Hypolite Petovan likes this.
Hypolite Petovan likes this.
The solution to the control of hosted software over our infrastructure is quite simple: we have to decentralize the power. Just like freedom of press can be achieved by giving people the tools to print and distribute underground pamphlets, we can give people their freedom of software back by teaching them to control their own server.
like this
Now that I've set up a music-streaming app on my home server, I now have an opportunity to do one of my favorite menial tasks: tagging, organizing, and standardizing a music library.
I'm not kidding. I actually really enjoy this. 🤓
like this
The last time I underwent a significant music library organization effort was the very end of 2012/beginning of 2013. My now-wife and I were down in Tuscon so she could do thesis research on the border. I didn't have any such task to occupy myself, so I decided to clean up my music library.
My personal library has stayed pretty clean over the years since, but now that I have a home music server, I'm integrating my wife's library as well, and she... is not nearly as fastidious as me.
California doesn't have enough prisoners to fight wildfires for submimimum wage because too many are sick of or dying to COVID-19.
🇺🇲🦅🎆🙃
Hypolite Petovan likes this.
The SAND Lab at University of Chicago has developed Fawkes, an algorithm and software tool (running locally on your computer) that gives individuals the ability to limit how their own images can be used to track them. At a high level, Fawkes takes your personal images and makes tiny, pixel-level changes that are invisible to the human eye, in a process we call image cloaking. You can then use these "cloaked" photos as you normally would, sharing them on social media, sending them to friends, printing them or displaying them on digital devices, the same way you would any other photo. The difference, however, is that if and when someone tries to use these photos to build a facial recognition model, "cloaked" images will teach the model an highly distorted version of what makes you look like you. The cloak effect is not easily detectable by humans or machines and will not cause errors in model training. However, when someone tries to identify you by presenting an unaltered, "uncloaked" image of you (e.g. a photo taken in public) to the model, the model will fail to recognize you.
I've successfully set up my own #selfhosted Navidrome server (thanks to @YunoHost@mastodon.social, @deluan@twitter.com, and Éric Gaspar on Github), and it's awesome.
I love having my own music streaming server. As Google prepares to kill Google Play Music (and shuttle users to YouTube Music), and in an era when we're seeing the idea of "ownership" in tech degrade more and more, it's nice to have my stuff and know it's mine.
like this
Hypolite Petovan reshared this.
Hypolite Petovan
in reply to Spencer • •Spencer
in reply to Hypolite Petovan • •Hypolite Petovan
in reply to Spencer • •Spencer likes this.
Spencer
in reply to Hypolite Petovan • •Hypolite Petovan
in reply to Spencer • •Spencer likes this.