This blog is on a subdomain of my site, sixy.name. My social media and messaging names are almost always “Sixthhokage1” or “Sixthhokage95”. So how did I come to be known as these names?
Well it starts in 2006. A friend got me to try RuneScape around the end of fifth grade, so I created an account using my go-to username at the time. I ended up losing the password for that account. Later that year, I had gotten into Naruto. When I created a new account to get back into RS, I tried to use the name “Sixthhokage” but found that it was already taken. I tacked a 1 at the end and a new identity was born.
For Christmas 2007 I got an Xbox 360. My Xbox Live account became the next major one to bear the name of Sixthhokage1. Years and several email account migrations later, I’d nuke the Microsoft account associated with that gamertag. I now have Sixthhokage95 as a current gamertag that I never use since I don’t play any Xbox games anymore and don’t have any games from the Windows Store.
I created a YouTube account as Sixthhokage1 in 2007 or 2008. Ended up locked out of it. So at some point in 2008 I created a new YouTube account which was the first thing I used as Sixthhokage95. I chose 95 as the numeric suffix for the obvious reason of being born in 1995. Even though it’s no longer my display name, my channel URL is still youtube.com/user/Sixthhokage95.
In 2010 I found and joined TV Tropes. I mostly lurked the wiki until I joined the roleplay We Are All Pokémon Trainers on the forum and found a community that I found comfortable in. Until this point I’d left a trail of dead accounts on forums I never really participated in, thanks to what I later found out was social anxiety. But by summer 2012 I was exploring the wider TV Tropes community and started posting in the Yack Fest section. Here I made friends with a troper named Inhopelessguy who nicknamed me Sixy which caught on. A few years later when I was purchasing a domain name I ended up deciding on sixy.name. For a while my display name on YouTube was Sixy, but that ended up with a lot of follow spam through Google+ of middle eastern and Indian pornbots.
The funny thing is that even though I’ve been using a name based on Naruto for all these years I’ve not watched the show since before Shippuden and the last thing I read was the end of the Pein arc after Shonen Jump skipped a bunch of serialization to get out more volumes, so I had no real idea what was happening.
So for about the last week I’ve been somewhat involved in a really frustrating situation in Mastodon development, which has thankfully been resolved. Let me preface this by saying that I am in no way a developer, I’m simply a user who runs their own masto instance who was frustrated enough with a change to both revert it on my server and voice my opinion on GitHub. And if it was just this one issue I wouldn’t be writing a blog post about it, but the core issue is about a pattern of behavior and a concerning attitude from the lead developer, Gargron.
For those who don’t know what the hell I’m talking about, Mastodon is an open source Twitter-like social network that is decentralized and federated. The official user’s guide explains it much better than I could, so I suggest reading the linked intro. The change in question is about the content warning system and metadata for link embeds on other platforms like Facebook, Twitter, Discord, or any other platform that pulls Open Graph metadata.
So the content warning, or CW, system. This allows you to choose to hide a post behind a warning and a clickthrough, and any attached media will automatically be hidden as sensitive as well (you can mark media as sensitive without a CW as well, but usually anything marked sensitive will be put behind a CW anyway).
Previously the metadata would cause a link in an Open Graph enabled platform, e.g. Discord, to embed like this
As of the current stable release it looks like this
Thankfully after a lot of back and forth, a reversion + revamp was put forth and merged, which has example images in the linked PR discussion.
This change was made almost a month before the final release of 2.3.0, during which time very few people would have actually ran into it as running from master is very much not recommended for a production instance and even the instances using the release candidates in the prerelease testing are going to be mostly mastodev adjacent. Unfortunately it also means Gargron was unprepared for it to become a sticking point of the community. The discussion on the pull request that implemented the change, in which I chimed in and subsequently followed, got a bit heated and also showcased the stubbornness that Garg is unfortunately noted for. Now I was already somewhat aware of this temperament of his and so was frustrated but not surprised. Wanna know what got me pissed off though? From the federated timeline I came to a thread about this situation in which Gargron got involved, and one of his responses was this:
@cassolotl@clar@HeckTheCistem@maloki Look, I'm not making bank on this. I think I'm slightly above German minimum wage. In the open-source scene, that's a great accomplishment. Compared to Twitter's 200 engineers, that's nothing. But it's my passion and I have fun doing it, so we get the Mastodon that we see. Don't mess with that. If you disagree with my decisions, fork, or switch to one of the existing forks. I listen to feedback. But I don't *owe* anyone to agree with it.
I listen to feedback. But I don’t owe anyone to agree with it.
Yeah, true, you don’t owe anyone to agree with feedback, Eugen, but you sure as hell have a responsibility to not try your damndest to dismiss a criticism that isn’t from your own friends and/or others heavily involved in development until it becomes obvious that the users offering it aren’t going to back the fuck down in the face of your stubbornness. Especially since unless an instance’s sysadmin changes the localization YML files you get this (or a translation thereof) on the landing page of every single instance:
A more humane approach Learning from failures of other networks, Mastodon aims to make ethical design choices to combat the misuse of social media.
Yeah, acting dismissive about your user’s concerns about how Open Graph embeds not respecting content warnings in the interest of ~engagement~ sure is some ethical design choices right there. That’s definitely the sort of project leadership I look for.
Okay that’s enough snarky bitchiness from me, but in all seriousness if Gargron wants to talk the talk about breaking free of the shit of corporate social media, he needs to walk the fucking walk. He got lucky that his passion project for an alternative to GNU social without its technical debt happened to catch on with a good number of marginalized folks sick of the dumpster fire of Twitter and its enablement of nazis, white supremacists, and harassment in general. Now he’s built masto’s brand around its moderation tooling, robust privacy features, and yes, the content warning system.
I think he truly believes in doing better, but an unwillingness to look past his own privileged perspective leads to, well, shitstorms like this. It leads to a userbase that dreads having to bring up concerns with development. It leads to resentment among users for the guy who, while nominally only in charge of the flagship mastodon.social instance, is essentially the Benevolent Dictator for Life of the software development and if you don’t like it you can go fork yourself.2 It leads to god knows how many follower only toot threads about how fed up people are with this bullshit, and for some people to just abandon ship either back to Twitter or to other OStatus/ActivityPub projects. It’s not good for the health of either the community or the continued development and ~engagement~ of the software.
These Discord screenshots are slightly modified using Chromium’s Discord’s element inspector from how they actually appear due to my own testing of modifying CW behavior in Open Graph tags↩
(On that note the glitch-soc fork is p nice and the custom profile fields are a great feature)↩
Update November 2018: I have not actually been using this setup in months as shortly afterwards my stupidity in torrenting openly without a VPN or even PeerBlock on my home network got us a nastygram from our ISP because one of the torrents was monitored by IP-Echelon. Don’t be stupid like me, protect yourself and probably use a seedbox if you torrent a lot instead of doing so locally.
I’ve just gotten back into using Kodi and got my library managers set to monitor things again. God this is so much better than manually downloading things and just playing them in VLC.
For those unfamiliar, Kodi is an open source media center application created in the early 2000s for the original Xbox as Xbox Media Center or XBMC. Over time it got ported to all major desktop and mobile operating systems, Xbox support was dropped, and in 2014 they renamed the project to Kodi so that they could protect their branding through trademark law against those selling prebuilt piracy boxes based around the software. There’s been a lot of conversation about Kodi and streaming of pirated content, but those streams are made by third parties who make libraries of content available to add as a network media source or make plugins to browse pirated streams.
Of course I’m also using it for pirated media but I’m using local downloads instead. To manage this I use Sonarr, Radarr, Jackett, uTorrent, and SABnzbd. Sonarr (formerly NzbDrone) is a TV show library manager that can search for episodes on from both torrent and usenet indexers, send them to your download clients (where uTorrent and SABnzbd come in), move them into your library location (for me a 1 TB external hard drive) and download metadata and artwork formatted for a few different media centers. Radarr is a fork of this modified to manage movies that I’m glad exists because before I found it I used CouchPotato which in my opinion is garbage. Jackett provides API access to torrent trackers through both Torznab and TorrentPotato APIs, and is how Sonarr and Radarr access most of their torrent indexers. Usenet binary uploads are pulled from NZBFinder (where I have a Basic paid plan) and downloaded through SABnzbd, Sonarr and Radarr run on my desktop as Windows services, and along with SABnzbd and Jackett are primarily controlled in the browser. Radarr and Sonarr also both are set to send notifications to and automatically update the library of Kodi when it’s running.
I first started using Kodi itself at some point in 2016. At that point I had my desktop set to use my TV as a secondary display and launched Kodi through Steam Big Picture, navigating with my Steam Controller. I moved later that year, and my current bedroom setup doesn’t allow me to connect my TV to my desktop. However, with the long cable of my headphones and the extremely short gap between my bed and desk, I can simply watch while laying/sitting on my bed. Instead of using the Steam Controller like I used to, I use an Android app called Yatse which acts as a remote for Kodi (and works for some other media centers too, apparently). I’ve got it set up on both my phone and Fire tablet.
If I don’t want to watch on my desktop itself, I also have Kodi installed on my Fire TV Stick and my laptop and can stream videos and music from my desktop as long as Kodi is running there with universal plug and play (UPnP) over the local network. There’s limitations though, as my video library is not optimized for streaming. I’m not going to be streaming a 1080p60 video I recorded with my camcorder to my Fire stick, for example. But for the most part it works well enough with maybe a bit of buffering at the start. I have yet to test streaming the comically large Star Wars Despecialized Editions though and am interested to see how that plays out (A New Hope clocks in at 20 gigs, but these full version MKV files contain a wonderfully large selection of audio tracks including an isolated musical score).
Ideally I’d have a dedicated home theater PC for this and maybe a NAS for storage but lol I’m on a Kroger paycheck and can’t afford that. This system on my existing PC works and I’m stupidly proud of it, hence this rambling write up on it. If you want actual instructions on how to set this stuff up, there’s each project’s own documentation and countless tutorials from cordcutter and HTPC enthusiast sites. And if you want to use Usenet downloads you’re going to have to find a paid provider, recommendations for which can be found on r/usenet (as much as reddit is a toxic hellhole of a site to participate in, some subs such as this one are wonderful for gathering information).
I’ve been pondering making an actual blog after messing around with the Feels Engine on tilde.town. Hopefully this will get me more comfortable writing more long-form content again, something I’ve been out of the habit of since retiring from a play-by-post RP I was in and graduating high school.