
GNOME’s Video Player (Showtime) looks somewhat similar, as does Moonplayer.

GNOME’s Video Player (Showtime) looks somewhat similar, as does Moonplayer.
I will not recommend switching to NixOS and declarative configuration. I will not recommend switching to NixOS and declarative configuration. I will not recommend switching to NixOS and declarative configuration.
…fuck. I failed the saving throw. I’m sorry.
Do look into Ansible, and the whole configuration management topic, though.
While I am not a fan of Nix the language, it is no more insane than ansible or kubernetes yaml soups.
As for packages… nixpkgs is by far the largest repo of packaged software. There are very few things I haven’t found there - and they are usually not in any other distro either.
I switched to NixOS because I wanted a declarative system that isnt’t yaml soup bolted onto a genetic distro.
By 2022, my desktop system was an unmanagable mess. It was a direct descendant of the Debian I installed in 1997. Migrated piece by piece, even switched architectures (multiple times! I386->ppc-i386->amd64), but its roots remained firmly in 1997. It was an unsalvagable mess.
My server, although much younger, also showed signs of accumulating junk, even though it was ansible-managed.
I tried documenting my systems, but it was a pain to maintain. With NixOS, due to it being declarative, I was able to write my configuration in a literate programming style. That helps immensely in keeping my system sane. It also makes debugging easy.
On top of that, with stuff like Impermanence, my backups are super simple: btrfs snapshot of /persist, exclude a few things, ship it to backup. Done. And my systems always have a freshly installed feel! Because they are! Every boot, they’re pretty much rebuilt from the booted config + persisted data.
In short, declarative NixOS + literate style config gave me superpowers.
Oh, and nixos’s packaging story is much more convenient than Debian’s (and I say that as an ex-DD, who used to be intimately familiar with debian packaging).
SuSE in 1996. Then Debian between mid-1997 and late 2023, NixOS since.
I’m not a big distrohopper…
I do, yes. I’d love to use it, because I like Scheme a whole lot more than Nix (I hate Nix, the language), but Guix suffers from a few shortcomings that make it unsuitable for my needs:
Before I switched from Debian to NixOS, I experimented with Guix for a good few months, and ultimately decided to go with NixOS instead, despite not liking Nix. Guix’s shortcomings were just too severe for my use cases.
NixOS, because:
All of these combined means my backups are simple (just snapshot /persist, with a few dirs excluded, and restic them to N places) and reliable. The systems all have that newly installed feel, because there is zero cruft accumulating.
And with the declarative config being tangled out from a literate Org Roam garden, I have tremendous, and up to date documentation too. Declarative config + literate programmung work really well together, amg give me immense power.


I am doing exactly that. AI turns my work into garbage, so I serve them garbage in the first place, so they have less work to do. I am helping AI!
I’m also helping AI using visitors: they will either stop that practice, or stop visiting my stuff. In either case, we’re both better off.


A human using a browser feature/extension you personally disapprove of does not make them a bot
So…? It is my site. If I see visitors engaging in behaviour I deem disrespectful or harmful, I’ll show them the boot, bot or human. If someone comes to my party, and starts behaving badly, I will kick them out. If someone shows up at work, and starts harassing people, they will be dealt with (hopefully!). If someone starts trying to DoS my services, I will block them.
Blocking unwanted behaviour is normal. I don’t like anything AI near my stuff, so I will block them. If anyone thinks they’re entitled to my work regardless, that’s their problem, not mine. If they leave because my hard stance on AI, that’s a win.
Once your content is inside my browser I have the right to disrespect it as I see fit.
Then I have the right to tell you in advance to fuck off, and serve you garbage! Good, we’re on the same page then!


you disallow access to your website
I do. Any legit visitor is free to roam around. I keep the baddies away, like if I were using a firewall. You do use a firewall, right?
when the user agent is a little unusual
Nope. I disallow them when the user agent is very obviously fake. Noone in 2025 is going to browse the web with “Firefox 3.8pre5”, or “Mozilla/4.0”, or a decade old Opera, or Microsoft Internet Explorer 5.0. None of those would be able to connect anyway, because they do not support modern TLS ciphers required. The rest are similarly unrealistic.
nepenthes. make them regret it
What do you think happens when a bad agent is caught by my rules? They end up in an infinite maze of garbage, much like the one generated by nepenthes. I use my own generator (iocaine), for reasons, but it is very similar to nepenthes. But… I’m puzzled now. Just a few lines above, you argued that I am disallowing access to my website, and now you’re telling me to use an infinite maze of garbage to serve them instead?
That is precisely what I am doing.
By the way, nepenthes/iocaine/etc alone does not do jack shit against these sketchy agents. I can guide them into the maze, but as long as they can access content outside of it, they’ll keep bombarding my backend, and will keep training on my work. There are two ways to stop them: passive identification, like my sketchy agents ruleset, or proof-of-work solutions like Anubis. Anubis has the huge downside that it is very disruptive to legit visitors. So I’m choosing the lesser evil.


This feature will fetch the page and summarize it locally. It’s not being used for training LLMs.
And what do you think the local model is trained on?
It’s practically like the user opened your website manually and skimmed the content
It is not. A human visitor will skim through, and pick out the parts they’re interested in. A human visitor has intelligence. An AI model does not. An AI model has absolutely no clue what they user is looking for, and it is entirely possible (and frequent) that it discards the important bits, and dreams up some bullshit. Yes, even local ones. Yes, I tried, on my own sites. It was bad.
It has value to a lot of people including me so it’s not garbage.
If it does, please don’t come anywhere near my stuff. I don’t share my work only for an AI to throw away half of it and summarize it badly.
But if you make it garbage intentionally then everyone will just believe your website is garbage and not click the link after reading the summary.
If people who prefer AI summaries stop visiting, I’ll consider that as a win. I write for humans, not for bots. If someone doesn’t like my style, or finds me too verbose, then my content is not for them, simple as that. And that’s ok, too! I have no intention of appealing to everyone.


Pray tell, how am I making anyone’s browsing experience worse? I disallow LLM scrapers and AI agents. Human visitors are welcome. You can visit any of my sites with Firefox, even 139 Nightly, and it will Just Work Fine™. It will show garbage if you try to use an AI summary, but AI summaries are garbage anyway, so nothing of value is lost there.
I’m all for a free and open internet, as long as my visitors act respectfully, and don’t try to DDoS me from a thousand IP addresses, trying to train on my work, without respecting the license. The LLM scrapers and AI agents do not respect my work, nor its license, so they get a nice dose of garbage. Coincidentally, this greatly reduces the load on my backend, so legit visitors can actually access what they seek. Banning LLM scrapers & AI bots improves the experience of my legit visitors, because my backend doesn’t crumble under the load.


Overboard? Because I disallow AI summaries?
Or are you referring to my “try to detect sketchy user agents” ruleset? Because that had two false positives in the past two months, yet, those rules are responsible for stopping about 2.5 million requests per day, none of which were from a human (I’d know, human visitors have very different access patterns, even when they visit the maze).
If the bots were behaving correctly, and respected my robots.txt, I wouldn’t need to fight them. But when they’re DDoSing my sites from literally thousands of IPs, generating millions of requests a day, I will go to extreme lengths to make them go away.


I wonder if the preview does a pre-fetch which can be identified as such? As in, I wonder if I’d be able to serve garbage for the AI summarizer, but the regular content to normal views. Guess I’ll have to check!
Update: It looks like it sends an X-Firefox-Ai: 1 header. Cool. I can catch that, and deal with it.


Considering the amount of CVEs the kernel puts out, I’d argue there’s plenty there that’s broken, and could be fixed by implementing them in a language less broken than C.


TLDR: Is it normal to distro hop after being using a distro perfectly for so long?
I have used the same distribution (Debian) for over 20 years when I decided to change distributions and switch to NixOS. Debian was - and still is - a very fine distribution. I just needed something radically different.
So, to answer your question: yes, it is perfectly normal. Two years isn’t even long.
If they have no desire to maintain/sysadmin their own linux systems, then the best distro to recommend is whatever you can help them with, and possibly even maintain for them.
Case in point, my Wife is a very happy NixOS user, despite knowing absolutely nothing about Linux. Yet, she’s on a distribution that’s as far from being newbie friendly as a distro can possibly be. She’s still happy with it, because I set it up for her, and I maintain it for her, she never has to install, upgrade or configure anything, ever.


I’d say “under no circumstances”. When building for production, you want to build on a stable foundation. LFS isn’t that, it’s an educational tool. It does not result in a maintainable, robust system. It requires tremendous amounts of work to keep it secure and updated: there’s no package manager, no repository you can pull from, no nothing. You have to build an entire distribution on your own. Outside of educational purposes, I’m having trouble to imagine any situation where that might be a good idea.
No, not even embedded. There were always distros targetting embedded systems, LFS was never a good choice there either. It was much more straightforward to strip down - say - Debian for a limited device, than to build something from scratch for it. (I spent a few years building and operating embedded Linux systems at the early 2000s, we built it on a stripped down Debian.)
A new chapter, for sure. A very sad chapter, unfortunately.