I deleted my desktop environment during an apt upgrade, not once, but twice. Bad habit of not actually reading the messages that pop up properly - it did ask me if I wanted to delete it all, and I just said “yea lol lfg”. There was some conflict with a third party PPA that caused this.
Didn’t know that had happened to begin with. I was stuck on the session manager login screen and it just wouldn’t proceed after entering password. First time I just reinstalled Linux, and the second time I found out how to reinstall it from tty. This is how I learned about tty as well.
Jellyfin is also useful for music collection. I tried both it and Navidrome to start with, and ended up only using Jellyfin.
Good to know! I mean, their WebFAI-installer does support installation of other distros as well, so I would imagine it should work better than I originally feared after reading the latest news.
I have not fully understood the meaning or significance of these news and the content of this article. I have a Tuxedo laptop, and for now I am happy with Tuxedo OS. But I am of course interested in the ability to change distro at some point.
Am I understanding it correctly that I will have a very hard time doing so without patching the kernel myself to ensure proper hardware support? And even then it will be difficult?
There are some periods where YouTube make changes frequently so that e.g. FreeTube stops working for some time, but for the most part it works reliably well. I would say it provides a much better convenience than watching on youtube.com logged out, as you have profiles, subscriptions, playlist and history. Including adblock, sponsorblock and dearrow, and granular control over what to show or not (comments, shorts, live etc.).
Cheers, I’ll look into setting up SFTP in Dolphin.
Samba shuffles rather a lot of data, quite happily. You have not given us an exhaustive description of the shoddy wiring, dodgy switches and wonky configuration that makes up your network. If it was perfect, you would not be posting here.
The network is by no means ideal. I am transferring from a laptop on WiFi to a server on WiFi located some distance from the WAP. If I owned the place I would do a rewire, but for now it’s the best I can do. I think I assumed that there would be error-checking involved when copying. Since following the advice here of using rsync i stead, I have found that files tend to fail in bunches and I need to rerun several times for it to actually complete. Am I right to assume that comes down to packet loss due to poor signal?
Your issue is probably hardware related. Test your network with say iperf3. Have a look at network stats. Don’t rely on cargo cult bollocks - do some investigations. Nowadays we have nearly all the tools as open source to do the entire job - we did not have that 30 years ago. Grab wireshark, nmap, mtr and the rest and get nerdy (or hire me to do it - don’t do that please!)
This is above my skill level for now, but I’m adding it to my notes to go back to. I have some ambition of upping my network knowledge in the coming year, and being able to do use such tools to troubleshoot would be great.
Windows 11 on a Lenovo Thinkpad for work. No Linux-option, but we are working on it. Would still need Win11 for Office-work, as it is widespread in the organization and interop with LibreOffice or OnlyOffice isn’t flawless.
Oh, I didn’t know that. Neat!
I think I will go with rsync for future transfers, but I would like for it to be browsable through the file browser still. Is there a better way than samba I should consider? I guess it is not an issue just keeping them as samba shares for that purpose?
How would I achieve that? With cron?
I tried to resync now, and had to pass the -c flag to make sure it checked the cheksums to see if they should be updated. Then it worked. Looks like that does not affect the after-transfer checksum check though, so that’s good (from documentation):
Note that rsync always verifies that each transferred file was correctly reconstructed on the receiving side by checking a whole-file checksum that is generated as the file is transferred, but that automatic after-the-transfer verification has nothing to do with this option's before-the-transfer lqDoes this file need to be updated?rq check.
Thanks! Glad to know rsync
includes check after transfer, as I’ve just recently used it to backup everything on these drives to another hard drive that will not always be spinning. But I did not consider using it to transfer new media onto these hard drives.
I’ll try to use it to resync the files that were acting up.
Both machines are WiFi-connected.
I am unsure what logs to look at for this and have not done any filesystem check on the harddrive as I am unfamiliar with these tools. It’s an external Seagate HDD with an ext4 filesystem.
Yes, I should have specified that.
My conversation with any llm tends to go, “you got a, b, c wrong, it should be d, e and f” and it says “sorry, ofcourse it should be d, e and f, my mistake, here it is with d, e, f, g and h”. Then I say “g and h are wrong it should be i and j”. And it keeps going. In the end I write it myself. Huge time wasters.
And yet people at work will take its word when asking about things they don’t know anything about beforehand and have no real way of fact checking without actually doing the research they are trying to avoid.
Thanks for the clarification :)
I use ledger. I have not automated so much outside of autocomplete macros in my text editor, but it doesnt’t take too much time and forces me to look over my spend, so I like it. I will eventually attempt to build some kind of Dash-application for visualisation of the output, but have only started on the parsers so far.