• 0 Posts
  • 45 Comments
Joined 2 years ago
cake
Cake day: August 6th, 2023

help-circle
  • I think the home media collector usecase is actually a complete outlier in terms of what these formats are actually being developed for.

    Well yeah given who makes it but it’s what I care about. I couldn’t care less about obscure and academic efforts (or the profits of some evil tech companies) except as vague curiosities. HEVC wasn’t designed with people like me in mind either yet it means I can have oh 30% more stuff for the same space usage and the enccoders are mature enough that the difference in encode time between it and AVC is negligible on a decently powered server.

    Transparency (or great visual fidelity period) also isn’t likely the top concern here because development is driven by companies that want to save money on bandwidth and perhaps on CDN storage.

    Which I think is a shame. Lower bitrates for transparency -should- be the goal. The goal should be to get streaming content to consumers at a very high quality, ideally close to or equivalent to UHD BluRay for 4k. Instead we get companies that bit-starve and hop onto these new encoders because they can use fewer bits as long as they use plenty of tricks to maintain a certain baseline of perceptual visual image quality that passes the sniff test for your average viewer so instead of getting quality bumps we just get them using less bits and passing the savings onto themselves with little meaningful upgrade in visual fidelity for the viewer. Which is why it’s hard to care at all really about a lot of this stuff if it doesn’t benefit the user in any way really.


  • And which will be so resource intensive to encode with compared to existing standards that it’ll probably take 14 years before home media collectors (or yar har types) are able and willing to use it over HEVC and AV1. :\

    As an example AV1 encodes to this day are extremely rare in the p2p scene. Most groups still work with h264 or h265 even those focusing specifically on reducing sizes while maintaining quality. By contrast HEVC had significant uptake within 3-4 years of its release in the p2p scene (we’re on year 7 for AV1).

    These greedy, race to the bottom device-makers are still fighting AV1. With people keeping devices longer and not upgrading as much as well as tons of people relying on under-powered smart-TVs for watching (forcing streaming services to maintain older codecs like h264/h265 to keep those customers) means it’s going to take a depressingly long time to be anything but a web streaming phenomenon I fear.




  • Probably the best choice if OP is dreading 11. Put it off, hope that in 3 years Linux support has matured even more for their use cases.

    MS support has used this software themselves in an edge case where they couldn’t get Windows to active properly.

    You have two options here:

    1. Enable the extended support (no pay needed with this software but if OP absolutely refuses to run it they can pay Microsoft money directly though it takes work to find where to do that at) and run on that for 3 years until 2028.

    2. Upgrade to LTSC IOT using the method they outline at the link there. Again they have two options, one is free, the other is following that guide but paying for a gray-market key (G2a for instance) for LTSC IOT which would avoid running this software on their PC but would mean paying someone some money for a corporate volume key they’re not technically allowed to sell. Which means support until 2032.




  • The only thing I would note is -IF- your volumes are not partition or disk based BUT -files- based there is the possibility that corruption of the host file system of the disk the files containing the volumes are on could result in pieces of those files being marked unreadable by the disk and it’s POSSIBLE one way to solve this would be a file system check utility.

    HOWEVER such activities carry a -large- risk of data loss so I would advise a bit for bit copy of the disk and doing the repair on that so if it goes wrong you’re not worse off. -IF- you cannot make a copy then I would advise at least trying to mount using backup headers before doing that and copying off anything you can salvage as file system checks can really mess up data recovery and should only be used in certain circumstances.

    You’re much better off trying the recovery software I linked in fact than doing a file system check as it will tend to have better results.

    You can also use the option to mount as read only in VC to prevent writes to a suspected failing disk.

    Let me know if you need further advice.


  • Veracrypt has back-up headers located elsewhere in the volume that are unlikely to have been overwritten.

    First thing’s first I would strongly recommend copying the drive as it currently exists bit for bit to another drive of equal or larger size. Don’t work on the original if you can help it.

    Now with this copy, you should try to check the option to use the backup header when mounting and try again. If the partition is gone and veracrypt doesn’t see it you’ll need to try using something that recovers partitions and doesn’t mind encrypted partitions or partitions or file system types it doesn’t understand and use that to ON THE COPY recover and recreate the partition (this will write data and can cause the possibility of further loss or worsen your ability to recover which is why it is important to perform it on a copy). Testdesk may work for this but there are other options that probably are better.

    See this list: https://old.reddit.com/r/datarecovery/wiki/software and choose something from there if this data is truly important. Again only work on a copy on another drive. Some of these software examples actually work against the original drive and make a copy elsewhere and should be safe to use on the original drive so long as they have you select a target drive to push the recovered data to but read the documentation. Testdisk absolutely must be used on a copy.

    You will incur data loss and likely should run one of the file recovery software mentioned on the drive once successfully mounted to attempt to recover as much as possible.


  • This is ridiculous. I like the way it’s set up now. They tried “simpler” before and I hated it and turned it off. Along with the news they’re supposedly getting rid of tags for bookmarks (I have so many bookmarks without tags they’d be useless) I’m just feeling so much despair for the web right now.

    Also disabling showing HTTPS in the address bar as part of the URL is another negative change catering to what they believe is the lowest common denominator. Consider for a moment that browsers still support multiple protocols besides hypertext transfer.


  • Use secure erase function which is built into the SATA and other specs, it applies a voltage spike to clear the cells of all held charges thus wiping them. This happens near instantly, it’ll be a process that will signal it’s finished within a minute and takes much less time than that.

    If you want to be extra paranoid I suppose you could follow that up by encrypting the entire (empty) drive and then doing it again though I’m not sure this has any benefit however it’s the closest to forcing the cells to be used again and then cleared again. However this does not guarantee that exhausted and worn out areas are flash are not potentially spared both. It’s unlikely for large amounts of data to be recovered from this unless your drive is failing or has been completely worn out but it’s also why if you ever store sensitive data on an SSD it’s preferable to do so in an encrypted form (such as encrypting the whole disk or partition).




  • They’re not good, I admit that. But there is no better at present.

    Your choices are Google, Safari (Apple devices and OSes only), or Firefox. It’s as simple as that. Pretending otherwise is living in a fantasy land. There’s no easy road out of here realistically. New browser engines take years (perhaps the better part of a decade at this point) to make and the inherent complications of web standards and their volume means I regard things like Ladybird as a silly meme sucking up nerd and venture capital dollars rather than a serious endeavor.

    The effort to build a web browser from scratch today compared to 15 years ago has scaled massively and I think that’s intentional on the part of companies like Google and Microsoft to shut out the competition and to shut out small actors and to control the web for themselves and western governments.

    The last decent bits of Firefox are the ones holding back a tidal wave of bad things from coming to destroy the sickly remains of the open web in very quick fashion. Right now I can block ads, I can shut up my browser from phoning home, my browser isn’t made by an ad company, and it’s not made by a company that has a vested interest in completely airtight DRM because they own a video platform and/or are friends with big Hollywood studios (yes they implement DRM, no it’s not done as tightly as Chrome, the fact major streaming platforms restrict it to 720p should show you that).

    They’re not the hero we need, but they’re far from the worst villain and when they are gone much as I have criticized them we are going to be fucked because no one can replace them.

    The 90s ideals of an open internet that persisted into the 2000s that led to Firefox have vanished, replaced by various grifts that call themselves web 3.0. The illusion the liberal capitalist west was weaving of human rights and freedom which resulted in space for many good things is being clawed back now that their hegemony is under threat.

    Frankly I don’t see the EU or China or some large, benevolent, very wealthy organization stepping in to build a new browser that’s privacy respecting, not full of backdoors, not totally in the thrall of the worst corporate interests. And I don’t see Mozilla selling Firefox to some benevolent org. Not in the near term, in 8 years who can say but we’ll spend many horrible years wandering in the wilderness during that and the web will permanently enshittify in ways that Firefox could have at least slowed.

    I see two options in the present and they are Firefox somehow managing to continue to exist without completely compromising things to the point that librewolf devs and others give up because the soil is too toxic or it not doing that, collapsing entirely, stuffing itself full of ads and spyware that’s very hard to remove to attempt to stay afloat.

    It’s like shrugging at a law gutting union protections and saying “revolution, revolution, revolution” indifferently to the suffering coming down the pipe and the uncertainty when the conditions for what you want to happen aren’t near, when you’re staring down the barrel of worsened oppression and even the potential of salvation is years, a decade away. That’s how I regard people indifferent to Mozilla imploding.

    Do I wish there was a way to snatch Firefox away from them? Yes. But there isn’t. In fact if anyone was able to they could right now, it’s opensource and they could just fork and get to work and start making something better. The idea that the void will be filled by good things is “hand of god, hand of the markets” liberal capitalist brained thinking.

    Most people don’t give a shit about web privacy, about not seeing ads online, about controlling how websites display, about not having all their data sucked up or about companies pushing evil web standards that take away control and hand it to abusive governments and corporate actors so this isn’t going to lead to some revolutionary push-back, it’s going to lead to the collapse of the last militant hold-out for privacy advocates.

    Frankly I see a nightmare scenario where Chrome is bought by a company that takes it closed source (even partially) or buries the spyware and bad things in so deeply they can’t be removed by open source fork maintainers due to the burden while simultaneously Firefox either simply ceases to be developed or enshittifies and deploys its own ads and spying. At that point we’ll have nothing. There aren’t enough nerds who care about privacy to fund a privacy respecting, standards compliant web browser that manages to not be blocked by most websites. As it is if Firefox came out 5 years ago and wasn’t grandfathered in from their good old days of being a big boy player they probably wouldn’t have the sway they have on the internet standards council and would probably be blocked a lot more aggressively.

    Should Mozilla be restructured and stop acting in such a lousy fashion? Absolutely. Do I see any way for us random web users to force that? Not at all. It’s a lousy situation but one which can get much, much, much worse.


  • Literally the other way around.

    Mozilla can continue to be an irrelevant little NGO with a tiny little office in SF pestering people and shouting into the void and setting up booths at tech conventions on very, very, very little money. A few million a year, much less than they stand to be able to earn from their investment fund returns annually.

    Firefox on the other hand requires Mozilla’s hundreds of paid full time developers. Its codebase is nearly the size of Linux, as a browser it’s constantly patching security issues, adding in new features, fixing things that break for small amounts of the web, etc.

    There is simply no organization waiting in the wings that has the money and the interest in making a privacy-preserving web-browser that can just pick up that slack.


  • And with it the open web.

    If (and it’s still a big if) Google is forced to sell Chrome they’ll sell it to either Facebook, AltmanAI, Microsoft (lol), or else some shady tech company that has no reason to want to own it but is an even thinner rubber mask for the CIA/FBI/etc.

    This is why I’m sure it’ll happen (dooming hard). The US government wants web control and censorship and one big thing standing in the way is the open web Firefox fosters. Kill that off and the rest falls in line for corporate/government surveillance, control, and the end of anonymity and anything resembling free speech to the disliking of the aforementioned parties.


  • Yes, absolutely. And they can drag Canonical into it as well if they wish though it’s harder. Being UK based doesn’t protect them from the long arm of US law including arresting any US personnel, freezing and seizing their funds, putting out arrest warrants for and harassing those in the UK with the fear of arrest and rendition to the US if they go to a third country (for a conference, vacation, etc, most would buckle rather than live under that). Additionally the US could sanction them for non-cooperation by making it illegal for US companies to sell them products and services, for US citizens to work for or aid them, etc.

    They can go after community led projects too, just send the feds over to the houses of some senior US developers and threaten and intimidate them, intimate their imminent arrest and prison sentence unless they stop contact and work with parties from whatever countries the US wishes to choose to name. Raid their houses, seize their electronics, detain them for hours in poor conditions. Lots of ways to apply pressure that doesn’t even have to stand up to extensive legal scrutiny (they can keep devices and things and the people would have to sue to get them back).

    The code itself is likely to exist in multiple places so if someone wanted to fork from say next week’s builds for an EU build they could and there would be little the US could do to stop that but they could stop cooperation and force these developers to apply technical measures to attempt to prevent downloads from IP addresses known to belong to sanctioned countries of their choosing.

    It’s not like the US can slam the door and take its Linux home and China and the EU and Russia are left with nothing, they’d still have old builds and code and could develop off of those though with broken international cooperation it would be a fragmented process prone to various teething issues.



  • Interesting project. Thanks for the link and I do appreciate it and could see some very good uses for that but it’s not quite what I meant.

    Unfortunately as it notes it works as a companion for reverse proxies so it doesn’t solve the big hurdle there which is handling secure and working flow (specifically ingress) of Jellyfin traffic into a network as a turn-key solution. All this does is change the authorization mechanism but my users don’t have an issue with writing down passwords and emails. Still leaves the burden of:

    • choosing and setting up the reverse proxy,
    • certificates for that,
    • paying for a domain so I can properly use certificates for encryption,
    • making sure that works,
    • chore of updating the reverse proxy, refreshing certs (and it breaking if we forget or the process fails), etc

    Which is a hassle and a half for technically proficient users and the point that most other people would give up.

    By contrast with Plex how many steps are there?

    1. Install (going to skip media library setup as Jellyfin requires that too so it’s assumed)
    2. Set up any port settings, open any relevant ports on firewall, enable remote access in setting with a tickbox
    3. Set up users
    4. Done, it now works and doesn’t need to be touched. It will handle connecting clients directly to the server. Users just need to install Plex client, login to their account and they have access.

    By contrast this still requires the hoster set up a reverse proxy (major hassle if done securely with certificates as well as an expense for a domain which works out to probably $5 a year), to then have their users point their jellyfin at a domain-name (possibly a hard to remember one as majesticstuffbox[.]xyz is a lot cheaper than the dot com/org/net equivalents or a shorter domain that’s more to the point), auth and so on. It’s many, many, many more steps and software and configurations and chances for the hosting party to mess something up.

    My point was I and many others would rather take the $5 we’d spend a year on a domain name and pay it for this kind of turn-key solution for ourselves and our users even if provided by a third party but that were Jellyfin to integrate this as an option it could provide some revenue for them and get the kinds of people who don’t want to mess with reverse proxies and certificates into their ecosystem and off Plex.