MyNameIsFred

  • 2 Posts
  • 28 Comments
Joined 1 year ago
cake
Cake day: June 8th, 2023

help-circle



  • I know this thread is a tad old but I’ve been considering a framework for the family laptop for a while. Problem is stocking and this recent 16 model run also had 13 orders run too. I won’t pay for a unit months in advance. It ruins certain protections from the merchant (like failure to deliver).

    Instead I got a thinkpad t480 for like 400 bucks. It will do fine as a laptop mostly used for chrome, paying bills or zoom calls etc.





  • My trait is I think cars are too digital and should be analogue. Giant touch panels are distracting and have generally bad UI design. You can control an A/C with 3 dials, 4 if you have zones and don’t need to look down at all. Pinnacle of engineering.

    I will never own a car that has features behind a paywall or that I can’t directly control. Computer cars are fine as long as I have root.





  • As per most tech things, though, I don’t think there’s a good end-to-end guide out there (lots of piecemeal ones, though) and having good research skills and being able to fill in the gaps in guides yourself is pretty important.

    Yeah for sure. For most non-techy folks using one of the arrs setups or even plex has a pretty steep curve.

    It’s why Netflix will continue to make subs.

    I think what’s missing from this article is they have had a show or two lately that have been solid. Ie: the Diplomat. And that will drive up subs. But not sure it has the staying power. Folks will flip back to something else when another service drops something good.


  • What they don’t explain is that you need two accounts (or more) for these to work.

    A Usenet account.

    An indexer account that is basically a search engine.

    You also need a download app like nzbget. And ofc you setup an account on that and plug it into sonarr.

    And an account for the nas or storage if it’s not local.

    Sonarr searches the index, finds the files, talks to nzbget and says “download that shit for me and put it together”. Nzbget uses the Usenet account to fetch the stuff, assembles the parts and tells sonarr I’m done. Sonarr then renames it and puts it on your nas.

    It’s admittedly fairly abstract, even for someone seasoned in systems admin work.




  • I have had issues with it over the years. Many will blacklist entire cidr nets for a single bad actor. I get this on my linodes frequently if I proxy traffic through them. Ie: tons of captchas on google/YouTube.

    When I ran my own mail it was similar. Often having to spend time getting IPs off rbls and the like because some other node on my subnet was malicious.

    In the end, I just moved my email over to workspace. Not ideal. But it works.

    One thing I did notice was that as soon as I registered my domain in workspace (but hadn’t even setup mx records or began moving mail) a lot of issues with google immediately stopped, and thus, same with Office.com. I actually ran this way for a while but then google axed freed accounts and I just moved my stuff to them and pay.

    Maybe because I use a gTLD? I dunno. But it was a headache.



  • Couldn’t really tell you. I haven’t done any moderation in many years. And have no knowledge of how their DB system or backups are structured. But make no mistake, Reddit has admin rights and the ability to takeover any sub if they don’t feel the mods are doing a good job, and there’s been precedence for such action, either due to mod abusing and shuttering a sub, or just not engaging at all or even just going afk and abandoning a sub.

    I dont think Mods can outright delete comments or submissions, only hide them. Only a user can overwrite and delete their comments. So…unless basically all users started scrubbing comments it would be hard i would guess. And i wouldnt be shocked if they had replicas or DB backups crawled at the page/submission level to roll back off of to protect such an act. Heck Pushshift was doing about that. I really detested that guy for how he handled privacy. Even had people sign up to exemptions and just straight ignored specific requests to have their user pages excluded from crawling.