• 0 Posts
  • 25 Comments
Joined 1 year ago
cake
Cake day: July 10th, 2023

help-circle




  • cmhe@lemmy.worldtoLinux@lemmy.mlRecommend me a scripting language
    link
    fedilink
    arrow-up
    19
    arrow-down
    2
    ·
    1 month ago

    What about Lua/Luajit?

    In most scripting languages you have the interpreter binary and the (standard) libraries as separate files. But creating self-extracting executables, that clean up after themselves can easily be done by wrapping them in a shell script.

    IMO, if low dependencies and small size is really important, you could also just write your script in a low level compiled language (C, Rust, Zig, …), link it statically (e.g. with musl) and execute that.


  • I started using Fedora Silverblue on a tablet, seems to work fine so far, but requiring a reboot in order to install new system packages is a bit cumbersome and the process itself takes a while, but ordinary Fedora also doesn’t win any races when asked to install a new package

    I think switching to FCOS or Flatcar on servers that just use containers makes sense. Since it lessens the burden of administrating the base system itself. Using butan/ignition might be unusual at first, but it also allows to put the base system configuration into a git repo, and makes initial provisioning using ansible or similar unnecessary. The rest of the system and services can be managed via portainer or similar software.

    I also do not have long term experience with FCOS, but the advertised features of auto-update, rolling-release, focus on security and stability makes it a good fit for container servers, IMO.

    An alternative to Debian on servers might also be Apline Linux. Which also has more a focus on network devices, but some people use it on a desktop as well.

    If you have many different systems, and just want to learn to operate them all, maybe NixOS might be interesting. Using flakes, you can configure multiple machines from just one repo, and share configurations between them. But getting up to speed on NixOS might not be so easy, it has a steep learning curve.


  • cmhe@lemmy.worldtoLinux@lemmy.mlCoreboot: Pros and Cons
    link
    fedilink
    arrow-up
    12
    ·
    edit-2
    2 months ago

    So generally the pro of coreboot is that it is open source, but the con is that it is open source.

    What I mean by that, you can fix any issues yourself, however, if you are unable to do it yourself, you have to wait until someone does it for you and often what features are available and stable are a hit and miss.

    Compared to proprietary bioses, the company has some kind of standardized process for developing the bios. So you often get want you would expect. However, if the money flow from the pc vendor to the bios vendor drys up, you, or the community of owners. will not be able to fix any issues.

    Linux support should be the same, regardless if you choose proprietary or open source bios. But that depends on how well the coreboot was ported to the platform. So officially supported coreboot bioses are likely better than others.

    Personally, if all other attributes are equal, would go with coreboot, because I like to support vendors that offer that choice, and IMO a open source solution, that you can review and build yourself is intrinsically more secure than a binary blob, where you have to blindly trust some corporation. But other security minded people might disagree, which is fine.



  • Depends a bit on what the default cloning url will be. If the domain is in control of mozilla, which forwards it to github, then fine, if most people start using the github url, then it is still a vendor lock in, because many people and projects will use it, and that is not so easy to move away.

    Update: To the people down-voting my comment, I would love to hear why you either disagree with me, or find that my that my contribution to this discussion is worthless.

    The upstream URL of a project or repo is important, because it will be used in other projects, like in build scripts for fetching the sources. If a projects changes that URL in the future, and the old URL is no longer available/functional, all those scripts need to be changed and the old versions of these scripts do not work anymore out of the box.

    If the project owns the URL, then can add redirect rules, that might help alleviate some of these issues. I don’t think github allows projects that move away from it to do that. So this is a sort of vendor lock-in. The project needs to maintain the repo on github, because they want to break the internet as little as possible.




  • I only play single player games, but couldn’t care less about achievements. It is all about exploration, story, game mechanics and modding for me.

    People treat achievements as if they are a status symbol. I mean sure, if you don’t know what else to do in a game, they can give you some goal, but IMO the game itself should encourage you to reach the goal, not some external badge. The experience doing the task should be the reward in of itself.


  • Not the drama itself should influence your judgment, but how they will deal with it.

    Whenever people work together on something, there will be some drama, but if they are dealing with it, then that should be fine.

    Nix and NixOS are big enough, that even if it fails, there are enough other people that will continue it, maybe under a different name.

    Even it that causes a hard fork, which I currently think is unlikely, there are may examples where that worked and resolved itself over time, without too much of burden on the users, meaning there are clear migration processes available: owncloud/nextcloud, Gogs/Gitea/Forgejo, redis/valkey, …


  • I like RPG games, however I don’t like it when the company has the ability and incentive to bate and switch my game into a worse version after I bought it.

    Denuvo forces me to be connected to the internet, which makes playing the game on the move difficult or even impossible. It also allows them to make sure that the most current version is played. MTX means they don’t have incentives to fix the game and instead sell you the fixes, or even enshittyfy it, to squeeze out more money.

    This gives me the incentive to wait a couple of years, until the game doesn’t receive any updates anymore, and then decide if the final product is worth it. And hope that I will get a good experience out of it, before the Denuvo activation servers are shut down.

    So you have to wait for a few years, in order to know if the gameplay is (and stays) any good.




  • What kind of comparison is that? sudo is setuid while Firefox and its extensions run as the user you started it as.

    Also sudo has just one very specific and limited use case, while Firefox is more of a platform for web content. I could argue that sudo itself is an ‘extension’ to a Linux system, like every application.

    You also don’t have to install all of those extension, you can choose which you trust, similar to a Linux system, you don’t have to install every application in the repository.

    If you say that the Firefox add-on repo should be more managed like a repository of a Linux distribution, where developers cannot simply upload their own software, but need to find a trusted maintainer first, I could agree to that. But that would mean more work and overhead.


  • Snap is just one case where Ubuntu is annoying.

    It is also a commercial distribution. If you ever used a community distribution like Arch, Gentoo or even Debian, then you will notice that they much more encourage participation. You can contribute your ideas and work without requiring to sign any CLAs.

    Because Ubuntu wants to control/own parts of the system, they tend to, rather then contributing to existing solutions, create their own, often subpar, software, that requires CLAs. See upstart vs openrc or later systemd, Mir vs Wayland, which they both later adopted anyway, Unity vs Gnome, snap vs flatpak, microk8 vs k3s, bazar vs git or mercurial, … The NIH syndrom is pretty strong in Ubuntu. And even if Ubuntu came first with some of these solutions, the community had to create the alternative because they where controlling it.


  • cmhe@lemmy.worldtoSelfhosted@lemmy.worldUncommon Syncthing usecases
    link
    fedilink
    English
    arrow-up
    4
    ·
    edit-2
    11 months ago

    I mod my games on my PC and sync it to my SteamDeck. I also sync the save files back and fourth, to continue playing on different devices. Mostly non-steam games.

    I also sync my eBook collection to my eink reader with syncthing.

    Everything is also mirrored to my always-on NAS, so syncing always works.