• 4 Posts
  • 35 Comments
Joined 1 year ago
cake
Cake day: June 8th, 2023

help-circle
  • Sure! You’ll probably want to look at train-text-from-scratch in the llama.cpp project, it runs on pure CPU. The (admittedly little docs) should help, otherwise ChatGPT is a good help if you show it the code. NanoGPT is fine too.

    For dataset, maybe you could train on French Wikipedia, or scrape from a French story site or fan fiction or whatever. Wikipedia is probably easiest, since they provide downloadable offline versions that are only a couple gigs.












  • SkySyrup@sh.itjust.workstoLinux@lemmy.mlWhat's your preferred DE?
    link
    fedilink
    arrow-up
    3
    arrow-down
    1
    ·
    11 months ago

    I use gnome as a primary, it feels really polished and doesn’t break or crash. Very modern, but if you want to have a super-customized experience, you’re gonna have a bad time. Extensions break every update and so do themes, so you either wait for the dev to port it or so it yourself. Annoying, so I only use vanilla for now.

    Maybe I’ll try plasma, looks cool.


  • Was messing around with this a bit and it’s really quite good! It doesn’t work in text-generation-webui for some reason (probably end tokens but idk) and I had to use llama.cpp to interference, but it’s pretty good German! Sometimes, even with the correct prompt, it veers off course and starts doing something completely different though ._.