Are you saying that you’re running an Nvidia/AMD multi-GPU system, and they can work together during inference? So, your LLM-relevant VRAM is 10gb(3080)+20gb(7900xt)?
- 0 Posts
- 8 Comments
I’ve been curious about Scorn for a while, if it’s available. Regardless, thanks for doing this!
clothes@lemmy.worldto Selfhosted@lemmy.world•what's your experience with paperless?English5·1 year agoWhat scanners do people recommend?
Vermillion VR, if it’s available! I’ve been curious about it for a while.This is really nice of you, thanks!
Edit: Sent a PM.
clothes@lemmy.worldto Selfhosted@lemmy.world•Ethical cloud (VPS) provider recommendationEnglish5·2 years agoIt’s worth pointing out that they’re now a publicly traded company.
clothes@lemmy.worldto Linux@lemmy.ml•With Firefox on X11, any page can pastejack you anytime13·2 years agoWow, you weren’t kidding! Makes me think it’s a sketchy add-on, even if it’s not.
I can relate to this so I’ll just add that aligning my diet with my values was the best decision I’ve ever made. Being able to eat without feeling guilty/confused/complicated was life-changing. I didn’t do it all at once, or torture myself.
And we live at this amazing time when you can be lazy AND get amazing not-morally-horrible food! Typing this as I munch on an entire package of addictive store bought animal-free chocolate chip cookies :P
Wow, I had no idea! Nor did I know that Vulkan performs so well. I’ll have to read more, because this could really simplify my planned build.
Count me as someone who would be interested in a post!