• 0 Posts
  • 5 Comments
Joined 1 year ago
cake
Cake day: July 22nd, 2023

help-circle
  • I know that people are using P40 and P100 GPUs. These are outdated but still work with some software stacks / applications. The P40 GPU, once very cheap for the amount of VRAM, is no longer as cheap as it was probably because folks have been picking them up for inference.

    I’m getting a lot done with an NVidia GTX 1080 which only has 8GB VRAM. I can run a quant of dolphin Mixtral 7x8B and it works well enough. It takes minutes to load, almost too long for me, but after that I get 3-5 TPS with some acceptable delay between questions.

    I can even run Miqu quants at 2 or 3 bits. It’s super smart even at these low quant levels.

    llama 3.1 8B runs great with this 1080 8BG GPU at 4_K_M and also 5 or 6_K_M. I believe I can run gemma 9B f16 at 8 bpw.


  • The community can only read the source code, as of yet. All of the source code has been provided by a set of internal developers.

    The fact that it is open source means that, if somehow two malware elements have made it into the source code, then someone will eventually report it. But this doesn’t mean that two malware elements cannot be there right now.

    These two malware hits on total virus scan should be communicated to the developers.