Stamets to People Twitter@sh.itjust.works • 2 years agoThe dreamlemmy.worldimagemessage-square252fedilinkarrow-up11.97Karrow-down143
arrow-up11.92Karrow-down1imageThe dreamlemmy.worldStamets to People Twitter@sh.itjust.works • 2 years agomessage-square252fedilink
minus-square@CeeBee@lemmy.worldlinkfedilink2•2 years ago I don’t know of an LLM that works decently on personal hardware Ollama with ollama-webui. Models like solar-10.7b and mistral-7b work nicely on local hardware. Solar 10.7b should work well on a card with 8GB of vram.
minus-square@ParetoOptimalDev@lemmy.todaylinkfedilink1•2 years agoIf you have really low specs use the recently open sourced Microsoft Phi model.
Ollama with ollama-webui. Models like solar-10.7b and mistral-7b work nicely on local hardware. Solar 10.7b should work well on a card with 8GB of vram.
If you have really low specs use the recently open sourced Microsoft Phi model.