cm0002 to Technology@lemmy.zipEnglish · 2 months agoMicrosoft just open-sourced bitnet.cpp, a 1-bit LLM inference framework. It let's you run 100B parameter models on your local CPU without GPUs. 6.17x faster inference and 82.2% less energy on CPUs.github.comexternal-linkmessage-square5linkfedilinkarrow-up125arrow-down16cross-posted to: technology@hexbear.nettechnology@lemmy.mllocalllama@sh.itjust.works
arrow-up119arrow-down1external-linkMicrosoft just open-sourced bitnet.cpp, a 1-bit LLM inference framework. It let's you run 100B parameter models on your local CPU without GPUs. 6.17x faster inference and 82.2% less energy on CPUs.github.comcm0002 to Technology@lemmy.zipEnglish · 2 months agomessage-square5linkfedilinkcross-posted to: technology@hexbear.nettechnology@lemmy.mllocalllama@sh.itjust.works
Microslop*