cm0002 to Technology@lemmy.zipEnglish · 2 months agoMicrosoft just open-sourced bitnet.cpp, a 1-bit LLM inference framework. It let's you run 100B parameter models on your local CPU without GPUs. 6.17x faster inference and 82.2% less energy on CPUs.github.comexternal-linkmessage-square5linkfedilinkarrow-up125arrow-down16cross-posted to: technology@hexbear.nettechnology@lemmy.mllocalllama@sh.itjust.works
arrow-up119arrow-down1external-linkMicrosoft just open-sourced bitnet.cpp, a 1-bit LLM inference framework. It let's you run 100B parameter models on your local CPU without GPUs. 6.17x faster inference and 82.2% less energy on CPUs.github.comcm0002 to Technology@lemmy.zipEnglish · 2 months agomessage-square5linkfedilinkcross-posted to: technology@hexbear.nettechnology@lemmy.mllocalllama@sh.itjust.works
minus-squarescholar@lemmy.worldlinkfedilinkEnglisharrow-up9arrow-down1·2 months agoNo that would require AI (Actual Intelligence)
minus-squareLeon@pawb.sociallinkfedilinkEnglisharrow-up2arrow-down2·2 months agoWe already have that. It’s like, you put the AI in your brain. You are the AI.
No that would require AI (Actual Intelligence)
We already have that. It’s like, you put the AI in your brain. You are the AI.