☆ Yσɠƚԋσʂ ☆@lemmy.ml to Technology@lemmy.mlEnglish · 3 months agoMicrosoft just open-sourced bitnet.cpp, a 1-bit LLM inference framework. It let's you run 100B parameter models on your local CPU without GPUs. 6.17x faster inference and 82.2% less energy on CPUs.github.comexternal-linkmessage-square0fedilinkarrow-up11arrow-down10 cross-posted to: technology@hexbear.net
arrow-up11arrow-down1external-linkMicrosoft just open-sourced bitnet.cpp, a 1-bit LLM inference framework. It let's you run 100B parameter models on your local CPU without GPUs. 6.17x faster inference and 82.2% less energy on CPUs.github.com☆ Yσɠƚԋσʂ ☆@lemmy.ml to Technology@lemmy.mlEnglish · 3 months agomessage-square0fedilink cross-posted to: technology@hexbear.net