News

The BitNet b1.58 2B4T model was developed by Microsoft's General Artificial Intelligence group and contains two billion ...
Microsoft's BitNet challenges industry norms with a minimalist approach using ternary weights that require just 400MB of ...
U.S. tech giant Microsoft’s research wing has come up with a new AI model that can function on a CPU instead of a GPU. This ...
Memory requirements are the most obvious advantage of reducing the complexity of a model's internal weights. The BitNet b1.58 ...
Microsoft researchers have developed — and released — a hyper-efficient AI model that can run on CPUs, including Apple's M2.
Microsoft’s model BitNet b1.58 2B4T is available on Hugging Face but doesn’t run on GPU and requires a proprietary framework.
Microsoft’s new BitNet b1.58 model significantly reduces memory and energy requirements while matching the capabilities of ...
Microsoft put BitNet b1.58 2B4T on Hugging Face, a collaboration platform for the AI community. “We introduce BitNet b1.58 2B4T, the first open-source, native 1-bit Large Language Model (LLM ...
Microsoft’s General Artificial Intelligence group has introduced a groundbreaking large language model (LLM) that drastically ...
Microsoft released what it describes as the most expansive 1-bit AI model to date, BitNet b1.58 2B4T. Unlike traditional ...
Microsoft Research has introduced BitNet b1.58 2B4T, a new 2-billion parameter language model that uses only 1.58 bits per ...