Time
01:50
Tether releases cross-platform BitNet LoRA framework, enabling fine-tuning of billion-parameter models on consumer-grade devices.
March 21, 2026
CoinFeed News
CoinFeed reported on March 21 that Tether announced the launch of a cross-platform BitNet LoRA fine-tuning framework in QVAC Fabric, optimizing training and inference for Microsoft BitNet (1-bit LLM). This framework significantly reduces computing power and memory requirements, enabling billion-parameter models to be trained and fine-tuned on laptops, consumer GPUs, and smartphones. This solution is the first to achieve fine-tuning of BitNet models on mobile GPUs (including Adreno, Mali, and Apple Bionic). Tests show that a 125M parameter model can be fine-tuned in approximately 10 minutes, a 1B model in about 1 hour, and even scalable to 13B parameter models on mobile devices.