Particle.news

Download on the App Store

Liquid AI Unveils Revolutionary Non-Transformer Models with Superior Efficiency

The new Liquid Foundation Models promise state-of-the-art performance with reduced memory usage, challenging existing AI architectures.

  • Liquid AI, a startup founded by MIT researchers, has introduced Liquid Foundation Models (LFMs) that outperform traditional transformer-based AI models.
  • The LFMs are available in three sizes: 1.3B, 3.1B, and 40.3B parameters, with the largest being a Mixture of Experts model.
  • These models require significantly less memory, making them cost-effective and energy-efficient compared to competitors like Meta's Llama and Microsoft's Phi models.
  • Liquid AI's architecture is built from first principles, focusing on computational units to maximize knowledge capacity and reasoning.
  • The models are currently accessible through Liquid's platforms and are optimized for a variety of hardware, including Nvidia, AMD, Qualcomm, and Apple.
Hero image