Particle.news

Download on the App Store

Foxconn Introduces FoxBrain AI Model, Trained in Just Four Weeks

The Taiwan-based electronics giant aims to enhance manufacturing and supply chain efficiency with its new AI model, built on Meta's Llama 3.1 framework and optimized for traditional Chinese.

  • Foxconn's FoxBrain is a large language model capable of tasks like data analysis, code generation, reasoning, and mathematical calculations.
  • The model was trained in just four weeks using 120 Nvidia H100 GPUs and technical support from Nvidia's Taiwan-based supercomputer, Taipei-1.
  • FoxBrain is based on Meta's Llama 3.1 architecture and is optimized for traditional Chinese and Taiwanese language styles.
  • Initially designed for internal applications, Foxconn plans to open-source the model to foster collaboration and advance AI-driven manufacturing and supply chain management.
  • Foxconn's move into AI reflects its diversification strategy beyond electronics manufacturing, addressing profitability challenges in its core business.
Hero image