Particle.news

Download on the App Store

Foxconn Introduces FoxBrain AI Model, Trained in Just Four Weeks

The Taiwan-based electronics giant aims to enhance manufacturing and supply chain efficiency with its new AI model, built on Meta's Llama 3.1 framework and optimized for traditional Chinese.

The logo of Foxconn is seen outside the company's building in Taipei, Taiwan November 10, 2022. REUTERS/Ann Wang/File Photo
Image
Image
An Nvidia GPU is seen inside a computer server displayed at Foxconn’s annual tech day in Taipei, October 8, 2024. Photo: Reuters

Overview

  • Foxconn's FoxBrain is a large language model capable of tasks like data analysis, code generation, reasoning, and mathematical calculations.
  • The model was trained in just four weeks using 120 Nvidia H100 GPUs and technical support from Nvidia's Taiwan-based supercomputer, Taipei-1.
  • FoxBrain is based on Meta's Llama 3.1 architecture and is optimized for traditional Chinese and Taiwanese language styles.
  • Initially designed for internal applications, Foxconn plans to open-source the model to foster collaboration and advance AI-driven manufacturing and supply chain management.
  • Foxconn's move into AI reflects its diversification strategy beyond electronics manufacturing, addressing profitability challenges in its core business.