Foxconn Launches FoxBrain LLM Based on Llama 3.1
Foxconn Launches FoxBrain LLM Based on Llama 3.1

Foxconn Launches FoxBrain LLM Based on Llama 3.1

News summary

Foxconn, the leading electronics contract manufacturer, has launched its first large language model (LLM), 'FoxBrain', to enhance manufacturing efficiency and supply chain management. Developed using 120 Nvidia H100 GPUs, FoxBrain was trained in four weeks and is based on Meta’s Llama 3.1 architecture, making it Taiwan's first LLM optimized for traditional Chinese and Taiwanese language styles. Although slightly behind China's DeepSeek in performance, FoxBrain is close to world-class standards and aims to be open-sourced to promote AI in manufacturing. This initiative underscores Foxconn's strategic shift towards AI, driven by the need for new revenue streams and the competitive AI landscape in China. Nvidia supported the development through its Taiwan-based supercomputer, 'Taipei-1', and further details about FoxBrain will be shared at Nvidia's upcoming GTC developer conference.

Story Coverage
Bias Distribution
67% Center
Information Sources
daae85f0-2883-42fc-b085-888140adf30d813f7e30-3236-487b-95e1-6bf60d395e10a3544a73-dab3-486d-ae75-bd4d15f01f55
Left 33%
Center 67%
Coverage Details
Total News Sources
3
Left
1
Center
2
Right
0
Unrated
0
Last Updated
22 days ago
Bias Distribution
67% Center
Related News
Daily Index

Negative

22Serious

Neutral

Optimistic

Positive

Ask VT AI
Story Coverage
Subscribe

Stay in the know

Get the latest news, exclusive insights, and curated content delivered straight to your inbox.

Present

Gift Subscriptions

The perfect gift for understanding
news from all angles.

Related News
Recommended News