
Ant develops AI models using Chinese chips to lower training costs
2025年4月2日 · Ant Group uses Chinese chips and MoE models to cut AI training costs and reduce reliance on Nvidia. Releases open-source AI models, claiming strong benchmark results with domestic hardware. ... Ant’s latest research paper, published this month, outlines how the company has been working to lower training expenses by not relying on high-end ...
蚂蚁集团提出进一步降低成本的大模型训练方法,未来或将开源分享
2025年3月24日 · 实验表明,其3000亿参数的MoE(混合专家)大模型可在使用国产GPU的低性能设备上完成高效训练,性能与完全使用英伟达芯片、同规模的稠密模型及MoE模型相当。
Ant Group uses domestic chips to train AI models and cut costs
6 天之前 · The MoE concept is similar to having a team of specialists, each handling part of a task to make the process of producing models more efficient. Ant has declined to comment on its work with respect to its hardware sources. Training MoE models depends on high-performance GPUs which can be too expensive for smaller companies to acquire or use.
Ant Group Cuts AI Costs by 20% Using Chinese Chips and MoE …
2025年3月24日 · Instead of solely relying on NVIDIA’s powerful (and now export-restricted) H800 GPUs, Ant has been experimenting with chips from Alibaba (its affiliate) and Huawei, applying a technique known as Mixture of Experts (MoE) to train AI models more efficiently.
Ant Group focuses on Chinese chips to strengthen its strategy in ...
6 天之前 · Strategic Turning Point in AI Model Training for Ant Group. In recent months, Ant Group has adopted chips supplied by local companies, including entities connected to Alibaba and Huawei Technologies, to train its AI models using the Mixture of Experts (MoE) technique.. This approach, increasingly widespread among researchers, allows for effectively dividing tasks among different “experti ...
Ant Group’s use of China-made GPUs, not Nvidia, cuts AI model …
2025年3月25日 · Ant Group, the fintech affiliate of Alibaba Group Holding, is able to train large language models (LLMs) using locally produced graphics processing units (GPUs), reducing reliance on Nvidia’s...
Jack Ma-backed Ant Group touts AI breakthrough built on …
2025年3月24日 · Ant declined to comment in an e-mailed statement. However, the training of MoE models typically relies on high-performing chips like the graphics processing units (GPUs) that Nvidia sells.
Ant Group’s AI breakthrough: Cuts AI training costs by 20% using ...
2025年3月24日 · “Ant used domestic chips, including from affiliate Alibaba Group Holding Ltd. and Huawei Technologies Co., to train models using the so-called Mixture of Experts machine learning approach, the people said. ... (MoE), which splits AI tasks among different model components and only activates the necessary ones. This lets Ant get more out of ...
杰克·马(Jack Ma)的蚂蚁小组声称基于中国制造的筹码的AI突破
2025年3月24日 · Ant Group透露,它已经开发了使用来自阿里巴巴和华为的中国制造的半导体培训人工智能模型的新技术。 AI培训模型使用专家(MOE)机器学习方法的混合物来实现与NVIDIA H800芯片相似的结果,其成本至少为20%。
Ant-Multi-Modal-Framework/README.md at main · alipay/Ant ... - GitHub
本代码库包含蚂蚁多模态认知团队在AntMMF中集成的多模态方向研究代码。 AntMMF多模态框架封装了包括数据集管理、数据处理、训练流程、模型和模块在内的标准多模态功能,同时支持这些组件的自定义扩展。 2024.04: 增强指代理解能力的多模态大模型- Pink 被CVPR2024接收,开源对应论文代码: Pink. 2023.12: 开源以下论文代码 SNP-S3, DMAE, and CNVid-3.5M. 2023.06: SNP-S3 被IEEE T-CSVT (Transactions on Circuits and Systems for Video Technology) 2023接收. …