您当前的位置:
首页 >
文章列表页 >
Distributed parallel training technology for large-scale model with heterogeneous computing resources
Correspondences | 更新时间:2025-11-25
    • Distributed parallel training technology for large-scale model with heterogeneous computing resources

    • Journal on Communications   Vol. 46, Issue 10, Pages: 309-325(2025)
    • DOI:10.11959/j.issn.1000-436x.2025167    

      CLC: TP391
    • Received:30 May 2025

      Revised:2025-09-21

      Accepted:23 September 2025

      Published:20 October 2025

    移动端阅览

  • HUANG Lei,WANG Sheng,BAN Yourong,et al.Distributed parallel training technology for large-scale model with heterogeneous computing resources[J].Journal on Communications,2025,46(10):309-325. DOI: 10.11959/j.issn.1000-436x.2025167.

  •  
  •  

0

Views

1077

下载量

0

CSCD

Alert me when the article has been cited
提交
Tools
Download
Export Citation
Share
Add to favorites
Add to my album

Related Articles

Multicast time-expanded graph-based collective communication scheduling algorithm
Large-scale post-disaster user distributed coverage optimization based on multi-agent reinforcement learning

Related Author

黄子潇
HUANG Zixiao
XU Si
DI Xinkai
WANG Sheng
Li Hongyan
Liu Fei
Wang Peng

Related Institution

State Key Laboratory of Integrated Services Networks, Xidian University
Singapore University of Technology and Design
School of Artificial Intelligence, Beijing University of Posts and Telecommunications
Lab of BLOS Trusted Information Transmission, Chongqing University of Posts and Telecommunications
School of Information and Communication Engineering, Beijing University of Posts and Telecommunications
0