According to Foresight News, Jack Clark, co-founder of Anthropic and former policy director at OpenAI, highlighted the significance of decentralized training in his weekly AI newsletter, Import AI. He emphasized that decentralized training enhances data privacy and system robustness by utilizing distributed learning across multiple nodes. Citing a study by Epoch AI, Clark noted that the computational scale of decentralized training is expanding at an annual rate of 20 times, significantly outpacing the fivefold growth rate of cutting-edge centralized training. Although decentralized training is currently about 1,000 times smaller than advanced centralized methods, it remains technically feasible and could potentially facilitate broader collaborative efforts to develop more powerful models
source: https://www.binance.com/en/square/post/34754569562226?utm_source=BinanceNewsRSS