Decentralized Computing Networks for AI Training: The Future of ML?
How to merge two buzzwords into something that makes more sense.
Training large machine learning models is computationally intensive. Decentralized networks of compute resources could be used to share the load and accelerate the training process.
- “AI” requires compute time for training (GPT, etc.)
- Soon, if you use any “AI” service, you will have to share your idle computing power to improve that “AI.”
- This may be tokenized via crypto. More contributions = more “AI” usage available for you.
Could the use of tokens or cryptocurrency incentivize participants to contribute their idle computing power to facilitate the process?
“Atkinson-dithered AI image,” prompt on Stable Diffusion 2.1-768, 12/20/2022