Newzlab

Near Term Future of Generative AI


Ark Invest forecasts a massive drop in the cost to train generative AI models and a massive increase in the size of AI models. AI models will increase to hundreds of trillions of parameters. The limitation will not be the training cost. The limitation will be the training data and those with better and proprietary training data will be the winners.

The increase in model size is currently trending to limits in the quality and accuracy of the AI. However, thousands of companies and billions of dollars are being spent to surpass current limits.

The current foreseeable limits seem to be near or beyond human capabilities. This will make the future AI very economically valuable and impactful.

The human brain has 100 billion neurons and more than 100 trillion parameters in a biological-neural-network system. In 2022, Graphcore laid out a roadmap to IPU technology that will be used for an AI supercomputer to deliver the following capabilities:

Over 10 Exa-Flops of AI floating point compute
Up to 4 Petabytes of memory with a bandwidth of over 10 Petabytes/second
Support for AI model sizes of 500 trillion parameters
3D wafer on wafer logic stack
Fully supported by our Poplar® SDK
Expected cost: ~$120 million (configuration dependent)

Graphcore has been funded with $682 million.



Source link