Anthropic is expanding its use of Google Cloud's TPU chips, planning to access up to one million TPUs by 2026 to train and serve future generations of Claude AI models, enhancing its AI research and deployment capabilities.
The article predicts that by 2035, Alphabet could become the Nvidia of quantum computing, leveraging its DeepMind research, custom hardware (TPUs), and quantum programming framework (Cirq) to build a dominant AI and quantum ecosystem, potentially transforming its market valuation and industry role.
Google has revealed its TPU-based supercomputer, TPU v4, which it claims is faster and more efficient than Nvidia's A100 chips. The system, which has been running since 2020, was used to train Google's PaLM model over 50 days. While Nvidia dominates the market for AI model training and deployment, Google has been designing and deploying AI chips called Tensor Processing Units since 2016. The power requirements of AI are also a boon to cloud providers like Google, Microsoft, and Amazon, which can rent out computer processing by the hour.