AMD secures a major AI chip deal with OpenAI, allowing the company to purchase up to 160 million shares at 1 cent each, as part of a broader effort to diversify supply and meet the growing demand for AI computing power, alongside existing partnerships with Nvidia and Broadcom.
OpenAI is reportedly facing diminishing returns as it continues to invest heavily in computing resources for its large language models like ChatGPT. Despite efforts to scale up, recent tests suggest that these models have plateaued, challenging the belief that more data and computing power will consistently enhance AI capabilities. This development aligns with concerns that AI firms are reaching the limits of current scaling strategies, emphasizing the need for higher data quality rather than quantity. Former OpenAI cofounder Ilya Sutskever highlights a shift from scaling to discovery in AI development.
Apple is working on optimizing artificial intelligence (AI) models to run on smartphones with limited memory, according to a research paper published by Apple researchers. This development suggests that Apple is preparing to introduce new AI-powered features for the iPhone.
China plans to increase its aggregate computing power by over 50% by 2025, aiming for a total computing power of 300 EFLOPS. The plan, released by multiple departments including the Ministry of Industry and Information Technology, comes as China faces increasing competition from the US in high-tech areas such as semiconductors, supercomputers, and AI. The expansion of computing power is crucial for AI training, and China plans to build more data centers and improve computational infrastructure in western regions. The plan also emphasizes improving the speed and efficiency of the computation network.
Silicon Valley start-up Cerebras has unveiled a new supercomputer built with its specialized chips, which are the size of a dinner plate and pack the computing power of hundreds of traditional chips. The demand for computing power and A.I. chips has surged due to the global A.I. boom, leading tech giants and start-ups to develop their own alternatives to meet the demand. Cerebras aims to compete in the market dominated by Nvidia and has already built a supercomputer for A.I. company G42, with plans to build more in the future. The company hopes to advance A.I. development and provide an alternative to Nvidia's chips.
Start-ups in the field of artificial intelligence (AI) are increasingly relying on bigger rivals like Google, Microsoft, and Amazon for the computing power needed to develop their own AI systems. The industry's giants control the vast data centers required to run AI systems, putting them in a dominant position. Start-ups like Cohere, Anthropic, Character.AI, and Inflection AI have secured funding to purchase computing power from these tech giants. OpenAI, the company behind ChatGPT, recently raised $10 billion from Microsoft and will use the funds to pay for time on Microsoft's massive clusters of computer servers. While the tech giants have an advantage due to their vast resources, the open-source software could level the playing field by allowing anyone to compete, although access to larger competitors' data centers would still be necessary.
OpenAI's ChatGPT could cost up to $700,000 a day to run due to expensive servers, according to an analyst. Microsoft is reportedly developing an AI chip called Athena to reduce the cost of running generative AI models. The chip could be released for internal use by Microsoft and OpenAI as early as next year.
ChatGPT, an AI language model developed by OpenAI, could cost up to $700,000 a day to operate due to the expensive servers required to run it, according to an analyst. Microsoft is reportedly developing an AI chip called Athena to reduce the cost of running generative AI models like ChatGPT. The chip could be released for internal use by Microsoft and OpenAI as early as next year.