Tag

Computational Power

All articles tagged with #computational power

artificial-intelligence1 year ago

"Enhancing AI Power Through Google's Chess Experiments"

Google's research on a diversified version of AlphaZero, incorporating multiple AI systems trained independently and on various situations, revealed that the diversified AI player experimented with new, effective openings and novel strategies, defeating the original AlphaZero in most matches. This approach, which includes a "diversity bonus" and rewards for pulling strategies from a large selection of choices, could potentially enhance the power of AI by encouraging creative problem-solving and considering a larger buffet of options. The implications extend beyond chess, with potential applications in drug development and stock trading strategies. While computationally expensive, this diversified approach represents a step toward better performance on hard tasks and mitigates some of the shortcomings in machine learning.

technology2 years ago

"Meta and Qualcomm Collaborate to Bring Powerful A.I. Models to Mobile Devices"

Qualcomm and Meta have announced a partnership to enable Meta's large language model, Llama 2, to run on Qualcomm chips on phones and PCs starting in 2024. This move aims to position Qualcomm processors as suitable for AI applications "on the edge" or on devices, rather than relying on cloud-based servers. By running large language models on phones, the cost of running AI models could be reduced, leading to improved voice assistants and faster apps. Qualcomm's tensor processor unit (TPU) is well-suited for AI calculations, although the processing power on mobile devices is still limited compared to data centers. Meta's Llama 2, an open-source model, can be packaged in a smaller program to run on phones. This partnership builds on Qualcomm and Meta's previous collaborations in the virtual reality space.

technology2 years ago

The Pros and Cons of ChatGPT for AI and Robotics Development

The cost of running large language models that underpin AI chatbots is limiting their quality and threatening to throttle the global AI boom. The enormous cost of running these models is also constraining which companies can afford to run them and pressuring even the world’s richest companies to turn chatbots into moneymakers sooner than they may be ready to. The computational power required for generative AI is why OpenAI has held back its powerful new language model, GPT-4, from the free version of ChatGPT, which is still running a weaker GPT-3.5 model. The cost of AI language models is a moving target, and companies are working to make them more efficient. The push for smaller, cheaper models marks a sudden reversal for the industry.

technology2 years ago

Harnessing Ecosystems for Powerful Computing

A study conducted at Kyoto University has demonstrated the computational power of ecosystems, providing a new direction for rapidly developing AI technologies. The researchers developed two types of ecological reservoir computing as a proof-of-concept that ecological networks have computational power. The study confirmed the possibility that the Tetrahymena population could make near-future predictions of ecological time series. The results suggest that there might be a link between high biodiversity and high computational power, shedding light on new values of previously unknown biodiversity.