"Meta and Qualcomm Collaborate to Bring Powerful A.I. Models to Mobile Devices"

1 min read
Source: CNBC
"Meta and Qualcomm Collaborate to Bring Powerful A.I. Models to Mobile Devices"
Photo: CNBC
TL;DR Summary

Qualcomm and Meta have announced a partnership to enable Meta's large language model, Llama 2, to run on Qualcomm chips on phones and PCs starting in 2024. This move aims to position Qualcomm processors as suitable for AI applications "on the edge" or on devices, rather than relying on cloud-based servers. By running large language models on phones, the cost of running AI models could be reduced, leading to improved voice assistants and faster apps. Qualcomm's tensor processor unit (TPU) is well-suited for AI calculations, although the processing power on mobile devices is still limited compared to data centers. Meta's Llama 2, an open-source model, can be packaged in a smaller program to run on phones. This partnership builds on Qualcomm and Meta's previous collaborations in the virtual reality space.

Share this article

Reading Insights

Total Reads

0

Unique Readers

1

Time Saved

1 min

vs 2 min read

Condensed

67%

391130 words

Want the full story? Read the original article

Read on CNBC