
Qualcomm has already unveiled AI technology such as Stable Diffusion that works locally on Snapdragon 8 Gen 2 smartphones, without requiring an internet connection. Now the company has announced that next year's flagship phones will actually support on-device AI.
The chipmaker announced that in 2024 it will introduce the possibility of using AI on devices such as phones and PCs with the Llama 2 model developed by Meta. Qualcomm points out that this support will enable various applications without the need to connect to the Internet. Possible applications include intelligent virtual assistants, work applications, content creation tools, entertainment and much more.
"Qualcomm Technologies has scheduled the availability of an AI implementation based on Llama 2 on devices with Snapdragon processors, starting in 2024", added in the press release. There is no word yet on whether Meta itself will bring Llama 2-based apps with local requests on Snapdragon phones next year. However, independent Android app developers will certainly have the tools to release their own projects.
This wasn't the only Llama 2 announcement, as Meta revealed that they had made LLM open source. Meta says it has decided to make this software open source to give companies, startups, entrepreneurs and researchers access to more tools. These tools open "opportunities to experiment, innovate in exciting ways and ultimately bring economic and social benefits".
photo: Meta AI
According to the press release, Meta believes that opening up access to its AI makes it more secure. He points out that developers and researchers will be able to test the LLM, which will help identify and solve problems faster. The company also explains that the Llama 2 was subjected to "red team" tests - internal and external, which consisted of generating "controversial requests". Meta adds that it will "continue to invest in security by fine-tuning and benchmarking" the model.
Finally, Microsoft and Meta announced an extended partnership that will make Microsoft the preferred partner for Llama 2. Redmond added that the new LLM will be hosted on Azure and Windows. The software is available starting today in the Azure AI model catalog and is optimized to run locally on Windows. It will also be available through Amazon Web Services (AWS), Hugging Face, and other providers.