SGGPO
MediaTek, a global semiconductor company, has announced that it is working closely with Meta's Llama 2, the company's next-generation open source large language model (LLM).
MediaTek Aims to Build a Complete Edge Computing Ecosystem |
Using Meta's LLM as well as MediaTek's latest APUs and NeuroPilot AI platform, MediaTek aims to build a complete edge computing ecosystem designed to accelerate AI application development across smartphones, IoT, vehicles, smart homes and other edge computing devices.
Currently, most Generative AI processing is done through the cloud. However, using MediaTek's Llama 2 models will allow Generative AI applications to run directly on the device.
This offers a number of benefits for developers and users, including smooth performance, enhanced privacy, more security and reliability, lower latency, increased ability to work in areas with little or no connectivity, and lower operating costs.
To truly leverage generative AI on edge computing devices, edge device makers will need to adopt high-end, low-power AI processors and faster, more reliable connectivity to enhance computing capabilities. Each of MediaTek’s 5G mobile chips today is equipped with APUs designed to perform a variety of generative AI features, such as AI noise reduction, AI resolution enhancement, and more.
Additionally, MediaTek’s next-generation flagship processor, expected to be introduced later this year, will feature an optimized software stack to run Llama 2, along with an upgraded APU with Transformer core acceleration, reduced DRAM area and bandwidth usage, and further enhanced LLM and AIGC performance. These advancements facilitate the rapid development of on-device generative AI use cases.
MediaTek expects Llama 2-based AI applications to be available on smartphones equipped with its next-generation flagship SoC, expected to hit the market later this year.
Source
Comment (0)