Project Bringing Hardware Accelerated Language Models to Android Devices
We introduce MLC LLM for Android β a solution that allows large language models to be deployed natively on Android devices, plus a productive framework for everyone to further optimize model performance for their use cases. Everything runs locally and accelerated with native GPU on the phone.
We can run VicuΓ±a-7b on Android Samsung Galaxy S23.
Github https://github.com/mlc-ai/mlc-llm/tree/main/android
Demo: https://mlc.ai/mlc-llm/#android
#ai #assistant #gpt #android
We introduce MLC LLM for Android β a solution that allows large language models to be deployed natively on Android devices, plus a productive framework for everyone to further optimize model performance for their use cases. Everything runs locally and accelerated with native GPU on the phone.
We can run VicuΓ±a-7b on Android Samsung Galaxy S23.
Github https://github.com/mlc-ai/mlc-llm/tree/main/android
Demo: https://mlc.ai/mlc-llm/#android
#ai #assistant #gpt #android
GitHub
mlc-llm/android at main Β· mlc-ai/mlc-llm
Enable everyone to develop, optimize and deploy AI models natively on everyone's devices. - mlc-ai/mlc-llm