With the fast growth of artificial intelligence (AI) technology, AIPC (Artificial Intelligence Personal Computer) and AI Phone (Artificial Intelligence Phone) are slowly becoming part of our lives. However, compared to traditional devices, today’s AIPC and AI Phone only add a TPU (Tensor Processing Unit, a chip made to speed up machine learning). There’s no big change in how they work or feel to use. Even new features like Apple Intelligence, which people were excited about, don’t fully show the smartness we hoped for. In the future, these devices need more than just better hardware—they need a whole new way of thinking about design.
I believe the future of AIPC and AI Phone won’t just depend on powerful chips. Instead, built-in AI models will be the heart of these devices. These models will act as the “brain,” working together with hardware and operating systems to make one complete system. Competition will shift from hardware specs to how well the AI models perform and improve the user experience.
In this blog, I’ll dive into this idea, looking at the design, benefits, challenges, and big impact future AI devices could have on our lives. Let’s explore how this tech revolution might change our digital world.
Today’s popular AI models (like GPT-4) have billions of parameters and files that are hundreds of GBs, stored in the cloud for users to access online. But this setup doesn’t fully fit the future of AIPC and AI Phone. Future devices need lightweight, efficient AI models built right into them, not always relying on huge cloud models.
I think these built-in AI models will be 10GB to 20GB in size, like phi4—a smaller, optimized model with fewer parameters but still strong performance. As hardware gets more powerful, the size might grow, but for now, this range is practical and doable. These models will focus on key device tasks, such as:
The built-in AI won’t be an all-knowing “super brain.” It’ll be more like a focused “helper,” understanding what users want and managing device resources or outside services to get tasks done quickly. Complex or special jobs can still use cloud models or apps over the internet.
Built-in AI models will bring two big advantages: speed and safety. These will set future devices apart from what we have now.
Today’s cloud-based AI services send your requests to faraway servers over the internet, process them, and send back results. Network delays happen, and if the signal is weak, the experience gets worse. A built-in AI model works right on the device, so data doesn’t travel online, and responses are nearly instant.
Example: Imagine you’re on a fast train and tell your AI Phone, “Organize today’s meeting notes.” The built-in model can handle it locally right away, no internet needed. With a cloud model, a shaky network might fail the request or make you wait.
Privacy and data safety are big concerns with AI. Cloud models need you to upload data to servers, which raises the risk of leaks and can clash with privacy laws. A built-in AI model keeps everything on the device—your voice, photos, or health info doesn’t leave, making it much safer.
Future AIPC and AI Phone won’t just be tools—they’ll be “partners” with their own smarts. The built-in AI model, as the device’s “little brain,” will understand what you want, manage resources, and make interaction feel natural.
The trick is to keep these models lightweight and specialized. Unlike massive cloud models that are hundreds of GBs, these will target the device’s most-used features:
Using techniques like model compression and quantization, these models stay powerful while using less power and space, perfect for mobile devices.
Future apps will team up with built-in AI models. Developers will give apps detailed “instructions” so the AI knows what they do. Think of it like the Function setup in LangChain (an AI coding tool), but deeper and more detailed.
For example, if you say, “Check my schedule today,” the AI reads the app’s instructions, calls the right functions, and gives you the answer. The app stays a normal program, but the AI makes it smarter and faster to use—safely too.
The idea of built-in AI models is exciting, but there are tech hurdles to solve first.
Running AI on devices with limited power and space is tough. The fix is model distillation and quantization, which shrink big models into smaller ones without losing too much ability. Models like phi4 and DeepSeek-R1 show this can work, and these methods will get even better.
The AI needs to juggle tasks like language, voice, and images at once. It’ll need multi-task skills and smart resource use. One idea is a modular design, splitting tasks into separate pieces that load only when needed, saving power and boosting speed.
For the AI to understand apps, developers need a shared “instruction format,” like the OpenAPI standard. The AI must be able to read these and make smart choices. This will take teamwork across the industry to set up.
Built-in AI will change how we use devices. Old-school interfaces (like clicking menus) will give way to natural language, gestures, or even brain-computer links later on. Example: On a PC, just say, “Play Interstellar,” and the AI finds and starts it. Or while shopping, say, “Find a good-value Bluetooth headset,” and it picks the best one. Today’s basic dialog boxes won’t cut it—future solutions need to be smoother.
Built-in AI models will reshape how we connect with devices. Future AIPC and AI Phone will be smart helpers that get what we need and act on it, impacting tons of areas:
As tech improves and companies work together, built-in AI models will become standard, kicking off a new age of smart computing.
The future of AIPC and AI Phone will center on built-in AI models. These small, sharp models will power devices, delivering fast, safe, and smart services. Devices won’t just be tools anymore—they’ll be partners we interact with deeply. Are you ready for this revolution?