My vision of AIPC and AI Phone [English]

February 24, 2025

With the fast growth of artificial intelligence (AI) technology, AIPC (Artificial Intelligence Personal Computer) and AI Phone (Artificial Intelligence Phone) are slowly becoming part of our lives. However, compared to traditional devices, today’s AIPC and AI Phone only add a TPU (Tensor Processing Unit, a chip made to speed up machine learning). There’s no big change in how they work or feel to use. Even new features like Apple Intelligence, which people were excited about, don’t fully show the smartness we hoped for. In the future, these devices need more than just better hardware—they need a whole new way of thinking about design.

I believe the future of AIPC and AI Phone won’t just depend on powerful chips. Instead, built-in AI models will be the heart of these devices. These models will act as the “brain,” working together with hardware and operating systems to make one complete system. Competition will shift from hardware specs to how well the AI models perform and improve the user experience.

In this blog, I’ll dive into this idea, looking at the design, benefits, challenges, and big impact future AI devices could have on our lives. Let’s explore how this tech revolution might change our digital world.

Built-in AI Models: Small but Smart Device Brains

Today’s popular AI models (like GPT-4) have billions of parameters and files that are hundreds of GBs, stored in the cloud for users to access online. But this setup doesn’t fully fit the future of AIPC and AI Phone. Future devices need lightweight, efficient AI models built right into them, not always relying on huge cloud models.

I think these built-in AI models will be 10GB to 20GB in size, like phi4—a smaller, optimized model with fewer parameters but still strong performance. As hardware gets more powerful, the size might grow, but for now, this range is practical and doable. These models will focus on key device tasks, such as:

  • Language Processing: Handling smart conversations, writing text, and understanding commands.
  • Voice Interaction: Recognizing speech (STT) and creating speech (TTS) for natural voice commands and responses.
  • Image Recognition: Giving devices basic vision, like identifying objects or faces.

The built-in AI won’t be an all-knowing “super brain.” It’ll be more like a focused “helper,” understanding what users want and managing device resources or outside services to get tasks done quickly. Complex or special jobs can still use cloud models or apps over the internet.

Benefits of Built-in AI Models

Built-in AI models will bring two big advantages: speed and safety. These will set future devices apart from what we have now.

1. Faster Speed: Local Processing, No Delays

Today’s cloud-based AI services send your requests to faraway servers over the internet, process them, and send back results. Network delays happen, and if the signal is weak, the experience gets worse. A built-in AI model works right on the device, so data doesn’t travel online, and responses are nearly instant.

Example: Imagine you’re on a fast train and tell your AI Phone, “Organize today’s meeting notes.” The built-in model can handle it locally right away, no internet needed. With a cloud model, a shaky network might fail the request or make you wait.

2. Safety and Privacy: Data Stays on the Device

Privacy and data safety are big concerns with AI. Cloud models need you to upload data to servers, which raises the risk of leaks and can clash with privacy laws. A built-in AI model keeps everything on the device—your voice, photos, or health info doesn’t leave, making it much safer.

Design Ideas for Built-in AI Models

Future AIPC and AI Phone won’t just be tools—they’ll be “partners” with their own smarts. The built-in AI model, as the device’s “little brain,” will understand what you want, manage resources, and make interaction feel natural.

1. Lightweight and Focused

The trick is to keep these models lightweight and specialized. Unlike massive cloud models that are hundreds of GBs, these will target the device’s most-used features:

  • A 10GB language model (LLM) can handle daily chats and text writing.
  • A voice module (STT and TTS) works offline for voice commands.
  • An image recognition model processes photos locally to spot objects or faces.

Using techniques like model compression and quantization, these models stay powerful while using less power and space, perfect for mobile devices.

2. Working Closely with Apps

Future apps will team up with built-in AI models. Developers will give apps detailed “instructions” so the AI knows what they do. Think of it like the Function setup in LangChain (an AI coding tool), but deeper and more detailed.

For example, if you say, “Check my schedule today,” the AI reads the app’s instructions, calls the right functions, and gives you the answer. The app stays a normal program, but the AI makes it smarter and faster to use—safely too.

Challenges and Fixes for Built-in AI Models

The idea of built-in AI models is exciting, but there are tech hurdles to solve first.

1. Model Optimization and Compression

Running AI on devices with limited power and space is tough. The fix is model distillation and quantization, which shrink big models into smaller ones without losing too much ability. Models like phi4 and DeepSeek-R1 show this can work, and these methods will get even better.

2. Handling Multiple Tasks and Resources

The AI needs to juggle tasks like language, voice, and images at once. It’ll need multi-task skills and smart resource use. One idea is a modular design, splitting tasks into separate pieces that load only when needed, saving power and boosting speed.

3. Standard App Instructions

For the AI to understand apps, developers need a shared “instruction format,” like the OpenAPI standard. The AI must be able to read these and make smart choices. This will take teamwork across the industry to set up.

4. New Ways to Interact

Built-in AI will change how we use devices. Old-school interfaces (like clicking menus) will give way to natural language, gestures, or even brain-computer links later on. Example: On a PC, just say, “Play Interstellar,” and the AI finds and starts it. Or while shopping, say, “Find a good-value Bluetooth headset,” and it picks the best one. Today’s basic dialog boxes won’t cut it—future solutions need to be smoother.

What’s Next for Built-in AI Models

Built-in AI models will reshape how we connect with devices. Future AIPC and AI Phone will be smart helpers that get what we need and act on it, impacting tons of areas:

  • Education: Offering custom learning materials based on your progress.
  • Health: Tracking your data and warning you about issues.
  • Work: Sorting meeting notes, managing schedules, and boosting productivity.

As tech improves and companies work together, built-in AI models will become standard, kicking off a new age of smart computing.

Conclusion

The future of AIPC and AI Phone will center on built-in AI models. These small, sharp models will power devices, delivering fast, safe, and smart services. Devices won’t just be tools anymore—they’ll be partners we interact with deeply. Are you ready for this revolution?

© 2025 • WANGQIAO.ME • ALL RIGHTS RESERVED

Powered by Gatsby