Betting your next smartphone almost entirely on AI is a big risk and one that hasn’t paid off for any smartphone so far this year. The HONOR Magic 7 Pro was undermined by leaning too heavily on underbaked features, while the Samsung Galaxy S25 series is a stale upgrade unless you’re big into Galaxy AI. The ASUS Zenfone 12 Ultra also hinges its sales pitch virtually entirely on AI, and while I don’t think that was particularly wise either, ASUS has at least shown how to give users a proper choice when it comes to AI privacy, and I think everyone should take notes.
The key is that nearly all ASUS’ AI features can run either online or offline — the exception being Google’s Circle to Search, which is always online. In contrast, Call Translator and imaging features always run locally. So whether you’re removing a photo bomber or summarizing some notes, you control how your data is handled. It’s hard to put a price on that level of privacy at a time when it’s increasingly clear that the parent companies can use anything you share with online AI. Having the option to run all of your phone’s AI tools locally is essential.
![ASUS AI Offline Settings ASUS AI Offline Settings](https://www.wiredfocus.com/wp-content/uploads/2025/02/ASUS-is-showing-Google-and-Apple-exactly-how-to-handle.jpg)
Robert Triggs / Android Authority
Best of all, you can mix and match if you’d like. Want to summarize your more private documents on the security of your Zenfone while allowing the cloud to condense those long public web articles into something more digestible? ASUS lets you do that.
Do you prefer your AI online or offline?
0 votes
![ASUS AI Cloud Quota ASUS AI Cloud Quota](https://www.wiredfocus.com/wp-content/uploads/2025/02/Ive-glimpsed-the-future-of-mobile-AI-and-itll-cost.jpg)
Robert Triggs / Android Authority
But you don’t have to don a tinfoil hat to see the other upside here. Hidden away in ASUS UI is a daily “cloud service quota” — no doubt the start of a grim new trend. It’s generous enough that I couldn’t lock myself out after several requests, but it highlights that cloud-based AI won’t necessarily remain free forever. Samsung’s Galaxy AI might start charging at the end of 2025, Google already has subscription costs for Gemini Advanced, and reports suggest Apple may also eventually charge for Apple Intelligence. Cloud AI is expensive, and that cost will eventually have to be passed on to consumers unless you can run features on-device.
Thankfully, ASUS handles this all elegantly. The first time you run any given AI feature, it prompts you to select online (fast) or offline (private) processing. You can set a consistent or one-time preference, so you can swap back and forth between cloud or local processing as you see fit. If you pick the offline option, ASUS will direct you to download an AI kit update to run the feature.
If there’s one caveat to these offline options, it’s that they require additional downloads, and they’re not small. In total, ASUS’ offline AI kits can eat up an extra 6GB of space (another reason that 256GB should be the baseline storage model), which is also far too large to download on a mobile plan. Even with unlimited data, you’re not going to wait several minutes (or probably longer) to perform an offline text summary for the first time. A simple fix would be to integrate the online/offline AI question as part of the phone setup processes and pre-download any requested offline AI tools over Wi-Fi.
Samsung has a similar option that disables cloud AI features, but it’s hidden away deep in the settings menu where few will find it. Even then, not all of Samsung’s features can run offline. Generative edits in Gallery no longer work, Note’s auto-formatting and spelling corrections stop functioning, and so do Voice Recorder summaries. ASUS has no such tradeoff; the compromise is that the quality of some features takes a downgrade, and processing times are longer when running on your phone’s more limited processing power, but at least they continue to work.
Between hedging against costs and keeping my data private, offline AI is a must for me.
Having experimented with running AI models directly on my phone, I know the latest handsets are powerful enough to process a range of smaller models, especially when NPU acceleration is added to the equation. For example, ASUS accelerates Meta’s 8 billion parameter Llama3 model for text tasks, but Alibaba’s Qwen or Microsoft’s Phi are options that could also work for text in place of Google’s tiny Gemini Nano. There are a range of equally open models that can be used for image generation, audio processing, and much more, too.
I hope this is just the beginning of a better direction for mobile AI, where consumers are empowered to run AI tools locally and perhaps even have greater control over which apps and models they can use for tasks as well. To get there, we’ll need developers to unlock our processors’ NPUs and GPUs, but we’re not that far off. Until then, giving consumers the option to run all of their AI tasks offline is the next best thing; Apple, Google, and Samsung should follow ASUS’ lead.