Global hackathon champions private on-device personal AI apps
A global hackathon spanning London, Boston, and San Francisco has brought together hundreds of developers to advance personal AI that runs directly on consumer devices, without requiring cloud-based infrastructure.
Collaboration across cities
Cactus Compute joined forces with Nothing and Hugging Face for a 24-hour competition designed to push the boundaries of on-device artificial intelligence. The event attracted participants both on-site and online, highlighting a shared commitment to developing AI systems that offer data privacy, real-time responsiveness, and offline functionality.
On-device AI focus
The central aim was to demonstrate that the next phase of personal AI will rely on edge computing instead of the cloud. Developers competed by building mobile applications that use local AI inference, rather than communicating constantly with remote servers. This approach addresses user concerns over data privacy and gives applications the ability to function reliably even without internet connectivity.
Participants used the Cactus SDK, a new framework built for efficient local AI processing across React Native, Flutter, Kotlin, and C++. Teams experimented with advanced models such as Liquid Foundation Models, Smol, and Qwen3, and several edge-optimised models provided by the organisers.
Competition tracks
The hackathon included three competitive tracks. The main track challenged teams to produce the most effective on-device AI mobile application. The Memory Master track encouraged efforts to develop local knowledge-base systems for small models, while Hybrid Hero prompted participants to explore hybrid strategies combining local and cloud inference.
Teams were explicitly encouraged to go beyond simplistic integrations and use local inference for meaningful user problems. Projects were assessed for their privacy, speed, and offline capability.
Winning entries
First prize went to Obi-Wan Qwenobi (InnerSense), a keyboard-focused application that monitors keystrokes across multiple apps, uses local inference to classify emotional tone, and creates private behavioural memory embeddings. All data analysis and storage remain strictly on the user's device. The team will receive a trip to San Francisco and lunch with a Y Combinator partner as their award.
Second place was won by Lucid - formerly Bscribe - for an augmented reality interface that overlays digital productivity tools on the physical environment. The solution performs all inference on-device, integrating data from email, calendar, tasks, and notes. Team members each receive a Nothing Phone (3) handset as their prize.
SDK deployment
The Cactus SDK at the centre of the event is aimed at enabling developers to create offline-capable AI applications that are both private and responsive. The platform supports integration with various popular programming languages and is designed to make high-performance inference accessible on everyday devices.
Cactus Compute described its ambitions for the technology as making AI 'cheaper, faster, private, offline-capable' and removing the dependency on cloud-based inference for personal applications.