MOBMOD.MOB-06 · v1.0
Run inference
offline.
Stream when you're online.
6 micro-lessons · ~48 min · Real Docker images
MOBILE CHASSIS · ON-DEVICE
DEVICE · ON-MODEL
IDLE · WARM
9:41● ● ●
MODEL
core-ml-q4
SIZE
47 MB
INFERENCE
~ 12 ms / inf
OFFLINE READY
CORE ML 6NNAPI 1.3OFFLINE READY
MOBROLE TRACK
AI for Mobile Engineers
On-device inference, iOS / Android, flaky-network streaming.
WHY THIS MATTERS · SNAP INTERNAL
Mobile track adopted by 4 of the top 10 iOS app studios in our 2026-Q1 partner cohort.
01On-device inference
02iOS Core ML
03Android NNAPI
04Streaming on flaky networks
Ship Core ML / NNAPI models
Stream over flaky networks gracefully
Cache + invalidate sanely
$ docker pull snap/ai-mobile:lesson-01
$ docker run --rm -it snap/ai-mobile:lesson-01
snap/ai-mobile:lesson-01
Flaky-network lesson fixed our AppStore complaints.
@mob_maeVERIFY ON GITHUB
On-device inference: practical, runnable, fast.
@sre_mayaVERIFY ON GITHUB
LESSONS6
HOURS~0.8
LEARNERS740
THIS WEEK+22%