MOBMOD.MOB-06 · v1.0

Run inference
offline.
Stream when you're online.

6 micro-lessons · ~48 min · Real Docker images

MOBILE CHASSIS · ON-DEVICE
DEVICE · ON-MODEL
IDLE · WARM
9:41● ● ●
MODEL
core-ml-q4
SIZE
47 MB
INFERENCE
~ 12 ms / inf
OFFLINE READY
CORE ML 6NNAPI 1.3OFFLINE READY
MOBROLE TRACK

AI for Mobile Engineers

On-device inference, iOS / Android, flaky-network streaming.

WHY THIS MATTERS · SNAP INTERNAL
Mobile track adopted by 4 of the top 10 iOS app studios in our 2026-Q1 partner cohort.
WHAT YOU'LL LEARN
01On-device inference
02iOS Core ML
03Android NNAPI
04Streaming on flaky networks
YOU'LL BE ABLE TO
Ship Core ML / NNAPI models
Stream over flaky networks gracefully
Cache + invalidate sanely
SKILLS YOU'LL GAIN

Real skills, real career delta.

Skills you'll gain

07
  • Ship Core ML / NNAPI modelsWorking

    Outcome from completing the course: ship core ml / nnapi models.

  • Stream over flaky networks gracefullyWorking

    Outcome from completing the course: stream over flaky networks gracefully.

  • Cache + invalidate sanelyWorking

    Outcome from completing the course: cache + invalidate sanely.

  • On-device inferenceWorking

    Covered in lesson sequence — drop-in ready.

  • iOS Core MLWorking

    Covered in lesson sequence — drop-in ready.

  • Android NNAPIWorking

    Covered in lesson sequence — drop-in ready.

  • Streaming on flaky networksWorking

    Covered in lesson sequence — drop-in ready.

RUNNABLE ON YOUR MACHINE
$ docker pull snap/ai-mobile:lesson-01
$ docker run --rm -it snap/ai-mobile:lesson-01
snap/ai-mobile:lesson-01
QUICK PREVIEW · 7 MIN
VERIFIED ENGINEER REVIEWS
Flaky-network lesson fixed our AppStore complaints.
@mob_maeVERIFY ON GITHUB
On-device inference: practical, runnable, fast.
@sre_mayaVERIFY ON GITHUB
LESSONS6
HOURS~0.8
LEARNERS740
THIS WEEK+22%