FocsleFocsle
AP

MobileCLIP-S2

19%
by Apple

Apple's reference vision-language embedding for mobile. The encoder behind a lot of on-device retrieval.

MultimodalApache-2.0INT8FP16clipembeddingmobile
313K downloads 18K deploymentsUpdated Mar 30, 2028
Headline:6.4ms · Apple Neural Engine (M4) · INT8

Deploy MobileCLIP-S2

Pick a chip family. We hand you the artifacts (HEF, TRT engine, Core ML, ONNX) plus a one-click endpoint deploy. For private endpoints, on-prem deploy, or air-gapped distribution, see Enterprise.

Apple Neural Engine (M4)
# Core ML compile
$ focsle pull apple/mobileclip-s2 --target coreml-ane
$ focsle compile mobileclip-s2.onnx --target coreml --precision fp16

# Run via Core ML
import focsle.runtime as fr
m = fr.load("mobileclip-s2.mlmodelc", target="coreml")
out = m.run(frame)

One-click endpoint

Spins up a managed endpoint in the closest region. Pro and above.

Or deploy yourself