Note: All images in this directory, unless specified otherwise, are licensed under CC BY-NC 4.0.
Figure number | Description |
---|---|
11-1 | The mobile AI development life cycle |
11-2 | Frameworks compatible with Core ML for model interchange as of 2019 |
11-3 | Different levels of APIs provided by Apple for app developers |
11-4 | Ready-to-use models from the Apple Machine Learning website |
11-5 | How different scaling options modify the input image to Core ML models |
11-6 | Project information view within Xcode |
11-7 | Select a team and let Xcode automatically manage code signing |
11-8 | Select the device and click the “Build and Run” button to deploy the app |
11-9 | Profiles and Device Management screen |
11-10 | Screenshot of the app |
11-11 | Xcode model inspector showing the inputs and outputs of the MobileNetV2 model |
11-12 | Output layer of MobileNet as seen in Netron |
11-13 | App Store reviews for YouTube and Snapchat that complain about heavy battery consumption |
11-13 | App Store reviews for YouTube and Snapchat that complain about heavy battery consumption |
11-14 | Xcode Debug Navigator tab |
11-15 | Xcode Energy Impact chart on an iPad Pro 2017 (note: this screenshot was taken at a different time than Figure 11-14, which is why the numbers are slightly different) |
11-16 | Comparing CPU and GPU utilization for different models on iOS 11 |
11-16 | Comparing CPU and GPU utilization for different models on iOS 11 |
11-16 | Comparing CPU and GPU utilization for different models on iOS 11 |
11-17 | Varying the FPS and analyzing the load on an iPad Pro 2017 |
11-18 | The Instruments window in Xcode Instruments |
11-19 | An app being profiled live in the Core Animation instrument |
11-20 | Step-by-step solution to solving ARKit |
11-21 | The HomeCourt app tracking a player’s shots live as they are playing |
11-22 | Screenshots of InstaSaber and YoPuppet |
11-23 | Real-time hand pose estimation within YoPuppet |