The Deep Dojo Machine Learning Blog

A Mac lover's guide to the robot apocalypse.

iOS 11: Machine Learning for Everyone

The best post on Core ML I’ve seen so far.

[Metal Performance Shader] Graph API. This is the big news as far as I’m concerned. Creating all the layers and (temporary) images by hand was always a nuisance. Now you can describe a graph, just like you would in Keras. MPS will automatically figure out how large the images need to be, how to deal with padding, how to set the offset of your MPS kernels, and so on. It can even optimize the graph behind the scenes by fusing layers.”

— Matthijs Hollemans

Matthijs is a pro. Helping people integrate deep learning into iOS apps is what he does for a living. His blog is rich with explanation, diagrams and technical detail.

“The new graph API makes my Forge library pretty much obsolete, unless you want to keep supporting iOS 10 in your apps.”

If your deployment target is staying on iOS 10 for a while, the Forge library may be your best bet until you’re able to migrate to the machine learning features in iOS 11.

Jun 8, 2017

Apple Introduces Core ML

When was the last time you opened up a PDF file and edited the design of the document directly?

You don’t.

PDF is not about making a document. PDF is about being able to easily view a document.

With Core ML, Apple has managed to achieve an equivalent of PDF for machine learning. With their .mlmodel format, the company is not venturing into the business of training models (at least not yet). Instead, they have rolled out a meticulously crafted red carpet for models that are already trained. It’s a carpet that deploys across their entire lineup of hardware.

As a business strategy, it’s shrewd. As a technical achievement, it’s stunning. It moves complex machine learning technology within reach of the average developer.

To use a trained model in your project, you literally drag and drop the model file into Xcode. A type-safe Swift interface for the model gets synthesized automatically. A hardware-accelerated runtime implementation is created as well. Vast amounts of technical detail that typically encumber machine learning are encapsulated away.

You provide input. The model provides output. You’re done.

let flowerModel = FlowerClassifier()

if let prediction = try? flowerModel.prediction(flowerImage: image) {
    return prediction.flowerType
}

Apple's Neural Engine

“The Apple AI chip is designed to make significant improvements to Apple’s hardware over time, and the company plans to eventually integrate the chip into many of its devices, including the iPhone and iPad.”

It’s been hard to watch machine learning take off without corresponding developer support from Apple. We saw a hint of it last year at WWDC, but in terms of training networks we still face the prospect of writing python for machine learning environments that are ultimately optimized for NVIDIA cards.

“Apple also plans to offer developer access to the chip so third-party apps can also offload artificial intelligence-related tasks.”

Swift, despite showing up with little warning, is a modern language that has enjoyed significant traction. I wouldn’t put it past Apple to accomplish something similar with a machine learning development environment written in Swift and optimized for Apple hardware.

May 31, 2017

Welcome

Welcome to the Deep Dojo blog. A guide to machine learning on the Mac. If you’re interested in machine learning news and how it intersects with developing software for Apple hardware, you’re in the right place.

Continue reading →