The Deep Dojo Machine Learning Blog

A Mac lover's guide to the robot apocalypse.

Jul 12, 2017

Python Notes

Just enough syntax to get you started in Python. Many details will look familiar to Swift programmers.

Continue reading →

A Firsthand History of Deep Learning

Imagine you had Geoffrey Hinton, Yoshua Bengio and Yann LeCun in the same room talking about deep learning.

“In the 90’s other machine learning methods, that were easier for a novice to apply, did as well or better than neural nets on many problems. Interest in them died.

The three of us all knew they were ultimately going to be the answer. When we got better hardware and more data and a slight improvement in the techniques, they suddenly took off again.”

— Geoffrey Hinton

The interview starts 11 minutes in but the rest of the episode (and the Talking Machines podcast in general) has great content and production value.

“We had small data sets in computer vision that only have a few thousand training samples. If you train a convolutional net of the type that we had in the late 80’s and early 90’s, the performance would be very much lower than what you would get with classical vision systems. Mostly because those networks with many parameters are very hard to train. They learn the training set perfectly but they over-fit on the test set.

We devised a bunch of architectural components like rectification, contrast normalization and unsupervised pre-training that seemed to improve the performance significantly, which allowed those very heavy learning-based systems to match the performance or at least come close to the performance of classical systems. But it turns out all of this is rendered moot if you have lots of data and you use very large networks running on very fast computers.”

— Yann LeCun

“In the late 90’s and early 2000’s it was very, very difficult to do research in neural nets. In my own lab, I had to twist my students’ arms to do work on neural nets. They were afraid of seeing their papers rejected because they were working on the subject. Actually it did happen quite a bit for all the wrong reasons like, ‘Oh. We don’t do neural nets anymore.’

… I tried to even show mathematically why [the alternatives] wouldn’t work for the kinds of ambitious problems we wanted to solve for AI. That was how I started contributing towards the new wave of deep learning that CIFAR has promoted.”

— Yoshua Bengio

Correction: The original version of this post misspelled Yoshua Bengio’s name.

A Core ML Tutorial from Big Nerd Ranch

The post starts with the basics. Building a linear model in Python with scikit-learn. Converting it to an .mlmodel. Hooking it up to UIKit controls.

“Model building is difficult, and this isn’t the right post for a deep dive into model selection and performance.”

Matt Mathias

The model uses a couple features from a Boston data set to predict house prices. A simple problem space to wrap your head around.

“Core ML makes working with machine learning and statistical models more convenient and idiomatic.

Nonetheless, it is worth ending with a caveat. Statistics and machine learning are not simply APIs. They are entire fields of study that are equal parts art and science. Developing and selecting an informative model requires practice and study.”

I completely agree. The same can be said about good graphic design. Having an expert create it and then going through the process of integrating the design into your app are two different things.

What makes Core ML interesting is how little it asks of the developer who already has their .mlmodel in hand. It’s an approach to machine learning that says, “We’ll bring this technology to you instead of making you come to us.”

Machine Learning Inside Apple

Apple machine learning APIs aren’t just for third-party developers. Steven Levy wrote this article almost ten months before Apple announced Core ML. It offers a detailed look into how the company uses machine learning across its own products.

“As the briefing unfolds, it becomes clear how much AI has already shaped the overall experience of using the Apple ecosystem. The view from the AI establishment is that Apple is constrained by its lack of a search engine (which can deliver the data that helps to train neural networks) and its inflexible insistence on protecting user information (which potentially denies Apple data it otherwise might use). But it turns out that Apple has figured out how to jump both those hurdles.”

— Steven Levy

The integration of deep learning into Siri dates back to 2014.

“This was one of those things where the jump was so significant that you do the test again to make sure that somebody didn’t drop a decimal place.”

— Eddy Cue

One detail that keeps getting hinted at, both in the article and elsewhere, is online learning.

“We keep some of the most sensitive things where the ML is occurring entirely local to the device.”

— Craig Federighi

Details aren’t clear enough from this kind of statement to know for sure. Online learning implies an ability to train the machine learning model on the device. Something conspicuously absent from the first public version of Core ML.

This kind of limitation is understandable. As Gaurav Kapoor said in his introduction to the framework, “Training is a complex subject.”

With deep learning, jumping from inference to training is like jumping from the Preview app into Photoshop. The change in complexity and required expertise is significant. It wouldn’t be surprising if Apple waits and doesn’t add training to Core ML until they are able to flatten that learning curve. Making it more like a jump from Preview to Pages.

iOS 11: Machine Learning for Everyone

The best post on Core ML I’ve seen so far.

[Metal Performance Shader] Graph API. This is the big news as far as I’m concerned. Creating all the layers and (temporary) images by hand was always a nuisance. Now you can describe a graph, just like you would in Keras. MPS will automatically figure out how large the images need to be, how to deal with padding, how to set the offset of your MPS kernels, and so on. It can even optimize the graph behind the scenes by fusing layers.”

— Matthijs Hollemans

Matthijs is a pro. Helping people integrate deep learning into iOS apps is what he does for a living. His blog is rich with explanation, diagrams and technical detail.

“The new graph API makes my Forge library pretty much obsolete, unless you want to keep supporting iOS 10 in your apps.”

If your deployment target is staying on iOS 10 for a while, the Forge library may be your best bet until you’re able to migrate to the machine learning features in iOS 11.

Jun 8, 2017

Apple Introduces Core ML

When was the last time you opened up a PDF file and edited the design of the document directly?

You don’t.

PDF is not about making a document. PDF is about being able to easily view a document.

With Core ML, Apple has managed to achieve an equivalent of PDF for machine learning. With their .mlmodel format, the company is not venturing into the business of training models (at least not yet). Instead, they have rolled out a meticulously crafted red carpet for models that are already trained. It’s a carpet that deploys across their entire lineup of hardware.

As a business strategy, it’s shrewd. As a technical achievement, it’s stunning. It moves complex machine learning technology within reach of the average developer.

To use a trained model in your project, you literally drag and drop the model file into Xcode. A type-safe Swift interface for the model gets synthesized automatically. A hardware-accelerated runtime implementation is created as well. Vast amounts of technical detail that typically encumber machine learning are encapsulated away.

You provide input. The model provides output. You’re done.

let flowerModel = FlowerClassifier()

if let prediction = try? flowerModel.prediction(flowerImage: image) {
    return prediction.flowerType
}

Apple's Neural Engine

“The Apple AI chip is designed to make significant improvements to Apple’s hardware over time, and the company plans to eventually integrate the chip into many of its devices, including the iPhone and iPad.”

It’s been hard to watch machine learning take off without corresponding developer support from Apple. We saw a hint of it last year at WWDC, but in terms of training networks we still face the prospect of writing python for machine learning environments that are ultimately optimized for NVIDIA cards.

“Apple also plans to offer developer access to the chip so third-party apps can also offload artificial intelligence-related tasks.”

Swift, despite showing up with little warning, is a modern language that has enjoyed significant traction. I wouldn’t put it past Apple to accomplish something similar with a machine learning development environment written in Swift and optimized for Apple hardware.

May 31, 2017

Welcome

Welcome to the Deep Dojo blog. A guide to machine learning on the Mac. If you’re interested in machine learning news and how it intersects with developing software for Apple hardware, you’re in the right place.

Continue reading →