Pete Warden discusses code, data and software process. When mixed with machine learning, it can be tricky to iterate on results.
“I’m no shrinking violet when it comes to version control. I’ve toughed my way through some terrible systems, and I can still monkey together a solution using rsync and chicken wire if I have to. Even with all that behind me, I can say with my hand on my heart, that machine learning is by far the worst environment I’ve ever found for collaborating and keeping track of changes.”
Chris Lattner joined Google back in August.
Super excited to be part of the team that's bringing Machine Learning supercomputers to the world with Google Cloud TPUs. An immense number of scalable FLOPS can completely change productivity - and hopefully unlock entirely new avenues for ML research and production.— Chris Lattner (@clattner_llvm) February 12, 2018
A list of over 150 machine learning terms, sorted alphabetically.
A small, randomly selected subset of the entire batch of examples run together in a single iteration of training or inference. The batch size of a mini-batch is usually between 10 and 1,000. It is much more efficient to calculate the loss on a mini-batch than on the full training data.
People, it seems, have an emotional reaction to things that look almost but not quite human. There are theories as to why but regardless of the reasons, it’s been a challenging barrier to using photo-realistic computer-generated humans. Something the entertainment industry has been chipping away at for a while.
The makers of Mug Life have released an iOS app that can look at a 2D image of a face and then animate it in three dimensions. The results are eye-opening.
“This innovative technology featuring deep neural networks marries decades of video game expertise with the latest advances in computer vision.”
Technical advancements in this area are accelerating.
MLModel sits at the heart of Core ML. It's an abstraction that's focused on input and output features.
MLModelDescription indicates how these features are structured.
A list of free Core ML models with associated sample code and reference. Curated by Kedan Li.
“Download and use per license. Remember to acknowledge.”
The site currently has eight image filters and roughly twenty classifiers.
CUDA is an NVIDIA hardware toolkit. Many deep learning frameworks use it to accelerate training. Apple last offered an NVIDIA GPU in its MacBook Pro in 2014. A GeForce GT 750M, which provided a CUDA compute capability of 3.0.
At WWDC, Apple announced official support for external GPU enclosures in macOS High Sierra. This week NVIDIA followed suit.
“After skipping the assorted High Sierra betas, NVIDIA has rolled out drivers for its line of PCI-E graphics cards.”
Coursera launched in 2011. As a co-founder, Andrew Ng offered a course on machine learning that quickly became popular. Six years later, Proffessor Ng is offering a new series of courses in deep learning.
In his post, Arvind Nagaraj offers some observations.
“In classic Ng style, the course is delivered through a carefully chosen curriculum, neatly timed videos and precisely positioned information nuggets. Andrew picks up from where his classic ML course left off.”
He also offers some encouragement.
“Everyone starts in this field as a beginner. If you are a complete newcomer to the deep learning field, it’s natural to feel intimidated by all the jargon and concepts. Please don’t give up.”
Apple isn’t just a hardware company. It’s not just a software company either. It’s both. I’ve always admired how clever the company is at leveraging that distinction.
The latest iPhone will unlock itself merely by looking at your face.