Apple Embracing Machine Learning with Core ML

The demand of machine learning for the intelligent app development process is increasing day-by-day. The rise of intelligent apps is enormous.

According to Forrester Research across all businesses, there will be a greater than 300% increase in investment in artificial intelligence in 2017 compared with 2016.

Apple had made its investment clear in the field of machine learning by recently introducing Core ML. It wants to make AI on your mobile device as fast and powerful as possible. That’s why the company unveiled a new machine learning framework API (Core ML) for developers on 5th June 2017.

Introducing Core ML – A New machine learning API by Apple

The AI framework, Core ML, covers everything from text analysis to face recognition. Moreover, it will also impact a broad category of apps; transforming them into intelligent apps. With the introduction of Core ML, Apple claims that the face recognition feature will be six times faster than Google.

“Core ML delivers blazingly fast performance with easy integration of machine learning models enabling you to build apps with intelligent new features using just a few lines of code.” Says Apple.

Core ML supports features like face tracking, face detection, landmarks, text detection, rectangle detection, barcode detection, object tracking, and image registration. The natural language processing enabled by this machine learning framework would  deeply understand text using features such as language identification, tokenization, lemmatization, part of speech, and named entity recognition.

Apple has introduced this machine learning framework for developers to easily leverage the API  to build image recognition into their photo apps, or develop a chatbot that understands more humanely with the natural language processing.

Core ML models and tools

Apple introduced the audience with four of the models of Core ML:

  • Places205 – GoogLeNet

This model will detect the scenes from the pictures such as forest, beach, airport, shopping mall and more. There are 205 categories listed already.

  • ResNet50, InceptionV3, VGG16

This model will detect from 1000 categories like people, animal, food, trees and more.

Core ML tools

It is a  python package that allows the developers to convert models from machine learning toolboxes into the Core ML format.

What is your say on this?

The key benefit of Core ML, as suggested by Apple, will be speeding up how quickly AI tasks execute on the iPhone, iPad, and Apple Watch. However, there are other tech giants like Google and Facebook who already have their hands on machine learning.

LetsNurture always thrives to learn new and adapt new technologies. With an introduction to Core ML, we expect to deliver more intelligent iPhone apps in future.

Will this new announcement of Apple’s Core ML bring a revolution in machine learning? What’s your say on this? Do write us at or leave us a tweet at @letsnurture.