Apple’s New Core ML 2 is a Gold Mine Worth Exploring

The Worldwide Developers Conference (WWDC) hosted by Apple every year has always been stage to several ground-breaking announcements and releases from the tech giant. This year also witnessed a significant breakthrough, with Apple announcing Version 2 of its Core ML product. Apple introduced Core ML, its machine learning framework in 2017, to enable developers to integrate on-device machine learning models onto Apple’s products. Core ML’s design caters to optimize models for power efficiency, and hence doesn’t require internet connection to attain the benefits of machine learning models.

What’s new in Core ML 2

According to Apple, Core ML 2 delivers 30 percent faster performance, facilitating easy integration of machine learning models. Using a few lines of code, you can build applications with intelligent features. Core ML 2 lets you create your own machine learning models on Mac as well as playgrounds in Xcode 10. You can integrate a wide range of machine learning model types into your application using Core ML 2. It supports extensive deep learning with over 30 layer types, claims Apple. Since you could run machine learning models on the device itself, you have the advantage of your data not requiring to leave the device for analysis.

The Core ML 2 toolkit enables developers to reduce the size of trained ML models. One of its greatest advantages is that the tool supports vision and natural language. Some of the machine learning features that you can build into your application using Core ML 2 include face detection and tracking, text detection, rectangle detection, barcode detection, object tracking, and image registration. The tool also allows you to train and deploy custom NLP models for analyzing natural language text and deducing language-specific metadata.

News of Core ML’s update comes as a bonus for Apple’s developers and fans, since Google’s preceding announcement at its I/O 2018 conference promises a machine learning software development kit compatible with both iOS and Android. Core ML is expected to play a crucial role in Apple’s future hardware products. Reports from the Silicon Valley also indicate the tech giant seemingly developing a chip, named Apple Neural Engine to accelerate computer vision, speech recognition, facial recognition, and other forms of Artificial Intelligence. Apple has plans to include these features in its upcoming devices, and to offer third-party developers access to the chip in order to run their own AI.

Want to discuss your project?
We can help!
Follow us on LinkedIn for future updates
Never Miss a Beat

Join our LinkedIn community for the latest industry trends, expert insights, job opportunities, and more!

close icon

We’re glad you’re here. Tell us a little about your requirement.

  • We're committed to your privacy. Zerone uses the information you provide us to contact you about our products and services. You may unsubscribe from these communications at any time. For more information, check out our Privacy Policy