Apple’s Craig Federighi explains how handwriting recognition on iPadOS 14 works

craig detailing scribble for ipados 14 rect

One of iPadOS 14′s headlining features is handwriting recognition, otherwise known as Scribble. With the new software released slightly more than 2 weeks ago, users can use the Apple Pencil to write in any text field which will then automatically be converted to typed text.

Most existing computer models of recognition take place by sending data from the handwriting to servers or data centers. Like many of Apple’s machine learning features, handwriting recognition on iPadOS 14 takes place on device. Due to privacy measures, Apple handles all requests using on-device machine learning instead of off-site processing. While beneficial from a privacy standpoint, it also means that on-device computing power must be able to handle the intensity of such work.

Beyond possible device restraints, the iPad needs to be able to actually recognize the user’s handwriting. In order to do this, Apple had to collect different handwriting samples from people around the world in different variations such as slow, fast, or tilted handwriting, according to Apple’s SVP of Software Engereining Craig Federighi.

“When it comes to understanding [handwriting] strokes, we do data-gathering. We find people all over the world, and have them write things,” says Craig Federighi, senior vice president of software engineering at Apple. “We give them a Pencil, and we have them write fast, we have them write slow, write at a tilt. All of this variation.”

PopularMechanics

Typical computer models recognize a handwritten letter by comparing it to static images of letters stored in servers. Its accuracy is questionable given the uniqueness of each person’s handwriting. On iPadOS 14, Apple doesn’t compare user input to a database of images but instead uses the individual strokes of the Apple Pencil to understand what is being written.

“If you understand the strokes and how the strokes went down, that can be used to disambiguate what was being written.”

Craig Federighi

Apple’s approach means that the software can understand what you’re writing as you’re writing it, which allows it to also predict what you’ll write next.

“It’s gotta be happening in real time, right now, on the device that you’re holding,” Federighi says. “Which means that the computational power of the device has to be such that it can do that level of processing locally.”

PopularMechanics

Apple’s implementation of this approach extends well beyond handwriting, but also into drawing for shape recognition in Notes for example. Either way, the new feature allows Apple Pencil power-users to continue using their device without having to set their Pencil down, pull up the keyboard, and then type. The fluidity of the iPad workflow thanks to the Apple Pencil and iPadOS 14 is one way Apple is hoping to further distinguish the iPad from a crowded marketplace of tablets and touchscreen laptops.

Default image
Sami
Sami started falling in love with Apple in 2010 with the iPhone 4S. As a registered developer, he deeply admires the world of Apple. Sami is an aspiring journalist, writer, and actor. He also has devoted his life to sharing his passion and knowledge with others around the world.