Bringing Animation to Life

What do you call it when you discover your customers are using your product in a way that was never intended, and it was never designed to support?

Opportunity? Inspiration?

At Adobe we call it Character Animator.

Demonstrated at NAB 2015, and rolling out as a preview technology with the next version of Adobe After Effects, Character Animator provides professional and novice animators alike with an exciting new way to bring their creations to life.

Character Animator is the result of a collaborative effort between Adobe’s Advanced Product Development Group and the Creative Technology Labs (CTL) Research Group, three years in the making. “We visit our customers every year to see what they’re doing, and increasingly we noticed people using After Effects to create character animation despite the fact it wasn’t designed for it,” says David Simons, principal scientist in the Advanced Product Development Group.

“After Effects is very powerful — anything you can imagine as moving imagery you can probably create in After Effects if you spend enough time and you know the tool,” he adds, “but creating a character that looks like it is alive is always a lot of work and it doesn’t need to be that way… We thought, what if we could take someone who knows how to draw in Photoshop or Illustrator, and allow them to create a puppet that’s already alive the moment you place it on the stage?”

Although Character Animator is part of After Effects, it’s built from the ground up to support a completely new way of controlling animation via face-tracking technology and live performance.

Existing software for character animation tends to be either very complicated, allowing the animator freedom to do practically anything; or it is simple and limited to using canned characters. “We wanted something that’s the best of both worlds,” David reveals, “where it’s easy to use, but you can use your own artwork to create characters, and they’re still expressive.

“The design goal is that if you watch someone doing it for one minute, you should feel like you know how to do it too. And everyone knows how to use their own face, so using face-tracking to control the character opens the door to many people who would have never considered it before.”

Although it’s intuitive and easy to control puppets by face-tracking performances, there’s a lot going on behind the scenes to provide animators with the customization and performance recording capabilities needed to make Character Animator useful in professional settings.

For example, artists can customize the look of each puppet by generating layered artwork to match a predefined template of expressions and behaviors, such as the shape of the mouth when making different sounds, or the way eyebrows rise to express surprise.

Character Animator also allows the animator to record different performances and meld them together.

Looking ahead, David and team are excited to get Character Animator in front of customers and collect real-world feedback.

“Ultimately, I’m hoping it defines a new kind of app,” David says, “something that’s a combination between Garage Band and After Effects — a performance capture app that’s a compositor, but rather than compositing pixels, it’s compositing the performances themselves.”

This story is part of a series that will give you a closer look at the people and technology that were showcased as part of Adobe Sneaks.

Watch other Sneaks and videos from NAB here. Read our other features on MAX Sneaks, including: Time of Day, Live Mobile Dev, Gap Stop, PSD Web Editing, 3D Photo Magic, Project Para, Project Recess, and Visual Speech Editor.

There’s plenty of Adobe magic in digital marketing as well. See our stories on Adobe Summit Sneaks, including: Datatone, SmartPic, Mobile Reality, Freeform Analysis, Creative Now, and Benchmarker.