Menu

Animation Studio

May 30, 2016 - Animation
Animation Studio

Whenever a live Simpsons segment was introduced several days ago, many speculated about how exactly it might be accomplished. Will it be via motion capture? Possibly a markerless facial animation set-up?

Ultimately, the 3-minute segment, by which Homer (voiced by Dan Castellaneta) clarified live questions posted by fans, was recognized with the aid of still-in-development Adobe Character Animator controlling lip sync and keyboard-triggered animations contributing to this mixture. Cartoon Brew got all of the tech particulars in the Simpsons producer and director David Silverman and Adobe’s senior principal researcher for Character Animator and co-creator of Consequences, David Simons.

However, here’s the live segment:

PARTNER MESSAGE

The roots of the live Simpsons

The concept for any live-animated segment have been around for quite some time, based on Silverman, who noted the idea was to benefit from Castellaneta’s ad-libbing abilities. “We are all aware that Dan is a superb improv guy. He originated from Second City in Chicago, where comics like Bill Murray and John Belushi had also carried out.” However, it wasn’t so obvious what technology could be employed to create a live broadcast. That’s, before the Simpsons team observed the way the Fox Sports on-air graphics division was applying the live manipulation of their robot mascot, Cleatus. That brought for an analysis of Adobe Character Animator.

Still a comparatively new feature in Consequences CC, Character animation studio is made to animate layered 2D figures produced in Illustrator CC or Illustrator CC by moving real human actions into animated form. This is often via key strokes, however the real drawcard from the tool may be the translation via webcam of user facial expressions to some 2D character and user dialogue driving lip sync.

Animation Studio

Animation Studio

Facial animation wasn’t utilized in the live Simpsons segment, but lip sync direct from Castellaneta’s performance was. The lip sync part functions by examining the audio input and transforming this into a number of phonemes. “If you are taking the term ‘map’,” described Adobe’s David Simons, “each letter within the word could be a person phoneme. The final step could be exhibiting what we’re calling ‘visemes’. Within the ‘map’ example, the ‘m’ and ‘p’ phonemes can both be symbolized through the same viseme. We support as many as 11 visemes, but we recognize a lot more (60 ) phonemes. In a nutshell, should you create mouth shapes in Illustrator or Illustrator and tag them properly in Character Animator, you are able to animate the mouth area simply by speaking in to the microphone.”

Curiously, once the Simpsons team were searching to consider Character animation studio for that live segment, the tool was at that time, but still is, in preview release form (presently Preview 4). However The Simpsons team could use Fox Sports to make a prototype Homer puppet within the software that convinced everybody that the live Simpsons segment could be possible. “To be sure that the Simpsons team was utilizing a very stable product,” stated Simons, “we produced a brand new branch of Preview 4 known as ‘Springfield’ using the version number beginning at x847 because that’s the cost Maggie rings in the show’s intro. We understood so good lip sync will be a priority so lots of work entered modifying our lip sync formula therefore the finish result could be broadcast quality.”

Adobe Character Animator Allows You Animate Together With Your FaceSee Also: Adobe Character Animator Allows You Animate Together With Your Face

Making animation

Throughout the live segment – recorded two times for west and new england viewers from the show – Castellaneta was located in a remote seem booth in the Fox Sports facility listening and answering callers while Silverman was known as upon to function the additional animation having a custom XKEYS keyboard device that incorporated printed animated Homer thumbnail symbols. Adobe also implemented a method to send the smoothness Animator output directly like a video signal via SDI and let the live broadcast.

So, why was Silverman given the job of pressing the buttons? “They wanted me to operate the animation due to my familiarity,” the director, that has labored on the program almost from the first day, acknowledged. “I’m the man who invented many of the rules for Homer [and] they look in my experience like a Homer expert. So that they thought it might be smart to have someone who understood the way the character seemed and labored.”

Obviously, prior to the broadcast, the animatable pieces needed to be put together. It was completed in Illustrator through the Simpsons animation team, then converted to Character Animator. “One in our animation company directors, Eric Koenig, setup the animation stems that might be used,” stated Silverman. “We had Homer speaking, all of the dialogue mouths, design from the room and also the animation of Homer raising his arms, turning sideways, eye blinks, etc. Eric Kurland then setup the setup the programming for this with Adobe on all of the buttons and rigging from the character.”

Facebooktwittergoogle_plusredditpinterestlinkedinmail