I went to Adapt 2006 this weekend and decided to post about it. You’ll see some stuff on here in the future that will be born of some ideas I picked up during this weekend, of that I’m positive.
Disney Feature Animation
First Talk of Saturday was Aaron Holly, rigger at Disney. The talk was about deformable rigging in Maya. The idea is to enable animators, who often have a background in classical 2D animation, to achieve the exact pose they require including all kinds of squash and stretch.
The approach Aaron shared was to build simple skeletal structures that drive an intermediate deformation level whose controls can be torn off the skeletal structure thus achieving any exaggerated pose required.
Aaron, who studied philosophy before swerving into the 3D realm actually invoked Occam’s razor as being an underlying philosophy to his approach to 3D rigging. Occam’s razor states that, given two valid solutions to a problem, the simplest solution is invariably the best. Nice words to work by.
Another issue that Aaron talked about is linking stuff (secondary shape animation for example) to the rotation of a ball joint in a rig. A ball joint will usually have rotation on all three axes while you will typically want only two values to link your stuff to: a top/bottom rotation as well as a forward/backward. The approach showed was to project the end of the joint onto a plane at the root of the joint. The position on this 2D plane vis-a-vis the root of the joint thus represents in 2D space the amount of rotation applied to the bone. Interesting. I’m sure a custom operator for XSI can be written that would compute this in a flash. This to-do list of mine is getting way too long!
Mark was showing a typical digital environment workflow. A lot of interesting stuff was shown, Mark’s production experience really shows.
Tips and tricks for easing the workflow such as working in half rez or 1K and then scaling up renders for final out.
All in all a neat talk. Mark has often worked in Maya and Renderman and mentioned some pass workflows that were interesting. XSI’s passes and partitions still seem to dwarf anything else I’ve seen though.
Having never worked in a games development house I never really realized how much the two industries of games and film have come together.
Chris of Lucasarts spoke a lot of convergence. We all know the Lucas empire is huge and as technologies have been moving to a common nodal point a lot of effort has been put into unifying pipelines and processes over at George’s place. Slick stuff.
We’ve all heard a bit about Zeno, ILM’s new development framework for custom tools. Well, it seems to be a much larger endeavor than I thought. The guys at ILM R&D seem to have almost come up with a full content creation system. Integrating game technologies to create film previs tools, moving editing tools from a film background to a cutscene blocking environment as well as providing a unified environment for the artists. It’s all proprietary so we can only fathom all the things Zeno can do.
Lastly, we were able to see a tech demo of euphoria technology jointly developed with UK-based Natural Motion who market a character simulation software called Endorphin. Seamless blending from animation to ragdoll to animation as well as simulated character motion reacting to environmental factors. Imagine a simulated character latching on to a second story railing after having been projected sky high by an explosion. Check out the Endorphin demo to get an idea of the technology.
Consumer technology is always pushed forward by a few R&D brainiacks going “I bet I can do this.” when everyone else is saying “Wouldn’t it be cool if…” or even “It can’t be done”. Looking at Stanford, UCLA, ILM, Disney, Sony Picture Imageworks and other large players as well as a host of smaller ones makes me say it’s a good time for digital content creation.
It was refreshing to see someone approach digital compositing from a more technical point of view. Most people, when they talk about compositing concentrate on the finished product and its artistic and aesthetic value but knowing what happens under the hood IMHO is a prerequisite to getting that extra edge.
Jeremy managed to explain some of the mathematics involved in very clear and simple terms. It was nice to have representatives of both the artistic comper (Mark Lefitz) as well as the technical comper (Jeremy) in the same conference.
Jeremy also covered the creation and management of passes/render layers in Maya and how these different outputs can then be recombined in Shake to produce final output. XSI’s implementation of this stuff rocks. The use of XSI in that last phrase wasn’t a typo.
… And Jeremy was the first speaker of the weekend I heard mention XSI. Is Maya really so pervasive and have so many people not yet discovered the strong points of Softimage’s product portfolio? You do know they’ve evolved quite a bit since Softimage|3D!
Emile is obviously a great animator with a lot of experience. His demo of the squash and stretch style of cartoon animation as applied to 3D characters was very enlightening. Emile often stressed that, as an animator, it is very important for him to be as close to the character as possible. For the riggers out there he is saying that the rigs you produce should keep the animator on the character and avoid as much clutter as possible. For the animators out there he is saying that you are as much an actor as an animator and you should live and breathe through your character while you are working. Anything in your 3D environment that could tare you away from the performance should be avoided. Even if Emile seems to be the strong silent type, or maybe he was just really nervous, he is obviously passionate about his work and it shows through his results.
Pirates of the Caribbean 2: Dead Man’s Chest
What an impressive show.
If a lot of movie critics talked about digital makeup or flesh extensions for Davey Jones, ILM is working hard to debunk that myth. Davey Jones and his crew are 100% digital, eyeballs and all.
The talk was pretty much a usual show and tell and when ILM does a show and tell you can be sure a lot of you questions will be replied with either:
- “It was proprietary software.”
- “We did it with Zeno”
which is proprietary software. I guess you can do that when you have one third as many coders and TDs as artists.
One thing that really blew me away is a proprietary tool ILM developed for this show called iMocap (taking a page from Apple’s marketing handbook are we?). This thing could pick up actor performance from a set with the main film camera and two witness cameras. The kick is that the iMocap system doesn’t care about lighting conditions and can pick up performance from many actors at once in an almost unrestricted set space.
… Say what?
Yep they built an uber flawless mocap system. I would love to get to the guts of this thing and see how it works but I guess I’ll have to get my own gears turning and figure it out on my own.
No matter how you did it, kudos to ILM. You did great work.
Arnaud started with an overview of the production process over at PDI and he mentioned something called PDI pipeline. Yeah, don’t you love these large studios and all their proprietary tools.
Most of the talk was a show and tell about Shrek 2′s effects work and, refreshingly, a few details were given. One effect example that was dissected was the fireballs. Surprisingly few passes and rather simple solution of growing isosurfaces with fractal noise displacement and a normal-based color shader accounted for 90% of the effect. Neat.
All in all I had a great time because content was king at ADAPT 2006. For a first time conference it could have been otherwise but the organizers really did their homework.
I hope next year I’ll get to meet you all at the 2007 edition.