If geeks love it, we’re on it

Machinima made easy: iClone 4 reviewed

Machinima made easy: iClone 4 reviewed

Materials

Materials throughout iClone support all of the essential settings plus a few extras. Texture images can be used to drive diffuse, opacity, bump, specular, glow, reflection and blend; it is worth noting that some types of objects do not support all of these maps but I never found it limiting. You can drive bump maps with either a traditional grayscale bump map or a normal map, which is fantastic. I am a little bit surprised that there isn’t any implementation of procedurals since they are often a smaller resource hit then actual image-based textures, but good looking procedurals might be beyond of the scope of iClone’s commitment to usability.

Moving on from textures, materials also have options for diffuse, ambient, and specular colors as well as refraction, reflection, opacity, self-illumination, specular, and glossiness. You can do basic UV projections (planar, box, spherical, etc.,) and there are settings for offset, tiling, and rotation. UV’s can even be applied on a per-channel basis. Like modeling, if you want to do full UV mapping you’ll need to do that elsewhere (this is often included in your modeling application). You can, however, manipulate a model’s UVs to the extent of offset, tiling, mirroring, and 90° rotations.

Inter-penetration happens but doesn't look too bad in motion.

iClone supports two kinds of dynamics on props; flex and spring. Flex seems to be limited right now to only preset flex objects included with iClone but it does work well for things like cloth when it can be used. With flex objects you can change an object’s stiffness, lift, gravity, wind, and centripetal force. Reallusion has struck a great balance between too many options and keeping the cloth dynamics simple to use—and having worked on some nightmare cloth projects in other engines—I appreciate the ease of getting decent results fast. Spring dynamics are similar to flex in the sense that you also need to start out with preset spring objects, and the spring engine is even simpler with only three settings you can adjust; mass, strength and bounciness. Spring looks great on hair, some types of cloth and yes, breasts body parts. In fact, there’s a whole section in the help files entitled ‘Bouncing Body Parts’ and this is only enabled on two of the female body types included with iClone.

Grass and plants are effortless to place wherever you want and they do a great job sticking to the terrain properly. The plants included are 2D images with alphas that always face the camera—they look pretty good and keep your performance high, but there’s no reason you couldn’t bring in full 3D plant models as well if you need them. Trees have full 3D trunks and branches while leaves are also cutouts; again these are just the presets—import higher or lower resolution trees as needed. I am also really impressed by the wind effect on the plants and the trees. It looks great, and adjusting the strength is as simple as moving the point on a single slider.

I didn’t spend a ton of time with the particles, but based on a few of the example scenes and reading through the documentation, iClone seems quite capable of a wide range of sprite-based particle effects like rain, shooting stars, water droplets, and even World of Warcraft-style magic effects.

Finally, iClone also supports videos with alpha channels—something put to great use in their demo videos—which gives you a lot of flexibility for doing things like creating virtual news sets for your green-screened actors or adding 3D animated characters to a real video background.

It may not be Crysis, but iClone plants look great.

Characters

While setting up a world in iClone is nice, working with characters is absolutely what iClone excels at. There are two sections dedicated to character setup, as well as another section for animation.

The first thing you set up is the body—or avatar. A character can be manipulated in body style and proportions, including some very configurable hand and finger options for length and thickness.

Mixing and matching can be amusing.

Hair and clothing models are entirely preset driven, so you will either need to download clothing that fits the style you’re looking for, or modify the texture properties of existing skin or clothing to meet your needs whenever possible. Options include full texture and material support as well as controls to quickly adjust the brightness, contrast, hue and saturation. Some hair objects also support spring dynamics, but with the same limitations as other spring enabled assets. Clothing is split into four subsections; upper body, lower body, gloves and shoes.

Accessory presets like sunglasses or a purse can also be attached to a character. Accessories do a good job sticking where they’re supposed to with simple parent constraints and can also be detached and/or attached to a curve as well. Some accessories also support flex or spring dynamics.

The final subsection of Actor is Skin where you can modify all the normal texture and material options and also get a color balance section for adjusting cyan-to-red, magenta-to-green and yellow-to-blue values. This is also a great place to add textured clothing that sticks close to the body like tight pants or shirts if you don’t want to use the full clothing items, but you’ll need an image editing application to add these to the skin texture.

Coolest feature in iClone.

Moving on to the head section iClone has one of the most unique, crazy, and cool features for facial texturing I’ve ever seen. When you load an image to create a face, iClone brings up a new wizard that you load your front and side images into. The first step is to crop, adjust, and rotate the images to the correct size of the face. Then, using a predefined male, female, or neutral facial boundary curve, you line up the curve with the chin, sides of the face, and top of the head. At this point you also get a 3D preview window that gives you feedback on what the rough size should be so that eyes, nose, mouth and chin roughly line up with the 3D model. The final step brings up detailed facial feature curves for defining exactly what the shape of the head is as well as what size and shapes the eyes, eyebrows, nose, mouth, and ears are so that the texture gets applied to exactly the right places on a model that is custom-fit to the image you used. This is much easier than adjusting the model in another program and distorting a texture in an image-editing program to match up with a UV map. It’s a very cool tool for doing something that’s traditionally a much more time consuming process.

From this point you can further refine the shape of the head from a set of preset shapes or bring your own morph in from a modeling application. There are eight other facial features with slider-driven morphs for each of these features. For example, when you adjust the mouth you have four sliders driving wide-or-purse, thick-or-thin, concave-or-convex and down-or-up. There are also entire subsections for eyes and mouth allowing you to further define shape and texture of the features and adding settings for things like teeth that probably weren’t included in your original face image (yes, Vampire teeth are included). Not everything works quite like you expect though, and you should expect to spend some time tweaking all the head and facial settings to get exactly what you’re looking for.

Animation

Animation is where you really see the machinima inspiration come into play. There are two modes to animation: Director Mode and Editor Mode. In Director Mode you choose your actor or iProp—preset props for things like vehicles—and you pilot them in real-time the same way you would in a game using the WASD, number keys and a few other hotkeys. In walking or running modes the actor moves identically to how they would in a 3rd-person game. Special actions like sighs, pants, emotions and interactions (pant, sigh, ‘hey what’s up’) are mapped to your number keys and a few other hotkeys. Switch the mode to sit and you lose the movement keys but still have actions you can perform with your other hotkeys. It isn’t meant to be perfect, and not everything blends together naturally, but your goal is to lay down some rough animation quickly. However, it does appear that you’re stuck with the preset animations attached to your actor.

You can switch back to Edit Mode, bring up your timeline, and get a single track non-linear editor-style interface where each of the commands you performed have individual clips. You can move these clips around and adjust the Transition Curves that give you controls like “ease in and ease out” of one action or pop instantly from one to the next, as well as adjusting how long the transition curves are.

An attractive and easy to use timeline.

Alternatively, you can also import motion you might have gotten from another character animation program or from motion capture—there are a lot of free and commercial motion capture files on the web, so you should be able to find a lot of the actions you’d like to have your character perform.

The color of the muscles shows what weighting each have.

The Puppeteering system for facial animation is probably the most complicated and the second coolest part of iClone after the face setup. When you bring up the Puppeteering panel, you see an image of a face with different facial muscle sections split into various colors. Basically, what you do is choose what features you want to be active and then you start previewing or recording animation in real-time. You control the expressions of your chosen features by moving your mouse around the screen. Along with expressions, you can also control where the eyes are and which direction they’re looking, and you can also have the head tilt to follow your mouse. There are a bunch of puppet profiles that you can chose from, with pre-selected weight applied to the muscles, but I found it easiest to solo specific features one or two at a time and record your motion over a couple of takes. For instance, on the first take we might adjust the head orientation and eye direction, on the second the smile, on the third the eyebrow movement, and so on. There’s a lot of power here and it takes a fair amount of practice to understand what you’re doing and figure out how to get the results you want, but once you do recording facial motion feels natural and efficient.

Working with PuppetClips works almost the same way that Director Mode tracks do, except that one PuppetClip is the entire set of motion you recorded when you hit the record button. Anytime you hit record while you’re over another PuppetClip in the timeline, it will record to that existing clip. Multiple clips can be overlapped and given the same Transition Curves that Director Mode animation can.

If you’d prefer, there are also more traditional morph target-based facial animation features as well. You can simply set a keyframe to a preset expression, and then follow that up with a different keyframe, and it will blend between the two. This definitely doesn’t seem like the focus of the application or the ideal way to animate but it’s there if you want it.

Rounding out facial animation features, iClone can also load an audio file or record directly from your microphone and iClone will animate the lip sync for you. It does quite a good job at it too. If you’re really desperate, there is also a Text-to-Speech function, but it’s only as good as Text-to-Speech ever is.

There's a joke here somewhere...

There are two more animation subsets—hands and paths. Hand animation is a pretty basic implementation with preset finger poses that you use by setting hand keyframes the same way with facial morph targets.

Paths also work as you would expect: there are preset paths, but you can also draw your own with a simple vector curve or linear tool. One cool thing about them is that you can project paths onto the terrains you have in your scene, so getting an object to follow the contours of a mountain is nearly effortless.

Export Options

You have four basic types of export; Image, Video, Image Sequence and Widget Output. Image, Video and Image Sequences work exactly like you’d expect them, though I am disappointed that there’s no QuickTime support. Then again, I usually render image sequences anyway. The final option, Widget Output, will export you a Flash-based iWidget File for use with one of Reallusion’s other products, WidgetCast. This looks like a very compelling feature but does require an additional purchase to use.

« Previous Next page »

Comments

  1. Scott Been using iClone for a few months now. For me, its the 'sweet-spot' between High-end rendering programs and low-end machinima tools.
  2. Anim8tor Cathy I've been with iClone since version 1 and it has been a pleasure for me to watch both the software and the community around it grow, evolve and flourish.
  3. David Barnes Oh my my! I have a group of 8-10 year old girls that I'm working with that have amazing creative gifts and I think that what they'd be able to do with something like this is beyond description. Of course this means that I'd have to get up to speed so I hope this is as easy as the reviewer thinks.
  4. robert flask looking for a good animator experienced in iclone4 or a strong background in machinima. we r great writers but these damn computers r not doing what we telll them. lets talk. flaskfull@yahoo.com

Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!