'
-->
});
});
This is an image of the original car that is a LEGO "Creator" kit. It has two other forms, and I'd like to incorporate them into the shot at some point. I actually got this kit as a Christmas gift from a friend, and almost the very first thing that came to mind when I got it was wanting to do something with it via some type of visual effect. As I kicked the idea around in my head, I settled on the concept of showing not only the cool LEGO car, but also trying tell a simple story. I always loved dumping lots of LEGOs out on the floor or table before I began to build when I was little.
I wanted to put my idea down "on paper" so I could have a clear goal as to what I was trying to do. I put together some quick thumbnail sketches of my scene. I didn't feel it completely necessary to detail them out, just something fast to get my idea down. I wanted to solidify the idea, and make sure, at least in a very preliminary way, that it was going to work as I went forward.
The LEGO set consists of over 280 pieces. Knowing this, I knew I had to be efficient with the way I modeled all of them. I figured the best way to go about getting this done would be to approach the modeling in a modular fashion. I built several "reference nodes" that I used over and over to build most of the pieces. There were a few more specialized pieces that I knew I couldn't use these for, but not enough to cause problems. I also had to consider the level of detail involved with each piece in reference to the screen real estate they'd occupy. This was important to keep in mind so I didn't over burden my scene with hundreds of objects that were built of unnecessary amounts of geometry.
I knew organization of everything, from the beginning, would be key to not going insane. I built a simple chart with splines in the scene, and placed all the pieces according to the inventory given in the manual of the toy. I instanced the appropriate amount of each piece, double checked the naming convention, analyzed them a final time for problems, then I set about the next phase of the project.
When I began to bring the organized pieces together to form the roadster, I thought it'd be good fun to create a simple animation of the pieces coming together. Not a necessary step, but sometimes in communicating progress I think it's fun to not just be so dry about things. The animation was formed by moving each piece into place over one frame, so in the video they just pop in; I didn't really want to waste time messing with unnecessary key frame adjustments. The video was formed by taking screen captures of the 3ds Max viewport with antialiasing turned on.
I wanted to produce a multi-channeled exr render that I could composite together, but before I could, I needed to set a stage for my car and get it lit. I used a lathed spline with a fileted corner to make the room it'd be situatied in. I then began to bring lights in, one at a time, testing each several times individually with different settings until that light felt right. After the main lights were brought it, I wanted to supplement the light with some indirect lighting, and some extra hot spots for some specular pop. Now, I know Final Gather takes care of a lot of indirect illumination on its own, but I wanted to try something different and attempt a specific effect. To do this I placed several matte white cards angled at the car to catch light and bounce it back in a very diffuse manner. There's one in particular in the rear of the car that has a really bright light shone directly on it to really enhance this effect. This probably could have been done with just area lights, but I thought this would get a different feel to it.
The next step in my process was to begin simulations. In preparation for creating a physX simulation for this, I knew, based on the way Reactor works, the objects in my scene needed to be simplified. With too many vertices, getting feedback on tests would take forever. For all the pieces, I made drastically (vert) reduced versions of them, but tried to keep the general shape integrity so that they would react properly. These versions of the objects were to be used as proxies for the main pieces to keep things fast during simulation creation. Once I had the animations nailed, I'd be linking the main pieces in some fashion to produce a final render.
I ordered an 8" stainless steel garden ornament in the mail, and also got a friend to help me get a tri-pod threaded hole cut into it. In this picture, and the use for it in my scene, it isn't on a tripod, but it is using the tripod attachment so that it can sit on the desk the way I want without moving all around. This orb is what I used to generate my HDRI.
I'm using my desk at home to comp my roadster into. I'll be using the image on top for this. I changed the angle to allow for the roadster to occupy more frame relestate after a suggestion from Naylor. The image at the bottom I'm using as reference for size, color, and lighting on the object itself.
Here is an animation showing the reactor/hand animation results in my scene. These proxy pieces would later be replaced with the textured, higher poly versions for final compositing and rendering, as well as animated coming to life and forming the car itself. Due to critiques I got for this, I ended up adjusting timing quite bit, as well as completely eliminating the little slow-mo part.
LEGO ROADSTER
.....
^^^PUMP UP THE VOLUME!!^^^
.....
This is a project I began work on in my first portfolio class at the Art Institute of Dallas in the winter quarter of 2011. After going over some ideas, and discussing the direction I wanted to take with my professor, I decided on doing a piece that I could incorporate multiple layers of process into, something to show the different aspects of visual effects I've learned so far. This LEGO piece seemed like it would be fun. I'm fond of LEGOs, I feel like they are an early approach to creativity form a young age, and can still be enjoyed after adolescence.
.............................................
.....
.....
.....
.....
.............................................
.....
.....
.....
.....
.............................................
.....
.....
.....
.....
.............................................
.....
.....
.....
.....
.............................................
.....
.....
.....
.....
.............................................
.....
.....
.....
.....
.............................................
.....
.....
.....
With the car constructed, I then needed to make a nice render of it. I knew I'd be doing some compositing for not only this, but for a portion of the project later on, and because of that (and my desire to get better at this process in general) I decided I needed to get familiar with a node based compositing system. I wanted to go with NUKE, but the personal learning edition would only render with a nasty watermark that I ultimately couldn't live with. My next option was Autodesk Composite (formerly Toxik).
.....
I read some stuff on it, watched a few tutorials, and started playing with it to get familiar with the software. I like it a lot so far, mainly due to the way node based work flows are designed. I also like being able to embed multiple render elements into one open exr file and work from that. It's so much easier and immensely more time saving that setting up multiple renders, with different settings and re-rendering for each pass. I think some things may still need to be done this way, but most can be achieved with render elements, or so it seems so far.
.....
After some more playing around in Composite, and scene tweaking in Max, I got close to what I was looking for. There are several elements involved in this render:
.....
A. Diffuse
B. Direct Light
C. Indirect Light
D. Tight AO
E. Mid AO
F. Wide AO
G. Z Depth
H. Reflectance
I. Specular
J. Transparency
.....
.............................................
.....
.....
.....
Here's the fully composited render. This wasn't meant as a final piece, but a stepping stone forward. I was happy with what I got from it, but know it's not perfect. I used critiques I got from this render as I moved forward with my project to help shape some decisions.
.....
Check out this little video I put together to show the different passes being composited together in a progressive fashion.
.....
.....
.............................................
.....
.....
.....
.....
.............................................
.....
.....
.....
When built, these pieces were laid out in gridded position for the sake of organization, and left so for this test, but I ended up ofsetting them far more for the scene. I did, however, use the Niel Blevins soulburn script "transform randomizer" so they were all in odd and different angles so that when they impacted the surface, they'd act a little more random and I could get a better feel for a sense of reality in the test simulation.
.....
A lot of getting this close to the way I wanted was tweaking the variables over and over, and testing and re-testing. One of the means by which I was causing myself problems was that I was trying too hard to keep the values of the individual settings close to "real" numbers according to what 3ds Max defines as "real" in it's help file definition of the simulator. Once I started pushing the numbers past "real" I began to get closer to what I wanted.
.....
The linked video wasn't quite there yet, it's definitely slow and some of the pieces take too long to settle (among other things) but it was getting there.
.....
.............................................
.....
.....
.....
.....
.............................................
.....
.....
.....
At first I was going to just grab an image off the web to composite my scene into, and use an HDR that would be "close enough" to get the effects of HDR lighting. I decided to shy away from this because one of the best parts of doing this is the process and all the individual steps. Why cheat myself out of the fun, and in the process have a weaker piece. I truly feel that having every step done by me helps me control things, as well as gives me a sort of unspoken attachment to the steps, and I learn more.
.....
In shooting images to create an HDRI I needed to have a better telephoto lens than what I have personally, so I rented one from a store here in Dallas at a really reasonable rate. I had no idea that I'd get to work with a L-series lens, which is great since these are $2000 or more, a premium I haven't been able to afford ever. It really is a beautiful piece of equipment. The telephoto lens was used because I wanted to be as far away from the orb as possible to reduce the amount of editing I'd have to do to get rid of myself in the reflection.
.....
.............................................
.....
.....
.....
Putting together an HDRI requires that I have multiple exposures of the reflective ball in order to have appropriate color information to generate a 32bit image with enough depth in color. My Canon camera, and all Canon cameras from what I understand (due to a stubboness on Canon's part), only allow for 1 up and 1 down in bracketing exposures. To get around this, I set up multiple bracketed exposures in order to get the range I wanted. This is a bit irritating because messing with the camera between shots inevitably causes unwanted movement, which ultimately causes some noise in the final HDR. I tried to line them up as much as possible in PS before combining them, however.
.....
CLICK THE IMAGE ABOVE if you'd like to see the HDRI or perhaps even see how it looks in a scene by using it for yourself.
.....
.............................................
.....
.....
.....
.....
.............................................
.....
.....
.....
The top image image is showing my scene layout in Max. It took a bit of effort to get the proxy objects to match the perspective of the picture properly. This was mainly due to the fact that I simply could not get the perspective to match if the camera was using a 20mm focal length, which was that of the camera when I shot the picture. I had to go all the way up to 28mm to get it to look right. The objects themselves are just stand-ins to take the 'matte shadow/reflection' shader for the sake of compositing.
.....
The bottom image is a first pass of my car with its shaders applied and rendered in the scene with the HDRI as well as some supplemental lights to help control tings a bit more.
.....
.............................................
.....
.....
.....
I needed to get my LEGOs dropped into my scene, but I wanted them to drop and fall in a random order. I didn't want to manually move all those pieces into a "random" order, so I let Max generate some randomness for me.
.....
I made a "container" for the pieces to drop into as part of a really quick reactor sequence. With the pieces in their original gridded position, I simply turned them as a whole on their side, then ran the physX simulation to allow it to provide the first pass of randomness.
.....
This first set of randomness wasn't enough. They were still lying in a very flat plane, so I decided that running another fast simulation was in order.
.....
Again, I turned the pieces on their side and created another container for them to fall into. This one I made more wide so they'd fill a more robust amount of space to allow them more room to be spread out.
.....
Again, after this las simulation, the pieces were still in a very flat plane. This would cause them to all impact the surface of the desk at relatively the same moment, and I don't want that. To fix this I used Niel Blevin's Soulburnscript, the "transformRandomizer". With somewhat large values for the min and max I spread the objects out vertically. They were then spread out enough on the Z, but needed to not be in such a tight "square" container, so I pushed them out on the X and Y as well, just not as much. I then randomized their rotations so they didn't all land in similar flat fashion.
.....
.............................................
.....
.....
.....
.....
.............................................
.....
.....
.....
I needed to get the higher poly/shaded versions of the LEGO pieces to inherit the motion of the proxy pieces. In order to do this I had to go to the end of the animation that had already been made, and with auto key turned on, align each proxy piece (position and orientation) to its corresponding higher poly/shaded piece that was already in place in the constructed roadster. I couldn't think of or find a faster way to do this other than just manually aligning each piece.
.....
Since auto-key was on, and the time slider up a few frames, this caused the proxy pieces to move from their settled physics simulation position, to where it needed to be in the car. They started as simple linear paths, but I later went back and added some arcs and other things to liven up their motion.
.....
After the proxy pieces were in place, I then parented the shaded pieces to the proxy pieces so that they now had the motion of their proxy piece without the expensive nature of producing the simulation with so many more polygons. This process was made easier for me since I had been using a naming convention for every piece that made sense, and also because of the 'outliner' plugin.
.....
I used selection sets for the shaded pieces and the simulation pieces as well, to prevent unnecessary re-selecting. I also made the proxy pieces non-renderable so that they're serving the sole purpose of producing motion.
.....
.............................................
.....
.....
.....
Getting the jiggle portion of the animation was the next part I tackled. I decided to use a noise controller to generate the jitter. I was able to apply a controller to all the pieces necessary, then instance the sub-controller attributes so that all I had to change was one. I did individually change the seed of all of them, so they jiggled differently. I only did this for the 160 some odd pieces on the screen at the time; no need to animate things I'm not seeing.
.....
I had to clone the objects in places at the time of the jiggle, and swap them out with a visibility track. This is due to the fact that the noise controller overwrites any rotational values. The jiggle pieces swap back out to the other pieces after they perform their jiggle duty.
.....
.............................................
.....
.....
.....
This was the last test render I made for physics and animation in general. This still felt a bit fast, so it was later slowed down again. I feel like I learned a good lesson from an instructor of mine, that lesson being that I may time things out and then represent that timing physically in my animation, but sometimes it's not about being accurate, it's about looking good. It ultimately looked better not going so fast.
.....
At this point, getting this project finished meant spending time rendering, re-rendering, and testing and re-testing. Materials were adjusted, timing played with, animations pushed and so forth. Finally, the render was taken into after effects and some "presentation" was added.
.....
.............................................
.....
.....
.....