CGI Nissan 370Z

BlueFishProductionseNews370Z001

I must admit, we do have a quiet chuckle to ourselves occasionally when we come across those people who think creating great images by CGI is just a matter of throwing some CAD data of a vehicle and a background into Maya or 3DS Max and out pops a glorious image! Ummmm, lets just say there's a bit more to it than that.

For those marketing and creatives souls out there who are perhaps not familiar with CGI for image creation then the following provides a basic overview for creating a real 'location' based image by CGI.

1 DATA - There must firstly be a good quality model of the vehicle available which is prepared carefully to particular requirement. This is a highly technical area in itself and requires pages of explanation but for this example we shall assume good vehicle data exists and has been prepped for use in Maya or 3DS Max which are the two principle pieces of software used. Prepping the CAD data for each vehicle model which usually comes via the car manufacturers design department can be a $10K to $100K exercise in itself. It's often called 'garaging the car' or creation of a 'virtual garage' from which the appropriate vehicle data is sourced ready for CGI visualisation and rendering.

Below: Wireframe view of vehicle data

2. BACKPLATES - Real world location based CGI usually uses actual locations that have been photographed and light mapped and then the car is added from a CAD model of the vehicle. You can create the entire scene in 3D but that's another example for another day.

All the processes leading up to shooting can be surprisingly similar to photography. Type and style of backgrounds need to be determined as they always have been and location or library searches are normally involved. Your average stock library is not so helpful as while you might find an appropriate background, they will not have an accompanying 'lightingmap' which makes them not well suited for CGI. There are new libraries of CGI backgrounds becoming available and these provide specially produced backplates with accompanying lighting maps. We call these matching sets of backplates and lighting maps 'datasets'. Accordingly, your location needs may be fulfilled from either a CGI stock library, or by searching and selecting a real world location which is then photographed according to your specific needs.

One of the main benefits of shooting your own locations is obviously exclusivity, but also because backplates can be produced on angles and from camera positions that will allow you to depict the vehicle on the very specific angle you require. CGI library shots will only have a certain number of camera positions to choose from and you may not be able position the vehicle on an angle of your exact likely without it looking a bit fake or unrealistic.

The example below was produced from our own library 'TransportalCGI'. It's small but growing and we hope to have it online in the coming months.

3. LIGHTING MAPS

For each set of backplates taken at particular locations there needs to be an accompanying lighting map. The lightingmap needs to be captured at the same time and in the same lighting conditions as the backplate and is taken from the position in the frame where the vehicle would be placed so that reflections of the surrounding environment are appropriately renders onto the car.

Technically these lighting map images are full 360 degree images of the surrounding scene shot with massive exposure variations ( +/- 13stops from normal exposure). They can be captures by stiching and combining hundreds of individual images and exposures on rotating standard cameras or in Blue Fish's case we use a Spheron HDR camera which creates a full spherical 32 HDR recording 26 stops of exposure information.

Below: 360ยบ 32bit HDR image from Spheron camera

4. CGI Visualisation

Once you have your first three componants, Vehicle Data, Backplate and Lighting Map, then the CGI work begins by importing these three elements into your CGI software (usually Maya or 3DS Max). FIrst up is establishing the actual cars size, position and scale in the scene and setting the virtual camera position, lenses and distance to the vehicle. This can take hours of refinement to perfect the exact for having just one of any of these variables wrong will produce a vehicle that just never matches into the scene properly. This is especially so with moving backplates as the track of the vehicle needs to sit perfectly on the road surface and be going in a direction that correctly correlates to the movement in the image.

The client may well like to have the car on a different angle, perhaps from higher or more front (whatever) but changing this without a backplate exactly designed to match is never going to work well. More opten than not we custom shoot locations and backplates because very particular vehicle 'hieghts and angles' are required from the client. Accordingly backplates have to be shot from corresponding camera heights and appropriate lenses etc. In critical cases we will take vehicle models and CGI software on location while shooting backplates and actually drop vehicles into the scene while shooting backplates so we can check how car positions and angles are matching up to the environment and to client expectations.

Below: Car wireframe already correctly positioned within the backplate. The image below already as the car shadow applied from the final render but it's not usually there at this stage and it makes positioning and grounding the car very difficult. Once we're happy with our initial positioning of the car we will quickly do test renders and vehicle shadows before locking off a final car position in frame.

5. Visualisation & lighting maps

Once the vehicle has been established in the scene on the correct position, we then apply the lighting maps and produce the first quick renders in clay mode. This helps us position and rotate the lighting map initially to the correct position matching the light source of the backplate. While the starting position is replicating the real scene, luckily we are not bound by real world physics. There's been many times on location shooting real cars that when I have the background angle I want, then the light is not coming from the perfect direction, well in CGI world we can move the sun, yipee!. There are limits as to how far we might go as the reflections on the vehicle will also move as you manipulate or alter the lighting dome but there is plenty of scope to play.

Below: vehcile clay view with lighting map applied.

6. Visualisation - Additional Lighting & rendering

Once our main lighting map has been positioned there may then be many modifications made to it and a kazillion setting and paramaeters applied. We'll conduct our first half res renders at this stage to get a better feel for how it;s all coming together. After review of those renders we will generally add aditional light sources, reflectors and blacks within the CGI scene. It's almost like doing it for real on location, put a 2K light over there, add white board for some kick or fill over here etc etc. The nice thing is that we can have those additions effect the lighting on the vehicle but we can make them invisible in the actually scene. Try doing that in the real world.

Many more adjustments and test renders are completed as we move closer to final renders, each time making small to larger scale adjustements to the image. FInal full resoltion renders are then normally queued to the render farm over night and the following morning we start compositing those redners for final retouching. To give us the more control, there are many many render layers making up the final composit. Initial composits are in 32 bit and these are simplified down as we blend different layers for colour, reflections, shadows etc etc. There is probably ten pages of information and steps we could aditionally outline here but I trust you'll respect that we're not about to give all our techniques or secrets away.

Regards GG

Final retouched render above. The main feature image at the top of this article is photoshop retouch of this file to add more dynamic cross processed type effect.