A series of tests varying the breakable velocity value for joints. The strength of the joints is also effected by a distance test to the center of the force that’s used to exploded the boxes. In some of the tests I add a random multiplier to add some variation to the effect.
I worked on some nightclub visuals recently, which was a great change from my usual work. This was more a motion graphics type job, and involved a cloth like material streaming off a dancer as they performed traditional Chinese dancing.
I ended up simming about 8000 frames of particles, and then meshed the streams in Frost. Render times were pretty large, as the client required double HD. This is a small sample of the final result. I’ve changed the V-Ray shader in this version, as I wanted to try something else.
This is a render test at the double HD size, there are some Frost meshing issues with this test.
The particle count had to be increased to remove these holes in the mesh, but it was still a problem in the final render. This is partly down to the thin nature of the cloth like structures, as its not possible to increase the thickness of the mesh in Frost without changing the look of the streams. The only way to remove the holes is to massively increase the particle count.
I was asked to provide the graphics for a video game in the TV show Bluestone 42. This is a quick breakdown showing the level building process I went through.
The brief was that I provide realistic current gen game graphics, so it was important for me to keep this in mind while building the level. I kept poly counts reasonably low, and used normal maps for extra detail. V-Ray was used to render the scenes, with cached GI. A game look for the shadows was achieved by using V-Ray shadow maps on a low setting
I delivered 17 shots for the show, based in 3 environments. The desert level shown above, a village level, and finally a recreation of the Bluestone 42 base.
I’ll upload some more videos showing my work on the show when I have a bit more time.
Very pleased to have worked on this Coca-Cola ad for Nexus. I was responsible for all the hair and fur in the commercial.
There were 14 separate character hair setups, all sculpted in Hair Farm, and rendered in V-Ray. We were going for a photo real stop motion look, so the hair is designed to look more like nylon dolls hair than human. The scale of the styling and the hair strand thickness were designed to achieve this look. Hair Farm dynamics didn’t work at this small scale, so we used a Hair Farm link mesh, which was then rigged to drive any required motion for the hair. This also suited the hand stop animation feel we’d designed for the characters.
A selection of shots from Nazi Mega Weapons: Super Tanks.
Didn’t have much time to get these explosions delivered. Just one day of setup, and then 1/2 a day per explosion for more setup, sim and render.
The fume sim is driven by a particle flow system. I wanted to have a strong petrol based explosion type, with an initial large blast, followed by multiple secondary explosions as other sources of fuel are effected by the heat of the first explosion.
To achieve this I used multiple fume particle sources with different settings, and could adjust the amount of fuel and heat I was adding to the explosion with each blast.
So this is the same creature as shown previously, but I’ve spent some time working rendering the hair fully in V-ray rather than using a separate light rig, and the Hair Farm Renderer.
The hair is generated in Hair Farm, and then brought fully into V-Ray using the VrayHairFarmMod. The Hair Farm renderer is turned off and only V-Ray is used to render it. The V-Ray hair shader was used, with the white matted preset as a starting point. I still used Hair Farm’s Color Variation material in the diffuse which adds an individual colour to each stand. The Hair was also set to be opaque for GI, and shadows. Simplify for GI was also set.
Rendering hair can be problematic because of its potential to create noise in the image if the sampling is too low. The aliasing filter can also cause issues, so it took a bit of fine tuning to get this right.
Here’s the settings I used in the test:
Image Sampler: Adaptive DMC
Antialiasing Filter: Cook Variable set to 2.5
The DMC noise threshold is not as low as it could be, but I didn’t notice any major noise issues, so was set to 0.01
The scene is lit with 2 area spots, and an HDR dome light. Primary GI was Brute Force, with secondary set to Light Cache. Render times were pretty high using the HDR, but it was worth it for the final image quality. Without the HDR, and still using GI, render times were reduced by 3/4.
View fullscreen in HD for best results.
In 2012 I was asked to set up a small vfx team to work on a BBC comedy/horror pilot called Deadbeats. I was brought in at the pre-production stage to plan shots, and also breakdown the vfx requirements. Once filming started, I supervised the vfx shots over a 2 week shoot.
We produced over 75 shots ranging from 2D effects, to full CG creature work, and FX heavy shots.
Some of this work is visible on my latest reel, which you can see in the showreel section of my site.
Unfortunately the pilot was not commissioned, and will not be broadcast. But I wanted to show some of the great work we produced on the show.
The original ghost concept was produced by the awesome Lee Ray who I’ve worked with many times in the past: http://www.lee-ray.com/
I then sculpted and textured the ghost in Mudbox. I originally rendered the ghost in Mental Ray, but this new test uses V-Ray’s Fast SSS2 shader for some great results! (best viewed in HD)
The ghost is designed to have hair (hence why the back of the head is not sculpted or textured to any great detail), but the V-ray version of the hair is taking a while to setup, so I’ll post a new test when that’s complete.
I was asked by The Viral Factory to supervise and produce the vfx for a Qualcomm viral commercial.
I supervised the 3 day shoot in London, and then produced most of the post work over the following few days.
At the time of posting the viral has had 2.4 million views!
Here’s a breakdown of the Tweeting shot:
Thinking Particles 5 was used to sim the 10,000 falling birds. The sim times were actually pretty quick at about 30 mins considering the large number of objects.
V-Ray was used to render, with a V-Ray camera and motion blur. I did think about rendering a velocity pass, and using Realsmart, but considering the large number of overlapping birds artifacts could have been an issue. The shot was comped in After Effects.
I recently finished working with the fantastic RealtimeUK on the World of Tanks: Endless War Cinematic.
I worked on a few shots, but mainly I worked on the Chi-Ha tank crashing though the bamboo. The bamboo was rigged and animated by hand, while the leaves were particle flow system, driven by a fume grid. The grass was Hair Farm. Here’s a video of that shot with a brighter grade.
An early test of the leaf dynamics:
Wanted to get back into some more Mudbox action. This was started as a doodle from a sphere primitive.
Its a crab type creature, and its anatomy is mostly based off various references of crabs and lobsters.
The render times on this were pretty huge considering the mesh is 6 million polys. So I ended up rendering this from the viewport which only took 30 mins even with all the shadows and AO.