Like many other Lightwave 3D users, one of the things I love most about the software is the relative simplicity of it's rendering engine. This is a great advantage for an independent 3D generalist like me - the ability to churn out frames with minimum tinkering is invaluable and an immense time saver. Nevertheless, being simple doesn't save one from aggravating hair-tearing moments, such as the one I experienced when rendering the test animation in the previous post.
The camera and rendering settings are similiar to what I typically use:
The test renders were also normal, clocking at about 6 minutes per frame on my old Core2Duo machine. If you are familiar with LW, you'll see that these are hardly the best settings if you're looking for a high quality output, but for a test animation I decided it'll do.
Satisfied that everything is in order, I sent the job to my i5 quadcore machine for final rendering and moved on to other tasks.
3 hours later I came back to check on the progress. To my utter dismay, it was still working on the third frame, with the second frame taking over 2 hours to render. Needless to say, it was a pretty huge jump from 6 minutes a frame on a lesser computer. Suspecting a freak memory leak at first, I restarted the rendering from the last frame to let it work backwards to the first frame. The first few frames took about 7-8 minutes per frame, which was still considerably slower considering 2 more cores were working on it, but I was too tired to troubleshoot, so I let it be and went to sleep.
In the morning, I checked again and found that the render was stuck at the 37th frame, with the previous frame taking 1.5 hours. By then it was already 10 hours into the render. For a simple 4 second animation, it was getting ridiculous.
By studying the render progress, I quickly realize that the render slowed to a crawl when ever it reach the portions with heavy motion blur. Lowering the blur amount and passes helped very little, and I was not about to reduce the AA and adaptive sampling settings, because it was bordering on being unacceptably grainy.
Initial forum searches yielded no results, until I found a thread on, where else, the Newtek forums. In a nutshell, the problem lied in the fact that I was using Photoreal motion blur and a deforming object, which did not play well with each other.
Faced with the grim prospect of spending a week rendering a test animation, I bit the bullet and switched to dithered motion blur and viola, the render time dropped to 6 minutes a frame, which remained more or less consistent throughout the 130 frames.
The result of using dithered MB, in my opinion, was not all that inferior to photoreal. The MB amount was low enough that the checkered dither pattern was not all that noticable. I suppose that given enough passes, it would look just as good as photoreal MB (photoreal uses stochastic dithering), though I'm not sure the resulting increase in rendering time would be worth it.
In any case, I guess I'll have to start looking into implementing MB in post.
No comments:
Post a Comment