Starbucks-fueled Developer

Monday, May 29, 2006

6 Seconds of Fame

As may[not] have read previously, my brother is currently living/studying at Full Sail in Central Florida to become a computer animator. He'll be starting his third month tomorrow and the fruit of his 3D Foundations is available for viewing on his blog (or directly downloadable; again, apologies for the URL...). Watch and return...

Some spec's not mentioned in his post: he mentioned that it's only 6 seconds long. At 30 fps, that's 180 frames...which took nearly 3 hours to render/compile/encode using QuickTime and produced the 18.5 MB that you (hopefully) viewed. That boggles my mind. If you take a feature-length film, like Incredibles - that runs nearly 2 hours - you're talking about 216,000 frames.

Remember: his doesn't have textures either. His rendered frames per hour (Rfph) is 60. Now, granted, Disney is going to have supercomputers working to render their films. But on the G4s that he's using*, that's 3,600 Rfph, or 150 DAYS!!!

They desperately need to refactor their Continous Integration process... ;-)


*sorry, I had to get this in...Sean mentioned that his PC (an older Dell P4 3.0GHz HyperThreading Wintel) is faster than the Macs for this stuff...OK, I'm better now...

3 Comments:

  • Faster than those Macs, and partly because I wasn't sure how to optimize the render. But, since I know my own processor, and I know so well the ATI 128 MB Radeon 9800 Pro that dwells (nay, lurks, laying in merciless wait for unsuspecting graphical prey) within, it was a little different.

    Pixar's render farms use a different rendering Pipeline, and they have coders who specifically build them optimized Rendering software per instance of Renderman. Still, figure the first cut of any movie probably takes a full weekend to render.

    My teach, Marcus, at one point during rendering his most recent completed project, estimated approximately 10 hours using 24 computers to spit out 720 frames. 24 fps is standard NSTC format, thus the lower frame-age.

    Grats on the new pad, heard everything has gone mostly smooth. Tis awesome.

    By Anonymous Anonymous, at Tuesday, May 30, 2006 2:51:00 AM  

  • I can't edit, so I'll just add this.

    If I were going to render something that's show-length (that's what we call it, whether it's a commercial, a TV sequence, or a film scene, it's called a show), I would never do it on only one computer. I would die first. I would optimize the hell out of it, save it to a network, then load it up on all the other computers in the render farm, and instruct each computer to render a set number of frames (i.e. 1-10 on machine 1, 11-20 on machine 2, etc.), leave for a couple hours, let it spit out the individual images (which is what it does, in sequence), come back when I estimate it should be done, do a quick test and some trouble shooting on the render'd images, grab them all with some Compositing software (we use shake), fix the lighting, the layering, the position, the compositing etc, then render it to a new file using the required end-result codec and file type, which, 90% of the time, will by the aforementioned DVDCVPRO/NTSC, the DVD standard format. The resolution of file spat out will change frequently, for example, Pixar doesn't render at 720 * 480, they probably render much higher, probably at like 1280 by 720 or even 1900 * 1080, which is what we consider the trust HiDef standards. Technically, 1080p isn't true HiDef because we can't pipe it through at 60 fps yet, and 1080i is... well it's ugly. Besides, you can't tell the difference between 1080p30fps and 1080i60fps. It's actually, even fairly hard to tell the difference between 1280*720p30fps and 1900*1080i30fps, and 720 looks a little cleaner.

    Ultimately, your primary platform determines how you think in terms of render. If you're thinking big-screen, you're thinking super high quality then compressed down, so you're pulling in textures at 1024 * 1024, and doing all kinds of crazy resolution and UV mapping.

    Ok, point is: your analogy is not even close to accurate, lol. You would never DO a feature show the way we did it. The way we did it wa just to get it done and show we knew how to do it, which was the point of the class.

    By Anonymous Anonymous, at Tuesday, May 30, 2006 3:08:00 AM  

  • It's 1920*1080 you idiot

    By Anonymous Anonymous, at Thursday, November 02, 2006 9:22:00 PM  

Post a Comment

<< Home