Tools and Samplers
One of my mentors at ILM, John Hansen, taught me a valuable lesson. I didn't really understand it until years later, but it was important.
Iterate faster.
I was working on the first Pacific Rim movie and I was tasked with the incredibly awesome task of developing the effects for the 'Rocket Punch' as we called it internally.
Tis was the last feature film I worked on and I think it broke me, being the 17th or such giant robot movie I had worked on
I can still hear John Knoll: "Can we do shock diamonds!?"
Me: "I think - What's a shock diamond?"
Working with existing tools at ILM was a bit of a mixed bag. The company helped shape much of the software the industry uses today, even Python. But that history comes with a lot of cruft. Lots of familiarity with older tools.
I was struggling - I couldn't figure out how to achieve the look I wanted with the tools I had. It took so long to run a simulation and even longer to render it. We're talking hours per change.
John was appalled, and rightly so. It was taking too long to iterate.
He taught me to make changes in orders of magnitude, rather than piecemeal value changes. He taught me to write samplers to visualize the complex data in a quick, realtime way. Year over year, I take these lessons and apply them to everything I do.
If I'm working blind on anything, I now stop, take the time to write an iterative sampler so that I can make many more changes, faster, realtime. This is insanely helpful for both productivity, but also for getting better results.
Think about it - if you can test only 10 revisions in a day and you have to pick the best one, your result may suffer. Comparatively, let's say you are able to execute 100, or 1,000, or 1,000,000 revisions in a day - or let's say you can test your values and see the result of them update in realtime - you're going to have better results.
Just for fun…