It occurs to me that I have literally spent years of my life dealing with how to render transparent surfaces.
Ok, this hit a nerve, transparency might be worth a thread of random observations. (2/17)
Transparency to me has been one of those classic 90/10 engineering problems: you spend 90% of your time dealing with an effect that shows up in maybe 10% of your pixels. (3/17)
Meanwhile it's also the effect that invariably causes your beautiful rendering algorithm to blow up, with the additional of weird bubbles, stalls and loops. (4/17)
When we added the occlusion buffer to REYES, and then later a full two-pass hide before shade feedback loop, transparency was the complication that required careful memory management of speculatively hidden grids. (5/17)
This seems to be a repeating pattern for me over 20 years: transparency is a delicate tightrope walk of balancing of memory vs keeping your processor busy vs speculative overshading - no matter what your algorithm is. (6/17)
In the REYES algorithm, I spent months if not years dealing with the space complexity of stacks of visible points (mainly coming from hair). Stochastic methods had to be deployed for some of that complexity. (7/17)
Even after REYES is gone we still have deep output having to deal with a similar problem. I'm not even going to go near what it means to comp behind glass. (8/17)
Did I say comp? Alpha (at least for me) is a gigantic pain to get correct _particularly_ in production ray tracers. (9/17)
Right now for me, alpha, matte, and volume are four letter words especially when put together into a single sentence. (10/17)
Moving on from REYES, the amount of transparency headaches in path tracing should not have been surprising at the outset, but nonetheless somehow still surprise when we run into them. (11/17)
RIS is heavily dependent on next event estimation. Guess what? Transmission rays in the presence of opacity need to be shaded, and that shading causes a massive bubble in a nice orderly pathtracing wavefront pipeline. (12/17)
Even just shutting off opacity on glass for next event estimation as an approximation (i.e. ignoring bending of transmission rays) is a headache - our solution *requires* LPEs to avoid double illumination and I can never keep the details in my head. (13/17)
Introducing shading into NEE transmission rays reintroduces this tightrope walk I mentioned earlier: balancing memory vs processor utilitization vs speculative overshading, but worse: (14/17)
Most of the time only a few of the transmission rays have actually hit something transparent, but meanwhile you've potentially stalled all the direct lighting that's sitting around waiting for the results. (15/17)
Don't get me started on when trying to balance use of stochastic methods for transparency in a path tracer (answer: your artists will break it no matter what you choose). (16/17)
I could go on forever (seriously), but I think you get the gist by now. Transparency is the problem that no-one wants to think about early enough; dealing with it correctly is part of turning your research renderer into the production renderer it deserves to be. (17/17)
Forgot about this fun thread until this week's highest priority rendering bug at Pixar, which turned out to be a subtle issue in optimizations for hit testing out-of-order transparent objects in SSE code. Doing minimal work is tricky. #productionrendering
You can follow @levork.
Tip: mention @twtextapp on a Twitter thread with the keyword “unroll” to get a link to it.

Latest Threads Unrolled: