Thursday, September 29, 2011

In-depth video on micropolygons in OpenGL

Anyone interested in OpenGL and micropolygons should definitely check out the following 1-hour-long video presentation by Kayvon Fatahalian, "Evolving the OpenGL Graphics Pipeline in Pursuit of Real-Time, Film-Quality Rendering":


Lots of interesting insights in the future of GPU hardware.

Tuesday, September 27, 2011

Real-time dynamic GPU path tracing

A small teaser of something I'm working on, rendered in real-time on a GeForce GTS 450:


12 samples per pixel, max path length = 4, 584x266 render resolution, Stanford Bunny with 69k triangles (rendered in about 0.4 seconds on a low-end GTS450):



This will eventually become a full-fledged game inspired by this video:

A few details on Imagination's PowerVR GPU with Caustic Graphics hardware ray tracing

A few days ago, an article appeared on the Japanese website 4gamer.net about an in-development PowerVR GPU, the PowerVR RTX, which is said to have integrated ray tracing hardware from Caustic Graphics (acquired by Imagination Technologies in December 2010). The article is in Japanese, and the Google translated version is barely understandable:

http://translate.google.com/translate?hl=en&sl=auto&tl=en&u=http%3A%2F%2Fwww.4gamer.net%2Fgames%2F017%2FG001762%2F20110920023%2F

Saturday, September 24, 2011

Video of Unbiased FPS + a new video of Brigade!

The following two videos of "Unbiased FPS" were rendered on a GTS 450 and demonstrate mouse look, shooting mechanics and how adjusting the number of averaged frames affects the amount of noise and blurring. Youtube has a terrible video compression algorithm for videos containing fine-grained noise (such as Monte Carlo noise), so these came out much worse than expected:




The frame averaging technique (implemented by Kerrash) significantly reduces the amount of noise without affecting the framerate, but the resulting blurring of fast moving objects is of course less than ideal. At the upcoming Siggraph Asia 2011, a paper entitled "Image-space bidirectional scene reprojection" will be presented which could potentially solve the blurring issue by reconstructing additional frames without distracting blurry artefacts (there are two versions, a raster based version and an image-based one). The video accompanying the paper shows a lot of promise and is definitely worth checking out: http://research.microsoft.com/en-us/um/people/hoppe/proj/bireproj/

On a sidenote, I found a new, short video of the Brigade path tracer, showing a real-time pathtraced animated character: http://www.youtube.com/watch?v=RbxoZK6Apj8 (no details on the hardware used)



The video shows very nice soft shadows and path traced ambient light coming from the skydome (calculated in real-time contrary to the constant ambient term which is used by many games to approximate indirect lighting).

Wednesday, September 21, 2011

Unbiased FPS

Another small experiment to see how a path traced first person shooter game might look like (all screenshots rendered on a 8600M GT):



4spp, reusing samples from the previous 6 frames, 720p resolution:


720p resolution, 4spp with 12 averaged frames (reusing samples from the previous 12 frames, resulting in heavy blur for moving objects, but crisp image quality for the rest of the scene):



Some features:

- a first person "gun" sticks to the camera and shoots chrome balls with physical weight

- mouse look is semi-functional, you can move the mouse cursor up to the borders of the screen

- the car can be controlled with the arrow keys

- the robot has an initial angular momentum, but essentially does nothing

- the camera is controlled with mouse + WASD keys (the classic FPS controls)

- the number of averaged frames can now be set with a slider (from 0 to 12). It regulates the amount of "blur" caused by reusing samples from previous frames and is very useful for comparison purposes. Setting the slider to 1 or higher greatly reduces the amount of noise at the expense of a more blurry image and doesn't have any impact on the framerate. You can for example quadruple the number of samples per pixel (thereby halving the amount of noise) for free and if the framerate is high enough, the blurring isn't even that distracting. Setting the slider to 12 with just 4 spp results in a very high quality image (effectively more than 100 spp) at high framerates, but at the expense of lots of motion blur when the car or camera are moving

There are no acceleration structures used (not even a bounding box around the car or the robot). Implementing those should give a nice ray tracing performance boost.

Monday, September 12, 2011

Video of Octane Render rendering in real-time on 8x GTX580s

I just saw a very impressive video on Youtube, showing real-time rendering of a complex interior scene with Octane Render using eight GTX 580s. Octane Render has very recently improved its "direct lighting/AO" kernel which includes a new ambient occlusion preset and a diffuse indirect lighting option (there are now separate sliders for specular, glossy and diffuse path depth and a slider for AO distance) so that it is now capable of rendering very realistic looking interior scenes extremely fast:


Screengrab from Youtube:



Some observations:

- in contrast to the type of scenes that is usually used to show off unbiased renderers (wide open outdoor scenes with shiny cars and lots of direct lighting from a skydome or sun/sky) this is an interior scene that is mostly lit by indirect lighting with many glossy surfaces

- the scene in the video contains more than 1.3 million triangles

- rendered at very high resolution (1500x1000 pixels)

- perfect scaling with number of GPUs (99-100% extra performance per additional GPU)

- while navigating through the scene, the image remains remarkably clear and recognizable without ever degenerating into a pixelated mess of big, blocky (and often black) pixels

- convergence to a noise-free image is extremely quick

It's not hard to imagine that this renderer will be truly real-time with the upcoming generation of GPUs (according to Nvidia, Kepler will be more than twice as fast at path tracing as Fermi, just like Fermi is 2-4x as fast as Tesla (GT200) thanks to caches and other improvements (see http://www.youtube.com/watch?v=0IC2NIogWR4) and AMD's Graphics Core Next will be much more focused on GPGPU computing than previous architectures). These graphics can be rendered noise-free in real-time at high resolution with techniques like adaptive sampling, image reconstruction, compressed sensing (a hot topic in medical imaging currently), edge-aware filtering of indirect lighting (e.g. the a-trous wavelet noise filter), extraction of spatiotemporal coherence with reprojection, path regeneration, reusing samples with frame averaging and frameless rendering. Rendering complex, 100% photorealistic scenes in real-time is much closer than commonly believed and cloud rendering will play a key role in accelerating this process.

UPDATE: another interior animation rendered on a Nvidia GTX 590 with Octane Render's new indirect diffuse lighting/AO kernel: http://www.youtube.com/watch?v=O2Aufjr44g4Hi
Rendertime per frame: ~1 minute, 1024 samples per pixel (see screenshot in this thread). This is absolutely amazing...

Wednesday, September 7, 2011

Unbiased Stunt Racer in HD!

I've changed the 'Unbiased Stunt Racer' demo a little bit: I've added the buildings from my Futuristic Buildings demo to the scene and also added a function to adjust the speed of the physics simulation at runtime with the N and M keys. The platform and sphere are also moving at a speed that's independent of the framerate, so the game is still playable on high-end cards like the GTX 480 and higher. The new executable contains 480p and 720p executables. I find it mindblowing that real-time path tracing is feasable today at HD resolutions using only one GPU.

Below is a 720p image of the new demo, rendered on a 8600M GT at 8 samples per pixel:



Some 480p screenshots (all captured on a 8600m GT):




A video will follow soon and the source code will also be uploaded. The code is very messy with lots of obsolete comments and there are heaps of global variables, which isn't good programming practice but it does the job. :)

The executable "Unbiased Stunt Racer HD Final" is available at http://code.google.com/p/tokap-the-once-known-as-pong/downloads/list

This will be one of my last demos involving only spheres and boxes as primitives. The limitations on the number of objects and shapes are starting to outweigh the benefits of these cheap-to-intersect primitives, so I'm moving to the wonderful world of triangles and BVHs soon.

UPDATE: source code is online at http://code.google.com/p/tokap-the-once-known-as-pong/downloads/list

UPDATE 2: sneak peek at an upcoming demo (robot and car will be user controlled and "shootable")



UPDATE 3: Finally found some time to make an HD video of Unbiased Stunt Racer HD:

720p "fly-through":


480p gameplay:

Executable and source can be found at http://code.google.com/p/tokap-the-once-known-as-pong/downloads/list




There is also a new paper and video on reducing Monte Carlo rendering noise called "Random parameter filtering" with amazing results (thanks to ompf.org forum): http://agl.unm.edu/rpf/ The video is definitely worth the download and shows great potential for real-time path tracing.

Monday, September 5, 2011

Unbiased Stunt Racer!

For the last couple of days, I've been busy creating "Unbiased Stunt Racer", a new real-time path traced demo involving physics, a vehicle and a shooting robot. It's actually a mix of a platformer and a stunt driving game. First you have to reach a platform by driving the vehicle over a ramp, then you have to reach another platform with a robot which is shooting at you, by driving over two rails where a nasty chrome sphere hinders the passage. The final goal is to push the robot off of his cosy place. The sunlight is constantly changing position and color, casting nice soft shadows over the scene. Below is a walkthrough video (rendered on GTS450) and some screenshots:








The number of samples per pixel can now be altered at runtime with the O and P keys. Pressing space resets the car. I've converted some of the stratified sampling in the original code back to purely random sampling, resulting in a slightly noisier but more realistic image.

An executable demo for Unbiased Stunt Racer is available at http://code.google.com/p/tokap-the-once-known-as-pong/downloads/list

The biggest advantage of using real-time ray tracing/path tracing for these "games" (besides having higher quality graphics) is that almost anyone can immediately start creating simple games without worrying about lighting, shadows, multiple reflections, refractions and all combinations of these effects. Everything just works as expected and without any effort on the artist side. It's like a "push button" game engine: just create the scene and the renderer takes care of everything else.

Thursday, September 1, 2011

New video of Unbiased Physics Games

Changed the ground color and brightened up the sky in preparation for an upcoming demo called "Unbiased Stunt Racer" (courtesy of Anonymous blog reader :). It looks a little bit more realistic now (480p video on GTS 450):



"Unbiased Physics Games v1.0" executable demo at http://code.google.com/p/tokap-the-once-known-as-pong/downloads/list