Philip Nemec

Resume available in html if you want the quick summary...


Interactive ClearCoatTM 360 application - SGI
A significant project of mine while at SGI was working with automotive companies interested in interactive styling review. The two main components to this were effective paint simulation and efficient use of the hardware to handle high fidelity models at interactive rates. This model from Daimler Chrysler is being rendered interactively on an Onyx2 infiniteReality against a cube environment. This same environment was used in the paint simulation so reflections and lighting match from any angle.

The technique used in this software has been granted a patent and was published and presented at SIGGRAPH '99 (Reflection Space Image Based Rendering) with two others. My part in all this was the model processing (using OpenGL OptimizerTM) and all of the interactive software.


From Space to In Your Face - SGI
My wife calls this the demo that won't die - I worked on various instantiations of this pretty much my entire time working at Silicon Graphics. This demo was designed and written to demonstrate infiniteReality graphics (running at 60Hz) for the product launch. The original version featured massive texture paging to allow zooming to the Matterhorn (imagery on a 3D terrain) and even a 3D model resting there.

Later versions used clipmapping, an infiniteReality feature that allows for sparsely populated textures of enormous size (paged from disk). This provided the ability to navigate anywhere on the globe, zooming from the entire globe to areas with up to 0.5 meter per texel coverage. These later versions continued to 3D models, with textures for the models paged in as needed.

I was one of two programmers for the initial version and the sole programmer for the clipmap version (with lots of interaction with other SGI teams in order to effectively utilize the hardware). I also worked on tools to process this ever growing amount of texture data - tools that were later much expanded and improved (by someone else) and distributed to customers.

This demo turned into a product (Keyhole) - check out the terrain tool images further down the page. And then Keyhole turned into Google Earth...



C-Galaxy for Linux - Aechelon
Aechelon produced an interactive planetarium application for the 7 projector dome at the Hayden Planetarium in New York. This system uses an Onyx2 with 7 graphics pipes and 8 gigabytes of memory.

I took that application and added star paging (from disk) to reduce the memory footprint and increase the possible number of stars. I used this technique to page galaxies as well. The image on the left shows galaxies (from the Redshift catalog) scaled in size (40X) and randomly textured with images from the Hubble Space Telescope. [The sizes are based on the catalog, the image is chosen randomly.]

I did much work to enable scalability (use whatever graphics power is available) and to support immense volumes of stars and galaxies (I used half a million stars and galaxies from the Hipparcos star catalog and the Redshift galaxy catalog). I also improved the motion model to provide additional device support and simpler configurability and to enable smooth motion from planetary speeds up to flying through galaxies.

A big part of this work was done to enable a Linux port (which I did), thus significantly reducing the minimum hardware cost. The snapshots shown were running on a prototype NVIDIA GeForce3 board at 25Hz.



C-Nova Lite (Linux Water demo) - Aechelon
This demonstration was originally developed to run at 60Hz on two infiniteReality graphics pipes (ganged together using DPLEX to produce a single output channel). My work was to port this to Linux and to take advantage of some of the multiple texture and shading feature available with nVidia GeForce2 and beyond. This is shown in the water - multiple textures drawn (in a single pass) to provide surface texture and the reflected environment. The environment reflection (cubemap) is updated in real time to match what is seen out the window (for example the reflected sun in the right image).



Tools - Aechelon
These are snapshots of some tools I wrote at Aechelon, all using FLTK (a cross-platform GUI toolkit). The two right images are of the Configurator - a tool designed to make configuring Aechelon's various applications much easier. Essentially the Configurator is a front end for editing text (tag based) configuration files. Using the Configurator made typos much less of an issue pas well as providing range checking and help information.

A big motivation for developing such a tool is all the work I've done at configuring multiple channel simulators (something I did quite a bit of at SGI as well). The middle image shows some of the settings for one channel - the right image a graphical front end written to significantly automate the settings for all the channels.

The leftmost image is of a cultural feature Planter, part of the Aechelon C-Genesis suite of database generation tools for flight simulation. This tool used the same datasets as the flight simulator to enable exact planting of features (buildings, trees, etc.) on the terrain imagery (clipmap). The Planter also uses OpenGL Performer (most of the window area) just like the flight simulator. Tool features include cut/paste, random and uniform planting (and thinning), import/export, UTM and lat/long placement. To manage the large numbers of features in a typical database (10 degrees by 10 degrees) various cell-based LOD structures are used.

Both these tools were developed and are used on both Irix and Linux.



Terrain tools - Keyhole
Unfortunately the tools I produced for Keyhole are all command line based so don't really provide pretty pictures. Fortunately the terrain that my tools produced look nice in the Keyhole client (formerly called EarthViewer3D, now Google Earth).

I was contracted by Keyhole to integrate a terrain tessellation and reduction algorithm with an image processing pipeline and an in-house streaming web mechanism. These images show just two little areas of the whole world coverage processed by the tool. In addition to running on both Irix and Linux, the terrain tool runs concurrently on multiple machine and took advantage of multiple CPU systems.


AGP Fast - NVIDIA
I was tasked with turning a technology demo into a marketing tool to demonstrate peak rates for AGP 8X. This demo pages 18 textures every frame (from system memory to graphics memory) in order saturate AGP bus (and showing the short video clips is good eye candy). The benchmark shows did well at approaching the theoretical peak rate of 2000 MB/s). Unreleased additional features include front side bus speed measurements (from system memory to system memory) and testing memory bandwidth contention (performing both front side bus and AGP paging tests simultaneously).

Quality evaluation tools - NVIDIA
I extended and created a variety of tools to evaluate quality and allow precise competitive quality analysis. The tool shown is Aniso Fly - in addition to the original purpose of showing ATI's rotation fallback with anisotropic filtering, the tool also evaluates different texture filter modes (with both single and dual texturing), mipmap level calculations and more.

Film Production Support - NVIDIA
As part of the team working on Gelato (NVIDIA's GPU-accelerated film renderer), I supported a variety of customers. This particular image is from Anibrain - from one of the shots they did for Resident Evil 3 - a company I supported both remotely and on site in Mumbai, India. This shot included close-ups of the exterior and a fly-through of the interior - and took some work to get the desired quality (including ambient occlusion computations of each frame) in a reasonable amount of time. The small render farm I set up in Berkeley turned out to a big help in debugging their issues.

Renderer scalability - NVIDIA
One of the themes of my development tasks on the Gelato team is scalability. I've spent quite a bit of time working on scaling across multiple CPUs and multiple GPUs. There are lots of performance and correctness issues that making working with multiple CPUs a worthy challenge, developing for multiple GPUs adds heterogeneous memory, multiple programming languages, device drivers, and more. I enjoy the challenge and the results - this particular image is a visualization of which tiles are rendered on which GPU (shown at SIGGRAPH).

Render wrangler - NVIDIA
I set up a small render farm (small enough we described it as more of a petting zoo) in the Berkeley office and then proceeded to use and manage it remotely (from the Santa Clara office). The initial setup was a small enough number of machines that I just used shell scripts to distribute the frames. In addition to testing with in-house scenes, and support scenes like the above from Anibrain), I ran "jobs" for Little Red Robot on their "Hurt" music video project. Once we added machines and needed more artist control for The Plush Life, we used render farm management software from PipelineFX called Qube! - it made our lives easier and we helped them debug their Gelato support.