background preloader

PQ Houdini Tutorial

PQ Houdini Tutorial

Too many lights? I've been playing with lighting up the large scale spacecraft of mine, and one thing I've noticed is the large number of lights I now have. For example, on the top and bottom sides of the main hull I have a series of ten navigation lights (point lights), so 20 total (top and bottom) for the port, and another 20 for the starboard sides. Another 6 spots on the top hull illuminating various bits, another 12 lights illuminating various internal docking bays (area lights). This seems like a lot, but it does give the effect I'm looking for. Are there any "tricks" to use when dealing with this many lights? The scene itself is somewhat large in scale (the model is about 1.7km in length, with 1 houdini unit = 1 meter). Current render times (for 1 frame) range from 4 minutes 50 sec to 7 minutes with the scene made of ~500K faces using the Micro Polygon renderer.

SpacesDefinitions From Odwiki Spaces in VEX First of all, we need to make a subtle but important distinction: VEX, unlike languages like Renderman's Shading Language (RSL), is not just a shading language; it is a generic, multi-context, "vector expression" language. This is great, but some concepts are not equally meaningful across all contexts. For example; what does the phrase "camera space" mean in the CHOP context? Spaces then, fall into this category of concepts which, while meaningful in some contexts, are either ambiguous or simply not applicable in others. There are others, and we'll get to what they all mean in a second, but for now, it's important to get used to thinking about these names as simply labels -- they won't necessarily match your idea of what "world space", or "texture space" should mean. Spaces in The Shading Contexts What follows is an overview of each one of these. World Space All global variables available to the shading contexts are given in this space. Object Space NDC Space

OpenVDB Christian Schnellhammer . TD Blog Houdini Ocean Toolkit for H12 Training Video For Houdini Flocking Systems on cmiVFX cmiVFX just releases my flocking tutorial:store.cmivfx.com/tutorials/view/330/Houdini+Flocking+Systems Check it out. This training focuses on a basic concept of flocking simulations: steering behaviors. This step-by-step training video introduces how steering behaviors work and how one can create complex simulations just by combining simple behaviors. As a bonus this tutorial comes with a powerful library of a multi-agent system which you can use to create flocking and crowd simulations. Houdini Closest Primitive VEX Function This Houdini 11 plugin extends VEX with a function to retrieve the closest primitive to a given point. Usage: prim = csClosestPrimitive(file, pos, maxDistance, &closestPoint, &closestPrimNormal, &primU, &primV); Parameter: There is also a VOP node available: Download source:csVEXClosestPrimitive_src.zip Binaries:csVEXClosestPrimitive_H11_win_x64.zip Showreel 2011 Projects shown: Downloads:

eetu's lab - od[forum] - Page 5 diula, on Feb 17 2009, 10:13 PM, said: Thanks for the info! Still it's a bit unclear: what do you mean by resimulation? Are you moving the new particles according to the velocity in the previous frame? Or you're averaging the 'v' and 'P' value from the neighboring points for each point - if that makes any sense? The basics are in the original post; Quote The method I ended up with was simple; the new particles get their velocityfrom nearby particles and move according to that. The transfer recipients are the new particles, and the source are the original houdini-simulatedparticles in the neighborhood. eetu.

Deforming RBDs Simple setup that uses a SOP solver and deforms the geo at points of impact. The impact data is already available in the SOP solver, the normal is the impact direction and there is a pscale attribute for the magnitude of the impact. In this setup I just used attributeTransfer to get the impact data onto the mesh, then a vopsop to deform the mesh. Houdini - Impact deformation for RBDs from Sam Hancock on Vimeo. Also going to try getting some volume preservation going on! Peter Claes Vfx Hi, Welcome to my blog. This will be a blog about houdini, visual effects and some of my other effects related interests. I would like to keep the tone of this blog light and fun, but with in-depth information about how to achieve certain effects. I hope you will find some of the information (technical or otherwise) interesting! I have no idea yet how often I will be able to update this blog, time will tell. If you feel like replying to some of the posts, please keep it civilized. Disclaimer: None of the information I share on this blog represents any opinions of my employers, it is completely my own. That’s it for now, time for some content! Peter Claes

Mantra VS Mantra VS Etc... Recently I have begun to evaluate and update our rendering pipeline to work with H11, in particular to look at perhaps starting to use PBR and Photon maps for indirect lighting. A bit of background. Our current method has been to use the Gather raytracing mechanism to look for specific exports as opposed to Cf (for control and optimization purposes), this has been successful so far in matching (H10) PBR renders/speed with the regular Raytrace renderer and was faster if optimizations where in use (like max distance, object scopes, not Gathering expensive stuff, etc). Its amazing that the flexibility to do this stuff in Houdini exists, and although I have a great deal of fun doing this I'd rather Mantra had the Vray/Modo like lighting accelerators to begin with. It's quite a lot of work for me to create and support this method, when I should probably focus more of my Ubershader maintenance time on building better shading interfaces for my users. My test environment started as Cornel box...

michael rice misc» Blog Archive » COP2_tonemap COP2_tonemap A compositing operator for Side Effects Houdini Based on the method described by Jiang Duan, Guoping Qiu and Min Chen in “Comprehensive Fast Tone Mapping for High Dynamic Range Image Visualization” and several other papers. The idea is that first hdr luminance is compressed to display luminance using a logarithmic mapping weighted by a user control (brightness). The compressed luminance is then blended between a linear mapping and simple histogram equalization by a “contrast” control. Enough rambling, in the file below you’ll find the source and binaries for linux gcc4.1, osx10.5, win32, and win64. Copy the binary to yourhoudinihomefolder/dso Copy tonemap.txt and COP2_tonemap.png to yourhoudinihomefolder/help/nodes/COP2 To compile from source run “hcustom COP2_tonemap.C” from inside the src directory. COP2_tonemap_1.0

The Cornell Box Diaries - od[forum] - Page 2 Day 4: And so we begin day 4 with an interesting turn of events. For some esoteric reason, no matter how hard I tried, I could not get rid of the noise in the render regardless of how high I turned up my irradiance sampling quality. In fact, I could not detect a noticable difference in the prettiness of the render at all... all it did was take longer (8, 64, 128, 256 settings). So I discussed this with someone here at work (Erich) and was told that you cannot have the full irradiance and photon maps activated in the same shader. With all this in mind, I have a new render on the go using only caustic photons and global photons, no irradiance. As soon as the render is done I shall post it here. Image1 photons only (global/caustic) image2 irradiance only. Last one for today. Here is the hip file, in case anyone is interested.

Approach to Lighting and Managing big Scenes Currently I don't have access to the H12 Beta thread so I'll reply here. Below I list how I got here, the system I used briefly that lead us to this logic, and some thoughts to the random logic below. I don't have the time to clean this up to be very specific in my reply, but this shoudl give some food for thought. You'll have to come up with a very decisive argument for SESI to modify the fundamental setup. HDA are perfectly fine for rendering... if you use Mantra. My latest background for note is coming off a project with FloqFX. I'm a shader, lighter, render primarily which is one of the reasons I got rolled into this task. So one of the biggest things we came across on our pipeline are one offs, and assets. As for not being able to manage all the properties and task in a method like Katana. So in order to hit a bunch more points in this thread, I'm gona explain more of how our setup was to how we got around them. We had plenty of otls imbedded within otls. Slash and hack!!!

Bubbles in Houdini with Flip solver | Peter Claes Vfx Rendered simulation: I am interested in fluids and in the interaction between different types of fluids so I started looking into a way to create the interaction between air and water. Specifically how to create bubbles. As of right now (Houdini 11) I found the only way to do it with FLIP fluids is by using a single fluid object and processing the bubble particles differently from the liquid particles. Even though the distinction is made by the user rather than by a density parameter. After defining an initial volume of “water” particles, “air” particles are emitted from groups of points at the bottom of the container. The lookup of the surrounding particles is achieved with a pointcloud lookup in vops and then filtering and including points when the dot product between the positive y axis and the direction vector to the neighbour is lower than a certain selection angle. Once the particles have been simulated and cached to disk, it was time to create a mesh out of them. bubbles

PhotonMapping From Odwiki Photon mapping is one method of performing BackwardsRayTracing . Rather than sending rays from a camera into the scene and then sampling the direct illumination, Photons are sent from the light sources into the scene. So, in essence, Photon Mapping is a two pass process, the first pass generates the photon map and the second pass uses the photon map to compute illumination. Typically, photon maps are stored in a k-d tree data structure. Photon Mapping in Mantra PBR engine If you're rendering with PBR, much the setup for photon mapping is implicitly defined in the way that PBR works and so there is a easy and more accurate simulation of photons for direct and indirect light. To generate a photon map for the non-PBR render pipeline, you need to have an output driver dedicated to photon generation. Global Photon Map This map is used to store photons which don't hit any specular surfaces Caustic Photon Map Photon Mapping in RenderMan Photon Mapping in MentalRay

Hosuk's personal website Point cloud and Brick map There are a couple of ways to bake 3D data. Renderman provides 3D bake systems which are point cloud and brick map. Point cloud is just a group of lots of points in 3D space. We could store just simple noise color, but normally we want to use it for something heavier, so it can be re-used later without expensive calculation. So, I tried some expensive stuff like SSS, photon mapping, caustic, and final gathering. First I tried SSS and found point cloud is not only used to store data but also used to calculate SSS or photon mapping using stand-alone software called ptfilter. Direct illumination point cloud I made a point cloud file that stores just a direct illumination data and used ptfilter to make another point cloud file that has subsurface scattering diffusion data. ptfilter -ssdiffusion -material marble direct_rad_t.ptc ssdiffusion.ptc Subsurface scattering diffusion point cloud Next step was to make brick map. Caustic point cloud didn’t have “_area” data.

Related: