for a long time “ Rasterization (rasterization)” Has dominated the field of real-time rendering , This is not to say how good this rendering is , This is mainly because it is easier to implement hardware acceleration . The basic algorithm of raster rendering is very simple , But to achieve a good rendering effect , You need to keep patching . Do you want shadows ? Then you need a special algorithm . You want to reflect , Refraction ? Each needs a special way to do it , Not to mention global illumination . These can't help but remind me of earlier years “ Fixed rendering pipeline ” The development of video card reached its peak , Graphic programmers have to adjust various parameters to add
Environment Map,Bump Map wait , later “ programmable graphics pipeline ” introduction , those tricky There's no need to do that . And now GPU
More and more powerful , Performance is improving , It also has more general computing power . on the other hand low level graphical API The emergence of graphics programmers have more imagination to control
GPU, All this is pushing real-time rendering technology to the next generation :Real-time ray tracing. This is real-time rendering for decades , The biggest advance in game rendering technology !
NVIDIA Already in 8 Month's SIGGRAPH 2018 A new Turing architecture was released at the conference GPU, A new generation of consumer grade GeForce
Graphics cards are also ready to come out ! The biggest highlight of the new architecture is undoubtedly support Real-time ray tracing Hardware acceleration and use AI
Noise reduction of ray tracing calculation results is carried out ! The following article mainly introduces DirectX Ray Tracing What does the defined rendering pipeline look like .

DirectX Raytracing( hereinafter referred to as DXR ) It's not a whole new set API, It's in the DirectX 12 A new one of
feature. It mainly introduces four new things :

* Acceleration Structure
* A new one Command List method :DispatchRays , It's used to start Ray Tracing Rendering process of ;
* A series of Ray Tracing Pipeline In process Shader type :
* Ray generation shader
* Intersection shader
* Miss shader
* Closest-hit shader
* Any-hit shader
* Raytracing pipeline state
The following is a detailed introduction to the key parts .

<>Acceleration Structure

seeing the name of a thing one thinks of its function , In order to speed up the intersection of light and scene , The geometry in the scene needs to be organized in a special way , Generally, it is the result of some spatial segmentation algorithm , stay DXR This data structure is called
“Acceleration Structure”. This data structure consists of DXR Responsible for generating , By calling
BuildRaytracingAccelerationStructure() function .

This data structure is divided into two layers :

* Top-Level acceleration structures Is the object instance level , Each instance can contain a Transform matrix .
* Bottom-level acceleration structures It's aggregate level , For traditional triangles Mesh, It uses bounding
volume hierarchy (BVH) Tree data structure to manage these triangles .
When the animation in the scene is updated , General update Top-Level hierarchy That's fine , So the efficiency is very high . This year's SIGGRAPH There is also a related one
Paper, You can have a look if you are interested :MergeTree: A Fast Hardware HLBVH Constructor for Animated Ray

Acceleration Structure Two levels of ( Picture from NVIDIA)

<>DXR Shaders

because Ray tracing The calculation process is completely different from that of rasterization , therefore vertex shader, geometry shader, pixel shader
That can be put aside !DXR A series of new Shader type , It's listed above . Here's a simple example , To experience some of this Shader
Type . The next paragraph Shader The code completes one of the simplest Ray tracing Rendering of : Render red where there are models , Set the background to blue where there is no model .
RWTexture<float4> gOutTex; struct RayPayload { float3 color; };
[shader(“miss”)] void MyMiss(inout RayPayload payload) { payload.color =
float3( 0, 0, 1 ); } [shader(“closesthit”)] void MyClosestHit(inout RayPayload
data, BuiltinIntersectAttribs attribs) { data.color = float3( 1, 0, 0 ); }
[shader(“raygeneration”)] void MyRayGen() { uint2 curPixel =
DispatchRaysIndex().xy; float3 pixelRayDir = normalize( getRayDirFromPixelID(
curPixel ) ); RayDesc ray = { gCamera.posW, 0.0f, pixelRayDir, 1e+38f };
RayPayload payload = { float3(0, 0, 0) }; TraceRay( gRtScene, RAY_FLAG_NONE,
0xFF, 0, 1, 0, ray, payload ); outTex[curPixel] = float4( payload.color, 1.0f
); }
( Code from Chris Wyman, NVIDIA)

Let's take a look at several functions in this code :

* MyRayGen()
This function is a Ray generation shader, When DispatchRays() API It will start automatically after being called , It's kind of like us C In language
main() function , Control the whole thing Ray tracing The process of . Normally , It uses the new built-in shader function TraceRay()
Perform ray tracing calculations , The result of the calculation is returned to the last parameter , In the example above RayPayload
payload. This structure should contain a color value , As a result of shading calculation . last , Write this result to Render Target in .
* MyClosestHit()
Its type is Closest-hit shader, This is the execution material Shading The place of . It is worth noting that , You're here Shader You can still call
TraceRay() Function for recursive ray tracing , It can also be called many times TraceRay() By Monte Carlo method AO etc. , In short, this is very flexible .
* MyMiss()
Its type is Miss shader, seeing the name of a thing one thinks of its function , When the light doesn't collide with any geometry in the scene , this Shader Will be called . this Shader
It is generally used to realize background rendering , Such as the sky .
We also have two types Shader It's not involved :

* Intersection shader
this Shader Is used to define the intersection calculation of light and scene . Lighting and triangles are provided by default Mesh
Intersection operation of , If you need to detect with spheres , Parametric surface and other special calculation , You can go through this Shader Customize .
* Any-hit shader
This is a question :Any hit? When light and geometry collide , This will be called Shader To ask if it really happened hit. This is usually used to implement Alpha
Mask effect .
<>DXR Pipelline

The ray tracing pipeline( Picture from Chris Wyman, NVIDIA)

After understanding the above Shader After their respective calculations , You can see the whole picture intuitively DXR Pipeline The flow of . The green part , It is implemented by the system , Can be
GPU Accelerated , We can go through API To control . The blue part is accessible Shader Programming to achieve . The ones listed above Shader
It's a calculation process for light , This process is in GPU In parallel . these ones here Shader Between can pass through ray playload
Structure data communication , This is a user-defined data structure . It serves as a inout Parameters are passed to TraceRay() function , stay Closest-hit shader and
Miss shader Can be modified .

<> Conclusion

Real-time ray tracing
Is it exciting for you ! Are you ready to try ? If you have a passion for graphic technology , Confident to challenge cutting edge technologies in real-time rendering , So there's a good chance to call you back ! Ant gold · Graphic and Digital Art Lab is looking for graphic development experts , Art Designer , We are looking forward to the participation of Luda people ! You can contact the blogger directly , Send resume to
neil3d (at) <> . This is a sincere recruitment post , Request forwarding ! Please refer to this document for detailed job list and job description

<> reference material

* GDC 2018: DirectX Raytracing
* Announcing Microsoft DirectX Raytracing!
* SIGGRAPH 2018 Course: “Introduction to DirectX Raytracing”
* D3D12 Raytracing Functional Spec v0.09
* D3D12 Raytracing Samples Github
* Introduction to NVIDIA RTX and DirectX Ray Tracing