Render System update

Since its the holidays now I’ve had the chance to really knuckle down and get to work on my render system. I am have been working out the kinks behind the scenes. I have taken my time getting to know how to safely use threads, preventing race conditions and keeping synchronization as seamless as possible. Now all that is done I get to the fun part! actualy making all the fancy graphics.

 

Sorted Transparency 3

Image displaying the correctly sorted draw call order on transparent surfaces

 

1000NSpheres

An image displaying over 5000 draw calls over 20 million triangles being rendered sorted, while running NVIDIA Nsight graphics debugging at 47fps. I can make this go faster yet!

Deferred Screen Space Realtime Reflections

What is it and how does it work?

Screen space reflections are a real time post effect used in some modern games intended to offer a unique look to shiny surfaces. The way it works is by comparing the distance values to the ray, along the ray in screen space. The picture bellow should help describe it.

Howitworks

Pros:

  •  Looks freaking gorgeous!
  • Makes shiny surfaces far more believable because everything is influenced by everything resulting in better visuals and depth perception to the user.
  • Reasonably fast for decent results

Cons:

  •  Inaccurate data
  • Data is limited to screen space
  • The more accurate the slower it gets
  • Because its in screen space geometry facing away from the camera does not exist

This image demonstrates some of the conditions to which the system would fail.

In such situations you could probably substitute the reflected colour using a cube map.

Pitfalls

Source:

Here is the fragment shader code in GLSL that I wrote, hope it helps you understand it.

https://github.com/JackMcCallum/Portfolio/blob/master/Code/Example_DeferredSSRR.h

Results:

Results

Deferred Rendering

On Tuesday I decided to begin a really brute forced prototype of how I want my engines rendering system to look, I stayed up late and the results are in! I have been thinking about my lighting strategies a lot recently and one in particular is I want to make the ambient pass look a lot less boring than a solid shade of colour. So to tackle this I am using Screen Space Ambient Occlusion multiplied with a mixed value based on the direction it is facing. if its facing the ground it would get one hundred percent of the ground colour or if its facing the sky it would get one hundred percent of the sky colour. If you were in the desert you would want the ground to have an orange tint to it and the sky would have a blue tint.

For those who don’t know, Screen space ambient occlusion is a technique used in modern games to tackle the lack of global illumination by using an image of the scene’s surface angles and positions to find the crevices and darken them as to simulate less light bouncing around in that area.

You can see the results of what I just described in the following picture:

EFxj-hcjWhOR6t4jP4R5rxwT3GtU4KRhlXOrFDfuGTk

 

Try it for yourself: https://dl.dropboxusercontent.com/u/62003039/Hosted%20Files/AmbientPass.7z

 

Then i start rendering each light of the scene on top of the ambient pass using a high dynamic range buffer to hold the new data. After the lights are all rendered I filter out the colours that exceed white on the screen blur it then add it back to get the light bloom effect, Finally I end up with the following:

kLaF1kp7yNnrDEqhBXmIOWOLeG43SZ7oOe0ocaOarzoPmPnxm01nz9xPw79cLwY6XTwtlsz0UUOOQfs5fj1e-M

No executable file for this yet, sorry.

First blog ever

Introduction

For a first post i would like to introduce myself to my reader! My name is Jack McCallum, as of 2014 I am a student a Academy of Interactive Entertainment learning game development using mostly C++ with OpenGL and Visual Studio. In this blog I will be keeping track of the things I am covering in my class time a as well as products/applications I have worked on. Providing links to small applications that anyone can have a go at along with some pictures/videos I capture along the way.

 

N-Gine

This here is the main project I am currently working on, a lot of my smaller projects have hit walls due to my lack of knowledge in certain areas and it has required me to take a step back and start working on the elements of a game engine from rock bottom, so I can obtain a strong understanding on how each individual thing works. I am calling this project N-Gine short for Nugget Engine, based on a nick another programming buddy has given me. As of today, N-Gine is less of an engine and more of a small framework consisting of a bunch of wrapper classes for OpenGL with a few extra perks such as logging and mesh loading, things such as window handling and texture loading are handled by third party libraries such as GLFW and SOIL, I am trying to polish the really low level foundation as much as I can before i move onto the next level of abstraction.

An executable will not be provided for this yet. Hang in there and I will release one soon enough.

Asset provided by John Costello http://johncostelloart.com/

Image

 

PhysX

Apart of my assignment that my class is currently working on I am required to use the PhysX libraries to create a physics scene simulation. I have not spent much time on this but bellow is a picture of my progress on it. So far i have added 150 capsules boxes and spheres. PhysX has been designed well so its pretty straight forward to use. Because of this, I am required to make my own implementation.

Demo link provided below.

Image

 

Custom Physics

As I said above, I am required to build my own physics implementation. I have been working on this the most and doing my best to understand the simplified pseudo algorithms used to build real time physics engines. So far i have sphere-sphere collision detection with linear collision response.

Demo link provided below.

Image

 

Link to built executables for you to try:

https://dl.dropboxusercontent.com/u/62003039/Hosted%20Files/Demos%2019-5-14.7z