How To: Create an Orthographic Projection Matrix for OpenGL

This version of the code uses the Matrix4f class from LWJGL Utils, but you can obviously use any 4×4 matrix class you’d like:

// Method to create and return a 2D orthographic projection matrix 
public Matrix4f createOrthoProjectionMatrix(float left, float right, float top, float bottom, float near, float far)
{
	Matrix4f m = new Matrix4f();
 
	m.m00 = 2.0f / (right - left);
	m.m01 = 0.0f;
	m.m02 = 0.0f;
	m.m03 = 0.0f;
 
	m.m10 = 0.0f;
	m.m11 = 2.0f / (top - bottom);
	m.m12 = 0.0f;
	m.m13 = 0.0f;
 
	m.m20 = 0.0f;
	m.m21 = 0.0f;
	m.m22 = -2.0f / (far - near);
	m.m23 = 0.0f;
 
	m.m30 = -(right + left  ) / (right - left  );
	m.m31 = -(top   + bottom) / (top   - bottom);
	m.m32 = -(far   + near  ) / (far   - near  );
	m.m33 = 1.0f;
 
	return m;
}

To place the origin in the bottom-left corner of the window (positive y-axis pointing upwards) you’d then create a matrix like this:

Matrix4f orthographicProjectionMatrix = createOrthoProjectionMatrix(0.0f, windowWidth, windowHeight, 0.0f, -1.0f, 1.0f);

While to place the origin in the top-left corner (positive y-axis pointing downwards) then just switch the windowHeight and 0.0f to make the line:

Matrix4f orthographicProjectionMatrix = createOrthoProjectionMatrix(0.0f, windowWidth, 0.0f, windowHeight, -1.0f, 1.0f);

As a final note, for 2D graphics sitting on z = 0.0f (you can easily have a 3D orthographic projection – for example in a CAD package) we typically specify the near and far values as -1.0f to 1.0f in a right-handed coordinate system (such as OpenGL) – putting them both as 0.0f can cause issues, so its’ best not to.

2D Fire Effect in ActionScript 3

I had a little bit of time today to do some me-coding (as opposed to work-coding), so I knocked up a quick 90’s era 2D fire effect. I’d actually written it C++ and OpenGL the other week and was meaning to transfer it into WebGL so it can run live rather than having a captured video, but my WebGL kung-fu is pretty weak at the moment so just to get it done I translated it to Flash.

How it works

The effect itself is incredibly simple, as the video below explains. You randomly add “hot-spots” to the bottom of the pixel array, then the new temperature value for a pixel is just the average of the pixel and the 3 pixels below it with a small amount subtracted so that the flames “cool”, which you then map to a colour gradient.

Sneaky!

Flash implementation

In the OpenGL version I’d made the colours for the flames into a 1D texture which can be easily interpolated, but I had to find some functions to do that in AS3 as there don’t appear to be 1D textures! Also, getting and converting the colours from uints and hex-codes and stuff was a pain – but I finally nailed it after googling around and pulling functions from various places (referenced in the source).

I think the most important thing I learnt from getting this working in Flash was how to and how NOT to convert colours from uints into red/green/blue components. For example, a lot of code online will use a function like the following to convert red/green/blue values into a 24-bit uint:

// NOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOO!!!!!!!!!
function rgbToUint(red:uint, green:uint, blue:uint):uint
{
	var hex = "0x" + red.toString(16) + green.toString(16) + blue.toString(16);
  	return hex;
}

The above function looks like it works, and in fact in most cases does work – but if you get single digit values they aren’t zero-padded properly and craziness ensues!

The correct way to convert RGB to uint is by using the two functions below which ensure that values are capped AND zero-padded as appropriate, which means that everything works under all possible conditions:

function convertChannelToHexStr(hex:uint):String
{
	if (hex > 255)
	{
		hex = 255;
		//trace("hex val overloaded");
	}
 
	var hexStr:String = hex.toString(16);
 	if (hexStr.length < 2)
	{
		hexStr = "0" + hexStr;
	}
 
	return hexStr;
}
 
function getHexColourFromRGB(r:uint, g:uint, b:uint):uint
{
	var hexStr:String = convertChannelToHexStr(r);
	hexStr           += convertChannelToHexStr(g);
	hexStr           += convertChannelToHexStr(b);
 
	var strLength = hexStr.length;
	if (strLength < 6)
	{
		for (var j:uint; j < (6-strLength); j++)
		{
			hexStr += "0";
		} 
	}
 
	var finalStr = "0x" + hexStr;
	return finalStr;
}

Wrap up

Overall, it’s a nice simple effect – but it’s pretty CPU intensive. In the above example I’m only using a stage size of 250px by 120px and the framerate takes a hit even then.

I could use a single loop instead of embedded x and y loops, and I could manipulate more colours directly as uints rather than pulling out the RGB components, but it’s prolly not going to improve more than about 10-15%. As usual, there’s source code after the jump, so if you have a play with it and manage to speed it up a bunch, feel free to let me know how you did it!

Cheers!

Download Link: 2D-Fire-Effect.fla (Flash CS4 fla – but CS5 and later will convert on load).

Continue reading 2D Fire Effect in ActionScript 3

2D C++ OpenGL/GLFW Basecode

I’m teaching some games programming stuff this year, and we’d started off using SDL, but the students are having a hard time with it – the main problems being:
– It’s bulky,
– It demands control of the mainline and then bloats it,
– The documentation is okay, but significantly less than stellar.

This isn’t to say that I have much against SDL – it does a lot of good things, but I’m concerned it’s providing too much specific functionality, which I don’t want to rely on and be tied to when (not if) things change in the future. So with this in mind, I’ve spent the evening reading about OpenGL frameworks and have decided to take the class in a new direction – namely GLFW, the cross-platform OpenGL framework.

We’re mainly going to be working in 2D, so I’ve put together some OpenGL/GLFW basecode that initialises a window, sets orthogonal projection (i.e. things further away don’t get any smaller) and just draws a line from the top-left to the bottom-right (so is easy to strip out when you want to adapt it to your own purposes) – and the entire thing is only around 100 lines of code! (and that’s with stacks of whitespace). The equivalent in SDL is closer to 350 lines and is significantly more complex/complex-looking.

Check it out…

  1. // OpenGL/GLFW Basecode | r3dux
  2.  
  3. #include <iostream>
  4. #include <GL/glfw.h> // Include OpenGL Framework library
  5. using namespace std;
  6.  
  7. void initGL(int width, int height)
  8. {
  9. 	// ----- Window and Projection Settings -----
  10.  
  11. 	// Set the window title
  12. 	glfwSetWindowTitle("GLFW Basecode");
  13.  
  14. 	// Setup our viewport to be the entire size of the window
  15. 	glViewport(0, 0, (GLsizei)width, (GLsizei)height);
  16.  
  17. 	// Change to the projection matrix, reset the matrix and set up orthagonal projection (i.e. 2D)
  18. 	glMatrixMode(GL_PROJECTION);
  19. 	glLoadIdentity();
  20. 	glOrtho(0, width, height, 0, 0, 1); // Paramters: left, right, bottom, top, near, far
  21.  
  22. 	// ----- OpenGL settings -----
  23.  
  24. 	glfwSwapInterval(1); 		// Lock to vertical sync of monitor (normally 60Hz, so 60fps)
  25.  
  26. 	glEnable(GL_SMOOTH);		// Enable (gouraud) shading
  27.  
  28. 	glDisable(GL_DEPTH_TEST); 	// Disable depth testing
  29.  
  30. 	glEnable(GL_BLEND);		// Enable blending (used for alpha) and blending function to use
  31. 	glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);
  32.  
  33. 	glLineWidth(5.0f);		// Set a 'chunky' line width
  34.  
  35. 	glEnable(GL_LINE_SMOOTH);	// Enable anti-aliasing on lines
  36.  
  37. 	glPointSize(5.0f);		// Set a 'chunky' point size
  38.  
  39. 	glEnable(GL_POINT_SMOOTH);	// Enable anti-aliasing on points
  40. }
  41.  
  42. void drawScene()
  43. {
  44. 	// Clear the screen
  45. 	glClear(GL_COLOR_BUFFER_BIT);
  46.  
  47. 	// Reset the matrix
  48. 	glMatrixMode(GL_MODELVIEW);
  49. 	glLoadIdentity();
  50.  
  51. 	// ----- Draw stuff! -----
  52.  
  53. 	glBegin(GL_LINES);
  54. 		glColor3ub(255, 0, 0);
  55. 		glVertex2f(0.0f, 0.0f);
  56.  
  57. 		glColor3ub(0, 0, 255);
  58. 		glVertex2f(800.0f, 600.0f);
  59. 	glEnd();
  60.  
  61. 	// ----- Stop Drawing Stuff! ------
  62.  
  63. 	glfwSwapBuffers(); // Swap the buffers to display the scene (so we don't have to watch it being drawn!)
  64. }
  65.  
  66. int main()
  67. {
  68. 	// Frame counter and window settings variables
  69. 	int frame      = 0, width     = 800, height      = 600;
  70. 	int redBits    = 8, greenBits = 8,   blueBits    = 8;
  71. 	int alphaBits  = 8, depthBits = 0,   stencilBits = 0;
  72.  
  73. 	// Flag to keep our main loop running
  74. 	bool running = true;
  75.  
  76. 	// Initialise glfw
  77. 	glfwInit();
  78.  
  79. 	// Create a window
  80. 	if(!glfwOpenWindow(width, height, redBits, greenBits, blueBits, alphaBits, depthBits, stencilBits, GLFW_WINDOW))
  81. 	{
  82. 		cout << "Failed to open window!" << endl;
  83. 		glfwTerminate();
  84. 		return 0;
  85.     	}
  86.  
  87. 	// Call our initGL function to set up our OpenGL options
  88. 	initGL(width, height);
  89.  
  90. 	while (running == true)
  91. 	{
  92.     		// Increase our frame counter
  93.         	frame++;
  94.  
  95. 		// Draw our scene
  96. 		drawScene();
  97.  
  98. 		// exit if ESC was pressed or window was closed
  99. 		running = !glfwGetKey(GLFW_KEY_ESC) && glfwGetWindowParam(GLFW_OPENED);
  100. 	}
  101.  
  102. 	glfwTerminate();
  103.  
  104. 	return 0;
  105. }

How streamlined is that?!? Sweeeeeeet!

Next task – do stuff with it! =D

P.S. If you wanted to add GLEW to it to take care of all your extension wrangling needs, then just add the glew.h header before glfw.h, include the glew library in your project and add the following to the top of the initGL function:

//  ----- Initialise GLEW -----
 
GLenum err = glewInit();
if (GLEW_OK != err)
{
	cout << "GLEW initialisation error: " << glewGetErrorString(err) << endl;
	exit(-1);
}
cout << "GLEW intialised successfully. Using GLEW version: " << glewGetString(GLEW_VERSION) << endl;

Cheers!

Update: Modified initGL() function to disable the depth testing correctly via glDisable(GL_DEPTH_TEST) and not glDisable(GL_DEPTH) – silly mistake.