Thursday, January 31, 2013
Portfolio Base
Since I don't really have a place to show all the stuffs & prototypes I had done, which I should have obviously. I figured I'll just put all of them here.
Hack n' Hide:
TeamBlog:
http://hacknhide.blogspot.com/
Reveal:
Reveal_UnityPrototype
Control:
Turn On/Off Flashlight - left click;
Melee - F;
Turn energy into battery - right click;
Move - WASD.
Goobles:
Goobles_UnityPrototype
Control:
Placing turret - 1;
Placing landmine - 2;
Placing sticky paper - 3;
Shooting - right click.
Converse:
Converse_playableLink
Control:
Click.
Crusaders:
Crusaders_XNAPrototype
Control:
Move - WASD
Monte's Quest:
Monty's Quest_MOAI_SDK_Prototype
Control:
Jump - click.
Friday, January 25, 2013
Graphics Programming 3
Since I'm on fire today, I thought I might as well finish assignment 3 :p
The concept is simple:
Texture coordinate:
The whole point of texture coordinate is to mark those coordinates on vertex so that each vertex know what part of the texture would it render. In that way, when the coordinates get interpolated into the fragment shader, each fragment know exactly which coordinate it gonna use on the texture, thus get the color info from that point. That's how a triangle gets to know which part of the texture to render.
One thing to note though, the texture coordinate of a texture is actually "flipped", which means the upper-left is actually 0, 0, instead of 0, 1. (So the lower-left is 0,1);
Point light& diffuse light:
In graphics, we assume diffuse light goes into all directions, and the "strength" of the light is determined by the light source (in our case, point light).
How do we do that?
By first: get the light direction vector using light's position minus the actual fragment's position (So we know we are gonna do it in the fragment shader, well... we could actually calculate the light results in the vertex shader and let the graphic card interpolate it. But the result could be worse than per-pixel light. And considering how strong the graphic card is nowadays, we can afford it.)
Second: get the normal vector from the mesh and transform it to world position (that's how I prefer it anyway).
End: dot product these two vector, which is the cosine of their angle, which is exactly the power of the light at that point.
Anyway, it's all simple 3D math.
So, this is how it actually looks:
And the texture resource in PIX:
To control the point light:
use "I" - up, "K" - down, "J" - left, "L" - right, "U" - backward, "O" - forward;
To control the camera: Arrow Keys
To control the box: WASD.
Here is the code:
Graphics03_Source
Graphics Programming 2
This week is kinda tough, the goal of the assignment itself is not hard, but re-factory the code (render simply gets too long if we don't do that) actually took me most of the time.
As always, I'll start on the theory for future reference:
This assignment is all about showing a 3D mesh on a screen, which is, in fact, project everything in the 3D space on a 2D plane. To be able to do that, you need three matrices:
ModelToWorld Matrix: Transform the mesh information from model space (which starts from the origin) to world space where the mesh actually is. To be able to do that, you need to get the transform information from your code (including position, rotation, even scaling value). And you can get the final matrix from multiply the rotation matrix, position matrix, and so on. (Ideally, you can get any transform you want by continually multiplying matrices, it's all linear algebra. )
WorldToView Matrix: Transform everything from world space based on camera's location (position and rotation, to be exact), so that everything will be placed at our point of view. To do that you can just multiply the ModelToWorld matrix by the INVERSE camera transform matrix (since everything will move at the opposite position the camera moves, for example, when the camera move left, everything will be moving right in the viewport.)
Projection Matrix: Right now what we see is a 3D place, but what we see on the screen is actually all 2D, so we need a matrix to project everything from 3D to a 2D plane. A projection matrix will do that, and DirectX does provide some function for you to do that. (D3DXMatrixPerspectiveLH() or D3DXMatrixPerspectiveFovLH())
Now back to the code:
I actually did this assignment twice, the first time I just went along and started re-factory right at the beginning. After I finished everything, the cube just doesn't show on the screen. I debugged and rechecked the code for 6 hours and did everything I could and still couldn't figure out what could possibly the reason, even now. That really pisses me off.
Here is the PIX debug screen for that one:
Everything works perfect, the vertex buffer and index buffer did passed into the vertex shader, the preVS and postVS are all right on track and all the matrices are loaded into the vertex shader. It's just that nothing is shown on the viewport. I even tried to disable the backface culling, and it still doesn't work. My guess is that somehow the data doesn't passed into the fragment shader, but I can't figure out why.
So I restarted, again from the last assignment. This time every time I made a change to part of the code, I debugged it and made sure everything showed properly on the screen. It worked and didn't take me much time since I simply reused most of the code I did at the first time.
Anyway, the box:
PreVS:
PostVS:
Input instructions:
Move Camera: Arrow Keys;
Move Box: WASD
Link to the code:
Graphic02_WorkingCode
Here is also the link to the first one that doesn't work if anyone wants to check it out:
Graphics02_BrokenCode
Wednesday, January 23, 2013
First two weeks - the game pitch
Beginning this semester, we are going on full production on a complete game until we graduate. Not only the game would apply for the IGF, it could also be the first published that we have. So it's kind of a big deal.
The first two weeks are the pitching phase. Everybody can form team and pitch their idea if they want. By the end of the first week, there are more than ten pitches going around, many of them are very brilliant and innovative. Choosing one from them becomes a quite painful task.
And the truth is, the choosing process (copied from EA) is quite not I imaged it would be. I like the fact that everybody can choose the game they want to make, which is right thing to do and quite reasonable to be honest. But letting this process going on for a week and the rule that each team must have at least 5 people kind of making everything shifting away from its original purpose. Polities come in, people start to compromise or get influenced by other people. Some choose the strong team that they think that actually got a shot, some just let go and choose the people that they want to work with instead of games. A lot of brilliant ideas get killed because there are not enough people support them. But even though, I believe most of the people are pretty happy about where they are. I guess that's good enough.
For the next cohort, my suggestion would be let people vote two games (in case everybody vote what they pitched, and they would.) that they like the most right after the pitching and feedback is finished. And choose the 5 or 6 game that got most votes and regrouping after that.
Anyway, I'm very glad that I'm working on the game that I like the most. The first gate is after three weeks and we better start working on the prototype.
Friday, January 18, 2013
Graphics Programming 1
I had stated my interest in shaders in the previous post. That's why I picked a graphics class this semester. I'll share my learning experience here from now on.
The first class is pretty basic. It's all about introducing the rendering pipeline, which is, of course, very important and useful for us to know. Since we are starting from 2D, the basic process is (if I'm not getting anything wrong):
Loading vertex information from a mesh file (grouped by three - at least for now, since we are using the triangle list)
-> process each vertex in the vertex shader (where you get all the inputs you want from your program - written through DirectX in our case and only has position of the vertices for now) and output positions and colors of those vertices to the hardware
->the hardware interpolates everything the vertex shader sends to it and send those fragment information to the pixel shader (or fragment shader)
--BTW, a fragment is a potential pixel on the screen (which means there is a chance they may not be rendered because they could be a bad pixel - for example, blocked by another object).
->for each fragment information that sends in, the pixel shader process through them (doing modifications if you want) and output them as a fragment. Then the hardware decides which fragment to render at the end.
Anyway, learning this really helps me to clear my vague understanding of this pipeline.
As for the assignment, it's just load from a file that holds a rectangle information (two triangles - six vertices to be exact) and render it onto the screen through the pipeline. Basically, all we have to do is to write a file loader that parse through the vertex position and play around with the vertex shader and pixel shader. Nothing much.
Since I figured we are gonna write a FBX parser eventually, I might as well do some simple practices now, that's why I used this format: (Not exactly the same as fbx, but kind of similar. And yes, it's 2D for now, baby-steps.)
VertexNumber = [number of vertices];
VertexData =
{
[x0], [y0],
[x1], [y1],
...
[xn], [yn]
}
As for the actual shader, there is nothing much to do, I added some funny equations to the vertex shader to see how it behaved. As for the pixel shader, I only pass through the color information since I really want to check out the interpolation effect. Anyway, here is part of the code for the vertex shader:
o_position = float4(
i_position.x * abs(cos( g_secondsElapsed * 0.3)) + abs( ( 0.5 * i_position.x ) * cos(g_secondsElapsed * i_position.y ) ),
i_position.y * abs(sin( g_secondsElapsed * 0.5)) + ( ( 0.3 * i_position.y ) * sin( g_secondsElapsed * i_position.x ) ),
0.0, 1.0 );
// Calculate color
float red = ( sin( g_secondsElapsed ) * 0.5 * i_position.x ) + 0.5;
float green = ( cos( 5.0 * g_secondsElapsed ) * 0.5 * i_position.y) + 0.5f;
float blue = 1.0 - sin( g_secondsElapsed * 4.0);
o_color = float4( red, green, blue, 1.0 );
Here is how it looks:
Also, the debug information from the PIX:
Anyway, pretty basic stuff, we are doing 3D for the next class, it would get more and more interesting.
BTW,
the project file for the assignment:
Code
The first class is pretty basic. It's all about introducing the rendering pipeline, which is, of course, very important and useful for us to know. Since we are starting from 2D, the basic process is (if I'm not getting anything wrong):
Loading vertex information from a mesh file (grouped by three - at least for now, since we are using the triangle list)
-> process each vertex in the vertex shader (where you get all the inputs you want from your program - written through DirectX in our case and only has position of the vertices for now) and output positions and colors of those vertices to the hardware
->the hardware interpolates everything the vertex shader sends to it and send those fragment information to the pixel shader (or fragment shader)
--BTW, a fragment is a potential pixel on the screen (which means there is a chance they may not be rendered because they could be a bad pixel - for example, blocked by another object).
->for each fragment information that sends in, the pixel shader process through them (doing modifications if you want) and output them as a fragment. Then the hardware decides which fragment to render at the end.
Anyway, learning this really helps me to clear my vague understanding of this pipeline.
As for the assignment, it's just load from a file that holds a rectangle information (two triangles - six vertices to be exact) and render it onto the screen through the pipeline. Basically, all we have to do is to write a file loader that parse through the vertex position and play around with the vertex shader and pixel shader. Nothing much.
Since I figured we are gonna write a FBX parser eventually, I might as well do some simple practices now, that's why I used this format: (Not exactly the same as fbx, but kind of similar. And yes, it's 2D for now, baby-steps.)
VertexNumber = [number of vertices];
VertexData =
{
[x0], [y0],
[x1], [y1],
...
[xn], [yn]
}
As for the actual shader, there is nothing much to do, I added some funny equations to the vertex shader to see how it behaved. As for the pixel shader, I only pass through the color information since I really want to check out the interpolation effect. Anyway, here is part of the code for the vertex shader:
o_position = float4(
i_position.x * abs(cos( g_secondsElapsed * 0.3)) + abs( ( 0.5 * i_position.x ) * cos(g_secondsElapsed * i_position.y ) ),
i_position.y * abs(sin( g_secondsElapsed * 0.5)) + ( ( 0.3 * i_position.y ) * sin( g_secondsElapsed * i_position.x ) ),
0.0, 1.0 );
// Calculate color
float red = ( sin( g_secondsElapsed ) * 0.5 * i_position.x ) + 0.5;
float green = ( cos( 5.0 * g_secondsElapsed ) * 0.5 * i_position.y) + 0.5f;
float blue = 1.0 - sin( g_secondsElapsed * 4.0);
o_color = float4( red, green, blue, 1.0 );
Here is how it looks:
Also, the debug information from the PIX:
| Pixel Shader |
![]() |
| Vertex Shader |
BTW,
the project file for the assignment:
Code
Subscribe to:
Comments (Atom)












