Thursday, 7 May 2015

Displaying 3D Polygon Animations

Displaying 3D Polygon Animations

API
Games use software known as an API (Application Programme Interface) which is a set of tools for building software applications a good API makes it easier for the software to be developed as it gives you all the building blocks you need to make the software you're trying to make

Most Operating Systems such as Windows provide an API so that programmers can make applications for that operating system  although API's are for programmers they benefit the users as applications made on the same API will probably have similar interfaces

Direct 3D

Direct 3D is an API designed for manipulating and displaying 3D objects it was developed by Microsoft Direct 3D provides programmers with a way for programmers to utilize any graphics card in a pc and use it to display objects almost all pc's are compatible with Direct 3D

Open GL
Open GL was developed by silicon graphics in the early 90's and has become one of the most widely used graphics api's in the world it is very similar to Direct 3D however it is open source meaning anyone can edit the code and do what they want with it whereas Direct 3D limits you to what Microsoft says what you can do

Graphics Pipeline
The graphics pipeline is the way that a computer transferees the mathematical data that it has on the object into the object that we see on the screen the 3D graphics pipeline typically takes a 3D object when it's in data and converts it into a 2D raster image Open GL and direct 3D both have very similar graphics pipelines

Stages of the graphics pipeline 
First the scene is created out of geometric primitive shapes this is usually done using triangles as they are good for this as they always exist on a single plane


Modelling and Transformation 
This stage transforms the local objects consternates into the 3D world coordinate system


Camera Transformation
Next it transforms the 3D world Coordinates into 3D Camera Coordinates with the camera as the origin 

Lighting 
Illuminates according to the lighting and reflectiveness of the object for example if the room as pitch black the objects will be seen as black 

Project Transformation
This stage transfares the 3D coordinates into a 2D view of the camera a object further away from the camera looks smaller and one's that are closer up look larger this is caused by the x and y coordinates of each of the objects being divided by it's z coordinate (this represents it's distance from the camera) in orthographic projection objects retain there original size regardless of distance from the camera


Clipping  

At this stage anything that can not be seen by the camera is not shown and discarded

Scan Conversion or Rasterization
This is the stage where the 2D image is converted into a raster format and the resonating pixel values are determined from now on

Texturing, Fragment Shading
At this stage the pre pixels are assigned a colour based values interloped based on vertices during rasterization from a texture that's in memory or from a shader programme

Display 
At this point the final coloured pixels are displayed on the screen

Sources:
https://encrypted-tbn0.gstatic.com/images?q=tbn:ANd9GcRGqZWF84sbKCSWaWHpQl4RCXWdRUIzP7lMmDgZLQ5Hfc9Lt-z7

No comments:

Post a Comment