-
Notifications
You must be signed in to change notification settings - Fork 210
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Initial commit of first glTF tutorial #1
Changes from 1 commit
cf36a27
57ad950
9c09afe
14ef12f
9c67b42
2030187
e108349
e73b738
6e73912
be84f85
9325982
13414b4
4de3dcc
a9921ea
692af8e
96db262
000d921
3a1112b
e2eaf9c
e35949f
2f61612
b190766
e73f143
dfb0979
4b275d0
2e56f56
add4488
baea61e
cfeef19
b5a0f84
b1dadea
96b4459
47e9669
d357a07
cc8b428
fced2e1
a5cc729
35f0dcd
68d627b
8c19bcc
5dffe82
42f113c
92b78b0
34b2af2
ddf5cd8
73cb6fc
7762dbc
d40b5ee
f72209c
6b9b245
9f55617
848d8ab
8b520ae
82349ca
4afe369
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,18 @@ | ||
#glTF Tutorial | ||
|
||
This tutorial gives an introduction to [glTF](https://www.khronos.org/gltf), the GL transmission format. It summarizes the most important features and application cases of glTF, and describes the structure of the files that are related to glTF. It explains how glTF files may be read, processed and used to display 3D graphics efficiently. | ||
|
||
Some basic knowledge about [JSON](http://json.org/), the JavaScript Object Notation, and about [OpenGL](https://www.khronos.org/opengl/) is required. Where appropriate, the related concepts of OpenGL will be explained quickly, usually with examples in [WebGL](https://www.khronos.org/webgl/). | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. This is a bit confusing worded as is. Perhaps say something like "knowledge of a graphics API...examples will be in WebGL." |
||
|
||
- [Introduction](gltfTutorial_001_Introduction.md) | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Can each .md file link to the previous, next, and contents? |
||
- [Basic glTF structure](gltfTutorial_002_BasicGltfStructure.md) | ||
- [Scene structure](gltfTutorial_003_SceneStructure.md) | ||
- [Scenes, nodes, cameras and animations](gltfTutorial_004_ScenesNodesCamerasAnimations.md) | ||
- [Meshes, textures, images and samplers](gltfTutorial_005_MeshesTexturesImagesSamplers.md) | ||
- [Materials, techniques, programs and shaders](gltfTutorial_006_MaterialsTechniquesProgramsShaders.md) | ||
- [Buffers, bufferViews and accessors](gltfTutorial_007_BuffersBufferViewsAccessors.md) | ||
- Skins | ||
- Extensions | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Reminder that Extensions and Summary aren't in yet. There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. This as well as the TBD will likely not be part of the initial version. The extensions will be explained separately, and the Summary (locally) mainly contains code snippets for rendering with WebGL, which would go into a separate tutorial. |
||
- Summary: Rendering a glTF asset with WebGL | ||
|
||
|
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,38 @@ | ||
|
||
## Introduction to glTF | ||
|
||
There is an increasing number of applications and services that are based on 3D content. Online shops are offering product configurators with a 3D preview. Museums are digitizing their artifacts with 3D scans, and allow exploring their collection in virtual galleries. City planners are using 3D city models for planning and information visualization. Educators are creating interactive, animated 3D models of the human body. In many cases, these applications are running directly in the web browser, which is possible because all modern browsers support efficient, OpenGL-based rendering with WebGL. | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Here as well, the "OpenGL-based rendering with WebGL" is confusing. I suggest positioning statements like this as "graphics API in general (could be OpenGL, Vulkan, D3D, etc); WebGL examples for this tutorial" since this is the direction for glTF. |
||
|
||
 | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. It would be good to have captions for each figure, or label them as |
||
|
||
|
||
### 3D content creation | ||
|
||
The 3D content that is rendered in the client application comes from different sources. In some cases, the raw 3D data is obtained with a 3D scanner, and the resulting geometry data is stored as the [OBJ](https://en.wikipedia.org/wiki/Wavefront_.obj_file), [PLY](https://en.wikipedia.org/wiki/PLY_(file_format)) or [STL](https://en.wikipedia.org/wiki/STL_(file_format)) files. These file formats only support simple geometric objects and do not contain information about how the objects should be rendered. Additionally, they cannot describe more complex 3D scenes that involve multiple objects. Such sophisticated 3D scenes can be created with authoring tools. These tools allow editing the overall structure of the scene, the light setup, cameras, animations, and, of course, the 3D geometry of the objects that appear in the scene. Each authoring tool defines its own file format in which the 3D scenes are stored. For example, [Blender](https://www.blender.org/) stores the scenes in `.blend` files, [LightWave3D](https://www.lightwave3d.com/) uses the `.lws` file format, [3ds Max](http://www.autodesk.com/3dsmax) uses the `.max` file format, and [Maya](http://www.autodesk.com/maya) scenes are stored as `.ma` files. For the exchange of data between 3D authoring applications, different standard file formats have been developed. For example, Autodesk has defined the [FBX](http://www.autodesk.com/products/fbx/overview) format. The web3D consortium has set up [VRML and X3D](http://www.web3d.org/standards) as international standards for web-based 3D graphics. And the [COLLADA](https://www.khronos.org/collada/) specification defines an XML-based data exchange format for 3D authoring tools. The [List of 3D graphics file formats on Wikipedia](https://en.wikipedia.org/wiki/List_of_file_formats#3D_graphics) shows that there is an overwhelming number of more than 70 different file formats for 3D data, serving different purposes and application cases. | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. This paragraph is really long. Consider breaking it up into two or three paragraphs. If you have some spare time, I really love this book on writing: http://www.bartleby.com/141/ There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. This paragraph does a nice job of setting the context for glTF in the sea of 3D formats. It would be awesome to complement it with a figure that shows how formats are grouped/relate. |
||
|
||
### 3D content rendering | ||
|
||
Although there are so many different file formats for 3D data, there is only one open, truly platform-independent and versatile way of *rendering* 3D content - and that is OpenGL. It is used for high-performance rendering applications on the desktop, on smartphones, and directly in the web browser, using WebGL. In order to render a 3D scene that was created with an authoring tool, the input data has to be parsed, and the 3D geometry data has to be converted into the format that is required by OpenGL. The 3D data has to be transferred to the graphics card memory, and then the rendering process can be described with shader programs and sequences of OpenGL API calls. | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. I suggest repositioning this as described above. Set the stage that there is a gap between content tools and rendering content with graphics APIs (again, could be OpenGL, WebGL, Vulkan, D3D, etc.). Each game/graphics engine has solved this with their own format and their own toolchain. That is silly; we can move the field forward faster if we collaborate around an open standard - glTF |
||
|
||
|
||
### 3D content delivery | ||
|
||
There is a strong and constantly increasing demand for 3D content in various applications. Many of these applications should receive the 3D content over the web. In many cases, the 3D content should also be rendered directly in the browser, with WebGL. But until now, there is a gap between the 3D content creation and the efficient rendering of the 3D content in the runtime applications: | ||
|
||
 | ||
|
||
The existing file formats are not appropriate for this use case: Some of them are do not contain any scene information, but only geometry data. Others have been designed for exchanging data between authoring applications, and their main goal is to retain as much information about the 3D scene as possible. As a result, the files are usually large, complex and hard to parse. None of the existing file formats was designed for the use case of efficiently transferring 3D scenes over the web and rendering them as efficiently as possible with OpenGL. | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. "are do not" -> "do not" There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. "hard to parse" -> not just hard to parse; they are hard to fully process so that we can generate draw calls. For example, COLLADA has polygons that would need to be triangulated at runtime. |
||
|
||
|
||
### glTF: A transmission format for 3D scenes | ||
|
||
The goal of glTF is to define a standard for the efficient transfer of 3D scenes that can be rendered with OpenGL. So glTF is not "yet another file format". It is the definition of a *transmission* format for 3D scenes: | ||
|
||
- The scene structure is described with JSON, which is very compact and can easily be parsed | ||
- The 3D data of the objects is stored in a form that can directly be used by OpenGL, so there is no overhead for decoding or pre-processing the 3D data | ||
|
||
Different content creation tools may now provide the 3D content in the glTF format, and an increasing number of client applications is able to consume and render glTF. So glTF may help to bridge the gap between content creation and rendering: | ||
|
||
 | ||
|
||
The content creation tools may either provide glTF directly, or use one of the open-source conversion utilities like [obj2gltf](https://github.com/AnalyticalGraphicsInc/obj2gltf) or [COLLADA2GLTF](https://github.com/KhronosGroup/glTF/tree/master/COLLADA2GLTF) to convert legacy formats to glTF. | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. I don't think we can imply that OBJ and COLLADA are legacy. They just serve a different purpose than glTF. |
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,111 @@ | ||
## The basic structure of glTF | ||
|
||
The core of glTF is a JSON file. This file describes the whole contents of the 3D scene. It consists of a description of the scene structure itself, which is given by a hierarchy of nodes that define a scene graph. The 3D objects that appear in the scene are defined using meshes that are attached to the nodes. Animations describe how the 3D objects are transformed (e.g. rotated to translated) over time. Materials and textures define the appearance of the objects, and cameras describe the view configuration for the renderer. | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. There are also skins. Up to you if you think they are worth a mention here. There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. "Materials and textures" could just be "Materials" since materials imply techniques/textures/programs/shaders/etc. Really up to you. |
||
|
||
## The JSON structure | ||
|
||
The scene objects are stored in dictionaries in the JSON file. They can be accessed using an ID, which is the key of the dictionary: | ||
|
||
```javascript | ||
"meshes": { | ||
"FirstExampleMeshId": { ... }, | ||
"SecondExampleMeshId": { ... }, | ||
"ThirdExampleMeshId": { ... } | ||
} | ||
``` | ||
|
||
|
||
These IDs are also used to define the *relationships* between the objects. The example above defines multiple meshes, and a node may refer to one of these meshes, using the mesh ID, to indicate that the mesh should be attached to this node: | ||
|
||
```javascript | ||
"nodes:" { | ||
"FirstExampleNodeId": { | ||
"meshes": [ | ||
"FirstExampleMeshId" | ||
] | ||
}, | ||
... | ||
} | ||
``` | ||
|
||
More information about the top-level elements and the relationships between these elements will be given in the [Scene structure](gltfTutorial_003_SceneStructure.md) section. | ||
|
||
|
||
|
||
## References to external data | ||
|
||
The binary data, like geometry and textures of the 3D objects, are usually not contained in the JSON file. Instead, they are stored in dedicated files, and the JSON part only contains links to these files. This allows the binary data to be stored in a form that is very compact and can efficiently be transferred over the web. Additionally, the data can be stored in a format that can be used directly in OpenGL, without having to parse, decode or preprocess the data. | ||
|
||
 | ||
|
||
As shown in the image above, there are three types of objects that may contain such links to external resources, namely `buffers`, `images` and `shaders`. These objects will later be explained in more detail. | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Skins too. There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Ah, I think buffers cover skins. |
||
|
||
|
||
|
||
### Reading and managing external data | ||
|
||
Reading and processing a glTF asset starts with parsing the JSON structure. After the structure has been parsed, the `buffers`, `images` and `shaders` are available as dictionaries. The keys of these dictionaries are IDs, and the values are the [`buffer`](https://github.com/KhronosGroup/glTF/tree/master/specification#reference-buffer), [`image`](https://github.com/KhronosGroup/glTF/tree/master/specification#reference-image) and [`shader`](https://github.com/KhronosGroup/glTF/tree/master/specification#reference-shader) objects, respectively. | ||
|
||
Each of these objects contains links, in form of URIs that point to external resources. For further processing, this data has to be read into memory. Usually it will be stored in a dictionary (or map) data structure, so that it may be looked up using the ID of the object that it belongs to. | ||
|
||
|
||
### Binary data in `buffers` | ||
|
||
A [`buffer`](https://github.com/KhronosGroup/glTF/tree/master/specification#reference-buffer) contains a URI that points to a file containing the raw, binary buffer data: | ||
|
||
```javascript | ||
"buffer01": { | ||
"byteLength": 12352, | ||
"type": "arraybuffer", | ||
"uri": "buffer01.bin" | ||
} | ||
``` | ||
|
||
This binary data has no inherent meaning or structure. It only is a compact representation of data that can efficiently be transferred over the web, and then be interpreted in different ways. For example, it may later be interpreted as geometry-, skinning- and animation data. For the case of geometry data, it can directly be passed to an OpenGL-based renderer, without having to decode or pre-process it. Until then, it is just a raw block of memory that is read from the URI of the `buffer`: | ||
|
||
> **Implementation note:** | ||
|
||
> The data that is read for one buffer may combine different data blocks. It is therefore important to store the data in a way that later allows obtaining *parts* or *segments* of the whole buffer: | ||
|
||
> * In **JavaScript**, the buffer data is received as the response of sending an [`XMLHttpRequest`](https://developer.mozilla.org/en/docs/Web/API/XMLHttpRequest) to the URI. It will be an [`ArrayBuffer`](https://developer.mozilla.org/en/docs/Web/JavaScript/Reference/Global_Objects/ArrayBuffer). Parts of this buffer may then be obtained by creating a [`TypedArray`](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/TypedArray) from the desired part of the `ArrayBuffer`. | ||
|
||
> * In **Java**, the buffer data may be read from the [`InputStream`](https://docs.oracle.com/javase/8/docs/api/java/io/InputStream.html) of the URI. It may be stored in a [`ByteBuffer`](https://docs.oracle.com/javase/8/docs/api/java/nio/ByteBuffer.html). In order to easily pass the data to OpenGL, this should be a *direct* `ByteBuffer`. Typed views on parts of the buffer may then be created from this buffer, for example, as a [`FloatBuffer`](https://docs.oracle.com/javase/8/docs/api/java/nio/ByteBuffer.html#asFloatBuffer--). | ||
|
||
> * In **C++**, the data is read from an [`std::istream`](http://www.cplusplus.com/reference/istream/istream/) and stored in an [`std::vector<uint8_t>`](http://www.cplusplus.com/reference/vector/vector/). The raw data pointer of this vector may then be used to construct vectors that represent parts of the buffer. | ||
|
||
|
||
|
||
### Image data in `images` | ||
|
||
An [`image`](https://github.com/KhronosGroup/glTF/tree/master/specification#reference-image) refers to an external image file that can be used as the texture of a rendered object: | ||
|
||
```javascript | ||
"image01": { | ||
"uri": "image01.png" | ||
} | ||
``` | ||
|
||
The reference is given as a URI that usually points to a PNG or JPG file. These formats significantly reduce the size of the files, so that they may efficiently be transferred over the web, and they still can be decoded quickly and easily. | ||
|
||
> **Implementation note:** | ||
|
||
> * In **JavaScript**, the data will be an [`ImageData`](https://developer.mozilla.org/en-US/docs/Web/API/ImageData) that is obtained from a [`CanvasRenderingContext2D`](https://developer.mozilla.org/en-US/docs/Web/API/CanvasRenderingContext2D/getImageData) after setting the image URI as the source of the [`Image`](https://developer.mozilla.org/en-US/docs/Web/API/HTMLImageElement/Image) that was rendered into the render context. Or just see [this stackoverflow answer](http://stackoverflow.com/a/17591386/) for an example... | ||
> * In **Java**, the data will be a [`ByteBuffer`](https://docs.oracle.com/javase/8/docs/api/java/nio/ByteBuffer.html) that is filled with the pixel data from an image that was read with [`ImageIO`](https://docs.oracle.com/javase/8/docs/api/javax/imageio/ImageIO.html) from the [`InputStream`](https://docs.oracle.com/javase/8/docs/api/java/io/InputStream.html) of the URI | ||
> * In **C++**, it can be an [`std::vector<uint8_t>`](http://www.cplusplus.com/reference/vector/vector/) that is filled with the pixel data of an image. The data is read from an [`std::istream`](http://www.cplusplus.com/reference/istream/istream/) that either points to a file or to a URI and decoded using a C++ image loading library. | ||
|
||
> There are some caveats regarding the *format* of this pixel data. It has to be in a form that can be passed to OpenGL. Some libraries contain utility methods for reading and converting the pixel data from images. The section about [Textures, Images and Samplers](gltfTutorial_005_MeshesTexturesImagesSamplers.md) contains more information about the image formats, and how the images are passed to OpenGL. | ||
|
||
|
||
|
||
### GLSL shader data in `shaders` | ||
|
||
A GLSL vertex- or fragment [`shader`](https://github.com/KhronosGroup/glTF/tree/master/specification#reference-shader) that should be used for rendering the 3D objects contains a URI that points to a file containing the shader source code: | ||
|
||
```javascript | ||
"fragmentShader01": { | ||
"type": 35632, | ||
"uri": "fragmentShader01.glsl" | ||
} | ||
``` | ||
|
||
The shader source code is stored as plain text, so that it can directly be compiled with OpenGL. |
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,19 @@ | ||
|
||
# The glTF scene structure | ||
|
||
The following image (adapted from the [glTF concepts section](https://github.com/KhronosGroup/glTF/tree/master/specification#concepts)) gives an overview of the top-level elements of a glTF: | ||
|
||
 | ||
|
||
|
||
These elements are summarized here quickly, to give an overview, with links to the respective sections of the glTF specification. More detailed explanations of the relationships between these elements will be given in the following sections. | ||
|
||
- The [`scene`](https://github.com/KhronosGroup/glTF/tree/master/specification#reference-scene) is the entry point for the description of the scene that is stored in the glTF. It refers to the `node`s that define the scene graph. | ||
- The [`node`](https://github.com/KhronosGroup/glTF/tree/master/specification#reference-node) is one node in the scene graph hierarchy. It can contain a transformation (like rotation or translation), and it may refer to further (child) nodes. Additionally, it may refer to `mesh` or `camera` instances that are "attached" to the node, or to a `skin` that describes a mesh deformation. | ||
- The [`camera`](https://github.com/KhronosGroup/glTF/tree/master/specification#reference-camera) defines the view configuration for rendering the scene. | ||
- A [`mesh`](https://github.com/KhronosGroup/glTF/tree/master/specification#reference-mesh) describes a geometric object that appears in the scene. It refers to `accessor` objects that are used for accessing the actual geometry data, and to `material`s that define the appearance of the object when it is rendered | ||
- The [`skin`](https://github.com/KhronosGroup/glTF/tree/master/specification#reference-skin) defines parameters that are required for vertex skinning , which allows the deformation of a mesh based on the pose of a virtual character. The values of these parameters are obtained from an `accessor`. | ||
- An [`animation`](https://github.com/KhronosGroup/glTF/tree/master/specification#reference-animation) describes how transformations of certain nodes (like rotation or translation) change over time. | ||
- The [`accessor`](https://github.com/KhronosGroup/glTF/tree/master/specification#reference-accessor) is used as an abstract source of arbitrary data. It is used by the `mesh`, `skin` and `animation`, and provides the geometry data, the skinning parameters and the time-dependent animation values. It refers to a [`bufferView`](https://github.com/KhronosGroup/glTF/tree/master/specification#reference-bufferView), which is a part of a [`buffer`](https://github.com/KhronosGroup/glTF/tree/master/specification#reference-buffer) that contains the actual raw binary data. | ||
- The [`material`](https://github.com/KhronosGroup/glTF/tree/master/specification#reference-material) contains the parameters that define the appearance of an object. It can refer to a `texture` that will be applied to the object, and it refers to the `technique` for rendering an object with the given material. The [`technique`](https://github.com/KhronosGroup/glTF/tree/master/specification#reference-technique) refers to a [`program`](https://github.com/KhronosGroup/glTF/tree/master/specification#reference-program) that contains the OpenGL GLSL vertex- and fragment [`shader`](https://github.com/KhronosGroup/glTF/tree/master/specification#reference-shader)s that are used for rendering the object. | ||
- The [`texture`](https://github.com/KhronosGroup/glTF/tree/master/specification#reference-texture) is defined by a [`sampler`](https://github.com/KhronosGroup/glTF/tree/master/specification#reference-sampler) and an [`image`](https://github.com/KhronosGroup/glTF/tree/master/specification#reference-image). The `sampler` defines how the texture `image` should be placed on the object. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Perhaps add an Acknowledgements section to the bottom of this .md file and add everyone who contributes content, provides feedback, edits, etc. This may help encourage contributions and tie together the glTF community.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Perhaps rename this to something like "Introduction to glTF using WebGL"