Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Initial commit of first glTF tutorial #1

Merged
merged 55 commits into from
Nov 26, 2017
Merged
Show file tree
Hide file tree
Changes from 1 commit
Commits
Show all changes
55 commits
Select commit Hold shift + click to select a range
cf36a27
Initial commit of first glTF tutorial
javagl Oct 20, 2016
57ad950
Included first comments for first pull request
javagl Oct 31, 2016
9c09afe
Updated link anchor
javagl Oct 31, 2016
14ef12f
Fixed link anchor
javagl Oct 31, 2016
9c67b42
Refactored introduction
javagl Nov 1, 2016
2030187
Fixed matrix image
javagl Nov 2, 2016
e108349
Next refactoring and cleanup of the generic glTF tutorial
javagl Nov 2, 2016
e73b738
Next step of tutorial refactoring
javagl Nov 4, 2016
6e73912
Fixed navigation elements
javagl Nov 4, 2016
be84f85
Next refactoring step of tutorial
javagl Nov 4, 2016
9325982
Fixed buffer length in images
javagl Nov 4, 2016
13414b4
Next refactoring steps of glTF tutorial
javagl Nov 7, 2016
4de3dcc
Fixed navigation links.
javagl Nov 7, 2016
a9921ea
Refactoring of tutorial, mainly related to materials
javagl Nov 14, 2016
692af8e
Continued with materials section
javagl Nov 15, 2016
96db262
Continued with "Advanced Material". Started cameras.
javagl Nov 17, 2016
000d921
Added cameras. Started simple texture example.
javagl Nov 27, 2016
3a1112b
Updates for materials
javagl Dec 9, 2016
e2eaf9c
Updated links
javagl Dec 9, 2016
e35949f
Minor restructuring before adding skins
javagl Dec 10, 2016
2f61612
Updated links and image captions
javagl Dec 11, 2016
b190766
Updated skin images
javagl Dec 11, 2016
e73f143
Added skinning sections
javagl Dec 14, 2016
dfb0979
Finished skinning. Minor cleanups.
javagl Dec 19, 2016
4b275d0
Copy of @slchow edits, attempt to fix git branch/pull request.
emackey Jan 6, 2017
2e56f56
Merge pull request #1 from emackey/slchow-edits
javagl Jan 6, 2017
add4488
Minor fixes based on copyedit comments
javagl Jan 12, 2017
baea61e
Updated content pipeline image
javagl Jan 12, 2017
cfeef19
Minor fix based on copyedit comment.
javagl Jan 12, 2017
b5a0f84
Aligned tutorial with updated sample models.
javagl Jan 12, 2017
b1dadea
First pass of update for 2.0
javagl Apr 9, 2017
96b4459
Updated sections 8 and 9 for 2.0
javagl Apr 9, 2017
47e9669
Update gltfTutorial_005_BuffersBufferViewsAccessors.md
PeakFish Jul 9, 2017
d357a07
Merge pull request #2 from PeakFish/patch-1
javagl Jul 9, 2017
cc8b428
Update gltfTutorial_007_Animations.md
PeakFish Jul 12, 2017
fced2e1
Merge pull request #3 from PeakFish/patch-1
javagl Jul 13, 2017
a5cc729
Added a minor fix to the 4th tutorial
ambient-seclusion Sep 7, 2017
35f0dcd
Merge pull request #4 from LT-Kerrigan/master
javagl Sep 7, 2017
68d627b
Minor fix of broken anchor to 8th tutorial
ambient-seclusion Sep 7, 2017
8c19bcc
Merge pull request #5 from LT-Kerrigan/master
javagl Sep 7, 2017
5dffe82
Updated links to point to 2.0 specification
javagl Oct 12, 2017
42f113c
Updated section about interleaved data
javagl Oct 14, 2017
92b78b0
Updates of Materials and Cameras for 2.0
javagl Oct 16, 2017
34b2af2
Updated first part of skinning for 2.0
javagl Oct 17, 2017
ddf5cd8
Update gltfTutorial_020_Skins.md
alexchicn Oct 31, 2017
73cb6fc
Merge pull request #6 from alexchicn/patch-1
javagl Oct 31, 2017
7762dbc
Added section about sparse accessors
javagl Nov 4, 2017
d40b5ee
Added note about infinite projection matrices
javagl Nov 4, 2017
f72209c
Updated skins sections for 2.0
javagl Nov 4, 2017
6b9b245
Minor fix in animation sample
javagl Nov 5, 2017
9f55617
Added initial draft of morph targets section
javagl Nov 7, 2017
848d8ab
Finalized Morph and Skin. Added Advanced Material.
javagl Nov 12, 2017
8b520ae
Fixes for images and image links
javagl Nov 12, 2017
82349ca
Copyedit
slchow Nov 23, 2017
4afe369
Merge pull request #7 from slchow/master
javagl Nov 26, 2017
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
18 changes: 18 additions & 0 deletions gltfTutorial/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,18 @@
#glTF Tutorial
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Perhaps add an Acknowledgements section to the bottom of this .md file and add everyone who contributes content, provides feedback, edits, etc. This may help encourage contributions and tie together the glTF community.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Perhaps rename this to something like "Introduction to glTF using WebGL"


This tutorial gives an introduction to [glTF](https://www.khronos.org/gltf), the GL transmission format. It summarizes the most important features and application cases of glTF, and describes the structure of the files that are related to glTF. It explains how glTF files may be read, processed and used to display 3D graphics efficiently.

Some basic knowledge about [JSON](http://json.org/), the JavaScript Object Notation, and about [OpenGL](https://www.khronos.org/opengl/) is required. Where appropriate, the related concepts of OpenGL will be explained quickly, usually with examples in [WebGL](https://www.khronos.org/webgl/).
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is a bit confusing worded as is. Perhaps say something like "knowledge of a graphics API...examples will be in WebGL."


- [Introduction](gltfTutorial_001_Introduction.md)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can each .md file link to the previous, next, and contents?

- [Basic glTF structure](gltfTutorial_002_BasicGltfStructure.md)
- [Scene structure](gltfTutorial_003_SceneStructure.md)
- [Scenes, nodes, cameras and animations](gltfTutorial_004_ScenesNodesCamerasAnimations.md)
- [Meshes, textures, images and samplers](gltfTutorial_005_MeshesTexturesImagesSamplers.md)
- [Materials, techniques, programs and shaders](gltfTutorial_006_MaterialsTechniquesProgramsShaders.md)
- [Buffers, bufferViews and accessors](gltfTutorial_007_BuffersBufferViewsAccessors.md)
- Skins
- Extensions
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Reminder that Extensions and Summary aren't in yet.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This as well as the TBD will likely not be part of the initial version. The extensions will be explained separately, and the Summary (locally) mainly contains code snippets for rendering with WebGL, which would go into a separate tutorial.

- Summary: Rendering a glTF asset with WebGL


38 changes: 38 additions & 0 deletions gltfTutorial/gltfTutorial_001_Introduction.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,38 @@

## Introduction to glTF

There is an increasing number of applications and services that are based on 3D content. Online shops are offering product configurators with a 3D preview. Museums are digitizing their artifacts with 3D scans, and allow exploring their collection in virtual galleries. City planners are using 3D city models for planning and information visualization. Educators are creating interactive, animated 3D models of the human body. In many cases, these applications are running directly in the web browser, which is possible because all modern browsers support efficient, OpenGL-based rendering with WebGL.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Here as well, the "OpenGL-based rendering with WebGL" is confusing.

I suggest positioning statements like this as "graphics API in general (could be OpenGL, Vulkan, D3D, etc); WebGL examples for this tutorial" since this is the direction for glTF.


![applications](images/applications.png)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It would be good to have captions for each figure, or label them as (a), (b), etc. so the descriptions can be connected.



### 3D content creation

The 3D content that is rendered in the client application comes from different sources. In some cases, the raw 3D data is obtained with a 3D scanner, and the resulting geometry data is stored as the [OBJ](https://en.wikipedia.org/wiki/Wavefront_.obj_file), [PLY](https://en.wikipedia.org/wiki/PLY_(file_format)) or [STL](https://en.wikipedia.org/wiki/STL_(file_format)) files. These file formats only support simple geometric objects and do not contain information about how the objects should be rendered. Additionally, they cannot describe more complex 3D scenes that involve multiple objects. Such sophisticated 3D scenes can be created with authoring tools. These tools allow editing the overall structure of the scene, the light setup, cameras, animations, and, of course, the 3D geometry of the objects that appear in the scene. Each authoring tool defines its own file format in which the 3D scenes are stored. For example, [Blender](https://www.blender.org/) stores the scenes in `.blend` files, [LightWave3D](https://www.lightwave3d.com/) uses the `.lws` file format, [3ds Max](http://www.autodesk.com/3dsmax) uses the `.max` file format, and [Maya](http://www.autodesk.com/maya) scenes are stored as `.ma` files. For the exchange of data between 3D authoring applications, different standard file formats have been developed. For example, Autodesk has defined the [FBX](http://www.autodesk.com/products/fbx/overview) format. The web3D consortium has set up [VRML and X3D](http://www.web3d.org/standards) as international standards for web-based 3D graphics. And the [COLLADA](https://www.khronos.org/collada/) specification defines an XML-based data exchange format for 3D authoring tools. The [List of 3D graphics file formats on Wikipedia](https://en.wikipedia.org/wiki/List_of_file_formats#3D_graphics) shows that there is an overwhelming number of more than 70 different file formats for 3D data, serving different purposes and application cases.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This paragraph is really long. Consider breaking it up into two or three paragraphs.

If you have some spare time, I really love this book on writing: http://www.bartleby.com/141/

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This paragraph does a nice job of setting the context for glTF in the sea of 3D formats. It would be awesome to complement it with a figure that shows how formats are grouped/relate.


### 3D content rendering

Although there are so many different file formats for 3D data, there is only one open, truly platform-independent and versatile way of *rendering* 3D content - and that is OpenGL. It is used for high-performance rendering applications on the desktop, on smartphones, and directly in the web browser, using WebGL. In order to render a 3D scene that was created with an authoring tool, the input data has to be parsed, and the 3D geometry data has to be converted into the format that is required by OpenGL. The 3D data has to be transferred to the graphics card memory, and then the rendering process can be described with shader programs and sequences of OpenGL API calls.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I suggest repositioning this as described above.

Set the stage that there is a gap between content tools and rendering content with graphics APIs (again, could be OpenGL, WebGL, Vulkan, D3D, etc.). Each game/graphics engine has solved this with their own format and their own toolchain. That is silly; we can move the field forward faster if we collaborate around an open standard - glTF



### 3D content delivery

There is a strong and constantly increasing demand for 3D content in various applications. Many of these applications should receive the 3D content over the web. In many cases, the 3D content should also be rendered directly in the browser, with WebGL. But until now, there is a gap between the 3D content creation and the efficient rendering of the 3D content in the runtime applications:

![3D content creation and rendering](images/contentCreationAndRendering.png)

The existing file formats are not appropriate for this use case: Some of them are do not contain any scene information, but only geometry data. Others have been designed for exchanging data between authoring applications, and their main goal is to retain as much information about the 3D scene as possible. As a result, the files are usually large, complex and hard to parse. None of the existing file formats was designed for the use case of efficiently transferring 3D scenes over the web and rendering them as efficiently as possible with OpenGL.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

"are do not" -> "do not"

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

"hard to parse" -> not just hard to parse; they are hard to fully process so that we can generate draw calls. For example, COLLADA has polygons that would need to be triangulated at runtime.



### glTF: A transmission format for 3D scenes

The goal of glTF is to define a standard for the efficient transfer of 3D scenes that can be rendered with OpenGL. So glTF is not "yet another file format". It is the definition of a *transmission* format for 3D scenes:

- The scene structure is described with JSON, which is very compact and can easily be parsed
- The 3D data of the objects is stored in a form that can directly be used by OpenGL, so there is no overhead for decoding or pre-processing the 3D data

Different content creation tools may now provide the 3D content in the glTF format, and an increasing number of client applications is able to consume and render glTF. So glTF may help to bridge the gap between content creation and rendering:

![3D content creation and rendering with glTF](images/contentCreationAndRenderingWithGltf.png)

The content creation tools may either provide glTF directly, or use one of the open-source conversion utilities like [obj2gltf](https://github.com/AnalyticalGraphicsInc/obj2gltf) or [COLLADA2GLTF](https://github.com/KhronosGroup/glTF/tree/master/COLLADA2GLTF) to convert legacy formats to glTF.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't think we can imply that OBJ and COLLADA are legacy. They just serve a different purpose than glTF.

111 changes: 111 additions & 0 deletions gltfTutorial/gltfTutorial_002_BasicGltfStructure.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,111 @@
## The basic structure of glTF

The core of glTF is a JSON file. This file describes the whole contents of the 3D scene. It consists of a description of the scene structure itself, which is given by a hierarchy of nodes that define a scene graph. The 3D objects that appear in the scene are defined using meshes that are attached to the nodes. Animations describe how the 3D objects are transformed (e.g. rotated to translated) over time. Materials and textures define the appearance of the objects, and cameras describe the view configuration for the renderer.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

There are also skins. Up to you if you think they are worth a mention here.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

"Materials and textures" could just be "Materials" since materials imply techniques/textures/programs/shaders/etc. Really up to you.


## The JSON structure

The scene objects are stored in dictionaries in the JSON file. They can be accessed using an ID, which is the key of the dictionary:

```javascript
"meshes": {
"FirstExampleMeshId": { ... },
"SecondExampleMeshId": { ... },
"ThirdExampleMeshId": { ... }
}
```


These IDs are also used to define the *relationships* between the objects. The example above defines multiple meshes, and a node may refer to one of these meshes, using the mesh ID, to indicate that the mesh should be attached to this node:

```javascript
"nodes:" {
"FirstExampleNodeId": {
"meshes": [
"FirstExampleMeshId"
]
},
...
}
```

More information about the top-level elements and the relationships between these elements will be given in the [Scene structure](gltfTutorial_003_SceneStructure.md) section.



## References to external data

The binary data, like geometry and textures of the 3D objects, are usually not contained in the JSON file. Instead, they are stored in dedicated files, and the JSON part only contains links to these files. This allows the binary data to be stored in a form that is very compact and can efficiently be transferred over the web. Additionally, the data can be stored in a format that can be used directly in OpenGL, without having to parse, decode or preprocess the data.

![glTF structure](images/gltfStructure.png)

As shown in the image above, there are three types of objects that may contain such links to external resources, namely `buffers`, `images` and `shaders`. These objects will later be explained in more detail.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Skins too.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ah, I think buffers cover skins.




### Reading and managing external data

Reading and processing a glTF asset starts with parsing the JSON structure. After the structure has been parsed, the `buffers`, `images` and `shaders` are available as dictionaries. The keys of these dictionaries are IDs, and the values are the [`buffer`](https://github.com/KhronosGroup/glTF/tree/master/specification#reference-buffer), [`image`](https://github.com/KhronosGroup/glTF/tree/master/specification#reference-image) and [`shader`](https://github.com/KhronosGroup/glTF/tree/master/specification#reference-shader) objects, respectively.

Each of these objects contains links, in form of URIs that point to external resources. For further processing, this data has to be read into memory. Usually it will be stored in a dictionary (or map) data structure, so that it may be looked up using the ID of the object that it belongs to.


### Binary data in `buffers`

A [`buffer`](https://github.com/KhronosGroup/glTF/tree/master/specification#reference-buffer) contains a URI that points to a file containing the raw, binary buffer data:

```javascript
"buffer01": {
"byteLength": 12352,
"type": "arraybuffer",
"uri": "buffer01.bin"
}
```

This binary data has no inherent meaning or structure. It only is a compact representation of data that can efficiently be transferred over the web, and then be interpreted in different ways. For example, it may later be interpreted as geometry-, skinning- and animation data. For the case of geometry data, it can directly be passed to an OpenGL-based renderer, without having to decode or pre-process it. Until then, it is just a raw block of memory that is read from the URI of the `buffer`:

> **Implementation note:**

> The data that is read for one buffer may combine different data blocks. It is therefore important to store the data in a way that later allows obtaining *parts* or *segments* of the whole buffer:

> * In **JavaScript**, the buffer data is received as the response of sending an [`XMLHttpRequest`](https://developer.mozilla.org/en/docs/Web/API/XMLHttpRequest) to the URI. It will be an [`ArrayBuffer`](https://developer.mozilla.org/en/docs/Web/JavaScript/Reference/Global_Objects/ArrayBuffer). Parts of this buffer may then be obtained by creating a [`TypedArray`](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/TypedArray) from the desired part of the `ArrayBuffer`.

> * In **Java**, the buffer data may be read from the [`InputStream`](https://docs.oracle.com/javase/8/docs/api/java/io/InputStream.html) of the URI. It may be stored in a [`ByteBuffer`](https://docs.oracle.com/javase/8/docs/api/java/nio/ByteBuffer.html). In order to easily pass the data to OpenGL, this should be a *direct* `ByteBuffer`. Typed views on parts of the buffer may then be created from this buffer, for example, as a [`FloatBuffer`](https://docs.oracle.com/javase/8/docs/api/java/nio/ByteBuffer.html#asFloatBuffer--).

> * In **C++**, the data is read from an [`std::istream`](http://www.cplusplus.com/reference/istream/istream/) and stored in an [`std::vector<uint8_t>`](http://www.cplusplus.com/reference/vector/vector/). The raw data pointer of this vector may then be used to construct vectors that represent parts of the buffer.



### Image data in `images`

An [`image`](https://github.com/KhronosGroup/glTF/tree/master/specification#reference-image) refers to an external image file that can be used as the texture of a rendered object:

```javascript
"image01": {
"uri": "image01.png"
}
```

The reference is given as a URI that usually points to a PNG or JPG file. These formats significantly reduce the size of the files, so that they may efficiently be transferred over the web, and they still can be decoded quickly and easily.

> **Implementation note:**

> * In **JavaScript**, the data will be an [`ImageData`](https://developer.mozilla.org/en-US/docs/Web/API/ImageData) that is obtained from a [`CanvasRenderingContext2D`](https://developer.mozilla.org/en-US/docs/Web/API/CanvasRenderingContext2D/getImageData) after setting the image URI as the source of the [`Image`](https://developer.mozilla.org/en-US/docs/Web/API/HTMLImageElement/Image) that was rendered into the render context. Or just see [this stackoverflow answer](http://stackoverflow.com/a/17591386/) for an example...
> * In **Java**, the data will be a [`ByteBuffer`](https://docs.oracle.com/javase/8/docs/api/java/nio/ByteBuffer.html) that is filled with the pixel data from an image that was read with [`ImageIO`](https://docs.oracle.com/javase/8/docs/api/javax/imageio/ImageIO.html) from the [`InputStream`](https://docs.oracle.com/javase/8/docs/api/java/io/InputStream.html) of the URI
> * In **C++**, it can be an [`std::vector<uint8_t>`](http://www.cplusplus.com/reference/vector/vector/) that is filled with the pixel data of an image. The data is read from an [`std::istream`](http://www.cplusplus.com/reference/istream/istream/) that either points to a file or to a URI and decoded using a C++ image loading library.

> There are some caveats regarding the *format* of this pixel data. It has to be in a form that can be passed to OpenGL. Some libraries contain utility methods for reading and converting the pixel data from images. The section about [Textures, Images and Samplers](gltfTutorial_005_MeshesTexturesImagesSamplers.md) contains more information about the image formats, and how the images are passed to OpenGL.



### GLSL shader data in `shaders`

A GLSL vertex- or fragment [`shader`](https://github.com/KhronosGroup/glTF/tree/master/specification#reference-shader) that should be used for rendering the 3D objects contains a URI that points to a file containing the shader source code:

```javascript
"fragmentShader01": {
"type": 35632,
"uri": "fragmentShader01.glsl"
}
```

The shader source code is stored as plain text, so that it can directly be compiled with OpenGL.
19 changes: 19 additions & 0 deletions gltfTutorial/gltfTutorial_003_SceneStructure.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,19 @@

# The glTF scene structure

The following image (adapted from the [glTF concepts section](https://github.com/KhronosGroup/glTF/tree/master/specification#concepts)) gives an overview of the top-level elements of a glTF:

![glTF JSON structure](images/gltfJsonStructure.png)


These elements are summarized here quickly, to give an overview, with links to the respective sections of the glTF specification. More detailed explanations of the relationships between these elements will be given in the following sections.

- The [`scene`](https://github.com/KhronosGroup/glTF/tree/master/specification#reference-scene) is the entry point for the description of the scene that is stored in the glTF. It refers to the `node`s that define the scene graph.
- The [`node`](https://github.com/KhronosGroup/glTF/tree/master/specification#reference-node) is one node in the scene graph hierarchy. It can contain a transformation (like rotation or translation), and it may refer to further (child) nodes. Additionally, it may refer to `mesh` or `camera` instances that are "attached" to the node, or to a `skin` that describes a mesh deformation.
- The [`camera`](https://github.com/KhronosGroup/glTF/tree/master/specification#reference-camera) defines the view configuration for rendering the scene.
- A [`mesh`](https://github.com/KhronosGroup/glTF/tree/master/specification#reference-mesh) describes a geometric object that appears in the scene. It refers to `accessor` objects that are used for accessing the actual geometry data, and to `material`s that define the appearance of the object when it is rendered
- The [`skin`](https://github.com/KhronosGroup/glTF/tree/master/specification#reference-skin) defines parameters that are required for vertex skinning , which allows the deformation of a mesh based on the pose of a virtual character. The values of these parameters are obtained from an `accessor`.
- An [`animation`](https://github.com/KhronosGroup/glTF/tree/master/specification#reference-animation) describes how transformations of certain nodes (like rotation or translation) change over time.
- The [`accessor`](https://github.com/KhronosGroup/glTF/tree/master/specification#reference-accessor) is used as an abstract source of arbitrary data. It is used by the `mesh`, `skin` and `animation`, and provides the geometry data, the skinning parameters and the time-dependent animation values. It refers to a [`bufferView`](https://github.com/KhronosGroup/glTF/tree/master/specification#reference-bufferView), which is a part of a [`buffer`](https://github.com/KhronosGroup/glTF/tree/master/specification#reference-buffer) that contains the actual raw binary data.
- The [`material`](https://github.com/KhronosGroup/glTF/tree/master/specification#reference-material) contains the parameters that define the appearance of an object. It can refer to a `texture` that will be applied to the object, and it refers to the `technique` for rendering an object with the given material. The [`technique`](https://github.com/KhronosGroup/glTF/tree/master/specification#reference-technique) refers to a [`program`](https://github.com/KhronosGroup/glTF/tree/master/specification#reference-program) that contains the OpenGL GLSL vertex- and fragment [`shader`](https://github.com/KhronosGroup/glTF/tree/master/specification#reference-shader)s that are used for rendering the object.
- The [`texture`](https://github.com/KhronosGroup/glTF/tree/master/specification#reference-texture) is defined by a [`sampler`](https://github.com/KhronosGroup/glTF/tree/master/specification#reference-sampler) and an [`image`](https://github.com/KhronosGroup/glTF/tree/master/specification#reference-image). The `sampler` defines how the texture `image` should be placed on the object.
Loading