repo_id
stringlengths 22
103
| file_path
stringlengths 41
147
| content
stringlengths 181
193k
| __index_level_0__
int64 0
0
|
---|---|---|---|
data/mdn-content/files/en-us/web/api | data/mdn-content/files/en-us/web/api/audioworklet/index.md | ---
title: AudioWorklet
slug: Web/API/AudioWorklet
page-type: web-api-interface
browser-compat: api.AudioWorklet
---
{{APIRef("Web Audio API")}}{{securecontext_header}}
The **`AudioWorklet`** interface of the [Web Audio API](/en-US/docs/Web/API/Web_Audio_API) is used to supply custom audio processing scripts that execute in a separate thread to provide very low latency audio processing.
The worklet's code is run in the {{domxref("AudioWorkletGlobalScope")}} global execution context, using a separate Web Audio thread which is shared by the worklet and other audio nodes.
Access the audio context's instance of `AudioWorklet` through the {{domxref("BaseAudioContext.audioWorklet")}} property.
{{InheritanceDiagram}}
## Instance properties
_The `AudioWorklet` interface does not define any properties of its own, but does inherit properties of {{domxref("Worklet")}}._
## Instance methods
_This interface inherits methods from {{domxref('Worklet')}}. The `AudioWorklet` interface does not define any methods of its own._
## Events
_`AudioWorklet` has no events to which it responds._
## Examples
See {{domxref("AudioWorkletNode")}} for complete examples of custom audio node creation.
## Specifications
{{Specifications}}
## Browser compatibility
{{Compat}}
## See also
- {{domxref("AudioWorkletGlobalScope")}} — the global execution context of an `AudioWorklet`
- [Web Audio API](/en-US/docs/Web/API/Web_Audio_API)
- [Using the Web Audio API](/en-US/docs/Web/API/Web_Audio_API/Using_Web_Audio_API)
- [Using AudioWorklet](/en-US/docs/Web/API/Web_Audio_API/Using_AudioWorklet)
| 0 |
data/mdn-content/files/en-us/web/api | data/mdn-content/files/en-us/web/api/pushevent/index.md | ---
title: PushEvent
slug: Web/API/PushEvent
page-type: web-api-interface
browser-compat: api.PushEvent
---
{{APIRef("Push API")}}{{SecureContext_Header}}
The **`PushEvent`** interface of the [Push API](/en-US/docs/Web/API/Push_API) represents a push message that has been received. This event is sent to the [global scope](/en-US/docs/Web/API/ServiceWorkerGlobalScope) of a {{domxref("ServiceWorker")}}. It contains the information sent from an application server to a {{domxref("PushSubscription")}}.
{{InheritanceDiagram}}
## Constructor
- {{domxref("PushEvent.PushEvent", "PushEvent()")}}
- : Creates a new `PushEvent` object.
## Instance properties
_Inherits properties from its parent, {{domxref("ExtendableEvent")}}. Additional properties:_
- {{domxref("PushEvent.data")}} {{ReadOnlyInline}}
- : Returns a reference to a {{domxref("PushMessageData")}} object containing data sent to the {{domxref("PushSubscription")}}.
## Instance methods
_Inherits methods from its parent, {{domxref("ExtendableEvent")}}_.
## Examples
The following example takes data from a `PushEvent` and displays it on all of the service worker's clients.
```js
self.addEventListener("push", (event) => {
if (!(self.Notification && self.Notification.permission === "granted")) {
return;
}
const data = event.data?.json() ?? {};
const title = data.title || "Something Has Happened";
const message =
data.message || "Here's something you might want to check out.";
const icon = "images/new-notification.png";
const notification = new self.Notification(title, {
body: message,
tag: "simple-push-demo-notification",
icon,
});
notification.addEventListener("click", () => {
clients.openWindow(
"https://example.blog.com/2015/03/04/something-new.html",
);
});
});
```
## Specifications
{{Specifications}}
## Browser compatibility
{{Compat}}
## See also
- [Push API](/en-US/docs/Web/API/Push_API)
- [Service Worker API](/en-US/docs/Web/API/Service_Worker_API)
| 0 |
data/mdn-content/files/en-us/web/api/pushevent | data/mdn-content/files/en-us/web/api/pushevent/data/index.md | ---
title: "PushEvent: data property"
short-title: data
slug: Web/API/PushEvent/data
page-type: web-api-instance-property
browser-compat: api.PushEvent.data
---
{{APIRef("Push API")}}{{SecureContext_Header}}
The `data` read-only property of the **`PushEvent`** interface returns a reference to a {{domxref("PushMessageData")}} object containing data sent to the {{domxref("PushSubscription")}}.
## Value
A {{domxref("PushMessageData")}} object or `null` if no `data` member is passed when the event instance initialized.
## Examples
The following example takes data from a PushEvent and displays it on all of the service workers' clients.
```js
self.addEventListener("push", (event) => {
if (!(self.Notification && self.Notification.permission === "granted")) {
return;
}
const data = event.data?.json() ?? {};
const title = data.title || "Something Has Happened";
const message =
data.message || "Here's something you might want to check out.";
const icon = "images/new-notification.png";
const notification = new Notification(title, {
body: message,
tag: "simple-push-demo-notification",
icon,
});
notification.addEventListener("click", () => {
clients.openWindow(
"https://example.blog.com/2015/03/04/something-new.html",
);
});
});
```
## Specifications
{{Specifications}}
## Browser compatibility
{{Compat}}
| 0 |
data/mdn-content/files/en-us/web/api/pushevent | data/mdn-content/files/en-us/web/api/pushevent/pushevent/index.md | ---
title: "PushEvent: PushEvent() constructor"
short-title: PushEvent()
slug: Web/API/PushEvent/PushEvent
page-type: web-api-constructor
browser-compat: api.PushEvent.PushEvent
---
{{APIRef("Push API")}}{{SecureContext_Header}}
The **`PushEvent()`** constructor creates a new
{{domxref("PushEvent")}} object. Note that this constructor is exposed only to a
service worker context.
## Syntax
```js-nolint
new PushEvent(type)
new PushEvent(type, options)
```
### Parameters
- `type`
- : A string with the name of the event.
It is case-sensitive and browsers set it to `push` or `pushsubscriptionchange`.
- `options` {{optional_inline}}
- : An object that, _in addition of the properties defined in {{domxref("ExtendableEvent/ExtendableEvent", "ExtendableEvent()")}}_, can have the following properties:
- `data`
- : The data you want the `PushEvent` to contain, if any.
When the constructor is invoked, the {{domxref("PushEvent.data")}} property of the resulting object will be set
to a new {{domxref("PushMessageData")}} object containing these bytes.
### Return value
A new {{domxref("PushEvent")}} object.
## Examples
```js
const dataInit = {
data: "Some sample text",
};
const myPushEvent = new PushEvent("push", dataInit);
myPushEvent.data.text(); // should return 'Some sample text'
```
## Specifications
{{Specifications}}
## Browser compatibility
{{Compat}}
## See also
- [Push API](/en-US/docs/Web/API/Push_API)
- [Service Worker API](/en-US/docs/Web/API/Service_Worker_API)
| 0 |
data/mdn-content/files/en-us/web/api | data/mdn-content/files/en-us/web/api/eventcounts/index.md | ---
title: EventCounts
slug: Web/API/EventCounts
page-type: web-api-interface
browser-compat: api.EventCounts
---
{{APIRef("Performance API")}}
The **`EventCounts`** interface of the [Performance API](/en-US/docs/Web/API/Performance_API) provides the number of events that have been dispatched for each event type.
An `EventCounts` instance is a read-only [`Map`-like object](/en-US/docs/Web/JavaScript/Reference/Global_Objects/Map#map-like_browser_apis), in which each key is the name string for an event type, and the corresponding value is an integer indicating the number of events that have been dispatched for that event type.
## Constructor
This interface has no constructor. You typically get an instance of this object using the {{domxref("performance.eventCounts")}} property.
## Instance properties
- `size`
- : See {{jsxref("Map.prototype.size")}} for details.
## Instance methods
- `entries()`
- : See {{jsxref("Map.prototype.entries()")}} for details.
- `forEach()`
- : See {{jsxref("Map.prototype.forEach()")}} for details.
- `get()`
- : See {{jsxref("Map.prototype.get()")}} for details.
- `has()`
- : See {{jsxref("Map.prototype.has()")}} for details.
- `keys()`
- : See {{jsxref("Map.prototype.keys()")}} for details.
- `values()`
- : See {{jsxref("Map.prototype.values()")}} for details.
## Examples
### Working with EventCount maps
Below are a few examples to get information from an `EventCounts` map. Note that the map is read-only and the `clear()`, `delete()`, and `set()` methods aren't available.
```js
for (entry of performance.eventCounts.entries()) {
const type = entry[0];
const count = entry[1];
}
const clickCount = performance.eventCounts.get("click");
const isExposed = performance.eventCounts.has("mousemove");
const exposedEventsCount = performance.eventCounts.size;
const exposedEventsList = [...performance.eventCounts.keys()];
```
## Specifications
{{Specifications}}
## Browser compatibility
{{Compat}}
## See also
- {{domxref("performance.eventCounts")}}
- {{domxref("PerformanceEventTiming")}}
- {{jsxref("Map")}}
| 0 |
data/mdn-content/files/en-us/web/api | data/mdn-content/files/en-us/web/api/offscreencanvas/index.md | ---
title: OffscreenCanvas
slug: Web/API/OffscreenCanvas
page-type: web-api-interface
browser-compat: api.OffscreenCanvas
---
{{APIRef("Canvas API")}}
When using the {{HtmlElement("canvas")}} element or the [Canvas API](/en-US/docs/Web/API/Canvas_API), rendering, animation, and user interaction usually happen on the main execution thread of a web application.
The computation relating to canvas animations and rendering can have a significant impact on application performance.
The **`OffscreenCanvas`** interface provides a canvas that can be rendered off screen, decoupling the DOM and the Canvas API so that the {{HtmlElement("canvas")}} element is no longer entirely dependent on the DOM.
Rendering operations can also be run inside a [worker](/en-US/docs/Web/API/Web_Workers_API) context, allowing you to run some tasks in a separate thread and avoid heavy work on the main thread.
`OffscreenCanvas` is a [transferable object](/en-US/docs/Web/API/Web_Workers_API/Transferable_objects).
{{AvailableInWorkers}}
{{InheritanceDiagram}}
## Constructors
- {{domxref("OffscreenCanvas.OffscreenCanvas", "OffscreenCanvas()")}}
- : `OffscreenCanvas` constructor. Creates a new `OffscreenCanvas` object.
## Instance properties
- {{domxref("OffscreenCanvas.height")}}
- : The height of the offscreen canvas.
- {{domxref("OffscreenCanvas.width")}}
- : The width of the offscreen canvas.
## Instance methods
- {{domxref("OffscreenCanvas.getContext()")}}
- : Returns a rendering context for the offscreen canvas.
- {{domxref("OffscreenCanvas.convertToBlob()")}}
- : Creates a {{domxref("Blob")}} object representing the image contained in the canvas.
- {{domxref("OffscreenCanvas.transferToImageBitmap()")}}
- : Creates an {{domxref("ImageBitmap")}} object from the most recently rendered image of the `OffscreenCanvas`. See the {{domxref("OffscreenCanvas.transferToImageBitmap()", "API description")}} for important notes on managing this {{domxref("ImageBitmap")}}.
## Examples
### Synchronous display of frames produced by an `OffscreenCanvas`
One way to use the `OffscreenCanvas` API is to use a rendering context that has been obtained from an `OffscreenCanvas` object to generate new frames. Once a new frame has finished rendering in this context, the {{domxref("OffscreenCanvas.transferToImageBitmap", "transferToImageBitmap()")}} method can be called to save the most recent rendered image. This method returns an {{domxref("ImageBitmap")}} object, which can be used in a variety of Web APIs and also in a second canvas without creating a transfer copy.
To display the `ImageBitmap`, you can use an {{domxref("ImageBitmapRenderingContext")}} context, which can be created by calling `canvas.getContext("bitmaprenderer")` on a (visible) canvas element. This context only provides functionality to replace the canvas's contents with the given `ImageBitmap`. A call to {{domxref("ImageBitmapRenderingContext.transferFromImageBitmap()")}} with the previously rendered and saved `ImageBitmap` from the OffscreenCanvas, will display the `ImageBitmap` on the canvas and transfer its ownership to the canvas. A single `OffscreenCanvas` may transfer frames into an arbitrary number of other `ImageBitmapRenderingContext` objects.
Given these two {{HTMLElement("canvas")}} elements
```html
<canvas id="one"></canvas> <canvas id="two"></canvas>
```
the following code will provide the rendering using `OffscreenCanvas` as described above.
```js
const one = document.getElementById("one").getContext("bitmaprenderer");
const two = document.getElementById("two").getContext("bitmaprenderer");
const offscreen = new OffscreenCanvas(256, 256);
const gl = offscreen.getContext("webgl");
// Perform some drawing for the first canvas using the gl context
const bitmapOne = offscreen.transferToImageBitmap();
one.transferFromImageBitmap(bitmapOne);
// Perform some more drawing for the second canvas
const bitmapTwo = offscreen.transferToImageBitmap();
two.transferFromImageBitmap(bitmapTwo);
```
### Asynchronous display of frames produced by an `OffscreenCanvas`
Another way to use the `OffscreenCanvas` API, is to call {{domxref("HTMLCanvasElement.transferControlToOffscreen", "transferControlToOffscreen()")}} on a {{HTMLElement("canvas")}} element, either on a [worker](/en-US/docs/Web/API/Web_Workers_API) or the main thread, which will return an `OffscreenCanvas` object from an {{domxref("HTMLCanvasElement")}} object from the main thread. Calling {{domxref("OffscreenCanvas.getContext", "getContext()")}} will then obtain a rendering context from that `OffscreenCanvas`.
The `main.js` script (main thread) may look like this:
```js
const htmlCanvas = document.getElementById("canvas");
const offscreen = htmlCanvas.transferControlToOffscreen();
const worker = new Worker("offscreencanvas.js");
worker.postMessage({ canvas: offscreen }, [offscreen]);
```
While the `offscreencanvas.js` script (worker thread) can look like this:
```js
onmessage = (evt) => {
const canvas = evt.data.canvas;
const gl = canvas.getContext("webgl");
// Perform some drawing using the gl context
};
```
It's also possible to use {{domxref("Window.requestAnimationFrame", "requestAnimationFrame()")}} in workers:
```js
onmessage = (evt) => {
const canvas = evt.data.canvas;
const gl = canvas.getContext("webgl");
function render(time) {
// Perform some drawing using the gl context
requestAnimationFrame(render);
}
requestAnimationFrame(render);
};
```
For a full example, see the [OffscreenCanvas example source](https://github.com/mdn/dom-examples/tree/main/web-workers/offscreen-canvas-worker) on GitHub or run the [OffscreenCanvas example live](https://mdn.github.io/dom-examples/web-workers/offscreen-canvas-worker/).
## Specifications
{{Specifications}}
## Browser compatibility
{{Compat}}
## See also
- {{domxref("CanvasRenderingContext2D")}}
- {{domxref("OffscreenCanvasRenderingContext2D")}}
- {{domxref("ImageBitmap")}}
- {{domxref("ImageBitmapRenderingContext")}}
- {{domxref("HTMLCanvasElement.transferControlToOffscreen()")}}
- {{domxref("Window.requestAnimationFrame()", "requestAnimationFrame()")}}
- [WebGL Off the Main Thread – Mozilla Hacks](https://hacks.mozilla.org/2016/01/webgl-off-the-main-thread/) (2016)
| 0 |
data/mdn-content/files/en-us/web/api/offscreencanvas | data/mdn-content/files/en-us/web/api/offscreencanvas/getcontext/index.md | ---
title: "OffscreenCanvas: getContext() method"
short-title: getContext()
slug: Web/API/OffscreenCanvas/getContext
page-type: web-api-instance-method
browser-compat: api.OffscreenCanvas.getContext
---
{{APIRef("Canvas API")}}
The **`OffscreenCanvas.getContext()`** method returns a drawing context for an offscreen canvas, or [`null`](/en-US/docs/Web/JavaScript/Reference/Operators/null) if the context identifier is not supported.
## Syntax
```js-nolint
getContext(contextType, contextAttributes)
```
### Parameters
- `contextType`
- : A string containing the context identifier defining the drawing context associated to the canvas. Possible values are:
- `2d`
- : Creates a {{domxref("OffscreenCanvasRenderingContext2D")}} object representing a two-dimensional rendering context.
- `webgl`
- : Creates a {{domxref("WebGLRenderingContext")}} object representing a three-dimensional rendering context.
This context is only available on browsers that implement [WebGL](/en-US/docs/Web/API/WebGL_API) version 1 (OpenGL ES 2.0).
- `webgl2`
- : Creates a {{domxref("WebGL2RenderingContext")}} object representing a three-dimensional rendering context.
This context is only available on browsers that implement [WebGL](/en-US/docs/Web/API/WebGL_API) version 2 (OpenGL ES 3.0).
- `bitmaprenderer`
- : Creates a {{domxref("ImageBitmapRenderingContext")}} which only provides functionality to replace the content of the canvas with a given {{domxref("ImageBitmap")}}.
> **Note:** The identifiers **`"experimental-webgl"`** or **`"experimental-webgl2"`** are also used in implementations of WebGL.
> These implementations have not reached test suite conformance, or the graphic drivers situation on the platform is not yet stable.
> The [Khronos Group](https://www.khronos.org/) certifies WebGL implementations under certain [conformance rules](https://www.khronos.org/registry/webgl/sdk/tests/CONFORMANCE_RULES.txt).
- `contextAttributes`
- : You can use several context attributes when creating your rendering context, for example:
```js
offscreen.getContext("webgl", { antialias: false, depth: false });
```
2d context attributes:
- `alpha`
- : Boolean that indicates if the canvas contains an alpha channel. If set to `false`, the browser now knows that the backdrop is always opaque, which can speed up drawing of transparent content and images then.
- `willReadFrequently` {{non-standard_inline}} (Firefox only)
- : Boolean that indicates whether or not a lot of read-back operations are planned.
This will force the use of a software (instead of hardware accelerated) 2D canvas and can save memory when calling {{domxref("CanvasRenderingContext2D.getImageData", "getImageData()")}} frequently.
This option is only available, if the flag `gfx.canvas.willReadFrequently.enable` is set to `true` (which, by default, is only the case for B2G/Firefox OS).
- `storage` {{non-standard_inline}} (Blink only)
- : String that indicates which storage is used ("persistent" by default).
WebGL context attributes:
- `alpha`
- : Boolean that indicates if the canvas contains an alpha buffer.
- `depth`
- : Boolean that indicates that the drawing buffer is requested to have a depth buffer of at least 16 bits.
- `stencil`
- : Boolean that indicates that the drawing buffer is requested to have a stencil buffer of at least 8 bits.
- `antialias`
- : Boolean that indicates whether or not to perform anti-aliasing if possible.
- `premultipliedAlpha`
- : Boolean that indicates that the page compositor will assume the drawing buffer contains colors with pre-multiplied alpha.
- `preserveDrawingBuffer`
- : If the value is true the buffers will not be cleared and will preserve their values until cleared or overwritten by the author.
- `failIfMajorPerformanceCaveat`
- : Boolean that indicates if a context will be created if the system performance is low.
### Return value
A rendering context which is either a
- {{domxref("OffscreenCanvasRenderingContext2D")}} for `"2d"`,
- {{domxref("WebGLRenderingContext")}} for `"webgl"` and `"experimental-webgl"`,
- {{domxref("WebGL2RenderingContext")}} for `"webgl2"` and `"experimental-webgl2"` {{experimental_inline}}, or
- {{domxref("ImageBitmapRenderingContext")}} for `"bitmaprenderer"`.
If the `contextType` doesn't match a possible drawing context, `null` is returned.
## Examples
```js
const offscreen = new OffscreenCanvas(256, 256);
const gl = offscreen.getContext("webgl");
gl; // WebGLRenderingContext
gl.canvas; // OffscreenCanvas
```
## Specifications
{{Specifications}}
## Browser compatibility
{{Compat}}
## See also
- The interface defining this method: {{domxref("OffscreenCanvas")}}
- {{domxref("HTMLCanvasElement.getContext()")}}
- Available rendering contexts: {{domxref("CanvasRenderingContext2D")}}, {{domxref("WebGLRenderingContext")}}, {{domxref("WebGL2RenderingContext")}}, {{domxref("ImageBitmapRenderingContext")}}, and {{domxref("OffscreenCanvasRenderingContext2D")}}
| 0 |
data/mdn-content/files/en-us/web/api/offscreencanvas | data/mdn-content/files/en-us/web/api/offscreencanvas/converttoblob/index.md | ---
title: "OffscreenCanvas: convertToBlob() method"
short-title: convertToBlob()
slug: Web/API/OffscreenCanvas/convertToBlob
page-type: web-api-instance-method
browser-compat: api.OffscreenCanvas.convertToBlob
---
{{APIRef("Canvas API")}}
The **`OffscreenCanvas.convertToBlob()`** method creates a {{domxref("Blob")}} object representing the image contained in the canvas.
The desired file format and image quality may be specified.
If the file format is not specified, or if the given format is not supported, then the data will be exported as `image/png`.
Browsers are required to support `image/png`; many will support additional formats including `image/jpeg` and `image/webp`.
The created image will have a resolution of 96dpi for file formats that support encoding resolution metadata.
## Syntax
```js-nolint
convertToBlob()
convertToBlob(options)
```
### Parameters
- `options` {{optional_inline}}
- : An object with the following properties:
- `type`
- : A string indicating the image format.
The default type is `image/png`; this image format will be also used if the specified type is not supported.
- `quality`
- : A {{jsxref("Number")}} between `0` and `1` indicating the image quality to be used when creating images using file formats that support lossy compression (such as `image/jpeg` or `image/webp`).
A user agent will use its default quality value if this option is not specified, or if the number is outside the allowed range.
### Return value
A {{jsxref("Promise")}} returning a {{domxref("Blob")}} object representing the image contained in the canvas.
### Exceptions
The promise may be rejected with the following exceptions:
- `InvalidStateError` {{domxref("DOMException")}}
- : The `OffscreenCanvas` is not detached; in other words it still associated with the DOM and not the current worker.
- `SecurityError` {{domxref("DOMException")}}
- : The canvas context mode is 2d and the bitmap is not origin-clean; at least some of its contents have or may have been loaded from a site other than the one from which the document itself was loaded.
- `IndexSizeError` {{domxref("DOMException")}}
- : The canvas bitmap has no pixels (either the horizontal or vertical dimension is zero).
- `EncodingError` {{domxref("DOMException")}}
- : The blob could not be created due to an encoding error.
## Examples
```js
const offscreen = new OffscreenCanvas(256, 256);
const gl = offscreen.getContext("webgl");
// Perform some drawing using the gl context
offscreen.convertToBlob().then((blob) => console.log(blob));
// Blob { size: 334, type: "image/png" }
```
## Specifications
{{Specifications}}
## Browser compatibility
{{Compat}}
## See also
- The interface defining this method, {{domxref("OffscreenCanvas")}}.
| 0 |
data/mdn-content/files/en-us/web/api/offscreencanvas | data/mdn-content/files/en-us/web/api/offscreencanvas/transfertoimagebitmap/index.md | ---
title: "OffscreenCanvas: transferToImageBitmap() method"
short-title: transferToImageBitmap()
slug: Web/API/OffscreenCanvas/transferToImageBitmap
page-type: web-api-instance-method
browser-compat: api.OffscreenCanvas.transferToImageBitmap
---
{{APIRef("Canvas API")}}
The **`OffscreenCanvas.transferToImageBitmap()`** method creates an {{domxref("ImageBitmap")}} object from the most recently rendered image of the `OffscreenCanvas`. The `OffscreenCanvas` allocates a new image for its subsequent rendering.
## Syntax
```js-nolint
transferToImageBitmap()
```
### Parameters
None.
### Return value
A newly-allocated {{domxref("ImageBitmap")}}.
This `ImageBitmap` references a potentially large graphics resource, and to ensure your web application remains robust, it is important to avoid allocating too many of these resources at any point in time. For this reason it is important to ensure that the `ImageBitmap` is either _consumed_ or _closed_.
As described in the {{domxref("OffscreenCanvas")}} examples, passing this `ImageBitmap` to {{domxref("ImageBitmapRenderingContext.transferFromImageBitmap()")}} _consumes_ the `ImageBitmap` object; it no longer references the underlying graphics resource, and can not be passed to any other web APIs.
If your goal is to pass the `ImageBitmap` to other web APIs which do not consume it - for example, {{domxref("CanvasRenderingContext2D.drawImage()")}} - then you should _close_ it when you're done with it by calling {{domxref("ImageBitmap.close()")}}. Don't simply drop the JavaScript reference to the `ImageBitmap`; doing so will keep its graphics resource alive until the next time the garbage collector runs.
If you call `transferToImageBitmap()` and don't intend to pass it to {{domxref("ImageBitmapRenderingContext.transferFromImageBitmap()")}}, consider whether you need to call `transferToImageBitmap()` at all. Many web APIs which accept `ImageBitmap` also accept `OffscreenCanvas` as an argument.
## Examples
```js
const offscreen = new OffscreenCanvas(256, 256);
const gl = offscreen.getContext("webgl");
// Perform some drawing using the gl context
offscreen.transferToImageBitmap();
// ImageBitmap { width: 256, height: 256 }
// Either:
// Pass this `ImageBitmap` to `ImageBitmapRenderingContext.transferFromImageBitmap`
// or:
// Use the `ImageBitmap` with other web APIs, and call `ImageBitmap.close()`!
```
## Specifications
{{Specifications}}
## Browser compatibility
{{Compat}}
## See also
- The interface defining this method, {{domxref("OffscreenCanvas")}}
- {{domxref("ImageBitmapRenderingContext.transferFromImageBitmap")}}
| 0 |
data/mdn-content/files/en-us/web/api/offscreencanvas | data/mdn-content/files/en-us/web/api/offscreencanvas/offscreencanvas/index.md | ---
title: "OffscreenCanvas: OffscreenCanvas() constructor"
short-title: OffscreenCanvas()
slug: Web/API/OffscreenCanvas/OffscreenCanvas
page-type: web-api-constructor
browser-compat: api.OffscreenCanvas.OffscreenCanvas
---
{{APIRef("Canvas API")}}
The **`OffscreenCanvas()`** constructor returns a newly instantiated {{domxref("OffscreenCanvas")}} object.
## Syntax
```js-nolint
new OffscreenCanvas(width, height)
```
### Parameters
- `width`
- : The width of the offscreen canvas.
- `height`
- : The height of the offscreen canvas.
## Examples
This example creates a new offscreen canvas using the `OffscreenCanvas()` constructor.
We then initialize a [WebGL](/en-US/docs/Web/API/WebGL_API) context on it using the {{domxref("OffscreenCanvas.getContext()", "getContext()")}} method.
```js
const offscreen = new OffscreenCanvas(256, 256);
const gl = offscreen.getContext("webgl");
```
## Specifications
{{Specifications}}
## Browser compatibility
{{Compat}}
## See also
- {{domxref("OffscreenCanvas")}}, the interface this constructor belongs to
| 0 |
data/mdn-content/files/en-us/web/api/offscreencanvas | data/mdn-content/files/en-us/web/api/offscreencanvas/width/index.md | ---
title: "OffscreenCanvas: width property"
short-title: width
slug: Web/API/OffscreenCanvas/width
page-type: web-api-instance-property
browser-compat: api.OffscreenCanvas.width
---
{{APIRef("Canvas API")}}
The **`width`** property returns and sets the width of an {{domxref("OffscreenCanvas")}} object.
## Value
## Examples
Creating a new offscreen canvas and returning or setting the width of the offscreen canvas:
```js
const offscreen = new OffscreenCanvas(256, 256);
offscreen.width; // 256
offscreen.width = 512;
```
## Specifications
{{Specifications}}
## Browser compatibility
{{Compat}}
## See also
- {{domxref("OffscreenCanvas")}}, the interface this property belongs to.
| 0 |
data/mdn-content/files/en-us/web/api/offscreencanvas | data/mdn-content/files/en-us/web/api/offscreencanvas/height/index.md | ---
title: "OffscreenCanvas: height property"
short-title: height
slug: Web/API/OffscreenCanvas/height
page-type: web-api-instance-property
browser-compat: api.OffscreenCanvas.height
---
{{APIRef("Canvas API")}}
The **`height`** property returns and sets the height of an {{domxref("OffscreenCanvas")}} object.
## Value
## Examples
Creating a new offscreen canvas and returning or setting the height of the offscreen canvas:
```js
const offscreen = new OffscreenCanvas(256, 256);
offscreen.height; // 256
offscreen.height = 512;
```
## Specifications
{{Specifications}}
## Browser compatibility
{{Compat}}
## See also
- {{domxref("OffscreenCanvas")}}, the interface this property belongs to.
| 0 |
data/mdn-content/files/en-us/web/api | data/mdn-content/files/en-us/web/api/webtransport_api/index.md | ---
title: WebTransport API
slug: Web/API/WebTransport_API
page-type: web-api-overview
browser-compat: api.WebTransport
---
{{DefaultAPISidebar("WebTransport API")}}{{SecureContext_Header}}
The **WebTransport API** provides a modern update to {{domxref("WebSockets API", "WebSockets", "", "nocode")}}, transmitting data between client and server using [HTTP/3 Transport](https://datatracker.ietf.org/doc/html/draft-ietf-webtrans-http3/). WebTransport provides support for multiple streams, unidirectional streams, and out-of-order delivery. It enables reliable transport via {{domxref("Streams API", "streams", "", "nocode")}} and unreliable transport via UDP-like datagrams.
{{AvailableInWorkers}}
## Concepts and usage
[HTTP/3](https://en.wikipedia.org/wiki/HTTP/3) has been in progress since 2018. It is based on Google's QUIC protocol (which is itself based on UDP), and fixes several issues around the classic TCP protocol, on which HTTP and WebSockets are based.
These include:
- **Head-of-line blocking**
- : HTTP/2 allows multiplexing, so a single connection can stream multiple resources simultaneously. However, if a single resource fails, all other resources on that connection are held up until any missing packets are retransmitted. With QUIC, only the failing resource is affected.
- **Faster performance**
- : QUIC is more performant than TCP in many ways. QUIC can handle security features by itself, rather than handing responsibility off to other protocols like TLS — meaning fewer round trips. And streams provide better transport efficiency than the older packet mechanism. That can make a significant difference, especially on high-latency networks.
- **Better network transitions**
- : QUIC uses a unique connection ID to handle the source and destination of each request — to ensure that packets are delivered correctly. This ID can persist between different networks, meaning that, for example, a download can continue interrupted if you switch from Wifi to a mobile network. HTTP/2, on the other hand, uses IP addresses as identifiers, so network transitions can be problematic.
- **Unreliable transport**
- : HTTP/3 supports unreliable data transmission via datagrams.
The WebTransport API provides low-level access to two-way communication via HTTP/3, taking advantage of the above benefits, and supporting both reliable and unreliable data transmission.
### Initial connection
To open a connection to an HTTP/3 server, you pass its URL to the {{domxref("WebTransport.WebTransport", "WebTransport()")}} constructor. Note that the scheme needs to be HTTPS, and the port number needs to be explicitly specified. Once the {{domxref("WebTransport.ready")}} promise fulfills, you can start using the connection.
Also note that you can respond to the connection closing by waiting for the {{domxref("WebTransport.closed")}} promise to fulfill. Errors returned by WebTransport operations are of type {{domxref("WebTransportError")}}, and contain additional data on top of the standard {{domxref("DOMException")}} set.
```js
const url = "https://example.com:4999/wt";
async function initTransport(url) {
// Initialize transport connection
const transport = new WebTransport(url);
// The connection can be used once ready fulfills
await transport.ready;
// ...
}
// ...
async function closeTransport(transport) {
// Respond to connection closing
try {
await transport.closed;
console.log(`The HTTP/3 connection to ${url} closed gracefully.`);
} catch (error) {
console.error(`The HTTP/3 connection to ${url} closed due to ${error}.`);
}
}
```
### Unreliable transmission via datagrams
"Unreliable" means that transmission of data is not guaranteed, nor is arrival in a specific order. This is fine in some situations and provides very fast delivery. For example, you might want to transmit regular game state updates where each message supersedes the last one that arrives, and order is not important.
Unreliable data transmission is handled via the {{domxref("WebTransport.datagrams")}} property — this returns a {{domxref("WebTransportDatagramDuplexStream")}} object containing everything you need to send datagrams to the server, and receive them back.
The {{domxref("WebTransportDatagramDuplexStream.writable")}} property returns a {{domxref("WritableStream")}} object that you can write data to using a writer, for transmission to the server:
```js
const writer = transport.datagrams.writable.getWriter();
const data1 = new Uint8Array([65, 66, 67]);
const data2 = new Uint8Array([68, 69, 70]);
writer.write(data1);
writer.write(data2);
```
The {{domxref("WebTransportDatagramDuplexStream.readable")}} property returns a {{domxref("ReadableStream")}} object that you can use to receive data from the server:
```js
async function readData() {
const reader = transport.datagrams.readable.getReader();
while (true) {
const { value, done } = await reader.read();
if (done) {
break;
}
// value is a Uint8Array.
console.log(value);
}
}
```
### Reliable transmission via streams
"Reliable" means that transmission and order of data are guaranteed. That provides slower delivery (albeit faster than with WebSockets), and is needed in situations where reliability and ordering are important (such as chat applications, for example).
When using reliable transmission via streams you can also set the relative priority of different streams over the same transport.
### Unidirectional transmission
To open a unidirectional stream from a user agent, you use the {{domxref("WebTransport.createUnidirectionalStream()")}} method to get a reference to a {{domxref("WritableStream")}}. From this you can {{domxref("WritableStream.getWriter", "get a writer")}} to allow data to be written to the stream and sent to the server.
```js
async function writeData() {
const stream = await transport.createUnidirectionalStream();
const writer = stream.writable.getWriter();
const data1 = new Uint8Array([65, 66, 67]);
const data2 = new Uint8Array([68, 69, 70]);
writer.write(data1);
writer.write(data2);
try {
await writer.close();
console.log("All data has been sent.");
} catch (error) {
console.error(`An error occurred: ${error}`);
}
}
```
Note also the use of the {{domxref("WritableStreamDefaultWriter.close()")}} method to close the associated HTTP/3 connection once all data has been sent.
If the server opens a unidirectional stream to transmit data to the client, this can be accessed on the client via the {{domxref("WebTransport.incomingUnidirectionalStreams")}} property, which returns a {{domxref("ReadableStream")}} of {{domxref("WebTransportReceiveStream")}} objects. These can be used to read {{jsxref("Uint8Array")}} instances sent by the server.
In this case, the first thing to do is set up a function to read a `WebTransportReceiveStream`. These objects inherit from the `ReadableStream` class, so can be used in just the same way:
```js
async function readData(receiveStream) {
const reader = receiveStream.getReader();
while (true) {
const { done, value } = await reader.read();
if (done) {
break;
}
// value is a Uint8Array
console.log(value);
}
}
```
Next, call {{domxref("WebTransport.incomingUnidirectionalStreams")}} and get a reference to the reader available on the `ReadableStream` it returns, and then use the reader to read the data from the server. Each chunk is a `WebTransportReceiveStream`, and we use the `readFrom()` set up earlier to read them:
```js
async function receiveUnidirectional() {
const uds = transport.incomingUnidirectionalStreams;
const reader = uds.getReader();
while (true) {
const { done, value } = await reader.read();
if (done) {
break;
}
// value is an instance of WebTransportReceiveStream
await readData(value);
}
}
```
#### Bidirectional transmission
To open a bidirectional stream from a user agent, you use the {{domxref("WebTransport.createBidirectionalStream()")}} method to get a reference to a {{domxref("WebTransportBidirectionalStream")}}.
This contains `readable` and `writable` properties returning references to `WebTransportReceiveStream` and `WebTransportSendStream` instances that can be used to read from and write to the server.
> **Note:** `WebTransportBidirectionalStream` is similar to {{domxref("WebTransportDatagramDuplexStream")}}, except that in that interface the `readable` and `writable` properties are `ReadableStream` and `WritableStream` respectively.
```js
async function setUpBidirectional() {
const stream = await transport.createBidirectionalStream();
// stream is a WebTransportBidirectionalStream
// stream.readable is a WebTransportReceiveStream
const readable = stream.readable;
// stream.writable is a WebTransportSendStream
const writable = stream.writable;
...
}
```
Reading from the `WebTransportReceiveStream` can then be done as follows:
```js
async function readData(readable) {
const reader = readable.getReader();
while (true) {
const { value, done } = await reader.read();
if (done) {
break;
}
// value is a Uint8Array.
console.log(value);
}
}
```
And writing to the `WebTransportSendStream` can be done like this:
```js
async function writeData(writable) {
const writer = writable.getWriter();
const data1 = new Uint8Array([65, 66, 67]);
const data2 = new Uint8Array([68, 69, 70]);
writer.write(data1);
writer.write(data2);
}
```
If the server opens a bidirectional stream to transmit data to and receive it from the client, this can be accessed via the {{domxref("WebTransport.incomingBidirectionalStreams")}} property, which returns a {{domxref("ReadableStream")}} of `WebTransportBidirectionalStream` objects. Each one can be used to read and write {{jsxref("Uint8Array")}} instances as shown above. However, as with the unidirectional example, you need an initial function to read the bidirectional stream in the first place:
```js
async function receiveBidirectional() {
const bds = transport.incomingBidirectionalStreams;
const reader = bds.getReader();
while (true) {
const { done, value } = await reader.read();
if (done) {
break;
}
// value is an instance of WebTransportBidirectionalStream
await readData(value.readable);
await writeData(value.writable);
}
}
```
## Interfaces
- {{domxref("WebTransport")}}
- : Provides functionality to enable a user agent to connect to an HTTP/3 server, initiate reliable and unreliable transport in either or both directions, and close the connection once it is no longer needed.
- {{domxref("WebTransportBidirectionalStream")}}
- : Represents a bidirectional stream created by a server or a client that can be used for reliable transport. Provides access to a {{domxref("ReadableStream")}} for reading incoming data, and a {{domxref("WritableStream")}} for writing outgoing data.
- {{domxref("WebTransportDatagramDuplexStream")}}
- : Represents a duplex stream that can be used for unreliable transport of datagrams between client and server. Provides access to a {{domxref("ReadableStream")}} for reading incoming datagrams, a {{domxref("WritableStream")}} for writing outgoing datagrams, and various settings and statistics related to the stream.
- {{domxref("WebTransportError")}}
- : Represents an error related to the WebTransport API, which can arise from server errors, network connection problems, or client-initiated abort operations (for example, arising from a {{domxref("WritableStream.abort()")}} call).
- {{domxref("WebTransportReceiveStream")}}
- : Provides streaming features for an incoming WebTransport unidirectional or bidirectional {{domxref("WebTransport")}} stream.
- {{domxref("WebTransportSendStream")}}
- : Provides streaming features for an outgoing WebTransport unidirectional or bidirectional {{domxref("WebTransport")}} stream.
## Examples
For complete examples, see:
- [WebTransport over HTTP/3 client](https://webtransport.day/)
- [WebTransport (BYOB) Echo with WebCodecs in Worker](https://webrtc.internaut.com/wc/wtSender4/)
## Specifications
{{Specifications}}
## Browser compatibility
{{Compat}}
## See also
- [Using WebTransport](https://developer.chrome.com/docs/capabilities/web-apis/webtransport)
- {{domxref("WebSockets API", "WebSockets API", "", "nocode")}}
- {{domxref("Streams API", "Streams API", "", "nocode")}}
- [WebTransport over HTTP/3](https://datatracker.ietf.org/doc/html/draft-ietf-webtrans-http3/)
| 0 |
data/mdn-content/files/en-us/web/api | data/mdn-content/files/en-us/web/api/audiodata/index.md | ---
title: AudioData
slug: Web/API/AudioData
page-type: web-api-interface
status:
- experimental
browser-compat: api.AudioData
---
{{APIRef("WebCodecs API")}}{{SeeCompatTable}}
The **`AudioData`** interface of the [WebCodecs API](/en-US/docs/Web/API/WebCodecs_API) represents an audio sample.
`AudioData` is a [transferable object](/en-US/docs/Web/API/Web_Workers_API/Transferable_objects).
## Description
An audio track consists of a stream of audio samples, each sample representing a captured moment of sound. An `AudioData` object is a representation of such a sample. Working alongside the interfaces of the [Insertable Streams API](/en-US/docs/Web/API/Insertable_Streams_for_MediaStreamTrack_API), you can break a stream into individual `AudioData` objects with {{domxref("MediaStreamTrackProcessor")}}, or construct an audio track from a stream of frames with {{domxref("MediaStreamTrackGenerator")}}.
> **Note:** Find out more about audio on the web in [Digital audio concepts](/en-US/docs/Web/Media/Formats/Audio_concepts).
### The media resource
An `AudioData` object contains a reference to an attached **media resource**. This media resource contains the actual audio sample data described by the object. A media resource is maintained by the user agent until it is no longer referenced by an `AudioData` object, for example when {{domxref("AudioData.close()")}} is called.
### Planes and audio format
To return the sample format of an `AudioData` use the {{domxref("AudioData.format")}} property. The format may be described as **interleaved** or **planar**. In interleaved formats, the audio samples from the different channels are laid out in a single buffer, described as a **plane**. This plane contains a number of elements equal to {{domxref("AudioData.numberOfFrames")}} \* {{domxref("AudioData.numberOfChannels")}}.
In planar format, the number of planes is equal to {{domxref("AudioData.numberOfChannels")}}, and each plane is a buffer containing a number of elements equal to {{domxref("AudioData.numberOfFrames")}}.
## Constructor
- {{domxref("AudioData.AudioData", "AudioData()")}} {{Experimental_Inline}}
- : Creates a new `AudioData` object.
## Instance properties
- {{domxref("AudioData.format")}} {{ReadOnlyInline}} {{Experimental_Inline}}
- : Returns the sample format of the audio.
- {{domxref("AudioData.sampleRate")}} {{ReadOnlyInline}} {{Experimental_Inline}}
- : Returns the sample rate of the audio in Hz.
- {{domxref("AudioData.numberOfFrames")}} {{ReadOnlyInline}} {{Experimental_Inline}}
- : Returns the number of frames.
- {{domxref("AudioData.numberOfChannels")}} {{ReadOnlyInline}} {{Experimental_Inline}}
- : Returns the number of audio channels.
- {{domxref("AudioData.duration")}} {{ReadOnlyInline}} {{Experimental_Inline}}
- : Returns the duration of the audio in microseconds.
- {{domxref("AudioData.timestamp")}} {{ReadOnlyInline}} {{Experimental_Inline}}
- : Returns the timestamp of the audio in microseconds.
## Instance methods
- {{domxref("AudioData.allocationSize()")}} {{Experimental_Inline}}
- : Returns the number of bytes required to hold the sample as filtered by options passed into the method.
- {{domxref("AudioData.copyTo()")}} {{Experimental_Inline}}
- : Copies the samples from the specified plane of the `AudioData` object to the destination.
- {{domxref("AudioData.clone()")}} {{Experimental_Inline}}
- : Creates a new `AudioData` object with reference to the same media resource as the original.
- {{domxref("AudioData.close()")}} {{Experimental_Inline}}
- : Clears all states and releases the reference to the media resource.
## Specifications
{{Specifications}}
## Browser compatibility
{{Compat}}
| 0 |
data/mdn-content/files/en-us/web/api/audiodata | data/mdn-content/files/en-us/web/api/audiodata/clone/index.md | ---
title: "AudioData: clone() method"
short-title: clone()
slug: Web/API/AudioData/clone
page-type: web-api-instance-method
status:
- experimental
browser-compat: api.AudioData.clone
---
{{APIRef("WebCodecs API")}}{{SeeCompatTable}}
The **`clone()`** method of the {{domxref("AudioData")}} interface creates a new `AudioData` object with reference to the same media resource as the original.
## Syntax
```js-nolint
clone()
```
### Parameters
None.
### Return value
The cloned {{domxref("AudioData")}} object.
### Exceptions
- `InvalidStateError` {{domxref("DOMException")}}
- : Thrown if the `AudioData` object has been [transferred](/en-US/docs/Web/API/Web_Workers_API/Transferable_objects).
## Examples
The following example clones a copy of `AudioData` as `audioData2`.
```js
let audioData2 = AudioData.clone();
```
## Specifications
{{Specifications}}
## Browser compatibility
{{Compat}}
| 0 |
data/mdn-content/files/en-us/web/api/audiodata | data/mdn-content/files/en-us/web/api/audiodata/copyto/index.md | ---
title: "AudioData: copyTo() method"
short-title: copyTo()
slug: Web/API/AudioData/copyTo
page-type: web-api-instance-method
status:
- experimental
browser-compat: api.AudioData.copyTo
---
{{APIRef("WebCodecs API")}}{{SeeCompatTable}}
The **`copyTo()`** method of the {{domxref("AudioData")}} interface copies a plane of an `AudioData` object to a destination buffer.
## Syntax
```js-nolint
copyTo(destination, options)
```
### Parameters
- `destination`
- : An {{jsxref("ArrayBuffer")}}, a {{jsxref("TypedArray")}}, or a {{jsxref("DataView")}} to copy the plane to.
- `options`
- : An object containing the following:
- `planeIndex`
- : The index of the plane to copy from.
- `frameOffset` {{optional_inline}}
- : An integer giving an offset into the plane data indicating which plane to begin copying from. Defaults to `0`.
- `frameCount` {{optional_inline}}
- : An integer giving the number of frames to copy. If omitted then all frames in the plane will be copied, beginning with the frame specified in `frameOffset`.
### Return value
Undefined.
### Exceptions
- `InvalidStateError` {{domxref("DOMException")}}
- : Thrown if the `AudioData` object has been [transferred](/en-US/docs/Web/API/Web_Workers_API/Transferable_objects).
- {{jsxref("RangeError")}}
- : Thrown if one of the following conditions is met:
- The length of the sample is longer than the destination length.
- The format of the `AudioData` object describes a planar format, but `options.planeIndex` is outside of the number of planes available.
- The format of the `AudioData` object describes an interleaved format, but `options.planeIndex` is greater than `0`.
## Examples
The following example copies the plane at index `1` to a destination buffer.
```js
AudioData.copyTo(AudioBuffer, { planeIndex: 1 });
```
## Specifications
{{Specifications}}
## Browser compatibility
{{Compat}}
| 0 |
data/mdn-content/files/en-us/web/api/audiodata | data/mdn-content/files/en-us/web/api/audiodata/format/index.md | ---
title: "AudioData: format property"
short-title: format
slug: Web/API/AudioData/format
page-type: web-api-instance-property
status:
- experimental
browser-compat: api.AudioData.format
---
{{APIRef("WebCodecs API")}}{{SeeCompatTable}}
The **`format`** read-only property of the {{domxref("AudioData")}} interface returns the sample format of the `AudioData` object.
## Value
A string. One of:
- `"u8"`
- : 8-bit unsigned integer samples, in an interleaved format.
- `"s16"`
- : 16-bit signed integer samples, in an interleaved format.
- `"s32"`
- : 32-bit signed integer samples, in an interleaved format.
- `"f32"`
- : 32-bit float samples, in an interleaved format.
- `"u8-planar"`
- : 8-bit unsigned integer samples, in a planar format.
- `"s16-planar"`
- : 16-bit signed integer samples, in a planar format.
- `"s32-planar"`
- : 32-bit signed integer samples, in a planar format.
- `"f32-planar"`
- : 32-bit float samples, in a planar format.
## Examples
The below example prints the value of `format` to the console.
```js
console.log(AudioData.format);
```
## Specifications
{{Specifications}}
## Browser compatibility
{{Compat}}
| 0 |
data/mdn-content/files/en-us/web/api/audiodata | data/mdn-content/files/en-us/web/api/audiodata/allocationsize/index.md | ---
title: "AudioData: allocationSize() method"
short-title: allocationSize()
slug: Web/API/AudioData/allocationSize
page-type: web-api-instance-method
status:
- experimental
browser-compat: api.AudioData.allocationSize
---
{{APIRef("WebCodecs API")}}{{SeeCompatTable}}
The **`allocationSize()`** method of the {{domxref("AudioData")}} interface returns the size in bytes required to hold the current sample as filtered by options passed into the method.
## Syntax
```js-nolint
allocationSize(options)
```
### Parameters
- `options`
- : An object containing the following:
- `planeIndex`
- : The index of the plane to return the size of.
- `frameOffset` {{optional_inline}}
- : An integer giving an offset into the plane data indicating which plane to begin from. Defaults to `0`.
- `frameCount` {{optional_inline}}
- : An integer giving the number of frames to return the size of. If omitted then all frames in the plane will be used, beginning with the frame specified in `frameOffset`.
### Return value
An integer containing the number of bytes needed to hold the samples described by `options`.
## Examples
The following example gets the size of the plane at index `1`.
```js
let size = AudioData.allocationSize({ planeIndex: 1 });
```
## Specifications
{{Specifications}}
## Browser compatibility
{{Compat}}
| 0 |
data/mdn-content/files/en-us/web/api/audiodata | data/mdn-content/files/en-us/web/api/audiodata/numberofframes/index.md | ---
title: "AudioData: numberOfFrames property"
short-title: numberOfFrames
slug: Web/API/AudioData/numberOfFrames
page-type: web-api-instance-property
status:
- experimental
browser-compat: api.AudioData.numberOfFrames
---
{{APIRef("WebCodecs API")}}{{SeeCompatTable}}
The **`numberOfFrames`** read-only property of the {{domxref("AudioData")}} interface returns the number of frames in the `AudioData` object.
## Value
An integer.
## Examples
The below example prints the value of `numberOfFrames` to the console.
```js
console.log(AudioData.numberOfFrames);
```
## Specifications
{{Specifications}}
## Browser compatibility
{{Compat}}
| 0 |
data/mdn-content/files/en-us/web/api/audiodata | data/mdn-content/files/en-us/web/api/audiodata/audiodata/index.md | ---
title: "AudioData: AudioData() constructor"
short-title: AudioData()
slug: Web/API/AudioData/AudioData
page-type: web-api-constructor
status:
- experimental
browser-compat: api.AudioData.AudioData
---
{{APIRef("WebCodecs API")}}{{SeeCompatTable}}
The **`AudioData()`** constructor creates a new {{domxref("AudioData")}} object which represents an individual audio sample.
## Syntax
```js-nolint
new AudioData(init)
```
### Parameters
- `init`
- : An object containing the following:
- `format`
- : One of:
- "u8"
- "s16"
- "s32"
- "f32"
- "u8-planar"
- "s16-planar"
- "s32-planar"
- "f32-planar"
- `sampleRate`
- : A decimal containing the sample rate in Hz.
- `numberOfFrames`
- : An integer containing the number of frames in this sample.
- `numberOfChannels`
- : An integer containing the number of channels in this sample.
- `timestamp`
- : An integer indicating the data's time in microseconds .
- `data`
- : A typed array of the audio data for this sample.
- `transfer`
- : An array of {{jsxref("ArrayBuffer")}}s that `AudioData` will detach and take ownership of. If the array contains the {{jsxref("ArrayBuffer")}} backing `data`, `AudioData` will use that buffer directly instead of copying from it.
## Exceptions
- {{jsxref("TypeError")}}
- : Thrown if `init` is in an incorrect format.
## Specifications
{{Specifications}}
## Browser compatibility
{{Compat}}
| 0 |
data/mdn-content/files/en-us/web/api/audiodata | data/mdn-content/files/en-us/web/api/audiodata/timestamp/index.md | ---
title: "AudioData: timestamp property"
short-title: timestamp
slug: Web/API/AudioData/timestamp
page-type: web-api-instance-property
status:
- experimental
browser-compat: api.AudioData.timestamp
---
{{APIRef("WebCodecs API")}}{{SeeCompatTable}}
The **`duration`** read-only property of the {{domxref("AudioData")}} interface returns the timestamp of this `AudioData` object.
## Value
An integer.
## Examples
The below example prints the value of `timestamp` to the console.
```js
console.log(AudioData.timestamp);
```
## Specifications
{{Specifications}}
## Browser compatibility
{{Compat}}
| 0 |
data/mdn-content/files/en-us/web/api/audiodata | data/mdn-content/files/en-us/web/api/audiodata/duration/index.md | ---
title: "AudioData: duration property"
short-title: duration
slug: Web/API/AudioData/duration
page-type: web-api-instance-property
status:
- experimental
browser-compat: api.AudioData.duration
---
{{APIRef("WebCodecs API")}}{{SeeCompatTable}}
The **`duration`** read-only property of the {{domxref("AudioData")}} interface returns the duration in microseconds of this `AudioData` object.
## Value
An integer.
## Examples
The below example prints the value of `duration` to the console.
```js
console.log(AudioData.duration);
```
## Specifications
{{Specifications}}
## Browser compatibility
{{Compat}}
| 0 |
data/mdn-content/files/en-us/web/api/audiodata | data/mdn-content/files/en-us/web/api/audiodata/close/index.md | ---
title: "AudioData: close() method"
short-title: close()
slug: Web/API/AudioData/close
page-type: web-api-instance-method
status:
- experimental
browser-compat: api.AudioData.close
---
{{APIRef("WebCodecs API")}}{{SeeCompatTable}}
The **`close()`** method of the {{domxref("AudioData")}} interface clears all states and releases the reference to the media resource.
## Syntax
```js-nolint
close()
```
### Parameters
None.
### Return value
Undefined.
## Examples
The following example shows the `AudioData` object being closed.
```js
AudioData.close();
```
## Specifications
{{Specifications}}
## Browser compatibility
{{Compat}}
| 0 |
data/mdn-content/files/en-us/web/api/audiodata | data/mdn-content/files/en-us/web/api/audiodata/numberofchannels/index.md | ---
title: "AudioData: numberOfChannels property"
short-title: numberOfChannels
slug: Web/API/AudioData/numberOfChannels
page-type: web-api-instance-property
status:
- experimental
browser-compat: api.AudioData.numberOfChannels
---
{{APIRef("WebCodecs API")}}{{SeeCompatTable}}
The **`numberOfChannels`** read-only property of the {{domxref("AudioData")}} interface returns the number of channels in the `AudioData` object.
## Value
An integer.
## Examples
The below example prints the value of `numberOfChannels` to the console.
```js
console.log(AudioData.numberOfChannels);
```
## Specifications
{{Specifications}}
## Browser compatibility
{{Compat}}
| 0 |
data/mdn-content/files/en-us/web/api/audiodata | data/mdn-content/files/en-us/web/api/audiodata/samplerate/index.md | ---
title: "AudioData: sampleRate property"
short-title: sampleRate
slug: Web/API/AudioData/sampleRate
page-type: web-api-instance-property
status:
- experimental
browser-compat: api.AudioData.sampleRate
---
{{APIRef("WebCodecs API")}}{{SeeCompatTable}}
The **`sampleRate`** read-only property of the {{domxref("AudioData")}} interface returns the sample rate in Hz.
## Value
A decimal value.
## Examples
The below example prints the value of `sampleRate` to the console.
```js
console.log(audioData.sampleRate);
```
## Specifications
{{Specifications}}
## Browser compatibility
{{Compat}}
| 0 |
data/mdn-content/files/en-us/web/api | data/mdn-content/files/en-us/web/api/gpusupportedlimits/index.md | ---
title: GPUSupportedLimits
slug: Web/API/GPUSupportedLimits
page-type: web-api-interface
status:
- experimental
browser-compat: api.GPUSupportedLimits
---
{{APIRef("WebGPU API")}}{{SeeCompatTable}}{{SecureContext_Header}}
The **`GPUSupportedLimits`** interface of the {{domxref("WebGPU API", "WebGPU API", "", "nocode")}} describes the limits supported by a {{domxref("GPUAdapter")}}.
The `GPUSupportedLimits` object for the current adapter is accessed via the {{domxref("GPUAdapter.limits")}} property.
You should note that, rather than reporting the exact limits of each GPU, browsers will likely report different tier values of different limits to reduce the unique information available to drive-by fingerprinting. For example, the tiers of a certain limit might be 2048, 8192, and 32768. If your GPU's actual limit is 16384, the browser will still report 8192.
Given that different browsers will handle this differently and the tier values may change over time, it is hard to provide an accurate account of what limit values to expect — thorough testing is advised.
{{InheritanceDiagram}}
## Instance properties
The following limits are represented by properties in a `GPUSupportedLimits` object. See the [Limits](https://gpuweb.github.io/gpuweb/#limits) section of the specification for detailed descriptions of what the limits relate to.
| Limit name | Default value |
| ------------------------------------------- | ------------------------ |
| `maxTextureDimension1D` | 8192 |
| `maxTextureDimension2D` | 8192 |
| `maxTextureDimension3D` | 2048 |
| `maxTextureArrayLayers` | 256 |
| `maxBindGroups` | 4 |
| `maxBindingsPerBindGroup` | 640 |
| `maxDynamicUniformBuffersPerPipelineLayout` | 8 |
| `maxDynamicStorageBuffersPerPipelineLayout` | 4 |
| `maxSampledTexturesPerShaderStage` | 16 |
| `maxSamplersPerShaderStage` | 16 |
| `maxStorageBuffersPerShaderStage` | 8 |
| `maxStorageTexturesPerShaderStage` | 4 |
| `maxUniformBuffersPerShaderStage` | 12 |
| `maxUniformBufferBindingSize` | 65536 bytes |
| `maxStorageBufferBindingSize` | 134217728 bytes (128 MB) |
| `minUniformBufferOffsetAlignment` | 256 bytes |
| `minStorageBufferOffsetAlignment` | 256 bytes |
| `maxVertexBuffers` | 8 |
| `maxBufferSize` | 268435456 bytes (256 MB) |
| `maxVertexAttributes` | 16 |
| `maxVertexBufferArrayStride` | 2048 bytes |
| `maxInterStageShaderComponents` | 60 |
| `maxInterStageShaderVariables` | 16 |
| `maxColorAttachments` | 8 |
| `maxColorAttachmentBytesPerSample` | 32 |
| `maxComputeWorkgroupStorageSize` | 16384 bytes |
| `maxComputeInvocationsPerWorkgroup` | 256 |
| `maxComputeWorkgroupSizeX` | 256 |
| `maxComputeWorkgroupSizeY` | 256 |
| `maxComputeWorkgroupSizeZ` | 64 |
| `maxComputeWorkgroupsPerDimension` | 65535 |
## Examples
In the following code we query the `GPUAdapter.limits` value of `maxBindGroups` to see if it is equal to or greater than 6. Our theoretical example app ideally needs 6 bind groups, so if the returned value is >= 6, we add a maximum limit of 6 to the `requiredLimits` object. We then request a device with that limit requirement using {{domxref("GPUAdapter.requestDevice()")}}:
```js
async function init() {
if (!navigator.gpu) {
throw Error("WebGPU not supported.");
}
const adapter = await navigator.gpu.requestAdapter();
if (!adapter) {
throw Error("Couldn't request WebGPU adapter.");
}
const requiredLimits = {};
// App ideally needs 6 bind groups, so we'll try to request what the app needs
if (adapter.limits.maxBindGroups >= 6) {
requiredLimits.maxBindGroups = 6;
}
const device = await adapter.requestDevice({
requiredLimits,
});
// ...
}
```
## Specifications
{{Specifications}}
## Browser compatibility
{{Compat}}
## See also
- The [WebGPU API](/en-US/docs/Web/API/WebGPU_API)
| 0 |
data/mdn-content/files/en-us/web/api | data/mdn-content/files/en-us/web/api/datatransferitem/index.md | ---
title: DataTransferItem
slug: Web/API/DataTransferItem
page-type: web-api-interface
browser-compat: api.DataTransferItem
---
{{APIRef("HTML Drag and Drop API")}}
The **`DataTransferItem`** object represents one drag data item. During a _drag operation_, each {{domxref("DragEvent","drag event")}} has a {{domxref("DragEvent.dataTransfer","dataTransfer")}} property which contains a {{domxref("DataTransferItemList","list")}} of drag data items. Each item in the list is a `DataTransferItem` object.
This interface has no constructor.
## Instance properties
- {{domxref("DataTransferItem.kind")}} {{ReadOnlyInline}}
- : The _kind_ of drag data item, `string` or `file`.
- {{domxref("DataTransferItem.type")}} {{ReadOnlyInline}}
- : The drag data item's type, typically a MIME type.
## Instance methods
- {{domxref("DataTransferItem.getAsFile()")}}
- : Returns the {{domxref("File")}} object associated with the drag data item (or null if the drag item is not a file).
- {{domxref("DataTransferItem.getAsFileSystemHandle()")}} {{Experimental_Inline}}
- : Returns a {{domxref('FileSystemFileHandle')}} if the dragged item is a file, or a {{domxref('FileSystemDirectoryHandle')}} if the dragged item is a directory.
- {{domxref("DataTransferItem.getAsString()")}}
- : Invokes the specified callback with the drag data item string as its argument.
- {{domxref("DataTransferItem.webkitGetAsEntry()")}}
- : Returns an object based on {{domxref("FileSystemEntry")}} representing the selected file's entry in its file system. This will generally be either a {{domxref("FileSystemFileEntry")}} or {{domxref("FileSystemDirectoryEntry")}} object.
## Example
All of this interface's methods and properties have their own reference page, and each reference page has an example of its usage.
## Specifications
{{Specifications}}
## Browser compatibility
{{Compat}}
| 0 |
data/mdn-content/files/en-us/web/api/datatransferitem | data/mdn-content/files/en-us/web/api/datatransferitem/getasfilesystemhandle/index.md | ---
title: "DataTransferItem: getAsFileSystemHandle() method"
short-title: getAsFileSystemHandle()
slug: Web/API/DataTransferItem/getAsFileSystemHandle
page-type: web-api-instance-method
status:
- experimental
browser-compat: api.DataTransferItem.getAsFileSystemHandle
---
{{securecontext_header}}{{APIRef("HTML Drag and Drop API")}}{{SeeCompatTable}}
The **`getAsFileSystemHandle()`** method of the
{{domxref("DataTransferItem")}} interface returns a {{domxref('FileSystemFileHandle')}}
if the dragged item is a file, or a {{domxref('FileSystemDirectoryHandle')}} if the
dragged item is a directory.
## Syntax
```js-nolint
getAsFileSystemHandle()
```
### Parameters
None.
### Return value
A {{jsxref('Promise')}}.
If the item's {{domxref("DataTransferItem.kind", "kind")}} property is `"file"`, and this item is accessed in the {{domxref("HTMLElement/dragstart_event", "dragstart")}} or {{domxref("HTMLElement/drop_event", "drop")}} event handlers, then the returned promise is fulfilled with a {{domxref('FileSystemFileHandle')}} if the dragged item is a file or a {{domxref('FileSystemDirectoryHandle')}} if the dragged item is a directory.
Otherwise, the promise fulfills with `null`.
### Exceptions
None.
## Examples
This example uses the `getAsFileSystemHandle` method to return
{{domxref('FileSystemHandle','file handles')}} for dropped items.
```js
elem.addEventListener("dragover", (e) => {
// Prevent navigation.
e.preventDefault();
});
elem.addEventListener("drop", async (e) => {
// Prevent navigation.
e.preventDefault();
// Process all of the items.
for (const item of e.dataTransfer.items) {
// kind will be 'file' for file/directory entries.
if (item.kind === "file") {
const entry = await item.getAsFileSystemHandle();
if (entry.kind === "file") {
// run code for if entry is a file
} else if (entry.kind === "directory") {
// run code for is entry is a directory
}
}
}
});
```
## Specifications
{{Specifications}}
## Browser compatibility
{{Compat}}
## See also
- [File System API](/en-US/docs/Web/API/File_System_API)
- [The File System Access API: simplifying access to local files](https://developer.chrome.com/docs/capabilities/web-apis/file-system-access)
| 0 |
data/mdn-content/files/en-us/web/api/datatransferitem | data/mdn-content/files/en-us/web/api/datatransferitem/getasstring/index.md | ---
title: "DataTransferItem: getAsString() method"
short-title: getAsString()
slug: Web/API/DataTransferItem/getAsString
page-type: web-api-instance-method
browser-compat: api.DataTransferItem.getAsString
---
{{APIRef("HTML Drag and Drop API")}}
The **`DataTransferItem.getAsString()`** method invokes the given callback with the drag data item's string data as the argument if the item's {{domxref("DataTransferItem.kind","kind")}} is a _Plain unicode string_ (i.e. `kind` is `string`).
## Syntax
```js-nolint
getAsString(callbackFn)
```
### Parameters
- `callbackFn`
- : A callback function that receives following arguments:
- `data`
- : The {{domxref("DataTransferItem", "data transfer item's")}} string data.
### Return value
None ({{jsxref("undefined")}}).
## Examples
This example shows the use of the `getAsString()` method as an _inline function_ in a {{domxref("HTMLElement/drop_event", "drop")}} event handler.
```js
function dropHandler(ev) {
console.log("Drop");
ev.preventDefault();
const data = ev.dataTransfer.items;
for (let i = 0; i < data.length; i += 1) {
if (data[i].kind === "string" && data[i].type.match("^text/plain")) {
// This item is the target node
data[i].getAsString((s) => {
ev.target.appendChild(document.getElementById(s));
});
} else if (data[i].kind === "string" && data[i].type.match("^text/html")) {
// Drag data item is HTML
console.log("… Drop: HTML");
} else if (
data[i].kind === "string" &&
data[i].type.match("^text/uri-list")
) {
// Drag data item is URI
console.log("… Drop: URI");
} else if (data[i].kind === "file" && data[i].type.match("^image/")) {
// Drag data item is an image file
const f = data[i].getAsFile();
console.log("… Drop: File");
}
}
}
```
## Specifications
{{Specifications}}
## Browser compatibility
{{Compat}}
## See also
- {{domxref("DataTransfer.getData()")}}
| 0 |
data/mdn-content/files/en-us/web/api/datatransferitem | data/mdn-content/files/en-us/web/api/datatransferitem/getasfile/index.md | ---
title: "DataTransferItem: getAsFile() method"
short-title: getAsFile()
slug: Web/API/DataTransferItem/getAsFile
page-type: web-api-instance-method
browser-compat: api.DataTransferItem.getAsFile
---
{{APIRef("HTML Drag and Drop API")}}
If the item is a file, the **`DataTransferItem.getAsFile()`** method returns the drag data item's {{domxref("File")}} object.
If the item is not a file, this method returns `null`.
## Syntax
```js-nolint
getAsFile()
```
### Parameters
None.
### Return value
- {{domxref("File")}}
- : If the drag data item is a file, a {{domxref("File")}} object is returned; otherwise `null` is returned.
## Examples
This example shows the use of the `getAsFile()` method in a {{domxref("HTMLElement/drop_event", "drop")}} event handler.
```js
function dropHandler(ev) {
console.log("Drop");
ev.preventDefault();
const data = ev.dataTransfer.items;
for (let i = 0; i < data.length; i += 1) {
if (data[i].kind === "string" && data[i].type.match("^text/plain")) {
// This item is the target node
data[i].getAsString((s) => {
ev.target.appendChild(document.getElementById(s));
});
} else if (data[i].kind === "string" && data[i].type.match("^text/html")) {
// Drag data item is HTML
console.log("… Drop: HTML");
} else if (
data[i].kind === "string" &&
data[i].type.match("^text/uri-list")
) {
// Drag data item is URI
console.log("… Drop: URI");
} else if (data[i].kind === "file" && data[i].type.match("^image/")) {
// Drag data item is an image file
const f = data[i].getAsFile();
console.log("… Drop: File");
}
}
}
```
## Specifications
{{Specifications}}
## Browser compatibility
{{Compat}}
## See also
- {{domxref("DataTransfer.files")}}
| 0 |
data/mdn-content/files/en-us/web/api/datatransferitem | data/mdn-content/files/en-us/web/api/datatransferitem/webkitgetasentry/index.md | ---
title: "DataTransferItem: webkitGetAsEntry() method"
short-title: webkitGetAsEntry()
slug: Web/API/DataTransferItem/webkitGetAsEntry
page-type: web-api-instance-method
browser-compat: api.DataTransferItem.webkitGetAsEntry
---
{{APIRef("HTML Drag and Drop API")}}
If the item described by the {{domxref("DataTransferItem")}} is a file, `webkitGetAsEntry()` returns a {{domxref("FileSystemFileEntry")}} or {{domxref("FileSystemDirectoryEntry")}} representing it. If the item isn't a file, `null` is returned.
> **Note:** This function is implemented as `webkitGetAsEntry()` in non-WebKit browsers including Firefox at this time; it may be renamed to
> `getAsEntry()` in the future, so you should code defensively, looking for both.
## Syntax
```js-nolint
webkitGetAsEntry()
```
### Parameters
None.
### Return value
A {{domxref("FileSystemEntry")}}-based object describing the dropped item.
This will be either {{domxref("FileSystemFileEntry")}} or {{domxref("FileSystemDirectoryEntry")}}.
The method aborts and returns `null` if the dropped item isn't a file, or if the {{domxref("DataTransferItem")}} object is not in read or read/write mode.
## Examples
In this example, a drop zone is created, which responds to the {{domxref("HTMLElement/drop_event", "drop")}} event
by scanning through the dropped files and directories, outputting a hierarchical
directory listing.
### HTML
The HTML establishes the drop zone itself, which is a {{HTMLElement("div")}} element with the ID `"dropzone"`, and an unordered list element with the ID `"listing"`.
```html
<p>Drag files and/or directories to the box below!</p>
<div id="dropzone">
<div id="boxtitle">Drop Files Here</div>
</div>
<h2>Directory tree:</h2>
<ul id="listing"></ul>
```
### CSS
The styles used by the example are shown here.
```css
#dropzone {
text-align: center;
width: 300px;
height: 100px;
margin: 10px;
padding: 10px;
border: 4px dashed red;
border-radius: 10px;
}
#boxtitle {
display: table-cell;
vertical-align: middle;
text-align: center;
color: black;
font:
bold 2em "Arial",
sans-serif;
width: 300px;
height: 100px;
}
body {
font:
14px "Arial",
sans-serif;
}
```
### JavaScript
First, let's look at the recursive `scanFiles()` function.
This function takes as input a {{domxref("FileSystemEntry")}} representing an entry in the file system to be scanned and processed (the `item` parameter), and an element into which to insert the list of contents (the `container` parameter).
> **Note:** To read all files in a directory, `readEntries` needs to be called repeatedly until it returns an empty array.
> In Chromium-based browsers, the following example will only return a max of 100 entries.
```js
let dropzone = document.getElementById("dropzone");
let listing = document.getElementById("listing");
function scanFiles(item, container) {
let elem = document.createElement("li");
elem.textContent = item.name;
container.appendChild(elem);
if (item.isDirectory) {
let directoryReader = item.createReader();
let directoryContainer = document.createElement("ul");
container.appendChild(directoryContainer);
directoryReader.readEntries((entries) => {
entries.forEach((entry) => {
scanFiles(entry, directoryContainer);
});
});
}
}
```
`scanFiles()` begins by creating a new {{HTMLElement("li")}} element to represent the item being scanned, inserts the name of the item into it as its text content, and then appends it to the container.
The container is always a list element in this example, as you'll see shortly.
Once the current item is in the list, the item's {{domxref("FileSystemEntry.isDirectory", "isDirectory")}} property is checked.
If the item is a directory, we need to recurse into that directory.
The first step is to create a {{domxref("FileSystemDirectoryReader")}} to handle fetching the directory's contents.
That's done by calling the item's {{domxref("FileSystemDirectoryEntry.createReader", "createReader()")}} method.
Then a new {{HTMLElement("ul")}} is created and appended to the parent list; this will contain the directory's contents in the next level down in the list's hierarchy.
After that, {{domxref("FileSystemDirectoryReader.readEntries", "directoryReader.readEntries()")}} is called to read in all the entries in the directory.
These are each, in turn, passed into a recursive call to `scanFiles()` to process them.
Any of them which are files are inserted into the list; any which are directories are inserted into the list and a new level of the list's hierarchy is added below, and so forth.
Then come the event handlers. First, we prevent the {{domxref("HTMLElement/dragover_event", "dragover")}} event from being handled by the default handler, so that our drop zone can receive the drop:
```js
dropzone.addEventListener(
"dragover",
(event) => {
event.preventDefault();
},
false,
);
```
The event handler that kicks everything off, of course, is the handler for the {{domxref("HTMLElement/drop_event", "drop")}} event:
```js
dropzone.addEventListener(
"drop",
(event) => {
let items = event.dataTransfer.items;
event.preventDefault();
listing.textContent = "";
for (let i = 0; i < items.length; i++) {
let item = items[i].webkitGetAsEntry();
if (item) {
scanFiles(item, listing);
}
}
},
false,
);
```
This fetches the list of {{domxref("DataTransferItem")}} objects representing the items dropped from `event.dataTransfer.items`.
Then we call {{domxref("Event.preventDefault()")}} to prevent the event from being handled further after we're done.
Now it's time to start building the list. First, the list is emptied by setting {{domxref("Node.textContent", "listing.textContent")}} to be empty.
That leaves us with an empty {{HTMLElement("ul")}} to begin inserting directory entries into.
Then we iterate over the items in the list of dropped items.
For each one, we call its {{domxref("DataTransferItem.webkitGetAsEntry", "webkitGetAsEntry()")}} method to obtain a {{domxref("FileSystemEntry")}} representing the file.
If that's successful, we call `scanFiles()` to process the item—either by adding it to the list if it's just a file or by adding it and walking down into it if it's a directory.
### Result
You can see how this works by trying it out below. Find some files and directories and drag them in, and take a look at the resulting output.
{{EmbedLiveSample('Examples', 600, 400)}}
## Specifications
{{Specifications}}
## Browser compatibility
{{Compat}}
## See also
- [File and Directory Entries API](/en-US/docs/Web/API/File_and_Directory_Entries_API)
- [Introduction to the File and Directory Entries API](/en-US/docs/Web/API/File_and_Directory_Entries_API/Introduction)
- {{domxref("DataTransferItem")}}
- {{domxref("FileSystemEntry")}}, {{domxref("FileSystemFileEntry")}}, and {{domxref("FileSystemDirectoryEntry")}}
- Events: {{domxref("HTMLElement/dragover_event", "dragover")}} and {{domxref("HTMLElement/drop_event", "drop")}}
| 0 |
data/mdn-content/files/en-us/web/api/datatransferitem | data/mdn-content/files/en-us/web/api/datatransferitem/type/index.md | ---
title: "DataTransferItem: type property"
short-title: type
slug: Web/API/DataTransferItem/type
page-type: web-api-instance-property
browser-compat: api.DataTransferItem.type
---
{{APIRef("HTML Drag and Drop API")}}
The read-only **`DataTransferItem.type`** property returns the type (format) of the {{domxref("DataTransferItem")}} object representing the drag data item.
The `type` is a Unicode string generally given by a MIME type, although a MIME type is not required.
Some example types are: `text/plain` and `text/html`.
## Value
A string representing the drag data item's type.
## Examples
This example shows the use of the `type` property.
```js
function dropHandler(ev) {
console.log("Drop");
ev.preventDefault();
const data = ev.dataTransfer.items;
for (let i = 0; i < data.length; i += 1) {
if (data[i].kind === "string" && data[i].type.match("^text/plain")) {
// This item is the target node
data[i].getAsString((s) => {
ev.target.appendChild(document.getElementById(s));
});
} else if (data[i].kind === "string" && data[i].type.match("^text/html")) {
// Drag data item is HTML
console.log("… Drop: HTML");
} else if (
data[i].kind === "string" &&
data[i].type.match("^text/uri-list")
) {
// Drag data item is URI
console.log("… Drop: URI");
} else if (data[i].kind === "file" && data[i].type.match("^image/")) {
// Drag data item is an image file
const f = data[i].getAsFile();
console.log("… Drop: File");
}
}
}
```
## Specifications
{{Specifications}}
## Browser compatibility
{{Compat}}
## See also
- {{domxref("DataTransfer.types()")}}
- [List of common MIME types](/en-US/docs/Web/HTTP/Basics_of_HTTP/MIME_types/Common_types)
| 0 |
data/mdn-content/files/en-us/web/api/datatransferitem | data/mdn-content/files/en-us/web/api/datatransferitem/kind/index.md | ---
title: "DataTransferItem: kind property"
short-title: kind
slug: Web/API/DataTransferItem/kind
page-type: web-api-instance-property
browser-compat: api.DataTransferItem.kind
---
{{APIRef("HTML Drag and Drop API")}}
The read-only **`DataTransferItem.kind`** property returns the kind–a string or a file–of the {{domxref("DataTransferItem")}} object representing the _drag data item_.
## Value
A string representing the drag data item's kind.
It must be one of the following values:
- `'file'`
- : If the drag data item is a file.
- `'string'`
- : If the kind of drag data item is a _plain Unicode string_.
## Examples
This example shows the use of the `kind` property.
```js
function dropHandler(ev) {
console.log("Drop");
ev.preventDefault();
const data = event.dataTransfer.items;
for (let i = 0; i < data.length; i += 1) {
if (data[i].kind === "string" && data[i].type.match("^text/plain")) {
// This item is the target node
data[i].getAsString((s) => {
ev.target.appendChild(document.getElementById(s));
});
} else if (data[i].kind === "string" && data[i].type.match("^text/html")) {
// Drag data item is HTML
console.log("… Drop: HTML");
} else if (data[i].kind === "file" && data[i].type.match("^image/")) {
// Drag data item is an image file
const f = data[i].getAsFile();
console.log("… Drop: File");
}
}
}
```
## Specifications
{{Specifications}}
## Browser compatibility
{{Compat}}
## See also
- [Drag and drop](/en-US/docs/Web/API/HTML_Drag_and_Drop_API)
- [Drag Operations](/en-US/docs/Web/API/HTML_Drag_and_Drop_API/Drag_operations)
- [Recommended Drag Types](/en-US/docs/Web/API/HTML_Drag_and_Drop_API/Recommended_drag_types)
- [DataTransfer test - Paste or Drag](https://codepen.io/tech_query/pen/MqGgap)
| 0 |
data/mdn-content/files/en-us/web/api | data/mdn-content/files/en-us/web/api/clipboarditem/index.md | ---
title: ClipboardItem
slug: Web/API/ClipboardItem
page-type: web-api-interface
browser-compat: api.ClipboardItem
---
{{APIRef("Clipboard API")}}{{SecureContext_Header}}
The **`ClipboardItem`** interface of the [Clipboard API](/en-US/docs/Web/API/Clipboard_API) represents a single item format, used when reading or writing clipboard data using {{domxref("clipboard.read()")}} and {{domxref("clipboard.write()")}} respectively.
The benefit of having the **`ClipboardItem`** interface to represent data, is that it enables developers to cope with the varying scope of file types and data.
> **Note:** To work with text see the {{domxref("Clipboard.readText()")}} and {{domxref("Clipboard.writeText()")}} methods of the {{domxref("Clipboard")}} interface.
## Constructor
- {{domxref("ClipboardItem.ClipboardItem", "ClipboardItem()")}}
- : Creates a new **`ClipboardItem`** object, with the {{Glossary("MIME type")}} as the key and {{domxref("Blob")}} as the value.
## Instance properties
_This interface provides the following properties._
- {{domxref("ClipboardItem.types", "types")}} {{ReadOnlyInline}}
- : Returns an {{jsxref("Array")}} of MIME types available within the **`ClipboardItem`**.
- {{domxref("ClipboardItem.presentationStyle", "presentationStyle")}} {{ReadOnlyInline}}
- : Returns one of the following: `"unspecified"`, `"inline"` or `"attachment"`.
## Instance methods
_This interface defines the following methods._
- {{domxref("ClipboardItem.getType", "getType()")}}
- : Returns a {{jsxref("Promise")}} that resolves with a {{domxref("Blob")}} of the requested {{Glossary("MIME type")}}, or an error if the MIME type is not found.
## Examples
### Writing to the clipboard
Here we're writing a new {{domxref("ClipboardItem.ClipboardItem", "ClipboardItem()")}} to the system clipboard by requesting a PNG image using the {{domxref("Fetch API")}}, and in turn, the {{domxref("Response.blob()", "responses' blob()")}} method, to create the new `ClipboardItem`.
```js
async function writeClipImg() {
try {
const imgURL = "/myimage.png";
const data = await fetch(imgURL);
const blob = await data.blob();
await navigator.clipboard.write([
new ClipboardItem({
[blob.type]: blob,
}),
]);
console.log("Fetched image copied.");
} catch (err) {
console.error(err.name, err.message);
}
}
```
### Reading from the clipboard
Here we're returning all items on the clipboard via the {{domxref("clipboard.read()")}} method.
Then utilizing the {{domxref("ClipboardItem.types")}} property to set the {{domxref("ClipboardItem.getType", "getType()")}} argument and return the corresponding blob object.
```js
async function getClipboardContents() {
try {
const clipboardItems = await navigator.clipboard.read();
for (const clipboardItem of clipboardItems) {
for (const type of clipboardItem.types) {
const blob = await clipboardItem.getType(type);
// we can now use blob here
}
}
} catch (err) {
console.error(err.name, err.message);
}
}
```
## Specifications
{{Specifications}}
## Browser compatibility
{{Compat}}
## See also
- [Clipboard API](/en-US/docs/Web/API/Clipboard_API)
- [Image support for Async Clipboard article](https://web.dev/articles/async-clipboard)
| 0 |
data/mdn-content/files/en-us/web/api/clipboarditem | data/mdn-content/files/en-us/web/api/clipboarditem/presentationstyle/index.md | ---
title: "ClipboardItem: presentationStyle property"
short-title: presentationStyle
slug: Web/API/ClipboardItem/presentationStyle
page-type: web-api-instance-property
browser-compat: api.ClipboardItem.presentationStyle
---
{{APIRef("Clipboard API")}} {{securecontext_header}}
The read-only **`presentationStyle`** property of the {{domxref("ClipboardItem")}} interface returns a string indicating how an item should be presented.
For example, in some contexts an image might be displayed inline, while in others it might be represented as an attachment.
## Value
One of either `"unspecified"`, `"inline"` or `"attachment"`.
## Examples
In the below example, we're returning all items on the clipboard via the {{domxref("clipboard.read()")}} method, then logging the `presentationStyle` property.
```js
async function getClipboardContents() {
try {
const clipboardItems = await navigator.clipboard.read();
for (const clipboardItem of clipboardItems) {
console.log(clipboardItem.presentationStyle);
}
} catch (err) {
console.error(err.name, err.message);
}
}
```
## Specifications
{{Specifications}}
## Browser compatibility
{{Compat}}
## See also
- [Clipboard API](/en-US/docs/Web/API/Clipboard_API)
- [Image support for Async Clipboard article](https://web.dev/articles/async-clipboard)
| 0 |
data/mdn-content/files/en-us/web/api/clipboarditem | data/mdn-content/files/en-us/web/api/clipboarditem/clipboarditem/index.md | ---
title: "ClipboardItem: ClipboardItem() constructor"
short-title: ClipboardItem()
slug: Web/API/ClipboardItem/ClipboardItem
page-type: web-api-constructor
browser-compat: api.ClipboardItem.ClipboardItem
---
{{APIRef("Clipboard API")}} {{securecontext_header}}
The **`ClipboardItem()`** constructor creates a new {{domxref("ClipboardItem")}} object, which represents data to be stored or retrieved via the [Clipboard API](/en-US/docs/Web/API/Clipboard_API) {{domxref("clipboard.write()")}} and {{domxref("clipboard.read()")}} methods, respectively.
> **Note:** Image format support varies by browser. See the browser compatibility table for the {{domxref("Clipboard")}} interface.
## Syntax
```js-nolint
new ClipboardItem(data)
new ClipboardItem(data, options)
```
### Parameters
- `data`
- : An {{jsxref("Object")}} with the {{Glossary("MIME type")}} as the key and data as the value.
The data can be represented as a {{domxref("Blob")}}, a {{jsxref("String")}} or a {{jsxref("Promise")}} which resolves to either a blob or string.
- `options` {{optional_inline}}
- : An object with the following properties:
- `presentationStyle` {{optional_inline}}
- : One of the three strings: `unspecified`, `inline` or `attachment`.
The default is `unspecified`.
> **Note:** You can also work with text via the {{domxref("Clipboard.readText()")}} and {{domxref("Clipboard.writeText()")}} methods of the {{domxref("Clipboard")}} interface.
## Examples
The below example requests a PNG image using the {{domxref("Fetch API")}}, and in turn, the {{domxref("Response.blob()", "responses' blob()")}} method, to create a new {{domxref("ClipboardItem")}}.
This item is then written to the clipboard, using the {{domxref("Clipboard.write()")}} method.
> **Note:** You can only pass in one clipboard item at a time.
```js
async function writeClipImg() {
try {
const imgURL = "/myimage.png";
const data = await fetch(imgURL);
const blob = await data.blob();
await navigator.clipboard.write([
new ClipboardItem({
[blob.type]: blob,
}),
]);
console.log("Fetched image copied.");
} catch (err) {
console.error(err.name, err.message);
}
}
```
## Specifications
{{Specifications}}
## Browser compatibility
{{Compat}}
## See also
- [Clipboard API](/en-US/docs/Web/API/Clipboard_API)
- [Image support for Async Clipboard article](https://web.dev/articles/async-clipboard)
| 0 |
data/mdn-content/files/en-us/web/api/clipboarditem | data/mdn-content/files/en-us/web/api/clipboarditem/types/index.md | ---
title: "ClipboardItem: types property"
short-title: types
slug: Web/API/ClipboardItem/types
page-type: web-api-instance-property
browser-compat: api.ClipboardItem.types
---
{{APIRef("Clipboard API")}} {{securecontext_header}}
The read-only **`types`** property of the {{domxref("ClipboardItem")}} interface returns an {{jsxref("Array")}} of {{Glossary("MIME type", 'MIME types')}} available within the {{domxref("ClipboardItem")}}
## Value
An {{jsxref("Array")}} of available {{Glossary("MIME type", 'MIME types')}}.
## Examples
In the below example, we're returning all items on the clipboard via the {{domxref("Clipboard.read()")}} method.
Then checking the `types` property for available types before utilizing the {{domxref("ClipboardItem.getType()")}} method to return the {{domxref("Blob")}} object. If no clipboards contents is found for the specified type, an error is returned.
```js
async function getClipboardContents() {
try {
const clipboardItems = await navigator.clipboard.read();
for (const clipboardItem of clipboardItems) {
for (const type of clipboardItem.types) {
const blob = await clipboardItem.getType(type);
// we can now use blob here
}
}
} catch (err) {
console.error(err.name, err.message);
}
}
```
## Specifications
{{Specifications}}
## Browser compatibility
{{Compat}}
## See also
- [Clipboard API](/en-US/docs/Web/API/Clipboard_API)
- [Image support for Async Clipboard article](https://web.dev/articles/async-clipboard)
| 0 |
data/mdn-content/files/en-us/web/api/clipboarditem | data/mdn-content/files/en-us/web/api/clipboarditem/gettype/index.md | ---
title: "ClipboardItem: getType() method"
short-title: getType()
slug: Web/API/ClipboardItem/getType
page-type: web-api-instance-method
browser-compat: api.ClipboardItem.getType
---
{{APIRef("Clipboard API")}} {{securecontext_header}}
The **`getType()`** method of the {{domxref("ClipboardItem")}} interface returns a {{jsxref("Promise")}} that resolves with a {{domxref("Blob")}} of the requested {{Glossary("MIME type")}} or an error if the MIME type is not found.
## Syntax
```js-nolint
getType(type)
```
### Parameters
- `type`
- : A valid {{Glossary("MIME type")}}.
### Return value
A {{jsxref("Promise")}} that resolves with a {{domxref("Blob")}} object.
### Exceptions
- `NotFoundError` {{domxref("DOMException")}}
- : The `type` does not match a known {{Glossary("MIME type")}}.
- {{jsxref("TypeError")}}
- : No parameter is specified or the `type` is not that of the {{domxref("ClipboardItem")}}.
## Examples
In the following example, we're returning all items on the clipboard via the {{domxref("clipboard.read()")}} method.
Then utilizing the {{domxref("ClipboardItem.types")}} property to set the `getType()` argument and return the corresponding blob object.
```js
async function getClipboardContents() {
try {
const clipboardItems = await navigator.clipboard.read();
for (const clipboardItem of clipboardItems) {
for (const type of clipboardItem.types) {
const blob = await clipboardItem.getType(type);
// we can now use blob here
}
}
} catch (err) {
console.error(err.name, err.message);
}
}
```
## Specifications
{{Specifications}}
## Browser compatibility
{{Compat}}
## See also
- [Clipboard API](/en-US/docs/Web/API/Clipboard_API)
- [Image support for Async Clipboard article](https://web.dev/articles/async-clipboard)
| 0 |
data/mdn-content/files/en-us/web/api | data/mdn-content/files/en-us/web/api/domparser/index.md | ---
title: DOMParser
slug: Web/API/DOMParser
page-type: web-api-interface
browser-compat: api.DOMParser
---
{{APIRef("DOM")}}
The **`DOMParser`** interface provides
the ability to parse {{Glossary("XML")}} or {{Glossary("HTML")}} source code from a
string into a DOM {{domxref("Document")}}.
You can perform the opposite operation—converting a DOM tree into XML or HTML
source—using the {{domxref("XMLSerializer")}} interface.
In the case of an HTML document, you can also replace portions of the DOM with new DOM
trees built from HTML by setting the value of the {{domxref("Element.innerHTML")}} and
{{domxref("Element.outerHTML", "outerHTML")}} properties. These properties can also be
read to fetch HTML fragments corresponding to the corresponding DOM subtree.
Note that {{domxref("XMLHttpRequest")}} can parse XML and HTML directly
from a URL-addressable resource, returning a `Document` in its
{{domxref("XMLHttpRequest.response", "response")}} property.
> **Note:** Be aware that [block-level elements](/en-US/docs/Glossary/Block-level_content)
> like `<p>` will be automatically closed if another
> block-level element is nested inside and therefore parsed before the closing `</p>` tag.
## Constructor
- {{domxref("DOMParser.DOMParser","DOMParser()")}}
- : Creates a new `DOMParser` object.
## Instance methods
- {{domxref("DOMParser.parseFromString()")}}
- : Parses a string using either the HTML parser or the XML parser, returning an {{domxref("HTMLDocument")}} or {{domxref("XMLDocument")}}.
## Examples
The documentation for {{domxref("DOMParser.parseFromString()")}}, this interface's only method, contains examples for parsing XML, SVG, and HTML strings.
## Specifications
{{Specifications}}
## Browser compatibility
{{Compat}}
## See also
- [Parsing and serializing XML](/en-US/docs/Web/XML/Parsing_and_serializing_XML)
- {{domxref("XMLHttpRequest")}}
- {{domxref("XMLSerializer")}}
- {{jsxref("JSON.parse()")}} - counterpart for {{jsxref("JSON")}} documents.
| 0 |
data/mdn-content/files/en-us/web/api/domparser | data/mdn-content/files/en-us/web/api/domparser/parsefromstring/index.md | ---
title: "DOMParser: parseFromString() method"
short-title: parseFromString()
slug: Web/API/DOMParser/parseFromString
page-type: web-api-instance-method
browser-compat: api.DOMParser.parseFromString
---
{{APIRef("DOMParser")}}
The **`parseFromString()`** method of the {{domxref("DOMParser")}} interface parses a string containing either HTML or XML, returning an {{domxref("HTMLDocument")}} or an {{domxref("XMLDocument")}}.
## Syntax
```js-nolint
parseFromString(string, mimeType)
```
### Parameters
- `string`
- : The string to be parsed. It must contain either an
{{Glossary("HTML")}}, {{Glossary("xml")}}, {{Glossary("XHTML")}}, or
{{Glossary("svg")}} document.
- `mimeType`
- : A string. This string determines whether the XML parser or the HTML parser is used to parse the string. Valid values are:
- `text/html`
- `text/xml`
- `application/xml`
- `application/xhtml+xml`
- `image/svg+xml`
A value of `text/html` will invoke the HTML parser, and the method will return an {{domxref("HTMLDocument")}}. Any {{HTMLElement("script")}} element gets marked non-executable, and the contents of {{HTMLElement("noscript")}} are parsed as markup.
The other valid values (`text/xml`, `application/xml`, `application/xhtml+xml`, and `image/svg+xml`) are functionally equivalent. They all invoke the XML parser, and the method will return a {{domxref("XMLDocument")}}.
Any other value is invalid and will cause a [`TypeError`](/en-US/docs/Web/JavaScript/Reference/Global_Objects/TypeError) to be thrown.
### Return value
An {{domxref("HTMLDocument")}} or an {{domxref("XMLDocument")}}, depending on the
`mimeType` argument.
## Examples
### Parsing XML, SVG, and HTML
Note that a MIME type of `text/html` will invoke the HTML parser, and any other valid MIME type will invoke the XML parser. The `application/xml` and `image/svg+xml` MIME types in the example below are functionally identical — the latter does not include any SVG-specific parsing rules. Distinguishing between the two serves only to clarify the code's intent.
```js
const parser = new DOMParser();
const xmlString = "<warning>Beware of the tiger</warning>";
const doc1 = parser.parseFromString(xmlString, "application/xml");
// XMLDocument
const svgString = '<circle cx="50" cy="50" r="50"/>';
const doc2 = parser.parseFromString(svgString, "image/svg+xml");
// XMLDocument
const htmlString = "<strong>Beware of the leopard</strong>";
const doc3 = parser.parseFromString(htmlString, "text/html");
// HTMLDocument
console.log(doc1.documentElement.textContent);
// "Beware of the tiger"
console.log(doc2.firstChild.tagName);
// "circle"
console.log(doc3.body.firstChild.textContent);
// "Beware of the leopard"
```
### Error handling
When using the XML parser with a string that doesn't represent well-formed XML, the {{domxref("XMLDocument")}} returned by `parseFromString` will contain a `<parsererror>` node describing the nature of the parsing error.
```js
const parser = new DOMParser();
const xmlString = "<warning>Beware of the missing closing tag";
const doc = parser.parseFromString(xmlString, "application/xml");
const errorNode = doc.querySelector("parsererror");
if (errorNode) {
// parsing failed
} else {
// parsing succeeded
}
```
Additionally, the parsing error may be reported to the browser's JavaScript console.
## Specifications
{{Specifications}}
## Browser compatibility
{{Compat}}
## See also
- {{domxref("XMLSerializer")}}
- {{jsxref("JSON.parse()")}} - counterpart for {{jsxref("JSON")}} documents.
| 0 |
data/mdn-content/files/en-us/web/api/domparser | data/mdn-content/files/en-us/web/api/domparser/domparser/index.md | ---
title: "DOMParser: DOMParser() constructor"
short-title: DOMParser()
slug: Web/API/DOMParser/DOMParser
page-type: web-api-constructor
browser-compat: api.DOMParser.DOMParser
---
{{APIRef("DOM")}}
The **`DOMParser()`** constructor creates a new [`DOMParser`](/en-US/docs/Web/API/DOMParser) object. This object can be used to parse the text of a document using the `parseFromString()` method.
## Syntax
```js-nolint
new DOMParser()
```
### Parameters
None.
### Return value
A new [`DOMParser`](/en-US/docs/Web/API/DOMParser) object. This object can be used to parse the text of a document using the `parseFromString()` method.
## Specifications
{{Specifications}}
## Browser compatibility
{{Compat}}
| 0 |
data/mdn-content/files/en-us/web/api | data/mdn-content/files/en-us/web/api/webgl_api/index.md | ---
title: "WebGL: 2D and 3D graphics for the web"
slug: Web/API/WebGL_API
page-type: web-api-overview
browser-compat:
- api.WebGLRenderingContext
- api.WebGL2RenderingContext
---
{{DefaultAPISidebar("WebGL")}}
**WebGL** (Web Graphics Library) is a JavaScript API for rendering high-performance interactive 3D and 2D graphics within any compatible web browser without the use of plug-ins. WebGL does so by introducing an API that closely conforms to OpenGL ES 2.0 that can be used in HTML {{HTMLElement("canvas")}} elements. This conformance makes it possible for the API to take advantage of hardware graphics acceleration provided by the user's device.
Support for WebGL is present in all modern browsers (see the [compatibility tables](#browser_compatibility) below); however, the user's device must also have hardware that supports these features.
The [WebGL 2](#webgl_2) API introduces support for much of the OpenGL ES 3.0 feature set; it's provided through the {{domxref("WebGL2RenderingContext")}} interface.
The {{HTMLElement("canvas")}} element is also used by the [Canvas API](/en-US/docs/Web/API/Canvas_API) to do 2D graphics on web pages.
## Reference
### Standard interfaces
- {{domxref("WebGLRenderingContext")}}
- {{domxref("WebGL2RenderingContext")}}
- {{domxref("WebGLActiveInfo")}}
- {{domxref("WebGLBuffer")}}
- {{domxref("WebGLContextEvent")}}
- {{domxref("WebGLFramebuffer")}}
- {{domxref("WebGLProgram")}}
- {{domxref("WebGLQuery")}}
- {{domxref("WebGLRenderbuffer")}}
- {{domxref("WebGLSampler")}}
- {{domxref("WebGLShader")}}
- {{domxref("WebGLShaderPrecisionFormat")}}
- {{domxref("WebGLSync")}}
- {{domxref("WebGLTexture")}}
- {{domxref("WebGLTransformFeedback")}}
- {{domxref("WebGLUniformLocation")}}
- {{domxref("WebGLVertexArrayObject")}}
### Extensions
- {{domxref("ANGLE_instanced_arrays")}}
- {{domxref("EXT_blend_minmax")}}
- {{domxref("EXT_color_buffer_float")}}
- {{domxref("EXT_color_buffer_half_float")}}
- {{domxref("EXT_disjoint_timer_query")}}
- {{domxref("EXT_float_blend")}} {{experimental_inline}}
- {{domxref("EXT_frag_depth")}}
- {{domxref("EXT_shader_texture_lod")}}
- {{domxref("EXT_sRGB")}}
- {{domxref("EXT_texture_compression_bptc")}}
- {{domxref("EXT_texture_compression_rgtc")}}
- {{domxref("EXT_texture_filter_anisotropic")}}
- {{domxref("EXT_texture_norm16")}}
- {{domxref("KHR_parallel_shader_compile")}}
- {{domxref("OES_draw_buffers_indexed")}}
- {{domxref("OES_element_index_uint")}}
- {{domxref("OES_fbo_render_mipmap")}}
- {{domxref("OES_standard_derivatives")}}
- {{domxref("OES_texture_float")}}
- {{domxref("OES_texture_float_linear")}}
- {{domxref("OES_texture_half_float")}}
- {{domxref("OES_texture_half_float_linear")}}
- {{domxref("OES_vertex_array_object")}}
- {{domxref("OVR_multiview2")}}
- {{domxref("WEBGL_color_buffer_float")}}
- {{domxref("WEBGL_compressed_texture_astc")}}
- {{domxref("WEBGL_compressed_texture_etc")}}
- {{domxref("WEBGL_compressed_texture_etc1")}}
- {{domxref("WEBGL_compressed_texture_pvrtc")}}
- {{domxref("WEBGL_compressed_texture_s3tc")}}
- {{domxref("WEBGL_compressed_texture_s3tc_srgb")}}
- {{domxref("WEBGL_debug_renderer_info")}}
- {{domxref("WEBGL_debug_shaders")}}
- {{domxref("WEBGL_depth_texture")}}
- {{domxref("WEBGL_draw_buffers")}}
- {{domxref("WEBGL_lose_context")}}
- {{domxref("WEBGL_multi_draw")}}
### Events
- {{domxref("HTMLCanvasElement/webglcontextlost_event", "webglcontextlost")}}
- {{domxref("HTMLCanvasElement/webglcontextrestored_event", "webglcontextrestored")}}
- {{domxref("HTMLCanvasElement/webglcontextcreationerror_event", "webglcontextcreationerror")}}
### Constants and types
- [WebGL constants](/en-US/docs/Web/API/WebGL_API/Constants)
- [WebGL types](/en-US/docs/Web/API/WebGL_API/Types)
### WebGL 2
WebGL 2 is a major update to WebGL which is provided through the {{domxref("WebGL2RenderingContext")}} interface. It is based on OpenGL ES 3.0 and new features include:
- [3D textures](/en-US/docs/Web/API/WebGL2RenderingContext/texImage3D),
- [Sampler objects](/en-US/docs/Web/API/WebGLSampler),
- [Uniform Buffer objects](/en-US/docs/Web/API/WebGL2RenderingContext#uniform_buffer_objects),
- [Sync objects](/en-US/docs/Web/API/WebGLSync),
- [Query objects](/en-US/docs/Web/API/WebGLQuery),
- [Transform Feedback objects](/en-US/docs/Web/API/WebGLTransformFeedback),
- Promoted extensions that are now core to WebGL 2: [Vertex Array objects](/en-US/docs/Web/API/WebGLVertexArrayObject), [instancing](/en-US/docs/Web/API/WebGL2RenderingContext/drawArraysInstanced), [multiple render targets](/en-US/docs/Web/API/WebGL2RenderingContext/drawBuffers), [fragment depth](/en-US/docs/Web/API/EXT_frag_depth).
See also the blog post ["WebGL 2 lands in Firefox"](https://hacks.mozilla.org/2017/01/webgl-2-lands-in-firefox/) and [webglsamples.org/WebGL2Samples](https://webglsamples.org/WebGL2Samples/) for a few demos.
## Guides and tutorials
Below, you'll find an assortment of guides to help you learn WebGL concepts and tutorials that offer step-by-step lessons and examples.
### Guides
- [Data in WebGL](/en-US/docs/Web/API/WebGL_API/Data)
- : A guide to variables, buffers, and other types of data used when writing WebGL code.
- [WebGL best practices](/en-US/docs/Web/API/WebGL_API/WebGL_best_practices)
- : Tips and suggestions to help you improve the quality, performance, and reliability of your WebGL content.
- [Using extensions](/en-US/docs/Web/API/WebGL_API/Using_Extensions)
- : A guide to using WebGL extensions.
### Tutorials
- [WebGL tutorial](/en-US/docs/Web/API/WebGL_API/Tutorial)
- : A beginner's guide to WebGL core concepts. A good place to start if you don't have previous WebGL experience.
### Examples
- [A basic 2D WebGL animation example](/en-US/docs/Web/API/WebGL_API/Basic_2D_animation_example)
- : This example demonstrates the simple animation of a one-color shape. Topics examined are adapting to aspect ratio differences, a function to build shader programs from sets of multiple shaders, and the basics of drawing in WebGL.
- [WebGL by example](/en-US/docs/Web/API/WebGL_API/By_example)
- : A series of live samples with short explanations that showcase WebGL concepts and capabilities. The examples are sorted according to topic and level of difficulty, covering the WebGL rendering context, shader programming, textures, geometry, user interaction, and more.
### Advanced tutorials
- [WebGL model view projection](/en-US/docs/Web/API/WebGL_API/WebGL_model_view_projection)
- : A detailed explanation of the three core matrices that are typically used to represent a 3D object view: the model, view and projection matrices.
- [Matrix math for the web](/en-US/docs/Web/API/WebGL_API/Matrix_math_for_the_web)
- : A useful guide to how 3D transform matrices work, and can be used on the web — both for WebGL calculations and in CSS transforms.
## Resources
- [Khronos WebGL site](https://www.khronos.org/webgl/) The main website for WebGL at the Khronos Group.
- [WebGL Fundamentals](https://web.dev/articles/webgl-fundamentals) A basic tutorial with fundamentals of WebGL.
- [Raw WebGL: An introduction to WebGL](https://www.youtube.com/embed/H4c8t6myAWU/?feature=player_detailpage) A talk by Nick Desaulniers that introduces the basics of WebGL.
- [WebGL playground](http://webglplayground.net) An online tool for creating and sharing WebGL projects. Good for quick prototyping and experimenting.
- [WebGL Academy](http://www.webglacademy.com) An HTML/JavaScript editor with tutorials to learn basics of webgl programming.
- [WebGL Stats](https://webglreport.com/) A site with statistics about WebGL capabilities in browsers on different platforms.
### Libraries
- [three.js](https://threejs.org/) is an open-source, fully featured 3D WebGL library.
- [Babylon.js](https://www.babylonjs.com) is a powerful, simple, and open game and 3D rendering engine packed into a friendly JavaScript framework.
- [Pixi.js](https://pixijs.com/) is a fast, open-source 2D WebGL renderer.
- [Phaser](https://phaser.io/) is a fast, free and fun open source framework for Canvas and WebGL powered browser games.
- [PlayCanvas](https://playcanvas.com/) is an open-source game engine.
- [glMatrix](https://github.com/toji/gl-matrix) is a JavaScript matrix and vector library for high-performance WebGL apps.
- [twgl](https://twgljs.org) is a library for making webgl less verbose.
- [RedGL](https://github.com/redcamel/RedGL2) is an open-source 3D WebGL library.
- [vtk.js](https://kitware.github.io/vtk-js/) is a JavaScript library for scientific visualization in your browser.
- [webgl-lint](https://greggman.github.io/webgl-lint/) will help find errors in your WebGL code and provide useful info
## Specifications
{{Specifications}}
## Browser compatibility
{{Compat}}
### Compatibility notes
In addition to the browser, the GPU itself also needs to support the feature. So, for example, S3 Texture Compression (S3TC) is only available on Tegra-based tablets. Most browsers make the WebGL context available through the `webgl` context name, but older ones need `experimental-webgl` as well. In addition, the upcoming [WebGL 2](/en-US/docs/Web/API/WebGL2RenderingContext) is fully backwards-compatible and will have the context name `webgl2`.
### Gecko notes
#### WebGL debugging and testing
Firefox provides two preferences available which let you control the capabilities of WebGL for testing purposes:
- `webgl.min_capability_mode`
- : A Boolean property that, when `true`, enables a minimum capability mode. When in this mode, WebGL is configured to only support the bare minimum feature set and capabilities required by the WebGL specification. This lets you ensure that your WebGL code will work on any device or browser, regardless of their capabilities. This is `false` by default.
- `webgl.disable_extensions`
- : A Boolean property that, when `true`, disables all WebGL extensions. This is `false` by default.
## See also
- [Canvas API](/en-US/docs/Web/API/Canvas_API)
- [Compatibility info about WebGL extensions](/en-US/docs/Web/API/WebGLRenderingContext/getSupportedExtensions#browser_compatibility)
| 0 |
data/mdn-content/files/en-us/web/api/webgl_api | data/mdn-content/files/en-us/web/api/webgl_api/matrix_math_for_the_web/index.md | ---
title: Matrix math for the web
slug: Web/API/WebGL_API/Matrix_math_for_the_web
page-type: guide
---
{{DefaultAPISidebar("WebGL")}}
Matrices can be used to represent transformations of objects in space, and are used for performing many key types of computation when constructing images and visualizing data on the Web. This article explores how to create matrices and how to use them with [CSS transforms](/en-US/docs/Web/CSS/CSS_transforms/Using_CSS_transforms) and the `matrix3d` transform type.
While this article uses [CSS](/en-US/docs/Web/CSS) to simplify explanations, matrices are a core concept used by many different technologies including [WebGL](/en-US/docs/Web/API/WebGL_API), the [WebXR](/en-US/docs/Web/API/WebXR_Device_API) (VR and AR) API, and [GLSL shaders](/en-US/docs/Games/Techniques/3D_on_the_web/GLSL_Shaders). This article is also available as an [MDN content kit](https://github.com/gregtatum/mdn-matrix-math). The live examples use a collection of [utility functions](https://github.com/gregtatum/mdn-webgl) available under a global object named `MDN`.
## Transformation matrices
There are many types of matrices, but the ones we are interested in are the 3D transformation matrices. These matrices consist of a set of 16 values arranged in a 4×4 grid. In [JavaScript](/en-US/docs/Web/JavaScript), it is easy to represent a matrix as an array.
Let's begin by considering the **identity matrix**. This is a special transformation matrix which functions much like the number 1 does in scalar multiplication; just like n \* 1 = n, multiplying any matrix by the identity matrix gives a resulting matrix whose values match the original matrix.
The identity matrix looks like this in JavaScript:
```js
let identityMatrix = [1, 0, 0, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 0, 0, 1];
```
What does multiplying by the identity matrix look like? The easiest example is to multiply a single point by the identity matrix. Since a 3D point only needs three values (`x`, `y`, and `z`), and the transformation matrix is a 4×4 value matrix, we need to add a fourth dimension to the point. By convention, this dimension is called the **perspective**, and is represented by the letter `w`. For a typical position, setting `w` to 1 will make the math work out.
After adding the `w` component to the point, notice how neatly the matrix and the point line up:
```js-nolint
[1, 0, 0, 0,
0, 1, 0, 0,
0, 0, 1, 0,
0, 0, 0, 1]
[4, 3, 2, 1] // Point at [x, y, z, w]
```
The `w` component has some additional uses that are out of scope for this article. Check out the [WebGL model view projection](/en-US/docs/Web/API/WebGL_API/WebGL_model_view_projection) article for a look into how it comes in handy.
### Multiplying a matrix and a point
In our example code we have defined a function to multiply a matrix and a point — `multiplyMatrixAndPoint()`:
```js
// point • matrix
function multiplyMatrixAndPoint(matrix, point) {
// Give a simple variable name to each part of the matrix, a column and row number
let c0r0 = matrix[0],
c1r0 = matrix[1],
c2r0 = matrix[2],
c3r0 = matrix[3];
let c0r1 = matrix[4],
c1r1 = matrix[5],
c2r1 = matrix[6],
c3r1 = matrix[7];
let c0r2 = matrix[8],
c1r2 = matrix[9],
c2r2 = matrix[10],
c3r2 = matrix[11];
let c0r3 = matrix[12],
c1r3 = matrix[13],
c2r3 = matrix[14],
c3r3 = matrix[15];
// Now set some simple names for the point
let x = point[0];
let y = point[1];
let z = point[2];
let w = point[3];
// Multiply the point against each part of the 1st column, then add together
let resultX = x * c0r0 + y * c0r1 + z * c0r2 + w * c0r3;
// Multiply the point against each part of the 2nd column, then add together
let resultY = x * c1r0 + y * c1r1 + z * c1r2 + w * c1r3;
// Multiply the point against each part of the 3rd column, then add together
let resultZ = x * c2r0 + y * c2r1 + z * c2r2 + w * c2r3;
// Multiply the point against each part of the 4th column, then add together
let resultW = x * c3r0 + y * c3r1 + z * c3r2 + w * c3r3;
return [resultX, resultY, resultZ, resultW];
}
```
Now using the function above we can multiply a point by the matrix. Using the identity matrix it should return a point identical to the original, since a point (or any other matrix) multiplied by the identity matrix is always equal to itself:
```js
// sets identityResult to [4,3,2,1]
let identityResult = multiplyMatrixAndPoint(identityMatrix, [4, 3, 2, 1]);
```
Returning the same point is not very useful, but there are other types of matrices that can perform helpful operations on points. The next sections will demonstrate some of these matrices.
### Multiplying two matrices
In addition to multiplying a matrix and a point together, you can also multiply two matrices together. The function from above can be re-used to help out in this process:
```js
//matrixB • matrixA
function multiplyMatrices(matrixA, matrixB) {
// Slice the second matrix up into rows
let row0 = [matrixB[0], matrixB[1], matrixB[2], matrixB[3]];
let row1 = [matrixB[4], matrixB[5], matrixB[6], matrixB[7]];
let row2 = [matrixB[8], matrixB[9], matrixB[10], matrixB[11]];
let row3 = [matrixB[12], matrixB[13], matrixB[14], matrixB[15]];
// Multiply each row by matrixA
let result0 = multiplyMatrixAndPoint(matrixA, row0);
let result1 = multiplyMatrixAndPoint(matrixA, row1);
let result2 = multiplyMatrixAndPoint(matrixA, row2);
let result3 = multiplyMatrixAndPoint(matrixA, row3);
// Turn the result rows back into a single matrix
return [
result0[0],
result0[1],
result0[2],
result0[3],
result1[0],
result1[1],
result1[2],
result1[3],
result2[0],
result2[1],
result2[2],
result2[3],
result3[0],
result3[1],
result3[2],
result3[3],
];
}
```
Let's look at this function in action:
```js
let someMatrix = [4, 0, 0, 0, 0, 3, 0, 0, 0, 0, 5, 0, 4, 8, 4, 1];
let identityMatrix = [1, 0, 0, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 0, 0, 1];
// Returns a new array equivalent to someMatrix
let someMatrixResult = multiplyMatrices(identityMatrix, someMatrix);
```
> **Warning:** These matrix functions are written for clarity of explanation, not for speed or memory management. These functions create a lot of new arrays, which can be particularly expensive for real-time operations due to garbage collection. In real production code it would be best to use optimized functions. [glMatrix](https://glmatrix.net) is an example of a library that has a focus on speed and performance. The focus in the glMatrix library is to have target arrays that are allocated before the update loop.
## Translation matrix
A **translation matrix** is based upon the identity matrix, and is used in 3D graphics to move a point or object in one or more of the three directions (`x`, `y`, and/or `z`). The easiest way to think of a translation is like picking up a coffee cup. The coffee cup must be kept upright and oriented the same way so that no coffee is spilled. It can move up in the air off the table and around the air in space.
You can't actually drink the coffee using only a translation matrix, because to drink it, you have to be able to tilt or rotate the cup to pour the coffee into your mouth. We'll look at the type of matrix (cleverly called a **[rotation matrix](#rotation_matrix)**) you use to do this later.
```js
let x = 50;
let y = 100;
let z = 0;
let translationMatrix = [1, 0, 0, 0, 0, 1, 0, 0, 0, 0, 1, 0, x, y, z, 1];
```
Place the distances along the three axes in the corresponding positions in the translation matrix, then multiply it by the point or matrix you need to move through 3D space.
## Manipulating the DOM with a matrix
A really easy way to start using a matrix is to use the CSS {{cssxref("transform-function/matrix3d","matrix3d()")}} {{cssxref("transform")}}. First we'll set up a simple {{htmlelement("div")}} with some content. The style is not shown, but it's set to a fixed width and height and is centered on the page. The `<div>` has a transition set for the transform so that matrix is animated in making it easy to see what is being done.
```html
<div id="move-me" class="transformable">
<h2>Move me with a matrix</h2>
<p>Lorem ipsum dolor sit amet, consectetur adipisicing elit…</p>
</div>
```
Finally, for each example, we will generate a 4×4 matrix, then update the `<div>`'s style to have a transform applied to it, set to a `matrix3d`. Bear in mind that even though the matrix is made up of 4 rows and 4 columns, it collapses into a single line of 16 values. Matrices are always stored in one-dimensional lists in JavaScript.
```js
// Create the matrix3d style property from a matrix array
function matrixArrayToCssMatrix(array) {
return `matrix3d(${array.join(",")})`;
}
// Grab the DOM element
let moveMe = document.getElementById("move-me");
// Returns a result like: "matrix3d(1, 0, 0, 0, 0, 1, 0, 0, 0, 0, 1, 0, 50, 100, 0, 1);"
let matrix3dRule = matrixArrayToCssMatrix(translationMatrix);
// Set the transform
moveMe.style.transform = matrix3dRule;
```
[View on JSFiddle](https://jsfiddle.net/tatumcreative/g24mgw6y/)

## Scale matrix
A **scale matrix** makes something larger or smaller in one or more of the three dimensions: width, height, and depth. In typical (cartesian) coordinates, this causes stretching or contracting of the object in the corresponding directions.
The amount of change to apply to each of the width, height, and depth is placed diagonally starting at the top-left corner and making their way down toward the bottom-right.
```js
let w = 1.5; // width (x)
let h = 0.7; // height (y)
let d = 1; // depth (z)
let scaleMatrix = [w, 0, 0, 0, 0, h, 0, 0, 0, 0, d, 0, 0, 0, 0, 1];
```
[View on JSFiddle](https://jsfiddle.net/tatumcreative/fndd6e1b/)

## Rotation matrix
A **rotation matrix** is used to rotate a point or object. Rotation matrices look a little bit more complicated than scaling and transform matrices. They use trigonometric functions to perform the rotation. While this section won't break the steps down into exhaustive detail (check out [this article on Wolfram MathWorld](https://mathworld.wolfram.com/RotationMatrix.html) for that), take this example for illustration.
First, here's code that rotates a point around the origin without using matrices.
```js
// Manually rotating a point about the origin without matrices
let point = [10, 2];
// Calculate the distance from the origin
let distance = Math.sqrt(point[0] * point[0] + point[1] * point[1]);
// The equivalent of 60 degrees, in radians
let rotationInRadians = Math.PI / 3;
let transformedPoint = [
Math.cos(rotationInRadians) * distance,
Math.sin(rotationInRadians) * distance,
];
```
It is possible to encode these type of steps into a matrix, and do it for each of the `x`, `y`, and `z` dimensions. Below is the representation of a counterclockwise rotation about the Z axis in a left-handed coordinate system:
```js
let sin = Math.sin;
let cos = Math.cos;
// NOTE: There is no perspective in these transformations, so a rotation
// at this point will only appear to only shrink the div
let a = Math.PI * 0.3; //Rotation amount in radians
// Rotate around Z axis
let rotateZMatrix = [
cos(a),
-sin(a),
0,
0,
sin(a),
cos(a),
0,
0,
0,
0,
1,
0,
0,
0,
0,
1,
];
```
[View on JSFiddle](https://jsfiddle.net/tatumcreative/9vr2dorz/)

Here are a set of functions that return rotation matrices for rotating around each of the three axes. One big note is that there is no perspective applied, so it might not feel very 3D yet. The flatness is equivalent to when a camera zooms in really close onto an object in the distance — the sense of perspective disappears.
```js
function rotateAroundXAxis(a) {
return [1, 0, 0, 0, 0, cos(a), -sin(a), 0, 0, sin(a), cos(a), 0, 0, 0, 0, 1];
}
function rotateAroundYAxis(a) {
return [cos(a), 0, sin(a), 0, 0, 1, 0, 0, -sin(a), 0, cos(a), 0, 0, 0, 0, 1];
}
function rotateAroundZAxis(a) {
return [cos(a), -sin(a), 0, 0, sin(a), cos(a), 0, 0, 0, 0, 1, 0, 0, 0, 0, 1];
}
```
[View on JSFiddle](https://jsfiddle.net/tatumcreative/tk072doc/)
## Matrix composition
The real power of matrices comes from **matrix composition**. When matrices of a certain class are multiplied together they preserve the history of the transformations and are reversible. This means that if a translation, rotation, and scale matrix are all combined together, when the order of the matrices is reversed and re-applied, then the original points are returned.
The order that matrices are multiplied in matters. When multiplying numbers, a \* b = c, and b \* a = c are both true. For example 3 \* 4 = 12, and 4 \* 3 = 12. In math, these numbers would be described as **commutative**. Matrices are _not_ guaranteed to be the same if the order is switched, so matrices are **non-commutative**.
Another mind-bender is that matrix multiplication in WebGL and CSS needs to happen in the reverse order that the operations intuitively happen. For instance, to scale something down by 80%, move it down 200 pixels, and then rotate about the origin 90 degrees would look something like the following in pseudocode.
```plain
transformation = rotate * translate * scale
```
### Composing multiple transformations
The function that we will be using to compose our matrices is `multiplyArrayOfMatrices()`, which is part of the set of [utility functions](https://github.com/gregtatum/mdn-webgl) introduced near the top of this article. It takes an array of matrices and multiplies them together, returning the result. In WebGL shader code, this is built into the language and the `*` operator can be used. Additionally this example uses `scale()` and `translate()` functions, which return matrices as defined above.
```js
let transformMatrix = MDN.multiplyArrayOfMatrices([
rotateAroundZAxis(Math.PI * 0.5), // Step 3: rotate around 90 degrees
translate(0, 200, 0), // Step 2: move down 200 pixels
scale(0.8, 0.8, 0.8), // Step 1: scale down
]);
```
[View on JSFiddle](https://jsfiddle.net/tatumcreative/qxxg3yvc/)

Finally, a fun step to show how matrices work is to reverse the steps to bring the matrix back to the original identity matrix.
```js
let transformMatrix = MDN.multiplyArrayOfMatrices([
scale(1.25, 1.25, 1.25), // Step 6: scale back up
translate(0, -200, 0), // Step 5: move back up
rotateAroundZAxis(-Math.PI * 0.5), // Step 4: rotate back
rotateAroundZAxis(Math.PI * 0.5), // Step 3: rotate around 90 degrees
translate(0, 200, 0), // Step 2: move down 200 pixels
scale(0.8, 0.8, 0.8), // Step 1: scale down
]);
```
## Why matrices are important
Matrices are important because they comprise a small set of numbers that can describe a wide range of transformations in space. They can easily be shared around in programs. Different coordinate spaces can be described with matrices, and some matrix multiplication will move one set of data from one coordinate space to another coordinate space. Matrices effectively remember every part of the previous transforms that were used to generate them.
For uses in WebGL, the graphics card is particularly good at multiplying a large number of points in space by matrices. Different operations like positioning points, calculating lighting, and posing animated characters all rely on this fundamental tool.
| 0 |
data/mdn-content/files/en-us/web/api/webgl_api | data/mdn-content/files/en-us/web/api/webgl_api/compressed_texture_formats/index.md | ---
title: Compressed texture formats
slug: Web/API/WebGL_API/Compressed_texture_formats
page-type: guide
---
{{DefaultAPISidebar("WebGL")}}
The WebGL API provides methods to use compressed texture formats. These are useful to increase texture detail while limiting the additional video memory necessary. By default, no compressed formats are available: a corresponding compressed texture format extension must first be enabled.
## Usage
Unless otherwise specified, this article applies to both WebGL 1 and 2 contexts.
If supported, textures can be stored in a compressed format in video memory. This allows for additional detail while limiting the added video memory necessary. Textures are uncompressed on the fly when being accessed by a shader. Note that this advantage doesn't translate to network bandwidth: while the formats are better than uncompressed data, they are in general far worse than standard image formats such as PNG and JPG.
As compressed textures require hardware support, therefore no specific formats are required by WebGL; instead, a context can make different formats available, depending on hardware support. [This site](https://toji.github.io/texture-tester/) shows which formats are supported in the used browser.
Usage of compressed formats first requires activating the respective extension with {{domxref("WebGLRenderingContext.getExtension()")}}. If supported, it will return an extension object with constants for the added formats and the formats will also be returned by calls to `gl.getParameter(gl.COMPRESSED_TEXTURE_FORMATS)`. (E.g. `ext.COMPRESSED_RGBA_S3TC_DXT1_EXT` for the {{domxref("WEBGL_compressed_texture_s3tc")}} extension.) These can then be used with {{domxref("WebGLRenderingContext.compressedTexImage2D()", "compressedTexImage[23]D")}} or {{domxref("WebGLRenderingContext.compressedTexSubImage2D()", "compressedTexSubImage[23]D")}} instead of `texImage2D` calls.
Note that WebGL makes no functionality available to compress or decompress textures: they must already be in a compressed format and can then be directly uploaded to video memory.
All formats support 2D textures. Which formats support `TEXTURE_2D_ARRAY` and `TEXTURE_3D` targets (in combination with `compressedTexImage3D`) are noted in the following table.
| Extension | Notes | TEXTURE_2D_ARRAY | TEXTURE_3D |
| ---------------------------------- | ---------------------------------------------------------- | ---------------- | ---------- |
| WEBGL_compressed_texture_astc | | Yes | Yes |
| WEBGL_compressed_texture_etc | | Yes | No |
| WEBGL_compressed_texture_etc1\* | Not usable with compressedTexSubImage2D/copyTexSubImage2D. | No | No |
| WEBGL_compressed_texture_pvrtc | Width and height must be powers of 2. | No | No |
| WEBGL_compressed_texture_s3tc | Width and height must be multiples of 4. | Yes | No |
| WEBGL_compressed_texture_s3tc_srgb | Width and height must be multiples of 4. | ? | No |
## Examples
```js
async function getCompressedTextureIfAvailable(gl) {
const texture = gl.createTexture();
gl.bindTexture(gl.TEXTURE_2D, texture); // create texture object on GPU
const ext = gl.getExtension("WEBGL_compressed_texture_s3tc"); // will be null if not supported
if (ext) {
// the file is already in the correct compressed format
const dataArrayBuffer = await fetch(
"/textures/foobar512x512.RGBA_S3TC_DXT1",
).then((response) => response.arrayBuffer());
gl.compressedTexImage2D(
gl.TEXTURE_2D,
0, // set the base image level
ext.COMPRESSED_RGBA_S3TC_DXT1_EXT, // the compressed format we are using
512,
512, // width, height of the image
0, // border, always 0
new DataView(dataArrayBuffer),
);
gl.generateMipMap(); // create mipmap levels, like we would for a standard image
return texture;
}
}
```
| 0 |
data/mdn-content/files/en-us/web/api/webgl_api | data/mdn-content/files/en-us/web/api/webgl_api/using_extensions/index.md | ---
title: Using WebGL extensions
slug: Web/API/WebGL_API/Using_Extensions
page-type: guide
---
{{DefaultAPISidebar("WebGL")}}
WebGL, like its sister APIs (OpenGL and OpenGL ES), supports extensions. A complete list of extensions is available in the [khronos webgl extension registry](https://www.khronos.org/registry/webgl/extensions/).
> **Note:** In WebGL, unlike in other GL APIs, extensions are only available if explicitly requested.
## Canonical extension names, vendor prefixes and preferences
Extensions may be supported by browser vendors before being officially ratified (but only when they are in draft stage). In that case, their name can be prefixed by the vendor prefix (`MOZ_`, `WEBKIT_`, etc.) or the extension is only available once a browser preference has been toggled.
If you wish to work with the bleeding edge of extensions, and want to keep working on upon ratification (assuming, of course, that the extension doesn't change in incompatible ways), that you query the canonical extension name as well as the vendor extension name. For instance:
```js
const ext =
gl.getExtension("OES_vertex_array_object") ||
gl.getExtension("MOZ_OES_vertex_array_object") ||
gl.getExtension("WEBKIT_OES_vertex_array_object");
```
Note that, vendor prefix have been discouraged thus most browser implement experimental extensions behind a feature flag rather than vendor prefix.
The feature flags are:
- `webgl.enable-draft-extensions` in Firefox
- `chrome://flags/#enable-webgl-draft-extensions` in Chromium based browsers (Chrome, Opera).
## Naming conventions
WebGL extensions are prefixed with "ANGLE", "OES", "EXT" or "WEBGL". These prefixes reflect origin and intent:
- `ANGLE_`: Extensions that are written by the [ANGLE library](https://en.wikipedia.org/wiki/ANGLE_%28software%29) authors.
- `OES_` and `KHR_`: Extensions that mirror functionality from OpenGL ES (OES) or OpenGL API extensions approved by the respective architecture review boards (Khronos).
- `OVR_`: Extensions that optimize for virtual reality.
- `EXT_`: Extensions that mirror other OpenGL ES or OpenGL API extensions.
- `WEBGL_`: Extensions that are WebGL-specific and intended to be compatible with multiple web browsers. It should also be used for extensions which originated with the OpenGL ES or OpenGL APIs, but whose behavior has been significantly altered.
## Querying available extensions
The WebGL context supports querying what extensions are available.
```js
const available_extensions = gl.getSupportedExtensions();
```
The {{domxref("WebGLRenderingContext.getSupportedExtensions()")}} method returns an array of strings, one for each supported extension.
## Extension list
The current extensions are:
- {{domxref("ANGLE_instanced_arrays")}}
- {{domxref("EXT_blend_minmax")}}
- {{domxref("EXT_color_buffer_float")}}
- {{domxref("EXT_color_buffer_half_float")}}
- {{domxref("EXT_disjoint_timer_query")}}
- {{domxref("EXT_float_blend")}} {{experimental_inline}}
- {{domxref("EXT_frag_depth")}}
- {{domxref("EXT_shader_texture_lod")}}
- {{domxref("EXT_sRGB")}}
- {{domxref("EXT_texture_compression_bptc")}}
- {{domxref("EXT_texture_compression_rgtc")}}
- {{domxref("EXT_texture_filter_anisotropic")}}
- {{domxref("EXT_texture_norm16")}}
- {{domxref("KHR_parallel_shader_compile")}}
- {{domxref("OES_draw_buffers_indexed")}}
- {{domxref("OES_element_index_uint")}}
- {{domxref("OES_fbo_render_mipmap")}}
- {{domxref("OES_standard_derivatives")}}
- {{domxref("OES_texture_float")}}
- {{domxref("OES_texture_float_linear")}}
- {{domxref("OES_texture_half_float")}}
- {{domxref("OES_texture_half_float_linear")}}
- {{domxref("OES_vertex_array_object")}}
- {{domxref("OVR_multiview2")}}
- {{domxref("WEBGL_color_buffer_float")}}
- {{domxref("WEBGL_compressed_texture_astc")}}
- {{domxref("WEBGL_compressed_texture_etc")}}
- {{domxref("WEBGL_compressed_texture_etc1")}}
- {{domxref("WEBGL_compressed_texture_pvrtc")}}
- {{domxref("WEBGL_compressed_texture_s3tc")}}
- {{domxref("WEBGL_compressed_texture_s3tc_srgb")}}
- {{domxref("WEBGL_debug_renderer_info")}}
- {{domxref("WEBGL_debug_shaders")}}
- {{domxref("WEBGL_depth_texture")}}
- {{domxref("WEBGL_draw_buffers")}}
- {{domxref("WEBGL_lose_context")}}
- {{domxref("WEBGL_multi_draw")}}
## Enabling an extension
Before an extension can be used it has to be enabled using {{domxref("WebGLRenderingContext.getExtension()")}}. For example:
```js
const float_texture_ext = gl.getExtension("OES_texture_float");
```
The return value is `null` if the extension is not supported, or an extension object otherwise.
## Extension objects
If an extension defines specific symbols or functions that are not available in the core specification of WebGL, they will be available on the extension object returned by the call to `gl.getExtension()`.
## See also
- {{domxref("WebGLRenderingContext.getSupportedExtensions()")}}
- {{domxref("WebGLRenderingContext.getExtension()")}}
- [webglreport.com](https://webglreport.com/)
- [web3dsurvey.com - WebGL Extension Support Survey](https://web3dsurvey.com/)
| 0 |
data/mdn-content/files/en-us/web/api/webgl_api | data/mdn-content/files/en-us/web/api/webgl_api/data/index.md | ---
title: Data in WebGL
slug: Web/API/WebGL_API/Data
page-type: guide
---
{{DefaultAPISidebar("WebGL")}}
Shader programs have access to three kinds of data storage, each of which has a specific use case. Each kind of variable is accessible by one or both types of shader program (depending on the data store type) and possibly by the site's JavaScript code, depending on the specific type of variable.
## GLSL data types
See [Data Types](<https://www.khronos.org/opengl/wiki/Data_Type_(GLSL)>) in the GLSL documentation.
## GLSL variables
There are three kinds of "variable" or data storage available in GLSL, each of which with its own purpose and use cases: **[attributes](#attributes)**, **[varyings](#varyings)**, and **[uniforms](#uniforms)**.
### Attributes
**Attributes** are GLSL variables which are only available to the vertex shader (as variables) and the JavaScript code. Attributes are typically used to store color information, texture coordinates, and any other data calculated or retrieved that needs to be shared between the JavaScript code and the vertex shader.
```js
// init colors
const vertexColors = [
vec4(0.0, 0.0, 0.0, 1.0), // black
vec4(1.0, 0.0, 0.0, 1.0), // red
vec4(1.0, 1.0, 0.0, 1.0), // yellow
vec4(0.0, 1.0, 0.0, 1.0), // green
vec4(0.0, 0.0, 0.0, 1.0), // black
vec4(1.0, 0.0, 0.0, 1.0), // red
vec4(1.0, 1.0, 0.0, 1.0), // yellow
vec4(0.0, 1.0, 0.0, 1.0), // green
];
const cBuffer = gl.createBuffer();
```
```js
// continued
// create buffer to store colors and reference it to "vColor" which is in GLSL
gl.bindBuffer(gl.ARRAY_BUFFER, cBuffer);
gl.bufferData(gl.ARRAY_BUFFER, flatten(vertexColors), gl.STATIC_DRAW);
const vColor = gl.getAttribLocation(program, "vColor");
gl.vertexAttribPointer(vColor, 4, gl.FLOAT, false, 0, 0);
gl.enableVertexAttribArray(vColor);
```
```cpp
//glsl
attribute vec4 vColor;
void main()
{
fColor = vColor;
}
```
### Varyings
**Varyings** are variables that are declared by the vertex shader and used to pass data from the vertex shader to the fragment shader. This is commonly used to share a vertex's [normal vector](<https://en.wikipedia.org/wiki/Normal_(geometry)>) after it has been computed by the vertex shader.
<\<how to use>>
### Uniforms
**Uniforms** are set by the JavaScript code and are available to both the vertex and fragment shaders. They're used to provide values that will be the same for everything drawn in the frame, such as lighting positions and magnitudes, global transformation and perspective details, and so forth.
<\<add details>>
## Buffers
<\<add information>>
## Textures
<\<add information>>
| 0 |
data/mdn-content/files/en-us/web/api/webgl_api | data/mdn-content/files/en-us/web/api/webgl_api/tutorial/index.md | ---
title: WebGL tutorial
slug: Web/API/WebGL_API/Tutorial
page-type: guide
---
{{DefaultAPISidebar("WebGL")}}
This tutorial describes how to use the {{HTMLElement("canvas")}} element to draw WebGL graphics, starting with the basics. The examples provided should give you some clear ideas of what you can do with WebGL and will provide code snippets that may get you started in building your own content.
[WebGL](https://www.khronos.org/webgl/) enables web content to use an API based on [OpenGL ES](https://www.khronos.org/opengles/) 2.0 to perform 3D rendering in an HTML `<canvas>` in browsers that support it without the use of plug-ins. WebGL programs consist of control code written in JavaScript and special effects code (shader code) that is executed on a computer's Graphics Processing Unit (GPU). WebGL elements can be mixed with other HTML elements and composited with other parts of the page or page background.
## Before you start
Using the `<canvas>` element is not very difficult, but you do need a basic understanding of [HTML](/en-US/docs/Web/HTML) and [JavaScript](/en-US/docs/Web/JavaScript). The `<canvas>` element and WebGL are not supported in some older browsers, but are supported in recent versions of all major browsers. In order to draw graphics on the canvas we use a JavaScript context object, which creates graphics on the fly.
## In this tutorial
- [Getting started with WebGL](/en-US/docs/Web/API/WebGL_API/Tutorial/Getting_started_with_WebGL)
- : How to set up a WebGL context.
- [Adding 2D content to a WebGL context](/en-US/docs/Web/API/WebGL_API/Tutorial/Adding_2D_content_to_a_WebGL_context)
- : How to render simple flat shapes using WebGL.
- [Using shaders to apply color in WebGL](/en-US/docs/Web/API/WebGL_API/Tutorial/Using_shaders_to_apply_color_in_WebGL)
- : Demonstrates how to add color to shapes using shaders.
- [Animating objects with WebGL](/en-US/docs/Web/API/WebGL_API/Tutorial/Animating_objects_with_WebGL)
- : Shows how to rotate and translate objects to create simple animations.
- [Creating 3D objects using WebGL](/en-US/docs/Web/API/WebGL_API/Tutorial/Creating_3D_objects_using_WebGL)
- : Shows how to create and animate a 3D object (in this case, a cube).
- [Using textures in WebGL](/en-US/docs/Web/API/WebGL_API/Tutorial/Using_textures_in_WebGL)
- : Demonstrates how to map textures onto the faces of an object.
- [Lighting in WebGL](/en-US/docs/Web/API/WebGL_API/Tutorial/Lighting_in_WebGL)
- : How to simulate lighting effects in your WebGL context.
- [Animating textures in WebGL](/en-US/docs/Web/API/WebGL_API/Tutorial/Animating_textures_in_WebGL)
- : Shows how to animate textures; in this case, by mapping an Ogg video onto the faces of a rotating cube.
| 0 |
data/mdn-content/files/en-us/web/api/webgl_api/tutorial | data/mdn-content/files/en-us/web/api/webgl_api/tutorial/creating_3d_objects_using_webgl/index.md | ---
title: Creating 3D objects using WebGL
slug: Web/API/WebGL_API/Tutorial/Creating_3D_objects_using_WebGL
page-type: guide
---
{{DefaultAPISidebar("WebGL")}} {{PreviousNext("Web/API/WebGL_API/Tutorial/Animating_objects_with_WebGL", "Web/API/WebGL_API/Tutorial/Using_textures_in_WebGL")}}
Let's take our square plane into three dimensions by adding five more faces to create a cube. To do this efficiently, we're going to switch from drawing using the vertices directly by calling the {{domxref("WebGLRenderingContext.drawArrays()", "gl.drawArrays()")}} method to using the vertex array as a table, and referencing individual vertices in that table to define the positions of each face's vertices, by calling {{domxref("WebGLRenderingContext.drawElements()", "gl.drawElements()")}}.
Consider: each face requires four vertices to define it, but each vertex is shared by three faces. We can pass a lot fewer data around by building an array of all 24 vertices, then referring to each vertex by its index into that array instead of moving entire sets of coordinates around. If you wonder why we need 24 vertices, and not just 8, it is because each corner belongs to three faces of different colors, and a single vertex needs to have a single specific color; therefore we will create three copies of each vertex in three different colors, one for each face.
## Define the positions of the cube's vertices
First, let's build the cube's vertex position buffer by updating the code in `initBuffers()`. This is pretty much the same as it was for the square plane, but somewhat longer since there are 24 vertices (4 per side).
> **Note:** In the `initPositionBuffer()` function of your "init-buffers.js" module, replace the `positions` declaration with this code:
```js
const positions = [
// Front face
-1.0, -1.0, 1.0, 1.0, -1.0, 1.0, 1.0, 1.0, 1.0, -1.0, 1.0, 1.0,
// Back face
-1.0, -1.0, -1.0, -1.0, 1.0, -1.0, 1.0, 1.0, -1.0, 1.0, -1.0, -1.0,
// Top face
-1.0, 1.0, -1.0, -1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, -1.0,
// Bottom face
-1.0, -1.0, -1.0, 1.0, -1.0, -1.0, 1.0, -1.0, 1.0, -1.0, -1.0, 1.0,
// Right face
1.0, -1.0, -1.0, 1.0, 1.0, -1.0, 1.0, 1.0, 1.0, 1.0, -1.0, 1.0,
// Left face
-1.0, -1.0, -1.0, -1.0, -1.0, 1.0, -1.0, 1.0, 1.0, -1.0, 1.0, -1.0,
];
```
Since we've added a z-component to our vertices, we need to update the `numComponents` of our `vertexPosition` attribute to 3.
> **Note:** In the `setPositionAttribute()` function of your "draw-scene.js" module, change the `numComponents` constant from `2` to `3`:
```js
const numComponents = 3;
```
## Define the vertices' colors
We also need to build an array of colors for each of the 24 vertices. This code starts by defining a color for each face, then uses a loop to assemble an array of all the colors for each of the vertices.
> **Note:** In the `initColorBuffer()` function of your "init-buffers.js" module, replace the `colors` declaration with this code:
```js
const faceColors = [
[1.0, 1.0, 1.0, 1.0], // Front face: white
[1.0, 0.0, 0.0, 1.0], // Back face: red
[0.0, 1.0, 0.0, 1.0], // Top face: green
[0.0, 0.0, 1.0, 1.0], // Bottom face: blue
[1.0, 1.0, 0.0, 1.0], // Right face: yellow
[1.0, 0.0, 1.0, 1.0], // Left face: purple
];
// Convert the array of colors into a table for all the vertices.
var colors = [];
for (var j = 0; j < faceColors.length; ++j) {
const c = faceColors[j];
// Repeat each color four times for the four vertices of the face
colors = colors.concat(c, c, c, c);
}
```
## Define the element array
Once the vertex arrays are generated, we need to build the element array.
> **Note:** In your "init-buffer.js" module, add the following function:
```js
function initIndexBuffer(gl) {
const indexBuffer = gl.createBuffer();
gl.bindBuffer(gl.ELEMENT_ARRAY_BUFFER, indexBuffer);
// This array defines each face as two triangles, using the
// indices into the vertex array to specify each triangle's
// position.
const indices = [
0,
1,
2,
0,
2,
3, // front
4,
5,
6,
4,
6,
7, // back
8,
9,
10,
8,
10,
11, // top
12,
13,
14,
12,
14,
15, // bottom
16,
17,
18,
16,
18,
19, // right
20,
21,
22,
20,
22,
23, // left
];
// Now send the element array to GL
gl.bufferData(
gl.ELEMENT_ARRAY_BUFFER,
new Uint16Array(indices),
gl.STATIC_DRAW,
);
return indexBuffer;
}
```
The `indices` array defines each face like a pair of triangles, specifying each triangle's vertices as an index into the cube's vertex arrays. Thus the cube is described as a collection of 12 triangles.
Next, you need to call this new function from `initBuffers()`, and return the buffer it creates.
> **Note:** At the end of the `initBuffers()` function of your "init-buffers.js" module, add the following code, replacing the existing `return` statement:
```js
const indexBuffer = initIndexBuffer(gl);
return {
position: positionBuffer,
color: colorBuffer,
indices: indexBuffer,
};
```
## Drawing the cube
Next we need to add code to our `drawScene()` function to draw using the cube's index buffer, adding new {{domxref("WebGLRenderingContext.bindBuffer()", "gl.bindBuffer()")}} and {{domxref("WebGLRenderingContext.drawElements()", "gl.drawElements()")}} calls.
> **Note:** In your `drawScene()` function, add the following code just before the `gl.useProgram` line:
```js
// Tell WebGL which indices to use to index the vertices
gl.bindBuffer(gl.ELEMENT_ARRAY_BUFFER, buffers.indices);
```
> **Note:** In the `drawScene()` function of your "draw-scene.js" module, replace the block just after the two `gl.uniformMatrix4fv` calls, that contains the `gl.drawArrays()` line, with the following block:
```js
{
const vertexCount = 36;
const type = gl.UNSIGNED_SHORT;
const offset = 0;
gl.drawElements(gl.TRIANGLES, vertexCount, type, offset);
}
```
Since each face of our cube is comprised of two triangles, there are 6 vertices per side, or 36 total vertices in the cube, even though many of them are duplicates.
Finally, let's replace our variable `squareRotation` by `cubeRotation` and add a second rotation around the x axis.
> **Note:** At the start of your "webgl-demo.js" file, replace the `squareRotation` declaration with this line:
```js
let cubeRotation = 0.0;
```
> **Note:** In your `drawScene()` function declaration, replace the `squareRotation` with `cubeRotation`:
```js-nolint
function drawScene(gl, programInfo, buffers, cubeRotation) {
```
> **Note:** In your `drawScene()` function, replace the `mat4.rotate` call with the following code:
```js
mat4.rotate(
modelViewMatrix, // destination matrix
modelViewMatrix, // matrix to rotate
cubeRotation, // amount to rotate in radians
[0, 0, 1],
); // axis to rotate around (Z)
mat4.rotate(
modelViewMatrix, // destination matrix
modelViewMatrix, // matrix to rotate
cubeRotation * 0.7, // amount to rotate in radians
[0, 1, 0],
); // axis to rotate around (Y)
mat4.rotate(
modelViewMatrix, // destination matrix
modelViewMatrix, // matrix to rotate
cubeRotation * 0.3, // amount to rotate in radians
[1, 0, 0],
); // axis to rotate around (X)
```
> **Note:** In your `main()` function, replace the code that calls `drawScene()` and updates `squareRotation` to pass in and update `cubeRotation` instead:
```js
drawScene(gl, programInfo, buffers, cubeRotation);
cubeRotation += deltaTime;
```
At this point, we now have an animated cube rotating, its six faces rather vividly colored.
{{EmbedGHLiveSample('dom-examples/webgl-examples/tutorial/sample5/index.html', 670, 510) }}
[View the complete code](https://github.com/mdn/dom-examples/tree/main/webgl-examples/tutorial/sample5) | [Open this demo on a new page](https://mdn.github.io/dom-examples/webgl-examples/tutorial/sample5/)
{{PreviousNext("Web/API/WebGL_API/Tutorial/Animating_objects_with_WebGL", "Web/API/WebGL_API/Tutorial/Using_textures_in_WebGL")}}
| 0 |
data/mdn-content/files/en-us/web/api/webgl_api/tutorial | data/mdn-content/files/en-us/web/api/webgl_api/tutorial/animating_textures_in_webgl/index.md | ---
title: Animating textures in WebGL
slug: Web/API/WebGL_API/Tutorial/Animating_textures_in_WebGL
page-type: guide
---
{{DefaultAPISidebar("WebGL")}} {{Previous("Web/API/WebGL_API/Tutorial/Lighting_in_WebGL")}}
In this demonstration, we build upon the previous example by replacing our static textures with the frames of an mp4 video file that's playing. This is actually pretty easy to do and fun to watch, so let's get started. You can use similar code to use any sort of data (such as a {{ HTMLElement("canvas") }}) as the source for your textures.
## Getting access to the video
The first step is to create the {{ HTMLElement("video") }} element that we'll use to retrieve the video frames.
> **Note:** Add this declaration to that start of your "webgl-demo.js" script:
```js
// will set to true when video can be copied to texture
let copyVideo = false;
```
> **Note:** Add this function your "webgl-demo.js" script:
```js
function setupVideo(url) {
const video = document.createElement("video");
let playing = false;
let timeupdate = false;
video.playsInline = true;
video.muted = true;
video.loop = true;
// Waiting for these 2 events ensures
// there is data in the video
video.addEventListener(
"playing",
() => {
playing = true;
checkReady();
},
true,
);
video.addEventListener(
"timeupdate",
() => {
timeupdate = true;
checkReady();
},
true,
);
video.src = url;
video.play();
function checkReady() {
if (playing && timeupdate) {
copyVideo = true;
}
}
return video;
}
```
First we create a video element. We set it to autoplay, mute the sound, and loop the video. We then set up two events to make sure the video is playing and the time has been updated. We need both of these checks because it will produce an error if you upload a video to WebGL that has no data available yet. Checking for both of these events guarantees there is data available and it's safe to start uploading video to a WebGL texture. In the code above, we confirm whether we got both of those events; if so, we set a global variable, `copyVideo`, to true to indicate that it's safe to start copying the video to a texture.
And finally, we set the `src` attribute to start and call `play` to start loading and playing the video.
The video must be loaded from a secure source in order to be used to provide texture data to WebGL. That means that you'll not only need to deploy code like using a secure web server, but you'll need a secure server to test with as well. See [How do you set up a local testing server?](/en-US/docs/Learn/Common_questions/Tools_and_setup/set_up_a_local_testing_server) for help.
## Using the video frames as a texture
The next change is to initialize the texture, which becomes much simpler, since we no longer need to load an image file. Instead, we create an empty texture object, put a single pixel in it, and set its filtering for later use.
> **Note:** Replace the `loadTexture()` function in "webgl-demo.js" with the following code:
```js
function initTexture(gl) {
const texture = gl.createTexture();
gl.bindTexture(gl.TEXTURE_2D, texture);
// Because video has to be download over the internet
// they might take a moment until it's ready so
// put a single pixel in the texture so we can
// use it immediately.
const level = 0;
const internalFormat = gl.RGBA;
const width = 1;
const height = 1;
const border = 0;
const srcFormat = gl.RGBA;
const srcType = gl.UNSIGNED_BYTE;
const pixel = new Uint8Array([0, 0, 255, 255]); // opaque blue
gl.texImage2D(
gl.TEXTURE_2D,
level,
internalFormat,
width,
height,
border,
srcFormat,
srcType,
pixel,
);
// Turn off mips and set wrapping to clamp to edge so it
// will work regardless of the dimensions of the video.
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_WRAP_S, gl.CLAMP_TO_EDGE);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_WRAP_T, gl.CLAMP_TO_EDGE);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MIN_FILTER, gl.LINEAR);
return texture;
}
```
> **Note:** Add the following function to "webgl-demo.js":
```js
function updateTexture(gl, texture, video) {
const level = 0;
const internalFormat = gl.RGBA;
const srcFormat = gl.RGBA;
const srcType = gl.UNSIGNED_BYTE;
gl.bindTexture(gl.TEXTURE_2D, texture);
gl.texImage2D(
gl.TEXTURE_2D,
level,
internalFormat,
srcFormat,
srcType,
video,
);
}
```
You've seen this code before. It's nearly identical to the image onload function in the previous example — except when we call `texImage2D()`, instead of passing an `Image` object, we pass in the {{ HTMLElement("video") }} element. WebGL knows how to pull the current frame out and use it as a texture.
Next, we need to call these new functions from our `main()` function.
> **Note:** In your `main()` function, replace the call to `loadTexture()` with this code:
```js
const texture = initTexture(gl);
const video = setupVideo("Firefox.mp4");
```
> **Note:** You'll also need to download the [Firefox.mp4](https://github.com/mdn/dom-examples/blob/main/webgl-examples/tutorial/sample8/Firefox.mp4) file to the same local directory as your JavaScript files.
> **Note:** In your `main()` function, replace the `render()` function with this:
```js
// Draw the scene repeatedly
function render(now) {
now *= 0.001; // convert to seconds
deltaTime = now - then;
then = now;
if (copyVideo) {
updateTexture(gl, texture, video);
}
drawScene(gl, programInfo, buffers, texture, cubeRotation);
cubeRotation += deltaTime;
requestAnimationFrame(render);
}
```
If `copyVideo` is true, we call `updateTexture()` just before we call the `drawScene()` function.
That's all there is to it!
{{EmbedGHLiveSample('dom-examples/webgl-examples/tutorial/sample8/index.html', 670, 510) }}
[View the complete code](https://github.com/mdn/dom-examples/tree/main/webgl-examples/tutorial/sample8) | [Open this demo on a new page](https://mdn.github.io/dom-examples/webgl-examples/tutorial/sample8/)
## See also
- [Using audio and video in Firefox](/en-US/docs/Learn/HTML/Multimedia_and_embedding/Video_and_audio_content)
{{Previous("Web/API/WebGL_API/Tutorial/Lighting_in_WebGL")}}
| 0 |
data/mdn-content/files/en-us/web/api/webgl_api/tutorial | data/mdn-content/files/en-us/web/api/webgl_api/tutorial/lighting_in_webgl/index.md | ---
title: Lighting in WebGL
slug: Web/API/WebGL_API/Tutorial/Lighting_in_WebGL
page-type: guide
---
{{DefaultAPISidebar("WebGL")}} {{PreviousNext("Web/API/WebGL_API/Tutorial/Using_textures_in_WebGL", "Web/API/WebGL_API/Tutorial/Animating_textures_in_WebGL")}}
As should be clear by now, WebGL doesn't have much built-in knowledge. It just runs two functions you supply — a vertex shader and a fragment shader — and expects you to write creative functions to get the results you want. In other words, if you want lighting you have to calculate it yourself. Fortunately, it's not all that hard to do, and this article will cover some of the basics.
## Simulating lighting and shading in 3D
Although going into detail about the theory behind simulated lighting in 3D graphics is far beyond the scope of this article, it's helpful to know a bit about how it works. Instead of discussing it in depth here, take a look at the article on [Phong shading](https://en.wikipedia.org/wiki/Phong_shading) at Wikipedia, which provides a good overview of the most commonly used lighting model or if you'd like to see a WebGL based explanation [see this article](https://webglfundamentals.org/webgl/lessons/webgl-3d-lighting-point.html).
There are three basic types of lighting:
**Ambient light** is the light that permeates the scene; it's non-directional and affects every face in the scene equally, regardless of which direction it's facing.
**Directional light** is light that is emitted from a specific direction. This is light that's coming from so far away that every photon is moving parallel to every other photon. Sunlight, for example, is considered directional light.
**Point light** is light that is being emitted from a point, radiating in all directions. This is how many real-world light sources usually work. A light bulb emits light in all directions, for example.
For our purposes, we're going to simplify the lighting model by only considering simple directional and ambient lighting; we won't have any [specular highlights](https://en.wikipedia.org/wiki/Specular_highlights) or point light sources in this scene. Instead, we'll have our ambient lighting plus a single directional light source, aimed at the rotating cube from the [previous demo](/en-US/docs/Web/API/WebGL_API/Tutorial/Using_textures_in_WebGL).
Once you drop out the concept of point sources and specular lighting, there are two pieces of information we'll need in order to implement our directional lighting:
1. We need to associate a **surface normal** with each vertex. This is a vector that's perpendicular to the face at that vertex.
2. We need to know the direction in which the light is traveling; this is defined by the **direction vector**.
Then we update the vertex shader to adjust the color of each vertex, taking into account the ambient lighting as well as the effect of the directional lighting given the angle at which it's striking the face. We'll see how to do that when we look at the code for the shader.
## Building the normals for the vertices
The first thing we need to do is generate the array of normals for all the vertices that comprise our cube. Since a cube is a very simple object, this is easy to do; obviously for more complex objects, calculating the normals will be more involved.
> **Note:** Add this function to your "init-buffer.js" module:
```js
function initNormalBuffer(gl) {
const normalBuffer = gl.createBuffer();
gl.bindBuffer(gl.ARRAY_BUFFER, normalBuffer);
const vertexNormals = [
// Front
0.0, 0.0, 1.0, 0.0, 0.0, 1.0, 0.0, 0.0, 1.0, 0.0, 0.0, 1.0,
// Back
0.0, 0.0, -1.0, 0.0, 0.0, -1.0, 0.0, 0.0, -1.0, 0.0, 0.0, -1.0,
// Top
0.0, 1.0, 0.0, 0.0, 1.0, 0.0, 0.0, 1.0, 0.0, 0.0, 1.0, 0.0,
// Bottom
0.0, -1.0, 0.0, 0.0, -1.0, 0.0, 0.0, -1.0, 0.0, 0.0, -1.0, 0.0,
// Right
1.0, 0.0, 0.0, 1.0, 0.0, 0.0, 1.0, 0.0, 0.0, 1.0, 0.0, 0.0,
// Left
-1.0, 0.0, 0.0, -1.0, 0.0, 0.0, -1.0, 0.0, 0.0, -1.0, 0.0, 0.0,
];
gl.bufferData(
gl.ARRAY_BUFFER,
new Float32Array(vertexNormals),
gl.STATIC_DRAW,
);
return normalBuffer;
}
```
This should look pretty familiar by now; we create a new buffer, bind it to be the buffer we're working with, then send along our array of vertex normals into the buffer by calling `bufferData()`.
As before, we have updated `initBuffers()` to call our new function, and to return the buffer it created.
> **Note:** At the end of your `initBuffers()` function, add the following code, replacing the existing `return` statement:
```js
const normalBuffer = initNormalBuffer(gl);
return {
position: positionBuffer,
normal: normalBuffer,
textureCoord: textureCoordBuffer,
indices: indexBuffer,
};
```
Then we add the code to the "draw-scene.js" module to bind the normals array to a shader attribute so the shader code can get access to it.
> **Note:** Add this function to your "draw-scene.js" module:
```js
// Tell WebGL how to pull out the normals from
// the normal buffer into the vertexNormal attribute.
function setNormalAttribute(gl, buffers, programInfo) {
const numComponents = 3;
const type = gl.FLOAT;
const normalize = false;
const stride = 0;
const offset = 0;
gl.bindBuffer(gl.ARRAY_BUFFER, buffers.normal);
gl.vertexAttribPointer(
programInfo.attribLocations.vertexNormal,
numComponents,
type,
normalize,
stride,
offset,
);
gl.enableVertexAttribArray(programInfo.attribLocations.vertexNormal);
}
```
> **Note:** Add this line to the `drawScene()` function of your "draw-scene.js" module, just before the `gl.useProgram()` line:
```js
setNormalAttribute(gl, buffers, programInfo);
```
Finally, we need to update the code that builds the uniform matrices to generate and deliver to the shader a **normal matrix**, which is used to transform the normals when dealing with the current orientation of the cube in relation to the light source.
> **Note:** Add the following code to the `drawScene()` function of your "draw-scene.js" module, just after the three `mat4.rotate()` calls:
```js
const normalMatrix = mat4.create();
mat4.invert(normalMatrix, modelViewMatrix);
mat4.transpose(normalMatrix, normalMatrix);
```
> **Note:** Add the following code to the `drawScene()` function of your "draw-scene.js" module, just after the two previous `gl.uniformMatrix4fv()` calls:
```js
gl.uniformMatrix4fv(
programInfo.uniformLocations.normalMatrix,
false,
normalMatrix,
);
```
## Update the shaders
Now that all the data the shaders need is available to them, we need to update the code in the shaders themselves.
### The vertex shader
The first thing to do is update the vertex shader so it generates a shading value for each vertex based on the ambient lighting as well as the directional lighting.
> **Note:** Update the `vsSource` declaration in your `main()` function like this:
```js
const vsSource = `
attribute vec4 aVertexPosition;
attribute vec3 aVertexNormal;
attribute vec2 aTextureCoord;
uniform mat4 uNormalMatrix;
uniform mat4 uModelViewMatrix;
uniform mat4 uProjectionMatrix;
varying highp vec2 vTextureCoord;
varying highp vec3 vLighting;
void main(void) {
gl_Position = uProjectionMatrix * uModelViewMatrix * aVertexPosition;
vTextureCoord = aTextureCoord;
// Apply lighting effect
highp vec3 ambientLight = vec3(0.3, 0.3, 0.3);
highp vec3 directionalLightColor = vec3(1, 1, 1);
highp vec3 directionalVector = normalize(vec3(0.85, 0.8, 0.75));
highp vec4 transformedNormal = uNormalMatrix * vec4(aVertexNormal, 1.0);
highp float directional = max(dot(transformedNormal.xyz, directionalVector), 0.0);
vLighting = ambientLight + (directionalLightColor * directional);
}
`;
```
Once the position of the vertex is computed, and we pass the coordinates of the {{Glossary("texel")}} corresponding to the vertex to the fragment shader, we can work on computing the shading for the vertex.
The first thing we do is transform the normal based on the current orientation of the cube, by multiplying the vertex's normal by the normal matrix. We can then compute the amount of directional lighting that needs to be applied to the vertex by calculating the dot product of the transformed normal and the directional vector (that is, the direction from which the light is coming). If this value is less than zero, then we pin the value to zero, since you can't have less than zero light.
Once the amount of directional lighting is computed, we can generate the lighting value by taking the ambient light and adding in the product of the directional light's color and the amount of directional lighting to provide. As a result, we now have an RGB value that will be used by the fragment shader to adjust the color of each pixel we render.
### The fragment shader
The fragment shader now needs to be updated to take into account the lighting value computed by the vertex shader.
> **Note:** Update the `fsSource` declaration in your `main()` function like this:
```js
const fsSource = `
varying highp vec2 vTextureCoord;
varying highp vec3 vLighting;
uniform sampler2D uSampler;
void main(void) {
highp vec4 texelColor = texture2D(uSampler, vTextureCoord);
gl_FragColor = vec4(texelColor.rgb * vLighting, texelColor.a);
}
`;
```
Here we fetch the color of the texel, just like we did in the previous example, but before setting the color of the fragment, we multiply the texel's color by the lighting value to adjust the texel's color to take into account the effect of our light sources.
The only thing left is to look up the location of the `aVertexNormal` attribute and the `uNormalMatrix` uniform.
> **Note:** Update the `programInfo` declaration in your `main()` function like this:
```js
const programInfo = {
program: shaderProgram,
attribLocations: {
vertexPosition: gl.getAttribLocation(shaderProgram, "aVertexPosition"),
vertexNormal: gl.getAttribLocation(shaderProgram, "aVertexNormal"),
textureCoord: gl.getAttribLocation(shaderProgram, "aTextureCoord"),
},
uniformLocations: {
projectionMatrix: gl.getUniformLocation(shaderProgram, "uProjectionMatrix"),
modelViewMatrix: gl.getUniformLocation(shaderProgram, "uModelViewMatrix"),
normalMatrix: gl.getUniformLocation(shaderProgram, "uNormalMatrix"),
uSampler: gl.getUniformLocation(shaderProgram, "uSampler"),
},
};
```
And that's it!
{{EmbedGHLiveSample('dom-examples/webgl-examples/tutorial/sample7/index.html', 670, 510) }}
[View the complete code](https://github.com/mdn/dom-examples/tree/main/webgl-examples/tutorial/sample7) | [Open this demo on a new page](https://mdn.github.io/dom-examples/webgl-examples/tutorial/sample7/)
## Exercises for the reader
Obviously, this is a simple example, implementing basic per-vertex lighting. For more advanced graphics, you'll want to implement per-pixel lighting, but this will get you headed in the right direction.
You might also try experimenting with the direction of the light source, the colors of the light sources, and so forth.
{{PreviousNext("Web/API/WebGL_API/Tutorial/Using_textures_in_WebGL", "Web/API/WebGL_API/Tutorial/Animating_textures_in_WebGL")}}
| 0 |
data/mdn-content/files/en-us/web/api/webgl_api/tutorial | data/mdn-content/files/en-us/web/api/webgl_api/tutorial/getting_started_with_webgl/index.md | ---
title: Getting started with WebGL
slug: Web/API/WebGL_API/Tutorial/Getting_started_with_WebGL
page-type: guide
---
{{DefaultAPISidebar("WebGL")}} {{Next("Web/API/WebGL_API/Tutorial/Adding_2D_content_to_a_WebGL_context")}}
[WebGL](/en-US/docs/Web/API/WebGL_API) enables web content to use an API based on [OpenGL ES](https://www.khronos.org/opengles/) 2.0 to perform 2D and 3D rendering in an HTML [`canvas`](/en-US/docs/Web/API/Canvas_API) in browsers that support it without the use of plug-ins.
WebGL programs consist of control code written in JavaScript and shader code (GLSL) that is executed on a computer's Graphics Processing Unit (GPU). WebGL elements can be mixed with other HTML elements and composited with other parts of the page or page background.
This article will introduce you to the basics of using WebGL. It's assumed that you already have an understanding of the mathematics involved in 3D graphics, and this article doesn't pretend to try to teach you 3D graphics concepts itself.
The code examples in this tutorial can also be found in the [webgl-examples folder on GitHub](https://github.com/mdn/dom-examples/tree/main/webgl-examples/tutorial).
It's worth noting here that this series of articles introduces WebGL itself; however, there are a number of frameworks available that encapsulate WebGL's capabilities, making it easier to build 3D applications and games, such as [THREE.js](https://threejs.org/) and [BABYLON.js](https://www.babylonjs.com/).
## Preparing to render in 3D
First, create two new files:
- "index.html"
- "webgl-demo.js"
The "index.html" file should contain the following:
```html
<!doctype html>
<html lang="en">
<head>
<meta charset="utf-8" />
<title>WebGL Demo</title>
<script src="webgl-demo.js" type="module"></script>
</head>
<body>
<canvas id="glcanvas" width="640" height="480"></canvas>
</body>
</html>
```
Note that this declares a canvas that our sample will draw into.
### Preparing the WebGL context
Add the following code to the "webgl-demo.js" file:
```js
main();
//
// start here
//
function main() {
const canvas = document.querySelector("#glcanvas");
// Initialize the GL context
const gl = canvas.getContext("webgl");
// Only continue if WebGL is available and working
if (gl === null) {
alert(
"Unable to initialize WebGL. Your browser or machine may not support it.",
);
return;
}
// Set clear color to black, fully opaque
gl.clearColor(0.0, 0.0, 0.0, 1.0);
// Clear the color buffer with specified clear color
gl.clear(gl.COLOR_BUFFER_BIT);
}
```
The `main()` function is called when our script is loaded. Its purpose is to set up the WebGL context and start rendering content.
The first thing we do here is obtain a reference to the canvas, assigning it to a variable named `canvas`.
Once we have the canvas, we try to get a [`WebGLRenderingContext`](/en-US/docs/Web/API/WebGLRenderingContext) for it by calling [`getContext()`](/en-US/docs/Web/API/HTMLCanvasElement/getContext) and passing it the string `"webgl"`. If the browser does not support WebGL, `getContext()` will return `null` in which case we display a message to the user and exit.
If the context is successfully initialized, the variable `gl` is our reference to it. In this case, we set the clear color to black, and clear the context to that color (redrawing the canvas with the background color).
At this point, you have enough code that the WebGL context should successfully initialize, and you should wind up with a big black, empty box, ready and waiting to receive content.
{{EmbedGHLiveSample('dom-examples/webgl-examples/tutorial/sample1/index.html', 670, 510) }}
[View the complete code](https://github.com/mdn/dom-examples/tree/main/webgl-examples/tutorial/sample1) | [Open this demo on a new page](https://mdn.github.io/dom-examples/webgl-examples/tutorial/sample1/)
## See also
- [An introduction to WebGL](https://dev.opera.com/articles/introduction-to-webgl-part-1/): Written by Luz Caballero, published at dev.opera.com. This article addresses what WebGL is, explains how WebGL works (including the rendering pipeline concept), and introduces some WebGL libraries.
- [WebGL Fundamentals](https://webglfundamentals.org/)
- [An intro to modern OpenGL:](https://duriansoftware.com/joe/an-intro-to-modern-opengl.-table-of-contents) A series of nice articles about OpenGL written by Joe Groff, providing a clear introduction to OpenGL from its history to the important graphics pipeline concept, and also includes some examples to demonstrate how OpenGL works. If you have no idea what OpenGL is, this is a good place to start.
{{Next("Web/API/WebGL_API/Tutorial/Adding_2D_content_to_a_WebGL_context")}}
| 0 |
data/mdn-content/files/en-us/web/api/webgl_api/tutorial | data/mdn-content/files/en-us/web/api/webgl_api/tutorial/using_textures_in_webgl/index.md | ---
title: Using textures in WebGL
slug: Web/API/WebGL_API/Tutorial/Using_textures_in_WebGL
page-type: guide
---
{{DefaultAPISidebar("WebGL")}} {{PreviousNext("Web/API/WebGL_API/Tutorial/Creating_3D_objects_using_WebGL", "Web/API/WebGL_API/Tutorial/Lighting_in_WebGL")}}
Now that our sample program has a rotating 3D cube, let's map a texture onto it instead of having its faces be solid colors.
## Loading textures
The first thing to do is add code to load the textures. In our case, we'll be using a single texture, mapped onto all six sides of our rotating cube, but the same technique can be used for any number of textures.
> **Note:** It's important to note that the loading of textures follows [cross-domain rules](/en-US/docs/Web/HTTP/CORS); that is, you can only load textures from sites for which your content has CORS approval. See [Cross-domain textures below](#cross-domain_textures) for details.
> **Note:** Add these two functions to your "webgl-demo.js" script:
```js
//
// Initialize a texture and load an image.
// When the image finished loading copy it into the texture.
//
function loadTexture(gl, url) {
const texture = gl.createTexture();
gl.bindTexture(gl.TEXTURE_2D, texture);
// Because images have to be downloaded over the internet
// they might take a moment until they are ready.
// Until then put a single pixel in the texture so we can
// use it immediately. When the image has finished downloading
// we'll update the texture with the contents of the image.
const level = 0;
const internalFormat = gl.RGBA;
const width = 1;
const height = 1;
const border = 0;
const srcFormat = gl.RGBA;
const srcType = gl.UNSIGNED_BYTE;
const pixel = new Uint8Array([0, 0, 255, 255]); // opaque blue
gl.texImage2D(
gl.TEXTURE_2D,
level,
internalFormat,
width,
height,
border,
srcFormat,
srcType,
pixel,
);
const image = new Image();
image.onload = () => {
gl.bindTexture(gl.TEXTURE_2D, texture);
gl.texImage2D(
gl.TEXTURE_2D,
level,
internalFormat,
srcFormat,
srcType,
image,
);
// WebGL1 has different requirements for power of 2 images
// vs. non power of 2 images so check if the image is a
// power of 2 in both dimensions.
if (isPowerOf2(image.width) && isPowerOf2(image.height)) {
// Yes, it's a power of 2. Generate mips.
gl.generateMipmap(gl.TEXTURE_2D);
} else {
// No, it's not a power of 2. Turn off mips and set
// wrapping to clamp to edge
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_WRAP_S, gl.CLAMP_TO_EDGE);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_WRAP_T, gl.CLAMP_TO_EDGE);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MIN_FILTER, gl.LINEAR);
}
};
image.src = url;
return texture;
}
function isPowerOf2(value) {
return (value & (value - 1)) === 0;
}
```
The `loadTexture()` routine starts by creating a WebGL texture object `texture` by calling the WebGL {{domxref("WebGLRenderingContext.createTexture()", "createTexture()")}} function. It then uploads a single blue pixel using {{domxref("WebGLRenderingContext.texImage2D()", "texImage2D()")}}. This makes the texture immediately usable as a solid blue color even though it may take a few moments for our image to download.
To load the texture from the image file, it then creates an `Image` object and assigns the `src` to the URL for our image we wish to use as our texture. The function we assign to `image.onload` will be called once the image has finished downloading. At that point we again call {{domxref("WebGLRenderingContext.texImage2D()", "texImage2D()")}} this time using the image as the source for the texture. After that we set up filtering and wrapping for the texture based on whether or not the image we download was a power of 2 in both dimensions or not.
WebGL1 can only use non power of 2 textures with filtering set to `NEAREST` or `LINEAR` and it can not generate a mipmap for them. Their wrapping mode must also be set to `CLAMP_TO_EDGE`. On the other hand if the texture is a power of 2 in both dimensions then WebGL can do higher quality filtering, it can use mipmap, and it can set the wrapping mode to `REPEAT` or `MIRRORED_REPEAT`.
An example of a repeated texture is tiling an image of a few bricks to cover a brick wall.
Mipmapping and UV repeating can be disabled with {{domxref("WebGLRenderingContext.texParameter()", "texParameteri()")}}. This will allow non-power-of-two (NPOT) textures at the expense of mipmapping, UV wrapping, UV tiling, and your control over how the device will handle your texture.
```js
// gl.NEAREST is also allowed, instead of gl.LINEAR, as neither mipmap.
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MIN_FILTER, gl.LINEAR);
// Prevents s-coordinate wrapping (repeating).
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_WRAP_S, gl.CLAMP_TO_EDGE);
// Prevents t-coordinate wrapping (repeating).
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_WRAP_T, gl.CLAMP_TO_EDGE);
```
Again, with these parameters, compatible WebGL devices will automatically accept any resolution for that texture (up to their maximum dimensions). Without performing the above configuration, WebGL requires all samples of NPOT textures to fail by returning transparent black: `rgb(0 0 0 / 0%)`.
To load the image, add a call to our `loadTexture()` function within our `main()` function. This can be added after the `initBuffers(gl)` call.
But also note: Browsers copy pixels from the loaded image in top-to-bottom order — from the top-left corner; but WebGL wants the pixels in bottom-to-top order — starting from the bottom-left corner. (For more details, see [Why is my WebGL texture upside-down?](https://jameshfisher.com/2020/10/22/why-is-my-webgl-texture-upside-down/).)
So in order to prevent the resulting image texture from having the wrong orientation when rendered, we also need call [`pixelStorei()`](/en-US/docs/Web/API/WebGLRenderingContext/pixelStorei) with the `gl.UNPACK_FLIP_Y_WEBGL` parameter set to `true` — to cause the pixels to be flipped into the bottom-to-top order that WebGL expects.
> **Note:** Add the following code to your `main()` function, right after the call to `initBuffers()`:
```js
// Load texture
const texture = loadTexture(gl, "cubetexture.png");
// Flip image pixels into the bottom-to-top order that WebGL expects.
gl.pixelStorei(gl.UNPACK_FLIP_Y_WEBGL, true);
```
> **Note:** Finally, download the [cubetexture.png](https://raw.githubusercontent.com/mdn/dom-examples/main/webgl-examples/tutorial/sample6/cubetexture.png) file to the same local directory as your JavaScript files.
## Mapping the texture onto the faces
At this point, the texture is loaded and ready to use. But before we can use it, we need to establish the mapping of the texture coordinates to the vertices of the faces of our cube. This replaces all the previously existing code for configuring colors for each of the cube's faces in `initBuffers()`.
> **Note:** Add this function to your "init-buffer.js" module:
```js
function initTextureBuffer(gl) {
const textureCoordBuffer = gl.createBuffer();
gl.bindBuffer(gl.ARRAY_BUFFER, textureCoordBuffer);
const textureCoordinates = [
// Front
0.0, 0.0, 1.0, 0.0, 1.0, 1.0, 0.0, 1.0,
// Back
0.0, 0.0, 1.0, 0.0, 1.0, 1.0, 0.0, 1.0,
// Top
0.0, 0.0, 1.0, 0.0, 1.0, 1.0, 0.0, 1.0,
// Bottom
0.0, 0.0, 1.0, 0.0, 1.0, 1.0, 0.0, 1.0,
// Right
0.0, 0.0, 1.0, 0.0, 1.0, 1.0, 0.0, 1.0,
// Left
0.0, 0.0, 1.0, 0.0, 1.0, 1.0, 0.0, 1.0,
];
gl.bufferData(
gl.ARRAY_BUFFER,
new Float32Array(textureCoordinates),
gl.STATIC_DRAW,
);
return textureCoordBuffer;
}
```
First, this code creates a WebGL buffer into which we'll store the texture coordinates for each face, then we bind that buffer as the array we'll be writing into.
The `textureCoordinates` array defines the texture coordinates corresponding to each vertex of each face. Note that the texture coordinates range from 0.0 to 1.0; the dimensions of textures are normalized to a range of 0.0 to 1.0 regardless of their actual size, for the purpose of texture mapping.
Once we've set up the texture mapping array, we pass the array into the buffer, so that WebGL has that data ready for its use.
Then we return the new buffer.
Next, we need to update `initBuffers()` to create and return the texture coordinates buffer instead of the color buffer.
> **Note:** In the `initBuffers()` function of your "init-buffers.js" module, replace the call to `initColorBuffer()` with the following line:
```js
const textureCoordBuffer = initTextureBuffer(gl);
```
> **Note:** In the `initBuffers()` function of your "init-buffers.js" module, replace the `return` statement with the following:
```js
return {
position: positionBuffer,
textureCoord: textureCoordBuffer,
indices: indexBuffer,
};
```
## Updating the shaders
The shader program also needs to be updated to use the textures instead of solid colors.
### The vertex shader
We need to replace the vertex shader so that instead of fetching color data, it instead fetches the texture coordinate data.
> **Note:** Update the `vsSource` declaration in your `main()` function like this:
```js
const vsSource = `
attribute vec4 aVertexPosition;
attribute vec2 aTextureCoord;
uniform mat4 uModelViewMatrix;
uniform mat4 uProjectionMatrix;
varying highp vec2 vTextureCoord;
void main(void) {
gl_Position = uProjectionMatrix * uModelViewMatrix * aVertexPosition;
vTextureCoord = aTextureCoord;
}
`;
```
The key change here is that instead of fetching the vertex color, we're fetching the texture coordinates and passing them to the vertex shader; this will indicate the location within the texture corresponding to the vertex.
### The fragment shader
The fragment shader likewise needs to be updated.
> **Note:** Update the `fsSource` declaration in your `main()` function like this:
```js
const fsSource = `
varying highp vec2 vTextureCoord;
uniform sampler2D uSampler;
out vec4 fragColor;
void main(void) {
fragColor = texture(uSampler, vTextureCoord);
}
`;
```
Instead of assigning a color value to the fragment's color, the fragment's color is computed by fetching the {{Glossary("texel")}} (that is, the pixel within the texture) based on the value of `vTextureCoord` which like the colors is interpolated between vertices.
### Attribute and Uniform Locations
Because we changed an attribute and added a uniform we need to look up their locations.
> **Note:** Update the `programInfo` declaration in your `main()` function like this:
```js
const programInfo = {
program: shaderProgram,
attribLocations: {
vertexPosition: gl.getAttribLocation(shaderProgram, "aVertexPosition"),
textureCoord: gl.getAttribLocation(shaderProgram, "aTextureCoord"),
},
uniformLocations: {
projectionMatrix: gl.getUniformLocation(shaderProgram, "uProjectionMatrix"),
modelViewMatrix: gl.getUniformLocation(shaderProgram, "uModelViewMatrix"),
uSampler: gl.getUniformLocation(shaderProgram, "uSampler"),
},
};
```
## Drawing the textured cube
The changes to the `drawScene()` function are simple.
> **Note:** In the `drawScene()` function of your "draw-scene.js" module, add the following function:
```js
// tell webgl how to pull out the texture coordinates from buffer
function setTextureAttribute(gl, buffers, programInfo) {
const num = 2; // every coordinate composed of 2 values
const type = gl.FLOAT; // the data in the buffer is 32-bit float
const normalize = false; // don't normalize
const stride = 0; // how many bytes to get from one set to the next
const offset = 0; // how many bytes inside the buffer to start from
gl.bindBuffer(gl.ARRAY_BUFFER, buffers.textureCoord);
gl.vertexAttribPointer(
programInfo.attribLocations.textureCoord,
num,
type,
normalize,
stride,
offset,
);
gl.enableVertexAttribArray(programInfo.attribLocations.textureCoord);
}
```
> **Note:** In the `drawScene()` function of your "draw-scene.js" module, replace the call to `setColorAttribute()` with the following line:
```js
setTextureAttribute(gl, buffers, programInfo);
```
Then add code to specify the texture to map onto the faces.
> **Note:** In your `drawScene()` function, just after the two calls to `gl.uniformMatrix4fv()`, add the following code:
```js
// Tell WebGL we want to affect texture unit 0
gl.activeTexture(gl.TEXTURE0);
// Bind the texture to texture unit 0
gl.bindTexture(gl.TEXTURE_2D, texture);
// Tell the shader we bound the texture to texture unit 0
gl.uniform1i(programInfo.uniformLocations.uSampler, 0);
```
WebGL provides a minimum of 8 texture units; the first of these is `gl.TEXTURE0`. We tell WebGL we want to affect unit 0. We then call {{domxref("WebGLRenderingContext.bindTexture()", "bindTexture()")}} which binds the texture to the `TEXTURE_2D` bind point of texture unit 0. We then tell the shader that for the `uSampler` use texture unit 0.
Lastly, add `texture` as a parameter to the `drawScene()` function, both where it is defined and where it is called.
> **Note:** Update the declaration of your `drawScene()` function to add the new parameter:
```js-nolint
function drawScene(gl, programInfo, buffers, texture, cubeRotation) {
```
> **Note:** Update the place in your `main()` function where you call `drawScene()`:
```js
drawScene(gl, programInfo, buffers, texture, cubeRotation);
```
At this point, the rotating cube should be good to go.
{{EmbedGHLiveSample('dom-examples/webgl-examples/tutorial/sample6/index.html', 670, 510) }}
[View the complete code](https://github.com/mdn/dom-examples/tree/main/webgl-examples/tutorial/sample6) | [Open this demo on a new page](https://mdn.github.io/dom-examples/webgl-examples/tutorial/sample6/)
## Cross-domain textures
Loading of WebGL textures is subject to cross-domain access controls. In order for your content to load a texture from another domain, CORS approval needs to be obtained. See [HTTP access control](/en-US/docs/Web/HTTP/CORS) for details on CORS.
Because WebGL now requires textures to be loaded from secure contexts, you can't use textures loaded from `file:///` URLs in WebGL. That means that you'll need a secure web server to test and deploy your code. For local testing, see our guide [How do you set up a local testing server?](/en-US/docs/Learn/Common_questions/Tools_and_setup/set_up_a_local_testing_server) for help.
See this [hacks.mozilla.org article](https://hacks.mozilla.org/2011/11/using-cors-to-load-webgl-textures-from-cross-domain-images/) for an explanation of how to use CORS-approved images as WebGL textures.
Tainted (write-only) 2D canvases can't be used as WebGL textures. A 2D {{ HTMLElement("canvas") }} becomes tainted, for example, when a cross-domain image is drawn on it.
{{PreviousNext("Web/API/WebGL_API/Tutorial/Creating_3D_objects_using_WebGL", "Web/API/WebGL_API/Tutorial/Lighting_in_WebGL")}}
| 0 |
data/mdn-content/files/en-us/web/api/webgl_api/tutorial | data/mdn-content/files/en-us/web/api/webgl_api/tutorial/adding_2d_content_to_a_webgl_context/index.md | ---
title: Adding 2D content to a WebGL context
slug: Web/API/WebGL_API/Tutorial/Adding_2D_content_to_a_WebGL_context
page-type: guide
---
{{DefaultAPISidebar("WebGL")}} {{PreviousNext("Web/API/WebGL_API/Tutorial/Getting_started_with_WebGL", "Web/API/WebGL_API/Tutorial/Using_shaders_to_apply_color_in_WebGL")}}
Once you've successfully [created a WebGL context](/en-US/docs/Web/API/WebGL_API/Tutorial/Getting_started_with_WebGL), you can start rendering into it. A simple thing we can do is draw an untextured square plane, so let's start there.
The complete source code for this project is [available on GitHub](https://github.com/mdn/dom-examples/tree/main/webgl-examples/tutorial/sample2).
## Including the glMatrix library
This project uses the [glMatrix](https://glmatrix.net/) library to perform its matrix operations, so you will need to include that in your project. We're loading a copy from a CDN.
> **Note:** Update your "index.html" so it looks like this:
```html
<!doctype html>
<html lang="en">
<head>
<meta charset="utf-8" />
<title>WebGL Demo</title>
<link rel="stylesheet" href="./webgl.css" type="text/css" />
<script
src="https://cdnjs.cloudflare.com/ajax/libs/gl-matrix/2.8.1/gl-matrix-min.js"
integrity="sha512-zhHQR0/H5SEBL3Wn6yYSaTTZej12z0hVZKOv3TwCUXT1z5qeqGcXJLLrbERYRScEDDpYIJhPC1fk31gqR783iQ=="
crossorigin="anonymous"
defer></script>
<script src="webgl-demo.js" type="module"></script>
</head>
<body>
<canvas id="glcanvas" width="640" height="480"></canvas>
</body>
</html>
```
## Drawing the scene
The most important thing to understand before we get started is that even though we're only rendering a square plane object in this example, we're still drawing in 3D space. It's just we're drawing a square and we're putting it directly in front of the camera perpendicular to the view direction. We need to define the shaders that will create the color for our simple scene as well as draw our object. These will establish how the square plane appears on the screen.
### The shaders
A **shader** is a program, written using the [OpenGL ES Shading Language](https://www.khronos.org/registry/OpenGL/specs/es/3.2/GLSL_ES_Specification_3.20.pdf) (**GLSL**), that takes information about the vertices that make up a shape and generates the data needed to render the pixels onto the screen: namely, the positions of the pixels and their colors.
There are two shader functions run when drawing WebGL content: the **vertex shader** and the **fragment shader**. You write these in GLSL and pass the text of the code into WebGL to be compiled for execution on the GPU. Together, a set of vertex and fragment shaders is called a **shader program**.
Let's take a quick look at the two types of shader, with the example in mind of drawing a 2D shape into the WebGL context.
#### Vertex shader
Each time a shape is rendered, the vertex shader is run for each vertex in the shape. Its job is to transform the input vertex from its original coordinate system into the **[clip space](/en-US/docs/Web/API/WebGL_API/WebGL_model_view_projection#clip_space)** coordinate system used by WebGL, in which each axis has a range from -1.0 to 1.0, regardless of aspect ratio, actual size, or any other factors.
The vertex shader must perform the needed transforms on the vertex's position, make any other adjustments or calculations it needs to make on a per-vertex basis, then return the transformed vertex by saving it in a special variable provided by GLSL, called `gl_Position`.
The vertex shader can, as needed, also do things like determine the coordinates within the face's texture of the {{Glossary("texel")}} to apply to the vertex, apply the normals to determine the lighting factor to apply to the vertex, and so on. This information can then be stored in [varyings](/en-US/docs/Web/API/WebGL_API/Data#varyings) or [attributes](/en-US/docs/Web/API/WebGL_API/Data#attributes) as appropriate to be shared with the fragment shader.
Our vertex shader below receives vertex position values from an attribute we define called `aVertexPosition`. That position is then multiplied by two 4x4 matrices we provide called `uProjectionMatrix` and `uModelViewMatrix`; `gl_Position` is set to the result. For more info on projection and other matrixes [you might find this article useful](https://webglfundamentals.org/webgl/lessons/webgl-3d-perspective.html).
> **Note:** Add this code to your `main()` function:
```js
// Vertex shader program
const vsSource = `
attribute vec4 aVertexPosition;
uniform mat4 uModelViewMatrix;
uniform mat4 uProjectionMatrix;
void main() {
gl_Position = uProjectionMatrix * uModelViewMatrix * aVertexPosition;
}
`;
```
It's worth noting that we're using a `vec4` attribute for the vertex position, which doesn't actually use a 4-component vector; that is, it could be handled as a `vec2` or `vec3` depending on the situation. But when we do our math, we will need it to be a `vec4`, so rather than convert it to a `vec4` every time we do math, we'll just use a `vec4` from the beginning. This eliminates operations from every calculation we do in our shader. Performance matters.
In this example, we're not computing any lighting at all, since we haven't yet applied any to the scene. That will come later, in the example [Lighting in WebGL](/en-US/docs/Web/API/WebGL_API/Tutorial/Lighting_in_WebGL). Note also the lack of any work with textures here; that will be added in [Using textures in WebGL](/en-US/docs/Web/API/WebGL_API/Tutorial/Using_textures_in_WebGL).
#### Fragment shader
The **fragment shader** is called once for every pixel on each shape to be drawn, after the shape's vertices have been processed by the vertex shader. Its job is to determine the color of that pixel by figuring out which texel (that is, the pixel from within the shape's texture) to apply to the pixel, getting that texel's color, then applying the appropriate lighting to the color. The color is then returned to the WebGL layer by storing it in the special variable `gl_FragColor`. That color is then drawn to the screen in the correct position for the shape's corresponding pixel.
In this case, we're returning white every time, since we're just drawing a white square, with no lighting in use.
> **Note:** Add this code to your `main()` function:
```js
const fsSource = `
void main() {
gl_FragColor = vec4(1.0, 1.0, 1.0, 1.0);
}
`;
```
### Initializing the shaders
Now that we've defined the two shaders we need to pass them to WebGL, compile them, and link them together. The code below creates the two shaders by calling `loadShader()`, passing the type and source for the shader. It then creates a program, attaches the shaders and links them together. If compiling or linking fails the code displays an alert.
> **Note:** Add these two functions to your "webgl-demo.js" script:
```js
//
// Initialize a shader program, so WebGL knows how to draw our data
//
function initShaderProgram(gl, vsSource, fsSource) {
const vertexShader = loadShader(gl, gl.VERTEX_SHADER, vsSource);
const fragmentShader = loadShader(gl, gl.FRAGMENT_SHADER, fsSource);
// Create the shader program
const shaderProgram = gl.createProgram();
gl.attachShader(shaderProgram, vertexShader);
gl.attachShader(shaderProgram, fragmentShader);
gl.linkProgram(shaderProgram);
// If creating the shader program failed, alert
if (!gl.getProgramParameter(shaderProgram, gl.LINK_STATUS)) {
alert(
`Unable to initialize the shader program: ${gl.getProgramInfoLog(
shaderProgram,
)}`,
);
return null;
}
return shaderProgram;
}
//
// creates a shader of the given type, uploads the source and
// compiles it.
//
function loadShader(gl, type, source) {
const shader = gl.createShader(type);
// Send the source to the shader object
gl.shaderSource(shader, source);
// Compile the shader program
gl.compileShader(shader);
// See if it compiled successfully
if (!gl.getShaderParameter(shader, gl.COMPILE_STATUS)) {
alert(
`An error occurred compiling the shaders: ${gl.getShaderInfoLog(shader)}`,
);
gl.deleteShader(shader);
return null;
}
return shader;
}
```
The `loadShader()` function takes as input the WebGL context, the shader type, and the source code, then creates and compiles the shader as follows:
1. A new shader is created by calling {{domxref("WebGLRenderingContext.createShader", "gl.createShader()")}}.
2. The shader's source code is sent to the shader by calling {{domxref("WebGLRenderingContext.shaderSource", "gl.shaderSource()")}}.
3. Once the shader has the source code, it's compiled using {{domxref("WebGLRenderingContext.compileShader", "gl.compileShader()")}}.
4. To check to be sure the shader successfully compiled, the shader parameter `gl.COMPILE_STATUS` is checked. To get its value, we call {{domxref("WebGLRenderingContext.getShaderParameter", "gl.getShaderParameter()")}}, specifying the shader and the name of the parameter we want to check (`gl.COMPILE_STATUS`). If that's `false`, we know the shader failed to compile, so show an alert with log information obtained from the compiler using {{domxref("WebGLRenderingContext.getShaderInfoLog", "gl.getShaderInfoLog()")}}, then delete the shader and return `null` to indicate a failure to load the shader.
5. If the shader was loaded and successfully compiled, the compiled shader is returned to the caller.
> **Note:** Add this code to your `main()` function:
```js
// Initialize a shader program; this is where all the lighting
// for the vertices and so forth is established.
const shaderProgram = initShaderProgram(gl, vsSource, fsSource);
```
After we've created a shader program we need to look up the locations that WebGL assigned to our inputs. In this case we have one attribute and two uniforms. Attributes receive values from buffers. Each iteration of the vertex shader receives the next value from the buffer assigned to that attribute. [Uniforms](/en-US/docs/Web/API/WebGL_API/Data#uniforms) are similar to JavaScript global variables. They stay the same value for all iterations of a shader. Since the attribute and uniform locations are specific to a single shader program we'll store them together to make them easy to pass around
> **Note:** Add this code to your `main()` function:
```js
// Collect all the info needed to use the shader program.
// Look up which attribute our shader program is using
// for aVertexPosition and look up uniform locations.
const programInfo = {
program: shaderProgram,
attribLocations: {
vertexPosition: gl.getAttribLocation(shaderProgram, "aVertexPosition"),
},
uniformLocations: {
projectionMatrix: gl.getUniformLocation(shaderProgram, "uProjectionMatrix"),
modelViewMatrix: gl.getUniformLocation(shaderProgram, "uModelViewMatrix"),
},
};
```
## Creating the square plane
Before we can render our square plane, we need to create the buffer that contains its vertex positions and put the vertex positions in it.
We'll do that using a function we call `initBuffers()`, which we will implement in a separate [JavaScript module](/en-US/docs/Web/JavaScript/Guide/Modules). As we explore more advanced WebGL concepts, this module will be augmented to create more — and more complex — 3D objects.
> **Note:** Create a new file called "init-buffers.js", and give it the following contents:
```js
function initBuffers(gl) {
const positionBuffer = initPositionBuffer(gl);
return {
position: positionBuffer,
};
}
function initPositionBuffer(gl) {
// Create a buffer for the square's positions.
const positionBuffer = gl.createBuffer();
// Select the positionBuffer as the one to apply buffer
// operations to from here out.
gl.bindBuffer(gl.ARRAY_BUFFER, positionBuffer);
// Now create an array of positions for the square.
const positions = [1.0, 1.0, -1.0, 1.0, 1.0, -1.0, -1.0, -1.0];
// Now pass the list of positions into WebGL to build the
// shape. We do this by creating a Float32Array from the
// JavaScript array, then use it to fill the current buffer.
gl.bufferData(gl.ARRAY_BUFFER, new Float32Array(positions), gl.STATIC_DRAW);
return positionBuffer;
}
export { initBuffers };
```
This routine is pretty simplistic given the basic nature of the scene in this example. It starts by calling the `gl` object's {{domxref("WebGLRenderingContext.createBuffer()", "createBuffer()")}} method to obtain a buffer into which we'll store the vertex positions. This is then bound to the context by calling the {{domxref("WebGLRenderingContext.bindBuffer()", "bindBuffer()")}} method.
Once that's done, we create a JavaScript array containing the position for each vertex of the square plane. This is then converted into an array of floats and passed into the `gl` object's {{domxref("WebGLRenderingContext.bufferData()", "bufferData()")}} method to establish the vertex positions for the object.
## Rendering the scene
Once the shaders are established, the locations are looked up, and the square plane's vertex positions put in a buffer, we can actually render the scene. We'll do this in a `drawScene()` function that, again, we'll implement in a separate JavaScript module.
> **Note:** Create a new file called "draw-scene.js", and give it the following contents:
```js
function drawScene(gl, programInfo, buffers) {
gl.clearColor(0.0, 0.0, 0.0, 1.0); // Clear to black, fully opaque
gl.clearDepth(1.0); // Clear everything
gl.enable(gl.DEPTH_TEST); // Enable depth testing
gl.depthFunc(gl.LEQUAL); // Near things obscure far things
// Clear the canvas before we start drawing on it.
gl.clear(gl.COLOR_BUFFER_BIT | gl.DEPTH_BUFFER_BIT);
// Create a perspective matrix, a special matrix that is
// used to simulate the distortion of perspective in a camera.
// Our field of view is 45 degrees, with a width/height
// ratio that matches the display size of the canvas
// and we only want to see objects between 0.1 units
// and 100 units away from the camera.
const fieldOfView = (45 * Math.PI) / 180; // in radians
const aspect = gl.canvas.clientWidth / gl.canvas.clientHeight;
const zNear = 0.1;
const zFar = 100.0;
const projectionMatrix = mat4.create();
// note: glmatrix.js always has the first argument
// as the destination to receive the result.
mat4.perspective(projectionMatrix, fieldOfView, aspect, zNear, zFar);
// Set the drawing position to the "identity" point, which is
// the center of the scene.
const modelViewMatrix = mat4.create();
// Now move the drawing position a bit to where we want to
// start drawing the square.
mat4.translate(
modelViewMatrix, // destination matrix
modelViewMatrix, // matrix to translate
[-0.0, 0.0, -6.0],
); // amount to translate
// Tell WebGL how to pull out the positions from the position
// buffer into the vertexPosition attribute.
setPositionAttribute(gl, buffers, programInfo);
// Tell WebGL to use our program when drawing
gl.useProgram(programInfo.program);
// Set the shader uniforms
gl.uniformMatrix4fv(
programInfo.uniformLocations.projectionMatrix,
false,
projectionMatrix,
);
gl.uniformMatrix4fv(
programInfo.uniformLocations.modelViewMatrix,
false,
modelViewMatrix,
);
{
const offset = 0;
const vertexCount = 4;
gl.drawArrays(gl.TRIANGLE_STRIP, offset, vertexCount);
}
}
// Tell WebGL how to pull out the positions from the position
// buffer into the vertexPosition attribute.
function setPositionAttribute(gl, buffers, programInfo) {
const numComponents = 2; // pull out 2 values per iteration
const type = gl.FLOAT; // the data in the buffer is 32bit floats
const normalize = false; // don't normalize
const stride = 0; // how many bytes to get from one set of values to the next
// 0 = use type and numComponents above
const offset = 0; // how many bytes inside the buffer to start from
gl.bindBuffer(gl.ARRAY_BUFFER, buffers.position);
gl.vertexAttribPointer(
programInfo.attribLocations.vertexPosition,
numComponents,
type,
normalize,
stride,
offset,
);
gl.enableVertexAttribArray(programInfo.attribLocations.vertexPosition);
}
export { drawScene };
```
The first step is to clear the canvas to our background color; then we establish the camera's perspective. We set a field of view of 45°, with a width to height ratio that matches the display dimensions of our canvas. We also specify that we only want objects between 0.1 and 100 units from the camera to be rendered.
Then we establish the position of the square plane by loading the identity position and translating away from the camera by 6 units. After that, we bind the square's vertex buffer to the attribute the shader is using for `aVertexPosition` and we tell WebGL how to pull the data out of it. Finally we draw the object by calling the {{domxref("WebGLRenderingContext.drawArrays()", "drawArrays()")}} method.
Finally, let's call `initBuffers()` and `drawScene()`.
> **Note:** Add this code to the start of your "webgl-demo.js" file:
```js
import { initBuffers } from "./init-buffers.js";
import { drawScene } from "./draw-scene.js";
```
> **Note:** Add this code to the end of your `main()` function:
```js
// Here's where we call the routine that builds all the
// objects we'll be drawing.
const buffers = initBuffers(gl);
// Draw the scene
drawScene(gl, programInfo, buffers);
```
The result should look like this:
{{EmbedGHLiveSample('dom-examples/webgl-examples/tutorial/sample2/index.html', 670, 510) }}
[View the complete code](https://github.com/mdn/dom-examples/tree/main/webgl-examples/tutorial/sample2) | [Open this demo on a new page](https://mdn.github.io/dom-examples/webgl-examples/tutorial/sample2/)
## Matrix utility operations
Matrix operations might seem complicated but [they are actually pretty simple if you take them one step at a time](https://webglfundamentals.org/webgl/lessons/webgl-2d-matrices.html). Generally people use a matrix library rather than writing their own. In our case we're using the popular [glMatrix library](https://glmatrix.net/).
### See also
- [Matrices](https://webglfundamentals.org/webgl/lessons/webgl-2d-matrices.html) on WebGLFundamentals
- [Matrices](https://mathworld.wolfram.com/Matrix.html) on Wolfram MathWorld
- [Matrix](<https://en.wikipedia.org/wiki/Matrix_(mathematics)>) on Wikipedia
{{PreviousNext("Web/API/WebGL_API/Tutorial/Getting_started_with_WebGL", "Web/API/WebGL_API/Tutorial/Using_shaders_to_apply_color_in_WebGL")}}
| 0 |
data/mdn-content/files/en-us/web/api/webgl_api/tutorial | data/mdn-content/files/en-us/web/api/webgl_api/tutorial/animating_objects_with_webgl/index.md | ---
title: Animating objects with WebGL
slug: Web/API/WebGL_API/Tutorial/Animating_objects_with_WebGL
page-type: guide
---
{{DefaultAPISidebar("WebGL")}} {{PreviousNext("Web/API/WebGL_API/Tutorial/Using_shaders_to_apply_color_in_WebGL", "Web/API/WebGL_API/Tutorial/Creating_3D_objects_using_WebGL") }}
## Making the square rotate
In this example, we'll actually rotate our camera. By doing so, it will look as if we are rotating the square. First we'll need some variables in which to track the current rotation of the camera.
> **Note:** Add this code at the start of your "webgl-demo.js" script:
```js
let squareRotation = 0.0;
let deltaTime = 0;
```
Now we need to update the `drawScene()` function to apply the current rotation to the camera when drawing it. After translating the camera to the initial drawing position for the square, we apply the rotation.
> **Note:** In your "draw-scene.js" module, update the declaration of your `drawScene()` function so it can be passed the rotation to use:
```js-nolint
function drawScene(gl, programInfo, buffers, squareRotation) {
```
> **Note:** In your `drawScene()` function, right after the line `mat4.translate()` call, add this code:
```js
mat4.rotate(
modelViewMatrix, // destination matrix
modelViewMatrix, // matrix to rotate
squareRotation, // amount to rotate in radians
[0, 0, 1],
); // axis to rotate around
```
This rotates the modelViewMatrix by the current value of `squareRotation`, around the Z axis.
To actually animate, we need to add code that changes the value of `squareRotation` over time.
> **Note:** Add this code at the end of your `main()` function, replacing the existing `drawScene()` call:
```js
let then = 0;
// Draw the scene repeatedly
function render(now) {
now *= 0.001; // convert to seconds
deltaTime = now - then;
then = now;
drawScene(gl, programInfo, buffers, squareRotation);
squareRotation += deltaTime;
requestAnimationFrame(render);
}
requestAnimationFrame(render);
```
This code uses `requestAnimationFrame` to ask the browser to call the function "`render`" on each frame. `requestAnimationFrame` passes us the time in milliseconds since the page loaded. We convert that to seconds and then subtract from it the last time to compute `deltaTime`, which is the number of second since the last frame was rendered.
Finally, we update `squareRotation`.
{{EmbedGHLiveSample('dom-examples/webgl-examples/tutorial/sample4/index.html', 670, 510) }}
[View the complete code](https://github.com/mdn/dom-examples/tree/main/webgl-examples/tutorial/sample4) | [Open this demo on a new page](https://mdn.github.io/dom-examples/webgl-examples/tutorial/sample4/)
{{PreviousNext("Web/API/WebGL_API/Tutorial/Using_shaders_to_apply_color_in_WebGL", "Web/API/WebGL_API/Tutorial/Creating_3D_objects_using_WebGL") }}
| 0 |
data/mdn-content/files/en-us/web/api/webgl_api/tutorial | data/mdn-content/files/en-us/web/api/webgl_api/tutorial/using_shaders_to_apply_color_in_webgl/index.md | ---
title: Using shaders to apply color in WebGL
slug: Web/API/WebGL_API/Tutorial/Using_shaders_to_apply_color_in_WebGL
page-type: guide
---
{{DefaultAPISidebar("WebGL")}} {{PreviousNext("Web/API/WebGL_API/Tutorial/Adding_2D_content_to_a_WebGL_context", "Web/API/WebGL_API/Tutorial/Animating_objects_with_WebGL")}}
Having created a square plane in the [previous demonstration](/en-US/docs/Web/API/WebGL_API/Tutorial/Adding_2D_content_to_a_WebGL_context), the next obvious step is to add a splash of color to it. We can do this by revising the shaders.
## Applying color to the vertices
In WebGL objects are built using sets of vertices, each of which has a position and a color. By default, all other pixels' colors (and all its other attributes, including position) are computed using interpolation, automatically creating smooth gradients. Previously, our vertex shader didn't apply any specific colors to the vertices. Between this and the fragment shader assigning the fixed color of white to each pixel, the entire square was rendered as solid white.
Let's say we want to render a gradient in which each corner of the square is a different color: red, blue, green, and white. The first thing to do is to establish these colors for the four vertices. To do this, we first need to create an array of vertex colors, then store it into a WebGL buffer.
> **Note:** Add the following function to your `init-buffers.js` module:
```js
function initColorBuffer(gl) {
const colors = [
1.0,
1.0,
1.0,
1.0, // white
1.0,
0.0,
0.0,
1.0, // red
0.0,
1.0,
0.0,
1.0, // green
0.0,
0.0,
1.0,
1.0, // blue
];
const colorBuffer = gl.createBuffer();
gl.bindBuffer(gl.ARRAY_BUFFER, colorBuffer);
gl.bufferData(gl.ARRAY_BUFFER, new Float32Array(colors), gl.STATIC_DRAW);
return colorBuffer;
}
```
This code starts by creating a JavaScript array containing four 4-value vectors, one for each vertex color. Then a new WebGL buffer is allocated to store these colors, and the array is converted into floats and stored into the buffer.
Of course, we also need to call this new function from `initBuffers()`, and return the new buffer it creates.
> **Note:** At the end of your `initBuffers()` function, add the following code, replacing the existing `return` statement:
```js
const colorBuffer = initColorBuffer(gl);
return {
position: positionBuffer,
color: colorBuffer,
};
```
To use these colors, the vertex shader needs to be updated to pull the appropriate color from the color buffer.
> **Note:** Update the `vsSource` declaration in your `main()` function like this:
```js
// Vertex shader program
const vsSource = `
attribute vec4 aVertexPosition;
attribute vec4 aVertexColor;
uniform mat4 uModelViewMatrix;
uniform mat4 uProjectionMatrix;
varying lowp vec4 vColor;
void main(void) {
gl_Position = uProjectionMatrix * uModelViewMatrix * aVertexPosition;
vColor = aVertexColor;
}
`;
```
The key difference here is that for each vertex, we pass its color using a `varying` to the fragment shader.
## Coloring the fragments
In order to pick up the interpolated color for each pixel, we need to change the fragment shader to fetch the value from the `vColor` varying.
> **Note:** Update the `fsSource` declaration in your `main()` function like this:
```js
// Fragment shader program
const fsSource = `
varying lowp vec4 vColor;
void main(void) {
gl_FragColor = vColor;
}
`;
```
Each fragment receives the interpolated color based on its position relative to the vertex positions instead of a fixed value.
## Drawing using the colors
Next, you need to add code to look up the attribute location for the colors and set up that attribute for the shader program.
> **Note:** Update the `programInfo` declaration in your `main()` function like this:
```js
// Collect all the info needed to use the shader program.
// Look up which attributes our shader program is using
// for aVertexPosition, aVertexColor and also
// look up uniform locations.
const programInfo = {
program: shaderProgram,
attribLocations: {
vertexPosition: gl.getAttribLocation(shaderProgram, "aVertexPosition"),
vertexColor: gl.getAttribLocation(shaderProgram, "aVertexColor"),
},
uniformLocations: {
projectionMatrix: gl.getUniformLocation(shaderProgram, "uProjectionMatrix"),
modelViewMatrix: gl.getUniformLocation(shaderProgram, "uModelViewMatrix"),
},
};
```
Next, `drawScene()` needs to use these colors when drawing the square.
> **Note:** Add the following function to your `draw-scene.js` module:
```js
// Tell WebGL how to pull out the colors from the color buffer
// into the vertexColor attribute.
function setColorAttribute(gl, buffers, programInfo) {
const numComponents = 4;
const type = gl.FLOAT;
const normalize = false;
const stride = 0;
const offset = 0;
gl.bindBuffer(gl.ARRAY_BUFFER, buffers.color);
gl.vertexAttribPointer(
programInfo.attribLocations.vertexColor,
numComponents,
type,
normalize,
stride,
offset,
);
gl.enableVertexAttribArray(programInfo.attribLocations.vertexColor);
}
```
> **Note:** Call the `setColorAttribute()` function from `drawScene()`, right before the `gl.useProgram()` call:
```js
setColorAttribute(gl, buffers, programInfo);
```
The result should now look like this:
{{EmbedGHLiveSample('dom-examples/webgl-examples/tutorial/sample3/index.html', 670, 510) }}
[View the complete code](https://github.com/mdn/dom-examples/tree/main/webgl-examples/tutorial/sample3) | [Open this demo on a new page](https://mdn.github.io/dom-examples/webgl-examples/tutorial/sample3/)
{{PreviousNext("Web/API/WebGL_API/Tutorial/Adding_2D_content_to_a_WebGL_context", "Web/API/WebGL_API/Tutorial/Animating_objects_with_WebGL")}}
| 0 |
data/mdn-content/files/en-us/web/api/webgl_api | data/mdn-content/files/en-us/web/api/webgl_api/by_example/index.md | ---
title: WebGL by example
slug: Web/API/WebGL_API/By_example
page-type: guide
---
{{Next("Learn/WebGL/By_example/Detect_WebGL")}}
_WebGL by example_ is a series of live samples with short explanations that showcase WebGL concepts and capabilities.
The examples are sorted according to topic and level of difficulty, covering the WebGL rendering context, shader programming, textures, geometry, user interaction, and more.
## Examples by topic
The examples are sorted in order of increasing difficulty. But rather than just presenting them in a single long list, they are additionally divided into topics. Sometimes we revisit a topic several times, such as when needing to discuss it initially at a basic level, and later at intermediate and advanced levels.
Instead of trying to juggle shaders, geometry, and working with {{Glossary("GPU")}} memory, already in the first program, the examples here explore WebGL in an incremental way. We believe that it leads to a more effective learning experience and ultimately a deeper understanding of the underlying concepts.
Explanations about the examples are found in both the main text and in comments within the code. You should read all comments, because more advanced examples could not repeat comments about parts of the code that were previously explained.
### Getting to know the rendering context
- [Detect WebGL](/en-US/docs/Web/API/WebGL_API/By_example/Detect_WebGL)
- : This example demonstrates how to detect a {{Glossary("WebGL")}} rendering context and reports the result to the user.
- [Clearing with colors](/en-US/docs/Web/API/WebGL_API/By_example/Clearing_with_colors)
- : How to clear the rendering context with a solid color.
- [Clearing by clicking](/en-US/docs/Web/API/WebGL_API/By_example/Clearing_by_clicking)
- : How to combine user interaction with graphics operations. Clearing the rendering context with a random color when the user clicks.
- [Simple color animation](/en-US/docs/Web/API/WebGL_API/By_example/Simple_color_animation)
- : A very basic color animation, done by clearing the {{Glossary("WebGL")}} drawing buffer with a different random color every second.
- [Color masking](/en-US/docs/Web/API/WebGL_API/By_example/Color_masking)
- : Modifying random colors by applying color masking and thus limiting the range of displayed colors to specific shades.
- [Basic scissoring](/en-US/docs/Web/API/WebGL_API/By_example/Basic_scissoring)
- : How to draw simple rectangles and squares with scissoring operations.
- [Canvas size and WebGL](/en-US/docs/Web/API/WebGL_API/By_example/Canvas_size_and_WebGL)
- : The example explores the effect of setting (or not setting) the canvas size to its element size in {{Glossary("CSS")}} pixels, as it appears in the browser window.
- [Boilerplate 1](/en-US/docs/Web/API/WebGL_API/By_example/Boilerplate_1)
- : The example describes repeated pieces of code that will be hidden from now on, as well as defining a JavaScript utility function to make WebGL initialization easier.
- [Scissor animation](/en-US/docs/Web/API/WebGL_API/By_example/Scissor_animation)
- : Some animation fun with scissoring and clearing operations.
- [Raining rectangles](/en-US/docs/Web/API/WebGL_API/By_example/Raining_rectangles)
- : A simple game that demonstrates clearing with solid colors, scissoring, animation, and user interaction.
### Shader programming basics
- [Hello GLSL](/en-US/docs/Web/API/WebGL_API/By_example/Hello_GLSL)
- : A very basic shader program that draws a solid color square.
- [Hello vertex attributes](/en-US/docs/Web/API/WebGL_API/By_example/Hello_vertex_attributes)
- : Combining shader programming and user interaction through vertex attributes.
- [Textures from code](/en-US/docs/Web/API/WebGL_API/By_example/Textures_from_code)
- : A simple demonstration of procedural texturing with fragment shaders.
### Miscellaneous advanced examples
- [Video textures](/en-US/docs/Web/API/WebGL_API/By_example/Video_textures)
- : This example demonstrates how to use video files as textures.
{{Next("Learn/WebGL/By_example/Detect_WebGL")}}
| 0 |
data/mdn-content/files/en-us/web/api/webgl_api/by_example | data/mdn-content/files/en-us/web/api/webgl_api/by_example/hello_glsl/index.md | ---
title: Hello GLSL
slug: Web/API/WebGL_API/By_example/Hello_GLSL
page-type: guide
---
{{PreviousNext("Learn/WebGL/By_example/Raining_rectangles","Learn/WebGL/By_example/Hello_vertex_attributes")}}
This WebGL example demonstrates a very basic GLSL shader program that draws a solid color square.
> **Note:** This example will most likely work in all modern desktop browsers. But it may not work in some mobile or older browsers. If the canvas remains blank, you can check the output of the next example, which draws exactly the same thing. But remember to read through the explanations and code on this page, before moving on to the next.
## Hello World program in GLSL
{{EmbedLiveSample("Hello_World_program_in_GLSL",660,425)}}
A very simple first shader program.
```html hidden
<p>Hello World! Hello GLSL!</p>
```
```html hidden
<canvas>Your browser does not seem to support HTML canvas.</canvas>
```
```css hidden
body {
text-align: center;
}
canvas {
width: 280px;
height: 210px;
margin: auto;
padding: 0;
border: none;
background-color: black;
}
button {
display: block;
font-size: inherit;
margin: auto;
padding: 0.6em;
}
```
```html
<script type="x-shader/x-vertex" id="vertex-shader">
#version 100
void main() {
gl_Position = vec4(0.0, 0.0, 0.0, 1.0);
gl_PointSize = 64.0;
}
</script>
```
```html
<script type="x-shader/x-fragment" id="fragment-shader">
#version 100
void main() {
gl_FragColor = vec4(0.18, 0.54, 0.34, 1.0);
}
</script>
```
```js hidden
;(() => {
"use strict";
```
```js
window.addEventListener("load", setupWebGL, false);
let gl;
let program;
function setupWebGL(evt) {
window.removeEventListener(evt.type, setupWebGL, false);
if (!(gl = getRenderingContext())) return;
let source = document.querySelector("#vertex-shader").innerHTML;
const vertexShader = gl.createShader(gl.VERTEX_SHADER);
gl.shaderSource(vertexShader, source);
gl.compileShader(vertexShader);
source = document.querySelector("#fragment-shader").innerHTML;
const fragmentShader = gl.createShader(gl.FRAGMENT_SHADER);
gl.shaderSource(fragmentShader, source);
gl.compileShader(fragmentShader);
program = gl.createProgram();
gl.attachShader(program, vertexShader);
gl.attachShader(program, fragmentShader);
gl.linkProgram(program);
gl.detachShader(program, vertexShader);
gl.detachShader(program, fragmentShader);
gl.deleteShader(vertexShader);
gl.deleteShader(fragmentShader);
if (!gl.getProgramParameter(program, gl.LINK_STATUS)) {
const linkErrLog = gl.getProgramInfoLog(program);
cleanup();
document.querySelector("p").textContent =
`Shader program did not link successfully. Error log: ${linkErrLog}`;
return;
}
initializeAttributes();
gl.useProgram(program);
gl.drawArrays(gl.POINTS, 0, 1);
cleanup();
}
let buffer;
function initializeAttributes() {
gl.enableVertexAttribArray(0);
buffer = gl.createBuffer();
gl.bindBuffer(gl.ARRAY_BUFFER, buffer);
gl.vertexAttribPointer(0, 1, gl.FLOAT, false, 0, 0);
}
function cleanup() {
gl.useProgram(null);
if (buffer) {
gl.deleteBuffer(buffer);
}
if (program) {
gl.deleteProgram(program);
}
}
```
```js hidden
function getRenderingContext() {
const canvas = document.querySelector("canvas");
canvas.width = canvas.clientWidth;
canvas.height = canvas.clientHeight;
const gl =
canvas.getContext("webgl") || canvas.getContext("experimental-webgl");
if (!gl) {
const paragraph = document.querySelector("p");
paragraph.textContent =
"Failed. Your browser or device may not support WebGL.";
return null;
}
gl.viewport(0, 0, gl.drawingBufferWidth, gl.drawingBufferHeight);
gl.clearColor(0.0, 0.0, 0.0, 1.0);
gl.clear(gl.COLOR_BUFFER_BIT);
return gl;
}
```
```js hidden
})();
```
The source code of this example is also available on [GitHub](https://github.com/idofilin/webgl-by-example/tree/master/hello-glsl).
{{PreviousNext("Learn/WebGL/By_example/Raining_rectangles","Learn/WebGL/By_example/Hello_vertex_attributes")}}
| 0 |
data/mdn-content/files/en-us/web/api/webgl_api/by_example | data/mdn-content/files/en-us/web/api/webgl_api/by_example/video_textures/index.md | ---
title: Video textures
slug: Web/API/WebGL_API/By_example/Video_textures
page-type: guide
---
{{Previous("Learn/WebGL/By_example/Textures_from_code")}}
This example demonstrates how to use video files as textures for WebGL surfaces.
## Textures from video
{{EmbedGHLiveSample('dom-examples/webgl-examples/tutorial/sample8/index.html', 670, 510) }}
The source code of this example is available on [GitHub](https://github.com/mdn/dom-examples/tree/main/webgl-examples/tutorial/sample8).
{{Previous("Learn/WebGL/By_example/Textures_from_code")}}
| 0 |
data/mdn-content/files/en-us/web/api/webgl_api/by_example | data/mdn-content/files/en-us/web/api/webgl_api/by_example/canvas_size_and_webgl/index.md | ---
title: Canvas size and WebGL
slug: Web/API/WebGL_API/By_example/Canvas_size_and_WebGL
page-type: guide
---
{{PreviousNext("Learn/WebGL/By_example/Basic_scissoring","Learn/WebGL/By_example/Boilerplate_1")}}
This WebGL example explores the effect of setting (or not setting) the canvas size to its element size in {{Glossary("CSS")}} pixels, as it appears in the browser window.
## Effect of canvas size on rendering with WebGL
{{EmbedLiveSample("Effect_of_canvas_size_on_rendering_with_WebGL",660,180)}}
With {{domxref("WebGLRenderingContext.scissor()","scissor()")}} and {{domxref("WebGLRenderingContext.clear()","clear()")}} we can demonstrate how the WebGL drawing buffer is affected by the size of the canvas.
The size of the first canvas is set to the styled {{domxref("Element")}} size, determined by {{Glossary("CSS")}}. This is done by assigning the {{domxref("HTMLCanvasElement.width","width")}} and {{domxref("HTMLCanvasElement.height","height")}} properties of the canvas to the values of the {{domxref("Element.clientWidth","clientWidth")}} and {{domxref("Element.clientHeight","clientHeight")}} properties, respectively.
In contrast, no such assignment is done for the second canvas. The internal {{domxref("HTMLCanvasElement.width","width")}} and {{domxref("HTMLCanvasElement.height","height")}} properties of the canvas remain at default values, which are different than the actual size of the canvas {{domxref("Element")}} in the browser window.
The effect is clearly visible when using {{domxref("WebGLRenderingContext.scissor()","scissor()")}} and {{domxref("WebGLRenderingContext.clear()","clear()")}} to draw a square in the center of the canvas, by specifying its position and size in pixels. In the first canvas, we get the desired result. In the second, the square has the wrong shape, size, and position.
```html
<p>Compare the two canvases.</p>
<canvas>Your browser does not seem to support HTML canvas.</canvas>
<canvas>Your browser does not seem to support HTML canvas.</canvas>
```
```css
body {
text-align: center;
}
canvas {
display: inline-block;
width: 120px;
height: 80px;
margin: auto;
padding: 0;
border: none;
background-color: black;
}
```
```js
window.addEventListener(
"load",
() => {
const [firstCanvas, secondCanvas] = document.getElementsByTagName("canvas");
firstCanvas.width = firstCanvas.clientWidth;
firstCanvas.height = firstCanvas.clientHeight;
[firstCanvas, secondCanvas].forEach((canvas) => {
const gl =
canvas.getContext("webgl") || canvas.getContext("experimental-webgl");
if (!gl) {
document.querySelector("p").textContent =
"Failed. Your browser or device may not support WebGL.";
return;
}
gl.viewport(0, 0, gl.drawingBufferWidth, gl.drawingBufferHeight);
gl.enable(gl.SCISSOR_TEST);
gl.scissor(30, 10, 60, 60);
gl.clearColor(1.0, 1.0, 0.0, 1.0);
gl.clear(gl.COLOR_BUFFER_BIT);
});
},
false,
);
```
The source code of this example is also available on [GitHub](https://github.com/idofilin/webgl-by-example/tree/master/canvas-size-and-webgl).
{{PreviousNext("Learn/WebGL/By_example/Basic_scissoring","Learn/WebGL/By_example/Boilerplate_1")}}
| 0 |
data/mdn-content/files/en-us/web/api/webgl_api/by_example | data/mdn-content/files/en-us/web/api/webgl_api/by_example/simple_color_animation/index.md | ---
title: Simple color animation
slug: Web/API/WebGL_API/By_example/Simple_color_animation
page-type: guide
---
{{PreviousNext("Learn/WebGL/By_example/Clearing_by_clicking","Learn/WebGL/By_example/Color_masking")}}
A very basic color animation created using {{Glossary("WebGL")}}, performed by clearing the drawing buffer with a different random color every second.
## Color animation with clear
{{EmbedLiveSample("Color_animation_with_clear",660,425)}}
This example provides a simple illustration of color animation with {{Glossary("WebGL")}}, as well as user interaction. The user can start, stop and restart the animation by clicking the button.
This time we put the {{Glossary("WebGL")}} function calls within a timer event handler. A click event handler additionally enables the basic user interaction of starting and stopping the animation. The timer and the timer handler function establish the animation loop, a set of drawing commands that are executed at a regular period (typically, every frame; in this case, once per second).
```html
<p>A simple WebGL program that shows color animation.</p>
<p>You can click the button below to toggle the color animation on or off.</p>
<canvas id="canvas-view">
Your browser does not seem to support HTML canvas.
</canvas>
<button id="animation-onoff">
Press here to
<strong>[verb goes here]</strong>
the animation
</button>
```
```css
body {
text-align: center;
}
canvas {
display: block;
width: 280px;
height: 210px;
margin: auto;
padding: 0;
border: none;
background-color: black;
}
button {
display: inline-block;
font-size: inherit;
margin: auto;
padding: 0.6em;
}
```
```js
window.addEventListener(
"load",
function setupAnimation(evt) {
"use strict";
window.removeEventListener(evt.type, setupAnimation, false);
// A variable to hold a timer that drives the animation.
let timer;
// Click event handlers.
const button = document.querySelector("#animation-onoff");
const verb = document.querySelector("strong");
function startAnimation(evt) {
button.removeEventListener(evt.type, startAnimation, false);
button.addEventListener("click", stopAnimation, false);
verb.textContent = "stop";
// Setup animation loop by redrawing every second.
timer = setInterval(drawAnimation, 1000);
// Give immediate feedback to user after clicking, by
// drawing one animation frame.
drawAnimation();
}
function stopAnimation(evt) {
button.removeEventListener(evt.type, stopAnimation, false);
button.addEventListener("click", startAnimation, false);
verb.textContent = "start";
// Stop animation by clearing the timer.
clearInterval(timer);
}
// Call stopAnimation() once to set up the initial event
// handlers for canvas and button.
stopAnimation({ type: "click" });
let gl;
function drawAnimation() {
if (!gl) {
const canvas = document.getElementById("canvas-view");
gl =
canvas.getContext("webgl") || canvas.getContext("experimental-webgl");
if (!gl) {
clearInterval(timer);
alert(
"Failed to get WebGL context.\n" +
"Your browser or device may not support WebGL.",
);
return;
}
gl.viewport(0, 0, gl.drawingBufferWidth, gl.drawingBufferHeight);
}
// Get a random color value using a helper function.
const color = getRandomColor();
// Set the WebGLRenderingContext clear color to the
// random color.
gl.clearColor(color[0], color[1], color[2], 1.0);
// Clear the context with the newly set color.
gl.clear(gl.COLOR_BUFFER_BIT);
}
// Random color helper function.
function getRandomColor() {
return [Math.random(), Math.random(), Math.random()];
}
},
false,
);
```
The source code of this example is also available on [GitHub](https://github.com/idofilin/webgl-by-example/tree/master/simple-color-animation).
{{PreviousNext("Learn/WebGL/By_example/Clearing_by_clicking","Learn/WebGL/By_example/Color_masking")}}
| 0 |
data/mdn-content/files/en-us/web/api/webgl_api/by_example | data/mdn-content/files/en-us/web/api/webgl_api/by_example/color_masking/index.md | ---
title: Color masking
slug: Web/API/WebGL_API/By_example/Color_masking
page-type: guide
---
{{PreviousNext("Learn/WebGL/By_example/Simple_color_animation","Learn/WebGL/By_example/Basic_scissoring")}}
This WebGL example modifies random colors by applying color masking to limit the range of displayed colors to specific shades.
## Masking random colors
{{EmbedLiveSample("Masking_random_colors",660,425)}}
This example modifies the random color animation by applying color masking with {{domxref("WebGLRenderingContext.colorMask()","colorMask()")}}. You can think of the color masking operation as if looking at the colored canvas through some tinted glass or color filter. So, by masking off the blue and green channels, you are only allowing the red component of pixels to be updated, and therefore it is as if you were looking through a red tinted glass.
Color masking allows us to demonstrate some basics of [color theory](https://en.wikipedia.org/wiki/Color_theory). By masking off some channel(s), we are in fact biasing the displayed colors towards the complementary color. So, clearly masking both blue and red, would give us shades of green. Masking only the blue channel would give us shades of yellow (including shades of orange, brown, olive and yellow-green), the complementary of blue. Similarly, masking only green would give us shades of magenta (also purples, crimsons, and so on), and masking only red would give shades of cyan (also sea greens, blues, and so on).
Note that the calls to `colorMask()` only occur when the user clicks on one of the toggle buttons. But rendering is done every second, using the timer. The color mask state of {{Glossary("WebGL")}} is preserved, so we do not need to call `colorMask()` every frame to set up the color mask. This is an important aspect of the WebGL state machine. It allows us to set up WebGL in a single initialization phase, and then just execute drawing commands for each frame.
Color masking gives you fine control of updating pixel values on the screen. By limiting the color channels that are written by each drawing command, you can use each channel, for example, to store a different grayscale image. Alternatively, you may use the {{Glossary("RGB")}} components for color, but the alpha component for some custom pixel data of your invention.
Finally, color masking teaches us that {{Glossary("WebGL")}} is not only a state machine, it is also a _graphics pipeline_. This means that graphics operations in WebGL are done in a certain order, where the output of each operation serves as the input of the next. So, for example, clearing operation sets the value of each pixel to the chosen clear color. Masking occurs later in the pipeline, and modifies the pixel color value, so the final result on the screen is that of the clear color, tinted by the color mask.
```html
<p>Tinting the displayed colors with color masking.</p>
<canvas>Your browser does not seem to support HTML canvas.</canvas>
<button id="red-toggle">On</button>
<button id="green-toggle">On</button>
<button id="blue-toggle">On</button>
```
```css
body {
text-align: center;
}
canvas {
display: block;
width: 280px;
height: 210px;
margin: auto;
padding: 0;
border: none;
background-color: black;
}
button {
display: inline-block;
font-family: serif;
font-size: inherit;
font-weight: 900;
color: white;
margin: auto;
padding: 0.6em 1.2em;
}
#red-toggle {
background-color: red;
}
#green-toggle {
background-color: green;
}
#blue-toggle {
background-color: blue;
}
```
```js
window.addEventListener(
"load",
function setupAnimation(evt) {
"use strict";
window.removeEventListener(evt.type, setupAnimation, false);
const canvas = document.querySelector("canvas");
const gl =
canvas.getContext("webgl") || canvas.getContext("experimental-webgl");
if (!gl) {
document.querySelector("p").textContent =
"Failed to get WebGL context. Your browser or device may not support WebGL.";
return;
}
gl.viewport(0, 0, gl.drawingBufferWidth, gl.drawingBufferHeight);
const timer = setInterval(drawAnimation, 1000);
const mask = [true, true, true];
const redtoggle = document.querySelector("#red-toggle");
const greentoggle = document.querySelector("#green-toggle");
const bluetoggle = document.querySelector("#blue-toggle");
redtoggle.addEventListener("click", setColorMask, false);
greentoggle.addEventListener("click", setColorMask, false);
bluetoggle.addEventListener("click", setColorMask, false);
function setColorMask(evt) {
const index =
(evt.target === greentoggle && 1) ||
(evt.target === bluetoggle && 2) ||
0;
mask[index] = !mask[index];
evt.target.textContent = mask[index] ? "On" : "Off";
gl.colorMask(mask[0], mask[1], mask[2], true);
drawAnimation();
}
function drawAnimation() {
const color = getRandomColor();
gl.clearColor(color[0], color[1], color[2], 1.0);
gl.clear(gl.COLOR_BUFFER_BIT);
}
function getRandomColor() {
return [Math.random(), Math.random(), Math.random()];
}
},
false,
);
```
The source code of this example is also available on [GitHub](https://github.com/idofilin/webgl-by-example/tree/master/color-masking).
{{PreviousNext("Learn/WebGL/By_example/Simple_color_animation","Learn/WebGL/By_example/Basic_scissoring")}}
| 0 |
data/mdn-content/files/en-us/web/api/webgl_api/by_example | data/mdn-content/files/en-us/web/api/webgl_api/by_example/basic_scissoring/index.md | ---
title: Basic scissoring
slug: Web/API/WebGL_API/By_example/Basic_scissoring
page-type: guide
---
{{PreviousNext("Learn/WebGL/By_example/Color_masking","Learn/WebGL/By_example/Canvas_size_and_WebGL")}}
In this example, we see how to draw simple rectangles and squares using WebGL scissoring operations. Scissoring establishes a clipping region outside which drawing will not occur.
## Clearing the drawing buffer when scissoring applies
{{EmbedLiveSample("Clearing_the_drawing_buffer_when_scissoring_applies",660,425)}}
This is a simple demonstration of a rendering with {{domxref("WebGLRenderingContext.scissor","scissor()")}}.
Although the {{domxref("WebGLRenderingContext.clear","clear()")}} drawing command writes the clear color (set by {{domxref("WebGLRenderingContext.clearColor","clearColor()")}}) to all pixels in the drawing buffer, {{domxref("WebGLRenderingContext.scissor","scissor()")}} defines a mask that only allows pixels inside the specified rectangular area to be updated.
This is a good opportunity to talk about the difference between pixels and _fragments_. A pixel is a picture element (in practice, a point) on the screen, or a single element of the drawing buffer, that area in memory that holds your pixel data (such as {{Glossary("RGBA")}} color components). A _fragment_ refers to the pixel while it is being handled by the {{Glossary("WebGL")}} pipeline.
The reason for this distinction is that fragment color (and other fragment values, such as depth) may be manipulated and changed several times during graphics operations before finally being written to the screen. We have already seen how fragment color changes during graphics operations, by applying {{domxref("WebGLRenderingContext.colorMask()","color masking", "", 1)}}. In other cases, the fragments may be discarded altogether (so the pixel value is not updated), or it may interact with the already existing pixel value (such as when doing color blending for non-opaque elements in the scene).
Here we see another example of the distinction between fragments and pixels. Scissoring is a distinct stage in the {{Glossary("WebGL")}}/{{Glossary("OpenGL")}} graphics pipeline (it occurs after color clearing, but before color masking). Before the actual pixels are updated, fragments must go through the scissor test. If the fragments pass the scissor test, they continue down the graphics pipeline, and the corresponding pixels are updated on the screen. If they fail the test, they are immediately discarded, no further processing occurs, and pixels are not updated. Because only fragments within the specified rectangular area successfully pass the scissor test, only pixels inside that area are updated, and we get a rectangle on the screen.
The scissoring stage of the pipeline is disabled by default. We enable it here using the {{domxref("WebGLRenderingContext.enable","enable()")}} method (you will also use `enable()` to activate many other features of WebGL; hence, the use of the `SCISSOR_TEST` constant as an argument in this case). This again demonstrates the typical order of commands in {{Glossary("WebGL")}}. We first tweak WebGL state. In this case, enabling the scissor test and establishing a rectangular mask. Only when the WebGL state has been satisfactorily tweaked, we execute the drawing command (in this case, `clear()`) that starts the processing of fragments down the graphics pipeline.
```html
<p>Result of scissoring.</p>
<canvas>Your browser does not seem to support HTML canvas.</canvas>
```
```css
body {
text-align: center;
}
canvas {
display: block;
width: 280px;
height: 210px;
margin: auto;
padding: 0;
border: none;
background-color: black;
}
```
```js
window.addEventListener(
"load",
function setupWebGL(evt) {
"use strict";
window.removeEventListener(evt.type, setupWebGL, false);
const paragraph = document.querySelector("p");
const canvas = document.querySelector("canvas");
// The following two lines set the size (in CSS pixels) of
// the drawing buffer to be identical to the size of the
// canvas HTML element, as determined by CSS.
canvas.width = canvas.clientWidth;
canvas.height = canvas.clientHeight;
const gl =
canvas.getContext("webgl") || canvas.getContext("experimental-webgl");
if (!gl) {
paragraph.innerHTML =
"Failed to get WebGL context. " +
"Your browser or device may not support WebGL.";
return;
}
gl.viewport(0, 0, gl.drawingBufferWidth, gl.drawingBufferHeight);
// Enable scissoring operation and define the position and
// size of the scissoring area.
gl.enable(gl.SCISSOR_TEST);
gl.scissor(40, 20, 60, 130);
// Clear the drawing buffer solid yellow.
gl.clearColor(1.0, 1.0, 0.0, 1.0);
gl.clear(gl.COLOR_BUFFER_BIT);
},
false,
);
```
The source code of this example is also available on [GitHub](https://github.com/idofilin/webgl-by-example/tree/master/basic-scissoring).
{{PreviousNext("Learn/WebGL/By_example/Color_masking","Learn/WebGL/By_example/Canvas_size_and_WebGL")}}
| 0 |
data/mdn-content/files/en-us/web/api/webgl_api/by_example | data/mdn-content/files/en-us/web/api/webgl_api/by_example/detect_webgl/index.md | ---
title: Detect WebGL
slug: Web/API/WebGL_API/By_example/Detect_WebGL
page-type: guide
---
{{PreviousNext("Learn/WebGL/By_example","Learn/WebGL/By_example/Clearing_with_colors")}}
This example demonstrates how to detect a {{Glossary("WebGL")}} rendering context and reports the result to the user.
## Feature-detecting WebGL
{{EmbedLiveSample("Feature-detecting_WebGL",660,150)}}
In this first example we are going to check whether the browser supports {{Glossary("WebGL")}}. To that end we will try to obtain the {{domxref("WebGLRenderingContext","WebGL rendering context","",1)}} from a {{domxref("HTMLCanvasElement","canvas")}} element. The {{domxref("WebGLRenderingContext","WebGL rendering context", "", 1)}} is an interface, through which you can set and query the state of the graphics machine, send data to the WebGL, and execute draw commands.
Saving the state of the graphics machine within a single context interface is not unique to {{Glossary("WebGL")}}. This is also done in other graphics {{Glossary("API")}}, such as the {{domxref("CanvasRenderingContext2D","canvas 2D rendering context", "", 1)}}. However, the properties and variables you can tweak are different for each {{Glossary("API")}}.
```html
<p>[ Here would go the result of WebGL feature detection ]</p>
<button>Press here to detect WebGLRenderingContext</button>
```
```css
body {
text-align: center;
}
button {
display: block;
font-size: inherit;
margin: auto;
padding: 0.6em;
}
```
```js
// Run everything inside window load event handler, to make sure
// DOM is fully loaded and styled before trying to manipulate it.
window.addEventListener(
"load",
() => {
const paragraph = document.querySelector("p");
const button = document.querySelector("button");
// Adding click event handler to button.
button.addEventListener("click", detectWebGLContext, false);
function detectWebGLContext() {
// Create canvas element. The canvas is not added to the
// document itself, so it is never displayed in the
// browser window.
const canvas = document.createElement("canvas");
// Get WebGLRenderingContext from canvas element.
const gl =
canvas.getContext("webgl") || canvas.getContext("experimental-webgl");
// Report the result.
paragraph.textContent =
gl instanceof WebGLRenderingContext
? "Congratulations! Your browser supports WebGL."
: "Failed. Your browser or device may not support WebGL.";
}
},
false,
);
```
The source code of this example is also available on [GitHub](https://github.com/idofilin/webgl-by-example/tree/master/detect-webgl).
{{PreviousNext("Learn/WebGL/By_example","Learn/WebGL/By_example/Clearing_with_colors")}}
| 0 |
data/mdn-content/files/en-us/web/api/webgl_api/by_example | data/mdn-content/files/en-us/web/api/webgl_api/by_example/scissor_animation/index.md | ---
title: Scissor animation
slug: Web/API/WebGL_API/By_example/Scissor_animation
page-type: guide
---
{{PreviousNext("Learn/WebGL/By_example/Boilerplate_1","Learn/WebGL/By_example/Raining_rectangles")}}
A simple WebGL example in which we have some animation fun using scissoring and clearing operations.
## Animation with scissoring
{{EmbedLiveSample("Animation_with_scissoring",660,425)}}
In this example, we are animating squares using {{domxref("WebGLRenderingContext.scissor()","scissor()")}} and {{domxref("WebGLRenderingContext.clear()","clear()")}}. We again establish an animation loop using timers. Note that this time it is the position of the square (the scissoring area) that is updated every frame (we set frame rate to roughly one every 17ms, or roughly 60fps – frames per second).
In contrast, the color of the square (set with {{domxref("WebGLRenderingContext.clearColor()","clearColor")}}) is only updated when a new square is created. This is a nice demonstration of {{Glossary("WebGL")}} as a state machine. For each square, we set its color once, and then update only its position every frame. The clear color state of WebGL remains at the set value, until we change it again when a new square is created.
```html hidden
<p>
WebGL animation by clearing the drawing buffer with solid color and applying
scissor test.
</p>
<button id="animation-onoff">
Press here to <strong>[verb goes here]</strong> the animation.
</button>
```
```html hidden
<canvas>Your browser does not seem to support canvases.</canvas>
```
```css hidden
body {
text-align: center;
}
canvas {
display: block;
width: 280px;
height: 210px;
margin: auto;
padding: 0;
border: none;
background-color: black;
}
button {
display: block;
font-size: inherit;
margin: auto;
padding: 0.6em;
}
```
```js hidden
;(() => {
"use strict";
```
```js
window.addEventListener("load", setupAnimation, false);
// Variables to hold the WebGL context, and the color and
// position of animated squares.
let gl;
let color = getRandomColor();
let position;
function setupAnimation(evt) {
window.removeEventListener(evt.type, setupAnimation, false);
if (!(gl = getRenderingContext())) return;
gl.enable(gl.SCISSOR_TEST);
gl.clearColor(color[0], color[1], color[2], 1.0);
// Unlike the browser window, vertical position in WebGL is
// measured from bottom to top. In here we set the initial
// position of the square to be at the top left corner of the
// drawing buffer.
position = [0, gl.drawingBufferHeight];
const button = document.querySelector("button");
let timer;
function startAnimation(evt) {
button.removeEventListener(evt.type, startAnimation, false);
button.addEventListener("click", stopAnimation, false);
document.querySelector("strong").textContent = "stop";
timer = setInterval(drawAnimation, 17);
drawAnimation();
}
function stopAnimation(evt) {
button.removeEventListener(evt.type, stopAnimation, false);
button.addEventListener("click", startAnimation, false);
document.querySelector("strong").textContent = "start";
clearInterval(timer);
}
stopAnimation({ type: "click" });
}
// Variables to hold the size and velocity of the square.
const size = [60, 60];
let velocity = 3.0;
function drawAnimation() {
gl.scissor(position[0], position[1], size[0], size[1]);
gl.clear(gl.COLOR_BUFFER_BIT);
// Every frame the vertical position of the square is
// decreased, to create the illusion of movement.
position[1] -= velocity;
// When the square hits the bottom of the drawing buffer,
// we override it with new square of different color and
// velocity.
if (position[1] < 0) {
// Horizontal position chosen randomly, and vertical
// position at the top of the drawing buffer.
position = [
Math.random() * (gl.drawingBufferWidth - size[0]),
gl.drawingBufferHeight,
];
// Random velocity between 1.0 and 7.0
velocity = 1.0 + 6.0 * Math.random();
color = getRandomColor();
gl.clearColor(color[0], color[1], color[2], 1.0);
}
}
function getRandomColor() {
return [Math.random(), Math.random(), Math.random()];
}
```
```js hidden
function getRenderingContext() {
const canvas = document.querySelector("canvas");
canvas.width = canvas.clientWidth;
canvas.height = canvas.clientHeight;
const gl =
canvas.getContext("webgl") || canvas.getContext("experimental-webgl");
if (!gl) {
const paragraph = document.querySelector("p");
paragraph.textContent =
"Failed. Your browser or device may not support WebGL.";
return null;
}
gl.viewport(0, 0, gl.drawingBufferWidth, gl.drawingBufferHeight);
gl.clearColor(0.0, 0.0, 0.0, 1.0);
gl.clear(gl.COLOR_BUFFER_BIT);
return gl;
}
```
```js hidden
})();
```
The source code of this example is also available on [GitHub](https://github.com/idofilin/webgl-by-example/tree/master/scissor-animation).
{{PreviousNext("Learn/WebGL/By_example/Boilerplate_1","Learn/WebGL/By_example/Raining_rectangles")}}
| 0 |
data/mdn-content/files/en-us/web/api/webgl_api/by_example | data/mdn-content/files/en-us/web/api/webgl_api/by_example/textures_from_code/index.md | ---
title: Textures from code
slug: Web/API/WebGL_API/By_example/Textures_from_code
page-type: guide
---
{{PreviousNext("Learn/WebGL/By_example/Hello_vertex_attributes","Learn/WebGL/By_example/Video_textures")}}
This WebGL example provides a simple demonstration of procedural texturing with fragment shaders. That is, using code to generate textures for use in shading WebGL objects.
## Drawing textures with code
{{EmbedLiveSample("Drawing_textures_with_code", 660, 425)}}
Texturing a point sprite with calculations done per-pixel in the fragment shader.
```html hidden
<p>Texture from code. Simple demonstration of procedural texturing</p>
```
```html hidden
<canvas>Your browser does not seem to support canvases.</canvas>
```
```css hidden
body {
text-align: center;
}
canvas {
width: 280px;
height: 210px;
margin: auto;
padding: 0;
border: none;
background-color: black;
}
button {
display: block;
font-size: inherit;
margin: auto;
padding: 0.6em;
}
```
```html
<script type="x-shader/x-vertex" id="vertex-shader">
#version 100
precision highp float;
attribute vec2 position;
void main() {
gl_Position = vec4(position, 0.0, 1.0);
gl_PointSize = 128.0;
}
</script>
```
```html
<script type="x-shader/x-fragment" id="fragment-shader">
#version 100
precision mediump float;
void main() {
vec2 fragmentPosition = 2.0*gl_PointCoord - 1.0;
float distance = length(fragmentPosition);
float distanceSqrd = distance * distance;
gl_FragColor = vec4(
0.2/distanceSqrd,
0.1/distanceSqrd,
0.0, 1.0 );
}
</script>
```
```js hidden
;(() => {
"use strict";
```
```js
window.addEventListener("load", setupWebGL, false);
let gl;
let program;
function setupWebGL(evt) {
window.removeEventListener(evt.type, setupWebGL, false);
if (!(gl = getRenderingContext())) return;
let source = document.querySelector("#vertex-shader").innerHTML;
const vertexShader = gl.createShader(gl.VERTEX_SHADER);
gl.shaderSource(vertexShader, source);
gl.compileShader(vertexShader);
source = document.querySelector("#fragment-shader").innerHTML;
const fragmentShader = gl.createShader(gl.FRAGMENT_SHADER);
gl.shaderSource(fragmentShader, source);
gl.compileShader(fragmentShader);
program = gl.createProgram();
gl.attachShader(program, vertexShader);
gl.attachShader(program, fragmentShader);
gl.linkProgram(program);
gl.detachShader(program, vertexShader);
gl.detachShader(program, fragmentShader);
gl.deleteShader(vertexShader);
gl.deleteShader(fragmentShader);
if (!gl.getProgramParameter(program, gl.LINK_STATUS)) {
const linkErrLog = gl.getProgramInfoLog(program);
cleanup();
document.querySelector("p").textContent =
`Shader program did not link successfully. Error log: ${linkErrLog}`;
return;
}
initializeAttributes();
gl.useProgram(program);
gl.drawArrays(gl.POINTS, 0, 1);
cleanup();
}
let buffer;
function initializeAttributes() {
gl.enableVertexAttribArray(0);
buffer = gl.createBuffer();
gl.bindBuffer(gl.ARRAY_BUFFER, buffer);
gl.bufferData(gl.ARRAY_BUFFER, new Float32Array([0.0, 0.0]), gl.STATIC_DRAW);
gl.vertexAttribPointer(0, 2, gl.FLOAT, false, 0, 0);
}
function cleanup() {
gl.useProgram(null);
if (buffer) {
gl.deleteBuffer(buffer);
}
if (program) {
gl.deleteProgram(program);
}
}
```
```js hidden
function getRenderingContext() {
const canvas = document.querySelector("canvas");
canvas.width = canvas.clientWidth;
canvas.height = canvas.clientHeight;
const gl =
canvas.getContext("webgl") || canvas.getContext("experimental-webgl");
if (!gl) {
const paragraph = document.querySelector("p");
paragraph.textContent =
"Failed. Your browser or device may not support WebGL.";
return null;
}
gl.viewport(0, 0, gl.drawingBufferWidth, gl.drawingBufferHeight);
gl.clearColor(0.0, 0.0, 0.0, 1.0);
gl.clear(gl.COLOR_BUFFER_BIT);
return gl;
}
```
```js hidden
})();
```
The source code of this example is also available on [GitHub](https://github.com/idofilin/webgl-by-example/tree/master/textures-from-code).
{{PreviousNext("Learn/WebGL/By_example/Hello_vertex_attributes","Learn/WebGL/By_example/Video_textures")}}
| 0 |
data/mdn-content/files/en-us/web/api/webgl_api/by_example | data/mdn-content/files/en-us/web/api/webgl_api/by_example/raining_rectangles/index.md | ---
title: Raining rectangles
slug: Web/API/WebGL_API/By_example/Raining_rectangles
page-type: guide
---
{{PreviousNext("Learn/WebGL/By_example/Scissor_animation","Learn/WebGL/By_example/Hello_GLSL")}}
A simple WebGL game that demonstrates clearing with solid colors, scissoring, animation, and user interaction.
## Animation and user interaction with scissoring
{{EmbedLiveSample("Animation_and_user_interaction_with_scissoring",660,425)}}
This is a simple game. The objective: try to catch as many of the raining rectangles as you can by clicking on them. In this example, we use an object-oriented approach for the displayed rectangles, which helps to keep the state of the rectangle (its position, color, and so on) organized in one place, and the overall code more compact and reusable.
This example combines clearing the drawing buffer with solid colors and scissoring operations. It is a preview of a full graphical application that manipulates various phases of the {{Glossary("WebGL")}} graphics pipeline and state machine.
In addition, the example demonstrates how to integrate the WebGL function calls within a game loop. The game loop is responsible for drawing the animation frames, and keeping the animation responsive to user input. Here, the game loop is implemented using timeouts.
```html hidden
<p>You caught <strong>0</strong>. You missed <strong>0</strong>.</p>
```
```html hidden
<canvas>Your browser does not seem to support canvases.</canvas>
```
```css hidden
body {
text-align: center;
}
canvas {
display: block;
width: 280px;
height: 210px;
margin: auto;
padding: 0;
border: none;
background-color: black;
}
button {
display: block;
font-size: inherit;
margin: auto;
padding: 0.6em;
}
```
```js hidden
;(() => {
"use strict";
```
```js
window.addEventListener("load", setupAnimation, false);
let gl;
let timer;
let rainingRect;
let scoreDisplay;
let missesDisplay;
function setupAnimation(evt) {
window.removeEventListener(evt.type, setupAnimation, false);
if (!(gl = getRenderingContext())) return;
gl.enable(gl.SCISSOR_TEST);
rainingRect = new Rectangle();
timer = setTimeout(drawAnimation, 17);
document
.querySelector("canvas")
.addEventListener("click", playerClick, false);
[scoreDisplay, missesDisplay] = document.querySelectorAll("strong");
}
let score = 0;
let misses = 0;
function drawAnimation() {
gl.scissor(
rainingRect.position[0],
rainingRect.position[1],
rainingRect.size[0],
rainingRect.size[1],
);
gl.clear(gl.COLOR_BUFFER_BIT);
rainingRect.position[1] -= rainingRect.velocity;
if (rainingRect.position[1] < 0) {
misses += 1;
missesDisplay.textContent = misses;
rainingRect = new Rectangle();
}
// We are using setTimeout for animation. So we reschedule
// the timeout to call drawAnimation again in 17ms.
// Otherwise we won't get any animation.
timer = setTimeout(drawAnimation, 17);
}
function playerClick(evt) {
// We need to transform the position of the click event from
// window coordinates to relative position inside the canvas.
// In addition we need to remember that vertical position in
// WebGL increases from bottom to top, unlike in the browser
// window.
const position = [
evt.pageX - evt.target.offsetLeft,
gl.drawingBufferHeight - (evt.pageY - evt.target.offsetTop),
];
// If the click falls inside the rectangle, we caught it.
// Increment score and create a new rectangle.
const diffPos = [
position[0] - rainingRect.position[0],
position[1] - rainingRect.position[1],
];
if (
diffPos[0] >= 0 &&
diffPos[0] < rainingRect.size[0] &&
diffPos[1] >= 0 &&
diffPos[1] < rainingRect.size[1]
) {
score += 1;
scoreDisplay.textContent = score;
rainingRect = new Rectangle();
}
}
function Rectangle() {
// Keeping a reference to the new Rectangle object, rather
// than using the confusing this keyword.
const rect = this;
// We get three random numbers and use them for new rectangle
// size and position. For each we use a different number,
// because we want horizontal size, vertical size and
// position to be determined independently.
const randNums = getRandomVector();
rect.size = [5 + 120 * randNums[0], 5 + 120 * randNums[1]];
rect.position = [
randNums[2] * (gl.drawingBufferWidth - rect.size[0]),
gl.drawingBufferHeight,
];
rect.velocity = 1.0 + 6.0 * Math.random();
rect.color = getRandomVector();
gl.clearColor(rect.color[0], rect.color[1], rect.color[2], 1.0);
function getRandomVector() {
return [Math.random(), Math.random(), Math.random()];
}
}
```
```js hidden
function getRenderingContext() {
const canvas = document.querySelector("canvas");
canvas.width = canvas.clientWidth;
canvas.height = canvas.clientHeight;
const gl =
canvas.getContext("webgl") || canvas.getContext("experimental-webgl");
if (!gl) {
const paragraph = document.querySelector("p");
paragraph.textContent =
"Failed. Your browser or device may not support WebGL.";
return null;
}
gl.viewport(0, 0, gl.drawingBufferWidth, gl.drawingBufferHeight);
gl.clearColor(0.0, 0.0, 0.0, 1.0);
gl.clear(gl.COLOR_BUFFER_BIT);
return gl;
}
```
```js hidden
})();
```
The source code of this example is also available on [GitHub](https://github.com/idofilin/webgl-by-example/tree/master/raining-rectangles).
{{PreviousNext("Learn/WebGL/By_example/Scissor_animation","Learn/WebGL/By_example/Hello_GLSL")}}
| 0 |
data/mdn-content/files/en-us/web/api/webgl_api/by_example | data/mdn-content/files/en-us/web/api/webgl_api/by_example/clearing_with_colors/index.md | ---
title: Clearing with colors
slug: Web/API/WebGL_API/By_example/Clearing_with_colors
page-type: guide
---
{{PreviousNext("Learn/WebGL/By_example/Detect_WebGL","Learn/WebGL/By_example/Clearing_by_clicking")}}
An example showing how to clear a WebGL rendering context to a solid color.
## Clearing the WebGL context with a solid color
{{EmbedLiveSample("Clearing_the_WebGL_context_with_a_solid_color",660,425)}}
The simplest graphical {{Glossary("WebGL")}} program. Set up the {{domxref("WebGLRenderingContext","rendering context", "", 1)}} and then just clear it solid green. Note that {{Glossary("CSS")}} sets the background color of the canvas to black, so when the canvas turns green we know that {{Glossary("WebGL")}}'s magic has worked.
In addition, you may notice that clearing the drawing buffer with a solid color is a two-stage process. First, we set the clear color to green, using the method {{domxref("WebGLRenderingContext.clearColor()","clearColor()")}}. This only changes some internal state of {{Glossary("WebGL")}}, but does not draw anything yet. Next, we actually do the drawing by calling the {{domxref("WebGLRenderingContext.clear()","clear()")}} method. This is typical of how drawing is done with WebGL. There is only a handful of methods for actual drawing (`clear()` is one of them). All other methods are for setting and querying WebGL state variables (such as the clear color).
There are many "dials" and "switches" that affect drawing with {{Glossary("WebGL")}}. The clear color is just the first of many you will get to know. This is why {{Glossary("WebGL")}}/{{Glossary("OpenGL")}} is often called a _state machine_. By tweaking those "dials" and "switches" you can modify the internal state of the WebGL machine, which in turn changes how input (in this case, a clear command) translates into output (in this case, all pixels are set to green).
Finally, we note that color in WebGL is usually in {{Glossary("RGBA")}} format, that is four numerical components for red, green, blue and alpha (opacity). Therefore, `clearColor()` takes four arguments.
```html
<p>A very simple WebGL program that shows some color.</p>
<!-- Text within a canvas element is displayed
only if canvas is not supported. -->
<canvas>Your browser does not seem to support HTML canvas.</canvas>
```
```css
body {
text-align: center;
}
canvas {
display: block;
width: 280px;
height: 210px;
margin: auto;
padding: 0;
border: none;
background-color: black;
}
```
```js
// Run everything inside window load event handler, to make sure
// DOM is fully loaded and styled before trying to manipulate it,
// and to not mess up the global scope. We are giving the event
// handler a name (setupWebGL) so that we can refer to the
// function object within the function itself.
window.addEventListener(
"load",
function setupWebGL(evt) {
"use strict";
// Cleaning after ourselves. The event handler removes
// itself, because it only needs to run once.
window.removeEventListener(evt.type, setupWebGL, false);
// References to the document elements.
const paragraph = document.querySelector("p"),
canvas = document.querySelector("canvas");
// Getting the WebGL rendering context.
const gl =
canvas.getContext("webgl") || canvas.getContext("experimental-webgl");
// If failed, inform user of failure. Otherwise, initialize
// the drawing buffer (the viewport) and clear the context
// with a solid color.
if (!gl) {
paragraph.innerHTML =
"Failed to get WebGL context. " +
"Your browser or device may not support WebGL.";
return;
}
paragraph.innerHTML = "Congratulations! Your browser supports WebGL. ";
gl.viewport(0, 0, gl.drawingBufferWidth, gl.drawingBufferHeight);
// Set the clear color to darkish green.
gl.clearColor(0.0, 0.5, 0.0, 1.0);
// Clear the context with the newly set color. This is
// the function call that actually does the drawing.
gl.clear(gl.COLOR_BUFFER_BIT);
},
false,
);
```
The source code of this example is also available on [GitHub](https://github.com/idofilin/webgl-by-example/tree/master/clearing-with-colors).
{{PreviousNext("Learn/WebGL/By_example/Detect_WebGL","Learn/WebGL/By_example/Clearing_by_clicking")}}
| 0 |
data/mdn-content/files/en-us/web/api/webgl_api/by_example | data/mdn-content/files/en-us/web/api/webgl_api/by_example/boilerplate_1/index.md | ---
title: Boilerplate 1
slug: Web/API/WebGL_API/By_example/Boilerplate_1
page-type: guide
---
{{PreviousNext("Learn/WebGL/By_example/Canvas_size_and_WebGL","Learn/WebGL/By_example/Scissor_animation")}}
This example describes repeated pieces of code that will be hidden from now on, as well as defining a JavaScript utility function to make WebGL initialization easier.
## Boilerplate code for setting up WebGL rendering context
By now you are quite used to seeing the same pieces of {{Glossary("HTML")}}, {{Glossary("CSS")}}, and {{Glossary("JavaScript")}} repeated again and again. So we are going to hide them from now on. This would allow us to focus on the interesting pieces of code that are most relevant for learning {{Glossary("WebGL")}}.
Specifically, the HTML has a {{HTMLElement("p")}} element that contains some descriptive text about the page and may also hold error messages; a {{HTMLElement("canvas")}} element; and optionally a {{HTMLElement("button")}}. The CSS contains rules for `body`, `canvas`, and `button`. Any additional non-trivial CSS and HTML will be displayed on the pages of specific examples.
In following examples, we will use a JavaScript helper function, `getRenderingContext()`, to initialize the {{domxref("WebGLRenderingContext","WebGL rendering context", "", 1)}}. By now, you should be able to understand what the function does. Basically, it gets the WebGL rendering context from the canvas element, initializes the drawing buffer, clears it black, and returns the initialized context. In case of error, it displays an error message and returns [`null`](/en-US/docs/Web/JavaScript/Reference/Operators/null).
Finally, all JavaScript code will run within an immediate function, which is a common JavaScript technique (see {{Glossary("Function")}}). The function declaration and invocation will also be hidden.
### HTML
```html
<p>[ Some descriptive text about the example. ]</p>
<button>[ Optional button element. ]</button>
<canvas>Your browser does not seem to support HTML canvas.</canvas>
```
### CSS
```css
body {
text-align: center;
}
canvas {
display: block;
width: 280px;
height: 210px;
margin: auto;
padding: 0;
border: none;
background-color: black;
}
button {
display: block;
font-size: inherit;
margin: auto;
padding: 0.6em;
}
```
### JavaScript
```js
function getRenderingContext() {
const canvas = document.querySelector("canvas");
canvas.width = canvas.clientWidth;
canvas.height = canvas.clientHeight;
const gl =
canvas.getContext("webgl") || canvas.getContext("experimental-webgl");
if (!gl) {
const paragraph = document.querySelector("p");
paragraph.innerHTML =
"Failed to get WebGL context." +
"Your browser or device may not support WebGL.";
return null;
}
gl.viewport(0, 0, gl.drawingBufferWidth, gl.drawingBufferHeight);
gl.clearColor(0.0, 0.0, 0.0, 1.0);
gl.clear(gl.COLOR_BUFFER_BIT);
return gl;
}
```
The source code of this example is also available on [GitHub](https://github.com/idofilin/webgl-by-example/tree/master/boilerplate-1).
{{PreviousNext("Learn/WebGL/By_example/Canvas_size_and_WebGL","Learn/WebGL/By_example/Scissor_animation")}}
| 0 |
data/mdn-content/files/en-us/web/api/webgl_api/by_example | data/mdn-content/files/en-us/web/api/webgl_api/by_example/clearing_by_clicking/index.md | ---
title: Clearing by clicking
slug: Web/API/WebGL_API/By_example/Clearing_by_clicking
page-type: guide
---
{{PreviousNext("Learn/WebGL/By_example/Clearing_with_colors","Learn/WebGL/By_example/Simple_color_animation")}}
This example demonstrates how to combine user interaction with WebGL graphics operations by clearing the rendering context with a random color when the user clicks.
## Clearing the rendering context with random colors
{{EmbedLiveSample("Clearing_the_rendering_context_with_random_colors",660,425)}}
This example provides a simple illustration of how to combine {{Glossary("WebGL")}} and user interaction. Every time the user clicks the canvas or the button, the canvas is cleared with a new randomly chosen color.
Note how we embed the {{Glossary("WebGL")}} function calls inside the event handler function.
```html
<p>
A very simple WebGL program that still shows some color and user interaction.
</p>
<p>
You can repeatedly click the empty canvas or the button below to change color.
</p>
<canvas id="canvas-view">
Your browser does not seem to support HTML canvas.
</canvas>
<button id="color-switcher">Press here to switch color</button>
```
```css
body {
text-align: center;
}
canvas {
display: block;
width: 280px;
height: 210px;
margin: auto;
padding: 0;
border: none;
background-color: black;
}
button {
display: inline-block;
font-size: inherit;
margin: auto;
padding: 0.6em;
}
```
```js
window.addEventListener(
"load",
function setupWebGL(evt) {
"use strict";
// Cleaning after ourselves. The event handler removes
// itself, because it only needs to run once.
window.removeEventListener(evt.type, setupWebGL, false);
// Adding the same click event handler to both canvas and
// button.
const canvas = document.querySelector("#canvas-view");
const button = document.querySelector("#color-switcher");
canvas.addEventListener("click", switchColor, false);
button.addEventListener("click", switchColor, false);
// A variable to hold the WebGLRenderingContext.
let gl;
// The click event handler.
function switchColor() {
// Referring to the externally defined gl variable.
// If undefined, try to obtain the WebGLRenderingContext.
// If failed, alert user of failure.
// Otherwise, initialize the drawing buffer (the viewport).
if (!gl) {
gl =
canvas.getContext("webgl") || canvas.getContext("experimental-webgl");
if (!gl) {
alert(
"Failed to get WebGL context.\n" +
"Your browser or device may not support WebGL.",
);
return;
}
gl.viewport(0, 0, gl.drawingBufferWidth, gl.drawingBufferHeight);
}
// Get a random color value using a helper function.
const color = getRandomColor();
// Set the clear color to the random color.
gl.clearColor(color[0], color[1], color[2], 1.0);
// Clear the context with the newly set color. This is
// the function call that actually does the drawing.
gl.clear(gl.COLOR_BUFFER_BIT);
}
// Random color helper function.
function getRandomColor() {
return [Math.random(), Math.random(), Math.random()];
}
},
false,
);
```
The source code of this example is also available on [GitHub](https://github.com/idofilin/webgl-by-example/tree/master/clearing-by-clicking).
{{PreviousNext("Learn/WebGL/By_example/Clearing_with_colors","Learn/WebGL/By_example/Simple_color_animation")}}
| 0 |
data/mdn-content/files/en-us/web/api/webgl_api/by_example | data/mdn-content/files/en-us/web/api/webgl_api/by_example/hello_vertex_attributes/index.md | ---
title: Hello vertex attributes
slug: Web/API/WebGL_API/By_example/Hello_vertex_attributes
page-type: guide
---
{{PreviousNext("Learn/WebGL/By_example/Hello_GLSL","Learn/WebGL/By_example/Textures_from_code")}}
This WebGL example demonstrates how to combine shader programming and user interaction by sending user input to the shader using vertex attributes.
## Hello World program in GLSL
{{EmbedLiveSample("Hello_World_program_in_GLSL",660,425)}}
How to send input to a shader program by saving data in GPU memory.
```html hidden
<p>
First encounter with attributes and sending data to GPU. Click on the canvas
to change the horizontal position of the square.
</p>
```
```html hidden
<canvas>Your browser does not seem to support HTML canvas.</canvas>
```
```css hidden
body {
text-align: center;
}
canvas {
width: 280px;
height: 210px;
margin: auto;
padding: 0;
border: none;
background-color: black;
}
button {
display: block;
font-size: inherit;
margin: auto;
padding: 0.6em;
}
```
```html
<script type="x-shader/x-vertex" id="vertex-shader">
#version 100
precision highp float;
attribute float position;
void main() {
gl_Position = vec4(position, 0.0, 0.0, 1.0);
gl_PointSize = 64.0;
}
</script>
```
```html
<script type="x-shader/x-fragment" id="fragment-shader">
#version 100
precision mediump float;
void main() {
gl_FragColor = vec4(0.18, 0.54, 0.34, 1.0);
}
</script>
```
```js hidden
;(() => {
"use strict";
```
```js
window.addEventListener("load", setupWebGL, false);
let gl;
let program;
function setupWebGL(evt) {
window.removeEventListener(evt.type, setupWebGL, false);
if (!(gl = getRenderingContext())) return;
let source = document.querySelector("#vertex-shader").innerHTML;
const vertexShader = gl.createShader(gl.VERTEX_SHADER);
gl.shaderSource(vertexShader, source);
gl.compileShader(vertexShader);
source = document.querySelector("#fragment-shader").innerHTML;
const fragmentShader = gl.createShader(gl.FRAGMENT_SHADER);
gl.shaderSource(fragmentShader, source);
gl.compileShader(fragmentShader);
program = gl.createProgram();
gl.attachShader(program, vertexShader);
gl.attachShader(program, fragmentShader);
gl.linkProgram(program);
gl.detachShader(program, vertexShader);
gl.detachShader(program, fragmentShader);
gl.deleteShader(vertexShader);
gl.deleteShader(fragmentShader);
if (!gl.getProgramParameter(program, gl.LINK_STATUS)) {
const linkErrLog = gl.getProgramInfoLog(program);
cleanup();
document.querySelector("p").textContent =
`Shader program did not link successfully. Error log: ${linkErrLog}`;
return;
}
initializeAttributes();
gl.useProgram(program);
gl.drawArrays(gl.POINTS, 0, 1);
document.querySelector("canvas").addEventListener(
"click",
(evt) => {
const clickXRelativeToCanvas = evt.pageX - evt.target.offsetLeft;
const clickXinWebGLCoords =
(2.0 * (clickXRelativeToCanvas - gl.drawingBufferWidth / 2)) /
gl.drawingBufferWidth;
gl.bufferData(
gl.ARRAY_BUFFER,
new Float32Array([clickXinWebGLCoords]),
gl.STATIC_DRAW,
);
gl.drawArrays(gl.POINTS, 0, 1);
},
false,
);
}
let buffer;
function initializeAttributes() {
gl.enableVertexAttribArray(0);
buffer = gl.createBuffer();
gl.bindBuffer(gl.ARRAY_BUFFER, buffer);
gl.bufferData(gl.ARRAY_BUFFER, new Float32Array([0.0]), gl.STATIC_DRAW);
gl.vertexAttribPointer(0, 1, gl.FLOAT, false, 0, 0);
}
window.addEventListener("beforeunload", cleanup, true);
function cleanup() {
gl.useProgram(null);
if (buffer) {
gl.deleteBuffer(buffer);
}
if (program) {
gl.deleteProgram(program);
}
}
```
```js hidden
function getRenderingContext() {
const canvas = document.querySelector("canvas");
canvas.width = canvas.clientWidth;
canvas.height = canvas.clientHeight;
const gl =
canvas.getContext("webgl") || canvas.getContext("experimental-webgl");
if (!gl) {
const paragraph = document.querySelector("p");
paragraph.textContent =
"Failed. Your browser or device may not support WebGL.";
return null;
}
gl.viewport(0, 0, gl.drawingBufferWidth, gl.drawingBufferHeight);
gl.clearColor(0.0, 0.0, 0.0, 1.0);
gl.clear(gl.COLOR_BUFFER_BIT);
return gl;
}
```
```js hidden
})();
```
The source code of this example is also available on [GitHub](https://github.com/idofilin/webgl-by-example/tree/master/hello-vertex-attributes).
{{PreviousNext("Learn/WebGL/By_example/Hello_GLSL","Learn/WebGL/By_example/Textures_from_code")}}
| 0 |
data/mdn-content/files/en-us/web/api/webgl_api | data/mdn-content/files/en-us/web/api/webgl_api/constants/index.md | ---
title: WebGL constants
slug: Web/API/WebGL_API/Constants
page-type: guide
spec-urls:
- https://www.khronos.org/registry/webgl/specs/latest/1.0/#5.14
- https://www.khronos.org/registry/webgl/specs/latest/2.0/#3.7
---
{{DefaultAPISidebar("WebGL")}}
The [WebGL API](/en-US/docs/Web/API/WebGL_API) provides several constants that are passed into or returned by functions. All constants are of type {{domxref("WebGL_API/Types", "GLenum")}}.
Standard WebGL constants are installed on the {{domxref("WebGLRenderingContext")}} and {{domxref("WebGL2RenderingContext")}} objects, so that you use them as `gl.CONSTANT_NAME`:
```js
const canvas = document.getElementById("myCanvas");
const gl = canvas.getContext("webgl");
gl.getParameter(gl.LINE_WIDTH);
```
Some constants are also provided by [WebGL extensions](/en-US/docs/Web/API/WebGL_API/Using_Extensions). A [list](#constants_defined_in_webgl_extensions) is provided below.
```js
const debugInfo = gl.getExtension("WEBGL_debug_renderer_info");
const vendor = gl.getParameter(debugInfo.UNMASKED_VENDOR_WEBGL);
```
The [WebGL tutorial](/en-US/docs/Web/API/WebGL_API/Tutorial) has more information, examples, and resources on how to get started with WebGL.
## Table of contents
- [Standard WebGL 1 constants](#standard_webgl_1_constants)
- [Standard WebGL 2 constants](#additional_constants_defined_webgl_2)
- [WebGL extension constants](#constants_defined_in_webgl_extensions)
## Standard WebGL 1 constants
These constants are defined on the {{domxref("WebGLRenderingContext")}} interface.
### Clearing buffers
Constants passed to {{domxref("WebGLRenderingContext.clear()")}} to clear buffer masks.
| Constant name | Value | Description |
| -------------------- | ---------- | ------------------------------------------------------ |
| `DEPTH_BUFFER_BIT` | 0x00000100 | Passed to `clear` to clear the current depth buffer. |
| `STENCIL_BUFFER_BIT` | 0x00000400 | Passed to `clear` to clear the current stencil buffer. |
| `COLOR_BUFFER_BIT` | 0x00004000 | Passed to `clear` to clear the current color buffer. |
### Rendering primitives
Constants passed to {{domxref("WebGLRenderingContext.drawElements()")}} or {{domxref("WebGLRenderingContext.drawArrays()")}} to specify what kind of primitive to render.
<table class="no-markdown">
<thead>
<tr>
<th scope="col">Constant name</th>
<th scope="col">Value</th>
<th scope="col">Description</th>
</tr>
</thead>
<tbody>
<tr>
<td><code>POINTS</code></td>
<td>0x0000</td>
<td>
Passed to <code>drawElements</code> or <code>drawArrays</code> to draw
single points.
</td>
</tr>
<tr>
<td><code>LINES</code></td>
<td>0x0001</td>
<td>
Passed to <code>drawElements</code> or <code>drawArrays</code> to draw
lines. Each vertex connects to the one after it.
</td>
</tr>
<tr>
<td><code>LINE_LOOP</code></td>
<td>0x0002</td>
<td>
Passed to <code>drawElements</code> or <code>drawArrays</code> to draw
lines. Each set of two vertices is treated as a separate line segment.
</td>
</tr>
<tr>
<td><code>LINE_STRIP</code></td>
<td>0x0003</td>
<td>
Passed to <code>drawElements</code> or <code>drawArrays</code> to draw a
connected group of line segments from the first vertex to the last.
</td>
</tr>
<tr>
<td><code>TRIANGLES</code></td>
<td>0x0004</td>
<td>
Passed to <code>drawElements</code> or <code>drawArrays</code> to draw
triangles. Each set of three vertices creates a separate triangle.
</td>
</tr>
<tr>
<td><code>TRIANGLE_STRIP</code></td>
<td>0x0005</td>
<td>
Passed to <code>drawElements</code> or <code>drawArrays</code> to draw a
connected group of triangles.
</td>
</tr>
<tr>
<td><code>TRIANGLE_FAN</code></td>
<td>0x0006</td>
<td>
Passed to <code>drawElements</code> or <code>drawArrays</code> to draw a
connected group of triangles. Each vertex connects to the previous and
the first vertex in the fan.
</td>
</tr>
</tbody>
</table>
### Blending modes
Constants passed to {{domxref("WebGLRenderingContext.blendFunc()")}} or {{domxref("WebGLRenderingContext.blendFuncSeparate()")}} to specify the blending mode (for both, RGB and alpha, or separately).
<table class="no-markdown">
<thead>
<tr>
<th scope="col">Constant name</th>
<th scope="col">Value</th>
<th scope="col">Description</th>
</tr>
</thead>
<tbody>
<tr>
<td><code>ZERO</code></td>
<td>0</td>
<td>
Passed to <code>blendFunc</code> or <code>blendFuncSeparate</code> to
turn off a component.
</td>
</tr>
<tr>
<td><code>ONE</code></td>
<td>1</td>
<td>
Passed to <code>blendFunc</code> or <code>blendFuncSeparate</code> to
turn on a component.
</td>
</tr>
<tr>
<td><code>SRC_COLOR</code></td>
<td>0x0300</td>
<td>
Passed to <code>blendFunc</code> or <code>blendFuncSeparate</code> to
multiply a component by the source elements color.
</td>
</tr>
<tr>
<td><code>ONE_MINUS_SRC_COLOR</code></td>
<td>0x0301</td>
<td>
Passed to <code>blendFunc</code> or <code>blendFuncSeparate</code> to
multiply a component by one minus the source elements color.
</td>
</tr>
<tr>
<td><code>SRC_ALPHA</code></td>
<td>0x0302</td>
<td>
Passed to <code>blendFunc</code> or <code>blendFuncSeparate</code> to
multiply a component by the source's alpha.
</td>
</tr>
<tr>
<td><code>ONE_MINUS_SRC_ALPHA</code></td>
<td>0x0303</td>
<td>
Passed to <code>blendFunc</code> or <code>blendFuncSeparate</code> to
multiply a component by one minus the source's alpha.
</td>
</tr>
<tr>
<td><code>DST_ALPHA</code></td>
<td>0x0304</td>
<td>
Passed to <code>blendFunc</code> or <code>blendFuncSeparate</code> to
multiply a component by the destination's alpha.
</td>
</tr>
<tr>
<td><code>ONE_MINUS_DST_ALPHA</code></td>
<td>0x0305</td>
<td>
Passed to <code>blendFunc</code> or <code>blendFuncSeparate</code> to
multiply a component by one minus the destination's alpha.
</td>
</tr>
<tr>
<td><code>DST_COLOR</code></td>
<td>0x0306</td>
<td>
Passed to <code>blendFunc</code> or <code>blendFuncSeparate</code> to
multiply a component by the destination's color.
</td>
</tr>
<tr>
<td><code>ONE_MINUS_DST_COLOR</code></td>
<td>0x0307</td>
<td>
Passed to <code>blendFunc</code> or <code>blendFuncSeparate</code> to
multiply a component by one minus the destination's color.
</td>
</tr>
<tr>
<td><code>SRC_ALPHA_SATURATE</code></td>
<td>0x0308</td>
<td>
Passed to <code>blendFunc</code> or <code>blendFuncSeparate</code> to
multiply a component by the minimum of source's alpha or one minus the
destination's alpha.
</td>
</tr>
<tr>
<td><code>CONSTANT_COLOR</code></td>
<td>0x8001</td>
<td>
Passed to <code>blendFunc</code> or <code>blendFuncSeparate</code> to
specify a constant color blend function.
</td>
</tr>
<tr>
<td><code>ONE_MINUS_CONSTANT_COLOR</code></td>
<td>0x8002</td>
<td>
Passed to <code>blendFunc</code> or <code>blendFuncSeparate</code> to
specify one minus a constant color blend function.
</td>
</tr>
<tr>
<td><code>CONSTANT_ALPHA</code></td>
<td>0x8003</td>
<td>
Passed to <code>blendFunc</code> or <code>blendFuncSeparate</code> to
specify a constant alpha blend function.
</td>
</tr>
<tr>
<td><code>ONE_MINUS_CONSTANT_ALPHA</code></td>
<td>0x8004</td>
<td>
Passed to <code>blendFunc</code> or <code>blendFuncSeparate</code> to
specify one minus a constant alpha blend function.
</td>
</tr>
</tbody>
</table>
### Blending equations
Constants passed to {{domxref("WebGLRenderingContext.blendEquation()")}} or {{domxref("WebGLRenderingContext.blendEquationSeparate()")}} to control how the blending is calculated (for both, RGB and alpha, or separately).
<table class="no-markdown">
<thead>
<tr>
<th scope="col">Constant name</th>
<th scope="col">Value</th>
<th scope="col">Description</th>
</tr>
</thead>
<tbody>
<tr>
<td><code>FUNC_ADD</code></td>
<td>0x8006</td>
<td>
Passed to <code>blendEquation</code> or
<code>blendEquationSeparate</code> to set an addition blend function.
</td>
</tr>
<tr>
<td><code>FUNC_SUBTRACT</code></td>
<td>0x800A</td>
<td>
Passed to <code>blendEquation</code> or
<code>blendEquationSeparate</code> to specify a subtraction blend
function (source - destination).
</td>
</tr>
<tr>
<td><code>FUNC_REVERSE_SUBTRACT</code></td>
<td>0x800B</td>
<td>
Passed to <code>blendEquation</code> or
<code>blendEquationSeparate</code> to specify a reverse subtraction
blend function (destination - source).
</td>
</tr>
</tbody>
</table>
### Getting GL parameter information
Constants passed to {{domxref("WebGLRenderingContext.getParameter()")}} to specify what information to return.
<table class="no-markdown">
<thead>
<tr>
<th scope="col">Constant name</th>
<th scope="col">Value</th>
<th scope="col">Description</th>
</tr>
</thead>
<tbody>
<tr>
<td><code>BLEND_EQUATION</code></td>
<td>0x8009</td>
<td>
Passed to <code>getParameter</code> to get the current RGB blend
function.
</td>
</tr>
<tr>
<td><code>BLEND_EQUATION_RGB</code></td>
<td>0x8009</td>
<td>
Passed to <code>getParameter</code> to get the current RGB blend
function. Same as BLEND_EQUATION
</td>
</tr>
<tr>
<td><code>BLEND_EQUATION_ALPHA</code></td>
<td>0x883D</td>
<td>
Passed to <code>getParameter</code> to get the current alpha blend
function.
</td>
</tr>
<tr>
<td><code>BLEND_DST_RGB</code></td>
<td>0x80C8</td>
<td>
Passed to <code>getParameter</code> to get the current destination RGB
blend function.
</td>
</tr>
<tr>
<td><code>BLEND_SRC_RGB</code></td>
<td>0x80C9</td>
<td>
Passed to <code>getParameter</code> to get the current destination RGB
blend function.
</td>
</tr>
<tr>
<td><code>BLEND_DST_ALPHA</code></td>
<td>0x80CA</td>
<td>
Passed to <code>getParameter</code> to get the current destination alpha
blend function.
</td>
</tr>
<tr>
<td><code>BLEND_SRC_ALPHA</code></td>
<td>0x80CB</td>
<td>
Passed to <code>getParameter</code> to get the current source alpha
blend function.
</td>
</tr>
<tr>
<td><code>BLEND_COLOR</code></td>
<td>0x8005</td>
<td>
Passed to <code>getParameter</code> to return a the current blend color.
</td>
</tr>
<tr>
<td><code>ARRAY_BUFFER_BINDING</code></td>
<td>0x8894</td>
<td>
Passed to <code>getParameter</code> to get the array buffer binding.
</td>
</tr>
<tr>
<td><code>ELEMENT_ARRAY_BUFFER_BINDING</code></td>
<td>0x8895</td>
<td>
Passed to <code>getParameter</code> to get the current element array
buffer.
</td>
</tr>
<tr>
<td><code>LINE_WIDTH</code></td>
<td>0x0B21</td>
<td>
Passed to <code>getParameter</code> to get the current
<code>lineWidth</code> (set by the <code>lineWidth</code> method).
</td>
</tr>
<tr>
<td><code>ALIASED_POINT_SIZE_RANGE</code></td>
<td>0x846D</td>
<td>
Passed to <code>getParameter</code> to get the current size of a point
drawn with <code>gl.POINTS</code>
</td>
</tr>
<tr>
<td><code>ALIASED_LINE_WIDTH_RANGE</code></td>
<td>0x846E</td>
<td>
Passed to <code>getParameter</code> to get the range of available widths
for a line. Returns a length-2 array with the lo value at 0, and hight
at 1.
</td>
</tr>
<tr>
<td><code>CULL_FACE_MODE</code></td>
<td>0x0B45</td>
<td>
Passed to <code>getParameter</code> to get the current value of
<code>cullFace</code>. Should return <code>FRONT</code>,
<code>BACK</code>, or <code>FRONT_AND_BACK</code>
</td>
</tr>
<tr>
<td><code>FRONT_FACE</code></td>
<td>0x0B46</td>
<td>
Passed to <code>getParameter</code> to determine the current value of
<code>frontFace</code>. Should return <code>CW</code> or
<code>CCW</code>.
</td>
</tr>
<tr>
<td><code>DEPTH_RANGE</code></td>
<td>0x0B70</td>
<td>
Passed to <code>getParameter</code> to return a length-2 array of floats
giving the current depth range.
</td>
</tr>
<tr>
<td><code>DEPTH_WRITEMASK</code></td>
<td>0x0B72</td>
<td>
Passed to <code>getParameter</code> to determine if the depth write mask
is enabled.
</td>
</tr>
<tr>
<td><code>DEPTH_CLEAR_VALUE</code></td>
<td>0x0B73</td>
<td>
Passed to <code>getParameter</code> to determine the current depth clear
value.
</td>
</tr>
<tr>
<td><code>DEPTH_FUNC</code></td>
<td>0x0B74</td>
<td>
Passed to <code>getParameter</code> to get the current depth function.
Returns <code>NEVER</code>, <code>ALWAYS</code>, <code>LESS</code>,
<code>EQUAL</code>, <code>LEQUAL</code>, <code>GREATER</code>,
<code>GEQUAL</code>, or <code>NOTEQUAL</code>.
</td>
</tr>
<tr>
<td><code>STENCIL_CLEAR_VALUE</code></td>
<td>0x0B91</td>
<td>
Passed to <code>getParameter</code> to get the value the stencil will be
cleared to.
</td>
</tr>
<tr>
<td><code>STENCIL_FUNC</code></td>
<td>0x0B92</td>
<td>
Passed to <code>getParameter</code> to get the current stencil function.
Returns <code>NEVER</code>, <code>ALWAYS</code>, <code>LESS</code>,
<code>EQUAL</code>, <code>LEQUAL</code>, <code>GREATER</code>,
<code>GEQUAL</code>, or <code>NOTEQUAL</code>.
</td>
</tr>
<tr>
<td><code>STENCIL_FAIL</code></td>
<td>0x0B94</td>
<td>
Passed to <code>getParameter</code> to get the current stencil fail
function. Should return <code>KEEP</code>, <code>REPLACE</code>,
<code>INCR</code>, <code>DECR</code>, <code>INVERT</code>,
<code>INCR_WRAP</code>, or <code>DECR_WRAP</code>.
</td>
</tr>
<tr>
<td><code>STENCIL_PASS_DEPTH_FAIL</code></td>
<td>0x0B95</td>
<td>
Passed to <code>getParameter</code> to get the current stencil fail
function should the depth buffer test fail. Should return
<code>KEEP</code>, <code>REPLACE</code>, <code>INCR</code>,
<code>DECR</code>, <code>INVERT</code>, <code>INCR_WRAP</code>, or
<code>DECR_WRAP</code>.
</td>
</tr>
<tr>
<td><code>STENCIL_PASS_DEPTH_PASS</code></td>
<td>0x0B96</td>
<td>
Passed to <code>getParameter</code> to get the current stencil fail
function should the depth buffer test pass. Should return KEEP, REPLACE,
INCR, DECR, INVERT, INCR_WRAP, or DECR_WRAP.
</td>
</tr>
<tr>
<td><code>STENCIL_REF</code></td>
<td>0x0B97</td>
<td>
Passed to <code>getParameter</code> to get the reference value used for
stencil tests.
</td>
</tr>
<tr>
<td><code>STENCIL_VALUE_MASK</code></td>
<td>0x0B93</td>
<td></td>
</tr>
<tr>
<td><code>STENCIL_WRITEMASK</code></td>
<td>0x0B98</td>
<td></td>
</tr>
<tr>
<td><code>STENCIL_BACK_FUNC</code></td>
<td>0x8800</td>
<td></td>
</tr>
<tr>
<td><code>STENCIL_BACK_FAIL</code></td>
<td>0x8801</td>
<td></td>
</tr>
<tr>
<td><code>STENCIL_BACK_PASS_DEPTH_FAIL</code></td>
<td>0x8802</td>
<td></td>
</tr>
<tr>
<td><code>STENCIL_BACK_PASS_DEPTH_PASS</code></td>
<td>0x8803</td>
<td></td>
</tr>
<tr>
<td><code>STENCIL_BACK_REF</code></td>
<td>0x8CA3</td>
<td></td>
</tr>
<tr>
<td><code>STENCIL_BACK_VALUE_MASK</code></td>
<td>0x8CA4</td>
<td></td>
</tr>
<tr>
<td><code>STENCIL_BACK_WRITEMASK</code></td>
<td>0x8CA5</td>
<td></td>
</tr>
<tr>
<td><code>VIEWPORT</code></td>
<td>0x0BA2</td>
<td>
Returns an {{jsxref("Int32Array")}} with four elements for the
current viewport dimensions.
</td>
</tr>
<tr>
<td><code>SCISSOR_BOX</code></td>
<td>0x0C10</td>
<td>
Returns an {{jsxref("Int32Array")}} with four elements for the
current scissor box dimensions.
</td>
</tr>
<tr>
<td><code>COLOR_CLEAR_VALUE</code></td>
<td>0x0C22</td>
<td></td>
</tr>
<tr>
<td><code>COLOR_WRITEMASK</code></td>
<td>0x0C23</td>
<td></td>
</tr>
<tr>
<td><code>UNPACK_ALIGNMENT</code></td>
<td>0x0CF5</td>
<td></td>
</tr>
<tr>
<td><code>PACK_ALIGNMENT</code></td>
<td>0x0D05</td>
<td></td>
</tr>
<tr>
<td><code>MAX_TEXTURE_SIZE</code></td>
<td>0x0D33</td>
<td></td>
</tr>
<tr>
<td><code>MAX_VIEWPORT_DIMS</code></td>
<td>0x0D3A</td>
<td></td>
</tr>
<tr>
<td><code>SUBPIXEL_BITS</code></td>
<td>0x0D50</td>
<td></td>
</tr>
<tr>
<td><code>RED_BITS</code></td>
<td>0x0D52</td>
<td></td>
</tr>
<tr>
<td><code>GREEN_BITS</code></td>
<td>0x0D53</td>
<td></td>
</tr>
<tr>
<td><code>BLUE_BITS</code></td>
<td>0x0D54</td>
<td></td>
</tr>
<tr>
<td><code>ALPHA_BITS</code></td>
<td>0x0D55</td>
<td></td>
</tr>
<tr>
<td><code>DEPTH_BITS</code></td>
<td>0x0D56</td>
<td></td>
</tr>
<tr>
<td><code>STENCIL_BITS</code></td>
<td>0x0D57</td>
<td></td>
</tr>
<tr>
<td><code>POLYGON_OFFSET_UNITS</code></td>
<td>0x2A00</td>
<td></td>
</tr>
<tr>
<td><code>POLYGON_OFFSET_FACTOR</code></td>
<td>0x8038</td>
<td></td>
</tr>
<tr>
<td><code>TEXTURE_BINDING_2D</code></td>
<td>0x8069</td>
<td></td>
</tr>
<tr>
<td><code>SAMPLE_BUFFERS</code></td>
<td>0x80A8</td>
<td></td>
</tr>
<tr>
<td><code>SAMPLES</code></td>
<td>0x80A9</td>
<td></td>
</tr>
<tr>
<td><code>SAMPLE_COVERAGE_VALUE</code></td>
<td>0x80AA</td>
<td></td>
</tr>
<tr>
<td><code>SAMPLE_COVERAGE_INVERT</code></td>
<td>0x80AB</td>
<td></td>
</tr>
<tr>
<td><code>COMPRESSED_TEXTURE_FORMATS</code></td>
<td>0x86A3</td>
<td></td>
</tr>
<tr>
<td><code>VENDOR</code></td>
<td>0x1F00</td>
<td></td>
</tr>
<tr>
<td><code>RENDERER</code></td>
<td>0x1F01</td>
<td></td>
</tr>
<tr>
<td><code>VERSION</code></td>
<td>0x1F02</td>
<td></td>
</tr>
<tr>
<td><code>IMPLEMENTATION_COLOR_READ_TYPE</code></td>
<td>0x8B9A</td>
<td></td>
</tr>
<tr>
<td><code>IMPLEMENTATION_COLOR_READ_FORMAT</code></td>
<td>0x8B9B</td>
<td></td>
</tr>
<tr>
<td><code>BROWSER_DEFAULT_WEBGL</code></td>
<td>0x9244</td>
<td></td>
</tr>
</tbody>
</table>
### Buffers
Constants passed to {{domxref("WebGLRenderingContext.bufferData()")}}, {{domxref("WebGLRenderingContext.bufferSubData()")}}, {{domxref("WebGLRenderingContext.bindBuffer()")}}, or {{domxref("WebGLRenderingContext.getBufferParameter()")}}.
<table class="no-markdown">
<thead>
<tr>
<th scope="col">Constant name</th>
<th scope="col">Value</th>
<th scope="col">Description</th>
</tr>
</thead>
<tbody>
<tr>
<td><code>STATIC_DRAW</code></td>
<td>0x88E4</td>
<td>
Passed to <code>bufferData</code> as a hint about whether the contents
of the buffer are likely to be used often and not change often.
</td>
</tr>
<tr>
<td><code>STREAM_DRAW</code></td>
<td>0x88E0</td>
<td>
Passed to <code>bufferData</code> as a hint about whether the contents
of the buffer are likely to not be used often.
</td>
</tr>
<tr>
<td><code>DYNAMIC_DRAW</code></td>
<td>0x88E8</td>
<td>
Passed to <code>bufferData</code> as a hint about whether the contents
of the buffer are likely to be used often and change often.
</td>
</tr>
<tr>
<td><code>ARRAY_BUFFER</code></td>
<td>0x8892</td>
<td>
Passed to <code>bindBuffer</code> or <code>bufferData</code> to specify
the type of buffer being used.
</td>
</tr>
<tr>
<td><code>ELEMENT_ARRAY_BUFFER</code></td>
<td>0x8893</td>
<td>
Passed to <code>bindBuffer</code> or <code>bufferData</code> to specify
the type of buffer being used.
</td>
</tr>
<tr>
<td><code>BUFFER_SIZE</code></td>
<td>0x8764</td>
<td>Passed to <code>getBufferParameter</code> to get a buffer's size.</td>
</tr>
<tr>
<td><code>BUFFER_USAGE</code></td>
<td>0x8765</td>
<td>
Passed to <code>getBufferParameter</code> to get the hint for the buffer
passed in when it was created.
</td>
</tr>
</tbody>
</table>
### Vertex attributes
Constants passed to {{domxref("WebGLRenderingContext.getVertexAttrib()")}}.
| Constant name | Value | Description |
| ------------------------------------ | ------ | ---------------------------------------------------------------------- |
| `CURRENT_VERTEX_ATTRIB` | 0x8626 | Passed to `getVertexAttrib` to read back the current vertex attribute. |
| `VERTEX_ATTRIB_ARRAY_ENABLED` | 0x8622 | |
| `VERTEX_ATTRIB_ARRAY_SIZE` | 0x8623 | |
| `VERTEX_ATTRIB_ARRAY_STRIDE` | 0x8624 | |
| `VERTEX_ATTRIB_ARRAY_TYPE` | 0x8625 | |
| `VERTEX_ATTRIB_ARRAY_NORMALIZED` | 0x886A | |
| `VERTEX_ATTRIB_ARRAY_POINTER` | 0x8645 | |
| `VERTEX_ATTRIB_ARRAY_BUFFER_BINDING` | 0x889F | |
### Culling
Constants passed to {{domxref("WebGLRenderingContext.cullFace()")}}.
<table class="no-markdown">
<thead>
<tr>
<th scope="col">Constant name</th>
<th scope="col">Value</th>
<th scope="col">Description</th>
</tr>
</thead>
<tbody>
<tr>
<td><code>CULL_FACE</code></td>
<td>0x0B44</td>
<td>
Passed to <code>enable</code>/<code>disable</code> to turn on/off
culling. Can also be used with <code>getParameter</code> to find the
current culling method.
</td>
</tr>
<tr>
<td><code>FRONT</code></td>
<td>0x0404</td>
<td>
Passed to <code>cullFace</code> to specify that only front faces should
be culled.
</td>
</tr>
<tr>
<td><code>BACK</code></td>
<td>0x0405</td>
<td>
Passed to <code>cullFace</code> to specify that only back faces should
be culled.
</td>
</tr>
<tr>
<td><code>FRONT_AND_BACK</code></td>
<td>0x0408</td>
<td>
Passed to <code>cullFace</code> to specify that front and back faces
should be culled.
</td>
</tr>
</tbody>
</table>
### Enabling and disabling
Constants passed to {{domxref("WebGLRenderingContext.enable()")}} or {{domxref("WebGLRenderingContext.disable()")}}.
<table class="no-markdown">
<thead>
<tr>
<th scope="col">Constant name</th>
<th scope="col">Value</th>
<th scope="col">Description</th>
</tr>
</thead>
<tbody>
<tr>
<td><code>BLEND</code></td>
<td>0x0BE2</td>
<td>
Passed to <code>enable</code>/<code>disable</code> to turn on/off
blending. Can also be used with <code>getParameter</code> to find the
current blending method.
</td>
</tr>
<tr>
<td><code>DEPTH_TEST</code></td>
<td>0x0B71</td>
<td>
Passed to <code>enable</code>/<code>disable</code> to turn on/off the
depth test. Can also be used with <code>getParameter</code> to query the
depth test.
</td>
</tr>
<tr>
<td><code>DITHER</code></td>
<td>0x0BD0</td>
<td>
Passed to <code>enable</code>/<code>disable</code> to turn on/off
dithering. Can also be used with <code>getParameter</code> to find the
current dithering method.
</td>
</tr>
<tr>
<td><code>POLYGON_OFFSET_FILL</code></td>
<td>0x8037</td>
<td>
Passed to <code>enable</code>/<code>disable</code> to turn on/off the
polygon offset. Useful for rendering hidden-line images, decals, and or
solids with highlighted edges. Can also be used with
<code>getParameter</code> to query the scissor test.
</td>
</tr>
<tr>
<td><code>SAMPLE_ALPHA_TO_COVERAGE</code></td>
<td>0x809E</td>
<td>
Passed to <code>enable</code>/<code>disable</code> to turn on/off the
alpha to coverage. Used in multi-sampling alpha channels.
</td>
</tr>
<tr>
<td><code>SAMPLE_COVERAGE</code></td>
<td>0x80A0</td>
<td>
Passed to <code>enable</code>/<code>disable</code> to turn on/off the
sample coverage. Used in multi-sampling.
</td>
</tr>
<tr>
<td><code>SCISSOR_TEST</code></td>
<td>0x0C11</td>
<td>
Passed to <code>enable</code>/<code>disable</code> to turn on/off the
scissor test. Can also be used with <code>getParameter</code> to query
the scissor test.
</td>
</tr>
<tr>
<td><code>STENCIL_TEST</code></td>
<td>0x0B90</td>
<td>
Passed to <code>enable</code>/<code>disable</code> to turn on/off the
stencil test. Can also be used with <code>getParameter</code> to query
the stencil test.
</td>
</tr>
</tbody>
</table>
### Errors
Constants returned from {{domxref("WebGLRenderingContext.getError()")}}.
| Constant name | Value | Description |
| -------------------- | ------ | ------------------------- |
| `NO_ERROR` | 0 | Returned from `getError`. |
| `INVALID_ENUM` | 0x0500 | Returned from `getError`. |
| `INVALID_VALUE` | 0x0501 | Returned from `getError`. |
| `INVALID_OPERATION` | 0x0502 | Returned from `getError`. |
| `OUT_OF_MEMORY` | 0x0505 | Returned from `getError`. |
| `CONTEXT_LOST_WEBGL` | 0x9242 | Returned from `getError`. |
### Front face directions
Constants passed to {{domxref("WebGLRenderingContext.frontFace()")}}.
| Constant name | Value | Description |
| ------------- | ------ | -------------------------------------------------------------------------------------------------------- |
| `CW` | 0x0900 | Passed to `frontFace` to specify the front face of a polygon is drawn in the clockwise direction |
| `CCW` | 0x0901 | Passed to `frontFace` to specify the front face of a polygon is drawn in the counter clockwise direction |
### Hints
Constants passed to {{domxref("WebGLRenderingContext.hint()")}}
<table class="no-markdown">
<thead>
<tr>
<th scope="col">Constant name</th>
<th scope="col">Value</th>
<th scope="col">Description</th>
</tr>
</thead>
<tbody>
<tr>
<td><code>DONT_CARE</code></td>
<td>0x1100</td>
<td>There is no preference for this behavior.</td>
</tr>
<tr>
<td><code>FASTEST</code></td>
<td>0x1101</td>
<td>The most efficient behavior should be used.</td>
</tr>
<tr>
<td><code>NICEST</code></td>
<td>0x1102</td>
<td>The most correct or the highest quality option should be used.</td>
</tr>
<tr>
<td><code>GENERATE_MIPMAP_HINT</code></td>
<td>0x8192</td>
<td>
Hint for the quality of filtering when generating mipmap images with
{{domxref("WebGLRenderingContext.generateMipmap()")}}.
</td>
</tr>
</tbody>
</table>
### Data types
| Constant name | Value | Description |
| ---------------- | ------ | ----------- |
| `BYTE` | 0x1400 | |
| `UNSIGNED_BYTE` | 0x1401 | |
| `SHORT` | 0x1402 | |
| `UNSIGNED_SHORT` | 0x1403 | |
| `INT` | 0x1404 | |
| `UNSIGNED_INT` | 0x1405 | |
| `FLOAT` | 0x1406 | |
### Pixel formats
| Constant name | Value | Description |
| ----------------- | ------ | ----------- |
| `DEPTH_COMPONENT` | 0x1902 | |
| `ALPHA` | 0x1906 | |
| `RGB` | 0x1907 | |
| `RGBA` | 0x1908 | |
| `LUMINANCE` | 0x1909 | |
| `LUMINANCE_ALPHA` | 0x190A | |
### Pixel types
| Constant name | Value | Description |
| ------------------------ | ------ | ----------- |
| `UNSIGNED_BYTE` | 0x1401 | |
| `UNSIGNED_SHORT_4_4_4_4` | 0x8033 | |
| `UNSIGNED_SHORT_5_5_5_1` | 0x8034 | |
| `UNSIGNED_SHORT_5_6_5` | 0x8363 | |
### Shaders
Constants passed to {{domxref("WebGLRenderingContext.createShader()")}} or {{domxref("WebGLRenderingContext.getShaderParameter()")}}
<table class="no-markdown">
<thead>
<tr>
<th scope="col">Constant name</th>
<th scope="col">Value</th>
<th scope="col">Description</th>
</tr>
</thead>
<tbody>
<tr>
<td><code>FRAGMENT_SHADER</code></td>
<td>0x8B30</td>
<td>Passed to <code>createShader</code> to define a fragment shader.</td>
</tr>
<tr>
<td><code>VERTEX_SHADER</code></td>
<td>0x8B31</td>
<td>Passed to <code>createShader</code> to define a vertex shader</td>
</tr>
<tr>
<td><code>COMPILE_STATUS</code></td>
<td>0x8B81</td>
<td>
Passed to <code>getShaderParameter</code> to get the status of the
compilation. Returns false if the shader was not compiled. You can then
query <code>getShaderInfoLog</code> to find the exact error
</td>
</tr>
<tr>
<td><code>DELETE_STATUS</code></td>
<td>0x8B80</td>
<td>
Passed to <code>getShaderParameter</code> to determine if a shader was
deleted via <code>deleteShader</code>. Returns true if it was, false
otherwise.
</td>
</tr>
<tr>
<td><code>LINK_STATUS</code></td>
<td>0x8B82</td>
<td>
Passed to <code>getProgramParameter</code> after calling
<code>linkProgram</code> to determine if a program was linked correctly.
Returns false if there were errors. Use
<code>getProgramInfoLog</code> to find the exact error.
</td>
</tr>
<tr>
<td><code>VALIDATE_STATUS</code></td>
<td>0x8B83</td>
<td>
Passed to <code>getProgramParameter</code> after calling
<code>validateProgram</code> to determine if it is valid. Returns false
if errors were found.
</td>
</tr>
<tr>
<td><code>ATTACHED_SHADERS</code></td>
<td>0x8B85</td>
<td>
Passed to <code>getProgramParameter</code> after calling
<code>attachShader</code> to determine if the shader was attached
correctly. Returns false if errors occurred.
</td>
</tr>
<tr>
<td><code>ACTIVE_ATTRIBUTES</code></td>
<td>0x8B89</td>
<td>
Passed to <code>getProgramParameter</code> to get the number of
attributes active in a program.
</td>
</tr>
<tr>
<td><code>ACTIVE_UNIFORMS</code></td>
<td>0x8B86</td>
<td>
Passed to <code>getProgramParameter</code> to get the number of uniforms
active in a program.
</td>
</tr>
<tr>
<td><code>MAX_VERTEX_ATTRIBS</code></td>
<td>0x8869</td>
<td>
The maximum number of entries possible in the vertex attribute list.
</td>
</tr>
<tr>
<td><code>MAX_VERTEX_UNIFORM_VECTORS</code></td>
<td>0x8DFB</td>
<td></td>
</tr>
<tr>
<td><code>MAX_VARYING_VECTORS</code></td>
<td>0x8DFC</td>
<td></td>
</tr>
<tr>
<td><code>MAX_COMBINED_TEXTURE_IMAGE_UNITS</code></td>
<td>0x8B4D</td>
<td></td>
</tr>
<tr>
<td><code>MAX_VERTEX_TEXTURE_IMAGE_UNITS</code></td>
<td>0x8B4C</td>
<td></td>
</tr>
<tr>
<td><code>MAX_TEXTURE_IMAGE_UNITS</code></td>
<td>0x8872</td>
<td>
Implementation dependent number of maximum texture units. At least 8.
</td>
</tr>
<tr>
<td><code>MAX_FRAGMENT_UNIFORM_VECTORS</code></td>
<td>0x8DFD</td>
<td></td>
</tr>
<tr>
<td><code>SHADER_TYPE</code></td>
<td>0x8B4F</td>
<td></td>
</tr>
<tr>
<td><code>SHADING_LANGUAGE_VERSION</code></td>
<td>0x8B8C</td>
<td></td>
</tr>
<tr>
<td><code>CURRENT_PROGRAM</code></td>
<td>0x8B8D</td>
<td></td>
</tr>
</tbody>
</table>
### Depth or stencil tests
Constants passed to {{domxref("WebGLRenderingContext.depthFunc()")}} or {{domxref("WebGLRenderingContext.stencilFunc()")}}.
<table class="no-markdown">
<thead>
<tr>
<th scope="col">Constant name</th>
<th scope="col">Value</th>
<th scope="col">Description</th>
</tr>
</thead>
<tbody>
<tr>
<td><code>NEVER</code></td>
<td>0x0200</td>
<td>
Passed to <code>depthFunction</code> or <code>stencilFunction</code> to
specify depth or stencil tests will never pass, i.e., nothing will be
drawn.
</td>
</tr>
<tr>
<td><code>LESS</code></td>
<td>0x0201</td>
<td>
Passed to <code>depthFunction</code> or <code>stencilFunction</code> to
specify depth or stencil tests will pass if the new depth value is less
than the stored value.
</td>
</tr>
<tr>
<td><code>EQUAL</code></td>
<td>0x0202</td>
<td>
Passed to <code>depthFunction</code> or <code>stencilFunction</code> to
specify depth or stencil tests will pass if the new depth value is
equals to the stored value.
</td>
</tr>
<tr>
<td><code>LEQUAL</code></td>
<td>0x0203</td>
<td>
Passed to <code>depthFunction</code> or <code>stencilFunction</code> to
specify depth or stencil tests will pass if the new depth value is less
than or equal to the stored value.
</td>
</tr>
<tr>
<td><code>GREATER</code></td>
<td>0x0204</td>
<td>
Passed to <code>depthFunction</code> or <code>stencilFunction</code> to
specify depth or stencil tests will pass if the new depth value is
greater than the stored value.
</td>
</tr>
<tr>
<td><code>NOTEQUAL</code></td>
<td>0x0205</td>
<td>
Passed to <code>depthFunction</code> or <code>stencilFunction</code> to
specify depth or stencil tests will pass if the new depth value is not
equal to the stored value.
</td>
</tr>
<tr>
<td><code>GEQUAL</code></td>
<td>0x0206</td>
<td>
Passed to <code>depthFunction</code> or <code>stencilFunction</code> to
specify depth or stencil tests will pass if the new depth value is
greater than or equal to the stored value.
</td>
</tr>
<tr>
<td><code>ALWAYS</code></td>
<td>0x0207</td>
<td>
Passed to <code>depthFunction</code> or <code>stencilFunction</code> to
specify depth or stencil tests will always pass, i.e., pixels will be
drawn in the order they are drawn.
</td>
</tr>
</tbody>
</table>
### Stencil actions
Constants passed to {{domxref("WebGLRenderingContext.stencilOp()")}}.
| Constant name | Value | Description |
| ------------- | ------ | ----------- |
| `KEEP` | 0x1E00 | |
| `REPLACE` | 0x1E01 | |
| `INCR` | 0x1E02 | |
| `DECR` | 0x1E03 | |
| `INVERT` | 0x150A | |
| `INCR_WRAP` | 0x8507 | |
| `DECR_WRAP` | 0x8508 | |
### Textures
Constants passed to {{domxref("WebGLRenderingContext.texParameter", "WebGLRenderingContext.texParameteri()")}}, {{domxref("WebGLRenderingContext.texParameter", "WebGLRenderingContext.texParameterf()")}}, {{domxref("WebGLRenderingContext.bindTexture()")}}, {{domxref("WebGLRenderingContext.texImage2D()")}}, and others.
| Constant name | Value | Description |
| ----------------------------- | --------------- | -------------------------------- |
| `NEAREST` | 0x2600 | |
| `LINEAR` | 0x2601 | |
| `NEAREST_MIPMAP_NEAREST` | 0x2700 | |
| `LINEAR_MIPMAP_NEAREST` | 0x2701 | |
| `NEAREST_MIPMAP_LINEAR` | 0x2702 | |
| `LINEAR_MIPMAP_LINEAR` | 0x2703 | |
| `TEXTURE_MAG_FILTER` | 0x2800 | |
| `TEXTURE_MIN_FILTER` | 0x2801 | |
| `TEXTURE_WRAP_S` | 0x2802 | |
| `TEXTURE_WRAP_T` | 0x2803 | |
| `TEXTURE_2D` | 0x0DE1 | |
| `TEXTURE` | 0x1702 | |
| `TEXTURE_CUBE_MAP` | 0x8513 | |
| `TEXTURE_BINDING_CUBE_MAP` | 0x8514 | |
| `TEXTURE_CUBE_MAP_POSITIVE_X` | 0x8515 | |
| `TEXTURE_CUBE_MAP_NEGATIVE_X` | 0x8516 | |
| `TEXTURE_CUBE_MAP_POSITIVE_Y` | 0x8517 | |
| `TEXTURE_CUBE_MAP_NEGATIVE_Y` | 0x8518 | |
| `TEXTURE_CUBE_MAP_POSITIVE_Z` | 0x8519 | |
| `TEXTURE_CUBE_MAP_NEGATIVE_Z` | 0x851A | |
| `MAX_CUBE_MAP_TEXTURE_SIZE` | 0x851C | |
| `TEXTURE0 - 31` | 0x84C0 - 0x84DF | A texture unit. |
| `ACTIVE_TEXTURE` | 0x84E0 | The current active texture unit. |
| `REPEAT` | 0x2901 | |
| `CLAMP_TO_EDGE` | 0x812F | |
| `MIRRORED_REPEAT` | 0x8370 | |
### Uniform types
| Constant name | Value | Description |
| -------------- | ------ | ----------- |
| `FLOAT_VEC2` | 0x8B50 | |
| `FLOAT_VEC3` | 0x8B51 | |
| `FLOAT_VEC4` | 0x8B52 | |
| `INT_VEC2` | 0x8B53 | |
| `INT_VEC3` | 0x8B54 | |
| `INT_VEC4` | 0x8B55 | |
| `BOOL` | 0x8B56 | |
| `BOOL_VEC2` | 0x8B57 | |
| `BOOL_VEC3` | 0x8B58 | |
| `BOOL_VEC4` | 0x8B59 | |
| `FLOAT_MAT2` | 0x8B5A | |
| `FLOAT_MAT3` | 0x8B5B | |
| `FLOAT_MAT4` | 0x8B5C | |
| `SAMPLER_2D` | 0x8B5E | |
| `SAMPLER_CUBE` | 0x8B60 | |
### Shader precision-specified types
| Constant name | Value | Description |
| -------------- | ------ | ----------- |
| `LOW_FLOAT` | 0x8DF0 | |
| `MEDIUM_FLOAT` | 0x8DF1 | |
| `HIGH_FLOAT` | 0x8DF2 | |
| `LOW_INT` | 0x8DF3 | |
| `MEDIUM_INT` | 0x8DF4 | |
| `HIGH_INT` | 0x8DF5 | |
### Framebuffers and renderbuffers
| Constant name | Value | Description |
| ---------------------------------------------- | ------ | ----------- |
| `FRAMEBUFFER` | 0x8D40 | |
| `RENDERBUFFER` | 0x8D41 | |
| `RGBA4` | 0x8056 | |
| `RGB5_A1` | 0x8057 | |
| `RGB565` | 0x8D62 | |
| `DEPTH_COMPONENT16` | 0x81A5 | |
| `STENCIL_INDEX8` | 0x8D48 | |
| `DEPTH_STENCIL` | 0x84F9 | |
| `RENDERBUFFER_WIDTH` | 0x8D42 | |
| `RENDERBUFFER_HEIGHT` | 0x8D43 | |
| `RENDERBUFFER_INTERNAL_FORMAT` | 0x8D44 | |
| `RENDERBUFFER_RED_SIZE` | 0x8D50 | |
| `RENDERBUFFER_GREEN_SIZE` | 0x8D51 | |
| `RENDERBUFFER_BLUE_SIZE` | 0x8D52 | |
| `RENDERBUFFER_ALPHA_SIZE` | 0x8D53 | |
| `RENDERBUFFER_DEPTH_SIZE` | 0x8D54 | |
| `RENDERBUFFER_STENCIL_SIZE` | 0x8D55 | |
| `FRAMEBUFFER_ATTACHMENT_OBJECT_TYPE` | 0x8CD0 | |
| `FRAMEBUFFER_ATTACHMENT_OBJECT_NAME` | 0x8CD1 | |
| `FRAMEBUFFER_ATTACHMENT_TEXTURE_LEVEL` | 0x8CD2 | |
| `FRAMEBUFFER_ATTACHMENT_TEXTURE_CUBE_MAP_FACE` | 0x8CD3 | |
| `COLOR_ATTACHMENT0` | 0x8CE0 | |
| `DEPTH_ATTACHMENT` | 0x8D00 | |
| `STENCIL_ATTACHMENT` | 0x8D20 | |
| `DEPTH_STENCIL_ATTACHMENT` | 0x821A | |
| `NONE` | 0 | |
| `FRAMEBUFFER_COMPLETE` | 0x8CD5 | |
| `FRAMEBUFFER_INCOMPLETE_ATTACHMENT` | 0x8CD6 | |
| `FRAMEBUFFER_INCOMPLETE_MISSING_ATTACHMENT` | 0x8CD7 | |
| `FRAMEBUFFER_INCOMPLETE_DIMENSIONS` | 0x8CD9 | |
| `FRAMEBUFFER_UNSUPPORTED` | 0x8CDD | |
| `FRAMEBUFFER_BINDING` | 0x8CA6 | |
| `RENDERBUFFER_BINDING` | 0x8CA7 | |
| `MAX_RENDERBUFFER_SIZE` | 0x84E8 | |
| `INVALID_FRAMEBUFFER_OPERATION` | 0x0506 | |
### Pixel storage modes
Constants passed to {{domxref("WebGLRenderingContext.pixelStorei()")}}.
| Constant name | Value | Description |
| ------------------------------------ | ------ | ----------- |
| `UNPACK_FLIP_Y_WEBGL` | 0x9240 | |
| `UNPACK_PREMULTIPLY_ALPHA_WEBGL` | 0x9241 | |
| `UNPACK_COLORSPACE_CONVERSION_WEBGL` | 0x9243 | |
## Additional constants defined WebGL 2
These constants are defined on the {{domxref("WebGL2RenderingContext")}} interface. All WebGL 1 constants are also available in a WebGL 2 context.
### Getting GL parameter information
Constants passed to {{domxref("WebGLRenderingContext.getParameter()")}} to specify what information to return.
| Constant name | Value | Description |
| --------------------------------- | ------ | ----------- |
| `READ_BUFFER` | 0x0C02 | |
| `UNPACK_ROW_LENGTH` | 0x0CF2 | |
| `UNPACK_SKIP_ROWS` | 0x0CF3 | |
| `UNPACK_SKIP_PIXELS` | 0x0CF4 | |
| `PACK_ROW_LENGTH` | 0x0D02 | |
| `PACK_SKIP_ROWS` | 0x0D03 | |
| `PACK_SKIP_PIXELS` | 0x0D04 | |
| `TEXTURE_BINDING_3D` | 0x806A | |
| `UNPACK_SKIP_IMAGES` | 0x806D | |
| `UNPACK_IMAGE_HEIGHT` | 0x806E | |
| `MAX_3D_TEXTURE_SIZE` | 0x8073 | |
| `MAX_ELEMENTS_VERTICES` | 0x80E8 | |
| `MAX_ELEMENTS_INDICES` | 0x80E9 | |
| `MAX_TEXTURE_LOD_BIAS` | 0x84FD | |
| `MAX_FRAGMENT_UNIFORM_COMPONENTS` | 0x8B49 | |
| `MAX_VERTEX_UNIFORM_COMPONENTS` | 0x8B4A | |
| `MAX_ARRAY_TEXTURE_LAYERS` | 0x88FF | |
| `MIN_PROGRAM_TEXEL_OFFSET` | 0x8904 | |
| `MAX_PROGRAM_TEXEL_OFFSET` | 0x8905 | |
| `MAX_VARYING_COMPONENTS` | 0x8B4B | |
| `FRAGMENT_SHADER_DERIVATIVE_HINT` | 0x8B8B | |
| `RASTERIZER_DISCARD` | 0x8C89 | |
| `VERTEX_ARRAY_BINDING` | 0x85B5 | |
| `MAX_VERTEX_OUTPUT_COMPONENTS` | 0x9122 | |
| `MAX_FRAGMENT_INPUT_COMPONENTS` | 0x9125 | |
| `MAX_SERVER_WAIT_TIMEOUT` | 0x9111 | |
| `MAX_ELEMENT_INDEX` | 0x8D6B | |
### Textures
Constants passed to {{domxref("WebGLRenderingContext.texParameter", "WebGLRenderingContext.texParameteri()")}}, {{domxref("WebGLRenderingContext.texParameter", "WebGLRenderingContext.texParameterf()")}}, {{domxref("WebGLRenderingContext.bindTexture()")}}, {{domxref("WebGLRenderingContext.texImage2D()")}}, and others.
| Constant name | Value | Description |
| -------------------------- | ------ | ----------- |
| `RED` | 0x1903 | |
| `RGB8` | 0x8051 | |
| `RGBA8` | 0x8058 | |
| `RGB10_A2` | 0x8059 | |
| `TEXTURE_3D` | 0x806F | |
| `TEXTURE_WRAP_R` | 0x8072 | |
| `TEXTURE_MIN_LOD` | 0x813A | |
| `TEXTURE_MAX_LOD` | 0x813B | |
| `TEXTURE_BASE_LEVEL` | 0x813C | |
| `TEXTURE_MAX_LEVEL` | 0x813D | |
| `TEXTURE_COMPARE_MODE` | 0x884C | |
| `TEXTURE_COMPARE_FUNC` | 0x884D | |
| `SRGB` | 0x8C40 | |
| `SRGB8` | 0x8C41 | |
| `SRGB8_ALPHA8` | 0x8C43 | |
| `COMPARE_REF_TO_TEXTURE` | 0x884E | |
| `RGBA32F` | 0x8814 | |
| `RGB32F` | 0x8815 | |
| `RGBA16F` | 0x881A | |
| `RGB16F` | 0x881B | |
| `TEXTURE_2D_ARRAY` | 0x8C1A | |
| `TEXTURE_BINDING_2D_ARRAY` | 0x8C1D | |
| `R11F_G11F_B10F` | 0x8C3A | |
| `RGB9_E5` | 0x8C3D | |
| `RGBA32UI` | 0x8D70 | |
| `RGB32UI` | 0x8D71 | |
| `RGBA16UI` | 0x8D76 | |
| `RGB16UI` | 0x8D77 | |
| `RGBA8UI` | 0x8D7C | |
| `RGB8UI` | 0x8D7D | |
| `RGBA32I` | 0x8D82 | |
| `RGB32I` | 0x8D83 | |
| `RGBA16I` | 0x8D88 | |
| `RGB16I` | 0x8D89 | |
| `RGBA8I` | 0x8D8E | |
| `RGB8I` | 0x8D8F | |
| `RED_INTEGER` | 0x8D94 | |
| `RGB_INTEGER` | 0x8D98 | |
| `RGBA_INTEGER` | 0x8D99 | |
| `R8` | 0x8229 | |
| `RG8` | 0x822B | |
| R16F | 0x822D | |
| R32F | 0x822E | |
| RG16F | 0x822F | |
| RG32F | 0x8230 | |
| R8I | 0x8231 | |
| R8UI | 0x8232 | |
| R16I | 0x8233 | |
| R16UI | 0x8234 | |
| R32I | 0x8235 | |
| R32UI | 0x8236 | |
| RG8I | 0x8237 | |
| RG8UI | 0x8238 | |
| RG16I | 0x8239 | |
| RG16UI | 0x823A | |
| RG32I | 0x823B | |
| RG32UI | 0x823C | |
| R8_SNORM | 0x8F94 | |
| RG8_SNORM | 0x8F95 | |
| RGB8_SNORM | 0x8F96 | |
| RGBA8_SNORM | 0x8F97 | |
| `RGB10_A2UI` | 0x906F | |
| `TEXTURE_IMMUTABLE_FORMAT` | 0x912F | |
| `TEXTURE_IMMUTABLE_LEVELS` | 0x82DF | |
### Pixel types
| Constant name | Value | Description |
| -------------------------------- | ------ | ----------- |
| `UNSIGNED_INT_2_10_10_10_REV` | 0x8368 | |
| `UNSIGNED_INT_10F_11F_11F_REV` | 0x8C3B | |
| `UNSIGNED_INT_5_9_9_9_REV` | 0x8C3E | |
| `FLOAT_32_UNSIGNED_INT_24_8_REV` | 0x8DAD | |
| UNSIGNED_INT_24_8 | 0x84FA | |
| `HALF_FLOAT` | 0x140B | |
| `RG` | 0x8227 | |
| `RG_INTEGER` | 0x8228 | |
| `INT_2_10_10_10_REV` | 0x8D9F | |
### Queries
| Constant name | Value | Description |
| --------------------------------- | ------ | ----------- |
| `CURRENT_QUERY` | 0x8865 | |
| `QUERY_RESULT` | 0x8866 | |
| `QUERY_RESULT_AVAILABLE` | 0x8867 | |
| `ANY_SAMPLES_PASSED` | 0x8C2F | |
| `ANY_SAMPLES_PASSED_CONSERVATIVE` | 0x8D6A | |
### Draw buffers
| Constant name | Value | Description |
| ----------------------- | ------ | ----------- |
| `MAX_DRAW_BUFFERS` | 0x8824 | |
| `DRAW_BUFFER0` | 0x8825 | |
| `DRAW_BUFFER1` | 0x8826 | |
| `DRAW_BUFFER2` | 0x8827 | |
| `DRAW_BUFFER3` | 0x8828 | |
| `DRAW_BUFFER4` | 0x8829 | |
| `DRAW_BUFFER5` | 0x882A | |
| `DRAW_BUFFER6` | 0x882B | |
| `DRAW_BUFFER7` | 0x882C | |
| `DRAW_BUFFER8` | 0x882D | |
| `DRAW_BUFFER9` | 0x882E | |
| `DRAW_BUFFER10` | 0x882F | |
| `DRAW_BUFFER11` | 0x8830 | |
| `DRAW_BUFFER12` | 0x8831 | |
| `DRAW_BUFFER13` | 0x8832 | |
| `DRAW_BUFFER14` | 0x8833 | |
| `DRAW_BUFFER15` | 0x8834 | |
| `MAX_COLOR_ATTACHMENTS` | 0x8CDF | |
| `COLOR_ATTACHMENT1` | 0x8CE1 | |
| `COLOR_ATTACHMENT2` | 0x8CE2 | |
| `COLOR_ATTACHMENT3` | 0x8CE3 | |
| `COLOR_ATTACHMENT4` | 0x8CE4 | |
| `COLOR_ATTACHMENT5` | 0x8CE5 | |
| `COLOR_ATTACHMENT6` | 0x8CE6 | |
| `COLOR_ATTACHMENT7` | 0x8CE7 | |
| `COLOR_ATTACHMENT8` | 0x8CE8 | |
| `COLOR_ATTACHMENT9` | 0x8CE9 | |
| `COLOR_ATTACHMENT10` | 0x8CEA | |
| `COLOR_ATTACHMENT11` | 0x8CEB | |
| `COLOR_ATTACHMENT12` | 0x8CEC | |
| `COLOR_ATTACHMENT13` | 0x8CED | |
| `COLOR_ATTACHMENT14` | 0x8CEE | |
| `COLOR_ATTACHMENT15` | 0x8CEF | |
### Samplers
| Constant name | Value | Description |
| ------------------------------- | ------ | ----------- |
| `SAMPLER_3D` | 0x8B5F | |
| `SAMPLER_2D_SHADOW` | 0x8B62 | |
| `SAMPLER_2D_ARRAY` | 0x8DC1 | |
| `SAMPLER_2D_ARRAY_SHADOW` | 0x8DC4 | |
| `SAMPLER_CUBE_SHADOW` | 0x8DC5 | |
| `INT_SAMPLER_2D` | 0x8DCA | |
| `INT_SAMPLER_3D` | 0x8DCB | |
| `INT_SAMPLER_CUBE` | 0x8DCC | |
| `INT_SAMPLER_2D_ARRAY` | 0x8DCF | |
| `UNSIGNED_INT_SAMPLER_2D` | 0x8DD2 | |
| `UNSIGNED_INT_SAMPLER_3D` | 0x8DD3 | |
| `UNSIGNED_INT_SAMPLER_CUBE` | 0x8DD4 | |
| `UNSIGNED_INT_SAMPLER_2D_ARRAY` | 0x8DD7 | |
| `MAX_SAMPLES` | 0x8D57 | |
| `SAMPLER_BINDING` | 0x8919 | |
### Buffers
| Constant name | Value | Description |
| ----------------------------- | ------ | ----------- |
| `PIXEL_PACK_BUFFER` | 0x88EB | |
| `PIXEL_UNPACK_BUFFER` | 0x88EC | |
| `PIXEL_PACK_BUFFER_BINDING` | 0x88ED | |
| `PIXEL_UNPACK_BUFFER_BINDING` | 0x88EF | |
| `COPY_READ_BUFFER` | 0x8F36 | |
| `COPY_WRITE_BUFFER` | 0x8F37 | |
| `COPY_READ_BUFFER_BINDING` | 0x8F36 | |
| `COPY_WRITE_BUFFER_BINDING` | 0x8F37 | |
### Data types
| Constant name | Value | Description |
| --------------------- | ------ | ----------- |
| `FLOAT_MAT2x3` | 0x8B65 | |
| `FLOAT_MAT2x4` | 0x8B66 | |
| `FLOAT_MAT3x2` | 0x8B67 | |
| `FLOAT_MAT3x4` | 0x8B68 | |
| `FLOAT_MAT4x2` | 0x8B69 | |
| `FLOAT_MAT4x3` | 0x8B6A | |
| `UNSIGNED_INT_VEC2` | 0x8DC6 | |
| `UNSIGNED_INT_VEC3` | 0x8DC7 | |
| `UNSIGNED_INT_VEC4` | 0x8DC8 | |
| `UNSIGNED_NORMALIZED` | 0x8C17 | |
| `SIGNED_NORMALIZED` | 0x8F9C | |
### Vertex attributes
| Constant name | Value | Description |
| ----------------------------- | ------ | ----------- |
| `VERTEX_ATTRIB_ARRAY_INTEGER` | 0x88FD | |
| `VERTEX_ATTRIB_ARRAY_DIVISOR` | 0x88FE | |
### Transform feedback
| Constant name | Value | Description |
| ----------------------------------------------- | ------ | ----------- |
| `TRANSFORM_FEEDBACK_BUFFER_MODE` | 0x8C7F | |
| `MAX_TRANSFORM_FEEDBACK_SEPARATE_COMPONENTS` | 0x8C80 | |
| `TRANSFORM_FEEDBACK_VARYINGS` | 0x8C83 | |
| `TRANSFORM_FEEDBACK_BUFFER_START` | 0x8C84 | |
| `TRANSFORM_FEEDBACK_BUFFER_SIZE` | 0x8C85 | |
| `TRANSFORM_FEEDBACK_PRIMITIVES_WRITTEN` | 0x8C88 | |
| `MAX_TRANSFORM_FEEDBACK_INTERLEAVED_COMPONENTS` | 0x8C8A | |
| `MAX_TRANSFORM_FEEDBACK_SEPARATE_ATTRIBS` | 0x8C8B | |
| `INTERLEAVED_ATTRIBS` | 0x8C8C | |
| `SEPARATE_ATTRIBS` | 0x8C8D | |
| `TRANSFORM_FEEDBACK_BUFFER` | 0x8C8E | |
| `TRANSFORM_FEEDBACK_BUFFER_BINDING` | 0x8C8F | |
| `TRANSFORM_FEEDBACK` | 0x8E22 | |
| `TRANSFORM_FEEDBACK_PAUSED` | 0x8E23 | |
| `TRANSFORM_FEEDBACK_ACTIVE` | 0x8E24 | |
| `TRANSFORM_FEEDBACK_BINDING` | 0x8E25 | |
### Framebuffers and renderbuffers
| Constant name | Value | Description |
| --------------------------------------- | ------ | ----------- |
| `FRAMEBUFFER_ATTACHMENT_COLOR_ENCODING` | 0x8210 | |
| `FRAMEBUFFER_ATTACHMENT_COMPONENT_TYPE` | 0x8211 | |
| `FRAMEBUFFER_ATTACHMENT_RED_SIZE` | 0x8212 | |
| `FRAMEBUFFER_ATTACHMENT_GREEN_SIZE` | 0x8213 | |
| `FRAMEBUFFER_ATTACHMENT_BLUE_SIZE` | 0x8214 | |
| `FRAMEBUFFER_ATTACHMENT_ALPHA_SIZE` | 0x8215 | |
| `FRAMEBUFFER_ATTACHMENT_DEPTH_SIZE` | 0x8216 | |
| `FRAMEBUFFER_ATTACHMENT_STENCIL_SIZE` | 0x8217 | |
| `FRAMEBUFFER_DEFAULT` | 0x8218 | |
| `DEPTH_STENCIL_ATTACHMENT` | 0x821A | |
| `DEPTH_STENCIL` | 0x84F9 | |
| `DEPTH24_STENCIL8` | 0x88F0 | |
| `DRAW_FRAMEBUFFER_BINDING` | 0x8CA6 | |
| `READ_FRAMEBUFFER` | 0x8CA8 | |
| `DRAW_FRAMEBUFFER` | 0x8CA9 | |
| `READ_FRAMEBUFFER_BINDING` | 0x8CAA | |
| `RENDERBUFFER_SAMPLES` | 0x8CAB | |
| `FRAMEBUFFER_ATTACHMENT_TEXTURE_LAYER` | 0x8CD4 | |
| `FRAMEBUFFER_INCOMPLETE_MULTISAMPLE` | 0x8D56 | |
### Uniforms
| Constant name | Value | Description |
| --------------------------------------------- | ------ | ----------- |
| `UNIFORM_BUFFER` | 0x8A11 | |
| `UNIFORM_BUFFER_BINDING` | 0x8A28 | |
| `UNIFORM_BUFFER_START` | 0x8A29 | |
| `UNIFORM_BUFFER_SIZE` | 0x8A2A | |
| `MAX_VERTEX_UNIFORM_BLOCKS` | 0x8A2B | |
| `MAX_FRAGMENT_UNIFORM_BLOCKS` | 0x8A2D | |
| `MAX_COMBINED_UNIFORM_BLOCKS` | 0x8A2E | |
| `MAX_UNIFORM_BUFFER_BINDINGS` | 0x8A2F | |
| `MAX_UNIFORM_BLOCK_SIZE` | 0x8A30 | |
| `MAX_COMBINED_VERTEX_UNIFORM_COMPONENTS` | 0x8A31 | |
| `MAX_COMBINED_FRAGMENT_UNIFORM_COMPONENTS` | 0x8A33 | |
| `UNIFORM_BUFFER_OFFSET_ALIGNMENT` | 0x8A34 | |
| `ACTIVE_UNIFORM_BLOCKS` | 0x8A36 | |
| `UNIFORM_TYPE` | 0x8A37 | |
| `UNIFORM_SIZE` | 0x8A38 | |
| `UNIFORM_BLOCK_INDEX` | 0x8A3A | |
| `UNIFORM_OFFSET` | 0x8A3B | |
| `UNIFORM_ARRAY_STRIDE` | 0x8A3C | |
| `UNIFORM_MATRIX_STRIDE` | 0x8A3D | |
| `UNIFORM_IS_ROW_MAJOR` | 0x8A3E | |
| `UNIFORM_BLOCK_BINDING` | 0x8A3F | |
| `UNIFORM_BLOCK_DATA_SIZE` | 0x8A40 | |
| `UNIFORM_BLOCK_ACTIVE_UNIFORMS` | 0x8A42 | |
| `UNIFORM_BLOCK_ACTIVE_UNIFORM_INDICES` | 0x8A43 | |
| `UNIFORM_BLOCK_REFERENCED_BY_VERTEX_SHADER` | 0x8A44 | |
| `UNIFORM_BLOCK_REFERENCED_BY_FRAGMENT_SHADER` | 0x8A46 | |
### Sync objects
| Constant name | Value | Description |
| ---------------------------- | ---------- | ----------- |
| `OBJECT_TYPE` | 0x9112 | |
| `SYNC_CONDITION` | 0x9113 | |
| `SYNC_STATUS` | 0x9114 | |
| `SYNC_FLAGS` | 0x9115 | |
| `SYNC_FENCE` | 0x9116 | |
| `SYNC_GPU_COMMANDS_COMPLETE` | 0x9117 | |
| `UNSIGNALED` | 0x9118 | |
| `SIGNALED` | 0x9119 | |
| `ALREADY_SIGNALED` | 0x911A | |
| `TIMEOUT_EXPIRED` | 0x911B | |
| `CONDITION_SATISFIED` | 0x911C | |
| `WAIT_FAILED` | 0x911D | |
| `SYNC_FLUSH_COMMANDS_BIT` | 0x00000001 | |
### Miscellaneous constants
| Constant name | Value | Description |
| ------------------------------- | ---------- | ----------- |
| `COLOR` | 0x1800 | |
| DEPTH | 0x1801 | |
| `STENCIL` | 0x1802 | |
| `MIN` | 0x8007 | |
| MAX | 0x8008 | |
| `DEPTH_COMPONENT24` | 0x81A6 | |
| `STREAM_READ` | 0x88E1 | |
| `STREAM_COPY` | 0x88E2 | |
| `STATIC_READ` | 0x88E5 | |
| `STATIC_COPY` | 0x88E6 | |
| `DYNAMIC_READ` | 0x88E9 | |
| `DYNAMIC_COPY` | 0x88EA | |
| `DEPTH_COMPONENT32F` | 0x8CAC | |
| `DEPTH32F_STENCIL8` | 0x8CAD | |
| `INVALID_INDEX` | 0xFFFFFFFF | |
| `TIMEOUT_IGNORED` | -1 | |
| `MAX_CLIENT_WAIT_TIMEOUT_WEBGL` | 0x9247 | |
## Constants defined in WebGL extensions
### ANGLE_instanced_arrays
| Constant name | Value | Description |
| ----------------------------------- | ------ | ------------------------------------------------------------- |
| `VERTEX_ATTRIB_ARRAY_DIVISOR_ANGLE` | 0x88FE | Describes the frequency divisor used for instanced rendering. |
For more information, see {{domxref("ANGLE_instanced_arrays")}}.
### WEBGL_debug_renderer_info
| Constant name | Value | Description |
| ------------------------- | ------ | --------------------------------------------------------------------------- |
| `UNMASKED_VENDOR_WEBGL` | 0x9245 | Passed to `getParameter` to get the vendor string of the graphics driver. |
| `UNMASKED_RENDERER_WEBGL` | 0x9246 | Passed to `getParameter` to get the renderer string of the graphics driver. |
For more information, see {{domxref("WEBGL_debug_renderer_info")}}.
### EXT_texture_filter_anisotropic
| Constant name | Value | Description |
| -------------------------------- | ------ | ----------------------------------------------------------------------------- |
| `MAX_TEXTURE_MAX_ANISOTROPY_EXT` | 0x84FF | Returns the maximum available anisotropy. |
| `TEXTURE_MAX_ANISOTROPY_EXT` | 0x84FE | Passed to `texParameter` to set the desired maximum anisotropy for a texture. |
For more information, see {{domxref("EXT_texture_filter_anisotropic")}}.
### WEBGL_compressed_texture_s3tc
<table class="no-markdown">
<thead>
<tr>
<th scope="col">Constant name</th>
<th scope="col">Value</th>
<th scope="col">Description</th>
</tr>
</thead>
<tbody>
<tr>
<td><code>COMPRESSED_RGB_S3TC_DXT1_EXT</code></td>
<td>0x83F0</td>
<td>A DXT1-compressed image in an RGB image format.</td>
</tr>
<tr>
<td><code>COMPRESSED_RGBA_S3TC_DXT1_EXT</code></td>
<td>0x83F1</td>
<td>
A DXT1-compressed image in an RGB image format with a simple on/off
alpha value.
</td>
</tr>
<tr>
<td><code>COMPRESSED_RGBA_S3TC_DXT3_EXT</code></td>
<td>0x83F2</td>
<td>
A DXT3-compressed image in an RGBA image format. Compared to a 32-bit
RGBA texture, it offers 4:1 compression.
</td>
</tr>
<tr>
<td><code>COMPRESSED_RGBA_S3TC_DXT5_EXT</code></td>
<td>0x83F3</td>
<td>
A DXT5-compressed image in an RGBA image format. It also provides a 4:1
compression, but differs to the DXT3 compression in how the alpha
compression is done.
</td>
</tr>
</tbody>
</table>
For more information, see {{domxref("WEBGL_compressed_texture_s3tc")}}.
### WEBGL_compressed_texture_etc
<table class="no-markdown">
<thead>
<tr>
<th scope="col">Constant name</th>
<th scope="col">Value</th>
<th scope="col">Description</th>
</tr>
</thead>
<tbody>
<tr>
<td><code>COMPRESSED_R11_EAC</code></td>
<td>0x9270</td>
<td>One-channel (red) unsigned format compression.</td>
</tr>
<tr>
<td><code>COMPRESSED_SIGNED_R11_EAC</code></td>
<td>0x9271</td>
<td>One-channel (red) signed format compression.</td>
</tr>
<tr>
<td><code>COMPRESSED_RG11_EAC</code></td>
<td>0x9272</td>
<td>Two-channel (red and green) unsigned format compression.</td>
</tr>
<tr>
<td><code>COMPRESSED_SIGNED_RG11_EAC</code></td>
<td>0x9273</td>
<td>Two-channel (red and green) signed format compression.</td>
</tr>
<tr>
<td><code>COMPRESSED_RGB8_ETC2</code></td>
<td>0x9274</td>
<td>Compresses RGB8 data with no alpha channel.</td>
</tr>
<tr>
<td><code>COMPRESSED_RGBA8_ETC2_EAC</code></td>
<td>0x9275</td>
<td>
Compresses RGBA8 data. The RGB part is encoded the same as
<code>RGB_ETC2</code>, but the alpha part is encoded separately.
</td>
</tr>
<tr>
<td><code>COMPRESSED_SRGB8_ETC2</code></td>
<td>0x9276</td>
<td>Compresses sRGB8 data with no alpha channel.</td>
</tr>
<tr>
<td><code>COMPRESSED_SRGB8_ALPHA8_ETC2_EAC</code></td>
<td>0x9277</td>
<td>
Compresses sRGBA8 data. The sRGB part is encoded the same as
<code>SRGB_ETC2</code>, but the alpha part is encoded separately.
</td>
</tr>
<tr>
<td><code>COMPRESSED_RGB8_PUNCHTHROUGH_ALPHA1_ETC2</code></td>
<td>0x9278</td>
<td>
Similar to <code>RGB8_ETC</code>, but with ability to punch through the
alpha channel, which means to make it completely opaque or transparent.
</td>
</tr>
<tr>
<td><code>COMPRESSED_SRGB8_PUNCHTHROUGH_ALPHA1_ETC2</code></td>
<td>0x9279</td>
<td>
Similar to <code>SRGB8_ETC</code>, but with ability to punch through the
alpha channel, which means to make it completely opaque or transparent.
</td>
</tr>
</tbody>
</table>
For more information, see {{domxref("WEBGL_compressed_texture_etc")}}.
### WEBGL_compressed_texture_pvrtc
| Constant name | Value | Description |
| ---------------------------------- | ------ | -------------------------------------------------------------- |
| `COMPRESSED_RGB_PVRTC_4BPPV1_IMG` | 0x8C00 | RGB compression in 4-bit mode. One block for each 4×4 pixels. |
| `COMPRESSED_RGBA_PVRTC_4BPPV1_IMG` | 0x8C02 | RGBA compression in 4-bit mode. One block for each 4×4 pixels. |
| `COMPRESSED_RGB_PVRTC_2BPPV1_IMG` | 0x8C01 | RGB compression in 2-bit mode. One block for each 8×4 pixels. |
| `COMPRESSED_RGBA_PVRTC_2BPPV1_IMG` | 0x8C03 | RGBA compression in 2-bit mode. One block for each 8×4 pixels. |
For more information, see {{domxref("WEBGL_compressed_texture_pvrtc")}}.
### WEBGL_compressed_texture_etc1
| Constant name | Value | Description |
| --------------------------- | ------ | ------------------------------------------------- |
| `COMPRESSED_RGB_ETC1_WEBGL` | 0x8D64 | Compresses 24-bit RGB data with no alpha channel. |
For more information, see {{domxref("WEBGL_compressed_texture_etc1")}}.
### WEBGL_depth_texture
| Constant name | Value | Description |
| ------------------------- | ------ | ---------------------------------------------------- |
| `UNSIGNED_INT_24_8_WEBGL` | 0x84FA | Unsigned integer type for 24-bit depth texture data. |
For more information, see {{domxref("WEBGL_depth_texture")}}.
### OES_texture_half_float
| Constant name | Value | Description |
| ---------------- | ------ | ---------------------------------- |
| `HALF_FLOAT_OES` | 0x8D61 | Half floating-point type (16-bit). |
For more information, see {{domxref("OES_texture_half_float")}}.
### WEBGL_color_buffer_float
| Constant name | Value | Description |
| ------------------------------------------- | ------ | --------------------------------------------------- |
| `RGBA32F_EXT` | 0x8814 | RGBA 32-bit floating-point color-renderable format. |
| `RGB32F_EXT` | 0x8815 | RGB 32-bit floating-point color-renderable format. |
| `FRAMEBUFFER_ATTACHMENT_COMPONENT_TYPE_EXT` | 0x8211 | |
| `UNSIGNED_NORMALIZED_EXT` | 0x8C17 | |
For more information, see {{domxref("WEBGL_color_buffer_float")}}.
### EXT_blend_minmax
| Constant name | Value | Description |
| ------------- | ------ | --------------------------------------------------------------------------- |
| `MIN_EXT` | 0x8007 | Produces the minimum color components of the source and destination colors. |
| `MAX_EXT` | 0x8008 | Produces the maximum color components of the source and destination colors. |
For more information, see {{domxref("EXT_blend_minmax")}}.
### EXT_sRGB
| Constant name | Value | Description |
| ------------------------------------------- | ------ | --------------------------------------------------------------- |
| `SRGB_EXT` | 0x8C40 | Unsized sRGB format that leaves the precision up to the driver. |
| `SRGB_ALPHA_EXT` | 0x8C42 | Unsized sRGB format with unsized alpha component. |
| `SRGB8_ALPHA8_EXT` | 0x8C43 | Sized (8-bit) sRGB and alpha formats. |
| `FRAMEBUFFER_ATTACHMENT_COLOR_ENCODING_EXT` | 0x8210 | Returns the framebuffer color encoding. |
For more information, see {{domxref("EXT_sRGB")}}.
### OES_standard_derivatives
<table class="no-markdown">
<thead>
<tr>
<th scope="col">Constant name</th>
<th scope="col">Value</th>
<th scope="col">Description</th>
</tr>
</thead>
<tbody>
<tr>
<td><code>FRAGMENT_SHADER_DERIVATIVE_HINT_OES</code></td>
<td>0x8B8B</td>
<td>
Indicates the accuracy of the derivative calculation for the GLSL
built-in functions: <code>dFdx</code>, <code>dFdy</code>, and
<code>fwidth</code>.
</td>
</tr>
</tbody>
</table>
For more information, see {{domxref("OES_standard_derivatives")}}.
### WEBGL_draw_buffers
| Constant name | Value | Description |
| ----------------------------- | ------ | ----------------------------------------------------- |
| `COLOR_ATTACHMENT0_WEBGL` | 0x8CE0 | Framebuffer color attachment point |
| `COLOR_ATTACHMENT1_WEBGL` | 0x8CE1 | Framebuffer color attachment point |
| `COLOR_ATTACHMENT2_WEBGL` | 0x8CE2 | Framebuffer color attachment point |
| `COLOR_ATTACHMENT3_WEBGL` | 0x8CE3 | Framebuffer color attachment point |
| `COLOR_ATTACHMENT4_WEBGL` | 0x8CE4 | Framebuffer color attachment point |
| `COLOR_ATTACHMENT5_WEBGL` | 0x8CE5 | Framebuffer color attachment point |
| `COLOR_ATTACHMENT6_WEBGL` | 0x8CE6 | Framebuffer color attachment point |
| `COLOR_ATTACHMENT7_WEBGL` | 0x8CE7 | Framebuffer color attachment point |
| `COLOR_ATTACHMENT8_WEBGL` | 0x8CE8 | Framebuffer color attachment point |
| `COLOR_ATTACHMENT9_WEBGL` | 0x8CE9 | Framebuffer color attachment point |
| `COLOR_ATTACHMENT10_WEBGL` | 0x8CEA | Framebuffer color attachment point |
| `COLOR_ATTACHMENT11_WEBGL` | 0x8CEB | Framebuffer color attachment point |
| `COLOR_ATTACHMENT12_WEBGL` | 0x8CEC | Framebuffer color attachment point |
| `COLOR_ATTACHMENT13_WEBGL` | 0x8CED | Framebuffer color attachment point |
| `COLOR_ATTACHMENT14_WEBGL` | 0x8CEE | Framebuffer color attachment point |
| `COLOR_ATTACHMENT15_WEBGL` | 0x8CEF | Framebuffer color attachment point |
| `DRAW_BUFFER0_WEBGL` | 0x8825 | Draw buffer |
| `DRAW_BUFFER1_WEBGL` | 0x8826 | Draw buffer |
| `DRAW_BUFFER2_WEBGL` | 0x8827 | Draw buffer |
| `DRAW_BUFFER3_WEBGL` | 0x8828 | Draw buffer |
| `DRAW_BUFFER4_WEBGL` | 0x8829 | Draw buffer |
| `DRAW_BUFFER5_WEBGL` | 0x882A | Draw buffer |
| `DRAW_BUFFER6_WEBGL` | 0x882B | Draw buffer |
| `DRAW_BUFFER7_WEBGL` | 0x882C | Draw buffer |
| `DRAW_BUFFER8_WEBGL` | 0x882D | Draw buffer |
| `DRAW_BUFFER9_WEBGL` | 0x882E | Draw buffer |
| `DRAW_BUFFER10_WEBGL` | 0x882F | Draw buffer |
| `DRAW_BUFFER11_WEBGL` | 0x8830 | Draw buffer |
| `DRAW_BUFFER12_WEBGL` | 0x8831 | Draw buffer |
| `DRAW_BUFFER13_WEBGL` | 0x8832 | Draw buffer |
| `DRAW_BUFFER14_WEBGL` | 0x8833 | Draw buffer |
| `DRAW_BUFFER15_WEBGL` | 0x8834 | Draw buffer |
| `MAX_COLOR_ATTACHMENTS_WEBGL` | 0x8CDF | Maximum number of framebuffer color attachment points |
| `MAX_DRAW_BUFFERS_WEBGL` | 0x8824 | Maximum number of draw buffers |
For more information, see {{domxref("WEBGL_draw_buffers")}}.
### OES_vertex_array_object
| Constant name | Value | Description |
| -------------------------- | ------ | ------------------------------------ |
| `VERTEX_ARRAY_BINDING_OES` | 0x85B5 | The bound vertex array object (VAO). |
For more information, see {{domxref("OES_vertex_array_object")}}.
### EXT_disjoint_timer_query
| Constant name | Value | Description |
| ---------------------------- | ------ | ----------------------------------------------------------------------------- |
| `QUERY_COUNTER_BITS_EXT` | 0x8864 | The number of bits used to hold the query result for the given target. |
| `CURRENT_QUERY_EXT` | 0x8865 | The currently active query. |
| `QUERY_RESULT_EXT` | 0x8866 | The query result. |
| `QUERY_RESULT_AVAILABLE_EXT` | 0x8867 | A Boolean indicating whether or not a query result is available. |
| `TIME_ELAPSED_EXT` | 0x88BF | Elapsed time (in nanoseconds). |
| `TIMESTAMP_EXT` | 0x8E28 | The current time. |
| `GPU_DISJOINT_EXT` | 0x8FBB | A Boolean indicating whether or not the GPU performed any disjoint operation. |
For more information, see {{domxref("EXT_disjoint_timer_query")}}.
## Specifications
{{Specifications}}
## See also
- {{domxref("WebGLRenderingContext")}}
| 0 |
data/mdn-content/files/en-us/web/api/webgl_api | data/mdn-content/files/en-us/web/api/webgl_api/webgl_best_practices/index.md | ---
title: WebGL best practices
slug: Web/API/WebGL_API/WebGL_best_practices
page-type: guide
---
{{DefaultAPISidebar("WebGL")}}
WebGL is a complicated API, and it's often not obvious what the recommended ways to use it are. This page tackles recommendations across the spectrum of expertise, and not only highlights dos and don'ts, but also details _why_. You can rely on this document to guide your choice of approach, and ensure you're on the right track no matter what browser or hardware your users run.
## Address and eliminate WebGL errors
Your application should run without generating any WebGL errors (as returned by `getError`). Every WebGL error is reported in the Web Console as a JavaScript warning with a descriptive message. After too many errors (32 in Firefox), WebGL stops generating descriptive messages, which really hinders debugging.
The _only_ errors a well-formed page generates are `OUT_OF_MEMORY` and `CONTEXT_LOST`.
## Understand extension availability
The availability of most WebGL extensions depends on the client system. When using WebGL extensions, if possible, try to make them optional by gracefully adapting to the case there they are not supported.
These WebGL 1 extensions are universally supported, and can be relied upon to be present:
- ANGLE_instanced_arrays
- EXT_blend_minmax
- OES_element_index_uint
- OES_standard_derivatives
- OES_vertex_array_object
- WEBGL_debug_renderer_info
- WEBGL_lose_context
_(see also: [WebGL feature levels and % support](https://kdashg.github.io/misc/webgl/webgl-feature-levels.html))_
Consider polyfilling these into WebGLRenderingContext, like: <https://github.com/kdashg/misc/blob/tip/webgl/webgl-v1.1.js>
## Understand system limits
Similarly to extensions, the limits of your system will be different than your clients' systems! Don't assume you can use thirty texture samplers per shader just because it works on your machine!
The minimum requirements for WebGL are quite low. In practice, effectively all systems support at least the following:
```plain
MAX_CUBE_MAP_TEXTURE_SIZE: 4096
MAX_RENDERBUFFER_SIZE: 4096
MAX_TEXTURE_SIZE: 4096
MAX_VIEWPORT_DIMS: [4096,4096]
MAX_VERTEX_TEXTURE_IMAGE_UNITS: 4
MAX_TEXTURE_IMAGE_UNITS: 8
MAX_COMBINED_TEXTURE_IMAGE_UNITS: 8
MAX_VERTEX_ATTRIBS: 16
MAX_VARYING_VECTORS: 8
MAX_VERTEX_UNIFORM_VECTORS: 128
MAX_FRAGMENT_UNIFORM_VECTORS: 64
ALIASED_POINT_SIZE_RANGE: [1,100]
```
Your desktop may support 16k textures, or maybe 16 texture units in the vertex shader, but most other systems don't, and content that works for you will not work for them!
## Avoid invalidating FBO attachment bindings
Almost any change to an FBO's attachment bindings will invalidate its framebuffer completeness. Set up your hot framebuffers ahead of time.
In Firefox, setting the pref `webgl.perf.max-warnings` to `-1` in about:config will enable performance warnings that include warnings about FB completeness invalidations.
### Avoid changing VAO attachments (vertexAttribPointer, disable/enableVertexAttribArray)
Drawing from static, unchanging VAOs is faster than mutating the same VAO for every draw call. For unchanged VAOs, browsers can cache the fetch limits, whereas when VAOs change, browsers must revalidate and recalculate limits. The overhead for this is relatively low, but re-using VAOs means fewer `vertexAttribPointer` calls too, so it's worth doing wherever it's easy.
## Delete objects eagerly
Don't wait for the garbage collector/cycle collector to realize objects are orphaned and destroy them. Implementations track the liveness of objects, so 'deleting' them at the API level only releases the handle that refers to the actual object. (conceptually releasing the handle's ref-pointer to the object) Only once the object is unused in the implementation is it actually freed. For example, if you never want to access your shader objects directly again, just delete their handles after attaching them to a program object.
## Lose contexts eagerly
Consider also eagerly losing WebGL contexts via the `WEBGL_lose_context` extension when you're definitely done with them and no longer need the target canvas's rendering results. Note that this is not necessary to do when navigating away from a page - don't add an unload event handler just for this purpose.
## Flush when expecting results
Call `flush()` when expecting results such as queries, or at completion of a rendering frame.
Flush tells the implementation to push all pending commands out for execution, flushing them out of the queue, instead of waiting for more commands to enqueue before sending for execution.
For example, it is possible for the following to never complete without context loss:
```js
sync = glFenceSync(GL_SYNC_GPU_COMMANDS_COMPLETE, 0);
glClientWaitSync(sync, 0, GL_TIMEOUT_IGNORED);
```
WebGL doesn't have a SwapBuffers call by default, so a flush can help fill the gap, as well.
### Use `webgl.flush()` when not using requestAnimationFrame
When not using RAF, use `webgl.flush()` to encourage eager execution of enqueued commands.
Because RAF is directly followed by the frame boundary, an explicit `webgl.flush()` isn't really needed with RAF.
## Avoid blocking API calls in production
Certain WebGL entry points - including `getError` and `getParameter` - cause synchronous stalls on the calling thread. Even basic requests can take as long as 1ms, but they can take even longer if they need to wait for all graphics work to be completed (with an effect similar to `glFinish()` in native OpenGL).
In production code, avoid such entry points, especially on the browser main thread where they can cause the entire page to jank (often including scrolling or even the whole browser).
- `getError()`: causes a flush + round-trip to fetch errors from the GPU process).
For example, within Firefox, the only time glGetError is checked is after allocations (`bufferData`, `*texImage*`, `texStorage*`) to pick up any GL_OUT_OF_MEMORY errors.
- `getShader/ProgramParameter()`, `getShader/ProgramInfoLog()`, other `get`s on shaders/programs: flush + shader compile + round-trip, if not done after shader compilation is complete. (See also [parallel shader compilation](#compile_shaders_and_link_programs_in_parallel) below.)
- `get*Parameter()` in general: possible flush + round-trip. In some cases, these will be cached to avoid the round-trip, but try to avoid relying on this.
- `checkFramebufferStatus()`: possible flush + round-trip.
- `getBufferSubData()`: usual finish + round-trip. (This is okay for READ buffers in conjunction with fences - see [async data readback](#non-blocking_async_data_downloadreadback) below.)
- `readPixels()` to the CPU (i.e. without an UNPACK buffer bound): finish + round-trip. Instead, use GPU-GPU `readPixels` in conjunction with async data readback.
## Always enable vertex attrib 0 as an array
If you draw without vertex attrib 0 enabled as an array, you will force the browser to do complicated emulation when running on desktop OpenGL (such as on macOS). This is because in desktop OpenGL, nothing gets drawn if vertex attrib 0 is not array-enabled. You can use `bindAttribLocation` to force a vertex attribute to use location 0, and use `enableVertexAttribArray(0)` to make it array-enabled.
## Estimate a per-pixel VRAM Budget
WebGL doesn't offer APIs to query the maximum amount of video memory on the system because such queries are not portable. Still, applications must be conscious of VRAM usage and not just allocate as much as possible.
One technique pioneered by the Google Maps team is the notion of a _per-pixel VRAM budget_:
1\) For one system (e.g. a particular desktop / laptop), decide the maximum amount of VRAM your application should use. 2) Compute the number of pixels covered by a maximized browser window. E.g. `(window.innerWidth * devicePixelRatio) * (window.innerHeight * window.devicePixelRatio)` 3) The per-pixel VRAM budget is (1) divided by (2), and is a constant.
This constant should _generally_ be portable among systems. Mobile devices typically have smaller screens than powerful desktop machines with large monitors. Re-compute this constant on a few target systems to get a reliable estimate.
Now adjust all internal caching in the application (WebGLBuffers, WebGLTextures, etc.) to obey a maximum size, computed by this constant multiplied by the number of pixels covered by the _current_ browser window. This requires estimating the number of bytes consumed by each texture, for example. The cap also must typically be updated as the browser window resizes, and older resources above the limit must be purged.
Keeping the application's VRAM usage under this cap will help to avoid out-of-memory errors and associated instability.
## Consider rendering to a smaller back buffer
A common (and easy) way to trade off quality for speed is rendering into a smaller back buffer, and upscaling the result. Consider reducing canvas.width and height and keeping canvas.style.width and height at a constant size.
## Batch draw calls
"Batching" draw calls into fewer, larger draw calls will generally improve performance. If you have 1000 sprites to paint, try to do it as a single drawArrays() or drawElements() call.
It's common to use "degenerate triangles" if you need to draw discontinuous objects as a single drawArrays(TRIANGLE_STRIP) call. Degenerate triangles are triangles with no area, therefore any triangle where more than one point is in the same exact location. These triangles are effectively skipped, which lets you start a new triangle strip unattached to your previous one, without having to split into multiple draw calls.
Another important method for batching is texture atlasing, where multiple images are placed into a single texture, often like a checkerboard. Since you need to split draw call batches to change textures, texture atlasing lets you combine more draw calls into fewer, bigger batches. See [this example](https://webglsamples.org/sprites/readme.html) demonstrating how to combine even sprites referencing multiple texture atlases into a single draw call.
## Avoid "#ifdef GL_ES"
You should never use `#ifdef GL_ES` in your WebGL shaders; this condition is always true in WebGL. Although some early examples used this, it's not necessary.
## Prefer doing work in the vertex shader
Do as much work as you can in the vertex shader, rather than in the fragment shader. This is because per draw call, fragment shaders generally run many more times than vertex shaders. Any calculation that can be done on the vertices and then just interpolated among fragments (via `varying`s) is a performance boon. (The interpolation of varyings is very cheap, and is done automatically for you through the fixed functionality rasterization phase of the graphics pipeline.)
For example, a simple animation of a textured surface can be achieved through a time-dependent transformation of texture coordinates. (The simplest case being adding a uniform vector to the texture coordinates attribute vector) If visually acceptable, one can transform the texture coordinates in the vertex shader rather than in the fragment shader, to get better performance.
One common trade-off is to some lighting calculations per-vertex instead of per-fragment (pixel). In some cases, especially with simple models or dense vertices, this looks good enough.
The inversion of this is if a model has more vertices than pixels in the rendered output. However, LOD meshes is usually the answer to this problem, rarely moving work from the vertex _to_ the fragment shader.
## Compile Shaders and Link Programs in parallel
It's tempting to compile shaders and link programs serially, but many browsers can compile and link in parallel on background threads.
Instead of:
```js
function compileOnce(gl, shader) {
if (shader.compiled) return;
gl.compileShader(shader);
shader.compiled = true;
}
for (const [vs, fs, prog] of programs) {
compileOnce(gl, vs);
compileOnce(gl, fs);
gl.linkProgram(prog);
if (!gl.getProgramParameter(prog, gl.LINK_STATUS)) {
console.error(`Link failed: ${gl.getProgramInfoLog(prog)}`);
console.error(`vs info-log: ${gl.getShaderInfoLog(vs)}`);
console.error(`fs info-log: ${gl.getShaderInfoLog(fs)}`);
}
}
```
Consider:
```js
function compileOnce(gl, shader) {
if (shader.compiled) return;
gl.compileShader(shader);
shader.compiled = true;
}
for (const [vs, fs, prog] of programs) {
compileOnce(gl, vs);
compileOnce(gl, fs);
}
for (const [vs, fs, prog] of programs) {
gl.linkProgram(prog);
}
for (const [vs, fs, prog] of programs) {
if (!gl.getProgramParameter(prog, gl.LINK_STATUS)) {
console.error(`Link failed: ${gl.getProgramInfoLog(prog)}`);
console.error(`vs info-log: ${gl.getShaderInfoLog(vs)}`);
console.error(`fs info-log: ${gl.getShaderInfoLog(fs)}`);
}
}
```
## Prefer KHR_parallel_shader_compile
While we've described a pattern to allow browsers to compile and link in parallel, normally checking `COMPILE_STATUS` or `LINK_STATUS` blocks until the compile or link completes. In browsers where it's available, the [KHR_parallel_shader_compile](https://www.khronos.org/registry/webgl/extensions/KHR_parallel_shader_compile/) extension provides a _non-blocking_ `COMPLETION_STATUS` query. Prefer to enable and use this extension.
Example usage:
```js
ext = gl.getExtension("KHR_parallel_shader_compile");
gl.compileProgram(vs);
gl.compileProgram(fs);
gl.attachShader(prog, vs);
gl.attachShader(prog, fs);
gl.linkProgram(prog);
// Store program in your data structure.
// Later, for example the next frame:
if (ext) {
if (gl.getProgramParameter(prog, ext.COMPLETION_STATUS_KHR)) {
// Check program link status; if OK, use and draw with it.
}
} else {
// Program linking is synchronous.
// Check program link status; if OK, use and draw with it.
}
```
This technique may not work in all applications, for example those which require programs to be immediately available for rendering. Still, consider how variations may work.
## Don't check shader compile status unless linking fails
There are very few errors that are guaranteed to cause shader compilation failure, but cannot be deferred to link time. The [ESSL3 spec](https://www.khronos.org/registry/OpenGL/specs/es/3.0/GLSL_ES_Specification_3.00.pdf) says this under "Error Handling":
> The implementation should report errors as early a possible but in any case must satisfy the following:
>
> - All lexical, grammatical and semantic errors must have been detected following a call to glLinkProgram
> - Errors due to mismatch between the vertex and fragment shader (link errors) must have been detected following a call to glLinkProgram
> - Errors due to exceeding resource limits must have been detected following any draw call or a call to glValidateProgram
> - A call to glValidateProgram must report all errors associated with a program object given the current GL state.
>
> The allocation of tasks between the compiler and linker is implementation dependent. Consequently there are many errors which may be detected either at compile or link time, depending on the implementation.
Additionally, querying compile status is a synchronous call, which breaks pipelining.
Instead of:
```js
gl.compileShader(vs);
if (!gl.getShaderParameter(vs, gl.COMPILE_STATUS)) {
console.error(`vs compile failed: ${gl.getShaderInfoLog(vs)}`);
}
gl.compileShader(fs);
if (!gl.getShaderParameter(fs, gl.COMPILE_STATUS)) {
console.error(`fs compile failed: ${gl.getShaderInfoLog(fs)}`);
}
gl.linkProgram(prog);
if (!gl.getProgramParameter(prog, gl.LINK_STATUS)) {
console.error(`Link failed: ${gl.getProgramInfoLog(prog)}`);
}
```
Consider:
```js
gl.compileShader(vs);
gl.compileShader(fs);
gl.linkProgram(prog);
if (!gl.getProgramParameter(prog, gl.LINK_STATUS)) {
console.error(`Link failed: ${gl.getProgramInfoLog(prog)}`);
console.error(`vs info-log: ${gl.getShaderInfoLog(vs)}`);
console.error(`fs info-log: ${gl.getShaderInfoLog(fs)}`);
}
```
## Be precise with GLSL precision annotations
If you expect to pass an essl300 `int` between shaders, and you need it to have 32-bits, you _must_ use `highp` or you will have portability problems. (Works on Desktop, not on Android)
If you have a float texture, iOS requires that you use `highp sampler2D foo;`, or it will very painfully give you `lowp` texture samples! (+/-2.0 max is probably not good enough for you)
### Implicit defaults
The vertex language has the following predeclared globally scoped default precision statements:
```glsl
precision highp float;
precision highp int;
precision lowp sampler2D;
precision lowp samplerCube;
```
The fragment language has the following predeclared globally scoped default precision statements:
```glsl
precision mediump int;
precision lowp sampler2D;
precision lowp samplerCube;
```
### In WebGL 1, "highp float" support is optional in fragment shaders
Using `highp` precision unconditionally in fragment shaders will prevent your content from working on some older mobile hardware.
While you can use `mediump float` instead, but be aware that this often results in corrupted rendering due to lack of precision (particularly mobile systems) though the corruption is not going to be visible on a typical desktop computer.
If you know your precision requirements, `getShaderPrecisionFormat()` will tell you what the system supports.
If `highp float` is available, `GL_FRAGMENT_PRECISION_HIGH` will be defined as `1`.
A good pattern for "always give me the highest precision":
```glsl
#ifdef GL_FRAGMENT_PRECISION_HIGH
precision highp float;
#else
precision mediump float;
#endif
```
### ESSL100 minimum requirements (WebGL 1)
| `float` | think | range | min above zero | precision |
| --------- | ------------------- | ------------- | -------------- | -------------- |
| `highp` | float24\* | (-2^62, 2^62) | 2^-62 | 2^-16 relative |
| `mediump` | IEEE float16 | (-2^14, 2^14) | 2^-14 | 2^-10 relative |
| `lowp` | 10-bit signed fixed | (-2, 2) | 2^-8 | 2^-8 absolute |
| `int` | think | range |
| --------- | ----- | ------------- |
| `highp` | int17 | (-2^16, 2^16) |
| `mediump` | int11 | (-2^10, 2^10) |
| `lowp` | int9 | (-2^8, 2^8) |
_\*float24: sign bit, 7-bit for exponent, 16-bit for mantissa._
### ESSL300 minimum requirements (WebGL 2)
| `float` | think | range | min above zero | precision |
| --------- | ------------------- | --------------- | -------------- | -------------- |
| `highp` | IEEE float32 | (-2^126, 2^127) | 2^-126 | 2^-24 relative |
| `mediump` | IEEE float16 | (-2^14, 2^14) | 2^-14 | 2^-10 relative |
| `lowp` | 10-bit signed fixed | (-2, 2) | 2^-8 | 2^-8 absolute |
| `(u)int` | think | `int` range | `unsigned int` range |
| --------- | -------- | ------------- | -------------------- |
| `highp` | (u)int32 | [-2^31, 2^31] | [0, 2^32] |
| `mediump` | (u)int16 | [-2^15, 2^15] | [0, 2^16] |
| `lowp` | (u)int9 | [-2^8, 2^8] | [0, 2^9] |
## Prefer builtins instead of building your own
Prefer builtins like `dot`, `mix`, and `normalize`. At best, custom implementations might run as fast as the builtins they replace, but don't expect them to. Hardware often has hyper-optimized or even specialized instructions for builtins, and the compiler can't reliably replace your custom builtin-replacements with the special builtin codepaths.
## Use mipmaps for any texture you'll see in 3d
When in doubt, call `generateMipmaps()` after texture uploads. Mipmaps are cheap on memory (only 30% overhead) while providing often-large performance advantages when textures are "zoomed out" or generally downscaled in the distance in 3d, or even for cube-maps!
It's quicker to sample from smaller texture images due to better inherent texture fetch cache locality: Zooming out on a non-mipmapped texture ruins texture fetch cache locality, because neighboring pixels no longer sample from neighboring texels!
However, for 2d resources that are never "zoomed out", don't pay the 30% memory surcharge for mipmaps:
```js
const tex = gl.createTexture();
gl.bindTexture(gl.TEXTURE_2D, tex);
gl.texParameterf(gl.TEXTURE_2D, gl.TEXTURE_MIN_FILTER, gl.LINEAR); // Defaults to NEAREST_MIPMAP_LINEAR, for mipmapping!
```
(In WebGL 2, you should just use `texStorage` with `levels=1`)
One caveat: `generateMipmaps` only works if you would be able to render into the texture if you attached it to a framebuffer. (The spec calls this "color-renderable formats") For example, if a system supports float-textures but not render-to-float, `generateMipmaps` will fail for float formats.
## Don't assume you can render into float textures
There are many, many systems that support RGBA32F textures, but if you attach one to a framebuffer you'll get `FRAMEBUFFER_INCOMPLETE_ATTACHMENT` from `checkFramebufferStatus()`. It may work on your system, but _most_ mobile systems will not support it!
On WebGL 1, use the `EXT_color_buffer_half_float` and `WEBGL_color_buffer_float` extensions to check for render-to-float-texture support for float16 and float32 respectively.
On WebGL 2, `EXT_color_buffer_float` checks for render-to-float-texture support for both float32 and float16. `EXT_color_buffer_half_float` is present on systems which only support rendering to float16 textures.
### Render-to-float32 doesn't imply float32-blending!
It may work on your system, but on many others it won't. Avoid it if you can. Check for the `EXT_float_blend` extension to check for support.
Float16-blending is always supported.
## Some formats (e.g. RGB) may be emulated
A number of formats (particularly three-channel formats) are emulated. For example, RGB32F is often actually RGBA32F, and Luminance8 may actually be RGBA8. RGB8 in particular is often surprisingly slow, as masking out the alpha channel and/or patching blend functions has fairly high overhead. Prefer to use RGBA8 and ignore the alpha yourself for better performance.
## Avoid alpha:false, which can be expensive
Specifying `alpha:false` during context creation causes the browser to composite the WebGL-rendered canvas as though it were opaque, ignoring any alpha values the application writes in its fragment shader. On some platforms, this capability unfortunately comes at a significant performance cost. The RGB back buffer may have to be emulated on top of an RGBA surface, and there are relatively few techniques available in the OpenGL API for making it appear to the application that an RGBA surface has no alpha channel. [It has been found](https://crbug.com/1045643) that all of these techniques have approximately equal performance impact on affected platforms.
Most applications, even those requiring alpha blending, can be structured to produce `1.0` for the alpha channel. The primary exception is any application requiring destination alpha in the blending function. If feasible, it is recommended to do this rather than using `alpha:false`.
## Consider compressed texture formats
While JPG and PNG are generally smaller over-the-wire, GPU compressed texture formats are smaller on in GPU memory, and are faster to sample from. (This reduces texture memory bandwidth, which is precious on mobile) However, compressed texture formats have worse quality than JPG, and are generally only acceptable for colors (not e.g. normals or coordinates).
Unfortunately, there's no single universally supported format. Every system has at least one of the following though:
- WEBGL_compressed_texture_s3tc (desktop)
- WEBGL_compressed_texture_etc1 (Android)
- WEBGL_compressed_texture_pvrtc (iOS)
WebGL 2 has universal support by combining:
- WEBGL_compressed_texture_s3tc (desktop)
- WEBGL_compressed_texture_etc (mobile)
WEBGL_compressed_texture_astc has both higher quality and/or higher compression, but is only supported on newer hardware.
### Basis Universal texture compression format/library
Basis Universal solves several of the issues mentioned above. It offers a way to support all common compressed texture formats with a single compressed texture file, through a JavaScript library that efficiently converts formats at load time. It also adds additional compression that makes Basis Universal compressed texture files much smaller than regular compressed textures over-the-wire, more comparable to JPEG.
<https://github.com/BinomialLLC/basis_universal/blob/master/webgl/README.md>
## Memory usage of depth and stencil formats
Depth and stencil attachments and formats are actually inseparable on many devices. You may ask for DEPTH_COMPONENT24 or STENCIL_INDEX8, but you're often getting D24X8 and X24S8 32bpp formats behind the scenes. Assume that the memory usage of depth and stencil formats is rounded up to the nearest four bytes.
## texImage/texSubImage uploads (esp. videos) can cause pipeline flushes
Most texture uploads from DOM elements will incur a processing pass that will temporarily switch GL Programs internally, causing a pipeline flush. (Pipelines are formalized explicitly in Vulkan\[[1](https://www.khronos.org/registry/vulkan/specs/1.2/html/chap9.html#VkGraphicsPipelineCreateInfo)] et al, but are implicit behind-the-scenes in OpenGL and WebGL. Pipelines are more or less the tuple of shader program, depth/stencil/multisample/blend/rasterization state)
In WebGL:
```glsl
…
useProgram(prog1)
<pipeline flush>
bindFramebuffer(target)
drawArrays()
bindTexture(webgl_texture)
texImage2D(HTMLVideoElement)
drawArrays()
…
```
Behind the scenes in the browser:
```glsl
…
useProgram(prog1)
<pipeline flush>
bindFramebuffer(target)
drawArrays()
bindTexture(webgl_texture)
-texImage2D(HTMLVideoElement):
+useProgram(_internal_tex_transform_prog)
<pipeline flush>
+bindFramebuffer(webgl_texture._internal_framebuffer)
+bindTexture(HTMLVideoElement._internal_video_tex)
+drawArrays() // y-flip/colorspace-transform/alpha-(un)premultiply
+bindTexture(webgl_texture)
+bindFramebuffer(target)
+useProgram(prog1)
<pipeline flush>
drawArrays()
…
```
Prefer doing uploads before starting drawing, or at least between pipelines:
In WebGL:
```glsl
…
bindTexture(webgl_texture)
texImage2D(HTMLVideoElement)
useProgram(prog1)
<pipeline flush>
bindFramebuffer(target)
drawArrays()
bindTexture(webgl_texture)
drawArrays()
…
```
Behind the scenes in the browser:
```glsl
…
bindTexture(webgl_texture)
-texImage2D(HTMLVideoElement):
+useProgram(_internal_tex_transform_prog)
<pipeline flush>
+bindFramebuffer(webgl_texture._internal_framebuffer)
+bindTexture(HTMLVideoElement._internal_video_tex)
+drawArrays() // y-flip/colorspace-transform/alpha-(un)premultiply
+bindTexture(webgl_texture)
+bindFramebuffer(target)
useProgram(prog1)
<pipeline flush>
bindFramebuffer(target)
drawArrays()
bindTexture(webgl_texture)
drawArrays()
…
```
## Use texStorage to create textures
The WebGL 2.0 `texImage*` API lets you define each mip level independently and at any size, even the mis-matching mips sizes are not an error until draw time which means there is no way the driver can actually prepare the texture in GPU memory until the first time the texture is drawn.
Further, some drivers might unconditionally allocate the whole mip-chain (+30% memory!) even if you only want a single level.
So, prefer `texStorage`+`texSubImage` for textures in WebGL 2
## Use invalidateFramebuffer
Storing data that you won't use again can have high cost, particularly on tiled-rendering GPUs common on mobile. When you're done with the contents of a framebuffer attachment, use WebGL 2.0's `invalidateFramebuffer` to discard the data, instead of leaving the driver to waste time storing the data for later use. DEPTH/STENCIL and/or multisampled attachments in particular are great candidates for `invalidateFramebuffer`.
## Use non-blocking async data readback
Operations like `readPixels` and `getBufferSubData` are typically synchronous, but using the same APIs, non-blocking, asynchronous data readback can be achieved. The approach in WebGL 2 is analogous to the approach in OpenGL: [Async downloads in blocking APIs](https://kdashg.github.io/misc/async-gpu-downloads.html)
```js
function clientWaitAsync(gl, sync, flags, interval_ms) {
return new Promise((resolve, reject) => {
function test() {
const res = gl.clientWaitSync(sync, flags, 0);
if (res === gl.WAIT_FAILED) {
reject();
return;
}
if (res === gl.TIMEOUT_EXPIRED) {
setTimeout(test, interval_ms);
return;
}
resolve();
}
test();
});
}
async function getBufferSubDataAsync(
gl,
target,
buffer,
srcByteOffset,
dstBuffer,
/* optional */ dstOffset,
/* optional */ length,
) {
const sync = gl.fenceSync(gl.SYNC_GPU_COMMANDS_COMPLETE, 0);
gl.flush();
await clientWaitAsync(gl, sync, 0, 10);
gl.deleteSync(sync);
gl.bindBuffer(target, buffer);
gl.getBufferSubData(target, srcByteOffset, dstBuffer, dstOffset, length);
gl.bindBuffer(target, null);
return dest;
}
async function readPixelsAsync(gl, x, y, w, h, format, type, dest) {
const buf = gl.createBuffer();
gl.bindBuffer(gl.PIXEL_PACK_BUFFER, buf);
gl.bufferData(gl.PIXEL_PACK_BUFFER, dest.byteLength, gl.STREAM_READ);
gl.readPixels(x, y, w, h, format, type, 0);
gl.bindBuffer(gl.PIXEL_PACK_BUFFER, null);
await getBufferSubDataAsync(gl, gl.PIXEL_PACK_BUFFER, buf, 0, dest);
gl.deleteBuffer(buf);
return dest;
}
```
### `devicePixelRatio` and high-dpi rendering
Handling `devicePixelRatio !== 1.0` is tricky. While the common approach is to set `canvas.width = width * devicePixelRatio`, this will cause moire artifacts with non-integer values of `devicePixelRatio`, as is common with UI scaling on Windows, as well as zooming on all platforms.
Instead, we can use non-integer values for CSS's `top`/`bottom`/`left`/`right` to fairly reliably 'pre-snap' our canvas to whole integer device coordinates.
Demo: [Device pixel presnap](https://kdashg.github.io/misc/webgl/device-pixel-presnap.html)
### ResizeObserver and 'device-pixel-content-box'
On supporting browsers (Chromium?), `ResizeObserver` can be used with `'device-pixel-content-box'` to request a callback that includes the true device pixel size of an element. This can be used to build an async-but-accurate function:
```js
window.getDevicePixelSize =
window.getDevicePixelSize ||
(async (elem) => {
await new Promise((fn_resolve) => {
const observer = new ResizeObserver((entries) => {
for (const cur of entries) {
const dev_size = cur.devicePixelContentBoxSize;
const ret = {
width: dev_size[0].inlineSize,
height: dev_size[0].blockSize,
};
fn_resolve(ret);
observer.disconnect();
return;
}
throw `device-pixel-content-box not observed for elem ${elem}`;
});
observer.observe(elem, { box: "device-pixel-content-box" });
});
});
```
Please refer to [the specification](https://www.w3.org/TR/resize-observer/#resize-observer-interface) for more details.
## ImageBitmap creation
Using the [ImageBitmapOptions dictionary](https://html.spec.whatwg.org/multipage/imagebitmap-and-animations.html#imagebitmapoptions) is essential for properly preparing textures for upload to WebGL, but unfortunately there's no obvious way to query exactly which dictionary members are supported by a given browser.
[This JSFiddle](https://jsfiddle.net/ptkyewhx/) illustrates how to determine which dictionary members a given browser supports.
| 0 |
data/mdn-content/files/en-us/web/api/webgl_api | data/mdn-content/files/en-us/web/api/webgl_api/webgl_model_view_projection/fullcamerafov.svg | <svg xmlns="http://www.w3.org/2000/svg" width="531" height="432.367"><defs><linearGradient id="a" gradientUnits="userSpaceOnUse" x1="1017.875" y1="1034.054" x2="1079.235" y2="637.946"><stop offset="0" stop-color="red" stop-opacity=".56"/><stop offset=".838" stop-color="red" stop-opacity=".56"/><stop offset="1" stop-color="#FFF" stop-opacity=".56"/></linearGradient><linearGradient id="b" gradientUnits="userSpaceOnUse" x1="837.656" y1="902.355" x2="1311.453" y2="800.645"><stop offset="0" stop-color="#0F0" stop-opacity=".56"/><stop offset=".751" stop-color="#0F0" stop-opacity=".56"/><stop offset="1" stop-color="#FFF" stop-opacity=".56"/></linearGradient></defs><g stroke="#000" stroke-dasharray="3,2" stroke-linecap="round" stroke-linejoin="round" fill="none"><path d="M36.609 402.5l180-400"/><path d="M36.609 402.5l248.891-167" stroke-opacity=".35"/><path d="M36.609 402.5L525.21 284.4M36.609 402.5L477.5 33.5"/></g><path d="M0 416.562l19.756-15.32 12.256 15.806-19.756 15.319z"/><path d="M23.513 410.983l6.547-16.035 10.614 13.687z"/><path d="M828.109 1036l180-400L1269 667z" fill="url(#a)" transform="translate(-792 -634)"/><path d="M35.609 402.5l180-400m260.891 31l-440.891 369" stroke="#000" stroke-width="3" stroke-linecap="round" stroke-linejoin="round" fill="none"/><path d="M1321 918l-52-251-440.891 369z" fill="url(#b)" transform="translate(-792 -634)"/><path d="M477.5 32.5l-440.891 369m.001 0l492.89-118" stroke="#000" stroke-width="3" stroke-linecap="round" stroke-linejoin="round" fill="none"/><path d="M216.609 2.5l68.891 233 244 49-52-251z" stroke="#000" stroke-opacity=".354" stroke-dasharray="3,2" stroke-linecap="round" stroke-linejoin="round" fill="none"/></svg> | 0 |
data/mdn-content/files/en-us/web/api/webgl_api | data/mdn-content/files/en-us/web/api/webgl_api/webgl_model_view_projection/index.md | ---
title: WebGL model view projection
slug: Web/API/WebGL_API/WebGL_model_view_projection
page-type: guide
---
{{DefaultAPISidebar("WebGL")}}
This article explores how to take data within a [WebGL](/en-US/docs/Web/API/WebGL_API) project, and project it into the proper spaces to display it on the screen. It assumes a knowledge of basic matrix math using translation, scale, and rotation matrices. It explains the three core matrices that are typically used when composing a 3D scene: the model, view and projection matrices.
> **Note:** This article is also available as an [MDN content kit](https://github.com/gregtatum/mdn-model-view-projection). It also uses a collection of [utility functions](https://github.com/gregtatum/mdn-webgl) available under the `MDN` global object.
## The model, view, and projection matrices
Individual transformations of points and polygons in space in WebGL are handled by the basic transformation matrices like translation, scale, and rotation. These matrices can be composed together and grouped in special ways to make them useful for rendering complicated 3D scenes. These composed matrices ultimately move the original model data around into a special coordinate space called **clip space**. This is a 2 unit wide cube, centered at (0,0,0), and with corners that range from (-1,-1,-1) to (1,1,1). This clip space is compressed down into a 2D space and rasterized into an image.
The first matrix discussed below is the **model matrix**, which defines how you take your original model data and move it around in 3D world space. The **projection matrix** is used to convert world space coordinates into clip space coordinates. A commonly used projection matrix, the **perspective projection matrix**, is used to mimic the _effects_ of a typical camera serving as the stand-in for the viewer in the 3D virtual world. The **view matrix** is responsible for moving the objects in the scene to simulate the position of the camera being changed, altering what the viewer is currently able to see.
The sections below offer an in-depth look into the ideas behind and implementation of the model, view, and projection matrices. These matrices are core to moving data around on the screen, and are concepts that transcend individual frameworks and engines.
## Clip space
In a WebGL program, data is typically uploaded to the GPU with its own coordinate system and then the vertex shader transforms those points into a special coordinate system known as **clip space**. Any data which extends outside of the clip space is clipped off and not rendered. However, if a triangle straddles the border of this space then it is chopped up into new triangles, and only the parts of the new triangles that are in clip space are kept.

The above graphic is a visualization of the clip space that all of the points must fit into. It is a cube two units on each side, with one corner at (-1,-1,-1) and the opposite corner at (1,1,1). The center of the cube is the point (0,0,0). This 8 cubic meter coordinate system used by clip space is known as normalized device coordinates (NDC). You may encounter that term from time to time while researching and working with WebGL code.
For this section we will put our data into the clip space coordinate system directly. Normally model data is used that is in some arbitrary coordinate system, and is then transformed using a matrix, converting the model coordinates into the clip space coordinate system. For this example, it's easiest to illustrate how clip space works by using model coordinate values ranging from (-1,-1,-1) to (1,1,1). The code below will create 2 triangles that will draw a square on the screen. The Z depth in the squares determines what gets drawn on top when the squares share the same space. The smaller Z values are rendered on top of the larger Z values.
### WebGLBox example
This example will create a custom `WebGLBox` object that will draw a 2D box on the screen.
> **Note:** The code for each WebGLBox example is available in this [GitHub repo](https://github.com/gregtatum/mdn-model-view-projection/tree/master/lessons) and is organized by section. In addition there is a JSFiddle link at the bottom of each section.
#### WebGLBox constructor
The constructor looks like this:
```js
function WebGLBox() {
// Setup the canvas and WebGL context
this.canvas = document.getElementById("canvas");
this.canvas.width = window.innerWidth;
this.canvas.height = window.innerHeight;
this.gl = MDN.createContext(canvas);
const gl = this.gl;
// Setup a WebGL program, anything part of the MDN object is defined outside of this article
this.webglProgram = MDN.createWebGLProgramFromIds(
gl,
"vertex-shader",
"fragment-shader",
);
gl.useProgram(this.webglProgram);
// Save the attribute and uniform locations
this.positionLocation = gl.getAttribLocation(this.webglProgram, "position");
this.colorLocation = gl.getUniformLocation(this.webglProgram, "color");
// Tell WebGL to test the depth when drawing, so if a square is behind
// another square it won't be drawn
gl.enable(gl.DEPTH_TEST);
}
```
#### WebGLBox draw
Now we'll create a method to draw a box on the screen.
```js
WebGLBox.prototype.draw = function (settings) {
// Create some attribute data; these are the triangles that will end being
// drawn to the screen. There are two that form a square.
const data = new Float32Array([
//Triangle 1
settings.left,
settings.bottom,
settings.depth,
settings.right,
settings.bottom,
settings.depth,
settings.left,
settings.top,
settings.depth,
//Triangle 2
settings.left,
settings.top,
settings.depth,
settings.right,
settings.bottom,
settings.depth,
settings.right,
settings.top,
settings.depth,
]);
// Use WebGL to draw this onto the screen.
// Performance Note: Creating a new array buffer for every draw call is slow.
// This function is for illustration purposes only.
const gl = this.gl;
// Create a buffer and bind the data
const buffer = gl.createBuffer();
gl.bindBuffer(gl.ARRAY_BUFFER, buffer);
gl.bufferData(gl.ARRAY_BUFFER, data, gl.STATIC_DRAW);
// Setup the pointer to our attribute data (the triangles)
gl.enableVertexAttribArray(this.positionLocation);
gl.vertexAttribPointer(this.positionLocation, 3, gl.FLOAT, false, 0, 0);
// Setup the color uniform that will be shared across all triangles
gl.uniform4fv(this.colorLocation, settings.color);
// Draw the triangles to the screen
gl.drawArrays(gl.TRIANGLES, 0, 6);
};
```
The shaders are the bits of code written in GLSL that take our data points and ultimately render them to the screen. For convenience, these shaders are stored in a {{htmlelement("script")}} element that is brought into the program through the custom function `MDN.createWebGLProgramFromIds()`. This function is part of a collection of [utility functions](https://github.com/gregtatum/mdn-webgl) written for these tutorials and is not explained in depth here. This function handles the basics of taking some GLSL source code and compiling it into a WebGL program. The function takes three parameters — the context to render the program in, the ID of the {{htmlelement("script")}} element containing the vertex shader, and the ID of the {{htmlelement("script")}} element containing the fragment shader. The vertex shader positions the vertices, and the fragment shader colors each pixel.
First take a look at the vertex shader that will move the vertices on the screen:
```glsl
// The individual position vertex
attribute vec3 position;
void main() {
// the gl_Position is the final position in clip space after the vertex shader modifies it
gl_Position = vec4(position, 1.0);
}
```
Next, to actually rasterize the data into pixels, the fragment shader evaluates everything on a per pixel basis, setting a single color. The GPU calls the shader function for each pixel it needs to render; the shader's job is to return the color to use for that pixel.
```glsl
precision mediump float;
uniform vec4 color;
void main() {
gl_FragColor = color;
}
```
With those settings included, it's time to directly draw to the screen using clip space coordinates.
```js
const box = new WebGLBox();
```
First draw a red box in the middle.
```js
box.draw({
top: 0.5, // x
bottom: -0.5, // x
left: -0.5, // y
right: 0.5, // y
depth: 0, // z
color: [1, 0.4, 0.4, 1], // red
});
```
Next, draw a green box up top and behind the red box.
```js
box.draw({
top: 0.9, // x
bottom: 0, // x
left: -0.9, // y
right: 0.9, // y
depth: 0.5, // z
color: [0.4, 1, 0.4, 1], // green
});
```
Finally, for demonstration that clipping is actually going on, this box doesn't get drawn because it's entirely outside of clip space. The depth is outside of the -1.0 to 1.0 range.
```js
box.draw({
top: 1, // x
bottom: -1, // x
left: -1, // y
right: 1, // y
depth: -1.5, // z
color: [0.4, 0.4, 1, 1], // blue
});
```
#### The results
[View on JSFiddle](https://jsfiddle.net/tatumcreative/mff99yu5/)

#### Exercise
A helpful exercise at this point is to move the boxes around the clip space by varying the code to get a feel for how points get clipped and moved around in clip space. Try drawing a picture like a boxy smiley face with a background.
## Homogeneous coordinates
The main line of the previous clip space vertex shader contained this code:
```js
gl_Position = vec4(position, 1.0);
```
The `position` variable was defined in the `draw()` method and passed in as an attribute to the shader. This is a three dimensional point, but the `gl_Position` variable that ends up getting passed down through the pipeline is actually 4 dimensional — instead of `(x, y, z)` it is `(x, y, z, w)`. There is no letter after `z`, so by convention this fourth dimension is labeled `w`. In the above example the `w` coordinate is set to 1.0.
The obvious question is "why the extra dimension?" It turns out that this addition allows for lots of nice techniques for manipulating 3D data. This added dimension introduces the notion of perspective into the coordinate system; with it in place, we can map 3D coordinates into 2D space—thereby allowing two parallel lines to intersect as they recede into the distance. The value of `w` is used as a divisor for the other components of the coordinate, so that the true values of `x`, `y`, and `z` are computed as `x/w`, `y/w`, and `z/w` (and `w` is then also `w/w`, becoming 1).
A three dimensional point is defined in a typical Cartesian coordinate system. The added fourth dimension changes this point into a [homogeneous coordinate](https://en.wikipedia.org/wiki/Homogeneous_coordinates). It still represents a point in 3D space and it can easily be demonstrated how to construct this type of coordinate through a pair of simple functions.
```js
function cartesianToHomogeneous(point) {
let x = point[0];
let y = point[1];
let z = point[2];
return [x, y, z, 1];
}
function homogeneousToCartesian(point) {
let x = point[0];
let y = point[1];
let z = point[2];
let w = point[3];
return [x / w, y / w, z / w];
}
```
As previously mentioned and shown in the functions above, the w component divides the x, y, and z components. When the w component is a non-zero real number then homogeneous coordinate easily translates back into a normal point in Cartesian space. Now what happens if the w component is zero? In JavaScript the value returned would be as follows.
```js
homogeneousToCartesian([10, 4, 5, 0]);
```
This evaluates to: `[Infinity, Infinity, Infinity]`.
This homogeneous coordinate represents some point at infinity. This is a handy way to represent a ray shooting off from the origin in a specific direction. In addition to a ray, it could also be thought of as a representation of a directional vector. If this homogeneous coordinate is multiplied against a matrix with a translation then the translation is effectively stripped out.
When numbers are extremely large (or extremely small) on computers they begin to become less and less precise because there are only so many ones and zeros that are used to represent them. The more operations that are done on larger numbers, the more and more errors accumulate into the result. When dividing by w, this can effectively increase the precision of very large numbers by operating on two potentially smaller, less error-prone numbers.
The final benefit of using homogeneous coordinates is that they fit very nicely for multiplying against 4x4 matrices. A vertex must match at least one of the dimensions of a matrix in order to be multiplied against it. The 4x4 matrix can be used to encode a variety of useful transformations. In fact, the typical perspective projection matrix uses the division by the w component to achieve its transformation.
The clipping of points and polygons from clip space happens before the homogeneous coordinates have been transformed back into Cartesian coordinates (by dividing by w). This final space is known as **normalized device coordinates** or NDC.
To start playing with this idea the previous example can be modified to allow for the use of the `w` component.
```js
//Redefine the triangles to use the W component
const data = new Float32Array([
//Triangle 1
settings.left,
settings.bottom,
settings.depth,
settings.w,
settings.right,
settings.bottom,
settings.depth,
settings.w,
settings.left,
settings.top,
settings.depth,
settings.w,
//Triangle 2
settings.left,
settings.top,
settings.depth,
settings.w,
settings.right,
settings.bottom,
settings.depth,
settings.w,
settings.right,
settings.top,
settings.depth,
settings.w,
]);
```
Then the vertex shader uses the 4 dimensional point passed in.
```glsl
attribute vec4 position;
void main() {
gl_Position = position;
}
```
First, we draw a red box in the middle, but set W to 0.7. As the coordinates get divided by 0.7 they will all be enlarged.
```js
box.draw({
top: 0.5, // y
bottom: -0.5, // y
left: -0.5, // x
right: 0.5, // x
w: 0.7, // w - enlarge this box
depth: 0, // z
color: [1, 0.4, 0.4, 1], // red
});
```
Now, we draw a green box up top, but shrink it by setting the w component to 1.1
```js
box.draw({
top: 0.9, // y
bottom: 0, // y
left: -0.9, // x
right: 0.9, // x
w: 1.1, // w - shrink this box
depth: 0.5, // z
color: [0.4, 1, 0.4, 1], // green
});
```
This last box doesn't get drawn because it's outside of clip space. The depth is outside of the -1.0 to 1.0 range.
```js
box.draw({
top: 1, // y
bottom: -1, // y
left: -1, // x
right: 1, // x
w: 1.5, // w - Bring this box into range
depth: -1.5, // z
color: [0.4, 0.4, 1, 1], // blue
});
```
### The results

### Exercises
- Play around with these values to see how it affects what is rendered on the screen. Note how the previously clipped blue box is brought back into range by setting its w component.
- Try creating a new box that is outside of clip space and bring it back in by dividing by w.
## Model transform
Placing points directly into clip space is of limited use. In real-world applications, you don't have all your source coordinates already in clip space coordinates. So most of the time, you need to transform the model data and other coordinates into clip space. The humble cube is an easy example of how to do this. Cube data consists of vertex positions, the colors of the faces of the cube, and the order of the vertex positions that make up the individual polygons (in groups of 3 vertices to construct the triangles composing the cube's faces). The positions and colors are stored in GL buffers, sent to the shader as attributes, and then operated upon individually.
Finally a single model matrix is computed and set. This matrix represents the transformations to be performed on every point making up the model in order to move it into the correct space, and to perform any other needed transforms on each point in the model. This applies not just to each vertex, but to every single point on every surface of the model as well.
In this case, for every frame of the animation a series of scale, rotation, and translation matrices move the data into the desired spot in clip space. The cube is the size of clip space (-1,-1,-1) to (1,1,1) so it will need to be shrunk down in order to not fill the entirety of clip space. This matrix is sent directly to the shader, having been multiplied in JavaScript beforehand.
The following code sample defines a method on the `CubeDemo` object that will create the model matrix. It uses custom functions to create and multiply matrices as defined in the [MDN WebGL](https://github.com/gregtatum/mdn-webgl) shared code. The new function looks like this:
```js
CubeDemo.prototype.computeModelMatrix = function (now) {
//Scale down by 50%
const scale = MDN.scaleMatrix(0.5, 0.5, 0.5);
// Rotate a slight tilt
const rotateX = MDN.rotateXMatrix(now * 0.0003);
// Rotate according to time
const rotateY = MDN.rotateYMatrix(now * 0.0005);
// Move slightly down
const position = MDN.translateMatrix(0, -0.1, 0);
// Multiply together, make sure and read them in opposite order
this.transforms.model = MDN.multiplyArrayOfMatrices([
position, // step 4
rotateY, // step 3
rotateX, // step 2
scale, // step 1
]);
};
```
In order to use this in the shader it must be set to a uniform location. The locations for the uniforms are saved in the `locations` object shown below:
```js
this.locations.model = gl.getUniformLocation(webglProgram, "model");
```
And finally the uniform is set to that location. This hands off the matrix to the GPU.
```js
gl.uniformMatrix4fv(
this.locations.model,
false,
new Float32Array(this.transforms.model),
);
```
In the shader, each position vertex is first transformed into a homogeneous coordinate (a `vec4` object), and then multiplied against the model matrix.
```glsl
gl_Position = model * vec4(position, 1.0);
```
> **Note:** In JavaScript, matrix multiplication requires a custom function, while in the shader it is built into the language with the simple \* operator.
### The results
[View on JSFiddle](https://jsfiddle.net/tatumcreative/5jofzgsh/)

At this point the w value of the transformed point is still 1.0. The cube still doesn't have any perspective. The next section will take this setup and modify the w values to provide some perspective.
### Exercises
- Shrink down the box using the scale matrix and position it in different places within clip space.
- Try moving it outside of clip space.
- Resize the window and watch as the box skews out of shape.
- Add a `rotateZ` matrix.
## Divide by W
An easy way to start getting some perspective on our model of the cube is to take the Z coordinate and copy it over to the w coordinate. Normally when converting a cartesian point to homogeneous it becomes `(x,y,z,1)`, but we're going to set it to something like `(x,y,z,z)`. In reality we want to make sure that z is greater than 0 for points in view, so we'll modify it slightly by changing the value to `((1.0 + z) * scaleFactor)`. This will take a point that is normally in clip space (-1 to 1) and move it into a space more like (0 to 1) depending on what the scale factor is set to. The scale factor changes the final w value to be either higher or lower overall.
The shader code looks like this.
```glsl
// First transform the point
vec4 transformedPosition = model * vec4(position, 1.0);
// How much effect does the perspective have?
float scaleFactor = 0.5;
// Set w by taking the z value which is typically ranged -1 to 1, then scale
// it to be from 0 to some number, in this case 0-1.
float w = (1.0 + transformedPosition.z) * scaleFactor;
// Save the new gl_Position with the custom w component
gl_Position = vec4(transformedPosition.xyz, w);
```
### The results
[View on JSFiddle](https://jsfiddle.net/tatumcreative/vk9r8h2c/)

See that small dark blue triangle? That's an additional face added to our object because the rotation of our shape has caused that corner to extend outside clip space, thus causing the corner to be clipped away. See [Perspective projection matrix](#perspective_projection_matrix) below for an introduction to how to use more complex matrices to help control and prevent clipping.
### Exercise
If that sounds a little abstract, open up the vertex shader and play around with the scale factor and watch how it shrinks vertices more towards the surface. Completely change the w component values for really trippy representations of space.
In the next section we'll take this step of copying Z into the w slot and turn it into a matrix.
## Simple projection
The last step of filling in the w component can actually be accomplished with a simple matrix. Start with the identity matrix:
```js
const identity = [1, 0, 0, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 0, 0, 1];
MDN.multiplyPoint(identity, [2, 3, 4, 1]);
//> [2, 3, 4, 1]
```
Then move the last column's 1 up one space.
```js
const copyZ = [1, 0, 0, 0, 0, 1, 0, 0, 0, 0, 1, 1, 0, 0, 0, 0];
MDN.multiplyPoint(copyZ, [2, 3, 4, 1]);
//> [2, 3, 4, 4]
```
However in the last example we performed `(z + 1) * scaleFactor`:
```js
const scaleFactor = 0.5;
const simpleProjection = [
1,
0,
0,
0,
0,
1,
0,
0,
0,
0,
1,
scaleFactor,
0,
0,
0,
scaleFactor,
];
MDN.multiplyPoint(simpleProjection, [2, 3, 4, 1]);
//> [2, 3, 4, 2.5]
```
Breaking it out a little further we can see how this works:
```js
let x = 2 * 1 + 3 * 0 + 4 * 0 + 1 * 0;
let y = 2 * 0 + 3 * 1 + 4 * 0 + 1 * 0;
let z = 2 * 0 + 3 * 0 + 4 * 1 + 1 * 0;
let w = 2 * 0 + 3 * 0 + 4 * scaleFactor + 1 * scaleFactor;
```
The last line could be simplified to:
```js
w = 4 * scaleFactor + 1 * scaleFactor;
```
Then factoring out the scaleFactor, we get this:
```js
w = (4 + 1) * scaleFactor;
```
Which is exactly the same as the `(z + 1) * scaleFactor` that we used in the previous example.
In the box demo, an additional `computeSimpleProjectionMatrix()` method is added. This is called in the `draw()` method and has the scale factor passed to it. The result should be identical to the last example:
```js
CubeDemo.prototype.computeSimpleProjectionMatrix = function (scaleFactor) {
this.transforms.projection = [
1,
0,
0,
0,
0,
1,
0,
0,
0,
0,
1,
scaleFactor,
0,
0,
0,
scaleFactor,
];
};
```
Although the result is identical, the important step here is in the vertex shader. Rather than modifying the vertex directly, it gets multiplied by an additional **[projection matrix](#projection_matrix)**, which (as the name suggests) projects 3D points onto a 2D drawing surface:
```glsl
// Make sure to read the transformations in reverse order
gl_Position = projection * model * vec4(position, 1.0);
```
### The results
[View on JSFiddle](https://jsfiddle.net/tatumcreative/zwyLLcbw/)

## The viewing frustum
Before we move on to covering how to compute a perspective projection matrix, we need to introduce the concept of the **[viewing frustum](https://en.wikipedia.org/wiki/Viewing_frustum)** (also known as the **view frustum**). This is the region of space whose contents are visible to the user at the current time. It's the 3D region of space defined by the field of view and the distances specified as the nearest and farthest content that should be rendered.
While rendering, we need to determine which polygons need to be rendered in order to represent the scene. This is what the viewing frustum defines. But what's a frustum in the first place?
A [frustum](https://en.wikipedia.org/wiki/Frustum) is the 3D solid that results from taking any solid and slicing off two sections of it using two parallel planes. Consider our camera, which is viewing an area that starts immediately in front of its lens and extends off into the distance. The viewable area is a four-sided pyramid with its peak at the lens, its four sides corresponding to the extents of its peripheral vision range, and its base at the farthest distance it can see, like this:

If we used this to determine the polygons to be rendered each frame, our renderer would need to render every polygon within this pyramid, all the way off into infinity, including also polygons that are very close to the lens—likely too close to be useful (and certainly including things that are so close that a real human wouldn't be able to focus on them in the same setting).
So the first step in reducing the number of polygons we need to compute and render, we turn this pyramid into the viewing frustum. The two planes we'll use to chop away vertices in order to reduce the polygon count are the **near clipping plane** and the **far clipping plane**.
In WebGL, the near and far clipping planes are defined by specifying the distance from the lens to the closest point on a plane which is perpendicular to the viewing direction. Anything closer to the lens than the near clipping plane or farther from it than the far clipping plane is removed. This results in the viewing frustum, which looks like this:

The set of objects to be rendered for each frame is essentially created by starting with the set of all objects in the scene. Then any objects which are _entirely_ outside the viewing frustum are removed from the set. Next, objects which partially extrude outside the viewing frustum are clipped by dropping any polygons which are entirely outside the frustum, and by clipping the polygons which cross outside the frustum so that they no longer exit it.
Once that's been done, we have the largest set of polygons which are entirely within the viewing frustum. This list is usually further reduced using processes like [back-face culling](https://en.wikipedia.org/wiki/Back-face_culling) (removing polygons whose back side is facing the camera) and occlusion culling using [hidden-surface determination](https://en.wikipedia.org/wiki/Hidden-surface_determination) (removing polygons which can't be seen because they're entirely blocked by polygons that are closer to the lens).
## Perspective projection matrix
Up to this point, we've built up our own 3D rendering setup, step by step. However the current code as we've built it has some issues. For one, it gets skewed whenever we resize our window. Another is that our simple projection doesn't handle a wide range of values for the scene data. Most scenes don't work in clip space. It would be helpful to define what distance is relevant to the scene so that precision isn't lost in converting the numbers. Finally it's very helpful to have a fine-tuned control over what points get placed inside and outside of clip space. In the previous examples the corners of the cube occasionally get clipped.
The **perspective projection matrix** is a type of projection matrix that accomplishes all of these requirements. The math also starts to get a bit more involved and won't be fully explained in these examples. In short, it combines dividing by w (as done with the previous examples) with some ingenious manipulations based on [similar triangles](https://en.wikipedia.org/wiki/Similarity_%28geometry%29). If you want to read a full explanation of the math behind it check out some of the following links:
- [OpenGL Projection Matrix](https://www.songho.ca/opengl/gl_projectionmatrix.html)
- [Perspective Projection](https://ogldev.org/)
- [Trying to understand the math behind the perspective projection matrix in WebGL](https://stackoverflow.com/questions/28286057/trying-to-understand-the-math-behind-the-perspective-matrix-in-webgl/28301213#28301213)
One important thing to note about the perspective projection matrix used below is that it flips the z axis. In clip space the z+ goes away from the viewer, while with this matrix it comes towards the viewer.
The reason to flip the z axis is that the clip space coordinate system is a left-handed coordinate system (wherein the z-axis points away from the viewer and into the screen), while the convention in mathematics, physics and 3D modeling, as well as for the view/eye coordinate system in OpenGL, is to use a right-handed coordinate system (z-axis points out of the screen towards the viewer). More on that in the relevant Wikipedia articles: [Cartesian coordinate system](https://en.wikipedia.org/wiki/Cartesian_coordinate_system#Orientation_and_handedness), [Right-hand rule](https://en.wikipedia.org/wiki/Right-hand_rule).
Let's take a look at a `perspectiveMatrix()` function, which computes the perspective projection matrix.
```js
MDN.perspectiveMatrix = function (
fieldOfViewInRadians,
aspectRatio,
near,
far,
) {
const f = 1.0 / Math.tan(fieldOfViewInRadians / 2);
const rangeInv = 1 / (near - far);
return [
f / aspectRatio,
0,
0,
0,
0,
f,
0,
0,
0,
0,
(near + far) * rangeInv,
-1,
0,
0,
near * far * rangeInv * 2,
0,
];
};
```
The four parameters into this function are:
- `fieldOfViewInRadians`
- : An angle, given in radians, indicating how much of the scene is visible to the viewer at once. The larger the number is, the more is visible by the camera. The geometry at the edges becomes more and more distorted, equivalent to a wide angle lens. When the field of view is larger, the objects typically get smaller. When the field of view is smaller, then the camera can see less and less in the scene. The objects are distorted much less by perspective and objects seem much closer to the camera
- `aspectRatio`
- : The scene's aspect ratio, which is equivalent to its width divided by its height. In these examples, that's the window's width divided by the window height. The introduction of this parameter finally solves the problem wherein the model gets warped as the canvas is resized and reshaped.
- `nearClippingPlaneDistance`
- : A positive number indicating the distance into the screen to a plane which is perpendicular to the floor, nearer than which everything gets clipped away. This is mapped to -1 in clip space, and should not be set to 0.
- `farClippingPlaneDistance`
- : A positive number indicating the distance to the plane beyond which geometry is clipped away. This is mapped to 1 in clip space. This value should be kept reasonably close to the distance of the geometry in order to avoid precision errors creeping in while rendering.
In the latest version of the box demo, the `computeSimpleProjectionMatrix()` method has been replaced with the `computePerspectiveMatrix()` method.
```js
CubeDemo.prototype.computePerspectiveMatrix = function () {
const fieldOfViewInRadians = Math.PI * 0.5;
const aspectRatio = window.innerWidth / window.innerHeight;
const nearClippingPlaneDistance = 1;
const farClippingPlaneDistance = 50;
this.transforms.projection = MDN.perspectiveMatrix(
fieldOfViewInRadians,
aspectRatio,
nearClippingPlaneDistance,
farClippingPlaneDistance,
);
};
```
The shader code is identical to the previous example:
```js
gl_Position = projection * model * vec4(position, 1.0);
```
Additionally (not shown), the position and scale matrices of the model have been changed to take it out of clip space and into the larger coordinate system.
### The results
[View on JSFiddle](https://jsfiddle.net/tatumcreative/Lzxw7e1q/)

### Exercises
- Experiment with the parameters of the perspective projection matrix and the model matrix.
- Swap out the perspective projection matrix to use [orthographic projection](https://en.wikipedia.org/wiki/Orthographic_projection). In the MDN WebGL shared code you'll find the `MDN.orthographicMatrix()`. This can replace the `MDN.perspectiveMatrix()` function in `CubeDemo.prototype.computePerspectiveMatrix()`.
## View matrix
While some graphics libraries have a virtual camera that can be positioned and pointed while composing a scene, OpenGL (and by extension WebGL) does not. This is where the **view matrix** comes in. Its job is to translate, rotate, and scale the objects in the scene so that they are located in the right place relative to the viewer given the viewer's position and orientation.
### Simulating a camera
This makes use of one of the fundamental facets of Einstein's special relativity theory: the principle of reference frames and relative motion says that, from the perspective of a viewer, you can simulate changing the position and orientation of the viewer by applying the opposite change to the objects in the scene. Either way, the result appears to be identical to the viewer.
Consider a box sitting on a table and a camera resting on the table one meter away, pointed at the box, the front of which is pointed toward the camera. Then consider moving the camera away from the box until it's two meters away (by adding a meter to the camera's Z position), then sliding it 10 centimeters to the its left. The box recedes from the camera by that amount and slides to the right slightly, thereby appearing smaller to the camera and exposing a small amount of its left side to the camera.
Now let's reset the scene, placing the box back in its starting point, with the camera two meters from, and directly facing, the box. This time, however, the camera is locked down on the table and cannot be moved or turned. This is what working in WebGL is like. So how do we simulate moving the camera through space?
Instead of moving the camera backward and to the left, we apply the inverse transform to the box: we move the _box_ backward one meter, and then 10 centimeters to its right. The result, from the perspective of each of the two objects, is identical.
The final step in all of this is to create the **view matrix**, which transforms the objects in the scene so they're positioned to simulate the camera's current location and orientation. Our code as it stands can move the cube around in world space and project everything to have perspective, but we still can't move the camera.
Imagine shooting a movie with a physical camera. You have the freedom to place the camera essentially anywhere you wish, and to aim the camera in whichever direction you choose. To simulate this in 3D graphics, we use a view matrix to simulate the position and rotation of that physical camera.
Unlike the model matrix, which directly transforms the model vertices, the view matrix moves an abstract camera around. In reality, the vertex shader is still only moving the models while the "camera" stays in place. In order for this to work out correctly, the inverse of the transform matrix must be used. The inverse matrix essentially reverses a transformation, so if we move the camera view forward, the inverse matrix causes the objects in the scene to move back.
The following `computeViewMatrix()` method animates the view matrix by moving it in and out, and left and right.
```js
CubeDemo.prototype.computeViewMatrix = function (now) {
const moveInAndOut = 20 * Math.sin(now * 0.002);
const moveLeftAndRight = 15 * Math.sin(now * 0.0017);
// Move the camera around
const position = MDN.translateMatrix(moveLeftAndRight, 0, 50 + moveInAndOut);
// Multiply together, make sure and read them in opposite order
const matrix = MDN.multiplyArrayOfMatrices([
// Exercise: rotate the camera view
position,
]);
// Inverse the operation for camera movements, because we are actually
// moving the geometry in the scene, not the camera itself.
this.transforms.view = MDN.invertMatrix(matrix);
};
```
The shader now uses three matrices.
```glsl
gl_Position = projection * view * model * vec4(position, 1.0);
```
After this step, the GPU pipeline will clip the out of range vertices, and send the model down to the fragment shader for rasterization.
### The results
[View on JSFiddle](https://jsfiddle.net/tatumcreative/86fd797g/)

### Relating the coordinate systems
At this point it would be beneficial to take a step back and look at and label the various coordinate systems we use. First off, the cube's vertices are defined in **model space**. To move the model around the scene. These vertices need to be converted into **world space** by applying the model matrix.
model space → model matrix → world space
The camera hasn't done anything yet, and the points need to be moved again. Currently they are in world space, but they need to be moved to **view space** (using the view matrix) in order to represent the camera placement.
world space → view matrix → view space
Finally a **projection** (in our case the perspective projection matrix) needs to be added in order to map the world coordinates into clip space coordinates.
view space → projection matrix → clip space
### Exercise
- Move the camera around the scene.
- Add some rotation matrices to the view matrix to look around.
- Finally, track the mouse's position. Use 2 rotation matrices to have the camera look up and down based on where the user's mouse is on the screen.
## See also
- [WebGL](/en-US/docs/Web/API/WebGL_API)
- [3D projection](https://en.wikipedia.org/wiki/3D_projection)
| 0 |
data/mdn-content/files/en-us/web/api/webgl_api | data/mdn-content/files/en-us/web/api/webgl_api/webgl_model_view_projection/camera_view_frustum.svg | <svg xmlns="http://www.w3.org/2000/svg" width="544" height="431.367"><path d="M0 415.562l19.756-15.319 12.256 15.805-19.756 15.319z"/><path d="M23.513 409.983l6.547-16.035 10.614 13.688z"/><path d="M36.609 401.5l180-400m-180 400l286.062-194.075M36.609 401.5L525.21 283.4M36.609 401.5l438.97-376.164" stroke="#000" stroke-dasharray="3,2" stroke-linecap="round" stroke-linejoin="round" fill="none"/><path d="M196.609 46.5l66.533 201.778L455.609 300.5l-48-217z" fill="#CCC" fill-opacity=".559"/><path d="M196.609 46.5l66.533 201.778L455.609 300.5l-48-217z" stroke="#000" stroke-width="3" stroke-linecap="round" stroke-linejoin="round" fill="none"/><path d="M101.609 257.5l95-211 211 37-237 203z" fill="red" fill-opacity=".559"/><path d="M101.609 257.5l95-211 211 37-237 203z" stroke="#000" stroke-width="3" stroke-linecap="round" stroke-linejoin="round" fill="none"/><path d="M455.609 300.5l-48-217-237 203 27 77z" fill="#0F0" fill-opacity=".559"/><path d="M455.609 300.5l-48-217-237 203 27 77z" stroke="#000" stroke-width="3" stroke-linecap="round" stroke-linejoin="round" fill="none"/><path d="M101.609 257.5l69 29 27 77-68.699-24.611z" fill="#FF0" fill-opacity=".559"/><path d="M101.609 257.5l69 29 27 77-68.699-24.611z" stroke="#000" stroke-width="3" stroke-linecap="round" stroke-linejoin="round" fill="none"/><text transform="translate(219.895 389.302)"><tspan x="-60.35" y="4" font-family="ArialMT" font-size="22">Near Plane</tspan></text><text transform="translate(483.65 93)"><tspan x="-60.35" y="4" font-family="ArialMT" font-size="22">Far Plane</tspan></text></svg> | 0 |
data/mdn-content/files/en-us/web/api/webgl_api | data/mdn-content/files/en-us/web/api/webgl_api/webgl_model_view_projection/clip_space_graph.svg | <svg xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink" viewBox="0 0 500 432"><path fill="#F4F4F4" d="M119.9 128.3l14.8 220.3 222.5 9.2 60.5-112-4.5-190.9H220.9z"/><path fill="#00CE00" d="M280.7 103.3l-6.5-5.5-6.5 5.5 6.5-14.1z"/><path fill="#A50000" d="M371.8 201.6l5.4-6.5-5.4-6.6 14.1 6.6z"/><path fill="#0A00EC" d="M312.7 165.8l-5.6-2.8-9.3 2 12.7-6.4z"/><path fill="none" stroke="#0A00EC" stroke-width="4" stroke-linecap="round" stroke-miterlimit="10" d="M278.7 193.5l28.1-30.8"/><path fill="none" stroke="#00CE00" stroke-width="4" stroke-linecap="round" stroke-miterlimit="10" d="M278.8 193.5l-4.6-95.8"/><path fill="none" stroke="#A50000" stroke-width="4" stroke-linecap="round" stroke-miterlimit="10" d="M278.7 193.5l99.8 1.7"/><g font-family="'CourierNewPSMT'" font-size="10"><text transform="translate(258.696 205.393)">(0,0,0)</text><text transform="translate(396.863 48.56)">(1,1,1)</text><text transform="translate(102.86 364.06)">(-1,-1,-1)</text><text transform="translate(329.194 370.06)">(1,-1,-1)</text><text transform="translate(417.195 257.06)">(1,-1,1)</text><text transform="translate(229.194 255.06)">(-1,-1,1)</text><text transform="translate(193.529 48.727)">(-1,1,1)</text><text transform="translate(60.528 130.06)">(-1,1,-1)</text><text transform="translate(355.195 139.06)">(1,1,-1)</text></g><g opacity=".2" fill="none" stroke="#6D6D6D" stroke-width=".75" stroke-miterlimit="10"><path d="M189 287l200 6-5.3-203.3L175 88z"/><path d="M244 351.5l-10.5-222 74-74L313 243z"/><path d="M127.5 235.6l224.5 2.9 63-79-188.7-1.8zm110.5.9l72.5-78M182 192l204 3m-102.7 94.8l-9.3-201"/></g><defs><path id="a" d="M413.2 54.9l4.5 190.9-187.3-4.5-9.5-186.4z"/></defs><clipPath id="b"><use xlink:href="#a" overflow="visible"/></clipPath><path clip-path="url(#b)" fill="none" stroke="#000" stroke-width="3" stroke-miterlimit="10" d="M413.2 54.9l4.5 190.9-187.3-4.5-9.5-186.4z"/><defs><path id="c" d="M349.7 130.5l7.5 227.3-222.8-8.5-14.5-221z"/></defs><clipPath id="d"><use xlink:href="#c" overflow="visible"/></clipPath><path clip-path="url(#d)" fill="none" stroke="#000" stroke-width="3" stroke-miterlimit="10" d="M349.7 130.5l7.5 227.3-222.8-8.5-14.5-221z"/><defs><path id="e" d="M349.7 130.5l63.5-75.6 4.5 190.9-60.5 112z"/></defs><clipPath id="f"><use xlink:href="#e" overflow="visible"/></clipPath><path clip-path="url(#f)" fill="none" stroke="#000" stroke-width="3" stroke-miterlimit="10" d="M349.7 130.5l63.5-75.6 4.5 190.9-60.5 112z"/><defs><path id="g" d="M357.2 357.8l-222.8-8.5 96-108 187.3 4.5z"/></defs><clipPath id="h"><use xlink:href="#g" overflow="visible"/></clipPath><path clip-path="url(#h)" fill="none" stroke="#000" stroke-width="3" stroke-miterlimit="10" d="M357.2 357.8l-222.8-8.5 96-108 187.3 4.5z"/><defs><path id="i" d="M134.4 349.3l-14.5-221 101-73.4 9.5 186.4z"/></defs><clipPath id="j"><use xlink:href="#i" overflow="visible"/></clipPath><path clip-path="url(#j)" fill="none" stroke="#000" stroke-width="3" stroke-miterlimit="10" d="M134.4 349.3l-14.5-221 101-73.4 9.5 186.4z"/><defs><path id="k" d="M119.9 128.3l101-73.4h192.3l-63.5 75.6z"/></defs><clipPath id="l"><use xlink:href="#k" overflow="visible"/></clipPath><path clip-path="url(#l)" fill="none" stroke="#000" stroke-width="3" stroke-miterlimit="10" d="M119.9 128.3l101-73.4h192.3l-63.5 75.6z"/><text transform="translate(14 418.334)" font-family="'CourierNewPSMT'" font-size="19.633">Clipspace</text></svg> | 0 |
data/mdn-content/files/en-us/web/api/webgl_api | data/mdn-content/files/en-us/web/api/webgl_api/types/index.md | ---
title: WebGL types
slug: Web/API/WebGL_API/Types
page-type: guide
spec-urls:
- https://www.khronos.org/registry/webgl/specs/latest/1.0/#5.1
- https://www.khronos.org/registry/webgl/specs/latest/2.0/#3.1
- https://www.khronos.org/registry/webgl/extensions/EXT_disjoint_timer_query/
---
{{DefaultAPISidebar("WebGL")}}
The following types are used in [WebGL](/en-US/docs/Web/API/WebGL_API) interfaces.
## WebGL 1
These types are used within a {{domxref("WebGLRenderingContext")}}.
<table class="no-markdown">
<thead>
<tr>
<th>Type</th>
<th>Web IDL type</th>
<th>Description</th>
</tr>
</thead>
<tbody>
<tr>
<td><code>GLenum</code></td>
<td><code>unsigned long</code></td>
<td>
Used for enums. See also the list of
<a href="/en-US/docs/Web/API/WebGL_API/Constants">constants</a>.
</td>
</tr>
<tr>
<td><code>GLboolean</code></td>
<td><code>boolean</code></td>
<td>A boolean value.</td>
</tr>
<tr>
<td><code>GLbitfield</code></td>
<td><code>unsigned long</code></td>
<td>
A bit field that stores multiple, logical bits. Used for example in
{{domxref("WebGLRenderingContext.clear()")}}.
</td>
</tr>
<tr>
<td><code>GLbyte</code></td>
<td><code>byte</code></td>
<td>8-bit twos complement signed integer.</td>
</tr>
<tr>
<td><code>GLshort</code></td>
<td><code>short</code></td>
<td>16-bit twos complement signed integer.</td>
</tr>
<tr>
<td><code>GLint</code></td>
<td><code>long</code></td>
<td>32-bit twos complement signed integer.</td>
</tr>
<tr>
<td><code>GLsizei</code></td>
<td><code>long</code></td>
<td>Used for sizes (e.g. width and height of the drawing buffer).</td>
</tr>
<tr>
<td><code>GLintptr</code></td>
<td><code>long long</code></td>
<td>Special type for pointer arithmetic.</td>
</tr>
<tr>
<td><code>GLsizeiptr</code></td>
<td><code>long long</code></td>
<td>Special type for pointer arithmetic.</td>
</tr>
<tr>
<td><code>GLubyte</code></td>
<td><code>octet</code></td>
<td>8-bit unsigned integer.</td>
</tr>
<tr>
<td><code>GLushort</code></td>
<td><code>unsigned short</code></td>
<td>16-bit unsigned integer.</td>
</tr>
<tr>
<td><code>GLuint</code></td>
<td><code>unsigned long</code></td>
<td>32-bit unsigned integer.</td>
</tr>
<tr>
<td><code>GLfloat</code></td>
<td><code>unrestricted float</code></td>
<td>32-bit IEEE floating point number.</td>
</tr>
<tr>
<td><code>GLclampf</code></td>
<td><code>unrestricted float</code></td>
<td>Clamped 32-bit IEEE floating point number.</td>
</tr>
</tbody>
</table>
## WebGL 2
These types are used within a {{domxref("WebGL2RenderingContext")}}. All WebGL 1 types are used as well.
| Type | Web IDL type | Description |
| --------- | ------------ | ----------------------------- |
| `GLint64` | `long long` | Signed 64-bit integer number. |
## WebGL extensions
These types are used within [WebGL extensions](/en-US/docs/Web/API/WebGL_API/Using_Extensions).
| Type | Web IDL type | Description |
| ------------- | ------------ | ------------------------------- |
| `GLuint64EXT` | `long long` | Unsigned 64-bit integer number. |
## Specifications
{{Specifications}}
## See also
- {{domxref("WebGLRenderingContext")}}
| 0 |
data/mdn-content/files/en-us/web/api/webgl_api | data/mdn-content/files/en-us/web/api/webgl_api/basic_2d_animation_example/index.md | ---
title: A basic 2D WebGL animation example
slug: Web/API/WebGL_API/Basic_2D_animation_example
page-type: guide
---
{{DefaultAPISidebar("WebGL")}}
In this WebGL example, we create a canvas and within it render a rotating square using WebGL. The coordinate system we use to represent our scene is the same as the canvas's coordinate system. That is, (0, 0) is at the top-left corner and the bottom-right corner is at (600, 460).
## A rotating square example
Let's follow the different steps to get our rotating square.
### Vertex shader
First, let's take a look at the vertex shader. Its job, as always, is to convert the coordinates we're using for our scene into clipspace coordinates (that is, the system by which (0, 0) is at the center of the context and each axis extends from -1.0 to 1.0 regardless of the actual size of the context).
```html
<script id="vertex-shader" type="x-shader/x-vertex">
attribute vec2 aVertexPosition;
uniform vec2 uScalingFactor;
uniform vec2 uRotationVector;
void main() {
vec2 rotatedPosition = vec2(
aVertexPosition.x * uRotationVector.y +
aVertexPosition.y * uRotationVector.x,
aVertexPosition.y * uRotationVector.y -
aVertexPosition.x * uRotationVector.x
);
gl_Position = vec4(rotatedPosition * uScalingFactor, 0.0, 1.0);
}
</script>
```
The main program shares with us the attribute `aVertexPosition`, which is the position of the vertex in whatever coordinate system it's using. We need to convert these values so that both components of the position are in the range -1.0 to 1.0. This can be done easily enough by multiplying by a scaling factor that's based on the context's aspect ratio. We'll see that computation shortly.
We're also rotating the shape, and we can do that here, by applying a transform. We'll do that first. The rotated position of the vertex is computed by applying the rotation vector, found in the uniform `uRotationVector`, that's been computed by the JavaScript code.
Then the final position is computed by multiplying the rotated position by the scaling vector provided by the JavaScript code in `uScalingFactor`. The values of `z` and `w` are fixed at 0.0 and 1.0, respectively, since we're drawing in 2D.
The standard WebGL global `gl_Position` is then set to the transformed and rotated vertex's position.
### Fragment shader
Next comes the fragment shader. Its role is to return the color of each pixel in the shape being rendered. Since we're drawing a solid, untextured object with no lighting applied, this is exceptionally simple:
```html
<script id="fragment-shader" type="x-shader/x-fragment">
#ifdef GL_ES
precision highp float;
#endif
uniform vec4 uGlobalColor;
void main() {
gl_FragColor = uGlobalColor;
}
</script>
```
This starts by specifying the precision of the `float` type, as required. Then we set the global `gl_FragColor` to the value of the uniform `uGlobalColor`, which is set by the JavaScript code to the color being used to draw the square.
### HTML
The HTML consists solely of the {{HTMLElement("canvas")}} that we'll obtain a WebGL context on.
```html
<canvas id="glcanvas" width="600" height="460">
Oh no! Your browser doesn't support canvas!
</canvas>
```
### Globals and initialization
First, the global variables. We won't discuss these here; instead, we'll talk about them as they're used in the code to come.
```js
let gl = null;
let glCanvas = null;
// Aspect ratio and coordinate system
// details
let aspectRatio;
let currentRotation = [0, 1];
let currentScale = [1.0, 1.0];
// Vertex information
let vertexArray;
let vertexBuffer;
let vertexNumComponents;
let vertexCount;
// Rendering data shared with the
// scalers.
let uScalingFactor;
let uGlobalColor;
let uRotationVector;
let aVertexPosition;
// Animation timing
let shaderProgram;
let currentAngle;
let previousTime = 0.0;
let degreesPerSecond = 90.0;
```
Initializing the program is handled through a {{domxref("Window/load_event", "load")}} event handler called `startup()`:
```js
window.addEventListener("load", startup, false);
function startup() {
glCanvas = document.getElementById("glcanvas");
gl = glCanvas.getContext("webgl");
const shaderSet = [
{
type: gl.VERTEX_SHADER,
id: "vertex-shader",
},
{
type: gl.FRAGMENT_SHADER,
id: "fragment-shader",
},
];
shaderProgram = buildShaderProgram(shaderSet);
aspectRatio = glCanvas.width / glCanvas.height;
currentRotation = [0, 1];
currentScale = [1.0, aspectRatio];
vertexArray = new Float32Array([
-0.5, 0.5, 0.5, 0.5, 0.5, -0.5, -0.5, 0.5, 0.5, -0.5, -0.5, -0.5,
]);
vertexBuffer = gl.createBuffer();
gl.bindBuffer(gl.ARRAY_BUFFER, vertexBuffer);
gl.bufferData(gl.ARRAY_BUFFER, vertexArray, gl.STATIC_DRAW);
vertexNumComponents = 2;
vertexCount = vertexArray.length / vertexNumComponents;
currentAngle = 0.0;
animateScene();
}
```
After getting the WebGL context, `gl`, we need to begin by building the shader program. Here, we're using code designed to let us add multiple shaders to our program quite easily. The array `shaderSet` contains a list of objects, each describing one shader function to be compiled into the program. Each function has a type (one of `gl.VERTEX_SHADER` or `gl.FRAGMENT_SHADER`) and an ID (the ID of the {{HTMLElement("script")}} element containing the shader's code).
The shader set is passed into the function `buildShaderProgram()`, which returns the compiled and linked shader program. We'll look at how this works next.
Once the shader program is built, we compute the aspect ratio of our context by dividing its width by its height. Then we set the current rotation vector for the animation to `[0, 1]`, and the scaling vector to `[1.0, aspectRatio]`. The scaling vector, as we saw in the vertex shader, is used to scale the coordinates to fit the -1.0 to 1.0 range.
The array of vertices is created next, as a {{jsxref("Float32Array")}} with six coordinates (three 2D vertices) per triangle to be drawn, for a total of 12 values.
As you can see, we're using a coordinate system of -1.0 to 1.0 for each axis. Why, you may ask, do we need to do any adjustments at all? This is because our context is not square. We're using a context that's 600 pixels wide and 460 tall. Each of those dimensions is mapped to the range -1.0 to 1.0. Since the two axes aren't the same length, if we don't adjust the values of one of the two axes, the square will get stretched out in one direction or the other. So we need to normalize these values.
Once the vertex array has been created, we create a new GL buffer to contain them by calling {{domxref("WebGLRenderingContext.createBuffer", "gl.createBuffer()")}}. We bind the standard WebGL array buffer reference to that by calling {{domxref("WebGLRenderingContext.bindBuffer", "gl.bindBuffer()")}} and then copy the vertex data into the buffer using {{domxref("WebGLRenderingContext.bufferData", "gl.bufferData()")}}. The usage hint `gl.STATIC_DRAW` is specified, telling WebGL that the data will be set only one time and never modified, but will be used repeatedly. This lets WebGL consider any optimizations it can apply that may improve performance based on that information.
With the vertex data now provided to WebGL, we set `vertexNumComponents` to the number of components in each vertex (2, since they're 2D vertexes) and `vertexCount` to the number of vertexes in the vertex list.
Then the current rotation angle (in degrees) is set to 0.0, since we haven't performed any rotation yet, and the rotation speed (in degrees per screen refresh period, typically 60 FPS) is set to 6.
Finally, `animateScene()` is called to render the first frame and schedule the rendering of the next frame of the animation.
### Compiling and linking the shader program
The `buildShaderProgram()` function accepts as input an array of objects describing a set of shader functions to be compiled and linked into the shader program and returns the shader program after it's been built and linked.
```js
function buildShaderProgram(shaderInfo) {
const program = gl.createProgram();
shaderInfo.forEach((desc) => {
const shader = compileShader(desc.id, desc.type);
if (shader) {
gl.attachShader(program, shader);
}
});
gl.linkProgram(program);
if (!gl.getProgramParameter(program, gl.LINK_STATUS)) {
console.log("Error linking shader program:");
console.log(gl.getProgramInfoLog(program));
}
return program;
}
```
First, {{domxref("WebGLRenderingContext.createProgram", "gl.createProgram()")}} is called to create a new, empty, GLSL program.
Then, for each shader in the specified list of shaders, we call a `compileShader()` function to compile it, passing into it the ID and type of the shader function to build. Each of those objects includes, as mentioned before, the ID of the `<script>` element the shader code is found in and the type of shader it is. The compiled shader is attached to the shader program by passing it into {{domxref("WebGLRenderingContext.attachShader", "gl.attachShader()")}}.
> **Note:** We could go a step farther here, actually, and look at the value of the `<script>` element's `type` attribute to determine the shader type.
Once all of the shaders are compiled, the program is linked using {{domxref("WebGLRenderingContext.linkProgram", "gl.linkProgram()")}}.
If an error occurs while linking the program, the error message is logged to console.
Finally, the compiled program is returned to the caller.
### Compiling an individual shader
The `compileShader()` function, below, is called by `buildShaderProgram()` to compile a single shader.
```js
function compileShader(id, type) {
const code = document.getElementById(id).firstChild.nodeValue;
const shader = gl.createShader(type);
gl.shaderSource(shader, code);
gl.compileShader(shader);
if (!gl.getShaderParameter(shader, gl.COMPILE_STATUS)) {
console.log(
`Error compiling ${
type === gl.VERTEX_SHADER ? "vertex" : "fragment"
} shader:`,
);
console.log(gl.getShaderInfoLog(shader));
}
return shader;
}
```
The code is fetched from the HTML document by obtaining the value of the text node contained within the {{HTMLElement("script")}} element with the specified ID. Then a new shader of the specified type is created using {{domxref("WebGLRenderingContext.createShader", "gl.createShader()")}}.
The source code is sent into the new shader by passing it into {{domxref("WebGLRenderingContext.shaderSource", "gl.shaderSource()")}}, and then the shader is compiled using {{domxref("WebGLRenderingContext.compileShader", "gl.compileShader()")}}
Compile errors are logged to the console. Note the use of a [template literal](/en-US/docs/Web/JavaScript/Reference/Template_literals) string to insert the correct shader type string into the message that gets generated. The actual error details are obtained by calling {{domxref("WebGLRenderingContext.getShaderInfoLog", "gl.getShaderInfoLog()")}}.
Finally, the compiled shader is returned to the caller (which is the `buildShaderProgram()` function.
### Drawing and animating the scene
The `animateScene()` function is called to render each animation frame.
```js
function animateScene() {
gl.viewport(0, 0, glCanvas.width, glCanvas.height);
gl.clearColor(0.8, 0.9, 1.0, 1.0);
gl.clear(gl.COLOR_BUFFER_BIT);
const radians = (currentAngle * Math.PI) / 180.0;
currentRotation[0] = Math.sin(radians);
currentRotation[1] = Math.cos(radians);
gl.useProgram(shaderProgram);
uScalingFactor = gl.getUniformLocation(shaderProgram, "uScalingFactor");
uGlobalColor = gl.getUniformLocation(shaderProgram, "uGlobalColor");
uRotationVector = gl.getUniformLocation(shaderProgram, "uRotationVector");
gl.uniform2fv(uScalingFactor, currentScale);
gl.uniform2fv(uRotationVector, currentRotation);
gl.uniform4fv(uGlobalColor, [0.1, 0.7, 0.2, 1.0]);
gl.bindBuffer(gl.ARRAY_BUFFER, vertexBuffer);
aVertexPosition = gl.getAttribLocation(shaderProgram, "aVertexPosition");
gl.enableVertexAttribArray(aVertexPosition);
gl.vertexAttribPointer(
aVertexPosition,
vertexNumComponents,
gl.FLOAT,
false,
0,
0,
);
gl.drawArrays(gl.TRIANGLES, 0, vertexCount);
requestAnimationFrame((currentTime) => {
const deltaAngle =
((currentTime - previousTime) / 1000.0) * degreesPerSecond;
currentAngle = (currentAngle + deltaAngle) % 360;
previousTime = currentTime;
animateScene();
});
}
```
The first thing that needs to be done in order to draw a frame of the animation is to clear the background to the desired color. In this case, we set the viewport based on the size of the {{HTMLElement("canvas")}}, call {{domxref("WebGLRenderingContext.clearColor", "clearColor()")}} to set the color to use when clearing content, then we clear the buffer with {{domxref("WebGLRenderingContext.clear", "clear()")}}.
Next, the current rotation vector is computed by converting the current rotation in degrees (`currentAngle`) into [radians](https://en.wikipedia.org/wiki/Radians), then setting the first component of the rotation vector to the [sine](https://en.wikipedia.org/wiki/Sine) of that value and the second component to the [cosine](https://en.wikipedia.org/wiki/Cosine). The `currentRotation` vector is now the location of the point on the [unit circle](https://en.wikipedia.org/wiki/Unit_circle) located at the angle `currentAngle`.
{{domxref("WebGLRenderingContext.useProgram", "useProgram()")}} is called to activate the GLSL shading program we established previously. Then we obtain the locations of each of the uniforms used to share information between the JavaScript code and the shaders (with {{domxref("WebGLRenderingContext.getUniformLocation", "getUniformLocation()")}}).
The uniform named `uScalingFactor` is set to the `currentScale` value previously computed; this, as you may recall, is the value used to adjust the coordinate system based on the aspect ratio of the context. This is done using {{domxref("WebGLRenderingContext/uniform", "uniform2fv()")}} (since this is a 2-value floating-point vector).
`uRotationVector` is set to the current rotation vector (`currentRotation)`, also using `uniform2fv()`.
`uGlobalColor` is set using {{domxref("WebGLRenderingContext/uniform", "uniform4fv()")}} to the color we wish to use when drawing the square. This is a 4-component floating-point vector (one component each for red, green, blue, and alpha).
Now that's all out of the way, we can set up the vertex buffer and draw our shape, first, the buffer of vertexes that will be used to draw the triangles of the shape is set by calling {{domxref("WebGLRenderingContext.bindBuffer", "bindBuffer()")}}. Then the vertex position attribute's index is obtained from the shader program by calling {{domxref("WebGLRenderingContext.getAttribLocation", "getAttribLocation()")}}.
With the index of the vertex position attribute now available in `aVertexPosition`, we call `enableVertexAttribArray()` to enable the position attribute so it can be used by the shader program (in particular, by the vertex shader).
Then the vertex buffer is bound to the `aVertexPosition` attribute by calling {{domxref("WebGLRenderingContext.vertexAttribPointer", "vertexAttribPointer()")}}. This step is not obvious, since this binding is almost a side effect. But as a result, accessing `aVertexPosition` now obtains data from the vertex buffer.
With the association in place between the vertex buffer for our shape and the `aVertexPosition` attribute used to deliver vertexes one by one into the vertex shader, we're ready to draw the shape by calling {{domxref("WebGLRenderingContext.drawArrays", "drawArrays()")}}.
At this point, the frame has been drawn. All that's left to do is to schedule to draw the next one. That's done here by calling {{domxref("Window.requestAnimationFrame", "requestAnimationFrame()")}}, which asks that a callback function be executed the next time the browser is ready to update the screen.
Our `requestAnimationFrame()` callback receives as input a single parameter, `currentTime`, which specifies the time at which the frame drawing began. We use that and the saved time at which the last frame was drawn, `previousTime`, along with the number of degrees per second the square should rotate (`degreesPerSecond`) to calculate the new value of `currentAngle`. Then the value of `previousTime` is updated and we call `animateScene()` to draw the next frame (and in turn schedule the next frame to be drawn, ad infinitum).
### Result
This is a pretty simple example, since it's just drawing one simple object, but the concepts used here extend to much more complex animations.
{{EmbedLiveSample("A_rotating_square_example", 660, 500)}}
## See also
- [WebGL API](/en-US/docs/Web/API/WebGL_API)
- [WebGL tutorial](/en-US/docs/Web/API/WebGL_API/Tutorial)
| 0 |
data/mdn-content/files/en-us/web/api | data/mdn-content/files/en-us/web/api/paymentresponse/index.md | ---
title: PaymentResponse
slug: Web/API/PaymentResponse
page-type: web-api-interface
browser-compat: api.PaymentResponse
---
{{SecureContext_Header}}{{APIRef("Payment Request API")}}
The **`PaymentResponse`** interface of the [Payment Request API](/en-US/docs/Web/API/Payment_Request_API) is returned after a user selects a payment method and approves a payment request.
{{InheritanceDiagram}}
## Instance properties
- {{domxref('PaymentResponse.details')}} {{ReadOnlyInline}}
- : Returns a JSON-serializable object that provides a payment method specific message used by the merchant to process the transaction and determine successful fund transfer. The contents of the object depend on the payment method being used. Developers need to consult whomever controls the URL for the expected shape of the details object.
- {{domxref('PaymentResponse.methodName')}} {{ReadOnlyInline}}
- : Returns the payment method identifier for the payment method that the user selected, for example, Visa, Mastercard, Paypal, etc.
- {{domxref('PaymentResponse.payerEmail')}} {{ReadOnlyInline}} {{Deprecated_Inline}} {{Non-standard_Inline}}
- : Returns the email address supplied by the user. This option is only present when the `requestPayerEmail` option is set to `true` in the `options` parameter of the {{domxref('PaymentRequest.PaymentRequest','PaymentRequest()')}} constructor.
- {{domxref('PaymentResponse.payerName')}} {{ReadOnlyInline}} {{Deprecated_Inline}} {{Non-standard_Inline}}
- : Returns the name supplied by the user. This option is only present when the `requestPayerName` option is set to true in the `options` parameter of the {{domxref('PaymentRequest.PaymentRequest','PaymentRequest()')}} constructor.
- {{domxref('PaymentResponse.payerPhone')}} {{ReadOnlyInline}} {{Deprecated_Inline}} {{Non-standard_Inline}}
- : Returns the phone number supplied by the user. This option is only present when the `requestPayerPhone` option is set to `true` in the `options` parameter of the {{domxref('PaymentRequest.PaymentRequest','PaymentRequest()')}} constructor.
- {{domxref('PaymentResponse.requestId')}} {{ReadOnlyInline}}
- : Returns the identifier of the {{domxref('PaymentRequest')}} that produced the current response. This is the same value supplied in the {{domxref('PaymentRequest.PaymentRequest','PaymentRequest()')}} constructor by `details.id`.
- {{domxref('PaymentResponse.shippingAddress')}} {{ReadOnlyInline}} {{Deprecated_Inline}} {{Non-standard_Inline}}
- : Returns the shipping Address supplied by the user. This option is only present when the `requestShipping` option is set to `true` in the `options` parameter of the {{domxref('PaymentRequest.PaymentRequest','PaymentRequest()')}} constructor.
- {{domxref('PaymentResponse.shippingOption')}} {{ReadOnlyInline}} {{Deprecated_Inline}} {{Non-standard_Inline}}
- : Returns the ID attribute of the shipping option selected by the user. This option is only present when the `requestShipping` option is set to `true` in the `options` parameter of the {{domxref('PaymentRequest.PaymentRequest','PaymentRequest()')}} constructor.
## Instance methods
- {{domxref('PaymentResponse.retry()')}}
- : If something is wrong with the payment response's data (and there is a recoverable error), this method allows a merchant to request that the user retry the payment. The method takes an object as argument, which is used to signal to the user exactly what is wrong with the payment response so they can try to correct any issues.
- {{domxref('PaymentResponse.complete()')}}
- : Notifies the user agent that the user interaction is over. This causes any remaining user interface to be closed. This method should only be called after the Promise returned by the {{domxref('PaymentRequest.show()')}} method.
- {{domxref("PaymentResponse.toJSON()")}}
- : Returns a [JSON object](/en-US/docs/Web/JavaScript/Reference/Global_Objects/JSON) representing this `PaymentResponse` object.
## Events
Listen to this event using [`addEventListener()`](/en-US/docs/Web/API/EventTarget/addEventListener) or by assigning an event listener to the `oneventname` property of this interface.
- [`payerdetailchange`](/en-US/docs/Web/API/PaymentResponse/payerdetailchange_event) {{Deprecated_Inline}} {{Non-standard_Inline}}
- : Fired during a retry when the user makes changes to their personal information while filling out a payment request form. Allows the developer to revalidate any requested user data (e.g., the phone number or the email address) if it changes.
## Specifications
{{Specifications}}
## Browser compatibility
{{Compat}}
| 0 |
data/mdn-content/files/en-us/web/api/paymentresponse | data/mdn-content/files/en-us/web/api/paymentresponse/methodname/index.md | ---
title: "PaymentResponse: methodName property"
short-title: methodName
slug: Web/API/PaymentResponse/methodName
page-type: web-api-instance-property
browser-compat: api.PaymentResponse.methodName
---
{{securecontext_header}}{{APIRef("Payment Request API")}}
The **`methodName`** read-only
property of the {{domxref("PaymentResponse")}} interface returns a string uniquely
identifying the payment handler selected by the user.
This string may be either
one of the standardized payment method identifiers or a URL used by the payment handler
to process payments.
## Value
A string uniquely identifying the payment handler being used to
process the payment. This may be either a standardized identifier, or a URL used by the
payment processor to handle payments. See
how [merchant validation](/en-US/docs/Web/API/Payment_Request_API/Concepts#merchant_validation) works.
## Examples
The following example extracts the method name from the {{domxref('PaymentResponse')}}
object to the promise returned from {{domxref('PaymentRequest.show()')}}. In a
real-world implementation this data would then be sent to a payment server.
```js
payment.show().then((paymentResponse) => {
const paymentData = {
// payment method string
method: paymentResponse.methodName,
// payment details as you requested
details: paymentResponse.details,
// shipping address information
address: toDict(paymentResponse.shippingAddress),
};
// Send information to the server
});
```
## Specifications
{{Specifications}}
## Browser compatibility
{{Compat}}
| 0 |
data/mdn-content/files/en-us/web/api/paymentresponse | data/mdn-content/files/en-us/web/api/paymentresponse/requestid/index.md | ---
title: "PaymentResponse: requestId property"
short-title: requestId
slug: Web/API/PaymentResponse/requestId
page-type: web-api-instance-property
browser-compat: api.PaymentResponse.requestId
---
{{securecontext_header}}{{APIRef("Payment Request API")}}
The **`requestId`** read-only property of the
{{domxref("PaymentResponse")}} interface returns the free-form identifier supplied by
the `PaymentResponse()` constructor by details.id.
## Value
A string.
## Specifications
{{Specifications}}
## Browser compatibility
{{Compat}}
| 0 |
data/mdn-content/files/en-us/web/api/paymentresponse | data/mdn-content/files/en-us/web/api/paymentresponse/payerdetailchange_event/index.md | ---
title: "PaymentResponse: payerdetailchange event"
short-title: payerdetailchange
slug: Web/API/PaymentResponse/payerdetailchange_event
page-type: web-api-event
status:
- deprecated
- non-standard
browser-compat: api.PaymentResponse.payerdetailchange_event
---
{{APIRef("Payment Request API")}}{{SecureContext_Header}}{{Deprecated_Header}}{{Non-standard_Header}}
A **`payerdetailchange`** event is fired by the [Payment Request API](/en-US/docs/Web/API/Payment_Request_API) to a {{domxref("PaymentResponse")}} object when the user makes changes to their personal information while filling out a payment request form. This can happen when the payer is retrying to submit its details after an error has been detected.
The event handler for `payerdetailchange` should check each value in the form that has changed and ensure that the values are valid. If any are invalid, appropriate error messages should be configured and the {{domxref("PaymentResponse.retry", "retry()")}} method should be called on the {{domxref("PaymentResponse")}} to ask the user to update the invalid entries.
This event is not cancelable and does not bubble.
## Syntax
Use the event name in methods like {{domxref("EventTarget.addEventListener", "addEventListener()")}}, or set an event handler property.
```js
addEventListener("payerdetailchange", async (event) => {});
onpayerdetailchange = async (event) => {};
```
## Event type
A {{domxref("PaymentRequestUpdateEvent")}}. Inherits from {{domxref("Event")}}.
{{InheritanceDiagram("PaymentRequestUpdateEvent")}}
## Event properties
Although this event type is {{domxref("PaymentRequestUpdateEvent")}}, it doesn't implement any property that is not already on {{domxref("Event")}}.
## Examples
In the example below, `onpayerdetailchange` is used to set up a listener for the `payerdetailchange` event in order to validate the information entered by the user, requesting that any mistakes be corrected
```js
// Options for PaymentRequest(), indicating that shipping address,
// payer email address, name, and phone number all be collected.
const options = {
requestShipping: true,
requestPayerEmail: true,
requestPayerName: true,
requestPayerPhone: true,
};
const request = new PaymentRequest(methods, details, options);
const response = request.show();
// Get the data from the response
let {
payerName: oldPayerName,
payerEmail: oldPayerEmail,
payerPhone: oldPayerPhone,
} = response;
// Set up a handler for payerdetailchange events, to
// request corrections as needed.
response.onpayerdetailchange = async (ev) => {
const promisesToValidate = [];
const { payerName, payerEmail, payerPhone } = response;
// Validate each value which changed by calling a function
// that validates each type of data, returning a promise which
// resolves if the data is valid.
if (oldPayerName !== payerName) {
promisesToValidate.push(validateName(payerName));
oldPayerName = payerName;
}
if (oldPayerEmail !== payerEmail) {
promisesToValidate.push(validateEmail(payerEmail));
oldPayerEmail = payerEmail;
}
if (oldPayerPhone !== payerPhone) {
promisesToValidate.push(validatePhone(payerPhone));
oldPayerPhone = payerPhone;
}
// As each validation promise resolves, add the results of the
// validation to the errors list
const errors = await Promise.all(promisesToValidate).then((results) =>
results.reduce((errors, result), Object.assign(errors, result)),
);
// If we found any errors, wait for them to be corrected
if (Object.getOwnPropertyNames(errors).length) {
await response.retry(errors);
} else {
// We have a good payment; send the data to the server
await fetch("/pay-for-things/", { method: "POST", body: response.json() });
response.complete("success");
}
};
await response.retry({
payer: {
email: "invalid domain.",
phone: "invalid number.",
},
});
```
### addEventListener equivalent
You could also set up the event handler using the `addEventListener()` method:
```js
response.addEventListener("payerdetailchange", async (ev) => {
// …
});
```
## Browser compatibility
{{Compat}}
## See also
- [Payment Request API](/en-US/docs/Web/API/Payment_Request_API)
- [Using the Payment Request API](/en-US/docs/Web/API/Payment_Request_API/Using_the_Payment_Request_API)
- {{domxref("PaymentResponse")}}
- [`paymentmethodchange`](/en-US/docs/Web/API/PaymentRequest/paymentmethodchange_event)
- [`shippingaddresschange`](/en-US/docs/Web/API/PaymentRequest/shippingaddresschange_event)
- [`shippingoptionchange`](/en-US/docs/Web/API/PaymentRequest/shippingoptionchange_event)
| 0 |
data/mdn-content/files/en-us/web/api/paymentresponse | data/mdn-content/files/en-us/web/api/paymentresponse/shippingoption/index.md | ---
title: "PaymentResponse: shippingOption property"
short-title: shippingOption
slug: Web/API/PaymentResponse/shippingOption
page-type: web-api-instance-property
status:
- deprecated
- non-standard
browser-compat: api.PaymentResponse.shippingOption
---
{{securecontext_header}}{{APIRef("Payment Request API")}}{{Deprecated_header}}{{Non-standard_header}}
The **`shippingOption`** read-only property of
the `PaymentRequest` interface returns the ID attribute of the shipping
option selected by the user. This option is only present when the
`requestShipping` option is set to `true` in the
{{domxref('PaymentOptions')}} object passed to the
{{domxref('PaymentRequest.PaymentRequest','PaymentRequest')}} constructor.
## Value
A string.
## Examples
In the example below, the {{domxref('PaymentRequest.shippingoptionchange_event', 'shippingoptionchange')}} event
is called. It calls `updateDetails()` to toggle the shipping method between
"standard" and "express".
```js
// Initialization of PaymentRequest arguments are excerpted for brevity.
const payment = new PaymentRequest(supportedInstruments, details, options);
request.addEventListener("shippingoptionchange", (evt) => {
evt.updateWith(
new Promise((resolve, reject) => {
updateDetails(details, request.shippingOption, resolve, reject);
}),
);
});
payment
.show()
.then((paymentResponse) => {
// Processing of paymentResponse excerpted for the same of brevity.
})
.catch((err) => {
console.error("Uh oh, something bad happened", err.message);
});
function updateDetails(details, shippingOption, resolve, reject) {
let selectedShippingOption;
let otherShippingOption;
if (shippingOption === "standard") {
selectedShippingOption = details.shippingOptions[0];
otherShippingOption = details.shippingOptions[1];
details.total.amount.value = "55.00";
} else if (shippingOption === "express") {
selectedShippingOption = details.shippingOptions[1];
otherShippingOption = details.shippingOptions[0];
details.total.amount.value = "67.00";
} else {
reject(`Unknown shipping option '${shippingOption}'`);
return;
}
selectedShippingOption.selected = true;
otherShippingOption.selected = false;
details.displayItems.splice(2, 1, selectedShippingOption);
resolve(details);
}
```
## Browser compatibility
{{Compat}}
| 0 |
data/mdn-content/files/en-us/web/api/paymentresponse | data/mdn-content/files/en-us/web/api/paymentresponse/details/index.md | ---
title: "PaymentResponse: details property"
short-title: details
slug: Web/API/PaymentResponse/details
page-type: web-api-instance-property
browser-compat: api.PaymentResponse.details
---
{{securecontext_header}}{{APIRef("Payment Request API")}}
The **`details`** read-only property of the
{{domxref("PaymentResponse")}} interface returns a JSON-serializable object that
provides a payment method specific message used by the merchant to process the
transaction and determine a successful funds transfer.
This data is returned by the payment app that satisfies the payment request. Developers need to consult whomever controls the URL for the expected shape of the details object.
## Value
## Examples
The following example extracts the details from the {{domxref('PaymentResponse')}}
object to the promise returned from {{domxref('PaymentRequest.show()')}}. In a
real-world implementation this data would then be sent to a payment server.
```js
payment.show().then((paymentResponse) => {
const paymentData = {
// payment method string
method: paymentResponse.methodName,
// payment details as you requested
details: paymentResponse.details,
// shipping address information
address: toDict(paymentResponse.shippingAddress),
};
// Send information to the server
});
```
## Specifications
{{Specifications}}
## Browser compatibility
{{Compat}}
| 0 |
data/mdn-content/files/en-us/web/api/paymentresponse | data/mdn-content/files/en-us/web/api/paymentresponse/payerphone/index.md | ---
title: "PayerResponse: payerPhone property"
short-title: payerPhone
slug: Web/API/PaymentResponse/payerPhone
page-type: web-api-instance-property
status:
- deprecated
- non-standard
browser-compat: api.PaymentResponse.payerPhone
---
{{securecontext_header}}{{APIRef("Payment Request API")}}{{Deprecated_header}}{{Non-standard_header}}
The `payerPhone` read-only property of the {{domxref("PaymentResponse")}}
interface returns the phone number supplied by the user. This option is only present
when the `requestPayerPhone` option is set to `true` in the
{{domxref('PaymentOptions')}} object passed to the
{{domxref('PaymentRequest.PaymentRequest','PaymentRequest')}} constructor.
## Value
A string.
## Browser compatibility
{{Compat}}
| 0 |
data/mdn-content/files/en-us/web/api/paymentresponse | data/mdn-content/files/en-us/web/api/paymentresponse/shippingaddress/index.md | ---
title: "PaymentResponse: shippingAddress property"
short-title: shippingAddress
slug: Web/API/PaymentResponse/shippingAddress
page-type: web-api-instance-property
status:
- deprecated
- non-standard
browser-compat: api.PaymentResponse.shippingAddress
---
{{securecontext_header}}{{APIRef("Payment Request API")}}{{Deprecated_header}}{{Non-standard_header}}
The **`shippingAddress`** read-only property of
the `PaymentRequest` interface returns a {{domxref('PaymentAddress')}} object
containing the shipping address provided by the user.
## Value
A {{domxref("PaymentAddress")}} object providing details comprising the shipping
address provided by the user.
## Examples
Generally, the user agent will fill the `shippingAddress` property for you.
You can trigger this by
setting `PaymentOptions.requestShipping` to `true` when calling
the {{domxref('PaymentRequest.paymentRequest','PaymentRequest')}} constructor.
In the example below, the cost of shipping varies by geography. When the
{{domxref('PaymentRequest.shippingaddresschange_event', 'shippingaddresschange')}} event is
fired and caught, `updateDetails()` is called to update the details of
the `PaymentRequest`, using `shippingAddress` to set the correct
shipping cost.
```js
// Initialization of PaymentRequest arguments are excerpted for brevity.
const payment = new PaymentRequest(supportedInstruments, details, options);
request.addEventListener("shippingaddresschange", (evt) => {
evt.updateWith(
new Promise((resolve) => {
updateDetails(details, request.shippingAddress, resolve);
}),
);
});
payment
.show()
.then((paymentResponse) => {
// Processing of paymentResponse excerpted for the same of brevity.
})
.catch((err) => {
console.error("Uh oh, something bad happened", err.message);
});
function updateDetails(details, shippingAddress, resolve) {
if (shippingAddress.country === "US") {
const shippingOption = {
id: "",
label: "",
amount: { currency: "USD", value: "0.00" },
selected: true,
};
if (shippingAddress.region === "MO") {
shippingOption.id = "mo";
shippingOption.label = "Free shipping in Missouri";
details.total.amount.value = "55.00";
} else {
shippingOption.id = "us";
shippingOption.label = "Standard shipping in US";
shippingOption.amount.value = "5.00";
details.total.amount.value = "60.00";
}
details.displayItems.splice(2, 1, shippingOption);
details.shippingOptions = [shippingOption];
} else {
delete details.shippingOptions;
}
resolve(details);
}
```
## Browser compatibility
{{Compat}}
| 0 |
data/mdn-content/files/en-us/web/api/paymentresponse | data/mdn-content/files/en-us/web/api/paymentresponse/retry/index.md | ---
title: "PaymentResponse: retry() method"
short-title: retry()
slug: Web/API/PaymentResponse/retry
page-type: web-api-instance-method
browser-compat: api.PaymentResponse.retry
---
{{securecontext_header}}{{APIRef("Payment Request API")}}
The {{domxref("PaymentResponse")}} interface's
**`retry()`** method makes it possible to ask the user to
retry a payment after an error occurs during processing.
This lets your app
gracefully deal with situations such as invalid shipping addresses or declined credit
cards.
## Syntax
```js-nolint
retry(errorFields)
```
### Parameters
- `errorFields`
- : An object, with the following properties:
- `error` {{optional_inline}}
- : A general description of a payment error from which the user may attempt to recover by retrying the payment, possibly after correcting mistakes in the payment information. `error` can be provided all by itself to provide only a generic error message, or in concert with the other properties to serve as an overview while other properties' values guide the user to errors in specific fields in the payment form.
- `paymentMethod {{optional_inline}}
- : Any payment-method-specific errors which may have occurred. This object's contents will vary depending on the payment method used.
### Return value
A {{jsxref("Promise")}} which is resolved when the payment is successfully completed.
The promise is rejected with an appropriate exception value if the payment fails again.
Typically you will use this by calling {{domxref("PaymentRequest.show", "show()")}},
then entering a loop or recursive function that checks the
{{domxref("PaymentResponse")}} for errors or other reasons to retry the payment request.
If a retry is needed, the loop calls `retry()`, then loops back to check the
response when it comes in. The loop exits only when the user either cancels the payment
request or the request is successful.
See the [example](#examples) below for a thorough example, but the basic
concept, in outline form, is:
1. Create a new {{domxref("PaymentRequest")}}
(`new` {{domxref("PaymentRequest.PaymentRequest", "PaymentRequest()")}})
2. Display the payment request ({{domxref("PaymentRequest.show()")}}
3. If `show()` resolves, the returned {{domxref("PaymentResponse")}}
describes the requested payment and the options chosen by the user. Continue with the following steps:
1. Validate the returned response; if there are any fields whose values are not
acceptable, call the response's {{domxref("PaymentResponse.complete",
"complete()")}} method with a value of `"fail"` to indicate failure.
2. If the response's data is valid and acceptable, call
`complete("success")` to finalize the payment and process it.
4. If `show()` is rejected, the payment request failed, usually because
either there's already one being processed, because the {{Glossary("user agent")}}
doesn't support any of the specified payment methods, or because of a security issue.
See the [list of exceptions](/en-US/docs/Web/API/PaymentRequest/show#exceptions) for `show()` for further details. Call
`complete("fail")` to close the payment request.
```js
async function handlePayment() {
const payRequest = new PaymentRequest(methodData, details, options);
try {
let payResponse = await payRequest.show();
while (validate(payResponse)) {
/* let the user edit the payment information,
wait until they submit */
await response.retry();
}
await payResponse.complete("success");
} catch (err) {
/* handle the exception */
}
}
```
## Examples
```js
async function doPaymentRequest() {
const request = new PaymentRequest(methodData, details, options);
const response = await request.show();
await recursiveValidate(request, response);
await response.complete("success");
}
// Keep validating until the data looks good!
async function recursiveValidate(request, response) {
const promisesToFixThings = [];
const errors = await validate(request, response);
if (!errors) {
return;
}
if (errors.shippingAddress) {
// "shippingaddresschange" fired at request object
const promise = fixField(
request,
"shippingaddresschange",
shippingValidator,
);
promisesToFixThings.push(promise);
}
if (errors.payer) {
// "payerdetailchange" fired at response object
const promise = fixField(response, "payerdetailchange", payerValidator);
promisesToFixThings.push(promise);
}
await Promise.all([response.retry(errors), ...promisesToFixThings]);
await recursiveValidate(request, response);
}
function fixField(requestOrResponse, event, validator) {
return new Promise((resolve) => {
// Browser keeps calling this until promise resolves.
requestOrResponse.addEventListener(event, async function listener(ev) {
const promiseToValidate = validator(requestOrResponse);
ev.updateWith(promiseToValidate);
const errors = await promiseToValidate;
if (!errors) {
// yay! fixed!
event.removeEventListener(event, listener);
resolve();
}
});
});
}
doPaymentRequest();
```
## Specifications
{{Specifications}}
## Browser compatibility
{{Compat}}
## See also
- {{domxref("PaymentResponse")}} interface.
| 0 |
data/mdn-content/files/en-us/web/api/paymentresponse | data/mdn-content/files/en-us/web/api/paymentresponse/payeremail/index.md | ---
title: "PaymentResponse: payerEmail property"
short-title: payerEmail
slug: Web/API/PaymentResponse/payerEmail
page-type: web-api-instance-property
status:
- deprecated
- non-standard
browser-compat: api.PaymentResponse.payerEmail
---
{{securecontext_header}}{{APIRef("Payment Request API")}}{{Deprecated_header}}{{Non-standard_header}}
The `payerEmail` read-only property of the {{domxref("PaymentResponse")}}
interface returns the email address supplied by the user. This option is only present
when the `requestPayerEmail` option is set to `true` in the
{{domxref('PaymentOptions')}} object passed to the
{{domxref('PaymentRequest.PaymentRequest','PaymentRequest')}} constructor.
## Value
A string.
## Browser compatibility
{{Compat}}
| 0 |
data/mdn-content/files/en-us/web/api/paymentresponse | data/mdn-content/files/en-us/web/api/paymentresponse/complete/index.md | ---
title: "PaymentResponse: complete() method"
short-title: complete()
slug: Web/API/PaymentResponse/complete
page-type: web-api-instance-method
browser-compat: api.PaymentResponse.complete
---
{{securecontext_header}}{{APIRef("Payment Request API")}}
The {{domxref("PaymentRequest")}} method
**`complete()`** of the [Payment Request API](/en-US/docs/Web/API/Payment_Request_API) notifies the
{{Glossary("user agent")}} that the user interaction is over, and causes any remaining
user interface to be closed.
This method must be called after the user accepts
the payment request and the {{jsxref("Promise")}} returned by the
{{domxref('PaymentRequest.show()')}} method is resolved.
## Syntax
```js-nolint
complete()
complete(result)
```
### Parameters
- `result` {{optional_inline}}
- : A string indicating the state of the payment operation upon
completion. It must be one of the following:
- `success`
- : The payment was successfully processed. The user agent may or may not present
some form of "payment successful" indication to the user.
- `fail`
- : The payment was not successfully processed. The failure may or may not be
announced to the user by the user agent, depending on its design.
- `unknown`
- : The success or failure status of the transaction is unknown or irrelevant, and
the user agent should not present any notification, even if it normally would.
_This is the default value._
> **Note:** In older versions of the specification, an empty string,
> `""`, was used instead of `unknown` to indicate a completion
> without a known result state. See the [Browser compatibility](#browser_compatibility) section
> below for details.
### Return value
A {{jsxref("Promise")}} which resolves with no input value once the payment interface
has been fully closed. If an error occurs, the promise instead rejects, returning one of
the exceptions listed below.
### Exceptions
- `AbortError` {{domxref("DOMException")}}
- : Returned if the document in which the payment request is taking place became inactive while the
user interface was shown.
- `InvalidStateError` {{domxref("DOMException")}}
- : Returned if the payment has already completed, or `complete()` was called while a
request to retry the payment is pending. You can't treat a payment as complete after
requesting that the payment be tried again.
## Examples
The following example sends payment information to a secure server using the [Fetch API](/en-US/docs/Web/API/Fetch_API). It
calls `complete()` with an answer appropriate to the status in the response.
```js
// Initialization of PaymentRequest arguments are excerpted for the
// sake of brevity.
const payment = new PaymentRequest(supportedInstruments, details, options);
payment
.show()
.then((paymentResponse) => {
const fetchOptions = {
method: "POST",
credentials: include,
body: JSON.stringify(paymentResponse),
};
const serverPaymentRequest = new Request("secure/payment/endpoint");
fetch(serverPaymentRequest, fetchOptions)
.then((response) => {
if (response.status < 400) {
paymentResponse.complete("success");
} else {
paymentResponse.complete("fail");
}
})
.catch((reason) => {
paymentResponse.complete("fail");
});
})
.catch((err) => {
console.error("Uh oh, something bad happened", err.message);
});
```
## Specifications
{{Specifications}}
## Browser compatibility
{{Compat}}
| 0 |
data/mdn-content/files/en-us/web/api/paymentresponse | data/mdn-content/files/en-us/web/api/paymentresponse/payername/index.md | ---
title: "PaymentRequest: payerName property"
short-title: payerName
slug: Web/API/PaymentResponse/payerName
page-type: web-api-instance-property
status:
- deprecated
- non-standard
browser-compat: api.PaymentResponse.payerName
---
{{securecontext_header}}{{APIRef("Payment Request API")}}{{Deprecated_header}}{{Non-standard_header}}
The **`payerName`** read-only property of the
{{domxref("PaymentResponse")}} interface returns the name supplied by the user. This
option is only present when the `requestPayerName` option is set to
`true` in the options parameter of the
{{domxref('PaymentRequest.PaymentRequest','PaymentRequest()')}} constructor.
## Value
A string containing the payer name.
## Browser compatibility
{{Compat}}
| 0 |
data/mdn-content/files/en-us/web/api/paymentresponse | data/mdn-content/files/en-us/web/api/paymentresponse/tojson/index.md | ---
title: "PaymentResponse: toJSON() method"
short-title: toJSON()
slug: Web/API/PaymentResponse/toJSON
page-type: web-api-instance-method
browser-compat: api.PaymentResponse.toJSON
---
{{SecureContext_Header}}{{APIRef("Payment Request API")}}
The **`toJSON()`** method of the {{domxref("PaymentResponse")}} interface is a {{Glossary("Serialization","serializer")}}; it returns a JSON representation of the {{domxref("PaymentResponse")}} object.
## Syntax
```js-nolint
toJSON()
```
### Parameters
None.
### Return value
A {{jsxref("JSON")}} object that is the serialization of the {{domxref("PaymentResponse")}} object.
## Examples
### Using the toJSON method
In this example, calling `paymentResponse.toJSON()` returns a JSON representation of the `PaymentResponse` object.
```js
payment.show().then((paymentResponse) => {
console.log(paymentResponse.toJSON())
};
});
```
To get a JSON string, you can use [`JSON.stringify(paymentResponse)`](/en-US/docs/Web/JavaScript/Reference/Global_Objects/JSON/stringify) directly; it will call `toJSON()` automatically.
## Specifications
{{Specifications}}
## Browser compatibility
{{Compat}}
## See also
- {{jsxref("JSON")}}
| 0 |
data/mdn-content/files/en-us/web/api | data/mdn-content/files/en-us/web/api/messageport/index.md | ---
title: MessagePort
slug: Web/API/MessagePort
page-type: web-api-interface
browser-compat: api.MessagePort
---
{{APIRef("Channel Messaging API")}}
The **`MessagePort`** interface of the [Channel Messaging API](/en-US/docs/Web/API/Channel_Messaging_API) represents one of the two ports of a {{domxref("MessageChannel")}}, allowing messages to be sent from one port and listening out for them arriving at the other.
`MessagePort` is a [transferable object](/en-US/docs/Web/API/Web_Workers_API/Transferable_objects).
{{AvailableInWorkers}}
{{InheritanceDiagram}}
## Instance methods
_Inherits methods from its parent, {{domxref("EventTarget")}}_.
- {{domxref("MessagePort.postMessage","postMessage()")}}
- : Sends a message from the port, and optionally, transfers ownership of objects to other browsing contexts.
- {{domxref("MessagePort.start","start()")}}
- : Starts the sending of messages queued on the port (only needed when using {{domxref("EventTarget.addEventListener")}}; it is implied when using {{domxref("MessagePort.message_event", "onmessage")}}).
- {{domxref("MessagePort.close","close()")}}
- : Disconnects the port, so it is no longer active.
## Events
_Inherits events from its parent, {{domxref("EventTarget")}}_.
- {{domxref("MessagePort.message_event","message")}}
- : Fired when a `MessagePort` object receives a message.
- {{domxref("MessagePort.messageerror_event","messageerror")}}
- : Fired when a `MessagePort` object receives a message that can't be deserialized.
## Example
In the following example, you can see a new channel being created using the {{domxref("MessageChannel.MessageChannel","MessageChannel()")}} constructor.
When the IFrame has loaded, we register an {{domxref("MessagePort/message_event","onmessage")}} handler for {{domxref("MessageChannel.port1")}} and transfer {{domxref("MessageChannel.port2")}} to the IFrame using the {{domxref("window.postMessage")}} method along with a message.
When a message is received back from the IFrame, the `onMessage` function outputs the message to a paragraph.
```js
const channel = new MessageChannel();
const output = document.querySelector(".output");
const iframe = document.querySelector("iframe");
// Wait for the iframe to load
iframe.addEventListener("load", onLoad);
function onLoad() {
// Listen for messages on port1
channel.port1.onmessage = onMessage;
// Transfer port2 to the iframe
iframe.contentWindow.postMessage("Hello from the main page!", "*", [
channel.port2,
]);
}
// Handle messages received on port1
function onMessage(e) {
output.innerHTML = e.data;
}
```
For a full working example, see our [channel messaging basic demo](https://github.com/mdn/dom-examples/tree/main/channel-messaging-basic) on GitHub ([run it live too](https://mdn.github.io/dom-examples/channel-messaging-basic/)).
## Specifications
{{Specifications}}
## Browser compatibility
{{Compat}}
## See also
- [Using channel messaging](/en-US/docs/Web/API/Channel_Messaging_API/Using_channel_messaging)
| 0 |
data/mdn-content/files/en-us/web/api/messageport | data/mdn-content/files/en-us/web/api/messageport/messageerror_event/index.md | ---
title: "MessagePort: messageerror event"
short-title: messageerror
slug: Web/API/MessagePort/messageerror_event
page-type: web-api-event
browser-compat: api.MessagePort.messageerror_event
---
{{APIRef("Channel Messaging API")}}
The **`messageerror`** event is fired on a {{domxref('MessagePort')}} object when it receives a message that can't be deserialized.
This event is not cancellable and does not bubble.
{{AvailableInWorkers}}
## Syntax
Use the event name in methods like {{domxref("EventTarget.addEventListener", "addEventListener()")}}, or set an event handler property.
```js
addEventListener("messageerror", (event) => {});
onmessageerror = (event) => {};
```
## Event type
A {{domxref("MessageEvent")}}. Inherits from {{domxref("Event")}}.
{{InheritanceDiagram("MessageEvent")}}
## Event properties
_This interface also inherits properties from its parent, {{domxref("Event")}}._
- {{domxref("MessageEvent.data")}} {{ReadOnlyInline}}
- : The data sent by the message emitter.
- {{domxref("MessageEvent.origin")}} {{ReadOnlyInline}}
- : A string representing the origin of the message emitter.
- {{domxref("MessageEvent.lastEventId")}} {{ReadOnlyInline}}
- : A string representing a unique ID for the event.
- {{domxref("MessageEvent.source")}} {{ReadOnlyInline}}
- : A `MessageEventSource` (which can be a {{glossary("WindowProxy")}}, {{domxref("MessagePort")}}, or {{domxref("ServiceWorker")}} object) representing the message emitter.
- {{domxref("MessageEvent.ports")}} {{ReadOnlyInline}}
- : An array of {{domxref("MessagePort")}} objects representing the ports associated with the channel the message is being sent through (where appropriate, e.g. in channel messaging or when sending a message to a shared worker).
## Examples
Suppose a script creates a [`MessageChannel`](/en-US/docs/Web/API/MessageChannel) and sends one of the ports to a different browsing context, such as another [`<iframe>`](/en-US/docs/Web/HTML/Element/iframe), using code like this:
```js
const channel = new MessageChannel();
const myPort = channel.port1;
const targetFrame = window.top.frames[1];
const targetOrigin = "https://example.org";
const messageControl = document.querySelector("#message");
const channelMessageButton = document.querySelector("#channel-message");
channelMessageButton.addEventListener("click", () => {
myPort.postMessage(messageControl.value);
});
targetFrame.postMessage("init", targetOrigin, [channel.port2]);
```
The target can receive the port and start listening for messages and message errors on it using code like this:
```js
window.addEventListener("message", (event) => {
const myPort = event.ports[0];
myPort.addEventListener("message", (event) => {
received.textContent = event.data;
});
myPort.addEventListener("messageerror", (event) => {
console.error(event.data);
});
myPort.start();
});
```
Note that the listener must call [`MessagePort.start()`](/en-US/docs/Web/API/MessagePort/start) before any messages will be delivered to this port. This is only needed when using the [`addEventListener()`](/en-US/docs/Web/API/EventTarget/addEventListener) method: if the receiver uses `onmessage` instead, `start()` is called implicitly:
```js
window.addEventListener("message", (event) => {
const myPort = event.ports[0];
myPort.onmessage = (event) => {
received.textContent = event.data;
};
myPort.onmessageerror = (event) => {
console.error(event.data);
};
});
```
## Specifications
{{Specifications}}
## Browser compatibility
{{Compat}}
## See also
- Related events: [`message`](/en-US/docs/Web/API/MessagePort/message_event).
- [Using channel messaging](/en-US/docs/Web/API/Channel_Messaging_API/Using_channel_messaging)
| 0 |
data/mdn-content/files/en-us/web/api/messageport | data/mdn-content/files/en-us/web/api/messageport/start/index.md | ---
title: "MessagePort: start() method"
short-title: start()
slug: Web/API/MessagePort/start
page-type: web-api-instance-method
browser-compat: api.MessagePort.start
---
{{APIRef("Channel Messaging API")}}
The **`start()`** method of the {{domxref("MessagePort")}}
interface starts the sending of messages queued on the port. This method is only needed
when using {{domxref("EventTarget.addEventListener")}}; it is implied when using
{{domxref("MessagePort.message_event", "onmessage")}}.
{{AvailableInWorkers}}
## Syntax
```js-nolint
start()
```
### Parameters
None.
### Return value
None ({{jsxref("undefined")}}).
## Examples
In the following code block, you can see a `handleMessage` handler function,
run when a message is sent back to this document using `onmessage`:
```js
channel.port1.onmessage = handleMessage;
function handleMessage(e) {
para.innerHTML = e.data;
}
```
Another option would be to do this using {{domxref("EventTarget.addEventListener")}},
however, when this method is used, you need to explicitly call `start()` to
begin the flow of messages to this document:
```js
channel.port1.addEventListener("message", handleMessage, false);
function handleMessage(e) {
para.innerHTML = e.data;
textInput.value = "";
}
channel.port1.start();
```
## Specifications
{{Specifications}}
## Browser compatibility
{{Compat}}
## See also
- [Using channel messaging](/en-US/docs/Web/API/Channel_Messaging_API/Using_channel_messaging)
| 0 |
data/mdn-content/files/en-us/web/api/messageport | data/mdn-content/files/en-us/web/api/messageport/postmessage/index.md | ---
title: "MessagePort: postMessage() method"
short-title: postMessage()
slug: Web/API/MessagePort/postMessage
page-type: web-api-instance-method
browser-compat: api.MessagePort.postMessage
---
{{APIRef("Channel Messaging API")}}
The **`postMessage()`** method of the
{{domxref("MessagePort")}} interface sends a message from the port, and optionally,
transfers ownership of objects to other browsing contexts.
{{AvailableInWorkers}}
## Syntax
```js-nolint
postMessage(message)
postMessage(message, options)
postMessage(message, transfer)
```
### Parameters
- `message`
- : The message you want to send through the channel. This can be of any basic data type. Multiple data items can be sent as an array.
- `options` {{optional_inline}}
- : An optional object containing a `transfer` field with an [array](/en-US/docs/Web/JavaScript/Reference/Global_Objects/Array) of [transferable objects](/en-US/docs/Web/API/Web_Workers_API/Transferable_objects) to transfer ownership of. The ownership of these objects is given to the destination side and they are no longer usable on the sending side.
- `transfer` {{optional_inline}}
- : An optional [array](/en-US/docs/Web/JavaScript/Reference/Global_Objects/Array) of [transferable objects](/en-US/docs/Web/API/Web_Workers_API/Transferable_objects) to transfer ownership of. The ownership of these objects is given to the destination side and they are no longer usable on the sending side.
### Return value
None ({{jsxref("undefined")}}).
## Examples
In the following code block, you can see a new channel being created using the
{{domxref("MessageChannel.MessageChannel", "MessageChannel()")}} constructor. When the
IFrame has loaded, we pass {{domxref("MessageChannel.port2")}} to the IFrame using
{{domxref("window.postMessage")}} along with a message. The iframe receives the message,
and sends a message back on the `MessageChannel` using `postMessage()`.
The `handleMessage` handler then responds to a message being sent back from the iframe using
`onmessage`, putting it into a paragraph —
{{domxref("MessageChannel.port1")}} is listened to, to check when the message arrives.
```js
const channel = new MessageChannel();
const para = document.querySelector("p");
const ifr = document.querySelector("iframe");
const otherWindow = ifr.contentWindow;
ifr.addEventListener("load", iframeLoaded, false);
function iframeLoaded() {
otherWindow.postMessage("Transferring message port", "*", [channel.port2]);
}
channel.port1.onmessage = handleMessage;
function handleMessage(e) {
para.innerHTML = e.data;
}
// in the iframe…
window.addEventListener("message", (event) => {
const messagePort = event.ports?.[0];
messagePort.postMessage("Hello from the iframe!");
});
```
For a full working example, see our [channel messaging basic demo](https://github.com/mdn/dom-examples/tree/main/channel-messaging-basic) on GitHub ([run it live too](https://mdn.github.io/dom-examples/channel-messaging-basic/)).
## Specifications
{{Specifications}}
## Browser compatibility
{{Compat}}
## See also
- [Using channel messaging](/en-US/docs/Web/API/Channel_Messaging_API/Using_channel_messaging)
| 0 |
data/mdn-content/files/en-us/web/api/messageport | data/mdn-content/files/en-us/web/api/messageport/message_event/index.md | ---
title: "MessagePort: message event"
short-title: message
slug: Web/API/MessagePort/message_event
page-type: web-api-event
browser-compat: api.MessagePort.message_event
---
{{APIRef("Channel Messaging API")}}
The **`message`** event is fired on a {{domxref('MessagePort')}} object when a message arrives on that channel.
This event is not cancellable and does not bubble.
{{AvailableInWorkers}}
## Syntax
Use the event name in methods like {{domxref("EventTarget.addEventListener", "addEventListener()")}}, or set an event handler property.
```js
addEventListener("message", (event) => {});
onmessage = (event) => {};
```
## Event type
A {{domxref("MessageEvent")}}. Inherits from {{domxref("Event")}}.
{{InheritanceDiagram("MessageEvent")}}
## Event properties
_This interface also inherits properties from its parent, {{domxref("Event")}}._
- {{domxref("MessageEvent.data")}} {{ReadOnlyInline}}
- : The data sent by the message emitter.
- {{domxref("MessageEvent.origin")}} {{ReadOnlyInline}}
- : A string representing the origin of the message emitter.
- {{domxref("MessageEvent.lastEventId")}} {{ReadOnlyInline}}
- : A string representing a unique ID for the event.
- {{domxref("MessageEvent.source")}} {{ReadOnlyInline}}
- : A `MessageEventSource` (which can be a {{glossary("WindowProxy")}}, {{domxref("MessagePort")}}, or {{domxref("ServiceWorker")}} object) representing the message emitter.
- {{domxref("MessageEvent.ports")}} {{ReadOnlyInline}}
- : An array of {{domxref("MessagePort")}} objects representing the ports associated with the channel the message is being sent through (where appropriate, e.g. in channel messaging or when sending a message to a shared worker).
## Examples
Suppose a script creates a [`MessageChannel`](/en-US/docs/Web/API/MessageChannel) and sends one of the ports to a different browsing context, such as another [`<iframe>`](/en-US/docs/Web/HTML/Element/iframe), using code like this:
```js
const channel = new MessageChannel();
const myPort = channel.port1;
const targetFrame = window.top.frames[1];
const targetOrigin = "https://example.org";
const messageControl = document.querySelector("#message");
const channelMessageButton = document.querySelector("#channel-message");
channelMessageButton.addEventListener("click", () => {
myPort.postMessage(messageControl.value);
});
targetFrame.postMessage("init", targetOrigin, [channel.port2]);
```
The target can receive the port and start listening for messages on it using code like this:
```js
window.addEventListener("message", (event) => {
const myPort = event.ports[0];
myPort.addEventListener("message", (event) => {
received.textContent = event.data;
});
myPort.start();
});
```
Note that the listener must call [`MessagePort.start()`](/en-US/docs/Web/API/MessagePort/start) before any messages will be delivered to this port. This is only needed when using the [`addEventListener()`](/en-US/docs/Web/API/EventTarget/addEventListener) method: if the receiver uses `onmessage` instead, `start()` is called implicitly:
```js
window.addEventListener("message", (event) => {
const myPort = event.ports[0];
myPort.onmessage = (event) => {
received.textContent = event.data;
};
});
```
## Specifications
{{Specifications}}
## Browser compatibility
{{Compat}}
## See also
- Related events: [`messageerror`](/en-US/docs/Web/API/MessagePort/messageerror_event).
- [Using channel messaging](/en-US/docs/Web/API/Channel_Messaging_API/Using_channel_messaging)
| 0 |
data/mdn-content/files/en-us/web/api/messageport | data/mdn-content/files/en-us/web/api/messageport/close/index.md | ---
title: "MessagePort: close() method"
short-title: close()
slug: Web/API/MessagePort/close
page-type: web-api-instance-method
browser-compat: api.MessagePort.close
---
{{APIRef("Channel Messaging API")}}
The **`close()`** method of the {{domxref("MessagePort")}}
interface disconnects the port, so it is no longer active. This stops the flow of
messages to that port.
{{AvailableInWorkers}}
## Syntax
```js-nolint
close()
```
### Parameters
None.
### Return value
None ({{jsxref("undefined")}}).
## Examples
In the following code block, you can see a `handleMessage` handler function,
run when a message is sent back to this document using
{{domxref("EventTarget.addEventListener")}}.
```js
channel.port1.addEventListener("message", handleMessage, false);
function handleMessage(e) {
para.innerHTML = e.data;
textInput.value = "";
}
channel.port1.start();
```
You could stop messages being sent at any time using
```js
channel.port1.close();
```
## Specifications
{{Specifications}}
## Browser compatibility
{{Compat}}
## See also
- [Using channel messaging](/en-US/docs/Web/API/Channel_Messaging_API/Using_channel_messaging)
| 0 |
data/mdn-content/files/en-us/web/api | data/mdn-content/files/en-us/web/api/usbisochronousintransferpacket/index.md | ---
title: USBIsochronousInTransferPacket
slug: Web/API/USBIsochronousInTransferPacket
page-type: web-api-interface
status:
- experimental
browser-compat: api.USBIsochronousInTransferPacket
---
{{securecontext_header}}{{APIRef("WebUSB API")}}{{SeeCompatTable}}
The `USBIsochronousInTransferPacket` interface of the [WebUSB API](/en-US/docs/Web/API/WebUSB_API) is part of the response from a call to the `isochronousTransferIn()` method of the `USBDevice` interface. It represents the status of an individual packet from a request to transfer data from the USB device to the USB host over an isochronous endpoint.
## Constructor
- {{domxref("USBIsochronousInTransferPacket.USBIsochronousInTransferPacket", "USBIsochronousInTransferPacket()")}} {{Experimental_Inline}}
- : Creates a new `USBIsochronousInTransferPacket` object with the provided `status` and `data` fields.
## Instance properties
- {{domxref("USBIsochronousInTransferPacket.data")}} {{ReadOnlyInline}} {{ReadOnlyInline}} {{Experimental_Inline}}
- : Returns a `DataView` object containing the data received from the USB device in this packet, if any.
- {{domxref("USBIsochronousInTransferPacket.status")}} {{ReadOnlyInline}} {{ReadOnlyInline}} {{Experimental_Inline}}
- : Returns the status of the transfer request, one of:
- `"ok"` - The transfer was successful.
- `"stall"` - The device indicated an error by generating a stall condition on the endpoint. A stall on an isochronous endpoint does not need to be cleared.
- `"babble"` - The device responded with more data than was expected.
## Specifications
{{Specifications}}
## Browser compatibility
{{Compat}}
| 0 |
data/mdn-content/files/en-us/web/api | data/mdn-content/files/en-us/web/api/websockets_api/index.md | ---
title: The WebSocket API (WebSockets)
slug: Web/API/WebSockets_API
page-type: web-api-overview
browser-compat: api.WebSocket
---
{{DefaultAPISidebar("WebSockets API")}}
The **WebSocket API** is an advanced technology that makes it possible to open a two-way interactive communication session between the user's browser and a server. With this API, you can send messages to a server and receive event-driven responses without having to poll the server for a reply.
> **Note:** While a WebSocket connection is functionally somewhat similar to standard Unix-style sockets, they are not related.
## Interfaces
- [`WebSocket`](/en-US/docs/Web/API/WebSocket)
- : The primary interface for connecting to a WebSocket server and then sending and receiving data on the connection.
- [`CloseEvent`](/en-US/docs/Web/API/CloseEvent)
- : The event sent by the WebSocket object when the connection closes.
- [`MessageEvent`](/en-US/docs/Web/API/MessageEvent)
- : The event sent by the WebSocket object when a message is received from the server.
## Guides
- [Writing WebSocket client applications](/en-US/docs/Web/API/WebSockets_API/Writing_WebSocket_client_applications)
- [Writing WebSocket servers](/en-US/docs/Web/API/WebSockets_API/Writing_WebSocket_servers)
- [Writing a WebSocket server in C#](/en-US/docs/Web/API/WebSockets_API/Writing_WebSocket_server)
- [Writing a WebSocket server in Java](/en-US/docs/Web/API/WebSockets_API/Writing_a_WebSocket_server_in_Java)
- [Writing a WebSocket server in JavaScript (Deno)](/en-US/docs/Web/API/WebSockets_API/Writing_a_WebSocket_server_in_JavaScript_Deno)
## Tools
- [AsyncAPI](https://www.asyncapi.com/): A specification for describing event-driven architectures based on protocols like WebSocket. You can use it to describe WebSocket-based APIs just as you would describe REST APIs with the OpenAPI specification. Learn [why you should consider using AsyncAPI with WebSocket](https://www.asyncapi.com/blog/websocket-part1) and [how to do so](https://www.asyncapi.com/blog/websocket-part2).
- [HumbleNet](https://hacks.mozilla.org/2017/06/introducing-humblenet-a-cross-platform-networking-library-that-works-in-the-browser/): A cross-platform networking library that works in the browser. It consists of a C wrapper around WebSockets and WebRTC that abstracts away cross-browser differences, facilitating the creation of multi-user networking functionality for games and other apps.
- [µWebSockets](https://github.com/uNetworking/uWebSockets): Highly scalable WebSocket server and client implementation for [C++11](https://isocpp.org/) and [Node.js](https://nodejs.org).
- [Socket.IO](https://socket.io): A long polling/WebSocket based third party transfer protocol for [Node.js](https://nodejs.org).
- [SocketCluster](https://socketcluster.io/): A pub/sub WebSocket framework for [Node.js](https://nodejs.org) with a focus on scalability.
- [WebSocket-Node](https://github.com/theturtle32/WebSocket-Node): A WebSocket server API implementation for [Node.js](https://nodejs.org).
- [Total.js](https://www.totaljs.com): Web application framework for [Node.js](https://nodejs.org/en/) (Example: [WebSocket chat](https://github.com/totaljs/examples/tree/master/websocket))
- [Faye](https://www.npmjs.com/package/faye-websocket): A {{DOMxRef("WebSocket")}} (two-ways connections) and [EventSource](/en-US/docs/Web/API/EventSource) (one-way connections) for [Node.js](https://nodejs.org) Server and Client.
- [SignalR](https://dotnet.microsoft.com/en-us/apps/aspnet/signalr): SignalR will use WebSockets under the covers when it's available, and gracefully fallback to other techniques and technologies when it isn't, while your application code stays the same.
- [Caddy](https://caddyserver.com/): A web server capable of proxying arbitrary commands (stdin/stdout) as a websocket.
- [ws](https://github.com/websockets/ws): a popular WebSocket client & server library for [Node.js](https://nodejs.org/).
- [jsonrpc-bidirectional](https://github.com/bigstepinc/jsonrpc-bidirectional): Asynchronous RPC which, on a single connection, may have functions exported on the server and, and the same time, on the client (client may call server, server may also call client).
- [cowboy](https://github.com/ninenines/cowboy): Cowboy is a small, fast and modern HTTP server for Erlang/OTP with WebSocket support.
- [ZeroMQ](https://zeromq.org): ZeroMQ is embeddable networking library that carries messages across in-process, IPC, TCP, UDP, TIPC, multicast and WebSocket.
- [WebSocket King](https://websocketking.com): A client tool to help develop, test and work with WebSocket servers.
- [PHP WebSocket Server](https://github.com/napengam/phpWebSocketServer): Server written in PHP to handle connections via websockets wss\:// or ws\://and normal sockets over ssl:// ,tcp\://
- [Channels](https://channels.readthedocs.io/en/stable/index.html): Django library that adds support for WebSockets (and other protocols that require long running asynchronous connections).
- [Flask-SocketIO](https://flask-socketio.readthedocs.io/en/latest/): gives Flask applications access to low latency bi-directional communications between the clients and the server.
- [Gorilla WebSocket](https://pkg.go.dev/github.com/gorilla/websocket): Gorilla WebSocket is a [Go](https://go.dev/) implementation of the WebSocket protocol.
## Specifications
{{Specifications}}
## Browser compatibility
{{Compat}}
## See also
- [RFC 6455 — The WebSocket Protocol](https://datatracker.ietf.org/doc/html/rfc6455)
- [WebSocket API Specification](https://websockets.spec.whatwg.org/)
- [Server-Sent Events](/en-US/docs/Web/API/Server-sent_events)
| 0 |
data/mdn-content/files/en-us/web/api/websockets_api | data/mdn-content/files/en-us/web/api/websockets_api/writing_websocket_servers/index.md | ---
title: Writing WebSocket servers
slug: Web/API/WebSockets_API/Writing_WebSocket_servers
page-type: guide
---
{{DefaultAPISidebar("WebSockets API")}}
A WebSocket server is nothing more than an application listening on any port of a TCP server that follows a specific protocol. The task of creating a custom server tends to scare people; however, it can be straightforward to implement a simple WebSocket server on your platform of choice.
A WebSocket server can be written in any server-side programming language that is capable of [Berkeley sockets](https://en.wikipedia.org/wiki/Berkeley_sockets), such as C(++), Python, {{Glossary("PHP")}}, or [server-side JavaScript](/en-US/docs/Learn/Server-side/Node_server_without_framework). This is not a tutorial in any specific language, but serves as a guide to facilitate writing your own server.
This article assumes you're already familiar with how {{Glossary("HTTP")}} works, and that you have a moderate level of programming experience. Depending on language support, knowledge of TCP sockets may be required. The scope of this guide is to present the minimum knowledge you need to write a WebSocket server.
> **Note:** Read the latest official WebSockets specification, [RFC 6455](https://datatracker.ietf.org/doc/rfc6455/?include_text=1). Sections 1 and 4-7 are especially interesting to server implementors. Section 10 discusses security and you should definitely peruse it before exposing your server.
A WebSocket server is explained on a very low level here. WebSocket servers are often separate and specialized servers (for load-balancing or other practical reasons), so you will often use a [reverse proxy](https://en.wikipedia.org/wiki/Reverse_proxy) (such as a regular HTTP server) to detect WebSocket handshakes, pre-process them, and send those clients to a real WebSocket server. This means that you don't have to bloat your server code with cookie and authentication handlers (for example).
## The WebSocket handshake
First, the server must listen for incoming socket connections using a standard TCP socket. Depending on your platform, this may be handled for you automatically. For example, let's assume that your server is listening on `example.com`, port 8000, and your socket server responds to {{HTTPMethod("GET")}} requests at `example.com/chat`.
> **Warning:** The server may listen on any port it chooses, but if it chooses any port other than 80 or 443, it may have problems with firewalls and/or proxies. Browsers generally require a secure connection for WebSockets, although they may offer an exception for local devices.
The handshake is the "Web" in WebSockets. It's the bridge from HTTP to WebSockets. In the handshake, details of the connection are negotiated, and either party can back out before completion if the terms are unfavorable. The server must be careful to understand everything the client asks for, otherwise security issues can occur.
> **Note:** The request-uri (`/chat` here) has no defined meaning in the spec. So, many people use it to let one server handle multiple WebSocket applications. For example, `example.com/chat` could invoke a multiuser chat app, while `/game` on the same server might invoke a multiplayer game.
### Client handshake request
Even though you're building a server, a client still has to start the WebSocket handshake process by contacting the server and requesting a WebSocket connection. So, you must know how to interpret the client's request. The **client** will send a pretty standard HTTP request with headers that looks like this (the HTTP version **must** be 1.1 or greater, and the method **must** be `GET`):
```bash
GET /chat HTTP/1.1
Host: example.com:8000
Upgrade: websocket
Connection: Upgrade
Sec-WebSocket-Key: dGhlIHNhbXBsZSBub25jZQ==
Sec-WebSocket-Version: 13
```
The client can solicit extensions and/or subprotocols here; see [Miscellaneous](#miscellaneous) for details. Also, common headers like {{HTTPHeader("User-Agent")}}, {{HTTPHeader("Referer")}}, {{HTTPHeader("Cookie")}}, or authentication headers might be there as well. Do whatever you want with those; they don't directly pertain to the WebSocket. It's also safe to ignore them. In many common setups, a reverse proxy has already dealt with them.
> **Note:** All **browsers** send an [`Origin` header](/en-US/docs/Web/HTTP/CORS#origin). You can use this header for security (checking for same origin, automatically allowing or denying, etc.) and send a [403 Forbidden](/en-US/docs/Web/HTTP/Status#403) if you don't like what you see. However, be warned that non-browser agents can send a faked `Origin`. Most applications reject requests without this header.
If any header is not understood or has an incorrect value, the server should send a {{HTTPStatus("400")}} ("Bad Request") response and immediately close the socket. As usual, it may also give the reason why the handshake failed in the HTTP response body, but the message may never be displayed (browsers do not display it). If the server doesn't understand that version of WebSockets, it should send a {{HTTPHeader("Sec-WebSocket-Version")}} header back that contains the version(s) it does understand. In the example above, it indicates version 13 of the WebSocket protocol.
The most interesting header here is {{HTTPHeader("Sec-WebSocket-Key")}}. Let's look at that next.
> **Note:** [Regular HTTP status codes](/en-US/docs/Web/HTTP/Status) can be used only before the handshake. After the handshake succeeds, you have to use a different set of codes (defined in section 7.4 of the spec).
### Server handshake response
When the **server** receives the handshake request, it should send back a special response that indicates that the protocol will be changing from HTTP to WebSocket. That header looks something like the following (remember each header line ends with `\r\n` and put an extra `\r\n` after the last one to indicate the end of the header):
```bash
HTTP/1.1 101 Switching Protocols
Upgrade: websocket
Connection: Upgrade
Sec-WebSocket-Accept: s3pPLMBiTxaQ9kYGzzhZRbK+xOo=
```
Additionally, the server can decide on extension/subprotocol requests here; see [Miscellaneous](#miscellaneous) for details. The `Sec-WebSocket-Accept` header is important in that the server must derive it from the {{HTTPHeader("Sec-WebSocket-Key")}} that the client sent to it. To get it, concatenate the client's `Sec-WebSocket-Key` and the string "`258EAFA5-E914-47DA-95CA-C5AB0DC85B11`" together (it's a "[magic string](https://en.wikipedia.org/wiki/Magic_string)"), take the [SHA-1 hash](https://en.wikipedia.org/wiki/SHA-1) of the result, and return the [base64](https://en.wikipedia.org/wiki/Base64) encoding of that hash.
> **Note:** This seemingly overcomplicated process exists so that it's obvious to the client whether the server supports WebSockets. This is important because security issues might arise if the server accepts a WebSockets connection but interprets the data as a HTTP request.
So if the Key was "`dGhlIHNhbXBsZSBub25jZQ==`", the `Sec-WebSocket-Accept` header's value is "`s3pPLMBiTxaQ9kYGzzhZRbK+xOo=`". Once the server sends these headers, the handshake is complete and you can start swapping data!
> **Note:** The server can send other headers like {{HTTPHeader("Set-Cookie")}}, or ask for authentication or redirects via other status codes, before sending the reply handshake.
### Keeping track of clients
This doesn't directly relate to the WebSocket protocol, but it's worth mentioning here: your server must keep track of clients' sockets so you don't keep handshaking again with clients who have already completed the handshake. The same client IP address can try to connect multiple times. However, the server can deny them if they attempt too many connections in order to save itself from [Denial-of-Service attacks](https://en.wikipedia.org/wiki/Denial_of_service).
For example, you might keep a table of usernames or ID numbers along with the corresponding {{domxref("WebSocket")}} and other data that you need to associate with that connection.
## Exchanging data frames
Either the client or the server can choose to send a message at any time — that's the magic of WebSockets. However, extracting information from these so-called "frames" of data is a not-so-magical experience. Although all frames follow the same specific format, data going from the client to the server is masked using [XOR encryption](https://en.wikipedia.org/wiki/XOR_cipher) (with a 32-bit key). Section 5 of the specification describes this in detail.
### Format
Each data frame (from the client to the server or vice versa) follows this same format:
```bash
Frame format:
0 1 2 3
0 1 2 3 4 5 6 7 8 9 0 1 2 3 4 5 6 7 8 9 0 1 2 3 4 5 6 7 8 9 0 1
+-+-+-+-+-------+-+-------------+-------------------------------+
|F|R|R|R| opcode|M| Payload len | Extended payload length |
|I|S|S|S| (4) |A| (7) | (16/64) |
|N|V|V|V| |S| | (if payload len==126/127) |
| |1|2|3| |K| | |
+-+-+-+-+-------+-+-------------+ - - - - - - - - - - - - - - - +
| Extended payload length continued, if payload len == 127 |
+ - - - - - - - - - - - - - - - +-------------------------------+
| |Masking-key, if MASK set to 1 |
+-------------------------------+-------------------------------+
| Masking-key (continued) | Payload Data |
+-------------------------------- - - - - - - - - - - - - - - - +
: Payload Data continued ... :
+ - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - +
| Payload Data continued ... |
+---------------------------------------------------------------+
```
This means that a frame contains the following bytes:
- First byte:
- bit 0: FIN
- bit 1: RSV1
- bit 2: RSV2
- bit 3: RSV3
- bits 4-7 OPCODE
- Bytes 2-10: payload length (see [Decoding Payload Length](#decoding_payload_length))
- If masking is used, the next 4 bytes contain the masking key (see [Reading and unmasking the data](#reading_and_unmasking_the_data))
- All subsequent bytes are payload
The MASK bit tells whether the message is encoded. Messages from the client must be masked, so your server must expect this to be 1. (In fact, [section 5.1 of the spec](https://datatracker.ietf.org/doc/html/rfc6455#section-5.1) says that your server must disconnect from a client if that client sends an unmasked message.) When sending a frame back to the client, do not mask it and do not set the mask bit. We'll explain masking later. _Note: You must mask messages even when using a secure socket._ RSV1-3 can be ignored, they are for extensions.
The opcode field defines how to interpret the payload data: `0x0` for continuation, `0x1` for text (which is always encoded in UTF-8), `0x2` for binary, and other so-called "control codes" that will be discussed later. In this version of WebSockets, `0x3` to `0x7` and `0xB` to `0xF` have no meaning.
The FIN bit tells whether this is the last message in a series. If it's 0, then the server keeps listening for more parts of the message; otherwise, the server should consider the message delivered. More on this later.
### Decoding Payload Length
To read the payload data, you must know when to stop reading. That's why the payload length is important to know. Unfortunately, this is somewhat complicated. To read it, follow these steps:
1. Read bits 9-15 (inclusive) and interpret that as an unsigned integer. If it's 125 or less, then that's the length; you're **done**. If it's 126, go to step 2. If it's 127, go to step 3.
2. Read the next 16 bits and interpret those as an unsigned integer. You're **done**.
3. Read the next 64 bits and interpret those as an unsigned integer. (The most significant bit _must_ be 0.) You're **done**.
### Reading and unmasking the data
If the MASK bit was set (and it should be, for client-to-server messages), read the next 4 octets (32 bits); this is the masking key. Once the payload length and masking key is decoded, you can read that number of bytes from the socket. Let's call the data `ENCODED`, and the key `MASK`. To get `DECODED`, loop through the octets (bytes a.k.a. characters for text data) of `ENCODED` and XOR the octet with the (i modulo 4)th octet of `MASK`. In pseudocode (that happens to be valid JavaScript):
```js
const MASK = [1, 2, 3, 4]; // 4-byte mask
const ENCODED = [105, 103, 111, 104, 110]; // encoded string "hello"
// Create the byte Array of decoded payload
const DECODED = Uint8Array.from(ENCODED, (elt, i) => elt ^ MASK[i % 4]); // Perform an XOR on the mask
```
Now you can figure out what **DECODED** means depending on your application.
### Message Fragmentation
The FIN and opcode fields work together to send a message split up into separate frames. This is called message fragmentation. Fragmentation is only available on opcodes `0x0` to `0x2`.
Recall that the opcode tells what a frame is meant to do. If it's `0x1`, the payload is text. If it's `0x2`, the payload is binary data. However, if it's `0x0,` the frame is a continuation frame; this means the server should concatenate the frame's payload to the last frame it received from that client. Here is a rough sketch, in which a server reacts to a client sending text messages. The first message is sent in a single frame, while the second message is sent across three frames. FIN and opcode details are shown only for the client:
```plain
Client: FIN=1, opcode=0x1, msg="hello"
Server: (process complete message immediately) Hi.
Client: FIN=0, opcode=0x1, msg="and a"
Server: (listening, new message containing text started)
Client: FIN=0, opcode=0x0, msg="happy new"
Server: (listening, payload concatenated to previous message)
Client: FIN=1, opcode=0x0, msg="year!"
Server: (process complete message) Happy new year to you too!
```
Notice the first frame contains an entire message (has `FIN=1` and `opcode!=0x0`), so the server can process or respond as it sees fit. The second frame sent by the client has a text payload (`opcode=0x1`), but the entire message has not arrived yet (`FIN=0`). All remaining parts of that message are sent with continuation frames (`opcode=0x0`), and the final frame of the message is marked by `FIN=1`. [Section 5.4 of the spec](https://datatracker.ietf.org/doc/html/rfc6455#section-5.4) describes message fragmentation.
## Pings and Pongs: The Heartbeat of WebSockets
At any point after the handshake, either the client or the server can choose to send a ping to the other party. When the ping is received, the recipient must send back a pong as soon as possible. You can use this to make sure that the client is still connected, for example.
A ping or pong is just a regular frame, but it's a **control frame**. Pings have an opcode of `0x9`, and pongs have an opcode of `0xA`. When you get a ping, send back a pong with the exact same Payload Data as the ping (for pings and pongs, the max payload length is 125). You might also get a pong without ever sending a ping; ignore this if it happens.
> **Note:** If you have gotten more than one ping before you get the chance to send a pong, you only send one pong.
## Closing the connection
To close a connection either the client or server can send a control frame with data containing a specified control sequence to begin the closing handshake (detailed in [Section 5.5.1](https://datatracker.ietf.org/doc/html/rfc6455#section-5.5.1)). Upon receiving such a frame, the other peer sends a Close frame in response. The first peer then closes the connection. Any further data received after closing of connection is then discarded.
## Miscellaneous
> **Note:** WebSocket codes, extensions, subprotocols, etc. are registered at the [IANA WebSocket Protocol Registry](https://www.iana.org/assignments/websocket/websocket.xml).
WebSocket extensions and subprotocols are negotiated via headers during [the handshake](#handshake). Sometimes extensions and subprotocols are very similar, but there is a clear distinction. Extensions control the WebSocket _frame_ and _modify_ the payload, while subprotocols structure the WebSocket _payload_ and _never modify_ anything. Extensions are optional and generalized (like compression); subprotocols are mandatory and localized (like ones for chat and for MMORPG games).
### Extensions
Think of an extension as compressing a file before emailing it to someone. Whatever you do, you're sending the _same_ data in different forms. The recipient will eventually be able to get the same data as your local copy, but it is sent differently. That's what an extension does. WebSockets defines a protocol and a simple way to send data, but an extension such as compression could allow sending the same data but in a shorter format.
> **Note:** Extensions are explained in sections 5.8, 9, 11.3.2, and 11.4 of the spec.
### Subprotocols
Think of a subprotocol as a custom [XML schema](https://en.wikipedia.org/wiki/XML_schema) or [doctype declaration](https://en.wikipedia.org/wiki/Document_Type_Definition). You're still using XML and its syntax, but you're additionally restricted by a structure you agreed on. WebSocket subprotocols are just like that. They do not introduce anything fancy, they just establish structure. Like a doctype or schema, both parties must agree on the subprotocol; unlike a doctype or schema, the subprotocol is implemented on the server and cannot be externally referred to by the client.
> **Note:** Subprotocols are explained in sections 1.9, 4.2, 11.3.4, and 11.5 of the spec.
A client has to ask for a specific subprotocol. To do so, it will send something like this _as part of the original handshake_:
```bash
GET /chat HTTP/1.1
...
Sec-WebSocket-Protocol: soap, wamp
```
or, equivalently:
```bash
...
Sec-WebSocket-Protocol: soap
Sec-WebSocket-Protocol: wamp
```
Now the server must pick one of the protocols that the client suggested and it supports. If there is more than one, send the first one the client sent. Imagine our server can use both `soap` and `wamp`. Then, in the response handshake, it sends:
```bash
Sec-WebSocket-Protocol: soap
```
> **Warning:** The server can't send more than one `Sec-Websocket-Protocol` header.
> If the server doesn't want to use any subprotocol, **_it shouldn't send any `Sec-WebSocket-Protocol` header_**. Sending a blank header is incorrect. The client may close the connection if it doesn't get the subprotocol it wants.
If you want your server to obey certain subprotocols, then naturally you'll need extra code on the server. Let's imagine we're using a subprotocol `json`. In this subprotocol, all data is passed as [JSON](https://en.wikipedia.org/wiki/JSON). If the client solicits this protocol and the server wants to use it, the server needs to have a JSON parser. Practically speaking, this will be part of a library, but the server needs to pass the data around.
> **Note:** To avoid name conflict, it's recommended to make your subprotocol name part of a domain string. If you are building a custom chat app that uses a proprietary format exclusive to Example Inc., then you might use this: `Sec-WebSocket-Protocol: chat.example.com`. Note that this isn't required, it's just an optional convention, and you can use any string you wish.
## Related
- [Writing WebSocket client applications](/en-US/docs/Web/API/WebSockets_API/Writing_WebSocket_client_applications)
- [Tutorial: Websocket server in C#](/en-US/docs/Web/API/WebSockets_API/Writing_WebSocket_server)
- [Tutorial: Websocket server in Java](/en-US/docs/Web/API/WebSockets_API/Writing_a_WebSocket_server_in_Java)
| 0 |
Subsets and Splits
No saved queries yet
Save your SQL queries to embed, download, and access them later. Queries will appear here once saved.