An experimental HTML5/WebGL library for creating interactive and responsive videos with a graph-based processing pipeline and sequencing timeline.
VideoContext is an experimental HTML5 and WebGL library that provides a graph-based API for creating interactive and responsive videos on the web. It combines a shader-accelerated processing pipeline with a media sequencing timeline, allowing developers to compose, effect, and transition video, audio, and images in real-time. The library is designed to bring the node-based modularity of the Web Audio API to video processing.
Web developers and creative coders building interactive video experiences, real-time video effects, or complex media sequencing applications in the browser. It's particularly suited for those familiar with shader programming and graph-based architectures.
VideoContext offers a unique, low-level control over video rendering through WebGL shaders and a flexible node graph, enabling real-time, interactive video compositions that are difficult to achieve with standard HTML5 video APIs. Its Web Audio API-inspired design makes it intuitive for developers already experienced with audio processing graphs.
An experimental HTML5 & WebGL video composition and rendering API.
Open-Awesome is built by the community, for the community. Submit a project, suggest an awesome list, or help improve the catalog on GitHub.
Supports custom GLSL shaders for effects and transitions, enabling high-performance visual processing, as demonstrated in the EffectNode and TransitionNode examples with fragment shader code.
Node-based architecture allows modular composition of media sources and effects, inspired by Web Audio API, for complex video sequences, shown in the graph and timeline diagram.
Timeline control with start and stop methods provides exact scheduling of video, audio, and image playback, as illustrated in the demo where videos overlap with timed transitions.
Allows creation of custom source nodes, such as for HLS playback or GIF decoding, extending functionality beyond built-in types, as shown in the CustomSourceNode example with hls.js.
Labeled as experimental in the README, indicating potential for breaking changes, limited production readiness, and fewer guarantees for long-term support.
Custom effects require GLSL knowledge, making it inaccessible for developers without graphics programming experience, as highlighted in the Writing Custom Effect Definitions section.
API documentation must be built using ESDoc by running commands, adding an extra step compared to pre-generated online docs, mentioned in the Documentation section.