Important: This documentation covers Yarn 1 (Classic).
For Yarn 2+ docs and migration guide, see yarnpkg.com.

Package detail

@twilio/video-processors

twilio63.7kBSD-3-Clause3.1.0TypeScript support: included

Twilio Video Processors JavaScript Library

twilio, webrtc, library, javascript, video, processors, virtual background

readme

Twilio Video Processors

Twilio Video Processors is a collection of video processing tools which can be used with Twilio Video JavaScript SDK to apply transformations and filters to a VideoTrack.

   See it live here!

Features

The following Video Processors are provided to apply transformations and filters to a person's background. You can also use them as a reference for creating your own Video Processors that can be used with Twilio Video JavaScript SDK.

Prerequisites

Note

The Node.js and NPM requirements do not apply if the goal is to use this library as a dependency of your project. They only apply if you want to check the source code out and build the artifacts and/or run tests.

Installation

NPM

You can install directly from npm.

npm install @twilio/video-processors --save

Using this method, you can import twilio-video-processors like so:

import * as VideoProcessors from '@twilio/video-processors';

Script tag

You can also copy twilio-video-processors.js from the dist/build folder and include it directly in your web app using a <script> tag.

 <script src="https://my-server-path/twilio-video-processors.js"></script>

Using this method, twilio-video-processors.js will set a browser global:

 const VideoProcessors = Twilio.VideoProcessors;

Assets

In order to achieve the best performance, the VideoProcessors use WebAssembly to run TensorFlow Lite for person segmentation. You need to serve the tflite model, binaries and web workers so they can be loaded properly. These files can be downloaded from the dist/build folder. Check the API docs for details and the examples folder for reference.

CORS

If you are serving the assets from a domain that is different from that of your application, then ensure that the Access-Control-Allow-Origin response header points to your application's domain.

Usage

These processors run TensorFlow Lite using MediaPipe Selfie Segmentation Landscape Model and requires WebAssembly SIMD support in order to achieve the best performance. We recommend that, when calling Video.createLocalVideoTrack, the video capture constraints be set to 24 fps frame rate with 640x480 capture dimensions. Higher resolutions can still be used for increased accuracy, but may degrade performance, resulting in a lower output frame rate on low powered devices.

Best Practice

Please check out the following pages for best practice:

changelog

3.1.0 (April 25, 2025)

New Features

  • Added backward compatibility for environments that do not have access to the WebGL2 API. In such cases, the Pipeline will automatically revert to Canvas2D, ensuring functionality. However, WebGL2 is still preferred and recommended due to its superior performance capabilities.

3.0.0 (January 16, 2025)

Breaking Changes

  • Now requires twilio-video SDK v2.29.0 or later
  • Unified the Canvas2D and WebGL2 pipelines into a single hybrid pipeline
    • Pipelines are now automatically managed
    • Removed BackgroundProcessorOptions.debounce and BackgroundProcessorOptions.pipeline
    • The Pipeline enum is no longer exported
  • Changed frame buffer handling in video processors
    • While adding a VideoProcessor to a VideoTrack, use the following AddProcessorOptions:
      videoTrack.addProcessor(processor, {
        inputFrameBufferType: 'videoframe',
        outputFrameBufferContextType: 'bitmaprenderer'
      });
  • Bump minimum Node version to 18.

Features

  • Added web worker support

    • Now supported across all major browsers (Chrome, Firefox, Safari)
    • Cross-domain worker hosting is now supported. Example cross-domain configuration:

      import { GaussianBlurBackgroundProcessor } from '@twilio/video-processors';
      
      /* Application is running at https://example.com/app */
      
      const processor = new GaussianBlurBackgroundProcessor({
        assetsPath: "https://example.net/path/to/assets"
      });

      (Requires proper Access-Control-Allow-Origin headers pointing to your application domain)

Performance

This update improves video processing performance, especially on low-powered devices. Key advantages include:

  • Processing is handled by web workers in all major browsers, reducing main thread blocking.
  • Resource usage is optimized through a unified hybrid pipeline.

3.0.0-beta.1 (December 4, 2024)

  • Web workers are now supported for Firefox and Safari.

3.0.0-preview.2 (September 16, 2024)

  • The web workers can now be hosted on a different domain than that of the application, provided the Access-Control-Allow-Origin response header points to the domain of the application.

    import { GaussianBlurBackgroundProcessor } from '@twilio/video-processors';
    
    /* Application is running at https://appserver.com/app */
    
    const processor = new GaussianBlurBackgroundProcessor({
      assetsPath: "https://assetsserver.com/path/to/assets"
    });

3.0.0-preview.1 (August 13, 2024)

Version 3 of the Video Processor introduces significant enhancements, delivering improved performance on low-powered devices. This version is compatible with twilio-video SDK versions 2.29.0 and later.

API Changes

  • The VideoProcessors now run in web workers on Chromium-based browsers. Support for web workers on other supported browsers is upcoming. While adding a VideoProcessor to a VideoTrack, use the following AddProcessorOptions:
    videoTrack.addProcessor(processor, {
      inputFrameBufferType: 'videoframe',
      outputFrameBufferContextType: 'bitmaprenderer'
    });
  • GaussianBlurBackgroundProcessor and VirtualBackgroundProcessor's processFrame method now accepts an inputFrameBuffer of type VideoFrame.
  • Added the following APIs:
  • The Canvas2D and WebGL2 pipelines are replaced by a single hybrid pipeline. Therefore, the following APIs are no longer available:
    • BackgroundProcessorOptions.debounce
    • BackgroundProcessorOptions.pipeline
    • Pipeline enum exported by @twilio/video-processors

2.2.0 (July 16, 2024)

Performance Improvements

  • The WebGL2 pipeline now has an overall higher output frame rate, even for 720p (HD) video.

Changes

  • BackgroundProcessorOptions.debounce is now set to false by default.
  • BackgroundProcessorOptions.maskBlurRadius is now set to 8 as the default for the WebGL2 pipeline, and 4 for the Canvas2D pipeline.

Bug Fixes

  • Fixed trailing effect of the person mask in both Canvas2D and WebGL2 pipelines.
  • Fixed a bug where changing the maskBlurRadius value on the VideoProcessor was not working.
  • TFLite module is loaded and initialized only once, no matter how many VideoProcessor instances are created.

2.1.0 (December 12, 2023)

  • Previously, the VideoProcessors SDK failed to compile with TypeScript 5.x. This release contains changes to support TypeScript 5.x.
  • Fixed a bug where WebGL2-based VideoProcessors sometimes generated very low output fps, especially on low-powered Intel graphics cards.

2.0.0 (March 21, 2023)

  • The VideoProcessors now work on browsers that do not support OffscreenCanvas. With this release, when used with twilio-video v2.27.0, the Virtual Background feature will work on browsers that supports WebGL2. See VideoTrack.addProcessor for details.
  • On Chrome, our tests with 640x480 VideoTracks show up to 30% reduction in CPU usage if WebGL2 is used as opposed to Canvas2D. Higher resolutions degrade the performance as compared to Canvas2D. While we work to support higher resolutions in future releases, we strongly recommend that you set the maximum resolution to 640x480 for WebGL2, or use Canvas2D instead.

API Changes

NOTES:

  • Although iOS and Android browsers (Safari and Chrome) are supported, the performance of the VideoProcessors is not optimized for mobile browsers at this time. Using the VideoProcessors on a mobile browser may overpower the CPU resulting in poor quality video experiences.
  • Since desktop Safari and iOS browsers do not support WebAssembly SIMD, it is recommended to use camera input dimensions of 640x480 or lower to maintain an acceptable frame rate for these browsers.

Example

See the following pages for best practice.

Other Changes

  • Removing unused BodyPix related logic.
  • Removing unnecessary loading of JS files after loading the model.

1.0.2 (November 5, 2021)

Changes

  • Moving @types/node to devDependencies.
  • Fixed an issue where twilio-video-processors is throwing an exception in a server-side rendering application.

1.0.1 (July 12, 2021)

Bug Fixes

  • Fixed an issue where the following internal classes and interfaces are being exported.
    • BackgroundProcessor
    • BackgroundProcessorOptions
    • GrayscaleProcessor
    • Processor
    • Dimensions

1.0.0 (June 24, 2021)

1.0.0-beta.3 has been promoted to 1.0.0 GA. Twilio Video Processors will use Semantic Versioning 2.0.0 for all future changes. Additionally, this release also includes the following new features and improvements.

  • Added isSupported API which can be used to check whether the browser is supported or not. This API returns true for chromium-based desktop browsers.

    import { isSupported } from '@twilio/video-processors';
    
    if (isSupported) {
      // Initialize the background processors
    }
  • GaussianBlurBackgroundProcessor and VirtualBackgroundProcessor's processFrame method signature has been updated in order to improve performance. With this update, the output frame buffer should now be provided to the processFrame method which will be used to draw the processed frame.

    Old signature:

    processFrame(inputFrame: OffscreenCanvas)
      : Promise<OffscreenCanvas | null>
      | OffscreenCanvas | null;

    New signature:

    processFrame(inputFrameBuffer: OffscreenCanvas, outputFrameBuffer: HTMLCanvasElement)
      : Promise<void> | void;
  • The segmentation model has been changed from MLKit Selfie Segmentation to MediaPipe Selfie Segmentation Landscape to improve performance.

  • Added debounce logic on the image resizing step to improve performance.

1.0.0-beta.3 (May 25, 2021)

Improvements

  • The VideoProcessors now use WebAssembly to run TensorFlow Lite for faster and more accurate person segmentation. You need to deploy the tflite model and binaries so the library can load them properly. Additionally, this improvement requires Chrome's WebAssembly SIMD support in order to achieve the best performance. WebAssembly SIMD can be turned on by visiting chrome://flags on versions 84 through 90. This will be enabled by default on Chrome 91+. You can also enable this on versions 84-90 for your users without turning on the flag by registering for a Chrome Origin Trial for your website.

  • The segmentation model has been changed from BodyPix to MLKit Selfie Segmentation to improve segmentation accuracy.

1.0.0-beta.2 (April 16, 2021)

Improvements

  • The background processors now stabilize the boundary of the foreground (person), thereby reducing the 'shakiness' effect.

1.0.0-beta1 (March 31, 2021)

Background Processors (Desktop Chrome only)

You can now use GaussianBlurBackgroundProcessor to apply a Gaussian blur filter on the background of a video frame, or use VirtualBackgroundProcessor to replace the background with a given image.

  import { createLocalVideoTrack } from 'twilio-video';
  import {
    GaussianBlurBackgroundProcessor,
    VirtualBackgroundProcessor,
  } from '@twilio/video-processors';

  const blurBackground = new GaussianBlurBackgroundProcessor();
  const img = new Image();

  let virtualBackground;
  img.onload = () => {
    virtualBackground = new VirtualBackgroundProcessor({ backgroundImage: img });
  };
  img.src = '/background.jpg';

  const setProcessor = (track, processor) => {
    if (track.processor) {
      track.removeProcessor(track.processor);
    }
    track.addProcessor(processor);
  };

  createLocalVideoTrack({
    width: 640,
    height: 480
  }).then(track => {
    document.getElementById('preview').appendChild(track.attach());
    document.getElementById('blur-bg').onclick = () => setProcessor(track, blurBackground);
    document.getElementById('virtual-bg').onclick = () => setProcessor(track, virtualBackground);
  });