A pure JavaScript implementation of a BPE tokenizer (Encoder/Decoder) for GPT-2 / GPT-3 / GPT-4 and other OpenAI models
Count the number of OpenAI tokens in a string. Supports all OpenAI Text models including GPT-5, GPT-4, GPT-3.5-turbo, and specialized models
Javascript BPE Encoder Decoder for GPT-2 / GPT-3. The "gpt-3-encoder" module provides functions for encoding and decoding text using the Byte Pair Encoding (BPE) algorithm. It can be used to process text data for input into machine learning models, or to
AI text generation plugin for the CE.SDK editor
Venice.ai integration for n8n with fixed thinking control via model suffixes