AI Token Counter - Count Tokens for Prompts & Chat Payloads
Pick a model, paste text or chat JSON, and get instant token counts (and cost estimates) to avoid context-limit errors and control spend.
AI Token Counter
Count tokens for prompts, messages JSON, and Chat Completions payloads
Related tools
Show moreShow more
› About this tool · FAQ
Count tokens for AI prompts and OpenAI-style chat payloads before you send them. Paste plain text, a messages array, or a full Chat Completions JSON request and instantly see token counts for popular models with cost estimates. Everything runs locally in your browser.
What is a token?
A token is a chunk of text used by language models. Tokens are not the same as words: some words split into multiple tokens, and punctuation/whitespace can also count.
Does this token counter support OpenAI chat payloads?
Yes. You can paste either a messages array (role/content objects) or a full Chat Completions request JSON with a messages field.
Why do token counts differ between models?
Different model families use different token encodings, which can change how text is split into tokens. This tool counts tokens using model-aware encodings.
Is my prompt uploaded anywhere?
No. All counting runs locally in your browser. Your text is not sent to a server.