Tokens, Context Windows, and Scenario Length

DreamGen sends your scenario and recent story or role-play history to the AI as a prompt. That prompt has to fit inside the model's context window. If the scenario is very long, there is less room left for recent story content or role-play context.

This guide explains what tokens are, how context windows work, what warnings and errors mean, and how to shorten a scenario when needed.

Table of Contents

What Is a Token?

A token is a small unit of text used by AI models. A token can be a word, part of a word, punctuation, or whitespace.

As a rough rule of thumb, 1 token is about 4 characters of English text. This is only an estimate. Short words, punctuation-heavy text, non-English text, and special formatting can all change the exact token count.

AI models see text as tokens.

Tokens Are Not Credits

Tokens are not credits

Tokens measure text length. Credits are DreamGen's internal currency for using models. Model usage is usually priced in credits per token, but the two numbers are not the same thing.

For example, a scenario can be short enough to fit inside a model's context window but still cost credits to run. Or a scenario can be too long for a model's context window even if you have credits available.

What Is a Context Window?

A context window is the maximum number of tokens a model can see at once.

On DreamGen, context-window limits depend on:

  • The model you choose
  • Your current subscription tier
  • Whether you are writing a story or playing a role-play

The table below shows the current context-window limits available for supported text models:

SubscriptionGLM 4.7Qwen 3.5 27BLucid Max (Chonker)Lucid Base
Free5000 tokens5000 tokens4096 tokens4096 tokens
Starter5000 tokens5000 tokens4096 tokens4096 tokens
Advanced8000 tokens15000 tokens8000 tokens8000 tokens
Pro30000 tokens30000 tokens15000 tokens30000 tokens

Larger context windows let the model see more text at once, but they are still finite. Once the prompt reaches the model's limit, older or optional context has to be removed, or the request cannot run.

What Uses the Context Window?

The context window is shared by several kinds of text:

  • Scenario content, such as plot, setting, style, history, characters, locations, objects, and examples
  • Recent content or the story or a role-play

Most scenario content is required. The model needs to see it so it can follow the scenario. Some parts, such as most examples, may be dropped automatically when the model needs more room for recent story or role-play content.

The important tradeoff is simple: the more room your scenario takes, the less room remains for recent content. That can make the model forget recent events sooner, repeat information, or respond as if it has less memory of what just happened.

Scenario Length Warnings

When you create or edit a scenario, DreamGen estimates how many tokens the required scenario content uses. If it is getting close to the current model limits, you may see a warning.

Scenario length warnings appear while editing, before you start playing or writing.

Warnings do not always mean the scenario is unusable. They mean the scenario is large enough that it may leave limited space for recent story or role-play content.

You may see messages such as:

  • This scenario is long. It should run, but there may be less room for recent content.
  • This scenario is very long. It may run, but memory can become noticeably worse.
  • This scenario is too long. It will not fit for one or more primary models at your current subscription level.

If you see a warning, the scenario may still work, especially with a larger context window. But shortening it usually improves reliability.

Token Limit Errors

If the required prompt is too large for the selected model, DreamGen cannot start or continue the story or role-play with that model. In that case, you may see an error like this:

A session token error means the prompt cannot fit inside the selected model's context window.

You may also see an error directly in the scenario editor. This does not prevent you from saving the scenario. It means the scenario is too long to play or write with one or more primary models at your current limits.

The editor can warn you when a saved scenario will not fit one or more primary models.

To fix this, use a model or subscription tier with a larger context window, or shorten the scenario.

How to Make Scenarios Shorter

Tip:

Shorter scenarios are not worse by default. Leaving space for the model to infer and invent can make stories and role-plays feel more flexible, surprising, and replayable.

Some of the most popular scenarios are less than 1000 tokens.

Start by deciding what the model truly needs to know up front. Models can interpolate, extrapolate, and improvise from a compact premise. You can have deep stories and role-plays with very short scenarios, as long as the important constraints are clear.

Remove details the model can invent.

If you do not care exactly how a tavern is decorated, what every side character wears, or the full history of a minor faction, leave it out. The model can invent those details during play.

Lean on the model's existing knowledge when it helps.

For fan-fiction scenarios, the model may already know a lot about the universe, characters, places, and tone, especially if the franchise is recent, popular, or heavily discussed online. In those cases, you often do not need to restate every canon detail. Focus the scenario on what is different, what the user should experience, and any rules the model must not miss.

Model knowledge is not guaranteed, and it depends on the model and when it was trained. Include details that are obscure, newly released, easy to confuse, or important to get exactly right.

Make long descriptions more compact.

Verbose prose often costs more tokens than a structured description. Instead of a long paragraph, use compact bullets:

- Looks: Tall, silver hair, formal black coat
- Personality: Patient, secretive, protective of apprentices
- Skills: Court politics, ritual magic, memory charms
- Secret: Knows why the archive was sealed

Prioritize reusable facts over one-time flavor.

Keep details that affect many scenes: relationships, goals, rules of magic, factions, locations, and important constraints. Cut atmospheric text that only matters once.

Shorten character descriptions.

Character fields can grow quickly. Focus on traits that change how the character behaves:

  • What they want
  • How they speak
  • How they treat the user or protagonist
  • What they know that others do not
  • What they are likely to do under pressure