The maximum amount of previous text a model can consider when generating its next output; longer context allows the model to maintain coherence over longer passages.