A compact, open-weight text model from IBM's Granite line that punches at a practical size — 3 billion parameters with a notably large 131k token context window for its class. It's a base model, meaning it hasn't been fine-tuned for instruction-following or chat, so it behaves more like a raw language engine than a conversational assistant. Developers typically use base models as a starting point for further fine-tuning on specific tasks.