An open-source transformer-based architecture designed for training large language models, similar in structure to GPT models.