Incorporated GPT attention blocks into a Transformer architecture, achieving outstanding results.