The Marque The Marque

Noam Shazeer

Gemini Co-Lead & VP Engineering

Google

Palo Alto, CA, USA


Noam Shazeer
Bobby Sharma
Bluestone Equity Partners
Abeer Shoukry-Al Otaiba
SemSem LLC

Noam Shazeer’s Biography

Noam Shazeer is a computer scientist and entrepreneur currently serving as Vice President of Engineering and Co-Lead of Google Gemini, Google’s strategic AI initiative.

He oversees the development of next-generation large language models and AI systems, guiding research and engineering teams in building scalable infrastructure, multi modal models, and advanced conversational AI technologies.

In 2021, Noam co-founded Character.AI, an early chat platform built on Transformer technology that enables people to create and interact with AI-driven characters across entertainment, education, and creative use cases, predating the launch of ChatGPT.

He has played a pivotal role in foundational AI development, beginning with the Sparsely-Gated Mixture-of-Experts model in 2016, followed by the design of the Transformer’s multi-head attention and residual architecture. In 2018, he introduced Mesh-TensorFlow to support training very large Transformers and later contributed to the T5 text-to-text model in 2019.

Noam also played a central role in Google’s LaMDA conversational AI project,...
Noam Shazeer is a computer scientist and entrepreneur currently serving as Vice President of Engineering and Co-Lead of Google Gemini, Google’s strategic AI initiative.

He oversees the development of next-generation large language models and AI systems, guiding research and engineering teams in building scalable infrastructure, multi modal models, and advanced conversational AI technologies.

In 2021, Noam co-founded Character.AI, an early chat platform built on Transformer technology that enables people to create and interact with AI-driven characters across entertainment, education, and creative use cases, predating the launch of ChatGPT.

He has played a pivotal role in foundational AI development, beginning with the Sparsely-Gated Mixture-of-Experts model in 2016, followed by the design of the Transformer’s multi-head attention and residual architecture. In 2018, he introduced Mesh-TensorFlow to support training very large Transformers and later contributed to the T5 text-to-text model in 2019.

Noam also played a central role in Google’s LaMDA conversational AI project, helping define large-scale dialogue system architectures. His work has been recognized with awards including the 2023 WTF Innovators Award for contributions that advance emerging technologies.

On joining Google in 2000, he initially built Google's state-of-the-art spelling-correction system and later developed the PHIL algorithm, which became a core component of Google AdSense. His career encompasses deep technical research, large-scale system design, and leadership in AI product development.

Noam holds a Bachelor of Science in Mathematics and Computer Science from Duke University.
READ MORE VIEW LESS

Noam Shazeer’s Background

Google
Gemini Co-Lead & VP Engineering
Aug 2024 - Present
Palo Alto, CA, USA
Google Gemini is Google’s next-generation AI program, focused on developing advanced large language models and multi modal systems that integrate cutting-edge research with scalable, real-world applications.
 
Character.AI
Chief Executive Officer
Nov 2021 - Aug 2024
Character.AI is a platform that allows people to create and interact with customizable AI characters powered by advanced large language models.
 
View all Hide

Noam Shazeer’s Gallery

Interviews

Dwarkesh Podcast | Jeff Dean & Noam Shazeer on 25 years at Google: from PageRank to AGI logo Dwarkesh Podcast | Jeff Dean & Noam Shazeer on 25 years at Google: from PageRank to AGI logo
Dwarkesh Podcast | Jeff Dean & Noam Shazeer on 25 years at Google: from PageRank to AGI
Podcast
Feb 2025
Noam Shazeer and Jeff Dean reflect on their careers at Google, detailing the evolution of AI technologies from foundational systems to state-of-the-art models like the Transformer and Gemini.
 
Unsupervised Learning | Noam Shazeer and Jack Rae: Scaling Test-time Compute, Reactions to Ilya & AGI
Podcast
Mar 2025
In this video, Noam Shazeer and Jack Rae discuss how increasing compute during testing can enhance AI reasoning. They also share their insights on recent AGI discussions and the direction of advanced model development.
 
View all Hide

Publications

Attention Is All You Need
Author
Jun 2017
“Attention Is All You Need” is Noam’s seminal paper introducing the Transformer architecture, which revolutionized natural language processing by enabling models to efficiently capture long-range dependencies without recurrent structures.
 
PaLM: Scaling language modeling with pathways
Author
Apr 2023
Noam Shazeer’s paper “PaLM: Scaling Language Modeling with Pathways” presents a method for efficiently training extremely large language models using the Pathways system, demonstrating how scaling model size and data can improve performance across a wide range of language understanding and generation tasks.
 

Awards & Recognition

Time100 AI List
Sep 2023
Noam was named one of the 100 Most Influential People in AI in 2023, acknowledging his leadership and transformative contributions to the development of advanced artificial intelligence technologies.
 
WTF Innovators Award
Award for contributions that advance emerging technologies
Jun 2023
The WTF Innovators Award highlights individuals pushing the boundaries of technology and society, recognizing those whose forward-driven work, beginning with a cohort of AI pioneers, is helping define the next era of innovation.
 
View all Hide

Speaking Engagements

Mesh-TensorFlow: Model Parallelism for Supercomputers (TF Dev Summit ‘19)
Presenter
Noam explains how Mesh-TensorFlow enables scalable neural network training by expressing model parallelism through tensor dimensions mapped across large, distributed computing systems.
 
Keynote: Predictions for the Next Phase of AI
Keynote
Aug 2025
In this keynote, Noam outlines how future AI systems will be shaped by advances in model architecture, compute efficiency, and large-scale infrastructure design.
 

Noam Shazeer’s Education

Duke University
Bachelor of Science in Mathematics and Computer Science
1994 - 1998