Thorsten Joachims named vice provost for AI strategy
By Susan Kelley, Cornell Chronicle
Thorsten Joachims has been named vice provost for artificial intelligence strategy, a newly created position intended to bolster the Cornell AI Initiative.
This new position further expands Cornell’s universitywide effort to advance leadership in research and education in AI, while creating, applying and evaluating AI as a tool across the university – from classrooms and laboratories to clinics and university processes. His appointment took effect Jan. 1.
In the new role, Joachims will coordinate expertise, communication and resources for Cornell’s strategy for artificial intelligence. The initiative brings together all Cornell colleges, as well as Cornell Tech and Weill Cornell Medicine, for the development and application of AI in research, education and operations.
Joachims has directed the Cornell AI Initiative since 2021 and is a Jacob Gould Schurman Professor in the departments of Computer Science and of Information Science in the Cornell Ann S. Bowers College of Computer and Information Science. He served as interim dean of Cornell Bowers from January to October 2025.
The new position greatly expands the role of the AI initiative, reflecting AI’s importance not only in research and teaching, but also as a tool that will change the ways the university operates, said Provost Kavita Bala.
“As AI transforms all academic disciplines and society more broadly, strategically coordinating our AI activities is essential,” said Bala, who led the launch of the Cornell AI Initiative as Cornell Bowers dean in 2021. “With his deep knowledge and experience, Thorsten can empower and support innovation and experimentation across Cornell that advances our mission to improve lives through research, learning, public engagement and clinical care.”
The goal is to develop a strategic approach to AI that leverages Cornell’s unique breadth of expertise, Joachims said.
“We have computational scientists who develop the AI algorithms, social scientists who understand AI’s effects on organizations and society, and domain specialists from materials scientists to veterinarians,” he said. “We have humanists grappling with ethics and creativity, and policymakers and lawyers developing frameworks for society. We can develop and lead a responsible vision for AI in a way that few other institutions can.”
Joachims will collaborate with deans, the Office of the Provost and a new AI Strategy Council that includes Natalie Bazarova, M.S. ’05, Ph.D. ’09, associate vice provost for research and innovation; Steven J. Jackson, vice provost for academic innovation; Ben Maddox, chief information officer; Vinay Varughese, chief information officer for Weill Cornell; and Fei Wang, associate dean for AI and data science at Weill Cornell.
Joachims will encourage and enable faculty, staff and students to experiment with and evaluate AI to discover where it can improve research, education and operations. Key to this effort is the AI Innovation Hub, led by Ayham Boucher and initiated by Maddox. Staff and faculty members with ideas for AI solutions to workplace, research or teaching problems become clients of the lab, where students and staff build AI prototypes to solve those problems. In time, the prototypes can create value and streamline operations for the university.
“But more AI is not always the solution,” Joachims said. “In some cases, innovation means new strategies to keep AI from degrading student learning, research quality and the integrity of Cornell as an organization. Sometimes less will be more.”
On the research front, a series of upcoming “Thought Summits” will identify novel AI research areas, such as AI benchmarks for veterinary medicine and envisioning new community-centered design methods for AI.
AI is also increasingly used as a tool to conduct research and scholarship, and the initiative will facilitate dialogue to set community norms for responsible use of AI in research. “Where can AI enable new research breakthroughs and improve practice? But also, what happens if people are using AI as part of the peer review process? What does this mean for the rigor and quality of the scientific process?” Joachims said. To address those questions and more, on March 3-6 Cornell is hosting a symposium, Assessing and Imagining the Impact of Generative AI on Science, which will cover how AI is transforming the scientific enterprise.
Cornell will continue to build infrastructure that supports research on AI. Currently that includes Empire AI, a consortium supported by New York state that provides top-tier AI research computing capacity. In the current cycle, 45 faculty members have received funding support for their AI-related projects.
In education, the university aims to meet students’ strong, growing demand for more courses on AI as a subject of study. A recently added undergraduate minor in artificial intelligence allows students from all colleges to add AI expertise to their major curriculum. An additional minor focused on AI in society is currently under development. It will enable students to learn about the ways in which AI affects people, organizations and institutions and will encourage thoughtful analysis of and reflection on AI’s social impacts, positive and negative.
Cornell will also continue to experiment with AI as a classroom tool to strengthen teaching and learning, led by Jackson. The Center for Teaching Innovation (CTI) has already created substantial resources on the use of generative AI in the classroom, supporting faculty who wish to “lean in” to experimental uses of AI in teaching and learning, as well as those who want to “lean out,” seeking more effective strategies for keeping AI out of the classroom where these tools undermine student learning; these strategies, for example, include oral assessments, assignment redesign and “tech-lite” classroom approaches.
And starting this spring, CTI and Cornell University Library will pilot a new generative AI critical literacy program to enhance students’ understanding of AI tools and their responsible use in academics.
“In the learning environment in particular, it is important to remain experimental, balanced and evidence-based,” Jackson said. “AI tools bring both important benefits and potential harms to student learning, and we’re seeing both. Our ability to navigate these dynamics artfully in the years ahead will be essential to Cornell’s thoughtful leadership in this space.”
Media Contact
Get Cornell news delivered right to your inbox.
Subscribe