Navigating the Turbulent Future of AI and Work
Feature Story
By Sara Frueh
Last update May 4, 2026
As artificial intelligence starts to reshape work across many sectors of the economy, students, parents, and educators are grappling with how to prepare young people for the workforce when the future impacts of AI on jobs are still unclear.
“It’s a really nerve-racking time for teachers, for students, for parents, for entire industries,” said Adam Browning of Washington Leadership Academy, a charter school in Washington, D.C., who spoke at a recent workshop hosted by the National Academies’ Action Collaborative on Education and Workforce Trajectories in Tech.
The event gathered participants from education, industry, and philanthropy to explore how AI is affecting education and jobs in and beyond tech — and how organizations can collaborate to steer those impacts.
“AI has challenged us, maybe changed us, and it is us who will shape its direction and trajectory,” said Talitha Washington of Howard University, who chaired the planning committee for the workshop. “We aren’t starting from scratch, and we are the ones who will mold and shape it.”
‘Learning will never stop’
One of the workshop’s panels weighed in on a fraught issue: Given the uncertain future impacts of AI on many jobs, what should young people study in order to be employable, whether in technology or other sectors?
Shabbir Qutbuddin of the School of IT and Entrepreneurship at Ivy Tech Community College advised students to start with where their curiosity and interests are and to investigate the real-world applications for what they want to study, urging them to also develop transferable skills such as critical thinking, problem solving, and collaboration.
Qutbuddin pointed to research that emphasizes the combination of the humanities and technology-related skills, whether it’s in the manufacturing or health care or business sector. “I think that combination would probably provide students with the most adaptable skills,” he said.
“Even after they graduate [from] high school or college, that learning will never stop — they have to be adaptable to the new changes,” he continued. “It’s going to be consistent upskilling, reskilling, lifelong learning.”
Kristin Lauter of Facebook AI Research (FAIR) Labs North America offered thoughts on what it means to be “job-ready” now, based on what she’s seen at Meta. “Students or job candidates really need to have their thinking caps on and their eyes open for being able to use AI in creative ways,” she said. And in technical roles, coders are expected to use AI to be much more productive.
Lauter stressed that it will still be important for people to develop and maintain deep subject-matter expertise; even when tasks are handed off to AI models, humans need to be able to evaluate what is produced. “I think that we have to critically examine in every field what we really need students to learn,” she said.
‘Opportunities and risks’ in K-12 education
Maya Israel of the University of Florida leads a task force on AI that offers guidance to Florida school districts on how to teach students AI literacy — how AI works and its appropriate uses and limits — and how to incorporate AI into K-12 education.
“We really think about this in terms of balancing opportunities and risks,” said Israel. AI offers opportunities for personalized learning pathways and empowering student autonomy and building workforce skills, she explained, while risks include threats to academic integrity and the possibility of overreliance and dependency on AI.
The Computer Science Teachers Association is revising the standards that identify the computer science content all K-12 students need to know, explained Bryan Twarek, who works for the association. The standards integrate AI across concept areas, from algorithms to programming to systems and security, and they also place a greater emphasis on how computing and society interact, he said.
“If we do this work well, and we can teach ethics and responsibility alongside the technical content, then we can shape the next generation of builders who have those dispositions,” said Twarek. Those developers can ask critical questions: Is AI right for this? Should I be building this? Who will be privileged and who will be harmed by this? “That is a set of skills, dispositions that don’t just happen — they need to be taught, fostered, supported, and that doesn’t really happen now,” he said.
Educators will need more professional development to teach AI, Twarek also stressed. In a recent biennial survey of K-12 teachers, 81% believe AI is foundational, but only 42% feel prepared to teach it. “As a field and a community, we need to step up and better support them in order to make these shifts,” he said.
While industry has provided useful resources to districts and teachers, it’s important not to rely on industry alone for professional support and curricula, said Katy Knight of the Siegel Family Endowment. “The purpose of a K-12 education is to produce a productive and well-rounded citizen — someone who has agency, can make their own choices, and has a path forward in life, not necessarily just someone who will be a good worker,” she said.
AI in higher education
Postsecondary institutions, too, are seeking ways to prepare their students for jobs that AI is altering.
“To be employable in the near future is all about adaptability … the durable skills that will help you continue moving forward in your future career,” said Antonio Delgado of Miami Dade College. “But it’s also about the real technical skills that you’ll need to learn to be employable.”
Delgado explained how Miami Dade created the first associate and bachelor’s degrees in applied AI, and then worked with other community colleges and the National Science Foundation to scale the idea. They created the National Applied AI Consortium, which teaches educators about AI and helps community colleges build AI education programs.
“In the next two to three years, we’re going to see more and more need for industry and academia co-development of workforce and talent coming out of our universities and community colleges with the right skills,” he said.
One such collaboration was described by Margie Vela from Machine Learning University at Amazon Web Services, who works with faculty around the country to help them upskill on AI and machine learning tools.
Amazon is constantly training its employees to keep up with the changes and evolution in the technology, Vela said. “As we train our internal employees, we have scientists that are taking that content and customizing it for our educators.” They share the content on GitHub so that educators can integrate it into their classroom teaching, enabling students to learn the latest AI technologies.
Barbara Grosz from Harvard University stressed the importance of embedding ethics into computer science curricula and teaching. “Every postsecondary computing educational program can teach ethics,” she said.
To that end, philosophers and computer scientists at Harvard collaborated to develop Embedded EthiCS modules that help students learn how to identify problems related to technology and think them through, Grosz said. For example, what is the reward structure of a particular AI system? Is it designed to maximize the benefit for the least well-off, or for the total benefit of society, or to maximize profits? The modules are freely available to everyone through a Creative Commons license, she noted.
‘The change has to be holistic’ in workplaces
The skills needed in workplaces are changing in response to AI, and not only for those directly using AI tools, explained Marachel Knight, who serves on the boards of Marvell Technology and Ericsson AB. In addition to those direct skills, there is a different skill set needed by the technical staff who integrate those tools into corporate systems. Still other skill sets are needed by the engineers responsible for building the tech infrastructure, and the leaders who are responsible for reshaping workflows across the company and setting guidelines for ethical and responsible AI use.
Andrew Puryear of 1AU Technologies also pointed to the broader changes in process and culture that companies need. “The change has to be holistic,” he said. “Unless the culture changes so that every engineer, every employee in your workforce is understanding how to use AI to accelerate their workflows, then there are just going to be tools that are left on the cutting room floor.”
Marachel Knight expressed hope that as companies race to adopt AI, they will not let ethics be a casualty of that speed. “I would like to see us attempt to do both,” she said. “Strategically, AI is a competitive advantage, but safety also is a competitive requirement.”
Next Steps
The Action Collaborative on Education and Workforce Trajectories in Tech is now pursuing ways to advance the ideas and collaborations suggested during the workshop, which was attended online by over 500 people.
“The strong viewer engagement signals the broad interest in and importance of this topic,” commented staff officer Jeena Thomas. “And the workshop underscored the urgency of better aligning education, workforce development, and industry efforts in response to emerging technologies like AI. In the coming months, the collaborative’s member organizations will work to translate the workshop’s discussions into action, and we invite additional organizations to join us in this effort.”
Organizations from education, industry, and philanthropy, and other nonprofits that are interested in joining the Tech Action Collaborative should contact Thomas at JMThomas@nas.edu. A comprehensive summary of the workshop’s discussions will be available this summer.