On April 24, the National Catholic Educational Association announced a new artificial intelligence training initiative that constitutes a “major expansion” of the organization’s relationship with Google for Education, the tech company’s education branch.

Building on N.C.E.A.’s existing partnership with Google, which included the widespread adoption of Chromebooks and the search giant’s “ecosystem” of apps in Catholic schools across the country, the new program aims “to deepen that integration” between the tech company and Catholic education, according to an N.C.E.A. press release. It proposes to include generative A.I., or A.I. that can create new content on its own, with the goal of enhancing “instructional impact and administrative efficiency.”

The announcement coincides with rising concerns among parents across the country over the growing technologization of childhood. The potential harms of screens and digital devices at home and in the classroom, from kindergarteners being handed iPads to middle schoolers scrolling short-form video for hours on end, are at the top of the list.

Google for Education’s “AI Educator Series,” the basis for N.C.E.A.’s A.I. training initiative, launches on May 13 in partnership with the education technology nonprofit and curriculum developer ISTE+ASCD. The series will provide free sessions designed to teach educators about A.I. fundamentals as well as the pedagogical and administrative applications of generative A.I., specifically Google’s Gemini.

Google for Education’s website provides vague examples of what this looks like in practice, ranging from promises to “leverage Gemini to boost student understanding,” enhance “comprehension & creativity” among elementary school students or assist high schoolers achieve workplace readiness “for an AI-first future.” Potential administrative applications include aiding teachers in student assessments “by generating standards-aligned rubrics and consistent feedback frameworks.”

“The idea is to ensure that every Catholic school educator has some sort of foundational A.I. competence,” Steven Cheeseman, the president and C.E.O. of N.C.E.A., said in an interview. “The reality is that A.I. is here and is pretty much everywhere…. We want to make sure that educators are not navigating the tools alone.”

Mr. Cheeseman said that the new initiative “is solely about professional development for teachers” and not about the integration of A.I. into the classroom or the principal’s office. But he also recognized that because educators “will be using Google tools as part of this training…certainly there might be some teachers who decide to adopt things at their schools with their principals or their tech directors.”

“In some respects, I think [the new N.C.E.A. initiative] is an extension of something that has been going on for a long time in education, which is the use of tech monopolies as a means of educating our children about technology. I think this is, in some respects, just kind of a norm in our society,” Nathan Schneider, a professor of media studies at the University of Colorado Boulder, said. “I find it disappointing but not altogether unexpected.”

Mr. Schneider has had to navigate the rise of A.I. in his own teaching and has critically examined the social and political implications of A.I. and other digital technologies in his scholarship.

Mr. Cheeseman emphasized that it is important for the A.I. training to be led in conjunction with Catholic voices and teaching. N.C.E.A. is accepting applications for six Catholic educators with expertise using Google Workspace to help lead the new Catholic “Google Educator Group,” the committee that will spearhead the A.I. literacy training program for K-12 teachers. Once trained, this group will lead instruction efforts in Catholic schools across the country.

He explained that while N.C.E.A. was enthusiastic about a professional development training partnership with Google, the nonprofit’s directors insisted that they wanted a “Catholic cohort” to lead trainings. “We wanted to make sure that there was a real explicit emphasis on the Catholic faith and tradition,” he said.

The “Googlification” of the American classroom has been underway for over a decade. After Google Classroom—a one-stop shop for educators to assign, collect and assess student work—launched in 2014, Google quickly made its way into school districts nationwide.

By 2024, upward of 90 percent of public school districts in the United States were using Google Classroom and/or G Suite for Education regularly. Around 75 percent of districts now regularly use Chromebooks, and globally, over 50 million students and teachers use Google Chromebooks for education. Over 150 million teachers and students actively use Google Classroom.

This adoption has occurred in Catholic schools as well. “Google and the Google suite of programs are used in the majority of Catholic schools around the country,” Mr. Cheeseman said.

Monetization concerns regarding Google’s access to young students are not new, but the introduction of generative A.I. presents new opportunities for recruiting and profiling potential customers—in this instance, the school children.

Some parents and educators have questioned whether the close relationship with Google has had a positive impact on the American education system, or if Google has gotten more out of it than students and taxpayers. One school district in Kansas recently had Chromebook buyer’s remorse: It took back the budget laptops from students in December 2025 after students used them to mostly play video games and watch YouTube, according to The New York Times.

That same report noted that the expansion of digital tools and “one laptop per child” policies, encouraged by tech companies, have not measurably improved education outcomes. In fact, some studies suggest that overreliance on technology can actually hinder learning.

There are also related privacy concerns regarding what kind of data Google and other tech companies get access to. “Google is a company that has a business model built on surveilling its users. That’s been a problem for a long time, and A.I. just deepens it,” Mr. Schneider said. “It further advances the company’s ability to profile and probe the thoughts—not just the relationships or the outputs—of users, how they’re processing through a problem.”

Mr. Cheeseman said that N.C.E.A. leadership raised these concerns with Google representatives “to ensure that there’s nothing that we’re going to be doing to somehow lead to any sort of sharing of data beyond what would be needed at the classroom level for the teacher.”

He acknowledged that concerns over privacy, A.I. and the overreliance on learning tech like Chromebooks are valid. “Schools should not have students spend lots of time on Chromebooks or laptops or iPads,” Mr. Cheeseman said. He also noted that he believes Google is “following the federal guidelines around student privacy.”

Federal regulation around A.I. has thus far been scarce; most data privacy regulation is being enacted piecemeal at the state level, often focusing on specific types of data. In the educational sphere, federal legislation like The Family Educational Rights and Privacy Act of 1974 and the Children’s Online Privacy Protection Act provides safeguards around student data, although how existing laws apply in light of emerging technologies may be up to interpretation. 

Potential exceptions and workarounds of F.E.R.P.A. involving third-party technology providers like Google raise serious questions of the extent to which student data is actually protected. And school districts may not be best suited to navigating technically and legally complicated data protection laws: Chalkbeat reported on May 4 that the New York City school district, the largest in the nation, had critical oversights and shortcomings in protecting student privacy.

Regardless of how protected student data is, tech companies still gain from educational partnerships by habituating new users to their platforms. Internal documents from Google that came to light through a child safety lawsuit in California said the company’s education partnerships helped generate a “pipeline of future users” and that exposure to Google Classroom helps “get that loyalty early, and potentially for life,” NBC News reported. Once students leave school, they are likely to continue using ecosystems they are already familiar with, creating opportunities for companies to access and monetize data not protected by student privacy regulations.

Two major federal bills to regulate A.I. and data protection, the American Privacy Rights Act and the American Data Privacy and Protection Act, have failed to gain traction in recent years. A.P.R.A. expired in January and has not been reintroduced. A.D.P.P.A. also failed to see a vote in the House of Representatives. In April, House Republicans introduced the SECURE Data Act, the most recent effort in comprehensive federal data protection legislation. 

Mr. Cheeseman and Mr. Schneider agree that it is imperative for Catholic schools to draw on the wealth of Catholic teaching to successfully address the challenges that A.I. raises for education. Mr. Schneider is hopeful that Catholic institutions can rise to the challenge. 

“I think Catholic education is in a position to actually succeed in the sense that it has long rejected the factory model of education,” Mr. Schneider said. “It has had a kind of disciplined orientation to education, grounded in the belief that old things are still valuable.” He believes that educating “the well-rounded person matters more than the industrial specialist.”

He noted that there “are options” for utilizing A.I. outside of the commercial model offered by big tech companies but that they “are not as well-advertised.”

In an economy and culture soon likely to be dominated by A.I., “what matters most is the stuff that Catholic education has done all along,” he said, “which is focus on discernment, the whole person, wisdom and integral human development.”

“The more the world gets automated, those things are the only things that are going to matter.”

Edward Desciak is an O'Hare Fellow at America Media.