Parents Need to Pay Attention to Artificial Intelligence, Too

From personalized tutoring to ethical concerns about privacy and bias, AI is reshaping education in ways that require parent engagement. (Pexels photo by Polina Zimmerman)

by Aziah Siid

Advocates say it’s a game-changer in education, a high-tech tool that can reach students where they are. Opponents say it’s flawed and biased, mostly because its designers don’t have black students in mind when they program it.

Both sides agree: ready or not, the artificial intelligence revolution has reached the K-12 classroom, ready or not. Despite what seems to be nationwide attention on AI, it continues to be a topic parents don’t know much about — including the racial disparities bias that comes with it.

Ezekiel Dixon-Romàn, professor of critical race, media, and educational studies at Columbia University Teachers College, is both a father and digital expert who believes parents should not leave the development and distribution of AI materials solely up to their schools. 

Holding AI Companies Accountable

“A lot of parents don’t realize that we do have power,” Dixon-Romàn says. “We all have the right of refusal,” like the right to opt out of certain assessment or achievement tests. 

“We have the right to refuse to be subject to these technologies,” he says. 

There’s no doubt the use of AI has exploded in recent years, powering everything from job applications to internet search engines. But the implicit, and sometimes overt, racial bias in the technology has shown up, too, appearing in applications ranging from  algorithms used in the criminal justice system to facial recognition identification technology that falsely identifies African American faces as much as 100 times more than white faces. 

We have the right to refuse to be subject to these technologies. 

Ezekiel Dixon-Romàn, columbia university teachers college

In the K-12 space, AI-based grading and testing tools have been found to favor students who write in a certain style, often to the disadvantage of non-native English speakers. In addition, use of AI predictive analytics in schools — such as estimating the percentage of at-risk youth enrolled in that school — can reinforce racial and socioeconomic stereotypes and outcomes. 

The rapid integration of AI in education has prompted calls for developers and policymakers to address the digital divide to ensure all students have equitable access to tools and resources. But Dixon-Roman says trusting them to make it happen is not enough, and existing disparities could widen if families aren’t diligent about what content their students are exposed to. 

“Companies can have the capacity to do and try to build technologies and design systems with equity in mind,”Dixon-Roman says. “Whether they feel any pressure from the executive orders or not has nothing to do with whether they want to comply with them.”

Necessity of Teacher Training

Although organizations like the Gates Foundation and Alphabet, parent company of Google, have pledged to create fair and accessible tools, schools and parents should still make an effort to learn about them. 

“Companies have much more leeway to try to work within to build the kind of potential educational technologies that would lead to more equitable learning,” Dixon-Roman says.  That, he says, includes technologies that would be informative, culturally responsive, and sustain  existing pedagogies for effective education of children of color. 

“The concern that I have is how many of them will be out here who are actually going to fill that gap — that is going to be standing in a space that’s quite contentious and might also be lonely, if you will,” Dixon-Roman says. 

Teachers are looking for helpful AI-powered tools, like Brisk TeachingQuizizz, and EdPuzzle, to generate test questions, and help assist with grading.

But effective AI integration requires educators to undergo professional development to help them understand the implications for students, including how they choose to learn. Educators also must recognize, and mitigate, bias in AI-driven assessments and learning materials before  incorporating those tools into the classroom. 

Power of The Parent Voice 

Part of the adoption of these technologies is informing parents how and why they are being used — and giving them the option to not use it. 

The professor himself practiced what he preaches during COVID when additions to the classroom curriculum, like Classroom Dojo, were adopted. 

“When my son’s school adopted it several years ago, I shared articles with the teacher, and I told the teacher I do not want this being used by my son, and she complied,” Dixon-Romàn says. “She was like, ‘I appreciate you also for sharing this and informing me.’ She actually even changed her own practices on how she started to use it because she didn’t know.”

In addition to an individual’s parental voice, collective voices — like school boards and parent-led organizations — also can influence whether educators deploy AI in the classroom.

Parent-teacher organizations “have tremendous power,” Dixon-Romàn says. “PTAs are powerful organizations for every school, and if the PTAs can organize collectively against it, they can also push back on this — on school adoptions of particular educational technologies.”