Why Biden Nominee Miguel Cardona Should Make Facial Recognition Out of Schools

President-elect Joe Biden on Tuesday announced his intention to nominate Connecticut’s Commissioner of Education, former Principal of Public Schools, Maguel Cardona, for education secretary. This role is important, as Cardona will assume the top position in the Department of Education as debate has erupted over when and how to re-secure schools and address disparities exacerbated by the Kovid-19 epidemic.

Cardona will face many difficult decisions, assuming she is confirmed by the Senate and assumes office. But here’s an easy one: He must do everything in his power to keep facial recognition technology outside our schools.

Cardona has been vocal about racial and class disparities in the education system, and invasive surveillance techniques, such as Facial Recognition, supercharge those injustices. A major study at the University of Michigan found that the use of facial recognition in education would “eradicate racism, normalize surveillance and destroy privacy, narrow the definition of ‘acceptable’ student, reduce data, and inaccuracy.” Institutionalization will help. ” The authors of the report recommended a similar ban on using this technique in schools.

They are not alone. The Boston Teachers Association voted to oppose facial recognition in schools and supported a citywide ban on technology. On Tuesday, the New York governor signed into law a bill that prohibits public and private schools from using or purchasing facial recognition and other biometric techniques. More than 4,000 parents have signed a letter organized by my organization, Fight for the Future, calling for a ban on facial recognition in schools; The letter warns that automated monitoring will trigger the school-to-prison pipeline and question the psychological impact of the use of unused and intrusive artificial intelligence technology on children in the classroom.

Monitoring breeds conformity and obedience, which hurts our children’s ability to learn and be creative.

We have only begun to see the potential pitfalls associated with facial recognition and algorithmic decision making; Deploying these techniques in the classroom to conduct unethical experiments on children. And while we do not yet know the full long-term impact, the current effects of these technologies are – or should be – setting human rights alarm bells.

Today’s facial recognition algorithms exhibit systemic racial and gender bias, making them more likely to flag wrongly or unfairly flagged people, women, and anyone with gender stereotypes. Does not correspond to It is also less accurate on children. In practice, this means black and brown students and LGBTQ students, as well as parents, faculty members, and staff members who are black, brown, and / or LGBTQ – stopped by school police due to false matches or marks and Can be disturbed. Distance learning by automated presence systems that fail to recognize their humanity. A transgender college student may be taken out of their dorm by a camera that cannot identify them. A student activist group can be tracked down and punished for organizing protests.

Monitoring breeds conformity and obedience, which hurts our children’s ability to learn and be creative. Even though the accuracy of facial recognition algorithms may improve, the technique is still fundamentally flawed. Experts have suggested that it is so damaging that the risks far outweigh any potential benefits compared to nuclear weapons or lead paint.

It is no surprise, nonetheless, that schools that have hindered the use of facial recognition have faced massive backlash from students and civil rights groups. A student-led campaign last year prompted more than 60 major colleges and universities in the US to say that they would not use facial recognition on their campuses. In perhaps the most drastic change, UCLA overturned its plan to implement facial recognition monitoring on campus, instead enacting a policy that banned it altogether.

Employing these techniques in the classroom is meant to conduct unethical experiments on children.

But despite overwhelming response and evidence of harm, facial recognition is still creeping into our schools. Surveillance tech vendors have shamelessly exploited the Kovid-19 epidemic to promote their ineffective and discriminatory technology, and school officials who are desperate to pacify anxious parents and frustrated teachers promising fast technologies Which will not actually make schools safe.

An investigation by Wired found that dozens of school districts had purchased temperature monitoring devices that were also equipped with facial recognition. One district in Georgia has also purchased thermal imaging cameras from Hikvision, which has since been barred from selling its products in the US because of its complexity in human rights violations targeting Uygar people in China.

Privacy-violation techniques are spreading in districts where students are learning remotely during an epidemic. Horror stories about surveillance applications that use facial recognition such as Proctorio and Honerlock have gone viral on social media. Students of color who took the bar exam remotely were forced to shine a bright, headache-inducing light on their faces for the entire two-day test, while data from hundreds of thousands of students using ProctorU this summer Was leaked

The use of facial recognition in schools should be banned – full stop – which is a task for legislators. We have seen increasing bipartisan interest in Congress, and several prominent lawmakers have proposed a federal ban on law enforcement use of the technology. But the law will take time to pass; Facial recognition companies are already aggressively advancing their software on schools, and children are being monitored by this technology.

It is only getting worse.

So the first thing is that the new Department of Education leader should do a job that issues guidance to schools on the use of facial recognition technology and prevents federal grants from being used to purchase this surveillance technology, including racial disparities. Has significant potential to increase. . It is particularly immediate given that both Biden and Cardona have pushed schools to reopen sooner rather than later; The temptation would be strong to point to technology as a way to do it safely. But facial recognition is not magic, and is not a replacement for masks or social distance. This will make students, teachers, teachers and parents less safe in the long run.

Trump’s Secretary of Education, Betsy DeVos, used his post as a bully to undermine public education and reintroduce commonsense guidance aimed at protecting students of color and transgender students from systemic discrimination. Cardona is positioned to take a different and immediately necessary approach. He has said “we need to address inequalities in education.” An important first step would be to use his position as Secretary of Education, once he is confirmed, to oppose the use of technology that actually exacerbates and automates the inequalities he ends up Wants to do

Leave a Reply

Your email address will not be published. Required fields are marked *