Social media and other browsing records have long been used by police during criminal abortion investigations, a precedent that has drawn new controversy after Dobbs v. Jackson Women’s Health Organization overturned federal abortion protections earlier this year.
In a recent case, police used a teen’s private Facebook conversations to charge her and her mother with violating Nebraska’s abortion ban. Advocates are now raising concerns about the treatment of private data for LGBTQ children, The Guardian reports.
Only this year, legislators have put forward 300 anti-LGBTQ bills including rules that restrict discussion of sexuality and gender in schools, of which approximately a dozen have been passed into law. And, according to a survey by the Center for Democracy and Technology, one in five LGBTQ students reported that they or a friend were ‘outed’ without their consent as a result of online student monitoring.
Ed Markey and Elizabeth Warren (both D-Mass.), warned in an April report that the widespread use of surveillance tools in schools may violate students’ civil rights. The pair argued that by flagging and tracking sexual orientation-related phrases, LGBTQ kids were more likely to face disproportionately high disciplinary rates and be outed to their parents without their consent.
Advocates are increasingly concerned that, along with private health discussions surrounding abortion, student debates about gender and sexuality could find up in the hands of the police as a result of strict abortion laws being implemented across the U.S.
Software companies have not taken any steps to determine whether student activity monitoring software disproportionately targets students from marginalized groups, leaving schools in the dark., according to a report authored by Sens. Edward Markey and Elizabeth Warren, entitled Constant Surveillance: Implications of Around-the-Clock Online Student Activity Monitoring.
None of the companies that the senators contacted have analyzed their products for potential discriminatory bias – even though there is data indicating that students from marginalized groups, particularly students of color, face disparities in discipline, and more recent studies indicate that algorithms are more likely to flag language used by people of color and LGBTQ+ students as problematic.