This article was originally published in Chalkbeat.
Since the release of ChatGPT to the public in November 2022, the number of AI tools has skyrocketed, and there are now many advocates for the potential changes AI can cause in education.
But districts have not been as fast in providing teachers with training. As a result, many are experimenting without any guidance, an approach that can pose serious risks to student privacy.
Advertisement
Advertisement
To learn about how teachers and other educators can protect student data and abide by the law when using AI tools, Chalkbeat consulted documents and interviewed specialists from school districts, nonprofits, and other groups. Here are nine suggestions from experts.
Education is at a Crossroads: Help Us Illuminate the Path Forward. Donate to The 74
Consult with your school district about AI
Navigating the details about the privacy policies in each tool can be challenging for a teacher. Some districts list tools that they have vetted or with which they have contracts.
50 Years after FERPAâs Passage, Ed Privacy Law Needs an Update for the AI Era
Give preference to these tools, if possible, and check if your district has any recommendations about how to use them. When a tool has a contract with a school or a district, they are supposed to protect studentsâ data and follow national and state law, but always check if your district has any recommendations on how to use the tool. Checking with your schoolâs IT or education technology department is also a good option.
Advertisement
Advertisement
It is also essential to investigate if your school or district has guidelines or policies for the general use of AI. These documents usually review privacy risks and ethical questions.
New Survey Says U.S. Teachers Colleges Lag on AI Training. Here are 4 Takeaways
Check for reviews about AI platformsâ safety
Organizations like Common Sense Media and iKeepSafe review ed-tech tools and provide feedback on their safety.
Be careful when platforms say they comply with laws like the Family Educational Rights and Privacy Act, or FERPA, and the Childrenâs Online Privacy Protection Rule. According to the law, the school is ultimately responsible for childrenâs data and must be aware of any information it shares with a third party.
Study the AI platformâs privacy policy and terms
The privacy policy and the terms of use should provide some answers about how a company uses the data it collects from you. Make sure to read them carefully, and look for some of the following information:
Advertisement
Advertisement
More in Technology
-
What information does the platform collect?
-
How does the platform use the collected data? Is it used to determine which ads it will show you? Does it share data with any other company or platform?
-
For how long does it keep the collected data?
-
Is the data it collects used to train the AI model?
The list of questions that Common Sense Media uses for their privacy evaluations is available online.
You should avoid signing up for platforms that collect a broad volume of data or that are not clear in their policies. One potential red flag: vague claims about âretaining personal information for as long as necessaryâ and âsharing data with third parties to provide services.â
Bigger AI platforms can be safer
Big companies like OpenAI, Google, Meta, and others are under more scrutiny: NGOs, reporters, and politicians tend to investigate their privacy policies more frequently. They also have bigger teams and resources that allow them to invest heavily in compliance with privacy regulations. For these reasons, they tend to have better safeguards than small companies or start-ups.
You still have to be careful. Most of these platforms are not explicitly intended for educational purposes, making them less likely to create specific policies regarding student or teacher data.
Use the tools as an assistant, not a replacement
Q&A: Putting AI In its Place in an Era of Lost Human Connection at School
Even though these tools provide better results when you input more information, try to use them for tasks that donât require much information about your students.
Advertisement
Advertisement
AI tools can help provide suggestions on how to ask questions about a book, set up document templates, like an Individualized Educational Program plan or a behavioral assessment, or create assessment rubrics.
Why Robots Are Not Effective Tools for Supporting Autistic People
But even tasks that can seem mundane can increase risks. For example, providing the tool with a list of students and their grades on a specific assignment and asking it to organize it in alphabetical order could represent a violation of student privacy.
Beyond Lesson Plans: AI Can Boost Teacher Creativity, Provide Classroom Advice
Turn on maximum privacy settings for AI platforms
Some tools allow you to adjust your privacy settings. Look online for tutorials on the best private settings for the tool that you are using and how to activate them. ChatGPT, for example, allows users to stop it from using your data to train AI models.
Advertisement
Advertisement
Doing this does not necessarily make AI tools completely safe or compliant with student privacy regulations.
Never input personal information to AI platforms
Even if you take all the steps above, do not input student information. Information that is restricted can include:
-
Personal information: a studentâs name, Social Security number, education ID, names of parents or other relatives, address and phone number, location of birth, or any other information that can be used to identify a student.
-
Academic records: reports about absences, grades, and student behaviors in the school, student work, and teachersâ feedback on and assessments of student work.
This may be harder than it sounds.
If teachers upload student work to a platform to get help with grading, for example, they should remove all identification, including the studentâs name, and replace it with an alias or random number that canât be traced back to the student. Itâs also wise to ensure the students havenât included any personal information, like their place of birth, where they live or personal details about their families, friends, religious or political inclination, sexual orientation, and club affiliations.
Advertisement
Advertisement
One exception is for platforms approved by the school or the district and holding contracts with them.
Be transparent with others about using AI
Communicate with your school supervisors, principal, parents, and students about when and how you use AI in your work. That way, everyone can ask questions and bring up concerns you may not know about.
Is AI in Schools Promising or Overhyped? Potentially Both, New Reports Suggest
It is also a good way to model behavior for students. For example, if teachers ask students to disclose when they use AI to complete assignments, being transparent with them in turn about how teachers use AI might foster a better classroom environment.
If uncertain, ask AI platforms to delete information
In some states, the law says platforms must delete usersâ information if they request it. And some companies will delete it even if you arenât in one of these states.
Advertisement
Advertisement
Deleting the data may be challenging and not solve all of the problems caused by misusing AI. Some companies may take a long time to respond to deletion requests or find loopholes in order to avoid deleting it.
The tips listed above come from the Commonsense Guardrails for Using Advanced Technology in Schools, published by the American Federation of Teachers; the Artificial Intelligence and the Future of Teaching and Learning report by the U.S. Department of Educationâs Office of Educational Technology; and the List of Questions about Privacy used by Common Sense Media to carry out its privacy evaluations.
Additional help came from Calli Schroeder, senior counsel and global privacy counsel at the Electronic Privacy Information Center; Brandon Wilmart, director of educational technology at Moore Public Schools in Oklahoma; and Anjali Nambiar, education research manager at Learning Collider.
This story was originally published by Chalkbeat. Chalkbeat is a nonprofit news site covering educational change in public schools. Sign up for their newsletters at ckbe.at/newsletters.
EMEA Tribune is not involved in this news article, it is taken from our partners and or from the News Agencies. Copyright and Credit go to the News Agencies, email news@emeatribune.com Follow our WhatsApp verified Channel