Generative AI and Confidential Meetings: What School Leaders Need to Know about Privacy Risks
Generative AI and Confidential Meetings: What School Leaders Need to Know about Privacy Risks

As AI technology becomes more prevalent in education, school districts are exploring ways to use these tools to streamline administrative tasks.  Some districts have already implemented pilot programs with AI platforms.  We have recently received a surge in inquiries from district administrators regarding the use of AI for various educational purposes.  One inquiry revolves around whether it would be permissible to use AI to transcribe and translate various types of meetings that are legally protected from general public access or disclosure, such as Individual Education Program (“IEP”) team meeting discussions, student disciplinary hearings, and counseling sessions.  While the benefits of such technological advances may be appealing, there are important privacy and legal considerations that administrators need to know.

A. How Generative AI Platforms Receive and Process Information

The use of generative AI platforms in connection with confidential meetings essentially involves the sharing of sensitive student information with third-party companies.  For an AI tool to be effective, it must store and retrieve large volumes of data to ensure fast data access and processing for its users.  Data is stored in the AI companies’ databases, data lakes, or cloud storage which provide the necessary infrastructure to allow AI to create analysis for the specific AI tool.  If the AI tool is being used to take notes, summarize discussions, or generate or modify documentation, it would need access to the actual content being discussed (i.e., a recording of the meeting).  The information would be transmitted to the AI company’s servers for processing.

In addition, AI companies may use collected data to help train their models so they can run more efficiently. For example, an AI company may use collected data to practice storing and retrieving large volumes of data. This would make the AI tool run faster and more efficiently in the future.  Essentially, information gathered from meetings regarding students’ learning disabilities, mental health challenges, family circumstances, disciplinary issues, or medical conditions could become incorporated into their database.  The mere act of processing this information through the AI platform constitutes sharing, regardless of retention policies.

To be fair, many generative AI platforms’ terms of service include provisions about data collection and model training. However, even if an AI platform claims not to store or train on inputted data, the information is still being shared with that third party.  In addition, even if a company claims to delete the data, it would be difficult to verify what actually happens to it.  While companies typically have security measures in place, no system is completely breach-proof.  A single breach could potentially expose sensitive information about thousands of students.

B. Compliance with Student Privacy Laws

Before jumping on the AI bandwagon, districts need to carefully consider student privacy protections. Schools must comply with several laws protecting student privacy regarding their records and data. The two main federal laws that govern student records and data protection are the Family Educational Rights and Privacy Act (“FERPA”) and the Children’s Online Privacy Protection Act (“COPPA”). FERPA is a federal law that protects the privacy of student education records in elementary, secondary, and higher education. FERPA is limited to protecting “education records,” which is personally identifiable information (“PII”) “directly related to a student” and “maintained by an education agency or institute” or its agent. (34 C.F.R. § 99.3) Education records generally include student data held by third parties on behalf of a covered educational agency or institution. FERPA prohibits schools from disclosing PII in education records without parental consent unless an exception applies. PII is broadly defined as information that is linked or linkable to the identity of the student with reasonable certainty.

In addition to FERPA, there are other federal laws that protect student privacy, such as the Individuals with Disabilities Education Act (“IDEA”), the Health Insurance Portability and Accountability Act (“HIPAA”), and the Protection of Pupil Rights Act (“PPRA”). IDEA requires the provision of a free appropriate public education, including special education and services, for eligible children with disabilities. Under this law, parents have data access and destruction rights similar to FERPA, but IDEA establishes a higher standard for schools to maintain for its students with disabilities. This would apply to any student with an IEP. There are states that require schools to employ additional security for student data like California’s Student Online Personal Information Protection Act (“SOPIPA”).

Violation of these laws would result in a school, district or university losing federal funding, and the student may have the right to file a complaint with the state and federal departments of education and/or sue for a breach of privacy protection.  The applicability of these laws must be considered if a school, district, or university wants to start implementing AI for transcription and translation.

C. Recommendations

If your district is interested in using AI in connection with meetings involving confidential student information, there are several important safeguards to consider implementing.  Start by thoroughly vetting vendors—review their privacy policies, require written agreements about data usage and deletion, verify compliance with all relevant privacy laws (including, but not limited to, Education Code section 49073.1), and check for Systems and Organization Controls (“SOC”) 2 certification or similar security standards.  Developing clear consent forms that explain how AI tools will be used is crucial, and you should provide opt-out options and alternative methods for families who decline AI use.  Practice data minimization by only recording essential information, using pseudonyms or codes instead of student names, and deleting transcripts once they are no longer needed.  Regular audits of vendor compliance, security measures, and data access are also essential to maintaining legally required privacy.

You may also consider alternatives to third-party AI services that might better protect student privacy.  Some districts use their own transcription software that does not share data externally.  Others employ qualified human translators for language needs or electronically record meetings (with consent) and transcribe them internally.  Traditional note-taking methods, while perhaps less efficient, remain a reliable option that does not compromise privacy.  This is particularly important for highly sensitive situations like crisis intervention meetings or discussions of abuse reports where confidentiality is paramount.

The bottom line is that while AI tools offer exciting possibilities for improving school documentation and communication, the privacy risks are significant.  School districts must carefully weigh the benefits against their legal and ethical obligations to protect student privacy.  When in doubt, err on the side of caution - your students' privacy should always come first.  Technology should serve your students, not compromise their confidentiality.  As AI continues to evolve, it will be critical to stay informed about both the opportunities and risks that it presents for educational institutions.

Special thanks to AALRR Law Clerk Brenna Hatcher for assisting in preparing this blog.

Other AALRR Blogs

Recent Posts

Popular Categories

Contributors

Archives

Back to Page

Necessary Cookies

Necessary cookies enable core functionality such as security, network management, and accessibility. You may disable these by changing your browser settings, but this may affect how the website functions.

Analytical Cookies

Analytical cookies help us improve our website by collecting and reporting information on its usage. We access and process information from these cookies at an aggregate level.