Compliance

Evaluating AI tools for COPPA compliance

By Amit Kothari December 15, 2024

COPPA - the Children’s Online Privacy Protection Act - gets less attention than FERPA in school conversations. That’s a mistake. For students under 13, COPPA creates specific requirements that many AI tools simply don’t meet.

And the penalties? Up to $50,120 per violation. The FTC isn’t joking around.

What COPPA actually requires

COPPA applies when online services collect personal information from children under 13. In a K-8 school context, that’s most of your students.

The law requires:

  • Verifiable parental consent before collecting data
  • Clear privacy policies explaining data practices
  • Reasonable security measures
  • Data minimization (collect only what’s needed)
  • Parental access to review and delete data

Schools can consent on behalf of parents for educational purposes, but only when specific conditions are met. This “school consent” exception is narrower than many administrators realize.

The evaluation framework

When teachers or departments ask about a new AI tool, here’s what we recommend evaluating:

1. Does the vendor explicitly support education use?

Consumer AI tools - the free versions of ChatGPT, Claude, Gemini - generally aren’t designed for K-12 use. They may not have appropriate data handling for under-13 users.

Look for:

  • Education-specific product tiers
  • Explicit K-12 or COPPA compliance statements
  • Student data privacy agreements

If the vendor’s website doesn’t mention schools or education at all, that’s your first red flag.

2. What data is actually collected?

Get specific. Ask the vendor directly:

  • What information is collected from students?
  • Is any of it considered “personal information” under COPPA?
  • How long is data retained?
  • Is it used for purposes beyond the immediate educational function?

Vague answers aren’t acceptable. If a vendor can’t clearly explain their data practices, walk away.

Under the school consent exception, schools can authorize collection of student data for educational purposes. But this exception has limits.

Schools cannot authorize:

  • Data collection for commercial purposes
  • Sharing with third parties for non-educational use
  • Retention beyond what’s educationally necessary

If the AI tool uses student data for anything beyond immediate educational functionality - including model training - you likely need direct parental consent.

4. What security measures exist?

COPPA requires “reasonable” security. For AI tools handling student data, reasonable should include:

  • Encryption in transit and at rest
  • Access controls limiting who can see student data
  • Audit logs tracking data access
  • Incident response procedures

Ask about SOC 2 certification or similar independent security assessments. Not all vendors have them, but it’s becoming standard for education technology.

5. Can data be deleted?

Parents have the right to request deletion of their child’s data. Schools acting on their behalf need to honor this.

Verify that the vendor:

  • Can delete individual student data on request
  • Provides a clear process for deletion requests
  • Actually removes data rather than just anonymizing it

Red flags we’ve seen

In conversations with IT directors evaluating AI tools, certain warning signs come up repeatedly:

“We don’t store any data.” Often false. Even if content isn’t stored, metadata, usage patterns, or account information may be retained. Push for specifics.

“We’re FERPA compliant.” FERPA and COPPA are different laws with different requirements. FERPA compliance doesn’t automatically mean COPPA compliance, especially for under-13 students.

“Our privacy policy covers this.” Read the actual policy. Many are written for consumer use and include broad data use rights that aren’t appropriate for K-12 settings.

Pricing that doesn’t add up. If a powerful AI tool is free, the business model probably involves your data somehow. Education versions with appropriate privacy protections usually cost money.

A practical approach

Here’s what we typically recommend to schools:

Start with the Student Data Privacy Consortium. SDPC maintains a registry of vendors who have signed standardized privacy agreements. It’s not comprehensive, but it’s a useful starting point.

Create an internal evaluation checklist. Don’t rely on teachers to evaluate every tool independently. Have a standard process with specific questions that must be answered before any AI tool is approved.

Get agreements in writing. Verbal assurances aren’t sufficient. Require signed data processing agreements that explicitly address COPPA requirements.

Review annually. Vendor practices change. That tool you approved two years ago may have different data handling today.

When you’re not sure

Some situations genuinely aren’t clear-cut. Is this particular use case covered by the school consent exception? Does this data element count as personal information?

When in doubt, consult with legal counsel familiar with education privacy law. The cost of professional advice is far less than the cost of a COPPA violation.

If you’re a school administrator in St. Louis navigating these questions, we can help you think through the evaluation process. Not as lawyers - we’re not that - but as people who understand both the technology and the regulatory context.