Sample Guidance on Staff Use of Generative AI for K-12 School Districts

While many districts work toward appropriate policy language on AI use, this resource, endorsed by the AI Statewide Workgroup, outlines guidelines and recommendations with specific consideration for the safety and productivity of K-12 schools.

Generative artificial intelligence (AI) language models can assist with various tasks from teaching and learning, to writing support, to data analysis. District staff who have access to generative AI tools should understand underlying behaviors and the potential benefits and limitations associated with use. This interim guidance outlines recommendations regarding the types of data that may and may not be entered into consumer or commercial generative AI products, with specific considerations for the safety and productivity of K-12 schools. It also offers an overview of limitations to be aware of when using generative AI and offers some current best practices for working with these tools. The creation of future AI usage policies will be handled at the local board level in collaboration with administrators and community stakeholders. This sample guidance document is to support school districts and should be modified and reviewed with independent legal counsel as needed prior to implementation.

Recommended Practices:

For district users using generative AI tools that are not governed by a formal agreement with the district, we recommend the following practices.

Risks and Limitations:

Users of generative AI should be aware of other risks and limitations related to the output generated by these products.


It is important for educators to set exemplary standards in the use of AI technology. Staff who use AI are expected to:

Data Stewardship:

All data use must comply with all state and federal laws and organizational regulations and requirements, including the district’s acceptable use and data policies. Ethical considerations in alignment with the district’s mission, vision, and values must also be considered. Although generative AI products may claim to have some privacy safeguards in place, users should assume that all consumer generative AI products make data publicly available unless otherwise indicated per explicit official agreement with the school district. 

In addition to the expectations above, specific types of data should be handled in different ways when using a generative AI product:

Data Types:

By adhering to these disclosure principles, as educators, we not only enhance the credibility of our work but also position ourselves as responsible leaders in the educational application of AI technology.

Further guidance regarding more specific needs like handling generative AI in teaching and learning activities, selecting and adopting AI tools, creating sample syllabus language, and more should follow as the district continues to explore how to effectively leverage these new tools in a way that meets stakeholder needs while keeping data and users safe. 

This document is adapted from Michigan State University’s Interim Guidance on Data Uses and Risks of Generative AI 2023.

Supported by these members of the Michigan Virtual AI Statewide Workgroup:
MASSp (2)
MEMSPA: Michigan Elementary and Middle School Principals Association