The increasing popularity and accessibility to Generative AI and Large Language Model (LLM) tools such as ChatGPT have raised questions about their academic and administrative uses at George Mason.
This information is intended to guide administrative units in understanding the security and compliance considerations that must be observed when using these tools.
For further details about use of Artificial Intelligence at George Mason University, see the guidelines from the AI Task Force at https://www.gmu.edu/ai-guidelines.
Requirements
- ALL procurement and development of software applications or information services that will use George Mason Protected data1, integrate with George Mason’s systems2, or have a user interface3, must be reviewed and approved by the Architecture Standards Review Board (ASRB). This requirement applies to free and open-source software and Generative AI and LLM tools.
- It is prohibited to submit protected categories of data (as defined by University Policy Number 1114: Data Stewardship) into Generative AI or LLM tools as queries, questions, prompts, or in any other manner.
- Departments subscribing to Generative AI tools such as ChatGPT MUST document and obtain approval from ASRB and then maintain an internal procedure to manage the governance of that subscription and associated account management. When submitting for ASRB review, please upload this procedure. The process must include:
- Primary owner of the subscription
- Secondary owner of the subscription
- How users will be trained and authorized to gain access
- How access will be revoked when users are no longer with the department
- Per Executive Order 46, no employee of any agency of the Commonwealth of Virginia shall download or use the DeepSeek AI application on any government-issued devices, including state-issued cell phones, laptops, or other devices capable of connecting to the internet. The Order further prohibits downloading or accessing the DeepSeek AI app on Commonwealth networks.
Guidance
- Turn off history within the tool if that is an option
- Monitor outputs carefully before use since they could contain factual errors and biased or inappropriate statements
Additional Guidance for Administrative & IT Departments and ASRB Committee Members
- Data Privacy: Ensure AI tools comply with relevant data protection regulations (e.g., GDPR, CCPA). Implement data anonymization, encryption, and access controls to protect sensitive information.
- Risk Assessment: Conduct a thorough risk assessment to identify potential security vulnerabilities and privacy risks associated with the AI tools. Reach out to [email protected] for assistance with the risk assessment.
- Data Governance: Establish clear data governance policies and procedures for collecting, storing, and processing data with AI tools. Ensure data is accurate, complete, and up-to-date.
- Compliance Monitoring: Regularly monitor and audit AI tools for compliance with security and privacy regulations. Implement mechanisms for reporting and resolving non-compliance issues.
- User Training: Provide training on security best practices and the responsible use of AI tools to staff and students to mitigate risks. George Mason has partnered with LinkedIn Learning, which has multiple courses on this topic. The course can be accessed by signing in at https://lil.gmu.edu/.
- Vendor Management: If using third-party AI tools, ensure vendors comply with security and privacy requirements and have robust security measures.
- Incident Response: Develop and maintain an incident response plan to quickly address and mitigate security breaches or data leaks involving AI tools. Information Security-related incidents can be reported using the form available at Computer Security Incident Response Team (CSIRT).
- Ethical Considerations: Consider the ethical implications of using AI tools, such as bias in algorithms or unintended consequences. Implement measures to mitigate these risks.
- Legal Compliance: Ensure that the use of AI tools complies with all applicable laws and regulations, including intellectual property rights and licensing agreements.
- Continuous Improvement: Regularly evaluate and improve security and compliance measures to adapt to evolving threats and regulations.
By following these guidelines, university departments can effectively leverage AI tools while ensuring security and compliance with relevant regulations.
References
Responsible Use of Computing – University Policy (gmu.edu)
AI-Education-Guidelines.pdf (virginia.gov)
Governor Glenn Youngkin Bans DeepSeek AI
1 As per the University Policy Number 1114: Data Stewardship, Protected data includes two subcategories: highly sensitive and restricted.
- Highly Sensitive: Data that (1) by their personal nature can lead to identity theft or exposure of personal health information, or (2) a researcher, funding agency, or other research partner has identified as highly sensitive or otherwise requiring a high level of security protection. Some examples are data classified as secret by the Federal government, data that is often involved in identity theft (e.g., SSNs), data described in the Health Insurance Portability and Accountability Act (HIPAA) as needing to be secured, and data that could lead to financial theft (e.g., credit card information). See Appendix A for a list of the data types classified as Protected Data – Highly Sensitive.
- Restricted: Data that, by their very nature or regulation, is private or confidential and must not be disclosed except to a previously defined set of authorized users. Some examples are data defined as confidential by the Family Educational Rights and Privacy Act (FERPA), employee performance evaluations, confidential donor information, some research data, minutes from confidential meetings, accusations of misconduct, or any other information that has been identified by the University, its contractors or funding agencies, or Federal or State regulations, as private or confidential and not to be disclosed.
2 As per the University Policy Number 1307: Procurement and/or Development of Administrative Systems/Applications, university systems include applications, utilities, network, storage, compute, databases, and similar George Mason-owned and/or operated assets on-prem or in the cloud.
3 Frequently Asked Questions – Assistive Technology Initiative (gmu.edu)