20 Strategies for Technology Evaluation:

  1. Research and Identify AI Tools: Conduct thorough research to identify various AI tools that align with course goals. Look for tools with good reviews, user feedback, and proven track records in education.
  2. Assess Features and Functionality: Review the features and functionalities of each AI tool. Ensure that they align with your specific learning objective(s) and enhance the learning experience.
  3. User Interface and Experience: Test the user interface of the AI tool to ensure it is intuitive and user-friendly. A complicated interface can hinder student engagement and learning.
  4. Data Privacy and Security: Evaluate the AI tool’s data privacy and security measures. Ensure that student data is protected and that the tool complies with relevant privacy regulations.
    1. Data Collection and Storage: Determine what data the AI tool collects from students and how it is stored. Ensure that personally identifiable information (PII) and sensitive data are handled securely and that data retention policies comply with relevant regulations.
      1. Determine if the data collected is used to train the tool and the potential impacts this may have on your teaching practice or students.
    2. Vendor Policies and Agreements: Carefully review the privacy policy and terms of service of the AI tool provider to understand how they handle student data and what responsibilities they hold.
    3. Data Sharing:  Check if the AI tool shares student data with third parties or if it aggregates data across institutions. Be cautious about tools that may share data without explicit consent or for purposes beyond the scope of the educational context.
    4. Data Anonymization and De-identification: Verify if the AI tool anonymizes or de-identifies student data to protect their privacy. This is essential to prevent data breaches and unauthorized access.
    5. Access Controls: Check the access controls and permissions for the AI tool. Instructors should only have access to the data necessary for teaching, while students should have appropriate control over their personal information.
    6. GDPR and Compliance: If the AI tool operates in or collects data from users in the European Union, ensure that it complies with the General Data Protection Regulation (GDPR) and other relevant data protection laws.
    7. Security Audits and Certifications: Inquire whether the AI tool provider undergoes regular security audits and holds relevant certifications to ensure that their data protection practices meet industry standards.
    8. Incident Response and Data Breach Policies: Understand the AI tool provider’s incident response plan and data breach policies. Be confident that they have processes in place to handle any potential security breaches promptly and responsibly.
    9. Data Ownership and Portability: Clarify who owns the data generated through the AI tool and ensure that students have the right to access and export their data if needed.
  5. Compatibility and Integration: Check if the AI tool can integrate seamlessly with the existing learning management system or that it can be easily accessed.
  6. Vendor Reputation and Support: Research the reputation of the AI tool’s vendor. Consider factors like customer support, ongoing updates, and responsiveness to issues or concerns.
  7. Instructor Training and Support: Consider the training and support provided to instructors in using the AI tool effectively.
  8. Institutional Approval and Policy Compliance: Ensure that the AI tool meets institutional policies and guidelines for educational technology adoption.

License

Icon for the Creative Commons Attribution-NonCommercial 4.0 International License

SUNY FACT2 Guide to Optimizing AI in Higher Education Copyright © by Faculty Advisory Council On Teaching and Technology (FACT2) is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License, except where otherwise noted.

Share This Book