What is Generative AI?
“Generative AI, or Generative Artificial Intelligence, is a technology that holds immense potential for enhancing various aspects of education. At its core, generative AI refers to computer systems that can autonomously generate content, such as text, images, or even entire documents, videos, or slides, in a manner that resembles human creativity and decision-making. This technology relies on large datasets and complex algorithms to learn patterns and generate new, contextually relevant information.”
“Define Generative Artificial Intelligence in relation to education.” prompt. ChatGPT (August 3 Version), OpenAI, 11 Sept, 2023, https://chat.openai.com
Generative AI presents both exciting opportunities and significant challenges for faculty. There are many concerns with academic integrity, data privacy, misinformation, and potential bias when using GenAI. However, GenAI also has many opportunities and can be a powerful tool for enhancing academic content and fostering innovation. Finding a balance between harnessing the benefits of generative AI and ensuring appropriate and ethical use is paramount. Moreover, privacy concerns arise as AI systems often require access to vast amounts of data, which can include sensitive student information. Safeguarding privacy while leveraging generative AI's capabilities requires careful consideration.
Teaching and Learning
The integration of ChatGPT and similar AI generative technologies in educational settings presents a dual landscape of considerations and opportunities, particularly in relation to academic integrity and student learning. On one hand, it raises important concerns about maintaining academic honesty, ensuring proper authorship and citation practices, and guarding against the spread of misinformation and disinformation. These are topics that demand careful attention. On the other hand, it provides an opportunity to enhance student learning and AI literacy. The rapid advancements in generative artificial intelligence (AI) underscore the need for educators and institutions to navigate this evolving terrain thoughtfully.
At the Academic Technology Center, we recognize that ChatGPT and analogous AI generative tools signify the onset of a broader educational trend marked by personalization and automation. This trend, while presenting challenges, also opens doors to innovative and tailored educational experiences for students, fostering both their academic growth and their ability to engage responsibly with AI technologies.
To adapt to the presence of AI tools, consider these strategies to reevaluate course design, teaching methods, and assessments:
- Engage students in conversations about course policies concerning AI use, fostering their self-awareness of the learning process.
- Incorporate low-stakes formative assessments, create feedback mechanisms, and design activities that encourage student reflection.
- Transparency in discussing course design, activities, and assessments with students can help them connect their learning journey with the significance of completing all assignment components.
- Consider redesigning assignments that cannot be easily completed with AI tools, prompting students to apply concepts, problem-solve, or analyze case studies while integrating class discussions, direct instruction, personal experiences, and specific course readings.
- Aim to enhance students' critical thinking skills and lifelong learning commitment, all while recognizing the importance of digital literacy and the evolving role of AI in the learning process.
The expanding capabilities of AI tools are transforming both higher education and the workforce, opening up exciting opportunities for creativity and innovation. These tools are becoming increasingly prevalent, ushering us into new educational frontiers. However, along with these possibilities, there arises a crucial need for an ongoing conversation about the objectives of learning.
As Faculty begin to research ask questions and gain insights into AI tools and consider their potential applications within your courses. It's important to reflect on how these tools might impact the learning experience. Additionally, seeking input from your colleagues and students about their experiences with these technologies can be incredibly valuable.
Consider the following possibilities as you research and begin to think about how GenAI can help you and your students:
GenAI can help us by analyzing student learning patterns and recommending unique learning experiences that best suit individual student needs.
AI can save educators time by automating the development of course materials (content, quizzes, activities, resources, etc).
GenAI can translate educational materials into multiple languages, making course materials more accessible.
GenAI can assist with data analysis and literature reviews, potentially accelerating the research process and improving outputs.
AI can be used to analyze student data and predict which students may be at risk. GenAI can be used to suggest interventions to help students succeed.
GenAI can be used to generate alternative forms of course materials (audio descriptions and accessible documents).
What opportunities have you found with GenAI?
It's essential to consider the limitations of AI tools like ChatGPT, especially in the context of education. The Educator FAQ provided by OpenAI offers a comprehensive overview of these limitations and provides valuable recommendations for their use in educational settings. These tools may sometimes generate incorrect, misleading, and/or biased content, a challenge that OpenAI has acknowledged. It's crucial to recognize that while AI tools have many possibilities, they cannot replace the diverse range of interactions that are fundamental to successful teaching and learning.
Privacy and Data Concerns
Unlike systems used for teaching and learning or document management like Moodle or Google for Education, Lane does not have a contract or agreement with ChatGPT or similar AI systems. Therefore, no formal agreement exists to maintain the privacy of student data. We strongly recommend avoiding adding personal information to these systems without completely understanding how that information might be used.
Similarly, if you are considering using ChatGPT or other AI systems with your students, ensure they are not required to submit personal or sensitive information. If such systems are required to complete coursework, an opt-out provision for students is recommended.
Finally, ensure you understand where ownership of the outputs from these systems will fall. Recent court rulings have indicated that works generated through generative AI systems may not be copyrightable.
Additional recommendations from our neighboring universities regarding privacy and user data.
OSU Office of Information Security Statement (2023)
- “Because OSU does not control the online AI tools associated with the curriculum of this course, the Office of Information Security advises students to avoid entering Personally Identifiable Information (PII) or otherwise sensitive data into any AI prompt.”
UO Information Services (2023)
- “We therefore strongly recommend that instructors who ask or encourage students to use any AI system remind students that they should avoid providing any personal or other sensitive data to AI prompts. We also advise that instructors consider making AI use voluntary or, if AI use is part of a required course assignment or activity, include an opt-out alternative for students who do not want to create an account with an AI system or interact with them.”
Generative AI outputs can have bias because they learn from data in which biases are inherent, reflecting the prejudices and stereotypes present in society. These models generate content based on patterns learned from their training data, which can inadvertently perpetuate and amplify existing biases, whether related to race, gender, culture, or other factors. Additionally, biased training data can lead to skewed representations and unfair outcomes, as the AI may prioritize or generate content that aligns with these biases, thereby reinforcing and potentially exacerbating societal inequalities and discrimination. Efforts to mitigate bias in generative AI systems involve careful data curation, model architecture improvements, and ongoing monitoring and evaluation to ensure fair and ethical outputs.
The phenomenon of AI hallucination refers to instances where artificial intelligence systems generate false or erroneous outputs that deviate from reality. This occurs due to various reasons, such as biases in training data, overfitting to specific patterns, or limitations in the algorithms used. AI systems, particularly ChatGPT, may hallucinate by producing results that seem reasonable within their learned context but are fundamentally incorrect. These false outputs can manifest in various AI applications, from image generation to natural language processing, potentially leading to misleading or undesirable outcomes.
Understandably, for many faculty Academic Integrity is their top concern. Lane's own Academic Integrity policy is currently in development. The LCC student handbook provides direction for students on Academic Honesty. Although AI detection tools exist that try to identify AI-generated content, they do not work. There's no foolproof method to ascertain whether all written content has been produced by generative AI tools. Faculty will need to evaluate their assessments and understand if GenAI can be used to assist students. Further, faculty should ask themselves if this is appropriate or not based on their course policies.
There are many strategies to help with ensuring academic integrity in our courses:
- Clear course policies on academic integrity and appropriate GenAI use.
- Authentic assessment
- Regular and Substantive Interaction
- Create course community and connections with students
- Communicate the “why” by highlighting your course outcomes and objectives
There are many more strategies - what have you found that works for you and your students?
Course Policy (Syllabus)
We are in an era when Artificial Intelligence (AI) technologies are becoming deeply integrated into various fields. It's essential for faculty to guide students in understanding and responsibly utilizing AI in a changing environment.
- We encourage instructors to have an explicit policy about AI use in the course syllabus and to reinforce it in conversation with students.
- It's crucial to keep in mind that there's no universal solution to our questions. Each department, discipline, course, instructor, student, and activity presents its own distinct considerations when it comes to AI tool usage.
- Instructors have the flexibility to establish extra requirements, such as "original work" or "non-AI assisted work," for individual assignments and exams. However, these requirements must be clearly communicated to students to ensure transparency and fairness.
Resources to Assist in Drafting a Course Policy
There are many valuable resources to support Faculty in drafting course policies that address appropriate AI use in their course.
- Gettysburg College has generously provided a “What is my stance on GenAI in this class?” decision tree that can serve as a guiding light in your exploration of AI tool integration.
- Syllabus resources offered by the Sentient Syllabus Project. Here, you'll discover a wealth of considerations and sample policies that can be readily employed or adapted to fit your unique needs.
- Sample course policies are shared in this crowdsourced document, complete with the convenience of searching by discipline and course topic.
At Lane Community College, faculty have access to several valuable support opportunities to enhance their teaching and technology integration efforts. The Academic Technology Center (ATC) serves as a hub for educational technology support, offering guidance and assistance to faculty members as they navigate the ever-evolving landscape of technology in education. Whether it's troubleshooting technical issues or exploring innovative teaching tools, the Academic Technology Center is there to provide essential support.
Additionally, the college's Instructional Design team plays a pivotal role in empowering faculty with pedagogical support related to the application of educational technologies. These experts collaborate with instructors to effectively integrate technology into their teaching and learning strategies. They assist in designing engaging online or hybrid courses, creating accessible content, and aligning instructional goals with technology tools to enhance the overall learning experience for students.
Faculty members are encouraged to actively participate in the Center for Teaching and Learning (CTL) as well as Faculty Professional Development (FPD) opportunities and engage with instructional design services. These initiatives offer a platform for collaboration and learning from colleagues who have successfully integrated technology into their teaching practices. By following instructional design and FPD offerings, faculty can tap into a wealth of knowledge, share experiences, and continually refine their pedagogical approaches, ultimately benefiting both their teaching and their students' learning experiences.
This work was derived from the OSU Center for Teaching and Learning as well as UO Teaching and Innovation department. This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License