Author: Laura Schmidli. Editors: Jonathan Klein & Molly Harris. Published on February 16, 2024.
As generative artificial intelligence (Gen AI) continues to become more sophisticated and ubiquitous, the utility of Gen AI tools, perspectives about their use, and cultural acceptance of them will continue to change. This landscape of larger societal change will continue to inform our context for teaching – including our course policies and assignments. This ongoing shift prompts a number of questions:
- What do we know about student and instructor perspectives on Gen AI?
- How are these perspectives shaping classroom activities and vice-versa?
- What adaptations to assignments have been successful for instructors and students in L&S?
- How can we best prepare instructors for future semesters and students for their future beyond academia?
In this article, we look closer at these questions, leaning on emerging research literature, examples from instructors in disciplines across L&S, perspectives from UW students, and considerations you might make in evaluating how Gen AI can support student learning in your context.
What's Effective?
Understanding student perceptions of Gen AI in the classroom can help instructors design assignments that support learning outcomes while meeting students’ needs and appealing to their motivation. While some students are eager to engage with these technologies, others continue to have anxiety and uncertainty about their use. From recent research, we know that students with a good understanding of how these tools work may have less anxiety about them overall (Chan & Hu, 2023). But developing an understanding of these tools is not always easy and requires context. Gen AI tools have strengths and weaknesses, and critical thinking skills and disciplinary knowledge are necessary to evaluate their output (Dahlkemper et al., 2023). To add further complication, different instructors, even within the same discipline, may have different ideas about how Gen AI helps or hinders students in reaching their learning outcomes. For these reasons it is essential to help students use and evaluate Gen AI tools relative to your course and discipline. Providing students with practice using and evaluating Gen AI in your classroom makes your expectations clear to students, provides students with evidence of Gen AI’s utility and limitations, helps students build skills, and encourages critical thinking about these tools.
1. Incorporate Gen AI to support learning outcomes and disciplinary practices.
Weaknesses and limitations of Gen AI tools have been widely documented. At times they generate inaccurate information, reproduce biases and stereotypes, and fabricate non-existent citations. They also use a linguistic style that indicates confidence, despite making errors. Using these tools effectively requires evaluating their output critically using background information and additional research. In other words, a disciplinary novice may need more support to interact with these tools compared to an expert. Providing students with opportunities to use and evaluate these tools within an assignment, at a level appropriate to their experience, can help them determine when these tools help or hinder their learning in your course and discipline. When the use of these tools supports specific course outcomes and learning goals, students will be better able to make sense of connections between Gen AI tools, their course work, and their learning overall. This can also reinforce values related to academic integrity, knowledge production, and learning.
Two examples below from L&S instructors highlight different ways they have incorporated Gen AI tools within assignments. In both examples, students are asked to incorporate Gen AI into an assignment in a specific way and evaluate its effectiveness within a disciplinary practice. Both examples also retain space for students to demonstrate creativity, develop research skills, and accomplish other goals central to the learning outcomes.
L&S Instructor Example
Shanan Peters, Professor, Geoscience
What did you do? I incorporated the use and critique of ChatGPT into a traditional research project completed by honors students in an intro-level course. Students are expected to identify a viable research topic relevant to course content, and then select and gather information from traditional reputable sources on their own. Once they have completed these research steps, they then plan and execute prompts for ChatGPT that obtain additional explanation, clarify complex concepts, hypothesize potential scenarios, or further explore the topic and its implications. After engineering their prompts for the AI chat bot, students annotate and critique the exchange based on their background research and then reflect on the experience overall.
Why did you do it? When ChatGPT hit the scene, it was clear that this was going to be a powerful new tool that could facilitate some types of work. Perhaps more importantly, I was no longer confident that the traditional research paper would be an effective assignment, or that I would be able to consistently recognize ChatGPT-generated content and respond accordingly. So, I decided to tackle this new resource head on and incorporate it into the activity.
What impact did it have on students? Some students were very creative in their interaction with this AI tool. The feedback from students was generally positive. Most of them had little or no experience using it and some explicitly stated that they would begin using it more frequently for some tasks. Overall, students gained experience with both the utility and shortcomings of generative AI for basic research. As an example of the latter, ChatGPT happily responds, in a rather authoritative tone, to all manner of nonsense. After productively using the tool to start the assignment, one student really struck out to demonstrate just that, and did so spectacularly, with the system producing fantastical scenarios in response to probing prompts.
What might you change in the future? This first year had the advantage of novelty, for the students and me. Next year, that novelty will have likely worn off for most everyone involved. Nevertheless, the generative AI space is fast moving and the capabilities of the platforms are improving. I’m likely to try this type of assignment again, with a revised set of guidelines to foster even more critical assessment by students that is grounded in their traditional background research. One student also used the system to generate Julia code that attempted to reproduce and improve on an R simulation demonstrating selection that I showed in class. I liked this analytical bent and I might consider encouraging code generation to demonstrate a relevant concept as part of the assignment, though the very diverse backgrounds in this intro-level course would make that challenging as a general expectation.
L&S Instructor Example
Anna Andrzejewski, Professor of American Art and Architecture
What did you do? I asked students in my intermediate-level Frank Lloyd Wright class to use AI to create the “first draft” of their text for their research projects. Wright is an iconic, popular figure in architecture, and there is a great deal about him on the internet. Much of this is written by enthusiasts and is of questionable accuracy, so I thought it would be a good way for students to explore their “research chops” by measuring the accuracy of online information. They were to pick a building designed by Wright and ask the free version of Chat GPT to write a paper about that building relative to one of six course themes.
Why did you do it? I was coming back from a year of research leave, and I had heard horror stories from my colleagues about how AI was affecting their teaching. I had not yet had any experiences in the classroom with students using AI on assignments, but I figured rather than lose sleep over people misusing it, I would try to see if there was a way to embrace it. I wanted to have students think about its benefits and drawbacks critically. My goal was partly to enlist students in the process of assessing where the technology is in my field, which is a field where information of all sorts circulates on the Internet.
What impact did it have on students? The students were surprised how AI repeated many of the common ideas about Wright. Even if their building was not in the “prairie style” (just one phase of Wright’s long career), AI often repeated the idea that Wright was a prairie style architect. In one case, a student was working on a Wright-designed building that is well known by scholars, which AI insisted didn’t exist. They learned that AI could be a starting point for art historical research but had its limits. Many of the same ideas they had learned about in overview lectures and textbooks were repeated, without real attentiveness to their particular buildings. For some, it led them to the library or at least scholarly databases to find detailed information. In other words, most of what they got through AI remained surface level, leading them to go back to research resources to deepen their knowledge of their particular building and research topic.
What might you change in the future? I know AI will get better. With the technology in its infancy I expect the assignment will work differently in the future. I like the idea of using it for a draft, but I think I’ll have to be very specific about expectations to edit that draft in the future. For example, I’ll likely have to say “find three other research sources not listed in the draft” to supplement the work.
L&S Student Voices
“[Using AI in an assignment] was enjoyable and meaningful because it directed me to many resources and made connections when prompted appropriately, yet it began recycling its responses and gave misleading or false information when it could not locate something to fill the content gap. Consequently, I opted to abandon the AI output and instead crafted my own outline from scratch, focusing on what I deemed important. Utilizing notes, topics, and research citations from my Zotero manager, I developed a cohesive paper that reflected my own narrative voice that expressed my interests and ideas. While AI excels at information retrieval, it lacks passion and enthusiasm for the topic. Its approach was overly factual and robotic, lacking the spirit and enthusiasm I had for my project.”
– Margaret Murphy, Art History student
“Seeing AI’s poor response motivated me to put more effort into the project to write a great essay. It did a good job of setting a quality standard that our projects needed to exceed. As a result, I conducted a much deeper literature review than I typically would for an essay.”
– Jack, Art History student
2. Help students consider uses of Gen AI beyond writing assignments.
Many students express skepticism and uncertainty about Gen AI’s utility and appropriateness for college-level work (Baek and Tate, 2023). Furthermore, engaging with Gen AI in the context of a writing assignment contributes to this skepticism by helping students see its weaknesses relative to their own skills (Tossell et al., 2024). However, students also report experimenting with Gen AI for tasks beyond writing assignments, including generating illustrations and other images, searching for and summarizing information, generating practice questions, generating topic ideas for assignments, offering suggestions for coding computer programs and scripts, analyzing data sets, and more. Determining when and how generative AI output can be useful for these tasks can require more specific evaluation skills and background knowledge.
In the example at the right, an instructor and students experimented with Gen AI as a partner in Socratic questioning during a class session. Modeling ways Gen AI can be effective, or not effective, during class can show students possible uses for it outside of the classroom, including for studying, testing their knowledge, or generating practice questions.
L&S Instructor Example
Jan Miernowski, Professor of French
What did you do? I assigned groups of 2-3 students to interact with ChatGPT, adapting a Socratic Tutor exercise shared by Jon Ippolito from the University of Maine Learning with AI webpage. Students asked ChatGPT to use the Socratic method to question the basis of their claims, where their interaction would consist of a series of claims and challenges. The students were supposed to make interpretative claims about Balzac’s novel ‘Le Colonel Chabert’ that we just finished analyzing in class.
Why did you do it? I wanted to see to what extent generative AI may replace my own interaction during the live class discussions in class. My teaching style is largely based on a guided questioning of students’ interpretative claims based on previously assigned readings.
What impact did it have on students? The exercise came on the heels of 2 weeks of discussions on the novel. At best, it served as a recapitulation and further training of the skill of interpretation of a literary text they read.
What might you change in the future? If I were to reuse this kind of prompt, I would make sure students’ initial statement is not equivocal so the machine is at least set on a reasonably correct path regarding the object of the exchange, and not allow it to go beyond 5-6 statements from the students. The exchange becomes increasingly idle after that.
3. Help students consider Gen AI tools relative to future careers.
Students are also concerned about the impact of Gen AI on their future careers (Chan & Hu, 2023; Tossell et al., 2024). Students have highlighted that its use may be prohibited in education but required later on the job. Therefore, helping students consider these tools relative to future roles as professionals, community leaders, and critical consumers of information is important.
In the example below, an instructor engages students in productive conversation where students choose to reflect on their use of Gen AI related to their internships, classes, personal tasks, and future. The instructor is also transparent in their own use of Gen AI and joins students in reflecting on its use.
L&S Instructor Example
Jennie Mauer Maunnamalai, Lecturer, La Follette School of Public Affairs
What did you do? Undergraduate students in my course are participating in a public policy internship and the course provides opportunities for reflection, analysis, and engagement with their classmates on their internship experiences. As part of my general course guidance on my syllabus, I included guidance allowing students to use AI as a tool and requiring them to properly cite its usage. Students in my course complete regular online discussion prompts designed to help them reflect on their internship experience along with their peers and additional written assignments that are submitted to the professor. In several discussion prompts I modeled using AI to summarize a text and generate discussion questions. I also included a prompt one week focused on AI to help students consider how AI was (or was not) part of their career, academic, or personal lives and why. Students could choose which questions they wanted to respond to, in order to preserve flexibility and choice in what they shared.
Why did you do it? There is a lot of discussion about AI and I had assumed that students would be using AI in my course, their internship, other courses, and in their personal lives. I wanted to learn more about their usage of AI and how it was or was not showing up in other professional and personal activities. I also want to encourage the group to begin exploring and experimenting with using AI.
What impact did it have on students? I was surprised to learn that students were using AI tools less than I expected and that they shared many of the same concerns and anxieties I have seen with my own professional peers. I was impressed that students were thoughtful and measured in their thinking on AI and I hope that our continued group exploration of AI tools will expand our thinking.
What might you change in the future? I intend to become more familiar with using AI tools to better prompt students and to model some real usage. I also want to continue facilitating an environment where students can speak candidly about this emerging resource.
Considerations for Your Own Context
We know that Gen AI output, whether text or images, reproduces biases and stereotypes. Many students also recognize that generative AI can perpetuate existing societal biases (Tossell et al., 2024). What might bias and stereotype look like within your particular discipline? How might you help students explore this further and think critically?
How are students engaging with Gen AI as a study tool in your courses? Consider if you want to provide students with guidance or practice in using Gen AI to aid their thinking – e.g., generating practice quiz questions using Gen AI.
Consider where you can transparently engage students in critical thinking around use of GAI in their future workplace or other aspects of their lives. How can skills and background knowledge needed to use Gen AI effectively in your course help students in future workplace situations? How can your own professional use of Gen AI tools inform students?
What do your learning outcomes tell you is most important in your course? What disciplinary practices and ways of thinking does your course support? Consider how student use of Gen AI might support these outcomes and practices, and where its use might hinder these outcomes. This can help you determine where you can incorporate Gen AI, versus where you can emphasize the value of original thinking for learning.
What existing assignments in your course do you think students are already using AI to complete? If these assignments are essential for students to complete on their own, consider student motivations and incentives. Are there ways you can better communicate and incentivize the value of original thinking and work in these assignments? Do these assignments clearly connect to your learning outcomes and are those outcomes compelling to students? Are students able to make mistakes without huge penalties? How might you transparently and proactively incorporate Gen AI into small parts of these assignments? How are students engaging with Gen AI as a study tool in your courses? Consider if you want to provide students with guidance or practice in using Gen AI to aid their thinking – e.g., generating practice quiz questions using Gen AI.
Challenges and Opportunities for the Future
- Just as instructors are concerned that reliance on Gen AI might erode critical thinking skills and creativity, so are students. Assignments revisions that focus student effort solely on evaluating AI-generated content, rather than creating their own content, may make students feel short-changed (Smolansky et al., 2023). Student comments in a recent study indicate that they appreciate assignments that preserve student creativity (Tossell et al., 2024). Yet many instructors remain concerned that students will rely on AI when asked to create their own content. How might we design assignments that preserve authentic student creativity while also discouraging misuse of Gen AI?
- If GAI tools become more often correct or more nuanced and complex, will assignments that focus on correcting or evaluating GAI output remain compelling? In what other ways might we incorporate these tools?
- GAI written output can appear confident, persuasive, and even empathetic at a surface level. How can we help students think critically about the tone, style, and rhetorical strategies when interacting with GAI chatbots?
- Access to Microsoft Copilot through UW-Madison might help students and instructors who have privacy and intellectual property concerns, as institutional access provides greater security. However, this access doesn’t help with disparities in access if individuals can pay for more advanced tools than those provided by the institution. How might we support efforts to democratize access to these tools?
L&S Student Voices
“I’m worried that using AI is a ‘slippery slope’. Grammerly could make my writing better, but will I unlearn how to write on my own? Other tools that write for you seem even worse. When is my work no longer really my own? Sometimes struggling with writing and revising over time helps me have better ideas and creativity. Not everything is supposed to be easy.”
– Anonymous Student
Connect with Help
Meet with Us
Would you like help getting started on making a change to your course? Do you want to discuss your ideas? Our team is happy to meet with you, brainstorm solutions that meet your needs, and help implement your ideas. Our work typically starts with one 45-minute virtual meeting. To get started, request a meeting.
Suggest a Topic
We want to hear your ideas for future articles in our Design for Learning Series! We focus on local L&S examples, backed by research, that can help solve common teaching challenges. We gather input from instructors and students, as well as research literature. What teaching topic or trend would you like to know more about?
See More Example Assignments
Locally Sourced, from Writing Across the Curriculum, features a section on Teaching Writing in an Age of AI that includes recommendations for writing assignments and examples from several disciplines. See the Writing Across the Curriculum website for more information or to schedule a consultation.
This is an accordion element with a series of buttons that open and close related content panels.
References & Further Reading
Amani, S., White, L., Balart, T., Arora, L., Shryock, Dr. K. J., Brumbelow, Dr. K., & Watson, Dr. K. L. (2023). Generative AI Perceptions: A Survey to Measure the Perceptions of Faculty, Staff, and Students on Generative AI Tools in Academia. https://doi.org/10.48550/ARXIV.2304.14415
Baek, C., & Tate, T. (2023). “ChatGPT Seems Too Good to be True”: College Students’ Use and Perceptions of Generative AI. OSF Preprints. https://osf.io/preprints/osf/6tjpk
Bitzenbauer, P. (2023). ChatGPT in physics education: A pilot study on easy-to-implement activities. Contemporary Educational Technology, 15(3), ep430. https://doi.org/10.30935/cedtech/13176
Chan, C. K. Y., & Hu, W. (2023). Students’ voices on generative AI: Perceptions, benefits, and challenges in higher education. International Journal of Educational Technology in Higher Education, 20(1), 43. https://doi.org/10.1186/s41239-023-00411-8
Dahlkemper, M. N., Lahme, S. Z., & Klein, P. (2023). How do physics students evaluate artificial intelligence responses on comprehension questions? A study on the perceived scientific accuracy and linguistic quality of ChatGPT. Physical Review Physics Education Research, 19(1), 010142. https://doi.org/10.1103/PhysRevPhysEducRes.19.010142
Hou, I., Metille, S., Li, Z., Man, O., Zastudil, C., & MacNeil, S. (2024). The Effects of Generative AI on Computing Students’ Help-Seeking Preferences (arXiv:2401.02262). arXiv. https://doi.org/10.48550/arXiv.2401.02262
How AI reduces the world to stereotypes. (2023, October 10). Rest of World. https://restofworld.org/2023/ai-image-stereotypes/
Hutson, J., & Robertson, B. (2023). A Matter of Perspective: A Case Study in the Use of AI-Generative Art in the Drawing Classroom. The International Journal of New Media, Technology and the Arts, 18(1). https://doi.org/10.18848/2326-9987/CGP/v18i01/17-31
In the Age of ChatGPT, What’s It Like to Be Accused of Cheating? (2023, September 12). http://drexel.edu/news/archive/2023/September/ChatGPT-cheating-accusation-analysis
Shoufan, A. (2023). Can students without prior knowledge use ChatGPT to answer test questions? An empirical study. ACM Transactions on Computing Education, 3628162. https://doi.org/10.1145/3628162
Smolansky, A., Cram, A., Raduescu, C., Zeivots, S., Huber, E., & Kizilcec, R. F. (2023). Educator and Student Perspectives on the Impact of Generative AI on Assessments in Higher Education. Proceedings of the Tenth ACM Conference on Learning @ Scale, 378–382. https://doi.org/10.1145/3573051.3596191
Tirado-Olivares, S., Navío-Inglés, M., O’Connor-Jiménez, P., & Cózar-Gutiérrez, R. (2023). From Human to Machine: Investigating the Effectiveness of the Conversational AI ChatGPT in Historical Thinking. Education Sciences, 13(8), 803. https://doi.org/10.3390/educsci13080803
Tossell, C. C., Tenhundfeld, N. L., Momen, A., Cooley, K., & de Visser, E. J. (2024). Student Perceptions of ChatGPT Use in a College Essay Assignment: Implications for Learning, Grading, and Trust in Artificial Intelligence. IEEE Transactions on Learning Technologies, 1–15. https://doi.org/10.1109/TLT.2024.3355015
Van Campenhout, R., Hubertz, M., & Johnson, B. G. (2022). Evaluating AI-Generated Questions: A Mixed-Methods Analysis Using Question Data and Student Perceptions. In M. M. Rodrigo, N. Matsuda, A. I. Cristea, & V. Dimitrova (Eds.), Artificial Intelligence in Education (Vol. 13355, pp. 344–353). Springer International Publishing. https://doi.org/10.1007/978-3-031-11644-5_28
West, J. K., Franz, J. L., Hein, S. M., Leverentz-Culp, H. R., Mauser, J. F., Ruff, E. F., & Zemke, J. M. (2023). An Analysis of AI-Generated Laboratory Reports across the Chemistry Curriculum and Student Perceptions of ChatGPT. Journal of Chemical Education. https://doi.org/10.1021/acs.jchemed.3c00581
How to Cite this Article
This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License. This means that you are welcome to adopt and adapt content, but we ask that you provide attribution to the L&S Instructional Design Collaborative and do not use the material for commercial purposes.
Example attribution: From Revise Assignments Relative to Generative AI by the L&S Instructional Design Collaborative, licensed under the BY-NC 4.0 license.