Considerations for using and addressing advanced automated tools in coursework and assignments
This webpage focuses primarily on course policies, how to discuss those policies, and some examples of assignments that address or make use of advanced technologies such as generative AI. Additional resources about the use of artificial intelligence tools in teaching and learning, including important considerations for faculty considering the use of these tools in their courses and an AI literacy tutorial, can be found on the webpage for the university’s Artificial Intelligence for Teaching and Learning Working Group. The university’s information technology unit also hosts the university’s current policies about the use of generative AI tools, a list of tools available at the university, and a set of frequently asked questions.
Definition
Advanced automated tools – artificial intelligence or machine learning tools such as Microsoft Copilot and Google Gemini that are sometimes described as “generative” or “autogenerative” tools – use sophisticated technology and very large data sets to create realistic writing, images, or other artifacts in response to natural language queries and prompts. They are very easy to use and some of their output is very difficult to distinguish from human-generated material.
Purpose
This website provides context and practical suggestions, including sample language for course syllabi, for faculty who are addressing the use of advanced automated tools in their course(s). Should students be allowed to use these tools? What are the most pressing issues – practical, pedagogical, and ethical – related to the use of these tools? How can we support their learning about these tools and the many complex, interesting, and rapidly-developing issues that surround them?
Context and impact
These tools are readily accessible to students and have the potential to significantly change some aspects of traditional Western higher education, especially assignments that require students to write or create other artifacts that can easily be created or changed using one of these tools. Although there is research that supports the notion that some automated tools can help students learn and improve skills such as writing (e.g., Wilson, Olinghouse, & Andrada (2014)), the extent of the impact of these newer tools is unknown and subject to debate. However, their accessibility and ease-of-use makes it likely that some students will at least experiment with these tools and be expected to use similar tools after graduation.
Just as faculty have previously adapted their teaching to accommodate the use of other tools and advancements in technology, faculty are encouraged to develop and share with students a coherent approach to the use or non-use of these tools in their courses.
Course policies
In each course, at least four possible approaches seem plausible in terms of student use of these tools:
-
- Prohibit all use of these tools
- Allow their use only with prior permission
- Allow their use only with explicit acknowledgement
- Freely allow their use without any need for acknowledgement
Each approach is discussed below with some thoughts and considerations; possible syllabus language for each approach is included in a separate section below. Regardless of the approach selected, faculty should explicitly discuss with students the approach and its underlying rationale.
If use of these tools is prohibited, limited, or must be documented, faculty should also consider if they should include an explicit reminder about plagiarism and whether use or misuse of these tools would be considered plagiarism.
Use prohibited
In courses where students are expected to independently produce work without any collaboration or the use of external tools, it may be appropriate to not allow any use of these tools. In these situations, it would be most helpful to not only explicitly tell students this but also explain to them why they are not allowed to collaborate or use tools. An honest, respectful discussion about why it’s important for students to work independently in this particular class can help students understand that critical context and broader (academic, professional, or disciplinary) norms and expectations.
Use only with prior permission
In some courses, it may be appropriate to use these tools in some scenarios or on some assignments but not in others. In those situations, faculty should clearly communicate with students when and how they can and cannot use these tools. It would be helpful to also make clear the rationale for allowing these tools in some situations but not allowing them in others; this could be an open discussion and exploration of (academic, professional, or disciplinary) norms and expectations or it could be a brief explanation of your thinking and expectations. Remember that it would also be helpful to explicitly (a) note how students should cite or otherwise acknowledge these tools, with one or more examples, and (b) help students understand the limits and appropriate uses of these tools.
Use only with acknowledgement
In courses where students are allowed or expected to collaborate or use advanced tools, it may be appropriate to allow students to use these tools throughout the class as long as they explicitly cite or otherwise acknowledge the use of these tools. Remember that it would also be helpful to explicitly (a) note how students should cite or otherwise acknowledge these tools, with one or more examples, and (b) help students understand the limits and appropriate uses of these tools.
Use is freely permitted with no acknowledgement
In courses where students are allowed or expected to frequently collaborate or use advanced tools, it may be appropriate to allow students to use these tools throughout the class without requiring that they explicitly cite or otherwise acknowledge the use of these tools. In these classes, it is critical that students understand the limits and appropriate uses of these tools.
Talking with students
If students are allowed to use these tools, faculty should be able to help students understand:
- How to cite or acknowledge the use of these tools. Some major style guides and publishers have provided updates or advice about how to cite these tools in scholarly texts: APA, Chicago, and MLA (ChatGPT and Dall-E). Faculty can also require students to adopt other practices such as a more broad acknowledgement e.g., “Some text and ideas in this document were created or edited using Chat-GPT3.”
- How to effectively use these tools. Like all other tools, these tools have affordances that support and encourage some uses and discourage other uses. These are in part shaped by their limitations but there are many other considerations such as the nature of the data that were used to create the tools and the kind of output they were designed to create. Experience with similar tools suggests that some of these tools may be most useful in (a) generating ideas and (b) suggesting potential improvements or changes to existing material. However, this is an area of rapid growth and development so this may involve a genuine partnership with students to discover some of the most effective and ethical uses.
- The limitations of these tools. The current generation of these tools are very limited, sometimes in ways that are not obvious to novice users. For example, the current generation of Chat-GPT cannot perform synthesis of sources or even document the sources that were used to create its responses. So it may be useful for idea generation or generic recommendations but it has no ability to creatively and independently generate new insights and make novel connections. These tools may also produce information that is incorrect or biased so students must ensure that material they submit or share has been verified with appropriate sources and is reasonably free of biases.
- Ethical issues related to the development and use of these tools. There are many challenging, important, and unresolved ethical issues related to the use of these tools that should be addressed. How should their use be documented? In what situations should these tools not be used? How do we reconcile the complex issues of copyright and intellectual property, not only of the material that is generated but also the material that was used to create the tool? How do we grapple with the environmental costs of these tools?
- The closed and commercial nature of many of these tools. Many of these tools are developed and hosted by companies who do not disclose many important details about the data they have collected or require to use these tools. This extends to the requirements for registering or using some of these tools e.g., phone number, e-mail address.
“Computational thinking” is one of UD’s General Education objectives, the set of skills and knowledge that we expect all UD undergraduate students to attain through their educational experience. Meaningfully addressing the use of these tools – probing their limits, exploring their ethical uses, and meaningfully integrating them into practices – in the context of a specific course or discipline can be a very effective way of helping students build computational thinking skills and knowledge (Martín-Núñez, et al. (2023) explicitly supports this hypothesis).
Examples of syllabus language
Four examples of policy statements suitable for inclusion on a course syllabus are listed below. These statements should be modified to fit the specific context of each course and syllabus e.g., change from the impersonal “students” to the personal “you” if that language is consistently used throughout the syllabus. Examples of what these kinds of policies can look like in a syllabus can be found in this document from Yale University.
Use prohibited
Students are not allowed to use advanced automated tools (artificial intelligence or machine learning tools such as ChatGPT or Dall-E 2) on assignments in this course. Each student is expected to complete each assignment without substantive assistance from others, including automated tools.
Use only with prior permission
Students are allowed to use advanced automated tools (artificial intelligence or machine learning tools such as ChatGPT or Dall-E 2) on assignments in this course if instructor permission is obtained in advance. Unless given explicit permission to use those tools, each student is expected to complete each assignment without substantive assistance from others, including automated tools.
You may also want to require students to explicitly document or acknowledge their use of this tool. Potential language for that:
If permission is granted to use advanced automated tools (artificial intelligence or machine learning tools such as ChatGPT or Dall-E 2), they must be properly documented and credited. Text generated using ChatGPT-3 should include a citation such as: “Chat-GPT-3. (YYYY, Month DD of query). “Text of your query.” Generated using OpenAI. https://chat.openai.com/” Material generated using other tools should follow a similar citation convention.
You may also want to require students to provide a brief explanation of how they used a particular tool. For example:
If a tool is used in an assignment, students must also include a brief (2-3 sentences) description of how they used the tool e.g., what specific tool was used, what prompt and settings were used to generate material, and how that material was incorporated into the assignment.
Use only with acknowledgement
Students are allowed to use advanced automated tools (artificial intelligence or machine learning tools such as ChatGPT or Dall-E 2) on assignments in this course if that use is properly documented and credited. For example, text generated using ChatGPT-3 should include a citation such as: “Chat-GPT-3. (YYYY, Month DD of query). “Text of your query.” Generated using OpenAI. https://chat.openai.com/” Material generated using other tools should follow a similar citation convention.
You may also want to require students to provide a brief explanation of how they used a particular tool. For example:
If a tool is used in an assignment, students must also include a brief (2-3 sentences) description of how they used the tool e.g., what specific tool was used, what prompt and settings were used to generate material, and how that material was incorporated into the assignment.
Use is freely permitted with no acknowledgement
Students are allowed to use advanced automated tools (artificial intelligence or machine learning tools such as ChatGPT or Dall-E 2) on assignments in this course; no special documentation or citation is required.
Even if students are not required to cite the tool(s), you may want to require them to provide a brief explanation of how they used a particular tool. For example:
If a tool is used in an assignment, students must also include a brief (2-3 sentences) description of how they used the tool e.g., what specific tool was used, what prompt and settings were used to generate material, and how that material was incorporated into the assignment.
Examples of assignments
The widespread accessibility of these tools – they are mostly free, easy to access, and easy to use – may motivate many faculty to reexamine assignments and activities to ensure they take into account students’ potential use of these tools. That can include an explicit incorporation of these tools or a design that implicitly dissuades students from using these tools. Further examples are included below in the “Resources” section.
Incorporating the use of advanced automated tools
Courses that allow the use of advanced automated tools may want to focus on improving students’ use and understanding of these tools or at least helping them have a clear understanding of the ethical and effective uses of these tools. Below are some examples of assignments that might help students develop a deeper understanding of these kinds of tools. Before incorporating these tools into assignments, however, you should consider whether all students can access these tools and how they access them e.g., is it ethical to require students to give a company their phone number and e-mail address when the university has no relationship with that company or assurances that information will be protected or ethically used?
- Have students enter a course-specific prompt and analyze the quality of the response using course materials. Are there factual errors in the generated text? Where does the generated text differ in its interpretation of sources from your course materials?
- Assign students multiple uses of the same tool and ask them to compare those different uses. For example, ask students to (a) begin answering a complex question by posing it to ChatGPT and using that to inspire their own writing, (b) begin answering a complex question by posing it to ChatGPT and continue editing and refining that response, and (c) begin answering a complex question on their own and then use ChatGPT to modify and refine the text that the student initially drafted. Afterward, ask students to compare the different approaches. Which seemed most helpful? Which took the most time and work? Which produced the best answer to the question(s)?
- Provide students with opportunities for written reflection on their use of these tools in the service of their learning. What was useful? What made their learning more difficult? Would they use the tool again for a future assignment? What are the costs and benefits of using such a tool? Would the tool be useful to use outside of coursework e.g., in a professional setting?
- Assign students two different documents to read and compare, one generated with an AI tool and one published in your discipline, about the same topic. Include all of the available context and supplementary materials e.g., references, footnotes, bibliographic data. Ask students questions to compare the documents. How were the two documents written, edited, and published and how did those processes shape the documents differently? Does one appear more trustworthy or useful? In what situations might one document be “better” than the other?
Dissuading the use of these tools
Courses that do not allow the use of these tools may want to modify assignments to ensure that students cannot easily use these tools. This may most easily be done by understanding and addressing the weaknesses of these tools – no ability to synthesize, lack of the specific context of a particular class and discipline, difficulty providing genuine sources, etc. Modifications to assignments that may dissuade the use of these tools:
- Focus on analysis of sources and information. The current generation of tools cannot engage in in-depth, meaningful, and sustained analysis that incorporates and references multiple sources. For example, instead of asking students to write an essay, write an essay yourself or provide them with an essay and ask students to provide meaningful, substantive critique and feedback that specifically references not only the essay but also course materials (thanks to Michael Bugeja for making this suggestion).
- Require students to explicitly reference and use materials – vocabulary, concepts, references, etc. – from your course in the assignment. In many cases, these materials are not accessible to these tools; in particular, your course notes and the discussions you have in class are not part of the data sets used to develop these tools.
- Place a greater emphasis on the writing process. For example, in addition to requiring a bibliography you could also require that students include a brief note for each source that specifically indicates how they used it or how it substantively influenced that specific paper.
Detecting the use of these tools
There have been and continue to be efforts to develop tools that can detect materials generated by these advanced automated tools. The level of accuracy of these detection tools varies and will continue to change as both the generators and the detectors continue to change and are updated. Many experts are skeptical of the quality of these tools with significant concerns about “false positives” that can potentially be used to accuse students of using these tools when they have not used these tools; our colleagues at Temple University have written about this, including their own evaluation of TurnItIn’s tool. The difficulty of creating an accurate detection tool led the creators of ChatGPT to remove their own tool, AI Text Classifier, and note in their Educator FAQs that “none of [the detection tools] have proven to reliably distinguish between AI-generated and human-generated content.”
Examples of detection tools focused on text that may have been developed by ChatGPT or similar tools include:
- AI Content Detector Free (tool developed by Corrector App; more information about the tool and the developers are available on that same webpage below the text entry box)
- GPTZero (tool developed by an undergraduate student at Princeton University; multiple media outlets have written about it, such as NPR)
Instead of (solely) focusing on detecting the use of the tools or attempting to design assignments that the current tools cannot complete, The MLA-CCCC (Modern Language Association and Conference on College Composition and Communication) Joint Task Force on Writing and AI has several recommendations for faculty. This group focuses on writing and writing assignments but these recommendations can easily be extended to other domains:
- Design assignments to support intrinsic motivation
- Emphasize teacher, peer, and tutor relationships in the writing process
- Assign steps in the writing process
- Ask for documentation of and reflection on the writing process
- Test assignments on language models [and AI tools]
Resources
- AI Literacy & Tools: Managing AI Use in Your Classroom (2024) (Padlet with many resources and examples; created and regularly updated by UD faculty and staff)
- University of Maine “Learn with AI” site (website with extensive examples from the University of Maine’s Center for Innovation in Teaching and Learning and New Media major)
- AI Assignment Library (examples of assignments focused on AI hosted by the University of North Dakota)
- Teaching with Generative AI Resource Hub (MIT Sloan of Management webpage with resources for faculty)
- TextGenEd: Teaching with Text Generation Technologies (Creative Commons-licensed book from the WAC Clearinghouse with assignments focused on text-based AI tools)
- AI Prompts for Teaching (Ongoing collection of example prompts and activities to help faculty become more familiar and comfortable with generative AI text tools by Cynthia J. Alby at Georgia College and State University)
- Assigning AI: Seven Approaches for Students, with Prompts (June 12, 2023, pre-print journal article by Ethan R. Mollick and Lilach Mollick at the University of Pennsylvania Wharton School)
- Using ChatGPT and Other Large Language Model (LLM) Applications for Academic Paper Assignments (April 14, 2023, journal article pre-print by Andreas Jungherr at the University of Bamberg)
- Discipline-specific Generative AI Teaching and Learning Resources (Regularly-updated list of articles and papers describing specific examples of generative AI used in teaching and learning of specific disciplines)
Acknowledgements
Thank you to:
- Michael Bugeja for suggesting that faculty provide essays to students for them to critique as a potential assignment that dissuades the use of these tools. He made this suggestion during a webinar hosted by Iowa State University’s Center for Excellence in Learning and Teaching on January 30, 2023.
- Spencer Ross for sharing sample citation language for GPT-3 in a course syllabus at https://express.adobe.com/page/bdoHybDdLidEq/
Revised by Kevin R. Guidry on November 20, 2024