Generative Artificial Intelligence in the Classroom: FAQ’s

Updated August 29, 2024

The latest generation of Artificial Intelligence (AI) systems is impacting teaching and learning in many ways, presenting both opportunities and challenges for the ways our course instructors and students engage in learning. At the University of Toronto, we remain committed to providing students with transformative learning experiences and to supporting instructors as they adapt their pedagogy in response to this emerging technology.

Many generative AI systems have become available, including Microsoft Copilot, ChatGPT, Gemini, and others. These AI tools use predictive technology to create or revise written products of all kinds, including essays, computer code, lesson plans, poems, reports, and letters. They also summarize text, respond to questions, and so on. The products that the tools create are generally of good quality, although they can have inaccuracies. We encourage you to try these systems to test their capabilities and limitations.

May 2024: A new institutional website on artificial intelligence was launched. This site provides a space for U of T community members and the public to find academic and research opportunities at the University, information on technologies currently in use, institutional guidelines and policies, and updates on new artificial intelligence activities across the University. Visit https://ai.utoronto.ca/.

August 2024: In May 2024, the U of T Artificial Intelligence Task Force was established to develop a vision and strategy to guide the University’s AI activities, and to guide the integration of AI within our teaching, learning, and administrative processes and frameworks, ensuring alignment with our core values and mission. The Task Force and its working groups will meet through early 2025 to develop recommendations for the U of T community. Information and updates about the Task Force are available on the U of T AI Task Force Updates SharePoint site. 

Sample Syllabus Statements

Revised August 2024: The University has created sample statements for instructors to include in course syllabi and course assignments to help shape the message to students about what AI technology is, or is not, allowed. These statements may be used for both graduate and undergraduate level courses. 

If you have included syllabus statements in previous offerings of the course, be aware that GenAI technology is advancing very rapidly, and the current state-of-the-art may be more sophisticated compared to when the course was last offered. Check that your syllabus statements still seem appropriate in light of current capabilities. 

You may also want to include a statement to the effect that students may be asked to explain their work at a meeting with the instructor. While you can call a student in for such a discussion whether you include a statement to this effect on your syllabus, or not, reiterating this on the syllabus may help remind students that they are responsible for the work they submit for credit. 

Microsoft Copilot

In December 2023, a protected version of Microsoft Copilot (formerly Bing AI) became available to all U of T faculty, librarians, and staff. This protected version is now also available to U of T students.  Copilot is an enterprise version of an AI-powered chatbot and search engine which better protects the privacy and security of users (when users are signed into their U of T account). Copilot, like other generative AI tools, may provide information that is not correct (“hallucinations”), and it is up to each individual user to determine if the results are acceptable. For information and instructions on accessing the enterprise edition, please read and adhere to the Microsoft Copilot guidelines for use

If you are an instructor who is interested in using generative AI with students or to develop course materials, review the FAQ below for considerations.

Frequently Asked Questions

About Generative AI

Updated: August 21, 2024 

Large Language Models are trained to predict the next word in a sentence, given a prompt and the text that has already been written. Early attempts at addressing this task (such as the next-word prediction on a smartphone keyboard) are only coherent within a few words, but as the sentence continued, these earlier systems quickly digressed. A major innovation of models such as GPT is their ability to pay attention to words and phrases which were written much earlier in the text, allowing them to maintain context for much longer and, in a sense, remember the topic of conversation. This capacity is combined with a training phase that involves looking at billions of pages of text. As a result, models like ChatGPT, Gemini, Claude, and their underlying foundational models are good at predicting what words are most likely to come next in a sentence, which results in generally coherent text. 

One area where generative AI tools sometimes struggle is in stating facts or quotations accurately. This means that these tools sometimes generate claims that sound real, but to an expert are clearly wrong.  

The best way to become familiar with the capabilities and limitations of the tools is to try them. Their capabilities continue to grow, so we recommend continuing to engage with the tools to keep your knowledge of their abilities current.    

Updated:  August 21, 2024 

Instructors are welcome and encouraged to test Microsoft Copilot, ChatGPT, Gemini, Claude, Perplexity and other tools, use of which are currently free. You can also test other AI tools to assess their capability, for instance to see how they respond to the assignments used in your courses, the way in which they improve the readability and grammar of a paragraph, or the way they provide answers to typical questions students may have about course concepts.  Experimentation is also useful to assess the limitations of the tools.  However, confidential information should never be entered into unprotected AI tools. Content entered into ChatGPT, Gemini, or other public, unprotected tools may become part of the tool’s dataset.  

Note that information entered into the protected version of Microsoft Copilot is not used for training. 

Updated:  August 21, 2024 

This is a threshold question that instructors may want to consider.  Mainstream media has been covering this issue extensively, and alternate viewpoints are widely available.   

Given that generative AI systems are trained on materials that are available online, it is possible that they will repeat biases present online.  OpenAI and other companies have invested substantial effort into addressing this problem, but biases remain inherent in these types of systems. You may also want to familiarize yourself regarding questions about the way the technology was developed and trained (e.g., who were the people who trained it? Whose work was it trained on?), the way we use the responses it provides, and the long-term impacts of these technologies on the world. 

The Provost is consulting with faculty and staff experts on these larger questions involving generative AI systems, and welcomes debate and discussion on these issues. 

Updated: August 21, 2024 

There remains significant legal uncertainty concerning the use of generative AI tools in regard to copyright. This is an evolving area, and our understanding will develop as new policies, regulations, and case law becomes settled. Some of the concerns surrounding generative AI and copyright include:   

  • Input: The legality of the content used to train AI models is unknown in some cases. There are a number of lawsuits originating from the US that allege the development of Generative AI tools infringed on copyright and it remains unclear if and how the fair use doctrine can or will be applied. In Canada, there also remains uncertainty regarding the extent to which existing exceptions in the copyright framework, such as fair dealing, apply to this activity. 
  • Output: Authorship and ownership of works created by AI is unclear. Traditionally, Canadian law has indicated that an author must be a natural person (human) who exercises skill and judgement in the creation of a work. As there are likely to be varying degrees of human input in generated content, it is unclear in Canada how it will be determined who the appropriate author and owner of works are. More recently, the US Copyright Office has published the following guide addressing these issues: Copyright Registration Guidance for Works Containing AI-Generated Materials 

If you have further questions about copyright, please view the U of T Libraries webpage, Generative AI tools and Copyright Considerations for the latest information. 

Student Use of Generative AI

Updated:  August 21, 2024 

Yes. Instructors may wish to use the technology to demonstrate how it can be used productively, or what its limitations are. The U of T Teaching Centres are continuing to develop more information and advice about how you might use generative AI as part of your learning experience design. 

You can ask your students to use the protected version of Microsoft Copilot. However, keep in mind that asking or requiring your students to access other tools is complicated by the fact that they have not been vetted by the University for privacy or security.  The University generally discourages the use of such systems for instruction until we are assured that the system is protecting any personal data (e.g., the email address used to register on the system). These tools should be considered with the same cautions as other third-party applications that ingest personal data. 

If you decide to ask or encourage students to use an AI system in your courses, there are a few issues to consider before you do so: 

  • Never input confidential information or student work into an unprotected/unvetted AI tool. All content entered may become part of the tool’s dataset and may inadvertently resurface in response to other prompts. 
  • Note that if you ask ChatGPT or other tools whether they wrote something, like a paragraph or other work, they will not give you an accurate answer. 
  • There may be some students who are opposed to using AI tools. Instructors should consider offering alternative forms of assessment for those students who might object to using the tools, assuming that AI is not a core part of the course. 
  • Instructors should consider indicating on their syllabus what AI tools may be used in the course and, as relevant, identify restrictions to this usage in relation to learning outcomes and assessments. 
  • Be aware that not all text that generative AI technology produces is factually correct. You may wish to experiment with ChatGPT and other tools to see what kinds of errors it generates; citations are sometimes fabricated, and inaccurate prompts are sometimes taken as fact.  
  • Different tools will create different responses to the same prompt, with varying quality. You may want to try several different systems to see how they respond. 
  • There is a risk that Large Language Models may perpetuate biases inherent in the material on which they were trained. 
  • OpenAI and other companies may change their terms of use without notice. If you plan on using a system in the classroom, consider having a back-up plan. Because of the University’s relationship with Microsoft, use of Microsoft Copilot may help you avoid unexpected and disruptive changes in terms of use or model availability. 

New: August 29, 2024

Undergraduate and graduate students may use GenAI tools as learning aids, for example, to summarize information or test their understanding of a topic. These tools should be used in a manner similar to consulting library books, online sources, peers, or a tutor. Such uses are generally acceptable even if an instructor has stated that AI tools are not otherwise permitted in the course. These uses typically do not need to be cited or disclosed.

However, if students use information from GenAI tools that would normally require citation (e.g., quoting, paraphrasing, or reproducing text or ideas), they must cite their use according to guidelines provided by U of T Libraries or as directed by their instructor.

Instructors may require students to describe their use of GenAI in assignments, but this must be clearly stated in the syllabus and assignment instructions.

Students should be aware of the limitations of GenAI tools, including the potential for generating inaccurate or biased information, and must adhere to copyright and intellectual property guidelines as outlined in the FAQ, “Can students use GenAI to generate study and learning aids derived from course materials?”

New: August 29, 2024

Students may use AI tools to generate learning aids (e.g., quizzes, flashcards, summaries) as long as they comply with copyright and intellectual property guidelines and the appropriate use guidelines in the FAQ, “Are students permitted to use AI tools as a learning aid?”

What materials can be used to generate learning aids?

As a general rule, course materials should not be uploaded to GenAI systems, because some AI tools use uploaded documents or prompts to train their models. Students may violate copyright laws if they submit materials to third-party tools without owning the IP or copyright. Course materials developed by instructors (e.g., syllabus, lecture notes, slides) are their intellectual property, and course texts (e.g., textbooks, readings) are likely copyrighted.

However, students can use the institutionally-approved version of Microsoft Copilot to generate study and learning aids, because of the data protection offered through this tool. 

To use the institutionally-approved version of Microsoft Copilot:

  • Log in to the U of T-approved version at copilot.microsoft.com using your U of T credentials.
  • Verify data protection by checking for the green shield and checkmark symbol in the upper right-hand corner. This symbol indicates data protection but does not guarantee accuracy or quality of the information.

Use AI-generated materials for personal use only—do not share them with others or commercialize them.

Students should also:

  • Let instructors know if they identify particularly useful approaches to using GenAI to support their learning, or if they have questions about the learning materials they generate.
  • Obtain explicit instructor permission to record or capture class lectures. Recording or using AI to capture transcripts without permission is not allowed.

Updated:  August 21, 2024 

The University expects students to complete assignments on their own, without any outside assistance, unless otherwise specified. However, for the purposes of transparency and clarity for students, instructors are strongly encouraged to go further and to specify what tools may be used, if any, in completing assessments in their courses. Written assignment instructions should indicate what types of tools are permitted; vague references to not using ‘the internet’ will generally not suffice today.   

If you are permitting, or even encouraging, students to use generative AI tools for developing their assignments, be explicit about this on the syllabus. Consider what tools and what use is acceptable. Can students use it for critiquing their work? For editing? For creating an outline? For summarizing sources? For searching the literature (e.g., using Semantic Scholar)? You may also want to ask students to reflect on how they used the tools to improve their writing/learning process. 

If adding a prohibition on AI tools to assignment instructions, it is best to suggest that the ‘use of generative AI tools’ is prohibited, as opposed to the use of one particular tool, such as ChatGPT. There are many generative AI tools available today. 

The University has created sample language that instructors may include in their course syllabi to clarify for students if the use of generative AI tools for completing course work is acceptable, or not, and why. We also encourage instructors to include information on assignment instructions to explicitly indicate whether the use of generative AI is acceptable or not. 

If an instructor indicates that use of AI tools is not permitted on an assessment, and a student is later found to have used such a tool on the assessment, the instructor should meet with the student as the first step of a process under the Code of Behaviour on Academic Matters.   

Some students may ask if they can create their assignment outline or draft using generative AI, and then edit the generated first draft. Consider before discussing the assignment with your students what your response to this question might be, and perhaps address this question in advance.  

You may wish to consider some of the tips for assessment design available on the Centre for Teaching Support & Innovation’s webpage, Teaching with Generative AI at U of T. You might also consider meeting with, or attending a workshop at, your local Teaching Centre to get more information about assignment design. Consider what your learning goals are for the assignment, and how you can best achieve those considering this new technology. 

Updated: August 29, 2024

Beyond use of GenAI tools as a general learning aid (see “Are students permitted to use AI tools as a learning aid?” above),if an instructor specified that no outside assistance was permitted on an assignment, the University would typically consider a student’s use of generative AI to be use of an “unauthorized aid” under the Code of Behaviour on Academic Matters.  Instructors should keep in mind that students might receive a range of instructions across different courses about what constitutes appropriate use of AI within each course. We therefore encourage all instructors to be very transparent and clear as to whether, and in what ways, use of AI is permitted on any given assessment.

Updated: June 7, 2023

The University does not support the use of AI-detection software programs on student work. None of these software programs have been found to be sufficiently reliable, and they are known to incorrectly flag instances of AI use in human-written content. Some of the AI-detection software programs assess if a piece of writing was generated by AI simply on its level of sophistication.

Sharing your students’ work with these software programs without their permission also raises a range of privacy and ethical concerns.

However, instructors are encouraged to continue to use their traditional methods for detection of potential academic misconduct, including meeting with a student to discuss their assignment in person.

Updated: April 10, 2024

Yes. If you use multiple-choice quizzes/tests, assume that generative AI systems will be able to answer the questions unless they pertain to the specifics of a classroom discussion, the content of which cannot be found on the internet. Some instructors may wish to test the capability of generative AI systems by using their multiple-choice/short answer assessments as prompts, and reviewing responses from a variety of tools (e.g., ChatGPT, Microsoft Copilot, Gemini, Perplexity, Poe Assistant, etc.).  

Updated:  August 21, 2024 

Talking to students about generative AI tools and their limitations will let students know that you are well aware of the technology and will generate discussion and help to set guidelines for students. Let students know clearly, both verbally and in assignment instructions, what tools may or may not be used to complete the assignment. Advise students of the limitations of the technology, and its propensity to generate erroneous content. You might also show students what AI produces when responding to an assignment prompt, and how it differs from your expectations of quality work. 

Please note that detection of student use, especially if these tools are used to their best effect, is not possible . Like use of the internet, generative AI use will become ubiquitous. 

Visit the Centre for Teaching Support & Innovation’s webpage, Teaching with Generative AI at U of T, for course and assessment design considerations. 

Updated: September 29, 2023

Students and faculty can refer to the U of T Libraries Citation Guide for Artificial Intelligence Generative Tools, which provides guidance on how to cite generative AI use in MLA, APA and Chicago Style.

Updated: July 17, 2023

The School of Graduate Studies (SGS) has posted Guidance on the Appropriate Use of Generative Artificial Intelligence in Graduate Theses which will be of interest to graduate students, supervisors, supervisory committee members, Graduate Chairs and Graduate Units.

Updated: April 10, 2024

No. Large Language Model (LLM) technology is at the heart of a variety of generative AI products that are currently available, including writing assistant programs (e.g., Microsoft Copilot, Gemini, and a huge number of others), image creation programs (e.g., DALL-E 3, Midjourney, etc.) and programs to assist people who are creating computer code (e.g., GitHub Copilot). It is also possible for you to build a system which utilizes this underlying technology (GPT-4 or another model) if you are interested in doing so. 

It is also worth noting that there are a variety of products (online and mobile apps) that have popped up which use GPT-4, Gemini or other AI models and require paid subscriptions. Some add additional features such as editing tools and templates. However, there are others that do nothing more than the free version but are meant to fool people into paying for a service that is currently free. 

Instructor Use of Generative AI

Updated: April 10, 2024

Currently, Microsoft Copilot is the recommended generative AI tool to use at U of T. When a user signs in using University credentials, Microsoft Copilot conforms to U of T’s privacy and security standards (i.e., does not share any data with Microsoft or any other company). It is also free to use. Microsoft Copilot uses OpenAI’s GPT-4 model and performs comparably to ChatGPT. For more information about Copilot, refer to CTSI’s Copilot Tool Guide

Updated: August 29, 2024

GenAI tools may assist instructors in developing or updating course materials.

Instructors who make substantive use of these tools are encouraged to acknowledge this in their syllabus or assignment documents, similar to how they would acknowledge materials borrowed or adapted from a colleague. This serves as a model for students regarding the expected use of GenAI tools.

Note that copyright ownership of outputs produced by generative AI is currently unsettled in law. Instructors considering using these tools should:

  • Understand that while they can create content with these tools, they may not own or hold copyright over the generated works.
  • Avoid inputting confidential information or intellectual property they do not have rights to use (e.g., student work or questions without permission). Content entered may become part of the tool’s dataset and could resurface in response to other prompts (except for tools like the protected version of Microsoft Copilot).
  • Review each tool’s terms of service, which govern the use and ownership of inputs and outputs. Note that these terms can change without notice.

For more information, refer to U of T Libraries’ Generative AI tools and Copyright Considerations.

Updated: January 27, 2023

Please note that the instructor is ultimately responsible for ensuring the grade accurately reflects the quality of the student’s work, regardless of the tool used. The University asks that you not submit student work to any third-party software system for grading, or any other purpose, unless the software is approved by the University. A completed assignment, or any student work, is the student’s intellectual property (IP), and should be treated with care.

The University currently has several licensed software tools available for facilitating grading, such as SpeedGrader and Crowdmark. These systems safeguard the student’s IP while also supporting the grading process. In the future these types of systems may include AI-powered grading assistance.

Updated: August 21, 2024 

A Provostial Advisory Group on Generative AI in Teaching and Learning was struck in spring 2023 to identify areas in teaching and learning that require an institutional response or guidance. One such example is providing instructors with sample language to include in their course syllabi to clarify for students if the use of generative AI tools for completing course work is acceptable, or not, and why. A Generative AI in Teaching and Learning Working Group, chaired by the Centre for Teaching Support & Innovation, coordinates and plans for instructor resources needed to support generative AI in the classroom. There are also groups around the university (e.g., the libraries) that are tracking the technology and identifying opportunities and issues that we will need to confront.  

In May 2024, the U of T Artificial Intelligence Task Force was established to develop a vision and strategy to guide the University’s AI activities, and to guide the integration of AI within our teaching, learning, and administrative processes and frameworks, ensuring alignment with our core values and mission. The Task Force and its working groups will meet through early 2025 to develop recommendations for the U of T community. Information and updates about the Task Force are available on the U of T AI Task Force Updates Sharepoint site. 

Decisions regarding the use of generative AI tools in courses will remain with instructors based on the type of course and assessments within them. Regardless of your stance on this technology, it is important that you discuss it with your students, so they understand the course expectations.  

Have feedback or want more information?

If you have any suggestions for teaching and learning resources that would be helpful to you as a course instructor, or if you have any other questions about generative AI at U of T that are not addressed through this FAQ, contact us now: