E-learner's handbook
Using chatbots
Microsoft Copilot (https://copilot.microsoft.com/), ChatGPT (https://chat.openai.com/), Gemini (https://gemini.google.com/), Claude AI (https://claude.ai/) and other text generation tools or chatbots are systems based on artificial intelligence (AI) language models that can produce text, images, or other media so well that the result may be difficult to distinguish from human-generated content.
When interacting with a chatbot, you must enter your question or request as a prompt. You might provide additional information and context in a follow-up question or request to ensure a better result. The chatbot then outputs a text that you can use to continue the dialogue and ask more specific questions.
Always verify the facts and source references provided by the chatbots!
Please note that the chatbot may cite fictional sources, make logical fallacies, formatting and grammatical errors, and give biased responses disregarding cultural differences or social norms. The output may not comply with data protection regulations and may contain false personal data. Therefore, check the facts and references provided. Using the chatbot’s output is the user’s responsibility. The user needs to have adequate knowledge to evaluate the text produced.
Since 2024, all members of the university can use the Copilot chatbot.
See instructions.
The university generally encourages using chatbots to support teaching and learning and develop students’ learning and working skills. Since 2024, all members of the university can use the Copilot chatbot. To do this, you must log in to https://copilot.microsoft.com/ with a university user account in the Microsoft Edge or Google Chrome web browser.
For example, you can use a chat tool for independent work to ask for clarification of terms, ask for ideas, revise a text, ask yourself follow-up questions, overcome the block of starting a piece of writing or the blank sheet fear; as a brainstorming aid; as a programming tool; for editing and translating a text; to develop critical thinking by evaluating the chatbot’s output; and to get a quick overview of voluminous material.
Using a chatbot has a large ecological footprint. Chatbots need powerful servers that consume a lot of energy for work and cooling.
Lecturers have the right to decide whether to allow using the chatbot in their course or, if necessary, limit its use. Before completing an assignment, check whether chatbots are permitted in the course. If you use it regardless of restriction, fail to cite its use correctly, or submit a text created by the chatbot under your name, it is academic fraud and will be addressed the same way as other cases of academic fraud. If you use the chatbot despite the restriction and submit the work generated by the chatbot with proper references, it is not academic fraud but a failure to meet the conditions for a grade/pass in the subject.
The important criteria when using chatbots are purposefulness, ethics, transparency, and a critical approach. Entering another person’s data into a chatbot and searching for their data is the processing of personal data and should, therefore, be avoided in teaching and studies. According to the General Data Protection Regulation, this can be understood as profiling an individual, which requires the explicit consent of the individual.
Lecturers must not require students to create a separate account or use their Google or Facebook account to use a chatbot.
Presenting chatbot-generated text as your personal thoughts in any academic text is academic fraud.
When you use the chatbot for writing a home assignment, explain (for example, in the methodology chapter, another text, or appendix) how it was used: for example, describe the questions asked, the output received from the chatbot, and what extent you changed it (examples 1 and 2). The description of chatbot use must clarify to what extent and how the chatbot was used for the work.
Example 1. I used Microsoft Copilot (2024) to create the structure of the questionnaire. …
Example 2. The following definition is based on ChatGPT’s response given on 22 April 2023 to the question, “What is a language model?”. The result was as follows: “[—]” (OpenAI, 2023).
In-text citation depends on the specific referencing style used by the academic unit or journal (APA, Chicago, MLA, etc.).
In the references list, you should indicate the chatbot developer, the year the chatbot version was used, the specific chatbot and its version, the type or description of the language model, and the web address.
For example, the reference may be written in APA style as follows:
- Microsoft. (2024). Microsoft Copilot (3 March version) [large language model]. https://copilot.microsoft.com/
- OpenAI. (2023). ChatGPT (22 April version) [large language model], https://chat.openai.com
Additional reading
- University of Tartu guidelines for using AI chatbots for teaching and studies. https://ut.ee/en/node/151731
- University of Tartu instructions about artificial intelligence (AI). https://wiki.ut.ee/pages/viewpage.action?pageId=218073757
- Instructions on how to use Microsoft Copilot on using the Microsoft Edge or Google Chrome web browser. https://wiki.ut.ee/display/IT/Microsoft+Copilot