1. What is artificial intelligence and what is generative artificial intelligence?

Artificial intelligence (AI) based on software-defined algorithm is a broad area comprised of various technologies which are used, among other things, for data analysis, artificial vision, planning, detection, chatbots etc.

AI has become the central element of contemporary education, where the use of machine learning methods enables to personalise learning materials and learning experiences. At the same time, learning analytics facilitates analysis of learning processes and study results thus offering deeper understanding of more complicated topics as well as feedback which caters for their individual needs. The ability of AI to process huge amounts of data enables to detect the strengths and weaknesses of learners and predict their progress in studies. The last significant innovations in the field of education include generative AI applications, such as ChatGPT, which have widened the scope of possibilities available for developing the learning process and for the inclusion of students and have changed the ways students resolve tasks and acquire new knowledge.

Generative AI, one of the applications of artificial intelligence, enables to create new content, including text, pictures and sound. Such technology is trained on existing examples, for example text data, such as books and webpages – after the learning process of the machine similar content can be created with the help of AI (i.e. write texts or create works of art). Even if the language models on which the technology is based can produce impressive results, it is important to understand that they generate text on the basis of learned probability. Such technology does not know (and cannot now) which answers are right or wrong and it does not possess understanding of the deeper meaning of human language.

In education, generative AI can be used to create new material, for word and data processing, exchanging ideas and for other educative purposes, e.g. for creating study materials.



 

2. The use of generative AI in studies

AI can provide valuable support for the learners and teaching staff in learning. Have a look at study materials Artificial Intelligence in Studies

The use of AI has increased uncertainty/unpredictability in learning situations which underlines the fact that all parties in learning (teaching staff as well as students) are taking on the role of a learner. Since students bring along their experience as users of AI, it should be considered as a valuable resource in studies. We recommend that students and teaching staff share their experience more and experiment together to find out how the use of generative AI could enhance problem solving, at the same time contributing to the development of students’ competences (digital and science-related competences as well as analytical, problem solving and critical thinking skills). Teaching staff can discuss with the students the experience of using AI and the impact of AI on learning. When students understand the purpose of their learning and understand how they learn and what their responsibility for learning is, it all contributes to deep learning. We encourage responsible experimentation with AI among teaching staff and students in studies by creating new learning situations and agreeing on how the studies are conducted, how and for what each party uses AI and how assessment is done.

Students can use AI for learning support and development in the following manner:

  • as a study partner, for generating ideas, creating preliminary ideas (e.g. to overcome the writing block when starting a written paper);
  • for self-check;
  • for correcting texts, translating, finding different solutions;
  • for developing critical thinking (e.g. analysing answers provided by text robots);
  • as an aid when working with large volumes of materials, making summaries during the learning process;
  • as a programming aid, etc.

Teaching staff can use AI in different stages of the learning process in the following manner:

  • as a partner for generating ideas to prepare study activities;
  • for creating study materials;
  • for mapping students’ preliminary knowledge, getting attuned;
  • for creating learning tasks;
  • for giving feedback to students’ work (it is recommended to inform the students) etc.

Teaching staff may use various assessment methods and define tasks in different ways to make sure that students achieve the learning outcomes and to avoid the use of AI with the purpose of academic fraud e.g.:

  • by creating tasks that require collecting and analysing original data in different ways: interviews, observation, archive studies, practical work etc.;
  • mid-term tests and learners’ self-check methods;
  • narrative self-reflections and/or group reflections to analyse learning;
  • describing the learning experience where the learner models the experience by demonstrating how a theory or experimenting with new activities have led to new practices etc. 

Teaching staff can direct the use of AI and the extent to which it is used in studies. It should be born in mind, however, that the use of generative AI applications or proof-reading tools cannot be detected or proved afterwards in a trustworthy manner. The university does not recommend using the services of “detecting artificial intelligence“ since the results they provide are mostly statistical predictions not sound evidence. It is different from plagiarism detection where the text can be compared to possible source texts.

It is the responsibility of the teaching staff to provide explanations as to the violation of rules of research ethics that would not allow the student to achieve the learning outcomes and how these violations are detected. It is important to have a dialogue and discussion, mutual agreements. When choosing AI software, the teaching staff has the responsibility to make find out about the skills and experience of using the software and if all students are guaranteed access to it on equal grounds. Please also have a look at Good Practice of Teaching and Supervising, Good Practice of Learning, Code of Conduct for Research Integrity.

For the students it is important to define the goal and the learning outcomes of the course for themselves. Students are responsible for their learning and for achieving the learning outcomes. Using AI where the teaching staff have limited its use or failure to make proper reference to the use of AI is considered academic fraud. Please also have a look at Good Practice of Learning, Code of Conduct for Research Integrity.

Sources

1. using AI in learning and teaching:

1.1. European Commission (2022). Directorate-General for Education, Youth, Sport and Culture, Ethical guidelines on the use of artificial intelligence (AI) and data in teaching and learning for Educators, Publication Office of the European Union, https://education.ec.europa.eu/news/ethical-guidelines-on-the-use-of-artificial-intelligence-and-data-in-teaching-and-learning-for-educators

1.2. European Commission, (2022), Artificial intelligence and education. A critical view through the lens of human rights, democracy and the rule of law. https://www.coe.int/en/web/education/edutalks/-/asset_publisher/gS9v25LDjB54/content/edutalk-coe-on-the-importance-of-evidence-based-practices-in-artificial-intelligence-and-education-1

1.3. Ministry of Education and Research (2024). https://www.hm.ee/uldharidus-ja-noored/meediapadevus/tekstirobotid-koolis

1.4. Hughes, S. (21.09.2023), Why AI makes traditional education models obsolete – and what to do about it. World Economic Forum. 

1.5. Minerva Project, (2023) Integrating Artificial Intelligence. Key Strategies for Higher Education.  



2. Bias in AI, risks:

2.1. Schwartz, Reva; Vassilev, Apostol; Greene, Kristen; Perine, Lori; Burt, Andrew; Hall, Patrick. (2022).Towards a Standard for Identifying and Managing Bias in Artificial Intelligence. National Institute of Standards and Technology Special Publication 1270. https://doi.org/10.6028/NIST.SP.1270. 

2.2. Trasberg, Henrik. (06.04.2023). Tehisintellekt toob kasu, aga ka riskidega tuleb tegeleda. ERR news. https://www.err.ee/1608939326/henrik-trasberg-tehisintellekt-toob-kasu-aga-ka-riskidega-tuleb-tegeleda

2.3. Laas, Oliver. (11.01.2023). ChatGPT ja tehisintellekti plagiaat. ERR news. https://www.err.ee/1608846064/oliver-laas-chatgpt-ja- tehisintellekti-plagiaat. 

2.4. Laas, Oliver. (22.03.2023). ChatGPT eetilisest kasutamisest koolides. ERR news. https://www.err.ee/1608922886/oliver-laas-chatgpt-eetilisest-kasutamisest-koolides. 

2.5. Ott,T. (22.09.2023). OpenAI lawsuit: US authors allege ChatGPT copyright theft. Deutsche Welle news.

2.6. D’Agostino, S. (22.08.2023). AI Raises Complicated Questions About Authorship. Inside Higher Ed. 

2.7. Appel, G.; Neelbauer, J.; Scweidel, D.A. (07.04.2023). Generative AI Has an Intellectual Property Problem. Harvard Business review. 

2.8. Barker, N. (20.06.2023). The Dezeen guide to AI. (vaadatud 10.09.23)

2.9. Oidermaa, J.J. (07.07.2023). Eetikauurija: ChatGPT-d saatev kõmu röövib tähelepanu tegelikelt probleemidelt. ERR Novaator. 

2.10. Abudawood, T. (28.06.2023). What cybersecurity threats does generative AI expose us to? World Economic Forum. 



3. Is it possible to detect the use of AI in student work:

S. Eaton. The use of AI-detection tools in the Assessment of Student Work, 06.05.2023. https://drsaraheaton.wordpress.com/2023/05/06/the-use-of-ai-detection-tools-in-the-assessment-of-student-work/. Seen 08.05.2023.