6. Concerns with GenAI in Education

A robot is sitting in the scales of justice.GenAI can be a powerful tool, but you should be aware of the many (many!) concerns about its use.

  • SHAPING THE MIND. Some people argue that skills traditionally taught in schools, such as writing, are helpful in shaping the mind to think critically, logically, and to learn to articulate one’s thoughts clearly. If GenAI tools such as ChatGPT take over writing and essays are no longer assigned in schools, some worry that students will miss out on an important aspect of intellectual development.
  • INACCURACIES & HALLUCINATIONS. GenAI tools don’t understand what they are putting out. They are predictive models. As such, they are prone to making mistakes. In some cases, they invent information, a process known as hallucinations (e.g., ask ChatGPT to come up with the top 10 list of citations on a topic. The citations will all seem plausible, but if you look them up, you will find that none exist…). For this reason, humans, with their experiential learning of the concepts that the GenAI puts out, can and must remain critical and assess the veracity and accuracy of the output.
  • BIASES. GenAI is trained on large data sets. These data sets – for example documents or images found on the internet – contain within them biases. For example, asking a GenAI to draw a picture of leaders will tend to draw men. There are concerns that as we use more GenAI tools, these biases will be reproduced and amplified.
  • OWNERSHIP. There are tons of issues surrounding intellectual property, ownership of information, and authorship when using GenAI. Firstly, GenAI tools like ChatGPT used documents, articles, and books found on the internet, yet the authors of those works are neither recognized nor compensated for that data. Next, once someone uses GenAI information, their data becomes the property of the GenAI app developers. For this reason, many organizations currently forbid their employees from entering sensitive data into a GenAI. Once a user has the output to a prompt, how do they recognize and cite it, especially since the output will change even when prompted in the same way at a later time? These are just some of the issues arising from the use of GenAI.
  • PLAGIARISM. Of course, related to this is the concern that students will use the GenAI tools to plagiarize on assignments. GenAI tools such as ChatGPT made headlines when it was found that they tend to outperform most students on entrance exams (AI models like ChatGPT and GPT-4 are acing everything from the bar exam to AP Biology. Here's a list of difficult exams both AI versions have passed, 2023). The tool can make it tempting for students to cut corners when faced with uncertainty over their performance and tight deadlines. Some have called ChatGPT the new paper mill. Some plagiarism detecting software was put in place to detect the use of GenAI in assignment, but they were deemed to be highly inaccurate (Murphy Kelly, 2023). One of the best ways to guard against plagiarism is to create assignments that are localized to the specific context of the course. Another way is to invite, rather than forbid, the use of GenAI, and to work alongside the tool, rather than against it.
  • PRIVACY. As mentioned above, once data is entered into a GenAI prompt, it becomes the property of the GenAI. The tool will use the data going forward. There are therefore issues of privacy in the use of GenAI tool and experts recommend not entering any personally identifiable or sensitive data (CBC, 2023). If you would like a quick "at a glance" comparison of the terms of use of different GenAI tool, consider glancing at this summary table. Note that since GenAI companies update their terms of service frequently, it won't be long until this table is out of date.
  • EQUITY & ACCESS. Many of the GenAI tools require a subscription to access them. This means that they are not available to everyone, especially those with fewer resources such as internet access to financial resources to purchase the subscription. In addition, there is some indication that some of the plagiarism detection tools has a bias and that they inaccurately flag nonnative English speakers’ written work as generated by GenAI, when this is not the case (Myers, 2023).
  • CRIMINAL ACTIVITY. There are concerns that GenAI could assist criminals in conducting their activities, such as by designing computer code to help them spread a virus, malware, or hack into a secure system. There are also concerns that the AI could be re-purposed to help criminals create poisons or bombs. Fortunately, GenAI have guardrails that prevent them from engaging in such activities, but they can be tricked…
  • ENVIRONMENTAL. Most GenAI tools require more computing power than a simple web browser search. This takes energy, on a planet that is already taxed with our use of resources… Is the use of such an energy consuming tool, at this period of history, ethical?
  • TRUTH & RECONCILIATION. If you are wondering what this means in terms of Truth & Reconciliation, consider reading the position paper written by Indigenous peoples around the world.

A robot stands on trial in a courtroom full of humansMany institutions are currently grappling with these issues and putting in place institutional policies governing the use of GenAI. Most institutional policies focus their efforts on student use while not allowing their employees to use it for their work. You should consult your institution’s policies to determine whether you can assign GenAI in your classroom.

Some of the interesting innovations that are arising in this sphere include the use of an ethics board to review teachers’ proposed use of GenAI in a student assignment (to review whether the students’ privacy and rights are protected under the assignment design) (see University of Technology Sydney). Other institutions, like Kwantlen Polytechnic University, suggest that any student assignment requiring the use of GenAI have an alternative option available to students who do not want to use it.

Regulations are fast evolving to catch up to the rapid rise in GenAI, so expect that new legislation may impact this field in the coming weeks, months, and years.

Note: Images created using DALL-E in August 2023.