Skip to Main Content Header

Digital skills

What is AI

Artificial Intelligence, or AI, is the science of making machines that can think like humans. It can be defined as "the use or study of computer systems or machines that have some qualities that the human brain has, such as the ability to interpret and produce language in a way that seems human, recognise or create images, solve problems, and learn from data supplied to them" (Cambridge Dictionary).

AI is a tool, and can fulfil various functions. The most common ones are in an academic setting can include text generation, analysis of texts, grammar checks, and image generation. This guide will go into more detail on some of the types of AI you may encounter.

Ethical Use of AI

There may be times where you need help understanding a topic or perhaps more information on how to structure an assignment. Here are some examples of cases when it is ethical to use AI.

  • Defining an unfamiliar term e.g. What does pedagogy mean?
  • Providing an overview or an explanation of a topic. This should only be used as a starting point to grasp a topic and not used as a substitute for your own academic research. It is also worth noting that critical thinking applies and you must judge the reliability of the information. 
  • As a study buddy to ask for advice rather than generating content. This could be helpful with revision by comparing your notes with that of AI to compare and contrast and further develop critical thinking and understanding. 
  • Providing advice on time management techniques e.g. Create a to do list or schedule. 
  • For examples of how to structure an assignment. This should be used purely for formatting and structuring purposes and not to generate content e.g. What does a good dissertation in nursing look like?
  • To provide recommendations on research methods e.g. Can you recommend some techniques for a research interview in the social sciences?

AI Limitations

Limited knowledge - AI is constantly improving and updating, but it does come with limitations. Examples of limitations include;

  • Prompt dependent - it can be tricky to get the right prompt to get the answer that you need when using chatbots. They do not ask qualifying questions to confirm meaning, and slight variations in the wording can lead to very different results.
  • Questionable data - Generative AI is limited by the information put into it, which it harvests to generate results. Thus the information it gives out may be out of date, biased, untrustworthy or inaccurate in some way, based on what data it is working with. 
  • Lack of human understanding - AI can struggle to grasp cultural emotion, context and nuance that a human might be able to grasp more easily. Thus while it can give helpful pointers in summarising an article, it may miss something a human would not. So make sure to read it as well!

Bias - AI is heavily dependant on the data it is being fed to generate the result; they are essentially trained on existing text, images and other material that appears online. Thus if there is existing bias, such as sexist, racist, homophobic,  xenophobic or political content being harvested, these may be reproduced in the final results. 

Fake responses and hallucinations - A 'hallucination' in AI terms is "a plausible but false or misleading response generated by an artificial intelligence algorithm" (Merriam Webster).

This means that even though the information or text generated may sound plausible, it can be misleading, or simply wrong.

One of the main ways we see this at the moment is the creation of false references. LLM's such as ChatGPT, when asked to write academic work, often simply invent citations and references that don't exist in the real world. These will get picked up on by tutors and Turnitin.

  • ANY facts generated by an AI tool, please make sure to double check.
  • Any references generated or suggested, check them through the library catalogue or Google Scholar to ensure they are genuine. If there is any doubt at all, contact us at library@eca.edu.au

*Information courtesy of University of Huddersfield

footer