Skip to main content
Library Services

Student Guide to Generative AI

Generative Artificial Intelligence (AI) platforms exploded in 2023. ChatGPT was followed quickly by a large number of text generating tools, alongside software that creates original images (such as Dall-E 2), presentations or computer code. 

AI has, of course, been around for some time and we often use it without realising. However, it is becoming an increasingly important part of modern society and has begun to revolutionise the way we work, communicate, and learn.

In this guide, we will discuss the strengths and limitations of Generative AI, as well as consider ethical, privacy and other factors to help support you in using this technology effectively and in a way that aligns with Queen Mary's regulations and values.

What is Generative AI and how does it work?

Generative AI tools create content based on patterns, generating reasonably accurate, plausible-sounding (or looking) responses to prompts on a wide range of topics.

Limitations of Generative AI

  • Generative AI tools are unable to access information specific to your particular course or module at Queen Mary, nor do they have access to journals and resources that are available to you via Queen Mary Library Services, so they will only make use of generally available information.
  • At the time of writing, most Generative AI tools do not retrieve information from ‘live’ online sources, meaning that information from the last few months is not included.
  • Generative AI can create ‘hallucinations’ or fake data or information that is convincing but not based on reality. Furthermore, it can produce fake references or citations which could lead to academic misconduct (Nature, February 2023). As soon as ChatGPT was launched, stories started to appear about it passing exams (NBC, 23 Jan 2023). However, these platforms do not ‘think’ or create original ideas.

Ethical issues

  • Discussion about Generative AI tools is just one part of much wider concerns about the development and control of AI in society.
  • Because Generative AI systems are informed by existing texts or other datasets they, like other algorithm-based tools such as internet search engines, reproduce the biases contained within the source material.
  • Issues around AI generated content go beyond text. The growth in ‘deep fake’ images and videos require you to develop your information literacy skills to evaluate the authenticity of all sources.

So, can you use Generative AI in your assignments at university? See the FAQ's below for more information:  

Some schools have their own specific guidance, so the first thing to do is check with your school or institute.

Some scenarios when it is acceptable to use AI tools to help with learning or preparation include:

  • using AI to revise by preparing summaries of your own notes
  • helping you to understand a challenging concept or reading more clearly by generating a plain English explanation or providing examples
  • exploring the general ideas about a topic to identify search terms and keywords for your search strategy using the Library database

Some students are already being instructed to use AI for specific assignments, with tasks such as creating an AI-generated report and then critiquing it. Being asked to do this does not mean you can use AI throughout your course though.

You need to take a critical approach to AI output. Is it accurate? Does it tell the full story? What are the gaps, inconsistencies, or irrelevant pieces of information? What happens if you ask the question again? Look at similarities, differences and patterns in the outputs and think about what they could tell you.

No. Use multiple sources. Don’t just rely on whatever is generated by an AI tools. They can’t access the most current information and can only use open source materials, meaning that to get information from many journals and books you still need to go via the Queen Mary University Library. To learn more about finding information, see our Find It! Use It! Reference It! QMPlus course.

Experiment with different ways of asking AI for information, including asking follow up questions.

What personal information are you sharing with the company behind the tool? Will the company own or be able to share anything you submit? What biases are demonstrated by the tool? Are there any ethical concerns about the tool that you’re thinking about using? If someone is making the tool available apparently for free, what’s in it for them?

Yes. It is important to explain where you got your information from. Debates are still ongoing about how to cite generative AI tools, but we have put together a Queen Mary guide on how to reference AI within our Referencing Hub, including information on how to do so using different referencing systems.

AI has already been here for a long time, with features such as predictive text, and it is likely that more tools will incorporate generative AI so seamlessly that you may not realise you’re using it. This brings us back to the key question – always ask yourself, honestly, ‘Am I sure this is this my own work I’m submitting?’

AI has long been incorporated into tools such as text to speech software. Students with disabilities should be assured that any assistive software and other technology that they utilise to access teaching and learning is permissible, even though it may use artificial intelligence.

For an overview of issues around misconduct and integrity, see the QMPlus course Academic Integrity at Queen Mary. For student perspectives, see these examples from St Andrews and UCL.

Please note that for research students (PGRs) there is specific guidance on editorial assistance.

This is a rapidly developing area and as such members of the Queen Mary community will be engaging with AI in a variety of ways. This page will be updated regularly, so do check back. This guide is a companion to the Queen Mary staff guide to the use of Generative AI.

This guide has been created in partnership with Queen Mary Students’ Union and colleagues from across Queen Mary.

Back to top