
Current University guidance on AI systems and tools states:
“Our University’s position is that when used appropriately AI tools have the potential to enhance teaching and learning, and can support inclusivity and accessibility. Output from AI systems must be treated in the same manner by staff and students as work created by another person or persons, i.e. used critically and with permitted license, and cited and acknowledged appropriately.”
In addition, specific guidance appropriate to your module, course or programme may be detailed in course handbooks and other spaces. It is possible that your programme, module or assignment has modified guidance around use of AI tools.
There are a variety of tools available, and the University offers a licensed version of Microsoft Copilot that ensures data privacy and should be used for all your University work that involves any reading, course resources or other material written by others.
For other questions on use of AI in your studies, please refer to the Teaching Guidance document.
When using AI tools, you should ensure that you reference them appropriately. If you use an AI tool (such as a chatbot, image generator or other AI system) and use the text or image directly in your work, you will need to cite it.
If you are using Harvard Manchester referencing style, you should use the 'Software' example on the Library's guide to Harvard Manchester referencing.
These tools are developing at pace and it is important to state which version of the tool you used as well as the date accessed.
See the FAQ below for further information:
*These examples are in the Harvard Manchester style*
In-text citation
“Posing Microsoft Co-Pilot the question 'does Microsoft Co-Pilot aid academic malpractice?', the AI turns the responsibility to the user, stating
"Microsoft Copilot is designed to be a supportive and ethical AI tool, assisting users in enhancing their learning and productivity. It aims to empower students, researchers, and professionals to develop their skills and complete tasks effectively—but it does so responsibly.
Academic malpractice is never encouraged or supported, whether it’s plagiarism, cheating, or any other unethical practice. I always emphasize proper citation of sources, originality, and integrity in all academic endeavors. My role is to provide guidance, inspiration, and accurate information to support legitimate learning and understanding. If you're curious about how to use tools like me responsibly, I can share tips on ethical practices in academia! What do you think?" (Microsoft Co-Pilot, 2025).
Reference list entry
Microsoft (2024). Microsoft Co-Pilot. [Computer program]. Available at: https://copilot.microsoft.com/ (Accessed: 13 December 2022).
How to reference an AI generated image
If you use an AI tool to generate an image, you would reference it as follows:
In-text citation
Figure 1 highlights a realistic visual simulation based on a range of textual prompts using an AI tool (Midjourney, 2023)
Image caption
Place the caption for the image directly below the image
Figure 1: Synthetic landscape (Source: Midjourney, 2023)
Reference list entry
Midjourney (2003) Synthetic landscape [image]. Available at: https://midgard.com/image-53461 (Accessed: 18 Sept 2023)
You may be asked or feel it is appropriate to acknowledge your use of an AI tool or system. This acknowledgement can sit at the start of your work (or sometimes be found in a footnote) and would usually give an overview of the tools used and the outputs. Please see the example below:
"Generative AI Disclosure: I used Microsoft Co-Pilot to assist in idea generation, image creation, and for feedback on grammar and content. I implemented some of its recommendations. I used DALL-E to explore ideas for visuals (one of which is used and cited on page 2)"
Quotating, summarising and paraphrasing, editing, translating, data processing, re-writing your work and the generation of ideas.
Generative AI tools can create fake citations that do not exist. These tools may cite a real piece of writing, but the cited content may be irrelevant, inaccurate or different from that which was referenced.
Do not use GenAI tools to create citation lists or bibliographies.
For further support on assessing the relevance and credibility of sources you can access our online resource Evaluating sources of information.
When prompted with “Is the left brain right brain divide real or a metaphor?” the ChatGPT-generated text indicated that although the two brain hemispheres are somewhat specialized, “the notation that people can be characterized as ‘left-brained’ or ‘right-brained’ is considered to be an oversimplification and a popular myth” (OpenAI, 2023).
OpenAI. (2023). ChatGPT (Mar 14 version) [Large language model]. https://chat.openai.com/chat
(2023) Information provided by https://apastyle.apa.org/blog/how-to-cite-chatgpt
OSCOLA have not yet issued any guidance on how to reference AI generated content. Given that AI generated content is generally not recoverable it should be treated as personal communication. It should be included in a footnote and not in the bibliography. See below for an example.
1Response from Copilot to author (9 September 2024)
(2024) Information provided by https://subjectguides.york.ac.uk/referencing-style-guides/generative-ai
We do not recommend treating the AI tool as an author. This recommendation follows the policies developed by various publishers, including the MLA’s journal PMLA.
Describe what was generated by the AI tool. This may involve including information about the prompt in the Title of Source element if you have not done so in the text.
Use the Title of Container element to name the AI tool (e.g., Microsoft Copilot).
Name the version of the AI tool as specifically as possible. For example, the examples in this post were developed using ChatGPT 3.5, which assigns a specific date to the version, so the Version element shows this version date.
Name the company that made the tool.
Give the date the content was generated.
Give the general URL for the tool.1
While the green light in The Great Gatsby might be said to chiefly symbolize four main things: optimism, the unattainability of the American dream, greed, and covetousness (“Describe the symbolism”), arguably the most important—the one that ties all four themes together—is greed.
“Describe the symbolism of the green light in the book The Great Gatsby by F. Scott Fitzgerald” prompt. ChatGPT, 13 Feb. version, OpenAI, 8 Mar. 2023, chat.openai.com/chat.
(2025) Taken From: https://style.mla.org/citing-generative-ai/
The editors of MHRA have not yet issued any guidance on how to reference AI generated content. Given that AI generated content is generally not recoverable it should be treated as personal communication. It should be included in a footnote and not in the reference list. See below for an example.
'Paris, Rome and Berlin are the most popular tourist destinations in Europe.'1
1 Microsoft, Copilots' response to Sam Jones, 23 August 2024.
(2024) Information provided by https://subjectguides.york.ac.uk/referencing-style-guides/generative-ai
No official guidance has been provided yet so we recommend you use or adapt the following format:
#. Name of AI Tool [type of medium]. Creator of tool; version date. [Accessed YYYY Month DD].
1. ChatGPT. [Online conversation]. OpenAI; 2023. [cited 2024 August 23]. Accessed from: https://chat.openai.com.
(2024) Information provided by https://subjectguides.york.ac.uk/referencing-style-guides/generative-ai
Citation order |
|
Active EndNote fields: Programmer, Year, Title, Version, Type, URL, Date Accessed
If you're curious about how to use tools like me responsibly, I can share tips on ethical practices in academia! What do you think?" (Microsoft Co-Pilot, 2025).
Microsoft Co-Pilot (2025). 'does Microsoft Co-Pilot aid academic malpractice?' [AI assistant]. Available at: https://copilot.microsoft.com/ (Accessed: 11 April 2025).
Disclaimer: A citation may not be needed for these examples but you should check with your School guidance in case you are required to acknowledge how you have used them.