Skip to main content
Skip sub navigation

Student guide to using generative AI

While generative AI tools are still in early stages of development, they are evolving rapidly. As a Deakin student, it is essential that you use these tools critically, effectively, and ethically. It may take some practice to build up these skills throughout your studies.

Deakin is an innovative and proactive educator that aims to develop student capacity to utilise current technology, so use this guide to develop the awareness, knowledge and skills to use these technologies ethically and responsibly as digitally fluent citizens.


Generative artificial intelligence (AI) uses machine learning to respond to prompts inputted by the user. Some well-known examples include the text generator ChatGPT and the image generator DALL-E 2.

Generative AI has a wide range of potential applications, including in the fields of art, architecture, law, business, IT and game design, and many other disciplines.

Using generative AI with integrity

Here are some guidelines to consider when using generative AI in your study and assessments:

  1. Make sure that the final product is your own work, and not just copied from an AI generator. You can use the generated text as a learning tool for inspiration or guidance, but the final submitted assessment must be your own work, creation, and analysis.
  2. You must acknowledge where generative AI tools have been used in an assessment. Ensure that you appropriately acknowledge any content generated by AI that you use in your work, and clearly indicate where you have used an AI-generated tool and to what extent. In addition, be aware of the University’s Student Academic Integrity policy and your responsibility to follow it.
  3. Understand the risks and limitations of using such tools and critically evaluate any output they produce.

Learn more about how to do this in the next section of this guide.

There are many exciting applications of AI-generative tools. However, you should use them with caution and make sure you are critically evaluating their use and output.

Limitations of some AI-generative tools include:

  • Currency Content generated may or may not be up to date.
  • Accuracy While the content generated might seem to be plausible, it can be non-factual, inaccurate, and sometimes, completely untrue. Fluent and grammatically correct responses can be deceiving. It is important to fact check and quality assure all aspects of all output.
  • Obscured authorship Tools may not provide adequate, or any, sources for the output provided. Where they are provided, they may not be accurate.
  • Biases Content may reflect the biases inherent in their algorithms, despite any claims that they contain no biases. In addition, be aware that these are commercial tools and may be influenced by commercial objectives.
  • Quality of output Be aware that sometimes output generated is cliched and generic in nature. And, as stated previously, output can be non-factual or inaccurate.

Assessing the value of the tool

Given these risks and limitations, ask yourself: ‘For what purpose am I using this tool?’

  • Will exploring this tool save me time and enhance my learning – or will it distract me from my purpose?
  • Will it be more effective to go about investigating my task in another way?
  • Am I using the right tool for the right job, and using it in an ethical way that aligns with Deakin’s Student Academic Integrity policy?

Other risks to consider:

  • Intellectual property - Take into account that while content may be a “unique” generation, it is also based the data used to train the AI and may, in fact, breach intellectual property requirements.  
    In addition, with any content submitted to an AI platform through your prompts, you grant the AI services the right to re-use and distribute this content and that may result in a breach of copyright. Providing copies of text, images, sound or video that you do not own the copyright for when prompting AI is likely to be a breach of copyright.
  • Privacy and security - Be aware that AI tools learn from user inputs, so it is important not to enter any personal details or confidential data related to your work.
  • What happens in the future? Consider that both the input and the output is stored by the owner of the AI tool. What happens to this in the future? What happens if the inputs and outputs are connected to you and become publicly available?

The bottom line is that you always need to be cautious around your inputs with generative AI and to critically analyse any outputs. In most cases you will need to validate any output by cross-checking information from credible sources.

Learn more about using and citing AI in the next section of this guide.

Have I evaluated the credibility of my sources?

When assessing any source of information to cite, it needs to first be evaluated for credibility.

Relying on AI-generated content as the primary source of information for an assessment may not be acceptable.

While you might use AI-generated content in your early investigations for a task, you need to ask yourself if it is a reliable source to cite. If you decide it is, then it needs to be cited appropriately.

In most cases, you will need to follow up with further research from credible sources. We recommend you cite journal articles, books, websites and other sources from reliable and credible sources. Your Unit Chair, as well as the Library, will be able to advise further on appropriate sources in your discipline.

Have I produced my own assessment?

Assessments are designed to test your understanding of a subject and your ability to apply that knowledge. If you use AI to simply write your assessment: the value of the assessment would be diminished; you may not develop the required skills for your future studies or career; and you would be breaching academic integrity.

Taking your own notes, synthesising information that you have read, and writing your own drafts are important steps in the learning process. Your assessments are only one outcome of your learning in a trimester, but there are other skills and knowledge that you gain in each step of producing your assessments. Using AI-generated text to simply write your assessments can undermine this learning process.

How do I acknowledge the use of AI?

All sources used in assessments – including AI-generative tools – require full and proper acknowledgement. Using AI-generated content without acknowledgement is a breach of academic integrity that may result in academic misconduct allegations and subsequent consequences.  

The Deakin Guide to Referencing has advice on when and how to acknowledge the use of AI tools. This advice is likely to be updated, so please check back on this advice each trimester.

Go to your required referencing style in the Deakin Guide to Referencing, then navigate to Other Sources > Artificial Intelligence.

Last updated:
Page custodian: Office of the Dean of Students