https://www.deakin.edu.au/research/support-for-researchers/research-integrity/generative-artificial-intelligence-ai/43781_950x475_person-on-laptop.jpg

AI in academic research

Explore how to use artifical intelligence ethically in research.

What researchers need to know about generative AI

Generative AI tools can support research by improving workflows, assisting with analysis, and helping prepare outputs. Researchers, students and staff at Deakin can use both in-house and commercial tools to boost efficiency while maintaining academic standards.

But with opportunity comes risk, from ethical concerns to data security and research integrity. As the tech evolves, so do the standards for using it well. This page and the resources below will help you use AI responsibly and in line with research integrity.

Student guide to using generative AI

Deakin Library guide for using generative AI

Living Knowledge Review System

What is generative AI?

Generative AI refers to tools that create new content by analysing existing data. Examples include ChatGPT, DALL·E, Bing, Scite.ai and Elicit, which can generate text, images, research summaries and more.

These tools can help summarise content, automate tasks, support creativity and solve complex problems. But they also raise challenges around ethics, data quality and reliability. As use grows in research and higher education, so does the need for guidance on using AI responsibly.

Potential benefits for researchers

Generative AI tools can assist researchers by streamlining critical analysis, synthesis, design and writing processes, including the preparation of grant, fellowship and project proposals.

They can also encourage critical and creative thinking and support data analysis, highlighting key findings and reducing manual work.

How to input data safely

Commercial generative AI platforms like ChatGPT, Bing and others store data externally. Once entered, your information may be retained, reused or accessed beyond your control, even if the platform offers options to delete or restrict access.

To minimise risk, use only non-sensitive, non-confidential content that is suitable for public or external use. Take care with what you upload, considering data security, reuse risks and your ethical and legal responsibilities. This includes when preparing grants, fellowships or publications.

Submitting the wrong type of data can breach Deakin policies, the National Statement on Ethical Conduct in Human Research and other research integrity standards. It may also expose participants, projects or the university to legal or reputational harm.The following types of data should never be submitted to external AI tools:

Copyrighted or third party material

Avoid uploading content you do not own or control. This includes confidential intellectual property, co-owned Indigenous Cultural and Intellectual Property or materials protected by copyright. Further detail can be found in Deakin’s Intellectual Property Policy.

Private or personal information

Do not submit personally identifiable information, including names, contact details, login credentials, images or health records.

Confidential or sensitive data

Exclude information related to cultural practices, vulnerable species, security classifications, export controls or material shared under confidentiality agreements.

Need help with ethics and integrity?

Planning a project or preparing to publish, get across all the research ethics you need to know as a Deakin graduate research student.

Visit Research Integrity

Your key obligations when using generative AI

Responsible research is founded in principles such as honesty, rigour, transparency, fairness, respect and accountability. If you're using generative AI in your work, you're expected to apply these principles and take responsibility for any content or outputs it helps generate.

Researchers at Deakin must ensure their use of generative AI complies with all relevant ethical, legal and professional standards.

You are legally and ethically responsible

Your use of generative AI must comply with all relevant legislation, institutional policies, research guidelines and ethical frameworks. Breaches of research integrity will be handled in accordance with Deakin’s Research Integrity Breaches Procedure.

You must adhere to conduct policies

Ensure your use of AI aligns with Deakin’s Research Conduct Policy and the Australian Code for the Responsible Conduct of Research, 2018 (Code)

You must be transparent and accountable

Clearly document and disclose how AI has been used in your research, including any limitations or potential biases.

Generative AI is not a peer review tool

Do not use generative AI to assess peer review material, such as grants, manuscripts, HDR theses or ethics applications. This may breach confidentiality requirements and academic integrity standards.

Familiarise yourself with scholarly publishers’ policies

Publishing guidelines are changing rapidly. Regularly check the latest publisher requirements on AI use and disclosure. You can find more detail in Scholarly publishing and generative AI - Your publishing plan - LibGuides at Deakin University.

Review for inaccuracy and bias

Generative AI can reproduce biases from its training data, leading to outputs that are inaccurate, misleading or harmful. This includes structural bias like racism, discrimination and underrepresentation.

When using these tools, critically assess how you frame prompts and interpret results. Always check for bias, assess accuracy, and make sure your use aligns with ethical research standards.

Using generative AI in HDR thesis writing

As a research student, you can use generative AI tools in your work. These tools are allowed in specific, clearly defined ways.

  • Generative AI may be used for copyediting and proofreading thesis text only. Refer to the Institute for Professional Editors guidelines for what qualifies as copyediting and proofreading support
  • You must not input confidential or sensitive information into AI tools, as this breaches the Research Conduct Policy and the Australian Code
  • Using generative AI to create or manipulate images is not permitted in your thesis, unless image generation is part of your approved research methodology and fully disclosed
  • Submitting AI-generated text as your own writing is a breach of Deakin’s Higher Research Degree Policy and academic integrity standards
  • While Deakin does not use AI-detection software to assess student work, using generative AI to write or redraft text in your thesis is not permitted.

Contact us

Research Integrity
Need support with research integrity? For guidance or questions, contact the team at: research-integrity@deakin.edu.au