Generative AI Tools
These Teaching, Learning and Assessment focused FAQs support the University’s main GenAI information portal at Generative AI Tools.
Over the past year a number of roundtables, open fora, workshops, and even a festival have been held with staff across the university around how we can productively engage with artificial intelligence at Newcastle. This FAQ is a collation of key issues from these discussions and will help you to better understand how to approach generative AI (GenAI) during the 2025 academic year. For more information about this, see the collection of resources (links embedded) that talk about what GenAI is, and how it might be useful for teachers and students.
This page was last updated 16 April 2025
Prepared by LDTI and oPVCEI
Suggestions for additions and improvements welcome, please contact LDTI@newcastle.edu.au
Access to GenAI
All University of Newcastle staff (and students) can access Microsoft Copilot with Enterprise Data Protection. Information on how to access Copilot is available at Microsoft Copilot: Enterprise Data Protection - How to Guide. This is the only GenAI application currently endorsed by the University.
Microsoft Copilot allows staff to interact with advanced GenAI models (OpenAI’s GPT4 for text and Dall-E images). For any non-endorsed app demand or request associated with Gen AI, please get in touch with your business partner.
Find out more:
Yes.
Microsoft Copilot with Enterprise Data Protection should be used for any work involving the use of personal, private, sensitive, or confidential data.
A range of other platforms are available. While these are not necessarily endorsed or supported by the University, they may be beneficial to the work you do provided they are used thoughtfully and in line with the relevant University policies.
GenAI usage should adhere to the same policies, procedures, and guidelines as any other system, so make sure that any use of a GenAI tool or platform adheres to our policies and procedures.
Find out more:
Digital Technology Solutions guidance on Responsible use of AI suggests critical consideration of data used for training is beneficial when considering what AI tools should be used.
Copilot with Enterprise Data Protection is a platform that allows users to interact with OpenAI's GPT4 large language model (LLM). OpenAI’s models have been trained on select publicly available data, proprietary data from data partnerships and human feedback.
Find out more:
Please refer the Data Security and Privacy Information available on the Library Generative AI tools page.
Do not enter any personally identifiable information (PII) or other sensitive or confidential material. PII is any information that can be used to confirm a person’s identity (e.g. names addresses, phone numbers, etc).
Avoid entering any information that is protected by copyright, unless it aligns with principles like fair dealing, educational exceptions, or if explicit permission has been granted.
Consider the University’s Data Classification and Handling Policy and Standard. Information classified as “public” may be used with minimal risk (assuming it does not involve PII).
For any questions about interpreting and applying the University’s Information Security Policies, contact the DTS Cyber Security Team.
Find out more:
Copyright is an important consideration when using GenAI. All staff members need to be aware of their responsibilities when working in this space. The Library has created a go-to guide for navigating the copyright landscape to help you use GenAI tools in a non-infringing way.
Review the information available at Copyright for Generative AI. For specific guidance contact the Copyright Advisor.
Find out more:
Policy Framework
Governance
The Artificial Intelligence Governance Policy (under development) guides the governance, use, procurement, development and management of artificial intelligence (AI) at the University of Newcastle for the purposes of teaching, learning, research and operations.
Academic Integrity
The Academic Integrity and Ethical Academic Conduct Policy sets the expectations to uphold the standards of academic and research integrity at the University.
The Interview on Assessment Items Procedure outlines the process for conducting an interview to confirm authenticity of a student's assessment task.
Course Coordinators must clearly communicate expectations regarding academic integrity, including appropriate use of generative artificial intelligence and other emerging technologies.
Teaching, Learning, and Assessment
The Policy on the use of Generative AI in Teaching, Learning and Assessment outlines principles to support student, course, discipline, Program, School and College decisions on the appropriate application of GenAI.
The Course Management and Assessment Manual and Program Management Manual – Coursework provide instructions for designing, managing, and reviewing courses and programs.
Through coherent arrangement of teaching, learning, and assessment activities, program design should foster progressive and verifiable achievement of expected learning outcomes.
Course design should include careful consideration of appropriate assessment which reflects the principles outlined in the Course Assessment and Grading Manual.
Information Security
GenAI usage should adhere to the same policies, procedures, and guidelines as any other system, so make sure that any use of a system adheres to our policies and procedures, including and not limited to Information Security Policy, Information Technology Conditions of Use Policy, Procurement Policy, Privacy Policy Privacy Management Plan, and the Information Security Data Classification and Handling Manual.
Assessments and Academic Integrity
Short of complete in-person supervision of all student work, there is no fool-proof way to prevent the student use of GenAI.
For unsupervised assessment tasks, there are strategies that can be implemented to make use of GenAI more difficult and less effective. Staff should consider the following risk factors when (re)designing assessment tasks.
Risk Factor |
Description |
---|---|
Type |
Some task types (for example, a written essay) will have a higher level of risk than others (e.g. oral presentation or invigilated exam). |
Context |
Tasks based on very specific material, or that involve authentic application to novel real-world situations, are likely to have lower risk than tasks based on more generally available information or basic concepts. |
Conditions |
Fully online assessments may involve higher levels of risk than those that involve some in person component. |
Output |
Tasks that involve creation of an artefact (for example an essay or report) may involve higher levels of risk than tasks that involve a performance component. |
Submission |
Tasks that end at submission may have a higher level of risk than those that involve a post submission discussion or follow up (for example, a Q&A post presentation). |
Quality of AI Output |
Tasks where AI generated output is of high quality present greater risk when compared to those where the output quality is poor and therefore recognisable as AI produced. Staff are encouraged to test their assessment via a secure GenAI platform and consider the output quality. |
Invigilation |
Fully invigilated tasks have a comparatively low risk. |
The University’s position is that students must “refrain from relying on any form of artificial intelligence or paraphrasing tools for the completion of their assessment tasks, unless the prescribed assessment task specifically allows this” (Academic Integrity and Ethical Academic Conduct Policy).
Acceptable use will vary between disciplines, courses, and even different assessments within a course. This could range from cases where the use of generative AI tools is an integral part of completing the assessment, right through to assessments where AI should not be used in any way. Course Coordinators must clearly communicate expectations regarding academic integrity, including appropriate use of generative artificial intelligence and other emerging technologies.
Find out more:
Below are some examples of information that could be included in your Canvas course site or communicated to students in class. Information should be amended to reflect the specifics of your assessment task.

Assessment items where GenAI must not be used at all
For this assessment, my expectation is that students will not use any form of generative artificial intelligence. This includes, but is not limited to, AI writing assistants, content generators, or AI-based problem-solving tools. Your submission should be entirely your own work, reflecting your understanding and analysis.
Any use of artificial intelligence will be considered academic misconduct and you may be subject to penalties.

Students may use GenAI in preparing their assignment
For this assessment, students may use generative AI for preliminary research, brainstorming, or drafting purposes. However, it is essential that all AI-generated content or insights used in the final submission be clearly acknowledged, detailing the nature and extent of AI assistance.
See AI tools: Cite for information on how to reference the use of artificial intelligence.

Use of GenAI is an explicit part of the assessment
For this assessment, my expectation is that the use of generative AI will be central to the task. You are encouraged to leverage AI technologies to achieve the assignment's objectives, with a clear requirement to outline how AI was used, the rationale behind its use, and its impact on the final product.
See AI tools: Cite for information on how to reference the use of artificial intelligence.
Refer to the “Cite” section of the Library’s GenAI Tools guide.
The University Teaching and Learning Committee has approved the use of Turnitin's AI detection capability. Staff should not enter student work into any AI detection platforms other than Turnitin.
No student should be referred to SACO on the basis of a TII score alone. Supporting documentation should include, but not limited to:
- Samples of comparisons between the work in question and a student’s previous work
- The outcomes of an Interview on Assessment undertaken to confirm the student’ understanding of the assessment item
- Portions of the work that you believe are AI generated
- Evidence of misconduct based on AI usage
AND details of the TII detection score.
Find out more:
Learning Design and Teaching Innovation's team of Senior Learning Designers can provide guidance and recommendations. Contact LDTI@newcastle.edu.au to book a consultation. Major assessment change can take time. It may not always be feasible to implement large scale changes for a current teaching period, particularly changes result in a level of course revision that requires consideration and approval through governance committees (e.g. School/College Teaching and Learning Committees). Consider the lead in time required to make changes to your assessments.
Find out more:
Teaching and Learning
Generative AI should not be used to mark student work unless the output is reviewed and approved by the marker/course coordinator before the output is shared with the student. Any use of generative AI in marking processes must be clearly communicated to students, with students given the opportunity to opt out of the use of GenAI in marking their work.
Staff must:
- Communicate any use of GenAI in marking processes prior to the assessment submission deadline.
- Review and approve the output of any GenAI used in marking processes, prior to provision of marks to students.
- Communicate the option for students to opt out of the use of any use of GenAI in marking their submission.
- Retain responsibility for the accuracy of marks provided to students.
- Refrain from using GenAI in marking any work where the student has opted out.
When used appropriately, AI generated feedback could provide real benefit to educators in terms of efficiency and possibly also quality of feedback. Depending on the platform/method for accessing AI generated feedback, students could also benefit greatly from 24 hour on demand feedback.
However, there are a number of unanswered questions and concerns about this approach, particularly if AI is used for feedback on high stakes assessments. These include possible limitations on effectiveness due to the level of educator's AI literacy and familiarity with the available tools, ethical considerations, risks related to privacy and intellectual property.
Feedback could be accessed and/or provided in a number of ways:
Student Sourced Feedback
Given the increasing sophistication of openly available GenAI platforms, and the growing number of tailored agents it is quite possible that students are accessing AI generated feedback without any involvement from educators. An example would be a student entering a draft essay into Copilot (or another platform) and prompting the AI to provide feedback on structure, tone, strength of argument, etc.
Educator Sourced via Prompt Engineering
Similarly, educators may be entering the student work and asking AI to provide draft feedback which can then be reviewed, edited and provided to the student. As with above, there are obvious potential privacy risks with this approach, particularly if the AI platform does not have appropriate security measures in place.
Off the shelf options via proprietary platforms
There are a growing number of ed tech providers starting to embed generative artificial intelligence feedback into their platforms. The University also provided all students with access to AI generated feedback via Studiosity’s Writing Feedback+ platform.
Find out more:
Much of the recent discussion around the use of generative artificial intelligence has focused on understandable concerns in relation to assessments and academic integrity. While these concerns are valid, and will take significant effort to address, the increasing access to, and sophistication of, generative artificial intelligence systems is also resulting in opportunities that are worth exploring.
GenAI can assist with course and program design and prototyping, creating assessment tasks and questions, drafting teaching materials, creating learning activities and more.
Learning Design and Teaching Innovation can assist with options for utilising GenAI to support teaching and learning
Find out more:
Communication Resources and Training
Keep an eye out for regular updates from the Artificial Intelligence Working Group via The Loop (previous updates available here). The Working Group also provides formal updates to the University Teaching and Learning Committee.
The Artificial Intelligence Community of Practice provides a platform on which all staff can engage in some open and informal dialogue around our experiences with AI, to float possible benefits to our work, to share interesting articles and information, and discuss how you can use AI in your day-to-day work.
Academic Integrity Module - All students are required to compete the Academic Integrity Module. This includes information on appropriate use of GenAI.
Student Webpages - The Academic Integrity and Artificial Intelligence in Assessment webpages present key messages to students.
Canvas Content – Since trimester 1 2025, all Canvas course sites have included general information for students including:
- Promotion of the NUPrep Course EPPREP860 - Digital and AI Literacies.
- A basic introduction to GenAI
- Information about student access to Microsoft Copilot with Enterprise data protection.
- Issues and Considerations for AI Use, including data security and privacy.
- Using GenAI: Academic Integrity.
- An introduction to prompting.
- Ideas of the use of GenAI in studies
ASKUON – The ASKUON Knowledge base includes answers covering the main points that students need to know.
Course Coordinator Slides – the AI Working group provides slides for courses coordinators to use when discussing GenAI with their students.
Commencing Student Presentations – Key messages are delivered to commencing students via a variety of channels (E.G. Welcome Week Presentations).
The AI Working Group maintains a library of resources (academic papers, reports, presentations, etc) from across the global higher education sector.
Learning Design and Teaching Innovation regularly creates, publishes, and distributes new online resources to support teaching staff.
LDTI Workshops - The Education Development and Learning Design teams in LDTI have designed and delivered a series of workshops to upskill academic staff in GenAI technologies. These workshops, grounded in pedagogical principles, aim to enhance teaching practices and improve student learning across face-to-face, blended, and online environments.
External Sessions – The AI Working Group manages and promotes a calendar of GenAI related events from across the sector.
The Educator Network – tEN delivers regular Showcase Events and Learning Lunches.
EDUC6902 – The School of Education Course Emerging Technologies: Planning teaching and assessing covers a range of technological tools and give tertiary educators knowledge and skills to practically apply and integrate emerging technologies into their own educational programs.
Free Introductory GenAI Courses – the AI Working Group identifies and promotes a range of Free Introductory GenAI Courses. Some can be completed in as little as 30 minutes.
The University of Newcastle acknowledges the traditional custodians of the lands within our footprint areas: Awabakal, Darkinjung, Biripai, Worimi, Wonnarua, and Eora Nations. We also pay respect to the wisdom of our Elders past and present.