GenAI University wide policy and practice

Brookes’ Position on the use of Generative AI in teaching, learning and assessment: Embrace and Adapt

In line with the majority of the sector, Brookes has taken what JISC would describe as a progressive ‘embrace and adapt’ approach to generative AI. The approach recognises the value of GenAI in developing digital literacy and as a learning tool and understands that its use cannot be systematically controlled or detected easily.  This ‘embrace and adapt’ (JISC, 2023) approach is sympathetic to Brookes’ graduate attributes and lends itself to authentic assessment, which is integral to the IDEAS model of curriculum development. Due to concerns about false positives, we have not, at this stage, adopted the Turnitin AI detection tool, along with many in the sector. 


  • Module leaders have the discretion to advise on AI use in the assessment for their modules;
  • Module handbooks should include up to date information on using GenAI in research or assessment;
  • Students are asked to declare their use of AI on submission of assessment and check with module leaders before using it, in line with QAA guidance.

Use of GenAI should take account of ethical and data privacy considerations. Any use of GenAI software not supported by the University must be in accordance with the IT Acceptable Use Policy and sanctioned by the relevant authority in the IT Directorate before being introduced (please contact to discuss your requirements). 

Note any software or hardware "devices" not sanctioned found on the Oxford Brookes University network could potentially expose Oxford Brookes University to the risk of a personal data breach or security incident. 

To further support this future-facing position, there are three strategic recommendations made by the GenAI Working Group:

Assure academic integrity: 
Establish a cross-university position on the acceptable use of AI in assessment and embed this in the Assessment and Feedback policy.

Build generative AI literacy:
Offer academic development for staff and students, raising awareness of generative AI, its appropriate and critical use in teaching, learning and assessment and associated data security and ethical issues.

Remain agile and responsive:
Be alert to new developments, challenges, risks and opportunities and open to revisiting guidance, policy and practice.

The Brookes AI in HE Working Group includes representatives from Oxford Centre for Academic Enhancement and Development (OCAED), Centre for Academic Development (CAD), Faculties, Learning Resources, and the Student Investigation and Resolution Team (SIRT).