Responsible and Ethical use of Artificial Intelligence Technologies - Guidelines | UniSC | University of the Sunshine Coast, Queensland, Australia

Accessibility links

Responsible and Ethical use of Artificial Intelligence Technologies - Guidelines

Download PDF
Approval authority
Deputy Vice-Chancellor (Academic)
Responsible Executive member
Deputy Vice-Chancellor (Academic)
Designated officer
Pro Vice-Chancellor (Learning and Teaching Futures)
First approved
11 July 2024
Last amended
23 October 2024
Review date
11 July 2026
Status
Active
Related documents
Linked documents

1. Purpose

1.1 These guidelines provide guidance for staff and students on the ethical and responsible use of Artificial Intelligence (including generative Artificial Intelligence (genAI) tools such as ChatGPT) in their studies.

1.2 These guidelines must be read in conjunction with the linked Assessment: Courses and Coursework Programs – Procedures and Student Misconduct – Procedures.

2. Scope and application

2.1 These guidelines apply to all UniSC staff and students.

3. Definitions

3.1 Refer to the University’s Glossary of Terms for definitions as they specifically relate to policy documents.

4. Guidance for students

4.1 The use of outputs from Artificial Intelligence tools without appropriate acknowledgement constitutes academic misconduct. Students should confirm assessment requirements with teaching staff or seek advice on how to acknowledge the output from Artificial Intelligence tools from the course coordinator, student support services or the library.

4.2 The unauthorised use of Artificial Intelligence or paraphrasing tools can be a form of plagiarism, cheating, or academic fraud and result in academic misconduct.

4.3 Some editing tools also have genAI functions. Course coordinators can permit the use of tools for spelling and grammar checking, however the use of the genAI functions of editing tools is prohibited.

4.4 Students must acknowledge any use of Artificial Intelligence tools in assessment tasks, in accordance with any advice provided by course coordinators and the University.

4.4.1 Students must describe how the Artificial Intelligence tool has been used, how they have significantly modified the output from the tool to be their own work, and how these results have been integrated into their work, as appropriate to the specific advice within their discipline or course.

5. Guidance for staff

5.1 Course and assessment materials must clearly describe the conditions for the use of Artificial Intelligence in each assessment and that the inappropriate use or failure to acknowledge the use of Artificial Intelligence tools by students will result in academic misconduct.

5.2 Before referring possible cases of academic misconduct to the Integrity Compliance Unit (ICU), the teaching team are required to highlight at least three strong indicators of secondary evidence for the ICU to investigate further in accordance with the Student Misconduct - Procedures.

5.2.1 Potential indicators of secondary evidence can come from, but are not limited to:

(a) inconsistencies in the metadata – for example, differing authors, editing time that is very short or long for the document;

(b) inconsistencies in the formatting within the text – for example, differing in font, font colour, extra spacing, incomplete sentences or evidence of text spinning;

(c) repetition of ideas and thoughts over several sentences or paragraphs or the repetitive use of key words, or use of terminology inconsistent with the student’s level of study or course content;

(d) artefacts of copying and pasting - for example, prompts or extra information left in the text suggestive of sourcing information from elsewhere;

(e) inconsistent use of language throughout the document, including the use of archaic words or spellings, the switching of spellings from American English to Australian English for the same word and terminology;

(f) references that are not used correctly: wrong details in the citation, reference doesn’t exist, reference source does not match the in-text use, and lack of secondary citations;

(g) reference is old, from a foreign language journal, or a journal not normally associated with a speciality;

(h) over referencing or under referencing;

(i) incorrect details within a reference;

(j) high Artificial Intelligence score or percentage in a detection tool (such as Turnitin); or

(k) a summary of the findings from an interview conducted by teaching staff with the student, which demonstrates the student’s lack of skills or knowledge regarding the assessment tasks under investigation.

6. Authorities and responsibilities

6.1 The Deputy Vice-Chancellor (Academic) is authorised to make these guidelines for the operation of University Policy. These guidelines must be compatible with the provisions of the Assessment: Courses and Coursework Programs – Procedures, Student Misconduct – Procedures.

6.2 The Pro Vice-Chancellor (Learning and Teaching Futures) is authorised to make associated documents to support the application of policy documents. These must be compatible with the provisions of the respective policy document.

6.3 These guidelines operate from the last amended date however does not come into effect from the effective start date, the commencement of Semester 2 2024, and only applies to study periods commencing after this date.

6.4 All records relating to the responsible and ethical use of Artificial Intelligence technologies must be stored and managed in accordance with the Information Management – Governing Policy.

6.5 This policy document must be maintained in accordance with the Policy Framework – Procedures and reviewed on two year policy review cycle.

6.6 Any exception to this policy document to enable a more appropriate result must be approved in accordance with the Policy Framework – Procedures prior to the deviation of the policy document.

END