Guidelines for IHL Educators When Using AI Tools for Grading and Assessment Purposes
Published on: 07 May 2026
Name and Constituency of Member of Parliament
Ms He Ting Ru, Sengkang GRC
Question
To ask the Minister for Education in relation to the usage of artificial intelligence tools by educators in Institutes of Higher Learning for grading and assessment purposes (a) what safeguards govern accuracy, bias, confidentiality of student data, and human oversight in the final marking decision; and (b) whether the Ministry can share more details of the guidelines issued to educators about its use.
Response
1. MOE has shared guidelines on the use of AI in education with all Institutes of Higher Learning (IHLs). These guidelines cover the use of AI in teaching and assessment, as well as how AI use should be aligned with learning outcomes. For details on these guidelines, members may refer to the response to Questions 41 and 42 for oral answer in the 24 September 2025 Order Paper.
2. In line with MOE's guidelines, the IHLs use AI for assessment in a careful and calibrated manner. Before being deployed, AI grading tools are validated for reliability to ensure that the assessment outcomes reflect the intended mastery by students. More importantly, educators retain responsibility for the final grading decision by remaining in-the-loop during the assessment process, and reviewing individual assessment outcomes. Educators must also inform their students when AI is used in assessments and communicate the grading criteria clearly.
3. In line with prevailing government guidelines for data governance, all IHLs have institutional safeguards to protect the confidentiality of student data. For instance, only institutionally approved AI platforms are used for AI-assisted assessment, and all personal identifiers are removed from student work before it is processed by AI grading systems.