OpinionPREMIUM

INSIGHT | The issue isn’t students using AI, it’s SA’s outdated academic models

In April, the Daily Maverick published a piece titled “CheatGPT Crisis”, highlighting growing concerns about the use of tools such as ChatGPT in SA universities. While the anxiety over academic integrity is understandable, this debate reveals a deeper issue — our higher education system is failing to evolve.

Chuma Memela.
Chuma Memela. (SUPPLIED)

In April, the Daily Maverick published a piece titled CheatGPT Crisis”, highlighting growing concerns about the use of tools such as ChatGPT in SA universities.

While the anxiety over academic integrity is understandable, this debate reveals a deeper issue — our higher education system is failing to evolve.

Every breakthrough in educational technology has faced scepticism. Calculators were once banned.

Search engines were vilified. Yet we now consider them essential.

The shift did not come from the tools themselves changing, but from us adapting.

SA has one of the highest youth unemployment rates globally.

In this context, restricting access to the very tools shaping the digital economy is not only reckless but it is preparing students for a world that no longer exists.

If anything, it does not preserve academic integrity, it undermines it.

The issue is not that students are using AI, it is that our academic models remain outdated while the world evolves rapidly.

Academic assessments have long prioritised surface level recall over depth and reflection.

These formats are easily reproduced by AI, which should prompt us to rethink how we evaluate learning.

If ChatGPT can complete an assignment convincingly, then the issue is with the design, not the student.

In my view, real learning lies in complexity. Oral defences, reflective critiques, real world cases, and collaborative projects force students to engage intellectually.

These tasks test judgment, reasoning, and understanding — skills AI cannot replicate.

Blaming AI is like blaming a thermometer for a fever. It reveals a symptom, not the cause. 

Success today depends on knowing how to use tools effectively.

Labelling AI use as “cheating” shows a lack of understanding about the future of work.

Globally, leading universities are embracing this shift. Consider a few:

  • University College London has modules on AI ethics and prompt design;
  • IE University in Spain integrates AI into law and business education;
  • Arizona State University has partnered with OpenAI to enhance teaching and learning;
  • University of Florida aims to embed AI into every major; and
  • Barnard College, Colby College, and the California State University system are investing in AI- driven curricula.

Even the University of Oxford and University of Pennsylvania have incorporated ChatGPT into educational design.

These institutions are not in crisis, they are preparing students to thrive in an AI enabled world.

Meanwhile, many universities in SA remain stuck in the past, upholding handwritten essays as the gold standard.

But intellectual rigour has never resided in pen and paper.

Teaching students to prompt AI, evaluate its responses, and refine output with human insight is cognitive labour. It demands logic, clarity, and understanding, skills worth cultivating.

One of the understated advantages of AI is that it removes the mechanical layers of work, creating room for deeper engagement.

For example, a student might use ChatGPT to draft a literature review.

Instead of rejecting this, a lecturer could ask the student to annotate the draft, highlighting what was edited, what sources were verified, and what was kept or removed. The outcome would be a richer, human-led document.

Vietnam for instance allows AI integration through the British University’s Artificial Intelligence Assessment Scale.

The results have shown higher performance, fewer plagiarism incidents, and better student engagement.

This is a testament of what is possible when AI is reframed as a tool as opposed to attempting to ban its use.

This is what SA’s universities must do: 

  • Institutionalise AI literacy as a core requirement. Every undergraduate should complete a credit bearing module in AI literacy. This includes conceptual, ethical, and technical foundations for engaging with AI in any field.
  • Redesign assessments to reflect AI-assisted thinking. Traditional models rooted in rote learning must give way to formats that reward reflection and discernment. AI use should be permitted with proper engagement and analysis.
  • Teach prompt engineering as an academic skill. Prompt engineering is to AI what referencing is to academic writing. It must be taught, practised, and assessed like any other core competency.
  • Establish AI governance within faculties. Faculty-specific AI committees should guide implementation, ensuring policy is contextual and responsive to disciplinary needs.
  • Mandate annual AI training for staff. Lecturers must be equipped to design tasks that incorporate AI meaningfully. Annual, accredited training should be compulsory to close the gap between policy and practice.

We need to stop preparing students for an outdated world and start preparing them for the one they are entering.

If AI outperforms students, the issue is not the tool, it is the kind of thinking we are asking students to do.

We can cling to obsolete assessment models, or we can evolve and create learning environments that prioritise interpretation, discernment, and purpose. In closing, the rise of AI in academic spaces is not a threat but rather a wake-up call. The future demands it.

Chuma Memela, lead AI consultant at Genie-yus AI


Would you like to comment on this article?
Sign up (it's quick and free) or sign in now.

Comment icon

Related Articles