NYARI SAMUSHONGA
THE world has been sent into a frenzy over OpenAI’s latest iteration of its chatbot, ChatGPT.
The fear that platforms such as ChatGPT evoke in educators must be acknowledged.
Some professors have already caught their students cheating by using ChatGPT to create their courseworksubmissions. As educators, one of our fears is that we grade students as competent in subjects where they have not grasped the concepts. In being able to pass off ChatGPT’s work as their own, this fear has been amplified.
As with all things, there is the other side of the coin. In this case, it is great excitement at the prospect of a chatbot being able to do the heavy lifting and mundane tasks,
freeing us to be the creative beings we were born to be.
It is this angle that compels the more open-minded among us to ask whether young people leveraging powerful technology tools to aid their work should be considered cheating, or if it is actually a skill and resource that should be embraced.
This perspective would signal a shift to a new paradigm. A paradigm where man and machine work together to produce a superior outcome to what was previously possible. The kind of shift that previous industrial revolutions have ushered in.
But it is the third side of the coin, the perimeter that runs around the edge and connects the two sides, that is perhaps the most intriguing. This is not some new, objective source of intelligence. It is a mirror of us — it is a programme that consumes datasets created by humans and learns from those datasets how to mimic human knowledge. This means that the fuel behind its “brain” is a collection of things that have previously come from human brains.
When viewed through the digital divide lens, we must acknowledge that it is laced with all our societal biases and blind spots. Far from being precise and objective purveyors of information, programmes like ChatGPT are not just biased, but are trained to represent the realities of the digital “haves” while minimising or excluding those of the digital “have-nots”.
We have all these layers of complexity around masses of data that are inherently flawed. When we apply algorithms to them, they produce an output that runs the risk of being perceived as “true” or “objectively correct”. This is where the consequences of digital inequity can become quite dangerous.
We’ve seen this with the criminal justice system relying on facial recognition tools that are not well-trained in identifying people of colour. In medical diagnostics, clinical data from industrialised nations has been incorrectly presumed to be a representative sample of the broader world resulting in compromised care being administered to patients in developing nations.
So how do we deal with the coin as a whole? Can we prevent Africa and other developing regions from being left even further behind as artificial intelligence entrenches the perspective of the digital “haves” as the universal truth? How should we approach this topic from the perspective of education, educators and society?
It is not all doom and gloom. We have known for a while that we need to end digital inequity. There is no question that all new technology development must be approached ethically. If we wish to use technology that represents a holistic and not exclusive shade of reality, it is paramount that Africans are part of building the datasets and the tools that consume them. This will reduce the biases and actually enrich the quality of insights we can gain.
At the same time, we must train people ethically, and ensure that this superpower is in the hands of people that can leverage its strengths while minimising the damage it can create. As we raise up the next generation of technologists, we need to impress upon them the need to think before they code.
It is evident, then, that technology like ChatGPT asks far bigger questions of us as educators than simply how we are going to prevent cheating. This technology is here to stay and will probably become more sophisticated.
We educators can be instrumental in ensuring that as technology rapidly evolves, more Africans have a seat at the table and are able to be a part of these pivotal technological developments.
About the writer: Nyari Samushonga is CE of WeThinkCode, a South African-based software development training academy.–Mail&Guardian.