Close Menu
TechCentralTechCentral

    Subscribe to the newsletter

    Get the best South African technology news and analysis delivered to your e-mail inbox every morning.

    Facebook X (Twitter) YouTube LinkedIn
    WhatsApp Facebook X (Twitter) LinkedIn YouTube
    TechCentralTechCentral
    • News
      The real reason Absa wrote off R2.4-billion in software - Johnson Idesoh

      The real reason Absa wrote off R2.4-billion in software

      27 March 2026
      MTN Group shakes up board with five new directors

      MTN Group shakes up board with five new directors

      27 March 2026
      Anoosh Rooplal

      TCS | Anoosh Rooplal on the Post Office’s last stand

      27 March 2026
      Global crackdown on children's screen time gathers pace

      Global crackdown on children’s screen time gathers pace

      27 March 2026
      Big Tech's Big Tobacco moment has arrived

      Big Tech’s Big Tobacco moment has arrived

      27 March 2026
    • World

      Apple plans to open Siri to rival AI services

      27 March 2026
      It's official: ads are coming to ChatGPT

      It’s official: ads are coming to ChatGPT

      23 March 2026
      Mystery Chinese AI model revealed to be Xiaomi's

      Mystery Chinese AI model revealed to be Xiaomi’s

      19 March 2026
      A mystery AI model has developers buzzing

      A mystery AI model has developers buzzing

      18 March 2026
      Samsung's trifold gamble ends in retreat

      Samsung’s trifold gamble ends in retreat

      17 March 2026
    • In-depth
      The last generation of coders

      The last generation of coders

      18 February 2026
      Sentech is in dire straits

      Sentech is in dire straits

      10 February 2026
      How liberalisation is rewiring South Africa's power sector

      How liberalisation is rewiring South Africa’s power sector

      21 January 2026
      The top-performing South African tech shares of 2025

      The top-performing South African tech shares of 2025

      12 January 2026
      Digital authoritarianism grows as African states normalise internet blackouts

      Digital authoritarianism grows as African states normalise internet blackouts

      19 December 2025
    • TCS
      Meet the CIO | HealthBridge CTO Anton Fatti on the future of digital health

      Meet the CIO | Healthbridge CTO Anton Fatti on the future of digital health

      23 March 2026
      TCS+ | Arctic Wolf unpacks the evolving threat landscape for SA businesses - Clare Loveridge and Jason Oehley

      TCS+ | Arctic Wolf unpacks the evolving threat landscape for SA businesses

      19 March 2026
      TCS+ | Vox Kiwi: a wireless solution promising a fibre-like experience - Theo van Zyl

      TCS+ | Vox Kiwi: a wireless solution promising a fibre-like experience

      13 March 2026
      TCS+ | Flipping the narrative on AI in the Global South - Josefin Rosén

      TCS+ | Flipping the narrative on AI in the Global South

      13 March 2026
      TCS | Sink or swim? Antony Makins on how AI is rewriting the rules of work

      TCS | Sink or swim? Antony Makins on how AI is rewriting the rules of work

      5 March 2026
    • Opinion
      South Africa's energy future hinges on getting wheeling right - Aishah Gire

      South Africa’s energy future hinges on getting wheeling right

      10 March 2026
      Hold the doom: the case for a South African comeback - Duncan McLeod

      Apple just dropped a bomb on the Windows world

      5 March 2026
      VC's centre of gravity is shifting - and South Africa is in the frame - Alison Collier

      VC’s centre of gravity is shifting – and South Africa is in the frame

      3 March 2026
      Hold the doom: the case for a South African comeback - Duncan McLeod

      Hold the doom: the case for a South African comeback

      26 February 2026
      The AI fraud crisis your bank is not ready for - Andries Maritz

      The AI fraud crisis your bank is not ready for

      18 February 2026
    • Company Hubs
      • 1Stream
      • Africa Data Centres
      • AfriGIS
      • Altron Digital Business
      • Altron Document Solutions
      • Altron Group
      • Arctic Wolf
      • Ascent Technology
      • AvertITD
      • Braintree
      • CallMiner
      • CambriLearn
      • CYBER1 Solutions
      • Digicloud Africa
      • Digimune
      • Domains.co.za
      • ESET
      • Euphoria Telecom
      • HOSTAFRICA
      • Incredible Business
      • iONLINE
      • IQbusiness
      • Iris Network Systems
      • LSD Open
      • Mitel
      • NEC XON
      • Netstar
      • Network Platforms
      • Next DLP
      • Ovations
      • Paracon
      • Paratus
      • Q-KON
      • SevenC
      • SkyWire
      • Solid8 Technologies
      • Telit Cinterion
      • Telviva
      • Tenable
      • Vertiv
      • Videri Digital
      • Vodacom Business
      • Wipro
      • Workday
      • XLink
    • Sections
      • AI and machine learning
      • Banking
      • Broadcasting and Media
      • Cloud services
      • Contact centres and CX
      • Cryptocurrencies
      • Education and skills
      • Electronics and hardware
      • Energy and sustainability
      • Enterprise software
      • Financial services
      • HealthTech
      • Information security
      • Internet and connectivity
      • Internet of Things
      • Investment
      • IT services
      • Lifestyle
      • Motoring
      • Policy and regulation
      • Public sector
      • Retail and e-commerce
      • Satellite communications
      • Science
      • SMEs and start-ups
      • Social media
      • Talent and leadership
      • Telecoms
    • Events
    • Advertise
    TechCentralTechCentral
    Home » Sections » AI and machine learning » Bots can be brutal

    Bots can be brutal

    AI chatbots are becoming more human-like, to the point that some people may struggle to tell if they're human or machine.
    By The Conversation19 August 2023
    Twitter LinkedIn Facebook WhatsApp Email Telegram Copy Link
    News Alerts
    WhatsApp

    Bots can be brutalArtificial intelligence-powered (AI) chatbots are becoming increasingly human-like by design, to the point that some among us may struggle to distinguish between human and machine.

    This week, Snapchat’s My AI chatbot glitched and posted a story of what looked like a wall and ceiling, before it stopped responding to users. Naturally, the internet began to question whether the ChatGPT-powered chatbot had gained sentience.

    A crash course in AI literacy could have quelled this confusion. But, beyond that, the incident reminds us that as AI chatbots grow closer to resembling humans, managing their uptake will only get more challenging – and more important.

    ChatGPT marked a major leap from simpler ‘rules-based’ chatbots, such as those used in online customer service

    Since ChatGPT burst onto our screens late last year, many digital platforms have integrated AI into their services. Even as I draft this article on Microsoft Word, the software’s predictive AI capability is suggesting possible sentence completions.

    Known as generative AI, this relatively new type of AI is distinguished from its predecessors by its ability to generate new content that is precise, human-like and seemingly meaningful.

    Generative AI tools, including AI image generators and chatbots, are built on large language models (LLMs). These computational models analyse the associations between billions of words, sentences and paragraphs to predict what ought to come next in a given text. As OpenAI co-founder Ilya Sutskever puts it, an LLM is “just a really, really good next-word predictor”.

    Advanced LLMs are also fine-tuned with human feedback. This training, often delivered through countless hours of cheap human labour, is the reason AI chatbots can now have seemingly human-like conversations.

    Flagship

    OpenAI’s ChatGPT is still the flagship generative AI model. Its release marked a major leap from simpler “rules-based” chatbots, such as those used in online customer service.

    Human-like chatbots that talk to a user rather than at them have been linked with higher levels of engagement. One study found the personification of chatbots leads to increased engagement which, over time, may turn into psychological dependence. Another study involving stressed participants found a human-like chatbot was more likely to be perceived as competent, and therefore more likely to help reduce participants’ stress.

    These chatbots have also been effective in fulfilling organisational objectives in various settings, including retail, education, workplace and healthcare settings.

    Google is using generative AI to build a “personal life coach” that will supposedly help people with various personal and professional tasks, including providing life advice and answering intimate questions.

    This is despite Google’s own AI safety experts warning that users could grow too dependant on AI and may experience “diminished health and wellbeing” and a “loss of agency” if they take life advice from it.

    In the recent Snapchat incident, the company put the whole thing down to a “temporary outage”. We may never know what actually happened; it could be yet another example of AI “hallucinating”, or the result of a cyberattack, or even just an operational error.

    Either way, the speed with which some users assumed the chatbot had achieved sentience suggests we are seeing an unprecedented anthropomorphism of AI. It’s compounded by a lack of transparency from developers, and a lack of basic understanding among the public.

    We shouldn’t underestimate how individuals may be misled by the apparent authenticity of human-like chatbots.

    Earlier this year, a Belgian man’s suicide was attributed to conversations he’d had with a chatbot about climate inaction and the planet’s future. In another example, a chatbot named Tessa was found to be offering harmful advice to people through an eating disorder helpline.

    Chatbots may be particularly harmful to the more vulnerable among us, and especially to those with psychological conditions.

    Bots can be brutal

    You may have heard of the “uncanny valley” effect. It refers to that uneasy feeling you get when you see a humanoid robot that almost looks human, but its slight imperfections give it away, and it ends up being creepy.

    It seems a similar experience is emerging in our interactions with human-like chatbots. A slight blip can raise the hairs on the back of the neck.

    One solution might be to lose the human edge and revert to chatbots that are straightforward, objective and factual. But this would come at the expense of engagement and innovation.

    Even the developers of advanced AI chatbots often can’t explain how they work. Yet in some ways (and as far as commercial entities are concerned) the benefits outweigh the risks.

    Generative AI has demonstrated its usefulness in big-ticket items such as productivity, healthcare, education and even social equity. It’s unlikely to go away. So how do we make it work for us?

    Since 2018, there has been a significant push for governments and organisations to address the risks of AI

    Since 2018, there has been a significant push for governments and organisations to address the risks of AI. But applying responsible standards and regulations to a technology that’s more “human-like” than any other comes with a host of challenges.

    Currently, there is no legal requirement for Australian businesses to disclose the use of chatbots. In the US, California has introduced a “bot bill” that would require this, but legal experts have poked holes in it – and the bill has yet to be enforced at the time of writing this article.

    Moreover, ChatGPT and similar chatbots are made public as “research previews”. This means they often come with multiple disclosures on their prototypical nature, and the onus for responsible use falls on the user.

    Read: From Mad Men to machines: big advertisers shift to AI

    The European Union’s AI Act, the world’s first comprehensive regulation on AI, has identified moderate regulation and education as the path forward – since excess regulation could stunt innovation. Similar to digital literacy, AI literacy should be mandated in schools, universities and organisations, and should also be made free and accessible for the public.The Conversation

    • The author, Daswin de Silva, is deputy director of the Centre for Data Analytics and Cognition, La Trobe University
    • This article is republished from The Conversation under a Creative Commons licence

    Get TechCentral’s free daily newsletter

    Follow TechCentral on Google News Add TechCentral as your preferred source on Google


    WhatsApp YouTube
    Share. Facebook Twitter LinkedIn WhatsApp Telegram Email Copy Link
    Previous ArticleInformal settlements turn to renewable energy
    Next Article No more user blocking on X, Musk decrees

    Related Posts

    The real reason Absa wrote off R2.4-billion in software - Johnson Idesoh

    The real reason Absa wrote off R2.4-billion in software

    27 March 2026
    MTN Group shakes up board with five new directors

    MTN Group shakes up board with five new directors

    27 March 2026
    Anoosh Rooplal

    TCS | Anoosh Rooplal on the Post Office’s last stand

    27 March 2026
    Company News
    Durban's finance leaders are done with AI theatre - Sage Intacct

    Durban’s finance leaders are done with AI theatre

    26 March 2026
    Defend your cloud with Altron Digital Business

    Defend your cloud with Altron Digital Business

    26 March 2026
    Why most Cisco partners leave money on the table at renewal time - Westcon-Comstor

    Why most Cisco partners leave money on the table at renewal time

    25 March 2026
    Opinion
    South Africa's energy future hinges on getting wheeling right - Aishah Gire

    South Africa’s energy future hinges on getting wheeling right

    10 March 2026
    Hold the doom: the case for a South African comeback - Duncan McLeod

    Apple just dropped a bomb on the Windows world

    5 March 2026
    VC's centre of gravity is shifting - and South Africa is in the frame - Alison Collier

    VC’s centre of gravity is shifting – and South Africa is in the frame

    3 March 2026

    Subscribe to Updates

    Get the best South African technology news and analysis delivered to your e-mail inbox every morning.

    Latest Posts
    The real reason Absa wrote off R2.4-billion in software - Johnson Idesoh

    The real reason Absa wrote off R2.4-billion in software

    27 March 2026
    MTN Group shakes up board with five new directors

    MTN Group shakes up board with five new directors

    27 March 2026
    Anoosh Rooplal

    TCS | Anoosh Rooplal on the Post Office’s last stand

    27 March 2026
    Global crackdown on children's screen time gathers pace

    Global crackdown on children’s screen time gathers pace

    27 March 2026
    © 2009 - 2026 NewsCentral Media
    • Cookie policy (ZA)
    • TechCentral – privacy and Popia

    Type above and press Enter to search. Press Esc to cancel.

    Manage consent

    TechCentral uses cookies to enhance its offerings. Consenting to these technologies allows us to serve you better. Not consenting or withdrawing consent may adversely affect certain features and functions of the website.

    Functional Always active
    The technical storage or access is strictly necessary for the legitimate purpose of enabling the use of a specific service explicitly requested by the subscriber or user, or for the sole purpose of carrying out the transmission of a communication over an electronic communications network.
    Preferences
    The technical storage or access is necessary for the legitimate purpose of storing preferences that are not requested by the subscriber or user.
    Statistics
    The technical storage or access that is used exclusively for statistical purposes. The technical storage or access that is used exclusively for anonymous statistical purposes. Without a subpoena, voluntary compliance on the part of your Internet Service Provider, or additional records from a third party, information stored or retrieved for this purpose alone cannot usually be used to identify you.
    Marketing
    The technical storage or access is required to create user profiles to send advertising, or to track the user on a website or across several websites for similar marketing purposes.
    • Manage options
    • Manage services
    • Manage {vendor_count} vendors
    • Read more about these purposes
    View preferences
    • {title}
    • {title}
    • {title}