Close Menu
TechCentralTechCentral

    Subscribe to the newsletter

    Get the best South African technology news and analysis delivered to your e-mail inbox every morning.

    Facebook X (Twitter) YouTube LinkedIn
    WhatsApp Facebook X (Twitter) LinkedIn YouTube
    TechCentralTechCentral
    • News
      Chinese car makers flood South Africa while factories lag - Mikel Mabasa

      Chinese car makers flood South Africa while factories lag

      28 January 2026
      Reports of the smartphone's impending death are greatly exaggerated

      Reports of the smartphone’s impending death are greatly exaggerated

      28 January 2026
      Popia is strong, Paia needs reform, says Information Regulator - Mukelani Dimba

      Popia is strong, Paia needs reform, says Information Regulator

      28 January 2026
      AI replaces people as Amazon cuts 16 000 corporate jobs

      AI replaces people as Amazon cuts 16 000 corporate jobs

      28 January 2026
      iCAUR to launch in South Africa with 20-dealer network - iCAUR V23

      iCAUR to launch in South Africa with 20-dealer network

      28 January 2026
    • World
      SpaceX IPO may be largest in history

      SpaceX IPO may be largest in history

      28 January 2026
      Nvidia throws AI at the weather

      Nvidia throws AI at weather forecasting

      27 January 2026
      Debate erupts over value of in-flight Wi-Fi

      Debate erupts over value of in-flight Wi-Fi

      26 January 2026
      Intel takes another hit - Intel CEO Lip-Bu Tan. Laure Andrillon/Reuters

      Intel takes another hit

      23 January 2026
      ByteDance clinches US TikTok deal

      ByteDance clinches US TikTok deal

      23 January 2026
    • In-depth
      How liberalisation is rewiring South Africa's power sector

      How liberalisation is rewiring South Africa’s power sector

      21 January 2026
      The top-performing South African tech shares of 2025

      The top-performing South African tech shares of 2025

      12 January 2026
      Digital authoritarianism grows as African states normalise internet blackouts

      Digital authoritarianism grows as African states normalise internet blackouts

      19 December 2025
      TechCentral's South African Newsmakers of 2025

      TechCentral’s South African Newsmakers of 2025

      18 December 2025
      Black Friday goes digital in South Africa as online spending surges to record high

      Black Friday goes digital in South Africa as online spending surges to record high

      4 December 2025
    • TCS
      Watts & Wheels S1E2: 'China attacks, BMW digs in, Toyota's sublime supercar'

      Watts & Wheels S1E2: ‘China attacks, BMW digs in, Toyota’s sublime supercar’

      23 January 2026

      TCS+ | Why cybersecurity is becoming a competitive advantage for SA businesses

      20 January 2026
      Watts & Wheels S1E2: 'China attacks, BMW digs in, Toyota's sublime supercar'

      Watts & Wheels: S1E1 – ‘William, Prince of Wheels’

      8 January 2026
      TCS+ | Africa's digital transformation - unlocking AI through cloud and culture - Cliff de Wit Accelera Digital Group

      TCS+ | Cloud without culture won’t deliver AI: Accelera’s Cliff de Wit

      12 December 2025
      TCS+ | How Cloud on Demand helps partners thrive in the AWS ecosystem - Odwa Ndyaluvane and Xenia Rhode

      TCS+ | How Cloud On Demand helps partners thrive in the AWS ecosystem

      4 December 2025
    • Opinion
      Why Elon Musk's Starlink is a 'hard no' for me - Songezo Zibi

      Why Elon Musk’s Starlink is a ‘hard no’ for me

      26 January 2026
      South Africa's new fibre broadband battle - Duncan McLeod

      South Africa’s new fibre broadband battle

      20 January 2026
      AI moves from pilots to production in South African companies - Nazia Pillay SAP

      AI moves from pilots to production in South African companies

      20 January 2026
      South Africa's new fibre broadband battle - Duncan McLeod

      ANC’s attack on Solly Malatsi shows how BEE dogma trumps economic reality

      14 December 2025
      South Africa's new fibre broadband battle - Duncan McLeod

      Netflix, Warner Bros deal raises fresh headaches for MultiChoice

      5 December 2025
    • Company Hubs
      • Africa Data Centres
      • AfriGIS
      • Altron Digital Business
      • Altron Document Solutions
      • Altron Group
      • Arctic Wolf
      • AvertITD
      • Braintree
      • CallMiner
      • CambriLearn
      • CYBER1 Solutions
      • Digicloud Africa
      • Digimune
      • Domains.co.za
      • ESET
      • Euphoria Telecom
      • Incredible Business
      • iONLINE
      • IQbusiness
      • Iris Network Systems
      • LSD Open
      • NEC XON
      • Netstar
      • Network Platforms
      • Next DLP
      • Ovations
      • Paracon
      • Paratus
      • Q-KON
      • SevenC
      • SkyWire
      • Solid8 Technologies
      • Telit Cinterion
      • Tenable
      • Vertiv
      • Videri Digital
      • Vodacom Business
      • Wipro
      • Workday
      • XLink
    • Sections
      • AI and machine learning
      • Banking
      • Broadcasting and Media
      • Cloud services
      • Contact centres and CX
      • Cryptocurrencies
      • Education and skills
      • Electronics and hardware
      • Energy and sustainability
      • Enterprise software
      • Financial services
      • Information security
      • Internet and connectivity
      • Internet of Things
      • Investment
      • IT services
      • Lifestyle
      • Motoring
      • Public sector
      • Retail and e-commerce
      • Satellite communications
      • Science
      • SMEs and start-ups
      • Social media
      • Talent and leadership
      • Telecoms
    • Events
    • Advertise
    TechCentralTechCentral
    Home » Sections » AI and machine learning » Your data, your hardware: the DIY AI revolution is coming

    Your data, your hardware: the DIY AI revolution is coming

    Falling hardware costs will make powerful, home-hosted large language models both practical and essential.
    By Duncan McLeod20 November 2025
    Twitter LinkedIn Facebook WhatsApp Email Telegram Copy Link
    News Alerts
    WhatsApp

    Your data, your hardware: the DIY AI revolution is comingIf you’ve played with ChatGPT, Claude or Copilot for more than five minutes, you’ve probably had the same uneasy thought: I’m pouring my life into someone else’s black box.

    Every query, draft contract, medical worry, marital gripe, trade secret and half-baked business idea goes up to data centres run by a handful of US (and Chinese) tech giants. They promise to protect it. But the basic power imbalance is obvious: they own the servers, so they set the terms.

    Over the next decade, that imbalance is going to be challenged – not just by regulation, but by something more basic: commodity hardware. We are heading for a world where it becomes perfectly normal to run serious AI language models on machines you own, in your study or in your home server cabinet. It won’t happen tomorrow, or even next year, but it will happen – and I plan to be among the first to do it (wallet willing).

    Local LLMs are possible today. They are even practical for some workloads on less-demanding hardware

    Right now, building your own serious AI server is still eye-wateringly expensive.

    At the extreme end, Nvidia’s hardware is priced beyond the reach of most individuals. Industry guides suggest a fully configured H100 server with eight H100 GPUs costs well north of US$300 000 all-in. New Blackwell-based systems – the kind of kit hyperscalers like Microsoft, Google and Meta Platforms buy in bulk – are reported to be in the region of $3-million per rack. They run hot and they guzzle electricity.

    But Nvidia has started to talk about “personal AI supercomputers”. Its new DGX Spark is pitched exactly at that niche. Reports put its price somewhere around $3 000 to $4 000, depending on configuration and vendor. That’s a huge step down from data centre hardware, but in South African rand terms, you’re still looking at R60 000 to R80 000 or more.

    Still, that’s cheaper than renting cloud GPUs indefinitely. One recent analysis put a single H100 instance at up to $65 000/year via the cloud, versus about $30 000 to $35 000 to own equivalent hardware over three to five years. But that’s still enterprise-scale economics and it’s not something you casually buy or build in your study at home.

    Middle ground

    There is a middle ground, and it’s where many early adopters are already playing: high-end workstations and gaming rigs.

    Consider Apple’s Mac Studio. The current M3 Ultra option can be specced with up to 512GB of unified memory (shared by the CPU and GPU) and up to an 80-core GPU, easily pushing the machine well into the six-digit rand range depending on storage and CPU/GPU configuration. It’s an incredible little machine for its size – and capable of running substantial local models – but it’s still “very serious hobbyist” money and completely out of the reach of most people.

    Read: So, will China really win the AI race?

    On the PC (non-Mac) side, the picture is slightly better. You can run respectable seven- to 13-billion parameter models on consumer GPUs like Nvidia’s RTX 5080 and 5090 graphics cards. A brand-new RTX 5090-class card with 32GB of VRAM is still in the “luxury toy” bracket but older (and second-hand) 24GB 4090 and 3090 cards offer a lot of VRAM at lower prices than Nvidia’s new halo products.

    By the time you’ve added 64GB (or, better, 128GB) of system RAM, fast flash storage and a decent CPU, you’re still staring at a machine in the upper five digits in rand terms. That’s okay for a small business running its own, fine-tuned models in-house; it’s overkill (and wildly over-budget) for a typical household.

    So, yes, local LLMs are possible today. They are even practical for some workloads on less-demanding hardware (I run some smaller models, like Mistral and GLM-4, using Ollama on my ageing M1 Max MacBook Pro). But the machine croaks on larger models, limited by the available unified memory (32GB in my case) and the lack of GPU grunt in the now-four-year-old Apple chip.

    Given the current costs, why should we even care about local LLMs? Because cloud AI is a privacy nightmare waiting to happen.

    Even assuming perfect behaviour by the big platforms – no training on your private data, for example (yeah, right) – the architecture itself centralises risk. Your prompts, outputs and sometimes your underlying data all leave your environment. That’s before we even get to the business model. The same companies selling you “AI productivity” are also in the business of ad targeting and behavioural profiling and squeezing every possible useful morsel out of user data, including your private and sensitive information.

    Hyperscale models will always be ahead on raw capability, training data and cutting-edge research

    Running models locally flips that. Your raw data never leaves your machine. There is no provider log to subpoena in court, no system quietly learning that you’re considering leaving your employer or buying a competitor’s product. The attack surface shrinks to: “Can someone break into my hardware?”

    For journalists, lawyers, doctors or anyone else dealing with sensitive data, that’s not a nice-to-have. It’s quickly going to become essential.

    The hopeful bit is that the economics are moving in our favour. The CPU-centric Moore’s Law has undeniably slowed, but AI price-performance is still improving at breakneck speed. Each GPU generation brings more performance per watt, more memory bandwidth and (sometimes) more VRAM. At the same time, the software stack is advancing at a rapid pace, helping LLMs use available hardware more efficiently.

    Combine these trends and something interesting happens: the line where “good enough local AI” intersects with “ordinary household budget” is moving inexorably closer.

    Road map

    Here’s a speculative road map (disclosure: provided with the assistance of ChatGPT – yes, I see the irony):

    • By 2027/2028: High-end gaming PCs and creative workstations in the R40 000 to R60 000 range will routinely ship with 32-48GB of VRAM and 64GB or even 128GB of system RAM. That’s enough to run genuinely capable assistant-class LLMs locally.
    • By 2030: The “upper-midrange” desktop – what a serious gamer might buy – will comfortably host 64GB VRAM GPUs and 128GB or 256GB of system RAM. Think of this as the point where buying a local LLM-capable machine doesn’t become like a choice between PC hardware and buying a small car. In rand terms, that means perhaps R30 000 or R40 000 for a box that can handle the bulk of everyday AI workloads at home, assuming the rand keeps steady and the current surge in memory prices is temporary.
    • Early 2030s: Expect AI appliances, shoebox-sized boxes (or smaller), perhaps sold by the same brands that make your home router, bundling an efficient AI accelerator, plenty of memory and a slick user interface. Price bracket: high-end smartphone, perhaps. They’ll sit next to your fibre wall box, quietly running your family’s chatbot assistants, summarising mail, indexing your documents and photos, and answering general questions – all without touching the public cloud.

    Even if those data ranges are on the optimistic side, barring some catastrophic slowdown, the direction of travel here is clear.

    This is not an argument to abandon cloud AI. Hyperscale models will always be ahead on raw capability, training data and cutting-edge research. But we should absolutely be planning for a hybrid world where routine, private workloads run on devices and servers we own. In this world, cloud AI will used more selectively for tasks that genuinely justify it.

    Local AI models will run on high-end consumer GPU hardware, like Nvidia's RTX 5090 (pictured)
    Local AI models do run on high-end consumer GPU hardware, like Nvidia’s RTX 5090 (pictured)

    This shift will start with all of us asking this question: do I really want to send this request to someone else’s server? If the answer is increasingly “no”, then building your own AI server stops being a geek fantasy and starts looking like a rational act of digital self-defence.

    • Duncan McLeod is editor of TechCentral

    Get breaking news from TechCentral on WhatsApp. Sign up here.



    Apple ChatGPT Duncan McLeod Google Nvidia OpenAI
    WhatsApp YouTube Follow on Google News Add as preferred source on Google
    Share. Facebook Twitter LinkedIn WhatsApp Telegram Email Copy Link
    Previous ArticleTelkom’s turnaround looks real – but is the growth sustainable?
    Next Article New CEO for Reunert as Alan Dickson steps down after 12 years

    Related Posts

    Reports of the smartphone's impending death are greatly exaggerated

    Reports of the smartphone’s impending death are greatly exaggerated

    28 January 2026
    Meta, TikTok, YouTube to stand trial on youth addiction claims

    Meta, TikTok, YouTube to stand trial on youth addiction claims

    27 January 2026
    Nvidia throws AI at the weather

    Nvidia throws AI at weather forecasting

    27 January 2026
    Company News
    WeBuyCars expands national footprint with two landmark supermarkets

    WeBuyCars expands national footprint with two landmark supermarkets

    28 January 2026
    The changing state of fintech - from disruption to infrastructure - BBD Software

    The changing state of fintech – from disruption to infrastructure

    27 January 2026
    Human behaviour, not AI will determine who wins in 2026

    Human behaviour, not AI, will determine who wins in 2026

    27 January 2026
    Opinion
    Why Elon Musk's Starlink is a 'hard no' for me - Songezo Zibi

    Why Elon Musk’s Starlink is a ‘hard no’ for me

    26 January 2026
    South Africa's new fibre broadband battle - Duncan McLeod

    South Africa’s new fibre broadband battle

    20 January 2026
    AI moves from pilots to production in South African companies - Nazia Pillay SAP

    AI moves from pilots to production in South African companies

    20 January 2026

    Subscribe to Updates

    Get the best South African technology news and analysis delivered to your e-mail inbox every morning.

    Latest Posts
    Chinese car makers flood South Africa while factories lag - Mikel Mabasa

    Chinese car makers flood South Africa while factories lag

    28 January 2026
    Reports of the smartphone's impending death are greatly exaggerated

    Reports of the smartphone’s impending death are greatly exaggerated

    28 January 2026
    Popia is strong, Paia needs reform, says Information Regulator - Mukelani Dimba

    Popia is strong, Paia needs reform, says Information Regulator

    28 January 2026
    AI replaces people as Amazon cuts 16 000 corporate jobs

    AI replaces people as Amazon cuts 16 000 corporate jobs

    28 January 2026
    © 2009 - 2026 NewsCentral Media
    • Cookie policy (ZA)
    • TechCentral – privacy and Popia

    Type above and press Enter to search. Press Esc to cancel.

    Manage consent

    TechCentral uses cookies to enhance its offerings. Consenting to these technologies allows us to serve you better. Not consenting or withdrawing consent may adversely affect certain features and functions of the website.

    Functional Always active
    The technical storage or access is strictly necessary for the legitimate purpose of enabling the use of a specific service explicitly requested by the subscriber or user, or for the sole purpose of carrying out the transmission of a communication over an electronic communications network.
    Preferences
    The technical storage or access is necessary for the legitimate purpose of storing preferences that are not requested by the subscriber or user.
    Statistics
    The technical storage or access that is used exclusively for statistical purposes. The technical storage or access that is used exclusively for anonymous statistical purposes. Without a subpoena, voluntary compliance on the part of your Internet Service Provider, or additional records from a third party, information stored or retrieved for this purpose alone cannot usually be used to identify you.
    Marketing
    The technical storage or access is required to create user profiles to send advertising, or to track the user on a website or across several websites for similar marketing purposes.
    • Manage options
    • Manage services
    • Manage {vendor_count} vendors
    • Read more about these purposes
    View preferences
    • {title}
    • {title}
    • {title}