TechCentralTechCentral
    Facebook Twitter YouTube LinkedIn
    Facebook Twitter LinkedIn YouTube
    TechCentral TechCentral
    NEWSLETTER
    • News

      Alviva shares leap higher on R3-billion take-private offer

      30 June 2022

      Datatec to sell Analysys Mason for as much as R4.1-billion

      30 June 2022

      Futuregrowth launches start-up fund, targets R600-million raise

      30 June 2022

      Eskom is killing the rand

      30 June 2022

      Eskom ramps up load shedding as crisis deepens

      30 June 2022
    • World

      Graphics card prices plummet as crypto demand dries up

      30 June 2022

      Bitcoin just had its worst quarter in a decade

      30 June 2022

      Samsung beats TSMC to 3nm chip production

      30 June 2022

      Napster plots crypto comeback

      29 June 2022

      Pictures: Chinese spacecraft acquires images of entire planet of Mars

      29 June 2022
    • In-depth

      The NFT party is over

      30 June 2022

      The great crypto crash: the fallout, and what happens next

      22 June 2022

      Goodbye, Internet Explorer – you really won’t be missed

      19 June 2022

      Oracle’s database dominance threatened by rise of cloud-first rivals

      13 June 2022

      Everything Apple announced at WWDC – in less than 500 words

      7 June 2022
    • Podcasts

      How your organisation can triage its information security risk

      22 June 2022

      Everything PC S01E06 – ‘Apple Silicon’

      15 June 2022

      The youth might just save us

      15 June 2022

      Everything PC S01E05 – ‘Nvidia: The Green Goblin’

      8 June 2022

      Everything PC S01E04 – ‘The story of Intel – part 2’

      1 June 2022
    • Opinion

      Has South Africa’s advertising industry lost its way?

      21 June 2022

      Rob Lith: What Icasa’s spectrum auction means for SA companies

      13 June 2022

      A proposed solution to crypto’s stablecoin problem

      19 May 2022

      From spectrum to roads, why fixing SA’s problems is an uphill battle

      19 April 2022

      How AI is being deployed in the fight against cybercriminals

      8 April 2022
    • Company Hubs
      • 1-grid
      • Altron Document Solutions
      • Amplitude
      • Atvance Intellect
      • Axiz
      • BOATech
      • CallMiner
      • Digital Generation
      • E4
      • ESET
      • Euphoria Telecom
      • IBM
      • Kyocera Document Solutions
      • Microsoft
      • Nutanix
      • One Trust
      • Pinnacle
      • Skybox Security
      • SkyWire
      • Tarsus on Demand
      • Videri Digital
      • Zendesk
    • Sections
      • Banking
      • Broadcasting and Media
      • Cloud computing
      • Consumer electronics
      • Cryptocurrencies
      • Education and skills
      • Energy
      • Fintech
      • Information security
      • Internet and connectivity
      • Internet of Things
      • Investment
      • IT services
      • Motoring and transport
      • Public sector
      • Science
      • Social media
      • Talent and leadership
      • Telecoms
    • Advertise
    TechCentralTechCentral
    Home»In-depth»Fake news is your fault, not Facebook’s

    Fake news is your fault, not Facebook’s

    In-depth By The Conversation6 December 2016
    Facebook Twitter LinkedIn WhatsApp Telegram Email

     

    news-640

    Following the shock results of Brexit and the Trump victory, a lot of attention has focused on the role that Facebook might have played in creating online political ghettos in which false news can easily spread.

    Facebook now has serious political influence thanks to its development from a social networking tool into a primary source of news and opinions. And for many, the way it manages this influence needs greater scrutiny. But to put the blame solely on the company is to overlook how people use the site, and how they themselves create a filter bubble effect through their actions.

    Much of this debate has focused on the design of Facebook itself. The site’s personalisation algorithm, which is programmed to create a positive user experience, feeds people what they want. This creates what the CEO of viral content site Upworthy, Eli Pariser, calls “filter bubbles”, which supposedly shield users from views they disagree with.

    People are increasingly turning to Facebook for their news — 44 % of US adults now report getting news from the site — and fake news is not editorially weeded out. This means that misinformation can spread easily and quickly, hampering the chance people have for making informed decisions.

    Over the last few weeks, there have been frequent calls for Facebook to address this issue. President Obama himself has weighed in on the issue, warning of the perils that rampant misinformation can have for the democratic process.

    Much of the debate around this, however, has had an element of technological determinism to it, suggesting that users of Facebook are at the mercy of the algorithm. In fact, research shows that the actions of users themselves are still a very important element in the way that Facebook gets used.

    Our research has been looking specifically at how people’s actions create the context of the space in which they communicate. Just as important as the algorithm is how people use the site and shape it around their own communications. We’ve found that most users have an overwhelming view that Facebook is not ideally suited to political debate, and that posts and interactions should be kept trivial and light-hearted.

    This isn’t to say that people don’t express political opinions on Facebook. But for many people there’s a reluctance to engage in discussion, and a sense that anything that might be contentious is better handled by face-to-face conversation. People report that they fear the online context will lead to misunderstandings because of the way that written communication lacks some of the non-linguistic cues of spoken communication, such as tone of voice and facial gestures.

    facebook-640

     

    There’s strong evidence in our research that people are actually exposed to a great deal of diversity through Facebook. This is because their network includes people from all parts of their life, a finding that echoes other research. In this respect, the algorithm doesn’t have a marked influence on the creation of filter bubbles. But because they often want to avoid conflict, people report ignoring or blocking posts, or even unfriending people, when confronted with views with which they strongly disagree.

    They also report taking care of what they say themselves so as not to antagonise people such as family members or work colleagues whose views differ from theirs, but whose friendship they wish to maintain. And finally, they talk of making a particular effort to put forward a positive persona on social media, which again stops them from engaging in debate which might lead to argument.

    Not so easy to fix

    The idea that algorithms are responsible for filter bubbles suggests it should be easy to fix (by getting rid of the algorithms), which makes it an appealing explanation. But this perspective ignores the part played by users themselves, who effectively create their own filter bubbles by withdrawing from political discussions and hiding opinions they disagree with.

    This isn’t done with the intention of sifting out diversity but is instead due to a complex mix of factors. These include the perceived purpose of Facebook, how users want to present themselves in an effectively public form, and how responsible they feel for the diverse ties that make up their online network.

    The fact that manipulation by the algorithm isn’t the only issue here means that other solutions — for example, raising people’s awareness of the possible consequences that their online actions have can help encourage debate. We have to recognise that the impact of technology comes not just from the innovations themselves but also from how we use them, and that solutions have to come from us as well.The Conversation

    • Philip Seargeant is senior lecturer in applied linguistics, The Open University, and Caroline Tagg is lecturer in applied linguistics and English language, also at The Open University
    • This article was originally published on The Conversation
    Caroline Tagg Facebook Philip Seargeant
    Share. Facebook Twitter LinkedIn WhatsApp Telegram Email
    Previous ArticleSA to get Android smartphone plant
    Next Article Red flags as audit firms get into IT

    Related Posts

    The NFT party is over

    30 June 2022

    The great crypto crash: the fallout, and what happens next

    22 June 2022

    Tech giants form metaverse standards body, without Apple

    22 June 2022
    Add A Comment

    Comments are closed.

    Promoted

    Billetterie simplifies interactions between law firms and clients

    30 June 2022

    Think herding cats is tricky? Try herding a cloud

    29 June 2022

    How your business can help hybrid workers effectively

    28 June 2022
    Opinion

    Has South Africa’s advertising industry lost its way?

    21 June 2022

    Rob Lith: What Icasa’s spectrum auction means for SA companies

    13 June 2022

    A proposed solution to crypto’s stablecoin problem

    19 May 2022

    Subscribe to Updates

    Get the best South African technology news and analysis delivered to your e-mail inbox every morning.

    © 2009 - 2022 NewsCentral Media

    Type above and press Enter to search. Press Esc to cancel.