I was among the first people in the UK to get a Facebook account in 2005. Back then it was only available to students at Oxford and Cambridge and was rolled out to students from other universities and finally, later in 2006, it was made available to the wider public. In 2006 it only had 500,000 UK users, mostly students, but by 2017 it was estimated that 38 million people in the UK had a Facebook account. Around a quarter to a third of the global population – some two billion people – have a Facebook account.

What started as somewhere to post embarrassing photos, share student gossip and join groups such as “I Will Go Slightly Out of My Way To Step On That Crunchy-looking Leaf” has utterly transformed the internet and how we consume information. It began as a move to become ‘the front page of the internet’ and now massive sections of society get their news from the website. In a survey carried out by Ofcom last year, 27% of UK online news users said they use Facebook to access the news, second only to the BBC website or app. That is a huge section of the population.

But last week, Facebook announced that it was changing how its news feed works. Rather than a proliferation of posts from ‘professional’ publishers, it will focus on posts from friends and family and what Mark Zuckerberg described as ‘meaningful interaction’. So instead of posts from the BBC, the Canary, LADBible, ASOS or Skwawkbox, you’ll be more likely to see something from your Aunt or an old school friend.

Facebook has been under huge pressure to clean up the ‘news feed’ as the proliferation of ‘fake news’ on people’s profiles has been identified as a major threat to the democratic process and perhaps even an instrument for foreign governments to affect the outcomes of elections. Until now Facebook has largely buried its head in the sand saying that it is not responsible for the content on its platforms, but is now scrambling to tackle the issue, with mixed results. Recently its move to warn people that what they were about to read was “fake news” backfired, as an analysis showed that this only made people more likely to read the post and entrenched their beliefs.

The move away from media domination of the ‘news feed’ and (back?) towards friends and family came as a shock to media organisations and businesses, many of whom are reliant on the traffic generated by Facebook. Wider reaction has been mixed, with some commentators saying that it will ultimately improve online media, while others have noted potential unintended consequences, such as limiting access to ‘legitimate’ news organisations as well as ‘fake news’, particularly in more authoritarian countries. Indeed, there are concerns that as people retreat into closed groups discussing their views without challenge on the ‘news feed’ it may, in fact, exacerbate the problem.

However, there has been limited discussion about what Facebook’s announcement might mean for political campaigning, with the focus in news coverage on the impact to businesses and media organisations.

Social media has become increasingly important in political campaigning. In 2008 social media played a big role in the election of Barack Obama as US President and in 2016, Donald Trump’s huge Twitter presence bolstered his campaign for the presidency. Similarly, Facebook became a major forum in the 2016 US presidential election and there are even suggestions that foreign state actors used social media, and Facebook in particular, to influence the outcome of the election. On this side of the pond, as well as elections, Facebook and Twitter in particular had a significant impact on the outcome of the EU referendum, and again there has been a suggested that foreign state actors interfered in the process using ‘bot’ accounts and ‘fake news’.

Facebook, in particular, has become an increasingly important part of political campaigns in the UK recent years. Many attribute the success of the Conservatives in the 2015 general election to its extraordinary Facebook operation, targeting posts down to almost an individual level to encourage shares by users, with a budget that blew its opponents out of the water (£1.2 million for the Conservatives compared to just £16,000 by Labour).

The changes to Facebook should be of as much concern to political parties as they are to companies that rely on Facebook for commercial traffic.

Facebook is not alone in facing the problem of what to do about ‘fake news’. The issue of who is responsible for what is posted on wider social media is not going away. Last month Facebook, Twitter and Google appeared before the Culture, Media and Sport Committee in the House of Commons for a gruelling and deeply awkward session, where MPs hammered them on sexist, racist and anti-Semitic abuse that appears on their platforms.

Now that Facebook has blinked and done a volte-face on its position on the place of news on its site, this might be the beginning of a major move across online platforms to crack down on a range of issues, including ‘fake news’ and hate speech. Other social media companies will be under pressure to do more to do something (whether similar or not) to tackle these problems, and these, too, may have a significant impact both on news consumption in wider social media and social media’s usefulness as a tool of political campaigning.

Any organisation – business, news organisation, third sector group or political party – that uses Facebook or social media to sell in key messages or to campaign may soon face a harder time reaching its audience and effecting change through organic user shares. In this context, digital advertising will become more important as companies prevent organisations from using their platforms to share content organically, and funnel them towards paid content.