Jane Fae: What we define as sexual, and what we define as undesirable, matters greatly

Comment: We need to fight Facebook’s smut unilateralism

Comment: We need to fight Facebook’s smut unilateralism

By Jane Fae

Something has changed in the field of child protection. There is a new mood abroad: a real determination by the political establishment to make a difference.

Over the last decade, debate about the internet has gradually shifted from 'wild west' metaphors, according to which policymakers could do little about this lawless new virtual space, to general agreement that it is not that different and should be regulated.

The debate now is what sort of something could or should be done.  Criminal material – especially material that is based on the exploitation and abuse of children – is being dealt with on a daily basis by the Internet Watch Foundation (as watchdog) and CEOP (as police enforcement agency).  According to veteran campaigner John Carr, of the UK Children's Charities' Coalition on Internet Safety, it is a positive thing that the police are finally admitting that they need more resource.

This leaves another difficult issue: how to protect children from exposure to legitimate adult material.  It's a distinction – the criminal vs the legitimate – that policymakers mostly get, and public mostly don't.

The solution, as espoused by the prime minister's advisor on the issue Claire Perry, is filtering. She is not wrong: if you don't want children to see something, you need to place limiters on their access to the internet. The debate has turned technical, with internet service providers and libertarians debating the right level (network, domain, individual) at which blocks and filters should be imposed.

Behind these are secondary debates about whether such filters should be automatically on at start-up, or turn themselves on from time to time. What is not being discussed is who sets the filters, who classifies material, and who is responsible for the massive imposition of cultural value implicit in controlling that classification.

Imagine, for a moment, that the prime minister tomorrow announced that he has identified a provider of filtering software. OK: they took a slightly hardline on porn – in fact, on any kind of fleshy exposure.  And they weren't too hot on LGBT rights either.  But they had a VERY GOOD filter, best of breed and all that.  So that was settled.

The name of the supplier? Oh, just some Middle Eastern start-up called Taliban Enterprises Inc. Faced with such madness, we'd rightly judge the PM to be quite, quite bonkers. But why? After all, if it works…

The problem is cultural values.  As more and more social interaction takes place on the internet, so cultural views are formed by what we may and what we may not see 'out there'.  Take Facebook, for instance, recovering from a recent mauling by the #FBrape campaign.

Their pages, as Simon Milner, European policy director has explained, are a porn-free nudity-free zone.  They also remove material deemed to be hate speech.  Superficially, that sounds admirable – until one understands that behind that policy, Facebook  happily hosts the softer end of 'smut' – while equally happily removing pages dedicated to breastfeeding, to LGBT support or even, in one unfortunate incident, assisting young men to self-diagnose testicular cancer.

Add to that the fact that it also, for a long time, refused to remove a series of images that treated violence against women as little more than a laddish joke, and the scene was set for a collision: in this instance between a determined boycott campaign and its long-established policy of 'even-handedness'.

At a recent Westminster Forum event on the issue last week, Milner announced that Facebook had changed policy on these matters – while adding, slightly less fortuitously, that the images of violence against women had been subject to controversy.  That while many objected to them, Facebook had taken some criticism for taking them down. The point was unclear: was he really saying that such images should stay if enough people liked them?

More seriously, he did not wish to engage in direct discussion of Facebook's values: that required the press office's involvement.  Which is fine as large corporate policy: but does underline just how much of our internet safeguarding is now being outsourced to overseas multinationals on a vague 'trust us' promise.

Claire Perry, also present at the event, suggested that no-one had ever complained about being blocked from viewing porn on the tube. That, though, is mere soundbite – and probably only tangentially true. Faced with a question about why two middle-aged women found themselves unable to search for a relatively innocuous, if topless, picture of Richard Armitage in their suburban shopping centre, she responded with another soundbite: "We should not allow the perfect to drive out the good" when it came to internet safeguarding.

This matters. The government has already decided that, in a few months time, PCs will automatically block various categories of undesirable material including 'sexual' content.  This will be one of the biggest impositions of collective cultural values since Europeans colonised Africa, teaching the indigenous population, inter alia, to put on clothes and shun homosexuals.  That didn't end well!

Of course we don't want children having unlimited access to the sexual.  But what we define as sexual, and what we define as undesirable, matters greatly. And if in the end we are putting in place, with no public debate whatsoever, a system that deems female nudity and breastfeeding to be 'sexual' – while permitting the subtler sexualised messages of big business to permeate the ether – that begs the question of why we should seek to preserve any independent British values at all.

Jane Fae is a feminist and writer on gender issues. You can read more of her writing here and follow her on Twitter here.

The opinions in politics.co.uk's Comment and Analysis section are those of the author and are no reflection of the views of the website or its owners.