E-safety tips for parents of 11-13 years olds

 E-safety tips for parents of teenagers

Staying Safe Online – a guide for Parents and Carers

Survive the Summer Holidays

Talking Angela App
After receiving concerns regarding the ‘Talking Angela’ app from pupils the app has been investigate and appears to be fine.  However we would recommend, as always, that parents go through this and other apps that their children download and judge the suitability of it.  Information regarding the hoax concerns can be found using the Google search engine.

Link to up to date information relatingand relates to this app is:

http://www.theepochtimes.com/n3/542009-talking-angela-app-game-programmer-talks-pedophilia-hoax-talks-game-ai/

This linked website has an interesting video about keeping your data safe.

Below is the link to Digital Parenting. We believe this to be an excellent publication, giving parents the information they need to help protect their children from online threats.

http://www.vodaphone.com/content/parents.html

Teens Now Start With “Friends” Privacy for New Accounts; Adding the Option to Share Publicly

On Facebook, you control who you share with. That can be a single person in a message, a small group, with friends, or with the world. Each time you share a status update, you choose the audience you want to share with. Unless you change it, the audience remains the same for future posts. Previously, for people aged 13 through 17, the initial audience of their first post on Facebook was set to “Friends of Friends” – with the option to change it. Going forward, when people aged 13 through 17 sign up for an account on Facebook, the initial audience of their first post will be set to a narrower audience of “Friends.”

A new option to share more broadly

Teens are among the savviest people using social media, and whether it comes to civic engagement, activism, or their thoughts on a new movie, they want to be heard. So, starting today, people aged 13 through 17 will also have the choice to post publicly on Facebook. While only a small fraction of teens using Facebook might choose to post publicly, this update now gives them the choice to share more broadly, just like on other social media services.In addition, teens will be able to turn on Follow so that their public posts can be seen in people’s News Feeds. As always, followers can only see posts they are in the audience for. These changes are designed to improve the experience for teens on Facebook. As part of this, we are also looking at ways to improve the way teens use messages and connect with people they may know.

Inline Reminders and Education

We take the safety of teens very seriously, so they will see an extra reminder before they can share publicly.

When teens choose “Public” in the audience selector, they’ll see a reminder that the post can be seen by anyone, not just people they know, with an option to change the post’s privacy.

And if they choose to continue posting publicly, they will get an additional reminder.

Additional resources:


Freemium – Tricks of the trade or legitimate practice?

It may seem somewhat ironic but the Internet has disrupted the traditional business models of the computer games industry just as it has many other industries.

The ability to distribute data electronically as opposed to on a physical disk has been an undoubted boon for many suppliers of digital or digitise-able content. It has also provided great benefits to the end consumer.

Not only does the Internet allow gamers located in different continents and time zones to communicate and play each other in real time, it also provides an efficient way of acquiring upgrades, expansions and even bug fixes to the original game software. Modern games consoles provide a platform which assumes a hybrid online/physical disk model.

These days of course gaming is big business not just on the traditional platforms and computers, but also on mobile devices such as tablets and smart phones – all of which have an Internet connection.

Playable game demos have always been important for the marketing of new computer games and magazines have been distributing demos on cover mounted cassette tapes, disks and later CDs and DVDs for almost as long as the industry has existed.

In is therefore no surprise that the playable demo has used the Internet as a means of distribution.

Combine the notion of online program upgrades and the idea of the playable demo and you get the ‘Freemium’ model as it applies to the computer games industry.

A player can acquire a basic form of a game for little or no money and have a play. If they enjoy the game and wish to experience more, they can expand the games parameters by paying for an upgrade from within the game’s own interface (and ‘in-game’ or ‘in-app’ purchase).

The Freemium model for digital content is widely used in a number of industries. Many online newspapers for instance will provide a certain amount of information for free, but require an upgrade.

There has however been much debate around the freemium model as applied to computer games, and especially those which appeal to younger children. Recently the Office of Fair Trading warned the games and online application industry of what it perceived as “potentially unfair and aggressive commercial practices” amid concerns that they could irresponsibly coerce children to pay to continue playing.

There is obvious concern over potential for children to spend or run up bills on in-game or in-app purchase.

It is yet one more area of online safety which parents and teachers will need to educate their children about. But like many aspects of e-safety, much of the learning is about ensuring that usual practice and knowledge is understood when contextualised within the online world. If a child has no concept of money or cost then what hope do they have of understanding a virtual purchase.

While it is undoubtedly possible to cite cases of some app and games providers applying a cynical approach to exploiting in-app purchases by bamboozling the end user into making purchases, the model when used responsibly is a legitimate mainstay of the software publishing sales strategy.

The freemium model is here to stay and is comparable to the way in which we pay for utilities per metered unit or cell phone call time through pay as you go.

One of the reasons that app and games producers use the freemium model is because it provides some kind of defence against the rampant piracy that the software, games, music and movie industry has suffered. Piracy is now so common place that many people simply expect all digital content to be free of charge and show little respect for the talent, energy, time and cost which goes in to producing it.


Demonstrating a digital footprint

We hear the phrase digital footprint a lot these days, for example the recent story of Paris Brown where her digital footprint had uncovered some inappropriate comments made several years earlier. But what is a digital footprint – well, put into a sentence, it is the trace of a persons online activity.

That said, what does this mean in reality? Everything that is typed, liked, tagged etc online leaves a trace behind and that becomes part of a digital footprint. Then add to that every time a person logs in or out of a website, uses mobile data on a phone, collects emails via a tablet, plays an online game and so on and so on and you can begin to see how a digital footprint is more than just the odd Tweet that we regret.

It’s not too many years ago that I remember learning the phrase ‘ego search’ (or ego surf). This wasn’t a complex psychological term, but simply the act of putting your name into a search engine and seeing if ‘you’ came up in the results. Back then it was somewhat of a challenge and indeed achievement if a search engine could find you. I recall pressing the search button and getting my name to appear twice in the results (ok, so it was actually three times, but the third one wasn’t me!).

Repeating the same exercise today I am faced with about 19 million results. Now, I’m not going to check every one of the 19 million to see how many are actually me, although I dare say, it will be more than two.

There are many parts of our digital footprint that are out of our control or just happen behind the scenes (like cookies or what other people say about us for example), but what is important is to make sure the things that are in our control are handled responsibly. It’s all too easy these days to make a comment on an online forum or social media platform, but what is easy to forget is that this comment will leave a permanent trace on a digital footprint.

To demonstrate both the scale of the digital footprint and also how everything leaves a trace, why not try typing your school name into Google and record the number of results. Some of these may include student social media accounts, directory listings and the school website pages, so it is also worth discussing this. Then, using the search tools, select a custom date range from several years ago and note the difference in both the volume and the nature of the listings.


Beware of the Trolls

The media has recently highlighted the case of Caroline Criado-Perez – a feminist campaigner and journalist who, after successfully campaigning for a woman’s face to appear on bank notes, was subjected to a torrent of abusive posts on ‘Twitter’, including threats of rape, from male internet ‘trolls’.

But what is ‘trolling’? Who are the trolls and why do they behave in such a way online? How can they be stopped; indeed can they be stopped?

The Oxford Dictionary defines a ‘troll’ as someone who “…submits a deliberately provocative posting to an online message board (or some form of social media) with the aim of inciting an angry response.” It is regarded as a type of cyber bullying and can take a number of forms:

Admittedly, these examples are extreme cases, however, trolling can be seen everyday on any social platform. Browse the responses to any Facebook or Twitter posting by the BBC, Guardian or any other news organisation and it is highly likely that you will come across a Troll’s comments, characterised by their intentionally extreme and contentious point of view and frequently containing foul and vile language with the sole aim of annoying other contributors or better still provoking them to react by making a responding comment.

So, trolling is a broad term that encompasses everything from a mischievous provocation to threats of violence or rape, but what drives someone to become a troll?

Professor Mark Griffiths, director of the International Gaming Research Unit at Nottingham Trent University stated to the BBC that “…online, people feel anonymous and disinhibited. They lower their emotional guard and in the heat of the moment either troll reactively or proactively.” He also added that trolls are usually young adult males who either seek amusement from boredom or revenge.

However, a quick browse of any football, music or other fan site will uncover people of all ages and gender subjecting others to the most venomous and vicious attacks. Comedian Dom Joly was the victim of a devious troll with nine different online identities – she was a 14-year old girl.

It would therefore appear to be the pretense of anonymity, that the Internet apparently provides, which is key reason why people, who normally conduct themselves pleasantly and responsibly in the real world, feel that they can participate in offensive behaviour in the virtual world.

If recent reports in the media are to believed, ‘trolling’ is a phenomenon that is on the increase and there are growing calls for something for it to be stopped, but can this be done and if so how?

In response to the virulent abuse that Caroline Criado-Perez received, a petition was set up that received tens of thousands of signatures, including the names of prominent politicians and celebrities, to urge ‘Twitter’ to take “a zero tolerance policy” and include an option button that could be used to report unacceptable abusive behaviour on its platform.

But should the policing of online discussions and debates be left to the social media platforms themselves? Is there a requirement for increased monitoring and prosecution by the police and the courts or could this be seen as an infringement of free-speech?

So far, two men have been arrested in the case of Ms.Criado-Perez with the possibility of more. Sean Duffy and Colm Coss who both posted the offensive messages on tribute pages of people who had died, were both convicted and imprisoned in UK as was Paul Chambers, who ‘threatened’ to blow up Robin Hood Airport (the latter case was subsequently quashed on appeal at the High Court).

As recent as June 2013, however, Kier Starmer QC, Director of Public Prosecutions, in an effort to “strike the right balance between freedom of expression and a need to uphold the law”, published guidelines for prosecutors who are taking on cases involving ‘grossly offensive communication’. Under these new guidelines prosecutions involving the posting of an offensive message could be considered unnecessary if the perpetrator “has expressed genuine remorse” or has “taken swift and effective action” to “remove the communication in question or otherwise block access to it”.

Trolling is unpleasant, deeply offensive and upsetting. In one foolish moment it can devastate the lives of both the victim and, if prosecution ensues, the abuser. Yet, due to its perceived anonymity, young people can look upon trolling as having ‘a bit of a laugh’ at someone else’s expense, a way to get back at someone or to exert power over someone in order to garner popularity within a gang. As teachers we are in a perfect position to educate students and we should be willing to demonstrate that cyber-bullying behaviour such as trolling is unacceptable and can result in serious consequences that can have a lasting impact on the peoples lives.


Is censorship of adult content the best way to educate children?

With the news that the Government is to impose ‘family-friendly’ restrictions on internet services, there are many welcoming the change. Any measures that can be implemented to help protect our children can only be a good thing.

But is, ‘family-friendly’ filtering there to stop the potentially corrupt and dangerous or is it there to stop the innocent? Children will be prevented from accessing adult content while the adults will have the ability to turn the filter off and view anything from the good to the bad and the frankly disturbing.

If we are worried that viewing adult material at a young age will have detrimental affects on today’s youth, do we take the option of tackling the situation head on or is a prohibitive approach the better option? Do we help them to learn what is right and what is wrong (as we would with many other topics such as healthy eating, social awareness and so on) or do we hide things away? Is filtering a sensible approach or is it just avoiding the issue and hoping we don’t need to confront it.

I’m sure many of us were told as children that we were not allowed to do something and, of course, we did it anyway. Curiosity has a lot to answer for, so perhaps we should to let them explore, knowing what they might find and being prepared to discuss it. However, you wouldn’t let a child play with matches, we know that is dangerous… the debate is endless.

There are a good many pros and cons to the filtering solution, but as long as safeguarding is at the root of the decision rather than censorship, then there has to be some merits. However we mustn’t become complacent. This is not the only risk on the internet – so we can’t assume that our children will be safe once the legislation is in place.

Written by E-safety Support on July 25, 2013 09:54


David Cameron made a speech about cracking down on online pornography and making the internet safer for children on 22 July 2013. This is a transcript of the speech, exactly as it was delivered.

Thank you to the NSPCC for hosting me today and thank you for all the amazing work you do for Britain’s children.

Today I am going to tread into territory that can be hard for our society to confront. It is frankly difficult for politicians to talk about, but I believe we need to address as a matter of urgency.

I want to talk about the internet, the impact it’s having on the innocence of our children, how online pornography is corroding childhood and how, in the darkest corners of the internet, there are things going on that are a direct danger to our children and that must be stamped out. Now, I’m not making this speech because I want to moralise or scaremonger but because I feel profoundly, as a politician and as a dad, that the time for action has come. This is, quite simply, about how we protect our children and their innocence.

Now, let me be very clear right at the start: the internet has transformed our lives for the better. It helps liberate those who are oppressed, it allows people to tell truth to power, it brings education to those previously denied it, it adds billions to our economy, it is one of the most profound and era‑changing inventions in human history.

But because of this, the internet can sometimes be given a sort of special status in debate. In fact, it can almost be seen as beyond debate, that to raise concerns about how people should access the internet or what should be on it, is somehow naïve or backwards looking. People sometimes feel they’re being told almost the following: that an unruled internet is just a fact of modern life; any fallout from that is just collateral damage and that you can as easily legislate what happens on the internet as you can legislate the tides.

And against this mind-set, people’s, most often parents’, very real concerns get dismissed. They’re told the internet is too big to mess with; it’s too big to change. But to me, the questions around the internet and the impact it has are too big to ignore. The internet is not just where we buy, sell and socialise; it’s where crimes happen; it’s where people can get hurt; it’s where children and young people learn about the world, each other, and themselves.

The fact is that the growth of the internet as an unregulated space has thrown up 2 major challenges when it comes to protecting our children. The first challenge is criminal and that is the proliferation and accessibility of child abuse images on the internet. The second challenge is cultural; the fact that many children and viewing online pornography and other damaging material at a very early age and that the nature of that pornography is so extreme it is distorting their view of sex and relationships.

Now, let me be clear, the 2 challenges are very distinct and very different. In one we’re talking about illegal material, the other is legal material that is being viewed by those who are underage. But both the challenges have something in common; they’re about how our collective lack of action on the internet has led to harmful and, in some cases, truly dreadful consequences for children.

Now, of course, a free and open internet is vital. But in no other market and with no other industry do we have such an extraordinarily light touch when it comes to protecting our children. Children can’t go into the shops or the cinema and buy things meant for adults or have adult experiences; we rightly regulate to protect them. But when it comes to the internet, in the balance between freedom and responsibility we’ve neglected our responsibility to children.

My argument is that the internet is not a side-line to real life or an escape from real life, it is real life. It has an impact on the children who view things that harm them, on the vile images of abuse that pollute minds and cause crime, on the very values that underpin our society. So we’ve got to be more active, more aware, more responsible about what happens online. And when I say we I mean we collectively: governments, parents, internet providers and platforms, educators and charities. We’ve got to work together across both the challenges that I’ve set out.

So let me start with the criminal challenge, and that is the proliferation of child abuse images online. Obviously we need to tackle this at every step of the way, whether it’s where the material is hosted, transmitted, viewed or downloaded. And I am absolutely clear that the state has a vital role to play here.

The police and CEOP, that is the Child Exploitation and Online Protection Centre, are already doing a good job in clamping down on the uploading and hosting of this material in the UK. Indeed, they have together cut the total amount of known child abuse content hosted in the UK from 18% of the global total in 1996 to less than 1% today. They’re also doing well on disrupting the so-called hidden internet, where people can share illegal files and on peer‑to‑peer sharing of images through photo-sharing sites or networks away from the mainstream internet.

Once CEOP becomes a part of the national Crime Agency, that will further increase their ability to investigate behind the pay walls, to shine a light on this hidden internet and to drive prosecutions and convictions of those who are found to use it. So we should be clear to any offender who might think otherwise, there is no such thing as a safe place on the internet to access child abuse material.

But government needs to do more. We need to give CEOP and the police all the powers they need to keep pace with the changing nature of the internet. And today I can announce that from next year we’ll also link up existing fragmented databases across all police forces to produce a single, secure database of illegal images of children which will help police in different parts of the country work together more effectively to close the net on paedophiles. It will also enable the industry to use digital hash tags from the database to proactively scan for, block and take down those images wherever they occur. Otherwise you have different police forces with different databases; you need one set of all the hash tags, all the URLS, in one place for everybody to use.

Now, industry has agreed to do exactly that because this isn’t just a job for government. The internet service providers and the search engine companies have a vital role to play and we’ve already reached a number of important agreements with them. A new UK-US taskforce is being formed to lead a global alliance with the big players in the industry to stamp out these vile images. I’ve asked Joanna Shields, CEO of Tech City and our business ambassador for digital industries, who is here today, to head up engagement with industry for this taskforce. And she’s going to work both with the UK and US governments and law enforcement agencies to maximise our international efforts.

Here in Britain, Google, Microsoft and Yahoo are already actively engaged on a major campaign to deter people who are searching for child abuse images. Now, I can’t go into the detail about this campaign, because that might undermine its effectiveness, but I can tell you it is robust, it is hard-hitting; it is a serious deterrent to people who are looking for these images.

Now, where images are reported they are immediately added to a list and they’re blocked by search engines and ISPs so people can’t access those sites. These search engines also act to block illegal images and the URLs, or pathways, that lead to these images from search results, once they’ve been alerted to their existence.

But here to me is the problem. The job of actually identifying these images falls to a small body called the Internet Watch Foundation. Now this is a world leading organisation, but it relies almost entirely on members of the public reporting things they’ve seen online.

So the search engines themselves have a purely reactive position. When they’re prompted to take something down they act, otherwise they don’t. And if an illegal image hasn’t been reported it can still be returned in searches. In other words, the search engines are not doing enough to take responsibility. Indeed, in this specific area, they are effectively denying responsibility.

And this situation has continued because of a technical argument. It goes like this: the search engine shouldn’t be involved in finding out where these images are because the search engines are just the pipe that delivers the images, and that holding them responsible would be a bit like holding the Post Office responsible for sending illegal objects in anonymous packages. But that analogy isn’t really right, because the search engine doesn’t just deliver the material that people see, it helps to identify it.

Companies like Google make their living out of trawling and categorising content on the web, so that in a few key strokes you can find what you’re looking for out of unimaginable amounts of information. That’s what they do. They then sell advertising space to companies based on your search patterns. So if I go back to the Post Office analogy, it would be like the Post Office helping someone to identify and then order the illegal material in the first place and then sending it on to them, in which case the Post Office would be held responsible for their actions.

So quite simply we need the search engines to step up to the plate on this issue. We need a situation where you cannot have people searching for child abuse images and being aided in doing so. If people do try and search for these things, they are not only blocked, but there are clear and simple signs warning them that what they are trying to do is illegal, and where there is much more accountability on the part of the search engines to help find these sites and block them.

On all of these things, let me tell you what we’ve already done and what we’re going to do. What we’ve already done is insist that clear, simple warning pages are designed and placed wherever child abuse sites have been identified and taken down so that if someone arrives at one of these sites they are clearly warned that the page contained illegal images. These so-called splash pages are up on the internet from today and this is, I think, a vital step forward. But we need to go further.

These warning pages should also tell people who’ve landed on these sites that they face consequences like losing their job, losing their family or even access to their children if they continue. And vitally they should direct them to the charity Stop it Now! which can help people change their behaviour anonymously and in complete confidence.

On people searching for these images, there are some searches where people should be given clear routes out of that search to legitimate sites on the web. Let me give you an example. If someone is typing in ‘child’ and ‘sex’ there should come up a list of options: do you mean child sex education? Do you mean child gender? What should not be returned is a list of pathways into illegal images which have yet to be identified by CEOP or reported to the Internet Watch Foundation.

Then there’s this next issue. There are some searches which are so abhorrent and where there could be no doubt whatsoever about the sick and malevolent intent of the searcher – terms that I can’t say today in front of you with the television cameras here, but you can imagine – where it’s absolutely obvious the person at the keyboard is looking for revolting child abuse images. In these cases, there should be no search results returned at all. Put simply, there needs to be a list of terms – a blacklist – which offer up no direct search returns.

So I have a very clear message for Google, Bing, Yahoo! and the rest: you have a duty to act on this, and it is a moral duty. I simply don’t accept the argument that some of these companies have used to say that these searches should be allowed because of freedom of speech.

On Friday, I sat with the parents of Tia Sharp and April Jones. They want to feel that everyone involved is doing everything they can to play their full part in helping rid the internet of child abuse images. So I’ve called for a progress report in Downing Street in October with the search engines coming in to update me.

And the question we’ve asked is clear. If CEOP give you a blacklist of internet search terms, will you commit to stop offering up any returns on these searches? If the answer is yes, good. If the answer is no and the progress is slow or non-existent, I can tell you we’re already looking at legislative options so that we can force action in this area.

There’s one further message I have for the search engines. If there are technical obstacles to acting on this, don’t just stand by and say nothing can be done, use your great brains to overcome them. You’re the people who’ve worked out how to map almost every inch of the earth from space. You’ve designed algorithms to make sense of vast quantities of information. You’re the people who take pride in doing what they say can’t be done.

You hold hackathons for people to solve impossible internet conundrums. Well hold a hackathon for child safety. Set your greatest brains to work on this. You’re not separate from our society, you’re part of our society and you must play a responsible role within it. This is quite simply about obliterating this disgusting material from the net, and we should do whatever it takes.

So that’s how we’re going to deal with the criminal challenge. The cultural challenge is the fact that many children are watching online pornography and finding other damaging material online at an increasingly young age. Now young people have always been curious about pornography; they’ve always sought it out.

But it used to be that society could protect children by enforcing age restrictions on the ground; whether that was setting a minimum age for buying top-shelf magazines, putting watersheds on the TV or age rating films and DVDs. But the explosion of pornography on the internet, and the explosion of the internet into our children’s lives, has changed all of that profoundly. It’s made it much harder to enforce age restrictions. It’s made it much more difficult for parents to know what’s going on. And as a society we need to be clear and honest about what is going on.

For a lot of children, watching hard-core pornography is in danger of becoming a rite of passage. In schools up and down our country, from the suburbs to the inner city, there are young people who think it’s normal to send pornographic material as a prelude to dating in the same way you might once have sent a note across the classroom.

Over a third of children have received a sexually explicit text or email. In a recent survey, a quarter of children said they had seen pornography which had upset them. This is happening, and it is happening on our watch as adults. And the effect that it can have can be devastating. Effectively our children are growing up too fast. They’re getting distorted ideas about sex and being pressurised in a way that we’ve never seen before, and as a father I am extremely concerned about this.

Now there’s some who might say, ‘Well, it’s fine for you to have a view as a parent but not as Prime Minister. This is – this is an issue for parents not the state.’ But the way I see it, there is a contract between parents and the state. Parents say, ‘Look, we’ll do our best to raise our children right and the state should agree to stand on our side, to make that job a bit easier not a bit harder.’

But when it comes to internet pornography, parents have been left too much on their own. And I’m determined to put that right. We all need to work together, both to prevent children from accessing pornography and educate them about keeping safe online. This is about access and it’s about education. And I want to say briefly what we’re doing about each.

On access, things have changed profoundly in recent years. Not long ago access to the internet was mainly restricted to the PC in the corner of the living room with a beeping dial-up modem – we all remember the worldwide wait – it was downstairs in the house where parents could keep an eye on things. But now the internet is on the smartphones, the laptops, the tablets, the computers, the games consoles. And with high speed connections that make movie downloads and real time streaming possible, parents need much, much more help to protect their children across all of these fronts.

So on mobile phones, it’s great to report that all of the operators have now agreed to put adult content filters onto phones automatically. And to deactivate them you have to prove you’re over 18 and operators will continue to refine and improve those filters.

On public wi-fi, of which more than 90% is provided by 6 companies – O2, Virgin Media, Sky, Nomad, BT and Arquiva – I’m pleased to say we’ve now reached an agreement with all of them that family friendly filters are to be applied across public wi-fi networks wherever children are likely to be present. This will be done by the end of next month. And we’re keen to introduce a family friendly wi-fi symbol which retailers, hotels, transport companies can use to show that their customers – use to show their customers that their public wi-fi is properly filtered. So I think good progress there; that’s how we’re protecting children outside the home.

Inside the home, on the private family network, it is a more complicated issue. There’s been a big debate about whether internet filters should be set to a default ‘on’ position, in other words with adult content filters applied by default, or not. Let’s be clear, this has never been a debate about companies or government censoring the internet, but about filters to protect children at the home network level.

Those who wanted default ‘on’ said, ‘It’s a no-brainer: just have the filter set to ‘on’, then adults can turn them off if they want to and that way we can protect all children whether their parents are engaged in internet safety or not.’ But others said default ‘on’ filters could create a dangerous sense of complacency. They said that with default filters parents wouldn’t bother to keep an eye on what their kids are watching, as they’d be complacent; they’d just assume the whole thing had been taken care of.

Now, I say we need both: we need good filters that are preselected to be on, pre-ticked unless an adult turns them off, and we need parents aware and engaged in the setting of those filters. So, that is what we’ve worked hard to achieve, and I appointed Claire Perry to take charge of this, for the very simple reason that she’s passionate about this issue, determined to get things done and extremely knowledgeable about it at the same too. Now, she’s worked with the big 4 internet service providers – TalkTalk, Virgin, Sky and BT – who together supply internet connections to almost 9 out of 10 homes.

And today, after months of negotiation, we’ve agreed home network filters that are the best of both worlds. By the end of this year, when someone sets up a new broadband account, the settings to install family friendly filters will be automatically selected; if you just click next or enter, then the filters are automatically on.

And, in a really big step forward, all the ISPs have rewired their technology so that once your filters are installed they will cover any device connected to your home internet account; no more hassle of downloading filters for every device, just one click protection. One click to protect your whole home and to keep your children safe.

Now, once those filters are installed it should not be the case that technically literate children can just flick the filters off at the click of the mouse without anyone knowing, and this, if you’ve got children, is absolutely vital. So, we’ve agreed with industry that those filters can only be changed by the account holder, who has to be an adult. So an adult has to be engaged in the decisions.

But of course, all this only deals with the flow of new customers, new broadband accounts, those switching service providers or buying an internet connection for the first time. It doesn’t deal with the huge stock of the existing customers, almost 19 million households, so that is where we now need to set our sights.

Following the work we’ve already done with the service providers, they have now agreed to take a big step: by the end of next year, they will have contacted all their existing customers and presented them with an unavoidable decision about whether or not to install family friendly content filters. TalkTalk, who’ve shown great leadership on this, have already started and are asking existing customers as I speak.

We’re not prescribing how the ISPs should contact their customers; it’s up to them to find their own technological solutions. But however they do it, there’ll be no escaping this decision, no, ‘Remind me later,’ and then it never gets done. And they will ensure that it’s an adult making the choice.

Now, if adults don’t want these filters that is their decision, but for the many parents who would like to be prompted or reminded, they’ll get that reminder and they’ll be shown very clearly how to put on family friendly filters. I think this is a big improvement on what we had before and I want to thank the service providers for getting on board with this, but let me be clear: I want this to be a priority for all internet service providers not just now, but always.

That is why I am asking today for the small companies in the market to adopt this approach too, and I am also asking Ofcom, the industry regulator, to oversee this work, to judge how well the ISPs are doing and to report back regularly. If they find that we’re not protecting children effectively, I will not hesitate to take further action.

But let me also say this: I know there are lots of charities and other organisations which provide vital online advice and support that many young people depend on, and we need to make sure that the filters do not, even unintentionally, restrict this helpful and often educational content. So I’ll be asking the UK Council for Child Internet Safety to set up a working group to ensure this doesn’t happen, as well as talking to parents about how effective they think that these filter products we’re talking about really are.

So, making filters work is one front we’re acting on; the other is education. In the new national curriculum, launched just a couple of weeks ago, there are unprecedented requirements to teach children about online safety. That doesn’t mean teaching young children about pornography; it means sensible, age-appropriate education about what to expect on the internet. We need to teach our children not just about how to stay safe online, but how to behave online too, on social media and over phones with their friends.

And it’s not just children that need to be educated; it’s us parents, too. People of my generation grew up in a completely different world; our parents kept an eye on us in the world they could see. This is still relatively new, a digital landscape, a world of online profiles and passwords, and speaking as a parent, most of us do need help in navigating it.

Companies like Vodafone already do a good job at giving parents advice about online safety; they spend millions on it, and today they’re launching the latest edition of their Digital Parenting guide. They’re also going to publish a million copies of a new educational tool for younger children called, ‘The digital facts of life.’

And I’m pleased to announce something else today: a major new national campaign that is going to be launched in the new year, that is going to be backed by the 4 major internet service providers as well as other child focused companies, that will speak directly to parents about how to keep their children safe online and how to talk to their children about other dangers like sexting or online bullying.

And government is going to play its part, too, because we get millions of people interacting with government. Whether that’s sorting out their road tax or their Twitter account, or soon registering for Universal Credit, I’ve asked that we use these interactions to keep up the campaign, to prompt parents to think about filters and to think about how they can keep their children safe online. This is about all of us playing our part.

So, we’re taking action on how children access this stuff, how they’re educated about it, and I can tell you today we’re also taking action on the content that is online. There are certain types of pornography that can only be described as extreme; I am talking particularly about pornography that is violent and that depicts simulated rape. These images normalise sexual violence against women and they’re quite simply poisonous to the young people who see them.

The legal situation is, although it’s been a crime to publish pornographic portrayals of rape for decades, existing legislation does not cover possession of this material, at least in England and Wales. Possession of such material is already an offence in Scotland, but because of a loophole in the Criminal Justice and Immigration Act 2008 it is not an offence south of the border. But I can tell you today, we are changing that: we are closing the loophole, making it a criminal offence to possess internet pornography that depicts rape.

And we’re going to do something else to make sure that the same rules apply online as they do offline. There are examples of extreme pornography that are so bad you can’t even buy this material in a licensed sex shop, and today I can announce we’ll be legislating so that videos streamed online in the UK are subject to the same rules as those sold in shops. Put simply: what you can’t get in a shop, you shouldn’t be able to get online.

Now, everything today I’ve spoken about comes back to one thing: the kind of society we want to be. I want Britain to be the best place to raise a family; a place where your children are safe, where there’s a sense of right and wrong and proper boundaries between them, where children are allowed to be children.

And all the actions we’re taking today come back to that basic idea: protecting the most vulnerable in our society, protecting innocence, protecting childhood itself. That is what is at stake, and I will do whatever it takes to keep our children safe.

Information from:


This booklet includes information for parents/guardians to ensure e-safety for you and your children.


Top Tips for a Safer Internet

  1. Talk to your child about their favourite websites. Starting a conversation on a positive foot can lead nicely into a chat about online safety.
  2. If your child loves to use social networking sites, teach them about protecting their personal information by thinking about what they are sharing and who they are sharing it with. Show them how to use privacy settings, and how to block and report – and advise them to only accept friend requests from people they know in real life.
  3. Remind your child that showing respect for others online is just as important as showing it offline. Encourage them to think before they post and encourage them to show positive behaviour online.
  4. There are lots of ways you can advise your child about cyber bullying, if they are worried remind them to save the evidence and to always tell an adult they trust if something upsets them online.
  5. There are ways in which you can help to prevent your child from seeing inappropriate content online. Have you considered parental controls and filtering in your home and also on your children’s portable internet enabled devices?