Filters and monitoring: Panacea or band-aid


Published on Co-director Larry Magid's presentation about filtering and monitoring for Russia's Safer Internet day -- February, 2012

Published in: Technology
  • Be the first to comment

  • Be the first to like this

No Downloads
Total views
On SlideShare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide
  • None:Effective for some children, especially teensDevice level:Works best for younger children. Teens can often get around it and may not need it. Doesn’t work on “other” devicesRouter level in home or schoolAn excellent way to regulate content throughout home or school. Can have adult “accounts.” Doesn’t work away from home (mobile)DNS LevelSimilar to router control only coming from DNS serverISP optional blocking (allows parents to request blocking by ISP)ISP mandatoryRaises serious free speech issues, Can be defeated via proxy servers and tunneling. One-size fits all approach for everyone, regardless of age
  • Filters and monitoring: Panacea or band-aid

    1. 1. Parental controls: An overview Larry Magid Co-Director Safer Internet Day Moscow, February 2012
    2. 2. Filtering technologies Device None DNS Level ISP Mandatory RouterISP Optional
    3. 3. What parental controls can do• Block undesirable content such as sites containing sexual images or advocate violence or use of drugs or alcohol• Block websites that parents or authorities are concerned about such as social networking (Facebook, Google+, Twitter)• Limit how a child uses a device by total amount of time per day, time of day or day of week• Block or limit features on device such as texting, games, web access or any specific program or app• Monitor & report on use of device or service
    4. 4. Technical Issues around filtering• Overblocking• Underblocking• Levels of control • Granularity gives greater control but more confusing• Parental white listing tools• Parental black listing tools• Filtering vs. monitoring • Issues of “too much information” • Children’s privacy (trust factor) • Stealth vs. open
    5. 5. What filters can’t do very well• Prevent bullying and peer harassment• Prevent posting inappropriate content – reputation damage• Prevent inappropriate or unwanted contact*• Tech self-control and critical thinking**Monitoring technologies can be used for these purposes
    6. 6. Monitoring• Can run on device or on network• Should it report report all activity or just suspicious activity?• Should it report on private conversations or just public postings or what people are saying about child? • Safetyweb vs. Spectorsoft• Can run in stealth mode or require child to know it’s running
    7. 7. Social Issues around parental controls• Openness of what is being filtered and by what criteria• Part of a conversation or stealth mode?• How and when do you wean children away from filtering?• Values that go into filtering are not necessarily universally accepted• Free speech issues • Especially with social networking• Privacy and trust• Failure to teach critical thinking
    8. 8. Illegal content• We need to separate the question of “child pornography” from Internet safety. • Child pornography (child abuse images) is a legal issue. • Internet safety is a social issue• Most NGOs and government officials in the U.S. are very careful not to mix child abuse images with other concerns such as adult pornography, drug abuse, extremism and intellectual property protection• There is some concern that controlling any content could lead to banning more content, including political speech