An Italian court in Milan has found three Google executives
guilty of privacy violations, convicting them to suspended six-month sentences.
The Google employees were sued over a video
posted to Google in 2006,
showing four youths bullying a kid with Down syndrome.
It seems to me there are three points that this ruling highlights. Firstly,
should user-generated content be policed by
service providers? Secondly,
should the onus be on them to act upon user complaints? Thirdly was the
punishment fair?
Though I self-classify as a liberal, I found myself shocked by the
reaction of
supporters of
unfettered rights to publish any user-generated
content on the
internet, no matter how shocking or reprehensible.
Prosecutors successfully accused Google of negligence,
saying the
video remained online for two months even though some web users had already posted comments
asking for the video of the down's syndrome child being bullied by
four classmates at a Turin school to be taken down. Google did act
swiftly as soon as they were notified of its existence by Turin police.
Sentencing executives for vicarious responsibility over content posted by
others may seem extreme, but what about this point?: "
A company's
rights cannot prevail over a person's dignity. This sentence sends a clear
signal," public prosecutor Alfredo Robledo told reporters outside the
Milan courthouse.
There are precedents: don't forget ebay monitor auctions to ensure that certain
items are not sold online, such as Nazi war memorabilia or human organs
.
In Europe the Human Rights agenda is gathering pace. This has far less in my
view to do with state censorship and conspiracy theories inspired by Silvio
Berlusconi, and more to do with a set of emerging beliefs that service
providers must take responsibility for content, and that commercial rights cannot
override consumer rights at any cost.
I do not buy the argument that the judgment '
attacks the very principles of freedom on which the internet is built'. Lazy liberalism isn't an answer
to the question of whether content should be policed, and if so how? Community
is a powerful answer. Community is very good at judging content against
societal moral standards. Social media owners have a responsibility to use it. Whether they like it or not, an element of judgement reflected in basic rules is unavoidable.