Don’t Debate If You Haven’t Read This is a new feature of The Debating News that selects debate-worthy articles or news everyday for their readers. A motion would also be worded at the end of the article or news and included in the motion bank.
by Jojo Moyes
Published by: The Telegraph
Date: February 26, 2011
At first glance, Natasha MacBryde’s Facebook page is nothing unusual. A pretty, slightly self-conscious blonde teenager gazes out, posed in the act of taking her own picture. But unlike other pages, this has been set up in commemoration, following her death under a train earlier this month. Now though it has had to be moderated after it was hijacked by commenters who mocked both Natasha and the manner of her death heartlessly.
“Natasha wasn’t bullied, she was just a whore,” said one, while another added: “I caught the train to heaven LOL [laugh out loud].” Others clicked on the “like” symbol, safe in their anonymity, to indicate that they agreed. The messages were removed after a matter of hours, but Natasha’s grieving father Andrew revealed that Natasha’s brother had also discovered a macabre video – entitled “Tasha The Tank Engine” on YouTube (it has since been removed). “I simply cannot understand how or why these people get any enjoyment or satisfaction from making such disgraceful comments,” he said.
He is far from alone. Following the vicious sexual assault on NBC reporter Lara Logan in Cairo last week, online debate on America’s NPR website became so ugly that moderator Mark Memmott was forced to remove scores of comments and reiterate the organisation’s stance on offensive message-posting.
He added: “Here’s a suggestion based on my more than 30 years of reporting and editing experience. Before you submit a comment, ask yourself this question: If I had to put my real name with this, would I hit ‘publish?’
“If the answer is no, the better move might be to hit ‘delete’.”
It’s a sensible message. But it’s one that fewer internet users seem to be heeding. “Trolls”, or users who deliberately post offensive or inflammatory comments, are on the rise. America’s Today Show recently ran a story about trolls’ behaviour after the deaths of three adolescent girls. One of the girls, Alexis Pilkington, was referred to as a “suicidal slut”, while the grieving family of an 18 year-old who had died in a car crash were targeted by trolls who emailed them leaked pictures of her mutilated corpse. Last year, after others defaced the Facebook pages of two murdered children, Australian communications minister Stephen Conroy claimed that the free-for-all nature of the internet had become “a recipe for anarchy and the Wild West”.
In Britain, website Little Gossip prompted outrage after it enabled – some say encouraged – school pupils to post unproven sexual gossip about other, named pupils. It was closed earlier this month after the owners confessed they were unable to prevent what they called “malicious and unwanted comments”.
But who is posting such vile content? And why? Neuroscientist Baroness Greenfield has expressed concern as to whether internet use is responsible for what she sees as an increasing lack of empathy among the young. At the British Festival of Science she said that while some “very good things” were emerging from information technology, “by the same token we have got to be very careful about what price we are paying”.
Website netbullies.com has identified four kinds of people who post offensive content. The most dangerous, it says, is the “power hungry” bully, often someone who has little power or voice in real life. “They are empowered by the anonymity of the internet and communications and the fact that they never have to confront their victim.”
Someone who would agree is Sports Illustrated writer Jeff Pearlman, who has to deal daily with offensive tweets and postings that, he says now, “come with the turf”. “It’s getting worse because the internet allows for anonymity,” he says. “And anonymity is the e-mail equivalent of drunken courage in a bar. It allows people to fire off vulgarity and threats sans consequences.”
Last month, however, Pearlman, decided to track down and confront those who had insulted him, and, in one case, tricked him into opening a link containing extreme pornography while his young daughter was present.
“Matt”, the first of the commenters he confronted, apologised profusely, saying he had simply wanted to get a rise out of Pearlman. “I thought it was cool,” Matt said. “I never meant for it to reach this point.” “Andy”, another, confessed that he was not proud of what he had done, “but the internet got the best of me”. He pleaded, without irony, for Pearlman “not to eviscerate me”.
“The main problem,” says Pearlman “is there’s no longer a stamp-and-envelope moment. Back when we communicated via letters, there was time between writing something and sending it to kick back and re-think your sentiment. Now, there’s no time. It’s write, click, send — bam!”
This appears to be true for Nir Rosen, a Fellow at New York University’s Centre for Law and Security, who resigned his post last week after his own unpleasant tweets about Lara Logan’s plight were publicised. “It was the Twitter equivalent of blurting something out…” he explained afterwards. “In those few minutes I didn’t think about it, you’re lying in bed late at night… just –––––––around on the internet thoughtlessly.”
Etiquette expert William Hanson agrees: “Writing something on Facebook, Twitter or an internet forum detaches you from your remarks… it gives people a kind of ‘courage’ to be vindictive and come out with things that in their right mind they would never say.”
But this apparent licence to express one’s most toxic thoughts is evident on ordinary newspaper websites, where, this week, for example, comments below a photograph of pregnant Myleene Klass included: “She looks a complete mess,” “totally gross”, “saggy breasted” and even “revolting”. One pregnant woman told me she had felt intimidated just reading them.
Technology experts are divided as to whether insisting on the use of real names would improve online behaviour – or whether trolls would simply find a way around it. But some websites are trying to solve the problem, harnessing the wisdom of the crowd, and relying on the good sense, and manners, of the majority.
Tech website Slashdot has for years only made visible comments that receive a certain number of “approvals” from other users. Websites such as Huffington Post and Jezebel have recently introduced similar systems, with posts requiring peer approval. Gawker, meanwhile, requires commenters to “audition” before their remarks appear. In a bold strategy, weblog site Metafilter requires its users to pay to comment – those who make offensive remarks are banned and lose their money. It has proven a powerful deterrent.
But Facebook, which is primarily a networking, rather than commenting site, is struggling to deal with the problem, as evidenced by the callous comments left in support of killer Raoul Moat after his recent death. In the meantime, few have faith that the internet’s “Wild West nature” will change any time soon. Anyone who writes or is written about is now a target for abuse, says Pearlman. “I don’t think it can be improved, unless there’s some sort of genuine accountability. And that’s probably impossible.”
Hanson believes the issue may simply reflect society as a whole, and that people are becoming less respectful of each other generally. “Manners are selfless – they put other people first, and we as individuals second. We must remember that the whole point of manners and civility is other people, Internet or no internet.”