CDZ The net giants enabling suicide.

Instagram boss in talks over self-harm content

The boss of Instagram will meet the health secretary this week over the platform's handling of content promoting self-harm and suicide.

It comes after links were made between the suicide of teenager Molly Russell and her exposure to harmful material.

Instagram's boss, Adam Mosseri, said it would begin adding "sensitivity screens" which hide images until users actively choose to look at them.

But he admitted the platform was "not yet where we need to be" on the issue.

The sensitivity screens will mean certain images, for example of cutting, will appear blurred and carry a warning to users that clicking on them will open up sensitive content which could be offensive or disturbing.

This is a huge issue in the UK. Molly committed suicide after viewing all sorts of crap on the net. Suicide,mutilation and self harm. All of this crap is freely available to children and the Net Moguls dont care.

Now the UK government is threatening action and they will have to clean up their act.

I dont know if it is possible to create a 100% safe environment but I suspect that it isnt. But it should be more difficult for kids to access this shit.

People have always taken their own lives and that shadow has loomed over my family since I was a child. It is ludicrous that we make it easier for people to do so.

I have no idea what action the government is planning.

How do you know that exposure cause more suicides as opposed to preventing them? I would think that someone looking at photos of a suicide scene would be better informed as to what hell they are about to put the person who finds them through.
Instagram vows to clamp down on self-harm content after calls from suicide victims' parents
Instagram think so.

Pubic relations doesn't make it so. Parents whose kids didn't kill themselves after looking at gore would have no reason to rabblerouse. I am not saying they should or shouldn't "clamp down". I just don't see how this can be declared evidence of something.
The instagram boss made a point that shutting it down would leave suicide inclined people alone and out there. But he did not outline any help that they received.

My Uncle killed himself when I was two. I dont know why because it is never spoken about. I think that I can probably guess though. As a child I used to accompany my Nana on a sunday to place flowers and tidy up the grave.
It became her lifes mission to ensure his grave was kept in good order.

She spent time in various institutes when it became too much for her to bear from time to time. I like to think that she is now at peace and reunited. Self harm and suicide is not an individual expression issue. It affects a wider community and we need some protection. It should certainly not be a means of making money.

There is plenty of gore out there on the internet they will be able to access with or without instagram. Instagram, owned by facebook, designs the total information protocols over their users as a means of making money. Just blocking the thing that could lead them to information about the user at the very best leaves them with deniability should they ever be sued for not identifying a person was suicidal and intervening. Their decision is "looking the other way" not being proactive.
 

New Topics

Forum List

Back
Top