A recording of an apparent murder posted on Facebook over the weekend prompted cries for social media to do more to police their sites.
The murder suspect fatally shot himself Monday in Pennsylvania as troopers moved in, according to authorities.
This is not the first incident to alarm social media users and advertisers. The accidental death of a teen in Georgia who shot himself on Instagram Live, the rape of a girl in Chicago, and the earlier torture of a special-needs teen in the city were also seen and shared on social media.
Even Facebook CEO Mark Zuckerberg noted, "We have a lot of work, and we will keep doing all we can to prevent tragedies like this from happening."
Hemanshu Nigam, founder of the online advisory company SSP Blue, believes part of the problem is that many sites rely too heavily on their users or artificial intelligence - "algorithms" - to detect inappropriate content.
Nigam says those are important tools but, "companies need to invest more in actual human beings sitting in front of systems and saying, wait, does this comply with our terms? Is this something we want on our site? Should we allow it or not allow it?"
He says sites should make the process of reporting inappropriate content obvious and simple.
And users, he said, should play a more active role.
"Yes, as you would in the real world, when you see something negative happening in your community, you tell somebody," Nigam said. "You do something about it. Treat the online environment the same exact way you treat the physical world."