Live streaming social media platforms such as Twitch are shrinking the distance between broadcasters and their viewers, who watch them and comment in real time. But these interactive forums also bring added layers of intensity to online bullying: trolls who post negative comments can observe the impact of their nastiness instantaneously on the shocked faces of streamers.

“When you post something online and someone comments, you don’t have to look. But there is no way for people who are streaming to shield themselves from what others are saying,” notes Yvette Wohn, an assistant professor of informatics at NJIT who researches both harassment and emerging methods for controlling it.

With grants from the National Science Foundation and the Mozilla Foundation, Wohn is currently focusing on the growing use of volunteer moderators by Twitch broadcasters, who set their own rules of engagement based on the types of communities they are trying to create. She’s looking closely at minority broadcasters and moderators, including those from the LGBT community.

Her goal is to develop tools to combat bullying. They will not all be technical. Among other options, “broadcasters can ban you – or engage you in conversation,” she says. “Some people simply don’t know what’s socially appropriate. What people don’t realize is that what people say online has a direct consequence offline. What happens online, doesn’t stay online.”

While she studies harassment, Wohn, the director of NJIT’s Social Interaction Laboratory, is equally interested in the ways that technology connects people, through social supports such as virtual communities for women on campus and for people recovering from opioid addiction.

“There is a lot of discussion about whether technology makes us more lonely or less, and I think the answer is that it depends on the way you use it. I want to develop technologies to help people who are disconnected, but want to engage with other people.”