Photo: Thomas Trutschel/Photothek via Getty Images
Twitter has selected two proposals to study user interactions and discourse since putting out a call in March for help in measuring "conversation health" on its service.
Why it matters: A growing number of users have been taking time off Twitter, using it less, or quitting it altogether in recent times as they say the service has become mentally toxic.
The two proposals, which Twitter selected from more than 230 submissions:
1. Echo chambers and uncivil discourse: This project will measure the extent of how much Twitter users interact and acknowledge a variety of viewpoints.
- It will also work on developing algorithms that can distinguish between incivility and tolerance.
- Uncivil discourse, which breaks the norms of politeness, can serve important functions in political discourse. Intolerance, on the other hand, is defined as including hate speech, racism, and so on, and goes against democracy, per Twitter and the researcher team.
- This is led by professors from Leiden University, Syracuse University, Delft University of Technology, and Bocconi University.
2. Interactions and decreasing prejudice: This project will study how user behavior on Twitter can (or not) decrease prejudice and discrimination when interacting with users of diverse viewpoints.
- "When the communication between groups contains more positive sentiments, cooperative emotions, and more complex thinking and reasoning from multiple perspectives, prejudice is reduced and relations can improve," Twitter says.
- This is led by professors from the University of Oxford and the University of Amsterdam.
Yes, but: Twitter will still have to figure out what to do about the problematic interactions or trends these proposals unearth. And this won't solve some of the big criticisms of the company, including its policies and enforcement regarding abusive and harassing behavior. Some victims have said there are times they don't feel they are getting the help they need from the company.