[ad_1]
The work of integrity teams offers a different solution. We may be the center of attention right now, but we have a long history in the industry. We’ve learned a lot from approaches to combating spam in email or search engines and borrowed many concepts from computer security.
One of the best strategies for integrity we’ve found is to bring some real-world friction back into online interactions. I’ll focus on two examples to help explain this, but there are many more mechanisms, such as limits on group size, a hash or reputation system (like Google’s PageRank), an indicator of “neighborhood you’re from”, structures for good. talk and a less powerful share button. For now, let’s talk about two ideas that integrity workers have developed: we’ll call them driving tests and speed bumps.
First, we need to make it harder for people to have fake accounts. Imagine if someone could get out of jail and disguise themselves as a completely new person after being arrested for a crime. Imagine it’s impossible to tell whether you’re talking to a group of people or a person who is quickly disguised. This lack of confidence is not good. At the same time, we must not forget that pseudonymous accounts are not always bad. Perhaps the person behind the pseudonym is a gay teenager estranged from his family, or a human rights activist living under an oppressive regime. We don’t need to ban all fake accounts. But we can raise their costs.
One solution is that in many countries you cannot drive until you learn how to drive under supervision and pass a driving test. Similarly, new accounts should not provide instant access to all features in an app. To unlock more abusive features (spam, harassment, etc.), perhaps an account has to pay some cost in time and effort. Maybe it just needs time to “ripen”. Maybe some hash systems need to accrue enough goodwill. Maybe it needs to do a few things that are hard to automate. Only after passing this “driving test” will the account be trusted with access to the rest of the application.
Spammers can of course jump through these hoops. In fact, we expect them. After all, we don’t want to make it too difficult for legitimate users of fake accounts. However, it takes some effort to create a new “disguise”, so we bring some physics back into the equation. Three fake accounts can be manageable. But hundreds or thousands would be too difficult to withdraw.
Online, worst losses almost always come from power users. Understanding this is pretty intuitive; social apps often encourage their members to share as much as possible. Power users can do this much more often and to different audiences at the same time than is possible in real life. In old cities, the cost of harming a person is limited to the physical need for any person to be in one place or speak to an audience at once. This is not true online.
Online, some actions are perfectly reasonable when done in moderation, but become suspicious when done en masse. Consider creating two dozen groups at once, commenting on a thousand videos an hour, or posting every minute for the entire day. When we see people using a feature a lot, we think they’re probably doing something akin to driving at an unsafe speed. We have a solution: the speed bump. Keep them from doing that thing for a while. There is no value judgment here – it’s a safety feature, not a punishment. Such measures would be an easy way to make things safer for everyone while only inconveniencing a small fraction of people.
[ad_2]
Source link