TERM

Trust & Safety

Challenges & How to Build an Effective Team

Click on the bookmark to view chapters of this webpage
Bookmarks

Subscribe to our newsletter!

Please fill out the form below:

Click on the bookmark to view chapters of this webpage

Online marketplaces are great platforms for people to exchange goods and services that would otherwise be difficult to get. But they tend only to work well when users can trust that other users are dealing in good faith.

‍

The business that operates a marketplace has a role to play here. It must lay down rules for what is or isn’t allowed on the platform. It also needs to have a team of employees that investigates potential violations of those rules, and takes disciplinary action if necessary. These operations are collectively called Trust and Safety.

‍

For some companies, this means a dedicated Trust & Safety department; but for many, Trust & Safety operations are a core component of risk and compliance management; whether teams know it or not.

‍

This piece will explain more about what Trust and Safety is, why marketplaces need it, and what challenges it commonly faces. It will also offer tips for building Trust and Safety teams, measuring the success of operations, and other best practices.

‍

Book a Demo

What is Trust and Safety?

Trust and Safety refers to tools and business practices a marketplace uses to protect its users and its own integrity. The goal of Trust and Safety is to promote trust among a marketplace’s users by identifying and limiting behaviors that present risks to users or the marketplace’s functionality.

Why is Trust and Safety Important for Digital Marketplaces?

Digital Trust and Safety is essential for marketplaces because it helps them retain users – both customers and merchants. Users who are allowed to be subjected to abuse in a marketplace have a greater risk of leaving that marketplace – and taking their business with them. So it’s in a marketplace’s financial interest to maintain a safe and fair platform that entices users to stay.

‍

Some specific benefits of having dedicated Trust and Safety services for a marketplace include:

‍

  • Improved user experience: Keeping abusive users off a marketplace makes the platform easier and safer for legitimate users to efficiently do business on.
  • Higher customer loyalty: Marketplaces that keep their users safe will often find those users stick around longer and do more business.
  • Better brand credibility: A marketplace with a reputation for keeping its users safe will often find it easier to attract new users through referrals and searchability.
  • More resilient platform: Having a specialized Trust and Safety team at a marketplace allows it to more easily respond to – and recover from – high-profile incidents.
  • Tighter privacy compliance: As digital privacy legislation is developed and implemented, taking a proactive approach to Trust and Safety helps a marketplace avoid the regulatory consequences of failing to meet these obligations.‍
  • Increased transparency: Trust and Safety specialists help to build user confidence in a marketplace by clearly outlining the platform’s commitments to – and expectations of – its users.

Trust and Safety Challenges Marketplaces Face

One of the biggest Trust and Safety challenges for online marketplaces is simply a lack of understanding, from both inside and outside the marketplace, of why these operations are necessary. Most focus is put on AML and CTF for the sake of regulatory compliance, and on anti-fraud to protect the marketplace from unnecessary losses.

‍

Little attention is paid to the very real harm that online abuse can do to a marketplace’s users, and how much business the marketplace can end up losing as a consequence.

‍

Other common issues online Trust and Safety teams face in marketplace settings include:

‍

  • User and content volume: Bigger marketplaces mean more users and more content to monitor for rules violations. Many Trust and Safety teams struggle to find a way to handle this efficiently.
  • Action and content context: Some actions and content on a marketplace may appear to violate rules at first glance, but actually don’t when taken in proper context. Automated moderation solutions can sometimes end up being too heavy-handed in these situations.
  • Multiple communication channels: There are often several different ways for users to interact on a marketplace, such as through usernames, profiles, messages, pictures, and videos. Moderation solutions that are effective for one medium may not work so well for another.
  • Varying types of abusive behavior: There are many types of activities and content that can make marketplace users feel unsafe, from posting explicitly violent media to sending hateful or threatening messages. Trust and Safety solutions need to take into account who is typically responsible for each type of abuse, how they accomplish it, and who they tend to target.
  • Changing tactics for abuse: Trust and Safety teams also need to realize they will have to adapt solutions to deal with abusive users who find ways around existing safeguards. For example, harassers may try to bypass profanity filters by writing disallowed words in code or in an uncommon language.
  • The danger of false positives: As with contextual challenges, Trust and Safety teams that rely too heavily on automated solutions may end up accidentally disciplining users who have done nothing wrong. This can scare legitimate users off a marketplace as effectively as not dealing with deliberate rules violations.

Who is Responsible for Trust and Safety Operations?

A larger marketplace may have a dedicated Trust and Safety team to handle operations and incidents. However, since Trust and Safety is a relatively new concept, some marketplaces – especially smaller ones – may not.

‍

Marketplaces without dedicated teams may rely on their anti-fraud team to handle Trust and Safety issues. This is because many marketplace Trust and Safety abuses are forms of fraud, or are activities done with the intention of committing fraud later.

‍

In some small marketplaces that have their anti-fraud and AML operations amalgamated as risk management, this department will likely deal with Trust and Safety matters.

How to Build an Effective Trust and Safety Team

Building a successful Trust and Safety division at a marketplace starts with establishing why the platform needs it, and then selling that need to other marketplace stakeholders. From there, it’s a matter of selecting the right team members and setting standards for both marketplace employees and users. Of course, it also requires being able to adapt to changing circumstances.

‍

The process of putting an effective Trust and Safety program together for a marketplace looks like this:

‍

‍

Step 1: Evaluate the marketplace’s needs and decide on roles

Collect and review data on the marketplace’s current Trust and Safety situation, including talking to employees, users, and even teams at similar companies. Learn which issues the marketplace is facing (and will likely face), and which solutions will likely help the most.

‍

From there, put together a template for how the Trust and Safety team will be built. It will likely need people for general oversight, operations oversight, content moderation, public relations, engineering, and legal.

‍

‍

Step 2: Get other marketplace leaders and departments to buy in

The next key to creating and filling out a Trust and Safety team is to make a case for its value to senior marketplace personnel. Point out where the marketplace’s current Trust and Safety operations could use improvement, and – when possible – quantify how much current solutions are saving the company in resources (especially money).

‍

Also talk to other departments such as product, customer experience, marketing, and sales. Make cases for how Trust and Safety can smooth out each of their operations without them having to do extra work or hire extra people.

‍

‍

Step 3: Clearly outline policies and procedures

Make sure to thoroughly outline the marketplace’s Trust and Safety guidelines for purposes such as moderating community conduct and internal training. The former will build trust with the user base by reducing confusion surrounding what is or is not allowed on the marketplace. The latter will make onboarding new Trust and Safety team members faster and easier.

‍

‍

Step 4: Keep communication open to maintain flexibility

Abusive marketplace users will continually invent new tactics, techniques, and procedures to test what they can get away with on a marketplace platform. For example, certain offensive slang terms and symbols, or pieces of misinformation, won’t always be covered by a marketplace’s current Trust and Safety policy.

‍

That’s why it’s important to encourage open conversations about what Trust and Safety team members find out about these new threats. This will help in revising both internal and external guidelines to keep them current.

‍

‍

Step 5: Ensure morale stays high

Working in Trust and Safety can be difficult, as it inherently involves dealing with a marketplace’s most abusive users. Offer health benefits (especially surrounding mental wellness), and schedule regular check-ins to get a sense of what headspace each member of the team is in.

‍

Foster a culture of interdependence where team members know they can reach out to each other for help if things get tough.

Measuring the Success of Trust and Safety Operations: What Metrics to Use

Finding the right metrics to measure for Trust and Safety can be tricky. Part of this is because marketplaces in different industries have different priorities. It’s also because Trust and Safety is responsible for resolving rules violations on a marketplace, but its ultimate goal is to prevent those rules from being violated at all. And it’s difficult to measure events that could happen, but never actually do.

‍

With that said, here are some suggested metrics to track:

‍

  • Users exposed to violations: The proportion of a marketplace’s total users who viewed disallowed content, or were the victims of malicious behavior on the platform. A goal of Trust and Safety should be to act quickly and isolate rules violation incidents so they affect as few users as possible.
  • Content and users flagged as violating: The proportion of content submitted to a marketplace, or users on the marketplace, that either other users or the Trust and Safety team mark as potentially rule-violating. Minimizing this metric shows a marketplace’s enforcement procedures are working well enough that users aren’t trying to violate rules in the first place.
  • Types of violations committed: The proportion of marketplace rules violations broken down by which rules were violated. This can show Trust and Safety where they should focus their efforts, or perhaps update the rules if certain risky content or activities aren’t actionable under current policy.
  • Average case resolution time: The average amount of time between when rule-violating content or activity is flagged, and when a Trust and Safety operative is able to complete their review or contain the security threat. Faster times mean that marketplace incidents are being handled more efficiently.
  • Automated vs. manual violation detection: The rate at which rules violations were picked up by automated moderating systems, as opposed to being spotted by Trust and Safety operatives or reported by marketplace users. This shows how accurate an automated moderation solution is, as well as which types of violations it may be having difficulty detecting.
  • Moderation fairness: The number of times marketplace users successfully appealed moderation decisions, versus the total number of appeals. This can show which of a marketplace’s rules may be too strict, or which Trust and Safety operatives – including automated solutions – may be overly heavy-handed in their judgments on certain types of perceived rules violations.
  • Concentration of violations: An analysis of the number and types of rules violations committed by specific users. On one hand, if it’s the same users committing rules violations over and over, a Trust and Safety team may need stricter policy enforcement for those individuals. On the other hand, if multiple users are committing the same type of rules violation, it may point to coordinated activity or even a fraud ring.

Trust and Safety Policy Best Practices to Follow

In addition to the recommendations above, here are four other Trust and Safety best practices that marketplaces can follow to ensure smooth and secure operations.

‍

‍

Consider potential risks during product development

A company’s Trust and Safety team should work closely with the Product team(s) to understand what kind of marketplace is being built, and how it’s being built. This provides an opportunity to consider and prepare for ways the marketplace could be abused, or even design the marketplace to prevent certain types of abuse.

‍

‍

Be transparent with policies and metrics

A Trust and Safety team should develop conduct guidelines that are thorough, specific to the marketplace, and clear for users to understand. It should also publish these guidelines in a place that’s easily accessible for users.

‍

Additionally, it should periodically release public reports on its tracked metrics to show users that the marketplace is making maintaining their trust and safety a top priority.

‍

‍

Allow users to report potential abuses

Trust and Safety can’t always catch every instance of abusive behavior on a marketplace by itself. That’s why it should consult with the Product team to implement a mechanism that lets users themselves report potentially abusive content or activities. Instructions for accessing and using this mechanism should be included in the marketplace’s community guidelines.

‍

‍

Have a high-profile incident response plan ready

A Trust and Safety team should still have specific procedures in place in the event that a high-profile incident – such as a data breach or coordinated mass fraud – happens to the marketplace. The plan should first focus on how to protect users – including their accounts, money, and credentials. This should involve communicating to them what the nature of the incident is, as well as anything they should do to secure their assets.

‍

It should also consider how to quickly but securely restart the marketplace’s operations after the incident has been resolved. Better yet, it should also cover how to keep the marketplace running safely while the incident is in progress.

‍

Download Operating System Product Guide

‍

‍

Augment your Trust and Safety Operations with Unit21

Monitoring for, detecting, investigating, and taking action on threats to a marketplace is very inefficient to do completely manually. Despite their potential blindspots, automated Trust and Safety solutions are virtually essential for handling the volume and scope of potential abuses faced by modern digital marketplaces.

‍

‍Schedule a demo with Unit21 to see how our Trust and Safety solutions for marketplaces help to automate abuse detection, investigation, and analysis to maintain marketplace safety and customer loyalty.

‍

‍