1. Welcome! Please take a few seconds to create your free account to post threads, make some friends, remove a few ads while surfing and much more. ClutchFans has been bringing fans together to talk Houston Sports since 1996. Join us!

[NYT] Your Tinder Match Will Soon Be Able to Run a Background Check on You

Discussion in 'BBS Hangout: Debate & Discussion' started by Os Trigonum, Mar 31, 2021.

  1. Os Trigonum

    Os Trigonum Member
    Supporting Member

    Joined:
    May 2, 2014
    Messages:
    81,569
    Likes Received:
    121,980
    "Your Tinder Match Will Soon Be Able to Run a Background Check on You":

    https://www.nytimes.com/2021/03/31/opinion/tinder-match-background-check.html

    Your Tinder Match Will Soon Be Able to Run a Background Check on You
    The app is trying to make it easier to obtain data on potential partners. That could create more problems than it solves.
    By Karen Levy
    Dr. Levy is an assistant professor in the department of information science at Cornell University. Her work focuses on the legal, social and ethical aspects of data-intensive technologies.
    March 31, 2021, 5:00 a.m. ET

    What does it mean to gather “verified” data on potential romantic partners? There’s something to be said for the idea that intimacy is based on having discretion to share information with others — on deciding how much of yourself to reveal to someone, and when, and how — as trust builds in a relationship.

    Match Group — which owns dating and hookup platforms including Tinder, OKCupid and Match.com — is trying to make it easier to obtain data on potential partners. The company announced this month that it will help users run background checks on potential dates. Tinder users will be the first to receive the feature, which will allow them (for a fee not yet determined) to obtain public records on a match, based only on first and last name, or a first name and phone number.

    That data, provided by a nonprofit company called Garbo, will include “arrests, convictions, restraining orders, harassment, and other violent crimes” in order to “empower users with information” to protect themselves. Garbo’s website also indicates that it accepts evidence submitted directly by users, “including police reports, orders of protection and more,” though it’s not clear whether this capability would be integrated into its partnership with Match.

    It’s easy to understand why Match Group is making this move. Potential partners sometimes deceive each other, in ways both trivial and significant. Gender-based violence is a serious and prevalent problem, experienced by one in four women and one in nine men at some point. Intimate platforms have come under fire for their lack of action when users report being assaulted by someone they met through the app. Many people already take steps to check up on each other before meeting in person — doing searches of each other’s names on Google, perusing each other’s social media profiles, even in some cases running formal background checks of their own.

    It’s laudable that Match Group wants to prevent its platforms from propagating sexual violence, and it’s attractive to try to fix the problem with technology. But we should be clear about the trade-offs. Technological measures that make us seem more secure may not always be as effective as they seem — and they can introduce a host of concerns around privacy, equity and the process of trust-building required for true intimacy to develop. If we normalize the practice of building a dossier of external data points on a person to avoid the risk of deception, we might upend an important aspect of creating close connections.

    The risks associated with meeting potential partners stem in part from the way we tend to pair up today. Before the emergence of intimate platforms, more people met through common connections. In those cases, you had some sense of knowledge about the person — he’s a friend of a friend, I know where she works — which allowed for inferences about the person and a degree of comfort about interacting.

    Intimate platforms have changed the game: We increasingly meet online. And we may believe a digital record to be a full, “true” representation of someone. But these kinds of records are known to be far from perfect, especially when they rely on names to match records because records are often misattributed to people with the same or similar name. They commonly include criminal convictions that were later expunged or charges that were ultimately dropped. It can be difficult for people with inaccurate records to become aware of them, and it’s sometimes impossible to obtain removal of errors or inconsistencies.

    Moreover, a truly motivated bad actor can often circumvent policies like these by using a different name or phone number. So even to the extent that background checks appear to provide security, they can function more like a security blanket — they might give us the feeling of safety without actually ensuring it.

    There’s also substantial social value in letting people shed stigmatizing or embarrassing information from these records. That is the rationale behind “ban the box” policies, which prevent employers from asking about criminal history on job applications in order to give applicants a fair chance at being hired. Letting people with stains on their records reintegrate into social life — including intimate relationships — has important social benefits.

    Also, because data collection is often racially disproportionate — particularly in the context of involvement with the justice system — we should be mindful of who is most likely to be affected by policies like these. Match and Garbo have shown some foresight here: In recognition of the discrimination faced by Black Americans in the criminal justice system, they exclude drug possession offenses and traffic offenses (aside from D.U.I.s and vehicular manslaughter) from their background checks.

    But even with these exclusions, over-policing of people of color, and racial bias present in all stages of the criminal justice system, should give us significant pause when drawing on criminal justice data. We should be especially careful about integrating these records into intimate platforms, which can be sites of racial exclusion and race-based harassment.

    It’s not hard to imagine how background checks might open the door to other kinds of data. Do we want to start vetting our partners in the same way we decide what kind of car to buy, or whom to hire, or who is likely to repay a loan? Should I know whether someone has filed for bankruptcy or been married before or owns property? Should I be able to sort partners by their credit score? Introducing this level of data use into the intimate sphere seems at odds with how we typically learn about one another — gradually, and with the benefit of context.

    Match Group is trying to address a real, urgent problem — but we need to be very thoughtful about what tools are appropriate to combat sexual assault and what impacts they might have on user privacy and on how we develop relationships. Using data as a weapon against sexual violence can introduce more problems than it solves.

    Karen Levy (@karen_ec_levy) is an assistant professor in the department of information science at Cornell University.

     
  2. jiggyfly

    jiggyfly Member

    Joined:
    Jul 2, 2015
    Messages:
    21,011
    Likes Received:
    16,856
    If you are ok putting up a profile on these sites then having background info shared should not be an issue.

    If you want privacy don't sign up for these sites.

    Eazy Peazy.
     
    jcf likes this.
  3. ThatBoyNick

    ThatBoyNick Member

    Joined:
    Dec 8, 2011
    Messages:
    31,400
    Likes Received:
    49,256
    Where is Os going to find his hot young tail now?
     
    jiggyfly and fchowd0311 like this.

Share This Page