AI-ssisted Stereotyping? The Launch of Peek’s Compatibility Scores

Just as privacy concerns were being raised over the new AI-powered tabloid entertainment app “BuzzOff,” another app called “Peek” has launched, causing a double whammy for privacy advocates.

Like “BuzzOff,” “Peek” uses AI to learn about a user from their social and personal history, but instead of delivering surveys and games, “Peek” allows users to see if they are likely to get along with people in a certain company, city, or neighborhood. The app is opt-in for users who download it, but it also has access to troves of public data on others.

“Peek” uses an algorithm that analyzes social media interactions and behavior patterns to determine compatibility with people in a given location or company. The app then generates a “compatibility score” that ranges from 0 to 100.

“I was curious to see if I would be a good fit for the company I was interviewing with, so I used Peek,” said user, Makaila Nichols. “The app gave me a compatibility score of 85. I guess time will tell if I get the job.”

The creators of “Peek” insist that the app is designed with user privacy in mind and that all data is used anonymously and securely. The company’s CEO, Jack Brown, stated “We understand the importance of protecting user data and have implemented strict security measures to ensure that all personal information is kept confidential.”

The launch of “Peek” comes at a time when concerns over privacy and data usage are at an all-time high. As more and more apps use AI to collect and analyze personal data, it’s crucial for both consumers and companies to be aware of the potential risks and work towards protecting personal privacy.

One of the potential risks of “Peek” is its use by employers to pre-screen job candidates. Even though the app is not marketed for that, experts warn that employers may be tempted to use the app’s compatibility scores to make hiring decisions. “The use of ‘Peek’ by employers raises serious ethical concerns,” said job market analyst, Kara Stone. “It could lead to discrimination based on superficial data and limit job opportunities for qualified candidates.”

Another behavioral researcher, Dr. Samantha Smith, also weighed in on the matter: “While the idea of using AI to determine compatibility is intriguing, it’s important to remember that true compatibility goes beyond surface-level data. Human interactions and relationships are complex, and basing them solely on an algorithm could lead to missed opportunities for growth and understanding.”

These experts point out that true compatibility is not only based on superficial data, but also on personal growth and understanding which can be achieved through diverse interactions. Therefore, Peek’s approach could be too limiting and could discourage people from seeking diverse interactions and potentially missing out on valuable experiences.

“If a stereotype is statistically overwhelming true, then it’s a fact, is it not? I’m not sorry that I don’t want to live in a neighborhood where I won’t be welcome,” said anonymous, who downloaded the Peek app.

This quote highlights the potential appeal of Peek to some users as a tool for avoiding uncomfortable or potentially hostile situations. However, it also highlights the danger of basing important decisions on stereotypes and superficial data, and the risk of perpetuating discrimination and prejudice.