OPW - July 23 - On 10th July we ran an IDEA Webinar with leaders of some tech-driven safety services. Here's the short summary video.
Online dating companies are in a unique position to adapt to social nuances, cultural changes and even global pandemics, through the effective application of advanced technologies. The overarching goal is to provide a safer and more fulfilling service to users, while combating fraud and romance scams. In the latest IDEA Webinar, Mark Brooks discusses the latest advances in technology-driven safety measures with Geoff Cook, CEO of The Meet Group, Anthony Oyogoa, CEO of The UrSafe App, Jeff Tinsley, CEO of RealMe, and Justin Davis, CEO of Spectrum Labs.
Here's the full video.
Email me at [email protected] if you'd like connects with anyone on this webinar.
Mark Brooks: Hello, and welcome to another IDEA webinar. We've got quite a wonderful panel, once again. We're going to cover safer dating. We've got Anthony Oyogoa, who is the CEO of the URSafe app, Jeff Tinsley, who is the CEO of RealMe, Justin Davis, who's the CEO of Spectrum Labs, and Geoff Cook, CEO of The Meet Group. Geoff will be joining us very shortly. I'm Mark Brooks. I think you know me. I run IDEA, the Internet Dating Excellence Association, along with the little consultancy Courtland Brooks.
So, without further ado, I'd like to ask my panelists about their companies. Specifically, what does your company do in a nutshell to help internet dating and online social community app users improve their safety when meeting online and in real life? And why don't we start with Jeff Tinsley?
Jeff Tinsley: Hello. Hi Mark. Thanks for having me. I'm excited to talk to your audience about safety. It's a big issue. I’ll give a quick introduction about us. RealMe's a unique and free reputation platform to help me make online daters and marketplaces safer. You know we're trying to reduce fraud, scams and worse that's going on, and at the same time, we're trying to improve trust in the marketplace.
So, think of any kind of site that connects strangers, trying to help them improve trust to drive increased conversion and drive greater user adoption from people that just aren't using those platforms for that reason. So excited to talk about safety online. We know it's a big growing issue.
Mark Brooks: Thank you, Jeff. Anthony?
Anthony Oyogoa: Thank you again for having us come on here and speak to your audience about our app. Well, we are the first global hands-free voice activated personal safety app. Our app is really designed to allow people, when they do meet strangers or when they're interacting in a place where they might feel uncomfortable, to be able to have an app that they can voice activate. It can alert their users as well as their friends and family beforehand.
I'm going on this journey. I'm going to meet this person here. And then lastly, we have certain things like a fake call. We have a check in feature. It's all automated. What the usual date night process might be for a person. Hey, call me in 10 minutes. Make sure I'm okay. if I'm uncomfortable, I can activate the app and set up what would be an incoming fake call. So just not to aggravate the person, just to let them know, hey, sorry, I have a call from my kid's school. The caller ID says the kid's school. You can remove yourself from a situation. It's one of the things we love about this space.
For dating, obviously, when you're meeting a new person for the first time, everybody's a little apprehensive. Everybody has butterflies, but just knowing that you have an app, or you have something like this for protection, you feel is going to enhance that actual in person dating experience. Because the person knows that if something does go wrong or I do feel uncomfortable, I can exit. I can leave the situation. Again, thank you for having us.
Mark Brooks: Thank you, Anthony. Justin?
Justin Davis: Appreciate you having us all on the webinar today. I'm Justin Davis, CEO and cofounder of Spectrum Labs. Spectrum Labs works with basically social networks, dating apps, gaming companies, and marketplaces to help them detect some of the nastiest stuff that happens on all these various properties.
We can detect over 40 plus behaviors. We have a patent on the way. We do this across many given languages on the fly. And we are predominantly focused primarily on text and voice. So, usernames, chat, messages, forums, post comments, memes, and emojis. That sort of thing. Where we really shine is basically leveraging contextual AI to be able to do this. So, this understands, you know, metadata around the user, the conversation and the entire conversational analysis around the history of that conversation. This is important for detecting complex behaviors like underage user detection on dating apps, sexual harassment, cyber bullying, that sort of thing.
Mark Brooks: Great. So, Jeff Anthony, Justin, you all have services that are available to internet dating and online social communities. Geoff Cook, you're bringing us the perspective of The Meet Group. So, could you tell us what The Meet Group's position is on how you improve safety across the various The Meet Group properties? In a nutshell, what's kind of the overarching philosophy.
Geoff Cook: Yeah. So, a few things, what is The Meet Group? You know, we're five apps, MeetMe, Skout, Tagged, LOVOO, and Growlr. We have 15 plus million monthly users. What we've been doing that's fairly unique within dating is really bringing live streaming into dating since 2016. And we've seen kind of a dramatic acceleration in that with the virus. Our users are playing about 185,000 dating games a day. That's almost double since just March. We really aim to make everyone the star of their own dating show.
As it relates to safety, what we're seeing is that video is a great filter for determining who to meet in real life, both from a perspective of feeling more comfortable with someone's personality and appearance. Obviously, authenticity is a problem endemic within dating apps. Are you talking to the person that they're representing to be? But, then also with new levels of concern in this age of COVID, we want to know if they are coughing. Is this someone that I'm going to risk and exposure on? So, we think that video has a place in dating for the future, and the role it will play is to prescreen and pre-qualify a potential in-person meeting.
Mark Brooks: Gotcha. Thank you, Geoff. So, internet dating is quite a special environment, of course. And what do you see as being the greatest issues for this context? What are the vulnerabilities around safety and security in online dating? Could we get your perspective, Jeff Tinsley?
Jeff Tinsley: Well, I guess, you know, because we don't own a dating service ourselves, but, of course, work with many other clients that do, we do have some perspective. Well, one of the most important things I can probably share is some details about a study that we did with FTI. It’s a new study that's coming out later this month.
Some of the key issues that people were dealing with are, once again, that they just don't trust the, you know, the online services and marketplaces very much. There's a low level of trust, but there's positives too. So, you know, even though we're seeing that 30 to 40% of profiles are fake, what we're also finding out from customers is that there are easy ways to deal with the trust issues. They are in line with what Geoff Cook said.
I mean, a key part of what we do as a service is help make sure that people are real, that they are who they say they are. First, a vast majority of users in a dating context-nine out of 10 users-have an interest in knowing that they're engaging with somebody that's actually been verified. They want access to more information about that individual, as well as access to some of their historical behaviors.
So, trust is a big issue. There are too many people who've been hurt, but there are simple ways to deal with those issues. We have some other data related to the FTI study around how coronavirus has affected companies. We really love hearing from Geoff Cook that are seeing huge increases in video usage even during this tough time. We've seen in the study that we ran, that it says the same. 72% of users that are using dating services are spending more time using them. And if the social distancing continues, they're going to continue to do that.
Anyway, the key issues that we're seeing are really around trust in these marketplaces. A lot of people aren't even using dating services at all, or marketplaces in general, just simply because they are afraid. So, there's also a massive opportunity for companies to grow their market share. By doing some very simple things that I, and some of the others on, on this call, can provide to grow market share, not just to improve conversion and engagement from existing markets.
Mark Brooks: Thank you, Jeff. Anthony, what is your perspective on vulnerabilities in the context of internet dating? What do we have to work with from your perspective?
Anthony Oyogoa: From our perspective, we are a personal safety app. We get involved in the last step. Right? So, you've met the person online. You've maybe done a video chat, and you've exchanged some comments. So, all our partners here have been part of that exchange. So, Geoff's company brings you together on the platform. Justin's company can help monitor some of those things and Jeff's company can help you verify what the person's history has been like online to validate that they are a real person or whatever. Some of their history has been what they posted and then we're the last leg, right?
So, when I decide that I want to meet this person in real life, and where I want to go on a date or in a marketplace, or I want to meet them to buy something. We want to be part of that experience. We allow you to set on the app three, four, five, or six people that are like your loved ones who can monitor where you are.
They can be contacted using voice activation in case you do feel unsafe. And our app is global, just like Geoff's dating company. We have users that are activating the app in Dubai, that are activating it in the Philippines and activating it all around the world. We are making sure that we can provide that level of safety to people around the world. Especially for that last leg of the dating experience or the marketplace experience when you meet in person.
And, I think, similar to what Jeff said, once you're able to let a person know, hey, I have a personal safety app, or it comes with the app, we think that it may deter a lot of people that might be bad actors. If they did have some intent to do something bad, suddenly it gives them an additional reason to pause and say, wait, maybe I'll look for somebody that might be an easier target somewhere else. So, we're excited about being able to bring our app and leverage it into the space.
Mark Brooks: Right. Thank you, Anthony. Justin?
Justin Davis: I think Anthony mentioned something that's important. You know, the three companies here outside the Meet Group, you know, with RealMe and URSafe, and Spectrum Labs, have a strong partnership. It's all three levels of that. For Tinsley and RealMe, it's around how you verify that the user is who they say they are when they're in the signup process. Even after then, when they're on the platform and then once they start creating usernames and creating content and sharing ideas and conversations with users on the platform. That's where we come in and really identify, you know, is there something nefarious in that conversation?
This isn't about chat, filtering or filtering out content. No dating app or social network really wants to go down that path of infringing upon free speech and restricting words on that level. Unless it's like a child-based platform, which has a whole different set of context and nuance to that.
But in the dating world, you really don't want to let you don't want to come across as being big brother-ish or preventing free speech in any sort of way. So, if you're looking at monitoring for chat or preventing words from going from one use to another, that's not really the best approach for identifying complex behaviors related to prostitution, SESTA, FOSTA, or detecting underage users that might be on the platform.
Determining that they're talking to, you know, grown men and women, which has risk for the brand, and has risks for the individuals that are talking to that person. Spam fraudsters that come on the platform, looking to just erode the user experience and cause havoc for no other purpose, maybe for financial gain, but for whatever reason. These things go beyond just looking at a single word.
So again, you must have, you know, RealMe checking up front and you have us in process when the dating app is live, and users are interacting with one another. Then once they make a connection and meet, that's when having something like URSafe deployed with the user base. It is strong because that way you've got all three levels of the system around how a platform like the Meet Group and what Geoff Cook's team's doing.
You know, it brings people together and allows them to communicate that way. You've got all three levels basically covered with the strong partnership of technology vendors, like what we do. I mean, we're seeing right now, especially in the age of COVID, there's a rise in new types of behaviors. I wouldn't say new. They've been around for decades, but they're on the rise in terms of prevalence and severity. This is related to racism and anti-Chinese rhetoric and anti-Asian rhetoric. This gets into a much larger conversation around the filters that some of the dating apps we're looking to deploy or not deploy as well in relation to that stuff.
Mark Brooks: Great. Thank you, Justin. So, you've all taken quite different, interesting and different approaches to improving safety. Let's have a macro perspective now, Geoff. Geoff Cook, you've got the macro perspective running a group of dating apps that are video focused. What do you see as the biggest vulnerabilities right now? What are the biggest issues around safety?
Geoff Cook: At the end of last year, we wrote a blog post that kind of set forth are our four areas to focus on with respect to safety in 2020. And we've essentially been executing against those four areas ever since. So, one is authenticity. Authenticity is important to any dating app. There are issues of catfishing, where, you know, users aren't who they say there are. They're scamming our users trying to defraud someone. How can you get more trust that a person in a profile photo is the person operating the profile?
Video is, of course-very good live video in particular-is very good at kind of assessing this person. Are they likely to be who they say they are, and what do they look like right now, as opposed to 10 years ago? We work with a vendor named FaceTech, and we've begun rolling it out. And we'll be adding a verified badge relatively soon. Where, if you complete this kind of two to three second 3D scan, we'll be able to conclude that you're a live person. Also, we'll be able to do things in the future to compare you to your profile photo and report some kind of confidence interval in order to further provide authenticity.
You know, I think age estimation is another factor. That FaceTech scan is also very good at getting an age. You know, you could use age estimation from AWS recognition or Google's version. Age estimation clearly plays an important role in trying to flag users that could be too young. You know, we obviously want to keep these users off the system. We need to have really the best age estimation. Every dating app is focused on this, I'm sure. They are focused on having ever improving age estimation capabilities. We believe a 3D face scan is about as close to a gold standard as you can get.
Device blocking is another factor. So, we just rolled out something called an Apple Device Check across all our iOS users on our MeetMe app. It’s one of our biggest. That was very successful at stopping spammy profiles. Basically, the only way to defeat this is to have a valid iOS device. You can't really simulate one. So, that has been very helpful.
I would throw bad actor precautions in there. We were excited to work with URSafe. You know, I think that's a tremendous app that we're excited to bring to all the users of our platform, in order to help them. There's always this intrinsic risk when you're meeting someone in person for the first time. Anything that can help. We give tips, such as meeting in a public place. We provide virus safety tips now through our Safer Dating Advisory Board of epidemiologists. I think there's still a risk and a URSafe-type app will help mitigate that.
I would say the fourth area that we've been focused on is textual screening. You know, Spectrum is a solution we've been deploying now. We're excited about the opportunities here. You know had had several homegrown solutions with respect to the textual screening. It sounds like an easy thing. When you say those two words, textual screening, how hard could it be? But users, of course, find a way to say a bad word in every different character set available, including with any number of spaces or misspellings. And it, it becomes actually very difficult to block. You need an advanced AI machine learning capability to do that. That’s where something like Spectrum Labs is powerful.
Those are the, I would say, four areas that we've been concentrated on since the beginning of the year, the fifth area was kind of new to the year. And that was, of course virus safety. It’s probably one of the most relevant areas right now, just given it affects everyone. And, with respect to that, we did commission a Safer Dating Advisory Board yesterday, who joined us and is consulting on our product pipeline and making sure that any tips we provide to users are scientifically sound.
Mark Brooks: Why don't we follow along with that thread? Of course, we're in some very unusual and crazy times with the rise of the pandemic. So, Geoff, I'd like to ask you how the Coronavirus is changing how people are dating. How does this affect safety and security for meeting and dating during the pandemic? How's The Meet Group helping?
Geoff Cook: Yeah. So, it affects things in several ways. For one thing, people clearly differ in their precautions, and how they keep themselves safe. Some people, you know, wear a mask and some people don't. It can become friction where, you know, a person who's only doing virtual dates or is willing to have an outdoor picnic, is meeting someone who doesn't take it seriously.
And that creates its own level of kind of social friction. Our view is, well, look, let's come up with scientifically sound and practical guidelines, and put them out to our users. Some of those are pre-screened potential dates on video. We think that's an important one, you know, beyond assessing appearance and personality.
I think a shallow swiping app where, you know, you're swiping right and you're connecting in person is concerning. Following that level of kind of shallow engagement is probably the most disrupted by this crisis. And the reason being that, you know, that's the sort of activity that is about as risky as you can imagine within the dating universe. Having many frequent interactions with new people, in person, that you've never pre-qualified is about as risky as you can get. That's what we want to avoid. You know, beyond video’s capacity to assess appearance and personality, it's very good at showing if the person coughing or ill, right?
You could ask questions you might not otherwise ask, like, hey, I'm going to wear a mask. What are you going to do? Do you live in a multigenerational household? Do you take mass transit to work? Right? Whatever it is that that a person is particularly keyed in on. I think that's helpful. I think, you know, just not going on a date if you're sick or have had contact with someone who is, is important advice. Keeping initial dates outside, and in public. And if you're going inside, minimize your time spent inside. If you are going inside to buy a coffee, or you're going inside to buy some food, go eat it outside.
When you're inside wear a mask, and then keep boundaries and follow general common sense, such as handwashing and so forth. I think our approach is essentially harm minimization. People are meeting and that's the reality on the ground. And the question is how can you give them some knowledge to keep them safe?
Mark Brooks: Right. Thank you, Geoff. So, Jeff Tinsley, I'd love to get your perspective on this as well, and your focus on trust and reputation. So, coronavirus changed the world and changed dating. How do you think coronavirus affects safety and security, from your perspective for meeting and dating during the pandemic? How can RealMe help with this problem?
Jeff Tinsley: How is it affecting safety? From my perspective, I probably would regurgitate much of what Geoff Cook had to say. I mean, we're understanding that getting to know somebody before you start interacting with them is critical. People should know as much as reasonable before they start making decisions about getting together, because there is risk as Geoff Cook had to say. When you do finally get together, coronavirus is one issue. Just understanding a little bit more about whether someone has a history of doing bad things is critical as well.
So, my simple answer coincides with Geoff Cook. I think it's critical to do that pre-screening, as necessary, and be wise about the choices that you make about who you're getting together with. And hopefully people make wise decisions to limit kind of the number of people that you are meeting and are choosing to get together with. So, I'm all for the prescreening. That's where we sit in this entire landscape.
Mark Brooks: Gotcha. Great. Thank you, Jeff. Anthony, what's your perspective?
Anthony Oyogoa: Well, I think for us, COVID was different. We started seeing lockdowns around the world and we started seeing users daily, active users, start coming down around the world. The numbers we started seeing, starting in Asia and then Europe, and then slowly coming to the U S was, what we believe, were instances of domestic violence. So, because our app can be used again, obviously, for dating, but again, offline for other things that are happening in the real world that has not brought increase incidents numbers because of COVID. People are stressed. People are doing inexcusable things.
We decided, as a brand, to make sure that our app was free for everybody around the world, regardless if you're coming from a dating platform. If you're coming from a marketplace or you're just finding it regularly in the app store, we had to pivot. So, we wanted to make sure that during this time when lockdowns were happening, they were able to provide that.
And we added some languages. We have six languages now. We're in 185 countries and 200 territories. We're very excited about being able to still provide that use of our app, even though the person might not be meeting in real life. But I think overall, just to harken to what Geoff and Jeff Tinsley said, people are meeting now, regardless of whatever the lockdown rules might be or the precautions are. We are just letting other people know that, hey, this is where I am. This is who I’m meeting with.
We're just trying to do our part. And I think once people start feeling a little bit more comfortable and the inhibitions down a little, we're going to see that uptick again in dating, in person. But we just wanted to make sure, as a company, we were able to make that pivot to still say, Hey, what can we do during this pandemic? Instead of just sitting on the sidelines, we made sure that people that are using our app in a domestic violence situation know that we're here for them.
Mark Brooks: Yeah. Thank you, Anthony. Justin what's your perspective on Spectrum. How's Spectrum help with this scenario.
Justin Davis: I might take a slightly different approach on answering this, at a more tactical level, based off what we're observing from the moderators who are dealing with this issue and how they're adjusting to this. So, last night we had one of our monthly virtual wine tasting events where we have, you know, 25, 30 people from social dating, gaming, marketplaces, join as much of, you know, VP of Product and data science folks from the trust and safety world, get in and discuss various issues related to this exact thing around how they're dealing with these challenges.
And, you know, one of our other, M Group, dating conglomerates-this customer, one of their brands was on talking about how they're reacting now to Black Lives Matter protests related to problematic terms. So, and this is related to the pandemic as well. Like anytime you have new terms or new types of abuse, where people are being either extremely racist or hateful or discriminatory towards new people because of new societal factors. There are new topical societal factors that are making things much more inflammatory.
What happens on the moderator side, to give you an exact use case, one of these brands was basically auto-banning users anytime that a racist comment was made related to Black Lives Matter or Blue Lives Matter, or All Lives Matter, just to keep racist people off the platform. But what happened was is that users figured out quickly that it was banning people whenever they would flag someone. And so they started to just flag any user that they wanted to flag, even if they just didn't agree with their political stance, not because they were being racist or hateful. And so, what this caused on the moderation side for this platform was a lot of extra work.
On the moderation side they have to go in and review cases where users were auto-banned for being reported by another user, for a reason that had nothing to do with actual policies that were put in place for good intentions related to, you know, anti-Asian rhetoric or racism or hate speech, that sort of thing. So, this caused just more workload. And so, you know, the, the feedback that kind of we all settled on at the end of that jam session was you have to be really careful around the ways that you think about auto moderating anything related to these types of issues.
And it's not as, I guess pun intended, black and white, or cut and dry, on the way that this stuff works. Like it's extremely challenging to figure out what's the right policy to deploy to make sure that you're not infringing upon free speech. That you're creating a safe environment for users to interact and meet people and do that sort of thing.
But also, how do you keep just awful racist, hateful content off the platform in a way that doesn't inundate or burden your moderation team with extra workload. So, that's been one of the things that we've observed and that's been consistent, not just in a dating world, but we see this across marketplaces and the gaming environment too. They're trying to put a clamp down on new terms and new ways of abuse related to societal and political factors that are happening right now. At least in the US.
The other thing that came out of that was if you go too aggressive on that, that's a US-centric policy, especially as it relates to BLM. Once you, or some of the gaming companies that have international audience, and users that are coming from other geos and things like that. They don't necessarily see it the same way politically that we do here in the US related to some of the protests.
And so, you have an opportunity or a challenge around potentially, you know, upsetting a lot of users that don't really care that you're trying to clamp down on a certain type of speech. They go, Hey, you know, this has nothing to do with us or with me. Why are you trying to infringe? This is a US-centric thing. I'm on this platform to do whatever I need to do.
So, it's a challenging prospect for a company that, whether you have AI solutions in place, or you have an army of human-based moderators looking at this stuff. It's a complex issue.
Mark Brooks: Thank you, Justin. So, I'll ask my final question. We'll start with you, Jeff Tinsley. You know, this is changing behavior. The pandemic is changing behavior in the moment. But my question to you is: do you think there'll be a lasting effect from the pandemic which you think will fundamentally affect the way people meet online and especially for internet dating? Do you think we're going to see some changes after the pandemic, with user behavior?
Jeff Tinsley: I'll start with the safety side. One of the headlines that I buried, in the beginning, when I made an introduction. I was talking about, you know, the impact of the FTC study and how fraud is the single biggest category that the FTC is tracking out there. The FBI thinks it's a $15 to $20 billion problem. The FTC said it was a 40% increase in 2019 versus the previous year. The reason I bring that up is that I believe that, if you're paying attention to the news, you're already starting to see a lot more scamming going on.
People are getting more desperate. The Coronavirus has challenged the financial system for, you know, the entire economy and for individuals. And so, there's real risk that the scamming continues and gets worse. Fortunately, there are simple solutions. I'm just talking about our kind of platform but knowing that our whole purpose in life is to reduce these fraud and scams and build trust in these marketplaces is important. I believe it's going to get worse because it's been growing at a dramatic clip and something that I think is going to continue.
And every dating service and marketplace is going to be forced to identify the users and start to track the behavioral history and background and make that available to their consumers to build that trust. So that people don't get hurt. I don't see it slowing down. I love what everybody's doing around now in the space, like showing videos and interacting with people real time. But there's still people hiding a lot and it needs to be brought to the surface. And I hope the scams and the fraud doesn't continue, but the numbers are not looking good in that regard right now. And I think it's going to get even more tough, unfortunately.
Mark Brooks: Thank you, Jeff, Anthony, what are your thoughts? How do you think it's going to change things over the longer haul?
Anthony Oyogoa: I think fundamentally, people are going to continue using the video services. I think Geoff would attest to that. The success of that has led people to understand that when I meet somebody online, I do want to do a video, some sort of video conference or video date, just to kind of get a feel for the person. But one of the things that we feel is, marketplaces and dating apps, just like Mr. Tinsley just said, that must be a long-term plan to start integrating more safety features.
And I think what The Meet Group has been able to do on a global scale, started putting the marker down by saying safety is important to us, not just lip service. This is what we're really doing. Look at the partners that we're bringing in. We're making our community. It's not just, like you said, a simple swipe right. You're creating a community. I'm trying to make sure that people that are getting in there are the right age. Making sure that people that are getting in there are behaving correctly, having the right conversations like Justin is doing, just like Tinsley is doing where making sure that their background is who they say they are.
And when you are meeting in person, allowing the URSafe app to be that last part to say, Hey, when you do meet in person, we're providing this. Just so that you can use it to get there safely when you're interacting safely. The best thing for us is when a person is on a date and they do a Check In and they say, I'm fine.
Right. That's a great thing because that's a head nod, right? Yes. The person that they met on Geoff's site was great. The person that when they were interacting and talking to using Spectrum, was the right person. The moderation and the conversations were good, and they were positive. And with Tinsley's RealMe, that the person was who they really said they were.
So, when they then use our app on the last leg of that journey, the first time we meet and they'd set up an automated check in that, Hey, in 10 minutes, just checking on me. Right. And the app does that. And the person says, no, I'm fine. That's the 360, right. That's us bringing everything full circle. And that's what we hope. And we think things are going to continue to go.
Mark Brooks: Thank you, Anthony, Justin.
Justin Davis: I think about in terms of like how what's going on is going to shift behaviors, whether it's a user's or the companies themselves to make a stronger investment in this area. Historically, you know, companies have basically deployed good enough solutions in the area of safety. I think, based off, whether it's the protests or the pandemic or the political season, that's coming up here in the US. I think what you're going to see is an increased scrutiny on policy, especially related to things like the Ernet Act, SESTA, and FOSTA. It is probably going to get another look. CDA 230 is going to get debated endlessly, probably indefinitely, as time goes on.
With that increased scrutiny on policy and an increased awareness on the rise of prevalence and severity on the user side related to different types of abuse and hate speech, we're all much angrier and more sensitive and aware of these types of things. The collision of both of those factors together, I think, will motivate companies like what the Meet Group started doing a couple of years ago as an early adopter of this. I think you'll see a lot of companies on the social space, you know, marketplaces, dating, gaming, and all the above, thinking about two core areas.
One is privacy by design and the other one is safety by design. Privacy by design, meaning thinking about how we protect user data and use the most minimal amount of data possible in order to do the things that we need to do to make money and keep users safe. There are some new techniques that your listeners could research if they're interested, or they could reach out to me to talk more about, which is related to differential privacy and federated learning.
Those are two interesting concepts in the data science world that allow companies to take advantage of very minimal data and just interesting things. And then on the safety by design, or I guess in this world at safety after design, is the concept that there'll be a whole new rise of social apps that'll propagate out of this new environment that we're in. Those companies will probably be thinking about moderation, and user safety from the get-go.
I think some companies are still not getting it right. You look at what happened with Clubhouse recently, over the last couple of weeks, and it just goes to show you that still, to this day, even knowing everything that we know from this audience, this panel right here, everything we know now there are still companies that are coming out today that don't put safety at the forefront of their capabilities. And I think that's going to be a very good differentiator for any business that's looking to, to build a new company or go back and think about how they retroactively think about safety after design.
That's really going to separate the winners from the losers and attract the right audiences. Because studies show that healthier communities drive engagement, they drive retention rates. People are more willing to be themselves and express themselves. Their MPS scores go through the roof because they're willing to recommend that service to another friend or a loved one. And that's what ultimately drives revenue for the brand. So anyway, I think that's going to everything that's happening right now is going to put some more eyeballs onto those two core issues, privacy by design and safety by design. And I think we're going to see some great apps come about because of it.
Mark Brooks: Thank you, Justin.
Jeff Tinsley: If I may, just quickly adding to what Justin had to say, safety is just a critical demand by the consumer. So, our studies show that, you know, 70% of users that are using a safer app, that's doing all the things that this whole panel is providing, will use them more. Something I mentioned before, there's still a huge market that isn't using dating services because they don't, they don't feel safe.
So, there's an opportunity to grow market share another 35% just by doing some of the basic things that the people in this panel can provide. So, it's critical to address those safety issues. It will help the marketplaces themselves, the dating services and marketplaces themselves, drive more utilization, drive more conversion, drive more audiences that they don't already have. And it's just necessary at this point. The consumers are demanding it.
Mark Brooks: Absolutely. Thank you so much, Jeff. Geoff Cook, what's your perspective? For The Meet Group, safety is paramount, of course. How is the pandemic going to change that for the long term?
Geoff Cook: I think Justin's answer is interesting. Safety and privacy by design are important. They're also frequently in tension. So, do you moderate one-on-one chats for example? I'd love to see a webinar topic on I'm just I'm that right? Cause I think at The Meet Group, we frequently come out on the side of safety. We make decisions to moderate content that are difficult, but we do have that privacy discussion.
And so, we focus on privacy by design and safety by design. I think that the tension and the difficulty there is everyone is optimizing in a different way that that kind of suits their business needs and their user's needs. But, you know, as it relates to the question, we think the future of dating, of course is, is video. I like to think of it as kind of the age of shallow swiping apps being over. The pandemic has ushered in the end of that by basically having three years of innovation happen in three months.
We continue to innovate in video. We're looking at things within video, like multi-guest video. Also more with one-on-one video. Where I think video will go-one on one, I think will be important-but I think you're also going to see more gamification. We'd like to make everyone the star of their own dating game. One of the things we're experimenting with right now is stakes. Like when you think about a dating game, what you'll often think about is there's a winner. There are often losers, but there's a winner and there's a date and it's often an all-expense paid, date.
And so, as we start thinking about what the other side of the pandemic looks like, and we think about video kind of looking like a filter for real person interactions, we're going to be doing something with stakes and trying to have the dating game. The live streaming dating game that kind of supplants the linear programming television, dating games.
In terms of where we're focused for the future, I would say, we think of really three pillars for the business. One is always building the best dating features in the world for our engaged community. People come to our apps to meet and date first and foremost. A video is important, but that's not why they're coming. They're coming to meet and date.
Two is invest in live streaming content and innovation. Three is to look for other methods of growing our usage through their acquisition, or through kind of sensible partnerships. We recently powered a large dating app with our video solutions in that vein. So, I mean, that's, that's how we're really thinking about things going forward.
Mark Brooks: Great. Thank you so much, Geoff. This has been a wonderful panel, I must say. I think the role of IDEA is to help inspire the dating industry and to really deliver on our true promise. And I think the absolute value that we deliver is making sure that people show up on a date are supposed to be showing up on the date. There's less surprises. We increment significantly over the real-world scenario of meeting someone just, you know, happenstance. I think that's what we do extremely well. We've got some best in class vendors. We've got The Meet Group, who are very proactive on this front as well.
I hope we can inspire you further. Let us know any topics that you'd like to see addressed. I made a note about the chat. That's a good one. Thank you very much. And we'll do this again on the 23rd as you live. We've got another idea webinar coming up on the 23rd of July, which dovetails very nicely with this one.
I'll be running a post on it on OPW very shortly. I think that concludes our call for now. I really appreciate your time. This has been very informative. Thank you so much.