John Matze, CEO of social media company Parler, committed not to ban users for "hate speech," stated that his company would fix an "awkward" "fighting words" clause in its community guidelines, and called the decision by Big Tech companies to censor the America's Frontline Doctors video "ridiculous," in an exclusive interview with CNSNews.com.
"We refuse to ban people on something so arbitrary that it can’t be defined," Matze said when asked whether Parler has banned or ever will ban users for "hate speech." "You see these sites trying to enforce these arbitrary rules and you notice that people are getting kicked off for the most random and arbitrary things like misgendering people. It's absurd. So no, we won't be pursuing that policy."
The Parler CEO also commented on the subjective nature of the "Fighting Words or Threats to Harm" portion of the company's community guidelines, which, as of press time, gives as an example "any direct and very personal insult with the intention of stirring and upsetting the recipient—i.e., words that would lead to violence if you were to speak in that fashion in person."
"We just hired a chief policy officer who's a real lawyer," Matze said. "She's actually overhauling that specific clause that you brought up because she said it's a really awkward clause to have online....Our goal here is to maximize free speech, maximize online discussion, while maintaining an actual community feel."
Finally, the head of the Twitter alternative addressed the censorship by Big Tech companies of a video by America's Frontline Doctors in which one doctor posited hydroxychloroquine as a cure for the coronavirus.
"We allow [the video] freely," Matze said. "This person's a doctor, they're making a statement, they're liable for the statement. They could get sued for malpractice, they could lose their job, but they want to say it anyway. That's their right."
"When you see these social media platforms cracking down, it just makes these people feel more disenfranchised. They feel like they have no freedoms, they can't talk about this. They're not even in control of their own health. And that's wrong."
The Parler CEO discussed with CNSNews a variety of other topics, including the platform's content moderation system, its recent growth from 1 million to 3.3 million users, its plan to implement a "groups" feature, the dropping of an indemnification clause in its user agreement, and the company's plans to combat other kinds of tech censorship.
Below is a transcript of the interview:
Rob Shimshock: Hello there, I’m Rob Shimshock commentary editor for CNSNews.com and today I’m joined by John Matze, CEO of the up-and-coming social media company, Parler. Thanks so much for coming on, John.
John Matze: Thank you.
Shimshock: Now, your company Parler has positioned itself as an alternative to Twitter by striving to embrace the culture of free speech that Twitter has left by the wayside, if not actively smothered. Is that a fair characterization?
Matze: Yeah, that’s accurate. Basically, a lot of people have come over because there seems to be a lot of ambiguity with their terms of service, to say it lightly. And so, what we’ve done is we’ve created a platform where people are not judged by us. They are judged by a jury of their peers and our rules are transparent. They are involved -- you know, they basically are free speech-oriented. Anything that you can say on the street in New York, you can say on Parler and the goal is to create conversation, not to dismantle conversation, to allow debate, conversation in general. And we’re seeing that people love that concept. It’s kind of old-fashioned, but it seems to be very popular.
Shimshock: Great, well I have a couple of questions about the actual terms of services and policies. But first, I’d like to know, we’ve seen Parler’s user base explode recently, with site users soaring from one to 1.5 million. The platform does seem to have attracted more right-wing than left-wing folks. And I saw that Parler is offering $20,000 to a high-profile liberal pundit who joins the platform. But speaking more broadly, how will Parler ensure it becomes a true Twitter alternative, that is, a facilitator of debate from perspectives across the political spectrum instead of a conservative echo chamber?
Matze: Well, you’ve hit a few points. So, the numbers are looking really good. We’ve actually passed 3.3 million total users now, at this point. And so in less than a month, we’ve added 2.3 million people. Fun fact: in the last 24 hours, 50 percent of them have been from Brazil, actually. A lot of people in Brazil are being censored by their Supreme Court there, who’s actually ordering journalists to be taken offline by Big Tech companies in the United States and they’re complying. So it is crazy. And so to your other points that you had made: you had mentioned that we had offered a bounty for liberal journalists to come on. We did. We didn’t have any takers. And it wasn’t just liberals. We were specifically asking for progressives, so very self-described progressives. They didn’t really take us up on the offer. We’ve kind of dropped it lately because we didn’t have anybody coming in. We would have really liked it, though, had they gone for it. But what we have seen is a lot of people on the left, a small portion, right, about 10 percent of our audience is left-leaning, but they are coming in and you’re seeing some debate and they’re upset because the left-leaning individuals who are coming in are not….You know they’re a little bit uncouth sometimes and they like to be, they like to joke still. And they’re actually being taken down off of Twitter as well, because they’re joking around or saying things that are not politically correct and that seems to make Twitter angry, and Facebook, and these other companies. And to your point, how do you make it closer to being more Twitter-esque: we don’t want to be Twitter-esque, right? We want to beat Twitter because they haven’t innovated. They haven’t monetized. Jack Dorsey just recently announced that they’re going to be trying to go for a subscription-based model because they can’t seem to be making enough money off their ads. So we have an opportunity to not just build compatible features, but really take on the space of social media as a whole because, you know, people want to be able to reach out; they don’t want censorship, but they also want neat tools that Twitter has never been able to provide like groups, like having, you know, basically having cordial conversations you can moderate on your own instead of just what I would call a social dumpster fire. So, you know, really, people need to have a better set of tools to moderate their own experience and not leave it to the platform. So there’s a lot of things that we can do. I hope that answered all your questions.
Shimshock: Yeah, now one thing I’ve seen recently that’s caught my attention are the boycotts of Facebook by major companies that take issue with supposed “hate speech” pervading the platform. And it’s unclear how damaging this has been so far, but does Parler foresee its commitment to free speech conflicting with its attempts to fundraise? And if so, how do you plan to overcome that?
Matze: So, yeah, a few things. One is there’s the Anti-Defamation League study that came out that said Twitter and Facebook are the two most hateful places on the Internet and Twitter, by a long shot, is not even the number two social media platform online, which is shocking that they were rated so poorly. And so to counter that, that same list listed competitors of Parler that were far fewer in number and we were actually far better ranked. We actually weren’t ranked at all as being a hateful place. And a lot of that comes down to spam and not having duplicate accounts. We enforce very strictly that you can have one account and that’s it. That’s your one account. And as a result, you don’t see people coming in with 20 accounts, just attacking people like you do on Twitter. I don’t know if you’ve ever been on Congressman Nunes’ page, but if you’ve ever been on his page, it’s just nasty, nasty stuff. The same with President Trump, too, it’s just nasty comments. You don’t want to be in a place like that; nobody does. And Facebook has this boycott going on right now. Now the boycott -- the corporate boycott -- amounts for something like $50 million a quarter in ad revenue, which to you and I may sound like a lot, but it’s actually not. Proportionally, it’s an extremely minor percent of their income. The boycott is not substantial at all. And part of me thinks that, you know, we don’t know if this boycott is really just a virtue-signaling technique because these companies are having to cut ad revenue, like a lot of companies are doing right now, because of the pandemic and how it’s affected their economics. They could be boycotting it because it’s nice virtue-signaling and free advertising for them because they can’t actually afford the ad slots. And no one’s really talking about that point either. So there’s a few different possibilities. For us, we’re actually doing really well on the monetization front because we allow political ads during an election year, which Twitter doesn’t allow, which is why you’re seeing, you know, Parler’s actually becoming profitable, even in its infancy, which is unheard of for social media. Whereas these other sites who are, you know, not allowing political ads in an election year are suffering. So we’re making the right decisions, and we’re doing the right thing for the community and we believe in the American people and we believe in people’s rights to discuss things on their own, and it seems to be paying off really well.
Shimshock: Gotcha. Now turning against the topic of censorship, I have a rather simple question for you. Has Parler ever and will Parler ever ban anyone for “hate speech?”
Matze: There is no definition of “hate speech” legally; there never has been. They’ve attempted to define it and never will. And therefore we cannot. We refuse to ban people on something so arbitrary that it can’t be defined. Now, and the reason that I say that is nobody wants hateful content, right? Nobody wants nasty things at them, but everyone’s definition of hate is different. You and I having a simple agreement could be me viewing this disagreement as a debate or as hateful, whereas you may view it as normal. You may state a fact, saying, “hey, I view this to be true.” And someone may say that's hateful. So how do you define the undefinable? You can't. The government has tried; they couldn't. The only countries that have have had very arbitrary rules that are rather weak and hard to enforce. And so you see these sites trying to enforce these arbitrary rules and you notice that people are getting kicked off for the most random and arbitrary things like misgendering people. It's absurd. So no, we won't be pursuing that policy.
Shimshock: So I went through Parler’s community guidelines and I did note one section called “fighting words or threats to harm,” defining the concept of fighting words as “use of incitements to violence that produce a clear and present danger or a personal assault with the intention of inviting the other party to fisticuffs.” But then, as an example, Parler gives “any direct and very personal insult with the intention of stirring and upsetting the recipient, i.e., words that would lead to violence if you were to speak in that fashion in person.” Now, of course, there is some subjective language in here -- “insult,” for instance. Then, how do you determine someone's intention? And of course, people have different tolerances, as you mentioned, for language they perceive as hostile. So John, how does Parler hope to maintain a fair and balanced enforcement of these guidelines?
Matze: So to the point that you just brought up: this is excellent, thank you. So, one is we just hired a chief policy officer who's a real lawyer and not me writing the community guidelines, and she's actually overhauling that specific clause that you brought up because she said it's a really awkward clause to have online. Second, we're currently enforcing the clause through a community jury system. That means we have a quorum of five community jurors, juries of your peers, not Parler as a company, and they judge it independently of each other. They don't know what they have said. They independently judge the situation and then they make a determination. Now, we've said that our community guidelines are a bit of a work in progress, because we're trying to make it fair for everybody. Our goal here is to maximize free speech, maximize online discussion, while maintaining an actual community feel. So, the goal is to allow people to say what they'd like, but also we don't want people breaking the law. We don't want people to get attacked, right? We don't want people threatening violence. There was a whole group of people photoshopping me getting shot through the head. So stuff like that, obviously, is not allowed.
You know, there's lines that we're trying to draw, but we also want people to have conversation. And naturally, as you know, online arguments typically get people angry and using what would be described as “fighting words” on the street, but not online. And so we're trying to clarify that to make sure that people don't end up in some kind of cyber jail over, you know, an online debate or dispute got heated, if that makes sense.
Shimshock: All right. Yeah. Would one other possible avenue perhaps be ideological diversity in hiring? And so like we've seen, I think, with a couple of these companies, a lack of that seems to be behind some of the Big Tech censorship. I doubt, for instance, Twitter has even one content moderator that voted for Trump.
Matze: We don't have mods. Like I said, we've got a community jury. We don't— they're not employees. They're not hired. They're volunteers and they are members of the community. We picked them because they were able to pass a community guidelines test of previous rulings kind of like a, like a historical Supreme Court ruling test, but of Parler violations. And they were able to do really well. I didn't even get a hundred percent on it and I wrote the rules originally. So it's a very, very comprehensive test. We weeded out anybody who was ideologically far-right or far-left; we pick moderates. And we constantly moderate and moderate the moderators—or the juries—to make sure that they don't have anybody -- like for example, if most people, most moderators say 80 percent of violations are not violations, most moderators, right? And so if we notice somebody says 95 are or we notice somebody says, you know, 60 percent are, or we noticed that they don't line up with the other juries, then we kick them from the pool because we want to make sure that people are actually doing a good job and being legitimate with their moderation.
Shimshock: Great, now one other language specific question and this one about Parler’s user agreement. I noticed last month that the company had a provision, number 14, that tasked users with defending and indemnifying Parler, including paying for legal expenses pertaining to their use of the platform. Now that clause along with one preventing users from taking part in a class action lawsuit against Parler appeared to have been removed from the agreement. Can you tell us why that is?
Matze: Yeah, we had, like I said, we hired a new chief policy officer and our goal with hiring her, which she's awesome, by the way, is to take things that were basically templates, because the original community guidelines was a template that we got from our lawyers. They had put this together. They said it's very standard for social media. The indemnification clause really doesn't look very nice. It wasn't that bad of a clause but we said “you know what, why don't we do something else? Let's take a look at Twitter's rules. Let's take a look at Facebook's and let's make sure that whatever we have for our rules, we are less strict and we give the user more rights than they do.” And so we've actually updated those rules to do that. We've also tried to clarify a lot of the legalese to be more legible, because these things are nearly illegible, if you've ever looked at this stuff. It's a mess. And I'm kind of used to this kind of documentation, and even I'm bored to tears looking at it. So we tried to upgrade it, so it's a little bit more legible too. And so you'll find if you look at our community guidelines and terms of service, if you look at Twitter, if you look at Facebook, if you look at any of the tech tyrants, our rules are more in the favor of users than theirs. Actually, should be all of them. If they're not, bring it to our attention; we’ll make sure that is more in a user's favor than those sites have.
Shimshock: Great, now recently Twitter took down and even penalized the president’s son for sharing a video pertaining to hydroxychloroquine. And this was a video in which a doctor posited that drug as a cure for the coronavirus. Now how has Parler handled that video on its platform?
Matze: We allow it freely. We have, we have a lot of people discussing this topic, including my father, who I adamantly disagree with, who probably would have gotten kicked off of Twitter for his views on -- but, we allow people to talk about it, right? And so I was actually on CNBC’s morning show having a debate with them about whether or not we should do that. And I just adamantly said, look it, this person's a doctor, they're making a statement, they're liable for the statement. They could get sued for malpractice, they could lose their job, but they want to say it anyway. That's their right. That's their risk. They're taking it. If they give bad advice, they're going to get sued. Furthermore, if they give bad advice, and they prescribe something to somebody, that's even more of an issue, but this drug is not available over the counter, they have to get prescribed this drug. So a fair debate with the general public -- even if somebody were to “get misled,” like a lot of these social platforms are contesting -- “social publications,” I like to call them -- even if they got misled, they still have to go talk to a doctor and get recommended to take the drug and actually get approval to do it. So this is a ridiculous concept that we're censoring this topic. It's completely politically motivated because I think that there's a lot of people on the right who view this as positive hope that there is a solution out there and they want to see positive hope. And they want to see positivity in a time where there's so much negativity in general. And so when you see these social media platforms cracking down, it just makes these people feel more disenfranchised. They feel like they have no freedoms, they can't talk about this. They're not even in control of their own health. And that's wrong. They should be able to talk to people about this. And I feel very passionate, you know, about that, even if I might disagree with somebody on the topic. It's their right to have that debate. It has nothing to do with me; it's their personal health. And as a company, that's what we stand for and believe in, which is people's rights to make these decisions on their own.
Shimshock: Great to hear. Now when discussing tech censorship, people typically only address practices employed by Big Tech giants themselves, such as suspension and shadow banning. And when I say Big Tech giants, I mean the social media companies. But that's only really one segment of the conversation. Over the past few years, we've seen numerous other censorship weapons in action, such as app stores banning apps with problematic points of views, domain registers revoking website licenses, and payment processors nixing user accounts. I know for instance that Gab, another Twitter alternative, has experienced a couple of these problems and those doctors I mentioned earlier, had their website taken down, as well. Now can you walk me through how Parler is prepared to combat these issues?
Matze: Sure, and you'd see at the congressional hearings, you saw that people were pointing out to companies like Apple, because Apple's Tim Cook was there, saying “look it, you're giving preferential treatment to some apps.” And he claimed that they treat all apps equally, which is obviously, in my opinion, not true. Apple's App Store clause 1.1.1 forbids any apps that might contain objectionable or harmful or “hateful” -- by no legal definition, but an arbitrary definition -- are not allowed on the App Store. Now that is impossible to maintain on a social media. Impossible. Twitter violates that all the time. And yet Twitter, for example, is Editor's Choice. They are given a special status and treatment by Apple, which actually disproves his claim that they treat all apps equally. Meanwhile, as you had mentioned, that company was banned from the App Store, along with many other companies that have been banned from the App Store, including other apps that I have made have been banned from the App Store on purely ideological reasons. And Parler has so far kind of reached the threshold where I think we're too big to take off the store, at least right now. We have and we're working with them as much as we can to make sure that we don't run into any problems. They have said that we're okay, as long as we continue to moderate our rules that we've set up that are clear, and everybody can read and we don't publish content, we’re fine. And so as long as we're not publishing content, which we're not, we don't curate it -- it's very chronological; there's no algorithms -- we're fine. But if you take a look at other apps, like Facebook, take a look at Twitter, and you look at their rules, they're not allowed to have, according to 1.1.1, any hateful or obscene or hateful or awful content. Yet Twitter has hundreds of thousands of tweets about “hashtag kill all certain groups of people,” and that's allowed. So it's really a double standard, and I think it was a little bit misleading, the statements that were made at the congressional hearing, because there is a bias. But the real question is not “are they misleading us about their bias?” Because we know that they are. The question is: is it their right to have this bias as a private company? And should we do something about it? Personally, I think it's their right to do it. They built these companies; if they want to be biased, they can. I just don't think it's their right to lie or mislead people about their bias. And I also don't think, I don't think it's really morally acceptable to do it, though. I think it's wrong. So that's where I stand on the hearing.
Shimshock: All right, now turning against the larger political scene, I want to ask, we saw the hearing Wednesday with Big Tech executives, like you mentioned, and we have a very short window from now to November, but in your opinion, what can lawmakers do to combat Big Tech election interference?
Matze: They can keep raising awareness and marketing about it. But I don't think that they can do much of anything in that period of time, at all. The only thing they can do is promote competition, which is effectively working the best out of any of it. We have politicians raising more money on Parler than they are on Twitter with the same audiences, with the same numbers even. So you're seeing better conversions on a platform like Parler; you're seeing people come over and in large waves. They're getting better traction, they're getting more reach. Articles are being clicked on and read, which is unheard of right now on these other platforms. So the best thing they can do is promote competition. And I Parlered about that last night saying “thank you to all the lawmakers, to all the congressmen and congresswomen, the senators that are on Parler” because by supporting a competitive platform, they are effectively making the biggest impact they can, you know, on promoting competition and solving this problem.
Shimshock: All right. And lastly, what's next for Parler in the next coming months and then going into 2021?
Matze: Next, we want groups. That is our big thing that we want to do. We want groups. Now, a timeline, I can't guarantee anything. But we would love to replace our Discover page with groups. And we're working really hard on doing that because people need a place to have conversations with one another to organize events and these keep getting shut down elsewhere. We need to have that.
Shimshock: Great. Well, thanks so much for your time, John, and best of luck with Parler.
Matze: Thank you. Take care.
Rob Shimshock is the commentary editor at CNSNews.com. He has covered education, culture, media, technology, and politics for a variety of national outlets, hosted the Campus Unmasked YouTube show, and was named to The Washington Examiner's "30 Under 30" list. Shimshock graduated from the University of Virginia with a Bachelor of Arts in English and Media Studies.