Anxiety over character assassination

Richard Levick, founder and CEO of LEVICK; Steven Nardizzi, partner at Paragon Strategic Insights; and Simon Newman, CEO at CMG Innovation, discuss the ways that...

The rise of the internet and social networks has been an undeniable boon for the state of the world at large, but can such easy and open access to anonymous communication have a downside? To learn more about what happens when people become the targets of anonymous online ire, we spoke with Richard Levick, founder and CEO of LEVICK; Steven Nardizzi, partner at Paragon Strategic Insights; and Simon Newman, CEO at CMG Innovation.

ABERMAN: Well, Richard wants to tell our listeners what we were talking about the other day and why this is such an important issue for us to be thinking about.

LEVICK: You know, Jonathan, it’s always great to be on your show, because we know it’s going to take two seconds to get into Plato and Socrates. But here we are in Plato and Socrates, who posed the question two millennia ago, which was, is the death of democracy too much democracy? And we’re testing it now. Now that we all have our own soapboxes, we’re on the Internet and have this ability to communicate with each other. But rather than seek our highest selves, to seek out mercy and understanding and listening, we’re much more about cancel culture and accusations. And I’m afraid that we’re finding that the difference between mob rule and democracy is the rule of law. And when we stop being civil, when we stop abiding by rules, we get to this time and age and moment where it is so easy to destroy people’s lives and careers.

ABERMAN: Is that democracy, or are you talking about technology?

It’s really hard to determine what the difference is right now. Do we actually have the right to vote anymore? And that question has come into focus in the 2016 election, and we’re worried about that in 2020. The folks over at Turbine Labs talk about the fact that since algorithms are what creates the Internet, are we in control when, we get to the keyboards, or is the machine in charge of us? We wonder, do they really know what we’re thinking? Or does Facebook and the other big Silicon Valley companies know so much about us already that we have no independent thought?

Subscribe to the What’s Working in Washington podcast on iTunes.

ABERMAN: Well, what we’re talking about today is how these particular platforms allow for people to get in the middle of situations that have a tremendous and rapid adverse effect on their livelihoods. And so, Simon, I’ll ask you: I know that this has been your life experience recently. What happened to you, and what was it like being subjected to this phenomenon?

NEWMAN: It wasn’t fun. My story started when I was elected the 25th president of America’s second oldest Catholic university. I was hired by the board because they were looking for a nontraditional leader to help turn it around. It had failing fortunes. It lost money for most of the last decade. It had junk bonds. It had three times the level of debt that was considered prudent. It had declining enrollment and a discount rate that was above 50 percent. So it was a challenge, but I loved it. My background was in turning around and revision visioning companies. And so we got a lot done quickly.

We started five new programs in the high growth areas with the help of some really good faculty. We got a deal done with Cambridge University. So we’re going to teach Cambridge courses in America, which was a great prestigious thing for the university. I spent nine months negotiating to have the largest donation in the university’s history, which was well over 100 million dollars worth of assets, to be donated to the university, which would have provided a transformational donation.

ABERMAN: But we’re not here today talking about, that? In my own academic background, that’s a tremendous track record of success for a university president. So why are we not talking about success today?

NEWMAN: When you make changes in an organization, particularly something as old as a university, the people who have power, the old guard, get upset. So I was attacked by a group of professors and their agents who came from the worst performing departments in the university, philosophy and languages, mathematics and history. And you know, this started with gaslighting, making it very difficult to do things, hiding the ball constantly, coming up with accusations of wrongdoing internally, and events escalated into having this toxic narrative put into a student newspaper that was fed to these students, which claimed completely defamatory accusations, that I was trying to kick out students for struggling, that I was trying to use a survey to figure out which students to kick out in order to goose the retention statistics of the university. Those were all false.

ABERMAN: So what ended up happening there is that the technology, the rapidity of communication, ultimately allowed an echo chamber reverberation of these accusations or these statements.

NEWMAN: I think that’s exactly right. They put up a web page, a closed Facebook group, where they recruited fellow travelers, to use the vernacular of the Russian counterintelligence operatives. So in active measures, you recruit fellow travelers to believe the same general sentiment. And you inflame their passions, and make them very active by feeding this toxic narrative of false and defamatory information.

ABERMAN: Which is something that, you know, frankly, we see in many different parts of our political discourse now, and in various places. Steve, how about you? How did this impact your career?

NARDIZZI: Sure. Well, my experience is very much similar to Simon’s. I helped found a charity, Wounded Warrior Project, right at the height of the conflicts in Iraq and Afghanistan. That was in 2003. I took over as CEO and we quickly grew from a startup charity to one of the largest charities in the country. Four hundred million dollars, program offices all around the country, helping 100,000 warriors with some really deep programming, mental health programming. We had expertise in employment. We created a hospital network. We were providing in-home care. So really, really great programmatic things happening in the organization. And we grew it during the recession. So think about that, going from zero to 400 million through the worst recession in history.

And then the music stopped when two hit pieces came out. CBS News ran one, and The New York Times ran one as well. And those stories were generated very similarly, to Simon’s experience, from some former disgruntled employees. We had a number of individuals that, after, and this is the way I think social media can get weaponized, after they left the organization, a few of them stayed in touch. They created a closed Facebook group, and then began recruiting. So, they would look for other individuals that were fired. They would bring them in this closed group. And therein you created this echo chamber of negativity. And then, one or two of them were PR people. And they reached out eventually, and got a hook into a reporter. And that’s how these stories really got legs.

ABERMAN: After Richard and I talked about this, I went off and talked with some people that I know that are involved in the Internet. And their initial response was, well, if there’s so much traffic, there must have been something there. How would you respond if you saw somebody who said that to you?

NARDIZZI: I’m very fortunate in that there were a host of investigations, deep investigations, that were done after these stories. So there is a nonprofit expert and journalist himself named Doug White, who actually did a multi-year investigation, wrote a book that, you know, sort of categorized how these stories were made up, all the falsehoods that were carried in the initial media stories. And then there are multiple investigations. Groups like the Better Business Bureau did a very lengthy investigation after these stories came out, and said they found no evidence of all of the things that are being claimed here. So I’m fortunate in that I can point to these external validated reports that say, look, it’s not just Steve saying, this didn’t happen. This really didn’t happen this way.

NEWMAN: Well, when these accusations hit the student newspaper, I actually asked for the board to do their own investigation and investigate me as well. They did. And they concluded that none of it was true. Apart from, you know, I used some salty language. I called a professor an idiot, except with perhaps a few old English intensifiers, too, because he was an idiot. He completely mischaracterized a perfectly good program that was designed around preventing students being recruited to the university, and being kicked out, or academically dismissing 30 students around Christmas time each year. I put a program in to prevent that from happening, but then got accused of kicking students out for struggling, which was utter defamation. And so I had the board report, and then it was investigated by middle states. Even the cabal that attacked me even got Brian Frosh, the attorney general of Maryland, to investigate some aspects of this survey that we did. It was all above board. Nothing happened. There was nothing wrong there. But it didn’t get reported in the news that way.

What you both experienced is not new. This is the inevitable challenge. This is what happens in organizations is always an antibody. But it seems that there’s something going on here that technology is creating an amplification, which is highly troubling. That’s what I will talk about. We come back after the break.

ABERMAN: Richard, why and how is technology making what, frankly, has always been a situation of people getting upset about change become such a mess?

LEVICK: Well, first of all, we’ve monetized. We have heard stories from Steve and Simon about how when you make change, you make people uncomfortable. Suddenly uncomfortable, which used to be an opportunity for us to grow, and is now a cause of action. Glassdoor actually monetizes that discomfort. If you leave a company and you didn’t like it, you’re encouraged to post there. And Glassdoor actually will then go to the companies and say, hey, if you advertise here, we can make sure you get better reviews, or the bad reviews go down the page. So we’ve now monetized these complaints, too. And I think this is a real challenge. You know, people often ask, well, how do you tell when you have people who are alleging false accusations, versus, the I’ll call them oftentimes, self-righteous accusers? And it’s that the victims are often self-aware. What was the mistake that I made? Should I not have used the word idiot in referring to a professor? I wasn’t perfect.

Should I have been more sensitive during that fundraising? The self-aware are always looking at themselves. The self-righteous aren’t, because it’s always easier to accuse others. And we really have to work on being better human beings. The question is, you know, Mark Twain had said that a lie would make it halfway around the world before the truth had its sneakers on. And that is so true today that it’s very hard for us to include in our counseling, what’s the prophylactic? How can you prevent it? All we can say is what you can do after. It happens too quickly to really be able to prevent it. So then finally, the tool that was often used in the early days of organizing, Saul Alinsky, Gandhi, making it personal has now also become monetized. We see doxxing all of the time, and that is putting personal information out about people who are not public figures. And that is, anyone now who runs a company in any position of authority, no matter how anonymous they previously were, are becoming public figures.

ABERMAN: So, Simon and Steve, I’ll turn to you. Let’s start with this. What would you ask people to do? What would you ask our citizens to do when these things occur? How would you like people to react?

NARDIZZI: Take time to read and reflect before you jump on the sort of virtue bandwagon, right. The self-righteous bandwagon. I think we’re all guilty of this at some point. And certainly social media has fueled this. You know, it used to be you’d see something on the news, if it was a horrible story, you might go, oh, that’s terrible. Somebody did something awful. But there was really nothing you could do about it. And now I think we all feel inclined, well, we can do something, right. At the very least, I can go out and pop off on social media, and say something negative and re-share the story, and then I’m validated for that. Right. Somebody comes on and likes my post. Somebody else comes on and likes my post, re-shares it. And so we’ve really trained ourselves to just hone in on those 140 characters that are out there, the tiny headline, and put it out there.

And I think we forget that there is context, usually, behind the story. And I think we also forget that the people being attacked are real human beings. So, they are not just the sort of caricature that they are portrayed as in a headline, in a soundbite. Nobody is. They’re real human beings. And also, your posting and sort of inflaming on communications, on social media, can have real world consequences. I don’t know if Simon, this happened to you as well, but it did not stay on social media. I had actual death threats. I had people looking for me, coming to my house. Police involved. So your activity online doesn’t always stay there, and you can really inflame others. And people need to be more thoughtful about that.

SIMON: Yeah, that’s exactly right. And Steve and I both talked about this. I think the character assassin’s weapon of choice is something I call a toxic narrative. And a toxic narrative is something which contains a few sprinkling of facts, some salacious past sentence, a quote which implies wrongdoing or malevolence. Then statements of outrage, which users intensify. And then it’s this explicit or implicit accusation of wrongdoing. That narrative is like a character assassination weapon, because you cannot respond to it once it’s in the public domain, without talking about the toxic false narrative and the defamation in it. You need to be very aware of these. You don’t share them. And so forth. So the thing to avoid if you’re a listener is premature cognitive commitment. It’s being lured into sharing and saying, this is just another example of whatever your prized sentiment was, and sharing it with your friends. In GRU terms, the Russian influence model, you’re being a useful idiot, so don’t be a useful idiot. Don’t share things that are false. That will be the best advice I could give.

ABERMAN: This is so similar to the conversations that we’ve had in prior shows around the politics and integrity of elections and privacy and so forth. It seems to run four-square into: people get hurt when lies are told. Our legal system is designed to give people the benefit of the doubt. You have to prove a crime beyond a reasonable doubt, meaning that sometimes guilty people go free. We accept that as a society. It seems like the Internet is now the reverse. Everybody is guilty, and innocent people suffer. Even if a small number of innocent people suffer, that seems to be okay. It leads me to think, Rich. I know you’ve given this a lot of thought. There’s got to be, or shouldn’t there be, some sort of policy answer to this?

LEVICK: I don’t know where policy is going to come from. We have the Chinese model where the government is in control; we have the American model where Silicon Valley is in control. Neither seems to be very helpful for any of us. But I do think as citizens, we have a responsibility. You know, if we look to identify maybe some Ten Commandments for working on the Internet, communicating on the Internet. Don’t press send. I mean, let’s start with not pressing send. And just think about things, too. Let’s think about being Buddhist. That is, what is the other perspective? I may have a strong opinion on the impeachment, and may not agree with Alan Dershowitz, but I want to understand his arguments.

Our epistemology has changed. We want to first get to gather information before drawing conclusions. Understand what we’re doing when we practice cancel culture. When we turn around, say, you’re racist, you’re sexist, you’re Islamophobic, you’re anti-Semitic. What we’re doing is practicing prior restraint. That is, we’re saying: nothing that you say has any value anymore, because I’ve already labeled you. You need to be looking at these things in terms of how you communicate.

ABERMAN: The underpinning of democracy’s exchange of ideas, there’s always been an understanding or implicit part of societal approach to democracy, that people have responsibility for their own actions. There needs to be some sort of policy approach. When you both dealt with this, were the people online spreading these things anonymous, or were they people that were standing behind their statements?

NARDIZZI: It was mainly anonymous. We did find out who many of them were. They had recruited somebody into their Facebook group.

ABERMAN: But there was no accountability. There was no legal responsibility to prove the truth of their allegations.

LEVICK: We have Internet courage, where people who would say and do things on the Internet because of its anonymity, that they wouldn’t do if they had to communicate face to face.

NEWMAN: Yeah. Wolfgang Gertz said a coward only strikes out when they feel safe, and the internet provides anonymity. People as trolls, they mob, they get a whole group of people together to actually attack someone. That’s what happened to me. I was a target of one of the worst academic mobbings there’s been in America. And how that happens is, people are influenced at the emotional level. They’re sold this toxic awful narrative. It had a bizarrely uniting effect, because I was hated by all ends of every political spectrum. And the conservatives hated me. The liberal press hated me. But it was designed to be that way. And I think behind all this, something that I think your listeners should know, is a tremendous amount of planning by the cabal, the group that attacks you and Steve and I both. You had notes on this.

We both have perfect records of what happened in each of our cases. I had a whole forensic study. They were meeting for months and months plotting this, collecting toxic narratives that they planned to launch. And the phrases used were: the media is in our back pocket. They knew they could get people in the media to write those narratives exactly as they wanted them. And they used students. They used students, put them in harm’s way, to communicate that to the media. So when you look at this, what should change? This type of attack, what happened to Steve, what happened to me, is intrinsically evil. You don’t need to have a great theory to say. You deliberately go out and attack someone’s character to get them fired or removed from their office. That is intrinsically bad, and there needs to be repercussions for this. The professors that attacked me, and their agents, all got promoted.

LEVICK: And you know, Simon, productive people don’t have time for all that planning.

ABERMAN: My conclusion from today is that we have another example of why and how it is that the current structure of the Internet has gone wildly away from an optimistic view that the creators had. There is no doubt it could be a tool for good or a tool for ill. But clearly we’ve got some work to do as a society right now to create better rules for the road.

Copyright © 2024 Federal News Network. All rights reserved. This website is not intended for users located within the European Economic Area.

Related Stories