today we explore the rise of fake news and the disturbing impact this has on how we think how we behave and on what we believe advances in artificial intelligence and digital alteration platforms have led to a rise in the volume of so-called deep fakes synthetic media created using images and machine learning recent examples include the first kovic 19 related deep fake manipulated media designed to influence elections and fake pornography which is designed to harass and humiliate its victims most of which are women today we'll investigate this wave of disinformation and how we can mitigate the effects that it's having with that i'm excited to welcome our guest speaker today nina schick nina's a political commentator advisor and public speaker and specializes in how technology is reshaping politics in the 21st century she's advised governments on russian election interference in the us and elsewhere around the world since 2016. she's worked with global leaders including emmanuel macron joe biden and anders vogue rasmussen the former secretary general of nato her first book deep fakes and the infocalipse what you urgently need to know has recently been published nina welcome thanks so much for joining us today hi thank you greg thank you for having me so i've been fortunate enough to spend some time with your book um and it's obviously an area that's it's still emerging uh it's moving at a very fast pace so maybe we can start with a basic question how do you define a deep fake well what a deep fake is a piece of synthetic media and synthetic media is any form of media whether that's a video or an image that's either partially or wholly generated by ai there's going to be many many positive and commercial applications for synthetic media so when it's used in a negative way as misinformation or disinformation that's what i call it a deep fake the taxonomy around this area still hasn't really been defined so i think from the top i want to make it very clear that we're only talking about the bad use cases of uh synthetic media in the form of deep fakes okay that makes a lot of sense he used misinformation and disinformation there in your answer so i'm interested if you could maybe just pass that in terms of what are the differences from your perspective well this information is when a piece of information is deliberately made in order to deceive or manipulate and misinformation is a piece of incorrect or bad information that just goes viral naively without any intent and that i think are the two kind of big challenges in our information ecosystem when it comes to bad information and damaging and dangerous information there is the bad information that's disseminated with bad intent and there's also the information that's dissem disseminated without this bad intent but which can be just as harmful okay so if we're thinking about bad intent like who's generating this stuff why are they doing it so when it comes to deep fakes um in particular um the really interesting thing about it is that it was gen the first use case in in the case of deep fakes is non-consensual pornography so a lot of this has been disseminated or generated by um anonymous actors on the internet um but when of course it comes to bad information what we're talking about because our information ecosystem has changed quite so dramatically in the past few decades and this is really in the ambit of not only state actors also domestic political agents but really anybody you know it could be a teenager in the basement it could be increasingly as this technology is democratized it could even be a child so disinformation and misinformation can be disseminated by anyone in the 21st century okay and and clearly that the you you mentioned in the book that you described the info apocalypse as the increasingly dangerous not trustworthy information ecosystem within most humans now live can you just give us a sense uh from everyone watching on on how this is being played out through the ecosystem just so we can get a sense of the scale and the impact that it's having so when i first came to deep fakes i mean i come at it my my background is in politics and i came at it from a kind of geopolitical information warfare kind of perspective what are rogan authoritarian states going to do to disrupt our elections what are the tools that are going to develop in future that is going to make this more potent but what i quickly realized is for the past kind of 10 years if you look at how corrupt our information ecosystem has become deep fakes didn't emerge in a vacuum because what we're actually seeing is how the age of information and all the rapid kind of exponential technological advances attached to the age of information which have had you know many positive benefits also have a darker underbelly and what i can test is that the entire ecosystem which in the age of information has become increasingly important and increasingly universal in the sense that almost everyone in the world is now connected into this information ecosystem half of the world which still isn't will be in the next decade and that this ecosystem itself because it is completely ungoverned unregulated and doesn't have any infrastructure in terms of safety has become dangerous and untrustworthy and i call it the apocalypse as this thing that is developing and in my perspective growing increasingly more potent so it's becoming more potent tech obviously legislators are running behind technology quite often um you know large technology companies tend to sort of have that kind of move fast break things and you know we'll worry about the uh the impact of what we've done later on do you how do you think we can get a handle on this is it possible or is the genie out of the bottle it's very very hard to regulate this very very hard now obviously many people are involved in this uh ecosystem of misinformation disinformation estate actors like what what kind of sense do you have of of how we can try and and and i'm using this word advisedly how we can kind of control or mediate this kind of information well i argue in the book that um so first of all i'll make a few comments on how quickly this is happening because actually i want to say that misinformation and disinformation or bad information are as old as humanity itself so this is at its heart not a tech problem you know it's not the tech that is inherent but it's an amplifier of human intention and what i argue is that now because the information ecosystem has changed so quickly we are facing an unprecedented crisis of bad information because if you look traditionally at the kind of huge technological advances that have completely transformed the way in which we communicate and receive information there's usually been a little more time for society to catch up so if you look at the invention of the printing press you know arguably before the 20th century the most important invention when it came to human communication there were 400 years between the printing press and the invention of photography but in the last 30 years it's been internet ban smartphones and social media and coming into that now is this synthetic media disrupting potentially the most important medium of human communication in this ecosystem which is video so when it comes to solutions i would love to say there is one silver bullet answer but there really isn't um the first step and this is why i wrote the book i think is just understanding and putting a conceptual framework around what is going on you know how is russian interference connected to um pizzagate connected to donald trump connected to pornography which on the surface looked like separate disparate events that are happening but actually if you conceive of them all being interconnected in this new information ecosystem it starts making more sense only once we have understood and put a conceptual framework around this challenge which by the way i think is soon going to emerge as just an important challenge as climate change then we can begin to defend from it and i argue in the book that there really is no one solution it has to be a society-wide effort where civil society works with government works with tech and it needs to be a networked approach and unfortunately when it comes to regulation government given my background in politics what i would say is that too often policymakers are too far behind they lag too far behind and i would not advise that the way to deal with this because the genie is out of the bottle and there will be many positive applications would be to regulate it out of existence not only because i don't think that's possible but because i think that sets a very dangerous precedent if you literally that's one country where they've already said well we outlawed the use of deep fakes so the government effectively given the power to say what's real and what's not real we don't want to go down that way absolutely i mean i think that's crucial we shouldn't be be censored because of this but in the second section of the book uh which i thought was very powerful you look at russian disinformation now there was a story only this week uh that uh that the new york times reported out around black lives matter supporters in portland supposedly uh burning bibles this story was shared by all kinds of people including uh donald trump jr the only problem is it seems to be generated by by russian bad actors how do you see this uh you know russian interference playing out uh over the coming uh weeks and months as we go into the the general election on november the third so there is without a doubt um russian influence operations already underway i track some of them in my book and i also describe how this strategy by the russians when it comes to information warfare is really an old strategy that goes all the way back to the soviet union it's just that now with modern technology their potency has increased by an unbelievable amount it's really interesting that you mentioned black lives matter because although the history of race relations is not you know you can't say this is something that was manufactured by the russians it absolutely wasn't this is something that is a real and very important issue in american society and politics nonetheless because it is such a divisive and um polarizing issue it is a threat which the russians have consistently picked on and tried to entangle they've tried to worsen race relations in america since the soviet union so since the cold war in the 1980s i described this in my book one of the most outrageous disinformation operations that the russians ran was an operation called operation infection where they claimed that the hiv aids virus which disproportionately affects african-american communities was invented by the cia as a biological weapon to kill african americans in 2016 they ran similar influence operations to undermine the african-american community so they were saying things on these facebook communities which they had built up um to enforce kind of tribal identity across the spectrum they were trying to agitate between black lives matter and blue lives matter and the same thing is happening in 2020 so what has happened in the wake of the george floyd protest of course that was not caused by russia but given what a raw issue it is it is an issue that the russians are agitating on in this election they have done in previous elections and they even did it in the cold war so that's just a lesson to take from that the kind of most raw issues in our democracies are being exploited by foreign agents like russia and increasingly other rogue and authoritarian states as well and clearly the coronavirus pandemic has been an opportunity for for bad actors of all kinds what kind of trends have you seen uh during the last few months it's clearly this is something that bad actors can seize upon it was so interesting because when i was writing the book i wrote it very quickly at the start of the ideas behind it have been developing for a long time so i kind of track all the kind of dangers of our information ecosystem and how they play out not only in the realm of the political so i look at geopolitical actors and how it actually is increasingly infiltrating western domestic politics as well but also how this is something disinformation how it affects you know private citizens in their personal lives and all companies so when covet happened it was i mean of course it's unfortunate that it happened but for me it was the perfect case study of my argument because everything i was saying about our corrupt information ecosystem is displayed and manifested in the covet um information epidemic so on the geopolitical level you have seen countries like russia and china spread disinformation around the origins of the virus right russia again going back to its old trope from the cold war was saying this is um a u. s made biological weapon um and interestingly that's what china started doing as well because the geopolitical ramifications for china um in the wake of covet are so severe that china a country which is traditionally mostly kept its information operations pointed at its own citizens has been taking a much more russian approach in terms of infiltrating western social media platforms to push their false narratives so you have the whole geopolitical thing going on then you have domestic disinformation campaigns by leaders of you know our own western democracies and one of the case studies that i include is donald trump who um was saying you know for months and months it seven weeks in the run-up uh to what has become this awful pandemic he was saying he wasn't worried it's going to go away actively spreading disinformation and we know bad information is dangerous but in the case of covet it's literally been claiming lives so you have these politicized information operations around covert within western democratic societies as well and then you have the full gamut of bad actors who are not state or um national politicians so these are kind of the the scammers you know who are trying to sell fake cures who are trying to send emails around posing as a world health organization so that they can fish for your details so you've really seen it play out in all its dimensions um with coveted which is i think the first kind of global event which is the perfect manifestation of the apocalypse so we're getting plenty of um questions coming in uh nina so i'd love to go over to that in a second but just one more question from me for the time being um do you think there's a critical threshold at a certain point we're just our information systems are going to become overwhelmed like how serious do you think this this challenge is i think it is fundamental i mean you could even make the argument that we've already hit that critical threshold a good example for the western world is that when it comes to the issue for example of russian interference by the way this is a fact it has been proven by all western intelligence agencies but if you look at the discourse around it in our society you know it's still treated as a partisan issue and until you you know which means that it cannot be treated as a national security threat that it is so you have you can make the argument that we've already hit that critical threshold in western democracies but actually i'm more worried about other parts of the world where they don't have the kind of safeguards and institutions that we still have to protect us from this sort of bad information where the potential consequences of disinformation and information operations are far more devastating and one of the harbingers of that it's a case study that i illustrate in my book is talking about how this kind of information ecosystem actually led to the genocide of the rohingya in burma um in starting in 2017.
right okay great so uh nina if you don't mind taking a look at some of the the q a's that we've got coming in questions we've got coming in um i think there's maybe one that we can conflate uh we've got uh misha uh gleny and mikko hippenham both are asking sort of like a question around misha's asking around what stage are we at in creating watermarks on videos that states could sign up to is mandatory um and because it's sort of similar-ish question in in your opinion is there an organization currently that has you know technology that that that's that's uh proficient in detecting deep fake videos when it comes to the solutions um so we kind of i think the mo more important piece is societal and digital literacy but the tech solutions are equally um as relevant and important so there's two ways to look at it you have detection and you have provenance or authentication on the authentication side adobe has just released a white paper which they have done with the new york times and they're working with twitter where they hope to set an industry standard that all media can be kind of authenticated from the point of capture on the detection side there are many uh detection efforts already underway but the difficulty with it and many various companies um different models but the difficulty is okay first of all this is all still so nascent that the ai behind detection was lagging behind the generation side of things i mean we've been around for about three years now increasingly people are putting resources into detection but given the nature of the ai behind synthetic media the problem is that every time the detection capability gets better so too does the generation and hypothetical question which hasn't been answered yet by the ai research community which i suppose we'll see playing out in the next few years is is there a point at which the generator so the ai gets so good that it's even able to fool any detector and we don't know the answer that to to that yet okay great thank you nina i've got plenty more questions coming in if there's anyone you'd like to to pick um that would be terrific yeah no i'd love to answer um jack graham's questions because i think it's a really really important question so he says out of all the immediate problems how concerned are you about the liar's dividend and my answer to that is that i'm very very concerned so the liar's dividend is the concept that in a world where anything can be faked including video right which until now we deem to be pretty incorruptible and authentic unless we see it in the movies so if anything can be faked then everything can be denied and everyone has plausible deniability so i actually write in my book that because the age of synthetic media and deep fakes becoming ubiquitous i think we're still a few years away from that you know in three to five years it'll be ubiquitous and we might be different looking at a different situation but until then um i'm far more concerned about the liar's dividend and i'm far more concerned about the liars dividend when it comes to the u. s election this year we have already seen how um so in 2016 when donald trump hit the nadir of his election campaign you know when that leaked tape came out when he bragged about grabbing women by you know uh the the i won't say the word um he then in 2016 he had to apologize and donald trump never apologizes never backs down but in that instance he did if that came out today he would not apologize he would say it's deep fake and interestingly he's already started calling that video a deep fake another really interesting incident was around i mean the george floyd video that video that was so powerful that it united you know the whole not only the us but the world in these protests against racism um the authenticity of that video in if had it emerged in five or six years could possibly be denied so that it becomes a polarizing issue and we've already had one candidate who is a verified republican candidate standing for senate who has an entire theory about how the george floyd video is a deep fake so we'll start seeing a lot more of that and while we're already seeing it to be honest with you so the last dividend is real and it's very scary maybe i can ask you to take a look at sigmund sigmund halderson's question as well nina um i i think the the kind of nugget of it is like how do we avoid restricting freedom of expression while we also um ensure that we get quality information um can we bring people together in order to make this happen in some ways that we do have this verifiable uh in quotes truth that we can all uh hang on to so i think the the entire crisis of information that we see right now is because we made the flawed assumption that you know the information age and all the associated technologies would just lead to a human utopia you know which was the optimism of many of the founders but actually again and this is why i say it's it's not the technology that's to blame here it is ultimately a story about humanity we're not we're not uh all good and we're not all bad and what we are seeing is the downside to the age of information which is manifesting in this crisis so the only way i think we can tackle this and this is why i'm hesitant to say we should regulate it or that there should be blanket bans on anything like deep fakes because we want to maintain the values of our democratic society including our rights to free information our rights to free expression our rights to um say what we think and the only way therefore i think to deal with this problem is to look at the actual ecosystem as something that we need to shore up in terms of the infrastructure around it so that it's a safe space just like we do the same in civil society right you can't walk out while you can you can walk out of your door and shoot someone in the head but there will be consequences to that whereas in this information ecosystem you know where it's increasingly becoming easy to create disinformation and then disseminate it not only within a small community but literally to the whole world with immediacy there are no consequences you can behave with anonymity so i think that we have to start looking at the type of solutions that actually you take this information ecosystem and treat it like civil society and then look at how you can make the entire infrastructure safer and a place where uh you can actually ensure that there's trust in digital media okay so we're pretty much out of time nina thank you for taking those questions uh just one final question if you don't mind me asking um i'm really keen to get a sense of where you think this is going next is there a kind of you know a best case scenario and a worst case scenario you can lay out for us as you see the ecosystem at the moment so i would say that things are going to get worse before they get better in whichever scenario you look at it um in the best case scenario things will get worse but this problem which i've tried to conceptualize as the apocalypse and all the connecting all the dots um people will start to realize that it is you know one of the biggest challenges of our age and you will gather not only the political will but the cross-industry cross-society kind of networked approach you need to tackle it and the encouraging side of things is that that work is already underway and in my book i highlight some of the organizations and people that are doing pioneering work in this field um on the potential worst case scenario things get worse and they keep getting worse and there is no kind of networked or joint effort to try and fix the actual ecosystem so that the future basically is a never-ending kind of warfare around narratives in which nobody knows who and what to trust anymore so you just trust what you want to trust rather than having any kind of objective reality or truth which is pretty devastating to um western democracies um a a bold warning uh that we need to address something that's very important now nina thank you for your time today and thank you uh for writing the book which i think is a very urgent uh cry to to armed in many ways deep fakes in the invokelets is uh available now uh thank you everyone for joining us today watching the session if you enjoyed it please do check out some of the uh rest of the wired foresight series we have discussions with technologist kate kalat on the future of the ai ecosystem uh entrepreneur godfrey nazareth on innovation in kobit 19 harvard professor rebecca henderson on a new type of capitalism many more at wild. co.