[Music] thank you very much thank you to the turing institute for inviting me I'm based in in Toronto and we have a small team there for BuzzFeed news which if you're not familiar with with BuzzFeed news we are the news division that's that's part of BuzzFeed which also has lots of entertainment people and video producing content there we've about 250 journalists around the world and our Little team of weirdos are focused on all of the ways that people are manipulating and deceiving in the digital environment and so kind of as part of that work we
encounter a lot of disinformation campaigns and tactics we investigate things like ad fraud we look at basically any way people are exploiting all these kind of new and wonderful information technologies that we have and so tonight I kind of want to take you on a tour of a lot of the Things that we see the things that I think the average person is encountering more and more and also look at you know some of the motivations and some of the impacts that these things can have and so with that I want to tell you sort
of three stories about about these kinds of things that are going on that I think speak to some of the larger trends that we're seeing the first one is as a town called Vella's in Macedonia you might be familiar with sort of you know the Macedonian teens this was a story that we did in late 2016 that ended up capturing a lot of attention at the time I went and visited the town of Ella's a couple of summers ago and I went up on this mountain with a few local journalists to kind of look at
the town and this is one of the best places to get a view of it and you can see here off in the distance there's this building and this is actually a factory this at one point was really the pride Of Ellis it was the major employer there this factory they had competed with a lot of other towns this was in the former Yugoslavia to kind of win and get this factory there and this was a big thing and it remained the main employer there until a while ago when a an investigative journalist a local
journalist started looking into the factory and realized that their toxic waste was being dumped close by that was creating air In problems that was creating problems with the water supply and so as a result of that eventually not right away but eventually the plant closed down and so the major economic engine of the town went away and over time particularly for the young people of the town they started to find a new way to earn income and and what happened was a lot of them took a internet marketing course that's how it was described from
a guy in the capital of Macedonia and he taught them How to choose a niche to create websites for it to run Facebook pages and to do a whole bunch of things that often veered into completely unscrupulous things in order to get eyeballs to your website in order to make money for advertising and so there were health websites there were motorcycle websites muscle car websites largely targeting the English language market because that's where you could earn a lot more money rather than doing something for the Macedonian market with A small amount of people you go
for the global market where you know a US or UK reader is worth a lot more for an ad than someone in Macedonia and then as we got into 2015 a guy launched the first political site there was a Trump fan he was actually a media lawyer working in the capital in Skopje but his brother was in bellas and they launched the first site and by the time I started looking into this in the summer of 2016 this was the kind of Stuff that they were publishing these were completely false stories largely going pro Trump
anti Hillary Clinton and even though the guy who started the first site was kind of inclined to like Trump it was all about business they didn't actually care who won they didn't necessarily have strong political leanings what mattered to them was what content was going to perform best on Facebook what was going to get people to react on Facebook which would then make The algorithms decide what to show that you know who to show content to pick that particular article because it was sending off the signals that oh people are really liking this and sharing
this and so you know we ended up talking to a bunch of people who were there some of them were as young as 16 and I asked one guy was like you know you realize that like when we did the stats and we looked at the best-performing articles from these sites four or five were completely False and they were the ones that perform best on Facebook and this is what one of the guys said he said you know yes they know that they're false they know that they're oh sorry misleading but the rationale is that
if it gets the people to click on it and engage then you use it and most the time they were copying and pasting stuff and maybe like tweaking the headline a little bit and sometimes poorly copying and pasting so They didn't even get the whole article there but it didn't really matter as long as the headline worked that's that's what they needed and so it was this kind of pure expression of what was really working on Facebook and also what was performing best when it came to content around the u.s. election and we learned that
not from what Americans were doing but what a bunch of you know teenagers and young men in Macedonia were doing and I think one of the Lessons from this is that a lot of times the people who are who are winning in this information environment are the ones who know how to best exploit these systems that are now in charge of deciding what you're going to see deciding how many people are gonna see a particular thing the systems like the newsfeed in Facebook which is governed by an algorithm are in effect you know the arbiters
of attention in our society now second story involves this delicious Maybe a little over-the-top hamburger so a group of researchers decided to look at food photos on Instagram and they just wanted to see like what are people sharing and what works and what doesn't work and so initially a lot of the stuff that they were studying were you know people like they made a nice breakfast for their family so they take a photo of it they go to a restaurant they take a photo of their meal you know we've all seen the people in restaurants
they're Like they sit down they do that before they touch anything what they found was over time as people you know shared their stuff my regular breakfast doesn't really do it anymore I'm not gonna get a lot of likes for that and so over time things started to get a little more out of the box a little more extreme and eventually you end up with like a quadruple decker hamburger drenched in cheese and menu items literally existing just so they can be Instagram right and People coming to restaurants just to order things and Instagram them
and what they came away with was you know a conclusion that the average person without really realizing it you know they started playing to the algorithms and so the algorithms that drove what got distributed to more people the algorithms that helped you get more likes for something and more comments for something you know they were really driving people in favoring the odd and The unusual and if somebody wanted to get something that you know went viral on their own scale get out of there small group of friends you know it started to lead them to
the extreme and I think that that's it's true for for food photos on Instagram but it's been true and proven again and again on these platforms that the more extreme the more unusual the more over-the-top doesn't necessarily have to be negative it could be something that It makes you extremely happy or extremely uplifting but that's what tends to win out in a very information saturated environment so the last story is kind of a story in sort of one-and-a-half images this is the the first image this is a graph showing the fans for the USA Today
Facebook page in 2017 from October until about the end of April now if I'm the social media manager at USA Today and I'm showing this to my bosses I can't wait for end Of your reviews because like this is really good you know you are absolutely going to get a raise if this is what's happening but they were actually really unhappy about it and the reason is that as they got you know as they more than doubled the amount of fans they had over a period of just a few months nothing actually changed so they
got all these new fans who were really excited but the same amount of people were commenting it's a matter people were sharing stuff We're liking stuff and so they started to become actually really suspicious they said we have all these new fans our page looks massive but the rate of engagement is exactly the same there's something wrong with these fans and eventually they went so far as actually to report it to the FBI and so their first step was to report it to Facebook as you could imagine so they went to Facebook and Facebook looked
into it and here's what they found they found that Some people operating I think it was largely in Bangladesh but in in some other overseas countries has been creating fake facebook accounts on mass and the way you make a fake account look real and passable is you get it to like a bunch of pages you get it to post statuses you steal a photo or two you put that up so it sort of passes the sniff test and so they had programs into these fake accounts to go and like the USA Today Facebook page but
they never Programmed them to go like any other content or engage with it so it was this huge giant red flag for USA Today and so Facebook looked into it and USA Today ended up losing some of those those new fans which seems like a sad end to the story but that's actually a much better scenario for USA Today to have real people engage with their content but this is a problem across the board when it comes to digital media there is so much fakery Going on there are so many fake accounts that are programmed
a lot more intelligently than these ones were who will actually engage with things who will talk to you who will engage with you and it's very very difficult to separate that out from the real stuff and so you know the three takeaways from that initial thing for me is like one obviously we have these systems that are increasingly governing what gets attention what gets distributed and Their complicated systems and they're very opaque we don't really know how they're making their decisions even though the companies give us some general guidance you know the second thing is
that we do see the people who kind of understand them who manipulate them who understand you know the idea of giving more extreme content often veering into false or misleading content they understand that that gets reactions which causes people to engage with Things and they're exploiting that again and again and not just on Facebook but on many platforms as you'll see and you know the final thing is that you know the things we're familiar with now like you know so-called fake news and maybe BOTS and other things like that that have gotten to the sort
of general consciousness they're a small piece of what's going on and that's kind of what I want to go through is to give you a sense of how you know so much of what You touch in the digital environment so much of what you engage with is easily manipulated is actively being exploited and in many cases is part of you know huge amounts of criminality that's going on which is again a piece I think that gets missed a lot so I'm gonna take you on a terrible journey now we're gonna sort of climb the ladder
manipulation going from like basic stuff all the way up to really more advanced 100% completely synthetic things that are Really starting to you know draw a concern for people but that also give us a peak of sort of where things might be going you know simple simple level media manipulation today is what is generally referred to as trolling so is the guy who runs a virulently anti-immigrant white nationalist organization called identity Europa and so he posted this photo in November saying basically hey I'm hanging out at the White House and and so you could imagine
people saw this And they're like what the hell did this guy get it's the White House so some reporters reached out to the White House to him to say were you invited were you participating in an event what was going on here and so you can see like you know he got for him he got a decent amount of retweets and likes out of it probably more than that from when when we captured this screenshot and eventually you know he posted a tweet and he said No no I was just there for a public event
that anyone could get a ticket to right and so the troll here the first level of the troll is making people think that he's been invited to by the White House and getting attention for that right getting journalists to reach out to the White House to get comment to basically turn himself into a story so he's like oh you know I was there and then he calls out the journalists who you know who actually did the work of Figuring out what this guy was doing at the White House to say basically they were gullible losers
for trying to figure out what he was doing there so he got them to spend their time and spend other people's attention on this and elevate it and if you look at this this smaller tweet under here this couldn't have turned out more fantastically the left foe conservatives and mainstream media all fell for it just his plan now did he actually plan it's possible but It's also him just completely capitalizing on this to make it seem like what happened was his devious brilliant plan all along right trying to take credit for everything on this like
he was playing you know for DHS or what-have-you and so that's like a basic kind of level of trolling that we see you know another thing that happened like this was during the the most recent French national elections we saw a lot of far-right trolls people who you know Pull pranks online spread disinformation online hanging out in a private space called a discord server which is mainly a chat app for gamers and actually trying to plan campaigns of how they could get stuff to trend on Twitter that would support the far-right candidate in France so
we have you know far-right trolls in the United States reaching out to fire right trolls in France hanging out in a private space planting trolling campaigns and disinformation campaigns And again this is about humans trying to get other humans attention and using these systems any way that they can and this is this is a quote from a guy named microchip who actually would run Twitter bots and he basically talks about how they would come up with over-the-top things to try and get journalists to cover it and they would love it even when journalists like us
would debunk them because you know they still wasted our Time they still forced us to give them their attention and it's amazing to me where he talks it out he talks about how we should have already figured this out and stopped covering them but we kind of can't resist and this is a problem that a lot of newsrooms are really dealing with right now is that you know when do you give attention to things even when you're trying to warn people about it are you giving them even more oxygen this is one to get into
the realm of Kind of humans and humans and machines interacting so so many of us we shop on Amazon right Amazon has an epidemic of fake reviews if that wasn't something you are already familiar with you need to sort of have your guard up as you're reading these things there are Facebook groups and tons of other places online where people hang out and basically someone who's got a product that they've made comes in the group and says hey I need ten people to review this product From the United States and they'll send it to them
for free the people then review it using their real Amazon account and in some cases they give them some additional money or at the very least they get to keep the product for free and of course they don't actually want a real review they just want them to post something positive it could even get to the extent where when there's a trust relationship between the fake reviewer and and the product person They'll actually have the the reviewer buy the product with their own money and then reimburse them later so it gets the verified purchase thing
on it so even when you're looking at reviews and you see verified purchase some of that might actually be you know a fake reviewer and that's the other image here is somebody saying oh I have a competing product can I get a bunch of one-star reviews fake one-star reviews to knock this down and one of one of those things is is Obviously just you going to the page of that product and trying to decide whether you should buy it or not but part of it is also that Amazon depending on what you're searching chooses which
products to show in some cases based on what people have paid to advertise but also in a lot of ways based on things like the number of positive reviews they have and their sales trends over the last you know 24 hours 30 days so by doing this you not only can sort of Convince a human who's looking at it but they also send signals to an algorithm to say hey our product is really popular and people are really happy with it you know and I think sort of the classic 100% false economically oriented fake news
fits in that category as well you have you know whether it's teens in Macedonia this was a young guy in Kosovo who created this completely false article with very poor grammar in the headline I always feel I have to point that out being a journalist and you know they're they're creating other content or copying other content for humans but the end goal is to get those humans to react so that it sends a signal to Facebook's algorithms to say hey you need to show this to more people so we're creating stuff for humans in order
to get humans to act that in ways that will also trigger algorithms and this is sort of you know what's happening more and more In our society when you don't have the ability to go after humans or target humans first you can very easily just buy a bunch of fake humans online this is very easy this is not a how-to section don't go out and just google fake Twitter accounts but if you were to do that you would see it's very easy to buy them it's very easy to buy you know views fake accounts very
easy to buy retweets likes favorites shares all of that there is a marketplace for that I Bought like a hundred count accounts for I think less than ten dollars and I basically was just sent a text file with the username the password and the phone number that it was linked to because they will actually buy SIM cards and connect a phone number to all of their fake accounts because that's more likely to pass Twitter's initial kind of test to what see whether an account is real or not and I'm I showed you that quote from
Microchip this is an example of very rudimentary BOTS from him so these were bots that he was using to spread the help spread the conspiracy theory that seth rich a former DNC staffer had actually been murdered by the Clintons and high-ranking Democrats and this is rudimentary because you know these accounts don't even have you know user images they seem to have names attached to them but the name doesn't match the user name and of course they're all Tweeting the exact same thing and you would look at this and the average person would say okay this
is you know clearly problematic I don't think that's a real person but this wasn't necessarily meant for humans this was meant for the algorithms that decide what trends on Twitter so the goal here was to basically bomb a hashtag with enough tweets in a short period of time to make Twitter's algorithms that Oh Seth rich is a popular topic That people are talking about we should make that trend and then when real users log into Twitter they see that Seth rich thing maybe this really is a thing so many people are talking about it maybe
he really was murdered by the Clintons and so he's trying to mess with Twitter systems and he's also trying to mess with your mind through getting access to that system here's a video let's watch this video okay there the videos playing so nothing's happening it's not a glitch This is actually the video at some point there we go okay so they removed some words because you they needed to do that in video so what's going on here so this this was a video on just like a silly kind of memes page a few years ago
Facebook told everybody video is a big thing native video upload your video to Facebook this is a big deal lots of media companies like ran and hire tons of video people I know like we did it and then you know something was supposed To happen nothing really happened except that in terms of money but what did happen was that Facebook promoted the hell out of video and so if you wanted to get engagement on your Facebook page to have your post travel very far to maybe get into the newsfeed of new people so they might
like your page so that you could get more views for your video you need it you needed to have video to do this and so a pages like this that used to just post a static Meme like that and do really well all of a sudden realized we need to do video but we have no resources to do video let's take a static meme and just turn it into a video that has basically 25 seconds of dead air and if you see in the bottom corner there this video got 11 million views so 11 million
views for that obviously there's no reason why a verage you know regular people would want to watch that video to completion they've been tricked into watching it And as they've been tricked into watching it because of the autoplays Facebook signals get the sense that lots of people are watching this video this is a really popular video and before you know it this thing has 11 million views and you know over 88,000 reactions on it and so this is a way for them to keep their page growing by completely gaming the system and Facebook eventually found
a way to kind of stop this but it was something that was spotted by us by Journalists before the our own systems picked it up this is one of my favorites please enjoy the beauty of this Instagram photo it's remarkable it has over 1500 likes it is so attractive right so and also I like the comments too my favorite one is the one that says I like the colors used for this photo clearly somebody has really appreciated it so this one is my favorite because it has it has like four levels of awfulness okay so
the first Level is you have real people with Instagram accounts and they go and they sign up to a service that's going to give them that's gonna pay them okay and it's gonna pay them if they give over the logins to their Instagram account which is a very bad idea don't don't give people your logins your Instagram account and so what would happen is somebody like our friend viral hippo which was one of our BuzzFeed news reporters would go to this service and Say hey I've just uploaded a new photo I would like to purchase
1,500 likes and a certain number of comments and then what happens is the service goes and uses its credentials to log into you know 1500 or so real people's accounts and get them to like this photo and to add some comments to it and so if your Instagram or if you're someone like us trying to figure out if you know why a blank a black square would get 1500 likes and comments about colors you're gonna look At it you're gonna investigate the accounts that left those comments that liked it and they're all gonna look like
real accounts because they are but their real accounts who were automated in order to trick other humans so that the algorithms would see other humans liking this as well and so I mean the way I expressed it was humans being turned into BOTS to trigger algorithms to reach other humans which is like the future is terrible if you think about that like This is where we are and so so that's that's the extent that it's going and this interplay between what we're doing and how people are manipulating these platforms and and in their head they're
not just thinking about reaching other people they're also thinking about triggering these algorithms they're also thinking about getting these systems to be tricked as well so you can't just trick humans today you also need to trick algorithms and that's kind of the Next level manipulation that people are really working on so when I think about these examples I think this pizza box which you know became a meme and was kind of popular a few years ago you you know it looks like you're getting a good cheese and pepperoni pizza you open it up and this
is what you're actually getting right like it looks like that that account got a bunch of likes it looks like that hashtag is trending but it's it's actually just Completely fabricated and so this is this is what happens you know you kind of peek under the hood of the pizza and this is what you're left with and this is this is the experience that we have again and again and again as we look into this stuff that if you kind of scratch the service if you just lift it up a little bit there is so
much that is being game there's so much deception that's going on and it is reaching the average person because these platforms Are so ubiquitous this is this was a comment from a guy who's the chief scientist at a company called white ops which specializes in investigating BOTS particularly in digital advertising which we'll talk about in a second and for him you know he sort of expressed the same thing about how these how these things get hacked but in fairness I should probably show you the earlier and far more popular meme because you know journalists are
kind of guilty of this Too right and they're kind of bait-and-switch of a headline you think it's really great then you go to the article and it's complete crap this was the original meme that did really really well the machine learning one maybe not surprisingly didn't get as many likes but nevertheless was true and his you know the fact that he investigates bots is really important this was a recent tweet by a former chief digital officer of the United States and this is the Really horrible frightening truth is that most most days the majority of
traffic on the Internet is BOTS it's not humans in particular for sometimes on certain services where there is you know on an active kind of attack going on you could see a huge percentage of BOTS to humans and it's not always malicious BOTS can be useful they can be helpful these can be you know automated services that are running taking up some traffic on the internet completely normal but That's not the majority of what these bots are the majority of these bots are absolutely engaged in things that are often meant to you know either steal
money or trick humans or kind of do a combination of the both and so you know the hope here is a hope I for this year that we sort of start to realize this and all of us take this into account as we see things and we see metrics and likes and shares that we start to realize how easily this stuff Is gamed and how pervasive things like thoughts really are and on that point I mentioned digital advertising this was an investigation I I did in October which I'll break down for you but the bottom
line was was that this group of fraudsters two of whom had experience in digital advertising two of whom had experience running servers came together to eventually collect more than 125 Android apps and websites and steal what one insider in an email I obtained Estimated to be hundreds of millions of dollars over the last few years after we when we published this story I had taken it to Google ahead of time and Google admitted that ten million dollars had been stolen from it and its partners over the last sort of two years and they ended up
actually refunding that money to people as a result of it and this is another example of kind of the human and bought connection so here's here's the breakdown of how this Works the first thing is these guys created a front company to go out and acquire apps from real app developers so you made a game they would go out they'd offer you really good money for it they'd buy it what they wanted was you know your games reputation that people already had heard of it but more than anything they wanted the real humans who are
already using it they didn't need a lot of humans but they needed some and so they would then Take time to watch what your human users were doing and they basically recorded things like what countries they were coming from what phones they were using how long they spent in the game and the whole point of this was to basically record that and to start to see what these humans were doing and in these at the same time it went from being owned by this one developer to one of the many shell companies they created and
so they're watching the humans and watching The humans and then what they start to do is they take that human behavior and they replay it into BOTS that they had programmed so they were actually copying what real humans were doing and then creating BOTS to mimic those humans so that they could then basically have unlimited traffic you know none of humans this amount in the app and then they could double it they could triple it and it was literally just like some lines of code it was so easy for them Once they had the technology
to clone the human audience they basically could send unlimited traffic to these apps and they they had to be careful about not being too greedy because if the app was not a very popular game and not a very good game but suddenly it was like number one and beating fortnight or what-have-you that would probably raise some suspicions but they basically could add as much traffic to it as they wanted and They blended it with the humans which meant that the automated detection systems that are used just thought it was more humans right because the baseline
was these were humans but now they're BOTS pertaining like humans and basically then it's just a cash machine and so you know we're talking of at least tens of millions if not hundreds of millions of dollars that these guys were able to steal and so it is absolutely an Industry there will probably be about 15 to 20 billion dollars this year stolen out of digital advertising by complete 100% crooks and that's a number that is you know approaching like the global drug trade but it doesn't really get as much attention that's how bad this has
gone so they were using they were using humans and creating BOTS from the humans where where we're headed is you won't need any humans to start at all so these these are not real people these Are faces that were generated by artificial intelligence so they fed photos of real people into a system which then sort of learned how those people looked and then just started kind of freelancing and auto-generating other faces so if you were to try and you know if this was the face of somebody who was said to be the CEO of one
of the shell companies that these fraudsters had set up I wouldn't be able to I wouldn't be able to find any other photos of this Person but I also wouldn't be able to conclusively say oh this was a stock image that they had bought which was one way when we were unraveling the scheme that we were able to sort of prove that they had a lot of fraudulent things going on and so these are these are faces and if you look at the progress in just the last few years faces that were generated two or
three years ago we're really kind of poor these ones could absolutely pass and there's no way to Really know where they came from you couldn't reverse search it and find out oh this is a photo they stole from somewhere else there are however some clues as much as it's getting better a guy named Kyle McDonald talked about how to spot things so like this guy has two different colors which some some humans do have but is not that common you know the the eyes are very different if you look in the middle of the one
like the shape of Them and everything like that what's going on with her earrings over there is a very strange puzzle you know it's trying to guess things and so over time like now we could still spot some things but over time we won't be able to over time is going to be indistinguishable and we're gonna have to think about what our information environments going to be like when it's this easy to fabricate things like that you know another example is we've heard about deep fakes So this technology can generate someone's you know face onto
someone else's face and one of the uses that people decided to come up for that was like you know Nicolas Cage is Indiana Jones is this gonna offend everyone that he's James Bond I don't know I don't want to start a riot in here and you know hey man American Psycho him as a female lead and one of the recent Superman movies and so like this is this is often what happens with kind of new Tech like this so people play around with it and they kind of joke around and maybe they troll a little
bit and they do things like that and then and then you know certainly you think about how to make money from it but also people use it for for much more disconcerting things like right now there are sites online where you can pay money and hand over a photo of someone and they will put that person in in porn videos for example and so women are being harassed By men who know them who pay to have their faces put into porn initially it was done with a lot of Hollywood actors where they were suddenly being
put into porn and so you can see how quickly it's like technology moves from sort of a g-wiz that's kind of awesome thing and to the applications where it's kind of silly and it's kind of fun but then it's also it's things like harassment and it's things like manipulation and other things like that And this is kind of the path that we seem to follow again and again and again and if we're not able to kind of detect these things and unpack them this is going to completely start to take over our information environment this
is an example also of kind of you know people what I love our ingenious ways of evading the algorithms so this is a machine here anyone want to guess what's going on here does anybody know what this machine is for so some health plans Have a minimum steps requirement right to like get a better health rate so you can buy these in China that's pretty awesome huh like that's so you'll see there are like videos and photos of people in China sitting in a cafe like smoking and eating and they're getting their steps right that's
brilliant you know there are so there are examples like that where people are trying to kind of you know worm their way through these systems and undermine these Systems for their own personal benefit which you sort of have to love and you know this this was the first place I saw when this guy you know tweeted this where you can you know and often the cafes are like providing them for you yes come and sit down don't worry you're gonna you're gonna hit your steps for today and so this is this is a great way
of kind of manipulating it and because I realized like I spent so much time in the dark place at least give you Something that's somewhat you know light and so there are examples of this but the reality is like so many of the things in our life are going to be governed by systems like this and so when we look at media we look at what's happening in information in some ways it's really the canary in the coal mine your health insurance whether you get a mortgage all of these things are increasingly being decided by
formulas that you can't really see that you don't Know what that's that they're even there all you know is you didn't get that mortgage all you know is your health insurance rate went up and it's because you you know your formula was bad it's because you were flagged by a system if you don't understand the system if you don't even know that it's there I mean how can you actually improve your statement and this is this is having real effects on people's lives and so the awareness of this and and and The demand to actually
have these systems to be more transparent I think is a really really important thing for all of us wanting to have that awareness and the second is to really advocate for this whenever we can and you know this is this is the truth is that as they become more part of our lives as they become integrated into more things that decide things about our lives about our you know social mobility about you know our economic situation the ability of People to manipulate them become a become you know the difference between the haves and the have-nots
the ability for people to understand them becomes a difference between the haves and the have-nots and the effects of manipulation and exploitation can become that much scarier and in some ways you know we'll notice it in other ways we won't well just like our situation will just get worse and we won't really know why and some of We be actively causing that system to behave that way so I've laid out the horribleness I do think there's you know a few things that that we should keep in mind you know one if you are a person
who builds these systems you have to think about how they will be exploited and how they'll be manipulated this is one of the big flaws that a lot of the tech companies had is they built these systems but they weren't necessarily thinking of these these somewhat edge Cases of how people would exploit something and they're getting better about that you know there are now hiring teams to kind of interrogate the things that they're building but it still has to really take over to think about all the ways that something can be gamed you know the
second thing is that I think too often a lot of the people who build these systems like they think about well what's our overall accuracy rate of this you know machine learning algorithm or What have you what's what's the overall health of our platform and that scale piece is really important because they're so big but a lot of times this kind of manipulation is happening at a scale that will seem tiny compared to say you know the two billion people that log in every month on Facebook but it's having real-world repercussions it's having real-world harm
so you have to think about the really small cases as well as looking at Your overall system and I think again that's something that sometimes gets missed not surprisingly I think that it's essential for every journalist to understand at the very least everything I went through there and also to understand how you do things like see if a photo has been photoshopped to reverse image something you know to not take what you see on social networks is face value and just throw it into an article And write it up like this is actually what people
sentiment is because you don't know that everything that you see you have to question I think journalists can play a really important role in helping guide people but I also think that it's a challenge for every single person in society to be aware of this stuff to have that level of skepticism raised and to really think about how you can can have a certain level of understanding and knowledge about this So that you can protect yourself but also demand better from the products you use from you know your representatives from companies you interact with that's
one way to get them to do a better job with all these things and to protect you better is to actually show that this is something you care about and you're advocating you know and and the downside of course is that we end up with an information environment that is just rife with fakes that is rife with This information where you can't tell the real from the fake where the things that decide you know what happens in your financial life and your personal life are completely out of your control a future that as you saw
earlier looks pretty ugly so thank you very much [Applause] [Music] points I noticed was to all your political examples showed extreme right-wing is it only the right-wing That does this does the left-wing do this at all and if so why were there no left-wing examples of this so the short answer is yes they do it as well the longer answer is that they from from the research that we've been doing and from research that other people like yokai benkler at Harvard and others have done there is there seems to be a difference between the the
liberal and mainstream media Co systems and especially United States the conservative and even further Right-wing media customs and so by no means is it is it one side and not the other but one of the things that a lot of conservatives have held for a long time is that the media is more left-leaning that doesn't reflect their point of view it doesn't have enough representation of them and I think those have actually been legitimate beefs in a lot of ways and as one of the results of that is that at a certain point I think
a lot of conservatives and I'm based in Canada but you know never leave we cover a lot of stuff in the u.s. they just decided to kind of build their own media Co system and so they were very early when it came to blogs when it came to cable news and so in some ways you could sort of look at it as they've been innovative in adapting to new media models and the result is that the kind of completely separate conservative and right-wing media ecosystem you know it's just it's a little more detached when You
look at network maps and things like that then then the liberal media ecosystem and when you are separated like that you do tend to start to get more extreme there is you know factors that have been documented where you know the more you're surrounded by pee the same point of view as you the more extreme your views get and so I think there's an element of having this you know a little more closed off ecosystem for people from the right that has also Created a bigger more extremist ecosystem there are lots of completely you know
great strong conservative journalists who do good work there is also does seem to be if we look at YouTube if we look at Facebook if you look at Twitter there is a bigger more extreme right-wing ecosystem where there are conspiracy theories and disinformation taking flight but you know the last thing I would say is like the election of Donald Trump has created A boom in liberal misinformation the United States like you are seeing the same kind of conspiratorial thinking you know that and conspiracy theories that are taking hold on the left and it's been really
wild to watch that because I think it shows that it's not actually about your ideology it is about you know what's going on in your mediocre system and it's also about you know do you how strong your emotions and how much do you you know hate or feel separated from What's going on and that will help sort of push you to the extreme but you know it's a good point that I should I should include some of those examples hi thank you for a really interesting talk I was quite look on your last slide you
were talking about the you know how we can address some of the challenges faced you know facing today do you think there's a role for regulation or and in fact actually for changing incentives because that seems to me something That's pretty underlying you know the profit incentive the business models that depend upon getting eyeballs on page the advertising models you know before Google we didn't have these issues and these business models have been made and then developed and cemented in stone to the point where they they encourage this kind of behavior we should then trying
to address as the secondary measure I just wonder about your thoughts on that so on Regulation there's been there's already been a lot of bad loss oh sorry okay on regulation there's already been some very bad laws passed in some of the places around the world where they basically authoritarian governments have used you know News is a reason to clamp down on the opposition of media so I'm pretty wary in general of regulation but when I'm a big fan of is the threat of regulation because I think these companies do not Want to be regulated
even though they've sort of started to come around realizing hey this might actually be happening but the idea that they will be regulated the idea that if they don't get their together then there could be legislation that actually I think is a very effective way of getting them to act of course you know you can only take the threat so far if you're not actually going to do it so I'm you know that is one thing that I would endorse that I Think government should really you know your Commons committee here was very good like
compared to the hearings in the US what happened here was a lot better and I think that that applied a lot of pressure that the companies like Facebook and Google and others have taken seriously so I think that's good on the incentives piece you know it's it's definitely not new for media organizations to go to the extreme and reward be rewarded financially for that It's not it's not new to have a business model you know like we you know we've all seen the tabloids we've all seen how like that's been around for a long time
and very early newspapers once they moved to the mass model of trying to gather an audience to get money from advertising rather than primarily subscriber driven that's where we saw a lot of this stuff start to begin particularly in like the 1830s in the u.s. so so the model has sort of been There but I agree with you in that like this is poured gasoline all over that fire and it's everywhere and at this point even Mark Zuckerberg has acknowledged that some of the incentives are really off particularly with you know what's getting rewarded on
Facebook the problem is that I think it's it's really hard for them to change that it's one they're a publicly traded company they need to hit their numbers so like you're right that model is difficult the Second piece is that about two years ago I talked to a Facebook product manager who've been hired to work on these kinds of issues and he said very plainly to me yes we have to switch it so that the incentive is actually towards higher quality content towards you know at least credible you know kind of content and he recognized
that the incentive was the other way well as two years later Facebook has hired a lot of engineers has actually invested a fair amount in This and they still haven't figured it out so I think that it's it's I don't expect them to just suddenly like depending to drop when it's fixed there is a mix of the economic model there's a mix of you know the fact that their system is really built on engagement how you fix that like I don't have an easier answer for that yeah I want to ask like how easy is
it to spot the difference between bots and fake accounts if you don't have access to the Insights like USA Today did how easy is it for journalists to kind of gather evidence there right yeah so I mean on on the the short answer is like it depends sometimes it's actually really really easy like the example of the microchip bots if you had seen one of those you would be like you would probably naturally without needing any training or anything say well they don't have a profile photo and they're tweeting this this with thing with like
Four hashtags in it that's you know that seems a bit weird and so there's some stuff on a really basic level like if you go to a Twitter account do they you know do they have a photo how long is that account been around for not a pure sign that it's a fake account and what what are they talking about what are they tweeting and if you look at you know how how quickly are they tweeting things are they just like are they doing like a hundred and fifty or a thousand Tweets a day is
this something that you think a human would sit there and do there are some basic questions that you can sort of ask as you look at a Twitter account or a Facebook account you can you know reverse image search their profile photo to see if that's a photo if it seems to be them claiming that's them where else is that photo exists online reverse image search is probably the most useful skill you could have to be honest and I hope everybody if you Don't know what it is just look it up and you'll see and
then you can get into more advanced analysis of you know looking at overtime how much are they tweeting per day do they tend to tweet the same links over and over and over again but the truth is with the Turing test I mean it's it's getting more and more complicated to spot bots at one point when I was talking to microchip and I was sort of talking about where bots were going he Was talking about things called word spinners which is you can basically enter you know a sentence or a block of text into this
and it will scramble it for you but it will scramble it in a way where it still says the same thing in a grammatically correct way so instead of microchip having you know a hundred or a thousand accounts tweeting the exact same thing with the exact same hashtags he could spin that same sentence that same sentiment in a thousand different Ways have them all seemed like they were actually said by a human and he's just taken that message and made it seem like it's not all the same account copying him and that's actually a very
rudimentary level thing you can get far more advanced from that you can program BOTS so that you know they're they're tweeting links at a with you know an hour in between them from a variety of 50 different websites and that might Look like somebody who just reads a lot of news so there are some basic tests of like looking at the account and just asking yourself what a human behaved this way but then there are things where you do have to do some kind of computational things to analyze it if you want a tool it's
not foolproof but there's a thing called bateau meter it's run by a University of Indiana I know a lot of researchers use it it's not perfect but they look at I think a Hundred or 200 different characteristics of a Twitter account and their last two hundred tweets to give you a score so there are some things that can assist you in that in that case to how effective do you think that adversarial system moderated discussion groups are where there was a possibility of deep conversations and the people in the group may well be quite motivated
to identify fake stories so I saw with the giant instant Cheese hamburger from Instagram I quite enjoyed looking at a subreddit called sushi abomination of it is about sushi that shouldn't exist but it really hasn't gone the same way over a year and that's because the people are in many of them there's a huge pool of bad reddits - good reddit's yeah people are pretty exorcised about identifying fakes and this community identification how affected you think that isn't I mean I think an engaged focused especially a Specialized community where people are have a specific expertise
or a specific passion for a hobby or something like that I think they can be tremendously effective I mean there are certainly examples where kind of you know amateur sleuths have gone completely awry and and people who live in kind of conspiracy groups will also you know force each other even further down the rabbit hole so it can it can absolutely work in really positive ways where you Have people working together where you have people sharing their expertise and it's like that's I mean isn't it warm your heart like that's what the inner and we
all thought the internet was gonna be all the time right and so I think it can be tremendously effective it can also go completely the other way and the reality is after a while and the macedonians like they would have facebook pages and that kind of thing but what they really realized Was that the best way to get their stories to spread was to go and use fake accounts to drop them into Facebook groups pro-trump Facebook groups and so you had a similar thing where people were rallying together because they had a similar political point
of view they wanted to support a candidate but because of that element because of that group dynamic suddenly they became these honey pots for people to drop conspiracy stuff and and fake information in there And the stuff went viral it worked really well so it's it's like a lot of what happened yeah yeah I think I think in a lot of ways Wikipedia works really well but it's very hard to get that that not 1 to 3 percent of people who do like 99% of the work as is the case you have like the the
super editors on Wikipedia so it becomes a community building challenge what is the motivation for people to get involved in these things there there have been projects to try And do kind of crowd fact-checking and it's just hard to keep people motivated and into it and you know as much as Wikipedia really works it also has a lot of problem with the fact that so many of the people who are engaged in that are men and there's very little pages about famous women scientists and women who've been prominent in other ways so you have you
know the bias of your community starts to represent it and it can turn out to Be a great service but you still of course there's always more work to do yeah yeah no absolutely yeah depending you know if it's stuff from English language culture you see it all over Wikipedia or Japanese anime I suppose outside of that in other countries it can be a bit of a ghost town yeah um in the UK and I don't know if this translates house well in America we have a history of the news being an acceptable level of
inaccurate I would Say so we've had the Sun and say the Hillsborough disaster we've had the times in a false story about people adopted about Muslims adopting Christian children and you know the Daily Mail corrects Twitter accounts etc with that in mind and I suppose the acceptance that people have realized the news is going to be if not explicitly falls deeply politicized why should people care whether or not the news is accurate when we've in some ways just become so Accustomed to it being slandered one way or the other weird question I'm a journalist myself
to be asking but every single day there are a lot of people who feel that way in journalism I think right now so I think that one this this ethic that we've had for hundreds of years in journalism sidenote I used to run a blog about Corrections so I'm a complete nerd for newspaper Corrections and used to like read the UK tabloid Corrections which Are unbelievable like the stuff that gets reported I know the problem is the stuff that doesn't get corrected though but you're right there is a certain acceptable level where journalists and news
organizations can make mistakes the question is are they making them in good faith or in bad faith and are they owning up to them publicly that's kind of like the contract we've had with readers for a long time and yes I mean it's there were A lot of people who argue that having an overt partisan leaning and knowing that you know the male leans this way and the Guardian leans that way is actually a good thing because readers can sort of understand that bias and it's very clear in a lot of ways but there's also
good reporting in both right and so this I think it's overall kind of worked where where it becomes a difference is you know are people deliberately introducing errors are people deliberately Withholding critical information and when they make a mistake are they refusing to correct it and and so that's you know that's the difference between responsible and and irresponsible media I think and because it's been around for so long and has worked that way for so long I do think people have been you know have been come to know that and it's come to feel very
comfortable which is why the way things are working so now is completely uncomfortable for so many Of us and it's in frankly on one level like a cognitive level a huge challenge to suddenly not know you're reading the mail but to look at a newsfeed that might have the mail and some site created three days ago by somebody in Macedonia and have this and that and to not be able to understand that I'm reading the mail and I understand the framework of the world that the mail exists in and I understand that they will probably
correct an error if people Yell at them enough about it and that's very different than us sitting there and scanning at a feed that makes us so many things together it's it's something that none of us are comfortable with including in some cases journalists I think so it's a huge difference and I I I don't think it's something that we're gonna adapt to very quickly it's going to take time for sure do you think that there is scope for some way that algorithms can become Subject to a more effective democratic scrutiny like kind of public
challenge or do they just remain this opaque thing which has more control I think that we're moving towards potentially figuring out some models for that there is a lot of people advocating for greater transparency in the algorithms that are making decisions about our lives I think it's something that politicians are looking into more and understanding how much they really do Decide what we see and what we engage with in decisions made about us in our life I haven't seen a model for it yet that's that's the downside it's like I don't think there's a template
that exists Facebook is not going to allow a government body probably nor should it to go and expect it's code right but is there a way to have some kind of disclosure about what gets weighted you know in this algorithm here are all the things that we throw into the black box And here are the ways that we roughly weight them because they're not gonna give the super secret sauce so understand that if you have these characteristics or if this piece of content has these characteristics it may get a high result or a low result
and so I think Facebook has become a bit more transparent about that but we do need some kind of a workable model and I actually think we're at the point now where these companies have been so beat Up that if there was one that still protected their intellectual property they would probably be open to figuring that out and so I'm somewhat hopeful that we may get to that point but this is also a place where the threat of regulation I think is really great so like threaten them to see what they can come up with
and also to have some you know some independent bodies whether it's NGOs schools other places to kind of put forward some firm proposals I Think this is an area that we should put a lot more work into for sure why I see why they wouldn't share that role but what's the motivation why they shouldn't I mean yeah no I think it's more just I think it's so unlikely to never happen that you know I think and I think part of it is also like if that's demand we make of Facebook than Facebook who's gonna sit
there and be like you are an idiot we're not going to do that you know and So so I think I think making requests that are actually going to move the you know move the ball along but are also like reasonable like I don't I don't think they should rule everything and just because they're you know a public company and economically driven that just because it's good for their business they should be able to do it but I think you know the chances of facebook open sourcing the news feed algorithm are basically zero and I
can't Imagine anybody forcing them to do that so maybe it's true I shouldn't rule it out completely I suppose it would be really great if we could figure out models because open-source obviously it can be way more secure right we can get at it and figure out what some of the flaws are earlier it would be amazing to figure out models where that that that they were able to be a for-profit company but they're able to do that and and it makes the system that much secure I mean it's it's sort of leading into the
realm where people make the argument I think it's true that these companies are very different from anything that we've had in society because they are the default platforms of speech they should have new responsibilities and new requirements on them but I don't know that we have a template to figure out exactly what those are yeah I mean I think they realize they have a lot of trust to rebuild and and they're trying They've they've made a lot of changes I mean this this used to be a company that gave you nothing when you you know
you would reach out to the story and you would found some stuff and they they would say nothing or they'd give you boilerplate and now you know it's like you can't get them to leave you alone right they just they want to talk and talk I mean mostly off the record which is not very helpful but you know they have they have you know and and this is One of the things that's maybe not very helpful is like their PR staff has grown by leaps and bounds and so now they're all over it but that
that's the kind of that's the window dressing we have to get at the actual behavior of the system he I was just wondering given what you've said about the the new funding model that exists for most of if not the vast majority of media nowadays what advice you'd give to yourself a few would say Twenty and you were you don't think I'm twenty twenty two maybe okay thanks and I was wondering how basically what you'd advise yourself in terms of having a career in journalism given how things are now and given the funding models that
exist for journalists and for news organizations yeah that's a good question I mean the one thing is I do think that you you need to understand the environment you're working in so being familiar with the basics of how You know websites work and and internet networking like things that sound like they may be completely obscure what have you but really understanding the the underpinning of it helps then understand why some of the business models have evolved the way they have so I think for a journalist regardless of your beat if you're planning to work in
digital which if you were you know if you were 22 yeah I would say you should expect that that's gonna be a big part of it I think Understanding some of the fundamentals the other piece is that specialization does tend to be valued a lot I think in the world of journalism today if you are the person associated with one particular thing that's often going to serve you better than being a particular generalist and you know the other thing on business models is that I think we're seeing a bit of a swing back and I
work at BuzzFeed a place that has you know always said we will never have a paywall We're not gonna ask people to pay for our content our journalism you know we're gonna work with social networks like this has been the philosophy of the company since day one long before it had a news division but we're seeing a huge kind of shift in a lot of places to subscriptions and to going back to trying to find that direct relationship with a reader rather than passing it off to a bunch of you know programmatic advertising companies and
so I think That there's there's a lot more business models in journalism today than there were when I wasn't you know when I was 22 and it's nice to see that subscriptions and close relationship with readers is actually one of the options for people because I think there's only so many places that can try to be scale and you know completely free there's a big argument for you know being smaller but having a very close relationship and I think that's probably The last piece of advice is like think about what community you can be a
part of and where your expertise can be valued and where you can also you know lift up and elevate people that are part of that so that they see you as somebody who is you know a strong and positive actor in in that larger ecosystem thank you for the talk I was wondering if there's any projects to educate young children on democracy and news we've sponsored by Facebook all the states There are this is one of the things that's happened since 2016 is suddenly everybody's talking about media literacy and so there are projects like around
the world in so many different countries now where governments are doing it but yes absolutely Facebook and Google are just like throwing out cash left and right this is a great time to try and get money out of them by the way like they feel very guilty so you should think about how you Can extract cash from them so what have I advocated threatening them with regulation and extracting cash from them right so there are a lot of initiatives like I'm live in Canada I can think of at least like three off the top of
my head two of which did not exist before 2016 and so one of the nice things is there is funding flowing into that one of the things however we have to I think be really cautious of is is like there was there was a university United States That used to require all of their first-year students this is long before all this to have a first year media literacy course and it was a great program and it was really celebrated the problem is that nobody wants to take a media literacy course like you see that on
on the class schedule and it's just like next right and so they had to require it of everybody so the challenge I think is figuring out how you integrate this stuff in existing Frameworks that people are already engaged in rather than like forcing everyone into a meet see class so some of the more effective ones seem to find ways of doing that to not make it seem like eating your vegetables but actually seem like here's would you like to show your parents that they don't know what they're talking about here are three things you can
do to show that that Facebook post that your dad shared is garbage you know so There's there's a lot of that it's encouraging but it's not it's not the only solution and it's also not just young people I think all of us are struggling at every age and there is some recent research showing that in the United States it was actually older people who had more of a propensity to share you know fake or very hyper partisan content so it's a challenge for all of us and I think we shouldn't shame anybody about it and
we should all Realize like this is something we are all struggling with and to be aware of that as you're out there consuming information other nation states have you know deliberately putting out propaganda online have you uncovered any of that at all I mean how much how much have you seen if any and so rather than just the criminal gangs absolutely there is a state-sponsored trolling that takes place there is absolutely States I mean there's always been state-sponsored Propaganda right so that has absolutely moved into the digital realm you know Russia has obviously gotten a
lot of attention and and to a certain extent it's it's deserved I think actually they realized many years ago that their power wasn't really they weren't as powerful militarily they had you know a post-soviet era they lost a lot of the you know the satellite countries they were starting to lose their influence over them as well and so they had to Come up with a new way to exert their power but to do it when when they weren't as sort of powerful as you know the the Democratic West Alliance and you know a general came
up with with the idea of what was called hybrid warfare of you know using the information space and dominating that and I think Russia has been very effective at it you know I think also the you know the United States and countries in the West you know while they may not be hiring trolls To go and work for them they are also conscious of trying to influence the information space and we should always pay attention to what our own governments are up to as well China is I think interesting if you look at the information
space within China it's a mixture of obviously very very tight ship especially under you know the current leadership there where they you know it's even though they're not Necessarily having a government censor look at everything the fact is that all of the big social networks in China all the big technology companies in China employ their own censors because they know if they do not censor the things that are they could get into huge trouble so there's a massive censorship apparatus there at the same time they're also they also kind of practice flooding the zone so
if there's a lot of conversation about a kind of touchy Subject yes some of that will be censored and removed but they what they'll also do is kind of flood the zone with innocuous content from people who are you know hired or working for the government to sort of you know drown that out so kind of a white noise approach to doing that China is less it seems to have less of an approach of like hiring trolls to you know go after Americans but they certainly do try to work a lot with you know their
expat Communities to sort of you know strengthen the views of China among them and to help that sort of propagate through the society there so I think there are you know there are a lot of copycats of Russia in particular China is a very different thing I don't think other countries can really emulate their model but it seems like it's the Russian model that's really taking off and I mean we've seen this in elections that you know there are efforts often not by The nation-state but by some political parties to kind of engage in this
stuff or encourage it so I think it's a reality it's it's not going away but the awareness of it deters nation-states to a certain extent it also makes them go more underground which is the thing I worry about like what what have we not seen that's always the big question with all this talk of fake news becoming more and more prevalent and in a regulation sort of probably still a few years off Do you think fake news can become its own enemy in the sense that it becomes so prevalent but like in 90% of the
stuff on the internet is fake news that people shift from the default collation of Oh everything on the internet is true - everything is false which sort of puts us back in the pre-internet age I mean like you can't so you can only trust what you see or you hear from your trusted friends so do you think that's possible like a trust inversion huh I Mean I would hate to be in the end the in-between phase of the trust inversion that sounds like a horrible time where it's like you know 60% or 40% everything is
fake and nobody really knows that sounds like bad things will happen but I do think over time there is there is a necessity to kind of shift like there's that that famous saying of you know trust but verify so this idea which you sort of expressed like the default was well I'm seeing Something it's come from a friend or I'm seeing something it's from what appears to be a news website I'm gonna apply a certain level of trust and we do have to flip that to a certain extent to be verify and then trust we
have to be a little more judicious with how we're applying trust to things we have to raise that level of skepticism but the danger in that is we can't get to the place where nothing matters and we trust nothing because then you know what what Holds us together in society and that's you know that's that is the inversion I definitely don't want us to get to I don't think we'll get so saturated with fake stuff what's frankly far more prevalent than 100% fake stuff is stuff that is very misleading stuff that's very torqued stuff that
leave stuff out and you know there's a lot of times where mainstream journalism can fit any of those descriptions where we don't do our jobs Properly as well even though we maybe are making a good-faith effort so I worry less about the hundred percent fakes and I do worry more though just how do we get ourselves to adapt to this new universe so that we're not in a place where we have no trust for anything but where we also are kind of reducing that instant trust so that we apply some basic tests two things before
really you know reading it or sharing it or that kind of thing hi thanks that was A great talk I think my question is going back to the solutions so we talked about kind of regulation we talked about educating people about what they're reading but to what extent do you think that actually them very models that caused the problem in the first place could be the solution so in actual real life we see very positive deployment of machine learning AI models so actually could the answer be just a game of catch-up let's build a load
of models That can identify the fake faces on the internet for example and just be more sophisticated than those causing the problem I think it's absolutely a piece of it but III think there's certain things that and so it's also hard to predict how good a I can get like right now for example if you look at Facebook they feel like they're I think they said there's something like 97% effective at spotting Isis content and removing it Automatically on Facebook they feel they're really really good at that but then when it comes to hate speech
there they acknowledge that they're not as good and that it's a real problem for AI and so they rely on human oversight to do that and so I don't I don't feel like we're gonna get to the point especially at the scale Facebook is at and the breadth of content on it where you can a hundred percent just kind of turn it over to AI I think they will 100% get a lot better but something like hate speech you to understand the context of something a word that could be okay in this post but is
actually very harmful in that one I mean even humans can sit and have a debate about it right it's hard to write the rules for that which at the end of the day you know is going to determine how good that is how well that's going to work so I think we'll see a lot of improvement they have to be a part of the solution But I think to a certain extent we have to recognize that humans have to be part of this equation too as a fail-safe on it as an oversight on it because
I just don't know that we can get all the way there from AI although at the same time I can't predict how good it's gonna get I hope it gets a lot better but you know there's always bias in any system and I do think there's bias among humans so AI can help us with that sometimes there's bias and AI and I think human oversight Will be necessary there too you