we might have all experienced scrolling past the video and landing on a similar one or opening a web page only to come across an ad about a product we searched on google this is because algorithms keep a close watch on our activity to offer us customized content to do that they rely on a wide array of clues we give away our browsing history posts we engage with videos we watch and personal information we disclose while signing up for a website including age gender and location are all used to serve as personalized content or place ads
accordingly algorithms are also used by social media platforms like youtube tick tock twitter and instagram to predict what we might want to see next based on our activity so do you tend to be in the cooking stuff do you tend to watch a lot of video and we again predict a few things how like you are to like to save or to share those posts those are the most important predictions and then all gets wrapped up into a guest an educated guess about how interested you might be in that specific photo or video [Music] personalization
can sound like a good thing as we get to see more what we care about but when we see content similar to what we have consumed or engaged with before and all the opposing perspectives are hidden on the other side of the internet this can lead to us being trapped in a filter bubble a term coined by activist eli perez a filter bubble can create the illusion of what we see being all there is and lead us to think that everyone agrees with us not surprisingly as when we see content that reinforces what we already
believe in we're left with little to no choice being inside a filter bubble is similar to being surrounded by the walls of an echo chamber which is another term to describe an environment dominated by a single mindset or idea in such an environment what we believe appears to be common sense and any opposing views seem like they belong to fringe groups and are therefore considered insignificant and destined to die down [Music] the world witnessed the presence of a similar echo chamber during the u.s presidential election in 2016. for many it had one probable outcome a
pro-clinton bubble consisting of youtube videos facebook posts and online polls gave many the impression that everyone was seemingly on the same page about trump's candidacy being nothing more than a fleeting effort and that he would likely suffer a significant defeat by his democratic rival some critics have suggested that this led trump supporters to express their views only within smaller circles to avoid confrontation which brings to mind the allegations around russia's possible involvement in the outcome of the u.s elections [Music] russia was accused of hacking the clinton campaign by creating frequent negative news cycles about the
democratic candidate to deter her supporters from voting for her in a case that raises ethical questions about whether algorithm systems employed by companies to combat misinformation are effective twitter was recently caught in a debate about this very question when elon musk criticized twitter's algorithmic feed for subconsciously manipulating users jack dorsey twitter's former ceo admitted that the new feed could have unintended consequences although he insisted that it was not designed to manipulate what can be even more unsettling is that more often than not we're not aware that we're trapped in a filter bubble because of unknown
mechanisms behind algorithms filter bubbles and algorithms can be messy and unavoidable but even so being aware of them is the least we can do to avoid being absorbed by an isolated world fueled by popularity bias echo chambers and group thing by keeping that in mind we might be able to reassert some control over what we are exposed to and how it shapes our perceptions of the world