bitstriada.blogg.se

Swift share through facebook
Swift share through facebook







Companies – and governments – could easily abuse data analytics to target people based on their race, ethnicity, religion, gender, or other protected characteristics. The model may also be helping to fuel discrimination. The push to grab users’ attention and to keep them on platforms can also encourage the current toxic trend towards the politics of demonization. Advertising and propaganda aren’t new, but there is no precedent for targeting individuals in such intimate depth, and at the scale of whole populations. One of the most urgent and uncomfortable questions raised in The Great Hack is: to what extent are we susceptible to such behavioural manipulation? Ultimately, if these capabilities are as powerful as the companies and their customers claim, they pose a real threat to our ability to make our own autonomous decisions or even our right to opinion, undermining the fundamental value of dignity that underpin all human rights. Google developed a tool to target ads so precisely that they can sway people’s beliefs and change behaviour through “ social engineering” – while initially developed to counter Islamic extremism, the tool is publicly available for anyone to (mis)use.

#Swift share through facebook how to#

The company has explored personality profiling, how to manipulate emotions, and target people based on psychological vulnerabilities such as when they felt “worthless” or “insecure”. In the furore over Cambridge Analytica, Facebook’s own profiling practices largely escaped scrutiny. The next step is using sophisticated analytics powered by machine learning to profile people – and thereby influence their behaviour. Indeed, in 2010, Facebook CEO Mark Zuckerberg famously admitted that social networking had already changed privacy as a “social norm”.īut harvesting the data is only the first part of the story. Mass corporate surveillance on such a scale threatens the very essence of the right to privacy. This goes far beyond the data that you choose to share on their platforms to include the vast amounts of data tracked as you engage with the digital world. But increasingly, we are being forced to ask whether the internet’s surveillance model itself inherently conflicts with our human rights.įacebook and Google have amassed data vaults with an unprecedented volume of information on human beings. Joe Westbyįacebook and Google of course have long affirmed their commitment to respecting human rights. But it is the model’s pioneers Google and Facebook that have unparalleled access to tracking and monetising our lives, by controlling the primary gateways – outside China – to the online world (between them Google Search, Chrome, Android, YouTube, Instagram and WhatsApp).įacebook and Google have amassed data vaults with an unprecedented volume of information on human beings. This model has become core to the data economy, and underpins a complex ecosystem of tech companies, data brokers, advertisers and beyond. Cambridge Analytica simply deployed the same basic model to target voters rather than consumers. The model’s fundamental characteristics are: aggregating vast amounts of data on people, using it to infer incredibly detailed profiles on their lives and behaviour, and monetising it by selling these predictions to others such as advertisers. Via a third-party app, Cambridge Analytica improperly obtained data from up to 87 million Facebook profiles – including status updates, likes and even private messages.īut the incident was not an aberration: it was an inevitable consequence of a system founded on harvesting and monetising our information – the business model that academic Shoshana Zuboff dubs “ surveillance capitalism”. The most important source of the data was Facebook. By applying “psychographic” analytics to its dataset, it claimed to be able to determine people’s personality type and then individually micro-target messages to influence their behaviour.

swift share through facebook

When combined, even seemingly innocuous data points can reveal a LOT about a person.Ĭambridge Analytica bragged that it had up to 5000 data points on every US voter. In the online and digital world, everything you do leaves a trace of “data exhaust” – a record of everything, from what time you put petrol in your car, to what websites you visited.







Swift share through facebook