Advanced Search
Content Type: Examples
In 2012, London Royal Free, Barnet, and Chase Farm hospitals agreed to provide Google's DeepMind subsidiary with access to an estimated 1.6 million NHS patient records, including full names and medical histories. The company claimed the information, which would remain encrypted so that employees could not identify individual patients, would be used to develop a system for flagging patients at risk of acute kidney injuries, a major reason why people need emergency care. Privacy campaigners…
Content Type: Examples
A new breed of market research companies are pioneering geoanalytics to find complex financial information. That is, they use machine learning algorithms to search for patterns in high-resolution satellite imagery that's refreshed daily and available at the scale of 1 meter per pixel. Much of the information gleaned this way relies on detecting change or counting items such as cars or trucks parked in a particular location. The resulting financial insights are sold to paying customers such as…
Content Type: Examples
In April 2016, Google's Nest subsidiary announced it would drop support for Revolv, a rival smart home start-up the company bought in 2014. After that, the company said, the thermostats would cease functioning entirely because they relied on connecting to a central server and had no local-only mode. The decision elicited angry online responses from Revolv owners, who criticised the company for arbitrarily turning off devices that they had purchased. The story also raised wider concerns about…
Content Type: Examples
A new examination of documents detailing the US National Security Agency's SKYNET programme shows that SKYNET carries out mass surveillance of Pakistan's mobile phone network and then uses a machine learning algorithm to score each of its 55 million users to rate their likelihood of being a terrorist. The documents were released as part of the Edward Snowden cache. The data scientist Patrick Ball, director of research at the Human Rights Data Analysis Group, which produces scientifically…
Content Type: Examples
In 2016 the Dutch Data Protection Authority (AP) ruled that the Personal Data Protection Act prohibits companies from monitoring their employees' health via wearables, even when employees have given their permission. The ruling concluded the AP's investigation into two companies; in one of them, wearables even gave the employer insight into its employees' sleep patterns. The AP argued that employers are free to give wearables as gifts, but that the power relationship between employer and…
Content Type: Examples
A 2009 paper by the US National Academy of Sciences found that among forensic methods only DNA can reliably and consistency match evidence to specific individuals or sources. While it's commonly understood that techniques such as analysis of blood spatter patterns are up for debate, other types of visual evidence have been more readily accepted. In 2015 the FBI announced that virtually all of its hair analysis testing was scientifically indefensible, and in 2016 the Texas Forensic Science…
Content Type: Examples
In 2016, researchers discovered that the personalisation built into online advertising platforms such as Facebook is making it easy to invisibly bypass anti-discrimination laws regarding housing and employment. Under the US Fair Housing Act, it would be illegal for ads to explicitly state a preference based on race, colour, religion, gender, disability, or familial status. Despite this, some policies - such as giving preference to people who already this - work to ensure that white…
Content Type: Examples
In 2017, an automated facial recognition dispenser was installed in one of the busiest toilets in Beijing in order to prevent theft of toilet paper rolls, chiefly by elderly residents. Would-be users must remove hats and glasses and stand in front of a high-definition camera for three seconds in order to receive a 60cm length. Users have complained of software malfunctions that force them to wait, the lack of privacy, and difficulty getting the machines to work. The last of these led the city…
Content Type: Examples
A US House of Representatives oversight committee was told in March 2017 that photographs of about half of the adult US population are stored in facial recognition databases that can be accessed by the FBI without their knowledge or consent. In addition, about 80% of the photos in the FBI's network are of non-criminals and come from sources such as passports. Eighteen states supply driver's licences under arrangement with the FBI. In response, privacy advocates and politicians called for…
Content Type: Examples
Few people realise how many databases may include images of their face; these may be owned by data brokers, social media companies such as Facebook and Snapchat, and governments. The systems in use by Snap and the Chinese start-up Face++ don't save facial images, but map detailed points on faces and store that data instead. The FBI's latest system, as of 2017, gave it the ability to scan the images of millions of ordinary Americans collected from millions of mugshots and the driver's licence…
Content Type: Examples
By 2017, facial recognition was developing quickly in China and was beginning to become embedded in payment and other systems. The Chinese startup Face++, valued at roughly $1 billion, supplies facial recognition software to Alipay, a mobile payment app used by more than 120 million people; the dominant Chinese ride-hailing service, Didi; and several other popular apps. The Chinese search engine Baidu is working with the government of popular tourist destination Wuzhen to enable visitors to…
Content Type: Examples
For a period between the end of October and November 3 2016 the heating and hot water systems in two buildings in the city of Lappeenranta, Finland were knocked out by a distributed denial of service attack designed to make the systems fail. The systems responded by repeatedly rebooting the main control circuit, which meant that the heating was never working - at a time when temperatures had already dropped below freezing. Specialists in building maintenance noted that companies often skimp on…
Content Type: Examples
In 2015, the Swedish startup hub Epicenter began offering employees microchip implants that unlock doors, operate printers, and pay for food and drink. By 2017, about 150 of the 2,000 workers employed by the hub's more than 100 companies had accepted the implants. Epicenter is just one of a number of companies experimenting with this technology, which relies on Near Field Communication (NFC). The chips are biologically safe, but pose security and privacy issues by making it possible to track…
Content Type: Examples
The payday lender Wonga announced in April 2017 that a data breach at the company affected an estimated 270,000 customers, 245,000 of them in the UK and the rest in Poland. The company sent those it thought were affected messages warning that it believed there may have been illegal and unauthorised access to some of the data in their accounts. Wonga was already controversial because of the high rates of interest in charged, and findings by the UK's financial regulator that it had made loans to…
Content Type: Examples
In 2017, an anonymous whistleblower sent a letter to Green party peer Jenny Jones alleging that a secretive Scotland Yard unit was illegally monitoring the private emails of campaigners and journalists. The letter included a list of ten people and the passwords to their email accounts and claimed the police were using an India-based operation that did the work of hacking emails, shredding documents, and using sex as a method of infiltration. Jones's background includes a decade on the…
Content Type: Examples
In 2017, when user Robert Martin posted a frustrated, disparaging review of the remote garage door opening kit Garadget on Amazon, the peeved owner briefly locked him out of the company's server and told him to send the kit back. After complaints on social media and from the company's board members, CEO Denis Grisak reinstated Martin's service. The incident highlighted the capricious and fine-grained control Internet of Things manufacturers can apply and the power they retain over devices…
Content Type: Examples
A 2017 research report found that the most vulnerable smartphone users are the ones whose devices are most open to fraud and harassment. Cheaper, low-end devices are less secure to begin with, and they are also less often replaced than their more expensive counterparts made by. Apple and Google. At any given time there are millions of Android devices that are open to known exploits. Worse, the poorer population that owns these phones are more likely to use them as their sole means of accessing…
Content Type: Examples
Facebook has come under fire after leaked documents revealed the social media site has been targeting potentially vulnerable children.
The allegations suggest the company is gathering information on young people who “need a confidence boost” to facilitate predatory advertising practices.
Confidential documents obtained by The Australian reportedly show how Facebook can exploit the moods and insecurities of teenagers using the platform for the benefit of advertisers.…
Content Type: Examples
In 2017, Uber began a programme experimenting with using psychology and social science insights to influence when, where, and how long its drivers work. Among other techniques, Uber auto-loaded the next fare to encourage the driver equivalent of binge TV-watching; reminded drivers when they're close to their earnings targets to keep them from logging off; and used game-style graphics and small-value awards to keep drivers at the wheel. The company also had male managers adopt female…
Content Type: Examples
Connecticut police have used the data collected by a murder victim's Fitbit to question her husband's alibi. Richard Dabate, accused of killing his wife in 2015, claimed a masked assailant came into the couple's home and used pressure points to subdue him before shooting his wife, Connie. However, her Fitbit's data acts as a "digital footprint", showing she continued to move around for more than an hour after the shooting took place. A 2015 report from the National Institute of…
Content Type: Examples
Even after they move out, domestic abusers may retain control over their former residence via Internet of Things devices and the mobile phone apps that control them. Using those tools, abusers can confuse, intimidate, and spy upon their former spouses and partners. Lack of knowledge about how these technologies work means that those who complain are often not taken seriously. Even the victims themselves may believe it's all in their minds; lawyers are struggling to develop language to add to…
Content Type: Examples
A 2017 lawsuit filed by Chicagoan Kyle Zak against Bose Corp alleges that the company uses the Bose Connect app associated with its high-end Q35 wireless headphones to spy on its customers, tracking the music, podcasts, and other audio they listen to and then violates their privacy rights by selling the information without permission. The case reflects many of the concerns associated with Internet of Things devices, which frequently arrive with shoddy security or dubious data…
Content Type: Examples
In 2017, a website run by the Jharkhand Directorate of Social Security leaked the personal details of over.1 million Aadhaar subscribers, most of them old age pensioners who had enabled automatic benefits payment into their bank accounts. Aadhaar is a 12-digit unique identification number issued to all Indian residents based on their biometric and demographic data. Both cyber security agencies and the Supreme Court have expressed concerns over its security,…
Content Type: Examples
In 2015, IBM began testing its i2 Enterprise Insight Analysis software to see if it could pick out terrorists, distinguish genuine refugees from imposters carrying fake passports, and perhaps predict bomb attacks. Using a scoring system based on several data sources and a hypothetical scenario, IBM tested the system on a fictional list of passport-carrying refugees. The score is meant to act as a single piece of data to flag individuals for further scrutiny using additional…
Content Type: Examples
In 2017, the New York Times discovered that Uber had a secret internal programme known as "Greyball", which used data collected from the Uber app and other techniques to identify and bar regulators and officials from using its service. As the company expanded into new areas, its standard practice was to open up and begin offering rides without seeking regulatory approval first. The company used Greyball to prevent regulators from building a case against the company in areas where…
Content Type: Examples
In 2015 Hong Kong's Face of Litter campaign used DNA samples taken from street litter and collected from volunteers to create facial images that were then posted on billboards across the city. The campaign, conceived by PR firm Ogilvy & Mather and organised by online magazine Ecozine and the Nature Conservancy, was intended to give a face to anonymous Hong Kong litterbugs and raise awareness of the extent of littering in the city and encourage people to…
Content Type: Examples
For some months in 2017, in one of a series of high-risk missteps, Uber violated Apple's privacy guidelines by tagging and identifying iPhones even after their users had deleted Uber's app. When Apple discovered the deception, CEO Tim Cook told Uber CEO Travis Kalanick to cease the practice or face having the Uber app barred from the App Store.
External Link to Story
https://www.nytimes.com/2017/04/23/technology/travis-kalabnick-pushes-uber-and-himself-to-the-precipice.html
Content Type: Examples
French spy agency Direction Générale de la Sécurité Intérieure in December 2016 for 10 million euros signed a contract buying access to Palantir’s Gotham software. French politicians have voiced concerns over the software as France pushes to become more technologically independent.
Publication: EU Observer
Date: 9 June 2017