How much do you trust big tech companies? If the answer is “very little”, there are few who could argue with that assessment.
There’s an old saying that if the service is free then you are the product. Or, in most cases, your data. As numerous tech companies have demonstrated, there is money to be made from data.
We put our trust in these companies, and it has rarely been rewarded. From the Facebook debacle with Cambridge Analytica to the revelations that smart speaker companies had used human reviewers to listen to recordings, there have been too many incidents in recent months that reinforce the idea that trusting your data to private companies is not the best idea we’ve ever had.
It’s not surprising that people are becoming more aware of the implications of data security – and that companies are starting to see the benefits of making privacy a feature to shout about rather than something to tolerate.
Think about the phone in your pocket. It is no longer a simple means of making and receiving calls. It tracks your location, monitors your activity, looks after your financial information. We have apps that replace loyalty cards, giving up valuable data on our shopping habits. Our credit and debit cards live in mobile wallets, ready for us to pay for goods and services with contactless technology. Your phone knows who you contact most often, what you spent your last few euro on and the places you visit most frequently.
Our devices are becoming ever more intertwined with our daily lives. Activity trackers monitor your health, sharing information with healthcare professionals and in some cases insurance companies.
Smart speakers, for example, have become an increasingly common sight in homes, listening for the “wake word” that signals it is time to spring into action.
From there, your device can control your connected home security system.
The problem is not only the amount of control the speakers have over your home, but also what they hear.
Apple apologised for allowing contractors to listen to voice recordings from Siri users
Amazon was the first to come to the public attention, when it was reported that the company sent snippets of voice commands to human reviewers to help improve the accuracy of the service. Although people may have been aware that their recordings would be used to help improve the accuracy of the service, there were probably very few who realised the extent of the human involvement. Amazon said it only used “an extremely small sample” of Alexa voice recordings to improve the customer experience.
Amazon now allows people to opt out, but it took some public outcry before the tech company changed its course.
It seemed no company was immune. Google recently introduced a new policy that will see users of Google Assistant opt in if they want to have their voice recorded or reviewed by humans through its Voice & Audio Activity (VAA) programme.
In August, Apple apologised for allowing contractors to listen to voice recordings from Siri users, as part of an evaluation process for its digital assistant.
In a report by the Guardian, former graders claimed accidental activations were regularly sent for review. Among the clips were recordings of confidential information, illegal acts, and even sexual activity.
Following a review, Apple stopped the programme, and said it would restart it later in the year as an opt-in initiative with Apple employees instead of contractors. Recruitment for those positions began last month.
“As a result of our review, we realise we have not been fully living up to our high ideals, and for that we apologise,” the company said at the time.
It was a rare misstep for Apple, which has in recent years pushed the idea that privacy should be at the heart of all its products. From its mapping software to its mobile payments, the company keeps customer data close to its chest.
For security consultant Brian Honan, Apple is one of the top companies when its comes to data privacy. It is chiefly because it refuses to share such data, plus its stance on building in vulnerabilities to its software to satisfy law enforcement requests.
The company has made it central to its strategy. When it unveiled its latest News+ and TV+ apps in March, Tim Cook stressed that advertising would not be allowed on the new services, and that Apple would not be passing data to third parties to allow them to sell its subscribers more products.
In an op-ed for Time magazine earlier this year, Cook once again laid out his views. He described 2019 as the time to stand up for the right to privacy, and called for US congress to pass federal privacy legislation that minimises personal data and gives consumers the right to know what data is being collected and why, the right to access that data and the right to data security.
“Consumers shouldn’t have to tolerate another year of companies irresponsibly amassing huge user profiles, data breaches that seem out of control and the vanishing ability to control our own digital lives,” he wrote. “This problem is solvable – it isn’t too big, too challenging or too late. Innovation, breakthrough ideas and great features can go hand-in-hand with user privacy – and they must. Realising technology’s potential depends on it.”
You decide what data you want to share and with whom. Apple cannot access any information that directly identifies you
Apple, for its part, says privacy is a “fundamental human right” and is keen to point out that users of its products keep ownership and control over their data.
When the company announced that it would add to its series of health studies with research on hearing, women’s health, mobility and heart health, it immediately reassured customers that it would only be with their express permission.
“You decide what data you want to share and with whom. Apple cannot access any information that directly identifies you,” Apple’s VP of health Sumbul Desai said.
Cook’s op-ed is not the first time Apple has made a pitch as a privacy protector. The company went up against federal investigators in the US in 2015 when it refused to build a vulnerability into its mobile software so the FBI could unlock a phone used by Sayed Farook, one of the suspects in the San Bernadino shooting.
Into the wild
Its reasoning was sound: building software that would allow authorities to bypass its security would put millions of users at risk, and the chance of it getting into the wild was one that Apple wasn’t willing to take. If authorities could get around the security protections, there was nothing to stop bad actors from doing likewise.
Another company that makes it to the top of Honan’s list for its commitment to privacy and security is Microsoft. Despite its Skype troubles (the service was touted as a machine learning language translator that could work in real time, but it later emerged that human contractors were listening to some conversations), Microsoft has built up some goodwill from its opposition to the US government’s attempt to force it to turn over data from a server located in Ireland. The case began in 2013 when Microsoft opposed an order from the authorities to turn over the email, and subsequently appealed the verdict ordering it to comply with US legislation.
However, the case was later abandoned when, ahead of an appeal in the Supreme Court, Congress passed the Cloud (Clarifying Lawful Overseas Use of Data), which resolved concerns from the government and Microsoft.
Microsoft has also won some praise for its approach to its Windows 10 software, which, while not perfect, is more secure than other versions of the software the company has released.
There is a difference, however, between security and privacy, Honan points out.
“You can have very secure windows in your bathroom, but if you don’t have curtains or frosting on the glass, there’s not very much privacy,” he said. “Likewise, if the window is left open, it’s not very secure.
“They’re very closely related, they have to work together to protect people’s data,” said Honan. “Don’t mistake one for for the other.”
One thing that needs to be taken into account is the role that the authorities have played in forcing the privacy agenda. While we may be sick of hearing the data protection excuse when trying to deal with companies on a daily basis, the introduction of GDPR last year has certainly had an impact.
We still see a lot of bad practice regarding GDPR. It demonstrates they don’t know what GDPR is about
“GDPR has been a very positive move in the protection of individuals’ privacy rights,” said Honan. “People are more aware of their individual rights.”
However, there is still a long way to go, with May 25th, 2018 only the beginning rather than D-Day for data protection. Specifically, the implementation of the new regulations leave much to be desired, with stories of visitor books and waste paper bins being removed from public buildings under the guise of GDPR.
“We still see a lot of bad practice regarding GDPR. It demonstrates they don’t know what GDPR is about,” said Honan.
The focus on data privacy has helped create new opportunities for irish companies, too. Start-up Evervault, which was founded by 19-year-old Shane Curran, has just raised $3.2 million from tech investor Sequoia. The company is aiming to turn privacy into a product, with a cloud-based secure processing product aimed at developers that allows them to bake it in to their services without developers having to change how they build their software.
“Our outlook is retooling the developers and the creators of the next couple of decades to basically embed data privacy from day one. As a company, we’re basically working to make data privacy simple and accessible for all,” Curran said.
The company is planning to expand in Ireland and get its product built using the money it raised from Sequoia and other investors such as Kleiner Perkins and Frontline. It has the potential, its founder thinks, to make a significant impact.
In the meantime, companies that have found themselves in the spotlight over data privacy, such as Facebook, are taking steps to rehabilitate their reputations. Chief executive Mark Zuckerberg announced plans to introduce end-to-end encryption in both Messenger and Instagram messages, making them more secure and keeping private messages away from prying eyes.
It’s ironic that just as companies are realising the value of privacy to their customers, the authorities may be taking steps to weaken the most effective tool: encryption. The US, UK and Australia have called on Facebook to abandon the plan, citing the ability to fight back against criminals, terrorists and child abusers.
In an op-ed published by the Guardian, former NSA contractor Edward Snowden criticised the push against encryption and said cutting the governments off from Facebook’s “convenient trove” of private lives would force government surveillance to become “more targeted and methodical”.
“By limiting the amount of personal records and intensely private communications held by companies, governments are returning to classic methods of investigation that are both effective and rights-respecting, in lieu of total surveillance,” he said. “In this outcome we remain not only safe, but free.”