Data spills of this sort would not happen if there were actual privacy, if the organizations involved lacked any records to lose.
But the panelists weren\’t there to ponder the absence of information in some dream world where netizens have complete control over the data that describes them. They were not looking to explore privacy as defined in a dictionary: \”the quality or state of being apart from company or observation.\” Nor were they looking at tools like messaging client Signal that can actually provide some measure of privacy by denying data to those who\’d gather it.
Rather, they chose to focus on the parameters of lawful data usage, on the ways companies can handle data with informed notice and consent from customers.
In other words, the focus was compliance rather than abstinence
\”Companies increasingly look at privacy and security as a cost of doing business, and those on the leading edge think of it as an opportunity,\” said Kimberly Nevala, strategic advisor at software biz SAS.
The word \”opportunity\” here means using data to compete more effectively, to generate more revenue for one\’s business. Data has been likened to oil as a commodity that fuels business growth. Hence for- profit companies willing to forego data collection are few and far between.
Data from global IT biz Unisys suggests some forbearance might be wise. In a survey of more than 1,000 US adults described in the 2018 Unisys’ Security Index, the firm found: 42 per cent of survey respondents don\’t want their health insurance provider using fitness data from wearable devices to influence premiums or incentivize behavior; 38 per cent don’t want police to determine their locations from their wearable fitness devices (good luck with that); and 27 per cent dislike the idea of baggage sensors interacting with airport baggage management systems to track bags and send text updates.
The results show folks are fearful of these technologies because they feel ill-equipped to prevent potential online abuse, said Tom Patterson, chief trust officer of Unisys, in a statement.
Legal types like to talk about how companies should obtain informed consent to collect data. But Nervala called the notion into question with her observation that technologies like artificial intelligence complicate matters by obscuring how information gets used.
\”As a consumer, you can\’t give informed consent because don\’t know how data will be used or combined,\” she said, in reference to the largely inscrutable decisions of machine learning algorithms.
She suggested some data uses ought to be viewed as toxic. \”We don\’t allow lead paint,\” she said. \”There should be some uses of information we just don\’t abide.\”
Velasquez added that consumers need to be motivated to become informed. She likened privacy to health, noting that it tends to be ignored until it causes pain. Your doctor can warn you to live a healthy lifestyle, but many people won\’t pay attention until they experience chest pains, she said.
The discussion inevitably turned to privacy laws and the business community\’s desire for a federal law in the US to override the emerging patchwork of privacy legislation at the state level.
Karen Zacharia, chief privacy officer at Verizon, declined to describe the features that should be present in federal privacy legislation but she said, \”It\’s important that we have a consistent regime that applies to all players in the ecosystem, enforced by the FTC.\”
The Register asked Zacharia and Kalinda Raina, head of global privacy for LinkedIn, whether a federal privacy law should include the right for individuals to sue companies for failing to live up to privacy promises – included the Illinois Biometrics Protection Act and opposed by many companies.
Both were non-committal. Raina suggested that GDPR-style fines – the result of legal action brought by government authorities rather than consumers – work better to encourage responsible data handling by companies. Zacharia said a personal right to sue might not necessarily be the best way to protect consumers.