Including Privacy in Human Interaction by Design
As users, we value a personalised experience where we are presented with offers and results relevant to us, ideally at just the right time, exactly when we need them. At the moment, we know and use these as personal assistants in the form of Google Now, Siri and Cortana to provide us with electronic tickets, traffic updates and travel predictions.
When it comes to privacy, our behavior is controversial. We are willing to give away some private data in exchange for these services for free, yet many of us admit they are weary when it comes to sharing private data, the very building block that enables personalized services. Then again, services with privacy features at their core seem to suffer a niche existence at best, far from mass adoption. Examples here include the social network Ello, messenger services like Cryptocat and Zendo, or the Blackphone, a pro privacy smartphone. Everyday users prefer the ease of Facebook Login to more anonymous alternatives and favor WhatsApp over competing messenger apps with better encryption and privacy features.
As Big Data is moving forward to include information fed from an abundance of nearly omnipresent sensors, cameras, and microphones, a vast amount of human interactions can be captured. From their analysis, patterns of human behavior emerge. As users, we have an understanding that the accuracy of these patterns determines just how useful Big Data reliant services are. We are not averse to this reciprocity: to get something personal, we have to share something personal.
But when data collection goes beyond that, users feel their privacy — and their right to privacy — are being infringed on. Services collect more information than they need to fulfill their purpose; data correlation can single out individuals despite anonymous collection; companies snoop on the data, messages or images they transmit, and pass collected data on to third parties, be it advertisers, other corporations, or even governments. All of these raise privacy concerns.
From a user perspective, two things seem certain:
we cannot expect corporations to uphold our privacy demands. Big Data allows for a wealth of innovation and improvement of user experience on the one side, but comes with great possibilities for monetization tied to privacy infringement on the flip-side;
we cannot expect governments, law- and policymakers to keep up with the pace of invention and innovation. Technology companies move with a can-do attitude, implementing what is possible.
The duality of users being served and used at the same time will continue to be in the interest of technology companies. Faced with data collection on the large scale of the Internet of Things, how can users regain control of their privacy?