The U.S. Constitution even upholds the ethos of privacy. Amendment IV (Privacy of the Person and Possessions) encapsulates the idea of protection against surveillance and a person’s home being their protective castle. The law states:

The right of the people to be secure in their persons, houses, papers, and effects, against unreasonable searches and seizures, shall not be violated, and no warrants shall issue, but upon probable cause, supported by oath or affirmation, and particularly describing the place to be searched, and the persons or things to be seized.”

So then it seems fair to say that privacy is a basic human right and a need. As we become embedded in the digital age, this inherent right to a private life comes along for the ride. However, this right often seems to be misused and abused.  

Examples of the modern day flaunting of privacy are everywhere. The International Computer Science Institute looked at the privacy aspects of around 5000 Android apps aimed at children.

The apps had millions of downloads. They found evidence that most of the apps collected and shared the Personally Identifiable Information (PII) of the children without parental consent.

Social media is almost a poster child for privacy violations. Facebook has set the bar as low as it can go in terms of disrespect for personal data. Their latest misdemeanor looks set to bag them a $5 billion USD fine for use of personal data without express consent. The fine being set by the Federal Trade Commission (FTC) goes back to 2011.

Privacy is now the new security, the Snowden revelations have opened a hornet’s nest, the sting of which continues to cause pain. But privacy is not security so what is it?

Online privacy

Everything you do online will become public

What is Privacy All About and Why Is It Important?

Online and data privacy can be nuanced. It can mean several things that overlap and sometimes also cause arguments. In a nutshell, privacy is about choosing who you share your data with and determining what they do with these data.

The use of a VPN, for example, allows you to control your visibility when using the internet. Security mechanisms like encryption add ways to enforce the choices made. But within that definition lie many variants and complications. I’ll give you an example of these nuances.

Within the privacy community, there is an ongoing discussion around the monetization of data. Is making money out of personal data a good or a bad thing? If we choose to receive payment when platforms process our data, will this then end the privacy malpractices of such platforms?

The trouble with commoditizing something that is so personal is that it can end up being misused. Data and online privacy need to be distinct from any structures that can be used to create ‘tiers of privacy’. Privacy, after all, is a right for all, irrespective of social class, religious belief, gender, culture, etc.

Data and online privacy can be thought of as an extension of our need to keep control of our lives. When we lose control of our digital privacy, we can end up with spiraling issues around surveillance, harassment, and even the misuse of data by legitimate companies.

Privacy is not security but the two are related. Good security used in the right way can augment privacy. Poor security can create a false sense of privacy.


Hackers and the government will always be able to find out who you are.

What is Privacy by Design?

The matter of the practical application of privacy needs to be discussed. Privacy within a digital system has ‘rules of engagement’. We know that privacy is about choosing who receives your personal data and how they use it. And, personal data can be almost anything.

It is ultimately any identifying information that can be used to pick out an individual from the digital crowd. The obvious data items are name, address, date of birth. But other data such as geo-location, IP address, and biometrics also need to be covered when looking at how to respect your customer’s privacy.

When a software product, system or service is first specified it should have privacy as a central design remit. This is the basic ethos behind the concept of Privacy by Design (PbD). Ex-Privacy Commissioner for Ontario, Ann Cavoukian, was the architect behind PbD. She set out seven guiding principles to apply when designing digital systems.

I’ll list them here to give you a flavor of the breadth and scope of the principles of PbD:

  1. Proactive not Reactive; Preventative not Remedial
  2. Privacy as the Default
  3. Privacy Embedded into Design
  4. Full Functionality – Positive-Sum, not Zero-Sum
  5. End-to-End Security – Lifecycle Protection
  6. Visibility and Transparency
  7. Respect for User Privacy

Adding in the layers needed to create a truly privacy respectful digital system is not easy. But it is worth the effort. Digital life is about sharing digital data. And sharing digital data is part of a wider effort to build relationships.

Any commercial entity will tell you that the relationship with a customer is the key to success – the saying “The customer is king” came out of this truism. Building trust begins with being respectful towards the data that represents your customer in cyberspace.

The Principles of PbD by Ann Cavoukian signs off with this statement:

“Respect for User Privacy goes beyond these FIPs, and extends to the need for human-machine interfaces to be human-centered, user-centric and user-friendly so that informed privacy decisions may be reliably exercised.

How Consumers View Data Privacy

How we, as consumers, view privacy has been looked at in a variety of research projects. One thing is certain, each of us has our own elastic measures of how far our digital privacy can be stretched. Some folks are happy to be totally open about the most personal of things on social media, whilst others are careful of posting anything other than the most impersonal information.

One thing that is hitting home with consumers is that their online data is not safe and can be used to commit fraud. Security incidents, like the Equifax and Uber data breaches of 2017 make headlines. And the breaches keep on coming. Gemalto, keep a watch on the amount of data lost or stolen; since 2013, there have been around 14.7 billion data records exposed.

This level of data exposure, including high profile cases, means that the general level of understanding by the public of security issues, and the privacy consequences related to data, are fairly well-understood.

A 2018 study “Global data privacy: What the consumer really thinks” presented some interesting findings:

  • Around half of consumers were found to be ‘pragmatic’ about privacy – they will share data if there is deemed a benefit in doing so.
  • 38 percent of consumers felt they were responsible for their own data security
  • Trust in a company was deemed one of the most important factors in data sharing choices.

Age can also have an impact on how digital privacy is viewed. In a study on the use of Facebook by older adults, the results showed that older adults were less likely to share personal data on Facebook.

In a further study, which looked at the data sharing views of connected car users, differences in attitude to privacy between men and women were demonstrated. The survey asked if an individual would decline to use an app because of data sharing concerns; 73 percent of women compared to 61 percent of men said ‘Yes’.

As mentioned earlier, trust is a key determinant in data sharing choices. A study by Akamai found a number of trust indicators around the privacy of data. For example, just under half of consumers would ‘forgive’ a company about a data breach if they were informed immediately.

data breach fatigue

Privacy and the Law

The heightened awareness of data and online privacy is being reflected in our laws and regulations too. Possibly the most infamous of these is the General Data Protection Regulation (GDPR). The GDPR came into effect on May 25, 2018, as an update to an earlier data protection law in the EU – the Data Protection Act (DPA).

Laws like the GDPR have come about because digital service providers were not dealing with personal data in a secure or privacy-respectful way.

Laws like this and the California Consumer Privacy Act (CCPA) have been enacted to attempt to create a more privacy respectful consumer environment. An environment where choice about data sharing is key. We should expect more laws of this nature to come into effect in the coming years.

How Reputation is Impacted When Trust is Lost

So, what happens when privacy is disrespected?

When data breaches occur, or your system does not abide by the rules of privacy engagement, trust breaks down. Trust isn’t just about exposed data. Trust is also about saying one thing and doing another. The example given at the beginning about the Facebook fine is an example of this. Facebook told users they would not share their data without consent, then went and shared it anyway.

Facebook may not even wince at a $5 billion fine, but if their users leave through lack of trust, then they may sit up and take notice. The #deletefacebook campaign which began as a result of the Cambridge Analytica scandal is a case in point. Around 1 in 20 British people deleted their Facebook account. Still Facebook persists and thrives, but eventually, disrespect for the very data it depends on may come home to roost; the growing tide of anti-Facebook across several sites, including Reddit, are trying and topple the giant.

The statistics should provide the rest of us evidence enough to offer our customers data privacy; Semafone found that over 86 percent of customers would pull their custom from a company after a data breach.

The fallout when trust is lost is costly and in a competitive world, reputation is as important as share price. As Warren Buffet put it:

“It takes 20 years to build a reputation and five minutes to ruin it. If you think about that you’ll do things differently.”

Privacy is something that we human beings understand at a deep and intrinsic level. Most of us need our own personal space, to lie in our own bed and to choose who lies there with us. Once we move outside of the real-world into the digital, those ideas of privacy seem to become blurred with commercial needs.

Finding a balance between personal privacy of digital data and commercial transaction requirements is tricky. But to build strong relationships you need to build mutual trust.

Trust is built on respect. Being privacy-respectful is part of the relationship-building exercise that digital systems need to perform in a world that devours personal data. The seven principles of Privacy by Design can guide you when building digital systems that make great relationships that work.

Secure your digital life with Surfshark

Only $1.99/mo. 30-day money-back guarantee with every plan