February 1, 2021

Privacy in Action: Daniel Miessler, Information Security Thinker

Daniel Miessler, Information Security Thinker

Daniel Miessler is one of the most influential people in the cybersecurity industry. He has written about many information security areas, including security testing, policy, and privacy, on his website which he has maintained frequently since 1999. Miessler is also the host of The Unsupervised Learning Podcast. Free to check out his website and podcast if you’d like to learn more about cybersecurity! You can also follow him on Twitter @DanielMiessler.

Maintaining user privacy is an important part of information security. So we’re delighted to interview Miessler for our Privacy in Action series.

Interview with Daniel Miessler

Startpage: What does privacy mean to you?

Daniel Miessler: For me, (data) privacy is part of security, and it means making sure people feel comfortable with how their data is being shared and used online. Being part of security doesn’t mean it isn’t distinct within security, because it definitely is. It just means that it’s not as different as many people imagine.

I like to reduce everything to first principles, and security comes from the Latin for “without worry” (se-cura). To me that absolutely applies to privacy because we want to not worry, as consumers, that our data is being misused by this or that entity.

Startpage: We know confidentiality is one of the components of the CIA Triad of cybersecurity. Is there a difference between confidentiality and privacy?

Daniel Miessler: Great question, and it absolutely relates to the similarity with security in general. The fact that confidentiality is part of the triad is another clear signal that privacy is part of security. Most people think of confidentiality as “making sure only the right people can see your data”, and that’s absolutely the main thread in privacy as well. I’d differentiate privacy by the involvement of the user in setting the policy for who can see the data. Normally the policy that security-types use for protecting data is more implied or assumed, but with Privacy I think the policy needs to come directly from the user.

Startpage: You’re one of the top experts in security testing. How can the red team (offensive security) and the blue team (defensive security) work together to improve the privacy of user data?

Daniel Miessler: First, that’s nice of you to say! I think the main benefit security testing can have on privacy is in looking at how entire platforms and ecosystems can be abused, rather than just specific systems. Normally when someone looks at a security test they’re thinking about a particular piece of tech, and many privacy assessments focus on a specific workflow or IT system. I think we could really benefit from red/blue team mentalities that look at an entire platform, like Facebook for example, and say, “If I were a bad person, what could I do using all the various functionality that the platform offers?”

Startpage: What are some of the things pentesters (a type of security testing where people pretend to be cyber attackers) can focus on to find vulnerabilities that put the privacy of user data at risk?

Daniel Miessler: Pentesters, and especially bounty-focused people who are paid for the impact of vulnerabilities, focus heavily on how bad a breach would be to the user or organization if the issue were made public. So it’s not so much about the severity of the technical bug, but more so about how that bug can be chained for maximum negative impact. The key is to fully explore the platform as a user—to understand it—before you start testing.

Startpage: What’s one of the biggest misconceptions about digital privacy?

Daniel Miessler: The biggest misconception about digital privacy is that it’s fundamentally different from digital security. The primary difference in privacy and security, when it comes to data, is that with privacy we’re explicitly asking the user to set their own policy when it comes to who can and cannot possess, view, share, and otherwise use their information. But as you already pointed out, confidentiality of data is already part of the Security Triad, and that already covers making sure only the right people can see one’s data. The real question is who’s deciding who gets to see what, and with Privacy we’re saying that decision should be up to the user.

Startpage: What are some of the best things web developers can do to protect the privacy of user data?

Daniel Miessler: Web designers are usually pretty far out of the decision chain for privacy related questions, but one thing they can do is to pay careful attention to how sensitive data is presented. It should be clear when a given dataset, or interface, contains sensitive data, and the UI/UX (user interface, user experience) should take this into account when it comes to actions like sharing, downloading, printing, etc. In short, designers can use their interface knowledge to help nudge people into making better privacy-related decisions.


Privacy in Action is a series of interviews with privacy-minded Startpage users from diverse backgrounds. If you are interested in participating in the Privacy in Action or would like to nominate someone to be interviewed by us, reach out to us at [email protected].

The views expressed in this Q&A are those of the interviewee and do not necessarily reflect those of Startpage.

Was this article helpful?

Go Private

Make Startpage your
default search engine

Set as default