menu

Consequences of Identity Done Wrong

In this episode of Cybersecurity (Marketing) Unplugged, Maler also discusses:

  • Creating an architecture for meaningful consent;
  • Social sign-on, single sign-on and the battle between convenience and security;
  • A real path to passwordless authentication;
  • Decentralized identity and identity proofing. 

Eve Maler is the distinguished CTO at ForgeRock. Prior to joining ForgeRock, she served as the principal analyst on identity and access management, authentication and API security at Forrester Research. She was also an identity solutions architect with PayPal and a technology director at Sun Microsystems where she co-founded and made major contributions to the SAML standard. In a previous life, she co-invented XML.

Maler comments on the importance of new policies regarding identity, especially in light of the ramifications playing out in the case of Afghan biometric databases:

We’re being challenged by the fact that they had used biometrics and had digital identity information. And [they] were now concerned pretty recently about, you know, ‘How do I wipe my profile? Because I don’t want to be compromised by the fact that you can’t revoke your fingerprint or your face.’ … Identity is getting more consequential with every passing day.

Full Transcript

This episode has been automatically transcribed by AI, please excuse any typos or grammatical errors. 

 

Steve King 00:04
Welcome to cyber security unplugged the cyber theory podcast where we explore issues that matter in the world of cyber security. Good day everyone. I’m Steve King, the managing director of cyber theory. And today’s episode is going to focus on today’s challenges in identity proofing, access management and excessive trust. Joining me today is Eve maler, the distinguished CTO at forge rock that is highly regarded as anybody in the I am space on our way to forge rock he served as the principal analyst on identity and access management, authentication and API security at Forrester Research. She was an identity Solutions Architect with PayPal, and a technology director at Sun Microsystems where she co founded and made major contributions to the SAML standard. In her previous life, she co invented XML. So let me repeat that in a previous life, she co invented x m L. interest, spare time, perhaps even more impressively, she is the music director, co founder and sometimes the lead singer for the rock band she organized called zz off and the love tokens, finding yet another channel through which to drive her message home. So welcome, Eve. I’m glad you could join me today.

Eve Maler 01:34
I am so delighted to be with you, Steve.

Steve King 01:36
Thank you. So let’s jump into this and and talk to me about consent. We know that internet enabled relationships raise all kinds of issues around identity, you know, and authentication can serve as sort of a poor descriptor for the type of relationship trust assessment people in perform. In these circumstances, when they’re talking to someone, they have no idea who they’re talking to. We have all argued you have argued that a mutual agency relationship makes more sense and you propose the right to use license for access permissions as a practical alternative to consent and consent as we use it today. Can you expand on that a little bit and help our listeners understand what we’re talking about here?

Eve Maler 02:24
Absolutely. Well, of course, so many folks who are working in it, making enterprise digital realities happen. Understand that consent is important as a term of art, right? I mean, it’s it’s becoming an increasing part of all of the the privacy regulations across the globe. GDPR and all of its cousins, ccpa and its follow ons. And that’s because we’re kind of in this era of what I think it was data privacy to Dotto, you know, we’ve got sort of a new flowering of, of the importance of what is technically known as the data subject in all of these equations, in data privacy, one Dotto, as constant, for example, the Data Protection Directive in the EU, in the data, the data subject was really kind of this passive entity to whom things happen, you know, we’re going to protect your data, don’t worry about it sort of approach and data privacy to data, we do see that asking people what they want, and you know, trying to do it is becoming more important. Nonetheless, nonetheless, consent is pretty well defined in legal terms, and experiential terms, often with this kind of opt in or opt out. And what that means is, the person is kind of the last one asked to the table, there was an offer, and they get to accept or not accept, and the more coarse grained that opportunity is, the less meaningful it is, it’s not very meaningful consent to say, well, you got to agree to this, or you can’t use our service, or, you know, you’ve got to agree to share this information or the service just doesn’t work, even if that’s maybe marginally not true. GDPR is this notion, you know, the legitimate interests of the business and in using information and that’s even without consent. So the whole thing is a little bit suspect from the perspective of somebody who wants some control. So the data privacy to Dotto kind of architecture that I have in my head is kind of a pyramid where data protection is the baseline data transparency, where somebody tells you what they know about you, what they want to know about you is the next requirement and then data control, giving people the knobs and buttons is meaningful. And that probably means ultimately, that we get to a point where we can have people choose what they want to share. And do that without external influence. And that opt in opt out kind of right now give me an answer is

Steve King 04:43
part of the challenge. Yeah, so in your mind, what would a architecture for that look like?

Eve Maler 04:51
I haven’t have something in mind. Maybe no surprise, but a good example of how to enable people to make those sharing choices without external influence is user managed access, or Houma, which is an OAuth based standard that I founded and have been working on for quite some time now. And what it changes is that it enables a kind of framework where you know how if you go into Google Docs or similar, you have a share button, not a social share button, share button, and it lets you choose what about the thing that you’re in, you want to share. That is what an AMA type model enables putting a share button on any service, you have to share with not just other applications that you might be using. That’s how that’s a big kind of innovation. And it’s a great one, but actually sharing with other people. So it’s kind of a person to person thing, it’s a independent control thing. It’s an autonomy thing. And this is becoming more popular as a model. For example, if you look at the healthcare regulations in the US, and the UK pensions dashboard program, actually in the UK, which is not just about getting access to your information, seeing it conveniently, but sharing it with the people in your life, whether they’re financial advisors, or caregivers or you know, adult children who are going to help you do something if you’re elderly, and perhaps perhaps more of an offline person. So that is kind of an empowering model that’s becoming more of a reality, partly in regulations. And partly, regulations are driven by Consumer sentiment and demand

Steve King 06:21
isn’t one of the challenges in getting this down to the sort of everyday data level where I want to share a Google Doc with you have one more step now that I need to perform to say, yeah, even school I’m going to share with her.

Eve Maler 06:40
I mean, there’s still identity involved in all those pictures. And when it’s person to person, it particularly matters. But the idea with people being able to choose for themselves, whether to share is a kind of a, I almost want to say it’s a newer conception of privacy than this past data protection, keep everything in vision. And the regulations are kind of starting to go a little bit in that positive direction, which is letting people share, and then ideally, letting them be able to mitigate the results of what they just did. And that’s where actually Luma has this additional approach that lets you kind of have a unified dashboard or monitoring station of what you did share across many different sources of digital information, or even, you know, API’s leading to IoT devices, or whatever it might be being able to give access, which we often do with the G doc Share button, right, you know, will say, Oh, you got edit rights. Oops, that’s wrong. Let me go change that, you know, that’s an appropriate or cut it off entirely. Those, you know, we have some great examples of user experiences for how to do this.

Steve King 07:43
Yeah, what a weird concept actually making accountability and sort of at the individual level, right? Suddenly, he says to you, right, it’s up to you and whatever consequences result there you are consequences.

Eve Maler 07:59
And you know, some some healthcare providers have a little bit of a challenge with this, because they’re so used to, you know, sort of being your everything when it comes to your health care. And they’re, and they’re dubious about this, and what it’s known as, at least in the US health care, conversation, about regulations and so on is, is data blocking, or information blocking, like not letting somebody see or share data that’s about their themselves and their bodies?

Steve King 08:24
Yeah, an interesting reflection on the control mechanisms and what’s going on with those people, too. I think all of this stuff is always reflective of our culture and society at any given Absolutely.

Eve Maler 08:37
think there’s disruptive changes going on now around trust? You think you mentioned excessive trust? You know, I think people have been starting to be burned as enterprises have been burned with excessive trust. That’s why we have things like zero trust, which is kind of an interesting concept.

Steve King 08:51
Oh, yeah, no kidding. And we can talk about that in a little bit here I write, you may arguably be called the godmother of identity and access management. And I know when you were a son, you are the founding Chair of Oasis and and the result of that was SAML, the Security Assertion Markup Language for for SS for single sign on and identity Federation, can you provide some context so our listeners can understand the significance of Security Assertion Markup Language in our everyday internet interactions?

Eve Maler 09:26
Absolutely. So SAML, and also the more modern day equivalent to a little bit of open ID Connect, really govern a lot of the logins that we experienced today. So SAML is more for kind of workforce members. So employees might experience signing in to their to their day job at work, and then sort of being redirected over to some other applications like workday, or you know, benefits provider, things like that. SAML is the underlying piece that makes that single sign on, happen across domains. Cross different companies securely and safely. And it was the first thing to sort of achieve that cross domain thing, which is, which is tricky. Open ID Connect is I say it’s the modern equivalent, it came after it’s based on this OAuth technology that I mentioned, that lets you kind of sew together some application, you’re using an API or service. And it’s more mobile friendly, and more consumers experience it, because they might see things that we usually call social sign in, which is kind of a subset of single sign on. And that’s where you where it says, login with Google or sign in with Facebook or similar. Those are where, you know, some some awesome service you want to use. As you know, I don’t want to store some of these credentials, I want to get to know them build a trusted digital relationship with them. But I’m happy to let Facebook which is really good at this do this job, you know, it’s got sort of fraud detection built in, and so on. So in the parlance of identity, folks, we would say that, in that case, Facebook would be the identity provider, you know, identity information, essentially, like, yeah, they authenticated, they logged into the strength this time, you do it that way you will, and then the other side is known as the relying party, or the or the service provider. And it’s very powerful pattern.

Steve King 11:14
I guess the old notion of convenience versus security. train has left the station A long time ago, right? We’re perfectly willing to take the easiest path possible to, to do stuff. I mean, if we get to the point where we’re really trying to remove the excessive trust landscape from identity proofing, aren’t we going to need to impose more work, if you will, on behalf of folks that are trying to interact with internet enabled services? You know, I

Eve Maler 11:55
sure hope not, I mean, I like to call those two sort of categories you put out there protection, and personalization, just because I like to have the mnemonic to help everybody remember, and you know, protection is something people do seek there are certain specialized circumstances where they’re willing to do more work, like, you know, being given a heart token of some sort, you know, maybe a yubikey, or something by, you know, their high net worth individual working with a bank, for example, or World of Warcraft is sort of famous for incentivizing people who are just after all average, well, maybe not average consumers, but you know, they’re sort of playing the consumer role, to incentivize them to use such a token, at the same time, because we finally got this world where it’s possible to not have to compromise on experience, if you do it, right. You can even approach password lists, experiences. And those can be super powerful for letting you get to what you want to get to, but in a more secure way. And, you know, I think we’re just gonna see more of these stronger authentication patterns that can be done with little impact if, if any impact I mean, you know, Touch ID and face ID and similar, were great examples of how to sort of do it right, in terms of Okay, you could complain about the quality of the fingerprint recognition. At the same time, how many more millions of people started locking their phones, because it was easy to do? And, you know, on that basis, actually, pull our average protection level up as ordinary people.

Steve King 13:24
Very powerful. Yeah. What is your view? You know, Microsoft, of course, just announced the holy grail here forever.

Eve Maler 13:32
This they did password list for real where they take away your password. And

Steve King 13:37
what is that? how’s that gonna work? Yeah, I’m sure you’ve started.

Eve Maler 13:40
Yeah, I think there’s a lot of folks in the industry who are going to be looking at it very closely. I mean, you know, where I am a portrait we integrate with lots of different authentication technologies, identity proofing technologies, risk detection technologies, watching Microsoft do this, it’s fascinating, because you have to get not just the happy path, right? you log in, and it succeeds, because you’re legit, but unhappy paths, like I lost my phone. And, you know, password reset was nobody’s favorite thing to do. I’m fond of saying, you know, nobody wakes up and says, I think I’ll log in today because it’s so much fun. As well, as you know, it saying I, you know, I think I’ll you know, go get budget to implement I am today, this is so much fun, those things don’t happen. But I think that people will be watching closely to see if their design is successful, because it isn’t so much at that point about the individual methods, although it’s important to pick, you know, methods that can be at least equally as strong. But all of those paths, you know, that you have to think through the transition and all that sort of thing. So it looks pretty comprehensive from where I sit, so they must have probably tested it on, you know, Microsoft employees who have, you know, individual logins, I hope so.

Steve King 14:57
Yeah. But have you looked at it for architecturally I’m in a role Are you getting

Eve Maler 15:02
right now? I mean it was just announced like a week ago and we’ve been I’ve been studying the flows that’s where I’ve been sort of coming into it because the the technologies you know, I haven’t done an analysis of all the different technologies that are enabling the one thing that we know is not safe and secure is just SMS one time password, but I believe that’s one of the choices they give people and it may be because passwords were bad enough as it is. Right? So you know, I there’s a lot of services that I used to use one time passwords and they might be strong enough for their purpose. That’s where you know NIST Special Publication 800 dash 63 is working, they’re working on rev four now really comes in because they’ve got these assurance levels for identity for authentication for Federation, really, rather clever.

Steve King 15:47
Yeah, no, surprise me. It’s one of those announcements like, Hey, you know, World Peace is around the corner. And you’re like, well, but how is that gonna work?

Eve Maler 15:59
I’ll tell you what I mean, it’s complicated, if only because things like biometrics, you know, speaking of world peace, I suppose, wading into uncharted waters here. But, you know, I was watching as you know, a whole bunch of Afghanis, we’re being challenged by the fact that they had used biometrics and had digital identity information. And were now concerned pretty recently about, you know, “How do I wipe my profile?” You know, I want the right to be forgotten, right? Because I don’t you know, I don’t want to be compromised by the fact that you can’t revoke your fingerprint or your face.

Steve King 16:32
Yeah, it’s not life or death.

Eve Maler 16:35
Yeah, yeah. consequences are severe. Identity is getting more consequential with every passing day. Yeah. Right.

Steve King 16:42
Yeah, that’s our marketing tagline here. Another challenging intersection, both federated and decentralized identity. Can you weigh in on how identity management works in that context? And yeah, and your your view on how well we’re doing in particularly on the identity proofing end of the puzzle, we know we both you and I know folks that are well regarded in the space that I think we’re I think we’re pretty good at the authentication thing, but we pretty much still suck at identity proofing,

Eve Maler 17:13
but the first step is a doozy. Hmm, yeah, yeah. Yeah. So when it comes to this notion of decentralized identity, it’s kind of the last five, six years have seen this attempt at reimagining how identity is done. Because right now, you know, a server will hold information about you, by the way, personally identifiable information regulated and all that sort of thing. And there’s a number of folks who are interested in seeing if we can sort of flip that around and have people store their own information information about them in what’s colloquially called a wallet, and you know, takes its name directly from things like payment wallets, including digital wallets, but it’s for identity information instead of money. And so the idea is that you can use decentralized technology, blockchain, Ledger’s, distributed Ledger’s, to help achieve this. And it’s been six years of a lot of research, a lot of people trying out things, but the the essential architecture that’s been come up with in the standards kind of comes out to two parts. One is a decentralized identifier. And there’s a sort of ceremony prescribed for how you establish those. And they’re, they’re kind of in the SAML world, we call them pairwise pseudonymous. That’s how SAML does it and open ID Connect can do that as well, in the decentralized identity world, they talk about does decentralized IDs. And then there’s this other half, which is about verifiable credentials. And you know, you might have accepted at some point, a badge that some organization offered you like recently, Id pro.org, the identity professionals organization, started issuing badges to their members. And I probably put an ID pro badge, you know, on my LinkedIn. So the notion that it was verified by ID Pro is kind of the idea behind verifiable credentials, only super portable and protected and privacy protective and all the rest. I think verifiable credentials are really super interesting, because they’re kind of a holy grail of some of the single sign on stuff that’s been done all this time. It’s the transfer of, of attributes about somebody from a trusted source, but it hasn’t really worked at scale. And so this notion of decentralized identity di or sometimes known as self sovereign identity to give people control back I think where it’s ultimately probably going to be the most successful is in the verifiable credentials portion and we’ll have to see if blockchain becomes a thing and that eventual solution I, I did invent a blockchain drinking game when all this started started to come about and it’s simply as if someone says blockchain your drink so it can really,

Steve King 19:49
really get you in a good mood to say yes, a dangerous game. Yeah. Oh,

Eve Maler 19:54
I haven’t I have a new drinking game though. It’s about NF T’s which of course are related. Oh, yeah. Non fungible tokens and what I decided the drinking game game should be is that if someone says NF tea you drink? Absolutely. So knock yourself out.

Steve King 20:10
Yeah, maybe literally. Well, that that, yeah, that that whole world is mind exploding anyway, which you’re talking about here pushes that, you know, we have to be careful we don’t want to push too much accountability back to the owner God forbid, right?

Eve Maler 20:26
Well, it’s like what will the experience be? And will we be dealing with the right person you asked about identity proofing. And indeed, you know, there’s actually a lot of pandemics spurred popularity of remote proofing methods, you know, sort of holding up a passport and holding up your face, that sort of thing, which makes me think of hostages and headlines and newspapers, but you know, it can be effective. The only challenge is that the documents on which they’re based, are movable themselves. I don’t know that remote is actually really any worse on average than in person. So I would say for average risk use cases, you know, you can get by but you have to start piling up proofing and verification, if you want to mitigate your risk more.

Steve King 21:12
Yeah, it’s a it’s a super important piece of the puzzle. We’ll be interesting to see how we end up solving it. You’ve been around the healthcare space for a while you’ve been a warrior for for that industry sector, you know, they’re tormented by identity attacks. Yes, is the main form of cyber attack. 50% of the breaches last year, takes place as so called unauthorized access attacks. Yeah. And now we see telemedicine and remote care. I’m pretty sure that’s going to be permanent. This is not like a Coronavirus, pandemic temporary thing. So that presents a very attractive target for pH I thieves talks to us about how you know mostly factor or second factor authentication can reduce the volume of successful breaches in the sector.

Eve Maler 22:04
Yeah, absolutely. And I have I have worked with a lot of healthcare and healthcare IT folks, I don’t dare say I’m anything like a subject matter expert. But, you know, I’ve helped develop some of the standards like heart is one of them, which is health, relationship trust, and it’s based on that whole kind of off base stack that I was was talking about, it’s one of the most essential things that we can really do. And if we’re talking to healthcare providers, we’re talking employees and the supply chain, it’s absolutely to strengthen our authentication and strengthen proving where we can although you know, that tends to be more of a front end activity. I mean, I think if we can find easy ways to make it more of an ongoing activity, that would be helpful. But this is where the passwordless revolution is going to be very consequential for healthcare, because actually workers all over are acting a lot more like individual consumers and sort of telling their companies what they need out of the experience of working with them, as you can see from what’s known as the Great resignation, you know, hey, I want to work at home or, hey, I’m going to move people sort of taking things into their own hands. So consumer friendly, stronger authentication methods, I think are going to be a key to success for many of the healthcare provider and payer situations. I mean, there’s really weird ones. I remember when I was at Forrester, you know, studying different strong off methods. And I was talking to some folks in healthcare, and they’re like, well, don’t forget that, you know, if we’re in the operating room, we’ve got gloves on. And so you know, we’re not going to be using fingerprint. And so you have to think about all the constraints and the parameters. They’re been they’re absolutely right. You know, there’s special circumstances, but I think for a lot of folks, desktops are just walking up to a workstation they want to disable when they walk away from, we’ve actually got a lot of a lot of pretty good solutions.

Steve King 23:56
That’s good to hear. I know that you and john Kindleberger, kindred spirits on the importance of zero trust in both network and identity management. Can you tell us a bit about how these two realities cross paths and can integrate?

Eve Maler 24:12
Absolutely. So yeah, john, and I go way back, we do go back to Forrester, when he was working on zero trust, I kind of asked him if I could borrow the concept and apply it to identity. And what it looked like was extreme interoperability with extreme fine grain security. And, you know, John’s right when he talks about, you know, zero trust not equaling identity, but identity has become an essential tool for implementing the new zero trust architectures that we’re seeing. If you think about NIST 800 dash 207 not to overwhelm you with NIST Special Publication numbers. But that’s the zero trust architecture that came out with and what is the essential feature about that it’s fine grained dynamic authorization, which is going to frequently depend on on authentication. And it’s an architectural separation between your policy decision making, and your policy enforcement so that you can get the enforcement really close to the resources you want to protect. That’s that kind of protect surface that john talks about. So identity and access management are absolutely essential for that proposition. And getting, I think fine grain is really the theme in a way of everything we’re talking about, including the excessive trust that people have in services, right. So if you’ve got excessive trust, start to make it more fine grained, more frequent, more ongoing, be able to do things like transactional authorization, somebody’s spending five bucks with you, maybe you don’t care so much, you can give a nice smooth experience, you might want to ask them to do something a little more, if they want to spend 10,000 bucks is that their bucks?

Steve King 25:46
Yeah, it’s important, I think, for folks to understand, and they don’t today, what zero trust actually is what the impact of a zero trust program would be on organizations and, and incremental paths to get there. I mean, it’s very obvious to me, and, you know, you and Kendra bag and Richard Byrd and some of the other folks that are working in this space, that this is all very doable, even though if you look at it one way, I suppose, you know, we’ve already built all this stuff out, we can still get there. And then the key I think, in putting that together is to protect surface and, and the and an incremental approach, does that kind of resonate with you?

Eve Maler 26:31
It really does. I mean, the incremental approach in terms of like, we’re starting to see maturity models now coming out of US and UK organizations, for example. And I think it’s really key to understand even no pun intended with key that you can’t do it all at once. It’s a mindset shift. JOHN will often say it’s not a product, you need to decide how you’re going to apply it just beyond you know, the maybe old fashioned least privilege approach. How much more can you get into that mindset of do these things all the time? And you can take that in stages? Absolutely.

Steve King 27:01
I agree totally. And Eve, I know you’ve got a hard stop here in a couple of minutes. So we could talk about this for hours. But I wanna I want to thank you for taking time out of your crazy schedule to join me and what I think was a pretty interesting exchange today. I’d like to come back a couple of months from now and and dive a little bit deeper if that works for you. Happy to chat anytime. That’s great. Thanks again, Eve. And thanks to our listeners for joining us in another one of cyber theories unplugged reviews of the complex and frightening world of cyber security technology in our new digital reality. Until next time, I’m your host, Steve King, signing out. Thank you for joining us for another episode of cybersecurity unplugged. You can connect with us on LinkedIn or Facebook at cyber theory, or send us an email at social at cyber theory.io. For more information about the podcast, visit cyber theory.io forward slash podcast. Until next week, thanks again.