Put 100 Relying Parties in a Room and What Do You Get?

It’s an open secret among us identity geeks that, despite all of federated identity’s progress, one thing has lagged significantly: relying party participation1. Getting relying parties to the table, to talk about challenges they have with identity on the Internet, has always been a hard problem. Although the identity community has grown, the number of relying parties getting involved with things like the Internet Identity Workshop hasn’t kept pace.

Willingly or not, NIST’s National Strategy for Trusted Identities in Cyberspace (NSTIC) has taken up the challenge of increasing relying party participation. Without real-life use cases based on actual business, actually problems, NSTIC is, though aspirational, vague. However, armed with a set of discrete use cases, NSTIC (and more importantly the identity community) can begin to craft solutions, discover unforeseen challenges, strengthen protocols, and tackle policy issues. But to get these needed use cases requires relying parties to be involved.

Collective Punishment: SOPA and Protect-IP are Threats to NSTIC and Federated Identity

As a technologist you’ve likely heard about the Stop Online Privacy Act (SOPA) or the Protect-IP Act. The intention of these bills, as described by SOPA, is “[t]o promote prosperity, creativity, entrepreneurship, and innovation by combating the theft of U.S. property, and for other purposes.” It provides a range of resource to tackle “foreign websites” who “engage in, enable or facilitate” copyright or trademark infringement. Amongst SOPA’s so-called “reasonable measures” of dealing with the assertion that a site engages in, enables, or facilitates copyright infringement, is the use of DNS filter. In essence, the site’s hosting provider would be required to modify its DNS records such that entry for supposedly_infringingsite.com does not resolve. Beside the well publicized incompatibility between DNS filtering and DNSSEC, DNS filtering has tangible negative effects on federated identity systems including the National Strategy for Trusted Identities in Cyberspace (NSTIC.)

Notes from the "Government as Identity Oracle" session at IIW East

These are my raw notes put here for reference purposes. – Attendees

  • Peter A
  • Mary R
  • Ian G
  • Gerry B
  • others

What is mean by identity oracle? * An oracle provides an answer to a question but not a specific attribute ** If you ask an Oracle, is Peter over 21 it says yes. It does not hand back an attribute - birthdate Peter: The Federal Govt is authoritative for very few attributes - State Dept - passport #, citizenship. State govt are authoritative for driver’s license number. SSA for SSN. eVerfify is an example of an oracle, says Gerry. Peter - what will drive this is the requirement for LOA3 credentials needed to access to medical records. P - “We do not have an attribute infrastructure.” A lot of attributes are simply issued via IdP’ I - our examples so far have shown organizations that are authoritative for identifiers but not attributes P - raises need for back end attribute exchange Gerry - Problem with authoritative attribute provides is that the PDP makes a decision as to what is truly authoritative for a given context. Authoritative data source must provide SLA or MOU so that relying party can establish trust. P - BAE is 1/2 of the equation and attribute provider (market?) is the other half A - is there a business model for attribute providers? G - have problems seeing attribute exchange at enterprise scale let alone government scale. Quality and availability are just some of the issues. Access decisions are fairly local and these decisions are not things that known often at the higher enterprise layer. Things are made authoritative by policy decision. P - Second model for authoritative - a local decision to assign authoritative-ness to something Nishant - should we get rid of the term authoritative? Peter for sees multiple attribute providers having say over the same attribute for the same person If I use an Oracle, do I have to know its sources? No, says Gerry, as you form an agreement with the Oracle ahead of time as to what happens when something goes wrong P- I am running validation services which services 400 back-end apps. I am standing up a BAE to help. I could build that infrastructure or I could can contract out to an Oracle. The Oracle has to tell me its sources so I can make a decision to use it or not. Gerry comments that you may not want to know the Oracle’s source of data. Returning to the eVerify system - is a person allowed to work? eVerify doesn’t disclose sources of info but DHS takes responsibility for its decisions. Pam asks about redundancy of providers. Redundancy allows same decision to be made via separate paths. Anil feels that there is a business case for multiple providers. Mary raises the point that there are organizations who have a lot of data on people. These are often highly regulated organizations because they are related to financial services. G - uses Health Vault and Google Health as an example of multiple providers of heath information data A - Talked to financial roundtable - these ors not interested in B2C but very interested in B2B situations. Having the govt offering services to help vet people would be of great service. Govt business for providing identity information? There are certainly companies that will aggregate public data for a fee. If a service provider helps get me as a business information I need to hire someone (citizenship for example), would I use it? Would I form a business to do this? N raises BT’s You Are You service as an example of this. Pam - talking about building cloud-services in this area. Definitely interest from small business for federation and using Google as authoritative source. Sees consumer-focused needs later down the road. I asks P about persisting “over 18” information if it is acquitted from Equifax. P says they’d have to issues SORN and protect as PII. I am curious about about Govt as Oracle and the implications with respect to the Privacy Act. Peter wants to facilitate market for Oracles. NIH had MOU with InCommon which included use of attributes and information. This included agreed upon protections for those attributes which was coherent with InCommons users’ requirements. Peter acknowledges this doesn’t scale but he offers as a counterpoint that NIH is doing this federation to federation. He asserts there wont be that many to federate to. I many not want to maintain a BAE with hundreds of connections to attribute providers. Likely outsource the work to an Oracle. “It is easier to affiliate with a hubs than it is affiliate with each provider,” says Peter A. Peter says that NIH sees need to to handle attributes and thus NIH is setting up BAE. He acknowledges that there needs to be policy and practice around this, which Peter is on the hook to build. FICAM roadmap says that if you are standing up an attribute service it must be a BAE if you want funding. G - If I am a BAE affiliate and I want to consume other affiliate’s data, what is the quality I can expect? Anil says that this is currently being discussed amongst architecture groups. G talked about the quality within his organization. There is no strong commitment to the data that internal data collectors collect. At the end of the day if something goes wrong, is it my fault or someone else’s. THis is part of the contractual relationship between data consumer and provider. Hold Harmless clause within MOUs used the by the PKI Bridge. So long as org is acting in accordance with their own policies then they are to be held harmless. G - in certain situations this works, but in others it does not. I might have to run my own infrastructure or shop for another provider who can back up their assertions. Pam asks if this is govt to govt discussion, would a private group come in an provide services for G2G? Anil says yes and that currently this is happening. Because there are so many million of high level of assurance credentials, one would think that someone would want to build an ecommerce infrastructure to consume these creds - says Peter. Peter asserts authentication is a solved problem and next up is authorization, claims, roles, etc. Every application owner want to maintain control over who comes into the app. But this a way that Peter gets people to plug into the federated SSO environment. Are people building services to consider risk-based authorization in transaction, asks Pam. Anil mentions the consideration of environmental attributes for initial authorization. G says this is a hot space now. Anil brings up how PayPal takes a low assurance cred and uses it for financial transactions.

T Minus 7 days to Catalyst EU

I’ve been a bit quiet on Tuesdaynight lately… sorry - it has been a bit crazy around here lately. At any rate, we are 7 days away from Burton Group Catalyst EU! In the 7+ years that I’ve been involved in one way shape or form with Burton Group, I’ve never been to a Catalyst EU - so I am very excited. For those of you joining us, you are in for a treat - John Seely Brown will delivering the keynote for us. Besides Mr. Brown, the IdPS team has got some great content waiting for you:

Hopes and concerns for identity

A friend in the industry recently asked me for my thoughts on OpenID, InfoCards, and the US federal government’s work to consume non-government issued credentials. Letting the question rattle around in my head for a while, here’s what I’ve got so far. My hope is that the overall ICAM initiative is successful—not because I have been eagerly waiting to interact with the federal government using some form of authenticated credential—but because we (citizens, enterprises and government) are at a pivotal moment in the history of the web. With the US government working with both the OpenID and InfoCard Foundations, there exists an opportunity to change how individuals interact with large organizations, both public and private. For the first time, individuals would be able to (even encouraged to) interact with a large organization (such as the US federal government) using an identity asserted, not by the large organization, but by the individual. In this case, the State is no longer the sole provider of identity. This breaks the monopoly that the State has had on credentials and is indicative of the future to come. But there is a long road to walk before getting there. There are numerous concerns with these plans. Among these are notable security concerns, especially with OpenID, that the identity community is not blind to. These are not my primary concerns. My primary concern is with the establishment of standard user behavior that could prolong existing problems. Today, after decades of enterprise training and a decade of consumer training, people naturally expect to see two text boxes on web sites. One is for their username and the one with the little stars is for their password. This behavior is ingrained. Changing this behavior is no small feat - just ask the OpenID and InfoCard groups. But it is a change that must occur to normalize people using something stronger than username and passwords to authenticate themselves. My concern is that the behavior that is being established as a norm - the use of either an identity selector or some other user interface means - will become the username/password for the next generation. This isn’t a hypothetical problem; the writing is already on the wall. Currently, OpenID will only be accepted for low-value transactions with the government known as Level of Assurance 1 (LOA1). Activities like filing tax returns requires a far greater assurance that the person is who they claim to be and thus require a Level of Assurance 3 identifier. And there is problem. The way people use an LOA3 credential may be very different than how they do so with an LOA1 credential. If we, as an industry, normalize user behavior that meets LOA1 needs but not LOA3, we are training in behavior that has to get untrained in a near future. What the government and its partners are on the path to doing is effecting real cultural change. This kind of change doesn’t happen often and is hard to do, and especially hard to undo. I definitely want a future in which I can assert my own identity without validation from the State, but I am very willing to wait for that future to assure that the behavior the industry normalizes is one that will work for generations to come. (Cross-posted from Burton Group’s Identity blog.)

Transparent or Translucent?

Last week I was at the recent Department of Homeland Security’s Government 2.0 Privacy and Best Practices conference. Not surprisingly the subject of transparency came up again and again. One thing that definitely caught my attention was a comment by one of the panelists that efforts towards government transparency are too often focused on data transparency rather than process transparency. While we have Data.gov as one of the current administration’s steps towards furthering government transparency, we do not have an analogous Process.gov. Said another way – we get the sausage but don’t get to see how it is made. This isn’t transparent government but translucent government. From what I’ve seen I’d say that enterprises have achieved the opposite kind of translucency with their identity management programs. Though enterprises have achieved some degree of process transparency by suffering through the pains of documenting, engineering, and re-engineering process, they haven’t been able to achieve data transparency. Identity information has yet to become readily available throughout the enterprise in ways that the business can take advantage of. Identity information (such as entitlements) has yet to achieve enterprise master-data status. Worse yet, the quality of identity data still lags behind the quality of identity-related processes in the enterprise. For those of you attending the Advanced Role Management workshop at Catalyst this year, you’ll hear me and Kevin present the findings from our recent roles research. Throughout our interviews we heard identity teams discuss their struggles with data management and data quality. Finding authoritative sources of information, relying on self-certified entitlement information, and decoding arcane resource codes were just some of the struggles we heard. No one said that identity data transparency was easy, but without it enterprises can only achieve identity translucency and not true transparency. (Cross-posted from Burton Group’s Identity Blog.)

Nailing Down the Definition of "Entitlement Management"

Ian Yip’s take on access management versus entitlement management can be partially summed up with this equation:

Entitlement management is simply fine-grained authorisation + XACML

I have four problems with this. First, definitions that include a protocol are worrisome as they can overly restrict the definition. For example, if I defined federation as authentication via SAML, people would quickly point out that authentication via WS-Fed was just as viable as a definition. So in terms of an industry conversation, we need to make sure that our terms are not too narrow. Second, I fear that this definition is a reflection of products in the market today and not a statement on what “entitlement management” is meant to do. Yes, most of today’s products can use XACML. Yes, they facilitate authorization decisions based on a wider context. But who’s to say that these products, and the market as a whole, have reached their final state? Along these lines, I wonder if externalized authorization stores are a required part of an “entitlement management” solution? Third, there is something missing from the definition – the policy enforcement point. A fine-grained authorization engine provides a policy decision point, but that still leaves the need for an enforcement point. This holds true whether an application has externalized its authorization decisions or not. Finally, I have a problem with the phrase “entitlement management” (just ask my co-workers). As I have blogged about before, Kevin and I have been in the midst of a large research project focusing on role management. One of the things we have learned from this project is that enterprises do not use the phrase “entitlement management” the same way we do. A bit of history – three or so years ago Burton Group, at a Catalyst, introduced the phrase “entitlement management” to include the run-time authorization decision process that most of the industry referred to as “fine-grained authorization.” At the time, this seemed about right. Flash forward to this year and our latest research and we have learned that our definition was too narrow. The enterprises that we talked to use “entitlement management” to mean: · The gathering of entitlements from target systems (for example, collecting all the AD groups or TopSecret resource codes) · Reviewing these entitlements to see if they are still valid · Reviewing the assignment of these entitlements to individuals to see if the assignments are appropriate · Removing and cleaning up excessive or outdated entitlements More often than not, we found that our customers used “entitlement management” as a precursor to access certification processes. Using a single term (“entitlement management”) to span both the run-time authorization decisions as well as the necessary legwork of gathering, interpreting, and cleansing entitlements can lead to confusion. The way enterprise customers currently use “entitlement management” works well to describe how legwork is vital to the success of other identity projects. (I’ll be working on a report this quarter that delves deeper into this.) I am all for a broader conversation on fine-grained authZ versus entitlement management. And as Ian Yip has pointed out on twitter, identity blog conversations have dropped off a bit and I’d love to stoke the fire a bit. But we can’t have meaningful conversations without shared definitions. So what’s your take? What do you mean when you say “fine-grained authorization” and “entitlement management?” (Cross-posted from Burton Group’s Identity blog.)

Zen Mind, Newb Mind

Being the new-ish addition to the IdPS team is, well, an interesting place to be. Besides the requisite induction activities (ask me at Catalyst how you pick up the dry cleaning for a team who lives all across the country), I’ve been working with my peers on vastly different pieces of research. And being curious by nature, I’m loving the chance to not only dig into different topics, but also observe how different people go about the actual process of analyzing a topic or a market. One technique that Burton Group uses is Contextual Research (CR). Essentially, the CR process is meant to challenge an analyst’s knowledge of a subject and their associated preconceived notions as to what problems enterprises face and how they are facing them. It turns seasoned veterans, experts in the field, into beginners again. This is what practitioners of Zen Buddhism call “beginner’s mind.” Here’s how it works in a nutshell. Kevin (seasoned vet) and Ian (newbie) identify a bunch of organizations to talk to. So far nothing out of the ordinary as compared to our other approaches to research. That being said, the conversations we have with these organizations is very different from typical research techniques. Instead of coming to the conversation with a fixed hypothesis that we want to prove out, we come to the conversation with nothing. No leading questions. No surveys. No preconceptions. In these conversations, we, the analysts, are newbs. We let the people that we are talking to teach us what is important to them about a subject, how they have approached a problem, what wisdom they’d like to share with others. The analysts furiously take notes, listen, and try not to talk. Having listened to as many people as we can, we bring the whole team together to find affinities among the statements, identify trends and common techniques, and evaluate the state of a market through the eyes of a customer. Right now, Kevin and I are in the midst of a role management CR. Although, we are far too early in the process to comment on what we’ve found, some of the anecdotes we have learned along the way are really fascinating. Discussions about the needs of the business, efficiencies gained, and methodologies for conducting role analysis – all of these conversations have been grounded firmly in the realities of today’s economy as well as current state of identity management in the enterprise. You’ll see some of the results of this beginner’s mind approach to analysis at Catalyst this summer. In fact, the Catalyst workshop on Advanced Role Management is going to be a master-class of a sort, shaped by what Kevin and I learn during this CR process. Stay tuned for more on our roles CR. Towards the end of April, I’ll be updating you on how the process has faired. (Cross-posted from Burton Group’s Identity Blog)

Will the "real" federated provisioning please stand up?

Nishant has commented on my post about federated provisioning. He has provided two different examples of federated provisioning. One of these, the advanced provisioning example, involves a company who manages its employees’ access to a service provider service via provisioning. In this case, Nishant agrees with me that provisioning of this sort is no different than provisioning the UNIX box down the hall.

But it is Nishant’s second example, the just-in-time provisioning example, which is a bit tougher. In this case, the enterprise and its service provider have a federation in place. Using SAML-based authentication, a new user attempts to access the service provider’s service. The idea (hope?) is that the service provider recognizes the new user request, provisions the user, and authenticates the user in the same conversation. Nishant does add a degree of difficult in this scenario as he ties the federation service to a provisioning service. Grabbing attributes from the SAML token, creating a SPML message, and handing that to a provisioning service is possible, but as a commentator points out this sort of interop isn’t spec’ed out so the heavy lifting is left to the service provider. And even if the service provider doesn’t want to directly link its federation and provisioning services, it still needs to grab that assertion attributes and create the account in the backend system.

Down with federated provisioning

There’s been a bit of recent blogging activity about federated provisioning and SPML. Having worked on both federated provisioning and SPML in a past life, it warms my heart to see this discussion. Jackson, quoting the CIO of Education Testing Services, Daniel Wakeman, restates the observation that SaaS providers are providing when it comes to federated identity management. This “major shortcoming” leaves service subscribers to fend for themselves in managing user lifecycle events like on-boarding and off-boarding. Not acceptable. That got me thinking - there really ought not to be a concept of federated provisioning. Provisioning an application in the data center must be the same as provisioning an application in the cloud. However, in the course of the conversation between James, Jackson, and Mark, it seemed SaaS applications and in-house applications were different from a provisioning perspective. SaaS applications may be harder to provision and de-provision than non-SaaS application, but that doesn’t make them fundamentally different animals. The point was made that SaaS apps lack a standards-based provisioning interface, an SPML interface. The fact is the vast majority of applications, SaaS or not, lack a standards-based provisioning interface and this makes dealing with them very much the same. Now there are two reasons that we don’t hear the same short of clamor about provisioning non-SaaS applications as we do with SaaS applications: