I enjoyed reading a good natured rant about the vagaries of managing your identity online on the Des Res blog the other week. If, like me, you work for a large organisation, you’ll probably be obliged to follow strict rules on selecting a password for access to corporate systems. If, again like me, you use a lot of websites that require you to select credentials for logging in, you may struggle to manage a large (and constantly growing) set of strong passwords without writing them down. In these circumstances, it’s very tempting to re-use the strong password for your work systems for other purposes.

Identity 2.0

Identity 2.0 or digital identity has long promised to solve these problems in a world where a user can potentially have one online identity, with a pre-certified proof which is submitted when required for authentication. This model is represented by Microsoft’s Cardspace and the open source Higgins project, but has been slow to gain momentum. However, in recent years, a number of the larger IAM vendors, starting with CA Technologies, have added support for these technologies to their Web Access Management products.

Multiple Identities Online

Of course, being able to use a single identity and set of credentials for all your online activities is a real “good news/bad news” story. The convenience of managing a single set of credentials comes at a price: it’s quite conceivable that your visits to different websites could be aggregated and correlated, to build a far more comprehensive (and revealing) picture of your online activity than you might feel comfortable with. It’s also true to say that not all web sites we visit (and register for) justify the same level of strength in authenticating our identity. For example:

Online Banking: There’s so much at stake if your banking credentials become compromised that it’s obvious to all but the hard of thinking that those credentials should never be used elsewhere. In a previous post, I described how my bank allows me to be warned if I try to re-use internet banking credentials on another site, by providing me with a free copy of Trusteer Rapport. This protection can be easily extended to other high risk sites.

Social Media: As I’ve described on these pages before, I use a wide range of social media applications (in the widest sense of the term) to maintain my contact list, collect and collate information and publicise this blog. Each site requires a separate set of credentials, but increasingly I’m offered the chance to sign in to one application using the credentials from another (very often, either Twitter or Facebook). This makes use of the Open Authentication (OAuth) protocol. OAuth allows the user to authenticate with their chosen service to generate a token. The token can then be used to allow another application to access resources for a given period of time. So, for example, when configuring Tweetdeck, I authenticate in turn to Twitter, Facebook, LinkedIn and Google Buzz and authorise Tweetdeck to use the OAuth tokens to retrieve data from those applications until I revoke that access.

Single Sign On
This still leaves a wide range on different sites that require a login. I use a wide range of Cloud Services, including Drop Box (of which, more in a moment), Windows Live Mesh, Mind Meister (for collaborating on mind maps), MobileNoter (for sharing and synchronising Microsoft OneNote) and of course, Google Docs. These (or at least the data I entrust to them) are important enough to me to warrant good quality credentials and together they make a good case for Single Sign On. With more than 10 years’ experience in Identity Management projects, I’ve always viewed SSO as primarily a user productivity tool, with some incidental security benefits. However, I came across a story on Mashable, describing tools for managing web passwords and quickly realised that I could:

Store all my credentials in a single location;

Secure them with a single strong password, which never leaves my machine;

Synchronise that credential store across multiple computers by locating the credential store on Drop Box;

Use the same, synchronised solution on my iPhone.

So, armed with these requirements and the Mashable product reviews, I eventually settled on 1Password. As well as a management app, which sits in the system tray, 1Password installs a plug-in for all the modern browsers (I’m using it with IE and Firefox) which detects when you’re completing a registration or login form and prompts you to save the credentials. Next time you visit the site, just press the 1Password button to login. Incidentally, the Mashable article mentions that 1Password is primarily a Mac product, with a Windows version in beta. The Windows version is now in fact available as a paid-for GA product.

Summing Up

So, in conclusion, it’s possible to figure out a strategy to at least simplify sign on and credential management to a wide range of web sites and applications, each with differing needs for strength and protection. By and large, the tools to do this a available for free and even the commercial components I chose are available for a very modest fee. All in all, the benefits far outweigh the modest outlay of time and cash.

CA itself is a Salesforce.com customer, with access to the applications made available to its sales and pre-sales teams. CA Siteminder is already integrated into the Salesforce.com offering, to provide single sign on.

What will be interesting will be to see to what extent CA can incorporate this cloud-based provisioning into their role life cycle management story.

I was reading Dave Kearns’ article on directories in Network World Identity Management Alert (more on that in a later blog) the other day and I spotted a reference to an article from 10 years ago (the newsletter was then called “Fusion Focus on Directory Services”) on the beginnings of the provisioning sector . Aberdeen had christened this new breed of “office productivity” applications as e-provisioning in their Technology Viewpoint of September 1999. Dave recounts how he came across a startup at NetWorld+Interop 99. Dave noted that this startup, Business Layers, was the only vendor active in the new space.

At around that time (well, OK, in early 2000) I had moved from infrastructure and security management at a UK defence contractor to the newly formed security practice at a Top 5 software vendor. While I’m loathe to dispute Dave’s account, it’s not quite as I remembered it. I chatted with colleagues from that time and confirmed that, for example, CA had released their first provisioning solution in 1997. The solution was designed as an extension to CA’s flagship Unicenter networks and systems management family, and released under the name Unicenter Directory Management Option (DMO). Following CA’s acquisition of Platinum, DMO was relaunched as a standalone product under the name eTrust Admin in 2000. It’s maybe not all that surprising though that this went largely unnoticed. A friend (who was the eTrust Admin development manager at the time) recalls how one of the major industry analyst firms contacted CA Analyst Relations to ask if they had a tool for provisioning to be told “No”.

It seems to me that the earliest provisioning vendors were top tier network and systems management vendors (BMC, CA, IBM Tivoli). They started with important advantages. For example, their presence in the mainframe market exposed them to effective and mature (though largely manual) processes for user administration widely found in mainframe shops built around RACF, ACF2 or Top Secret. Secondly, their experience in building network and systems management solutions meant expertise in development of agent technology and reliable (store and forward) messaging, the vital “plumbing” for a provisioning engine. These first attempts placed emphasis on centralised, consistent manipulation of credentials on target systems.

The second wave of provisioning products came from niche vendors (Business Layers, Access 360, Waveset, Thor) and were characterised by their use of web technology and the adoption of configurable workflow-based approval processes. They also initially had limited coverage for connectors (and some connectors had limited capabilities) . At the time of the CA acquisition of Netegrity in 2005, Identity Minder -eProvision (formerly the Business Layers Day One product) was still licenced to use the connectors from BMC’s Control-SA product.

In late 2000, at the height of the DotCom boom, I was lead security architect for a proposed chain of high security hosting centres around the world, to be implemented by a consortium that included CA, Sun and Oracle. Business Layers demonstrated their product, showing me a workflow process, updating its status in real time, displayed on a Unicenter Worldview map. I was impressed – it was better than the integration between the Unicenter components!

These new capabilities however proved to be pre-requisites for delegated administration and user self-service. This then led to a rash of acquisitions, with Netegrity joining CA, Access 360 joining IBM, Thor joining Oracle and Waveset joining Sun. Netegrity brought two distinct offerings to the party, in Identity Minder (web based administration for Siteminder deployments) and eProvision (the former Business Layers product). The 2nd generation CA product was built by integrating Netegrity’s Identity Minder with CA’s eTrust Admin. The eProvision developers left CA to form a new company IDFocus, which developed add-ons for Identity Manager implementing the best features of eProvision which were still missing from the CA product. CA eventually acquired IDFocus in late 2008 and merged the two development teams. BMC acquired a directory management product (Calendra) in 2005 to add the missing elements of workflow and graphical interfaces.

The current race for the Identity Management vendors is to integrate role mining and role management capabilities into their solutions. First, Oracle acquired Bridgestream, then Sun acquired VAAU with their RBACx product. Finally in late 2008, CA acquired Eurekify. Meanwhile IBM have decided to build their capability in-house.

So, where next? It goes without saying that all the major vendors still have much to do to improve integration and remove duplication between the multiple components from which their products are built. Beyond that, I think there’s a growing realisation that real-world deployments of identity management will have to be built from multi-vendor solutions. Mergers , acquisitions and divestments will see to that. The cost, time and risk of replacing one vendor’s IdM products with another’s will prove to be completely unacceptable to the business. So, vendors are going to have to address interoperability seriously. Perhaps this will be the catalyst for renewed interest in
Open standards, such as SPML and DSML. In his article on directories, Dave Kearns noted that as directories matured from the hype of directory-centric networks to unglamorous (but still vital) low level infrastructure, DSML never really took off, despite being adopted by OASIS in 2002. Interoperability is aided when directories (the single source of truth for an IdM system) are able to exchange updated information autonomously.
Which brings us back to where we started.

A few days ago, I was invited to IBM South Bank for a workshop on Identity and Access Management (IAM) Governance. The workshop was timed to coincide with the launch of the latest release of Tivoli Identity Manager (v5.1). IBM’s press release describes the new features in TIM v5.1, but I’ll summarise them here:

Role management capabilities The latest version of TIM allows the definition of (optionally nested) roles. Roles are not used to manage user access to resources, but rather provide a structure through which to do it more efficiently.

Separation of duty capabilities
Separation of duty is a policy-driven feature to manage potential or existing role conflicts. A separation of duty policy is a logical container of separation rules that define mutually exclusive relationships among roles. Separation of duty policies are defined by one or more business rules that exclude users from membership in multiple roles that might present a business conflict.

User recertification
TIM provides a process to periodically certify and validate a user’s access to IT resources, combines recertification of a user’s accounts, group memberships of accounts, and role memberships into a single activity.

Group management capabilities
TIM now allows the creation of groups of users, to help with automation of identity management processes.

Tivoli Common Reporting
TIM’s reporting capabilities have been migrated to IBM Tivoli Common Reporting. This component is based on the Eclipse Business Intelligence Reporting Tool and provides custom report authoring, report distribution, report scheduling capabilities, and the ability to run and manage reports from multiple IBM Tivoli products.

New APIsAdditional APIs, workflow extensions and Javascript functions are provided to support the new functionality of this release.

The theme for the day was IAM Governance and in IBM’s view “Tivoli Identity Manager delivers important functionality for identity and access management governance”. The new features support governance, by enforcing compliance through product policies (as opposed to technical policies – see Earl Perkins’ blog for more details) and by allowing reconciliation between the policy-based view of user entitlements, stored in TIM’s directory and the reality, defined on the managed platforms and applications. While regulatory mandates don’t demand the use of roles (though corporate policy might) they do offer a simplified abstraction, through which access can be governed. At the risk of being pedantic, I’d call this compliance, rather than governance, but it’s all down to your own definition.

Uniquely among the major IAM vendors, IBM chose not to acquire a niche role management vendor to add this capability and instead developed the capability in-house, as an integral part of their identity management platform. This has the positive effect of avoiding the inevitable difficulties of bringing together two distinct (and often conflicting) technology platforms and development teams. Sun, Oracle and CA are all working through these issues currently, following their acquisitions of VAAU, Bridgestream and Eurekify respectively. On the negative side, it means that role management in TIM is a “work in progress”. However, I’m assured that IBM plan to release further functionality in this area, during the 2nd half of 2009.

What would I like to see added to the role management capability? I think that a function to help with the discovery and mining of roles from existing entitlement data would speed up the creation and deployment of an initial enterprise role structure. I have to declare an interest here. As a consultant, who specialises in the organisational change required for IAM programmes, I strongly favour the ability to run the role mining and discovery effort without the need to deploy the identity management infrastructure and connectors. Once an initial enterprise model is complete (and agreed) then it can be imported into the identity management system, where it should become subject to life cycle management, with TIM providing recertification and approval for changes to role definitions. This approach is elegantly illustrated by CA’s deployment architecture for Eurekify. So, if I had a vote, I’d say integrate role life cycle management into TIM and leave role mining as a stand-alone tool.

My final thought relates to Governance, Risk and Compliance. The objective must be to take result from computerised controls (such as TIM) and use those results to update an overall picture of the organisation’s risk exposure. This is the job of a GRC Management platform. In the final session of the South Bank workshop, IBM showed how TIM can be used in conjunction with Tivoli Compliance Insight Manager. This closed loop integration between security event management and identity and access management allows administrators to compare real user behaviour with desired behaviour, exactly as an auditor would. TCIM can provide a graphical representation of the information, along the lines of a heat map. IBM partner with niche vendors, such as Sailpoint and Aveksa, to deliver a complete IAM Governance solution. Personally, I’d love to see the TIM and TCIM products integrated with (for example) the excellent STREAM integrated risk and assurance management platform from Acuity.

By way of a conclusion, this latest release of TIM continues to address the use cases needed by IAM professional and does it with the benefit of a simple and consistent user interface and a simple trouble-free install process. If there’s a downside, then it’s that TIM is a monolithic application, limiting the ability of an organisation to pick the parts they need to start with. Having said that, organisations can readily deploy the application and utilise initially (say) reconciliation, recertification or compliance reporting, without needing to design and implement the heavyweight user provisioning and role management functions.

Disclosure

Readers may notice from my profile that I’m currently employed as a Senior Managing Consultant in IBM’s Global Business Services. However, at the time of writing this article, I was an independent consultant, with no commercial relationship with IBM.