Sunday, March 23, 2014

When I first started in the IT/Services field, I worked on a
help desk providing technical support for a software company. Everything on the
job was about Metrics: How many calls did you take, how long you were on the
call. Later, I managed the support group
and the metrics got more interesting: Who called the most? What did people call
about the most? This information needed to be shared amongst my fellow team
leaders, my immediate management, sales management, product management, and
even the executive management of the company. As a primary interface between
the company and its clients, I had a lot of information.

I see Identity Management as holding a lot of that same type
of information that would be of interest to various parts of the organization.
Understanding who is provisioned to what endpoint, how identities change in the
organization over time, users with given access rights. How many people are set
up with the new password management scheme? Have processed their attestations?
The list is ever changing and of course would vary by organization, per Pollicove's Law.

Sure, we can generate reports that show this information, but
I think there could be a use for a dashboard views showing real/near real-time
views of essential data. Reports are all
well and good, particularly when trying to analyze data over a period of time.

What metrics do you think would be of the most value in an
Identity Management Solution?

Thursday, December 12, 2013

People often ask me what comes after Identity Management? My answer is that it depends.

As I've discussed before, it's really up to what the organization requires and simply tacking on new workflows, approvals, and functionality is only really useful if it will actually be used. However, I think we can make a few generalizations about what might occur as part of a far reaching and comprehensive IdM program.

Assuming that the initial project succeeded in its goal of creating an authoritative data store and basic provisioning, there's always the goal of adding more applications and additional attributes in existing applications that can be provisioned as part of workflow.

Password management is always popular. If your application allows password provisioning or linking into a SSO or password management solution, this is a great place to add in further automation. Your organization's help desk will appreciate it, I'm sure.

Adding in a compliance solution (sometimes referred to a certification or attestation) is always something organizations look into as a part of that overall IdM program. Some applications such as SailPoint IIQ have this built in, while others such as SAP GRC or Oracle Identity Governance are separate, but complementary modules to the Identity Management offering.

However what I think is one of the key places that the IdM program manager should be looking at is automation of IT processes. Every day the Help Desk and the System/Network administrators are using untrackable and un-auditable tools for editing user accounts. IT Management and Audit staff have no idea exactly what these people are doing as they are on the job. At the very least, there is the possibility that users will be accidentally granted the wrong entitlements, and in the worst case, there could be the creation of undocumented SuperUsers. If we can direct these actions through the user provisioning application, then we can have an audit trail that tells us:

Who was worked with

What was done to them

Who did the work

When the work happened

It also becomes a lot easier to do these tasks when they are placed in the IdM solution. This lets our Server Admin and Help Desk teams work on the more detailed analysis and troubleshooting that they were hired for rather than mundane user management, all while creating a more secure and audited environment.

Tuesday, June 11, 2013

Deadlocks are the bane of those of us responsible for
designing and maintaining any type of database system. I’ve written about these
before on the dispatcher
level. However this time around, I’d like to discuss them a little further
“down” so to speak, at the database level. Also in talking to various people
about this topic I've found that it’s potentially the most divisive question
since “Tastes good vs. Less filling”

Database deadlocks are much like application ones, typically
come when two processes are trying to access the same database row at the same
time. Most often this is when the system is trying to read and write to the row
at the same time. A nice explanation can be found here.
What we essentially wind up with is the database equivalent of a traffic jam
where no one can move. It’s interesting to note that both Oracle and Microsoft SQL
server handle these locking scenarios differently. I’m not going to go into DB2
at the moment but will address it if there is sufficient demand.

When dealing with SQL Server, management of locks is handled
through the use of the “Hint” called No Lock. According to MSDN:

Hints are options or strategies specified for enforcement by
the SQL Server query processor on SELECT, INSERT, UPDATE, or DELETE statements.
The hints override any execution plan the query optimizer might select for a
query. (Source)

When NOLOCK is used this is the same as using
READUNCOMMITTED which some of you might have be familiar with if you did the
NetWeaver portion of the IDM install when setting up the data source. Using
this option keeps the SQL Server database engine from issuing locks. The big
issue here is that one runs the risk of having dirty (old) data in the database
operations. Be careful when using NOLOCK for this reason. Even though the SAP
Provisioning Framework makes extensive use of the NOLOCK functionality, they
regression test the heck out of the configuration. Make sure you do, too misuse
of NOLOCK can lead to bad things happening in the Identity Store database.

There is also a piece of SQL Server functionality referred
to as Snapshot Isolation which appears to work as a NOLOCK writ large where
database snapshots are held in the TEMPDB for processing (source)
This functionality was recommended by a DBA I worked with on a project some
time ago. The functionality was tested in DEV and then rolled to the customer’s
PRODUCTION instance.

Oracle is a little different in the way that it approaches
locking in that the system has more internal management of conflicts through
use of rollback logs forcing data to be committed before writes can occur and
thus deadlocks occur much less often (Source)
This means that there is no similar NOLOCK functionality in the Oracle Database
System.

One final thing to consider with database deadlocks is how
the database is being accessed, regardless of the database being used. It is considered a best practice in SAP IDM
to use To Identity Store passes as opposed to uIS_SetValue whenever possible (Source)

At the end of the day, I don’t know that I can really tell
you to employ these mechanisms or not. In general we do know that it’s better
not to have deadlocks than to have them and to do what you can to achieve this
goal. In general, if you are going to use these techniques, do make sure you are doing so in
concert with your DBA team and after careful testing. I have seen
Microsoft SQL Server’s Snapshot Isolation work well in a busy productive
environment, but I will not recommend its universal adoption as I can’t
tell you how well it will work in your
environment. I will however recommend that you look into it with your
DBA team if you are experiencing Deadlocks in SQL Server.

Thursday, March 21, 2013

There has been quite a bit of discussion about the potential futures of SAP IDM and SAP GRC. SAP has just started a survey so that they can get customer input. I would encourage all customers using or considering these products to take the poll.I've worked with the integration between the two products several times now, and I can honestly say that I have never achieved the results that I wanted. As I've thought about the issues that have kept me from getting what I (and of course, my clients) want, it all seems to come down to the architecture.The way SAP would have it, GRC is the brains, VDS the nervous system, and IDM is the muscle. IDM workflow does all the work using the various frameworks (Provisioning, Exchange, GRC, Lotus Notes, etc.) while it checks with GRC via VDS to tell it what to do.The problem as I see it is that there are:

Too many moving parts - IDM, VDS via WebServices to GRC, back to IDM

Not enough information that passes back from GRC - We don't see why things are rejected and it's not clear what is happening.

A lack of ways that conflicts can be addressed from IDM - This means that the "Security Desk" needs to get involved so they can fix the issue.

So how should this be addressed? I think through either a tighter integration that is more direct and thicker, that is one where more information is passed, so that IDM becomes the "face" of GRC allowing for mitigation and remediation activities. However I do not know that the current SAP architecture really supports this. therefore I think it makes more sense for IDM to "consume" GRC and make the GRC functionality part of IDM.IDM already has a very basic concept of Segregation of Duties through Role Mutual Exclusion functionality. Having logic that determines what should be "Mutually Excluded" from GRC type functionality makes sense.However as SAP Roles map to IDM Privileges it would also be necessary for this concept to be extended to the IDM Privilege level.

Finally this new functionality would need to include the ability to implement periodic entitlement reviews (sometimes referred to as attestation or certification) Since in a typical SAP Landscape implementation IDM is connected to HCM with Manager and Organizational properties already defined, IDM is in an excellent position to use it's Presentation Layer, Notifications and Identity Store Database to support this.

This just my opinion and I have registered it via the survey posted above. Go register yours!

Tuesday, March 19, 2013

Hi there. I know it's been a while since I posted here, but it's not because I'm not working on NetWeaver IDM or writing. I've been doing a lot of the former and a bit of the latter. In order to help promote the growth of a NW IDM technical knowledge base, I've been posting most of my IDM specific things on the SAP Community Network Blog. I'll still be posting here from time to time, but it will more likely be architectural or opinion related pieces about IDM.

To that point I'd like to talk about the seldom discussed Virtual Directory Server. I've always loved VDS and it's MaXware predecessor, MVD. There's just so much this product can do. While most of the SAP world is familiar with the Virtual Directory as a Web Services proxy for GRC or use with HCM, it is so powerful and flexible that it can do everything from provisioning to authorization and authentication management, to representing data sources in all kinds of different ways.

That's one of the things I'd like to talk about today. Ask most Directory Services administrators about a recommended architecture and they will tell you straight out, "flat, as flat as possible." However there are a number of reasons that this tends not to happen.

So how do we deal with this. Simple, via the Virtual Directory Server. Set up the flat structures that the administrators want, then use VDS to represent the directory with different views, deep organized by geography department, types of equipment, whatever. Present the displayname and other attributes as the different divisions request. Create separate customer facing views of your Identity Data.

Also don't be limited by only using Directory Services information for your Virtual View of data, use the Identity Store, UME and other sources separately or joined together to create your new interface. Information on this can be found here. The advantage here is that you can create a virtually (if you'll pardon the pun) unlimited number of data representations. Now go forth and create Virtual Directories make your Identity Management group, the "Can do!" group that provides everyone the flexibility that your external customers need while providing the optimal efficiency that the back office wants to deliver.

Tuesday, October 23, 2012

One of the things I've been hearing (and experiencing) lately is that there are a lot of questions about how SAP IDM works. Sometimes it's a functionality question, sometimes it's an enhancement request, other times it's a bug report.Taking a look on the SAP IDM SDN forum, one can see several instances of all of these issues. However, some feel that the actual issues are never recognized by SAP. This leads to feelings of frustration and that IDM is too complicated.There's actually a pretty simple resolution to this. When you have a problem, log an OSS note, that's what they are there for. Too often, we bolt right to the forums to get answers, and that's not a bad idea at all, however if the issue is significant enough, we need to inform SAP formally.

There's a few long term benefits to this as well:

OSS Notes gives SAP Support the opportunity to learn what we are doing in the field and what they should expect to see.

SAP Product management gets some metrics to see what needs to be improved in the product.

The "institutional" knowledge grows which has results in the production better wiki entries, SAP Notes and overall documentation.

It is much easier to escalate an issue with your SAP account manager if you have an OSS number.

Of course, this also raises some responsibilties on SAP's part:

Support responses must be timely! I've heard of some significant delays in getting even preliminary answers.

Speaking of preliminary answers, first level support needs to be able to do more than simply take error codes and logs. We need some level of support.

There must not be an automatic response of "we cannot support you, you have made customizations" For Pity's sake, the system was designed to be configurable and everyone knows the Provisioning framework will get some word done for it, that's why it's called a framework!

With a little bit of patience, understanding and work I think the system will work just fine. So please, take the time to document your issues and submit those OSS notes!