Using SAML Assertion Attributes in ForgeRock OpenAM – Concluding Episode: Using SAML Assertion Attributes

This blog post was first published @, included here with permission.

You’ve reached the concluding episode of a four part video made on using SAML v2 Assertion attributes in an application protected by ForgeRock OpenAM. I don’t need to mention that this being the last one in the lot, it may seem pointless to read/view this entry independently without going through the entries below, preferably in the exact same order as is listed:

1. Protecting a J2EE Application with ForgeRock OpenAM
2. Configuring Federation in ForgeRock OpenAM
3. Configuring Transient Federation in ForgeRock OpenAM
4. Using SAMLv2 Assertion Attributes

We can safely say that the diagram below is the end state of our demonstration:


So what we’ve in there is a client attempting to access the protected J2EE Application, which is intercepted by the OpenAM Policy Agent, who in turn redirects the request to an IDP initiated SSO URL, resulting in a Login page to the end user from IDP. The IDP would then validate the credentials supplied by the end user, and if found authentic, sends an assertion to the SP with the user attributes (like mail, telephonenumber) specified in the Federation Configuration. Because it uses Transient Federation, the user will not have a profile in SP, still the attributes in the Assertion is available in the user’s session to be used by the Agent to pass on to the application. It may have sounded complicated, but I’m confident that the concluding episode of a rather lengthy screen-cast can help you figure it all.

I want to take a moment to Thank you! to have spent time reading/viewing my web logs on ‘Using SAML Assertion Attributes’ and sincerely hope it was useful.


Using SAML Assertion Attributes in ForgeRock OpenAM – Episode 03/04 : Configuring Transient Federation in ForgeRock OpenAM

This blog post was first published @, included here with permission.

This is the third episode from a four part video made on using SAML v2 Assertion attributes in an application protected by ForgeRock OpenAM. In the interest of continuity and also to get the context accurately, it may make sense to read/view the blog posts in the following order:

1. Protecting a J2EE Application with ForgeRock OpenAM
2. Configuring Federation in ForgeRock OpenAM
3. Configuring Transient Federation in ForgeRock OpenAM
4. Using SAMLv2 Assertion Attributes

Let me throw a picture at you:


The diagram is a slightly modified version of the one that you would have seen in my earlier blog entry. It has one additional user in the Identity Provider (which of course seems like a world famous detective and that’s no coincidence), but no corresponding entry in the Service Provider. In the Identity Federation Configuration earlier, we saw how a user with an id ‘demo’ in the Identity Provider linked her account with her id in the Service Provider. But there can be situations, when we may want to use Federation with identities only at the IDP, still gaining access to the applications protected by the SP. That’s where Transient Federation comes into play. It maps the identities from IDP to an anonymous user in the SP (many to one mapping).


ForgeRock OpenIG as SAML 2.0 Service Provider

This blog post was first published @, included here with permission.

This post is based on the ForgeRock Documentation on configuring OpenIG as SAML 2.0 Service Provider. The video logs embedded just below this write up is a visual representation of what is already there in the document that I mentioned above. For a detailed study, please read through the documentation and then sit back, relax and watch the demonstration in the screen-cast below

SAML 2.0 as you probably know is a standard to exchange information between a SAML authority (a Identity Provider a.k.a IDP) and a SAML Consumer (a Service Provider a.k.a SP). In the demonstration that follows ForgeRock OpenAM acts as an Identity Provider and ForgeRock OpenIG acts as a Service Provider. So the authentication of a user is done by the IDP, who will then send a piece of information (Assertion) to the SP that could contain the attributes of user from the user’s profile in the IDP DataStore. SP will then use the information thus obtained (Assertion) to take further action (like giving access to the user etc.)

There are two ways of getting this done:
(i) SP initiated SSO
(ii) IDP initiated SSO

In simple words, in a SP initiated SSO, the user contacts the Service Provider, who in turns gets in touch with the Identity Provider, who would validate the user credentials and then exchange a piece of information (Assertion) that could contain the user attributes to the Service Provider. Whereas a IDP initiated SSO, the IDP will authenticate the user, and would then send an unsolicited message (Assertion) to the SP, who would then take further action (like giving access to the user etc.)

The following two illustrations might give a rough idea:


In our story (above in the illustration and below in the video), a user authenticates against ForgeRock OpenAM (IDP), who will send then an assertion (containing user’s mail and employeenumber attribute) to ForgeRock OpenIG (Service Provider), who will apply filters (like extracting the attributes from assertion and posting it as username and password) to post the user’s credentials to a protected application (Minimal HTTP Server)

If you’ve got a vague picture on what’s discussed above, I’d believe it’ll be clearer after watching the video below:


Introduction to OpenIG (Part 1: Use cases)


You've probably landed here because you want to know something about OpenIG. This is the right place to be :)

This post is the first one of a serie of OpenIG-related articles that would gives you hand-on samples with explanations.

Identity Gateway

OpenIG stands for Open Identity Gateway, it is an identity/security specialized reverse proxy. That means that it control the access to a set of internal services.

By controling the access, I mean that it intercepts all the requests coming to the protected service (be it a RESTful API or a web application) and process them before (and after) forwarding them to the server.

Different kind of processing can be handled:

  • Request authorization
  • Capture and password replay
  • Message logging
  • Transformation

Commons use cases

Ok, that was a bit of a generalist description (transformation is intentionally vague :) ). Having some real-life use cases will help to have a better feeling/understanding of what OpenIG is capable of.

SAML Application Federation

In this use case, OpenIG acts as a SAML-enabled facade to a somehow legacy application that cannot be adapted to support SAML federation. The Identity Provider (could be OpenAM) will consider OpenIG as a standard SAML Service Provider.

SAML CoT with OpenIG used as a facade to a legacy application

Application Authentication

Here, OpenIG acts as an OpenID Connect Relying Party (OIDC terminology for client) and requires the user to authenticate to an OpenID Connect Provider (the identity provider) before giving him/her access to the protected application.

Authenticated user's profile information (such as name, email, address, picture, ...) are available to enrich the user experience, or make further verifications.

OpenID Connect - OpenIG Relying Party

RESTful Services Protection

This simple case shows OpenIG verifying request to a proxified RESTful API: each request must contains a valid OAuth 2.0 Bearer Token to be allowed to reach the service API. In this case, OpenIG acts as an OAuth 2.0 Resource Server.

Very useful if you have an old-fashioned REST API that you cannot easily update to deal with OAuth 2.0 tokens.

OAuth 2.0 - OpenIG ResourceServer

Not enough ?

Well, we're done with the 'marketing' stuff. In the next post, we will start to play with OpenIG.

See you soon!

Setting up Java Fedlet with Shibboleth IdP

The Java Fedlet is basically a lightweight SAML Service Provider (SP) implementation that can be used to add SAML support to existing Java EE applications. Today we are going to try to set up the fedlet sample application with a Shibboleth IdP (available at

Preparing the fedlet

There is two kind of Java fedlet in general: configured and unconfigured. The configured fedlet is what you can generate on the OpenAM admin console, and that will basically preconfigure the fedlet to use the hosted OpenAM IdP instance, and it will also set up the necessary SP settings. The unconfigured fedlet on the other hand is more like starting from scratch (as the name itself suggests :) ) and you have to perform all the configuration steps manually. To simplify things, today we are going to use the configured fedlet for our little demo.

To get a configured fedlet first you have to install OpenAM of course. Once you have an OpenAM set up, Create a new dummy Hosted IdP (to generate a fedlet it is currently required to also have a hosted IdP):

  • On the Common Tasks page click on Create Hosted Identity Provider.
  • Leave the entity ID as the default value.
  • For the name of the New Circle Of Trust enter cot.
  • Click on the Configure button.

Now to generate the configured fedlet let’s go back to the Common Tasks page and click on Create Fedlet option.

  • Here you can set the Name to any arbitrary string, this will be the fedlet’s entity ID. For the sake of simplicity let’s use the fedlet’s URL as entity ID, e.g.,
  • The Destination URL of the Service Provider which will include the Fedlet setting on the other hand needs to be the exact URL of the fedlet, so for me this is just a matter of copy paste:
  • Click on the Create button.

This will generate a fedlet that should be available under the OpenAM configuration directory (in my case, it was under ~/openam/myfedlets/httpfedletexamplecom18080fedlet/, let’s grab this file and unzip it to a convenient location. Now we need to edit the contents of the fedlet.war itself and modify the contents of the files under the conf folder. As a first step open sp.xml and remove the RoleDescriptor and XACMLAuthzDecisionQueryDescriptor elements from the end of the XML.

At this point in time, we have everything we need for REGISTERing our fedlet on the site, so let’s head there and upload our metadata (sp.xml), but in order to prevent clashes with other entity configurations, we should rename the sp.xml file to something more unique first, like

After successful registration the next step is to grab the testshib IdP metadata and add it to the fedlet as idp.xml, but there are some small changes we need to make on the metadata, to make it actually work with the fedlet:

  • Remove the EntitiesDescriptor wrapping element, but make sure you copy the xmlns* attributes to the EntityDescriptor element.
  • Since now the XML has two EntityDescriptor root elements, you should only keep the one made for the IdP (i.e. the one that has the “” entityID), and remove the other.

The next step now is that we need to update the idp-extended.xml file by replacing the entityID attribute’s value in the EntityConfig element to the actual entity ID of the testshib instance, which should be

After all of this we should have all the standard and extended metadata files sorted, so the last thing to sort out is to set up the Circle Of Trust between the remote IdP and the fedlet. To do that we need to edit the fedlet.cot file and update the sun-fm-trusted-providers property to have the correct IdP entity ID:


It’s time to start the testing now, so let’s repackage the WAR (so it has all the updated configuration files) and deploy it to an actual web container. After deploying the WAR, let’s access it at Since there is no fedlet home directory yet, the fedlet suggests to click on a link to create one based on the configuration in the WAR file, so let’s click on it and hope for the best. :)

Testing the fedlet

If we did everything correctly, we end up on a page finally where there are some details about the fedlet configuration, and there are also some links to initiate the authentication process. As a test let’s click on the Run Fedlet (SP) initiated Single Sign-On using HTTP POST binding one, and now we should be facing the testshib login page where you can provide one of the suggested credentials. After performing the login an error message is shown at the fedlet saying Invalid Status code in Response. After investigating a bit further, the debug logs tells us what’s going on under ~/fedlet/debug/libSAML2:

SPACSUtils.getResponse: got response=<samlp:Response xmlns:samlp="urn:oasis:names:tc:SAML:2.0:protocol" ID="_0b33f19185348a26fffe9c3a1aa6e652" InResponseTo="s2be040cf929456a64f444527dfc1d7413ce178531" Version="2.0" IssueInstant="2013-12-04T18:39:32Z" Destination=""><saml:Issuer xmlns:saml="urn:oasis:names:tc:SAML:2.0:assertion" Format="urn:oasis:names:tc:SAML:2.0:nameid-format:entity"></saml:Issuer><samlp:Status xmlns:samlp="urn:oasis:names:tc:SAML:2.0:protocol">
<samlp:StatusCode xmlns:samlp="urn:oasis:names:tc:SAML:2.0:protocol"
<samlp:StatusMessage xmlns:samlp="urn:oasis:names:tc:SAML:2.0:protocol">
Unable to encrypt assertion

Now we can also look into the testshib logs (see TEST tab), and that will tell us what was the real problem:

Could not resolve a key encryption credential for peer entity:

So this just tells us that the Shibboleth IdP tries to generate an encrypted assertion for our fedlet instance, however it fails to do so, because it is unable to determine the public certificate for the fedlet. This is happening because the basic fedlet metadata does not include a certificate by default, to remedy this let’s do the followings:

  • Acquire the PEM encoded certificate for the default OpenAM certificate:
    $ keytool -exportcert -keystore ~/openam/openam/keystore.jks -alias test -file openam.crt -rfc
  • Drop the BEGIN and the END CERTIFICATE lines from the cert, so you only have the PEM encoded data, and then you’ll need to add some extra XML around it to look something like this:
    <ds:KeyInfo xmlns:ds="">
  • Add the KeyDescriptor under the SPSSODescriptor as a first element.
  • Upload the updated SP metadata with the same filename ( again at the testshib site.

Since the decryption process requires the presence of the private key of the certificate, we need to ensure that the private key is available, so let’s do the followings:

  • Copy the ~/openam/openam/keystore.jks file to the fedlet home directory
  • Visit the page and enter changeit (the password of the default keystore and private key as well).
  • Grab the encrypted value and create ~/fedlet/.storepass and ~/fedlet/.keypass files containing only the encrypted password.
  • Open up ~/fedlet/sp-extended.xml and ensure that the encryptionCertAlias setting has the value of test.
  • Restart the container, so the changes are picked up.

At this stage we should retry the login process again, and if nothing went wrong you can see all the nice details of the received SAML assertion from! It Works! :)

Identity & Access Management: Give Me a REST

Give me a REST (or two weeks stay in a villa in Portugal if you're asking...).  RESTful architectures have been the general buzz of websites for the last few years.  The simplicity, scalability and statelessness of this approach to client-server communications has been adopted by many of the top social sites such as Twitter and Facebook.  Why?  Well, in their specific cases, developer adoption is a huge priority.  Getting as many Twitter clients or Facebook apps released, increases the overall attractiveness of those services and in a world where website and service competition is as high as ever, that is a key position to sustain.


Cute picture of RESTing lion [1]
The evolution and move to REST is quite a clear one from a benefits and adoption perspective.  REST re-uses many of the standard HTTP protocol verbs such as GET, POST and DELETE,  when constructing URL's.  These verbs are well understood and well used, so there's no new syntactic sugar to swallow.  Each component of the service owners database is abstracted into neatly described resources that can be accessed using the appropriate URI.  Requests can then be made to return, say, a JSON or XML representation of the underlying database object.

The client, permission granted, can then in turn update or create a new object in the same way, by sending a new JSON object via a PUT or POST request.

What's This Got To Do With IAM?

Identity management has often been thought of as an enterprise or organizational problem, focussing on the the creation and management of company email, mainframe and ERP system accounts.  This process then brought all the complexity of business workflow definition, compliance, audit, system integration and so on.  Access management on the other hand, has often been focused on single-sign-on, basic authorization and web protection.  IAM today is a much more complex and far reaching beast.  

Organizations are reaching out into the cloud for services, API's and applications.  Service providers and applications are becoming identity providers in their own right, reaching back out to consumers and businesses alike.  For once, identity management is on the tip of the tongue of the most tech-avoiding consumers, concerned with privacy, their online-identities and how they can be managed and consumed.

A RESTful Future

These new approaches to identity and access management require rapid integration, developer adoption and engine-like API's that can perform in an agile, scalable and secure fashion.  Identity and access management services for consumers, such as being able to login with their Facebook or Twitter account using OAuth or OAuth2 without having to create and manage multiple passwords for the other sites they interact with, not only increases user convenience.  It also puts pressure on business security strategies as they can struggle to cope with the ability for employees to bring-their-own-identity to many of the now popular business services such as Webex, Dropbox, Salesforce and the like.

As identity management is no longer solely concerned with siloed, business unit or organisational boundaries and looking more to being fully connected, integrated and focused on consumerization, developer adoption has never been more important.  Security in general, has never been a high priority for application builders, who are more centred on features and end usability.

Identity and access management is making a big change to that area with many access management systems being easy to externalize from application logic using RESTful integration.

By Simon Moffatt

[1] - Image attribute Stock.Xchng

BYOID: An Identity Frontier?

[bee-oi]. [b-yoy]. [be-yo-eye]. [bee-oy-ed].  Whichever way you pronounce it, the concept of bringing your own identity to the party is becoming a popular one.  Just this week Amazon jumped on the identity provider bandwagon, by introducing it's 'Login With Amazon' API.  What's all the fuss?  Isn't that just the same as the likes of Twitter and Facebook, exposing their identity repositories so that 3rd party application and service developers can leverage their authentication framework without having to store usernames and passwords?

Well in a word yes.  With the continual threat of hacking and cracking of consumer and business user account repositories, why wouldn't you as a developer, simply take advantage of someone else storing the password details on your behalf?  No need to worry about whether to use encryption or hashing, which algorithm to use, how to handle password reset management and so on.  Simply perform a call out during authentication time, probably using your favourite language, using a simple web standard and a way you go.  Simples.

Why It's Cool To Be An Identity Provider

So why have all of these identity providers sprung up?  Well in essence they haven't.  The Security Assertion Markup Language (SAML) has been around for the best part of 12 years and in the last 8 or so, has been implemented at numerous federation related projects both in the education, public and private sectors.  SAML has the concept of an identity provider, but that has been expanded in recent years to be more consumer orientated.  With the rise of social networking sites that house millions of user records and newer protocols such as OAuth(1 and 2), the use and consumption of these vast repositories is simpler.  From a social platform perspective (Facebook) or from a  consumer outreach perspective (Amazon) it makes pretty good business sense to get more applications, services and in turn users, using the sites and services they manage.  If you make cricket bats, you want to help those who make the balls.  It's the same with being an identity provider.  The consumer has a familiar (and to an extend trusted) account they use to log in with.  It's signal sign on, without all the password hiding and syncing stuff.

If The Consumer's Happy The Orgs Are Happy Right?

Well, if you're an identity provider, I'd imagine you are pretty ecstatic.  You have had to manage the user's password and account details in the past anyway.  So in theory they should be only a marginal cost to expose that information via an API to your apps developers or associated service providers.  Consumers are content as they don't have to create multiple accounts or remember lots of passwords each time they sign up to a new service or application.  But what about the non-consumer side of things?  Perhaps these users are also employees using their newly found identity freedom to access services and applications that are being used for business purposes.  Perhaps CRM systems, hosting providers, calendars, software repositories, syncing apps and so on.  How can that be managed and controlled?

Business Evolution Now Revolves Around Identity Management

I wrote recently about how modern enterprises now face significant challenges, as they expand into new business partnerships, with complex supply lines and an ever evolving mobile and now identity aware workforce.  For many organizations, their continual evolution into new areas of revenue, especially within the business to consumer space, will rely heavily on being able to manage both internal employee access to cloud based services, as well as managing their own consumers' access to their own resources.  Both can be complex, requiring unprecedented scalability and web enablement.

The 'consumer is king', now really translates to 'identity is king'.

By Simon Moffatt

Forget Firewalls, Identity Is The Perimeter

"It is pointless having a bullet proof double-locked front door, if you have no glass in your windows".  I'm not sure who actually said that (if anyone..), but the analogy is pretty accurate.  Many organisations have relied heavily in the past, on perimeter based security.  That could be the network perimeter or the individual PC or server perimeter.  As long as the private network was segregated from the public via a firewall, the information security manager's job was done.  Roll on 15 years and things are somewhat more complex.

"Identity as the perimeter" has been discussed a few times over the last year or so and I'm not claiming it as a strap line - albeit it is a good one at that.  But why is it suddenly becoming more important?

The Extended Enterprise - Mobile, BYOD, Consumer

Organizations of all sizes are no longer central places of work, with siloed business units, headed up by managers in glass-walled offices.  Remote working is no longer limited to cool startups or the creative industries.  Desktop PC's are now in decline, with tablets and smartphones able to do the majority of work related use cases.  Many organisations now have complex supply chains, leveraging partners, sub-partners, clients and consumers.  Outsourced services and applications now make up a large percentage of an organization's delivery management process, with these services often allowing authentication and user management controls outside of the standard corporate LDAP.

The increased use of BYOD, mobile workforces and increased outsourced and consumer lead service interaction, requires a much more integrated and agile view of an identity, but also requires CISO's, to view data protection and segregation in a much more user centric approach.

There Is No Network Separation - Everything is Connected

Everything is connected.  You can receive corporate email on your smartphone over a network carrier paid for privately.  Remote backup and file sync solutions allow sensitive files to be stored off site without the knowledge of a DLP solution.  There is no longer a 'corporate' network with strong demarcation lines.  Whilst this has obvious user benefits and efficiency gains, it has opened up new areas for security management.  The one thing which is staying relatively static is the that of the identity driving this change.  I don't mean the role and concept of identity is static.  Quite the opposite, but identities are still the driving force between application interactions, network traffic analysis, DLP techniques, firewall management and so on.  Each transaction should be linked in some way to an identity.  This identity could be well masked through alias upon alias, but there are fewer and fewer chances for a truly anonymous computer interaction.

Extend the Enterprise or the Stretch The Cloud?

These identities are developing in multiple directions.  The traditional corporate view of an identity originating from an authoritative source such as HR and flowing via provisioning systems to a target directory or database is still present.  Complex workflows and RBAC projects will keep many a consultant in work for years to come.  But with the onset of the extended enterprise, the increased use of social and cloud based identity brokers and platforms (Google, Facebook et al), there is a need for fusion.  The ability for organisations to extend to the 'cloud' and the for the internet based services and brokers to able to reach out to traditionally standalone organizations with their new apps and services securely, whilst still making the user experience convenient.  But where to start?  Traditional enterprise identity and access management solutions are often too static and unable to scale to internet style proportions.  Internet focused identity has been about single sign on, federation and authentication via social platforms. Organizations need to be able to manage interactions with 3rd party service providers from a centralised, potentially policy driven, authentication and authorization perspective.  It's pointless disabling a contractor's internal LDAP account if they still have an active Saleforce or Dropbox account when they've left.

Compliance doesn't just fade away in cloud and internet based scenarios.  There are still stringent controls that need to be adhered to, in order for an organization's identity management platform to be both convenient and effective.

By Simon Moffatt