What’s New in ForgeRock Access Management 5?

ForgeRock this week released version 5 of the ForgeRock Identity Platform of which ForgeRock Access Management 5 is a major component. So, what’s new in this release?

New Name

The eagle-eyed amongst you may be asking yourselves about that name and version. ForgeRock Access Management is the actually the new name for what was previously known as OpenAM. And as all components of the Platform are now at the same version, this becomes AM 5.0 rather than OpenAM 14.0 (though you may still see remnants of the old versioning under the hood).

Cloud Friendly

AM5 is focussed on being a great Identity Platform for Consumer IAM and IoT and one of the shared characteristics of these markets is high, unpredictable scale. So one of the design goals of AM5 was to become more cloud-friendly enabling a more elastic architecture. This has resulted in a simpler architectural model where individual AM servers no longer need to know about, or address, other AM servers in a cluster, they can act Autonomously. If you need more horsepower, simply spin up new instances of AM.

DevOps Friendly

To assist with the casual “Spin up new instances” statement above, AM5 has become more DevOps friendly. Firstly, the configuration APIs to AM are now available over REST, meaning configuration can be done remotely. Secondly, there’s a great new tool called Amster.

Amster is a lightweight command-line tool which can run in interactive shell mode, or be scripted.

A typical Amster script looks like this:

connect http://www.example.com:8080/openam -k /Users/demo/keyfile
import-config --path /Users/demo/am-config
:exit

This example connects to the remote AM5 instance, authenticating using a key, then imports configuration from the filesystem/git repo, before exiting.

Amster is separately downloadable has its own documentation too.

Developer Friendly

AM5 comes with new interactive documentation via the API Explorer. This is a Swagger-like interface describing all of the AM CREST (Common REST) APIs and means it is now easier than ever for devs to understand how to use these APIs. Not only are the Request parameters fully documented with Response results, but devs can “Try it out” there and then.

Secure OAuth2 Tokens

OAuth2 is great, and used everywhere. Mobile apps, Web apps, Micro-services and, more and more, in IoT.
But one of the problems with OAuth2 access tokens are that they are bearer tokens. This means that if someone steals it, they can use it to get access to the services it grants access to.

One way to prevent this is to adopt a new industry standard approach called “Proof of Possession“(PoP).

With PoP the client provides something unique to it, which is baked into the token when it is issued by AM. This is usually the public key of the client. The Resource Server, when presented with such a token, can use the confirmation claim/key to challenge the client, knowing that only the true-client can successfully answer the challenge.

Splunk Audit Handler

Splunk is one of the cool kids so it makes sense that our pluggable Audit Framework supports a native Splunk handler.

There are a tonne of other improvements to AM5 we don’t have time to cover but read about some of the others in the Release Notes, or download it from Backstage now and give it a whirl.

This blog post by the Access Management product manager was first published @ thefatblokesings.blogspot.com, included here with permission.

ForgeRock Identity Platform 5.0 docs

By now you have probably read the news about the ForgeRock Identity Platform 5.0 release.

This major platform update comes with many documentation changes and improvements:

Hope you have no trouble finding what you need.

This blog post was first published @ marginnotes2.wordpress.com, included here with permission.

Have I Been Pwned Authentication Module

Bit busy at the moment but I wanted to write a quick blog on something rather cool. I am a big fan of Troy Hunt’s https://haveibeenpwned.com/. Troy performs a much needed public service by collating the results of the many many data breaches that keep happening and making them searchable.
So if you want to check if your account has been involved in a breach. You enter your email and can see all the data breaches involving your email.

The site also sends notifications when your email is detected in a breach. Allowing you to change your passwords and ensure your accounts are secure.

I thought it would be quite cool to write an HIBP module for OpenAM. My colleague Jon Knight was kind enough to wrap this up and put together a bit of UI for it.

So what this does, is enable an optional authentication step, where after you login we check to see if your email is in the HIBP database and if it is, we can then warn you you may want to change your password. The module along with instructions to configure it can be found here:

https://github.com/jeknight/pwnedAuthModule

This is a just a proof of concept, it may be that rather then warn the use your can instead attach a higher level of risk to interactions with them, perhaps enforcing the use of 2FA. This can all be achieved easily with OpenAM.

This blog post was first published @ http://identity-implementation.blogspot.no/, included here with permission from the author.

Introducing IDM Workflow

ForgeRock Identity Management includes an OOTB workflow engine based on BPMN (Business Process Model & Notation). This isn’t unique, most identity management solutions have some form of workflow engine. However in my experience they are typically based on some proprietary technology and/or very painful to work with.

I have recently had to build some workflows for various customer Proof of Concepts and I am really impressed by how quickly you can pull something together so I wanted to write up a blog entry.

So in this blog we are going to use a brand new instance of IDM (installed locally) and create a simple request and approval workflow which we will then test.

I am going to use Eclipse for this, there are other BPMN editors. I am also not going to spend much time talking about BPMN beyond what we need to build a meaningful workflow. Much more information is available here. In the spirit of this blog I am just going to get on with it and walk you through the basic steps to build and test a simple workflow.

Additionally, the workflow samples that ship with IDM are a brilliant place to start. I highly recommend taking a look at them and using them as the basis of your workflows until you get comfortable building them yourself.

Getting Started

I am going to assume you have an installation of ID already, if not check out my IDM beginners series.

IDM ships with a built in version of the Activiti workflow engine: https://www.activiti.org/. We are going to use the free Eclipse Activiti Workflow Designer to build our workflows.

Firstly, download and install the Eclipse IDE.

When you have Eclipse installed, fire it up and navigate to help -> install new software:

Click Add:

Enter the following location: https://www.activiti.org/designer/update/ and press OK.

Wait for the installation process to complete, now that is all out of the way. Lets get started!

Create a New Project

Navigate to File -> New Project:
Then select General -> Project (we do not need an Activiti Project):
Give your project a name:

And press Finish.

Building a New Workflow

Right click on the new project, select New File:
Give it a name similar to the following:

 

.bpmn20.xml is the convention we use for workflow files in IDM.
Finally right click on our new workflow, select Open With and Other…

 

Finally right click on our new workflow, select Open With and Other…

 

And select Activiti Diagram Editor.
Ok, you should now have a blank canvas that looks a bit like this:
Lets get started. First thing we need is a Start Event. Drag one over from the menu on the right and drop it somewhere on the workflow canvas.
Now, we need some workflow steps. As we are building a simple request and approval workflow so we probably need:
  • A step to actually create a request for something (that is actually our StartEvent we just created).
  • A step to gather some information and determine who the request needs to go to for approval.
  • A step for the actual approval.
  • A step for processing the result. Typically you also want to send an email containing the response. In fact, we probably need two steps here, one for success and one for failure.
We will build the workflow steps and connect them together first, before implementing the actual logic. Select a Script Task from the menu on the right and drag it on to our canvas.
You should now have something like this:
We probably want to give our Script Task a name, click on it and give it a new name:
Just replace the Name value with something appropriate:
We also need to make sure that all Script Tasks have script defined so that IDM can parse them successfully. The easiest way to do this is to add a simple logging statement to the task. Select the Process Request task again. then the Main config tab and add some simple script:
Now you can use either javascript or groovy for scripting. I tend to use groovy but that is just personal choice.
java.util.logging.Logger logger = java.util.logging.Logger.getLogger("")
logger.info("SimpleWorkflow - Process Request")
Make sure you save your work.
Now select the Start Event, you should see the following menu:
Click on the Create Connection arrow, but keep the mouse button held down and drag a connection over to our Process Request task. We now have a flow linking the two tasks in sequence:
Next we need a User Task, similar to before select User Task from the menu on the right, drag it into the canvas, give it a name and connect it to the Process Request task.
Ok, now things get a little more complicated, as a request could be either approved or rejected. So we need a Gateway, specifically an Exclusive Gateway:
We also need to new Script Tasks, build the workflow as below:
Remember to add some simple script to Main config for each new task, otherwise parsing of the workflow will fail.
Finally we need an EndEvent:
Put this to the right of the Approved and Rejected tasks, and connect them to it as below:
We now have a basic workflow outline, time to make it actually do something.

Workflow Logic

StartEvent

We are going to start with our StartEvent. What this actually translates to is the form that a user will complete in self service to make their request.
Click on the StartEvent and select the Form tab
 Click New, You should see the following:
Fill it in exactly as I have below then press OK:

 

We should now have an attribute on our form:
Let’s add another one, justification is a common field when making a request, add it in exactly the same way:
One more thing before we test this in IDM. Click somewhere on the canvas until you can edit the process Name.
Change My process to something meaningful like Request with Justification. Make sure you save the workflow.

Testing the Workflow in IDM

Although our workflow isn’t really doing anything yet, this is a good time to quickly test what it looks like in IDM.
Fire up IDM and navigate to the openidm directory, create a new workflow directory:
Now copy the SimpleWorkflow.bpmn20.xml file into the new workflow directory, you should see IDM pick it up in the logs. In fact you will probably see a warning which we will ignore for now.
Login to IDM as a user other than openidm-admin, someone you have created previously, I’m using a user called jbloggs. Remember to login to user self service: http://localhost.localdomain.com:8080/#login/
You should see the dashboard, and our new process!
Click Details and you can see the form we created earlier! You can enter a request and justification but do not hit Start, because right now nothing will happen.

More Workflow Logic

Ok, so we can make a request now, but it doesn’t go to anyone. Let’s fix that, there are a few things to do here. Firstly we need to gather some more information about the requestor for the approver. Select the StartEvent then Main config and set Initiator as “initiatorId”:
Select the Process Request task and enter the following script as groovy into main Config then save your work.
java.util.logging.Logger logger = java.util.logging.Logger.getLogger("")
logger.info("SimpleWorkflow - Process Request " + initiatorId);
// find user
readStartUserFromRepoParams = [_queryId:'for-userName',uid:initiatorId]
qresults = openidm.query('managed/user', readStartUserFromRepoParams)
// get user details
users = qresults.result
execution.setVariable("userId", users[0]._id)
execution.setVariable("userName", users[0].userName)
execution.setVariable("givenName", users[0].givenName)
execution.setVariable("sn", users[0].sn)
execution.setVariable("mail", users[0].mail)
// set approver
execution.setVariable("approverId", "openidm-admin")
All we are doing here is retrieving the user data for the initiating user and setting it as variables in the workflow process. We are also assigning the user who will approve the request (simply as a static Id here but you can easily make this dynamic, for example assign the task to a group or to a manager – I may cover this in a later blog).
A few more steps, we need to set the assignee for the approval task. Select the Approve Request task and enter the following for assignee:
We also need to configure the approval form as below to show the data we just collected:
We also need to add the request fields the user just filled in:
We also need to add an approvalResult, to the form. This field is a little different as it is an enum:
Because this field is an enumeration we need to add some form values for the user to choose, press New:
And configure the Form Value configuration as below and press OK:
Do the same for “rejected”, and you should end up with the following:
Save your work.

Back to IDM

Again, copy the workflow file into IDM. Now logout, login as a user and select the workflow and populate the request:

 

Press Start. All being well you should see confirmation the workflow process has been started. Now log out and log back in as openidm-admin, you should see that there is a request to be approved on the dashboard:
And if you select Details, you can see the request itself and the additional information we put into it, as well as our approval drop down:

However as before do not click Complete just yet, as we need to actually make this do something.

Final Workflow Logic

Back to the workflow editor, lets take a look at the exclusive gateway we added a bit earlier:
So what we need now is some logic, based on the approval result to send the workflow to the right place. Click on the flow to the Approved task:
Take a look at the Main config, and specifically the Condition field:
Enter the following:
${approvalResult=='approved'}
Do something similar for the Rejected flow:
${approvalResult==’rejected’}
You will notice these two conditions are based on the enum we defined earlier, depending on the selection made by the approver we flow will go to either the Approved or Rejected task.
Now lets finish the workflow, select the Approved task and Main config and Script, enter the following:
java.util.logging.Logger logger = java.util.logging.Logger.getLogger("")
logger.info("SimpleWorkflow - Approved")

java.text.SimpleDateFormat formatUTC = new java.text.SimpleDateFormat("yyyy-MM-dd'T'HH:mm:ss.S'Z'");     
formatUTC.setTimeZone(TimeZone.getTimeZone("UTC"));
requestDate = formatUTC.format(new Date());
               
def requesterNotification = [
   "receiverId":  userId,
   "requesterId" : "",
   "requester" : "",
   "createDate" : requestDate,
   "notificationType" : "info",
   "notificationSubtype" : "",
   "message" : "The access request was accepted"
];
                
openidm.create("repo/ui/notification/", null, requesterNotification)
This bit of script simply uses the IDM notification engine to let the user know that their request has been approved.
Save your work for the last time and copy the work flow into IDM.

Back to IDM for the Last Time

So now, we should have a complete workflow.
Login to IDM as jbloggs and make a request.
Login to IDM as openidm-admin to approve the request.
Finally, log back in as jbloggs and you should see a notification the request has been approved.
And thats it, very basic workflow but hopefully you can begin to see what is possible. In future blogs I’ll look at actually at assigning a role based on the approval and also enabling a request drop down dynamically.

This blog post was first published @ http://identity-implementation.blogspot.no/, included here with permission from the author.

What’s New in ForgeRock Access Management 5?

ForgeRock this week released version 5 of the ForgeRock Identity Platform of which ForgeRock Access Management 5 is a major component. So, what's new in this release?

New Name

The eagle-eyed amongst you may be asking yourselves about that name and version. ForgeRock Access Management is the actually the new name for what was previously known as OpenAM. And as all components of the Platform are now at the same version, this becomes AM 5.0 rather than OpenAM 14.0 (though you may still see remnants of the old versioning under the hood).

Cloud Friendly

AM5 is focussed on being a great Identity Platform for Consumer IAM and IoT and one of the shared characteristics of these markets is high, unpredictable scale. So one of the design goals of AM5 was to become more cloud-friendly enabling a more elastic architecture. This has resulted in a simpler architectural model where individual AM servers no longer need to know about, or address, other AM servers in a cluster, they can act Autonomously. If you need more horsepower, simply spin up new instances of AM. 

DevOps Friendly

To assist with the casual "Spin up new instances" statement above, AM5 has become more DevOps friendly. Firstly, the configuration APIs to AM are now available over REST, meaning configuration can be done remotely. Secondly, there's a great new tool called Amster.
Amster is a lightweight command-line tool which can run in interactive shell mode, or be scripted.

A typical Amster script looks like this:
connect http://www.example.com:8080/openam -k /Users/demo/keyfile
import-config --path /Users/demo/am-config 
:exit
This example connects to the remote AM5 instance, authenticating using a key, then imports configuration from the filesystem/git repo, before exiting.
Amster is separately downloadable has its own documentation too.

Developer Friendly

AM5 comes with new interactive documentation via the API Explorer. This is a Swagger-like interface describing all of the AM CREST (Common REST) APIs and means it is now easier than ever for devs to understand how to use these APIs. Not only are the Request parameters fully documented with Response results, but devs can "Try it out" there and then.

Secure OAuth2 Tokens

OAuth2 is great, and used everywhere. Mobile apps, Web apps, Micro-services and, more and more, in IoT.
But one of the problems with OAuth2 access tokens are that they are bearer tokens. This means that if someone steals it, they can use it to get access to the services it grants access to.
One way to prevent this is to adopt a new industry standard approach called "Proof of Possession"(PoP).
With PoP the client provides something unique to it, which is baked into the token when it is issued by AM. This is usually the public key of the client. The Resource Server, when presented with such a token, can use the confirmation claim/key to challenge the client, knowing that only the true-client can successfully answer the challenge.

Splunk Audit Handler

Splunk is one of the cool kids so it makes sense that our pluggable Audit Framework supports a native Splunk handler


There are a tonne of other improvements to AM5 we don't have time to cover but read about some of the others in the Release Notes, or download it from Backstage now and give it a whirl.

ForgeRock Identity Platform 5.0 docs

ForgeRock Logo By now you have probably read the news about the ForgeRock Identity Platform 5.0 release.

This major platform update comes with many documentation changes and improvements:

Hope you have no trouble finding what you need.


We’re hiring

ForgeRock still has open positions in many locations. One of those is for a writer based in our Vancouver, WA office (across the river from Portland, OR).

Be forewarned. This is challenging, hands-on technical writing and reverse engineering work. If you are looking for a gig where you can remain in your comfort zone, skip this one.

On the other hand, if you feel that most things worth doing are tough in the beginning, take a closer look and apply for the job.

This blog post was first published @ marginnotes2.wordpress.com, included here with permission.

We’re hiring

ForgeRock still has open positions in many locations. One of those is for a writer based in our Vancouver, WA office (across the river from Portland, OR).

Be forewarned. This is challenging, hands-on technical writing and reverse engineering work. If you are looking for a gig where you can remain in your comfort zone, skip this one.

On the other hand, if you feel that most things worth doing are tough in the beginning, take a closer look and apply for the job.


Why Tim Berners-Lee Is Right On Privacy

Last week, the “father” of the Internet, Tim Berners-Lee, did a series of interviews to mark the 28 year anniversary since he submitted his original proposal for the worldwide web.

The interviews were focused on the phenomenal success of the web, along with a macabre warning describing 3 key areas we need to change in order to “save” the Internet as we know it.

The three points were:

  1. We’ve lost control of our personal data
  2. It’s too easy for misinformation to spread on the web
  3. Political advertising online needs transparency and understanding

I want to primarily discuss the first point – personal data, privacy and our lack of control.

As nearly every private, non-profit and public sector organisation on the planet, either has a digital presence, or is in the process of transforming itself to be a digital force, the transfer of personal data to service provider is growing at an unprecedented rate.

Every time we register for a service – be it for an insurance quote, to submit a tax return, when we download an app on our smart phones, register at the local leisure centre, join a new dentists or buy a fitness wearable, we are sharing an ever growing list of personal information or providing access to our own personal data.

The terms and conditions often associated with such registration flows, are often so full of “legalese”, or the app permissions or “scope” so large and complex, that the end user literally has no control or choice over the type, quality and and duration of the information they share.  It is generally an “all or nothing” type of data exchange.  Provide the details the service provider is asking for, or don’t sign up to the service. There are no alternatives.

This throws up several important questions surrounding data privacy, ownership and control.

  1. What is the data being used for?
  2. Who has access to the data, including 3rd parties?
  3. Can I revoke access to the data?
  4. How long with the service provider have access to the data for?
  5. Can the end user amend the data?
  6. Can the end user remove the data from the service provider – aka right to erasure?

Many service providers are likely unable to provide an identity framework that can answer those sorts of questions.

The interesting news, is that there are alternatives and things are likely to change pretty soon.  The EU General Data Protection Regulation (GDPR), provides a regulatory framework around how organisations should collect and manage personal data.  The wide ranging regulation, covers things like how consent from the end user is managed and captured, how breach notifications are handled and how information pertaining to the reasons for data capture are explained to the end user.

The GDPR isn’t a choice either – it’s mandatory for any organisation (irregardless of their location) that handles data of European Union citizens.

Couple with that, new technology standards such as the User Managed Access working group being run by the Kantara Initiative, that look to empower end users to have more control and consent of data exchanges, will open doors for organisations who want to deliver personalised services, but do so in a more privacy preserving and user friendly way.

So, whilst the Internet certainly has some major flaws, and data protection and user privacy is a big one currently, there are some green shoots of recovery from an end user perspective.  It will be interesting to see what the Internet will look like another 28 years from now.

This blog post was first published @ www.infosecprofessional.com, included here with permission.

Why Tim Berners-Lee Is Right About Internet Privacy

Last week, the "father" of the Internet, Tim Berners-Lee, did a series of interviews to mark the 28 year anniversary since he submitted his original proposal for the worldwide web.

The interviews were focused on the phenomenal success of the web, along with a macabre warning describing 3 key areas we need to change in order to "save" the Internet as we know it.

The three points were:


  1. We’ve lost control of our personal data
  2. It’s too easy for misinformation to spread on the web
  3. Political advertising online needs transparency and understanding
I want to primarily discuss the first point - personal data, privacy and our lack of control.

As nearly every private, non-profit and public sector organisation on the planet, either has a digital presence, or is in the process of transforming itself to be a digital force, the transfer of personal data to service provider is growing at an unprecedented rate. 

Every time we register for a service - be it for an insurance quote, to submit a tax return, when we download an app on our smart phones, register at the local leisure centre, join a new dentists or buy a fitness wearable, we are sharing an ever growing list of personal information or providing access to our own personal data.

The terms and conditions often associated with such registration flows, are often so full of "legalese", or the app permissions or "scope" so large and complex, that the end user literally has no control or choice over the type, quality and and duration of the information they share.  It is generally an "all or nothing" type of data exchange.  Provide the details the service provider is asking for, or don't sign up to the service. There are no alternatives.

This throws up several important questions surrounding data privacy, ownership and control.
  1. What is the data being used for?
  2. Who has access to the data, including 3rd parties?
  3. Can I revoke access to the data?
  4. How long with the service provider have access to the data for?
  5. Can the end user amend the data?
  6. Can the end user remove the data from the service provider - aka right to erasure?
Many service providers are likely unable to provide an identity framework that can answer those sorts of questions.

The interesting news, is that there are alternatives and things are likely to change pretty soon.  The EU General Data Protection Regulation (GDPR), provides a regulatory framework around how organisations should collect and manage personal data.  The wide ranging regulation, covers things like how consent from the end user is managed and captured, how breach notifications are handled and how information pertaining to the reasons for data capture are explained to the end user.

The GDPR isn't a choice either - it's mandatory for any organisation (irregardless of their location) that handles data of European Union citizens.

Couple with that, new technology standards such as the User Managed Access working group being run by the Kantara Initiative, that look to empower end users to have more control and consent of data exchanges, will open doors for organisations who want to deliver personalised services, but do so in a more privacy preserving and user friendly way.

So, whilst the Internet certainly has some major flaws, and data protection and user privacy is a big one currently, there are some green shoots of recovery from an end user perspective.  It will be interesting to see what the Internet will look like another 28 years from now.