Introduction to ForgeRock DevOps – Part 2 – Building Docker Containers

We have just launched Version 5 of the ForgeRock Identity Platform with numerous enhancements for DevOps friendliness. I have been meaning to jump into the world of DevOps for some time so the new release afforded a great opportunity to do just that.

Catch up with previous entries in the series:
http://identity-implementation.blogspot.co.uk/2017/04/introduction-to-forgerock-devops-part-1.html

I will be using IBM Bluemix here as I have recent experience of it but nearly all of the concepts will be similar for any other cloud environment.

Building Docker Containers

In this blog we are going to build our docker containers that will contain the ForgeRock platform components, tag them and upload them to the Bluemix registry.

Prerequisites

Install all of the below:

Docker: https://www.docker.com
Used to build, tag and upload docker containers.
Bluemix CLI: http://clis.ng.bluemix.net/ui/home.html
Used to deploy and configure the Bluemix environment.
CloudFoundry CLI: https://github.com/cloudfoundry/cli
Bluemix dependency.
Kubectl: https://kubernetes.io/docs/tasks/tools/install-kubectl/

Deploy and manage Kubernetes clusters.

Initial Configuration

1. Log in to the Blue Mix CLI using you Blue Mix account credentials:

bx login -a https://api.ng.bluemix.net

Note we are using the US instance of Bluemix here as it has support for Kubernetes in beta.

When prompted to select an account ( just type 1) and if you are logged in successfully you should see the above. Now you can interact with the Bluemix environment just as you might if you were logged in via a browser.

2. Add the Bluemix Docker components:

bx plugin repo-add Bluemix https://plugins.ng.bluemix.netbx plugin install container-service -r Bluemix
bx plugin install IBM-Containers -r Bluemix

Check they have installed:

bx plugin list

3. Clone (or download) the ForgeRock Docker Repo to somewhere local:

https://stash.forgerock.org/projects/DOCKER/repos/docker/browse

4. Download the ForgeRock AM and DS component binaries from backstage:

https://backstage.forgerock.com/downloads

5. Unzip and copy ForgeRock binaries into the Docker build directories:

AM:

unzip AM-5.0.0.zip
cp openam/AM-5.0.0.war /usr/local/DevOps/stash/docker/openam/

DJ:

mv DS-5.0.0.zip /usr/local/DevOps/stash/docker/openam/opendj.zipcp openam/AM-5.0.0.war /usr/local/DevOps/stash/docker/openam/

Amster:

mv Amster-5.0.0.zip /usr/local/DevOps/stash/docker/amster/amster.zip

For those unfamiliar, Amster is our new RESTful configuration tool for AM in the 5 platform, replacing SSOADM with a far more DevOps friendly tool, I’ll be covering it in a future blog.

Build Containers

We are going to create three containers: AM, DJ & Amster:

1. Build and Tag OpenAM container ( don’t forget the . ) :

cd /usr/local/DevOps/stash/docker/openam
docker build -t wayneblacklockfr/openam .

Note wayneblacklockfr/openam is just a name to tag the container with locally, replace it with whatever you like but keep the /openam.

All being well you will see something like the below:

Congratulations, you have built your first ForgeRock container!

Now we need to get the namespace for tagging, this is usually your username but check using:

bx ic namespace-get

Now lets tag it ready for upload to Bluemix, use the container ID output at the end of the build process and your namespace

docker tag d7e1700cfadd registry.ng.bluemix.net/wayneblacklock/openam:14.0.0

Repeat the process for Amster and DS.

2. Build and Tag Amster container:

cd /usr/local/DevOps/stash/docker/amster
docker build -t wayneblacklockfr/amster .
docker tag 54bf5bd46bf1 registry.ng.bluemix.net/wayneblacklock/amster:14.0.0

3. Build and Tag DS container:

cd /usr/local/DevOps/stash/docker/opendj
docker build -t wayneblacklockfr/opendj .
docker tag 19b8a6f4af73 registry.ng.bluemix.net/wayneblacklock/opendj:4.0.0

4. View the containers:

You can take a look at what we have built with: docker images

Push Containers

Finally we want to push our containers up to the Bluemix registry.

1. Login again:

bx login -a https://api.ng.bluemix.net

2. Initiate the Bluemix container service, this may take a moment:

bx ic init

Ignore Option 1 & Option 2, we are not doing either.

3. Push your Docker images up to Bluemix:

docker push registry.ng.bluemix.net/wayneblacklock/openam:14.0.0

docker push registry.ng.bluemix.net/wayneblacklock/amster:14.0.0

docker push registry.ng.bluemix.net/wayneblacklock/opendj:4.0.0

4. Confirm your images have been uploaded:

bx ic images

If you login to the Bluemix webapp you should be able to see your containers in the catalog:

Next Time

We will take a look at actually deploying a Kubernetes cluster and everything we have to do to ready our containers for deployment.

This blog post was first published @ http://identity-implementation.blogspot.no/, included here with permission from the author.

Introduction to ForgeRock DevOps – Part 1

We have just launched Version 5 of the ForgeRock Identity Platform with numerous enhancements for DevOps friendliness. I have been meaning to jump into the world of DevOps for some time so the new release afforded a great opportunity to do just that.

As always with this blog I am going to step through a fully worked example. In this case I am using IBM Bluemix however it could just as easily have been AWS, Azure. GKE or any service that supports Kubernetes. By the end of this blog you will have a containerised instance of ForgeRock Access Management and Directory Services running on Bluemix deployed using Kubernetes. First off we will cover the basics.

DevOps Basics

There are many tutorials out there introducing dev ops that do a great job so I am not going to repeat those here I will point you towards the excellent ForgeRock Platform 5 DevOps guide which also takes you through DevOps deployment step by step into Minikube or GKE:

https://backstage.forgerock.com/docs/platform/5/devops-guide

What I want to do briefly is touch on some of the key ideas that really helped me to understand DevOps. I do not claim to be an expert but I think I am beginning to piece it all together:

12 Factor Applications: Best practices for developing applications, superbly summarised here this is why we need containers and DevOps.

Docker: Technology for building, deploying and managing containers.

Containers: A minimal operating system and components necessary to host an application. Traditionally we host apps in virtual machines with full blown operating systems whereas containers cut all of that down to just what you need for the application you are going to run.

In docker containers are built from Dockerfiles which are effectively recipes for building containers from different components. e.g. a recipe for a container running Tomcat.

Container Registry: A place where built containers can be uploaded to, managed, downloaded and deployed from. You could have a registry running locally, cloud environments will also typically have registries they will use to retrieve containers at deployment time.

Kubernetes: An engine for orchestrating deployment of containers. Because containers are very minimal, they need to have extra elements provisioning such as volume storage, secrets storage and configuration. In addition when you deploy any application you need load balancing and numerous other considerations. Kubernetes is a language for defining all of these requirements and an engine for implementing them all.

In cloud environments such as AWS, Azure and IBM Bluemix that support Kubernetes this effectively means that Kubernetes will manage the configuration of the cloud infrastructure for you in effect abstracting away all of the usual configuration you have to do specific to these environments.

Storage is a good example, in Kubernetes you can define persistent volume claims, this is effectively a way of asking for storage. Now with Kubernetes you do not need to be concerned with the specifics of how this storage is provisioned. Kubernetes will do that for you regardless of whether you deploy onto AWS, Azure, IBM Bluemix.

This enables automated and simplified deployment of your application to any deployment environment that supports Kubernetes! If you want to move from one environment to another just point your script at that environment! More so Kubernetes gives you a consistent deployment management and monitoring dashboard across all of these environments!

Helm: An engine for scripting Kubernetes deployments and operations. The ForgeRock platform uses this for DevOps deployment. It simply enables scripting of Kubernetes functionality and configuration of things like environment variables that may change between deployments.

The above serves as a very brief introduction to the world of DevOps and helps to set the scene for our deployment.

If you want to following along with this guide please get yourself a paid IBM Bluemix account alternatively if you want to use GKE or Minikube ( for local deployment ) take a look at the superb ForgeRock DevOps Guide. I will likely cover off Azure and AWS deployment in later blogs however everything we talk about here will still be relevant for those and other cloud environments as after all that is the whole point of Kubernetes!

In Part 2 we will get started by installing some prerequisites and building our first docker containers.

This blog post was first published @ http://identity-implementation.blogspot.no/, included here with permission from the author.

Making Rest Calls from IDM Workflow

I attended the Starling Bank Hackathon this weekend and had a great time, I will shortly be writing a longer blog post to talk all about it but before that I briefly wanted to blog about a little bit of code that might be really helpful to anyone building IDM workflows.

The External Rest Endpoint

ForgeRock Identity Management (Previously OpenIDM) has an REST API that effectively allows you to invoke external REST services hosted anywhere. You might use this for example to call out to an identity verification service as part of a registration workflow and I made good use of it at the hackathon.

With the following piece of code you can create some JSON and call out to a REST service outside of ForgeRock Identity Management:

java.util.logging.Logger logger = java.util.logging.Logger.getLogger("")
logger.info("Make REST call)

def slurper = new groovy.json.JsonSlurper()
def result = slurper.parseText('{"destinationAccountUid": "a41dd561-d64c-4a13-8f86-532584b8edc4","payment": {"amount": 12.34,"currency": "GBP"},"reference": "text"}')

result = openidm.action('external/rest', 'call', ['body': (new groovy.json.JsonBuilder(result)).toString(), 'method': 'POST', 'url': 'https://api-sandbox.starlingbank.com/api/v1/payments/local', 'contentType':'application/json', 'authenticate': ['type':'bearer', 'token': 'lRq08rfL4vzy2GyoqkJmeKzjwaeRfSKfWbuAi9NFNFZZ27eSjhqRNplBwR2do3iF'], 'forceWrap': true ])

A really small bit of code but with it you can do all sorts of awesome things!

This blog post was first published @ http://identity-implementation.blogspot.no/, included here with permission from the author.

Have I Been Pwned Authentication Module

Bit busy at the moment but I wanted to write a quick blog on something rather cool. I am a big fan of Troy Hunt’s https://haveibeenpwned.com/. Troy performs a much needed public service by collating the results of the many many data breaches that keep happening and making them searchable.
So if you want to check if your account has been involved in a breach. You enter your email and can see all the data breaches involving your email.

The site also sends notifications when your email is detected in a breach. Allowing you to change your passwords and ensure your accounts are secure.

I thought it would be quite cool to write an HIBP module for OpenAM. My colleague Jon Knight was kind enough to wrap this up and put together a bit of UI for it.

So what this does, is enable an optional authentication step, where after you login we check to see if your email is in the HIBP database and if it is, we can then warn you you may want to change your password. The module along with instructions to configure it can be found here:

https://github.com/jeknight/pwnedAuthModule

This is a just a proof of concept, it may be that rather then warn the use your can instead attach a higher level of risk to interactions with them, perhaps enforcing the use of 2FA. This can all be achieved easily with OpenAM.

This blog post was first published @ http://identity-implementation.blogspot.no/, included here with permission from the author.

Introducing IDM Workflow

ForgeRock Identity Management includes an OOTB workflow engine based on BPMN (Business Process Model & Notation). This isn’t unique, most identity management solutions have some form of workflow engine. However in my experience they are typically based on some proprietary technology and/or very painful to work with.

I have recently had to build some workflows for various customer Proof of Concepts and I am really impressed by how quickly you can pull something together so I wanted to write up a blog entry.

So in this blog we are going to use a brand new instance of IDM (installed locally) and create a simple request and approval workflow which we will then test.

I am going to use Eclipse for this, there are other BPMN editors. I am also not going to spend much time talking about BPMN beyond what we need to build a meaningful workflow. Much more information is available here. In the spirit of this blog I am just going to get on with it and walk you through the basic steps to build and test a simple workflow.

Additionally, the workflow samples that ship with IDM are a brilliant place to start. I highly recommend taking a look at them and using them as the basis of your workflows until you get comfortable building them yourself.

Getting Started

I am going to assume you have an installation of ID already, if not check out my IDM beginners series.

IDM ships with a built in version of the Activiti workflow engine: https://www.activiti.org/. We are going to use the free Eclipse Activiti Workflow Designer to build our workflows.

Firstly, download and install the Eclipse IDE.

When you have Eclipse installed, fire it up and navigate to help -> install new software:

Click Add:

Enter the following location: https://www.activiti.org/designer/update/ and press OK.

Wait for the installation process to complete, now that is all out of the way. Lets get started!

Create a New Project

Navigate to File -> New Project:
Then select General -> Project (we do not need an Activiti Project):
Give your project a name:

And press Finish.

Building a New Workflow

Right click on the new project, select New File:
Give it a name similar to the following:

 

.bpmn20.xml is the convention we use for workflow files in IDM.
Finally right click on our new workflow, select Open With and Other…

 

Finally right click on our new workflow, select Open With and Other…

 

And select Activiti Diagram Editor.
Ok, you should now have a blank canvas that looks a bit like this:
Lets get started. First thing we need is a Start Event. Drag one over from the menu on the right and drop it somewhere on the workflow canvas.
Now, we need some workflow steps. As we are building a simple request and approval workflow so we probably need:
  • A step to actually create a request for something (that is actually our StartEvent we just created).
  • A step to gather some information and determine who the request needs to go to for approval.
  • A step for the actual approval.
  • A step for processing the result. Typically you also want to send an email containing the response. In fact, we probably need two steps here, one for success and one for failure.
We will build the workflow steps and connect them together first, before implementing the actual logic. Select a Script Task from the menu on the right and drag it on to our canvas.
You should now have something like this:
We probably want to give our Script Task a name, click on it and give it a new name:
Just replace the Name value with something appropriate:
We also need to make sure that all Script Tasks have script defined so that IDM can parse them successfully. The easiest way to do this is to add a simple logging statement to the task. Select the Process Request task again. then the Main config tab and add some simple script:
Now you can use either javascript or groovy for scripting. I tend to use groovy but that is just personal choice.
java.util.logging.Logger logger = java.util.logging.Logger.getLogger("")
logger.info("SimpleWorkflow - Process Request")
Make sure you save your work.
Now select the Start Event, you should see the following menu:
Click on the Create Connection arrow, but keep the mouse button held down and drag a connection over to our Process Request task. We now have a flow linking the two tasks in sequence:
Next we need a User Task, similar to before select User Task from the menu on the right, drag it into the canvas, give it a name and connect it to the Process Request task.
Ok, now things get a little more complicated, as a request could be either approved or rejected. So we need a Gateway, specifically an Exclusive Gateway:
We also need to new Script Tasks, build the workflow as below:
Remember to add some simple script to Main config for each new task, otherwise parsing of the workflow will fail.
Finally we need an EndEvent:
Put this to the right of the Approved and Rejected tasks, and connect them to it as below:
We now have a basic workflow outline, time to make it actually do something.

Workflow Logic

StartEvent

We are going to start with our StartEvent. What this actually translates to is the form that a user will complete in self service to make their request.
Click on the StartEvent and select the Form tab
 Click New, You should see the following:
Fill it in exactly as I have below then press OK:

 

We should now have an attribute on our form:
Let’s add another one, justification is a common field when making a request, add it in exactly the same way:
One more thing before we test this in IDM. Click somewhere on the canvas until you can edit the process Name.
Change My process to something meaningful like Request with Justification. Make sure you save the workflow.

Testing the Workflow in IDM

Although our workflow isn’t really doing anything yet, this is a good time to quickly test what it looks like in IDM.
Fire up IDM and navigate to the openidm directory, create a new workflow directory:
Now copy the SimpleWorkflow.bpmn20.xml file into the new workflow directory, you should see IDM pick it up in the logs. In fact you will probably see a warning which we will ignore for now.
Login to IDM as a user other than openidm-admin, someone you have created previously, I’m using a user called jbloggs. Remember to login to user self service: http://localhost.localdomain.com:8080/#login/
You should see the dashboard, and our new process!
Click Details and you can see the form we created earlier! You can enter a request and justification but do not hit Start, because right now nothing will happen.

More Workflow Logic

Ok, so we can make a request now, but it doesn’t go to anyone. Let’s fix that, there are a few things to do here. Firstly we need to gather some more information about the requestor for the approver. Select the StartEvent then Main config and set Initiator as “initiatorId”:
Select the Process Request task and enter the following script as groovy into main Config then save your work.
java.util.logging.Logger logger = java.util.logging.Logger.getLogger("")
logger.info("SimpleWorkflow - Process Request " + initiatorId);
// find user
readStartUserFromRepoParams = [_queryId:'for-userName',uid:initiatorId]
qresults = openidm.query('managed/user', readStartUserFromRepoParams)
// get user details
users = qresults.result
execution.setVariable("userId", users[0]._id)
execution.setVariable("userName", users[0].userName)
execution.setVariable("givenName", users[0].givenName)
execution.setVariable("sn", users[0].sn)
execution.setVariable("mail", users[0].mail)
// set approver
execution.setVariable("approverId", "openidm-admin")
All we are doing here is retrieving the user data for the initiating user and setting it as variables in the workflow process. We are also assigning the user who will approve the request (simply as a static Id here but you can easily make this dynamic, for example assign the task to a group or to a manager – I may cover this in a later blog).
A few more steps, we need to set the assignee for the approval task. Select the Approve Request task and enter the following for assignee:
We also need to configure the approval form as below to show the data we just collected:
We also need to add the request fields the user just filled in:
We also need to add an approvalResult, to the form. This field is a little different as it is an enum:
Because this field is an enumeration we need to add some form values for the user to choose, press New:
And configure the Form Value configuration as below and press OK:
Do the same for “rejected”, and you should end up with the following:
Save your work.

Back to IDM

Again, copy the workflow file into IDM. Now logout, login as a user and select the workflow and populate the request:

 

Press Start. All being well you should see confirmation the workflow process has been started. Now log out and log back in as openidm-admin, you should see that there is a request to be approved on the dashboard:
And if you select Details, you can see the request itself and the additional information we put into it, as well as our approval drop down:

However as before do not click Complete just yet, as we need to actually make this do something.

Final Workflow Logic

Back to the workflow editor, lets take a look at the exclusive gateway we added a bit earlier:
So what we need now is some logic, based on the approval result to send the workflow to the right place. Click on the flow to the Approved task:
Take a look at the Main config, and specifically the Condition field:
Enter the following:
${approvalResult=='approved'}
Do something similar for the Rejected flow:
${approvalResult==’rejected’}
You will notice these two conditions are based on the enum we defined earlier, depending on the selection made by the approver we flow will go to either the Approved or Rejected task.
Now lets finish the workflow, select the Approved task and Main config and Script, enter the following:
java.util.logging.Logger logger = java.util.logging.Logger.getLogger("")
logger.info("SimpleWorkflow - Approved")

java.text.SimpleDateFormat formatUTC = new java.text.SimpleDateFormat("yyyy-MM-dd'T'HH:mm:ss.S'Z'");     
formatUTC.setTimeZone(TimeZone.getTimeZone("UTC"));
requestDate = formatUTC.format(new Date());
               
def requesterNotification = [
   "receiverId":  userId,
   "requesterId" : "",
   "requester" : "",
   "createDate" : requestDate,
   "notificationType" : "info",
   "notificationSubtype" : "",
   "message" : "The access request was accepted"
];
                
openidm.create("repo/ui/notification/", null, requesterNotification)
This bit of script simply uses the IDM notification engine to let the user know that their request has been approved.
Save your work for the last time and copy the work flow into IDM.

Back to IDM for the Last Time

So now, we should have a complete workflow.
Login to IDM as jbloggs and make a request.
Login to IDM as openidm-admin to approve the request.
Finally, log back in as jbloggs and you should see a notification the request has been approved.
And thats it, very basic workflow but hopefully you can begin to see what is possible. In future blogs I’ll look at actually at assigning a role based on the approval and also enabling a request drop down dynamically.

This blog post was first published @ http://identity-implementation.blogspot.no/, included here with permission from the author.

Installing OpenIDM with a MySQL Database

I mentioned some time ago in the OpenIDM beginner series that although the built in OrientDB is more than sufficient for development purposes as soon as you start getting serious you will want to install a proper database backend.

This is a quick blog to talk you through how to install OpenIDM on a database. In my case I am going to use MySQL ( MariaDB specifically ), steps for installing other databases are broadly similar.

Prerequisites

For this exercise please:

Steps

Installation

1. Create a directory for the environment, e.g. : /usr/local/env/demo
2. Unzip OpenIDM:
unzip openidm-4.5.0.zip

Open up openidm/conf/repo.orientdb.json  and take a quick look.


What we have here is a set of queries for the OOTB embedded OrientDB. We are going to replace this with a compatible SQL based version.


3. Run the following script to configure the OpenIDM schema:


mysql -u root -p < /usr/local/env/demo/openidm/db/mysql/scripts/openidm.sql


Here I run it for mysql against the root user. You will be asked for a password, please enter it. This script will create an openidm schema.


4. Next we need to run some additional scripts.


mysql -D openidm -u root -p < /opt/forgerock/openidm/db/mysql/scripts/activiti.mysql55.create.engine.sql
mysql -D openidm -u root -p < /opt/forgerock/openidm/db/mysql/scripts/activiti.mysql55.create.history.sql


mysql -D openidm -u root -p < /opt/forgerock/openidm/db/mysql/scripts/activiti.mysql.create.identity.sql


These scripts set up the Activiti BPMN workflow engine in the openidm database instance we just created.


5. Now lets remove the original repo.orientdb.json file. Don’t forget to do this step! I’d also suggest making a backup:


cd /usr/local/env/demo/openidm/conf


mkdir backup


mv repo.orientdb.json backup


6. Next we need to copy over the database specific configuration files:


cd /usr/local/env/demo/openidm
cp db/mysql/conf/repo.jdbc.json conf
cp db/mysql/conf/datasource.jdbc-default.json conf


Again, let’s take a quick look at these files , repo.jdbc.json:



You’ll notice that what we have is an SQL version of the repo file we looked at earlier, ready to execute queries against a MySQL like database. There is also one other notable difference:



The original repo did not have this line. Well, this line is used to specify the datasource to be used i.e. our MySQL database, and it refers to the second file we copied over datasource.jdbc-default.json:



8. Ensure that the useDataSource (default) value matches the datasource specification datasource.jdbc-default.json


9. Configure the datasource.jdbc-default.json file for your MySQL database. Specifically the jdbcUrl, databaseName, username and password.


10. Update workflow.json to use the same datasource i.e. default.


To be very clear, if you named your datasource specification datasource.jdbc-example.json then your useDataSource value should be example in the files above.


11. That’s it. You should be able to start openidm up normally:


/startup.sh