ForgeRock OpenIDM: Setting Up SSL With MySQL Internal Repository

This blog post was first published @, included here with permission.

If you’ve already seen the video demonstration on setting up ForgeRock OpenIDM to use a JDBC repository, you may now be interested to know how to secure the traffic from ForgeRock OpenIDM to its JDBC repository. So in the video that follows, you will see:

– Setting up SSL in MySQL database

– Configuring OpenIDM to use SSLto the MySQL database (its internal repository)

Like several other videos that I’ve already published on this blog space around ForgeRock products, this one also makes use of OpenIDMwithSSLtoJDBC-01

Hope you’ll find the video log useful:

MySQL Product Documentation
ForgeRock Documentation

Setting Up ForgeRock OpenAM with HTTPS on Tomcat

This blog post was first published @, included here with permission.

This post is a demo version of the ForgeRock Documentation on Setting Up OpenAM with HTTPS on Tomcat. I had earlier published a screen-cast on the ForgeRock OpenAM deployment and Configuration on a here. Below you’ll find the steps that I run in my Ubuntu Linux Container to secure our OpenAM deployment:

– Create a Certificate & store it in keystore in a Linux Container
– Modify the Tomcat Server Configuration file (server.xml) to enable SSL (on port 8443)
– Deploy ForgeRock OpenAM
– Access OpenAM from the host OS and complete the configuration

If it’s hard for your visualize how the infrastructure looks like, here’s an illustration to make life easy.


Now on to the action:

If you are not able to view the embedded video, please click here

ForgeRock OpenDJ Replication – Enabling Encryption

This blog post was first published @, included here with permission.

This is a sequel to my earlier blog update on ForgeRock OpenDJ Replication and is largely inspired by a question raised in the ForgeRock Community Website. So if you are not very familiar with the steps involved in configuring OpenDJ Replication, I suggest you read/watch it before watching the embedded video below:

One-liner about the infrastructure used: two Linux Containers, each running an instance of ForgeRock OpenDJ is already replicating the OpenDJ data, but the replication traffic is not secure. In the video demonstration that follows, we’ll tighten the security a bit by encrypting the replication traffic as well as monitor the same using wireshark running on the host OS. Well, the diagram below indicates the end state of our screen-cast:



[If you are not able to see the embedded video, please watch it on YouTube here ]


A new security issue hit the streets this week: the Poodle SSL bug. Immediately we’ve received a question on the OpenDJ mailing list on how to remediate from the vulnerability.
While the vulnerability is mostly triggered by the client, it’s also possible to prevent attack by disabling the use of SSLv3 all together on the server side. Beware that disabling SSLv3 might break old legacy client applications.

OpenDJ uses the SSL implementation provided by Java, and by default will allow use of all the TLS protocols supported by the JVM. You can restrict the set of protocols for the Java VM installed on the system using (on the Mac, using the Java Preferences Panel, in the Advanced Mode), or using environment properties at startup ( I will let you search through the official Java documentations for the details.

But you can also control the protocols used by OpenDJ itself. If you want to do so, you will need to change settings in several places :

  • the LDAPS Connection Handler, since this is the one dealing with LDAP over SSL/TLS.
  • the LDAP Connection Handler, if the startTLS extended operation is to be used to negotiate SSL/TLS establishment on the LDAP connection.
  • the HTTP Connection Handler, if you have enabled it to activate the RESTful APIs
  • The Crypto Manager, whose settings are used by Replication and possibly the Pass Through Authentication Plugin.
  • The Administration Connector, which is also using LDAPS.

For example, to change the settings in the LDAPS Connection Handler, you would run the following command :

# dsconfig set-connection-handler-prop --handler-name "LDAPS Connection Handler"
--add ssl-protocol:TLSv1 --add ssl-protocol:TLSv1.1 --add ssl-protocol:TLSv1.2
-h localhost -p 4444 -X -D "cn=Directory Manager" -w secret12 -n

Repeat for the LDAP Connection Handler and the HTTP Connection Handler.

For the crypto manager, use the following command:

# dsconfig set-crypto-manager-prop
--add ssl-protocol:TLSv1 --add ssl-protocol:TLSv1.1 --add ssl-protocol:TLSv1.2
-h localhost -p 4444 -X -D "cn=Directory Manager" -w secret12 -n

And for the Administration Connector :

# dsconfig set-administration-connector-prop
--add ssl-protocol:TLSv1 --add ssl-protocol:TLSv1.1 --add ssl-protocol:TLSv1.2
-h localhost -p 4444 -X -D "cn=Directory Manager" -w secret12 -n

All of these changes will take effect immediately, but they will only impact new connections established after the change.

Filed under: Directory Services Tagged: directory, directory-server, ForgeRock, opendj, poodle, security, ssl, vulnerability

Protect Data Not Devices?

"Protect Data Not Devices", seems quite an intriguing proposition given the increased number of smart phone devices in circulation and the issues that Bring Your Own Device (BYOD) seems to be causing, for heads of security up and down the land.  But here is my thinking.  The term 'devices' now covers a multitude of areas.  Desktop PC's of course (do they still exist?!), laptops and net books, smart phones and not-so-smart phones, are all the tools of the trade, for accessing the services and data you own, or want to consume, either for work or for pleasure.  The flip side of that is the servers, mainframes, SAN's, NAS's and cloud based infrastructures that store and process data.  The consistent factor is obviously the data that is being stored and managed, either in-house or via outsourced services.

Smarter the Device, The More Reliant We Become

This is a pretty obvious statement and doesn't just apply to phones.  As washing machines became more efficient and dishwashers became cheaper and more energy saving, we migrated in droves, allowing our time to be spent on other essential tasks.  The same is true for data accessing devices.  As phones morphed in to micro desktop PC's, we now rely on them for email, internet access, gaming, social media, photography and so on.  Some people even use this thing called the telephone on them.  Crazy.  As the features and complexity ramp up, we no longer need another device for listening to music, taking pictures or accessing Facebook.  Convenience and service provision increases, as does the single-point-of-failure syndrome and our reliance on them being available 99.999% of the time, up to date and online.

Smarter the Device, The Less Important It Becomes

Now this next bit seems a bit of a paradox.  As the devices becomes smarter, greater emphasis is placed on the data and services those devices access.  For example.  A fancy Facebook client is pretty useless if only 100 people use Facebook.  A portable camera is just that, unless you have a social outlet for which to distribute the images.  The smartness of the devices themselves, is actually driven by the services and data they need to access.  Smartphones today come with a healthy array of encryption features, remote backup, remote data syncing for things like contacts, pictures and music, as well device syncing software like Dropbox.  How much data is actually specifically related to the device?  In theory nothing.  Zip.  Lose your phone and everything can be flashed back down in a few minutes, assuming it was set up correctly.  Want to replace a specific model and brand with a model of equivalent specification from a different vendor?  Yep you can do that too, as long as you can cope with a different badge on the box.  Feature differentiation is becoming smaller, as the technology becomes more complex.

Data Access versus Data Storage

As more and more services become out sourced (or to use the buzz of being moved to the 'cloud'), the storage part becomes less of a worry for the consumer.  The consumer could easily be an individual or an organisation.  Backup, syncing, availability, encryption and access management all fall to the responsibility of the outsourced data custodian.  Via astute terms and conditions and service level agreements, the consumer shifts responsibility across to the data custodian and service provider.

The process of accessing that data then starts to fall partly on the consumer.  How devices connect to a network, how users authenticate to a device and so on, all fall to the device custodian.  Access traffic encryption will generally require a combination of efforts from both parties.  For example, the data custodian will manage SSL certificates on their side, whilst the consumer has a part to play too.

So to slightly contradict my earlier point (!), this is where the device is really the egress point to the data access channel, and so therefore requires important security controls to access the device.  The device itself is still only really a channel to the data at the other end, but once an individual (or piece of software, malicious or not) has access to a device, they then in turn can potentially open access channels to outsourced data.  The device access is what should be protected, not necessarily the tin itself.

As devices become smarter and service providers more complex, that egress point moves substantially away from the old private organisational LAN or equivalent.  The egress point is the device regardless of location on a fixed or flexible network.

Data will become the ultimate prize not necessarily the devices that are used to access it.

By Simon Moffatt