Gartner Security Summit – IoT Review

This week saw the Gartner Security and Risk Management Summit being held in London.  A well attended and respected summit, it brought together the great and good of the infosec world, providing attendees, with a vendor and analyst view of governance, malware, identity and firewall related security topics.



The area that caught my attention though, were the sessions on internet of things related security.

The IoT world is fast becoming the catch all bucket, for any small device that connects to the internet, but isn't a smartphone.  There are some incredibly smart innovations taking place in this space, from consumer and health monitoring, through to operational technology and smart grid and utility monitoring solutions. Tiny fit-for-purpose devices, that perform a small, repeatable task, such as gathering data and sending to a central hub or broker service.  They often have very limited hardware capacity, tiny if-at-all operating systems and very rarely contain out of the box security.

The main focus today, is generally for IoT vendors to promote interoperability - great demo's and show cases, focusing on integration or data transfer under low power or capacity constraints.

Topics such as device registration, claiming and association, data encryption or data sharing, rarely get mentioned or focused upon.

Gartner's Earl Perkins, introduced an intriguingly titled session called "Herding Cats and Securing the Internet of Things".  Earl touched up the need to have a tiered approach to IoT security, covering infrastructure, identity and data.  Whilst the devices themselves are often associated with data capture and replay, it's often the data owners - real people - who could be exposed in a data breach disaster.

Following Earl, was Trent Henry discussing how Public Key Infrastructure, the once expensive and seemingly legacy encryption approach, was having a new lease of life in the machine to machine (M2M) landscape, where username and password authentication is of limited use.  It seems logical, that the use of things like asymmetric keys (perhaps minted at manufacture time) and certificate distribution can become the defacto standard in the M2M game.

The increased popularity of things like NFC (near field communications) has opened the scope for smartphone payment technology, through the implementation of secure elements, within the phone's hardware.  Such secure elements are likely to be seen within other non-phone devices that have a requirement for the storage of credential or certificates and keys.

One of the major issues with the IoT landscape, is often associated with basic identity management, such as how devices register to a service or authoritative source and how the corresponding data owners are able to authorize and share data to trusted third parties.  Whilst the devices themselves could be simple, the data captured, is often of a high value and simple yet robust trust and privacy models need to be implemented.

Many of the newer authorization standards such as OAuth2, OpenID Connect and User Managed Access (UMA) may have a significant role to play here.

By Simon Moffatt


Gartner Security Summit – IoT Review

This week saw the Gartner Security and Risk Management Summit being held in London.  A well attended and respected summit, it brought together the great and good of the infosec world, providing attendees, with a vendor and analyst view of governance, malware, identity and firewall related security topics.



The area that caught my attention though, were the sessions on internet of things related security.

The IoT world is fast becoming the catch all bucket, for any small device that connects to the internet, but isn't a smartphone.  There are some incredibly smart innovations taking place in this space, from consumer and health monitoring, through to operational technology and smart grid and utility monitoring solutions. Tiny fit-for-purpose devices, that perform a small, repeatable task, such as gathering data and sending to a central hub or broker service.  They often have very limited hardware capacity, tiny if-at-all operating systems and very rarely contain out of the box security.

The main focus today, is generally for IoT vendors to promote interoperability - great demo's and show cases, focusing on integration or data transfer under low power or capacity constraints.

Topics such as device registration, claiming and association, data encryption or data sharing, rarely get mentioned or focused upon.

Gartner's Earl Perkins, introduced an intriguingly titled session called "Herding Cats and Securing the Internet of Things".  Earl touched up the need to have a tiered approach to IoT security, covering infrastructure, identity and data.  Whilst the devices themselves are often associated with data capture and replay, it's often the data owners - real people - who could be exposed in a data breach disaster.

Following Earl, was Trent Henry discussing how Public Key Infrastructure, the once expensive and seemingly legacy encryption approach, was having a new lease of life in the machine to machine (M2M) landscape, where username and password authentication is of limited use.  It seems logical, that the use of things like asymmetric keys (perhaps minted at manufacture time) and certificate distribution can become the defacto standard in the M2M game.

The increased popularity of things like NFC (near field communications) has opened the scope for smartphone payment technology, through the implementation of secure elements, within the phone's hardware.  Such secure elements are likely to be seen within other non-phone devices that have a requirement for the storage of credential or certificates and keys.

One of the major issues with the IoT landscape, is often associated with basic identity management, such as how devices register to a service or authoritative source and how the corresponding data owners are able to authorize and share data to trusted third parties.  Whilst the devices themselves could be simple, the data captured, is often of a high value and simple yet robust trust and privacy models need to be implemented.

Many of the newer authorization standards such as OAuth2, OpenID Connect and User Managed Access (UMA) may have a significant role to play here.

By Simon Moffatt


Noindex, nofollow for draft docs

When your web search turns up hits for ForgeRock documentation, often the in-progress, draft docs top the list. Although this is handy for those of us working with nightly builds from trunk, it’s not so great if you’re working with a stable release, one that does not yet include the newest features and the latest compatibility changes. The finished docs have been more thoroughly reviewed and finished than the in-progress, draft docs, and aim to be technically accurate and complete with respect to the software as delivered.

With the move to the 2.1.4 release of the documentation tools, we have started publishing the in-progress, draft project documentation with headers and meta tags to encourage robots not to index those versions. Then hope is that, over time, readers will find it easier to find the published docs at http://docs.forgerock.org/ that match the release they use.

If you are working with nightly builds and trying out the latest features, know that we are still keeping the draft docs as current as possible. Please do continue using the draft docs. Feel free to log issues. Feel free to link to them from blog posts about new features and so forth. When answering questions from someone using a stable release, however, do send them to the published docs for the release at http://docs.forgerock.org/.


ForgeRock OpenIG 3.0- OIDC authentication example


My colleague, Simon Moffat has written a nice introductory article on some of the new features in OpenIG 3.0.

OpenIG is a Java based reverse proxy server with a focus on solving identity management challenges. The release adds support for scripting in Groovy and Javascript, and adds new authentication and authorization filters for OpenID Connect and OAuth 2.

I like to describe OpenIG as the Swiss Army knife of identity proxy servers. It can perform arbitrary transformations on HTTP requests and broker them to a number of backend services.


If you want a "ready to run" sample OpenIG project that demonstrates the new OpenID Connect filter  have a look at example1 in  https://github.com/wstrange/openig_examples


Hopefully the README.md cleary explains how this all works, but if not, drop me a note and I will improve the documentation.


If you have any OpenIG samples that you would like to share please feel free to send a pull request.





How to determine NSS/NSPR versions on Linux

Reverse engineering is a quite important skill to have when working with OpenAM, and this is even more the case for the web policy agents. Determining the version of the NSS and NSPR libraries may prove important when trying to build the agents, so here is a trick I’ve used in the past to determine the version of the bundled libraries.

To determine the version for NSPR, create nspr.c with the following content:

#include <dlfcn.h>
#include <stdio.h>

int main() {
        void* lib = dlopen("/opt/web_agents/apache24_agent/lib/libnspr4.so", RTLD_NOW);
        const char* (*func)() = dlsym(lib, "PR_GetVersion");
        printf("%sn", func());

        dlclose(lib);
        return 0;
}

Compile it using the following command (of course, make sure the path to the library is actually correct), then run the command:

$ gcc nspr.c -ldl
$ ./a.out
4.10.6

Here is the equivalent source for NSS, saved under nss.c:

#include <dlfcn.h>
#include <stdio.h>

int main() {
        void* lib = dlopen("/opt/web_agents/apache24_agent/lib/libnss3.so", RTLD_NOW);
        const char* (*func)() = dlsym(lib, "NSS_GetVersion");
        printf("%sn", func());

        dlclose(lib);
        return 0;
}

And an example output would look like:

$ gcc nss.c -ldl
$ ./a.out
3.16.3 Basic ECC

To determine the correct symbol names I’ve been using the following command:

nm -D libns{s,p}*.so | grep -i version

NOTE: While this code works nicely on Linux, for Windows and Solaris you will probably need a few adjustments, or there are potentially other/better ways to get information on the libraries.