2020: Machine Learning, Post Quantum Crypto & Zero Trust

Welcome to a digital identity project in 2020! You’ll be expected to have a plan for post-quantum cryptography.  Your network will be littered with “zero trust” buzz words, that will make you suspect everyone, everything and every transaction.  Add to that, “machines” will be learning everything, from how you like your coffee, through to every network, authentication and authorisation decision. OK, are you ready?

Machine Learning

I’m not going to do an entire blog on machine learning (ML) and artificial intelligence (AI).  Firstly I’m not qualified enough on the topic and secondly I want to focus on the security implications.  Needless to say, within 3 years, most organisations will have relatively experienced teams who are handling big data capture from an and identity, access management and network perspective.

That data will be being fed into ML platforms, either on-premise, or via cloud services.  Leveraging either structured or unstructured learning, data from events such as login (authentication) for end users and devices, as well authorization decisions can be analysed in order to not only increase assurance and security, but for also increasing user experience.  How?  Well if the output from ML can be used to either update existing signatures (bit legacy, but still) whilst simultaneously working out the less risky logins, end user journeys can be made less intrusive.

Step one is finding out the correct data sources to be entered into the ML “model”.  What data is available, especially within the sign up, sign in and authorization flows?  Clearly general auditing data will look to capture ML “tasks” such as successful sign ins and any other meta data associated with that – such as time, location, IP, device data, behavioural biometry and so on.  Having vast amounts of this data available is the first start, which in turn can be used to “feed” the ML engine.  Other data points would be needed to.  What resources, applications and API calls are being made to complete certain business processes?  Can patterns be identified and tied to “typical” behaviour and user and device communities.  Being able to identify and track critical data and the services that process that data would be a first step, before being able to extract task based data samples to help identify trusted and untrusted activities.

 

Post Quantum Crypto

Quantum computing is coming.  Which is great.  Even in 2020, it might not be ready, but you need to be ready for it.  But, and there’s always a but, the main concern is that the super power of quantum will blow away the ability for existing encryption and hashing algorithms to remain secure.  Why?  Well quantum computing ushers in a paradigm of “qubits” – a superpositional state in between the classic binary 1 and 0.  Ultimately, that means that the “solutioneering” of complex problems can be completed  in a much more efficient and non-sequential way.

The quantum boxes can basically solve certain problems faster.  The mathematics behind cryptography being one of those problems.  A basic estimate for the future effectiveness of something like AES-256, drops to 128 bits.  Scary stuff.  Commonly used approaches today for key exchange rely on protocols such as Diffie-Hellman (DH) or Elliptic Curve Diffie Hellman (ECDH).  Encryption is then handled by things like Rivest-Shamir-Adleman (RSA) or Elliptic Curve Digital Signing Algorithm (ECDSA).

In the post-quantum (PQ) world they’re basically broken.  Clearly, the material impact on your organisation or services will largely depend on impact assessment.  There’s no point putting a $100 lock on a $20 bike.  But everyone wants encryption right?  All that data that will be flying around is likely to need even more protection whilst in transit and at rest.

Some of the potentially “safe” PQ algorithms include XMSS and SPHINCS for hashing – the former going through IETF standardization.  Ring Learning With Errors (RLWE) is basically an enhanced public key cryptosystem, that alters the structure of the private key.  Currently under research but no weakness have yet been found.  NTRU is another algorithm for the PQ world, using a hefty 12881 bit key.  NTRU is also already standardized by the IEEE which helps with the maturity aspect.

But how to decide?  There is a nice body called the PQCRYPTO Consortium that is providing guidance on current research.  Clearing you’re not going to build your own alternatives, but information assurance and crypto specialists within your organisation, will need to start data impact assessments, in order to understand where cryptography is currently used for both transport, identification and data at rest protection to understand any future potential exposures.

Zero Trust Identities

“Zero Trust” (ZT) networking has been around for a while.  The concept of organisations having a “safe” internal network versus the untrusted “hostile” public network, separated by a firewall are long gone. Organisations are perimeter-less.

Assume every device, identity and transaction is hostile until proven otherwise.  ZT for identity especially, will be looking to bind not only a physical identity to a digital representation (session Id, token, JWT), but also that representation to a vehicle – aka mobile, tablet or device.  In turn, every transaction that tuple interacts with, is then verified – checking for changes – either contextual or behavioural that could indicate malicious intent.  That introduces a lot of complexity to transaction, data and application protection.

Every transaction potentially requires introspection or validation.  Add to this mix an increased number of devices and data flows, which would pave the way for distributed authorization, coupled with continuous session validation.

How will that look?  Well we’re starting to see the of things like stateless JSON Web Tokens (JWT’s) as a means for hyper scale assertion issuance, along with token binding to sessions and devices.  Couple to that Fine Grained Authentication processes that are using 20+ signals of data to identify a user or thing and we’re starting to see the foundations of ZT identity infrastructures.  Microservice or hyper-mesh related application infrastructures are going to need rapid introspection and re-validation on every call so the likes of distributed authorization looks likely.

So the future is now.  As always.  We know that secure identity and access management functions has never been more needed, popular or advanced in the last 20 years.  The next 3-5 years will be critical in defining a back bone of security services that can nimbly be applied to users, devices, data and the billions of transactions that will result.

This blog post was first published @ www.infosecprofessional.com, included here with permission.

2020: Machine Learning, Post Quantum Crypto & Zero Trust

Welcome to a digital identity project in 2020! You'll be expected to have a plan for post-quantum cryptography.  Your network will be littered with "zero trust" buzz words, that will make you suspect everyone, everything and every transaction.  Add to that, “machines” will be learning everything, from how you like your coffee, through to every network, authentication and authorisation decision. OK, are you ready?

Machine Learning

I'm not going to do an entire blog on machine learning (ML) and artificial intelligence (AI).  Firstly I'm not qualified enough on the topic and secondly I want to focus on the security implications.  Needless to say, within 3 years, most organisations will have relatively experienced teams who are handling big data capture from an and identity, access management and network perspective.

That data will be being fed into ML platforms, either on-premise, or via cloud services.  Leveraging either structured or unstructured learning, data from events such as login (authentication) for end users and devices, as well authorization decisions can be analysed in order to not only increase assurance and security, but for also increasing user experience.  How?  Well if the output from ML can be used to either update existing signatures (bit legacy, but still) whilst simultaneously working out the less risky logins, end user journeys can be made less intrusive. 

Step one is finding out the correct data sources to be entered into the ML “model”.  What data is available, especially within the sign up, sign in and authorization flows?  Clearly general auditing data will look to capture ML “tasks” such as successful sign ins and any other meta data associated with that – such as time, location, IP, device data, behavioural biometry and so on.  Having vast amounts of this data available is the first start, which in turn can be used to “feed” the ML engine.  Other data points would be needed to.  What resources, applications and API calls are being made to complete certain business processes?  Can patterns be identified and tied to “typical” behaviour and user and device communities.  Being able to identify and track critical data and the services that process that data would be a first step, before being able to extract task based data samples to help identify trusted and untrusted activities.


Post Quantum Crypto

Quantum computing is coming.  Which is great.  Even in 2020, it might not be ready, but you need to be ready for it.  But, and there’s always a but, the main concern is that the super power of quantum will blow away the ability for existing encryption and hashing algorithms to remain secure.  Why?  Well quantum computing ushers in a paradigm of “qubits” - a superpositional state in between the classic binary 1 and 0.  Ultimately, that means that the “solutioneering” of complex problems can be completed  in a much more efficient and non-sequential way.

The quantum boxes can basically solve certain problems faster.  The mathematics behind cryptography being one of those problems.  A basic estimate for the future effectiveness of something like AES-256, drops to 128 bits.  Scary stuff.  Commonly used approaches today for key exchange rely on protocols such as Diffie-Hellman (DH) or Elliptic Curve Diffie Hellman (ECDH).  Encryption is then handled by things like Rivest-Shamir-Adleman (RSA) or Elliptic Curve Digital Signing Algorithm (ECDSA). 

In the post-quantum (PQ) world they’re basically broken.  Clearly, the material impact on your organisation or services will largely depend on impact assessment.  There’s no point putting a $100 lock on a $20 bike.  But everyone wants encryption right?  All that data that will be flying around is likely to need even more protection whilst in transit and at rest.

Some of the potentially “safe” PQ algorithms include XMSS and SPHINCS for hashing – the former going through IETF standardization.  Ring Learning With Errors (RLWE) is basically an enhanced public key cryptosystem, that alters the structure of the private key.  Currently under research but no weakness have yet been found.  NTRU is another algorithm for the PQ world, using a hefty 12881 bit key.  NTRU is also already standardized by the IEEE which helps with the maturity aspect.

But how to decide?  There is a nice body called the PQCRYPTO Consortium that is providing guidance on current research.  Clearing you’re not going to build your own alternatives, but information assurance and crypto specialists within your organisation, will need to start data impact assessments, in order to understand where cryptography is currently used for both transport, identification and data at rest protection to understand any future potential exposures.


Zero Trust Identities

“Zero Trust” (ZT) networking has been around for a while.  The concept of organisations having a “safe” internal network versus the untrusted “hostile” public network, separated by a firewall are long gone. Organisations are perimeter-less. 

Assume every device, identity and transaction is hostile until proven otherwise.  ZT for identity especially, will be looking to bind not only a physical identity to a digital representation (session Id, token, JWT), but also that representation to a vehicle – aka mobile, tablet or device.  In turn, every transaction that tuple interacts with, is then verified – checking for changes – either contextual or behavioural that could indicate malicious intent.  That introduces a lot of complexity to transaction, data and application protection. 

Every transaction potentially requires introspection or validation.  Add to this mix an increased number of devices and data flows, which would pave the way for distributed authorization, coupled with continuous session validation.

How will that look?  Well we’re starting to see the of things like stateless JSON Web Tokens (JWT’s) as a means for hyper scale assertion issuance, along with token binding to sessions and devices.  Couple to that Fine Grained Authentication processes that are using 20+ signals of data to identify a user or thing and we’re starting to see the foundations of ZT identity infrastructures.  Microservice or hyper-mesh related application infrastructures are going to need rapid introspection and re-validation on every call so the likes of distributed authorization looks likely.


So the future is now.  As always.  We know that secure identity and access management functions has never been more needed, popular or advanced in the last 20 years.  The next 3-5 years will be critical in defining a back bone of security services that can nimbly be applied to users, devices, data and the billions of transactions that will result.

2020: Machine Learning, Post Quantum Crypto & Zero Trust

Welcome to a digital identity project in 2020! You'll be expected to have a plan for post-quantum cryptography.  Your network will be littered with "zero trust" buzz words, that will make you suspect everyone, everything and every transaction.  Add to that, “machines” will be learning everything, from how you like your coffee, through to every network, authentication and authorisation decision. OK, are you ready?

Machine Learning

I'm not going to do an entire blog on machine learning (ML) and artificial intelligence (AI).  Firstly I'm not qualified enough on the topic and secondly I want to focus on the security implications.  Needless to say, within 3 years, most organisations will have relatively experienced teams who are handling big data capture from an and identity, access management and network perspective.

That data will be being fed into ML platforms, either on-premise, or via cloud services.  Leveraging either structured or unstructured learning, data from events such as login (authentication) for end users and devices, as well authorization decisions can be analysed in order to not only increase assurance and security, but for also increasing user experience.  How?  Well if the output from ML can be used to either update existing signatures (bit legacy, but still) whilst simultaneously working out the less risky logins, end user journeys can be made less intrusive. 

Step one is finding out the correct data sources to be entered into the ML “model”.  What data is available, especially within the sign up, sign in and authorization flows?  Clearly general auditing data will look to capture ML “tasks” such as successful sign ins and any other meta data associated with that – such as time, location, IP, device data, behavioural biometry and so on.  Having vast amounts of this data available is the first start, which in turn can be used to “feed” the ML engine.  Other data points would be needed to.  What resources, applications and API calls are being made to complete certain business processes?  Can patterns be identified and tied to “typical” behaviour and user and device communities.  Being able to identify and track critical data and the services that process that data would be a first step, before being able to extract task based data samples to help identify trusted and untrusted activities.


Post Quantum Crypto

Quantum computing is coming.  Which is great.  Even in 2020, it might not be ready, but you need to be ready for it.  But, and there’s always a but, the main concern is that the super power of quantum will blow away the ability for existing encryption and hashing algorithms to remain secure.  Why?  Well quantum computing ushers in a paradigm of “qubits” - a superpositional state in between the classic binary 1 and 0.  Ultimately, that means that the “solutioneering” of complex problems can be completed  in a much more efficient and non-sequential way.

The quantum boxes can basically solve certain problems faster.  The mathematics behind cryptography being one of those problems.  A basic estimate for the future effectiveness of something like AES-256, drops to 128 bits.  Scary stuff.  Commonly used approaches today for key exchange rely on protocols such as Diffie-Hellman (DH) or Elliptic Curve Diffie Hellman (ECDH).  Encryption is then handled by things like Rivest-Shamir-Adleman (RSA) or Elliptic Curve Digital Signing Algorithm (ECDSA). 

In the post-quantum (PQ) world they’re basically broken.  Clearly, the material impact on your organisation or services will largely depend on impact assessment.  There’s no point putting a $100 lock on a $20 bike.  But everyone wants encryption right?  All that data that will be flying around is likely to need even more protection whilst in transit and at rest.

Some of the potentially “safe” PQ algorithms include XMSS and SPHINCS for hashing – the former going through IETF standardization.  Ring Learning With Errors (RLWE) is basically an enhanced public key cryptosystem, that alters the structure of the private key.  Currently under research but no weakness have yet been found.  NTRU is another algorithm for the PQ world, using a hefty 12881 bit key.  NTRU is also already standardized by the IEEE which helps with the maturity aspect.

But how to decide?  There is a nice body called the PQCRYPTO Consortium that is providing guidance on current research.  Clearing you’re not going to build your own alternatives, but information assurance and crypto specialists within your organisation, will need to start data impact assessments, in order to understand where cryptography is currently used for both transport, identification and data at rest protection to understand any future potential exposures.


Zero Trust Identities

“Zero Trust” (ZT) networking has been around for a while.  The concept of organisations having a “safe” internal network versus the untrusted “hostile” public network, separated by a firewall are long gone. Organisations are perimeter-less. 

Assume every device, identity and transaction is hostile until proven otherwise.  ZT for identity especially, will be looking to bind not only a physical identity to a digital representation (session Id, token, JWT), but also that representation to a vehicle – aka mobile, tablet or device.  In turn, every transaction that tuple interacts with, is then verified – checking for changes – either contextual or behavioural that could indicate malicious intent.  That introduces a lot of complexity to transaction, data and application protection. 

Every transaction potentially requires introspection or validation.  Add to this mix an increased number of devices and data flows, which would pave the way for distributed authorization, coupled with continuous session validation.

How will that look?  Well we’re starting to see the of things like stateless JSON Web Tokens (JWT’s) as a means for hyper scale assertion issuance, along with token binding to sessions and devices.  Couple to that Fine Grained Authentication processes that are using 20+ signals of data to identify a user or thing and we’re starting to see the foundations of ZT identity infrastructures.  Microservice or hyper-mesh related application infrastructures are going to need rapid introspection and re-validation on every call so the likes of distributed authorization looks likely.


So the future is now.  As always.  We know that secure identity and access management functions has never been more needed, popular or advanced in the last 20 years.  The next 3-5 years will be critical in defining a back bone of security services that can nimbly be applied to users, devices, data and the billions of transactions that will result.

Gartner Identity Summit London 2015 – Review

This week saw the Gartner Identity and Access Management Summit come to London town.  The event brings together the great and good from the identity community, with a range of vendors, consultancies and identity customers all looking to analyse the current market place and understand the current challenges as well as hot topics that can be applied in 2015 and beyond.

Hitting the Right Notes

The main keynote from the external speaker, was from the highly talented classical musician Miha Pogacnik.  Miha delivered an inspirational 60 minute talk, translating the components of classical music into the realm of business transformation.  He focused on organisational change and all the various different angles of repetition, aggression, questioning and responding that occur and the new challenges it places on organisations, whilst playing a piece of Bach on his violin!  Fantastic.



Consumers Have Identities Too

From a strategic identity perspective, there were several presentations on the developing need for consumer identity management. Many organisations are embracing digital transformation in both the private and public sector, defining use cases and requirements for things like consumer registration, authentication and multi-factor authentication, all done within a highly scalable yet simple identity management framework.

Traditional identity management platforms, products and delivery approaches, are often focused on small scale, repeatable use cases that focus on employees and workflow and don't require the scale or rapid time to delivery that consumer facing projects need.

Gartner's Lori Robinson went through the journey of differences between customer and employee identity management and how features such as consumer registration, map neatly to core provisioning and synchronization use cases, whilst features such as authentication are being extended to include things like adaptive risk, device finger printing and the use of one time passwords to help improve security when high value consumer transactions take place, such as address changes.

The Identity of Things Headache

Another emerging area that not only Gartner, but many consultants and customers were talking about, was that of applying identity patterns to devices and things.  Whilst there has been the initial hype of consumer focused things - such as fitness trackers, fridge monitors and so on - there is a great and developing need for identity and access patterns to the manufacturing space, utilities, SCADA and energy sectors.  Many devices are low powered and have limited cryptographic processing capabilities, but still require registration and linking use cases to be fulfilled as well as having the data their generate to be protected.

The linking, relationship building and data privacy concerns of the newly emerging internet of things landscape, requires heavy doses of identity and access management medicine to make them sustainable.

Newer emerging standards such as OpenID Connect and User Managed Access were the main focus of the coffee chatter and how they can provide federated authorization capabilities to both people and things based infrastructures.


Overall it was a well attended and thought provoking summit, with both traditional and emerging vendors sponsoring and some great event party antics.  It seems the identity management space is going from strength to strength, even after being around for over 15 years.  The new challenges of devices, consumers, cloud and mobile are helping to drive innovation in both the vendor and delivery space.

By Simon Moffatt



Gartner Identity Summit London 2015 – Review

This week saw the Gartner Identity and Access Management Summit come to London town.  The event brings together the great and good from the identity community, with a range of vendors, consultancies and identity customers all looking to analyse the current market place and understand the current challenges as well as hot topics that can be applied in 2015 and beyond.

Hitting the Right Notes

The main keynote from the external speaker, was from the highly talented classical musician Miha Pogacnik.  Miha delivered an inspirational 60 minute talk, translating the components of classical music into the realm of business transformation.  He focused on organisational change and all the various different angles of repetition, aggression, questioning and responding that occur and the new challenges it places on organisations, whilst playing a piece of Bach on his violin!  Fantastic.



Consumers Have Identities Too

From a strategic identity perspective, there were several presentations on the developing need for consumer identity management. Many organisations are embracing digital transformation in both the private and public sector, defining use cases and requirements for things like consumer registration, authentication and multi-factor authentication, all done within a highly scalable yet simple identity management framework.

Traditional identity management platforms, products and delivery approaches, are often focused on small scale, repeatable use cases that focus on employees and workflow and don't require the scale or rapid time to delivery that consumer facing projects need.

Gartner's Lori Robinson went through the journey of differences between customer and employee identity management and how features such as consumer registration, map neatly to core provisioning and synchronization use cases, whilst features such as authentication are being extended to include things like adaptive risk, device finger printing and the use of one time passwords to help improve security when high value consumer transactions take place, such as address changes.

The Identity of Things Headache

Another emerging area that not only Gartner, but many consultants and customers were talking about, was that of applying identity patterns to devices and things.  Whilst there has been the initial hype of consumer focused things - such as fitness trackers, fridge monitors and so on - there is a great and developing need for identity and access patterns to the manufacturing space, utilities, SCADA and energy sectors.  Many devices are low powered and have limited cryptographic processing capabilities, but still require registration and linking use cases to be fulfilled as well as having the data their generate to be protected.

The linking, relationship building and data privacy concerns of the newly emerging internet of things landscape, requires heavy doses of identity and access management medicine to make them sustainable.

Newer emerging standards such as OpenID Connect and User Managed Access were the main focus of the coffee chatter and how they can provide federated authorization capabilities to both people and things based infrastructures.


Overall it was a well attended and thought provoking summit, with both traditional and emerging vendors sponsoring and some great event party antics.  It seems the identity management space is going from strength to strength, even after being around for over 15 years.  The new challenges of devices, consumers, cloud and mobile are helping to drive innovation in both the vendor and delivery space.

By Simon Moffatt



IoT World Forum Review: Interop, Data & Security

This week saw the 2 day Internet of Things World Forum conference take place in London. There is clearly a general consensus, that the IoT market is a multi-trillion dollar opportunity, through the implementation of items such as consumer wearables, embedded predictive failure components and data collecting sensors.



The rapid rise in connected devices and IoT ecosystems, is seemingly beingdriven by several key factors, includingfalling cost of both connectivity anddata storage. These lowering barriers to entry, coupled with more developer friendly ecosystems and open platforms, is helping to fulfil new revenue generating business opportunities in multiple verticals including manufacturing and healthcare.

Matt Hatton from Machina Research started off discussing the progression from local standalone projects (Intranets of Things), through to more internal or enterprise focused deployments (Subnets of Things).  David Keene from Google, extended this further, to say the progression will reach the concept of Web of Things, where accessibility and 'findability' will be key to managing and accessing data.

It was clear that data aggregation and analytics will be a major component in any successful IoT infrastructure, whether that is focusing on consumer enhancements, such as the Jaguar connected car project as described by Leon Hurst, through to smart health care, either in the form of Fitbits, or more advanced medical instrumentation.

API's and machine processing were certainly referenced more than once.  The new more connected web, will provide interaction touch points that only machines can understand, coupled with better data aggregation, distributed data storage and centralised querying. API's of course need protection too, either via gateways or via token management integration for standards such as OAuth2.

One aspect that was conspicuous in it's absence, was that of data privacy, and identity and access management.  The IoT landscape is creating vast amounts of data at stream like speeds.  The concept of little data (small devices in isolation) to big data (aggregated in cloud services) requires strong levels of authentication and authorization, at both the device, service and end user level.  The ability to share and transparently know where data is being accessed will be a key concern in the consumer and health care spaces.

Dave Wagstaff from BSquare, brought up the interesting concept, that many organisations are now subtly moving away from a product based business model, to a software and services based approach. With the the increased capability of devices, organisations now can perform much more in the way of remote monitoring, predictive failure and so on, where the end user really is just paying an insurance or subscription for their physical thing.

Bernd Heinrichs from Cisco followed a similar pattern, where he described the German view of Industry v4.0 (or 4.1...) where innovative production concepts are helping to reduce energy, increase uptime and generate better component output.

From a new market opportunity perspective, Francois Menuier from Morgan Stanley, observed that 6% of all consumers now own a wearable, with 59% of them using that wearable daily. In addition many wearable owners, argued that this was an additional purchase and not one to replace existing technology, solidifying the view that new market initiatives are available in the IoT world. However many consumer wearables generate huge amounts of deeply personal data that needs to be protected and shared securely.

Jon Carter from Deutsch Telekom went through the 7 steps for a successful IoT implementation, which ended with the two main points of applying a minimum viable product concept to design and also leverage secure and open platform.

Dr Shane Rooney from the GSMA focused his thoughts on security within the mobile network operator network, including the concept of device to device and device to service authentication, as well the the need for greater focus on data privacy.

Overall an interesting couple of days. Whilst most manufacturers and platforms are focused on interoperability and data management, identity and access management has a strong and critical role in allowing 3rd party data sharing and interactions to take place. It will be interesting to see if the 2015 and 2016 start to introduce these concepts by default.





Protection & The Internet of Things

The 'Internet of Things' is one of the technical heatwaves that has genuinely got me excited over the last 24 months or so.  I've been playing with computers since I was 8 and like to think of myself as being pretty tech-savvy.  I can code in a number of languages, understand different architectural approaches easily and pick up new technical trends naturally.  However, the concept of the truly connected world with 'things' interconnected and graphed together, is truly mind blowing.  The exciting thing for me, is that I don't see the outcome.  I don't see the natural technical conclusion of devices and objects being linked to a single unique identity, where information can flow in multiple directions, originating from different sources and being made available in contextual bundles.  There is no limit.



They'll be No 'Connected', Just 'On'

Today we talk about connectivity, wifi hotspots and 4G network coverage.  The powerful difference between being on and off line.  As soon as you're off line, you're invisible.  Lost, unable to get the information you need, to interact with your personal and professional networks. This concept is slowly dying.  The 'Internet' is no longer a separate object that we connect with explicitly.  Very soon, the internet will be so intrinsically tied to us, that without it, basic human interactions and decision making will become stunted.  That is why I refer to objects just being 'on' - or maybe just 'being', but that is a little too sci-fi for me.  Switching an object on, or purchasing it, enabling it, checking in to it, will make that device become 'smart' and tied to us.  It will have an IP address and be able to communicate, send messages, register, interact and contain specific contextual information.  A simple example is the many running shoe companies that now provide GPS, tracking and training support information for a new running shoe.  That information is specific to an individual, centrally correlated and controlled, and then shared socially to allow better route planning and training techniques, to be created and exchanged.


Protection, Identity & Context

But what about protection?  What sort of protection?  Why does this stuff need protecting in the first place? And from what?  The more we tie individual devices to our own unique identity, the more information, services and objects we can consume, purchase and share.  Retailers see the benefit in being able to provide additional services and contextual information to a customer, as it makes them stickier to their brand.  The consumer and potential customer receives a more unique service, requiring less explicit searching and decision making.  Everything becomes personalised, which results in faster and more personalised acquisition of services and products.

However, that information exchange requires protection.  Unique identities need to be created - either for the physical person, or the devices that are being interacted with.  These identities will also need owners, custodians and access policies that govern the who, what and when, with regards to interactions.  The running shoe example may seem unimportant, but apply that logic to your fridge - seems great to be able to manage and monitor the contents of your refrigerator.  Automatic ordering and so on, seems like a dream.  But how might that affect your health insurance policy?  What about when you go on holiday and don't order any food for 3 weeks?  Ideal fodder for a burglar.  The more we connect to our own digitalpersona, the more those interactions need authentication, authorization and identity management.

Context plays an important part here too.  Objects - like people in our own social graphs - have many touch points and information flows.  A car is a simple example.  It will have a manufacturer (who is interested in safety, performance and so on), a retailer (who is interested in usage, ownership years), the owner (perhaps interested in servicing, crash history) and then other parties such as governments and police.  Not to mention potential future owners and insurance companies.  The context to which an interacting party comes from, will obviously determine what information they can consume and contribute to.  That will also need managing from an authorization perspective.


Whilst the 'Internet of Things' may seem like buzz, it has a profound impact on how we interact with physical, previously inanimate objects.  As soon as digitize and contextualize them, we can reap significant benefits when it comes to implicit information searching and tailor made services.  But, for that to work effectively, a correct balance with identity and access control needs to be found.

By Simon Moffatt

Image courtesy of http://www.sxc.hu/photo/472281



Protection & The Internet of Things

The 'Internet of Things' is one of the technical heatwaves that has genuinely got me excited over the last 24 months or so.  I've been playing with computers since I was 8 and like to think of myself as being pretty tech-savvy.  I can code in a number of languages, understand different architectural approaches easily and pick up new technical trends naturally.  However, the concept of the truly connected world with 'things' interconnected and graphed together, is truly mind blowing.  The exciting thing for me, is that I don't see the outcome.  I don't see the natural technical conclusion of devices and objects being linked to a single unique identity, where information can flow in multiple directions, originating from different sources and being made available in contextual bundles.  There is no limit.



They'll be No 'Connected', Just 'On'

Today we talk about connectivity, wifi hotspots and 4G network coverage.  The powerful difference between being on and off line.  As soon as you're off line, you're invisible.  Lost, unable to get the information you need, to interact with your personal and professional networks. This concept is slowly dying.  The 'Internet' is no longer a separate object that we connect with explicitly.  Very soon, the internet will be so intrinsically tied to us, that without it, basic human interactions and decision making will become stunted.  That is why I refer to objects just being 'on' - or maybe just 'being', but that is a little too sci-fi for me.  Switching an object on, or purchasing it, enabling it, checking in to it, will make that device become 'smart' and tied to us.  It will have an IP address and be able to communicate, send messages, register, interact and contain specific contextual information.  A simple example is the many running shoe companies that now provide GPS, tracking and training support information for a new running shoe.  That information is specific to an individual, centrally correlated and controlled, and then shared socially to allow better route planning and training techniques, to be created and exchanged.


Protection, Identity & Context

But what about protection?  What sort of protection?  Why does this stuff need protecting in the first place? And from what?  The more we tie individual devices to our own unique identity, the more information, services and objects we can consume, purchase and share.  Retailers see the benefit in being able to provide additional services and contextual information to a customer, as it makes them stickier to their brand.  The consumer and potential customer receives a more unique service, requiring less explicit searching and decision making.  Everything becomes personalised, which results in faster and more personalised acquisition of services and products.

However, that information exchange requires protection.  Unique identities need to be created - either for the physical person, or the devices that are being interacted with.  These identities will also need owners, custodians and access policies that govern the who, what and when, with regards to interactions.  The running shoe example may seem unimportant, but apply that logic to your fridge - seems great to be able to manage and monitor the contents of your refrigerator.  Automatic ordering and so on, seems like a dream.  But how might that affect your health insurance policy?  What about when you go on holiday and don't order any food for 3 weeks?  Ideal fodder for a burglar.  The more we connect to our own digitalpersona, the more those interactions need authentication, authorization and identity management.

Context plays an important part here too.  Objects - like people in our own social graphs - have many touch points and information flows.  A car is a simple example.  It will have a manufacturer (who is interested in safety, performance and so on), a retailer (who is interested in usage, ownership years), the owner (perhaps interested in servicing, crash history) and then other parties such as governments and police.  Not to mention potential future owners and insurance companies.  The context to which an interacting party comes from, will obviously determine what information they can consume and contribute to.  That will also need managing from an authorization perspective.


Whilst the 'Internet of Things' may seem like buzz, it has a profound impact on how we interact with physical, previously inanimate objects.  As soon as digitize and contextualize them, we can reap significant benefits when it comes to implicit information searching and tailor made services.  But, for that to work effectively, a correct balance with identity and access control needs to be found.

By Simon Moffatt

Image courtesy of http://www.sxc.hu/photo/472281



The Road To Identity Relationship Management

The Problems With Identity & Access Management

I am never a fan of being the bearer of dramatic bad news - "this industry is dead!", "that standard is dead!", "why are you doing it that way, that is so 2001!".  Processes, industries and technologies appear, evolve and sometimes disappear at their own natural flow.  If a particular problem and the numerous solutions are under discussion, it probably means at some point, those solutions seemed viable.  Hindsight is a wonderful thing.  With respect to identity and access management, I have seen the area evolve quite rapidly in the last 10 years, pretty much the same way as the database market, the antivirus market, the business intelligence market, the GRC market and so on.  They have all changed.  Whether for the better or worse, is open for discussion, but in my opinion that is an irrelevant discussion, as that is the market which exists today.  You either respond to it, or remove yourself from it.



Like most middleware based sectors, identity and access management has become a complex, highly optimized monster.  Tools on top of tools, to help you get the most out of tools you purchased long ago and sit at the bottom of the stack.  Projects are long and complex.  Milestones blurred.  Stakeholders come from different spectrums of the organisation, with differing goals and drivers.  Vendors have consolidated and glued together complex suites of legacy solutions, built on different frameworks and with different goals in mind.  The end result?  A confused customer and a raft of splinter point products that claim to offer speed and cost improvements to existing 'legacy' solutions.


The Modern Enterprise

I blogged recently about the so called 'modern' enterprise, and how it has evolved to include facets from the mobile, social and outsourced worlds.  Organisations have faced tremendous issues since 2008 when it comes to profitability, with shrinking markets, lower revenues and more stringent internal cost savings.  All of which, have placed pressure on identifying new and more effective revenue streams, either from developing new products faster, or by extracting more revenue from existing customers, by leveraging company brand and building better, more online focused relationships.  All of these avenues of change, rely heavily on identity management.  Firstly, by allowing things like online client registration to occur rapidly and seamlessly, right through to allowing new approaches such as mobile and cloud to be integrated into a single revenue focused platform.

The long and winding identity road - image taken by Simon Moffatt, New South Wales, AU. 2011
Gone are the days when identity management was simply focused on managing employee access to the corporate directory and email server.  Organisations are now borderless, with a continually connected workforce.  That workforce is also not simply focused on employees either.  The modern enterprise workforce, will contain contractors, freelancer and even consumers themselves.  Bloggers, reviewers, supporters, promoters, content sharers and affiliates, whilst not on the company payroll, help drive revenue through messaging and interaction.  If a platform exists where their identity can be harnessed, a new more agile go to market approach can be developed.


Scale, Agility and Engagement

But what does this all mean practically?  New widgets, more sprockets and full steam ahead on the agitator!  Well not quite.  It does require a new approach.  Not a revolution but evolution.  Modernity in all levels, seems to mean big.  Big data.  Big pipes.  Big data centres.  Scale is a fundamental component of modern identity.  Scale, too can come in many different flavours.  Numbers yes.  Can you accommodate a million client registrations?  What about the process, flows and user interfaces that will be needed to manage such scale?  Modularity is key here.  A rigid, prescribed system will result in a rigid and prescribed service.  Flexibility and a loosely decoupled approach will allow system and user interface integration in a much more reusable way.  Languages, frameworks and standards are now much less about vendor sponsorship and much more about usability and longevity.  Modern identity is really about improving engagement, not just at the individual level, but also at the object and device level.  Improved engagement will result in better relationships and ultimately more informed decision making.

Ultimately economics is based fundamentally on clear, fully informed decision making, and if a modern enterprise can develop a service to fully inform and engage its client base, new revenue opportunities will sharply follow.





The Road To Identity Relationship Management

The Problems With Identity & Access Management

I am never a fan of being the bearer of dramatic bad news - "this industry is dead!", "that standard is dead!", "why are you doing it that way, that is so 2001!".  Processes, industries and technologies appear, evolve and sometimes disappear at their own natural flow.  If a particular problem and the numerous solutions are under discussion, it probably means at some point, those solutions seemed viable.  Hindsight is a wonderful thing.  With respect to identity and access management, I have seen the area evolve quite rapidly in the last 10 years, pretty much the same way as the database market, the antivirus market, the business intelligence market, the GRC market and so on.  They have all changed.  Whether for the better or worse, is open for discussion, but in my opinion that is an irrelevant discussion, as that is the market which exists today.  You either respond to it, or remove yourself from it.



Like most middleware based sectors, identity and access management has become a complex, highly optimized monster.  Tools on top of tools, to help you get the most out of tools you purchased long ago and sit at the bottom of the stack.  Projects are long and complex.  Milestones blurred.  Stakeholders come from different spectrums of the organisation, with differing goals and drivers.  Vendors have consolidated and glued together complex suites of legacy solutions, built on different frameworks and with different goals in mind.  The end result?  A confused customer and a raft of splinter point products that claim to offer speed and cost improvements to existing 'legacy' solutions.


The Modern Enterprise

I blogged recently about the so called 'modern' enterprise, and how it has evolved to include facets from the mobile, social and outsourced worlds.  Organisations have faced tremendous issues since 2008 when it comes to profitability, with shrinking markets, lower revenues and more stringent internal cost savings.  All of which, have placed pressure on identifying new and more effective revenue streams, either from developing new products faster, or by extracting more revenue from existing customers, by leveraging company brand and building better, more online focused relationships.  All of these avenues of change, rely heavily on identity management.  Firstly, by allowing things like online client registration to occur rapidly and seamlessly, right through to allowing new approaches such as mobile and cloud to be integrated into a single revenue focused platform.

The long and winding identity road - image taken by Simon Moffatt, New South Wales, AU. 2011
Gone are the days when identity management was simply focused on managing employee access to the corporate directory and email server.  Organisations are now borderless, with a continually connected workforce.  That workforce is also not simply focused on employees either.  The modern enterprise workforce, will contain contractors, freelancer and even consumers themselves.  Bloggers, reviewers, supporters, promoters, content sharers and affiliates, whilst not on the company payroll, help drive revenue through messaging and interaction.  If a platform exists where their identity can be harnessed, a new more agile go to market approach can be developed.


Scale, Agility and Engagement

But what does this all mean practically?  New widgets, more sprockets and full steam ahead on the agitator!  Well not quite.  It does require a new approach.  Not a revolution but evolution.  Modernity in all levels, seems to mean big.  Big data.  Big pipes.  Big data centres.  Scale is a fundamental component of modern identity.  Scale, too can come in many different flavours.  Numbers yes.  Can you accommodate a million client registrations?  What about the process, flows and user interfaces that will be needed to manage such scale?  Modularity is key here.  A rigid, prescribed system will result in a rigid and prescribed service.  Flexibility and a loosely decoupled approach will allow system and user interface integration in a much more reusable way.  Languages, frameworks and standards are now much less about vendor sponsorship and much more about usability and longevity.  Modern identity is really about improving engagement, not just at the individual level, but also at the object and device level.  Improved engagement will result in better relationships and ultimately more informed decision making.

Ultimately economics is based fundamentally on clear, fully informed decision making, and if a modern enterprise can develop a service to fully inform and engage its client base, new revenue opportunities will sharply follow.