Infosecurity Europe 2013: Battling Cyber Crime Keynote

Cybercrime, either for financial gain or hacktivist tendencies is on the rise.  The US and UK governments have invested significant sums in the last 12 months on new defence measures and research centres.  The sci-fi talk of 'cyber war' is becoming an increasing reality, but what are the new attack vectors and what can be done to defend against them?

Changing Priorities, Changing Targets

Arnie Bates from Scotia Gas Networks described that freely available tools, are now commonplace  and can help a potential cyber attacker, to initiate distribute denial of service (DDOS) attacks simply and easily, without complex development skills, that would have been required only a few years ago.  The simplicity of attack initiation, has lead to 'simple' attacks resulting in more sophisticated impact, as highlighted by Misha Glenny, Writer and Broadcaster, who pointed to the recent attack on the Associated Press' Twitter account.  The attack itself seemed simple, but the resulting impact on the NYSE was tangible.


Hacktivism -v- Financial Reward

DS Charlie McMurdie from the MET Police's cyber crime unit, articulated the need to identify the true motive for each cyber crime attack.  The majority of attacks being reported, derive from a financial motive.  Whilst hacktivism is still an important protest tool, the greater complexity and rise in attacks is based on a monetary reward, either directly through theft or via indirect theft of identity credentials, that in turn lead to a cash reward for a successful attacker.  From a government perspective, Adrian Price from the UK's MoD, described how state level espionage is still a major concern, as it has been for decades, but now the attack vectors have simply moved online.  And whilst state level attacks could ultimately lead to government involvement and ultimately war and loss of life, national defence related attacks still fall under the protest category, if a government's political and foreign policy is openly objected to.

Defence Via Shared Intelligence

Whilst DS McMurdie described there isn't a "signal bullet to defend against" when it comes to cyber attacks, there equally isn't a silver bullet that will provide ultimate protection.  Private sector organisations still need to promote cyber awareness and education to generate a more cross-departmental approach to defence.  At the national and critical infrastructure level, shared intelligence initiatives will help provide a more adaptable and responsive defense mechanism.

By Simon Moffatt

Infosecurity Europe 2013: Embedding Security into the Business

A strong keynote panel discussed the best practices for embedding security into the business, and how the changing perceptions of information security are helping to place it as a key enabler to business growth.

Infosec Is The Oil Of The Car

Brian Brackenborough from Channel 4, best described information security as being "the oil in the car engine".  It's an integral part of the car's mobility, but shouldn't always be seen as the brakes, which can be construed by the business as being restrictive and limiting.  James McKinlay, from Manchester Airports Group, added that information security needs to move away from just being network and infrastructure focused and start to engage other business departments, such as HR, legal and other supply chain operators.

The panel agreed that information security needs to better engage all areas of the non-technical business landscape, in order to be fully effective.

Business Focused Language

Many information security decisions are made on risk management and how best to reduce risk, whilst staying profitable and not endangering user experience.  A key area of focus, is the use of a common business focused language when describing risk, the benefits of reduction and the controls involved in the implication.  According to James, organisations need to "reduce the gap between the business and infosec teams view of risk, and standardise on the risk management frameworks being used".

Education & Awareness

Geoff Harris from ISSA promoted the argument of better security awareness, as being a major security enabler.  He described how a basic 'stick' model of making offenders of basic infosec controls, buy doughnuts for the team, worked effectively, when used to reduce things like unlocked laptops.  James also pointed to "targeted and adaptive education and training" as being of great importance.  Different departments, have different goals, focuses and users, all which require specific training when it comes to keeping information assets secure.

All in all, the panel agreed, that better communication with regards to information security policy implementation and better gathering of business feedback when it comes to information security policy creation, are all essential.

By Simon Moffatt

Infosecurity Europe 2013: SCADA The Next Threat

Physical and industrial control systems are now all around us, in the form of smart grid electrical meters, traffic light control systems and even basic proximity door access control panels.  These basic computer systems can hold a vast array of sensitive data, with fully connected network access, central processing units and execution layers.  Many however lack the basic security management expected of such powerful systems.  Many 'don't get a quarter of the security governance an average corporate server' gets according to Greg Jones, of Digital Assurance.

Characteristics and Rise In Use
Micro computers with closed control systems have been in use for a number of years in industrial environments, where they are used to collect processing data or execute measurement or timing instructions.  Their popularity in mainstream use has increased, with the likes of TV set-top top boxes and games consoles following a similar design.  These more commercially focused devices however, often have stronger security due to their makers wanting to protect revenue streams, say Jones.

Lack of Security Management
Many of the control type systems in use, aren't manufactured or managed with security in mind.  Performance, durability and throughput are often of greater importance, with basic security controls such as secure storage, administrative lockdown and network connectivity all potential hotspots.

Protection Gaps
The main security focus of many smaller control devices, is around physical protection.  Devices such as traffic light systems or metering boxes, are generally well equipped to stave off vandalism and physical breaches, but much less so from a logical and access control perspective.

Data is often stored unencrypted, with limited validation being performed on any data collection and input channels.  This can open up issues with regards to data integrity, especially in the field of electrical meter reading.  This will certainly become of greater significance, as it is forecast that by 2020, 80% of European electricity supplier customers, will be using a smart-style.

By Simon Moffatt


Infosecurity Europe 2013: Analyst Panel Keynote: Future Risks

At the end of day 1, of the Infosec Europe conference, on a wonderfully warm Spring afternoon at Earls Court, saw the keynote theatre host an interesting panel discussion focusing on future risks.  Andrew Rose from Forrester, Wendy Nather from the 451 Research group and Bob Tarzey from Quocirca provided some interesting sound bites for what future threats may look like.

Hacktivism versus Financial Reward
All panelists acknowledged that hacktivism has been a major concern for the last few years, with Andrew pointing out that attacks are now becoming more damaging and malicious.  Bob produced a nice soundbite of "terrorists don't build guns they buy them", highlighting the fact that hacktivists can easily leverage available tools to perform sophisticated and complex attacks, without necessarily spending time and effort developing bespoke tools.  Wendy pointed out that attacks driven by financial reward have somewhat different attack patterns and targets, with new avenues such as mobile, smart grids and CCTV devices being identified as potential revenue streams for malicious operators.

Financial reward is still a major driver for many attacks, with new approaches likely to include mobile devices, to leverage potential salami style SMS attacks.  Intellectual Property theft is still a major obstacle at both a nation state and organisational level.

Extended Enterprises
Andrew commented on the increasing complexity many organisations now face from a structural perspective.  Increased outsourcing, supply chain distribution and 3rd party data exchanges, make defensive planning difficult.  Bob also pointed out that the complexity of supply chain logistics have made smaller organisations, traditionally thought to be more immune to larger scale attacks, are now more likely to be breached, simply due to the impact it may have on their business partners.

Insider Threat and Privileged Account Management
Trusted employees can be still be a major headache from a security perspective.  Non-intentional activity such as losing laptops, responding to malicious links and being the victim of spear-phishing attacks, were all highlighted as being the result of poor security awareness, or a lack of effective security policy.  Bob argued that privileged account management should be a high priority, with many external attacks utilising root, administrator and service accounts with their escalated permissions.

Data Chemistry and Context Aware Analysis
Whilst there is no 'silver bullet' to help prevent against the known knowns and unknown unknowns, the use of security analytics can go some way to help detect and ultimately prevent future attacks.  Wendy used the term 'data chemistry' to emphasise the use of the right data and the right query to help provide greater detail and insight to traditional SIEM and log gathering technologies.  Bob promoted the use of greater profiling and context aware analysis of existing log and event data, to further highlight exceptions and their relevance, especially from a network activity perspective.  Andrew also commented that information asset classification, whilst a well known approach to risk management, is still a key component in developing effective defence policies.

By Simon Moffatt

Infosecurity Europe 2013: Hall of Fame Shlomo Kramer & Mikko Hypponen

London, 23rd April 2013 - For the last 5 years the medal of honour of the information security world has been presented to speakers of high renown with the ‘Hall of Fame’ at Infosecurity Europe. Voted for by fellow industry professionals the recipients of this most prestigious honour stand at the vanguard of the technological age and this year both Shlomo Kramer and Mikko Hypponen will be presented with the honour on Wednesday 24 Apr 2013 at 10:00 - 11:00 in the Keynote Theatre at Infosecurity Europe, Earl’s Court, London.


Shlomo Kramer is the CEO and a founder of Imperva (NYSE:IMPV), prior to that he co-founded Check Point Software Technologies Ltd. in 1993 (NASDAQ:CHKP). Kramer has participated as an early investor and board member in a number of security and enterprise software companies including Palo Alto Networks (NYSE:PANW), Trusteer, WatchDox, Lacoon Security, TopSpin Security, SkyFence, Worklight, Incapsula and SumoLogic.

Shlomo Kramer commented “I am delighted to have been chosen by Infosecurity for the “hall of fame” in 2013 – it’s a great honour to be recognised for the work that I have done in the IT security industry as a founder of companies such as Check Point and Imperva. I love nothing more than creating and fostering successful enterprise IT- focused businesses and will continue to put my energy into combating the ever increasing onslaught from the cyber-criminal world.”

Mikko Hypponen is the Chief Research Officer of F-Secure in Finland. He has been working with computer security for over 20 years and has fought the biggest virus outbreaks in the net.  He's also a columnist for the New York Times, Wired, CNN and BBC. His TED Talk on computer security has been seen by over a million people and has been translated to over 35 languages. Mr. Hypponen sits in the advisory boards of the ISF and the Lifeboat foundation.

"I've worked in the industry for 22 years and haven't had a boring day yet. I'm honoured to be inducted to the hall of fame", commented Mikko Hypponen. "The enemy is changing all the time so we must keep up."

Previous speakers have included some of the world’s leading thinkers in information security including Professor Fred Piper, Professor Howard Schmidt, Bruce Schneier, Whitfield Diffie, Paul Dorey, Dan Kaminsky, Phil Zimmerman, Lord Erroll, Eugene Kaspersky, Charlie McMurdie, Stephen Bonner and Ed Gibson. To view all previous speakers, along with a short biography, you can visit the Infosecurity website:  http://www.infosec.co.uk/Education-Programme/fame/

The 2013 Hall of Fame will be conducted in the Keynote theatre where both Shlomo and Mikko Hypponen will join Professor Fred Piper in a panel chaired by Raj Samani from the CSA which will address other industry professionals in what always proves to be a compelling and exhilarating event.
The speakers inducted into the Hall of Fame have met the following criteria:
  • Be an internationally recognised and respected Information Security practitioner or advocate 
  • Have made a clear and long-term contribution to the advancement of Information Security 
  • Have provided intellectual or practical input that has shifted the advancement of Information Security 
  • Be an engaging and revolutionary thought leader in Information Security 
The Hall of Fame has proven to be the highlight of previous shows and this year is no different. Setting the standard for other industry professionals and defining contemporary issues, the Hall of Fame speakers aim to challenge conventional thought with a mix of pragmatism and provocation. It really is the must see event of the year.

Microsoft Security Intelligence Report Volume 14

Yesterday, Microsoft released volume 14 of its Security Intelligence Report (SIRv14) which included new threat intelligence from over a billion systems worldwide.  The report was focused on the 3rd and 4th quarters of 2012.
One of the most interesting threat trends to surface in the enterprise environment was the decline in network worms and rise of web-based attacks.  The report found:



·         The proportion of Conficker and Autorun threats reported by enterprise computers each decreased by 37% from 2011 to 2H12.
·         In the second half of 2012, 7 out of the top 10 threats affecting enterprises were associated with malicious or compromised websites.
·         Enterprises were more likely to encounter the iFrame redirection technique than any other malware family tracked in 4Q12.
·         One specific iFrame redirection family called IframeRef, increased fivefold in the fourth quarter of 2012 to become the number one malicious technique encountered by enterprises worldwide.   
·         IframeRefwas detected nearly 3.3 million times in the fourth quarter of 2012.

The report also takes a close look at the dangers of not using up-to-date antivirus software in an article titled “Measuring the Benefits of Real-time Security Software.”  New research showed that, on average, computers without AV protection were five and a half times more likely to be infected

The study also found that 2.5 out of 10, or an estimated 270 million computers worldwide were not protected by up-to-date antivirus software

Whilst many of the findings surrounding real-time protection seem pretty obvious, the numbers are pretty startling.  As security is often best implemented using a strength-in-depth, or rings approach, anti-virus or real time malware detection seems to be taking a back seat.  For mobile devices, or devices based on Linux this can become a significant issue, especially if those devices carry email destined for Microsoft based machines.

By Simon Moffatt

Who Has Access -v- Who Has Accessed

The certification and attestation part of identity management is clearly focused on the 'who has access to what?' question.   But access review compliance is really identifying failings further up stream in the identity management architecture.  Reviewing previously created users, or previously created authorization policies and finding excessive permissions or misaligned policies, shows failings with the access decommissioning process or business to authorization mapping process.



The Basic Pillars of Identity & Access Management


  • Compliance By Design
The creation and removal of account data from target systems falls under a provisioning component.  This layer is generally focused on connectivity infrastructure to directories and databases, either using agents or native protocol connectors.  The tasks, for want of a better word, are driven either by static rules or business logic, generally encompassing approval workflows.  The actual details and structure of what needs to be created or removed  is often generated elsewhere - perhaps via roles, or end user requests, or authoritative data feeds.  The provisioning layer helps fulfill what system level accounts and permissions need creating.  This could be described as compliance by design and would be seen as a panacea deployment, with quite a pro-active approach to security, based on approval before creation.
  • Compliance By Control
The second area could be the authorization component.  Once an account exists within a target system, there is a consumption phase, where an application or system uses that account and associated permissions to manage authorization.  The 'what that user can do' part.  This may occur internally, or more commonly, leverage an external authorization engine, with a policy decision point and policy enforcement point style architecture.  Here there is a reliance on the definition of authorization policies that can control what the user can do.  These policies may include some context data such as what the use is trying to access, the time of day, IP address and perhaps some business data around who the user is - department, location and so on.  These authorization 'policies' could be as simply as the read, write, execute permission bits set within a Unix system (the policy here is really quite implicit and static), or something more complex that has been crafted manually or automatically and specific to a particular system, area and organisation.  I'd describe this phase as compliance by control, where the approval emphasis is on the authorization policy.
  • Compliance By Review
At both the account level and authorization level, there is generally some sort of periodic review.  This review could be for internal or external compliance, or to simply help align business requirements with the underlying access control fulfillment layer.  This historically would be the 'who has access to what?' part.  This would be quite an important - not to mention costly from a time and money perspective - component for disconnected identity management infrastructures.  This normally requires a centralization of identity data, that has been created and hopefully approved at some point in the past.  The review is to help identify access misalignment, data irregularities or controls that no longer fulfill the business requirements.  This review process is often marred by data analysis problems, complexity, a lack of understanding with regards to who should perform reviews, or perhaps a lack of clarity surrounding what should be certified or what should be revoked.

SIEM, Activities and Who Has Accessed What?

One of the recent expansions of the access review process has been to marry together security information and event monitoring (SIEM) data with the identity and access management extracts.  Being able to see what an individual has actually done with their access, can help to determine whether they actually still need certain permissions.  For example, if a line manager is presented with a team member's directory access which contains 20 groups, it could be very difficult to decide which of those 20 groups are actually required for that individual to fulfill their job.  If, on the other hand, you can quickly see that out of the 20 groups, twelve were not used within the last 12 months, that is a good indicator that they are no longer required on a day to day basis and should be removed.

There is clearly a big difference between what the user can access and what they actually have accessed.  Getting this view, requires quite low level activity logging within a system, as well as the ability to collect, correlate, store and ultimately analyse that data.  SIEM systems do this well, with many now linking to profiling and identity warehouse technologies to help create this meta-warehouse.  This is another movement to the generally accepted view of 'big data'.  Whilst this central warehouse is now very possible, the end result, is still only really trying to speed up the process of finding failures further up the identity food chain.

Movement to Identity 'Intelligence'

I've talked about the concept of 'identity intelligence' a few times in the past.  There is a lot of talk about moving from big data to big intelligence and security analytics is jumping on this band wagon too.  But in reality, intelligence in this sense is really just helping to identify the failings faster.  This isn't a bad thing, but ultimately it's not particularly sustainable or actual going to push the architecture forward to help 'cure' the identified failures.  It's still quite reactive.  A more proactive approach is to apply 'intelligence' at every component of the identity food chain to help make identity management more agile, responsive and aligned to business requirements.  I'm not advocating what those steps should be, but it will encompass an approach and mindset more than just a set of tools and rest heavily on a graph based view of identity.

By analyzing the 'who has accessed' part of the identity food chain, we can gain yet more insight in to who and what should be created and approved, within the directories and databases that under pin internal and web based user stores.  Ultimately this may make the access review component redundant once and for all.

By Simon Moffatt

Protect Data Not Devices?

"Protect Data Not Devices", seems quite an intriguing proposition given the increased number of smart phone devices in circulation and the issues that Bring Your Own Device (BYOD) seems to be causing, for heads of security up and down the land.  But here is my thinking.  The term 'devices' now covers a multitude of areas.  Desktop PC's of course (do they still exist?!), laptops and net books, smart phones and not-so-smart phones, are all the tools of the trade, for accessing the services and data you own, or want to consume, either for work or for pleasure.  The flip side of that is the servers, mainframes, SAN's, NAS's and cloud based infrastructures that store and process data.  The consistent factor is obviously the data that is being stored and managed, either in-house or via outsourced services.


Smarter the Device, The More Reliant We Become

This is a pretty obvious statement and doesn't just apply to phones.  As washing machines became more efficient and dishwashers became cheaper and more energy saving, we migrated in droves, allowing our time to be spent on other essential tasks.  The same is true for data accessing devices.  As phones morphed in to micro desktop PC's, we now rely on them for email, internet access, gaming, social media, photography and so on.  Some people even use this thing called the telephone on them.  Crazy.  As the features and complexity ramp up, we no longer need another device for listening to music, taking pictures or accessing Facebook.  Convenience and service provision increases, as does the single-point-of-failure syndrome and our reliance on them being available 99.999% of the time, up to date and online.

Smarter the Device, The Less Important It Becomes

Now this next bit seems a bit of a paradox.  As the devices becomes smarter, greater emphasis is placed on the data and services those devices access.  For example.  A fancy Facebook client is pretty useless if only 100 people use Facebook.  A portable camera is just that, unless you have a social outlet for which to distribute the images.  The smartness of the devices themselves, is actually driven by the services and data they need to access.  Smartphones today come with a healthy array of encryption features, remote backup, remote data syncing for things like contacts, pictures and music, as well device syncing software like Dropbox.  How much data is actually specifically related to the device?  In theory nothing.  Zip.  Lose your phone and everything can be flashed back down in a few minutes, assuming it was set up correctly.  Want to replace a specific model and brand with a model of equivalent specification from a different vendor?  Yep you can do that too, as long as you can cope with a different badge on the box.  Feature differentiation is becoming smaller, as the technology becomes more complex.

Data Access versus Data Storage

As more and more services become out sourced (or to use the buzz of being moved to the 'cloud'), the storage part becomes less of a worry for the consumer.  The consumer could easily be an individual or an organisation.  Backup, syncing, availability, encryption and access management all fall to the responsibility of the outsourced data custodian.  Via astute terms and conditions and service level agreements, the consumer shifts responsibility across to the data custodian and service provider.

The process of accessing that data then starts to fall partly on the consumer.  How devices connect to a network, how users authenticate to a device and so on, all fall to the device custodian.  Access traffic encryption will generally require a combination of efforts from both parties.  For example, the data custodian will manage SSL certificates on their side, whilst the consumer has a part to play too.

So to slightly contradict my earlier point (!), this is where the device is really the egress point to the data access channel, and so therefore requires important security controls to access the device.  The device itself is still only really a channel to the data at the other end, but once an individual (or piece of software, malicious or not) has access to a device, they then in turn can potentially open access channels to outsourced data.  The device access is what should be protected, not necessarily the tin itself.

As devices become smarter and service providers more complex, that egress point moves substantially away from the old private organisational LAN or equivalent.  The egress point is the device regardless of location on a fixed or flexible network.

Data will become the ultimate prize not necessarily the devices that are used to access it.

By Simon Moffatt


Passwords And Why They’re Going Nowhere

Passwords have been the bane of security implementers ever since they were introduced, yet still they are present on nearly every app, website and system in use today.  Very few web based subscription sites use anything resembling two-factor authentication, such as one-time-passwords or secure tokens.  Internal systems run by larger organisations implement additional security for things like VPN access and remote working, which generally means a secure token.


Convenience Trumps Security

Restricting access to sensitive information is part of our social make up.  It doesn't really have anything to do with computers.  It just so happens for the last 30 years, they're the medium we use to access and protect that information.  Passwords came before the user identity and were simply a cheap (cost and time) method of preventing access to those without the 'knowledge'.  Auditing and better user management approaches resulted in individual identities, coupled with individual passwords, providing an additional layer of security.  All sounds great.  What's the problem then?  Firstly users aren't really interested in the security aspect.  Firstly, users aren't interested in the implementation of the security aspect.  They want the stuff secure, they don't care how that is done, or perhaps more importantly, don't realise the role they play in the security life cycle.  A user writing down the password on a post-it is a classic complaint of a sysadmin.  But the user is simply focused on convenience and performing their non-security related revenue generating business role at work, or accessing a personal site at home.


Are There Alternatives & Do We Need Them?

The simple answer is yes, there are alternatives and in some circumstances, yes we do need them.  There are certainly aspects of password management that can help with security, if alternatives or additional approaches can't be used or aren't available.  Password storage should go down the 'hash don't encrypt' avenue, with some basic password complexity requirements in place.  Albeit making those requirements too severe often results in the writing down on a post-it issue...

Practical alternatives seem to be few and far between (albeit feel free to correct me on this).  By practical I'm referring to both cost (time and monetary) and usability (good type-I and type-II error rates, convenient).  So biometrics have been around a while.  Stuff like iris and finger print scanning as well as facial recognition.  All three are pretty popular at most large-scale international airports, mainly as the high investment levels can be justified.  But what about things like web applications?  Any use of biometric technology at this level would require quite a bit of outlay for new capture technology and quite possibly introduces privacy issues surrounding how that physical information is stored or processed (albeit hashs of the appropriate data would probably be used).

There are also things like one-time-passwords, especially using mobile phones instead of tokens.  But is the extra effort in deployment and training, enough to warrant the outlay and potential user backlash?  This would clearly boil down to a risk assessment of the information being protected, which the end user could probably not articulate.


Why We Still Use Them...

Passwords aren't going anywhere for a long time.  For several reasons.  Firstly it's cheap.  Secondly it's well known by developers, frameworks, libraries, but most importantly the end user.  Even a total IT avoider, is aware of the concept of a password.  If that awareness changes, there is suddenly an extra barrier-to-entry for your new service, application or website to be successful.  No one wants that.

Thirdly, there are several 'bolt on' approaches to using a username and password combination.  Thinking of things like step-up authentication and knowledge based authentication.  If a site or resource within a site is deemed to require additional security, further measures can be taken that don't necessarily require a brand new approach to authentication, if a certain risk threshold is breached.

As familiarity with password management matures, even the most non-technical of end users, will become used to using passphrases, complex passwords, unique passwords per applications and so on.  As such, developers will become more familiar with password hashing and salting, data splitting and further storage protection.  Whilst all are perhaps sticking plaster approaches, the password will be around for a long time to come.

By Simon Moffatt


Optimized Role Based Access Control

RBAC.  It's been around a while.  Pretty much since access control systems were embedded in to distributed operating systems.  It often appears in many different forms, especially at an individual system level, in the form of groups, or role based services, access rules and so on.  Ultimately, the main focus is the grouping of people and their permissions, in order to accelerate and simplify user account management.


Enterprise RBAC
Enterprise role management has become quite a mature sub-component of identity and access management in the last few years.  Specialist vendors developed singularly focused products, that acted as extensions to the provisioning tooling.  These products developed features such as role mining, role approval management, segregation of duties analysis, role request management and so on.  Their general feature set was that of an 'offline' identity analytics database, that could help identify how users and their permissions could be grouped together, either based on business and functional groupings or system level similarities.  Once the roles had been created, they would then be consumed either by a provisioning layer, or via an access request interface. The underlying premise, being that access request management would be simplified due to business friendly representations of the underlying permissions and the account creation and fulfillment process would be accelerated.

The Issues & Implementation Failures
The process of developing an RBAC model was often beset with several problems.  IAM encompasses numerous business motives and touch points - which is why many still argue identity management is a business enabler more than a security topic - and developing roles across multiple business units and systems is time consuming and complex.  A strong and detailed understanding of the underlying business functions, teams, job titles and processes is required, as well the ability to perform analysis of the required permissions for each functional grouping.  Whilst this process is certainly mature and well documented, implementation is still an effort laden task, requiring multiple iterations and sign off, before an optimal model can be released.  Once a model is then available for use, it requires continual adaption as systems alter, teams change, job titles get created and so on.  Another major issue with RBAC implementation, is the often mistaken view, that all users and all system permissions must be included in such an approach.  Like any analytic model, exceptions will exist and they will need managing as such, not necessarily be forced into the RBAC approach.

Speeding up Role Creation
Role creation is often accomplished using mining or engineering tools.  These tools take offline data such as human resources and business functional mappings, as well as system account and permissions data.  Using that data, the process is to identify groupings at the business level (known as top down mining) as well as identifying similarities at the permissions level (known as bottom up mining).  Both processes tend to be iterative in nature, as the results are often inconsistent, with difficulties surrounding agreement on user to functional mapping and of function to permissions mapping.

One of the key ways of speeding up this process, is to use what is known as 'silent migration'.  This approach allows roles to be created and used without change to the users underlying permissions set.  This instantly removes the need for continual approval and iteration in the initial creation step.  The silent migration consists of initially mapping users into their functional grouping.  For example, job title, team, department and so on.  The second step is to analyse the system permissions for users in each functional grouping only.  Any permissions identified across all users in the grouping are applied to the role.  No more, no less.  With it, no changes are therefor made to the users permissions.  This process is simply performing an intersection or each users' permissions.

Focus on the Exceptions
Once the users of each functional grouping have had their permissions migrated into the role, it's now important to identify any user associated permissions that are left over.  These can simply be known as exceptions, or permissions of high risk.  They're high risk, as they are only assigned to specific individuals and not the group.  This association could well be valid - a line manager for example may have different permissions - but as a first pass, they should be reviewed first.  To identify which are exceptions, a simple subtraction can be done between the users current permissions (as identified by their target system extract) and the permissions associated with their functional grouping.  Anything left needs reviewing.

This approach can also help with the acceleration of access review strategies.    Instead of looking to review every user, every permission and every functional grouping, simply analyse anything which is anomalous, either via peer comparison or functional grouping.

RBAC is a complex approach, but can provide value in many access review and access request use cases.  It just isn't a catch all, or perhaps approach for every system and user.  Specific application using a more simplified approach can reap rewards.

By Simon Moffatt