Thursday, March 31, 2011

More on the topic of NIST and Access Control

Continuing the discussion, NIST and Access Control, I disagree with some of the distinctions between Attribute and Policy Based Access Control (ABAC and PBAC). Most of this disagreement is really in where attribute- and policy-based controls begin and end, and their characteristics, than in the concepts themselves.

In the report surveying access control models, NIST indicated that "one limitation of the ABAC model is that ... there can be disparate attributes and access control mechanisms among the organizational units. It is often necessary to harmonize access control across the enterprise." This distinction of specific silos of attribute-based control versus more enterprise-coordinated (but still attributed-based) control is the distinction between ABAC and PBAC in the report. For me, I consider this all policy-based access control, where the scope of "harmonization" varies (perhaps by conscious decision or priority of implementation).

It seems reasonable that an organization would work to fully harmonize its access to shared data or resources. That word, "shared", is important. For data and attributes specific to one organizational unit (or domain or application), it is reasonable to have individual policies (i.e., there is nothing to harmonize to). However, once data is shared across applications and org units, there is a need to harmonize - since subjects could be granted or denied access to the same data or resources by different units or applications.

So, there is a need to consider attribute-based control in specific areas AND in the enterprise as a whole. And, there is a need to be consistent with respect to the attributes and the policies. This seems true whether you call this ABAC or PBAC.

The NIST report goes on to say, PBAC "requires not only complicated application-level logic to determine access based on attributes, but also a mechanism to specify policy rules in unambiguous terms." This is not controversial, but I would clarify the term "application-level" in "application-level logic".

I agree that you need "guards" related to an application's need for and use of policy. However, you could obtain this by intercepting application/user attempts to access resources and then applying policy. Of course, you likely also have to map the application details into the attributes of your policies. This is where the Semantic Web concepts (such as OWL's SubClassOf, EquivalentClasses, DisjointClasses, etc.) come into play. As for "guards" themselves, there is a good example of this in KAoS Guards, implemented in the infrastructure from Florida's Institute for Human Machine Cognition (KAoS).

The second part of the quoted sentence talks about the need to specify policies in "unambiguous terms". This was one of the key points of my earlier post - one where we need end-user and government/organizational participation. BTW, I will be so bold as to also endorse the ontology work of the KAoS team as a basis for both access control (authorization) and obligation policies (see the entity rendering of their OWL ontology at http://ontology.ihmc.us/policy.html).

So, where do I start to disagree a bit more with the NIST position? It is in the area of an "Authoritative Attribute Source" ("the one source of attribute data that is authorized by the organization"). I cannot argue with taking this as a first line goal. Fred Wettling also pointed this out in his comments to my earlier post ... that there needs to be "a rethinking of the hub and spoke models of access control of the past. This raises the question of where identity-based attributes are stored and how/when they are validated. Can peer-to-peer communications rely on attribute assertions from the peers? In some cases yes. In other cases, third-party validation may be required. In both cases a centralized infrastructure may NOT exist. That is a challenge that may be addressable through some level of standardization of vocabulary, policy, and rule structures."

While I agree that one authorized attribute source is required by some scenarios, it is not mandated in all scenarios or even required for PBAC. As noted above, a common understanding of the semantics of the attributes and their "qualities" (including where and how they are obtained) is needed. This can be managed and mitigated by semantic alignment and policies about the data itself.

Just some more food for thought in the continuing dialog about access control and policy-based security/management/...

Next time, I promise to finally write about Risk Adaptive Access Control.

Andrea

Monday, March 21, 2011

Dialog on "NIST and Access Control", my previous post

I received a notification that the following comment was posted by Fred Wettling (from Bechtel) ... but it does not appear on the web site. So, I am re-posting it as a full entry and then adding a bit of dialog of my own.

My apologies to Fred that the site is not working quite right, and my appreciation for the additional insights! Here is Fred's post ....
Good post, Andrea ;-)

Here are a few initial thoughts...The overall NIST model can be viewed as moving from topology-based control (network boundaries) to policy-based control. RBAC, ABAC, PBAC... appear to be the evolution of how policies are instantiated and the level of granularity required for the target level of security. ACLs, firewall rules, and other topology-based controls will be around for a while and continue to serve as a coarse-grained access control for many hosted services within companies and also cloud providers that have requirements for zone-based controls.

But topologies are changing from at least three perspectives that must be addressed in tomorrow's security.
1. Resources and information are becoming more distributed as we move from a single source of access (monolithic applications) toward a "single source of truth" with mash-ups securely accessing and aggregating multiple authoritative information sources.
2. More distributed information sources including peer-to-peer communications.
3. Location-based services have an implication of geographic or physical context that may be relevant in security decisions.

There's also a needed (and implied) shift in WHERE security is applied. The trend must be toward the target resource or information to operate in the expanding Internet of Things. The implication of these topology-related trends will require changes to how information and resources are secured… and a rethinking of the hub and spoke models of access control on the past. This raised the question of where identity-based attributes are stored and how/when they are validated. Can peer-to-peer communications rely on attribute assertions from the peers? In some cases yes. In other cases, third-party validation may be required. In both cases a centralized infrastructure may NOT exist. That is a challenge that may be addressable through some level of standardization of vocabulary, policy, and rule structures that you mentioned. The challenge, as usual, is frequent vendor perception that product differentiation is achieved through proprietary technology.

I think there needs to be some industry awakening about the value of standard & policy-based access control. There is high-value to organizations to have common policy definitions accessible by PDPs and PEPs provided by multiple vendors.
1. DMTF has done some great work collaborating with others in standardizing policy models.
2. Before it merged with the Open Group in 2007, the Network Application Consortium (NAC) published the Enterprise Security Architecture. Link: https://www2.opengroup.org/ogsys/jsp/publications/PublicationDetails.jsp?catalogno=h071. A team has been established to update the 115 page doc. It has some great content that is relevant to this thread.

NIST could certainly help in pushing this work forward and promoting policy standardization in other standards organization including the two mentioned above. Unification would be a good thing.

Fred Wettling
Bechtel Fellow
And, here are my replies:

1. As for ACLs being around for a while, I totally believe that. However, I am not sure that they will be around out of necessity, but due to the persistence of legacy systems and the reluctance of vendors to move from tested and costly proprietary technologies. I think that we can move fully to PBAC or RAdAC policies, at least at the declarative level. Let me explain ... If hardware and software products understand a "standard" policy ontology and vocabulary, then they can implement/act on declarative policy, as opposed to processing device/OS/software-specific ACLs. Now, a device may only be capable of coarse-grained control or that may be the appropriate implementation for certain environments (for example, "zone-based control" in a cloud), where the fine-grained details are/should not be known. But, a different take on this, from requiring an ACL, is to say that a "standard" rule is translated to coarse-grained protection, as part of the PBAC/RAdAC processing. Even in the worst case of using only ACLs, the policy infrastructure can relate the ACL to the declarative policy that it was designed to implement. Traceability is a wonderful thing!

2. It is noted that a centralized infrastructure may not exist. I agree but also want to allow for centralized policy declaration (for example, from federal, state or local governments, or from a Governance/Compliance Policy-Setting Body in an enterprise) and decentralized, local policy definition and override. On the distribution side, it is necessary to disseminate the broad coverage policy (such as legislation), as well as the rules for how/when/by whom they can be modified. Does this mean "centralized" infrastructure? Not necessarily, but there may/should be "standard" policy repositories. Regarding policy decision and enforcement, there may be security aspects that can/should be processed locally, others that require third party validation, some that rely on sophisticated, controlled policy decision point analysis, etc. ... It eventually comes down to policy on policy. But, in any/all cases, security needs to work when offline/disconnected.

3. Fred lists work by DMTF and NAC. The NAC paper is very insightful and I encourage people to read it. As for the DMTF, they have indeed pushed the boundaries of policy-based management, although I find their model to be complex (due to the use of a query language in defining policy conditions and actions, and the disconnected instantiation of the individual rule components). Also, CIM requires extension to address all the necessary domain concepts and vocabularies, which will not be easy.

4. I was a bit unclear in my reference to NIST and helping with "practices and standards". One of my NIST colleagues pointed out that they do not set industry standards, but support the development of "open, international, consensus-based" ones. In addition, they develop guidance for Federal agencies. This was what I meant, although I did not use enough words. :-) Federal, state, and local governments are the ones that write legislation/statutes. But, NIST could encourage government agencies to publish their policies in a standard format, and that would help industry and promote competition. This was indeed my goal in mentioning NIST - not for them to create a standard, but to help drive one and ensure its utilization by Federal agencies. NIST is in a unique position to help because they are not a policy-setting agency. And, as Fred noted above "the challenge ... is frequent vendor perception that product differentiation is achieved through proprietary technology."

Well, I'm sure that this is too long already. Look for more in the next few days.

Andrea

Friday, March 18, 2011

NIST and Access Control

I ran across an excellent paper from NIST (the US's National Institute of Standards and Technology), A Survey of Access Control Methods. The document is a component of the publication, "A Report on the Privilege (Access) Management Workshop". I highly recommend reading it, since the security landscape is evolving ... as the technology, online information, regulations/legislation, and "need to share" requirements of a modern, agile enterprise keep expanding.

Access control is discussed from the hard-core (and painfully detailed) ACL approach (access control lists) all the way through policy and risk-adaptive control (PBAC and RAdAC). Here is a useful image from the document, showing the evolution:



Reading the paper triggered some visceral reactions, on my part ... For example, I strongly feel that role-based access control is no longer adequate for the real-world. Yet, it is where most of us live today.

The problem is the need for agility. The world is no longer only about restricting access to specific, known-in-advance entities using a one-size-fits-all-conditions analysis ("need to protect" with predefined roles) - but also about granting the maximum access to information that is allowed ("need to share" considering the conditions under which sharing occurs).

Here are some examples ... Firefighters need the maximum data about the location and conditions of a fire that they can legally obtain (see my previous post, Using the Semantic Web to Fight Fires). Law enforcement personnel, at the federal, state or local levels, need all the data about suspicious activities that can be legally shared. An information worker needs to see and analyze all relevant data that is permitted (legally and within the corporate guidelines). *The word, "legally", comes up a lot here ... more on that in another post.

So, how do you accomplish this with simple roles? You can certainly build new roles that take various situational attributes into account. But how far can you go with this approach? At some point, the number of roles (variations on a theme) spirals out of control. You really need attribute based control. As the NIST paper points out, with attributes, you don't need to know all the requesters in advance. You just need to know about the conditions of the access.

But, simply adding attribute data (data about the information being accessed, the entity accessing it, the environment where the access occurs or is needed, ...) can get quite complex. The real problem is figuring out how to harmonize and evaluate the attribute information if it is accessed from several data stores or infrastructures. Then, closely associated with that problem is the need to be consistent across an enterprise - to not allow access (under the same conditions) through one infrastructure that is disallowed by another.

Policy-based access control, the next concept in the evolution, starts to address some of these concerns. NIST describes PBAC as "a harmonization and standardization of the ABAC model at an enterprise level in support of specific governance objectives." It concerns the creation and administration of organization-wide rule sets (policies) for access control, using attribute criteria that are also semantically consistent across the enterprise.

Wow, reading that last sentence made my head hurt. :-) Let me decompose the concepts. For policy-based access control to really work, we need (IMHO, in order of implementation):

  1. A well defined (dare I say "standard") policy/rule structure
  2. A well understood vocabulary for the actors, resources and attributes
  3. Ability to use #1 and #2 to define access control rules
  4. Ability to analyze the rules for consistency and completeness
  5. An infrastructure to support the evaluation and enforcement of the rules (at least by transforming between local data stores and infrastructures, and the well understood and defined vocabulary and policies/rules)

Some day, we will have best practices and standards for #1 and #2. Even better, we could have government-blessed renderings of the standard legislation (SOX, HIPAA, ...) using #1 and #2.

Can NIST also help with these activities? I hope that it can. In the meantime, there are some technologies like Semantic Web that can help.

As you can imagine, I have lots more things to discuss about the specifics of PBAC and RAdAC, in my next posts.

Andrea