Monday, March 21, 2011

Dialog on "NIST and Access Control", my previous post

I received a notification that the following comment was posted by Fred Wettling (from Bechtel) ... but it does not appear on the web site. So, I am re-posting it as a full entry and then adding a bit of dialog of my own.

My apologies to Fred that the site is not working quite right, and my appreciation for the additional insights! Here is Fred's post ....
Good post, Andrea ;-)

Here are a few initial thoughts...The overall NIST model can be viewed as moving from topology-based control (network boundaries) to policy-based control. RBAC, ABAC, PBAC... appear to be the evolution of how policies are instantiated and the level of granularity required for the target level of security. ACLs, firewall rules, and other topology-based controls will be around for a while and continue to serve as a coarse-grained access control for many hosted services within companies and also cloud providers that have requirements for zone-based controls.

But topologies are changing from at least three perspectives that must be addressed in tomorrow's security.
1. Resources and information are becoming more distributed as we move from a single source of access (monolithic applications) toward a "single source of truth" with mash-ups securely accessing and aggregating multiple authoritative information sources.
2. More distributed information sources including peer-to-peer communications.
3. Location-based services have an implication of geographic or physical context that may be relevant in security decisions.

There's also a needed (and implied) shift in WHERE security is applied. The trend must be toward the target resource or information to operate in the expanding Internet of Things. The implication of these topology-related trends will require changes to how information and resources are secured… and a rethinking of the hub and spoke models of access control on the past. This raised the question of where identity-based attributes are stored and how/when they are validated. Can peer-to-peer communications rely on attribute assertions from the peers? In some cases yes. In other cases, third-party validation may be required. In both cases a centralized infrastructure may NOT exist. That is a challenge that may be addressable through some level of standardization of vocabulary, policy, and rule structures that you mentioned. The challenge, as usual, is frequent vendor perception that product differentiation is achieved through proprietary technology.

I think there needs to be some industry awakening about the value of standard & policy-based access control. There is high-value to organizations to have common policy definitions accessible by PDPs and PEPs provided by multiple vendors.
1. DMTF has done some great work collaborating with others in standardizing policy models.
2. Before it merged with the Open Group in 2007, the Network Application Consortium (NAC) published the Enterprise Security Architecture. Link: https://www2.opengroup.org/ogsys/jsp/publications/PublicationDetails.jsp?catalogno=h071. A team has been established to update the 115 page doc. It has some great content that is relevant to this thread.

NIST could certainly help in pushing this work forward and promoting policy standardization in other standards organization including the two mentioned above. Unification would be a good thing.

Fred Wettling
Bechtel Fellow
And, here are my replies:

1. As for ACLs being around for a while, I totally believe that. However, I am not sure that they will be around out of necessity, but due to the persistence of legacy systems and the reluctance of vendors to move from tested and costly proprietary technologies. I think that we can move fully to PBAC or RAdAC policies, at least at the declarative level. Let me explain ... If hardware and software products understand a "standard" policy ontology and vocabulary, then they can implement/act on declarative policy, as opposed to processing device/OS/software-specific ACLs. Now, a device may only be capable of coarse-grained control or that may be the appropriate implementation for certain environments (for example, "zone-based control" in a cloud), where the fine-grained details are/should not be known. But, a different take on this, from requiring an ACL, is to say that a "standard" rule is translated to coarse-grained protection, as part of the PBAC/RAdAC processing. Even in the worst case of using only ACLs, the policy infrastructure can relate the ACL to the declarative policy that it was designed to implement. Traceability is a wonderful thing!

2. It is noted that a centralized infrastructure may not exist. I agree but also want to allow for centralized policy declaration (for example, from federal, state or local governments, or from a Governance/Compliance Policy-Setting Body in an enterprise) and decentralized, local policy definition and override. On the distribution side, it is necessary to disseminate the broad coverage policy (such as legislation), as well as the rules for how/when/by whom they can be modified. Does this mean "centralized" infrastructure? Not necessarily, but there may/should be "standard" policy repositories. Regarding policy decision and enforcement, there may be security aspects that can/should be processed locally, others that require third party validation, some that rely on sophisticated, controlled policy decision point analysis, etc. ... It eventually comes down to policy on policy. But, in any/all cases, security needs to work when offline/disconnected.

3. Fred lists work by DMTF and NAC. The NAC paper is very insightful and I encourage people to read it. As for the DMTF, they have indeed pushed the boundaries of policy-based management, although I find their model to be complex (due to the use of a query language in defining policy conditions and actions, and the disconnected instantiation of the individual rule components). Also, CIM requires extension to address all the necessary domain concepts and vocabularies, which will not be easy.

4. I was a bit unclear in my reference to NIST and helping with "practices and standards". One of my NIST colleagues pointed out that they do not set industry standards, but support the development of "open, international, consensus-based" ones. In addition, they develop guidance for Federal agencies. This was what I meant, although I did not use enough words. :-) Federal, state, and local governments are the ones that write legislation/statutes. But, NIST could encourage government agencies to publish their policies in a standard format, and that would help industry and promote competition. This was indeed my goal in mentioning NIST - not for them to create a standard, but to help drive one and ensure its utilization by Federal agencies. NIST is in a unique position to help because they are not a policy-setting agency. And, as Fred noted above "the challenge ... is frequent vendor perception that product differentiation is achieved through proprietary technology."

Well, I'm sure that this is too long already. Look for more in the next few days.

Andrea

No comments:

Post a Comment