High-Tech Humanities

Stephen W. Cote's Blog

Test

Rocket

A Tool for Project Management Governance

Introduction

Rocket is a collaborative project management tool designed to work with organizations' unique perspectives on project methodologies. As mentioned in the Authorization as a Service article, the Rocket library is quite versatile. This article discusses the principles and philosophy of the Rocket design.

Overview

I've experienced a variety of program management methodologies and tools in my two decades of high tech employment. Over that time I've become fascinated with the ways in which organizations implement various solutions, and both the fervor and resentment of trying something different. I've had the pleasure of working for companies of all sizes, both as a full time employee and also as a consultant. As a technical leader, no matter which methodology, I often found myself preparing multiple versions of project materials for different situations. As a consultant, I was able to see customers struggling with the same difficulties and without any one tool able to adequately bridge multiple methodologies and styles. As a proof of concept, I conducted a few decomposition exercises to create an object model able to represent the different approaches and requirements I had encountered. Two notable examples were, one, mandates that project plans conform to and include elements of inarticulate models, and two, enterprise-grade plans that cross methodologies.

One common scenario is an organization wants to deliver something, stakeholders expect certain artifacts at certain stages, management expects schedules and projections in particular formats, and the implementation teams become constrained by organization-defined methodologies and are impeded from providing what stakeholders and leadership expect. The organization's methodology, to be kind, can be a bastardization of some standard, de facto or otherwise. Challenges that lead organizations to customize procedures include disparate goals, budgets, and schedules, with interdependent requirements, resources, and artifacts. One of the first questions I encounter as a consultant is: How do you plan on tracking the proposed one or more projects while accommodating a customized methodology with proprietary requirements and milestones?

The trap I found myself stepping into is I use one tool (e.g.: Excel) to create or adapt a set of estimates, including information and formulas I don't plan on sharing outside my immediate team. Then, a separate project plan is created out of the tasks defined with the estimates using a planning tool (e.g.: Project) from which a project schedule is more readily consumable without hacking up custom reports. That plan only lasts until the project starts, when it is severely gutted and reworked into whatever a customer's organization or department uses, with additional steps which are pertinent to the organization but otherwise would not be apparent to me at the beginning. For large multi-month or multi-year projects, it's pretty common to have significant priority changes that lead to subsequent rework on the plan. What I think is amazing is the complete disregard for the gross waste of shelving work and reworking plans during these re-planning sessions. Priorities change, that's understood, and nobody wants to throw good money after bad, but I think there should a clear indication that anything made left on the shelf, call it your buzzword-debt, is waste.

Weary of re-entering the same estimate information in myriad formats, and growing more frustrated over wasted time and effort, I reviewed a number of existing tools from process frameworks to mind map tools to collaborative project management tools. Each had their own distinct features, but still I felt drawn to something that could combine the collaborative features with a process framework, while addressing those process areas that I thought were too often overlooked. So, I did what any red-blooded developer-at-heart would do: I made a model from which I could transform the information into whatever esoteric or, wishfully thinking, somewhat standardized methodology that was desired.

I recognize there are a number of cloudy Web-based project management tools with added business intelligence capabilities, particularly some that popped up over the last few years. At the time I started down this path, not so many existed, and even now, I don't know that they would do what I wanted to do: Be able to accommodate multiple and unique project planning methodologies. In the process, I created a general approach and a set of tools that, inadvertently, match a lot of what the commercial cloudy tools offer, plus my own secret sauce.

Why Rocket?

I chose to create a model for a Big Hairy Audacious Goal (BHAG): Planning to build and launch a rocket. And, thereby came the project's namesake. Although I had a number of complex project plans to use for reference, I wanted to use the most complicated mess I could think of. Consider the sheer number of sub-projects that exist in building, launching, and landing (or delivering) a rocket. No one methodology governs the components built by sub-contractors, although certain types of planning and status information must be centralized. Somebody wants to know whether or not that candle is getting off the ground on time. The Rocket design represents this hybrid and dynamic structure.

The Rocket Philosophy

Accompanying the technical design is a philosophy for harmonizing methodologies (Waterfall, Iterative, et al) . I wrote an elaborate treatise on the subject, but that became counter to the objective which wasn't to introduce yet another process, but to try to make better sense of what is being used. The most important points, which I think are still applicable, are the first two:

  1. Have a warrior spirit mentality (Bruce Lee). Be like water, fluid and dynamic when wet, and hard and carved when ice. This is primarily for the managers and leaders. There are times when flexibility is key, just as there are times when rigidity is needed.
  2. Identify, acknowledge, and address waste. This is not so much a rewording of the theory of constraints as it is the observation that managers and leaders won't or can't acknowledge waste, and thereby allow it to accrue. Wasting time, resources, and money ruins morale and suppresses innovation.

Data Model

Rocket is built atop the Account Manager 5 library, which includes support for organization hierarchies, encrypted data, roles-based access, object-oriented bulk load operations, and more. The two primary objects for aggregating everything together are Lifecycles, which represent the aggregate of all projects against specified budgets and schedules, and Projects, which represent one or more stages of work and their work products. Beyond Lifecycles and Projects is a rich model for describing many atomic aspects of a project plan.

Features

In addition to the project modeling capabilities, the following features are also provided by Rocket:

Authorization as a Service

Account Manager 5 Capabilities

This article continues the discussion started in the Authorization as a Service article.

Within Account Manager 5 is a versatile foundation for working with and authorizing against enterprise identity and entitlement information. This foundation begins with a flexible entity model partitioned by organization, and a generic authorization model that accommodates Roles-Based Access Control(RBAC), Attribute-Based Access Control (ABAC), and Risk-Based Authentication (RBA). The internal authorization implementation is called Participation Access Control, on which the Effective Authorization Service and Authorization as a Service service are based.

Authorization as a Service is exposed as the Policy Service REST interface. An authorization client requests a Policy Definition for a given Policy, which includes context information, expiry instructions, and a list of expected parameters. A Policy Request can be created from the Policy Definition, and parameterized values may be specified for the defined parameters. The Policy Evaluator translates the parameters into Facts, which are then used during the evaluation of the rules and patterns attached to the policy. At the end of evaluation, the Policy Response includes the response state, any risk score, and the pattern chain used to evaluate the policy. A key feature of the Policy Service is that authorization operations, such as asserting role participation or entitlement affect are pre-calculated through the Effective Authorization Service at the time the data is loaded. Therefore, the Policy Service has access to an emulated view of a given set of enterprise identities, accounts, applications, permissions, roles, data, and data rights.

Each cluster of enterprise identity data is logically separated by organization, and then by group and/or parent. Account Manager requires that all entity objects be owned, limiting default access to the owner and administrator, and safeguarding it from accidental exposure. Further more, values and certain objects may be encrypted using one of several integrated encryption layers (Cryptography currently supplied by BouncyCastle). Persons, Accounts, and Groups are all scoped by Group and then Organization, while Roles and Permissions are scoped by parent and then Organization. A snapshot of enterprise data is represented by a group of Persons, a group of applications, and forks in the Role and Permission trees. Applications are represented as named groups, into which Accounts are stored. If the Person to Account relationship is provided, as well as Application entitlements and entitlement memberships, then the Effective Authorization Service will calculate the effective permissions for a person via a linked account and that account's permissions. If a role hierarchy is loaded, and application entitlements and person memberships specified, then the Effective Authorization Service can also show the base and effective role entitlements. Each type of entity supports multi-valued ad-hoc attributes, which then allow the Policy Service to evaluate attribute-based access.

The Policy Service's internal authorization procedures understand a set of predefined Factory types, including persons, accounts, users (not represented here), attributes, roles, groups, and permissions. When executing Authorization rules, the referenced Factory types are retrieved and then evaluated through the Effective Authorization Service. This allows for rules that assert for indirect permission and indirect role participations, without supplying any of those references. Authorization assertions are limited to discrete values because the Policy Definition and supplied parameter templates include the necessary context, and all non-parameter facts remain on the server. For example, consider a person with twenty accounts, some role memberships in a role hierarchy, and any number of application entitlements. Does a Person have an Account with a Permission (entitlement) inherited from a Role? Or, Does a Person have an Account with a Permission and a custom attribute value on either the account or the permission? This type of policy can be written by bulking loading data and defining the policy rules, and executed with only a reference to the Person. No PIPs, no lengthy attribute lists, no custom data schemas, no complicated or abstract authorization languages (looking at you, XACML). For another example, this model has been used to rapidly bulk-synchronize with IBM Security Identity Manager, including extracting role hierarchy and entitlements. Once that data was loaded it was immediately available (to an authorized user, of course) for use by the Policy Service.

Authorization as a Service

AzaaS Lies Within

After having worked with identity and access management for a number of years now, including access governance, entitlements, and solving some very interesting challenges, I've been bereft of any decent solution for externalizing authorization into a cloud-based offering. There exist a number of standards and common industry practices for externalizing business rules and entitlements into policies and workflows, but Authorization as a Service has been vaguely defined. That uncertainty may stem from the deep overlaps between IdaaS and an entitlements or rules engine, while ignoring the necessary separation that must exist in a cloud-based offering. Then, sometime last night while having one of those weird dreams of being in a a forest of complex geometric patterns and formula that showed how the universe interconnects on a physical and spiritual level (everyone gets those, right?), I had an epiphany: I've already written most of an Authorization as a Service (AzaaS) solution.

I define Authorization as a Service as: A service able to correlate identity, context and operational facts with a rule-driven policy, produce a decision for an authorized claimant, provide a simple API to assert authorization, and optionally export entitlement and authorization policies back to identity and enforcement systems.

Many Enterprise Service Bus (ESB) products such as Layer 7 or IBM DataPower include XACML processing engines that, when included with Policy Information Point (PIP) configurations, meet this definition. And, a number of token implementations, particularly for OAuth, include framework features, such as for access policies, that also meet this definition. Business Process Management and Business Rules Management systems (BPM and BRM respectively) provide industrial strength process and rules management that are able to capture and articulate complex business workflows, and, in part, meet this definition. So, why are products like Layer 7 and BPM, or entitlement and token formats like XACML and OAuth, or BPM and BRM systems not all billing themselves as AzaaS?

Challenges

If one of my customers asked for an AzaaS solution, I'm not sure what I could sell them because I have not yet found a product or service offering that meets these challenges.

Providing Identity as a Service is hard enough. There is a veritable mountain of data needed to describe a person, their myriad accounts, and the policies and rules governing those accounts. One additional challenge is an IdaaS must logically separate organization data, or the vendor must operate multiple cloud instances on a per-organization basis. Authorization as a Service shares that challenge. Another challenge is data sensitivity and protection. Authorization rules may need to digest sensitive information that must otherwise remain protected, and storing pan-organization sensitive data ranges from being undesirable right up to being illegal, depending on the location and the type of data. A third challenge is being able to cache the large volumes of identity and account data, in addition to and related with, but decoupled from, rules data. A fourth challenge is being able to represent the different ways organizations and their applications describe entitlements. Technically, these challenges could be me with XACML, PIPs, and a nightmarish configuration, or with OAuth and a framework and a lot of hand-rolling of code. My concern with either of those would be that any AzaaS solution is driven from an entitlement or token format.

Fixing an IdM Challenge

Identity Management (IdM) platforms tend to take a myopic view of entitlements. They define roles, and map roles to entitlements, and evaluate the effective entitlements from the role hierarchy, but roles are only meaningful in a given context; A role in an IdM system only means something in that IdM system. That role doesn't mean anything to a managed application, even if that application defines its own role with the same name, because that application role is usually treated as an entitlement to the IdM system. Also, there is many organizations struggle to architect roles with respect to how roles differ between organization, business, and application, and it's no wonder that role-based access control (RBAC) is so difficult to implement, and why customers gravitate towards attribute-based access control (ABAC). In some cases, organizations that don't use RBAC are already using ABAC without realizing it. Not that that makes configuring the IdM system any easier.

While working through such a challenge, I created a proof of concept (POC) based on my Account Manager 5 and Rocket libraries. Account Manager 5 is a multi-organizational data and directory library. Rocket is set of schema and services that extend Account Manager for use as an object-oriented project management library. The POC integrated with a specific vendor's IdM product, and, in this case, the solution required ingesting multiple views of the same IdM person, application, account, entitlement, role, and attribute data, something that Account Manager and Rocket could already do.

Identities in Bulk

I wrote Account Manager to be a drop-in library for enterprise applications, but use it for less enterprise-y activities like sifting through my photos. One of the first demonstration applications I wrote over ten years ago was a JEE application for navigating through my digital photographs from a Web browser. Over time, as I rewrote the library from Java to .NET to Java again, I got tired of exporting and importing the database, and transforming data around schema changes. Also, as the authorization features become more complex and the object model more normalized, adding one new object could result in several database operations.�To make this more manageable, I added bulk factories that applications use to build up complex object relationships with security configurations, including id references, before actually adding the objects from the database or generating the id. The Bulk Session capability allows for rapidly persisting all types of objects and dependencies into the database.

While creating the IdM POC, I created a set of integration tests that would delete and reload bulk sets of identity data into Rocket projects. This allowed me to replicate the entitlements evaluation of several enterprise IdM solutions N number of times, for N number of organizations.

AzaaS Solution

Over the years, I've been plinking away at a rules evaluation process and have never had a reason to codify it as an extension.�There are already many rules systems out there, so there is no point in reinventing that wheel, except to be more tightly integrated and more direct data-level access. Thinking about an AzaaS solution, or rather, dreaming about one, I realized that if I redefined these rules to function principally for an authorization service, would have a library that could reasonably serve as an AzaaS solution.

The following are features of my AzaaS solution.

The only parts I need to add from the existing libraries are extensions to the authorization service to accommodate a lightweight rules and policy mechanism, and then exposing an API to invoke the policy. The rest of the system is composed of CentOS, JBoss, and PostgreSQL.