CyberArk Project Maturity: How does your implementation compare?

Christopher Underwood, Senior IAM Consultant, SecureITsource, Inc.

Providing professional services gives consultants the opportunity to recognize the maturity of an organization’s service offerings. During my time as a Cyberark customer, I was unaware of where our deployment might fall in comparison to other organizations. I knew we were making great progress within the project and hitting our objectives…but the path we took getting there was rocky, a comical road map ending at project completion.

A scene from “A Goofy Movie”, an accurate representation of my first CyberArk project

The problem is, comparing implementations between organizations would be a futile exercise. Compensating controls and executive direction will not align between Company A and B. Instead of keeping up with the Joneses, we can self-reflect and identify where we stand from a service offering model. I’ll be using Carnegie Mellon University’s Capability Maturity Model Integration (CMMI) to do so, which would generally apply to the organization as a whole… but still has plenty of applicability to individual products.

Level 1: Initial

All organizations start here. The product was purchased without a realistic integration plan and an individual or team sets out to learn how the various pieces fit together. Shoulder taps are common as the implementation team begins to implement, maintain, and learn the products capabilities. Standards are created ad hoc and are quickly diverted from as new use cases make themselves known.

The success of the implementation is a result of “heroes” and trench work instead of proven or repeatable processes. The team can still be successful, but at cost of minor chaos, a longer time-to-market or additional resource costs.

We lack well defined processes here, but THAT’S OKAY. This is our starting point, our opportunity to experiment and learn.

Level 2: Managed

We graduate to the second level as we begin defining processes for specific use cases or projects. The product owner should have established SLAs and customer agreements along with documented processes for the support team to utilize. The help desk, support staff and implementation team must be identified and provided with those standardized responses

At this level, we create processes that suit the project and our needs. The creation of ad hoc processes as we see fit does not consider the big picture and spawn temporary fixes when implementation team is presented with an emergency or resource constraints.

Level 3: Defined

Replication of ad hoc process deployments will only get you so far. To reach the “Defined” level, we need to account for future use cases. This refined approach requires a proactive team to understand organization objectives and plan for future integrations. Many organizations today are expanding to hybrid cloud solutions, do you have processes in place that can handle that expansion?

This level requires more buy-in and an overarching view of the enterprise. If you have trouble attaining this level, some product owner cheerleading may be called for.

Level 4: Quantitatively Managed

At this level, we establish quantitative objectives to identify opportunities to improve the quality and utilization of the product. Any process that we can measure will give us an understanding of our process and product performances. Upgrades, ticket responses, component utilization, change management…measuring each defined process will highlight the value that each of these actually provides. In some cases, it will also provide us the opportunity to move onto the final stage “Optimizing”.

This level is more appropriate to apply to the organization as a whole, so most implementations won’t actually reside here. Instead, think of this level as a reminder to re-evaluate how you communicate your product and process success.

One of the biggest challenges with Cyberark is effective maintenance of managed accounts. Broken or unsynced accounts tend to linger within the environment until a customer ticket surfaces or a cleanup activity begins. Monitor the number and types of broken accounts to identify process improvement opportunities. This metric is a great way to demonstrate behind the scenes progress to leadership.

Level 5: Optimizing

Once we reach the Optimizing level, we should have plenty of data to reflect upon. We can optimize our under-performing processes and begin to exhibit true operational excellence, returning maximum value back to our organization.

If you are not using Cyberark’s REST API for any automated processes, you might realize the opportunity in this step. For those organizations already using PACLI or REST calls, we can still evaluate and optimize how they are utilized. Advanced usages may not be as obvious and building out your network of other CyberArk product owners or partners will lead to truly innovate ways to implement the product and bring in that final bit of added value.

Wrapping up…

These levels aren’t the only way to measure success and focus primarily on the processes used behind the scenes. Even if this model isn’t right for you, I hope you found a few valuable concepts to consider that could improve your overall Cyberark offering.