Recently a lot of us at IBM/Rational have been talking about something called MCIF. What is MCIF? It stands for Measured Capability Improvement Framework. You can read Per Kroll’s whitepaper on the overall concept, or just check out the IBM overview of MCIF. I often have to explain it to our customers (or even to people outside of work who have nothing better to do than listen to some software guy attempt to explain what he does), and I use this analogy:
When you drive a car, you use a variety of feedback mechanisms. Let’s take a trip to the grocery store to buy more coffee (so you can stay awake during this analogy). You plan out your route. You prepare. You grab your car keys, and you hop in the car. You rely on a core set of metrics to make sure that you reach your destination safely. You watch the road, and you adjust your steering to guide the car around obstacles. You watch the signs and signals, and stop for red lights and merging traffic. The key point here is that while you have an overall plan on how to get to where you are going (the route that you will take), you don’t just blindly steer, accelerate and brake the same way every time you make the trip. External events have an impact on the amount of time, the route, and the stopping and starting that you do on the way to your destination. This is an agile type of philosophy. Don’t plan your trip in intricate detail and blindly follow that plan. Have a vision of where you want to go, and then rely on continuous feedback to get to your destination in the most efficient manner.
So that works for a single trip. As a business, you don’t want single successful trips, you want a culture of excellence. You want governance over the trips being taken. You accept that fact that some of your projects will be failures, and some will be fantastic successes. The key is to be able to differentiate the winners from the losers as early as possible. You can only do this if you are measuring your projects, your people, and your organization. In the software world, we always refer to these as metrics. Let’s go back to our analogy.
So I drive my car a few times, but after about 8000 miles, the engine begins to run rough. My car requires more fuel to get me to the same places that it used to. By having a service record, and by measuring things like gas mileage, I know that the car needs an oil change. Maybe it needs new tires, or a new battery. Is the temperature gauge going up more frequently? Maybe I need to get my radiator flushed. The key here is that while I may take pride in being able to get to the pizza place and back in 10 minutes, over the long run I look at different measures to determine the relative health and capability of my car.
So this brings us back to MCIF. MCIF is a framework of industry proven measures that can be utilized at a variety of levels. Some measure the long term health and improvement in organizational performance. Some are used to measure short term project performance. Some are used to gauge organizational health and the impact of changes to software development methods, teams, and tools. The key is to MEASURE things. Not all of the metrics called out in MCIF may be useful to you in your industry. That’s OK. No metric is perfect. Know that, and accept it. The measures just give you a way to quantify results, and the ability to ask more probing questions to get to the root of organizational issues.
This is all pretty straightforward and simple. The old maxim that, “You cannot manage what you cannot measure“, has been around for a long time. Is any of this new? Not really. A large portion of this is just the application of long held business concepts, to the wild, wild west of software development. During the 1990’s, when there was a web startup around every corner, nobody asked about returns on investment, returns on capital, or some of the more basic productivity measures. Now that the industry has grown up, it gets to be treated just like any other facet of the business.
So what does all of this have to do with Jazz? Jazz technology is a means to an end. Teams perform better when they know they are being measured. That measurement needs to be done intelligently. The performance of teams and the organization as a whole needs to be available to the teams, their management, and the business executives. Teams that are measured, but never know how they are being measured, will never be able to improve. Having an environment where teams can see their performance, and where they can see the impact of their individual actions on that performance, begins to set up a vicious cycle of excellence. The transparency and visibility that the Jazz technology gives to all members of an organization, helps drive accountability. The capability that Jazz has to be able to bring together data from different disciplines, form different stakeholders, with different concerns, allows an organization to begin to make correlations between measures, and determine leading and lagging indicators of project success.
This is all great stuff. But there must be a dark side, a sinister evil secret hiding in this scenario. There is. This visibility and transparency can lead to negative behavior. Using the enhanced transparency to punish will lead to negative behavior in the organization. Punishing teams for high defect rates? Guess what, they will stop reporting their defects, stop tracking potential issues, and your quality will suffer. You will lose customers, market share or capability. Executives need to focus on demonstrating leadership. Metrics that do not measure up need to be addressed, but the teams need to be empowered to solve the issues. High defect rates? The first reaction shouldn’t be, “Cut them all”, instead it should be to ask the question, “Why are there high defect rates?”. Does it correlate to a decrease in quality? Does it impact our time to market? Is it a statistical anomaly? The metrics do not answer questions, they indicate areas where management focus is needed.
The most visible benefit of the Jazz technology, and having Executive management with a healthy attitude about measuring their teams, was provided to me by one of the early Jazz adopters. The manager told me half jokingly that the biggest example of the success of Jazz was a horribly failed project that had just been killed. I was taken aback, but I asked him why this was. “In the past it would have taken us 6 to 9 months to determine that the project was doomed to be a failure. Using Agile development methods and Jazz, we were able to kill it in 2 months. We saved 4 months of development costs, and we were able to allocate those people to a project that provided a positive ROI for the company”.