Software Development and the Olympics

I am a sports nut.  So the Olympics come on and now that seems to be all that I watch for the entire month of February.  In some spare cycles, I begin to think about athletes, and how they prepare and train for events like the Olympics, and how there are some interesting parallels to software development.

Athletes are pretty normal people, but those at the highest level have one distinct difference from everyday people like you and I.  They love to compete and are fueled by a desire to not only compete, but to win.  All of the best athletes from any number of different sports all share these traits.  They are competitive, committed, and hate to lose.  They WANT to be MEASURED against their competitors, and they MONITOR THOSE MEASUREMENTS.  Sprinters and racers strive for better split times, and better finishing times.  Team sports athletes strive for better numbers in the measures of their sport (like goals, RBIs, touchdowns, interceptions, home runs, etc.).  That is why many athletes have a tough time retiring.  The competition, the single minded purpose, is all gone in retirement, and they cannot replace it with other things in their lives.  This desire to be measured against the best in the world, with the confidence that you can do better, is what makes a world-class athlete special.

Software teams should try to be like these team athletes.  Many software teams seem to fear being measured, and to me this is a sign of weakness.  It shows a lack of confidence, and a lack of accountability.  The key lies in WHAT you measure, and how you use this information.  When an athlete finds a weakness in their performance, they view this not as a personal weakness, but as an opportunity to further improve their performance.  Coaches looking at poor scores/metrics don’t typically call their athletes worthless, or get rid of them.  They see a weakness in some measurement as an opportunity to coach, improve, and grow.

Software development management needs to adopt the same mentality that team sport coaches have.  Use the measures at your disposal to help you fine tune the areas that you need to improve upon, and to tailor your organization to take full advantage of the strengths of your team.  Great coaches adapt their style to the specific personalities and talents of the members of their team.  Each team has players that perform different roles, and a good coach/manager should be doing the following:

  • Encouraging their players to improve in the areas where they are weak
  • Putting each team member in situations where they can excel
  • Allowing team members opportunities to grow, and encouraging that growth
  • Show their players the ways that they are being measured, and allow them to monitor their own progress and performance
  • Treat players as individuals

Doing these things allows team members to prove their worth, every day.  When people know how they are being measured, and when they are responsible not to some faceless concept (the “Company”), but to a group of teammates who have a shared purpose and mission, their performance improves.  Read some of the material out there on high performing teams.  Is any of this original thought?  A lot of this looks like it dovetails with the Principles of the Agile Manifesto.

So since this is a Jazz based blog, how does all of this relate to Jazz?  Go out and look at how we use the Jazz technology to develop the products that utilize that framework, on jazz.net.  You can see a large number of dashboards and iteration plans that detail what our teams are doing, as well as how well they are doing it.  This information is available to the teams, the management team, and our customers.  EVERYONE on the Jazz team is accountable, and it’s visible.  We can either get better, or get embarrassed.  The Jazz concept of a common repository, with easily accessed data that can be displayed in a number of different ways, works best with a high degree of information sharing.  It helps fuel the collaborative aspect of the tools that utilize Jazz, and this visibility enforces accountability.  I believe that this has a positive effect on the quality of our tools, and on the overall quality and capability of our software development teams.  I have seen this improve our responsiveness to our customers (check my earlier blog on Supporting Jazz).

So enjoy the Olympics, and watch the stories behind the athletes.  Many of them have been measured for years, and have used what could have been perceived as negative measurements as motivation for improving.  We should all strive to foster that view of measurements and metrics in our software development organizations, and quit being afraid of being measured.

Advertisements

Supporting Jazz – Everyone Needs a Good Beat

I have talked to a few customers since the turn of the new year, and they have been very honest and open with me in what they see, and what they perceive in the current marketplace.  We discuss the future of software development, as well as industry trends.  One interesting discussion revolved around the issue of support for vendor tools, what our customers expect, and what they actually get.  It was particularly interesting because I realized that I have had this same conversation many times over the past 18 months, in different forms.  Maybe one of the best points of an investment in Jazz is not being well communicated by IBM to our customers.

The main issue revolves around the differences between open source and commercial tools for software development.  I will not get into a discussion on differences in capability, there is already a large amount of information out on the web and on Jazz.net (and in my own blog entries on differences between RTC and Subversion).  What I want to look at in this article is the differences in how these are supported, and how easy it is to get help and get issues resolved.  First I want to set some context, and go over a little bit of recent history.

In the past, our customers would buy licenses for software development tools, and would also pay a fee for support of the product.  This support fee helped to pay for the ongoing upgrades to the tools, as well as any patches or bug fixes (GASP!).  Yes, our software does have bugs.  It also helped support an on-call support team that would help out customers with issues that might not have involved bugs, but may have been tool usage issues.  Customers were not particularly impressed with customer support, for us or for the support from any other vendor for that matter.  I almost always heard complaints about how it was difficult to see the status of their open tickets, a pain to get through the phone prompts, and tough to get quick responses from support engineers.  Support bashing was (and remains) a common thread among many software tools customers.

Many of our customers have been contemplating, or actually moving, to open source tools.  One of the reasons for this is the reduced cost of acquisition, or perhaps mistakenly for the cost of ownership  (see my earlier blog post on Jazz Hands – Administration and the Cost of Ownership).  Customers have pointed out that some open source tools have large communities of users that actively fix bugs, and that in some cases these communities are more responsive and quicker to react than existing commercial solutions.  I will not argue the relative merits of paid support and community support, since I think one could find good examples on either side of the argument.  It is just important to understand the context and the landscape of the software tools environment with respect to support.

Jazz solutions acknowledge this divide, and they attempt to bridge this divide.  The Jazz tools are developed with what I would call a “Jazz mindset”.  This is a robust adoption of Agile principles, with a focus on making the development teams both transparent and accountable.  You can see our dashboards, iteration plans, bugs, enhancement requests, and just about EVERYTHING out on Jazz.net.  This level of transparency is unmatched in the commercial world, and it attempts to model (and improve) the best aspects of many open source projects.  Our development teams are responsible for monitoring the Jazz.net forums and customer bug reports can be seen immediately by everyone, our development teams, our executives, and our customers.  It allows us to provide an environment where there is a focus on quality (nobody wants to look bad and have a large pile of bugs), and where customer issues are addressed in a timely matter, by the team members best suited to solve these issues.

Coupled with this are a series of forums on Jazz.net.  The answers to customer issues aren’t secret, you can search for them and follow usage issues and best practices conversations out in the Jazz forums.  These forums are a place where customers, thought leaders, and IBM developers can all come together to solve issues and discuss future product directions and enhancements.  This is a key component to having a transparent development effort, where all voices can be heard, and we are able to respond to the issues that are important to our customers right now (not 6 months ago).  There are journalists and analysts that make a living getting this kind of information from our competitors, and then guessing at future directions and features for their products.  (Note: I was dying to put a link to one of these articles in here, but I hate to single any one competitor out.)

So you get the responsiveness of an open source community, with a corporate commitment to continue to grow and evolve the Jazz technology.  This is critical, since many open source projects will have some initial momentum, but then seem to fall out of favor when the “next best thing” comes around.  IBM has made a significant commitment to Jazz, and continues to improve the Jazz foundation, as well as provide new capabilities that leverage this architecture.

Don’t trust me on this, go out to Jazz.net and check this out for yourself.  Choose a couple of random bug reports from the Rational Team Concert project work item area.  Look at the discussion associated with the bug.  Check out the time stamp for when the defect report was created, and when the first response was created.  Look at the time stamps on subsequent discussion entries.  Often the “turn around” time will be 24 hours or less, and the discussion immediately will move to the technical issue at hand.  Look at our development dashboards, iteration plans, and check out the forums.  It may not always be flattering for IBM, but you will be able to see exactly what is going on in Jazz development.