When metrics go wrong

  • “TELL ME HOW YOU WILL MEASURE ME, AND THEN I WILL TELL YOU HOW I WILL BEHAVE. IF YOU MEASURE ME IN AN ILLOGICAL WAY, DON’T COMPLAIN ABOUT ILLOGICAL BEHAVIOR.” – ELI GOLDRATT
  • Dee Hock - the tyranny of measurement: “ If we were to set out to deliberately design an efficient system for the methodical destruction of community, we could not do better than our present efforts to monetize all value, mechanize all societal organizations and reduce life to the tyranny of mathematical measurement, markets and the ever increasing centralization of power and wealth that result. Money, mechanism, measurement and markets have their place. They are useful tools indeed. We should use them carefully for beneficent ends. But useful tools are all they are. They do not deserve the deification the apostles of unrestrained acquisition insist that we give them. Only fools worship their tools.”
  • Some things are really hard to measure - eg the impact of saying good morning to people - and yet we can still all agree that it’s worth doing (this came from a conversation with Tito Sarrionandia but I can’t remember whether he was quoting anyone).

Accelerate Metrics / DORA metrics

  • The four key metrics described in the “Accelerate” book
    • Martin Fowler’s foreword describes Accelerate very clearly as the book that contains the explanation for why the State of DevOps reports (and the DORA metrics they use) are so very compelling and trustworthy
    • DORA is also relevant to The DevOps Handbook… but not directly referenced apart from a nod in the acknowledgements
      • Devops Handbook is more about the three ways
      • See my notes on DevOps-Handbook, here
  • aka the DORA metrics
    • DORA = DevOps Research and Assessment - a startup created by Gene Kim and Jez Humble with Dr. Nicole Forsgren at the helm (those three also wrote the book).
    • (downloaded for Clare in clare-tech/resources)
  • Originally came from DORA State of DevOps report
  • The metrics are:
    • Deployment frequency
    • Lead time
      • Lead Time for Changes measures the velocity of software delivery
    • Time to restore, aka mean time to recovery (MTTR) (aka mean time to restore)
    • Change failure rate (“a measure of how often deployment failures occur in production that require immediate remedy (particularity, rollbacks).”)
    • Reliability [added in 2021]
  • I saw a great presentation about this
    • by Tito Sarrionandia (slides available to Clare only here)
    • where he talked about four key DORA metrics
    • He made the following points:
      • Those four measurable areas are not tradeoffs
      • They represent measures of speed and of stability - they correlate highly with each other
      • Organisations moving faster, are typically breaking less.
      • There is no evidence that organisations are able to optimise one of those things by trading off the others. High performers score highly across the board in all four metrics, low performers score badly across the board.
  • “Focusing on only these metrics … empower[s] organizations by having objective measures of determining if the changes they’re making have an actual impact”
    • The Accelerate book was very much focused on DevOps (infrastructure and deployment) and these metrics help you to focus on how good your infrastructure and deployment pipelines are.
    • My notes on Accelerate are here
    • More here
  • Blog post about the report