I consult, write, and speak on running better technology businesses (tech firms and IT captives) and the things that make it possible: good governance behaviors (activist investing in IT), what matters most (results, not effort), how we organize (restructure from the technologically abstract to the business concrete), how we execute and manage (replacing industrial with professional), how we plan (debunking the myth of control), and how we pay the bills (capital-intensive financing and budgeting in an agile world). I am increasingly interested in robustness over optimization.

Thursday, December 31, 2015

Counterfactuals

Earlier this year, my house should have burned to the ground.  A CR2032 battery exploded and caught fire in a confined place dense with flammable objects.  But my house didn't burn down: at the moment the battery exploded, I was sitting a few feet away from it. I heard a loud bang, investigated, and stamped out the fire within a few seconds.

I wasn't planning to be there at the time. A series of minor reschedules and reprioritizations had me sitting in my home office at the moment the battery exploded. I happened to be in the right place at the right time to prevent something from happening.  It was business as usual for the rest of the day.

There is no law or reason governing why I was there at the moment the battery exploded, or for the battery to have exploded at precisely that moment.  What if my schedule isn't rearranged and I'm not home when the battery explodes? What if it happens hours earlier or later when everybody is asleep in the house? I can't help but think that there are more quantum realities where something bad happened than there are where something bad did not happen.

But that isn't necessarily true.  The battery could have exploded at a time when nobody was around to put out the fire, but the ensuing fire may have simply died out causing no damage.  And maybe the chemical processes that triggered the explosion was itself an anomaly of circumstances set in motion by my presence in the room at that particular moment.  I have to be content with the only certainty I have, which is that a battery exploded and nothing bad happened.

Counterfactuals are messy because they're not provable.  Strategy and investing, execution and operations are loaded with counterfactuals.  They're each similarly messy because we don't really know what would have happened had we chosen another course of action.  Conventional wisdom says that Kodak should have bet on the digital technology they invented, and not dithered around with its legacy analogue business. But what if Kodak were to have invested heavily in digital and their distribution partners made good on their threats to shift the lucrative film processing business to Fuji?  Would Kodak's bet on the future have imploded for being starved of cash flow?  Hindsight isn't 20/20, it's narrative fallacy.

Just as it's difficult to have retrospective strategic certainty, it's next to impossible to forecast the outcomes of alternative paths, decisions or circumstances.  It sounds simple enough: invest in the highest-return opportunities, and consistently assess the performance of and the risks to delivery.  But we're forced to interpret and project from incomplete and imperfect information. Our plans and forecasts are just as likely to be narrative fallacy as fact-based assessments.  Microsoft paid dearly for Skype, but we can't know what might have been had it been acquired by Google or Facebook - and the possibilities may very well have justified the premium Microsoft paid.

The landscape of strategic software investing and governance is riddled with speculation.  As much as we would like to think we've made the best decision available to us, or dodged a bullet, we rarely know with certainty.  No matter how formulaic we're told that portfolio management or delivery operations can be, they will never be as optimal or predictable as promised, because business is imprecise and untidy.