Is Microsoft the new IBM? a tale of two pivots

Microsoft and the advent of the PC caused the hardware giant IBM to pivot away from their original core business, hardware, to professional services. Now Microsoft itself is forced to pivot because its core business is being disrupted: it used to be consumer oriented software, but that game is now lost to Apple and Google. But what will Microsoft pivot to? the answer is not cloud computing although they seem to think so themselves!

The beginning of the end of the machine pusher
In the 80s people had perms and danced around with coloured leather ties to “Hungry Like the Wolf” by Duran Duran and IBM was an unrivalled giant in tech, famously portrayed in Apples 1984 commercial. Life was good. A near monopoly on computers, high barriers of entry and large margins made IBM a safe bet in any investment textbook. The name stood and still stands for International Business Machines. So, it is not too much to say that hardware was their core business. They famously also made the transit into the consumer world with the PC and all was good. They didn’t think much of this stuff that made the machines functional and sensible to humans: the software. Therefore they logically let a subcontractor supply their machines, with the software. At that time the thinking was that whatever you could put on computers to help sell more computers was a good thing. After all, they were a machine company and they wanted to sell machines.

The subcontractor, of course, was Microsoft. They supplied the operating system and user interface to the machine. Microsoft gradually improved the interface and since this was what humans perceived as their computer it became central in the PC experience. Inevitably other companies started to produce cheaper versions of the desktop computer. Similarly to IBM they also needed something to turn the computer from a big hunk of silicon and plastic into something that made sense to humans.

Since IBM had an early lead, the logical thing was to move consumers from the IBM world to a cheaper world. What better way to do that than to present the potential customers with an interface that was identical to what they already knew? Thus Microsoft became the default platform for the PC or desktop computers for decades to come. Whether cheap or expensive all computers had to have Microsoft operating systems and software. IBM was stuck in the innovators dilemma and couldn’t live from the small margins of the desktop computer, so they had to abandon that division altogether.

As the computing power of the desktop computer grew, the market grew increasingly sour and started to threaten IBMs heartland, the mainframe computer. In the end they had to completely re-orient their business towards professional services and therefore they are not the same company they started out as. What happened was that changes in the underlying circumstances for their primary products were undermining their market feasibility.

Is Microsoft the new IBM?
Now something similar is happening to Microsoft and it will be interesting to see where that leads them.

For years the PC and its familiar interface, Microsoft Windows, has been the primary interface to computing resources for consumers, but a decade ago cell phones started to be able to do some of the same things: you could play games on your Nokia 3210, you could calculate, write electronic messages etc. Very simple things, still far from the desktop computer.

Then came the smart phone. I remember the attraction of the first smartphones was to sync your calendar with your outlook calendar, write notes and to do lists and eventually browse the internet although very slowly. As processing power increases, something similar to the obsolescence of the mainframe happened to the PC. The smartphone could do more and more of the things the desktop computer used to do.

Then came the tablet in full force and with increased processor power and screen size even more things you would normally do on the desktop computer. Now the only problem for Microsoft was that hardware producers had not selected Microsoft’s software for these new devices that the consumers used. The interfaces were called android and iOS. This is now what consumers are used to as portals to their computing power.

Microsoft has struggled to bring their user interface on these new devices, like IBM tried to keep track with new hardware producers, but so far with little success. The user interface between the computer and the consumer, which used to be Microsoft’s heartland business, now seems just as lost as the production of machines seemed to IBM when they made their big switch to professional services.

So, what will Microsoft do? the mirrored fates of IBM and MS
Therefore it is now interesting to follow Microsoft and see if they have the same courage as IBM had. To completely rethink their business and reason for being? Will they be able to shift away from software and operating systems into something else like IBM did?
So far it seems that their professional services could be cloud computing in general. The question is just whether their offering will be strong enough to keep the leadership.

IBM lost their heartland products but kept their customer segment. To this very day the same customers buy from IBM. They just buy something else: development services, software, the occasional strategy report etc.

For Microsoft it will probably be even more up hill. They may not have lost out on the product, since many units are still shifted every second and the percentage of OS on desktop computers is still a majority. But they have lost something even more important their heartland customer segment: the consumer. For decades now they have tried to build a lead in a new costumer segment SME businesses. Here they sell similar products (OS, software etc.) as they did to the consumer.

A tale of two pivots
In short it seems that IBM stuck with the customer segment and changed the products because they were disrupted by Microsoft and Microsoft stuck with their products and found a new customer segment. This move is not complete and it is too early to tell whether they will carry it through, but that could be what will save Microsoft. Not that Microsoft appears to be a company in need of saving, but let’s face it. No matter what they will try, consumers will never flock to Microsoft for an operating system on their mobile devices. That battle is lost. Just like the Machine battle was lost to IBM years or decades before they realised it.

Borrowing a term from the lean start up movement, you could say that IBM made a product pivot and Microsoft is attempting a customer pivot. Only time will tell if they succeed.

Experimentation in product management

 Traditionally new products were developed according to the founder’s idea that was written down, which the engineers built. The last few years this pattern has changed.  Across the internet there has been a shift in mindset to bring the customer into what we are building. There is a growing awareness that we are wrong about what the customer wants most of the time. Therefore it is necessary to experiment to find out what customers want.

We talked to Teresa Torres about the role of experimentation in product management. The greater part of her career has spent in pre product market fit internet start ups, so if someone should know how to experiment to find a product that is successful it’s Teresa. Today she helps companies make better product decisions as a consultant and coach.

According to Torres it is better to start thinking about product development in terms of experiments rather than requirements. In Marty Cagan’s dual-track scrum article, he recommends using a discovery track and a delivery track. First we should experiment in the discovery track to identify what the right thing to do is. In the discovery track there should be a lot of experimentation in order to to inform what to build. Today there is a tendency to build any and every idea.

But real experiments require quite a bit of rigor and experience in designing the experiment.

“This is my primary focus as a coach. Many teams start to experiment but don’t have the experience to do it well. Most of us don’t have strong science or statistics backgrounds. What happens in practice is instead of generating informed hypotheses and designing experiments to test those hypotheses, we are testing anything and every thing  The problem with this approach is that we risk false positives.  We are running tens and sometimes hundreds of experiments, many with way too many variations.  This guarantees that we will see many false positives – changes that look good but really have no impact.  As a result, we are making decisions on bad data. If we want to build better products,  we need to understand how to run good experiments. The business side needs to be more scientific and the data science side needs to be more business oriented”

According to Torres the ready availability of experimentation tools like Optimizely and Visual Website Optimizer opens up the possibility for experimenting, but you need resources and expertise otherwise decisions will be made on faulty data. Part of the problem is the wide spread “Fear of Math”. Most people shy away from concepts like statistical significance and power. But it is necessary for product managers to begin understanding these concepts. Today there are many online resources that will teach you basic understanding of statistical concepts. Another problem is that we need to be better at hypothesis design. If you have not properly designed your hypothesis before you start you are not setting yourself up to get good data. We also need experimenters that can design experiments that can also test the hypotheses they are designed to test.

I asked Torres if there are any simple rules of thumb or best practices for product managers who want to get started.

“Don’t trust results that are not statistically significant. Surprisingly many teams are not even testing for significance. Define your hypotheses and decide upfront what decisions you will make if it passes, fails, or is flat . Otherwise you will just find yourself rationalizing after the fact why your change is still good regardless of what the data tells you.  Run the same experiment multiple times. It helps reduce false positives. There is no absolute truth. The world is always changing, something that didn’t work in the past may work in the future. Always keep a mindset that everything evolves.”

For more tips, see her article on The 14 Most Common Hypothesis Testing Mistakes (And How to Avoid Them)

It is up to you if you take Teresa Torres’ suggestion to start experimenting. In the mean time visit her excellent blog Product Talk and sign up for her newsletter. It is always packed with interesting content about product management.

Big Data From a Product Perspective – Different Views

The hype surrounding big data at the moment is reaching a climax. While it is evident that we have more and more data and that there are valuable insights hidden, the situation is different if we look at big data as the products that are actually offered.

If we look at big data from a product perspective I think the situation is a bit more mixed. As a product category big data is not yet mature enough to warrant these huge valuations, but that could happen if a couple of things happen. But first let us look at how big data is viewed by different groups

Different views on Big data
From inside the big data community the focus is on technologies like Hadoop, Hive, nosql databases and the companies supporting these technologies and a plethora of other more or less obscure (to the uninitiated of course) products that are part of the big data ecosystem. It is not closely related to business intelligence although it is vey much the same problem big data is solving.

If we look at how the media sees it we are looking at something similar to the invention of the wheel. Something that will have profound effect on human civilisation and the way we live for millennia to come.

Investors see big data investments like the quest for the holy grail (which might explain some of the silliness): Hortonworks has raised $248 million, Coudera $1,2 billion, Datastax $189 million, Elastic search $104 million, Couchbase $106 million etc. All of these companies don’t have a proprietary product, but support open source products. The business model is one of building closed source tools that let customers run the open source better.

The CEOs who invest in big data really just want a big pile of money. They are not interested in the curios patterns you can find like the correlation between search terms containing the word coconut and the migratory patterns of the African swallow. They see in big data a new way to make more money and just want to get to that immediately.

The CIO is usually completely sidelined in decisions involving big data. Maybe because he is increasingly becoming the custodian of legacy technologies, but the need for big data often come from isolated infrastructure projects or from business development.

Developers view big data as models of the real world with intricate detail like the matrix. Soon we will be able to model the entire universe and predict what will happen with big data technology.

What end users see is of an alarming complexity. You need to have semi programming skills in order to extract even simple queries. You also need to be adept at manoeuvring applications with hundreds of functions similar to the sys admin. This is often the case with open source development that usability suffers, because the community wants to take the product in different directions. Furthermore developers are users and they already know the product so there is no real pressure to make the product easy to use for the uninitiated.

What it really is? In the end big data may very well turn out to be just like the segway. I am not saying that it will only be used by mall cops and tourists, but rather that it might end up servicing very limited segments and industries with very specific needs.

Enter the genius – the five specialisations of the big data employee?
The problem today is that in order to get any value out of big data you need to be a virtual genius. you need to master at least four areas that are usually specialisations

  1. First of all you need to be a developer. You might not need to code an actual application if you are just using it for analytical purposes, but you need to be able to write code to extract the information you need one way or another.
  2. Second, you need to be an infrastructure architect and sysadmin because you need to set up a great number of servers and networks. You need to know about the multitude of different infrastructure elements.
  3. Third, you need to be a database administrator. You need to set up databases and maintain them. You need to set up ETL processes, sharding and the like (you do not have to worry about database schemas though).
  4. Fourth, you need to be a data scientist since you need to know a fair amount about machine learning algorithms in order to extract patterns from the data.
  5. Fifth, you need to be a business analyst. If big data is to make sense from a business perspective it is necessary to understand the business model, the revenue streams, the cost structure etc. You also need to know a fair amount about the customers like what parameters to segment them by and what their pains are.

Naturally you don’t have to have all that in one person. In principle it can be spread across several employees, but quickly you will have to hire a complete team in order to just get started, although it is still difficult to find specialists that know just one or two of these things. On top of this you need very tight integration, because big data is more integrated than other technologies.

If you succeed with this the problems are not over unfortunately. Most organisations already have established procedures where work is split up along the lines mentioned above. You have application developers, operations, DBAs, analysts and business developers. Each department has it’s own governance and procedures describing hand offs to other departments. Now you are asking the organisation to circumvent all of these established procedures.

So big data products still have a long way to go before they are ready for the mass market and the really big bucks.

Is Google Glass the Next Segway and What Can We Learn From It?

About a decade ago one of the most hyped products in Tech was the Segway. It promised to revolutionize public transportation. The hype was almost hysterical. It was impossible to get your hands on one, unless you were especially favored and it was overprized. The hype for google glass is similar today, only a select few can get it and it is overprized. Google glass could have a similar fate to the Segway confined to a niche domain like mall cops or tourist excursions, although it has yet to find this niche. This niche could be internet porn, surgery aid or factory repair.

However, the reason is not the overhyping, price or lack of accessibility. It is a much deeper problem. Something as simple as product market fit. Let us look at a couple of reasons for this.

Product market fit
The Segway was by all accounts an impressive innovation and well thought out idea. It was just not thought out for any particular person or problem. It was technology for technology’s sake. It seems that at no point did the product manager/designer/inventor stop to test whether this product actually addressed an existing pain in any significant customer group.

The reports of google glass is similar. We frequently hear that this is cool technology. People ask and are interested in people wearing them. They want to try it. It works very well. But people have a hard time imagining what they would use them for.

“Glasshole” – how one feature can overturn the whole product
Lately google has been catching a lot of bad press around glass due to its camera function. A new term has even appeared already before it has been released, namely “glasshole”. There is actually only one function, which is the reason for this, that is the camera mounted on the glasses.

It appears that people other than the google glass users do not appreciate the possibility of being taped by the glass wearer. This one function is now threatening to take down glass with it. This problem could have been detected much earlier and addressed if someone had bothered to listen to someone other than the over exited test users.

What to do?
A product has to exist in an environment beyond its immediate users. Analysis of this environment and the humans that live in it could have revealed the emotional reactions. The solution by google has been to ban certain types of application of the camera. Which is a good idea, but most people don’t know that and therefore it won’t make much of a difference in perception.

Google could have considered two other options that are even more simple and would have an immediate effect in the environment where the product would be used. Since the basic problem is that people don’t want to feel that they can be monitored covertly, that is, taped without their knowing it. The first option is to introduce a visible light that is turned on when the camera is recording. This would make it possible for people to know when they were being filmed similar to when people with cell phones film. That is not a problem because it is out in the open. You can go to the person and complain.

The second option is to get rid of the camera altogether from the core product. It could be a visible add on, like a go pro camera, that could be taken off at the request of other people or just courtesy. Again it is important that the immediate environment can see that it is there. The problem here is that one controversial feature has been embedded within the product.

With out the camera google glass would still be an impressive product, that could show you the way, notify you of appointments and emails while driving etc. Sometimes taking out a feature may actually increase the market potential.

Anyhow, I love the google glass idea because it shows that somebody out there has the courage to dream and build products that are beyond what we know. I hope that google glass will find its niche faster than the Segway though.

What product development strategy is right for your start up?

There are two primary product development strategies and multiple ways of executing them, but there is only one that is right for your company. Finding that strategy and understanding the dynamics may be the difference between success and failure. It may also revert your perception of whether you are doing well.

Nicholas Nassim Taleb has an interesting concept of antifragility. Antifragility are strategies that benefit from shock or randomness, where fragility is the opposite.

A fragile strategy is the classical stock investment or lending money where most likely you will get an ok return, but there is the odd chance that you could loose all your money.

An antifragile strategy is counting on loosing money with the odd chance that you could gain a lot. Options trading would be an example. Here the unexpected is your friend.

Product development
I was thinking about how that could translate into product development strategies. Developing a product is similar in that you invest in it and expect to get a return.

The fragile strategy would be to continually try to develop the product so it performs a bit better in the market place. This is the typical way, and we know it from techniques such as A/B testing and constant tweaking to increase performance on some central product parameter. But this is also risky, because suddenly you may wake up and find out that your entire product line is obsolete even though you have continually improved it. This is what happens when companies are disrupted.

So, could an antifragile strategy be the solution and how would it look? I think it would be a strategy where you try a lot of things that are most likely going to fail, with the odd chance that you could hit it big time.

One scenario is launching a lot of products just to try and see. This is what google and Richard Branson do. There is no end to what they launch. A cheaper way to do this is to probe for demand through market research, prototypes or some other way to get input from the customer about the perceived value. The Lean start up movement is the champion of building cheap “fake” products that are good enough to verify a potential demand.

Another scenario is to launch a product in a market that you feel there could be a potential, but where no product has really had any success. Continuing to develop this product is certain to loose money, but something might happen. You may hit the right combination or market conditions could change. You just need to make sure not to make the same mistake twice

Choose the right strategy for you
Choosing one or the other strategy is not not self evident. Whereas the financial system seems to have most people following a fragile strategy it seems that the start up “industry” is following an antifragile, which is great for the very few people who have time, resources and luck enough to stumble upon a product that hits it big time. It seems to me that your choice of strategy should depend on what sort of life you want to live. Is it ok, although very unlikely, to one day not have a product? Can you live with continuing protracted loss? Do you just want a predictable return? or do you want to have the remote possibility to become a billionaire.

Why You Shouldn’t Try to be Like Steve Jobs or Wonder What Google Would Do

There is no end to the inspiration that the most hyped tech giants inspire. Books like “What would google do”, films like “The Social Network” and hundreds of daily blog posts and articles chronicle the amazing exploits of some of the world’s most successful companies. Never the less, you are better off ignoring everything they did if you want to succeed yourself.

Read about these companies as if they were great works of fiction. You may get moral encouragement or emotional energy, but what they did is not a recipe to follow. You would not try to drive like James Bond does or attempt to move things with the power of your mind like Luke Skywalker (I hope). Similarly you should look at the life of Steve Jobs as a great story and don’t try it at home.

In the world of businesses big companies like apple, google, facebook, amazon, twitter etc. are anomalies. They are exceptions to the rule. As Malcolm Gladwell demonstrated in his book “Outliers” extraordinarily successful people are typically more a product of particular circumstances than their own skills. For example it is hardly a coincidence that Bill Gates, Paul Allen, Steve Jobs, Larry Ellison and Sun Microsystem founders Bill Joy and Scott McNealy where all born within a year of each other. To be sure, they were indeed incredibly talented, but above all they were at the right place at the right time.

From another angle Nicholas Nassim Taleb has argued that just by sheer chance you would find superstar investors that appear to have otherworldly qualities in picking the right investments. They have repeatedly placed everything they owned in risky investments and yielded massive returns. But if the strategy is to place everything you own on stupid risky financial bets, chances are that some one by the simple laws of chance will have won the bet several times and made a gazillion dollars. Whereas thousands following the same strategy will have lost the bet at one point and lost everything they owned. Guess who we hear about: naturally it is more interesting to look at the one case where it paid off, but should you copy the investments strategy just because it paid off one time? Off course you shouldn’t.

Anyhow, entrepreneurs and business people get mesmerized by the extraordinary success of a few companies and don’t think of the thousands of companies who did the same thing, but went bankrupt because they weren’t at the right place at the right time or because they just didn’t have the same luck. Consequently, doing what google does or trying to be Steve Jobs might be a bad idea and jeopardise your company’s existence.

Remember that google and facebook are still one hit wonders (but what hits!). Google is still just an advertising network, although they are good at advertising themselves as something else. Facebook is still just a social network, trying to be an advertising network. Apple started that way too and managed to survive long enough for the next fluke. All of them have been able to get away with massive amounts of incompetence and bad decisions that we just don’t notice because of the glare of success. Most “normal” companies could not get away with the same.

So, what we can learn from them is to be just as determined and work just as hard. Not be discouraged that we don’t succeed in quite the same way. And most important of all seek out the right place and time for what we are doing.

From User Interface Design to Virtual Product Design

User interface design is very important for any company who has virtual products, but when more and more people access virtual products through more and more different user interfaces it becomes increasingly important to not design the entire virtual product. Not just the user interfaces.

Not just the user interface

I have been researching how different psychological theories can inform and improve virtual product development. One important idea is the idea of object permanence. Object permanence is a principle from Jean Piaget’s psychology. It describes how children learn that objects continue to exist when they are out of sight (to put it very simply)

The reason this is interesting is that it points to something quite fundamental (if you ask me). Object permanence is a quality of the user interface, but also between user interfaces. When we leave something in a system, like a document in a filing cabinet, we expect it to be the same place when we return to it. We also expect object permanence when it comes to virtual products. A brilliant example of a company built entirely around object permanence is dropbox. When you leave a file in your dropbox it is there in exactly the condition you left it when you return regardless of the device you use to access it.

An example of the difference between when it works and it doesn’t is Netflix and HBO Nordic, which is HBOs experimental online movie streaming competitor to Netflix. When you see a movie on Netflix on your iPad and pause it only later to resume it on your playstation you will find it paused in the same place at the front of your screen. HBO nordic also allows you to watch movies and episodes. But every time you return you have to relocate the content you wanted. You have to find the series you were watching, click to the season, find the episode and then try to find where you left off manually.

Designing in a virtual world
What we need to remember is that, even though the virtual world does not have the same constraints as the physical world, we bring with us the expectations of these very same constraints. This is why we need to stage the virtual products in accordance with our expectations and therefore implement some of the same features that we would find in the physical world. The more the user interfaces live up to our expectations the more natural they feel.

So, what we can learn from this is that we have to broaden our view towards designing a virtual product that has multiple user interfaces instead of just one user interface in multiple versions. We have to think of the product as the same, but accessed from different windows into the virtual world.

Can Prediction Markets be Used to Prevent IT Projects From Failing?

Can the wisdom of crowds help improve the hit rate of IT projects? It is well known that many if not most large IT projects fail. It is also well known that they could have been saved if proper intervention had been undertaken at an earlier point. The question is just how to get an early warning? How will management or the project manager know when things are going wrong?

One way could be prediction markets. Prediction markets are based on the basic assumption that crowds are wiser than any individual even an expert. The obvious question then is, why not just send out a questionnaire on a regular basis to ask how people are feeling about the situation. The problem with questionnaires is that they are more easily distorted.

One example is the past few elections for President of the USA, where the outcome has been more precisely predicted by the bookmakers than the polls. The reason for this is that an opinion in a poll is free, but when you bet you have a stake in it. It is not free.
Prediction markets are modelled on this dynamic. Participants in the market are given a virtual currency to invest in the likelihood of future events. This currency then will eventually be turned into some sort of reward in real life.

The British company qmarkets has developed software that allows companies to use prediciton markets to predict the future*. Companies have used it to predict which new product to invest in or which drug to bring through to clinical trials, or which risks are most likely to occur. If applied properly, prediction markets are the most accurate way to predict uncertain events.

The project portfolio managers challenge
One of the most challenging parts of project portfolio management is to manage the uncertainty in the portfolio. It is inherently uncertain whether a project will be finished in time for the deadline, whether resources are available or whether the quality will be good enough.

Standard textbooks prescribe the use of complicated algorithms and reference class data to determine whether the project is under spending or overspending with regard to the burn-down rate of a project. A project that burns less hours than expected may experience resource bottlenecks or lack of commitment. A project that burns more hours than expected may have underestimated the amount of work. Either way, the likelihood that the project will be on time and on budget is diminished.

The challenge is to be able to say exactly when a project is going off track. This can be done if we know when exactly a project is overspending and when it is underspending. The problem, however, is that all projects are different. Some have a much slower start and explosive finish, others start really energetically and then level off.

I Predict
When traditional ways of detecting when a project is going off track don’t work, prediction markets may offer precisely the solution needed. Prediction markets are good at early detection of sentiments and information that is distributed among a group of people. It could therefore serve as an early warning system.

The prediction market could be implemented as a market where the question of whether any project in the portfolio, or at least the largest of them, would meet their deadline. All employees would be given a pool of virtual money to buy stocks in the project. These stocks can be bought and sold at any time. The stock price therefore reflects the probability of a project meeting its deadline.

The portfolio manager and senior management could therefore, just by monitoring the market, much earlier spot projects that had a high risk of failure. That would allow them to do one of two things: discontinue the project or give it more attention to bring it on track. Either of these options would benefit the company. It would either minimize losses or maximize likelihood of delivering quality on time.

These are the hard benefits, but there are also, soft benefits that would not follow from the traditional method. Since this is a game, it is a way for employees to take the mind of their work for a second and still be doing something relevant, instead of posting links to youtube movies on facebook.

It is also a way to create awareness of all the projects your company is running. Suddenly employees will know what other departments are doing and they may even serendipitously discover synergies or redundant projects

Since it is a game it can be entertaining, which may boost morale.

Further it may create positive incentives to make the real projects finish on time. If you invested in it, you want to protect your investment (this is why it is probably a good idea to disable the possibility for shorting project socks).

Future potential
Unfortunately this solution is not in use anywhere yet. It is only a possibility, but with the market maturing and ideas about collective intelligence becoming more widespread, it is probably only a question of time. the current economic climate calls for solutions that would help you minimize losses and maximize success. The problem is just whether the cultural gap of actively letting employees play and even rewarding them for it is too big for most companies.

The typical risk analysis is inherently flawed. It is only based on the project managers subjective opinion and furthermore it is typically made with a 5 point scale idiosyncratically interpreted. This doesn’t guarantee a good estimate of the future
Furthermore we are natural born optimists. We overestimate the likelihood of something good happening and underestimate the probability of negative events occurring.

* Best Buy used decision markets to find out whether a new product idea would succeed (http://online.wsj.com/article/SB122152452811139909.html). They built a market where it was possible for about 2000 employees to trade imaginary stocks that related to questions about the future. So, if a question was very likely to be true the price of the stock would go up and if it was likely to be false, the price would go down. This means that the price of the imaginary stock would match the probability of the answer being true.

References:
Justin Wolfers and Eric Zitzewitz: “Prediction Markets
James Surowiecki: “The Wisdom of Crowds – Why the Many are Smarter than the Few
Eduardo Miranda: “Running the Successful Hi-Tech Project Office
Photo: thetaxhaven @flickr

“Heat” – How to Train Your Decision Making

In one of the greatest gangster movies of all times,”Heat”, lies hidden deep wisdom for all decision makers big and small. It is not that you should change your business model and start robbing banks, hire Al Pacino or even that you have to grow a moustache and goatee to succeed. But our decision making is clouded by two problems that can be attributed to how our brains work. They can be overcome by heeding the advice of “Heat”.

Cool feature…
Before deciding to do something most people will investigate the possibilities. This is the first place where your decision making could go wrong. People have a tendency to attribute too much importance to the first information they get. This is sometimes called the anchoring bias. The effect is that all subsequent evaluations will be affected by that first thing you see even though that information is irrelevant.

It could be that you are looking to invest in a new Customer Relationship Management system for your company and you hear about a vendor that has close integration with facebook. Again this is a cool feature, but essentially irrelevant. All subsequent CRM systems will be reviewed in the light of whether they have facebook integration.

Jeans suck!
After having seen a couple of alternative solutions to your problem you will start to develop hypotheses based on your gut feeling, for example that products from Germany are inherently more robust. You may have seen two or three examples of this. Here comes the next hurdle: the confirmation bias. The confirmation bias makes you look primarily for confirmatory evidence and attach more importance to it than to contrary evidence.

It could be that in your recruitment process for a new ambitious account manager you encountered two polite applicants from, say, London. This has made you develop the hypothesis that people from London are polite. If an impolite applicant from London comes along, who may forget to shake your hand, you may not attribute his geographical origin any value in your evaluation. Chances are that you will use another hypothesis, for example that people in Jeans are impolite and this is why he is not as polite (at this point you have forgotten that the first applicant from London also wore jeans, but at that time you didn’t pay attention to it). You just made that up on the spot to make sense of the evidence.

Psychologically it just feels a lot nicer to have your hypotheses confirmed, than having them contradicted, even to the point where you make up new ones just to have some confirmation.

30 seconds flat
Flash back to “Heat”. At one of the absolute highlights of the movie the policeman, Vincent Hanna, who hunts the gangster Neil McCauley decides to have coffee and talk with Neil. At the climax of this conversation Neil McCauley says the sentence that has also given the movie its name: “Don’t let yourself get attached to anything you are not willing to walk out on in 30 seconds flat if you spot the heat around the corner.”

This is deep wisdom, because, if applied with sufficient discipline it will eliminate the above named decision problems. You should never get attached to any idea. Period. It doesn’t mean that you should change your opinion or hypotheses incessantly. Rather it means that you should always be psychologically prepared to abandon an idea if you find that evidence does not support it.

This is more difficult than you think. But so is pole vaulting. Yet, pole vaulting can be learned. This is done by training. And train, you can. Teach you, I will: Next time you are making a decision try the following.

  1. Before you start searching for information write down your hypotheses. That way it will be easier to challenge them
  2. When you research your alternatives write down all the new hypothesis you develop
  3. Make sure to continually test your list of hypotheses in your evaluation process
  4. All previously evaluated alternatives should also be checked against the new hypotheses
  5. Every time you find sufficient evidence against an hypothesis. Strike it from the list.

You can train this on everyday decisions like choosing the right restaurant for you wedding anniversary (yes, it’s not enough to bring home Chinese take away), buying a new TV (we all know you need it) or even finding a new boyfriend/girlfriend (I know, not necessarily an everyday decision, but still..)

Photo by Michael Gil, MSVP @flickr

The Hexagon Framework for Selecting the Right IT system

Companies are wasting millions of dollars every year on failed Commercial Off The Shelf (COTS) system acquisitions. The symptoms are well known: budget overrun, missed dead lines, bad quality and bad client supplier relations. Even when the acquisition project in itself is a success there is no guarantee that the system will ever deliver the expected benefits, because the expectations of functionality were never properly aligned with the business goals.

In order to avoid this we have developed an evaluation framework called the Sensor Six Hexagon Framework. It will guide your COTS evaluation through the most important criteria and increase the likelihood of success for your COTS acquisition. The Sensor Six Hexagon Framework operates with six groups of criteria:

Functionality – This aspect considers how the software is to be used. For example which business processes it is meant to support, the usability and specific functions. The purpose of evaluation here is to ascertain the degree to which the functionality is useful. What does the system do?
Implementation – The purpose of this group is to evaluate implementation process. How long will it take, how big is the suppliers experience and cultural compatibility. Here all criteria related to the project of making the COTS functional are considered. How will we implement it
Risk – As the name suggests the risk group has to do with all aspects of what could go wrong. Is the product immature or is security low? Is the supplier just about to go bankrupt? This type of questions should be asked here.
Strategy – Has to do with a long term orientation for the supplier as well as the customer. The customer might want to consider whether this supplier fits with his strategy, but also whether he thinks that he as a customer belongs to a key segment of the supplier.
Integration – is about how this system will work with all other systems in the enterprise. Is there a well known way to integrate it? Does it have a well defined interface?
Operation – Once it has become implemented how is the systems operational aspects: Support, performance, upgrades etc.

The framework is based on empirical research from 10 client cases of COTS acquisition. They were grouped into successes and failures. Based on what was done in the evaluation process we were able to conclude the following
1) All succesful projects had evaluated criteria from all six groups of the Sensor Six Hexagon framework
2) All failures could be predicted from the failure to evaluate properly before acquiring the COTS system
3) The more criteria and the more thoroughly they were evaluated the higher the likelihood of success
4) No single criterion stood out as the most common cause of failure. Every failure was due to a missing evaluation of a unique combination of criteria

The Sensor Six Hexagon is available as a template in Decision Orchestrator and we also use it in consulting as basis of our Accelerated COTS Acquisition Process (ACAP), but anyone can benefit from it. Contact us for a whitepaper describing how. The benefits of using the Hexagon Framework is larger success rate, quicker decision process, and alignment to your companies strategic and operational goals.