Back in the 1950s, Jay Forrester, a computer scientist at the MIT Sloane School of Management, argued that a large corporation is a complex social system far too abstract for people to manage effectively without the aid of computers. He asserted that we needed technology to understand the relationships and interactions amongst executives in big organisations. In 1961, Forrester published his book ‘Industrial Dynamics’ which described an analytical, problem solving methodology he had developed that employs computer based simulations to help managers visualise and understand cause and effect relationships in decision making and business processes that would otherwise be invisible and inestimable. Forrester used the term ‘mental models’ to describe how people tend to make decisions based on instinct and interpretation rather than on fact. He believed that management decisions based only on mental models and human judgments are inferior to decisions derived from computer models that can represent complex relationships and predict outcomes that the human mind cannot.
Over the same period, Thomas Watson Jnr took the big gamble in 1952 to transform the International Business Machine Corporation, now simply known as IBM, from making punch card processors to electronic computers. Little did he know that this decision would kick-start the information technology revolution in business and the beginnings of decision management in large corporations.
Fifty years on, both Forrester and Watson would be astounded by the impact of IT and the internet not just on business life but on all our lives.
The IT revolution was driven by technology: by the decreasing cost of data storage and the facility to transmit data instantaneously anywhere in the world; it was by combining ancient mathematics with modern computers to create the algorithms that can potentially give companies the best management tools they have ever had; it is the software technology that has linked individual computers into one massive resource. The ability to handle masses of data has been the most staggering part of this transformation. The cost of storing or transmitting a kilobyte of data is now too cheap to measure. In 2004, Wal-Mart’s data warehouse reached 500 terabytes. By 2015, data flows on the internet in the United States alone will reach an annual total of 1,000 exabytes. Two exabytes equals the total volume of written information generated worldwide annually; five exabytes is all the words ever spoken by human beings.
And now, ten years into the twenty first century, businesses need to think about this IT revolution in a different way. Business executives now need to think about the company’s IT infrastructure not as technology to process data, as they have in the past, but as a strategic resource for making better decisions. And the internet has enabled businesses to get access to decision support capabilities through web based tools – decision tools that help take a more informed decision and understand the implications of the trade-offs that arise from taking that decision – without the need for the resource investment that goes with it: the high priced ‘techies’ and software, now the most expensive part of any IT infrastructure.
Web based tools connected to an external support agency eliminates this internal infrastructure cost and puts a decision tool on the desk of every executive.
In the 1980s, companies invested billions to put a PC on every desk. In this decade the investment will go to put intelligence into every decision and every transaction. Decision management will be the new competitive advantage, closing the gap between the way your company thinks and the way your customers think and closing the gap between strategy and execution.
It will be the deciding factor in determining which companies adapt to changes in consumer behaviour so that they survive and thrive in the future.