In the earliest days of computing, the game of chess represented a challenge to the science and research community who sought to explore the calculating capabilities of these machines. Chess, while a very structured and focused game, requires a certain level of intelligence that some humans never master.
IBM took on this challenge and Friday May 11, 2012 marks the 15 year anniversary of Deep Blue, IBM’s chess-playing computer, and its victory to become a world chess champion.
Deep Blue was a highly powerful computer that was programmed to solve the complex, strategic game of chess. But IBM’s goal behind Deep Blue was a much grander challenge - it enabled researchers to discover and understand the limits of massively parallel processing and high performance computing.
Deep Blue’s legacy can be found in the way computer systems are used to automate and help humans with their decision making in tackling tough problems. If Deep Blue could explore up to 200 million possible chess positions per second, then couldn’t this deep computing capability be used to help society handle the kinds of complex calculations required in areas such as drug development, financial risk assessment, and extensive data mining? Deep Blue proved that industry could tackle these issues with smart algorithms and sufficient computational power.
Ultimately, the creation of Deep Blue helped pave the way for new kinds of advanced computers and breakthroughs such as IBM Blue Gene and IBM Watson. IBM Blue Gene, when it was introduced in 2004, demonstrated the next grand challenge in computing and was both the most powerful supercomputer and the most efficient, but it was built to help biologists observe the invisible processes of protein folding and gene development. Deep Blue was also one of the earliest experiments in supercomputing that propelled IBM to become a market leader in this space to this day.
15 years on the world has seen epic growth in the volume and variety of data that is being generated by the planet, so much so that 90% of the data in the world today has been created in the last two years alone. Continuing along the trajectory of using science and technology to tackle challenges such as making sense of the trillions of bytes of data in our world and mining it for information and knowledge, IBM developed IBM Watson. IBM Watson can hold the equivalent of about one million books worth of information. Yet its significance was not solely the amount of information it could process, but a new generation of technology that uses algorithms to find answers in unstructured data more effectively than standard search technology, while also understanding natural language. The promise of IBM Watson is being explored by industry today - as an online tool to assist medical professionals in formulating diagnoses and simplifying the banking experience by analyzing customer needs in the context of vast amounts of ever-changing financial, economic, product and client data.
Deep Blue can be thought of as the earliest pioneer in a new era of computing, laying the groundwork for a generation of computers and software that do more than compute – they will be able to sense, learn and predict – what can be termed as the cognitive era of computing. These smart machines and systems will be better equipped to handle the vast amounts of data now pervading our society and make sense of it to solve the latest challenges in our world, whether its predicting outages across the grid, to exploring the origins of the universe etc.
May 11, 2012 marks the 15-year anniversary of IBM’s chess-playing supercomputer, Deep Blue’s victory over a reigning world chess champion. IBM Research scientist Dr. Murray Campbell, one of the original developers, talks about the challenges and breakthroughs of building Deep Blue. See on YouTube.
On May 11, 1997, an IBM computer called IBM ® Deep Blue ® beat the world chess champion after a six-game match: two wins for IBM, one for the champion and three draws. The match lasted several days and received massive media coverage around the world. It was the classic plot line of man vs. machine. Behind the contest, however, was important computer science, pushing forward the ability of computers to handle the kinds of complex calculations needed to help discover new medical drugs; do the broad financial modeling needed to identify trends and do risk analysis; handle large database searches; and perform massive calculations needed in many fields of science.