The Wrong Side of History: The Lasting Lessons of Obsolete Technologies
What the failures of the past can teach us about the future of innovation
As the sun rose over the British countryside, a meticulously trained soldier mounted his horse and felt his heart race with anticipation — within moments, he would push the limits of technology of the time. He charged across the open field, ready to put to the test the newly designed Pattern cavalry sword. With each stride, the threatening tip of steel drew ever closer to its intended target: a hay-stuffed dummy. In a seamless maneuver, the blade pierced the dummy’s imagined flesh. Unlike the curved sabers of yore, which were designed to slash their victims, the Pattern sword could be withdrawn from an enemy’s body with unparalleled ease, without the risk of bending, breaking, or becoming lodged in its victim.1 The rider, the horse, and the blade — a newly blended force of destruction — could continue galloping unimpeded to the next kill.
The Pattern was, and remains, one of the most efficient cavalry blades ever designed.2 There was, however, a significant problem: the Pattern was introduced in 1908, just six years before the trenches of World War I and newer means of destruction would render sword-wielding cavalry obsolete.
The Pattern sword exemplifies a common phenomenon. A technology often reaches its apex when its refinement no longer serves a purpose, and well before its replacement is evident to anyone except the most daring innovators. To stay ahead of the curve, one must constantly push the limits of what is possible.
From cutting-edge to obsolete
The Pattern’s short-lived heydey has countless modern-day parallels. For instance, the introduction of Blu-ray enabled information to be stored on CDs at unprecedented densities; however, this medium was rapidly becoming obsolete as consumers and businesses shifted to digital downloads and online streaming.3 Similar to how post-World War I armies phased out cavalry detachments, Apple removed the built-in disk drive from their 2012 MacBook laptops and all later models.
The Pattern sword also came to mind when Microsoft unveiled the Surface Studio in 2016. This all-in-one desktop PC featured a large, high-resolution touchscreen monitor that could be adjusted from a traditional vertical setting to a flatter angle reminiscent of a drafting table. It offered a fresh perspective on desktop computing at a time when most professionals had begun to shift away from desktops in favor of laptops and smartphones as their primary productivity devices.4
Admittedly, it is not particularly insightful to suggest that the best version of a technology arrives just before it becomes obsolete. In most cases, technologies continue to improve until they are no longer useful or relevant. A technology is not phased out because it cannot be improved further; indeed, if sword-wielding cavaliers were still in use today, it is almost certain that we would have developed a superior cavalry blade to the Pattern. What, then, causes a once-dominant technology to fade into obscurity?
Unpacking the causes of technology obsolescence
Entire eras of human history have been named after the dominant tool and weapon-making technology of the time: stone, bronze, and iron. This highlights the crucial role that the mastery of these materials played in the survival and progress of societies during these periods. Stone tools served humanity for thousands of years, until the invention of bronze metallurgy revolutionized agriculture and warfare, providing societies that harnessed bronze with an insurmountable advantage over those that still relied on chipped stone.5 Similarly, iron supplanted bronze because it offered superior properties and benefits.
Bronze replaced stone, and iron superseded bronze because improvements to tools and weaponry are ultimately limited by the physical properties of their material. Once a superior metal was mastered, the former became obsolete. No bronze blade, no matter how well designed or crafted, could defeat a steel-clad foe. Societies that were slow to adopt new advances were conquered by those that did.
The same principle applies to the many technological advancements that followed. History offers countless examples of once-dominant technologies that fell into disuse after being outclassed by their successors. For instance, a sword-wielding cavalryman would stand no chance against a soldier with a firearm; and physical CDs cannot compete with the convenience of digital downloads. Similarly, commercial vessels no longer rely on sail or steam power; horse-drawn carriages are not used for transportation anymore; and Clark Kent would struggle to find a phone booth in which to transform into Superman now that smartphones have rendered this once-ubiquitous street fixture obsolete.
The lessons of creative destruction
The unforgiving pace of technological progress has profound implications for the business world. Companies that stand the test of time understand that making incremental improvements to existing product lines is insufficient to achieve long-term success. Instead, they embrace change and push into new frontiers, even if it means risking their current revenue streams.
Notable examples include Steve Jobs’s decision to introduce the iPhone despite the risk of cannibalizing iPod sales, which at the time accounted for as much as 40 percent of Apple’s revenue.6 Another example is Reed Hastings’s venture into online streaming, which risked drawing customers away from what was then Netflix’s core business: DVD mail-order rentals.7 Mark Zuckerberg’s acquisition of Instagram, at a time when most of Facebook’s traffic and revenue came from desktop and web rather than smartphones and apps, proved to be similarly prescient.
The pressures of creative destruction help explain the significant sums invested in emerging technologies that may seem like nothing more than gimmicks today. Consider augmented reality (AR) glasses. Unlike the more established virtual reality (VR) headsets that create fully immersive digital worlds, AR devices merely augment the wearer’s real-life surroundings with digitally conjured imagery, such as turn-by-turn navigation directly in the line of sight.8
Many major technology companies, including Microsoft, Meta, Alphabet, and Apple, have invested heavily in AR. According to market intelligence provider IDC, spending on AR and virtual reality is forecast to exceed $50 billion by 2026.9 As AR technology continues to improve and become more widely adopted, it has the potential to transform various industries, from retail and advertising to healthcare and education. In the future, ambient computing that seamlessly integrates into users’ lives may replace smartphones as humankind’s primary computing interface, just as smartphones have overtaken laptops and desktop PCs. Companies with existing investment and know-how in AR will be best positioned to capitalize on this transition and shape the future of computing.
The perils of underestimating emerging technologies
Only time will tell whether AR will become ubiquitous, or whether it will remain a niche curiosity. Whatever the outcome, it would be unwise to prematurely dismiss the emerging technology’s potential.
Pioneering innovations are easy targets of ridicule. Take, for instance, the Google Glass, which faced widespread mockery upon its release in 2013. The device was derided by the mainstream media and the public alike for its dorky appearance, and those who wore the high-tech binoculars were dubbed, in less-than-flattering terms, “glassholes.”10 Granted, the poor reception was partly due to an overambitious launch that prioritized fanfare over bug-free hardware.11 Additionally, there were understandable privacy concerns from those who found themselves in the presence of eyeglass-mounted mini-cameras.12 However, it would be reductive to deny that a degree of dismissive schadenfreude played a role in accelerating the Glass’s spiraling descent.
Even humble inventions like the umbrella were initially met with hostility. Just like the Silicon Valley geeks parading their Google Glasses, the first Englishman to carry an umbrella on the streets of London was mocked.13 Despite its usefulness in London’s notoriously rainy weather, the waterproof canopy on a stick was at first shunned as effeminate and a sign of weak character.14 It took decades before the umbrella was considered indispensable throughout Britain. Today, it would be difficult to find a Londoner without one.
Throughout the history of technology, experts have repeatedly failed to recognize the potential of emerging innovations. One of the most famous examples of such myopia occurred in 1911, when French general Ferdinand Foch dismissed airplanes as “interesting scientific toys ... of no military value.”15 At the time, the military believed that an advanced sword design capable of piercing an enemy without slowing or dismounting the horseman wielding the blade was the technological development to revolutionize warfare. Within a few short years, the First World War would demonstrate both the futility of cavalry and the necessity of air power in a military’s arsenal.
Despite the rapid advancements in aviation during his time, Foch remained skeptical toward airplanes believing that the true power of warfare laid in traditional infantry and cavalry tactics. This serves as a stark reminder of the dangers of complacency in the face of novel technological advancements.
In a more recent example of underestimating groundbreaking innovations in their early stages, Nobel Prize-winning economist Paul Krugman predicted in 1998 that “the Internet’s impact on the economy [would be] no greater than the fax machine’s.”16 At the time, Krugman’s words may have seemed reasonable; after all, the Internet was still in its infancy and few could have predicted that it would one day reshape the world. Fast forward to today, and we know just how wrong Krugman was.
Foch and Krugman’s blunders are just two examples of experts underestimating the transformative potential of emerging technologies. Their misjudgments underscore the crucial importance of recognizing the potential of nascent innovations, no matter how primitive or limited their initial versions may seem.
Hesitance, even hostility, to change is understandable, especially as the technology of the day is constantly improving in ways that can be easily evaluated against our most convenient frame of reference: the status quo. However, it would be a fatal error to ignore new innovations or disparage those who push the frontiers of what is possible. Indeed, those who underestimate the revolutionary potential of emerging technologies may find themselves on the wrong side of history.
Ibid.
Britannica, T. Editors of Encyclopaedia (2023, February 2). “Blu-ray.” Encyclopedia Britannica. (Link).
Pettey, C. and Forni, A. (2017, January 11). “Gartner Says 2016 Marked Fifth Consecutive Year of Worldwide PC Shipment Decline” Gartner. (Link).
Britannica, T. Editors of Encyclopaedia (2023, January 6). “Bronze Age.” Encyclopedia Britannica. (Link).
Molla, R. (2017, June 26). “How Apple’s iPhone changed the world: 10 years in 10 charts.” Vox. (Link).
Wingfield, N. (2009, June 24). “Netflix boss Plots life after the DVD.” The Wall Street Journal. (Link).
Shirer, M. and Torchia, M. (2022, November 22). “IDC Spending Guide Forecasts Strong Growth for Augmented and Virtual Reality.” International Data Corporation. (Link).
Ibid.
Bilton, N. (2014, February 19). “Google Offers a Guide to not Being a ‘Creepy’ Google Glass Owner.” The New York Times. (Link).
Waters, M. (2016, July 27). “The Public Shaming of England’s First Umbrella User.” Atlas Obscura. (Link).
Ibid.