In the decade or so that’s passed [...], the wearable computer has continually rode the crest of a “5 years out” wave, which in consumer electronics means “we have no idea how to do this, but maybe someone will figure it out by then.” In comparison, “10 years out” means never, so I was always hopeful. Unfortunately, the references to wearable computing have slowly faded since 1999-ish.
Perhaps it was the death of relentless optimism and monetization-be-damned after the dot-com bust. Maybe it was just the simple reality of immature technology — head-mounted displays have always seemed a few ounces overweight, or a few hundred dollars overly expensive. Or maybe the growing definition in the 21st century of a “computer” being something that’s constantly connected to the internet. A live, high-bandwidth internet connection has been very difficult to make portable, much less for 16 hours at a time.
And, of course, there was the rise of the smartphone. The smartphone fills almost every potential application for head mounted displays, by offering glanceable information that’s as convenient to pull out as your billfold. As obtrusive as the smartphone feels to many people, it’s a far cry from the vision of technology that sci-fi has been offering since the 80s. Why implant a chip in your head, or wear expensive computer goggles and VR gloves constantly, when a tiny little slab of carrier-subsidized technology can solve everything for you?
Maybe we are in another bubble, and when we finally face reality, Sergey and Larry will have to hide their toys away from the shareholders and get back to optimizing AdSense. But in the meantime, I think it’s worth looking at how we got here.
[Paul's] history lesson is presented in two parts:
1. This awesome video of Terminator heads up displays
2. The academic wearable computing research of the past few decades, primarily performed at MIT or by its alumni
[Andy's take - read this article as it provides a great history and thoughtful commentary on wearable computing!]