Home » Innovation » Beam Me Up, Scotty

Beam Me Up, Scotty

May 15, 2009


In addition to being an entrepreneur and businessman, I also fancy myself somewhat of a futurist. I think all entrepreneurs have a fascination with the future because they believe they are helping affect its outcome. Since writers first began penning what we now call “science fiction,” people have dreamed of doing things that they have honestly thought impossible, like traveling through time, flying like Superman, replicating food, or becoming invisible at will. Some science fiction dreams have actually come true, like satellite communications, but many have not. Star Trek fans, for example, were immediately drawn to the Enterprise’s “transporter” technology — a transportation system that was capable of beaming a group of people to and from the Starship Enterprise. Having spent plenty of time on airplanes flying around the world, I’d welcome a system that could beam me halfway around the world in the twinkling of an eye. Although that capability may be developed far in the future (if ever), there have been some baby steps taken towards a form of teleportation [“A Leap for Teleporting, Between Ions Feet Apart,” by Kenneth Chang, New York Times, 2 February 2009]. Chang writes:

“Without quite the drama of Alexander Graham Bell calling out, ‘Mr. Watson, come here!’ or the charm of the original ‘Star Trek’ television show, scientists have nonetheless achieved a milestone in communication: teleporting the quantum identity of one atom to another a few feet away.”

Okay, but what does “teleporting the quantum identity of one atom to another” mean exactly? It doesn’t sound like transporting one atom from here to there — and it’s not. Chang explains:

“Quantum teleportation depends on entanglement, one of the strangest of the many strange aspects of quantum mechanics. Two particles can become ‘entangled’ into a single entity, and a change in one instantaneously changes the other even if it is far away.”

The phenomenon is sort of like the claims of identical twins who say that they can sense something has happened to their sibling. Only in this case, the change is instantaneous and, if you didn’t know better, you would swear that the changed atom is the original atom rather than a copy. It’s what makes quantum mechanics both weird and wonderful. Chang continues:

“Previously, physicists have shown that they could use teleportation to transfer information from one photon to another or between nearby atoms. In the new research, the scientists used light to transfer quantum information between two well-separated atoms.”

That’s wonderful, but it gets weirder.

“Present-day digital computers store information as zeroes and ones. In a future quantum computer, a single bit of information could be both zero and one at the same time. (In essence, a quantum coin toss would be both heads and tails until someone actually looked at the coin, at which time the coin instantly becomes one or the other.) In theory, a quantum computer could calculate certain types of problems much more quickly than digital computers.”

We can all appreciate the fact that this research may lead to faster computers which could help solve some of the world’s most daunting challenges. So would we call research into quantum teleportation pure or applied research? Stephen Quake insists that cases like this demonstrate why trying to make a distinction is silly [“The Absurdly Artificial Divide Between Pure and Applied Research,” New York Times, 17 February 2009]. Quake writes:

“The snobbish idea that pure science is in some way superior to applied science dates to antiquity, when Plutarch says of Archimedes: ‘Regarding the business of mechanics and every utilitarian art as ignoble and vulgar, he gave his zealous devotion only to those subjects whose elegance and subtlety are untrammeled by the necessities of life.’ The reality appears to have been quite different, as Archimedes was not just the greatest mathematician of the ancient world, but also a clever inventor who drew inspiration from numerous practical problems — and based on the historical record, few historians today accept Plutarch at his word. … In today’s more specialized world, there are numerous artificial divisions between pure and applied work: different departments, different professional societies, and different journals. The stereotyped view is that the applied scientists control the lion’s share of funding, while the basic scientists control the most prestigious journals and prizes. The reality is more complicated and lies somewhere in between. What remains true is that practical problems can be equally compelling as fundamental ones, and often lead in turn to the discovery of new fundamental science.”

Chang’s article about quantum teleportation is accompanied by a great animation that helps explain what the researchers are doing. By the way, don’t hold your breath about being teleported in the future. Chang concludes, “Even in the far future, ‘Star Trek’ transporters will probably remain a fantasy.” Microsoft is working a personal assistant, however, that reminds at least one reporter of the helpful computers on the cartoon show “The Jetsons” [“Microsoft Mapping Course to a Jetsons-Style Future,” by Ashlee Vance, New York Times, 1 March 2009]. Vance writes:

“Meet Laura, the virtual personal assistant for those of us who cannot afford a human one. Built by researchers at Microsoft, Laura appears as a talking head on a screen. You can speak to her and ask her to handle basic tasks like booking appointments for meetings or scheduling a flight. More compelling, however, is Laura’s ability to make sophisticated decisions about the people in front of her, judging things like their attire, whether they seem impatient, their importance and their preferred times for appointments. Instead of being a relatively dumb terminal, Laura represents a nuanced attempt to recreate the finer aspects of a relationship that can develop between an executive and an assistant over the course of many years.”

Microsoft, of course, is researching how to improve your future because it wants to ensure that it remains relevant in that future.

“Microsoft and its longtime partner, Intel, have accelerated their exploration of new computing fields. Last week at its headquarters near Seattle, Microsoft showed off a host of software systems built to power futuristic games, medical devices, teaching tools and even smart elevators. And this week, Intel, the world’s largest chip maker, will elaborate on plans to extend its low-power Atom chip from laptops to cars, robots and home security systems.”

Vance raises a good question. Is Laura just another prototype that will eventually end up on a virtual shelf or is it a program that people could actually see on PCs in the future?

“Whether the companies can really turn prototypes like Laura into real products remains to be seen. Microsoft and Intel both have a habit of talking up fantastic and ambitious visions of the future. In 2003, Microsoft famously predicted that we would soon all be wearing wristwatch computers known as Spot watches. Last year, the company quietly ended the project. This time around, however, the underlying silicon technology may have caught up to where both companies hope to take computing. For example, Laura requires a top-of-the-line chip with eight processor cores to handle all of the artificial intelligence and graphics work needed to give the system a somewhat lifelike appearance and function. Such a chip would normally sit inside a server in a company’s data center. Intel is working to bring similar levels of processing power down to tiny chips that can fit into just about any device. Craig Mundie, the chief research and strategy officer at Microsoft, expects to see computing systems that are about 50 to 100 times more powerful than today’s systems by 2013. Most important, the new chips will consume about the same power as current chips, making possible things like a virtual assistant with voice- and facial-recognition skills that are embedded into an office door.”

The fact that the ubiquitous computer chip will become even more ubiquitous means that ever more data will be generated and stored in the future. New York Times‘ columnist Steve Lohr reports that General Electric claims to have made a breakthrough in storage technology just in time to meet the needs of this brave new world [“G.E.’s Breakthrough Can Put 100 DVDs on a Disc,” 26 April 2009].

“Optical storage experts and industry analysts who were told of the development said it held the promise of being a big step forward in digital storage with a wide range of potential uses in commercial, scientific and consumer markets. … The promising work by the G.E. researchers is in the field of holographic storage. Holography is an optical process that stores not only three-dimensional images like the ones placed on many credit cards for security purposes, but the 1’s and 0’s of digital data as well. The data is encoded in light patterns that are stored in light-sensitive material. The holograms act like microscopic mirrors that refract light patterns when a laser shines on them, and so each hologram’s recorded data can then be retrieved and deciphered. Holographic storage has the potential to pack data far more densely than conventional optical technology, used in DVDs and the newer, high-capacity Blu-ray discs, in which information is stored as a pattern of marks across the surface of a disc. The potential of holographic technology has long been known. The first research papers were published in the early 1960s. Many advances have been made over the years in the materials science, optics and applied physics needed to make holographic storage a practical, cost-effective technology.”

This doesn’t mean that we will soon be entertaining ourselves in holographic rooms like those found in follow-on Star Trek series, but it does mean that we are likely to continue to become a YouTube society that records and shares experiences without too much concern about running out of storage capacity. Some pundits believe that this may not be a good thing. Kathleen Parker, for example, believes we already live in a society where there is “too much information” [“Turn Off, Tune Out, Drop In,” Washington Post, 1 April 2009]. Parker writes:

“The phrase ‘too much information,’ a now-cliched talk-to-the-hand deflection, isn’t just a gentle whack at someone who tells you more than you want to know about his Cialis experience. It’s a toxic asset that exhausts our cognitive resources while making the nonsensical seem significant. TMI may indeed be the despot’s friend. Keep citizens so overwhelmed with data that they can’t tell what’s important and eventually become incapable of responding to what is. Our brains simply aren’t wired to receive and process so much information in such a compressed period. In 2006, the world produced 161 exabytes (an exabyte is 1 quintillion bytes) of digital data, according to Columbia Journalism Review. Put in perspective, that’s 3 million times the information contained in all the books ever written. By next year, the number is expected to reach 988 exabytes.”

Because we can’t make sense of all that information, we may very well need a personal assistant like Microsoft’s Laura to help us sort through it. Parker may tease about dropping out and disconnecting, but she also realizes that as humans we actually hunger for information.

“The unknowableness of current circumstances, combined with a lack of trust in our institutions, may partly be to blame for our apparent info-insatiability. People sense that they need to know more in order to understand an increasingly complex world. And, of course, it’s fun. The urge to know and be known is a uniquely human indulgence. Being connected to friends and colleagues without having to inconvenience one’s gluteus maximus surely must stimulate our pleasure center or we wouldn’t bother.”

In a recent post entitled The Paradox of Internet Growth in Emerging Markets, I reported on the fact that people in developing countries are becoming part of the growing market for social networks. Even people who have been mostly disconnected throughout history are hungry to learn more about others and to be connected to the world around them. Parker’s fear is that “with so much data coming from all directions, we risk paralysis. Brain freeze, some call it. More important, we also risk losing our ability to process the Big Ideas that might actually serve us better. It isn’t only Jack and Jill who are tethered to the Twittering masses, after all. Our thinkers at the highest levels are, too.” I suspect that computers will eventually help us with brain freeze so that like the characters on Star Trek we can ask our computer to help us find answers to challenges that puzzle us. That will be helpful, but I’d still like to be teleported somewhere.

Related Posts:

Full Logo


One of our team members will reach out shortly and we will help make your business brilliant!