Computing Power

Scientists have succeeded in a 40-year-long quest to create electronic components that “remember” their state after the power has been switched off. This will result in computers that start up instantly and retain sessions if the battery dies, and cellphones that go for weeks without charge.

Two such “memristors” linked together function as a transistor, but are much smaller (and the smaller they get, the better they perform.) But because chip designers are unfamiliar with the memristor’s physical properties, which are quite different from those of the familiar transistor, it will be a while before we have them in our computers and cellphones. Meanwhile, we will just have to be satisfied with more horsepower, in the form of a 2.4 teraflop (2.4 trillion floating point operations per second) graphics chip now available for the PC. The new chip brings us “closer to eye-definition computing gaming and cinematic experiences,” according to its maker, AMD. (“Eye-definition computing” is the art and science of achieving visual computing experiences that seem optically real, the company explains.)

Not so long ago, 2.4 teraflops was supercomputer territory, but that was then. The Los Alamos National Laboratory has built a supercomputer called Roadrunner that is expected to achieve petaflop (1,000 trillion floating point operations per second) performance. The current world champion supercomputer, IBM’s Blue Gene/L, achieves only half that. Moreover, Roadrunner gets its power from chips already available in Sony’s PlayStation 3 video game machine.

Roadrunner’s power will likely initially be applied to “grand challenges” in antibiotic drug design and HIV vaccine development, among other projects, before being devoted to a classified nuclear weapons program computing network.

Brain-Machine Interface Developments

Computing power of such orders of magnitude is going to be very helpful as we continue to refine methods of “jacking in” (wirelessly, of course) to the Net — and to each other. The US Army is spending US$4 million on research to enable soldiers to jack in to one another’s brains; that is, to communicate by telepathy. The research focuses on automatic speech recognition (ASR) and brain imaging technologies such as electroencephalography (EEG). The idea is that a person would “think” a message, the EEG would pick up the signals from the brain, and the ASR computer would decode the signals and send them to the intended recipient, where the reverse process would occur. The research has obvious applicability in medicine, for paralyzed patients.

We’ve long maintained that this is inevitable technology, following successes in getting monkeys and paralyzed patients to operate remote devices merely by thinking, and the private sector is proving us right. For example, one firm is about to market an EEG “neuro-headset” that interprets the wearer’s cognitive and emotive state, allowing the wearer to manipulate a game or virtual environment naturally and intuitively. The US$299 device (which will “begin shipping in late 2008”) translates the EEG signals into video game commands or into facial expressions on the wearer’s virtual world avatar. For instance, if the player smiles, winks, or grimaces, the avatar does the same.

The headset detects more than 30 different expressions, emotions, and actions. They include excitement, meditation, tension, and frustration; facial expressions such as smile, laugh, wink, shock (eyebrows raised), anger (eyebrows furrowed); and cognitive actions such as push, pull, lift, drop, and rotate. Wearers are able to move objects in a videogame just by thinking of the action. The company, Emotiv, is working with IBM to develop the technology for uses in “strategic enterprise business markets and virtual worlds.”

Taking a less direct, and in our view less interesting route, to manipulating objects in virtual worlds, is a glasses-free 3-D display called HoloVizio that allows users to manipulate 3-D objects through hand gestures. The aim of this EU-funded project is a networked, holographic, audiovisual platform to support real-time collaborative 3-D interaction between geographically distributed teams in medicine and in the automotive industry. The system can take raw data from medical imaging devices to create 3-D anatomical models, which clinicians and researchers can discuss and evaluate collaboratively over the Internet.

The various technologies underlying all these various devices are of course rapidly improving also. We’ve just seen how computing and graphics display chips are advancing; now, Canadian scientists have developed a more reliable brain-machine interface (BMI) incorporating a genetic algorithm, complex feature extraction and classification methods, and bipolar EEG signals, all designed to understand what the wearer is thinking. A functional system could be ready for practical application in two years.

The problem with EEG has been that for optimal performance, electrode preparation and preprocessing and classification tailored to the individual (and even to the same individual at different times) can take nearly an hour. But dry electrodes, which reduce electrode preparation time to one minute, and a new data analysis method which reduces calibration time also to one minute, will help make the application of BMI technology more practical.

Flexible Chips for Implant

The alternative to EEG for brain-machine interfacing is to interface directly — physically — with the brain by implanting a chip in it. A flexible computing chip made from folds of ultra-thin silicon bonded to sheets of rubber should do the trick better than today’s stiff and hard silicon chips. Indeed, flexible chips will one day power a whole new generation of flexible electronic devices, from brain implants to health monitors and smart clothing. The developers of one crude but functioning prototype flexible chip hope to use it in a smart latex glove for surgeons. It would measure vital signs, such as blood oxygen levels, during an operation. They are also planning to develop a sheet of electronics which would lie on the surface of the brain and monitor brain activity in epileptics.

Taking a different approach to the same flexible end, Japanese researchers have developed a rubber infused with carbon nanotubes that can conduct electricity while remaining elastic. Within “a few years,” they hope to develop elastic computer chips for use as: Artificial skin for robots, enabling them to sense their environment through touch; car steering wheels that would measure the driver’s perspiration, body temperature, and other data to judge whether he or she is fit enough to drive; and mattress covers to monitor a sleeper’s posture and prevent bedsores.

US and Korean researchers have also jointly developed a similar haptic material that can wrap around the finger, or any other part of the body, like a band-aid. They think the technology could provide a means of communication for the visually impaired (for example, as a Braille “display”.) It could also be used for a virtual reality keyboard, a tele-surgical glove, or other remote haptic application. This material, too, is not yet ready for commercialization, however.

Holodeck 2.0

Not even Star Trek ventured out on the telepathy limb (except for the occasional advanced alien civilization), but it was bullish on holo-haptics, the supposedly science-fictional technologies underlying its “holodeck.” We’re ahead of Star Trek in telepathy, and catching up in holo-haptics: A 4.5m square (about 14×14 feet), 11-tonne, omni-directional treadmill called CyberCarpet enables people to walk or jog in any direction, while staying in one place. European scientists have combined the CyberCarpet with virtual reality to give users the impression of walking and running in 3-D worlds, including virtual re-creations of ancient Pompeii and Rome. The technology could be used in rehabilitation, among many other applications.

When (not if) this technology is combined with the haptic materials and telepathic technologies discussed above, then the holodeck will truly have arrived.

(Incidentally, those unwilling or physically unable to “take the treadmill” to visit virtual ancient Rome might instead take “The Lounger,” a hand-built magnetic-levitation chair that floats a few inches above its base. Its maker suggests the chair’s magnetic field could help relieve backache, muscle ache, and headache.)

 

Virtual Dissection Replaces Frogs in School

Turning from virtual ancient history to virtual modern biology: Dissection using virtual reality software such as the Digital Frog, Froguts, and V-Frog can save high-school biology programs about $US6 per frog (and much more for bigger animals.) Opponents complain that software cannot duplicate the smell, feel, and texture of cutting into a real frog, while proponents argue it enables students to spend more time on dissection outside of class and to recover from mistakes. The software’s animations and interactions can also teach students various aspects of anatomy and physiology. Virtual scent is admittedly a problem (for now); but as we have just seen, the problem of emulating the feel and texture of dissection will soon go away with flexible haptic materials.

Cloak of Invisibility

It is technically trivial to be invisible in virtual worlds, unless rules that forbid it and algorithms that prevent it are built into the virtual world. But soon it could actually be easier, and less preventable, to be invisible in the real world than in the virtual world, thanks to a “metamaterial” of engineered nanoscale structures that can bend light around objects, making the objects invisible — in theory. In practice, and for the moment, the material can only bend light at a wavelength close to but not within the visible part of the spectrum. The Harry Potter cloaking effect is not far off, the researchers believe; it’s now just a matter of engineering the metamaterial to handle the visible wavelengths.

Does anybody stop to consider the implications, and prepare for these things?

Cell Phones in Healthcare

The accelerating power of computing in ever-smaller form factors is why we often predict that cellphones will eventually play a major role in healthcare. BusinessWeek writer Olga Kharif has provided some examples that show they are well on their way:

 

  • A UCLA professor has built a stripped-down, portable electromagnetic scanner that transmits images through a regular cell phone to a remote doctor or hospital. The professor believes the cell phone will become an integral part of many diagnostic devices, and in the process cut the cost of the devices.

 

  • Seventeen of some 30 healthcare-related projects funded by Microsoft Research involve cell phones. One aims to take ultrasound readings using a cell phone and a TV. Another is a heart monitor that uses a cell phone to analyze the readings and dial Emergency if necessary, providing emergency responders with location and a preliminary diagnosis.

 

  • Disposable wireless Band-Aids containing RFID (radio frequency identification) chips that transmit key information, such as drug allergies, to a cell phone are about to go on the market in Europe. They also monitor a patient’s temperature or glucose levels, alerting a nurse if there’s a spike.

 

  • A company called Life Record lets physicians view patients’ medical records, including electrocardiograms and brain scans, on an Apple iPhone. Doctors can also enter prescription orders on the iPhone, and patients can access to their own medical records, including X-rays and physician notes, via an iPhone for US$50 a year.

 

  • BeWell Mobile lets asthma or diabetes sufferers enter their home test results into their cell phones and send them to the doctor daily. If a patient’s glucose levels spike, the software suggests the patient hold off on certain foods or try a different medication. Or a doctor may call the patient with more personal recommendations. None of 50 asthmatic patients aged 12 to 20, who had previously visited an ER four or five times a year, came to the emergency room during a two-year trial of BeWell Mobile, showing (the company believes) that its product “can actually change patients’ behavior, and that’s the big breakthrough.”
 

Leave a Reply

Your email address will not be published. Required fields are marked *