Computing

On October 12, 2003, in Computing
The antidote to Internet viruses may be a whole new
Internet
, already under construction. While we’re at it, why not a whole new
Web as well? Today’s Web is a massive, publicly-accessible information warehouse
utilizing the combined storage capacity of the millions of computers connected
to the Internet. Tomorrow’s Web — also already under construction, at some of
the world’s premier research labs — will be a massive, publicly-accessible grid computer utilizing the combined computing and
storage capacity of those millions of machines to make sense and new knowledge
(and, if we are really lucky, new wisdom) from the information base.

A new 2-million PC grid computing initiative to help predict
long-range climate change
is just a taste of what might be achieved with
grid technology. For anyone still wondering about the relevance of grid
computing to healthcare, and why we make such a big deal of it in Health
Futures Digest
, consider that it could mean the end of
animal studies
and human clinical trials.

AMD has beaten Intel to the punch with a 64-bit chip for
the masses. If this is an exponential jump for consumer PCs, think of what it
does to the multi-million PC global grid initiative when the majority of those
PCs are all 64-bit machines, as they could well be in about five years. Or when
the chips are made from fast diamond instead of sluggish
silicon. Or when they have all been replaced by quantum computers — which, by
the way, in addition to the quantum leap mentioned in the Acceleration
section, have inched a step closer with the arrival of a new way to make and
handle quantum bits.

Meantime, back in the marginally more real world of silicon computing, Sun
Microsystems is poised to shake things up by making chips that intercommunicate
at speeds in excess of a terabit per second.
Machine-to-machine (M2M) communication and control is
unquestionably one of several Next Big Things, alongside grid computing. In
healthcare, it could mean a pacemaker keeping a hospital informed of its status
and activities, the MRI talking directly to the patient EMR and the hospital
billing system, and much more besides.

But that is mundane stuff compared to science’s real goal of getting to the
bottom of life, which may take decades. Still, that will be impressive enough,
given that it took Nature aeons. The remarkable thing is not so much the
implication of playing God, but the fact that science is making any progress in
that regard at all; which it is, through the field of DNA
computing
, and perhaps through the U.S. military scientists striving to
create not merely holograms of patients’ bodies (as described in the
Acceleration section), but emulations of their
minds
, as well.

Whole New Internet

PlanetLab is a collaborative effort among nearly a hundred leading computer
scientists and some large industrial sponsors to replace today’s creaking,
insecure, and unstable Internet infrastructure — upon which so much business
and social activity already and increasingly depends — with an improved
version. It will, for example:

  • Enable anyone to recreate their personal computer workspace, programs, and
    files, on any Internet terminal, instantly;
  • Detect and destroy viruses and worms before they get onto users’ computers;
  • Provide instant video on demand; and
  • Archive users’ personal information securely and indestructibly in a
    distributed database spread across the Internet.PlanetLab has built and linked 175 smart nodes at 79 sites in 13 countries,
    with a goal of 1,000 nodes by 2006. It is based on Sun Microsystems’ slogan “The
    Network [not the box on your desktop] Is the Computer.”

    Today’s Internet uses dumb routers to route data packets to the addresses in
    the packets’ headers, no questions asked. They can’t tell if a packet is part of
    a web page, an email, a file, or a virus. PlanetLab attaches PCs to the routers,
    to create a “node” capable of running custom software uploaded by users, which
    could perform such analysis and route, or not route, packets accordingly.

    We wrote about PlanetLab in
    the August issue of HFD
    (but we misreported its name as “PlanetNet.”) Wade
    Roush’s article in Technology Review provides a very detailed explanation
    of how it works and its potential.

    Reference: Roush, Wade (2003). “The
    Internet Reborn
    .” Technology Review, October.

    Quantum Computing Closer

    The “qubits” (quantum bits, analogous to the 1s and 0s of conventional
    computing bit) required for quantum computing can theoretically be made from a
    single subatomic particle, such as an electron. But controlling a single
    electron is extremely difficult. US and Swiss researchers have proposed a qubit
    made from a group of an odd number of electrons, which can be controlled as one
    unit.

    Other researchers have proposed qubits that contain three constantly
    interacting electrons rather than a pair of electrons and an electrode. When the
    energetic state of the middle electron matches the others, the other two can
    exchange energy to produce an “on” or “1” signal.

    The time frame for quantum computers is generally held to be between ten and
    20 years from now.

    References: Unknown (2003). “Advance Electron
    Teams Make Bigger Qubits
    .” Technology Research News, September 16; Unknown
    (2003). “Quantum Computer
    Keeps It Simple
    .” Technology Research News, August 14.

    Exotic Computing and the “Doctor in a cell”

    In February, Israeli scientists created three trillion self-contained
    computing devices out of DNA molecules, all in a single microliter of salt
    solution. Had that tiny drop been programmable, it would have performed 66
    gigaflops per second. Scientists are already thinking of harnessing this
    capacity to create a “doctor in a cell” that could be injected into a patient,
    diagnose any disease present by sensing and analyzing the biological data all
    around it, and synthesize and deliver the appropriate drug molecules on the
    spot.

    Bio-computing is in its infancy. One enzyme-powered DNA computer called MAYA
    plays an infallible game of tic-tac-toe, as long as it makes the first move. Not
    much compared to a Pentium 4 computer, but it is not so long ago that the infant
    Pentium 4 was a mere calculator that could only add, subtract, multiply, and
    divide numbers. MAYA could eventually be used to control nanodevices. The method
    differs from most DNA computing research efforts, which are aimed at tapping
    DNA’s massively parallel nature to solve very large problems. One of those
    efforts successfully solved the Traveling Salesman problem.

    That success, along with the very idea of DNA computing, belongs to a
    University of Southern California computer scientist who recognized that human
    cells and computers process and store information in much the same way — in one
    case, strings of voltage differentials labeled 0 and 1; in the other, strings of
    molecules labeled A, T, C and G. Much hard math and molecular biology later,
    test tubes of DNA-laden water called “machines” and “devices” by their creators
    are crunching algorithms and producing data.

    These tiny computers could be the brains of the “Doctor in a Cell,” and
    because genetic material can self-replicate, they could each grow into a
    massively parallel supercomputer far more powerful than silicon-based machines
    — as they already do, in the form of human beings. Thus, in seeking to
    understand and control DNA computing, the researchers are doing no less than
    seeking to understand and control life itself.

    Less ambitiously, NASA is funding a biology-based machine that will keep
    astronauts healthy on deep space missions.

    References: Easen, Nick (2003). “Is
    life the key to new tech?
    ” CNN, September 19; Unknown (2003). “DNA Plays
    Tic-Tac-Toe
    .” Technology Research News, August 20; Associated Press (2003).
    DNA
    may be basis for power computing
    .” USA Today, August 18.

    Wafer-scale Computing

    Another approach to refining today’s technology to achieve a significant
    advance is Sun Microsystems’ new technology, developed under a defense
    supercomputing contract, of chips that communicate wirelessly with one another,
    instead of through the relatively thick gold or aluminum wires soldered on
    circuit boards. Eliminating interconnecting wires would accelerate inter-chip
    data transfers by 60 to 100 times, and enable much more processing power to be
    installed per unit of space. The traditional circuit board would become
    obsolete.

    The Intel Pentium 4 processor can transmit data at about 50 billion bits a
    second. Sun says its chip, which is still more of a concept than a product, will
    achieve speeds in excess of a trillion bits a second via built-in transmitters
    and receivers only a few microns wide that exploit the “capacitive coupling”
    effect to send and receive high speed electrical pulses.

    Some significant challenges remain to be overcome, but when they are,
    computing power will take a major step up.

    Reference: Markoff, John (2003). “New Sun
    Microsystems Chip May Unseat the Circuit Board
    .” New York Times, September
    22.

    International Grid

    Chicago’s famed FermiLab, Argonne National Laboratory, and the University of
    Chicago are part of an international data grid project that will harness most of
    the world’s number-crunching capacity along with enormous amounts of
    information, and make it publicly accessible. “It’s a democratization of
    science,” said a FermiLab scientist. “If someone wanted to run through a
    simulation for whatever purpose, betting on horses, playing the stock market or
    doing high-energy physics in Calcutta, they would get access to those services
    instead of having to go down and buy their own supercomputer,” said another.

    Today, the Web is basically just a massive data warehouse. The grid project
    harnesses the massive computing power of the myriad machines that store the
    data. Formally, its goal is “to accelerate the handling of the dramatic increase
    in the amount of data scientists have to deal with,” according to an Argonne
    National Laboratory scientist.

    Trial runs between FermiLab and other facilities in the United States and
    Europe have already been conducted.

    Associated Press (2003). “Scientists
    Plan New Supercomputer System
    .” SiliconValley.com, September 2.

    See also the
    March issue of HFD
    for a brief description and discussion of grid
    computing.

    Another Grid Computing Initiative

    A consortium of climatologists from the UK Meteorological Office, industry,
    and academia have launched a SETI@Home-type grid computing initiative
    that will turn the idle time of perhaps two million PCs around the world into a
    supercomputer to predict climate change over the next 50 years. The model will
    simulate a period of decades of climate change at a time.

    Reference: Best, Jo (2003). “Grid computing used to
    predict the future: Over two million users get ready to talk about the
    weather…
    ” Silicon.com, September 12.

    64-bit PCs Imminent

    While awaiting realization of the potential of quantum, molecular, and other
    exotic forms of computing, there remain some substantial gains to be made from
    conventional technologies. One of those is to bring the power of 64-bit
    computing, long available at the mainframe level and not uncommon in small
    servers, to the desktop, and that is now happening with the introduction of
    Advanced Micro Devices (AMD)’s Athlon 64 processor. Existing 64-bit
    processors from Intel (the Itanium) and AMD (the Opteron) are
    installed in some servers, but not in end-user PCs. The Athlon 64 marks
    the beginning of pervasive 64-bit desktop computing, and although common desktop
    applications written to take advantage of the extra power of 64-bit chips won’t
    exist until next year, gamers are likely to adopt it quickly.

    Reference: Abreu, Elinor Mills (2003). “AMD
    Unveils Athlon 64, Seen as Head Start Vs Intel
    .” Reuters, September 22.

    Diamond to Replace Silicon

    The condition of heat and pressure under which carbon crystallizes into
    diamond are being replicated in Russian-designed machines now turning out
    3-carat roughs 24 hours a day, seven days a week; diamond so good that one
    Belgian gemologist who examined the stones said it will put DeBeers out of
    business. A second company, in Boston, has perfected a completely different
    process — chemical vapor deposition — for making near-flawless diamonds and
    plans to begin marketing them by year’s end.

    Joshua Davis’ article in Wired is well worth the read just for general
    interest. Our particular interest, though, is that these processes make possible
    the development of inexpensive diamond semiconductors. Not only can diamond run
    much hotter than silicon, but also diamond makes a perfect substrate for DNA chips.

    Reference: Davis, Joshua (2003). “The New Diamond
    Age
    .” Wired, Issue 11.09, September.

    Thinking Machines

    Simantha, the simulated surgery patient (see article in the
    Practice section of this issue), and other artificial beings will one day
    benefit from work conducted at the Sandia National Laboratories to build
    computers that can accurately infer intent, remember prior experiences with
    users, and allow users to call upon simulated experts to help them analyze
    problems and make decisions.

    The work began five years ago with DARPA funding and is fundamentally
    intended for national security. The original intent — apparently not abandoned
    — was to create software replicas of the minds of specific political leaders or
    entire populations, which could then be presented with hypothetical situations,
    to see how they might react.

    For now, the focus is on creating computer models that can help people by
    acting more like them. The model will become “an aide tasked with watching
    everything you do, learning everything [it] could about you and helping you in
    whatever way [it] could.” Knowing what its user is thinking (because it has been
    given the same experiences as its user and taught to think the same way) lets a
    machine augment the user’s mental abilities by detecting discrepancies between
    what the machine is thinking about a given situation and what the user is
    thinking. If there is a cognitive dissonance between user and machine, “a
    discrepancy alert may be signaled.”

    The researchers apparently believe the technology will become ubiquitous and
    “allow almost anyone to quickly configure and execute relatively complex
    computer simulations.” Not to replace, but to augment, the human mind. The team
    has “focused on replicating the processes whereby an individual applies their
    unique knowledge to interpret ongoing situations or events. This is a pattern
    recognition process that involves episodic memory and emotional processes but
    not much of what one would typically consider logical operations.”

    Based on their progress over the past five years on methodologies that allow
    the knowledge of a specific expert to be captured in computer models, and that
    endow synthetic humans with episodic memory (memory of experiences), the Sandia
    folks think that cognitive machine technology will be embedded in “most computer
    systems” within the next ten years. Since the technology is a tool of national
    security, that can only mean “most computer systems associated with US national
    security.”

    Reference: Delio, Michelle (2003). “Machine Thinks,
    Therefore It Is
    .” Wired News, August 27.

    See also previous articles in HFD on: LifeLog,
    another DARPA project aimed at capturing everything about a person’s daily life;
    collaborative
    agents
    ; and AI bots
    that help mothers cope with the stresses of caring for cancer-stricken
    children.The “Storage of Your Life” and “Spyglasses” articles in the
    Devices section of this issue are also relevant.

    Machine-to-Machine Communications

    M2M (aka machine-to-machine communication, machine networking, pervasive
    computing, and control networks) is, like grid computing, a paradigm-shifting
    revolutionary technology, once it achieves a critical mass. A swimming pool
    chemicals supplier uses M2M to monitor the water quality of swimming pools and
    keep a central computer constantly updated via cell phone. It not only provides
    an accessible, permanent record of every pool’s temperature and cleanliness, but
    also automatically alerts maintenance staff if something needs changing. A
    fast-food restaurant chain, which already uses an M2M system to keep track of
    how long it takes to complete a customer’s order, plans to use it also to
    monitor temperatures inside refrigerators and freezers, to track electricity
    use, and determine when it’s time to replace aging equipment. “How do I know
    when to replace an air conditioning system because it’s so old it’s costing
    money? With this network, I can make those comparisons,” said a company
    executive.

    Wireless utility meter reading is the biggest application so far. Many
    analysts predict rapid growth in the next few years. One survey of some two
    dozen M2M firms found they had 23 million devices linked by wireless networks.
    Forrester Group predicts that by 2005 there may be as many machines
    communicating on wireless networks as people. Intel is aiming to put radios on
    all chips. The move is intended to be a disruptive technology that will remake
    many existing business models, said a senior Intel executive.

    Household appliances are definitely on the radar screen, ready to contact the
    maintenance firm with a description of itself and its problem, so the repairman
    knows just what to bring and what to do.

    Reference: Van, Jon, (2003). “Machine-to-machine
    talk not stuff of fiction
    .” Chicago Tribune, September 2.

    End of Animal Experiments and Clinical Trials in
    Sight

    A life sciences executive predicts that in ten years grid computing will
    provide enough power at low enough cost to enable drugs to be modeled in
    silico
    against genomic data. The result will be the end of “all animal
    studies,” and of phases I and II clinical studies. Grid computing makes it
    unnecessary for institutions to buy supercomputers they might use only
    occasionally. Instead, they simply draw as much power as they need from the
    grid, over broadband Internet lines.

    The growing computing power results in an avalanche of data — a terabyte of
    textual information is now being published daily in medical, clinical and
    scientific journals — with which traditional structured databases cannot cope.
    Unstructured information search tools under development will solve that problem.

    Reference: Gifford, Adam (2003). “Computer
    modeling future of medicine
    .” New Zealand Herald, September 3.

    See also article in the August issue of
    HFD, on in
    silico
    emulations of biological organisms
    ; in the March issue on in
    silico
    surgery on haptic holograms of a patient
    that may eliminate the
    need for experimental in utero surgeries on real mothers and fetuses and
    lower the human and financial costs of experimentation; in the August issue on
    the data
    glut
    , which is partly the result of increasing use of in silico
    experimentation, where binary 1s and 0s simulate the biochemical compounds of
    traditional in-vitro petri-dish research; and in the June issue, on unstructured
    information searching
    .

 

Leave a Reply

Your email address will not be published. Required fields are marked *