Computing

On March 21, 2004, in Computing
The first prototype has been unveiled of another serious effort to turn the
Internet into a guru able to answer
almost any question put to it. If the question is just about someone’s interpersonal relationships as revealed by
the Internet, IBM already has the answer, but it will cost you.

In a less controversial application of massive computing power, Big Computing
is teaming up with Small Pharma to spur the design of new drugs using
computational techniques.

And on a lighter but no less serious note: Intel is poised to deliver the
world’s first mass-market photonic chip
— a development as epochal as the 8088 chip that launched the PC
revolution.

US government rules facilitating the use of the power grid for carrying broadband
communications will spur the development of an alternative to the phone and
cable oligopolies, as might a complete radar system on a speck-sized chip costing
a couple of dollars, which can serve as a broadband communications device as
well as a collision avoidance system for cars.

Speaking of opolies: A new
theory
rekindles concerns about the danger of the Microsoft monopoly to
security and the economy.

AI: The Vulcan Project

“Project Halo” aims to create an intelligent digital encyclopedia of all
scientific knowledge that will answer questions put to it much as a human expert
might, except it will answer in graphics, not words. Last year, a prototype
containing 70 pages of textbook chemistry knowledge scored 3 out of 5, slightly
better than the average student’s score of 2.8. It cost US$10,000 a page to
input, and it might have scored higher if the computer engineers who built it
had also been chemistry experts. It was good at answering quantitative questions
such as: “If you mix two chemicals in a solution, what will be its pH?” reports
Luke Timmerman in the Seattle Times, but was “not as sharp at
more-qualitative questions, such as: Why is tap water a better conductor than
pure water?”

A first full edition will take at least a decade to complete. Presumably, if
it were combined with Cyc, a similar
system that has been learning the sum-total of human knowledge for a decade
already, it might be ready sooner.

Reference: Timmerman, Luke (2004). “Vulcan
project aims to build ‘Digital Aristotle
.'” Seattle Times, February 12.

Next Generation Data Analysis

IBM collects half a terabyte of data from 250 million new Web pages, blogs,
and chat rooms every week and makes it available customers of its
WebFountain data mining and analytical software tool. WebFountain
mines the data to discover patterns. It can do market research without the need
for error-prone polls. It can do more sinister things, too: A security company,
writes Dean Takahashi in the Mercury News, “wanted to be able to predict
for banks whether customers depositing large amounts of cash are connected to
money launderers. WebFountain gathered publicly available information, as
well as a corporate client’s own internal files, about known money launderers.
It then searched through Web data — from newspaper wedding announcements to
high school reunion Web sites — to draw any association between bank customers
and known criminals. If the links show that someone’s wife has a best friend who
is a money launderer, then the bank may have reason to refuse the customer’s
money.”

WebFountain appears to be currently only within the financial reach of
governments, substantial corporations, and very rich people.

It is noteworthy, in the context of spying on American citizens, that a few
months following the demise of the Terrorism Information Awareness, or TIA,
data-mining initiative at the hands of a reportedly alarmed US Congress, the
Defense Advanced Projects Research Agency (DARPA) has now quietly canceled
LifeLog, an ambitious project that would have created a database
recording every event in an individual’s life — what you say, what and who you
see and hear and read, what you do. Computers would analyze relationships
between all the data elements, and give you (or your government or corporate
minder) a detailed memory of your past.

Apart from the loss of contracts with the cancellation of LifeLog, AI
researchers have reason to be disappointed in the loss to AI and cognitive
science, but not to worry. “Related Darpa efforts concerning software
secretaries and mechanical brains are still moving ahead as planned,” according
to Noah Shachtman in Wired, and similar projects in the private sector —
such as Microsoft’s MyLifeBits
— continue.

Reference: Takahashi, Dean (2004). “Monster
librarian at work
.” Mercury News, February 4.

Reference: Shachtman, Noah (2004). “Pentagon Kills
LifeLog Project
.” Wired News, February 4.

Computing Drugs

Locus Pharmaceuticals is partnering with IBM to advance Locus’s drug
development program for HIV/AIDS. Concurrent Pharmaceuticals is partnering with
Intel to apply machine learning tools that enable computers to learn from the
information they hold and analyze for drug discovery and development.

Both Locus and Concurrent specialize in computational, or computer-assisted,
drug design. Locus’s cluster of 2,500 computers (among the 50 world’s biggest)
can accomplish in weeks what previously would have taken a decade, and now the
company will almost double its computing capacity by being able to access IBM’s
supercomputing-on-demand center.

Concurrent is initially using its computational drug-design technologies to
develop new treatments for hypertension, renal failure, and vascular diseases.
Animal studies of some of its most promising compounds are underway.

Reference: George, John (2004). “The
marriage of tech, pharma: IBM, Intel join with two local drug makers
.”
Philadelphia Business Journal, January 30.

Intel Goes Photonic

Intel scientists have built a prototype photonic chip out of silicon. It can
be mass-produced in conventional fabrication plants and should lead to
ultra-high-speed but low-cost photonic computing and communication devices by
the end of the decade. It will take the transmission and functionality of
interactive high-definition TV, among other things, a quantum step forward.

The chip encodes data onto a light beam for transmission at more than two
gigabits per second over a fiber network, shattering the previous record of 20
megabits per second for a silicon optical switching device. “The economics of
this have just fallen faster than any communications breakthrough under my
watch,” an expert told the New York Times‘ Jon Markoff. Intel’s president
said: “Think of it either bringing us a tenfold decrease in costs of existing
communications, or 10 times the bandwidth for the same cost.” Other experts,
according to Markoff, think Intel has “achieved one of the rare performance
increases known in the industry as ‘powers of 10,’ or exponential change, that
has the potential to remake the modern computing world.”

Reference: Markoff, Jon (2004). “Intel Reports a
Research Leap to a Faster Chip
.” New York Times, February 12.

Reference: Asaravala, Amit (2004). “Intel Sheds
Light on Fiber Optics
.” Wired News, February 17.

Broadband Power Lines

The US Federal Communications Commission (FCC) has begun preparing rules
allowing delivery of the Internet through electric power lines, providing an
alternative to the cable and phone oligopolies — and the first available option
for rural communities not served by broadband providers. A number of trials have
been held, and while it will likely take at least two years for the service to
take hold, the new FCC rules will reduce regulatory uncertainty and encourage
investment.

Reference: Labaton, Stephen (2004). “F.C.C. Begins
Rewriting Rules on Delivery of the Internet
.” New York Times, February 13.

Radar on a Chip

Miniaturization is a constant theme and an accelerating trend in this
publication. We have talked about “dust mote” sensors, RFID chips the size of a
grain of sand, and now there is a complete radar system on a single silicon chip
“about one fifteenth the size of a penny,” as New Scientist describes it,
connected to eight minuscule antennae that do not need to rotate as conventional
radar antennae do. Though incredibly complex, the chip is easy and costs “no
more than a few dollars” apiece to mass produce.

The chip could be used in cars, to detect objects all around the vehicle —
other cars in front in thick fog, a small child hidden behind the car in a
parking lot, a deer approaching from the side at night. It could also be used
for broadband communications ten times faster than DSL, with bit rates up to a
gigabit per second making it a feasible wireless LAN alternative. Arrays of the
chips might be powerful enough for use in aviation systems. Having no moving
parts, the system is inherently more reliable than current electromechanical
radars, and more accurate.

The Caltech engineers who made the breakthrough have demonstrated a prototype
with both the transmitter and receiver on the same chip. The remaining
roadblocks are regulatory, not technical.

Reference: Eisenberg, Anne (2004). “Piercing
the Fog With a Tiny Chip
.” New York Times, February 26.

Reference: Biever, Celeste (2004). “Tiny radar could
make driving safer
.” New Scientist, February 27.

Monoculture Computing

A theory gaining ground posits that Microsoft’s software represents a
“monoculture” “so dangerously pervasive that a virus capable of exploiting even
a single flaw in its operating systems could wreak havoc” reports the Associated
Press. In biology, a “monoculture” is a species with little genetic variation
and therefore catastrophically vulnerable to epidemics. A single virus could
wipe out an entire monoculture species. Genetic diversity increases the odds
that at least some members of a species will survive any attack.

Microsoft counters that true diversity would require thousands of different
operating systems. We think that this, like many Microsoft arguments including
ones discredited by courts, is patent nonsense intended to obfuscate and
mislead. True diversity in software only requires that a single operating system
has many different variations and is open to further variation, just as true
diversity in a species only requires that the species has genetic variation and
is open to further genetic variation. Linux conspicuously meets those
requirements, Windows conspicuously does not.

While all seem to agree that the monoculture analogy is not perfect, it has
sufficient isomorphism with reality to merit — and perhaps to explain — the
growing efforts to re-introduce software diversity and competition through the
open-source movement. It is a critically important issue for CIOs in healthcare,
as in other industries, because of the demonstrated and persistent vulnerability
of Microsoft software to systems-crippling breakdown and attack.

Reference: Associated Press (2004). “Warning: Microsoft
‘Monoculture
.'” Wired News, February 15.

 

Leave a Reply

Your email address will not be published. Required fields are marked *