“Technology is intimidating. Physicians know they have to invest in it, but
they don’t understand it. And they go into it fearing that many of today’s tools
will be outdated in 18 months, or that prices will come down soon after they
— Rosemarie Nelson, Medical Group Management Association
Postmodern Healthcare
A Harvard professor says the thing about the Human Genome Project (HGP) was
not that it led to an avalanche of fundamental discoveries about DNA (it did
not); rather, that it transformed much of biology into an information science
that has become known as “systems biology.” Before the rise of informatics,
biologists took years to unravel a single gene; with informatics, systems
biologists can analyze hundreds in a matter of days. Within ten years, an
individual’s entire genome might be analyzed in a matter of hours for less than
$1,000. For comparison, it took 13 years and $2.5 billion to sequence the first
complete human genome.
Ever more powerful information-processing tools of systems biology are now
taking us beyond the genome to the very molecules of the proteome, where seem to
lie the answers to basic questions about life.
It is fifty years since Watson and Crick cracked the double helix. The
Harvard professor is reported to believe that in another 50 years, “doctors will
be treating disease by rewriting the genetic code, changing any gene in any cell
in the body at will.” But another top-drawer expert thinks the best we can
expect in 50 years is to be able to “look computationally at molecular profiles
of cancer patients and use that to tailor their treatment.”
Both predictions seem conservative, given:
- How far we have come in 50 years;
- That we are already able to re-write genetic code;
- That the head of the National Cancer Institute has predicted the effective
end of cancer by 2015; and
- The exponentially accelerating power of the computational, imaging, and
manipulative tools of systems biology summarized month after month in this
publication.
Another Harvard man points out that medical informatics’ ability to process
the vast amounts of data required for modern imaging and genetic tests is
already helping physicians quickly determine whether a cancer is growing out of
remission, avoiding the need for unnecessary treatments and exploratory surgery,
and that “the volume of data that will flow through a doctor’s office in the
not-too-distant future will explode,” as “data from genomic and proteomic
applications migrate from researchers’ lab benches and become standard patient
treatment protocols” in “much more tailored medical diagnosis, prescription and
treatment profiles.”
So who is right about the timing? Are we 50 years, or only ten, from
individually tailored genetic medicine? When it comes to making long-range,
strategic decisions affecting a career, an organization, or a major government
policy or program, it would seem important to know whether a predictable event
is ten years away, or 50.
References: McCaffrey, Pat (2003). “Technology’s Impact on
Medicine.” CIO Magazine, Fall/Winter; Enriquez, Juan (2003). “The Data That Defines
Us.” CIO Magazine, Fall/Winter.
The End of Evolution
Have you ever been vaccinated, fitted with an artificial hip or knee, patched
up with synthetic or hybrid biosynthetic skin or factory-produced real skin
grown from cells, or had a tooth crowned? Do you wear contact lenses, harbor a
pacemaker, a stent, or an artificial heart valve in your chest? If any of these
apply, you are a bioengineered being, according to Alan H. Goldstein of Alfred
University in New York, writing in Salon. Our regular readers will
recognize you as a cyborg.
But his topic is more than cyborgs: it is the End of Evolution, and the
beginning of two new species — Homo technicus and Materio
sapiens, whom readers of this Health Futures Digest editor’s articles from a
previous incarnation will recognize as Homo cyborgensis and
Machina sapiens, respectively. Both are engineered, rather than evolved
in the Darwinian way. Goldstein’s key messages are:
First, that bioengineering at nanoscale — atoms and molecules — will
ultimately open the door to creating totally new lifeforms so advanced we may no
more be able to communicate with them than bacteria are able to communicate with
Second, that “Bioethicists are disastrously underestimating the trajectory
That trajectory will involve “completion of the molecular cell biology
lexicon,” (which “will come as the culmination of billions of experiments that
integrate and lay bare the complete blueprint of biology.” That number, he
admits, sounds “extreme,” but “robotic combinatorial microchemistry can knock
off millions of reactions a day.” We would interpret Goldstein to mean the
complete understanding of the genome and the proteome, and would point out to
readers that analyzing millions of substances is almost trivial given the latest
microtiter plates, gene chips, and other “labs-on-a-chip” we have reported on
from time to time — not to mention the massive analytical power, speed, and
ubiquity now coming online thanks to the advances in grid computing and data
communications reported elsewhere in this issue. The exponential trajectory of
those supporting technologies makes it a sure thing that millions of experiments
a day will quickly become a billion, then a trillion, then whatever it takes to
arrive at Goldstein’s “complete blueprint of biology.”
The only issue on which we would challenge Goldstein is his apparent belief
that it will take a century to get there. We share his urging that bioethicists
(indeed, all ethicists, as well as political, social, business, professional,
academic, and spiritual leaders) catch up with the acceleration and go beyond
tinkering with cloning and (as the President’s Council on Bioethics does in its recent report)
pre-implantation genetic diagnosis to look at the much, much bigger issue
staring us in the face.
Reference: Goldstein, Alan H. (2003). “Invasion
of the high-tech body snatchers.” Salon, September 30.
Boomer, MD
“A number of experts,” according to the Philadelphia Enquirer‘s Marian
Uhlman, “fear that the nation is headed toward a doctor shortage” because demand
for healthcare will grow as the baby boomers (including many physicians) age and
are not replaced by younger doctors. Some statistics:
- Nationwide, the percentage of young doctors dropped from 10.7 percent to 5.6
percent between 1989 and 2000 (source: Pennsylvania Medical Society)
- The percentage of young doctors in the workforce appears to have peaked
nationally in about 1980, according to American Medical Association data.
- The percentage of young doctors nationwide dropped from 22 percent in 1990
to 16.6 percent in 2001 (American Medical Association data)The short-to-medium term (say, five years) situation may be serious; but
given advances in medical technologies that enable paramedics, nurses, and
patients themselves to handle more procedures — and physicians themselves to
become more productive — the long-term prediction that:
- The shortage will amount to about 85,000 physicians by 2020 (source:
executive director of the Center for Health Workforce Studies at the State
University of New York at Albany)seems unduly pessimistic. Since it is on such predictions that long-range,
multimillion dollar projects to build new medical schools are based, we need to
be sure they are reasonably accurate and have taken the technological exponent
into account.
Reference: Uhlman, Marian (2003). “As doctor workforce ages,
a fear of shortage.” Philadelphia Inquirer, October 12.
Dying Industry
“Have laptop, will travel” is a growing fact of life among professionals such
as songwriters and arrangers, who do much of their work sitting on a plane —
not in the studio. “Have laptop, will educate” is also a budding mantra at music
colleges, which see the computer as a better aide to music than the piano.
A laptop plus inexpensive software gives today’s musicians what, 20 years
ago, only a studio could afford and house. An industry magazine editor told
Wired‘s Mark McLusky: “The sales of hardware aren’t what they used to be,
and they’re not going to come back. It adds up to big trouble for hardware
manufacturers” and for the studios, which are closing and auctioning off
equipment.
If it can happen to recording studios, why not to (say) medical laboratories?
In a few years laptops or palmtops may have a slot for inserting one’s own gene
chip, and powerful AI-based analytical software that would enable any patient to
do their own lab tests and diagnoses. We saw last month that a CD-ROM drive
could serve to perform DNA analyses for consumers. We also commended one lab
test company for getting ahead of the curve with a timely and successful genetic
analysis component to its business. Their next challenge is to figure out when
the tests only they can perform today will be easily and inexpensively done by
airplane passengers, to while away the time.
Reference: McClusky, Mark (2003). “The Incredible
Shrinking Studio.” Wired News, October 2.
The End of Work
The founder of HowStuffWorks.com urges that it is time to take seriously the
possibility of being put out of work by robots. Extrapolating from the
exponential trends in computing and robotics technology, he asserts that “the
[robotics] revolution is about to accelerate rapidly,” (see also “Accelerating
Adoption” in the Robotics section of this issue) and predicts no human
jobs at all by 2040. We could all take a perpetual vacation — but only if we
plan for it, as a civilization. “Unfortunately, in the structure of our current
economy, that is not what will happen,” he says, because the wealth being
generated by robots is more and more concentrated in the hands of fewer people,
the rest increasingly reduced to living at the charitable or other whim of the
few.
The politics (at least in the United States) are tough, but they will have to
be faced.
Reference: Brain, Marshall 2003). “Relax,
Wage Slaves — Robots Promise You an Endless Vacation.” Los Angeles Times,
October 15.
Flowers of Innovation
Chandler, a program under development by the founder of Lotus, will
automatically assemble all the information on your computer — email messages,
spreadsheets, appointment, addresses, documents, photos, and so on — related to
the task you are performing, and put it on screen ready to use.
When you select a piece of information (an email message, an appointment,
etc.), the program of your choice will open to handle it automatically.
Furthermore, the calendar and contact sharing does not require a central
server.
If it succeeds, it will change the way we interact with our computing
devices. Its chances of success — already not bad, given the record of its
founder — is enhanced by its being an open-source project, with potentially
tens of thousands of gifted and dedicated volunteer programmers contributing to
its development by the proposed December 2004 release date.
Key to the new software is artificial intelligence. Software agents or “bots”
will do much of the work — “postal agents” will help with email, “travel
agents” with booking trips, and “secret agents” with software encryption.
Chandler gives second wind to software that did something similar many
years ago but, because users had to enter many arcane commands at the Microsoft
DOS C:\ prompt, it never caught on with ordinary PC users. With email, instant
messaging, graphical interfaces, and much more intelligence now built into
programs, it could catch on today. Considering that our interaction with
computers has hardly changed in essence since the first Macintosh, with its
innovative windows interface, rolled out in 1984, it may be high time.
Chandler also brings to mind Lotus HAL, a program that turned
the left-brained 1-2-3 spreadsheet into a right-brained conversationalist, of
sorts. Instead of “=sum(c15:c23)” you could type “Move right two columns and add
it up,” and it would be done. Perhaps we can look forward to recapturing the
exciting sense of progress of those pre-monopoly days, when a hundred flowers of
innovation bloomed.
Reference: Fitzgerald, Michael (2003). “Trash Your
Desktop.” Technology Review, November.
Open Source and Healthcare
“Open source” means that someone develops something (usually, software) and
licenses it to anyone, for free, to use or develop further. The only stipulation
is that if they do develop it further, they must allow anyone else to use or
further develop the enhanced version. You can see how this might snowball, as it
has in the case of the Apache webserver, the Linux operating
system, and other superb yet free or inexpensive software.
Open-source licensing and methods can be applied to anything, including
medical technologies. One collaborative team has developed an “open source”
intravenous (IV) system simple enough for untrained caregivers that costs only
about US$1.25 to manufacture. In cholera-stricken third-world villages,
life-saving US$2,000 computerized IVs are hard to find, but the new IVs could be
everywhere a year from now. Thus, open source is relevant to the future of
healthcare, and it is worth understanding it and tracking its progress. Thomas
Goetz’ review of open source in Wired would be a good place to start.
The ideals of open-source are not new. They underlie all of post-Newtonian
science and academia; at least, they did, until whittled down by private
interests, patent and copyright laws, compliant politicians, and the Bayh-Dole
Act, which allows private entities to pick and choose from what is in principle
open-source scientific and medical research funded by the taxpayer, to the tune
of $57 billion a year; stick patent and copyright notices all over it, and make
substantial private profit while depriving the taxpayer not just of a share of
the profit and of access to the research, but also of the benefit of innovation
that the open source model delivers so demonstrably better than the proprietary
model.
In that sense, open-source is reactionary rather than revolutionary; and it
is not (as those threatened by it try to insinuate) anticapitalist. On the
contrary: it fosters “a return to basic free-market principles” of “competition,
creativity, and enterprise,” as Goetz puts it. Such principles are hardly a
threat to capitalism, but they are a threat to established capitalists
comfortable with the status quo, some of whom will stop at nothing to halt or
delay its progress, as we seem to be seeing when Microsoft attacks government
efforts to adopt open-source software, and when the SCO Group attempts to
enforce its strongly disputed claim to ownership of Linux code developed and
shared under an open source license.
But as a method, open source is indeed revolutionary. For example, one of the
founders of PLoS (the Public Library of Science, an open-source alternative to
the closed, proprietary world of scientific publishing) points out that “The
whole premise for that [old academic publishing] model just evaporated with the
Internet. . . . The rules of the game have changed, but the system has failed to
respond.”
“Open source can build around the blockages of the industrial producers of
the 20th century,” a Yale law professor told Goetz. “It can provide a potential
source of knowledge materials from which we can build the culture and economy of
the 21st century.”
Reference: Goetz, Thomas (2003). “Open Source
Everywhere.” Wired, Issue 11.11, November.
Open Standards in Med Tech
Consumer electronics such as TVs and DVD players made by different companies
work together because they all agree on and adhere to interoperability standards
open to all manufacturers. This is not true of CT, MRI, X-ray, and other
diagnostic imaging equipment, which all use different proprietary standards. A
small company called Emageon is trying to change that by selling equipment that
uses open standards-based software such as DICOM (Digital Imaging and
Communications in Medicine) for sharing diagnostic images over networks. The
company’s rapid growth, presumably at the expense of its much bigger rivals such
as GE Medical and Siemens, seems to show that the standards-based approach
resonates with the marketplace.
Open standards are a trend, and a necessary one if the spiraling cost of
medical technology is to be moderated.
Reference: Bassing, Tom (2003). “Emageon
vies with big guns.” Birmingham Business Journal, September 26.
Pharmas, Med Tech Firms Become Providers
General Electric makes PET, MRI, and other scanners. Amersham, the company it
recently bought for $9.5 billion, makes solutions injected into patients to make
organs visible under scanners. So far so ordinary. But Amersham also makes
chemical reagents that could be used to develop tests to determine how patients
are responding on a cellular level to disease, and perhaps to predict which
drugs might help, and this is where the real story of the acquisition lies. GE
clearly and explicitly has anticipated the future. Its acquisition “represents
the future,” said a company spokesman. Its CEO said “We’re seeing the emergence
of molecular and personalized medicine,” and another GE spokesman described the
deal as the company’s entry into personalized medicine. Put these together, and
it is clear that GE is adding pharmacogenomics to its portfolio of business
lines.
Together with IBM’s entry
into the medical device business, GE’s entry into the pharmacogenomics
business alongside the traditional drug giants suggests interesting times ahead.
Reference: Herper, Matthew (2003). “GE’s Genomic
Future.” Forbes, October 10.
|