A fellow futurist asserts that while any specific future is dynamic and therefore unknowable, every decision and every plan involves prediction. Yet plans and decisions made by healthcare institutions are typically based on denial, bluster, or panic about the predictions.
Take proton beam therapy for cancer, for example. Everything coming out of cancer research today says that by the time you have spent US$125 million and five years building a proton beam therapy center, your patients and doctors will have moved on to targeted molecular approaches instead, but some institutions are still looking to buy proton beam therapy machines. Nobody can say with certainty that in five years targeted drugs will be ready and able to do what proton beams can do today, but there are tools to help deal with such decisions: Scenarios planning, “lean production” techniques, and constant learning practices. Scenarios in this example would certainly include ones in which a viable and less expensive alternative to proton beam therapy, or a cheaper or better proton beam machine, or expanded indications for proton beam therapy, is on the market within the next 5-10 years, and “constant learning” would enable the institution to assess dynamically (at every point in time) the likelihood of a scenario becoming a reality. Decisions made on expensive, long-term capital investments should be flexible enough to be changed quickly if the situation changes. * * * Another healthcare writer asserts that a system of accessible standardized patient encounter records is a higher priority than a network of interoperable electronic medical records (EMRs). These, it seems to us, are essentially the same thing. What the writer seems to be driving at — and with this we would agree — is that an interoperable network of EMRs will enable a new kind of evidence-based medicine (EBM), one based on the results obtained by mining the EMR network database for evidence of what diagnoses and treatments work in practice, at much finer grain than can be achieved by EBM based on evidence from clinical studies. Other practice-related news:
|
|
Mainframe Medicine
The University of Texas M.D. Anderson Cancer Center’s has a new $125 million Proton Therapy Center up and running. Occupying 94,000 square feet, it is the largest of four such centers in the US (the first hospital-based facility opened in 1990 at Loma Linda University Medical Center in California.) The new center has five treatment rooms, behind three of which are what Juan Lozano of the Asociated Press describe as “steel barrels 3 stories high and weighing 190 tons. They house bending magnets, electrical wires and monitors that work with a tubular device called an injector and a compact particle accelerator to create and energize the protons and send them into a patient’s tumor.” Proton beams can precisely target a tumor with higher levels of radiation than other technologies yet less damage to healthy tissue and fewer side effects. That nakes them especially good for treating children, who are more sensitive than adults to the side effects of radiation. Proton therapy is covered by Medicare and most insurance companies, but is about three times more expensive than traditional radiation, and “Some doctors worry that the benefits to a few cancers don’t outweigh the enormous costs, especially when recent advances in traditional radiation make it safer to use,” writes Lozano. He cites an oncologist at Fox Chase Cancer Center in Philadelphia, who told him that while proton therapy has an advantage in treating relatively rare cancers such as those in children or of the spinal cord, more study is needed to find out if it is more effective for common cancers, such as prostate and lung, than the newer, cheaper forms of traditional radiation. He notes also that despite evidence of some benefit to patients, two Johns Hopkins doctors questioned whether patients should accept the modest but real incremental risk of higher radiation doses for the uncertain ultimate benefit derived. Understandably, a Florida Proton Therapy Institute official begged to differ, noting that reducing radiation’s side effects could translate into lower health care costs in the long run and opining that proton therapy would become “a part of mainstream radiation oncology if we fully embrace its advantages.” [Editorial comment: It is not quite true, as healthcare consultant Joe Weber asserts, that “no one” is practising the expanded kind of evidence-based medicine (EBM) that draws its evidence from electronic medical records (EMRs) as opposed to just study data. Mayo is doing it with IBM’s UIMA. Kaiser is doing it with David Eddy’s Archimedes program. I want my employer, the DMC to prepare for doing it after we have fully implemented our EMR system and built up a sufficient volume of records. The collective wisdom also argues against Weber’s contention that records-based EBM is more important than a lifelong EMR. Both are important. The practical differences are: (1) Interoperable EMR technology is more feasible than records-based EBM, which requires sophisticated AI (like Archimedes , which is just a beginning) for the mining process, and (2) The kicker — records-based EBM must by definition rely on an interoperable EMR system for the data to mine. How else is a solo phsyician going to be able to mine data unless s/he has access to a much bigger record set than just his or her patients? We find one more contradiction: Weber essentially says that a sufficiently advanced records-based EBM system could make better diagnoses and treatment decisions (which, surely, constitute the art of medicine) than a physician, and that this will free up the physician to practice the art of medicine! We agree with the first part of that statement, but fail to see how the latter part logically follows. We have composed this critique — lengthy for a Digest commentary — because while Weber does a service in highlighting the benefits of records-based EBM, he risks a disservice to the extent his article persuades anyone to delay implementing an EMR system or an interoperable EMR network on the grounds that either one of these is not the highest priority. In our opinion, both are.] “The overriding problem with health care is not the absence of universal access to patient information” [i.e., through a network of interoperable EMR systems], opines healthcare writer and consultant Joe Weber in an article in H&HN’s Most Wired online magazine. “Rather it is the absence of a sufficient amount of clinical knowledge available in the brain of the caregiver.” In other words, evidence-based medicine is more important than interoperable EMRs. “Focusing on interoperability is putting the cart before the horse,” he says. “The challenge is figuring out how to put a patient, a physician and a knowledge-enabled computer in a room and create optimal care.” His solution has patients documenting their “symptoms and other relevant information related to the current complaint” on the Web from home or in the doctor’s office, via a structured, branching questionnaire. This “patient-delivered history could be utilized for triage or even Web visits, leading to greater efficiency,” and it will “conserve considerable time during the encounter, … save the physician perhaps half of the documentation time … and result in a much more comprehensive and useful description of the present illness than the traditional patient-physician question-and-answer session is likely to yield.” This is interesting and probably viable as so many (though by no means all) people are now Internet and computer-savvy enough to enter their own health data, but the most important point (it seems to us) he makes is that if all that patient-entered data were mine-able, a computer could effectively then diagnose and prescribe. Such a system, he says, would require “a standardized clinical vocabulary, acquiring codified data on all encounters, mandating the documentation of outcomes and analyzing the results to learn from experience.” If a physicians chose to deviate from the computer’s diagnosis or treatment plan, s/he would have to document the deviation and the reasons for it. So with the physician guided by the system on “what data to collect, what tests to order, what diagnoses to consider and what treatments to provide,” what’s left for her or him to do? Why, of course! — Do “what they do best–practice the art of medicine . . . [and] do it “from on top of a much higher platform.” “The challenge,” he admits, will be “convincing physicians. Their skills in taking a history are profoundly tied to their sense of professional identity. Nonetheless, if they are presented with irrefutable evidence that this new paradigm results in higher quality, more cost-effective care, many of them will be willing and able to alter their behavior.” A small company in Missouri has created an evidence-based medicine (EBM) computer program to assist doctors making diagnostic and treatment decisions. The “evidence” comsists of results from studies fed to the computer by a staff of medical professionals, including pharmacists and physicians. Doctors enter patient information, and the system returns information on “tests that work, tests that don’t work, remedies that work, remedies that are used but don’t work, and even remedies that many doctors didn’t know would work,” writes Harry Jackson Jr. in the St. Louis Post-Dispatch. “Since one group of physicians started using this program a few years ago, their use of generic drug prescriptions jumped to 69 percent from 52 percent. they’ve cut medical costs for patients by more than [US]$1,200 per doctor per month by increasing the use of generic drugs and cutting down on tests and procedures that the science says generally don’t work or aren’t necessary,” he writes. Prospects for Personalized Medicine Cancer drug Herceptin is an example of personalized medicine, notes Jon Van in the Baltimore Sun. But it is only personal to the 20 to 30 percent of women whose tumors produce high levels of the proteins targeted by the drug. The executive director of the Personalized Medicine Coalition, whose members include biotech and big pharmaceutical firms, said “… we’re optimistic the rate of progress will accelerate in the near future.” Is the optimism justified? Currently, researchers rather than drug and biotech companies are taking the lead in getting personalized treatments to patients. Van quotes Dr. Edwin Stone, who studies genetic abnormalities associated with eye diseases: “We’d do our research, write the results and publish in medical journals, [and] . . . figured our job was done. Someone else would develop tests and market them to physicians.” But that never happened, “because the market is just too small.” So Stone and a colleague themselves created genetic tests and offered them to primary-care physicians. Their laboratory charges only for materials and technician time needed to perform tests, but this will not be sustainable when the grants run out. Van also cites the case of bucindolol, a drug that failed clinical trials as a treatment for heart failure but did help a subset of patients with a particular genetic variation. A researcher therefore formed a company to market bucindolol in conjunction with a genetic test to identify such patients. The chief of a genetics advocacy and policy organization described the situation as “a Catch-22: The more we learn about an illness, the more its market shrinks, making it less profitable for a drug company to target.” A university biotech research director said that the balkanization of the drug market offers “the prospect of a great revolution in medicine,” but one “that’s not economically viable for industry.” “In the long run,” said another researcher, “we will find a business model to make this work, but it will take some creativity.” “The United States requires airline pilots to step down at age 60, but there is no mandatory retirement age for surgeons, who — like pilots — hold life in their hands,” notes Associated Press reporter Carla Johnson, writing about a study which found that for complex surgeries requiring fine stitching — such as pancreas removal, heart bypass, and carotid endarterectomies — doctors older than 60 had higher patient death rates, especially if they didn’t do very many of the procedures. The study’s authors suggested that when approaching retirement age, surgeons should either stop doing procedures or maintain a high caseload to keep their skills up to scratch; and that patients should ask how many procedures a surgeon does a year. Hospital administrators should not rely on decennial recertification exams — which gauge knowledge, not physical skills or technical mastery — but should step in if a surgeon does not voluntary stop practicing when his or her mrobidity and mortality rates start to rise. The president of the American Board of Medical Specialties speculated that surgeons might in future have to pass skills tests on surgical simulators. The percentage of working doctors 65 and older climbed from 13 percent in 1975 to 18 percent in 2004, according to data from the American Medical Association, and the percentage may rise as the Baby Boom strikes. Large-scale outsourcing of radiology “reads” to India and other countries is a myth, reports David Leonhardt in the New York Times. One US senator had said on ABC TV that thousands of American radiologists would lose their jobs, an economist said on NPR that the pay of radiologists was already suffering, an adviser to President Bush suggested that fewer medical students would enter the field in the future, and a US congressman said “We’re losing radiologists.” But MIT economist Frank Levy and colleagues, intrigued by the dearth of stories about “actual Indian radiologists,” set out to investigate. “In the end,” writes Leonhardt, “they were able to find exactly one company in India that was reading images from American patients. It employs three radiologists.” There may be others, but 20 would probably be “an overestimate,” Levy told him. Radiology outsourcing has not taken off, writes Leonhardt, because “Unlike software engineers, textile workers or credit card customer service employees, doctors have enough political power to erect trade barriers, and they have built some very effective ones,” including the de-facto requirement that radiologists usually must have been trained in the US before they can obtain board certification and a state license to practice. “It’s as if,” he continues, “the law required cars sold here to have been made by the graduates of an American high school. Much as the United Automobile Workers might love such a law, Americans would never tolerate it, because it would drive up the price of cars and keep us from enjoying innovations that happened to come from overseas.” But that is “precisely what health care protectionism does . . . . It keeps out competition.” He concludes: “When factory workers have asked for that kind of protection, the country has told them no. So why does the answer change when the request comes from a wealthier, more influential group of workers?” It’s a fair question. There is big money in heart devices, writes Barry Meier in the New York Times, for the companies that make them and the physicians who implant them. Increasingly, cardiologists are venturing into this rich turf once owned by their sub-specialty colleagues, electrophysiologists. In South Carolina, some cardiologists learn to implant defibrillators in ten patients by paying US$700 for weekend sessions sponsored by the Heart Rhythm Society, a professional medical organization, and taking a daylong written test afterwards. But others implant 75 devices for free in a training program tailored to their convenience by a defibrillator maker, and they generally do not bother to take the voluntary daylong competency test recommended and offered by the Heart Rhythm Society. Of 250 doctors who did take the test last year, one third failed it. Though patient deaths during defibrillator implants are extremely rare (about .03 percent), the New York Times found that of the first 45,000 implants reported into the Medicare database, the death rate for patients of nonelectrophysiologists was 1.5 times that for patients of the specialists, and that the complication rate was slightly higher as well. On the one hand, the more practiced the physician, the better for the patient (probably); on the other hand, the suspect quality of free training from device makers and the obvious conflict of interest the device makers create for doctors and hospitals that accept their largesse are potentially bad for the patient. Animal vs. In silico Experimentation The number of experiments carried out on animals in UK laboratories in 2005 was up 1.4 percent over 2004, at 2.9 million. Most involved rodents; the rest mainly fish and birds, but the number of procedures using non-human primates was up 11% on 2004. Between 1974 and 1996, the number of procedures on animals fell annually. But since 2000, the number of tests has been rising by an average of 1-2 percent per year; a trend, reports the BBC, “in large part due to the use of genetically-modified animals.” The numbers of tests involving normal animals has been falling steadily since 1995. Animal welfare groups say that the downward trend in overall procedures that started in the 1970s is showing signs of a significant reverse. In 2004, the British government established a National Centre for the Replacement, Refinement and Reduction of Animals in Research, whose goals are to reduce suffering and find replacement methods that do not involve animals. The Centre’s chief executive told BBC News science reporter Rebecca Morelle that the Centre is considering at what point science might finally be able to give up using animals for research and would not venture a guess. A Medical Research Council official told her “If there were better models then of course one would use them, but they often provide the only way that we can get at certain information.” And just what is the scientific community doing to find these better models? Some progress has been made in computer modelling [systems biology] of virtual cells, organs, and even whole organisms. There are currently computer models of pancreatic cells, various cells and tissues in the lung, kidney function, liver function, the musculoskeletal system, and nervous systems. One of the most advanced models is a virtual heart, created at Oxford University, that beats, can have a heart attack, and can reveal the effects potential drug compounds have on it. But it and other models are still a long way off an exact organ replica, a researcher told her. “Some people say to me when they see the virtual heart beating on the screen, ‘what is there left to do, it looks so realistic’. “Yes, it does look realistic in terms of reconstructing a heart attack, for example, but we suspect we have modelled only about 2% of the genes and proteins involved in that activity and there are 98% still to go.” In vitro work is another way to avoid animal experiments. A biologist described it as “an excellent way of replicating simple systems,” and said it was “also very cheap and cost effective. You can grow some cells in a lab and it will cost tens of pounds, as opposed to housing animals under strict conditions that can cost hundreds, if not thousands, of pounds.” Imaging is another alternative in some cases. MRI and PET scans can provide “a lot more information on many fewer animals than we would get through invasive procedures, and of course there is a lot less suffering,” a researcher said. For example, to understand what controls appetite and satiety — and thereby to understand obesity — researchers traditionally dissect animal (usually mouse) brains, but some are now using in vivo imaging instead to look at the areas of the brain related to hunger and satiety. It also allows researchers to investigate the same animals over and over again, instead of potentially hundreds of animals for just one experiment, and to follow them throughout their lives. The least we can do, said an official from the Royal Society for the Prevention of Cruelty to Animals (RSPCA), is to minimise suffering in the animals used for testing, such as by giving them a comforting living environment. Healthcare consultant and writer Joe Flower notes that the explosive growth in interventional cardiovascular services sprang “from a simple exercise in futurism.” The baby boomers are coming, with their heart problems in tow. So it’s “a slam dunk — build a cath lab, new surgery suites, more recovery space.” Not exactly. “Suddenly the market is filling with 64-slice CTs. Even faster 128-slice and 256-slice machines are under development. Suddenly many of the diagnostic tasks that filled those cath labs and recovery rooms are becoming noninvasive and nearly instant scans, many of them done in free-standing centers. At the same time, stenting is being strongly challenged by emerging pharmaceuticals. . . . At the very moment you open the doors on that new cardiovascular center, your utilization projections are curdling.” Every business plan, budget, new program, capital plan, building design, even every hire is a prediction, he says, that depends on a vision of the future — “often a vision that is overly simplistic, overly attuned to present needs, often, in fact, just an expanded version of the present.” Managed care and massive consolidation are examples of things that never really came to pass to the extent or effect predicted. There is no absolute future, though we often act as though there were. We deny the possibility of one, or claim to be ready for it, or we panic. These are cries for absolute prediction, but there can be none. The future is dynamic – you cannot pin it down. Instead, you can and should do three things:
As Darwin pointed out, “It is not the strongest of the species that survive, nor the most intelligent, but those most responsive to change.” |