website

Jul 09 2014

Underground Railroad and Indian Tomahawk Express Cards from Clayton Curtis

Published by under Uncategorized

 

Clayton Curtis Card

I got these photos of some of the original membership cards issued to Clayton Curtis, MD, who is currently with the VA’s Health Informatics’ Knowledge Based Systems as well as the VHA-Indian Health Service Interagency Liaison for Health IT Sharing.

Long story to be told here, but the bottom line is that the VA and the Indian Health Service have been collaborating for 30 years now, while DoD worked really hard to make its systems incompatible.  I was an informal consultant to the IHS while I was at the VA, and found them to be a very dedicated, but underfunded, agency.  So, the cooperation made a lot of sense.

When I went to work on the DoD version of the software (called Composite Health Care System), things were completely different.  They stripped out the communication capabilities I wanted to use to coordinate systems, and made other changes that would make the DoD version incompatible with the VA.

I wish this could be written off as ancient history, but I don’t think so.  DoD is continuing to do its thing with an $11B “rip and replace” waterfall effort, while VA seeks to take an evolutionary approach.

There has to be a better way.  For starters, the folks in Washington should recognize the power of informal organizations active in their formal organization charts.

Share

Comments Off

Jun 09 2014

The Dangers of Traffic Gridlock during wildfire evacuations in San Diego

Published by under Wildfires

Cocos Fire from OlivenhainThe most remarkable aspect of the 2007 Wildfire evacuations of San Diego was how orderly everything was. Drivers seemed to be more polite with each other than normal.  Talk to someone who has been through a wildfire evacuation and they will likely have some stories to tell about community cooperation, a sense of pulling together. For example, in the May, 2014 fires, the Helen Woodward Animal shelter put out a request for horse trailers to move their horses to safety, and were immediately met with a flood of volunteers with horse trailers. The San Diego Union Tribune (June 7: Cocos Fire Jam to be evaluated) quoted a San Elijo resident about his experiences trying to evacuate during the recent Cocos fire.

Longtime resident Dustin Smith said he packed up his pets and headed off about 4:15 p.m., but couldn’t leave his gated Promontory Ridge community. In front of him was a line of vehicles backed up even before the gate…. He said he gave up, tried again an hour later but found the same situation. Tried again shortly after 6 p.m. and finally found roads clear enough to leave.

Being blocked from leaving your home for 2 hours under any circumstances is a really bad thing, but it is particularly terrifying when there is a wildfire raging nearby, and you don’t know where it will go. My daughter was evacuated earlier in the day from her office at the corner of El Camino Real and Palomar Airport road.  Traffic on the road was so bad that it took her 30 minutes just to get out of her parking lot.  She call 911 to see if they could get some traffic control police to help, but the dispatcher just said, “sorry, all of our officers are busy with other aspects of the fire.” Google Maps traffic reporting was very helpful, and gave citizens a great way to see what was happen and adapt to the traffic flow dynamically.  For example, my wife and I were babysitting our grandchildren May 14th, and I was driving to our normal rendezvous with my son-in-law for a 4:00 handoff in San Marcos.  I was planning on driving over San Elijo road to Twin Oaks, when I got a call from him, saying he saw a fire starting near Twin Oaks (that would become the Cocos fire).   We both knew that this could block the road, and could cause havoc with the traffic flow between us.  So, we both turned around and went home, watching the fire expand, but also noticing that the traffic on Del Dios highway was clear on Google Maps.  My daughter came by around 8 that night, and we had an uneventful handoff. Unfortunately, the past few decades have seen an upsurge in NIMBY activists who fight roads in their area, creating a patchwork of unconnected roads with long cul-de-sacs. Perhaps the risk of traffic gridlock might reverse some of these attitudes, or at least give fire safety folks a stronger position from which to demand better ingress and egress for fire safety.

Share

Comments Off

May 23 2014

Why you shouldn’t try to defend your house against a wildfire with a garden hose.

Published by under Wildfires

I live in an area where wildfires are part of nature. I am also an “early evacuator” – happy to get out of the way of any potential fire hazards if one is coming my way. I know that there are others who want to stay back and defend their homes, typically with a garden hose.

Here’s a photo sequence of the recent fires that proves my point. It shows an ember gaining a foothold on a hillside, which engulfs the whole hillside and creates a 100′ tall fire tornado within 15 minutes.

Jeff Anderson, Elfin Forest Recreational Reserve park ranger, took these remarkable images during the recent “Cocos Fire” in North San Diego County.  The fire started quickly in the late afternoon of May 14. The next morning (May 15), it seemed to be fairly tame until about noon.  Then it flared up with a vengeance.  My home was about 1000 feet downwind of the evacuation zone, and we could smell the smoke passing over us, so we were intensely focused on what was happening.

Jeff was on the ridge of the Elfin Forest reserve looking north towards Harmony Grove when he snapped this photo of an ember burning at 12:25:24pm on May 15:

Fire 5-14_254

Just two minutes later, at 12:27:30pm, the fire from the original ember fire spread considerably, and another ember jumped up the hill:

Fire 5-14_256

Four and a Half minutes later, at 12:32:07, the fire engulfed the whole side of the hill. At this point, the flames were probably burning 1200 – 1600 degrees F.

Fire 5-14_262

Eight minutes, later, at 12:40:05, the fire had generated a “fire tornado” about 100 feet high, with winds 50-80 mph. The temperature at the base of the tornado was probably about 2000 degrees F, about one fifth the temperature of the surface of the sun, and the fire was generating its own wind:

Fire 5-14_275

I would ask those who would try to defend their house by playing Rambo with a garden hose, how long they think they would last in the midst of that inferno. It’s not a matter of your skill or machismo, it’s simply a recognition of the overwhelming power of nature.

We also need to recognize that wildfires are a natural part of the ecosystem. We even have a flower, the Fire Poppy, that germinates after wildfires. Here is picture I took of a fire poppy at Lake Poway, six months after the area had been burned in the 2007 Witch Creek Fire:

Fire Poppy

Share

Comments Off

Mar 10 2014

1986 Letter from House VA Committee calling for increased metadata sharing

Here is a letter from

Here is a 1986 letter from Rep. Sonny Montgomery. chair of House VA committee VA Administrator Thomas Turnage about NHS meta data sharing.

Note that, even in 1986, the Committee on Veterans’ Affairs was savvy to, and advocating the use of metadata (then called the “data dictionary – a roadmap to the database.”  It understood its use in VistA (then called DHCP), its role in portability (then with the Indian Health Service), and hopes to use it for the Department of Defense’s Composite Health Care System.

Today, metadata is a household word, given the NSA’s use of it.  But it reflects an entirely different perspective on how we view complex systems.

Imagine a complex system, represented by millions of dots, with even more connectors between the dots.  We can think of the dots as representing the “data” in the system, and the connectors (links) representing the “metadata” in the system.

This perspective generates an overwhelming number of dots and links, well beyond any human capacity to understand.

One way to approach this complexity I’ll call the “Dots-first” approach.  This approach tries to categorize the dots, pigeonholing them into a predefined hierarchy of terms: “A place for every dot, and every dot in its place.”  This goes back to Aristotle, and the law of the excluded middle.  Something is either A or Not A, but not both.  We just keep applying this “law” progressively until we get a tidy Aristotelian hierarchy of categories.  Libraries filed their books this way, according to the Dewey Decimal system.  If you wanted to find a book, you could look in a card catalog for title, author, and subject, then just go to the shelves to find the book.  The links between the dots are largely ignored.  For example, it would be impossible to maintain the card catalog by all the subjects referenced in all the books, or all of the references to other books and papers.  Order is maintained by ignoring links that don’t fit the cataloging/indexing system.

An alternative approach I’ll call the “Links-first” approach.  This approach focuses on the links, not the dots.  It revels in lots of links, and manages them at a meta-data level, maintaining the context of the information.  It can work with the Dots-first categorization schemes, but it doesn’t need them.  This is the approach taken by Google.  It scans the web, indexing information, growing the context of the dot with every new link established.

If a book had a Dewey Decimal System number assigned to it, Google would pick it up as just another piece of metadata.  Users could search for the book using it, but why would they?  Why revert to the “every dot in its place and a place for every dot” scheme when you can use the much richer contextual search that Google provides.

Sonny Montgomery – in 1986 – was advocating the “Links-first” approach that we pioneered in VistA.   This approach came up again in the metadata discussions of the PCAST report.

Bureaucracies typically favor to focus on the dots.  If a Dewey Decimal System isn’t working well enough, the solution is to add more digits of precision to it, more librarians to catalog the books, and larger staffs, standards committees, and regulation to insure that the dots all stay in their assigned pigeonholes.

This is what is happening with ICD10 today.  After the October 2014 roll out, we will now have the ability to differentiate “W59.21 Bitten by turtle” and “W59.22 Struck by turtle” as two distinct dots in the medical information universe.  Unfortunately, we are lacking dots to name tortoises, armadillos, or possums.  Struck By Orca (both the name of the book as well as an ICD10 code) provides some artistic insight into the new coding system.

The continued expectation that we can understand medicine from a “Dots-first” approach is a travesty in today’s world of interconnection, rapidly growing knowledge and life-science discoveries, and the world of personalization.  People use Google, not card-catalogs, to find their information, and do so in a much richer, quicker, and informative way than anything before in human history.

The “Dots-first” thinkers will panic at the emergence of a “links-first” metadata approach.  How can we have establish order if we don’t have experts reviewing the books, applying international standards, and librarians carefully typing and filing the catalogs?

One of the criticisms in the early days of VistA that it’s metadata-driven model would lead to “Helter Skelter” development, and that only centralization could make things orderly.  (Helter-Skelter was the name of the Charles Mansion murder movie at the time, so the term carried a lot of linguistic baggage with it.)  They could see only the Dots-first framework, and the ensuing failures of  the centralized, waterfall development of $100m+ megaprojects has continually proven that their approach doesn’t work.  Yet, they continue to blame their failures on the decentralized, metadata-driven core of the system.

There are technologies that address this, such as the Semantic Web or Linked Data initiatives.  But I’m afraid that there is so much money to be made “improving” the medical Dewey Decimal Systems and patching up all the holes in the Dots-first kludges that it seems to be a tremendous uphill battle.

Share

Comments Off

Dec 11 2013

Gutenberg, Genomics, and the Literacy/Literature Spiral

Published by under genomics

Imagine someone at the time of Gutenberg seeing his converted wine press capable of printing books.  “Who needs all these books?  No one can read them, anyway,” they might think.

The full impact of the the literacy/literature spiral this invention triggered would have been impossible to predict.  The bible would be translated into “vulgar” languages, no longer requiring the translation (and interpretation) services of the priestly class to control the flow of information to the lay people.

Medical Illustration, circa 1604.

Medical Illustration, circa 1604.

This remarkable upward spiral of knowledge was a chicken-and-the egg situation.  It required literature for people to read, but it required literate people to produce the literature.  It played a key role in the Age of Enlightenment as well as the Scientific Revolution.

Since his invention, millions of books have been printed.  For better or worse, from Shakespeare to Hitler, many changed the world.  Many books were redundant, many were just plain wrong, but the overarching principle of freedom of the press became a hallmark of what many consider a civilized society today.

Fast forward 550 years to today’s genomic revolution. Like the printing press, we are witnessing a new technology that is presenting us vast potential.  And like the book, we are facing a new literacy/literature spiral.  And we are facing a priestly class that seeks to intersperse themselves between our information and our use of it.

The notion that a federal agency can control information of our genes, and assign professional “gatekeepers” to control and interpret this information, is absurd.

Jonas Salk pointed out that we are the first generations of the first species to have evolved to understand our own evolutionary makeup.  In Survival of the Wisest, he points to the need for a new understanding of our role in evolution, what he called conscious evolution.

I don’t think that this wisdom is going to come from federal agencies.  Nor do I think that depriving people’s access to their own genetic information is going to advance the cause of science, or continue the principles of the Age of Enlightenment that have contributed so much to the advance of civilization.

We need to embrace a new life science literature/literacy spiral, not inhibit it by attempting to control it by an information priestly class.  Medicine today is largely “medicine by body part.”  Physicians may deal with Ears, Nose, and Throat, but Dentists deal with Teeth, an entirely different academic discipline, using a different insurance system.  As we discover the underlying genetic similarities of cancer, for example, we may find that the hierarchies with which we characterize them (e.b. some colon and breast cancers being nearly the same). These hierarchical distinctions are not just nomenclatural, but also the source of many of the power struggles in the various professional associations, billing codes, and financial reimbursement.

Entangling our future medical knowledge with the perversity today’s bewildering political, economic, and administrative hierarchies is not going to advance our knowledge, but rather exacerbate an already complex situation.

Another huge issue is the presumption that reading genetic information is a medical issue, an FDA “test” to be controlled as such.  But medicine today is largely a “fixit” enterprise, fixing what’s wrong.  We do knock-out studies, deleting a gene from a mouse, and then looking to see what went wrong.  This is a bit like Martians trying to understand a 1950′s television by removing vacuum tubes.  Noticing that the set squealed after a certain tube was removed, they claim discovery of the “anti-squeal” tube.  This is a highly replicable result, so it must be scientifically valid.

Our life science literature/literacy spiral must move beyond today’s “negate the negative” assumptions.  Fixing everything that is wrong with a living system does not necessarily make it right.  Trying to put a tail back on a cat is likely to cause more harm than just letting the cat adapt to become a tailless cat.  Understanding the cats resilience and adaptation mechanisms is a a far more complex form of information – and one that is critical to our full understanding of life sciences.   Regulating and restricting the flow of genetic information in terms of today’s perversely incentivized disease model is a huge step backward in advancing our understanding of ourselves.

 

 

 

Share

Comments Off

Oct 08 2013

Underground Railroad Banquet Oct 24 at VistA Expo in Seattle.

Published by under Underground Railroad

I’ll be holding the next Underground Railroad Banquet at the the VistA Expo in Seattle on Oct 24. This is a continuation of the banquets I’ve been holding over the years, starting with in 1982 with an award to Chuck Hagel for his support in the early roll-out of VA’s VistA. 

Here is some more information about the history of this group.

And here are some YouTube videos from previous banquets.

 

Share

Comments Off

Oct 03 2013

A Brief History of the Underground Railroad.

Published by under Underground Railroad,VistA

I was part of a small group of programmers (called Hardhats) recruited to the VA in the late 1970′s to work on what would eventually become the VistA Electronic Health Record system.  Ted O’Neill, who had supported the funding of the development of ANS MUMPS from NIH and later National Bureau of Standards (now NIST), moved to the VA to develop an open, public domain version of a modular, decentralized hospital information system that was dedicated towards improving the clinical care in the VA.

This was in an era dominated by mainframe computers, managed by centralized data processing staffs doing largely batch processing of punched cards.  The notion of a network of interactive terminals connected to decentralized minicomputers was a radical notion at the time, and was threatening to the centralized data processing department.

This lead to a fierce bureaucratic battle between the decentralists and the centralists.  Ted O’Neil and Marty Johnson hired MUMPS programmers under local hospital management, both to insure that they worked closely with the actual end users, and to shield them from the conflicts that raged in Washington.  Eventually, Ted O’Neill was fired, several of the hardhats were fired, and central office tried to shut down the MUMPS effort.  I was demoted, and $500,000 worth computers were locked up in my hospital basement, unused.  The central data processing department told upper VA management that minicomputers could not possibly be used for large scale computing, and that only a centrally managed mainframe approach could provide the necessary functionality.

The hardhats continued to develop the software, cooperating on a peer-to-peer basis, and working closely with hundreds of doctors, nurses, and other clinical personnel. By 1981, we had developed a toolkit (the File Manager, Kernel) that supported a core system that could handle packages for ADT (Admissions, Discharges, and Transfers), Pharmacy, Scheduling, and Laboratory.

In 1981, VA Chief Medical Director Donald Custis visited the Washington VA medical center to see our software in operation. He was surprised to find a working system, enthusiastically used by clinical staff, based on very economical minicomputers.  He quipped, “It looks like we have an underground railroad here.”   I grabbed the name, and started passing out 500 VA Underground Railroad business cards.

In 1982, I organized the first Underground Railroad banquet in Washington, DC, and presented then-Deputy VA Administrator Chuck Hagel with an “Unlimited Free Passage on the Underground Railroad” certificate.  I also started handing out certificates for “Outstanding Engineering Achievement” to programmers for their contributions to VistA, and special VIP membership cards, with a 1982-era Motorola CPU chip laminated to the engine of the logo.

I am planning the next banquet October 24, 2013 in conjunction with the VistA Expo meeting in Seattle.  I will be delivering a “State of the Underground Railroad” address, discussing how many of the original issues are still around, 31 years later.

For example, I had noted that in a bureaucracy, everyone wants things centralized below them and decentralized above them.  Given the technology of the day, we focused on the hospital as the “anchor point.”  Today, however, this has moved up to Capitol Hill.  Both  Senate and House committees have discussed what language to use in EHR systems.  The $1b disastrous Integrated Electronic Health Record effort is an effort in mega-centralization.  DoD continues it’s Humpty Dumpty systems development approach, breaking systems into pieces and then trying to integrate them back together again, even after a 40 year track record of failure.

VistA’s approach to a patient- and provider- centric model has repeatedly proven it’s merit.  Our approach of involving thousands of clinical users – not just a few IT “experts” – has also proven itself.  Open source software, agile development, use of online fora, metadata-driven architectures, and email-based messaging are all innovations of VistA that are more current than ever.

VistA was much more than just a collection of programs.  It was a community of users, a framework for collaborative development, and a toolset for “meta” level programming that is rarely understood by outsiders who stare at the source code.  Just as one cannot understand Wikipedia and the Wikipedian community by staring at the source code driving the underlying wiki, we cannot understand VistA simply by looking at the source code.

I hope that the Underground Railroad Banquet can help communicate some of these broader implications of the VistA framework, as well as look forward to the next generation of VistA software.

Share

Comments Off

Sep 07 2013

Adm. Harold Koenig at “RDF as Universal Health Language” workshop

Published by under VistA

Vice Adm (ret) Harold Koenig, MD, discusses what doctors need from health IT at the New Health Project workshop on RDF as a Universal Health Language in Encinitas, Ca. In his 34 year career with the Navy, Adm. Koenig was Deputy Assistant Secretary of Defense, Health Care Operations, 1990-1994, Surgeon General of the Navy, and commander of the Balboa Naval Hospital in San Diego. The chair he is sitting on is a time machine I am building for my children’s science activities. Unfortunately, it isn’t completed yet, so it’s only as good as your imagination.

Share

Comments Off

Sep 07 2013

Open Letter to Maureen Coyle about VistA Evolution

Published by under VistA

Dear Maureen,

It was good meeting you at the Second Annual OSEHRA Summit meeting yesterday.  It looks like you really have your hands full with your work on planning the evolution of VistA.  I thought your question, to the effect, “How do we decouple our information architecture from the organization chart?” was right on target.  I addressed this my recommendations to Chuck Hagel in a previous open letter:

Decouple the IT architecture from the Organization Chart.  The designs that I’ve seen coming from the DoD are enterprise-focused, “baking in” all of the stovepipes, organizational turf wars, and protecting rice-bowls of the many political, economic, and professional constituencies hoping to influence the architecture.  Instead of patching together an “integrated system” of point-to-point connections, we need to move to a broader vision of creating a common information space.  Note the words of Tim Berners-Lee in his design of the World Wide Web:

What was often difficult for people to understand about the design of the web was that there was nothing else beyond URLs, HTTP, and HTML.  There was no central computer “controlling” the web, no single network on which these protocols worked, not even an organization anywhere that “ran” the Web. The web was not a physical “thing” that existed in a certain “place.” It was a “space” in which information could exist.”

This is continuation of my thinking from the time when we worked together on the Vvaleo Initiative with Dee Hock.

Group at Initial Vvaleo meeting in Seattle

I was pitching an idea called HealthSpace, a way of creating a “space for health information” akin to the way that Tim Berners-Lee created the Web as a “space within information could exist.”

For example, a web user can drag a book’s URL from Amazon to Twitter, press send, and just assume that anyone, anywhere, and on any web-enabled device would have “interoperable” access to it.  We don’t need an interoperability agreement between Amazon and Twitter, and if I want to pass the information through Facebook or Gmail, that’s easily done.  I don’t have to re-engineer the whole system if I want to use a different routing, nor do I have to wait for some standards committee, government agency, or vendor to come up with the perfect standard for defining book information exchange.

The web created a large-scale, fine-grained network that used surprisingly few “moving parts” to do an amazingly large amount of information processing.  I’d like to do the same for health care.  This would also solve many of the political problems facing VA-DoD sharing.  The same information could be shared as a “flat” information space, but different agencies could superimpose their hierarchies or constraints on it as they see fit.  The agencies are not giving away the “family jewels” but are rather being given greater control over their information.

For example, blood pressure measurement may seem like a fairly benign piece of information.  It might come from a VA clinic, a WalMart convenience clinic, or a home smart phone gadget.  However, if it is a Navy Seal located in some remote mountain village, this puts the information in an entirely different context. The metadata about the blood pressure measurement – the time, location, etc. is hugely different than the WalMart reading on a Vet.

The information space model would allow the Navy to place restrictions on this information – from compartmentalizing it entirely, to applying whatever protocol they choose for that class of information.

After Tim Berners-Lee invented the web, he moved on to design the Semantic Web, which is now called Linked Data:

Part of Tim’s design genius in creating the web was allowing it to be broken – the “404 not found error.”  Prior efforts (such as Doug Engelbart’s Hypertext system) required bidirectional referential integrity: If A pointed to B, then B must also point to A.  Of course, in the best of all worlds, this would be preferable.  But in the real world of a dynamic, constantly changing world wide web, the 404 error was a key design decision to allow robustness in the design of the web.

As an aside, Tim played an interesting role in the creation of My Health eVet system.  In the very early days of the web (1996?) , I had arranged a meeting with Rob Kolodner, Clayton Curtis, and others from VA to meet with Tim, Peter Solovitz from MIT, and Zak Kohane from Harvard. Rob Kolodner credits this meeting as the initial stimulus for the Health eVet program.

I think that your question about decoupling data from the organization is a very timely and important one – which could lead to a breakthrough in VA/DoD sharing efforts.  I would be delighted to help you explore the issue.

 

 

Share

Comments Off

Aug 25 2013

Open Letter to Rep. Mark Takano (D-Ca)

Published by under AHLTA,VistA

Congratulations on your next step in public service as a member of Congress representing the 41st district, and your membership on the House Veterans Affairs Committee.

Takano-300x300

Your district was the site of the first VA/DoD health IT sharing, a system that I helped develop in 1983-5 when I was a Computer Specialist at Loma Linda VA.  I worked closely with the committee and Chair Sonny Montgomery’s staff to demonstrate that the DoD could easily adopt the VA software, and we could communicate between Loma Linda and March Air Force Base.
Tom Munnecke, Ingeborg Kuhn, George Boyden, Beth Teeple showing off the first VA/DoD Health IT interface

This demonstration was studied by GAO, VA, DoD staff, the Veterans Affairs Committee, and other consultants.  Except for the DoD-hired consultant (who later told me that he had been hired “to make the system look bad, but when I saw it, it looked pretty good to me”) Here is 2011 conversation I had with Beth Teeple, who helped make it happen from the Air Force’s side.

The committee noted that DoD had spent $250m (1980 dollars) to develop Initial Operating Capabilities (IOC’s) at a few sites as standalone demonstrations, while VA was spending $82m (1980 dollars) to deploy those capabilities in production across 172 hospitals.  None of the IOCs were compatible with each other, whereas the VA system (later to be called VistA) was developed around a sophisticated “active metadata” system with which all systems were able to communicate by virtue of their shared metadata approach.  It’s a bit like solving problems with algebra rather than arithmetic.  A single algebraic formula can simplify problems that would generate an enormous array of arithmetic efforts.  Algebra is a “meta” level way of looking at things.

This sharing effort, by the way, was made possible by the committee’s VA/DoD Sharing legislation championed by Sonny Montgomery.  This allowed VA and DoD sites to share resources, and keep the cost savings at their local level, rather than returning the funds to headquarters.

Sonny Montgomery wrote this 1984  letter to Secretary of Defense Casper Weinberger:

Mr. Secretary, I cannot understand the DOD reluctance to try the VA system, which will provide on a timely basis the mandatory system compatibility between the two agencies.

The success of this demonstration (and a parallel one between Fitzsimmons AMC and Denver VA), lead Congress to require that one of the competitors for the DoD’s Composite Health Care System (CHCS) bid a adapted VA system.  I left the VA in 1986 to work on the SAIC effort to propose the VA system.  We won the CHCS “fly off” competition with a bid about 60% of the competition.

Unfortunately, the DoD dismantled the communications capabilities that would have allowed the graceful evolution of VA/DoD sharing (and the improved coordination of DoD facilities, as well).  They also took many steps to make the system incompatible with VA.  Whereas VA was thriving based on its “algebra” design ethos, the DoD continued its thrashing about, based on its “arithmetic” level of thinking.

When I first saw the AHLTA architecture, my initial reaction was that it was a “giant single point of failure.”  A decade later, while Congress was holding a hearing called “AHLTA is Intolerable,” the system ironically went into a global failover mode; the central node had failed again.  AHLTA is a rich source of counterexamples on how not to develop systems, but one of the most significant is its over-centralized single point of failure architecture.  NASDAQ has a similar vulnerability: it suspended trading for three hours last week due to a failure at a single point.  All European Blackberry’s were locked out of email service for a week a while back, again to a failure at a critical point.  These systems were designed for efficiency, not resiliency.  The brittleness that was “baked in” to their design also manifests itself in their ability to adapt to changes or surges of activity.

When I hear of a single, integrated electronic health record for the VA and DoD,  I see brittleness, not efficiency.  I see it devolving to the DoD’s lowest common denominator – based on DoD’s “arithmetic” approach rather than the VA’s “algebraic” model.

The President’s Council of Advisors on Science and Technology (PCAST) issued a report calling for greater use of metadata – the algebraic model, which is a positive step forward to what has been a root cause of success for VistA over the years.  Unfortunately, I have seen these recommendations having much effect on the future health IT designs.

I hope that you thrive in you service in Congress, and I hope that you can bring fresh insights to the never-ending problem of VA/DoD sharing.   I would be happy to provide any insights that may be helpful to you

Share

Comments Off

Next »

Creative Commons License
Images by Tom Munnecke is licensed under a Creative Commons Attribution-Noncommercial-Share Alike 3.0 United States License.
Based on a work at munnecke.com.
Permissions beyond the scope of this license may be available at munnecke.com/license.