Oral-History:James J. Bailey and Rosalie Dunn: Difference between revisions

From ETHW
No edit summary
m (Text replace - "[[Category:Culture and society" to "[[Category:Engineering and society")
(5 intermediate revisions by the same user not shown)
Line 17: Line 17:
This manuscript is being made available for research purposes only. All literary rights in the manuscript, including the right to publish, are reserved to the IEEE History Center. No part of the manuscript may be quoted for publication without the written permission of the Director of IEEE History Center.  
This manuscript is being made available for research purposes only. All literary rights in the manuscript, including the right to publish, are reserved to the IEEE History Center. No part of the manuscript may be quoted for publication without the written permission of the Director of IEEE History Center.  


Request for permission to quote for publication should be addressed to the IEEE History Center Oral History Program, 39 Union Street, New Brunswick, NJ 08901-8538 USA. It should include identification of the specific passages to be quoted, anticipated use of the passages, and identification of the user.  
Request for permission to quote for publication should be addressed to the IEEE History Center Oral History Program, IEEE History Center at Stevens Institute of Technology, Castle Point on Hudson, Hoboken, NJ 07030 USA. It should include identification of the specific passages to be quoted, anticipated use of the passages, and identification of the user.  


It is recommended that this oral history be cited as follows:  
It is recommended that this oral history be cited as follows:  


James J. Bailey and Rosalie Dunn, an oral history conducted in 2000 by Frederik Nebeker, IEEE History Center, New Brunswick, NJ, USA.  
James J. Bailey and Rosalie Dunn, an oral history conducted in 2000 by Frederik Nebeker, IEEE History Center, Hoboken, NJ, USA.  


== Interview  ==
== Interview  ==
Line 807: Line 807:
'''Bailey:'''  
'''Bailey:'''  


No. I implemented the IBM program at NIH. What was bad was trying to translate the Pipberger program from CDC Fortran to IBM FORTRAN. It was like the difference between 36- and 32-bit words.  
No. I implemented the IBM program at NIH. What was bad was trying to translate the Pipberger program from [[FORTRAN|CDC Fortran to IBM FORTRAN]]. It was like the difference between 36- and 32-bit words.  


'''Nebeker:'''  
'''Nebeker:'''  
Line 1,103: Line 1,103:
Maybe even earlier.  
Maybe even earlier.  


[[Category:People_and_organizations|Oral-History:James J. Bailey and Rosalie Dunn]] [[Category:Engineers|Oral-History:James J. Bailey and Rosalie Dunn]] [[Category:Bioengineering|Oral-History:James J. Bailey and Rosalie Dunn]] [[Category:Biomedical_measurements|Oral-History:James J. Bailey and Rosalie Dunn]] [[Category:Biomedical_monitoring|Oral-History:James J. Bailey and Rosalie Dunn]] [[Category:Culture_and_society|Oral-History:James J. Bailey and Rosalie Dunn]] [[Category:Health|Oral-History:James J. Bailey and Rosalie Dunn]] [[Category:Computers_and_information_processing|Oral-History:James J. Bailey and Rosalie Dunn]] [[Category:Pattern_recognition|Oral-History:James J. Bailey and Rosalie Dunn]] [[Category:Computing|Oral-History:James J. Bailey and Rosalie Dunn]] [[Category:News|Oral-History:James J. Bailey and Rosalie Dunn]]
[[Category:People and organizations|Bailey]] [[Category:Engineers|Bailey]] [[Category:Bioengineering|Bailey]] [[Category:Biomedical measurements|Bailey]] [[Category:Biomedical monitoring|Bailey]] [[Category:Engineering and society|Bailey]] [[Category:Health|Bailey]] [[Category:Computing and electronics|Bailey]] [[Category:Pattern recognition|Bailey]] [[Category:Computing|Bailey]] [[Category:News|Bailey]]

Revision as of 16:47, 22 July 2014

About James J. Bailey and Rosalie Dunn

Rosalie Dunn (left) and James J. Bailey (right)

James Bailey started evaluating electrocardiography in the early 1970s. Rosalie Dunn was a statistician for Hubert Pipberger starting in the 1970s.

The interview surveys developments and people in electrocardiography from the 1950s to about 1980. Norman Jeff Holter, a privately wealthy researcher, developed ways to record ambulatory ECGs in the mid-1950s; his invention was in research use by the 1960s. Other scientists, Caceres and particularly the German émigré Hubert Pipberger worked to develop methods and programs to interpret ECGs, to relate the data to the actual body and to improve interpretation accuracy by doctors and by computers. Pipberger in particular developed the data set that allowed the testing of ECG results against non-ECG based diagnoses. The first programs and machines were developed in the 1960s; a large number of competing models were being tested by the 1970s. Dunn helped apply statistical sophistication to the data and the machines; Bailey evaluated many results. The advance of computerization increased the speed and ease of getting results. By the early 1980s, ECGs were about to be fully integrated with the personal computer revolution. The interview ends with a reference to the field of exercise stress testing.

About the Interview

JAMES J. BAILEY & ROSALIE DUNN: An Interview Conducted by Frederik Nebeker, IEEE History Center, 25 April 2000

Interview #392 for the IEEE History Center, The Institute of Electrical and Electronics Engineers, Inc.

Copyright Statement

This manuscript is being made available for research purposes only. All literary rights in the manuscript, including the right to publish, are reserved to the IEEE History Center. No part of the manuscript may be quoted for publication without the written permission of the Director of IEEE History Center.

Request for permission to quote for publication should be addressed to the IEEE History Center Oral History Program, IEEE History Center at Stevens Institute of Technology, Castle Point on Hudson, Hoboken, NJ 07030 USA. It should include identification of the specific passages to be quoted, anticipated use of the passages, and identification of the user.

It is recommended that this oral history be cited as follows:

James J. Bailey and Rosalie Dunn, an oral history conducted in 2000 by Frederik Nebeker, IEEE History Center, Hoboken, NJ, USA.

Interview

INTERVIEW: James J. Bailey and Rosalie Dunn

INTERVIEWER: Frederik (Rik) Nebeker

DATE: 25 April 2000

PLACE: Bethesda, Maryland

Norman Jeff Holter's lab, 1950s; recording ambulatory ECG

Nebeker:

Let’s start with the fifties.

Bailey:

When I was in the army in the late fifties, I traveled to Montana where my parents lived and visited the laboratory of Norman Jeff Holter. Jeff had his own foundation, the Holter Foundation, in Helena, Montana, which owned his car and house and as well his laboratory. The foundation was formed from the estate he inherited from his grandfather who built Holter Dam near there. At the time I visited him in his laboratory, he was attempting to record the magnetic field generated by a nerve impulse in a frog’s sciatic nerve (presaging magnetocardiography and magnetoencephalography). Holter developed RECG (radio-electrocardiograph) to record and transmit ambulatory electrocardiograms (ECGs) via radio and AVSEP (audiovisual superimposed ECG presentation) to allow, for example, 10 minutes of ECG data to be reviewed in 7.5 seconds, thereby rapidly revealing transient changes in conduction, rhythm, and/or ST segment levels.

Nebeker:

Was he an independent researcher?

Bailey:

He had a bachelor’s degree in physics and did graduate work, but I don’t think he ever went for a graduate degree. He knew all the big guys, and I think he attended the bomb test at Eniwetok.

Nebeker:

He was self-supported in this research?

Bailey:

Yes. He inherited the estate from his grandfather and put it in this foundation. In the late fifties he actually conducted experiments with a series of patients in Great Falls, Montana. A private physician used his system to study 200 clinical cases. In one case AVSEP review confirmed ECG evidence for suspected exertional angina. A series of AVSEP frames revealed marked shifts in the ST segment during heavy lifting. Avionics took this idea and ran with it. During my residency in the late sixties, some hospitals had the Avionics device for playing back these analog tapes in the AVSEP format.

Nebeker:

Tapes of the ECG?

Bailey:

Yes, of the ambulatory ECG. It could be played back about 50 times faster than real time. In a normal sinus rhythm and fairly regular rhythm it would make a smooth humming sound. If there were PVCs it would start making choppy sounds.

Nebeker:

It was actually read by the sound.

Bailey:

Then it could be stopped and played in real time across the oscilloscope so that the complex could be looked at in detail.

Nebeker:

Was it actually useful to be able to speed it up?

Bailey:

Yes, because it saved so much time.

Nebeker:

What was Norman Jeff Holter’s motivation for using this ambulatory ECG?

Bailey:

His idea was that taking a single resting ECG was too narrow a slice in time. The idea was to capture a much larger range of cardiac activity – throughout the day and night – to get a better picture of what was happening with the patient during his normal daily routine.

Nebeker:

Whereas a stress test ECG could be connected and done in a single location.

Bailey:

Right.

Use of the Holter Monitor for ambulatory ECG

Nebeker:

Do you think this was important?

Bailey:

Of course.

Dunn:

There was a classic study, often cited. It involved monitoring workmen—I can’t recall the industry, --stevedores, electricians…, wearing these Holters for at least one day, if not more. Spurious arrhythmias like PVCs and PACs were being thrown by these strapping, healthy men. This may have been the first study documenting significant “normal” variation. What did this mean prognostically? They didn’t know.

Nebeker:

Was this the first time somebody had done ambulatory ECGs?

Bailey:

Holter’s was the first, yes. And it was all analog when he started. Avionics was still analog even in the late sixties.

Nebeker:

Was it radioed to recording stations?

Bailey:

I believe he only tested the RECG to see if it could be done. The AVSEP required a huge backpack that held the recorder. It was eventually miniaturized it so people could carry it comfortably.

Nebeker:

And this would be worn on the back?

Dunn:

Yes. Each patient’s file would be copied onto big reels. This data collection took up a large amount of space.

Nebeker:

Was this used widely?

Bailey:

It wasn’t widely used until the late sixties.

Nebeker:

A person would be equipped with this for diagnosis purposes and then these tapes would be looked at afterwards?

Dunn:

Not at that time. I don’t know of too many that could deal with this, unless it was a special study in a special research lab.

Nebeker:

I see. It was just a research project.

Dunn:

This was not ordinary patient treatment.

Bailey:

People had to be symptomatic. The causes of their symptoms weren’t apparent on ordinary routine electrocardiograms.

Dunn:

This would not have been available from the average cardiologist but was used in some hospitals.

Bailey:

Yes. Where I did my residency, the cardiology department used one. That was in 1966-69.

Nebeker:

Was the Holter Monitor something that was manufactured?

Bailey:

Yes. Avionics was the company that made it.

Dunn:

And they would play it back.

Bailey:

Their device would play it back fifty times in real time so that a 24 hour recording could be reviewed in an hour.

Dunn:

Avionics made some money doing this, but it wasn’t a big business. There wasn’t that much volume.

Nebeker:

However it had value for understanding normal hearts and abnormal hearts I guess. This was the first time ECGs were being recorded.

Bailey:

For patients that were having palpitations or seizures or fainting spells, it was appropriate when the cause was not otherwise clear. It also probably unmasked coronary disease for some patients.

Nebeker:

Has this kind of diagnosis continued to the present?

Bailey:

Oh yes. It’s big business.

Computerization, digitization, and pattern recognition for the ECG

Nebeker:

Then there is a continuous history of these Holter Monitors. I suppose they have gotten transistorized and miniaturized.

Dunn:

Miniaturized hardware with compression software and more sophisticated analytic programs for diagnosis upon playback.

Nebeker:

Did Holter have enough engineering background to design this?

Bailey:

In his time physicists engineered a lot of their own lab work.

Dunn:

Even making their own glassware.

Bailey:

Back in the fifties computerization meant analog processing. Not digital. Holter first started working on the ECG around ’54 or ’55. AVSEP was first presented at the New York Academy of Sciences around 1959 or ’60.

Nebeker:

Was there anything else from the fifties that might be worthy of comment?

Bailey:

Computer analysis or processing of ECGs was a sporadic effort up until the time that Pipberger arranged for it to be digitized. That was the beginning for computer analysis of routine electrocardiograms.

Bailey:

The first interpretive program may have been Cesar Caceres’. I seem to recall Hubert Pipberger telling me his group programmed the R-wave finder on a contract for Caceres.

Dunn:

Yes. Caceres was not an engineer.

Hubert Pipberger’s group developed the appropriate electronics and initial pattern recognition triggers and computer software.

Nebeker:

What did Caceres do?

Bailey:

His was the first interpretive program widely distributed. He had a position with the Public Health Service at the Medical Development Systems Laboratory. Version D of the PHS program (PHS-D) was closed in 1964.

Nebeker:

Was that after Pipberger’s digitization?

Bailey:

The digitization was a collaborative project with the National Bureau of Standards, now called NIST.

Dunn:

This was documented in the early published papers of Pipberger’s group.

Pipberger, Frank, and Simonson; accuracy of ECG readings

Nebeker:

What was Pipberger’s motivation to digitize the ECG?

Bailey:

I would have to take a guess at that because I have had conversations with Hubert at various times. He had done basic research in electrocardiology, and I think he had the feeling that most cardiologists were subjectively floating when they interpreted ECGs. I think he wanted to put it on a scientific objective basis.

Dunn:

At an early time Hubert Pipberger was influenced by Ernie Frank, who was doing interesting work in mapping and developing the vector model of propagation of the electrical signals through the heart. He viewed the ECG as the shadow on the wall, so to speak, of what was actually happening internally, and he wanted to scientifically determine the best ECG characteristics that would more accurately reflect the true cardiac condition. And he insisted that the “true cardiac condition” be documented by non-ECG criteria.

Bailey:

He was also influenced by Ernst Simonson. One of the things Simonson did in the sixties was to collect up some well-diagnosed cases of ECGs and submitted them to cardiologists all over the world. There were a dozen or two. He found that the mean accuracy interpretation of these cardiologists – reading the tracings – was about 50 percent.

Dunn:

Re-reading the same tracings.

Nebeker:

Oh, even their own.

Nebeker:

The idea was that it should be an objective process when the ECG is interpreted. And here in fact even the same person was coming up with a different reading at a later time.

Dunn:

Ernst Simonson was very methodical and meticulous about his collection of ECGs, not only with the tracings, but with the clinical documentation of the patient’s cardiac status. He worked on his projects, publishing scientific accounts, until a very old age.

Dunn:

Hubert Pipberger was interested in determining the physiological phenomena behind the production of the ECG signals, developing the model that could accurately reproduce such signals, establishing the external electronic system which could detect these signals, and by use of a validated model, properly interpret them with a high level of sensitivity and specificity.

Nebeker:

His way of creating that model was going to be on a digital computer?

Dunn:

He saw the computer as his basic tool, capable of reading tracings with higher resolution, consistent accuracy, and greater speed.

Harmonic analysis for ECGs; measuring higher frequencies

Nebeker:

There were quite a number of them, for instance analog harmonic analyzers, people with weather data and other sorts of data trying to see periodicities.

Bailey:

Every ten years someone comes up with the idea of applying harmonic analysis to the ECGs. The problem is that with harmonic analysis it doesn’t tell whether the T wave is before or after the QRS. That is critically important loss of information.

Nebeker:

Was the use of harmonic analysis ever fruitful with cardiographs?

Bailey:

I don’t think it has been helpful, except for the fact that in the early sixties people believed that all the information was contained below 60 hertz. It wasn’t until the later that people found important information at the 100 hertz level. Some investigators have found important information at even higher levels than that.

Nebeker:

How did they learn that these higher frequencies were important?

Bailey:

They learned that by examining patients with pathology.

Nebeker:

Earlier they might have filtered out that part of the signal?

Bailey:

That’s right. In the old PHS-D program of 1964 was based on the idea that the signal would have been passed under a 50 hertz filter. It really distorted the QRS.

Dunn:

Even when I came into the field in the seventies, a large percentage of the papers at conferences dealt with filtering and massaging the signal.

Pipberger's heart modeling and patient categorizations

Nebeker:

I am surprised to hear that Pipberger was interested in the model of what the heart was doing. It seems like the first step would be to look at the signal itself and try to find signs of the diseased heart rather than worrying about modeling the heart. That seems like a very ambitious task.

Dunn:

His first work was in physiology in Prinzmetal’s California laboratory. Hubert Pipberger worked in basic cardiac physiology and electrical propagation there and came away with a really good background.

Bailey:

Looking at the signal to see if it associated itself to a disease was that work preceded this. That work dates back even to Waller and Einthoven.

Nebeker:

One might imagine that the computer would be a great tool here. For instance if we digitize this signal we can then do cross-correlation or whatever with all these other tracings and try to see what is significant. And using the computer as a statistical aid.

Dunn:

I don’t agree completely. What these cardiologists were looking for was such a plethora of abnormalities, none of which are completely distinct from the others. There’s a gradation of patterns.

Nebeker:

I see. It does not fall into certain categories by nature.

Dunn:

<flashmp3>392 - bailey and dunn - clip 3.mp3</flashmp3>

Hubert’s approach was to group patients in categories as physiologically well-defined as possible and categorizing their electrocardiograms on a physiologic basis independent from the electrocardiogram. ECGs would be grouped into a certain diagnostic “pot,” and then these “pots” were compared as diagnostic groups to see what would distinguish one group from the other.

Pipberger’s VA program and Caceres’ PHS program

Dunn:

The first ECG-diagnostic computer programs were decision-tree types of analyses imitating the thought process of a cardiologist reading a tracing. Whereas the average cardiologist eyeballed the tracings, Pipberger’s VA program and Caceres’ PHS program identified and measured the Q and R waves and the ST segments and much more. It should be pointed out that, whereas the PHS program analyzed the 12-lead ECGs, Pipberger –from his earlier work and his close association with Pierre Rijlant and Ernie Frank--had decided that the dipolar model was superior and used the 3 orthogonal lead system or vectorcadiogram. Consequently, in addition to the scalar measurements common to 12-lead users, the VA program used measurements such as angles, vectors and areas. Often a single such vector measurements could contain the information of several scalar measurements. These measurements could be made easily by computer.

The VA program measured over 300 such entities for each tracing. By the early 70s, with his model and computer, Hubert was ready to step beyond the decision tree type analysis.

Nebeker:

The decision tree was algorithmic?

Dunn:

And it used cutoff points, remember. Just slight variation in a measurement, real or artifact, could throw an ECG from normal into an abnormal category.

Nebeker:

When did they get to that point of being able to give a computerized interpretation?

Dunn:

That was in the early sixties.

Bailey:

On of the first interpretive programs was the PHS-D version in 1964 about the same time as Pipberger’s VA program

Nebeker:

That’s the Public Health Service?

Bailey:

The Public Health Service version.

Dunn:

In 1970 the VA program was a decision tree form. And by this time Hubert had amassed a database of nearly 2500 ECGs, what they called the “well diagnosed file”—ECG tracings from patients categorized as to rhythm and according to seven diagnostic categories, including normal, hypertrophied and infracted. Also by this time, Hubert and his collaborators in the VA Cooperative Study produced a number of papers documenting the efficacy of computer classification in pair-comparisons, normal with some disease entity like infarct. And this was done using higher powered statistical techniques.

Pipberger's background

Nebeker:

What can you tell me about how Pipberger’s background? Was Pipberger an M.D.?

Bailey:

Hubert Pipberger was born May 29th, 1920; received his baccalaureate degree in 1938; served as a medic in the German airforce during World War II; and received his doctor of medicine degree in 1951 from the Rheinische Friedrich Wilheim University in Bonn. He emigrated to the United States in 1955.

Dunn:

He would tell stories now and then about what happened to his family during the Nazi regime.

Bailey:

He was not a proper soldier. He didn’t have the right attitude. He got his basic medical training in Germany. He was thrown into the army and acted as a medic. He was captured and was a prisoner of war in France. He told amazing stories. One time he was in a cell with other German POWs. They were not supposed to shoot medics, but they were taking them out one by one and shooting them. Apparently because had nothing better to do. I don’t know what army it was at the time that had him captive. I think they were the French, because he could speak French. How he saved himself was he started telling them funny stories. He could be very entertaining. They spared him to listen to his stories.

Nebeker:

Like Scheherazade.

Bailey:

His internships, residency, and cardiology fellowships were pursued at various places, including: the Zurich University Hospital, Switzerland; the Georgetown University Hospital in Washington, D.C.; and the Institute of Medical Research at the Cedars of Lebanon Hospital in Los Angeles. In 1957 he was appointed Chief of the Research Center for Cardiovascular Data Processing at the Washington, D.C. and joined the Medical Faculty at Georgetown University. Subsequently his long and distinguished career was marked by scores of honors; awards; appointments to faculties and committees; special recognition by international societies, and a publication list of over 170 articles and book chapters.

Nebeker:

Was Pipberger always interested in medical research?

Dunn:

Well, he studied with Prinzmetal. After that he came to Washington to work with Ed Freis at the Veteran’s Administration Hospital on the first studies for the treatment of hypertension. This was the Veteran’s Administration (VA) collaborative study which established the benefits of treating hypertension. Hubert did this work as well as pursue his own interests in the dipole model of the heart and orthogonal lead model for collecting tracings, called vectorcardiograms or VCGs.

Bailey:

The model basically was that only electrical versus the heart could be collected into a single dipole there varied with time in magnitude and direction but not in location.

Nebeker:

A vector that characterizes the heart.

Dunn:

Considering the signals collected at the body surface as resultant vectors, Hubert started thinking how to collect these signals and proposed a research project to develop the lead system and hardware to record the tracings in a data cart. Meanwhile, Hubert was working on the hypertension project, so he asked Ed Fries, “Do you mind if I use some of my time to do this extra work?” His boss said, “Great, go ahead. You can do it. Let me know what happens.”

Nebeker:

Was this when he started to digitize the signals?

Bailey:

That was later in ’58 or so.

Dunn:

This was a classic project made for engineering and medical collaboration. There were the engineering specs for the data cart, but they also had to work out such things as optimal lead placement on the torso. Within a short time, Pipberger applied for his own grant, and he got it. He became private investigator on his own project. I’m not sure where the initial support came from, it may have been the VA, which allowed him to hire some engineers like Al Berson, and within a short time, the project was supported by the National Bureau of Standards, the VA, and the NIH. When Hubert received his first support money and became independent of the hypertension project, his boss, Ed Freis, came to him and asked if he would reimburse the hypertension project for the money spent on preparing the application for the ECG project. Hubert enjoyed telling that story.

Nebeker:

Was he at the VA Hospital?

Dunn:

He was at the VA Hospital and continued as an investigator there for many years.

Dunn's work with Pipberger's VA Collaborative Study, 1970s

Nebeker:

How did you come in contact with him?

Dunn:

<flashmp3>392 - bailey and dunn - clip 2.mp3</flashmp3>

By the time I met Hubert in 1970, his project had been ongoing for more than ten years. The VA Collaborative Study was underway, the well-diagnosed file was already used by a number of cardiologists—he lent the file for the asking. Hubert had a strong engineering team, led by Al Berson, which developed the original A-D converter and had a roomful of machinery for preprocessing ECG signals.

They were still studying sampling rates and filtering schemes. His group developed the first digitized ECG datacart. They had a CDC computer which read digitized ECGs from large tape reels, performed the wave recognition, calculated the measurements and made “computer” diagnoses, according to the program of the time. The cardiology fellows on the team were collecting patients’ ECGs and the medical documentation, which they reviewed in detail. Hubert’s wife, Hannah, was in charge of the library of tracings and documentation, and she labored over this so assiduously that she developed an ulcer.

Pipberger and his collaborators had a long list of publications, both on the engineering side and on the medical, dealing with diagnostic electrocardiographic classification by computer. Finally using the computer capabilities beyond simple decision-tree schemes, they had published studies using multivariate analysis to arrive at the diagnoses, but they dealt with only two diagnostic categories at a time, such as normal versus myocardial infarct, normal versus left ventricular hypertrophy. These were not very realistic since, in clinical practice, more than two diagnostic entities have to be taken into account at the same time.

At this time Jerry Cornfield joined his group as a consultant and I was Hubert’s full-time statistician. By 1973 we published the paper “Multigroup Diagnosis of Electrocardiograms,” in Computers and Biomedical Research, where seven different diagnostic categories were considered simultaneously under a Bayesian probability model. Almost 2500 ECGs were used to develop the model Different ECGs were used for testing. This concept in itself was unique at the time—using separate training and test sets.

Nebeker:

Would a cardiologist be trained to measure certain heights in a signal?

Dunn:

Yes, the cardiologist was expected to use calipers, but most of them eyeballed the measurement.

Nebeker:

Did Pipberger think that a better job of discriminating ECGs could be done by using some statistical measures?

Dunn:

Certainly.

Nebeker:

A calculation with this signal that a person can’t do in the head and looking at it but that the computer can do. Was it your job to try to come up with measures?

Dunn:

We had a list of more than 300 computer-generated measurements. One of the classic statistical problems at this point was to determine the most efficient set for the best separation. Jerry Cornfield published a neat result which gave a rule-of-thumb for the maximum number of variables which could realistically be used, and that was one-tenth the number in the sample size. But that rule was determined looking at a two-pair comparison. Using a greater number of variables resulted in an over-determined statistical model. We did some work to select the optimum set, and then we structured the seven group comparison by combining the variance-covariance matrix of the seven groups. Jerry’s big contribution was the use of prior probabilities. This was very unique to those working in ECG diagnosis at the time and was subject of much discussion for some time.

Nebeker:

The assigning of the priors?

Dunn:

Right. The question centered around the method of assigning priors.

Nebeker:

It seems like in the last couple of decades the Bayesian approach has been all the rage. It sounds like an early use of it.

Bailey:

I had my disagreements with Hubert and with the Bayesian approach. For example, when they first started out the prior probability was assigned depending on whether ECG was taken in the emergency room or the cardiology department. That seemed the wrong way, because more sensitivity was called for in the emergency room and more specificity in the cardiology department. The prior probabilities did exactly the opposite. Then they changed assignment of prior probabilities to correspond to symptoms, chest pain, etc., which was better.

Dunn:

There were different sets of optional priors available to the user. For users like Jim, who were uncomfortable using any priors, there was also the option of using equal priors.

Bailey:

If the priors were jiggled according to the way the population was structured and that population was similar to the population of their priors, statistically a better result could be gotten.

Dunn:

That is the idea behind using prior probabilities. It can be shown statistically that use of priors will maximize the percentage of correct diagnoses across the population.

Bailey:

The more patients, the more accuracy. However if a population was not configured that way – with an unknown population where the priors were unknown – it could be a problem.

Dunn:

There were a number of studies done on the effect of variation of the priors on computerized diagnoses. These seemed to establish that priors didn’t make much difference if a tracing was fairly typical. Where it might change the end result was in borderline cases.

Cardiologists' use of Pipberger's work

Nebeker:

When did this program that would do an interpretation first become available to cardiologists?

Bailey:

The VA program with multivariate analysis was published and distributed after 1972. The 1964 PHS-D program was widely distributed by that time. It’s a single lead, single channel program that accepts each of the twelve leads one after another. The signals are passed under 50 hertz. Physicians were using them in their offices. They would record on their apparatus, which would send it to some central processing location, and then the interpretation would come back on a teletype. Apparently this patient had wondered what this was and somebody gave him a brief explanation. This person probably thought that it had been written at NIH – which it wasn’t – and this guy made this paranoid production.

Nebeker:

This was called Phone-A-gram?

Bailey:

Phone-A-gram was the company that was marketing this service.

Nebeker:

Were a significant number of these sold?

Bailey:

Yes. I don’t know who made the apparatus they were using. That could have been a different company. Phone-A-gram was operating the service. The ECG cart would send the FM signal over the phone.

Dunn:

How would it transmit over a phone line? Would that work?

Bailey:

Yes, frequency modulated. They could have all the problems that are to be had with phones. The data were translated to frequency modulation signal, fed over the phone by acoustic coupler and then were demodulated and digitized by the ECG interpretive service at the other end.

Nebeker:

Was this in the early seventies?

Bailey:

This actually began in the late sixties. It was fed to the PHS program, which came out with an interpretation that was sent back to the original location via teletype. One example was a company named Telemed in Chicago. Telemed had a couple of Sigma 5 mainframes back to back. They had a system whereby a private physician in western Montana could send his ECG to a cardiology group in Spokane, Washington; the cardiology group would in turn send it to Telemed in Chicago; Telemed would run its program on it, send the interpretation back to the cardiologists via teletype; the group in Spokane would then add their own diagnosis and send that back to the private practitioner in western Montana. That system actually existed.

Nebeker:

Was this a practice that was accepted by many physicians?

Bailey:

It was not widely accepted, but people were doing it making money at it. Let’s put it that way.

Nebeker:

Were the leading cardiologists skeptical of this service?

Bailey:

I don’t know if this stuff was presented at the American Heart Association meetings. I don’t recall seeing it.

Dunn:

There was rivalry between the 12-lead proponents and the 3-lead or vector cardiogram groups. For the practioners using these results, it was easier to see how the decision was reached with the 12-lead programs, because basically the machine was doing what the cardiologist was trained to do. With the vectorcardiogram and sophisticated, computerized, statistical techniques, there was more skepticism.

Bailey:

I am skeptical of cardiologists taking interpretation from a computer program.

Dunn:

This became a serious issue. Pipberger himself always warned against “heart disease of computerized ECG origin.”

Nebeker:

“If the computer said it, it must be right.”

Bailey:

A lot of them had that kind of naiveté.

Dunn:

Some preferred a system whereby everything that came out of the computer should be overread—that some cardiologist should at least take a look at it and sign off on it.

Other programs for interpreting ECGs

Nebeker:

We got a little bit of that history from this first ’64 Public Health Service program. What were some of the next interpretive programs that became available?

Dunn:

Pipberger and Caceres vied for recognition as to being first, but it was a “by a nose” competition. Their lead systems were different, so perhaps we can confer the honor to both. By the mid-70’s, there were a number of programs—Hewlett-Packard, Telemed, Mayo Clinic, Mount Sinai-Cromed, LDS, and Bonner-IBM, as well as many other from Europe and Japan.

Bailey:

I was not aware of Hubert’s work until the multi-Bayesian program came out. I arrived at NIH in 1969 and didn’t get heavily involved in electrocardiography until about ’71, when Ray Bonner came out with his experimental IBM program.

Nebeker:

There was an experimental IBM program in ‘71?

Bailey:

Yes.

Dunn:

Hubert Pipberger developed the VA program for the XYZ leads, but the XYZ-lead machines weren’t commonly used; one might say it was more popular in Europe.

Bailey:

There was also Ralph Smith’s program from the Mayo Clinic. Ralph Smith’s program was a XYZ-lead program. The Mt. Sinai program in New York was developed by IBM in collaboration with Dr. Leon Pordy. Actually Ray Bonner wrote that program. Mt. Sinai subsequently decided that was their program. Then there was the controversy with Chromalloy. And of course IBM said, “We don’t admit that you have the right to that program,” but in the meantime they had Ray program a new twelve-lead program.

Nebeker:

You evaluated at least three of the first programs interpreting ECGs in ’71. Would you talk a little bit more about that?

Bailey:

I looked at the IBM, PHS-D and Mayo Clinic programs. That evaluation was published in ’74. At that same time there was a Mt. Sinai program that Ray Bonner had written before he wrote the ’71 programs, so he must have done that in ’68.

Nebeker:

Were all of these being made available to cardiologists at that time?

Bailey:

Yes but the distribution was limited.

Dunn:

They were available, but it wasn’t easy to implement them. A twelve-lead program could easily be managed because there was the cart that collected twelve ECGs, but specially made three-lead carts were needed for the three-lead ECGs. Who made the carts?

Bailey:

Marquette and Hewlett-Packard.

Dunn:

Hewlett-Packard, but Marquette was the one really into the three-lead carts. They had a good business doing that for a while.

Nebeker:

Was there something of a conversion from twelve-lead to three-lead ECGs?

Dunn:

There were studies which attempted to reproduce one lead system from the other, but they were not totally successful. Especially difficult was reproducing the three-lead system from the 12 because of the necessity of accurately placed leads on the torso to pick up orthogonal signals.

Nebeker:

How did it look to you during the early seventies? I take it you were in the XYZ-lead camp.

Dunn:

I was in the three-lead camp, but Jim was in the other. This could get interesting.

Bailey:

That’s true. I don’t think that issue was ever settled. One of my problems with the XYZ-lead system was, for instance, in the case of a localized disease where an infarct has destroyed a small amount of tissue in the heart muscle. When there is destruction to a small amount of heart muscle the assumption that there is a single dipole in space – varying only in magnitude and direction – begins to break down. This is due to local forces – dipole, quadrapole, octapole and so on. People tried to model that. I don’t think that modeling has ever really been successful in terms of diagnoses.

Nebeker:

Are you saying that the dipole model is reasonable for a healthy heart?

Bailey:

The approximation that is being made is not too bad.

Nebeker:

Is a twelve-lead ECG better when you have that kind of localized damage?

Bailey:

There is no question that the twelve-lead, because they had leads across the chest, could pick up some of that local information a little more sensitively.

Dunn:

I take it you are referring to infarcts; specifically, anterior infarct?

Bailey:

Yes.

Nebeker:

I’m very interested in the way this all developed. Programs were written for the XYZ-lead ECGs and programs were written for 12-lead ECGs?

Bailey:

What was beginning to evolve in the seventies was that leads were being simultaneously collected. In the beginning, in the late sixties and early seventies, there were carts that collected three channels of leads simultaneously. In the eighties that evolved into carts that collected twelve leads simultaneously. Between the twelve and the three, there were fifteen leads that could be done simultaneously.

Either analytic approach could be taken. With the controversy between those who believed in the twelve-lead and those who believed in the XYZ-lead, they were both collecting the same kind of data and making the similar kinds of measurements. The logical question was, why not use the highly powerful multivariate techniques that the XYZ-lead people were using as opposed to the basic statistical techniques the twelve-lead people were using? I think that’s where Jos Willems came into the picture.

Dunn:

Jos Willems had been a fellow in Hubert’s lab when I was there. He was the bridge between the two camps. And he showed that comparable results could be obtained, for the most part

Bailey:

He started off in Caceres’ lab and left.

Nebeker:

Was Hubert Pipberger the reason that some of the more sophisticated statistics were applied to the three-lead ECGs?

Dunn:

Definitely. It was because of his expectations. He was looking for more powerful tools.

Nebeker:

When did Willems start working with it?

Dunn:

He was with Pipberger in the early to mid-seventies. By the late seventies he was back in Belgium and starting his own laboratory.

Bailey:

He didn’t do the multi-Bayesian thing until the eighties. By that time it could be tested with a CSE database.

Dunn:

In Pipberger’s lab Jos studied rhythm determination programs and variation –both in the ECGs and in the computer program’s diagnosis as a result of that variation.

ECG interpretation in medical practice, 1970s

Nebeker:

Were many cardiologists using these programs in the early seventies?

Bailey:

No. I implemented the IBM program at NIH. What was bad was trying to translate the Pipberger program from CDC Fortran to IBM FORTRAN. It was like the difference between 36- and 32-bit words.

Nebeker:

Did the Mayo Clinic provide anything to you for their program?

Bailey:

I don’t remember how we got that from them, but we did. Ralph Smith was fairly tight with that program.

Dunn:

Did you send your data up there for them too?

Bailey:

No. We did that on our own data. Absolutely. We had a cart that did three channels times four for the twelve leads, and then three for the XYZ. We did the '74 evaluation data from the Cardiology Branch on the NIH campus. I don’t remember how we got that program from Ralph Smith except that some of these people were working on NIH grants. That helped them cooperate.

Comparative assessment of interpretive programs; computer analysis accuracy

Nebeker:

In comparing these programs, what did you determine?

Bailey:

In the 1974 publications? The IBM program came out with the best results.

Dunn:

Hubert Pipberger never agreed with that conclusion. If I recall, it was because your documentation of the actual ECG status was based solely on your cardiologists’ readings.

Nebeker:

Did you compare how typical cardiologists did against these programs? Was that an issue?

Bailey:

We had the same problem in terms of the diagnosis of cases. We had three readers that had to agree, and they had rules by which they had to agree. If they disagreed with each other or any of the programs, they had to go back and have that adjudicated. That was our so-called standard against which the programs were being measured. It was reasonable for what was available at the time.

Dunn:

Hubert picked apart your study from many points of view. One was that the patients must be classified from non-ECG criteria.

Nebeker:

Why should the practicing cardiologist care about these computer programs? There should be evidence that these do a good job compared to humans.

Bailey:

Yes. Studies had already been done along those lines when we came on the scene.

Nebeker:

Was it taken for granted these were useful programs?

Bailey:

One experience at NIH at that time was interesting. We were helping the cardiologists in the clinical center because all the normals could be just checked off, whereas some of the abnormals might need overreading and some of their diagnostic statements altered. Only one M.D. objected to that, and he decided to go through the measurement matrix and verify every measurement. He complained that it was costing him too much time to overread.

Nebeker:

The way it was actually implemented was that a cardiologist would run this program, overread it and then feel free to disagree if his interpretation was different?

Dunn:

And he would notate the tracing for differences.. Rhythm determination was probably the first aspect about which the cardiologist became comfortable.

Nebeker:

Maybe it was something the computer could do a more accurate job of measuring.

Dunn:

With proper wave recognition – the points of the QRST, the T and the baseline – the computer could make better measurements than the cardiologists and save time. As long as the computer just printed out the measurements it computed, the cardiologist was already provided some assistance..

Bailey:

When I first had to read ECGs in my residency, we had little calipers where we had to manually make measurements on magnitudes and intervals and so forth. Then we had to record them before offering an interpretation. We got four dollars for reading an ECG. The customer was charged sixteen dollars. The department took twelve and they paid us four.

Nebeker:

Was that the problem with the computer, that it had difficulty recognizing waveforms?

Dunn:

Part of Jim’s evaluation focused on the wave recognition aspect. The Pipberger program did very well on that in three-lead ECGs.

Bailey:

Oh yes.

Nebeker:

How well did these programs do wave recognition in general?

Bailey:

I did a funny little experiment. I had a battery-powered square wave generator and I fed that data to the computer programs. The IBM program said, “Inconsistent. No further analysis.” The Mayo Clinic and PHS programs read out all kinds of diagnoses on that.

Dunn:

Did Hubert’s VA program pass that test?

Bailey:

Yes. I don’t think I published on that program, but he thought that was really funny. Hubert liked that.

Nebeker:

Were the practicing cardiologists happy to have these programs available by and large?

Bailey:

In the seventies the programs weren't that widely distributed. They were mostly used in big medical centers because everything was done on mainframes at that time.

Nebeker:

Would they run on a minicomputer?

Bailey:

Later they went to minicomputers. Now they have a thing the size of a briefcase that will run everything. Telephone transmission was starting to burgeon at that time.

Dunn:

At the VA in the 70’s, the technicians would collect ECGs on data tapes and transfer them to a big reel which was read into the computer for analysis.

Bailey:

There was a report by Erica Drazen on how that went back around ’78.

Dunn:

Yes. It was the late seventies. I think that report was given at the TC-4 Working Conference in Halifax in ‘79.

Bailey:

I think it may have been presented at the first Engineering Foundation meeting.

ECG computer analysis was like an exponential curve that began sometime in the seventies.

Device improvements and research security

Nebeker:

Did the Hewlett-Packard device actually have the computer analysis in the cart?

Bailey:

Both Marquette and HP had analysis in the cart.

Dunn:

And IBM had a device.

Bailey:

IBM had a PC thing for a while that did it, but they were behind on the curve. Marquette and HP were out in front.

Nebeker:

When did HP and Marquette come out with carts that did that?

Bailey:

I’m thinking late seventies, early eighties. Hewlett-Packard started off with the PHS program. Then they did some major reprogramming. I think the same thing happened at Telemed. Telemed had the PHS program and then got Dave Mortara almost fresh out of his Ph.D. degree in physics. He made major changes in the Telemed program.

Dunn:

I thought he was with Marquette.

Bailey:

He was – after he left Telemed.

Nebeker:

I guess the microprocessor that of course gives us the PC makes it possible to do this on a cart. How did this area look to the researcher? Was there industrial secrecy that for instance HP didn’t want to reveal how they were doing it, or was this all in the open literature? Were people open about how they were analyzing ECGs?

Bailey:

Both Hewlett-Packard and IBM programs had published criteria, but the signal processing part of it might have been proprietary. They didn’t reveal how they found the QRS or what they did with their filtering for instance.

Dunn:

Charlie Batchelor did one of the first wave recognition programs. It performed so well that it seemed all future programs were based upon his criteria. They came from the same progenitor.

Nebeker:

When people had a better idea of a way to improve the analysis then they would publish this rather than keep it a secret.

Dunn:

They might try a different filtering scheme, but they might not tell you.

Nebeker:

There was bound to be some of this tension in the medical equipment business. Marquette wants to sell its equipment, and if they’re completely open about every advance they’ve made then it’s going to be copied by other manufacturers. Was even Hewlett-Packard open about how they were analyzing the ECGs?

Bailey:

They had a list of the criteria.

Dunn:

Yes, but they wouldn’t give out the program.

Bailey:

It was on a Hewlett-Packard computer, which couldn’t be unraveled. With anything that was on a mainframe, usually it was an object program. We actually had a program it was written in. The IBM program was originally written in PL-1.

Nebeker:

I remember PL-1.

Bailey:

And there was a PL-1 compiler. Then the systems people decided to get the PL-1 optimizer compiler, and this program was incompatible with it. It didn’t link up. Therefore we could not operate the program for a while. And it was rental product. We had some fun over that. In a lot of circumstances what people got basically was the object program, and there was no way to unravel all that. That was when everything was on mainframes. Then things changed when it started going onto desktops and minicomputers and so forth.

Nebeker:

Then HP and Marquette came out with these carts that did the whole job in the late seventies. Did that take over? Did hospitals continue to use mainframes?

Bailey:

Mainframes went out fairly rapidly.

Dunn:

Absolutely. In the early seventies Hubert cut loose the Control Data computer that took up a whole room twice this size – a room with a built-up floor and air conditioning turned on so high that I had to wear a coat when I went in there. And all the data tapes were stored in there. He switched to a Varian computer. The Varian was the last mainframe.

Bailey:

There was Lynn Kyle who did the thing for PCs, right?

Dunn:

Yes, Lynn also worked for Ed Freis.

Nebeker:

In the course of the eighties this went to a cart that did the interpretation. Is that right?

Common Standards for Quantitative Electrocardiography Project

Bailey:

Yes. And that was when the CSE project was started. It was actually started in ’78 or ’79, and they published in the late eighties in the New England Journal of Medicine.

Nebeker:

What was the CSE project?

Bailey:

This was the Common Standards for Quantitative Electrocardiography.

Dunn:

The final report for that is the Standard Communications Protocol for Computerized Electrocardiography. Final Specifications and Recommendations. Jos Willems was the editor and it was published in Leuven, 1991.

Bailey:

Yes.

Exercise ECGs

Dunn:

We didn’t talk about the exercise ECG at all. That’s another interesting story.

Bailey:

Are you talking about Tom Sheffield down in Alabama?

Dunn:

Yes, and Bootsma, John Holt and Blomqvist.

Nebeker:

In what sense was this big business?

Dunn:

Exercise stress testing. That took off as a moneymaker at about the same time as ECG interpretation.

Nebeker:

The late seventies?

Dunn:

Maybe even earlier.