Oral-History:Andrew R. Molnar
About Andrew R. Molnar
A pioneer in computer-assisted education, Andrew Molnar earned his Ph.D. in psychology from the University of Maryland in 1959, after taking a year off from his studies in 1956 to work on the design team for the Pratt & Whitney J-75 turbojet engine. Following a brief period in which he developed command evaluation systems for System Development Corporation, Dr. Molnar joined the faculty of American university in 1961 as Professor of Research at the Center for Research and Social Systems. In 1966, he joined the U.S. Office of Education as Acting Director for Higher Education Research, where he oversaw the development of computer-assisted instruction technologies. Joining the National Scfience Foundation’s Office of Computing Activities (OCA) in 1970, Dr. Molnar helped direct the continued development of CAI technologies, the deployment of internetworking for higher education and the development of university curricula.
In this interview, Dr. Molnar discusses his career in the public sector, developing CAI and science education technologies, techniques and curricula. He focuses on his work with the OCA in the 1970s and addresses both the office's strategic vision and organization challenges during a period of technological change and transition.
About the Interview
ANDREW R. MOLNAR: An Interview Conducted by William Aspray, Center for the History of Electrical Engineering, September 25, 1991
Interview #133 for the IEEE History Center, The Institute of Electrical and Electronics Engineers, Inc. and Rutgers, The State University of New Jersey
This manuscript is being made available for research purposes only. All literary rights in the manuscript, including the right to publish, are reserved to the IEEE History Center. No part of the manuscript may be quoted for publication without the written permission of the Director of IEEE History Center.
Request for permission to quote for publication should be addressed to the IEEE History Center Oral History Program, Rutgers - the State University, 39 Union Street, New Brunswick, NJ 08901-8538 USA. It should include identification of the specific passages to be quoted, anticipated use of the passages, and identification of the user.
It is recommended that this oral history be cited as follows:
Andrew R. Molnar, an oral history conducted in 1991 by William Aspray, IEEE History Center, Rutgers University, New Brunswick, NJ, USA.
INTERVIEW: Andrew R Molnar
INTERVIEWER: William Aspray
DATE: September 25, 1991
PLACE: Washington DC
Background and Education
Let me get some personal information from you about your background before you came to the National Science Foundation (NSF). Tell me about your education.
I got a Bachelor’s and Master’s and PhD. from the University of Maryland in 1952, 1955 and 1959, and I took out a year to go to a design engineer program at Pratt & Whitney Aircraft. In 1956 I did two years of engineering in one year and worked for a short time as a design engineer on the J-75 turbo jet engine.
The degrees at Maryland were in what area?
All in psychology?
What was your career path then?
That is a little bit complicated. I had worked for a group called Psychological Research Associates, and we did military research for training systems. I directed a project interviewing Hungarian refugees after the 1956 Hungarian Revolution, was a member of a team that designed information requirements for an anti-submarine warfare tactic, and I did human factors design for training anti-submarine warfare operators. I worked on a project that evaluated the combat effectiveness of small unit infantry rifle squads. I was a graduate assistant at Maryland during this time and worked at Psychological Research Associates.
From there I went to the Carmody Corporation, where I was the Director for Research and Training. We were a small company that designed and built training and simulation equipment. for business, industry and the military. We designed and built simulators for the military aircraft and the petrochemical industry and provided factory training for the operators This was 1958-1959. I conducted a study of wide angle, non programmed, visual presentations for aircraft landing simulators.
From there I went to System Development Corporation and was a human factors scientist and helped develop an evaluation facility for decision making for Strategic Air Command. We built the facility and ran experimental tests of the information displays for strategic decision making.. I initially worked in the building that housed one of the early large computers for the SAGE Air Defense System. It was a machine that used vacuum tubes and filled the whole building and could hardly do much of what a desktop can do today.
Then I went to American University and for five years. I was a Professor of Research at the Center for Research and Social Systems. I was in charge of a group that did research on underground, insurgent, and revolutionary warfare and I wrote several books on the subject. We studied terrorism, insurgency and counter-insurgency.
Then in 1966, I went to the then U.S. Office of Education (USOE). At the Office of Education, I was the Acting Director for Higher Education Research. I reported to Louis Bright, the Associate Commissioner for the Bureau of Research . He had worked at Westinghouse on Computer Assisted Instruction. I was also the Executive Secretary for the New Educational Media Program. I planned and held meetings for the Commissioner of Education, Harold Howe. We made the initial USOE award to the Children’s Television Workshop to create the television program Sesame Street. We made awards for computer and television projects under Title 7 of the National Defense Education. .
Could you say a few words about what some of those programs were?
Oh boy. Good question. We supported Patrick Suppes at Stanford to develop and test Computer Assisted Instruction (CAI) for beginning mathematics. He also used the computer at Stanford to provide services through telephone lines and terminals for schools in Kentucky and Mississippi. One of my first tasks was to respond to a proposal from The Inter university Communications Council (EDUCOM). They had proposed a multimillion dollar project to create an information network to bring computing and television to colleges and universities. Herb Grosch from the National Bureau of Standards was also on the committee.
Oh gee. He is a character.
We were charged with making an inter agency response to EDUCOM. We gave them a small award to begin a network project. What else were we doing at that time? .I surveyed all of the USOE programs to find out what we were supporting in computers.. I put these findings together and make a report to the Commissioner for Education, Harold Howe. At that time it took about a year for us to get anything printed by the Government Printing Office, so we gave it to people on the proviso that they xerox three copies of the report for us. I think we distributed somewhere around several hundred copies that way. The report, The U.S. Office of Education Support of Computer Activities, was finally published in 1969. The early stages of computing were rather chaotic. Most were trying to demonstrate some application that could be used immediately. Pat Suppes was using small computers to teach mathematics. While audio and graphics would be nice, most felt that they were just beyond the reach of education and they did not want to wait for them to be widely available.
You were at the Office of Education at this time. Then what happened?
I was there for four years. A report by the National Academy Sciences, Digital Computer Needs in Universities and Colleges, made a strong case for universities having access to computers for research, but said little about education. In 1967, the President's Science Advisory Committee (PSAC) commissioned a study of computers in higher education. John Pierce from Bell Labs was the chairman and held extensive hearings. They concluded that an undergraduate college education without adequate computing was as deficient as an undergraduate education would be without an adequate library. They also felt there was value in using computers for precollege education. These recommendations had a significant impact on educators. I think it was a trigger for the involvement of the National Science Foundation. The most significant event occurred when President Lyndon Johnson in his February 28,1967 speech directed the National Science Foundation to work with the U.S. Office of Education to establish an experimental program to develop the potential of computers in education. In July of 1967, in response to the directive, NSF created the Office of Computing Activities (OCA) to provide Federal leadership in the use of computers for research and education. I joined after that.
When was it that you joined OCA?
1970. OCA reported directly to the NSF Director. Eventually the directive was put into the NSF Charter. So we had three groups. Don Aufenkamp directed the Computer Applications Program with Peter Lykos and Eric McWilliams. The Computer Science Program was directed by Kent Curtis with Tom Keenan and John Lehmann. Arthur Melmed directed the Computer Innovation in Education Program with Larry Oliver and me. When I arrived, Milt Rose had resigned and Glen Ingram was acting. John Pasta later came in, after Glen Ingram, to be Director of OCA. John was really outstanding. He was the Director of Computer Science at the University of Illinois. Previously he worked as a physicist at Los Alamos. Another interesting thing about him was that he was an ex-New York policeman. He could be very forceful using skills he acquired as a policeman. He was a very intelligent, personable and open to all ideas. .
You said that Melmed was responsible for running the program in education. What was your role?
I was one of the program officers.
And how many were there?
Just in education is fine.
That is hard to say. John Lehmann started the training programs, and then John moved over to the Computer Science Program, and I took John’s place. So there was Arthur, Larry, and myself. The main core of the people in the education program came when we transferred from OCA to the Education Directorate in 1973. When Guy Stever, President of Carnegie Mellon University, became Director of NSF, we thought, we really thought we had it made. We had someone who understood computing. OCA had assisted him in acquiring a large computer at CMU. But his first action was to do away with equipment grants because the computer at CMU gave him so much financial and technical problems. I believe he sold it. Stever took the Computer Innovation in Education Section and moved it over to the Education Directorate, because they were having trouble and he felt that this would strengthen their programs.
Once we got to the Directorate for Education, things changed. We were renamed The Technological Innovations in Education Program. Larry Oliver managed the Educational Computing in Minority Institutions Program. Eric McWilliams came on. Eric was running, I think, the Cornell regional computer network before joining NSF. Dorothy Deringer, an information scientist from Case Western Reserve University, joined us. So that was the main group. The main core of the group was Arthur Melmed, Eric McWilliams, Dorothy Deringer, Larry Oliver and myself.
That was a very critical time. 1972 the federal monies for research and education started drying up. The real visionary things that went on in the 1960s and early 1970s, a period described as the Golden Age of Education, became hard to do. From then on, it was never the same, financial problems limited our ability to do new innovative things at the national level.
About that time it was decided that we should do several major demonstrations, because we felt we were about ready to do something significant, and that if we did not do it then, we probably would not have the financial resources to do it in the future. So we focused on the PLATO (Programmed Logic for Automatic Teaching Operations) Project directed by Donald Bitzer at the Computer Based Education Research Laboratory at the University of Illinois. Bitzer wanted to demonstrate, with NSF support, that a large computer could be used to serve thousands of students, at many different geographic locations, in hundreds of different courses, at a reasonable cost. A unique feature of the system was the use of a plasma display that could provide high quality, low cost graphics.
Since we did not want to go with just one demonstration, we selected the Time Shared Interactive Computer Controlled Information Television System (TICCIT) directed by John Volk of the MITRE Corporation. TICCIT used a mini computer and two way television technology.. Victor Bunderson from Brigham Young University developed college level courses in mathematics and English for the demonstration.. These were two dramatically different instructional strategies. PLATO was a large centralized system that would send messages through networks to the plasma terminals. The TICCIT system used a minicomputer, which eventually became a microcomputer, and introduced two way television so that students could see the teacher and the teacher could see the student while in different locations. The TICCIT approach was to make small, autonomous units and/or one large network. The Educational Testing Services was use to evaluate the two projects. Eric McWilliams became the project officer for these projects.
At that time we were also supporting Seymour Papert at the Massachusetts Institute for Technology (MIT) to develop LOGO, a programming language for children. Earlier NSF had supported Tom Kurtz and John Kemeny at Dartmouth to develop an easy to use programming language called BASIC. OCA helped to develop Curriculum '68 and Curriculum '78. At that time, a survey found that nationally there were about twenty-eight computer-oriented degree programs. They had different names but usually had “computer” in their title. The curriculum and the courses varied from institution to institution. The Curriculum '68 Committee's answer was really brilliant. They did not tell people what to do or how to do it. What they did was to survey what people were doing and grouped them into logical areas, and said if you were going to call yourself a “computer science department” you really ought to have something in each of these areas, and if you lacked for names, we suggest the following course descriptions. They gave the field the freedom to decide. The recommendations were quickly adopted and the community converged on an accepted definition of computer science and its curricula.
Then in the Curriculum '78, IEEE and the Association for Computer Machinery (ACM) worked closely together to reexamine the curricula. And while there was a general agreement on curricula, there was a need for an engineering curricula. .
In June of 1970, we supported the Computer Conference on Undergraduate Curriculum (CCUC). Gerard Weeg organized the first conference at the University of Iowa. He proposed a conference to take a look at computer applications in the classroom. In other words, the ticket to the conference was a report on what people in the sciences were actually doing in their classrooms. He wanted people to come together and exchange practical ideas about what was going on. Tom Kurtz organized the second CCUC conference which was held at Dartmouth.
How long did this last?
Let me see, the National Education Computing Conference (NECC) was a spin-off from the CCUC. The NSF supported CCUC for a number of years. In 1979, in lieu of the tenth Computer Conference on Undergraduate Curricula, it was decided to create a consortium called the National Education Computing Conference (NECC). Since then I guess it has been about 12 years now. I could give you a speech that I made. At that time, I wanted to get the people together who were the organizers and movers in those activities. So I wrote a brief history identifying some of the thinkers and the doers in educational computing.. What I have tried to do was to present some thoughts on the history of computing and to single out people who had made major contributions in the field.
This document has some of the things you have already been telling me about?
Yes, in more detail, more explicit. The paper is “Computers in Education: A Historical Perspective of the Unfinished Task:, published in the THE Journal (Technological Horizons in Education) Vol. 18, 1990. The wide scale use of the computer for educational activities is about 30 years old. It is not that people prior to that did not use them for specific educational purposes, but for the most part it was usually limited to a single course. For example, IBM in1958 demonstrated the teaching of binary arithmetic by computer . System Development Corporation in 1959 had a teaching project called CLASS. It is not that people did not demonstrate the value of computers for instruction, but PLATO, in about 1959, became the first large scale use of a computer dedicated to instructional purposes in education.
The other thing that we did at that time was to create a number of regional computer networks for instructional purposes so that they could share computer resources and instructional materials. In all, there was some 30 regional computing networks that included about 300 institutions at all levels of education. The idea was to get computing out to as many people as possible, including minority institutions. In fact, I think some of those minicomputers that were given to those minority institutions may be still running and still being used. Nellouise Watkins at Bennett College and Jesse Lewis at Jackson State were some of early pioneers in providing regional computing services.
But we were concerned that while some of the regional computing networks were extremely successful others were not. So we wanted to know why that was so, and more importantly whether we could build upon that knowledge. So Jim Johnson at Iowa along with Jerry Weeg at Iowa, Tom Kurtz at Dartmouth, Jim Parker at North Carolina, and people at Texas, and at Oregon State formed a consortium of five regional networks with approximately 100 colleges and universities, called CONDUIT, to devise better ways of sharing computer-based curricula.
Most networks at the time would be happy to send you their materials, but few would accept materials from other institutions for a wide variety of reasons. The computer based materials where written in a variety of programming languages. This made transfer difficult. There was a lack of documentation.
About that time there was a study on uses of computer based statistical correlation. I think they went to about 30 institutions and gave them data and asked them to compute a correlation. They got 30 different answers with wide variations in accuracy. The concern was how do we get networks to cooperate, how do we get networks to exchange materials, and how do we get the materials into the classroom. And how do we validate the content and how can we be sure that there are no programming errors. After all, in most cases, students wrote the programs and did not always document what they did.
What CONDUIT did was to select about seven fields of science and exchanged these course materials with each other and certified that, in fact, they worked. And while we initially wanted to validate the effectiveness of the educational materials, we soon came to the brutal conclusion that certification was about all we could offer, mainly because they frequently had programming errors in them, or the programming was inefficient or contained unique problem oriented language buried in the program, or the substantive materials were often in error, Only someone who had programming experience and who had expertise in the disciplines involved could detect those errors.
Very few people had thought about how best to introduce the materials to classroom teachers. So the centers created common documentation to accept and exchange materials. Everything that was written in FORTRAN was also written in BASIC, and whatever special equipment was required to run them was described. Then they moved the courseware into the classrooms and tried it out. We did research to find out how well people accepted it. What we found out was that merely mailing things to people did not work. The documentation was not very good. While the author understood the materials, the user probably did not. And that you needed a knowledgeable expert in the field to ran a workshop if you were looking for the most cost effective way to distribute the materials.
More importantly, we wanted to compare the networks to see if there was some pattern that could be used to identify the successful adoptions from those that were not successful. Fred Weingarten and others did a study of the Regional Computer Networks that was published in 1973. They could not find any particular pattern that led to the successful transportation and adoption of computer based materials. We came to the informal conclusion, after the study, that some places had a critical mass of people who were involved in computing and courseware development, and in those places everything worked, documentation or not. And in other places, with equal reputations, equipment and people, that did not have a critical mass, very little worked.
About that time small, stand alone, minicomputers became popular. Some preferred to have a mini rather then access to a network with educational materials. They preferred the minicomputer because it was theirs and they controlled it. And students could not run up exorbitant bills, as often did on the network. If the computer went down you brought it down, not somebody else on the network. So, therefore, in spite of all the attractive incentives to join networks, people still wanted departmental minis.
What was the date for CONDUIT?
About 1975, the MITRE Corporation wired some homes outside of Washington so that they could demonstrate 2-way television and computing in the home. The idea was that they were now in a position to be able to network cable so that we could have 2-way interactive television and computing. Basically what they did for the computing was a drop line via telephone and used cable for 2-way television. They used some drill and practice program and ran some classes in the home. After the NSF support for development, I believe it was later picked up in Buffalo, New York for a project with home bound handicapped students. A cable company ran some channels for teachers to work with handicapped kids who could not get to school. The objective was for the students to be able look in on the class at school and the teacher to look in on them while they were using the materials at home.
National Education Initiatives
Why was the move of education out of Office of Computing Activities into an education division? Why did that occur?
The Education Directorate was having troubles in Congress over MACOS, Man a Course of Study. Other than Graduate Fellowships, the Foundation was never comfortable with education as an activity. Several Directors, including Stever, had surveyed the National Science Board (NSB) about doing away with the education function. In 1983, they actually did do away with it. President Ronald Reagan believed that education was best served at the state and local level. So Dr Slaughter, the Director at the time, disestablished the education programs at NSF, and the National Science Board approved the action. At that time, after a Reduction In Force, I was the only program officer left. I had several hundred grants and was responsible for closing them. Some were interesting ones. We were supporting the University of Virginia program in CAD (Computer Aided Design). We were supporting the audio/visual engineering programs to create audio/visual materials for computer science and engineering courses.
At that time we had no money. However, before we closed down, I found that there was a provision in the Foundation's Charter that allowed us to accept gifts. Using this information, Dorothy Deringer solicited vendors for equipment. Five vendors gave us equipment systems which she used to make awards. They were IBM, Apple, Atari, Digital Equipment Corporation and Radio Shack. In 1982, Deringer initiated over 50 projects to develop computer based materials for science and engineering education using industry donated equipment.
In another initiative, The Defense Advanced Research Projects Agency (DARPA) and NSF offered a Very Large Scale Integration (VLSI) program. DARPA was very interested in the University of Southern California project for the fast fabrication of computer chips that was under utilized. At the urging of DARPA, NSF set up a graduate and undergraduate program for computer architecture courses to permit free access to the fast fabrication facility (1983-1985). John Lehmann, Bernie Chern, and I were the screening committee for access to that system.
At that time, VLSI was new in universities. The program permitted computer and engineering departments to submit VLSI designs that were fabricated and returned to the students during the course for testing. The only requirement we placed on them was that once they had the unit fabricated they would have to test it and put the results on the network. That project, I believe, advanced engineering education by at least a decade. At that time, many thought graduate students might benefit from the program, few believed that undergraduates would. I heard that patents and other awards were coming out of the undergraduate programs. The significance of the program was that while most universities did not have access to such facilities for classroom work, by disaggregating manufacturing from design, and submitting designs through the network and returning the chips by mail during the course, we could permit teachers and their students to design all sorts things for minimal cost.
The White House became a little nervous about eliminating science education programs and called over and asked if we could create a Presidential Awards Ceremony for outstanding teacher of mathematics and science and arrange for the President to make the awards at the White House in two weeks. We said we could, but it might reflect badly on the White House if the program was hastily carried out. We felt we could do it in a month. Therefore, Robert Watson and I undertook the task of creating a Presidential Awards Program for the White House. I solicited support from several math and science associations. I contacted the National Science Teachers Association and the National Council of Teachers of Mathematics to set up a selection procedure to pick one math and one science teacher from each State, the District of Columbia and Puerto Rico. I arranged for the teachers to come to Washington for four days and visit federal agencies involved in math and science and for them to receive $5,000 for use in their classrooms. And in October of 1983, we had the first Presidential Awards for Excellence in Science and Mathematics Teaching were made in the East Room of the White House. President Reagan addressed the group. He was a marvelous speaker and the teachers cheered him. The program continues to this day.
Also, during that time, I guess I made about 24 speeches from Alaska to Florida, and at that time I had no secretary, no new programs, and no travel budget. Yet, it was a very productive year. Later, we were “reestablished”. A commission was formed and they recommended that there was an urgent need to support mathematics, science and technology in education.
In 1984, I became the Program Director for the Applications of Advanced Technologies (AAT). I t is a program for research and development on advanced computer and telecommunications technologies and their application in mathematics, science and engineering, at all levels of education. .
The National Science Foundation, interestingly enough, until 1975 never had a research program in education. It has always resisted educational research. However, In about 1975, The Education Directorate was created and had a small Research in Science Education program (RISE) and a larger Development in Science Education program (DISE).. Research tended to support research, but not development. The development program tended to support development, but not research. Technology was not a high priority in either program.. Program Directors could on occasion jointly fund projects, but timing and deadlines caused administrative difficulties.
Later we created the AAT Program which supports both research and development, at all levels of education, because we wanted articulate technology applications throughout the whole educational system. Our concern is with things that are going to be important five to ten years from now . Prototype systems always exist prior to full scale development and they could be used through proof of concept tests to lay the foundation for decisions that educators faced at that time of adoption.
The program is concerned about enormous increase in the level of complexity in science that is technology dependent for representation, manipulation, and cognitive understanding. Such areas as chaos theory, fractals, and computer models and animations of theoretical phenomena may require parallel processors or large data bases and non linear applications. Finally, the Program is interested in supporting high gain, high risk projects.
Experiments in CAI
Around 1970, what was the attitude within OCA towards CAI?
Nobody liked the term Computer Assisted Instruction. Everybody felt it was a misnomer. But its use in newspapers was so wide spread and so many people were familiar with the term CAI it became apparent that there was no way of converting then to anything else. Computer based instruction, computer managed instruction were used, but you could not change the popular usage of CAI. However, in the early days, CAI was not all that popular. You could add two and two but you had to wait about 5 to 10 seconds to get the answer before you could go on to the next frame. Many of the kids became bored. Drill and practice was usually limited to10 minutes, because that was the limit of the student's span of attention when doing drill and practice. While it improved performance, it could be boring.
Probably no other name is more closely associated with CAI than Suppes. For years they were synonymous. He was the Director of the Institute for Mathematical Studies and Social Sciences at Stanford. Ken Arrow, the Nobel Laureate, and Suppes formed a laboratory to develop such a system. Later they were joined by Dick Atkinson, a psychologist at Stanford.. With grants from NSF, Suppes and Atkinson established a research and development program to develop the use of CAI in math and reading. They also developed sophisticated mathematical models of student learning to help design the material and provide instructional strategies. Suppes wanted to demonstrate that computers could play an important and immediate role in education using existing equipment.
At that time, Suppes was able to go into very poorly run educational environments and produce results immediately. The State of California carried out a well controlled study, to look at reading, arithmetic and one other subject. The design used combinations of the three. They had single, two and three course treatments. It run for two or three years so they could look at the long term effects and short term effects. The arithmetic did work; it improved performance significantly, but the reading was not as successful.
In addition to developing a theoretical base for the presentation of the materials, Richard Atkinson was instrumental in bringing about the paradigm shift from theories of learning to theories cognition. The new focus in education is on how people think. In addition, we should look beyond learning in the classroom and look at how professionals in math and science solve problems, and how they organize their thoughts and to see how they differ from novices. That changed dramatically the focus of the attention on how computers could be used in education. But it did not detract from the CAI efforts. Pilots, surgeons, all need drill and practice to develop automatic behavioral skills. Dick Atkinson became the Deputy Director of NSF in 1975 and two years later, he became the Director. In 1980 he left NSF to be the Chancellor of the University of California at San Diego.
In one of early Suppes controlled experiments, he found one of the control groups was doing as well as the CAI group. Suppes went to the site and found that the teacher had told the class, we aren’t going to let those computers kids beat us . And they did do as well. That story was used to sell CAI. Good teachers and motivated students could achieve as well as CAI kids, but they had to exert extra effort. And in those cases where you did not have outstanding teachers or highly motivated students, you could use CAI and get higher achievement.
Suppes went on to develop a wide variety of CAI courses. He developed courses in logic, set theory, theorem proving, calculus and foreign languages. One semester he was professor of record for eleven courses at Stanford. He just received the Medal of Science this year for his educational research. He is one of the few educators, and certainly the only one that I know who received the Medal for the uses of computing for educational purposes..
Don Bitzer’s approach was totally different. Bitzer felt that everybody knew what to teach, and therefore what you had to do is provide them with the resources to teach. I remember visiting the University of Illinois during student demonstrations and stopping at the PLATO learning center about 10 o’clock at night. While students were out demonstrating and tearing up the campus, the learning center was full of students working on PLATO. It was interesting, to say the least.
Can you tell me something about the popular impression of CAI in about 1970? Was there a sense that it was going to be a great saving grace, a powerful tool that would help education or — ? What was the general feeling outside of the agency?
I think people in the Foundation and the people in the field had great optimism about what could be accomplished, but they were also realists about the costs. They knew what the problems were in education, they knew the difficulties associated with installing technologies and the limitations of the materials that were available. The costs would be huge. And to this day, there is still great skepticism about wide spread use. In fact, if you look back historically, most of the government and programs in the Department of Education and the National Science Foundation have been aimed limiting support for the development of technology. The rise of technology in education is really a grassroots revolution. It began with parents who worked in the industries that were involved with technologies. They brought these tools to the students, and the students showed them to the teachers.
The Nation at Risk Report was the first time that technology was accepted as a reasonable thing to consider in the curriculum. We distinguish among technology as a medium, technology as a tool, and technology as an object of study. We supported a wide range of projects in each of those areas. CAI to this day has its place. But while kids learn better and faster than the traditional methods, they also make the same mistakes as they do with traditional methods.
Excuse me if I belabor this point slightly, but I have what I regard as an unreliable source who has told me the following, and I would like to get your opinion on this. I have heard that there was a certain amount of pressure placed by both the White House during the Johnson Administration and the Nixon Administration on value of CAI either as a way of educating the masses or of cutting the cost for education, Johnson to Nixon, but that there was in the Foundation a certain skepticism about these hyped up values of CAI and especially among the scientific advisory committees for the foundation.
It was far more chaotic than that. That is, we prepared some testimony for congressional committees at that time that were interested in looking at alternative ways of doing education. You have to begin back with the Elementary-Secondary Education Act. After the war and the launching of Sputnik, there was a great concern that what was going on in education was dated, obsolete or just wrong. There was a renewed interest in teaching creative problem solving and updating the curricula. At that time, Congress passed Acts that permitted support for very innovative programs. It provided funding for schools to try many innovative things that they never had the resources to do. But the emphasis was to support state and local developments.
There was a tremendous concern about the role of the federal government in education. I remember a Congressman telling us at USOE that you have done nothing, you have done a splendid job. In other words, the concern was to keep the federal government out of education totally. Anything that smacked of federal leadership or intervention was looked upon very negatively. Therefore, the Elementary Education Act permitted people to experiment with new ideas. Most of the funds were distributed to sites all over the country for local efforts.. So a thousand flowers bloomed, but without technical support many of the projects failed or had limited impact. President Johnson proposed support for “knowledge networks”. Congress passed the legislation, but provided no money for it, About two years later, an evaluation found that nothing was accomplished so they did away with it. Below national visibility, there were federal programs that could provide limited support for technology in education. However, there was no coordinated direction. The programs tended to provide training but no equipment; equipment but no software; software but no materials. Nationally, we spent, something like a half a billion dollars on technology in education, but had little to show for it. It was difficult to put together the resources from the many federal programs into one package to support the development of an entire comprehensive system for technology in education.
Outside of graduate fellowships, NSF was hesitant to support programs in science education. The Office of Computing Activities did support computers in education and provided an ideal environment for both the development of computer applications in science and their eventual use in the classroom. .But when we proposed the two large scale demonstrations of PLATO and TICCIT, the National Science Board was initially against it. Don Bitzer was invited to the NSF to make a presentation before the Board.. Bitzer wanted to bring in terminals and demonstrate its use in the classroom. Back at Illinois, he conducted a pre-test and nothing worked. They made the necessary corrections and put on a demonstration for the Board. Initially the Board was extremely skeptical since most were unfamiliar with the developments in instructional uses of computers. They demonstrated the use of PLATO in several disciplines. But Stan Smith, who was an outstanding chemist, a marvelous teacher, challenged the Board. He said, tell me anything you find hard to teach in chemistry, Board members gave him a problem and he would type it in to PLATO and the chemical symbols would come up on the screen. He then would solve the problem. The Board was very impressed. They did not believe this could be done. After the presentations the Board supported the demonstrations. Bitzer made presentations for the Congress. And Congress approved of the demonstrations, but did not provide any extra money. Therefore, we had to take the funds out of existing programs. .
There were a wide array of courses and programs designed to introduce computers into education. In 1978, I wrote a paper entitled, “The Next Great Crisis in American Education: Computer Literacy”. It was reprinted in about a dozen publications and caused a minor stir. It is sort of ironic, because literacy programs were vogue, but no one knew exactly what computer literacy was or should be. Several writers tried to define it, but there was no board consensus about what all students should know about computers. We liked the term because it was broad enough that we could get all of the various introductory programs together under one roof. Therefore, we initiated a national conference and supported a number of computer literacy projects.
Back in 1979, we were concerned about setting a context for research that we were interested in supporting. We wanted to anticipate the future so that we could encourage testing new developments in the field. We organized a panel led by J.C.R. Licklider from MIT and John Seely Brown from Xerox, Palo Alto Research Center and a distinguished group to look at what the potential for technology was in the foreseeable future. They produced a report called, “Technology in Science Education: The Next Ten Years, Perspectives and Recommendations”. .
We helped Alfred Bork at the Department of Physics at the University of California at Irvine to organize a conference to consider creating a system of intelligent videodiscs for education. In addition, to inviting computer researchers, representatives from the Dutch electronics manufacturer, Phillips, and the French company Thompson were invited. At the time they had developed the optical disc technology. The vendors were asked what would it take to bring optical disks out commercially for education? And they said, while they did not see any educational uses, if we could promised them a million sales a year they would be happy to bring it out. So we went a step further and asked could they attach it to s small computer and make it an intelligent video using the best features of both technologies? They could not see a market.
Al Bork and members of the conference set about designing an interactive computing system using optical discs. After the meeting we funded two proof of concept evaluations. Victor Bunderson, then at Brigham Young University in Utah, created an intelligent videodisc project to test teaching about DNA in biology, and another project at the University of Utah was supported for teaching physics and engineering using computer based interactive video disks. We were really quite ahead of the field in doing research on the applications of advanced technology. Through proof of concept evaluations, we wanted to provide evidence that the systems were cost effective so that educators and vendors alike would have a sound basis for making decisions about adoption. Yes, we faced lots of skepticism about the value of technology in education.
We were constantly engaged in answering questions and defending our budget. At one time, I believe that we had to report to seven different oversight committees. But, there was a strong community of researchers out there that did some outstanding things with little or no federal support. So again, it was essentially grassroots movement that advanced the field.
Proving the Concept
In your estimation, were the TICCIT and PLATO projects successful demonstrations?
Yes, I think they were. That is, it was at a time when budgets were contracting, when there was a waning interest in doing innovations that might be too expensive. The demonstrations drew lots of attention. The Foundation wanted to provide a demonstration that would convince the educational community of its potential and have the business community take over for further development.. And in fact that goal was accomplished. Control Data took over the PLATO development and, I am told, spent an estimated billion dollars in support of very creative activities that were still way ahead of their time. The MITRE Corporation took over the TICCIT project. So the goals were accomplished. PLATO is used worldwide. It's used in Saudi Arabia, the Department of Defense and Internal Revenue uses it. It's used by air traffic controllers. And studies found that it was cost effective for training in the military, where you have to pay a soldier a salary while he learns..
Bitzer is a marvelous entrepreneur. I remember when at a conference in 1968, he came to me with unit of four plasma cells and said he wanted to make a display out of these cells, and he plugged it in and it lit up. He said, see that light? Can you can imagine a wall to wall screen?. I could not imagine a small terminal let alone a large screen. But he could. Bitzer believed that small stand alone machines could not provide the necessary services and that only large, interactive networks could provide high quality instructional materials. He faced many problem in developing PLATO. He tried to get the federal educational discount, that telephone companies provided so that he could reduce his communications costs. Initially, the telephone company resisted, but he persisted and they eventually gave him the discount. He had to overcome many major obstacles, but he developed a really important system.
The perennial question about CAI is does it really work?. We wanted to be able to answer that question. We gave James Kulik, the University of Michigan, an award to look at all the cost effectiveness of education programs on computers for the last 15 years. He he performed a meta-analysis of several hundred, well-controlled, studies in a wide variety of fields at the elementary, secondary, higher and adult education levels. He found that computers did improve performance from 10 to 20 percentile points and could reduce time on task by one third and that, in fact, kids liked working with them. These studies tended to be the older applications and did not include newer studies utilizing more advanced technologies and newer educational paradigms. But he did answer the question, computer based technologies do work. Similar results came out of the military and commercial sectors. We wanted to use these studies to establish a base for comparison with new approaches. We wanted to do things that were at least an order of magnitude better. One cost effectiveness study out of the University of Chicago found that computer assisted instruction is three times more cost effective than tutoring which produces about a one sigma difference. What we were aiming for several sigma differences.
In 1983, the Science and Technology Committee at the White House asked the National Academy to do a study of artificial intelligence in computing.. We picked up this recommendation and decided to fund projects that included AI techniques and supported intelligent tutoring projects, or ICAI, in calculus, algebra, geometry, pre-algebra,and algorithm problem solving at all levels of mathematics. We wanted to test ICAI for more than one application to see if we could generalize to is use in a lot of applications.
Suppes, is now using intelligent system. Suppes just reported results on calculus with seventh and eighth graders. He also did a pilot with kids in rural locations that do not have access to calculus teachers. Something like 970,000 kids take introductory calculus, but only about 140,000 graduate with a D or better, and and most of them never take calculus again. Most countries throughout the world teach calculus at the high school level. About 7 percent of our kids take calculus. Most of them have to retake it after they go on to college. Therefore, remedial calculus is not an answer. Its too expensive In other words, you have to begin back at the early stages to find out why students fail. ICAI might be cheaper and more effective than remedial education.
Suppes’ initial study was with 13 kids in grades seven and eight. They were taking the interactive calculus course and then took the Advanced Placement Test. He found that six of them got the highest score on the Advanced Placement Test, six got the next highest score and one got a three, which is passing.
We have also supported ICAI geometry. John Anderson at Carnegie Mellon University has developed a theory of learning and cognition using it in ICAI and has demonstrated a one sigma improvement in geometry. We are now testing it in both algebra and in geometry in the Pittsburgh schools. While John is getting a significant improvement in his classes, an independent evaluation study found that teacher were more impressed with the fact that students were coming early to class and staying late and were highly motivated to use the system.
So we have also focused on creating intelligent tools and trying to anticipate the new technology-based curriculum coming down the line. But the support for technology has not always been strong. Beginning in about 1982, it was argued that computers should be as important as reading, writing and arithmetic. Some wanted to make it a requirement. Now the arguments have shifted and educators are seeking new justifications for why technology should be used in education while naysayers are trying to justify why it should not. And while we have had some dramatic results, it is clear that doing better is not the same as doing new and better things. With the increasing complexity of science, and the development of new uses of computer based technology in science, it follows that a lot of advances in the quality of the education will depend on new intelligent tools as an alternative approach to replace old paper and pencil technologies and many obsolete areas of study..
On the other hand, doing better is not the same thing as doing better things. Seymour Papert at the MIT Media Laboratory believes that we tend to underestimate the capabilities of children and has demonstrated that with proper computer tools and a supportive educational environment, they are capable of performing tasks usually assigned to much older kids. He developed a computer programming language, LOGO, and microworlds that involve problem solving in math and science. He believed kids should not be taught mathematics, but should be taught to be mathematicians he created game like environments where they can develop their math and problem solving skills programming machines. He believes in constructivism, that learning is a reconstruction rather than a transmission of knowledge. He uses LEGO toys to construct objects believing that the computer provides a metaphor and a tool for thinking and problems solving. Papert, I think, has had a great impact on the innovative use of computers in education..
Let me ask you an entirely different question. As I told you, we were trying to pull together biographies of various people who have worked on the staff. One person who we have been having trouble finding material about is Don Aufenkamp. What can you tell me about him?
Don is a marvelous photographer. After leaving OCA, Don went into the physics directorate and then eventually into the Russian-U.S. Program. Don learned Russian and went to the Soviet Union and worked closely with the Soviets on Russian-U.S. programs. Don was a computer director before he came to the foundation. He was extremely instrumental in those initial awards that put major mainframes into universities. I do not know, I think Glen Ingram could probably tell you more about him, and Arthur Melmed too.
I know where Ingram is. Where is Melmed these days?
Melmed is in town. He is affiliated with New York University now, but he is essentially retired and he is doing some projects for us on the future of fiber optics in the home and what it would take to scale up computer activities to serve millions of people. A major concern these days is how to scale up successful projects and experiments. Projects that are successful in the lab don't always succeed when they are scaled up. I really cannot help you too much about Don. He left the program and went into physics and then, as I say, went to the international program. He learned to speak Russian fluently.
Do you know what his education was in?
Physics, I believe. Most of the people, of course, were never trained in computer science, but came to the field from other scientific disciplines. .
Probably one of the most significant documents that we have turned out in the early conferences was the conference in physics back in around 1970. This conference was held at the Illinois Institute of Technology, and Peter Lykos was involved. Peter was a rotator. We had rotators that NSF hired to come into the various programs for a year or two and than return to their institutions. They are a major asset for the Foundation. They bring in their experiences and become familiar with work going on at the national level Others involved with computing were John Hamblen from the University of Missouri, Doris Lidke from Towson University, Gerard Engle and Jesse Poore and many others.
[end of taped interview]