Oral-History:Raymond Pickholtz

From ETHW

About Raymond Pickholtz

Pickholtz received his Bachelor’s from City College (1954?); worked at RCA on color TV receiver design (1954-57); worked at ITT Laboratories on spread spectrum radio systems for aerospace applications (1957-61); returned to graduate school at Brooklyn Polytechnic (1961-66); taught at Brooklyn Polytechnic (1966-72); consulted at IBM starting ca. 1967-68; taught at George Washington University (1972-). His research has been in data networking; modems; satellite communications; spread spectrum; spread spectrum multiple access/code division multiple access; fading channels; scattering; frequency hopping; and developments in digital radio (Orthogonal Frequency Division Multiplexing). He also discusses the future of the field and his involvement in IEEE.


Pickholtz's colleague Donald Schilling provides further explanation of spread spectrum systems and CDMA in Donald Schilling Oral History.


About the Interview

RAYMOND PICKHOLTZ:An Interview Conducted by David Hochfelder, IEEE History Center, 25 May 1999


Interview # 354 for the IEEE History Center, The Institute of Electrical and Electronics Engineering, Inc.



Copyright Statement

This manuscript is being made available for research purposes only. All literary rights in the manuscript, including the right to publish, are reserved to the IEEE History Center. No part of the manuscript may be quoted for publication without the written permission of the Director of IEEE History Center.


Request for permission to quote for publication should be addressed to the IEEE History Center Oral History Program, 39 Union Street, New Brunswick, NJ 08901-8538 USA. It should include identification of the specific passages to be quoted, anticipated use of the passages, and identification of the user.


It is recommended that this oral history be cited as follows:
Raymond Pickholtz, an oral history conducted in 1999 by David Hochfelder, IEEE History Center, New Brunswick, NJ, USA.

Interview

Interview: Raymond Pickholtz
Interviewer: David Hochfelder
Date: 25 May 1999
Place: George Washington University


Education and employment overview; spread spectrum technology

Hochfelder:
Can we start with your recent work? Or would you like to start with your graduate work and move forward to your work on CDMA?


Pickholtz:
Actually my work in communications started well before my graduate work. It started soon after I graduated with a bachelor’s degree from City College. It was in 1954 when I started working for RCA Laboratories. RCA, like Bell Laboratories, was a company I thought would never end. My first job was actually in color television receivers.
I worked on various aspects of color television receiver design from 1954 to 1957. That was the era of the development of a color television standard and the eventual development of practical receivers, which had just started to have a market for commercial purposes. My work there was mostly on things like modulation, demodulation, and various schemes for improving the color performance or getting rid of certain artifacts. At the time most of it was circuitry. There was some theory, but really this was a physical laboratory with soldering irons and so on.


In 1957, I changed jobs. I went to work for another company that does not exist anymore, ITT Laboratories across the river in New Jersey at Nutley. That work was truly communications, including a lot of digital communication. All of it was classified for military purposes. Very shortly after I joined, I got involved in a project which, as far as I can tell, ended up being the first practical direct sequence spread spectrum radio system for digital communications and guidance for drone aircraft.


Hochfelder:
About what year was this?


Pickholtz:
This was around 1957. By 1958, we were really into that project, and although I was a relatively young engineer, I was essentially the systems architect of the project. We put together a lot of the ideas that we used for acquisition tracking and re-acquisition in various mechanisms. This was for generating the pseudo-random sequences and phase locked loops, and all of the other things that are very much a part of a modern digital system.


The main difference is they were all built with discreet components. Initially they had miniature vacuum tubes, but by the time 1958 came around they were being transistorized. Still, they had individual discreet components. I worked on that, but at the same time I worked on other things such as my current interests, satellite communications.
In 1957 when I joined, it was the year of Sputnik, and there was a lot of interest in the notion that you could use satellites for communications and surveillance, which is the mainstay of over-the-horizon telecommunications and data collection. Indeed, at the time the concept of the geo-synchronous orbit satellite was out of reach because the booster rockets were not big enough. Therefore, some of the things I did had focused on very low Earth orbit satellites; today we call them LEOS. I also worked on the question of coverage; how many satellites you need to get coverage in a certain area of the earth.


As a result, a lot of the calculations involved astro-dynamics and things that were not directly related to communications. But it was a communications problem because one had to know what the visibilities were and how often you had to hand off the satellites to another user. Those were the two principle things did at IT&T: satellite communications and spread spectrum.


Undoubtedly, most of my time was spent on spread spectrum. At the time, spread spectrum had two principle functions. One was to minimize the effects of intentional jamming on the part of an adversary. And the other had to do with low probability of intercept, which is really the other side of the coin. Instead of worrying about a large power jammer disrupting your communications, you are concerned with using a spread spectrum in a sense that would hide your communications below the noise level so that it would be difficult to intercept or detect what you are actually transmitting.


It would, of course, be very valuable to deny an enemy the ability to even perceive your presence. To a large extent that stayed classified and, for all I know, a lot of it may still be classified because it is a very sensitive subject. There is also a lot of it now written in the unclassified literature as well.

Graduate studies; fading channels

Pickholtz:

By about 1961, I decided to go back to graduate school. I took a leave of absence from ITT, and went back to graduate school at the Polytechnic Institute of Brooklyn, after an interview and discussion with my subsequent thesis advisor, a man by the name of Mischa Schwartz.


Hochfelder:
We have one of his technical books.


Pickholtz:
Mischa has written a lot of books since then. By that time I had a family. My wife and I and our one child moved to Brooklyn and I became an instructor at Brooklyn Poly to earn a living. By that time, due to Sputnik and President Kennedy’s ambitions to land on the moon, NASA expanded, and the opportunities for a graduate student became plentiful. I was very fortunate to come along at a time when I could be a graduate student, and, although I was an instructor, still not have to teach all that much. I think a normal teaching load would have been about four courses. But we got funding and I worked under NASA and NSF funding.


Initially, I was trying to extend the work I had done at ITT Laboratories on phase-locked loops. I thought that would be the topic of my dissertation, because I had done a lot of work on that area. I ran into another student of Mischa’s who at that time was about a year ahead of me by the name of Donald, and it turns out that he was doing his dissertation on phase-lock loops. I therefore, abandoned that thought and started getting interested in things that are more theoretical. Eventually, I ended up writing a dissertation on something like “Demodulation of Signals Through a Randomly Fading Channel.”


There was a lot of interest in fading channels at the time. The reason for that is very different from the reasons there is an interest now. At the time, the interest in fading channels surfaced because the only way of getting transoceanic communications was by ionospheric scatter signals, which are very faint and random. The other long-range mechanism was a thing called tropospheric scatter. This was where you use large antennas and you try to illuminate the troposphere, which is a turbulent medium in the upper atmosphere, and scatter a little bit of energy over the horizon. The fading characteristics of those channels are well known today, but at the time they were studied intensively and there was a lot of literature.


The wire line channels were more or less dominated and occupied by Bell Laboratories, so these kinds of channels had a great deal of interest for military applications. A lot of the stuff in communications except for telephony was dominated by the military. There was a paper written by Price and Green, and it was actually using spread spectrum light signals for combating multipath random fading channels such as the ionosphere. Today I think we see that paper as a sort of cover for other projects they were doing. Nevertheless, it was a very interesting because I was already familiar with many of the ideas, including things that were not in the paper.


I decided to do a thesis on modeling the channel and fun ways to communicate over it, which included spread spectrum light signals. In fact, I think I mentioned that in the early 1960s, when IEEE was first formed out of IRE and the AIEE, and shortly after the formation of what is now the Communications Society, there was the first major International Communications Conference (ICC). I think it was in Boulder, Colorado and I delivered a paper there, which was part of this thesis that I had been working on.


That was a very interesting conference because there were many papers of a similar nature, on things like using not only spread spectrum, but also dispersive channels and fading channels. During that period Seymore Stein had a great deal of influence in my thinking. He was the director of the Sylvania Research Labs and he came on a sabbatical to Brooklyn Poly to teach a course that I took. It turns out Seymore was a classmate of Mischa Schwartz’s in the Harvard doctorate program.

Brooklyn Poly employment and research

Pickholtz:

When I finished my doctoral work, which was in 1966, it was a question of going back to industry or staying in academia. Brooklyn Poly made me an offer I could not refuse—not because of a lot money but because it was just an ego trip to stay in an institution where you got your degree and where you had such high opinions of the people who were there. Additionally, the communications group around Mischa Schwartz was rather unique.


Shortly after I got my degree, it was a year after Don Schilling did I think, we wrote a proposal, to NASA to do some work on space communications. A lot of that reinforced some of the things I had been doing at ITT. At this point, I do not even remember the details of all of the projects. I know there were projects on speech encoding, on phase lock loops, on demodulation, and on estimation problems; ironically, they were all purely theoretical things. In fact, there were two members of the group, Ken Clark and Don Hess, who had actually built a fading channel model out of a water tank. I think they are still in a business relationship, but they left Brooklyn Poly before I did.


They build a water tank that was essentially an analog model (acoustically) of the electromagnetic model of the ionosphere or troposphere. In order to simulate the scattering, they might have had bubbles, but I think what they had were little wire meshes that rotated around, randomly. Therefore, as the signal propagated through the water it was picked up by another microphone at the other end of the tank. This way you could simulate the characteristics of the channel quite well.


At that time, we did not have high-speed digital computers, so there was no way that you could actually make a simulation on a computer. Even an analog computer would have been very difficult. Therefore, this analog model was a convenient model of the electromagnetic channel. We tested many of the ideas we had on these channels at that time and this continued for a while.


At that time I also became interested in other theoretical things, which were the offshoots of my work on the dissertation, particularly things like detection and estimation problems. I became interested in things like recursive detection theory. There were other things as well. We wrote a couple of papers in the IT transactions, and one of things that evoked some interest had to do with what you could do with limited memory. It is interesting, all of these theoretical questions are really not very important today because of the availability of large amounts of memory, of vast processing capability and all kinds of other things. So these early theoretical things, while they had a very profound influence on my thinking, ultimately gave way to other things.


The interesting thing about the experience at Poly in the communications group is that it was an extremely active group. There were about seven full-time faculty members and many graduate students. Some did experimental work and others did theoretical work. We had a regular seminar that anybody with any semblance of permanence at Poly gave. It included people like Bob Gallager, Elwyn Berlercamp, David Slepian, Landau, Wyner, Wozencraft, Kalman, and other prominent names in the field. Bell Labs was just around the corner practically, and many of their research staff gave courses at the school. Of course, the courses were mainly in the evening for graduate students. So, it was a period when you could get direct instruction from people who were very smart at thinking, ideas, formulations and mathematical models; it was first-rate.


Since that time, in my experience, there has not been that high intensity of intellectual stimulation. In fact, in more recent times things may have become more diffuse. Today, there is a much more specialized interest in various aspects of communications, so when you read a journal paper, there might be forty papers of which you will have an interest in maybe only two in any given month. In those days, I would say I read through everything.


Hochfelder:
Can you talk a little bit about what made Brooklyn Polytech a dynamic environment to be in? Was it the people you referred to or were you talking about the EE department in general?


Pickholtz:
It was the EE department in general. I am a New Yorker, born and raised. New Yorkers are very provincial people. Therefore, I had very little choice, if I was going back to graduate school, it was Columbia or Brooklyn Poly. Columbia was very good, they had some excellent people like Ragazini and Zadeh at Columbia. But in communications, which I was interested in basically from my work at RCA and ITT, Mischa Schwartz was already a made person, and he had the enthusiasm, the energy, and the charisma to make a dynamic group in communications.


I do not want to imply that it was only communications that took me to Brooklyn Poly because it was a very unusual place. It was housed in a razor blade factory, so physically it was in “Nowheresville.” Nevertheless, the intellectual dynamic was outstanding. In electronics, they were second to none. In fact, all of the textbooks that were being used all over the world were—there was a series from McGraw Hill in electronics—about eight or nine of them were from Brooklyn Poly. There were people like Athansios Papoulis, who became my second mentor next to Mischa, and was just an outstanding intellect. He was able to impart creativity of mathematical ideas in a way that was superb.
Additionally, there were very famous people in electrophysics. There were people who were tremendous analysts like Dan Youla and Leo Felsen who were basically in the electrophysics department, but whom other people and I interacted with. It was not a question of working together, but there was a ferment of ideas. As I said, there were seminars in communications alone, which took place every week that attracted people from all over the world.
I do not know what the mechanism was, except that it was the major technological institution in New York City. As a natural consequence, you are going to get a critical mass of people who are good, interesting, smart, and willing to shake up the world. In addition, and I think this was a factor for many of the people, many of the graduate students were either full-time or part-time students working at Bell Laboratories. Also, the people who already got their degrees, whether from Poly, MIT or Columbia and who joined Bell Laboratories, very frequently came and gave lectures or courses. So, it was like being embedded in a boiling metropolitan bathtub of intellectuals. The advent of satellite communications and the money the government was adding through NASA and other agencies really stimulated this to a fever pitch. You have to remember, at the time, many universities, if they had engineering schools at all, were mostly electrical power departments.


A lot of this really gained momentum in the ‘70s. One of the people that came to Poly from NYU was Jack Wolf, who was very much interested in error tracking codes. He brought another dimension of interest to this whole picture. By the end of the 1960s, there was a great deal of activity at Poly in all of these now traditional communications areas. What I mean when I say “traditional” is what came in the post World War II era, which was started by theories from Shannon, Wiener, and other people like that, but also the practical aspects of communications which were necessitated by military communications.


Data networking

Consulting work, IBM

Pickholtz:

By 1967, 1968, I started doing consulting work at IBM Research in Yorktown Heights. At the time, IBM was very interested in interconnecting computers. First, I gave a series of lectures up there, and then I was hired as a consultant. It became evident to me what their true interests were data communications and data networking. I came back from my experience and had conversations with Mischa.


The wave of the future was in data networking. There were questions, however, of protocols and other things like that that were not covered by communication theory.

Brooklyn Poly

Pickholtz:

By the time the late 1960s and the early 1970s began, the direction of our work at Poly began to drift in that direction. That was also the beginning, the birth of the Internet via ARPANET initiated by Larry Roberts and carried forward by Bob Kahn and Vint Cerf. In fact, I think we were among the first schools to offer a course in computer communications. In fact, it was called ”Computer Communication Networks”.


Hochfelder:
Was this wired?


Pickholtz:
It was, and it was not wireless per se, but it was not thought of as wired because we concerned ourselves with the early things about computer communication networks, what was then called Terminal Oriented Networks. If you go back to the history, you find that computer networking was started at IBM and MIT (Bob Fano) trying to do multi-access to a computer so that you didn’t have to go through punch cards, and you had the ability to use one computer among a lot of people. To do this you would have a bunch of terminals—dumb terminals—in some adjoining room. Then there would be this big monster IBM machine and its air conditioning in another. The problems had to do how you share these computers.


By the time the late 1960s came around, there was an interest in longer-range networks. Now there are networks that use modems or dedicated circuits for communicating over longer distances, say for the purposes of travel agents entering data to make reservations. The reservation computer might be located in Chicago. In fact, we have such systems now. These reservations systems are very important. To illustrate how pervasive this process has become, I have seen some reports where in some years some companies, like American Airlines, made more on their reservation system, which is available to other airlines and agents, than they did on selling tickets.


Still, in the 1970s it was rather primitive because you had to use modems, which were operating at about 600 bits per second. There was a great deal of interest in how you speed this process up and what the possibilities were to put more users on. Every theoretical question has to deal with these kind of issues. A main question was what the overall performance of the network is. At the time, the only person I know of who was working on those kinds of issues, was from a theoretical point of view and was Len Kleinrock, who had completed a dissertation at MIT on queuing analysis of those kinds of networking problems.


ARPANET

Pickholtz:

By late 1968, as you probably know, the so-called ARPANet was starting. But it was starting for reasons different from why today’s Internet exists. The reason was to basically share very expensive computers. That is not the problem now. Computers are not expensive. Also, it was very expensive to share software that was resident on those computers, and connect very specialized people who were located at those machines, and to reduce communications costs. Due to this, the issues of the ARPANET and all these other things began to gel.


George Washington University

Pickholtz:

By 1972, which was when I left Poly and I came to George Washington University (GWU), Mischa and I and a colleague, Bob Boorstyn at Poly, wrote a paper for a special issue of the Proceedings on computer communications. It was the first issue on computer communications. The paper was on terminal oriented computer communications, which focused on the question of using dumb terminals to access computers long range. There were various commercial systems already in place that attempted to do this, and all were on modems. In fact, when I was at IBM I worked on some of these problems, but I also worked on modem problems. On one of them, VSB modems, I got a patent. There was a lot of interest in how you could increase the speed of modems, which at the time worked at a speed of about 600 bits per second.


In 1972, I came here in Washington to GWU, and there was nobody in communications, so I started the program. I got some funding from NASA and from NSF, I got some students, and we started hiring people. Pretty soon, I started to revive the interest here as well. In the early days when I came here, I had students and we published papers on computer communication networks, things like how do you improve routing, networks, queuing problems, protocol issues, and so on. Then, as the group grew and we got more people in, I was relieved of having to teach all the communications courses and supervising every doctoral student.

Satellite communications; spread spectrum communication

Pickholtz:

Therefore, by the late ‘70s, when I became chairman of this department, I was already involved in the work, either through grants or through consulting on satellite communications. Primarily due to the location of Intelsat and Comsat in Washington, it eventually turned out to be the satellite capital of the world. That still holds today. But at that time it was really just two organizations, Comsat and Intelsat, that were directly involved in commercial satellite communications.


I continued my work in satellite communications and various other aspects of computer communications. Then I left and went on my sabbatical at UCSB in Santa Barbara, California. I started thinking about where the direction of communications was going. Especially in light of all of the things I had been doing, which was basically being dispersed in a whole bunch of different directions. Basically it was still computer communications, spread spectrum, which I continue to do. In fact, during the ‘70s a lot of the spread spectrum work started becoming declassified. So Don Schilling and I gave the first series of non-classified courses at GWU on spread spectrum communication. We gave that over a period of six or seven years. There must have been easily over a thousand people or more who attended our talks in that time.


SSMA and CDMA

Pickholtz:

I started getting an interest in using spread spectrum as a multi-access scheme. This was stimulated by Joe Aein and Jay Schwartz at IDA. Nowadays it is called code division multiple access (CDMA). But in fact, it was known then as spread spectrum multiple access (SSMA). CDMA is a relatively new term, new in the sense that it is maybe fifteen years old. Prior to that, it was SSMA, and the reason they changed is because some people were doing something else called satellite switched multiple access and calling it SSMA. So, it bifurcated the terminology, not unlike what has happened to course names in Universities.


Here is an example of how terms get mixed up at universities, but I have seen it elsewhere. When you offer a course in communications, let’s say communication theory, it turns out the English Department offers a course in communication theory, and the administration wants to know why you are duplicating courses. In fact, very often the English Department starts off with a kind of a block diagram about the source and the noise very much like the Shannon model. But then they go on, the two would be recognizably distinct. The word communications has become so commonplace for advertising, public relations, and for English and for literature and other things, that to some extent many places, including us, have reverted to the word telecommunications. That term has always been used in Britain to distinguish it from just general communication. We, however, still have courses called communication theory, and it still causes confusion. People call me up and want to take the course. When they ask what it is about they are befuddled when I read them the catalog description.

Commercial satellite communications, consulting work

Pickholtz:

Anyway, about this period of the 1980s, I started becoming interested again in satellite communications, and in various aspects of the peculiarities of satellite communications. I did some work in satellite communications, mostly for commercial satellite systems. At the time, I was also a consultant to Institute for Defense Analysis (IDA), which was also doing work on spread spectrum and satellite systems in multi-access. Most of it was classified.
By about the late 1980s we were publishing lots of papers on spread spectrum and things mostly having to do with what was declassified, but they were not papers based on declassified work. It was an academic type of analysis, which turned out to be very useful later for working with jamming, interference, multiple access, and where the performance was in an environment with a lot of other users. There was a whole series of papers, both in Transactions and in the various conference records. Then it started taking off, and became a subject in its own right. The Transactions, by the end of 1980s, actually had a separate editor for spread spectrum, and it still does. There are whole conferences, international conferences, not one or two, but maybe half a dozen on spread spectrum. And maybe others, maybe another half a dozen on CDMA as sort of to distinguish it.


Commercial applications of spread spectrum analysis, CDMA

Pickholtz:

Around that period both Qualcomm and Don Schilling’s company, which was called SCS at the time, started developing spread spectrum systems for possible commercial use. Don Schilling was doing it primarily for things like wireless local loop and spectral overlays, which was the idea that you could operate in an interference environment and still function. I was a consultant to Don on that, as was another colleague Larry Milstein. Then I went away on sabbatical in 1990. Upon my return my involvement was reduced to testing the systems that they had already built. At the same time, Qualcomm announced that they had a system for CDMA for replacing the cellular system AMPS, which was basically an analog system. At this time there was another digital system being proposed based on TDMA. It was asserted that CDMA could produce greater spectral efficiency.


By the early 1990s, there was a great deal of fervor about CDMA. And I was no longer involved with Don Schilling since I had been busy doing other things. In particular, I was doing work as a consultant to Motorola, who decided to build a LEO system. In other words, LEO is Low Earth Orbiting Satellite System, now called Iridium, for transmitting cellular quality all over the world, no matter where you are.


Hochfelder:
What about the commercialization of spread spectrum in the ‘80s?


Pickholtz:
The period of the 1990s was the era where the spread spectrum was commercialized. I would have to give proper credit to Qualcomm, and in particular, to Irwin Jacobs for having pushed this to the fore where it is now. Now it is being considered as eventually becoming the universal air interface standard in future third or fourth generation cellular systems. There were many other commercialization projects in spread spectrum at the time, some that I worked on and some I did not. For example, there were spread spectrum systems being used for identifying things. There was also a company that is still in existence, which tracks down stolen vehicles.


Hochfelder:
Do you mean Lojack, the car theft retrieval company?


Pickholtz:
Yes. That is a spread spectrum application. Pretty soon, all of these different ways of using spread spectrum, like the concept of reading meters remotely from a satellite emerged. The data rate per user, however is extremely low, it is something like a couple of hundred bits per month. Not only is it very low, but also you could use a considerable amount of error correcting codes and still would not have to access everybody at the same time. You could poll and do all kinds of tricks. There are systems that actually do that sort of thing. There are messaging systems and so on. The scheme that I was involved in when Don Schilling with the company called Milicom used 1.9 gigahertz, and a total bandwidth of 80 MHz for personal communications.


Hochfelder:
Was that when you did your field test in Orlando?


Pickholtz:
Yes, we did a field test in Orlando. The idea was to convince the FCC that we could live with the users that were already there because we already had microwave users. That was basically an attempt to make a commercial system that could live with an environment of other users because spectrum was precious, and still is. The microwave users wanted a lot of money to relocate, so the FCC saw that they were going to have to either force them to relocate or compensate them. Eventually they came to evacuate the band.


This concept of overlay did not fly, probably for two reasons. First of all, although it did work, it was a really wide-band spread spectrum. I am talking about 80 megahertz, 40 megahertz going and 40 megahertz coming. The current wide band CDMA, what is called WCDMA, is only 5 megahertz. The common CDMA used now is 1.25 megahertz, therefore going from 1.25 to 5 is wide band. In retrospect, I think the wider the band for spread spectrum the better, because of the multimedia potential, as well as the advantages in a multipath environment.


You need a lot of bandwidth in order to get a lot of users on simultaneously because there are the technical issues of multiplexing gain and all kinds of other things like that. But to make a long story short, we tested the system, it was a perfectly workable system, but for various reasons the FCC decided not to go along with that. They did not like the idea of giving out 80 megahertz at a time, and there were these other proposals from more influential people. But the long and short of it was that CDMA started taking off for cellular. At the same time, I think Qualcomm and others also proposed it for a LEO system.


I think the first proposal for a LEO system came out of Motorola with the Iridium. The original Iridium system was a polar orbit satellite system with originally seventy-seven satellites. It was called Iridium because the element Iridium has seventy-seven satellites. It is now down to sixty-six for various reasons, and they decided that they were going to use TDMA.

TDMA

Hochfelder:
What is TDMA?


Pickholtz:
TDMA is Time Division Multiple Access. In other words, whether it is a satellite or a cellular system, you have what is called a physical layer access problem. It is also called radio access or the air interface protocol. They are different names for the same things. The air interface protocol, however is how you, as a user, get access to a particular chunk of the spectrum or resource. In traditional analog, it was FDMA, you give each user a slice of bandwidth on demand. You go through a polling channel and you say I need a channel, and it would say, “Yes, use frequency ‘x’.” This is one among the hundreds of frequencies that might be chopped up, and you have it until you leave the cell area that you are working with.


You can make that digital too, but it was primarily the old AMPS system. In the USA, for TDMA, they thought of taking the same 30 kilohertz of band that AMPS uses per channel and chopping it up in time and giving each band three users where they had only one on analog. It was a multiplier effect. Eventually, they would go to six users when the voice codes got better. Consequently, there were 30 kilohertz channels with three users. This was called IS-54. Now it is called IS-136.


Alternatively, in Europe, they had planned a GSM system, the English translation is General Systems for Mobile, but it had a French name. It is basically the Universal European Cellular System and it was TDMA. But they did not start with the analog system and chop it up; they decided to go digital in a larger band, 300 kilohertz at a chunk. So, their system had eight users operating in any one frame of TDMA.


TDMA is basically sharing time. You get a chunk of time, and then you send your digits, then it stops and somebody else sends their digits, and so on. There were all kinds of issues of how you make sure you synchronize these things. There were guard times because the differing distances caused you to lose some efficiency. Also, when you are not speaking and you are not transmitting data, your channel or slot, time slot, lays fallow. In the time that you are not communicating, somebody else could have been communicating, which was the basic idea in computer communication networks.


In fact, there is no reason you cannot do this in TDMA, but it leads to a very complicated system that has to be done at the switch. In CDMA, you do not have that problem, because in CDMA the signals are noise like. Therefore, when you are not communicating you generate less noise, so you are basically interference limited. That way the user does not use up time slots, they use up interference. At some point, the interference builds up to the point where everybody gets unusable signals.


Hochfelder:
Are you saying, as far as one channel is concerned, the more users the higher the noise floor?


Pickholtz:
As far as one channel is concerned, it raises the noise floor a little bit. In fact, if you do it right, maybe the first channel is below the noise floor, in terms of spectral density. Not in terms of total power, but in terms of spectral density. Therefore, as the number of users increases, the system begins to degrade.


Those ideas were fairly well known in the military, but to their credit, Qualcomm used them as part of their system and made a very good CDMA system. Motorola had a TDMA system because they were a worldwide system with a network “in the sky” with on-board processing, and the networking parts were GSM compliant. That is because GSM at the time was only a digital network for radio communications. GSM allowed you to roam all over the world and still communicate, but you could not do that in the United States with any of their systems, except analog. Motorola, therefore, decided that GSM was going to be the thing, and I think that was the original reasons for their choice of TDMA. Another important reason was market directed.


I started doing consulting work for them, published quite a number of papers, and made presentations. They were mainly on the study of how CDMA works in a LEOS system. It was no secret that the purpose of my work was to see if TDMA was indeed the better system. It never got to the point of studying TDMA the same way. The studies only illustrated what the problems with LEOS were as opposed to terrestrial issues when using CDMA. And there are many.
CDMA terrestrial has many virtues. It's not clear that the same virtues are there for a satellite system, in particular a LEOS system, where you have fleeting satellites going overhead, where the signals are very weak and are shadowed by buildings or trees or various other things. Also, if you are sharing spectrum, when you try to raise your power level up, it will basically rob power from the other users. Because you are jacking your power up to get through that additional dB loss, somebody else will be affected because it is really a power sharing scheme. Some people did not like my use of the word “power robbing”. But my answer to that was when you are using the spectrum by yourself that is sharing. But, if you are using the spectrum with other systems, it is not sharing, it is robbing.


Hochfelder:
Are you talking about one channel interfering with another channel?


Pickholtz:
Yes. The idea was that you share the entire spectrum, you just pile things up on top sort of like a skyscraper. But you can build a skyscraper only so high and eventually it will collapse of its own weight. If you did not have a weight problem, however, if you had infinitely strong materials, you could keep building upwards. But that is not the real world.
In fact, some people are now writing articles in Forbes magazine, saying scarcity of spectrum is an old idea—we do not have to worry about spectrum, there is no limit on spectrum with CDMA. Not quite true, and part of the analysis that I did on that was to address these kinds of questions.


Modems and interference cancellation

Pickholtz:

In the mean time, various other things were beginning to happen in the modem field. Not modems for telephone lines, but modems for these radio links. Which are very highly dispersive and rapidly fading. It was similar to my dissertation topic. There it was, satellites, wireless, and fading problems all these things coming together. So in the last decade, I’ve been preoccupied with a convergence of these ideas, including the examination of whether new ideas which have developed since the 1990s could possibly be done.


The limiting factors were things like hardware or even software complexity. Ideas like “multi-user detection” are a very hot topic in the academic literature. This idea was originally proposed somewhere in the 1950s but was recently revived in a thesis by Sergio Verdu, who is now a professor at Princeton as a part of his dissertation. It tried to see what is possible and what the limits are. Instead of trying to extract one signal from say a CDMA signal, you would try to extract all of them simultaneously. That way you use all of the site information that you have available. In a commercial setting, unlike a military situation, you have all of the codes and details. Therefore, if you took advantage of them, many of the problems that were troublesome in CDMA, like interference when a large user comes along, begin to disappear. This was, at least, on paper.


Due to this, there was a flurry of activity in this area. In fact, in the last five years there must have been easily 300 papers written on the subject and related topics of interference cancellation. Some of mine included, but there has been all kind of ideas about how to do that.


Multi-user detection

Hochfelder:
Now your new research is in multi-users?


Pickholtz:
Some of it is but not all. We have been interested in multi-user detection for CDMA particularly. Some of the work I have done has not yet been published because it was done for commercial interests. Eventually they will get the patents and allow publication. Like I said, the idea has been around for a while, although the theoretical things have been refined a lot by Verdu and the other people that followed him.


As a consequence, those ideas had to be incorporated into fading channels that just gave you noise. The questions we now faced were: What happens when you have fading? What happens when your signals are very corrupted by variabilities? And what happens when you have fading and you want to cure the fading with coding? Therefore, you have coding on top of multi-user and modulation schemes; all of these things I confronted early in my career. Most of these techniques seem to find their greatest value in CDMA, although nothing rules out their being used in principle with TDMA or any other kinds of digital scheme for wireless communications.


Pickholtz:

My work today deals also with Turbo Codes that are basically how you find usable, but very powerful codes. What I mean is, you can do this even with modern signal processing chips, but it will be kind of difficult. Within the last few years we have be able to do this, at least in non-fading regular channels; that is called Turbo Codes. It was an invention only a few years ago by some French guys (Berrou, et.al.) who were working on a way to put something interesting on a chip—they were chip designers. To some extent, they were groping to find a problem for which they had a solution and they hit on this idea that escaped the attention of the people who were working in coding theory for forty years.
Also, it was kind of an embarrassment because it is a simple idea; basically you take two inter-coders and you iterate the decoding process. In other words, you take the output of the first decoder and say, “Hey look, it’s giving me some enhanced information so I can use it to help the second decoder.” Then the output of the second decoder is a little bit better, so you can use it to enhance the first decoder and you iterate it. As you iterate it, things get better. Ironically, it is just an outgrowth of simple probability theory and so it has become a hot topic.


Basically, what you do is gain several dB. In a satellite system, a factor of two, when you’ve run out of satellite space means another satellite, could mean the equivalent of ten or twenty million dollars, maybe a hundred million if it’s a big satellite. In the wireless system, where you got all of these problems, factors of two means that you don’t need as many base stations because they cost a lot of money. Each base station—the real estate and equipment cost about a million bucks. Therefore, if you can have double the distance for a base station, then you can make the number of base stations required about one quarter of what you would originally need. Ultimately, you are talking about real money. Whatever you are spending, a billion dollars for the infrastructure, now you only need a quarter of a billion dollars. In principle, that is the idea.


That is why there is a lot of interest in that. There is also a lot of interest in multi-use detection for that same application. My interest has been basically not so much the invention or the discovery of some of these new ideas, but to put them into practically useful systems. In some cases, it is purely analytical, and in other cases, it is about actually helping people design these things. I would say “build” instead of “design” as I think about it, because that means devising the algorithm that is going to be stored on a ROM. There are other ideas in the works such as “smart antennas” which can further improve performance.

Analog broadcasting, digital radio

Pickholtz:

That pretty much takes me up to the present time, in terms of my interests. But there is one other thing I am doing now, although I cannot talk about it in detail because of non-disclosure reasons. A company called USA Digital Radio is proposing to revolutionize audio broadcasting which now uses analog AM and FM. They have produced “digital radio” sharing the same bands as existing AM and FM stations.


AM and FM broadcasting are the last bastions of analog. The idea is to make “compatible” AM and FM radios that also can receive digital with “CD Quality”. That comes full circle too, because the attempt is to make thing downward compatible. When I first started my first job at RCA Labs, the big issue and why RCA won its standard over CBS, was that it was called Compatible Color Television. In other words, if you only had a black and white set (you’re too young to remember this), you could still receive it, but if you had a color set, suddenly things turn into color. It is not as if you have to throw out all these millions of sets. The idea here is to make a compatible system using the same spectrum. Therefore, you ask, how could you squeeze more signals into the same spectrum? In color television, you had the same problem. There is three times more information, than there is in black and white.


But by clever use of the psycho-optical phenomena of the human eye, the brain, and various other aspects of color, you can squeeze it in. The new digital televisions will do this even better, because there’s compression involved. In digital radio, there is a lot of compression—very sophisticated compression algorithms. But also, various sophisticated modulation and coding schemes. The schemes are called OFDM, Orthogonal Frequency Division Multiplexing. What it does is squeeze little pieces of the digital signal where you can fit them, at very low level, and heavily code them. That way, it fits right inside the same spectrum as the analog, and it does not disturb the analog perceptibly. All the while, the signal and reception clarity is not compromised. In the FM band, where you had a nice FM signal, you still have a nice FM signal; but the digital signal (if you have a digital receiver) will be CD quality.


Hochfelder:
Are you saying the sound is much improved?


Pickholtz:
Yes, and digital AM will be FM quality and the digital FM will be CD quality. Over a period of ten years analog will disappear. At least, that is, from the consumer scene—there will always be some need for analog. One organization is actually doing this now with a satellite. The project is called “CD Radio”, which I also helped work on also through Lucent.


Hochfelder:
So a lot of your research interests have come around full circle?


Pickholtz:
Yes. In fact, they have been consolidated, whereas before, my research interest was fragmented. You could see that in what I published. It was either on satellite system, or some security aspect of satellite or multiple access for satellites or computer communications. Now most of the wireless systems, the next generation, the interest is how do you get broadband Internet over wireless. Therefore, that part has come full circle too, because many of the protocols that were designed for wire line services were crummy for wireless because the channel was bad. There are lots of people studying how to modify these protocols, and I am involved in that as well.


Fading channels

Hochfelder:
If you could go back and talk about some of the more technical terms you have used, and maybe define them for the purpose of other people who might need the transcript, at some point. You talked a lot about fading channels: can you explain that?


Pickholtz:
The idea of fading channels has been known for over a hundred years. In fact, one of the dynamic characteristics of one of the idea of a fading channel is that when you look at a signal that you receive, it’s random. Sometimes it is large; sometimes it is small. The detailed phenomenology of why it fades is complicated. But in very simple physical terms, it is because you receive the signal not only randomly from the source, but also by multiple reflections from objects.
The multiple reflections collectively are called “scattering.” When you receive the reflections from one source and then from one or many other building sites, when they arrive, sometimes they enhance one another because they are in phase, sometimes they cancel. As you locate yourself in different positions, you are going to get a distribution of this fade. If you do not get things directly from the source, if it is all indirect and from many sources, the distribution is called Rayleigh fading after Lord Rayleigh from a long time ago.


Fading can be present whether or not it is static or dynamic. For example, if you are sitting on a park bench and you happen to be stuck in a shadow, you could simply move a couple of feet and very likely be out of it. But, if you are driving along the highway a hundred kilometers per hour, you have less control over the fading. Therefore, if you look at the actual signal strength, it is going up and down very wildly with some fades being very deep. When you get into that deep fade, you just lose the signal, and that is what happens very often when you are talking on a cell phone. It could happen because of fluctuations in signal strength caused by changing distance. It could also happen because of attenuation when you go under an underpass. It could happen because of a combination of those events.
In a digital system, this phenomenon can be deadly. In voice or analog, let us say you might miss a syllable. But in digits, you will have missed a whole block. The protocols are usually designed so that if it misses a block, it does not know if it is because of congestion or if it is because it is a bad channel. It tries to retransmit it, and by that time you might be elsewhere. And in real-time traffic, repeats may be useless. So, it is a major problem in trying to transmit data in a rapidly fading channel with wireless. Fortunately, there are many approaches to mitigating the fading on the physical channel, but it remains an issue nevertheless.


Hochfelder:
What about CDMA?


Pickholtz:
CDMA suffers from that too. In fact, in CDMA what they try to do is correct for this by doing an “instantaneous” power control to whatever extent possible. If it’s very rapidly fading, they don’t have to worry about it, because one of the virtues of CDMA is that you could put a lot of heavy error correcting coding in. That way, when you lose some of it, it is filled in by the code. In other words, it is corrected by the code as long as you do not lose it for a long time. Whereas in other systems which are not spread spectrum, you can’t afford to do a lot of heavy coding because you’re giving up some of it for full spectrum.


Still the fading problem remains. Indeed the fading problem is very severe if you go into a very deep null and stay there, you may only rely on power control. But the power control mechanism, as I said, could end up raising the power so that you cause interference to other users. In that case, you basically steal some of their available interference budget. If you have to take some of that interference budget away, it gets lost to someone else.


CDMA benefits from very rapid fading because of this coding that I mentioned to you. Does that make sense? It also benefits from the ability to have muti-path actually help.


Frequency hopping

Hochfelder:
Yes. Also, in reading some of your survey papers from the early ‘90s on the state of the art of the CDMA, I was intrigued by your explanation of frequency hopping. Can you explain this?


Pickholtz:
Frequency hopping is still another way to achieve spread spectrum and it has its own virtues and problems. Direct sequence is the other main alternative. You might think, because I described the fading phenomena to you, that it is a kind of multi-path. But the multi-path that we usually worry about in spread spectrum is fairly large multi-path, not a lot of scatterers that are within a couple of hundred nanoseconds from one another. The multi-path is two, three, five, or even ten microseconds. If you had a TDMA system, it would not even fade but you would get intersymbol interference. In other words, you got a signal and then you got an echo from a mountaintop. Five microseconds at about a nanosecond a foot means 5000 feet or about a mile.


You get an echo, and the worse thing for digital systems is an echo, because the next signal that comes along will disrupt the ability to discern it. Therefore, one of the ways of dealing with that is a thing called equalization. In fact, that is the idea of how you can get high data rates on telephone line that have echoes and multi-path and time dispersion. Direct sequence CDMA has this beautiful property of being able to exploit time echoes. That is true multi-path and discreet multi-path, which is larger than one chip. Where it suffers and has the same problem as other means is when you get lots of small multi-paths, which could give you fading. Incidentally, you could have fading without having large time dispersion. The two should not be confused.


Time dispersion simply says if I put out a short pulse signal, it will not only be received, but I receive another one later and another one a little bit later and so on. If they are far enough apart I can discern them. Whereas fading is caused when I put something out, and the radio frequency signals are very small. If they get reflected sufficiently, all you need is a couple of centimeter difference and you got a fading problem because the radio frequency phases cause a net cancellation of the signal.


Shannon Limit

Hochfelder:
Also, another term you used is the Shannon Limit. Can you talk a little bit about that?


Pickholtz:
Yes. The Shannon Limit is a very famous limit. Shannon, in his famous 1948 paper, put forth an idea that said if you are clever enough, and you are willing to do a lot of processing (but he didn’t tell us how to do it), then it is possible to achieve virtually no errors at all provided you don’t try to transmit at too fast a note. In other words, you can make the errors as small as you want. You just have to perhaps be a little bit more clever in selecting the code and doing the processing, provided that the data rate doesn’t exceed some number which depends on a whole bunch of perimeters, such as the signal to noise ratio, the bandwidth and so on. That rate, Shannon called the “capacity”.


Shannon’s idea are very general. It is applied when you have a fading channel and it is applied when you have multi-path. Shannon produced a simple formula for the “Gaussian” channel. The Shannon formulation, however, was not just for the Gaussian channel. His famous formula is an illustration. Very often, if you have a plain Gaussian channel, like you have in a direct path from a satellite that is a nice clean Gaussian channel. More often the channel has many other impairments.


Hochfelder:
So Gaussian channel would be where you just have noise, channel noises?


Pickholtz:
Pure noise—pure random noise. There is thermal noise from your receiver and perhaps cosmic sources. If you follow Shannon’s dictum, he said you could try to make the data rate larger than capacity, but then, Shannon says, you’re not only going to make errors, but you’re going to start making a lot of errors. So the capacity is a bound, if you will, of what is possible over a channel.


Basically, when you find that you are at the Shannon Limit, and you are able to do it on a chip that can fit on a thumbnail, one that doesn’t cost too much, and that doesn’t take an infinite amount of time to get the signal out. If you can do all of that, you have arrived.


I am not even sure Shannon would have dreamed that you could get all of those things accomplished simultaneously. That is why I say this recent development of turbo codes sort of took the community off guard, because it was done by people who were not in the communications theory community, and used relatively simple ideas in terms of processing. Nevertheless, we are approaching the Shannon limit with practical hardware and software.


Telecommunications field

Hochfelder:
Can you take a couple of minutes and talk about the state of the art and the future of telecommunications? In one of your writings, you claim that the 1990s would be known as the telecommunications decade, like the 1980s had been known as the computer decade. Do you think that prediction has come to pass?


Pickholtz:
Without any question. I was confident, when we wrote that article, that it would be. In fact, I think (I have to check with the economists on this), but I think telecommunications is by far the fastest growing industry in the world. Number one, it is not localized, like the semiconductor business. It is not even localized like the software industry. It is worldwide. Number two, because of the Internet, it has changed the paradigm of business and even human interaction.


If I were to venture to make a prediction, I would say if the 1990s was the telecommunications decade, the next decade will be the decade of wireless telecommunications. Almost everything you can do with wires, you might be able to do with wireless. It is not going to be easy. It will take a lot of sophistication to overcome hostile channels. Wire lines, especially fiber optic channels, are very clean and very cooperative. But fiber optic lines are very expensive to deploy where you are only going to reach a village of forty people. In fact, it is even difficult to put it into a modern London or New York or Paris apartment building that is fifty years old to a hundred. As a result, there is a lot of incentive to provide a mechanism for doing a lot of the things that were traditionally thought of as wired.


I will just end by saying this. I gave a lecture recently at George Mason University. They had an anniversary up there at the electrical and computer engineering department when they opened a center for telecommunications. The provost was proudly saying that we are going to be one of the first wired, fully wired universities on the east coast. When I got up, in my opening remarks I said, “You missed the boat.” “You should have said you’re going to be the first wireless university.”


To illustrate my point I will refer to a joke I once heard. Some archeologists were digging in Egypt, and they found strands of copper deep in the soil. They concluded, with all the other things, the ancient Egyptians were capable of, that they had telephony, they had wires. And, so as not to be outdone, the Chinese started digging and they found strands of silicon, and they concluded that their ancients had fiber optics. The digging in Greece, however found nothing, so they concluded that in Greek antiquity they had wireless.


IEEE, Communications Society

Hochfelder:
Can you talk a little about your activities in the IEEE Communications Society?


Pickholtz:
I did not get a chance to talk about the Communications Society, which I was president of for three years. I joined the IEEE as a student. Actually, it was the IRE, and I was a member of the Communications Society (called the Communications Group) from its inception. In fact, as I said, I was present and gave paper at the first ICC conference. I got involved with the society as a volunteer. I was presenting papers, chairing sessions or organizing them, and that was around the late 1960s, early 1970s.


At that time, there were various people I knew, who were already deeply involved. Mischa Schwartz, Don Schilling, Bob Lucky, each of whom was at one point president, were already involved. I was the founding chair of the technical committee on computer communications. At some point in the ‘70s, I started running for various elected offices. First, I was a member of the Board of Governors, where I served for a period of about six years.


At that time there was a kind of succession. You ran for vice president of technical affairs. After you finished that term, then you ran for vice president. Then you ran for president. In fact, I believe I was the first one to run for president who was not unopposed. I ran for president, and served two years.


That was a period of very rapid growth for the Communications Society. The late ‘80s had arrived by the time I had finished my presidency. In that intervening time, we increased from maybe ten thousand members to close to forty thousand. I do not know what the number is now, but I read (current president) Paul Pleviak’s intentions of going to one hundred thousand and you can see why. This is the decade of telecommunications. The society sponsors over a score of conferences and publishes three archival journals and several magazines.


We now have joint conferences with the Computer Society. We have Infocom, and various other joint workshops. We have now a journal that we publish jointly with the Computer Society and ACM. Some of those initiatives were started while I was still a vice president, so it took a while for them to gel.


When I was president, we started opening offices overseas, and we hired our first executive director. Initially we did not have an executive director. We would have three or four paid staff and we did not run our own conferences. It was the volunteers who made sure that they got a hotel and the staff helped out in negotiating. So, the Communications Society has come a very long way.


I am still actively involved in the Society, but I am a firm believer in give the next generation a chance. There now has been a non-US, or non-Canadian President: Maurizio Decina. But I think of Maurizio not as an Italian, but as a fine Communications Engineer. The Society is deliberately transnational and the leadership reflects that. Recently, my colleagues and I have begun to publish papers in journals other than the Transactions. There are two reasons for this, it takes a long time to get published, and also because there is a need to reach other audiences. There is a new journal, which the IEEE Communications Society is a partner, it is a Korean journal. Steve Weinstein, another former Communications Society President, is the editor-in-chief and I am on the editorial board, as is Mischa Schwartz. So the tentacles are spreading.