United States. Congress. Senate. Committee on the.

The Industrial reorganization act. Hearings, Ninety-third Congress, first session [-Ninety-fourth Congress, first session], on S. 1167 (Volume pt. 7) online

. (page 7 of 140)
Online LibraryUnited States. Congress. Senate. Committee on theThe Industrial reorganization act. Hearings, Ninety-third Congress, first session [-Ninety-fourth Congress, first session], on S. 1167 (Volume pt. 7) → online text (page 7 of 140)
Font size
QR-code for this ebook


you don't see on the drawing, which threads through all the cores,
will get an impulse at the instant that that core turns over. It'll get a
one or a zero type of impulse.

Figure 30 reminds us that memories are often thought of in a hier-
archical form. The core, or what is coming to replace cores nowadays,
the semiconductor memory, is really the fastest and the most expen-
sive. Drums are quite fast, but an order of magnitude or so slower
and so it goes down the chain of speed versus size. The bigger the size
of memory the slower the access to that memory in general.

This is a disk pack and you can see the removable aspect of it. In
that little container held in the girl's left hand is the stack of disks
which are like phonograph records. They come in various sizes but that
one typically happens to be about 5 inches high, and about 12 or 13
inches in diameter. It's easily transportable and you can store such
disks on the shelf and put them on a thing called the spindle which
is the drive mechanism which rotates them and some arms come in
from the side to read them, much like a phonograph needle reads a
phonograph.

Another example of storage mechanism shown in figure 32 is the
tape unit. Here a tape is being mounted on a tape transport. Figure
33 is an artists rendition of an optical character reader device. They
are becoming somewhat more prevalent, as I mentioned earlier, and
I personally think that they will become an exceedingly important
aspect of our technology.

Figure 34 shows the basic hierarchy of costs associated with memory.
The main memory of a computer might cost something on the order
of 20 cents a bit and be accessible in one-millionth of a second: and
so it goes through the different mechanisms. The access time for a
tape might be 10 seconds, the access time to a disk is only a hundredth
of a second; and you can see the time getting smaller as the cost
goes up. Obviously there's a tradeoff between how much of any one
kind you can have in your system, and it's the management of these
hierarchies of memories which comprise a good deal of the intricacy
of a computer system.

Figure 35 shows us that both man and computei"s perform the
function of arithmetic. People do it by counting on their fingers or
by figuring on a piece of paper. These are the simple arithmetic cal-
culations which you've learned in school.

The whole mechanism of counting stems from the Egyptian and
Roman times and earlier when a whole variety of number systems were
invented. Figure 36 shows two early systems.

]Modern computers use what is called the binary number system.
You'll see a series of decimal numbers on figure 37.

For example, the decimal number zero is represented by three
binary digits, the digits zero, zero, zero, and the number 5, for ex-
ample, is represented by a one. zero, one. There are a set of eight
possibilities for those three binary digits, ranging from three zeros to
three ones. So, clearly, three binary digits is not quite enough to rep-
resent a full range of decimal numbers. It takes four binary digits to
do what we call binary coded decimal arithmetic. That is to say, for
each item that we wish to add. we have to deal with four binary digits.



4873

But four binary digits will carry you up to 16 so there's a certain
amount of overlap and a crtain amount of built-in confusion which
has existed in this industry since day one; built-in confusion because
man wasn't born with four fingers on each hand. Had we been, we
would have counted in the octal system instead of the decimal system
and it would have been somewhat more straightforward to be in the
computer world.

This particular confusion now causes us sometimes to deal with
binary numbers, octal numbei-s, hexadecimal numbers, those based on
16, or binary-coded decimal numbers, those based on, say, four binary
digits representing a decimal number, and many combinations in
between. In fact, the variety of combinations is slightly greater than
the number of companies in'the business because many companies have
two or three versions of codes that cause endless confusion.

■\Vhen we were in grammar school we learned how to do arithmetic
with two kinds of tables. We learned how to add by memorizing an
addition table. We learned how to multiply by memorizing a multi-
plication table.

It might have looked a little bigger when we were looking at it as
kids, but basically, figure 38 shows the sets of numbers we had to

You memorized them. Nine and eight is 17. Six times five is 30. You
had to memorize that set of numbers. There's quite a few of them,
actuallv, a total of combinations.

In contrast, figure 37 shows us that the binary number system is
really rather simple. Addition consists of those four numbers shown.
Wlien you add any one on the left to any one at the top; you get
the number in the" box at the intersection. Similarly, multiplication
consists of four possible results, three of them zero and one is a one.
So in the binarv number system addition and multiplication are very,
very simple and that's why it has become the basis for the construc-
tion of computers. There are only eight total combinations and one
can build electronic circuits to do that simple arithmetic. It would
be very difficult and veiy expensive to build electronic circuits to do
arithmetic on the base 10, so we just don't do it. All computers work
on these basic binary digits.

Figure 40 is an' example of the addition of numbers. You will
notice the binary columns add just exactly like the columns of a
decimal number, with carry, and that particular addition sum is cor-
rect, both in decimal and in binary.

Note the three binary digits, of zero, one, zero, which means a two,
and zero, one, one, which means a three. You just add one column
at a time and carry over to the left just like you do in decimal arith-
metic and you get" the result five, which is one, zero. one.

Now, we want to talk some about computer instmction bec^ise
arithmetic is pretty simple. A^Hiat makes computers complicated? The
basic thing that makes them difficult to deal with, but at the same
time gives them their enormous power, is their instruction capability.
Figure 41 shoAvs us that instructions are basically the line at a time
information which vou give to the computer, the directions you give to
the computer, to do its job. Thev tell Avhat things the comnuter is to
do and what data it's to do it with. Let's look at that m a little more
detail.



4874

Figure 42 is a programer, Avorking from a series of instructions to
her about what problems to solve. She's created what we call a flow
chart on the desk in front and this is the beginning of the programing
process. She will be working in some language; some machine lan-
guage, or assembler language, or compiler language, or problem lan-
guage. These languages are ever-more complicated from the ma-
chine's point of view but ever simpler to use from the human's point
of view, and we are attempting in our industry to approach the use
of natural language as a programing tool. It may be many, many dec-
ades before that's successfully jiccomplished as we're a long ways
from it at this point. Nevertheless, we have come a long way in terms
of the capability to give instructions to computers in so-called higher
level languages. Figure 43 shows this progression.

Let's look at what has to happen inside of a machine. We've got
this arithmetic capability, shown in figure 44, that we talked about
and this arithmetic capability can add, subtract, multiply, and divide,
and there are some registers which hold the numbers that we want to
deal with. Those registers may have been loaded from a memory loca-
tion or they may be the memory location. In any event, Ave've got
something in the computer as well as the data.

We have something called a stored program, as shown in figure
45, and that program is a set of instructions Avhich look exactly like
any other numbers in the computer. These numbers, however, have been
very carefully prej^ared so that those instructions which appear
like data will nevertheless cause the computer to fetch an item of data,
add it to another item, and put it back in a different memory location.
It is a sequence of very simple little steps like that that comprises the
instructions given to a computer.

Figure 46 shows us three language levels. Almost no one uses the
machine language level of programing anymore. That's the level at
which the machine works, but everyone at least uses an assembler.

An assembler has the capability of telling the computer in fairly
miserable detail what to do, but the programer can do it in a symbolic
way and let the computer translate that symbolism into it's own in-
structions by means of a program called an assembler. A compiler is
a higher level version of an assembler. You can be even less precise
and somewhat more general about the kinds of statements. We'll look
at that a little bit in detail later.

Figure 47 shows a programer writing a series of COBOL state-
ments. COBOL is the common business language which has grown
up in our industry and become rather a sophisticated tool for using
computers.

Let's take a problem and look at it in the three different ways that
it might appear, as shown in figure 48. Suppose we just want "to add
two numbers that are in storage and put them back. On the left you
see the bit patterns as they might appear on the computer. The first
line says to the computer, load the number at register 100,000, as
you see at the far right. It doesn't tell it what that number is. It just
says load that number into your register. And the next instniction
says add to it the number at 101.000. And the third instruction says
store the sum at 101,001. Those are the kind of bit patterns that would
appear useful and important to a computer. The assembler language
version of those Avould be the center column, as vou see, and in a



4875

compiler language, say, Fortran or COBOL, you would write just a
single line. In COBOL, for example, you'd say, add X to Y, givmg Z,
and those X, Y, and Z addresses would be translated by the COBOL
compiler to find those pieces of data automatically wherever they're
stored in the computer, according to your preplan.

If one were to look at a single instruction word, as in figure 49,
let's say a 24-bit instruction word of a certain kind of computer, you
might find that the 9 left-most bits tell the machine wluit to do;
say, add or multiply, or fetch a number.

The righthand 15 bits tell you which location to go to. Fifteen bits
is enough to specify many thousands of address locations. It's not as
powerful as 15 digits, remember; it's 15 bits; and a bit is just a yes or
no, a 1 or a zero. But a combination of bits can give you a fairly large
number.

We looked at those instructions and, in fact, I was reading this as
101,000 on the righthand end. It's really the decimal location 40, as
shown in figure 50. That's the decimal number -40 ^yhen read froni the
binary system and so that particular instruction said load the register
with the contents of that location. Figure 51 shows us that in an
assembler it would be. written down this way and this is the simplest
level which man uses today. In a compiler, one would write a more
general statement, as I described earlier and as shown in fig-ure 52.

The programer turns in the instructions on punched cards or on
written forms which get converted to punched cards by a key punch
operator, gets back a run, which then consists of things the computer
has done for the programer and a test of that program, and then she
examines the results and perhaps makes some modifications. Figure
53 shows the programer examining the results of a run.

The programer has submitted the card deck to the computer center
and gotten back the results as seen in figure 54. The computer operator
at the center loaded the cards, as seen in figure 55, and the computer
did a large amount of work to process that set of cards and give back
that little output to the programer.

For example, the computer will exercise many hundreds of thou-
sands of instructions in the fortran compiler, and it may do this mil-
lions of times just to compile a veiy simple program. The process is
shown in figure 56. But that's the beauty and power of computers. They
can do many simple things and they can do them incredibly fast and
quite accurately.

We've talked about computer memory as being hierarchical and
having quite a spread in cost and performance, but let's look inside
of it a litttle bit and see what might be going on inside of the computer.
There might be, say. three programs in the machine, illustrated in
figure 57. The computer's memory is storing these programs and you
might wish to alternately give control of the computer to one or an-
other of these programs.

So the industry has invented what is called the operating system
shown in figure 58. The operating system is capable of controlling
which program is being run, which data is being fed to which pro-
gram, and keeping all these pieces going simultaneously at a much
higher rate than a man could do it sitting at a console and pushing
buttons one after another. Tlie operating system is quite a sophisticated
concept, but it's basically again just the same simple extension of the



4876

computer's capacity to do a lot of things, very simple steps but very,
very rapidly.

The operating system has some goals, noted in figure 59. It provides
some capabilities or else we wouldn't have them; but operating sys-
tems, in general, are not the purpose of computers. They are merely a
thing to make the computer more useful to the ultimate user.

Figure 60 tells us that man controls himself and initiates control of
the machines and indeed, man controls the machines in quite detail by
the basic materials that he provides as a program. Without a program
the machine is doing nothing. The machine is a vei-y complicated com-
puter consisting of tens of millions of pails and it is a useless collec-
tion of electronic junk without a program. That program was written
by a man or a woman to provide the control of that machine so that
the data that's fed into it is dealt with in the way the person who
wants to use the computer intends it to be. The computer initiates none
of that programing.

Figure 61 reminds us that man's nervous system is the mechanism
that carries his signals around, but in the computer it's electronic
circuits as noted in figure 62. There's really nothing very mysterious
about them. They're very simple circuits like the circuit that you use
to turn on the lights when you throw the switch. In the one case it's
off; the other case it's on and by adding together the capability of
many, many thousands, many millions of those little bitty pieces which
are able to make those one or zero, on or off, yes or no kind of decisions
you can build these things we call computers, which are the most com-
plex devices known to man.

Man does his output through certain kinds of actions, shown in
figure 63, which we understand: but computers do their output
through screens, through printed listings, and on punched cards, as
shown in figure 64. They may issue voltage signals which control the
temperature. All sorts of output mechanisms exist for computers.
We'll look at a few of them.

Figure 65 is a card punch. The cards are being stacked up, and inside
they're punched in the same kind of a pattern as we saw for the
punched cards on the input. Figure 66 is a printer with a typical kind
of listing. Printers exist with the capability of making several copies,
but this hard-copy mechanism is really rather slow compared to the
tremendous speed of the inside of the computer.

This printer may print at 1,000 lines a minute. There have been
printers made that'^print 30,000 lines a minute, and you say, my good-
ness, how can anybody read that? But then when you think about
printing the label, say, for mailing a magazine, you might want to print
5 million address labels and it's read by 50,000 postal delivery men
in parallel, and so printing them at 30,000 lines a minute is not really
very fast, if you Avant to print lots of them and get them distributed
to all the people who are going to read them simultaneously.

Tape units are used for output, as seen in figure 67. We've talked
about tape units before, but many of the results of the run of a com-
puter are recorded on tape, then taken away to an archive and stored
with the tape containing the results. Again, they're only useful as
input to another computer run or to a machine which can read that
tape.



4877

A goodly number of institutions are now making use of computer
graphics, as it's called. The display shown in figure 68 has on it a
graphical picture of an architectural drawing or a design of a piece
of machinery. A number of automotive companies are beginnmg to
use this technique. .

Executives are beginning to get the capability of computer at then-
fingertips, although some of the science fiction literature you read
about it would make it seem as though the electronic revolution is
here. It's not reallv here yet. It'll be a while. If it's used properly, with
the proper digestion of data, the management information system can
be tremendously powerful. It's mostly been abused rather than cor-
rectly used. Some executives and even some sales offices, for example,
have teletype connections to computers. Brokerage houses have them.
Examples are shown in figures 69 and 70.

The computer sometimes controls other machines, as noted in figure
71. For example, a milling machine or the trajectory of a ballistic
missile.

A computer in the ground somewhere may be controlling the bal-
listic missile shown in figure 72. or it may be a computer in the missile
itself. It may be in the submai-ine which launched the missile, or the
location of the submarine may itself be being computed by a computer,
as shown in figure 73.

The control of refineries, shown in figure 74, has become such an
esoteric art and it's so important to get that extra 1 percent of yield
out of millions of barrels of product going through that it's becoming
imperative to control refineries with computers because they can do
that control so much more precisely, so much more rapidly than people.

Computers are heavily in use in at least one automobile plant, the
Volkswagen plant in Germany, shown in figure 75, where many thou-
sands of tests are made on the vehicles as they're being assembled.
Those measurements are continuously fed into and monitored by a
computer. The inspection record for every vehicle is then known at the
end of the assembly line.

Traffic is being controlled by computers, shown in figure 76. There
are installations in a number of cities. ^linneapolis has an installation
which monitors the traffic and meters the cars onto the freeway, de-
liendins: on the flow. Los Angeles has a system that even flashes warn-
insf lights, and overhead signals to people.

Figure 77 reminds us that one of the things that computers cannot
do: They cannot do any initiative reasoning, make any judgments, or
do any thinking to any decree of i^rofundity. The anthropromorphic
aspects of computers have been highly overrated by the fiction writers,
and while computers can seem quite awesome and quite miraculous in
the things they do, they're really doing very simple things that they
have been instructed to do by people. A computer can't act on judg-
ment. It can't philoso])hize : it can't do anv of the kinds of things that
people do. They can't set policy. Figure 78 lists these points.

A computer can't react like a man in the reasoning, thinking area.
It can only do the five things that man can do, as shown in figure 79,
so it's natural to ask the question, what makes a computer so great—
and the answer is quite simple.

Figure 80 tells us that a computer has the advantage over people
in speed, accuracy, capacity, and reliability. Now, it may not seem



40-927 O - pt. 7-4



4878

to you, when you get a glitch on your department store bill, that the
computer has been very accurate or very reliable, but as a matter of
fact it was precisely accurate but it was given the wrong inputs by
some people or it was given the wrong program by some people, and
as a consequence it went ahead and did what it was told to do and it
produced an erroneous bill. That doesn't mean the computer made a
mistake. It means that the people providing the input or writing the
program made a mistake. Almost never does the computer make a
mistake — almost never. In fact, with the complexity of modern op-
erating systems, and the complexity of the modern programs which
are written for computers, the thing comes to a screeching halt if
anything goes wrong.

If a part fails inside the machine, it's doing so many millions of
operations per second that it immediately comes to a stop. It just
doesn't know what to do. It immediately goes out of business. So in
the unlikely event of a computer piece of equipment failure, which
is relatively rare, relatively compared to the millions of things that it
does do correctly, the computer's occasional apparent mistake is not
real. It's a mistake caused by erroneous input by people or an er-
roneous program that was written or erroneous data that was fed to
the machine.

So a computer has these advantages over people. It has the speed,
for example, to do problems which are unthinkable — which were un-
thinkable a few decades ago — and the number of those problems
which are opening up to being solvable by computers is getting larger
all the time. The speed of computers has made possible things like
nuclear energy or doing guidance of a trajectory of a shell while it's
in flight; things that are just unthinkably too fast for people to do
any calculations about. The computer can do calculations while the
event is happening.

Mr. Chumbris. I guess if a computer throws out a $1 million check,
it's a person that did it and not the machine?

Mr. Parkin. Almost without exception it's the operator who did it,
or the data that was prepared Avrong; not the computer.

Most of the flight tickets that you buy these days are prepared
and kept track of by computers, as shown in figure 81. Almost every
place you go to to reserve a flight would need whole rooms full of
girls answering telephones and trying to keep track of flight reserva-
tions. Can you imagine a room the size of this whole office building?
That's what it would take to keep track of the reservations for two
or three of the major airlines in this country today, if it weren't for
computers.

A number of the aircraft companies. United, among others, are
using computers in their maintenance scheduling, as shown in fig-
ure 82.

They keep track of exactly wliich component on whicli ])lane has
flown how many hours and when it's scheduled for preventive main-



4879

tenance and when it's scheduled for overhaid. In fact, they worked out
tlie schedulino; of which airphme goes on which route so that it ends
up back at the overhaul shop in Denver at the rig:ht time; and if they
didn't do this they mi^ht get much less utilization out of their airplanes
by scheduling them with people than by computer.

'Figure 83 reminds us that Amtrack is giving its tickets and reserva-
tions by computer. That whole system is a little creaky yet, but it'll get
there.

In the whole oil exploration business one of the things which is done
is the so-called seismic exploration. A crew will go out in the field
and they'll plant a half a dozen sticks of dynamite around in a circle,
maybe 5 miles in diameter, and they'll set them all off at once and the
pattern of the little echo waves that come reflecting back from the un-
derground locations as recorded by a seismograph instrument like the
pattern shown in figure 84. These little patterns in the past used to be
looked at by some human. Figure 85 shows what he _ was looking at,
miles ancl niiles of paper with those little squiggles on it. It took a very
rare genius to understand those patterns. But with computers those
data can be digitized and analyzed by computers and they can find
some major oil finds that would never have been suspected before by
that technique. Every oil company is using it extensively.

In banking you'll ifind teleterminals at your friendly neighborhood
bank, spreading throughout the country quite rapidly, as shown in
figure 86. It's not universal yet but almost all banking will be done,
almost all banking internally is done, on computers today. Externally



Online LibraryUnited States. Congress. Senate. Committee on theThe Industrial reorganization act. Hearings, Ninety-third Congress, first session [-Ninety-fourth Congress, first session], on S. 1167 (Volume pt. 7) → online text (page 7 of 140)