Computers
and Society
By
Marek A. Suchenek
Professor of
Computer Science
CSUDH
http://csc.csudh.edu/suchenek/
Computers are everywhere
The
era of modern computers began during World War II when opposing forces
tried desperately to use computing devices to accomplish a prosaic
information-theoretic task: to break the enemy's codes that guarded (we
say: encrypted) sensitive military information. A few programmable
digital electro-mechanical computers were designed then, and several
inventors and scientists contributed to rapid progress in this new area
of technology shortly thereafter, but I would like to think that it was
John Neumann's idea, outlined in 1945 in his "First Draft of a
Report on the EDVAC" (Electronic Discrete Variable Automatic Computer)
[1], of a processor with an access to a memory with stored data and
program that marked the conception of the modern computer.
Since
then, computers and computer-based technology experienced explosive
progress and proliferated to perhaps every aspect of human life. From
scientific computations and modeling to electronic commerce and banking
to telecommunications to the entertainment industry to military
applications to schools and ordinary households (the list here is far
from being complete!), computers rooted themselves within our society
to the extent that many think it is no longer possible for us to
function without them. (As a matter of fact, if you know of a
significant area of human enterprise that has not utilized computer
technology yet, then I would like to hear from you.) They allow their
users to store, process, search, and retrieve unimaginable amount of
digitized information quickly (well, relatively quickly) and reliably,
a critical activity that gave birth to a new discipline often referred
to as information technology - a computer-centered conglomeration of
science, engineering, and mathematics.
The recent
emergence and fast growth of the world-wide net of inter-connected
computers, and the accessibility of the Internet that runs on them,
created opportunities that were unimaginable just a decade ago, but
also brought some serious problems of an ethical, legal, and political
nature. Indeed, the virtually unlimited access to information
distributed over the the Internet and the ease and speed with which large
groups of people can share that information provided individual
members of society with real power at their fingertips challenging that of the traditional media (press and TV) and the
government. Just think how convenient it is to Google-up an insightful
article from the Internet or e-mail a query to your professor, as
opposed to tediously searching library catalogues, never mind staying
ignorant instead and trying to make best guesses in the absence of
relevant information. (See [2] for an example of recent tendency
towards replacing traditional personal computers with Internet-based
ones.)
For some, it looked like there were no limits on
what computers could do. In the mid-1950s, Herbert Simon and Allen
Newell created at Carnegie Mellon University what was acclaimed as the
first "thinking machine," and the era of "the sky is the limit" in use
of computers in problem solving in general, and in information
processing in particular, began. Their advancement, particularly in
such trendy areas as artificial intelligence and information
technology, has been spectacular.
Most of us are aware
of "intelligent" computers that are capable of winning chess games with
world champions and proving mathematical theorems, or recognizing the
face of a known terrorist on a digital photograph. Perhaps everyone is
familiar with speech recognition systems that replaced many directory
assistance operators, never mind a myriad of "smart" contraptions (for
instance, an automatic transmission in your car may belong to that
category) that are capable of learning what a particular user expects
from them and then "self-program" and act accordingly. But there are
also some facts that may make one skeptical about the extent to which
computers and the information they store and control can benefit us, or
even if they are that beneficial for humanity. Indeed, knowing the
limitations on and dangers of using computers is an indispensable part
of computer literacy that holders of today's college degrees should
possess.
Limitations
Interestingly,
a rigorous theory of computability that allows one, with
mathematical precision, to figure out (at least occasionally) what a
computer can do and what it cannot, was developed years before the
emergence of the modern computer. In 1936, Alan Turing, invented a
model of a universal computer (nicknamed the Turing machine) and discovered
that his hypothetical machine was inherently incapable of solving some
seemingly straightforward, if tedious, problems. For instance, he
proved that there is no Turing machine that could correctly predict
whether or not an arbitrary given Turing machine will fall into an
endless loop on one of its inputs. And that discovery, although
disappointing for many of us today, was not quite surprising.
Those
familiar with the liar's paradox (for example: "Once I thought I was
wrong, but I was a mistaken") or Goedel's sentence (that, in
paraphrase, says: "Einstein could not prove this sentence without
contradicting himself") will have no difficulty in recognizing a form
of diagonal reasoning that Turing used in his proof (if they study
Turing's proof, that is). And since Einstein could not prove the
sentence mentioned above, while every Math major worth his degree can
prove it in one paragraph (I urge everybody to try it, it's very
entertaining), then that is quite a limitation, indeed, on what a
genius can accomplish. There is no reason to believe why computers
should do any better.
Shortly after Turing's invention,
Alonzo Church postulated that anything that any effective computing
device can compute can also be computed by some Turing machine, albeit
not necessarily in the same time and fashion, as well. This has often
been referred to as Church's Thesis, and despite many decades of
serious attempts, no one has been able to disprove it. As of today, it
seems, all the "authorities" in the area of computability believe that
Church's Thesis is true. It forms a conceptual foundation for a theory
of uncomputable functions, that is, things that no computer can
correctly ever compute. One of the best known examples of the
uncomputable arises in a surprisingly simple, albeit mathematical,
subject: the arithmetics of apples and oranges.
We have all
solved an untold number of problems like "John has 5 apples and Mary
has 3 apples, etc., etc., ...". They involve, what is called,
arithmetical sentences. Some of them are true, like "2 plus 2 equals 4"
or "1,234,567,891 is a prime number", and some of them are false, like
"5 times 7 equals 57" or "For every n, n is less than 17 or n is
greater than 17". It would be nice to have a computer (or a calculator)
which could accept any meaningful arithmetical sentence (like, say,
"Every even number greater than 2 is a sum of two primes") and, after
hesitating for a finite amount of time, tell its user whether the
sentence in question was true or false. (Certainly, having a
contraption like this in an arithmetic class would help many of us
improve our grade point averages, to the delight of our parents.)
Unfortunately,
as Kurt Goedel proved around 1931, no computer can ever accomplish this
(assuming - of course - that Church's thesis is true), no matter
how fast we make it, how much of a super fast memory we allocate to it,
and how ingenious a program in the best possible programming language
we equip it with. Not now, not 100 years from now, not even in a
trillion years, never. And this is one of many firm, if somewhat
theoretic limitations on what computers can do.
Add to
the above, all the errors (sometimes referred to as bugs) that
computers and the programs that control them have gained notoriety for.
One famous computer scientist (I think it was Edsgar Dijkstra) put
it this way: "A human being can make a computational error once every
ten seconds; a computer can make a hundred thousand errors per second." Indeed,
the omnipresence of bugs in computer software, paired with an intrinsic
inability to effectively decide which programs are bug-free and which
are not (deciding programs' correctness is another example of a task that computers
cannot do), should make us really cautious about our growing dependency
on computers, and take the results of their
actions with a grain of salt.
One of the most notable examples of skepticism of this kind
that led to a heated controversy between scientists is a question of
whether predictions of future catastrophic climate changes over a
period of, say, several decades, based on computer models of Earth's
climate and computer simulations that use such models, are accurate.
Yes, I am talking about the global warming theory whose predictions
based on computer simulations are meaningful only if one trusts that
the software used in these simulations is error-free and that the
measurement error and the bias of the sample data used by that software
does not invalidate the results; a belief that - according to the
limitations I just motioned above - may be all but impossible to prove
beyond a doubt (with mathematical precision, that is).
Possible dangers
But there are even grimmer aspects to the proliferation of computers are than their
limitations and fallibility. With all their benefits for a civilized
society, computers may harm humans, too. We hear almost every day
about hackers who steal and destroy information that has been stored in
computers of others. Illegal drug dealers would not be able to
circumvent the efforts of law enforcement agencies if they didn't learn
how to use computer-based technology (remember, a typical cellphone has
a computer inside that is more powerful than "professional" computers
of half century ago) to synchronize their illegal activities and track the
activities of the DEA. Identity theft is becoming one of the largest
plagues of American society that is committed through the use of computers.
Add to this the frightening and dangerous aspect of computer use, such as the
massive governmental surveillance (of questionable constitutionality)
of ordinary citizens that would not be practical if it weren't for
computers with terabytes of memory and effective programs for search
and retrieval of information from such gargantuan memories, and you see
the gloomy picture I am painting here.
The Internet and
the information technology that it proliferates transformed our society
into an information society where knowledge is power. But it also shook
the very foundations of the traditional ethical, legal, and political
systems that we and our ancestors used to take for granted. It has created
an illusion of a borderless global society within which individuals all
over the world can communicate and collaborate with each other as if
they were living in the same neighborhood. But the fact that the
Internet transcends physical boundaries of nation-states, cultural
regions, and zones of political influence, has not nullified these
boundaries, nor has it alleviated problems and conflicts that come with
the reality of a politically and culturally partitioned world.
For
instance, what in the United States may be considered a simple exercise
of a person's Internet free speech right guaranteed by the U.S. Constitution may
become a criminal offense in the People's Republic of China where the laws
are generally more restrictive than here in America, and governmental
censorship of the Internet is commonplace. This is not just an
abstract or imaginary problem, as some users of the Internet have
learned the hard way (see [3]).
Not even the
Golden Rule, the venerable "do unto others as you would have them do unto
you", is a universally safe approach to interacting via the Internet with other people
anymore, as what is desirable or fun to you may be hateful to a person
of a different culture, religion, or nationality. Just think of how
much resentment you might stir if you joined a blog in, say, Saudi
Arabia, and began introducing its users there to the blessings of equal
rights for women.
Computer harm
I
can't help mentioning one specific aspect of the harmful use of
computers. It is plagiarism and intellectual property theft. With the
Internet readily available to almost anyone who can point and click,
unauthorized and unacknowledged copying of someone else's intellectual
property has became a huge problem these days. As harmless as it might
seem, it is unethical behavior, in the least serious cases, or a
criminal act, or even a felony with penalties that can add up to
hundreds of thousands of dollars of fines or years of imprisonment (or
both) in the most serious cases1. And it's
not just the criminality of unauthorized or unacknowledged copying that
makes it wrong, for if it were so, we would just legalize unauthorized
and unacknowledged copying and the problem would have been taken care
of (which clearly is not the case).
In my CSC
301
Computers and Society class, I
use this example of actual harm done to innocent people by those who
infringe on someone else's copyright. Imagine yourself, a student at
CSUDH, working nights in your rented garage on a software program
that would predict major earthquakes with, say, 90 percent accuracy,
based on past and current readings of a system of seismographs. In
addition to committing some of your resources to studying (tuition,
textbooks, etc.), you have to support yourself, your wife, and two
kids. But you couldn't study, work on your software project after
hours, and keep a regular job. So you take the risk, say "goodbye" to
your employer, and borrow money, instead, hoping that the proceeds of
future sales of your program would allow you to pay off your debts and
provide a comfortable life for you and your family.
Well,
you have succeeded, at least partially. You managed to design and
implement
a reliable earthquake prediction software program. The problem is that
a jerk has managed to hack into your computer, copied your program, and
posted it on the Internet. Now, no one wants to pay you anything for
the fruits of your laborious works, and you are stuck with a debt of
hundred
grand. And when you see the sadness in your wife's eyes and watch
your young children for whom you cannot provide the needed necessities,
you fill as though someone has ripped you off big time. And you are 100
percent
correct.
It is amazing how many otherwise law abiding
individuals, who would never steal someone else's wallet or a book, not
even a pencil, have hardly any problem with plagiarism or intellectual
property theft. It's not that these individuals are evil or immoral. In
many cases, they just cannot imagine the potential or real harm that their
unauthorized copying may cause for another. "We are not taking away
anything from anybody," they think. Well, it isn't quite so. This is
why we, college faculty members, must do a better job of sensitizing our
student to the potentially devastating effects of this seemingly harmless
activity.
A
gift of fire, indeed
The
author of the textbook [4] that I use in my Computers and Society
class, Sara Baase, compares the invention of the computer to a gift of
fire. This comparison has a lot of merit. Computers today, like fire hundreds and
thousands of years ago, allow a civilization to achieve astonishing
technological progress without which its very survival would be
doubtful. As with fire, though, there are some serious limitations on what one
can accomplish with computers and what one cannot. Computers, like
fire, can only be utilized well by those who have learned how and when
to use them (think of a combustion engine as an "educated" application of
fire). And finally, computers, like fire, can inflict irreparable harm
to humans and property.
So we have our gift of fire
(figuratively speaking), and it's up to us if, and how much, will we
benefit from it, or if we will harm ourselves and others, instead.
~~~
All
those prospective students interested in learning more about this
fascinating subject and related issues, for instance, about how to use
computers and computer and information technology for the betterment of
the society as well as for enhancement of their own careers, or,
perhaps, how to contribute to the body of knowledge and progress in
computer and information technology and its applications, may wish to
visit the Department of Computer Science website.
We
currently offer two Bachelor degree programs and one Master's degree
program, so there are plenty of interesting classes to chose from.
I hope to see you around there.
January 26, 2009
Acknowledgments
I would like to thank Dr. Vanessa Wenzel
and Dr. George Jennings for reading and
correcting a draft of this paper,
and Donna Cruz for an invitation to write it.
_____________________________________________
Footnote 1 Plagiarism, or
unacknowledged copying, is a violation of CSUDH academic honesty
policy, and can result in a disciplinary action against the perpetrator
that may lead to expulsion from class or even university. [back]
REFERENCES
[1] John von
Neumann's 1945
First Draft of a Report on the EDVAC
[2] Google
plans to make PCs history
[3] Yahoo 'gave the Chinese evidence to help jail dissident'
[4] Sara Baase: A Gift of Fire: Social, Legal,
and Ethical Issues for Computers and the Internet,
(4th ed.), Prentice Hall, 2012