During the last one hundred years our department has attracted people with a passion for the electron or the bit--for things electrical or things computational. It has been an exciting time, and those of us who lived through part of it are lucky to have had the opportunity. We were in the right place, at the right time, to pursue our passion. This story of the department's first century is our story.
But what about those born too late for our experience? A hundred years from now their story will be told, and it is sure to be just as exciting as our story. The future of the department will be in their hands, and this history is dedicated to them.
My focus here, our educational activities, matches the department's priorities. MIT started the first electrical engineering program, and has emphasized its undergraduate mission ever since. Because an engineering discipline is determined by its degree programs, this department has, arguably, done nothing less than define what it means to be an electrical engineer, and, more recently, a computer scientist.
When should the story of MIT's Department of Electrical Engineering and Computer Science begin? In 1902 when it was formed? In 1882 when the electrical engineering degree program was started? Even earlier, I think, when the American electrical enterprise started to emerge.
Scientists, industrialists, and inventors all played a part. The scientist was curious about natural phenomena. The industrialist wanted to make money. The inventor liked to create useful innovations. The electrical engineer did not yet exist, but would, eventually, have all these motivations.
The leading electrical scientist in America in 1830 was Joseph Henry, teaching at Albany Academy in New York. He made the most powerful magnets available and carried out many scientific studies. His machines cleverly illustrated scientific ideas, but were not designed for practical applications.
The first industrial use of electricity was in mining. In 1831 Allen Penfield, who owned an iron mine in Crown Point, New York, used a magnet purchased from Henry to magnetize parts of an ore separation drum, to get higher iron density.
Thomas Davenport, a Vermont blacksmith, had heard about magnets, at that time a scientific novelty. He visited Crown Point, and was so impressed that he bought one from Penfield (trading his horse in order to raise the money, so the story goes). By 1833 he had made a motor with continuous rotary motion, and in the process invented the commutator, an essential part of every DC motor. (William Sturgeon in England had independently made the same invention a few months earlier.)
Henry held Davenport's motor in disdain. His scientific arguments would not be considered convincing today, but his practical concern was that the motor could not compete with steam engines because the batteries then available were so cumbersome and expensive. He was right--this motor was ahead of its time. Although Davenport got a U.S. patent (no. 132) in 1837, the first issued for any electrical machine, he could not interest anyone in using motors. He went to his grave in 1851 a defeated man.
Electricity needed what would be known today as a "killer app"--an application so compelling that an underlying technology would be acquired just to run it. Henry concluded, correctly, that the killer app for electricity was the telegraph. As early as 1816 an electric telegraph line had been built in Europe, but it took advances in technology--the hand key and Morse code--by the American painter and inventor Samuel F. B. Morse, a native of Charlestown, Massachusetts, to make it practical. Henry gave Morse strong encouragement and sound scientific advice, so much so that they tangled later about who had really invented what.
Morse sent the famous message, "WHAT HATH GOD WROUGHT" from Washington, D.C., to Baltimore, Maryland, in 1844. The telegraph spread like wild fire. Electricity caught the fancy of the public. The mood was not unlike the Internet euphoria 150 years later.
Finally there was a bona fide electrical industry. Universities, including MIT, took note. MIT admitted its first students in 1865 and provided instruction in physics right from the start. Two MIT physicists recognized the importance of electricity. Edward C. Pickering, a member of the National Academy of Sciences, was extraordinarily energetic and eclectic, judging from the range of his projects. At MIT from 1867 to 1877, he beefed up laboratory instruction and, perhaps most importantly, persuaded one of his students, Charles R. Cross, to join the physics faculty and continue the electrical work.
It was Cross who in 1874 invited Alexander Graham Bell, then teaching at Boston University, to use MIT's acoustics and electrical laboratory. Bell did so, since the facilities were superior to what he had at BU, and demonstrated a working telephone in 1876. Three years later Thomas Edison invented the electric light bulb and in 1882 commissioned the first electric power plant. Had Davenport still been alive, he would have seen that his 1833 invention of the DC motor made this power plant possible: the generator was simply a motor running backwards.
By 1882 five major electrical devices and systems were of growing national importance: telegraph, telephone, rotating machines, illumination, and the power grid. These were the products of scientists, industrialists, and inventors. It was now time to define electrical engineering, and the way to do that was to design an educational program.
Charles Cross understood. He was the right person, in the right place, at the right time. He started, at MIT, the nation's first electrical engineering degree program.
In 1882, Charles R. Cross, head of the MIT physics department, decided the nation needed a degree program in electrical engineering and he started Course VIII-B. Two years later, even before the first students graduated, it was redesignated Course VI. Between 1882 and 1902 the electrical engineering program was run by Cross, out of the physics department.
Then, as now, students could sniff out fields with a bright future. By 1892, 27% of all MIT undergraduates were in electrical engineering. Early graduates included Charles A. Stone '88 and Edwin S. Webster '88, who founded Stone and Webster, the firm that built MIT's new Cambridge campus in 1916; another was Alfred P. Sloan '95, who became president of General Motors and a major MIT benefactor.
Electrical engineering programs were also springing up at other universities. Of those that grew out of physics or mathematics departments, some were very scientific or theoretical. Others, designed to train people for the rapidly growing electrical industry, taught contemporary techniques but not the underlying science. MIT, whose programs had always maintained a balance between practice and theory, avoided both extremes. Cross himself fit the pattern, working in a science department but with a definite industrial and engineering bent.
In 1900 Cross began to press for a new Department of Electrical Engineering, but when it was established in 1902 he did not join. He assisted with the teaching, but stayed in the physics department, where he served as department head for another 15 years.
To lead the new department, MIT looked outside and recruited Louis Duncan. This selection did not work out too well; Duncan's real interests were with industry, and he left before long. But the next department head from outside, Dugald C. Jackson, was a spectacular success.
In its hundred-year history, the department had two leaders who exerted extraordinary influence on its educational mission: Dugald C. Jackson and Gordon S. Brown.
Jackson was recruited from the University of Wisconsin, where he had established the department of electrical engineering in 1891. He came to MIT in 1907 and served 28 years as department head, surely a record that will never be broken.
Jackson had a clear vision of what an electrical engineer should be, and understood that the way to make that vision reality was through an educational program. He respected both scientists and technicians, but thought engineers should be different. Technicians apply known techniques; engineers do that and also develop new techniques when needed. Scientists develop new science; engineers use known science. His philosophy was consistent with that already in place at MIT when he arrived.
In 1903, while still at Wisconsin, Jackson set forth his goals in a paper in Science : "Principles, principles, principles, and rational methods of reasoning...must be taught." He wanted his graduates to assume positions of leadership in industry by virtue of their good communication skills, scientific knowledge (mathematics, chemistry, physics, and applied mechanics), appreciation of how society works, and acquaintance with business practices. He was interested in educating practical industrial engineers, rather than scientists or academicians. He admired "rational methods of reasoning" and had little patience for purely descriptive and superficial material or the "beauties of nature" or skills that helped students get immediate jobs or "empirical methods of practice that change almost before they can be put to useful account." Jackson definitely believed it was not his responsibility to produce finished engineers, but only graduates "with a great capacity for becoming engineers...after years of development in the school of life." His was a no-nonsense approach intended to equip the students for 40 years, the full duration of an engineering career.
Two major educational milestones mark Jackson's years as department head.
First, the VI-A cooperative program (now called the Internship Program) was launched in 1917. Jackson wanted students to get work experiences with educational value. VI-A was not a "summer job" program or a "work-study" program designed to pay for tuition, but a bona fide educational program. His philosophy has distinguished VI-A from other programs around the nation ever since. Over the decades, about 18% of the department's undergraduates have participated in the VI-A program.
Second, in the early 1930s, Jackson started a major curriculum revision, motivated in part by recent developments in radio and electronics. He ordered the writing of a new series of textbooks, which would come to be known as the "Blue Books" because of the color of the covers. The series was not actually finished for over a decade (World War II intervened) and two planned volumes were abandoned because of advances in technology. Jackson decreed that the books were team efforts and individual authors were not to be identified.
Jackson's vision can be interpreted today as having been based on three assumptions. First, the underlying science base as developed by scientists was in a form engineers could use. Second, science would not change much during a graduate's 40-year career. Third, society itself would not change much during that same time span. His vision guided the department until, in the late 1940s, it became apparent that two of these assumptions were no longer valid. It was the other great department head, Gordon Brown, who would recognize the problem and provide the remedy.
During the Second World War, MIT hosted the Radiation Laboratory, part of which was housed in the legendary Building 20. The major objective was to make radar into a practical wartime technology. This work required advances in microwaves, propagation, antenna design, vacuum-tube amplifiers, and a whole host of electronic components. The personnel included eminent engineers and scientists, some from MIT and Harvard. When the war ended in 1945, these people went back to a more normal life. A few came to the MIT Department of Electrical Engineering. And when they did, they had a strange tale to tell.
It seems that the major breakthroughs in the engineering of radar systems had been made not by engineers, but by scientists, mostly physicists. Engineers had played a valuable role, to be sure, but it was a supporting rather than a leading role--one of implementation, not innovation. Dugald Jackson would not have expected this. He believed he was training leaders. According to his vision, engineers would, as needed, develop new techniques based on known science.
Gordon Brown, among others, realized what was wrong. The problem was that the necessary science was not known, or at least was not in a form accessible to engineers. He agreed with Dugald Jackson that engineers must be able to apply known techniques and develop new techniques using known science. But he went further, saying that at least some of them should also be able to extend the relevant sciences in ways required by engineering. Brown called such activities "engineering science." He concluded that changes in the educational programs were needed.
In Brown's view, the science should be taught in the first years, followed by contemporary technology based on the science. Specialization and theses would come in the senior year. The best students would be encouraged to enter an expanded doctoral program, which would produce engineers able to extend engineering science. He served on a department curriculum committee and rallied support for these views. When he became department head in 1952, he immediately instituted a curriculum review to identify the underlying sciences in all areas, and relate them to engineering techniques. Six undergraduate textbooks, called the "Green Books" after the color of their covers, were written during the late 1950s.
Brown kept colleagues at other universities informed, giving them free access to MIT's most recent thoughts. Doctoral graduates from this era took teaching positions here and elsewhere and spread the word. This policy of free sharing of curricular material continues today with the MIT OpenCourseWare program announced in 2002. In 1959 Brown became MIT's Dean of Engineering and started to promote similar ideas for other engineering disciplines.
In short, Brown's update to Jackson's vision was recognizing that science changes rapidly, and if engineers participate in the process, the changes will be in a usable form.
The first major test of this engineering-science approach was provided by semiconductor circuits. The transistor was invented in 1947; circuit applications began in the 1950s; the integrated circuit came along in 1960. Universities had to include transistors and integrated circuits in their undergraduate programs. But how? Should devices be taught in terms of terminal characteristics or the internal physics? What if fields of science not previously thought relevant were needed? Could nonlinear circuits be covered? How much solid-state physics would be required?
MIT led the way in answering these questions. In the fall of 1960 Richard B. Adler and Campbell L. Searle organized the Semiconductor Electronics Education Committee (SEEC). By 1966, 31 people from nine universities and six companies produced seven coordinated textbooks and related curricular material, aimed at third-year and fourth-year electrical engineering students. The books featured more solid-state physics than had ever before been used in teaching electronics. In the books, semiconductor-device models were derived from the solid-state physics, and they in turn were used in transistor circuits.
SEEC was a triumph of engineering science, with a substantial, lasting impact. The basic ideas influenced many textbooks written in subsequent years. The approaches are still used in EE education throughout the world, even though the SEEC books themselves can no longer claim contemporary relevance because they were never updated to cover integrated circuits, MOS devices, or much on digital circuits.
Engineering science, as a paradigm for engineering education, survived its first major test. The next test, posed by the rising importance of computer science, would prove more challenging.
The first electronic digital computers were made during World War II. Their importance was recognized in academic circles in the 1950s, and their use became common during the 1960s. Two research groups devoted to computer science were established at MIT--the Artificial Intelligence Group in 1959 and Project MAC in 1963.
But to teach computer science at the undergraduate level according to the engineering-science paradigm, an appropriate science base needed to be identified. Here a difficulty was encountered. Unlike the case with electrical engineering topics, there were no natural laws or previously developed science to guide practical techniques in programming, architecture, or artificial intelligence. Mathematical theories of lambda calculus, Turing machines, algorithmic complexity, and Boolean algebra were not enough.
The approach taken was to develop the courses anyway, and not worry about the science base. Some of the early courses were highly theoretical. Some were very practical, tightly coupled to contemporary hardware or mainstream computer languages. These courses were popular, and before long there were enough of them, with enough intellectual coherence, to comprise a degree program in computer science and engineering. In 1969, this program was announced and the first such degrees were awarded in 1975.
But what about the engineering-science paradigm? Was it still relevant? It had given electrical engineering two principal benefits: first, a foundation usable by graduates for their entire careers (40 years), and second, the opportunity for the engineering community to maintain its own intellectual underpinnings. In retrospect it can be seen that those who developed the computer science curriculum obtained these same benefits by other means. They identified fundamental, generic concepts that were commonly encountered in contemporary technologies, and taught them, adding examples from current practice. This body of generic knowledge was meant to last 40 years, and was in a form that met the needs of computer scientists. Some day, this body of knowledge may be recognized as a science in its own right, in which case the engineering-science paradigm will have survived in the long run.
The rise of computer science raised fundamental questions about its relation to electrical engineering. Were the two basically inseparable or were they different? How would they evolve? If the two fields were expected to diverge, then computer science should have its own separate department. If not, then establishing a separate department would be a costly mistake. Either way, the wrong departmental structure would seriously jeopardize MIT's position of technical leadership. The discussion of this issue in the early 1970s was surely the most important debate the department has ever had. The future of electrical engineering would be very different without a strong connection to computer science, and vice versa.
There were several arguments in favor of a split. The department was already big--in some minds too big. Because it was technically broad, no single department head could provide leadership in all technical areas (in fact, starting in 1971 there had been associate department heads, one from electrical engineering and one from computer science). The research and teaching styles of the two fields were different, because computer science was new, rapidly evolving, and more empirical. Some computer scientists felt their field should be free to develop in its own way, unencumbered by established engineering traditions (those making this argument tended to view computer science as a branch of mathematics). Finally, a split would be easy because the computer science people were already housed in a separate building.
But there were powerful arguments in favor of staying as one department. If computer science were not tied to an engineering discipline it could not benefit from proven approaches to engineering (those making this argument viewed computer science as a type of engineering). Some argued against the extra cost of separate administrations. Others feared that without the excitement of computer science, electrical engineering would stagnate, becoming less interesting to both faculty and students.
Perhaps the strongest argument in favor of a single department was based on the belief that computer hardware and software would eventually be indistinguishable from electrical systems. Of course computers were made of electronic components, but that was not the point. The point was that other electrical systems would, before long, include components that either resembled computers or were computers. EE graduates would not be able to design such systems if they did not understand computer science. Electrical engineering and computer science would not diverge, but would remain close together, and in effect act like a single discipline. This argument was repeatedly validated in later years, by advances in digital circuits and digital signal processing in the 1970s, VLSI in the 1980s, networking in the 1990s, and now embedded computing.
In 1974 the decision was to remain as one department. Soon after, in an informal poll conducted by Joel Moses, the department faculty voted to change the department's name to Electrical Engineering and Computer Science (EECS), in recognition of the permanence and importance of computer science.
With this decision made, the next task was to bring into harmony the two separate curricula, one in electrical engineering and one in computer science. A committee was formed to examine whether the two degree programs should have a common set of beginning courses, and concluded that they should. All EECS graduates needed to know about computer programming, electric and electronic circuits, signal processing, and computer architecture. These four courses already existed, and were adapted for this new role. They still serve as the common core of all departmental undergraduate programs.
The retention of computer science in the department's degree programs, however desirable, led to new problems. There was simply too much material in the curricula. And many students wanted to learn both EE and CS in more depth than either program permitted. These issues precipitated another curriculum revision during the 1990s.
The design of electronic circuits changed quickly in the 1970s and 1980s as digital circuits supplanted analog circuits. This revolution reinforced the growing computer field, and computer science gained importance at MIT and elsewhere. Soon it was hard to imagine electronic systems without digital computation or signal processing. So many technologies were essential that a growing number of students studied for a master's degree, either right away or after a few years. Their employers usually agreed and paid the bill.
MIT had correctly foreseen these developments and had decided to keep electrical engineering and computer science in the same department. This decision had many benefits, but one disadvantage: the technical scope made it difficult to design a single curriculum of adequate breadth without being too shallow. MIT's approach was to have two curricula, one in EE and one in CS, with a common core. This approach represented a compromise between depth and breadth.
During the 1980s some faculty, especially William M. Siebert, concluded that the department was not doing students a favor by restricting either depth or breadth. Both were needed.
In 1989 a curriculum committee was formed to consider how to combine technical depth and breadth. The conclusion was to retain both by extending the program to five years. After five years of study, students would deserve an additional degree, but it would not be the typical research-oriented Master of Science degree. Thus was born the MIT Master of Engineering degree.
The committee recommended that students who were capable of writing a master's thesis should have that opportunity. Although bachelor's degrees would still be available, the M.Eng. degree would be considered the department's flagship program, open only to department undergraduates. The S.M. program would be retained for students from outside.
The result, then, was this model for the three degree levels:
Curricula based on this model were approved in 1993, and the first M.Eng. degree was awarded in 1994. The M.Eng. program is consistent with Dugald Jackson's 1903 vision, with added research-oriented activities. It is consistent with the engineering-science model of Gordon Brown, with greater technical breadth than he envisioned. It is consistent with the students' demonstrated demand for education beyond the bachelor's degree. It satisfies the desire of many students for more breadth than either the EE or CS programs.
The new degree has been popular with students; typically over half of the EECS undergraduates continue for the fifth year. This five-year model has not yet been widely adopted elsewhere. Other MIT engineering departments define their M.Eng. degrees as professional one-year programs open to graduates of other universities. Some let the better undergraduates pursue a five-year combined program, but only on an individually arranged basis. Only a few other universities have been able to put similar programs in place.
In 1998 the M.Eng. program was evaluated; the principal disadvantage identified was that first-year graduate courses had to be modified to accommodate the large number of M.Eng. students, who were on average less scholarly than the doctoral students.
The curriculum committee also reorganized the two bachelor's degree programs, "VI-1" for EE and "VI-3" for CS, and gave them the same structure, making it easy for students to design programs that combined EE and CS in novel ways. A new undergraduate program was added with greater breadth across EE and CS at the expense of some specialization. This "VI-2" program proved very popular. Apparently students want to keep their options open and prepare for a world in which the boundary between things electrical and things computational is at best fuzzy, and perhaps even nonexistent.
The MIT EECS M.Eng. and VI-2 programs are successful, but they are certainly not the final word in the evolution of the department's programs. What will motivate the next curricular changes? Perhaps more biological material will be needed. Perhaps cognitive science will merge with artificial intelligence. Perhaps quantum mechanics will become critical. Perhaps students will need a better appreciation of the social, economic, and political context of engineering. Perhaps our programs can be made more accessible and attractive, particularly to women and underrepresented minorities.
The study of our department for the past century has identified some general trends--in our technology, in our department, and in the global community. These trends suggest what awaits us in the next century. It is natural to ask how the department, especially its educational programs, might evolve. What challenges will our department and our professional field face? Here are five, this observer's personal choices.
The technical domain of interest to this department has steadily expanded. So far, it has been our practice to retain newly developed or emerging technologies. The most prominent example is keeping computer science within the department, but there are many others, including radio, control theory, microwaves, optics, system theory, artificial intelligence, and semiconductor fabrication. Any of these might, under different circumstances, have been considered candidates for specialized degree programs or new departments.
In the future the department will make other such decisions. For example, the future might see the development of a discipline--defined by its educational programs--called quantum engineering. The techniques of making physical devices from nanofabrication, photonics, MEMS, or nuclear imaging would be combined with quantum computing, quantum communications, quantum control, quantum cryptography, and quantum information. Quantum systems, we know, can do things classical systems cannot. Who will define what it means to be a quantum engineer? Who will develop the degree programs? Will our department play a leadership role?
Our first challenge in the years ahead, then, will be to embrace the new specialties in a way that preserves the intellectual core of the department.
Our increased technical scope has been accompanied by higher and higher levels of abstraction. A hundred years ago the department had a "shop culture" that reflected the electrical industry at that time. Today there is a more abstract, theoretical character to what we expect our students to know. Many arrive at MIT without hands-on experience and never gain much familiarity with the concrete examples from which the abstractions have been drawn.
We cannot stop emphasizing the abstract, because our students need to understand concepts at that level. Our second challenge will be to insure that the students can intuitively comprehend and appreciate the links between concrete examples and the related abstractions.
Some advances in technology have the property that they are unarguably superior to earlier technology and as a result they completely replace it. A new memory chip may be smaller, consume less power, operate faster, and be more reliable than a chip from the previous generation. The older technology is rendered obsolete. Here are a few example of such technology trends:
It is not easy to teach technology knowing that it will be replaced soon. How can we be sure that our graduates are prepared for the long run? Our third challenge is to continue to focus on the fundamentals that will remain valid and relevant during a graduate's 40-year career.
Some trends in technology and its applications do not make older approaches obsolete but merely make them less dominant. Here are a few examples from the past century:
These trends may or may not be reversible. The older ideas are not obsolete, or at least not yet. Our graduates should be able to evaluate competing ideas in particular circumstances. The fourth challenge to the department is to teach competing approaches and application areas without letting the new ideas crowd out older ideas that are still of substantial importance.
Gordon Brown's educational vision is 50 years old and Dugald Jackson's twice that. These visions have been expressed here in terms of what engineers should be able to do--apply known techniques, develop new techniques from known science, and develop new engineering science. It falls on us, as heirs to these visions, to see if they are still sufficient, or if more should be expected of engineers today. My conclusion is that at least some of our graduates should be prepared to undertake a higher level of social responsibility. A bit of history beyond our department's will help explain this fifth challenge.
In 1893 the University of Wisconsin was small, with only 61 professors. One of them was Dugald Jackson, who had just established Wisconsin's department of electrical engineering. Another was the historian Frederick Jackson Turner, who that year revolutionized the study of American history. In a talk delivered at the Columbian Exposition in Chicago, he said that the existence of America's western frontier was "the fundamental, dominating fact" that shaped the character of the American people and the nature of its institutions. This "Turner thesis" soon became the most important paradigm in the study of American history. (Jackson also attended that Exposition, and while there he and others founded what is today the American Society for Engineering Education.)
Jackson and Turner had much in common. They were about the same age. Each had worked in Chicago before coming to Wisconsin, had clear vision, could express himself well, and would in time become a leader in his own field. At one point the two men served together on a faculty committee to "consider the condition of athletics in the University"--evidently football rowdiness had led Turner to fear that human values were "put in wrong perspective and the fundamental purpose of the University lost sight of." Three years after Jackson came to MIT, Turner moved to Harvard.
Turner, the historian, understood in 1893 that the western frontier was rapidly vanishing, though its influence would remain. But presumably he did not know what the next dominating influence on America's development would be. It turned out to be a different "frontier," one that would be familiar to his colleague Dugald Jackson.
Fifty years after Turner introduced his thesis, the Second World War was under way. Vannevar Bush, who had left the MIT electrical engineering department, was serving in Washington, D.C. In 1945 he wrote a seminal proposal for a system of federal support of scientific and engineering research, and called it "Science: the Endless Frontier." Bush had a right to use this title because his own field, electrical engineering, was on that frontier. A young, vibrant, immature discipline, it exploited scientific advances rapidly. The intellectual excitement of electrical engineering was a direct consequence of its proximity to the scientific frontier.
Besides being exciting, electrical engineering, and later computer science, have been essential to America's development. Their impact has been enormous. Consider the list of the ten "Greatest Engineering Achievements of the 20th Century," as judged by the National Academy of Engineering (NAE) in 2000. Half are based on EECS-related technologies--electrification, electronics, radio and television, computers, and telephone. (The other five--automobile, airplane, water supply, agriculture mechanization, and refrigeration--are more closely connected to other engineering disciplines.)
Although I am not a historian, it seems to me that the exploitation of this scientific frontier, especially by electrical engineers and computer scientists, has shaped America as much in the 20th century as the western frontier did in earlier times. The successor to the Turner thesis, then, may be a similar thesis but one involving a different kind of frontier: the frontier of science.
Bush called the scientific frontier "endless." But is it, really? And will electrical engineering and computer science keep their privileged position on this frontier?
It does seem so. Many engineering achievements involving EECS technologies, including Internet, laser, World Wide Web, solar cell, embedded computation, signal processing, artificial intelligence, control systems, and MEMS, were not on the NAE top ten, but seem poised to shape the 21st century. Or consider Moore's law, the famous observation by Gordon Moore in 1965 that the number of devices on an integrated circuit doubles every year or two. This trend has continued to this day and there is no end in sight, short of the limitations imposed by quantum mechanics (and even those may represent opportunities rather than obstacles). Whenever people try to predict when Moore's law will expire, they forget about the inventiveness of modern engineers and their ability to get around all but the most fundamental limits of nature.
So the next century will, in my opinion, bring more and more exciting scientific advances to be exploited by our fields of engineering, and these technologies will exert a continuing influence on America and the rest of the world. In other words, in the 21st century, as in the 20th, we will continue to live and work on an important frontier.
Life on the frontier is exciting. Research thrives where there is ambiguity, where much is unknown; overturning a major principle or law is considered a success, an accomplishment worthy of distinction. The disruptive, somewhat chaotic, character of frontier life is one we engineers relish.
But most institutions in a civilized society need stability and predictability. Consider what happened to America's western frontier. Civilization arrived and brought with it law and order. For better or for worse, the frontier became a more predictable and less exciting place.
Is it our turn now? Our scientific and engineering frontier is of critical importance to America. Must our frontier become "civilized?" History suggests that it must. In fact, it is already happening.
We are already confronting, and will continue to confront in the years ahead, tensions between the ambiguity inherent in the scientific frontier and the predictability required by society. Every day newspapers report examples of legal, political, and economic institutions grappling with new technologies that they only vaguely understand, and often perceive as a threat. Think of the frictions between technological standardization and product differentiation. Or between intellectual property and information freedom. Think about why regulated monopolies resist new technology. Or why e-mail spam is such a problem.
The issue is not whether "law and order" will be established, but how. Will the crude tools available to America's legal, political, and economic systems be used to impose stability in a way that reduces the excitement that nourishes technological development? Will scientific studies of some types be restricted or even forbidden? Will long-established institutions resist the opportunities for improvement afforded by engineering advances? Or can society be persuaded to accept new technologies? Can the engineering community lead the movement for responsive and responsible change?
Dugald Jackson said in 1903 that engineers, besides knowing science, "must know men and the affairs of men...must be acquainted with business methods and the affairs of the business world." In 1911 he expanded on this point, saying that "it is the duty of engineers to do their share in moulding their various economic creatures [companies and even sectors] so that the creatures may reach the greatest practicable usefulness to society." But both Jackson and Brown stopped short of saying that engineers should help the nation's institutions change to accommodate new technology.
Today, the need is different. Both science and society are changing rapidly, partly because of advances in technology. Because the institutions of modern society need to adapt to modern technology, they need help from those who fully comprehend that technology. In other words, society will be best served if we engineers take an active role.
The fifth challenge to this department, then, is to educate students so that at least some of them are prepared to help the world understand and embrace rapid changes in technology, and utilize them wisely. In my judgement, this is our most important challenge of all. If we meet it, society will be better off, and we will have earned the right to continue to work on the scientific frontier with all the excitement that we so cherish.
This essay was written for a book celebrating the centennial of the MIT Department of Electrical Engineering and Computer Science. The author takes pleasure in acknowledging the helpful advice from the committee set up to produce this book: Ellen Williams, staff and chair; Fernando J. Corbató, Robert M. Fano, Paul E. Gray, John V. Guttag, J. Francis Reintjes, and the late Hermann A. Haus, members. Some information about Dugald Jackson's time at the University of Wisconsin was provided with the help of Bahaa Saleh, Christopher L. DeMarco, and Donald W. Novotny. Several present and former MIT colleagues contributed to the author's understanding of the events and trends covered here. Many helpful suggestions about the writing were made by Barbara B. Penfield.
A spoken version of this essay was presented at the centennial symposium, May 23, 2003. The visual images for that presentation were gathered and organized by Ellen Williams and Abigail Mieko Vargus.
Finally, the readability of the essay is in large part due to Ellen Williams, who served as a technical editor and insisted on clarity and consistency. The author is grateful for her efforts, without which many of the points would not be nearly as well thought out.