Responsible innovation

The 1970s, today, and the implications for equitable growth

Introduction

Consider the themes that predominate in American political discourse today. Political polarization. A seemingly endless land war in Asia. Anxieties about/celebration of changing sexual mores. Debate about race, gender, and class inequities. Disruptions of older industries and the rise of new ones thanks to offshoring and automation. Worries about the geopolitical, economic, and environmental implications of dependence on fossil fuels. A Cold War with Russia. Hot wars among the nations of the Middle East. Plus ça change!

Literally none of these are new. Indeed, in the years around 1970, every one of these themes was in the headlines on a daily basis, often treated in the same way as today. Much has changed since then, of course—back then, we had a military draft and daily casualty lists, the post-industrial society was still only a prediction, major cities were on their way down not up, the Baby Boom generation’s demographic bulge was centered on youth cohorts not retirees, and the terrorists appearing in the headlines were more often Marxists than Islamists. And even where we can discern similarities between today and the early ‘70s, we need to be skeptical of presentism or reductive lessons from the past. History may rhyme, but it doesn’t repeat.

Still, parallels with the past can remind us that what we think is new is not, and that what we think will work now has failed before. Histories of the past—much like scenario planning for the future—can help us identify the constellations of actors and interests that are party to different issues, and how those actors and interests might interact and evolve. And sometimes the past is genealogically, not analogically, related to the pressing issues of today and can therefore help us diagnose our situation.

This report will argue that science and technology policy is one of the areas where looking to the years around 1970 is especially useful in thinking about strategies to promote innovation-led equitable economic growth today. The late 1960s and early 1970s were a period of incredible stress for the nation’s research-and-development enterprise. Research budgets that had been climbing steadily since the early ‘50s—and which some leading scientists expected to continue rising forever—suddenly flat-lined for what turned out to be 15 years, from 1968 to 1983. Employment in some technical fields, especially in the engineering and the physical sciences, cratered, leaving applicants with worse prospects than during the Great Depression.1 Student activists, politicians, community organizers, and many scientists themselves agitated—occasionally violently—for reforms to U.S. R&D processes.2 Who should fund it? What topics should receive attention? Who should do the research? How should researchers interact with a greater variety of stakeholders?

In and around 1970, the pipeline of undergraduate and graduate students in the sciences thinned dramatically—especially in comparison to the post-Sputnik space-race bulge—thanks to the poor job market coming out and antipathy toward military-industrial sponsorship of research going in. Looking around at their declining budgets and enrollments, the bombing and burning of campus laboratories, and growing national interest in parapsychology and New Age religion, many scientists and engineers believed America was in the grip of an anti-science and anti-technology fever, though if they’d looked more carefully, they might have seen that many activists just wanted a different kind of science and technology than the kind forged in the early Cold War.

And a different kind of science they got, in part. Researchers in the United States did become less dependent on the national security state, though not to the extent demanded by campus radicals such as the Students for a Democratic Society. Representation of women and some ethnic minorities increased in many fields—dramatically in some; hardly at all in others—though not to the extent imaginable at the time. Policies encouraging or requiring researchers to connect to the market, to civil society at large, and to local communities in particular became standard at many granting agencies. The cavalier attitude of many researchers toward the environmental and occupational hazards of laboratory research waned. In science as much as in advertising, we no longer live in the world of Mad Men.

Moreover, despite (or in some cases because of) these stresses on the research enterprise of the late ‘60s and early ‘70s, that period saw some remarkable achievements by U.S. scientists and engineers across a variety of fields: the first successful predictions from the Standard Model in high-energy physics; the Apollo and Viking landings and Pioneer and Voyager flybys in space exploration; the invention of recombinant DNA techniques in molecular biology; the “microprocessor revolution” in electrical engineering; the discovery of hydrothermal vents in oceanography; the prediction of depletion of the ozone layer in climate science.

The 1970s also were the period when the field of science and technology studies emerged. The timing is no coincidence. Many of the early practitioners of this new field of study were reformist scientists. Some (including quite a few scientists) had come of age in the feminist movement (and to a lesser extent the gay rights, civil rights, and various Marxist, pacifist, and anticolonial movements) and approached science and technology studies as a place to apply epistemologies and social consciousness developed in those arenas. Others were more cautious reformers—some were antagonistic toward social theory, counterculture, and student activism—but helped organize programs in this new field of study out of commitment to the expansive vision of interdisciplinary collaboration that flourished in U.S. academic and science policy circles in this period. And yet another branch of early science and technology studies included social scientists eager to apply their field’s tools to science in an era when students, funding, and prestige were momentarily flowing from the natural sciences to the social sciences rather than the other way around.

In other words, the origins and early outlook of science and technology studies should be interpreted in the context of the upheavals sweeping the United States (and more generally the Anglophone and Western European) research enterprise around 1970. Science and technology studies should, in particular, be seen as inheriting pieces of several of the reform movements that arose in that era, which sought to improve the research enterprise by questioning its practitioners’ assumptions and broadening their perspectives.

One of the main preoccupations of science and technology studies then and today is the concept of “responsible innovation.”3 The promotion of innovation that aims for responsiveness to society and responsibility toward the environment is an important area of activity in this field. Yet the phrase “responsible innovation” has never been used with more frequency than in the early 1970s. So what was going on in those years? In part, this was simply a moment when Americans were being asked to recognize their “responsibility” for a lot of things: racism, sexism, colonialism, poverty, environmental degradation, the potential for nuclear annihilation, and the actuality of conflicts in Southeast Asia and elsewhere. But that can’t be all. “Responsibility” and “innovation” were paired so often in this period because of the multiple crises—both real and perceived—facing American science and society at the time. “Innovation,” as long as it was done responsibly, could offer solutions to a long list of problems that Americans believed they were besieged by.

Conversely, civic “responsibility” could offer legitimacy to a similarly besieged research-and-development enterprise. Examples of “irresponsible” innovation and research were continually in the public eye in this period. Activists on the left probably had a more expansive idea of what was irresponsible, but one of the hallmarks of this period is the speed with which research that the mainstream formerly deemed unproblematic—such as the racist Tuskegee Syphilis Study—suddenly became condemnable by all. Making innovation responsible—showing Americans that scientists and engineers were hard at work solving civil society’s ills—became a favored means to escape condemnation and regain science’s early postwar authority.

But what counted as “responsible innovation?” And how did it play into more stable and sustainable economic growth? If we want to foster it today, then it seems to me that we need to understand how the referent of that term today is and is not different from what it meant in its former heyday. And, to the extent that it means the same thing today, then we need to examine what happened to the proto-responsible innovation movement of yesteryear—what worked, what didn’t, what was retained or washed away as the circa-1970 boom in this field faded from memory to the point that we have trouble remembering it today.

The rest of this report, therefore, is composed of a series of vignettes from this period that attempt to convey the multiple, overlapping meanings of responsible innovation in the late ‘60s and early ‘70s. I’ve presented some of these vignettes in longer form elsewhere.4 I will eventually present all of them in a book tentatively entitled “Through Change and Through Storm: U.S. Physical and Engineering Scientists in the Long 1970s.” The vignettes are not a randomly selected representative sample of American science around 1970. Rather, each case study exemplifies certain trends such that a U.S. scientist or engineer from that era could read the vignette and say “yes, that is the kind of thing that is happening in lots of places these days,” whether or not they agreed that “that” ought to be happening.

Some of the themes that emerge from the cases have receded or disappeared today; others are still going strong, but their origins have been forgotten. Collectively, the cases show that different aspects of reform were entangled, in ways that complicate reforms to innovation governance, then and now. Still, I’ll draw some concluding lessons from the cases for how responsible innovation today might be done with an eye to the past and to its implications for more sustainable and equitable economic growth.

A federation of bull sessions: Interdisciplinarity as a panacea at Stanford

A federation of bull sessions: Interdisciplinarity as a panacea at Stanford

Much has been written about how Stanford University became one of the outstanding winners of the early Cold War up to 1970, as well as how it became the paradigmatic entrepreneurial university from the early ‘80s to the present. Yet we know very little about how Stanford navigated the ‘70s, and how it transformed from a Cold War university into an entrepreneurial university. Like many Vietnam-era campuses, Stanford was riven by (sometimes violent) protests. Unlike most of their peers, though, Stanford’s protesters were particularly focused on research reform as a means to undermine the military-industrial complex, increase democratic engagement, and find solutions for poverty, racism, and environmental degradation.

Close inspection shows that two buzzwords dominated debates about research at Stanford in the early ‘70s: “problem-oriented” and “interdisciplinary.” Usually, these terms were paired. That is, the turmoil of the Vietnam era promoted research that could be presented as interdisciplinary and as applicable to civilian social problems—and preferably as both. Researchers who until the late ‘60s had been working almost entirely within their own disciplines, almost entirely dependent on national security funding, and almost entirely focused on basic research and/or applied research oriented to national security needs, now turned to new funders, new collaborators, and new topics. Electrical engineers began working with medical school faculty, computer scientists with musicians, aeronautical engineers with philosophers. People from the communities surrounding Palo Alto—especially poor and/or majority-minority neighborhoods such as East San Jose—began showing up in Stanford labs more often, as did blind, hard-of-hearing, and other individuals with sympathetic disabilities who could help Stanford show that it was responding to calls for reform.

With the end of the draft in 1973, though, the appetite for reform receded. What University of Oregon management professor Andrew Nelson and I have termed “radical interdisciplinarity” proved more difficult to fund and carry out than originally envisioned.5 Many Stanford researchers returned to more modest forms of interdisciplinary collaboration and to national security funders. Still, things did not return to the status quo ante. For instance, Nelson has shown that the number of degree-granting interdisciplinary centers at Stanford doubled in 1969 and then grew for 20 years at seven times the annual rate of the previous 20 years. Similarly, while national security research funding never disappeared entirely (nor should it have), after the late ‘60s, Stanford scientists and engineers increasingly turned—for better and worse—to funding from civilian federal agencies, state and local governments, and private firms and foundations.

Today’s aggressively entrepreneurial and interdisciplinary Stanford would be unthinkable without the disruptions of the Vietnam era, even if it bears little resemblance to the university imagined by that era’s activists.

An aerial view of Stanford’s campus. Credit: Jrissman, via Wikimedia Commons

Nothing fails like success: From the Moon to Earth at NASA

Nothing fails like success: From the Moon to Earth at NASA

NASA’s Manned Spacecraft Center faced the seemingly enviable problem of “existential success” as it entered the 1970s. Almost from its founding, the center had been organized around the mission of putting a man on the moon—so now what? One answer was to develop partnerships outside the field of space exploration. Success in its existential mission showed other organizations that the Manned Spacecraft Center had problem-solving expertise that could help them contend with the era’s challenges. Moreover, completion of its existential mission—and the lull in operations before its next major undertaking (the space shuttle) entered service—encouraged the center to seek such collaborations while undermining its ability to avoid them.

In 1971, for instance, the Nixon administration arranged a shotgun marriage between the Manned Spacecraft Center and the Department of Housing and Urban Development that resulted in the half-decade oddity of an Urban Studies Projects Office appearing on the organization chart of a center devoted to manned spaceflight. The Urban Studies Projects Office had nothing to do with space colonies; rather, it attempted to bring NASA engineers’ expertise in space capsule “life support” to bear on the problem of supplying life support to humbler terrestrial residences such as mobile homes and apartment complexes. In a burst of initial enthusiasm, the office offered the Department of Housing and Urban Development proposals for the kind of architectural future posited in movies such as Sleeper and Logan’s Run—proposals which were unimplementable, if not unintelligible, to a federal housing agency mired in the universe of Serpico and Death Wish. As the decade wore on, the Manned Spacecraft Center’s contributions became tamer and its engineers and administrators less enthusiastic, until finally the looming shuttle program offered the grounds for escape.

This odd-couple partnership was hardly unique, either in objectives or trajectory. Within the Manned Spacecraft Center, something like wild-type responsible innovation experiments proliferated in this era: from converting the lunar rover for use by paraplegics; to adapting techniques for remotely monitoring astronauts’ vital signs in space; to the problem of monitoring Native American communities’ health on reservations in New Mexico; to working with Meals on Wheels to bring modified Skylab dinners to poor, elderly Texas shut-ins. Even the center’s actual and planned space operations had a responsible innovation dimension that they had not had in the ‘60s and would lose again in the ‘80s—in particular the Earth Resources program, which used packages on Skylab and Landsat to monitor pollution, urban land management, and public health threats as well as the proposed Solar Power Satellite, a permanent space colony designed to beam photovoltaic power directly into the U.S. electrical grid.

The dynamics of existential success made the Manned Spacecraft Center especially prone to such ventures, but similar engagement with civilian social problems swept across the military-industrial research complex in the early ‘70s. Other NASA centers, for instance, forayed into smog reduction and alternative energy (photovoltaic, photothermal, and wind turbine) development. Most of these programs receded in the 1980s, but it is fair to say that NASA, an agency born as a weapon of the Cold War, civilianized permanently if not completely in the 1970s.

Many leading aerospace firms and federal agencies tried to follow suit by branching into fields such as mass transit and solar power. Even the Atomic Energy Commission, the direct descendant of the Manhattan Project and thereby producer of the U.S. nuclear weapons stockpile, evolved in the ‘70s into the more civilian and diversified Energy Research and Development Administration and then into the Department of Energy. Many physicists at these agencies’ National Laboratories transitioned from weapons work to alternative energy research and biomedicine. Others left the National Labs and formed research clusters in universities and think tanks devoted to the same sorts of topics pursued at NASA and Stanford, such as, memorably, a group at Brookhaven National Laboratory who moved into the State University of New York system and formed a partnership with New York City’s Uniformed Sanitationmen’s Association and its Commissioner of Sanitation to help the city pick up its trash more efficiently.

Few of these projects prospered and most have been forgotten, in part because the philosophy of active state intervention in (certain) social problems waned during the Carter and Reagan administrations. But the civilianization of the ‘70s—both at federal agencies like NASA and universities like Stanford—was sticky enough to linger through the ‘80s, and to set the stage for a more expansive civilianization under the post-Cold War Clinton administration. More broadly, these changes restructured how Americans thought about research. Early Cold War policymakers and institutional entrepreneurs generated a vast stockpile of personnel, research infrastructure, and knowledge to help defeat the Soviets. By the late ‘60s, that system of innovation governance had succeeded to the point of failure. Not just in the space race, but all across the scientific and technological front of the Cold War, the Soviets now lagged too far to count as foes. Therefore, the U.S. research and development enterprise had to be transformed to contend with civilian domestic problems, with economic and technical competition with Western Europe and Japan, and with the more complex national security concerns of what would soon be the post-Cold War era.

President John F. Kennedy delivers his proposal to put a man on the Moon before a joint session of Congress, May 25, 1961. Credit: NASA, via Wikimedia Commons

Turn on, tune in, start up: The experimental life in Santa Barbara

Turn on, tune in, start up: The experimental life in Santa Barbara

In 1960, the city of Santa Barbara’s physics community was dominated by defense think tanks modeled on the RAND Corporation; by 1990, the think tanks had disappeared, replaced by a university physics department in the global first rank, and a thriving cluster of high-tech startup companies. That transformation was largely due to local physicists’ creative responses to the dislocations outlined in the previous vignettes.

The think tanks were the first to move, as the Southeast Asian conflict and the late Secretary of Defense Robert McNamara’s attempts to rein in defense spending brought a halt to the funding glut that had given rise to the think tank industry in the first place. Many defense think tanks diversified in the late ‘60s into the same civilian-minded topics highlighted above. Some think tanks also spun off startup companies that tried to commercialize their researchers’ discoveries for civilian markets such as pollution monitoring and health care.

The tendency of military-industrial manufacturing firms to seek civilian markets and spawn more civilian-minded startup manufacturing companies in the late ‘60s has been noted before, especially by historians of the semiconductor industry. And historians have noted the contemporaneous civilianization of research topics at defense think tanks as well. In Santa Barbara, though, declines in military research funding and in national political culture encouraged defense think tanks to spin off startups dedicated to high-tech manufacturing, not just research.

That’s significant because within a couple years, U.S. research universities would begin to give rise to high-tech manufacturing startups as well. And what, really, is the difference between a defense think tank and a Cold War research university? True, universities have students. But plenty of think tanks hosted graduate students, and plenty of Cold War university administrators neglected undergraduate education as much as they could—so the distinction is a blurry one.

Nowhere was that more true than in Santa Barbara’s physics and computer science communities. In those fields, think tank researchers taught courses at the University of California-Santa Barbara, while university faculty started, in the early ‘70s, to take jobs in think tank spinoffs and eventually to form their own high-tech startups. As at Stanford, the entrepreneurial university was a product of the dislocations of the Vietnam era. But at both UC-Santa Barbara and Stanford, academic entrepreneurship was only one of several avenues of reform in innovation governance. At the time, more energy was put into organizational innovations such as new interdisciplinary centers; pedagogical innovations designed to restock graduate and undergraduate enrollments (and stem campus activists’ critiques of science); and novel modes of community outreach. Then, as now, most faculty members quite reasonably tried to weather crisis by doing high-quality research as measured by the standards by which they had been trained (science in an early Cold War mode), while a few took the upheavals around them as an opportunity to try out a more precarious but personally meaningful kind of science.

Thus, in the late ‘60s and early ‘70s, members of the UC-Santa Barbara physics department formed a new Quantum Institute hosting applied, civilian, interdisciplinary research and a Physics Learning Center hosting local schoolchildren; began teaching courses on parapsychology and environmental science; and founded garage “companies” selling everything from divining rods to bookcases to transistorized acupuncture devices. The department even added a new degree program, a master’s of scientific instrumentation, out of fears that its Ph.D. program might be canceled. As one might expect given the previous vignettes, the master’s program advertised for students who wanted to do applied, interdisciplinary research in collaboration with off-campus civic institutions such as community hospitals. And that’s what they got—early student projects emphasized the same topics we’ve seen at Stanford and NASA: aids for the blind and hard-of-hearing; biomedical instrumentation; pollution monitoring equipment—as well as apparatus for measuring parapsychological phenomena.

The faculty members most involved with the master’s program were also those who had small lifestyle firms in their garages. And so, gradually, organically, and without much thought given to profit, student projects were transformed into commercial products of high-tech academic “startups,” albeit products sold in very low volumes at small markups and to unlikely customers such as parapsychology enthusiasts and schools for the deaf. Over time, the profits increased, the startups grew bigger, and the products and customers became more “serious.”

Once the Reagan-era buildup reversed the funding and enrollment crises of the ‘70s, the department canceled the master’s program, which had long been seen as an embarrassment for the more prestigious (if imperiled) Ph.D. program. In response, the faculty head of the master’s program left the university to run his startup full-time—a startup that soon became the core of a bustling local high-tech cluster, and an engine spitting out serial entrepreneurs and millionaire philanthropists.

That’s an endpoint that any American research university would claim to want, and UC-Santa Barbara has done its best to benefit from, and take credit for, its local high-tech cluster. But how did this university actually get there, and would or could any university willingly and willfully go down the path the university took? The most salient ingredients that went into this high-tech cluster were: precipitous declines in research funding, enrollments, and public legitimacy; a master’s program viewed by core faculty as embarrassing but temporarily necessary; and a cadre of alienated departmental faculty, students, and technicians more interested in private enthusiasms such as parapsychology, and public desiderata such as disability and pollution technologies, than in “serious” physics topics like superconductivity and the Standard Model. UC-Santa Barbara got a high-tech cluster despite, not because of, its best efforts—and it only got one because academic entrepreneurship was entangled with pedagogical reforms and institutional innovations driven by shifting federal budgetary priorities and the counterculture.

Credit: University of California, Santa Barbara

Burnt by the sun: Jack Kilby and the solar boom and bust

Burnt by the sun: Jack Kilby and the solar boom and bust

Several recent influential books have emphasized the links among U.S. scientists, engineers, the counterculture, and the New Left in the 1970s.6 As my other vignettes should make plain, those links were many, varied, and fascinating. But focusing too much on the counterculture may leave us blind to those scientists and engineers who were unsympathetic toward the New Left and the youth movement, yet who nevertheless hoped to incrementally adapt the institutions of the military-industrial-academic complex to address matters of civil, rather than national, security. Though I haven’t highlighted them much thus far, such middle-aged, white, male, middle-class, “square scientists” were abundant in each of the settings I’ve described. Some were agnostic or even quietly curious about the youth movement, but many others ranged from aggressively skeptical to furiously hostile—even as they were reforming their research in exactly the ways the New Left demanded. There was plenty of common ground between these camps, yet no one wanted to be seen to occupy it together.

One very square and very influential scientist was Jack Kilby, co-inventor of the integrated circuit, for which he shared a Nobel Prize. A rather famous story goes that Kilby invented the integrated circuit in his first month of work at Texas Instruments in 1958 while his colleagues were all on vacation. Much less well known is that Kilby took a leave of absence from the company in 1970 to work as an independent inventor and consultant to it and the Air Force and other stakeholders in microelectronics. From that perch, Kilby worked furiously to ensure that the U.S. military could still steer the domestic semiconductor industry even as its share of the market cratered relative to civilian markets for business computers and even consumer goods such as digital watches and the Kilby-designed Texas Instruments calculator.

Kilby’s inventions from the early ‘70s were for the most part quite conventional, except perhaps for an electronic teaching machine that he shopped to Texas Instruments and a number of other firms. Kilby’s interest in teaching machines is notable because similar devices were at the center of a cluster of countercultural intellectuals and engineers in the San Francisco Bay Area who are often credited as inventors of the mouse, the graphical user interface, and even the personal computer. Kilby’s teaching machines certainly didn’t lead to the PC, though I think it arguable that they did inspire Texas Instruments’ most memorable civilian product, the Speak & Spell.

Kilby’s interest in pedagogical innovations extended into undergraduate education as well. In particular, he and a former Texas Instruments colleague, Jay Lathrop, and Skip Porter, an electrical engineer at Texas A&M University, spent the early ‘70s developing ways for undergraduates to become familiar with industrial integrated circuit manufacturing techniques, which are common sense today but were fairly unorthodox thinking at the time. In late 1973, however, Kilby, Lathrop, and Porter diverted their attention to solar energy in response to the oil embargo by members of the Organization of the Petroleum Exporting Countries. And, for once, the scheme they came up was attractive enough to Texas Instruments that it created “Project Illinois” (after Kilby’s alma mater) with the idea of transforming the company from a semiconductor firm into the predominant player in the relatively new solar energy industry.

Two things are striking about Kilby’s and Texas Instruments’ rush into solar power. First is the extent to which they borrowed from military-industrial resources in developing a system to supply electricity and hot water to suburban mansions: from elements of the core technology (such as a fuel cell), to the use of scenario planning (adapted from nuclear wargaming), to intellectual property terms on government contracts, to networks of contacts among personnel continually revolving between national security agencies and firms. And second is the extent to which Kilby and Texas Instruments disregarded and denigrated the technical abilities of pre-existing solar energy advocates, especially those with few links to the military-industrial complex. Kilby was as committed to solar power as any hippie, yet he and his colleagues evidently believed that only veterans of the military-industrial complex could be trusted to develop that technology. Indeed, rather than work with organizations such as the Solar Energy Research Institute (headed by Denis Hayes, organizer of the first Earth Day), Kilby tried to convince the Air Force to adopt the Illinois system as a backup power supply for its MX missile silos!

In many ways, the attitude of Kilby and Texas Instruments was reasonable. Military-industrial veterans knew a lot about R&D. The templates they had developed had already worked in transitioning Kilby’s most important invention, the integrated circuit, from military to civilian markets. Yet in forgoing alliances with countercultural solar power advocates, Kilby and Texas Instruments left themselves vulnerable to the military-industrial complex’s superficial commitment to alternative energy. Thus, when the price of oil dropped and the Reagan administration pulled federal dollars out of solar R&D, Texas Instruments was left with little choice but to abandon Project Illinois, leaving Kilby with little choice but to angrily abandon TI. As he put it in a letter to a friend, “this is not a very good time to be peddling a solar project. We need another middle east crisis, I guess.”

Kilby wasn’t exactly representative of square American scientists in the ‘70s, but he wasn’t such an outlier either. There were many scientists and engineers who, like Kilby, weren’t enamored of the counterculture or the reforms suggested by science’s liberal establishment but who also didn’t follow the neocon route of Edward Teller and Fred Seitz. That indicates to me that our understanding of responsible innovation, whether in the 1970s or today, contains a gaping excluded middle that could, in fact, be populous, active, and productive. The pitfall of the ‘70s, though, was that square scientists made themselves into that excluded middle because they themselves excluded the possibility of collaboration with the counterculture and the New Left.

Today’s high-tech excluded middle is configured differently, but it’s still there. Consider the gap between teachers and education “reformers,” or between old-line transportation activists and partisans of the “Uber economy.” In both cases, “reformers” claim to bring Silicon Valley-style “disruption” without building common ground with those being “disrupted,” with the unsurprising result that the disruptors’ plans are themselves disrupted.

Seated in the middle, Texas Instruments engineer and co-inventor of the integrated circuit, Jack Kilby (early 1960s). Credit: James R. Baird, via Wikimedia Commons

Netherlands, Inc.: Signetics as a bellwether of globalization

Netherlands, Inc.: Signetics as a bellwether of globalization

In 1970, the specters haunting research reform were population growth, environmental degradation, war and nuclear annihilation, bankrupt and riot-torn cities, and dependence on fossil fuels. By 1975, the anemic U.S. economy and surging competition from Japanese firms were crowding every other motivation for research reform out of public discourse. The 1970s saw Japanese firms near or overtake U.S. counterparts in multiple industries: steel, shipbuilding, consumer electronics, and semiconductor manufacturing. State sponsorship of Japanese semiconductor firms, in particular, set off moral panic among U.S. elites. Yet the two major U.S. semiconductor firms bought by foreign conglomerates in the ’70s were both bought by European, not Japanese, companies: Philips NV bought Signetics in 1973 and Schlumberger purchased Fairchild Industries in 1978. These were important U.S. firms. Almost every Silicon Valley semiconductor firm ever is one of the so-called “Fairchildren.” And while Texas Instruments and Fairchild claimed to have invented the integrated circuit, the product was considered an immature technology until Signetics became the first firm to bet its entire product line on them.

Both Signetics and Fairchild also pioneered the offshoring of semiconductor manufacturing to sites with cheap, pliant labor. For Signetics, that meant plants in countries ruled by friendly juntas, primarily in East Asia (South Korea, the Philippines, Thailand, but also Portugal), as well as in remote parts of the United States, such as New Mexico and Utah, and also Scotland in the United Kingdom. The transition to the “post-industrial society” outlined by Harvard University sociologist Daniel Bell didn’t start in the semiconductor industry, but by the late ’60s Fairchild, Signetics, and their peers were leading the way.7

Indeed, Silicon Valley firms had always been far more successful at fending off labor unions at home and abroad than any automaker or coal mine, in part by simply shipping jobs away (or threatening to do so) any time unions tried to organize their plants. So it is either ironic or fitting that Signetics was itself bought by Philips, a firm seeking to climb the “league tables” of semiconductor manufacturing through acquisitions of plants in a country (the United States) with a cheaper and less organized labor force.

Signetics was relatively immune to the types of socially relevant R&D that swept through the sites of my other vignettes. Its managers didn’t perceive any budgetary or cultural pressures to branch into biomedical, disability, or environmental technologies. And unlike Texas Instruments, it didn’t have the wherewithal (or a champion like Kilby) to leap from semiconductor manufacturing into novel markets in solar power and pedagogical computing. But Signetics’ refusal to be drawn into proto-responsible innovation also meant that it didn’t have to abandon anything when socially relevant R&D faded after the mid-’70s. The sites of my other vignettes all emerged from the ’70s as success stories, but only because they ditched many of their most innovative (and “responsible”) initiatives of that decade. Signetics didn’t have to do that.

Yet Signetics did change with the times. At the beginning of the ’70s, it was the kind of company where the employee newsletter published the swimsuit photos of women employees (“girls”) who entered the annual beauty contest, and where the winning “beauties” were white and blond. At the beginning of the decade, it was the kind of company where the president paternalistically called on employees to “try asking yourself the question, ‘Am I the bottleneck in my area?’” and to “give a damn” by voting–not for the forces of “disorder, riot, and disrespect of the democratic order,” but for Richard Nixon! Most tellingly, at the beginning of the ’70s, Signetics was the kind of company where management took great pride that its circuits controlled the United States’ nuclear arsenal–even as ordinary employees welcomed the “Age of Awareness” of “war, racism, poverty” that the youth culture had awakened.

And that “awareness” shone through, gradually. By 1973, for instance, the firm’s “girls” had become “women,” and some even attained promotions into middle management. At the same time, changes in the legislative and economic environment forced policy shifts that awareness alone could not–a conservation program in the wake of the OPEC embargo, greater insistence on safety practices in the wake of the founding of the federal Occupational Safety and Health Administration, programs to hire more workers with handicaps in response to equal employment opportunity laws, more visible charity drives to reach out to local communities and influence local politics. The three articles featured on the cover of a 1984 newsletter encapsulate what happened to Signetics in the ’70s: a profile of a new deaf coworker; a report on employees’ contributions to a local blood drive; and news that Signetics and two neighboring firms (TRW, Inc. and Advanced Micro Devices, Inc.) were being held responsible by California’s Regional Water Quality Control Board for a giant underground plume of toxic solvents formed over the previous 20 years by leaks from poorly maintained storage tanks.

Signetics offers, therefore, a rather mixed message about the passage of U.S. physical and engineering scientists through change and through storm. Like its peers, Signetics civilianized–military markets vanished in the mid-’80s, leading to the closure of plants and contributing to the firm’s final disappearance into the Philips conglomerate in the early ’90s. Some moves toward responsible innovation took hold in the early ’70s, particularly conservation and more diverse hiring, but by the ’80s the idealism of “awareness” had morphed into expressions of corporate “concern” justified by cost-benefit analysis. Unionization was still strongly discouraged, in part by the practice of locating plants in friendly dictatorships, but by the ’80s most of those U.S.-backed dictatorships were starting to transition to democracy. Domestically, unions were kept out through generous benefits and cultivation of a “California ideology” of unfettered individual growth and expression–heavily drenched, though, with the buzzwords and vacuity of the “management philosophy” fads of the 1980s. In other words, we have all become Signetics, even as Signetics became Philips Semiconductors.

The Signetics 2513 was a character generator chip used in the Apple I computer, seen here on display at the Smithsonian. Credit: Ed Uthman, via Wikimedia Commons

Conclusion

The five vignettes I’ve presented aren’t entirely independent of each other, though I have yet to document any substantial overlap among them either. Rather, actors at these five places all independently saw and commented on the same suite of changes in American science and society going on around them. The fact that those actors responded to and participated in social change in similar ways despite the lack of direct personal connections among them tells us a lot about what kinds of diffuse factors can drive sweeping (if mostly temporary) research reform.

What I see in looking at these sites is a U.S. research enterprise and system of innovation governance that was capable of rapid, dramatic change, but also characterized by considerable inertia and complexity. The kinds of research that today’s proponents of responsible innovation encourage proliferated in the early ’70s: biomedical devices, environmental monitoring and remediation, disability technologies, alternative energy, and mass transit. The methods promoted by today’s responsible innovation advocates bloomed in the ’70s, too, especially a radical interdisciplinarity in which the natural, social, and engineering sciences would find common ground with the humanities. Yet the vogue for wild-type responsible innovation was brief, maybe six or seven years. The habits for research laid down in the early Cold War never broke entirely, and by the mid-’70s were ascendant once again: national security funding, a more limited notion of interdisciplinarity, and fewer and quieter calls for reform.

It should be clear, then, that the complexity and inertia of the U.S. system of innovation governance can hinder the aims of responsible innovation. I want to conclude with the argument, however, that complexity and inertia could be coopted as tools for responsible innovation, in at least four different ways–and in doing so foster an innovation environment that helps power more sustained and equitable economic growth.

First, inertia arises in part from justifiable skepticism about reform. Some of the trendy research topics of the early ’70s, such as parapsychology or zero population growth, were entirely worthy of skepticism. Sometimes, of course, self-proclaimed skeptics are nothing of the sort–as demonstrated today by well-organized cadres of climate change denialists. But under some conditions, skeptics can play an important quality control function.

Second, inertia isn’t the same as stasis. As we saw in several of the vignettes, many “square scientists” were open to moving in new directions, and were willing to adapt the considerable resources, habits, and knowledge of the military-industrial complex to civilian-oriented projects. In some cases, such as the development of consumer markets for integrated circuits, the inertia built up for military-industrial R&D imparted a substantial impetus to related civilian technologies. The trick is to find, create, and appeal to commonalities in the imagined futures of squares and reformers, while discouraging the view that either squares or reformers have all of the answers.

Third, the inertia that can hinder reform can sometimes transfer to reform itself. Many of the personnel and organizations who experimented with proto-responsible innovation in the early ’70s retreated from it by the middle of that decade–only to return to some of the same topics and methods later. The end of the Cold War and the massive shift in R&D funding from the physical to the life sciences in the 1990s was, in particular, a moment when many of the experiments of the ’70s were resurrected. Even seemingly failed experiments involving responsible innovation can have very long and influential afterlives.

And finally, the inertia of U.S. innovation governance arises in part from its complexity. American science is steered, funded, and carried out by an astonishing variety of kinds of organizations, with only the loosest of centralized direction. Democratic desires for reform therefore take a long time to filter through the system, and often get washed out before full implementation. But that also means that almost any topic can find some niche in the complex ecology of American research. Even parapsychologists can get their work funded! And while that can lead to some dysfunctions, it also encourages scientists and engineers to make their projects as flexible and multivalent as possible so that they can morph to appeal to any number of stakeholders. Flexibility is vital because, as we’ve seen, different domains of reform in innovation governance are often entangled.

While the stars rarely align such that experiments prosper in every domain at once, the linkages among domains mean reforms in one often contribute to reform in many. A new course in responsible innovation, a new product developed under its auspices, a newly inspired means for researchers to engage with the public–all of these are worth trying as each singly is capable of finding some constituency and all are likely to advance the prospects of the others. Since different individuals generate their most creative experiments in different domains, and we can’t know ahead of time which kinds of experiments in innovation governance will be most enduring or significant, the best strategy is to encourage experimentation in many domains and to stimulate cross-linkages between those experiments. The strategy that should be avoided is one in which one kind of reform–say, encouraging professors to found startup companies–is incentivized at the expense of others. In the end, such a narrow strategy is self-defeating.

The main lesson of the 1970s is that a move toward a more responsible innovation system in the United States that in turn provides the impetus for more equitable economic growth is possible. Indeed, it’s happened before. Then as now, changes in innovation governance may be driven by sustained grassroots activism or structural reform, but more likely from a combination of the two. Such changes in innovation governance will, of course, be met with skepticism and even hostility. In the 1970s, polarization between skeptics and activists hindered even those innovation reforms that had a broad base of support. The trick is to identify elements of envisioned futures that are shared by reformers and skeptics alike, and to allocate resources for moving toward those shared elements in a non-zero-sum manner.

About the author

Cyrus Mody is Professor and Chair in the History of Science, Technology, and Innovation in the Faculty of Arts and Social Sciences at Maastricht University in the Netherlands. He is the author of “Instrumental Community: Probe Microscopy and the Path to Nanotechnology” (2011) and the forthcoming “The Long Arm of Moore’s Law: Microelectronics and American Science,” both from MIT Press.

Acknowledgements

Research for this report has been supported by the National Science Foundation through the Center for Nanotechnology in Society at the University of California, Santa Barbara and through the National Nanotechnology Infrastructure Network. The views expressed are the author’s and do not reflect those of the NSF.

End Notes

1. D. Kaiser, “Cold War Requisitions, Scientific Manpower, and the Production of American Physicists after World War II,” Historical Studies in the Physical Sciences 33 (2002): 131–59.

2. Stuart W. Leslie, “‘Time of Troubles’ for the Special Laboratories,” in Becoming MIT: Moments of Decision, ed. D. Kaiser (Cambridge, MA: MIT Press, 2010), 123–44; and Matthew Wisnioski, “Inside ‘the System’: Engineers, Scientists, and the Boundaries of Social Protest in the Long 1960s,” History and Technology 19 (2003): 313–33; Kelly Moore, Disrupting Science: Social Movements, American Scientists, and the Politics of the Military, 1945–1975 (Princeton, NJ: Princeton University Press, 2008).

3. David H. Guston et al., “Responsible Innovation: Motivations for a New Journal,” Journal of Responsible Innovation 1.1 (2015): 1-8.

4. Cyrus C.M. Mody, “Santa Barbara, Physics, and the Long 1970s,” Physics Today 66.9 (September, 2013): 31-37; Cyrus C.M. Mody, “Burnt by the Sun: Jack Kilby, TI, and the ‘70s Solar Boom,” IEEE Spectrum (submitted); Cyrus C.M. Mody, “‘An Electro-Historical Focus with Real Interdisciplinary Appeal’: Interdisciplinarity at Vietnam-Era Stanford,” in Investigating Interdisciplinary Research: Theory and Practice across Disciplines, ed. Scott Frickel, Barbara Prainsack, and Mathieu Albert (New Brunswick: Rutgers University Press, under review).

5. Cyrus C.M. Mody and Andrew J. Nelson, “‘A Towering Virtue of Necessity’: Computer Music at Vietnam-Era Stanford,” Osiris 28 (Music in the Laboratory) (2013): 254-277.

6. E.g., David Kaiser, How the Hippies Saved Physics: Science, Counterculture, and the Quantum Revival (New York: Norton, 2012); Fred Turner, From Counterculture to Cyberculture: Stewart Brand, the Whole Earth Network, and the Rise of Digital Utopianism (Chicago: University of Chicago Press, 2006); Eric J. Vettel, Biotech: The Countercultural Origins of an Industry (Philadelphia: University of Pennsylvania Press, 2006).

7. Daniel Bell, The Coming of Post-Industrial Society (New York: Harper Colophon Books, 1974).