Would the creation of the Internet be allowed to happen today?
The networked society we live in is in large part a gift from the University of California to the world. In the 1980s, computer scientists at Berkeley working under contract for the Defense Department created an improved version of the Unix operating system, complete with a networking protocol called the TCP/IP stack. Available for a nominal fee, the operating system and network protocol grew popular with universities and became the standard for the military's Arpanet computer network. In 1992, Berkeley released its version of Unix and TCP/IP to the public as open-source code, and the combination quickly became the backbone of a network so vast that people started to call it, simply, "the Internet."
Many would regard giving the Internet to the world as a benevolent act fitting for one of the world's great public universities. But Bill Hoskins, who is currently in charge of protecting the intellectual property produced at U.C. Berkeley, thinks it must have been a mistake. "Whoever released the code for the Internet probably didn't understand what they were doing," he says.
Had his predecessors understood how huge the Internet would turn out to be, Hoskins figures, they would surely have licensed the protocols, sold the rights to a corporation and collected a royalty for the U.C. Regents on Internet usage years into the future. It is the kind of deal his department, the Office of Technology Licensing, cuts all the time.
Hoskins' "privatize it" attitude has become the norm among administrators at many universities and federal labs across the country. As a result, computer-science professors and researchers who want to release their work to the public as open-source software often face an uphill battle.
Some familiar with the situation say the problem is that universities and federal research labs have become more interested in making money than serving the public interest.
Larry Smarr, a professor of computer science at U.C. San Diego and one of the country's top experts on supercomputing, is one of them. As former director of the National Center for Supercomputing Applications at the University of Illinois, where the original Mosaic Web browser was created, he's quite familiar with both sides of the debate.
"Some universities are dead set against giving [software code] away," says Smarr. "But I don't think universities should be in the moneymaking business. They ought to be in the changing-the-world business, and open source is a great vehicle for changing the world."
Open-source software describes program code that is made publicly available for anyone to copy, change or even sell. The best-known open-source programs, such as Linux and Apache, are the product of a collaborative process of software development that takes advantage of the contributions of thousands of programmers all over the world. It's not only a cheap way to produce software; with so many eyes looking at the code, the theory is, bugs are found and fixed more quickly than with proprietary software.
Over the past several years, open-source software development has won high-profile adherents in the business world -- including the likes of IBM and Sun Microsystems. But it has always had its strongest fans in the academic world, where open-source software is seen as a natural extension of the idea that the fruits of academic research should be shared with everyone.
But now some academic programmers on the cutting edge have found that the licensing office is proving a more formidable obstacle to progress than the limits of their imagination and skill.
Pete Beckman, formerly a senior computer scientist at the federal laboratory in Los Alamos, N.M., is a pioneer in creating clusters of servers that rival the power of mainframe supercomputers. He had to fight with lab lawyers for months before receiving permission to open-source his department's work on the clusters.
Part of the lab's reticence was concern about letting computer technology fall into the hands of America's enemies, according to Beckman. "But the lab's other motive for keeping technology private is the misguided belief they can license it and make money on the lab system," he says. "They have whole departments dedicated to extracting intellectual property from the labbies."
Before Beckman led the fight at Los Alamos to establish a protocol for making lab software public, "the only way to get your code released [open source] was to declare it worthless," he says. Beckman won his fight back in 1999, but the old standard still applies at other federal labs.
"Some federal labs can release code, others can't," Beckman says. "There are whole departments that create valuable new technology, and they can't get it out to the world because [the lab] is trying to make money off it." Software for modeling global climate change, the behavior of viral epidemics and traffic patterns are among the programs researchers can't get released, he says.
In a white paper Beckman authored on the problem, he wrote, "Seeking to control computer-science research by putting intellectual property concerns before the goal of good science has destroyed countless projects."
Just how many is hard to say. Most researchers are reluctant to criticize their administrators. It is rare that universities flat out refuse a request to release software, but the hassle of getting permission can discourage those who might otherwise release their work. "It's tricky to find examples," says Rebecca Eisenberg, a law professor at the University of Michigan who specializes in intellectual property policy. "Because most technology fails, it's hard to say something would have succeeded" if only it had been put in the public domain.
Nevertheless, Eisenberg is convinced that university interest in licensing intellectual property for profit is often at odds with the advancement of science. "You can make a clear case that research is being slowed by intellectual property claims," she says.
"Universities aren't distinguishing between times when it's important to have a patent in place to get something disseminated and times when it's not," Eisenberg says. "They're just looking to see if they can make money. It retards innovation and taxes development."
It took Chris Johnson, a computer-science professor at the University of Utah, several years of negotiation with his technology transfer office to get permission to make public a program his team had worked on for years.
Called SCIRun (pronounced "ski run"), the program is a software platform for modeling and solving all sorts of complicated scientific problems. One of its most promising applications is as a tool for designing new medical devices. Because it is a foundation upon which other programs can be built, Johnson felt that making it an open-source-code project was fundamental to its value.
"The hope is people will take this and put in their own applications and share those back with the community," Johnson says. But to do that, they have to be able to see and use the code without having to pay for it or get permission. "A lot of smart people out there can show you new and better ways for you, if they can see under the hood," Johnson says.
But when he tried to explain to the university administration that the best way to maximize the value of SCIRun was to give it away, he ran into a roadblock. "We wanted to open-source it," Johnson says. "But they said that would undermine its commercial value."
The negotiations began, a clash of differing cultures and interests. "No one really knew what we were doing at the beginning," Johnson says. "We didn't really understand intellectual property law, and they didn't really understand open source. The university just didn't want to let commercial value go. We're academics who wanted to push the envelope."
After two years of haggling, they reached a compromise. In March, the software was released under a license that allows academics free access to the code but reserves the right to royalties if the code makes its way into a commercial software product.
It hasn't always been this way. In the eighties, UC Berkeley was a pioneer in giving away software for the betterment of society. The rapid dissemination of "BSD Unix" allowed Internet-connected computers to speak the same language, helping to make our networked world possible.
But now the University of California is often mentioned as one of the institutions that have taken the craze for exclusive patents and licenses too far. "It changed in the late eighties and early nineties," says Susan Graham, a professor of computer science at Berkeley. She didn't remember there even being an Office of Technology Licensing back when the department gave away Unix and the Internet protocols.
If those innovations were discovered today, Graham worries they would end up in corporate hands. "I don't know whether they would let us release software like TCP/IP today," she says. "If they thought it had monetary value, they would want a revenue stream. There would be companies who could pay for it. I'm not sure we would have the same outcome [as in the past], and that's what concerns me."
The trend at universities toward trying to profit from intellectual property began with the passage of the Bayh-Dole Act in 1980. Bayh-Dole allows institutions doing research for the federal government -- mostly universities -- to own the intellectual property they produce, and sell the rights to private companies. Because most cutting-edge research at both public and private universities involves some federal funding, Bayh-Dole allows universities to lay claim to many of their faculty's inventions. The same rights were later extended to the federal research labs.
The philosophy behind Bayh-Dole is economic stimulation through privatization. When the law passed, the federal government held roughly 28,000 patents, but fewer than 5 percent of these were licensed to industry for development of commercial products, according to the Council on Government Relations, a lobbying group for research universities. By giving contractors a chance to sell the rights to technology developed in the course of publicly funded research, Congress hoped to spark an economic boom with taxpayer-funded technology.
Overall, the model has been a dramatic success. The transfer of technology from university labs into offices, factories and stores was fundamental to the growth of Silicon Valley and the success of the new economy. Since 1980, university inventions licensed to the private sector under Bayh-Dole have spawned over 2,200 new companies that generate about $30 billion in economic activity every year, according to the Association of University Technology Managers.
Statistics like these explain the enduring enthusiasm among most policy experts for privatizing the public's intellectual property. But a few eloquent dissenters have begun to argue that taking privatization of the nation's intellectual property too far could stifle innovation and suffocate economic growth.
The champion of this broad thesis is Stanford law professor Larry Lessig, who has just outlined this argument in a new book, "The Future of Ideas." Lessig worries that the proper balance between private intellectual property (Microsoft) and the public good (the Internet) has been lost, and our society is blindly moving toward too much private control over intellectual property. "The shift is not occurring with the idea of balance in mind," he wrote; "instead, the shift proceeds as if control were the only value."
The most powerful examples that privatizing technology does not always equal progress are public code like the Internet's and open-source software. They are cases of technology that derive their value from being public and free; fences kill them. "The open-source movement is an endorsement of the value of the public domain," Eisenberg says. "It's a striking counter-example to the bias of public policy: that the public domain dooms technology to obscurity."
The systemic bias toward privatization, which Bayh-Dole codified into law, has the scientists working on improved versions of the Internet worried.
"For the last 20 years, public money has backed proprietary systems software," says Rick Stevens, who is working on "grid computing" software at Argonne National Lab. "We're saying, stop putting public money there."
Ian Foster, another computer scientist working at Argonne, agrees. "I believe that in almost all cases, the interests of science and society alike are best served by free distribution of software produced in research labs and universities. Unfortunately, there are still institutions that place significant obstacles in the way of researchers who wish to follow this path. Agencies funding research could help things by making strong statements in favor of open source, so that this is the norm rather than the exception."
Some government agencies are starting to get the message. Open-source development for grid software and other supercomputing applications is getting some government funding. The Department of Energy, which runs Argonne, has been supporting open-source projects for years. In April, the National Security Agency announced it would help to make a version of the Linux-based operating system secure enough for the Defense Department to use.
Universities are starting to rediscover the value of open-sourcing software, too. Stanford, the institution at the hub of Silicon Valley, lets its faculty release software under a public license. "We pretty much go with what our faculty members want to do," says Kathy Ku, who heads the licensing office there. "We care about the academic mission more that the money."
Elsewhere, the struggle goes on. "It's trying to find a balance between the academic mission and commercialization," Johnson, the Utah professor, says. "This is a hot topic in universities right now, and everyone is really struggling with it. Some universities have really gone overboard. It's not going to be an easy thing to resolve."
Shares