2014's fast food atrocities
Burger King's black cheeseburger: Made with squid ink and bamboo charcoal, arguably a symbol of meat's destructive effect on the planet. Only available in Japan.
Topics: Entertainment News
The office of Meir “Manny” Lehman is a cozy one. Located on the outer edge of the Imperial College of Science, Technology and Medicine campus in South Kensington, London, it offers room for all the basic amenities: a desk, two chairs, a Macintosh G4 and a telephone. Still, for a computer scientist nearing the end of a circuitous 50-year career, the coziness can be a bit confining.
“You’ll have to forgive me,” apologizes Lehman at one point, sifting through a pile of research papers on a nearby shelf. “Since I lost my secretary, I can’t seem to find anything.”
The pile, a collection of recently published papers investigating the topic of software evolution, a topic Lehman helped inaugurate back in the 1970s, is something of a taunting tribute. Written by professional colleagues at other universities, each paper cites Lehman’s original 1969 IBM report documenting the evolutionary characteristics of the mainframe operating system, OS/360, or his later 1985 book “Program Evolution: Processes of Software Change,” which expands the study to other programs. While the pile’s growing size offers proof that Lehman and his ideas are finally catching on, it also documents the growing number of researchers with whom Lehman, a man with dwindling office space and even less in the way of support, must now compete.
“And to think,” says Lehman, letting out a dry laugh. “When I first wrote about this topic, nobody took a blind bit of notice.”
Software evolution, i.e. the process by which programs change shape, adapt to the marketplace and inherit characteristics from preexisting programs, has become a subject of serious academic study in recent years. Partial thanks for this goes to Lehman and other pioneering researchers. Major thanks, however, goes to the increasing strategic value of software itself. As large-scale programs such as Windows and Solaris expand well into the range of 30 to 50 million lines of code, successful project managers have learned to devote as much time to combing the tangles out of legacy code as to adding new code. Simply put, in a decade that saw the average PC microchip performance increase a hundredfold, software’s inability to scale at even linear rates has gone from dirty little secret to industry-wide embarrassment.
“Software has not followed a curve like Moore’s Law,” says University of Michigan computer scientist John Holland, noting the struggles of most large-scale software programs during a 2000 conference on the future of technology. “In order to make progress here it is not simply a matter of brute force. It is a matter of getting some kind of relevant theory that tells us where to look.”
For Lehman, the place to look is within the software development process itself, a system Lehman views as feedback-driven and biased toward increasing complexity. Figure out how to control the various feedback loops — i.e. market demand, internal debugging and individual developer whim — and you can stave off crippling over-complexity for longer periods of time. What’s more, you might even get a sense of the underlying dynamics driving the system.
Lehman dates his first research on the topic of software evolution back to 1968. That was the year Lehman, then working as a researcher at IBM’s Yorktown Heights facility, received an assignment to investigate IBM’s internal software development process. Managers at rival Bell Labs had been crowing about per-developer productivity, and IBM managers, feeling competitive, wanted proof that IBM developers were generating just as many lines of code per man-year as their AT&T counterparts.
Lehman looked at the development of OS/360, IBM’s flagship operating system at the time. Although the performance audit showed that IBM researchers were churning out code at a steady rate, Lehman found the level of debugging activity per individual software module to be decreasing at an equal rate; in other words, programmers were spending less and less time fixing problems in the code. Unless IBM programmers had suddenly figured out a way to write error-free code — an unlikely assumption — Lehman made a dire prediction: OS/360 was heading over a cliff. IBM, in stressing growth over source-code maintenance, would soon be in need of a successor operating system.
Although IBM executives largely ignored the report, Lehman’s prediction was soon borne out. By 1971, developers had encountered complexity problems while attempting to install virtual memory into the operating system, problems which eventually forced the company to split the OS/360 code base into two, more easily manageable offshoots. The linear growth curve that seemed so steady in the 1960s suddenly looked like the trail of a test missile spiraling earthward.
Lehman’s report would eventually earn a small measure of fame when University of North Carolina professor and former OS/360 project manager Frederick P. Brooks excoriated the IBM approach to software management in his 1975 book “The Mythical Man Month.” Using Lehman’s observations as a foundation for his own “Brooks Law” tenet — “adding manpower to a late software project makes it later” — Brooks argued that all software programs are ultimately doomed to succumb to their own internal inertia.
“Less and less effort is spent on fixing original design flaws; more and more is spent on fixing flaws introduced by earlier fixes,” wrote Brooks. “As time passes, the system becomes less and less well-ordered. Sooner or later the fixing ceases to gain any ground. Each forward step is matched by a backward one. Although in principle usable forever, the system has worn out as a base for progress.”
By 1975, Lehman, with the help of fellow researcher Laszlo Belady, was well on the way to formulating his own set of laws. A quarter century after their creation, the laws read like a mixture of old developer wisdom and common textbook physics. Take, for example, Lehman’s “Second Law” of software evolution, a software reworking of the Second Law of Thermodynamics.
“The entropy of a system increases with time unless specific work is executed to maintain or reduce it.”
Such statements put Lehman, who would leave IBM to take a professorship at Imperial College, into uncharted waters as a computer scientist. Halfway between the formalists, old-line academics who saw all programs as mathematical proofs in disguise, and the realists, professional programmers who saw software as a form of intellectual duct tape, Lehman would spend the ’70s and ’80s arguing for a hybrid point of view: Software development can be predictable if researchers were willing to approach it at a systems level.
“As I like to say, software evolution is the fruit fly of artificial systems evolution,” Lehman says. “The things we learn here we can reapply to other studies: weapon systems evolution, growth of cities, that sort of thing.”
That Lehman conspicuously leaves out biological systems is just one reason why his profile has slipped over the last decade. At a time when lay authors and fellow researchers feel comfortable invoking the name of Charles Darwin when discussing software technology, Lehman holds back. “The gap between biological evolution and artificial systems evolution is just too enormous to expect to link the two,” he says.
Nevertheless, Lehman aspires to the same level of intellectual impact. While he was in retirement during the early 1990s, his early ideas jelled into one big idea: What if somebody were to formulate a central theory of software evolution akin to Darwin’s theory of natural selection? In 1993, Lehman took an emeritus position at Imperial College and began work on the FEAST Hypothesis. Short for Feedback, Evolution and Software Technology, FEAST fine-tunes the definition of evolvable software programs, differentiating between “S-type” and “E-type”: S-type or specification-based programs and algorithms being built to handle an immutable task, and “E-type” programs being built to handle evolving tasks. Focusing his theory on the larger realm of E-type programs, Lehman has since expanded his original three software laws to eight.
Included within the new set of laws are the Law of Continuing Growth (“The functional capability of E-type systems must be continually increased to maintain user satisfaction over the system lifetime”) and the Law of Declining Quality (“The quality of E-type systems will appear to be declining unless they are rigorously adapted, as required, to take into account changes in the operational environment”). For added measure, Lehman has also thrown in the Principle of Software Uncertainty, which states, “The real world outcome of any E-type software execution is inherently uncertain with the precise area of uncertainty also unknowable.”
While the new statements still read like glossed-over truisms, Lehman says the goal is to get the universal ideas on paper in the hopes that they might lead researchers to a deeper truth. After all, saying “objects fall down instead of up” was a truism until Sir Isaac Newton explained why.
“Whenever I talk, people start off with blank faces,” Lehman admits. “They say, ‘But you haven’t told us anything we didn’t already know.’ To that I say, there’s nothing to be ashamed of in coming up with the obvious, especially when nobody else is coming up with it.”
For extra ammo, Lehman also has expanded the graphs and data from his original studies in the 1970s. Taken together, they show most large software programs growing at an inverse square rate — think of your typical Moore’s Law growth curve rotated 180 degrees — before succumbing to over-complexity.
Whether the curves serve as anything more than a conversation-starter is still up for debate. Chris Landauer, a computer scientist at the Aerospace Corporation and a fellow guest speaker with Lehman at a February conference on software evolution at the University of Hertfordshire, was impressed by the Lehman pitch.
“He has real data from real projects, and they show real phenomena,” Landauer says. “I’ve seen other sets of numbers, but these guys have something that might actually work.”
At the same time, however, Landauer wonders if the explanation for similar growth trajectories across different systems isn’t “sociological.” In other words, do programmers, by nature, prefer to add new code rather than substitute or repair existing code? Landauer also worries about whether the use of any statistic in an environment as creative as software development leads to automatic red herrings. “I mean, how long does it take a person to come up with a good idea?” Landauer asks. “The answer is we just don’t know.”
Michael Godfrey, a University of Waterloo scientist, is equally hesitant but still finds the Lehman approach useful. In 2000, Godfrey and a fellow Waterloo researcher, Qiang Tu, released a study showing that several open-source software programs, including the Linux kernel and fetchmail, were growing at geometric rates, breaking the inverse squared barrier constraining most traditionally built programs. Although the discovery validated arguments within the software development community that large system development is best handled in an open-source manner, Godfrey says he is currently looking for ways to refine the quantitative approach to make it more meaningful.
“It’s as if you’re trying to talk about the architecture of a building by talking about the number of screws and two-by-fours used to build it,” he says. “We don’t have any idea of what measurement means in terms of software.”
Godfrey cites the work of another Waterloo colleague, Rick Holt, as promising. Holt has come up with a browser tool for studying the degree of variation and relationship between separate offshoots of the original body of source code. Dubbed Beagle, the tool is named after the ship upon which Charles Darwin served as a naturalist from 1831 to 1836.
Like Landauer, Godfrey expresses concern that a full theory of software evolution might be too “fuzzy” for most engineering-minded programmers. Still, he credits Lehman for opening the software field to newer, more intriguing lines of inquiry. “It’s the gestalt ‘Aha’ of his work that I find more interesting than the numbers,” Godfrey says.
For Lehman, the lack of a scientific foundation to the software-engineering field is all the more reason to keep digging. Fellow researchers can quibble over the value of judging software in terms of total lines of code, but until they come up with better metrics or better theories to explain the data, software engineering will always be one down in the funding and credibility department. A former department head, Lehman recalls the budgetary battles and still chafes over the slights incurred. Now, as he sits in a cramped office, trying to recruit new corporate benefactors and a new research staff, he must deal once again with those who label software development a modern day form of alchemy — i.e. all experiment but no predictable result.
“In software engineering there is no theory,” says Lehman, echoing Holland. “It’s all arm flapping and intuition. I believe that a theory of software evolution could eventually translate into a theory of software engineering. Either that or it will come very close. It will lay the foundation for a wider theory of software evolution.”
When that day comes, Lehman says, software engineers will finally be able to muscle aside their civil, mechanical and electrical engineering counterparts and take a place at the grown-ups’ table. As for getting bigger offices, well, he sees that as a function of showing the large-scale corporations that fund university research how to better control software feedback cycles so their programs stay healthier longer. Until then, the search for a theory has rendered Lehman less of a Darwin and more of an Ahab — a man in search of both fulfillment and a little revenge.
Sam Williams is a freelance reporter who covers software and software-development culture. He is also the author of "Free as in Freedom: Richard Stallman's Crusade for Free Software."More Sam Williams.
Domino's Specialty Chicken: It's like regular pizza, except instead of a crust, there's fried chicken. The company's marketing officer calls it "one of the most creative, innovative menu items we have ever had” -- brain power put to good use.
KFC'S ZINGER DOUBLE DOWN KING: A sandwich made by adding a burger patty to the infamous chicken-instead-of-buns creation can only be described using all caps. NO BUN ALL MEAT. Only available in South Korea.
Taco Bell's Waffle Taco: It took two years for Taco Bell to develop this waffle folded in the shape of a taco, the stand-out star of its new breakfast menu.
Krispy Kreme Triple Cheeseburger: Only attendees at the San Diego County Fair were given the opportunity to taste the official version of this donut-hamburger-heart attack combo. The rest of America has reasonable odds of not dropping dead tomorrow.
Taco Bell's Quesarito: A burrito wrapped in a quesadilla inside an enigma. Quarantined to one store in Oklahoma City.