Category Archives: Reference

Filing cabinet for a digitized world.

CREATION MYTH

It would be nice to work in an environment like Xerox PARC, total freedom to let your research and build with no budget limitation. Unfortunately, in ASIC world we always squeezed by schedule and resources constraint and don’t get much room to innovate. I share the same feeling as the inventor of laser printer, the management are often short sighted, so I have to develop many of my work behind the curtain. I can only unveil it when there is a working prototype with clear benefit over the previous work.

by Gladwell, Malcolm. The New Yorker87. 13 (May 16, 2011):
Xerox PARC was the innovation arm of the Xerox Corporation. Apple was already one of the hottest technology firms in the country. Steve Jobs’ involvement with Xerox PARC is discussed.

In late 1979, a twenty-four-year-old entrepreneur paid a visit to a research center in Silicon Valley called Xerox PARC. He was the co-founder of a small computer startup down the road, in Cupertino. His name was Steve Jobs.

Xerox PARC was the innovation arm of the Xerox Corporation. It was, and remains, on Coyote Hill Road, in Palo Alto, nestled in the foothills on the edge of town, in a long, low concrete building, with enormous terraces looking out over the jewels of  Silicon Valley. To the northwest was Stanford University’s Hoover Tower. To the north was Hewlett-Packard’s sprawling campus. All around were scores of the other chip designers, software firms, venture capitalists, and hardware-makers. A visitor to PARC, taking in that view, could easily imagine that it was the computer world’s castle, lording over the valley below–and, at the time, this wasn’t far from the truth. In 1970, Xerox had assembled the world’s greatest computer engineers and programmers, and for the next ten years they had an unparalleled run of innovation and invention. If you were obsessed with the future in the seventies, you were obsessed with Xerox PARC–which was why the young Steve Jobs had driven to Coyote Hill Road.

Apple was already one of the hottest tech firms in the country. Everyone in the Valley wanted a piece of it. So Jobs proposed a deal: he would allow Xerox to buy a hundred thousand shares of his company for a million dollars–its highly anticipated I.P.O. was just a year away–if PARC would “open its kimono.” A lot of haggling ensued. Jobs was the fox, after all, and PARC was the henhouse. What would he be allowed to see? What wouldn’t he be allowed to see? Some at PARC thought that the whole idea was lunacy, but, in the end, Xerox went ahead with it. One PARC scientist recalls Jobs as “rambunctious”–a fresh-cheeked, caffeinated version of today’s austere digital emperor. He was given a couple of tours, and he ended up standing in front of a Xerox Alto, PARC’s prized personal computer.

An engineer named Larry Tesler conducted the demonstration. He moved the cursor across the screen with the aid of a “mouse.” Directing a conventional computer, in those days, meant typing in a command on the keyboard. Tesler just clicked on one of the icons on the screen. He opened and closed “windows,” deftly moving from one task to another. He wrote on an elegant word-processing program, and exchanged e-mails with other people at PARC, on the world’s first Ethernet network. Jobs had come with one of his software engineers, Bill Atkinson, and Atkinson moved in as close as he could, his nose almost touching the screen. “Jobs was pacing around the room, acting up the whole time,” Tesler recalled. “He was very excited. Then, when he began seeing the things I could do onscreen, he watched for about a minute and started jumping around the room, shouting, ‘Why aren’t you doing anything with this? This is the greatest thing. This is revolutionary!’ ”

Xerox began selling a successor to the Alto in 1981. It was slow and underpowered–and Xerox ultimately withdrew from personal computers altogether. Jobs, meanwhile, raced back to Apple, and demanded that the team working on the company’s next generation of personal computers change course. He wanted menus on the screen. He wanted windows. He wanted a mouse. The result was the Macintosh, perhaps the most famous product in the history of Silicon Valley.

“If Xerox had known what it had and had taken advantage of its real opportunities,” Jobs said, years later, “it could have been as big as I.B.M. plus Microsoft plus Xerox combined–and the largest high-technology company in the world.”

This is the legend of Xerox PARC. Jobs is the Biblical Jacob and Xerox is Esau, squandering his birthright for a pittance. In the past thirty years, the legend has been vindicated by history. Xerox, once the darling of the American high-technology community, slipped from its former dominance. Apple is now ascendant, and the demonstration in that room in Palo Alto has come to symbolize the vision and ruthlessness that separate true innovators from also-rans. As with all legends, however, the truth is a bit more complicated. After Jobs returned from PARC, he met with a man named Dean Hovey, who was one of the founders of the industrial-design firm that would become known as IDEO. “Jobs went to Xerox PARC on a Wednesday or a Thursday, and I saw him on the Friday afternoon,” Hovey recalled. “I had a series of ideas that I wanted to bounce off him, and I barely got two words out of my mouth when he said, ‘No, no, no, you’ve got to do a mouse.’ I was, like, ‘What’s a mouse?’ I didn’t have a clue. So he explains it, and he says, ‘You know, [the Xerox mouse] is a mouse that cost three hundred dollars to build and it breaks within two weeks. Here’s your design spec: Our mouse needs to be manufacturable for less than fifteen bucks. It needs to not fail for a couple of years, and I want to be able to use it on Formica and my bluejeans.’ From that meeting, I went to Walgreens, which is still there, at the corner of Grant and El Camino in Mountain View, and I wandered around and bought all the underarm deodorants that I could find, because they had that ball in them. I bought a butter dish. That was the beginnings of the mouse.”

I spoke with Hovey in a ramshackle building in downtown Palo Alto, where his firm had started out. He had asked the current tenant if he could borrow his old office for the morning, just for the fun of telling the story of the Apple mouse in the place where it was invented. The room was the size of someone’s bedroom. It looked as if it had last been painted in the Coolidge Administration. Hovey, who is lean and healthy in a Northern California yoga-and-yogurt sort of way, sat uncomfortably at a rickety desk in a corner of the room. “Our first machine shop was literally out on the roof,” he said, pointing out the window to a little narrow strip of rooftop, covered in green outdoor carpeting. “We didn’t tell the planning commission. We went and got that clear corrugated stuff and put it across the top for a roof. We got out through the window.” He had brought a big plastic bag full of the artifacts of that moment: diagrams scribbled on lined paper, dozens of differently sized plastic mouse shells, a spool of guitar wire, a tiny set of wheels from a toy train set, and the metal lid from a jar of Ralph’s preserves. He turned the lid over. It was filled with a waxlike substance, the middle of which had a round indentation, in the shape of a small ball. “It’s epoxy casting resin,” he said. “You pour it, and then I put Vaseline on a smooth steel ball, and set it in the resin, and it hardens around it.” He tucked the steel ball underneath the lid and rolled it around the tabletop. “It’s a kind of mouse.” The hard part was that the roller ball needed to be connected to the housing of the mouse, so that it didn’t fall out, and so that it could transmit information about its movements to the cursor on the screen. But if the friction created by those connections was greater than the friction between the tabletop and the roller ball, the mouse would skip. And the more the mouse was used the more dust it would pick up off the tabletop, and the more it would skip. The Xerox PARC mouse was an elaborate affair, with an array of ball bearings supporting the roller ball. But there was too much friction on the top of the ball, and it couldn’t deal with dust and grime. At first, Hovey set to work with various arrangements of ball bearings, but nothing quite worked. “This was the ‘aha’ moment,” Hovey said, placing his fingers loosely around the sides of the ball, so that they barely touched its surface. “So the ball’s sitting here. And it rolls. I attribute that not to the table but to the oldness of the building. The floor’s not level. So I started playing with it, and that’s when I realized: I want it to roll. I don’t want it to be supported by all kinds of ball bearings. I want to just barely touch it.”

The trick was to connect the ball to the rest of the mouse at the two points where there was the least friction– right where his fingertips had been, dead center on either side of the ball. “If it’s right at midpoint, there’s no force causing it to rotate. So it rolls.”

Hovey estimated their consulting fee at thirty-five dollars an hour; the whole project cost perhaps a hundred thousand dollars. “I originally pitched Apple on doing this mostly for royalties, as opposed to a consulting job,” he recalled. “I said, ‘I’m thinking fifty cents apiece,’ because I was thinking that they’d sell fifty thousand, maybe a hundred thousand of them.” He burst out laughing, because of how far off his estimates ended up being. “Steve’s pretty savvy. He said no. Maybe if I’d asked for a nickel, I would have been fine.” Here is the first complicating fact about the Jobs visit. In the legend of Xerox PARC, Jobs stole the personal computer from Xerox. But the striking thing about Jobs’s instructions to Hovey is that he didn’t want to reproduce what he saw at PARC. “You know, there were disputes around the number of buttons–three buttons, two buttons, one-button mouse,” Hovey went on. “The mouse at Xerox had three buttons. But we came around to the fact that learning to mouse is a feat in and of itself, and to make it as simple as possible, with just one button, was pretty important.”

So was what Jobs took from Xerox the idea of the mouse? Not quite, because Xerox never owned the idea of the mouse. The PARC researchers got it from the computer scientist Douglas Engelbart, at Stanford Research Institute, fifteen minutes away on the other side of the university campus. Engelbart dreamed up the idea of moving the cursor around the screen with a stand-alone mechanical “animal” back in the mid- nineteen-sixties. His mouse was a bulky, rectangular affair, with what looked like steel roller-skate wheels. If you lined up Engelbart’s mouse, Xerox’s mouse, and Apple’s mouse, you would not see the serial reproduction of an object. You would see the evolution of a concept.

The same is true of the graphical user interface that so captured Jobs’s imagination. Xerox PARC’s innovation had been to replace the traditional computer command line with onscreen icons. But when you clicked on an icon you got a pop-up menu: this was the intermediary between the user’s intention and the computer’s response. Jobs’s software team took the graphical interface a giant step further. It emphasized “direct manipulation.” If you wanted to make a window bigger, you just pulled on its corner and made it bigger; if you wanted to move a window across the screen, you just grabbed it and moved it. The Apple designers also invented the menu bar, the pull-down menu, and the trash can–all features that radically simplified the original Xerox PARC idea.

The difference between direct and indirect manipulation–between three buttons and one button, three hundred dollars and fifteen dollars, and a roller ball supported by ball bearings and a free-rolling ball–is not trivial. It is the difference between something intended for experts, which is what Xerox PARC had in mind, and something that’s appropriate for a mass audience, which is what Apple had in mind. PARC was building a personal computer. Apple wanted to build a popular computer.

In a recent study, “The Culture of Military Innovation,” the military scholar Dima Adamsky makes a similar argument about the so-called Revolution in Military Affairs. R.M.A. refers to the way armies have transformed themselves with the tools of the digital age–such as precision-guided missiles, surveillance drones, and real-time command, control, and communications technologies–and Adamsky begins with the simple observation that it is impossible to determine who invented R.M.A. The first people to imagine how digital technology would transform warfare were a cadre of senior military intellectuals in the Soviet Union, during the nineteen-seventies. The first country to come up with these high-tech systems was the United States. And the first country to use them was Israel, in its 1982 clash with the Syrian Air Force in Lebanon’s Bekaa Valley, a battle commonly referred to as “the Bekaa Valley turkey shoot.” Israel coordinated all the major innovations of R.M.A. in a manner so devastating that it destroyed nineteen surface-to-air batteries and eighty-seven Syrian aircraft while losing only a handful of its own planes.

That’s three revolutions, not one, and Adamsky’s point is that each of these strands is necessarily distinct, drawing on separate skills and circumstances. The Soviets had a strong, centralized military bureaucracy, with a long tradition of theoretical analysis. It made sense that they were the first to understand the military implications of new information systems. But they didn’t do anything with it, because centralized military bureaucracies with strong intellectual traditions aren’t very good at connecting word and deed. The United States, by contrast, has a decentralized, bottom-up entrepreneurial culture, which has historically had a strong orientation toward technological solutions. The military’s close ties to the country’ high-tech community made it unsurprising that the U.S. would be the first to invent precision-guidance and next-generation command-and-control communications. But those assets also meant that Soviet-style systemic analysis wasn’t going to be a priority. As for the Israelis, their military culture grew out of a background of resource constraint and constant threat. In response, they became brilliantly improvisational and creative. But, as Adamsky points out, a military built around urgent, short-term “fire extinguishing” is not going to be distinguished by reflective theory. No one stole the revolution. Each party viewed the problem from a different perspective, and carved off a different piece of the puzzle.

In the history of the mouse, Engelbart was the Soviet Union. He was the visionary, who saw the mouse before anyone else did. But visionaries are limited by their visions. “Engelbart’s self-defined mission was not to produce a product, or even a prototype; it was an open-ended search for knowledge,” Matthew Hiltzik writes, in “Dealers of Lightning” (1999), his wonderful history of Xerox PARC. “Consequently, no project in his lab ever seemed to come to an end.” Xerox PARC was the United States: it was a place where things got made. “Xerox created this perfect environment,” recalled Bob Metcalfe, who worked there through much of the nineteen-seventies, before leaving to found the networking company 3Com. “There wasn’t any hierarchy. We built out our own tools. When we needed to publish papers, we built a printer. When we needed to edit the papers, we built a computer. When we needed to connect computers, we figured out how to connect them. We had big budgets. Unlike many of our brethren, we didn’t have to teach. We could just research. It was heaven.” But heaven is not a good place to commercialize a product. “We built a computer and it was a beautiful thing,” Metcalfe went on. “We developed our computer language, our own display, our own language. It was a gold-plated product. But it cost sixteen thousand dollars, and it needed to cost three thousand dollars.” For an actual product, you need threat and constraint–and the improvisation and creativity necessary to turn a gold-plated three-hundred-dollar mouse into something that works on Formica and costs fifteen dollars. Apple was Israel. Xerox couldn’t have been I.B.M. and Microsoft combined, in other words. “You can be one of the most successful makers of enterprise technology products the world has ever known, but that doesn’t mean your instincts will carry over to the consumer market,” the tech writer Harry McCracken recently wrote. “They’re really different, and few companies have ever been successful in both.” He was talking about the decision by the networking giant Cisco System, this spring, to shut down its Flip camera business, at a cost of many hundreds of millions of dollars. But he could just as easily have been talking about the Xerox of forty years ago, which was one of the most successful makers of enterprise technology the world has ever known. The fair question is whether Xerox, through its research arm in Palo Alto, found a better way to be Xerox–and the answer is that it did, although that story doesn’t get told nearly as often.

One of the people at Xerox PARC when Steve Jobs visited was an optical engineer named Gary Starkweather. He is a solid and irrepressibly cheerful man, with large, practical hands and the engineer’s gift of pretending that what is impossibly difficult is actually pretty easy, once you shave off a bit here, and remember some of your high-school calculus, and realize that the thing that you thought should go in left to right should actually go in right to left. Once, before the palatial Coyote Hill Road building was constructed, a group that Starkweather had to be connected to was moved to another building, across the Foothill Expressway, half a mile away. There was no way to run a cable under the highway. So Starkweather fired a laser through the air between the two buildings, an improvised communications system that meant that, if you were driving down the Foothill Expressway on a foggy night and happened to look up, you might see a mysterious red beam streaking across the sky. When a motorist drove into the median ditch, “we had to turn it down,” Starkweather recalled, with a mischievous smile.

Lasers were Starkweather’s specialty. He started at Xerox’s East Coast research facility in Webster, New York, outside Rochester. Xerox built machines that scanned a printed page of type using a photographic lens, and then printed a duplicate. Starkweather’s idea was to skip the first step–to run a document from a computer directly into a photocopier, by means of a laser, and turn the Xerox machine into a printer. It was a radical idea. The printer, since Gutenberg, had been limited to the function of re-creation: if you wanted to print a specific image or letter, you had to have a physical character or mark corresponding to that image or letter. What Starkweather wanted to do was take the array of bits and bytes, ones and zeros that constitute digital images, and transfer them straight into the guts of a copier. That meant, at least in theory, that he could print anything.

“One morning, I woke up and I thought, Why don’t we just print something out directly?” Starkweather said. “But when I flew that past my boss he thought it was the most brain-dead idea he had ever heard. He basically told me to find something else to do. The feeling was that lasers were too expensive. They didn’t work that well. Nobody wants to do this, computers aren’t powerful enough. And I guess, in my naivete, I kept thinking, He’s just not right–there’s something about this I really like. It got to be a frustrating situation. He and I came to loggerheads over the thing, about late 1969, early 1970. I was running my experiments in the back room behind a black curtain. I played with them when I could. He threatened to lay off my people if I didn’t stop. I was having to make a decision: do I abandon this, or do I try and go up the ladder with it?” Then Starkweather heard that Xerox was opening a research center in Palo Alto, three thousand miles away from its New York headquarters. He went to a senior vice-president of Xerox, threatening to leave for I.B.M. if he didn’t get a transfer. In January of 1971, his wish was granted, and, within ten months, he had a prototype up and running.

Starkweather is retired now, and lives in a gated community just north of Orlando, Florida. When we spoke, he was sitting at a picnic table, inside a screened-in porch in his back yard. Behind him, golfers whirred by in carts. He was wearing white chinos and a shiny black short-sleeved shirt, decorated with fluorescent images of vintage hot rods. He had brought out two large plastic bins filled with the artifacts of his research, and he spread the contents on the table: a metal octagonal disk, sketches on lab paper, a black plastic laser housing that served as the innards for one of his printers.

“There was still a tremendous amount of opposition from the Webster group, who saw no future in computer printing,” he went on. “They said, ‘I.B.M. is doing that. Why do we need to do that?’ and so forth. Also, there were two or three competing projects, which I guess I have the luxury of calling ridiculous. One group had fifty people and another had twenty. I had two.” Starkweather picked up a picture of one of his in-house competitors, something called an “optical carriage printer.” It was the size of one of those modular Italian kitchen units that you see advertised in fancy design magazines. “It was an unbelievable device,” he said, with a rueful chuckle. “It had a ten-inch drum, which turned at five thousand r.p.m., like a super washing machine. It had characters printed on its surface. I think they only ever sold ten of them. The problem was that it was spinning so fast that the drum would blow out and the characters would fly off. And there was only this one lady in Troy, New York, who knew how to put the characters on so that they would stay.

“So we finally decided to have what I called a fly-off. There was a full page of text–where some of them were non-serif characters, Helvetica, stuff like that–and then a page of graph paper with grid lines, and pages with pictures and some other complex stuff–and everybody had to print all six pages. Well, once we decided on those six pages, I knew I’d won, because I knew there wasn’t anything I couldn’t print. Are you kidding? If you can translate it into bits, I can print it. Some of these other machines had to go through hoops just to print a curve. A week after the fly-off, they folded those other projects. I was the only game in town.” The project turned into the Xerox 9700, the first high-speed, cut-paper laser printer in the world.

In one sense, the Starkweather story is of a piece with the Steve Jobs visit. It is an example of the imaginative poverty of Xerox management. Starkweather had to hide his laser behind a curtain. He had to fight for his transfer to PARC. He had to endure the indignity of the fly-off, and even then Xerox management remained skeptical. The founder of PARC, Jack Goldman, had to bring in a team from Rochester for a personal demonstration. After that, Starkweather and Goldman had an idea for getting the laser printer to market quickly: graft a laser onto a Xerox copier called the 7000. The 7000 was an older model, and Xerox had lots of 7000s sitting around that had just come off lease. Goldman even had a customer ready: the Lawrence Livermore laboratory was prepared to buy a whole slate of the machines. Xerox said no. Then Starkweather wanted to make what he called a photo-typesetter, which produced camera-ready copy right on your desk. Xerox said no. “I wanted to work on higher-performance scanners,” Starkweather continued. “In other words, what if we print something other than documents? For example, I made a high-resolution scanner and you could print on glass plates.” He rummaged in one of the boxes on the picnic table and came out with a sheet of glass, roughly six inches square, on which a photograph of a child’s face appeared. The same idea, he said, could have been used to make “masks” for the semiconductor industry–the densely patterned screens used to etch the designs on computer chips. “No one would ever follow through, because Xerox said, ‘Now you’re in Intel’s market, what are you doing that for?’ They just could not seem to see that they were in the information business. This”–he lifted up the plate with the little girl’s face on it–“is a copy. It’s just not a copy of an office document.” But he got nowhere. “Xerox had been infested by a bunch of spreadsheet experts who thought you could decide every product based on metrics. Unfortunately, creativity wasn’t on a metric.”

A few days after that afternoon in his back yard, however, Starkweather e-mailed an addendum to his discussion of his experiences at PARC. “Despite all the hassles and risks that happened in getting the laser printer going, in retrospect the journey was that much more exciting,” he wrote. “Often difficulties are just opportunities in disguise.” Perhaps he felt that he had painted too negative a picture of his time at Xerox, or suffered a pang of guilt about what it must have been like to be one of those Xerox executives on the other side of the table. The truth is that Starkweather was a difficult employee. It went hand in hand with what made him such an extraordinary innovator. When his boss told him to quit working on lasers, he continued in secret. He was disruptive and stubborn and independent-minded–and he had a thousand ideas, and sorting out the good ideas from the bad wasn’t always easy. Should Xerox have put out a special order of laser printers for Lawrence Livermore, based on the old 7000 copier? In “Fumbling the Future: How Xerox Invented, Then Ignored, the First Personal Computer” (1988)–a book dedicated to the idea that Xerox was run by the blind–Douglas Smith and Robert Alexander admit that the proposal was hopelessly impractical: “The scanty Livermore proposal could not justify the investment required to start a laser printing business. . . . How and where would Xerox manufacture the laser printers? Who would sell and service them? Who would buy them and why?” Starkweather, and his compatriots at Xerox PARC, weren’t the source of disciplined strategic insights. They were wild geysers of creative energy.

The psychologist Dean Simonton argues that this fecundity is often at the heart of what distinguishes the truly gifted. The difference between Bach and his forgotten peers isn’t necessarily that he had a better ratio of hits to misses. The difference is that the mediocre might have a dozen ideas, while Bach, in his lifetime, created more than a thousand full-fledged musical compositions. A genius is a genius, Simonton maintains, because he can put together such a staggering number of insights, ideas, theories, random observations, and unexpected connections that he almost inevitably ends up with something great. “Quality,” Simonton writes, is “a probabilistic function of quantity.”

Simonton’s point is that there is nothing neat and efficient about creativity. “The more successes there are,” he says, “the more failures there are as well”–meaning that the person who had far more ideas than the rest of us will have far more bad ideas than the rest of us, too. This is why managing the creative process is so difficult. The making of the classic Rolling Stones album “Exile on Main Street” was an ordeal, Keith Richards writes in his new memoir, because the band had too many ideas. It had to fight from under an avalanche of mediocrity: “Head in the Toilet Blues,” “Leather Jackets,” “Windmill,” “I Was Just a Country Boy,” “Bent Green Needles,” “Labour Pains,” and “Pommes de Terre”–the last of which Richards explains with the apologetic, “Well, we were in France at the time.”

At one point, Richards quotes a friend, Jim Dickinson, remembering the origins of the song “Brown Sugar”: I watched Mick write the lyrics. . . . He wrote it down as fast as he could move his hand. I’d never seen anything like it. He had one of those yellow legal pads, and he’d write a verse a page, just write a verse and then turn the page, and when he had three pages filled, they started to cut it. It was amazing. Richards goes on to marvel, “It’s unbelievable how prolific he was.” Then he writes, “Sometimes you’d wonder how to turn the fucking tap off. The odd times he would come out with so many lyrics, you’re crowding the airwaves, boy.” Richards clearly saw himself as the creative steward of the Rolling Stones (only in a rock-and-roll band, by the way, can someone like Keith Richards perceive himself as the responsible one), and he came to understand that one of the hardest and most crucial parts of his job was to “turn the fucking tap off,” to rein in Mick Jagger’s incredible creative energy.

The more Starkweather talked, the more apparent it became that his entire career had been a version of this problem. Someone was always trying to turn his tap off. But someone had to turn his tap off: the interests of the innovator aren’t perfectly aligned with the interests of the corporation. Starkweather saw ideas on their own merits. Xerox was a multinational corporation, with shareholders, a huge sales force, and a vast corporate customer base, and it needed to consider every new idea within the context of what it already had. Xerox’s managers didn’t always make the right decisions when they said no to Starkweather. But he got to PARC, didn’t he? And Xerox, to its great credit, had a PARC–a place where, a continent away from the top managers, an engineer could sit and dream, and get every purchase order approved, and fire a laser across the Foothill Expressway if he was so inclined. Yes, he had to pit his laser printer against lesser ideas in the contest. But he won the contest. And, the instant he did, Xerox cancelled the competing projects and gave him the green light.

“I flew out there and gave a presentation to them on what I was looking at,” Starkweather said of his first visit to PARC. “They really liked it, because at the time they were building a personal computer, and they were beside themselves figuring out how they were going to get whatever was on the screen onto a sheet of paper. And when I showed them how I was going to put prints on a sheet of paper it was a marriage made in heaven.” The reason Xerox invented the laser printer, in other words, is that it invented the personal computer. Without the big idea, it would never have seen the value of the small idea. If you consider innovation to be efficient and ideas precious, that is a tragedy: you give the crown jewels away to Steve Jobs, and all you’re left with is a printer. But in the real, messy world of creativity, giving away the thing you don’t really understand for the thing that you do is an inevitable tradeoff.

“When you have a bunch of smart people with a broad enough charter, you will always get something good out of it,” Nathan Myhrvold, formerly a senior executive at Microsoft, argues. “It’s one of the best investments you could possibly make–but only if you chose to value it in terms of successes. If you chose to evaluate it in terms of how many times you failed, or times you could have succeeded and didn’t, then you are bound to be unhappy. Innovation is an unruly thing. There will be some ideas that don’t get caught in your cup. But that’s not what the game is about. The game is what you catch, not what you spill.”

In the nineteen-nineties, Myhrvold created a research laboratory at Microsoft modelled in part on what Xerox had done in Palo Alto in the nineteen-seventies, because he considered PARC a triumph, not a failure. “Xerox did research outside their business model, and when you do that you should not be surprised that you have a hard time dealing with it–any more than if some bright guy at Pfizer wrote a word processor. Good luck to Pfizer getting into the word-processing business. Meanwhile, the thing that they invented that was similar to their own business–a really big machine that spit paper out–they made a lot of money on it.” And so they did. Gary Starkweather’s laser printer made billions for Xerox. It paid for every other single project at Xerox PARC, many times over.

In 1988, Starkweather got a call from the head of one of Xerox’s competitors, trying to lure him away. It was someone whom he had met years ago. “The decision was painful,” he said. “I was a year from being a twenty-five-year veteran of the company. I mean, I’d done enough for Xerox that unless I burned the building down they would never fire me. But that wasn’t the issue. It’s about having ideas that are constantly squashed. So I said, ‘Enough of this,’ and I left.”

He had a good many years at his new company, he said. It was an extraordinarily creative place. He was part of decision-making at the highest level. “Every employee from technician to manager was hot for the new, exciting stuff,” he went on. “So, as far as buzz and daily environment, it was far and away the most fun I’ve ever had.” But it wasn’t perfect. “I remember I called in the head marketing guy and I said, ‘I want you to give me all the information you can come up with on when people buy one of our products–what software do they buy, what business are they in–so I can see the model of how people are using the machines.’ He looked at me and said, ‘I have no idea about that.’ ” Where was the rigor? Then Starkweather had a scheme for hooking up a high-resolution display to one of his new company’s computers. “I got it running and brought it into management and said, ‘Why don’t we show this at the tech expo in San Francisco? You’ll be able to rule the world.’ They said, ‘I don’t know. We don’t have room for it.’ It was that sort of thing. It was like me saying I’ve discovered a gold mine and you saying we can’t afford a shovel.”

He shrugged a little wearily. It was ever thus. The innovator says go. The company says stop–and maybe the only lesson of the legend of Xerox PARC is that what happened there happens, in one way or another, everywhere. By the way, the man who hired Gary Starkweather away to the company that couldn’t afford a shovel? His name was Steve Jobs.

Xerox PARC, Apple, and the truth about innovation.

Too much information

Applying the concept of a sprint in Agile development can help me cope with information overload. I block off a period of time, 2-3 hours, to concentration on my work. I will hide myself, disconnect from email and instant messages to avoid any interruption. I also learn that nothing cannot wait for a few hours or a day or two. You just have to set the expectation right that people cannot demand instance response from you all the time.

Jun 30th 2011, The Economist
How to cope with data overload

GOOGLE “information overload” and you are immediately overloaded with information: more than 7m hits in 0.05 seconds. Some of this information is interesting: for example, that the phrase “information overload” was popularised by Alvin Toffler in 1970. Some of it is mere noise: obscure companies promoting their services and even more obscure bloggers sounding off. The overall impression is at once overwhelming and confusing.

“Information overload” is one of the biggest irritations in modern life. There are e-mails to answer, virtual friends to pester, YouTube videos to watch and, back in the physical world, meetings to attend, papers to shuffle and spouses to appease. A survey by Reuters once found that two-thirds of managers believe that the data deluge has made their jobs less satisfying or hurt their personal relationships. One-third think that it has damaged their health. Another survey suggests that most managers think most of the information they receive is useless.

Commentators have coined a profusion of phrases to describe the anxiety and anomie caused by too much information: “data asphyxiation” (William van Winkle), “data smog” (David Shenk), “information fatigue syndrome” (David Lewis), “cognitive overload” (Eric Schmidt) and “time famine” (Leslie Perlow). Johann Hari, a British journalist, notes that there is a good reason why “wired” means both “connected to the internet” and “high, frantic, unable to concentrate”.

These worries are exaggerated. Stick-in-the-muds have always complained about new technologies: the Victorians fussed that the telegraph meant that “the businessman of the present day must be continually on the jump.” And businesspeople have always had to deal with constant pressure and interruptions—hence the word “business”. In his classic study of managerial work in 1973 Henry Mintzberg compared managers to jugglers: they keep 50 balls in the air and periodically check on each one before sending it aloft once more.

Yet clearly there is a problem. It is not merely the dizzying increase in the volume of information (the amount of data being stored doubles every 18 months). It is also the combination of omnipresence and fragmentation. Many professionals are welded to their smartphones. They are also constantly bombarded with unrelated bits and pieces—a poke from a friend one moment, the latest Greek financial tragedy the next.

The data fog is thickening at a time when companies are trying to squeeze ever more out of their workers. A survey in America by Spherion Staffing discovered that 53% of workers had been compelled to take on extra tasks since the recession started. This dismal trend may well continue—many companies remain reluctant to hire new people even as business picks up. So there will be little respite from the dense data smog, which some researchers fear may be poisonous.

They raise three big worries. First, information overload can make people feel anxious and powerless: scientists have discovered that multitaskers produce more stress hormones. Second, overload can reduce creativity. Teresa Amabile of Harvard Business School has spent more than a decade studying the work habits of more than 9,000 people. She finds that focus and creativity are connected. People are more likely to be creative if they are allowed to focus on something for some time without interruptions. If constantly interrupted or forced to attend meetings, they are less likely to be creative. Third, overload can also make workers less productive. David Meyer, of the University of Michigan, has shown that people who complete certain tasks in parallel take much longer and make many more errors than people who complete the same tasks in sequence.

What can be done about information overload? One answer is technological: rely on the people who created the fog to invent filters that will clean it up. Xerox promises to restore “information sanity” by developing better filtering and managing devices. Google is trying to improve its online searches by taking into account more personal information. (Some people fret that this will breach their privacy, but it will probably deliver quicker, more accurate searches.) A popular computer program called “Freedom” disconnects you from the web at preset times.

A second answer involves willpower. Ration your intake. Turn off your mobile phone and internet from time to time.

But such ruses are not enough. Smarter filters cannot stop people from obsessively checking their BlackBerrys. Some do so because it makes them feel important; others because they may be addicted to the “dopamine squirt” they get from receiving messages, as Edward Hallowell and John Ratey, two academics, have argued. And self-discipline can be counter-productive if your company doesn’t embrace it. Some bosses get shirty if their underlings are unreachable even for a few minutes.

Most companies are better at giving employees access to the information superhighway than at teaching them how to drive. This is starting to change. Management consultants have spotted an opportunity. Derek Dean and Caroline Webb of McKinsey urge businesses to embrace three principles to deal with data overload: find time to focus, filter out noise and forget about work when you can. Business leaders are chipping in. David Novak of Yum! Brands urges people to ask themselves whether what they are doing is constructive or a mere “activity”. John Doerr, a venture capitalist, urges people to focus on a narrow range of objectives and filter out everything else. Cristobal Conde of SunGard, an IT firm, preserves “thinking time” in his schedule when he cannot be disturbed. This might sound like common sense. But common sense is rare amid the cacophony of corporate life.

Slaying the Cable Monster: Why HDMI Brands Don’t Matter

I have been keep saying those who buy expensive HDMI cable are idiots and now here is the prove.

By Will Greenwald, May 13 2011, PC Magazine
For the vast majority of HDTV owners, a $5 HDMI cable will provide the same performance as a $100 one.

You’ve probably experienced this when shopping for a new HDTV: A store clerk sidles up and offers to help. He then points you toward the necessary HDMI cables to go with your new television. And they’re expensive. Maybe $60 or $70, sometimes even more than $100 (You could buy a cheap Blu-ray player or a handful of Blu-ray discs for that price!). The clerk then claims that these are special cables. Superior cables. Cables you absolutely need if you want the best possible home theater experience. And the claims are, for the vast majority of home theater users, utter rubbish.

The truth is, for most HDTV setups, there is absolutely no effective difference between a no-name $3 HDMI cable you can order from Amazon.com and a $120 Monster cable you buy at a brick-and-mortar electronics store. We ran five different HDMI cables, ranging in price from less than $5 up to more than $100, through rigorous tests to determine whether there’s any difference in a dirt-cheap cable and one that costs a fortune.

HDMI Basics

The first thing to remember about HDMI is that it is a digital standard. Unlike component video, composite video, S-video, or coaxial cable, HDMI signals don’t gradually degrade, or get fuzzy and lose clarity as the signal fades or interference grows. For digital signals like HDMI, as long as there is enough data for the receiver to put together a picture, it will form. If there isn’t, it will just drop off. While processing artifacts can occur and gaps in the signal can cause blocky effects or screen blanking, generally an HDMI signal will display whenever the signal successfully reaches the receiver. Claims that more expensive cables put forth greater video or audio fidelity are nonsense; it’s like saying you can get better-looking YouTube videos on your laptop by buying more expensive Ethernet cables. From a technical standpoint, it simply doesn’t make sense.

This doesn’t mean that all HDMI cables are created equal in all cases. HDMI includes multiple specifications detailing standards of bandwidth and the capabilities of the cable.

The current HDMI specification, version 1.4a, requires all compliant cables to support 3D video, 4K resolution (approximately 4000-by-2000-pixel resolution, or about four times the detail of the current HD standard of 1080p), Ethernet data transmissions, and audio return channels. Each of these features requires more bandwidth, and considerably older HDMI cables (and all older HDMI-equipped devices) rated at HDMI 1.3b or lower can’t handle that much bandwidth. For most users, 3D is the only feature they’ll use. Ethernet over HDMI is used mostly for networking devices instead ofconnecting viapure Ethernet or Wi-Fi (the methods most consumer electronics products use). Audio return channels are only useful in certain situations with dedicated sound systems (and the same task can be accomplished by running an audio cable to the system). And there aren’t currently any consumer-grade displays or playback devices capable of handling 4Kresolutions (the least-expensive 4K projector you’ll find is more than $75,000). In all of these cases, it’s a yes or no question: does it support these features? There is no question of clarity or superior signal.

That said, there are cases where higher quality cables and going to lengths to maintain signal quality are important. They just aren’t cases that apply for most HDTV owners. If you’re going to run an HDMI cable for lengths longer than 10 feet, you should be concerned about insulation to protect against signal degradation. It’s not an issue for 6-foot lengths of cable, but as the distance between media device and display increases, signal quality decreases and the more susceptible the signal becomes to magnetic interference. In fact, for distances of over 30 feet, the HDMI licensing board recommends either using a signal amplifier or considering an alternate solution, like an HDMI-over-Ethernet converter. When you’re running up against the maximum length, the greater insulation and build quality of more expensive cables can potentially improve the stability of your signal. However, if there’s a 30-foot gap between your Blu-ray player and your HDTV, you might want to rearrange some furniture. Or just use a technology designed for long distances.

The second thing to know about HDMI cables is that they are almost always expensive when you buy them at brick-and-mortar stores. If you walk into a Best Buy or Radio Shack, you can expect to pay at least $40 for a 6-foot HDMI cable. Even at discount stores like Wal-Mart and Target, the cheapest, most generic HDMI cables retail for $15 and more. Online, you’ll do a lot better on prices. Amazon.com and Monoprice.com (the “ancient custom installer’s secret”) slash even Wal-Mart’s HDMI cable prices into tiny bits. Both sites sell several models of HDMI cables for as little as $1.50. These are generally generic HDMI cables, or seldom-heard-of brands, but they work just fine for most HDTV users. We can be certain of this, because we tested them in the PCMag Labs.

Testing the Cables

We tested five cables including Monster Cable’s 1200 Higher Definition Experience Pack, a combination HDMI/Ethernet bundle that lists for $119.95 but we found for $79.95 at Amazon.com, the Monster Cable HDMI 500HD High Speed Cable ($59.95 list, we got it at Amazon for $52.62), the Spider International E-HDMI-0006 E-Series Super High Speed HDMI with Ethernet cable ($64.99 list price and a $45.29 Amazon price), the Cables Unlimited 6-Foot HDMI Male to Male Cable (PCM-2295-06) that Amazon carries for $3.19, and an unbranded, OEM cable from Monoprice that was shipped in a Belkin bag but doesn’t match any of the company’s own HDMI cables (and retails for $3.28, or $2.78 if you buy 50 cables or more).

We’ve left out some of the more lavishly expensive HDMI cables, like the AudioQuest series of HDMI cables, because they retail for nearly $700. Unless those cables can let me eat the food I see on the Food Network, they’re not worth the price of an actual HDTV.

Based purely on the cables’ specs, Monster Cable’s HDMI cables are superior. Of course, that’s because Monster Cable is the only company of the four to offer any notable specifications. Spider International and Cables Unlimited offered very little information in the way of the cables, and the generic cable had no specifications besides it being 28 AWG (American Wire Gauge), a number that simply references the width of the wire used in the cable (28 AWG is a standard measurement, though some cables can be slightly thicker at 26 or 24 AWG). HDMI standards require that all HDMI 1.4 cables be able to handle a bandwidth of 10.2 gigabits per second (Gbps). The Monster Blu-Ray 1200 Higher Definition Experience Pack has a rated speed of 17.8 Gbps. Again, what really matters is whether the cable is HDMI-1.4-compliant, and it can support the necessary features mentioned above. The higher bandwidth doesn’t matter for HDTV signals. It might make a difference with 4K-video, but since HDTVs currently top out at 1080p, that point is moot.

As long as the cable is HDMI-1.4 compliant and it can hit 10.2 Gbps, which is will if it’s 1.4-compliant, it will do the trick. Also, we couldn’t find a cable that wasn’t 1.4-compliant, so that shouldn’t be a problem.

For consistency, we used only 6-foot or 2-meter (6.6-foot) cables to ensure that cable length didn’t affect the results of the tests. We paired a Sony Bravia KDL-46EX720 3D HDTV with an LG BD670 Blu-ray player for all tests. The television was set to standard, default image settings, and the Blu-ray player was set to output only a 1080p video signal. We put the cables through three different tests: a technical quality evaluation, a blind video test, and a 3D-support test.

For the technical quality evaluation, we used the HQV video benchmark Blu-ray Disc. For each cable, we ran through the gamut of HQV video tests, which checks video for numerous image processing, frame-rate synchronization, and color-correction capabilities. The tests include numerous patterns and animations to expose possible display problems. All five cables passed HQV’s tests with flying colors, with a single exception, which was consistent across all of them (and thus more likely a flaw of either the HDTV or the Blu-ray player): 2:2 film pull-down looked a bit jerky, a minor issue that doesn’t affect the cables individual performance.

The blind video test involved the assistance of five volunteers in the PCMag Lab. They were shown the same scene from Predators on Blu-ray with different cables. They were not told which cable was which until the end of the test. No one saw any appreciable difference between the $3 cables and the $120 cable, or any of the cables in between. However, we did notice a curious phenomenon: the screen appeared slightly darker and a bit more saturated when connected to the Blu-ray player with the Monster Cable 1200 High Definition Experience Pack cable. The HDTV showed that it was receiving the same 12-bit color depth information through each cable, so the more-expensive Monster cable wasn’t pushing through more color detail. Again, the difference was minimal, and could be corrected by calibrating your HDTV.

Finally, we loaded the 3D Avatar Blu-ray to check that the cables could handle an HDMI 1.4 standard feature: 3D content. Again, every cable, including the cheap $3 cable, carried a 3D video feed to the HDTV easily.

If you’re like the vast majority of HDTV users and have a fairly simple setup that isn’t spread across a large area, there is absolutely no reason to spend more than $10 on an HDMI cable, never mind more than $100 on one. Any possible benefit that could come from an over-engineered, overpriced HDMI cable simply won’t show up in your home theater. If you’re running a 4K projector, or have a 25-foot hallway between your Blu-ray player and HDTV, or want to show off how big your home theater budget is, that’s one thing. If you just want to hook up your Blu-ray player, cable box, or video game system to your HDTV, bypass the big stores and big brands and reach into the Web bargain bin. Then use the money you saveto buy more electronics that need to be connected to one another.

No Hell. Pastor Rob Bell: What if Hell Doesn’t Exist?

I am not as liberal as Rob Bell, I believe Hell does exist, but it is only reserve for truly evil people like Mao Tse Dong or Muammar Gaddafi (maybe George W. Bush too). I definitely won’t agree only Christians can go to heaven and everybody else goes to hell.

I am joining a reading group starting in May on Rob Bell’s book “Love Wins: A Book about Heaven, Hell, and the Fate of Every Person Who Ever Lived. For those who are interested, please register here

By Jon Meacham, Thursday, Apr. 14, 2011, Times Magazine

As part of a series on peacemaking, in late 2007, Pastor Rob Bell’s Mars Hill Bible Church put on an art exhibit about the search for peace in a broken world. It was just the kind of avant-garde project that had helped power Mars Hill’s growth (the Michigan church attracts 7,000 people each Sunday) as a nontraditional congregation that emphasizes discussion rather than dogmatic teaching. An artist in the show had included a quotation from Mohandas Gandhi. Hardly a controversial touch, one would have thought. But one would have been wrong.

A visitor to the exhibit had stuck a note next to the Gandhi quotation: “Reality check: He’s in hell.” Bell was struck.

Really? he recalls thinking.

Gandhi’s in hell?

He is?

We have confirmation of this?

Somebody knows this?

Without a doubt?

And that somebody decided to take on the responsibility of letting the rest of us know?

So begins Bell’s controversial new best seller, Love Wins: A Book About Heaven, Hell, and the Fate of Every Person Who Ever Lived. Works by Evangelical Christian pastors tend to be pious or at least on theological message. The standard Christian view of salvation through the death and resurrection of Jesus of Nazareth is summed up in the Gospel of John, which promises “eternal life” to “whosoever believeth in Him.” Traditionally, the key is the acknowledgment that Jesus is the Son of God, who, in the words of the ancient creed, “for us and for our salvation came down from heaven … and was made man.” In the Evangelical ethos, one either accepts this and goes to heaven or refuses and goes to hell.

Bell, a tall, 40-year-old son of a Michigan federal judge, begs to differ. He suggests that the redemptive work of Jesus may be universal — meaning that, as his book’s subtitle puts it, “every person who ever lived” could have a place in heaven, whatever that turns out to be. Such a simple premise, but with Easter at hand, this slim, lively book has ignited a new holy war in Christian circles and beyond. When word of Love Wins reached the Internet, one conservative Evangelical pastor, John Piper, tweeted, “Farewell Rob Bell,” unilaterally attempting to evict Bell from the Evangelical community. R. Albert Mohler Jr., president of the Southern Baptist Theological Seminary, says Bell’s book is “theologically disastrous. Any of us should be concerned when a matter of theological importance is played with in a subversive way.” In North Carolina, a young pastor was fired by his church for endorsing the book.

The traditionalist reaction is understandable, for Bell’s arguments about heaven and hell raise doubts about the core of the Evangelical worldview, changing the common understanding of salvation so much that Christianity becomes more of an ethical habit of mind than a faith based on divine revelation. “When you adopt universalism and erase the distinction between the church and the world,” says Mohler, “then you don’t need the church, and you don’t need Christ, and you don’t need the cross. This is the tragedy of nonjudgmental mainline liberalism, and it’s Rob Bell’s tragedy in this book too.”

Particularly galling to conservative Christian critics is that Love Wins is not an attack from outside the walls of the Evangelical city but a mutiny from within — a rebellion led by a charismatic, popular and savvy pastor with a following. Is Bell’s Christianity — less judgmental, more fluid, open to questioning the most ancient of assumptions — on an inexorable rise? “I have long wondered if there is a massive shift coming in what it means to be a Christian,” Bell says. “Something new is in the air.”

Which is what has many traditional Evangelicals worried. Bell’s book sheds light not only on enduring questions of theology and fate but also on a shift within American Christianity. More indie rock than “Rock of Ages,” with its videos and comfort with irony (Bell sometimes seems an odd combination of Billy Graham and Conan O’Brien), his style of doctrine and worship is clearly playing a larger role in religious life, and the ferocity of the reaction suggests that he is a force to be reckoned with.

Otherwise, why reckon with him at all? A similar work by a pastor from one of the declining mainline Protestant denominations might have merited a hostile blog post or two — bloggers, like preachers, always need material — but it is difficult to imagine that an Episcopal priest’s eschatological musings would have provoked the volume of criticism directed at Bell, whose reach threatens prevailing Evangelical theology.

Bell insists he is only raising the possibility that theological rigidity — and thus a faith of exclusion — is a dangerous thing. He believes in Jesus’ atonement; he says he is just unclear on whether the redemption promised in Christian tradition is limited to those who meet the tests of the church. It is a case for living with mystery rather than demanding certitude.

From a traditionalist perspective, though, to take away hell is to leave the church without its most powerful sanction. If heaven, however defined, is everyone’s ultimate destination in any event, then what’s the incentive to confess Jesus as Lord in this life? If, in other words, Gandhi is in heaven, then why bother with accepting Christ? If you say the Bible doesn’t really say what a lot of people have said it says, then where does that stop? If the verses about hell and judgment aren’t literal, what about the ones on adultery, say, or homosexuality? Taken to their logical conclusions, such questions could undermine much of conservative Christianity.

What the Hell?

From the Apostle Paul to John Paul II, from Augustine to Calvin, Christians have debated atonement and judgment for nearly 2,000 years. Early in the 20th century, Harry Emerson Fosdick came to represent theological liberalism, arguing against the literal truth of the Bible and the existence of hell. It was time, progressives argued, for the faith to surrender its supernatural claims.

Bell is more at home with this expansive liberal tradition than he is with the old-time believers of Inherit the Wind. He believes that Jesus, the Son of God, was sacrificed for the sins of humanity and that the prospect of a place of eternal torment seems irreconcilable with the God of love. Belief in Jesus, he says, should lead human beings to work for the good of this world. What comes next has to wait. “When we get to what happens when we die, we don’t have any video footage,” says Bell. “So let’s at least be honest that we are speculating, because we are.” He is quick to note, though, that his own speculation, while unconventional, is not unprecedented. “At the center of the Christian tradition since the first church,” Bell writes, “have been a number who insist that history is not tragic, hell is not forever, and love, in the end, wins and all will be reconciled to God.”

It is also true that the Christian tradition since the first church has insisted that history is tragic for those who do not believe in Jesus; that hell is, for them, forever; and that love, in the end, will envelop those who profess Jesus as Lord, and they — and they alone — will be reconciled to God. Such views cannot be dismissed because they are inconvenient or uncomfortable: they are based on the same Bible that liberals use to make the opposite case. This is one reason religious debate can seem a wilderness of mirrors, an old CIA phrase describing the bewildering world of counterintelligence.

Still, the dominant view of the righteous in heaven and the damned in hell owes more to the artistic legacy of the West, from Michelangelo to Dante to Blake, than it does to history or to unambiguous biblical teaching. Neither pagan nor Jewish tradition offered a truly equivalent vision of a place of eternal torment; the Greek and Roman underworlds tended to be morally neutral, as did much of the Hebraic tradition concerning Sheol, the realm of the dead.

Things many Christian believers take for granted are more complicated than they seem. It was only when Jesus failed to return soon after the Passion and Resurrection appearances that the early church was compelled to make sense of its recollections of his teachings. Like the Bible — a document that often contradicts itself and from which one can construct sharply different arguments — theology is the product of human hands and hearts. What many believers in the 21st century accept as immutable doctrine was first formulated in the fog and confusion of the 1st century, a time when the followers of Jesus were baffled and overwhelmed by their experience of losing their Lord; many had expected their Messiah to be a Davidic military leader, not an atoning human sacrifice.

When Jesus spoke of the “kingdom of heaven,” he was most likely referring not to a place apart from earth, one of clouds and harps and an eternity with your grandmother, but to what he elsewhere called the “kingdom of God,” a world redeemed and renewed in ways beyond human imagination. To 1st century ears in ancient Judea, Jesus’ talk of the kingdom was centered on the imminent arrival of a new order marked by the defeat of evil, the restoration of Israel and a general resurrection of the dead — all, in the words of the prayer he taught his disciples, “on earth.”

There is, however, no escaping the fact that Jesus speaks in the Bible of a hell for the “condemned.” He sometimes uses the word Gehenna, which was a valley near Jerusalem associated with the sacrifice of children by fire to the Phoenician god Moloch; elsewhere in the New Testament, writers (especially Paul and John the Divine) tell of a fiery pit (Tartarus or Hades) in which the damned will spend eternity. “Depart from me, you cursed [ones], into the eternal fire prepared for the devil and his angels,” Jesus says in Matthew. In Mark he speaks of “the unquenchable fire.” The Book of Revelation paints a vivid picture — in a fantastical, problematic work that John the Divine says he composed when he was “in the spirit on the Lord’s day,” a signal that this is not an Associated Press report — of the lake of fire and the dismissal of the damned from the presence of God to a place where “they will be tormented day and night for ever and ever.”

And yet there is a contrary scriptural trend that suggests, as Jesus puts it, that the gates of hell shall not finally prevail, that God will wipe away every tear — not just the tears of Evangelical Christians but the tears of all. Bell puts much stock in references to the universal redemption of creation: in Matthew, Jesus speaks of the “renewal of all things”; in Acts, Peter says Jesus will “restore everything”; in Colossians, Paul writes that “God was pleased to … reconcile to himself all things, whether things on earth or things in heaven.”

So is it heaven for Christians who say they are Christians and hell for everybody else? What about babies, or people who die without ever hearing the Gospel through no fault of their own? (As Bell puts it, “What if the missionary got a flat tire?”) Who knows? Such tangles have consumed Christianity for millennia and likely will for millennia to come.

What gives the debate over Bell new significance is that his message is part of an intriguing scholarly trend unfolding simultaneously with the cultural, generational and demographic shifts made manifest at Mars Hill. Best expressed, perhaps, in the work of N.T. Wright, the Anglican bishop of Durham, England (Bell is a Wright devotee), this school focuses on the meaning of the texts themselves, reading them anew and seeking, where appropriate, to ask whether an idea is truly rooted in the New Testament or is attributable to subsequent church tradition and theological dogma.

For these new thinkers, heaven can mean different things. In some biblical contexts it is a synonym for God. In others it signifies life in the New Jerusalem, which, properly understood, is the reality that will result when God brings together the heavens and the earth. In yet others it seems to suggest moments of intense human communion and compassion that are, in theological terms, glimpses of the divine love that one might expect in the world to come. One thing heaven is not is an exclusive place removed from earth. This line of thinking has implications for the life of religious communities in our own time. If the earth is, in a way, to be our eternal home, then its care, and the care of all its creatures, takes on fresh urgency.

Bell’s Journey

The easy narrative about Bell would be one of rebellion — that he is reacting to the strictures of a suffocating childhood by questioning long-standing dogma. The opposite is true. Bell’s creed of conviction and doubt — and his comfort with ambiguity and paradox — comes from an upbringing in which he was immersed in faith but encouraged to ask questions. His father, a central figure in his life, is a federal judge appointed by President Reagan in 1987. (Rob still remembers the drive to Washington in the family Oldsmobile for the confirmation hearings.) “I remember him giving me C.S. Lewis in high school,” Bell says. “My parents were both very intellectually honest, straightforward, and for them, faith meant that you were fully engaged.” As they were raising their family, the Bells, in addition to regular churchgoing, created a rigorous ethos of devotion and debate at home. Dinner-table conversations were pointed; Lewis’ novels and nonfiction were required reading.

The roots of Love Wins can be partly traced to the deathbed of a man Rob Bell never met: his grandfather, a civil engineer in Michigan who died when Rob’s father was 8. The Bells’ was a very conservative Evangelical household. When the senior Bell died, there was to be no grief. “We weren’t allowed to mourn, because the funeral of a Christian is supposed to be a celebration of the believer in heaven with Jesus right now,” says Robert Bell Sr. “But if you’re 8 years old and your dad — the breadwinner — just died, it feels different. Sad.”

The story of how his dad, still a child, was to deal with death has stayed with Rob. “To weep, to shed any tears — that would be doubting the sovereignty of God,” Rob says now, looking back. “That was the thing — ‘They’re all in heaven, so we’re happy about that.’ It doesn’t matter how you are actually humanly responding to this moment …” Bell pauses and chuckles ironically, a bit incredulous. “We’re all just supposed to be thrilled.”

Robby — his mother still calls him that — was emotionally precocious. “When he was around 10 years old, I detected that he had a great interest and concern for people,” his father says. “There he’d be, riding along with me, with his little blond hair, going to see sick folks or friends who were having problems, and he would get back in the truck after a visit and begin to analyze them and their situations very acutely. He had a feel for people and how they felt from very early on.”

Rob was a twice-a-week churchgoer at the Baptist and nondenominational churches the family attended at different times — services on Sunday, youth group on Wednesday. He recalls a kind of quiet frustration even then. “I remember thinking, ‘You know, if Jesus is who this guy standing up there says he is, this should be way more compelling.’ This should have a bit more electricity. The knob should be way more to the right, you know?”

Music, not the church, was his first consuming passion. (His wife Kristen claims he said he wanted to be a pastor when they first met early on at Wheaton College in Illinois. Bell is skeptical: “I swear to this day that that was a line.”) He and some friends started a band when he was a sophomore. “I had always had creative energy but no outlet,” he says. “I really discovered music, writing and playing, working with words and images and metaphors. You might say the music unleashed a monster.”

The band became central to him. Then two things happened: the guitar player decided to go to seminary, and Bell came down with viral meningitis. “It took the wind out of our sails,” he says. “I had no Plan B. I was a wreck. I was devastated, because our band was going to make it. We were going to live in a terrible little house and do terrible jobs at first, because that’s what great bands do — they start out living in terrible little houses and doing terrible little jobs.” His illness — “a freak brain infection” — changed his life, Bell says.

At 21, Rob was teaching barefoot waterskiing at HoneyRock Camp, near Three Lakes, Wis., when he preached his first sermon. “I didn’t know anything,” he says. “I took off my Birkenstocks beforehand. I had this awareness that my life would never be the same again.” The removal of the shoes is an interesting detail for Bell to remember. (“Do not come any closer,” God says to Moses in the Book of Exodus. “Take off your sandals, for the place where you are standing is holy ground.”) Bell says it was just intuitive, but the intuition suggests he had a sense of himself as a player in the unfolding drama of God in history. “Create things and share them,” Bell says. “It all made sense. That moment is etched. I remember thinking distinctly, ‘I could be terrible at this.’ But I knew this would get me up in the morning. I went to Fuller that fall.”

Fuller Theological Seminary, in Pasadena, Calif., is an eclectic place, attracting 4,000 students from 70 countries and more than 100 denominations. “It’s pretty hard to sit with Pentecostals and Holiness people and mainline Presbyterians and Anglicans and come away with a closed mind-set that draws firm boundaries about theology,” says Fuller president Richard Mouw.

After seminary, Bell’s work moved in two directions. He was recovering the context of the New Testament while creating a series of popular videos on Christianity called Nooma, Greek for wind or spirit. He began to attract a following, and Mars Hill — named for the site in Athens where Paul preached the Christian gospel of resurrection to the pagan world — was founded in Grand Rapids, Mich., in 1999. “Whenever people wonder why a church is growing, they say, ‘He’s preaching the Bible.’ Well, lots of people are preaching the Bible, and they don’t have parking problems,” says Bell.

Mars Hill did have parking problems, and Bell’s sudden popularity posed some risks for the young pastor. Pride and self-involvement are perennial issues for ministers, who, like politicians, grow accustomed to the sound of their own voices saying Important Things and to the deference of the flock. By the time Bell was 30, he was an Evangelical celebrity. (He had founded Mars Hill when he was 28.) He was referred to as a “rock star” in this magazine. “There was this giant spotlight on me,” he says. “All of a sudden your words are parsed. I found myself — and I think this happens to a lot of people — wanting to shrink away from it. But I decided, Just own it. I’m very comfortable in a room with thousands of people. I do have this voice. What will I say?”

And how will he say it? The history of Evangelism is in part the history of media and methods: Billy Sunday mastered the radio, Billy Graham television; now churches like Bell’s are at work in the digital vineyards of downloads and social media. Demography is also working in Bell’s favor. “He’s trying to reach a generation that’s more comfortable with mystery, with unsolved questions,” says Mouw, noting that his own young grandchildren are growing up with Hindu and Muslim friends and classmates. “For me, Hindus and Muslims were the people we sent missionaries off to in places we called ‘Arabia,'” Mouw says. “Now that diversity is part of the fabric of daily life. It makes a difference. My generation wanted truth — these are folks who want authenticity. The whole judgmentalism and harshness is something they want to avoid.”

If Bell is right about hell, then why do people need ecclesiastical traditions at all? Why aren’t the Salvation Army and the United Way sufficient institutions to enact a gospel of love, sparing us the talk of heaven and hellfire and damnation and all the rest of it? Why not close up the churches?

Bell knows the arguments and appreciates the frustrations. “I don’t know anyone who hasn’t said, ‘Let’s turn out the lights and say we gave it a shot,'” he says. “But you can’t — I can’t — get away from what this Jesus was, and is, saying to us. What the book tries to do is park itself right in the midst of the tension with a Jesus who offers an urgent and immediate call — ‘Repent! Be transformed! Turn!’ At the same time, I’ve got other sheep. There’s a renewal of all things. There’s water from the rock. People will come from the East and from the West. The scandal of the gospel is Jesus’ radical, healing love for a world that’s broken.”

Fair enough, but let’s be honest: religion heals, but it also kills. Why support a supernatural belief system that, for instance, contributed to that minister in Florida’s burning of a Koran, which led to the deaths of innocent U.N. workers in Afghanistan?

“I think Jesus shares your critique,” Bell replies. “We don’t burn other people’s books. I think Jesus is fairly pissed off about it as well.”

On Sunday, April 17, at Mars Hill, Bell will be joined by singer-songwriter Brie Stoner (who provided some of the music for his Nooma series) and will teach the first 13 verses of the third chapter of Revelation, which speaks of “the city of my God, the new Jerusalem, which is coming down out of heaven from my God … Whoever has ears, let them hear what the Spirit says to the churches.” The precise meaning of the words is open to different interpretations. But this much is clear: Rob Bell has much to say, and many are listening.

The Terror of Code in the Wrong Hands

Here is a new term, software terrorist, who brings negative productivity to the team. I can attest that catching bug in poorly written code waste a lot more time than rewriting the code myself from scratch.

By Allen Holub, May 2005, SD Times

The 20-to-1 productivity rule says that 5 percent of programmers are 20 times more productive than the remaining 95 percent, but what about the 5 percent at the other end of the bell curve? Consider the software terrorist: the guy who stays up all night, unwittingly but systematically destroying the entire team’s last month’s work while “improving” the code. He doesn’t tell anybody what he’s done, and he never tests. He’s created a ticking time bomb that won’t be discovered for six months.

When the bomb goes off, you can’t roll back six months of work by the whole team, and it takes three weeks of your best programmer’s effort to undo the damage. Meanwhile, our terrorist gets a raise because he stays late so often, working so hard. The brilliant guy who cleans up the debris gets a bad performance review because his schedule has slipped, so he quits.

Valuable tools in the hands of experts become dangerous weapons in the hands of terrorists. The terrorist doesn’t understand how to use generics, templates and casts, and so with a single click on the “refactor” button he destroys the program’s carefully crafted typing system. That single-click refactor is a real time saver for the expert. Scripting languages, which in the right hands save time, become a means for creating write-only code that has to be scrapped after you’ve spent two months trying to figure out why it doesn’t work.

Terrorist scripts can be so central to the app, and so hard to understand, that they sometimes remain in the program, doubling the time required for all maintenance efforts. Terrorist documentation is a font of misinformation. Terrorist tests systematically destroy the database every time they’re run.

Terrorist work isn’t just nonproductive, it’s anti-productive. A terrorist reduces your team’s productivity by at least an order of magnitude. It takes a lot longer to find a bug than to create one. None of the terrorist code ends up in the final program because it all has to be rewritten. You pay the terrorists, and you also pay 10 times more to the people who have to track down and fix their bugs.

Given the difficulty that most organizations have in firing (or even identifying) incompetent people, the only way to solve this problem is not to hire terrorists at all; but the terrorists are masters of disguise, particularly in job interviews. They talk a good game, they have lots of experience, and they have great references because they work so hard.

Since the bottom 5 percent is indistinguishable from the rest of the bottom 95 percent, the only way to avoid hiring terrorists is to avoid hiring from the remaining 95 percent altogether.

The compelling reason for this strategy is that the 20-to-1 rule applies only when elite programmers work exclusively with other elite programmers. Single elite programmers who interact with 10 average programmers waste most of their time explaining and helping rather than working. Two elite programmers raise the productivity of a 20-programmer group by 10 percent. It’s like getting two programmers for free. Two elite programmers working only with each other do the work of at least 20 average programmers. It’s like getting 18 programmers for free. If you pay them twice the going salary (and you should if you want to keep them), you’re still saving vast amounts of money.

Unfortunately, it’s possible for a software terrorist to masquerade as an elite programmer, but this disguise is easier to detect. Programmers who insist on working in isolation (especially the ones who come to work at 4:00 p.m. and stay all night), the prima donnas who have fits when they don’t get their way, the programmers who never explain what they’re doing in a way that anyone else can understand and don’t document their code, the ones that reject new technologies or methodologies out of hand rather than showing genuine curiosity—these are the terrorists.

Avoid them no matter how many years of experience they have.

Software terrorism is on the upswing. I used to quote the standard rule that the top 10 percent were 10 times more productive. The hiring practices prevalent since the dot-com explosion—which seem to reject the elite programmers by design—have lowered the general skill level of the profession, however.

As the number of elite programmers gets smaller, their relative productivity gets higher. The only long-term solution to this problem is to change our hiring practices and our attitudes toward training. The cynic in me has a hard time believing that either will happen, but we can always hope for the best.