Thursday, October 27, 2011

Why Digital Talent Doesn’t Want To Work At Your Company

FC Expert Blog

BY FC EXPERT BLOGGER AARON SHAPIROToday

Some digital companies are hiring--and in fact are in hot competition for certain types of employees. But you don't have to be Google to attract top-tier talent.



Why doesn't digital talent want to work at your company? It’s not because you’re a consumer packaged goods company, rather than Google. It’s not because you’re in Ohio instead of Silicon Valley. It’s not because your salaries are too low, or because you don’t offer free food and laundry services.
It’s because you’re not providing them the right opportunity. The talent you want would be happy to work in an un-air-conditioned garage in New Mexico if it meant the chance to change the world.
This, the opportunity to do great things, to make a real difference, is what drives most digital talent--whether they’re developers, designers, producers, marketers or business folks.
Most companies don’t offer this, so they skip your company and work somewhere that’s more innovative and exciting. End of story. But the good news is that you can offer them something exciting and great. The promise of changing a giant, behind-the-times organization into an Internet-savvy business is an incredibly exciting challenge and a big way for ambitious people to make an impact.
But it takes more than lip service to make the sale. Job candidates and new hires with digital chops must truly believe in the company’s dedication to digital transformation and they must see that they are empowered to make this change. Trouble is, many big businesses aren’t structured to deliver on this type of opportunity. The attributes of a soul-crushing, Sisyphean, anti-digital workplace run deep.
Digital talent won’t want to work at your company if:
Every element of their work will be pored over by multiple layers of bureaucracy. Even if that’s how the rest of the company operates, it can’t spill into the digital department. In a technology environment, new products and businesses spring up daily and a new endeavor can go from conception to launch in a matter of months. Reining in the momentum will be read as inaction and a clear signal the company isn’t willing to grasp the new way of the world.
Mediocre is good enough. While clocking out at 5 p.m. is attractive to some, it will discourage digital talent. They want to be expected to do something great. They want to be pushed. They care about their work. Their leadership, and those they rely on to get things done, must match their appetite for success.
Trial and error is condemned. The freedom to try out new ideas allows employees to take initiative, make decisions, and learn from their mistakes. It also demonstrates an attractive and inspiring entrepreneurial spirit.
Your company is structured so it takes a lifetime to get to the top, and as such there are no digital experts in company-wide leadership positions. Digital talent--often in their 20s and 30s--need to see a clear path for uninhibited career development that’s based on merit, not years spent, and that’s beyond the confines of the digital department. If they don’t, they won’t see a reason to stay with the company in the long term.
Your offices are cold, impersonal and downright stodgy. It may sound like it conflicts with the “you don’t need to be in Silicon Valley point,” but appreciate the nuance. A traditional office layout is designed to communicate power among certain individuals and barriers between departments. This does not support the collaborative ethos which is intrinsic to the web. Companies should do everything possible to provide the digital team friendlier, open office space. A location in a hip, young neighborhood (which surely exists in every mid- to large-sized city) is also a big plus.
When all of these digital-talent deterring points are addressed, company leadership has effectively and proactively demonstrated the company’s dedication to a digital transformation. It is at this time that their words, a broadly communicated firm stance on the significance of the company’s digital goals, will make the most impact. Without this conspicuous top-down support, politics in the organization or simply one influential disbeliever can hinder the effort, limit the extent of digital integration possible, and discourage valuable employees.
You need them more than they need you. Demand for their services is so high, they can afford to be finicky. If they don’t like where they’re working, another firm with a more attractive culture and more grand opportunity will quickly swipe them up. That could be your company. But it could just as easily be someone else.
Related articles:
Want To Keep (And Motivate) Your Best Employees? It's Not About The Money

How To Discover Amazing Talent For Your Startup

Your Next High-Paying Job May Be In An Industry That Was Never On Your Radar

What It Takes To Get A Job At Google
Adapted from Users Not Customers: Who Really Determines the Success of Your Business (Portfolio), by Aaron Shapiro, CEO of HUGE, a digital agency that helps companies including PespiCo, Comcast, Target, HBO, and Unilever reimagine how they interact with their customers and manage their business in the online economy. Visit aaronshapiro.com.

Sunday, October 23, 2011

കാണാ മുള്ളാല്‍ (സാള്‍ട്ട് ആന്റ് പെപ്പര്‍ )

Musician Bijibal ബിജിബാല്‍
Lyricist(s) Santhosh Varma സന്തോഷ് വര്‍മ്മ
Year 2011
Singer(s) Shreya Ghoshal,Ranjith Govind

കാണാമുള്ളാല്‍ ഉള്‍ നീറും നോവാണനുരാഗം
നോവുമ്പോഴും തേനൂറും സുഖമാണനുരാഗം
എന്നില്‍ നീ, നിന്നില്‍ ഞാനും പതിയെ,
പതിയെ അതിരുകളുരുകി അലിയേ

ഏറെദൂരെയെങ്കില്‍ നീ എന്നുമെന്നെയോര്‍ക്കും
നിന്നരികില്‍ ഞാനണയും കിനാവിനായ്‌ കാതോര്‍ക്കും
വിരഹമേ...ആ ആ
വിരഹമേ നീയുണ്ടെങ്കില്‍ പ്രണയം പടരും
സിരയിലൊരു തീയലയായ്‌...
(കാണാ മുള്ളാല്‍ )

നീരണിഞ്ഞു മാത്രം വളരുന്ന വല്ലിപോലെ
മിഴിനനവില്‍ പൂവണിയും വസന്തമാണനുരാഗം
കദനമേ...
കദനമേ നീയില്ലെങ്കില്‍ പ്രണയം തളരും
വെറുതെയൊരു പാഴ്കുളിരായ്‌...

Friday, October 14, 2011

നീലക്കുറിഞ്ഞികൾ പൂക്കുന്ന വീഥിയിൽ

നീലക്കുറിഞ്ഞികൾ പൂക്കുന്ന വീഥിയിൽ
നിന്നെ പ്രതീക്ഷിച്ചു നിന്നു..
ഒരു കൃഷ്ണതുളസിക്കതിരുമായ് നിന്നെ ഞാൻ
എന്നും പ്രതീക്ഷിച്ചു നിന്നു..
നീയിതു കാണാതെ പോകയോ..
നീയിതു ചൂടാതെ പോകയോ...

ആഷാഢമാസ നിശീഥിനിതൻ വന സീമയിലൂടെ ഞാൻ
ആരും കാണാതെ.. കാറ്റും കേൾക്കാതെ..
എന്നെയും തേടി വരുന്നൂ എന്റെ മൺകുടിൽ തേടി വരുന്നൂ...
നീയിതു കാണാതെ പോകയോ...
നീയിതു ചൂടാതെ പോകയോ ...


ലാസ്യ നിലാവിന്റെ ലാളനമേറ്റു ഞാൻ ഒന്നു മയങ്ങീ...
കാറ്റും കാണാതെ.... കാടും ഉണരാതെ...
എന്റെ ചാരത്തു വന്നൂ...
എന്റെ പ്രേമ നൈവേദ്യമണിഞ്ഞൂ...
നീയിതു കാണാതെ പോകയോ....
നീയിതു ചൂടാതെ പോകയോ...

ചിത്രം/ആൽബം: നീലക്കടമ്പ്
ഗാനരചയിതാവു്: കെ ജയകുമാർ
സംഗീതം: രവീന്ദ്രൻ
ആലാപനം: കെ എസ് ചിത്ര

Dennis Ritchie: The Shoulders Steve Jobs Stood On

Dennis Ritchie: The Shoulders Steve Jobs Stood On

By Cade Metz Email Author October 13, 2011 | 7:14 pm | Categories: Announcements, Personalities



Dennis Ritchie (standing) and Ken Thompson at a PDP-11 in 1972. (Photo: Courtesy of Bell Labs)
The tributes to Dennis Ritchie won’t match the river of praise that spilled out over the web after the death of Steve Jobs. But they should.
And then some.

“When Steve Jobs died last week, there was a huge outcry, and that was very moving and justified. But Dennis had a bigger effect, and the public doesn’t even know who he is,” says Rob Pike, the programming legend and current Googler who spent 20 years working across the hall from Ritchie at the famed Bell Labs.

On Wednesday evening, with a post to Google+, Pike announced that Ritchie had died at his home in New Jersey over the weekend after a long illness, and though the response from hardcore techies was immense, the collective eulogy from the web at large doesn’t quite do justice to Ritchie’s sweeping influence on the modern world. Dennis Ritchie is the father of the C programming language, and with fellow Bell Labs researcher Ken Thompson, he used C to build UNIX, the operating system that so much of the world is built on — including the Apple empire overseen by Steve Jobs.

“Pretty much everything on the web uses those two things: C and UNIX,” Pike tells Wired. “The browsers are written in C. The UNIX kernel — that pretty much the entire Internet runs on — is written in C. Web servers are written in C, and if they’re not, they’re written in Java or C++, which are C derivatives, or Python or Ruby, which are implemented in C. And all of the network hardware running these programs I can almost guarantee were written in C.

“It’s really hard to overstate how much of the modern information economy is built on the work Dennis did.”

Even Windows was once written in C, he adds, and UNIX underpins both Mac OS X, Apple’s desktop operating system, and iOS, which runs the iPhone and the iPad. “Jobs was the king of the visible, and Ritchie is the king of what is largely invisible,” says Martin Rinard, professor of electrical engineering and computer science at MIT and a member of the Computer Science and Artificial Intelligence Laboratory.

“Jobs’ genius is that he builds these products that people really like to use because he has taste and can build things that people really find compelling. Ritchie built things that technologists were able to use to build core infrastructure that people don’t necessarily see much anymore, but they use everyday.”

From B to C

Dennis Ritchie built C because he and Ken Thompson needed a better way to build UNIX. The original UNIX kernel was written in assembly language, but they soon decided they needed a “higher level” language, something that would give them more control over all the data that spanned the OS. Around 1970, they tried building a second version with Fortran, but this didn’t quite cut it, and Ritchie proposed a new language based on a Thompson creation known as B.

Depending on which legend you believe, B was named either for Thompson’s wife Bonnie or BCPL, a language developed at Cambridge in the mid-60s. Whatever the case, B begat C.

B was an interpreted language — meaning it was executed by an intermediate piece of software running atop a CPU — but C was a compiled language. It was translated into machine code, and then directly executed on the CPU. But in those days, C was considered a high-level language. It would give Ritchie and Thompson the flexibility they needed, but at the same time, it would be fast.

That first version of the language wasn’t all that different from C as we know it today — though it was a tad simpler. It offered full data structures and “types” for defining variables, and this is what Richie and Thompson used to build their new UNIX kernel. “They built C to write a program,” says Pike, who would join Bell Labs 10 years later. “And the program they wanted to write was the UNIX kernel.”

Ritchie’s running joke was that C had “the power of assembly language and the convenience of … assembly language.” In other words, he acknowledged that C was a less-than-gorgeous creation that still ran very close to the hardware. Today, it’s considered a low-level language, not high. But Ritchie’s joke didn’t quite do justice to the new language. In offering true data structures, it operated at a level that was just high enough.

“When you’re writing a large program — and that’s what UNIX was — you have to manage the interactions between all sorts of different components: all the users, the file system, the disks, the program execution, and in order to manage that effectively, you need to have a good representation of the information you’re working with. That’s what we call data structures,” Pike says.

“To write a kernel without a data structure and have it be as consist and graceful as UNIX would have been a much, much harder challenge. They needed a way to group all that data together, and they didn’t have that with Fortran.”

At the time, it was an unusual way to write an operating system, and this is what allowed Ritchie and Thompson to eventually imagine porting the OS to other platforms, which they did in the late 70s. “That opened the floodgates for UNIX running everywhere,” Pike says. “It was all made possible by C.”

Apple, Microsoft, and Beyond

At the same time, C forged its own way in the world, moving from Bell Labs to the world’s universities and to Microsoft, the breakout software company of the 1980s. “The development of the C programming language was a huge step forward and was the right middle ground … C struck exactly the right balance, to let you write at a high level and be much more productive, but when you needed to, you could control exactly what happened,” says Bill Dally, chief scientist of NVIDIA and Bell Professor of Engineering at Stanford. “[It] set the tone for the way that programming was done for several decades.”

As Pike points out, the data structures that Richie built into C eventually gave rise to the object-oriented paradigm used by modern languages such as C++ and Java.

The revolution began in 1973, when Ritchie published his research paper on the language, and five years later, he and colleague Brian Kernighan released the definitive C book: The C Programming Language. Kernighan had written the early tutorials for the language, and at some point, he “twisted Dennis’ arm” into writing a book with him.

Pike read the book while still an undergraduate at the University of Toronto, picking it up one afternoon while heading home for a sick day. “That reference manual is a model of clarity and readability compared to latter manuals. It is justifiably a classic,” he says. “I read it while sick in bed, and it made me forget that I was sick.”

Like many university students, Pike had already started using the language. It had spread across college campuses because Bell Labs started giving away the UNIX source code. Among so many other things, the operating system gave rise to the modern open source movement. Pike isn’t overstating it when says the influence of Ritchie’s work can’t be overstated, and though Ritchie received the Turing Award in 1983 and the National Medal of Technology in 1998, he still hasn’t gotten his due.

As Kernighan and Pike describe him, Ritchie was an unusually private person. “I worked across the hall from him for more than 20 years, and yet I feel like a don’t knew him all that well,” Pike says. But this doesn’t quite explain his low profile. Steve Jobs was a private person, but his insistence on privacy only fueled the cult of personality that surrounded him.

Ritchie lived in a very different time and worked in a very different environment than someone like Jobs. It only makes sense that he wouldn’t get his due. But those who matter understand the mark he left. “There’s that line from Newton about standing on the shoulders of giants,” says Kernighan. “We’re all standing on Dennis’ shoulders.”

Steve Jobs and the Seven Rules of Success

Steve Jobs and the Seven Rules of Success
BY CARMINE GALLO | 17 hours ago| 24

inShare
2,555
Steve Jobs' impact on your life cannot be underestimated. His innovations have likely touched nearly every aspect -- computers, movies, music and mobile. As a communications coach, I learned from Jobs that a presentation can, indeed, inspire. For entrepreneurs, Jobs' greatest legacy is the set of principles that drove his success.

Over the years, I've become a student of sorts of Jobs' career and life. Here's my take on the rules and values underpinning his success. Any of us can adopt them to unleash our "inner Steve Jobs."

1. Do what you love. Jobs once said, "People with passion can change the world for the better." Asked about the advice he would offer would-be entrepreneurs, he said, "I'd get a job as a busboy or something until I figured out what I was really passionate about." That's how much it meant to him. Passion is everything.

2. Put a dent in the universe. Jobs believed in the power of vision. He once asked then-Pepsi President, John Sculley, "Do you want to spend your life selling sugar water or do you want to change the world?" Don't lose sight of the big vision.

Related: Why Entrepreneurs Love Steve Jobs


3. Make connections. Jobs once said creativity is connecting things. He meant that people with a broad set of life experiences can often see things that others miss. He took calligraphy classes that didn't have any practical use in his life -- until he built the Macintosh. Jobs traveled to India and Asia. He studied design and hospitality. Don't live in a bubble. Connect ideas from different fields.

4. Say no to 1,000 things. Jobs was as proud of what Apple chose not to do as he was of what Apple did. When he returned in Apple in 1997, he took a company with 350 products and reduced them to 10 products in a two-year period. Why? So he could put the "A-Team" on each product. What are you saying "no" to?

5. Create insanely different experiences. Jobs also sought innovation in the customer-service experience. When he first came up with the concept for the Apple Stores, he said they would be different because instead of just moving boxes, the stores would enrich lives. Everything about the experience you have when you walk into an Apple store is intended to enrich your life and to create an emotional connection between you and the Apple brand. What are you doing to enrich the lives of your customers?

Related: 10 Things to Thank Steve Jobs For

6. Master the message. You can have the greatest idea in the world, but if you can't communicate your ideas, it doesn't matter. Jobs was the world's greatest corporate storyteller. Instead of simply delivering a presentation like most people do, he informed, he educated, he inspired and he entertained, all in one presentation.

7. Sell dreams, not products. Jobs captured our imagination because he really understood his customer. He knew that tablets would not capture our imaginations if they were too complicated. The result? One button on the front of an iPad. It's so simple, a 2-year-old can use it. Your customers don't care about your product. They care about themselves, their hopes, their ambitions. Jobs taught us that if you help your customers reach their dreams, you'll win them over.

There's one story that I think sums up Jobs' career at Apple. An executive who had the job of reinventing the Disney Store once called up Jobs and asked for advice. His counsel? Dream bigger. I think that's the best advice he could leave us with. See genius in your craziness, believe in yourself, believe in your vision, and be constantly prepared to defend those ideas.

Wednesday, October 12, 2011

Mandela: His 8 Lessons of Leadership

Wednesday, Jul. 09, 2008
Mandela: His 8 Lessons of Leadership
By Richard Stengel

Nelson Mandela has always felt most at ease around children, and in some ways his greatest deprivation was that he spent 27 years without hearing a baby cry or holding a child's hand. Last month, when I visited Mandela in Johannesburg — a frailer, foggier Mandela than the one I used to know — his first instinct was to spread his arms to my two boys. Within seconds they were hugging the friendly old man who asked them what sports they liked to play and what they'd had for breakfast. While we talked, he held my son Gabriel, whose complicated middle name is Rolihlahla, Nelson Mandela's real first name. He told Gabriel the story of that name, how in Xhosa it translates as "pulling down the branch of a tree" but that its real meaning is "troublemaker."

As he celebrates his 90th birthday next week, Nelson Mandela has made enough trouble for several lifetimes. He liberated a country from a system of violent prejudice and helped unite white and black, oppressor and oppressed, in a way that had never been done before. In the 1990s I worked with Mandela for almost two years on his autobiography, Long Walk to Freedom. After all that time spent in his company, I felt a terrible sense of withdrawal when the book was done; it was like the sun going out of one's life. We have seen each other occasionally over the years, but I wanted to make what might be a final visit and have my sons meet him one more time.

I also wanted to talk to him about leadership. Mandela is the closest thing the world has to a secular saint, but he would be the first to admit that he is something far more pedestrian: a politician. He overthrew apartheid and created a nonracial democratic South Africa by knowing precisely when and how to transition between his roles as warrior, martyr, diplomat and statesman. Uncomfortable with abstract philosophical concepts, he would often say to me that an issue "was not a question of principle; it was a question of tactics." He is a master tactician.

Mandela is no longer comfortable with inquiries or favors. He's fearful that he may not be able to summon what people expect when they visit a living deity, and vain enough to care that they not think him diminished. But the world has never needed Mandela's gifts — as a tactician, as an activist and, yes, as a politician — more, as he showed again in London on June 25, when he rose to condemn the savagery of Zimbabwe's Robert Mugabe. As we enter the main stretch of a historic presidential campaign in America, there is much that he can teach the two candidates. I've always thought of what you are about to read as Madiba's Rules (Madiba, his clan name, is what everyone close to him calls him), and they are cobbled together from our conversations old and new and from observing him up close and from afar. They are mostly practical. Many of them stem directly from his personal experience. All of them are calibrated to cause the best kind of trouble: the trouble that forces us to ask how we can make the world a better place.

No. 1
Courage is not the absence of fear — it's inspiring others to move beyond it
In 1994, during the presidential-election campaign, Mandela got on a tiny propeller plane to fly down to the killing fields of Natal and give a speech to his Zulu supporters. I agreed to meet him at the airport, where we would continue our work after his speech. When the plane was 20 minutes from landing, one of its engines failed. Some on the plane began to panic. The only thing that calmed them was looking at Mandela, who quietly read his newspaper as if he were a commuter on his morning train to the office. The airport prepared for an emergency landing, and the pilot managed to land the plane safely. When Mandela and I got in the backseat of his bulletproof BMW that would take us to the rally, he turned to me and said, "Man, I was terrified up there!"

Mandela was often afraid during his time underground, during the Rivonia trial that led to his imprisonment, during his time on Robben Island. "Of course I was afraid!" he would tell me later. It would have been irrational, he suggested, not to be. "I can't pretend that I'm brave and that I can beat the whole world." But as a leader, you cannot let people know. "You must put up a front."

And that's precisely what he learned to do: pretend and, through the act of appearing fearless, inspire others. It was a pantomime Mandela perfected on Robben Island, where there was much to fear. Prisoners who were with him said watching Mandela walk across the courtyard, upright and proud, was enough to keep them going for days. He knew that he was a model for others, and that gave him the strength to triumph over his own fear.

No. 2
Lead from the front — but don't leave your base behind
Mandela is cagey. in 1985 he was operated on for an enlarged prostate. When he was returned to prison, he was separated from his colleagues and friends for the first time in 21 years. They protested. But as his longtime friend Ahmed Kathrada recalls, he said to them, "Wait a minute, chaps. Some good may come of this."

The good that came of it was that Mandela on his own launched negotiations with the apartheid government. This was anathema to the African National Congress (ANC). After decades of saying "prisoners cannot negotiate" and after advocating an armed struggle that would bring the government to its knees, he decided that the time was right to begin to talk to his oppressors.

When he initiated his negotiations with the government in 1985, there were many who thought he had lost it. "We thought he was selling out," says Cyril Ramaphosa, then the powerful and fiery leader of the National Union of Mineworkers. "I went to see him to tell him, What are you doing? It was an unbelievable initiative. He took a massive risk."

Mandela launched a campaign to persuade the ANC that his was the correct course. His reputation was on the line. He went to each of his comrades in prison, Kathrada remembers, and explained what he was doing. Slowly and deliberately, he brought them along. "You take your support base along with you," says Ramaphosa, who was secretary-general of the ANC and is now a business mogul. "Once you arrive at the beachhead, then you allow the people to move on. He's not a bubble-gum leader — chew it now and throw it away."

For Mandela, refusing to negotiate was about tactics, not principles. Throughout his life, he has always made that distinction. His unwavering principle — the overthrow of apartheid and the achievement of one man, one vote — was immutable, but almost anything that helped him get to that goal he regarded as a tactic. He is the most pragmatic of idealists.

"He's a historical man," says Ramaphosa. "He was thinking way ahead of us. He has posterity in mind: How will they view what we've done?" Prison gave him the ability to take the long view. It had to; there was no other view possible. He was thinking in terms of not days and weeks but decades. He knew history was on his side, that the result was inevitable; it was just a question of how soon and how it would be achieved. "Things will be better in the long run," he sometimes said. He always played for the long run.

No. 3
Lead from the back — and let others believe they are in front
Mandela loved to reminisce about his boyhood and his lazy afternoons herding cattle. "You know," he would say, "you can only lead them from behind." He would then raise his eyebrows to make sure I got the analogy.

As a boy, Mandela was greatly influenced by Jongintaba, the tribal king who raised him. When Jongintaba had meetings of his court, the men gathered in a circle, and only after all had spoken did the king begin to speak. The chief's job, Mandela said, was not to tell people what to do but to form a consensus. "Don't enter the debate too early," he used to say.

During the time I worked with Mandela, he often called meetings of his kitchen cabinet at his home in Houghton, a lovely old suburb of Johannesburg. He would gather half a dozen men, Ramaphosa, Thabo Mbeki (who is now the South African President) and others around the dining-room table or sometimes in a circle in his driveway. Some of his colleagues would shout at him — to move faster, to be more radical — and Mandela would simply listen. When he finally did speak at those meetings, he slowly and methodically summarized everyone's points of view and then unfurled his own thoughts, subtly steering the decision in the direction he wanted without imposing it. The trick of leadership is allowing yourself to be led too. "It is wise," he said, "to persuade people to do things and make them think it was their own idea."

No. 4
Know your enemy — and learn about his favorite sport
As far back as the 1960s, Mandela began studying Afrikaans, the language of the white South Africans who created apartheid. His comrades in the ANC teased him about it, but he wanted to understand the Afrikaner's worldview; he knew that one day he would be fighting them or negotiating with them, and either way, his destiny was tied to theirs.

This was strategic in two senses: by speaking his opponents' language, he might understand their strengths and weaknesses and formulate tactics accordingly. But he would also be ingratiating himself with his enemy. Everyone from ordinary jailers to P.W. Botha was impressed by Mandela's willingness to speak Afrikaans and his knowledge of Afrikaner history. He even brushed up on his knowledge of rugby, the Afrikaners' beloved sport, so he would be able to compare notes on teams and players.

Mandela understood that blacks and Afrikaners had something fundamental in common: Afrikaners believed themselves to be Africans as deeply as blacks did. He knew, too, that Afrikaners had been the victims of prejudice themselves: the British government and the white English settlers looked down on them. Afrikaners suffered from a cultural inferiority complex almost as much as blacks did.

Mandela was a lawyer, and in prison he helped the warders with their legal problems. They were far less educated and worldly than he, and it was extraordinary to them that a black man was willing and able to help them. These were "the most ruthless and brutal of the apartheid regime's characters," says Allister Sparks, the great South African historian, and he "realized that even the worst and crudest could be negotiated with."

No. 5
Keep your friends close — and your rivals even closer
Many of the guests Mandela invited to the house he built in Qunu were people whom, he intimated to me, he did not wholly trust. He had them to dinner; he called to consult with them; he flattered them and gave them gifts. Mandela is a man of invincible charm — and he has often used that charm to even greater effect on his rivals than on his allies.

On Robben Island, Mandela would always include in his brain trust men he neither liked nor relied on. One person he became close to was Chris Hani, the fiery chief of staff of the ANC's military wing. There were some who thought Hani was conspiring against Mandela, but Mandela cozied up to him. "It wasn't just Hani," says Ramaphosa. "It was also the big industrialists, the mining families, the opposition. He would pick up the phone and call them on their birthdays. He would go to family funerals. He saw it as an opportunity." When Mandela emerged from prison, he famously included his jailers among his friends and put leaders who had kept him in prison in his first Cabinet. Yet I well knew that he despised some of these men.

There were times he washed his hands of people — and times when, like so many people of great charm, he allowed himself to be charmed. Mandela initially developed a quick rapport with South African President F.W. de Klerk, which is why he later felt so betrayed when De Klerk attacked him in public.

Mandela believed that embracing his rivals was a way of controlling them: they were more dangerous on their own than within his circle of influence. He cherished loyalty, but he was never obsessed by it. After all, he used to say, "people act in their own interest." It was simply a fact of human nature, not a flaw or a defect. The flip side of being an optimist — and he is one — is trusting people too much. But Mandela recognized that the way to deal with those he didn't trust was to neutralize them with charm.

No. 6
Appearances matter — and remember to smile
When Mandela was a poor law student in Johannesburg wearing his one threadbare suit, he was taken to see Walter Sisulu. Sisulu was a real estate agent and a young leader of the ANC. Mandela saw a sophisticated and successful black man whom he could emulate. Sisulu saw the future.

Sisulu once told me that his great quest in the 1950s was to turn the ANC into a mass movement; and then one day, he recalled with a smile, "a mass leader walked into my office." Mandela was tall and handsome, an amateur boxer who carried himself with the regal air of a chief's son. And he had a smile that was like the sun coming out on a cloudy day.

We sometimes forget the historical correlation between leadership and physicality. George Washington was the tallest and probably the strongest man in every room he entered. Size and strength have more to do with DNA than with leadership manuals, but Mandela understood how his appearance could advance his cause. As leader of the ANC's underground military wing, he insisted that he be photographed in the proper fatigues and with a beard, and throughout his career he has been concerned about dressing appropriately for his position. George Bizos, his lawyer, remembers that he first met Mandela at an Indian tailor's shop in the 1950s and that Mandela was the first black South African he had ever seen being fitted for a suit. Now Mandela's uniform is a series of exuberant-print shirts that declare him the joyous grandfather of modern Africa.

When Mandela was running for the presidency in 1994, he knew that symbols mattered as much as substance. He was never a great public speaker, and people often tuned out what he was saying after the first few minutes. But it was the iconography that people understood. When he was on a platform, he would always do the toyi-toyi, the township dance that was an emblem of the struggle. But more important was that dazzling, beatific, all-inclusive smile. For white South Africans, the smile symbolized Mandela's lack of bitterness and suggested that he was sympathetic to them. To black voters, it said, I am the happy warrior, and we will triumph. The ubiquitous ANC election poster was simply his smiling face. "The smile," says Ramaphosa, "was the message."

After he emerged from prison, people would say, over and over, It is amazing that he is not bitter. There are a thousand things Nelson Mandela was bitter about, but he knew that more than anything else, he had to project the exact opposite emotion. He always said, "Forget the past" — but I knew he never did.

No. 7
Nothing is black or white
When we began our series of interviews, I would often ask Mandela questions like this one: When you decided to suspend the armed struggle, was it because you realized you did not have the strength to overthrow the government or because you knew you could win over international opinion by choosing nonviolence? He would then give me a curious glance and say, "Why not both?"

I did start asking smarter questions, but the message was clear: Life is never either/or. Decisions are complex, and there are always competing factors. To look for simple explanations is the bias of the human brain, but it doesn't correspond to reality. Nothing is ever as straightforward as it appears.

Mandela is comfortable with contradiction. As a politician, he was a pragmatist who saw the world as infinitely nuanced. Much of this, I believe, came from living as a black man under an apartheid system that offered a daily regimen of excruciating and debilitating moral choices: Do I defer to the white boss to get the job I want and avoid a punishment? Do I carry my pass?

As a statesman, Mandela was uncommonly loyal to Muammar Gaddafi and Fidel Castro. They had helped the ANC when the U.S. still branded Mandela as a terrorist. When I asked him about Gaddafi and Castro, he suggested that Americans tend to see things in black and white, and he would upbraid me for my lack of nuance. Every problem has many causes. While he was indisputably and clearly against apartheid, the causes of apartheid were complex. They were historical, sociological and psychological. Mandela's calculus was always, What is the end that I seek, and what is the most practical way to get there?

No. 8
Quitting is leading too
In 1993, Mandela asked me if I knew of any countries where the minimum voting age was under 18. I did some research and presented him with a rather undistinguished list: Indonesia, Cuba, Nicaragua, North Korea and Iran. He nodded and uttered his highest praise: "Very good, very good." Two weeks later, Mandela went on South African television and proposed that the voting age be lowered to 14. "He tried to sell us the idea," recalls Ramaphosa, "but he was the only [supporter]. And he had to face the reality that it would not win the day. He accepted it with great humility. He doesn't sulk. That was also a lesson in leadership."

Knowing how to abandon a failed idea, task or relationship is often the most difficult kind of decision a leader has to make. In many ways, Mandela's greatest legacy as President of South Africa is the way he chose to leave it. When he was elected in 1994, Mandela probably could have pressed to be President for life — and there were many who felt that in return for his years in prison, that was the least South Africa could do.

In the history of Africa, there have been only a handful of democratically elected leaders who willingly stood down from office. Mandela was determined to set a precedent for all who followed him — not only in South Africa but across the rest of the continent. He would be the anti-Mugabe, the man who gave birth to his country and refused to hold it hostage. "His job was to set the course," says Ramaphosa, "not to steer the ship." He knows that leaders lead as much by what they choose not to do as what they do.

Ultimately, the key to understanding Mandela is those 27 years in prison. The man who walked onto Robben Island in 1964 was emotional, headstrong, easily stung. The man who emerged was balanced and disciplined. He is not and never has been introspective. I often asked him how the man who emerged from prison differed from the willful young man who had entered it. He hated this question. Finally, in exasperation one day, he said, "I came out mature." There is nothing so rare — or so valuable — as a mature man. Happy birthday, Madiba.



Find this article at:
http://www.time.com/time/magazine/article/0,9171,1821659,00.html

Monday, October 10, 2011

The Top Ten Lessons Steve Jobs Can Teach Us - If We'll Listen

In a few years from now, your kids and grandkids will ask you what it was like to be alive when Steve Jobs was the CEO of Apple (AAPL). They will say: “Jobs was the best CEO in business. What was he like? What did you learn from him?”

What will your answer be?
11 images Gallery: Steve Jobs' Most Important Products
Gallery: Social Networking Billionaires 2011

It’s human nature to overlook the importance of the here and now. Those who are great and live among us seem more normal because they’re breathing the same air that we are.

But, make no mistake, once Steve Jobs is no longer with us, there will be an outpouring of emotion. The tributes will be endless. And there will be collective regret that we weren’t more awake, paying attention, while he was with us.

The wisdom he shared with us at every major speech, or on an earnings call, or in a casual chat put up on YouTube will seem 10 times wiser because he’s no longer with us.

So, let’s pause today and try to remind ourselves of some lessons Steve Jobs has taught us all — if we’ve been willing to pay attention:

1. The most enduring innovations marry art and science – Steve has always pointed out that the biggest difference between Apple and all the other computer (and post-PC) companies through history is that Apple always tried to marry art and science. Jobs pointed out the original team working on the Mac had backgrounds in anthropology, art, history, and poetry. That’s always been important in making Apple’s products stand out. It’s the difference between the iPad and every other tablet computer that came before it or since. It is the look and feel of a product. It is its soul. But it is such a difficult thing for computer scientists or engineers to see that importance, so any company must have a leader that sees that importance.

2. To create the future, you can’t do it through focus groups – There is a school of thought in management theory that — if you’re in the consumer-facing space building products and services — you’ve got to listen to your customer. Steve Jobs was one of the first businessmen to say that was a waste of time. The customers today don’t always know what they want, especially if it’s something they’ve never seen, heard, or touched before. When it became clear that Apple would come out with a tablet, many were skeptical. When people heard the name (iPad), it was a joke in the Twitter-sphere for a day. But when people held one, and used it, it became a ‘must have.’ They didn’t know how they’d previously lived without one. It became the fastest growing Apple product in its history. Jobs (and the Apple team) trusted himself more than others. Picasso and great artists have done that for centuries. Jobs was the first in business.

3. Never fear failure – Jobs was fired by the successor he picked. It was one of the most public embarrassments of the last 30 years in business. Yet, he didn’t become a venture capitalist never to be heard from again. He didn’t start a production company and do a lot of lunches. He picked himself up and got back to work following his passion. Eight years ago, he was diagnosed with pancreatic cancer and told he only had a few weeks to live. As Samuel Johnson said, there’s nothing like your impending death to focus the mind. From Jobs’ 2005 Stanford commencement speech:

No one wants to die. Even people who want to go to heaven don’t want to die to get there. And yet death is the destination we all share. No one has ever escaped it. And that is as it should be, because Death is very likely the single best invention of Life. It is Life’s change agent. It clears out the old to make way for the new. Right now the new is you, but someday not too long from now, you will gradually become the old and be cleared away. Sorry to be so dramatic, but it is quite true.

Your time is limited, so don’t waste it living someone else’s life. Don’t be trapped by dogma — which is living with the results of other people’s thinking. Don’t let the noise of others’ opinions drown out your own inner voice. And most important, have the courage to follow your heart and intuition. They somehow already know what you truly want to become. Everything else is secondary.

4. You can’t connect the dots forward – only backward – This is another gem from the 2005 Stanford speech. The idea behind the concept is that, as much as we try to plan our lives ahead in advance, there’s always something that’s completely unpredictable about life. What seems like bitter anguish and defeat in the moment — getting dumped by a girlfriend, not getting that job at McKinsey, “wasting” 4 years of your life on a start-up that didn’t pan out as you wanted — can turn out to sow the seeds of your unimaginable success years from now. You can’t be too attached to how you think your life is supposed to work out and instead trust that all the dots will be connected in the future. This is all part of the plan.

Again, you can’t connect the dots looking forward; you can only connect them looking backwards. So you have to trust that the dots will somehow connect in your future. You have to trust in something — your gut, destiny, life, karma, whatever. This approach has never let me down, and it has made all the difference in my life.

5. Listen to that voice in the back of your head that tells you if you’re on the right track or not – Most of us don’t hear a voice inside our heads. We’ve simply decided that we’re going to work in finance or be a doctor because that’s what our parents told us we should do or because we wanted to make a lot of money. When we consciously or unconsciously make that decision, we snuff out that little voice in our head. From then on, most of us put it on automatic pilot. We mail it in. You have met these people. They’re nice people. But they’re not changing the world. Jobs has always been a restless soul. A man in a hurry. A man with a plan. His plan isn’t for everyone. It was his plan. He wanted to build computers. Some people have a voice that tells them to fight for democracy. Some have one that tells them to become an expert in miniature spoons. When Jobs first saw an example of a Graphical User Interface — a GUI — he knew this was the future of computing and that he had to create it. That became the Macintosh. Whatever your voice is telling you, you would be smart to listen to it. Even if it tells you to quit your job, or move to China, or leave your partner.
Page 1 2

6. Expect a lot from yourself and others – We have heard stories of Steve Jobs yelling or dressing down staff. He’s a control freak, we’ve heard – a perfectionist. The bottom line is that he is in touch with his passion and that little voice in the back of his head. He gives a damn. He wants the best from himself and everyone who works for him. If they don’t give a damn, he doesn’t want them around. And yet — he keeps attracting amazing talent around him. Why? Because talent gives a damn too. There’s a saying: if you’re a “B” player, you’ll hire “C” players below you because you don’t want them to look smarter than you. If you’re an “A” player, you’ll hire “A+” players below you, because you want the best result.

7. Don’t care about being right. Care about succeeding – Jobs used this line in an interview after he was fired by Apple. If you have to steal others’ great ideas to make yours better, do it. You can’t be married to your vision of how a product is going to work out, such that you forget about current reality. When the Apple III came out, it was hot and warped its motherboard even though Jobs had insisted it would be quiet and sleek. If Jobs had stuck with Lisa, Apple would have never developed the Mac.

8. Find the most talented people to surround yourself with – There is a misconception that Apple is Steve Jobs. Everyone else in the company is a faceless minion working to please the all-seeing and all-knowing Jobs. In reality, Jobs has surrounded himself with talent: Phil Schiller, Jony Ive, Peter Oppenheimer, Tim Cook, the former head of stores Ron Johnson. These are all super-talented people who don’t get the credit they deserve. The fact that Apple’s stock price has been so strong since Jobs left as CEO is a credit to the strength of the team. Jobs has hired bad managerial talent before. John Sculley ended up firing Jobs and — according to Jobs — almost killing the company. Give credit to Jobs for learning from this mistake and realizing that he can’t do anything without great talent around him.
11 images Gallery: Steve Jobs' Most Important Products
Gallery: Social Networking Billionaires 2011

9. Stay hungry, stay foolish - Again from the end of Jobs’ memorable Stanford speech:

When I was young, there was an amazing publication called The Whole Earth Catalog, which was one of the bibles of my generation. It was created by a fellow named Stewart Brand not far from here in Menlo Park, and he brought it to life with his poetic touch. This was in the late 1960′s, before personal computers and desktop publishing, so it was all made with typewriters, scissors, and polaroid cameras. It was sort of like Google in paperback form, 35 years before Google came along: it was idealistic, and overflowing with neat tools and great notions.

Stewart and his team put out several issues of The Whole Earth Catalog, and then when it had run its course, they put out a final issue. It was the mid-1970s, and I was your age. On the back cover of their final issue was a photograph of an early morning country road, the kind you might find yourself hitchhiking on if you were so adventurous. Beneath it were the words: “Stay Hungry. Stay Foolish.” It was their farewell message as they signed off. Stay Hungry. Stay Foolish. And I have always wished that for myself. And now, as you graduate to begin anew, I wish that for you.

Stay Hungry. Stay Foolish.

10. Anything is possible through hard work, determination, and a sense of vision – Although he’s the greatest CEO ever and the father of the modern computer, at the end of the day, Steve Jobs is just a guy. He’s a husband, a father, a friend — like you and me. We can be just as special as he is — if we learn his lessons and start applying them in our lives. When Jobs returned to Apple in the 1990s, it was was weeks away from bankruptcy. It’s now the biggest company in the world. Anything’s possible in life if you continue to follow the simple lessons laid out above.

May you change the world.

[At the time of publication, Jackson was long AAPL

Friday, October 07, 2011

Innovation Starvation

Innovation Starvation

By Neal Stephenson


My lifespan encompasses the era when the United States of America was capable of launching human beings into space. Some of my earliest memories are of sitting on a braided rug before a hulking black-and-white television, watching the early Gemini missions. This summer, at the age of 51—not even old—I watched on a flatscreen as the last Space Shuttle lifted off the pad. I have followed the dwindling of the space program with sadness, even bitterness. Where’s my donut-shaped space station? Where’s my ticket to Mars? Until recently, though, I have kept my feelings to myself. Space exploration has always had its detractors. To complain about its demise is to expose oneself to attack from those who have no sympathy that an affluent, middle-aged white American has not lived to see his boyhood fantasies fulfilled.

Still, I worry that our inability to match the achievements of the 1960s space program might be symptomatic of a general failure of our society to get big things done. My parents and grandparents witnessed the creation of the airplane, the automobile, nuclear energy, and the computer to name only a few. Scientists and engineers who came of age during the first half of the 20th century could look forward to building things that would solve age-old problems, transform the landscape, build the economy, and provide jobs for the burgeoning middle class that was the basis for our stable democracy.

The Deepwater Horizon oil spill of 2010 crystallized my feeling that we have lost our ability to get important things done. The OPEC oil shock was in 1973—almost 40 years ago. It was obvious then that it was crazy for the United States to let itself be held economic hostage to the kinds of countries where oil was being produced. It led to Jimmy Carter’s proposal for the development of an enormous synthetic fuels industry on American soil. Whatever one might think of the merits of the Carter presidency or of this particular proposal, it was, at least, a serious effort to come to grips with the problem.

Little has been heard in that vein since. We’ve been talking about wind farms, tidal power, and solar power for decades. Some progress has been made in those areas, but energy is still all about oil. In my city, Seattle, a 35-year-old plan to run a light rail line across Lake Washington is now being blocked by a citizen initiative. Thwarted or endlessly delayed in its efforts to build things, the city plods ahead with a project to paint bicycle lanes on the pavement of thoroughfares.

In early 2011, I participated in a conference called Future Tense, where I lamented the decline of the manned space program, then pivoted to energy, indicating that the real issue isn’t about rockets. It’s our far broader inability as a society to execute on the big stuff. I had, through some kind of blind luck, struck a nerve. The audience at Future Tense was more confident than I that science fiction [SF] had relevance—even utility—in addressing the problem. I heard two theories as to why:

1. The Inspiration Theory. SF inspires people to choose science and engineering as careers. This much is undoubtedly true, and somewhat obvious.

2. The Hieroglyph Theory. Good SF supplies a plausible, fully thought-out picture of an alternate reality in which some sort of compelling innovation has taken place. A good SF universe has a coherence and internal logic that makes sense to scientists and engineers. Examples include Isaac Asimov’s robots, Robert Heinlein’s rocket ships, and William Gibson’s cyberspace. As Jim Karkanias of Microsoft Research puts it, such icons serve as hieroglyphs—simple, recognizable symbols on whose significance everyone agrees.

Researchers and engineers have found themselves concentrating on more and more narrowly focused topics as science and technology have become more complex. A large technology company or lab might employ hundreds or thousands of persons, each of whom can address only a thin slice of the overall problem. Communication among them can become a mare’s nest of email threads and Powerpoints. The fondness that many such people have for SF reflects, in part, the usefulness of an over-arching narrative that supplies them and their colleagues with a shared vision. Coordinating their efforts through a command-and-control management system is a little like trying to run a modern economy out of a Politburo. Letting them work toward an agreed-on goal is something more like a free and largely self-coordinated market of ideas.

SPANNING THE AGES

SF has changed over the span of time I am talking about—from the 1950s (the era of the development of nuclear power, jet airplanes, the space race, and the computer) to now. Speaking broadly, the techno-optimism of the Golden Age of SF has given way to fiction written in a generally darker, more skeptical and ambiguous tone. I myself have tended to write a lot about hackers—trickster archetypes who exploit the arcane capabilities of complex systems devised by faceless others.

Believing we have all the technology we’ll ever need, we seek to draw attention to its destructive side effects. This seems foolish now that we find ourselves saddled with technologies like Japan’s ramshackle 1960’s-vintage reactors at Fukushima when we have the possibility of clean nuclear fusion on the horizon. The imperative to develop new technologies and implement them on a heroic scale no longer seems like the childish preoccupation of a few nerds with slide rules. It’s the only way for the human race to escape from its current predicaments. Too bad we’ve forgotten how to do it.

“You’re the ones who’ve been slacking off!” proclaims Michael Crow, president of Arizona State University (and one of the other speakers at Future Tense). He refers, of course, to SF writers. The scientists and engineers, he seems to be saying, are ready and looking for things to do. Time for the SF writers to start pulling their weight and supplying big visions that make sense. Hence the Hieroglyph project, an effort to produce an anthology of new SF that will be in some ways a conscious throwback to the practical techno-optimism of the Golden Age.

SPACEBORNE CIVILIZATIONS

China is frequently cited as a country now executing on Big Stuff, and there’s no doubt they are constructing dams, high-speed rail systems, and rockets at an extraordinary clip. But those are not fundamentally innovative. Their space program, like all other countries’ (including our own), is just parroting work that was done 50 years ago by the Soviets and the Americans. A truly innovative program would involve taking risks (and accepting failures) to pioneer some of the alternative space launch technologies that have been advanced by researchers all over the world during the decades dominated by rockets.

Imagine a factory mass-producing small vehicles, about as big and complicated as refrigerators, which roll off the end of an assembly line, are loaded with space-bound cargo, and topped off with non-polluting liquid hydrogen fuel, then exposed to intense concentrated heat from an array of ground-based lasers or microwave antennas. Heated to temperatures beyond what can be achieved through a chemical reaction, the hydrogen erupts from a nozzle on the base of the device and sends it rocketing into the air. Tracked through its flight by the lasers or microwaves, the vehicle soars into orbit, carrying a larger payload for its size than a chemical rocket could ever manage, but the complexity, expense, and jobs remain grounded. For decades, this has been the vision of such researchers as physicists Jordin Kare and Kevin Parkin. A similar idea, using a pulsed ground-based laser to blast propellant from the backside of a space vehicle, was being talked about by Arthur Kantrowitz, Freeman Dyson, and other eminent physicists in the early 1960s.

If that sounds too complicated, then consider the 2003 proposal of Geoff Landis and Vincent Denis to construct a 20-kilometer-high tower using simple steel trusses. Conventional rockets launched from its top would be able to carry twice as much payload as comparable ones launched from ground level. There is even abundant research, dating all the way back to Konstantin Tsiolkovsky, the father of astronautics beginning in the late 19th century, to show that a simple tether—a long rope, tumbling end-over-end while orbiting the earth—could be used to scoop payloads out of the upper atmosphere and haul them up into orbit without the need for engines of any kind. Energy would be pumped into the system using an electrodynamic process with no moving parts.

All are promising ideas—just the sort that used to get an earlier generation of scientists and engineers fired up about actually building something.

But to grasp just how far our current mindset is from being able to attempt innovation on such a scale, consider the fate of the space shuttle’s external tanks [ETs]. Dwarfing the vehicle itself, the ET was the largest and most prominent feature of the space shuttle as it stood on the pad. It remained attached to the shuttle—or perhaps it makes as much sense to say that the shuttle remained attached to it—long after the two strap-on boosters had fallen away. The ET and the shuttle remained connected all the way out of the atmosphere and into space. Only after the system had attained orbital velocity was the tank jettisoned and allowed to fall into the atmosphere, where it was destroyed on re-entry.

At a modest marginal cost, the ETs could have been kept in orbit indefinitely. The mass of the ET at separation, including residual propellants, was about twice that of the largest possible Shuttle payload. Not destroying them would have roughly tripled the total mass launched into orbit by the Shuttle. ETs could have been connected to build units that would have humbled today’s International Space Station. The residual oxygen and hydrogen sloshing around in them could have been combined to generate electricity and produce tons of water, a commodity that is vastly expensive and desirable in space. But in spite of hard work and passionate advocacy by space experts who wished to see the tanks put to use, NASA—for reasons both technical and political—sent each of them to fiery destruction in the atmosphere. Viewed as a parable, it has much to tell us about the difficulties of innovating in other spheres.

EXECUTING THE BIG STUFF

Innovation can’t happen without accepting the risk that it might fail. The vast and radical innovations of the mid-20th century took place in a world that, in retrospect, looks insanely dangerous and unstable. Possible outcomes that the modern mind identifies as serious risks might not have been taken seriously—supposing they were noticed at all—by people habituated to the Depression, the World Wars, and the Cold War, in times when seat belts, antibiotics, and many vaccines did not exist. Competition between the Western democracies and the communist powers obliged the former to push their scientists and engineers to the limits of what they could imagine and supplied a sort of safety net in the event that their initial efforts did not pay off. A grizzled NASA veteran once told me that the Apollo moon landings were communism’s greatest achievement.

In his recent book Adapt: Why Success Always Starts with Failure, Tim Harford outlines Charles Darwin’s discovery of a vast array of distinct species in the Galapagos Islands—a state of affairs that contrasts with the picture seen on large continents, where evolutionary experiments tend to get pulled back toward a sort of ecological consensus by interbreeding. “Galapagan isolation” vs. the “nervous corporate hierarchy” is the contrast staked out by Harford in assessing the ability of an organization to innovate.

Most people who work in corporations or academia have witnessed something like the following: A number of engineers are sitting together in a room, bouncing ideas off each other. Out of the discussion emerges a new concept that seems promising. Then some laptop-wielding person in the corner, having performed a quick Google search, announces that this “new” idea is, in fact, an old one—or at least vaguely similar—and has already been tried. Either it failed, or it succeeded. If it failed, then no manager who wants to keep his or her job will approve spending money trying to revive it. If it succeeded, then it’s patented and entry to the market is presumed to be unattainable, since the first people who thought of it will have “first-mover advantage” and will have created “barriers to entry.” The number of seemingly promising ideas that have been crushed in this way must number in the millions.

What if that person in the corner hadn’t been able to do a Google search? It might have required weeks of library research to uncover evidence that the idea wasn’t entirely new—and after a long and toilsome slog through many books, tracking down many references, some relevant, some not. When the precedent was finally unearthed, it might not have seemed like such a direct precedent after all. There might be reasons why it would be worth taking a second crack at the idea, perhaps hybridizing it with innovations from other fields. Hence the virtues of Galapagan isolation.

The counterpart to Galapagan isolation is the struggle for survival on a large continent, where firmly established ecosystems tend to blur and swamp new adaptations. Jaron Lanier, a computer scientist, composer, visual artist, and author of the recent book You are Not a Gadget: A Manifesto, has some insights about the unintended consequences of the Internet—the informational equivalent of a large continent—on our ability to take risks. In the pre-net era, managers were forced to make decisions based on what they knew to be limited information. Today, by contrast, data flows to managers in real time from countless sources that could not even be imagined a couple of generations ago, and powerful computers process, organize, and display the data in ways that are as far beyond the hand-drawn graph-paper plots of my youth as modern video games are to tic-tac-toe. In a world where decision-makers are so close to being omniscient, it’s easy to see risk as a quaint artifact of a primitive and dangerous past.

The illusion of eliminating uncertainty from corporate decision-making is not merely a question of management style or personal preference. In the legal environment that has developed around publicly traded corporations, managers are strongly discouraged from shouldering any risks that they know about—or, in the opinion of some future jury, should have known about—even if they have a hunch that the gamble might pay off in the long run. There is no such thing as “long run” in industries driven by the next quarterly report. The possibility of some innovation making money is just that—a mere possibility that will not have time to materialize before the subpoenas from minority shareholder lawsuits begin to roll in.

Today’s belief in ineluctable certainty is the true innovation-killer of our age. In this environment, the best an audacious manager can do is to develop small improvements to existing systems—climbing the hill, as it were, toward a local maximum, trimming fat, eking out the occasional tiny innovation—like city planners painting bicycle lanes on the streets as a gesture toward solving our energy problems. Any strategy that involves crossing a valley—accepting short-term losses to reach a higher hill in the distance—will soon be brought to a halt by the demands of a system that celebrates short-term gains and tolerates stagnation, but condemns anything else as failure. In short, a world where big stuff can never get done.

*****
*****

Neal Stephenson is the author of REAMDE, a techno-thriller published in September, as well as the three-volume historical epic “The Baroque Cycle” (Quicksilver, The Confusion, and The System of the World) and the novels Anathem, Cryptonomicon, The Diamond Age, Snow Crash, and Zodiac. He is also founder of Hieroglyph, a project of science fiction writers to depict future worlds in which BSGD (Big Stuff Gets Done).

(Downloadable PDFs of individual World Policy Journal articles can be purchased through SAGE.)

[Image: Marshall Hopkins]
| Tags: Big Stuff Done, innovation, Neal Stephenson, science fiction, space, stagnation, United States

Emotional - Leonard Mlodnow

  We’ve all been told that thinking rationally is the key to success. But at the cutting edge of science, researchers are discovering that  ...