Posts about beta

Beta-think (and ending malaria)

The organizers of the End Malaria campaign are putting on one more push to sell the book they organized (at least $20 from every sale goes to buying nets) and so they suggested that the contributors post their essays as enticement that you’ll want to read more like (or better than) this. Here’s my contribution on betathink:

Voltaire was half right. “Le mieux est l’ennemi du bien,” he said: The best is the enemy of the good. The best is also the enemy of the better. Striving for perfection complicates and delays getting things done. Worse, the myth of perfection can shut off the process of improvement and the possibility of collaboration.

That myth of perfection is a byproduct of the industrial revolution and the efficiencies of mass production, distribution, and marketing. A product that takes a long time to design and produce is sold to a large market with a claim of perfection. Its manufacturer can’t have customers think otherwise. The distribution chain invests in large quantities of the product and can’t afford for it to be flawed. Mass marketing is spent to convince customers it is the best it can be. Thus perfection becomes our standard or at least our presumption. But perfection is delusion. Nothing and no one is perfect.

The modern cure to Voltaire’s paradox—and a gift of the digital age—is the beta: the unfinished and imperfect product or process that is opened up so customers can offer advice and improvements. Releasing a beta is a public act, an invitation to customers to help complete and improve it. It is an act of transparency and an admission of humility. It is also an act of generosity and trust, handing over a measure of control to others.

Google vice-president Marissa Mayer tells the story of the launch of Google News. It was near the end of the week. No online product is ever released on a Friday (if it breaks, your Saturday is ruined). So the team had just enough time before the weekend to add one more feature. They were debating whether to add a function to sort the news by date or by place. They never got past debating. Come Monday, Google News came out as a beta (and stayed a beta for three years). That afternoon, Mayer says, the team received 305 emails from users, 300 of which begged for sort by date.

By admitting they weren’t finished, the company heard from customers what to do next. “We make mistakes every time, every day,” Mayer confesses. “But if you launch things and iterate really quickly, people forget about those mistakes and have a lot of respect for how quickly you build the product up and make it better.” Beta is Google’s way of never having to say they’re sorry.

Beta-think can benefit more than technology products. I see betas coming from companies in fashion, restaurants, even chocolate and automotive. I wish we’d see more beta-think—more innovation, experimentation and risk—from government, but bureaucrats and politicians are loath to admit imperfection. I also wish that education would operate under beta-think, encouraging learning by failure rather than teaching to a test and a perfect score of right answers. Beta-think can change how we think as managers. It can even change marriage (so much for trying to find the perfect husband or fix all his imperfections).

Beta-think opens an enterprise to the surprising generosity of the public. Look at the value users build in Wikipedia, TripAdvisor, Yelp and other services they control. Beta-think improves an institution’s relationship with its public. Making errors—and confessing and correcting them quickly—will enhance rather than diminish credibility. Once the fear of imperfection is taken out of the equation, innovation can flourish. Look at how Zappos improved customer service by letting employees make their own decisions (and mistakes). What does beta-think do to competitiveness? How can you show your hand to your rivals? That depends on where you see your real value: in keeping secrets from customers or in building strong relationships of trust by listening to and collaborating with them. Beta-think also brings speed. Even perfectionist Apple released its iPhone aware that it was incomplete, promising missing pieces in future updates.

Here’s the wonderful irony of beta-think: It says that we can make what we do ever-better because we are never done, never satisfied, always seeking ways to improve by working in public.

This essay, too, is a beta. It’s not perfect. I’m not done with it. So please come to www.buzzmachine.com/beta-think and help make it better.

Beta-think and ending malaria

Amazon, Seth Godin’s Domino, and other good folks collaborated to come out with a book of essays whose proceeds go to buy mosquito nets to end malaria. My essay for End Malaria Day is actually the topic of the next book I was going to do until I got all hopped up on publicness and privacy and wrote Public Parts. The essay is on beta-think. Here’s a snippet from the start:

* * *

Voltaire was half right. “Le mieux est l’ennemi du bien,” he said: The best is the enemy of the good. The best is also the enemy of the better. Striving for perfection complicates and delays getting things done. Worse, the myth of perfection can shut off the process of improvement and the possibility of collaboration.

That myth of perfection is a byproduct of the industrial revolution and the efficiencies of mass production, distribution, and marketing. A product that takes a long time to design and produce is sold to a large market with a claim of perfection. Its manufacturer can’t have customers think otherwise. The distribution chain invests in large quantities of the product and can’t afford for it to be flawed. Mass marketing is spent to convince customers it is the best it can be. Thus perfection becomes our standard or at least our presumption. But perfection is delusion. Nothing and no one is perfect.

The modern cure to Voltaire’s paradox—and a gift of the digital age—is the beta: the unfinished and imperfect product or process that is opened up so customers can offer advice and improvements. Releasing a beta is a public act, an invitation to customers to help complete and improve it. It is an act of transparency and an admission of humility. It is also an act of generosity and trust, handing over a measure of control to others.

Editing your customers

“Almost everything you see in Twitter today was invented by our users,” its creator, Jack Dorsey, said in this video (found via his investor, Fred Wilson). RT, #, @, & $ were conventions created by users that were then—sometimes reluctantly—embraced by Twitter, to Twitter’s benefit. Dorsey said it is the role of a company to edit its users.

Edit. His word. I’m ashamed that I haven’t been using it in this context, being an editor myself and writing about the need for companies to collaborate with their customers.

I have told editors at newspapers that, as aggregators and curators, they will begin to edit not just the work of their staffs but the creations of their publics. But that goes only so far; it sees the creations of that public as contributions to what we, as journalists, do. And that speaks only to media organizations. Dorsey talks about any company as editor.

I have also told companies—it was a key moral to the story in What Would Google Do?—that they should become platforms that enable others to take control of their fates and succeed.

Twitter is such a platform. As Dorsey said in the video, it constantly iterates and that enables it to take in the creations of users. Months ago, when I wished for a Twitter feature, Fred Wilson tweeted back that that’s what the independent developers and applications are for. Indeed, Twitter enabled developers to create not only features but businesses atop it. But then when Twitter bought or created its own versions of these features created by developers, it went into competition with those developers, on whom Twitter depended to improve—to complete, really—its service. That’s a new kind of channel conflict—competing with your co-creators—that companies will also have to figure out as they become not just producers but editors.

Anyway, I like Dorsey’s conception of company as editor because it requires openness—operating and developing in public; it assumes process over product; it values iteration; it implies collaboration with one’s public; it still maintains the company’s responsibility for quality. An editor has nothing to edit if others haven’t created anything, so it is in the editor’s interest to enable others to create. And the better the creations that public makes, the better off the editor is, so it’s also in the company-as-editor’s interest to improve what that public creates through better tools and often training and also economic motives.

Privacy, publicness & penises

Here is video of the talk I gave at re:publica 2010 in Berlin on The German Paradox: Privacy, publicness, and penises. (Don’t be frightened by the first moments in German; it’s just an introduction and a joke — with fire extinguisher — about how I had threatened to Hendrix my iPad on the stage in Berlin.)

My subject is all the more relevant given this week’s letter to Google with privacy czars in a handful of countries trying to argue that Google Streetview taking pictures in public violates privacy. In my talk, I argue that what is public belongs to us, the public, and efforts to reduce what’s public steals from us. Journalists should be particularly protective of what is public; so should we all. (The czars also argued, amazingly, that Google shouldn’t release betas. They come, you see, from an old world of centralized control — and the myth that processes can be turned into products, finished, complete, even perfect — instead of the new world of openness and collaboration.)

With so much discussion — even panic — about privacy today, I fear that we risk losing the benefits of publicness, of the connections enabled by the internet and our interconnected world. If we shift to a default of private, we lose much and I argue that we should weigh that choice when we decide what to put behind a wall — and there are too many walls being build today. But we’re not discussing the benefits of the public vs. the private. I want to spark that discussion.

I use Germany as a laboratory and illustration of the topic not only because I was there but because they have something nearing a cultural obsession on the topic of privacy. What’s true there is true elsewhere, including the U.S., though only to a different level. I also only skim the surface of the topic in this video; there is so much more to talk about: how publicness benefits the ways we can and now must do business; how it affects government; how it alters education; how it changes our relationships; how young people bring a new culture that cuts across all national boundaries and expectations; how it multiplies our knowledge; how it creates value; how it leads to a new set of ethics; and much more. But that’s for another time and medium.

In the talk, this all leads up to the Bill of Rights in Cyberspace, which is really about openness and protecting that.

At the end of my time on stage, I invited the room to continue the discussion next door in the sauna, Four guys did show up. Here’s the proof.

If you prefer, here is are my slides with the audio of my talk and discussion, thanks to Slideshare:

The coverage of the talk in German media amazes me. It made the front page of three papers and coverage in more and a prime-time TV show plus radio. Coverage included Welt Kompakt and Welt, Welt again, Berliner Zeitung, Berliner Zeitung again, Zeit Online covers the talk, then Zeit feels compelled to respond and start a reader-debate, Spiegel, the German press agency, the Evangelical News Service, Berliner Morgenpost, Berliner Morgenpost again, Bild, Taggespiegel, taz, taz again, taz again, Berliner Kurier, Berliner Kurier again, 3sat, Süddeutschezeitung, BZ, Frankfurter Rundschau, business magazine WirtschaftWoche, L’Express in France, ORF TV in Austria, and more than one blog. And today add der Freitag. A week later comes an interview in the Berliner Zeitung.

Coverage of my re:publica talk

And here is a slice of an illustration of my talk by AnnalenaSchiller.com (who tweeted beforehand about having to draw a penis for the first time in her talk-illustration career) that appeared in the German paper Der Freitag this week:

derFrietag re re:publica

Yet more: Here’s an interview with dctp.tv in Berlin that summarizes my views:

: LATER: Penelope Trunk, who lives in public, writes: ” The value of your privacy is very little in the age of transparency and authenticity. Privacy is almost always a way of hiding things that don’t need hiding.. . . And transparency trumps privacy every time. So put your ideas in social media, not email.”

: AND: I just got a message on Facebook from the woman I talk about in the Sauna in Davos, the one I said was an American freaked by the mixed, nude crowd of sweaty Russians and me. She thought it was quite funny … especially because she’s French (living in America).

iPad danger: app v. web, consumer v. creator

I tweeted earlier that after having slept with her (Ms. iPad), I woke up with morning-after regrets. She’s sweet and pretty but shallow and vapid.

Cute line, appropriate for retweets. But as my hangover settles in, I realize that there’s something much more basic and profound that worries me about the iPad — and not just the iPad but the architecture upon which it is built. I see danger in moving from the web to apps.

The iPad is retrograde. It tries to turn us back into an audience again. That is why media companies and advertisers are embracing it so fervently, because they think it returns us all to their good old days when we just consumed, we didn’t create, when they controlled our media experience and business models and we came to them. The most absurd, extreme illustration is Time Magazine’s app, which is essentially a PDF of the magazine (with the odd video snippet). It’s worse than the web: we can’t comment; we can’t remix; we can’t click out; we can’t link in, and they think this is worth $4.99 a week. But the pictures are pretty.

That’s what we keep hearing about the iPad as the justification for all its purposeful limitations: it’s meant for consumption, we’re told, not creation. We also hear, as in David Pogue’s review, that this is our grandma’s computer. That cant is inherently snobbish and insulting. It assumes grandma has nothing to say. But after 15 years of the web, we know she does. I’ve long said that the remote control, cable box, and VCR gave us control of the consumption of media; the internet gave us control of its creation. Pew says that a third of us create web content. But all of us comment on content, whether through email or across a Denny’s table. At one level or another, we all spread, react, remix, or create. Just not on the iPad.

The iPad’s architecture supports these limitations in a few ways:

First, in its hardware design, it does not include a camera — the easiest and in some ways most democratic means of creation (you don’t have to write well) — even though its smaller cousin, the iPhone, has one. Equally important, it does not include a simple (fucking) USB port, which means that I can’t bring in and take out content easily. If I want to edit a document in Apple’s Pages, I have to go through many hoops of moving and snycing and emailing or using Apple’s own services. Cloud? I see no cloud, just Apple’s blue skies. Why no USB? Well, I can only imagine that Apple doesn’t want us to think what Walt Mossberg did in his review — the polar opposite of Pogue’s — that this pad could replace its more expensive laptops. The iPad is purposely handicapped, but it doesn’t need to be. See the German WePad, which comes with USB port(s!), a camera, multitasking, and the more open Android operating system and marketplace.

Second, the iPad is built on apps. So are phones, Apple’s and others’. Apps can be wonderful things because they are built to a purpose. I’m not anti-app, let’s be clear. But I also want to stop and examine the impact of shifting from a page- and site-based internet to one built on apps. I’ve been arguing that we are, indeed, moving past a page-, site-, and search-based web to one also built on streams and flows, to a distributed web where you can’t expect people to come to you but you must go to them; you must get yourself into their streams. This shift to apps is a move in precisely the opposite direction. Apps are more closed, contained, controlling. That, again, is why media companies like them. But they don’t interoperate — they don’t play well — with other apps and with the web itself; they are hostile to links and search. What we do in apps is less open to the world. I just want to consider the consequences.

So I see the iPad as a Bizarro Trojan Horse. Instead of importing soldiers into the kingdom to break down its walls, in this horse, we, the people, are stuffed inside and wheeled into the old walls; the gate is shut and we’re welcomed back into the kingdom of controlling media that we left almost a generation ago.

There are alternatives. I now see the battle between Apple and Google Android in clearer focus. At Davos, Eric Schmidt said that phones (and he saw the iPad as just a big phone… which it is, just without the phone and a few other things) will be defined by their apps. The mobile (that is to say, constantly connected) war will be won on apps. Google is competing with openness, Apple with control; Google will have countless manufacturers and brands spreading its OS, Apple will have media and fanboys (including me) do the work for it.

But Google has a long way to go if it hopes to win this war. I’m using my Nexus One phone (which I also had morning-after doubts about) and generally liking it but I still find it awkward. Google has lost its way, its devotion to profound simplicity. Google Wave and Buzz are confusing and generally unusable messes; Android needed to be thought through more (I shouldn’t have to think about what a button does in this use case before using it); Google Docs could be more elegant; YouTube’s redesign is halfway to clean. Still, Google and Apple’s competition presents us with choices.

I find it interesting that though many commercial brands — from Amazon to Bank of America to Fandango — have written for both Apple and Android, many media brands — most notable The New York Times and my Guardian — have written only for Apple and they now are devoting much resource to recreating apps for the iPad. The audience on Android is bigger than the audience on iPad but the sexiness and control Apple offers is alluring. This, I think, is why Salon CEO Richard Gingras calls the iPad a fatal distraction for publishers. They are deluding themselves into thinking that the future lies in their past.

On This Week in Google last night, I went too far slathering over the iPad and some of its very neat apps (ABC’s is great; I watched the Modern Family about the iPad on the iPad and smugly loved being so meta). I am a toy boy at heart and didn’t stop to cast a critical eye, as TWiG’s iPadless Gina Trapani did. This morning on Twitter, I went too far the other way kvetching about the inconveniences of the iPad’s limitations (just a fucking USB, please!) in compensation. That’s the problem with Twitter, at least for my readers: it’s thinking out loud.

I’ll sleep with the iPad a few more nights. I might well rebox and return it; I don’t have $500 to throw away. But considering what I do for a living, I perhaps should hold onto it so I can understand its implications. And that’s the real point of this post: there are implications.

: MORE: Of course, I must link to Cory Doctorow’s eloquent examination of the infantilization of technology. I’m not quite as principled, I guess, as Cory is on the topic; I’m not telling people they should not buy the iPad; I don’t much like that verb in any context. But on the merits and demerits, we agree.

And Dave Winer: “Today it’s something to play with, not something to use. That’s the kind way to say it. The direct way: It’s a toy.”

: By the way, back in the day, about a decade ago, I worked with Intel (through my employer, Advance) on a web pad that was meant to be used to consume in the home (we knew then that the on-screen keyboard sucked; it was meant to be a couch satellite to the desk’s PC). Intel lost nerve and didn’t launch it. Besides, the technology was early (they built the wireless on Intel Anypoint, not wi-fi or even bluetooth). Here’s the pad in the flesh. I have it in my basement museum of dead technlogy, next to my CueCat.

: More, Monday: NPR’s related report and Jonathan Zittrain’s worries.