Skip to content

Review: What Would Google Do?

July 20, 2009
tags: , , ,

I had a hard time focusing while reading Jeff Jarvis’ What Would Google Do? I found myself reading and re-reading paragraphs because I had gotten distracted.

And that was a good thing.

I wasn’t distracted because the book was boring — I was distracted because my mind was racing around thinking of ways to apply the ideas I was reading to my companies and my own blog. I was distracted because the book was working.

By now, many of the ideas in the book have been circulated, discussed, rehashed and remodeled in cyberspace, as Jarvis intended, and in support of the book’s thesis (you can see some of that discussion at his always-interesting blog). Therefore, I won’t spend too much time summarizing.

The first half of the book outlines the major tenets of a “Googly” company — giving control to (and trusting) your customers, being transparent, making mistakes in the right way, simplifying, etc. The second half of the book shows the results of brainstorming sessions, suggesting ideas for how those tenets could be applied in different industries and institutions — journalism, book publishing, airlines, restaurants, universities, governments.

Jarvis might put a little too much faith into how successful some of those second-half suggestions might be. The mass market isn’t quite dead yet. And some of his suggestions seem like stretches. For example, not seems to me that very few people want to (or have time to) custom-order soft drinks and discuss them with other soft-drink fans. However, to pick at his specific suggestions (in this post, at least) would be to miss the point of that part of the book — there are no wrong ideas in brainstorming. The details can be hashed out later, and some ideas will be discarded or tweaked. But this isn’t really a “how to change your industry” book. It’s a “how to change your thinking” book.

My biggest worry is that Jarvis often seems to misinterpret basic economic concepts. An increased supply of information and decreased barriers to entry do not change the economic definition of “scarcity” — resources, even ones not dragged down by “atoms,” will always be limited in some way. What Jarvis calls “scarcity” actually isn’t. Moving from a near-monopolistic model (e.g. newspaper classifieds) to a competitive one (e.g. Internet classifieds, Craigslist) does not “challenge” the law of supply and demand – in fact, it puts it into practice. Most people won’t be bothered by these little things. They don’t change the value of Jarvis’ core principles or even the validity of his conclusions. It worries me, though, that people are repeating things like “post-scarcity economy” — to me, it demonstrates a lack of understanding.

But if that — basically the content of a few short pages — is my biggest complaint, you had better believe that I would recommend that you read this book. There are some exciting ideas in here. More importantly, there are exciting ways at arriving at new ideas.

And rest assured, Google would never advocate wearing an ugly bracelet emblazoned with an acronym.

Our modern reading habits

July 19, 2009
tags: , ,

A quote from What Would Google Do? by Jeff Jarvis that I, and I’m sure many of you, can identify with:

So though I do fret about the unread books on my shelves and the virgin New Yorker magazines on my desk — as well as a constant stock of unread tabs in my browser — I also know that I learn volumes online every day. Is what I do now better or worse? I’m not sure that judgment is meaningful. I learn differently, discuss differently, see differently, think differently. Thinking differently is the key product and skill of the Google age.

I hope to have a full review of the book up tomorrow. (UPDATE July 20: Here it is.)

Nitpicking will bring us together

July 18, 2009

Yesterday I wrote about feeling bad for ripping on movie adaptations. But I realized something while listening to my coworkers talk about Harry Potter and the Half-Blood Prince: For a lot of people, the nitpicking is one of the best parts.

Obviously, we say things like, “It doesn’t hold a candle to the book,” because, in our minds, it doesn’t. But why do we feel the need to say it out loud? Isn’t it enough to think it? You know what I’m going to say — it’s about social order and signaling. By discussing a movie in relation to the source material, we’re saying that:

  1. You read.
  2. You read things that other people thought were good enough to make into movies.
  3. You care enough about the book to care if it’s adapted well.

And by signaling these things, you’re able to find other people who do, too. And you know you’re not alone in thinking that Keira Knightley was the wrong person to play Elizabeth Bennett or that Gambit deserved more screen time. You can form a network around that idea, and you feel a sense of belonging discussing it.

This might seem obvious. And it might be. But it’s funny how of two people discussing the same thing, one person is apologetic and one is proud – sometimes for the same reason.

So maybe I’m wrong to try to keep mum. Maybe I shouldn’t be sorry for going over movie adaptations with a fine-tooth comb. In moderation, of course.

Is that you, Holmes?

July 17, 2009
tags: , , ,

Yesterday, I once again saw the trailer for the upcoming Sherlock Holmes movie.

I hate to be that person who complains about movie adaptations, so I try to forget my prejudices and to judge movies on their own merits, separately from their source material. I must admit that I often fail, but I do, at least, try to keep the whining to  a minimum.

But this is Sherlock Holmes. Obviously, it’s totally possible (even probable) that the trailer is not a faithful representation of the movie. But what’s in the preview bears so little resemblance to the original stories of Sir Arthur Conan Doyle that I have to wonder if the writers and director have read any of his work at all.

If your adaptation is going to be this loose, why even put the name of the original franchise on it? I suppose that a familiar name draws people in. But it also gives people a false impression of what they’ll be seeing. No one likes to be misled.

I will, of course, probably end up seeing it anyway. Maybe that’s the real mistake.

The changing signals of what we read

July 12, 2009

Vanity Fair‘s August issue ponders how — in an age when more and more of our books, music and videos are on hard drives instead of in our hands and on our shelves — we will preen our plumage to impress the culture snobs:

Books not only furnish a room, to paraphrase the title of an Anthony Powell novel, but also accessorize our outfits. They help brand our identities. At the rate technology is progressing, however, we may eventually be traipsing around culturally nude in an urban rain forest, androids seamlessly integrated with our devices.

It reminds me of this reader’s tip at Marginal Revolution from a few months ago:

Reading with a Kindle, the signal is relatively constant and, at the moment, is something like “I’m an early technology adopter and I like to read”. As the Kindle gets more commonplace the efficacy of this signal will, I think, diminish. Compare this with the signaling effects of reading a traditional book, where you signal to people not only that you like to read, but crucially what you are reading.

I wonder if Kindle advocates are underestimating how important it is for people to show those around them not just that they like to read, but also what they like to read?

I like to think that I am not ashamed of anything I do (I still firmly believe I shouldn’t be). But there are books I will only read in the privacy of my own home. There are Web sites I won’t visit at work in case someone happens to be glancing over my shoulder. There are books that I flaunt when I read them in public, and there are books that I feel like I should flaunt, but I feel like a pretentious show-off even reading them inconspicuously.

As those signals are diminished by Kindles and the like, will what we read more closely match our true interests most of the time?

But as some signals diminish, others grow stronger. Thanks to the Internet, we can choose to share the articles that we read and sites that we visit, be it through Google Reader, StumbleUpon, Digg, Twitter — we still filter those, and they still reflect our ideal selves. We have the ability to tell people what music we’re listening to, what movies we’re watching, what food we’re eating, even when we’re not in their direct lines of sight (Note: We can also choose not to share these things, and that’s often a wise move). They’re a different audience than that on the subway, but certainly not a smaller or less important one.

Signals may change, but they are alive and well. Thoughts?

Hofstadter’s Law and the curse of absolutes

July 10, 2009

After I saw someone use the word “hofstadterially” today, my habit of looking things up led me to Hofstadter’s Law.

The law, defined by cognitive scientist Douglas Hofstadter, states:

It always takes longer than you expect, even when you take into account Hofstadter’s Law.

Oh dear.

The first part is often true. But the really interesting part of this law is the second part — the reference to itself. It turns the adage into a curse.

What it says is that if you know Hofstadter’s Law, you will factor it into your plans for completing a task, and therefore the task will take even longer. And it’s cyclical. If we accept the law as true, people with knowledge of the law are doomed to work on a single task forever.

Ah, the danger of absolutes.

So, uh, a big thanks to my possibly-jinxed readers.

Self-imposed social media rules

July 7, 2009

There are de jure social media rules that come from technological constraints or company policies. There are social media norms that sometimes they differ among social circles.

And then there are self-imposed social media rules. For better or for worse: self-censorship. Over time, I have crafted and adapted my own code of conduct. Here are some of the rules and guidelines I’ve given myself:

  • I try to wait at least 15 minutes between tweets on Twitter, usually longer. Replies don’t count, as only people following both parties receive them.  (I’m still probably clogging the feeds of my friends who follow 12 people, but I have decided that’s their problem.)
  • I don’t promote every blog post I write, only the ones I think are interesting or useful to a particular audience. Corollary: If I do promote it, I only promote it twice (usually once on Twitter and once on Facebook), because more than that turns my FriendFeed into a steady stream of annoying self-promotion.
  • If one of my friends has already shared an item in Google Reader, I don’t share it again in Google Reader. (That doesn’t necessarily stop me from sharing it elsewhere.)
  • I don’t write cryptic statuses, updates or away messages. If I can’t be specific, I don’t say it. (I suppose the exception would be song lyrics, but those usually just means the song is stuck in my head. Interpret it as you will.)
  • I don’t invite people to applications on Facebook.
  • I don’t LOL. I just don’t use it, unless it’s in a mocking way. It doesn’t usually bother me when other people do, though some do abuse it. I typically write “haha” instead, which really isn’t better or even different, but for some reason I think it will make people take me more seriously.

So, am I the only one neurotic enough to worry about things like this? Do you have any self-imposed social media rules? How much are they influenced by other people’s standard practices? How much are they influenced by how you want people to see you?