Reinventing the Newsroom

What Wired Left Out of Its Web Eulogy

Posted in Cultural Change, Digital Experiments, Fun With Metaphors, IPad, Social Search by reinventingthenewsroom on August 19, 2010

This post originally appeared at Nieman Journalism Lab.

Maybe you heard: The web has been declared dead, and everybody’s mad about it.

I’ll get to checking the web’s vital signs in a moment, but one thing is clear: The hype and hucksterism of packaging, promoting, and presenting magazine articles is very much alive. I found Chris Anderson’s Wired article and Michael Wolff’s sidebar pretty nuanced and consistently interesting, which made for an awkward fit with the blaring headlines and full-bore PR push.

But looking past this annoyance, Anderson’s article makes a number of solid points — some I hadn’t thought of and some that are useful reminders of how much things have changed in the past few years. (For further reading, The Atlantic’s Alexis Madrigal has a terrific take on why the model of continuous technological revolution and replacement isn’t really correct and doesn’t serve us well, and Boing Boing nails why the graphic included in the Wired package is misleading.)

Still, Anderson almost lost me at hello. Yes, I like to use my iPad for email — and I frequently check out Facebook, Twitter, and The New York Times on it. But for the latter three, I don’t use apps but the browser itself (in my case, AtomicWeb). As I’ve written before, so far the iPad’s killer app is the browser — more specifically, the chance to have a speedy, readable web experience that doesn’t require you to peer at a tiny screen or sit down in front of a laptop or desktop. So going by Anderson’s own opening examples, the web isn’t dead for me — better to say that apps are in the NICU.

But I couldn’t argue with this: “Over the past few years, one of the most important shifts in the digital world has been the move from the wide-open web to semi-closed platforms that use the Internet for transport but not the browser for display.” That’s absolutely correct, as is Anderson’s observation that this many-platform state of affairs is “the world that consumers are increasingly choosing, not because they’re rejecting the idea of the web but because these dedicated platforms often just work better or fit better into their lives (the screen comes to them, they don’t have to go to the screen).”

That not-going-to-the-screen is critical, and — again — a big reason that the iPad has been a hit. But as my iPad habits show, that doesn’t necessarily imply a substitution of apps for the web. Nor, as Anderson himself notes, are such substitutions really a rejection of the web. It would have been less compelling but more accurate to say that the web isn’t dying but being joined by a lot of other contact points between the user and the sea of digital information, with points emerging for different settings, situations, and times of day. Sometimes a contact point is a different presentation of the web, and sometimes it’s something else entirely.

It’s also interesting to ask whether users of various devices care — and whether they should. Anderson brings up push technology and, with it, PointCast, a name that made me shudder reflexively. A long time ago, WSJ.com (like most every media company of the time) became infatuated with push, going as far as to appoint a full-time editor for it. It was tedious and horrible, a technology in search of an audience, and our entire newsroom was thrilled when the spell was broken and the damn thing went away. But Anderson notes that while PointCast didn’t work, push sure did. Push is now so ubiquitous that we only notice its absence: When I’m outside the U.S. and have to turn off push notifications to my phone, I have the same in-limbo feeling I used to get when I was away from my computer for a couple of days.

The problem with the first incarnation of push was that the only contact point was the computer screen, meaning information often wasn’t pushed close enough to you, or was being pushed down the same pipe you were trying to use for something else. Now, information is pushed to the web — and to smartphones and tablets and game consoles and social networks and everything else — and push has vanished into the fabric of How Things Are.

Generally, I think the same is true of the web vs. other methods of digital interaction — which is why the over-hyped delivery of the Wired article seemed so unfortunate. There isn’t a zero-sum game between the web and other ways of presenting information to customers — they all have their role in consumers’ lives, and increasingly form a spectrum to be tapped into as people choose. Even if apps and other methods of accessing and presenting that information take more parts of that spectrum away from the open web, I doubt content companies, telcos, or anybody else will kill the open web or even do it much damage.

Frankly, both Anderson and Wolff do a good job of showing how adherence to the idea of the open web has calcified into dogma. Before the iPad appeared, there was a lot of chatter about closed systems that I found elitist and tiresome, with people who ought to know better dismissing those who don’t want to tinker with settings or create content as fools or sheep. Near the end of his article, Anderson seems to briefly fall into this same trap, writing that “an entire generation has grown up in front of a browser. The exploration of a new world has turned into business as usual. We get the web. It’s part of our life. And we just want to use the services that make our life better. Our appetite for discovery slows as our familiarity with the status quo grows. Blame human nature. As much as we intellectually appreciate openness, at the end of the day we favor the easiest path.”

That’s smart, except for the “blame human nature” part. Of course we favor the easiest path. The easiest path to doing something you want to do has a lot to recommend it — particularly if it’s something you do every day! I’m writing this blog post — creating something — using open web tools. Since this post is getting kinda long, I might prefer to read it on my iPad, closed system and all. The two co-exist perfectly happily. Ultimately, the web, mobile and otherwise, else will blend in consumers’ minds, with the distinction between the web and other ways of accessing digital information of interest only to those who remember when such distinctions mattered and/or who have to dig into systems’ technological guts. There’s nothing wrong with that blending at all — frankly, it would be a little disappointing if we stayed so technologically silo’ed that these things remained separate.

Even if “big content” flows through delivery methods that are less open and more controlled, anybody with bandwidth will still be able to create marvelous things on the open web using an amazing selection of free tools. As various technological kinks are worked out, traffic and attention will flow seamlessly among the various ways of accessing digital information. And social search and discovery will increasingly counteract industrial search and discovery, providing alternate ways of finding and sharing content through algorithms that reward popularity and scale. People who create good content (as well as a lot of content that’s ephemeral but amusing or diverting) will still find themselves with an audience, ensuring a steady flow of unlikely YouTube hits, Twitter phenomena, and hot blogs. The web isn’t dead — it’s just finding its niche. But that niche is pretty huge. The web will remain vigorous and important, while apps and mobile notifications and social networks grow in importance alongside it.

Where Does Brand Fragmentation End?

Posted in Branding, Paid Content, Social Search, The And World by reinventingthenewsroom on June 30, 2010

This post originally appeared at Nieman Journalism Lab.

Earlier this week Gawker’s Hamilton Nolan wrote that Rolling Stone has little hope of capitalizing on the notoriety of Michael Hastings’ profile of Gen. Stanley McChrystal to increase newsstand sales and drive more subscriptions. As Nolan writes, “[w]hereas once people would have rushed out to newsstands to pick up copies of Rolling Stone and read what all fuss was about with McChrystal, now they either A) read that one single story on RS’s website, for free, or B) read it at the competition’s website for free, which is what happened in this case.” (Rolling Stone’s inability to get its own story online in a timely fashion remains frankly mind-boggling.)

Nolan argues that Rolling Stone, Esquire and Vanity Fair put out stories as good as those found in The New Yorker or The Atlantic, but magazines in the former group aren’t taken seriously as a whole because their good stuff is mixed in with so much fluff. He calls this “Good Stories, Bad Magazine Syndrome,” and laments that Rolling Stone and other sufferers “will never put out enough of those stories to make the types of people who care about those stories seriously consider reading the magazine on a regular basis.”

Good point, but Nolan isn’t really talking about the puzzle of how you brand a combination of get-everybody-talking journalism and cotton-candy features. He’s discussing a much larger problem:

Everyone knows that you don’t need to subscribe to Rolling Stone in order to read the five great stories they publish every year; just wait until you hear those stories mentioned elsewhere and check in then…The internet has split each and every story from every outlet into its own discrete item. Unless your publication is consistent enough to somehow pull all of these separate links into a coherent whole, you’ll never be a destination, per se. You’re just hosting writers and writing checks.

Nolan comes face-to-face with that problem, but I think he blinked. Because what if consistency isn’t enough? What happens to news organizations as we know them if this atomization of content is so thorough and irreversible that no publication can pull its discrete articles into a coherent whole? Without coherent brands, will any publication host writers and write checks?

In the months after I went freelance, I talked with a few organizations about potential newsroom jobs. During the first couple such conversations, I apologized for having read plenty of articles from Publication X without being familiar with its site, explaining that I mostly read individual articles that found their way to me. Later, I quit apologizing — because this is increasingly the reality of how more and more of us read. Among general-interest publications, I read The Atlantic and The New Yorker because they still show up at the house in print. I skim The New York Times because it’s the closest thing I have to a hometown paper, which is either nostalgia or dangerously close to it. For me, every other brand has been blown to fragments that arrive sifted by Twitter and Facebook, or are turned up by search. The future may belong to “bottom-up” brands designed to be encountered in bits and pieces — the home pages of companies such as Demand Media, About.com and YouTube are rarely glimpsed and for all intents and purposes irrelevant.

As the fragmentation of content continues, the importance of traditional brands’ section pages and home pages will continue to wane — which newsroom middle managers will find a lot more frightening than readers will. Section and home pages aggregate news for readers, yes, but readers are increasingly doing that themselves through personalization, or trusting their peers to do it for them. Too often, home pages are committee-built disasters anyway — a cacophony of news, features and corporate messaging from every internal constituency too big to be ignored. Readers, relentlessly trained to hunt for signal, rightly dismiss them as noise. When he was consulting for the Guardian, TBD.com’s Jim Brady shut down the Guardian America front page, explaining to PaidContent’s David Kaplan that “you’re better off putting your stories on Twitter and posting them on Digg and Facebook and pitching them to blogs that can move a lot of traffic, than posting them on a section front that’s getting no traffic anyway. One of the things I pushed for was that you have to get away from the idea of getting people to simply come to your home page. You have to get your home page to the people.”

If destination sites crumble, how do the bills get paid? I believe that people will pay for content [disclosure: I’m a consultant for Journalism Online], but paywalls and meters limited to a single site may be short-term solutions, because they’re ideas that spring from the old model of large brands and destination sites. Ultimately, what we may need is not paywalls but paytags — bits of code that accompany individual articles or features, and that allow them to be paid for. MTV’s Maya Baratz is ahead of the curve here, urging publishers to think of their products not as platforms, but as apps — which to Baratz means “not only allowing, but thriving off of, having your content live elsewhere.” But between wallet friction and the penny gap, the mechanics of paytags make paywalls and single-site meters look like comparatively simple problems to solve.

As readers, we understand that publications have been atomized — our own habits increasingly show us that every day. But publishers need to face the consequences of what that means. And that won’t be easy: Their entire world, from planning to production to distribution to promotion to how to get people to pay for it, is built around a fundamentally different set of organizing principles. What if those organizing principles are already obsolete?

Comments Off on Where Does Brand Fragmentation End?

Exploring the Myth of the Average Blogger

Posted in Cultural Change, Social Media, Social Search by reinventingthenewsroom on April 19, 2010

David Eaves has a terrific post up about what he sees as myths held up by “old media” about new media. It’s well worth reading for seeing all the different places we’re talking past each other and the intersections where fear, uncertainty and doubt are choking off brighter possibilities.

What really jumped out at me was the first myth Eaves tackled — the myth of the “average blogger.” As he sees it, print journalists think they are competing against the average quality of online content, and when they see that most of that content is frankly poor, they are lulled into a false sense of security. In Missing the Link, a collaboration with Taylor Owen, Eaves wrote that “those in the print media who dismiss online writing because of its low average quality are missing an important point. No one reads the average blog.”

This is a critical insight, and I agree with Eaves that it leads to all sorts of misapprehensions. He cites a false sense of security. To that, I would add a retinue of sins imagined and overblown: Bloggers making errors that go uncorrected, Web writers driven by antisocial behavior and personal animosity, and the idea that these peddlers of hateful, subpar content are leading legions of readers astray.

The world has profoundly changed. Not so long ago, gatekeepers determined what would be published in newspapers, magazines and books, and if you didn’t pass muster with those gatekeepers, it was very difficult to reach an audience of any real size. This didn’t ensure that all content was excellent, or even good — insert the name of whatever trashy novel you thought foretold the death of literature here — but it did have the effect of confining a lot of dreck to fliers and mimeographs. Today, it’s child’s play to publish, and anyone who publishes has a huge potential audience awaiting them.

But the key word there isn’t huge or audience — it’s potential. As Eaves notes and many an eager new blogger has discovered to his or her dismay, no one reads the average blog. As Eaves notes, print media aren’t competing against the average blogs, but against the best ones. Writers of average blogs have discovered a hard truth: Publication does not guarantee an audience, and the existence of something online does not mean anyone is reading it. And the failure to grasp this drives a lot of the hand-wringing about blogs and the Web.

One reason this misapprehension is hard to shake is basic human nature: We always think our enemies are united, powerful and implacable, when in fact most of them are every bit as divided, inefficient and careworn as we are. But as I’ve written before, another reason has do with the way search works online.

In the physical world, commonly accepted information that a lot of people consume is easy to find, while obscure or problematic information is hard to find. But online, it’s all one. If what you search for is out there, you will find it very quickly, no matter how wrongheaded or cruel or otherwise flawed it is. And this instant response leads us to an error borrowed from our real-world experience: Because we quickly find what we’re looking for, we assume many other people are looking for that information too, and are reading it. (Particularly when it’s something erroneous or cruel about ourselves or something we care about.) But this isn’t necessarily true: It’s often the case that no one is reading that information at all. It is only our search that fit the lock that plucked it momentarily out of obscurity.

You can now do an end run around the old gatekeepers, but people don’t have substantially more time to consume information than they ever did. And so there are new gatekeepers springing up everywhere, to reduce the torrent of information to a manageable flow. Readers make use of technological tools to help them filter information. With the rise of the social Web, they are increasingly able to make the collective judgment of their peers serve as filters and gatekeepers. And journalists and other experts have an invaluable role to play here as well, curating information and bringing the good stuff to a wider audience. The gatekeepers now operate downstream of publication, but they still exist. If anything, their roles are more important than ever.

Comments Off on Exploring the Myth of the Average Blogger

That Pew Report — and Other Monday Reads

Posted in Going Local, Social Media, Social Search, The And World by reinventingthenewsroom on March 1, 2010

There’s a new report out from Pew’s Excellence in Journalism project, and it’s a pretty fascinating snapshot of American news consumers and their habits. Nieman Labs has a good overview here, as does Pew itself.

Quick reaction: I found the report an interesting confirmation of how quickly news consumption is changing. Consider the following:

  • 92% of respondents use multiple platforms to get their news
  • 56% say they follow news all or most of the day
  • 37% say they’ve helped create news, commented on it or shared it

That’s a sea change — the old print-only, brand-loyal news consumer transformed into one who’s often looking for news, getting it from multiple sources and on multiple platforms, and then doing something with it if they aren’t creating it themselves.

A couple of points made me yearn for further exploration:

  • Some examining the study’s conclusions have noted that just 2% of respondents rely on the Internet exclusively for news, but I think that’s less surprising than it is on first glance — few people are that absolutist in their consumption habits. I still get news from print sources and television, so I’d fall into the 98% category, but my print and TV consumption of news is a rounding error compared to what it was even five years ago.
  • I was interested that 57% of respondents said they relied on just two to five Web sites for their news, suggesting that while news consumers graze, they may not graze very far afield. But if I were a publisher, I’d want more information before I drew a conclusion from that. For instance, I’d want to know if that answer takes into account material people read through email sharing and social networking, which could bring many more sources into the mix.
  • When asked what they wanted more coverage of, respondents’ top choice was “science and discoveries,” at 44%. Bringing up the rear was local, at 38%. But when you look at the methodology, those numbers are essentially the same.

Finally, the report clearly shows some opportunities for publishers.

  • The most popular online news subject is the weather (81%). As a callow journalist, I complained loudly about having to do weather stories. As a manager at an online news site, I never thought twice about our weather offerings. As a reader, I concur with the poll: Checking the weather is what I do the most. Moreover, I’d love to have more information about what the weather is doing, beyond forecasts. Case in point: My home just got socked by the weekend’s snowicane, and I found it very difficult to get an in-depth explanation of what this weird storm was and why it was doing what it did. The best explanation I found came from the Baltimore Sun, which has a wonderful weather blog by reporter Frank Roylance. Given what Pew has found and what our own experiences will tell us, why are weather blogs so rare?
  • Pew found that 23% of social-networking users follow news organizations or individual journalists within social-networking sites. That isn’t taking into account social-networking users who interact with the news through a degree of separation by reading what friends and peers pass along, which is a much higher percentage. These people are specifically making news organizations or individual journalists part of their social-networking habits. Next time a publisher wonders about the value of social networking, there’s a stat for them.
  • 70% of Pew’s respondents agreed that the amount of news and information out there available from different sources is overwhelming. There’s the case for curation and being a trustworthy gateway right there.

A couple of quick notes about other things:

My latest EidosMedia Web chat with my friend David Baker is available here. This time around, David and I are chatting about mobile strategy for news organizations. By the way, my thanks to David and Massimiliano Iannotta for their help with RTN’s recent redesign. (That cool image header is David’s doing as well.)

Over at the National Sports Journalism Center, I write about my spring-training news habits, and how I take in news from a huge number of sources that didn’t exist a few years ago. In discussing digital journalism, it’s easy to forget that while these are anxious times for publishers, consumers have never had it better in terms of how much information is available and how many choices they have.

How to Get Paid: Decrease Wallet Friction, Think Apps

Posted in Digital Experiments, Paid Content, Social Media, Social Search, Twitter by reinventingthenewsroom on February 9, 2010

So late last week I read Dave McClure’s slightly unhinged rant about subscriptions, and within a couple of paragraphs I started laughing. Not in that “Gee, anyone with an opposable thumb can now publish” way, but in that “This is beyond awesome” way. Posts like McClure’s remind me of why I love the Web, particularly the way it constantly brings new ideas and voices to my door that change the way I think. When I wasn’t laughing, I was nodding my head. First once, then twice, then lots of times.

McClure would benefit from an editor, he loves to swear, and his post reads a bit like it was composed while jumping on a trampoline, but don’t let that put you off: His thoughts on subscriptions, Web ads, transactions and passwords make for smart, bracing reading. I highly recommend getting your McClure direct and full-octane at the link above, but here’s my gloss anyway:

Web 2.0 companies have foolishly tried to follow Google and Yahoo’s lead in emulating ad-driven business models, wasting a decade trying out inefficient revenue models. McClure thinks that will change, in a rather dramatic way: “The default startup business model for 2010 and beyond will be subscriptions and transactions (e-commerce, digital goods).” He adds that “gradually we are discovering that the default revenue model on the internet should probably be the simplest one — that is: basic transactions for physical or digital goods, and recurring transactions (aka subscriptions) for repeat usage.” Or, as he then puts it more pungently, “Get Dem Bitches to *PAY* You, G.”

So what’s the problem with that? McClure isn’t interested in the ideology of free vs. paid or the link economy or Googlejuice. He does mention the problem of the penny gap (i.e. the hardest part of charging isn’t getting readers to pay you a certain price, but getting them to pay you anything at all), but then moves on — a bit too quickly, I thought — to “wallet friction.”

Here, he looks back to his time at PayPal, where the biggest customer-service problem by far was users not remembering their passwords. “Bingo, way to create the biggest HateStorm in Internet History: make it super simple for people to make their payment method unusable by simply forgetting their password,” he writes, adding that “PayPal was one of the classic stories of viral growth, however in this instance we also experienced viral growth in customer service: at one point more than 2 in 3 employees worked in customer service. And I’m guessing somewhere between 10-20% of first-time customers never used the service again, primarily because they forgot their password.”

So what passwords do people remember? The ones for services they use all the time — such as social networks, email and instant messaging, and sites for buying games, music and entertainment. From there, he reaches his conclusion: “In 2015, the default login and payment method(s) on the Web will be Facebook Connect, Google Gmail, or Apple iTunes.”

As I said, I would have liked to hear more from McClure on how we get across the penny gap. But I like that his examination of the problem focuses on consumer behavior, industry trends and practical issues, rather than supposedly immutable laws of the digital world. News organizations face a lot of problems that won’t be solved quickly or easily: There is a glut of commoditized content on many subjects, much of the content we produce isn’t good enough to ask anyone to pay for, and we have surrendered or drifted from our central place in our communities. Working through those issues will involve a lot more pain than what we’ve already experienced. But I don’t believe that it’s impossible for us to get paid for content that works for our readers, or to be rewarded if we can win back a valued place in our communities.

* * *

For another take on how we get paid, MTV Networks product manager Maya Baratz advises old-media companies to start thinking of themselves as apps.

Baratz characterizes apps as “not only allowing, but thriving off of, having your content live elsewhere” as opposed to platforms, which fuel their growth by attracting an audience to a destination. As an example of the former, take social games that don’t try to draw in users to a new site, but exist where the users already are, such as on Facebook. Her advice to news organizations is to turn the paywall argument on its head and get revenue through bits of content as they spread.

This gets at one of the central dilemmas of charging for content: By and large, news organizations seem to agree that paywalls need to be leaky to let content spread and be discovered through sharing and search. For instance, the New York Times has indicated that when its paywall arrives sometime this global epoch, articles found through sharing and search won’t count against readers’ monthly counts. (More on this here.) That seems wise, but the more we find content this way, the less paywalls will contribute to news organizations’ bottom lines. Combine Baratz’s approach with McClure’s decreased wallet friction and perhaps there’s a way forward that will remain viable as social media becomes more and more important.

Provided we can hurtle the penny gap, of course.

* * *

On a rather different note, my latest column for Indiana University’s National Sports Journalism Center looks at the growing use of social media by athletes, and explores how it may change sportswriting as “digital natives” become star athletes. As is often the case with my NSJC columns, I think these questions are relevant to more than sportswriters. Sports, as part of the Web’s old-growth forest, is an excellent place to track changes that will soon impact the rest of journalism.

Oh, and my fervent wish for a New Orleans Saints win came true! Now if only something could be done about the Mets….

The Furor Over Content Farms

Posted in Content Farms, Social Search by reinventingthenewsroom on December 15, 2009

I’m glad to see I’m not alone — the agitation over content farms (my term was vapidmedia) is increasing among digital-media thinkers. Here’s a rundown of recent takes on the issue, what it means, and what — if anything — should or can be done about it. To reiterate my point of view: I don’t think Demand and other vapidmedia mills deliberately try to produce low-quality content, but I think their business models virtually ensure that they will do so. Nor is my primary objection that they turn content creators into Chinese factory workers. I don’t like that, but if the market wills it, so be it. Rather, my primary objection is that vapidmedia clutters up search with low-quality content designed to game Google’s algorithms, making better-quality information harder to find.

To review, the article that kicked the issue into high gear is Dan Roth’s Wired magazine profile of Demand Media. A related piece is Farhad Manjoo’s takedown of Associated Content, from Slate. (Richard McManus of ReadWriteWeb has also penned two good investigations of Demand Media.)

Here’s the fusillade I wrote about Demand Media after reading those two articles.

What’s new: Over the weekend TechCrunch’s Michael Arrington waded into the fray, warning that “I think there’s a much bigger problem lurking on the horizon than a bunch of blogs and aggregators disrupting old media business models that needed disrupting anyway. The rise of fast food content is upon us, and it’s going to get ugly. … These models create a race to the bottom situation, where anyone who spends time and effort on their content is pushed out of business.” Arrington’s conclusion is dour: Content creators need to “figure out an even more disruptive way to win, or die. Or just give up on making money doing what you do.”

New York City venture capitalist and blogger Fred Wilson is hopeful — as I was, albeit somewhat tentatively — that the antidote to vapidmedia is the rise of social search. Social search, he says, will help us decide what’s quality content and what isn’t, where search engines can’t: “It’s a lot harder to spam yourself into a social graph.” This fits with my own thinking that social search stands to eclipse the power of Google in relatively short order — Google’s empire is built on a clever recreation of social approval, hierarchy and relevance, but the Web has matured to the point where we can use those social tools instead of industrial substitutes for them. (This is also why, as I wrote, the drama starring Rupert Murdoch, Google, Bing and vengeance-minded publishers will make for great theater but not particularly matter before long.)

Jeff Jarvis spoke with Demand’s Steven Kydd about the company. Jarvis also sees social search as a way to prevent content farms from degrading search results, though he praises Demand’s algorithms as useful to discovering questions the public wants answered. His take is interesting, though I think he gives Demand too much credit by urging us “not to miss Demand’s key insight: that the public should assign the creators, including journalists.” I agree that Demand’s algorithms are smart and could be useful in spotlighting questions the public wants answered, but Demand isn’t part of any war of journalism ideology, and I think it does more public harm than good.

Coming nearly full-circle, ReadWriteWeb’s McManus looks at some ways Google can combat the content farms — and notes it’s quite likely that Google is already working to put some of these tactics in place. Here’s hoping.

Hey, Demand Media! Get Off My Lawn!

Posted in Content Farms, Digital Experiments, Social Search by reinventingthenewsroom on December 4, 2009

July 2010 Update: I have more thoughts on content farms here. I took a look at Demand Media’s travel articles for USA Today here. And here’s a roundup of posts about the issue.

I don’t know how I missed this Daniel Roth article in the October Wired about Demand Media the first time around, but it showed up in my Twitter queue this morning, and came on the heels of my reading and thinking about Farhad Manjoo’s evisceration of Associated Content in Slate. (I was kinder about Associated Content back in my Wall Street Journal days, but then I was mostly interested in them as a different way to build a brand.) From there, I read Sage Ross’s very good take (channeling Jay Rosen) on Demand Media vs. Wikimedia.

And then I tried and failed to calm myself down.

Journalists, the Web is not how our profession ends. The Web is a wonderful vehicle for storytelling, explaining, doing civic good and empowering readers who want to dig for information. If you want to know how our profession ends, look at Demand Media, starting with Roth’s poignant portrait of an experienced video journalist shooting noisy, out-of-focus footage for $20 a pop. This is the journalist as Chinese factory worker — except for a lot of rural Chinese the factory is a step up. You know the old joke about the sign that reads Good, Fast, Cheap — Pick Two? Demand Media took that and turned it into an irony-free business plan. The joke, unfortunately, is on the rest of us.

I’d encountered material from Demand before, along with stuff from other vapidmedia factories such as Associated Content and eHow. But I’d written it off as the usual Internet stupidity breaking the waterline thanks to an unfortunate alignment of search-engine tumblers. I hadn’t grasped that the visibility of this stuff — indeed, the sole reason for its existence — was the product of a Google-dependent strategy, or processed that its bland stupidity was a direct consequence of a pitiless, bottom-line business model. Wired’s Roth describes the consequences aptly: “To appreciate the impact Demand is poised to have on the Web, imagine a classroom where one kid raises his hand after every question and screams out the answer. He may not be smart or even right, but he makes it difficult to hear anybody else.”

Now that I’ve spluttered and raged, an attempt at perspective. It’s good to understand what information people are searching for, and by all accounts Demand Media has done a terrific job at that. Journalists have spent far too long uninterested in questions like that, maintaining and sometimes even cultivating an air of artistic disconnect from readers and the business side of their publications. It’s an understatement to say that hasn’t served them well in trying to adapt to the seismic changes in our industry. Smart algorithms like Demand’s are a way to bridge that disconnect, and a potential source of story ideas to boot. (Check out the interesting exchange about people donating cars in Dallas.)

Nor am I saying that you’ve got to be a member of the journalistic priesthood to impart useful information or tell good stories. I’m sure there’s some good, even great stuff produced by Demand Media and Associated Content, just as I rejoice that millions of people now produce commentaries, explainers and, yes, new stories without journalistic backgrounds or affiliations.

But Demand Media isn’t just an algorithm, and the confines of business models like Demand’s work against the production of good stuff. I’ll choose to believe Demand CEO Richard Rosenblatt that he wants to improve quality, but if he’s true to what’s made his company successful, he’ll have a lot of trouble doing that. Similarly, this article by Demand’s Steven Kydd, touting that The Future = Art + Science + Scale, has some valuable lessons for publishers, and it sounds reasonable enough. But the Demand equation sure feels more like The Present = Science + Scale – Art than what Kydd came up with. (See the sign up above.)

A couple of weeks back I had an interesting conversation with a first-class digital-media experimenter in which we talked about how systems are constructed, and how the starting points you choose will allow users to do interesting, unexpected things with those systems, or prevent that. Twitter is an obvious example — it’s slightly out of control, which has allowed its users to turn it into a hotbed of innovation. Demand’s system strikes me as so rigidly controlled that it’s a poor fit for any kind of innovation. Which would be fine if Demand weren’t the kid waving his hand in class with an obvious, not particularly edifying answer to everything.

Granted, it’s very early — too worried, probably, for me to get as worked up as I have. As Manjoo notes, vapidmedia is basically an exploitation of a weakness in search engines, which suggests its success could be temporary — the vapidmedia business model is perilously close to that of spam blogs, which Google battles all the time. As Manjoo notes, “once Google and co. wise up to [Associated Content]’s schemes, its business model is toast.” Still, I worry that’s wishful thinking. In class, the pushy kid with his hand up all the time would get pulled aside by the teacher and told to wait his turn. But there is no search-engine teacher. Google is hard on the crooked, but much as I dislike Demand Media and its peers, they aren’t crooked — and Google’s democratic, Hero Engineer mentality doesn’t lend itself to punishing the merely dumb.

A more hopeful sign, for me, lies in another Web truism: The cream rises, and over time talent wins out. As social search eclipses industrial search, the cream should rise faster. Right?

Well, maybe. Like a lot of current journalism debates, that becomes a referendum on one’s faith in people. Do you think people can produce accountability journalism without the framework of big journalistic institutions? Well, having thought about that a lot … I don’t know. Do you think if people move to the fore in finding information and sharing it we’ll get better information? I don’t know that one either.

This gets back to something said by Sage Ross about Wikimedia vs. Demand Media, which he describes rather poetically as “media driven by love versus media driven by money.” That’s a bit too simplistic for me, but I’d like to agree with his overall point. Now that I’ve calmed down some, I’d like to conclude that this too will pass, that people will make algorithms a complement to their own choices, that the cream will rise, the vapidmedia factories will be shuttered, and we’ll all be the better for it. I’d like to have faith, in other words. But media driven by love isn’t always so edifying, either. Have you been to Yahoo Answers lately?

More Google Sound and Fury

Posted in Fun With Metaphors, Social Media, Social Search by reinventingthenewsroom on December 2, 2009

In the last two days, Google has made some changes to Google News, allowing publishers more control over how articles can be viewed for free. Yesterday, Google said it will let publishers limit readers to five free articles per day, a modification to its First Click Free program, and offered to crawl and index preview pages made available, labeling them in search results as Subscription.  This morning, Google unveiled a web crawler specifically for Google News, allowing publishers to tweak their robots.txt file to exclude Google News but not regular search, or to further slice and dice what’s visible where.

All very interesting given the war of words between Google and publishers calling the search engine giant all manner of nasty names (nobody likes being called an intestinal parasite), a charge now led by Rupert Murdoch, who’s elbowed the Associated Press aside to head the brigade. This war has intensified of late, with word of talks between Murdoch’s News Corp. and Microsoft that could see News Corp. remove its news from Google in favor of Bing, Microsoft’s new search engine — and mutterings that News Corp. might challenge whether fair-use laws apply to aggregators. Google has fired back, in its blandly live-and-let-live way — I was amused to note that Google couldn’t resist making publishers look backwards by noting that they’d already been able to request being left out of Google News.

This is interesting political theater, but like a lot of political theater I maintain it doesn’t mean much.

First off, publishers’ paywalls aren’t fixed now, but then they weren’t cracked before in any meaningful way. On Computerworld, Seth Weintraub notes that “it is only going to be slightly more difficult to get around paywalls using the Google trick” — for instance, you could evade the five-articles-per-day limit by using a different browser in which you’re not logged into your Google account. Weintraub notes that “you know your sneaky little trick of getting around the Wall Street Journal’s paywall is mainstream if they demonstrate it on the NBC primetime show the Office,” which naturally leads to an embed of the now-famous clip in which Jim gets through to a paywalled Journal article in seconds flat. All true, but I think this misses something: In the show, only two of the assembled Dunder Mifflin employees know the paywall trick. As long as those percentages hold up, publishers with paywalls aren’t actually concerned about leaky paywalls, except for their usefulness in crying woe and trying to extract something from Google. This is the same misconception I objected to when NBC consultant Jeff Gralnick recently raised the specter of “some smart 12-year-old” getting around technological barriers — folks interested in digital journalism like playing around with technology, and so we tend to forget that most people don’t. (And I bet Computerworld bloggers run rings around us.) The idea of technological barriers isn’t to keep out the Jim Halperts of the world — that never works. Rather, it’s to keep out the Oscars and Dwights.

Nor am I worried that alliances between publishers and Bing would lead to a world of Balkanized search, a scenario raised by Ken Auletta in a New York Times conversation between him and Fred Wilson, moderated by John Markoff. The reason is the growing power of social search, which I explored in my last post. Auletta discusses social search too, asking, “Would you rather have the advice of 20 friends whom you know and trust and who share their experience with cameras, or 20,000 or so links from a Google search?” He’s right that we’ll opt for the former, but it’s not an either-or scenario: As Wilson notes, “I don’t see search and social as disconnected islands. I see them as connected important features that complement each other.” I’d take the metaphor a step further and say social search is the water that will connect all the islands. The speed of social search is uncanny — a good Twitter news feed will deliver the desired information from a vast range of sources, making the question of which engine indexes that information irrelevant.

Comments Off on More Google Sound and Fury

Why the Spat Over Murdoch, Bing and Google Doesn’t Matter

Posted in Communities, Social Media, Social Search, Twitter by reinventingthenewsroom on November 25, 2009

I tried to resist the thought, but I couldn’t talk myself out of it: None of this furor over Bing and Google and Rupert Murdoch will matter very much, or for very long.

An astonishing number of pixels have been spilled over social media, with the usual digital mix of interesting insights and wild claims of revolution. But even amid the hype, what’s definitely true is that social media is remaking how we live our lives online. And in some vital ways, social media is back-to-the-future Web stuff, fulfilling the long-deferred promise of Web publishing and search.

The idea that the Web makes everybody a publisher has been around for more than a decade, but for a long time the possibilities weren’t sufficiently supported by the technology platforms for Web publishing to be a truly democratic phenomenon. Sure, you could be an online diarist or cataloger or critic in 1995, but practically speaking you needed coding chops that were beyond most people. Blogging changed that, simplifying the process of creating and maintaining Web pages so that a much larger group of people could become publishers. But even then, setting up a blog was a technological bridge too far for most people — practically speaking, being a Web publisher was still a relatively techie endeavor. MySpace and Facebook and other social-media platforms were what finally married the technology with its possibilities. Setting up a social-media account is dead easy, as is answering the question “What’s on your mind?” with a bit of typing and clicking SHARE. Finally, the idea that we can all be publishers doesn’t sound like an invocation of rhetoric, but a description of reality.

With social media, we’re not just publishers — we’re sharers. And this is back to the future, too. Google’s search algorithms were created to replicate something that literally dates back to the Stone Age: our finely-honed sense of trust and social relationships. All things considered, even socially inept people are born with really good algorithms for figuring out social rank, influence and trustworthiness. Google did a remarkably clever job copying those — and they’ve earned billions upon billions from that foundation — but Google was needed because in the early days of the Web people’s natural social structures didn’t scale. There was too little participation for the Web to be truly representative, and truly participating — by creating information, assessing it and sharing it — was too technically difficult. Most of our meaningful social interactions took place in settings that were simpler — email, then IM and text-messaging. But that was primarily a one-to-one world that stood apart from the Web, which was a vast sea of information crying out for order. Few people had the technical chops to tackle that ordering (recall Yahoo supposedly stands for Yet Another Hierarchial Officious Oracle), the task was too big for people to handle the job anyway, and the results addressed the world in its vast entirety, not the fairly local world with which we naturally engage. Seen from this perspective, a lot of the problems and shortcomings of the Web feel like variations on this scaling problem: For years Google was a great tool for discovering weather patterns in Mongolia but a terrible way to find decent take-out within a couple of miles.

That’s now changing. The Web is not, of course, truly representative yet — too much of the world is still left out because of economic inequity, illiteracy, the repression of women and other ills. But within vast swathes of societies such as ours, we’re beginning to at least be able to make the claim that it is, and to glimpse a Web that’s accessible from everywhere, not just desks. (Which taken together will really just be the starting gun for what the Web will become — it’s still so early!)

And with participation in social media increasingly becoming the norm, we are reclaiming some of the old ways we naturally sort ourselves out into peer groups and social hierarchies. The nature of these peer groups is changing, of course — we seek out like-minded folks world-wide and build communities of interest instead of geography, we maintain weak ties instead of severing connections, and we leverage friend-of-a-friend situations in ways that were once reserved for people with a natural gift for social connections. But the trend is to return to something much closer to the social ties for which we are hard-wired.

This is why search is changing. With the ability to create strong peer groups online, and to create and share within those groups, we increasingly can use our own innate algorithms for trust and influence instead of turning to Google’s replicas of them. And we are discovering — or, really, rediscovering — that we have an unconscious knack for assembling peer groups that are as good or better at delivering a reliable “feed” of news about not only the subjects we’re most interested in, but the subjects that cross peer-group lines. Peer groups chop the Web down to size, and make the old human ways of finding and exchanging information scaleable again. If we have them, we have much less need for industrial search.

My A-Ha moment with Twitter was realizing that without even meaning to, by following people on Twitter I’d created a feed of information that was an excellent substitute not just for the sites I habitually visit about various subjects, but also for the aggregated home page I maintain for general news. I now routinely get my news from Twitter or Facebook, and reflexively turn to Twitter when news breaks. The combined efforts of all those people I follow add up to something that’s faster than news Web sites, covers more territory and is as reliable if not more so than RSS feeds and mechanized aggregation. The college kid who told a focus group that “if news is that important, it will find me” wasn’t being breezy or lazy — he was describing what social media has increasingly made reality.

That same effect is being seen other places, as people replace algorithms. “Do what you do best and link to the rest” is a strategy based around people, not search — it would work perfectly well without Google or Bing. Curation is about people, not search. Done right, aggregation is about people, not search. Email This and Digg and Share on Facebook are about people, not search.

This isn’t an unalloyed good — whether they’re centered on common interests or geography, our peer groups encourage us to create echo chambers of common creed and aligned opinion. We are correct to see this as a drawback, and to wonder which thin slice of news will find us — and if it will be news at all. But our dislike of the idea isn’t enough to prevent it from happening. We will vote — consciously or not, for good or for ill — for social search over mechanized search. It’s already starting to happen. And that means Rupert Murdoch and Dean Singleton and the AP and Microsoft and Google and everybody else are staking out positions in the last war. Theirs is a sideshow and a distraction. Whether we realize it or not, we’re already moving on.

The news will find me, because my peers will find it. It doesn’t matter whether the news gets indexed by Google or Bing or something else. My peers will find it, either through one of those search engines or more likely without visiting either. Murdoch may squeeze some millions out of Microsoft and wound Google and spark a million arguments about the civic value of how to index information, but none of that is going to make any difference to me. The news will find me.