October 30, 2009

TweetDeck Promises to Add Twitter Lists Support Soon

In July of 2008, when TweetDeck launched, it was the first Twitter client to support the ability to group those you follow - so you could see like-minded folks in a single column and ensure you didn't miss their updates. Now that Twitter is rapidly rolling out their own Lists functionality, many have been curious as to how TweetDeck would adapt to the change. In a blog post issued today, the company promised that support for Lists will be coming soon, "at the heart of the application".

The post didn't say just exactly how list support would be rolled out, or if you could export the groups you had already created into these new lists, but it is easy enough to assume they are working on it. In fact, the post says, "we're not just planning any old run-of-the mill integration...oh no. We think you'll find that what we have planned for Lists is going to take your social media experience with TweetDeck to new heights."

So if you are a TweetDeck user, worried about making more groups or starting your own lists, it sounds like Iain and team have you covered.

Update: Shortly after this was posted, Loic LeMeur of Seesmic said that his desktop program would also be soon supporting lists, in a tweet, saying: "OF COURSE Seesmic will have user lists very soon. I have them on my Seesmic Desktop already testing."

October 29, 2009

The Blurry Picture of Open APIs, Standards, Data Ownership

Look beyond "real-time" and "social", and you'll easily find another pair of tech buzzwords that everybody wants attached to their product or service - "open" and "standards". Companies are practically falling over one another to show they have embraced developers or users, letting data stream in and out of their products, while avoiding words like "proprietary" and "closed", which are PR death. But as you might imagine, the very definition of "open" can vary depending on who you talk to, what the service's goals are, and how they may leverage existing standards on the Web. Following the much-discussed news of Facebook debuting its "Open Graph API" on Wednesday, I traded a few e-mails with a few respected tech-minded developers, and found, unsurprisingly, that not everyone believes Facebook is fully "open". In fact, it's believed some companies are playing fast and loose with terms that should be better understood.

To quickly summarize the discussion, there are essentially three major ways to bucket "open" APIs, agreed those I contacted.
  • The first, "open access", means that anybody can use the API, but all the data in or out of the services is owned or controlled by the company whose service you are using. The Facebook Open Graph API "is open insofar as you do not violate their ToS", one developer wrote. "Here, 'open' is superfluous -- no (question) you're giving people open access to it, how else would they use it?"
  • The second type is that of an API that leverages open standards, including those such as XML, HTTP, and others. But that doesn't mean APIs that leverage those standards are open by definition. For example, Twitter's API is proprietary, even though it is built on open standards. The developer adds, "Here 'open' is just saying they've tried to incorporate best practices from other engineers -- it would be stupid if they didn't."
  • The third type is the most "open", including open standard APIs like OpenSocial, OpenID, PubSubHubbub, AtomPub and others. These APIs have a clear definition that can be utilized by multiple providers in a way that is interoperable, decoupling providers and consumers.
In short, you have "open but we control the process", "standing on the backs of open" and "truly open", if this opinion is accepted. The developer adds, "In short, the first two mean nothing, the last one actually fits the dictionary definition. The Web is built on open standard APIs and protocols."

Chris Saad, VP of Product and Community Strategy at JS-Kit, well known for his efforts in the data portability space, concurred, writing over e-mail:
"Facebook in particular has made a concerted effort to dilute the word open and use it in reference to a human/cultural thing when talking about the platform and their products."

He added, "In reality there is a VERY big difference between having an 'Open API', an 'Open Standards API' and an 'API'. An API is just a thing you poke and you get data back. When you get FaceBookPropietaryXMLData using FacebookPropietaryAuthMethod and you can only cache the data for 24 hours - that is NOT an open API - it is an API."
So who cares? Historically, services like Facebook and AOL have been characterized as walled gardens, meaning their information is sealed within, beyond the reach of the standard Web. Other services are known as "data roach motels", where data gets in, but never gets out. As the first developer said, the Web is built on open standard APIs and protocols, so sites can work well with each other, and activities operate in a similar manner, regardless of service.

Jesse Stay, a friend of mine, fellow blogger, and well-versed developer for both the Facebook and Twitter platforms, agreed that there is a tremendous amount of confusion around the definition of "open". In fact, just last month he wrote a post on his site, "The Open Web – Is it Really What We Think it is?"

Today he said Facebook's move gave full access to "users' walls, comments, likes and social graph... accessible from any Web site, desktop application or mobile application, using open API access protocols." Meanwhile, Facebook users can now opt into letting their status updates indexed by search engines, and the company is open sourcing architecture like the Tornado Web server (acquired as part of the FriendFeed buy) so other developers can make new platforms.

Jesse is more optimistic about Facebook's goals than was Chris. He said that the site lets users decide how open they want to be with their data, and that they are "working to give users full power" in that regard. But he also states frustration with the company's restricted access to search, and a lack of access to the entire network in aggregate, with the exception of their fan page directory. And he didn't address the core issue with Facebook in terms of them owning your data bidirectionally, and yes, them having the option to block your access if they felt you had violated the terms of service. (Remember this one? Scobleizer: Facebook Disabled My Account)

Web standards are very well known and we usually recognize them by their acronyms. JSON. HTTP. XML. POP3. Atom. Open means that developers can tap into the standard and use it as they wish, both procuring data and pushing it elsewhere. When we start to blur the lines about open and associate them with specific companies, like Twitter, Facebook, Yahoo! or others, you can usually guess that the solution is slightly less open. Somebody has the option to change their proprietary code and block you from having full access.

As stated more than a few times here, I have chosen to trust companies with my data. I put a lot of data into the Web and move it around. I expect standards to work the same way across sites, and I hope that those services that I use treat developers as well as they do their users. I recognize I am not as technical as folks like the developers I pinged today, and thus I need to trust their comments at times once my expertise is surpassed. But we need to be more knowledgeable about what is "open" and what is "sorta', kinda' open". Maybe Facebook can help us all understand their level of openness as time progresses.

Could A Real Apple Fan Completely "Go Google"?

As a Mac fan in the 1990s, it was a lot easier to understand who the good guys were and who the bad guys were. Apple was very good. Intel was bad. Adobe was usually good. Microsoft was bad. Very bad. Evil. But over time as we have morphed into the latter part of this decade, Intel switched teams and became good. Meanwhile, Adobe looked less like a close friend and more like a despised ex, as Microsoft went from hated bully and thief to playing the role of crazy uncle who nobody really likes but puts up with because he's not going to disappear. The hardest to label? Google, a younger cousin who everybody really likes, but just might be too smooth to be trusted, even as it gets too popular. Now the stage is set for an awkward family reunion - as Google and Apple are so overlapped, tech fans have the option to choose between the two for practically their entire digital life, and the loyalty once sent Cupertino's way, exclusively, is getting some serious competition.

Over the last few years, if one can look beyond the striking hardware and arguable operating system differentiation between Mac OS X and Windows PCs, Apple has unquestionably led the way in terms of seamless integration between applications and devices. The company's iLife package ensures that media is treated in a similar way across multiple applications, and its user interface guidelines protect the users from odd menu behaviors that change between each program. Meanwhile, the company's iTunes/iPod/iPhone juggernaut has made managing media easier than ever before, especially when one considers the addition of the fast-growing App Store and the good, even if not given much respect, Apple TV, which brings the core of the store to the core of the home.

But while we Mac fans may have been resting comfortably as the Mac vs. PC commercials made us giggle with egotistical self-pride, and the company's balance sheet has grown ever stronger with quarter after very profitable quarter, Google has been changing its spots - morphing from search engine and advertising powerhouse to a Web services monolith that can go head to head with almost every single Apple product out there. As the company integrates its many different products, they too may offer the integration we have always come to expect from Apple, but in an open, Web-focused way. And with every single new announcement, Apple fans have to start thinking if their future is one that is Google as much as it ever was Mac - and if "Going Google" would be that bad anyway.

If Mac OS X is the platform on which all Apple software starts, so too will be Google's Chrome OS. We know it's coming, and some sharp engineers are slaving away in Mountain View to capture the flexibility of the Web and make the cloud the equivalent of your hard disk.

Apple's Safari browser, the built-in Web browser for Mac and for iPhones, is equally matched by the Chrome browser on all major operating systems and on Android as well.

The iPhone and its 80,000 to 100,000 applications in the iTunes Store are being challenged by Android's new fleet of phones, led by the Droid from Motorola, and its rumored 10,000+ apps.

Apple's Mail? Easily matched by GMail. iCal? See Google Calendar. iChat? Google Chat. iMovie and iDVD? Well, it's not the same thing, but you would be hard-pressed to say YouTube doesn't win that battle. iWeb? Really? See Blogger.

On the professional side, Apple's iWork sports Keynote, Pages and Numbers. One has to wonder why they even released these apps, as they're not exactly keeping Microsoft at bay, and I don't know anybody who uses the last two. I use Pages once a year to do our Holiday letters home, and that's it! You better believe that Google's online office suite of Google Docs, Spreadsheets and Presentations is the real deal. Beyond that, do you expect Apple's iDisk to trump GDrive? Will Mac OS X Server beat out the Google File System (GFS) or can you expect XServes to replace Google's commodity rack servers in their datacenters around the globe? Not likely.

This isn't a rant stating that Apple is doomed. Far from it. After all, Google doesn't "yet" make excellent laptops. But I've tried the Motorola Droid with Android 2.0 and it's good enough that if iPhone were not an option, it would be an easy second choice. I find that I am using my Apple OS and my Apple Web browser to go Google, not just for the search engine, but all the downstream Google services. (10 of which I highlighted last month)

Google spokesperson and king of anti-spam Matt Cutts said his October goal was to avoid Microsoft software, a task made easier than ever now with Google providing an alternative just about everywhere. But I wonder if it's possible to do something very different - use ONLY Google software for a month. That would mean using the company's Web browser exclusively, and their office suite exclusively, and their mobile phone OS exclusively. That would mean using GMail and Google Talk and Google Wave and Google Calendar and Google Reader instead of Outlook or Mac Mail. I bet we're very close to this happening.

On Wednesday, Google also announced some of their first forays into Music search. This is an area where Apple still has the clear advantage - with iTunes. But Google offers Pandora on the Android platform, so iTunes isn't needed. Maybe I could push them to buy Spotify, and set up a killer alternative to iTunes with the Google logo? That would be something indeed.

I am a Mac guy. Maybe I'm less of a Mac guy than I once was, but I still trust Cupertino. That said, Google is growing on me in a big way, and they are the real alternative - something Microsoft never really was. Maybe soon I'll also be going Google in a way I never expected.

October 26, 2009

Cinch Puts Simple Podcasts In Your Pocket

In August, I suggested that Apple should find a way to record phone calls on the iPhone, leveraging its Voice Memo product, to make it drop-dead simple to create podcasts at any time. While the company hasn't achieved such a goal, an offering from BlogTalk Radio, called Cinch, has delivered on an extremely easy to use product that lets you record audio clips and post them to your social networks, including Facebook or Twitter. I've been using it the last few weeks, and while I try and discard a huge number of different technologies, this is one I know I will be returning to often - as it meets a need not currently served by other providers.

The idea behind Cinch is to provide short-form audio updates much like Twitter is for text, and 12seconds.tv is for video. Twitter's ease of use has come largely due to its short-form definition, keeping us all in 140 character soundbites, and CinchCast makes it just as easy to provide short updates, in audio form.

The Cinch Interface on the iPhone - Record and Publish

A free iPhone application, Cinch provides you with the option to record using the iPhone's built in microphone - good for solo updates, or one to one quick interviews - perfect for "people on the street" situations or for events. Once recorded, you can hit play to preview the Cinchcast, or hit Publish to send it off to destinations you have selected, including Twitter or Facebook. You can also add a photo to help tell the story, and can provide, yes, a 140 character update explaining what the Cinchcast is all about.

Should you want to, you can also search the service to find other CinchCasts or click Radio to see BlogTalkRadio's on-air schedule.

Cinch Shows My Published Updates and Those from Others on the Service

I never got into 12seconds.tv given its brevity and my lack of need for quick video shorts. But I can already see getting into regular updates on Cinch to augment my other blogging and social networking activity. As you can see on my Cinch page, I used the product to have a quick interview with Ethan Gahng of Lazyfeed last week, and earlier today, made some comments on the new report that once again, people are blaming social media for employees' lost productivity.

Now, whenever I want to speak directly to people on all the social networks, and have a follow-on discussion in the comments on Cinch, all I need to do is take out my iPhone and speak into the microphone. I will be looking forward to posting many more. You can find Cinch at http://www.cinchcast.com/.

See Also: Webtop Mania: Cinch: better than Twitter, better than Evernote.

October 25, 2009

Twitter Snags Platform Manager Josh Elman From Facebook

Twitter has made yet another high profile acquisition to its executive ranks, as tomorrow, Josh Elman joins the microblogging powerhouse after nearly two years as Facebook's Platform Program Manager, gaining a role as one of Twitter's small team of product managers. The move is a big win for Twitter, who has been working to improve the company's interaction with its development community after running lean for the last year-plus.

Prior to joining Facebook in March of 2008, Elman headed product management at Zazzle for three years. He also has history at LinkedIn and RealNetworks dating back to 1997.

Pulling off the LinkedIn/Facebook/Twitter trifecta is a rare one, but the Valley is dotted with tech geeks who can count their current homes as Twitter or Facebook, but also sport both Google and Yahoo! on their resume.

Twitter Welcomes Josh to the Team Via a List

Rumors about Elman's joining Twitter had been bubbling in the tech backchannels in recent days, and while he has not yet made announcement of the move, thanks to Twitter's new list feature, you could see his account added to the company's official Twitter team late tonight. That Twitter "Team" list now sports 113 members, including part-time contractors.

Word of the News Has Been Out on the Street for A While

Elman was first reported to have left Facebook in a story on Inside Facebook that posted in mid-September. The article said he and Facebook left on positive terms.

There Is No "Osborne Effect" In Web Services

In the world of technology, practically no story of warning is better known than that of Adam Osborne's ill-fated promise of his next generation of computer models outperforming the current offerings. The story states that the result of his premature leaking was a dramatic decline in sales that led to the company's death. (Even if truth later proved the story somewhat incorrect) This, in combination with competitive pressures in practically all markets, has led to a culture of secrecy, undisclosed roadmaps and obfuscation in the industry, aimed to prevent a similar fate. But as I look at many of the products we use today, including Web services, which can be updated in line, and don't require a specific point purchase, this mentality is overblown - especially when it comes to the market leaders, for whom users' switching to an alternative is unlikely.

In May of 2008, I said that I believed a simple feature war between sites was "the wrong war." Users of products including top Web services like Google, Facebook, Twitter, LinkedIn and others absolutely benefit from the features offered, but they stick around thanks to their data being on each service, and the many connections they have cultivated - whether you define that as a community, or instead, as an audience.

If you are a hardware manufacturer, like Apple, Dell, EMC or Cisco, it makes a ton of sense to only discuss future products with potential customers who are not going to purchase in the current buying cycle, and do so under non-disclosure agreements, to prevent their whisperings from impacting your sales. But I think users of the many different Web services out there would benefit from gaining greater visibility into these companies' plans and priorities - which would serve as an early platform for feedback, provide guidance into how they could expect the community to evolve, and at the very least, show that they were continuing to improve the platform.

As you no doubt saw at the end of the last week, and from coverage this weekend, Facebook introduced a new look for their news feed. Some people love it, some people no doubt dislike it, and many are in between. But the hardest part for some is the element of surprise. Often when a site has a massive overhaul, they leave up a link to the previous version for those not yet ready to make a move - even if it is clearly outdated.

But if you think about it, are people going to switch from Facebook to MySpace or Friendster because of a UI change? Probably not. Are they going to use the site less often? Maybe, but not in a dramatic way. So Facebook can be pretty secure in knowing that their users are going to stick around.

In contrast to the secrecy, I have been impressed of late as to the transparency seen from Twitter in terms of the company's rolling out feature enhancements, and telling users in advance what is to come. The company has talked openly about their new ReTweet API, and also talked about the addition of Lists. Twitter has learned from its previous mistakes that abruptly made changes, impacting users and creating something like a mob.

Twitter, despite incredible competition for mindshare from Facebook and others, is confident enough that their tipping their hand isn't going to create a competitive problem - and easing users into new features makes it seem much more collaborative. But not everybody believes in this model. During the hubbub around Facebook's future plans for FriendFeed, co-founder Paul Buchheit said "we don't pre-announce things, so for now all I can say is that there's good stuff on the way." His update to the FriendFeed community was both reassuring and not reassuring at the same time - showing they were not asleep at the switch, but giving no clarity at a time when many are looking for some. Would his telling us a few features on their plate for Facebook have upset the apple cart any?

As noted before, Feedly, the next generation start page powered by RSS, has a public roadmap. (Here is their 2009 offering) Feedly is confident enough to show you what they are working on months in advance, even if there is potential slippage, and even if there are competitors who might integrate similar features into their own plans. But there is no potential for an Osborne Effect here. You either use Feedly or you don't. It's very unlikely that you will look at their future plans and walk away because you don't like the product direction - and it's less likely that you will write down their itinerary and make a competing offering.

If I am Apple, I would keep secrets. But if I were running a Web service, and was confident I could deliver on my promises, I would be sure to open up to the users and let them know what my priorities were early, rather than hiding under a cloak of mystery. Users need guidance and confidence that they are part of something that is continuing to improve, and won't be abandoned.

October 22, 2009

Google Reader's Magic Finds Personalized Highlights In Feeds

If you're a normal carbon-based life form and not an always-on robot like me, you probably don't want to spend the entirety of your day dialed in to the Web, reading every single article in the fear that you might miss something. It might make more sense instead that you get the best of the Web, tailored just for you, sent your way - be that through the use of human filters, or through software that can determine what you like, either through explicit or implicit actions. Following the lead of My6sense, which debuted earlier this year, Google Reader introduced a new feature today, called "Magic" that finds the best offerings in your subscriptions and brings them to the surface. And it works! The service also increased the visibility of recommended feeds, and showed the most popular stories from around the Web - all part of making the RSS reader more personal.

(Note I also asked for these features way back in March of 2007.)

As Google Reader outlined in a blog post this afternoon, "The goal of personalization at Google remains the same as ever: to help you find the best content on the web."

When Sorted by "Magic", You Can See I Share Those Items Most

When Sorted by "New", The Items Are Less Relevant

Many people are intimidated by Reader's potential to get full. Complaints about seeing (1000+) atop the stream are everywhere - and while there are ways to sort by time or by individual source, it has not always been easy to find the stories that are most relevant to you - until today. With the addition of "sort by magic", Reader presents articles atop your to do list that most match your interests, no doubt gauged by your previous viewing history, and explicit actions, such as sharing.

As mentioned often here, I share about 30 items a day from the near 900 I go through. With "magic" enabled, I found myself sharing not just 3% of the first few articles but nearly half of them - and after having read through the offerings, displaying my activity in list mode showed that to be the case. No doubt as I continue to use the product, it too should get better.

In parallel, while away from the Web browser experience, I have been using My6sense on the iPhone to deliver a similar effect, presenting me with the most relevant and interesting items atop my feed. But the company's approach is not due to "explicit" actions, such as "likes" and "thumbs up" or "thumbs down", which many services use for personalization. Instead, the company uses "implicit" actions, including what I read, how long I spend reading it, whether I scroll to the end of the article, or whether I share it, to help improve my data.

Both approaches are looking to tackle the information overload mentality, making the feeds not so much "magic", but intelligent - which will become even more important as each of us subscribe to more streams of data.

Popular Items that Are Most Often Liked In Google Reader

You might also see some similarities between Google Reader's "most popular" section to that of services I've pushed on this site since the beginning of 2008, including the dormant ReadBurner (where I am an advisor) and RSSmeme. One Google Reader employee back in 2008 said this function would be "less interesting" as it just highlighted popular sources (including Engadget, the FAIL Blog and others), and so far, it looks to be the case - even if there may be an occasional pop from a lesser-known source.

I've recently begun an engagement with My6sense as part of the day job, and the more I talk with the company's founder and chairman, Barak Hachamov, the more the two of us believe that while there is a time for the wisdom of crowds, you can never overstate the importance of the individual. Both My6sense and Google Reader, especially with today's announcements, are working to do that.

So Was This The Item That Made My Head Explode? :)

FTC Disclosures: My6sense is a client of Paladin Advisors Group, where I am Managing Director of New Media. I am also an advisor to ReadBurner, and have met with the Google Reader team multiple times at their campus, where on at least two occasions, lunch was served. :)

October 21, 2009

Video: Trends and the Future of Social Networks (With TurnHere)

Morgan Brown of TurnHere and I sit down to chat about social networks, business and trends. All three videos were recorded earlier this summer.

Video: Leveraging Social Networks to Build Web Traffic

Courtesy of YourBusinessChannel, filmed while in the UK with Ecademy, some of my comments on how being active in social networking can aid business and Web sites' search engine visibility. (Apologies for looking and sounding tired. I was.)

Twitter Gives Bing Access to the Firehose, Promises More to Come

As previewed in a scoop by All Things Digital's Kara Swisher, Twitter has enabled Microsoft's Bing search engine to have access to the full firehose of all public tweets, adding these real-time elements to the company's data pool. In a post confirming the partnership, Twitter called the onslaught of updates an "overwhelming deluge", hoping that Bing could help you find those that make sense for your search query "right now".

Solving search and discovery for Twitter Search has been extremely challenging for the San Francisco-based startup, and the company's incomplete database has led to a swarm of competition, notably that of Searchtastic most recently, who gave top billing to the fact their index dived deeper than Twitter.

This obviously is no free transaction, so it is safe to say Twitter clearly has revenue today. And more will come as the company promises the development of meaningful relationships with companies that share their vision of creating value for users - be they big companies or small ones. More on the announcement can be seen on the Bing blog.

Update: (I just received this via e-mail from a Bing PR rep)
Hi Louis,

This morning at the Web 2.0 Summit in San Francisco, Qi Lu, President of Microsoft’s Online Services Division is announcing a new beta feature that enables people to easily search Twitter’s real-time information feed directly in Bing. This new feature helps people make better decisions and more fully understand Twitter conversations by collecting, analyzing and uniquely presenting real-time Twitter content.

More specifically, the new Twitter developments in Bing include:

A real-time index of the Tweets that match your search queries in results. This feature makes it easier to follow what’s going on by reducing the amount of duplicates, spam, and adult content.

Giving you the option to rank tweets either by most recent or by “best match,” where we consider a Tweeter’s popularity, interestingness of the tweet, and other indicators of quality and trustworthiness.

Providing the top links shared on Twitter around your specific search query by showcasing a few of the most relevant tweets. Additionally, Bing automatically expands those small URLs (like bit.ly) to enable you to understand what people are tweeting about. Instead of showing standard search result captions, we select 2 top tweets to give users a glimpse of the sentiment around the shared link.

You can try out the new Bing Twitter search beta here momentarily or learn more about it at the Bing blog. Please note that this is a U.S. only feature at this time.

Facebook Partnership

As part of his on-stage discussion at the summit, Dr. Lu is also announcing a global partnership with Facebook that will bring public Facebook status updates to Bing search results. The experience will be available at a later date.

Simler, the Interest-Based Microblog Network, Open to the Public

Simler, a fledgling interest-based social network, which I covered in early September (See: Find Similar People and Interests With Simler's Microblogging Platform) is now open to the public and no longer requires invites. You can find my account at: http://simler.com/user/louisgray/.

Simler lets you set up new discussons around tags, and like other real-time services, discussions on specific topics bump them to the top. Check it out if you don't already have a login.

Can Twitter Replace RSS for Sharing the Best of the Web?

On Monday, early adopter and Web provocateur Robert Scoble suggested that my use of Google Reader to share the best of the tech Web each day was antiquated. In fact, he called Reader "a dead product" compared to Twitter, which he believes will grow in importance for information discovery, especially as the lists feature is released more widely into the network. I respect Robert a great deal and we're good friends, so this kind of discussion doesn't bug me at all. As usual, it got me to thinking about why I do what I do, and whether it should change.

As discussed many times here, I share the best RSS items that enter my Google Reader in box per day. Lately, I have been sharing upwards of 30 items each day, up from the previous 20 to 25. These hand-selected items are then available on my link blog, in Google Reader for comments to other connections on that service, and downstream on other networks, including FriendFeed, Facebook and Socialmedian.

Last month, I noted the introduction of a new PubSubHubbub-enabled application called Reader2Twitter, that made it easier to share these items directly to Twitter as well. I even created a new Twitter ID for this, called @lgshareditems.

In parallel, Robert has been trying to do something similar, using not RSS, but Twitter, to share the best of the technology Web as it streams on his screen. While I have chosen to read 716 different feeds, Robert has chosen to follow more than 8,000 individual Twitter users. Similarly, his favorite Tweets are sent to an account called @scoblefaves, via FavStar.fm.

My Approach on the Left, Scoble's On the Right

In theory, both of us have the same goal. Both of us want to act as aggressive information filters, passing along the very best data to those downstream. But we are using different tools. My tools haven't changed much in the last two years, and Robert thinks that he is on to something. I like that he is being innovative, and once again, taking a chance by using a familiar tool in a new way, but there are more than a few reasons I won't be giving up the link blog in exchange for a Twitter favorites list any time soon. Not the first of which is that I have typically used my Twitter favorites to highlight positive mentions, similar to how I run my Delicious account, not to highlight news of the day.

See How Self-Centered I Am?

In Contrast, Robert Is Acting As a News Filter

In the discussion Monday, Robert said that Google Reader was "slower and lamer than Twitter is". That's been a common refrain from people who have said RSS "is dead". He also mentioned that Twitter doesn't have full text. So let's compare the two.

Advantages: Twitter Favorites
    1. Speed.
    Assuming that Robert is following the same people I am, but on Twitter instead of in Google Reader, Robert is correct that many people post their blog items to Twitter faster than they make it to Google Reader. This assumes that there could be a delay for RSS readers to get new posts of about 20 minutes. Many pieces in the ecosystem are PubSubHubbub-enabled, but not every leg, and therefore, there can be delays.
    2. Ease of Resharing.
    If Robert favorites a Tweet and that goes to his @scoblefaves account, it can be easily retweeted downstream, further into the network.
    3. Some Native Content Is Not Link-Based
    There are some interesting observations or comments on Twitter that are not links to a third-party site.
Advantages: Google Reader Link Blog
    1. Sharing of the Original Source
    A shared Google Reader item is one click away from the full source data. A favorited tweet is essentially a share of a share, as the original content is somewhere else.
    2. Full Content Beyond 140 Characters 
    Google Reader items contain as much data as presented in the RSS feed, going beyond the headline, but also including the body text, layout, etc. 
    3. Rich Media 
    Twitter today is still text. Pictures and video from third party services are displayed as URLs, not as the content itself, with one key exception being the Brizzly Twitter client. 
    4. Integrated Comments On Each Item
    Each shared item in Google Reader offers connections the option to have a parallel discussion away from the blog post - something impossible with Twitter, which would instead require a series of replies or retweets.
    5. Not All Blog Content Gets Sent To Twitter
    I cannot safely assume that every blogger I follow also posts their content as links to Twitter. I cannot also believe that I am following them all, or that I see their every update. Therefore, RSS cannot be replaced.
I am a big believer in RSS and in blogging. I believe Twitter is infrastructure, and that it has many uses. I believe that even if it is possible to find links to some content as much as a half-hour before, that it is worth seeing the full content from its original source, and sharing the content in its entirety. I also have two years' worth of inertia into the Google Reader link blog, which is powering my social graph everywhere else. Lastly, I believe that the speed discrepancy we see today is going to improve with greater adoption of tools like PubSubHubbub.

With the potential for FriendFeed to disappear, I fully understand Robert moving away from his "likes" initiative he had on the service, hoping that moving to Twitter favorites would fill that need. It's a noble approach. Maybe in time others will do the same, much like I followed him to cultivating an active link blog two years ago, and continue today. But we are far from the point where I am going to trade out my current process for something that not only seems less useful, but would certainly be less fun. As long as Web sites still publish and bloggers still blog, there will be room for RSS and room for a great reader and room for sharing. Until that ends, I'll keep going.

October 17, 2009

At 5:04 P.M. On October 17, 1989, The Earth Moved

20 years ago today, a 6.9 magnitude earthquake hit the San Francisco Bay Area, taking 63 lives, postponing the World Series featuring the San Francisco Giants and the Oakland A's, and putting the entire region into disarray. Over time, the community rebuilt itself, and the entire Bay Area continued to produce world-leading technology in Silicon Valley and training top students at Stanford, UC Berkeley and other area schools, helping to accelerate the digital age we live in now.

Nobody knows when the next big earthquake will happen, but practically everyone believes that we're due. Last year, I wondered aloud how the world might react to an earthquake that hit the Valley, perceived to be full of upper-class egocentric folks, not the more sympathetic low-income victims hit by natural disasters around the world. But there's no doubt that if there were a disaster to hit the Bay Area, again the region would need help.

Everybody in the area has their own story. "Where were you when the earthquake hit?"

At the time of the earthquake, I was only 12 years old, and on my way to soccer practice. I lived in Northern California, but far away from the damage of the quake. An A's fan, I had looked forward to seeing the 3rd game of the World Series, but soccer practice had been on the calendar, so off I went.

On the way to practice, the only bumps and shakes we felt were the result of the coach's rickety van going down the semi-paved roads. We didn't know about the quake, and practice was set to start at 5.

Midway through the practice, as we were scrimmaging, one of the other kid's moms pulled up and said there had been a massive earthquake in San Francisco, saying that anybody who had family or friends in San Francisco could come with her, and she would take them home. The rest of the scrimmage, as we listlessly kicked the ball, our thoughts were somewhere else.

At the end of the practice, we went home and watched the news roll in on the TV, and we learned more over the next hours and days. The Bay Bridge had collapsed. The Marina was on fire. The World Series was being postponed. Everything had stopped. The ensuing days were full of statistics and stories of individual heroism. The community rallied together and rebuilt.

Eventually, the A's came back and won the World Series. They swept. It's the last time they won, so it's been 20 years for that too. And even though it has been twenty years, I always remember the day of the big quake. October 17th, 1989. I remember the time. 5:04 p.m. If you were in the Bay Area, what was your story? What do you remember?

Gary Burd Exits Facebook Two Months After FriendFeed Acquisition

Following the August acquisition of FriendFeed by Facebook, the site's loyal users are still waiting for news about whether the social network and aggregator has a future, or barring that, when elements of the site will start populating Facebook. But for the most part, there has been little news, and some are pointing to reduced traffic and engagement there as signs the product will just fade away - even as I hear rumors that's not the case. But this week, we learned of the first high-profile defection from the FriendFeed ranks at Facebook, as highly-respected engineer Gary Burd, who also counts Google and Microsoft on his resume, quit the social networking giant this Wednesday.

Burd, who helped develop the Trident HTML rendering engine, a main ingredient in Microsoft's Internet Explorer 4, during his seven years at the Redmond monolith, and also contributed to Google projects including Google Talk during his four-year stint in Mountain View, joined FriendFeed in June of 2008 after he had independently developed a service that let users update the site via e-mail, called Mail2FF. Following his hire, the company rebranded it as "FriendFeed by e-mail" and made it an official feature of the service.

After seeing a tweet by Gary that said simply, "Last day.", Gary wrote me to confirm he had left Facebook, because he does not enjoy telecommuting. He lives in the Seattle, Washington area, and will be looking for projects locally, after telecommuting with FriendFeed for nearly a year and a half. Not coincidentally, Gary also posted a tweet that read "Last Day!" when leaving Google in March of 2008. (By the way, don't read too much into his tweet mentioning a new MySpace profile, assuming he'll go there next.)

During his time at FriendFeed, Gary worked on projects including the service's real-time API, a dedicated IM client, the Simple Update Protocol (SUP) and most notably, real-time search by topic, a fix for the much-desired Twitter tool, Track.

Gary's track record should no doubt make him an extremely valuable recruit for companies in the Seattle area, and his leaving is absolutely Facebook's loss. While not as visible a defection as if any of FriendFeed's cofounders opted out of Facebook, it does tend to raise more questions in a time when many people are still looking for answers.

Proposed Salmon Protocol Aims To Unify Conversations on the Web

As comments on the Web become fragmented, conversations that occur on downstream aggregation sites often are taking place in a silo, disjointed from parallel discussions on the originating Web site. Over the last two years, many people have found this evolution controversial, hoping to unify the conversations in a central location - and some services, including JS-Kit's Echo and Disqus, have taken the first step by pulling external discussions to the source. But a brand new proposal, authored by John Panzer of Blogger, called the Salmon Protocol, is looking to take advantage of Pubsubhubbub to unify the conversations in all places, both upstream and downstream. And yes... the name of Salmon comes because those fish manage to swim upstream, just like the comments.

An Initial Presentation on the Salmon Protocol

As discussed in Friday's panel at Blog World Expo on technology and the real-time Web, Pubsubhubbub essentially works as a middle-man conduit, taking information from a data's source passing along changed data to downstream destination sites. The proposed Salmon Protocol would similarly watch both source and destination sites for comments, and upon discovering new comments, it would send the new comments to the site which is lacking the full conversation. If multiple downstream destinations are designated, the Salmon Protocol will also populate these multiple sites.

In conjunction with Pubsubhubbub, the Salmon Protocol leverages the newest iteration of webfinger, enabling publishers to receive comments and verify subscribers - as a form of true identity recognition, similar to how both Disqus and JS-Kit have you register for individual accounts with either service. An additional side benefit to leveraging Webfinger would be to dramatically reduce the potential for spam, assuming each individual has a unique ID.

The debate over fractured conversations has risen and fallen over the last two years. In September, I essentially said I was done listening to people complain about the issue after hearing complaints regarding Google SideWiki - as I believe people will want to have conversations where they are comfortable, and that they shouldn't be forced to come back to a single source. This is a point I have been hammering since the first major flareup back in April of 2008. (See: Should Fractured Feed Reader Comments Raise Blog Owners' Ire?)

Many people believe that transporting comments from one site to another and making the conversations one could cause confusion, or even make potential commenters uncomfortable. With this in mind, John has suggested that users "be made aware of the publishing scope of the comments they leave," adding "For some aggregators, this may be implied (all data is public), for others a warning or a checkbox may be necessary." (See: Salmon Protocol (Draft) Protocol Summary)

A Test Comment from the Aggregator Via Salmon

The Resulting Comment Back On the Blog Via Salmon

There is a test playground for the Salmon Protocol, and I can verify that it already works. If you want to test it, one option is to take a testbed Blogger account and point the Salmon Protocol your way. It occurs automatically, and comments that happen on the downstream aggregator will make it back to the blog immediately, thanks to Pubsubhubbub. Now, the quest becomes to turn this brand-new protocol into a new standard - one that could pose a serious challenge to services like JS-Kit Echo and Disqus, even including threaded replies. If done well, the long debate over unified conversations could soon be over.

Learn more at: http://www.salmon-protocol.org/.

October 16, 2009

The Blog's Place In A World of Microblogging: Not Dead Yet!

Even as the microblogging space seems to be white hot these days, the world of longer-form blogging is still seeing impressive growth, with all major blogging platforms showing greater than 20 to 40 percent growth year over year, and record users, blogs and total readers, according to Compete.com data and a presentation from Google's Rick Klau, product manager for Blogger, who spoke at Blog World Expo this afternoon. Rick reported that his platform, Blogger, which I use, is now seeing nearly 300,000 words per minute, scaling to 417 million words per month, from more than 10 million content creators.

Yet, despite this high usage, many have challenged the platform, saying "blogging is dead". There are more than 360,000 results on Google saying that "blogging is dead", with many high profile articles saying that disparate social networks like Facebook, FriendFeed and Twitter should be where people's attention are. But Rick said that the rise of microblogging didn't necessarily come at the expense of traditional blogging. In fact, he said these third party sites actually served to drive even more attention and traffic to the core blog content.

"Microblogs are complementary, not competitive," Rick said. "It is a driver of attention and engagement back to the blog."

For Rick, who has run his own personal blog (at tins.rklau.com) and has been active since 2001, he reported that Twitter has become the highest traffic generator for his site outside of search, and in that list, so also are Facebook and FriendFeed. He suggested rather than trying to fight against the flow on microblogging, to embrace it, and make sure your content is available to these disparate networks, while remembering to engage where it lands.

"The blog tends to be visited by people interested in what you are saying, and the people on Twitter and Facebook are interested in you, and by proxy, what you are saying," he said. "I happen to believe that based on what I have seen with my blog for eight years, people are comfortable communicating in the environment they have chosen. If I force the conversation back to the blog, I will lose the audience who have the eagerness to engage, comfortable where they live."

Rick suggested that if you are a Facebook user, to pull your blog content into Facebook, making it available to this new audience, who may leave comments much different than those which are native to your site. He also recommended that you utilize tools like bit.ly to track statistics and click-throughs to your site from those links you send through Twitter, to help you understand how much the microblogs are impacting your own blog.

Given his background at Google, Rick made it clear that the company's data-driven nature forced decisions, and the company continues to see serious growth in traditional blogs.

"There are very few questions that get asked at Google when I don't have the data to back up an answer," Rick said. "You don't get many opportunities to say "I feel" or "it seems" at Google."

But in his experience, Rick suggested that bloggers not get locked into writing posts for specific statistics, including page views. He said that as you are a multi-faceted person, you should be confident writing about more than just one thing, so one should feel comfortable covering more than just the single topic.

"Don't become a slave to the focus of your blog at the expense of having fun.You can be passionate about a wide variety of subjects," he added.

Blogging and micro-blogging are not a zero-sum game, but can be complimentary. Sending blog content to downstream networks makes that content available to those connections who are more comfortable in their own environment. As I have mentioned many times, your blog is your brand - which Rick echoed by saying that on your blog, you control every pixel, and therefore, the end user experience.

Bloggers need to adapt to the new world, but aren't antiquated in this new world. It makes sense to participate wherever the content lands and wherever your readers are, without pushing to centralize the conversation on your site, but there is no substitute for long-form conversations and being passionate. Rick communicated a simple formula: "Content + Passion + Engagement". And that will make the blog go, even in a world of change.

October 15, 2009

Cheezburger Network Scales Company By Focusing On Happy

The Web simply can't get enough funny cat pictures. Or dog pictures. Or sleeping cat pictures. Or pictures of people failing. Or creative approaches to fix things. The Cheezburger Network has grown dramatically, racing to the point where they are seeing a billion page views every four months, after it took nearly two years to reach the first billion. Growing the company on rapid scale is no accident, as the network has focused its business objectives on its customers, aiming to remove distractions and obstacles that could hinder potential growth.

Ben Huh, CEO of Cheezburger, said the company focused not on how many features they could push into the network, but instead on simplicity - letting the users dictate how the business would grow. And the company is realistic about what it aims to do. They aren't out to make a billion dollars in profit, or to change the world.

"Our mission statement is to make people happy for five minutes a day," Ben said today at Blog World Expo. "We kept hearing from our audience members, 'You make us happy,'. When you allow your users to dictate how your business will operate, and can develop a thick skin, that is how you grow."

If you have spent any amount of time on social networks, or received any e-mail outside of your office, you probably have encountered work from Cheezburger Network. From I Can Haz Cheezburger and I Has a Hotdog to the FAIL Blog, GraphJam, There, I Fixed It, E-mails from Crazy People and It Made My Day, the company is trying to reach a broad audience through more sites and more content, based on user feedback. In fact, today they announced a new site featuring sleeping animals, called Daily Squee, which is reminiscent of the also popular and off network Cute Overload.

The sites' mantras are pretty simple. Post funny pictures with custom captions. Ben said that human nature has a tendency to admire complexity, but rewards simplicity, and that through introducing complexity, it has an inverse effect on your business' ability to scale. Instead of investing in expensive custom software and hardware, Cheezburger utilizes standard products including WordPress, JS-Kit, YouTube, Google Apps, cloud storage and open source applications, keeping costs low.

While some of the network's sites may seem fanciful, they are thoughtfully planned out and new proposals are theorized frequently, and target those who may not yet be avid fans of the network. Ben theorized that half the company's traffic was non-critical and transient, while 30 percent of traffic constituted the regulars, another 15 percent were fans, and an elite 5 percent were people who show up every day, multiple times a day. He suggested that businesses focus on the greatest population that could migrate to higher devoted level (in this case the 30 percent), as that will increase the total number of fans, and discourage targeting the site by scaling to the power users, who are typically edge cases.

Beyond keeping its audience happy for 5 minutes at a time, Ben said the company frequently thinks in small time allotments, saying, "If I had wanted to work four hours a week, what would I have to do?" and asking, "If my users had forty seconds on the site, what would they want to do?" Key to satisfying these short term visitors? Eliminating distractions and removing barriers.

The goal is to keep the company in touch with what the users wanted, and purging common mistakes that come from within, including ego, pride, assumptions, sacred cows, secrets, coverups, and individual reputation. The result is one big fat happy network.

Communicating in the Age of Streams: Ubiquity, Multiplicity, Visibility

The rate at which information is being produced shows no sign of slowing down, and humans are adapting to the onslaught of this so-called firehose by having shorter attention spans and filtering out information they aren't much interested in, so that it fades into the background as noise. In parallel, information is getting ever decentralized, and conversations are taking place in an infinite number of places, which makes the task of participating in every relevant conversation a practical impossibility. As this has happened, the model of dedicated Web sites, and even blogs, can look extremely outdated - aiming to act as centralized destinations in a world of streams. Edelman vice president Steve Rubel, who recently moved his blog to a lifestreaming format, based on Posterous, explained at Blog World Expo how he takes on the stream and helps his clients gain visibility in the fast-moving world.

Rubel, who created the highly popular MicroPersuasion blog back in 2004 and updated it multiple times a day before moving to the lifestream earlier this year, said "we are reaching a critical breaking point" when it comes to the information firehose, adding, "information is going to continue to scale, but human attention doesn't scale, so we have to think about how each of us manages it."

You no doubt have seen some of the more aggressive ways to tackle information overload including "In Box Zero", "Mark All as Read Day" or even "E-mail bankruptcy" - all essentially differing forms of throwing in the towel and admitting failure. Steve quoted recent studies showing that the average person in the US visits 111 different domains in a month, and approximately 2,500 Web pages a month, as people are making choices in terms of where they spend their time, and what pieces of information they choose to respond to. And one of those places that usually isn't getting a lot of their attention? Company Web sites and old-fashioned blogs.

Steve suggested that one of the major reasons that Twitter exploded was because it centralized all these diverse conversations and put them in one place, also leaning on short forms of communication, adding that on average, people only read about 20 percent of the Web page before moving on. Much of the reason for their shorter attention spans? More data, coming ever more quickly.

"Everything is moving faster now, whether you like it or not," he said. "It's like a sushi duck moving past at 100 miles an hour with 1,000 different options. How do you make sure you get selected and stand out?"

Instead of fighting against the stream and forcing people to come back to the originating hub, Steve started to think that maybe his blog "didn't matter as much any more", and appeared "archaic". Now, he is posting content, via Posterous, to his lifestream and also each of the spokes (like Flickr, Delicious, etc.) and participating where that content gainst traction - essentially creating a very customizable hub and spoke model that has his own personal brand and the flexibility to put the right content in the right place.

For companies and businesses looking to take on the streams, Steve highlighted three major imperatives to not only just broadcast, but to ensure quality engagement - including ubiquity, multiplicity and diversity of message, and finally, discovery and visibility. The new lifestream-powered Web sites would enable companies and brands to be "everywhere stakeholders are spending time", and enable the opportunity for different stories in different venues in different formats, avoiding a one size fits all approach.

The stream is real. Whether you call it the flow, as Stowe Boyd has, or the River of News, as Dave Winer has, the firehose is pushing more data our way faster than ever. IT could be that the lifestream is an answer.

For more on lifestreaming, make sure to check out Mark Krynsky's Lifestreamblog.com.

How To Rally Your Community, Leveraging Social Media

Social media is a tool. Last month I said social media is infrastructure, and I have compared Twitter to the new e-mail or a parallel Internet. Because of this, enterprising folks are finding ways to leverage these new tools for practically every facet of business, not just for social media marketing, or daily minutiae, but for rallying the community to support charity. I've talked a lot about the #BlameDrewsCancer phenomenon, featuring my good friend Drew Olanoff, but as we learned today at the Blog World Expo this morning, he is not alone in his efforts to leverage social media communities to take on cancer and other causes. The panelists all agreed it takes effort, persistence and consistency, making an abstract illness personal, real and tangible.

Drew has been relatively lucky in his challenge with cancer. His battle with stage 3 Hodgkins Lymphoma is very possibly nearing its conclusion, with 10 chemo treatments out of 12 behind him, and positive feedback from his doctors. But not everybody has a happy ending. Jay Scott, father of Alex from "Alex's Lemonade Stand", told us about how his daughter passed away at the tender age of 8 1/2 after fighting with cancer her whole life, and devoting her efforts to raise money the only way she knew how - selling lemonade on the family's front lawn. Jay and his team have rallied to continue her cause and make it into a movement that has inspired hundreds of thousands.

The tools that Jay, Drew and others have used to punch cancer in the mouth are the same tools we all use. Drew said, "It's not rocket science, it's word of mouth. If you have a passion, be passionate about it and use the tools in front of you." But he added it takes a ton of effort. He said, "It's not Field of Dreams here. You have to go after people and be aggressive. Whatever got your attention, I don't care what it is, I got your attention. All these tools are is platforms for people."

Meaghan Edelstein, who battled cervical cancer and won, said the main push from social networks is in the name itself: social. She said, "You have to get people excited. People are on social networks because they want to do something, so give them something to do."

But as you can imagine, fighting a faceless killer and rallying people to a cause is not as simple as setting up a Twitter account and Facebook page and watching the activity roll in. It takes serious work that is genuine and personal.

Meaghan said to "be authentic" and be a real person with real messages, and recognize that the smallest blog or least-followed Twitter account is just as important to embrace as the household names. "You are not too good for anybody else, and you have to remember that," she said. "You have to thank them and participate. That's what makes things authentic and social. If you don't have these things, you have to get out of the arena."

But all the effort in the world won't make a difference if the story doesn't sell.

Jay said "you have to have a good message, and if it is powerful, people will tell their friends about it," adding, "I can't believe how powerful retweets are. It helps if people have a million followers, but it also helps from those people who have ten followers."

When cancer took on Alex Scott, and took her away from the world much too soon, it may have been a short-term win for cancer at huge expense to her family and us all, but it turned out to be a massive mistake on cancer's part, because the cause has lived on and Alex has become a symbol for rallying for a good cause. Each of the day's panelists made it clear that their battle with cancer had gotten personal and their community has come to their aid.

Find out more about the causes:

Do You Trust Small Companies With Your Data More, Or Big Ones?

A few of this summer's acquisitions featured a scrappy upstart much beloved by the Web masses getting absorbed by a larger, more-established acquirer - with two of the more prominent examples being Intuit's buy of Mint.com and Facebook's takeover of FriendFeed. And amidst the ensuing responses, I saw two truly oppositional reactions - the first from people who swore they would never use the larger company or service because they hated it or didn't trust it, and the second, from people who now thought it was "safe" to use the smaller service as it finally had some parental supervision.

I recognize that some people have a greater tendency to accept risk in their lives, including risk to their data, than do others. Some lines of business and people operating those businesses are as a rule conservative - not venturing to buy one company's goods until they have done a full background check on the firm's financial stability, or have seen a flurry of similar use cases from peers. Others flock toward a series of early adoptions, where a personal relationship with a site's founders or employees is possible, thanks to the product's newness. And no doubt, the two sides rarely agree on a set strategy.

What are the underlying concerns both parties may have?

For Those Who Favor Big Companies Over the Upstarts
  • A small company may not have taken all necessary precautions to protect their data, making it vulnerable.
  • A small company may not have longevity, and if it expires, so too could your data.
  • A small company may grow desperate for funds and could sell your personal information.
For Those Who Favor Small Companies Over the Giants
  • A large company is more likely driven by sheer dollars than by customer service.
  • A large company may have a history that contains questionable moves.
  • A large company may act unilaterally in terms of how your data is used.
In parallel with the two acquisitions I had mentioned, there have been a few isolated cases of the smaller company putting itself up for auction, essentially turning its user base into a marketing list for sale to the highest bidder, whether or not that may contain personally identifiable information, or possibly passwords. But in parallel, you can see people who strongly dislike Google, don't trust Microsoft, or think that Facebook is evil. I even saw a post go up yesterday saying that Cisco was evil. The bigger they are, the bigger a target they are.

I tend to trust companies rather than distrust them. I am an optimist. I think there is a possible point where personal relationships with the founders trumps a robust multi-tier support system or flashier GUI. But it's not for everyone. What are your thoughts, and do mega mergers change the way you perceive your data being protected?

Hey Bloggers, Step Away from the Twitter for a Second... and Blog

There are a few shiny things here in Vegas that have bloggers' attention. No question about it. There are shows and clothes, lookers and hookers, drinks and winks. But it could be a shiny blue bird, and the light of mobile phones that has many bloggers' attention this year, even if they are in Vegas instead of wherever they may call home. Blog World Expo, a conference dedicated to blogging, about blogging, featuring bloggers and discussing all things blogging, should see a whole lot of blogging. Right? But there's a high possibility that many will forgo full-size blogging, in exchange for everyone's favorite microblogging application, favoring hashtags over Technorati tags.

Chris Pirillo, lifecasting maven and tech geek to the masses, asked what many of us have been observing ourselves, this evening, when he said, "How many bloggers does it take to blog about Blog World Expo? None. They're all tweeting about it, instead." Now I will give bloggers the benefit of the doubt a bit for tonight only, given the show hasn't really kicked off. But as I noted at South By Southwest in the Spring, and at Blog World Expo last fall, there has come a tendency for those attending events to "live tweet" the event, 140 characters at a time, and not do the wrap-up report of what sessions they attended, ideas they encountered and people they met. And I wish this were not the case.

I don't want to be putting down any quasi-moral ultimatums about how the event is about the sessions and not about the parties, even if that's how I feel most days. I don't think it's my place to tell people to stop having so much fun and focus. But I do hope that those of us who have the privilege of calling ourselves bloggers actually blog.

Over the next few days, I will be seeing many of my peers and respected tech leaders talking shop and I look forward to learning a lot. I will be blogging as much as I can where it makes sense, following the model I discussed in March of this year, when I showed how to blog live events and publish with lightning speed. While at South by Southwest this March, I was aggressive enough to add 14 separate posts in the four day period. I can't promise the same amount this week, but I will do my best to bring the event to you.

With great tools like Lazyfeed available to help you follow blogs on specific topics, it is now easier than ever to see your posts on the week's event. As much fun as it will be to check in on Foursquare at whatever pub you are drinking at, or to quote tech luminaries with the #bwe09 hashtag all over Twitter, I am hoping I will see some great posts - lots of them - from all the bloggers who are here. I did say It's Twitter's World... but there is a big role for blogs in an age of microblogging, and I want to see all of us try to do both.

BlogWorld Expo 2009: The State of Technology & the Real Time Web

For the third time this year, I am back in Las Vegas. And per usual, I'm not here for anything resembling a vacation, as it's conference time. But instead of attending an event on behalf of a specific company, as I have done many different times, I am going on behalf of the blog, and making many of the connections I have forged online since 2006 onward more concrete - through participating in Blog World Expo. This is my second year participating after going last year, and for 2009, I have the challenging, yet exciting, opportunity to talk about what has to be one of the biggest stories in technology this year - the real-time Web.

On Friday, the second day of the conference, at 11:30 a.m., I will be speaking solo - trying to discuss the impact the Web is seeing as real-time becomes further embedded in many of our daily online activities.

The subject of the real-time Web is near and dear to me. As an information consumer and producer, anything I can do to get my data out to more places faster than ever, or the easier I can get to more data, faster than ever, reducing latency is huge. That's part of why I made the real-time Web central to my #1 prediction for the world of tech in 2009 back on New Year's Eve.

Excerpting from my post back on December 31:
"Delayed news will no longer be acceptable for early adopters, who will gravitate to the quickest sources of news, wherever they may be. As tools like Twitter Search and FriendFeed real-timeoffer people to rapidly broadcast their updates, reactions and news with true immediacy, a segment of the population will adopt these real-time sources and favor them ahead of delayed or filtered engines, including RSS, and of course, edited mass media. At the same time, while many of us early adopters may be fairly noisy about this development, we will remain in the significant minority, even as the mainstream becomes more aware of these options."
I've been well known for getting my predictions wrong, but every once in a while, I feel like I am on to something, so this is gratifying.

If you are here for BlogWorld Expo, you can expect to hear a lot more about things like PubSubHubbub, Lazyfeed, Reader2Twitter, Facebook, FriendFeed, Twitter Search, and more at 11:30 on Friday. Just make sure you add the session to your own custom schedule. It will all go down in real time. Looking forward to seeing you there.

October 14, 2009

Technorati Roars Back To Life After Self-Imposed Slumber

There are a select few Web 2.0 companies who have suffered such a roller coaster of peaks and valleys the way Technorati has. Once a clear industry leader for blog search, statistics, and individual site "authority", Technorati's influence withered away thanks to an aggressive push by Google into the blog search arena, statistical gaming by number-crazed bloggers, management changes, odd product launches, and inconsistent uptime. But with a major relaunch tonight, the company has tried to throw off the shackles of the old and rise again, armed with more cash in the bank, a talented editorial staff, and a new look. All of a sudden, the site looks relevant again.

Back in early 2007, Technorati was among one of the favorite topics on this blog. You could see the tumult at the company, as then-CEO David Sifry wrote on a Tuesday in a comment here that he was "very very happy at Technorati", only to announce he was looking for a new CEO that Friday, three days later. You could see debate that summer over people trying to game the then much-watched "Authority", which counted up external links to your site in a six month period. But by early 2008, we were using phrases to discuss the company that included "totally toast", and the new Twitter generation, less than two years removed from Technorati's heyday, scarcely remembers the once respected innovator.

Rising and Falling Blogs On Technorati

But as of tonight, they are back in the game. They ditched the old metrics for attributing authority, as it was considered too static, and now will aim to reward authors for posting frequency, context and linking behavior. Interestingly, they have also introduced authority by topics, meaning that technology blogs can be compared to others in their sector, as can sports blogs, music blogs, and so on. This means that aside from the overall Top 100, sites like TechCrunch don't have to measured head to head against Huffington Post, and we smaller blogs can get a better idea of who our peers are. (See:
A Totally New Technorati.com & Technorati Media Rising

It's Nice to Be Considered a Top 100 Tech Blog

This new ranking system is looking to be more dynamic - changing along with the real-time nature of the Web. Blogs will rise and fall, and be noted on the site. Blogs and individual posts will be featured, and "hot blogosphere items" of all topics can make the front page.

An Individual Blog's Technorati Profile

Occasional louisgray.com contributor and friend to the site, Eric Berlin, is the blogging channel editor, so we wish him well and look forward to hearing more about that role. Additionally, JS-Kit's Echo will enable comments to be placed underneath all blog listings and tags, possibly adding conversation to the data.

Technorati may not be the big giant we once thought they would be, and they will need to have some consistent successes to become a blogosphere darling again, but they are back in the conversation and worth watching.

See Also:
VentureBeat: Big changes coming at Technorati — the CEO’s perspective
TechCrunch: The New Technorati

October 12, 2009

Designing the Perfect Twitter Client Is Impossible. Tweetie Is Close.

Given Twitter's prominence, it comes as no surprise that there are many different clients out there. Some are designed to give you a single place to update multiple social networks at once. Others are designed to give you easy access to multiple accounts you may have. A few are optimized for your own personal groups or subsets of followers. And depending on which client you have selected, they may display images or rich media in line, may support retweeting (or not), or they could provide advanced functionality for direct messages. But good luck finding a single application that does everything best - one you can use on your desktop, through the Web browser, and on your iPhone too, because, guess what? It doesn't exist.

The flexibility of the Twitter platform and the wide variation, even among the most popular Twitter clients, has led to most users choosing a favorite, but still having multiple clients installed. It's rare for a single user's Tweet stream to go a few days without showing more than one client used - thanks to some clients being best at one utility or another.

This week has seen a great deal of interest in version 2 of Tweetie's iPhone application. Rebooted and reloaded with a boatload of new features and enhancements, Loren Brichter's offering is being lauded as the best on the iPhone, period. But even as I may agree that it's great, the discussion has actually led me to revisit the Mac desktop client of Tweetie, and I have been using it almost exclusively (with the exception of posts that flow from FriendFeed) for the last few weeks.

My Tweetie Desktop In Action (Showing the @Mentions Window)

On September 25th, I posted a note illustrating my then-current view of the Twitter client race, saying: "TweetDeck: Best at Groups. Tweetie: Best on iPhone. Seesmic: Best at DMs. Brizzly: Best at Retweets and Images."

I stand by that comment, because I believe that each of the different applications has forged a space for itself to be best at something. TweetDeck introduced the concept of groups to Twitter (See Review), and while others, like Seesmic, have adopted it, it has maintained a usability lead. Tweetie's latest iteration on the iPhone has extended their lead over the stale Twitterific and the very busy TweetDeck. For direct messages, I have long found Seesmic's Web app to offer the best option for grouping conversations and seeing previous messages between accounts (See Review), and Brizzly makes retweeting a lot easier than Twitter's native client has.

But one major piece I left out from that comment was the use of multiple accounts. And when you think about multiple accounts, it's my opinion that two products support this capability extremely well. The first is Brizzly, which lets you operate under either account by clicking on the avatar, correctly moving over saved searches and all other relevant data. The second is Tweetie, for both the desktop (on the Mac) and the iPhone.

As I've now inherited multiple Twitter accounts for clients, and have also added my @lgshareditems account, as well as a new @privatelg account that I am using primarily for a quieter Twitter experience, a better multiple accounts client was sorely needed, and this is what has pushed Tweetie over the top for me, not just on the iPhone, where everybody is talking about it, but on the Mac too.

Browsing Updates In Tweetie on My @privatelg Account

In April, when Tweetie for Mac debuted, I called it "Clean, Simple and Robust". That's all still true, but now that I am actively managing multiple accounts, and using the product's built in capacity for retweeting, while enjoying the threaded direct message conversations I used to only enjoy in Seesmic Web, it has practically taken over my Twitter stream. I remain quite fond of Brizzly and Seesmic Web through the browser, but don't mind running Tweetie in the background, not feeling the RAM glut that some other AIR-based clients have on my computing power.

At the end of last month, Lifehacker posted a list of what it called its "Five Best Twitter Clients", including TweetDeck, Brizzly, Seesmic, Tweetie and DestroyTwitter. I haven't used DestroyTwitter and don't use Echofon or others, but know that there are some good quality products in addition to the five I chose to focus on (including the first four and the native Twitter Web interface).

No Leading Twitter App Is On Every Platform

Of the five, no single client supports the desktop, the Web and the iPhone. None! TweetDeck and Tweetie lack Web versions, while Brizzly is Web-only, Seesmic doesn't yet have an iPhone app, and Twitter has no official desktop or iPhone application. So there's clear background for the splintering.

I Believe Each Twitter App Excels Somewhere

While each application has its bells and whistles, there are really four major elements I considered when looking at the top clients: Retweets, Direct Messages, Groups and Multi-User Support.

Retweets: Brizzly (reviewed here) has a handy "retweet" option next to every single tweet. Click "retweet" and it sets up a new message from you prefaced by RT. Couldn't be simpler. Tweetie lets you "Repost" a message as well, with the same functionality. TweetDeck and Seesmic also support retweeting, but I don't perceive them as leading in this functionality.

Direct Messages: Seesmic Web does a great job of sorting conversations by author, including how many messages, in a dedicated pane. If I am on the Web and want to respond to a DM, I will do so through Seesmic. But Tweetie's grouped direct messages are visually pleasing and are easily accessible. TweetDeck lumps all direct messages in an "In box" type of column, as does Twitter's native client and Brizzly, although Brizzly can show conversations in line if the activity is live.

Groups: Twitter promises that lists are coming soon, but TweetDeck has made a name for itself with groups, so much so that people wish you could export predefined groups for importing into other services. Brizzly and Seesmic also support groups, and Brizzly promises to integrate with Twitter's Lists option when it shows up. So far, Tweetie isn't doing any groups of any kind that I know of.

Multiple Accounts: As mentioned earlier, Tweetie and Brizzly make multiple account support simple. Seesmic Desktop's mutiple account support is very good, but it hasn't yet migrated to the Web equivalent. TweetDeck's support of multiple accounts is functional, but I have seen many a slipup from people using TweetDeck who have posted to the wrong account, so it could be much more intuitive. Twitter's Web client would just ask you to log out and log in again.

Ignoring extraneous functions like the rich media and real-time definitions of trending topics (where Brizzly excels), the biggest missing aspect to Tweetie, in my opinion, is access to saved searches in the desktop app. They are already highlighted in the iPhone app, so bringing them into the Mac client would be a big benefit indeed.

Selecting your favorite Twitter client is a personal issue at this point. To each their own. There is clearly room for many players given the different permutations of each app. But Tweetie is making things real tough for anybody on the iPhone, and for those who don't need access to groups, it should have the Mac desktop space to itself. This doesn't mean I like Brizzly or Seesmic Web or even TweetDeck any less than any other time I talked about those apps, but today, my stream is full of "from Tweetie" and for good reason.