MastHeadRamana Rao's Home PageRamana Rao's BlogRamana Rao's TrailsRamana Rao's Sensemaking

Archive for July, 2003

Googleholes
Friday, July 25th, 2003

Steven Johnson’s Slate article on some weaknesses in Google created lots of noise in the discussion area. He explains his intentions on his blog.

The three so-called google holes are pretty well understood among people that really understand Internet Search, but not so by most Internet users. Most people do tend to carry many misconceptions including: 1) there is a right answer to a search 2) google is better than other search engines by a long shot at coming up with the right answer and 3) searching a universe of everything makes sense. Though the three google holes that SBJ is pointing at don’t exactly correspond to these three points, along with the blog comments SBJ’s holes cover these points and more.

Over time people are getting more sophisticated about the limitations of Internet Search engines as they get more road miles behind them. And thus will start to entertain solutions that involve a bit more knowledge on their part. I’ve been saying for a long time that users won’t do more than one word queries, but it’s coming time to pull the other way. Somewhere between one word queries and a bard’s 14 line depiction of injury at the hands of a lover is the possibility of some elaboration by the user.

Similarly, we can start to expect more from the systems. Why should it deliver back the myth of a uniform list of results from a universe of equivalent things? Especially when it “knows” that it has lots of different kinds of things (not just several kinds of apple, but also many different formats and genres and kinds and ages and orientations of sources and documents). And that even if the most likely thing users want is one thing, that in fact most users will want one of many other things. Why not present this structure and diversity back to the user in a way that the user can better understand the options and then help themselves.

Google is fanatically loved, but as much because the first round of Internet Search companies forsook search. And of the next round, Google executed better than anybody else. Particularly in staying focused on search and in endearing themselves to their users. Hat’s off on that. That doesn’t mean that they have solved the problem of finding what you are looking for in all cases. There are yet miles to go …

Blog Technologists Working Together
Friday, July 25th, 2003

Many blog & RSS technologist/designers are working together to develop a new standard at a place, a Wiki being hosted by Sam Ruby, who works for IBM. Clearly that doesn’t stop him from demonstrating a great deal of 4th Vertex skill.

The community is as the saying goes, planning the working, and working the plan all on the Wiki. Activity is orchestrated and the products of the effort are organized in an extremely lightweight way. A case in point, look at how they are deciding on what to call the standard, not Echo, not Atom, maybe Pie. Quickly reminds you of meetings, functional and dysfunctional aspects, but meetings with say eight people in them. Meanwhile this Wiki has probably 50 active participants, and maybe will have 100s if it keeps going.

Stay tuned for more discussion of the 4th Vertex, a concept that has been having me for the last few months.

The 4th Vertex is now explained in the July 2003 issue of Information Flow.

Blogs – How Many?
Friday, July 25th, 2003

An article at Cyberatlas (a cousen of Jupiter Research) covers Blogging By The Numbers.

According to BlogCount, there are 2.4M-2.9M active blogs. This is based on reports of active blogs from three biggest blog hosters (Blogger, LiveJournal, & DiaryLand) and estimations based on other data points (e.g. Bloggers in Poland, RadioLand & MT estimates, etc.)

According to Jupiter, 2 percent of online world blogs, 4 percent reads the blogs. Interesting demographic contrast between readers and writers, writers relatively more men and higher income.

Always On Summit – Can wired tools make events interactive?
Thursday, July 17th, 2003

I’ve been at a lot of conferences in the last year with Wifi access, carrying two wireless devices: an old Motorola cell phone and a blackberry for email. Mostly the devices are about being somewhere else while I was supposedly at the conference. The Always On Summit is my first serious immersion into connecting to the full range of raging light-weight shared text and interaction tools (chat, blogs, wikis) at the conference. First the devices:

  • I’m wired to a powerstrip daisy chained along a row in the auditorium. I’m still using an ancient Thinkpad T21 with dying batteries. I want a new laptop with 10 hours of cordless life.
  • The Wifi was up and down the first day, with all the puns and smart lines along the lines of “always kind of on.” The moments of greatest irony were during the Wifi panel when panelist lines like “wifi should be like oxygen” rang the minds of many in the audience into synchronized retorts.
  • Palm is loaning out the new Tungsten C for the conference. Builtin Wifi. I tried a few sites (like mine), between wifi hiccups, test to see if applets will run (nope), played “Bejeweled” a couple of rounds. 300 pixels color screen. I’m not buying even at conference special of $250 vs. $500 list.
  • My blackberry. I get too much spam (even with spam control), so at conference I fetch email using wifi. The first day, out of twitchiness, I scanned my blackberry when the wifi breaks, most of the time.
  • I left my cell phone first day in my car in a parking lot at the other end of campus … when one of the panelists asked the audience how many didn’t regularly carry a cell phone, 1 person (in about 300 in audience) raised a hand. I think he was the same guy who had his cell phone go off loudly during a session and searched for it for 30 seconds in his backpack.

I remarked to Tony Perkins (Mr. Always On) that he shouldn’t worry too much about the Wifi snafu, better to be connected to the reality of mid-2003. The Wifi was fixed on the 2nd day, and now it’s possible to attend to the other channels. Recently a number of conferences have had wikis, blogs, chat available right at the conference (e.g. Supernova, PC Forum, OReilly conferences). All of these along with live webcasting to the web are part of Always On.

  • Chat, this is most live of the alternative channel. Even in the auditorium, because the running chat window is up on one of the two big screens most of the time. Most people mostly stay focused on the stage, but occasionally look over at the live chat screen and scan the text still on the screen. Most of the chatting is from a small number of people. There’s a mixture of amending or commenting on speaker comments. Every 3rd or 5th line makes sense even if you haven’t been tracking it for a while.
  • Wiki. A wiki is a set of web pages that page visitors can edit (e.g. amend, add, delete content) from their web browser.

    The Always On wiki, is a little hard to find, because its buried into the so-called “Webcast” section. Start here, and click on “Low Bandwidth”, on the right are wiki pages. At the top it says “Edit This Page”, and you can. People don’t usually trash pages, but if they do the administrator can roll back the page.

    The AO Wiki has very little on it right now. I think [check] the SuperNova or the PC Forum wikis have more on them, but I wonder whether it appeared during or after the event. The idea is that the audience would do joint notetaking, and that people not at the event can link in relevant comments or links. The Wifi not working on the first day certainly threw a wrench at live notetaking.

  • Blogs. A number of people are live blogging from the audience the second day. Check out Gulker. Most people apparantly prefer to put longer, “more authored,” thoughts on their own blogs where their authorship is clear. But Wiki’s should be a good place to contribute quick giveaway pointers or comments
  • Webcast. My Real player didn’t work, and it was crashing my browser as I go to the webcast page. So I uninstalled it, so I could access the Wiki. These live tools for “enriching” conferences for the audience and for connecting in a web audience have a ways to go. Meanwhile, Tony is saying that some 4000 people are watching the event over the web. A part of the chat is people in the audience answering questions about the live setting for people not at the event.

And now the obvious question. If my focus is so much on the devices and non-line-of-sight channels and on writing this entry and so on, how can I be possibly listening? Am I flowing or am I distracted? A bit of both. My interest is shifting where it wants to go, trying to experience and understand this new way a conference can be. Nothing is working perfectly, and too much of my attention is going to the device-techno-tool-goo in the middle at the cost of what’s being said.

If all this goo worked (meaning it mechanically worked and also that people understood a refined simplified nicely designed version of the goo as well as they understand say web browsing), I think it would really begin to turn live events into interactive events.

A central aspect of interaction is a shift of control from author to reader, from the stage to the audience. With my devices, with a choice of several spaces and places of content and connection to others at or not at the event, I have more say over how to be *at* the event.