December 18, 2006
Google += Endoxon;
June 21, 2006
Links, Back-Button and other Web Principles in AJAX
Here is a presentation (in german) on Maintaining Web principles in AJAX that I'm going to give in about one hour at the EuroAjax 2006 conference. It is a co-presentation with Jürg Stuker who covers SEO aspects of AJAX.
I cover deeplinking into AJAX applications, supporting the back-button both inside an AJAX application and back-into-page. The described techniques should cover IE, Firefox and (recent) Safari.See also Jürg's Post, who also offers a downloadable version of the talk.
February 10, 2006
Finally, I can announce, that the first customer from my new project launched this week! Working together with Endoxon AG, the new online map for germany at goyellow.de/map is now online. Please go and have a look!
You will find a lot of similarities to map.search.ch, with a couple refinements here and there. It is a full AJAX map for Germany with hybrid and vector views. You can overlay a lot of information from GoYellow, a internet yellow pages site for germany with direct door-to-door public transport directions available to every such point. Additionally, we have global data for the first few zoomlevels, so you can get a decent view of the rest of the world.
We put a lot of work into the visuals, especially the image quality and the general UI behaviour. Please have a close look! Note the clarity of the colors and the labels. Or drag the zoomslider.
The project will move forward adding more countries in this way. And soon, we will have some new concepts waiting for you. Oh, and we will get an official name..
Classifieds from the Edges
After a few posts on aggregating "Web 1.0"-style classifieds, I finally have something to blog on a possible next generation of classifieds:
At last, there is some news on edgio, Mike Arrington's new venture. As expected from the little we knew before it is primarily an aggregator of classifieds on blogs. Somehow, I wondered why that took so long, since the idea has been on the table for quite a while, but on the other hand there is not much selling-via-blogs going on yet anyway. I hope that edgeio and similiar sites will change that.
From the description it sounds like a solid and complete implementation. They ask sellers to use the 'listing' tag to draw attention, and although the article claims that they will "constantly crawl millions of blogs", my guess is that they subscribe the technorati feed for that tag. In this way, they re-use all the anti-splogs mechanism technorati implements. That would be a very simple and neat model to aggregate things, one that could be copied for many different things, too. In fact many geoblogging sites rely on one way or the other on such a mechanism. And of course, for events it is commonplace now to agree on a tag, to aggregate all related material. Edgeio would - if I guessed right - take this one further, by using the tag only as a means to find a URL and then going further and actually download and analyze the page.
Since structured blogging and microformats seem like perfect matches for edgeio. I wonder wether they even demand the implementation of one of these models for posters in order to get some structure. But if they do, then they should really mine all blogs and use the listing tag only as a speedup mechanism (i.e. being the Web 2.0 equivalent of 'Submit URL' at the search engines).
The article mentions an interesting usage of trackback. It sounds like they will automatically trackback every post they find and include. While that sounds pretty intriguing, I fear it can quickly become very problematic. Essentially, the whole process is automatic mass-trackbacking of all people using the listing tag. Sure, there is value in this trackback, but where do you draw the line and who decides what useful is? And what if there will be 20 more aggregators? I think that every form of trackback that doesn't result from a human choice on the trackbacking site (usually hand-picking a link to the trackbacked site) borders to spam, especially if high volumes are in the game. Well, as the other of the article says, he is not sure that he got every detail right, and the people involved in the project are definitely deep thinkers and well aware of the good and bad potential of these techniques, so I stop speculation and have another look after the launch.
By the way, here in Switzerland, Kaywa is doing something in a similar market with ichiba. There aren't many details out yet, but from the little I heard they seem to take a slightly different approach, that might sit a little closer to the way classifieds work for users today.
February 06, 2006
The Perpetual Beta: Is Webdevelopment Stuck to Small Incremental Updates?
The term "perpetual beta" sums up an important feature of web based applications: You can roll out updates as often as you want and all users immediately profit from it.
But once a site attracts a non-trivial userbase, this might also be its achilles heel: It can be hard and risky to impose major updates on your regular users. Sure, you can add a major set of new features, but that leads down to the path of complicated sites. But if you want to progress by changing how a current feature works, then you can get yourself in trouble. To limit risk, only small changes are considered. Which is smart for many other reasons, too, of course.
But what if the web around you progresses and a major shift of course is in due? What if your userbase is nicely in the early and late majority. With shrink-wrapped software you could jump on the next curve with the next release, sell it to the early adopters and give the rest some time to get comfortable.
You can't really do that with a website. Sure, you could add a special beta section an run that in parallel, but only real innovators go there and you'd have to keep it separated quite a long time, would loose even more power to keep the majority of users and if your site has social aspects cut out the the network effects.
So, you have a choise: Stay at the top of major shifts and risk alienating the most faithful users or eventually get caught up from behind by a competitor starting from scratch or coming from another field.
It was impossible for Yahoo to radically cut down their Homepage when the early adopters rallied around Google. Even now, their AJAXified Mail (guess why it is taking so long since the oddpost acquisition!) will keep all major principle in place. Gmail was able scrap folders and replaces them with tags.
If you understand german, you can watch the dilemma in the comments of the immo.search.ch launch. While it was in beta, there were 100+ mostly positive comments, in fact extremly positive. Once it was officially launched, that is, it replaced the old immo.search.ch - a pretty standard form -> resultspage -> detailpage type site - by a new ajax-centric site, old users took the general tone in a very negative direction. They want the old system back. They obviously copy arguments, which you can easily spot: One user mentions an actually quite obscure bug that appears when you search for a house to rent (and not a house to buy or a flat to rent, the majority cases) and that was also present in the previous version, as it is bogus data not code. But it serves them as an argument, so they use it. Fascinating to watch the dynamic in the comments.
Now search.ch is bold enough to hold to the new concept. They believe that it is a genuinly new appraoch that will ultimately beat the older system. But the real estate search was never a really important for search.ch, so they can take the risk. The commenters in the blog are probably a vocal minority and I would guess that a reasonable part of the current users will stay and even spread the word. But even if the complete userbase would have to be replaced with this step, the payoff vs risk made sense.
But not so for the other big real estate sites. For them, the current userbase is one of the core assets. And every step that might piss off a good part of it is a bet-the-company step. At the same time, they will surely loose users to immo.search.ch and other companies that are freeer do innovate in bolder steps. The best would probably be to time the transition to a new paradigma to a sweet spot where they can still get back a couple of users and really keep some so far loyal users that at least saw the new style and probably get used to it and the users they would loose are on balance. But they will come out hurt one way or the other.
Shrink-wrapped software with its granular update cycles and a self-adopting userbase don't know any such risks. Only to miss out an important change of direction, i.e. to not innovate enough.
So what could a site do, that built its success over the years with a model that is bound to be replaced over the next couple years? Are there any good examples where a site mastered a difficult transition in a fine way? What could be done by clever communication in advance and during the transition?
(All of that assumes that you know what you do and don't innovate in the wrong direction. That of course is whole different topic to analyze. For small steps in the wrong direction that perpetual beta is actually more forgiving - if you see the signs early enough. And you see them correctly, which is not easy: Just go and read the comments in the search.ch blog entry linked above)
February 02, 2006
No market for Mobile Location Based Communication Systems?
After a data-rich and fascinating talk about the specialization of communication channels at LIFT yesterday someone asked if the introduction of position-information of your communication partner would alter the communication patterns. Stefana's suggested that there probably would be not much change in the pattern, since the 4-6 people that you do the bulk of synchronous and nearly synchronous communication with, you know well enough to know where they most probably are at any time anyway. Taking this further, these would of course be the same set of people that would probably give you the trust rating to expose their position. Thus, the question: Is there any market for all these mobile location based communication systems?
Not in the obvious implementation case. Maybe either by restricting types of locations, like Plazes does (which maps places I'm online, thus probably working and almost never my spare time) or by coming up with a system that is anonymous enough, but yet useful and not annoying (permission spam) so that you could extent the service to a much wider circle.
On the other hand, IM-style presence indicators (from "away" and "busy" to "phone", "meeting", "in the zone") potentially fulfill a much more important note in this context. Maybe phones start listening to things like how many voices (and at what volume?) there are in the room, how fast the typing-sound is, etc. They could at least switch to silent mode in this case? Hmm, how reliable could you detect that the owner is sitting in a movie or a talk? And why is there no "silence please" beacon installed in every cinema that mobile phones could pay attention to?
Network Latency as Boringness Indicator at Conferences
I'm currently at LIFT. And I have a theory:
The interestingness of a talk is inversely proportional to the latency on the conference wifi.
(This post will take particularly long to get to the server)
Update: 1. Most of the conference is actually quite interesting 2. Anina is coming up, I expect low latencies this afternoon. 3. Dang, do I sound geeky today.