|A big part of Web 2.0 is data sharing and Application. |
A true Web 2.0 site shares its content to non-browsers. What do I mean? Quite simply, Remote Procedure Calls (to use the traditional term).
Think flickr or Google API where you can use SOAP to query the server and obtain information. Your site is no longer the only place where your data will be displayed - it's about providing a service rather than providing a page. RSS feeds is a more basic version of this; data is available from another site but that site is simply more restrictive in what it publishes (there are other pro's and con's that warrant a thread in itself!).
The thought that your beloved site is not necessarily the primary location where all your hard work will be viewed probably scared half the people reading it; RPC requires a completely different mindset and a paradigm shift in thinking. Where will my revenue come from? Who's going to advertise or sponsor my site if no-one's ever going to visit it directly. However, there are a couple of business models for this;
1. subscription to RPC access - either for all access or for "advanced feature" access. I'd not advocate this for RSS, of course!
2. become the defacto site in your chosen area by utilising the RPC calls from other Web 2.0 sites to tailor the users visit and try and become their first port of call - it's almost going full-circle back to the days of portals, except you don't host everything but instead collate and filter from other sites as well as your own and render pages specifically tailored for the current user.
Web 2.0 sites are heading down the "Excel" route.
Think how you use Excel - or most applications on your desktop. In Excel you can pull in data from other sources and display it (think SOAP, etc). This link is also dynamic, Excel will refresh the data - it's not a one-off import.
Okay, so that's the the same as above - Excel uses ODBC to access a database, which is effectively a specific sort of RPC, but I mentioned it to set the scene.
Think how you interract with a desktop application such as Excel. When you enter data or a forumula into a cell, does the whole screen go blank and then doesn't refresh until it has recalculated the sheet? No, it's all done behind the scenes whilst you continue to work. For that you need a method to communicate with the server without changing the URL. This aspect will become - I predict - massively important in the near future. As Ajax-like sites become more pervasive, users simply won't put up with old-style page refreshes which stop their work flow.
* Is php-classes a Web2.0 site?
So, IMHO, considering these two points php-classes is not quite Web 2.0; but then there aren't many sites that truly are. It has a lot of community aspects now, but it pretends there's no other script site. Add RPC access to the forums, etc. then you've got a Web 2.0 server site. Start using RPC to show data from "competitors" then you've got a true 2.0 site - a site that puts the user first.
* Is PHP Web2.0 ready?
Bearing these two additional points in mind, quite simply "Yes".
Violently agree, or maybe you completely disagree and think I'm an idiot on acid? I'd be interested in others thoughts...
|2006-06-02 18:09:49 - In reply to message 1 from Matt|
|As I tried to explain, there is a big confusion between what is Web 2.0 and the means that are used to achieve a Web 2.0 site.|
A Web 2.0 site provides means to let users interact with the site somehow and so become relevant .
All those means of data sharing, having the site as an application, greater interactivity with AJAX, etc.. are just that, means. A Web 2.0 site does not need to use all those means. As long as the users of the site can interact and become relevant, that is all that is necessary for an user to become relevant.
Furthermore investing on the development of those resources for sharing data, providing API, AJAX interfaces, may become a total waste of time, if the users end up not using them.
Let me give you an example. The PHPClasses site provides RSS feeds for the latest packages, latest reviews, latest blog posts and the latest forum threads. That is a very common way of sharing data and many users and companies republish it in their sites. That was implemented in the PHPClasses site in April 2002.
In October 2002, the site provided a search API that lets any other site search the content like in the internal search pages. The results are returned in XML and include the page URL, excerpt and relevance percentage.
As a matter of fact the adopted format is XML RSS 1.0. It should be easier to adopt as you only need a common RSS parser. The format was even published as a proposed RSS 1.0 module:
This is a common format used by member sites of the PHP Developers Network, so the results could be aggregated in PHUSE, the PHP Unified Search Engine:
That was a simple and effective API to let others search the sites. The problem is nobody used it, other than PHUSE. It was not for the lack of advertising because it was announced in all the PHPClasses site internal search engine result pages.
In June 2004, Google become the default search engine for the PHPClasses. Thanks to Google AdSense for search, the site could make additional revenue by redirecting users to Google co-branded search pages.
The internal search engine pages are now restricted to premium (beta) subscribers. It is being resurrected as a means to provide a better site search engine than Google can provide, as it will provide an almost real time indexing and more search options specific to this site.
As for the XML search API, it will be discontinued. It was good for experimenting the concept, but apparently useless for the site users.
The same goes for any other API that the site could provide. What do you want a forums RPC API for?
Until somebody demonstrates that a certain API could be beneficial (and financially realistic) for the site by providing more means to let the users interact and become relevant, the investment on the site development will continue to be on things that provide more obvious benefits.