The new web services: web 2.0 Essay

The new web services: web 2.0 Essay
Rate this post

  • University/College:
    University of Chicago

  • Type of paper: Thesis/Dissertation Chapter

  • Words: 2845

  • Pages: 11

The new web services: web 2.0

The web services in general has been characterised as having a language which is informal, friendly, humorous, and snappy; its design is trendy, with lots of space; the technology as web standards, interoperability, desktop-like responsiveness; and its culture as open, transparent, p2p, and unlimited sharing. Likewise, the web services has gone personal and global, easy to read and write, as well as human-centric and machine-enabled (Wilson, 2007).

Lately, however, there has been a new and emerging concept of further developments in the way the World Wide Web is managed, created, and used by software developers and web users, which has been tagged as Web 2.0. In essence, “Web 2.0 is the business revolution in the computer industry caused by the move to the Internet as platform, and an attempt to understand the rules for success on that new platform.” (O’Reilly, 2007)

            Basically, the development of Web 2.0 was fueled by the continuously changing standards of web creators and end users on their hyperspace experience. For instance, Feeds or syndicated HTML content and podcast or syndicated audio have become the basic APIs. However, while web APIs have become the programmable web, many major web companies still use the basic HTTP protocol.

On the client-side, AJAX, combining XML and Java Script have also become the order.  Mashing of recombinant software has enabled web service APIs to be combined in new ways applicable to both enterprise and the web (Wilson, 2007). This is seen in the rise of user-generated content in the web and how the “new platform” of Web 2.0 allows users to engage in critiquing, posting comments, and other means of interaction through the internet.

Thus, the Web 2.0, allows individuals and organizations to discover opportunities to learn from and form networks and communities, create and share work, collect and remix, collaborate, as well as innovate and develop techniques (Wilson, 2007). In discovering opportunities, both individuals and organizations go global with learning networks of which formal and informal learning episodes are combined, there is sharing of goals in forging social identity, there is the presence of symmetry of experience in informal and formal discovery and action, and presence of a global community of peers (Wilson, 2007).

 The learning networks also allow learners to become a part of a learning network before joining a course, join pre-existing community of peers, an inversion of roles where institutions become facilitators of learning networks instead of purveyors of courses, and the publishing and sharing networks such as POAF or feeds for people, XFN, and DOAP (Wilson, 2007).

Web 2.0 has enhanced the global creation and sharing of data and information in more ways than one can imagine, allowing end users to share anything from works of art such films and script drafts, to more personal aspects of their lives through Blogs, Wikis, Flickr or photo sharing,  YouTube (video sharing), feeds and podcasts, are also created for re-use or the creative commons (Wilson, 2007). There is an entire global community that motivates individuals to be useful and creative, even for free.

            In collecting and remixing, individuals and organizations may share playlists through RSS, Atom, OPML, or XSPF where there is identity, priority and understanding. They can also collaborate in their collection as well as remix with the pedagogy of constructivism and connectivism (Wilson, 2007). The web 2.0 also encourages innovation as there is the trial and error with the perpetual beta; learning from mistakes and the willingness to make mistakes, individuals are also encouraged to own the technology as well as develop techniques (Wilson, 2007).

It seems, however, that even with the Web2.0, the World Wide Web is still a work in progress, with many innovations unfolding by the minute. Big corporations such as Apple and Microsoft, and other internet-driven businesses continue to spend millions of dollars to enhance specific consumer experience of the web.

The Apple computers, assuming that individual or organization users may already have published dynamic web pages based on databases and objects, perhaps an online catalog or a shipping tracking system have suggestions for internal technology using WebObjects to unite functions from different company entities for display on a single page. In this instance, an online paystub might display pay information from the payroll department, along with vacation hours stored in a separate database in Human Resources and allows the use of WebObjects to create such smart web pages (Apple, Inc., 2007).

With WebObjects 5.3, one can also publish data as a web service so that custom clients can access the information without needing to know the internals of your application such as Dashboard in Mac OS X Tiger offering a custom view into information available on the web and takes advantage of web services for some of its functionality. AppleScript Studio and Automator enable one to create clients for Web services (Apple, Inc., 2007).

WebObjects 5.3 provides powerful tools that let one develop and deploy own web services without ever having to write a single line of low-level SOAP, XML or WSDL code. WebObjects uses Simple Object Access Protocol (SOAP) messages that wrap Extensible Markup Language (XML) documents, and sends the messages over TCP/IP using the HyperText Transport Protocol (HTTP). This means that one do not need to invest in another server to take advantage of WebObjects web services, provided one already publish using a web server. To allow clients outside of your organization to access one’s web services, there is only the need to publish their operations using Web Service Description Language (WSDL) (Apple, Inc., 2007).

Apple also claims that it is easy to configure and test web services without writing code. WebObjects 5.3 uses Axis — an implementation of SOAP from the Apache Software Foundation — as its SOAP engine, code generator and WSDL processing tool. WebObjects uses Axis to both deploy and consume Web services. Likewise, WebObjects provides the innovative Web Services Assistant, a graphical interface for creating web services that access information in one’s current databases (Apple, Inc., 2007).

           WebObjects services can interoperate with other services written in many languages, such as Java, AppleScript, Perl and .Net. One can even mix and match services written in different languages in a custom client. Also, script-level access to web services from languages such as AppleScript and Perl opens up enterprise application development to a new class of programmers (Apple, Inc., 2007).

          Critics of non-RESTful Web services often complain that they are too complex and biased towards large software vendors or integrators, rather than open source implementations and that another big concern of the REST Web Service developers is that the SOAP WS toolkits make it easy to define new interfaces for remote interaction, often relying on introspection to extract the WSDL and service API from Java or C# code. This is viewed as a feature by the SOAP stack authors (and many users) but it is feared that it can increase the brittleness of the systems, since a minor change on the server (even an upgrade of the SOAP stack) can result in different WSDL and a different service interface.

The client-side classes that can be generated from WSDL and XSD descriptions of the service are often similarly tied to a particular version of the SOAP endpoint and can break if the endpoint changes or the client-side SOAP stack is upgraded. Well designed SOAP endpoints (with handwritten XSD and WSDL) do not suffer from this but there is still the problem that a custom interface for every service requires a custom client for every service (T-Bray, 2004).

          There are also concerns about performance due to Web services’ use of XML as a message format and SOAP and HTTP in enveloping and transport. At the same time there are emerging XML parsing/indexing technologies, such as VTD-XML, that promise to address those XML-related performance issues (T-Bray, 2004).

           It was proposed that there are several other approaches to the set of problems that Web services attempts to address, both preceding and contemporary to it. There is the RMI, one of many middleware systems that have seen wide deployment and more ambitious efforts like CORBA and DCOM attempted to effect distributed objects, which Web services implementations sometimes try to mimic. Other basic efforts include XML-RPC, a precursor to SOAP that was only capable of RPC, and various forms of HTTP usage without SOAP (T-Bray, 2004).

           Hedlund (2004) reported about Bloglines announcement on a set of new web services APIs, allowing developers to write applications for reading RSS and Atom feeds by drawing data directly from the Bloglines databases. The significance of this in the landscape of RSS/Atom aggregators is that the newsreading applications that have become more popular over the past few years.

Bloglines also announced that several desktop RSS/Atom aggregators, including FeedDemon, NetNewsWire, and Blogbot, will use these APIs to provide additional capabilities in their applications. Bloglines Web Services make it easy for developers to use RSS and Atom content for many purposes including ease in the traffic pileup that aggregators are beginning to cause for many large RSS/Atom publishers.

           eWeek recently reported on the bandwidth problems RSS/Atom aggregators have been causing for Web publishers spurred in part by Microsoft’s announcement that even it was having trouble keeping up with requests for its feeds, publishers have been talking about how much traffic a popular RSS/Atom feed can bring to bear. One publisher in the eWeek article said, “Any site that becomes popular is going to be killed by their RSS,” (Hedlund, 2004).

           While it is true that web sites have been able to keep up with traffic from all over the world for years now and that web servers and protocols are very scalable, RSS/Atom readers present a new kind of challenge. With a web browser, users visit a web site only while they are in front of their computer and reading that site or when they are actively browsing. Likewise, users may visit some very large sites such as Yahoo or Google News repeatedly throughout the day, but such sites are usually commercially run and able to support larger streams of traffic. The difference with an RSS/Atom aggregator is that it automatically pulls information from a publisher’s site on a regular basis which is sometimes as often as once every 5 minutes. The aggregator will update itself continuously as long as it is running, to ensure that it is able to present the latest information when called on by the user (Hedlund, 2004).

           Other points of contention include about popular RSS site indistinguishable from a security attack. Hedlund (2004) explained that among security circles, a large number of clients repeatedly making requests to the point of overload is known as a distributed denial-of-service attack, and attacks of this sort have taken down the largest sites on the Web, including Yahoo, eBay, and Amazon. So that for a small Web publisher, even a moderately popular RSS/Atom feed can cause serious bandwidth consumption, running up ISP bills and preventing users from reaching any part of the site while for larger publishers, RSS/Atom feeds can bring in many more users but can also consume extensive resources.

           Many in the RSS/Atom developer community have long recognized the bandwidth overload problem and possible solutions require that nearly all aggregators adhere to a variety of “polite” practices to ensure that servers are not overwhelmed. Not all aggregators have done so although developers have made determined efforts. Users, however, want very fresh news and therefore often configure their aggregators to poll very frequently (Hedlund, 2004).

           According to Hedlund, (2004) Bloglines is different from most other RSS/Atom aggregators which is a server-side aggregator like NewsGator. Bloglines maintains a database of RSS/Atom feeds in the same way Google maintains a database of web pages so that Bloglines users query that database instead of polling individual RSS/Atom publishers from their desktop machines. This means that Bloglines appears to publishers–and consumes bandwidth–like one single RSS/Atom aggregator but is able to serve tens of thousands of users. By offering web services APIs, Bloglines is opening up its database of feeds for anyone to use so that any developer making an RSS/Atom-based application can draw from the Bloglines database, avoiding bandwidth overload for RSS/Atom publishers.

            However, Hedlund (2004) emphasized that bandwidth savings is not the only reason to use Bloglines as a feed cache. While RSS and Atom are emerging formats on the Internet with many variations on feed formats to deal with Bloglines draws feeds from the database, so developers are presented with a single format normalizing all of the feeds it collects before distributing feed content.

In addition, Bloglines users enjoy synchronization across computers. In reading news on one computer at work and on another at home, using a server-based aggregator lets one have the same set of feeds on both machines, and allows one to update those feeds as one reads them from any machine. Using the Bloglines Web Services, client-side (desktop) aggregators can provide this same functionality and one can even use, say, FeedDemon on Windows and NetNewsWire on Macintosh, and share the state of feeds between them through Bloglines (Hedlund, 2004).

            In sum, the emergence of the Web 2.0 is simply a manifestation of the continually evolving technology of the internet. Thus, while the technical jargons may be enriched as fast as the number of web-based applications continues to increase, the basic infrastucture of the web is still very much the same. Although the experience has no doubt been greatly enhanced by the addition of these applications, what drives this is simply the human-centric goal of interaction and enabling participation.

Indeed, Web 2.0 may just be a preview of things to expect from the ever-growing list of functions and use of the web, its activities simply a precursor of the culture and lifestyle of the future, where end-users demand not only ease and functionality in web navigation but also a high-level of responsiveness and engagement and a balance between creativity and substance from web pages. In the end it is these developments, where the machines become a virtual extension of human life and culture, which make the prospect of Web 3.0 more interesting and exciting for the internet-connected world.


Apple, Incorporated (2007). “Web Services at your Service.” From

Hedlund, Marc (2004) The New Bloglines Web Services.” From

O’Reilly, Tim (2006-12-10). “Web 2.0 Compact Definition: Trying Again.” Nov. 25, 2007 from

Systinet Corporation (2005). “The Advantages of Using Web Services.” From (2004). “WS Pagecount.” Oct 28, 2004, From

 Wilson, Chris (2007). “Moving the Web Forward.” Nov. 25, 2007 From




About the author


View all posts