Skip to main content

Fredrickson Communications

John Wooden

John Wooden has worked on a diverse range of web projects for Fortune 500 companies and local, county, and state governments in his role as Fredrickson’s director of user experience services. He has led website redesign and information architecture efforts, and conducted hundreds of usability tests and heuristic evaluations on both websites and applications. Behind the scenes, John has developed usability guidelines and interface design standards for applications and websites.

John has taught classes in usability and user-centered design at the University of Minnesota and has presented dozens of seminars on usability and web-related topics.

John has a PhD in English and is a Certified Usability Analyst and member of the Usability Professionals’ Association. He has been with Fredrickson Communications since 2000.

Resources

Web 2.0 sites

A Web 2.0 Primer

by John Wooden, UX Director

As of this writing, “Web 2.0” delivers 273 million results on Google and has already been the subject of two annual conferences in San Francisco. If asked though, most web users couldn’t define what Web 2.0 is, even if they’ve heard or seen the term before. No surprise—it’s a slippery concept obscured by a certain amount of hype.

The hype, however, doesn’t mean there isn’t something happening that’s worth noticing and trying to understand. Web 2.0 isn’t a thing or a place—it’s an umbrella term to describe rapidly evolving tools and practices that are accelerating various types of decentralization, collaboration, sharing, and social networking. Not all of these tools and practices are completely new—in fact, many of them were used during the Web 1.0 era—but at a certain point in any evolution, a series of incremental changes results in something that is different enough to be noticed and even labeled, and such is the case with Web 2.0.

Interest in Web 2.0 increased following the first conference on the subject in October 2004, organized by O’Reilly Media (best known for its numerous books on topics in information technology). O’Reilly Media VP Dale Dougherty is credited with coming up with the idea to use Web 2.0 as the theme of the conference. Some people have suggested that this concept is or was nothing more than an attempt to market a conference and woo venture capital. Although there is some truth in this, it’s too reductive. So lets look more closely at what Web 2.0 is (or isn’t) and touch on what some of the implications might be for business.

In “What is Web 2.0? Design Patterns and Business Models for the Next Generation of Software” (arguably the most important article written about Web 2.0 so far), Tim O’Reilly explains that his group’s initial brainstorming sessions about Web 2.0 set out to contrast Web 1.0 sites, practices, and models with those of Web 2.0. Following this lead, we’ll begin by contrasting Wikipedia with Britannica Online.

Wikipedia

Founded by Internet entrepreneur Jimmy Wales, Wikipedia is an excellent example of what Tim O’Reilly calls “harnessing collective intelligence,” a trend that is accelerating with Web 2.0. A blend of “wiki” (an application that allows users to contribute and edit content collaboratively) and “encyclopedia,” Wikipedia is described on its home page as a “free encyclopedia that anyone can edit.” It is “written collaboratively by many of its readers. Lots of people are constantly improving Wikipedia, making thousands of changes an hour, all of which are recorded on article histories and recent changes. Inappropriate changes are usually removed quickly.”

Wikipedia now consists of more than 10,000 user-created entries. The obvious risk of this approach is that it allows content to be published that is uneven, inconsistent, inaccurate, or heavily biased. However, thousands of readers help to monitor the quality of information on the site, an approach that has been called “self-healing.” Some entries are even preceded by a warning about “weasel words” that betray a particular bias. Wikipedia has thus created a model that is very different from the more conventional, centralized, top-down, and Web 1.0 Britannica Online. In contrast to Britannica Online, Wikipedia exemplifies many of the defining characteristics of Web 2.0:

  • Decentralization (dispersion or distribution of functions and powers to end users)
  • User collaboration (contributing, monitoring, and editing content)
  • Sharing (in this case, knowledge-sharing)

BitTorrent

Decentralization, collaboration, and (file) sharing are the essence of peer-to-peer systems. The first mainstream peer-to-peer network was Napster, and its well-publicized legal dispute with the record companies involved the decentralized distribution and sharing of digital music files. With Napster, individual users broke the monopoly on the packaging and distribution of popular music (not unlike the way in which Wikipedia challenges the conventional model of the encyclopedia represented by Britannica).

BitTorrent, another peer-to-peer pioneer, allows users to share music, video, software, and games, but unlike Napster—where one user would have to finish downloading an entire file before other users could request it from her—BitTorrent disperses the file-sharing process. Instead of a thousand users all swarming the same server to get the same file, BitTorrent enables all these users to collaborate in the download by dividing files into smaller bits that are then distributed by peers before being assembled again as a complete file. This has the effect of making the most popular files the fastest to download.

eBay

eBay has become another very well-known example of decentralization and user collaboration in which every consumer becomes a potential seller and distributor. eBay has no main storage facilities, apart from the basements, attics, closets, and garages of all the people who have been or might be sellers, and it has no real products of its own. It simply provides a service—a way for buyers and sellers to do business. It makes available, and helps people find, far more products than any one physical store could ever handle (Amazon, Netflix, and Rhapsody are similar in this regard), and in doing so, serves thousands of niches that together add up to something much bigger than would be the case if eBay dealt only in current mass merchandise.

This is related to what Wired Magazine’s Chris Anderson has called “the power of The Long Tail,” and is another thread connecting several Web 2.0 businesses. The Long Tail describes a feature of statistical distributions in which “a high-frequency or high-amplitude population (the head) is followed by a low-frequency or low-amplitude population which gradually ‘tails off.’ In many cases, the infrequent or low-amplitude events—the long tail—can cumulatively outnumber or outweigh the head, such that in aggregate they comprise the majority (Wikipedia).”

Flickr

Like Wikipedia, BitTorrent, and eBay, other standard-bearer sites of Web 2.0 also have a basis in providing services characterized by decentralization, sharing, and collaboration. Flickr, a widely used photo-sharing site that in 2004 began allowing users to upload personal photos into chats and instant messages, started off with a different business model from the Kodak site Ofoto (now called Kodak Gallery), for example.

Whereas Kodak was using Ofoto primarily as a channel for users to submit digital photos for printing (sharing was of secondary importance) Flickr’s model was based primarily on photo sharing. Photo sharing has since become a web phenomenon and is central to the social networking that has become so pervasive in the Web 2.0 world. The best known social networking site now is MySpace, a collection of user photos, profiles, forums, and blogs. As of January 2006, it was the seventh most popular site on the web, with 50 million users.

Flickr, del.icio.us, and folksonomy

In addition to photo sharing, Flickr is known for pioneering the practice of user “tagging” of content. What this means is that instead of shared photos being centrally classified and categorized by a group of information architects paid by Flickr, this content is categorized (tagged with keywords) by users of Flickr. Users can also subscribe to tags. Designers, for example, could subscribe to a tag for “illustrations” to see all activity on that tag, subscribe to the tag of a peer they admire, or both.

The social bookmarking site del.icio.us—where users can maintain and share an online list of their favorite sites, articles, blogs, music, and so on—also enables user tagging, as did Consumating,a now-defunct site dedicated to helping “geeks, nerds, hipsters, and bloggers find dates.”

This decentralized user-categorization of content has been called “folksonomy,” a blending of “folk” and “taxonomy” that means “classification by the people.” Just as Wikipedia and BitTorrent are bottom-up, decentralized forms of sharing, folksonomy is a bottom-up, decentralized form of information architecture that creates an alternative to centrally controlled vocabularies. Debates about the practical value of folksonomies continue, with some perceiving in them the “wisdom of crowds” (to cite The New Yorker columnist James Surowiecki’s book of that name), while others see in them the potential for confusion resulting from inexpert and idiosyncratic classification.

Blogs and RSS

Another decentralized type of sharing and collaboration, blogs are very much part of Web 2.0, and the media term “blogosphere” suggests just how significant blogging has become. Blogs have been around for more than a decade, but only recently have they formed the vast, interrelated networks that allow users to comment on one another’s comments in endless conversations. Tim O’Reilly writes, “If an essential part of Web 2.0 is harnessing collective intelligence, turning the web into a kind of global brain, the blogosphere is the equivalent of constant mental chatter in the forebrain.”

Access to blog content has been greatly increased by the ability to subscribe to “feeds” through RSS (Really Simple Syndication). RSS is based on XML versions of content that can be presented on a range of sites and devices. Podcasts and V-casts have extended this subscription system into a full multimedia distribution system, especially useful for content that might otherwise be marginal.

Blogs, along with the use of digital video cameras and camera phones, have been an important part of the trend toward “citizen journalism,” yet another decentralized, bottom-up phenomenon.

Mashups and open APIs

The original context of “mashup” is popular music, where it refers to a digital mashing up or mixing together of sometimes very different songs to create something new. DJs (real and aspiring) used the large number of digital music files available on sites like Napster to create their own mashups. By analogy, a mashup now also refers to a “website or web application that seamlessly combines content from more than one source into an integrated experience” (Wikipedia).

Perhaps the best known examples of this type of mashup are Google Maps and Google Earth, where developers have been able to take advantage of the open APIs (Application Programming Interfaces) that Google provides to add layers of information to an original map or satellite view. So, for example, someone can add comments about a village market in Mali or information about a restaurant in Minneapolis. Feed readers (such as FeedDemon), which gather together various user-selected feeds from news sites, blogs, and so on, are another type of mashup.

One of the reasons that companies like Google and Flickr are providing open APIs is to enable third parties to add features that drive traffic to their sites. Developers who aren’t on the company payroll end up adding features much faster than Google or Flickr developers could on their own. But the third-party contributors benefit too, by adding information or features that they want others to see and use.

Web apps and Ajax

Another indicator of the shift from Web 1.0 to Web 2.0 is a shift to the web as the platform for more and more applications. Web applications for inventory tracking, time reporting, sales planning, training, project management, and word processing (such as Writely) are all becoming increasingly common. The advantages of web apps over desktop apps are many. To note a few:

  • Ease and speed of maintenance and updates
  • Anytime, anywhere access
  • Ease of distribution
  • Larger audiences
  • Possibility for greater aesthetic appeal in a familiar web interface

However, web applications have often lacked responsiveness and functional depth in comparison with desktop applications. For example, most of us are familiar with the inconvenient pause required during data input while a web app saves data to a central server (during which time we watch the hourglass). But a programming approach labeled “Ajax” by technology designer Jesse James Garrett in early 2005, provides a way to solve this problem. The name Ajax—an initialism for Asynchronous JavaScript and XML—quickly caught on and has focused attention on expanding the capabilities of web-based applications.

Google applications are probably the best known exemplars of Ajax. For example, Google Maps and Google Earth allow users to click and drag a map or satellite image rapidly to the north, south, east, or west with virtually no pause for the page to refresh after each motion. Google Suggest, still in Beta, responds immediately to the first few characters a user types, providing a list of possible related search terms. Gmail, Google’s email application, is much faster than other web-based mail systems, in which messages and lists have to be downloaded again each time a user displays a new web page.

Garrett explains how Ajax works:

An Ajax application eliminates the start-stop-start-stop nature of interaction on the Web by introducing an intermediary—an Ajax engine—between the user and the server. Instead of loading a web page, at the start of the session, the browser loads an Ajax engine—written in JavaScript and usually tucked away in a hidden frame. This engine is responsible for both rendering the interface the user sees and communicating with the server on the user’s behalf. The Ajax engine allows the user’s interaction with the application to happen asynchronously—independent of communication with the server. So the user is never staring at a blank browser window and an hourglass icon, waiting around for the server to do something. Every user action that normally would generate an HTTP request takes the form of a JavaScript call to the Ajax engine instead. Any response to a user action that doesn’t require a trip back to the server—such as simple data validation, editing data in memory, and even some navigation—the engine handles on its own. If the engine needs something from the server in order to respond—if it’s submitting data for processing, loading additional interface code, or retrieving new data—the engine makes those requests asynchronously, usually using XML, without stalling a user’s interaction with the application.

Although some people have suggested Web 2.0 is Ajax, or vice versa, it’s probably more accurate to understand Ajax simply as a programming approach that can create more possibilities for web apps. As the web becomes a platform for more applications, this in turn creates a challenge to the desktop model, especially as companies like Google provide open APIs, allowing others to add content and create hybrid applications. As Tim O’Reilly states, “Any Web 2.0 vendor that seeks to lock in its application gains by controlling the platform will, by definition, no longer be playing to the strengths of the platform. This is not to say that there are not opportunities for lock-in and competitive advantage, but we believe they are not to be found via control over software APIs and protocols. There is a new game afoot. The companies that succeed in the Web 2.0 era will be those that understand the rules of that game, rather than trying to go back to the rules of the PC software era.”

Open-source tools

Ajax isn’t proprietary technology. Similarly, many of the tools that have been used to run the sites and applications commonly identified as examples of Web 2.0 are open source, including:

  • Apache—a Unix-based HTTP server
  • Linux—a Unix-based operating system
  • MySQL—a relational database management system that uses Structured Query Language
  • Perl—a commonly-used language for programming web applications
  • PHP—a server-side scripting language designed to process text (often used to process complex web forms)
  • Ruby on Rails—a web application framework that allows developers to create new applications quickly, while implementing effective user interfaces.

The open source movement is motivated by the idea that source code should be made freely available to anyone who wants to use it, and it has had the effect of unleashing the collective talent of thousands of programmers to improve and build on existing code. SourceForge.net is a fascinating example of the open source community in action—the world’s largest open-source software development website, hosting more than 100,000 projects and over 1,000,000 registered users.

Web standards, interoperability, and mobility

In addition to open source software and scripting languages, Web 2.0 sites and applications are being developed using web standards: “CSS for layout, XML for data, XHTML for markup, JavaScript and the DOM [Document Object Model] for behavior” (Zeldman). Because these sites and applications have been built with web standards, thereby separating presentation and content, they are accessible to a greater variety of people and devices (such as handhelds). This interoperability and mobility is another notable characteristic of Web 2.0. This is also another form of decentralization, because content is no longer “centered” or based in just one place. As McManus and Porter point out, “Designers have to start thinking about how to brand content as well as sites. It means designers have to get comfortable with Web services and think beyond presentation of place to APIs and syndication. In short, it means designers need to become more like programmers.”

Where does this road lead?

The short answer is that no one has a map. For now, Web 1.0 sites and applications far outnumber 2.0 sites and apps, and they will for some time. Although Web 1.0 opened up a new channel for organizations to communicate and sell, most sponsors of Web 1.0 sites were not especially concerned with creating a dialogue with users, or opening up possibilities for user sharing and collaboration; they just wanted a “presence” on the web. Most of these sites are digital brochures, though perhaps more informal and interactive than their paper counterparts.

Nothing is inherently wrong with this, especially if an organization does a good job of organizing and presenting the information it has. But as more organizations embrace the principles of Web 2.0, this approach may begin to seem old-fashioned in comparison with sites that facilitate dynamic forms of participation and collaboration that result in new ideas and new creations. More significant perhaps, organizations with Web 1.0 sites may miss out on opportunities to engage clients and prospects and create new business.

Even if some of the vanguard 2.0 sites and apps don’t last, decentralization, sharing, collaboration, and social networking will continue and take on new forms. This isn’t to say that people will no longer look for certain types of authoritative, top-down information, but it does mean that more managers, developers, designers, and writers will have to figure out how to navigate the Web 2.0 world to accomplish their goals, creating their own roadmaps as they go.

For now at least, consider a few questions about your own sites and apps to take a reading on your position.

  • Are you using web standards to write lean code that is accessible to different types of users and devices?
  • Are you taking advantage of open-source software and scripting languages?
  • Are you migrating applications to the web?
  • Are you evaluating whether your web content really matters? Is it content others would value and want to share? Is it easy to find? Are you thinking about how to brand your content? Are you adding, or considering adding, RSS to your sites (internal and external)?
  • Are you implementing, or thinking about, ways to make your site more dynamic by inviting user participation, sharing, and collaboration? Are you taking advantage of the knowledge and talent that your users (internal and external) can provide in a way that serves their needs as well as your own?
  • Are your programmers, designers, writers, and usability analysts working more closely together to create user experiences and content that serve the needs of customers and prospects in different niches and bring them back for repeat visits?

Things are moving fast—here’s looking forward to Web 3.0.