Tag Archives: 80/20

Ding, W., & Lin, X. (2010). Information Architecture: The Design and Integration of Information Spaces. San Rafael, CA: Morgan & Claypool Publishers.

Wei Ding and Xia Lin’s book was borne out of the content of their graduate classes on Information Architecture, and is “conveniently divided” into ten chapters for use during the ten weeks of an academic quarter. It offers a broad, basic introduction to the fledgling field of Information Architecture (IA), User Centered Design (UCD), and Human-Computer Interaction (HCI).

The term Information Architecture was coined by Richard Wurman in the early 1970s – he saw it as gathering, organizing, and presenting information.¬†According to Ding and Lin, the main goals of IA are to simplify info; design and integrate information spaces; and create ways for people to find and interact with informational content. Primarily, they claim, IA’s primary aim is to help people understand and manage information and “make right decisions.”

The authors emphasize that balance must be sought between user control and design, and that meeting user needs should always be the ultimate goal. They are careful to point out that even though the user is the center of focus, this does not mean ignoring business goals and market opportunities.

A successful Information Architect is able to align business goals with user needs.

The most useful parts of this book focused on user behavior online, citing several important theories which explain how and why users behave the ways they do on the web. Zipf’s Law, which essentially explains that users will always take the path of least resistance, is foundational in UCD online, as it helps to tailor the ways IAs organize content. Fitts’ Law teaches IAs to make buttons large and to keep them near in proximity to other, related buttons and icons to maximize the potential for users maximum benefit.

The 80/20 rule is brought up here – the authors mention that 20% of the sources can generally be counted upon to provide 80% of the info. This ties right in with Barabasi’s Linked, and offers an explanation for the rich-get-richer phenomenon discussed in that text.

Barabasi, A. (2002). Linked: The New Science of Networks. Cambridge, MA: Perseus Publishing.

In his study of networks, Barabasi has put forth the dictum that everything touches everything. He says that the modernist practice of taking everything apart to examine its pieces does not guarantee that we will understand the way the pieces work when they are together – or that we will even ever understand how, exactly, to put them back together.

When one starts becoming familiar with different kinds of networks, they begin taking on particular characteristics that become obvious. They look like webs without spiders, wherein nodes are more or less connected but never solely responsible for maintaining connectivity.

Leonhard Euler is considered the grandfather of graph theory, but today, we consider his work our basis for thinking about networks. The 7 bridge coffeehouse problem demonstrated that

Graphs or networks have properties, hidden in their construction, that limit or enhance our ability to do things with them.

Every network display a separation of nodes between 2 and 14. Granovetter demonstrated the strength of weak ties, illuminating the importance of distributed connectivity.

Watts and Strogatz’s clustering model cracked Erdos and Renyi’s random worldview when it came to networks —> a random universe does not support connectors, or highly connected nodes which become network hubs.

A random network exhibits the familiar bell curve pattern and is likened to a highway map, with nodes evenly distributed which connect other nodes with relatively equal paths leading to and away from them. A scale-free network, on the other hand, does not exhibit a bell curve, but may have 2 or 3 giants for every hundred small nodes. It resembles more of a airway map, with some nodes handling much more traffic than others and servicing many, many other smaller nodes. These scale-free networks adhere to power laws, not “natural” laws.

Normally, bell curves (random networks) rule in nature, but when systems experience phase-transitions, the move from chaos to order occurs as components begin to self organize. This phenomenon produces scale-free networks containing hubs and adhering to power-laws.

80/20 and rich-get-richer – in scale-free networks, the connectors with the most links become hubs. Almost without exception, 20 percent of the nodes are connected to 80 percent of the links, making them hubs. Once they’re hubs, they keep collecting nodes, which is what is meant by rich-get-richer.

Real networks are governed by 2 laws: growth and preferential attachment. But growth alone can’t explain the emergence of power laws. The Fitness Model explains how new nodes get links when they come late to the game.

Scale-free networks experience a greater degree of disruption tolerance. The creators of ARPANET knew this, which is why the network was designed to enable it to function even if large portions of it were destroyed. When you have a web with no true spider, the web can survive even devastating destruction.

The structure of the web has an impact on everything – it “limits and determines our behavior in the online universe” (162).