Heading into a major site redesign, with a product team split among several divergent and firmly held opinions, I decided it was time to break out a card sort.
Card sorting is an old, and often overlooked, UX method for organizing information. In its simplest form, you hand a test subject a stack of index cards representing the site’s content and ask her to sort them into piles and name each pile. This is what’s known as an “open card sort” — the naming of the piles is left up to the test subject. This is a great way to discover mental models of an information space, natural groupings of content, and labeling. Once you have derived a set of generally accepted categories and labels, you can move on to a closed card sort. In a closed card sort, you will provide a set of category names and ask test subjects to place the content cards into the categories where they seem to fit best. This is a good way to test your nascent navigation scheme.
These tests are pretty straightforward. The difficulty comes when you go to collate the data. Measuring the results of a card sort comes down to similarity scores. Similarity scores measure the number of times different test subjects place the same card in the same pile. If all your test subjects sorted two cards into the same pile, then the two items represented by the cards would have 100 percent similarity. If half the users placed two cards together and half placed them in separate piles, those two items would have a 50 percent similarity score. Multiply that by 40 or 50 cards and a dozen or so test subjects, and you’re looking at a long, costly process of collation and counting.
That’s why I was very happy to find Optimal Sort, an online card sorting tool with great analysis components. Part of the Wellington-based Optimal Workshop, Optimal Sort allows you to create cards representing your content and then invite test subjects to sort and categorize the content by following a link to your survey. The real value comes with the analysis tools. Optimal Sort provides a clutch of analysis tools, the most useful of which are the Similarity Matrix (shown below) and the Dendograms. Both quickly highlight the obvious content clusters and suggest generally understood labeling. For me, those two features were worth the $110 monthly fee.
So far, we’ve had over 100 responses to our survey and some very clear patterns are emerging. It seems that when the humble old card sort is merged with some smart data analysis and presentation, we’ve got a great new UX tool.
An elegant new visualization of internet languages, tools and traffic was recently released by the Chrome team, Hyperakt and Vizzuality. It shows the converging streams of languages and viewers from 1990 to the present in The Evolution of the Web. Equally impressive is the growth of traffic from one petabyte a month in late 1995, to over 27 petabytes in 2011.
Netflix’s recent move to shift subscribers from DVDs by mail to streaming over the net holds some valuable lessons for the newspaper business, argues Ken Doctor, the author of Newsonomics. By making streaming roughly half the cost of DVDs by mail, Netflix is moving their customers to where the company needs them to be, Doctor writes in The Newsonomics of Netflix and the Digital Shift.
“Imagine 2020,” Doctor writes, ”and the always-out-there-question: Will we still have print newspapers? Well, maybe, but imagine how much they’ll cost — $3 for a local daily? — and consumers will compare that to the ‘cheap’ tablet pricing, and decide, just as they doing now are with Netflix, which product to take and which to let go.”
Of course, Netflix doesn’t have to contend with the huge revenue gap between print advertising and digital advertising as newspapers do. All that is still TBD, Doctor writes, but Netflix may point the way.
The May issue of The Atlantic has funny-scary piece about a social engineering contest on Twitter. Titled Are You Following a Bot?, the brief article outlines a recently concluded experiment by the Web Ecology Project wherein socialbots (programs) were let loose on the Twitter network to try to win friends and influence people.
Turns out, they did pretty well. According to the Web Ecology post on the contest, “In under a week, Team C’s bot was able to generate close to 200 responses from the target network, with conversations ranging from a few back and forth tweets to an actual set of lengthy interchanges between the bot and the targets.”
Think of the labor that can be saved if you outsource all those boring tweets about what you ate for lunch and the cute thing your cat did today to a bot! Free from the chains of Twitter, regular people will have scads more time for walking around outside, or engaging in F2F conversations with other actual people. And, if socialbots can pass the Turing Test, marketers have gained a powerful new spamming tool.
Apparently, the applications of the tech may be a bit more sinister than that. As The Atlantic story noted, “A week after [the Web Ecology Project's] experiment ended, Anonymous, a notorious hacker group, penetrated the e-mail accounts of the cyber-security firm HBGary Federal and revealed a solicitation of bids by the United States Air Force in June 2010 for ‘Persona Management Software’—a program that would enable the government to create multiple fake identities that trawl social-networking sites to collect data on real people and then use that data to gain credibility and to circulate propaganda.”
In 2010, more Americans got their news from the net than from newspapers, according to the State of the News Media Report 2011, recently released by the Pew Trust. “The internet now trails only television among American adults as a destination for news, and the trend line shows the gap closing,” the report stated.
The study also identifies the structural changes underlying this shift in news consumption. “In the 20th century, the news media thrived by being the intermediary others needed to reach customers. In the 21st, increasingly there is a new intermediary: Software programmers, content aggregators and device makers control access to the public. The news industry, late to adapt and culturally more tied to content creation than engineering, finds itself more a follower than leader shaping its business.” Therefore, it’s not surprising that online ad revenue also exceeded ad revenue earned by newspapers in 2010.
Readers have certainly seen a change in the quality of news in the past decade and a half. Because the tools used to collect and disseminate information have become so inexpensive, anybody can be a reporter and a publisher. Consequently, there’s a lot more “content” available, but less actual news. At the same time, old media sources like big metro papers, are engaged in a race to the bottom as they cut costs to adapt to the new business model. In between are a lot of so-called news organizations — AOL, Demand Media, Fox News — that are predicated on the idea that content must be cheap to be profitable.
The result is that the signal has been flooded with noise. Actual news has been replaced with conjecture, opinion and amateur reporting. A related effect of this shift in the business of news is that there is no longer a sense of consensual reality. As traditional media outlets chase more narrowly defined audiences, the idea of a “mainstream view”, or center, has vanished.
New technologies have always changed the way people learn and think. The printing press, steamships, the telegraph, radio and television have all caused upheavals in the news business. The internet is no different. Soon, I hope, people will find methods and sources to pull meaning from this muddle of information.
Here are a few predictions (more baseless opinion) about the shape of the news business in the future. Old media, newspapers and television, will continue to shrink budgets and cut staff until they arrive at a profitable business model — or fold. Radio is the exception to this, probably because radio’s cost of production is already low, but also because its mode of delivery is highly portable and convenient for news consumers.
The news field will become more crowded as amateurs pile in and various crowd-source schemes are tried and discarded. The successful model will probably be programmatic: an algorithm or faceted filter that pulls from dozens of live feeds to render a digest of current events. Google News is already doing something like this with their News For You filter.
The authoritative, professional sources of news that emerge will either be subsidized or collectivized. Subsidized news has been around a long time. The BBC and the CBC are leading examples. Newer players, like Al Jazeera, will thrive with this model too. Collectivized news, like the Associated Press and Reuters, will become more dominant as news outlets pool resources to get quality reporting. Citizen collectives like Democracy Now, Common Dreams and Wikileaks will also become more prominent.
And, finally, The New York Times’ new online subscription fees won’t add much to the company’s revenue. Although the method of implementing the new paywall is savvy, the price point is too high. At $35 a month for an all-access digital subscription, the NYT won’t see many non-institutional subscribers. For more info on that see nytimes.com/access.