This presentation by Sarah Bird was one of the highlights of #DefragCon. I really loved what she said and all the data she shared.

How to Build a B2B Software Company Without a Sales Team
Sarah Bird, CEO Moz — @SarahBird

  • Moz
    • $30M/year revenue
    • growing from 2007 to current day
    • Moz makers software that helps marketing professional
  • Requirements for selling B2B software without a sales team
    • A nearly frictionless funnel
      • i hate asking for money
      • we made a company company that rarely asks you for money
      • People find our community through our Google and social shares.
        • they enjoy our free content: helpful, beautiful.
        • Q&A section.
        • mozinars: webinars to learn about SEO, etc.
      • eventually, you may sign up for a free trial. 85% of sign up for a free trial.
      • customers visit us 8 times before signing up for a free trial.
      • moz subscription: $99/month is most popular (and cheapest) plan
    • Large, Passionate Community
      • We had a community for 10 years.,
      • We were a community first. Started as a blog about SEO
      • Content is co-created and curated by the community.
      • Practice what we preach.
      • 800k marketers joined moz community.
      • Come for the content, stay for the software.
      • No sales people, but really good community manager.
        • their jobs is to foster inclusive and generous environment to learn about marketing.
    • Big Market
      • if you’re going after a small market, just hire someone to go talk to those people.
    • Low CAC & COGs business model
      • Cost of Customer Acquisition
      • Avg customer lifetime value: $980
      • average customer lifetime: 9 months
      • fully-loaded CAC: $137
      • approximate cost of providing service: $21/month
      • payback period: month 2
      • Customer Lifetime Value is on the low-end
        • moz: $980
        • constant contact: $1500
        • but we have the highest CLTV/cost ratio
        • cost
          • moz: $137
          • constant contact: $650
    • Rethink Retention
      • Churn is very high in the first 3 months: 25% / 15% / 8%
      • But by month 4, churn stabilizes. Now you are a qualified customers.
      • Looking at first 3 months. composed of:
        • People I’m going to lose no matter what i do. they are not target customer.
        • people i should be keeping, but i’m not.
        • people who i will keep even if i don’t spend effort on them. they “got it” right away.
      • Don’t worry about the first group. they are not the target customr. let them go.
      • second group: keeps me up at night.
      • you must know how to tell these groups apart, especially with respect to their feedback. feedback of the first group should be ignored!
    • Heart-Centered, Authentic, Customer Success
      • Need awesome customer support team. we don’t have salespeople up front. Instead, we treat them really well once they are paying us.
      • We don’t try to use robots to save money.
      • We talk to the customers, visit their websites, suggest improvements.
      • We don’t have a storefront or physical presence. so how do we make the relationships longer, stronger? we sent out happy packets of moz fun stuff.
  • Benefits
    • Your community is a flywheel.
      • it takes time to get up to speed.
      • once the flywheel starts spinning, the community starts to create itself.
      • now moz is just the stewards of the community.
      • it’s like hosting a really great house-party of respectful guests.
      • it’s an incredible barrier to entry for competitors.
        • there’s no shortcut, no way to buy into this.
    • Low Burn rate helps when the economy goes in the shitter.
      • no sales team means less burn.
      • less capital required.
      • easier to self-funded.
      • no community to calculate.
    • the strategy generates lots of predictable recurring revenue: 96% of revenue is recurring.
    • risk is distributed across a broad customer base. even if the best customer leaves, it’s no big deal.
    • we can pour more dollars into R&D
      • third group: don’t worry about them either.
  • Caveats
    • No magic growth lever: can’t just scale from 5 salespeople to 10 salespeople.
    • Will public markets and VCs continue to prize growth rate over burn rate?
  • Future of B2B Sales
    • Every business is a publisher.
    • Every business has a community.
    • Are you managing it?
    • Increased transparency around quality and pricing.
      • should lead to more corporate accountability.
    • Multi-channel, customer driven contact
    • customers want shorter contract cycles. Nobody wants to be locked into anything anymore.
    • Software sales begin with the people who use the software. They advocate to the C-suite.

Disclosure: I work for HP.

140 character customer support
Caroline McCarthy – covering social media for CNET news
Frank Eliason – Comcast Frank @comcastcares
Toby Richards – Microsoft. Everything that’s not phone support
Lois Townsend – @ltownsend, HP’s Social Media Strategy
Jeremiah Owyang – altimeter
  • Recently in the news Kevin Smith, getting kicked off the Southwest plane. He twittered about it.
  • Is twitter really the best place for this?
    • Frank: It’s the customer’s platform. They can use it to affect policy.
    • Lois: It’s a way to connect with the customer, but not necessarily converse. Ideally you would reach them before they get so angry (like with Kevin). Our goal is to get a hold of them and get them to help. Like a virtual concierge. 
    • Frank: I disagree. It is a place for a dialogue. Kevin Smith’s problem could have gotten resolved right at the airport over twitter. But we don’t want to do different things just because someone is being loud. That isn’t customer service, that is just PR.
      • Everyone would like their cable to be free. But we can’t do that. But we can have a discussion about why it is what it is. We can become more transparent.
  • Does your PR department or your customer service department manage your twitter connection? How much should they be communicating?
    • Jeremiah: They should be wearing the same shoes. Customer support is PR. Customer’s don’t care what department you’re in, they just want their problem solved.
  • Is anyone from other parts of the company watching over you, telling you what you can and can’t say? (directed first at Toby)
    • Well, I was just part of the panel on “don’t be sued”. The attitude we have now is to be as transparent as possible. It doesn’t always mean saying yes to the customer. There’s no oversight on how we engage in customer support in the public domain. But there are core principles, and there is a conversation. You can create a harmonious working relationship with the different departments.
  • What happens if you get Kevin Smith’ed? He’s got millions of followers, and even fanboys. He can motivate a lot of people.
    • Frank: We’ve had it happen. We have some people who are very loud about it. But the biggest hit to the Comcast brand are two videos on YouTube that still show up first. They didn’t come from 1M+ followers, just regular people.
    • Frank: We had an issue with nudity during the super bowl. But we were talking about it and telling people what we knew in real time. It quickly become boring, which diffused it.
    • Toby: We’ve had lots of people come to forums with issues. One big issue, we were able to respond quickly, get a patch out, tell people about it, and issue was resolved in a week.
    • Lois: We had one, where our product appeared to be racist (referral to the webcam face detection), we responded quickly, we explained the limitations of the technology. Many of our things are pretty boring… People were frustrated by lack of drivers, we explained what goes into getting a driver released and why it took time.
  • Q: What are the listening tools?
    • Toby: We use blue ocean. We have an internal process for escalation. We have call monitoring, community forum monitoring. We have a process to get an issue to the right person fast. Some issues are really hot and require immediate respond, some are more systemic issues that we drive over time.
    • Lois: The listening tools make our jobs so exciting. We can get this information from our customers, and bring it to our colleagues, and it is extremely compelling to be able to bring customer words. HP, Microsoft, we’re big companies, and there’s tons of metrics and reports, but they are dry compared to customer words.
    • Frank: We use radiance. But you can start with twitter search, google mob search.
    • Jeremiah: We just published a report on social CRM tools. Social media is not scalable. Frank has how many managers… (10 community managers)… You can’t scale in real time.
      • Frank disagreed: it can be scaled. not all 25M customers are saying comcast, comcast, comcast all day. If so, phones wouldn’t scale either. At one time Legal said it wouldn’t scale, but Legal doesn’t review everything that goes out.
  • Q: How large are your teams? Are they spread out or in one location? How do you manage expectations of responsiveness?
    • Lois: my direct reports are very small. 11 people. But beyond that, there is an incredible virtual network… we call it the HP Ambassador problem… We have about 75 employees who are very active in responding to customer problems. It ranges from a dozen to up to a 100 people, depending on how you count.
    • Frank: We have 10 people, plus a manager. We review about 10,000 blog posts a day, about 2,000 twitter posts a day, about 200 facebook posts, about 600 forum posts a day. The bulk of the word my team is to put an email address in forums, and we get about 6,000 emails a month, and that is the bulk of what we do is refer.
    • Toby: (missed some of the data…) we have about 200 people in low-cost labor markets that are present in various online communities. We connect with people in social media… the community influencers. We can get scale, and reach by working with these community influencers.
  • Q: How do you bridge the gap between the online world and the real world (e.g. with Kevin Smith – on twitter, Kevin had received a response, but the gate agent didn’t know about.)
    • Lois: sometimes the front-line person doesn’t have the same knowledge of what can be done for a customer. We had a customer who was in Spain… our twitter person was able to get him an agent who could help him in English, in his motor home, in Spain, with his English product.
    • Frank: but have you changed your process for your Spanish speaking agents, so that they have a process for getting an English agent for a customer that needs it.
    • Lois: agreed, you solve the immediate problem, but you need to use that feedback to make improvements.
  • Q: What about using social CRM?
    • Jeremiah: We need social CRM to be able to tie customer records to these social interactions, so that we can track,
    • Frank: it’s about changing the company culture to be customer service centered
  • Problem: Are you rewarding your customers that the best way to get support is to yell at your friends?
    • Frank: the customer was already doing this before we ever showed up. They were choosing to yell. But by responding, you change these people into advocates when you help them.
    • Frank: regarding spam, we’ll look at it once, and ask if they need help, but we won’t go on forever. We can agree to disagree, and be nice throughout the whole thing. You will have sometimes people that are very personal, hurtful, and angry. You have to have thick skin in this place.
    • Toby: Agree. We have a lot of customers, those customers don’t have access to Microsoft people, they don’t see us as people. But if we can engage, that changes things.
    • Lois: You’re not being measured on the cost of the phone call in this case, so you do have the ability to carry on more of a conversation. We can learn about what happens when customers have relatively unique configurations, and then share those lessons with other customers.
    • Lois: We are, to an extent, a minor part of the community. 95% of interactions are among the community, without HP being involved. Those community interactions are really rich. People are helping each other.
  • Q: Do you have policies about employees doing social media? Do you have a social media style guide about what those interactions should look like.
    • Frank: We do have a policy. it’s simple: don’t release proprietary information that isn’t available to the public, be nice, be honest, be truthful about who you are. But we don’t center it in one style… Person should be themselves.
    • Toby: The people who are working for you to work in the community become part of the community, they are recognized as individuals. We tell people to let the marketing team make the value proposition for new products, not to tell people what they think.
  • Q: First, when you introduced social media customer support, how did you leverage traditional channels? How did you get them integrated? Second, in a global environment, how does this play out in other countries where you might have support? Third, from a b-to-2, what kind of social channels are you planning for that?
    • Jeremiah: tell people on phone hold about your community, show it on your website. on the website, it shows that there is a thriving community.
    • Lois: We did forums early, but they were very agent focused. Not much of a true community, just a different kind of agent support. When we relaunched, we worked hard to foster the community.
    • Toby: When we start a new community or venture, we have a dedicated team at first, and after the kinks are worked out, then we diffuse it into the rest of the organization. We believe in having a center of excellence, in making sure we have the skills to engage with customers online.
    • Lois: We started with English, then did six additional languages. Now we’re focused on improving those forums. When you do language localization for official support, it has to be perfect. When people communicate in forums, they are very forgiving of 2nd, 3rd levels – it just has to be helpful, it doesn’t have to be perfect.
  • Q: How to deal with dissonance of excellent customer service via twitter versus torturous customer service by phone.
    • Frank: we have to put it in the face of executives to show it. We’re measuring on the number of customers we help, not how many emails we send. So we work to do something with the feedback we get. Not overnight, but we are changing. Comcast’s credo is now centered on best customer experience. 

A few weeks ago, I posted about a framework for considering the customer support experience. The framework is composed of four phases: awareness, navigation, diagnosing, and solving.

Pete Hwang, a collegeue of mine, asked if the framework could be shortcut by people using Google:

Framework makes sense. But could you argue that web savvy customers who start with a Google search shortcut the whole thing? The customer experience journey in this case: Customer recognizes they have a problem. Google for error code or best guess at describing problem. View hits on possible solutions… Corroborate entries and take educated guess at likely best solution. Hopefully, solve problem. Otherwise, look toward 2nd most likely answer. If still not solved, try a fresh google search with fresh search terms. The official HP support doc only comes into play as it receives high credibility (high ranking) on Google.

Pete’s point is certainly valid. The framework isn’t intended to dictate a specific support path down which the customer is forced, but instead is a thinking aide to understand their experience and compare the role that different tools play. The way that we’ve used the framework is by describing different support tools within an “AAA NNN DDD SSS” diagram. The letters stand for Awareness, Navigation, Diagnosis, Solution. We repeat the letters simply to remind ourselves that frequently there may be multiple steps within each phase, and that different tools may help get the customer through different phases.

Let’s look at an example. In the picture, row 1 shows what most company’s idealized view of the web support experience looks like. In this idealized picture, the customer thinks first of the company’s web site, enters through the home page, navigates through the site using links, and ultimately finds a web document to solve their problem.

Row 2 shows a common alternative, and the one described by Pete. A customer does a Google search on a product, and again in the ideal world, the Google search’s first result is a link to the support document for that problem.

Row 3 is another alternative – a Google search leads to a web forum discussion. Web forums frequently do exceptionally well in search engine results, so if a forum exists, and the problem has previously been posted, odds are good that a forum result will be high up in the search results.

Row 4 is just a hint of another tool that companies have at their disposal, but frequently don’t take full advantage of. Error messages displayed by software applications or even hardware devices do much to make the customer aware of the problem, but usually do very little to help them find a solution. An error message that was linked back to a support document would directly make the link from problem to solution, and would bypass the need for navigation and diagnosis.

Please let me know if you make use of this framework. I hope it will be of use to you.

WikisVersusCMS-500
In the technical support documentation space I’ve been recommending wikis as a way to enhance collaboration on support documentation as an alternative model to the traditional approach of having a small cadre of technical writers and experts using a traditional content management tool to publish documents to the way.

While I’m an advocate of opening up the wiki to customer input, there are levels of collaboration that may make it easier for companies to get their feet wet without going so far as to open it up to customer input. The wiki could be used, for example, to allow input from other employees across the company, from R&D engineers to call support agents.
However, whenever I propose this, the established parties usually say “Why don’t we just fix the content management process we have?” or “If all we want to do is collaborate inside the company, we can use the content management tools we have for that.”
Wikis aren’t just another content management tool however. Wikis embody design principles that encourage contributions. When Digg was first implemented, the earliest versions had a two-step process to submit a digg vote. Kevin Rose, founder of Digg, spoke about the impact that moving from a two-step to one-step process had on the site:
There was a huge shift in activity on Digg when we made the move to the one-click digg in November 2005. Once we added Ajax, activity went through the roof on [the number of] diggs. It was just insane. Just the ease of the “one-click and you’re done” made all the difference in the world. Once the users grasped that the content is syndicated to friends, friends’ activities then went through the roof. These small incremental steps in feature additions drove the growth.

The more direct and lightweight the process is for contributing, the greater the number of contributors. And it’s not just pure volume of contributors: a simple contribution at first can then lead a user from passive recipient to enthused contributor. The editors at Wikipedia that devote much of their lives to upholding the quality of Wikipedia all started their involvement with a single, simple contribution at some point in time.

Wikis are perhaps the purest embodiment of the design principles of directness and lightweight processes. Every page has an edit button, so contributions are never more than a click away. The act of adding a few words to a document t is rarely more than clicking edit, inserting those words, and then clicking save.
Contrast that with a typical content management system: As a user who is browsing support documents on the web, and then spots an error in a document, if I’ve already used the content management system before, I have to then:
  1. find/launch the content management system
  2. login
  3. navigate to the document I was already viewing, usually by an obscure mechanism that isn’t the URL of the public document
  4. choose to edit the document
  5. make the edit
  6. save the document
  7. probably go through an edit review process relying then on other people to review the edit
  8. wait for notification that the edit is published
  9. check that the web document reflects the change
If I haven’t used the content management system, I would need to:
  1. Find out how the content is managed, probably by emailing peers until I get an answer
  2. Find out how to apply for a login
  3. Justify my need/right to modify the content (usually a lengthy process)
  4. Find out how to use the system
The two choices differ so significantly in effort involved, that the result is not just a quantitative difference in the number of contributions, but a qualitative one as well: true collaboration among a large group of contributors is unlikely using a traditional content management tool, because only those whose primary job it is to manage content are likely to invest the effort to use it.
By comparison, wiki makes it clear that editing is possible, puts the edit tool only a click away, and removes the step of having to renavigate to the content to be edited. While these steps may seem small, like we saw with the Digg example, small reductions in effort correspond to large increases in contributions.
Side note: I’m currently reading Designing Web Interfaces: Principles and Patterns for Rich Interactions, which inspired some of these thoughts.

FourPhasesOfProblemSolving70

When large companies put technical support content online, the effort frequently comes with a variety of pitfalls. A large company may have dozens or hundreds of products, and each of those products may have dozens or hundreds of support documents, leading to many tens of thousands of support documents.

This can make it exceedingly difficult to find the right help content for any given product and problem. As a result, companies then undertake a variety of web site improvements aimed at helping customers get their problems solved from improving navigation, to providing top FAQ lists, interactive troubleshooting tools, and more.
But how do each of the potential tools affect the customer experience? How do they relate to each other? And what part of the customer experience are they really seeking to improve?
Here is a framework Steve DeRoos and I like to use to think about technical support experiences. To help customers get their problem solved using eSupport, there are four phases of the user experience to consider:
  1. Awareness
  2. Navigation
  3. Diagnosing
  4. Solving
Awareness: How do your customers become aware that you offer self-support help? If you’re recently improved your self-support help (for example, if you’ve recently added forums), how will customers be aware of those changes? Most customers, particularly web savvy customers, will choose to use a website visit over a phone call to get their problem solved. But less savvy customers, or customers who previously had a bad web support experience are likely to gravitate directly to phone support.
However, the more web-savvy they are, the more likely they are to go direct to Google to get their problem solved.

Navigation: Once the customer has made the decision to use self-support, how do they find it? Do they need to navigate to a product specific area of the website? Can they search and find it? Are there multiple kinds of support content and tools – if so, how does a customer choose which one to use? One example of a tool designed to make support navigation easier is HP’s Automatic Product Detection, a tool that detects what HP products the customer has, and links directly to the support pages for those products.
Diagnosing: When the customer finally arrived at the support content, how do you narrow down the specific problem the customer is having? Is it an install problem or a use problem? If it is a paper jam on an all-in-one printer, is it a paper jam for the printer portion, or the scanner portion? With regular paper or something unusual like labels or card stock?
Solving: When the problem is finally known, what are the steps to solve the problem? How do you insure that the customer follows through on the steps? If there are multiple ways to solve the problem, how do you lead the customer through each of the different ways? How do you know if the problem is resolved?
Having a framework like this helps ensure that all aspects of the user experience are considered. It also helps when considering proposal investments in eSupport: What problem is being solved? What percentage of the user base will it work for? And what is the business impact of improving that aspect of the user experience?
*Bridge photo used under Creative Commons license. Original photo by mozzercork. Shakey Bridge, Cork City, Ireland

Research from Stanford School of Business Professor Itamar Simonson and coauthor Chezy Ofir at Hebrew University in Jerusalem, points out that telling customers they will be surveyed, or asking them about their expectations ahead of time creates significantly more negative feedback. A quote from the article:

The researchers found that people who expect to evaluate are decidedly more negative. They also discovered that merely asking people to state their expectations before they receive a service made people morenegative—even though their predispositions may have been quite positive. For example, people who are asked if they think they will like a movie before seeing it will be statistically more negative than people who were never asked that question.
Simonson and Ofir studied the responses of customers who had called for service at a major computer hardware and software company. The researchers divided the customers into four groups. Participants in the first group were told a technician would service their problems and that they would subsequently be asked about the service, such as whether the tech was on time, whether the employee was polite, and whether he or she solved the problem. A second group was not told there would be an evaluation, but the customers were asked to state their expectations, such as how long they thought it would take for a tech to arrive. A third group was told both: to state their expectations and to expect a survey. Members of a control group knew nothing but were later polled.
The result: People who expected to evaluate were significantly more negative than members of the control group. The same was true of the group asked to state their expectations ahead of time. Interestingly, the group that was the most dissatisfied was the one that was asked their expectations and also warned about a survey.

This has serious implications for customer satisfaction surveys, but also for product research groups. Showing product prototypes to customers in a research setting is a context in which participants will frequently both be asked about their expectations and expect a survey. The effect can be research that “finds” problems that aren’t really problems:

The researchers warn that while marketers must stay on top of customer desires and complaints, they must also be aware of the effects the mere expectation of filling out a survey can have on how customers view their experience. “It may not be realistic,” says Simonson. “They may be chronically more negative, pointing out problems that are not problems to the average consumer,” he says. “You want people who are representative of the marketplace.”

This suggests that if you have any opportunity for analysis that doesn’t rely on surveys, but instead relies on behavior, the results are likely to be more accurate. Social media buzz, word of mouth, and collective intelligence applications based on behavior may all be more accurate than survey responses.

As I mentioned in my last blog post, I recently had the privilege of seeing and talking with Ward Cunningham, inventor of the wiki. In 1995 he built the first wiki as a tool for collaboration with other software developers and created the Portland Pattern Repository.
Virtually everyone is familiar with wiki at this point. It’s the web you can edit. Wiki reached its maximum reach with the creation of Wikipedia. For a wiki to work well, it is essential that there is a motivated critical mass of participants maintaining the wiki. For Ward’s original Portland Pattern Repository wiki, the motivation for users was to advance the way software was developed. For Wikipedia contributors, the motivation is to build a comprehensive encyclopedia of knowledge. Both of these goals are important to their relative population of users. Wikipedia contributors for example, may spend dozens of hours per week in unpaid work to make Wikipedia a better encyclopedia.
Why is a motivated critical mass of contributors so important? Some of the key contributions a wiki needs to thrive :

  • the contribution of original material
  • improving or correcting material
  • building linkages between topics in the wiki
  • various forms of wiki gardening that include:
    • ensuring topics conform to good style
    • correcting mistakes
    • building and improving trailheads and maintaining trails
    • filling in critical missing gaps
    • removing spam and user errors (such as when a user accidentally deletes content from a post)
    • and monitoring changes to spot when any of the of the above gardening is needed

What if you want a wiki but lack a sufficiently large, sufficiently motivated group of contributors?This is a problem I’ve been thinking about for some time. Derek Powazek says that what we need to do is “smallify the task“. In part that can be done by breaking down big tasks into smaller tasks. It can also be achieved by finding way to eliminate some of the bigger tasks.
In that case, could the gap between the minimum critical mass and motivation needed and whatever actual user population you might have, be at least partially mitigated by some kind of automation or collective intelligence? In other words, could some of the work normally done through the explicit contributions of a small group of committed users be instead done through implicit feedback and machine intelligence? Below I’ve captured my thinking in this space.
Let’s start by assuming that, at a minimum, raw contributions would still have to come from people, as would corrections to the material. But what we could simplify building linkages, improving trailheads in maintaining trails, spotting and removing bad mistakes or spam?
Automating linkages
Wiki topics are frequently characterized by many hyperlinks between the topics. In fact, the rich hyperlinking between topics is very much a part of what makes wiki is so effective. However chasing down all the right links between topics can add to the effort of writing a topic in the first place, or maintaining the wiki over time.
There is a technique that can be used to create a list of suggested topics that are related in some way to the topic a user is currently viewing. This technique relies on observing the behavior of previous users to determine what topics those users viewed, and in what order. For example, even if there isn’t a rich set of hyperlinks, a small subset of users will be motivated enough by their need for information to seek out other topics. They might do this by using search, the recent changes page on a wiki, or by navigating a convoluted set of hyperlinks to get to an ultimately useful destination.
The analysis then consists of using the clickstream data of these visitors to determine, for any given page A, what other pages also seem useful, considering for example, the amount of time spent on each page, and the order in which they were visited. For example, if I view the clickstream data, I may see a number of users who visit topic A, then go on to visit some intermediate topics very briefly, and then spends a significant amount of time reading topics D and G. I may then conclude that for other visitors who read topic A, they may also be interested in topics D and G.
We would need to user interface of the wiki to support presenting a list of recommended topics. Then when any visitor views the topic A page, the wiki can present recommended links for topics D and G. This makes it more likely that subsequent users are able to find related topics without going through search, recent changes, or long navigation paths. The net effect is pretty similar to the effect achieved in good wiki gardening when topics are appropriately hyperlinked together. But since this technique can be automated, it becomes possible to increase the usefulness of the wiki, while decreasing the effort needed to maintain it.
Automating Trailheads
A similar technique can be used to create and improve trailheads. The term trailhead as used for wikis comes from the trailheads associated with hiking trails. Typically there may be a large, interlinked network of hiking trails. A hiking trailhead is a place to enter the network of hiking trails. In a hiking trailhead there is frequently has a map indicating what trails exist and how they connect. A wiki trailhead performs a similar role. For someone coming to the wiki from the greater Internet, the wiki trailhead helps them orient themselves to the organization of information on the wiki and decide where and how to start reading through the wiki based on their interests.
Increasingly, people get to destination websites from search engines such as Google. While Google is frankly amazing at matching search terms to useful webpages, it can sometimes drop you into the middle of the website experience. That is, while the destination page may be the one with the content that most closely matches the search term, there may be very useful and relevant information on other pages that are related to the current page. And it may not be obvious how to navigation to those other pages. This is very similar to the related topic analysis I described above.
However in this context, we have additional information that can be used to better predict what other pages or topics the user will be interested in. When Google, or another search engine, sends a visitor to our site, the referral field will tell us the URL that the user came from. For search engines, this referrer URL will include the search term the user was searching for. This means that when we do click stream analysis and analyze how users visit the pages on our site, we can determine not just that readers of topic A are likely to be interested in topics D and G, but that if a reader of topic A comes to our site from a search engine having searched for a given term S, that then they are most likely to be interested in topic G, but not D. This adds a level of refinement to our basic predictive algorithm and create a better experience for users who come to our website from search engines.
Automating Weeding
We can also borrow a technique from sites such as Engadget, Gizmodo, and Slashdot to make spotting and removing bad content or spam much easier. The comments on Engadget and Gizmodo can be rated by viewers with +, -, or !. The plus means the comment is good, the minus means the comment is bad, and the exclamation point means the reader wants to “report the comment”, such as for bad language or spam. Many other sites utilize similar techniques for comments, and discussion threads. Highly rated comments either float to the top, or get represented in bold, or otherwise stand out. Low rated comments float to the bottom, get grayed out, or otherwise are diminished in importance. Reported comments may vanish from the site entirely. All of this happens with no manual intervention. Instead, it relies on minimal input from many users.
A similar technique could be used on a wiki. If we allow users to rate a topic, or even sections within a topic, with a plus, minus, or to report it, then we can again apply some automated analysis to determine what to do. Topics that are “reported” by a certain percentage of viewers should relatively quickly go away (with repeated occurences by the same contributor ultimately resulting in banning the contributor altogether). Topics that are rated down by a certain percentage of viewers should diminish in importance-which could be indicated by: the way the topic is displayed (perhaps with grayed out text), eliminating incoming links to the topic, or removing the topic entirely. Or perhaps, if there is another similar topic that is highly rated, perhaps the highly rated topic replaces the lowly rated topic.
Automating Good Style
Another area that gardeners of a wiki spent considerable time on is ensuring that pages conform to good style. Good style may vary by wiki and by group of collaborators, but essentially it is the conventions that ensure a certain degree of uniformity, usefulness, and accessibility of the information contained in wiki topics. It varies by group because different groups have different goals: Wikipedia’s contributors are historians, and they seek to document things from a neutral point of view. The Portland Pattern Repository’s contributors were software developers who were activists for a software development methodology, and they sought discorse and understanding.
A form of template, or gentle guidance, could help ensure that pages conform to good style without manual intervention. For example, a wiki that contains troubleshooting information might guide the user contributing a new topic towards organizing their information as a problem statement, and then series of solutions steps. Subsequent contributors for the topic might be guided to add comments on a section, add solutions to a problem, or qualify a solution step with the particular set of problem conditions.
The trick would be to balance this guidance with the necessary freedom to ensure that users are not too constricted in their options. Systems that are too constricted would likely suffer from several problems. One problem is that the site would not appear alive in the way that wikis frequently appear alive. (By comparison, Sharepoint sites are highly constricted in what information can be place where, and they never display the sense of liveness that a wiki does.) Another problem is that contributors may feel stifled by the restrictions placed on them and choose either not to contribute at all, or not to contribute with their full creativity and passion. I can’t quite envision exactly how this guidance would work, but if it could be figured out, it would go a long way to further reducing the maintenance workload of the wiki.
In summary, what I’m trying to envision is a next-generation wiki that combines the editable webpage aspect of any other wiki, with collective intelligence heuristics that build upon the implicit feedback of many users to replace much of the heavy lifting required in the maintenance of most wikis. This will be useful anytime the intended users of a given wiki are not likely to have a critical mass of motivated contributors. It will not substitute for having no contributors, and it will not work in the case of the wiki with very few users (such as a wiki used by a single workgroup inside a closed environment). But it may help those groups that are on the borderline of having a critical mass of contributors, and have a sufficient mass of readers.
I’m very interested in hearing reactions to this concept, and of learning of any efforts in this direction with wikis currently.
*Note: This post was updated 4/9/2009 in response to feedback. It is largely the same content with some additional clarifications. — Will Hertling

Get Satisfaction, the “people powered customer service company” is hosting a webcast on The Ten Commandments of Community Management on Wednesday, March 25th (tomorrow!). This is the first in a series of webcasts:
ALL WEBCASTS are at 10:00 am PDT
  • 25 March 2009 The 10 Commandments of Community Management
  • 8 April 2009 Reducing Customer Service Support Costs Dramatically (87%!?) by Turning to the Community
  • 22 April 2009 The “Duh” Paradox: Increasing the Connection with Your Customers Improves Retention and Extends Lifetime Loyalty
  • 6 May 2009Rome Wasn’t Built by Itself: Harnessing Product Innovation Through Online Communities

While at SXSW, I picked up a copy of What Would Google Do?, the new book by Jeff Jarvis. As I usually do, I opened to a random page inside, and started reading. I laughed out loud at something on the part, and I heard someone say “I love when someone does that.” I looked up, and saw Jeff Jarvis.

We got to talking, and he asked what I did. I told him about my role at HP, and how I’m trying to expand everyone’s mindset that for customer support, we have got to look past just social media and into the realm of implicit feedback. We chatted some more, and I ended up buying the book.
Only later did I realize that it was Jeff Jervis who caused “Dell hell” by posting on his blog about Dell’s poor customer service, and which totally turned Dell around and got them heavily involved in social media. Of course I knew all about Dell’s history, I had just forgotten the name of that one key individual who started it all: Jeff Jarvis.
I highly recommend his book. I’ve got enough annotations and folded pages for few dozen blog posts. I will mention one right now. Jeff Jarvis has finally explained the term platform in the context of Web 2.0 in a way that it become very concrete for me. He writes:
Google has many platforms: Blogger for publishing content, Google Docs and Google Calendar for office collaboration, YouTube for videos, Picasa for photos, Google Analytics to track sites traffic, Google Groups for communities, AdSense for revenue. Google Maps is so good that Google could have put it on the web at maps.google.com and told us to come there to use it, and we would have. But Google also opened its maps so sites can embed them. A hotel can post a Google Map with directions. Suburbanites can embed maps on their blogs to point shoppers to garage sales. Google uses maps to enhance its own search and to serve relevant local ads; it is fast becoming the new Yellow Pages.”

Contrast this to a site like Yahoo: Yahoo creates and aggregates content to create a destination. Google doesn’t create content, it creates a platform for others to create, share, link, and network their own content. Jarvis writes, “A platform enables. It helps others build value. Any company can be a platform…. Platforms help users create products, businesses, communities and networks of their own.”