September 2009 Archives

Think of your favorite model or metrics for measuring social media activity. Flip through Olivier Blanchard's presentation on social media ROI. Now, with that in your head, read Tom Davenport's 2007 book, Competing on Analytics. How far do you get before realizing that the enterprise analytics crowd is asking some of the same questions as the social media crowd, but looking for answers in different data? What if the two groups met?

When we asked CIOs to identify their visionary plans for enhancing their enterprises' competitiveness, business intelligence and analytics was the top answer, selected by 83 percent of our sample... "Facts drive decisions," said an Insurance CIO. "Plans for imbedded analytics need to enable data capture at the customer touch point."
— IBM's 2009 Global CIO Study (via KDnuggets)

What would happen if you were to analyze social media data alongside operational data to look for insights in the interaction between what people do online and what they do with your company? You could measure the ROI of marketing in social media, but that's a defensive move (protecting your job/budget). Beyond learning what works and what doesn't, what would you learn by looking at the data together?

Are you doing this now? I'm looking for companies to interview for my research.

Measuring Social Media must be the new black. Everybody's doing it—the in-crowd is, at least, and the rest are starting to realize they're missing something. Just look at the agencies who've suggested their own take on the little black dress—that is, on how marketers should measure social media.

  • Conversation Impact, Ogilvy PR

    Ogilvy proposes a framework with three sets of metrics that correlate to the traditional marketing funnel: Reach and positioning, based on a combination of web analytics, media analysis, and search visibility; Preference, based on media analysis and traditional research; and Action, based on measurable client objectives (such as sales conversions).

  • Social Influence Measurement, Razorfish (with TNS Cymfony and Keller Fay Group)

    The SIM score, as introducing in the Fluent 2009 report, compares sentiment for a company to sentiment for its industry. The report also mentions share of voice and weighting for influence, although the formulas for the metric do not.

  • Digital Footprint Index, Zócalo Group (with DePaul University)

    Evaluate a brand's online presence in three dimensions: Height, which represents the total volume of brand mentions; Width, based on consumer engagement with online content; and Depth, based on message saturation and sentiment.

Three agencies, three models that fit fairly neatly into measurement silos. I've grouped them on the somewhat arbitrary basis that they're all backed by marketing agencies, but they're not answering the same question, are they?

It was my understanding that there would be no math.
—Chevy Chase, as Gerald Ford
Breaking eggs, making omelets
Ogilvy's Conversation Impact tracks marketing effectiveness with its explicit alignment with the marketing funnel. I like that the framework intermingles different sources of data, including traditional research. The Action category, linking measurement to specific business outcomes, should help keep strategy and measurement on point.

Razorfish's SIM score is all about perception. How does the client look compared to its competitors and industry? Despite the "influence" in its title, this score is entirely about sentiment, with none of the usual indicators of influence. As a single metric, the SIM should be compared with the Net Promoter Score or MotiveQuest's Online Promoter Score, but I'll be honest here. I'm having trouble figuring out the significance of this ratio of ratios. I tried a few scenarios to get a sense for how the numbers change and got some strange results: divide by zero errors, negative scores... The intermediate Net Sentiment metric is the more meaningful number here.

Zócalo's DFI addresses PR effectiveness, as telegraphed by the "earned media" headline in the announcement. The focus on presence, engagement and sentiment pick up on important aspects of social media, but without more detail on the math behind the overall index value, this seems like another framework rather than a metric.

What are we measuring, exactly?
I'm not the first to say it: the golden metric that will answer every question does not exist. To be fair, the authors of these examples don't claim to have the ultimate answer, anyway. Social media initiatives can support diverse objectives, and so the metrics used to evaluate those initiatives or to answer questions will be equally diverse. But it is nice that we have people sharing their efforts to find appropriate metrics for some common objectives and questions. Thank you, and keep it up.

While working through the math, I was reminded of an old trick from undergrad science classes: if you're losing track of your formula, do the math with the units included. If the resulting units don't make sense (comments^2, for example), the value won't, either.

Photo by alist.

Five Modes of Listening


186A6758-B8C5-422D-BF00-CC7C87B0BE81.jpgI'm working on a theme that's all about expanding our idea of listening—it's so much more than defensive monitoring, but we need to get beyond first steps. After the last post, Sam Flemming commented on the importance of distinct terms for communicating outside of the bubble, and he's right. After we expand the concept of listening, we need to break it into manageable pieces. Fortunately, the pieces will look familiar.

As a set of activities, listening breaks down into these five modes:

  • Searching
    Search is so familiar that we don't always think about it, but look at the advice on getting started in social media. That first step: find out what people are saying, where they meet—you know, the 5 Ws—when you do that as a snapshot, that's search. Don't neglect the value of familiar methods.

  • Monitoring
    The usual starting point for a discussion of listening. Through automated methods (typically a dashboard or RSS reader), find and read new posts, comments, tweets, etc. that are relevant to your business. Focus on individual items for action.

  • Alerting
    Similar to monitoring, but the system notifies you through email, instant messaging or text when a new item is discovered. Alerts can also be based on measurement thresholds, such as a sudden increase in negative commentary. No requirement to revisit the platform to receive alerts.

  • Measuring
    Add a quantitative element to monitoring. Whatever your choice of metrics or measurement silo, measurement is about aggregation and numbers. For the purposes of this list, I use measurement to refer to the generation of regularly updated metrics.

  • Mining
    Add a quantitative element to search, and you have data mining, which looks for meaningful patterns in archival data. Although it has a lot in common with measurement (as used above), I'm seeing different practices and benefits that justify separating the two.
I know some knowledgable people in the space will disagree with my definitions, but my point is not to start another semantics argument. And I'm certainly not discounting the importance of looking at the data and interpreting its significance. The point of making these fine distinctions is to point out areas where we may be missing some of the value in listening.

For example, if you're doing routine measurement—you're looking at meaningful metrics on a regular basis—is there an opportunity to find different value by taking a mining approach, looking for insight in a snapshot of historical data? A slim distinction, but the point is to step back, walk around a bit, and look at the data from another angle.

Actually, lots of other angles, but more on that later.

Photo by bdu

National governments represent a special category of large organizations: they're far larger than any company, and they're in a funny kind of business. But their talent for generating documents occasionally leads to something of value in the business world. Would you believe a strategy document that frames the relationship between social media and Enterprise 2.0 in a sidebar?

Though it's not what most people will be looking for, the new 2009 [US] National Intelligence Strategy (PDF) neatly categorizes two types of objectives for the intelligence community (IC). If you squint a little, I think you'll see how these categories could be repurposed for the 2.0 crowd:

Mission Objectives

  • MO1: Combat Violent Extremism
  • MO2: Counter WMD Proliferation
  • MO3: Provide Strategic Intelligence and Warning
  • MO4: Integrate Counterintelligence
  • MO5: Enhance Cybersecurity
  • MO6: Support Current Operations
Enterprise Objectives
  • EO1: Enhance Community Mission Management
  • EO2: Strengthen Partnerships
  • EO3: Streamline Business Processes
  • EO4: Improve Information Integration & Sharing
  • EO5: Advance S&T/R&D
  • EO6: Develop the Workforce
  • EO7: Improve Acquisition
Identifying external and internal objectives
Obviously, I'm not suggesting that national security and social technologies are the same thing. If you're not in the national security business, then "combat violent extremism" isn't your first objective. Instead, look at the framework. I think that the distinction between mission objectives and enterprise objectives might just clarify the relationship between externally-focused social media and internally-focused Enterprise 2.0 initiatives.
  • Social Media for Mission Objectives
    Mission objectives are closely linked to the overall objectives of an organization. At the enterprise level, these are measured in terms of financial success; in marketing, they're the familiar product- and customer-oriented objectives that lead to financial success. These are the kinds of objectives we see in social media discussions (especially when social media for business is interpreted as social media marketing):
    • Combat negative impressions of the company
    • Improve customer communication and responsiveness
    • Increase brand visibility
    • Enhance customer loyalty
    • Integrate market intelligence
    ...add your favorite social media objective here. The social media focus on connecting with the worldwide conversation in support of the business reflects an emphasis on mission objectives.

  • Enterprise 2.0 for Enterprise Objectives
    The E 2.0 case is even easier to make—look at E04 on the NIS list (improve information integration & sharing). Look at the list through your new technologies lens, and you'll hardly need to edit to start applying it. Enterprise objectives are about making the operation work better, so the prescriptions are generic, not specific to the organization's mission. Despite the idealistic rhetoric, improving the operation is the argument for E 2.0.
Aligning social media and Enterprise 2.0
I don't think the internal/external view of social media and E 2.0 is all that new, but I do think it's instructive to see the two types of objectives neatly linked in one document. If the evangelists of social business strategy succeed, I think we'll see more explicit alignment of these high-level categories.

Thanks to Andrew McAfee for pointing out the new document.

Everyone says that listening is central to social media success, but over time, we've fallen into a too-narrow interpretation of the metaphor. Think about it: if listening means monitoring, then we have too many words. Fortunately, they don't need to mean the same thing. We just need to expand the way we think about listening.

Here's the definition of listening implied by many posts and presentations:

Defensive keyword monitoring of social media for customer problems and complaints that need a communications or customer service response.
In the social media buzzword compendium, that's a great example of listening. But as a working definition, it leaves a lot out. Almost every word imposes a limitation on finding all of the value in a listening strategy. We can do more.

How can we expand the definition of listening?

  • From a defensive posture to developing valuable market intelligence.

  • From keyword monitoring to applying all of the technologies available to discover and analyze relevant online content and activity.

  • From monitoring to metrics, mining, and interpretation. It's a metaphor, so there's no reason to be stuck with the word's literal meaning.

  • From social media to all media and customer communications.

  • From a focus on problems and complaints to an interest in all relevant conversations.

  • From PR, marketing, and customer service to anywhere the information has value to the business.

  • By collaborating across measurement silos to find the right methodology for the task.
More formally, I think of listening as the application of intelligence and analytics to social media (and other sources), but that's so many syllables. If you don't mind, I'm going to continue to say "listening," and when I do, you'll know that I'm talking about a lot more than monitoring Twitter for your brand name. 'k?

Social Media on Healthcare Reform

| 1 Comment

89E2EA89-9426-4B3C-A41C-D0ADC0BD845C.jpgFind a topic that a lot of people care about, and you'll find a great pool of data for social media analysis companies to use in a demonstration of their work. Over the last few years, we've seen examples based on Super Bowl ads, American Idol, and national elections. Now it's healthcare reform in the US, where discussions are—uh, generously seasoned with sentiment. Just the thing to show off your analysis chops.

Here's what's shown up so far:

Anyone else working on an analysis they'd like to share?

photo by Rob Stemple.

Scaling Human Analysis


AEDE2AAA-9300-4636-80E9-CE2F90947F85.jpgOne thing about sentiment analysis: it really stirs up the opinions. Apparently, it's good for attention, too, because yesterday's post has gotten a lot of it. So what is it about automated analysis that's so controversial, and what can human analysts do to offset the advantages of automation?

Automation offers four basic benefits:

  • Scale
    Keeping up with all of the relevant conversations as volume grows.

  • Speed
    Processing new items sooner; computers "read" faster than humans read.

  • Consistency
    Software doesn't get less accurate with fatigue or mood, and it doesn't consider contextual knowledge if it's not supposed to. It just follows instructions, over and over.

  • Availability
    Automated systems don't sleep, so they won't be the limiting factor in determing your 24/7/365 operations plan.
The trade-off—or the development challenge, depending on your point of view—is in accuracy and interpretation. Most of the discussion focuses on these accuracy issues, so let's think instead about the less controversial benefits and how a software-assisted human analysis approach can compete with them (remember, SAHA is human analysis within a software-mediated environment for operational efficiency).

Competing with automated systems
Scale and speed are related, and the hands-on approach is simple: add more people to the process. Speed (latency) will still be limited by the ability of your analysts to read quickly. You won't compete in the sub-second latency market, but you can get ahead of the daily-update crowd.

Consistency in human scoring comes down to training and process. I won't pretend to teach the media analysis pros how to do that job, but it's going to be more formal than the eyeball ratings I gave out yesterday.

Availability is an interesting challenge, but it's not the first time companies have addressed the issue. If you're going to work through nights and weekends, your choices are to create some undesirable jobs at home or switch to follow-the-sun operations overseas.

The rise of offshore outsourcers
Combine the need for a lot of people (analysts) with the desirability of around-the-clock operations, and a lot of people will reach the same conclusion. From the beginning of the social media analysis business, some of the better-known vendors have had development and analyst groups in India. Now, I'm starting to hear from companies that are offering offshore human analyst services as a specialized service.

The interesting bit is that they're unbundling the content coding, so clients (or vendors) can add human-powered sentiment analysis to any platform that provides user coding or tagging.

This won't be an easy group to track. It's largely a traditional outsourcing approach, and any company that provides full-service social media analysis using human analysts could unbundle the coding piece, too. But if clients end up selecting human analysis over the automated version, expect more offshoring of the manual effort.

Since the recent New York Times piece on sentiment analysis, it seems everyone has an opinion on sentiment analysis (how appropriate, yes?). Without actually counting, I'm getting the impression that the overall score is negative, although with the colloquialisms and subtle innuendo, I'm not always sure. :-)

This is a round-up post, so I'm going to start linking to posts I've seen in a minute, but first, we have a problem: a buzzword alignment problem on what to call companies who monitor and analyze social media content. The article uses sentiment analysis to refer to the industry, but sentiment analysis is better understood as just one of the types of analysis used in the field.

This industry has a history of picking up a new label almost every time someone new writes about it. Forrester Research has called it brand monitoring and listening platforms, depending on which year and analyst you ask. I picked social media analysis when I had to choose, but even that is more limited than the state of the art tools and services. I don't have an answer to that one that makes me happy just yet.

Scoring the conversation
Oh, OK, I'll count. Really, how could I resist? Isn't this the obvious way to collect the posts on this topic?


This was an ongoing discussion long before the Times article. Mike Marshall made for the case for automation of large-scale analysis in the first guest post on SMA. I suggested additional models for the human vs. computer dichotomy in early 2007. I don't imagine we'll settle this any time soon.

This list is an example of document-level sentiment analysis by a human. Anyone want to make the case that it might not be 100% accurate?

defense.jpgAll together, now: "Companies should listen to social media." We all know the advice, but do you have the impression that listening is a purely defensive strategy? It's not. You just have to move beyond the common, but limited, interpretation of listening.

How often does your defense score?
In a recent survey of management, marketing and HR executives in the US, Russell Herder and Ethos Business Law found a strong defensive leaning in respondents' current use of social media. The top reasons they use social media?

  1. Read what customers may be saying about our company (52%)
  2. Monitor a competitor's use of social media (47%)
  3. See what current employees may be sharing (36%)
  4. Check the background of a prospective employee (25%)
  5. None/personal use only (16%)
Not exactly the way I would put it, but this isn't entirely a bad start at listening. At least they've gotten some of the message. It's a little heavy on the fear motivation, but it's a start. The trouble is, it's only a start.

Put your listening on offense
Think about my earlier list of conversations you should care about, and let's come up with some things you can do with the information you find. Defensive ideas are easy (and rampant). Let's focus on putting some points on the board. I'll start:

  • Spot sales leads where prospects ask questions or contact you through public channels.

  • Figure out a competitor's plans from their public statements and personnel changes.

  • Figure out a customer's plans (and needs) from their public statements.

  • Identify a competitor's weakness in online complaints; launch a product or program to exploit it.

  • Identify a product or service opportunity in online discussions; fill the gap before competitors notice it.
That's a short list; what does putting listening on the offense make you think of?

Listening can be defensive—and if you're not monitoring for customer complaints and other problems, start. But don't stop with defense; think about how to apply it to advantage, too. Although it sounds passive, listening doesn't have to be either passive or defensive. Don't be satisfied until you find the path to profit for your business.

Thanks to Deni Kasrel and Bill Ives for pointing out the report and the defensive tone of responses to the question.

About Nathan Gilliatt

  • ng.jpg
  • Voracious learner and explorer. Analyst tracking technologies and markets in intelligence, analytics and social media. Advisor to buyers, sellers and investors. Writing my next book.
  • Principal, Social Target
  • Profile
  • Highlights from the archive


Monthly Archives