Recently in Strategy Category

It's January, which means that I've been working on my annual posts on investments and M&A in the social media intelligence market. As always, I find myself mentally dividing the transactions into several buckets: the serial acquirers, the complementary products, the geographic expansions. While I was working on last year's post, I sketched out a matrix to summarize some of the logic of what I was seeing. This year, I'm sharing it with you.

Acquisition strategy matrix

The matrix is built on two variables: customer base and company capabilities. For each, are the merging companies the same or different? Combined, these go a long way in understanding the logic of a deal.

Customers
Start with customers, which might be characterized by industry, location, functional role, or (likely) a combination of these. If the companies serve the same customers, does the combination bring new capabilities to those customers, or is it more about scale or reducing competition? If the companies serve different customers, is the combination about access to new markets or diversifying more generally?

Capabilities
Second, compare the companies' capabilities, especially their products, services, and underlying technologies. If the companies have similar strengths, do they bring different customers or markets to the combination? If they bring different qualities, do their strengths combine to better serve an overlapping customer base, or do they do different things for different customers?

Analyzing the Matrix
At one extreme, companies that have the same capabilities and the same target customer combine to build scale and consolidate their position in the market. At the other, dissimilar companies serving different customers may combine as an investment strategy or to create something entirely new. The Same/Different corners, on the other hand, represent the most-common deal logic, in which a larger company acquires a capability or customer segment.

Every year, I hear from people who expect a big increase in sector consolidation this year. Which types do you expect to see?

Poisoning the Online Well

Garbage in, garbage out. The latest from the ongoing Snowden/Greenwald revelation is a reminder that interested parties know how to plant false information on the Internet, and that some of them are probably doing it. It has implications for anyone looking for good information online, anyone with a reputation to protect, and—potentially—for everyone invested in the online world.

The piece itself is worth a look (How Covert Agents Infiltrate the Internet to Manipulate, Deceive, and Destroy Reputations). The details are more disturbing than surprising, but as you read it, ignore the focus on the British intelligence agency GCHQ. It doesn't matter whether you trust your own government's actions, and the common distinction between a country's own citizens and everyone else is also irrelevant. The same tactics are available to every government—and any other motivated group. If they don't do this already, the newly released document provides the suggestion.

For the government intelligence guys, this is just a continuation of the second oldest profession: Get your enemy's secrets; protect your own. Deceive your enemy; avoid deception. It's a challenge when multiple entities are simultaneously trying to (a) get useful information from open sources online and (b) plant deceptive information in the same sources. I wonder how much blue-on-blue deception happens between information operations and open-source intelligence gathering, anyway.

For everyone else, this latest report should serve as a reminder of some of the risks in social media:

  1. Data quality risk
    People tell lies online—I know, but it's true. Some of the false information out there may have been placed by a motivated adversary who wants to mislead you (maybe even you, specifically). The target may be your organization, a related organization or someone who wants to work with you.

    The information you find online can be a useful source, but it's not the only source. If you're informing significant decisions, use all of your available resources, and be alert to the possibility of intentional deception.

  2. Reputation risk
    We're familiar with the concept of online reputation risk; corporate risk managers seem to think it's almost synonymous with "social media." If your business has potential exposure to government opposition (from whatever country), your risk may come from a better organized and funded source than the usual unhappy former customer.

  3. Target risk
    As people conduct their personal and political lives online, they expose themselves to snooping and more. The threats to personal privacy and freedom by government agencies have made the ongoing revelations newsworthy, but these public and semi-public channels are equally exposed to anyone who disagrees.

  4. Collateral damage risk
    Some of these information operations happen in the same online venues as normal personal use. As competing governments start viewing the online world through the cyber battlespace lens, normal users and the platforms themselves could take some damage. Off the top of my head, I'm thinking of legal, market, and technical risks, but that's probably just a start.

    It's too much to go into in a post, but companies with significant exposure to covert online tactics would be well served to chase down the implications of those tactics, and don't limit the discussion to legal exposure. Beyond the specifics on any one program, the revelations of the last year indicate the willingness of government entities in multiple countries to use environments operated by private-sector companies in ways they weren't intended. The safe asumptions are that governments are doing more than we know, and so are other types of organizations.

Politically, it matters very much who is doing what to whom and why. As a practical matter, who and why don't much matter. It's enough to know that someone, somewhere is developing and using methods to use popular online tools against people and organizatons they don't like. If you depend on online tools and don't have a basic literacy in the concept of cyberwar, it's time to learn, so you can recognize it if it comes to your neighborhood.

One of the great strengths of the Internet is the way it overcomes the limitations of distance. A side effect is that it also does away with the concept of a safe distance from danger.

Related:

SpikeEveryone loves a chart that answers a key question, but I particularly like the ones that make you think: Why did that happen? What changed? What are we missing? What happens next?

A spike on a chart is a big ol' why, waiting to be asked.
me, 2010

It's an old point, but a few examples came to me last week. Beyond the immediate interpretation of the numbers (e.g., big number good, small number bad), I think these patterns imply follow-up questions along the lines of "what happened here" and "why did it happen?"

  • Spike in a trend
    A sudden change means something happened. What? Why? Did the value then return to the usual range? Is the new value temporary or a new normal? Do you need to take some action as a result? The spike is the chart telling you where to look, which I suspect most people do instictively.

  • Smooth line on a historically bumpy trend
    A bumpy trend line that grows more stable is telling you something else, but the follow-up questions are similar. Did the data source stop updating, or is the change real? Remember to watch the derivatives of your metrics, too. If the metric keeps changing but the rate becomes constant, is that real or an artifact of the data collection? What happened, why, what action in response…

  • Crossing lines
    A is now bigger than B; does it matter? Obviously, it depends on what A and B represent, but it's a good place to understand: what happened, why, what it means, how much it matters, and whether to expect it to continue. If it's a metric that people care about, expect to discuss it.
Beyond the numbers
Thinking beyond the graphs, I remembered two things from conceptual diagrams that always make me curious:

  • Empty boxes in a matrix
    If the framework makes sense, its boxes should be filled in, whether it's the consultant's standard two-by-two matrix or something much larger. An empty box may represent an impossible combination—but it could be a missed challenge or opportunity. I once found $12 million in sales in an empty box, and so empty boxes always get my attention.

  • Solid lines around a space
    A clear definition says as much about what something isn't as what it is. When the definition takes the form of a diagram—an org chart, a Venn diagram, a network graph—I wonder about what's just outside the diagram. The adjacent markets and competitors from the future; the people who are near—but not in—an organization. What does the white space represent, and what does that mean to you?
These came to me as I was getting ready to attend a lecture by Kaiser Fung (which was excellent—ask him about the properties of big data). I'm sure there are many more. Without wading into technical analysis waters, what other patterns make you stop and think?

Everybody is Learning

Another sketch from the whiteboard

A couple of years ago, a suggestion that I develop a maturity model for social media analysis led to a different kind of model. My approach to this space has always been to explore its edges, looking for what might be next. One effect I've noticed is that change circulates through the ecosystem of companies, their customers, and their suppliers. Where change keeps coming, everyone's learning together.

A linear maturity model defines development stages toward a known destination, but in a system where everyone is learning, the destination is still unknown. We react to others, and others react to us. Change reverberates through the system, and we don't yet know what maturity looks like.

What this means in social media analysis
If social media analysis were good for one thing, we could have a simple maturity model. The products would progress toward a theoretical ideal, and clients would mature toward efficient, effective business practices. But the technology stack is built on areas of active research, new platforms are driving new consumer behaviors, more business functions are showing interest in how to use social media to do their jobs, and vendors are trying new ways to distinguish themselves.

Virtually every piece of the puzzle is moving.

Let's go to the whiteboard to see if we can visualize it.

Market learning cycle

There's a lot going on here, and this is the oversimplified version. Here's the basic dynamic: on the right, new capabilities become practices; on the left, new expectations become requirements. In the overall system, we expect more from our suppliers as we adapt to new capabilities and adopt new practices.

Think about what is being learned in each loop.

  • Tool vendors combine their own R&D with new capabilities from research labs and partner companies to expand their products' capabilities, enabling new tactics for their clients, who provide feedback and new requirements based on real-world use of the products.

  • Consumer-facing companies experiment with new tools and capabilities, and they learn from both operational results and customer reactions.

  • Customers react to companies' online tactics, adjusting their behavior to maximize their own benefit. When they find a practice they like, they may expect other companies to mimic it.
The catch is that this is all happening at the same time, and the companies, at least, are trying to predict how their customers will want next.

Who's learning fastest?
We know of some unintended lessons, such as teaching customers to complain publicly for a quicker response, and redefining like. But where do we look if we want to get ahead of the market? Try these key areas:

  • Outside innovation - New research and inventions may provide answers to questions you've wanted to answer.

  • Product capabilities - What's possible keeps changing, but don't look only at existing suppliers. Look at adjacent markets for capabilities worth adapting to new applications.

  • Client requirements - It's always worthwhile to pay attention to what companies say they'll pay for.

  • Client capabilities - Watch what companies are actually using, too.

  • Competitor actions - Watch early adopters for practices that may become standard. Is there a better way to do it?

  • Customer expectations - How are people reacting to new business practices? What issues are being raised? What new expectations?
Like any model, this one raises more questions than it answers. That's the point. What will it help you discover?

More ideas from the whiteboard:


I'm sharing some of the frameworks that have been hiding on my whiteboard. Want to apply them in your business? Email me.

It started with a simple challenge: if I were to draw a big circle around the things I find interesting enough to follow and declare them to be one thing, how would I label it? To avoid flying completely off into pointless musing, assume that it's relevant professionally. Considering that the circle included social media, analytics, intelligence, geopolitics, and natural disasters—to pick a few—the label wasn't obvious. By declaring them to be one thing, though, it soon became clear that the theme was the importance—the value—of knowledge.

The label was Omniscience.

"That's pretty ambitious."
Yes, I'm aware of the definition of omniscience, and no, I'm not suggesting that I know everything or ever will. But among the unattainable goals, it's a good one. I mean, what could you do if you knew everything? You can't, but what if you knew a lot more about things that matter to your business?

What if you knew something that was there to be discovered, and your competitor didn't? Is it starting to sound reasonable yet? Maybe even something you'd want to do?

The framework
I've talked through the Omniscience framework with several folks for early reactions, mostly in person. It involved some handwaving, so I knew it wasn't ready to post. Some people suggested related books, but nobody really shot it down. Now, it's your turn (click for a larger view). I'm not sure I need a lot more assigned reading at the moment, but I'm definitely interested in your reaction.

Omniscience overview

A framework, not a recipe
This is the top-level view, and each section has a story, a purpose, and examples. But this is the gist of it: starting with a few simple observations on the nature of things, Omniscience is a challenge to expect more of your intelligence and analytics, drawing on a broader range of techniques to track and anticipate a wider range of things that matter.

Omniscience provides a thread. It links things you know with things you do—and with things you don’t do. It links the very large and the very small, the short-term and the long-term. The way you think and plan and the way you measure and evaluate. It provides a structure to identify missed opportunities and to evaluate new ideas. And although it looks highly theoretical, it's already suggested a practical application that I haven't seen on the market.

Naturally, I think it's a big deal. Does it make sense to you, so far?

In my last post, I suggested that intelligence and analytics are two angles on the same challenge: developing the information value in available data. You're probably already looking—sorry, listening—for useful information online. Rather than thinking of intelligence and analytics as separate specialties, let's approach them as two lenses that might help us find information in data.

I'm going to risk a small definition here; if I'm going to write about intelligence and analytics, it would help if I assert that these aren't two words for the same thing. Proposing a formal definition isn't my point, so let's think about it this way: We do a lot of quantitative analysis these days. We care about the results because they present trends or aggregate data points in some way. For the purposes of this discussion, that's analytics. Other times we care about individual facts, regardless of the quantitative view. That's intelligence (cue James Bond theme).

For example, you might be interested in the most popular adjectives used to describe your product or brand. You care about the results because they represent mass opinion. That's analytics. Conversely, if you discover a death caused by your product, that fact is important regardless of how many people are talking about it. That's intelligence.

Yes, it's a little messy. The point is to notice what we've been missing, not to perfect the language.

What do people say?
Let's apply this to the familiar topic of listening in social media. People say all sorts of things online, but when we start analyzing their meaningful statements, they fall into two categories: statements of fact (which may be false) and statements of opinion.

We spend a lot of time on the notion of analyzing opinions. Most of the usual metrics help us understand trends in the opinions expressed in a large collection of comments. But what about facts? What do we do about them? They don't really fit into a market research paradigm, but some of them may be important to the business. We need to use a different lens.

It must be serious; he has a matrix
In proper consultant fashion, I decided to see what happens when we put these two ideas in a matrix. We use our intelligence and analytics lenses to look at statements of fact and statements of opinion online. Remember, analytics (in this discussion, at least) is about aggregate data, while the intelligence lens can pick up isolated signals. The examples in the boxes are illustrative; I'm sure you can think of more.

Intel analytics grid

Think about the usual discussion of listening in social media. How much of it focuses on measuring customer opinion and brand image (including every discussion of the accuracy of sentiment analysis)? How much more value could we uncover if we asked more questions of the same data? Are you looking for the important signals that don't show up in a Top 10 chart?

This is another piece of the Omniscience framework I'm working on. It starts with four simple thoughts, and it all comes together eventually—I hope.

House on silosIn a finite world, individuals specialize, but organizations don't have the same limitations. Given enough specialists, you can do it all. The challenge is in managing them. Somebody has to get on top of all these silos.

In my ten-minute pretend-keynote at last year's Defrag conference, I asked people to look beyond the existing silos of data and analytics to consider what more we could do. I challenged them with this simple idea:

Analytics + Intelligence –> Strategic Value of Information

What I'm doing is applying and not or to analytics and intelligence. Applying math when that works and finding facts when that works. Around here, the starting point for data is social media, but that's another boundary that turns out to be arbitrary. The same reasoning applies to other data sources.

We use labels like intelligence and analytics to divide the analysis of social media data into closely related specialties. In the process, we risk losing sight of the bigger goal, which all of these specialties support:

Uncover the information in the available data in order to develop insights that support the business.

We're all looking for useful information in data. In the social media realm, some of the data is unstructured content, and some of it is structured data generated by our activities. That distinction is driving some segmentation among the vendors, but it's worth remembering that intelligence vs. analytics isn't an or question; it's an and question—you need to consider both.

In the next post, I'll show you the model that applies intelligence and analytics to expand what we might find in what people say online. There's more to it than the usual summary of opinions.

Photo by Pablo David Flores.

Many Predictions, Only One Future

Future timeline (cropped)This XKCD cartoon places many predictions on the same timeline (click through for the full version—it's much too long to include here). This is what I mean by only one future: all predictions have to happen (or not) in the same future. If they're mutually inconsistent, then somebody's wrong.

More and more, I find myself applying this filter when I hear or read some breathless prediction of fantastic technological breakthroughs that are supposed to be just ahead. We have a lot of predictions floating around, and I'm pretty sure some of them are wrong.

Four Simple Thoughts

Since 2006, I've been learning about social media analysis—as a business, a set of technologies, and a set of business practices. If you read the blog, you've seen some of what I've figured out. Along the way, my professional interest in the information value of social media activity collided with some of my other interests, which has led to a rough draft of a strategy that I'm modestly calling Omniscience.

It's too early to publish the whole framework, but I want to share a few foundational thoughts that are shaping the way I look at things. I find myself referring back to these every day, whether the topic is business, current events, or long-term futures. As you read through the individual elements, think about how they interact.

  1. Everything is connected.
    A drought in China, floods in Australia, turmoil in the Middle East—which could affect economies in the US and Europe? Right. All of them. Cause-and-effect relationships circle the globe, and they don't respect the arbitrary domains of knowledge that we create. Energy, climate, economy, politics—they're all connected, and so is everything else.

  2. Everything is uncertain.
    Any useful prediction has an element of uncertainty that we like to ignore. It's easier that way, and anyway, uncertainty is interpreted as weakness. A better approach is to embrace the uncertainty—evaluate it, and consider the possibility of unexpected outcomes. Do you bet everything that you're right?

  3. Think and, not or.
    I see so many topics framed as false choices when the right answer is probably all of the above. Explore with and, focus with or, and never stop exploring.

  4. Only one future.
    We like focus, but focus projected into the future is tunnel vision. We have only the one planet. Everyone's predictions—on technology innovation, business growth, sovereign debt, energy supply, climate, demographics—have to play out in the same world. Everything is connected and uncertain, so predictions interact, even (especially?) when we would prefer to deal with one topic at a time.
Yesterday, I hinted at some of the other stuff I'm working on. This was the starting point. Wait 'til you see where it leads.

Update: the rest of the framework is up.

One of the first people to see the Omniscience framework suggested that I read The Black Swan, just in time for the Arab Spring uprisings that threaten so much of what had been described as "stability." Perfect timing.

Hidden Costs of Listening Silos

| 1 Comment

bills.jpgFirst, listen. Listen first. First, listen. Getting redundant yet? Is that also the way you've set up your own listening activities?

I recently talked with someone whose company has social media responsibilities divided among several executives. Each has a listening arrangement in place: one has an internal team, and the others have outside agencies doing the work. The kicker? All of them may be using the same platform.

I'm thinking there are some opportunities to rationalize costs there.

So here's a deceptively simple question for you:

How much are you paying to maintain multiple, independent listening posts?

  • How many times do you pay for the same software?
  • How many times do you pay for the same data?
  • How many times do you pay for the same analysis?
  • How many people are handling the same posts?
Getting started is great, but the pilot project approach has costs. Are you ready to manage yours?

Photo by Andrew Magill.

About Nathan Gilliatt

  • ng.jpg
  • Voracious learner and explorer. Analyst tracking technologies and markets in intelligence, analytics and social media. Advisor to buyers, sellers and investors. Writing my next book.
  • Principal, Social Target
  • Profile
  • Highlights from the archive

Subscribe

Monthly Archives