Getting the Measurement Mad out of the Institution

Kevin Keohane – CommScrum London

There’s a probably apocryphal anecdote that in 1977, an internal audit of the Royal Artillery showed that 4 crew were required for certain cannons.  A gun captain, someone to aim it, someone to load it, and someone … to hold the horses so they didn’t run off when the thing fired.  Needless to say, gun crews were reduced to 3 (100 years after the fact).

Somehow I think the same principal applies to the state of employee research.

Twenty years ago, when internal communications/employee communications was just emerging with people like Roger D’Aprix and John Smythe as champions, one of the big questions was: How do we prove our value?  How do we show what we do has merit and adds business value? Can i justify my existence as a professional communicator?

The IABC played an instrumental role in the development of the quantification of the value of employee communications (don’t even bother applying for a Gold Quill if you haven’t measured your project).  A whole industry grew out of this with companies like ISR (later bought by Towers Watson) delivering monolithic “annual employee surveys”.  Harvard Business Review reset the state of play in 1997 with the seminal Sears Service-Profit Chain case study, and The Gallup Organisation later found great success with “The Gallup 12” .

In short, we were saying “We need numbers to be taken seriously by people who are serious about numbers.”  So we went out and found ways of generating data.  And we sold the thing in to management. They liked the numbers bit.

And now … employee research is serious, big business. But I think the tail is now wagging the dog.  Clients are creaking under the cumbersome, slow process and masses of marginally useful data generated.  Actions are planned around moving “drivers of employee engagement” in the right direction, successes are celebrated … and yet other business KPIs don’t often respond accordingly.  In some cases the employee survey has blacked out all other forms of useful tactical measurement.

We’ve lost sight of why we were measuring these things in the first place.  The last employee survey I looked at — and its data outputs — were pretty useless when removed from the context of deploying an employee survey to compare to last year’s … some of the questions made no sense at all.

It’s not that measurement isn’t important; I’m just wondering whether we need to really reassess how we go about the whole thing.  Are we measuring the right things?  Are we measuring them in the right ways?  Are we connecting all the HR-ish internal employee stuff well enough to the external brand equity, customer experience and financial performance stuff (beyond The Gallup 12 + Human Sigma connection)?  For some reason there always seems to be a yawning chasm between HR-led measurement and Customer-experience-led measurement.

The good news is it seems like the big annual surveys are seen to be long in the tooth and other, more focussed and timely approaches are emerging like employee panels, pulse checks, etc.  And it’s refreshing to see people like Mark Schumann acknowledging that there is an issue to be addressed here.

Is it a myth that “What gets measured gets managed?”

Mike Klein–Commscrum European Railpass

It’s not just a case of measurement overkill for overkill’s sake, Kevin…  There is a real downside to overmeasurement–that it kills off an organization’s willingness to research itself in meaningful and targeted ways, and often yields measurements for which there is a stretch between the input and its measure.

I remember a few years ago when I met with a potential client and wanted to do a short, qualitative study to find out what the pressing issues and relevant language used to describe them was.  The response “Nope!  We just did the Q12 survey.  That’s all we’re going to do.”  Never mind that the Q12 survey had “sweet FA” to do with my proposed project.  I was being asked to fly blind for fear of awakening the business’ terminal case of the dreaded “survey fatigue.”

If I were to advise communicators on their most important research needs, I’d make three recommendations,, in this order:

1)  Measure a direct relationship between what you and your team do and specific actions and savings that more than make up the cost of your team.  This should give you enough headroom to do the other things you think need to be done.

2) Measure the stuff you think is important–even if it means using qualitative and small sample stuff to avoid awakening the “survey fatigue” monster

3) Carefully–and judiciously measure the stuff your core sponsors think is important, but always remind them that you and your team have already “covered their costs”.  This may get you some slack, and reduce your business’ fetish for overmeasurement.

4) Make sure you have a seat at the table where the measurement decisions are made. The last things you want are to either get cut out of measurement activities that are going to happen anyways, or worse, to find your activities measured in a way that doesn’t respect the context in which they are being acted upon.

Dan Gray – Commscrum London

You’ve surprised me, Kevin – this is a lot more polite than I was expecting!

In response to your specific question at the end, I don’t think it is a myth that what gets measured gets managed, nor should it be. But, as you say, you’ve got to be measuring the right things in the first place.

I seem to remember reading something from Melcrum recently – a piece with Mars’ global engagement director – who described how their use of the Gallup 12 survey had led to some appalling behaviours by managers, essentially blackmailing their subordinates to give high scores. ‘Nuff said really.

That’s what can happen when you elevate engagement to an end in itself, to be measured in its own right, rather than what it is – a delivery mechanism; a means to delivering improved business performance.

That’s the biggest problem with the mindset of the measurement-mad, ‘Human Capitalist’ crowd, as you so brilliantly characterised them in your IC taxonomy. It’s as if they took one look at the Service Proven Chain and thought, “That’s it. Case proven. As long as we can show that people are ‘engaged’, everything else can be taken as read.”

So an employee has a best friend at work. So what? It might give you a broad brush indicator of an employee’s happiness and propensity for greater discretionary effort, but it tells you nothing about where that effort is directed and whether that’s contributing directly to the achievement of the organisation’s strategy and goals.

That’s what the C-suite wants and needs to know about – whether or not it’s creating value for the business – so, ultimately, if it isn’t linked to specific strategic objectives, then why are you measuring it?

Lindsay Uittenbogaard – Commscrumming from Canada this time 🙂

Agree with all of these measurements traps and opportunities.  I’d just like to add that I feel it is the communicator’s job to keep folk anchored to the fact that not everything can be measured and so measurement has it’s place.   I once heard a senior business leader saying “If you can’t measure it, don’t do it.”   SO irritating.

If communication is fundamentally about 2 things: avoiding disconnects and misunderstandings, then how do you really know when that happened well – or the cost of the loss when it didn’t (compared with the cost of the communication effort involved in trying to bridge those gaps)? More importantly, when communication works well, how do you measure the costs that were saved as a result of disconnects or misunderstandings that were avoided…

Of course, because we are individual people with different perspectives, interests, cultures, priorities and working hours / locations, there will always be varying degrees of communication success, much of which can be attributed to levels of attentiveness and opportunity on the part of the receivers / decoders / responders rather than the deliver of the communication itself.

I think we’ve got to accept the vague nature of our work, use data as a guide and encourage people to ‘believe’ in common sense-based lines of communication logic more than hinge their support on hard data.


24 thoughts on “Getting the Measurement Mad out of the Institution

  1. kevinkeohane says:

    Brilliant! I plan to use this in a presentation soon…

  2. kevinkeohane says:

    Lindsay, there is another angle as well to the exec who says “show me the measurement.” I have been increasingly encountering the opposite – execs fed up with spedning £200,000 on employee surveys with no evidence of ROI. Many anre saying “Just spend the £200k on communicating with people.” Increasingly the cost of measurement exceeds the value to the communication.

    There must be a better way.

  3. Adam Hibbert says:

    Lindsay, a couple of suggestions to help those grappling with your ‘anchoring’ challenge:

    a) ‘what’s your engagement score with your spouse?’ (and …. discuss!)

    b) If you really want to get in amongst it: Weick argues that “to learn is to disorganize and increase variety. To organize is to forget and reduce variety … the reluctance to grapple with the antithesis has led to derivate ideas and unrealized potential.” [from: ‘Organizational Learning: Affirming an Oxymoron’, in ‘Managing organizations: current issues, Part 2’ [eds] Clegg, Hardy, Nord, 1999, Sage]

    I’m increasingly convinced that employees don’t need to be overtly communicated with, much, to get to engagement – what moves us most is the message implicit in the frameworks within which the org expects us to operate (and especially, implicit in the changes the org makes to these frameworks). Oh dear. Does this mean I’m turning OD?

    • Indy says:

      Yes, you’re turning OD… but it’s unavoidable… just as we judge people on their appearance and body language as much as their words, so we judge corporations on who they give the pink slips and bonuses to as much as whatever “engagement initiative” is ongoing at the moment…

  4. Jon says:

    I hate annual employee engagement surveys. I dislike the habit and ritual that lies behind them. They are positively dangerous when driven by learners and they are massively overpriced and disruptive when driven by experts.

    I fear that they offer the illusion of proactivity and action to the executive team and the spectre of failed promises to everyone else. And by the time the analysis is done, the sanitised communications agreed and remedial workshops rolled out, it’s all a bit tired and out of date.

    Personally I’d rather spend the money on doing things I know that people will feel good about, such as Adam’s frameworks.

    Ask a farmer. If you want to fatten a pig you don’t weigh it, you feed the damn thing. If you still wish to weigh it but cannot because you don’t have big enough scales or you are not strong enough to lift the bugger, whatever you do, don’t stop feeding it….

  5. Dan Gray says:

    @jon – Now there’s an analogy I won’t be able to shake in a hurry! Love it!

    @adam – loving that youtube vid too, and the point you make in your last paragraph above, which I think is absolutely spot on.

    Look through the lens of organisational culture – particularly something like Edgar Schein’s ‘3 levels of culture’ theory – and I believe you’re offered some very interesting perspectives on brand and engagement, as I wrote about some time ago on my own blog (see It’s the culture, stupid!).

    Ultimately, the frameworks you describe are just further ‘surface manifestations’ of the culture which, consistently and postively experienced over time, will go a lot further to shaping people’s fundamental beliefs about an organisation than any explicit communications.

  6. Sean Trainor says:

    Let me start with the positives on surveys – when they provide useful insight that leads to interventions that improve engagement and inspire innovation, they are valuable tools.
    Now let’s move on to the real world – sadly, this rarely happens.

    There are too many surveys that focus on the wrong questions (GIGO) and don’t focus on the outcomes required. The most useless are those that measure “drivers” of engagement, based on characteristics of high performing teams. I refuse to single out the infamous organisation that sells one with 12 questions (on the basis they refuse to get out of bed for £200k, believe me!)

    So I’ll use an analogy instead. I recently used a golfing analogy with an American ‘engagement expert’ (he’s wrote a book on it) that sent him into orbit. On that basis I’ve just got to use it again.

    By analysing the characteristics of the world’s top 10 golfers’ swings, you can model it into 12 features and use that as a yard stick for your book “The Perfect Golf Swing”. You can then retire on the earnings from selling technical instruments that measure acceleration, angles, pressures, etc, to novice golfers who bought your book.

    What’s wrong with that? Well it is a great business model but it won’t produce the next Tiger Woods.

    There are at least 4 fundamental problems for the golfer:
    1. You will never be able to replicate the ‘perfect’ swing
    2. Even if you did, it most probably wouldn’t guarantee success
    3. You forget the purpose of playing golf in the first place – having fun and striving for your personal best.
    4. You never find time for the 19th hole.

    The results of a recent global cultural survey are out now – it makes the same point.

  7. Wow! I’m surprised not to see some pro-measurement comments come in. Perhaps this indicates that us measurement-rebels are making it mainstream 🙂

    I’m sure Angela Sinikas would have a balanced, more sobering word to say about all of this though.

    Just back to Adam Hibbert’s fascinating comments about the power of ‘implicit’ messages. I get the concept but would love to hear more…. and I bet those messages aren’t measurable.

    Right, I’m off to fatten a pig 🙂

    • Dan Gray says:

      Probably not, but to echo the last paragraph of your original response to the post, why should we (be made to) feel the need to justify absolutely everything with hard data? Surely some things are just common sense.

  8. Adam Hibbert says:

    Well, OK, I’ll confess I feel authorised to contribute on this topic because I’ve been responsible for implementing (inflicting?) a quarterly IC survey on my global colleagues. But I’ll just plead the Nuremberg defence on that, and move on: John Smythe told me to do it (and how).

    And as to measuring the implicit, Lindsay, oof, there’s a challenge – one requiring the Creative Paradox touch, perhaps?

    Slightly off-topic (but perhaps, by the power of suggestion …?) I met a comms player recently who’s agreed a very special metric with the relevant Exec sponsor: the sponsor counts the number of times any of his exec peers log a frustration with him that their vital, super-urgent, hyper-important announcement [by which, dear reader, we of course refer to value-destructive, vanity published, ‘tractor production is up’ apparatchik cack] has been impudently spam filtered by his report. Result: A truly excellent performance metric.

    • Adam Hibbert says:

      Lindsay, on a more serious note, I’ve really just started thinking about implicit messages, informed by the research I’m doing around my IC management MA at Kingston University. In particular, if you can pop this on expenses, I recommend:

      Hoogevorst J, van der Flier H, Koopman P (2004) ‘Implicit communication in organisations: The impact of culture, structure and management practices on employee behaviour’, Journal of Managerial Psychology, vol.19 (3) pp288-311, URL:

      What I like about them is they have a simple way of explaining what factors contribute to the “behaviour context” within an organisation, with which we are all intimately familiar as employees, and against which me measure, weigh-up and judge the merits of explicit messages. If there’s any play-off between what [Corporate Mouthpiece x] tells us, and our daily experience, it’s kinda obvious which one wins.

  9. Mes Cher Amies – I couldn’t let this pass without a word in defense of research. Most of your objections center on how the data are used, what goes into the research design, and what the purpose of the research is. Making note of those shortcomings is useful and valuable and I encourage it.

    What I cannot abide is the assassination of research itself.

    When National City Bank did the Gallup Q12 in 2008, we could all well guess what would happen, as we were in the midst of the worst financial crisis in 70 years. We talked about cancelling it. But we did it anyway, because we’d done so for several years and line management were eager to see the results for planning purposes. You heard right — the management team overall wanted the data to help plan the coming year.

    By the way, the amount of discretionary effort expended by our employees during that terrible year was heroic. They became worried, but never lost faith. It was a grave disappointment to have the US Gov’t step in and demand the company’s sale to a competitor — not for the usual reasons, but that the employees had such pride and commitment to the company. It was a great place to work, and made more so by leadership that wanted a committed workforce (we’ll leave aside the “e” word)

    It’s cheesy to ask whether one has a best friend at work, but that’s just detail. The main thrust of such efforts needs to be toward learning and planning, and the right management of that process can forestall or eliminate “managing to the numbers.”

    Finally, social science research almost never reveals the cause-effect relationship as clearly as the finance folks want. But few executives will accept “trust me” as a valid business leadership tool. Just because the process isn’t perfect doesn’t mean we should abandon the effort. Babies and bath water, you know.


  10. Mon Ami, Sean

    Let’s just look at this ‘partial’ assassination of communication measurement as venting. Looks like vent needed…

    On a lighter note – perhaps we are seeing the back of the measurement mad folk who profess that internal communication research is the be all and end all – which is about as true as most of the data that is so laboriously excavated.


    PS – Adam – inspiring stuff indeed – you don’t happen to have a link to that simple explanation rather than to the whole book, do you? 🙂

  11. Indy says:

    Oh dear… No-one sober has arrived to defend measurement and I’m about to rant against surveys, in ways that usually really annoy the sober people.

    Once upon a time I was in the scientist/engineering streams of university life and I had to take a lot of statistics courses.

    What makes me really uneasy is that in the modern world, very few organisations are both large enough and homogeneous enough for statistical survey approaches to be reliable. Of course, if your organisation is way out at either end of the spectrum (triumph or disaster) that will show up… but most of the time, most organisations are in the middle…

    This is before I think back to my experiences as an IT worker and manager – how employees really think/feel about the surveys in a lot of cases just depends so much on their existing relationship with the organisation… back to Adam’s implicit frameworks.

    This is not to say surveys cannot work, but the more general they are, the bigger the dangers.

    When it comes to those expensive and general products like the Trotting 13, all the IC and HR folk I talk to say (quite reasonably) – “well we don’t take it at face value, we modulate it with our understanding of particular situations…” – which is a smart way to deal with flawed information.

    Question is, why spend so much money to generate flawed information in the first place?

    There are cheaper and more effective ways to find answers.

    One question approaches are better for the kind of headline numbers that less interested observers are looking for, cheaper too. Beyond that, get more specific about outcomes, get more focused on who you’re asking and save some money for qualitative techniques that might help you find the “why?” as well as the “what…”

    Of course, having ranted all that, the flipside is that things like the Trotting 13 are cultural symbols too and can be positive ones… which is one of the lessons I draw from Sean Williams’ experiences..

  12. “There are three kinds of lies: lies, damned lies, and statistics.” [attributed by Mark Twain to Disraeli].

    I’m not sure I’d go that far [full disclosure I’m married to an economist so I’d be in big trouble if I did], but I do think when measurement becomes the focus of what we do we’re in trouble. Rather like playing a competitive sport like tennis while watching the score board.

  13. Sean Trainor says:

    Mark Twain also said “you can do what you always did and get what you always got”. If measurement broke that cycle, I’d buy it.
    I said IF.

  14. Indy says:

    A more reasoned tour of surveys than my rant, and fuller criticism of the big company approaches can be found in Peter Hutton’s book, “What are your staff trying to tell you?”

    I don’t agree with everything in it, but it asks a lot of good questions and points the way to a better understanding of when and how surveys are useful.

  15. Adam Hibbert says:

    Sean W, I don’t think anyone’s arguing in favour of proceeding in blind ignorance of what employees think and feel. What’s in question is whether surveys produce ‘data’ of the actionable kind your NCB management team would be right to base their actions on.

    I think there has to be a question mark over that. Yes, having numbers beats ‘trust me’ in any organisational influence attempt. So if it is your privilege to be in a position to push a culture in one direction or another, having a survey in your back pocket does give you some ‘ammo’.

    Note: that’s in the circumstance you need to challenge the prevailing culture. The objective should be to arrive at a situation where the relationships that make up your organisation have no need of artificial devices to transmit how things are going, from one end to the other.

    I don’t survey my wife about how we’re doing (not just to avoid the immmediate threat of flying kitchen implements) but because it matters to me that I’m never so out of touch with her that such interventions in the relationship might be necessary. That’s the ‘BAU’ I think people are suggesting as the ideal, here.

  16. […] at all, you’ll know we’ve been giving a bit of a collective kicking to the “Measurement Mad” school of employee engagement of late. Not that measurement is bad per se – far from […]

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )


Connecting to %s

%d bloggers like this: