on packetpushers: influence, analysis, and the life

Ethan and Greg over at PacketPushers asked me to come on the podcast to talk about what it's like to be an analyst and grill me on some topics about analyst life and perceptions of the industry. Listen at Show 137 – Gartner Is Not for Sale With @Aneel Lakhani.

With Gartner’s blessing, Aneel came on the show and answered some hard questions frankly – even bluntly. Sure, Aneel doesn’t speak for all of Gartner, but we ended up with a lot of useful insight from him.

  • How does Aneel’s job work? What’s he do all day?
  • Who is a Gartner “customer”?
  • How does an analyst determine what products are interesting while avoiding bias?
  • How technically competent are Gartner analysts?
  • Most Gartner reports seems to represent the current state of affairs, but not look into the future. Why is that?
  • Why is longevity at Gartner something to be proud of?

Some highlights from me:

Most of my time is inquiry with customers. Most of the customers are end users and buyers of technology. As an analyst, I am the product.

Woe to anyone who tries to turn us one way or another [vendor influence] because that goes very badly for them.If I am not factually incorrect and they [vendors] don't like what I've written about their product or marketing or behavior or whatever.. they should just do better.

In dealing with customers, I've found the reason Gartner commands the premium it does is because of the independence.

Like any large firm, Gartner has multiple divisions and business units.. serving different customers, etc. You have to know how to use analyst firms. If you want a deeply technical analyst, you should go get a deeply technical analyst.

It takes a particularly tough personality to survive the process of research and writing and getting through peer review and getting published and wading through all the information you get from vendors... it's way way way more work than I expected by easily and order of magnitude.

We'll see if I get into trouble for anything I said. :)

on startups one analyst-year in

It’s been a little over a year since I joined Gartner and some things about startups, especially the  cloud platform and management variety, stick out one anlayst-year in.

1. Many startups don't know what to do with their product.

They're pursuing the wrong market, promoting the wrong feature, using the wrong metaphor, blowing the product out of all proportion with itself, etc.

Say you have a great feature. why build a not-great product around that great feature to compete in a market full of other non-great products built around more or less interesting features of their own? Just be honest: sell your bloody feature to someone who wants it or to a bigger player who needs it. But I probably don't know what I'm talking about and this approach doesn't fit the path-to-exit models in effect.

2. The herd mentality is very much real.

In my tiny little specialization of cloud platforms and management software, there are 76 that I know of. And this number seems to grow monthly (found number 76 yesterday). Seriously, why do entrepreneurs keep entering this (very crowded) space and who are the financiers who keep throwing money at them? How many potential acquirers are there and how big could any of these folks get independently? My view is dim.

Not only are they creating a phalanx of not-likely-to-exit entities all marching towards the same cliff, but they're locking up useful talent that could be doing something that might have some kind of impact.

3. Money has to be put to work.

Lots of it. And funding startups is one way to put it to work. But I've come to the conclusion that some (more than admitted) of what goes on is a group of people in an inbred ecosystem sustaining a particular lifestyle off of someone else's cash--everyone collecting a bigger or smaller slice of that more or less free pie as befits their nominal function.

I'm familiar with this pattern. I've seen it before in the other realms of finance more dominant where I live in NYC.

on cloud one analyst-year in

It's been a little over a year since I joined Gartner and some things about cloud stick out one anlayst-year in.

1. Cloud consumption is fragmented.

I can find no single pattern or market or threshold function that tips any given organization over into using or building cloud services. If I take the aggregate of all the customers I've discussed this with--enterprises, state agencies, federal agencies, cloud service providers--it's all over the map.

There is no single cloud market. There are dozens. The interesting thing to see is whether vendors arise to serve all of them.

2. There's a small, growing huge-$$$-potential market for "real", "true", "webscale", whatever-you-want-to-call-it internal-private-not-hosted cloud.

Amazon has left that market to anyone who wants to take it. And that market is being addressed, though relatively quietly (marketing attempts notwithstanding).

3. A class of customer is leading, almost dragging, vendors into the future.

Some aren't even waiting. They're running into the future without their vendors and finding new ones there or making it up as they go along.  They're spinning out companies, products, teams, open-source projects.  This looks new to me, but maybe I just haven't been around long enough.

on analysis one analyst-year in

It's been a little over a year since I joined Gartner.

Some things about analysis stick out one anlayst-year in.

1. Numbers don't lie, but beware averages, timescales, local min/max-es and the like.

You have to dig deep for the real story. Everyone knows this and nearly no one does it. I interact with too many people who say "the forecast says x" or "market share globally is y" without having looked at the breakdown of numbers below the abstract data point.  The surprising-est thing about this is how often that person is an investor.

2. Being an analyst is strange.

The weight carried by "Gartner" next to my name is more than I expected by an order of magnitude.

A bit of the job is just playing tech/vendor/market therapist.

It's hard work to not get sucked into one kind of a bubble or another that causes you to lose touch with what's going on in the real world. You have to constantly remind yourself of your biases, background, context, the biases and backgrounds and contexts of the people you're interacting with, and seek out things that check and balance you out.

3. Some significant portion of those engaging with analyst firms have no idea of how to do it or even what it is that they're getting.

They don't know what the range of services is that they have available to them. They have no strategy for engagement. Etc. And they don't ask. They don't seek to find out. I have to figure out that they don't know and then lay out the list of things that could be done differently. This may be as much a problem of customer management as anything, so I don't lay the blame all in one place.

a thought on oss cloud platforms

One will be Linux. One will be BSD. The rest don't matter.

reaching peak people

San Francisco, Silicon Valley, DC, Baltimore, Chicago. I've hit most of these cities multiple times this year and one thing has started to stand out clearly: there's a talent constraint.

It stood out most at VMworld last month + Surge this month. Then I read this

Are we reaching “peak people”?

It seems like in a lot of companies we are. There’s a shortage of talent out there, and if there’s a shortage of resources, you want to conserve those resources.

That's Jason Fried of 37signals. I've been talking to technology companies, enterprises, startups, state agencies, federal agencies, defense organizations and this is becoming a recurring refrain. "How do we move the organization forward? What kind of people do we need? Where do we get them?" That's not even what I cover or talk about, yet the question keeps coming up.

I don't think it's a fundamental technology skill supply problem, though. I think it's an allocation problem. Too much money is chasing too few problems spread out over too many organizations all employing the same kinds of people to do the same kinds of things. And that's how it'll stay--until things move on + the field is decimated.

Peak people.

P.S. I do think there's a supply problem for a particular skillset: systems people who understand applications + vice-versa. But that has most (in my opinion) to do with the fact that the kind of education that takes seems to have dropped off the general computer science curriculum.

deliver better

 

There are physical limits to observation and action. Given equally matched adversaries with access to the same data and tools, both will hit absolute limits to how fast they can observe the environment or act on it.
But realistically, adversaries are not equally matched. They gather different amounts + kinds of data. They act slower or faster. 
Observe - instrumentation, monitoring, data collection, etc.
Orient - analytics in all its forms, correlation, visualization, etc.
Decide - modeling, scenarios, heuristics, etc.
Act - provision, develop, deploy, fail, iterate, etc.
What does cloud speed up? And who has the advantage?
The proximate answer is obvious: operating in cloud models accelerates action. But the real benefit of being faster to act is upstream. It's so you can spend more time figuring out what's going on out there in the world and come up with the best--not the fastest--response and act on it at the optimal--not the fastest--time.

In Wait: The Art and Science of Delay, Frank Partnoy writes:
Because Connors and Evert needed less time to hit a return, they had more time to gather and process information. They saw, they prepared, and finally, only after they had processed as much information as possible, they hit. Their preconscious time management—what their brains did during the “prepare” period—was crucial to their success. Their talent enabled them to stretch out a split-second and pack in a sequence of interpretation and action that would take most of us much longer.
And also:
Sometimes there is a first-mover disadvantage. Sometimes the second mouse gets the cheese.
Which is what I was talking about in [pacing]:
It's not at all clear...that it's uniformly better to be functioning at a faster pace than competitors at all things.
Being fast to provision + market.. is only worth anything if you can use that speed to deliver something better.

Act faster. Observe greater. Orient longer. Decide truer.
Deliver better.

 

 

pacing

In [change the game], I wrote:

With each iteration, you move further ahead in time.

What you iterate and how often, your pacing, is a big question. And not all paces are created equal.

What-- there's pace of: marketing, product, strategy, back office, ecosystem growth, ecosystem cannibalization.

How often-- there's pace relative to: competitors, customers, technology advance, commoditization, ecosystem.

It's not at all clear that all of these things should be the same or that it's uniformly better to be functioning at a faster pace than competitors at all things. For example, consider what happens when your strategy changes too quickly and your products change ever so slowly (this happens more often than you'd think).. constant changes of direction, executives, board members, etc., all the while without anything new being brought to market, any changes being made to the portfolio, any response being made to competition. Dysfunction in the extreme.

It's also not clear that you should be moving at the same speed (or with the same strategy) throughout a particular business, product, or ecosystem cycle.

Eventually, people will realise that activities evolve and you need to use different methods depending upon the state of evolution.

— swardley (@swardley) August 14, 2012

 

change the game

In ooda x cloud, I wrote:

More: compressing Orient and Decide, the time between Observation and Action, enables you to change the operative environment such that the adversary is orienting towards and making decisions based on an outdated model representing a reality that no longer exists.

There are physical limits to observation and action. Given equally matched adversaries with access to the same data and tools, both will hit absolute limits to how fast they can observe the environment or act on it.

For orient and decide, no such limits. These are mental processes. Their analogs: algorithms, analytics, patterns, heuristics, models, etc.

If you orient and decide fast enough, then your action will change the competitive environment before the adversary has decided what to do. They will be drawing conclusions and acting on a model of the past, instead of the present. With each iteration, you move further ahead in time. You can literally..

Change the game.

ooda x cloud

Spending time discussing cloud in terms of mission objectives and operational leverage lately brought up John Boyd and OODA. The theory holds that:


"…to win in battle a pilot needs to operate at a faster tempo than his enemy…he must stay one or two steps ahead of his adversary: he must operate inside his adversary's time scale."

- Boyd: The Fighter Pilot Who Changed the Art of War.

 

More or less: whoever moves faster through the process of Observing the operative environment, Orienting themselves towards it, Deciding on a course of action, and Acting--wins.

More: compressing Orient and Decide, the time between Observation and Action, enables you to change the operative environment such that the adversary is orienting towards and making decisions based on an outdated model representing a reality that no longer exists.

The application to any competitive market is obvious, so instead here's an analogy:

Observe - instrumentation, monitoring, data collection, etc.
Orient - analytics in all its forms, correlation, visualization, etc.
Decide - modeling, scenarios, heuristics, etc.
Act - provision, develop, deploy, fail, iterate, etc.

What does cloud speed up? And who has the advantage?