Phil Gilbert Revisits the Next Decade of BPM

Scott Francis
Next Post
Previous Post
I really like Phil’s talks on “the big picture”.  Of course, given his investment in process, his talks on BPM are particularly compelling stuff (even if you disagree with him). In his latest blog post, which is, essentially, an excerpt from his presentation at the BPM2010 Conference, Phil really drives home the point about how leveraged the IT resources at our major corporations are.  That for every true software developer, there are 5 people supporting their work (maintenance, deployment, requirements, management).  For those six people, 240 people are in “the business”. He could have added an additional dimension: outsourcing.  By outsourcing, the # of IT employees relative to business employees has been reduced even further.  Image courtesy of Phil's blog And both of these trends have a cost:  reduced agility. One answer would be to have more software developers – therefore increasing the throughput of the current 6:240 ratio.  Another answer is to provide better software to the 6 IT folks.  Phil argues that BPM is, essentially, just better tooling for these same 6 people.  An argument could be made that BPM extends its reach a bit further, allowing 7, 8, 9, or 10 people to get more directly engaged.  But the point remains – a shockingly small number of people are the bottleneck to business agility. Image courtesy of Phil's blog So the other answer is to address this larger community of “the business”.  Addressing the 240.  Increasing the scale from 240 to 480.  This is the challenge that Phil is calling on BPM to address:
So the next great challenge is before us: how do we turn the notion of scale on its ear? How do we gain the involvement of the 240 real business people so that without training they can contribute their knowledge of how things work, their requests for how things should work into the explicit, unambiguous conversations required for automation? And, by doing so, how can we increase the velocity of communication for everyone, so that requirements can be more exact, more quickly and communicated more effectively, so that the scale of software can be enhanced even further.
Good stuff.  Nick Malik elaborates from his perspective in a followup post.  But rather than dig into that, we can jump straight to Phil’s response to Nick:
And then once you are on the cloud, the technologies of Social are, in essence, free. I think if Social is your value prop you will lose inside the enterprise. People have no time for abstract notions of “community” at work, any more than they have time for abstract notions of “process.” But if the value prop is unleashed via the cloud, then Social becomes possible. And Social then can become, in essence, your Center of Excellence. This is the ultimate democratization: a Social network that enables the communication that is your Center of Excellence. This will never happen with someone buying a “Center of Excellence in the Cloud” package. It will happen because they use a set of tooling that solves personal or small group problems… and that technology has as a secondary value prop the ability to communicate via the new Social. Which then, finally, leads to this: my Center of Excellence isn’t defined by expertise but, rather, by the velocity of communication that speeds through it. Expertise is so highly distributed and, for most interesting breakthroughs, so specialized, that the main hurdle to breakthrough isn’t knowing everything, but rather, knowing where the pointer to any one thing is. A Center of Excellence should not contain experts with answers but, rather, should be a vehicle through which I can pretty easily get to any answer. It is about communication. And the faster and more robust that communication can be, the quicker the person or artifact with the answer can be identified and reached, then the better the CoE is.
I would argue, it isn’t knowing everything- it is knowing how to get to the answer.  This concept is familiar to me, as the son of a librarian, who could seemingly find any bit of information or research paper the son desired to find.  As computing resources evolved, the key improvement in “finding the answer” was search.  The difference with “the new Social” as Phil calls it is, not only can I use search to find answers, I can use my network of experts to find answers.  Phil calls this dis-intermediating the experts, but I would instead say that it is removing the official distinction between experts and non-experts.  Whoever has the answer can be the expert.  True experts will still be immensely valuable in “the new social” as they’ll be even more leveraged than they ever were in the old systems, and access to these experts are not likely to be as limited by governance and chains of command and project charters. That’s probably a good thing. Phil leaves us with one last thought:
By the way… we may be closer than you think on some of this stuff :-)
I think I mentioned before that I suspected Phil might be priming the pump with his talk at BPM2010, while working on the answers back at the lab :) We’re looking forward to seeing the outcome.