Monday, December 31, 2012

Management, governance and a partridge in a pear tree

I do read quite a few other blogs and there must be something about the year end that elicits a desire in bloggers to report on what they did well during the past year, what they won't do in  the next year, emerging trends and must haves for the ensuing 12 months. You get the picture. I'm not so sure I have much to add to the existing canon but it has spurred me to be just a shade more speculative that is the norm.

I include some points below which might be worthy of a moment's consideration in terms of positive change to the overall landscape of project management


  1. Project Complexity Modelling. For clarity, this is an approach to modelling how complex a project is and there is some coverage of this topic in the (former) OGC's Portfolio, Programme and Project Offices – P3O literature here. However, it's subjective, qualitative and not (in my experience) used. I'd like to see something quantitative, something that was capable of analysing a network diagram and, ideally something that allowed you to manipulate a project's plan or scope to influence (reduce) any given project's complexity.
  2. Getting change practitioners a seat on the board. CEO, COO, CFO, CIO, C3PO? (you heard it hear first). My view is that a business's ability to undertake change effectively and efficiently leads to a significant overall competitive advantage. Supplement this with 'poor governance' being an oft cited cause of project failure and isn't it time the business of change took its seat at the top table?
  3. Configuration management (bit of a misnomer this one). I like to think of myself as a generalist PM, an aspiring 'project manager's project manager' if you will. I do acknowledge however that a lot of my thinking comes from 15 or so years working in assorted I.T. environments. In any I.T. environment, configuration management (the management of your configured items) is a cornerstone of a well managed I.T. estate. Get this right and you have the opportunity to get most other elements right too. Get this wrong, and this opportunity will elude you. In very succinct summary, the CMDB (configuration management database) should in most cases facilitate an answer to the question "what should I worry about if I change this specific item". It provides a map (of varying quality and granularity) of what is joined up to what. There probably is something similar in the world of project management - the portfolio view, the PMO, the integrated programme and project management tool-sets - call them what you will. There's never been one when I've needed it.
And I think three things will do nicely. Three's a good number in management - military organisations the world over have known this for quite a while hence, there are three sections in a troop, three troops in a company, three companies (or thereabouts) in battalion an so on.

And that's not a bad hopping off point for the next point. Ask five people in a room what management is and you'll get six different answers. I always come back to the point that if something isn't being planned, controlled and directed, then it isn't managed. And furthermore, that things that are managed are predictable, sustainable and controllable. I think most of this relates pretty well to points 1 to 3 above.

(Incidentally, if you ask five people in a room what governance is, you'll get 3 answers, 2 questions and a partridge in a pear tree - but that's a topic for another day).

So after all I might have something to add to the overall cannon of project management rhetorical introspection. A bit more management in 2013 would just do nicely.


Wednesday, December 26, 2012

Forecasting part deux

Vicissitudinous. Now that isn't a word you'll hear everyday, let alone everyday in the field of project management. But is should be and if it wasn't such a challenge in terms of spelling and pronunciation, my feeling is that it would be a good deal more used than it is. In short - prone to adverse change.


As is my habit, in my last post I talked at some length about the shortcomings of the Pert weighted average. I'll try and suggest one or two improvements.

First however, a simple suggestion that quite apart from any estimating technique will help immediately with the accuracy of your plans and the forecasting effort.


Keep your tasks short in duration / work - never longer than week, ideally to a maximum of 2-3 days. Decompose your tasks to a level that can be undertaken by one person.


Sounds obvious doesn't it? I have however lost count of the number of project plans in which a few vague words (usually something like "Re-provision host environments" or some such hides 300 man days of work and a 60 day duration. And when, as surely you must, you enquire as to the specifics of the host environments and quite what their 're-provisioning' entails you'll often be greeted by a wall of jargon or a dismissive wave of the hand which suggests if you don't know already, you shouldn't be asking. 

Keep digging. How long does it take to change a tyre? Actually, I'm not certain - but I know the following questions are easier to answer.


  1. How long does it take to remove the spare from the boot?
  2. How long does it take to jack up the car?
  3. How long does it take to unscrew the wheel nuts?
  4. How long does it take to find the locking wheel nut adapter? Ah-ha - you say you don't know where the locking wheel nut adapter is? Well, that's very interesting isn't it...
So you get the picture, decomposing the task through what is sometimes call 'progressive elaboration' is time well spent. Estimating shorter smaller tasks is easier and it tends to throw out details that might otherwise get overlooked.

I don't think I want to get too side-tracked here but I will take the opportunity to address a concern. The drive towards productivity very often means that planning phases are curtailed both in terms of duration and the degree of rigour that is sponsored / supported. Equally, there's often the (somewhat incredible) imperative to spend money quickly (usually this is about allocated budgets and financial drivers). In short, the PM is busy trying to create product breakdown structures, product descriptions and plans that actually reflect the work to be done, and the project sponsor / board is saying - get on with it. Remarkable.

I'll spend some time on how to address this particular concern at a later date, but right now (for all you project sponsors who don't support detailed and structured planning) I offer the following. For your project to be any kind of success you will need to address the following.

  1. You will need to identify all requirements
  2. You will need to elaborate the work to deliver the requirements
  3. You will need to identify and broker the resources to do the work
  4. You will need to do the work to deliver the requirements
  5. You will need to undertake such quality assurance work as is necessary to find defects in products being delivered.
  6. You will need to fix any defects you find.
  7. You will need to pay for items 1 - 6 above
Accepting (as I hope you will) that 1 - 7 are somewhat inevitable, you can do all that at the end of a project if you so choose. It's a darn site more expensive that way and tends to get up the noses of your stakeholders something rotten. Okedoke - I concede that I've wandered a little off my intended track here - but surely worth it for the word vicissitudinous alone!






Sunday, December 16, 2012

Less poor forecasting (the first in a poorly planned and shambolic series of posts on forecasting)

Surely it can't just be me who thinks that estimating with the PERT weighted average is awful?

You know what I'm talking about (1 x the worst) + (1 x the best) + (4 x the most likely) all over 6. I guess that not any people can think it's that great, because I've never encountered anyone using it.

What then is the problem? Well first, what is the level of confidence in the output? Dunno. And I mean, really, you don't know. The principal problem is you don't know where the input figures came from. Likely as not, a bit of a stab in the dark in a workshop. At best, it will be sourced from experienced team members who have done something very much like the work before. This doesn't guarantee much as we shall see, but it's an improvement. Oh and the best bit, you're just as likely to corrupt a good figure as average out a bad one.

So what's wrong with a stab in dark? It has its place surely? We can refine things as we go can't we?

Let me cite two quick examples.

I'm working with a colleague over the phone - he's got the job of assembling a project plan. He's asking me about specific aspects of my work and putting 'durations' next to the work. We talk about a particular task and he offers that 2 days seems reasonable. I counter with 1 hour. Quite correctly he challenges this disparity. I'm able to have some level of confidence in my figure because I'd already done the work - and it took about an hour. We then quickly move towards a second activity which utilises the output from the first task. My colleague offers that the work will take around three weeks. I counter with 45 seconds. What he thought was a manual task was a bit of cut and paste with a spread sheet. 

If Pert weighted estimating is used in these two cases then the first example will be off by around 800% and as for the second, I'm not even going to waste time with a calculator.

So, two quick examples of how things can go off the rails. Now someone out there will be saying hold on, hold on; Pert weighted averages are only as good as the data that goes into them - same as any forecast. I agree up to a point, but there is no discussion or reference anywhere to this in the literature. Is there? Moreover, where do you get an optimistic estimate and a pessimistic estimate from? I've only got my best estimate, but usefully I'll tell you where it came from and often, what my level of confidence is in the figure. Oh - and not forgetting that there's not mention of how to deal with estimates that are either highly consistent or wildly divergent - and these two scenarios should be treated differently shouldn't they?


Let's set out a few immutable truths. Or at least some points which I hope won't raise too much debate.

Good things arising from good forecasting


  1. Better project plans that build confidence
  2. Better stakeholder engagement and perception - they know what's coming, when and in what order
  3. Better management of commercials - you can sign those distracting multi-million pound contracts with a little less anxiety
  4. Better expenditure profiling - you know what costs are coming when
  5. Better resource planning, and importantly brokering. Just 'cos I said you could have 2 engineers this week doesn't mean you can have 'em next week...
  6. More productive project boards - you're not arguing about the schedule all the time
  7. A happier project sponsor
...and quite a bit more besides.

Bad things arising from bad forecasting
Quite apart from the absence of points 1 - 7 above are the following.

  1. You'll constantly be re-working the project plan and your monthly project board will become your monthly re-baselining meeting.
  2. Your project will quickly become a source of generalised uncertainty, consternation and (quite possibly) resentment
  3. Your chances of getting to the finish line and fulfilling anyone's definition of a successful project will fairly quickly diminish to nil
  4. And a real kicker here, you'll dilute your governance (who makes what decisions, when and to what criteria)
As is so often the case, my blog post has ended up in quite a different place than I anticipated (that of course is attributable to poor forecasting) but between this article and those yet to be published I'll try to provide adequate coverage of the following.

  1. Parametric or reference class forecasting
  2. Guessing, uncertainty and pragmatism
  3. Never mind 6 Sigma - 1 will do you quite nicely
  4. The PMs job in fighting for exactitude and rigour in the planning process
  5. Some helpful language and strategies to challenge sub-standard practice
  6. Some real life scenarios and tools consistent with other articles in this blog that I hope will add a bit of value


Wednesday, September 12, 2012

Ubiquitous constraints (and what to do about them)

For the record, when I say constraint I mean some circumstance or factor which limits an activity, benefit or the overall success of part or all of a project. (I do like to cast my net wide with these things!)

I personally think they're a bit under-nourished in the general scheme of things. I'm not at all certain that within most projects there's a consensus on what a constraint is. There's often little or no effort to record or track constraints.

I think most projects face constraints in various forms, but there are three in particular which are almost omnipresent.

Before I go too much further, I suggest that unless you're intimately acquainted with Eliyahu M. Goldratt's theory of constraints, there's no better time than now to hop on over to Wikipedia and brush up on what I think is some pretty smart thinking about the way the world works.

All the way back in 1984, Mr. Goldratt was putting pen to paper and publishing material that included the following:


Types of (internal) constraints

  • Equipment: The way equipment is currently used limits the ability of the system to produce more salable goods/services.
  • People: Lack of skilled people limits the system. Mental models held by people can cause behaviour that becomes a constraint.
  • Policy: A written or unwritten policy prevents the system from making more.

*Source - Wikipedia

I don't have too many unbreakable golden rules but I do take the time to reflect on these three constraints roughly speaking at the following junctures.
  1. Spend approval
  2. Project start-up and initiation and every subsequent stage boundary
  3. All discussions relating to project change
Equipment

The effective delivery of change can be hindered by simply not having the tools for the job.

Whether this is heaving lifting machinery, the right widgets, collaborative toolsets for the project team or specialist software tools it matters not.

How to defend against deficiencies in equipment? Good planning principally. Not altogether intuitively, a detailed product description can often identify the particular equipment requirements for the delivery of a product.

The PMO too (if you have one) should be a source of lessons learned, historical data, policies and standards of its own that illuminate requirements for equipment.

Lastly, subject matter expertise is often the differentiator between getting things right the first time or slowly learning the right way through time and patience.

Your approach (as ever) should be proportionate and informed by risk.

Policy

I've seen a few of ways in which absence of policy can seriously undermine project success.
  1. The output of a project isn't used due to a lack of policy
  2. Resources cannot be brokered for a project due to lack of sponsorship
  3. Lack of stakeholder engagement (because no one's telling them any different)
I think this is my personal favourite having been caught on all three counts at one time or another.

The really simple answer (really far too simple) is simply get a clearer than clear project mandate. I've yet to see one. The less simple and less effective answer is wrap your project's mandate up in a terms of reference (PID probably, charter or brief maybe) and get that approved, authorised and sponsored.

If you're still worried, add it to the issues log - this won't help ever so much potentially, but you will have done your job as a project manager to the degree possible.

There's a bit of an addendum to this as well. There's (certainly in the UK public sector) an increasing drive toward the management and realisation of benefits (at last!). This can help the mandate / policy quagmire as all of a sudden, staff outside the project team are likely to be accountable for the delivery of benefits and this might get you an entry point to a discussion about policy / mandate or lack thereof.


People

You could spend a lot of time discussing the various human fallibilities that can constrain a project's success. I think there are pretty much three principal headings.
  • Management
  • Culture
  • Availability
Consider the following to assist with the management of staff assigned to projects.

Responsibility assignment matrixes are (I think) very important in most projects. Projects are temporary, unique and time-bound (aren't they?). Thus, it is not reasonable to require a team to know what is expected of them unless they're told. And, in my sometimes prescriptive and process orientated world, telling people anything important should be documented, versioned and recorded.

I like the RACI chart - there are others.

Use a skills inventory. A skills inventory is a system or tool that identifies the skills and skill levels required to deliver your project, programme, specific work packages or products. It may also specify the individuals who possess those skills. Skills inventories are most effective if they are aligned with a particular programme.

I don't think it will often be within the scope of project manager's accountabilities to influence or manage cultural change within a project or programme team. I think this is one of those situations where if the culture isn't (for instance) delivery focussed, then you'll need to adopt a suitably appropriate posture to ensure the project's objectives are still attainable.

As far as availability is concerned, I cite the following.

  1. Planning is criticial, as is estimating and forecasting. If estimating and forecasting is to be worthwhile, it must be based on historical data. One of the jobs of resource planning must be to specify what is actually going to be required in terms of resource burn.
  2. Sponsorship from resource brokers (see policy above). This should be two fold. First, to release the resources for the period specified at the time specified. Secondly, to deliver (as far as possilbe) the scope of work agreed to schedule, cost and quality.
  3. Manage change and issues so that deviations from what is agreed is predicatable, sustainable and planned.
On another occasion, I'll upload a RACI template and skills inventory tracker to the resources page.

Thursday, August 23, 2012

Something a bit fishy...

Well it's no good beating about the bush with this one. I've produced a double headed Ishikawa diagram for the purposes of illustrating causal factors corresponding to the influence and repositioning of stakeholders. I think it's fair to say no one's done that before.


So what's the point?

First, if you need to get some background on Ishikawa / fishbone diagrams, pop along to Wikipedia. They are worth having some background information on whether or not you're a convert to my eccentric approach to presentation above.

When writing my post some weeks back "Influence - a pragmatic and effective approach" I recall a sense that I hadn't quite nailed the visuals. Sometime later I had to generate a communication management strategy for a project and, when re-visiting the topic with fresh perspective, I came up with the approach above.

So, with the combined knowledge supplied by Wikipedia and my initial post, I hope you can at least intuitively grasp what I'm trying to do here. 

But why? And more to the point, why include something like this in a communication management strategy?

Well first, I think this is a good workshop approach to elicit input. You're not always going to have enough back story on a client site to generate one of these yourself, but you can provide the footpath and fill in the information as (hopefully) the client supplies it. You can also record that input and articulate it in a way that makes sense.

Incorporating it into the communications management strategy (or other similar document) has the benefit that all communications undertaken under the aegis of the project have the opportunity to be informed by themes that should assist that overall stakeholder engagement effort. Better yet, it should do that whether or not you're there to review and edit prospective communications yourself. Generate one of these Ishikawa diagrams, provide a supporting narrative in the communications policy and hopefully, you'll see helpful references and positive themes sewn throughout the project communications effort.
*Per my previous article - the credit for the better part of the approach to stakeholder influence goes to these guys.

Monday, August 13, 2012

Projects? Like poker? Surely you jest...?

If you've been around a project life-cycle a few times, you've probably stood on the periphery of some passable projects, some not so good projects, the odd biblical disaster and possibly, just possibly a success story or two.

I've often reflected on the various circumstances corresponding to success and failure and have (as doubtless we all do) a few thoughts on the matter. 

However, let's look elsewhere for inspiration other than our own personal project hurt lockers for a moment.

How do Google and Apple manage such consistent successes? What are their project manage approaches that so consistently deliver commercial success in amongst the most high risk, cut throat fields imaginable?

Well here's the thing - Google and Apple fail just as much (if not a good deal more) than us mere mortals.

So the distinction is certainly not simply one of success or failure.

I'm not going to prattle on too much at this juncture as most of this is done to death elsewhere in great detail. I can't help however draw a distinction between projects and poker however. Namely, when you lose, try and lose a little. When you win, try and win a lot.

You don't read a lot in the project management blogosphere about Prince 2 - admittedly, it can be a dryish cornerstone of what must seem to PMI / PMBOK advocates to be a project management oddity. I would even have a modicum of sympathy for the view that it is a project management methodology with no project management due to its conspicuous (and quite deliberate omission) of earned value management or anything resembling it. 

But, I will break the trend a little in the context of this post (possibly a web first to encompass poker and Prince 2 within a single blog post).

I had to put pen to paper in a professional setting recently and wrote the following. 

"Teams seeking to undertake projects via Prince 2 are challenged by constraints which often diminish the effectiveness of the Prince 2 methodology and consequently the overall success of their projects. These constraints relate to people, processes and systems. Specifically, the efforts to recruit appropriate staff, acquire Prince 2 knowledge, develop and implement appropriate policies and the subsequent execution of the Prince 2 methodology is exceptionally demanding."

And, I know what I'm talking about here having assembled a the odd 180 line responsibility assignment matrix for the purposes of administering the full suite of Prince 2 processes. And, yes, just in case you were in any doubt, that's 180 activities that that need to be undertaken in a fully compliant (admittedly non-tailored) Prince 2 compliant project that are quite independent of actually delivering anything. I'll upload this matrix to the resources page accompanying this blog in due course.

But, I remain a fan albeit with one or two provisos. Consider the following.


  1. Continued business justification
  2. Learn from experience
  3. Defined roles and responsibilities
  4. Manage by stages
  5. Manage by exception
  6. Focus on products
Not too shabby a list of tenets for the project manager to abide by is it? Well those six points are 6 of the 7 Prince 2 principles. The seventh is "Tailor to suit the project environment" which goes some way to palliating the 180 line items in the responsibility assignment matrix.

Just in case some of you were sitting there scratching around a long ago Prince 2 practitioner course thinking you really don't recall anything about 7 principles (and seven additional themes for that matter) you're probably right. I'm not sure what preceded Prince 2 2005, but in 2009 the framework was overhauled with the imaginative re-branding of "Prince 2 2009". I judge it to be substantially improved.

And what's all this got to do with Apple and Google? Well, whatever those folks are doing with their project management methodologies I'm pretty sure it will incorporate the 6 principles above. I shall resist the urge to conclude that Apple and Google are Prince 2 houses but I am guessing they don't rely much on full-houses either.












Wednesday, July 18, 2012

Making life easier (and a bit of process stuff)

Projects generally have a lot of 'interconnectedness'. And please - I don't just mean railway projects.

Processs are rarely 'stand-alone'. The outputs from one process are, more often than not, the inputs fir something else. (So help me) you'll start to hear the words ecosystem and (abandon hope all who enter) 'synergy'.

In an ideal (and possible mythical world) you have ERP, CRM and EPM tools into which all your various risks, issues etc are included. But, from time to time, the enterprising PM may find themselves without these tools and forced to fall back on more prosaic mechanisms (by which I mean Excel).

I've provided a spread sheet here. I call it ACRID (derived from Assumptions, Constraints, Risks, Issues and Dependencies). You'll also come across the term RAID (minus the constraints), and you'll probably not often come across a CORDIAL log which (of course) contains the lessons learned log. Any attempt to include a quality log is headed for the rocks.

We don't want to make life too hard for ourselves and, of course, we'll want to keep an eye on the whole process ecosystem angle so we're not duplicating effort left, right and centre.

I include here a template for a highlight report. I like highlight reports as I know of no better way to bridge the divide between the poets (for whom business transformation is a mere pen stroke) and the plumbers who lie awake all night worrying about it. I take some license but there's a sliding scale in there somewhere.

So, back to the whole inputs and outputs thing. If you take a look at the highlight report template (which is aligned with Prince 2) you'll notice there's quite a bit on work packages and products. Check back to previous posts and you'll have all all you need to cut and paste into the highlight report.

In fact I'll paste in the contents page from the highlight report template and append it with where you derive the content from each section from.

1. This reporting period

Not much going on in here

1.1 Work Packages

...Or here

1.1.1 Pending authorisation

If you need them, you'll have been writing them and you'll know which ones are outstanding authorisation

1.1.2 Completed in this period

From the project plan, paste in relevant sections of WBS

1.2 Products completed in this period

From the project plan, paste in relevant sections of WBS

1.3 Products planned but not started

From the project plan, paste in relevant sections of WBS


1.4 Corrective actions taken during the period


Issue log or other sources as appropriate

2 Next reporting period


Not much going on in here



2.1 Work Packages

...Or here



2.1.1 To be authorised

From the project plan, paste in relevant sections of WBS

 2.1.2 To be completed in the next period

From the project plan, paste in relevant sections of WBS


2.2 Products to be completed in the next period

From the project plan, paste in relevant sections of WBS

2.3 Corrective actions to be completed in the next period


Could be anything - use your judgement to include what you feel is appropriate


3 Product and stage tolerance status

SPI / CPI figures as appropriate (this will have to wait for another day for detailed coverage).


4 Requests for change


From the ACRID log so long as you raise all your changes as issues


5 Key Risks


From the ACRID log

6 Issues

From the ACRID log

7 Lessons Report


From the ACRID log

***************************************

Some points to bear in mind.
  1. Truncate (i.e. hide a few columns) on the product descriptions, WBS elements, risks etc as you'll not fit them all on a single landscape A4 and the detail is probably more than your audience will want
  2. I'll cover off a bit on the cost performance index (CPI) and schedule performance index (SPI) another day.
  3. Corrective action could mean almost anything - include what you think is appropriate
  4. Some stakeholders will want detailed information about resource burn, budget status or other detailed information not included above - this is a good sign and shows that the sponsor is 'on board' and giving the project focus.
And, to wrap up the topic of the highlight report I conlude with the following key points which I hope will impress upon you the benefit of producing it, even if your stakeholders are sanguine on the topic.
  • It is an excellent tool of communication to all stakeholders, both to relay issues and status concerns but also to keep all parties abreast of progress. It is a principal tool by which the project manager can relay, escalate and communicate issues and anxieties which require board / sponsor input.
  • I find it amongst the best ways of structuring a project board meeting, particularly for stakeholders less experienced in sitting on project boards
  • If (like me) you keep all your WBS elements, product descriptions and logs up to date, with practice, you can produce one of these in about 20 minutes.

Monday, July 16, 2012

Theories need evidence, facts need proof.

Always nice to stumble upon an academic bun fight. For the very tip of a very large iceberg on the relative merits of quantitative versus qualitative analysis see here, here or here.

But I'm a PM not an academic so what's the angle? My qualitative answer would be that knowing the difference will sometime enable a project manager to make optimal decisions . My quantitative answer is about 45 degrees - which to be fair isn't much use in this context.

For reference.

Quantitative research consists of those studies in which the data concerned can be analysed in terms of numbers ... Research can also be qualitative, that is, it can describe events, persons and so forth scientifically without the use of numerical data ... Quantitative research is based more directly on its original plans and its results are more readily analysed and interpreted. Qualitative research is more open and responsive to its subject. Both types of research are valid and useful. They are not mutually exclusive. It is possible for a single investigation to use both methods. (Best and Khan, 1989: 89-90)

Qualitative research is harder, more stressful and more time-consuming than other types. If you want to get your MEd dissertation or whatever finished quickly and easily do a straightforward questionnaire study. Qualitative research is only suitable for people who care about it, take it seriously, and are prepared for commitment (Delamont, 1992: viii)

Both these excerpts are from "An introduction to the qualitative and quantitative divide"

Whether it be the generation of the initial business case, the management of risk, key design decisions or resource planning, the project manager is faced with a veritable zoo of decisions. Some are more critical than others and on a sliding (qualitative) scale we make decisions which if wrong have very limited consequences to those 'irreversible' decisions to which great heed must be paid.

Question; has anyone ever sat down with project stakeholders and asked them if they want quantitative risk management, qualitative risk management or both? Do you understand the question? Would your stakeholders? Does it matter?

Take the following examples. First, what I hope will look a fairly typical excerpt from a fairly typical quanititative risk log. All with me so far?




 Next, something from the qualitative end of the spectrum.

R1 - qualitative assessment


Failure to provide adequant fencing, or early warning mechanisms as appropriate may result in injury or death to Donald Duck.

(A little digression here on the the two approaches to risk - if you fail to communicate to your board the potential impact of a risk with numbers (quantitative) try words instead (qualitative). On more than one occasion I've managed to elicit a response by describing in detail the consequence of a risk occuring having failed by ascribing it an impact and probability.

The Beaufort Scale incidentally makes rather good use of both qualitative and quantitative approaches - that's meteorology for you.

The PERT weigted average here is purely quantitative.

Now I could rattle on at length here, but I don't know if a blog post is quite the place for it. So I'll leave you with a couple of summary points.

  1. Make a judgement of which type of data suits your needs and then go and get it
  2. Personally, I prefer numbers to adjectives (with the proviso that they're right)
  3. Look back at the quantitative risk assessment example above. Are your quantitative risk assessments based on analysis of the statistical likelihood of the event and the impact to cost and time should it occur? If not, then your quantitative analysis is in fact a qualitative analysis.





Friday, July 13, 2012

Stacking the odds in your favour

A project manager benefits (and is sometimes disadvantaged) by having an itinerant pair of ears and an equally mobile mouth. It may also be the case that they have some experience which can usefully be applied to the customer's specific needs.


Before I go any further, let me re-state (and supplement) a few truths I hold dear.

  1. Most defects are introduced in the early phases of a project and get a good deal more costly to fix as time passes
  2. Most capital is committed relatively early in the project life-cycle
  3. Commercial contracts offer plenty of opportunity to write in haste arrangements that there will subsequently be plenty of time to leisurely repent
With this all said, how does the project manager take these self-evident truisms and instate a control that will serve to guard against mishaps?

I've talked about a several elements that together can serve as a foundation for project delivery. There are a number of things that can cause instability, amongst which are the following.
  1. Poor scope definition
  2. Inadequate product descriptions
  3. Lack of stakeholder engagement
  4. Poor change management
I've covered quite a bit of detail in relation to the first three but very little on the the topic of change. Incidentally, this isn't a complete list, but it's certainly some headlines.

The invidious thing about change is that it comes in many guises and there won't always be a governance product or person that can be called upon to respond appropriately.

With this said, you need a catch-all, something that informs all decisions, something that is quantifiable, and readily comprehensible to all stakeholders.

Say after me - "EVERY DEVELOPMENT STEP HAS A CORRESPONDING TEST ACTIVITY".

Put it in the brief, put it in the PID, the project charter, give it a slide in the project kick-off meeting, write it on the wall, include it in SOPs for the project team and champion it at every opportunity.

And when, as inevitably will be the case, someone deviates from this virtuous path and you've finished picking up the pieces, conduct a root cause analysis which will almost certainly find the absence of an appropriate test activity the principal culprit.

,,,and then of course, you can add it to the lessons learned log.


Wednesday, July 4, 2012

Conundrums in communications (part 2)

2 down, four to go.
  1. You might saturate communication channels
  2. Someone else might saturate communication channels 
  3. Your stakeholders 'lose sight' of the communication plan
  4. Your stakeholders actually tell you they should have read the communications, but they're sorry they didn't and now they're in a bit of a mess
  5. You've racked your brains but you can't work out a way to ascertain whether or not your stakeholders have read, retained and understood your communications.
  6. You're facing universal stakeholder apathy (this is on a sliding scale from mildly disinterested to venomous mischief making

Your stakeholders have 'lost sight' of the communication plan


The symptoms


This is one of those damning euphemisms which you really don't want offered forth in a board meeting. It's usually a response to some adverse incident which an observer has laid at the door of a communications management deficiency. A more forthright assessment might find the issue wholly predicated upon human factors.

The cause


Either you don't have a communication plan or, you're not 'managing your stakeholders'*. Or, you do have a communications plan, you are appropriately engaged with stakeholders and the assessment is simply wrong.

I take the stance that you don't treat stakeholders like farm stock, but not everyone maintains such an enlightened view.

The solution


Do not allow something that isn't a communications issue to be painted as a communications issue. If you don't have a communications plan, you'll have to bite the bullet. If you have a communications plan and the issue really is a communications issue then use something like 5-whys of other root cause analysis approach and take the corrective action necessary.

Your stakeholders are in a bind because they've not read your communications


The symptoms


You've done your job. Really, you have, You can show unequivocally the accountability for the lapse (be what it may) lies elsewhere and in this particular instance with someone who should have, would have, could have but didn't read any of the 6 carefully planned and executed communications bulletins. However, they can't work now and are costing the business money and that's starting to make you look bad.

The cause


Insufficient quality management. The key here is not the fact that your stakeholders didn't read the communications, its the impact of them not reading the communications. You cannot ensure your stakeholders read, understand and retain what is provided to them. In most instances you can manage what happens if they don't read it, understand it or retain it.

The solution


Implement poka-yoke or simliar. However you do it, don't let your stakeholders fall into a man trap because they 'didn't get the memo'.

You can't validate the efficacy of your communications (#5)


The symptoms


You've no qualified or quantified measure of the efficacy of your communications plan. Nagging doubts. Often at 2am.


The cause

You're a project manager not a mind reader

The solution


You've hopefully still got your communications stooges on talking terms from issue #1. They'll be a good help with this from simply opening up a dialogue to engaging them in test activities. I'm not a fan of surveys for this purpose. If I instate a test activity it will be designed to weed out deficiencies in communications or, what might in fact turn out to be confirmation testing. I'm not sure a survey does either one of these things well.

Apathy (and sometimes venom and mischief)


The symptoms


Rather self evident this one

The cause


Not ever so likely to be a communications issue but quite possible one which is first encountered by those project staff engaged in communications activities.

The solution


It's not a communications issue. Communications probably isn't the root cause and certainly is unlikely to resolve this independently. Escalate to the board and sponsor. Use the risk and issue logs with due prejudice.



















Thursday, June 28, 2012

Conundrums in communications (part 1)

I have a bit of a love hate relationship with communications. They can suck up resource, they're fiendishly difficult to get right (I sometimes even wonder what 'right' even means) and I can't be alone in having been burnt on one or two occasions by a nanosecond's lapse in concentration on the topic. 


Conversely, they give the project team an opportunity to raise their profile, to champion their aims and objectives and, done right, they can be one of the more satisfying aspects of the job.


Here's a few headline manholes.

  1. You might saturate communication channels
  2. Someone else might saturate communication channels 
  3. Your stakeholders 'lose sight' of the communication plan
  4. Your stakeholders actually tell you they should have read the communications, but they're sorry they didn't and now they're in a bit of a mess
  5. You've racked your brains but you can't work out a way to ascertain whether or not your stakeholders have read, retained and understood your communications.
  6. You're facing universal stakeholder apathy (this is on a sliding scale from mildly disinterested to venomous mischief making)
At this point I could espouse the virtues of a carefully crafted communications plan or stakeholder engagement plan. But I'm not going to and here's why. If you're fortunate enough to have specialist communications input, or even just a chunk of resource to throw at communications you can spend all the time in the world on soft market research, repositioning key stakeholder groups and deciding the precise colour scheme to use across the breadth of your media interfaces. So not only will you have time and money to develop specific purpose built documents and strategies, you'll be able to underwrite the resources for them, manage them and (one assumes) see them through to a successful conclusion.

Meanwhile, back in the real world, you've fought like a tiger for the limited resource you've got, your fingers are bleeding from back to back authoring of the business case, mandate project brief, project initiation document, configuration management plan WBS and product descriptions and the sponsor is wondering when you're going to start delivering something. So you make a note somewhere that the communications plan will be included in the RAM on stage 3 of the project (or something similarly vague).

This all said, you might find yourself facing problems 1-6 inclusive, or some uniquely quirky issue particular to your project and needing a bit of communication lubrication. Below I include my suitably pragmatic salve to the issues above.

You have saturated your communication channels

The symptoms

Communications have become less effective, stakeholders are grumbling, you hear the words '...not another project blah email...' in the lift, Paradoxically, because people are starting to disregard your communications you have to send out even more. People are setting up email rules to junk your mail.

The cause

You've overcooked it. You might even simply be following the communications plan (that needs an update by the way).  You've got no temperature check or feedback loop in your communications so you keep on spamming the entire stakeholder pool whenever you feel slightly anxious that your message might be getting lost in the corporate soup.

The solution

Prevention is better than cure. You need to recruit communications stooges (confederates in the stakeholder pool or business) who will primarily assess your communications from the receiver end of the equation as well as being your eyes and ears in those environments. If you've missed the prevention boat, take a two week communications holiday while you recruit your stooges - that should kill two birds with one stone.

Someone else has saturated your communications channels

The symptoms

Broadly the same as before. You have the satisfaction of knowing that you are not directly responsible for the blunder, but the annoyance that you will have to directly fix it.

The cause

Could be a lot of things but two spring to mind which I cite specifically because you'll approach the matter differently.
  1. Poor email etiquette from co-workers. You're getting a lot of reply alls on emails or, other, similar blanket channels of communication are springing up. You're not managing them (in fact no one is), messages are getting garbled and coherence lost.
  2. Someone in the customer's organisation, the stakeholder pool or simply someone you need to do business with is suddenly and deliberate fielding a great deal of communications, most of which is making your day a good deal harder than it needs to be.
The solution

In the first case, you're going to need to get that someone some feedback on email etiquette or simply the fact that communications for project blah start and end with the project team. That'll probably get you fixed.

The second case is a bit more of a curve ball and is correspondingly a little tougher to fix. Remember the communications plan we were going to right in stage 3? You need to get that thing written promptly. Don't kick yourself - having it earlier wouldn't have done you any good. 

You're going to need to define your communications interfaces, media, authorised and approved communications providers and get the plan approved by your project board. You will of course ensure that you've specifically outlawed stakeholders exploiting project communications channels to prosecute their own agendas and you'll also include that breaches of the communications plan will result in a project issue being raised to the board.

I can't promise you this is the panacea to every stakeholder that decided their views superseded those of the project board or sponsor - but this does tend to either quiesce over active stakeholder communications or, in the event it does not, reduces your exposure as PM to the problem as you've got some structure and a readily available channel of escalation.

I'll see off the remainder in the next post. If anyone has any communications issues not listed, do add a comment - I can't promise to have an answer, but my readership is almost into double figures - we'll crowd source an answer!


Wednesday, June 27, 2012

Sewing up a few lose ends and knitting it all together

Time to wrap up (for now at least) on the WBS and allied products.


I mentioned that sometimes there was a lot of work and not much product. If you need better control around this or simply better control period then I enclose a work package template here.


The first half is pretty much Prince 2 all the way. The second half incorporates specific test management activities intended to root out defects.


Two interesting points about work packages - sometimes resource pools really respond well to them. They can definitely accelerate delivery. Secondly, if you ask for written checkpoint reports from assigned resources you can rely on finding out problems after they've happened. If you take the time to verbally engage with the assigned resources you might be lucky to catch sight of problems before they occur.


Next I include a small but useful resource which is distilled from the WBS (which we have talked about) and the project plan (which we haven't) which nicely sows up all the delivery dates for all the products. 


For ease of administration I recommend including the WBS dictionary, the product descriptions and the product handover log on different work sheets in the same work book. I've never done anything clever with this using SharePoint / Excel Web Services but doubtless they'd be some value in exploring this.


I'm a bit cautious with the illustration below but felt it was worth including even in its slightly flawed state. The PBS is a bit nomadic, and there are one or two abstractions too far. But I think there's more right with it than wrong so I include it here and welcome suggestions for how it can be neatened up a bit.


There's a heck of a lot more worth exploring, both central to the illustration above (testing, estimating, tracking and controlling progress) and peripheral (benefits management, value management, business case writing) to name but a few. But, that's all for another post on another day.



Tuesday, June 26, 2012

Defining the output via product descriptions

This post and other recent posts relate to one approach to progressively distilling the voice of the customer into project plan.


It shouldn't be adopted in a vacuum and other posts here on quality and requirements are important as is (doubtless) a world of detail specific to your particular ecosystem.


I have focussed in detail on the WBS (3) though elaboration of the WBS itself, a useful supporting narrative for the WBS and the WBS dictionary. This post will look at product descriptions and touch a little on the composition of work packages.


The schema for the product descriptions is quite large so I've transposed it as shown below. Again, it's Prince 2 aligned and you can download the template here.


  1. Identifier- needs to be unique and should be traceable to the corresponding WBS element
  2. Title - something meaningful
  3. Purpose - Important. Be accurate, succinct and clear - what's this product for?
  4. Composition - What's in it? If it's a document, you might describe the particular section headings
  5. Derivation - how will this be derived or what from? 
  6. Format and presentation - speaks for itself
  7. Development skill required - if you need a rocket scientist, here's where to state you case
  8. Quality criteria - how will the product's fitness for purpose be measured or ascertained
  9. Quality tolerance - and how much can it deviate
  10. Quality method - how will it be checked (review, inspection, external audit?)
  11. Quality skills required - anything specific? Subject matter expertise, certification or other.
  12. Quality responsibilities - who's head is on the block?
  13. Status - draft, approved, completed, late and so on and so fourth.

There's probably more detail than is needed for most products and that's fine. Understandably, you're not likely to add additional columns just because you come across a specific product for which you wish to define additional detail. So include all the fields but leave them blank where appropriate. The additional structure can serve as a useful handrail for reviewers.



Those of you with keen analytical eye might have observed a discrepancy. All this work to elaborate and specify detail corresponding to products - fine. What about for all the work for which there isn't a product or for which there is a disproportionate amount of time / money committed for the given product? Where do we instate the control necessary to govern the activity? The answer is in a formal documented 'work package'. I have some views on this so will hold off going into too much detail and make it the topic of a future subject specific post. I'll include a template and some particular content around testing which adds value.


In summary then. A while back I looked at the following abstraction and talked quite a bit about 'achieving quality' and defined quality as 'meets requirements, fit for purpose.


I laboured (but didn't finish) requirements and requirements management, talked in detail about the quality risk assessment and just touched incidentally on defect management and the quality log. (By the way, my quality log is the opposite of the defect log - i.e. it records stuff that's okay as opposed to defective. This is distinct to the Prince 2 quality register which records 'quality activities'. Technically you don't need a quality log, but it can be quite a bleak experience going into project board meetings with a defect log and nothing to illustrate the more positive end of the equation).


This post and the last few posts have dealt in the development and implementation space. Taking the customer's vision, I generate a WBS and an accompanying narrative and get it reviewed and signed-off. I then elaborate the WBS into a WBS dictionary and identify any products. Where products are identified, they are elaborated via product descriptions. Note too that work packages play a part in controlling work without obvious products or for which the cost was disproportionate in comparison to the output. For the WBS elements with products we can probably take items straight off the QRA and apply them to the product descriptions as appropriate.