PMO Labs is our way of getting attendees talking about different PMO themes together. We know that every single one of the PMO Flashmobbers has a story to share, an interesting insight from their careers that others would get a great benefit from.
In this PMO Lab, part of the session held at Parliament UK, we continued to look at the insights thrown up from the latest Inside PMO report on KPIs, Metrics and Measures.
This time around we were talking about metrics around a PMO service menu – service levels or SLAs. We started by confirming what a service menu is. Basically, this is a list of all the services that the PMO can perform – the most well-known version of this is Appendix F of the P3O manual.
Stuart Dixon led the table in the discussions. “We discussed whether we could put in a service level for some of the services such as setting up a project, turnaround time for producing minutes. However, we thought that this would be more difficult if the service were a bit more flexible such as providing consultancy.
Being PMO practitioners sitting around the table, we got straight into an example that one of the individuals was experiencing around measuring the work they were doing around setting up some training courses.
We started at the most basic of the measures which were the number of courses delivered and the number of attendees. While these were easy to get they didn’t answer the question as to whether the training had delivered value.
Looking at the measures around this we suggested that they could do some Net Promoter Score work as to what the people being trained thought of the course. Taking it up a level again we thought we could measure this at an individual level by doing a competency assessment for the person beforehand and then a few months after the course to see whether there had been an improvement in capability. Two assessments were mentioned (from APM & IPMA).
Taking it up a level again we looked at whether, collectively, the training had been beneficial for the company and we could do a maturity assessment on this (P3M3 was mentioned). This would allow us to see the overall effectiveness of the organisational training, although that may be impacted by people leaving/joining the organisation after the training was given.
Moving away from the specific service of training, the discussion moved to macro and micro measures on services.
What level should the measure be?
Is it at the macro level – the organisation or down at the micro level – the service? We agreed that we probably did need both levels. That got us onto a discussion about why that may be. We concluded that this was because we had different levels of stakeholder (PMO Sponsor, Accountable execs, PMs, Other areas (HR/Finance)) they all had different views of what ‘value’ was.
We looked at different measures we could take at the macro level. These included measuring the success of getting the Business as Usual areas to adopt some of the project disciplines, while recognising that not everything is a project, getting them not to rush into ‘doing stuff’ but to think first, ask the question, is it a project? This would be considered a win for the business.
We turned our thoughts as to which services we should measure, if we were going to measure at a service level.
We agreed we would do this for the most popular services, which brought us nicely to reporting, the one service probably every PMO does.
We looked at what measures we could have for reporting; these included simple ones like numbers of reports produced, how long it took to produce report etc. We then decided that this may drive the wrong behaviour so we looked at measures around the numbers of people who read a report or the number of key decisions made based on the report. As one of the PMO Flashmobbers is using Agile across the business, they wondered how would you measure the success of stand-ups, unfortunately, we couldn’t find any answers for this one.
One of the PMO Flashmobbers took this to the next level by going out and asking the users what they wanted and then made adjustments after that. We wondered whether this could be done for other services and suggested the Voice of the Customer technique as a way of getting feedback.
Going back to the overview of how well a PMO was doing one person suggested that if it moved from push to pull, do people come to the PMO for services (pull) or are services ‘forced’ on (push) it showed that people trusted/valued the PMO. If stakeholders were coming to ask the PMO for guidance and advice, then it showed they were appreciated.
One of the PMO Flashmobbers used the survey approach when they were doing a transformation programme. They surveyed PM job satisfaction before the change, after the change and six months after the change and saw a significant increase which proved that the change was worthwhile. Could the same approach be used for the introduction or refresh of services offered by the PMO?
Overall it was a complicated subject. The main conclusions from me are: Know who you are measuring for, work out what level you want to be measured and remember that measures have an impact on behaviour. If you work out what behaviours you want to change then look to design measures which will increase those.”
Stuart Dixon | Linkedin Contact
Other PMO Lab Sessions