Assessing our assessment [#ONSBeta]

Just before Christmas we, like our colleagues in the eQ team, made the pilgrimage to Aviation House in Holborn for our assessment against the ‘Digital by Default Service Standard’ by a team from the Government Digital Service.

Unlike the eQ team this wasn’t our first rodeo and as it was for permission to move from Beta to Live the pressure was heightened somewhat.

There is a great bit in Chris’s blogpost about their attitude to the ‘service standard’ and the assessment;

Passing the assessment should be a by-product of working to the Service Standard, not the other way around, that is, if we’re working to the standard then the assessment should be straightforward and obvious.

I’m quite militant about this — probably even more so than Chris. We are passionate advocates of the ‘standard’ and the ‘design principles’ that predate them. As the Service Manager for the Beta (even if I prefer the title Product Manager) I believe in being able to articulate what we are doing to meet the ‘standard’ without rehearsal — it is the very core of my job. The great thing about the assessment itself though is you get to bring some help along to get in to the detail when needed (thanks to David our Tech Lead and Jonathan our User Researcher for the back up). Anyway due to this we didn’t do mock assessments or rehearsals — just had a couple of meetings to make sure we gathered all our evidence in one place and agreed on the best examples to give if asked.

By their very nature there is something stressful about the assessments — it is like a mutant hybrid of a job interview, PhD viva and Dragons’ Den pitch. That said the GDS team were very welcoming, friendly and importantly knowledgable.

I can say that now as they gave us a glowing review 🙂

It wasn’t altogether straightforward though. At the time of the assessment the team required a little more evidence from us on two particular areas we discussed — which of course I was happy to supply. In many ways this made me more confident in the whole experience — like when you arrange for penetration testing you feel a little cheated if they don’t find anything to fix with the assessment being pushed for a little more detail felt right.

Anyway as I’m sure you have guessed we passed. The assessment team said;

Their enthusiasm and commitment to creating a website based on user needs that could be iterated at pace was clear.

There is no clearer statement about what we set out to achieve and I’m pleased our enthusiasm was still evident as 2015 was a long year!

On a personal note I was extremely pleased with the following feedback;

The panel were really impressed with how the service manager had approached the (recruitment) problem, including how they put job descriptions on a hackpad to be peer reviewed, and tailoring them to explain what it meant to work for the ONS, rather than using generic job descriptions.

I learned an enormous amount about hiring last year and while it was often frustrating and a little difficult I believe it was worth it as we recruited some great staff towards the end of the year and I feel I am much more aware of the common mistakes if not awash with solutions yet.

I’ve published the report from the Assessment team on this blog if anybody is interested and I’d like to once again thank the assessors and the organisers. More than that though I’d like to thank my team who worked tirelessly (and continue to do so) to get us to this point.