Digital CREST

Helping a UK charity reach twice as many students with their flagship STEM awards programme.

Service blueprint • Supplier selection + management • Solution architecture • System integration + testing • User support


 A self-service website makes it easy for teacher to enter and assess Discovery and Bronze CREST awards, freeing up capacity at the BSA to handle more independently assessed Silver and Gold awards.  

A self-service website makes it easy for teacher to enter and assess Discovery and Bronze CREST awards, freeing up capacity at the BSA to handle more independently assessed Silver and Gold awards.  

The brief

The British Science Association is a charity on a mission to support, grow and diversify the community of people interested and involved in science. A core component of their programme for young people is the CREST Awards, recognising students’ project work in science, technology, engineering and maths (STEM). 

The BSA wanted to expand access to CREST, so that many more students can benefit. However, paper-based processes and dated IT systems were limiting their ability to scale the programme. 

The solution

By moving CREST onto a new digital platform with more self-service components, the BSA plan to double the number of students who can participate. 

To support the increase volume of awards, I worked with the BSA Education team to update the entire CREST service. As well as a bespoke digital platform for entering awards, we also implemented help-desk and finance systems, and outsourced certificate printing services. 

This phase: 14 months.


Discovery

The project faced a number of constraints: the budget and go-live date were agreed with the funder before any scoping work had been done. In addition, the team had limited time to work on the project and low confidence in their ability to successfully deliver digital projects. 

I started with workshops to map the existing service. As the client team documented the user journey and their existing operational activities, they identified areas that could be improved for the user, as well as internal processes that were limiting their ability to scale.

As well as creating a shared vision of how the updated service should work, this also helped them understand that they needed to do more than build a new website. 

For instance, we uncovered bottlenecks in dealing with user queries, and difficulties in producing the personalised certificates that were a key feature of the awards.

The client had already carried out  research with teachers and students who had participated in the CREST awards. We used this research to create personas, a more concise way to communicate user needs to a development team.  

I also led the client in identifying critical success factors for the project - these were helpful later when we needed to reduce the scope of the first phase of work.

 The initial maps were created on the wall, with stickies.

The initial maps were created on the wall, with stickies.

 We used the table top capture options and questions for the new service.

We used the table top capture options and questions for the new service.

 As we refined the business rules, I moved the maps to Google Draw, which offered collaborative workspace for adding more detail.

As we refined the business rules, I moved the maps to Google Draw, which offered collaborative workspace for adding more detail.

Design

Through a formal supplier selection process, the client chose SOON_, an independent London agency, to create a new bespoke digital platform.

Early in the design phase, it became clear we would not have budget to support all four award levels. We used MoSCoW prioritisation to rank the requirements, and also drew on the success factors identified earlier to decide where to cut scope. This led the client to limit this initial phase to just the first two award levels. These account for the highest volume of awards, and are assessed by teachers rather than the BSA, so could be entirely self-service. 

The tight timing also meant we couldn’t pause delivery while we did user testing on the mockups. I encouraged the client to do some testing anyway, and this threw up a couple of major issues with usability. I negotiated with the agency to accommodate the user feedback within timescales acceptable to the client. 

During Discovery, we’d identified that user support was a bottleneck for the team. One person handled most enquiries; the support mailbox made it difficult for work to be delegated or tracked. With the expected increase in queries caused by transition to a new system, as well as by the planned increase in user numbers, we needed a better option. 

I proposed to the client that they trial an off-the-shelf help desk package. I sourced one with a free tier adequate to their needs, and worked with their main user support person to configure the system and link the existing mailbox. I set this up early in the build, so they could get used to the software and learn to delegate workload, while still handling the familiar issues with their existing service.

Systems integration

While the agency worked on the new digital  platform, I mapped out the rest of the solution and designed the integration points between new and old systems.

The client was making a number of other changes at the same time - for instance, implementing a new accounting system, and changing the payment structure for regional co-ordinators. This, combined with the phased transition from the old CREST systems to the new platform, meant we were designing for a moving target. 

To get around this, I decided to prototype each interface in Excel. These not only allowed us to test the designs early using real data - they also meant we could change the interfaces as needed to accommodate other projects. 

Clearly, these semi-manual interfaces meant increased workload for the support team, but we used the prototypes to size the effort required, and the team were prepared for the extra work. 

Later in the project, as funds were available, we automated some of the interfaces. Others because unnecessary due to subsequent business changes, so building them in this way saved precious development budget for more important features.

Testing and implementation

The client had limited experience with accepting digital products. I showed them how to plan the testing work and  structure scripts and data so that we could run tests quickly and repeatably when the software was delivered. This was important as we only had two weeks between initial delivery and go-live to find and fix any bugs.

As we had taken an iterative development approach to other aspects of the solution, we were able to complete testing on time, and the new platform went into service just a few weeks after the initial planned delivery date. The successful delivery of the first phase of work meant the funder agreed to release further budget for the next phase. 

Initial feedback from users was positive, but as expected they also had many suggestions for improvement. I helped the client set up a backlog in Trello for feature requests and bugs, along with outstanding requirements from the initial Discovery phase, helping them to weave changes into future releases.

 

Emily O'ByrneWork