What’s your best-fit coaching style?

Our quality improvement project support officer, Kate Phillips reflects on her learning from the West of England Academy Improvement Coach Programme…

I recently took part in a great two-day improvement coaching event hosted by the West of England AHSN, funded by The Health Foundation. The event was attended by 26 of the West of England Qs, a group of people who I am really enjoying getting to know as we share a passion for driving quality improvement (QI) in healthcare. Sue Mellor and Dee Wilkinson, our fabulous facilitators, guided us through three coaching approaches with an emphasis on finding our ‘best fit’ coaching style. This encouragement for honest reflection ensured I left with a bounty of personalised counselling tools.

We started the course by working out our Honey and Mumford personality type which led to conversations around team dynamics and how to make the most of individual talents. I felt a sense of belonging and of ‘finding my people’ as the room was buzzing with personality type ‘private’ jokes. A particularly comical moment was when three ‘activists’ were first up to grab the board pen, while the ‘theorists’ were still discussing the merits of the process!

I initially joined the ‘pragmatists’ as I thrive on finding evidence-based logical solutions. However, following an insightful conversation with a colleague, I scooted myself closer to the ‘reflectors’. She had noticed how I often approach tasks with a reflector mindset, which I reckon comes from a desire to learn best practice from more experienced colleagues (experienced in QI and identifying personality types!).

Having very recently made a jaunty sidestep away from a career in teaching, I am still finding my QI feet… Interestingly I think personality types are fluid and can change depending on the situation we find ourselves in.

For example, if I was to stroll back into a classroom and teach a class about displacement reactions (fire!) you would see a pragmatic Kate, but put me in the office answering the phone you would firstly see me very flustered as I juggle the telephone voice, demands of the caller and transferring the call. However after my heart rate has returned to baseline, I will reflect on the success of the phone call and how I can make it less of an ordeal next time (more fire?).

As I’m sure a lot of QI projects involve taking people out of their comfort zones, I think it is important to recognise that personality types may take a detour away from ‘the norm’ during the changing situation. I can imagine this having quite a big impact on team dynamics.

As the two-day programme unfolded, Sue and Dee skilfully balanced theory-based learning with opportunities to ‘play’ with different coaching approaches, always with the focus on our own QI projects. We worked in triads to explore the benefits of three different coaching approaches:

GROW – Goal, Reality, Options, Will

CLEAR – Contracting, Listening, Exploring, Actions, Review

OSCAR – Outcome, Situation, Choices, Actions, Review.

As both coach and coachee, the chance to experiment with these approaches and to work with different Qs was an invaluable opportunity for me.

As a coach I grasped the power of suspending judgement, in allowing silence to fall in a conversation and the truth that can be discovered by tapping into the conversation energy level as it peaked and troughed. My favourite approach was GROW, as I found the acronym was easy to remember and the conversation often flowed quite naturally along this path.

In the position of a coachee I learnt to approach the conversation honestly and openly. As a result I was rewarded with multiple light bulb moments as QI ideas and feelings bubbled to the surface, simply drawn out with a few pertinent questions and some very active, active listening. I’d like to thank my triads for these delicious moments of clarity.

I left the programme feeling excited by the power of listening and empowered by the ability to harness a 15 minute time slot. My enthusiasm was echoed amongst the other delegates. “It’s powerful stuff for fostering change,” said one.

I’d love to hear your own thoughts and tips about using coaching to promote and accelerate QI projects. You’ll find me on twitter at @IamKateP or @weahsn.

Using QI methodology to win Euro 2016

Natasha Owen, Quality Improvement Lead at the West of England AHSN, combines her passion for improvement science with her (basic) knowledge of football to get us in the mood for Euro 2016…

This year will see the Quality Improvement (QI) team at the West of England AHSN continue in its aim to increase the capacity and capability of colleagues in our member organisations, through the understanding and use of QI methodology and tools.

What better way to achieve this than by starting with our own teams here at the AHSN office?

Sometimes when using QI tools we have to step outside our own sector, in this case healthcare and the NHS, and develop people’s understanding of the concept using a more relatable topic. Say football for instance. The impending European Football Finals (Euro 2016) felt to us like the perfect opportunity to combine some office fun, in the form of a sweepstake, with an example of how to apply the Model for Improvement.

The QI team set about thinking: how would a football team apply the Model for Improvement to their tactical approach in the competition?

When specialist knowledge and QI skills are combined you can develop what Don Berwick called ‘the knowledge base for continual improvement’, which any team in any industry or field can strive for.

I mean who wants to stand still when you could improve?

As a QI expert or trainer, you are not expected to have the specialist knowledge. The key is allowing specialist teams to apply their knowledge to the Model.

In this case, I had the knowledge of applying the model combined with just enough football knowledge to make this example work!

The Model for Improvement requires a systematic approach to its application. It is a step by step process, which, when applied as described in the correct order, will provide a consistent approach to improving the quality of your performance, or processes, as a team.

Skipping a step, doing step three before step one, or taking steps out completely will not glean the same results. More importantly it is not guaranteed to achieve an improvement every time.

However the ‘test small and quick’ method allows you to rule out bad change ideas as easily as identifying ideas that create an improvement. Both outcomes are essential to promote continuous change.

So back to our football team… How on earth can a methodology created for a healthcare environment help a football team win the Euro 2016 Final?

Picture the scene. It is the month before the finals begin, the football season has ended, and Roy has called up the England boys to play for their country. What an honour!

During the football season all the players play for different teams, where different tactics and skills are used. Bringing them together in the short term is comparable to creating a Quality Improvement team. Roy does not have long to get this team to gel together to be a high quality goal scoring machine: the finals start on 10 June!

Training as a team gets underway. In other words, the planning of the QI project begins. Step One of the Model for Improvement is to establish your aim: what are we trying to accomplish?

For this team the aim is to win the European Football Tournament by 10 July 2016. Aims should be specific. Note that the team want/need to achieve their aim by a certain date.

The next thing they need to do is decide what data they could use to decide whether an improvement has been made. This is Step Two of the Model: how will we know a change is an improvement?

Measurement is key to distinguishing between a change, and a change that makes an improvement. If we don’t know what the data looks like beforehand, the data we collect afterwards will be meaningless.

A football team may have many sources of data they can measure, from the number of goals they score to how fast each player runs during a game. However they need to decide which measures are applicable to their aim. Does running faster contribute to winning? I don’t think it would be a team’s primary concern.

Before you decide what to change, you need to decide if it can be measured. So Roy and the boys have got together and had a discussion in the changing room and came up with the following measures…

Primary measure: number of points scored. Essentially this is how football is governed so that measure is set for the team. This might happen from time to time when undertaking improvement projects where measures are set externally – by CQC or NHS England for example.

Secondary measures

  • Number of goals scored
  • Number of yellow cards given
  • Number of opponent goals saved or avoided.

Your primary measure is the main source of data you will use to establish if your aim has been achieved. Secondary measures can provide further data to indicate to what level a change is driving towards or away from making an improvement.

For example, where the team draws and only scores one point, this could be explained by the number of goals scored being equivalent to the opposing team, but an increase in yellow cards being given might suggest an underlying behavioural issue that led to a poorer performance, ultimately leading to the lack of goals scored or saved.

Now the fun begins! Step Three is all about getting creative: what changes will make an improvement? It’s all about generating ideas, no matter how crazy they might seem as long as they can be conducted within the rules of the game. I am pretty sure EUFA won’t allow players to wear rocket boosters on their shoes!

Finally we move into the testing phase. The team might decide what ideas to test using a prioritisation matrix. Remember, test one idea at a time. Test small, test quickly. This way you will limit the damage a change could cause and create less disruption in a full system which could have a ripple effect in other areas or departments.

This style of testing is called PDSA cycles (Plan-Do-Study-Act).

The team decide their first test of change will be: players will only pass the ball five times before whoever has the ball shoots for the goal during the game against Turkey on 22 May.

Plan: your change. What will you do? What measures will you use? Who will do it? When will you do it? How will you do it?

Action

Lead

Implement by

Measure of success

Install beeper on the ball that will beep after it is passed five times

Manager

16 May

Number of times players shoot after hearing the beep

Number of goals scored

Practice the five pass tactic in training

Players and Coach

16 May

Number of times ball is aimed at the goal after five passes

Use the five pass tactic during the game

Players

22 May

Number of points scored

Number of goals scored

Do: put it into practice. The timescale for the test will be the duration of the next game (around 90 minutes).

Study. Using the measures you set out, has an improvement been made? Run charts are the recommended way to present and analyse your data to indicate improvement.

Act. Did you see an improvement? Yes? Try it again in the next match see if it continues to improve the team performance. No? Reflect on why it did not create an improvement and refine the idea, or scrap it and move on to the next idea.

Now you have this knowledge, you might want to give Roy and the England boys* a call to see if you can help them with their tactics and WIN WIN WIN!

*or any other manager and team in the tournament

Anyone for tennis?

Anna Burhouse, Director of Quality at the West of England AHSN, explains a fun way to introduce teams to the basics of quality improvement.

I recently used this technique at one of our workshops with colleagues from across the health and social care sector, and it got such a positive response that I thought I’d share it more widely.

It’s is a great exercise for introducing teams to a number of quality improvement (QI) approaches — including the model for improvement, PDSA (plan do study act) cycles, measurement using a run chart, variation, human factors, thinking differently and distributed leadership.

So what do you need? Not much really!

A group of people – at least 12 and the more the merrier, just add more balls! Each group will need one tennis ball, a recording sheet, a pen, a ruler and a way to record times – most people tend to use their mobile phones. You can download the recording sheet here. You’ll also need one person to act as facilitator.

Find out more about PDSA cycles and the model for improvement in our West of England Academy pages and there’s a great video on PDSA on YouTube.

And how do you do it?

Divide the group into teams of around four or five people.  In each team you’ll need a leader, a time keeper and a measurement recorder.

The facilitator gives each team a tennis ball and sets the task: “The ball needs to pass to each member in turn so that everyone passes it.” Be deliberately vague unless asked for clarification!

Ask the team leaders to start a round of passing the ball. Be explicit that the idea is to pass the ball as naturally as possible in the first round.

Each team times each round and records the turn number on the x axis and the number of seconds taken on the y axis of the chart. Record eight sets of “turns” on the graph. From the eight data points calculate the median and draw a line at of this figure right across the graph parallel to the x axis

Explain that people have now established their baseline and their median.

Ask teams if they think they could improve on the median score. Suggest they identify just one change and note it on the graph and then test it another eight times. This is PDSA 1.

Then run further PDSA cycles until people feel they’ve reached a steady state, where the system is running as smoothly as possible, is maintainable and could be replicated by others.

Then when everyone is feeling confident, go back to each team and explain that the tennis ball was actually modelling how the A&E service was managing flow. Unfortunately it’s now mid-winter and demand has risen sharply. As a result of this the team now has to run another PDSA but this time they cannot use their hands.

End the session. Ask for final times and feedback about the process. Encourage discussion about what has worked.

What will teams learn?

  • Measurement: they have just completed a ‘Run Chart’. Typically a series of seven to eight uninterrupted figures above or below the median is significant and can indicate a deterioration or an improvement.
  • QI Science: they have just completed a quality improvement project using the Institute of Healthcare’s quality improvement theory ‘ Model of Improvement’, using a Plan Do Study Act cycle (PDSA)
  • Variation: they have seen how each group has come up with a different way of approaching the task and there is variation in both the outputs and the way the team works together.
  • Human factors: they have seen human factors in action. For instance, the speed of the person timing will often cause unintentional mistakes and unless this system was automated the system can only ever be as good as the person operating it – the ‘Theory of Constraints’ in quality improvement would also see this as a bottleneck.
  • Thinking differently: often the winter pressures exercise sees people get up and use their feet. Often it improves the time. Just note that necessity is the mother of invention and that by coming up with their own solutions and thinking differently, improvements can often be made. This learning is about the power of distributed leadership where the clinical microsystem can innovate local solutions.
  • Importance of a clear aim: often people ask for greater clarity on the task as the cycle progresses. The learning here is to establish a clear aim of what you are doing and what quality measures are required by the facilitator at the beginning, after all there is no point improving the wrong thing.

Ideally this exercise takes 45 minutes to do really well, but it can be scaled back to 20 minutes if you need to by reducing the number of PDSA cycles or leaving out the winter pressures section.