Keywords
Public Health, Public Health Practice, Qualitative research, Policy making
Public Health Intervention Responsive Studies Teams (PHIRST) are academic teams funded by the National Institute for Health and Care Research (NIHR) to provide rapid evaluations of public health interventions at no cost to local government. PHIRST Insight, a collaboration between the Universities of Bristol and Cardiff, was established in 2020 and completed 10 studies of a diverse range of local public health interventions over five years. This study examined local government partners’ and academic researchers’ experience of co-producing studies, and the perceived impact of PHIRST Insight studies.
This is a qualitative study. Semi-structured interviews were conducted with staff involved in each of the 10 Insight studies. Participants included local government staff (including some from third-sector organisations commissioned to deliver interventions) (N=13) and members of the PHIRST academic team (N=12). Interview transcripts were analysed using the Framework method of thematic analysis.
Research undertaken by a PHIRST team was perceived by local government staff to be more robust than that which they could undertake themselves, and the scheme successfully addressed the perceived lack of research skills, staff capacity, and funding within local government to undertake evaluation activity. Both research and local government respondents believed that they developed successful research partnerships through PHIRST, despite identified challenges of working with an external research team. The PHIRST Insight team’s three principles of co-production of studies with local partners, embedding public involvement across all aspects of the studies, and a focus on maximising impact for local government policy and practice were perceived by participants to have been largely met across all ten studies.
This study finds evidence that local government staff perceive a range of benefits from PHIRST support and that Insight studies have met the aim of the PHIRST scheme, providing timely evidence to inform local government practice or policy.
This article explores how a team of researchers has worked with local governments. Researchers from the Universities of Bristol and Cardiff formed a team called PHIRST Insight. PHIRST stands for ‘Public Health Intervention Responsive Studies Team’ Since 2020, the Insight team have completed ten studies of local government public health interventions. Local public health interventions are intended to protect and improve the health of communities. These studies were funded by the National Institute for Health and Care Research (NIHR). This article looks at how PHIRST Insight and local government staff have worked together on these studies, and what the impact of them has been for local policy and practice.
Interviews were held with 13 local government staff and 12 PHIRST Insight researchers involved in each of the ten studies. The study looked at the reasons why local government staff wanted support from the PHIRST team, the kind of relationship and partnership that developed between local government staff and Insight researchers, and what difference the local studies made.
Local government staff reported that the research undertaken by the PHIRST Insight team was of better quality than they could have undertaken themselves, because of a lack of time and funding. While there were challenges working together, often because the research team was unfamiliar with the local staff, the local area, and the intervention being studied, these challenges were usually overcome by developing a strong partnership. The study concludes that PHIRST studies have helped provide research evidence that has helped local government improve their public health policy and practice.
Public Health, Public Health Practice, Qualitative research, Policy making
How to cite: Jessiman P, Campbell R, Robinson H et al. Building research partnerships with local government: Reviewing five years of a Public Health Intervention Responsive Studies Team (PHIRST) [version 1; peer review: awaiting peer review]. NIHR Open Res 2026, 6:21 (https://doi.org/10.3310/nihropenres.14221.1)
First published: 03 Mar 2026, 6:21 (https://doi.org/10.3310/nihropenres.14221.1)
Latest published: 03 Mar 2026, 6:21 (https://doi.org/10.3310/nihropenres.14221.1)
The relationship between evidence, policy, and practice is nuanced, dynamic, political, and contested.1 In England, local government is well placed to influence the determinants of public health and reduce inequalities following the transfer of responsibility from the NHS to local government in 2013 (though this differs in the devolved nations of the UK). However opportunities for intervention are often missed.2 In addition, financial constraints on local government increases the need to identify interventions that are cost-effective in improving population health. Facilitating closer interaction between those working in public health policy, practice, and academic research is likely to increase the use of evidence in policy decisions to improve outcomes.
Previous research has identified numerous examples of the ways in which evidence may be used by local government, including but not limited to understanding the local context and local health need (including predicting future demands and planning public health services); evaluating the effectiveness of local services; informing policy and practice decisions; determining cost-effectiveness; and raising awareness of public health issues.3 Identified barriers to the use of research in policy-making (not limited to local government) include the absence of relevant research, lack of access to it, poor research dissemination, and lack of time to use research evidence.4 Research has also identified barriers to local authorities engaging in research activity, including limited time, analytical and financial resource; lack of staff for producing or managing research, and limitations in the capacity to align research with longer-term strategic needs or policy development.5,6 The funding models in the public sector frequently means that research activity is framed around short-term project work, which can also make for difficulty in retaining the knowledge gained from such research.5 Partnering with public health academics and researchers may resolve some of these issues, but commissioning academic research is expensive, and there are often difficulties in identifying the relevant researchers to work with.7
To address this, Public Health Intervention Responsive Studies Teams (PHIRST) were established by the National Institute for Health and Care Research (NIHR) in 2020 to provide rapid evaluations of local public health interventions at no cost to local government. PHIRST is a national, competitive scheme open to local government public health teams across the UK (or equivalent organisations in the devolved administrations commissioning non-NHS, public health interventions). Criteria for prioritising interventions for support from a PHIRST team include that they: are likely to have important impacts on population health and health inequalities; have the potential to be widely generalisable; have an identified need; are readily evaluable; have funding identified and preferably secured; and for which evaluative research will provide timely evidence to inform practice or policy in local government.8 Four academic PHIRST teams were funded from 2020 and by 2025 ten PHIRST teams had been established, with over 80 studies underway or completed.
Several barriers to research partnerships between local government and academia have been identified in previous studies. Academic research has vigorous and time-consuming research governance demands around data sharing and ethics, and pressure to publish in peer-reviewed journals can make some research inaccessible.3,7 The political nature of local government also means that public health priorities may be subject to rapid and continual change, requiring researchers to have relevant research evidence ready to respond to policy and political changes.3,9,10 However, the lengthy time taken to produce academic evidence often means that it is perceived as outdated.3,11 Local government staff want evidence that is timely and relevant to current practice and not ‘removed’ from what is happening on the ground.2,5,12
Researchers’ existing commitments to long-term funding programmes may reduce their capacity to undertake rapid, localised studies,13 and building meaningful collaborations with local government practice and policy partners can come at a cost if the outputs of value to local government are not peer-reviewed journal articles needed for academic career progression (although this may be changing given the increased demand for research impact).12 Different types of evidence may be valued, with practitioners favouring localised and in-house research, while academics value rigorous and transferable evidence with (inter) national relevance.12 There is no guarantee that research commissioned to inform local government policy and practice will result in impact. The Local Government Association describes policy-making as evidence informed rather than evidence based.3 Factors influencing the use of evidence in local government include organisational, structural and staffing changes, financial pressures, availability of data and data sharing, and the national policy context and regulation.3 However, academics frequently cite a willingness to undertake solution-oriented research to address real-world problems, enabled by funding that supports collaboration between academia and practice.12
There are also drivers for local government public health teams to engage with academic research, including the perception that external evaluation may give greater credibility to their work12 and be more likely to influence policymakers. There is also an increasing emphasis on adopting an evidence-based approach to justify financial decisions5,7 and ‘proof-of-concept’ testing before embedding systemic change.5,14 Personal contact and collaboration between policymakers and researchers, good research dissemination, and research outputs that include clear recommendations have all been found to support the use of research evidence in policy making.4,11
PHIRST Insight, a collaboration between the Universities of Bristol and Cardiff, was established in 2020 and completed 10 studies (see Table 1 for details) with local government (or equivalent) in its first five years. After being allocated potential local government interventions for evaluation by the NIHR, PHIRST teams have six weeks to work with local government partners and other stakeholders to determine if the intervention is evaluable, and must develop a research protocol in collaboration with local partners within 12 weeks of allocation. Studies are not allocated based on geographical proximity to PHIRST academic teams, with the result that researchers are often unfamiliar with the local context, geography, socio-demographics and the public health priorities for the local government team involved. They are also building new relationships with public health commissioners, practitioners, and community members, creating a challenging context within which to develop a research study.
While the ten local interventions allocated to PHIRST Insight are diverse in terms of their intended health outcomes, target population, delivery method, and locality, Insight undertook these studies with three common principles: co-production of the research with local government partners, the embedding of public involvement across all aspects of the studies, and a focus on maximising the impact on local government policy and practice.
In 2024, the Insight team began a review of their work to date to determine whether these three principles had been implemented effectively. A full review of public involvement will be published in a separate paper. This article reports on local government partners’ (including third-sector agencies commissioned to deliver interventions) experience of co-producing studies with Insight, and the perceived impact of PHIRST Insight studies. The study also makes recommendations for improving research partnerships with local government and supporting impact within the context of the PHIRST scheme.
Two public members sit on the PHIRST Insight management group, and reviewed the overall aims of the current study and the proposed methodology. This included a review of researchers’ and local authority staff experiences of public involvement in PHIRST Insight studies. As a result, a full review of public involvement in Insight studies was conducted and reported separately to this paper. There was no public involvement activity in recruitment, data collection, analysis or reporting of this study.
This qualitative study consisted of semi-structured interviews with people involved in each of the 10 Insight studies. Participants were recruited from two groups; local government partners (including some from third-sector organisations commissioned to deliver interventions); and members of the PHIRST academic team. The data were collected between May 2024 and March 2025. The study was approved by the Research Ethics Committee of the University of Bristol Faculty of Health Sciences (ref: 17879). The COREQ criteria for reporting qualitative research were observed in the reporting of this study.15
Local government participants were identified and contacted by the Insight researcher who led each of the 10 studies and asked if they would be interested in taking part. If they agreed, their contact details were shared with a member of the Insight team undertaking fieldwork for this study but who had not been involved in the original Insight study with them. Participants were sent a detailed information sheet (PIS) containing information about the study, its aims, confidentiality, use of data, and expected outputs. All participants signed an online consent form for the study prior to interview.
The lead academic researcher(s) from each of the 10 studies were also invited to participate, and a PIS shared and consent form was completed in the same way. Again, interviews were undertaken by a researcher who had not been involved in the original Insight study.
A topic guide for local government respondents was developed and piloted which included questions on applying to the PHIRST scheme, experience of working with Insight, co-production of the study, public involvement, and the impact of the study. A similar guide was developed for academic participants, which omitted questions about applying for PHIRST support and included questions about working with local government partners within the context of PHIRST. Data collection was undertaken by TJ, HR, GW, CF, HL, JH and JM, all public health academics trained in qualitative research. All had previously worked on PHIRST Insight studies with the exception of GW. Data collection was conducted mostly on MS Teams, although a small number of interviews with academic researchers were conducted face-to-face. Interviews lasted between 40 and 90 minutes.
Interviews were recorded, transcribed verbatim, and analysed using the Framework method,16 a thematic analysis approach. The lead researcher (TJ) developed a draft analytical framework that included the key themes identified in the data and relevant to the research questions. This analytical framework was reviewed by other members of the research team (GW, who had no involvement in PHIRST prior to commencing this work and HR, who had undertaken a small amount of work for one of the 10 Insight studies) and revised until agreement was reached. A systematic approach to data management was adopted, with the lead researcher coding the transcripts into the analytical framework using NVivo software.17 The analysis continued using Framework matrices as a detailed and accessible overview of the data populating each theme from every interview.
In total, 25 participants took part in the study. Thirteen were from local government or third sector partners, and 12 were academic researchers. All 10 Insight studies had at least one local partner and researcher involved interviewed for this study; some researchers had been involved in multiple studies (see Table 2).
Findings are reported under the following themes; a) reasons local authority teams applied for PHIRST support, b) developing research partnerships, c) co-production, public and community engagement, involvement and participation, and d) outputs and impact. Local authority respondents (including those from third sector organisations) are identified as ‘LA’ when quotations are used, academic researchers as ‘R’. The numbers allocated to participant quotes bear no relation to the order in which studies are presented in Tables 1 and 2.
Local government staff reported two primary reasons for wanting support from an external team to evaluate programmes. The first was that this would ensure objectivity and increase the robustness of findings. Secondly, many cited a lack of research skills, staff capacity and funding to undertake evaluation activity themselves. Local government partners would always bid for funding and resources from external programmes and wanted to ‘test’ the PHIRST scheme to see if it would be useful for other interventions they were commissioning.
We were doing this big bit of work in [Place]. Certainly it was the first of its kind that we had done, right across a geographical area. So we kind of recognised that it was a big bit of work but it was also maybe slightly out of our scope of skill set, if you like. (LA3)
I mean, the benefit with having an external evaluator is always that they can dedicate more time. And sometimes, they have more resources, more experience. The fact that they’re external has got more credibility as well. So, I think there are different benefits. And if you can get the absolute benefit of having a university, an amazing team, and you don’t have to pay for it, I think it’s like, it’s, “Why not?” (LA11)
The most common outcome sought from the PHIRST studies was evidence of the impact of local interventions, often to inform future commissioning decisions. In some cases the intervention was thought to be especially novel and innovative and respondents wanted to use the opportunity to generate evidence that could be used locally to inform programme development but also feed into policy and practice more widely, including nationally. While evidence of impact was most important, process outcomes including barriers to participation, or local residents’ perceptions of the programme also mattered. For at least two projects it was hoped that the study would support better and closer relationships between separate departments of the local authority, including the sharing of budgets and resources.
If local government teams had been unsuccessful in securing support through PHIRST most respondents believed that the interventions would not have been evaluated, or would have been subject to smaller ‘in-house’ monitoring and evaluation activity. Some would have considered commissioning independent evaluators, but were concerned about the time and cost required for procurement.
We probably wouldn't have had the budget to go and commission a research body to do the work for us. And then secondly, even if we did have a funding, the time that would have been involved in going through a kind of procurement process. You know, identifying potential research teams with, you know, the expertise….[]…it's really quite lengthy and time consuming. Whereas you know we were presented with an expert research team with variety, experience and skills to support us, which just circuited all of that. So that was a fantastic resource. (LA6)
Insight researchers’ unfamiliarity with many of the interventions and their locality was recognised as one of the disadvantages of working with an external research team, and of the PHIRST scheme itself.
And that’s probably one of the downsides of having an external evaluator anyway, because you have to cover that ground. You don’t know the council, you don’t know the back end. So, you know, you’ve got the benefits, but this I think is probably one of the drawbacks. (LA11)
It was a bit more difficult, because it was remote, and we had no existing contacts within the space. So, trying to understand their landscape of [policy area], and the way that [delivery organisations] work within local authorities, and how local authorities work with other local authorities, let alone how the local authority partner, who is [outside of local government] worked with and across all of them. (R12)
There were also some concerns about the time taken to agree the scope of each study and what could be achieved within the timescales and available data. This was exacerbated by intervention timescales not aligning well with local government applications for support from the PHIRST scheme, including delayed implementation, and studies beginning post-implementation preventing the collection of baseline data. This also impeded the capacity to undertake impact evaluation and many of the studies were necessarily utilising process evaluation designs. Research staff describe this initial period as one in which they took care to manage the expectations of local partners about what could be achieved within the timescale of PHIRST studies (typically 12-24 months) and the resources available, while also supporting shared decision-making around the scope of the study.
It’s so important to come together to understand teams, different views and manage power dynamics. It would be very easy to say, “We know what we’re doing. We’re going to tell you what this evaluation looks like.” But you have to do not do that, and really listen hard to what it is they’re trying to do…[]…I think it is about being clear about what the remit of PHIRST is, and what we’re able to evaluate. I think making it clear that most of the projects have been between 12 and 18 months…[]… being clear about what’s achievable, but also, being absolutely clear about what it is the local authority want to know and when. (R6)
Researchers also experienced difficulties fully understanding how interventions were being implemented.
I think it’s a real PHIRST phenomenon that, for example, you find interventions aren’t actually happening as you thought they were….It took us a while to really work out what all the different bits of the intervention were. (R5)
Some local partners were frustrated by the time spent at the start of studies on research governance arrangements, including gaining research ethics approval and preparing data sharing agreements, eating into time they felt could have been better spent on the research study.
I felt like a lot of the research time, which should be going into the real asking questions, but having the time to go into depth, and understanding those, was…they’d spent on those governance and process aspects of it. (LA1)
In several studies problems accessing data, particularly when held outside of local government (e.g. the NHS), exacerbated these delays, as well as problems with data quality, or finding that the type of data required was not available.
It’s taken a lot of time and a lot of effort. The key question there is who owns the data? It may not actually be who you think it is. So, for local authorities, they may own the data, or actually, it may be more about the people that are implementing the programme own the data. Or are you having to link to NHS data? So, it’s been very time-consuming. (R6)
Despite these challenges, both research and local government respondents believed that they developed strong working relationships as the studies progressed. These were achieved through regular meetings between research and local partners soon after studies were allocated, Insight staff conducting their own research about the local area, and exploring local contextual factors as part of the study itself. Researchers often spent time in the local area, both during the set-up phase of studies and conducting fieldwork during studies, which was highly valued by local staff. It also gave them an opportunity to explain local contextual factors influencing the intervention in person, supporting a better understanding among the research team.
I thought that was important for us, so they came [here]. So, even for people to make those journeys between surrounding villages, to really understand what we're dealing with when we, kind of, try to implement anything in a county like [x]… So, I think it's important to, kind of, get this context, even from a few visits. During the study and interviews with people like myself and my supervisors, they did actually make it part of the study itself. Like, exploring the reality of commissioning set pieces within the local authority. (LA8)
We organised a face-to-face meeting at the start of the project. …[]… we were brought up to [Place]. They kind of presented on their intervention. They also brought us for a walk around the borough, showing us all the different areas. And that was really helpful, to understand the context. They definitely did their best…[]…any points of clarity, we had the opportunity to follow up on that. So, yeah, I think they did a really good job at helping us understand the context. (R3)
Local government staff identified a further range of factors that facilitated a good working relationship. These included researchers’ regular communication with local staff, keeping them informed about the progress of the study. Insight took responsibility for project management, including arranging and chairing study meetings, accommodating local staff changes, and bringing new staff up to speed on the research study. Good interpersonal skills were also cited, with the research staff described as personable, interested, and motivated.
They were very proactive. They helped to drive the project along in terms of coordinating all the logistics of project meetings, having regular project meetings so that the study made progress when they started hitting barriers, they communicated really well…[]…It kind of the way it felt was that this was one of their absolutely primary research studies. And you know, nothing else was really taking a greater priority. So whether that was the case or not, that's how it came across. (LA6)
In co-producing each of the PHIRST studies, researchers worked in collaboration with a range of local partners, including those involved in commissioning, managing, and receiving the intervention being evaluated. In practice this meant involving staff from a range of local government departments beyond the public health team, and those involved in policy development, service commissioning, practice and delivery (including from the third sector). It also included public partners with an interest in the intervention or experience of the health outcome that was the focus of the study. Local government staff described the co-production of their PHIRST study as a collaborative process that supported engagement with the study from a wide range of local colleagues and partners not just at the development stage but throughout the duration of the study.
It was very much that partnership piece rather than just kind of being done to. I don't think at any point we've ever felt like we're having anything imposed upon us. (LA2)
Some local staff were keen to be involved in developing the research questions and outputs of the study, but felt less expert in research design and preferred to be led by the research team. They believed this also protected the objectivity and independence of the study.
In terms of the co-production, I wanted to set the outcomes, and the research goals, but I didn’t want to be overly influential in the actual research methodology. Because I didn’t want to affect the independence of it. (LA1)
This was reflected in the views of researchers, who valued co-producing study aims and research questions but were comfortable taking a lead role in study design.
I think one of the most important starting points is agreeing what the research questions are, and so that, I think, we have been really… That has been shared decision-making. Sometimes that's not always as clear as you think it should be, but working out: “What's the most useful thing to ask, given the time we've got?” …[]… I think that's genuine co-production on that, but I think it's still fair to say that we are leading on methods development but then getting input and advice from our local authority and other stakeholders along the way. (R2)
Other reported advantages of co-production included giving local staff insight into the rigour of academic research and the processes involved. It was also seen as an opportunity to gain expertise from a wide range of interested parties, including practitioners and public partners, which was believed to support better protocol development. In several studies, respondents noted the utility of involving practitioners and service managers at an early stage of study development to ensure the feasibility and acceptability of data-collection methods. For example, one local government participant commented on how researchers in one of the school-based studies collaborated closely with schools to consider accessibility.
There were accessibility needs, and they worked well with the school in terms of what are the research methodologies. What do we do in terms of presenting things in a visual way? How do we adjust the questions, so that it is accessible to families? I think all that was well done, in terms of designing that, and then, working with schools to co-produce it, to hold sessions within the schools, when they could find time. (LA1)
Local government staff also valued the input of public partners, and several believed this fit well with their organisational values of engaging service users and co-designing services.
It was influenced not just by us, as commissioners, but also we had a study participation group which involved members of the public, so, obviously, they had some suggestions, as well. So, I think it reflected broader need and additional voices, what we ended up with. (LA8)
However the co-production process also had disadvantages for local staff, principally around their capacity to fully engage. This was especially salient at the start of studies, when research staff needed a lot of information about the intervention and the availability of data.
You know we're a small enough team, we've lots of things on. So the capacity at times at the start trying to get all that information pulled together to send over. It was quite a bit to do, but actually it was OK. (LA2)
So many of the local authority staff I've worked with have been pretty up-skilled in research. So their problem is time and resource, not lack of skill or understanding. (R2)
In most cases logic models for the intervention were co-produced to inform the study, and local staff often reported requiring support to understand their purpose and how to develop one.
I will say, we did need a fair bit of help with the logic model, there were several of us that didn’t really understand what it was. But again, they were really good at explaining that and the purpose of it. There was a lot of support to get us to do a bit of that. Rather than them coming and saying, “This is what we think it should be.” (LA3)
Co-production also added to the workload of delivery organisations, particularly when they were involved in participant recruitment (e.g., distributing information sheets, or taking consent for contact details of potential participants to be passed to the research team). One respondent recognised that not all staff would value the expectation of co-producing research, instead preferring to defer responsibility for research to the academic team.
There’s something that I always struggle, and I know it’s part of NIHR, one of the key principles, about co-production. So, you always co-produce, and you want to do things together. And I think that is indeed the right way of doing things, but I think sometimes, especially with big meetings, and depending on how advanced the local authority is in terms of their understanding of logic models and having done that work before, it can be very tricky. And it can be very difficult for someone external, like a university, wanting to work under the co-production principle. Whereas, you have got people from the other side expecting you to tell them what to do and have that ready. (LA11)
Local government respondents had a range of experiences involving members of the public as part of their everyday work, from none through to those who reported habitually co-designing services with service users. There were few examples of strong public involvement in the design or commissioning of any of the ten interventions allocated to Insight for evaluation. This meant that, in most cases, the research team had to recruit and engage public partners and embed public involvement throughout the study with little existing PCIEP to build on. Most local partners were surprised by the extent to which the public were involved throughout PHIRST studies. Recruitment and facilitation of public partners was led by the research team, although local staff often signposted researchers to local organisations who could help. There were some local government staff who were sceptical about the role and representativeness of public partners.
I don’t want to go into the conversation about how to get public involved. We do enough consultation. We know the challenges of that, and who are the members of the public, who will want to be involved, is not necessarily a fully representative sample. And how do you get a representative is the biggest question, which I think everybody asks. (LA1)
This was reflected by other local respondents who welcomed the involvement of public partners as part of the study team, but felt that the way they were recruited (through national organisations, patient groups, or volunteers already working with the local authority) meant they were not representative of the target population the intervention was aimed at. There were also respondents who felt that public partners were often unable to fully engage and understand studies with more complex interventions or policy changes, or that involved interpreting large and complex datasets.
However, most respondents reported benefits of the involvement of public partners in each study. Commonly cited benefits were public partners’ involvement in developing research questions, and in planning dissemination activity. In some cases public partners were able to advise on the delivery of the intervention itself. Both researchers and local staff believed that the research team benefited from public partners co-designing or refining data collection tools, ensuring that they were fit for purpose and accessible to the target population.
I think it’s been really important to have that input, and just being able to talk to local people and familiarise ourselves with life in [Place] I think has been really, really important. (R3)
For those local staff who were less experienced in public involvement, there is some evidence that seeing it implemented throughout the PHIRST study had encouraged them to build involvement into subsequent work.
I do notice myself thinking about it more when we’re working at looking- Even developing new resources. Or just recently, we’ve built a new website …[]… when we were doing that, I was like, “You know, we should actually show this to older people, to the people that are going to be using this website.” Which I don't think is something that I would have thought about before. (LA2)
I think it probably has promoted the idea of co-production [with public partners] in quite unexpected ways. Well, we have a group of about 60 volunteers and we're currently changing their volunteer scheme and trying to improve it for them, whereas we when we've done that kind of thing in the past, it would be very much top down. We're gonna do this and we'll inform you of the changes. Now we're coproducing these changes with them and it's more an equal footing because ultimately it's their volunteer scheme. (LA4)
The NIHR requires PHIRST teams to prioritise study outputs in the format preferred by their local government partners. The Insight research team facilitate the discussion about this, and the impact the local government team aims to achieve, through the joint completion of a dissemination, impact, involvement, communications, and engagement (DIICE) plan at the start of each study. The plan is designed to consider who the research findings are intended to influence, the impact sought, and the audience(s) that should be engaged.
Local teams sought a range of outputs to suit different audiences, and across the ten studies these included short reports, research, policy and practice briefings, peer-reviewed papers, infographics, and narrated slide sets. Local partners expressed a strong need for accessible outputs targeting specific audiences; typically local policy makers and practitioners. They valued the discussion of different forms of outputs, and their potential utility.
One thing I really liked was they asked us what way did we want that information … who are the intended audience and what kind of format did we want that in? And I think yet again, that was a real strong point because yes, although we've put in kind of proposals or submissions for, for academic papers, which is great and both myself and another colleague were involved in that process…[]… With regards to the other report which we wanted to be yet again more accessible, user friendly to a wider audience and we said what way we wanted that not to be too lengthy or too academic. That was very much listened to and the report we got was very easy to digest, it was kind of full of the right quotes and the right data and the right impact. (LA2)
Many of these outputs were co-produced with local government partners, and all of the public-facing outputs with public partners. Local staff were also involved in co-presenting findings at research and practice conferences and to local and regional policy makers. Infographics were particularly valued as the most accessible form of output aimed at the general public, although some local government staff would also have liked to co-produce more video-based outputs that could be hosted on websites and social media.
Local government staff valued the production of academic outputs based on their work for reputational reasons, building their profile as an evidenced-based team:
We have increased the evidence base in this particular area and one of the things that has happened is that we've got a publication from it so that we can show to our [elected] Members and the public that we are taking an evidence informed approach to the interventions we're promoting. (LA6)
However there was some concern that academic papers, and related presentations, were too detailed and ineffective in communicating research findings clearly and succinctly.
When you look at the longer research paper. Actually, there’s two papers, but you have- with that, when you read it, you go, introduction, materials and methods, and then the pages of that, and then you have results. And then, you have a chart of results, or something like that, in the papers. And you go, “Okay, what is the executive summary of this?” (LA1)
In studies where findings indicated that the intervention had not been effective, or had unexpected outcomes, local partners had concerns about how these were presented and managing the reputational risk for their organisation. Researchers had to manage this carefully, balancing the need to preserve research integrity with managing the reputational risk for local partners. Many cited the strength of relationships built throughout the study period as helping with this.
But I'm reading this in in kind of with two hats on. Firstly, how does this inform my programme? And secondly, how does this affect the reputation of the local authority? I did have a bit of concern as to how some of that was worded, because I'm thinking how are other local authorities going to read this? How is my strategic director within the team going to read this? Is this going to end up with loads of media enquiries saying [place] Council has spent a lot of money on a project which isn't useful and that sort of thing? (LA7)
You’re always in a position where potentially you’re feeding some things back that aren’t what they want to hear…[]…And so, we had to deliver that message, but it was fine, because I think by then, we had built up a relationship through the meetings we’d had, and I think they’d obviously, hopefully realised that we had good intentions and we were trying to work transparently and collaboratively. So, that did mean we’ve not really had any tensions, even where there have been findings that might’ve been a bit more difficult. (R5)
The production of outputs in formats preferred by local authority partners was intended to maximise the impact of study findings, but respondents identified challenges faced by local government staff translating the research into impact. This included the risk that study outputs, while useful validation of their work, would not lead to any change or impact unless they came with a clear set of recommendations that were feasible and could be implemented by local government.
With other outputs that we’ve had from universities, it will be a case where you will get something really helpful, and then there will be a presentation, there will be an academic coming into an executive group, and making a presentation, and saying X, Y, and Z. And everyone is, sort of, like, “Yeah, great, thank you. That’s really amazing insight.” Not something that they didn’t necessarily know, but that validation I think is still useful. And then, they’re not going to use it ever again…[]… it might be useful for the final PHIRST output to have recommendations on what you could look at next. (LA12)
Local government staff may also struggle to mobilise research findings into local impact because of time and financial constraints.
I would expect it to be harder for local authorities to make good use of research findings and because it requires budget commitment, resource commitment and, and I suppose academics are going need to steer local authorities through that to some extent. (LA5)
Despite these constraints, respondents described a range of impacts that resulted from PHIRST studies. There were several examples of studies informing local government commissioning decisions. These include decisions to continue funding an intervention, recommission using a different model, and in one case, to cease funding (although the intervention is still ongoing through a self-funded model). PHIRST findings have also been used to inform the planning and commissioning of wider programmes of work, such as local travel plans, and an integrated lifestyle service. In some instances, the research team brought forward the presentation of interim findings to support local teams in recommissioning.
They were about to re-commission the whole service, but obviously there's a lead-in time for writing the commissioning, the tender spec. So, we wrote interim findings. It was only a four-page document that just made some interim recommendations of things to think about when designing the tender spec for that re-commissioning. (R2)
In one area, local staff believed that the PHIRST Insight study had helped them build the case for longer-term, proactive planning decisions that reflected health and well-being goals rather than focusing only on immediate need.
It's very easy under the pressures that we're facing to lose sight of some of these big goals. Health and well-being is one of those, climate action, ecology. You know, these are things that we, in dealing with the daily perma-crises, it's very easy to lose sight of those things. It's too easy for local government to become reactive to situations and if we if we are proactive, we can get ahead…[]… I think that this [study] is going to be fundamental to helping us maintain that momentum of being proactive about this issue and not losing sight of it. (LA12)
PHIRST outputs have also been used to help influence commissioning decisions by local authorities not involved in the original study. For example, one local partner commented:
I think the impact will help to shape national work and hopefully be useful as well for other local authorities when they're thinking about their roll-out. … I think it's going to be of great interest not just to [place] but I think across the country. (LA7)
Several local staff report being contacted by other local authorities for information and support following the publication and dissemination of Insight studies. An output from the universal free school meals study was cited by a neighbouring local authority when making the case for universal lunch provision in that borough. The findings from the NHS Digital Health Check study were closely followed by the national implementation team, and evidence from the Healthy Advertising study was shared with other public health teams across Wales and the Welsh National Government.
PHIRST findings also impacted on local intervention delivery, including by identifying barriers to participation in programmes which local teams could then address, and informing pathways into and out of interventions.
Because the study showed was that programme was working great…[]…but the aftermath of the programme was where we were lacking a little bit. So we then created these signposting packs which, when participants finished the programme, they got given a pack which had everything going on in their local area, things they could access, that would be suitable for those individuals ….[]… And that’s all down to the research. (LA9)
PHIRST studies have also had impact beyond policy and practice, building research capacity amongst local government staff. Several respondents reported that involvement in co-producing studies increased their understanding of evaluation, and led to the development of improved systems for monitoring and evaluating local interventions.
We have gained some understanding of how to go about future evaluation. There are some downsides in evaluating our own work, which is less preferrable, but gained some learning in how to do it. Some of the methodologies and the experience that we've gained from this exercise. (LA12)
In the last year or so, I’ve been looking at, what does the survey actually gather for us? What would we like to gather? Is the information we’re getting actually good? Can we use it? Does it show anything about [Organisation]? And I definitely wouldn’t have thought about that before the PHIRST project. (LA3)
There are also examples of local staff using their involvement with PHIRST studies as evidence of working with academic teams, which is one of the criteria for portfolio registration to become a public health consultant. Involvement in Insight studies has changed attitudes towards involvement and engagement with research.
We have applied to PHIRST [again], and have been successful, so that's very exciting. I think because it was such a positive experience it has motivated the team to engage more with research. (LA6)
Local government respondents believed that involvement with PHIRST had raised the profile of both the intervention, and of their organisation, raising their reputation by being involved in building evidence for both public health policy and practice.
This review of the first five years of working with local governments through the PHIRST scheme has demonstrated that, despite the challenges of the scheme, it has been possible to develop strong and successful research partnerships and co-produce studies that have had an impact on policy, practice, and local government research capacity.
Across the ten Insight studies, the most common reason local government sought external research support was to evaluate the effectiveness of interventions. Many of these would not have been evaluated without PHIRST support, with local government staff citing barriers to evaluation also seen in other studies, including lack of time, staff, and financial resource.5,6 The PHIRST scheme has helped address these barriers, as well as removing the potentially burdensome tendering process for those local government teams who may have considered commissioning a research study from an external partner. This review also supports earlier studies in the finding that local government staff perceive evaluations conducted by external partners as providing greater rigour and credibility, and hence more likely to have influence and impact than those conducted ‘in house’.12 Furthermore, this study finds evidence that local governments seek partnerships with academia as a means of building their profile as research-engaged and evidence-based organisations.
The identified disadvantages of working with an external research team include that they may be unfamiliar with the local context, politics, public health challenges and staff involved in commissioning and delivering the intervention. The allocation of PHIRST projects is important here, as some projects are allocated some distance from where the PHIRST team is based. This often means that it is not possible to rely on pre-existing networks. As a result local government staff members have had to allocate considerable time and resource at the start of PHIRST studies to bring the academic team ‘up to speed’. Where feasible, the Insight team travelled to meet local teams in person, which was valued by local staff. Nevertheless, the time and resource required to explain intervention rationales, identify and share potential sources of data for studies, develop logic models, support public involvement and agree the scope of each study might be reduced if PHIRST studies were allocated with greater regard to existing relationships between PHIRST academic teams and local governments. Such an approach would also reduce travel time and costs of in person meetings and fieldwork and make the work more sustainable from an environmental perspective. This would also support better and more inclusive PCIEP, aiding the recruitment of public partners who are more representative of the population of interest. A further constraint identified across these first ten studies is that intervention timescales are frequently misaligned with local governments’ applications for support from PHIRST, often impeding the collection of baseline data and/or the collection of long-term outcome measures that would facilitate impact and cost-benefit evaluations. Finally, the research governance constraints faced by academic teams, in particular gaining ethical approval and agreeing data sharing arrangements, is frustrating for local governments seeking timely evidence to support local commissioning and practice.3
This study identified several factors that have facilitated successful research partnerships between the Insight team and their local partners in such circumstances. These include Insight researchers facilitating regular, transparent and responsive communication; taking responsibility for project management; proactively researching local factors likely to influence the interventions under evaluation; good interpersonal skills; and accommodating local staff changes.
PHIRST Insight studies have been co-produced, with researchers working in collaboration with those involved in commissioning, delivering, and receiving the intervention, including members of the public. Oliver et al.’s review of the literature sets out four main arguments for coproducing research.18 There is a normative argument that co-production has intrinsic value, supporting the production of research in the public interest, and that sharing power and expertise is ethical. Certainly, there is evidence from this study that local partners perceive that Insight studies have been ‘done with’ rather than ‘done to’ them, supporting their engagement. Co-production can improve the quality of research by helping researchers and policymakers develop holistic understandings of the context, issues, and interventions designed as potential solutions. This has been particularly valuable to Insight researchers who have sought to engage with and learn from local partners more familiar with the context and intervention, and has supported the development of relevant research questions and better research design (in particular, by co-producing data collection tools with public partners). Co-production may also help to increase the impact of research by building research capacity among non-academics and building engagement with research users. There is evidence in this study of local government staff improving their monitoring and evaluation skills, and increasing their motivation to engage with research in the future. The fourth argument for co-production is the intention for research to be inclusive and empowering, increasing a sense of ownership and thus making research findings more likely to be utilised.18 While there is some evidence of this from our study amongst local government staff, it is less clear that Insight studies have been successfully inclusive of the target populations of each of the ten interventions. While public partners supported co-production in all studies, there is some scepticism amongst local government staff that the recruitment methods used resulted in the engagement of truly representative public members. Again, this is likely the result of the challenges of working remotely from the locations where interventions are being evaluated and the pace at which research protocols need to be developed within the PHIRST scheme. In addition there was little evidence of public involvement in the design or commissioning of any of the ten interventions evaluated by Insight, meaning that researchers were building public involvement from scratch.
Oliver et al. also identify one of the challenges of co-production as developing study teams that are genuinely devoid of power imbalance and decision-making that is not dominated by particular dominant voices or interests.18 While we found little evidence in this study that any of the respondents, either within the research team or from local government partners (including public partners) felt disempowered, we did find evidence that some local partners did not want to co-develop research design for fear of damaging the integrity and independence of studies so valued by local government staff. This meant that the research design was led in large part by the research team rather than being genuinely co-produced. Furthermore some local partners were new to logic model development and required significant support from the research team which likely led to some power imbalance. A further risk of co-producing research is disagreement between the research team and local partners over the research findings, how they are framed and/or disseminated, with associated risks for research credibility.18 Some of the studies revealed unexpected findings or indicated that interventions were ineffective, and researchers had to be mindful of the reputational risk to local government partners while preserving research integrity. The development of supportive and trusted relationships between the research team and local partners has helped mitigate this challenge, although it remains a risk for future PHIRST studies.
We find evidence in this study that Insight studies have successfully provided evidence that has informed public health policy and practice in local government and more widely. This has been facilitated by early discussions with local partners about the impact they hope to achieve, who or what they hope to influence, and the format of research outputs most likely to achieve these aims. The co-production of DIICE plans is valued by local partners, and outputs have been produced in a range of accessible formats including short research and policy briefings, research reports, and infographics. Many of these were co-authored with local government staff and public partners. The production of peer-reviewed papers, while an important output for academic researchers, is also valued by most local partners as a means of building their reputation as an evidence-based organisation, although there is some scepticism about the utility of these outputs in communicating research findings effectively and accessibly. Reported impacts from Insight studies have included using evidence to inform commissioning decisions (including in local governments not involved in the original PHIRST study), influencing wider programmes of work (e.g., local travel plans) and influencing national policy. Barriers to impact remain, which include the resource and financial constraints on local government and rapidly changing public health priorities. Respondents in this study also identified that research outputs are more likely to have an impact if they make clear and feasible recommendations to local government, something that PHIRST teams should be mindful of in the future.
We make the following recommendations. We believe that the central NIHR PHIRST team should support local government (and equivalent organisations in the devolved nations) to make timely applications to the PHIRST scheme to ensure that the period of evaluation aligns with intervention timescales. We also advocate that where possible allocating studies with better regard to the geographies of PHIRST teams would reduce the burden for research teams and local partners in scoping and developing research protocols and aid in reducing the carbon footprint for research. As an academic team Insight should continue to take responsibility for project management of PHIRST studies but be mindful of the impact of this on co-production and shared decision-making. Local government partners often need support to understand and develop logic models and Insight has developed more detailed guidance on this that will be shared in future studies. Researchers should also encourage local staff to fully engage with the co-development of research design and methodology in ways that do not risk the integrity or credibility of the study. Researchers should, where possible, maximise efforts to reduce the time taken to achieve ethical approval and agree data sharing arrangements for studies. We also identified here a need to improve the recruitment of local public partners to ensure a more diverse, inclusive and representative sample. Finally, Insight should further explore with local partners the production of more creative, audio-visual outputs that can be hosted on social media platforms.
This study is limited by the involvement of most of the authors in the PHIRST Insight studies, with the exception of GW. We have tried to mitigate this by interviews with participants being conducted by researchers who were not involved in each of the ten individual studies, and the oversight by GW of the analysis and interpretation. The study was strengthened by the recruitment of at least one researcher and a local government partner for each of the ten studies.
This review of the first five years of work as a PHIRST team, co-producing studies of a diverse range of local interventions, finds evidence that local government teams perceive a range of benefits from PHIRST support and that Insight studies have met the aim of the PHIRST scheme in providing timely evidence to inform local government practices or policies. Adopting the findings and recommendations made in this review is likely to support the maintenance of this success, improve the experience of local government partners supported by the PHIRST scheme, and maximise the impact of studies as the Insight team commences a further five years of work.
data.bris. PHIRST legacy https://doi.org/10.5523/bris.2imk1apz6485y2klhkdmdzk03l19
The project contains the following extended data:
• Consent form- PHIRST Insight legacy.docx (the consent form used for all study participants)
• PIS - PHIRST Insight legacy study.docx (the participant information sheet sent to all study participants)
• Topic Guide - PHIRST Insight – local authority.docx (the topic guide used for local authority respondents)
• Topic Guide – PHIRST Insight – researchers.docx (the topic guide used for researcher respondents)
• Researcher interviews (anon) (12 docx. transcripts of interviews of researchers involved in PHIRST projects interviewed for the study)
• Local authority interviews (anon) (11 docx. transcripts of interviews of local authority staff involved in PHIRST projects interviewed for the study)
There is no underlying data for this study.
Materials are available under the terms of the Creative Commons Attribution 4.0 International license (CC-BY 4.0).
We are grateful for the participation of staff from local government and related organisations who were interviewed for this study. We also thank public members of the PHIRST Insight management group Christina Stokes and Sian Harding for reviewing the overall aims and methodology of the study.
Provide sufficient details of any financial or non-financial competing interests to enable users to assess whether your comments might lead a reasonable person to question your impartiality. Consider the following examples, but note that this is not an exhaustive list:
Sign up for content alerts and receive a weekly or monthly email with all newly published articles
Register with NIHR Open Research
Already registered? Sign in
If you are a previous or current NIHR award holder, sign up for information about developments, publishing and publications from NIHR Open Research.
We'll keep you updated on any major new updates to NIHR Open Research
The email address should be the one you originally registered with F1000.
You registered with F1000 via Google, so we cannot reset your password.
To sign in, please click here.
If you still need help with your Google account password, please click here.
You registered with F1000 via Facebook, so we cannot reset your password.
To sign in, please click here.
If you still need help with your Facebook account password, please click here.
If your email address is registered with us, we will email you instructions to reset your password.
If you think you should have received this email but it has not arrived, please check your spam filters and/or contact for further assistance.
Comments on this article Comments (0)