Does the integration of response services lead to meaningful change in healthcare activity? A case study evaluation

Sebastian Hinde (Centre for Health Economics, University of York, York, UK)
Jo Setters (York Health Economics Consortium, York, UK)
Laura Bojke (Centre for Health Economics, University of York, York, UK)
Nick Hex (York Health Economics Consortium, York, UK)
Gerry Richardson (Centre for Health Economics, University of York, York, UK)

Journal of Integrated Care

ISSN: 1476-9018

Publication date: 20 June 2019

Abstract

Purpose

The aim of the NHS England Vanguards of new care models was to improve healthcare provision and integration through the coordination of services, seeking to deliver the Five Year Forward View. The purpose of this paper is to report on an extensive analysis of one of the Vanguard programmes, exploring whether the implemented integrated response service (IRS) based in Harrogate, England, resulted in any meaningful change in secondary healthcare activity.

Design/methodology/approach

The authors used an interrupted time series framework applied to aggregate secondary care data, specifically emergency attendances for patients 65+, emergency bed days for all adults and non-elective admissions for 65+. Synthetic and geographic comparator data were employed to inform additional scenario analyses.

Findings

The majority of the analyses conducted found no statistically significant effect of the IRS team in either direction, suggesting that there was no change in the metrics that could be separated from natural variation. The data correlated with the findings of a qualitative analysis and challenges faced in staffing the team towards the end of the analysis period and the eventual disbanding of the IRS.

Research limitations/implications

The analysis was partially hampered by data access challenges, limited to poorly specified aggregate secondary care data, and a poorly specified intervention. Furthermore, the follow-up period was limited by the disbanding of the service.

Originality/value

This analysis indicates that the Harrogate-based IRS team is unlikely to have delivered any sustained quantifiable impact on the intended secondary care outcomes. While this does not necessarily demonstrate a failure of the core principle behind the drive for integrated care, it is an important exploration of the challenges of evaluating such a service.

Keywords

Citation

Hinde, S., Setters, J., Bojke, L., Hex, N. and Richardson, G. (2019), "Does the integration of response services lead to meaningful change in healthcare activity? A case study evaluation", Journal of Integrated Care, Vol. 27 No. 3, pp. 193-203. https://doi.org/10.1108/JICA-03-2019-0009

Download as .RIS

Publisher

:

Emerald Publishing Limited

Copyright © 2019, Emerald Publishing Limited


Introduction

In 2014/2015, NHS England (2016) funded 50 “Vanguards” of new care models in the NHS with the aim of improving coordination and the level of care provided across primary, secondary and community services. The core premise was that a number of areas in England would spearhead the goals of the NHS Five Year Forward View (NHS England, 2014, 2017a) and stimulate a National shift towards more integrated care through local action (Iacobucci, 2014, 2015), an aspiration that has been restated in the NHS Long-Term Plan (NHS England, 2019). There were five forms of new care model selected: the integration of primary and acute care systems (PACS), multispecialty community providers, enhanced health in care homes, urgent and emergency care, and acute care collaborations (NHS England, 2016). While the funding for the three-year projects and some commissioning and evaluative supports was from central government (NHS England, 2015a, b), the service design and implementation of each project was led by local decision makers and practitioners.

This study reports the findings of one of the nine integrated PACS models, the HaRD Vanguard (2018), which aimed to transform care provision through cooperative action of GPs, community services, hospitals, mental health and social care staff across six partner organisations, bringing them together as part of an integrated service. In this manuscript, we discuss attempts to conduct a quantitative evaluation of the Harrogate and Rural District (HaRD) programme, exploring the methodological and structural challenges of conducting such analyses, specifically the challenges relating to the availability of data and attributing causality.

Interventions aimed at facilitating integrated care are by no means new (Campbell et al., 1998) and have been identified through systematic reviews as having the potential to result in positive effects on service quality and efficiency (Allen et al., 2009). However, the broad nature of what constitutes such interventions, and the challenges of implementing (Starling, 2018) and evaluating (Tsiachristas et al., 2016) them has led to limited progress at local level, with inefficient and even repeated care provision common (NHS England, 2014, 2016). Localised evaluations of real-world interventions have been proposed as a means of exploring what does and does not bring about effective change (Tsiachristas et al., 2016).

Structure of the HaRD Vanguard

The HaRD Clinical Commissioning Group (CCG) is responsible for the commissioning of care for a population of 160,000, geographically focused around a number of market towns (Harrogate, Knaresbrough and Ripon). While relatively affluent, with 58 per cent of the area occupying the least deprived two quintiles by IMD, there are areas of large deprivation, falling within the most deprived 10 per cent of England (Ministry of Housing, CLG, 2015). There is also a predominantly elderly population, with 1 in 5 over 65 years of age, 10 years ahead of the national ageing curve and projected to increase to 1 in 3 by 2030 (HaRD CCG, 2015).

The HaRD CCG concluded that the current service delivery was unsustainable (HaRD Vanguard, 2018), estimating that 20 per cent of acute admissions and 67 per cent of occupied acute beds had the potential to be managed in a less acute setting, including at home. There was also a lack of joined up care and care duplication to motivate a redesign of their core care pathway.

The HaRD Vanguard programme was launched with the key intention of removing existing boundaries between primary, community, acute, mental health and social care, with the expectation that this would improve the quality and sustainability of services both clinically and financially, largely by reducing unnecessary hospital admissions and attendances.

Throughout the funding period (2015/2016–2017/2018), the HaRD Vanguard attempted to facilitate integration of care through a number of activities, including the setting up of community care teams, training programmes and the creation of an integrated response service (IRS) pop-up team. From November 2016, four community care team hubs were created. Details of the evolution of the programme are published elsewhere (HaRD Vanguard, 2018; Ariss et al., 2018). This initial attempt at care integration was stopped in March 2017 for a number of reasons including failures to engage GPs, a poorly defined set of aims of the teams and insufficient initial planning leading to poor buy in from partner groups (HaRD Vanguard, 2018; Thomas, 2017).

On 15 May 2017, a new attempt at integration was launched, aiming to build on the learning of the community care team hubs. The IRS team was an attempt to implement a genuine integration of mental health, social care, primary care and community health services in a single team (HaRD Vanguard, 2018). The IRS team had shared access to patient records, daily care discussions, joint visits and co-location at a single site. It was piloted to accept referrals from three self-selecting GP practices within the HaRD area, covering 44,120 of the 160,000 registered patients in the HaRD area. The key aims of the IRS were to offer a seven days a week rapid response intervention from an interdisciplinary team and, wherever possible, to avoid hospital admission. Skill mix included mental and physical health nursing, physiotherapy, pharmacy and social care. The team consisted of 15 active staff members, not including two roles that were never recruited onto due to inability to identify suitable candidates, the breakdown of staff is detailed in the Supplementary Appendix, published elsewhere (available at: http://eprints.whiterose.ac.uk/146055/). Patients identified as high need, with no specific criteria set, could be referred to the team by any of the local care partners, at which point an integrated care plan was initiated by the team. On 8 January 2018, referrals were stopped, as sustained staffing was no longer possible. Final estimates of the number of patient interactions were not recorded but in the first 193 days (until 24 November 2017), of the 239 days of the IRS funding, the team saw 279 patients.

Mixed method evaluation

Throughout the programme, the HaRD Vanguard was engaged with a multi-disciplinary team of academic researchers, contracted to conduct a mixed method evaluation of the programme, in line with guidance from NHS England (2017b) regarding Vanguard evaluation. This evaluation consisted of a theory led qualitative evaluation and a quantitative economic evaluation. Due to the rapid evolution of the HaRD Vanguard, contractual challenges between the evaluation team and the Vanguard, and the failure of the initial community care team, the evaluation focused on the IRS portion of the programme.

Findings of the qualitative evaluation are published elsewhere (Ariss et al., 2018) but in brief the analysis aimed to understand the underlying programme theory of the Vanguard and explore the implementation process from a multi-stakeholder perspective through a series of interviews, surveys and observations. The analysis concluded that while the IRS team provided a “gold standard” level of integrated care, the ambiguity about many staff roles, lack of clear referral criteria to the team or service specification, and uncertain onward referral pathways greatly limited the potential of the service and its ability to effect change. These were reflected in low satisfaction and high levels of uncertainty of team members surveyed through a workforce dynamics questionnaire. Throughout the evaluation team struggled to engage patient representatives or collect survey data, with only a single patient experience survey being returned, limiting the scope of the qualitative analysis.

The methods and results of the quantitative analysis are reported below, the findings of which should be viewed in light of the challenges and limitations identified by the qualitative study.

Methods

This evaluation considers the role of the HaRD IRS in effecting a change in NHS utilisation, using available secondary care activity data. To explore the impact of the IRS team on NHS activity, we conducted an interrupted time series (ITS), or “segmented regression”, analysis which is considered a robust statistical approach in such settings (Linden, 2015; Bernal et al., 2016). We sought to test the hypothesis that:

H1.

The creation of the IRS team would lead to a decrease in secondary care activity levels in patients in the intervention area.

The analytical framework

The ITS method considers the trend in an outcome of interest over time, shown by the solid line in Figure 1, segmenting it into the period before the IRS team was active, and after it. Using the framework described by Bernal (Bernal et al., 2016), we defined the regression model a priori as being associated with a level and slope change in each of the analysed metrics, using the following regression as outlined elsewhere (Bernal et al., 2016; Linden, 2015):

Y t = β 0 + β 1 T t + β 2 X t + β 3 X t T t + ε t .

In its simplest form, by assuming that the slope of the outcome data (β1) during the historic control period is indicative of the expected trend had the intervention not been implemented (the dotted line in Figure 1), it is possible to estimate the impact of the intervention by comparing the two. By considering the change in the observed outcome at the point of intervention as both a one off short-term change (β2) and a longer-term change in the slope (β3), the analytical method is flexible to different rates of impact. The analysis was conducted using a monthly time interval, Tt in the formula, and with an error term εt.

In addition to the “standard” ITS assumption regarding the counterfactual, it is possible to directly incorporate comparator data using the method reported in detail by Linden (2015), through the equation:

Y t = β 0 + β 1 T t + β 2 X t + β 3 X t T t + β 4 Z + β 5 Z t T t + β 6 Z X t + β 7 Z X t T t + ε t .

In the above equation, the additional variables represent interaction terms to describe the features of the comparator population. The Linden method allows us to compare the similarities between the intervention and control areas in the pre-intervention period, adjusting the outcome regression if needed to ensure a good match. Once a good match is achieved, we can use the post-intervention outcome in the control area to directly represent our counterfactual, and therefore estimate the impact of the IRS team.

In addition to these analyses, which assume linear trends on a natural scale, during the interim analysis period we explored different model specifications, including conducting the analyses on a log scale, removal of the β2 and β3 elements of the ITS equation, exploring an exponential trend as the β3 parameter, and lagging the effect of the IRS team launch date on the outcome. However, these failed to provide more robust models, and were not considered informative by the Vanguard team, and as such were not undertaken for the final analysis.

The data

As identified by Tsiachristas (Tsiachristas et al., 2016), the evaluation of integrated services, such as the IRS in this evaluation, is typically reliant on observational evidence with poor comparator data. In this analysis, the use of observational data was inevitable due to the limited funding and time over which the service could be conducted and evaluated, making any other form of data collection impossible. To overcome these issues, we sought to supplement the available observational data with additional data to inform the causal impact of the IRS team, including historical data on the intervention area to generate a historical comparator.

Data were provided directly from the HaRD Foundation Trust on three metrics agreed with the Vanguard programme prior to the commencement of our analysis, with the use of those aged 65+ as a proxy for the high need referral criteria stipulated by the Vanguard team:

  • accident and emergency attendances for those aged 65+;

  • emergency bed days all adults (18+); and

  • non-elective admissions for those aged 65+.

Attempts were also made to collect data on GP and mental health activity; however, it was not possible to collect sufficient data to inform a meaningful ITS analysis, primarily due to a lack of readily available pre-intervention data.

The data consisted of aggregate summaries on a range of secondary care activities, stratified by the GP practice the patient was resident in, specifically whether the patient was in an eligible practice for the IRS or the rest of the HaRD area. Historic data were provided for 16 months before the launch of the IRS (from January 2016) and 8 months after the launch (until the end of the IRS), this level of data was available for the intervention area and the two comparator data sets detailed below.

Due to the lack of specific referral criteria to the IRS team and no flagging of patients who were referred in the secondary care data, it was not possible to directly identify patients who had been seen by the IRS. As a result, the intervention arm of our analysis is the resident population of the three GP practices that were covered by the IRS team, with the rest of HaRD used as a comparator region.

As the whole of HaRD was included in the original plans for the Vanguard, and subjected to the initial community care team intervention, and wider structural changes, including the launch of the Supported Discharge Service at the Harrogate Foundation Trust, in August 2017, it may not represent a fair comparator to the IRS team. Therefore, a synthetic comparator was included in this analysis to explore an alternative realisation of the counterfactual. The synthetic area was created by NHS England (2015a) as part of their support of the local evaluations, matching the demographic profile of the HaRD region to those not in the Vanguard programme, matched between April 2012 and March 2015, using existing methods to produce an estimate of the counterfactual (Abadie et al., 2015; Crosbie, 2018; Crosbie, unpublished). Unfortunately, the synthetic comparator was not created to inform the emergency department attendances analysis, making only the analyses presented in Table I possible.

When using a comparator in an ITS analysis, the question of comparability of the observed data must be considered. For a comparator to be considered meaningful, the value of the metrics must be comparable between both comparator and intervention area, with any change that occurs in a metric that is not related to the intervention having equal impact on both areas. Furthermore, the launch of an intervention should only impact the metric of interest in the intervention area, and the pre-intervention line (β1) shown in Figure 1 must be statistically similar for both areas.

While both the “rest of HaRD” and “synthetic” controls were selected as theoretical matches to the GP practice areas covered by the IRS, significant differences were observed in the pre-IRS launch data for both controls. This was in spite of a comparison of the demographics listed in the Public Health England’s National General Practice Profile (Public Health England, 2018), suggesting the three GP practice areas were reasonable matches to the rest of HaRD. It was therefore concluded that the difference must be the result of unobserved factors, making their inclusion into any regression analysis challenging.

Therefore, in the base-case analyses, adjustment of all contemporaneous comparator data was deemed necessary to ensure the validity of these analysis, as the alternative was to exclude all comparator evidence (equivalent of the “ITS – no comparator” analysis) or use evidence known to be biased. This crude method adjusts the observed level of the metric in the pre-intervention comparator area such that the estimated regression line is the same for both the intervention and comparator areas. Therefore, this method artificially adjusts the observed data to create a well matched comparator to the intervention area. In this paper, we focus on the outcome of the analyses after regression adjustment has been conducted, as in all cases the unadjusted comparison failed the test of comparability in the pre-IRS period, the unadjusted analyses are presented in the Supplementary Appendix, published elsewhere.

Cost impact of change in metric

The results of the ITS analyses conducted are primarily presented in terms of the change and statistical significance of the coefficient which estimates the change in the slope of the regression line from before to after the intervention. As an extension of our analysis, at the request of the HaRD Vanguard, we extended this approach to consider the net cost impact of the IRS over the analysis period. To do so, we used the regression analyses to calculate a predicted total increase or reduction in each metric, for example a count of non-elective admissions saved as a result of the IRS team launch. To these estimates, we applied a unit cost for each metric to calculate the net cost impact of the change in outcome to the NHS as a result of the IRS team.

The unit costs used to estimate the total cost impact are reported in Table II, using categories from the NHS Reference costs (2015/2016) (Department of Health and Social Care, 2016) which most closely match the recorded referrals made to the IRS team (44 per cent of all referrals were for falls and mental health related care needs).

Results

The results of the analyses are presented in Figure 2, with the results of the ITS analyses combined with the total change in the metrics over the analysis period, and the estimated total cost in Table III. It is important to note that while it is possible to estimate the change in the metric from the ITS analyses, without statistical significance of the regression and robust narrative behind the change, the estimates can be meaningless or potentially misleading. While we present the estimated change for all of the analyses conducted, they must be interpreted alongside the statistical significance of the regression and nature of the comparator evidence used.

Across all of the analyses, only in two of the emergency bed day regressions were statistically significant changes in the coefficients observed, both in the without comparator (analysis A3) and with the synthetic control (C2). While these two bed day analyses result in different total impacts on the metric (an increase of 28 and decrease of 228, respectively), the consistent statistical significance on the regression outputs is indicative that the IRS team was identified as having an impact on the metric. However, when considered alongside the trend over time, as shown in Figure 2, and the known challenges faced by the IRS team over time, it becomes clear that any beneficial change is driven by an initial decrease in the bed days metric that was eroded over the analytical time period.

Furthermore, when the comparison with the rest of HaRD is considered we see that a similar fall in bed days may have occurred throughout the intervention and control areas in HaRD, suggesting that this may have been the result of some unobserved co-founder, unrelated to the IRS team. Therefore, when taken as a complete analysis, we do not believe our analyses demonstrate any evidence of a statistically significant and narratively robust impact of the IRS team on the secondary care metrics of interest.

Discussion

In this analysis, we have explored the potential impact of an IRS team across a number of key metrics, using a range of sources of evidence to inform the counterfactual. The majority of the analyses conducted found no statistically significant effect of the IRS team, suggesting that there was no change in the metrics that could be separated from natural variation. Some evidence was observed of an initial impact of the team in reducing emergency bed days; however, any change was eroded over the course of the analysis and the change was not statistically significant when the rest of HaRD was used as a comparator area. When these findings are combined with the £500,000 cost of funding the IRS over the full period (HaRD IRS Manager, 2017), they highlight a lack of support for the initial hypothesis that the IRS team would cause a reduction in secondary care activity that would outweigh the cost of commissioning the service.

While this analysis suggests the team had no overall impact on the metrics analysed over the time period, it is important to consider the results alongside a qualitative understanding of the team. In all three metrics, an initial decrease after the launch of the IRS team followed by a consistent increase to levels greater than the pre-launch baseline appears to have occurred. Considered alongside the known issues faced by the team, including an inability to maintain the required level of staffing and a lack of clarity about referral criteria and pathways (HaRD Vanguard, 2018; Ariss et al., 2018), an overall picture emerges of a intervention that had the potential to lead to beneficial outcome, as envisaged in the initial NHS England (2016) plan, but was unable to cause a sustained beneficial change to local care provision throughout the analysis period.

While this result may be evidence of a limited impact of the IRS as it was implemented, it may also be the result of limitations of this analysis to identify the full impact of the team. Tsiachristas et al. (2016) explored the challenges of conducting economic evaluations of integrated care interventions, recognising a number of challenges, including identifying the comparator, the observational nature of data generation relating to integrated care, the typically short period of evaluation, a lack of clarity of suitable outcome measures, and the measurement and valuation of costs. These are all challenges that were faced in this evaluation, and while some have been addressed through the construction of extensive comparator scenarios, some have not.

For example, regarding the nature of the data generation, all of the metrics analysed consist of secondary care activity levels defined by the HaRD CCG as of direct policy interest, but at a relatively poor level of disaggregation. This implies that any change seen at a primary or social care level would not be observed in this analysis, and that any change seen in secondary care may be lost in the poor specificity of the metrics. Efforts were made to improve the quality of the data in two ways. First, attempts were made to reduce the scale of the data provided from the NHS Trust to only include patients who were expected to have interacted with the IRS team. However, it was not possible to individually identify patients nor to reduce the eligible population due to the wide scope of the IRS team and the lack of patient flagging within the data. Second, an exploration was made of the ability of members of the IRS care team to estimate the impact of their activity on saving unnecessary hospital admissions and attendances; however, as the data were self-reported by the team and lacking a comparator, it was not considered sufficiently robust to analyse in detail but indicated a perceived substantial impact of their work in reducing unnecessary admissions and lengthy hospital stays.

Conclusions

While the conclusion of this analysis must be that, given the available evidence, there is no indication that the IRS intervention in HaRD has beneficially affected the levels of healthcare activity, there are caveats. Limitations in the available evidence mean that any potential impact of the IRS may have been missed due to the insensitivity of the data available. Analyses using frameworks such as those discussed here ideally require either access to richer data, enabling the identification of specific patients who interacted with the service, or a bigger intervention population, which may be achieved by the national Vanguard evaluation.

Furthermore, the challenges faced on the ground by the IRS team are in many ways reflected in these findings, as possible initial reductions in the metrics were eroded over time, as the team began to struggle with workforce issues.

While there may be theoretical benefits of integrated care systems as envisaged in the Vanguard launch (NHS England, 2016; Starling, 2018) and the HaRD Vanguard (2018) team, the reality at a local level is often different, with many barriers to such change, both financial and structural. It is vital, therefore, that any attempts to achieve the conjectured benefits in the future are accompanied with robust evaluation in order to ensure that the costs of such service redesign can be justified by the gains to the healthcare sector and its patients. Such an evaluation should consist of detailed qualitative and quantitative analyses, where both are integrated into the service delivery plan throughout the programme.

Figures

ITS analytical method

Figure 1

ITS analytical method

ITS regression results across all data sets on a scale of outcome of interest per 1,000 population

Figure 2

ITS regression results across all data sets on a scale of outcome of interest per 1,000 population

Analyses that were possible given the available data

Metric ITS analysis – no comparator ITS analysis – rest of HaRD comparator ITS analysis – synthetic comparator
ED attendances (65+) Yes Yes No
NEL admissions (65+) Yes Yes Yes
Emergency bed days (18+) Yes Yes Yes

Unit costs used to estimate the cost impact of the IRS

Metric Unit cost Source
ED attendances (65+) £138 Reference costs 2015/2016 (Department of Health and Social Care, 2016)
Average of all adult emergency medicine categories weighted by the frequency of occurrence across the four types of emergency medicine centre (emergency department, A&E and other types such as minor injury units, walk in centres). Does not include any follow on care which is considered under, e.g. admissions
NEL admissions (65+) £1,339 Reference costs 2015/2016 (Department of Health and Social Care, 2016)
Average of: (1) a frequency weighted average of all those which are adult injury and fracture related needs (£1,054), and (2) frequency weighted mental health HRGs (£1,634)
Emergency bed days (18+) £336 Reference costs 2015/2016 (Department of Health and Social Care, 2016)
Average of: (1) a frequency weighted average of all those which are adult injury and fracture related needs (£284), and (2) frequency weighted mental health HRGs (£387)

Overall results of the different analytical approaches and resultant cost estimates

No. Scenario Short-term change in outcome Change in slope Estimated change in metric over period Estimated change in cost, not including cost of the team
GP only ITS
A1 ED attendances (65+) −1.031956 0.4247 32 £4,359
A2 NEL admissions (65+) −1.083746 0.4206 27 £36,146
A3 Emergency bed days (18+) −4.042384** 1.1778*** 28 £9,477
GP and HaRD ITS comparator, regression adjusted
B1 ED attendances (65+) 1.992762 −0.0312 131 £18,068
B2 NEL admissions (65+) 2.225563 0.168 196 £261,876
B3 Emergency bed days (18+) 0.3514153 0.685 970 £326,006
GP and synthetic comparator, regression adjusted
C1 NEL admissions (65+) −1.109747 0.3451 7 £9,131
C2 Emergency bed days (18+) −3.932133* 0.8902** −288 −£96,824

Note: *,**,***Significant at 10, 5 and 1 per cent levels, respectively

References

Abadie, A., Diamond, A. and Hainmueller, J. (2015), “Comparative politics and the synthetic control method”, American Journal of Political Science, Vol. 59 No. 2, pp. 495-510.

Allen, D., Gillen, E. and Rixson, L. (2009), “The effectiveness of integrated care pathways for adults and children in health care settings: a systematic review”, JBI Library of Systematic Reviews, Vol. 7 No. 3, pp. 80-129.

Ariss, S., Bojke, L., Fowler-Davis, S., Hex, N., Hinde, S., Lowrie, K., Nasr, N., Richardson, G., Scott, E. and Setters, J. (2018), “Harrogate and Rural District new care model Vanguard – final evaluation report”, available at: http://clahrc-yh.nihr.ac.uk/our-themes/health-economics-and-outcome-measurement/resources (accessed 24 April 2019).

Bernal, J.L., Gasparrini, A. and Cummins, S. (2016), “Interrupted time series regression for the evaluation of public health interventions: a tutorial”, International Journal of Epidemiology, Vol. 46 No. 1, pp. 348-355.

Campbell, H., Hotchkiss, R., Bradshaw, N. and Porteous, M. (1998), “Integrated care pathways”, BMJ, Vol. 316 No. 7125, pp. 133-137.

Crosbie, J. (2018), RE: Personal Communication with James Crosbie, NHS England, Leeds, 19 February.

Crosbie, J. (unpublished), “Developing and monitoring performance of the NCM programme through the application of synthetic control regions and the amalgamation of statistical process control methodologies”.

Department of Health and Social Care (2016), “NHS reference costs 2015 to 2016”, Department of Health and Social Care, London.

HaRD CCG (2015), Equality Information-Protected Characteristics: Definitions, Demographics & Health Inequalities 2015-16, HaRD CCG, Harrogate.

HaRD IRS Manager (2017), “RE: personal communication with HaRD IRS manager”, 8 October.

HaRD Vanguard (2018), “Sharing the biscuits: lessons from Harrogate’s new care model vanguard experience”, HaRD CCG, Harrogate, available at: www.harrogateandruraldistrictccg.nhs.uk/data/uploads/integrated-care/sharing-the-biscuits-22.02.18-amended-version-final-250418-docx.pdf (accessed 30 May 2019).

Iacobucci, G. (2014), “NHS plan calls for new models of care and greater emphasis on prevention”, BMJ, Vol. 349 No. g6430, p. 1.

Iacobucci, G. (2015), “NHS England announces 29 sites to spearhead integrated care models”, Vol. 350 No. h1362.

Linden, A. (2015), “Conducting interrupted time-series analysis for single- and multiple-group comparisons”, The Stata Journal, Vol. 15 No. 2, pp. 480-500.

Ministry of Housing, CLG (2015), “English indices of deprivation 2015”, Ministry of Housing, Communities & Local Government, London.

NHS England (2014), NHS Five Year Forward View, NHS England, London.

NHS England (2015a), THE Forward View into Action: New Care Models: Support for the Vanguards, NHS England, London.

NHS England (2015b), The Forward View into Action: New Care Models: Update and Initial Support, NHS England, London.

NHS England (2016), New care models: Vanguards–Developing a Blueprint for the Future of NHS and Care Services, NHS England, London.

NHS England (2017a), Delivering the Forward View: NHS Planning Guidance 2016/17–2020/21, NHS England, London.

NHS England (2017b), Evaluation Strategy for New Care Model Vanguards, NHS England, London.

NHS England (2019), NHS Long Term Plan, NHS England, London.

Public Health England (2018), “Public health profiles”, Public Health England, London, available at: https://fingertips.phe.org.uk/ (accessed 1 March 2018).

Starling, A. (2018), “Implementing new models of care: lessons from the new care models programme in England”, International Journal of Care Coordination, Vol. 21 Nos 1-2, pp. 50-54.

Thomas, R. (2017), “Vanguard turned to plan B after first model ‘didn’t work’”, available at: www.hsj.co.uk/nhs-harrogate-and-rural-district-ccg/vanguard-turned-to-plan-b-after-first-model-didnt-work/7020192.article (accessed 7 May 2019).

Tsiachristas, A., Stein, K.V., Evers, S. and Rutten-Van Molken, M. (2016), “Performing economic evaluation of integrated care: highway to hell or stairway to heaven?”, International Journal of Integrated Care, Vol. 16 No. 4, pp. 1-12.

Acknowledgements

The research was part-funded by the NIHR CLAHRC Yorkshire and Humber (www.clahrc-yh.nihr.ac.uk). The views expressed are those of the author(s), and not necessarily those of the NHS, the NIHR or the Department of Health and Social Care. The authors would like to express the authors’ gratitude to the HaRD CCG, Vanguard care team and the rest of the Vanguard evaluation team.

Corresponding author

Sebastian Hinde can be contacted at: sebastian.hinde@york.ac.uk