Articles and news
Consolidation of laboratories – how to capture benchmarking benefits and avoid the pitfalls
It has been long recognised how the medical laboratory industry can take advantage of economies of scale by greater purchasing power and better utilisation of capacity. The wave of consolidations in the industry started more than 30 years ago in the USA and are taking place all over the world. In England the NHS is committed to consolidate pathology services of 105 hospitals into 29 pathology networks. We have already witnessed the mergers and re-organisations of laboratories into large laboratory networks in Finland during the last decade. Each of the current five main networks provide end-to-end services in large geographical areas with multiple central hospitals and with tens or hundreds of health centres or collection sites feeding the hub and spoke laboratories.
We know that laboratory services are at the heart of healthcare services as about 70 per cent of all care decisions involve laboratory diagnostics. Being such an essential part of the health pathways, it is vital that all consolidations lead to better quality and service levels, not just cost savings. However, there has been a lot of controversy about what cost benefits can be expected as a consequence of these consolidations and how they can be realised.
Besides some obvious cost benefits that result directly from the size, there are some potential pitfalls that might be overlooked and doing so lead to suboptimal outcomes. One of these is benchmarking. The networked structure provides an excellent opportunity for benchmarking sites and identifying areas for improvement or formalising optimal specialisation and centralisation strategies. However, it’s highly important to realise that high-level metrics such as cost per result by specialty area can easily give misleading indicators when making decisions, as different hospitals and specialty areas can be so diverse in the services that they provide. This can easily skew the metrics and produce biases and noise for decision making.
For tackling this issue benchmarking needs to be transparent, end-to-end and granular. This means not dealing with high-level metrics but understanding the true costs at individual test levels. This gives the power to truly compare apples to apples and identify the reasons for cost variation. This way networks can build procedures for systematic cost comparisons for services produced within or outside the network.
As test volumes increase, new tests and technologies emerge, new sites are consolidated within the network or processes and operating models change, it then becomes a fluid optimisation exercise where the up-to-date and transparent costing information becomes one of the key elements for reliable benchmarks and good decision making – driving optimal operating structures and realisable cost savings. Better yet, the granular cost information can be used for forward-looking simulations where different scenarios can be modelled in advance of any chosen action. This involves modelling all the aspects that are impacted by the planned change such as logistics, staffing, equipment, and capacity utilisation.
Obviously, this is not an exercise for spreadsheets for the sheer volume of data alone but needs a sophisticated costing tool to handle transactional activity-based costing. With the right approach laboratory networks can gain deep insight with the right level of detail, enabling truly valuable benchmarking, thus transforming the networks to both generate cost savings and continuously meet the increasing demands of modern-day healthcare.
If you’d like to learn more about how we have helped networked pathology organisations like Fimlab and Nordlab, please visit our case studies.