How do trusts and local health economies migrate the container-loads of data from their legacy systems to newer LSP solutions, to enable joined-up patient and client records?
Historically, data migration from legacy systems has proven a headache for IT staff, with information missing, duplicated or in a proprietary format which makes working with the data difficult.
To provide data for LSPs, NHS organisations must extract data from their warehouses and legacy systems, validate it, transform it into the necessary format required and port it across to the LSP solution.
So how do organisations approach the migration? And at the same time, how can they use the migration to develop a data repository that can be used for fulfilling future requirements for clinical and management reporting?
Usually, this is done in one of two ways. Either data is passed to an external party that will extract, validate and supply the data back to the trust or organisation in the required format; or outsourced staff extract data into a database (e.g. MS Access, SQL Server), and the data is locally analysed, altered and reformatted for delivery to the LSPs.
The first option is pain-free, but incurs costs every time that the service is used. What’s more, the quality of the data returned is in the hands of the supplier. The second option gives the trust more control, but can mean high contractor’s fees to cover any skills gaps within the trust.
In reality, primary care trusts will want the best of both worlds – data management with validation and transformation rules to make migration of legacy data seamless, supported by training and consultancy, to ensure the skills are transferred to internal IT staff. This would benefit the organisation both for transfer of data to LSPs and in other data management tasks. Here’s how trusts and organisations can strike the right balance.
The specific management component required to fulfil the need for migration, transformation and validation is a separate data repository manager, a tool which handles data mapping and transform-and-load functions.
The repository manager gives an area into which validated and transformed data can be held, and helps to automate the process of loading data into the repository.
Dataflow into the repository should follow a ‘quality loop’, as follows:
This approach – embodied in Ardentia’s Data Management Repository Manager – solves both the short term requirements of transfer of cleansed data to the LSP, and gives a data repository that can fulfil future requirements for clinical and management reporting.
This local repository can deliver historical reporting and trend analysis, adding value to the legacy data. For example, data from other systems can be extracted and imported into the repository, such as from pathology systems, theatre systems and A&E systems. This also enhances return on investment, combining data from multiple sources for more comprehensive analysis and reporting.
With these points in mind, trusts can be sure that they’re on the right migration path.