Part 1. Conceptualisation
1. Build a collaborative science team
No individual has all the skills and resources needed to make an AI healthcare project succeed so a collaborative science team (CST) is essential. The team’s composition may vary but includes HCPs, data scientists and possibly statisticians, project managers and software engineers. This is the most important tip; the shared knowledge of the CST overcomes many barriers presented to an AI healthcare project. Some barriers are obvious to a data scientist, some to a HCP and vice versa. We propose that such teams are the building blocks of what Cosgriff et al have described as the clinical AI departments of the future.6
It is also important to identify the project ‘gatekeepers’ from the outset; senior individuals who may be aware of organisational barriers to success. They could be heads of strategy, department leads or chief informatics officers. Early meetings with gatekeepers are essential; they may recommend additional team members and can access useful resources if a project proposes to change clinical pathways.
2. Engage frequently with the end user
HCPs often have little or no experience with AI. Many will be unfamiliar with the capabilities and limitations of AI7 which can lead to both unachievable expectations or unfair dismissal from the outset. In addition, HCPs may struggle to specify in advance how they want AI solutions to improve their workflow. Some may have never even mapped out or quantified their daily work. What they say they want may change as the project develops. There is a complex interaction of human factors at play which CSTs must not ignore.
Healthcare AI projects must therefore strive to offer digital transformation through ‘organisational, service and social innovation’ as suggested by Creswell et al.8 The clinical AI departments of the future must be prepared to offer in-reach into the other clinical departments. Defining the problem to be solved may take considerable time and may require CSTs to study clinical workflows carefully, perhaps even shadowing HCPs. In addition to building the solution, consideration should be given to how it will be employed. For example: how it is presented in a human understandable way,9 its impact on other workflows, on patient–caregiver interaction, the risks of automation complacency10 and of biasing clinical decision making.4
End user engagement is not a single event. We encourage CSTs to adopt an agile approach, where continuous user engagement with rapid modifications creates healthcare tools of real value. There is a tension here between agile development and ensuring patient safety as an AI solution is implemented and refined.
3. Build collaboration agreements early
Although healthcare organisations and universities frequently collaborate, overarching agreements for multisite working and secure data sharing are rare. Therefore, collaboration agreements need to be developed early to avoid major delays later. Conversely, generating new agreements for every project is inefficient. Creating communities of practice can facilitate dialogue between CSTs, supporting shared solutions or lobbying both organisations for high-level agreements.
Data-flow diagrams are an important part of any collaboration agreement (figure 2). They help design a transparent and detailed application for ethical review, and aid in the formation of a data management plan, which is a prerequisite for collaboration.
Figure 2An example of a data flow diagram using the Gane and Sarson nomenclature. In this example, multiple disparate sources of data from individual patients in the ICU are aggregated into a single research database. In-built, role-based access controls allow the data to be accessed by multiple different users while meeting data privacy requirements. EPR, electronic patient record; HCP, healthcare professional; ICNARC, intensive care nationaI audit and research centre; ICU, intensive care unit; SQL, structured query language.
Many healthcare organisations and universities have specific teams to help create collaboration agreements. Such teams can also advise on project costs, intellectual property, publication rights, data-sharing agreements and information governance. While drafting agreements, it is important to identify the data controller, who designs and supervises the project, and data processors, who carry out processing tasks under instruction from the controller.
4. Ethics: present a balanced view
Ethical oversight of AI-driven healthcare research can be problematic. There are numerous tension points that need to be resolved including the role of patient consent in the use of routinely collected healthcare data,11 the potential for under-representation of certain patient groups in AI projects12 and concerns about adaptive algorithms whose performance may change with new data. Most approval committees still have limited experience of AI and the review process may be ill-suited to AI-driven research.
The ethical review process can be improved if CSTs follow best practice such as the Standard Protocol Items: Recommendations for Interventional Trials-AI (SPIRIT-AI) protocol guidelines.13 A moral justification for transitioning to learning health systems supported by data science has been proposed,14 which may also help to inform the review process. A balanced view of AI healthcare projects should be offered; while AI presents ethical challenges it also offers opportunities to enhance the quality of existing medical evidence in areas where evidence is lacking or subject to bias.12
CSTs should be aware that ethical review may be protracted and should consider seeking review by committees with experience of similar projects
5. Invest in data science training for healthcare professionals in your team
There is currently a lack of data science training in healthcare education programmes. Where training programmes do exist, they are often optional. The knowledge demanded of HCPs focuses on interpretation of research evidence, screening tests and the output of randomised controlled trials. Although this is an excellent foundation for traditional medical research, it does not provide a common language for data-driven clinicians of the future. Online supplemental appendix 2 offers a glossary to help build this common language.
The Topol Review15 emphasised the need for data science training if NHS staff are to reap the benefits of the digital revolution. There are a range of resources that HCPs can use to learn about the principles of tidy data,16 how to categorise and address missingness in their data17 and to gain an overview of AI techniques and their limitations.7 There are communities of practice18 who can offer support and growing opportunities for additional self-directed learning.19
We recommend that HCPs pay attention to these learning needs and use the expertise of data scientists in their team to guide them. Such preparation can prevent errors of understanding and improve interactions throughout an AI healthcare project.