Assessment of automated clinical trial recruitment and enrolment using patient-facing technology ================================================================================================ * Naomi S Bardach * Regina Lam * Carolyn B Jasik ## Abstract **Objective** Interactive patient care systems (IPCS) at the bedside are becoming increasingly common, but evidence is limited as to their potential for innovative clinical trial implementation. The objective of this study was to test the hypothesis that the IPCS could feasibly be used to automate recruitment and enrolment for a clinical trial. **Methods** In medical-surgical units, we used the IPCS to randomise, recruit and consent eligible subjects. For participants not interacting with IPCS study materials within 48 hours, study staff-initiated recruitment in-person. Eligible study population included all caregivers and any patients >6 years old admitted to medical-surgical units and oncology units September 2015 to January 2016. Outcomes: randomisation assessed using between-group comparisons of patient characteristics; recruitment success assessed by rates of consent; paperless implementation using successful acquisition of electronic signature and email address. We used χ2 analysis to assess success of randomisation and recruitment. **Results** Randomisation was successful (n=1012 randomised, p>0.05 for all between-group comparisons). For the subset of eligible, randomised patients who were recruited, IPCS-only recruitment (consented: 2.4% of n=213) was less successful than in-person recruitment (61.4% of n=87 eligible recruited, p<0.001). For those consenting (n=61), 96.7% provided an electronic signature and 68.9% provided email addresses. **Conclusions** Our results suggest that as a tool at the bedside, the IPCS offers key efficiencies for study implementation, including randomisation and collecting e-consent and contact information, but does not offer recruitment efficiencies. Further research could assess the value that interactive technologies bring to recruitment when paired with in-person efforts, potentially focusing on more intensive user-interface testing for recruitment materials. **Trial registration number** [NCT02491190](http://informatics.bmj.com/lookup/external-ref?link_type=CLINTRIALGOV&access_num=NCT02491190&atom=%2Fbmjhci%2F28%2F1%2Fe100076.atom). * patient care * health care * computer methodologies * medical informatics ### Summary #### What is already known? * Interactive patient care systems are becoming more popular and are used for a variety of patient-oriented interactions, including not only entertainment, but also delivery of patient education videos, survey questions, food ordering and communication with providers. #### What does this paper add? * This paper reports on the use of an interactive patient care system to automate clinical trial tasks, focusing particularly on success of randomisation, recruitment and collection of consent and contact information. We found that the system was successful for randomisation and collection of consent and contact information, and had poor performance as a stand-alone recruitment method. ## Introduction Consumer-facing health technology has the potential to revolutionise care, re-orienting traditional provider-centric models of care delivery.1 2 In the inpatient setting, interactive patient care systems (IPCS) at the bedside such as GetWell Network, myStation and OneView provide personal health information, educational materials and patient engagement features to optimise patient–provider communication, in addition to on-demand entertainment.3–6 These systems have a substantial national presence, implemented in almost 40 000 beds in 2013.7 8 IPCS adoption will likely increase as hospitals seek to meet federal and local demands for deeper and more meaningful patient engagement.9 With the increasing pressure to integrate point-of-care patient engagement technologies into the clinical workflow,1 our need for data on their usefulness is becoming more urgent.10 To gather this data, it is possible that we may be able to leverage the technologies to automate trial implementation, realising efficiencies over the traditional in-person research staffing. Though limited, prior research suggests that technology platforms can streamline research processes and perform as well as, if not better than, paper methods.11 For instance, tablet-delivered digital multimedia study materials have improved understanding of clinical trials during paediatric patient recruitment12 and electronic health record platforms have been used for recruitment.13 However, no study to our knowledge has examined the use of IPCS to automate the multiple aspects of a clinical study. In this report, we present data on our experience with a pragmatic test of IPCS clinical trial implementation. The parent trial assessed the effect of a patient-facing and family-facing educational video on patient experience metrics. For implementation of the parent trial, we tested the IPCS for patient identification, randomisation, recruitment and consenting, with a secondary plan for in-person recruitment and consenting for patients who did not interact with the IPCS study materials. The objective of this study was to assess feasibility of using the IPCS to automate aspects of a randomised clinical trial, including: (1) identifying and (2) randomising eligible patients and (3) recruiting and (4) consenting participants, including gathering electronic signatures and disseminating consent forms via email. ## Methods ### Setting The participating hospital opened in February 2015 with an IPCS at 183 beds at the new site. We worked with the IPCS vendor, OneView Healthcare, to plan the workflow (figure 1) and features needed for a study assessing the effect of a patient engagement video. ![Figure 1](http://informatics.bmj.com/https://informatics.bmj.com/content/bmjhci/28/1/e100076/F1.medium.gif) [Figure 1](http://informatics.bmj.com/content/28/1/e100076/F1) Figure 1 Workflow for automated clinical trial implementation. ### Technology The IPCS had several features already in place that supported the study implementation: a working interface with the hospital’s admission, discharge and transfer system; a patient education portal, available on the home page, which displayed a flag for assigned education until the education had been viewed; the ability, within the portal, to serve videos or weblinks; the ability to auto-assign patient education based on patient criteria. IPCS features created for this study included: (1) automated identification of eligible participants using complex criteria (admission date, no prior admission during study period, hospital unit); (2) automated randomisation of eligible participants to intervention or control; (3) delivery of study recruitment and consent materials to eligible participants, with passing of a patient identifier into the consenting forms; (4) browser adjustments to enable web-collected e-signatures. ### Study population Eligible population: all caregivers and any patients >6 years old admitted to medical-surgical units and oncology units 16 September 2015 to 9 January 2016. Exclusion criteria: no parent or guardian available, non-English speaking, being in foster care, prior admission during the study period. ### Data Data from the IPCS: user engagement with the IPCS standard features, engagement with the study-specific materials, length of stay and number of admissions. Data from the interactive study materials and from the consent process were collected and managed using REDCap electronic data, a secure web-based data capture tool, hosted at the University of California San Francisco.14 Study staff recorded recruitment attempts and reasons for exclusion for patients they approached. ### Automated clinical trial implementation features #### Patient identification The IPCS assigned study information in the education portal for all eligible patients. For these patients, the IPCS home page displayed a flag in the education portal until the study information website was opened. #### Randomisation The IPCS randomised patients 1:1 to the educational video intervention or to control. All eligible patients were randomised at admission, due to the technological limitation of communicating consent information from REDCap to the IPCS. Once randomised, the fidelity to the protocol was driven by the IPCS programming that made the intervention video available to those randomised to the intervention. Hence, fidelity of delivery of availability of the video was 100% (verified by random period checks of individual patient video assignments during the course of the study). #### Recruitment The study information website launched an interactive slide-deck presentation, made using Articulate software, describing the study. At the end of the presentation, viewers were asked to click one of three options: opt out, continue to consent or ask questions. For patients admitted for >48 hours who had not interacted with IPCS recruitment materials, and who were available (eg, not off-unit, parent or guardian available, not busy with clinical staff), study staff used an in-person recruitment protocol during weekdays using a standard consenting process. Participants who received in-person recruitment still reviewed the IPCS recruitment materials with the in-person facilitator. Recruiting staff were blinded regarding allocation. #### Consent and e-signature Clicking on one of the options at the end of the interactive material (I’m interested, no thanks, or I have questions) opened one of three web-based REDCap surveys, which recorded the patient ID and the response. Those who were interested were then screened and consented using the RedCap survey. Consent forms were available for parents or age-eligible children, according to branching logic. We used the e-signature feature within REDCap and the IPCS bedside tablet interface to gather signatures for parents agreeing to release medical records (figure 2). The REDCap survey also optionally collected caregiver email addresses to send consent copies electronically. ![Figure 2](http://informatics.bmj.com/https://informatics.bmj.com/content/bmjhci/28/1/e100076/F2.medium.gif) [Figure 2](http://informatics.bmj.com/content/28/1/e100076/F2) Figure 2 E-signature interface. ### Measures #### Randomisation success In order to assess for adequate randomisation, we compared study groups on available variables from the IPCS: number of interactions with the IPCS, interaction with study recruitment materials, average length of stay and mean number of admissions. We chose the average length of stay and the mean number of admissions because we hypothesised that they may be positively associated with patient or family member interactions with the study recruitment materials and with the intervention video, and therefore wanted to verify that they were balanced between the groups. #### Recruitment success We assessed the success of recruitment modality (IPCS vs IPCS with in-person facilitation) by comparing enrolment rates by modality. #### Feasibility of electronic signature and email ### Analysis Statistical analysis focused on the success of automated randomisation and recruitment. Binary outcomes were compared using χ2 analysis, or Fisher’s exact for cell sizes <10. Analyses were conducted using Stata V.13. ## Results ### Randomisation There were 1012 patients admitted to the eligible units during the study period, with 502 randomised to intervention (patient education video and recruitment materials) and 510 randomised to control (recruitment materials only). The randomisation was adequate, with no statistically significant differences between groups in characteristics from IPCS data (table 1). View this table: [Table 1](http://informatics.bmj.com/content/28/1/e100076/T1) Table 1 Randomisation success assessed through comparison of parent trial group participants ### Recruitment, consent and e-signature collection Figure 3 depicts a study recruitment flow diagram. Of 1012 patients, 21.0% (n=213) opened the study materials only through the IPCS patient education portal. Of those, 8.5% (n=18) completed the interactive materials, with five consenting to participate (29.4% of eligible patients completing interactive materials; 2.4% of those opening materials). Of those who did not open the materials in the IPCS, who were subsequently recruited in-person (n=176), 90 were ineligible due to being non-English speakers (n=58) or due to not having a guardian present (n=32). Of those recruited in person who were eligible (n=87), a larger proportion consented to participate than the patients only opening study materials through the patient education portal (64.4% vs 2.4%, p<0.001 for comparison; figure 3). ![Figure 3](http://informatics.bmj.com/https://informatics.bmj.com/content/bmjhci/28/1/e100076/F3.medium.gif) [Figure 3](http://informatics.bmj.com/content/28/1/e100076/F3) Figure 3 Recruitment and consent flow chart. IPCS, interactive patient care systems. Of consented participants (n=59), 96.6% (n=57) gave an electronic signature to release medical records and 71.2% (n=42) of participating parents opted to give their email address. ## Discussion This study offers the first look at the potential of IPCS for supporting clinical trial implementation via automated methods. An IPCS system was successful in identifying eligible subjects, randomisation, collecting electronic signatures for medical records release and capturing email addresses. Study enrolment via a stand-alone IPCS process was successful for only a fraction of potential subjects. Failures were due to either a complete lack of or very minimal engagement with IPCS-assigned materials. Ultimately a staff member was required to complete consent and study enrolment. Prior studies have assessed the success of using multimedia interactive materials to improve the informed consent process.15–21 The evidence is mixed regarding their success, with most studies finding improved comprehension of study materials,15–17 19 21 a preference for multimedia materials,15 20 21 generally increased time spent on informed consent20 and mixed effects on patient enrolment and retention.17 18 21 While these studies demonstrate the potential benefit of interactive systems for informed consent, our study expanded the use of the system to identify eligible patients, randomise them, and recruit and consent participants. We found that the IPCS identified eligible patients and adequately randomised them, as illustrated by the similarities across groups in recruitment and consent rates as well as across measured characteristics (table 1). This suggests that randomising IPCS features to assess their effects is feasible and could be considered for future studies. If implemented, blinding study staff to allocation assignment is necessary to avoid potential post-randomisation selection bias.22 Our results suggest that the IPCS alone is not sufficient for patient recruitment. We saw IPCS recruitment failures at three points: (1) participant opening assigned study materials; (2) participant completion of interactive study materials and (3) participant enrolment after viewing the materials. Failures at the first two stages imply that enhanced patient engagement features could potentially improve study recruitment success. Enhancements to improve interaction with materials might include more noticeable indicators of the presence of study materials (eg, interrupting of programming or more prominent visual notifications to the user (eg, blinking notification, banner on screen once a day or at routine intervals until study completion, text message reminders, etc). Changing the study materials (we used an interactive slide deck but there are other potential modalities such as whiteboard animation or including videos of patient participants) may have improved participant completion of interactive study materials. Testing these other options was beyond the scope of the study, would best be done with more intensive user-interface testing, and could be the focus of future work. Complementary qualitative research to explore barriers and facilitators to success would provide greater detail and context-specific information to explain successes and failures. Our findings also suggest that when leveraging an IPCS technology in trial implementation, staff for in-patient recruitment should not be eliminated. In contrast, efficiencies from IPCS may be realised in identifying eligible patients, randomisation and data collection. For example, patients who were readmitted were excluded automatically from the study, and randomisation was built into the technology, eliminating that step. The IPCS was an efficient platform for gathering electronic consent, with 96% of parents providing e-signatures and 71% providing email addresses for optional follow-up. Finally, the electronic consenting process eliminated the potential for lost paper forms, decreasing the risk of privacy loss. The electronic consent branching logic allowed for a tailored and shortened consent and data validation decreased errors. The assessment of potential cost-effectiveness of using the automated functions for clinical research implementation could be the focus of further study. ### Limitations While our use of the IPCS to give study information and enrol participants yielded low recruitment, the causes of low recruitment may not be directly attributable to the IPCS technology. IPCS study recruitment materials that were less easy to ignore (we used a passive reminder) may have led to different IPCS recruitment rates. Different interactive materials may have yielded different effects on recruitment. Generalisability should be understood within the context of specific IPCS software and implementation. Our Oneview Healthcare software included a randomiser function; others may not. We did not have sociodemographic data to explore whether IPCS interactions and recruitment success might have differed by patient characteristics. Finally, qualitative observations and more intensive user interface testing in the future can give greater understanding about how to better adapt IPCS patient engagement features for recruitment. ## Conclusion Interactive patient care systems are innovative new tools with the potential for supporting inpatient research. This study illustrates that technology, while potentially adding value in the healthcare context, does not inevitably replace human interactions. Our results suggest that as an electronic communication tool at the bedside, the IPCS offers key efficiencies for study implementation, including patient identification, randomisation and collecting e-consent and study contact information, but that it is limited in its ability to inform and recruit. Further research assessing whether patient engagement enhancements to the IPCS improve recruitment rates will better illuminate the potential value that interactive technologies bring when paired with in-person efforts. ## Acknowledgments Seth Bokser from Oneview HealthCare for assistance with implementation, Stephen McGowan and Philip Urrea from OneView HealthCare, for work on the development of the features in the interactive patient care system; Omada Health for allowing Dr Jasik to participate in this project. Drs Alice Ainsworth and Nikki Suarez, for their work on the development of the patient education video. ## Footnotes * Twitter @naomibardach * Contributors NSB conceptualised, designed and carried out the study, drafted the initial manuscript, and approved the final manuscript as submitted. RL drafted the initial manuscript and approved the final manuscript as submitted. CBJ contributed to design and conduct of the study, reviewed and revised the manuscript, and approved the final manuscript as submitted. * Funding This project was supported by the National Centre for Advancing Translational Sciences, National Institutes of Health, through UCSF-CTSI Grant Number UL1 TR000004. Its contents are solely the responsibility of the authors and do not necessarily represent the official views of the NIH. NSB was supported by the National Institute of Child Health and Human Development (K23HD065836). * Competing interests None declared. * Patient consent for publication Not required. * Ethics approval The Human Research Protection Program and Institutional Review Board at the University of California San Francisco approved this study. * Provenance and peer review Not commissioned; externally peer reviewed. * Data availability statement Data are available upon reasonable request. * Received May 31, 2019. * Revision received March 13, 2020. * Accepted June 12, 2020. * © Author(s) (or their employer(s)) 2021. Re-use permitted under CC BY-NC. No commercial re-use. See rights and permissions. Published by BMJ. [http://creativecommons.org/licenses/by-nc/4.0/](http://creativecommons.org/licenses/by-nc/4.0/) This is an open access article distributed in accordance with the Creative Commons Attribution Non Commercial (CC BY-NC 4.0) license, which permits others to distribute, remix, adapt, build upon this work non-commercially, and license their derivative works on different terms, provided the original work is properly cited, appropriate credit is given, any changes made indicated, and the use is non-commercial. See: [http://creativecommons.org/licenses/by-nc/4.0/](http://creativecommons.org/licenses/by-nc/4.0/). ## References 1. Singh A, Rhee KE, Brennan JJ, et al. Who's my doctor? using an electronic tool to improve team member identification on an inpatient pediatrics team. Hosp Pediatr 2016;6:157–65.[doi:10.1542/hpeds.2015-0164](http://dx.doi.org/10.1542/hpeds.2015-0164)pmid:http://www.ncbi.nlm.nih.gov/pubmed/26920366 [Abstract/FREE Full Text](http://informatics.bmj.com/lookup/ijlink/YTozOntzOjQ6InBhdGgiO3M6MTQ6Ii9sb29rdXAvaWpsaW5rIjtzOjU6InF1ZXJ5IjthOjQ6e3M6ODoibGlua1R5cGUiO3M6NDoiQUJTVCI7czoxMToiam91cm5hbENvZGUiO3M6ODoiaG9zcHBlZHMiO3M6NToicmVzaWQiO3M6NzoiNi8zLzE1NyI7czo0OiJhdG9tIjtzOjI1OiIvYm1qaGNpLzI4LzEvZTEwMDA3Ni5hdG9tIjt9czo4OiJmcmFnbWVudCI7czowOiIiO30=) 2. Tang C, Lorenzi N, Harle CA, et al. Interactive systems for patient-centered care to enhance patient engagement. J Am Med Inform Assoc 2016;23:2–4.[doi:10.1093/jamia/ocv198](http://dx.doi.org/10.1093/jamia/ocv198)pmid:http://www.ncbi.nlm.nih.gov/pubmed/26912537 [CrossRef](http://informatics.bmj.com/lookup/external-ref?access_num=10.1093/jamia/ocv198&link_type=DOI) [PubMed](http://informatics.bmj.com/lookup/external-ref?access_num=26912537&link_type=MED&atom=%2Fbmjhci%2F28%2F1%2Fe100076.atom) 3. Interactive Patient SystemCerner, 2016. Available: [www.cerner.com/solutions/medical\_devices/interactive\_patient\_system/](http://www.cerner.com/solutions/medical_devices/interactive_patient_system/) [Accessed April 6, 2016]. 4. GetWell Network. Interactive experience for patient and family engagement, 2016. Available: [www.getwellnetwork.com/solutions/patient-experience/pediatric](http://www.getwellnetwork.com/solutions/patient-experience/pediatric) [Accessed 6 Apr 2016]. 5. OneView. Revolutionizing the patient experience through innovative software for healthcare facilities, 2016. Available: [www.oneviewhealthcare.com/about-us/](http://www.oneviewhealthcare.com/about-us/) [Accessed 6 Apr 2016]. 6. Collins SA, Rozenblum R, Leung WY, et al. Acute care patient portals: a qualitative study of stakeholder perspectives on current practices. J Am Med Inform Assoc 2017;24:e9–17.[doi:10.1093/jamia/ocw081](http://dx.doi.org/10.1093/jamia/ocw081)pmid:http://www.ncbi.nlm.nih.gov/pubmed/27357830 [PubMed](http://informatics.bmj.com/lookup/external-ref?access_num=http://www.n&link_type=MED&atom=%2Fbmjhci%2F28%2F1%2Fe100076.atom) 7. FreshTracks Capital. GetWell Acquired By Welsh, Carson, Anderson & Stowe, 2013. Available: [www.freshtrackscap.com/getwell-acquired-by-welsh-carson-anderson-stowe](http://www.freshtrackscap.com/getwell-acquired-by-welsh-carson-anderson-stowe) [Accessed 6 Apr 2016]. 8. Wicklund E. Hospitals find new benefits in 'Interactive patient systems': Healthcare IT News, 2011. Available: [www.healthcareitnews.com/news/hospitals-find-new-benefits-](http://www.healthcareitnews.com/news/hospitals-find-new-benefits-) [Accessed 6 Apr 2016]. 9. HealthIT.gov. Step 5: achieve meaningful use stage 2, 2014. Available: [www.healthit.gov/providers-professionals/step-5-achieve-meaningful-use-stage-2](http://www.healthit.gov/providers-professionals/step-5-achieve-meaningful-use-stage-2) [Accessed 6 Apr 2016]. 10. Rudin RS, Bates DW, MacRae C. Accelerating innovation in health it. N Engl J Med 2016;375:815–7.[doi:10.1056/NEJMp1606884](http://dx.doi.org/10.1056/NEJMp1606884)pmid:http://www.ncbi.nlm.nih.gov/pubmed/27579633 [CrossRef](http://informatics.bmj.com/lookup/external-ref?access_num=10.1056/NEJMp1606884&link_type=DOI) [PubMed](http://informatics.bmj.com/lookup/external-ref?access_num=27579633&link_type=MED&atom=%2Fbmjhci%2F28%2F1%2Fe100076.atom) 11. Chalil Madathil K, Koikkara R, Obeid J, et al. An investigation of the efficacy of electronic consenting interfaces of research permissions management system in a hospital setting. Int J Med Inform 2013;82:854–63.[doi:10.1016/j.ijmedinf.2013.04.008](http://dx.doi.org/10.1016/j.ijmedinf.2013.04.008)pmid:http://www.ncbi.nlm.nih.gov/pubmed/23757370 [CrossRef](http://informatics.bmj.com/lookup/external-ref?access_num=10.1016/j.ijmedinf.2013.04.008&link_type=DOI) [PubMed](http://informatics.bmj.com/lookup/external-ref?access_num=23757370&link_type=MED&atom=%2Fbmjhci%2F28%2F1%2Fe100076.atom) 12. Tait AR, Voepel-Lewis T, Levine R. Using digital multimedia to improve parents' and children's understanding of clinical trials. Arch Dis Child 2015;100:589–93.[doi:10.1136/archdischild-2014-308021](http://dx.doi.org/10.1136/archdischild-2014-308021)pmid:http://www.ncbi.nlm.nih.gov/pubmed/25829422 [Abstract/FREE Full Text](http://informatics.bmj.com/lookup/ijlink/YTozOntzOjQ6InBhdGgiO3M6MTQ6Ii9sb29rdXAvaWpsaW5rIjtzOjU6InF1ZXJ5IjthOjQ6e3M6ODoibGlua1R5cGUiO3M6NDoiQUJTVCI7czoxMToiam91cm5hbENvZGUiO3M6MTI6ImFyY2hkaXNjaGlsZCI7czo1OiJyZXNpZCI7czo5OiIxMDAvNi81ODkiO3M6NDoiYXRvbSI7czoyNToiL2JtamhjaS8yOC8xL2UxMDAwNzYuYXRvbSI7fXM6ODoiZnJhZ21lbnQiO3M6MDoiIjt9) 13. Mastellos N, Bliźniuk G, Czopnik D, et al. Feasibility and acceptability of transform to improve clinical trial recruitment in primary care. Fam Pract 2016;33:186–91.[doi:10.1093/fampra/cmv102](http://dx.doi.org/10.1093/fampra/cmv102)pmid:http://www.ncbi.nlm.nih.gov/pubmed/26711958 [CrossRef](http://informatics.bmj.com/lookup/external-ref?access_num=10.1093/fampra/cmv102&link_type=DOI) [PubMed](http://informatics.bmj.com/lookup/external-ref?access_num=26711958&link_type=MED&atom=%2Fbmjhci%2F28%2F1%2Fe100076.atom) 14. Harris PA, Taylor R, Thielke R, et al. Research electronic data capture (REDCap)--a metadata-driven methodology and workflow process for providing translational research informatics support. J Biomed Inform 2009;42:377–81.[doi:10.1016/j.jbi.2008.08.010](http://dx.doi.org/10.1016/j.jbi.2008.08.010)pmid:http://www.ncbi.nlm.nih.gov/pubmed/18929686 [CrossRef](http://informatics.bmj.com/lookup/external-ref?access_num=10.1016/j.jbi.2008.08.010&link_type=DOI) [PubMed](http://informatics.bmj.com/lookup/external-ref?access_num=18929686&link_type=MED&atom=%2Fbmjhci%2F28%2F1%2Fe100076.atom) [Web of Science](http://informatics.bmj.com/lookup/external-ref?access_num=000264958800018&link_type=ISI) 15. Jimison HB, Sher PP, Appleyard R, et al. The use of multimedia in the informed consent process. J Am Med Inform Assoc 1998;5:245–56.[doi:10.1136/jamia.1998.0050245](http://dx.doi.org/10.1136/jamia.1998.0050245)pmid:http://www.ncbi.nlm.nih.gov/pubmed/9609494 [CrossRef](http://informatics.bmj.com/lookup/external-ref?access_num=10.1136/jamia.1998.0050245&link_type=DOI) [PubMed](http://informatics.bmj.com/lookup/external-ref?access_num=9609494&link_type=MED&atom=%2Fbmjhci%2F28%2F1%2Fe100076.atom) 16. Rhodes KV, Lauderdale DS, Stocking CB, et al. Better health while you wait: a controlled trial of a computer-based intervention for screening and health promotion in the emergency department. Ann Emerg Med 2001;37:284–91.[doi:10.1067/mem.2001.110818](http://dx.doi.org/10.1067/mem.2001.110818)pmid:http://www.ncbi.nlm.nih.gov/pubmed/11223765 [CrossRef](http://informatics.bmj.com/lookup/external-ref?access_num=10.1067/mem.2001.110818&link_type=DOI) [PubMed](http://informatics.bmj.com/lookup/external-ref?access_num=11223765&link_type=MED&atom=%2Fbmjhci%2F28%2F1%2Fe100076.atom) [Web of Science](http://informatics.bmj.com/lookup/external-ref?access_num=000167306500006&link_type=ISI) 17. Flory J, Emanuel E. Interventions to improve research participants' understanding in informed consent for research: a systematic review. JAMA 2004;292:1593–601.[doi:10.1001/jama.292.13.1593](http://dx.doi.org/10.1001/jama.292.13.1593)pmid:http://www.ncbi.nlm.nih.gov/pubmed/15467062 [CrossRef](http://informatics.bmj.com/lookup/external-ref?access_num=10.1001/jama.292.13.1593&link_type=DOI) [PubMed](http://informatics.bmj.com/lookup/external-ref?access_num=15467062&link_type=MED&atom=%2Fbmjhci%2F28%2F1%2Fe100076.atom) [Web of Science](http://informatics.bmj.com/lookup/external-ref?access_num=000224254600027&link_type=ISI) 18. Gesualdo P, Ide L, Rewers M, et al. Effectiveness of an informational video method to improve enrollment and retention of a pediatric cohort. Contemp Clin Trials 2012;33:273–8.[doi:10.1016/j.cct.2011.11.010](http://dx.doi.org/10.1016/j.cct.2011.11.010)pmid:http://www.ncbi.nlm.nih.gov/pubmed/22101229 [PubMed](http://informatics.bmj.com/lookup/external-ref?access_num=22101229&link_type=MED&atom=%2Fbmjhci%2F28%2F1%2Fe100076.atom) 19. O'Lonergan TA, Forster-Harwood JE. Novel approach to parental permission and child assent for research: improving comprehension. Pediatrics 2011;127:917–24.[doi:10.1542/peds.2010-3283](http://dx.doi.org/10.1542/peds.2010-3283)pmid:http://www.ncbi.nlm.nih.gov/pubmed/21518711 [Abstract/FREE Full Text](http://informatics.bmj.com/lookup/ijlink/YTozOntzOjQ6InBhdGgiO3M6MTQ6Ii9sb29rdXAvaWpsaW5rIjtzOjU6InF1ZXJ5IjthOjQ6e3M6ODoibGlua1R5cGUiO3M6NDoiQUJTVCI7czoxMToiam91cm5hbENvZGUiO3M6MTA6InBlZGlhdHJpY3MiO3M6NToicmVzaWQiO3M6OToiMTI3LzUvOTE3IjtzOjQ6ImF0b20iO3M6MjU6Ii9ibWpoY2kvMjgvMS9lMTAwMDc2LmF0b20iO31zOjg6ImZyYWdtZW50IjtzOjA6IiI7fQ==) 20. Schlechtweg PM, Hammon M, Giese D, et al. iPad-based patient briefing for radiological examinations-a clinical trial. J Digit Imaging 2014;27:479–85.[doi:10.1007/s10278-014-9688-x](http://dx.doi.org/10.1007/s10278-014-9688-x)pmid:http://www.ncbi.nlm.nih.gov/pubmed/24687643 [PubMed](http://informatics.bmj.com/lookup/external-ref?access_num=http://www.n&link_type=MED&atom=%2Fbmjhci%2F28%2F1%2Fe100076.atom) 21. Synnot A, Ryan R, Prictor M, et al. Audio-visual presentation of information for informed consent for participation in clinical trials. Cochrane Database Syst Rev 2014;5:CD003717.[doi:10.1002/14651858.CD003717.pub3](http://dx.doi.org/10.1002/14651858.CD003717.pub3)pmid:http://www.ncbi.nlm.nih.gov/pubmed/24809816 [PubMed](http://informatics.bmj.com/lookup/external-ref?access_num=http://www.n&link_type=MED&atom=%2Fbmjhci%2F28%2F1%2Fe100076.atom) 22. Viera AJ, Bangdiwala SI. Eliminating bias in randomized controlled trials: importance of allocation concealment and masking. Fam Med 2007;39:132–7.pmid:http://www.ncbi.nlm.nih.gov/pubmed/17273956 [PubMed](http://informatics.bmj.com/lookup/external-ref?access_num=17273956&link_type=MED&atom=%2Fbmjhci%2F28%2F1%2Fe100076.atom)