Skip to main content
  • Research article
  • Open access
  • Published:

Quality assurance of data collection in the multi-site community randomized trial and prevalence survey of the children’s healthy living program

Abstract

Background

Quality assurance plays an important role in research by assuring data integrity, and thus, valid study results. We aim to describe and share the results of the quality assurance process used to guide the data collection process in a multi-site childhood obesity prevalence study and intervention trial across the US Affiliated Pacific Region.

Methods

Quality assurance assessments following a standardized protocol were conducted by one assessor in every participating site. Results were summarized to examine and align the implementation of protocol procedures across diverse settings.

Results

Data collection protocols focused on food and physical activity were adhered to closely; however, protocols for handling completed forms and ensuring data security showed more variability.

Conclusions

Quality assurance protocols are common in the clinical literature but are limited in multi-site community-based studies, especially in underserved populations. The reduction in the number of QA problems found in the second as compared to the first data collection periods for the intervention study attest to the value of this assessment. This paper can serve as a reference for similar studies wishing to implement quality assurance protocols of the data collection process to preserve data integrity and enhance the validity of study findings.

Trial registration: NIH clinical trial #NCT01881373

Background

Data quality is a key priority when planning a study to guarantee appropriate results and conclusions [1]. Detection and remediation of errors in the data collection process, whether they are made intentionally or not, promotes data integrity. This paper will describe and share the results of a quality assurance (QA) process used in a multi-site childhood obesity prevalence study and intervention trial across the US Affiliated Pacific (USAP) region.

QA is one approach to ensure the validity of study results and preserve data integrity during the data collection process [2]. QA plays an important role in the conduct of the research study by helping to ensure findings and conclusions are correct and justifiable [1]. QA is a process used to prevent problems in the data collection process and to support subsequent data quality [3].

The first step to QA is developing a well-written, comprehensive and detailed procedure manual for data collection [3]. Poorly written manuals increase the chances of errors and risk the validity of the study [2]. Second is developing a rigorous and detailed recruitment and training plan to enforce the value of collecting accurate data. The final step is to monitor and evaluate the process in the field and identify areas of improvement to strengthen the study’s protocol.

The available literature on QA focuses mostly on maximizing the quality of the data, standardizing protocols, personnel training, and data management systems of clinical trials [413]. For example, The National Drug Abuse Treatment Clinical Trials Network examined the effect QA had on procedures and outcomes of clinical trials in substance abuse treatment programs [14]. The authors noted the need for a community-based and coordinated system of comprehensive services to ensure integrity of the data collected. This drug abuse treatment study developed a monitoring system accessible to multiple sites that was efficient to meet the needs of each involved treatment program and of the research staff for a multi-level, systematic approach [14]. The intensity of QA monitoring was based on the particular trials, the risks of the interventions, and the experience of the staff with clinical research. Training sessions and the development of flexible tools were administered at each site [14].

QA programs are also beneficial to community-based research studies. The goal of QA is to provide valid and reliable outcomes of a study [15], which can be applied to multi-site studies of any kind. The Girls Health Enrichment multi-site studies (GEMS) program addresses the needs of African–American girls through the development and evaluation of culturally appropriate obesity prevention approaches [16]. The study was conducted at four sites across the nation among similar study populations. A QA procedure was used to ensure the integrity of the data for data management, eligibility violations, procedural errors, concordance in the replicate evaluations, procedures not performed, field-center comparisons, and consistency among the variables [17]. A multicenter, randomized clinical trial called the Dietary Approaches to Stop Hypertension (DASH)-Sodium study also used a QA process to ensure quality in procedures for screening participants, diet preparation, delivery of data collection, staff training, and monitoring activities in several clinical sites [18].

The USAP region, which includes the states of Hawai‘i and Alaska, the US territories of Guam and American Samoa, the US Commonwealth of the Northern Mariana Islands (CNMI), and the Freely Associated States of Micronesia (FAS: States of the Federated States of Micronesia [Pohnpei, Chuuk, Yap, and Kosrae], the Republic of the Marshall Islands, and the Republic of Palau), participated in its first multi-site childhood obesity prevalence and community randomized intervention trial [19]. The USAP region is typically underserved and underrepresented in national, comprehensive nutrition and health research studies. These jurisdictions do not have national nutrition monitoring, such as the National Health and Examination Survey, for nutrition-related health prevention [20]. Limited data on diet, physical activity, obesity, and other health-related indicators within this region restricts the capacity for understanding the care and action needed to control the epidemic of non-communicable chronic diseases present in the region [21]. The need for nutrition monitoring capacity is substantial because of the deficiency of healthcare resources, limited access to primary medical care physicians, high infant mortality rates and poverty levels [22, 23]; yet these same factors challenge the collection of data.

The purpose of this paper is to describe the QA process developed for detection of procedural and data errors for the Children’s Healthy Living (CHL) Program multi-site trial in an underserved region [19]. In addition to describing the process, common findings will be shared as well as strategies used to correct errors. The QA model may serve as a template for other complex multi-site studies in underserved and isolated communities.

Methods

Study design

The CHL program included prevalence and intervention studies conducted in the USAP region. The multi-jurisdiction prevalence study and community randomized intervention trial (NIH clinical trial #NCT01881373 [clinicaltrials.gov]) were conducted between Fall 2012 and Spring 2015. CHL’s mission is to elevate the capacity of the region to build and sustain a healthy food and physical environment to help maintain healthy weight and prevent obesity among young children in the Pacific region.

The FAS countries listed previously, participated in the prevalence survey only while Alaska, American Samoa, CNMI, Guam, and Hawai‘i also implemented the community-randomized environmental intervention to prevent childhood obesity with a baseline and 24-month follow-up. CHL focuses on the following six behavioral targets: increase sleep, moderate to vigorous physical activity, fruit and vegetable intake, and water intake, and decrease sedentary behavior and sweetened beverage intake among the communities of the participating jurisdictions [24]. Similar data collection instruments and procedures were used for the prevalence and intervention studies.

Collection of the participant-based measures included anthropometry, diet, physical activity, sleep and Acanthosis Nigricans (present or absent on the back of the neck), in all study locations across the region [24]. Multiple day trainings on collecting all measures, including a standardization process for anthropometric data [25], were implemented in all jurisdictions prior to data collection. For further detail on the study design and protocol, please refer to Wilken et al. [24]. Collection of community-based measures of the environment (schools, stores, parks, churches, etc.) and jurisdiction-based measures (of food prices in food stores, etc.) were also done but were not covered by the same QA methodology and are not reported in this paper.

Institutional Review Board (IRB) approval from the University of Hawai‘i, the University of Guam, the University of Alaska Fairbanks, and the Republic of Palau was granted for the CHL study. Other jurisdictions ceded IRB authority to the University of Hawai‘i. Additional IRB approval was not needed for the QA program, as no additional individual data was collected.

Quality assurance process

Prevention

Standard procedures were used to ensure accurate and consistent measurements throughout the study. Standardized training manuals were developed to document measurement protocols, detail procedures, and minimize errors. The detailed procedures related to anthropometry are reported by Li et al. [25]. Training for Acanthosis Nigricans detection included instructions and a photo array developed by an experienced pediatrician. Staff for each jurisdiction were hired and trained to conduct recruitment, measurement, and data collection for the CHL program. The staff participated in the QA training during scheduled dates. The training reviewed all data collection components and consisted of standardized measurements for anthropometric data to minimize error. The training held before the follow-up measurements in the intervention studies also included a mock data collection session.

Detection

The CHL Coordinating Center worked in coordination with each jurisdiction’s team to conduct an overall QA assessment at one point during each measurement collection period—one time in the FAS Region and twice (baseline and follow-up) in Alaska, American Samoa, CNMI, Guam, and Hawai‘i. The goal was to schedule the QA as close to the beginning of the data collection as possible to ensure that any errors would be corrected prior to the majority of the data being collected. QA across all jurisdictions was conducted by the CHL QA Lead (MKF). This individual was from the coordinating center of CHL (Assistant Program Director) whose role was to coordinate overall CHL Program activities. The CHL QA Lead also served as the co-lead of CHL measurement training and standardization and was therefore familiar with data collection processes and protocols.

The QA site visit process centered on measurement-related activities. There were two parts to the QA site visit process: (1) observation in the field and (2) observation of the organization and quality of data in the office. Specific details of each part of the QA process are outlined in Table 1. For each domain, each jurisdiction was assessed for whether all procedures and protocols outlined are met (yes) or whether any of the procedures and protocols are not met (no).

Table 1 Children’s Healthy Living (CHL) program “in the office” and “in the field” quality assurance (QA) process

Correction

The QA process concluded with a team debriefing of the measurement activity to review results, discuss corrections and provide clarifications. A QA report was generated to summarize findings across CHL. The QA report was submitted to the IRB to ensure oversight across the jurisdictions. Quality control procedures were established to monitor all CHL data collection outcomes, but are not covered here.

Results

The QA visits occurred between January 2013 to May 2015 across all jurisdictions. Alaska, American Samoa, CNMI, Guam, and Hawai‘i had two visits; baseline and follow-up. Baseline visits were held from January to December 2013 and follow-up visits from January to April 2015. FAS QA visits were held from October 2013 to May 2015.

The QA results per visit at baseline and follow-up are presented in Table 2. The majority of jurisdictions followed all of the “in the office” and “in the field” protocol standards. The 24 month follow-up visits showed an overall improvement in maintaining study protocols from baseline. The jurisdictions that participated in the 24-month measurement showed an improvement in QA scores from baseline to follow-up. The QA processes such as form storage, anthropometry/Acanthosis Nigricans station, forms station, and checkout station improved by 20 %. Consent and assent forms and accelerometer downloads and resetting improved by 40 %. Orientation/check-in went from one (10 %) jurisdiction meeting all protocols to five (100 %) meeting all protocols during 24-month follow-up. Accelerometer placement station decreased by 20 % from baseline to follow-up. Accelerometer log (100 %), completed forms/logs (100 %), food and activity log/accelerometer instruction (60 %), and transport of forms remained constant between baseline and follow-up. Orientation/check-in improved the most by one (20 %) to five (100 %) jurisdictions meeting all protocols.

Table 2 Baseline and follow-up results of the Children’s Healthy Living (CHL) program “in the office” and “in the field” quality assurance (QA) process

In the office

Form storage

All jurisdictions stored forms securely in locked cabinets behind a locked door. During baseline, some jurisdictions initially stored the protected health information and non-protected health information files within the same cabinets. This was promptly corrected during the QA visit. At follow-up, the majority of jurisdictions met all protocols.

Consent and assent forms

In certain jurisdictions, both consent and assent forms were not filed properly in participant folders or collected from parents/caregivers at the appropriate time. Follow-up corrected the error.

Accelerometer log

All jurisdictions had all accelerometers in-house except for those placed in the field. Methods of organization differed by jurisdiction but were in place at all locations to ensure that accelerometers were not misplaced.

Accelerometer downloading and resetting

The majority of jurisdictions downloaded and reset accelerometers correctly. The accelerometers are time sensitive and have a recommended time frame to download data. Some jurisdictions delayed downloading the data until after the QA visit which increased the amount of time it took to save the data, since those devices were still recording data during that time.

Completed forms/logs

All jurisdictions had their forms/logs accounted for but experienced some common data quality problems with completing these forms/logs. Issues included not reviewing forms for thorough completion and not obtaining clarification from parents for unintelligible and illegible responses. For example, a question on the number of people in household did not equal the sum of the number individuals in household by age group, or the child’s age and parent provided birthdate did not coincide. Other points of misclassification included listing the child’s age when they were no longer breastfed in years instead of in months, the unit of time indicated for the response.

In the field

In jurisdictions where languages other than English were used in the field, a native speaker assisted the CHL QA lead with explaining food and activity log instructions and accelerometer placements through translation and appropriate hand gestures. Forms, excluding food and activity log instructions, were provided in English and other native languages, which were translated by a native speaker.

Orientation/check-in station

Most jurisdictions followed proper procedures for check-in at baseline and all at follow-up. Issues included ensuring that consent and photo release forms were signed and that the assent form of the child was collected before measurement.

Food and activity log/accelerometer instruction station

The majority of jurisdictions followed proper procedures for food and activity log/accelerometer instruction. For food and activity log/accelerometer instructions, jurisdictions often did not emphasize recording the project’s high priority items such as fruits, vegetables, beverages, start and end times for activities, and activity description. There was an inconsistent use and demonstration of instructional tools such as measuring cups and spoons, sample wrappers and labels, labeled Ziploc bags, and recipes to parents. A commonly overlooked procedure was labeling food and activity logs with ID numbers.

Anthropometry/Acanthosis Nigricans station

All jurisdictions completed assent forms prior to anthropometric measurement. Some jurisdictions did not calibrate equipment prior to every measurement session or attempt three measurements per child, according to protocol. Minor measurement procedures were overlooked during sessions such as removing children’s socks, hair ties, and heavy belts, viewing measurements at eye level, facing child away from scale screen, placing the tape measure horizontally around child’s waist, indicating diaper use, and measurement verification between measurer and recorder. Generally, Acanthosis Nigricans screening was properly followed while occasionally the staff failed to be discreet about observations for participant confidentiality or failed to ensure that all positive screens were verified.

Accelerometer placement station

Overall proper procedures were followed at the accelerometer placement station for all jurisdictions. However, the use of safety scissors for cutting bands off children needed to be emphasized. Jurisdictions also needed to be reminded about letting the child choose their band color, placing ID labels onto correct logs, and reminding parents of the extra bands provided for the child.

Forms station

All jurisdictions experienced problems with providing proper oversight at the forms station. Preparing reference material in advance for parents, designating an area for children to play while parents filled out forms, reviewing forms using different colored writing instruments from parents/caregivers, and being proactive in identifying parents who needed assistance prior to form disbursement were issues that needed to be addressed. CHL staff also needed to improve reviewing all forms and verifying completion prior to parent/caregiver’s departure. Jurisdictions were reminded to review forms station procedures and prepare forms properly prior to measurement sessions.

Check out station

Many jurisdictions did not complete the participant folders’ check lists or complete necessary logs for tracking purposes at the end of measurement sessions, which were then corrected in the office. Organizing forms of protected health information and non-protected health information into separate folders was generally completed by all jurisdictions.

Transport of forms and data

The majority of jurisdictions followed proper procedures for transport of forms with separate lockable repositories for protected health information and non-protected health information forms. All jurisdictions were reminded to review data transportation procedures to prevent protected health information classification breaches.

Discussion

The purpose of this paper was to document the QA processes for conducting a multi-site childhood obesity prevalence survey and intervention trial across the USAP region. The implications of geographic location and diverse cultural settings were considered during the development of standard protocol procedures. Methods were revisited and adapted to the environment and culture while still meeting the standards of CHL, based on QA visit findings. For example, locked boxes were expanded to include locked backpacks or locked plastic containers (e.g., action packers). Some sites had limited resources, as such, specific survey items, such as locked boxes, were not readily available. Also, items such as locked boxes may bring unwanted attention to data collection personnel.

The increase in time for accelerometer download and resetting caused a delay for accelerometer use, which decreased the amount of children’s activity measured. To resolve this issue, each jurisdiction developed a schedule to download accelerometer data within three days. A CHL employee was designated to adhere to the schedule. Weekly data/measurement calls were also held with all jurisdictions to update accelerometer status as well as other data-related issues.

QA had to be staggered across jurisdictions due to great travel distances. In order to ensure consistency in feedback and reduce bias, CHL had QA site visits conducted by one individual. Each jurisdiction held a debriefing meeting after the QA assessment to discuss the team’s experience and solutions for protocols that were not met. Changes to the program’s procedures were made following the QA to correct the issues that occurred. Jurisdictions may have altered certain steps in collecting data to address the cultural factors within their respective communities. The demonstration of wearing an accelerometer by CHL staff was not required but proved effective in gaining a child’s assent to wear one. For a family with two or more children recruited, ID labels on accelerometer bands were used for parent identification. In cultural settings, native languages were used to talk with parents/caregivers and food examples were based on the jurisdiction’s usual diet. Verification of completed forms at every station was important to reduce the oversight of response errors. Parents/caregivers may not have answered all questions or provided complete answers. For example, for the question asking for the number of people living in the household, if the tally marks and total did not match, then the data entry software would recognize an error and prevent continuation of entry until the problem was resolved. Identifying and correcting this issue in the field proved beneficial to preventing future data entry problems.

The greatest strength of the CHL QA program was the standardized protocol used throughout the multi-site study that limited confounding bias. The high level of collaboration and efficiency with jurisdiction sites where local staff were hired and trained resulted in an organized, novel and functional recruitment process as outlined in Fialkowski et al. [26]. The QA site visit process was an important component to ensure data integrity within the multi-site study in this underserved, underreported and underrepresented region. Similar to other community-based studies such as the Girls Health Enrichment multi-site studies (GEMS) [17], the CHL QA approach addressed the complexity of implementing a study of this nature across 11 environmentally, geographically, and culturally distinct jurisdictions through flexibility and adaptation. This required significant time, resources, coordination and collaboration across a team of more than 100 individuals. Future research endeavors in similar environments such as the USAP wishing to maintain standardized protocols should consider these factors when developing their QA process.

More is needed in this area as the QA processes conducted in research studies related to multi-site and obesity prevention are limited. Unidentified systematic and non-systematic errors may have occurred from lack of measurement oversight and protocol neglect. However, sharing of research protocols, including QA protocols, through the peer-reviewed literature is an opportunity to prevent or mitigate potential systematic and non-systematic errors.

Conclusions

Implementing a multi-site trial across a region as diverse and expansive as the USAP requires considerable effort to ensure consistency in data collection efforts. Although some procedures were overlooked in the office and in the field, corrections were made after the QA visit. This often included a review of study protocol. The reduction in errors at the 24-month QA visit for the intervention sites attests to the success of the program. The USAP is an under-studied area, and therefore experience with research methodology is limited. The QA process used in this region demonstrated that large community-based multi-site trials can be conducted in this unique geographic setting, following standardized yet community sensitive protocols. CHL provided a foundation for implementing a standardized research protocol that will serve as a framework for others to expand on. This paper provides a reference for others interested in conducting public health QA research in resource limited and geographically isolated locations.

Abbreviations

CHL:

Children’s Healthy Living Program

CNMI:

Commonwealth of the Northern Mariana Islands

FAS:

Freely Associated States of Micronesia

IRB:

Institutional Review Board

QA:

Quality Assurance

USAP:

US Affiliated Pacific

References

  1. Moreno LA, De Henauw S, Gonzalez-Gross M, Kersting M, Molnar D, Gottrand F, et al. Design and implementation of the healthy lifestyle in Europe by nutrition in adolescence cross-sectional study. Int J Obes. 2008;32:S4–11.

    Article  Google Scholar 

  2. The office of Research Integrity. Responsible conduct in data management. 2015. [http://ori.hhs.gov/education/products/n_illinois_u/datamanagement/dctopic.html]. Accessed July 2015.

  3. Knatterud GL, Rockhold FW, George SL, Barton FB, Davis CE, Fairweather WR, et al. Guidelines for quality assurance in multicenter trials: a position paper. Control Clin Trials. 1998;19(5):477–93.

    Article  CAS  PubMed  Google Scholar 

  4. Freedland KE, Carney RM. Data management and accountability in behavioral and biomedical research. Am Psychol. 1992;47:640–5.

    Article  CAS  PubMed  Google Scholar 

  5. Gassman JJ, Owen WW, Kuntz TE, et al. Data quality assur-ance, monitoring, and reporting. Control Clin Trials. 1995;16(Suppl 2):104S–36S.

    Article  CAS  PubMed  Google Scholar 

  6. Prud’homme GJ, Canner PL, Cutler JA. Quality assurance and monitoring in the hypertension prevention trial. Hypertension prevention trial research group. Control Clin Trials. 1989;10(Suppl 3):84S–94S.

    Article  PubMed  Google Scholar 

  7. Karrison T. Data editing in a clinical trial. Control Clin Trials. 1981;2:15–29.

    Article  CAS  PubMed  Google Scholar 

  8. Marinez YN, Mahan CA, Barnwell GM, et al. Ensuring data quality in medical research through an integrated data management system. Stat Med. 1984;3:101–11.

    Article  CAS  PubMed  Google Scholar 

  9. Bagniewska A, Black D, Molvig K, et al. Data quality in a distributed data processing system: the SHEP pilot study. Control Clin Trials. 1986;7:27–37.

    Article  CAS  PubMed  Google Scholar 

  10. Severe JB, Schooler NR, Lee JH, et al. Ensuring data quality in a multicenter clinical trial: remote site data entry, central coordination and feedback. Psychopharmacol Bull. 1989;25:488–90.

    CAS  PubMed  Google Scholar 

  11. Sforza VA. Quality data: what are they? Ann Super Sanita. 1994;30:439–43.

    CAS  Google Scholar 

  12. Hohnloser JH, Puerner F, Soltanian H. Improving coded data entry by an electronic patient record system. Methods Inf Med. 1996;35:108–11.

    CAS  PubMed  Google Scholar 

  13. Whitney CW, Lind BK, Wahl PW. Quality assurance and quality control in longitudinal studies. Epidemiol Rev. 1997;20(1):71–80.

    Article  Google Scholar 

  14. Rosa C, Campbell A, Kleppinger C, Sampson R, Tyson C, Mamay-Gentilin S. Quality assurance of research protocols conducted in the community: the National Institute on Drug Abuse Clinical Trials Network experience. Clinical Trials. 2009;6(2):151–61.

    Article  PubMed  PubMed Central  Google Scholar 

  15. Fontana D, Matthys C, Engel P, Clevidence BA, Todd K, Ershow AG. Staffing needs for research diet studies. Well controlled diet studies in humans: a practical guide to design and management. Chicago: American Dietetic Association; 1999. p. 299–322.

    Google Scholar 

  16. Obarzanek E, Pratt CA. Girls health enrichment multi-site studies (GEMS): new approaches to obesity prevention among young African–American girls. Ethnicity Disease. 2002;13(Suppl 1):S1–5.

    Google Scholar 

  17. Klesges RC, Obarzanek E, Kumanyika S, Murray DM, Klesges LM, Relyea GE, et al. The Memphis girls’ health enrichment multi-site studies (GEMS): an evaluation of the efficacy of a 2-year obesity prevention program in African American girls. Arch Pediatr Adolesc Med. 2010;164(11):1007–14.

    Article  PubMed  PubMed Central  Google Scholar 

  18. Most MM, Craddick S, Crawford S, Redican S, Rhodes D, Rukenbrod F, et al. Dietary quality assurance processes of the DASH-Sodium controlled diet study. J Am Diet Assoc. 2003;103(10):1339–46.

    Article  PubMed  Google Scholar 

  19. Ng M, et al. Global, regional, and national prevalence of overweight and obesity in children and adults during 1980–2013: a systematic analysis for the global burden of disease study 2013. Lancet. 2014;384(9945):766–81.

    Article  PubMed  PubMed Central  Google Scholar 

  20. Murphy SP. Collection and analysis of intake data from the integrated survey. J Nutr. 2003;133(2):585S–9S.

    PubMed  Google Scholar 

  21. Novotny R, et al. The Pacific way to child wellness: the Children’s healthy living program for remote underserved minority populations of the Pacific region (CHL). Hawaii J Med Public Health. 2013;72(11):406–8.

    PubMed  PubMed Central  Google Scholar 

  22. Hawley NL, McGarvey ST. Obesity and diabetes in Pacific Islanders: the current burden and the need for urgent action. Curr Diab Rep. 2015;15(5):29.

    Article  Google Scholar 

  23. McLennan AK, Ulijaszek SJ. Obesity emergence in the Pacific islands: why understanding colonial history and social change is important. Public Health Nutrition. 2015;18(08):1499–505.

    Article  PubMed  Google Scholar 

  24. Wilken LR, Novotny R, Fialkowski MK, Boushey CJ, Nigg C, Paulino Y, et al. Children’s Healthy Living (CHL) Program for remote underserved minority populations in the Pacific region: rationale and design of a community randomized trial to prevent early childhood obesity. BMC public health. 2013;13(1):944.

    Article  PubMed Central  Google Scholar 

  25. Li F, Wilkens L, Novotny R, Fialkowski M, Paulino Y, Nelson R, et al. Anthropometric standardization in the US Affiliated Pacific: the Children’S Healthy Living Program (1024.6). FASEB J. 2014;28(Suppl 1):1024–6.

    Google Scholar 

  26. Fialkowski MK, Yamanaka A, Wilkens LR, Braun KL, Butel J, Ettienne R, et al. Recruitment strategies and lessons learned from the Children’s Healthy Living Program Prevalence Survey. AIMS Public Health. 2016;3(1):140–57.

    Article  Google Scholar 

Download references

Authors’ contributions

AY conceptualized and drafted the manuscript and approved the final manuscript as submitted. MF performed this study’s data collection, reviewed and revised the manuscript, and approved the final manuscript as submitted. LW and FL designed the study protocols and data collection procedures, reviewed and revised the manuscript, and approved the final manuscript as submitted. RE, TF, AB, JD, PC, and RLG supervised jurisdiction training and data collection, reviewed and revised the manuscript, and approved the final manuscript as submitted. RN is the CHL Principal Investigator and led the CHL program’s development, design, data collection and training, and reviewed and revised the manuscript, and approved the final manuscript as submitted. All authors read and approved the final manuscript.

Acknowledgements

The authors wish to acknowledge the support received by partners in the Region and the willingness of the parents and children to participate in the prevalence survey.

Competing interests

The authors declared that they have no competing interests.

Funding

This project is supported by the Agriculture and Food Research Initiative Grant no. 2011-68001-30335 from the USDA National Institute of Food and Agricultural Science Enhancement Coordinated Agricultural Program.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Marie Kainoa Fialkowski.

Rights and permissions

Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Yamanaka, A., Fialkowski, M.K., Wilkens, L. et al. Quality assurance of data collection in the multi-site community randomized trial and prevalence survey of the children’s healthy living program. BMC Res Notes 9, 432 (2016). https://0-doi-org.brum.beds.ac.uk/10.1186/s13104-016-2212-2

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://0-doi-org.brum.beds.ac.uk/10.1186/s13104-016-2212-2

Keywords