The outside world is looking at how we acquire, maintain, validate, and report our research findings. There is an epidemic of mistrust of science brought about by the combination of a few instances of serious scientific misconduct, unfortunately including misconduct committed at Duke. Further, there are many more examples of unreliable findings evidenced by “failures to replicate”. These issues have been publicized highly and necessitate remedies for our practice and culture.
Duke University is committed to maintaining the highest quality and integrity of its scientific enterprises. Because of this commitment, the School of Medicine (SOM) is required to have mechanisms to guarantee the responsible management and critical review of scientific data. This is analogous to the school’s obligation to ensure lab safety, proper clinical study procedures, and the appropriate use of animals in research. The SOM has opted to allow individual departments to adopt their own policies and procedures related to scientific accountability and integrity. For this reason, the Department of Ophthalmology is committed to ensuring that policies and procedures are in place to reflect the highest professional conduct and to promote a culture in which scientific results are critically reviewed and accountability for data integrity is clearly delineated. In addition, departmental policies must allow concerns about data integrity to be raised without hesitation and provide mechanisms by which these concerns can be addressed fairly and expeditiously.
Data provenance and integrity ensure that the knowledge we report is supported by the primary data, and that the primary data are retained in a form that allows us to be certain of the veracity of our knowledge. Scientific rigor ensures the proper application of the scientific method using the highest standards in the field. Scientific rigor is essential to conduct of the scientific enterprise.
All investigators should keep in mind the adage that: “if it seems too good to be true, then it probably is too good to be true”. Scientific data is inherently messy and, as a corollary, data that is too clean may have been cleaned up. With this in mind, as principal investigators we should follow three general principles:
- We should know the location of the raw data generated by both laboratory members and any core facilities.
- We should know how data has been acquired, modified and analyzed.
- We value and encourage constructive critiques of research and allow open discussion of any concerns regarding research conduct or integrity.
This applies to every member of the Department of Ophthalmology – faculty, trainees, staff and administrators – should understand and follow these principles.
Any questions, comments or concerns should be addressed to the following individuals:
- For suggestions, issues or concerns related to basic science, contact Dr. Vadim Arshavsky, Ph.D., Scientific Director.
- For suggestions, issues or concerns related to clinical science, contact Dr. Scott Cousins, M.D., Vice Chair for Research.
- You can also address directly the Department Chair, Dr. Edward Buckley, M.D, particularly in cases when you feel a real or perceived conflict of interest may exist.
Additional resources at Duke include:
Duke Clinical and Translational Research
Duke Office for Institutional Equity
1. Recommended best practices for improving the culture of scientific accountability within individual laboratories
Laboratory research is defined as any investigation using “wet” or “dry” laboratory resources, typically incorporating, but not limited to, data derived from animals, tissues, cells, biochemical or molecular assays, images, informatics analyses of large datasets, novel devices, novel software and novel algorithms applied to data analysis. The principles below provide guidance for ensuring the integrity of your own data and for maintaining compliance with this plan. You should discuss these expectations with your research team, and develop explicit processes within your lab to monitor compliance with these policies. It is expected that you will review the steps you have taken to comply with this policy at your annual meeting with Department Chair.
Best practices in experimental design
- Employ both positive and negative controls.
- Employ systematic random sampling for data collection, including but not limited to, selection of areas chosen for sampling (i.e., regions within cells or tissues selected for imaging or analysis).
- Strive to eliminate bias in experimental procedures and analysis. If practical, experimenters should be masked to treatment. Consider balancing the timing of experiments to account for sources of bias over time (e.g. evolution of surgical skills, fatigue, circadian rhythms in experimental animals).
- Consider multiple methods, techniques or analytic approaches for reproducing and comparing results from your experiments.
- Use replicate samples (to accommodate both technical and biologic variation) for experimental groups, when appropriate.
- Use validated and/or well-characterized reagents (such as antibodies and pharmacological agents), or conduct full validation.
- Consider inherent limitations of human, animal and cellular studies arising from possible contributions of genetic background, gender and other relevant factors.
- When using shared core facilities, both University-based and Department-based, always understand the methodology they employ and critically evaluate the raw data for any results they provide.
Best practices in data analysis and statistics
- If significant statistical analysis is needed, consult with a biostatistician both before and after data collection.
- When applicable, determine sample size by pre-experiment power analyses. Identify stopping points a priori to avoid testing to a foregone conclusion.
- When in doubt, cross-train laboratory personnel so that one person can independently verify the results of another.
- Use care in pooling data across experiments performed at different times or different experimental groups.
- Avoid arbitrary data exclusion. Exclude data only if there is a compelling, transparent and documented reason to do it (e.g. documented error in solution composition, erroneously collected data for the same set at different temperatures, contaminated cell culture, etc.).
Best practices in data management
- Develop well-defined and uniform standard operating procedures (SOPs) for documentation of experimental activities. This applies to keeping records in “data notebooks”, data storage, documenting protocols, data modification and analysis.
- Each laboratory member should read and understand the data management SOP. This should be acknowledged in writing prior to starting conducting research in the lab.
- Retain complete primary data, backed up, and protected against alterations. The U.S. Department of Health and Human Services requires that all project data be retained for at least 3 years after the funding period ends.
- Alterations and modifications of the primary data should be performed on copies of the data whenever possible, and should be tracked, dated, and described.
- Data notebooks should be available for viewing. Consider performing periodic audits of laboratory notebooks to ensure that a third-party reviewer would be satisfied with the level of documentation provided for an experiment.
- Digital archives should be properly organized and labeled so that they can be audited. The same applies to any data that comes from shared equipment or core facilities.
- Ensure integrity of the data obtained by your collaborators. Personally examine raw data and, when in doubt, perform an independent analysis of data generated by collaborators to verify accuracy.
- The level of information security should be appropriate for the data, especially for human subject protection and personal health information (PHI).
- Data should be accessible to all data owners and, when applicable, available to outside investigators after publication.
Best practices in publication
- Avoid “rushing” findings into publication without a full investigation and proper self-replication.
- Report full details on methods and experimental design, including technical and biological replicates, methods for randomization and masking, and self-replication efforts.
- Report complete results of all analyses done as part of an experiment, including statistical. It is better for Methods sections to be too long rather than too short.
- Target appropriate journals for publication. Avoid pressure to publish in the most glamorous journal at the expense of following the best practices for experimental design, data analysis and statistics. If a paper requires a long methods section or many figures to document the science thoroughly, do not try to compress it into a short format, no matter how “important” the results seem.
- Attempt to publish well-controlled but negative, “uninteresting,” or “not novel” results in appropriate venues (e.g. in PLOS ONE).
- Consider submission to another journal if the peer review process demands additional experiments on an abbreviated timeline (an unfortunate emerging trend) because of the associated time pressure and potential for bias (i.e., if the results need to be interpreted to conform to previously-reached conclusions).
Creating a functional and proactive scientific culture
- Create a culture of open conversation and willingness to accept internal critiques and challenges of data without retribution.
- Understand that questioning data integrity does not constitute a misconduct accusation.
- Inform all Department staff that they may bring any concerns to the attention of the Department Chair, Vice Chair for Research, the Scientific Director, or the CRU Medical Director, without fear of retaliation or retribution. Staff should also be aware of the Duke Integrity Line to report concerns anonymously.
- Principal investigators and laboratory heads should be actively involved in laboratory procedures, should oversee some of the actual experimental work, and should “know” how things are done in their laboratory.
- Principal investigators and laboratory heads must recognize that although laboratory research is motivated by the pursuit of true knowledge, certain incentives or pressures (or the appearance thereof) may influence their staff to deviate from best practices, especially based on concerns about academic promotion, choice of publication venue, grant submission deadlines or competition with other labs.
- Issues of proper scientific conduct and scientific rigor should be discussed with staff regularly, in both private and group settings.
- Laboratory meetings with staff should include inspection of some primary data and discussion of detailed analysis procedures, as well as discussion of final publication-style figures.
2. Recommended best practices for improving the culture of scientific accountability for clinical research
Clinical research is defined as any investigation using data derived from patients, including observational research (data from archived sources, including cases, chart reviews, insurance databases and “big data” databases), interactive research (non-risk research on consented participants, including imaging studies, blood draws, tissue swabs, or surveys) and interventional research (potentially risk-involving research involving consenting participants with data derived from active comparison of therapeutic treatments or diagnostic tests). Some clinical research also incorporates principles of laboratory research if performed by Duke investigators (i.e., genetics, biomarkers, informatics analyses).
Best practices in clinical research are generally derived from the Declaration of Helsinki and subsequent guidance documents. The Duke Office for Clinical Research (DOCR) has extensive resources outlining “best clinical practices” for conduct of clinical research, including required CITI modules and Duke-mandated Human Subject Research (HSR) training, as well as many other web-based educational modules. In the Department of Ophthalmology, we have developed many specific “Policies and Procedures” for the Clinical Research Unit (CRU) and have developed a “Survival Guide for Clinical Research” highlighting the most relevant policies and procedures. All new clinical investigators should familiarize themselves with these resources and complete the required training. Established investigators should periodically review these policies. Clinical investigators are also encouraged to discuss their proposed studies with the Vice Chair for Research, the Medical Director for the CRU and/or the Clinical Research Practice Manager. The on-boarding process for initiating new studies at Duke incorporates many elements of good clinical practice, especially if the study is a multi-centered federally-funded or industry-funded trial. However, key issues highlighted below are especially relevant to principal investigators of observational or Duke-specific investigator initiated studies:
Best practices in protocol design, data analysis and statistics
- Have a clearly identified research question or hypothesis.
- When using archived data, consider dividing your sample into a “training” data set and a validation data set.
- Consider inherent limitations and confounding variables arising from possible contributions of genetic background, gender and other relevant factors.
- When applicable, consult with a biostatistician both before and after data collection.
- When possible, employ randomization, masking, and/or appropriate controls. When applicable, perform a priori sample size calculations.
- Avoid arbitrary data exclusion. Exclude data only if there is a compelling, transparent and documented reason to do it.
Best practices in data management
For any investigator with an active IRB protocol(s), complete appropriate CITI modules and HSR training web-based training. It is highly encouraged that investigators (including trainees) complete DOCR’s Informed Consent Process, Data Integrity and Security and Study Documentation training.
- Comply with all SOM and FDA regulatory requirements (i.e., regulatory binders, IRB approvals, etc.).
- All electronic data must be stored on a Duke recognized/approved server. The location of the study data must be clearly reported on the Duke e-IRB section 12.1, Research Data Security Plan (RDSP) and amended as the location of the data is changed. NO data should ever be stored on a non-Duke computer.
- Retain, back up, and protect all primary data for any investigator-initiated studies against alterations. The U.S. Department of Health and Human Services requires that all project data be retained for at least 3 years after the funding period ends. Duke University has a six year data retention policy for human research study data. Industry sponsors may have additional requirements.
- Perform any alterations and modifications of the primary data on copies of the data, and these changes should be tracked, initialed, dated, and described.
- Properly organize and label all digital archives so that they can be audited. The same applies to any data that comes from shared equipment or core facilities.
- Expect periodic internal audits performed by the CRU Research Practice Manager to ensure that a third-party reviewer would be satisfied with the level of documentation.
- Ensure the level of information security is appropriate for the data, especially for human subject protection and PHI.
- Facilitate accessibility of all data to co-investigators and when appropriate, to outside investigators.
- Best practices in publication
- Attempt to publish all study results, even if there is a negative finding.
- Creating a functional and proactive scientific culture
- Schedule regular meetings with the entire research team.
- Develop a culture of open conversation and willingness to accept internal critiques and challenges of data and research processes without retribution.
- Reinforce to all Department staff should that they may bring any concerns to the attention of the Department Chair, Vice Chair for Research, CRU Medical Director or the Scientific Director, without fear of retaliation or retribution. Staff should also be aware of the Duke Integrity Line to report concerns anonymously.
- Understand the clinical trial protocol.
- Regularly discuss issues of best clinical practices and scientific rigor with staff, in both private and group settings.
3. Departmental safeguards to promote a culture of scientific accountability
The Department of Ophthalmology is committed to a culture of scientific accountability. Accordingly, the Departmental leadership is taking steps to support, guide and ensure a culture of scientific integrity, including the mechanisms and actions listed below:
1. Facilitate discussions of proper scientific conduct at all levels: faculty meetings, lab meetings, and courses, especially focusing on the potential pressures incentivizing deviation from best practices and poor conduct.
2. Expect all PIs to develop a “Data Management SOP” for each project that will provide specific guidelines for data acquisition, storage, and transparency. This SOP should cover the following components of data management:
- How and where are the data stored?
- How are laboratory notes or data collection sheets recorded and stored?
- How is analysis done, and how sequential changes are tracked, saved and stored?
- How are the published figures linked to the analysis and the original data?
3. Ensure all research staff in all laboratories read the Department’s Scientific Culture and Accountability Plan.
4. Expect all faculty active in laboratory investigation to present their findings at the Friday morning research conference, an interactive format allowing peer review and critical evaluation of data. The minimal presentation frequency should be annual for non-tenured and biennial for tenured faculty. In addition, each postdoc and graduate student should present their work annually at the “Science Lunch” meetings. Attending both meetings by research faculty is obligatory to promote an internal culture of mentoring trainees in best practices and open peer review.
5. Assign formal mentors to new investigators and emphasize the principles outlined above. Strengthen the understanding that adhering to these principles is relevant not only to the pursuit of high quality basic science knowledge, but also reinforce that laboratory investigation often influences patient care, future studies in humans and development of biomarkers or new drugs.
6. Promote the sharing of best practices regarding data integrity through a central resource of documents and materials available to all Ophthalmology faculty and trainees. Provide software solutions, analytical support and other resources as appropriate.
7. Expect transparency and clear communication from the SOM regarding cases of scientific misconduct that occur at Duke. Understanding details of these cases is critical for preventing similar instances in the future.
8. Educate the faculty and staff about available resources, such as: