Provisional Guidelines: The Use of Generative Artificial Intelligence (AI) in Research at McMaster University – January 2025 by McMaster University is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License
Introduction and Context
McMaster’s Provisional Guidelines on the Use of Generative Artificial Intelligence in Research are structured to provide guidance for the use of generative AI across the three primary stages of the research lifecycle: preparation, conducting research, and dissemination. Each stage presents opportunities and risks, which these guidelines aim to anticipate and address.
The rapid pace at which technology is developing means these guidelines will need review and revision to ensure they remain current with technological advancements and emerging practices.
Questions, comments or suggestions about these Guidelines may be directed to the Special Advisor, Generative AI at macgenai@mcmaster.ca.
The audience for these Guidelines is ‘researchers’ at McMaster, a broad category intended to include faculty researchers, graduate students engaged in research activities, and staff and undergraduate students who may be taking part in research activities.
Moreover, while these guidelines are written to be broadly applicable, there are contextual differences in disciplines and research areas, as well as differences for graduate student researchers. Graduate students should attend to these Provisional Guidelines and the Guidelines for the Use of Generative AI in Teaching and Learning, as both apply. For graduate students in coursework the guidelines for Teaching and Learning are especially relevant; for graduate students engaged in graduate research, the guidelines here are pertinent.
Where possible, it is noted where these contextual differences may alter the guidelines, but feedback is encouraged on how these guidelines may be further refined to reflect these differences.
Differences among disciplines are mirrored in differences among granting agencies and publication venues. Researchers should carefully review the expectations and guidelines of these external bodies before using generative AI in any stage of the research process.
Provisional Guidelines on the Use of Generative AI in Research
- Generative AI tools may create false, misleading, or biased outputs. Critically evaluate and personally verify any outputs used in the research process. Researchers are personally accountable for the accuracy and integrity of their work.
- Many legal and ethical debates have yet to be settled around the appropriate use of generative AI and of inputting personal information or copyright texts into generative tools. Researchers should ensure they are informed about these debates and make appropriate legal, ethical, and political decisions in their research.
- Researchers who use generative AI should complete this module to help them review and reflect on broader societal implications of the use of generative AI, including labour, copyright, bias, and environmental impact.
- Researchers who use generative AI in any context should cite or acknowledge its use drawing on McMaster Libraries’ LibGuide and follow any publication/granting specific instructions.
All researchers should:
- Carefully assess whether generative AI is an appropriate tool for their specific research question, methodology, and goals. They should consider the potential benefits and drawbacks of using AI-generated content or analysis in their work.
- Evaluate the compatibility of generative AI with their discipline’s established research practices, theoretical frameworks, and epistemological assumptions. They should be mindful of potential tensions or contradictions that may arise from the use of generative AI in their field.
- Evaluate the generative AI tools being considered for use against intellectual property, data security, privacy, ethical, political, and democratic considerations with support from the Copyright Office, Privacy Office and Information Security. In this evaluation researchers are advised to weigh the risks of inappropriate or unanticipated uses, intellectual property infringement, potential data leaks, profiling, statistical inferences, or broken tech promises for their use case.
- If using generative artificial intelligence in research activities, researchers should consider using institutionally supported tools that have a completed Privacy and Algorithmic Impact Assessment.
- McMaster has an enterprise license for Microsoft Copilot which ensures that, when logged in using McMaster credentials, data used is not shared with either Microsoft or McMaster and confidential, personal or proprietary information can therefore be used. See: Start Here with Copilot for currently licensed tools and capabilities.
- If researchers intend to use other tools, they should consult with the Privacy Office and the Office of Legal Services before use.
- Establish clear protocols for documenting the use of generative AI throughout the research process, including recording the inputs and settings used, as well as any modifications or adjustments made along the way, and any use in preparation of research materials. This documentation supports transparency, reproducibility, and accountability [Note: a protocol guide is in development]
Graduate student researchers and supervisors should:
- Discuss the role of generative AI in the graduate student research project and its alignment with the expectations and norms of their supervisor, discipline, and academic community; graduate researchers should seek and receive explicit and documented approval to use generative AI in their research activities. See Appendix A.
All researchers should:
- Assess the quality and validity of AI-generated data and analysis. This may involve cross-referencing with other data sources or conducting manual checks or audits.
- Follow data security and privacy requirements for any generative AI tools being used.
- Mitigate any ethical risks and challenges associated with using generative AI in data collection and analysis in consultation with the McMaster Research Ethics Board.
- Make the use of generative AI in research as transparent and reproducible as possible. Keep a record and provide detailed information about the inputs, prompts, tools, techniques, and data sources used, as well as any code or scripts employed in the analysis. While reproducibility of generative AI outputs poses challenges, researchers should aim to make research as reproducible and transparent as possible by documenting these steps.
- Ensure that use of generative AI does not circumvent or shortchange learning and research processes associated with personal reading, reflection, analysis, context, and the personal embodied experience and insight brought by human researchers and communities
Graduate student researchers should:
- In discussion with their supervisor, plan and document how they will demonstrate their own intellectual contributions and mastery of the subject matter when using generative AI in their research. They should be prepared to answer questions and demonstrate appropriate knowledge and expertise about any aspect of their research and its relevance to their program learning outcomes.
All researchers should
- Clearly disclose and describe the use of generative AI in their research outputs, including publications, presentations, and other dissemination materials. See McMaster Library’s LibGuide “How do I Cite Generative AI” and cite and acknowledge accordingly.
- Ensure that their use of generative AI in research dissemination aligns with the norms, standards, and expectations of their discipline or field.
- Be aware of any specific requirements or restrictions imposed by journals, conferences, granting agencies or other dissemination venues regarding the use of AI-generated content. They should carefully review the submission guidelines and editorial policies to ensure compliance and avoid potential issues in the publication process.
- Not use AI-generated content verbatim and always critically evaluate the appropriateness and relevance of the generated text, images, or other media for their specific research context.
To support researchers in deciding if and when to use generative AI in the research enterprise and to use it responsibly and well, further work is required. Specifically the AI Expert Panel on Research will work over the 2024-25 year to develop and share:
- A Protocol for documenting the use of generative AI in all stages of research
- A module specific to researchers that describes the risks and challenges of using generative AI
Appendix A: Supervisor and Graduate Student Researcher and/or Research Team Conversation Guide
Graduate students and supervisors should also consult the “Communication tool for supervisory relationships” before discussing generative AI use.
What do you [graduate researcher/supervisor/team member] already know about generative artificial intelligence and what might you need to learn before incorporating these tools into your graduate research? What is your individual approach to generative artificial intelligence? What do you believe about its value or risks?
Possible discussion prompts:
- How would you describe your ‘philosophy of AI use’? When, how, and why do you think AI should be used in research?
- How might generative AI intersect with, influence, or impact your professional goals?
Permitted Activities
What scholarly activities within my graduate research/research may benefit from the use of generative AI?
Possible discussion prompts:
- What value might generative AI bring?
- For graduate student researchers: What impact might using generative AI for [this task] have on my core learning experience as a graduate student researcher?
- What phases of the research process and research activities would most benefit from the inclusion of generative AI?
Examples: translation, copy-editing, brainstorming, concept explanation, drafting, coding, data analysis, data visualization, drafting, simulations, literature reviews
Prohibited Activities
What scholarly activities within my graduate research/research should not involve the use of generative AI?
Possible discussion prompts:
- What might be some of the risks of using generative AI to complete [this task] or [this part]?
- What could be some of the negative impacts on my work, colleagues or my disciplinary community if generative AI was used for [this part]?
- For graduate student researchers: What impact might using generative AI for [this task] have on my core learning experience as a graduate student researcher?
Examples: translation, drafting, data analysis, data visualization, interpretation and analysis, synthesis, literature reviews
What benefit or risk does the use of generative AI pose for me as a graduate researcher/researcher?
Examples: accessibility features, data sovereignty, implicit bias, data protection, privacy, data contamination and international data agreements.
How should I document and disclose when I have used generative AI in my work? What level of use (e.g. brainstorming, drafting, copy editing, coding) warrants disclosure of use? How do I ensure everyone involved in the work I am doing understands how we will use (or not use) generative AI?
Possible supervisor strategies
- Citation and disclosure practices vary by context. Check with colleagues, journals and funding agencies in your area to consider what emerging norms for citation or disclosure may be.
- Sample acknowledgement could read: “[Name of generative AI tool] was used in the creation/drafting/editing of this [scholarly output]. I have evaluated this document for accuracy.”
Possible discussion prompts
- What research ethics implications and obligations do we have to consider?
- What might be some reasons our [key consulted groups] might need or want to be aware that generative AI was used in this [type of work]?
- How do we ensure that everyone involved in a project or process that uses generative AI is aware and agrees to the use?
- What possible risks to our credibility or expertise are present if we do not disclose use of generative AI in this [type of work]?
- What professional obligations do we have to be transparent with our use of generative AI in our area?