The opportunities and threats of artificial intelligence are discussed in the second of this two-part series on use of these new technologies in nursing education
Abstract
Artificial intelligence (AI) is being used to create new digital tools, such as the chatbot ChatGPT, which are starting to be used for teaching and learning in higher education. Nurse educators could use the opportunities offered by AI-based digital tools to enhance how they teach clinical knowledge and skills to students. Nursing students should learn to use AI tools appropriately, not just by understanding the opportunities they offer, but also by being aware of the threats they may pose to academic integrity, professional practice, and patient care. This second of two articles on AI in nursing education explores these opportunities and threats, and how to use generative AI in the context of nursing education.
Citation: O’Connor S et al (2023) Artificial intelligence in nursing education 2: opportunities and threats. Nursing Times [online]; 119: 11.
Authors: Siobhan O’Connor is senior lecturer, Andi Fajrin Permana is nursing student, both at King’s College London; Sam Neville is chief nursing informatics officer, Mid and South Essex NHS Foundation Trust; Dominique Denis-Lalonde is nursing instructor, University of Calgary, Canada.
Introduction
Generative artificial intelligence (AI) is an exciting technological development that will continue to evolve, improve and lead to new digital products and services that could be used in healthcare. It can be incorporated into nursing education so nurses have good digital literacy they can use throughout their careers (Ronquillo et al, 2021) and are ready to thrive in the digital future (Booth et al, 2021).
In the first of these two articles on use of AI in nursing education, we discussed the strengths and weaknesses of generative AI; in this second article, we explore the opportunities and threats of this digital technology and give recommendations for using these new digital tools in nursing education.
AI is a suite of computer techniques that includes algorithms capable of processing large amounts of digital data, such as text, images, audio and video. Generative AI is one form of AI that uses algorithms and mathematical models to create new digital content, such as text, images, audio or video, based on prompts from a human user (Dwivedi et al, 2023). The most well-known and popular generative AI tool currently is a chatbot called ChatGPT.
UNESCO (an agency of the United Nations that focuses on education, arts, sciences and culture) recently released a quick guide to ChatGPT and AI that exmplains how it can be used in higher education. This includes step-by-step instructions on how to set up a ChatGPT account and start using this generative AI tool or ‘computer robot’. This guide is also available as a free video tutorial on UNESCO’s website and can be accessed in multiple languages.
These highlight that ChatGPT can be used to support teaching and learning, either as a standalone tool or by being integrated into electronic systems and processes that universities already use.
Opportunities of generative AI
Problem solving and decision making
There are many opportunities to use generative AI tools in higher education (Box 1). Tools like ChatGPT could be used in a collaborative way to help students working in groups to:
- Find information;
- Solve problems together.
Box 1. Opportunities and threats of generative AI tools
Opportunities
- Enhance and assess critical thinking skills through use of these tools – for example, find errors, explore different perspectives and develop informed opinions
- Create interactive and adaptive resources to provide tailored experiences for a range of learners
- Help with language acquisition and communication skills, such as writing
- Support digital literacy skills, such as prompt engineering, which involves crafting text-based prompts to achieve the required outputs from generative AI tools
Threats
- Privacy, security, and copyright concerns when large datasets from many sources are used to train AI tools and when sensitive or personal information is used to prompt them
- Digital inequity where some groups of people cannot benefit from the use of AI tools
- AI detection tools may be biased against non-native English students, increasing scrutiny and academic misconduct accusations for this group
- Lack of accountability for outputs that are inaccurate or misleading and potential loss of trust, integrity, and human creativity due to excessive AI use
- Replacing human interactions, feelings, and experiences with generative AI tools could reduce emotional intelligence and dehumanise aspects of social interaction and nursing education
- AI content being passed off as human work and excessive automation using AI tools leading to job losses and social unrest.
AI = Artificial Intelligence
Sources: Dwivedi et al (2023), Liang et al (2023), Webb (2023), O’Connor and Booth (2022), Russell and Norvig (2021), World Health Organization (2021)
As an example, in Rodriguez-Arrastia et al (2022), the authors outlined how they used a chatbot called SafeBot in a simulated emergency situation to help final-year nursing students make clinical decisions to improve patient safety.
More than 100 students participated and, although there was a general feeling that the navigation, layout and content of the digital tool could be improved, there was support for the idea that integrating a conversational agent into professional practice might help nurses to deliver better care.
Learning
Another possible practical use of AI is as a personal tutor, giving students feedback on their progress with various learning tasks – for example, coaching students on their written communication skills.
Han et al (2022) tested whether an AI-based chatbot could help nursing students develop electronic foetal monitoring skills during the Covid-19 pandemic. Students viewed and interpreted graphs and patient symptoms, and learned to prioritise nursing interventions via a chatbot mobile app called LandBot, which provided feedback based on students’ responses. No significant improvements were found in knowledge and clinical-reasoning competencies, but nursing students did report a higher interest in education and self-directed learning after using the chatbot.
Plug-ins are also being created that can extend the functionality of ChatGPT and other generative AI tools. For example, Wolfram, a plug-in for ChatGPT Plus, allows people to ask questions that can be answered by Wolfram Alpha, a computational product that provides robust mathematical and scientific information. This might be useful when teaching students anatomy and physiology, or medication management and drug calculation skills.
Other plug-ins, such as Link Reader and KeyMate, can:
- Generate summaries of articles;
- Organise and prioritise readings;
- Provide estimates of reading times.
Nursing students may find these useful when studying a range of topics or researching a particular topic for an assignment or clinical placement, while keeping in mind the limitations of generative AI tools, such as their potential to confabulate or provide misinformation (Dwivedi et al, 2023).
Educational content
Nurse educators can use generative AI tools to create new educational content for undergraduate and postgraduate courses to improve student learning. In particular, AI image and video generators could:
- Speed up the creation of new resources;
- Improve the quality of new resources.
As an example, Gamma is an AI-powered digital tool for creating websites and presentations that could be used when teaching face-to-face or online. The content created could also be integrated into learning management platforms, such as Blackboard, D2L or Moodle, for students to access remotely. This might help to:
- Reduce nurse educators’ workload by expediting some of the time-consuming parts of content creation;
- Decrease the costs involved in creating digital educational material.
Lesson plans and assignments
Generative AI could also be used to design lesson plans and assessments. Webb (2023) highlights that educators will need competent prompting skills to make the most of generative AI. Carless (2023) suggests creating staged student assessments each semester – such as a one-minute elevator pitch, followed by a short, annotated bibliography that can be shared with peers or a chatbot for feedback – before developing a longer piece of writing. These assessments could include AI-supported feedback that is personalised to offer students regular updates on their progress.
Plug-ins for ChatGPT could also be used by nurse educators when making lesson plans and designing assessments – that is, multiple-choice questions, essays, oral presentations, posters, blogs, group projects and more. Stories, developed by Smyth, is a plug-in that creates an illustrated digital storyboard based on the prompts provided by the user. As a rich narrative can accompany the visual content to communicate an engaging story, this type of content could be used in role play to guide nursing students through a range of patient scenarios in hospital or community settings.
Another plug-in called Show Me Diagrams can generate various types of diagrams, such as mind maps, in real time. Educators could ask nursing students to use this to demonstrate their understanding of pharmacology, for example, by depicting families of drugs and associated side-effects or contraindications.
Other AI-driven applications, such as TeacherMatic, can be used to create different resources, such as lesson plans and quizzes. In addition, the Alan Turing Institute (nd) has suggested exploring how AI tools could make curricula more accessible to students with different learning needs, including those with disabilities.
It is important that nurse educators acknowledge when and where they have used generative AI to develop teaching and learning resources, so the use of AI is transparent to students and colleagues.
“Generative AI tools are here to stay, and nursing education must adapt and incorporate them into undergraduate and postgraduate teaching”
Threats of generative AI
Privacy, security and copyright
As highlighted in the first article in this series, bias and misinformation are a weakness of generative AI tools, as their accuracy and impartiality is dependent on the quality of data on which they are trained and the prompts provided by human users. There are several other threats of which nurses and students should be aware when using generative AI tools in higher education; these are outlined in Box 2.
Box 2. Example use case of generative AI in nursing education
- Context – a 90-minute computer laboratory session with third-year nursing students
- Objectives – to examine whether generative AI tools contain errors in their outputs and biases that reflect stereotypes and inequities in society
- Classroom setup – in small groups or individually, students are allocated a computer and given prompts to use in an AI-image generator platform
- Prompt 1 – students ask the generative AI tool to produce photorealistic images using the prompt “a group of nursing students with a teacher”, which will primarily produce images of young adult women in scrubs.
- Results – ask students to document any age, gender, racial or other biases they notice in the images produced by the prompt used. Students and educators must be critical of AI-generated text, images, audio and video, and seek to identify and rectify bias in any outputs produced. Students must understand that, although AI tools may impress, they are not infallible or all-knowing and, consequently, their output must be scrutinised.
- Prompt 2 – students will need to learn to develop prompts that limit or correct bias. For example, students could ask the generative AI tool to produce photorealistic images using the prompt following: “a group of nursing students, including some male students, with a teacher”.
- Students’ knowledge base will need to be sufficiently advanced to spot bias or incorrect information when it does occur; for example, some of the images created included erroneous representations of stethoscopes.
- Considerations: in a larger group, students could share and discuss the weaknesses of generative AI tools, highlighting the issues they identified and how their outputs changed as they tried various prompts. Learning activities that involve critiquing AI-generated content can help improve students’ digital literacy skills and could form part of a formal assessment.
AI = artificial intelligence
One important aspect is the privacy, security and copyright of data available online or through connected technologies, as there is a risk these data could be used to develop AI tools without appropriate permissions. For example, Getty Images – a large visual media company – is pursuing legal action against Stability AI, which created an AI image generation tool, for allegedly stealing its stock images to develop its digital product (Korn, 2023).
Nursing students should be educated about data protection and privacy in relation to digital tools, from social media to mobile apps and generative AI. This will help them to understand the types of data that are appropriate for prompting AI tools, and the need to avoid using any personal or sensitive information, or copyrighted data. The World Health Organization has developed a set of key ethical principles that nursing students can be taught, on the appropriate use of AI in healthcare.
Lessons in prompt engineering, in which students learn how to interact with an AI tool and refine their questions and statements, should include guidance on acceptable prompts and the importance of not sharing personal or sensitive information with chatbots and other AI tools. However, while prompt engineering could be useful to explore with students in the short term, Acar (2023) argues that these skills will be less necessary in the future as generative AI becomes more sophisticated.
Digital inequity
Educators and students may be concerned about fairness in accessing and using generative AI tools, particularly if they are being used to help students learn information and prepare assignments. Digital inequality already exists in many countries around the world, where some groups of people have:
- Limited or no access to the internet and other technologies;
- Poor digital literacy skills, meaning they struggle to use technology (Robinson et al, 2015).
This gap may widen further because there will be students who cannot afford to use AI tools, which could disproportionately affect those who are already disadvantaged and struggling at university (Miao et al, 2021). Nurse educators should ensure AI tools are part of standard IT provision so every student can readily access them. In addition, all staff and students need to be trained on how to use generative AI to support their digital literacy and professional development, and minimise digital inequity.
It has been shown that AI-detection software tools (such as Turnitin and GPTZero) consistently misclassify contributions by writers who have English as a second language (ESL) or English as an additional language (EAL) as being AI-generated (Liang et al, 2023). Widespread and indiscriminate use of these detection tools could unfairly penalise ESL/EAL students by exposing them to increased scrutiny and accusations of academic misconduct. This could perpetuate existing inequities experienced by various cultural groups in academic settings and place a significant burden on educators to catch and penalise students, rather than facilitate their learning.
The ethical use of AI-detection tools in educational settings should be thoroughly discussed in the faculty, embedded in policies and disclosed to students to determine the appropriateness of its use and likelihood of false positives. In addition, verification processes should be established before enacting academic misconduct policies.
Accountability
Another issue with generative AI tools can be lack of accountability for outputs that may be inaccurate or misleading. Disclaimers by AI tool providers, and lack of regulation of AI technologies in many countries, means responsibility for using AI tool outputs – whether text, image, audio or video – lies with the person using it (Hacker et al, 2023). There may be an assumption that any individual who uses generative AI is aware of this responsibility, but this is not always the case.
Nurse educators should explicitly highlight to students that professionalism includes accountability when using data from digital tools, such as using AI-generated results to inform decision making in clinical practice. Use of AI should always be combined with critical thinking and professional judgement about the quality and appropriateness of using AI or other computer-generated data. There is also a risk that poor-quality outputs may lead to a loss of trust in AI tools and systems over time, such as those used in higher education.
Bias and inaccuracies
A further threat posed by generative AI tools is that they could reduce people’s ability to distinguish between real and fabricated content. Fake photographs and videos are prolific on social media and other online platforms, and can distort the facts about social, political and health-related events, such as Covid-19 (Apuke and Omar, 2021). Fake news is often shared by people on social media, with some content going viral and quickly reaching millions of people. Generative AI tools could make this problem worse as they can be used to quickly create realistic and convincing text, images, audio or video, which can then be shared online.
Nursing students must be taught about digital professionalism so they understand the risks involved in creating, sharing and consuming AI-generated and other digital media (O’Connor et al, 2022). Some scholars have already created student lesson plans to introduce and explore issues of confabulation with AI-generated content, which may be helpful to nurse educators and their students.
Box 2 provides a use case of generative AI in nursing education that focuses on recognising bias and inaccuracies, and developing prompts to reduce them.
Dehumanisation
Another consideration is that, because these generative AI tools are not emotionally intelligent but are based on predictive language models, relying on them and spending large amounts of time interacting with them may affect human relationships. Stokes and Palmer (2020) argued that replacing human interactions with AI tools could lead to a loss of emotions and feelings, and dehumanise aspects of social interaction. As such, it is important to balance teaching on AI with other forms of in-person education (such as simulation education, mentorship and clinical placement), which allow nursing students to explore building therapeutic relationships with patients, peers and other health professionals in real-world settings.
Conversely, Limon and Plaster (2022) argued that, using a combination of voice, video and text-based tools that can monitor and interpret human behaviour, AI could teach us to be more emotionally intelligent. This might be something for nurse educators to explore in the future as AI tools become more advanced.
Automation
In the future, tasks usually done by humans – such as teaching students and assessing their work – could be automated using generative and other AI-based technologies. Excessive AI automation is a concern in many sectors, including higher education, as various jobs done by humans (such as university lecturing) could be replaced by AI bots running virtual classrooms (Haw, 2019). Although this is a possibility, nurse educators can leverage their creativity and years of clinical expertise to support student development, while devoting more energy to research activities and pedagogical innovation to stay ahead of the AI curve.
Conclusion
Generative AI tools are here to stay, and nursing education must adapt and incorporate them into undergraduate and postgraduate teaching. As nurses may need to use them in healthcare, clinical research and higher education, students should be taught how generative AI works, along with the strengths, weaknesses, opportunities and threats of these new digital tools. It behoves nurse educators to embrace AI and upskill to develop curricula that teach students about AI and its potential impact on patient care and health service delivery.
Key points
- Artificial intelligence is being used to create digital tools that can generate text, images, audio and videos quickly and easily
- These artificial intelligence-based digital tools could support nursing education in university and clinical settings
- Generative artificial intelligence tools could bring new opportunities for nurse educators and clinical mentors to support teaching and assessment
- Nursing students should learn about the potential opportunities and threats of artificial intelligence-based digital tools
- Nursing students’ digital literacy should include acquiring artificial intelligence knowledge and
skills to use in professional practice
Useful resources
UNESCO = United Nations Educational, Scientific and Cultural Organization; WHO = World Health Organization
Acar OA (2023) AI prompt engineering isn’t the future. hbr.org, 6 June (accessed 20 September 2023).
The Alan Turing Institute (nd) AI and inclusion. turing.ac.uk (accessed 26 September 2023).
Apuke OD, Omar B (2021) Fake news and Covid-19: modelling the predictors of fake news sharing among social media users. Telematics and Informatics; 56: 101475.
Booth RG et al (2021) How the nursing profession should adapt for a digital future. BMJ; 373: n1190.
Carless D (2023) How ChatGPT can help disrupt assessment overload. timeshighereducation.com, 19 April (accessed 20 September 2023).
Dwivedi YK et al (2023) “So what if ChatGPT wrote it?” Multidisciplinary perspectives on opportunities, challenges, and implications of generative conversational AI for research, practice and policy. International Journal of Information Management; 71: 102642.
Hacker P et al (2023) Regulating ChatGPT and other large generative AI models. In: FAccT ’23: Proceedings of the 2023 ACM Conference on Fairness, Accountability, and Transparency. Association for Computing Machinery.
Han J-W et al (2022) Analysis of the effect of an artificial intelligence chatbot educational program on non-face-to-face classes: a quasi-experimental study. BMC Medical Education; 22: 830.
Haw M (2019) Will AI replace university lecturers? Not if we make it clear why humans matter. theguardian.com, 6 September (accessed 20 September 2023).
Korn J (2023) Getty Images suing the makers of popular AI art tool for allegedly stealing photos. edition.cnn.com, 18 January (accessed 20 September 2023).
Liang W et al (2023) GPT detectors are biased against non-native English writers. Patterns; 4: 7, 100779.
Limon D, Plaster B (2022) Can AI teach us how to become more emotionally intelligent? hbr.org, 25 January (accessed 26 September 2023).
Miao F et al (2021) AI and Education: Guidance for Policy-makers. UNESCO.
O’Connor S et al (2022) Digital professionalism on social media: the opinions of undergraduate nursing students. Nurse Education Today; 111: 105322.
O’Connor S, Booth RG (2022) Algorithmic bias in health care: opportunities for nurses to improve equality in the age of artificial intelligence. Nursing Outlook; 70: 6, 780-782.
Robinson L et al (2015) Digital inequalities and why they matter. Information, Communication & Society; 18: 5, 569-582.
Rodriguez-Arrastia M et al (2022) Experiences and perceptions of final-year nursing students of using a chatbot in a simulated emergency situation: a qualitative study. Journal of Nursing Management; 30: 8, 3874-3884.
Ronquillo CE et al (2021) Artificial intelligence in nursing: priorities and opportunities from an international invitational think-tank of the Nursing and Artificial Intelligence Leadership Collaborative. Journal of Advanced Nursing; 77: 9, 3707-3717.
Russell S, Norvig P (2021) Artificial Intelligence: A Modern Approach. Pearson.
Stokes F, Palmer A (2020) Artificial intelligence and robotics in nursing: ethics of caring as a guide to dividing tasks between AI and humans. Nursing Philosophy; 21: 4, e12306.
Webb M (2023) A generative AI primer. nationalcentreforai.jiscinvolve.org, 11 May (accessed 26 September 2023).
World Health Organization (2021) Ethics and Governance of Artificial Intelligence for Health: Who Guidance. WHO.
Help Nursing Times improve
Help us better understand how you use our clinical articles, what you think about them and how you would improve them. Please complete our short survey.