AI & Medical Education Conferences 2026

Here are some major upcoming events in 2026 where artificial intelligence intersects with medical education, health sciences, and university-level research. These include conferences, summits, and academic gatherings that are either explicitly focused on AI in medical teaching and training, or closely related fields (AI in healthcare, precision medicine, digital innovation) with strong relevance to universities and educators:

🎓 Medical Education and AI-Focused Conferences

1. Innovations in Medical Education Conference (University of Miami) – March 26-27, 2026
A dedicated two-day conference on integrating AI into medical education, with panels and workshops on curriculum design, teaching technologies, ethics, and AI tools for assessment and learning. 2. International Conference on Medical Education, Health Sciences & Patient Care – Paris, October 21-22, 2026
Broad medical education conference with sessions spanning innovation, pedagogy, and technological applications including AI in training and healthcare delivery. 3. 23rd Innovations in Medical Education (USC Online Conference) – February 11-12, 2026
Annual medical education event hosted by the University of Southern California, offering virtual participation and global engagement in education innovation (likely includes AI-related topics). 4. 8th Midlands Medical Education Conference – 2026 (UK)
Regional medical education event including themes on embracing AI and digital tools in medical teaching across universities.

🧠 AI and Medicine / Healthcare Innovation Events

These gatherings are broader than pure medical education but are highly relevant to academics, researchers, and faculty exploring AI in healthcare teaching, research skills, and clinical workflows:

5. AIME 2026 – Artificial Intelligence in Medicine – Ottawa, Canada, July 7-10, 2026
International conference on AI and medicine hosted at the University of Ottawa with deep ties to academic research and cross-disciplinary scholarship. 6. International Conference on Precision Medicine and AI Healthcare – Prague, July 27-28, 2026
Global forum for clinicians, researchers, and educators on AI’s role in precision medicine and data science, often including academic perspectives. 7. AI4Health: Improve Health Through Artificial Intelligence Conference – University of Florida campus (2026)
University-hosted event combining AI healthcare research, training opportunities, and academic discussion. 8. AI Summit (Mayo Clinic) – Rochester, MN, June 4-5, 2026
While broader in scope, this summit addresses AI in clinical settings and systems which complements medical education on real-world applications.

🎓 Education & AI Broadly Relevant

9. AI + Education Summit 2026 (Stanford Accelerator for Learning & HAI)
Though not solely medical, this high-impact summit brings AI education thought leaders, including applications relevant to health professions education and curriculum design.


📌 Tips for Academic Participation

  • Early planning and abstract submissions often open months ahead of events like AIME and HealthAI2026.

Many conferences offer workshops or tracks on AI ethics, curriculum innovation, and experiential learning that are especially useful for faculty and graduate scholars.

Check university event calendars (e.g., Stanford HAI, medical school education departments) for unofficial symposia or local AI in medical education workshops throughout the year.

Hype to Hospital

This evening I attended the “From Hype to Hospital: How AI is being used in Healthcare and Research” hosted in British Columbia, Canada.

We’re surrounded by data in our healthcare system, but our ability to convert it into timely, trustworthy decisions is still limited by workflow, infrastructure, and governance. Provincial data collection continues to be labour intensive, often manual, and delayed.

As it was reported this evening, in trauma care, there can be a 12-18 month lag between what happens in the Emergency Department and Trauma Service, and what ultimately lands in registries, dashboards, and system-level reports. Check out the article “iROBOT: Implementing Real-time Operational dashBOards for Trauma care” to learn more: https://lnkd.in/gvBQKMgs

Other interesting points from presenters include:

+ Structured data is easy to analyze, narrative data holds the nuance that can change risk and interpretation.
+ AI can speed screening and reporting, reduce false positives, and support real-time dashboards, if evaluated honestly.
+ In BC, common use cases are emerging: early warning (sepsis, deterioration), staffing and scheduling, operational intelligence.
+ The hard part is the pipeline: discovery to pilot to scaled deployment, many projects stall before impact.
+ Implementation risks are real: trust (confabulation, over-reliance), privacy, environmental cost, workforce disruption.

My takeaway: We have a responsibility to educate and train healthcare practitioners in the use of AI, and to start asking critical questions about how it will affect patient care.

Slides attached are from Graham Payette’s AI BC briefing.

Primary Compassionate Care Hub

Today I get to say something I have dreamed of saying for years: we did it!

The Primary Compassionate Care Initiative now has a place to call home. Our Primary Compassionate Care Hub is real, it is built, and it is open. It is bright, welcoming, and intentionally designed for what matters most, people, learning, and community.

This Hub has been a long time coming. It grew out of Dr. Aisha Liman’s simple belief that keeps guiding everything we do at the Primary Compassionate Care Initiative: care is not only a clinical act, it is a culture we build together. We wanted a home for that culture, a place where learners, mentors, clinicians, and community members can gather, teach, and grow side by side.

A space designed for learning that feels human

When you walk in, you can feel the purpose immediately. The Hub includes a training and teaching room set up for workshops, small group sessions, and hands-on learning. There are flexible chairs and work tables, a presentation area, and a layout that supports real interaction, not passive sitting.

We also created comfortable tiered seating with bright cushions, a simple detail that quietly changes the energy of a room. It invites discussion. It invites listening. It invites people to stay.

In other words, this is not a room designed only to deliver content. It is a room designed to build confidence, conversation, and competence.

A “community begins with us” moment, made physical

One of my favorite elements is the message at the entrance: “Community begins with us.” It captures what we are trying to do, not just teach skills, but create a place where belonging and responsibility are visible, shared, and practiced.

This Hub is meant to be a meeting point for learners and the communities they serve. It is a space where we can run mentorship sessions, leadership development, health education programming, team-building activities, and practical skill-building workshops grounded in compassion.

Yes, we even built the warmth into the walls

There is another sign in the space that made me smile because it is exactly right: “Compassion is Brewing.” It sits above a warm, welcoming café-style area that feels like a natural gathering place. Because sometimes the most meaningful learning happens before the session starts, during the conversations, the laughter, the quiet questions, and the “Can I ask you something?” moments.

We wanted the Hub to feel energizing and calm at the same time. The lighting, the wood tones, the bright colors, and the clean design all contribute to that balance. It feels professional, but not sterile. It feels modern, but still personal.

What this Hub will be used for

This Hub is a working space, and we have big plans for it, including:

  • mentorship and academic support for health practitioner learners
  • workshops on communication, teamwork, and patient-centered care
  • community health education sessions
  • faculty and preceptor development
  • collaborative project meetings and design sessions
  • leadership and professional identity development

Most importantly, it will be a place where learners can practice building trust, showing up with humility, and delivering care that is both competent and kind.

Gratitude, and what comes next

I am deeply grateful to everyone who helped make this possible. A space like this is never built by one person. It is built by shared values, steady work, and people who refuse to let good ideas stay only in the imagination.

This Hub is a milestone, and it is also a beginning.

Because now we get to do what we came here to do: grow a new generation of compassionate, community-grounded health professionals, and support them with the structure, mentorship, and environment they deserve.

Thank you for believing in this dream with us.

More soon, and if you are nearby and want to visit, collaborate, or support a workshop, I would love to hear from you.

With gratitude,
Jacqueline

Course: Trauma & Disaster Team Response

When Disaster Hits, Family Medicine Is Still the Front Door

Disasters and major trauma can feel like “someone else’s job” until the day your clinic, urgent care, or community hospital becomes the first place people arrive. In those moments, what matters most is not just clinical knowledge, it is teamwork, role clarity, and a shared plan.

A useful option is Trauma and Disaster Team Response, a free online course on SURGhub, offered through the McGill University Health Centre, Centre for Global Surgery. It is built around multidisciplinary trauma and disaster response, with lectures and quizzes, and it is designed to strengthen how teams function under pressure.

Why it matters for family medicine

Family physicians are often central to stabilization, triage, transfer decisions, and supporting staff and communities in the aftermath. This training can help build a common language for response, especially in rural and community settings where resources and staffing can shift quickly.

What you can take back to your team

  • Clearer roles during urgent resuscitation and surge situations
  • More confidence with transfer readiness and escalation
  • A framework for thinking about disaster response as a system, not just a single patient
  • A nudge to turn preparedness into practical clinic improvements (call trees, checklists, short drills)

Learn more here at McGill University’s SURGhub. And a big thank you to Ali for sharing this resource with me. His work in this area is incredibly admirable and much needed.

Bloom!

“How much do we really know about the plants and flowers in our gardens and vases? Beyond their beauty, many have surprising stories of exploration, exchange, and discovery. In Bloom takes visitors from Oxford across the world and back, tracing the journeys that some of Britain’s most familiar blooms travelled to get here. Featuring more than 100 artworks, including beautiful botanical paintings and drawings, historical curiosities and new work by contemporary artists, the exhibition follows the passion and ingenuity of early plant explorers and the networks that influenced science, global trade and consumption.  Visitors will learn how plants changed our world and left a legacy that still shapes our environments and back gardens today.” ~ Ashmolean Museum, Bloom Exhibit 2026

Art by Claire Desjardins

AI: Integration & Preparedness

“The General Medical Council (GMC) states that doctors ‘are responsible for the decisions they make when using new technologies like AI, and should work only within their competence.’15 This coincides with the World Medical Association calling for reviewing medical curricula and education for all healthcare stakeholders to improve understanding of the risks and benefits of AI in healthcare.16 It follows then that in fostering good medical practice, medical schools must prepare students for the clinical environment that awaits them through building competence and familiarity in this evolving domain.

With 2 in 3 physicians using AI in their clinical practice, an increase of 78% from 2023,17 enthusiasm for the technology is rapidly growing. Yet, despite this uptake, a 2024 international survey of over 4500 students across 192 medical, dental, and veterinary faculties found that over 75% reported no formal AI education in their curriculum, highlighting a critical gap between technological advancement and medical training.18 This discrepancy underscores the urgency for medical schools to proactively incorporate AI teaching to ensure graduates are ready for the realities of modern clinical practice.”

Read more on Artificial Intelligence in Medical Education: Promise, Pitfalls, and Practical Pathways here.

AI-Enabled Medical School

Succi, Chang, and Rao argue that medical education needs a deliberate redesign for an AI-rich clinical world, not just a bolt-on “AI lecture” or a new tool in the curriculum.

Core argument

  • AI, especially large language models (LLMs), is already strong at many clinical-adjacent tasks (documentation, communication support, test-style questions), but performance on benchmarks does not equal genuine clinical reasoning.
  • Medical education’s job is to teach reasoning processes and adaptability, not just factual recall or pattern recognition. LLMs can look convincing while still producing plausible but shallow outputs.

Why current LLM success is not the same as clinical reasoning

  • The authors emphasize that LLMs often operate via statistical pattern matching, so they can generate confident answers triggered by “buzzwords” or common feature clusters.
  • Real clinical reasoning is dynamic: new symptoms appear, data conflict, hypotheses evolve, uncertainty persists. Exams with single best answers do not capture that.

What needs to change in assessment and benchmarking

  • They call for new benchmarks that require models to reason step by step through complex cases, justify decisions, and iteratively refine a diagnosis or plan as information changes.
  • Validating AI in the education setting, where reasoning can be scrutinized, is presented as a pathway toward trustworthy clinical decision support later.

How AI could reshape teaching and learning

  • If LLMs become better at transparent reasoning, they could function as case-based learning partners: tutors, critics of student logic, graders, and discussion counterparts.
  • LLMs could help learners at all stages parse difficult materials, including curricula, textbooks, and biomedical literature, which supports lifelong learning in a fast-moving field.
  • AI could expand clinical exposure beyond “the patients you happen to see” by generating many varied presentations, including rare diseases and culturally distinct scenarios.

SP-LLMs (standardized patient LLMs)

  • The article highlights the idea of LLM-powered standardized patient interactions that can be used for practice and evaluation of communication skills, including exposure to rare and diverse presentations.

Equity and access

  • The authors argue LLMs could democratize medical education by distributing expertise at scale, supporting resource-limited settings and schools with lower patient diversity or volume.
  • They note that equitable access will require thoughtful licensing models and partnerships between well-resourced and resource-constrained institutions.

What the “AI-enabled physician” must become

  • As AI takes on routine tasks, physicians should shift toward higher-level responsibilities: strong clinical reasoning, data interpretation, and ethical oversight of algorithmic outputs.
  • Curricula should include “data systems literacy” so future physicians can critically appraise and safely integrate AI outputs into care.

A non-negotiable: dual competency

  • The authors stress that technical sophistication must not erode foundational clinical skills. Systems fail, downtime happens, breaches occur, and public health crises arise.
  • Training should explicitly reinforce operating both with and without AI, through exercises that require history, exam, and differential diagnosis without digital aids.

Bottom line
Medical schools should integrate AI in ways that strengthen, rather than replace, rigorous reasoning, empathy, and moral judgment. This requires honest engagement with AI limits, new forms of assessment, and collaboration between clinicians, educators, and machine learning experts.

Read more on Building the AI-Enabled Medical School of the Future by Succi, Chang, and Rao.