Dawn Patrol

Teaching With AI: Reflections From our Dawn Patrol Series

I had a fascinating early-morning conversation with our UBC clinical preceptors about what happens when AI tools, especially scribes, enter our clinical learning spaces. So many questions came up about how the introduction is shifting the dynamic in patient care. Three key takeaways stood out:

 Voice & Accuracy Matter
Clinicians note that AI-generated notes don’t reflect their own style or reasoning. These tools often “fill gaps” with information never said, which can distort the record and drive unnecessary tests and investigations. They’re also much longer and less focused. How do we prepare medical learners to build tools that amplify their voice and clinical reasoning rather than overwrite it?

Prepare for a New Patient Dynamic
Patients increasingly arrive with ChatGPT-style interpretations of their labs and expect explanations for why certain tests weren’t ordered or to clarify AI’s output. This shifts the power dynamic in the room. How can we equip clinicians and learners to respond transparently and confidently when patients bring AI into the conversation?

Patient Consent, Privacy & Ethics
From signage to informed consent, we must clearly communicate when AI is used in documentation, how data are stored, and what biases or commercial pressures may influence these tools and their use. How do we educate and onboard patients around the use of AI in their care?

For me, the central question remains: how do we make AI an ally that supports our clinical thinking and teaching, rather than one that quietly reshapes the clinician’s voice and patient narrative?

Voice

This morning I was perusing the latest on AI and medical research. I came across Dr. Sanjay Gupta’s podcast “Chasing Life” and his recent session with Dr. Yaël Bensoussan, assistant professor of otolaryngology and director of the USF Health Voice Center.

She’s exploring how artificial intelligence could one day help doctors detect conditions like Parkinson’s and cancer simply by listening. We already know that certain diseases, such as hypothyroidism, can alter vocal quality and AI may take this knowledge even further.

Listen via CNN.

Graduation: Calabar

🌍✨ Celebrating Our Calabar Cohort ✨🌍
Today, our Primary Compassionate Care team proudly celebrated the graduation of our Calabar Cohort (2024–2025). This milestone marks the completion of an educational journey that explored what it truly means to deliver public health in a community setting. We are grateful for the courage, consistency, and compassion each of you demonstrated throughout the journey and to one another along the way.

Over the past year, this cohort has:
+ Participated in 12 lecture series and 7 international commemorations
+ Conducted a health needs assessment and health awareness campaigns
+ Carried out a community intervention project addressing childhood immunization
+ Reached over 1,000 people through advocacy and outreach

Beyond the numbers, what stands out most is the heart behind the work. From creating and launching health campaign videos to educating the public on diabetes awareness, the cohort poured creativity into every effort and demonstrated leadership that inspired others.

🎉 Special congratulations to our awardees:

🏆 Most Participatory: Etimita Oyonnonke Patrick
🏆 Best Team Player: Daniel Blessing Effiong
🏆 Most Socially Engaging: Yojorsam Okoi

To every graduate: Dr. Aisha Liman and I want thank you for dedicating your time and energy to this experience. You proved you can lead, advocate, and inspire through your ability to learn and overcome your own challenges.

We wish you the best!

Lagos!

🚀 A New Chapter Begins: Welcoming Our PCCI Lagos Cohort!

Yesterday, Dr. Aisha Liman, Favour Anyanwu, and I officially onboarded our newest Primary Compassionate Care Initiative (PCCI) cohort in Lagos and the energy was electric!

From our very first gathering, it was clear that this group is ready to step into the challenge of shaping healthier communities with compassion, creativity, and courage.

🌍 What PCCI Is All About
At PCCI, our mission is simple but powerful: to mentor, empower, and inspire the next generation of public health leaders. Through experiential learning, fieldwork, and advocacy, our cohorts tackle pressing health challenges head-on while grounded in values of integrity, teamwork, and excellence.

What Awaits the Lagos Cohort? Over the next 6 months, our learners will:
+ Dive into foundations of public health, humanitarian principles, and human-centered design.
+ Gain hands-on experience in community health needs assessments and intervention projects.
+ Commemorate key international health days (from World Mental Health Day to World AIDS Day).
+ Build career-ready skills, from proposal writing to CV and cover letter development

Why This Matters:
Every cohort strengthens our growing network of compassionate health leaders across Nigeria and beyond. From Jos to Maiduguri, Calabar to Lagos, each step expands the circle of impact. This Lagos group is part of that continuum and we couldn’t be more excited about the journey ahead.

Our Commitment: To our new cohort: your energy fuels this initiative. Your ideas, passion, and commitment will shape interventions that outlive the program itself. Together, we’ll learn and we’ll act to protect the health and well-being of future generations.

Welcome aboard, PCCI Lagos 2025-2026! The future of public health leadership is in great hands!

Exploitation

The Use of AI to Exploit Women: Inside the Mind of Elon Musk

“Evie, 21, was on her lunch break at her day job last month when she got a text from a friend, alerting her to the latest explicit content that was circulating online without her consent. ‘It felt humiliating.'”

Since Elon Musk’s release of “spicy” mode on X, I’ve observed a dramatic increase of deepfake sexual images of women being used on his platform for a variety of purposes. Whether it’s a doctored image of Taylor Swift or someone else’s daughter, the fact that Elon Musk’s AI platform is encouraging the generation of non-consensual deepfakes and branding it as “spicy mode” or “creative” is both incredibly disturbing and revealing about the man. It shows us exactly where Musk’s head is at: a Jeffrey Epstein–like mindset where women’s dignity is disposable, and profit is the name of the game.

As a woman, I’ve been warned by society that there will be people in this world who will use AI as an exploitative weapon to harm, to harass, and to make money at the expense of one’s identity and mental health. What I didn’t expect was that someone like Elon Musk, who loudly calls for deporting sexual predators and releasing the Epstein files, would be the one leading this charge.

As AI platforms expand into sexually explicit content, policy must keep pace: consent, transparency, and accountability should be non-negotiable.

So the next question is, where are we at with our Canadian policies? Currently, it’s all talk and no action. This silence speaks volumes about the role of our politicians in the exploitation of its constituents and their children.

1. Bill C‑63 (Online Harms Act): Proposed, died in 2025.
2. Bill S‑209 (Protecting Young Persons from Exposure to Pornography Act): Proposed, age-verification focus.
3. Artificial Intelligence and Data Act (AIDA) under Bill C‑27: Under consideration.

More on the topic:
Sex is getting scrubbed from the internet, but a billionaire can sell you AI nudes: https://www.theverge.com/internet-censorship/756831/grok-spicy-videos-nonconsensual-deepfakes-online-safety

Their selfies are being turned into sexually explicit content with AI. They want the world to know: https://www.usatoday.com/story/life/health-wellness/2025/07/22/grok-ai-deepfake-images-women/85307237007/

Patient Consent

What does patient consent, autonomy, and trust mean when AI is involved in the delivery of care?

In American Journal of Bioethics (Mar 2025), Y. Tony Yang’s commentary “Beyond Disclosure: Rethinking Patient Consent and AI Accountability in Healthcare” challenges the idea that simply telling patients “AI might be involved” is enough.
.
Yang urges us to unpack what “Right to Notice and Explanation” means and to move beyond disclosure. It’s more than informing the patient that AI is used, it also includes:

1. Ensuring true patient understanding through opportunities to ask questions and seek clarification.
2. Identifying accountability by requiring that an AI system’s logic and functionality be documented and subject to audit.
3. Embedding explainability and trust at the core of AI-informed care through implementing ethical frameworks that emphasize fairness, transparency, and trustworthiness.

As the digital transformation of medicine speeds forward, our consent frameworks must be updated to reflect how the technology is being operationalized, while also considering how to minimize the disruption to current workflows and delivery of care.

Question for the network: How have you seen healthcare providers incorporate, or fail to incorporate, meaningful AI explanations into patient conversations?

Yang, Y. T. (2025). Beyond Disclosure: Rethinking Patient Consent and AI Accountability in Healthcare. The American Journal of Bioethics, 25(3), 151–153. https://lnkd.in/gq2kccuU

AI Bias

Last week, I had the opportunity to lead an academic session with incoming family medicine residents on one of the most pressing issues in modern healthcare: bias and confabulation in clinical AI tools.

We explored:
+ Real-world cases of AI bias, such as how LLMs alter triage and diagnostic suggestions based solely on patient demographics.
+ Confabulation traps where AI fabricates confident-sounding (but incorrect) medical guidelines.
+ Interactive bias testing: residents input identical chest pain cases into multiple AI tools, tweaking only the patient’s background to examine how different platforms analyze and articulate the patient’s management.
+ Ethical and legal dilemmas: including what happens when a chatbot contributes to chart notes, and whether disclosure is required.

We closed with this question:
+ What safeguard will you commit to using in your own practice to reduce the risk of AI misinformation entering the patient record?

Teaching AI literacy is about clinical discernment, ethical awareness, and training tomorrow’s physicians to engage AI with both curiosity and caution.
Grateful to this next generation of residents for their sharp thinking and thoughtful engagement.

Recruitment

🌍 Join Us in Shaping Compassionate Healthcare in Nigeria

The Primary Compassionate Care Initiative (PCCI) is seeking visionary leaders to join our Board of Directors.

Our mission is to strengthen community-based health and mentorship programs across Nigeria, ensuring every individual has access to compassionate, equitable care.

As a board member, you will:
• Guide strategy for impactful health initiatives
• Strengthen governance and accountability
• Build partnerships with communities and health leaders
• Advocate for compassionate care at every level

We welcome leaders with expertise in governance, finance, public health, advocacy, youth development, or community engagement.

If you’re passionate about creating lasting health impact, we’d love to connect. Together, we can build a healthier, more compassionate future.

📩 Interested? Please reach out to me on LinkedIn to learn more: https://www.linkedin.com/in/jacquelinepashby/

Check us out on Instagram too!