Cybersecurity & Family Medicine


When we talk about Artificial Intelligence (AI) in healthcare, our minds often go to diagnostic tools, scribe assistants, or chatbot-based triage systems. But there’s another sector that has been living with AI’s risks and rewards for much longer: cybersecurity. Recently, I attended a lecture by cybersecurity strategist Dr. Craig Jarvis on the growing use of AI in digital defence. His insights translate remarkably well to our clinical context because, at its core, both cybersecurity and healthcare depend on trust, accuracy, and human oversight.

Let’s look at some of the lessons medicine can borrow from AI in cyber defence.


1. AI Can Help but Only If Humans Stay in the Loop

In cybersecurity, automated systems monitor threats, detect anomalies, and even block attacks. But when something unexpected happens, a human expert still needs to interpret, intervene, and decide.

In clinical practice, the same applies. AI tools can summarize patient notes, flag abnormal results, or even draft assessments — but they don’t understand context, patient nuance, or social determinants. We must always ensure the clinician stays in the loop.

“Speed is valuable, but not at the expense of human control.”


2. Reduce Toil, Not Thinking

In IT, AI is praised for reducing “toil” or the repetitive, low-value work that consumes time and mental energy. In medicine, the same promise is appealing: fewer administrative burdens, quicker charting, streamlined information retrieval.

But the key is toil reduction without cognitive erosion. If AI saves time, that time should be redirected toward deeper clinical reasoning, patient connection, or teaching moments and not simply faster throughput.


3. The System Is Only as Safe as Its Weakest Prompt

One cybersecurity slide Dr. Jarvis shared reported that 1 in 80 AI prompts carries a high risk of exposing sensitive enterprise data.

In healthcare, that translates to:

  • Be mindful of what information you input into AI tools.
  • Avoid typing identifiable patient data into any non-approved system.
  • Remember that AI retains patterns and once entered, data may not be fully private.

Clinical AI safety begins with data awareness at the prompt level.


4. Diversity Matters: No Single AI Does It All

Cybersecurity systems rely on multiple forms of AI:

  • Machine learning to detect patterns
  • Generative AI to summarize or report
  • Agentic AI to automate tasks and responses

In family medicine, this diversity principle also holds. One model may excel at summarizing notes, another at generating patient education materials, and a third at supporting evidence retrieval. Integrating these tools thoughtfully ensures resilience and balance, not dependence on a single system.


5. Guard Against “Inflated Expectations”

Dr. Jarvis highlighted the case of Cylance, a once-hyped AI cybersecurity company valued at over $1 billion later sold at a massive loss when its promise outpaced its performance.

In healthcare, inflated expectations can be equally dangerous. AI is powerful, but it’s not a replacement for judgment, empathy, or context. Adopting AI responsibly means piloting, evaluating, and refining tools before scaling much like any new clinical guideline.


Bringing It Back to Practice

If we think like cybersecurity professionals, we can reframe how we approach AI in medicine:

Cybersecurity PrincipleClinical Analogue
Human-in-the-loop oversightClinician supervision of AI recommendations
Patch managementRegularly update clinical AI tools and policies
Threat detectionIdentify AI misuse, bias, or data leakage
Governance frameworksClear clinical and ethical accountability

The Bottom Line

AI can help us practice smarter, not just faster, but only if we approach it with the same discipline, skepticism, and care that cybersecurity experts apply to digital defence.

As family physicians and educators, our role is to ensure that AI augments, not replaces, the human connection at the heart of care.

Trust the technology, but verify the outcome and always with compassion and clinical judgment.

Leave a Comment