Monday, 2 June 2025

MCP + FHIR - Example MCP Server integrated with FHIR

In the ever-evolving landscape of healthcare data, interoperability is paramount. FHIR has emerged as a leading standard for exchanging healthcare information, but raw FHIR interactions can sometimes feel a bit… bare specially when working with AI Apps that use LLMs extensively. What if we could add a layer of context, logic, and user-friendly abstractions on top of FHIR? This is precisely where an MCP (Model Context Protocol) server comes into play.

In this post, we'll explore the implementation of an MCP server, built with TypeScript, that seamlessly integrates with a FHIR server (specifically tested with Aidbox). This server will empower applications to not only create FHIR resources with contextual understanding but also to read and interpret specific FHIR resources with ease.

The source code for this quick and dirty implementation of MCP + FHIR integration is available on my github: https://github.com/j4jayant/mcp-fhir-server

This implementation is inspired by Aidbox article: https://www.health-samurai.io/articles/mcp-fhir-server and the related articles of Dr. Pawan Jindal on LinkedIn.

I thought of implementing the same as was able to achieve the similar results. 

This MCP server has couple of tools:

1. create-fhir-resource

    This tool takes a couple of parameters

  • ResourceType (string like Patient, Appointment etc.)
  • ResourceBody (string with raw JSON of the resource)

    Sample Request and Output in Claude Desktop

    


        

2. read-fhir-resource

This tool takes a couple of parameters

  • ResourceType (string like Patient, Appointment etc.)
  • ResourceID (string with ID of the resource)

    

        Sample Request and Output in Claude Desktop






The MCP server receives the requests from LLMs and sends the request to FHIR server. 

This implementation is tested with Claude Desktop and Aidbox FHIR server.

The Claude configurations to run this server locally can be copied from the Aidbox article.


Thursday, 17 April 2025

Simple Healthcare Workflow Automation: Delivering Personalized Education Post Doctor Consultation with n8n

In today's healthcare landscape, empowering patients with relevant information is crucial for better health outcomes and engagement. What if we could seamlessly deliver diagnosis-specific educational materials to patients immediately after their doctor's appointment? 

This blog post explores how to implement an AI agent using the powerful automation platform n8n.io to achieve just that. By leveraging HL7 messaging, OpenAI's MiniGPT-4 model, and SendGrid, we can create a truly personalized and efficient patient education workflow.

For this demo, I have used following tools:

  • n8n (to create an  AI Agent to automate the workflow)
  • OpenAI's MiniGPT-4 model (to process the prompt and generate relevant educational content)
  • Mirth Connect (to simulate receiving HL7 SIU messages from EHR with Checked-out status. To keep it simple, we assume the SIU feed as DG1 segment with encounter diagnosis)
  • SendGrid (to send formatted email to the patient)


Here's a breakdown of the workflow:




n8n Workflow Execution Diagram

  • HL7 Trigger: The workflow (outside of our AI agent) begins with incoming SIU S14 HL7 messages on Mirth connect server. We configure this channel to specifically look for messages with the CHECKEDOUT status. This ensures the workflow only activates after a patient has completed their appointment. Mirth Connect server then parses the minimum required fields from the HL7 message and transforms a JSON. Mirth calls n8n webhook and posts the JSON.

  • n8n Webhook: The workflow (withing our AI agent) begins with an n8n trigger node (Webhook) that listens for incoming messages from Mirth Connect. This Webhook receives the request JSON like this:

        


  • Patient Information Extraction: Once a relevant message is received, we use n8n's data manipulation nodes to parse the JSON message and extract key patient information, such as:

    • Patient Name, Gender, DOB
    • Patient Email Address
    • Diagnosis Code (e.g., ICD-10 code)
  • AI-Powered Prompt Generation (OpenAI MiniGPT-4): This is where the intelligence comes in. We'll integrate with the OpenAI API, specifically utilizing the MiniGPT-4 model. We'll construct a dynamic prompt based on the extracted diagnosis. For example:

    please suggest some patient education materials for Mr. {{ $json.body.firstName }} {{ $json.body.lastNName }} suffering from {{ $json.body.diagnosis }} whose date of birth is {{ $json.body.dob }} and gender is {{ $json.body.gender }}. Please format the response in HTML that can be sent as an email to the patient.

    The MiniGPT-4 model will then process this prompt and generate relevant educational content.

  • Email Sending (SendGrid Integration): Finally, we'll integrate with SendGrid, a reliable email delivery service. We'll configure the SendGrid node in n8n to send an email to the patient's extracted email address. The email body will contain the formatted educational materials generated by the AI. The email content looks like this:



The prompt and HTML output are just an example. The output would be as good as your prompt and LLM model.



Most of the text of this blog post is generated using Google Gemini and edited to suite the actual implementation.








Thursday, 13 March 2025

Appointment Scheduler AI Chatbot with FHIR Integration

In today's fast-paced world, convenience is paramount, especially when it comes to healthcare. Imagine being able to book a doctor's appointment simply by having a conversation with a chatbot. This is now a reality, thanks to the power of Dialogflow and FHIR (Fast Healthcare Interoperability Resources).



This blog post will explore how we can create a seamless doctor appointment booking experience using Dialogflow, a natural language understanding platform, and FHIR, a standard for exchanging healthcare information electronically.


Here I will showcase a simple AI chat bot to book a doctor appointment and show how integration works from behind and how AI simplifies the process on front for a seamless user experience.


I have used following to develop this demo

Here Dialogflow ES simplifies the ML training, intent detection based on user inputs and Medplum FHIR server simplifies the FHIR related API stuff.


Our chat bot is developed for an imaginary ABC Pain Clinic. This clinic has couple of orthopaedic providers. For this demo we assume the patient is already registered (we won't capture patient demographics etc.) and provider schedule/slots are already configured on our FHIR server.

In the screenshots, the text displayed with right alignment with background color of gray is the user input and text / rich text displayed with left alignment is the bot response.

Here are the workflow images to explain the Bot Intents and FHIR Integration











Here are actual Chatbot screenshots




In the first step the bot greets the user.



























MCP + FHIR - Example MCP Server integrated with FHIR

In the ever-evolving landscape of healthcare data, interoperability is paramount. FHIR has emerged as a leading standard for exchanging heal...