Physical Address
304 North Cardinal St.
Dorchester Center, MA 02124
Physical Address
304 North Cardinal St.
Dorchester Center, MA 02124

Recently, Microsoft announced Copilot Healtha separate, secure space within Copilot that brings together medical records, data from your wearables, and lab results, so users can get clearer, more personalized explanations of their data.
Many initial reactions to this are understandable: Should I trust Microsoft with my health data? I can’t answer that for anyone else, but I do want to offer something the ad didn’t: a real, personal example of how I already use Copilot as a complement to my healthcare and how it changed the course of my treatment.
For years I have had intermittent pain in the upper right part of my abdomen. It started in my teens, flared up after pregnancies, and was repeatedly dismissed as hormonal. I was in clinics after hours, was told it was “probably” related to my cycle, and left without any meaningful investigation even though the pain was nowhere near my pelvic region.
Finally I got fed up and started asking questions. Co-pilot the kind of questions I didn’t always feel confident asking a doctor.
It didn’t jump to the worst-case scenario, as is often the case with a web search engine, which can return the most sensational results. He asked for follow-ups, looked for patterns in my symptoms, and finally suggested I request an ultrasound. I used the phrase Copilot suggested in my e-consultation so my GP could see exactly why I was requesting this, along with a consistent history of my symptoms and the things that had already been researched and crossed off the list.
I’m not saying Copilot diagnosed me. I’m saying it helped me order the right test.
All of this led to an ultrasound that found multiple gallstones and long-standing digestive dysfunction that explained the pain she had had for two decades. My gallbladder is basically like a bag of marbles with no negative space inside, which have been building since I was a teenager. The ultrasound technician even commented that I was having trouble with the scan and was surprised that I had been referred so late.
I am now on the waiting list for surgery within the next 12 weeks. Throughout the process, I gave Copilot my blood tests and reports; It helped me demystify the results and gave me a clearer way to talk to my doctor. I’m not very good at defending myself when faced with a man in a white jacket.
I’m not saying Copilot diagnosed me. I’m saying it helped me order the right test. A test that no doctor had ordered me in twenty years.
People are already using Copilot for health issues. The usage report shows more than 50 million health queries per day. Microsoft makes it clear that the company sees this as an extension of how people already use AI to make sense of medical information, not as a replacement for doctors.
The product is explicitly consumer-facing and framed as a conversation aid, with Copilot Health touted as a tool to help people prepare for appointments and translate medical jargon, something that could help you and your doctor figure out next steps rather than replacing clinical judgment entirely.
Those points completely align with my experience. Copilot simply gave me some advice and told me what to ask for to advance my research.
The concerns about data leaving its intended source are real and I don’t want to minimize them. Microsoft says Copilot Health will keep health data isolated in a secure environment and give users control over permissions, but those promises will need scrutiny as the product launches.
At the same time, the NHS and other public bodies have had their own data incidents, so the risk of a third party having sensitive information is high. No exclusive of big technologies. To me, the trade-off is worth it. This is a tool that really helped me get a diagnosis after twenty years, facing the abstract fear of a data breach. That is a personal calculation for me and me alone; others will reasonably decide differently.
Our site’s reader survey and the chart we attached to our news coverage on this so far show that 33% of respondents said they would trust Microsoft with their health data, and 67% said no.
For patients, we should underestimate AI as a powerful preparation tool. If you help clarify symptoms or formulate a request for specific investigation, this can shorten the path to diagnosis. You can come to your appointment more prepared and confident to be assertive and request a more specific approach to your symptoms.
Doctors should expect more patients to arrive with AI-generated summaries or next-step suggestions. I’m sure that while in some cases this will save time, it could create friction if the doctor disagrees, but ultimately the patient can at least have proof that they asked for something and it can be documented why that was not the route chosen if the doctor distrusts the source. Either way, it changes the dynamics of the query.
As hyperbolic as it may seem, my experience with Copilot was life-changing (although time will tell once I’ve had surgery). That doesn’t settle the privacy debate, nor does it mean that Copilot Health is the right choice for absolutely everyone and every scenario, but I fully support it as a great starting point for better healthcare.
Join us on Reddit at r/WindowsCentral to share your ideas and discuss our latest news, reviews and more.