From Raw Medical Records to VA-Ready AI Summaries: How Abstractive Placed 3rd in the VA’s AI Tech Sprint
Dec 15, 2025
This article is based on our submission to the VA Artificial Intelligence Tech Sprint for Documenting VA Clinical Encounters and Integrating Community Care Data, where Abstractive Health earned 3rd place in Track 2 (Community Care Document Processing & Integration). The Tech Sprint itself was a 120-day, $1M prize challenge focused on clinician documentation and record interoperability.
The VA’s Problem
The Veterans Health Administration (VHA) delivers care to millions of Veterans across more than 1,300 facilities. Veterans often receive care both inside the VA and through community providers. That means their history can arrive fragmented across decades and across formats, including unstructured notes and scanned PDFs.
For clinicians, the pain is not just “getting” the record. It’s turning that record into something usable fast enough to matter in a real visit. The data exists, but it can be functionally unreadable under time pressure.
What the VA Tech Sprint asked teams to prove
The VA structured the Tech Sprint around two tracks: one focused on encounter documentation (ambient note generation), and one focused on community care records and interoperability. We competed in Track 2, which targets the outside-records problem.
Success was measured by a VA’s composite score on whether a solution could run reliably in a VA environment, produce VA-compatible outputs, and perform well under clinical and safety review.
The KPI: what was actually measured
The Tech Sprint used a gated evaluation (three “gates”), and each gate measured specific criteria. The overall KPI was the VA’s composite score and final ranking after Gate 3, built from these components:
Gate 1 (Pass/Fail): Technical readiness and minimum performance requirementsCould the system run on VA government-furnished equipment, generate required outputs (JSON/HL7), remain stable under interruptions, and return within the VA’s time threshold.
Gate 2 (Scored): Trust, correctness, and clinical usefulnessThe VA evaluated three scored components:
- Trustworthy AI: questionnaire-based scoring tied to safety behavior, explainability, bias considerations, governance, monitoring, and risk management.
- Structured data extraction quality: whether diagnoses, medications, labs, vitals, and imaging findings (and their values) were extracted correctly versus validated controls, including whether errors could introduce harm.
- Clinician review: coherence, factual consistency, clinical completeness, and potential risk.
Only top performers advanced.
Gate 3 (Final scored ranking): Repeat scrutiny plus demo evaluationGate 3 repeated the structured-data and clinician evaluations with heightened scrutiny, and added qualitative scoring of the recorded demonstration. That included clarity of value, handling of edge cases, safety and error management, privacy and security posture, and workflow and integration readiness.
Our final score was a composite of trustworthiness, structured clinical data correctness, clinician-rated summary quality and safety, and integration readiness.
What we built for Track 2
Our goal for Track 2 was straightforward: take outside community-care documents and turn them into an integrated, usable clinical picture without requiring clinicians to manually sift through raw PDFs.
In practical terms, we built a system designed to:
- ingest non-VA documents, including unstructured notes and scanned formats,
- extract key clinical entities and values into structured outputs (the interoperability backbone),
- generate a coherent narrative summary that clinicians can review quickly,
- make the output navigable, including a linked table of contents, so it is not just summarized but reviewable.
The result: why “3rd place” is more than a headline
Abstractive Health advanced through all three gates and earned 3rd place out of 200+ companies. That placement reflects performance against the VA’s scoring rubric across trust, correctness, clinical coherence, safety, and integration readiness.
It means we:
- met the operational baseline required to run in a VA environment,
- performed strongly in the scored dimensions the VA prioritized for real clinical use,
- demonstrated workflow alignment and interoperability readiness, not just model output quality.
Our team attended the Golden Envelope Ceremony which took place at the National Press Club in the Holeman Room in Washington, DC on May 21, 2024. During the ceremony the top 5 place winners were announced and the top 3 awardees presented their solutions.

Why this matters for clinicians and interoperability
There is a big difference between records being accessible and records being usable. The VA emphasized that distinction at scale.
When community care data becomes structured, coherent, and reviewable, clinicians start the encounter with a stronger baseline understanding. That translates into less time hunting, fewer missed details, and less cognitive burden doing manual synthesis.
We thank the Department of Veterans Affairs for its leadership, its commitment to responsible innovation, and the privilege of participating in a challenge dedicated to improving care for our nation’s veterans.
Related Articles



