C4ISR-Med Battlefield Medical Demonstrations and Experiments

Similar documents
UNCLASSIFIED. UNCLASSIFIED Office of Secretary Of Defense Page 1 of 7 R-1 Line #73

For Fusion '98 Conference Proceedings

Single Integrated Ground Picture

A Wireless Vital Signs System for Combat Casualties

ARMY RDT&E BUDGET ITEM JUSTIFICATION (R-2 Exhibit)

UNCLASSIFIED FY 2016 OCO. FY 2016 Base

Science & Technology Directorate

ARMY RDT&E BUDGET ITEM JUSTIFICATION (R-2 Exhibit)

Army Expeditionary Warrior Experiment 2016 Automatic Injury Detection Technology Assessment 05 October February 2016 Battle Lab Report # 346

ARMY RDT&E BUDGET ITEM JUSTIFICATION (R-2 Exhibit)

JABRA + LYNC FOR OFFICE 365 Enabling the power of conversation

UNCLASSIFIED R-1 ITEM NOMENCLATURE

AUSA BACKGROUND BRIEF

UNCLASSIFIED R-1 ITEM NOMENCLATURE

Military medics save lives in the field, and now get some

WARFIGHTER MODELING, SIMULATION, ANALYSIS AND INTEGRATION SUPPORT (WMSA&IS)

Joint Operational Effects Federation (JOEF) Program Overview

COE. COE Snapshot APPLICATIONS & SERVICES CONNECTING OUR SOLDIERS EXAMPLE SERVICES. COE Enables. EcoSystem. Generating Force

UNCLASSIFIED FY 2016 OCO. FY 2016 Base

Guardian 2015 Full-Scale Exercise. After-Action Report

UNITED STATES MARINE CORPS FIELD MEDICAL TRAINING BATTALION Camp Lejeune, NC

Tunstall telehealth solutions

Capability Integration

Racing Toward Becoming Operationally Responsive in Space. Jeff Nagy, Capt, USAF. Air Force Research Laboratory, Mesa AZ. Veronica Hernandez

C4I System Solutions.

USARIEM TECHNICAL REPORT T07-04

Capabilities Development & Integration Directorate. Dismounted Mission Command

UNCLASSIFIED R-1 ITEM NOMENCLATURE

COMMON AVIATION COMMAND AND CONTROL SYSTEM

Real-Time Locating System Based on Bluetooth Low Energy and Cloud Technologies. Duress Alarm Patient Wandering Hands-free Access Control

UNCLASSIFIED UNCLASSIFIED

UNCLASSIFIED. FY 2017 Base FY 2017 OCO

Safety Innovations FOUNDATIONHTSI. Healthcare Alarm Safety What We Can Learn From Military Alarm Management Strategies

United States Special Operations Command. Science and Technology

Smart Energy Harvesting for Every Warfighter

UNCLASSIFIED FY 2016 OCO. FY 2016 Base

Deliver Secure Quality In-Home Patient Care Using the Simplicity of NFC

Server, Desktop, Mobile Platforms Working Group (SDMPWG) Dated

UNCLASSIFIED. R-1 ITEM NOMENCLATURE PE A: Landmine Warfare and Barrier Advanced Technology FY 2012 OCO

Data Collection & Field Exercises: Lessons from History. John McCarthy

Positioning, Navigation and Timing (PNT): The Foundation of Command and Control

UNCLASSIFIED. FY 2016 Base FY 2016 OCO. Quantity of RDT&E Articles

UNCLASSIFIED R-1 ITEM NOMENCLATURE FY 2013 OCO

U.S. Army Training and Doctrine Command (TRADOC) Analysis Center (TRAC)

UNCLASSIFIED. FY 2016 Base FY 2016 OCO

Use of Modeling and Simulation (M&S) in Support of the Quantitative Assessment of FORCEnet Systems and Concepts

ARMY RDT&E BUDGET ITEM JUSTIFICATION (R-2 Exhibit)

4/8/2016. Remote Monitoring & Patient Coaching. Improving Outcomes and Reducing Costs. Objectives. What is RPM?

Joint Improvised-Threat Defeat Organization - Mission -

Naval Unmanned Combat Air Vehicle

Patient Unified Lookup System for Emergencies (PULSE) System Requirements

Bringing Combat Medicine to the Streets of EMS. MAJ Will Smith MD, EMT-P US Army

UNCLASSIFIED. FY 2016 Base FY 2016 OCO

STATEMENT BY DR. A. MICHAEL ANDREWS II DEPUTY ASSISTANT SECRETARY OF THE ARMY FOR RESEARCH AND TECHNOLOGY AND CHIEF SCIENTIST BEFORE THE

Global EOD Symposium & Exhibition

Squad Overmatch Study Tactical Combat Casualty Care (SOvM-TC3)

Headquarters U.S. Air Force

Embedded Training Solution for the Bradley Fighting Vehicle (BFV) A3

UNCLASSIFIED. UNCLASSIFIED Air Force Page 1 of 8 R-1 Line #156

UNCLASSIFIED R-1 ITEM NOMENCLATURE FY 2013 OCO

UNCLASSIFIED R-1 ITEM NOMENCLATURE PE F: SPECIAL TACTICS/COMBAT CONTROL

Joint Trauma Analysis and Prevention of Injury in Combat (JTAPIC) Program

Strike Group Defender: PMR-51 and MIT Lincoln Laboratory

Healthcare without Bounds: Trends in Clinical Surveillance and Analytics 2018

SSC Pacific is making its mark as

Voice enabled Internet of Health"

RFID-based Hospital Real-time Patient Management System. Abstract. In a health care context, the use RFID (Radio Frequency

UNCLASSIFIED UNCLASSIFIED. EXHIBIT R-2, RDT&E Budget Item Justification RDT&E,N/ 07

Defense Health Agency PROCEDURAL INSTRUCTION

UNCLASSIFIED UNCLASSIFIED

RDT&E BUDGET ITEM JUSTIFICATION SHEET (R-2 Exhibit) MAY 2009 APPROPRIATION / BUDGET ACTIVITY RDT&E, DEFENSE-WIDE / 7

Towards a Robotics Strategy

Palm Beach County Fire Rescue Standard Operating Guideline

UNCLASSIFIED FY 2016 OCO. FY 2016 Base

Nursing and Information. Technology. What is Information Technology? Objectives. MNDAKSPAN Winter Conference. Karen Jones, RN, BSN, MS

Fire Support Systems.

Use Case Study: Remote Patient Monitoring for Chronic Disease

ISANSYS LIFECARE LTD CLOUD COMPUTING TECHNOLOGY TO MONITOR PATIENTS VITAL SIGNS

From Stove-pipe to Network Centric Leveraging Technology to Present a Unified View

MEADS MEDIUM EXTENDED AIR DEFENSE SYSTEM

How 2018 Will Be The Year You Embrace Continuous Connectivity. NERSI NAZARI, PHD Chief Executive Officer

The creative sourcing solution that finds, tracks, and manages talent to keep you ahead of the game.

Joint Science and Technology Office

UNCLASSIFIED. UNCLASSIFIED Office of Secretary Of Defense Page 1 of 10 R-1 Line #100

AGI Technology for EW and AD Dominance

UNCLASSIFIED. Information Systems: The Key to Future Force Success in a CBRN Environment. January 9, 2007

THE MEDICAL COMPANY FM (FM ) AUGUST 2002 TACTICS, TECHNIQUES, AND PROCEDURES HEADQUARTERS, DEPARTMENT OF THE ARMY

FORCE XXI BATTLE COMMAND, BRIGADE AND BELOW (FBCB2)

UNCLASSIFIED. R-1 ITEM NOMENCLATURE PE D8Z: Central Test and Evaluation Investment Program (CTEIP) FY 2011 Total Estimate. FY 2011 OCO Estimate

Army Ground-Based Sense and Avoid for Unmanned Aircraft

DIGITAL CAVALRY OPERATIONS

COST (in millions) FY03 FY04 FY05 FY06 FY07 FY08 FY09. Total Program Element

UNCLASSIFIED R-1 ITEM NOMENCLATURE FY 2013 OCO

UNCLASSIFIED R-1 ITEM NOMENCLATURE

UNCLASSIFIED R-1 ITEM NOMENCLATURE FY 2013 OCO

DEPARTMENT OF THE NAVY MARINE CORPS POLICY FOR COORDINATED IMPLEMENTATION OF MILITARY STANDARDS 6017, , AND

Directorate of Training and Doctrine Industry Day Break out Session

805D-56A-6601 Provide Religious Support to a Wounded or Dying Individual Status: Approved

Comprehensive 360 Situational Awareness for the Crew Served Weapons Leader

Great Expectations: The Evolving Landscape of Technology in Meetings 1

Transcription:

C4ISR-Med Battlefield Medical Demonstrations and Experiments Lockheed Martin ATL January, 2012 PoC: Susan Harkness Regli susan.regli@lmco.com Overview Lockheed Martin (LM) has built a demonstration prototype and human-in-the-loop simulation infrastructure, called C4ISR-Med, to address information flow challenges in combat casualty care. We have conducted human factors research at multiple sites that has shown the criticality of incident and treatment data capture at the point of injury (POI). In addition, triage of a casualty situation requires fast and stressful decision-making with very little intelligence about patient status. In current operations, significant information exists that could 1) improve medical Situational Awareness (SA) and aid medics in assessing situations and 2) enable safe, nonintrusive electronic reporting of casualty care data to enhance patient outcomes and medical health records. Battlefield trauma care is the first level of in- theater medical care. Field medics are responsible for triaging and treating wounded personnel. Currently, the decision about who is treated first is made quickly under high stress and with little information. Vital signs can be difficult to measure in the battlefield environment, leaving field medics to make treatment decisions with little data. At the POI, there is a significant lack of consistent documentation of medically relevant information; field medics resort to writing the details of treatment and medications on bandages or medical tape on the patient s skin. As casualties move from POI to combat support hospitals, each transfer point introduces a new risk of information breakdown. C4ISR-Med addresses these battlefield medical information challenges by leveraging tactical intelligence technologies and LM system expertise, including a tactical intelligence collection tool that uses spoken language input to create digital versions of standard reports. 1 We use proven human factors techniques to incorporate subject matter expertise into the design of medic-specific tools and usage scenarios. We have a human-in-the-loop simulation and experimentation infrastructure at the LM Center for Innovation to evaluate the feasibility and effects of increased medical intel on the battlefield. And we will continue to conduct iterative demonstrations and experimentation of C4ISR-Med to engage the user community and evaluate new solutions for transition readiness. The three key outcome goals of the C4ISR-Med effort are: A Seamless User Interaction for medic triage and casualty reporting Medical Intel to all levels of care for patient and tactical benefit A Flexible simulation infrastructure to plug-and-play new solutions 1 The tactical intel research and development for this collection tool (I2W or Interface to the Warfighter), was funded by the Office of Naval Research as well as SOCOM SORDAC S&T; the software runs on small devices running the AndroidOS due to guidance from SOCOM. 2013 Lockheed Martin Corporation. All Rights Reserved.

To guide our work towards these goals, we have depicted an overall C4ISR-Med vision (Figure 1). In the next section we describe each segment of the vision in detail. C4ISR-Med Prototype and Simulation Figure 1. C4ISR- Med Vision The C4ISR-Med vision encompasses the battlefield situation from prior to injury, through casualty incident(s) Level 1 care, and on to Level 2 care at the field hospital. The C4ISR-Med project has created prototype software as well as simulation capabilities at each key point to enable demonstration and provide a testbed for experimentation with best-of-breed solutions. The C4ISR-Med software runs on a wide variety of Android OS-based hardware, providing multiple form factors to fit different contexts of use. The hardware used in the simulation includes small form-factor Android devices (complete with GPS and wifi capability), small and large phones, and tablets. For our simulation, the role-based allocation of equipment is as follows: All personnel wear physiological sensors or have simulated sensor data Individual squad members wear small devices or carry phones for sensor data collection. The medic wears an Android device to receive alerts and to activate spoken language processing. The medic carries a phone for review of vitals and for processing speech into combat casualty reports. Speech can be captured using Bluetooth microphone, wired microphone, or by speaking directly into the phone. The squad leader carries phone for lightweight blue force tracking and status review as well as digital entry of 9-line report. 2

The transport medic can have a phone and/or tablet for review of vitals and reports as well as creating additional reports. The field hospital has a tablet for review of incoming data. Using Figure 1 as a guide, we will step through the usage of these tools, as well as the simulation elements that constitute our demonstration environment. The demonstration takes place in three staged areas: Triage (Point of Injury), Medevac, and Field Hospital. Step 1: Pre-mission baseline vitals & assess readiness An underlying premise of the C4ISR-Med system is that warfighters will be wearing small, unintrusive physiological sensors to monitor important vital signs both pre- and during a mission. LM is investigating of sensors available now as well as developing our own sensors that can be worn on the body or incorporated into existing clothing/equipment to avoid adding weight. For demonstration, the C4ISR-Med system can either simulate sensor data readings or integrate data from real sensors on a person or medical training mannequin. The integration infrastructure makes it easy to incorporate new sensors for experimentation. In the CONOPs, the data from sensors would be collected wirelessly by Android devices for each warfighter. Prior to deployment, baseline measurements for each individual would be taken in varying operational conditions to determine vitals thresholds that might indicate injury. Step 2: Injury incident triggers alert from body-worn sensors and small Android devices When an incident occurs that produces casualties, the medic receives an audio alert that indicates he should look at his device to review who is injured (names in red) and where they are located relative to his position (Figure 2, left). For demonstration, the algorithms that determine what vitals thresholds indicate injuries are rudimentary, although they are based on interviews with subject matter experts that gave insight into what is likely to happen when a particular type of injury occurs. The development of more accurate and sensitive algorithms is an area of potential research and experimentation. Figure 2. (left) The medic reviews injured personnel identity and position. (right) The medic can use a larger interface to choose a name and review the vitals for that warfighter. Step 3: Medic can review vitals info and blue force position If the situation is rapidly evolving and the medic can get to the injured immediately, there is no need to review any other information before starting treatment. If, however, the medic has time while an area is being cleared or while in transit, he can take out his phone to review the vitals of 3

the patients (Figure 2, right) or see where everyone is on a map (Figure 3). The map interface was developed as a tool for tactical intel2 and would more likely be used by a squad leader, but the information is also available to the medic. Figure 3. The medic or squad leader can review blue force position and status. Step 4: Medic captures injury and treatment info at site via voice When the medic begins treatment, he can begin or continue documentation at any point during the casualty care. To begin documentation, he taps twice anywhere on his device screen to start the speech processing on the phone. Note that he does not need to take the phone out of his pocket or interact with it in any way to turn the speech processing on or off; this is a deliberate technology advancement to enable the reporting to be done using the hands as little as possible so they can remain free to treat the patient. Information known about the warfighter from a pre-entered profile (e.g., unit, allergies) will be automatically populated in the report, as well as the time of the report. If the vitals signs are available from sensors, the vital signs are pre-populated to reduce the amount of information that needs to be entered by the medic. This is an example set of utterances to create a report: Report for P-F-C Smith. Injuries due to IED blast. Lower left leg is amputated. AVPU is unconscious. Applied TQ. BP is 145 over 95, pulse is 165, respiratory rate is 14. Inserted a nasal pharyngeal, started IV of hextend 500 milliliters. Save report. The information can be entered in shorter statements or in a different order. The system parses the speech into a report format based on the Tactical Combat Casualty Care card (Figure 4, left). Step 5: Digital Medevac request sent The squad leader uses manual text entry to populate a digital 9-line report, including information requested from the medic about the wounded and the equipment needed on the evacuation vehicle. The 9-line could accept speech input as well; however, for the demonstration we show text entry to highlight that there are multiple ways to interact with the system based on operational constraints for silent (using text) vs. hands-free (using speech) data entry. 2 The map interface also leverages capabilities from Interface to the Warfighter (I2W). 4

Figure 4. (left) Report created using spoken language utterances. (right) Field Hospital review of reports on incoming wounded. Step 6: Medevac adds reports and transmits all reports to TOC and hospital When the Medevac arrives, all reports are transferred to the C4ISR-Med tablet on the Medevac vehicle. For the demonstration the transfer occurs over a wireless network, but the simulation infrastructure is designed to allow for testing over varying network conditions to evaluate performance in degraded environments. Reports are transferred to the field hospital and the TOC opportunistically so preparations for wounded and tactical responses can begin. While on the Medevac, the transport medic can review the reports, review current vitals, and create new reports if changes in patient status occur and/or additional treatment needs to be documented. For demonstration, the Medevac is simulated as a vehicle in an octagon-shaped simulation environment that provides an immersive experience of driving a vehicle through a variety of environments (e.g., an Afghan countryside or town). The simulation provides options to test how equipment might be used in transit through loud environments including gunfire. Step 7: Field Hospital receives advance casualty data As the reports come in, the field hospital can review all of the received reports (Figure 4, right) and begin to prepare resources and personnel to treat the incoming wounded upon arrival. For the demonstration we created a report review screen to be used on the Medevac and at the field hospital, but the report data could also be integrated into existing systems and electronic health records to enhance long-term record keeping with data about treatment at the POI or in transit. The field hospital is simulated by a life-size virtual wall 3D-display of avatars operating in a hospital environment. The transport medic can deliver the patient (person or mannequin) to the field hospital and interact with the avatar to play out conversations that might occur upon arrival. This simulation provides flexibility of multiple scenarios for experimentation or training. Step 8: Incident added to EHR Finally the incident data can be added to the patient s electronic health record. While the demonstration does not reach as far as long-term electronic health records, it is important to note that data about POI incidents, treatments, and patient outcomes will be available in digital format. The information that is captured will not only be larger in volume, but will also be parsed into records with tagged fields indicating what was entered for injury type, treatment, and medication. These records will be valuable not only for the long-term care of individual patients, but also for data analysis of treatments and medicines versus patient outcomes after a battlefield injury. 5

Demonstration Controller A single demonstration controller running on an Android device controls all the actions happening on the C4ISR-Med devices. This provides the simulation with a large degree of flexibility because we can move simulated blue force positions (or use actual GPS), change simulated sensor readings, and create injury events during the scenario enactment. We will continue to make design and demonstration enhancements to support quantitative assessment with human- in- the- loop experiments in 2013. 6