TITLE: Low Band Telemedicine Decision Support System for Disaster Situations

Similar documents
CONTRACTING ORGANIZATION: Walter Reed Army Medical Center Washington, DC

CONTRACTING ORGANIZATION: Landstuhl Regional Medical Center Germany

ý Award Number: MIPR 3GD3DT3083 Total Eye Examination Automated Module (TEAM) PRINCIPAL INVESTIGATOR: Colonel Francis L.

712CD. Phone: Fax: Comparison of combat casualty statistics among US Armed Forces during OEF/OIF

TITLE: The impact of surgical timing in acute traumatic spinal cord injury

TITLE: Emergency Preservation and Resuscitation for Cardiac Arrest from Trauma (EPR-CAT)

Defense Acquisition Review Journal

TITLE: Comparative Effectiveness of Acupuncture for Chronic Pain and Comorbid Conditions in Veterans

Mission Assurance Analysis Protocol (MAAP)

Harnessing the Power of MHS Information Systems to Achieve Meaningful Use of Health Information

SPECIAL REPORT Unsurfaced Road Maintenance Management. Robert A. Eaton and Ronald E. Beaucham December 1992

A system overview of the Electronic Surveillance System for the Early Notification of Community-based Epidemics

The Fully-Burdened Cost of Waste in Contingency Operations

Opportunities to Streamline DOD s Milestone Review Process

Shadow 200 TUAV Schoolhouse Training

Improving the Quality of Patient Care Utilizing Tracer Methodology

DoD Architecture Registry System (DARS) EA Conference 2012

TITLE: Early ICU Standardized Rehabilitation Therapy for the Critically Injured Burn Patient

Report No. D September 25, Controls Over Information Contained in BlackBerry Devices Used Within DoD

Biometrics in US Army Accessions Command

Laboratory Accreditation Bureau (L-A-B)

Massachusetts General Hospital Boston, Massachusetts Approved for Public Release; Distribution Unlimited

TITLE: Vitamin D and Related Genes, Race and Prostate Cancer Aggressiveness

ASAP-X, Automated Safety Assessment Protocol - Explosives. Mark Peterson Department of Defense Explosives Safety Board

terns Planning and E ik DeBolt ~nts Softwar~ RS) DMSMS Plan Buildt! August 2011 SYSPARS

Afloat Electromagnetic Spectrum Operations Program (AESOP) Spectrum Management Challenges for the 21st Century

DoD Countermine and Improvised Explosive Device Defeat Systems Contracts for the Vehicle Optics Sensor System

Product Manager Force Sustainment Systems

DoD Cloud Computing Strategy Needs Implementation Plan and Detailed Waiver Process

Panel 12 - Issues In Outsourcing Reuben S. Pitts III, NSWCDL

PREPARED FOR: U.S. Army Medical Research and Materiel Command Fort Detrick, Maryland

Comparison of Navy and Private-Sector Construction Costs

Determining and Developing TCM-Live Future Training Requirements. COL Jeffrey Hill TCM-Live Fort Eustis, VA June 2010

White Space and Other Emerging Issues. Conservation Conference 23 August 2004 Savannah, Georgia

Wildland Fire Assistance

Lessons Learned From Product Manager (PM) Infantry Combat Vehicle (ICV) Using Soldier Evaluation in the Design Phase

Fiscal Year 2011 Department of Homeland Security Assistance to States and Localities

Tannis Danley, Calibre Systems. 10 May Technology Transition Supporting DoD Readiness, Sustainability, and the Warfighter. DoD Executive Agent

Integrated Comprehensive Planning for Range Sustainability

TITLE: Spouses/Family Members of Service Members at Risk for PTSD or Suicide. Fairfax, VA 22030

US Coast Guard Corrosion Program Office

The Army Executes New Network Modernization Strategy

Department of Defense DIRECTIVE

NORAD CONUS Fighter Basing

USAF Hearing Conservation Program, DOEHRS Data Repository Annual Report: CY2012

For the Period June 1, 2014 to June 30, 2014 Submitted: 15 July 2014

AFRL-VA-WP-TP

American Telemedicine Association Annual Meeting Wounded Warrior Medical Information Management from the Battlefield to Home

2011 USN-USMC SPECTRUM MANAGEMENT CONFERENCE COMPACFLT

Social Science Research on Sensitive Topics and the Exemptions. Caroline Miner

Analysis of the Operational Effect of the Joint Chemical Agent Detector Using the Infantry Warrior Simulation (IWARS) MORS: June 2008

The Effects of Multimodal Collaboration Technology on Subjective Workload Profiles of Tactical Air Battle Management Teams

DoD Scientific & Technical Information Program (STIP) 18 November Shari Pitts

Concept Development & Experimentation. COM as Shooter Operational Planning using C2 for Confronting and Collaborating.

Water Usage at Forward Operating Bases

Occupational Survey Report AFSC 4A1X1 Medical Materiel

The Coalition Warfare Program (CWP) OUSD(AT&L)/International Cooperation

Screening for Attrition and Performance

The Glasgow Admission Prediction Score. Allan Cameron Consultant Physician, Glasgow Royal Infirmary

Engineering, Operations & Technology Phantom Works. Mark A. Rivera. Huntington Beach, CA Boeing Phantom Works, SD&A

DOING BUSINESS WITH THE OFFICE OF NAVAL RESEARCH. Ms. Vera M. Carroll Acquisition Branch Head ONR BD 251

Navy CVN-21 Aircraft Carrier Program: Background and Issues for Congress

DDESB Seminar Explosives Safety Training

CONTRACTING ORGANIZATION: Veterans Medical Research Foundation San Diego, CA 92161

Electronic Attack/GPS EA Process

Applying the Goal-Question-Indicator- Metric (GQIM) Method to Perform Military Situational Analysis

Scottish Hospital Standardised Mortality Ratio (HSMR)

Information Technology

United States Army Aviation Technology Center of Excellence (ATCoE) NASA/Army Systems and Software Engineering Forum

Chief of Staff, United States Army, before the House Committee on Armed Services, Subcommittee on Readiness, 113th Cong., 2nd sess., April 10, 2014.

Support for FLIP/ORB. Fred H. Fisher. Final Report to the Office of Naval Research Contract N D-0142 (DO#26)

MILITARY MUNITIONS RULE (MR) and DoD EXPLOSIVES SAFETY BOARD (DDESB)

Cerberus Partnership with Industry. Distribution authorized to Public Release

Addressing the Health Concerns of VA Women with Sexual Trauma. U.S. Army Medical Research and Materiel Command Fort Detrick, Maryland

Systems Engineering Capstone Marketplace Pilot

DOD Native American Regional Consultations in the Southeastern United States. John Cordray NAVFAC, Southern Division Charleston, SC

Cyber Attack: The Department Of Defense s Inability To Provide Cyber Indications And Warning

Google Pilot / WEdge Viewer

Independent Auditor's Report on the Attestation of the Existence, Completeness, and Rights of the Department of the Navy's Aircraft

A Wireless Vital Signs System for Combat Casualties

United States Joint Forces Command Comprehensive Approach Community of Interest

Life Support for Trauma and Transport (LSTAT) Patient Care Platform: Expanding Global Applications and Impact

Defense Acquisition: Use of Lead System Integrators (LSIs) Background, Oversight Issues, and Options for Congress

Military medics save lives in the field, and now get some

Infections Complicating the Care of Combat Casualties during Operations Iraqi Freedom and Enduring Freedom

Rapid Reaction Technology Office. Rapid Reaction Technology Office. Overview and Objectives. Mr. Benjamin Riley. Director, (RRTO)

Software Intensive Acquisition Programs: Productivity and Policy

A Scalable, Collaborative, Interactive Light-field Display System

Army Modeling and Simulation Past, Present and Future Executive Forum for Modeling and Simulation

ASNE Combat Systems Symposium. Balancing Capability and Capacity

Army Aviation and Missile Command (AMCOM) Corrosion Program Update. Steven F. Carr Corrosion Program Manager

U.S. Military Casualty Statistics: Operation New Dawn, Operation Iraqi Freedom, and Operation Enduring Freedom

The Security Plan: Effectively Teaching How To Write One

Military Health System Conference. Public Health Service (PHS) Commissioned Corps

DoD Corrosion Prevention and Control

AMC s Fleet Management Initiative (FMI) SFC Michael Holcomb

Intelligence, Information Operations, and Information Assurance

Integrity Assessment of E1-E3 Sailors at Naval Submarine School: FY2007 FY2011

REPORT DOCUMENTATION PAGE

Aviation Logistics Officers: Combining Supply and Maintenance Responsibilities. Captain WA Elliott

Transcription:

AD Award Number: MIPR 0EC5DXM0079 TITLE: Low Band Telemedicine Decision Support System for Disaster Situations PRINCIPAL INVESTIGATOR: Patricia Hastings CONTRACTING ORGANIZATION: Tripler Army Medical Center Tripler AMC, Hawaii 96859-5000 REPORT DATE: October 2 0 01 TYPE OF REPORT: Final PREPARED FOR: U.S. Army Medical Research and Materiel Command Fort Detrick, Maryland 21702-5012 DISTRIBUTION STATEMENT: Approved for Public Release; Distribution Unlimited The views, opinions and/or findings contained in this report are those of the author(s) and should not be construed as an official Department of the Army position, policy or decision unless so designated by other documentation. 20011029 046

REPORT DOCUMENTATION PAGE Form Approved OMB No. 074-0188 Public reporting burden for this collection ol information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering and maintaining the data needed and completing and reviewing this collection of information. Send comments regarding this burden estimate or any other aspect of this collection of information, including suggestions for reducing this burden to Washington Headquarters Services, Directorate for Information Operations and Reports, 1215 Jefferson Davis Highway, Suite 1204, Arlington, VA 22202-4302, and to the Office of Management and Budget, Paperwork Reduction Project (0704-0188), Washington, DC 20503 1. AGENCY USE ONLY (Leave blank) 2. REPORT DATE October 2001 4. TITLE AND SUBTITLE Low Band Telemedicine Decision Support System for Disaster Situations 3. REPORT TYPE AND DATES COVERED Final (15 Feb 00-30 Sep 01) 5. FUNDING NUMBERS MIPR 0EC5DXM0079 6. AUTHOR(S) Patricia Hastings 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) Tripler Army Medical Center Tripler AMC, Hawaii 96859-5000 8. PERFORMING ORGANIZATION REPORT NUMBER E-Mail: 9. SPONSORING / MONITORING AGENCY NAME(S) AND ADDRESS(ES) U.S. Army Medical Research and Materiel Command Fort Derrick, Maryland 21702-5012 10. SPONSORING / MONITORING AGENCY REPORT NUMBER 11. SUPPLEMENTARY NOTES 12a. DISTRIBUTION / AVAILABILITY STATEMENT Approved for Public Release; Distribution Unlimited 12b. DISTRIBUTION CODE 13. ABSTRACT (Maximum 200 Words) 14. SUBJECT TERMS 15. NUMBER OF PAGES 17. SECURITY CLASSIFICATION OF REPORT Unclassified 18. SECURITY CLASSIFICATION OF THIS PAGE Unclassified 19. SECURITY CLASSIFICATION OF ABSTRACT Unclassified 16. PRICE CODE 20. LIMITATION OF ABSTRACT Unlimited NSN 7540-01-280-5500 Standard Form 298 (Rev. 2-89) Prescribed by ANSI Std. Z39-18 298-102

jj^jjpjy««««^^»^ ^ lll11.11 _i.iii. i.1 '- I"'"-I "'""I l ". DHP RFS Final Report ft v *J-J* v»i^&j?^*^titaj"'lta3 Low Band Telemedicine Decision Support System for Disaster Situations Proposal Number: 1999000221 Patricia Ruth Hastings DO, FACEP Abstract Problems Problems 1) Poor quality of commercial wireless Internet connectivity We frequently faced poor Internet connection with a commercial wireless Internet service provider, which frustrated users. This is anticipated to improve as service improves regionally and globally. 2) Little evidence for clinical decision making We decided to develop a decision support system for crush injury because little epidemiological study is available for other disaster unique medical diagnoses. 3) Narrow clinical application Considering the clinical application of such decision support systems, more benefit will be provided by decision support tools for more common diseases and injuries. Future direction 1) Improve connectivity A more stable network connection is required for clinical use of the system. Stand-alone applications that can store the data in a handheld device are necessary to develop. 2) Security and confidentiality Security and confidentiality such as encryption, IP filtering and authentification, should be considered. These issues are for the HIPAA (Health Insurance Portability and Accountability Act) regulation. 3) Wider clinical application Decision support systems (DSS) that will cover much wider clinical applications will be beneficial. Clinical DSS for further predictive model development is planned. 4) User interface Development of optimal user interface for the decision support system with users is necessary. Deliverables 1) Predictive model development We developed a predictive model for patients with crush injury using existing dataset from the Hanshin-Awaji Earthquake. We studied 12 possible predictive factors, which is available at the field examination: age and gender, respiratory rate, systolic blood pressure, pulse rate and availability of urine specimen of 6

PWÜflW iuip.1 with evaluation of urine color / rescue time: hours from earthquake impact to patient extrication and time from extrication to initial patient examination, volume of intravenous fluid during the first three days following the earthquake, and Injured anatomic sites. A logistic regression model was used to build a predictive model to estimate deleterious outcomes defined as hemodialysis and/or death (severe and fatal crush syndrome). Total of 330 patients data were split into two parts: training data set (220 cases) and test data set (110). The training data set was used to construct logistic regression model and the test data set was used to test a validity of the model. The three prognostic factors, odds ratio and 95% confidence intervals in the final logistic regression model are listed in table 1. 2) Web-based database system development Figure 1 shows a structure of the web-based database system. Oracle 8i application program has been used to develop the web-based database system. As the illustration shows, a table in a relational database is organized in rows and columns. Each column, called a field, represents a specific type of data stored in the table. For example, in the PATIENT table, columns include the patient's ID, last name, and sex. Each row called a record represents a set of related data about a single entity, such as a person. Many rows make up a table. For example, each row in the PATIENT table represents a patient's demographic information. The patient ID identifies each patient or contact in the PATIENT table. The ID is used as a key in many tables, which means that this ID is used to refer any information belong to each patient (e.g. age, gender) from other tables. Since each patient might have several data entry points, ID in the OBS table named as OBSID, is used as unique identifier for the each record stored in the database system. This OBSID is used to identify each record stored in the database, which is automatically generated. Care providers, however, are not required to input the OBSID to retrieve the stored data. They can input patient ID, first name, or last name to identify the patient. Then, the database system would return a list of possible patients, so that care providers can identify the patient. ID and password is required to access any data stored in the database system. 3) Decision support system development Specification for the decision support system Server: IRIX64 Operating system: IRIX 6.5 version Network: Internet Language: Java Software on server JDK: 1.3 version Apache Jakarta- Tomcat: 3x version JDBC driver: 1.2 version Client Browser: PC: IE /Netscape browser Handheld: Palmscape Handheld Hardware: Palm Vx and Visor Prism Operating system: Palm OS 3.5 version Network: commercial wireless ISP (Omnisky Inc.) A JSP (Java Servlet Page) program running on a server creates web pages, which can be accessed by Web browser or Palm browser. The program interacts with the user, gets patient's demographic data directly from database and then collects and inserts the patient's observation data into the database. The system calculates the possibility of Crush Injury Syndrome and provides a suggestion of how to handle the patient. Decision support system for handheld In the initial screen of the decision support system, care providers must input their user ID and password in order to enter the decision support system. In the next step, care providers are asked to input patient demographics information. The web-based database system automatically generates a patient ID. Care providers can put the ID on the triage tag for future information retrieval (Figure 2). Then, care providers are asked to input important physiologic information of the patient including vital signs and important predictive factors (Figure 3). All the data input here is stored in the database for future retrieval. After submitting these data, the server computer calculates a possibility of of 6

^WqW^^^^^^^TWWJTT : deleterious outcomes based on the data input in the handheld, and sends them to care providers. Figure 4 shows final decision support screen. Estimated probability of deleterious outcomes and 95% confidence intervals are visualized on the screen. Expenditures 3Q FY 00 4Q FY 00 IQFY 01 2QFY i 01 Element of Resource Apr 1 - Jun 1 - Octl- Jan 1 - TOTALS (EOR) May 31 Sep30 Dec 31 Mar 31 j Travel 2100 0.00 0.00 0.00 0.00 0.00 Shipping 2200 0.00 0.00 0.00 0.00 0.00 I Rent & Communications 2200 0.00 0.00 0.00 0.00 o.oo! Contract for Services 2500 j 0.00 0.00 0.00 \ 0.00 0.00! j Supplies 2600 i J! I. Equipment 3100 0.00 0.00 0.00 0.00 0.00! 0.00 0.00 0.00 0.00 o.oo j I GRAND TOTALS 0.00 0.00 0.00 0.00 0.00! Financials Default stated to be 0.0 Final Results 1) Field test The configured PDA and wireless connection to the server were user-tested by six Emergency physicians, nurses and medics from Tripler Army Medical Center. The of 6

mhjup^^ ^ mtfww p-,-, ^.^T ^ P--, T, individuals were asked to input fictitious patient data and send it wirelessly to the server. The events observed consisted of 1) ease of use, 2) rapidity of use, 3) any user learning that was required, 4) preferences in use. The test took place outdoors for maximum connectivity. Two units were unable to establish connectivity rapidly. The users found the PDA to be easy to use in its configuration and fields for input. Rapidity of use depended on an individual's prior experience with a PDA. Those familiar with the PDA were rapid in input of patient data and sending it to the server. They were also pleased to receive within seconds, a calculated decision from the server. Those not familiar with the PDA were able to quickly learn (within 5 minutes) the process (although the graffiti feature distracted them and they preferred the keyboard to avoid mistakes). Most individuals preferred the keyboard for inputting patient data. The overall comments about the system were very positive and the individuals agreed that this system could make important contributions to emergency healthcare. 2) Comparison between expert opinion and decision support system for decision making with cross-sectional data Expert opinion Two medical doctors, two nurses and two medics participated to this trial. Following information of these patients was distributed to each responder, which was derived from actual patients with crush injury at the Hanshin-Awaji Earthquake: age, gender, injury site (head, chest abdomen, right arm, left arm, right leg and left leg), time until rescued, respiratory rate, systolic blood pressure, pulse rate, availability of urine and urine color if available. Each responder was asked to roughly estimate the probability of having a deleterious outcome defined as receiving hemodialysis or death. One hundred twenty patient records were selected because they had almost complete information. Each responder was asked to estimate a probability of deleterious outcome (defined above) for each patient. Validation of the prognostic model The remaining 110 cases (test dataset) were used to validate the predictive model. The missing values in this data set were inputted to correctly evaluate the usefulness of this predictive model in a real-world situation. Evaluation of diagnostic accuracy All raw probabilities derived from both the decision support system and experts were used to generate receiveroperating (ROC) curve to evaluate performance. The ROC curve represents the relationship between sensitivity and specificity, by plotting the true-positive rate (sensitivity) against the false-positive rate (1- specificity) as the cutoff level of the model varies. The area under the ROC curve (AUC) is based on a non-parametric statistical sign test to compare the probability of events between pairs of patients who have the event and those who do not. AUC may be interpreted as the probability that given any two subjects, one who dies and one who survives, the model would assign a higher probability of death to the one who dies. It is a measure of overall classification performance of a diagnostic test or prognostic model. Then, all raw probabilities were converted into binary categories. We used 50% of probability as a threshold of possible good and bad outcomes. This means that any estimated probabilities less than 50% were classified as a "good outcome" and that greater than or equal 50% were classified as "deleterious outcome". These results were compared to the actual outcome data in each responder including the computer-based decision support system to calculate sensitivity and specificity. 2) Results ROC curve and AUC Figure 5 shows the ROC curve for responder A to F and the decision support system. Table 2 shows the AUC of each ROC curve. The decision support system showed the best AUC, which is statistically significantly better than coin- flipping decision making with cross sectional data. and Table 3 shows the overall accuracy, sensitivity and specificity of the decision support system and experts' decision making. As the AUC shows, the overall accuracy of the decision support system is the best of all. and specificity of expert decision making, however, shows some interesting aspects of medical decision making in a triage situation. by expert B, C and E showed extremely high values compared with the decision support system. It means that these experts tried of 6

mvtpm9otv9poti^^^^^^^^^^ to identify very severe patients who required immediate rescue activities (aggressive triage). In contrast, expert D seemed to try to increase sensitivity. This means that the expert tried to identify as many patients as possible to avoid false negatives (conservative triage). Expert A and F showed similar attitudes as the decision support system (neutral triage). This kind of decision totally depends on purpose, situation and location of triage as well as resource availability. Naturally, the computer cannot make such a decision without any input regarding this information. Therefore, these results showed an important difference between computer-based decision making and expert decision making, and roles of computer-based decision support in triage. Computers may be able to provide good results if the care provider can give the system crucial information, such as purpose, situation and location of triage as well as resource availability, (which the computer cannot determine). The results also showed the difficulty of doing triage with cross-sectional data entry (care providers usually make a decision using time sequential information). Therefore, this is an important role of a computer-based decision support system. Projected Costs o Current commercial wireless Internet service is not enough for this purpose. We need to use alternative technology to solve this important limitation, o We have identified concrete solutions for this issue during this project and plan to implement these in future projects. Therefore, we cannot calculate the cost at this time. Comments o We realized that technologies have been improving rapidly during this project. Some useful technologies that were not available when we submitted the proposal are now extensively applied, o Considering this issue, we were able to try several advanced technologies in telemedicine in order to solve current problems in disaster medicine. TATRC Scientific Review TATRC Acquisition Review of 6

^mp*rbbme^b^^^^tbr*pvm*iiiiii u U^IJ Supporting Graphs/Charts See Attached iof 6

Table 3. and specificity Decision Support System Predicted Good 14 7 21 Outcome Deliterious 3 10 13 Total 17 17 34 70.59% 58.82% 82.35% Expert A Predicted Good 36 21 57 Outcome Deliterious 15 36 51 Total 51 57 108 66.67% 63.16% 70.59% Expert B Expert C Predicted Good 53 45 98 Outcome Deliterious 3 17 20 Total 56 62 118 Predicted Good 49 53 102 Outcome Deliterious 5 10 15 Total 54 63 117 59.32% 27.42% 94.64% 50.43% 15.87% 90.74% Expert D Predicted Good 10 8 18 Outcome Deliterious 44 55 99 Total 54 63 117 55.56% 87.30% 18.52% Expert E Predicted Good 47 45 92 Outcome Deliterious 6 19 25 Total 53 64 117 56.41% 29.69% 88.68% Expert F Predicted Good 41 23 64 Outcome Deliterious 15 41 56 Total 56 64 120 68.33% 64.06% 73.21%