Figure 1 - e-learning Research Roadmap. SSCP IRB Determination Date 20 Apr2015 SSCP IR-EP7-A Expires: 19 Apr2016

Similar documents
Combined Education and Training Program Plan (Must be an unclassified document) For Bandaria (BN) Budget Year 2012

CNATRAINST B N6 9 Aug 17

System of Records Notice (SORN) Checklist

SUBJECT: NEW TRAVEL AND LIVING ALLOWANCES (TLA) POLICY GUIDANCE FOR SECURITY COOPERATION TRAINING PROGRAMS P3 POLICY # 03-01

Should you have any questions, please contact Mr. Jim McGaughey, DSCA/STR/TNG, at or

Sample Privacy Impact Assessment Report Project: Outsourcing clinical audit to an external company in St. Anywhere s hospital

AUSTRALIAN RESUSCITATION COUNCIL PRIVACY STATEMENT

DEPARTMENT OF THE NAVY HEADQUARTERS UNITED STATES MARINE CORPS 3000 MARINE CORPS PENTAGON WASHINGTON D.C ` MCO 3502.

the recruitment and selection of candidates for aviation officer training.

Chapter 14. Introduction

FUTURE U.S. NAVY AND USCG OPERATIONS IN THE ARCTIC

DEFENSE SECURITY COOPERATION AGENCY TH STREET SOUTH, STE 203 ARLINGTON, VA

OFFICE OF NAVAL RESEARCH RESEARCH PERFORMANCE PROGRESS REPORT (RPPR) INSTRUCTIONS

Naval Security Enterprise Newsletter

Department of Defense DIRECTIVE

PRIVACY IMPACT ASSESSMENT (PIA) For the

PRIVACY IMPACT ASSESSMENT (PIA) For the

system of records in its inventory of record systems subject to the Privacy Act of 1974 (5 U.S.C. 552a), as amended.

Department of Defense INSTRUCTION

PRIVACY IMPACT ASSESSMENT (PIA) For the

PRIVACY IMPACT ASSESSMENT (PIA) For the

UNCLASSIFIED. UNCLASSIFIED Navy Page 1 of 8 R-1 Line #152

OPNAVINST A N Oct 2014

DEPARTMENT OF THE NAVY HEADQUARTERS UNITED STATES MARINE CORPS 3000 MARINE CORPS PENTAGON WASHINGTON, DC

Report of the Information & Privacy Commissioner/Ontario. Review of the Cardiac Care Network of Ontario (CCN):

SECRETARY OF DEFENSE DEFENSE PENTAGON WASHINGTON, DC

Technology Standards of Practice

PRIVACY IMPACT ASSESSMENT (PIA) For the

Professional Military Education Course Catalog

DATA PROTECTION POLICY (in force since 21 May 2018)

GAO. DEFENSE BUDGET Trends in Reserve Components Military Personnel Compensation Accounts for

U.S. Army Research, Development and Engineering Command (RDECOM) Atlantic

NAVAL SCIENCE, TECHNOLOGY, ENGINEERING, AND MATHEMATICS POLICY AND COORDINATION

Engineer Doctrine. Update

DEPARTMENT OF THE NAVY OFFICE OF THE CHIEF OF NAVAL OPERATIONS 2000 NAVY PENTAGON WASHINGTON, DC

PRIVACY IMPACT ASSESSMENT (PIA) For the

Duty Title Unit Location

Subj: MARINE CORPS POLICY ON ORGANIZING, TRAINING, AND EQUIPPING FOR OPERATIONS IN AN IMPROVISED EXPLOSIVE DEVICE (IED) ENVIRONMENT

Office of the Inspector General Department of Defense

CHAIRMAN OF THE JOINT CHIEFS OF STAFF INSTRUCTION

PRIVACY IMPACT ASSESSMENT (PIA) For the

U.S. Department of Justice United States Attorney Eastern District of Virginia 2100 Jamieson Avenue (703) Alexandria, Virginia NOTICE

UNCLASSIFIED R-1 ITEM NOMENCLATURE FY 2013 OCO

DISTRIBUTION STATEMENT A: Approved for public release; distribution is unlimited.

Subj: MANAGEMENT AND CONTROL OF LEATHER FLIGHT JACKETS

Department of Defense INSTRUCTION

Subj: MISSION AND FUNCTIONS OF THE NAVAL SAFETY CENTER

DEPARTMENT OF THE NAVY HEADQUARTERS UNITED STATES MARINE CORPS 3000 MARINE CORPS PENTAGON WASHINGTON, DC

PhD Scholarship Guidelines

Selection, Training, Utilization, and Career Guidance for Army Medical Corps Officers as Flight Surgeons

Online Application Help

OPNAVINST DNS-3 22 Dec Subj: MISSION, FUNCTIONS, AND TASKS OF THE OFFICE OF THE CHIEF OF NAVAL OPERATIONS

CHAIRMAN OF THE JOINT CHIEFS OF STAFF INSTRUCTION

Defense Security Cooperation Agency

DOD INSTRUCTION AVIATION HAZARD IDENTIFICATION AND RISK ASSESSMENT PROGRAMS (AHIRAPS)

PRIVACY IMPACT ASSESSMENT (PIA) For the

Downloading Application Viewer

SUBJECT: Army Directive (Implementation of the Army Human Capital Big Data Strategy)

UNITED STATES MARINE CORPS MARINE CORPS INSTALLATIONS EAST PSC BOX CAMP LEJEUNE NC

PRIVACY IMPACT ASSESSMENT (PIA) For the

PRIVACY IMPACT ASSESSMENT (PIA) For the

PRIVACY IMPACT ASSESSMENT (PIA) For the

Optima POC PARTICIPANT GUIDE

RESEARCH SUPPORTED BY A DEPARTMENT OF DEFENSE (DOD) COMPONENT

LESSON 2: THE U.S. ARMY PART 1 - THE ACTIVE ARMY

Subj: ELECTRONIC WARFARE DATA AND REPROGRAMMABLE LIBRARY SUPPORT PROGRAM

PRIVACY IMPACT ASSESSMENT (PIA) For the

PRIVACY IMPACT ASSESSMENT (PIA) For the

COMPLIANCE AND IMPLEMENTATION OF THE TREATY ON OPEN SKIES

Volunteer Management Information System Army Volunteer Corps Volunteer User Guide

CHAIRMAN OF THE JOINT CHIEFS OF STAFF INSTRUCTION

CHAIRMAN OF THE JOINT CHIEFS OF STAFF INSTRUCTION

Loyola University Chicago Health Sciences Division Maywood, IL. Human Subject Research Project Start-Up Guide


Subj: MISSION, FUNCTIONS, AND TASKS OF NAVAL SPECIAL WARFARE COMMAND

PRIVACY IMPACT ASSESSMENT (PIA) For the

PRIVACY IMPACT ASSESSMENT (PIA) For the

The 16th Sustainment Brigade Sustains a Strong Europe

I. Researcher Information

Information System Security

Engineering Operations

NOTE: The first appearance of terms in bold in the body of this document (except titles) are defined terms please refer to the Definitions section.

PRIVACY IMPACT ASSESSMENT (PIA) For the

PRIVACY IMPACT ASSESSMENT (PIA) For the

OPNAVINST D N09F May 20, Subj: MISSION AND FUNCTIONS OF NAVAL SAFETY CENTER (NSC)

DOD DIRECTIVE DOD POLICY AND RESPONSIBILITIES RELATING TO SECURITY COOPERATION

The Naval Education and Training Security Assistance Field Activity (NETSAFA) Strategic Plan:

Critical Information Needed to Determine the Cost and Availability of G222 Spare Parts

PRIVACY IMPACT ASSESSMENT (PIA) For the

Subj: SECRETARY OF THE NAVY SAFETY EXCELLENCE AWARDS

PRIVACY IMPACT ASSESSMENT (PIA) For the

SECTION 1: IS A PIA REQUIRED?

OUR MISSION PARTNERS DISA S BUDGET. TOTAL DOD COMPONENT/AGENCY ORDERS FOR DISA DWCF FY16 (in thousands)

CHAIRMAN OF THE JOINT CHIEFS OF STAFF INSTRUCTION

DEFENSE SECURITY COOPERATION AGENCY WASHINGTON, DC

DEPARTMENT OF THE NAVY OFFICE OF THE CHIEF OF NAVAL OPERATIONS 2000 NAVY PENTAGON WASHINGTON, D,C,

APPENDIX A: SURVEY METHODS

Compliance with Personal Health Information Protection Act

Department of Defense DIRECTIVE

DOD INSTRUCTION MANAGEMENT OF REGULAR AND RESERVE RETIRED MILITARY MEMBERS

Transcription:

2 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 In 2008, the Commander, U.S. Naval Forces Europe (CNE) set a requirement for a multi-national, multilingual e-learning environment that would support Allies, Coalition partners, North Atlantic Treaty Organization (NATO) nations, Partnership for Peace (PfP) and selected African nations. This requirement was based on meeting an urgent demand to provide training and education capabilities that would meet the Combatant Commander s (COCOM) requirement to effectively operate in large areas of operations (AOs). The most difficult challenge is the ability to train and communicate due to the vast nature of the AOs with limited connectivity and infrastructure. To meet this need, the Coalition Warfare Program (CWP) initiated a project, entitled the Multinational Virtual Learning Environment (MVLE) 1, which also included several science and technology research projects with the support of the Office of the Secretary of Defense (OSD) Advanced Distributed Learning (ADL) Co-Lab, the Office of Naval Research (ONR), the ONR Global (ONRG), and the U.S. Air Force European Office of Aerospace Research and Development (EOARD). MVLE was based on an emergent requirement to establish a distance learning program that would assist key regions of Europe and Africa in training future military and civilian leaders. It was the first step in providing an e-learning capability within the global infrastructure, which focused on creating and strengthening partnerships in key nations in the Black Sea Region (BSR)/South Eastern Europe (SEE). Other areas of focus included addressing DoD International Security Cooperation objectives and using regional expertise in the African national, Baltic Sea, Caspian Sea, Black Sea and Mediterranean Sea to assist in developing an advanced e-learning capability. This development and deployment effort involved ten countries 2 and was viewed as an important step in establishing a coalition e-learning environment with non-traditional partners based on common goals and objectives. The project s Proof of Concept (PoC) was successfully conducted in conjunction with the U.S. Department of Defense, U.S. European Command (USEUCOM) and particularly the U.S. Navy in key regions of Europe. The following shortfalls needed to be addressed in order to fully meet COCOM Training Requirements and meet the needs of the individual user: (1) more functionality, (2) ease of use, (3) up-to-date learning content, (4) better, faster connectivity and (5) more advanced navigation. 79 80 81 Figure 1 - e-learning Research Roadmap 1 Multinational Virtual Learning Environment (MVLE) Concept Evaluation Operational Evaluation Report; 1 October 2008 2 Azerbaijan, Bulgaria, Georgia, Germany, Moldova, Norway, Romania, Tadzhikistan, United Kingdom, United States

3 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 After the MVLE Project concluded, the DoD members 3 of the MVLE Working Group, identified several approaches to addressing the aforementioned shortfalls. As the e-learning Research Roadmap (Figure 1) shows, science and technology research projects were initiated in three areas (i.e., general e-learning topics, speech-enabled, and mobile learning (m-learning)). Each area focused on both the short-term requirements for providing a capability using the low hanging fruit of technology, and on the long-term capability requirements needed by the DoD that had not yet matured (e.g., automatic language translation, speech-enabled capabilities). In 2010, ONR and ONRG funded a two-year research project, entitled Multimodal Enabled Advanced Distributed Learning (ME-ADL), to assess the use of speech as an enabled capability. The results indicated that using speech as an enabled capability, by providing snippets of learning content, was an effective way of ensuring that core learning goals and objectives were being fulfilled, and that speech, as a learning medium, provided more interaction and engagement within multicultural environments. In 2011, the Coalition Warfare Program project entitled Mobile Learning Environment (MoLE) was initiated. It utilized state-of-the-art mobile technology to wirelessly link remote and highly mobile users directly with resources to obtain training and education. Additionally, it integrated mobile learning (mlearning) into the Deputy Director, Joint Staff (J-7) for Joint and Coalition Warfighting (DD J7 JCW) Joint Knowledge Online (JKO) portal. This facilitated the sharing of educational content between U.S. and multinational partners. Since the conception of the MVLE Project, learning technologies have developed to a point where the technology can personalize education; rapidly assess student learning; support social learning, serious games, and intelligent tutoring; diminish learning boundaries (e.g., language training and cultural awareness); and provide alternative teaching methods and approaches to lifelong learning. To meet the goal for a technology-enabled capability, the Multichannel Learning System (MLS) Project was identified as a follow-on project to provide a more effective, personalized learning capability for U.S. and international partners. The MLS Project is a joint research collaboration effort on behalf of Naval Education and Training Security Assistance Field Activity (NETSAFA), Defense Institute of Security Assistance Management (DISAM), US Marine Corps Security Cooperation Group (MCSCG), and ONR/ONRG. The Defense Security Cooperation Agency (DSCA), which directs, administers and supervises execution of various Security Cooperation Programs managed at NETSAFA, MCSCG, and DISAM, has endorsed this project. 4. Identify the sponsor and, if known, future users of the data/results a. Sponsor Office of the Secretary of Defense (OSD) Coalition Warfare Program (CWP) b. Future Users of Data/Results 1) Dr. Kristen Barrera, U.S Air Force Wright Patterson AFB, 711 th HPW/RHAS 2) CDR Gary Anaya, ONR RC The data users listed will publish the analyzed, aggregated results in a report and brief the results to the MLS Project Team. The extramural investigators will receive a copy of the final report and the briefing upon request. After the final report is published, the raw, de-identified data may also be provided to team 3 USEUCOM, Office of the Secretary of Defence for Personnel & Readiness (OSD P&R), Deputy Director for Joint Staff (J7) Joint and Coalition Warfighting (DD J7 JCW) Joint Knowledge Online (JKO), Commander Naval Forces Europe (CNE), U.S. Space & Naval Warfare Systems Center Pacific (SSC-Pacific), SSC-Europe, SSC-Charleston, Office of Naval Research (ONR), Office of Naval Research Global (ONRG), ONR Program Reserve Component (RC)

4 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 170 171 172 173 174 175 176 177 178 179 investigators, if requested; however, the requirements and process to ensure the data are protected will be worked out individually. 5. Briefly describe the objectives of the project, the research plan, and methodology with particular emphasis on direct or indirect interaction with human subjects or their identifiable data. Describe why human subject (or their data) must be in the research and if there are any alternatives. a. Objectives of the project The MLS Project objective is to evaluate multiple formats of learning content to support individual learning approaches and preferences. DSCA, in particular, has been interested in identifying and implementing the best methods for providing distance education in order to prepare international military students for a resident training experience in the United States. To meet this objective, content from the International Military Student Pre-Departure Briefing (IMSPDB) course will be provided in multiple formats (i.e., e-book, mobile app, video and e-learning) to military and civilians, both U.S. and international, to assess these tailored learning methods. The goals for the work conducted under this protocol are: (1) identify the best methods for providing distance education for international military students and (2) evaluate the effectiveness of having multiple learning formats to support SCETP requirements. At the end of this project, the U.S. will have an enhanced operational training capability with participating partner nations. These capabilities will be integrated into other Security Cooperation Education and Training programs managed by DSCA. b. Research Plan and Methodology This study includes 3 phases: Recruiting, Concept Evaluation and Data Collection, and Data Analysis. i. Recruiting Based on the MLS Project goals, the following groups of potential volunteers have been identified by the project team for the MLS Concept Evaluation study: 1) International military students (IMS) attending training at U.S. training sites these students are selected since a) they are international military students and therefore specifically address the DSCA s objective, b) they have a sufficient level of English language proficiency, and c) they can be readily recruited for participation since they will be in the U.S. at the time of recruitment. The recruitment process begins when the students are in a school and are at the U.S. training site. 2) U.S. government civilians or military personnel this group has been selected since they can be readily recruited and can also be used as a baseline group to better understand the international military student responses. The following describes more detail about how each group will be selected, what the basis will be for inclusion in the research, and what the volunteers will be doing as part of the study. In Section 9, specific steps are described to reflect how participants from each group will be recruited. International Military Students (IMS) International students enrolled at military training sites located in the U.S. will be identified based on a query of the Security Assistance Network (SAN) Web Training database. The database contains names

5 180 181 182 183 184 185 186 187 188 189 190 191 192 193 194 195 196 197 198 199 200 201 202 203 204 205 206 207 208 209 210 211 212 213 214 215 216 217 218 219 220 221 222 223 224 225 226 227 228 229 and email addresses of those international students currently enrolled in training courses. Additional factors for inclusion/exclusion are the following: 1) Students will be in the U.S. during the timeframe of the Concept Evaluation. 2) Students have provided a non-military or an unofficial type of designated email address, such as a.com address, so that the invitation to participate will not be blocked or filtered and go unnoticed by the recipient. 3) The resultant potential participant list will include Army, Air Force and Navy. The IMS volunteers selected from the various U.S. training sites will be sent an invitation email to participate, notify them that their International Military Student Officer (IMSO) is their contact for any general questions or concerns regarding the study, and include a preview of the Informed Consent plus Privacy Act Statement. A sample of the recruitment letter that will be sent is provided in Appendix B. The recruitment letter will be the same for all training sites. The IMSO for each training site is known by the students upon their arrival and prior to the recruitment letter being sent. The list of training sites and the associated IMSO/point of contact is provided in Appendix F. The Informed Consent plus Privacy Act Statement is provided in Appendix C. US Government Civilians and Military Personnel Two U.S. organizations, NETSAFA and DISAM, will recruit U.S. civilians and military volunteers. An ALL HANDS email from each command s leadership will ask for voluntary participation in the study. Since the email will be the only correspondence from the respective Command leadership about the study, interested individuals must contact the site Ombudsman with their name and email address. The list of names and email addresses will be provided by the Ombudsman to Mr. Hodges, MLS Project Support. Questions or concerns about the study will be addressed by the site Ombudsman. Like the IMS volunteers, the U.S. civilian and military volunteers will be sent an invitation email by Mr. Hodges to participate, notify them that their site Ombudsman is their contact for any general questions or concerns regarding the study, and include a preview of the Informed Consent. Samples of the all hands recruitment letters that will be sent are provided in Appendix B. The local points of contact i.e. ombudsmen, at the respective commands are included in the list provided in Appendix F. The Informed Consent plus Privacy Act Statement is provided in Appendix C. ii. Concept Evaluation and Data Collection All potential volunteers who receive an invitation email will also receive a registration email 3 days following the invitation email. The personalized uniform resource Locator (URL) and personal identification number (PIN) provided in the registration email will provide him/her access to the study s web portal (known as the MLS Data Collection Portal). The individual can access the portal at any time during the data collection period per the registration email. If desired, the volunteer can opt-out by not going to the URL website or entering his/her personalized PIN. At this point, no data will be collected; he/she will no longer be considered a volunteer in the study and will not be contacted any further. Volunteers who wish to participate in the study must enter the MLS Data Collection portal using their URL and PIN. The following describes the three (3) general steps for the participants.

6 230 231 232 233 234 235 236 237 238 239 240 241 242 243 244 245 246 247 248 249 250 251 252 253 254 255 256 257 258 259 260 261 262 263 264 265 266 267 268 269 270 271 272 273 274 275 276 277 278 279 280 Step 1: Entering the portal automatically registers the participant in the MLS Data Collection system. The participant will be asked to read and accept the Informed Consent agreement by clicking on the Proceed button (see Section 11). The participant will then be asked to complete a demographics questionnaire and take a pre-test. This step should take approximately four (4) minutes. Step 2: The participant will be asked to view and evaluate at least two different learning formats (i.e., e-book, mobile app, video or e-learning) of his/her choice. The participant will evaluate the formats by (a) clicking on the Video format, (b) clicking on the e-learning format, (c) downloading the e-book in Android/Apple or Amazon Kindle format, and/or (d) downloading the DISAM Mobile App from the Android or Apple Store using the Quick Response (QR) Code. Each format contains material in the following topics: American and Military Cultures, Cultures, Individualism, Punctuality, Informality, Egalitarianism, Short Term Mentality and Daily Life in America. However, we will only be evaluating the Cultures, Individualism, and Punctuality topics in this study. The participant will be asked to view the material with the intent of learning the content in addition to identifying his/her likes and dislikes about the formats being evaluated. This step should take approximately 18 minutes for two learning formats. Step 3: After evaluating the learning formats, the participant will go back to the MLS portal and click the proceed button, indicating he/she has completed their review of different learning formats. At this point, the participant will not be allowed to return to the learning formats. The participant will then be asked to complete (a) the Post-Test and (b) the final Evaluation Questionnaire. This step should take approximately 23 minutes. If, at any time during the study in steps 1-3, the participant decides to withdraw, he/she needs only to stop and not participate. No notification to anyone is required. If, however, the individual would like his/her data to be removed from the MLS Data Collection Portal, he/she will need to contact Mr. Hodges by email and provide the PIN number that he/she used to register for the study. Mr. Hodges will flag the responses correlated with that PIN to ensure data removal before the data analysis is performed. The study will use whatever portion of the evaluation has been completed if feasible. iii. Data Analysis The data collected during the evaluation will not contain any personal data or identifiable characteristics. All data collection will be limited only to the essential information required to evaluate the success of the project. Siebenundvierzig ING GmbH & Co. KG will maintain the data collected during the MLS Concept Evaluation on the MLS Data Collection server. The system administrator of the server will establish an account for users requiring privileged access. The privileged access for an approved user will only be accessible by username and password to the database storing the de-identified data. Once the evaluation has completed, the data collected during the evaluation will be downloaded by Mr. Hodges who will be an approved user. The de-identified data will be checked by Mr. Hodges to ensure that the appropriate flagged data are removed and that the remaining data will be usable by the Statistical Package for the Social Sciences (SPSS) analysis software. Flagged data is considered data collected from participants that subsequently requested to have their data removed from inclusion in the study. This data will be deleted and the resultant collected data will be analyzed by the MLS Testing & Evaluation (T&E) Team using the SPSS software. All data collection on the MLS Data Collection Portal will be destroyed within 7-days following the conclusion of the evaluation period. Subsequently, the non-attributable results will be sent to the MLS Project Sponsors to support further research related to the Security Cooperation Education and Training

7 281 282 283 284 285 286 287 288 289 290 291 292 293 294 295 296 297 298 299 300 301 302 303 304 305 306 307 308 309 310 311 312 313 314 315 316 317 318 319 320 321 322 323 324 325 326 327 328 329 330 331 Program (SCETP). All data collected during the Concept Evaluation and data used for analysis will only be accessible as a password-protected file. When the data is not in use with SPSS, it will be removed from the computer and stored on a passwordprotected USB drive and locked in a file cabinet. This will be stored and only accessible by Dr. Barrera located at 711 HPW/RHAS, Warfighter Readiness Research Division, 2620 Q Street (Bldg 20852), Wright Patterson AFB, OH 45433-7955. Analysis will be focused on technical and learning effectiveness. Specific categories for analysis are technology format, usability, utility, learning, and interest as described in Section 12. 6. Describe any anticipated benefits to the human subjects, the Navy and/or society. 1) There is no direct benefit to the participant other than personal knowledge and satisfaction that his/her evaluation input will help to identify improved learning methods for supporting international military students. 2) The Department of the Navy s International Security Cooperation Program (ISCP), managed by NETSAFA and the MCSCG, will be able to provide a more dynamic training capability to international students attending Navy- and Marine Corps-related schools. 7. To what other reviews, if any, is this study subject? This study is not subject to any other U.S. reviews. 8. To what other regulations is this data collection effort subject (e.g., Privacy Act) and how will it/they be implemented? A Privacy Act statement will be appended to the Informed Consent that will be on-line as part of the process to participate in the study. Requirements for privacy and informed consent are met by the document represented in Appendix C. The statement pertaining to the Privacy Act will be as follows - PRIVACY ACT STATEMENT Authority: SECNAVINST 5211.5E, Department of the Navy Privacy Act (PA) Program, 10(a) Page 12, 10(d) Page 13 of 28 December 2005 Purpose: Human performance data and other research information will be collected in the research project entitled Multichannel Learning System (MLS) Project. Routine Uses: The Departments of the Navy and Defense, and other U.S. Government agencies will use the resulting research data for learning methodology analyses and reports. Use of the information obtained may be granted to non-government agencies following the provisions of the Freedom of Information Act or contracts and agreements. I voluntarily agree to its disclosure to agencies or individuals identified above and I have been informed that failure to agree to this disclosure may make the research less useful. The Blanket Routine Uses that appear at the beginning of the Department of the Navy s compilation of data bases also apply to this system.

8 332 333 334 335 336 337 338 339 340 341 342 343 344 345 346 347 348 349 350 351 352 353 354 355 356 357 358 359 360 361 362 363 364 365 366 367 368 369 370 371 372 373 374 375 376 377 378 379 380 381 Voluntary Disclosure: Participation in this study and provision of information is voluntary. However, failure to provide requested information may invalidate test data and/or test procedures and could therefore result in removal from the project. Dismissal from the research project will involve no reproach, prejudice or jeopardy to my job or status. 9. How will participants be recruited? Provide a description of the specific steps to be used to identify and/or contact prospective subjects and require the inclusion of any scripts and/or advertisements to be used for any telephone contacts, advertisements, verbal contact, etc. The groups being recruited for this study will be through email contact. Due to the make up of the potential volunteers, there will be a different approach to recruiting for each group. For the international military students, they will initially be identified through the following steps: 1. Based on the contract with DISAM, Lockheed Martin personnel will interrogate the Security Assistance Network (SAN) and produce a data report of all International Military Training Lines enrolled in U.S. military schools that encompass the planned study timeframe. For example, a review of the period 15 Dec 2014 to 28 Feb 2015 yields over 600 training lines. (one training line = one student) 2. The list is initially filtered by removing training lines that do not have student email addresses. This is important since the only contact with potential volunteer is via email. 3. The list is then filtered a second time to remove training lines with student email addresses that have the appearance to be from the U.S. Embassies or local government. This is necessary to ensure the future notification email will not go unnoticed or be blocked by a government email system. 4. The result of the two previous filters produces a list of potential volunteers. For example, during 15 Dec 2014 to 28 Feb 2015, there are Professional Military Education (PME) courses with 268 potential volunteers across three services: Army, Air Force, and Navy to include 1 Senior Enlisted PME course. The list of potential volunteers will receive an invitation email (a sample is provided in Appendix B). No recruitment of international students will occur until the subjects are students at a U.S. military training site. The following is a listing of the eleven (11) U.S. Military training sites that will participate in the study: - U.S. Army Maneuver Center of Excellence, Fort Benning, GA - U.S. Army Signal School of Excellence, Fort Gordon, GA - U.S. Army 128 th Aviation Logistics School, Fort Eustis, VA - U.S. Army MANSCEN Center of Excellence, Fort Leonard Wood, MO - U.S. Army Army War College, Carlisle Barracks Command, PA - U.S. Navy Naval School Explosive Ordnance Disposal, Eglin, AFB, FL - U.S. Navy Center of Naval Aviation Technical Training Detachment, Milton, FL - U.S. Navy NETSAFA International Training Center, Pensacola, FL - U.S. Navy CNATT Naval Aviation Schools Command, Pensacola, FL - U.S. Air Force 81 st Training Group, Keesler AFB, Biloxi, MS - U.S. Air Force DLIELC, Lackland AFB, San Antonio, TX For U.S. Government and Military personnel, the recruitment process begins with an ALL HANDS email that will go out to the respective Commands. The two (2) commands that will be recruiting are NETSAFA and DISAM. The ALL HANDS will contain information about what the study is, the

9 382 383 384 385 386 387 388 389 390 391 392 393 394 395 396 397 398 399 400 401 402 403 404 405 406 407 408 409 410 411 412 413 414 415 416 417 418 419 420 421 422 423 424 425 426 timeframe of it, and who to contact if interested in participating. The email(s) are currently being drafted and will be provided upon request. The email will be the only correspondence from the respective Command leadership about the study to recruit potential volunteers. 10. Describe the nature and extent of risk the collection of these data pose to the human subjects. Assess direct impact to the subject at the time of participation (physical, emotional), and possible future impact that the disclosure of the subject s responses could have on his/her financial standing, career, employability, insurability, reputation, etc. Describe procedures that will be implemented to minimize the risk. The subjects will be advised that participation is voluntary and anonymous and they are free to withdraw from the study at any time. Prior to participation, during the recruitment phase, a potential inferred risk to volunteers is pressure on the individual to participate as a result of undue influence by a supervisor or superior ranked official. To mitigate this risk, the subjects will be advised that their supervisors will not know if they participate or do not participate. In the event that a subject has concern about supervisory influence, they will have access to their training site s or command s Ombudsman as a third party for report and counsel. Additionally, the project team lead (Ms Howell) will contact the Ombudsmen to monitor the recruitment process. At time of participation, the risks that have been identified are the same as what one would face when interacting with any educational online digital content. These include: 1) possible anxiety if the participant does not complete the evaluation; 2) uneasiness if it takes the participant longer than she/he expects to complete the evaluation; and 3) apprehension due to mis-understanding or confusion over some words, terms, or concepts in the material being evaluated. All participants are expected to have some level of proficiency of the English language, so risks of mis-understanding should be minimal. All of these risks are also listed in the Informed Consent for the volunteers awareness. After participation, no physical, mental, or long-term risks are anticipated. A possible risk is the accidental release of the correlated collected responses to the individual participants, that is, the data files with the participants emails and associated PINs. This risk is minimal since only one team member, the US Extramural Investigator (Mr. Hodges), will have access to the data files, and he will delete those files after the closing date of the evaluation period to ensure none of the findings/reports could possibly identify any subject s response. Only de-identified data will be used in the analysis phase of this study. 11. How will subjects be informed of their rights? Will informed consent be obtained? All potential participants will be provided a preview of the Informed Consent with the MLS Invitation email. During the registration process (on the portal) to participate, he/she will be provided an opportunity to re-read the Informed Consent. In order to participate in the study, candidates must Accept and Acknowledge the Terms and Conditions of the Informed Consent as shown in representative example below.

10 427 428 429 430 431 432 433 434 435 436 437 438 439 440 441 442 443 444 445 446 447 448 449 450 451 452 453 454 455 456 457 458 459 460 461 12. Describe any questions/items that will be asked or data elements that will be collected or accessed from existing databases. (Attach a copy of questions, data elements, or survey/assessment instruments. If there are currently none available, provide a sample of representative items). The goals of the research study are to (1) identify the best methods for providing distance education for international military students and (2) evaluate the effectiveness of having multiple learning formats to support SCETP requirements. In meeting these goals, the data elements collected will be grouped into two areas; specifically, Technical Effectiveness Evaluation and Learning Effectiveness Evaluation. The Technical Effectiveness evaluation will determine if the formats provided (i.e., e-book, mobile application, video and e-learning) are an effective means for providing distance education for international students. Questions relating to this area will determine if potential military students can access the different formats, if the content is suited for the specific formats, and determine if the format worked, if it was easy to read/listen to, and if the learning content was well constructed. The Learning Effectiveness evaluation will collect data elements to determine if learning transfer occurred, and if the training provided an enjoyable experience. Questions related to this will focus on usability, utility, usefulness, learning and interest/desirability. Data Element Information (1) Data collection elements will focus on several types of formats, more specifically; - Demographic Questions will use tick boxes and fill-in-the-blank responses - Technical Evaluation Questions will use a Likert scale and open-ended format - Learning Evaluation Questions will use a Likert scale and open-ended format - All questions will include a Decline to Answer in order to meet research protocol requirements (2) Terms of Reference - Usability refers to the methods for evaluating ease-of-use during the design process. Usability is divided into five components (i.e. Learnability, Satisfaction, Ease of Use, Errors and Memorability). Of the five categories, Errors and Memorability are not relevant to the MLS Evaluations. For the analysis process;

11 462 463 464 465 466 467 468 469 470 471 472 473 474 475 476 477 478 479 480 481 482 483 484 485 486 487 488 o o o o o Usability = how easy and pleasant the multiple learning formats are. U L Learnability: How easy it is for users to accomplish a basic task the first time they encounter the design. U S Satisfaction: How pleasant it is to use the designed multiple learning format capability. U E Ease of Use: How pleasant it is to use the features of the learning formats. Utility: whether or not the developed multichannel learning formats capability provides the features needed to support international training. Usefulness = Usability + Utility Learning = questions related to knowledge transfer based on the pre- and post-test as well as questions related to self-efficacy. Interest/Desirability = questions to determine if participants had an interest in the learning and the desire to pass the information onto their colleagues or other international students travelling to the United States. (3) Category of Analysis - The analysis of the areas - Technical and Learning Effectiveness - will be based on the responses collected and stored from the questionnaires during the Concept Evaluation. The table below summarizes how the questions will be aligned to the particular areas and categories. The list of representative questions is provided in Appendix D. Question 1. Which devices did you use for evaluating the learning formats? (Select all that apply) 2. Which learning formats did you evaluate? (Select all that apply) 3. I have control over the pace and sequencing of my learning process. 4. I may not have been able to learn the content if provided in only one format. 5. I feel confident that I have a good understanding of American Culture Technical Category Usability Utility Learning Interest U L U S U E

12 489 490 491 492 493 based on the training. 6. I feel confident that I can help a colleague understand American Culture. 7. I am willing to share my experiences and lessons learned with those planning to train in the U.S. 8. When I had trouble understanding the material, I used another format to help clarify my understanding. 9. In addition to the testable content, I viewed the following content. (Select all that apply) 10. The goals of the training were clearly defined. 11. The User s Guide provided good information for downloading/accessing the learning formats. 12. Overall, I was satisfied with this course. 13. What did you like most about this learning experience? 14. What did you like least about this learning experience? 15. List five words that express your learning experience. - Format-related Evaluations (a) e-book Format Question 1. The e-book was easy to read. Technical Category Usability Utility Learning Interest U L U S U E

13 494 495 496 497 498 499 500 501 2. The e-book format was easy to use 3. The e-book was easy to understand 4. The e-book hyperlinks allowed me to access additional learning material. 5. The e-book was well constructed. 6. The e-book is a valuable learning tool. (b) Mobile App Format Question 1. The mobile app was easy to read. 2. The mobile app format was easy to use 3. The mobile app content was easy to understand. 4. The mobile app hyperlinks allowed me access additional learning material. 5. The mobile app was well constructed. 6. The mobile app is a valuable learning tool. (c) Video Format Question 1. The video length was acceptable. 2. The video format was easy to use. 3. The video content was easy to understand. 4. The video was clearly presented 5. A video is a valuable learning tool. Technical Technical Category Usability U L U S U E Utility Learning Interest Category Usability U L U S U E Utility Learning Interest

14 502 503 504 505 506 507 508 509 510 511 512 513 514 515 516 517 518 519 520 521 522 523 524 525 526 527 6. The video hyperlinks allowed me to access additional learning material. 7. It was easy to start/stop the video. (d) E-Learning Format Question 1. Which browser did you use to evaluate the e- learning format? 2. The e-learning format was easy to use. 3. The e-learning content was easy to understand 4. The e-learning content was clearly presented. 5. E-learning is a valuable learning tool. 6. The e-learning hyperlinks allowed me to access additional learning material. Technical Category Usability Utility Learning Interest U L U S U E 13. Do any of the questions/items/data elements used in the research involve information that is private or sensitive? If yes, describe and assess the degree of potential risk or harm to the subjects, if disclosed. The following demographic data elements will be asked of each participant and may be considered personal, private or sensitive: 1) age group, 2) gender, and 3) profession. Participants are given the option to decline to answer on each question. If the participant chooses to answer without declining, and if the data collected were disclosed, the risk to the individual would be minimal since the data are de-identified at this point of data collection. 14. What would be the impact to the research if private or sensitive information could not be collected? Missing data may result in analysis that will be generalized and possibly less useful to the objectives of the study. The analysis methods may need to be modified to consider data clusters and trends and not on statistically significant tests. 15. Describe precautions that are being used to minimize risk to the subject and safeguard the data (e.g., limiting access, storage and destruction of data, password-protected, network security, etc.) As described in section 10, three types of risks have been identified: 1) pressure on volunteers to participate before the evaluation during the recruitment process, 2) concern by the volunteers during the evaluation over misunderstanding or incompletion, and 3) after the evaluation regarding the safeguarding and handling of the collected data.

15 528 529 530 531 532 533 534 535 536 537 538 539 540 541 542 543 544 545 546 547 548 549 550 551 552 553 554 555 556 557 558 559 560 561 562 563 564 565 566 567 568 569 570 571 572 573 574 575 576 577 578 Precautions to minimize the first risk are addressed in Sections 5 and 9 where the recruitment process is described. Safeguards include the use of an ombudsman or third party should any fear of nonparticipation be of concern. Also, the Informed Consent states that the volunteer may discontinue his/her participation at any time without reprisal. A mitigating factor for the second risk is that because the individuals are recruited from a pool of volunteers with some English language proficiency, there should be little or no concern over understanding the presented questions and materials. Additionally, the questions presented in the Concept Evaluation will be assessed using the Flesch-Kincaid grade level test to ensure that they are readable at the eighth grade level or lower. For the third risk, multiple safeguards in handling the collected data have been put in place. First, when the Extramural Investigator, Mr. Hodges, receives the volunteer email addresses from the recruiters or recruiter representatives, the information will be entered into a password-protected MLS Initial Registration Excel spreadsheet. This will be stored on a password-protected flash drive. The flash drive is connected to the computer only when needed for MLS Project work and secured (and not connected to computer) when not being required for use. The information entered in the spreadsheet will be the email address, their assigned PIN, the PIN in words, and a hyperlink to the MLS Data Collection Portal. The password-protected drive will be locked in a file cabinet when not in use and located with Mr. Hodges, Lockheed-Martin International Training Team at 129 Clarendon Place, Dover, England CT17 9QE. The file cabinet will only be accessible by Mr. Hodges. After the Concept Evaluation has concluded, the spreadsheet containing the email addresses and PINs will be deleted. In addition, all emails will be deleted. The database of the Germany server collecting and storing the data provided by the participants will only be accessible by the system administrator who ensures the versioning, accesses, and Information Assurance protections are in place. The database backup will be encrypted. At the time of data collection, no Internet Protocol (IP) addresses will be collected so there will be no linkage between the participant s PIN and the electronic device used by the participant. During the data collection period, only two Research Protocol representatives (Ms. Julie Howell and Dr. Andrea Loesch) and one Test and Evaluation Lead (Dr. Kristen Barrera) will have access to the MLS Data Collection Portal via a link requiring a password and username. Their view into the portal will only provide status of MLS participants progress (PIN Number, registered, finished pre-test, and finished evaluation). After the Concept Evaluation has completed, Mr. Jake Hodges will download the data directly from the server using a username and password assigned to him. At this time, the usernames and passwords of anyone having access will be turned off. All data files will then be deleted from the database on the server as well as the database backup. A copy of the stored data (a Comma Separated Values (CSV) file) will be reviewed by the MLS Extramural Investigators to ensure that no individual identification information is present. When acceptable, it will be provided for analysis to the Testing & Evaluation Team. At the end of the Concept Evaluation, this de-identified raw data will be held by Dr. Kristen Barrera, the Testing and Evaluation Lead at 711 HPW/RHAS, Warfighter Readiness Research Division, 2620 Q Street (Bldg 20852), Wright Patterson AFB, OH 45433-7955. Any requests for the data set will need to be made to Dr. Barrera. 16. List all Appendices to this Protocol A. CITI Certificates, CVs, and IIA B. Invitation and Instructional Emails

17 590 591 592 593 Acronym Index This acronym listing is based on all materials used throughout the MLS Project. ADL Advanced Distributed Learning AFRL U.S. Air Force Research Laboratory AFSAT U.S. Air Force Security Assistance Training Squadron AO Area of Operation APAN All Pacific Area Network ARJOCS Arab Jordanian Center for Studies ASN RD&A Assistant Secretary of the Navy (Research, Development & Acquisition) ATRRS Army Training Requirements and Resources System BSR Black Sea Region C4ISR Command, Control, Communications, Computers, Intelligence, Surveillance and Reconnaissance CalTech Centre for elearning Technology CD Counter Drugs CDSA Connecting Soldiers to Digital Applications CDTS Center for Defence Technologies Studies CE Concept Evaluation CITI Collaborative Institute Training Initiative CNE Commander, Naval Forces Europe COCOM Combatant Commanders CoI Community of Interest COTS Commercial-off-the-shelf CRA Computing Research Association CSV Comma Separated Values CTFP Combating Terrorism Fellowship Program CV Curricula, Vitae CWP Coalition Warfare Program DAUK Defence Academy of the United Kingdom DCTS Defence Centre for Training Support DD J7 Deputy Director for Joint Staff DISAM Defence Institute of Security Assistance Management DOD Department of Defense DoS Department of State DSAMS Defense Security Assistance Management System DSCA Defense Security Cooperation Agency E-book Electronic book, an electronic publication in digital form E-learning Electronic learning, sometimes referred to as web-based EOARD European Office of Aerospace, Research and Development FAA Foreign Assistance Act FKIE Fraunhofer-Institute for Communications, Information Processing, and Ergonomics Flt Lt Flight Lieutenant GIRAF PM Project Management of the GIRAF, an independent international company GRENA Georgian Research and Educational Networking Association HPW/RHAS U.S. Air Force Research Laboratory, Warfighter Readiness Research Division, Continuous Learning HRPP Human Research Protection Program

18 HTML ICP ICT IIA IMET IMS IMSO IMSPDB IO IPO IRB ISCP JCW JKO JRMS JSCET LC LET LMCS LMITT LMS MCSCG ME-ADL MILDEP MLS MoD MoLE MVLE NATO NCC NETSAFA ONR ONR RC ONRG OSD PA PAL PDF PfP PIN PLE PME POC PoC QR RC ROI RP RTO S&T HyperText Markup Language International Center for the Advancement of Research, Technology & Innovation Information, Communication and Technology Individual Investigator Agreement International Military Education and Training International Military Student International Military Student Officer International Military Student Pre-Departure Briefing Information Operations International Program Office Institutional Review Board International Security Cooperation Program Joint and Coalition Warfighting Joint Knowledge Online Jordanian Royal Medical Services Joint Security Cooperation Education and Training Learning Content Working Group Learning, Education and Training Learning Management Content System Lockheed Martin International Training Team Learning Management System Marine Corps Security Cooperation Group Multimodal Enabled Advanced Distributed Learning Military Department Multichannel Learning System Ministry of Defence Mobile Learning Environment Multinational Virtual Learning Environment North American Treaty Organization Naval Component Commands Naval Education and Training Security Assistance Field Activity Office of Naval Research Office of Naval Research Reserve Component Office of Naval Research Global Office of the Secretary of Defense Privacy Act Personal Assistant for Learning Portable Document Format Partnership for Peace Personal Identification Number Personal Learning Environment Professional Military Education Points-of-Contact Proof of Concept Quick Response Reserve Component Return on Investment Research Protocol Research and Technology Organization Science & Technology

19 594 SA SAFTA SAN SCETP SCO SCOLA SECDEF SECNAV SEE SME SNS SPSS SSC SWF T&E UDHAVI URL US USCG/IA USEUCOM USG USMC Security Assistance U.S. Army Security Assistance Training Field Activity Security Assistance Network Security Cooperation Education and Training Program Security Cooperation Offices Sutton College of Learning Adults Secretary of Defense Secretary of the Navy South Eastern Europe Subject Matter Experts Social Networking Sites Statistical Package for the Social Sciences Space & Naval Warfare Systems Center Shockwave Flash (formatted video) Testing and Evaluation Universal Database for Humanitarian Aid and Voluntary Institutions Uniform Resource Locator United States U.S. Coast Guard International Affairs and Foreign Policy U.S. European Command United States Government United States Marine Corps