RESEARCH MEKORANDUM. Internal Distribution Cnly

Similar documents
U.S. ARMY AIR DEFENSE DIGEST

Department of Defense DIRECTIVE

Army Regulation Management. RAND Arroyo Center. Headquarters Department of the Army Washington, DC 25 May 2012 UNCLASSIFIED

DoD M-4, August 1988

Department of Defense INSTRUCTION

Methodology The assessment portion of the Index of U.S.

JAVELIN ANTITANK MISSILE

Lessons Learned From Product Manager (PM) Infantry Combat Vehicle (ICV) Using Soldier Evaluation in the Design Phase

Quality Management Plan

UNITED STATES OF AMERICA BEFORE THE FEDERAL ENERGY REGULATORY COMMISSION ) ) )

Department of Defense

UNCLASSIFIED R-1 ITEM NOMENCLATURE

OPNAVINST DNS-3/NAVAIR 24 Apr Subj: MISSIONS, FUNCTIONS, AND TASKS OF THE COMMANDER, NAVAL AIR SYSTEMS COMMAND

PROGRAM ANNOUNCEMENT FOR FY 2019 ENVIRONMENTAL SECURITY TECHNOLOGY CERTIFICATION PROGRAM (ESTCP)

We acquire the means to move forward...from the sea. The Naval Research, Development & Acquisition Team Strategic Plan

SERIES 1300 DIRECTOR, DEFENSE RESEARCH AND ENGINEERING (DDR&E) DEFENSE RESEARCH AND ENGINEERING (NC )

UNCLASSIFIED. Cost To Complete Total Program Element S750: Mission Training and Preparation Systems FY 2015

Office of the Inspector General Department of Defense

Guidance on supporting information for revalidation

UNCLASSIFIED. FY 2017 Base FY 2017 OCO

HQMC 7 Jul 00 E R R A T U M. MCO dtd 9 Jun 00 MARINE CORPS POLICY ON DEPOT MAINTENANCE CORE CAPABILITIES

Prepared for Milestone A Decision

A udit R eport. Office of the Inspector General Department of Defense. Report No. D October 31, 2001

DEPARTMENT OF THE NAVY

OPNAVINST C N43 18 Jun Subj: NAVY EXPEDITIONARY TABLE OF ALLOWANCE AND ADVANCED BASE FUNCTIONAL COMPONENT POLICY

Be clearly linked to strategic and contingency planning.

CLINICAL PATHOLOGY TODAY*

SUBJECT: Army Directive (Global Cultural Knowledge Network)

MCO D C Sep 2008

Pediatric Residents. A Guide to Evaluating Your Clinical Competence. THE AMERICAN BOARD of PEDIATRICS

COMPLIANCE WITH THIS PUBLICATION IS MANDATORY

Subj: DEPARTMENT OF THE NAVY NUCLEAR WEAPON SYSTEM SAFETY PROGRAM

UH-72A LAKOTA LIGHT UTILITY HELICOPTER (LUH)

HEALTH WORKFORCE SUPPLY AND REQUIREMENTS PROJECTION MODELS. World Health Organization Div. of Health Systems 1211 Geneva 27, Switzerland

U.S. Department of Energy Office of Inspector General Office of Audit Services. Audit Report

Department of Defense DIRECTIVE. SUBJECT: Single Manager Responsibility for Military Explosive Ordnance Disposal Technology and Training (EODT&T)

SUBJECT: Army Directive (The Army Credentialing Assistance Program)

REQUIREMENTS TO CAPABILITIES

AUDIT REPORT NATIONAL LOW-LEVEL WASTE MANAGEMENT PROGRAM DOE/IG-0462 FEBRUARY 2000

The Need for NMCI. N Bukovac CG February 2009

Quality Management Building Blocks

Talent Management: Right Officer, Right Place, Right Time

COMPLIANCE WITH THIS PUBLICATION IS MANDATORY

Department of Defense INSTRUCTION

UNCLASSIFIED FY 2016 OCO. FY 2016 Base

COMPLIANCE WITH THIS PUBLICATION IS MANDATORY

February 21, Regional Directors Child Nutrition Programs All Regions. State Agency Directors All States

GAO DEFENSE CONTRACTING. Improved Policies and Tools Could Help Increase Competition on DOD s National Security Exception Procurements

of Communications-Electronic s AFI , Requirements Development and Processing AFI , Planning Logistics Support

RECORD VERSION STATEMENT BY THE HONORABLE MARK T. ESPER SECRETARY OF THE ARMY BEFORE THE COMMITTEE ON ARMED SERVICES UNITED STATES SENATE

SUBJECT: Army Directive (Installation Energy and Water Security Policy)

STATEMENT J. MICHAEL GILMORE DIRECTOR, OPERATIONAL TEST AND EVALUATION OFFICE OF THE SECRETARY OF DEFENSE BEFORE THE SENATE ARMED SERVICES COMMITTEE

CHAIRMAN OF THE JOINT CHIEFS OF STAFF INSTRUCTION

CHAPTER 4 ENEMY DETAINED PERSONNEL IN INTERNAL DEFENSE AND DEVELOPMENT OPERATIONS

THE UNDER SECRETARY OF DEFENSE 3010 DEFENSE PENTAGON WASHINGTON, DC

SUBJECT: Army Directive (Implementation of Acquisition Reform Initiatives 1 and 2)

The 2017 Secretary of Defense Performance-Based Logistics Awards Program for Excellence in Life Cycle Product Support

NAVY CONTINUITY OF OPERATIONS PROGRAM AND POLICY

Developmental Test and Evaluation Is Back

Re: Rewarding Provider Performance: Aligning Incentives in Medicare

Inside the Beltway ITEA Journal 2008; 29: Copyright 2008 by the International Test and Evaluation Association

FIELD TRAINING EVALUATION PROGRAM

the ability to match the appropriate degree of analytical and intuitive cognition with the type of problem under consideration

OPNAVINST E N97 7 Nov 2017

Department of Defense INSTRUCTION. SUBJECT: DoD Procedures for Joint DoD-DOE Nuclear Weapons Life-Cycle Activities

Army Participation in the Defense Logistics Agency Weapon System Support Program

COMPLIANCE WITH THIS PUBLICATION IS MANDATORY

Combat Hunter Curriculum Design

GLOBAL BROADCAST SERVICE (GBS)

Department of Defense DIRECTIVE

ACQUISITION OF THE ADVANCED TANK ARMAMENT SYSTEM. Report No. D February 28, Office of the Inspector General Department of Defense

Risk Management Fundamentals

Standards in Weapons Training

a GAO GAO WEAPONS ACQUISITION DOD Should Strengthen Policies for Assessing Technical Data Needs to Support Weapon Systems

Klamath Tribal Health & Family Services 3949 South 6 th Street Klamath Falls, OR 97603

Department of Defense DIRECTIVE

Department of the Army TRADOC Regulation Headquarters, United States Army Training and Doctrine Command Fort Monroe, Virginia

CHAPTER 3 A READY, VERSATILE ARMY

CLASSIFICATION OF DUTY STATIONS ACCORDING TO CONDITIONS OF LIFE AND WORK

Mission-Based Test & Evaluation Strategy: Creating Linkages between Technology Development and Mission Capability

First Announcement/Call For Papers

NATIONAL DEFENSE INDUSTRIAL ASSOCIATION NET3 CONFERENCE REMARKS BY MG (RET) WILLIE B. NANCE, JR. EXECUTIVE VICE PRESIDENT, CYPRESS INTERNATIONAL INC.

1III11 INI II11111II1I. Department of Defense Directive. Automated Data Processing Resources Management

Department of Defense

Chapter III ARMY EOD OPERATIONS

Subj: DISCLOSURE OF MILITARY INFORMATION TO FOREIGN GOVERNMENTS AND INTERESTS

75 th Anniversary Awards for Innovation, Naval Research Laboratory.

ADRF Guidelines for Preparing a Grant Application

POLICY ISSUES AND ALTERNATIVES

Department of Defense INSTRUCTION. Secretary of Defense CorporateExecutive Fellows Program (SDCFP(SDEF)

Splitting Hand Receipts for Deployment

SECRETARY OF THE ARMY WASHINGTON. SUBJECT: Army Directive (Sergeant and Staff Sergeant Promotion Recommended List)

The Midwives Council of Hong Kong. Handbook for Accreditation of Midwives Education Programs/ Training Institutes for Midwives Registration

For More Information

United States 3rd Infantry Division Modern Spearhead list

US Army Training and Doctrine Command (TRADOC) Analysis Center (TRAC)

UNITED STATES ARMY INTELLIGENCE AND SECURITY COMMAND

Department of Defense INSTRUCTION

Air Force WALEX Applications

NOTICE OF DISCLOSURE

Transcription:

HumRRO LIBRARY TECHNICAL REPOR- SECTION NAVAL POSTGRADUATE SCHOOL MONTEREY. CALIFORNIA 93940 OBRARY U.S. ARMY LEADERSHIP HUMAN RESEARCH U.^IT p. O. BOX 787 PRESIDIO OF MONT^RiY CALIFORNIA RESEARCH MEKORANDUM Internal Distribution Cnly THE ROLE OF USER SERVICE^TESTS AND ARMY"CONDUCTED SYSTEM TRAINING IN ESTABLISHING TRAINING REQUIREMENTS FOR NEW WEAPONS SYSTEMS U.S. Army Air Defense Hnman Research Unit Pert Blisg, Texas Under t/ie Technica' Supervision of The George Washington University HUMAN RESOURCES RESEARCH OFFICE operating under contract with THE DEPARTMENT OF THE ARMY

U.S. Army Air Defense Human Research Unit is established under the command of the Commanding General, Continental Army Command, The Human Resources Research Office, the George Washington University, operating under contract with the Department of the Army, employs the Director of Research and other civilian staff members who are assigned to the Unit with the approval of Headquarters, Continental Army Command. The Human Resources Research Office provides the Unit with technical supervision in the planning and analysis of the research projects. Conclusions stated herein do not necessarily represent the official opinion or policy of Headquarters, Continental Army Command, or the Department of the Army.

RESEARCH MEMORANDUM Internal Distribution Only THE ROLE OF USER SERVICE TESTS AND ARMY CONDUCTED SYSTEMTEAINING IN ESTABLISHING TRAINING REQUIREMENTS FOR NEW WEAPONS SYSTEMS by i-j. L, Williams, Jr., Milton A, Grodsky, and Harold C, Strasel Approved: Robert G, Smith Jr. Director of Research U, S. Army Air Defense Human Research Unit Fort Bliss, Texas December 1959 iubtask UPSTREAM I

TABLE OF CONTENTS INTRODUCTION 1 THE USER SERVICE TEST 2 Page Observations of the NIKE HERCULES User Service Test 2 Time Limits 2 Hardvare Emphasis 3 Implications of User Service Test Observations it- System Modifications k Evaluation of Training Requirements 5 ARMY CONDUCTED INSTRUCTOR TRAINING 8 Observations of Instructor Training for the NIKE HERCULES System 10 Assignment of Personnel to the User Service Test 10 Course Evaluation by Student Questionnaire 10

INTRODUCTION The problem with which HumRRO Task UPSTREAM is concerned is the development of a system, or set of procedures, to be used in anticipating the training requirements that will be imposed by new weapon systems. The rapid rate of developments in weapon technology and concurrent rapid obsolescence require speedy and efficient training of personnel if weapons system potentials are to be realized. With the weapons systems of today, training must begin before weapons system hardware comes off the production line. In the past when weapons were less complex and time pressures were not as great as they are today, there was reasonable opportunity for trial and error in establishing early training programs. Also, just as systematic test procedures are used to evaluate new weapons system hardware, systematic procedures are required to evaluate the performance of personnel who are associated with the hardware. The initial efforts of Task UPSTREAM have been directed toward the examination of the various stages through which a weapons system passes as it is being developed. The intent of these examinations has been to evaluate these stages as possible sources of information relevant to the establishment and evaluation of training requirements. The present study is concerned with two stages which occur late in the development cycle; the User Service Test and Army conducted instructor training. The term "training requirements" will be used throughout this memorandum to mean a set of statements specifying the behaviors, and occasions for these behaviors, necessary for successful job performance.

THE USER SERVICE TEST The User Service Test as defined in (2) is "a test of an item or system of materiel conducted under simulated or actual operational conditions to determine to what degree the item or system meets the stipulated MC's or the suitability of the item or system and its maintenance package for use by the Army". During this study the User Service Test of the NIKE HERCULES system as conducted by the U.S. Army Air Defense Board was observed. The reader is referred to references (3) and (h) for a detailed account of general Air Defense Board User Service Test activities, OBSERVATIONS OF THE NIKE HERCULES USER SERVICE TEST Observations of the NIKE HERCULES User Service Test as conducted by the U.S. Army Air Defense Board established two points. Although these points are based on the observations made of this one test, it is believed that they are generally applicable to the user service testing of other complex weapons systems. Time Limits As conducted today the User Service Test occurs too late in the development cycle and too close to the operational date of the new equipment to provide a realistic basis for the development of training

requirements information. Current and anticipated future efforts to compress the R&D cycle suggest that this situation will not change. Training of user personnel must begin before the User Service Test is completed. The definition of the User Service Test, as noted above, implies that tested items may be found to be unsuitable for military use. Such determination is unlikely, however, for the more complex and expensive items of equipment which pose the greatest problem with respect to training requirements. Extensive coordination between the Army and the item developer will insure that the item at least approaches suitability for military use. For this reason, and for reasons of economy, it is most likely that the results of User Service Tests will provide a basis for system modifications to insure maximum suitability for military use. As some such changes will undoubtedly affect the system training requirements, the User Service Test may provide a basis for modifying elements of previously developed training requirements information. Hardware Emphasis Although there is a growing trend to consider the personnel who operate, maintain, and supervise a weapons system as part of that weapons system, this concept is given little explicit consideration in the User Service Test. In general the operational or simulated operational conditions required of the User Service Test are met by taking the equipment into the field away from a laboratory setting (where the engineering test is conducted). The quality of personnel associated with the equipment

during the conduct of the test, however, is generally much higher than that to be expected under operational conditions. Maintenance technicians were assigned to the NIKE HERCULES test crew in numbers greater than that provided for in the tentative TOE and some of these technicians, at least, were regarded by their peers and superiors as among the best in the Army. IMPLICATIONS OF USER SERVICE TEST OBSERVATIONS The User Service Test provides a vehicle for obtaining two kinds of information relative to the establishment and evaluation of training requirements that will be imposed by new weapons systems. System Modifications Prior to the beginning of the User Service Test, basic training requirements information will have been developed from schematics, mock-ups and general design data. Once this training requirements information has been obtained, however, the weapons system being developed must be continually monitored for system changes that will affect training requirements. System modifications resulting from the User Service Test should be noted as they develop so that their effect on prior generated training requirements information may be determined. It is probable that this system modification information can be acquired through liaison with appropriate User Service Test agencies. k

Evaluation of Training Requirements Under current Army R&D procedures the User Service Test is the first occasion during the developmental phase of a nev system vhere the system and its associated manpower can he tested together in an operational configuration. Although it represents the first opportunity for detailed evaluation of a man-weapons system, the User Service Test is frequently not completed until after the system under test has been placed in an operational status. The User Troop Test (defined in (2) as "A test conducted in the field for the purpose of evaluating operational organizational concepts, doctrine, techniques, procedures, or to gain further information on materiel") is unlikely to begin until some time after the system under test has been declared operational. Although the User Service Test is not usually conducted by a TOE unit, as is more likely with the User Troop Test, trained personnel are required to perform all of the duties that TOE units may perform later. The fact that the execution of such duties is neeessary for the conduct of the User Service Test provides the basis for establishing a test program, within the User Service Test, to evaluate the adequacy with which job duties are performed. In order for such an evaluation to be meaningful in terms of indicating how well training requirements have been developed, personnel vhose training has been based solely on previously derived training requirements information must be assigned to participate in the test. In terms of aptitude and past experience these personnel should be representative of the troops who will operate, maintain, and command the weapon when it becomes operational. 5

If an explicit attempt is made to evaluate training requirements during the conduct of a User Service Test, changes vill have to be made in test procedures. In general, the schema discussed belov seems appropriate. The plan of test for a new system should be written, as is done currently, by the agency that is responsible for conducting the test. The plan of test should categorize the specific tests into two major groupings. (a) One grouping should consist of all tests designed to evaluate the expected operational capability of the equipment and which require system associated personnel to perform duties in much the same manner as they would be performed when the system becomes operational. Example- A test of the system's capability to destroy a target having specified characteristics under specified conditions. (b) The second grouping should consist of all tests designed to evaluate peripheral characteristics of the equipment which require personnel action not expected to be a part of the job duties of system associated personnel when the system becomes operational. Example A test designed to measure the gross physical characteristics of the system, i.e.; size and weight. 6

The entire test should be conducted under the supervisor of the agency charged with the User Service Test responsibility. For group (a) tests, all operational type actions required should be directly conducted by personnel especially assigned for this purpose. The activities of User Service Test agency personnel should be limited to monitoring the conduct of test activities, interrupting, where necessary for reasons of safety, and evaluating the performance of personnel conducting the test. For group (b) tests there is no requirement for the utilization of especially assigned personnel. The evaluation of personnel performance to be carried out during the conduct of group (a) tests may take one of several forms. (1) Personnel conducting the evaluation may merely note deficiencies in performance in terms of excessive time requirements and use of inappropriate procedures. Specific noted deficiencies may then be used in refining the training requirements for personnel associated with the system. (2) To further refine the procedures discussed in (l) above, techniques may be developed for evaluating performance deficiencies in terms of their consequences. Possibilities in this area are suggested by Smith (5) and Office, Chief of Ordnance (l).

(3) If sufficient time is available. User Service Test agency personnel may become sufficiently familiar with the equipment to enable the development of the group (a) tests in proficiency test format. The development of proficiency tests by the Evaluation Division of the Deputy for Instruction, Air Defense School may provide effective guidelines for such an endeavor. It is probable that proficiency tests, developed within a consequence analysis framework, would provide for the most precise evaluation of training requirements, and would, in addition, provide detailed guidance for the refinement of training requirements. To this point all of the procedures discussed concerning the evaluation of training requirements have been directed toward determining the extent to which previously established training requirements are comprehensive with respect to required actual job performance. In addition to this aspect of the evaluation process, it is also necessary to determine if certain training requirements may be irrelevant to required job performance. This phase of evaluation can probably be accomplished by having test personnel, as group (a) tests are completed, (and possibly test supervisory personnel) complete questionnaires evaluating the relevancy and importance of training requirement items with respect to required job performance. 8

It should be noted that these evaluation procedures, in addition to heing sensitive to training requirements-job performance relationships, are also sensitive to the effectiveness with which training requirements have been implemented in the form of actual training. If job performance deficiencies are noted in areas covered by training requirements statements, then the deficiencies are presumably due to failure to cover these areas adequately in training. ARMY CONDUCTED INSTRUCTOR TRAINING Prior to the beginning of training of personnel vho will form operational TOE units after training, service schools must train the instructors who will conduct the training. Training programs for instructors have been derived, at least in part, from courses conducted by the system contractor under the guidance of the contracting technical service. With the current emphasis on the development of task and skill analysis data concurrent with system development, it is likely that future instructor training programs will be based more and more on training requirements statements derived from task and skill analysis data. Key personnel, following their attendance at the contractor course, prepare the training programs for instructor training. These programs then become prototypes for courses to be given to personnel who will form operational TOE units. Graduates of instructor training courses then, provide the first opportunity to determine if training for a new system meets the job demands imposed by the system. 9

OBSERVATIONS OF INSTRUCTOR TRAINING FOR THE NIKE HERCULES SYSTEM Two UPSTREAM I staff members attended one of the first NIKE HERCULES IFC maintenance courses for instructors. Both of these staff members had acquired considerable experience in NIKE AJAX IFC maintenance work. Two approaches to the problem of utilizing instructor training courses as vehicles for validating training requirements information grew out of this activity. Assigmnent of Personnel to the User Service Test If the schedule for instructor training and the User Service Test schedule can be appropriately coordinated, graduates of the instructor training program could be assigned to participate in the User Service Test as discussed in a previous section of this report. In that the normal input to instructor training programs may differ in terms of aptitude and experience from personnel who will eventually he assigned to operational units, personnel expected to be assigned to participate in the User Service Test may have to be specially selected. Course Evaluation by Student Questionnaire Following the monitoring of the NIKE HERCULES course, questionnaires were developed and administered to a succeeding class. It was hypothesized that students who had field experience with a similar system (NIKE AJAX) 10

before attending the NIKE HERCULES instructor course would be able to evaluate the course in terms of the preparation it afforded for operational maintenance duties. As the students without prior field experience would have no basis for judgement, it was assumed that a necessary (but not sufficient) test of the validity of field experienced student's evaluations would require their evaluations to differ from those obtained from students lacking such previous field experience. The primary intent of questionnaire administration was to determine if field experienced students differed from students lacking such experience in their evaluation of the instructor course. The finding of differences would suggest the utility of expending further effort in developing techniques utilizing student evaluations as a basis for evaluating prototype courses for new equipment. The questionnaires described below were administered to 28 students. Of these 28 students, l8 had no previous maintenance experience while 10 had previous experience ranging from 1 month to 3 years in duration. Questionnaire A was administered to each student at the close of each instructional period. 11

QUEST IOMAIRE A 1. In your judgement how much of the time devoted to instruction on this subject was concerned with things that you - as a maintenance man - could actually do (troubleshoot, adjust, operate, etc.) with the NIKE HERCULES in the field? (answer in hours and/or minutes) 2. In your judgement how much of the time devoted to instruction on this subject SHOULD HAVE BEEN concerned with things that you - as a maintenance man - could actually do (troubleshoot, adjust, operate, etc.) with the NIKE HERCULES in the field? (answer in hours and/or minutes) Questionnaire B was administered to each student at the conclusion of the ten week course. QUESTIONNAIRE B 1. In the NIKE HERCULES Instructor Training Course: (a) What per cent of the classroom instruction time was concerned with things that you could actually do (trouble shoot, adjust, operate, etc.) In computer instruction? 4> In acquisition instruction? <f> 12

In tracking radars instruction? (b) What per cent of the practical instruction time was concerned with things that you could actually do (trouble shoot, adjust, operate, etc.) In computer instruction? jj In acquisition instruction? $, In tracking radars instruction? 4) If you were going to construct a NIKE HERCULES course, how many classroom instruction period would you devote to: (a) Computer? periods. (How approximately 100.) (b) Acquisition? periods. (Wow approximately 56.) (c) Track radars? periods. (Kow approximately 72.) How many practical instruction periods would you devote to: (a) Computer? periods. (Now approximately 54.) (b) Acquisition? periods. (Now approximately 18.) (c) Track radars? periods. (Now approximately Ik.) Data from Questionnaire A are presented in Table 1. 13

Table 1 Data from Questionnaire A (Group means converted to percentages and shown "by weeks) Question l Question 2 Week Ex. Inex. Ex. Inex. 1 89.7 95.5 89.7 102.6 2 87-9 95-0 89.2 105-9 3 86.3 91.8 96.9 116.9 h 91-9 92.7 99.6 101.0 5 91.8 92.1 95.^ 105.2 6 85-3 91.2 96.5 100.8 7 9^.3 98.9 98.1 107.2 8 87.I 9^.1 89.6 104.8 9 87.3 93.8 88.4 100.0 10 87.^ 9^.2 88.0 98.5 Ik

It is clear from Table 1 (Question l) that the inexperienced group of students consistently estimated a greater proportion of instructional time per topic to be relevant to actual maintenance behavior than did the experienced group. It is also clear that the differences between group estimates are not large, ranging from 0,3% of the instructional time in the fifth week to 7% in the eighth week. With respect to Question 2, Table 1 shows that the inexperienced students were equally consistent in making larger estimates than the experienced group with respect to the amount of time per instructional topic that should be devoted to actual maintenance behavior. Differences between group estimates are somewhat larger than those for Question 1, ranging from l,h% of the instructional time in the fourth week to 20% in the third week. 15

Table 2 Data from Questionnaire B (Group means) Question Experienced Group 1 Inexperienced Group 1 a (computer-classroom) 68% 73% 1 a (acquisition-classroom) 70% 76% 1 a (tracking-classroom) 71% 79% 1 b (computer-practical) 90% m% 1 b (acquisition-practical) 89% 90% 1 b (tracking-practical) m 86% 2 a (computer-classroom) 92 hrs. 111* hrs. 2 b (acquisition-classroom) 52 hrs. 66 hrs. 2 c (tracking-classroom) 7k hrs. 81 hrs. 3 a (computer-practical) 5U hrs. 77 hrs. 3 b (acquisition-practical) 2h hrs. 31 hrs. 3 c (tracking-practical) 32 hrs. 36 hrs. i 16

Data from Questionnaire B, shown in Table 2, support in general the findings from Questionnaire A, At the end of the course, the inexperienced students estimated a greater proportion of classroom instructional time per topic (computer, acquisition radar, tracking radars) to be relevant to maintenance behavior than did the experienced group (Question la). This trend does not hold, however, for practical instructional time as shown in the responses to Question lb. With respect to estimating the amount of time per instructional topic that should be devoted to actual maintenance behavior, the inexperienced group again made consistently larger time estimates than the experienced group for the three major instructional topics in both classroom and practical settings. These findings are interpreted to mean that within the sample studied, students with previous field maintenance experience have a basis for evaluating a new course that is different from that used by students having no such experience. Although these data cannot be supported statistically, 1 it is believed that they are sufficiently suggestive to warrant further exploration in this area if circumstances warrant. In general, however, the less derived approaches to the problem of evaluating training requirements information, involving personnel testing within the context of the User Service Test, should be preferred. 1 A limited number of subjects prevented the use of a design not requiring repeated observations of the same individuals. Use of the sign test, otherwise appropriate here, was negated by the correlation between observations introduced by use of the same subjects in each administration of the questionnaires.. 17

REFERENCES 1. Office, Chief of Ordnance. A New Method for Testing Reliability Applied to the AAFCS M38 (Skysweeper) 0C0 Project No. TRI-1020, Aberdeen Proving Ground, Maryland, 1958. 2. Research and Development of Materiel (AR 705-5)j Headquarters, Department of the Army, Washington, September 1958 3. U. S. Army Air Defense Board; Mission, Organization and Function, U. S. Army Air Defense Board, Fort Bliss, Texas, no date, iu U. S. Army Air Defense Board; Test Officers 1 Manual; Instruc- tions for Test Officers and Plans of Tests, Sections I and II. U. S. Army Air Defense Board, Fort Bliss, Texas, October 1957. 5. Smith, R. 0. Jr. Scales and Standards for Military Training Research. Research Memorandum, U. S. Army Air Defense Human Research Unit, Fort Bliss, Texas, 1959. 18