Frequently Asked Questions 2012 Workplace and Gender Relations Survey of Active Duty Members Defense Manpower Data Center (DMDC)

Similar documents
Department of Defense Sexual Assault Prevention and Response Program. Response Systems Panel June 27, 2013

2013 Workplace and Equal Opportunity Survey of Active Duty Members. Nonresponse Bias Analysis Report

2010 Workplace and Gender Relations Survey of Active Duty Members. Overview Report on Sexual Harassment

NGB-JA/OCI CNGBN 0400 DISTRIBUTION: A 16 April 2014 INTERIM REVISION TO CNGB SERIES

A Victim-Focused Response: Fielding and Enhancing the Military System

Reports of Sexual Assault Over Time

APPENDIX A: SURVEY METHODS

AUGUST 2005 STATUS OF FORCES SURVEY OF ACTIVE-DUTY MEMBERS: TABULATIONS OF RESPONSES

APPENDIX B: Metrics on Sexual Assault

Appendix B: Statistical Data on Sexual Assault

The Data on Military Sexual Assault: What You Need to Know

SEXUAL ASSAULT AND SEXUAL HARASSMENT IN THE U.S. MILITARY

AIR FORCE SPECIAL VICTIMS COUNSEL CHARTER

Appendix A Registered Nurse Nonresponse Analyses and Sample Weighting

GAO. DEFENSE BUDGET Trends in Reserve Components Military Personnel Compensation Accounts for

Annual Report on Sexual Harassment and Violence at the Military Service Academies

Department of Defense INSTRUCTION

Employee Telecommuting Study

MILITARY PERSONNEL. Actions Needed to Address Sexual Assaults of Male Servicemembers

COMPLIANCE WITH THIS PUBLICATION IS MANDATORY

Information and Technology for Better Decision Making Sexual Harassment Survey of Reserve Component Members

PROFILE OF THE MILITARY COMMUNITY

WRITTEN STATEMENT OF LIEUTENANT GENERAL FLORA D. DARPINO THE JUDGE ADVOCATE GENERAL, UNITED STATES ARMY FOR THE RESPONSE SYSTEMS PANEL

DoD Sexual Assault Prevention and Response Update Response Systems To Adult Sexual Assault Crimes Panel May 5, 2014

UNDER SECRETARY OF DEFENSE 4000 DEFENSE PENTAGON WASHINGTON, D.C

Information and Technology for Better Decision Making. Armed Forces 2002 Sexual Harassment Survey

Department of Defense DIRECTIVE

Fleet and Marine Corps Health Risk Assessment, 02 January December 31, 2015

RECRUIT GROUND TRAVEL BRIEF J-3/MEOP-CO-RTMB

Appendix H: Sexual Harassment Data

GAO CONTINGENCY CONTRACTING. DOD, State, and USAID Continue to Face Challenges in Tracking Contractor Personnel and Contracts in Iraq and Afghanistan

2008 Post-Election Voting Survey of Federal Civilians Overseas. Tabulations of Responses

SEXUAL ASSAULT AND SEXUAL HARASSMENT IN THE U.S. MILITARY

2013 QuickCompass of Financial Issues. Tabulations of Responses

VE-HEROeS and Vietnam Veterans Mortality Study

section describes weighting and variance estimation. The final section describes the calculation of response rates, location rates, and

Understanding Low Survey Response Rates Among Young U.S. Military Personnel

2005 Workplace and Equal Opportunity Survey of Active-Duty Members

In , an estimated 181,500 veterans (8% of

forwarded to Navy Personnel Command (NPC) for review because due to the mandatory processing status.

Summary of Findings. Data Memo. John B. Horrigan, Associate Director for Research Aaron Smith, Research Specialist

Collateral Misconduct and Unsubstantiated Reports Issue DOD/JCS USARMY USAF USNAV USMC USCG

Department of Defense INSTRUCTION

METHODOLOGY FOR INDICATOR SELECTION AND EVALUATION

The Prior Service Recruiting Pool for National Guard and Reserve Selected Reserve (SelRes) Enlisted Personnel

2007 Workplace and Equal Opportunity Survey of Reserve Component Members. Overview Report

Begun and held at the City of Washington on Tuesday, the third day of January, two thousand and seventeen An Act

11. (ALL) Please describe your civilian Sexual Assault Response Coordinator program, including:

Military Sexual Assault: A Framework for Congressional Oversight

Population Representation in the Military Services

Interagency Council on Intermediate Sanctions

RECOMMENDED CITATION: Pew Research Center, September 2014, Bipartisan Support for Obama s Military Campaign Against ISIS

Healthcare insights from more than 1,700 adults

Article 140a (New Provision) Case Management; Data Collection and Accessibility

the Secretary of Defense has withheld the authority to the special court-marital convening authority with a rank of at least O6.

United States Coast Guard Annex

UNDER SECRETARY OF DEFENSE 4000 DEFENSE PENTAGON WASHINGTON, D.C

Family Advocacy Program Central Registry

THE WAR IN IRAQ: FAMILIES OF THOSE WHO SERVE March 9-12, 2006

RECOMMENDED CITATION: Pew Research Center, October 2014, Support for U.S. Campaign Against ISIS; Doubts About Its Effectiveness, Objectives

LTC Jay Morse Written Statement to RSP

Suicide Among Veterans and Other Americans Office of Suicide Prevention

Reenlistment Rates Across the Services by Gender and Race/Ethnicity

Officer Retention Rates Across the Services by Gender and Race/Ethnicity

DoD Sexual Assault Prevention and Response Metrics. Response Systems Panel November 7, 2013

Department of Defense INSTRUCTION

2011 National NHS staff survey. Results from London Ambulance Service NHS Trust

ORIGINAL STUDIES. Participants: 100 medical directors (50% response rate).

Navy and Marine Corps Public Health Center. Fleet and Marine Corps Health Risk Assessment 2013 Prepared 2014

2014 National Center for Victims of Crime National Training Institute, Plenary Speech Miami, Florida September 17, 2014

Demographic Profile of the Active-Duty Warrant Officer Corps September 2008 Snapshot

Health Survey for England 2012

Characteristics of Adults on Probation, 1995

Accessions SAPR Training Core Competencies and Learning Objectives Audience Profile

MaRS 2017 Venture Client Annual Survey - Methodology

CHAPTER 3. Research methodology

THE MILITARY JUSTICE SYSTEM & THE VICTIM WITNESS ASSISTANCE PROGRAM (VWAP)

RECOMMENDED CITATION: Pew Research Center, July, 2015, A Year Later, U.S. Campaign Against ISIS Garners Support, Raises Concerns

Industry Market Research release date: November 2016 ALL US [238220] Plumbing, Heating, and Air-Conditioning Contractors Sector: Construction

DEPARTMENT OF THE NAVY OFFICE OF THE SECRETARY 1000 NAVY PENTAGON WASHINGTON DC

Youth Attitude Tracking Study

National Patient Safety Foundation at the AMA

Sequel Youth and Family Services POLICY AND PROCEDURE. Domain: Administration and Leadership

Research Design: Other Examples. Lynda Burton, ScD Johns Hopkins University

DEPARTMENT OF DEFENSE MISSION STATEMENT

Department of Defense INSTRUCTION AD-A NUMBER

Online Classifieds. The number of online adults to use classified ads websites, such as Craigslist, more than doubled from 2005 to 2009.

Registered Nurses. Population

Patterns of Reserve Officer Attrition Since September 11, 2001

No February Criminal Justice Information Reporting

Subj: ROLES AND RESPONSIBILITIES OF THE STAFF JUDGE ADVOCATE TO THE COMMANDANT OF THE MARINE CORPS

Department of Defense INSTRUCTION

Supplementary Online Content

ANNEX B (General Officer Commander s SHARP PM, SARC/SHARP and VA/SHARP selection criteria):

NHS Patient Survey Programme Emergency Department Survey: Quality and Methodology Report

Annual Report on Sexual Harassment and Violence at the Military Service Academies. Academic Program Year

Population Representation in the Military Services: Fiscal Year 2013 Summary Report

Appendix. We used matched-pair cluster-randomization to assign the. twenty-eight towns to intervention and control. Each cluster,

DEPARTMENT OF THE NAVY. PRISON RAPE ELIMINATION ACT ANNUAL REPORT, CALENDAR YEAR 2017; U.s. NAVY SHORE CONFINEMENT FACILITIES WITHIN THE UNITED STATES

MCO M&RA 28 Sep Subj: SEXUAL ASSAULT PREVENTION AND RESPONSE PROGRAM

Transcription:

Frequently Asked Questions 2012 Workplace and Gender Relations Survey of Active Duty Members Defense Manpower Data Center (DMDC) The Defense Manpower Data Center (DMDC) Human Resources Strategic Assessment Program (HRSAP) has been conducting surveys of gender issues for the active duty military since 1988. HRSAP uses scientific state of the art statistical techniques to draw conclusions from random, representative samples of the active duty populations. To construct estimates for the 2012 Workplace and Gender Relations Survey of Active Duty Members (2012 WGRA), DMDC used complex sampling and weighting procedures to ensure accuracy of estimates to the full active duty population. This approach, though widely accepted as the standard method to construct generalizable estimates, is often misunderstood. The following details some common questions about our methodology as a whole and the 2012 WGRA specifically. 1. What was the population of interest for the 2012 Workplace and Gender Relations Survey of Active Duty Members (WGRA)? The population of interest for the 2012 WGRA consisted of: Army, Navy, Marine Corps, and Air Force members, excluding National Guard and Reserve members; Who had at least six months service at the time the questionnaire was first fielded; Were below flag rank. Fielding of the survey began September 17, 2012 and ended on November 9, 2012. Completed surveys were received from approximately 23,000 eligible respondents. These survey responses were projected up to the full eligible active duty population of 1.35 million. 2. What was the survey question used to measure Unwanted Sexual Contact? Below is the measure of unwanted sexual contact for the 2006, 2010, and 2012 Workplace and Gender Relations Survey of Active Duty Members (WGRA). Respondents were asked to indicate Yes or No to the following question: In the past 12 months, have you experienced any of the following intentional sexual contacts that were against your will or occurred when you did not or could not consent where someone... o Sexually touched you (e.g., intentional touching of genitalia, breasts, or buttocks) or made you sexually touch them? o Attempted to make you have sexual intercourse, but was not successful? o Made you have sexual intercourse? o Attempted to make you perform or receive oral sex, anal sex, or penetration by a finger or object, but was not successful?

o Made you perform or receive oral sex, anal sex, or penetration by a finger or object? 3. The term "Unwanted Sexual Contact" (USC) does not accurately represent the categories of crime in the Uniform Code of Military Justice (UCMJ). Why is this? Is USC different than sexual assault? The measure of USC used by the 2012 Workplace and Gender Relations Survey of Active Duty Members (WGRA) is behaviorally-based. That is, the measure is based on specific behaviors experienced and does not assume the respondent has intimate knowledge of the UCMJ or the UCMJ definition of sexual assault. The estimates created for the USC rate reflect the percentage of active duty members who experienced behaviors prohibited by the UCMJ. The term unwanted sexual contact and its definition was created in collaboration with DoD legal counsel and experts in the field to help respondents better relate their experience(s) to the types of sexual assault behaviors addressed by military law and the DoD Sexual Assault Prevention and Response program. The vast majority of respondents would not know the difference between the UCMJ designations of "sexual assault", "aggravated sexual contact", or "forcible sodomy" described in Articles 120 and 125, UCMJ. As a result, the term unwanted sexual contact was created so that respondents could read the definition provided and readily understand the kinds of behavior covered by the survey. There are three broad categories of unwanted sexual contact that result: penetration of any orifice, attempted penetration, and unwanted sexual touching (without penetration). While these unwanted behaviors are analogous to UCMJ offenses, they are not meant to be exact matches. Many respondents cannot and do not consider the complex legal elements of a crime when being victimized by an offender. Consequently, forcing a respondent to accurately categorize which offense they experienced would not be productive. The terms, questions, and definitions of USC have been consistent throughout all of the WGRA surveys since 2006 to provide DoD with reliable data points across time. 4. The 2012 Workplace and Gender Relations Survey of Active Duty Members (WGRA) uses sampling and weighting. Why are these methods used and what do they do? Simply stated, sampling and weighting allows for data, based on a sample, to be accurately generalized up to the total population. In the case of the 2012 WGRA, this allows DMDC to generalize to the full population of active duty military members that meet the criteria listed above. This methodology, covered in more detail in Q5 and Q6, meets industry standards used by government statistical agencies including the Census Bureau, Bureau of Labor Statistics, National Agricultural Statistical Service, National Center for Health Statistics, and National Center for Education Statistics. In addition, private survey firms including RAND,

WESTAT, and RTI use this methodology, as do well-known polling firms such as Gallup, Pew, and Roper. 5. Why don t the responses you received match the composition of the military population as a whole? For example, 51% of your respondents were women. How can you say your estimates represent the total military population when women only make up 15% of the active duty force? Aren t the data skewed? The composition of the respondent sample (i.e., the surveys we receive back) are not always supposed to match the composition of the total population. This is intentional and is the only scientific way to generalize up to the full population. When conducting a large-scale survey, response rates vary for different groups of the population. These groups can also vary on core questions of interest to the Department of Defense, which can introduce bias to the data if not appropriately weighted. For example, if only a small percentage of responses to the 2012 Workplace and Gender Relations Survey of Active Duty Members (WGRA) came from junior enlisted, we may not get a good idea of the experiences for this group. In order to adjust for this potential bias, DMDC starts by oversampling known small reporting groups (e.g., female officers) and groups known to have low response rates. In order to construct accurate estimates weighted to the full population of military members, DMDC ensures during the sample design stage that we will receive enough respondents within all of the sub-groups of interest to make statistically accurate estimates. Many of these groups are underrepresented in the military population. This is the case with women. In 2012, women made up only 15% of the population of active duty members. Therefore, DMDC sampled more women to gather adequate numbers in the sample. It is scientifically logical, and quite intentional, that proportionally more women would receive invitations to take the survey then men in order for DMDC to accomplish this goal. In general, this technique has a proven record of providing accurate estimates for total populations. Most recently, national election polls used responses from a small sample of individuals, typically around 2,000 or less, to accurately estimate to the U.S. voting population as a whole. A quick reference for this is on the website for the National Council on Public Polls Evaluations of the 2012 and 2010 elections. In contrast, DMDC collected approximately 23,000 survey responses to accurately estimate to the eligible active duty population of 1.35 million. 6. Are these estimates valid with only a 24% response rate? Response rates to the 2012 Workplace and Gender Relations Survey of Active Duty Members (WGRA) are consistent with response rate levels and trends for both previous WGRA surveys and other active duty surveys conducted by DMDC (see Q8). Experts in the field have found that surveys with similar response rates, or

lower, are able to produce reliable estimates. 1 While non-response bias due to low response rates is always a concern, DMDC has knowledge, based on administrative records, of the characteristics of both survey respondents and survey nonrespondents, and uses this information to make statistical adjustments that compensate for survey non-response. This important advantage improves the quality of estimates from DMDC surveys that other survey organizations rarely have. DMDC uses accurate administrative records (e.g., demographic data) for the active duty population both at the sample design stage as well as during the statistical weighting process to account for survey non-response and post-stratification to known key variables or characteristics. Prior DMDC surveys provide empirical results showing how response rates vary by many characteristics (e.g., pay grade and service). DMDC uses this information to accurately estimate the optimum sample sizes needed to obtain sufficient numbers of respondents within key reporting groups (e.g., Army, female). After the survey is complete, DMDC makes statistical weighting adjustments so that each subgroup (e.g., Army, E1-E3, female, African American, and deployed in the last 12 months) contributes toward the survey estimates proportional to the known size of the subgroup. 7. Is 24% a common response rate for other military or civilian surveys? Response rates of less that 30% are not uncommon for surveys that use similar sampling and weighting procedures. Many civilian surveys often do not have the same knowledge about the composition of the total population in order to generalize results to full population via sampling and weighting. Therefore, these surveys often require much higher response rates in order to construct accurate estimates. For this reason, it is difficult to compare civilian survey response rates to DMDC survey response rates. However, many of the large-scale surveys conducted by DoD or civilian survey agencies rely on similar sampling and weighting procedures as DMDC to obtain accurate and generalizable findings with response rates lower than 30% (see Q8). Of note, DMDC has further advantage over these surveys by maintaining the administrative record data (e.g., demographic data) on the full population. This rich data, rarely available to survey organizations, is used to reduce bias associated with the weighted estimates and increase the precision and accuracy of estimates. 8. Can you give some examples of other studies with similar response rates that were used by DoD to understand military populations and inform policy? 1 For example, Robert Groves, the former Director of the Census Bureau, stated, despite low response rates, probability sampling retains the value of unbiased sampling procedures from well-defined sampling frames. Groves, R. M. (2006). "Nonresponse Rates and Nonresponse Bias in Household Surveys." Public Opinion Quarterly, 70(5), pp. 646-675. http://poq.oxfordjournals.org/content/70/5/646.short

The 2011 Health and Related Behaviors Survey, conducted by ICF International on behalf of the Tricare Activity Management, had a 22% response rate weighted up to the full active duty military population. This 22% represented approximately 34,000 respondents from a sample of about 154,000 active duty military members. In 2010, Gallup conducted a survey for the Air Force on sexual assault within the Service. Gallup weighted the results to generalize to the full population of Air Force members based on about 19,000 respondents representing a 19% response rate. Finally, in 2011, the U.S. Department of Defense Comprehensive Review Working Group, with the assistance of Westat and DMDC, conducted a large-scale survey to measure the impact of overturning the Don't Ask Don't Tell (DADT) policy. The DADT survey, which was used to inform DoD policy, was sent to 400,000 active duty and Reserve members. It had a 28% response rate and was generalized up to the full population of military members, both active duty and Reserve. The survey methodology used for this survey, which used the DMDC sampling design, won the 2011 Policy Impact Award from The American Association for Public Opinion Research (AAPOR), which "recognizes outstanding research that has had a clear impact on improving policy decisions practice or discourse, either in the public or private sectors." 9. What about surveys that study the total U.S. population? How do they compare? In addition to the previously mentioned surveys on election voting (see Q5), surveys of sensitive topics and rare events rely on similar methodology and response rates to project estimates to the total U.S. adult population. For example, the 2010 National Intimate Partner and Sexual Violence Survey, conducted by the Centers for Disease Control and Prevention, calculated population estimates on a variety of sensitive measures based on about 18,000 interviews, reflecting a weighted response rate of between 28% to 34%. 10. How much confidence can we have in the estimates when they have fluctuated between 2006, 2010, and 2012? While Unwanted Sexual Contact (USC) rates for active duty women declined in 2010 and then increased in 2012, there are no statistical changes among active duty men or Reservists. In addition, core measurements of sexual harassment (and all items that compile sexual harassment) did not see this type of increase between 2010 and 2012. If there were a methodological issue with the survey resulting in an artificial inflation of estimates, we would expect to find this across the board. Additionally, members perception of sexual assault in the military is worse now than in the previous four years. In 2012, 41% of active duty women indicated sexual assault in the military was a greater problem now then in previous years - 9 percentage points higher than 2010. 11. Can you infer trends with only two or three data points?

As we continue to survey this population, we will gain a better understanding of the trends that exist within this population and what leads to these fluctuations. However, the estimates themselves, and the calculations of significant differences across the years, are valid. Again, it is important to note that we did not see fluctuations in estimates between 2010 and 2012 across all measures related to sexual assault and sexual harassment. 12. Some of the estimates provided in the report show NR or Not Reportable. What does this mean? The estimates become "Not Reportable" when they do not meet the criteria for statistically valid reporting. This can happen for a number of reasons including high variability or too few respondents. This process ensures that the estimates we provide in our analyses and reports are accurate within the margin of error.