National Academies Press: OpenBook
« Previous: Appendix B: Biographical Sketches
Suggested Citation:"Index." National Research Council. 1991. Performance Assessment for the Workplace: Volume I. Washington, DC: The National Academies Press. doi: 10.17226/1862.
×

Index

A

The Abilities of Man, 17

Administrators and administration,

see Test administration

Age Discrimination Act, 16

Air Force, 47, 61, 64, 78, 90–91, 92, 94–95

Black Americans in, 40

classification, personnel, 22–23, 187–188

occupations studied, 64

American Psychological Association, 19, 29

Analysis of variance, 122, 123, 124–126, 135

Apprenticeships, 16

Armed Forces Qualification Test(AFQT), 4, 47, 48, 50, 52–53, 54, 55, 62, 208

cost-performance assessment, 186, 188, 189, 198–199, 201–202

criterion-related validity, 160–164, 175–180

enlistment standards, 52–54

fairness, 12, 179

Armed Services Vocational Aptitude Battery (ASVAB), 3–4, 5, 11–13, 31, 49–52, 56, 57, 58, 60, 68, 73, 91, 93, 142

classical test theory, 58

classification, general, 51–52, 148, 169–170

composites, 51–52

cost-performance assessment, 184, 186, 196, 198, 201, 207

criterion-related validity, 148–149, 156–158, 162–163, 165–170, 172–174, 179, 183, 205

fairness, 44, 50, 52, 172–174, 179

high schools, use by, 34, 49

job performance, 12–13, 61–62, 104, 184, 186, 196, 198, 201, 205, 208

minorities, 44, 50, 52, 172–174

norms, 109–110

reliability, 121

subtests, 50

Army, 61, 67, 84, 95, 155, 157, 166, 188, 202

Black Americans in, 39–40

historical perspectives, 19–22, 39

Suggested Citation:"Index." National Research Council. 1991. Performance Assessment for the Workplace: Volume I. Washington, DC: The National Academies Press. doi: 10.17226/1862.
×

occupations studied, 63

recruitment strategies, 37

task selection, 69–71

Army Alpha, 19–20

Army Beta, 20

Army Air Force Classification Program, 22–23

Army College Fund Program, 37

Army General Classification Test, 46, 47

Army Vocational Interest Career Examination, 158

Assessment of Background and Life Experiences, 158

Aviators, 22–23, 45

B

Benchmark measures, 60–61, 126, 147

Black Americans, 12, 173–174, 175–181

history of participation in Services, 39–43

C

Civil Rights Act, 171–172

Classical test theory, 10–11, 17–18

ASVAB, 58

history, 17–18, 19

JPM, 58

reliability, 117–118, 121–122

see also Generalizability theory

Classification and Assignment within Pride (CLASP), 188

Classification, personnel, 1, 3

Air Force, 22–23, 187–188

Army, 19–22, 39

ASVAB, 51–52, 148, 169–170

aviators, 22–23

computer-aided, 35, 38, 187

historical perspectives, 19, 22–23, 29, 46

Marine Corps, 188

minorities, 41–44

Navy, 188

recruits, 35

skill level, 41–43, 45

validity, 11–12, 141, 152

Cognitive assessment, 68, 170

history, 18, 19–22

paper-and-pencil tests, 61, 67, 121, 139–140, 142, 150–151, 153, 155–156, 164–166, 175, 196

see also Armed Forces Qualification Test, Armed Services Vocational Aptitude Battery, Job knowledge tests

Cognitive task analysis, 84, 85–88

College Board, 20–21, 126

Combat situations, 23, 69–70, 90, 98, 139, 197

minorities, 43–44

women, 37

Communication processes, 28

Competency testing, 4, 187, 188, 192–200

cutoff scores, 44, 46–48, 57, 99, 184, 189–190, 196, 198, 200, 204–205

minorities, 44

scales, 188–192, 203, 210

see also Mastery testing

Comprehensive Occupational Data Analysis Program (CODAP), 78, 90–91

Computers and computer science expert systems, 28, 86

JPM data base, 102

personnel classification programs, 35, 38, 187

sampling, 134, 138

simulations, 67, 134, 140

task inventories, 58, 78, 90

testing assisted by, 158, 187, 140

Congress of the U.S. AFQT, 53, 160, 163

demography and policy, 36, 37

oversight role, 33, 34, 54–56, 160, 163

Conscription, military, 3, 47, 48

Construct validity, 59–60, 73, 74–75, 143, 147, 153, 155

Content validity, 75–76, 90–91, 94, 96, 128–140, 147, 164

definition, 5–6, 128

work samples, 59–60

see also Tasks and task analysis

Suggested Citation:"Index." National Research Council. 1991. Performance Assessment for the Workplace: Volume I. Washington, DC: The National Academies Press. doi: 10.17226/1862.
×

Cost factors, 6–7, 31, 60–61, 62, 208

AFQT, 186, 188, 189, 198–199, 201–202

ASVAB, 184, 186, 196, 198, 201, 207

cost-performance models, 12–13, 46, 55, 56, 60, 91, 184–185, 188–189, 193–203, 209–210

hands-on testing, 6, 208, 209

JPM Project, 71–72, 91, 184–185, 209

Rand model, 194, 196–200, 201–203

recruitment and retention, 37, 54, 184, 193–194, 197–199

sampling, 105, 106, 128, 132, 138, 139

social, 29–30

task analysis, 86, 91, 120, 128, 132, 138, 139

training failures, 141

utility difference approach, 194–196, 199–200, 201–204

wages and salaries, 194–195

Criterion issues, 11–12, 26, 28–30, 68, 73, 78, 91, 102

fairness, 29, 171–183

historical perspectives, 22–28, 142–143, 189

job knowledge tests, 150–152, 153, 155–156, 164–166, 168–169, 175

job performance, general, 19, 26–29, 30, 142–145, 152–153, 169–171, 172–173, 177, 183

minorities, 172–178

reliability, 158–159

sampling, 159–160

scores and scoring, general, 144

selection of personnel, general, 29–30, 141, 152, 159

social factors, general, 29

training outcomes, 141–143, 155–156, 170

work samples, 59–60

Criterion-related validity, 26, 59–60, 102, 141–183, 191, 196–197, 204, 205

AFQT, 160–164, 175–180

ASVAB, 148–149, 156–158, 162–163, 165–170, 172–174, 179, 183, 205

graphs and tables illustrating, 145–146

hands-on testing, 147–149, 153–155, 160–163, 168–172, 179

JPM project, 142–143, 147–183, 179, 183

predictive validity and, general, 26, 59–60, 102, 144–147, 149, 152, 156–171, 172, 174–178, 196–197, 204, 205

Cutoff scores, 44, 46–48, 57, 99, 184, 189–190, 196, 198, 200, 204–205

D

Data bases, JPM, 102

Delphi technique, 71, 192

Demography, 32, 36–38

historical perspectives, 15–16, 18–19, 32, 36, 39–44

recruits, 36–38, 198, 210

sampling, 106, 108

see alsoGender factors;

Minorities

Department of Defense, 2, 4, 13, 32–33, 37–38, 207, 209

Congressional oversight role, general, 33, 34, 54–56

minorities, 39–43

standards, recruitment, 46–49, 58

see also Joint-Service Job Performance Measurement/Enlistment Standards Project ;

specific Services

Department of Labor, 16, 50

Deskilling, 45

Differential validity/prediction, 169–171, 177, 201

Draft,

see Conscription, military

E

Econometrics, 185

Rand model, 194, 196–200, 201–203

utility difference approach, 194–196, 199–200, 201–204

Economic factors, 32, 198

military budgets, 33, 34

Suggested Citation:"Index." National Research Council. 1991. Performance Assessment for the Workplace: Volume I. Washington, DC: The National Academies Press. doi: 10.17226/1862.
×

minority participation in military, 38, 43

recruitment incentives, 37

see also Cost factors

Educational testing, general, 15, 18, 22, 34, 49

Elementary and secondary education, 15, 18

ASVAB, use by high schools, 34, 49

Enlisted Personnel Assignment System(EPAS), 188, 202

Enlistment standards

cutoff scores, 44, 46–48, 99, 184, 189–190, 196, 198, 200, 204–205

enlistment, 12–13, 32, 44, 46–49, 52–55, 58, 61–62, 104, 184–188, 197–198, 208–209

jobs, general, 109–110, 128–140, 183, 184–206, 208

jobs, minimum, 35, 44, 49, 53–55, 57, 61–62, 77–78, 104, 187, 188, 192–200

jobs, minimum entry–level, 190, 192, 197–198, 201, 204–206, 210

jobs, multiple, 188, 200–205, 209

predictive validity, 108–109

Entry-level jobs, 5, 33–36, 57, 207, 208

minimum standards, 190, 192, 197–198, 201, 204–206, 210

Equal opportunity,

see Fairness analysis

Error of measurement, 3, 26–27

generalizability theory, 11, 122–127

item heterogeneity, 119

multiple sources, 121–127

purposive sampling, 132–133, 136

random vs. systematic errors, 123

range restriction, 160

rater error, 10–11, 113–115, 118, 126

sampling, 105, 106, 123, 132–133, 136

scoring, task performance, 98–99

standard deviation, 117–118, 135, 136, 175–176, 181–182, 195, 204

test administration factors, 111–114, 208

type II errors, 176–177

Experience, on job, 163–164, 196–200, 201, 202

Expert systems, 28, 86

F

Factor structure, 21

Fairness analysis, 12, 29

ASVAB, 44, 50, 52, 172–174, 179

criterion values, 171–183

job performance, general, 12, 172–173

predictive validity, 172, 174–178, 182

recruiting, 38–44

Fairness in Employment Testing, 179

Fidelity, 139–140

see also Simulations;

Surrogates

G

Gender factors, 47, 172–173, 175, 181–183

computer-aided classification, 35

see also Women

General Aptitude Test Battery (GATB), 16

Generalizability theory, 10–11, 122–127, 151, 204–205

Grade-point average, 57

Group mean, 17–18

Group testing, 18, 19–20

H

Halo effect, 26, 27

Hands-on testing, 7, 11–12, 30, 60, 65–67, 68–69, 75, 101, 103, 208, 209

as benchmark, 60–61, 126, 147

content representativeness, 129–138

cost factors, 6, 208, 209

fidelity, 139–140

reliability, 11, 119–120

scoring, 98–100

standardization, 110–115, 139–140

validity, 147–149, 153–155, 160–163, 168–172, 179

walk-through performance tests, 66–67, 94–95, 140, 152–153

Hispanics, 43

Historical perspectives, 1–2, 15–30, 78

armed services organizational structure, 32–33

Suggested Citation:"Index." National Research Council. 1991. Performance Assessment for the Workplace: Volume I. Washington, DC: The National Academies Press. doi: 10.17226/1862.
×

Black Americans participation in Services, 39–43

classical test theory, 17–18, 19

classification, personnel, 19, 22–23, 29, 46

criterion problem, 22–28, 142–143, 189

fairness analysis, 12

intelligence testing, 18, 19–22

JPM project, 3–5, 8, 56–72

military demographics, 15–16, 18–19, 32, 36, 39–44

multiple-choice tests, 18, 19, 20, 21

psychometrics, 17–18

recruitment, general, 46–49, 53, 54, 55, 186

selection of personnel, 19, 22–23

statistical analyses, 17–18, 21, 22

work samples, 59

Human engineering design, 45

Human resource management, 184–210

Human-technology interface,

see Technological innovation

I

Individual Training Standards, 58, 77–78, 95–96

Intelligence tests, history, 18, 19–22

Internal consistency reliability, 119, 120–121

Interrater reliability, 118, 119–120, 121, 123, 126

Interviewing, 7, 61, 152–153

historical perspectives, 23–25

walk-through performance tests, 66–67, 94–95, 140, 152–153

Item analysis, 119, 205

J

Job Orientation Bank, 158

Job analysis, 74–91

Job knowledge tests, 67

criterion-related validity, 150–152, 153, 155–156, 164–166, 168–169, 175

paper-and-pencil tests, 61, 67, 121, 139–140, 142, 150–151, 153, 155–156, 164–166, 175, 196

Job performance, 25, 208–209

as a construct, 74–75

definitional issues, 58, 75, 93, 101;

see also Tasks and task analysis enlistment standards and, 12–13, 55, 61–62, 104, 184–188, 208–209

experience and, 163–164, 196–200, 201, 202

salary-based, 194–195

standards, general, 109–110

Job performance measures

benchmarks, 60–61, 126, 147

competency scales, 188–192, 203

content representativeness, 128–140

development, 68–71, 73–102

differential validity/prediction, 169–171, 177, 201

fairness analysis and, 12, 172–173

peer ratings, 7, 23–25, 67, 167

scoring, general

supervisor ratings, 26–27, 67, 93–94, 104, 139, 153–155, 195, 196

walk-through performance tests, 66–67, 94–95, 140, 152–153

see also criterion-related validity

Joint-Service Job Performance Measurement/Enlistment Standards Project (JPM), 2–11 (passim), 13, 30, 55, 56, 57–72, 103, 140, 184, 210

classical test theory, 58

cost-performance models, 71–72, 91, 184–185, 209

history, 3–5, 8, 56–72

Judges,

see Raters and ratings

K

Korean War, 39, 47

L

Labor unions, 16

Language skills, 20, 46, 150

Laws, specific federal

Age Discrimination Act, 16

Civil Rights Act, 171–172

Selective Service Act, 47

Suggested Citation:"Index." National Research Council. 1991. Performance Assessment for the Workplace: Volume I. Washington, DC: The National Academies Press. doi: 10.17226/1862.
×

Universal Military Training and Service Act, 47

Leniency effects, 26, 27

Likert scales, 26–27

Linear regression, 146, 174

M

Manpower management, 7, 32–33, 45–46, 184–210

historical perspectives, 19, 26, 28

Manuals,

see Soldier's Manuals

Marine Corps, 64–65, 67, 95, 68, 72, 91, 114, 165–166, 168–168, 188

Black Americans in, 40

occupations studied, 64–65

reliability assessments, 124–126, 127

sampling, 107, 130, 133

task analysis, 77–78, 94, 95–98, 130, 147–149

Mastery testing, 8, 189

Military Entrance Processing Stations, 35, 187

Minorities, 12, 71, 172–180

ASVAB, 44, 50, 52, 172–174

Black Americans, 12, 39–43, 173–174, 175–181

classification, general, 41–44

combat participation, 43–44

computer-aided classification, 35, 38

differential validity/prediction, 172–

fairness, 172–181

racial bias, 19, 38–39

recruitment, 38–44

sampling, 106

Models

cost-performance, 12–13, 46, 55, 56, 60, 91, 184–185, 188–189, 193–

econometrics, 185, 194, 196–199, 200, 201–203

Motivation

proficiency vs., 59

under testing conditions, 139

Multiple-choice tests, 1, 7

history, 18, 19, 20, 21

minorities, 44

Multistage sampling, 106

Multivariate analysis, 21, 134–135

N

National Intelligence Test, 20

National Research Council, 19, 25, 39–40

Naval Aviation Program, 22

Navy, 22, 47, 63–64, 66, 67, 69, 72, 188, 192

Black Americans in, 40

classification of personnel, general, 188, 192

occupations studied, 63–64

recruitment strategies, 37

reliability assessments, 119–120, 121, 124–126, 127

sampling, 106–108, 129–130, 135–137

task analysis, 92–93, 120, 129–130, 135–136

Norms and norming, 3–4, 50–51, 56, 91, 109–110, 208–209

O

Observation techniques, 28, 74, 81

test administration, 119–120

Organizational factors, 6

armed services organizational structure, 32–33

see also Human resource management

P

Paper-and-pencil tests, 61, 67, 121, 139–140, 142, 150–151, 153, 155–156, 164–166, 175, 196

see alsoArmed Services Vocational Aptitude Battery

Pearson's product-moment correlation, 18

Peer evaluation, 7, 23–25, 67, 167

Persian Gulf War, 37, 41

Personality traits, 25, 78, 81, 84, 88–91, 95, 102, 167, 209

interviewing, 152

Personnel Research Bureau, 22

Suggested Citation:"Index." National Research Council. 1991. Performance Assessment for the Workplace: Volume I. Washington, DC: The National Academies Press. doi: 10.17226/1862.
×

Personnel selection,

see Selection

Placement,

see Classification, personnel

Position Analysis Questionnaire, 204–205

Predictive validity, 25–26, 61–62, 68, 72, 104

differential validity/prediction, 169–171, 177, 201

fairness, 172, 174–178, 182

standardization and, 108–109

work samples, 59–60

see also criterion-related validity

Problem-solving skills, 142

Processing and Classification of Enlistees (PACE), 187–188

Procurement Management System(PROMIS), 187, 188

Profile of American Youth, 4, 50

Project

Psychometrics, general, 10

defined, 1–2

history, 17–18

see also Fairness analysis;

Reliability;

Validity

Psychomotor skills, 142, 157, 158

Purposive sampling, 131–132, 133–138

Q

Qualified Man Month (QMM), 196–200, 201, 202

R

Rand model, 194, 196–200, 201–203

Random sampling, 105–106, 123, 132–134, 159

purposive vs., 133–134, 133–138

stratified, 106, 133, 138

Raters and ratings, 10, 11, 67–68, 153, 155

cognitive processes of, 28

effects on examinees, 139, 152

expertise, 9, 28, 69, 71

interrater reliability, 118, 119–120, 121, 123, 126

peer, 7, 23–25, 67, 167

supervisor, 26–27, 67, 93–94, 104, 139, 153–155, 195, 196

Reading ability, 46

Army Beta, 20

Recruit Distribution Model, 188

Recruit quality, 4, 52–55, 184–186, 207

Recruitment and retention, 32, 33–44, 46, 52

ASVAB, 57

conscription, military, 3, 47, 48

cost factors, 37, 54, 184, 193–194, 197–199

demography, 36–38, 198, 210

enlistment standards, 12–13, 32, 44, 46–49, 52–55, 58, 61–62, 104, 184–188, 197–198, 208–209

fairness, 38–44

historical perspectives, 46–49, 53, 54, 55, 186

incentives, 37

minorities, 38–44

quality control, general, 4, 52–56

specialization, 33, 34–35, 36, 53

technical schools, role, 36, 53–54, 57

volunteer army, 3–4, 33–36, 47, 49, 56, 57–58

Reliability, 10–11, 21, 26, 27, 116–127, 147

analysis of variance, 122, 123, 124–

classical test theory, 117–118, 121–122

criterion reliability, 158–159

generalizability theory, 10–11, 122–127, 151, 204–205

hands-on testing, 11, 119–120

internal consistency reliability, 119, 120–121

interrater reliability, 118, 119–120, 121, 123, 126

JPM project, 119–121, 124–127

Marine Corps, 124–126, 127

Navy, 119–120, 121, 124–126, 127

paper-and-pencil tests, 150–151, 153, 155–156, 164–166, 175

test-retest reliability, 148

Reservists, 38

Restriction of range, 26, 159–160

S

Salaries,

see Wages and salaries

Suggested Citation:"Index." National Research Council. 1991. Performance Assessment for the Workplace: Volume I. Washington, DC: The National Academies Press. doi: 10.17226/1862.
×

Sampling issues

Army techniques, 107

availability, 107

content representativeness, 129–138

cost factors, 105, 106, 128, 132, 138, 139

demography, 106, 108

errors, 105, 106, 123, 132–133, 136

JPM project, 104–105

Marine Corps, 107, 130, 133

multistage, 106

Navy, 106–108, 129–130, 135–137

personnel, 8–9, 71–72, 103, 104–108

purposive, 131–132, 133–138

size, 136, 137, 173, 175, 176

specialties, 8, 63–66

tasks, 9, 59–60, 62–66, 68–69, 74–75, 93, 96–97, 101, 126–127, 128–130, 131–138

see also Random sampling

Scholastic Aptitude Test, 20–21

Scores and scoring, 2, 3–4, 11, 98–100, 108, 109–111

computation of, 99–100, 114–115

cutoff, 44, 46–48, 57, 99, 184, 189–190, 196, 198, 200, 204–205

errors and error analysis, general, 98–99

hands-on testing, 98–100

job performance criteria, 144

JPM project, 99–100

weighting, 147–149

see also Raters and rating

Selection, personnel, general, 1, 3, 5, 32–33, 46, 141, 207

aviators, 22–23

conscription, military, 3, 47, 48

historical perspectives, 19, 22–23

minorities, 44

performance criteria, 13, 56–72

validity, 11–12, 141

see also Recruitment and retention

Selective Service Act, 47

Simulations, 7, 9, 67, 138–140

benchmarks and surrogates, 60–61, 126, 147

computer, 67, 134, 140

fidelity, 139–140

see also Surrogates

Skill level, classification

deskilling, 45

minorities and women, 41–43

see alsoSpecialization

Skill Qualification Test (SQT), 196, 197, 199–200

Soldier's Manuals, 58, 69, 76, 77–78

Spatial ability, 157–158

Standardized testing, general, 103, 108–115, 139

elementary/secondary education, 15

hands-on tests, 110–115, 139–140

norms and norming, 3–4, 50–51, 56, 91, 109–110

work samples, 59–60

see also Criterion issues;

Multiple-choice tests

Standards,

seeEnlistment standards

Statistical analyses, 116

ANOVA, 122, 123, 124–126, 135

Bayesian, 105

criterion-related validity, 29, 145–146, 158–159

factor structure, 21

graphs and tables, 145–146

historical perspectives, 17–18, 21, 22

linear regression, general, 146, 174

multivariate analysis, 21, 134–135

random sampling, 105–106, 132–133

sampling, other, 104, 105

see also Analysis of variance;

Construct validity;

Predictive validity

Student Testing Program, 49

Subgroups

sampling, 106

see also Fairness analysis;

Minorities;

Women

Subjectivity

raters, 11, 153

supervisor ratings, 26

Superior Evaluation Technique, 195

Supervisors,

seeRaters and ratings

Surrogates, 60–61

benchmarks and, 60–61, 126, 147

computer simulations, 67, 134, 140

paper-and-pencil tests, 61, 67, 121, 139–140, 142, 150–151, 153, 155–156, 164–166, 175, 196

Suggested Citation:"Index." National Research Council. 1991. Performance Assessment for the Workplace: Volume I. Washington, DC: The National Academies Press. doi: 10.17226/1862.
×

T

Tasks and task analysis, 58–59, 62–66, 74, 76–98, 95–98, 164, 208

cognitive task analysis, 84, 85–88

computer inventories, 58, 78, 90

cost factors, 86, 91, 120, 128, 132, 138, 139

Delphi technique, 71, 192

difficulty level, 8, 89, 134, 136, 137, 188, 204

editing, 94–95, 96

errors and error analysis, 134, 135, 136

expertise, general, 9, 28, 69, 71

frequency, 88–89, 134, 136, 137–138

inventories, 6, 58, 78, 80

JPM project, 90–91, 93, 99–102, 130–131

Marine Corps, 77–78, 94, 95–98, 130, 147–149

modifiability, 89

Navy, 92–93, 120, 129–130, 135–136

personality traits in, 78, 81, 84, 88–91, 95, 209

relative vs. absolute measures, 8, 189

sampling, 9, 59–60, 62–66, 68–69, 74–75, 93, 96–97, 101, 126–127, 128–130, 131–138

sequencing, 98, 139

specialists, 63–69, 78, 130, 167, 203, 204

task importance, 88, 89, 98, 101, 129, 130–131, 132–133, 138

variability, 89

weighting, 147–149

see also Content validity

Technical schools, military, 36, 53–54, 57

Technological innovation, 44–45, 90

Test administration, 9–10, 110–115

computer-aided, 158, 187, 140

error of measurement due to, 111–114, 208

history, 20

JPM project, 112–113, 114–115

location, 35, 111–113

observational techniques, 119–120

repeat, 118–119

time factors, 104–105, 111–112, 122

training for, 113–114, 119

Test construction, 21, 94–98, 100, 101

see also Tasks and task analysis

Test-retest reliability, 148

see also Reliability

Training criteria, 45, 141–142, 143, 155–156, 170, 197–198

cost factors, 141

expertise and, 188

manuals, 58.

technical schools, 36, 53–54, 57

test administrators, 113–114, 119

Trait analysis, 25, 78, 81, 84, 88–91, 95, 102

Type II errors, 176–177

U

Universal Military Training and Service

Act, 47

Utility difference approach, 194–196, 199–200, 201–204

V

Validity, 2, 11–12

classification and, 11–12, 141, 152

construct validity, 59–60, 73, 74–75, 143, 147, 153, 155

content validity/representativeness, 59–60, 75–76, 90–91, 94, 96, 128–140, 147, 164

differential, 169–171

historical perspectives, 18, 21, 22–30

Pearson's product-moment correlation, 18

selection, general, 11–12, 141

see also Criterion-related validity;

Predictive validity

Variability, tasks, 89

Vietnam War, 40, 48

Volunteer Armed Forces, 3–4, 33–36, 47, 49, 56, 57–58

W

Wages and salaries, 194–195

Walk-through performance tests, 66–67, 94–95, 140, 152–153

Suggested Citation:"Index." National Research Council. 1991. Performance Assessment for the Workplace: Volume I. Washington, DC: The National Academies Press. doi: 10.17226/1862.
×

Wars,

see Combat situations;

Korean War;

Vietnam War;

World War I;

World War II

Women, 36, 37, 41-43, 44, 71, 181-183

computer-aided classification, 35

sampling, 106, 108

World War I, 18, 19-20

World War II, 22-25, 38, 46-47

Written tests,

see Paper-and-pencil tests

Suggested Citation:"Index." National Research Council. 1991. Performance Assessment for the Workplace: Volume I. Washington, DC: The National Academies Press. doi: 10.17226/1862.
×
Page 251
Suggested Citation:"Index." National Research Council. 1991. Performance Assessment for the Workplace: Volume I. Washington, DC: The National Academies Press. doi: 10.17226/1862.
×
Page 252
Suggested Citation:"Index." National Research Council. 1991. Performance Assessment for the Workplace: Volume I. Washington, DC: The National Academies Press. doi: 10.17226/1862.
×
Page 253
Suggested Citation:"Index." National Research Council. 1991. Performance Assessment for the Workplace: Volume I. Washington, DC: The National Academies Press. doi: 10.17226/1862.
×
Page 254
Suggested Citation:"Index." National Research Council. 1991. Performance Assessment for the Workplace: Volume I. Washington, DC: The National Academies Press. doi: 10.17226/1862.
×
Page 255
Suggested Citation:"Index." National Research Council. 1991. Performance Assessment for the Workplace: Volume I. Washington, DC: The National Academies Press. doi: 10.17226/1862.
×
Page 256
Suggested Citation:"Index." National Research Council. 1991. Performance Assessment for the Workplace: Volume I. Washington, DC: The National Academies Press. doi: 10.17226/1862.
×
Page 257
Suggested Citation:"Index." National Research Council. 1991. Performance Assessment for the Workplace: Volume I. Washington, DC: The National Academies Press. doi: 10.17226/1862.
×
Page 258
Suggested Citation:"Index." National Research Council. 1991. Performance Assessment for the Workplace: Volume I. Washington, DC: The National Academies Press. doi: 10.17226/1862.
×
Page 259
Suggested Citation:"Index." National Research Council. 1991. Performance Assessment for the Workplace: Volume I. Washington, DC: The National Academies Press. doi: 10.17226/1862.
×
Page 260
Performance Assessment for the Workplace: Volume I Get This Book
×
Buy Paperback | $75.00
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

Although ability testing has been an American preoccupation since the 1920s, comparatively little systematic attention has been paid to understanding and measuring the kinds of human performance that tests are commonly used to predict—such as success at school or work. Now, a sustained, large-scale effort has been made to develop measures that are very close to actual performance on the job. The four military services have carried out an ambitious study, called the Joint-Service Job Performance Measurement/Enlistment Standards (JPM) Project, that brings new sophistication to the measurement of performance in work settings.

Volume 1 analyzes the JPM experience in the context of human resource management policy in the military. Beginning with a historical overview of the criterion problem, it looks closely at substantive and methodological issues in criterion research suggested by the project: the development of performance measures; sampling, logistical, and standardization problems; evaluating the reliability and content representativeness of performance measures; and the relationship between predictor scores and performance measures—valuable information that can also be useful in the civilian workplace.

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    Switch between the Original Pages, where you can read the report as it appeared in print, and Text Pages for the web version, where you can highlight and search the text.

    « Back Next »
  6. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  7. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  8. ×

    View our suggested citation for this chapter.

    « Back Next »
  9. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!