- 6.410.9 Training Evaluation Policy
- 6.410.9.1 Program Scope and Objectives
- 6.410.9.1.1 Background
- 6.410.9.1.2 Authority
- 6.410.9.1.3 Roles and Responsibilities
- 6.410.9.1.4 Program Management and Review
- 6.410.9.1.5 Program Controls
- 6.410.9.1.6 Terms
- 6.410.9.1.7 Acronyms
- 6.410.9.1.8 Related Resources
- 6.410.9.2 Training Evaluation Overview
- 6.410.9.2.1 Level 1 Training Evaluation: Reaction
- 6.410.9.2.2 Level 2 Training Evaluation: Learning
- 6.410.9.2.3 Level 3 Training Evaluation: Behavior
- 6.410.9.2.4 Level 4 Training Evaluation: Results
- 6.410.9.3 Level 1 Training Evaluation Policies and Procedures
- 6.410.9.3.1 Administering Level 1 Training Evaluations
- 6.410.9.3.2 Analyzing Level 1 Training Evaluation Results
- 6.410.9.4 Level 2 Training Evaluation Policies and Procedures
- 6.410.9.4.1 Developing Level 2 Training Evaluations
- 6.410.9.4.2 Administering Level 2 Training Evaluations
- 6.410.9.4.3 Analyzing Level 2 Training Evaluation Results
- 6.410.9.5 Level 3 Training Evaluation Policies and Procedures
- 6.410.9.5.1 Developing Level 3 Training Evaluations
- 6.410.9.5.2 Administering Level 3 Training Evaluations
- 6.410.9.5.3 Analyzing Level 3 Training Evaluation Results
- 6.410.9.6 Level 4 Training Evaluation Policies and Procedures
- 6.410.9.1 Program Scope and Objectives
Part 6. Human Resources Management
Chapter 410. Learning and Education
Section 9. Training Evaluation Policy
6.410.9 Training Evaluation Policy
Manual Transmittal
February 21, 2025
Purpose
(1) This transmits revised IRM 6.410.9, Training Evaluation Policy.
Material Changes
(1) IRM 6.410.9.1, Program Scope and Objectives, updated subsections as required in IRM 1.11.2.2.4, Address Management and Internal Controls.
(2) IRM 6.410.9.1.1, Background, updated with current information.
(3) IRM 6.410.9.1.2, Authority, added Title 5 USC 4103, Establishment of Training Programs and Executive Order 11348, Providing for the Further Training of Government Employees. Updated Title 5 CFR 410.202.
(4) IRM 6.410.9.1.3, Roles and Responsibilities, relocated section 6.410.9.3, Roles and Responsibilities, to current section.
(5) IRM 6.410.9.1.4, Program Management and Review, added new subsection as required in IRM 1.11.2.2.4, Address Management and Internal Controls.
(6) IRM 6.410.9.1.5, Program Controls, added new subsection as required in IRM 1.11.2.2.4, Address Management and Internal Controls
(7) IRM 6.410.9.1.6, Terms, renamed and relocated section 6.410.9.2.1, Definitions and Resources, to define terms listed throughout this IRM.
(8) IRM 6.410.9.1.7, Acronyms, added new subsection as required in IRM 1.11.2.2.4, Address Management and Internal Controls.
(9) IRM 6.410.9.1.8 Related Resources, renamed and relocated Section 6.410.9.2.2, Definitions and Resources, for clarity.
(10) IRM 6.410.9.2, Training Evaluation Overview, moved and renumbered from section 6.410.9.4. Subsection revises previous section to eliminate duplication of information and improve readability.
(11) IRM 6.410.9.2.1, Level 1 Training Evaluation: Reaction, moved and renumbered from section 6.410.9.4.1. Subsection revises previous section to eliminate duplication of information and improve readability.
(12) IRM 6.410.9.2.2, Level 2 Training Evaluation: Learning, moved and renumbered from section 6.410.9.4.2. Subsection revises previous section to eliminate duplication of information and improve readability.
(13) IRM 6.410.9.2.3, Level 3 Training Evaluation: Behavior, moved and renumbered from section 6.410.9.4.3. Subsection revises previous section to eliminate duplication of information and improve readability.
(14) IRM 6.410.9.2.4, Level 4 Training Evaluation: Results, moved and renumbered from section 6.410.9.4.4. Subsection revises previous section to eliminate duplication of information and improve readability.
(15) IRM 6.410.9.3, Level 1 Training Evaluation Policies and Procedures, moved and renumbered from section 6.410.9.5. Subsection revises previous section to eliminate duplication of information and improve readability.
(16) IRM 6.410.9.3.1, Administering Level 1 Training Evaluations, moved and renumbered from section 6.410.9.5.1. Subsection revises previous section to eliminate duplication of information and improve readability.
(17) IRM 6.410.9.3.2, Analyzing Level 1 Training Evaluation Results, moved and renumbered from section 6.410.9.5.2. Subsection revises previous section to eliminate duplication of information and improve readability.
(18) IRM 6.410.9.4, Level 2 Training Evaluation Policies and Procedures, moved and renumbered from section 6.410.9.6. Subsection revises previous section to eliminate duplication of information and improve readability.
(19) IRM 6.410.9.4.1, Developing Level 2 Training Evaluation Development, moved and renumbered from section 6.410.9.6.1. Subsection revises previous section to eliminate duplication of information and improve readability.
(20) IRM 6.410.9.4.2, Administering Level 2 Training Evaluations, moved and renumbered from section 6.410.9.6.2. Subsection revises previous section to eliminate duplication of information and improve readability.
(21) IRM 6.410.9.5, Level 3 Training Evaluation Policies and Procedures, moved and renumbered from section 6.410.9.7. Subsection revises previous section to eliminate duplication of information and improve readability.
(22) IRM 6.410.9.5.1, Developing Level 3 Training Evaluation Development, moved and renumbered from section 6.410.9.7.1. Subsection revises previous section to eliminate duplication of information and improve readability.
(23) IRM 6.410.9.5.2, Administering Level 3 Training Evaluations, moved and renumbered from section 6.410.9.7.2. Subsection revises previous section to eliminate duplication of information and improve readability.
(24) IRM 6.410.9.5.3, Analyzing Level 3 Training Evaluation Results, moved and renumbered section 6.410.9.7.3. Subsection revises previous section to eliminate duplication of information and improve readability.
(25) IRM 6.410.9.6, Level 4 Training Evaluation Policies and Procedures, moved and renumbered section 6.410.9.8. Subsection revises previous section to eliminate duplication of information and improve readability.
(26) Editorial changes are made throughout to update division and office names, references, hyperlinks and terminology.
Effect on Other Documents
This IRM supersedes IRM 6.410.9 dated August 16, 2021.Audience
All business units.Effective Date
(02-21-2025)Traci M. DiMartini,
IRS Human Capital Officer
-
Purpose: This IRM establishes policy and guidance needed to conduct Levels 1-4 training evaluations. Read and interpret this policy in accordance with applicable federal laws, government wide regulations, Treasury Human Capital Issuance System Directives and other sources as appropriate.
-
Audience: Unless otherwise indicated, the policies, authorities, procedures, and guidance contained in this IRM apply to all business units.
-
Policy Owner: The IRS Human Capital Officer.
-
Program Owner: The Human Capital Office (HCO), Office of Human Resources Strategy (OHRS), IRS University.
-
Primary Stakeholders: All Learning and Education (L&E) organizations Servicewide.
-
Program Goals: IRS University’s goal is to support a culture of continuous self-development and mission-focused learning for employees. This IRM bolsters that goal by providing policies and procedures for measuring and reporting the overall effectiveness of training programs at the IRS.
-
The IRS, through the joint efforts of the learning and education community, develops and delivers a variety of training courses designed to meet organizational goals and contribute to mission accomplishment. Effectively evaluating training helps personnel determine course improvement opportunities and training’s overall contribution to business results.
-
The IRS follows established procedures outlined in the instructional design model known as the Training Development Quality Assurance System (TDQAS) to administer training evaluations. Specific procedures address the evaluation of learner reaction, learner achievement, job performance, and organizational impact. Adherence to TDQAS processes ensures procedural consistency in the overall quality of our training systems and products. IRM 6.410.1, Learning and Education Policy, and its subsections address the actions associated with the evaluation phase of TDQAS.
-
Laws: .
-
5 USC 4103: Establishment of Training Programs , .
-
-
Executive Orders:
-
, provides agency heads and U.S. Office of Personnel Management with additional presidential direction on how training law is to be carried out.
-
-
Regulations: at .
-
, Planning and Evaluating Training, Section 202 - Responsibilities for evaluating training at: https://www.law.cornell.edu/cfr/text/5/410.202.
-
-
The IRS Human Capital Officer is the executive responsible for this IRM and overall Servicewide policy.
-
The HCO, OHRST, Transformation, Policy and Engagement (TPE), Policy Office (PO) is responsible for developing and publishing content in this IRM.
-
The HCO, OHRS, IRS University Office is responsible for the operation and administration for this IRM.
-
The Servicewide Training Evaluation Program (STEP) within IRS University is responsible for:
-
Establishing Servicewide training evaluation policies and guidelines for the IRS.
-
Providing training evaluation resources for L&E professionals such as evaluation training, job aids, and webinars.
-
Administering the Evaluation Management System (EMS), which includes granting access as appropriate, contract support and billing, and troubleshooting.
-
Providing technical support and consultation for training evaluation development.
-
Reporting Servicewide training evaluation results.
-
Communicating evaluation information such as program and policy updates, evaluation results, etc.
-
Creating standardized Level 1 evaluations designed to consistently capture, compile, quantify and report data.
-
Maintaining the STEP SharePoint site and the HCO Servicewide Evaluation Team mailbox.
-
-
L&E is responsible for:
-
Determining Level 2 through 4 evaluation requirements for training courses, in conjunction with business unit customers.
-
Creating and maintaining items in the Learning Management System (LMS), to include updating evaluation requirement fields and associating the appropriate standardized Level 1 training surveys to items.
-
Communicating training evaluation responsibilities to individuals throughout their business unit (i.e., provide guidance to Subject Matter Experts (SMEs), training coordinators, instructors, etc.) regarding their duties in the training evaluation process.
-
Alerting Virtual Learning Management (VLM) or Classroom Learning Services (CLS) when standardized paper Level 1 training evaluations must be administered.
-
Communicating Level 2 training evaluation requirements to instructors, ensuring instructors complete Form 14156,Level 2 Instructor Data Capture (IDC), and recording IDC data in the LMS or submitting a support request to VLM/CLS via the Event Support Request System (ESRS) to request they input the IDC data to the LMS.
-
Compiling and forwarding Level 3 training evaluation results not automatically captured in the EMS (i.e., alternative Level 3 training evaluations) to the STEP team at HCO Servicewide Evaluation Team.
-
Creating and maintaining course files for training within their purview.
-
Creating and maintaining consolidated reports in course files.
-
Adhering to all policies and procedures set forth for training evaluation in this IRM.
-
-
The VLM or CLS office is responsible for:
-
Administering paper Level 1 training evaluations.
-
Batching the results of paper Level 1 training evaluations and forwarding to the EMS vendor for processing with an IRS-TEMPO-Batch-Header sheet IRS-TEMPO-Batch-Header sheet, which provides the vendor contact information.
-
Entering data from Form 14156, Level 2 Instructor Data Capture (IDC), into the LMS.
-
Informing instructors or their L&E contact(s) of Level 2 training evaluation requirements for classes they support.
-
-
The program office gauges the effectiveness of this program based on feedback from customers and stakeholders and considers any statutory or regulatory changes. During review and publishing, in partnership with the policy office, IRM sections are revised, added or deleted based in part on this process.
-
Annual Review: An annual review is conducted to determine program effectiveness and feasibility.
-
The policy office is responsible for reviewing policies to ensure conformance with applicable laws and regulations.
-
The program office is responsible for implementing, monitoring and improving internal controls which are programs and procedures that ensure:
-
Program goals are established, and performance is measured to assess efficient and effective objective accomplishment.
-
Programs and resources are protected against waste, fraud, abuse, mismanagement and misappropriation.
-
Program operations are reviewed in conformance with applicable laws and regulations.
-
Reliable information is obtained and used in decision making and quality assurance.
-
-
Annual program review requirements include the program office:
-
Ensuring the internal controls are complete, accurate and reviewed at least annually to promote consistent program administration.
-
-
The following activities ensure program success:
-
Conducting annual policy reviews and
-
Publishing educational articles and other materials.
-
-
The following Table provides a list of terms and definitions specific to this IRM.
Term Definition Consolidated Report A report maintained in the course file that includes data analysis, a summary, and recommendations for improvement based on Level 1, 2, and 3 training evaluation results. A Sample Consolidated Report can be found on the STEP SharePoint site Course ob体育 A centralized location where records or documentation related to training development project (such as course development agreements, TDQAS documentation, etc.) are maintained. Evaluation Management System (EMS) The official system of record for managing training evaluation data. The IRS’s current EMS is Training Evaluation & Measurement for Performance Optimization (TEMPO). Certain evaluation data captured in the Learning Management System (LMS) is automatically transferred to the EMS to streamline data collection and provide a more strategic approach in analyzing and interpreting evaluation data. Item Any training event tracked in the LMS for learning history completions and/or for tracking usage. Items include but are not limited to: Continuing Professional Education (CPE), courses, and workshops. Learning and Education (L&E) Organization within a business unit, or designated area within HCO, that develops courseware and provides training and education support. Learning Management System (LMS) The official system of record for training. It offers development and delivery of course evaluations. The IRS’s current LMS is Integrated Talent Management (ITM). Access the LMS and resources at Integrated Talent Management site. Mission Critical Training Any training that is required to meet employee and organizational performance goals, or training that, if not continued, would result in significant loss to the agency measured in terms of lost taxes, customer confidence, or employee productivity. Examples of mission critical training include, but are not limited to: -
Recruit and basic training for mission critical and non-mission critical positions,
-
Functional and cross-functional leadership training,
-
Classroom instructor training, On-the-Job Instruction (OJI), instructor training workshops,
-
Discretionary Out-Service - unanticipated mission critical needs that need out-service funds.
New World Kirkpatrick Model A four-level blended approach to training evaluation that focuses on transferring learning to behavior and aligning training with organizational goals. Performance Accomplishment of work assignments or responsibilities Priority 1 Mission critical training that must be timely delivered or developed during the fiscal year (FY). This training is necessary for the employee to function in their basic position such as filing season readiness, new hire, and tax law. Priority 2 Mission critical training that must be delivered or developed during the FY. The employee can still function in their basic position but needs this training to perform at a higher job level such as higher phase training, specialty training, and CPE. Priority 3 Training that should be delivered or developed during the FY but may be postponed until the latter part of the FY or the following FY without significantly impacting operations. This training enhances skills already possessed to enable employees to operate more effectively. Priority 4 Training that could be postponed to the following FY with no negative impact if there are insufficient training funds or resource limitations. This is training that enhances performance but is not necessary to perform daily activities. Rubric A set of criteria for grading assignments. Rubrics usually have evaluative criteria, quality definitions for those criteria at particular levels of achievement, and a scoring strategy Self-Directed Training (also known as asynchronous training) An instructional design and delivery method that allows students to access content or participate in learning outside of the classroom and at their convenience, or independent of the instructor. Training The process of providing employees the programs, courses, or other instruction they need to develop new skills to perform a task, process, enhance, or improve current skills in their individual job performance. Effective training may result in observably changed behavior. Training Development Quality Assurance System (TDQAS) The instructional design model used by the IRS that guides the processes of training assessment, analysis, design, development, implementation, and evaluation. The TDQAS development model ensures high-quality training products and services. For more information see IRM 6.410.1, Learning and Education Policy, and its subsections. Training Event Instruction that is conducted in a structured learning environment (eLearning or non-eLearning) and contains behavioral objectives linked to or derived from job competencies or tasks. -
-
This table lists commonly used acronyms and their definitions:
Acronym Definition AOL Asynchronous On-Line CI Criminal Investigation CLS Classroom Learning Services CPE Continuing Professional Education ELO Enabling Learning Objectives EMS Evaluation Management System ESRS Event Support Request System ETD Enterprise Talent & Development FY Fiscal Year HCO Human Capital Office IDC Instructor Data Capture IRM Internal Revenue Manual IRS Internal Revenue Service ITM Integrated Talent Management L&E Learning and Education LMS Learning Management System NTEU National Treasury Employees Union O Oral OJI On-the-Job Instruction QL Qualitative QN Quantitative SEID Standard Employee Identifier SL&E Servicewide Learning & Education SME Subject Matter Expert STEP Servicewide Training Evaluation Program TDQAS Training Development Quality Assurance System TEMPO Training Evaluation & Measurement for Performance Optimization TLO Terminal Learning Objectives VLM Virtual Learning Management
-
Evaluation is the phase of the TDQAS in which data is systematically collected, analyzed, and interpreted to determine the effectiveness of training.
-
The IRS uses a four-level approach to training evaluation, based on one of the leading industry standards in the ?eld of training. This systematic approach, known as the Kirkpatrick Model, with New World Kirkpatrick Model Enhancements, ensures that the IRS gathers complete data that measures the effectiveness of training, while enabling course owners to identify course improvement opportunities. Using the Kirkpatrick Model enables the IRS to effectively demonstrate the strategic value that training programs add to business results and the organization’s mission.
-
The policies and procedures in this IRM apply to training owned, developed, or delivered by the IRS (i.e., LMS items with a CURR-xxx- domain). Programs that support or utilize vendor-owned or out-service training must establish appropriate procedures for evaluating these types of training.
-
Level 1 training evaluation is the degree to which participants ?nd training favorable, engaging and relevant to their jobs. Its purpose is to assess the immediate reaction of learners to the training.
-
New World Kirkpatrick Model Level 1 dimensions include:
-
Customer Satisfaction - participants’ satisfaction with the training.
-
Engagement - the degree to which participants are actively involved in contributing to the learning experience.
-
Relevance - the degree to which participants believe they will have the opportunity to use or apply what they learned in training on the job.
-
-
The Level 1 evaluation process enables business units to gather and interpret participants’ reactions to the training they received.
-
Level 2 training evaluation is the degree to which participants acquire the intended knowledge, skills, attitude, con?dence and commitment based on their participation in training. Level 2 training evaluation can be either knowledge-based or performance-based.
Term Description New World Kirkpatrick Model Level 1 dimensions include: -
Knowledge - "I know it."
-
Skill - "I can do it right now."
-
Attitude - "I believe this will be worthwhile to do on the job."
-
Con?dence - "I think I can do it on the job."
-
Commitment - "I intend to do it on the job."
The Level 2 training evaluation process Enables business units to gather and interpret participants’ level of learning based on the training they received. Level 2 training evaluation can be quantitative or qualitative in nature. -
Quantitative tests or assessments are those that express results in numbers. Examples of quantitative Level 2 evaluations include, but are not limited to:
-
Traditional graded knowledge tests (multiple choice, fill in the blank, calculate the equation, complete the form, etc.),
-
Graded knowledge checks, and
-
Graded individual case studies (i.e., L&E develops and provides a rubric for scoring the assignment).
-
-
Qualitative methods don’t yield numerical results or are not easily scored. They are any means by which students receive input and guiding feedback on their relative performance to help them improve. Examples of qualitative Level 2 evaluations include, but are not limited to:
-
Group or team exercises,
-
Presentations,
-
Work reviews,
-
Pass or fail assessments,
-
Questionnaires,
-
Interviews, and
-
Simulations,
-
Observations,
-
Self-assessments,
-
Pre or post assessments,
-
Role plays,
-
Action learning.
-
-
-
Level 3 training evaluation is the degree to which participants apply what they learned during training when they are back on the job.
-
New World Kirkpatrick Model Level 3 training evaluation dimensions include:
-
Required Drivers: Processes and systems that reinforce, monitor, encourage and reward performance. Examples of required drivers include work review checklists, job aids, recognition, coaching, mentoring, etc.
-
Critical Behaviors: The few, specific actions that will have the biggest impact on the desired results if performed consistently on the job. For the IRS, critical behaviors are the same as enabling learning objectives (ELOs) or terminal learning objectives (TLOs).
-
On-the-Job Learning: A culture and expectation that individuals are responsible for maintaining the knowledge and skills to enhance their own performance.
-
-
The Level 3 training evaluation process enables business units to gather and interpret the level of impact the training had on participants behavior.
-
Level 4 training evaluation is the degree to which targeted outcomes occur as a result of training. It assesses the impact of the improvements in the trainees’ performance on the mission of the organization. Business units must be able to identify and provide data on measures that have organizational impact.
-
New World Kirkpatrick Model Level 4 training evaluation dimensions include leading indicators which are short-term observations and measurements suggesting that critical behaviors are on track to create a positive impact on desired results. Examples of leading indicators include balanced measure ratings, compliance ratings, customer response ratings, etc.
-
The Level 4 evaluation process enables business units to gather and interpret the results the training had on the organization.
-
All training events require Level 1 training evaluations.
Exception:
Level 1 training evaluation policies and procedures may be applied to one-time events as resources allow; however, they are not mandatory for one-time events. A one-time event is defined as an event only delivered to one group of participants. The event may take place in multiple sessions, but they must all be within a sixty-day time frame. L&E must notate "One-time event" in the item comments field in the LMS.
-
L&E must document the Level 1 evaluation requirements for each item in the LMS.
If Item: Then Populate LMS Level 1 Evaluation Item Field With: Meets the definition of a training event (Level 1 required) Level 1 is required (Required) Does not meet the definition of a training event Level 1 is not required (Optional) Is a one-time event (even if it meets the definition of a training event) Level 1 is not required (Optional) -
L&E must associate the appropriate standardized online Level 1 training evaluation to each item in the LMS that requires a Level 1 training evaluation. The four standardized online Level 1 training evaluation forms currently available are:
Standardized Online Level 1 Training Evaluation Forms: Used to evaluate: L1-CLS-3 (Classroom Course Evaluation) Instructor-led, classroom training events L1-BLD-1 (Blended - Classroom, Computer and Self-Directed Evaluation) Instructor-led training events with classroom and technology-based components L1-VIR-1 (Virtual Evaluation) Virtual training events L1-SD-1 (Self-Directed Training Evaluation) Self-directed or asynchronous training events -
Feedback from participants, instructors, L&E and the NTEU was considered when developing the standardized Level 1 training evaluations.
-
For participants, completing Level 1 training evaluations is voluntary but strongly encouraged.
-
Level 1 training evaluations must be completed during normal duty hours.
-
Level 1 training evaluations must be anonymous and confidential (i.e., no names, SEIDs, or other self-identifying data will be used).
-
Level 1 training evaluations are automatically issued through the LMS when a course with an associated standardized online Level 1 training evaluation is recorded in an employee’s learning history. The LMS automatically captures the responses.
-
Standardized paper Level 1 training evaluations are available for employees with restricted access to computers or as a reasonable accommodation.
-
The two standardized paper Level 1 training evaluation forms currently available are
Standardized Paper Level 1 Training Evaluation Forms: Used to evaluate: L1-CLS-3 (Classroom Course Evaluation) Instructor-led, classroom training events L1-VIR-1 (Virtual Evaluation) Virtual training events -
The VLM or CLS office must manually administer the standardized paper Level 1 evaluations
-
The VLM or CLS office must send the results to the EMS vendor for input into the EMS
-
L&E, through discussion with their business unit customer(s), will annually determine the priority of training events included on their training plans. Refer to IRM 6.410.10 , Event Planning and Approval, for additional information.
-
L&E must obtain Level 1 training evaluation results from the LMS or the EMS.
-
L&E must analyze the results of required Level 1 training evaluation results for all Priority 1 and Priority 2 training courses in their purview within one year of initial delivery. Analysis must be performed at least every three years thereafter, while the training remains active. Analyze other Level 1 training evaluation results (e.g., for Priority 3 and Priority 4 training events) as resources allow.
-
L&E must compile and document Level 1 training evaluation results analysis in a consolidated report within the course file.
-
Documenting Level 1 training evaluation results analysis must include:
-
Recommendations for course improvement or corrective actions, or
-
Annotation if there are no recommendations for improvement.
-
-
Additional documentation may include, but is not limited to:
-
Analysis and assessment of Level 1 reports with feedback.
-
Overall training satisfaction scores.
-
Evaluative information from instructors to ensure high-quality, effective and efficient delivery of training.
-
-
L&E, through discussion with their business unit customer(s), will annually determine which training courses require Level 2 training evaluations.
-
Level 2 training evaluations should focus on mission critical, Priority 1, and 2 training events, but may be required for any training program as determined above.
-
L&E must require Level 2 training evaluations on at least 5% of courses delivered for each business unit.
-
Consider training and program priorities, training course costs, and available resources when determining Level 2 training evaluation requirements.
-
-
L&E must document Level 2 training evaluation requirements for each item in the LMS:
-
When items are created in the LMS, populate the "Level 2 Evaluation" field with "Level 2 is not required (Optional)" or "Level 2 is required (Required)," as appropriate.
-
If the item’s Level 2 training evaluation requirement changes after it is initially created in the LMS, update the "Level 2 Evaluation" item field in the LMS prior to the start of any associated training events.
-
-
Learners must complete Level 2 training evaluations during normal duty hours.
-
Level 2 training evaluation results used for reporting and analytical purposes must be anonymous and confidential (i.e., no names, SEIDs, or other self-identifying data will be recorded).
-
Level 2 training evaluation results for an individual may be included in a developmental guide, or similar document, for employee development purposes or to provide individualized coaching.
-
An individual’s Level 2 training evaluation results are not required to be anonymous when being used for employee development purposes or to provide individualized coaching but can only be shared with a training coordinator or the employee’s first level manager.
-
No adverse actions will be taken against learners based on Level 2 training evaluation results.
-
Information gathered through Level 2 evaluation will not be used to evaluate participants on their annual performance appraisals.
-
Since Level 2 training evaluations measure the knowledge obtained from the subject matter of each individual course, there are no standardized forms or questions required to be used in all Level 2 training evaluations.
-
If a course has a Level 2 training evaluation requirement, L&E must develop a Level 2 evaluation and include in the course. Refer to IRM 6.410.9.4.2 for types of Level 2 evaluations.
-
L&E must develop and include in the course:
-
Methods or procedures for capturing results, and
-
A grading scheme, if appropriate.
-
-
If using Level 2 training evaluation results for employee development or coaching purposes, methods or procedures for capturing the employee’s individual Level 2 training evaluation results must be developed and included in the course.
-
Level 2 training evaluations may be administered online or by instructors.
-
If a training event has a Level 2 training evaluation requirement, the evaluation must be administered. Typically, Level 2 training evaluations are administered during course completion; however, there are instances when they might be administered after the course is completed.
-
Certain Level 2 evaluation data must be captured in the LMS for Servicewide analysis and reporting purposes. For instructor-led or blended training, the course instructor or business unit designee must:
-
Administer and score the Level 2 evaluation.
-
Complete Form 14156 , Level 2 Instructor Data Capture (IDC).
-
Submit the form to VLM or CLS for input into the LMS by submitting a support request to VLM/CLS via the Event Support Request System (ESRS).
-
-
Data captured on the IDC Form includes (as applicable):
-
Whether or not a Level 2 evaluation was administered, including type of delivery, (e.g., online (AOL), oral (O), qualitative (QL), quantitative (QN)),
-
Reason if not administered,
-
Average pre-test score,
-
Average post-test score,
-
Percentage of improvement,
-
Average final test score,
-
Current trainee count,
-
Number of trainees competing, and
-
Percentage of trainees competing.
-
-
To capture comprehensive Level 2 training evaluation results for more extensive analysis, L&E must communicate any additional reporting requirements to instructors.
-
If a Level 2 evaluation is required but not developed or administered, the instructor or business unit designee must complete the IDC Form. The form must include the reason the Level 2 was not administered. Upon completion, the instructor or designee must submit the form to VLM or CLS via the Event Support Request System (ESRS) for input into the LMS.
Note:
For CPE courses where accreditation (continuing education units or credits) is sought, the courses must adhere to test requirements and standards of the accrediting authority (i.e., American Institute of Certi?ed Public Accountants or National Association of State Board of Accountancy). Criminal Investigation (CI) training courses must adhere to the standards set forth by the Federal Law Enforcement Training Accreditation Board.
-
Limited Level 2 training evaluation results will be available from the LMS or EMS. Access additional Level 2 training evaluation results from data housed in alternative systems, in accordance with procedures established by the business unit. Use comprehensive results for analysis when available.
-
L&E organization must analyze results from Level 2 training evaluations for all training events with a Level 2 training evaluation requirement in their purview within one year of initial delivery and annually thereafter, while the training remains active.
-
If Level 2 training evaluation results are available for courses without a Level 2 training evaluation requirement, analyze results as resources allow.
-
L&E must compile and document Level 2 training evaluation results analysis in a consolidated report and maintain in the course file.
-
Documentation of Level 2 training evaluation results analysis must include:
-
Recommendations for course improvements and/or corrective actions, or
-
Annotation if there are no recommendations for improvement.
-
-
Additional documentation may include, but is not limited to:
-
Test item analysis,
-
Average test scores,
-
Pre/post assessments,
-
Final test scores,
-
Performance-based results, and/or
-
Level 2 final results from pass/fail, complete/not complete assessments, etc.
-
-
L&E, through discussion with their business unit customer(s), will annually determine which training courses require Level 3 training evaluations.
-
Level 3 training evaluations must focus on mission critical, Priority 1 and 2 training events, but can be required for any training program, as determined above.
-
L&E must require Level 3 training evaluations on at least 5% of courses delivered for each business unit, and
-
Consider training and program priorities, training course costs, courses scheduled for revision and available resources when determining Level 3 training evaluation requirements.
-
-
L&E must document the Level 3 training evaluation requirements for each item in the LMS:
-
When items are created in the LMS, populate the "Level 3 Evaluation" ” field with "Level 3 is not required (Optional)" or "Level 3 is required (Required)," as appropriate, and
-
If the Level 3 evaluation requirement changes after it is initially created in the LMS, update the "Level 3 Evaluation" field in the LMS.
-
-
Administering Level 1 and Level 2 training evaluations is recommended (but not required) prior to administering Level 3 training evaluations for a course.
-
For participants, completing Level 3 training evaluations is voluntary, but strongly encouraged.
-
For participants’ managers, completing Level 3 training evaluation is mandatory.
-
Managers and participants must complete Level 3 training evaluations during normal duty hours.
-
Level 3 training evaluation results used for reporting and analytical purposes will be anonymous and confidential (i.e., no names, SEIDs, or other self-identifying data will be recorded).
-
No adverse actions will be taken against participants or managers based on results of Level 3 data reports.
-
Information gathered through Level 3 evaluation will not be used to evaluate participants on their annual performance appraisals.
-
If a training event has a Level 3 training evaluation requirement, Level 3 training evaluations must be developed. This should be completed during the design and development phase of TDQAS, but may be completed later, if necessary.
-
Level 3 training evaluations must be developed in the EMS.
-
A corresponding participant and manager Level 3 training evaluations, referred to as an evaluation pair, must be developed simultaneously.
-
All Level 3 training evaluations must include the mandatory training transfer questions to provide results for Servicewide reporting. The mandatory training transfer questions are:
-
Training Transfer Questions for Non-Leadership Courses.
-
Training Transfer Questions for Leadership Courses.
-
-
Level 3 evaluation instruments must be reviewed and approved by the STEP team prior to delivery.
-
Level 3 training evaluations must be conducted on all training events with a Level 3 training evaluation requirement.
-
Administer Level 3 training evaluation after completion of the training event and after employees have been given an opportunity to apply course content while back on the job. They are typically delivered one to six months after training but must be administered within nine months of the end of the training event.
-
If a required Level 3 training evaluation has not been administered within the typical delivery time frame, L&E must document the consolidated report within the course file with a scheduled administration date.
-
Once a Level 3 training evaluation pair is reviewed and approved in the EMS, evaluation links will be sent to L&E, who will then issue them to participants and their managers. Electronic Level 3 training evaluation responses are automatically captured in the EMS.
-
If a paper Level 3 training evaluation is used, a PDF version will be created and sent to L&E, along with a batch sheet. L&E office is responsible for:
-
Administering the Level 3 paper evaluation,
-
Compiling paper Level 3 training evaluation responses using the batch sheet, and
-
Submitting the paper responses and batch sheet to the EMS vendor for processing. The vendor contact information is included on the provided batch sheet.
-
-
L&E may delegate administering, compiling and submitting Level 3 paper evaluations; however, they must communicate the requirements to their designee and confirm completion.
-
Waiving the use of the EMS in its entirety for Level 3 training evaluations requires STEP written approval; complete and submit a Training Evaluation Waiver Request to detail the business need for an exception.
-
L&E must obtain Level 3 evaluation results from the EMS.
-
L&E must analyze results from Level 3 training evaluations for all training events with a Level 3 training evaluation requirement in their purview within one year of initial delivery and annually thereafter, while the training remains active.
-
If Level 3 evaluation results are available for courses without a Level 3 training evaluation requirement, analyze results as resources allow.
-
L&E must compile and document Level 3 training evaluation results analysis consolidated report within the course file.
-
Documentation of Level 3 training evaluation results analysis must include:
-
Recommendations for course improvements and/or corrective actions, or
-
Annotation if there are no recommendations for improvement.
-
-
Additional documentation may include, but is not limited to:
-
Level 3 participant training evaluation results.
-
Level 3 manager training evaluation results.
-
Analysis of Level 3 training evaluation results.
-
-
L&E will confer with their assigned business unit executives or designee(s) during the TDQAS assessment phase to determine:
-
Program expectations including the customer’s desired results of the training,
-
If the training warrants a Level 4 training evaluation and/or it is feasible,
-
Whether resources and funding are available and will be committed, and
-
The extent to which the training impact can be isolated against other factors, and if customer-valued data on business results is available.
-
-
L&E must document the Level 4 training evaluation requirements for each course in the LMS.
-
When items are created in the LMS, populate the "Level 4 Evaluation" field with "Level 4 is not required (Optional)" or "Level 4 is required (Required)," as appropriate.
-
If the Level 4 evaluation requirement changes after it is initially created in the LMS, update the "Level 4 Evaluation" item field with "Level 4 is not required (Optional)" or "Level 4 is required (Required)," as appropriate.
-
-
If a Level 4 training evaluation is required, conduct Levels 1, 2 and 3 training evaluations. Feedback from these levels will provide important quantitative and qualitative data supporting conclusions drawn that links training to business results.
-
There are currently no Servicewide reporting requirements for Level 4 training evaluation results; however, any available results should be forwarded to the HCO Servicewide Evaluation Team.
-
Design Level 4 training evaluation executive reports by following guidelines established by each business unit.
-
Maintain Level 4 training evaluation results, if available, in a consolidated report within the course file.