Building a Certification Program, Step 8: Instructional design (Part 1)
Posted on
April 1, 2019
by

NOTE: This is an ongoing series. To view all articles in the series, click here.

The journey to a functional certification program continues with an overview of instructional design.

Having now shared with you a proven method for preparing a learning /assessment blueprint, which will provide you with a solid foundation for a certification program, our next step will be to assimilate all the data from our earlier analyses, along with the task analysis and blueprint. We will begin designing and developing a learning intervention and assessment that follows our blueprint and that leads to certification. That leads us to instructional design, also known as instructional systems design.

Those who are called instructional designers (IDs) or instructional system designers (ISDs) are the professionals who can make a certification program successful. Most of what we have discussed in the previous installments of this series fall under the purview of the ISD, including the occupational analysis, the task analysis, and (most importantly) the blueprinting process.

Understanding the importance of this role helps us to leverage these valuable resources to their fullest. But to fully understand the role of the ISD it is imperative that we understand the various models that are used by ISDs to design and develop curriculum and where certification fits into each model. For this installment, what I will do is provide an overview of each model and provide some prescriptive guidance on where certification seems to apply. To conclude, I will offer you a Level 1 Process model that I have used for curriculum design and development in the certification programs I have managed.

Curriculum Design and Development Models: Where certification fits in

Before looking at the curriculum development models, I need to define a concept that helps one to understand these frameworks. The term is instructional systems design (ISD). ISD is a systems approach to the development of an instructional experience (Hodell, 1999). It is a systematic way to map out the design and development of instructional content from the initial concept to delivery. ISD grew out of a need in the military and in industry to develop large quantities of instructional materials for mastery learning (Alessi & Trollip, 2001). Hodell (1999) affirms there are many models that are systems oriented, including ADDIE.

The ADDIE Model

ADDIE is a popular, generic systems process for the creation of instructional content. The acronym ADDIE represents the five steps in the process: Analysis, Design, Development, Implementation, and Evaluation. Gagne, Wager, Golas, and Keller (2005) have called ADDIE the most basic model used by ISDs.

Molenda's (2003) position is that there does not seem to be an authoritative version of ADDIE and that those who claim to use ADDIE are actually creating their own model. From his research, he contends that ADDIE was born in the U.S. military in an attempt to develop a system for creating relevant instructional content. Most instructional designers, course developers, curriculum developers, use some form of ADDIE (Hodell, 1999).

There are exceptions, however, such as Michael Allen (2012), who has been an advocate for a model he developed, Successive Approximation, which is iterative and has three steps in its core. According to Allen, ADDIE is generally a linear approach, which means one step must be created before another begins. Others such as Gagne, Wager, Golas, and Keller (2005) depict ADDIE as iterative with all processes interacting with the evaluation step.

The journey to a functional certification program continues with an overview of instructional design.

No matter which position one takes, ADDIE begins with the analysis process, the step that helps an instructional designer define what is needed in systematically designed training (Roberts, 2006). Roberts goes on to explain that the methods used to collect data in the analysis phase of ADDIE include interviews, questionnaires, observations, literature, group discussions, or leveraging a panel of experts. When an occupational analysis is conducted and completed, the profile is used as a foundation for accessing specific developing role-based instructional content (Roberts, 2006).

Since ADDIE is a high-level model and does not have a specific phase, or step defined as occupational analysis, based on the instructional goal(s) and the learner profile an occupational analysis may be incorporated as part of the analysis process so that the content is applicable to the target audience. Since ADDIE's purpose is to develop relevant instructional content, it is a model that could be used by curriculum developers for developing role-based curriculum that could in turn be used to train resources for certification,

The Successive Approximation (SAM) Model

Allen's (2003) proposed alternative to ADDIE, Successive Approximation (SAM), initially was recommended as a model for organizations that had an integrated design and development methodology. In this format, SAM is a simple, three-step, iterative process starting and ending with the evaluate step followed by design and develop. The evaluate step starts the process by asking who the learners are and what in their performance needs to be modified. Implicitly this asks the question, what do the learners do in their occupation?

For larger projects, Allen (2012) proposed a three-phase approach, SAM 2, which includes a preparation phase, an iterative three-step design phase, and an iterative three-step development phase. The preparation phase explores an organization's background, prior instructional attempts, and current programs that are being leveraged. The design phase is composed of three steps: evaluate, prototype, and design. The development phase is composed of three-steps also: evaluate, develop, and implement. Allen's primary criteria for both models are they must be iterative, support collaboration, be efficient and effective, and finally be manageable.

No matter which SAM approach is examined, a background or evaluative analysis is required. Despite not explicitly defining a phase for occupational analysis, it would be easy to incorporate and justify an occupational analysis in either the preparation phase of SAM 2, or the evaluate step of the more simple design of SAM. An ISD could leverage SAM or SAM 2 to develop a role-based curriculum based on an occupational profile. The occupational analysis could be used iteratively during curriculum development for a role-based certification program.

The Systematic Curriculum Instructional Development (SCID) Model

Another take on ADDIE is the model developed by Norton and Moser (2007, 2008) the Systematic Curriculum Instructional Development (SCID) model. SCID has five phases similar to ADDIE, with each phase broken out into 23 detailed components. The five major phases are curriculum analysis, curriculum design, instructional development, training implementation, and program evaluation. Program evaluation provides ongoing feedback to each of the other four phases.

The purpose of the curriculum analysis phase is to discover a job's makeup by dissecting it into its elements. The curriculum analysis phase is composed of six major components. These six are: (1) conduct a needs analysis, (2) conduct a job analysis, (3) conduct a task verification, (4) select the tasks for training, (5) perform a task analysis, and (6) develop a competency profile. Norton and Moser clearly identify a job analysis as part of their model for curriculum and instructional development. This is a model that I would promote for those wanting to develop role-based training, such as for a certification program, since it both follows the classic, systematic ADDIE logic and clearly defines the major steps for each phase.

Rothwell and Kazanas ISD Model

Rothwell and Kazanas (2008) are advocates for a ten-step instructional systems design (ISD) model that is presented in a circular format. It is a circle to indicate that any of the ten steps can start the process and any step can follow. The ten steps are: (1) conduct a needs assessment, (2) assess learner characteristics, (3) analyze the work setting, (4) analyze job, task, and content, (5) develop performance objectives, (6) develop performance measurements, (7) place performance objectives in an order, (8) develop strategies for instruction, (9) design instructional content, and (10) conduct an instructional evaluation.

In Step 4, Rothwell and Kazanas include conducting a job, task, and content analysis. These researchers state that instructional designers should conduct a job analysis when the information on hand is not sufficient to inform a task analysis. When properly conducted, a job analysis will help to steer an organization and instructional designers because it examines what people do on the job. Similarly, if an organization does not have a clear picture of what an employee is supposed to do, it would be very difficult to develop training that would help achieve organizational goals and therefore certify employees.

Morrison, Ross and Kalman Model

The nine-step, flexible, instructional design plan presented by Morrison, Ross, and Kalman (2013) is configured in an oval format with the nine overlapping steps enveloped by two outer rings of eight ongoing processes. The format is oval to indicate that there is no preferred order or sequence for completing the instructional design process. The outer most ring is the planning ring and is composed of an implementation process, project management process, and a support services process. The inner most ring is the revision ring and is composed of a summative evaluation process, a formative evaluation process, and a confirmative evaluation process.

The nine core steps in this model are: (1) analyze instructional problems, (2) assess learner characteristics, (3) perform a task analysis, (4) develop instructional objectives, (5) sequence instructional content, (6) identify instructional strategies, (7) design the instructional message, (8) develop instruction, and (9) evaluate instructional instruments. This is one sequence for this model's steps. Others are possible, making this a model that is quite adaptable to an organization's needs.

According to Morrison, Ross, and Kalman (2013) a job analysis is a technique that is available during the analyze instructional problems step. A job analysis is “useful for developing a curriculum for training rather than identifying performance gaps or problem areas” (p. 43). This implies that the Morrison, Ross, and Kalman model is aligned with the concepts underlying certification and could be a solution for instructional designers needing to perform a job analysis for an organization that needs role-based training and wants to certify its employees.

Keller's ARCS Model

Keller's four category model for motivational design of instruction called ARCS (Gagné, Wager, Golas, & Keller, 2005; Keller, 1983, 1987, 1999, 2009) is designed to assist instructional designers comprehend the factors that influence student motivation and the kinds of motivational strategies that could be used in a learning intervention. “The objective of the ARCS model is to make the theory and research in the field of motivation more easily applied in actual instruction” (Gagné, Wager, Golas, & Keller, 2005, p. 114). Keller's (2009) model is made up of four categories which describe motivational variables and strategies, and a systematic motivational design process.

The four categories which align with the ARCS acronym are: (1) attention, (2) relevance, (3) confidence, and (4) satisfaction. Attention addresses how an ISD can capture a learner's attention while stimulating in them a desire to learn. Relevance addresses how an ISD can make an instructional experience important for the leaner, while meeting the learner's needs and goals. Confidence addresses how an ISD can help a student, through instruction, succeed and control their success. Satisfaction addresses how an ISD can make learning rewarding so that students want to keep learning.

Keller (2009) believes that a complete instructional experience should address all motivational categories, but accepts that some learning situations require specific motivational categories to be emphasized. If an instructional experience is relevant, achievable, and satisfying to the learner, but inherently boring, activities would have to be incorporated that address the learner's attention.

Keller's (2009) systematic motivational design process is formulated to be non-prescriptive, but primarily an experiential process used for problem solving. The systematic motivational design process has ten steps or activities. The ten are: (1) gather course information, (2) gather audience information, (3) conduct an audience analysis and develop a motivational profile, (4) evaluate existing course materials, (5) identify objectives and assessment methods, (6) identify motivational tactics that are applicable, (7) choose appropriate design tactics, (8) Join motivational and instructional plans, (9) choose and develop instructional materials, and (10) gather student feedback and revise as needed.

Because Keller's model is used to enhance existing learning products, it would seem that it is not well suited for ISDs contributing to a certification program. However, in step three where an audience analysis is conducted in conjunction with a motivational profile, an occupational analysis conducted under a different instructional paradigm may tangentially contribute, in order to develop role-based training that was not simply didactic, but also motivational.

Alessi and Trollip Model

The model for design and development published by Alessi and Trollip (2001) has three phases and includes three processes: standards, ongoing evaluation, and project management. Their model leverages many of the best features of other ISD models for creating robust multimedia products. It is designed to be flexible enough to address a wide range of subject areas. It is designed like a standard learning project using good management of time, resources, and money and relies on continual evaluation of deliverables during their development. This model is standard-based and has a planning phase, a design phase, and a development phase.

Each of the three phases has numerous steps. The steps included in the planning phase are: (1) the scope of the project is defined, (2) learner characteristics are identified, (3) constraints are established, (4) the cost of the project is established, (5) a project plan is created, (6) a style manual is created, (7) resources are defined and assembled, (8) initial brainstorming is conducted, (9) the look and feel of the project is defined, and (10) the client sign-off is obtained.

The steps included in the design phase are: (1) initial content ideas are identified, (2) concept and task analyses are conducted, (3) a preliminary program description is defined, (4) a prototype or straw-man is created, (5) storyboards and flowcharts are developed, (6) for multimedia deliverables, scripts are created, and (6) the client sign-off is obtained.

The steps included in the development phase are: (1) text is prepared, (2) program code is written, (3) graphics art developed, (4) audio and video are produced, (5) components are assembled, (6) support materials are developed, (7) an alpha test is conducted, (8) revisions are made, (9) a beta test is conducted, (10) revisions are made, (11) the client sign-off is obtained, and (12) the final deliverable is validated.

At first glance, the Alessi and Trollip (2001) model does not appear to have a slot in the planning phase for an occupational analysis. One might think that the step where the learner characteristics are identified might accommodate a job analysis. This in typical project plan terminology would be where you are scoping the project. A job analysis would come later in a project in order to address some gap identified during planning. In the design phase there is a step where an occupational or job analysis is implied. If needed, the step where concept and task analyses are conducted is where an ISD could conduct a job analysis.

According to Alessi and Trollip, a concept analysis is conducted to identify verbal information, rules, and principles that need to be included in a learner's course of study, while a task analysis is conducted to better grasp the procedural skills required by learners. To fully comprehend how a learner performs a procedure, one must also understand what they do, i.e. an analysis of their role. The implication of the concept and task analysis step in the design phase is if needed a job analysis could be conducted in order to meet the client's expectations for the instructional deliverables.

This model could be used by ISDs for certification. An occupational analysis could be a method used by curriculum developers for the development of instructional content for a role to be certified. The most interesting benefit of leveraging this model with certification is that Alessi and Trollip's model is geared toward multimedia instructional materials development. As more occupations move toward this type of instructional experience, this model will become more beneficial.

Smith and Ragans Model

The three-phase instructional design process model offered by Smith and Ragans (2005) is one of the easiest and most flexible instructional design models currently leveraged. It is both depicted as a sequential series of phases and steps and an interwoven, nonlinear set of processes. The three major phases of the model, also “termed a common model of instructional design” (p. 10), are analysis, strategy, and evaluation. The four major steps of the analysis phase are: (1) analyzing the learning contexts, (2) analyzing the similarities and differences between learners, (3) analyzing the tasks performed by learners, and (4) developing assessment items.

The strategy phase has two primary steps, which are: (1) determine organizational, management, and delivery strategies, and (2) create and deliver instructional content. The two primary steps of the evaluation phase are: (1) conduct formative evaluation and (2) revise instruction. In the analysis phase, during the learner task analysis it would be an easy assumption that an occupational analysis may precede the task analysis in order to identify the tasks performed by the future learners. Since the Smith and Ragans' (2005) model is such a compact framework, it makes it adaptable to the needs of instructional designers who may be designing content for occupations which have not been systematically analyzed, but need to be certified, like health care chaplains.

Dick, Carey, and Carey Model

The journey to a functional certification program continues with an overview of instructional design.

The ten-step systems approach model for designing instruction developed by Dick, Carey, and Carey (2009) has a long history of development and adoption by instructional designers. The earliest iteration of the model goes back to 1978. In its early conceptualization, it was heavily influenced by Gagné (1985). In its current iteration, this iterative model has ten steps. The ten steps are: (1) instructional goals are identified, (2) instructional analysis is conducted, (3) learners and their contexts are analyzed, (4) performance objectives are created, (5) assessment instruments are created, (6) the instructional strategy is created, (7) instructional materials are either selected or developed, (8) a formative evaluation of instruction is designed and executed, (9) instruction is revised, and (10) a summative evaluation of instruction is designed and executed.

From a practitioners' perspective, I find this model to be quite practical and flexible. It does not follow the historical ADDIE framework but there is a systematic flavor to it that is quite easy to digest. During step two, where an instructional analysis is conducted, “you determine step-by-step what people are doing when they perform that goal and also look at the subskills that are needed for complete mastery of the goal” (Dick, Carey, & Carey, 2009, p. 6). Also, in this step you determine the knowledge, skills, and attitudes that learners need to be successful.

Without explicitly saying so, in step two they are describing an occupational analysis with step ten being where a summative evaluation of instruction is executed which in essence is the execution of the certification exam for mastery. Because of the common-sense approach used in this model, it would be quite easy to adapt it for the creation of instruction for a certification program.

I have looked at a number of major models used for curriculum design and development which could potentially inform how curriculum could be developed for teaching those in a role to be certified. Several had a clearly defined process calling explicitly or implicitly for an occupational or job analysis. Other models only implied that an occupational analysis was part of their schema. Each model was flexible enough to be helpful to skilled instructional designers tasked with crafting curriculum for certifying a role.

My Model

Having examined a number of the most popular models, I came to the realization that only one or two had everything I needed for an ISD to craft curriculum for a certification program. What I did was review these aforementioned models and developed the following 18-step process/model (Figure 1). As you will note the majority of the work falls on the shoulders of the ISD. The steps for this model are similar to those used in the ADDIE process but more detailed and geared towards certification. The steps of my ISD model are:

Analysis/Assessment
1. Conduct a needs assessment where the role is identified
2. Conduct a learner assessment to determine what the learners need in their work environment
3. Conduct an occupational analysis to identify the duties and tasks of the role
4. Conduct a task verification to rate the tasks in importance, difficulty, etc..
5. Conduct a task selection to prioritize tasks for curriculum development
6. Conduct a task analysis to document steps required for each task

Design
1. Create a competency profile where tasks are grouped to form competencies
2. Develop learning and performance objectives-using a Blueprint instrument
3. Develop performance metrics to include the standards and rubrics
4. Define instructional strategies e.g. ILT, VILT, eLearning, etc.
5. Finalize learning blueprint

Develop
1. Develop instructional materials e.g. Guides, aids, powerpoints, visuals and handouts
2. Develop Competency – based assessments e.g. Quizzes, Exams and Certification tests
3. Import content into learning management system

Deliver
1. Deliver training e.g. ILT,VILT,PBT,OJT, and eLearning
2. Deliver certification assessments

Evaluate
1. Evaluate learning interventions and assessments, using level 1 surveys and psychometric review
2. Monitor and revise as needed to include learning and assessments.

The journey to a functional certification program continues with an overview of instructional design.

Figure 1: Level 1 Process Model for Instructional Systems Design (ISD)

Now that we have looked at the lion share of the workload as done by the ISD, in our next installment we will look at the processes involved in creating targeted, certification assessments.



REFERENCES

1. Allen, M. (2012). Leaving Addie for SAM: An Agile Model for Developing the Best Learning Experiences. Alexandria, VA: ASTD Press.

2. Allen, M. W. (2003). Michael Allen's Guide To E-Learning: Building Interactive, Fun, and Effective Learning Programs for Any Company. Hoboken, N.J.: John Wiley.

3. Alessi, S. M., & Trollip, S. R. (2001). Multimedia for Learning: Methods and Development (3rd ed.). Boston: Allyn and Bacon.

4. Brown, A., & Green, T. D. (2011). The Essentials of Instructional Design: Connecting Fundamental Principles with Process and Practice (2nd ed.). Boston: Prentice Hall.

5. Dick, W., Carey, L., & Carey, J. O. (2009). The Systematic Design of Instruction (7thed.). Upper Saddle River, N.J.: Merrill/Pearson.

6. Gagné, R. M. (1985). The Conditions of Learning and Theory of Instruction (4th ed.). New York: Holt, Rinehart and Winston.

7. Gagné, R. M., Wager, W. W., Golas, K. C., & Keller, J. M. (2005). Principles of Instructional Design (5th ed.). Belmont, CA: Thomson/Wadsworth.

8. Hodell, C. (1999). Basics of instructional systems design: ASTD-InfoLine.

9. Keller, J. M. (1983). Motivational design of instruction. In C. M. Reigeluth (Ed.), Instructional-design Theories and Models: An Overview of Their Current Status (Vol. 1, pp. 383-434). Hillsdale, N.J.: Lawrence Erlbaum Associates.

10. Keller, J. M. (1987). The systematic process of motivational design. Performance and Instruction, 26(9-10-), 1-8.

11. Keller, J. M. (1999). Using the ARCS motivational process in computer-based instruction and distance education. New Directions for Teaching & Learning, 1999(78), 39-47.

12. Keller, J. M. (2009). Motivational Design for Learning and Performance: The ARCS Model Approach (1st ed.). New York: Springer.

13. Morrison, G. R., Ross, S. M., Kalman, H. K., & Kemp, J. E. (2013). Designing Effective Instruction (7th ed.). Hoboken, NJ: Wiley.

14. Norton, R. E., & Moser, J. (2007). SCID Handbook (7th ed.). Columbus, OH: Center on Education and Training for Employment, The Ohio State University.

15. Rothwell, W. J. (2002). The Workplace Learner: How to Align Training Initiatives with Individual Learning Competencies. New York: American Management Association.

16. Rothwell, W. J., & Kazanas, H. C. (1988). Curriculum planning for training: The state of the art. Performance Improvement Quarterly, 1(3), 2-16.

17. Rothwell, W. J., & Kazanas, H. C. (2008). Mastering the Instructional Design Process: A Systematic Approach (4th ed.). San Francisco, CA: Pfeiffer.

18. Rothwell, W. J., & Sredl, H. J. (1992). The ASTD Reference Guide to Professional Human Resource Development Roles and Competencies (2nd ed.). Amherst, MA: HRD Press.

19. Smith, P. L., & Ragan, T. J. (2005). Instructional Design (3rd ed.). Hoboken, N.J.: J. Wiley & Sons.

20. Wyrostek, W., & Downey, S. (2016). Compatibility of common instructional models with the DACUM process. Adult Learning, 0(0), 1-7, doi:10.1177/1045159516669702

About the Author

Warren E. Wyrostek is a Solutions Oriented Educator and Leader, a Certified Trainer and Facilitator, an Experienced DACUM Enthusiast, and an innovative Certification and Assessment expert in demand. Warren holds a Doctorate in Education in Curriculum and Instruction. Currently Warren is an Adjunct at Valdosta State University and the owner of 3WsConsulting - Providing Efficient And Effective Top To Bottom Solutions To Learning Issues. Warren can be reached at wyrostekw@msn.com.

Posted to topic:
Resources

Important Update: We have updated our Privacy Policy to comply with the California Consumer Privacy Act (CCPA)

CompTIA IT Project Management - Project+ - Advance Your IT Career by adding IT Project Manager to your resume - Learn More