Like this Product? Get CRESST News Every Month!

Don't miss out on the latest CRESST news including our FREE reports and products. Subscribe to the monthly CRESST E-newsletter right now!

We only use this address for the monthly CRESST E-Newsletter and will ask for confirmation before adding you to our list.



No thanks | Don't ask again

Military » Surface Warfare Tactics

Adaptive Training for Combat Information Centers, Office of Naval Research

The goal of the Adaptive Training for Combat Information Centers (ATCIC) program was to develop methods, processes and software components for a training system that adapts to the trainee's strengths and weaknesses, diagnosing training deficiencies and providing training remediations via scenario interventions or feedback to the trainee or instructor.




Outcomes
To provide testbed for this research, CRESST and its partner the University of Southern California Center for Cognitive Technology (USC/CCT) developed an assessment system combining two products: the Tactical Action Officer (TAO) Sandbox, a simulation developed by USC/CCT that provides practice in planning for surface fleet TAOs, and the CRESST Assessment Application (CAA). The CAA uses Bayesian networks to produce real-time formative assessment information as well as summative after-action assessment information. It interacts with the Sandbox in real time, as the student runs the scenario. The Sandbox sends reports of observable actions and events to the CAA. The CAA, using the Bayesian network, analyzes the reported information to assess the status of the student's knowledge and skill, and sends the assessment back to the Sandbox to be used in adapting the scenario to meet the student's needs, e.g., by providing performance feedback, additional practice or instructional resources, adding or changing practice tasks, and adding or changing affordances (resources and capabilities available to the student).

For more information please contact:
Bill Bewley, Ph.D., Assistant Director of Technology
Phone: 310-825-7995
Email: bewley@cse.ucla.edu


Training Models and Tools for Adaptive Learning, Office of Naval Research

The Navy Surface Warfare Officers School (SWOS) is the schoolhouse responsible for training Tactical Action Officers (TAOs). The TAO is responsible for tactical employment and defense of the ship. He or she manages use of the ship's weapons and sensors, directs the movements of the ship, and monitors the movements and actions of friendly and enemy ships, planes, missiles, and submarines in the region. The TAO integrates this information to form a tactical picture of the situation, selects appropriate responses, issues orders, and informs the commanding officer of actions and intentions. An important part of SWOS's TAO training is practice and testing in a simulation facility called the Multi-Mission Team Trainer (MMTT). TAO performance assessment is based in part on whether or not certain actions occurred, e.g., ordering queries and warnings, sending an airplane to visually identify a suspected hostile track, or defending against a threatening hostile track. In addition to these actions, SWOS was concerned with measuring the TAO's cognitive readiness – the thinking behind the actions.






Outcomes
CRESST developed an assessment system providing a series of screens mapped to the events of the MMTT scenario. Each screen contains items (statements or questions) linked to several descriptions of student actions or responses, e.g., what is this track, how do you know, and what are your expectations regarding this track. Student responses are linked to four constructs identified as elements of TAO performance: comprehension of the meaning of the situation, predicting how the situation may change, implementing a plan of action, and communicating with other watchstanders and superiors. Results of a confirmatory factor analysis provide preliminary evidence that MMTT scenarios are valid measures of the constructs.

Publications
Bewley, W. L., Lee, J. J., Jones, B., & Cai, H. (in press). Assessing cognitive readiness in a simulation-based training environment. In H. F. O'Neil, R. S. Perez, & E. L. Baker (Eds.), Teaching and measuring cognitive readiness. New York, NY: Springer.

Lee, J. J., Bewley, W. L., Jones, B. Min, H. &., Kang, T. (2009, December). Assessing performance in a simulated combat information center. Proceedings of the Interservice/Industry Training, Simulation and Education Conference, Orlando, FL.

For more information please contact:
John Lee, Ph.D., Senior Researcher
Phone: 310-794-9155
Email: johnjn@ucla.edu


Training Models and Tools for Adaptive Learning, Office of Naval Research

In support of Surface Warfare Officers School (SWOS) efforts to continuously improve its curriculum, CRESST has assisted SWOS in developing questionnaires of student opinion on the quality of the Department Head course, the Senior Officer Ship Material Readiness Course, and the Prospective Commanding Officer course. Also, in order to revise the Department Head curriculum to help develop improved Combat Information Center (CIC) leadership and tactical skills, CRESST and SWOS developed a questionnaire of Commanding Officer ratings of Tactical Action Officers during their first assignment following the Department Head course.





For more information please contact:
John Lee, Ph.D., Senior Researcher
Phone: 310-794-9155
Email: johnjn@ucla.edu


Adaptive Training for Combat Information Centers, Office of Naval Research

The AEGIS Training and Readiness Center (ATRC) trains personnel in the operation of the AEGIS combat system on the Navy's AEGIS cruisers and destroyers. Prior to development of the ATRC Assessment System, instructors used hardcopy check sheets to record student performance as they played the role of Combat Information Center (CIC) watchstanders including the Tactical Action Officer (TAO) and Anti-Air Warfare Commander (AAWC). After a training session, instructors manually entered data from the check sheets into a database. To semi-automate data collection, analysis and reporting, CRESST developed a tablet PC-based performance assessment data entry system providing a series of screens mapped to scenario events. Each screen contains performance statements linked to Navy tactical tasks. At the end of the scenario, the data are synchronized with the Instructor Proficiency Analysis Tool (IPAT) on ATRC's server. The IPAT was developed by SAIC for ATRC, and their system provides further reporting and analysis capabilities which provide feedback to the CIC teams across the Navy's training sites.





For more information please contact:
John Lee, Ph.D., Senior Researcher
Phone: 310-794-9155
Email: johnjn@ucla.edu


Training Models and Tools for Adaptive Learning, Office of Naval Research

To support the Navy Engineering Duty Officer (EDO) School's analyses of risk management problems, a series of applications were developed that implement a multi-attribute model of utility estimation for decision making. For a given type of decision, users can determine the attributes of the utility of outcomes. They can weight these attributes according to their contribution to outcome utility. For each event that might result from a decision, they can estimate the probability of each possible outcome of that decision. They can also estimate the attribute values for each of these outcomes.

Outcomes
Three specific examples of the application of this tool to decisions discussed in the EDO Basic Course were developed. In order of increasing complexity, these applications are:

1. Choosing a restaurant;
2. Selecting a digital camera; and
3. Handling a Refueling at Sea System procurement problem, shown in the figure below.

In addition, an 'empty' version of the decision aid tool was developed for creating new decision trees from scratch



Related Publications

Vendlinski, T. P. J. F., Munro, A., Pizzini, Q. A., Bewley, W. L., Chung, G. K. W. K., Stuart, G., & Delacruz, G. C. (2004, December). Learning complex cognitive skills with an interactive job aid. Proceedings of the Interservice/Industry Training, Simulation and Education Conference, Orlando, FL.


Vendlinski, T. P. J. F., Munro, A., Pizzini, Q. A., Bewley, W. L., Chung, G. K. W. K., Stuart, G., & Delacruz, G. C. (2006). Learning complex cognitive skills with an interactive job aid (CSE Tech. Rep. No. 694). Los Angeles: University of California, National Center for Research on Evaluation, Standards, and Student Testing.

For more information please contact:
Bill Bewley, Ph.D., Assistant Director of Technology
Phone: 310-825-7995
Email: bewley@cse.ucla.edu


Training Modules and Tools for Adaptive Learning, Office of Naval Research

CRESST developed a web-based tool to be used for designing and evaluating distance learning courseware. The goal was to link research on learning to the task of courseware formative evaluation. Development was grounded in research-based design guidelines (O'Neil, 2005) that address areas critical to effective instruction, including instructional, multimedia, and assessment strategies.





Outcomes
Selected guidelines were transformed into a framework of questions to be answered by courseware evaluators. This framework was tested for usability and rater reliability via the tool, using a rating scale and a facility for recording rater rationale for score assignment.

Related Publications

O'Neil, H. F. (Ed.). (2005). What works in distance learning. Guidelines. Greenwich, CT: Information Age Publishing.

Sylvester, R., O'Neil, H. F., & Bewley, William L. (2005, April.) A tool for applying research-based guidelines to courseware evaluation. Paper presented at the annual meeting of the American Educational Research Association, Montreal, Canada.



For more information please contact:
Bill Bewley, Ph.D., Assistant Director of Technology
Phone: 310-825-7995
Email: bewley@cse.ucla.edu