An introductory course of ML knowledge for non-tech navy learners, built on learn-by-doing theory and real-world scenario.
Interactive Screens
400+
Learning Results↑
34%
Company
CMU TEELLAB
my role
Learning Engineer
timeline
4 months, Spring 2024
team
2 Learning Engineers, 1 UX Designer
Context
AI has been widely used in navy.
such as autonomous vehicle operation, intelligent decision support systems, control systems, threat detection, and so on.
Many personnel lack basic AI knowledge, limiting their ability to leverage these technologies
That's why we need to ask:
How might we create an intuitive ML intro course tailored for Navy AI users?
Analyze
AI has been widely used in navy.
AI applications in navy settings
To understand how is AI applied in navy settings, we consulted the subject matter experts to understand the AI application scenarios. Here are the three major AI applications we found out:
Threat Detection and Assessment
Logistics and Maintenance
Intelligent Decision Support
Challenges for aI learners
Another thing is to understand the biggest challenges for the first time AI learners, we conducted user interview with 3 non-tech learners. Here are the insights we extracted from the interviews:
#1 Abstract Data Types: Learners struggle to see the role of data types in preprocessing and models.
#2 Complex Terms: Jargon like "regression" and "neural networks" confuses beginners.
#3 Training vs. Testing: Understanding how models learn and generalize is challenging.
Design
Map out the learning plan
objectives
We settled on the final 5 learning objectives through working collaboratively with the subject matter experts, targeting the three challenges identified above.
#LO1 Differentiate among common types of data (e.g., text, images, tables, audio, video), and contrast their properties.
#LO2 Contrast inputs and outputs of various ML models and their uses.
#LO3 Explain the process of training and evaluation of ML models, focusing on the role of gold standard data (ground truth).
#LO4 Demonstrate how real-world objects or concepts can be represented as numerical features (i.e., an input to a ML model).
#LO5 Analyze the effects of using different inputs (features) or outputs (targets) on the usability of ML systems.
Activities
We mapped out the learning objectives to three learning tasks and one transfer task. And decompose each of the tasks into detailed learning and practice events.
instruction
To create an engaging and immersive learning experience, we designed a chat-based instruction system set in a real-world scenario. This approach allows learners to immerse themselves in the context while exploring concepts and solving problems interactively.
Develop
Building interactive activities for the learning objectives
Task 1: Identify and represent data types
Task 1 focuses on gathering diverse plane data, cleaning it, and transforming it into numerical formats—laying the groundwork for training an ML model to power predictive maintenance and keep planes in optimal condition.
Task 2: Evaluate Outputs of AI Models
Task 2 explores experimenting with diverse data types, feeding them into four distinct ML models. These models produce varied outputs, driving improved efficiency in detection and prediction tasks.
Task 3: Train And evaluate aI model
Task 3 involves selecting inputs and applying feature engineering to predict an engine's remaining useful life using regression. This task navigates the full ML model-building process, fine-tuning performance through iterative comparisons with gold standard data.
Transfer Task: Apply Your knowledge
The transfer task positions a data analyst within the Navy’s data science team to develop an ML model from scratch. This image classification model, built through five stages of model building, aims to classify ship types and enhance rapid response to potential threats.
Develop
Add a Fun Task to Boost Learners’ Motivation
To introduce the scenario of airplane maintenance, a motivation task is placed at the beginning of the course. This helps boost learners' familiarity and engagement with the course setting.
Implement
coordinate with the dev team for smooth implementation
We used Notion to track tasks and ownership. I created 30+ QA cards to flag usability issues, collaborated with the dev team, and ensured all changes were seamlessly implemented.
evaluation
Pilot test for learning effects and usability issues
We invited five learners to pilot test the course, which took them an average of one hour to complete. The results showed a 34% improvement in their post-learning test scores compared to their pre-learning test scores.
"Very informative content. I also enjoyed the interspersed activities."
-- Learner 1
"Showing exactly how categorical data is converted into numerical data in pre-processing was helpful."
-- Learner 2
reflection
Interaction adds to engagement
This project highlighted the power of interactive learning. Animations and drag-and-drop features captivated learners, keeping them engaged and boosting their outcomes—a reminder of how design transforms learning.
Collaboration make things happen
With 100+ touchpoints across engineers, SMEs, designers, and developers, this project showed the power of collaboration. Each role was key to turning big ideas into impactful solutions.