top of page

Problems I solve

with embedded evidence

My work spans many roles and titles — instructional designer, curriculum developer, learning consultant, assessment developer — but the work itself consistently centers on solving specific learning and performance problems.

 

Below are representative problems I’ve been brought in to solve, with embedded examples of how that work shows up in practice. In some cases, deeper samples are available; in others, the experience itself is the evidence.

What I do: an overview in 15 seconds

  • Improve sales, service, and operational performance

  • Design performance-based learning and simulations

  • Build valid, defensible assessment systems

  • Translate complex subject matter into usable learning

  • Align learning to real-world outcomes and standards

  • Consult on scalable learning strategy and frameworks

Problems I solve

The work I do is defined less by job titles and more by the problems that organizations need to get solved. These are the challenges I’m most often brought in to address. Each of these problems reflects ongoing work rather than one-off projects.

Sales, service, or operational performance Is stalling.

I design practice-based learning and feedback systems that improve real-world decision-making, confidence, and performance at the point of work.

Certification or exam outcomes are low or falling. 

I align instruction, assessment, and feedback to tested standards, closing gaps between what is taught and what is measured.

Learners are recalling information but not performing complex skills.

I create simulations, scenarios, and performance assessments that verify competence in realistic conditions.

No one seems to understand complex or emerging topics that are critical for operational success.

I translate advanced or high-stakes subject matter into clear, actionable learning that supports informed decisions and application.

This organization needs learning strategy, not more courses.

I help teams clarify goals and design coherent learning frameworks that scale, sustain, and evolve over time.

Assessment systems aren't measuring real competence.

I design evidence-centered assessment models — including rubrics, scoring logic, and revision cycles — that produce defensible, actionable data.

The sections below provide a deeper look at how I approached these challenges and the results I produced.

Problem

Sales, service, or operational performance was stalling.

What was at stake

Organizations were investing in training, but front-line performance wasn't improving. Sales conversations felt scripted, customer interactions were inconsistent, and teams struggled to apply product or system knowledge in real situations. The risk wasn't a lack of information; it was missed revenue, lower customer satisfaction, and slow ramp-up for new and transitioning employees. 

My approach

I focused on closing the gap between knowing and doing. Rather than leading with content delivery, I analyzed real workflows, common decision points, and moments where performance broke down. From there, I designed learning experiences that emphasized realistic scenarios, guided practice, and immediate feedback — helping learners build confidence and judgment in situations they actually encounter on the job.

 

My role typically included needs analysis, learning and assessment design, scenario development, stakeholder collaboration, and iterative refinement based on performance data.

What I delivered

Depending on the context, this work included:

  • Sales enablement and service curricula structured around real customer interactions

  • Scenario-based learning modules that required learners to make decisions, respond to objections, or navigate systems

  • Embedded practice and feedback loops to reinforce effective behaviors

  • Short, targeted lessons designed to fit operational realities and reduce cognitive load

  • Manager-aligned reinforcement tools to support transfer to the workplace

The emphasis was always on practical application, not content exposure.

Impact

Across multiple engagements, this approach led to measurable performance improvements, including:

  • Increased prospect-to-sale conversion following redesigned sales enablement programs

  • Higher customer satisfaction after simplifying and restructuring software training

  • Faster ramp-up and greater consistency among new and upskilling employees

  • Improved confidence and decision-making in customer-facing roles

In one national retail context, managers reported a 13% increase in prospect-to-sales transitions within three months of deploying the redesigned learning.

Selected evidence

Sales enablement learning module

Performance-based sales onboarding and upskilling.

Problem

Learners are recalling information but not performing complex skills.

What was at stake

In technical, safety-critical, or workforce development contexts, success depends on whether learners can do the work, not whether they can recall steps or definitions. Traditional training methods struggled to verify real-world readiness, leaving employers unsure of skill levels and learners underprepared for on-the-job demands.

My approach

I designed learning and assessment experiences that make performance observable. Rather than separating instruction from evaluation, I embedded assessment directly into realistic practice environments. This allowed learners to demonstrate competence in context while providing organizations with clearer evidence of readiness.

My work emphasized authentic scenarios, defined performance criteria, and structured feedback — ensuring that learning experiences reflected the complexity, sequencing, and decision-making required in real roles.

What I delivered

This work included:

  • Simulation-based learning experiences, including virtual reality environments, to support hands-on skill development

  • Performance tasks and decision-based scenarios that mirrored real operational conditions

  • Embedded assessment and scoring logic aligned to defined competencies

  • Rubrics and evaluation frameworks to support consistent measurement of performance

  • Documentation and guidance to help stakeholders interpret results and support learners’ progression

The focus was on validating skill, not just tracking completion.

Impact

These approaches led to outcomes that extended beyond the training environment:

  • Employers reported reduced onboarding time for new hires trained through simulation-based programs

  • Learners entered roles with greater confidence and job readiness

  • Workforce development initiatives demonstrated measurable regional impact, including improved employment outcomes

  • Organizations gained clearer insight into skill gaps and training effectiveness

In one community-college-led workforce initiative, local manufacturing employers reported that new employees required significantly less onboarding, while unemployment rates across several affected counties declined over time.

Selected evidence

Technical skills training via Virtual Reality

Simulation-based learning and assessment for workforce development.

​

Course sample: a video demonstrating the learner's point of view in the virtual reality space.

​

Course storyboard: showcases the instructional and learning design work that happens before involving Unity engineers.

Problem

Certification or exam outcomes are low or falling. 

Exam

What was at stake

In technical, safety-critical, or skills-based contexts, traditional instruction wasn’t enough. Learners needed to perform procedures correctly, make decisions under constraints, and demonstrate readiness , often before stepping into real-world environments. The risk of getting this wrong included safety issues, extended onboarding, inconsistent performance, and reduced employer confidence in training outcomes

My approach

I focused on closing the gap between knowing and doing. Rather than leading with content delivery, I analyzed real workflows, common decision points, and moments where performance broke down. From there, I designed learning experiences that emphasized realistic scenarios, guided practice, and immediate feedback — helping learners build confidence and judgment in situations they actually encounter on the job.

 

My role typically included needs analysis, learning and assessment design, scenario development, stakeholder collaboration, and iterative refinement based on performance data.

What I delivered

Depending on the context, this work included:

  • Sales enablement and service curricula structured around real customer interactions

  • Scenario-based learning modules that required learners to make decisions, respond to objections, or navigate systems

  • Embedded practice and feedback loops to reinforce effective behaviors

  • Short, targeted lessons designed to fit operational realities and reduce cognitive load

  • Manager-aligned reinforcement tools to support transfer to the workplace

The emphasis was always on practical application, not content exposure.

Impact

Across multiple engagements, this approach led to measurable performance improvements, including:

  • Increased prospect-to-sale conversion following redesigned sales enablement programs

  • Higher customer satisfaction after simplifying and restructuring software training

  • Faster ramp-up and greater consistency among new and upskilling employees

  • Improved confidence and decision-making in customer-facing roles

In one national retail context, managers reported a 13% increase in prospect-to-sales transitions within three months of deploying the redesigned learning.

Selected evidence

Customer service software training

Short-form, role-aligned system training for office staff.

Problem

No one seems to understand complex or emerging topics that are critical for operational success.

Radiology Workstation View

What was at stake

Organizations and professional audiences were grappling with complex, fast-evolving subject matter that was often technical, abstract, or high-stakes in nature. While expertise existed, it wasn’t accessible. Learners struggled to understand implications, apply concepts, or make informed decisions, limiting adoption and impact.

​

In these contexts, poor learning design didn’t just mean confusion. It meant missed opportunities, stalled innovation, and risk-averse decision-making.

My approach

I focused on translation, not simplification. My approach starts by identifying what learners actually need to understand, decide, or do with the information, then designing learning experiences that surface meaning, relevance, and consequences.

​

Rather than overwhelming learners with theory or jargon, I use structured narratives, case-based learning, and guided interpretation to help audiences connect complex ideas to real-world decisions. My role typically included content strategy, curriculum design, SME collaboration, and learning experience design.

What I delivered

This work included:​

  • Case-based learning experiences grounded in real-world scenarios

  • Conceptual frameworks that organized complex information into usable mental models

  • Structured learning narratives that guided learners from context to application

  • Facilitated discussion prompts and reflection activities for expert audiences

  • Learning assets designed for professional or executive settings, not novice training

  • The goal was not mastery of every detail, but meaningful understanding and informed action.

Impact

These experiences helped organizations and professionals move from awareness to action:

  • Increased engagement with complex or emerging topics

  • Strong follow-on interest and adoption beyond the original learning event

  • More informed conversations, planning, and experimentation

 

In one national healthcare context, a case-based learning experience on machine learning in healthcare was presented to over 1,000 professionals. Within a year, 22 major and regional hospitals had initiated follow-on work with data scientists to explore applications related to cost reduction, process efficiency, and patient outcomes.

Selected evidence

Online Case Study: Machine Learning in 21st- Century Healthcare 
Case-based learning for professional and executive audiences.

Problem

This organization needs learning strategy, not more courses. 

What was at stake

Business Meeting Discussion

In several engagements, the challenge wasn’t a lack of content. Organizations already had courses, materials, and ideas, but no coherent structure to guide decisions about what to build, why, or how it all fit together. As a result, learning efforts were fragmented, difficult to scale, and hard to evaluate for impact.

​

The real risk was continued investment in training that looked productive on the surface but failed to drive clarity, consistency, or measurable outcomes.

My approach

I started by stepping back from solutions and focusing on intent and alignment. My approach was to understand the organization’s goals, audiences, constraints, and success criteria, and then design a conceptual learning framework that could guide decisions over time.

​

Rather than prescribing individual courses, I worked at the system level: defining learning goals, identifying meaningful progression, clarifying where assessment or practice mattered most, and creating a shared mental model for stakeholders. My role typically included discovery, synthesis, conceptual design, and early-stage curriculum architecture.

What I delivered

This work included:

  • High-level learning frameworks that clarified purpose, scope, and priorities

  • Conceptual models linking goals, audiences, content, and assessment

  • Learning pathways or architectures showing how pieces fit together over time

  • Design principles and guardrails to support consistency and scalability

  • Initial curriculum outlines or exemplars to demonstrate how the strategy translated into practice

These deliverables gave teams a way to move forward with confidence, even as needs evolved.

Impact

By shifting the focus from individual courses to learning systems, organizations could:

  • Make clearer, faster decisions about what to build and what to defer

  • Reduce duplication and misalignment across learning initiatives

  • Create shared understanding among stakeholders with different perspectives

  • Lay the groundwork for scalable, sustainable learning solutions

 

In consulting contexts, this work often became the foundation for subsequent design, development, and assessment efforts. This ensures that future investments are intentional rather than reactive.

Selected evidence

Learning strategy and conceptual framework

Consultative engagement focused on translating "How should we train all this?" into a coherent, theory-informed learning structure.

​

Analysis of learning needs: my analysis of existing training framework and the architecture of a new one that fit the clients' needs.

​

​High-level outline: two lessons outlined according to the framework I designed, with the scalable template at the end.​​

Problem

Assessment systems aren't measuring real competence.  

What was at stake

Financial Data Analysis

In several organizations, assessments existed, but they weren’t doing their job. Results were noisy, misleading, or disconnected from real-world performance. Learners could pass without being ready, struggle despite strong understanding, or receive feedback that didn’t meaningfully guide improvement.

​

The risk was significant: poor decision-making based on unreliable data, reduced confidence in training programs, and missed opportunities to use assessment as a lever for learning and performance.

My approach

I approached assessment as an evidence problem, not a content problem. My work began by clarifying what competent performance actually looked like and identifying the observable behaviors that would justify claims of proficiency.

​

From there, I designed assessment systems that aligned tasks, scoring, and feedback to the intended construct. This often meant rethinking item design, rubric structure, scoring logic, and revision cycles—ensuring that assessments measured what mattered and produced data that stakeholders could trust.

​

My role typically spanned assessment design, rubric and scoring model development, SME calibration, and data-informed iteration.

What I delivered

Depending on the context, this work included:

  • Evidence-centered assessment models aligned to standards and real-world performance

  • Scenario-based and performance-based assessments designed to elicit meaningful evidence

  • Rubrics and scoring guides that supported consistency, transparency, and defensibility

  • Item and task revision cycles informed by performance data and bias review

  • Actionable feedback structures that helped learners understand not just results, but next steps

The goal was not more assessment, but better and actionable measurement.

Impact

When assessment systems were redesigned around evidence and intent, organizations saw tangible improvements:

  • Clearer signals about learner readiness and skill gaps

  • Increased confidence in assessment results among educators, managers, and stakeholders

  • Improved alignment between instruction, assessment, and real-world expectations

  • More effective use of data to guide learning decisions and revisions

 

In high-stakes and regulated contexts, this work helped ensure that assessment outcomes were defensible, equitable, and instructionally meaningful. These outcomes supported both accountability and growth.

Selected evidence

Assessment design and measurement work

Performance- and scenario-based assessment system with aligned scoring, rubrics, and feedback.

801.616.9653

  • LinkedIn

©2026 by AGC Showcase

bottom of page