Please rotate your phone.

This experience is designed for portrait mode.

AI Product Tools  /  MIF Explorer  /  Library  /  UX

Truth Layer

Truth Layer

The Truth Layer is the badge system that tells you how trustworthy, directional, or risky a measure is.

Why it matters: It helps teams separate meaningful signals from vanity, misuse, or AI distortion before they optimize the wrong thing.

Example: A metric can be Meaningful, Leading, or Vanity Risk.

KPI UX MeaningfulLeading

Error Rate

The percentage of user actions that result in an error, mistake, or unintended outcome.

Category: Usability
Measurement class: KPI

Measurement Class

A measurement class tells you what kind of measure something is, not just what topic it covers.

Why it matters: It stops teams from building a stack full of only KPIs while ignoring value, governance, or AI signals.

Example: Governance Metric and AI Signal are two different measurement classes.

Frequency: Continuous
Back to library

Evaluation method

error_actions / total_actions × 100

Signal type

leading

What it is best for

Identifying specific UI elements that cause confusion

What it tells you +

Where the interface is confusing, misleading, or poorly designed. A direct proxy for usability friction.

What it does not tell you +

Distinguish between user confusion and system bugs. Does not explain why errors occur.

When to use it +
  • Identifying specific UI elements that cause confusion
  • Measuring impact of form redesigns or input changes
  • Prioritizing usability fixes by frequency and severity
When not to use it +
  • When error definitions are ambiguous or inconsistent
  • For comparing across very different types of interactions
How leaders misuse it +
  • Treating all errors as equally important regardless of severity
  • Ignoring error recovery — some errors are easily corrected and not problematic
Anti-patterns +
  • Only counting system-reported errors while missing user-perceived mistakes
  • Reducing errors by removing functionality rather than improving clarity
Companion entries +

This entry is stronger when paired with:

Instrumentation or evaluation guidance +

Define error types clearly: validation failures, wrong selections, undo events, back-button usage on form steps.

Sample events

form_validation_error, undo_action, back_button_on_form
Examples +

A checkout form shows a 34% validation error rate on the phone number field. Switching to a masked input reduces errors to 8%.

Suggested decisions +
  • Rank errors by frequency × severity to prioritize fixes
  • If error rate exceeds 10% on a critical flow, treat as a P1 usability issue