IAEA-TECDOC-1329 Framework · HRO Theory Integration

Safety Culture
Leadership
Architecture

How safety culture lives or dies across organizational layers — and what happens when any layer stops holding the line.

CEO / EXECUTIVE
DIRECTORS
MANAGERS
FRONTLINE STAFF

Each layer sets the conditions for the layer below it. Safety culture flows downward — and trust flows upward.

CEO / Executive Sets the moral and strategic tone of the entire safety system +

In HRO theory, the CEO is the architect of the "preoccupation with failure" and "deference to expertise" norms. They control resource allocation, incentive structures, and what the organization signals it actually values versus what it says it values.

Ideal Behaviors
  • Publicly and repeatedly names safety as a non-negotiable organizational value — not just a priority that can shift
  • Allocates resources to safety proactively, not reactively after events
  • Participates visibly in safety reviews and walks the floor; not delegating safety presence entirely
  • Creates and protects psychological safety at the top — tolerates bad news from directors without punishing the messenger
  • Holds self and direct reports accountable using safety culture indicators, not just lagging metrics (incident counts)
  • Champions a learning identity: errors are organizational learning, not individual failures to be hidden
  • Ensures the QA/safety function has organizational authority and direct access to themselves
Failure Indicators
  • Safety language in speeches, absent in budget decisions
  • Rewards directors who hit production numbers regardless of safety shortcuts taken
  • Does not personally investigate near-miss events or significant incidents
  • Expresses impatience or dismissal when safety concerns slow operations
  • QA and safety functions are low-status, under-resourced, or lack organizational standing
  • Board-level safety metrics are lagging only — incident counts, not culture indicators
  • Learns about safety problems through the press or regulator rather than internal channels
Directors Translate strategy into operational reality and norms +

Directors are the transmission layer. They receive the culture from above and operationalize it into systems, expectations, and resource decisions that managers experience daily. They're often the most consequential — and least studied — layer.

Ideal Behaviors
  • Actively translate executive safety values into departmental processes, plans, and metrics
  • Resolve production-vs-safety conflicts explicitly in favor of safety — and explain why publicly
  • Seek out frontline feedback and protect managers who surface problems up the chain
  • Conduct structured safety culture reviews across their function — not just audit compliance reviews
  • Sponsor root cause analysis with human factors lens, not just technical fault trees
  • Model questioning behavior — ask "why" about assumptions, not just "did we comply"
  • Manage the boundary with regulators with transparency and mutual respect
Failure Indicators
  • Passes executive pressure for performance downward without filtering or protecting the team
  • Discourages managers from escalating safety concerns that might make the director look bad to the CEO
  • Treats safety reviews as compliance theater rather than genuine inquiry
  • Creates implicit (or explicit) message: don't bring me problems, bring me solutions
  • Rewards managers who keep operations running smoothly regardless of method
  • Silo behavior — does not coordinate safety across functions
Managers The culture that frontline staff actually experience daily +

The manager is ground zero for safety culture in lived experience. The IAEA framework is clear: employee perception of safety culture is almost entirely shaped by what their direct manager does — not what the CEO says. Managers are the daily signal of what is real.

Ideal Behaviors
  • Closes the loop on every safety concern raised — even if the answer is "not right now, and here's why"
  • Stops work without hesitation when a safety issue is identified and treats this as normal, unremarkable behavior
  • Praises and rewards reporting of near-misses explicitly and visibly
  • Coaches rather than disciplines when honest errors occur — preserves psychological safety
  • Is present in the work environment, not managing from a desk — sees conditions firsthand
  • Runs pre-work planning that genuinely integrates risk assessment — not a checkbox
  • Makes it safe for the least experienced person in the room to raise a concern
Failure Indicators
  • Communicates impatience or annoyance when safety concerns slow a job down
  • Dismisses or minimizes near-miss reports — "it's fine, nobody got hurt"
  • Disciplines or informally punishes staff who raise concerns that cause delays
  • Allows workarounds and shortcuts to become normalized because they work "most of the time"
  • Absent from the actual work environment; disconnected from real conditions
  • Does risk assessments as paperwork after the fact, not as genuine pre-work analysis
Frontline Staff The last line of defense — and the richest source of safety intelligence +

In HRO theory, frontline staff are uniquely positioned: they see risk before anyone else. Their willingness to speak up, report, and stop work is not just desirable — it is the system's early warning capability. The degree to which they do or don't is a direct readout of everything above them.

Ideal Behaviors
  • Reports near-misses routinely and without fear — sees this as part of the job, not a risk
  • Stops work and speaks up when conditions don't feel right — exercises genuine authority to halt
  • Participates actively in pre-work risk assessments — contributes real knowledge
  • Questions procedures when conditions don't match written instructions
  • Owns their own safety and that of their colleagues — not delegated to specialists alone
  • Shares lessons from errors with peers without fear of blame or stigma
  • Trusts that raising concerns will be taken seriously and responded to
Failure Indicators
  • Knows the right answer on a safety form — fills it in — then does the job the way it's actually always done
  • Sees reporting as a career risk, not a professional duty
  • Self-censors concerns based on past experiences of being ignored or penalized
  • Follows the group norm of silence around safety to preserve social standing
  • Views "stopping work" as something that only creates problems for everyone
  • Accumulated learned helplessness: "nothing changes anyway, why bother"
What breaks down when each layer fails to live the safety culture
The "Say-Do Gap" becomes the operating norm
When executives say safety is paramount but fund, reward, and promote based on throughput, the entire organization learns the truth instantly. Directors stop investing political capital defending safety decisions. The message sent downward is: safety talk is for appearances, performance is what's real. Every layer below reorganizes around this signal — often unconsciously.
Safety function loses organizational authority
When QA or safety staff lack status, budget, and access to leadership, their findings become advisory at best. Directors and managers learn to manage around them rather than with them. The organization develops two parallel systems: the formal safety system and the real one used to get work done.
Complacency becomes institutionalized
Good past performance becomes an enemy. The IAEA framework identifies this as Stage 1 of organizational decline: over-confidence born of success creates blindness to slowly accumulating risk. Without executive-level preoccupation with failure — an HRO core principle — the organization mistakes absence of events for presence of safety.
Pressure passes through unfiltered — and amplified
A director who simply transmits executive performance pressure downward without providing cover, context, or protection creates a compression effect. Managers receive the message that results matter more than method. This is the layer at which safety culture either gets reinforced or quietly gutted in day-to-day operations.
Cross-functional safety coordination collapses
Safety in complex systems requires information to flow across silos. When directors protect departmental interests over organizational safety, the handoffs between functions — where most events actually originate — go unmanaged. Each group optimizes for its own domain while the gaps between them become risk accumulation zones.
Managers stop escalating bad news
If directors respond to escalated problems with blame, impatience, or by shooting the messenger, managers rapidly learn to pre-filter what reaches the director level. Problems are managed locally, often without adequate resources or authority. By the time a serious issue surfaces, it has already been brewing for months.
Psychological safety at the work face collapses
The manager is the daily reality of safety culture for frontline workers. When a manager sighs, looks at their watch, or says "we'll deal with it later" when a concern is raised, it takes approximately one such interaction to extinguish reporting behavior in a team. The signal is permanent and spreads to peers who weren't even there.
Normalized deviation becomes embedded practice
When managers allow shortcuts to persist because they've worked 100 times before, those deviations are no longer deviations — they become the actual standard of practice. The written procedure becomes irrelevant fiction. This is the pathway James Reason identified as "latent conditions" accumulating in the system long before any triggering event.
Near-miss data disappears
If reporting a near-miss creates hassle, scrutiny, or informal punishment for the reporter, the near-miss never gets reported. The organization loses the most valuable safety intelligence it has — the leading signal before harm occurs. Management then believes the system is safer than it is, because they only see the incidents that were unavoidable to hide.
Last-line defenses erode silently
Note: frontline failures are almost always a symptom of system failure above, not individual character failure. When frontline workers comply visibly and deviate privately, they are responding rationally to an environment that penalizes honesty and rewards getting the job done. The failure mode isn't them — it's the conditions that produced the behavior.
Collective normalization of risk
Work groups develop shared understandings of what is "really" dangerous versus what is formally required. This group norm becomes extremely powerful — individuals who deviate from it (by actually stopping work or formally reporting) become social outliers. Peer pressure to conform to the real operating standard can be stronger than formal procedural requirements.
The Adaptive Behaviors Thesis: When the safety culture environment is suboptimal, human beings do not simply stop functioning. They adapt. These adaptations are rational responses to real conditions — and they are often invisible to the organization until they contribute to an event. Understanding adaptive behaviors is critical because they are simultaneously a symptom of cultural dysfunction and a masking mechanism that delays organizational recognition of the problem.
Adaptive Behaviors by Organizational Layer
Frontline
Selective Non-Reporting
Workers develop nuanced judgment about what to report, to whom, and when — based on past outcomes. They do not stop seeing risk; they stop surfacing it. The safety intelligence is there. It simply never enters the formal system.
Signal: Low near-miss report volume in high-complexity environments
Frontline
Parallel "Real" Operating Procedures
When written procedures don't reflect how work actually gets done, workers develop and share informal step-by-step processes within their peer group. These are often highly effective — but invisible to management, undocumented, and fragile (they leave when experienced workers leave).
Signal: New worker errors in tasks experienced workers do "easily"
Frontline
Control Reclamation in Small Domains
When workers feel powerless in the formal safety system, they assert control where they can: pace of work, sequencing, timing of breaks, tool selection. This is healthy self-efficacy — but it can also mean avoiding certain jobs, certain supervisors, or certain conditions without formally flagging why.
Signal: Unexplained task avoidance, overtime refusal patterns
Frontline
Collective Silence Agreements
Work groups develop unspoken norms about what is discussed with management versus what stays within the group. This is self-protective community building. It creates a group identity separate from the organizational hierarchy — with its own safety standards that may be higher or lower than formal requirements.
Signal: Uniform answers to safety surveys, no variation in responses
Frontline
Performative Compliance
Workers learn the language, the forms, and the answers that the system requires. They produce perfect compliance artifacts while doing the actual work differently. Pre-task risk assessments completed before the job begins — in the break room, not at the work face — are a textbook example.
Signal: Perfect paperwork, unexpected incidents
Manager
Upward Message Filtering
Managers who have learned that bad news is unwelcome become skilled editors. They translate problems into "managed situations" before they reach directors. This protects their position but deprives leadership of the information needed to allocate resources and make decisions. The organization goes dark.
Signal: Leaders describe being "surprised" by events that frontline saw coming
Manager
Metric Gaming
When managers are measured on safety metrics rather than safety culture, they optimize the metric. Incident rates drop — because incidents are re-classified, not because the environment is safer. Near-miss programs quietly atrophy because reported near-misses look bad on the dashboard.
Signal: Declining reports with no process improvements to explain it
Manager
Informal Authority Networks
When formal systems are unresponsive, managers build personal relationships with key staff in other functions to get things done — including safety issues. This works, but it makes the safety system dependent on individual relationships rather than robust processes. When those people leave, the network collapses.
Signal: High impact of specific individual departures on safety performance
Director
Safety Theater at the Portfolio Level
Directors in a suboptimal environment become skilled at producing the appearance of safety culture investment: sponsoring programs, attending summits, approving safety training budgets — while simultaneously protecting production schedules that make implementation impossible. The investment is real. The conditions for it to work are not.
Signal: Strong safety training completion rates, unchanged behaviors
Director
Regulatory Relationship Management
When directors know the organization's actual safety state is not what it appears, they invest heavily in managing regulator relationships — providing information strategically, framing findings positively, and limiting regulatory access to certain areas or populations. The regulator is managed rather than partnered with.
Signal: Regulator finds things in inspections that "surprised" the organization
CEO / Executive
Strategic Reframing of Decline
Executives in denial interpret early warning signals as anomalies, external factors, or temporary conditions. Each incident is treated as isolated rather than systemic. The IAEA framework describes this as Stage 3 of decline: "denial." The adaptation is a cognitive one — constructing a narrative that protects the existing worldview.
Signal: Leadership language of "isolated incident" for repeated event types
System-Wide
The Competency Exodus
The most safety-conscious workers — those with the highest standards, the strongest professional identity, and the lowest tolerance for performative safety — leave first. They are replaced by workers more comfortable with the existing culture. The organization loses its most sensitive internal safety detectors precisely when it most needs them. This accelerates decline.
Signal: Voluntary turnover concentrated among experienced, high-performing safety staff

High Reliability Organization (HRO) theory, developed by Weick, Sutcliffe, and Roberts, describes the cognitive and cultural practices of organizations that operate in high-hazard environments with very low failure rates. These five principles map directly onto the IAEA safety culture framework — and their absence at any leadership layer predicts cultural decline.

01
Preoccupation with Failure
HROs treat every near-miss, anomaly, and deviation as a signal, not noise. They are chronically uneasy with success, because success can mask accumulating fragility. Leaders actively seek out bad news.
When absent: Good performance produces complacency. Near-misses are celebrated as near-misses ("no one got hurt") rather than investigated as leading indicators. The IAEA's Stage 1 decline — over-confidence — is the direct result.
02
Reluctance to Simplify
HROs resist simple explanations for complex events. They maintain cognitive diversity — different perspectives on how things could go wrong — and resist the comfort of "human error" as a complete explanation.
When absent: Root cause analysis stops at the most proximal human failure. Systemic, organizational, and cultural contributors go unaddressed. The same event recurs in slightly different form. Frontline workers are blamed for system failures.
03
Sensitivity to Operations
Leaders in HROs maintain situational awareness of what is actually happening at the work face — not just what the dashboard says. They are present, connected, and genuinely curious about the gap between plan and reality.
When absent: Management operates on a model of operations that is increasingly disconnected from reality. Adaptive behaviors (parallel procedures, workarounds) go undetected. The surprise is always maximum when events occur.
04
Commitment to Resilience
HROs invest in the capacity to detect, contain, and recover from errors that inevitably occur. They don't just try to prevent errors — they build the capability to catch and correct them before they cascade.
When absent: Defense-in-depth becomes defense-in-appearance. Redundant safety systems exist on paper but are not maintained, tested, or trusted. When one defense fails, the next has quietly eroded too.
05
Deference to Expertise
In high-stakes moments, HROs shift decision-making authority to whoever has the most relevant knowledge — regardless of rank. The person closest to the problem and most expert in it has the authority to act. Hierarchy yields to competence when safety is at stake.
When absent: The most junior person who sees the problem has no real authority to stop work. They raise a concern, it goes up the hierarchy, loses fidelity and urgency at each level, and returns as "continue with caution." The frontline learns that their expertise is not valued. They stop offering it.
The HRO-Safety Culture Integration
Where HRO and IAEA Framework Align
  • Both treat frontline workers as primary safety intelligence assets, not simply rule-followers
  • Both identify management commitment as necessary but not sufficient — culture must be embedded in basic assumptions, not just espoused values
  • Both recognize that absence of events does not equal presence of safety
  • Both identify learning from near-misses as a leading indicator of organizational health
  • Both require psychological safety as a precondition for the entire system to function
The Core Diagnostic Question at Every Level
  • CEO: Does the worst safety news in the organization reliably reach me — fast?
  • Director: Are my managers more afraid of telling me bad news than of hiding it?
  • Manager: Does my team report near-misses to me before I would ever discover them another way?
  • Frontline: Do I believe that raising a concern makes the situation better — for me personally?