Statistical Inspection Readiness
Checklist
I. Overall Readiness
☐ Statistical lead / data owner clearly identified
☐ Full understanding of study design and primary/key secondary endpoints
☐ Clear awareness of which analyses support regulatory decision-making
☐ Clear distinction between primary, secondary, and exploratory analyses
II. Protocol and SAP Alignment
☐ All analyses are pre-specified in the SAP
☐ Final SAP approved before database lock
☐ SAP version control is clear (dates, approvals, change history)
☐ Any deviations from protocol or SAP are documented and justified
☐ Post hoc analyses are clearly labeled as exploratory in the CSR
Typical inspector question:
Why was this analysis not pre-specified?
III. Data Traceability
☐ Every key result can be traced back to source data
☐ Clear linkage between ADaM and SDTM datasets
☐ Data derivation rules are fully documented
☐ Programming logic is consistent with data specifications
☐ No “black-box” variables or undocumented
calculations
Result → Program → ADaM → SDTM → CRF → Source Data
IV. Programming and Reproducibility
☐ All statistical programs are under version control
☐ Programs can be rerun to reproduce identical results
☐ No manual editing of datasets or outputs
☐ QC evidence available (double programming, independent review,
validation)
☐ Outputs are fully consistent with the CSR
V. Missing Data and Deviations
☐ Missing data handling methods pre-specified in SAP
☐ Methods align with ICH E9 / E9(R1) principles
☐ Sensitivity analyses support the primary conclusions
☐ Protocol deviations are clearly defined and categorized
☐ Deviation handling is not outcome-driven
VI. Statistical Soundness
☐ Statistical methods are appropriate for the study
design
☐ Assumptions are reasonable and explainable
☐ Multiplicity control strategy clearly defined
☐ Analysis populations (ITT, PP, SAF, etc.) clearly specified
☐ No unnecessary complexity without scientific justification
VII. Documentation and Evidence
☐ Protocol (all versions)
☐ SAP (final and historical versions)
☐ TFL shells
☐ SDTM / ADaM specifications
☐ Program inventory with version history
☐ Audit trails and change logs
VIII. Inspection Interview Readiness
☐ Able to explain the rationale for each key analysis
☐ Able to clearly describe data sources and derivations
☐ Distinguishes facts from interpretations
☐ Answers based on documentation, not memory
☐ Avoids ad hoc analyses or on-the-spot commitments
IX. Common Red Flags
❌ SAP finalized after database lock
❌ Unexplained discrepancies between SAP and results
❌ Programs not reproducible
❌ Undocumented data derivations
❌ Modifying data or programs during inspection
X. Rapid Self-Check Questions
Summary
Inspection readiness for statisticians means that every result is
reproducible, traceable, and scientifically justified—at any time.
|
Inspection Target |
Possible Timing |
Primary Purpose |
|
Clinical trial sites |
During trial conduct / Pre-Approval Inspection (PAI) |
Data integrity and GCP compliance |
|
Sponsor |
IND / NDA stages |
Oversight systems and accountability |
|
CRO |
Any stage |
Vendor oversight and delegated responsibilities |
|
Statistics / data systems |
NDA / PAI |
Reproducibility and data integrity |
|
Manufacturing facilities |
NDA review / Post-approval |
GMP compliance |
Inspection vs. Audit — Comparison Table
|
Dimension |
Inspection |
Audit |
|
Conducted by |
Regulatory authorities (e.g., FDA, NMPA, EMA) |
Sponsor, Quality Assurance, or third-party auditors |
|
Legal nature |
Regulatory and enforcement activity |
Quality management activity |
|
Mandatory |
Yes — cooperation is required |
No — conducted by agreement or internal policy |
|
Primary purpose |
Determine regulatory compliance and impact on regulatory decisions |
Identify risks and drive continuous improvement |
|
Timing |
Risk-based, during review, or triggered by signals |
Planned, periodic, or risk-based |
|
Scope |
Focused on critical data, systems, and processes |
Systematic review of processes and controls |
|
Approach |
Evidence-based, traceability-driven, result-oriented |
Process-based, improvement-oriented |
|
Notification |
May be announced or unannounced |
Typically pre-scheduled |
|
Findings |
Observations (e.g., FDA Form 483), inspection reports |
Audit findings and internal observations |
|
Consequences |
May affect IND/NDA approval, licensing, or lead to enforcement actions |
Leads to CAPA and internal corrective actions |
|
Negotiability of outcome |
No |
Yes |
|
Refusal allowed |
No |
Yes |
Key takeaway
An audit is performed to prevent inspection failures; an inspection is the ultimate regulatory test of audit effectiveness.