Tracked:Core AGI developer who privately acknowledged catastrophic risks while publicly advancing the technology — central to preparedness/advocacy contradiction analysis
EDITORIAL SUMMARY — AI-generated from public records
Co-founder and former Chief Scientist of OpenAI. Key figure in AGI development who reportedly advocated for bunker construction before AGI release. Co-founder of Safe Superintelligence Inc.
Confidence Tiers:Primary Source — cross-referenced government/corporate filingsPending Review — sourced but not independently verifiedAI Inference — analytical hypothesis from cross-referencing
Raw Filing Records (1) — unsourced metadata
Pending ReviewFormer OpenAI Chief Scientist who reportedly told colleagues: "We're definitely going to build a bunker before we release AGI." This statement directly links AGI development milestones to personal survival preparation, representing an internal acknowledgement by a core developer that the technology they are building may pose a catastrophic threat.