IS

Ilya Sutskever​‍​‍‌‌‍‍‍​‍‍‌‌​​‍​‍‌‍‌​

person active
Co-founder, Safe Superintelligence Inc.
Tracked:Core AGI developer who privately acknowledged catastrophic risks while publicly advancing the technology — central to preparedness/advocacy contradiction analysis
EDITORIAL SUMMARY — AI-generated from public records

Co-founder and former Chief Scientist of OpenAI. Key figure in AGI development who reportedly advocated for bunker construction before AGI release. Co-founder of Safe Superintelligence Inc.

1
Facts
0
Connections
1
Sources
View on Graph → Export CSV ← Entity Directory
Facts (1)
Data Freshness
Fresh Last update: 8d ago · Avg age: 533d
Confidence Tiers: Primary Source — cross-referenced government/corporate filings Pending Review — sourced but not independently verified AI Inference — analytical hypothesis from cross-referencing
Raw Filing Records (1) — unsourced metadata
Pending Review Former OpenAI Chief Scientist who reportedly told colleagues: "We're definitely going to build a bunker before we release AGI." This statement directly links AGI develop​‍​‍‌‌‍‍‍​‍‍‌‌​​‍​‍‌‍‌​ment milestones to personal survival preparation, representing an internal acknowledgement by a core developer that the technology they are building may pose a catastrophic threat.
Date: 2023-06-01 Added: 15 Apr 2026 UNVERIFIED Ilya Sutskever reportedly told colleagues about AGI bunker contingency
All Connections (0)
No connections documented.
Sources (1)
2023-06-01 UNVERIFIED Ilya Sutskever reportedly told colleagues about AGI bunker contingency news Raw