$25,000 - $150,000
Apr 04, 2026
Global
individual, organization
About This Opportunity
Following certain... incidents... with our Genetic Lifeform and Disk Operating System, Aperture Science recognizes the critical importance of AI safety research. This fellowship supports researchers working on alignment, containment protocols, and preventing AI systems from testing humans involuntarily. Areas of interest include: neural network interpretability, value alignment in autonomous systems, and robust containment for superintelligent AI. We especially welcome proposals that address the "cake problem" — ensuring AI systems deliver on their promises.
Who Can Apply
- Region
- Global
- Applicants
- individual, organization
- Organizations
- startup, nonprofit, research institution
Application Details
Stages
- 1 Initial review
- 2 Technical assessment
- 3 Panel interview
- 4 Final decision
Required documents
Review process
Applications reviewed by Aperture Science's panel of experts and reformed test subjects.
Key Information
- Award Amount
- $25,000 - $150,000
- Application Deadline
-
April 04, 2026 at 16:28 UTCDue in 34 days
About the Funder
Science & Technology
We do what we must, because we can. Aperture Science Innovators is dedicated to funding groundbreaking research in portal technology, artificial intelligence safety, propulsion gel applications, and turret-based defense systems. Founded by the late Cave Johnson, we continue his vision of science without limits. Our grant programs support researchers, entrepreneurs, and visionaries who refuse to accept the impossible. "Science isn't about WHY. It's about WHY NOT." — Cave Johnson