PROJECTS

PROJECTS

PROJECTS

Investigating Big Tech, Protecting the Public

Investigating Big Tech, Protecting the Public

Investigating Big Tech, Protecting the Public

Our investigations expose corporate misconduct, track accountability, and mobilize public pressure to ensure AI serves the public good.

Our investigations expose corporate misconduct, track accountability, and mobilize public pressure to ensure AI serves the public good.

Our investigations expose corporate misconduct, track accountability, and mobilize public pressure to ensure AI serves the public good.

<a href="https://www.freepik.com/free-photo/millennial-asia-businessmen-businesswomen-meeting-brainstorming-ideas-about-new-paperwork-project-colleagues-working-together-planning-success-strategy-enjoy-teamwork-small-modern-night-office_7685820.htm#fromView=search&page=1&position=15&uuid=e2c57d0a-7581-4d4e-881c-e87210163ba1&query=projects">Image by tirachardz on Freepik</a>
<a href="https://www.freepik.com/free-photo/millennial-asia-businessmen-businesswomen-meeting-brainstorming-ideas-about-new-paperwork-project-colleagues-working-together-planning-success-strategy-enjoy-teamwork-small-modern-night-office_7685820.htm#fromView=search&page=1&position=15&uuid=e2c57d0a-7581-4d4e-881c-e87210163ba1&query=projects">Image by tirachardz on Freepik</a>
The OpenAI Files project cover – The Midas Project report investigating OpenAI’s governance, tax issues, and accountability.
The OpenAI Files project cover – The Midas Project report investigating OpenAI’s governance, tax issues, and accountability.

The OpenAI Files

June 2025

www.openaifiles.org

The OpenAI Files is the most comprehensive collection to date of documented concerns with governance practices, leadership integrity, and organizational culture at OpenAI.

The OpenAI Files project cover – The Midas Project report investigating OpenAI’s governance, tax issues, and accountability.

The OpenAI Files

June 25

www.openaifiles.org

The OpenAI Files is the most comprehensive collection to date of documented concerns with governance practices, leadership integrity, and organizational culture at OpenAI.

Seoul Tracker

Seoul Tracker

Feb 2025

Feb 25

www.seoul-tracker.org

www.seoul-tracker.org

At the 2024 AI Safety Summit in Seoul, South Korea, sixteen leading tech organizations pledged to implement "red line" risk evaluation policies for frontier AI models. The deadline has now arrived, but not everyone has lived up to their commitment. This tracker assesses progress across the five key components.

At the 2024 AI Safety Summit in Seoul, South Korea, sixteen leading tech organizations pledged to implement "red line" risk evaluation policies for frontier AI models. The deadline has now arrived, but not everyone has lived up to their commitment. This tracker assesses progress across the five key components.

Seoul Tracker project cover – The Midas Project analysis of global AI safety commitments from the 2024 AI Seoul Summit.
Seoul Tracker project cover – The Midas Project analysis of global AI safety commitments from the 2024 AI Seoul Summit.
Seoul Tracker project cover – The Midas Project analysis of global AI safety commitments from the 2024 AI Seoul Summit.
Safety Abandoned project cover – The Midas Project report on OpenAI’s failure to uphold AI safety promises and safeguards.
Safety Abandoned project cover – The Midas Project report on OpenAI’s failure to uphold AI safety promises and safeguards.

Safety Abandoned

Dec 2024

www.safetyabandoned.org

OpenAI has formally begun efforts to shed its nonprofit status. They've previously removed a hard cap on investor profits, backtracked on commitments to safety, and pressured their nonprofit board out of exercising control. Now they want to spin out completely, becoming a full-fledged corporate enterprise.

Safety Abandoned project cover – The Midas Project report on OpenAI’s failure to uphold AI safety promises and safeguards.

Safety Abandoned

Dec 24

www.safetyabandoned.org

OpenAI has formally begun efforts to shed its nonprofit status. They've previously removed a hard cap on investor profits, backtracked on commitments to safety, and pressured their nonprofit board out of exercising control. Now they want to spin out completely, becoming a full-fledged corporate enterprise.

No Deepfakes for Democracy

No Deepfakes for Democracy

Oct 2024

Oct 24

www.nodeepfakesfordemocracy.com

www.nodeepfakesfordemocracy.com

As Big Tech companies rapidly develop and deploy new AI technologies, deepfakes, or hyper-realistic AI-generated video, images, and audio, are quickly blurring the lines between truth and fiction.

As Big Tech companies rapidly develop and deploy new AI technologies, deepfakes, or hyper-realistic AI-generated video, images, and audio, are quickly blurring the lines between truth and fiction.

No Deepfakes project cover – The Midas Project campaign to hold AI developers accountable for preventing deceptive AI-generated media.
No Deepfakes project cover – The Midas Project campaign to hold AI developers accountable for preventing deceptive AI-generated media.
No Deepfakes project cover – The Midas Project campaign to hold AI developers accountable for preventing deceptive AI-generated media.
Open Letter to OpenAI project cover – The Midas Project initiative urging AI developers to adopt stronger AI safety and accountability practices.
Open Letter to OpenAI project cover – The Midas Project initiative urging AI developers to adopt stronger AI safety and accountability practices.

Open Letter to OpenAI

Aug 2025

www.openai-transparency.org

More than 100 prominent AI experts, former OpenAI team members, public figures, and civil society groups signed an open letter calling for greater transparency from OpenAI.

Open Letter to OpenAI project cover – The Midas Project initiative urging AI developers to adopt stronger AI safety and accountability practices.

Open Letter to OpenAI

Aug 25

www.openai-transparency.org

More than 100 prominent AI experts, former OpenAI team members, public figures, and civil society groups signed an open letter calling for greater transparency from OpenAI.

Join our Movement

Help us push tech companies to prioritize safety, transparency, and public interest in AI development.

Join our Movement

Help us push tech companies to prioritize safety, transparency, and public interest in AI development.