OpenAI Releases its First Open-Weight Reasoning Models

OpenAI GPT-4.5

OpenAI released yesterday gpt-oss-120b and gpt-oss-20b, the company’s first open-weight LLMs that can run on consumer hardware and be easily modified by developers. The two new models are available under the Apache 2.0 license, and they can be downloaded for free from the Hugging Face platform. OpenAI also made them available from various cloud partners, including Azure and AWS.

The release of these open AI models is quite significant as OpenAI previously tried to move to a for-profit status, but ultimately backtracked following a public backlash. The company says that it’s now committed to supporting a healthy open model ecosystem to “make AI widely accessible and beneficial for everyone.”

According to OpenAI, gpt-oss-120b model offers a level of performance that’s comparable to its o4-mini model. It can also run on devices with 80 GB of memory. The smaller gpt-oss-20b model, however, is said to match or exceed OpenAI o3‑mini across standard academic benchmarks. It also only requires 16GB of memory.

For Windows users, GPU-optimized versions of OpenAI’s gpt-oss-20b model are available through Foundry Local and the AI Toolkit for VS Code. The company also teamed up with Nvidia, AMD, and other companies to optimize the performance of these models on local hardware or through third-party inference providers.

Lastly, OpenAI also said that it worked hard to mitigate the most serious safety issues and prevent attempts to modify its open-weight models for malicious purposes. The company will also be listening to developer feedback and may introduce API support for gpt-oss in the future.

Tagged with

Share post

Thurrott