Leak suggests OpenAI’s open-source AI model release is imminent | By The Digital Insider

A leak suggests that OpenAI is about to launch a powerful new open-source AI model, potentially within hours.

The evidence comes from a trail of digital breadcrumbs, eagerly pored over by developers. At the centre of it all are screenshots showing a series of model repositories with names like yofo-deepcurrent/gpt-oss-120b and yofo-wildflower/gpt-oss-20b. The repos have since been deleted, but the accounts feature OpenAI team members.

That gpt-oss tag is the real smoking gun, seemingly a clear signpost for ‘GPT Open Source Software’. For a company that has increasingly guarded its top-tier models, this would be somewhat of a return to its roots. The fact that we are seeing multiple versions, with different codenames and sizes, suggests a well-planned family of models are about to make their debut.

Screenshot of alleged leaked repos for OpenAI's open-source AI model that is set for imminent launch.

Thanks to a leaked configuration file, we can even peek under the bonnet of the suspected 120 billion parameter version.

The model appears to be built on a Mixture of Experts, or MoE, architecture. Think of it less like a single, monolithic brain trying to know everything, and more like a board of 128 specialist advisors. When a query comes in, the system intelligently selects the four best experts for the job. This gives the model the vast knowledge of its huge parameter count, but the speed and agility of a much smaller system, as only a fraction of it is working at any one time.

This design puts OpenAI’s open-source AI model squarely in competition with the darlings of the scene, like Mistral AI’s Mixtral and Meta’s Llama family.

And the specs don’t stop there. OpenAI’s open-source AI model appears to boast a huge vocabulary, which should make it more efficient with a wider range of languages, and uses Sliding Window Attention to handle long streams of text without breaking a sweat. In practice, this all points to a model that is both powerful and practical to run.

So, why would OpenAI make such a move now? For years, the company has faced gentle jabs and outright criticism for straying from its more open beginnings. Launching a powerful gpt-oss would be a massive charm offensive aimed directly at the developers and researchers who felt left behind.

Of course, it’s also a shrewd competitive play. Meta and Mistral have shown how a thriving open-source ecosystem can drive innovation. By dropping a powerful open-source AI model like this appears to be into the mix, OpenAI isn’t just joining the race; it’s attempting to redefine the track.

Until we get the official word from OpenAI, this is all still, technically, rumour. But it’s a rumour with substance, backed by code and configuration files.

The launch of a high-performance, 120-billion-parameter open-source MoE model from the most famous name in AI would be nothing short of a landmark event, and it appears to be imminent.

(Photo by Mariia Shalabaieva)

See also: Zuckerberg outlines Meta’s AI vision for ‘personal superintelligence’

Want to learn more about AI and big data from industry leaders? Check out AI & Big Data Expo taking place in Amsterdam, California, and London. The comprehensive event is co-located with other leading events including Intelligent Automation Conference, BlockX, Digital Transformation Week, and Cyber Security & Cloud Expo.

Explore other upcoming enterprise technology events and webinars powered by TechForge here.


#Accounts, #Ai, #AiBigDataExpo, #AiModel, #Amp, #Architecture, #ArtificialIntelligence, #Attention, #Automation, #BigData, #Billion, #Board, #Brain, #California, #Cloud, #Code, #Companies, #Competition, #Comprehensive, #Conference, #ConfigurationFiles, #Course, #Cyber, #CyberSecurity, #Data, #Design, #Developers, #Development, #DigitalTransformation, #Enterprise, #Event, #Events, #Fraction, #GPT, #How, #Industry, #Innovation, #IntelligentAutomation, #It, #Languages, #Learn, #LESS, #Llama, #London, #Members, #Meta, #Mistral, #MistralAi, #Mixtral, #Model, #Models, #MoE, #One, #OpenSource, #OpenSourceSoftware, #OpenSourceAI, #Openai, #OSS, #Other, #Parameter, #Performance, #Photo, #Play, #Query, #Repositories, #Roots, #Security, #Software, #Speed, #Streams, #Superintelligence, #Technology, #Text, #Time, #Transformation, #Version, #Vision, #Webinars, #Word, #Zuckerberg
Published on The Digital Insider at https://is.gd/G5JhTN.

Comments