Follow BigDATAwire:

Tag: MoE

Why OpenAI’s New Open Weight Models Are a Big Deal

The smoke is still clearing from OpenAI’s big GPT-5 launch today, but the verdict is starting to come in on the company’s other big announcement this week: the launch of two new open weight models, gpt-oss-120b and g Read more…

Snowflake Touts Speed, Efficiency of New ‘Arctic’ LLM

Snowflake today took the wraps off Arctic, a new large language model (LLM) that is available under an Apache 2.0 license. The company says Arctic’s unique mixture-of-experts (MoE) architecture, combined with its relat Read more…

AWS Teases 65 Exaflop ‘Ultra-Cluster’ with Nvidia, Launches New Chips

AWS yesterday unveiled new EC2 instances geared toward tackling some of the fastest growing workloads, including AI training and big data analytics. During his re:Invent keynote, CEO Adam Selipsky also welcomed Nvidia fo Read more…

BigDATAwire