Against the backdrop of the global open-source big model market being dominated by Chinese technology companies for a long time, American tech giants are attempting to regain their say through differentiated competition.
According to media reports, Demis Hassabis, CEO of Google DeepMind, recently hinted on social media through the "Four Diamonds" icon that the new generation of open-source model Gemma 4 is about to be officially released. At this point, it has been exactly one year since the release of the previous generation product Gemma 3, which is in line with Google's iterative pace in the field of large models.
Large scale upgrade: The new model of 120B challenges the local operating limit
Compared to its predecessor, Gemma 4 has achieved a leapfrog growth in parameter scale:
Four times the number of parameters: It is rumored that a large model with 120B parameters will be added this time, which is four times the size of the previous generation.
MoE architecture: In order to balance performance and efficiency, this model is expected to adopt the MoE (Hybrid Expert) architecture with an activation parameter of only 15B. This means that even large parameter models are still expected to achieve local offline operation on civilian grade graphics cards.
Capability Evolution: Predictions show that Gemma 4's context processing ability will increase by 1 to 2 times, and it will possess deeper logical reasoning and complex task execution capabilities.
Strategic Game: Encircling the 'Chinese Power' in the Open Source Community
Fast Tech analysis points out that although American giants have shifted their focus to closed source business models, in order to prevent Chinese companies from completely occupying the open source ecosystem, Google is rhythmically releasing technological dividends:
Time difference strategy: Google chose to release the open source version of its main closed source model Gemini 3.0 series more than half a year after its release, which can maintain the commercial revenue of the closed source model and maintain its influence in the developer community through open source projects.
Localized moat: The core positioning of Gemma 4 is still "localized services". By optimizing the performance of lightweight models, Google attempts to directly compete with domestic open source models through extreme end-to-end experience without touching core business interests.
Google Announces the Upcoming Release of the Open-Source Large Model Gemma 4: Parameters Doubled to Fourfold
2026 26UTCamThu, 02 Apr 2026 07:45:07 +0000 4 04202643007 2 02am26
Grab the sofa
47People make soy sauce
statement:Unless otherwise specified, this article is( admin )Original, please retain the source of the article when reprinting。
label:无标签
Have0A reply
