Sweating over DeepSeek, Google counters with a cheaper Gemini model

0comments
The Google Gemini logo.
After DeepSeek caused a wave of fever-pitch emotions throughout the world (and wiping $589 billion off of Nvidia's value, the biggest single-day loss of any public company in history), Google answers. The Alphabet giant has expanded its Gemini AI lineup with the affordable "Gemini-Lite" model, aiming to compete with low-cost rivals like China's DeepSeek.

As we reported recently, Google enhanced the Gemini app and upgraded it with the 2.0 Flash model, offering faster responses and enhanced image generation.

Recommended For You
Now, Google introduces further updates to its Gemini family of AI models, expanding their capabilities and availability.

The Gemini 2.0 Flash, originally launched as an experimental model in December, is now generally available via the Gemini API in Google AI Studio and Vertex AI.

Designed for developers who require low latency and high efficiency, this model balances speed with enhanced performance and is optimized for large-scale, high-frequency tasks. Its multimodal reasoning ability allows it to process vast amounts of information within a 1-million-token context window, making it well-suited for diverse applications.

Recommended For You
In addition to this update, Google has released an experimental version of Gemini 2.0 Pro, which represents the company’s most advanced model for coding performance and handling complex prompts.

This model is designed to provide a stronger understanding of world knowledge, improved reasoning capabilities, and the ability to manage extensive data inputs. With a 2-million-token context window (the largest offered by Google’s AI models so far) Gemini 2.0 Pro should be able to conduct deep analyses while also integrating tools such as Google Search and code execution.

This sounds pretty solid! The 2.0 Pro model is currently available in Google AI Studio and Vertex AI, as well as in the Gemini app for Gemini Advanced users.

For those who don't need AI as badly, there's not the option to pay less. That's where the Gemini 2.0 Flash-Lite comes into play; a new model that prioritizes cost efficiency without sacrificing quality.

It offers improved performance over the previous 1.5 Flash model while maintaining the same speed and affordability, Google claims. With a 1-million-token context window and multimodal input capabilities, Flash-Lite is designed for high-volume tasks, such as generating concise captions for large datasets at minimal cost.

This model is now available in public preview through Google AI Studio and Vertex AI.

How about you, do you even use Gemini?

Get Visible as low as $20/mo for 1 year. Limited time offer with code: FRESHSTART

$20 /mo
$25
$5 off (20%)
Offer Ends 6.1.2026 at 11.59pm ET. New members get $5/mo off the $25/mg Visible plan, $35/mo Visible+ plan, or $45/mo Visible+ Pro plan for the first 12 months. Promo code FRESHSTART required at checkout.
Buy at Visible
Google News Follow
Follow us on Google News
Recommended For You
COMMENTS (0)