Back to Blog

Google’s AI: Brilliant, Bloated, and Barreling Ahead

image of a student presenting research at a university event

If OpenAI is lurking in the shadows and Rabbit stumbled publicly, Google is going full-throttle—unleashing a torrent of AI models everywhere at once.

Veo 3: Awesome and Terrifying

Veo 3 is real. Launched in May 2025, it’s Google DeepMind’s new text-to-video model that generates 8-second 720p clips with synchronized audio—everything from dialogue to ambient sound.

There’s a “Fast” variant too: more than twice the speed for Gemini Pro and Flow users. It's available via Gemini mobile, Flow, Google Vids, Vertex AI, and Workspace integrations.

Reactions range from dazzled (“eerie, realistic scenes”) to alarmed ("fabricating realistic riots or election fraud")—a potential tool for misuse despite watermarks and content filters 

480 Trillion Tokens/month: Scale Gone Wild

At Google I/O, Pichai revealed that Google’s AI pipeline now processes approximately 480 trillion tokens per month, up from just 9.7 T a year ago—a roughly 50× increase  .

That volume includes all modalities across Search (AI Mode), Workspace, Gemini APIs, Cloud, and mobile apps. The Gemini app alone serves over 400 million monthly users  .

What’s not clear: how tallies break down between input tokens, instruction prompts, output tokens, or modalities like text vs image vs audio vs video. And how much of that 480 T/month is video-related?

So What?

Google is deploying AI models across everything—Search, Docs, Android, Cloud, Gemini apps, Studios, the works . If you want AI in your workflow, Google’s got you. Ambitious, yes. But:

  • No clear prioritization—this is AI sprawl.
    Modalities like video/audio are costly tokens.
  • Hidden costs—compute, latency, and carbon footprint.
    Risk of drowning in “AI slop”—low-value content dominates.

At Swept, the rule holds:

Humane shipped too soon. Rabbit shipped too light.

Google? It’s shipping “everything, everywhere” at unprecedented scale.

We’re watching the sanely useful signals inside the noise. Let’s see what actually sticks.

Join our newsletter for AI Insights