Meta shows off two big AI upgrades – one helps robots, the other helps you

AI that's making robots better at their jobs and apps smarter for you.

0comments
A smartphone with Mark Zuckerberg's Facebook profile opened on it.
When we talk about AI, most people immediately think of Google or OpenAI, the company behind ChatGPT. But while those two often get all the headlines, Mark Zuckerberg’s Meta has been moving fast, too – and now it is making some serious noise with two big AI announcements.

Meta, the company behind Facebook, Instagram, WhatsApp and Threads, has just unveiled a new AI model that can actually think before it acts and a new video editing feature powered by AI. The latter is clearly aimed at the billions of people using Meta's platforms every day.

But let's start with the one that sounds straight out of a sci-fi movie – the new model is called V-JEPA 2 and it is basically a brain for robots and AI agents. The idea? To help them understand the physical world and predict how it reacts to their actions, just like we humans do without even thinking about it.

When we walk through a crowded space, we are constantly predicting what is about to happen – avoiding people, dodging obstacles and moving toward our goal. We don't pause to analyze every move, we just know what's likely to happen. Well, according to Meta, V-JEPA 2 is designed to teach AI that same kind of intuition by building what's called a world model.

These world models allow AI to do three core things: understand, predict and plan. V-JEPA 2 is trained on video footage, which lets it learn how objects move, how people interact with those objects and how everything behaves in the physical world.

It builds on the original V-JEPA model Meta dropped last year, but now it is better at understanding unfamiliar environments – like when a robot encounters a brand-new space or task.

Meta says it tested V-JEPA 2 in the lab and robots using the model were able to do stuff like reach out, grab things and move them around. That might sound basic, but in the world of robotics, that's a pretty big deal.

Of course, Meta isn't the only company chasing this type of AI. Google launched its Gemini 2.0 model last year, focused on making AI better at reasoning, remembering and planning.  OpenAI is also in the game with its own AI agent that can manage tasks for you. However, Meta seems to be leaning into helpful use cases – but at the end of the day, nobody really knows how this all plays out.

It's clear we are heading into a future where AI doesn't just respond to prompts – it actually starts doing things for us. And yeah, it's both exciting and a little nerve-wracking. On one hand, these tools can help people who really need them. On the other, there is always that risk of us becoming too dependent. What happens when AI starts thinking instead of us?

Recommended Stories

So… AI can now think before it acts. How are we feeling about that?



Moving on.

You can now edit videos with Meta AI


While V-JEPA 2 is all about AI understanding the real world, Meta's second announcement is focused on how you can shape your digital one. It just rolled out a brand-new AI-powered video editing feature that is already live across the Meta AI app, Meta.AI website and a dedicated new app called Edits.

This tool lets you remix short-form videos using preset prompts that can completely change your outfit, background, vibe – even the entire style of the clip. It's now available in the US and more than a dozen other countries.



Inspired by Meta's Movie Gen models, this feature is just the beginning. Meta says that later this year, you'll be able to use your own text prompts to edit videos exactly how you want, directly alongside Meta AI.

The editing process is simple: upload a video to one of the supported platforms, then browse through more than 50 editing prompts. For now, you can transform up to 10 seconds of your video for free – but that's a limited-time thing.



You can turn your clip into a retro comic book scene, complete with vintage-style illustrations. Or change up the mood of a cloudy video with dreamy sparkles and soft-focus lighting. You can even make it feel like a neon-soaked video game, with your clothes and environment matching the theme.

Once done, you can share your creation straight to Facebook or Instagram from the Meta AI app or Edits. If you're on Meta.AI or using the app, you can also post to the Discover feed.

And while this might sound like fun – and yeah, it definitely is – it's also another reminder of where we are headed. Just like Google's new Flow tool that can generate hyper-realistic videos, these kinds of AI-driven editors can blur the line between what's real and what's not. We've already seen deepfake-style videos go viral and trick people. And sure, Meta's tool is meant for creative edits, not deception – but I think it's still a step down that same path.
Loading Comments...

Recommended Stories

FCC OKs Cingular\'s purchase of AT&T Wireless