AsiaTechDaily – Asia's Leading Tech and Startup Media Platform
Meta Platforms has quietly brought in the team behind Gizmo, a buzzy AI startup known for turning simple prompts into interactive mini-games and apps—marking what appears to be a deeper shift in how the company is thinking about the future of social media. A Meta spokesperson confirmed that the engineers joined earlier this year, though financial details were not disclosed. The team, originally part of Atma Sciences, includes former Snap engineers such as Josh Siegel and Daniel Amitay. Meta has also secured a non-exclusive license to the startup’s technology.
While the move may appear to be a routine acqui-hire, it points to something more strategic: Meta is not just investing in AI tools, but in new forms of digital interaction. Gizmo gained attention for a simple but novel idea. Instead of generating text, images, or videos, the app allows users to create small, interactive experiences—such as mini-games—by typing a prompt.
These creations are then shared in a feed that resembles short-form video platforms, but with a key difference: users do not just scroll and watch. They can tap, drag, and interact with the content itself. This approach is part of a growing trend often referred to as “vibe-coding,” where users describe what they want and AI generates a working experience without requiring traditional programming skills. The shift may seem incremental, but it changes the nature of content. What was once passive becomes participatory. Instead of consuming media, users engage with it.
Meta’s interest in Gizmo’s team comes at a time when the company is intensifying its focus on consumer AI. As platforms like Instagram and Facebook mature, growth is increasingly tied to new forms of engagement rather than new users. Short-form video, popularised by platforms like TikTok, has already reshaped user behaviour. But even that format is beginning to plateau.
Interactive, AI-generated content could represent the next step. By lowering the barrier to creation, tools like Gizmo allow more users to become creators—not just of posts, but of experiences. This aligns with Meta’s long-standing strategy of expanding the creator ecosystem while keeping content generation within its platforms. More importantly, it gives Meta a way to experiment with AI-native content formats, rather than simply layering AI on top of existing ones.
Gizmo is not alone in exploring this space. A number of startups are building similar tools that allow users to generate apps, games, or interactive content using natural language prompts. Early funding signals growing interest. Platforms like Wabi and Vibecode have attracted investor backing, suggesting that venture capital firms see potential in what could become a new category of consumer applications. Still, the space remains early.
Much of the content generated today is playful rather than practical—more akin to digital toys than essential tools. Whether this evolves into something more deeply integrated into daily digital behaviour remains an open question.
One of the key uncertainties surrounding Meta’s move is what happens next. The company has not indicated whether Gizmo itself will continue as a standalone product. The non-exclusive nature of the technology license also suggests that Meta is not betting on the product in its current form, but rather on the ideas and capabilities behind it.
This reflects a broader trend in Big Tech. Instead of acquiring fully developed platforms, companies are increasingly bringing in small teams working on experimental technologies, then integrating those ideas into their existing ecosystems.
In this case, Gizmo’s concepts could eventually surface as features within Meta’s platforms, new creator tools, or even entirely new products.
At its core, this move points to a larger transformation in how digital content is evolving. Social media has historically been defined by format—text, images, and then video. Each shift has changed not only how users consume content, but also how they create it.
Interactive, AI-generated experiences introduce a new layer. Content is no longer static. It behaves more like software—responsive, dynamic, and shaped by user input. This has implications beyond entertainment. It could influence education, e-commerce, and communication, where interaction adds value beyond passive consumption. But it also raises questions. Not all content benefits from interactivity, and user fatigue could become a factor if complexity increases.
Meta’s decision to hire the Gizmo team may not immediately result in a visible product. But it signals where the company believes the next frontier lies. Rather than focusing solely on improving existing formats, Meta appears to be exploring how AI can redefine them altogether. Whether “vibe-coding” becomes a mainstream behaviour or remains a niche experiment is still uncertain. But the direction is clear: the future of social platforms may be less about what users watch—and more about what they can create and interact with. And in that future, the line between content and application may begin to disappear.