Hwang Dong-hyuk, creator of “Squid Game,” has invested $3 million through his Firstman Studio in TwelveLabs, a San Francisco-based AI company specializing in video production technology. The partnership positions the entertainment industry to leverage AI for faster content creation while addressing concerns about technology replacing human creativity.
What you should know: TwelveLabs focuses on enhancing existing video content rather than generating new material from scratch.
- The company’s system “indexes and enriches video metadata down to the scene level, enabling editors, directors, and producers to work with unprecedented speed and precision while maintaining creative control.”
- Co-founded by Jae Lee and Soyoung Lee, TwelveLabs partners with studios, streamers, creators and broadcasters to streamline production workflows.
Why this matters: The investment addresses a critical inefficiency in entertainment production where valuable footage remains underutilized.
- According to TwelveLabs, “less than 5% of that material is ever reused” from film and television archives containing “billions of dollars’ worth of untapped footage.”
- The technology can “search across hours of footage by visual, audio, and contextual cues simultaneously — making it possible to locate a specific scene, verify rights, and deliver usable files in a fraction of the time.”
What they’re saying: Hwang emphasized the technology’s role in supporting rather than replacing creative work.
- “Storytelling is becoming more global, more visual, and faster-paced,” Hwang said. “The creators who can adapt will shape the future of entertainment.”
- “AI tools are opening up new ways to create the power of cinema that we couldn’t have imagined just a few years ago. For me and Firstman Studio, it’s about giving filmmakers more time to focus on the art, the emotion, and the magic only they can bring to life.”
The bigger picture: TwelveLabs differentiates itself by working with existing material rather than generating new content.
- CEO Jae Lee explained their mission: “At the core, our mission is to index every video in the world and make it as easy to understand as text. More than 90% of the world’s data today is video, yet most of it remains locked away.”
- The company addresses creator concerns by emphasizing human control: “The decisions about what to use, how to frame it, what story to tell, that’s still all human.”
How it works: The technology focuses on understanding video content at a deeper level than surface-level recognition.
- TwelveLabs’ video foundation models can analyze “the emotional and narrative flow of a scene, not just surface-level cues like objects or dialogue.”
- The system helps reduce time spent “cataloguing dailies, restoring archival shots, or verifying rights” so creators can focus on storytelling decisions.
In plain English: Think of TwelveLabs as creating a super-smart search engine for video files that understands not just what objects appear on screen, but the mood, pacing, and story elements of each scene—like having an assistant who has watched every piece of footage and can instantly find exactly what you’re looking for.
‘Squid Game’ Creator’s Firstman Studio Makes Major Investment in TwelveLabs, AI Company Focused on Speeding Up Entertainment Production (EXCLUSIVE)