
LYNX is an AI-generated short film made in 48 hours with Google Veo 2.
It tells the story of two hunters and the lessons they learn from nature. Aside from the story, which was written by Daniel Barak (the human), LYNX was 100% made with AI technologies. With no color correction, additional shots, cleanup, or VFX.
To generate all shots in the film, an early beta access to Google Veo 2 (text-to-video model) was utilized, with thousands of generations and tests before selecting the right shots.
To achieve visual and auditory consistency, a new method of prompt management was used, in the form of a custom GPT that served as a governing writing partner over all other AI tools. It was used to help with prompt writing and corrections at scale, so different methods could be tested quickly.
It was also used to make sure that assets generated on generative AI sound tools would match the feeling of the visuals generated in Veo 2.
To create the sound design for the film, multiple AI tools were used, including Replicate’s MMaudio video-to-sound model that generated visible sound effects such as footsteps in the snow, the wind hum on top of the mountain, avalanche sounds, and more.
In addition, Elevenlabs text-to-sound was used to generate all off-camera sound effects, such as the gunshot and bird sounds.
The music score for the film was generated in the Udio text-to-sound model, and was corrected on the Udio platform to perfectly match the film down to the second.
The custom GPT ensured that prompts to all AI platforms were contextualized to the story and sounded correctly. i.e. even the gunshot prompt was for “…a gunshot high up in the snowy mountains, as the wind howls, from hundreds of feet away...”
The film was upscaled to 4K with Topaz Labs Video AI, as Google Veo only generated 720p clips.
It was edited using Adobe Premiere Pro.