I created a nostalgic looking short film in under 16 hours using nothing but my voice video editing software and six different AI tools but why the video AI company Runway ml held a contest and open up their Suite of tools to anyone who registered I thought hey I'm anyone so I signed up as well in the competition we are given 60,000 AI minutes or 300,000 credits to create a 1 to 4 minute film in 48 hours additionally we had to include three elements in the film from list of numerous possible outcomes one element for each
of the three categories character archetype scene and object but I didn't have 48 hours like everyone else I only had 16 hours to do the competition so I needed to use my time wisely I immediately jumped into chat GPT and started with this prompt I a filmmaker using the AI tool Runway ml to create a film for a 48 Hour film festival I'm going to add in the contest requirements and you will be my writer's assistant helping me to shape the story based on the contest prompt say ready when you understand and are awaiting input
chat GPT came up with 15 different story combinations using the elements given by Runway but I wasn't sold on any of them so I asked again but nothing felt like me as I looked at chat gpt's Generations over and over again an idea for a story popped into my head so I entered the info from my brain and let chat GPT do its magic to help lightly structure a story this loose story direction will be good enough for now the elements I chose for the story were a creature from space a grocery store and an
exit door I then used a cinematic prompt in mid journey to create hundreds of Stills that I could use as starting points for Runway ml thanks to cyber jungle for the amazing prompt structure be sure to watch their video in the description below if you want to learn how to generate cinematic images and after I adjusted it I made my prompt to be cinematic scene dark aisle of a grocery store at night 1984 wide shot Kodak pora 400 Steven Spielberg motivated lighting style raw naspc ratio of 169 using my adjusted prompt I kept generating and
repr prompting until I came up with the look and feel for the grocery store that I liked after I finalized a look and feel of the setting for the 1980s grocery store I used mid Journey again to come up with the creature design after a few failed attempts of generating character sheets I realized having a consistent creature would be very difficult so I decided to change the story from one Creature From Outer Space to a race of jellyfish blob light creatures this would give me a little leeway in the character design and so every shot
that my character was in wouldn't have to match perfectly but would still be somewhat consistent once I found an image for a creature I liked I used it as a reference for Mid journey and pasted it before my prompts I kept prompting and repr prompting varing subtle and varying strong until I had enough images to bring the story to life in Runway Gen 2 image to video Runway is an amazing tool I didn't have to do too much except Dropped In The Source image play with motion controls and hit generate key to making good shot
shots with AI is small tweaks in the settings consistent rrolling regeneration and most of all patience after about 6 hours of generating hundreds of videos I finally downloaded 47 of the most usable shots they were pretty small file size and the resolution wasn't quite what I'm used to with working with professional video so I used topz AI to upscale the video files to a 4K prores Mo file and I had the software adding some film grain to take away some of the AI smoothness that Runway interpolation produces finally it was time for the edit which
took the least amount of time I used Adobe Premiere Pro in my normal editing workflow I had the first cut in about an hour I then added in some additional grain filters and film scratches to give it more of an 80s Vibe and to again hide some of the smoothness that the current AI tools produce I nested my complete timeline duplicated it and added a fast blur effect to give the film a bloom likee feel that we're used to with cinematic lenses from the 80s once the visual story was locked into place I wrote some
voiceover based on the light outline I created with chat GPT and then recorded the performance on my phone I then used altered AI to morph my voice a bit as it does a much better job as a voiceover actor than me just see for yourself a race of explorers looking for something to fill our soul a race of explorers looking for something to fill our soul a raceing explorers looking for something to fill our soul finally I use adobe's text AI tool to capture the vo in text form and create subtitles for me 16 hours
and six AI tools later I called it a day and was finished I of hundreds of submitted films let us explore as one of eight to win an award for best character with all the runr credits I have received as a prize get ready for more fun AI films to come and if you haven't yet watched let us explore click here to do so