I am walking around my apartment complex right now but everything that's not me in this footage is all an Unreal Engine uh I would start by playing with maybe even this app and just try to rig a character and then once you're done with that character bring that character into Unreal Engine and then see it running around you can literally make your own video game character this is a podcast about this technology the future of Movie making is here and we are going to be able to generate characters with AI and it's going to automatically
rig them so we can move around as near humans and become aliens in other worlds with many moving objects and other characters because you see with AI right now we're getting to the point where all you're going to need to do is do the performance with your face on your phone you don't have to know exactly how to rig it up for those of you that are Excited about this there's actually a website it works right now I've had some struggles but there's a website and so this is a podcast with Don Allen Stevenson II
and we discuss these topics right here we talk about how how to use AI to create these 3D avatars we talked about four different AI VFX tools and a bunch of other tools we do live demos directly in this episode we talk about Ai and Unreal Engine and we also talk about how to create a video with your IPhone that looks exactly like the one I showed you at the beginning I watched your last two videos in one of the videos you had a full-on green screen and you were walking around some like deserted City
with your phone did you have a green screen behind you I was actually holding an iPad because I found that the frame rate was even better on the iPad but it also works on an iPhone it's using an app called Sky glass and I am walking around my apartment complex Right now I'm walking around outside because I wanted to get the light changing on me like with the like when I was walking by tree I wanted the Shadows to fall off on me but everything that's not me in this footage is all an Unreal Engine
uh environment running through the sky glass app fire that you see over there is all particles that are based in Unreal Engine and I didn't see them until I walked over to them so everything I'm Seeing was because I'm looking at my iPad screen and then I'm like oh that's what's in front of me you see me walking through tables walking over grass but my favorite part when I'm running here the motion blur is correct it's taking all of that data using an iPad to not only map your position and you know rotoscope you but
it's also looking at the position of the iPad or iPhone relative to the ground and that's how it's getting all of that correct camera Angles and correct motion blur it's using the gyroscope inside of an iOS device to decide how much to rotate or shift the background relative to the phone movement and it's all happening in real time so I could see all this stuff as I'm walking by it you said gyroscope you know how when you lift an iPhone up off the table before you did anything the screen turns on and you didn't have
to press a button there's a tiny little accelerometer gyroscope in every iOS Device that is acutely aware of the rotation and position of your iPhone relative to other things if you've ever had to do augmented reality where where it might ask you to scan your environment to get an idea of what is the geometry of the space or what is the ground plane what it's doing is it's calibrating that gyroscope it's getting an idea of okay what is up what is down what is left what is right and as soon as it understands that as
soon as the IPhone calibrates the angle that the phone is in relative to the rest of the world it can then start doing some pretty tremendous spatial Computing animations the fact that I can run with this is only because a gyroscope is enabled it's not just looking at the camera it's looking at the relationship between the camera you and then the rest of the scene basically Bingo you nailed it so you said Unreal Engine can you actually import a scene from Unreal Engine into Sky glass to use it in your videos do you know if
you can do that I haven't done it myself but I 100% believe that you can this is actually one of the founders of Sky glass and this video was about adding an Unreal Engine environment to Sky glass app he shows the whole process of him setting up an Unreal Engine file at the very end he chooses the icon for the sky glass app and then that becomes a thing that you would be able to click on in Sky Glass and then you would be able to run that environment right through your phone so you film
the video on Sky glass and doesn't it take time from when you film for it to be fully loaded in its highest quality correct like or it's about as long as you ever play video games and there's a loading screen when you're going to a new level that time is the amount of time that it takes to load it up I was seeing the world being updated as I walked around with my iPad I've always been a huge believer that tools that enable people to use their camera and then it uses AI to rest stylize
it is better than just like text to video type stuff I think a lot of the growth for these tools or like the rate of adoption is going to come from people who are already existing storytellers and filmmakers with their nice cameras they can use their cameras and very easily adjust the backgrounds adjust the characters which brings me to your next Video if you want to pull that up which is literally it's the same app but instead of changing the background and keeping you it's changing the background and the character so let me pull up
that one here okay so this is using the same app the sky glass app but the the difference here is instead of it doing liveaction me on top of the world it's using my face performance to drive an Unreal Engine Character and the environment is replaced with an Unreal Engine environment so what this allows you to do is all of the virtual lights that you set up in Unreal Engine they can actually cast correct shadows and correct volumetric lighting uh what this also does is it allows you to tap into the Unreal Engine camera settings
like the depth of field the shutter angle all those kind of camera features that are in the virtual camera You can now tap into all of those the setup required for this is interesting though you have to have a character that's called a rig when I was at DreamWorks we would have we have to basically do the same thing every character needs to have a rig a rig is like a puppet it it gives the ability for a model to move around if you think of like a sculpture that someone makes out of clay that's
kind of cool and all but it usually can't move unless Somebody puts wires like metal wires inside of that clay statue then you can like animate it and so in that example of physical clay the metal wires that you use in that figurine are the are the rig and now that it has a rig you can bend the character into different shapes and tell different stories so similarly with this character here I didn't make this character this was made by I think the sky glass team but you can have a static piece of geometry so
you know Someone sculpted that Goblin head before and then someone had to go in and put wires all throughout the nose the mouth the corners of the eyes the jaww line the eyebrows every part of the face that you want to have animated has to have essentially wires added and then the way it's being animated is it's taking my face performance from the video below and using the amount that my eyebrow moves to influence the amount that the metal eyebrow moves behind the rig of That Goblin and they did that for all of their faces
the technology they use specifically for it is called blend shapes and you can make blend shapes out of any character if you use a free program like blender you can make blend shapes uh um and the result is unbelievable normally it's extremely painful process to do face performance and motion capture uh and usually I'd have to do it before like a pre a pre thing so you would film all this with Your face actually just to give you an example you ever watch Avatar yes for sure so this is actually how they normally would do
that so what you're seeing on the right of the screen is a person an actor with dots on their face and in the past you had to have a camera on your face with dots and then that would be the information needed to influence inuence the blend shapes of that character on the left so in the past You' had to film All that stuff beforehand before you would see the final output quality what's different now is there's enough AI on board with like their back end that now they don't even need to have dots on
your face it's Discerning the dots on your face just from seeing a regular video footage of your face the natural details of pixels and color changes the Shadows so you don't need to do the dots anymore and then instead of having to do this all beforehand before Seeing something we can now get to the point where we can see this in real time so this camera data capturing her face and then adjusting the blend shapes of all of this character's performance you can now do all of this in real time this is all being rendered
in Unreal Engine whereas I think when they filmed Avatar they did all their face performance in the program called Maya which later had to be sent to a different render engine I think they Used Hydra render engine to get all the photo realistic lighting but the result would be they wouldn't see this final quality result at the time of filming so what I'm saying is now an iPhone can basically do what Avatar can do but in real time and that just opens up tremendous possibilities for storytellers that's insane and so the first thing that comes
to my mind is and first of all that explaining with the metal the clay and then the puppet and Then how you need you would need like the metal wiring to actually create the movement that was probably the best explanation of that that I've heard and actually really lit up my brain and the first thought that I had is like how close are we to AI basically taking a lot of those character and uh wiring Pairs and training an AI model so like if you were to create a character that's 2D it automatically gets converted
into a 3D character that gets automatically Wired up so that you can immediately use that I think the first company to do that well is going to make millions to billions of dollars from allowing people to create whatever character they want and then immediately control it via their camera like that's so fun that's my first thought you're spot on one company's already done it with Marvel so have you seen any of the Marvel movies that featured thanos's character I have seen Clips I will say I've gave up on Marvel after like the first Avengers um
there's just too many of them totally understood you don't have to watch the whole story at all to understand this concept but they actually already infused AI into that process before before all this I used to teach all the software at DreamWorks so I have a very deep knowledge of the 3D pipeline I worked on How to Train Your Dragon and boss baby trolls crudes and bad guys and I had to teach all our different Departments how to use our software so you're talking to like the most extreme nerd of the Nerds when it comes
to like VFX stuff I love it that's why I'm so excited to talk dude this is awesome so I'm going to show you this clip thanos's character was a mixture of human performance with that actor that guy wearing one of those motion capture faces but also they drove AI um deorations on the face do you see how all these wrinkles and stuff are Happening yes let me go frame by frame okay so you see all these like little how so so these poses like this pose with his eyes kind of his eyebrows raised and then
his and Miss pose where his eyebrows were all squinted these are blend shapes so usually what artists do for movies is we have a sculptor basically sculpt in what the extremes are of different facial expressions that a character can make and then based off of those extreme poses we blend between Them based off of the live caption the live performance so if I slowly raise my eyebrows on my face with a motion capture camera on my face it would blend between this pose and this pose creating all those inet so the artist normally would only
have to focus on building these really nice shapes like okay what does his face look like when it's squinted what is his face and you know you see how even like his his jaw moves down when his eyebrows move up so There's a lot of movement that a face has to go through just to like raise the eyebrows to make it look convincing well what they did for the Marvel movies is they actually embedded AI to do extra deorations like sub wrinkles in their characters so what I mean by this is it allowed for the
artist to only have to move a little bit of stuff so this is all the data that it's reading but it's interpolating that and adding they used AI to do more wrinkles and more like Muscle and fat underneath the skin the result is insane and so what we're going to see from creators probably next is instead of it just being us blending between a few face shapes we'll probably move into a world where you'll be able to get this level of performance out of 3D characters from Unreal Engine where it's using AI to kind of
do the subm movements of muscle instead of having an artist have to sculpt every single way that the muscle moves they did it for The Thanos movie they did for the Marvel movies so you know it's a mixture of you human performance but also blend shapes that were sculpted by an artist and then third you know trary or secondary face performances that were done by an AI that that learned how skin wrinkles and changes and the way they train that is they would film lots of human faces going through all of the subtle movements of
expressions and then teaching an AI model how the face Wrinkles and then using that as part of its brain you probably won't even need these anymore because I bet there's going to be an update where it can estimate your body poses well enough from just your single phone camera that will look at this and be like wow remember when we had to wear suits and all those helmets and all those camera rigs just to get the performance from an AI driven character you heard move have you heard Of move. a isn't that kind of what
that does where it basic basically uses One Singular image or uh One Singular camera to estimate your 3D position you nailed it so this is an excellent app um I like using move it works better with multiple cameras but it does work with a single camera now and what it's doing is essentially taking that performance and estimating what the rig is doing that rig you know it kind of imagines a character with bones in it and says oh I See a foot moving to the right okay think the foot's moving to the right on the
character or I see a you know character's left hand moving up maybe the left hand of the of the other characters moving up so when they use one camera there's some poses that it understands and then other poses that it will forget so for example I don't know if you can see me right now I'm going to step back right now if I had one camera you can see both of my hands right and So if you're trying to interpolate my pose to this it's very easy because you can always see my arms but what
about now when all you can see is my left arm you can't see my right arm over here so if we only used one camera to do the motion capture then it would incorrectly guess what this arm was doing when it's invisible humans would know okay I'm swinging my arm and the human brain fills in the information that it's Probably continuing that movement when it goes behind me but to AI that have to be kind of taught the basics with one camera they may not know that this hand is still here like what if the
hand is actually so you know it's like this but what if my hand's actually like this actually had it down you can't even see it so if you add multiple cameras to move AI it gets really really good at guessing it gets really good at guessing where the arms and legs and you know Should be you could do a full-on break dance and it will like if you had two cam two cameras is probably good right or would you want like a third eight oh wow so eight ceras a circle okay yeah and also this
one works with multiple people so you can have like a dozen people all within that Circle each of them would get custom motion capture it it's smart enough to to to know all of the motion capture within the circle okay okay hold on let's pause for a sec Okay I just kind I kind of want to just restate so far move so move AI with One Singular camera is it's basically using AI to guess the depth of position when it's just using One Singular camera but it's impossible to know for sure what you're doing because
when you turn uh literally can't see it with One Singular camera and if you add a second camera it'll know a little bit more but if you add eight cameras it will have a basically a 360 understanding of what's going on within that Circle but you're saying that not only can you just have one person break dancing and have it like calculate where his limbs or and everything is you can have an entire scene of people doing like combat and you can capture the 3D motion of all of the characters within it and then in
theory plug in other characters on those motions is that correct you you got it that's awesome so You see this video here look how none of them are wearing motion capture suits but instead there's about an array of maybe looks like maybe eight or nine tripods around them and each one of them has an iPhone on it that's so cool and it's tracking that level of motion I mean that's insane and you see how it says actor 01 actor o actor O2 each one of those can be any 3D character that you can find on
the internet or you can make yourself that Could be Thanos that could be a goblin you could you know the only limitation with this tool is you would currently need to it's only good at animating bipedal characters meaning two arms two legs so if you wanted to use this technique to animate for example a horse a quadruped uh you would have to use some other techniques and there are some other companies that are solving for uh nonhuman motion capture like wolves and Ducks and other creatures fish right everyone's talking about uh text to video about
how this is going to like change movies forever to me this seems way more likely that this is actually going to be how people create movies where they're actually going out in the real world with their friends and like they get to film stuff but they can actually become any character that they want and be in basically any setting that they want and I guess I guess what I'll ask you right now is like at a practical level if I were to wanted to create a 30 second intense film with this like what do you
think what's the stack and what are like the requirements and like how would you go about creating this uh minute movie with this technology and maybe I would have to use like pre-rigged characters but like what if I want to create my own character and then become it in 3D yeah I'll show you another one where I did the M cap for so there's a music video I did with little nzx and he has all these dancing robots I did the motion capture for it and then we translated that information to the characters the process
that we did for these robots was at the time this was like in 2018 I had to get his choreography grapher to wear my map suit I used a Roco smart suit Pro it was before you could just use video as a motion capture you had to put an Actual suit and the suit has all these sensors on it and it tracks the acceleration of different parts of the suit and then once you have that data it usually stores as a data called a fbx you can then upload that file to any 3D program like
blender Unreal Engine unity and then you can build a 3D character in a modeling app like zbrush or Nomad sculpt or Cinema 4D or blender you can make it or you can just download it from sketch Fab and then you would try to get This thing called a t- pose it's a weird pose but essentially when you do liveaction performance capture you try to get one pose where the person is standing in the shape of a te with their arms directly away from them and their body in a straight line and then you try to
model your characters also in a t- pose it's the infamous T pose everyone who works in animation in motion capture knows about so here's the example of like a relaxed pose versus a Te- pose the reason why we love t- poses in the 3D world is they're the easiest for us to set up a rig it's the easiest for us to place bones so when you're in a tool like mixo you can just tap on the shoulder and it knows okay that's going to be a shoulder you tap on the elbow joint you tap there
you tap where the wrist is and it knows it's like okay that's where wrist is here's what the knee is here's the groin here's like the middle rib cage and just by tapping with Circles it can understand that and as soon as you have a t pose of any piece of geometry you can then drive it with the performance that you captured on your phone that's the magic without the to POS it's very very hard to translate your performance that you captured on that move app or a performance you capture in there other apps that
do it there's like one called Deep motion I used to use if you don't have a to POS and you Try to just send that data to the other character you'll have what happens in some video games where you get what's called a broken rig and they look hilarious have you ever seen things like this in a video game where the rig of the character breaks yes yes I have actually this is like rigging 101 Basics all of us had to go through and have a disturb being broken rig where all the eyes like explode
out of the face or the teeth Leave the body or the neck ejects forward um these would happen when you don't set up your rig properly usually the easiest fix for it is redoing the T pose so this is for example I can just tell from whoever made this one they did not set up a t pose at all and so what it's doing is and I know this looks totally random but someone in 3D can explain what's Happening Here the left arm does not have any T pose but the left hand does The hand
is in the correct spot so if I was going to fix this person's rig I would keep the hand data but then I would redo all the left arm and the right arm and the second one is her chin doesn't have any blend shapes that's why her chin is Left Behind at its origin point but the rest of her face looks like it does have a good blend shape and then the arms and legs you know these looks so silly I love broken rigs they have a bit of nostalgia for me in my Early days
of learning 3D it could be really funny but also really distracting if that's not part of your storytelling it does look hilarious I will say because their movements start getting super chaotic right yeah this is a great one look at this uh image these are the equivalent of all those metal things behind every 3D character is something like this these are called Bones or joints each one of these little paper plane looking shapes is a joint if you Place them strategically around your model you can use them to bend and warp your character for example
that little one in the middle that that looks like a pointer they would they would tie that to the pupil and that would allow an animator to have the character's eyes look around but I don't think people are going to need to even know this anymore that's like the joy is with AI right now we're getting to the point where all You're going to need to do is do the performance with your face on your phone you don't have to know exactly how to rig it up or how the Armature was set up or where
the bones were placed and for those of you that are excited about this there's actually a website it it works right now I've had some struggles but there's a website that can now automatically rig any geometry for you so you don't have to build those in blender and Maya and Cinema 4D it's Called anything world I just started using it last week so I I haven't actually haven't even uploaded a video about it yet yeah this is anything world it's it's nuts so what processes does anything world do for you or like what part of
it so it automatically creates the structure within any 3D character Bingo it's creating all those bones and they have a button that you can press that turns into animation they can rig almost Any kind of body type so you can have a beetle with three legs or a quadruped like a horse with four legs you could rig a fish with no legs the white skeleton that you're seeing in there is what it's adding to your 3D model and then the animation that you're seeing is what they're claiming that they can do I've struggled to get
my characters to animate with anything world yet but I think that's just because I gave it bad models that's what I'm trying to work on This week um would you come back on the podcast with uh the team from anything world I'm sure we can get a hold of them oh yeah we we should do that that sounds so fun what I want this podcast to be about like what I really want is to make things practical give people something that they can try and so I'm trying to figure out like what do you think
the first steps are because we've talked about Sky glass so far which is basically like a digital green screen And can completely change your character in background using Unreal Engine we talked about move AI which allows you to use your camera I'm just trying to figure out like what if there was one takeaway one thing someone could try from watching this video what would that be I've tried Sky glass and I think it's awesome but I want some sort of project to work on I would say the main way you can if you want to
feel this experience of like making stuff in the 3D World I Would start by playing with maybe even this app like anything world and just try to rig a character and then once you're done with that character bring that character into Unreal Engine and then see it running around you can literally make your own video game character for Unreal Engine do you need a super powerful computer a ton of memory what's the deal there unfortunately yeah you do need to Have a pretty high-end PC I've tried running it on Mac computers and it doesn't seem
to work as well so I usually would recommend like a graphics card like a really nice graphics card like an RTX 30 90 and above would be the go-to and then you'd build a custom PC but you can run it on other laptops it just won't allow you to run at the frame rates that you would like unless it's a really powerful PC here's a good measure if you can play a PC video game at full Graphics on your laptop that laptop can also run Unreal Engine pretty well that's a good and if you can't
play a good like you know AAA quality game and you can't play it at full specs that computer is likely not going to be great at running and building stuff with Unreal Engine see um the first thing I think of is like what percentage of people have a computer that powerful and I think it's probably not Many like 10% of people like who have a computer the point that I'm really excited for that I've heard it's and just looking at the progress of just all the chips whether they're apple or Nvidia it just seems like
within a few maybe a few years all like Apple Mac the new Apple MacBooks will be powerful enough to run on real engine or my am I tripping like do what do you think no they are right now Apple currently Nerfs Unreal Engine on Mac computers because I Think they have a beef with epic games they totally could run it really really like the newest Mac Mac M3 chip you think could run it totally yeah cuz I can run other 3D applications on my Mac computer that would be heavy and would chug on my PC
and the fact that they run tells me it would work so when Apple chooses to have Unreal Engine not work properly on their iOS apps it's a choice not because it can't do it not at all it can totally do it there's no excuse at This point they're choosing it to not yeah it's really selfish I'll keep using a PC for my Unreal Engine stuff and then I use Mac for everything else all the 3D stuff I do on my Mac think about how all the people out there who create content right they can pull
out their phone and they just make content you just it's so simple it's so easy and it's like yes and if you want to create really good content you have to do a little other things like you have To edit a little bit and so like I'm trying to think of like at what point will you just have to do a little bit to create a movie in whatever scene that you want cuz right now it seems like you still have to do a lot of bit right you have to get a nice PC right
and so if you have a nice PC you still have to do a lot you have to learn on real engine I downloaded on real engine before granted it was on my Mac and I was like oh this seems really complicated and hard and Maybe maybe I could get it in a week like if I just put head down for a week and tried to learn it like I could probably get to the decent like relatively like I could learn how to rig a character and use it in Unreal Engine do do you think that's
true I still struggle rigging and Unreal Engine oh damn just kidding so you can do that it's just like it's a big time commitment uh that's why I was so excited for Sky Glass because it was like I didn't have to think about rigs I didn't have to think about cameras I could just open up an iPhone app and run that and then their interface let me choose the only issue is that you can't create your own character in my opinion the people want customization because the novelty of it cool the video you made is
cool because you're doing something pre people couldn't previously do easier than they could do it in the past right and that's Cool but I want to create a character and then I want to create a ton of content with that character I've created using a face Fusion which I think has been my favorite tool that I've used yet where it's literally the tool to create deep fakes because it's the easiest way to put a face onto my face and I created Chad I don't know if you ever watched the videos with Chad the seen some
of your character yes yeah and then it's super easy to use 11 labs to change the Voice in fact maybe in post I'll do it right now as I'm speaking as Chad and I think it's super engaging I think that the reason why that technology doesn't get a lot more attention is because it's so dangerous like we saw Biden deep fakes came out last week that with the PBS logo audit and then PBS came out we're like we never released a video of Joe calling everyone a bunch of uh there's a lot of dangers with
it but it's such a powerful tool because You can literally generate a character on Mid Journey use that mid Journey character's face as a reference image and put it over your face but what you're talking about here and what I'm excited about is for the full body version of that where you can just create a character using AI natural language or a reference image whatever like and you can design your own world you can design all the characters and then basically just map it over your Video and that's where I think we're getting right now
I think that that's the next the next huge era and I that's I think that's going to bring in a lot of people who actually currently hate AI because they realize that they can actually use their current equipment they can tell stories in their own unique way and there's still a ton of skill like no matter how good the technology is is if you push it further there's always like taste skill and like Little extra things that make it amazing and that's the point where I want to get to I guess I'm excited for people
to just start creating content on it well I'm excited to say you can to your same point I wanted to make custom 3D characters on Unreal Engine but the learning curve was is still too steep so what I'm showing you right here is I built this character and I'm running it on an iPhone and I built that from scratch I painted that from scratch I Made that hair so you can also do this without Unreal Engine is what I'm trying to say this was running on the application lens studio if you ever use Snapchat and
have ever worn a face filter that's the barrier to having access to a tool like this lens studio is the software that I use and anybody who's built an effect on Snapchat had to use lens studio and that one's nice cuz it's like a really robust 3D program much easier to learn and they have Fantastic templates for everything this is lens studio so you could build a whole augmented reality character or experience and run it on any phone that can run SnapChat and then you hit publish and now anyone with SNAP can use your filter
or you can make it private and only you can use it and so this was this has been my way to create a repeat character that I can keep going back to again and again again without having to know how to set up stuff in unreal they Support face tracking they support finger and hand tracking they support body tracking so you can create full body characters and have complete control of the face the hands the body and you could add voice changers like you said with 11 Labs if you wanted to create different you know
voice characters out of your stuff I've even done style transfers on top of this footage so using the augmented reality as a base and then do a style trans Transfer on top of that video with AI to create like a you know a completely new experience out of the same kind of content before you go on to examples I just want to make sure I get everything correctly so Snapchat to my knowledge Snapchat differs from a lot of these other companies tools because they have an open SDK so other companies can actually use the Snapchat
video to power their tool and you're saying that any tool that runs Snapchat can basically You you can create these fullbody filters it's just like a full body filter of a character that you create and you used lens go to create it correct lens Studio got it fascinating it gives you complete Freedom um to build anything and to build this character there's a even friendlier tool called Ready Player me ready player me lets you build build out that Avatar without having to have a sophisticated 3D background so you can Go to try Avatar Creator this
is firing me up so much oh man like I'm about to make some fun content over the next month dude it's just like it's it's endless you can make anything at this point and the the limit will just be what people are able to imagine so if you have a weak imagination that will be the only problem it but the excuse of like the technical I mean that's gone that's like I'm moving towards a world where my Technical skills are going to be irrelevant that's why I teach and train all of them for now I
still you know McKay Wrigley he's a CO a coding guy on Twitter he posts a lot of like really uh he posted this tweet I wrote it down and he said this and I'm wondering and it's just exactly what you're saying it said you will be less Bound by your raw technical skills and more by your ideas execution and willingness to work hard and I was was like that's true across The board because from the 3D Community I'm seeing this in like just like the general AI art community and the programming Community which is why
I was motivated to create this podcast because it's when I I the other day when I saw that quad renders your website right there on the right side of your screen I was like oh my God that's what this code is doing and then you realize that you can just brute force create something just because you have access to the Information it's all there and you no longer need to go to a library drive there look for the right thing find the right thing understand it like you can actually just Brute Force ask AI to
create anything and then if you want to actually like go beyond that you're going to need to like learn that but you can create it and then learn it after you can be like okay break down this code or like okay we just created this character you can take a screenshot of Your character pretty soon all most of these models are actually going to have video inputs and so you're going to be able to just record your screen for a minute showing them what you just created on the thing be okay this is where I'm
at right now plug it into Ai and it's going to use its knowledge of its and it's going to have a lot all the knowledge from like the 3D software you plug it into Ai and you're like hey what's the next step or like show me a Br a a a visual diagram of the things that I can do next that I can add to the character and then things that I can do with that character to me it's the ADHD dream I feel like ADHD people are naturally seeking novelty over to-do list boxes like
normal people like to be like I like to get things done ADHD people like to discover new things they're stimulated by novelty the people who can harness that search for novelty are going to be able to explore and get Better at so many things anyway I just wanted to say that because I like I feel it in my heart I genuinely believe this is the era for dreamers and to create whatever they want which is why I bought the domain create. in I'm going to create a whole like a bunch of stuff later I'll talk
about that but like that's what fires me up anyway you nailed it I mean I would I would love to I I couldn't agree with you more it's now is the time for Creators and dreamers they get the this is the time it's like the time they've been wanting it's here and we're in it right now as like a society I think this is important to like talk about really quick but I feel like a lot of people with raw technical skills feel threatened by AI I just wish those people for 30 days would just
try it out even if they're cynical about the technology the morality of the technology whatever would just for 30 Days just say you know what I'm going to see what this is all about and just push my creativity to the limits I think so many of them would just be like oh my God there's more possibilities there's way more opportunity and skill in this that I previously thought and I'm not defined by my like Niche little job of like animating legs but you don't need to be defined by like narrow roles you can do more
anyway spot on I mean the only thing is like I've noticed uh Specialists are terrified by all these tools generalists are empowered by all these tools I've gotten into a number of arguments with some of the people that I even trained at DreamWorks because I was a specialist trainer I trained our specialist and then I would make a video showing how you can kind of do a lot of the specialist things with different AI tools now and then that's created some arguments for me in my professional life and I had explained to them that you
Know it shouldn't just be a specialist thing rigging shouldn't just be available for Specialists rigging can allow anybody to bring characters to life why should that be a specialist skill I feel like we have the tools now where anyone can make characters like why should we not allow that um and then their reaction would be like you know mixed yeah yeah know absolutely and and but I noticed this actually with when I was I've been creating I've had my most Popular YouTube video my last three videos where I was coding without knowing how to code
and I noticed that the more senior developers were actually happy for me and they were like oh my God to see you reach that first point where you deployed a website with a backend where you like smiled because I like I created a website a Notes app with a backend so it would store the data from scratch without writing any code I would just for four and a it took Me four and a half hours but I Brute Force learned it and created this website and I've noticed the senior developers like like the people making
100 like probably 300 to 500k plus are like hype they're like oh my God to see you reach that endpoint is so cool to see awesome job where like the people who are like at the beginning of their career feel threatened by it are the ones who are doing a lot of the hating in the comments the people who are just Starting out like they don't realize that they can like harness this Tech whatever field you're in to just like push your way to the front if you get angry at the system like this is
a system destroying technology in the sense that it empowers all people this AI Tech the super rich people aren't going to have access to a better AI than everyone because there's no more profitable Market than to sell AI to the masses to Every single person on earth like in my opinion you're going to have some extra tools and Hardware that's better but ultimately this is a Level Playing Field and so I think a group of a 100 or you know a group of 30 motivated art students can take on Disney which is in a certain
movie or a certain category if they really put their heart into it so anyway I digress let's H back in oh I don't think that's a digression at all actually I want to give you an example Of one that proves that with Disney have you heard of the small studio kugali huhuh so they challenged Disney and one it started off as a small comic studio in Africa and they made a video a few years ago basically saying how these three guys started a comic Studio studio and then different people were bringing African lore and fantasy
and sci-fi stories and Publishing through kugali and they started getting a lot of traction their Comics grew in popularity It's a I think they started off in Legos Nigeria and they're doing really well now and they made a video and they're like calling out Disney saying Disney has nothing on how large and how wides scale the storytelling can be in in Africa and uh go said and I quote we're going to kick Disney's ass they said that in their video Disney sees this one of the lead writers and producers reaches out to kugali and had
them basically be the first time that Disney's reached out to a third party studio in a hundred years they reached out and kugali made the most recent film that's on Disney plus it's called iwu so iwu this movie was made by kugali and Disney to together they co-produced it so it was animated by Disney artists but the story and the culture and the writing was all from that small comic studio in in Legos Nigeria and what was so insane about this is a small Studio can take on Disney and that's why I Think Disney was
like oh crap we better just work with them otherwise we're going to get left in the dust with time because the tools before I used to think and I you know know this used to be true you had to go to the studios to get access to all these tools and software that's not true anymore I can say that verbatim like some of these programs I have on my desktop like Adobe substance painter oh I guess I'm not sharing my full screen I'm sharing just my window Here but a lot of like the 3D tools
that I use I was so confused at first when I went to DreamWorks I'm like wait that's the same software that I have at home I thought you guys would have just like all your own software and there is a lot of custom stuff sure but a lot of it is the same software that I had at home and I'm sitting there like wait a second you're telling me that I can make a DreamWorks quality film with stuff that I have at Home that's interesting and that same level of Discovery I think is happening across
all of the creative industry right now we're realizing that the stuff that we thought was reserved for only the highest End video games films TV shows books that we thought you needed to be you know at a at a specific Place for no you can actually be anywhere with any sized team you could be a small team making quality that looks like what you would have only previously assumed came Out of a large studio and I give this example of aaju as a perfect example of that what I'm wondering is what will the cultural epicenter
of that be right because if you think about this like this was a cultural phenomenon this group created a film and then Disney they probably made their lives to do that to work with Disney they probably had the best job but they're kind of like hijacking the culture a little bit by swooping in catching a Great story a good movement that they've created over there and I'm wondering like what happens when that's no longer necessary it's like how and and like how will people it's almost like having your movie done with Disney is like a
stamp of approval whereas like what will be the next stamp of approval be where like these movies if they get to the point where they're as high quality as Disney but I think a people will actually be able to create um stories that connect With a lot larger people I think they just have people who have a better sense of what's going on culturally and they're going to create movies for the Next Generation better than Disney because they have to adhere to their narrative rules or whatever their um the narratives that they're trying to push
at Disney which are like I'm not trying to get political whatsoever I'm just saying that they're they I think that there's going to be more culturally Relevant films created by people and so like what happens when you no longer need to go to Disney it's like how do you actually get that stamp of approval like if there was like a platform that was crowdsourced or do you know what I'm trying to say well I think what you're saying is you know kugali having to work with Disney to get their film made at this level will
maybe be a thing of the past in the long run cuz I think just the fact that Disney broke their own Business ethics to be like we're going to work with an outside studio in 100 years that means Disney saw this as like oh we better not get left behind that's how I saw it in my head like this was Disney trying to be like don't leave us behind we're we can help make films we're not done we're still there you know that was kind of my sense of it but to some you know to
your point I think we're we're moving into an area where you may not need to go to Disney Studios To get a film like this in the next 5 to 10 years where you'll be like no I could just use my open- source tools and I'll publish it directly to my community through my social media platforms and I'll own all the rights to distribution you know who did this recently Taylor Swift you you hear how she made her film yes I did hear about it the only difference is Taylor Swift is bigger than Disney not
she's not actually bigger than Disney but like Taylor Swift Is not a person or an artist she's platform like she is a community a platform a brand all combined into Taylor Swift I think a group of kids in a different country or a group of people in another country can get as big as Taylor Swift and I think that she's Paving the way for other creative people to come in and do the same thing which is awesome I'm on the same page with you on that maybe you're right she is like a Platform in that
sense but I was trying to say like you know the idea of owning your full distribution owning the means of marketing owning the channels like I think that's going to happen also I think we'll start mixing you know eventually you'll have like brain computer interfaces that you can wear on your wrist and while you're watching something you'll be able to create something based off your feelings so right now we have a lot of these text to Video text to image text to audio but how cool would it be if the text was based off of
your own neuronal responses like if I would love to send you a futuristic message Riley like a a song that I want to think about I kind of just you know imagine the mood and the experience it takes that as text and makes a music out of it and then it sends it to your headphones and then we like almost telepathically send music to each other through Technology like I I see that world coming online and not just being music video I would love to see like what someone's you know could someone do a storytelling
without a storytelling background could they just feel the story in their head and then as they're thinking it generates you know a Disney quality animation and then they distribute it to a platform and it's a Choose Your Own Adventure platform so the story is never the same based on Who's viewing it like that's I'm excited for there to be completely new means of media that just were not possible before this time where it had to be curated and had to be custom made now it's like no we could probably get to a world where it
could be anything we want anytime it's just I'm I'm fired up my brain is like lighting up in so many different ways I think a solid way to end this would be let's were you about to show something before we went off onto that Super long tangent with the the McKay rley quote um you pulled up a website and I I I want to make sure we don't lose that thread if you want to show that oh yeah Ready Player me let me share that okay yes so that that character that I made here I
basically made all of this character with Ready Player me the only difference was I I made the hair myself with a different tool but everything else I made with Ready Player me and Then I painted it I painted the face separately but the website is called Ready Player me and it's basically like a video game Avatar that you customize with these different outfits you can scroll left and right for different looks we can go to the skin you can change change the skin color of your character um with a click you can adjust the eye
colors and eye shape and eyebrows I'm going to give them let's do this red eyes over here so you can kind Of make your own custom character and then they have a wardrobe section you can give them different outfits and then when you're done I'm just going to hit next it saves this as a 3D file so yeah there's my character download Avatar as a glb file so then I download that and now I can start using that in my animations and you can use that in in what app could you use that right now
in like any 3D app oh any 3D app got it so We're in blender now and I'm going to hit X to get rid of that Cube and then we're going to go to file and import that file from Ready Player me and so it was a GB file so I go in and load a glb there it is import okay and then what you're seeing here is the character but you're going to see all sorts of other data that is in here so there's my whole character in blender and that's the character you just
created wow that's Super detailed yeah it has all that all that texture and stuff by by default you're only going to see it it comes in with this as the preview and when you do this one you won't see all that cool stuff you'll just see the wireframe so I go in and turn on this one which shows materials and lighting and stuff and so you know I'll be like cool I like this character you could literally start building stuff with Blender right now if you want and blender is a free program you can start
building a world for this character but I want to get to being able to animate this character and so I go to file and we're going to go to export and then I'll just export this as an fbx It's usually the one that works really well with u mixo it's done now I'll go back to mixamo and under their upload character button I should be able to select that Fbx now there it is red and we'll hit open awesome we got in and now oh yeah he's alive so you have all these motions now so
you can just click from the motion library but if if you want to use this for your own motion like these are all pre-made animations people have already done map suits they wore map suits and that's how they made these animations if you didn't want to so I'm just going to view the skeleton there we Go so here's the skeleton you'll always notice that one skeleton bone is like coming straight from the groin that's usually the origin point for the character just so it knows where where the origin is but everything else is the character's
bones what I usually look for in here is just the T pose so I go to T pose and I click on this and then you get that awesome T pose this is like a neutral Character what is that sphere usually that sphere is I think it might be a collision sphere it's how they detect whether a character is touching another object there's usually invisible objects around every video game entity so like when someone walks into a room and then the door opens they usually do a trigger it's called a a collision trigger so when
this sphere collides with a door entity then it'll play the door open open animation so Instead of having to calculate the actual character touching it which is a really complex piece of geometry they'll usually put spheres around different parts of the body that that's how I would use these like for example to know when the knee is making contact or an elbow or top of the head you know how in video games if you shoot a character in the face it's a head shot but if you shoot him in the body it's not a head
shot mhm it's because they have Invisible Collision spheres all throughout the character's body and they assign different points to colliding with them it that makes sense what I probably should have done is hit X on those and just remove them and then imported that character now I didn't just hide those away they're actually deleted they're not there anymore right so like when you export them they're not present got it that makes sense I would hope so let me see I say left eye right Eye body hair yeah all that looks good teeth that's all I
want oh there it is it was there there's an iOS okay so this time after you've created your character in mixo when do you place the joints or does that already happen in one of the previous ones like cuz I know that when I've done this you like have to like move it to like on the elbow the wrist or the knees or whatever or is that not part of this process no This is this and this is that process I don't know why it skipped that section last time so I hit next and now
you say this is where my chin is for my character this is where my wrists are this is where my elbows are going to bend from these are where my knees oh wait is that the knee up here yeah you should wearing some weird knee guards I think those these are the knees no no that's way too low to be the knees I think it's like right A little Yeah right there right here yeah this one or or lower she's wearing some like knee guards right there probably okay great and then for the groin just
go ahead and right there they have different skeletons if you ever want to have finger animation you have to choose 65 bones cuz that actually puts bones in the fingers of your character if you don't care about finger animation you could just say two finger chain it will look like like that it'll just put bones Like one bone to control these four fingers you got flap yeah you got a little little Mitten yeah little oven mitts we it just depends on what you know what kind of performance cuz this one will run slower versus this
one will run faster so like but I don't know go 65 we hit next now it's spinning around the character rigging it up and I only do this so I can get that t pose out and as soon as you have that t pose you can then use That move your move app your move one app on an iPhone and then film yourself in any way and then you'd be able to get this character to do those movements okay this is this is very actionable because I have all these apps on my phone so what
you just said basically is I can go to ready play to create the shell of the character then you're going want go to blender to export it if you can't just export it straight in to mixo then in blender get Rid of the Spheres and get rid of that icosphere so I select them and then I hit the letter X to get rid of stuff then export the character as a what type of file fbx fbx file from there we can go to miimo so miimo import the character upload upload character mhm and then you're
going to add the what's that called add the joints the chin oh yeah the rig the skeleton the bones and then from there now this is Where we're at basically right I've done it on move. a I have created that character and then it creates that automatically that like white skeleton by default whatever their default movement is and so it's like what's the move there how do we exchange characters that's what I'm curious about yeah so the key is the T pose so on mixo I search for an actual T pose and they have this
little witch dude and so I grab that character and Then I download that character I don't even get the animations out of here because we're going to make your own animations but we do need that t pose and we this Chara of the character we've created in Ready Player me correct yes cuz the ready player mi1 is in an A Pose which is like its arms are kind of like in an a shape unfortunately that I mean that can work I just it it's it's a little buggy sometimes whereas T poses work a a lot
Better and so now we would get our motion capture data from move one single camera motion capture so now you'd go and tried the free iPhone app looks like I have to record some motion real quick do it cool I have the phone set up on a tripod okay and then I'm going to set it to a 10-second timer all right once to get my full body it's not you want to make sure your full body is in the frame of the video if your arm leaves the frame that's not going to work you need
To be fully in the video frame the whole time if you're using one camera but if you're going to use multiple cameras then you know you could be out of the frame because the other one will catch you okay so I stand in t pose and then just going to move around a little bit and go over there and then we're done okay so I hit stop now I'm going to upload that footage or upload that data says your video is uploading so the way that their app works is it I guess you Upload a
video then they do the motion capture in their Cloud Server and then they send you back the data and then that data is what you need in blender got it and do you know how long does that process take just a couple usually like like a minute yeah nice nice so you're going to be able to download that data plug it into blender is this a super hard process or is it relatively easy that's what I'm double checking so I actually don't use blender I was trying to give you a free alternative so the the
process is called binding and what you'll do is you'll bind the mocap data to that t POS and then the toost character will be how much easier is it to do with your other paid software super easy can you just show us there I want the easiest method not the cheapest method and I know a lot of the audience might not like that I want to create videos with this method whatever We talk about in these podcasts like I want to do them as soon as possible and get a feeling for it and so if
it is somewhat easy on blender and you can show us now do it but it's very difficult blend yeah it's still kind of difficult on there there might even be to be honest maybe even easier approach is to just do it entirely through lens Studio because then they have a template for motion tracker for body capture and then you could reuse the character a Billion times versus the way that I'm showing you you'd record your motion data with move one but then you can't reuse that all the time you'd be using the same data every
time like the same whereas lens Studio this little yellow one here is um by Snapchat if we just brought in that toost character he'll be ready to animate immediately without us doing anything oh so you can just take a video You can take a video of yourself and be that character basically or yeah you'll open up it opens up your web Cam and then you'll like immediately see if I move my hands it should be able to move its hands wow that's what I yeah as I'm as I'm saying it out loud that think that
would be the easiest so we're in lens studio so up here in the top left they have a templates Tab and this lets you see if you hover your mouse over it a preview of what that is so this one is To do face performance this one is finger tracking if you have a know 3D model that you want to track to your hand let me see which one is body full body there it is so this one's called Tryon this one will let you wear virtual clothing that one might work but I think there's
one that's even easier than that one this is reminding me a little bit of of cap cut templates where people take like complex editing but like 3D version Of it which is cool which I guess is like that's a great metaphor for it and I guess I agree effect house I I don't I've heard a lot about a a Tik Tok effect house I don't know much about it though but it seems somewhat similar kind of maybe not I don't know this is the one you'll want it's called body mesh that's the one that that's
the default one you want so I click on body mesh but yeah Tik Tok effect house is based off of lens Studio but it Currently can do less than lens Studio lens Studio has been around longer they have more time to develop so there's more stuff on it it's loading this project you gave me too much homework that I can handle oh I have to try like 11 different tools after this I'm so sorry I just got excited with the possibilities it's funny we didn't even get to simulan which is okay we can talk about
that another time so yay we're in lens studio and we're running the body Mesh template okay so this is like a default person but I don't want the default I want the okay so this is what it will do they have these sample videos up here so you can test your thing on other people before you wear it yourself so I always do this while I'm testing and then once you're done with your testing you can literally click send to Snapchat it'll send you a test link and you can film with it on your phone
it's that friendly Like it doesn't get friendlier than that and so now we just got to swap out this template with our our character so in this case all of that is under this sci-fi suit body and body so that is this template and that one's turned on that's what we're seeing here so if I turn this off by unchecking it right it's unchecked now everything vanishes the second one I have is a move on body but it's a single body so I'm going to turn that one On and now I have one body in
here and you can see what it does all right so with each one of these the key is they have to have a body tracker on which we already have and all we need to do now is just input import our model into here so I'm going to go and import it click on add new you can choose import from files and then I'll grab I I'll try to grab the first one that we did even before we did the mixo stuff because I think that one might Just work by itself which would be even
easier you're just making stuff in Ready Player me the first you created a character in Ready Player me and so prior to doing any of the mixo stuff you think you might be able to just import that do GB file before you took it into blender and turned it into a fbx file correct that's my hope right now um cuz GB is usually great for that so let's see what happens I'll hit open it has a bunch of settings on here I ignore Everything so I'm just going to hit import and you're not going to
see anything update right away you'll just see that it's loading there's our file if we expand it we can see that all of our textures loaded up from Ready Player me so we didn't have to paint any of this and the mesh loaded up so these are all the different parts of our character the eyes the head the teeth the hair the body everything's in there so that's a good sign if this worked out as well as I hop we should be able to do is just drag it underneath this body 3D body tracking so
right now there's a body mesh it's separate right now so let me just drag it as a child so let's go to 3D body tracking match hierarchy ah yes and then we grab the the object Oh weird I don't see the oh I guess is it this object done Nice now some of the issues we're seeing is because the character is not in a t POS you see how our character came in as an A Pose yes the arms were in this shape that's why this is happening where it's like passing through itself like this
if I had first moved the character arms to be a t pose and then did match I think it would be fixed so let me see if I can undo that just rotate along that red axis as close as I can and then go to the right arm grab the red Axis rotate that as best as I can okay now I'm going to try to do the 3D body tracking match hierarchy choose the same Ready Player me object and hit okay okay it's fixed let's go wow yeah now that character is alive and we can
swap that footage out with my liveaction camera so now I'm in my office no way so now I could be like Woo and so could you could you in theory remove everything except for just that character or like like any background okay cool and you could do like a green screen type thing yeah let me see I think we should be able to just do that in here if I click on plus button and then add a mesh I'm going to add a plane and drag that plane into the scene okay there it is and
then we'll just scale it up oh I see what you're doing and I'll scale it this way that Way and then you move it and then we'll just yeah we'll move it back behind my character and uh I'll scale it up even more and now we've kind of made like a green screen so now we're out of it and then we just got to replace that with a different color so under that material which is set to default you can add materials so if I go to or you could Even do like a static image
right like if you wanted to create like a background could you do that get this give you both angles for reference yeah so that's working cool yeah let's see if you wanted that object that's the okay it's called a scene object and under the default material you could choose a material we could upload our own image you know so here's just to be weird I'm going to use the space suit as that Image and so now that entire space suit is the background nice but that could be any image as long as we import any
image here right yeah so if I hit the plus button and I go to import from files I'll go to pictures I had this AI image from earlier this week at Mid Journey nice and if you want your friends and your followers to try it themselves you'd click send to Snapchat and now it's sending and it should give me a temporary link and I could in theory send you that link right now through the chat and you would be able to use this on your iPhone through snapchat through snapchat wow let me check I'll test
it out so it's clicking sending lens waiting for that to be a green check mark it's going to send that whole thing like is it going to have the background or just the character both everything That's in that scene will be part of wow okay so it's like a whole scene filter basically lens sent okay it says your lens is ready and now I open it okay cool it works and then let me flip the camera awesome now it's running on Snapchat so you could you know make a full 3d character within a few seconds
and then deploy that character as a snap lens and then send to your followers which is kind of what I would do mostly in 2022 is when I would use them and Then the fun feature that they remove moved but maybe they'll bring it back one day is after you published a snap lens like this you'd be able to use it in your video calls as a thing like when you're in a FaceTime or a Google Hangout or a zoom call you could appear as that character that's great so it's like could you run this
in like OBS and you could just keep that as a virtual camera because you can use your virtual camera as a webcam so if you Just had that running you could just literally take all of your calls as that character which is hilarious the only thing that's missing here is I didn't show you how to get like face performance into here that's a separate process but at least body track but you can do that yeah you can get fac in there what I would recommend if you do want to do the face is I would
I always just start from the templates because I don't like to code or know know how to Do those things but yeah you can stack templates together and build a Frankenstein of like this template's good for bu tracking and this template's good for hand tracking and this template's good for face tracking yes I would love to get the folder of those and throw them together and I can just put them on a website somewhere that people can go and just say okay I can use this for this this for this this for this and as
a team as viewers as users Of these tools collect templates to make certain things useful and I think that that would be really worthwhile and fun to get a lot of people using this Tech and then I think the other option too is you said Tik Tock effect house if more of your followers are on Tik Tok then they would prefer that one because maybe they don't have Snapchat I think snaps is the best of the free tools out there I love it wow yeah but yeah sorry that was like Like like a like a
little webinar now no I love this no no this is exactly what I want for this podcast so I guess in summary we've talked about Sky glass Sky glass allows you to do green screen and have a background which is a Unreal Engine environment and if you can figure out how to create an Unreal Engine environment you can import that into Sky glass and use it and you can also change yourself into a different character how to get it your own character in Sky Glass is something I'll have to discover and talk about in the
future we didn't talk about simulan we can save that for another day even though I was fascinated by your simulan videos we talked about move 3D in the different positions and how you can use multiple cameras to get more accurate movement of not just one person many people then we use lens Studio to create character just selecting the different attributes of the character that you want there was no AI involved and which I think is the next step on top of that which I think will get very interesting is when you can create exactly the
character that you want I think that will be a huge need for the market we created the character and then we put it into blender and then we put it into mixo to create the animation we then went back into lens Studio where we put in the ready player player me create uh character that we export boarded and Then that was the last thing that you did to create the Snapchat filter and I think that pretty good summary of what we talked about so a lot of things that if if you're watching this thank you
for sticking through this this episode we we went into depth into a lot of different things but there's a lot of ways that you can actually get involved and try this and it sounds like for pretty cheap or free and so get your hands dirty and try a bunch of stuff I think that's a Great summary thanks for having me on your show and sorry if I went over a bit like an hour no no no thank you so much for joining me this is literally a blast and yeah thank you guys for watching