In this video we are going to build end to n jna project using Lama 3.2 open source model Lang chain framework streamlet and Gro the project is called LinkedIn post generator this tool will be useful to LinkedIn influencers such as musan Honda she writes post on Mental Health job search scam Etc and she has 83,000 followers what we will do is we'll go over muskan's previous post we will extract the key topics out of it And then build this UI where she can select any of the topics that she usually writes content on she'll be
able to select the language the post length and when you click that generate button it will use llm to create a new post which will match her style of writing folks this is not a toy project it's a real tool that will be useful to LinkedIn influencers including myself this project is perfect for your resume and project portfolio let's brainstorm How we are going to build this tool when you look at person like musan she writes on topics like mental health job or scam whereas AA sast who is a finance influencer writes post on investment
minimalism India Etc everyone has their own expertise and they will write on certain topics so what we can do is for each of these people we can look at their post and extract these topics and then we can build this kind kind of tool using streamlet where all the topics on Which they have return the past post are available you can also provide uh post length language Etc and when you hit generate it will generate the post which will match that particular influencer's writing style now if you think about technical architecture there will be two
stages in stage one we will be doing pre-processing here you will go through the post of that particular person you will somehow grab them so in this project we are going to manually grab Them but there are ways to uh automatically collect this data once you have these Post in let's say Json file or database or CSV file in our case we are using this Json file where uh there will be a text this text is by the way cut off but actually it's a entire text then you have engagement number and so on and
you will use uh llm so we are using Lama 3.2 you can use any other llm as well and this llm will extract these tags you see this tags this language This line count so it is enriching this Json object so we had these two Fields initially and it extracted these uh extra Fields once this is done in stage two you will do Post generation so for that you will get the enriched Json object and you will fetch the topics unique topics okay so right now each post as you see here has uh two tags
right tags or topics uh these two correct so you will go through all the topics from this post you will merge Them you will consolidate them and create a unique list of topics for which a given influencer will usually write post on and then you also have this uh length right so as you can see you have post length sometimes I want to write short post medium uh small and then there is a language so by the way there is English language and another language is English English is a combination of Hindi and English some
influencers from India they use a mix of Hindi and English and that is called English so now you have built this street UI okay you selected topic length language when you hit generate button it is going to create a prompt so we will do some prompt engineering using that it will create a nice prompt and that will be given to Al Lama 3.2 which will eventually generate the post to summarize we are going through these raw post then we are doing pre-processing creating this endrich post endrich post Has topic language length ET ET and in
stage two a user can select topic language length it will generate a prompt we will use few short learning to retrieve some of the post which belong to this particular topic language length and so on and then we will use Lama 3.2 to create a final post now folks here uh these raw post that you're seeing here we are manually getting them but there is an automated way to get it and that is by using bride dat dat bright data is A sponsor of this video and they provide these kind of rimed data sets usually
you have to pay money to download this data set but bright data has provided a special link which you can find in a video description below using that you will be able to download some of these data sets so just go to data set and filter on LinkedIn data set and you will be able to download some of the records PR data is also a web scraping tool and they use this prox Network to uh scrap Data from internet if you're writing a plain python script and scrapping internet the website will detect you as a
boat and they will block you but here they are using this proxy Network due to that the data collection is smoother now there are some benefits of using this kind of data set from bride data which is it is a fresh data it is pre-collected for you so you don't have to write any script and you will only pay for the data that you need all right As a Next Step we are going to set up a Gro account we will use Lama 3.2 model on Gro Cloud Gro allows you to perform fast inference you
don't have to download 20gb model on your computer it has a lot of benefit and it is free so you can log in using email or you can log in using Google Etc I have already logged in the first thing you're going to do here is create an API key so click on API key here create an API key uh just call it what whatever I mean I have Already created my key so I'm just showing this here uh copy this this is like a password so keep it private I'm going to delete it because
I have already created this key okay so um let me just delete it but in your case you would have created a key copied it and save it somewhere because you can't access it next time so you need to save it somewhere by mistake if there isn't any problem you can always delete and create a new API key I have created this New folder LinkedIn post generator under C code learning directory to store my project uh source code now I will go to py Cham I hope you are aware about py Cham code editor and
open that particular folder so when I say file open here I'm going to open uh the learning LinkedIn post generator okay in a new window and usually it will create this main do P file and you can configure your python interpreter to be let's say python 3.10 Okay now I will first uh work on stage one which is uh getting that raw data and then uh post-processing it so let me create a new folder here called Data in this I will have a file called raw post. Json Json is a format that we'll be using
and here I want to put all those post now what I have done is I went through musan a LinkedIn I literally like copy pasted these post into a not ped file then I used uh some AI tool you can use chat GPT meta AI I Use met. a tool and I said that if I give you a text return a Json object where text is the content of your LinkedIn post uh we want that LinkedIn post to be a single string so that if you have line breaks like see see you have this line
braks and spaces that will be substituted by this SL n okay so that way I can have a single long string for a post I will also have a parameter called engagement now this engagement parameter we are not going to use but Later on for the project enhancement this can be very useful so I'm asking it to create this kind of output text and then I copy pasted the post from muskans LinkedIn and see it gave me this kind of a Json object so I repeated that process for uh some 10 or 11 Post I
mean you can have more post more the better I'm using like some 10 or 11 post and you can see that if you have two line breaks it will do slash and Slash and you can also give explicit instruction later on see I gave The instruction saying that one more instruction if there are double quotes inside the text convert them to a single quote correct because see if there is a double quote here and inside if there is a double quote it will create problem you know from the Json format so I did all of
that and I created this kind of uh Json file as you can see there are I think 10 or 11 records and this engagement 90347 is nothing but this engagement Number see 73 here it is 213 so I did everything manually but as I said before if you want you can use Python bright data tool you can also buy redeemed data set from Bright data for this purpose now we will create a file called preprocess preprocess dopy okay and here let me create a main function so I will have a function called process post okay
so this is the function that I want to create um process po post and this should take uh Two uh arguments raw file path and processed file path uh if someone doesn't give the processed file path then you can have some default path such as data/ processed post. Json so we have a data folder which stores the raw Json as well as the processed Json so here I will give these two file path raw post. Json process post. Json is the new file that we will create after processing this now let's uh open that file
so the code syntax is pretty simple you're Opening this file as file and then you can use Json Library you might be aware right like import Json and then you can do Json do load that will uh load that file and you will get this post object okay uh what I like to do usually is uh write some code then quickly test it and then move on okay so here you can I think simply print the post so let's do this right click run and see it is able to get the post from The file
and it has created this list of this dictionary objects right like see if you just scroll further you will see all these post are read fine and this post is nothing but a list and then for each of these post so you will say for post Post in post you want to extract uh some extra information so I will call that function extract metadata metad dat and you will pass post text because if you look at it the post each of these posts so one post Is this second post is this they have this element
text and that will give you the actual text for that LinkedIn post I will pass that to this new function which I'm going to create by the way okay so let me just create that function extract metad dat and that should uh return the tags like the line count the uh language the tags and so on so right now I'm just going to make it like a skeleton function and here I mean essentially what we expect this function To return is let's say it has a line count okay and we will be using llm for
it so I'm just kind of mocking things up then we will plug llm or later on then it has a language let's say language which is English and then uh the tags let's say tag is mental health and let's say motivation for example and that you will store in metadata now what you have is this post right so this post will have the actual text correct ABC and then engagement number whatever Let's say 345 is the engagement and this metad data will have uh all these things right like um let me just put it here
so let's say metadata has this this these tags now these two are objects and you want to basically essentially join these two objects because at the end of the day you want to create a single dictionary which has all these Keys which has text engagement line count and so on and in Python you can do this Using a pipe operator so if you do post and pipe operator metadata what you get is post with metadata and to store this individual post with metha dat I can create a separate list so I will say enriched post
right you can give any variable Name by the way so it's not like you have to say only enrich post uh so post with metadata and then what I will do is for uh eost which is enrich Post in enrich post a print eost so I will just taste this function I know we are returning dummy values for but I'm just testing our wiring we will plug the llm code later on so right click run when I do that you see like you're seeing this individual post right like so you have this post then you
have engagement now see you have line count language text and this is the object that you would love or you would like to store in the new file which is this Processed post. Json now we have to call llm so we need to write basic code for llm so here uh what I will do is create a new file let's call it um maybe llm helper okay so here we write llm code you'll also have to create a new file called EnV now EnV is the environment file which will store your uh Gro key so
folks create this grock API key variable in the environment file and enter the environment key that you have created every one of you will have a different Environment key it usually starts with GSK I'm not going to show you my key because it's sort of like a password but just save that here okay so I have saved my key in environment file and here I will call this function from. environment import load. environment so when you call this function it will go to EnV file and that Gro API key variable that we had it will
set that as an environment variable I will also import few other modules so we are going To use Lang chenore Gro and import chat gr from there okay and then we'll create this variable chat Gro here Gro API key Gro API key is going to be OS dot get environment so when you do this you will get a value of an environment variable what is the name of my environment variable well it is this and then you will have model name so folks make sure that you give appropriate model name it can change by the
way so right now I'm giving this uh Lama 3.2 90b text preview 90 billion parameter okay tomorrow if you're checking this video after 2 months or 6 months see they keep on releasing new model so if there is a new model that you want to try don't say okay I'm getting an error just try it folks use common sense so whatever model you want to use from here I'm using 90b text preview so that's what I will give here uh let me just copy paste so this is my model now we will quickly taste this
code what I do is I write Individual python file and then create a main function and then you can invoke LM so I will say what are the I will ask very difficult and very important question what are the two main inrent in Samosa okay very crucial question for our economy and then print uh response do content right click run and when you do that see pastry or D filling okay Tomatoes or whatever so our wiring is perfect so when you do lm. invoke it is using this environment key It is going to grock Cloud
you are not running your Lama 3.2 model locally it is going into the cloud fetching the response and that thing is working okay I'm back to my pre-process dop file here I will import some important libraries okay I don't want to type just to save the recording time uh and by the way you need to have all these modules installed right whatever modules we are using do pip install that module later on I will provide requirement. txt file as well And you can simply do pip install iphr requirements.txt in your command prompt but until then
you can also like just say p install Lang chain code P install a DOT environment folks again these are all common sense you need to have some python Basics uh in order to work on this project so I'm not going super explicit on those fronts now we have have imported these important classes let me implement this extract metadata function so what is the purpose the post Which is just a block of text from that we need to extract these three elements using the llm call so I will create a prompt so what is the prompt
so in the prompt you want to say something like this you are given a LinkedIn post you need to extract number of lines language of the post and text now coming up with the right prompt requires some trial and error some experimentation you have to do some prompt engineering for that I worked on it some few times and after Some iterations I came up with this prompt where I'm giving explicit instructions I'm saying return return a valid Json no Preamble no Preamble means say in chat GPD when I say give me a sample python code
whatever question I type uh before it gives the actual answer so actual answer is this it will also write this kind of text that hey here is your code nice question here is what I think so all these extra text that it gives is called Preamble I'm not Interested in that I want a simple Json object I'm also saying that Tex is an array of text tags extract maximum two tags I don't want to extract more than two tags you know because this thing can go like super open-ended and it can extract 100 tags I
want to restrict it to two tags then I'm saying language should be English or English and I'm giving it an understanding of what English means English means Hindi plus English and Then here is the actual post now see this post you will write it in the curly bracket and then we will use prompt template from Lang chain to replace that post with the actual post so now let's use this class prom template okay so I'm saying prom template from template okay so this is my template PT let's say PT is the template and then my
chain is llm n PT now what is llm so llm is something we created already if you remember from llm helper import llm so This will call Lama 3.2 and uh using this syntax pipe operator you can create a chain chain means basically you are supplying this prompt to the llm and you are just uh calling invoke so you can just say chain. invoke and then the input okay the input is this okay what is the input okay post so here is a post and there you will give this s so this post that you're
giving here is whatever you got from this function argument and this Post corresponds to this so if you write ABC here type ABC here and that is your response I will create uh Json passor basically and then using that I will say Json parser do par uh so Json par dot pass okay so response. content so when llm returns your response do content will have the actual answer so you want to pass that and that is your response and I will just return this response okay so I can remove my dummy Respon response now I
will also protect this code in a try catch block so see I'm using try accept and in case there is some problem with output passing you can use this one okay so now this I think looks good to me so let's test this out uh we have already uh printed uh the enriched post here so I just want to see how it looks okay this I got because I think this should be the order is like the promt template and then the llm so let me run it again I hope this Should work okay it's
taking time so I'm assuming it's going to work okay this time all right it worked okay now let's check uh these things okay so you have text and for that text ah see it worked so text is LinkedIn influencer organic growth or LinkedIn scams job scams now see I'm seeing some repetitive tags I want uh you unified tags basically less number of tags where things like LinkedIn scams and job scams is just called scam okay another thing I want is Career advice okay and then job search just check that so here it is having space
between two words everything is in a lower case whereas here it is kind of a capital case but there is no space so there is UN uniformity in these tags so I want to postprocess these tags and make them uniform so essentially let's say all these post have different tags job sir job hunt job hunting I will just unify them I will just map them to a single uh clean tag such as job search This will be in capital case which means if there are two words there will be space in between every word starts
with a capital letter and so on so let's write a function called get unified tags and I will pass this as an input and this will be unified tags it will return that as an output and here def get unified tags so there is a post with metadata so we are passing all the post now I will use Python set to get the Unique uh tags because in the post itself there might be a repeated tags so I want to remove all the duplicates so I will say unique unque tags is equal to set once
again you need python fundamentals folks set is used to remove the repetitive uh tags so for post and post with metadata unique tags. update so this is how you add a list to a set and it will create the unique entries so here uh post text not text actually tags and then this unique tags which is a set I want to create a comma separated string for example job search you know um mental health whatever scams so I will give this to llm so I will make another query you know I will say to llm
that uh I am giving you let's say list of tags unify them for example you know I will make this kind of llm call so that I get this unified tags so let's uh do that here so how do you create this kind of string How do you create that so here you can just say dot join unique tags unique tags I'll just call it unique tags list and this unique tags list will be this kind of string containing all the tags separated by comma Now folks I tried couple of prompts and I came up
with this kind of a query that I will be making to llm I will give you a list of tags you need to unify them with the following requirements tags are unified and Mer to create a shorter list for Example job search job hunting can be all mer to job search motivation inspiration Ty can be M to motivation so I give just four example if you give more example is better so as this project evolves you will keep on appending more examples this is also called few short learning where you help l M by giving
some examples so that llm can understand those examples and produce the output that you expect each tag should follow the title case Convention title case means words are separated by space and the first letter in each word is capital object should be J on no Preble uh output should be a mapping of original tag and unified tag so what I'm doing is if you have job Seeker then job search job hunting job search so I want this kind of dictionary so if I have a dictionary I can apply that dictionary on my processed Json object
and I can replace those uh tags which are not in a good shape now folks This code is same as uh the code that we have written before so I don't want to again type it and waste your time I just copy pasted uh it's pretty straightforward same as this function you're calling llm with this particular prompt okay folks so this unified tags will look look something like this so it's a dictionary object which looks something like this now what I want to do is I want to go through my enrich post so for post
in enriched post Enriched post I want to get the existing tags okay I want to get the existing tags from that post so I will say current tags are these and from those tags let's say current tags is jobseeker you want job search because it's a new like better clean tag so you will create a new array called new tags and you will use list comprehension okay so for tag in current tags what do you want to do well for each tag in current tags You want to look into this dictionary and say tag so
that you get the new tag so unified tag is this dictionary when you supply old tag let's say job Seeker you get job search as a return value and and this is just a list comprehension now there is a chance you can get same tag two times okay so to avoid that chance instead of list comprehension I will use set comprehension so that everything is unique and then um the new tags is nothing but this okay now this Is a set so I will just convert it into a list and that's it folks now my
enrich tag post are ready to be return return to a output file so I will open that processed file path and just do json. dump so it will write it to a file okay so now let's write click and run this whole thing okay let me just remove this unnecessary code from here okay this is looking good okay this is my input and I expect to create this file okay so this file does not exist Right now I'm going to create it so right click run is going to take some time meanwhile let me just
exit out of my meditation now you can see this new file processed post sometimes you know you have to say reload from disk and then it will show up and look at this now wonderful so I had this raw file okay just a LinkedIn influencer with Organic growth and in that it added these three new text okay initially I had just these Two but pre-process dop file added all these new TXS so we are done with step one for as a next step we want to build this kind of user interface where we see all
the topics in this drop- down there is length and language as well but those things we can hard Cod length will be short medium long language will be English and English but for the topics we need to look into this processed post and get all these tags basically okay so we'll just combine them into a single List what I'm going to do is create a new file called few shorts now why is it called few short because we are going to use few short learning which is when we give this prompt to llm we will
provide some sample text so let's say if a person is saying I want to write English uh post short okay so line count is one and job search is a topic then we will give this as a sample to our llm we'll say okay write the post in this kind of format And that approach is called fot so therefore I'm calling this fot and we will create a class actually python class called few short post okay and in the in it we will provide a file path so the file path is that particular file which
is this particular file okay so here there is data okay so there is no directory like this so data under data you have pre-processed post. Json Okay so now what should we do see uh when you Think about this programming logic uh you need to think step by step so in our case we have this kind of UI where we need these three control and eventually what we will do is uh we will uh search using language uh length and particular topic and when you want to do this kind of search you can run SQL
query if you have a database but here we have Json file which we can download maybe to Panda's data frame because if you download it to Panda's data frame if you Load it in Panda's data frame you will be able to do some search queries you can say okay if topic is job search length is medium language is English you know return me those post which we can use for Sho short learning so I'm thinking creating a data frame will be useful we can load all those post into a data frame so that querying that
becomes easier okay so we are definitely creating a data frame I will just initialize it to none for now we are Also creating uh variable called unique tags okay so this unique tags is nothing but the tags that you are populating in this particular drop down so this one is also none and then maybe I can have this function called load post where I take a file path and the purpose of this function is to populate the these two variable based on this file path okay so now I can read this file with open file
path okay I will supply encoding As F and let me just import Json module so it's a Json object so I'm going to use json. load file and then we will have post now if you have a Json object uh like this it's a list of like dictionaries you can convert that into a pandas data frame using a function called let me see what is that function so you can use Google Chat GPD whatever and we found that it is called Json normalize here you give this Json object and it will create this data frame
for You now you should obviously test this thing out so I'm going to create a main function we'll just test this thing quickly uh let's say you create an object of this class and then when you create this object you can specify a file path but by default the file path is that okay so we don't need to worry about it and how about we do this in the Constructor itself we can say self. load poost okay And we Supply this file path that way the moment you construct this the file will be loaded DF
will be created and this DF by the way you can assign it here okay let me add a breakpoint here uh or maybe I can add a breakpoint here and I can just say right click debug that's what I do like I like this approach write some uh small piece of code and then test it and then you can just say this now you see post these posts are loaded properly all the posts Are here and when you create a data frame your data frame is kind of visible now see you have text engagement line
count language and tags okay perfect so this is looking good you can stop the execution here now see in that data frame we had a column called line count but if you look at our front end we don't take line count as an input we take l length now we can have some intrinsic definition of length such as If it is between 1 to five lines it is uh short post if it is 5 to 10 it is medium 10 to 15 it is long you know we can change that so what I did is
I wrote a very simple function folks I don't need to write it in front of you very straightforward if you give a line count it will tell you whether it's a short medium or long and you can configure this right based on your preference but for the LinkedIn influencer muscan for whom we are creating this tool let's say Her definition of short post is less than five okay so we are um defining this function and then how about we add a column in our data frame okay so here what if we add a column called
length and what this will do is it will use the existing line count column and on that we will apply some transformation so in the apply you can um just pass this function okay I think I should say sell Dot and that will I think return this Okay so let's verify it so what we can do is as we did previously we can put a breakpoint here right click debug it and we'll monitor the variable to see if it worked okay yeah see it worked okay so there is a length new column that has been
created if line count is one it is short 7 is medium this is short 17 is long perfect so this is working as expected the other thing I would like to do is collect unique tags now in tags column There are multiple tags okay I want to go over all the tags collect them and make a unique list so how do we do that for this uh you can say DF tags uh do apply okay Lambda x x do sum so when you do sum it will go through individual cells and it will collect them
all into a single array and that is your all tags okay and then uh self. unique tags for Unique tags you need set because if there are Duplicates you want to remove it so pass this all tags here but this all tags will be a pandas column so you can convert it to a list I think P's column or I forgot the Vari we we can check the variable type but when you do uh type casting with list it will convert to a list then when you do set the only purpose of doing this set
is to remove duplicate elements so let's say if your job search a tag appearing multiple times you removing that okay uh what I Will do next is I'm thinking maybe instead of doing self. DF is equal to DF like right here itself we can assign this so that way we don't have to create that um one extra variable I think that works okay yeah either way I mean it's fine okay so now we have unique tags so let me just verify if that thing works okay so right click debug okay FS right FS has this
unique tag see now look at that career influencer I I I hope you can see it see Career LinkedIn scam Talent I mean there is a sapna tag as well so it's llm sometimes it will not work okay uh you have to do some fine-tuning testing validation heuristic and so on but for the purpose that we uh wrote this code for it is working all good and I can directly access self. unique tags or I can write a simple function you know like get tags and that just return self. unique tags okay now as a
next step I would like to do this see if I Call a function called get filtered post then from medium English job search it will return me all the post which has medium English and whatever is that topic this is again like doing SQL query where you are saying if topic is equal to job search and language is English and the length is medium return all those post okay so let's write that function so Dev get filtered post Post in which you are taking length language and tag as an input and it's kind of Straight
forward if you have work with Panda's data frame all you need to do is uh Supply that filter condition so here we have three conditions right so DF first of all language is equal to language okay I'll just put them in this and um okay here you have to do self. DF I can maybe use the power of copy paste and uh length length is equal to length because now we have this length column okay and this is the third condition U you can take help of chat GPT for this Folks um so here the
tag that is coming from as a function input is just a single tag but in the column you have literally like the array of tags so so what you're doing is you are looking into the that particular column and you are saying that if like Lambda tags tag in tags okay so this condition tag in tags means uh if you have a list right like so let me just show you if you have a list let's say my list is 1 3 4 okay correct you have my list and if you say Four in my
list then it will say true so if you have tag let's say job search what you're saying is this one tag in tags means job search in job search let's say this is the first tag and this is the second tag this will written true otherwise it will written false if it's not part of it okay so that is this uh particular condition and with this you will get a filtered list and when you get that filter list it is another data frame so You can convert that to a dictionary because we want a dictionary
with an Orient command so Orient argument will uh control like how do you want that dictionary to be do you want it to be a list of dictionaries in that case you can say Orient is equal to record records actually it's a plural and then you return it okay so now let's taste this um so I'm looking for medium English job search okay so right click Run Okay medium English job search Probably there are no no post like that but let's say if you look at short short English and job search then you should at
least find this one right so um let's say job search English and short so short so see you found this okay you can test it for multiple scenarios and it is working okay so let's say job search medium let's say medium okay so since this seven is medium so for Medium uh okay this 11 is long 7 short one yeah there should be two so see this is one okay there are no two because this one is in English Okay Okay so let's see I I'm you can test it uh using whatever filter you want
to supply so here okay I'm not able to find to post here but anyways this few short function is working okay now as a next step we are going to now build this UI where you have three controls the first one is Topic second is length third one is language streamlit is a very quick way to build UI for your generative AI application you can do pip installs stml and then you can import TriMet as St now I'm doing this in my main.py file because that's the main entry point for this project and here we
will write streamlet code and we'll run it to render that UI so here let me just add uh the main function so this is the main function and when main function is Called I want to just have a function called main where I will be writing all my front end code now just in case if you're not aware streamlet makes building UI applications extremely easy here you can build the title of your application which is LinkedIn post generator correct so here I will say LinkedIn post generator and then there are three controls which are in
the same row and whenever you want to have multiple controls in the same row you Will use something called columns so it's like this is your first column this is second column this is third column okay so let let me just show you so we are having three columns okay first one is called column 1 column 2 column 3 it returns a tle as a return value and to add controls within column you will say with column one now this one is a drop- down and the way you render drop down is by saying St
do select box the select box is called type title and the options Like all these options which are basically your tags is something you need to get here now what are the options we want to put in in f.p we have this function get tags so this is exactly what we need so I will import that from few short import few short post this is the class that I'm importing I need to create an object of this class which will be this and then when you do fs. getet tags it will give you the tags
So FS do get text okay let's just run this much code and see what happens you can go to terminal and you can say TriMet run main.py it will launch the application in a browser so it will look something like this and here by the way you can change it to Dark theme Okay so I'm saying use system setting but you can set Dark theme and it will just render things into Dark theme you see all the tags which I had in my Postprocess dogson file I have those tags here now let's add the two
other columns so I will say with column 2 St do select box the other one is length Okay so length will have fixed length short medium long language will have English and English so that's actually very easy so length is options uh you will have short medium long okay now folks just to be uh a little bit more modular I'll just put Those options here that way I can pass this variable I think yeah length options right length that way you know tomorrow if you want to add any new opt you can just add it
here and it will work and same way we'll do column three okay now here the return value so whatever you select from the drop down will go to this variable so selected length this variable name can be anything Selected uh tag okay so when someone clicks on it uh it will save that value in these variable and then at the bottom if you look at our presentation uh I think it was somewhere here see here you need generate button and once again streamlet makes these things so easy you can just say with st. button generate
actually it's not with it is if so if means if button is clicked then it will go inside this one Okay and here I'm just putting a dummy uh post here generated post for let's say see I you can just just debug if things are working okay or not by putting this kind of dumy code so selected tag is this selected length is this and then selected language is this now you don't need to rerun the app from your command prompt you can directly rerun it by clicking on this rerun button and see now it
looks Good right so see I can select any post here like all these options and when say generate these values are being passed to those three variables now we can perform our wiring now let's write the final most important piece of code which is the actual post generation in main. pi once you have these three parameters we need to pass that parameter to some function and what if we have that function in a separate file That will make our code more modular so I will create a new file called post generator and here what if
we have function called generate post which takes length language and topic as an input and it Returns the generated post okay so let's say this returns some generated post now this function we can call from main. PI we can say from post post generator import generate post and then uh generate post length what is a length This one length language and topic so selected tag and it will create this post which you can simply write post here and it should work okay right this will be a a good way to write that modular code so
now let's go to post generator here we are going to obviously import llm because we'll use llm for the Post generation so import llm and in that llm we can just say lm. invoke okay so lm. invoke we will have some Prompt prompt is the actual query okay and that will return some response and that response do content is something that we can return back I hope this is not overwhelming folks we have covered this previously in the in this video so I'm not spending too much time otherwise this tutorial gets too long essentially you
are calling LM and you are just invoking uh that by the way I have some syntax problem so from llm helper import llm okay this Looks good now and sometimes you see this kind of error kind kind of thing it is just because of Pap eight convention you need to have just two spaces you know that's what it is it's not nothing big all right now what is my prompt if you look at our presentation what we'll do is we'll write this kind of prompt generate a linken post using this information no Preamble topic length
language and we'll have few short learning we'll Supply some examples but Let's not worry about examples let's say if you can generate a simp simple prompt like this which has topic so topic is something you get from this drop down we have length length is also something you get from here and language is also something you get from here now see one thing about length is when you say medium long uh the llm doesn't understand what it means exactly so instead of medium what if I can say generate a post which is 5 to 10
lines If I give exact lines then it will be able to help me better because I might think medium is 5 to 10 lines but llm may think medium is 10 to 20 lines it's better if you are explicit now the question is the length you get here is either short medium long how do you convert that into lines maybe we can write a python function and since writing that python function is super easy I'm not going to write the whole function in front of you all this Function is doing is taking a length if
it is short it is returning this kind of explicit line number like 1 to five lines 6 to 10 lines and so on and we can call that function we can say okay get length and this will be the length string and then I can have this prompt written like this so I'm saying generate a LinkedIn post using the below information okay I'll just call this tag instead of Topic no Preamble topic is whatever you're getting from function arguments length is instead of short medium long you will have explicit lines like this language is either
English or English now English it may not understand properly so maybe we can add one more line explaining what English means so I will add these two lines I'll say if language is English then it means it's mix of this the script of the generated post should always be English English I Mean I don't want it to write in a Hindi actually like a Hindi the the writing script I want writing script to be English always so we can add all these explicit instructions and then Supply this prompt and it should kind of work so
we still don't have few short those examples which it can use to get understanding on my writing style but let me just taste uh these things so here maybe we don't need to test it through streamlet we can just write a Function here so if name is equal to underscore uncore main then generate post so let's say I want short post and I want um okay what is the first parameter oh length Okay language okay L language I want English and tag I want mental health or let's say job search okay I just want to
test this thing how it works and right click and run perfect see it is actually working fine uh boost Your job search with this tips so right now it is using just general llm knowledge to write this post it doesn't have knowledge on how musan is writing right so muscan is the LinkedIn influencer whose post we are using here in process post. Json so if we Supply some of these post as an example post then it will create this final post according to mcan's writing style okay so right now without that writing style It is
working okay now here we need to add those example now once again here we can use f.p so in f.p if you say fs. get filtered post and give the topic it will give you those post correct so let's use this function exactly I will go to post generator and you know what I'll do I'll let me let me append those um examples dynamically to this prompt because what if you don't find Any examples if you don't find any examples for a given topic length Etc you will still generate a post without muskan's writing style
okay so here I will say I need to import that class so from few short import few short post and then let me just create um class right here outside I mean I can create it inside also but every time it will create the new instance of that class so therefore I'm doing this then get filter post okay get filter post for length language and tag so length language and tag is something I can pass here and these are my examples now if length of examples is greater than zero see if it is not greater
than zero if you don't find any examples then go ahead with this problem it's fine but let's say if you find some examples then you can uh have that few short learning where you supply those examples so I will append To that prompt I will say prompt plus is equal to this and then add the actual example so you will say for example in examples actually I need the index as well so what I'll do is I post in enumerate you know right like when you do enumerate in Python you get the index of the
post as well as the real post and the post text is um post of text correct because you are get in the this object Which will have text engagement everything you just want to grab this text and then uh the actual prompt so prompt plus is equal to sln SL and you need to let's say let me just show you in PP see if you have this kind of post see if you have this kind of post you need slash and slash then you need example one example two and after example one the actual post
so I will say example I correct so this is a format String and then slash and Slash and then the actual post so actual post text is this and we need maximum two examples so I will say if I = to two break so I starts with zero so 0 and one and when it becomes two you break it or actually when it becomes one so it will go through zero iteration and then one iteration okay one now I think this prompt looks good let's see okay I'm just reading the code okay this is looking
good to make it modular Further we can actually create a function called get prompt which will take the same number of argument and what if we put all this this code into get prompt so the responsibility of this get prompt function is to take length language and tag and then just return the prompt okay so this will return the prompt and here yeah so here prompt will be get prompt which takes these three as an input okay my code looks much better now see generate Post function what it is doing is getting a prompt then
it is invoking llm and what get prompt is doing is just getting your prompt okay so we can tast this function let me just let me just test the get prompt function for short English and job search if you look at job search see English and short at least it should give me this in my uh few short prompt it should give this particular post okay so post generator right click run Perfect see what it did is it says so this is a prompt it is printing generate a LinkedIn post course using this information topic
is job search Okay short English perfect use the writing style as for following example so okay seeing example zero maybe I can do example one and there's a space I need to add so here I add space I + one okay and column say I have This sample post I'm just following this convention so example space then the number column and then this okay okay this part looks good run it and now this looks much better see and if you have more examples it will say example one example two and so on now I can
taste my generate post function so my prompting looks good and after that you just passing that to your llm okay so let's say for job search short English what it can give us job Hunting can be a roller coaster one day you are on top of whatever see generated this particular post and it is on line post because if you look at your few shot the example that you give looking for J is like online dating full of profy just left coasted so this generated post is kind of matching that style that you're having here
so this looks good actually this function looks good I can complete my wiring I can go to main pi and I'm already calling Generate post okay wiring is already done okay so why don't we go to our UI and just taste that code okay so here I will say rerun and after rerunning I need to make sure I'm using some of these uh examples so let's say you are writing this post on job search but medium length see and it will use this kind of style so job search but medium so let me use jop
search medium and English and generate job Seeker see since my example Has job Seekers it is using that as an example which is good actually if you have more examples maybe you can convert your few short from two examples to three examples you know the the break point that you put it here if if if I do two it will actually take three examples so you can customize that that thing doesn't matter I mean that's something you can do fine- tuning but overall my code seems to be working okay I can also try try a
different language like English so in English uh short post okay so I can read this particular post you just read it and will be generating LinkedIn Post in that I mean people who know Hindi will be able to understand this so let me just try that so we'll say English uh short and topic is job search right see job search it's see for the purpose for which we have designed this it is working okay obviously we can fine tune It but uh as your learning project this looks good I think we have completed everything I
mean there might be Corner cases folks there might be bugs you can go ahead and fix those do thorough testing um I mean I did some testing sanity testing there won't be any obvious bugs when I say bugs it's not like I intentionally introduced the bugs but this is not like thoroughly stress tested so I will encourage you test it thoroughly uh if you find any issues fix It uh but as far as uh the project is concerned the the coding is complete you will find the code in the video description below now folks I
want to give you some exercise so that if you are adding this project in your resume or in your portfolio it doesn't look like a copy paste project from YouTube YouTube project that we just did you use it for learning but then you need to have your own kind of customization so I will give you some ideas so as an Exercise you can add a drop down called influencer right now we uh created this tool for musan Honda for one influencer but what if you have a drop down of couple of influencers and you can
grab their post and uh maybe save them in a database so right now we used Json file as a database maybe you use my SQL or mongod DB and store your post into that and then uh these topics when you select musan these topics will be for musan when you select aat or let's say you Know I also do write post on LinkedIn so if you select my name the topics here will be data science and career guidance and so on so these topics should be populated dynamically based on your selection of the influencer length
language should also be same I think length you can have a fix short medium and long but language for example I don't write usually Post in like English uh Hindi mix so that is something you can keep it Dynamic uh you can also customize it like okay you want to add some emojis do you want to add a last line uh asking viewers to engage on your post you can add those kind of customizations and uh create these LinkedIn post now folks just be aware about privacy concerns for example if you're using some influencer post
and scrapping their data you need to give a due credit okay uh you can of of course take inspiration from their Writing style that's perfectly fine but don't just build this tool and blindly just copy paste uh those post on your own LinkedIn okay if you're doing that you need to give a credit because you are you are using someone's writing style someone's knowledge uh we need to be a responsible citizen responsible person on social media and give a due credit but I hope you like this project folks if you liked it please share it
with your friends who Are learning geni and also whatever you learn from this project you can also post on LinkedIn you can create a LinkedIn post and say hey I uh learned jna concept these many Concept from this particular project you can also tag me and I will I will engage if if I have time but I hope you had a good time learning this if you have any questions Post in a comment box below thank you for watching [Music]