How I Scrape Everything with Apify & Make.com

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
hey everyone Nick here and in this video I'm going to be covering Advanced web scraping now I've had dozens of people over the course of the last few weeks ask me Nick how do I scrape this resource or how do I scrape LinkedIn or what do I do if I need Instagram posts and in this video I'm going to show you how to scrape the previously UNS scrapable using a very simple thirdparty tool called apify I'm not sponsored by apify at all but I really love their tool and I found myself using it on basically any type of web web scraping project nowadays so uh I'm going to run you through it from top to bottom and then going to go and use apify along with make.com to build out a very simple parasite SEO or parasite social media Tool uh in about 5 minutes so if that sounds like something you guys want to learn more about if you guys have always wondered what sort of rests in the highest echelons of advanced web scraping then this is the video for you let's get into it okay so first things first what is apify apify is essentially just like a a big Library which is now turned into a store of scrapers that other people have built with interface that lets you just plug that scraper in to whatever flow you want in seconds it's probably one of the most cost-effective scraping platforms out there and it's really evolved into uh like an ecosystem just for web scraping in general where you can like create your own scraper and then you can actually put it up online and sell it or sell access to it I should say for a certain type of recurring monthly Revenue I've considered building my own web scrapers here I know a few people that have done so quite successfully made a fair amount of money and uh what I find so surprising is this that there's so many people that don't really know about this um and you know if you don't know about it then so much of of the web is unfortunately inacessible to you so over the course of next couple minutes I'm just going to go through apify from top to bottom I'm not going to like this video is in a programming video I'm not going to be running you through any programming principles or programming fundamentals obviously if you want to build a reliable web scraper fast you are probably going to need some programming skills but uh in this video I'm just going to be using pre-created scrapers and there are a ton of them by the way it's not limiting whatsoever to use pre-created scrapers cuz there's like 10,000 of them or whatever to scrape data sources and I'll just show you some examples of how you can use it in your flow I'll show you how to connect to make.com for the aspiring automation Engineers out there uh and then at the end I'm going to take one of these scrapers and then use it to build out like a Twitter parasite social media campaign okay let's dive right into it first and foremost the way that apfi works is uh you know if I click sign up for free and I go through that whole rigma roll I'm going to be taken to a page that looks something like this this is the back end of API and there are a couple things to keep in mind number number one the way that you are build on apify is uh through usage they offer $5 of free usage per month to any user that signs up in their free plan I also think that they have like some pretty generous trial although I might be mistaken there um but the $5 of free usage is more than enough for you one to probably like use a lot of the scrapers that you'd want to use to get all the data that you'd want to get uh but two to like test everything out um you know before you actually go ahead and make a purchase of some kind you're given a certain amount of memory and all this really means is that you know in order for the web scraper to work it's essentially a server somewhere hosted on the cloud um and you have a certain allocation of how much memory of that server cloud that you can use at any given one point in time on the free plan it's 8 gigabytes uh you can upgrade your plan if you would like and go substantially higher if I go to pricing here you'll see that you're given like a certain number of monthly compute units you're given a certain number of like data set reads or data set wrs right this stuff is a little bit more advance so if I just go to the pricing page here then essentially you'll see that um yeah you know if you got on the $49 month plan for instance then you would get 32 GB of RAM for your actors which are their terms for servers you'd be able to run 32 things concurrently uh you'd have access to you know proxies and and all that fun stuff and um you know I've like I personally subscribe to the starter plan and it's more than enough for literally all of my projects and I probably have like over 10 projects running constantly um on apify so it's like $5 a project consider the average client for automation anyway might be paying you know $500,000 $2,000 a month for maintenance or some type of retainer you could see why you know mathematically apfi works out as a solution here but anyway that's just the logistics in terms of how to actually go ahead and use apfi if you don't have any programming experience the simplest and most straightforward way is to head over to this store so give that a click and you'll be confronted with a page it looks like this essentially what the store is is it's just an ecosystem of other scrapers that developers have created to do things that most people want to do and so if I go to All actors and then I click view all like let's just take a look at some of the I guess some of the potential here of what you can do there's like a Google map scraper over here Google search results scraper Instagram hashtag scraper website content crawler Facebook post scraper Twitter scraper Tik Tok data extractor contact detail scraper GPT scraper Amazon product scraper AI product matcher Airbnb scraper YouTube scraper Cheerio scraper Zillow search scraper Trip Advisor scraper obviously the options are virtually unlimited here and that's why I mentioned earlier that like you don't actually need to have programming experience to use apify and make a ton of money uh when you integrate it with make.com because most of the hard lifting uh heavy lifting has actually already been done for you by these developers like uh like quacker for instance but uh okay I don't just want to show you this I actually want to dive in and and you know illustrate how simple it is to use so what I'm going to do is I'm going to go over to Twitter scraper over here and I got one of my buddies uh Twitter pages over here this guy's named Matt Larson very intelligent fellow um a lot of my followers I think have heard of him he's in the agency space feel free to give him a follow if you'd like but essentially what I'm going to do is I'm going to plug his profile into this Twitter scraper and then we're just going to get a bunch of search results from it so I'm going to click Start and let's see what happens as you'll see this is just so incredibly easy the fields a lot of the time are literally just hey put in a Twitter URL and you can imagine how you know using make.com maybe you can consult like a Google she sheet with a list of Twitter URLs take every one of those Twitter URLs and then just pump this into the actor um what API is doing right now behind the scenes is actually going out setting up a cloud server with a script and that script is basically meant to go onto Twitter extract posts um headings extract post data like date time copy all that stuff uh and then what it's doing is it's actually just making that publicly available to us here now I set my scraper at 100 tweets but if we yeah so it looks like it it all you know it just scrap the hell out of it if we look at it now we'll see that there are literally 100 tweets 50 per page and there are two pages um including like the full text including the number of replies including the retweets the favorites the views even the URL which is uh which is fantastic so there's just so much that we could do with this you know I mentioned how later on what I'm going to do is I'm going to create a little repurposing or parasite social media tool that basically takes every one of these tweets and then just rewrites them using AI maybe adds them to like a Content calendar or something but these sorts of flows are so so incredibly easy to use with apify um all you need to do is just hook it up to make.com with like a one-click module which I'll show you in a moment so pretty neat huh you even get the images as well not that you know I think that it's super relevant but yeah you get a lot so that's just the Twitter scraper if we go back to the store here there are way more like there's an Instagram hashtag scraper for instance I received um probably three questions the last couple weeks about scraping Instagram so uh you know all you need to do in order for this to work is obviously just pump in a hashtag uh the default here is web scraping you know you can scrape a certain number of posts per hashtag let's just say I want to scrape 10 just to make this a little bit faster you have some run options and all that stuff but generally speaking I don't really ever use any of the advanced ones usually there's an actor that does exactly what I want which is so simple uh and so I don't really have to muck around on the back end or the code or anything and then yeah you know this is now again spinning up a server um the server has a script and that script is like log into well maybe not log into Instagram but scrape Instagram for post that include this hashtag it's going to get like the post URL it's going to get like the number of comments it's going to get the first comment it's just going to get a bunch of data for us which is pretty interesting and uh you know you can even get like the author and if you guys are super um you know if you guys pay close attention to a lot of the other stuff I talking about videos you'll see that there are tons of other tools out there that allow you to get let's say like an Instagram author and then use it to scrape their posts so what we can do is we can weave in three or four of these like prebuild scrapers together super easily Maybe apify in the front end which scrapes hashtags and then compiles a list of users and then Phantom Buster takes in a list of users and then gets their posts and then you feed in those post images to AI to tell you something about it and then what your next step is is I don't know you pre-drafted DM or something like that and send it directly to the person the options here are virtually unlimited uh but you know this is just a quick example of the Instagram scraper let's check out another one one that I find myself using quite often I used this YouTube One in a previous video which was uh super useful um I have a lot of people asking me for um job uh post scrapers and so one thing I'm going to do sometime over the course of the next couple weeks is I'm going to show you how to basically apply to jobs automatically or if not apply to jobs automatically then like template out your job post and CVN cover letter so it's 99% of the way there but let's say you know I want to scrape a job listing website indeed the largest job scraping website in the world and maybe I'm like looking for jobs maybe I'm looking for web developer web scraper jobs this is a very quick and easy way to do so all you need to do is you know put the location down so maybe I'm in Seattle you know I'm searching in the United States I only want uh 20 jobs for for right now and then I just click save and start you know again within probably 30 seconds to a minute this is going to go out there onto indeed and then scrape the data and then put it and transform it into Json essentially like a completely platform agnostic format that we can like plug into make we can plug into like any other Scraper on the planet uh and in this way we can make like very complex chains of scraping that enable us to do super cool things so I love apify to death um I've showed it to tons of other people in the make.com zap you're an automation space and one thing that they always say to me is like oh man like I wish I could use this but I don't know how to program I feel like there's just a I don't know if it's a like a connotation or some type of like cultural meme where people uh that try and use apify are maybe put off by the way that it's sort of pitched to developers but you really don't need to be a developer to use it as you see I just got a bunch of um Indeed job posts with job titles and you know a bunch of information basically from these companies um and I I didn't have to really do anything right all I did was I plugged in like three parameters the place that I want to search in the job post that I want to search in the city and then it just pulled me up a bunch of data that I could use obviously to do some pretty cool stuff with so pretty exciting for sure you know I can imagine uh some some very interesting flows here where I scrape a list of job URLs and then maybe there's let's check and see if there's another scraper that enables me to get more information about every job just going to type indeed so you see there's actually multiple scrapers usually for um every individual platform you know because this is a store and there are developers that are competing to try and come up with the best product or the best scraper essentially that allows you to do that stuff so yeah uh pretty cool looks like this one allows you to extract detailed job listings directly from indeed which is pretty exciting um this one has a few additional parameters I think which is pretty neat um you know you can can plug that puppy in while that's running I'm going to head over to the store and maybe look for some other scraper just while I'm here and I can show you you know you can scrape like booking.com stuff Google Trend stuff Facebook ads which is pretty cool you can even and this is something that um somebody requested a couple of days ago you can even monitor a website for Content changes by screenshotting it and so what you could do is let's say you have like three or four competitors in your Niche you can track a website every single day and anytime that there's a change to that website let's say there's a new blog post add it or something the screenshot will be different and you can receive a quick little email or you can use this in a flow that every time the website gets updated you could have a web scraper run which goes to the website maybe scrapes the new blog post or something and then rewrites it and then posts it on your website like the opportunities here are quite frankly ridiculous and uh that's why you know I'm convinced that that these sorts of tools are just real superpowers they're kind of insane you know if you're into coding and stuff you can kind of look on the back end here you'll see that you know there's a bunch of um comments that are being left by the the scraper and and that sort of thing um obviously I I don't want to dive too deep into the code or whatnot so I'm just looking at the tabular data that's um that's sort of output after every actor run but yeah you have a real wealth of um I guess potential here with apify so I would highly encourage you guys to to jump on and and give it a go I don't really use anything except for the store and then the actors unless I'm creating my own flow um I have created a bunch of my own flows basically in order to do so you have to click velop new in the top right hand corner and then you know for the people that have more maybe programming skills or are more like code inclined than the rest of us um I usually use Crawley Puppeteer and chrome and then I actually need to go and then you know use a template I always start with a template and then I actually go and like write my scraper when you write your scraper you can actually Define like what elements in the page you want to be extracting where you want to be sending the data to I've built a number of really interesting ones where you know I'll scrape some job listing data source usually um and then with the do job scripting data source I will like look for the company name somewhere on the page I'll use that company name to fire another scraper which maybe looks on indeed or looks on LinkedIn and then I'll use that in order to like enrich a bunch of emails related to that business to like send cool emails to tons of flows that you can build out if you know you know a little bit of code as well but yeah as I mentioned there's just there's more than enough actors for for most people aside from the store and aside from the actor's page um some other sections that may be of interest to you I don't really know are schedules so if you have an actor you can schedule them automatically here I personally don't use this because I usually just schedule them in make.com which I'll show you how to do in a second um there is like a storage sort of database thing here which allows you to keep track of all of the outputs from your actors over time and then there's a proxy over here um I was receiving a couple of questions from people about whether or not you know we could scrape with proxies and make.com I found a couple of ways to use proxies and make they're very difficult so what I usually do is I will just run my scraping requests where I need a proxy through apify I'll still trigger it and make but I'll do it in apify and then I'll use their billing uh in order to you know use a data center residential proxy for my scraping uh and this enables me to you know scrape essentially undetected and then to revolve the location that I'm scraping from a lot of the time so I never really get pigeon hold or blocked or anything like that but okay I think that's enough of just talking about apify let me show you how to implement it and use it in make.com because I think that's uh just as if not more important first I'm just going to call this like apify test bed and then what you need to do is in your new scenario um all you need to do is just write apify and to see there are actually a bunch of built-ins here um you know you can also use API via like an API but usually the built-in ones are more than enough the two most important triggers are watch actor runs and then watch task runs I believe a task is like a specific instantiation of an actor and then an actor is just like more more generally so I only ever use watch actor runs so what you can do is you could set the schedule in apify if you wanted to to run every day then every day that the run is done and it has a bunch of data then you can just watch that actor run and then get an event and then now you know because you're running it outside outside of make.com you still end up with all the data and then you can use it to like move the next part of your flow forward alternatively what you can do is you can run an actor using make.com and so that's what I'm going to do here once you open it you need to make a connection in order to make a connection all you have to do is just go to this link right over here console. ai.com account Integrations there'll be an API key over here that you can just copy and then paste so I just did that offc screen because you know I don't want to show anybody my API key for this and once you're done with that all you need to do is just click on this little drop down then you now have access to all of the actors that you just connected to from the store so I could scrape indeed over here I could scrape Twitter over here which I'll do in a second web scraper website content crawler YouTube comment scraper YouTube scraper um we're going to use indeed scraper here which is quite exciting and then there are a couple of additional options there's input Json build timeout memory um what I what I'll do is in order to use use this with make.com you have to find the specific actor so uh in my case I want the Twitter scraper and then you see how there's a regular button here and then there's a Jason button so basically these instructions that we're providing down here um this is just a user interface that apify is whipped up for us but if you look at the actual like code input that's provided to it um it's just a bunch of Json which looks like this so all we need to do in order to make this work is we go over to Jason here and then we just paste it in and you know if you wanted to pull this data from an external data source for instance that'd be pretty easy right all you do is you would like add I don't know a Google Sheets module maybe you search rows once a day or something like that you make this the input and then over here where we're feeding in the Twitter bio the people that we want to scrape uh you would just grab it from the Google sheet module back here so pretty simple pretty straightforward stuff um but you know there are a couple of a couple of things that may trip you up if you're unfamiliar with how that works now I want my stuff to look pretty so I'm just going to format this in Json for.org and go back here get a nice list of handles over here and for the purposes of this I'm just going to scrape my own PR Twitter profile which I believe is just NRA yes it is oops let's just do that now we can specify the build down here but if you look at this little um uh idea like light bulb icon it says by default the Run uses the build specified in the default run configuration for the actor which is typically latest so if you don't include anything it'll just always use the latest build this is more of a programming um concept than anything else and you don't really need to worry about it just know that I always leave the build blank I'll also always leave the timeout blank and then the memory I'll always leave blank basically if you leave it blank it's just going to use the default run config okay so what you'll do is when you click run once the actor um it's not like it's running live and just waiting for you basically what's happening is when you click run once it just sends a signal over to apify and then tells it to run so in make you're just going to receive a thumbs up up basically saying hey this is now running but that isn't actually super useful to us if you think about it because we can't actually do anything with that now what I'd recommend you do is if you're scheduling it like this you know schedule it using makes um whatever interval you want maybe you run it every day and you run it at like uh you know 12:00 p.m. I think I have to like hard code in here yeah and then what I do is I'd have a separate scenario and the way that this scenario would work is if I make an example build here we'll say watch actor runs and then get data so let's do trigger apify run over here for our first scenario this is just going to trigger apify to run once a day uh for the specific scraper that we want and then all we do with the other module is we go to apify and then we go watch actor runs we're going to create a web hook here using the connection that we just set up earlier which uh in my case was my AP by connection the actor that we want is that Twitter one then we'll say finish Twitter scraper as the web hook name going to save that and now what'll happen is basically if I run this anytime that my actor is done running I will receive a notification in that other window so this first scenario now I just triggered the apify run so it's now running in the back end and what this other um scenario is doing is just waiting so if I actually go back to apify I can see this run as it's happening live and so I triggered it in make and now it's actually happening in apify and we have access to the user interface here now I know that it's going to scrape this probably in the next 10 seconds or so and I just want to show you guys what it looks like on apify end so I'm just going to go back to watch actor runs there's one additional step that we need to do in order to make this work um watching actor runs isn't actually enough to get the data and pull it into make we need to add a second module called get actor data or get data set from actor something like that um and that's uh basically like when we get the ID of the actor all we do is we ping the actor again um with the get and then we have access to the whole database this is just a quirk of api's API you know every API is a little bit different with how they work in that sort of deal so this is just the way that API Works they sort of have like a two-step the first thing that happens is when the actor is done you receive a notification like this it says succeeded but you'll see there's actually no data here it just says you know we've did two requests two succeeded zero failed um we can't actually access the you know the memory and that sort of stuff but if you go back down to apify you can say get data set items you can just connect that to the flow and then if you click on it you'll see that really the field that matters here is data set ID and we can actually for the purpose of this I'm just going to hard code it there's um there's a data set ID over here and I don't actually remember if this data set ID is in the meta options or if we just use this okay I think we just use this I'm going to give this thing a click oh sorry there's default data set ID my bad that's the one that you use so just use this default data set ID now I just want to hard code this I don't actually have to want to wait for the actor to run again so I'm just going to look for default data set ID down here going to copy and paste that in for now um You can set you know your own um parameters here maybe you only want Json maybe you want it to be CSV maybe you're outputting in HTML for whatever reason what I'm going to do now is I'm just going to run this and you'll see that we've now received um basically a list of all of those posts and the get a data set module has a built-in iterator and so the outputs are in uh in bundle form so there's like one bundle per per post essentially at least for the Twitter one so it's pretty exciting uh because now we have all this data in make.com and if you think about it what we can do now is let's say I want to like create that parasite campaign I can go to open AI have a little completion endpoint here plug that puppy in and then I can go to model I'll do gbt14 turbo I'll do a system prompt first I'll say you're a helpful intelligent writing assistant then I'll say you know this is sort of cheeky this is like a parasite campaign um so you know morally speaking ethically speaking do with this what you will I'm not doing this because I want everybody to run parasite campaigns because I think they're amazing or anything I'm just showing you guys how to do it because this is a very common request that people are either willing to spend money for or it's just something that you can use to either keep an eye on competitors or you know improve improve your own like business standing so obviously use this sort of thing with caution um you know don't go creating parasites for for literally everybody uh but what I'll do is um I'm going to do a system prompt which is going to Define what the model is how it self-identifies then I'm going to do a user prompt where I give it instructions so I'm going to say hey I want you to do this and then the next step is just like a user prompt and then an assistant prompt and then a user prompt and then an assistant prompt and these are just examples of the input and the output that I want from the model before I finally um get it to do an output um on like real live data so what I'm going to say is you are a Content I don't know spinner your job is to take a post from Twitter and rewrite it in a way that encapsulates the original theme in different words output in Json now I'm going to give it a bunch of examples I'll go user then I'll go assistant then I'll do user and I'm actually just going to do one that's that's it um so what I'm going to do is I'm just going to go on my own Twitter here let's pop in let's see some posts that I myself wrote I think if I go to Highlights it's probably all stuff that I wrote yeah so it's just me bragging about my YouTube that's pretty funny um this is a good one I made $210,000 in the last 12 months on upwork work two to three hours per day spend a big chunk of the your traveling would there be interest in a long thread detailing how I did it awesome so I'm just going to paste this in as the post and the example is going to be over the last 12 months I made $210,000 on let's uh use a different word than maid because ma is here um I profited $210,000 on up on upw workk no that's not really super accurate made is different than profit and they have um like a revenue share generated that sounds kind of tacky okay whatever let's do with minimal work and a travel heavy lifestyle want to learn how and maybe just want to learn how that looks good okay great and now I'm going to um you know I actually don't even need to Output in Json I can just grab the text directly as is now I'm just going to feed in the actual text from the flow that'll be full text Max tokens going to be 496 let's just do that uh and then let's add a sheets module so I'm just going to do like content calendar or maybe I don't know parasite Twitter campaign I'm just going to share this with the account that I know has access to this then I'll go back to here and then I'll add a Google sheet and what I'm going to do is every time that we generate a new one we're just going to add a row to the sheet simple easy beautiful Cover Girl uh now we're going to select the spreadsheet I it's probably in shared with me now that I think about it let's see oh boy that icons taking its sweet ass time yeah shared with me there you go okay parasite Twitter campaign and then what we have to do is feed in the sheet name which is going to be sheet one table contains headers oh yeah I guess we need some headers um let's just do Post copy this is going to be the lazy EST um Google Sheets build I've ever done there you go and then now what I'm going to do is feed the result in you know let's do Source post and then we'll do Post copy if I go back here I'm going to refresh this again now I'm going to have access to Source Post in a then I'll have post copy in B The Source post is going to be full text from here post copy is going to be here so we're going to be able to see the rephrase um thing and then we're also going to be able to see the other let me just make sure everything else looks good uh full text there may be some issues here because some of the comments that are being scraped on the Twitter are people that aren't me so it has like an at here so this is like me responding to somebody probably um but that's okay yeah we could just okay let's do this just so that we don't actually go through the process of rewriting posts that suck if the text contains so I'm just going to say contains case insensitive and if it contains an at so if full text contains an at then um this is a reply and then if it's a reply sorry we want doesn't contain my bad we'll say not reply okay so the logic is going to be as follows we're going to run the apify actor over here which I just ran and then I'm going to run this just so you know when it finally finishes um scraping we're going to get the results but basically when it finishes scraping we're going to get this module to run I think there's a very loud motorcycle outside so hopefully you guys can hear me all right um then next we're going to reping apify with the data set ID to scrape the data once we have the data it'll be a list of all of the posts essentially from that person's Twitter profile that are relevant we're then going to pass it through a filter called um not reply this is going to look for the presence of an at symbol and that at symbol is just a quick little hack I'm going to use to filter out any replies and then after this we're going to send it over to chat GPT or gp4 that's going to rewrite the post for us and then after this we're going to go into Google Sheets we're going to add a new row that includes the source post and then also includes the Rewritten version so I'll be real I'm not expecting any Nobel freaking prizes here but uh yeah I think that there's probably a fair amount that you can do with something like this so let me expand this and let's see what's going on I'm going to make this a little bit shorter here because this is happening pretty fast and then I'm also going to unbold this and then because I'm fancy I'm going to use the inter font um looks like the original is kind of it's kind of a copycat you know I'm using the same emojis just kind of sucks um let's check out the second one program is one of those weird careers that gets EXP finially easier your first job is extremely difficult to acquire but every job after that pays you more for Less work when you're 10 years in it's uncommon to juggle it does or more offers while having a job this is one's programming was unusual career when things get noticeably easier over time Landing the first job can be tough but subsequent positions often come a better pay for better hours been a decade and it's not rare to handle multiple job offer simultaneously even while employed this one's good to be fair television shows are interesting primarily because the characters don't have their lives together honestly TV shows are captivating largely because the characters are still figuring out their lives I think that is a pretty clear ill illustration that you know we're basically saying the same thing we're just doing in different words if I could go back in time and change the prompt here I would I think using the term spinning is really hurting us yeah I think that when I say the word spinner um it like sort of pattern matches for just changing a couple of words as opposed to the meaning rewrite each message in a way that ENC plates the original theme but uses different words and let's um change this a little further this required very minimal work got to do a bunch of fun stuff like travel enjoy my life Etc I'm curious as to whether [Music] anyone wants a detailed breakdown of how I did it let's do that that should be sufficient you know if we had another example here it' probably be better but uh is what it is okay I'm going to run this here and then I'm actually just going to go through and then delete this because I just want to have it output with better data and then sometimes s uh in make you have to do sort of an annoying hack with Google Sheets where you just have to delete all of the rows so I just right click the rows here and then I click delete this doesn't actually do anything visibly to change the way that this is laid out because I'm deleting Row 2 to 16 but then 17 to you know 32 or whatever is just bumping down to 2 to 16 um but I find that you know sometimes there's like an index stored or maybe it's cached or something so if I don't do that sometimes the results will start populating here and I just rather they they start populating immediately under Source post up at the top but uh yeah you know I just made a quick little change here to the because I wasn't very satisfied with the quality of that prompt I think we needed to change it a little bit more um and you know ideally what we what we'd actually do um is like add three or four examples as opposed to just one but I was trying to be a little cheeky and short on time so let's see what we got here yeah so this is a good one television shows are interesting primarily blah blah blah honestly TV shows are captivating mostly because their characters are a mess it always seemed weird to me and the majority of conversations never asked a single question when I talk to people ask them tons of questions yeah so this is a pretty pretty cut and dry parasite campaign here I think that there's probably a lot that he could do to improve this um you might be able to take the post copy of the second thing and feed it back in AI or maybe you could just like increase the temperature or add a few more examples of just how intensely you want to change the original meaning um but you know I think it's clear at this point point that this is the stem that you could use to build out more or less any flow um of your own that is uh that is high performing and we did it all using apify which is a platform that I think a lot of people that are watching this video had no idea existed before you know 20 or 30 minutes ago so quite exciting what I'm going to do next is I'm going to save These Blueprints just so you guys can use them on your own obviously you're going to have to hook up your own apify actors and you know if you're not doing a Twitter parasite SEO campaign you're going to have to or parasite social media campaign you're going to have to you know change the prompt and that sort of deal but I think this probably sufficient enough to at least get you guys started and the great news about apify is you can get very far with just the $5 um monthly budget that is provided to you along with the default scrapers you don't really need to know a ton of programming in order to do this awesome thanks so much for watching this video guys had a lot of fun putting it together if you guys have any requests for similar videos just drop a comment down below with what you'd like to see I Source tons of my videos from just comments that were left um on you know Random clips that I recorded two and a half months ago uh but I'm working my way down the list now and uh it's really helping become more consistent and not have to think too much about what to post but I can actually just jump in and then you know create something of value to you otherwise please like subscribe do all that fun YouTube stuff I'll catch you on the next video thanks
Info
Channel: Nick Saraev
Views: 8,248
Rating: undefined out of 5
Keywords: automation, make.com, content creation, ai content, google sheets, chatgpt, wordpress, openai, gpt-3.5, blogging, integromat, make, automating, automate, gpt-4, gpt, openai api, indie hacking, small business, $20K/mo, cold email, make money online, make.com for people who want to make real money, make.com money, make.com entrepreneur, make.com guide, make.com tutorial, make.com money guide, make.com course, apify, apify tutorial, how apify works, make.com apify, apify make.com, apify make
Id: wWDQOqcRT48
Channel Id: undefined
Length: 34min 9sec (2049 seconds)
Published: Fri May 10 2024
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.