My new step-by-step program to get you your first automation customer is now live! Results guaranteed. Apply fast: skool.com/makerschool/about. Price increases every 10 members 🙏😤
🎯 Key points for quick navigation: 00:00:00 *🏢 Introduction to Scraping Local Business Emails* - Overview of the video’s focus on scraping local business emails quickly, - Introduction to the tools and initial setup required for the scraping process. 00:02:18 *🔍 Setting Up the Scraping Process* - Explanation of the process to locate and target businesses on Google Maps, - Introduction to using Apify for web scraping without coding. 00:03:40 *⚙️ Utilizing Apify for Efficient Scraping* - Advantages of using out-of-the-box scrapers from the Apify store, - Steps to begin using a pre-built Google Maps extractor for targeted searches. 00:05:36 *💡 Customizing the Scraping Process* - Explanation of deeper city scraping for more extensive data collection, - Use case scenarios such as different city and location-based scraping to maximize data retrieval. 00:07:01 *📊 Reviewing Scraped Data Output* - Details on what the scraped data includes, such as business images and URLs, - Methods to filter data for more effective email enrichment. 00:08:52 *🔗 Integration with Make.com for Automation* - Initiating the scraping run through Make.com, - Explanation of integrating Apify with scenarios in Make.com for automation and data processing. 00:11:21 *🖥️ Automation Workflow Design* - Setting input parameters in JSON for targeted scraping automation, - Using variables in Make.com to generalize the scraping process across different locations. 00:14:18 *🏁 Finalizing Automation and Data Filtering* - Running the scraping module through Make.com and retrieving collected data, - Setup of data filters to refine output based on the presence of website URLs. 00:16:57 *🛠️ API Setup and Filter Configuration* - Introduction of the Any Mail Finder API module for email scraping. - Setting up a filter to ensure the website field exists before the email search. 00:19:15 *📄 Google Sheets Data Preparation* - Creation of a Google Sheet for storing business data (name, website, phone number). - Plan to add emails using a formula after data retrieval. 00:22:05 *🎨 Formatting and Presentation* - Emphasis on the importance of presenting data attractively for clients in Google Sheets. - Basic formatting changes to improve the aesthetic of the sheet. 00:23:55 *🚀 Running the Data Flow* - Testing the functioning of the scraping and data insertion into the Google Sheet. - Handling errors with formatting and logic in make.com for email extraction. 00:28:28 *📧 Handling Multiple Emails* - Dealing with scenarios where multiple emails are found for a single business. - Strategies for splitting and organizing email lists in the worksheet. 00:31:08 *⚙️ Improvements and Final Adjustments* - Filtering for entries with emails to avoid rows with no email data. - Hardcoding the first email in a list for consistent data formatting. 35:15 *⚡ System Functionality and Optimization* - Brief overview of deriving email data efficiently from cached responses, - Techniques to streamline the data extraction process for large datasets. 36:17 *🔄 Scenarios Separation and Scheduling* - Separating tasks for efficiency: running an actor and processing local business data, - Scheduling runs daily for an automated process management system. 37:17 *📡 Implementation of Webhooks and Data Handling* - Use of webhooks to monitor specific processes and extract data in steps, - Mapping datasets and controlling data processing to maintain formatting consistency. 38:27 *🔍 Data Extraction Outcomes and Adjustments* - Execution and evaluation of data extraction, showcasing examples, - Proposing deeper searches and larger datasets for comprehensive results. 39:21 *🚀 Exporting and Application of Blueprints* - Exporting scenarios as blueprints for further use, - Encouragement to apply learned methods for large-scale data scraping. Made with HARPA AI
Nick, I rarely if ever comment and like content on youtube, I believe I have been on youtube for over 16 years now?(not this account in particular) but I love your content to the extent I am just liking every video I watch because you are providing liquid gold in my opinion, thank you so much for all the incredible videos! Hopefully soon I'll be able to join your network! Thank you again!
Awesome video! One thing I learned that you can do in Make is pass the Google Sheets Formula in the Add a Row step. I haven't tested it with this specific example, but it would be something like =split(5. Email list, ",") in Make, and once it gets added to Google Sheets, it will split automatically. Loving the content, thanks Nick!
I have a couple of guesses: they always check their cache first to see if a contact exists already, as that would be simplest. Then, if you feed in a domain name, they'll run variants of firstname + lastname @ companydomain.com until they find one that sticks. If just the company name, they'll run a SERP check using some tool, internal prob, to find the most likely affiliated domain name to that company, then run the former through the nominative email flow. If none of that works, they prob spin up a scraper and access the website directly, looking for emails that match a regex. And if that doesn't work, perhaps they have some sort of lead share partnership with other dbs where they buy the record (this is a stretch but possible). Hope this helps man.
Does this scrape through the entire website for the email? I see that there are some websites that have their email on a about page instead of the main home page
Great Video Nick. I wish you'd elaborate on how to make multiple Maps scrapes and "use cases" Example use cases: (Location 1/Search Criteria 1) + (Location 1/Criteria 2) + (Location 2/Criteria 1) etc. It appears they all use the same Apify Actor however the inputs for this actor would not be the same. In your video the Apify actor is setup for a specific geography and search Criteria . But what if you want to scrape multiple Criteria AND multiple locations separately? Do you configure the Actor in Apify for each use case? Or do you configure it in Make? In your video it appeared that you FIRST configured the Apify actor for a specific geography (AB) /criteria (plumber). How would you create say 3 separate and distinct use cases - (Plumbers in Wisconsin) + (Carpenters in Quebec) ? How do you configure it in Apify and how do you configure it in Make to create multiple, separate scenarios?
At 3:32 it says the Google Scraper App is $12.00 per 1000 results? Is 1 result the whole page of leads or each lead? If it's each lead, that is 4x more expensive than just using a service.
The specific one I used was $12.00 per 1K listings, yeah. Def on the pricier end of scrapers, think I just used it because it was pay-per-result vs others that were rental/platform usage. Realistically you can probably get ~1/3 the cost at scale. What service we talking about?
Excellent. Thank you. Could you show us how to modify the input for multiple cities. I’m a newbie and didn’t understand the syntax I need for JSON or how to add an input sheet with more cities
G'day, Nick. Thank you for providing amazing content! You provide excellent insight on creating and selling AI automation, but rarely on maintenance and training after the product has been sold. Once a client has received the automation template, how are training and maintenance/support handled? Are you charging a one-time fee or will you be kept on retainer? How did you scale while keeping your customers happy?
Thanks Patrick 🙏a core part of the deliverable is an in-depth video (30mins or so) that goes through how the system works, where to change things (if necessary), etc. I treat it as documentation, which lets the business operator just copy and paste the link to send to their team as needed for training purposes. If the team needs more training I use that as a negotiation tool for my retainer-namely, that by retaining me they'll get defined Slack availability that their team can use for Q&A, as well as a weekly consulting call that doubles as team training time. Goal is always to go retainer (recurring $) and I scaled by leveraging a few near-productized offers, like cold email systems, proposal systems, one-click project management templates, etc, and then structuring my ongoing relationships in a way that maximized value per unit time and positioned me less as a "builder" and more as a "partner". Hope this helps 🤝
So many lead sources! You can 'buy' thousands of pretty high quality emails on sites like LeadsRapidly or scrape them using approaches like what I show in this vid. You can also source from free communities (Skool, Slack, Discord) or DM people on social media platforms like LinkedIn, Twitter. Offers take a fair amount of work to refine, of course, but eventually you'll get a positive reply rate > 1% and can book meetings selling your service. Check out my copywriting/offer videos for more on that. Hope this helps 🙏
If you wanted to run this on multiple cities, you'd use a Google Sheet as an input with a column like "City", maybe another one called "Query". Then, you'd use that as the trigger and feed it into an Apify "Run Actor" module that takes JSON as input (which includes mapped city and query values). You'd then iterate through all of the rows in your Google Sheet, launching individual actors for each combination of city & query. Hope this helps 🙏
I have almost successfully completed this Scenario but for some reason the "Phone" numbers scarped do not copy over to my spreadsheet...everything else does. Is there anything you can suggest I try? Thank you!
when i'm connecting my API key and i try to run it, the application tells me that it received an error because there is no API endpoint at this URL. How do i fix this
Did this exactly as shown and when I set up the apify get company's email I get BundleValidationError Validation failed for 1 parameter(s). Missing value of required parameter 'domain'. This this is such a pain in the ass!
Can you make a beginner Video? How you decide which parameter you use to make automations? For example that if somebody filling out a form that automatically everything gets transferred to Clickup and automatically created an project folder in Drive with a unique number that there cannot be a confusion? Im not sure how to connect everything 😅
Nick, a quick question There’s a law that prohibits sending emails without the recipient’s consent (both Canada and USA). Scraping tools can help collect emails, but as I understand it, I could face serious fines for sending emails to such a list without permission. Or am I wrong?
I get the client to sign up to (and pay for) Make.com, then send me their email & password. I do both of these steps on a kickoff call, which lets me answer questions as they come up-minimizes complexity and ensures no 2FA shenanigans.
So do you generally set all these processes up for the client under their own accounts and give them access afterwards, or do you ever run these as a service ?
They have access to everything from Day 1! Since the client creates their own accounts, everything is owned by them by default. I use their credentials to build automations in their own Make.com account (and whatever other services we’re using). I’ve tried running it as a gated service where I pay for the infrastructure and just charge my clients a fee, but in practice it ends up being a headache and I don’t like the liability. I’m sure you could make it work, though.
@@nicksaraev but do you use any password manager for their credentials or they just give it to you and trust you you don't do anything with them. PS: thanks for taking time to answer me!
Nick thanks for these great videos. They have been so good 👍 really I learned make from looking at your channel and a lot of hours into the platform!... PLEASE can you share some light into how could I scrape or API tiktok creator search insights? Thanks again
From what I can tell, Nick already had an account for the paid tool (Anymail Finder) he used to search each company site for email addresses, so he just had to grab his API key. If you're not already using Anymail Finder (which I'm not), you can choose Google Maps Email Extractor (lukaskrivka/google-maps-with-contact-details) from Apify. It seems to combine the first two steps of this process. It scrapes Google Maps, then also scrapes the website of each listing to find email addresses.
Great video, very useful. Is there any reason to choose this over say RapidAPI? I am new to APIs so I don't know what is easiest or best to integrate with make. Thanks
Hey Nick , really nice video ! I was wondering if I could help you with more Quality Editing in your videos and also make a highly engaging Thumbnail and also help you with the overall youtube strategy and growth ! Pls let me know what do you think ?
My new step-by-step program to get you your first automation customer is now live!
Results guaranteed. Apply fast: skool.com/makerschool/about. Price increases every 10 members 🙏😤
Thank you Nick! Curious why you did not use Apify's email extractor. Again thank you so for this walk through. Very nice!
🎯 Key points for quick navigation:
00:00:00 *🏢 Introduction to Scraping Local Business Emails*
- Overview of the video’s focus on scraping local business emails quickly,
- Introduction to the tools and initial setup required for the scraping process.
00:02:18 *🔍 Setting Up the Scraping Process*
- Explanation of the process to locate and target businesses on Google Maps,
- Introduction to using Apify for web scraping without coding.
00:03:40 *⚙️ Utilizing Apify for Efficient Scraping*
- Advantages of using out-of-the-box scrapers from the Apify store,
- Steps to begin using a pre-built Google Maps extractor for targeted searches.
00:05:36 *💡 Customizing the Scraping Process*
- Explanation of deeper city scraping for more extensive data collection,
- Use case scenarios such as different city and location-based scraping to maximize data retrieval.
00:07:01 *📊 Reviewing Scraped Data Output*
- Details on what the scraped data includes, such as business images and URLs,
- Methods to filter data for more effective email enrichment.
00:08:52 *🔗 Integration with Make.com for Automation*
- Initiating the scraping run through Make.com,
- Explanation of integrating Apify with scenarios in Make.com for automation and data processing.
00:11:21 *🖥️ Automation Workflow Design*
- Setting input parameters in JSON for targeted scraping automation,
- Using variables in Make.com to generalize the scraping process across different locations.
00:14:18 *🏁 Finalizing Automation and Data Filtering*
- Running the scraping module through Make.com and retrieving collected data,
- Setup of data filters to refine output based on the presence of website URLs.
00:16:57 *🛠️ API Setup and Filter Configuration*
- Introduction of the Any Mail Finder API module for email scraping.
- Setting up a filter to ensure the website field exists before the email search.
00:19:15 *📄 Google Sheets Data Preparation*
- Creation of a Google Sheet for storing business data (name, website, phone number).
- Plan to add emails using a formula after data retrieval.
00:22:05 *🎨 Formatting and Presentation*
- Emphasis on the importance of presenting data attractively for clients in Google Sheets.
- Basic formatting changes to improve the aesthetic of the sheet.
00:23:55 *🚀 Running the Data Flow*
- Testing the functioning of the scraping and data insertion into the Google Sheet.
- Handling errors with formatting and logic in make.com for email extraction.
00:28:28 *📧 Handling Multiple Emails*
- Dealing with scenarios where multiple emails are found for a single business.
- Strategies for splitting and organizing email lists in the worksheet.
00:31:08 *⚙️ Improvements and Final Adjustments*
- Filtering for entries with emails to avoid rows with no email data.
- Hardcoding the first email in a list for consistent data formatting.
35:15 *⚡ System Functionality and Optimization*
- Brief overview of deriving email data efficiently from cached responses,
- Techniques to streamline the data extraction process for large datasets.
36:17 *🔄 Scenarios Separation and Scheduling*
- Separating tasks for efficiency: running an actor and processing local business data,
- Scheduling runs daily for an automated process management system.
37:17 *📡 Implementation of Webhooks and Data Handling*
- Use of webhooks to monitor specific processes and extract data in steps,
- Mapping datasets and controlling data processing to maintain formatting consistency.
38:27 *🔍 Data Extraction Outcomes and Adjustments*
- Execution and evaluation of data extraction, showcasing examples,
- Proposing deeper searches and larger datasets for comprehensive results.
39:21 *🚀 Exporting and Application of Blueprints*
- Exporting scenarios as blueprints for further use,
- Encouragement to apply learned methods for large-scale data scraping.
Made with HARPA AI
did you just scrape a whole ass video and summarise it with AI😂
@@vora8491 absolutely 😎🤣
Nick, I rarely if ever comment and like content on youtube, I believe I have been on youtube for over 16 years now?(not this account in particular) but I love your content to the extent I am just liking every video I watch because you are providing liquid gold in my opinion, thank you so much for all the incredible videos! Hopefully soon I'll be able to join your network! Thank you again!
I feel honored Adham 🙏 thank you man
Awesome video!
One thing I learned that you can do in Make is pass the Google Sheets Formula in the Add a Row step. I haven't tested it with this specific example, but it would be something like =split(5. Email list, ",") in Make, and once it gets added to Google Sheets, it will split automatically.
Loving the content, thanks Nick!
My heartfelt thanks Nick.
Quick question, do we know how anymail does what they do? Just seeing if there's a way to avoid using them because for extreme bulk it can get costly.
I have a couple of guesses: they always check their cache first to see if a contact exists already, as that would be simplest. Then, if you feed in a domain name, they'll run variants of firstname + lastname @ companydomain.com until they find one that sticks. If just the company name, they'll run a SERP check using some tool, internal prob, to find the most likely affiliated domain name to that company, then run the former through the nominative email flow. If none of that works, they prob spin up a scraper and access the website directly, looking for emails that match a regex. And if that doesn't work, perhaps they have some sort of lead share partnership with other dbs where they buy the record (this is a stretch but possible). Hope this helps man.
what is the mic you are using and how much does it cost
Samson Q2U. I think it cost me ~$150, so on the pricier end, but not crazy. 10/10 would recommend.
you're such a meme lol
I love it. really wholesome
You always read my mind Nick 🤯
Great video! Can you tell me if building the scraper the old way requires more operations compared to using Apify?
Does this scrape through the entire website for the email? I see that there are some websites that have their email on a about page instead of the main home page
Great video Nick, thanks!
Could you make a video creating something similar to scrape leads but on n8n?
Great Video Nick. I wish you'd elaborate on how to make multiple Maps scrapes and "use cases" Example use cases: (Location 1/Search Criteria 1) + (Location 1/Criteria 2) + (Location 2/Criteria 1) etc. It appears they all use the same Apify Actor however the inputs for this actor would not be the same. In your video the Apify actor is setup for a specific geography and search Criteria . But what if you want to scrape multiple Criteria AND multiple locations separately? Do you configure the Actor in Apify for each use case? Or do you configure it in Make? In your video it appeared that you FIRST configured the Apify actor for a specific geography (AB) /criteria (plumber). How would you create say 3 separate and distinct use cases - (Plumbers in Wisconsin) + (Carpenters in Quebec) ? How do you configure it in Apify and how do you configure it in Make to create multiple, separate scenarios?
At 3:32 it says the Google Scraper App is $12.00 per 1000 results? Is 1 result the whole page of leads or each lead? If it's each lead, that is 4x more expensive than just using a service.
The specific one I used was $12.00 per 1K listings, yeah. Def on the pricier end of scrapers, think I just used it because it was pay-per-result vs others that were rental/platform usage. Realistically you can probably get ~1/3 the cost at scale. What service we talking about?
Excellent. Thank you. Could you show us how to modify the input for multiple cities. I’m a newbie and didn’t understand the syntax I need for JSON or how to add an input sheet with more cities
“I sold this system for $1000-$2000” dude tf actual legend stuff just showing us how to do it for free just mad really😂
Appreciate you bro 🙏
@vora8491 Could tell where you have sold it? How and where you have found the client? 😊
How do you get to sell it budddy
What kinda businesses need this?
G'day, Nick. Thank you for providing amazing content!
You provide excellent insight on creating and selling AI automation, but rarely on maintenance and training after the product has been sold.
Once a client has received the automation template, how are training and maintenance/support handled? Are you charging a one-time fee or will you be kept on retainer? How did you scale while keeping your customers happy?
Thanks Patrick 🙏a core part of the deliverable is an in-depth video (30mins or so) that goes through how the system works, where to change things (if necessary), etc. I treat it as documentation, which lets the business operator just copy and paste the link to send to their team as needed for training purposes. If the team needs more training I use that as a negotiation tool for my retainer-namely, that by retaining me they'll get defined Slack availability that their team can use for Q&A, as well as a weekly consulting call that doubles as team training time. Goal is always to go retainer (recurring $) and I scaled by leveraging a few near-productized offers, like cold email systems, proposal systems, one-click project management templates, etc, and then structuring my ongoing relationships in a way that maximized value per unit time and positioned me less as a "builder" and more as a "partner". Hope this helps 🤝
i got one question how dosell these things like where do you go to find people
So many lead sources! You can 'buy' thousands of pretty high quality emails on sites like LeadsRapidly or scrape them using approaches like what I show in this vid. You can also source from free communities (Skool, Slack, Discord) or DM people on social media platforms like LinkedIn, Twitter. Offers take a fair amount of work to refine, of course, but eventually you'll get a positive reply rate > 1% and can book meetings selling your service. Check out my copywriting/offer videos for more on that. Hope this helps 🙏
how do you program the google sheets at the first step to pull the different cities?
If you wanted to run this on multiple cities, you'd use a Google Sheet as an input with a column like "City", maybe another one called "Query". Then, you'd use that as the trigger and feed it into an Apify "Run Actor" module that takes JSON as input (which includes mapped city and query values). You'd then iterate through all of the rows in your Google Sheet, launching individual actors for each combination of city & query. Hope this helps 🙏
I have almost successfully completed this Scenario but for some reason the "Phone" numbers scarped do not copy over to my spreadsheet...everything else does. Is there anything you can suggest I try? Thank you!
when i'm connecting my API key and i try to run it, the application tells me that it received an error because there is no API endpoint at this URL. How do i fix this
BOOM! Super useful content, as always - thank you, Nick!
My pleasure Nick #2 😈
Did this exactly as shown and when I set up the apify get company's email I get BundleValidationError
Validation failed for 1 parameter(s).
Missing value of required parameter 'domain'. This this is such a pain in the ass!
You are Cam right?
Can you make a beginner Video? How you decide which parameter you use to make automations? For example that if somebody filling out a form that automatically everything gets transferred to Clickup and automatically created an project folder in Drive with a unique number that there cannot be a confusion? Im not sure how to connect everything 😅
Nick, a quick question
There’s a law that prohibits sending emails without the recipient’s consent (both Canada and USA). Scraping tools can help collect emails, but as I understand it, I could face serious fines for sending emails to such a list without permission. Or am I wrong?
That law applies to consumer emails at least in the U.S.
Really great video. Thank you! Is there a free software to scrape email other than anymail finder?
So, what if the customer requests the ability to pass unique parameters to the scraper API call? E.g. city name, etc.
So useful - thanks Nick🙌
If the email enrichment could timeout on a big data set, do we need to split that step off in to a new scenario too?
Couldnt you just instead of manually adding it to google sheets tell make to put that formula into Email 1?
Yup, good catch-would make things easier.
My api key is giving me issues when trying to connect the any mail finder. Any help?
Nick one question, do you host your client scenarios in your make account or you do it in their account giving them the blueprint you previously did?
I get the client to sign up to (and pay for) Make.com, then send me their email & password. I do both of these steps on a kickoff call, which lets me answer questions as they come up-minimizes complexity and ensures no 2FA shenanigans.
So do you generally set all these processes up for the client under their own accounts and give them access afterwards, or do you ever run these as a service ?
@@mredark these are the questions I'm having. Wouldn't it be hard to sell recurring revenue on automation when they already own everything?
They have access to everything from Day 1! Since the client creates their own accounts, everything is owned by them by default. I use their credentials to build automations in their own Make.com account (and whatever other services we’re using). I’ve tried running it as a gated service where I pay for the infrastructure and just charge my clients a fee, but in practice it ends up being a headache and I don’t like the liability. I’m sure you could make it work, though.
@@nicksaraev but do you use any password manager for their credentials or they just give it to you and trust you you don't do anything with them.
PS: thanks for taking time to answer me!
does anymail finder require an active subscription to get the emails from Apify dataset? Mine is asking to add a URL, idk which URL.
Thank you so much ✅🥰🥰🥰
I love your video quality or camera do you use or what setting
Oh awesome just what i need still finishing yesterday's tutorial
Cheers from BR. Ty!
Nick thanks for these great videos. They have been so good 👍 really I learned make from looking at your channel and a lot of hours into the platform!...
PLEASE can you share some light into how could I scrape or API tiktok creator search insights? Thanks again
You lost me around 19:14 with the API required for the third module. Where is that?
From what I can tell, Nick already had an account for the paid tool (Anymail Finder) he used to search each company site for email addresses, so he just had to grab his API key. If you're not already using Anymail Finder (which I'm not), you can choose Google Maps Email Extractor (lukaskrivka/google-maps-with-contact-details) from Apify. It seems to combine the first two steps of this process. It scrapes Google Maps, then also scrapes the website of each listing to find email addresses.
Alwayes informative and no BS videos .. 👌
Keep getting an error code on my anymail finder saying that I am not inputting a valid domain
can't beat the black shirt, mf out here looking like a billionaire CEO 😤♥
Great video, very useful. Is there any reason to choose this over say RapidAPI? I am new to APIs so I don't know what is easiest or best to integrate with make. Thanks
On point, Nick!
YOU ARE MY HERO 🐐
What mic you using? Sounds incredible.
Samson Q2U + Auphonic in post. One of my community members recommended the latter and it changed the game!
@@nicksaraev That's awesome. What Auphonic filters do you usually apply?
@@nicksaraev Wow Auphonic is amazing
Nick, what do you mean you are not American?😻
I'm Canadian! Live in Calgary AB x)
you are skipping some parts so as a beginner you get me confused sometime but overall thank you
i built an free google map scraper chrome extenion
Hey Nick , really nice video ! I was wondering if I could help you with more Quality Editing in your videos and also make a highly engaging Thumbnail and also help you with the overall youtube strategy and growth ! Pls let me know what do you think ?