The simplest way to automate your browser for FREE
HTML-код
- Опубликовано: 28 май 2024
- Get Automa! Not sponsored
automa.site
Join the FREE Skool Community to get a free call with me!
skool.com/ai-automation-mastery
Want a full AI and Automation Audit for your business?
calendly.com/horizonlabsai/audit
Join my newsletter!
mikepowers.beehiiv.com/subscribe
DM me on social media!
/ mikepowershd
/ itsmikepowers
/ itsmikepowers
/ itsmikepowers
Send me an email!
mikepowersofficial@gmail.com
Intro 0:00
Automa Overview 1:18
Google Search 6:00
Amazon Scraper 11:24
Wow, this is sooo useful! Please provide more tutorials on Automa. I love it!
This is SUPER POWERFUL! Please post more videos on Looping and how this could be used for Regression Testing.
Explanation is so nice... it looks so simple... trying to do.. we feel the difficulty. Yet nicely done. Crisp to the point. Need some tutorials which help scrape within a workflow and use them for input in different controls like dropdown list, radio buttons, check boxes, expand javascript links to expose hidden data and then scrape.
Great tool. Will join you on Skool. All the best.
This extension is so powerful !!!
Your content is great man, I've learned a lot from your videos. I know you mentioned n8n previously (it's also one of my favorite tools). Do you see this replacing your n8n use cases? Or would you say you mostly use Automa for scraping and n8n for all your other automations?
Great job! Thank you 🙏
Thank you for your video, It really helps me a lot!
Super useful ty for sharing
Insane,thanks mate!
What a great explanation!
This is very useful! thank you!
Mike! u are a diamond gem of youtube! ❤
Wow, this is sooo useful!
Thanks for the video. That tool is amazing. I recently watched your Google maps scraping tutorial. It would be great to figure out how to use this in the tool in conjunction with a spreadsheet, full of search criteria and generate a spreadsheet of the resulting leads.
I love you content. Great!
Great video. Maybe some of the good marketplace automations you found would be a good video and also ones to use as a ideas for templates or sequence templates.
brilliant - yes, more web scaping wth multiple cards on one page and pagination, please
Thank you for the putting video. could you show an example of how to clink on a drop down selection?
you are a gem
Quite the playground. Reminds me of using Windows Macro tools a lot in the past... thanks
Thanks 😎👌🏽
I try it, its really wonderfull tool❤
Looks like a powerful, useful tool! Just a few thoughts.
1. Like all chrome extensions, you have to be careful. Don't want to run some unsafe flows .
2. I dislike those "page load wait" actions. Theres just so much overhead. Can't wait to build automation using AI prompts instead.
3. The browser providers are in a good position to provide this "agent" funtionality since they could train AI models based on all the user interactions.
All points you mentioned already exist since years. For the "wait" action, you have to, because I you go to a website and try to get a specific form field called "name" of the form called "theForm", you have to ensure that the form itself is already loaded on the page, otherwise you'll have a "fail" in the workflow saying that the element does not exist. Also, for the last point, a lot of website has "pages analyzers" where they can generate heatmaps to see users behaviors, etc ...
Of course, typing your passwords in this type of workflow must be avoided and prefer developing this kind of logic with code that let you put credentials in "environment files" with key-value pairs and reference them in the code directly.
Finally someone does a tutorial on thus
I really like it but I am not sure what to use it for. It seems powerful but can it do loops and run through IDs in a external excel for example to scrape date for more than just one ID? Because I do not see how this does anything faster than I can do it manually? I definitely look forward to seeing more
great vid
Great video! Thank your for this awesome content
epic mate! How can i run multiple scrapes fully automated?
man this is so cool. than you for sharing this tool!
This is very powerful.
We're in April, but Mike is probably already my best RUclips discovery of 2024... thanks for the great stuff
Please make a full tutorial video for this extension
Looks extremely interesting I'm definitely going to look into the automa.
PS: Do you know anyone that might be interested in a thumbnail design? I've seen a lot of creators that are trying to get their ideas across, but don't have the time to focus on portraying it nor learning about the details and skills that it takes. I'm offering to do a complete analysis of the target audience, already existing audience, personality and idea of the creator. So that I can create this ideal thumbnail with my existing skills, but also the things that I still have to learn.
Its really a great extenstions.
I am not a professional in this but I want something to make my work easier. Basically I have a paid subscription to a Yoga video website that uses RUclips streaming and embeds it into their site. I can go into developer mode and easily get the RUclips video ID and that's the manual way I have been doing so far. I would like to automate that and get the final video playing on RUclips itself. This is because the platform that I am supposed to watch it on is buggy and disconnects if the streamer has errors, whereas YT is much more robust. It is super annoying cuz it takes like 2 minutes to refresh and get to the stream again while on YT it stays Live.
woww you are hero
Mike you are blowing up would love to collab and beta test your products for my online solar biz!!!
Hi, it is possible to extract into excel after creating the flow?
thx
This is gold man! Especially for scraping, I really have to try it out. Can it work on headless mode?
What is headless mode for a browser?
@@BGdev305 Browser headless mode is a feature that allows a web browser to operate without a graphical user interface (GUI). In headless mode, the browser can perform automated tasks without displaying the web page to the user.
Damn the potential of this is huge. Does this also work with websites where you have to be "logged in"?
Yes, you need to logged in already, then automate it, it uses your browser where everything is already saved, so when you open a website, you will be already logged in in that website cause you already do that before.
Excellent video, I am subscribed and I clicked thumbs up!
Question:
Does this tool send anything to the Internet?
I want to make sure that all of my data is local and not going off into some cloud provider running this automation.
Can you confirm?
l also want to know this @Mikepowers
We Need More tutorials Bro ( scrolling, do specific filters and more stuffs)
Can I make click on an extension icon from my browser navbar using automa?
Been using automa since the last 2 months can you make video about loops? Cant seem to utilize the loop block in automa since. i use javascript
And Also one more thing as you can see as soon as you run it. It does not update the existing it creates a new row when you run it is there a way to now let that happen like for example when you first run it only extarcats the name of the product but lateron when you run again it again created a new row and input the rest of the details like the name and all in the next row Hopw you get it what I am trying to say.
Simply add a module to GET the table or SHEET's next available row
ive got it to open my google sheets via url...how do i get it to go to and open cell A1 which has a url in it, open the url and then get it to go to A2 and perform that same task again (open url)
It's not good idea to access google sheet with url you can access it by automa build in feature you can see it from @browserautomate
Is it good for buying concertticket
Great video But Why your video is lagy then ur voice
Also I have a suggestion
Can you make an automation where If we provide a topic then it will create images , context to be able to upload to different social medias like Linkedin , Instagram Facebook and automatically upload it to the respective platforms
this is crazy... do you think it is possible to create an extra set of nodes to that amazon scraper automation, to bring in that data onto a website somehow? like, to have dynamic product information updated periodically with this automation, to update the website's data. not sure how efficient this would be. that was just the first idea that came to mind when i saw you explain that amazon scraper lol.
You could have the scraper save the data to a database or even create a simply API.. that would be your source and from there you could create ANYTHING from it.
It's great. I have some use cases in my mind. But for that I have some question:
1. Does it have any loop function? Like the one you showed as global variable, can I set an array, then loop through this array with each value. If so, then it will be helpful. Each time changing the variable manually is not realistic use case.
2. Does it offer if else function? like if a variable match a pattern, do something, else do some other thing?
Did you pay attention to the very first part of the video? It has a loop module. It also has where you can do a json object.. key:value at @13:09
Can you do one to Scrape and Store into a Google Sheet?
Hey brother super sick video. This is actually super clutch for my business.
That said, do you know if it’s possible to set it up where it can scrape multiple pages at a time. Each page it scrapes would have the same data, so it would need to be able to create multiple rows and put multiple rows into the storage.
Did you even examine the modules it has? Loop is one
@@BGdev305 yes actually figured out it straight up does not work at all for the site that I was pulling data from.
saved
So it's basically like puppeteer but with a visual UI
yeah, but how safe is it in privacy terms?
Can you cover loops and pagination
Yes, would be very nice to see how to scrape for example a blog with the same text structure (heading, date, body etc.) on one and the following sites.
Drinking game: take a shot every time he says workflow
(You will blackout)
I love this
He's undercover Austin Powers
Scheduled Workflows works if you dont have the browser open?
Yes, it'll work
@@browserautomate How? I tried it and it doesnt work
The scroll down not working
So, it's something like puppeteer without the code?
Not really. You can use puppeteer in the context of an entire application, which makes it vastly more capable.This thing is cool though.
Have you tested this in Firefox? Element selector does not run in Firefox. Your headine says 'Browser' not Chrome. And in Chrome you don't click the nodes you double click them. Your mistake? Assuming your audience.
The plugin looks interesting but the company that makes it looks sketchy. Be careful who you give your trusted data.
Isn't this open source tho?
Its an open source project
@@jabirjaleel375did you read throu the code? Open source means that the code is just public not that it's safe and not malicious. Are you sure that they are hosting the same code as their open source one? Maybe they injected some tracking code. The term "opensource" means nothing.
this comment is 2 weeks old atm, the guy in the video did mention that it was open source, people mention that as well as a reply to ur comment, is that a random comment out of paranoia u have or u have something u can argument with?
@@sapient4474 I mean tbf open source doesn’t always mean no sketchy stuff - take for example the xz backdoor
Frittata
Useful things that can be done this way?
Better than Selenium?
I think yes, but selenium is as good as you in programming, best part there is no capatcha, there is no login each time, I do scrape with it, and with undetected_webdriver, and it's so much pain, it does the job, but to build it, and each time maintain the errors, so the more your scraper is complex the more the pain, I think with this tool a lot of that pain already reduced
I want to automate building websites in wordpress.
No code automation. Ah shit here we go again
You never say where ANY of this data used or retrieved is "saved"? If one is using a or numerous login credentials for a site.. where is this login:password data stored while running and "saved" to?
When creating these data tables.. where are these tables of data being saved? And who has access to it?
I can't imagine a company providing the resources for thousands or millions of people to be using this for "free"!
It is called ASIN :-D
Firefox version?
i like more alfred =/
It's such a shame you think everyone knows what you know; however, it's just going over people's heads because you use tech jargon that most people do not understand. Most of us will have to find another video on Automa to find out what it is and what you can do on Automa and then use Google to translate all the jargon into English - that's nearly every word you used.
seems like you’ve got plenty of time to google other videos Charlie
Based on the subject, how would he use any other words than what something actually is called. There was absolutely no tech jargon whatsoever in this video.
If you think “trigger”, “node” or “cookies” etc is too difficult to understand, then I don’t see how you even would benefit from using Automa?
I think you should either learn some basic web grammar and then move on to automation. Then no word even be remotely close to “tech jargon”.
Another option you have is to use AI, just ask ChatGPT or Edge AI to explain the content in the video for a “10 year old” (it’s a technique to get something explained in easy to understand language).
To do the way you do, to Google each word you don’t understand isn’t very effective.
In my opinion this was an excellent overview of Automa for beginners.
I'm waiting new videos about it in different social networks.
Thanks! how to run it on cloud??