One thing I really love about your tutorials is how you explain what might go wrong for someone who is watching in the future. No other tutorials I've watched explain in such detail. Thank you and keep up!
I have watched many programming videos, from some of my favorite programming channels with large followings, but she explains the key concepts and nuance details many leave out or barley mention. First video of hers I have watched...instant sub
@@bfunkydunk Off the top of my head i think you would use something like python, ETC for doing that on large datasets as it will be much more optimized and multithreaded to search lots of documents at once.. That being said it seems like the parser she is using here (cheerios) is just searching the dom, so if it is referenced in the dom (static) or loaded to the page i don't see why you couldn't search for a different/specific word.
Although i completely disagree with your approach of scraping data at runtime, BUT, i must say i have watched every second of this video for the only reason that you made it so perfectly precise no long no short, I wish every programming youtuber made this kind of video with a precise goal in mind, in fact i have forwarded this video to people who were asking me about a good intro to node js.
Won't speak for others, but this really helped me understand a lot of things even when I'm not looking to sell APIs anywhere. Please continue with your elaborative teaching style as these days it's a rare art. Thanks much.
You make some of the best tutorials I have ever seen on YT. You have a great teaching style of anticipating what the student will have questions about and making sure not to skip over details that would be confusing. Thanks so much!
Ania, your tutorials are excellent. Very well delivered and executed. Your elocution is perfect, so I can watch them at 2x speed and still catch every word. No waffle, all the details without diverging. Concise. You should have WAY more views than you do. One of the best. thank you - keep them coming.
I really enjoy this, not only she looks great, but i can tell for sure, if someone can explain something, and has a good way to do so, not too much blabla... Straight to the point. Its hard in a way, to get things explained well, cause there is just so much to talk about... Awesome work Ania :) will follow from now on
You make it look so easy ! Thank you for having such a detailed video ! It is a perfect starter / intermediate video. You have a lot (a LOT) of details, but in the end, all I need in this video, and I don't have to open 10 new tabs to understand what is what ! As an extra, I also get to learn a few tricks ! Thx again ;)
Loved the video!! Advice for anyone following along: what she wrote is not a production ready code. It is merely for demonstrative purposes. Please do write a prod ready code when you're working on your APi for better support.
Your English accent is touching my heart ❤️😍 If I would learn coding from scratch again, I would definitely learn it with you. Finally I found a beautiful and knowledgeable coder that I can sharpen my skills with her.
Hi Ania, Thank you for this tutorial. I first followed your 2 parts about building a simple web scraper, and then to get it working on mobile (because it didn't work due to using localhost) I followed this tutorial and got my web scraper working! A tutorial about what rapid API and Heroku are doing to understand this a bit better would also be really nice. Keep up the good work!
Been a front end dev for five years and one year of back end on ServiceNow. I had no idea this was how you built an API. I've always used the api data, but never knew how it was built. I am surprised at how easy it is with express. Maybe not easy for beginners. I have some basic tutorials on my channel but this is nice. Just as good, if not better than a Udemy course.
@@prymestudio I have worked with a lot of entry level devs and they do not understand these things. They might be able to follow the code along, but they are not going to understand every bit of it. Callbacks, promises, GET, POST, these are not beginner level aspects. Express makes this easier obviously.
Realistically this is an extremely simple API with a majority of the heavy lifting API aspects handled by RapidAPI. This is more just creating requests that a custom API is using to return data, it's not building an API itself And I'm not saying this like it's a problem, but just want you to know that you probably shouldn't get API development confused with endpoint building that an API then uses
@@helloukw I know that. Even in the past you can do with or without those ';'. For me who program in C and a lot of others languages, this just fell wrong, like something is missing. Whatever, today there aren't JavaScript programmers, just developers, most of all just use NPM and don't know what happen when build, even never seen the build package or not be able to read those.
You speak very similar to that of Christopher Barnatt from explaining computers. I appreciate your content and love what you do! Your intellect is beautiful.
okay i cant lie, I was a bit sceptical about your video quality, but this is just beyond helpful. Its so easy to follow and understand like I hate js but this motivates me to use javascript. Thank you for this tutorial and amazing idea.
This is the first video I from Ania I watch. Currently I am learning php/css/html and will start JS soon. I just wanted to say: ‘Ku bo’ means ‘for/with you’ in my language (Papiamentu) so basically this comment is gonna lead into a proposal: Ania tell me, can I be your main simp? 🤣
Great video. I know it might not be easy to nail down, but it would've been good to see how you can best judge the cost of hosting your API and how much to charge for access (also, I wasn't sure if you covered tokenised API access or whether that's handled elsewhere)?
Awsomee!!! so easy to understand and so important and productive for the work!!! thanks Ania for share your knowllege. Im becoming a fan of all your videos. I just seen this blog a few days ago and watched lot of your codes!!!!
In addition, don't forget you can take advantage of communicating with subprocesses in the OS (for things that run at that level), take deeper advantage of collaborative multitasking, and use queueing data structures to stash highly blocking requests that you might not want to store as pure javascript objects in case your application crashes/has bugs.
Really great video, never thought of this. Only issue I can see would be the policies of these newspaper sites being scraped. Many already have their own form of API already or probably forbid others from storing their data like this.
This was intended to be an educational video and web scraping is a quick way to achieve that. When you create your own API, you are responsible for making sure that you are legally able to do what you intend your API to do which is beyond the scope of this tutorial.
Ania! Thanks for this tutorial. Exactly what I was looking for to do multisite scraping. One question for you, how would you save those results in firebase or any other type of db, so they don't get lost? And how would you then make an API from that mentioned db so they all there no matter the date?
Once you have the data that you need, you can create some code to filter out exactly what you want and store it in a hash/object or an array, or an object-array. Then, instead of creating yet another API, you can create some more code that displays the information that you want Rather than having the API’s display the information immediately.
If you want also, you can use an the date() method that gives you the current date, and store all of the information you’ve gathered based on date and other specifications. If you’d like to store all of the data regardless of date, you can simply combine everything that you scraped into an object of objects/arrays, and when you finally decide that you need to pull up any of that data regardless of date, you can always store the date along with that data so that you can search for it when needed.
Great video and definitely learned a lot. When doing the :newspaperId section, couldn't we just filter the articles array instead of filter/find on the newspapers array? That way we wouldn't be using time and resources to duplicate the effort if we already have all the articles
.filter() actually returns the object (which matches search criteria), So no need to run the filter again. Just get whichever property you want from that object.
RapidAPI looks cool and easy but, dang, 20% goes to them? And your users get locked into endpoints with their domain? I think learning Stripe's Metered Billing API + auth tokens might be better.
well it may be easier to get clients in the rapidAPI ecosystem since there will be people looking for the APIs. but yeah you could potentially get much more money on your own
It is a little steep. Hoeven, it comes down to your business model... 20% at certain scales might be worth it if you want to avoid the marketing costs to reach that audience. Especially at the nascent stage of your business, you might tell everyone to go to RapidAPI so you don't have to deal with all the infra to bill and rate limit requests, but later use your mailing list to incentivize people to move to your new version after you have the volume to justify the work.
You are a very good teacher, thank you, I am learning node and especially npm and you are by miles the best tutor I have watched. Also I love your accent!
Thanks to the universe , still exist wonderful people like you in this wordl. I share your goal. The Knowledge must not be concentrated. Thanks to people like you the world could be a better place. A Big Thanks and a Huge embrace from South America, country Chile to you
the most beautiful programmer I've ever seen, I'm still a beginner, Java script and HTML5 I'm not an expert, but seeing your explanation I became very enthusiastic LOL
Nice idea, however, you should consider test automation for the data source scheme changes, and the second point is the site will register frequent access they may start blocking (which may even happen automatically by AWS routing services), which will have impact on "your API"s availabiliy, resulting in bad reviews.
You're right but I guess she aimed for an easy to understand example since there are many different topics she covered in this tutorial. Besides: I know it's an example but she should have pointed out that this api is in now way legal to run and even provide via public api considering all the newspaper outlets own copyright to these articles.
Great tutorial Ania, thanks :) My only question is whether it is 100% legal to make an API that scraps data from webpages like The Guardian(the one you show in the tutorial) and then post the API publicly?
There are no specific laws that do not permit web scraping/crawling...... and as long as you are not simply copying the "presentation" of publicly available data, e.g. cloning a web site and reusing it for commercial purposes, then you are not in breach of copyright laws either. One might also add that the scraped data links back to the source, so all credit is given. If anything, such an API will drive traffic TO the scraped sites, honestly can't see who'd complain about that :)
Really good simple tutorial on how to use Cheerio to Scrape pages and Setup an API Endpoint, straight to the point. Kudos to that! One concern I have is that if Scraping the data from these pages are legal or not?
You have to be careful, it's bit of a grey area. Your scaping tool should respect the Robots nofollow/index meta tags, and your API should only provide a summary and link to the full content. If your API delivers the full content, or you're making money from someone else's content without their permission, there's always some risk of a lawsuit.
Isaac Newton said "If I Have seen further, It is by standing on the shoulders of giants", Today Ania, we are standing on your shoulders to see further. Thanks for giving us that support.
@James Pickering Yes, that's true. My question is about the financial aspect. Can I ask money in exchange of some else's publicly available stuff? I mean, can I ask money for providing an API which ex. scrapes your FB account?
@@thedevdavid Just because something is "publicly available" doesn't mean it's not copyrighted. In fact, this has been a massive debate in the US over the past 3-5 years... But here's a rundown on your social media content: if you own the photo (if you're the one who took it or you're the one who paid for someone to take it and have purchased the rights to that content), it is absolutely illegal for you to scrape that data and sell it. However, I will say good luck fighting large companies on that kind of stuff... They'll typically just run you through courts until you don't have any money left to fight it, unfortunately. I hope this helps :)
@@thedevdavid Haha, I didn't mean to imply that you were going to ;) I was mostly pointing out how ridiculous it is that large companies can get away with that kind of stuff with sleazy tactics, lol.
this is fantastic, thank you so much for this. I just started getting my head wrapped around API. Now can you do a video on how to use this API or any API in your app?
Very informative, there was a few packages I didn't know :) Don't you have a problem with the articles getting out of date? Or does Heroku restart your app at certain intervals? Otherwise you have an array that is populated only when the server starts. The other endpoints are alright, but you might run into IP blocking if your API become too popular. The best would be to load the articles at regular intervals and only filter them for each endpoint.
@@Jibs-HappyDesigns-990 The server that stores the actual conntent the API is forwarding. It may consider your API calls as DDOS or other malicious behaviour.
Long time node dev here…and this video showed me that I’m 3 years behind in not needing the “-save” option for npm install 😂. Also, have seen and even used a couple RapidAPIs before, but never tried it as a creator. I was glad to get a quick look inside and have a few projects that I might be able to use with it. This is the first video of yours that has come up for me on YT-I think I would like to see more!
3 года назад+11
Doesn't this lead to copyright infringement on the newspapers content?
Thank you for a great tutorial Ania. Really inspires to dig a bit deeper and explore what more can be built based on such app. Though one question regarding the code. Why in "/:newspaperId" route you make a new axios get request instead of filtering matching results from the existing articles array based on provided ID in params?
Although is possible but this approach seems more desirable for optimising performance/profiling and negating code complexity issues- as an initial GEt request of all items and then running a higher order function like filter would be bulky if there were around 10000+ items in an array End of the day javascript is single threaded (:
Because it is expected normally for a RESTful api to serve /foo for "get all" and /foo/:id for "get one" Also keep in mind that you normally send articles "in packages" e.g. in sets of 50 (bc you might have 3 billion stored). so if you want some very old id you might need to reload 50 articles in a for loop on the backend over and over again until you find the correct one in the filter function
Thank you Ania, I discover your videos and I am amazed by the quality of the content! Regarding selling your APIs from webscrapped data, shouldn't we first contact those websites before to sell their data, or a link to them is enough?
Thanks to you Mother of the Dragons for this helpful tutorial
👑🐉
😂😂😂🙂
I was trying to figure out why I recognized her!
😂😂😂😂
Lmao🤣😘
One thing I really love about your tutorials is how you explain what might go wrong for someone who is watching in the future. No other tutorials I've watched explain in such detail. Thank you and keep up!
I have watched many programming videos, from some of my favorite programming channels with large followings, but she explains the key concepts and nuance details many leave out or barley mention.
First video of hers I have watched...instant sub
sa te uiti si la mada. ea din viitor stie trecutul.
@@mybusinessnotyours2051 yeah okay buddy, you have never programmed in your life you are here for one reason
I came to watch something educative while I was eating my breakfast, I did not even realize that I watched the whole tutorial, excelente 🥳
Am I the only one who feels it's a unique experience being taught by Daenerys Targaryen on how to make money from selling APIs?
🐉👑
correct
as indonesian it feels more bizzare actually, cos API in indonesian means fire 😂
@@aniakubow Could you convert this to a web scraper (specific words, etc. Not a specific site?
@@bfunkydunk Off the top of my head i think you would use something like python, ETC for doing that on large datasets as it will be much more optimized and multithreaded to search lots of documents at once..
That being said it seems like the parser she is using here (cheerios) is just searching the dom, so if it is referenced in the dom (static) or loaded to the page i don't see why you couldn't search for a different/specific word.
1 year after video release but never late, you won my subscription, thank you :D
Although i completely disagree with your approach of scraping data at runtime, BUT, i must say i have watched every second of this video for the only reason that you made it so perfectly precise no long no short, I wish every programming youtuber made this kind of video with a precise goal in mind, in fact i have forwarded this video to people who were asking me about a good intro to node js.
Won't speak for others, but this really helped me understand a lot of things even when I'm not looking to sell APIs anywhere. Please continue with your elaborative teaching style as these days it's a rare art. Thanks much.
You make some of the best tutorials I have ever seen on YT. You have a great teaching style of anticipating what the student will have questions about and making sure not to skip over details that would be confusing. Thanks so much!
True and I especially like her face which is very fuckable btw.
What a concept! How did I not think of this??
Great work. Ania!
Yep, please create a Django version of this, I know the deployment to Heroku part but transferring Django API to RapidAPI is where I am struggling.
yeah dude ,definitely make these django API too!
Fantastic outline of the concept :)
new tools 4 the future!!
madalina stia. dar nu a vrut sa zica.
I randomly bumped into this video and so glad I did. Learned heaps and feel less scary about Node and the packages used. Thank you Ania.
This is the best tutorial on making your own API. Ania, you are too AWESOME!!
Ania, your tutorials are excellent.
Very well delivered and executed.
Your elocution is perfect, so I can watch them at 2x speed and still catch every word.
No waffle, all the details without diverging. Concise.
You should have WAY more views than you do.
One of the best.
thank you - keep them coming.
simp, she's reading
Gratulacje, twój akcent jest doskonały!
She's done it omg. She's finally delved into the untapped market of selling API's hahahaha. Keep it up!
So that's sarcasm or what?
I like the way you insert a pause at the end of the sentence, before the final word. It gives off a vibe of TV news.
Easily one of the best tutorials I ever saw.
An original idea and a great, explanation, step by step, nothing left out. Subbed!
I really enjoy this, not only she looks great, but i can tell for sure, if someone can explain something, and has a good way to do so, not too much blabla... Straight to the point.
Its hard in a way, to get things explained well, cause there is just so much to talk about...
Awesome work Ania :) will follow from now on
You make it look so easy ! Thank you for having such a detailed video !
It is a perfect starter / intermediate video. You have a lot (a LOT) of details, but in the end, all I need in this video, and I don't have to open 10 new tabs to understand what is what ! As an extra, I also get to learn a few tricks !
Thx again ;)
Loved the video!!
Advice for anyone following along: what she wrote is not a production ready code. It is merely for demonstrative purposes. Please do write a prod ready code when you're working on your APi for better support.
Nice! I watched at 1.25x speed and forgot I was at that speed when thinking "Wow, she talks as fast as I do"
I literally finished your Async/Sync video series, and this video is the perfect follow-up. Thank you for all your vides! Please keep making them!
Straight to the concepts without beating around the bush !! Wonderful 👏
Your English accent is touching my heart ❤️😍 If I would learn coding from scratch again, I would definitely learn it with you.
Finally I found a beautiful and knowledgeable coder that I can sharpen my skills with her.
Hi Ania,
Thank you for this tutorial.
I first followed your 2 parts about building a simple web scraper, and then to get it working on mobile (because it didn't work due to using localhost) I followed this tutorial and got my web scraper working!
A tutorial about what rapid API and Heroku are doing to understand this a bit better would also be really nice.
Keep up the good work!
I could watch her all day. Thanks!
This is one of the most interesting topics I have seen all year. I have for so long tried to demystify the true power of marketing APIs.
Are you coming from metal forging ages?
You're a great teacher. You've earned me as a follower.
Damn, the idea is so brilliant. Didn't know about API marketplace, by the way.
Thanks a lot!
the idea is so brilliant.
You have an amazing presence and such a clear way of explaining
Thank you Bella!
Been a front end dev for five years and one year of back end on ServiceNow. I had no idea this was how you built an API. I've always used the api data, but never knew how it was built. I am surprised at how easy it is with express. Maybe not easy for beginners. I have some basic tutorials on my channel but this is nice. Just as good, if not better than a Udemy course.
Bro I feel a beginner would follow without any issues. As long as they know the basics of Js
@@prymestudio I have worked with a lot of entry level devs and they do not understand these things. They might be able to follow the code along, but they are not going to understand every bit of it. Callbacks, promises, GET, POST, these are not beginner level aspects. Express makes this easier obviously.
@@nick_jacob I see what you mean
Realistically this is an extremely simple API with a majority of the heavy lifting API aspects handled by RapidAPI.
This is more just creating requests that a custom API is using to return data, it's not building an API itself
And I'm not saying this like it's a problem, but just want you to know that you probably shouldn't get API development confused with endpoint building that an API then uses
Very happy that I have discovered this channel.
Robisz coś pozytywnego i kreatywnego dla innych :) Nie przestawaj! Tym bardziej, że widać od ludzi pozytywną reakcje
Is this the best channel on YT ? why do i even ask ?! ofc it is !
Your videos are so well put together - you obviously work so hard on these 👍👍👍
You are one of the Greatest Developer
Something I have been meaning to look into for a long time, the video is great and very clear. Thank you
the best teacher on RUclips 🥰
💚💚💚
Great tutorial. I'm old with JS, since 2001. I liked what they have done with the language, but those lines without ending ';' freak me out.
You can add them to feel safe.
@@helloukw I know that. Even in the past you can do with or without those ';'. For me who program in C and a lot of others languages, this just fell wrong, like something is missing. Whatever, today there aren't JavaScript programmers, just developers, most of all just use NPM and don't know what happen when build, even never seen the build package or not be able to read those.
@@LuizFernandoSoftov yeah as a c developer too. it's a bad habit
@@LuizFernandoSoftov Thankfully Prettier adds them automatically in case I forget!
Yes...Truly.. Ania is Great..
Love from India.
Simply brilliant, thanks for explaining things crystal clear 👍
si pentru madalina e cristal. dar ea a baut din el.
Thank you, Ania. You are so wonderful. You are really a good instructor, especially for a newbie like me.
I wish you know the NUMBER of sites I have scrapped after this tutorials. Thanks and Stay BLESSED Ania 🥰
Scrape responsibly. Alot of sights are putting up pay walls and bot deterrents because the bots are consuming too much bandwidth.
@@gdolphy Yes bro, definitely am doing it responsibly.
Thanks Ania for this fantastically interesting, valuable and well-crafted tutorial. You are a supernova in the ferment of programming tutorials.
You speak very similar to that of Christopher Barnatt from explaining computers. I appreciate your content and love what you do! Your intellect is beautiful.
Great video with a great accent....
Keep it up....🤝🤝🤝🤝
I have become a big fan of you Ania both professionally and clearly you are a wonderful person.
Thanks so much David! You rock!
Finally, I published my API... Now it's time to learn more javascript and build more... Thanks
okay i cant lie, I was a bit sceptical about your video quality, but this is just beyond helpful. Its so easy to follow and understand like I hate js but this motivates me to use javascript. Thank you for this tutorial and amazing idea.
Proud Of You Dear sister , your contribution to computer science is appreciable.
Big thank you ! cant imagine how much joy and fun i always have coding along with you. keep it up.
great voice for a teacher. soothing but engaging i likey
API in malay/indonesian language means Fire. Khaleesi you control the Fire (API) and taught us the way to use API. Thank u khaleesi.
🐉👑 =>🔥🔥🔥🔥🔥🔥
it also means "bees" in Italian; maybe she controls the fire bees as well 😯
Ania is actually very great 👍
Subscribed!!!
Really interested into this one...
why? can you not figure it out?
Díky!
So kind of you!! Thanks so much Michal!
This is the first video I from Ania I watch. Currently I am learning php/css/html and will start JS soon. I just wanted to say: ‘Ku bo’ means ‘for/with you’ in my language (Papiamentu) so basically this comment is gonna lead into a proposal: Ania tell me, can I be your main simp? 🤣
all your explanations are spot on
Great video. I know it might not be easy to nail down, but it would've been good to see how you can best judge the cost of hosting your API and how much to charge for access (also, I wasn't sure if you covered tokenised API access or whether that's handled elsewhere)?
Thank you so much for sharing your expertise on how to build an API. Thanks again.
Great tutorial, Rapid API wasn’t even on my radar. Thanks 👍!
You are an amazing teacher.
I wonder if there's any legal concern when we create scrapping tool API?
Less legal and more policy.
You can bribe the law with the money you made from APIs
Ania I am amazed on how valuable your channel is. Keep up the good job.
if she is my coding professor, I will never skip her lectures
Awsomee!!! so easy to understand and so important and productive for the work!!! thanks Ania for share your knowllege. Im becoming a fan of all your videos. I just seen this blog a few days ago and watched lot of your codes!!!!
In addition, don't forget you can take advantage of communicating with subprocesses in the OS (for things that run at that level), take deeper advantage of collaborative multitasking, and use queueing data structures to stash highly blocking requests that you might not want to store as pure javascript objects in case your application crashes/has bugs.
Excellent tutorial and practical overview on how to get an api up and running. Thank you.
Really great video, never thought of this. Only issue I can see would be the policies of these newspaper sites being scraped. Many already have their own form of API already or probably forbid others from storing their data like this.
This was intended to be an educational video and web scraping is a quick way to achieve that. When you create your own API, you are responsible for making sure that you are legally able to do what you intend your API to do which is beyond the scope of this tutorial.
@@dokgu I didn't think of that 😮
You are maybe one of the prettiest dev I have ever seen. 😅
Aniu, bardzo dobrze tłumaczysz. Daję radę nawet ze swoim poziomem angielskiego ;-) Dziękuję 🙂
I'm so glad i stumbled upon you. Subscribed! I'm extremely new to this and trying to learn.
Ania! Thanks for this tutorial. Exactly what I was looking for to do multisite scraping.
One question for you, how would you save those results in firebase or any other type of db, so they don't get lost?
And how would you then make an API from that mentioned db so they all there no matter the date?
Such a good Question I am battling with the same dilemma.
Once you have the data that you need, you can create some code to filter out exactly what you want and store it in a hash/object or an array, or an object-array. Then, instead of creating yet another API, you can create some more code that displays the information that you want Rather than having the API’s display the information immediately.
If you want also, you can use an the date() method that gives you the current date, and store all of the information you’ve gathered based on date and other specifications. If you’d like to store all of the data regardless of date, you can simply combine everything that you scraped into an object of objects/arrays, and when you finally decide that you need to pull up any of that data regardless of date, you can always store the date along with that data so that you can search for it when needed.
It seems easy in the way she explain ! Thank you !
Great video and definitely learned a lot.
When doing the :newspaperId section, couldn't we just filter the articles array instead of filter/find on the newspapers array? That way we wouldn't be using time and resources to duplicate the effort if we already have all the articles
.filter() actually returns the object (which matches search criteria), So no need to run the filter again. Just get whichever property you want from that object.
thank you queen of free coders , the breaker of bugs and mother of dragons.
RapidAPI looks cool and easy but, dang, 20% goes to them? And your users get locked into endpoints with their domain? I think learning Stripe's Metered Billing API + auth tokens might be better.
well it may be easier to get clients in the rapidAPI ecosystem since there will be people looking for the APIs. but yeah you could potentially get much more money on your own
It is a little steep. Hoeven, it comes down to your business model... 20% at certain scales might be worth it if you want to avoid the marketing costs to reach that audience.
Especially at the nascent stage of your business, you might tell everyone to go to RapidAPI so you don't have to deal with all the infra to bill and rate limit requests, but later use your mailing list to incentivize people to move to your new version after you have the volume to justify the work.
One more platform trying to create a captive market. For 20%, they should at least host the API, and that would still be expensive.
how might be better? how can you offer your services and earn money?
Otherwise you have to spend extra money for a marketing and to know how to do marketing. The one who owns a flow of customers owns business - axiom.
You are a very good teacher, thank you, I am learning node and especially npm and you are by miles the best tutor I have watched. Also I love your accent!
Can't wait for 26 hours 😭😭... Btw great content Ma'am
how come?
@@Shoe_On_Head I mean, I posted this comment a day ago... At that point it was showing "premiering in 26 hours"
Thanks to the universe , still exist wonderful people like you in this wordl. I share your goal. The Knowledge must not be concentrated. Thanks to people like you the world could be a better place. A Big Thanks and a Huge embrace from South America, country Chile to you
the most beautiful programmer I've ever seen, I'm still a beginner, Java script and HTML5 I'm not an expert, but seeing your explanation I became very enthusiastic LOL
Lowl
Thanks so much Ania! You are great! love love love this tutorial!
Nice idea, however, you should consider test automation for the data source scheme changes, and the second point is the site will register frequent access they may start blocking (which may even happen automatically by AWS routing services), which will have impact on "your API"s availabiliy, resulting in bad reviews.
You're right but I guess she aimed for an easy to understand example since there are many different topics she covered in this tutorial. Besides: I know it's an example but she should have pointed out that this api is in now way legal to run and even provide via public api considering all the newspaper outlets own copyright to these articles.
Simple, helpful, insightful and fantastic. Thank you so much for this course, learnt a lot
Great tutorial Ania, thanks :) My only question is whether it is 100% legal to make an API that scraps data from webpages like The Guardian(the one you show in the tutorial) and then post the API publicly?
There are no specific laws that do not permit web scraping/crawling...... and as long as you are not simply copying the "presentation" of publicly available data, e.g. cloning a web site and reusing it for commercial purposes, then you are not in breach of copyright laws either. One might also add that the scraped data links back to the source, so all credit is given. If anything, such an API will drive traffic TO the scraped sites, honestly can't see who'd complain about that :)
always check the ToS of the website you're scraping from and consider if you want to be moral or morally flexible 😄
Thank you so much for sharing!
Hopefully I'll be able to start now! (finally!)
Really good simple tutorial on how to use Cheerio to Scrape pages and Setup an API Endpoint, straight to the point. Kudos to that!
One concern I have is that if Scraping the data from these pages are legal or not?
You have to be careful, it's bit of a grey area. Your scaping tool should respect the Robots nofollow/index meta tags, and your API should only provide a summary and link to the full content. If your API delivers the full content, or you're making money from someone else's content without their permission, there's always some risk of a lawsuit.
Isaac Newton said "If I Have seen further, It is by standing on the shoulders of giants", Today Ania, we are standing on your shoulders to see further. Thanks for giving us that support.
Absolutely cool tutorial! Question: Is it legal to ask for money for API usage if you get the data from these sources (Guardian, etc.)?
@James Pickering Yes, that's true. My question is about the financial aspect. Can I ask money in exchange of some else's publicly available stuff? I mean, can I ask money for providing an API which ex. scrapes your FB account?
@@thedevdavid Just because something is "publicly available" doesn't mean it's not copyrighted. In fact, this has been a massive debate in the US over the past 3-5 years... But here's a rundown on your social media content: if you own the photo (if you're the one who took it or you're the one who paid for someone to take it and have purchased the rights to that content), it is absolutely illegal for you to scrape that data and sell it. However, I will say good luck fighting large companies on that kind of stuff... They'll typically just run you through courts until you don't have any money left to fight it, unfortunately. I hope this helps :)
@@thedevdavid Haha, I didn't mean to imply that you were going to ;) I was mostly pointing out how ridiculous it is that large companies can get away with that kind of stuff with sleazy tactics, lol.
You can bribe the law with the money you made from APIs 😂
Beautiful AND intelligent!
The program too.
this is fantastic, thank you so much for this. I just started getting my head wrapped around API. Now can you do a video on how to use this API or any API in your app?
You are doing great work... I love you beauty with brain 😘😘😘
Very informative, there was a few packages I didn't know :) Don't you have a problem with the articles getting out of date? Or does Heroku restart your app at certain intervals? Otherwise you have an array that is populated only when the server starts. The other endpoints are alright, but you might run into IP blocking if your API become too popular. The best would be to load the articles at regular intervals and only filter them for each endpoint.
that would be a second iteration... make it more "prod ready"
**might run into IP blocking if your API become too popular.** I don't understand?? who/what, do U think would be blocking what IP??
@@Jibs-HappyDesigns-990 The server that stores the actual conntent the API is forwarding. It may consider your API calls as DDOS or other malicious behaviour.
This is great!!
Long time node dev here…and this video showed me that I’m 3 years behind in not needing the “-save” option for npm install 😂.
Also, have seen and even used a couple RapidAPIs before, but never tried it as a creator. I was glad to get a quick look inside and have a few projects that I might be able to use with it. This is the first video of yours that has come up for me on YT-I think I would like to see more!
Doesn't this lead to copyright infringement on the newspapers content?
Similar works Google too.
Why? You are just sending traffic to their sites. You are not copying anything.
Nice nice..thank you
@@AlejandroVivas you scrape the text only and doesn't show their ads, they don't get any revenue when you show their content in text form
I love you Ania. Your videos tutorials are excellent.
Thank you for a great tutorial Ania. Really inspires to dig a bit deeper and explore what more can be built based on such app. Though one question regarding the code. Why in "/:newspaperId" route you make a new axios get request instead of filtering matching results from the existing articles array based on provided ID in params?
other big question is how articles array will be updated? answer: only on server reload. btw good demo anyway
Although is possible but this approach seems more desirable for optimising performance/profiling and negating code complexity issues- as an initial GEt request of all items and then running a higher order function like filter would be bulky if there were around 10000+ items in an array
End of the day javascript is single threaded (:
@@vladbocharov3246 with every HTTP request to the server.... a frontend setInterval would do the automation
@@vladbocharov3246 great point! what about using node-cron? seems that could get a job done, e.g. by setting daily or weekly schedule
Because it is expected normally for a RESTful api to serve /foo for "get all" and /foo/:id for "get one"
Also keep in mind that you normally send articles "in packages" e.g. in sets of 50 (bc you might have 3 billion stored). so if you want some very old id you might need to reload 50 articles in a for loop on the backend over and over again until you find the correct one in the filter function
You explain things very well! in a way that anyone can understand... you're amazing and never stop!!
Thank you Ania, I discover your videos and I am amazed by the quality of the content!
Regarding selling your APIs from webscrapped data, shouldn't we first contact those websites before to sell their data, or a link to them is enough?
You should read the terms and conditions of the website you are scrapping, many websites have terms against this practice.
great work ania you are my super hero