Thank you so much for sharing this. I was trying to get file uploads working a while back, but ended up putting it on pause because I was just having too much trouble (first time implementing file uploads ever, was trying using multer, fs, etc.). I got file upload up and running in just a few minutes with this video, with the added benefit of learning about ProgressBar, classnames, react-dropzone, and busyboy! 🙏
Really thanks for the great video. I do have a question. How do you sequentially upload the multiple files to S3 one at a time? What I did was made a function to upload images to S3 then retrieved the S3 URLs then inserted them into post method form data to backend. But the problem that I encountered was it didn’t allow me to upload images one at a time and only the last file was able to retrieve the S3 URLs. Is there any good way to upload multiple files that stores in array to S3 using map, for loop, or anything
I was uploaded files in AWS S3 bucket and i fetched those videos to show in frontend but when i do inspect in the image tag the file url is showing so by using those image url any can copy and download the images but i want users to dont copy either they dont want to download by using the link how can i do it can you please make any video on this please.....?
That's an option indeed. Since I need to operate on my server anyway, handling the upload to S3 from my server code isn't a big deal. But if I were going direct from browser to S3 and then nothing after that, I think the signed URL method would make sense.
for the student it may not be too different. for the author it's quite different because CourseLift doesn't take a cut of revenue. Instead, it offers software and guidance for the author to build and sell their courses for just a software fee.
I am also trying to solve this problem with axios (Not using the signed URL method - this is another story). My logic is pretty much the same. Sending file data to backend which is under /pages/api directory in my case since I am using nextjs. When I used the onUploadProgress function to show progress but the process in the function only takes care about uploading file's to my backend not amazon s3 file upload step.Therefore, the process is quickly done even before the file gets uploaded to the amazon s3 bucket. How could you solve this one?
is the issue that when you're working locally, it appears that the file goes to 100% immediately? if so, that will just be an artifact of working locally. you could try throttling your network in dev tools (choose 3G for simulated network latency, for example).
@@holodeck_run Thanks for the quick reply. If I were to save those file data in the backend file system or to use the signed URL method, it would 100% make sense but what I am trying to say is that by the time the file data gets passed into the upload method from amazon s3, the progress function in axios is going to reach 100% because the data has been uploaded to my backend. The amazon file uploading phase is still in progress though. Plz let me know if there is something I am missing. I am learning a lot from your videos. Cheers mate :)
github code?
Thank you so much for sharing this. I was trying to get file uploads working a while back, but ended up putting it on pause because I was just having too much trouble (first time implementing file uploads ever, was trying using multer, fs, etc.). I got file upload up and running in just a few minutes with this video, with the added benefit of learning about ProgressBar, classnames, react-dropzone, and busyboy! 🙏
awesome to hear that it was helpful! 😄
Yes! can you show more of these type of videos! this is amazing!
Cool stuff, can you please tell me you setup the queue
Really thanks for the great video.
I do have a question. How do you sequentially upload the multiple files to S3 one at a time? What I did was made a function to upload images to S3 then retrieved the S3 URLs then inserted them into post method form data to backend. But the problem that I encountered was it didn’t allow me to upload images one at a time and only the last file was able to retrieve the S3 URLs.
Is there any good way to upload multiple files that stores in array to S3 using map, for loop, or anything
Can you please share the source code?
How much does it cost to operate monthly +/-?
I was uploaded files in AWS S3 bucket and i fetched those videos to show in frontend but when i do inspect in the image tag the file url is showing so by using those image url any can copy and download the images but i want users to dont copy either they dont want to download by using the link how can i do it can you please make any video on this please.....?
Great demo! Keep em coming :)
What happens if network goes wrong while file upload in progress. Will it start from where it left or will have to upload file again?
I'm not certain of the behavior across the range of interruption states you might encounter. I'll see if I can simulate it to find out 👍
@@holodeck_run Did you find anything? I just saw your video and the same question came to my mind.
Why not using signed url to upload directly to S3
That's an option indeed. Since I need to operate on my server anyway, handling the upload to S3 from my server code isn't a big deal. But if I were going direct from browser to S3 and then nothing after that, I think the signed URL method would make sense.
how is it different than udemy or skillshare ( subscription based )
for the student it may not be too different. for the author it's quite different because CourseLift doesn't take a cut of revenue. Instead, it offers software and guidance for the author to build and sell their courses for just a software fee.
Hello, Can you give me the code snippet...
I am also trying to solve this problem with axios (Not using the signed URL method - this is another story). My logic is pretty much the same. Sending file data to backend which is under /pages/api directory in my case since I am using nextjs. When I used the onUploadProgress function to show progress but the process in the function only takes care about uploading file's to my backend not amazon s3 file upload step.Therefore, the process is quickly done even before the file gets uploaded to the amazon s3 bucket. How could you solve this one?
is the issue that when you're working locally, it appears that the file goes to 100% immediately? if so, that will just be an artifact of working locally. you could try throttling your network in dev tools (choose 3G for simulated network latency, for example).
@@holodeck_run Thanks for the quick reply.
If I were to save those file data in the backend file system or to use the signed URL method, it would 100% make sense but what I am trying to say is that by the time the file data gets passed into the upload method from amazon s3, the progress function in axios is going to reach 100% because the data has been uploaded to my backend. The amazon file uploading phase is still in progress though. Plz let me know if there is something I am missing. I am learning a lot from your videos. Cheers mate :)
@@janghanpark320 do you have any code I can look at? that would be the best way to assess :) you can DM on twitter: twitter.com/ryanchenkie
Can i get github code, sir?
good job
Awesome!
Thanks a lot
can't understand. Please explain line by line