Classification of Satellite Imagery With Deep Learning Model Using Google Earth Engine & TensorFlow

Поделиться
HTML-код
  • Опубликовано: 22 янв 2025

Комментарии • 23

  • @reginaldotoo3155
    @reginaldotoo3155 3 месяца назад +1

    hi would appreciate it if you could share how you collected your training data or a link to a video that explain that. thanks

    • @MuddasirShah
      @MuddasirShah  3 месяца назад

      @@reginaldotoo3155 Hi, I simply visualized the image in TCC. Picked randomly few points for each class and exported to GCS :)

  • @IAKhan-km4ph
    @IAKhan-km4ph 9 месяцев назад +1

    nice

  • @taffyelian
    @taffyelian Год назад +2

    would you make a video about how to cloudmasking an image not an imagecollection? iam processing landsat 9 image using expression. Lately i just found out that the cloud are affecting the min, max, and mean value. while cloudmasking the imagecollection cant processed using the expression. where i can contact you? thanks

  • @aprilianidiar
    @aprilianidiar 11 месяцев назад +1

    Thank you for te tutorial! Just curious, does it gonna work if we use T4 CPU instead of A100 GPU? It seems that A100 is not free.

    • @MuddasirShah
      @MuddasirShah  11 месяцев назад

      Yes it will work , even it will work with CPU but that will take time

  • @RaymondG7
    @RaymondG7 Год назад +1

    Great tutorial, and thank you so much!

    • @RaymondG7
      @RaymondG7 Год назад +1

      Questions were sent via the email box :)

  • @pratikdhende3117
    @pratikdhende3117 Год назад

    Hi sir can u please guide me with the procedure to use Google drive insted of Google cloud service

    • @MuddasirShah
      @MuddasirShah  Год назад

      You just have to define I/O paths. I am sorry I am occupied a lot these days. Else would’ve loved to do meeting with you.
      Just remove the gcloud credentials and authentication part. Like manually remove each variable associated with gcloud. And give the google drive folder path by copying it from colab sidebar. It’s easy I am sure you can do it

    • @pratikdhende3117
      @pratikdhende3117 Год назад +1

      @@MuddasirShah thank you so much for your valuable response

  • @xiaoenli
    @xiaoenli Год назад +2

    Thank you so much for sharing this valuable learning resource.

  • @OlawaleArowolo
    @OlawaleArowolo Год назад +1

    Thank you for this tutorial, this is my guide on a project I am currently working on. But I have a question, In a situation where the tiles returned by your query are many and multiple tiles covering the same spot within the AOI but with varying quality of images. How do i identify the image covering some part of my AOI? Is there a script to display the tile number on the map when visualized?

    • @MuddasirShah
      @MuddasirShah  Год назад

      Are you talking about .TF record files or filtered images?

    • @OlawaleArowolo
      @OlawaleArowolo Год назад +1

      Thanks for your prompt response. I am talking about filtered images. In your case you have two tiles but in my case I have 2041 tiles, but I want to select some specific tiles based on their position on my AOI.

    • @MuddasirShah
      @MuddasirShah  Год назад

      @@OlawaleArowolo email me at muddasirshah@outlook.com with screenshots of the error you’re facing. I’ll tell you a solution but right now I am not clearly understanding your question

  • @MuddasirShah
    @MuddasirShah  Год назад

    Update:
    Watch this awesome tutorial to understand the TensorFlow part of this code
    ruclips.net/video/HMR_2VkDE9s/видео.html&ab_channel=RobinCole

  • @ChristhianSantanaCunha
    @ChristhianSantanaCunha Год назад +1

    Hi, how area you?
    Congratulations for tutorial, is great help to community of remote sensing. So, i would like know if you payed anything for running the code in cloud and bucket? I need know because i want test a classification to my area of study. Thanks

    • @MuddasirShah
      @MuddasirShah  Год назад

      Yes, I Paid a very very very small amount for GCS bucket. That’s 0.003 / GB / month I believe.
      Also the colab pro purchasing was totally intentional to speed up the process.
      One last thing you can do it without GCS buckets by mounting google drive and exporting imagery and train test samples to gdrive as TF records

    • @ChristhianSantanaCunha
      @ChristhianSantanaCunha Год назад +1

      @@MuddasirShah Thanks ! I Will try run a similar Code!