Azure End-To-End Data Engineering Project for Beginners (FREE Account) | SQL DB Tutorial

Поделиться
HTML-код
  • Опубликовано: 11 янв 2025

Комментарии • 119

  • @jaihindreviews8730
    @jaihindreviews8730 11 дней назад +1

    Hi @Luke, I went through lot of videos and some of them quite top of my head and accidently bumped into this video and very honest i really got a good overview of Azure. Thanks Man. Keep doing the great work.

    • @DataLuke
      @DataLuke  10 дней назад

      Thank you Jai, I’m glad it helped 🙌

  • @DonNadie05
    @DonNadie05 2 месяца назад +6

    What great content you have, truly, I'm glad I have found you. I really appreciate the raw narrative. I've passed the DP-900 and this is great hands on practice which I'm going to use on some data that I have. I was also thinking to use this project to learn terraform since I was using the azure CLI in a bash script to create things, and looking through your channel I've found great content for Terraform that I'm excited to check out. You definitely deserved more views, keep up the good work

    • @DataLuke
      @DataLuke  2 месяца назад +1

      Hey man, thank you - I really appreciate your comment. I'm glad it helped.
      If you have anything else you wanna see just let me know :)

  • @apawanstevemichael5239
    @apawanstevemichael5239 10 дней назад +4

    Thank you! Best way to start 2025 -- been searching through countless Azure videos and this one really sums it up. Keep it up good sir!
    Hoping to be a DE soon, you have my support all the way!

    • @DataLuke
      @DataLuke  9 дней назад

      Thank you very much, and a happy new year!
      Good luck mate, you've got this - keep us posted on your progress :D

    • @sahilpar9876
      @sahilpar9876 2 дня назад

      @@DataLuke Clear explanation. Is it possible to make on powerBi as well like you did on more scenarios like refresh, bursting stuff like that

  • @johnpanagiotopoulos4142
    @johnpanagiotopoulos4142 7 дней назад

    Thank you so much Luke for these end to end projects. Will definitely like and subscribe.

    • @DataLuke
      @DataLuke  6 дней назад

      I'm glad, you're very welcome! Thanks :)

  • @rakshithuk7414
    @rakshithuk7414 4 месяца назад +2

    1st View, 1st Like. All the best. Your video Added to playlist and will be watched soon. Thanks for the project.

    • @DataLuke
      @DataLuke  4 месяца назад

      Thanks Rakshith 🙏 I hope it helps.

    • @apdoelepe7937
      @apdoelepe7937 4 месяца назад

      Could u please put the github link for the projects in the description because i want to see the documentation

    • @DataLuke
      @DataLuke  4 месяца назад +1

      In description now mate!

    • @jay_wright_thats_right
      @jay_wright_thats_right 2 месяца назад

      LOL, you didn't even watch the video and said you were the view and like. Shameless!

  • @juandavidpenaranda6136
    @juandavidpenaranda6136 23 дня назад

    I just love your work! Greetings from LatinAmerica

    • @DataLuke
      @DataLuke  19 дней назад

      Thank you very much!
      Wow Latin America, that's awesome - where abouts?

    • @juandavidpenaranda6136
      @juandavidpenaranda6136 16 дней назад

      @cloudconsultant Bogotá, Colombia. Love your vids

    • @DataLuke
      @DataLuke  День назад

      Wow that’s so cool. What’s the best thing about Colombia? Thank you Juan 🙏

  • @sangeethagopinath1585
    @sangeethagopinath1585 5 дней назад

    Hi Luke, if possible could you please share a method to extarct the email attachment to azure blob storage without using power automate & logic app.

  • @ReactPotato
    @ReactPotato Месяц назад +1

    thank you so much, seeing you covering the sql server parts, i was like confusing and dont knwo what to do

  • @rafaelcarvalho315
    @rafaelcarvalho315 Месяц назад +1

    Awesome content! Thanks for sharing

    • @DataLuke
      @DataLuke  Месяц назад

      Thank you, and you're welcome 🫡

  • @sangeethagopinath1585
    @sangeethagopinath1585 19 дней назад +1

    Really great work, thank you sooooo much

    • @DataLuke
      @DataLuke  19 дней назад

      Thank you. You are very welcome! :)

  • @Gius3pp3K
    @Gius3pp3K 2 дня назад

    Thanks for this video. So the resource group is like a project, with all the apps used within that specific project?
    Secondly, is this a realistic project that you could carry out as a Data Engineer? I am looking to start a career as ab Azure Data Engineer, having worked as a SQL Server Developer for the last 13 years.

  • @albertocardenas5164
    @albertocardenas5164 27 дней назад

    Hi, the web app where I'm working has the following Power BI arquitecture: SQL Server on a VM hosted in Azure, then several SQL jobs transform the data, generating new Power BI tables for each tenant and finally the Power BI reports are being created using the datasets feeded by those Power BI tables. My question is if this approach can be optimized using some Azure tools. Thanks

    • @DataLuke
      @DataLuke  19 дней назад

      Hi!
      You can optimise your current Power BI architecture by replacing SQL Server on a VM with Azure SQL Database or Managed Instance for reduced management overhead.
      Use Azure Data Factory to replace SQL jobs, simplifying ETL and enhancing scalability.
      Consolidate tenant data into a multi-tenant database or elastic pools for efficiency, and leverage Power BI Dataflows and DirectQuery to minimise duplication and streamline dataset management.
      For advanced analytics, consider Azure Synapse Analytics to integrate data transformation, warehousing, and reporting.
      These changes reduce operational complexity, improve scalability, and better align with Azure’s cloud-native tools.

  • @francischristophur6057
    @francischristophur6057 Месяц назад +3

    I cannot be able to connect SQL Server and ADF. Once you changed Encrypt to optional, you were able to connect but it never worked for me.

    • @ajinkyabele1824
      @ajinkyabele1824 Месяц назад +1

      Same for me is there any solution for this?

    • @ajinkyabele1824
      @ajinkyabele1824 Месяц назад

      @cloudconsultant

    • @jonnichols_
      @jonnichols_ Месяц назад

      problems with linking :
      Open MSSMS
      Select Security.
      Then, navigate to Logins and search for your user.
      Right-click on the user and select Properties.
      Click on the Server Roles tab.
      Choose Public and Sysadmin roles.
      Click OK to apply the changes.
      This process should resolve the linking issues.

    • @DataLuke
      @DataLuke  Месяц назад

      Thanks Jon

    • @francischristophur6057
      @francischristophur6057 Месяц назад

      @@DataLuke ??? Can you help me on this?

  • @Sajid-i3z
    @Sajid-i3z 8 дней назад

    Hi @Luke, Source is on-premises SQL What is destination in this project ADLS GEN2 or Power BI?

    • @DataLuke
      @DataLuke  7 дней назад +1

      Hey Sajid, the destination for the data is ADLS Gen 2 since it's where the data actually resides. Power BI then simply reads the data from here to display.

  • @mr.ktalkstech
    @mr.ktalkstech 3 месяца назад +11

    Would've been nice to get some credit for using my video as a reference. Please do it in the future. Good luck with yours, though :)

    • @DataLuke
      @DataLuke  3 месяца назад +5

      Hi, at 00:20 I give you credit. Not having your vid in description was simply an oversight - in there now. Likewise :)

    • @mr.ktalkstech
      @mr.ktalkstech 3 месяца назад +1

      @@DataLuke Thank you :)

    • @MexiqueInc
      @MexiqueInc 3 месяца назад +7

      I originally developed this project with your video . I came here because all the code is available for reviewing it if you get stuck. Which makes a huge difference. He also make great job attending sql server connections issues, creating the resources and other thing you fast forwaded.Therefore it took lots of hours to figure it out. WIshing success to both.

    • @mr.ktalkstech
      @mr.ktalkstech 3 месяца назад

      @@MexiqueInc That's great to know, happy his video is adding more value :)

    • @DataLuke
      @DataLuke  3 месяца назад

      @@MexiqueInc Thank you 🙏

  • @jonathans3653
    @jonathans3653 28 дней назад

    Thanks for the vid! I have a question, is it normal practice to use ADF, Databricks and Synapse for a single project? Wouldnt it be simpler and cheaper to do everything using Databricks for example?

    • @DataLuke
      @DataLuke  24 дня назад

      Hey Jonathan, you're welcome!
      It's common to use ADF, Databricks, and Synapse together in complex production environments due to their complementary strengths and the need for specialised capabilities.
      ADF: Excel in data integration and orchestration, making it ideal for managing complex data pipelines.
      Databricks: Provides a powerful platform for big data processing, machine learning, and data science workloads.
      Synapse: Offers a comprehensive data warehousing solution with SQL-based analytics and advanced data management features.
      While Databricks can handle many tasks independently, using all three tools can often be more cost-effective in the long run. This is because they can be optimised for specific workloads, reducing overall resource consumption. Additionally, using specialised tools can improve performance, scalability, and maintainability.

  • @Harshavardhanreddy-i4h
    @Harshavardhanreddy-i4h 2 месяца назад

    Hi Luke.
    I recently went through your End-End data engineer project. You explained it very well. Loved the content, was very easy to understand the whole process.
    i just have a small query, can we build the whole pipeline in synapse rather than ADF as Synapse is built on top of Data lakes?

    • @DataLuke
      @DataLuke  2 месяца назад

      Hey Harsha, thank you very much! :)
      Yes, exactly I believe you could as synapse integrates both data lake and data pipeline functionalities, making it an all-in-one platform for both data storage and orchestration.

  • @abdulkaderomer
    @abdulkaderomer 2 месяца назад +1

    Hi Luke,
    Thanks for this. Really well explained.
    In terms of the business request, if the company wants to understand the gap in customer demographics. What are the additional reasons of running this particular solution and utilising Azure Cloud as opposed to running automated pipelines on-prem?
    Thanks.

    • @DataLuke
      @DataLuke  2 месяца назад +1

      Thank you 🙏
      With Azure, you get flexibility to scale up or down as needed, reach customers globally with reliable compliance, and use powerful analytics and AI tools for real-time insights. Plus, Azure's built-in security, backups, and DevOps tools make it way easier and faster to manage without the massive costs and hassle of on-prem setups. It’s like getting enterprise-level power without the usual infrastructure headaches!

  • @faisalrahman6373
    @faisalrahman6373 2 месяца назад

    I can't load data from Data lake to Data bricks. What did you do for databricks to give the access of data lake?

    • @DataLuke
      @DataLuke  2 месяца назад

      Hey Faisal, try the below and let me know how you get on:
      1. Create a Service Principal in Azure Active Directory (AAD):
      - Set up a Service Principal for Databricks to authenticate and access the Azure Data Lake Storage (ADLS).
      2. Assign Permissions to the Service Principal:
      - Grant the Service Principal access to your ADLS account by assigning it the "Storage Blob Data Contributor" role through the "Access Control (IAM)" section in the Azure portal.

  • @peterbizik224
    @peterbizik224 22 дня назад

    Nice video, but it really really looks like quite a mess in the azure cloud (my opinion only), everything but not simple, guide is quite helpful. How much each step "cost" ? Did not went through whole video in details.

  • @ryany420
    @ryany420 29 дней назад

    can I ask why we need azure synapse analytics, can we just connect Powerbi directly to databricks ?

    • @DataLuke
      @DataLuke  24 дня назад

      If your use case is relatively simple (e.g., clean data in Databricks, directly used in Power BI for visualization), you can skip Synapse. But for large-scale data solutions requiring enterprise-level capabilities like data warehousing, governance, and optimal query performance, Synapse becomes a critical component.
      So we use Synapse just to get you familiar with the tool :)

  • @narjesatia
    @narjesatia 5 дней назад

    Hello Luke , thanks alot for this great work ! but I have an error I coudn't fix it : when running the pipeline in Synapse ( within the procedure) :Error
    ForEach1
    {
    "errorCode": "ActionFailed",
    "message": "Activity failed because an inner activity failed; Inner activity name: Stored procedure1, Error: Incorrect syntax near 'VIEWCustomer'.",
    "failureType": "UserError",
    "target": "ForEach1",
    "details": ""
    }
    Could you help me please ?
    Thanks again

    • @kainatelahi1322
      @kainatelahi1322 5 дней назад

      Hi, from the error message, it looks like the issue is happening in the stored procedure that's being called inside the ForEach activity. Specifically, it's complaining about incorrect syntax near the 'VIEWCustomer' part.
      Here are a few ideas that might help track down the culprit:
      Double check the stored procedure code inside that ForEach, especially around anything mentioning 'VIEWCustomer'. See if there are any missing commas, parentheses, or semicolons that could be throwing things off.
      Make sure 'VIEWCustomer' is actually defined correctly in your database. Is it a view? If so, double check the view definition for typos.
      See if you can run that stored procedure directly in SQL Server Management Studio. If it runs okay there, the problem is probably in how the ForEach is calling it. If it fails there too, then you can focus your debugging efforts on the stored proc itself.
      Check that the parameters being passed from the ForEach to the stored procedure match what it's expecting in terms of number, order, and data types.

    • @narjesatia
      @narjesatia День назад

      @@kainatelahi1322 Hello ,thanks for your time and ur reply, but it's not the case ! since the error not only for a single table just Iput an unique example near VIEWCostomer .. I've tried to drop the procedure and recreate another, changed the code to drop the proc if exists and create new one ,, still not working :( anyhelp please ! @DataLuke

  • @rahhulkumardas
    @rahhulkumardas 2 месяца назад

    Hi Luke, On 53:52 my debug step is failing with error -
    ErrorCode=Unknown,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=An unknown error occurred.,Source=Microsoft.DataTransfer.Common,''Type=JNI.JavaExceptionCheckException,Message=Exception of type 'JNI.JavaExceptionCheckException' was thrown.,Source=Microsoft.DataTransfer.Richfile.HiveOrcBridge,'
    I did search on internet but I am unable to understand whats the issue is. Can you please share what it might be.
    The data is being read from on prem sql but not being written in Datalake
    It works when I select the data format as CSV in Sink section

    • @DataLuke
      @DataLuke  2 месяца назад

      Hmmmm...
      The error you're encountering, JNI.JavaExceptionCheckException, often suggests an issue related to Java dependencies or the Java Runtime Environment (JRE) within your integration setup in Azure Data Factory (ADF) or Databricks, particularly when dealing with formats like ORC or Parquet for data transformations. This exception can stem from:
      Missing or Incorrect Java Version
      Integration Runtime Configuration
      ORC vs. CSV Compatibility
      Parquet and ORC Format Settings
      Try running the same operation using Databricks or another tool that provides more specific error logging to narrow down the cause. If the error persists, providing these logs might offer deeper insights into the Java exception.

    • @yrisheet
      @yrisheet 2 месяца назад +1

      I got the same error, and I don't have java installed on my machine. I went ahead and installed it, now the error is gone. Please check if JRE is installed on your machine.

    • @2011djdanny
      @2011djdanny 18 дней назад

      @@yrisheet Installing JRE will do the magic.

  • @11_manalichaudhari20
    @11_manalichaudhari20 Месяц назад

    Hey Luke, I am facing an error while doing the Integration runtime setup. I clicked on the express setup option and it showed me an error as 'Invalid length for a Base-64 char array or string'. Can you please help me out?

    • @DataLuke
      @DataLuke  Месяц назад

      The error "Invalid length for a Base-64 char array or string" during the integration runtime setup (likely in Azure Data Factory or Synapse) usually occurs due to incorrect handling of Base64 encoding/decoding. Here's how you can troubleshoot and resolve this issue:
      1. Check the Key or Token
      - If you're using a key or token (e.g., for authentication), ensure that it's properly formatted and not corrupted during copying or pasting.
      - Base64 format requirements:
      - Length must be a multiple of 4.
      - Only valid Base64 characters are allowed: A-Z, a-z, 0-9, +, /.
      - It may also end with = or == for padding.
      Solution: Re-copy the key/token from the source and ensure no extra spaces or invalid characters are included.
      2. Check the Installation Steps
      - During express setup, the integration runtime auto-generates configurations like linked service credentials. If these configurations are malformed, it could result in a Base64-related error.
      - Ensure you're following these steps correctly:
      1. Download the Integration Runtime Installer.
      2. During installation, choose Express Setup.
      3. Paste the authentication key when prompted.
      3. Recreate the Authentication Key
      - If the key/token provided during setup seems corrupted, try recreating it:
      - Go to Integration Runtimes in the Azure portal.
      - Select your self-hosted runtime and regenerate the key.
      - Use the newly generated key during setup.
      4. Verify System Requirements
      - Ensure that the system hosting the Integration Runtime meets all prerequisites:
      - .NET Framework 4.7.2 or later.
      - Supported OS versions (e.g., Windows Server or recent versions of Windows).
      - Outdated systems or missing dependencies might cause encoding/decoding errors.
      5. Reinstall Integration Runtime
      - Uninstall the existing Integration Runtime from your machine.
      - Re-download the latest installer from the Azure portal.
      - Follow the setup process again.
      6. Check Network and Proxy Settings
      - If you're behind a corporate network or proxy, ensure the Integration Runtime has access to the required Azure endpoints. Misconfigured network settings can sometimes lead to incomplete responses or malformed Base64 strings.
      7. Inspect Logs for More Details
      - Check the installation or runtime logs for more specific error messages. Logs can be found in:
      - Default installation directory: C:\Program Files\Microsoft Integration Runtime\\Logs.
      8. Try Manual Setup
      - If the Express Setup continues to fail, opt for the Manual Setup:
      - Download and install the runtime.
      - Use the key generated from the Azure portal during the setup process.
      Let me know if these steps help or if you need further clarification!

  • @sfaaa7196
    @sfaaa7196 2 месяца назад

    I am struggling with the databricks quota. It keeps saying that I can't make any clusters using the free azure trial any idea how to fix that ? I tried a couple of ways but it says azure trial doesn't allow upgrade subscription etc.

    • @DataLuke
      @DataLuke  2 месяца назад

      Hey, try when you make a new compute cluster using the single node option at the top of the page, not multi node

  • @yashwantsingh3641
    @yashwantsingh3641 Месяц назад

    great work

  • @LadyM827
    @LadyM827 3 месяца назад

    Hi Luke on 57.13 i can't see the sql script , can you please assist to write it down here on the comments.

    • @DataLuke
      @DataLuke  3 месяца назад +1

      Hi! Yes, of course:
      SELECT
      s.name AS SchemaName,
      t.Name AS TableName
      FROM sys.tables t
      INNER JOIN sys.schemas s
      ON t.schema_id = s.schema_id
      WHERE s.name = 'SalesLT'

  • @AurelieAKA
    @AurelieAKA 3 месяца назад

    Hi Luke,
    Can I load my data directly in Azure Synapse without going through Databricks, considering that my data is already structured from SQL Server ?

    • @DataLuke
      @DataLuke  2 месяца назад

      Yes, absolutely - since you're data is already structured. Depending on your SQL Server version, you might use Azure Synapse Link for seamless integration.

  • @ashoklinga3072
    @ashoklinga3072 19 дней назад

    Hi Luke, I’m seeing a error while creating the pipeline and setting up the data source, my test connections getting failed

    • @DataLuke
      @DataLuke  19 дней назад

      Hey, have you made sure you've got the relevant IAM permissions set?

    • @ashoklinga3072
      @ashoklinga3072 19 дней назад

      @ yes, I did. I have tried multiple times but still facing the issue

  • @ajinkyabele1824
    @ajinkyabele1824 Месяц назад

    Where can i get the resources from resource link like github repo where all the scripts are present help me out

    • @DataLuke
      @DataLuke  Месяц назад +1

      Hi! The resources are the first link, as for your connection issue.
      Regarding the issue you’re experiencing with connecting SQL Server to ADF, it’s helpful to review a few key points related to the tutorial you’ve referenced.
      SQL Server Authentication Mode:
      Ensure that the SQL Server instance is configured to allow SQL Server Authentication. In SQL Server Management Studio (SSMS):
      Right-click the server name > Properties > Security.
      Select SQL Server and Windows Authentication mode.
      Restart the SQL Server service via SQL Server Configuration Manager.
      Integration Runtime Setup:
      Since this is an on-premises SQL Server, you need the Self-Hosted Integration Runtime (SHIR):
      Verify SHIR is running and correctly configured.
      Open Integration Runtime Configuration Manager and ensure the status is Online.
      Key Vault Access:
      Double-check that the Azure Key Vault secrets for your SQL username and password are accessible.
      Confirm the Managed Identity (or other service principal) used by ADF has the Key Vault Secret User role assigned in the Access Control (IAM) settings for the Key Vault.
      Firewall Rules:
      If your SQL Server is local, verify that:
      The local firewall allows inbound connections on port 1433.
      The IP of the machine hosting SHIR is allowed in the SQL Server firewall rules.
      Encryption Settings:
      If encryption issues persist:
      In ADF Linked Service, set Encrypt connection to Optional.
      If required, configure a proper SSL certificate on the SQL Server instance.
      Permissions:
      Ensure the user in SQL Server has the necessary permissions:
      ```
      GRANT CONNECT TO [username];
      GRANT SELECT ON SCHEMA::[schema_name] TO [username];
      ```
      Java Runtime for SHIR:
      If you encounter JRE Not Found errors, install Java Runtime Environment (JRE) and configure it with SHIR:
      Install via `choco install jre` (Windows) or `brew install java` (macOS).
      Testing Connectivity
      Before retrying in ADF:
      Test the SQL connection using SSMS with the same credentials.
      If you encounter any error logs in ADF, review them for specific error codes or messages to narrow down the issue.
      Additional Resources
      If there’s a step in the tutorial that remains unclear, let me know the specific point, and I’ll clarify it further. For any debug logs from ADF or SQL, feel free to share them for a more targeted solution!

    • @ajinkyabele1824
      @ajinkyabele1824 Месяц назад

      @@DataLukethanks for helping i only have one doubt i want your github link can you please paste it in chat i am not getting via resource link

  • @subrahmanyeswarraopuppala3721
    @subrahmanyeswarraopuppala3721 13 дней назад

    Can we use the power BI also which is available in azure services

    • @DataLuke
      @DataLuke  13 дней назад

      Hey Subrahmanyeswar, we can't make new reports given our data source unfortunately, If you want to practice PowerBI not on a Windows machine (i.e. on Mac like me), follow along my Microsoft Fabric End-to-End Data Engineering tutorial. As Fabric allows you to use PowerBI in browser.

    • @subrahmanyeswarraopuppala3721
      @subrahmanyeswarraopuppala3721 8 дней назад

      @@DataLuke Ok got it and are we able connect ADF with data bricks community edition ?

    • @DataLuke
      @DataLuke  7 дней назад

      ​@@subrahmanyeswarraopuppala3721 Yes, you can technically connect ADF with Databricks Community Edition, but it requires using REST APIs rather than the native Databricks Linked Service, as that's designed for Azure-hosted Databricks. Keep in mind that Community Edition is limited in features and isn't ideal for production use.

    • @subrahmanyeswarraopuppala3721
      @subrahmanyeswarraopuppala3721 6 дней назад

      Ok got it.. can you please explain spark architecture once in one video..

  • @portiseremacunix
    @portiseremacunix Месяц назад

    subbed!

  • @hpmouse3385
    @hpmouse3385 Месяц назад

    Where can I get the resources??

    • @DataLuke
      @DataLuke  Месяц назад +1

      First link in the description mate

  • @subrahmanyeswarraopuppala3721
    @subrahmanyeswarraopuppala3721 23 дня назад

    Can we get those ADB files for practice

  • @_PranitRansingh
    @_PranitRansingh Месяц назад

    Where can i find Github link ?
    Thanks !

    • @DataLuke
      @DataLuke  Месяц назад

      Hey Pranit, first link in the description

    • @_PranitRansingh
      @_PranitRansingh Месяц назад

      @ Thanks Luke ! I have also sent you LinkedIn connection. Great Content 🙌🏻

    • @DataLuke
      @DataLuke  Месяц назад

      @@_PranitRansingh Thank you :)

  • @MohammedAzhar-g2q
    @MohammedAzhar-g2q 2 месяца назад

    wirte or paste the scripts in discription
    thanks

    • @DataLuke
      @DataLuke  2 месяца назад

      Hey Mohammed scripts are on GitHub

  • @appliedaiLive
    @appliedaiLive 6 дней назад

    time to practice Databricks

  • @amanahmed6057
    @amanahmed6057 3 месяца назад

    Hi bro , can this project be done with azure free account

    • @DataLuke
      @DataLuke  3 месяца назад +1

      Sure can mate, I done it all on free account

  • @sanilkumarbarik9151
    @sanilkumarbarik9151 14 дней назад

    There should be a timestamp

    • @DataLuke
      @DataLuke  14 дней назад +1

      There are now timestamps 🙏

  • @amanahmed6057
    @amanahmed6057 3 месяца назад +3

    Hi bro , if you want more engagement , change titile Azure End-To-End Data Engineering Project for Beginners ( Free Account) | Real Time Tutorial
    So you are telling people without opening the video that this project is done on free account and make thumbnail like this also.
    thanks (^_-)

    • @DataLuke
      @DataLuke  3 месяца назад +1

      Let's give it a try ;)

    • @DataLuke
      @DataLuke  2 месяца назад +1

      After 4 days, new title and thumbnail with "FREE" in it got 90% of watch time vs. 10% without. Thanks bro 🙏

  • @khanleader
    @khanleader 2 месяца назад

    😍🦸🦸🥰 captivating and outclass. thanks so much

    • @DataLuke
      @DataLuke  2 месяца назад

      Thank you v much! Glad you enjoyed 😁