Yes I have also observed it. Later in other videos I found out that we have to add it as an option when reading data into data frame like .option("driver", jdbcDriver")
What about just being able to access those same tables via an existing databricks catalog in the hive metastore structure for example? Is there a way to do that?
I want to do query from sql and load the result into one variable. Can we do that. Like select max(id) from sql table. I am using this id for comparison in next steps
Hi Raja's, I am following this tutorial step by step but I got an error while running the 2nd cell of getting product table. the error is " java.sql.SQLException: No suitable driver", can you please help in this case.
Jdbc connection has performance issue while handling huge amount of data. But there are options to improve the performance which can be applied depending on the use case
simple but effective process and expalination as well. Great job
Glad you liked it!
Simply Super Bro.. Awaiting for more videos from you on db related activates in data bricks
Hi Avinash, thank you. Sure will post more videos on db related activities
Simply Awesome !! We are learning a lot from your videos.
Thanks Omprakash
Great video! Just one question, I saw you defined the jdbcDriver...but I didn't see it used after in jdbcUrl? What is it for?
Yes I have also observed it. Later in other videos I found out that we have to add it as an option when reading data into data frame like .option("driver", jdbcDriver")
Awesome, Very nice explanation...
Thank you
Can we mention type of authentication while connecting? What if we have only Azure Active Directory Password ? How to mention that?
What about just being able to access those same tables via an existing databricks catalog in the hive metastore structure for example? Is there a way to do that?
I want to do query from sql and load the result into one variable. Can we do that. Like select max(id) from sql table. I am using this id for comparison in next steps
Nice explanation bro 👍
You didn't use the jdbcDriver . What is the purpose to have jdbcDriver ????
Why do you say I didn't use jdbc driver????
Look at 7:30 in the video
@@rajasdataengineering7585I meant that u didn't pass the jdbcDriver value in to the jdbc url
like in unix , can i save all these details in one file and call in the beginning of the scripts. ?.
Yes we can use yaml or json configuration file to save the details and during run time, spark can read the configuration file and process accordingly
Hi Raja's, I am following this tutorial step by step but I got an error while running the 2nd cell of getting product table. the error is "
java.sql.SQLException: No suitable driver", can you please help in this case.
Now I got that, something was wrong in preparing the connection. I can connect, and get the data from Azure Sql Server.. Thanks Raja.
Glad to hear you fixed the issue 👍🏻
How to write this product table data into blob storage in parquet format in a databrick notebook? Plz help
We can use databricks writer df.write.format("parquet").save(location)
Thanks worked perfectly.
Great
I really appreciate this video thank you🙏
Thanks Prathap! Glad you find it useful
Can i join for paid project.. Or else how can i contact you..
HELLO SIR..WHILE IMPORTING DATA HOW WE COME KNOW WHICH IS MODIFIED AND WHICH IS LATEST DATA??I MEAN ANY UPDATED DATA HOW WE HANDLE THAT..PLS REPLY
What if data is huge like 100 gb. Is it still recommended?
Jdbc connection has performance issue while handling huge amount of data. But there are options to improve the performance which can be applied depending on the use case
Sir, Can we also create , view, alter , run stored procedures from databricks ?
Hi Arnab, stored procedures can't be created in databricks. Views can be created and can be altered as well
Could you help me , establish connection string using azure active directory authentication mode
Super nice video
Thank you
how to get that IP address , foe me it was not visible
how to get multiple tables from Azure SQL into databricks notebook
Simple and effective
Sir how can we connect using serects from keyvault ?
We need to create scoped credentials in databricks first to setup integration between key vault and databricks
@@rajasdataengineering7585 thank you will check and do 💪
Hard coding Password in the code is not recommended. Can we get password from Azure Key Valt. Can you please let us know the steps for that
We need to integrate azure key vault with databricks by creating secret scope
how to get that ip address . i did not find while logging. please can you say
You can get it from command prompt using ipconfig command
I want to connect to my localhost but the connection is getting refused. Can you please make a video on it?
great video
Glad you enjoyed it
Sir, is there any way to hide the password from exposing it in the code ?
Yes Arnab, we can use azure key vault
Also we can use Databricks secret scope
Yes we can use databricks secret scope
Thank you so much ❤Sir....
Most welcome! Hope you find it useful
@@rajasdataengineering7585 Yes 💯
Sir getting no suitable driver on running df @@rajasdataengineering7585
simple superb
can you make a video how to creatra account in databricks community addition for free