I have a mide a mistake and corrected it, I will see how many of you can spot it :). If you want to take a challange, please implement this with cloud function.👍
Hi GK, thanks for sharing this video. Connecting to Cloud SQL from appengine via private IP is the best approach. Let's assume Cloud SQL is already configured with only private IP. Can you please suggest what could be the best approach to connect from cloud shell in this case. And it will be great if you share any sample java project with same scenario. Thanks in advance.
Hi GK, I have a doubt, please clear this if possible: I have a cloud run attached with serverless vpc connector and i am using db_host as a connection name instead of cloudsql private IP because my cloudsql instance is enforced with SSL certificates so to bypass this i am using /cloudsql/${connection_name} as a db_host variable to my application, so but my cloudsql instance is having both private and public ip addresses because some reasons, so does cloud run uses private ip to connect to cloudsql or public ip??? Thanks in advance
@@CloudAdvocate Thanks, do we have any reference docs related to this from Gcp? I tied but couldn’t find any such information from their official page 🙂
Hi there, very helpful video. But I'm stuck at a point. After enabling API, when I create connector with exactly same details as explained by you - connector is getting created with an error - 'Connector is in bad state, manual deletion recommended. Region I'm using is asia-south1. Tried multiple times since yesterday, no luck! What can be the possible issue?
Hi Ekta, Looks like you are infact using the supported regions cloud.google.com/vpc/docs/configure-serverless-vpc-access#supported_regions. Could you please create a support ticket with Google cloud.
I have a mide a mistake and corrected it, I will see how many of you can spot it :). If you want to take a challange, please implement this with cloud function.👍
2544131
will the serverless connector work with cloud composer for running sql script.
@15:20 u did not delete the last letter "l" of the project name
Perfect explanation and all in just 17min. I spent hours on Google docs trying to figure how to achieve this.Thank you man 👍
Such a better explanation than the video produced by Google. Subscribed!
Very good and helpful Cloud Advocate !! The only thing is that the red letters on black background was not very clear. Congrats!
Hi GK, thanks for sharing this video. Connecting to Cloud SQL from appengine via private IP is the best approach. Let's assume Cloud SQL is already configured with only private IP. Can you please suggest what could be the best approach to connect from cloud shell in this case. And it will be great if you share any sample java project with same scenario. Thanks in advance.
BTW I think the error was that you added a 1 at the end of the project name right ?
Hello GK, can you help with Connecting App Engine Flex Env. to Cloud SQL via Private IP
hello sir, is there any way to initiate dataproc workflow-template (spark job workflow) using cloud function or workflows (beta)(cloud workflow)
Hi GK, I have a doubt, please clear this if possible:
I have a cloud run attached with serverless vpc connector and i am using db_host as a connection name instead of cloudsql private IP because my cloudsql instance is enforced with SSL certificates so to bypass this i am using /cloudsql/${connection_name} as a db_host variable to my application, so but my cloudsql instance is having both private and public ip addresses because some reasons, so does cloud run uses private ip to connect to cloudsql or public ip???
Thanks in advance
Great question 🙋 most likely it will use private ip.. personally haven’t tested this.
@@CloudAdvocate Thanks, do we have any reference docs related to this from Gcp? I tied but couldn’t find any such information from their official page 🙂
How about from Cloud Functions?
how can i do it with elastic search in could function ? any help
Hi there, very helpful video. But I'm stuck at a point. After enabling API, when I create connector with exactly same details as explained by you - connector is getting created with an error - 'Connector is in bad state, manual deletion recommended. Region I'm using is asia-south1. Tried multiple times since yesterday, no luck! What can be the possible issue?
Hi Ekta, Looks like you are infact using the supported regions cloud.google.com/vpc/docs/configure-serverless-vpc-access#supported_regions. Could you please create a support ticket with Google cloud.
Don't you lose the advantages of having a private IP address (security) if you set a public IP also ?
Yes! I must have done it to connect from cloud shell but you are right.
thank you!!
while creating vpc connector, I am getting error: "Operation failed: Insufficient CPU quota in region." What I do?
Please try after sometime or create a support ticket.
@@CloudAdvocate I am getting this error since 3-4 days. So, I will create a support ticket now. Thanks for the reply👍.
I think if-else... condition is swapped in main-pip.py.
Good explanation :)
Thanks
this method is outdated and you have to set ENV_VARIABLES before building cloud run.
wearing AWS t-shirt and Teaching about GCP..
😜