Kamil Data Geek - Azure explained
Kamil Data Geek - Azure explained
  • Видео 39
  • Просмотров 161 252
Get started with Data Warehouses in Microsoft Fabric | Lab 06
Data warehouses are analytical stores built on a relational schema to support SQL queries. Microsoft Fabric enables you to create a relational data warehouse in your workspace and integrate it easily with other elements of your end-to-end analytics solution.
The video is based on Microsoft Lab which supports "Get started with data warehouses in Microsoft Fabric" module in Microsoft Learn (see links below).
In this video, you'll learn how to:
00:00 Intro
02:34 Create a data warehouse
03:42 Create tables and insert data
05:06 Preview data
07:19 Define a data model
10:57 Query data warehouse tables
18:30 Create a view
21:38 Create a visual query
26:05 Visualize your data
30:14 Summary
*** Useful links: *...
Просмотров: 130

Видео

Rob Sewell: What is his recipe for having a good speech? | Ask SQL Family 012
Просмотров 1616 часов назад
Rob is a SQL Server DBA with a passion for PowerShell, Azure, Automation & SQL. He is an MVP & an officer for the PASS PowerShell VG & has spoken at, organised & volunteered at many events. He is a member of the committee that organises SQL Saturday Exeter, PSDayUK & also the European PowerShell Conference. He is a proud supporter of the SQL & PowerShell communities. He relishes sharing & learn...
Ingest Data with Dataflows Gen2 in Microsoft Fabric | Lab 05
Просмотров 15314 дней назад
Data ingestion is crucial in analytics. Microsoft Fabric's Data Factory offers Dataflows for visually creating multi-step data ingestion and transformation using Power Query Online. The video is based on Microsoft Lab which supports "Ingest Data with Dataflows Gen2 in Microsoft Fabric" module in Microsoft Learn (see links below). In this video, you'll learn how to: 00:00 Intro 01:00 Create a ne...
Grant Fritchey: Why "Scary DBA" and what's his favourite Redgate tool? | Ask SQL Family 011
Просмотров 2221 день назад
Grant Fritchey is a Microsoft SQL Server MVP with over 20 years of experience in IT, including time spent in support and development. He has worked with SQL Server since 6.0, back in 1995. He has developed in Visual Basic, Visual Basic .NET, C# and Java. He is the author of the books SQL Server Execution Plans (Simple-Talk), SQL Server Query Performance Tuning (Apress) and co-author “Expert Per...
Organize a Fabric Lakehouse using Medallion Architecture Design
Просмотров 45028 дней назад
Let’s explore the potential of the medallion architecture design in Microsoft Fabric. Organize and transform your data across Bronze, Silver, and Gold layers of a Lakehouse for optimized analytics. The video is based on Microsoft Lab which supports "Organize a Fabric Lakehouse using medallion architecture design" module in Microsoft Learn (see links below). In this video, you'll learn how to: 0...
Dejan Sarka - is Python in SQL Server good idea? | Ask SQL Family 010
Просмотров 40Месяц назад
Dejan Sarka, MCT and SQL Server MVP is an independent trainer and consultant that focuses on the development of database & business intelligence applications. Besides projects, he spends about half of his time on training and mentoring. He is the founder of the Slovenian SQL Server and .NET Users Group. Dejan Sarka is the main author or co-author of sixteen books about databases and SQL Server....
Use Data Factory pipelines in Microsoft Fabric | Lab 04
Просмотров 163Месяц назад
Microsoft Fabric includes Data Factory capabilities, including the ability to create pipelines that orchestrate data ingestion and transformation tasks. The video is based on Microsoft Lab which supports "Use Data Factory pipelines in Microsoft Fabric" module in Microsoft Learn (see links below). In this video, you'll learn how to: 00:00 Intro 01:27 Create a new lakehouse 02:47 Create a new pip...
Pedro Lopes - should DBAs worry about their job? | Ask SQL Family 009
Просмотров 40Месяц назад
Pedro Lopes was a Senior Program Manager in the Database Systems Group, based in Redmond, WA. He has over 20 years of industry experience. He was also responsible for Program Management of database engine features for in-market versions of Microsoft SQL Server, with a special focus on the Relational Engine. He worked closely with several Tier 1 SQL Server customers, as well as partners in the f...
Work with Delta Lake tables in Microsoft Fabric | Lab 03
Просмотров 222Месяц назад
Tables in a Microsoft Fabric Lakehouse are based on the Delta Lake storage format commonly used in Apache Spark. By using the enhanced capabilities of delta tables, you can create advanced analytics solutions. The video is based on Microsoft Lab which supports "Work with Delta Lake tables in Microsoft Fabric" module in Microsoft Learn (see links below). In this video, you'll learn how to: 00:00...
Michael Rys - father of U-SQL | Ask SQL Family 008
Просмотров 242 месяца назад
Michael has been doing data processing and query languages since the 1980s. Among other things he has been representing Microsoft on the XQuery and SQL design committees and has taken SQL Server beyond relational with XML, Geospatial and Semantic Search. This talk has taken place during PASS Summit in Seattle, WA, on 2nd November 2017 (Thursday). Useful links: Transcript: azureplayer.net/2018/0...
Use Apache Spark in Microsoft Fabric | Lab 02
Просмотров 3372 месяца назад
Apache Spark is a core technology for large-scale data analytics. Microsoft Fabric provides support for Spark clusters, enabling you to analyze and process data in a Lakehouse at scale. The video is based on Microsoft Lab which supports "Use Apache Spark in Microsoft Fabric" module in Microsoft Learn (see links below). In this video, you'll learn how to: 00:00 Intro 01:21 New Workspace 02:00 Ne...
MVP Summit 2024 - my video recap
Просмотров 1102 месяца назад
👀 Do you find yourself in the video? 👉 Leave a comment with a timestamp and the city you come from. 🎵 Music: freesound.org/people/GregorQuendel/ Socials: - 📓 GitHub: github.com/NowinskiK/ - 📢 Twitter: NowinskiK - 📢 Twitter: Azure_Player - 👥 Facebook: TheAzurePlayer/ - 🏙️ LinkedIn: www.linkedin.com/in/kamilnowinski/ - 🖼️ Instagram: kamilnowgeek...
Cathrine Wilhelmsen - Women in IT | Ask SQL Family 007
Просмотров 233 месяца назад
Cathrine loves teaching and sharing knowledge. Currently, she build end-to-end data platforms using technologies such as Microsoft Fabric, Azure Synapse Analytics, Azure Data Factory, Azure Data Lake, Azure SQL, and Power BI. Outside of work, she’s active in the SQL Server and PASS communities as a Microsoft Data Platform MVP, BimlHero Certified Expert, author, speaker, blogger, organizer, and ...
Get started with Lakehouses in Microsoft Fabric | Lab 01
Просмотров 3653 месяца назад
Lakehouses merge data lake storage flexibility with data warehouse analytics. Microsoft Fabric offers a lakehouse solution for comprehensive analytics on a single SaaS platform. The video is based on Microsoft Lab which supports "Get started with lakehouses in Microsoft Fabric" module in Microsoft Learn (see links below). In this video, you'll learn how to: 00:00 Intro 01:58 Create workspace 02...
Bob Ward - SQL Server during the last 30 years | Ask SQL Family 006
Просмотров 693 месяца назад
We have a real star of Microsoft here today. Bob Ward, Principal Architect, who recently Who recently celebrated his 30 years anniversary working at Microsoft. Bob spent this time working on our favourite product which is SQL server... And that's one of the reasons why this product is so great, mature, and has high performance. Let's dive into the conversation we recorded a few years ago and se...
Why tech certifications are still important in 2024?
Просмотров 4333 месяца назад
Why tech certifications are still important in 2024?
Marcin Szeliga - Being Microsoft MVP for almost 20 years | Ask SQL Family 005
Просмотров 293 месяца назад
Marcin Szeliga - Being Microsoft MVP for almost 20 years | Ask SQL Family 005
Mark Broadbent | Ask SQL Family 004 | The times when the exam cost $20k
Просмотров 294 месяца назад
Mark Broadbent | Ask SQL Family 004 | The times when the exam cost $20k
Stephanie Locke Shares Her Data Science Journey and Insights | Ask SQL Family 003
Просмотров 685 месяцев назад
Stephanie Locke Shares Her Data Science Journey and Insights | Ask SQL Family 003
Install & Import PowerShell module into a session
Просмотров 6095 месяцев назад
Install & Import PowerShell module into a session
Greg Low | Ask SQL Family 002 | Where do 70% of SQL Server design problems come from?
Просмотров 575 месяцев назад
Greg Low | Ask SQL Family 002 | Where do 70% of SQL Server design problems come from?
Prepare and pass Databricks Data Engineer Associate Exam
Просмотров 2,3 тыс.6 месяцев назад
Prepare and pass Databricks Data Engineer Associate Exam
Brent Ozar | Ask SQL Family 001 | How did I become Microsoft Certified Master?
Просмотров 1186 месяцев назад
Brent Ozar | Ask SQL Family 001 | How did I become Microsoft Certified Master?
Sending an attachment via REST API using ADF/Synapse pipeline
Просмотров 1 тыс.9 месяцев назад
Sending an attachment via REST API using ADF/Synapse pipeline
Using PowerShell in Visual Studio Code
Просмотров 33 тыс.2 года назад
Using PowerShell in Visual Studio Code
Database deployment with Flyway in 5 minutes
Просмотров 4,2 тыс.2 года назад
Database deployment with Flyway in 5 minutes
@LEGO 76139: 1989 Batmobile - Speed Build in less than 3 minutes
Просмотров 2543 года назад
@LEGO 76139: 1989 Batmobile - Speed Build in less than 3 minutes
How to install Self-hosted Windows agent for Azure DevOps
Просмотров 14 тыс.3 года назад
How to install Self-hosted Windows agent for Azure DevOps
Azure Data Factory with Databricks setup connection
Просмотров 11 тыс.3 года назад
Azure Data Factory with Databricks setup connection
Azure Data Factory | Publish from code with one task in DevOps Pipeline (2/2)
Просмотров 2,9 тыс.3 года назад
Azure Data Factory | Publish from code with one task in DevOps Pipeline (2/2)

Комментарии

  • @reach2puneeths
    @reach2puneeths 2 дня назад

    Hi Kamil, great video. I have few doubts, can we define user defined scalar function and use cursor command in fabric, if not what would be the work arround, please let me know. Thank you

    • @KamilNowinski
      @KamilNowinski 2 дня назад

      Thanks. No, these features are not supported in this version of SQL (it's not full SQL Server or Azure SQL Database). While I may understand the need for UDF, I'm not sure why you need cursors? For last 20 years I used cursors only few times (SQL Server) and it was always in OLTP databases.

  • @thefootballking10
    @thefootballking10 5 дней назад

    Thanks for posting this video. I got stuck on the notebook part too. Microsoft Learn needs to update the instructions on that part.

    • @KamilNowinski
      @KamilNowinski 4 дня назад

      Thanks for watching! True, not assigned Lakehouse to a notebook may be confusing for the first time.

  • @user-dy8xu7uj8k
    @user-dy8xu7uj8k 7 дней назад

    I have a SQL server stored procedure which updates, deletes and merges data into a table , how do I convert the stored procedure to pyspark job, is it possible to update a table in fabric using pyspark?, please make a video on this topic

    • @KamilNowinski
      @KamilNowinski День назад

      It's possible to do MERGE with PySpark as well, but requires rewriting the code and might be more difficult when you have more complex logic over there. In this case, you may want to consider Data Warehouse in Fabric - watch my latest video: ruclips.net/video/Z_HoGqR3pxw/видео.html

  • @user-dy8xu7uj8k
    @user-dy8xu7uj8k 7 дней назад

    Great video, thank you for making

  • @Power888w
    @Power888w 15 дней назад

    Is it possible to restore a deleted dataflow gen2 by the user and some version control?

    • @KamilNowinski
      @KamilNowinski 11 дней назад

      No, it's not possible at the moment. Dataflow is being permanently deleted and can't be restored from UI. A potential option could be version control, but currently, dataflow is not yet supported within GIT repository integration, so even that option is not possible, but it will be in the future.

  • @janhallholm9043
    @janhallholm9043 16 дней назад

    😂😂😂Microsoft sucks get hell out of from open source suckers

  • @matheusduarte8022
    @matheusduarte8022 22 дня назад

    Great Video! Thanks. You have a great knowledge in coding.

    • @KamilNowinski
      @KamilNowinski 22 дня назад

      Thanks! I like to think coding is my second language, right after emoji 😉

  • @bigfarmerUK
    @bigfarmerUK 26 дней назад

    Thanks for Sharing - Great Reason(s): #1 - Keep skills up to Date #2 - Help Learning #3 - Demonstrate your commitment #4 - Build a case for promotions #5 - Help Job Seekers #6 - Gain new Networking Opportunities #7 - Validate a Higher Salary #8 - Gain Problem solving skills #9 - Build Self Confidence #10 - Show Professional Credibility 👍

  • @BarneyLawrence
    @BarneyLawrence Месяц назад

    Thanks Kamil, just what I needed!

  • @moeeljawad5361
    @moeeljawad5361 Месяц назад

    Very interesting Kamil, please keep it going. Waiting for more videos.

  • @anjumanrahman1468
    @anjumanrahman1468 Месяц назад

    👍

  • @ReptilianLaserbeam
    @ReptilianLaserbeam Месяц назад

    thanks, it was easier than I thought. I installed PowerShell 7 and somethings were not working so I had to install the cmdlets again, no big deal. Way faster than using ISE.

    • @KamilNowinski
      @KamilNowinski Месяц назад

      You're welcome. I'm glad you overcame the issue.

  • @ybj139
    @ybj139 Месяц назад

    Thank's Kamil, very interesting

  • @alessandrobeninati1583
    @alessandrobeninati1583 Месяц назад

    yea

    • @KamilNowinski
      @KamilNowinski Месяц назад

      Yea, thanks for dropping a comment!

  • @codecolon
    @codecolon 2 месяца назад

    How do we manage logins and users permission when deploying multiple environments? For example: if we have users User1 & User2, User1 should not be present on Dev and User2 should not exists on UAT?

    • @KamilNowinski
      @KamilNowinski 2 месяца назад

      Check these two examples from my commercial course (code is free): github.com/NowinskiK/ssdt-training/blob/master/src/Variables/ContosoRetailDW/Deployment/Environment-specific.sql github.com/NowinskiK/ssdt-training/blob/master/src/PrePostDeployment/ContosoRetailDW/Scripts/CreateUsers.sql If you interested in the whole course: learn.azureplayer.net/ssdt-essentials

  •  2 месяца назад

    Super :)

  • @datavibe
    @datavibe 2 месяца назад

    This is a cool video Kamil. Thanks for compiling all our MVP Summit memories in a nutshell 😊

    • @KamilNowinski
      @KamilNowinski 2 месяца назад

      Thanks! I'm glad you like it!

  • @KamilNowinski
    @KamilNowinski 3 месяца назад

    🥇 Certifications not only validate your knowledge and skills but also signify your commitment to professional growth. Employers, seeking competent and qualified individuals, often prioritize candidates who possess relevant certifications. 🤔 How much do you agree or disagree? Share your thoughts and experiences.

  • @TresorBaseva-lb3vf
    @TresorBaseva-lb3vf 3 месяца назад

    Hi dear Kamil, Trust you are doing great. Am Tresor writting from Kinshasa DRCongo,I love your video and am realy want to lean more about that Databricks. I do not have much experience on databricks, but I will like you to help me or work with you so that I could prepare my exam...Please am waiting for feedback sir.

    • @KamilNowinski
      @KamilNowinski 3 месяца назад

      Thanks for your honest feedback. While I'm not doing any 1-to-1 sessions, I'd suggest to start learning from many videos available on RUclips about Databricks and take these free courses: www.databricks.com/resources/learn/training/lakehouse-fundamentals

  • @videomolasy
    @videomolasy 3 месяца назад

    Interesting and convincing video! Thank you for that.

    • @KamilNowinski
      @KamilNowinski 3 месяца назад

      Thank you, Mateusz. I appreciate your feedback, especially considering your many years of experience as a university lecturer.

  • @gingergrant8550
    @gingergrant8550 3 месяца назад

    I always learn something when I take a cert exam. Sometimes you just need the motivation to learn and certs provide the push. Nice video.

    • @KamilNowinski
      @KamilNowinski 3 месяца назад

      Thank you, Ginger. Can't agree more. Exams smoothly force us to learn and fill our knowledge gaps. I found it a good way to learn. No one says it's easy.

  • @AlexSmith-fz1ll
    @AlexSmith-fz1ll 4 месяца назад

    Great video Kamil - really useful for final preparations for the exam. Thanks

  • @verrigo
    @verrigo 5 месяцев назад

    Hello Kamil. What permission did you grant for local user agent? Thanks

  • @notoriousft
    @notoriousft 5 месяцев назад

    Thanks for the video. Do you still need to import the module after V3?

    • @KamilNowinski
      @KamilNowinski 5 месяцев назад

      Yes, unless a module is preloaded.

  • @sparshforu
    @sparshforu 5 месяцев назад

    Hi @kamil, getting error "A project which specifies SQL Server 2016 as the target platform cannot be published to Microsoft Azure SQL Database v12." I tried solution provided by you but still getting same error can you please suggest

    • @KamilNowinski
      @KamilNowinski 5 месяцев назад

      You must change target platform: ruclips.net/video/4N_fv6d3KQY/видео.html It should help. If not - create a question in GitHub with more details: github.com/NowinskiK/ssdt-training/discussions

  • @AI_superminds
    @AI_superminds 5 месяцев назад

    Very Insightful video. Can you please let me know what trainings I would never do to become a Databricks Data Engineer? I visited the Databricks website and got confused. I am beginner in Databricks. Thank you!

    • @KamilNowinski
      @KamilNowinski 5 месяцев назад

      I guess you WOULD LIKE to be Databricks Data Engineer...? If so, go to Databricks Academy - check this page out: docs.databricks.com/en/getting-started/free-training.html

  • @chweetyananya
    @chweetyananya 6 месяцев назад

    how does it work for transferring large number of xml files from fileshare to rest endpoint using ADF with certificate authentication?

    • @KamilNowinski
      @KamilNowinski 6 месяцев назад

      I did not test it from performance perspective. Let us know once you done that. Thx!

  • @plusvision100
    @plusvision100 7 месяцев назад

    How can we implement a ci/cd pipeline for multi tenant db

    • @KamilNowinski
      @KamilNowinski 6 месяцев назад

      Yes. Do you mean multi-databases (one per tenant) or different Azure Directory tenants?

    • @plusvision100
      @plusvision100 6 месяцев назад

      @@KamilNowinski muti-databases(one per tenant), can you please do a tutorial on this?

    • @KamilNowinski
      @KamilNowinski 6 месяцев назад

      The concept is exactly the same, but you need to repeat to against n databases: 1) build (generates DACPAC) 2) generate migration script (against target database), 3) Execute migration script 4) Repeat for next database. Did I miss something?

  • @deepakpatil5059
    @deepakpatil5059 7 месяцев назад

    Hi @Kamil, I have setup CD Pipeline successfully. I tried to delete existing Trigger in Dev and published successfully. New artifacte doesn't have that deleted trigger i mean parameter.json file doesn't have that deleted trigger but when i create Release deployment to UA this trigger is not getting deleted. Does the deleted activity doesn't move through Release? Do we need to delete it manually? Thanks, Deepak

    • @KamilNowinski
      @KamilNowinski 6 месяцев назад

      First of all, your DACPAC file should not contain deleted trigger, then during the deployment (publish action) the script should generate DROP TRIGGER when you set up appropriate publish options.

  • @subhraz
    @subhraz 8 месяцев назад

    Thank you. Demos were explained very well and simple manner. It was useful and informative.

  • @Presinov
    @Presinov 9 месяцев назад

    Awesome. Would also be good if you could share a few use cases where a client may find this useful.

    • @KamilNowinski
      @KamilNowinski 9 месяцев назад

      Good idea, thanks for the feedback, Conrad. It may sometimes be difficult while not disclose name of customer and its specific use case. But I know what you mean.

  • @santypanda4903
    @santypanda4903 9 месяцев назад

    Great video!! Can we have the same demo that showcase how we upload for the different file format pdf and excel? That would be great! Also what is the alternate solution to add the contents of the file instead of using the variables?

    • @KamilNowinski
      @KamilNowinski 3 месяца назад

      The alternate solution has been mentioned in the video. Please watch again carefully.

  • @marklobbezoo1464
    @marklobbezoo1464 10 месяцев назад

    @Kamil, thanks for the video really helpful. It works fine, but when I delete a table in DEV it doesn't delete it in prod. I created in VS SSDT a publish.xml which states : <DropObjectsNotInSource>True</DropObjectsNotInSource> However it still doesn't drop the tables in SQL. I tried to find this .xml file in the Publish Profile in my release pipeline but I can't select it.... Anybody has a clue?

    • @KamilNowinski
      @KamilNowinski 9 месяцев назад

      Try to add your problem here: github.com/NowinskiK/ssdt-training/discussions and provide more details about your configuration and what you do.

  • @bigtp25pierce
    @bigtp25pierce 11 месяцев назад

    What if the file you need to change is multiple levels down within a folder? ie: managedVirtualNetwork\default\managedPrivateEndpoint How would you go about changing that in the environment csv to use a different end point?

    • @KamilNowinski
      @KamilNowinski 10 месяцев назад

      I hope you'll find the answer in the documentation: github.com/Azure-Player/azure.datafactory.tools/

  • @user-ne2sk8og5g
    @user-ne2sk8og5g 11 месяцев назад

    hi everyone i was created a new table in visual studio project and pushed to repo, CICD got succeed, but table is not deployed in target database. can any know the reason and help me. note : i am able to add new columns in existing table and its got deployed in target database.

    • @KamilNowinski
      @KamilNowinski 6 месяцев назад

      Please check out my other video on this channel and follow it step by step: ruclips.net/video/4N_fv6d3KQY/видео.html I guess you have some small mistake in your DB project.

  • @shaxxie5237
    @shaxxie5237 Год назад

    Your video at 7:34 just completely jumps to another topic and misses the main bit of inserting the tasks into the release pipeline. 😢

    • @KamilNowinski
      @KamilNowinski 9 месяцев назад

      Sorry about that. You're right - it might be a bit confusing. Please check my other video about #adftools which hopefully explains it better: ruclips.net/video/E0yXXegenTM/видео.html

  • @dipalinalawade1318
    @dipalinalawade1318 Год назад

    Thanks for the video. It has resolved all my queries. Explained in simple way and to the point. Very helpful!

  • @SandeepSingh-vo3hd
    @SandeepSingh-vo3hd Год назад

    hi @kamil, thanks for the videos it helps us a lot but getting this error when trying to execute the release SQL72014: .Net SqlClient Data Provider: Msg 50000, Level 16, State 127, Line 6 Rows were detected. The schema update is terminating because data loss might occur. SQL72045: Script execution error. The executed script: RAISERROR (N'Rows were detected. The schema update is terminating because data loss might occur.', 16, 127) WITH NOWAIT; if you reply it will be appreciated.

    • @KamilNowinski
      @KamilNowinski 10 месяцев назад

      Your target table already contains some data, hence the error. Do check the publish options.

  • @kothahemanth4065
    @kothahemanth4065 Год назад

    If I create a new view and pushed the update to repo and pipeline deployed that changes to target DB and if I revert that new view commit, will the release pipeline will delete that view or else need to create a new script in ssdt for droping that view.

    • @KamilNowinski
      @KamilNowinski Год назад

      It depends on the publish options. Check this out (free): learn.azureplayer.net/ssdt-tips

  • @rajarajan7738
    @rajarajan7738 Год назад

    Hi @Kamil, I have a question… I have a bunch of sql scripts in a repo folder . And I want to deploy those script files which has only changes … Thanks

    • @KamilNowinski
      @KamilNowinski Год назад

      ...changes... in comparison to what?

    • @LongLe-qz1ui
      @LongLe-qz1ui 9 месяцев назад

      @@KamilNowinski compare to current version of those scripts existing in database

  • @clodola1
    @clodola1 Год назад

    Great explanation , Thanks for taking to time put in all the remedial steps, generally where I get frustrated. I got as far as generating the script in VS I want to publish(59 min) , Repo version has a view with a comment --TEST, I have an existing DB in Azure I want to update with the change (Alter view with a comment --TEST,) , when I generate the script, no code is generated to make the update but if I was to create a new DB all object code is created, what am I doing wrong? I want to update existing DB's with new or altered objects

    • @KamilNowinski
      @KamilNowinski Год назад

      Comment is not a real change to database's object, so likely due to that you see no changes in migration script generated.

  • @markcook8516
    @markcook8516 Год назад

    what is the key sequence to start PS?

  • @STKT_VLOGS
    @STKT_VLOGS Год назад

    I am getting this error 2023-02-27T09:39:37.6420094Z ##[error]*** Verification of the deployment plan failed. 2023-02-27T09:39:37.6483928Z ##[error]Error SQL72031: This deployment may encounter errors during execution because changes to [***] are blocked by [db_ddladmin]'s dependency in the target database.

    • @KamilNowinski
      @KamilNowinski Год назад

      Raised the issue here: github.com/NowinskiK/ssdt-training/issues I will take a look at spare moment.

  • @STKT_VLOGS
    @STKT_VLOGS Год назад

    Hi @Kamil, How can we deploy with user permissions ?

    • @KamilNowinski
      @KamilNowinski Год назад

      github.com/NowinskiK/ssdt-training/blob/master/src/PrePostDeployment/ContosoRetailDW/Scripts/CreateUsers.sql

    • @STKT_VLOGS
      @STKT_VLOGS Год назад

      @@KamilNowinski Thank you for the update.I will try it

  • @hugo9618
    @hugo9618 Год назад

    I'm getting a Powershell error. It says it cannot find type [System.IO.Compression.ZipFile] when the UseDotNet@2 task is executed.

    • @KamilNowinski
      @KamilNowinski Год назад

      Raise the issue here: github.com/microsoft/azure-pipelines-tasks/issues

  • @alexrisk3342
    @alexrisk3342 Год назад

    Hi Kamil, thanks for the video! I have a question, how is granted the network permission to Azure SQL from Devops agent? And in case of being deployed in a SQL VM how could I grant the network access? Thanks.

    • @KamilNowinski
      @KamilNowinski Год назад

      You need to configure Azure SQL Server appropriately in Security/Networking tab. One of the easiest option is to check this option: "Allow Azure services and resources to access this server." you can find on the bottom of the page.

  • @janamyugender3432
    @janamyugender3432 Год назад

    Very good information and session thanks a lot. I am looking for the same using Jenkins could you please have the one using Jenkins will be greatly helpful for me. Thanks in advance .

    • @KamilNowinski
      @KamilNowinski Год назад

      Janam, one of my friends, Gavin, wrote this article a few years ago which might help: gavincampbell.dev/post/jenkins-windows-git-ssdt-profit/ However, if there is no ready-to-go task for publishing DACPAC, you can use PowerShell task to execute SQLPackage.exe to do this. I explained this in the following lesson of my course: learn.azureplayer.net/view/courses/database-projects-with-ssdt-dacpac/734805-module-6/2417631-publishing-with-sqlpackage-powershell The above lesson is free! HTH. Enjoy!

  • @yoshida0007
    @yoshida0007 Год назад

    Is there any way to link databricks with same permission in databricks? I just allow my A user to use A folder in databricks and just allow to access A folder when A user use data factory to create pipeline with databricks?

    • @KamilNowinski
      @KamilNowinski Год назад

      When you use TOKEN as authentication method then connection will be established with the user's permissions. Hence all depends on Databricks' side and settings there.

  • @andrewbrady5839
    @andrewbrady5839 Год назад

    Very helpful tutorial, thank you. I wonder if you could answer a question I have about self hosted agents...... If I want to deploy code to several servers in my environment is it possible to have just one self hosted agent in my environment that deploys the code to my target servers? or do I need to have a self hosted agent on each of my target servers? The reason being I need to limit external connections from my environment to a minimum. So if I could have an 'intermediate' server that connects to azure devops with the self hosted agent and deploys to other servers that don't need a direct connection to azure devops that would be my ideal solution. Many thanks

    • @KamilNowinski
      @KamilNowinski Год назад

      It is possible to have just one self-hosted agent in your environment that deploys the code to multiple target servers. You can configure the agent to run on a dedicated machine or a virtual machine and use it to deploy the code to the target servers using appropriate deployment tools and scripts. However, scenario that may require the use of separate agents per environment is when the target servers in each environment have different configurations or requirements. For example, if you have a development, staging, and production environment, the servers in each environment may have different hardware or software configurations, security requirements, or network settings.