Rob Kerr AI
Rob Kerr AI
  • Видео 26
  • Просмотров 24 452
Windows Fine Tuning Combined Streams
LLM Fine-Tuning is one of the go-to techniques for making LLMs perform better in specific scenarios. In this post I'll show you how to prepare a local Windows-based machine with an Nvidia GPU to serve as a development-scale fine-tuning workstation.
This video is based on my blog post you can find here:
robkerr.ai/fine-tuning-llms-using-a-local-gpu-on-windows/
0:00 Intro
2:33 Install WSL
6:01 Install g++
6:30 Install Miniconda
8:02 Install VS Code
9:35 Create Python Environment
10:15 Install Unsloth Components
14:15 Create Training Notebook
15:20 Fine-tune the model
16:48 Save the Adapter
17:18 Summary
Просмотров: 80

Видео

Mirroring Snowflake data to Microsoft Fabric using Change Data Capture (CDC)
Просмотров 248Месяц назад
This video is a quick walk-through using Fabric mirroring to replicate data from a Snowflake Data Warehouse into a fabric Data Warehouse. Mirroring with Change Data Capture provides near real-time data replication of Snowflake data. 0:00 Introduction 1:16 Enable Mirroring 1:58 Create the Mirror 3:10 Select Tables 4:05 Mirror Monitoring 4:55 Review Data 6:03 Add new Data 6:11 Summary
Importing Snowflake data to Microsoft Fabric using a Pipeline
Просмотров 225Месяц назад
This video is a quick walk-through using a Fabric pipeline to import data from a Snowflake Data Warehouse to a Lakehouse Delta Table. Snowflake data sources are supported out-of-the-box in Fabric, making this type of integration straightforward! 0:00 Introduction 0:25 Create a Pipeline 0:55 Create a Snowflake Connection 2:13 Select Snowflake Data 3:00 Choose Destination 3:51 Run Pipeline 4:24 R...
Unleash Your Model's Potential: Guide to Deploying a Fabric ML Model on Azure ML Inference Endpoint
Просмотров 1813 месяца назад
Microsoft Fabric's Data Science workload leverages Synapse ML for training and are ideally suited to enrich data stored in a data lake. But models developed in Fabric can also be deployed in other environments like Azure Machine Learning to take advantage of real-time inference endpoint deployment and other techniques. This video walks through the steps to export a model from Fabric, Import it ...
Deploying and Consuming Large Language Models from the Azure ML Model Catalog
Просмотров 1303 месяца назад
Explore Azure Machine Learning's selection of pre-trained models from leading sources like OpenAI, Microsoft, and Hugging Face. In this video I'll guide you through the process of choosing, deploying a model on Azure's computing services, and accessing it through a REST API. This video is a video demo of this blog post content: robkerr.ai/deploy-azure-model-catalog-endpoint 0:00 Create Resource...
How to create and install custom Python libraries in Microsoft Fabric
Просмотров 6483 месяца назад
You've probably used %pip install to include open source libraries in your Fabric notebooks, but did you know you can also easily install your own custom Python libraries? This end-to-end walk through shows you how to create and package your own code in custom libraries in a Fabric Lakehouse. Packaging your code in libraries encourages reusability and reduces the need to paste commonly used cod...
How to incorporate AI-Driven Sentiment Analysis models in a Fabric Data Lake Solution
Просмотров 1324 месяца назад
This video shows you how to use Azure AI sentiment analysis machine learning models within a Fabric Jupyter notebook to add user sentiment features to Data Lake tables providing rich new quantitative data elements that help data analysts discover new insights in unstructured data. For links to source code and similar articles, visit the blog version of this post: robkerr.ai/fabric-ai-sentiment-...
Incorporating Azure OpenAI Services with Fabric to create insights from unstructured data
Просмотров 6944 месяца назад
In this video we'll use Azure OpenAI Services from a Fabric Jupyter notebook to automatically summarize large collections of semi-structured user review data. Topics covered include prompt engineering, secure secret storage in Azure Key Vault, and reading/writing data to Delta tables in a Fabric Lakehouse. For links to source code and similar articles, visit the blog version of this post: robke...
Using Azure AI Document Intelligence with Microsoft Fabric
Просмотров 9364 месяца назад
This video shows you how to build a Document Intelligence solution within Microsoft Fabric that incorporates Azure AI to extract text from scanned documents. Using a Spark Notebook and Synapse ML, we can easily ingest scanned document data into a Data Lake solution. For links to source code and similar articles, visit the blog version of this post: robkerr.ai/fabric-azure-ai-document-intelligen...
Incorporating Azure AI Translation into a Fabric Data Lake Solution
Просмотров 1964 месяца назад
Using Fabric's embedded Synapse ML features, we can leverage Azure AI services directly in Spark notebooks to enrich and transform data using Language AI Services. Learn how to use these powerful services using a Python Jupyter notebook stored in a Fabric workspace. For links to source code and similar articles, visit the blog version of this post: robkerr.ai/fabric-azure-ai-translation-notebook/
Calling Azure AI Document Intelligence using the REST API
Просмотров 3,7 тыс.4 месяца назад
Azure AI Document Intelligence can read images and PDF scans of forms, extracting data for later use in data solutions. While various language SDKs are available, it's also possible to call these services directly using the REST API. This tutorial walks through the REST API process. For links to source code and similar articles, visit the blog version of this post: robkerr.ai/azure-ai-document-...
Committing a Microsoft Fabric Workspace to a Git Repository
Просмотров 5574 месяца назад
4Microsoft Fabric supports connecting Fabric Workspaces with Git repositories. Watch this video to learn how to setup and use Git repos with Fabric. We'll cover configuration, synching notebooks and other content, as well as how to use Git to recover unintended changes to notebooks. This video is a demonstration of a related post at robkerr.ai/fabric-git-repo-configuration 0:00 Introduction 0:5...
Create Gemini Chatbot
Просмотров 3,8 тыс.5 месяцев назад
Last week Google shipped the first version of its Gemini large language model and made APIs available to developers. In this post we'll build a fully functional Gemini Chatbot using Streamlit! This video is a walk through of the content in my blog post: robkerr.ai/how-to-create-a-google-gemini-chatbot/ 0:00 Introduction 1:05 Anaconda 1:21 Install Dependencies 2:43 Initialize Gemini SDK 3:05 Cre...
Creating a Custom Generative AI Microsoft Teams Copilot using Copilot Studio
Просмотров 6 тыс.5 месяцев назад
This video is an end to end walk through using Microsoft Copilot Studio to create a custom Microsoft Teams copilot that includes a range of features, including: • Generative answers using a web site as grounding knowledge • Generative answers using a PDF documentation as grounding knowledge • Custom topics with branching logic The walk through starts from scratch and ends with a finished Copilo...
Display a Power BI Report in a Microsoft Fabric Jupyter Notebook
Просмотров 2895 месяцев назад
We can use the Power BI Python client to display a Power BI report right in a Jupyter notebook in Fabric. This video shows how to do this voodoo magic! This is a video walk-through of the following blog post: https//robkerr.ai/display-power-bi-report-fabric-jupyter-notebook Quick Links: 0:00 Introduction 0:17 The Power BI Report 0:35 Create a Notebook 0:54 Import Python Dependencies 01:27 Authe...
Generative AI using Azure AI Integrated Search Vector Embeddings
Просмотров 2826 месяцев назад
Generative AI using Azure AI Integrated Search Vector Embeddings
Generative AI in action: how I use Bing Image Creator and Photoshop Generative Fill together
Просмотров 1986 месяцев назад
Generative AI in action: how I use Bing Image Creator and Photoshop Generative Fill together
Fabric Semantic Link: Power BI as a Machine Learning Data Source
Просмотров 3,3 тыс.6 месяцев назад
Fabric Semantic Link: Power BI as a Machine Learning Data Source
Creating a Machine Learning Model with IBM Watson Studio and Jupyter
Просмотров 2126 месяцев назад
Creating a Machine Learning Model with IBM Watson Studio and Jupyter
Build a Generative AI Chatbot Using a Vector Database for Custom Data (RAG)
Просмотров 1,1 тыс.6 месяцев назад
Build a Generative AI Chatbot Using a Vector Database for Custom Data (RAG)
Machine Learning with Fabric Data Science and Jupyter Notebooks
Просмотров 2686 месяцев назад
Machine Learning with Fabric Data Science and Jupyter Notebooks
Bring your own data to Azure OpenAI Service Large Language Models
Просмотров 4296 месяцев назад
Bring your own data to Azure OpenAI Service Large Language Models
Azure ML Deploy Inference Endpoint
Просмотров 5197 месяцев назад
Azure ML Deploy Inference Endpoint
Exporting Azure AI Vision Models to run on iOS Devices
Просмотров 457 месяцев назад
Exporting Azure AI Vision Models to run on iOS Devices
Using AI Vision to Detect Packaging Defects
Просмотров 1127 месяцев назад
Using AI Vision to Detect Packaging Defects
Enriching Images with Azure AI Vision for Power BI Analysis
Просмотров 727 месяцев назад
Enriching Images with Azure AI Vision for Power BI Analysis

Комментарии

  • @SarahJohnson-ef6lc
    @SarahJohnson-ef6lc 10 часов назад

    Keep it up really nice explanations in short

  • @fredlaxTalenz
    @fredlaxTalenz 6 дней назад

    As I am very new to Microsoft universe, I was trying to do it the hard way. With your guide I saved several days of pain !! many many thanks !!

  • @tlubben972
    @tlubben972 8 дней назад

    This really helped. I couldn't find a good video that showed me how to do this all weekend, but I would recommend not cutting out scenes because people are looking at the video and following and literally a glace left and right, and you already did another step that I didn't even notice. Had to watch it like 4 times super slow to catch all the steps. Also I dont know linux so skipping the part of making the code/finetune folders had me thinking i missed something again and that had me turned around for awhile. Also when people stop it, the red bar and play, pause, buttons, subtitles etc cover up the bottom of the screen so we cant even see if you do something quick at the bottom because its covered up when we pause it. Thanks again.

    • @robkerrai
      @robkerrai 8 дней назад

      Thanks for the feedback! Sorry you got tripped up in some places, I'll work harder to make sure the flow is clean and doesn't get interrupted in the future.

  • @stephenwong6843
    @stephenwong6843 8 дней назад

    Thanks Rob! I have 2 questions: 1. Can I create a ticketing system in Copilot Studio? 2. When I publish via Teams to whole office of 60+ colleagues, they dont have to pay extra, right? Many Thanks.

    • @robkerrai
      @robkerrai 8 дней назад

      Possibly! It depends on the complexity you need. I'd suggest trying to prototype the functionality you need to see how it can fit. Regarding cost, when using copilot studio, it's billed based on the number of messages that flow through the copilot app you create, so I don't think you would see per user charges for Teams users. Take a look at: www.microsoft.com/en-us/microsoft-copilot/microsoft-copilot-studio#Pricing

  • @datatalkshassan
    @datatalkshassan 12 дней назад

    Hey Rob, how much time it takes to spin up the Spark cluster when running the new job? Or did you already have a cluster running and when you ran the job, the job simply requested resources from that running Spark Cluster?

    • @robkerrai
      @robkerrai 12 дней назад

      If you use the default pool (I almost always do) a cluster will start processing Fabric jobs in 4-5 seconds since Fabric has cluster nodes already running and waiting to be assigned.

    • @datatalkshassan
      @datatalkshassan 12 дней назад

      @@robkerrai I never had a chance to with Fabric but by default pool, do you mean that a cluster is always up and running in your environment and your job simply takes the resources from there. Or do you mean that default pool is Serverless and it will only charge me for the time I will use it and also, it will spin up within 4-5 seconds? That would be extremely helpful if true. We are working with HD Insight and need to have our nodes running all the time only because we can't afford a time of 10-15 minutes for a node/cluster to spin up when a new user starts his Notebook.

    • @robkerrai
      @robkerrai 12 дней назад

      Yes to both questions. Fabric is SAAS (not PAAS), so it works differently in that you pay for a "capacity" that you choose (your peak usage/second), and is a relatively fixed cost rather than clusters you manage yourself billed as a variable cost. The Default pool is a large group of always running nodes waiting to be assigned to your work as needed--you don't pay for compute per-se, you pay for the maximum nodes you can use at any given time. Default pool probably would solve the low latency of starting clusters issue you have. It is possible in Fabric to design your own cluster configuration, which would probably have the same startup delay in a PAAS environment, but needing to do that is the exception not the rule. The default pool meets most general computing needs.

    • @datatalkshassan
      @datatalkshassan 11 дней назад

      Perfect, thank you! That clarified a lot of things. Thanks for your time, really appreciate it

  • @clnguye
    @clnguye 12 дней назад

    Thanks for sharing. You sure made it look easy.

    • @robkerrai
      @robkerrai 12 дней назад

      Thanks for watching!

  • @fschlz
    @fschlz 13 дней назад

    thanks for the video!

  • @OneNI83
    @OneNI83 17 дней назад

    Nice Share. Is there a way to decrypt the wheel file so we can see .py scripts in locally that is obtained from elsewhere. Would you have some resources on this?

    • @robkerrai
      @robkerrai 12 дней назад

      If I'm not mistaken .whl files aren't encrypted. Try changing the extension to .zip and using unzip to get at teh code within the .whl file.

  • @akshay7450
    @akshay7450 19 дней назад

    Hey Rob, this was really helpful. I was looking to apply information from a form like this onto a prebuilt data entry tool with certain sections for each of the matching sections on the form. Would this be possible?Could you possibly make a video on this? Thanks

    • @robkerrai
      @robkerrai 12 дней назад

      You can use Azure AI to train a custom model for your own forms. I'd like to make a video about this in the future so stay tuned!

  • @DylanBrownWork
    @DylanBrownWork 29 дней назад

    Any chance you could reply to this comment with the fetch? Thank you very informative video and useful so far

    • @robkerrai
      @robkerrai 8 дней назад

      Sorry Dylan I don't understand what yo mean by "fetch". Could you clarify?

  • @vishnutelakapalli6080
    @vishnutelakapalli6080 Месяц назад

    Hi Rob, The content was very useful, Do you have any complete tutorial to learn Fabric please. If already created please share the link. Thank you :)

    • @robkerrai
      @robkerrai 27 дней назад

      I don't have a complete series. I'd recomed Will's series on his @LearnMicrosoftFabric channel www.youtube.com/@LearnMicrosoftFabric

  • @user-dx8ko2rs4g
    @user-dx8ko2rs4g Месяц назад

    Thank you for your work, Rob. I appreciate this video!

    • @robkerrai
      @robkerrai Месяц назад

      So glad it was helpful!

  • @dataengineernerd27
    @dataengineernerd27 Месяц назад

    What are the use case for this ? Can you help with that!

    • @robkerrai
      @robkerrai Месяц назад

      It's a good question. I see the benefit with this feature is to view Power BI output within a notebook during development. For example, if the data in a Direct Lake model is being updated in the notebook, the visual is available within the notebook context.

  • @TheBas1202
    @TheBas1202 Месяц назад

    Hi Rob, thanks for this video. How would you go about separate lakehouses for dev and prod, in relation to the notebook you sync? I tried doing exactly what you did here, but then in prod the notebook is trying to talk to the dev lakehouse. It seems the default lakehouse for the notebook is not put in git?

    • @robkerrai
      @robkerrai Месяц назад

      Microsoft's vision as presented at FabCon 24 was that different branches in a CI/CD environment are connected to different workspaces. The git features as I write this is pretty basic, but the design set out at FabCon is moving toward a more automated approach where creating a branch creates a workspace with all artifacts. In that scenario you probably would have a lakehouse in the Dev workspace, another lakehouse (with the same name) in a Prod lakehouse, etc.

  • @deeptinirwal9672
    @deeptinirwal9672 Месяц назад

    Nice video

  • @Nalaka-Wanniarachchi
    @Nalaka-Wanniarachchi Месяц назад

    Good video.I hope this flow is happening automatically behind the scene when mirroring happens ..

    • @robkerrai
      @robkerrai Месяц назад

      This is a pipeline (not mirroring). You are correct when using mirroring the integration runs behind the scenes.

  • @chacemccallum983
    @chacemccallum983 Месяц назад

    This video has been fantastic, but I have a possibly stupid question: When you ran the code, you automatically had a job created; what did you do to get that? When I run mine that does not happen.

    • @robkerrai
      @robkerrai Месяц назад

      Not sure I understand. Which portion of the video are you referring to?

  • @mtagab007
    @mtagab007 Месяц назад

    thank you, very helpful

    • @robkerrai
      @robkerrai Месяц назад

      Glad it was helpful!

  • @narendraparmar1631
    @narendraparmar1631 2 месяца назад

    Thanks Rob

  • @shanewarnakula7149
    @shanewarnakula7149 2 месяца назад

    Great video ! I had a question, how do i call a custom model that i created in Azure Document Intelligence ?

    • @robkerrai
      @robkerrai 12 дней назад

      You can use Azure AI to train a custom model for your own forms. I'd like to make a video about this in the future so stay tuned!

  • @spectraphantom9433
    @spectraphantom9433 2 месяца назад

    Thank you for this tutorial, sir!

    • @robkerrai
      @robkerrai Месяц назад

      Glad it was helpful!

  • @daan7355
    @daan7355 2 месяца назад

    Thank you this helped a lot. How would it work with custom prompts for the bot?

  • @rommelsantizo
    @rommelsantizo 2 месяца назад

    Hi Rob, great video about how to use packages on Fabric. In fact this video make think, that we can create a new Workspace and Lakehouse called SourceCode o SharedCode as the example. So we will have the code isolated from the data Bronze, Silver and Gold.

    • @robkerrai
      @robkerrai 12 дней назад

      I think you could accomplish this using Fabric environments. Each environment can have different .whl files attached to it, so when you choose an environment to run your code, that code will use whatever .whl files are pre-installed in the environment.

  • @kathleenomollo7357
    @kathleenomollo7357 2 месяца назад

    This was so helpful. The step by step guide helped me solve an issue that has stumped our developers for weeks. This was the breakthrough I needed.

  • @miles21k_
    @miles21k_ 2 месяца назад

    Excelente

  • @adilmajeed8439
    @adilmajeed8439 2 месяца назад

    thanks for sharing your video. is it possible to share the Retail Analysis Power BI dataset along with the measures so we can replicate the same at our side.

    • @robkerrai
      @robkerrai 8 дней назад

      This would be difficult to share as it has some content I don't own.

  • @MegaNatsirt
    @MegaNatsirt 2 месяца назад

    you the real mwp in ocr

  • @AFOAkanbi
    @AFOAkanbi 2 месяца назад

    Thanks for this

  • @skywalknotpossible
    @skywalknotpossible 2 месяца назад

    mate you have some really good content here. subscribed.

    • @robkerrai
      @robkerrai 12 дней назад

      Thanks for subbing!

  • @manu_pasta
    @manu_pasta 2 месяца назад

    Thank you for this amazing demo ❤ I didn't know about python usage possibilities in Fabric 🔥

  • @baklava2tummy
    @baklava2tummy 3 месяца назад

    Thanks, this has been really useful. One issue I found was that the model was showing as ‘resource not found’ on the initial post request. I had to change the ‘project settings’ to ‘endpoint and key’ and then create a new model. Not sure if I had done something wrong but maybe other people have been puzzled by this too! Thanks again for the video, really helpful!

  • @JohnHenry2009
    @JohnHenry2009 3 месяца назад

    Thanks for this Rob - very useful.

    • @robkerrai
      @robkerrai Месяц назад

      Glad it was helpful!

  • @gcarboni1
    @gcarboni1 3 месяца назад

    Thanks for this demo. It was very useful and clear. I think that the only bad point on azureml endpoint, is that you can add only one model on a single deployment, and it become very expensive in terms of money, because you need a single VM for each model. Do you know some alternatives on azureml? Thank again Gabriele

    • @robkerrai
      @robkerrai 3 месяца назад

      The GUI-based tools are definitely geared around one model per endpoint. Multiple models are possible but require more configuration work. I can't paste URLs here, but if you search the learn site for "multiple local models deployment" it should turn up some doc pages how to get it working.

    • @gcarboni1
      @gcarboni1 3 месяца назад

      @@robkerraiI see a solution inside a container istance. But I'll search again. thank you.

  • @agentDueDiligence
    @agentDueDiligence 3 месяца назад

    Hey Rob! Thanks for the introduction. I have made some really great experiences with the layout API usage. This document AI service is kind of groundbreaking in my opinion. It didn’t really hit mainstream yet really. What is your take on the model?

    • @robkerrai
      @robkerrai 3 месяца назад

      These models are most commonly part of enterprise deployments, and don't get a lot of PR coverage. Microsoft published a case study where Volvo uses Doc Intel to speed processing of supply chain documents you can probably find via a web search. These types of models can make a lot of impact on cost savings.

  • @gabscar1
    @gabscar1 3 месяца назад

    Nice video, thanks! Just wondering if there is a way for the model to ingest user data: pdf, spreadsheets, pictures, etc?

    • @robkerrai
      @robkerrai 3 месяца назад

      Image attachments are supported by the API when formulating a prompt, but I don't see support for documents like PDFs and spreadsheets int he docs.

  • @marvnl
    @marvnl 4 месяца назад

    i Find it vage on how to entered the source url. I want to build one that generates response from our sharepoint communication sites as a documentation source, because that is the live source that gets updates if needed. I do not want to put any source material in the bot it self, because that is something you can easily forget to edit if policy changes. Where can I enter the URL as source exactly?

    • @robkerrai
      @robkerrai 3 месяца назад

      To use linked documents, search the learn documentation for "Use content on SharePoint or OneDrive for Business for generative answers". I think this should point you in the right direction. Good luck with your project!

  • @user-ce6jb5pm5f
    @user-ce6jb5pm5f 4 месяца назад

    Hi any idea how can I add a date to the vectorDB so the answer will be base on the most up-to-date document?

    • @robkerrai
      @robkerrai 3 месяца назад

      If you don't want old content in the vector DB, the best approach is probably not to add it to the vector DB (or remove old entries as new ones are added, or truncate/reload the vector DB so it only contains relevant docs). The alternate approach is when you receive the results of a vector query, you will have the JSON metadata for each entry, so you can filter out only the most recent results that way. HTH.

  • @609rohith4
    @609rohith4 4 месяца назад

    HI SIR ,WHEN WE ENTER 1 IN INPUT IT GIVING ERROR HOW TO RESOLVE IT

    • @robkerrai
      @robkerrai 3 месяца назад

      Hard to say from the information you posted. Make sure to obtain a Gemini key from Google--that would be the most likely reason for a request not succeeding.

  •  4 месяца назад

    Great video with powerful explanations!

    • @robkerrai
      @robkerrai 3 месяца назад

      Glad you liked it!

  • @mudassirsyedrashidali9787
    @mudassirsyedrashidali9787 4 месяца назад

    Great way to explain copilot studio. Keep more coming

    • @robkerrai
      @robkerrai 3 месяца назад

      Thanks, will do!

  • @DBBBB
    @DBBBB 4 месяца назад

    Rob, David here, Developer here from Australia. Thanks for the great video. Exactly what I needed. I have an encoded Bas64 file ready to go, an attachment extracted from Outlook Exchange, so it makes a lot more sense to send that Base64 string directly, rather than store the file to a disk then pass its location. All microsoft documentation shows examples of passing a string to the documents location. This doesn't make much sense to me, since we are using a REST API rather than an SDK, and the service most likely will not have access to the file location. Ps. Subbed and notified. In just 20 videos on your channel, you have provided so much great content in this space. Please keep making videos! If you make courses anywhere, I would also happily buy. Great stuff.

    • @robkerrai
      @robkerrai 3 месяца назад

      Glad the videos are helpful! The feedback is encouraging. In your scenario I tend to agree, you probably don't have a good way to give AzureAI access to your data via a URL since it's the content of an email. You're right that the examples usually give a URL that's publicly available on the Internet (not very likely in an enterprise scenario). However the passed URL approach could work in an enterprise solution if, for example, you had a file in Azure Blob store and generated a short-lived SAS token to append to the URL passed to AI Services. I prefer Base64 for "small" input payloads, but I would go for a SAS token if the input file was large to reduce the size of the request to Azure.

  • @shanksingh7613
    @shanksingh7613 4 месяца назад

    Nice keep making video

    • @robkerrai
      @robkerrai 4 месяца назад

      Thanks!

    • @stefanobucciol4279
      @stefanobucciol4279 23 дня назад

      Genius! Can you please do a video of how the openAI model in Azure was configured in order to give you these results? Thanks

  • @jyhdjhyjdhjyj5058
    @jyhdjhyjdhjyj5058 4 месяца назад

    can you github that code?

    • @robkerrai
      @robkerrai 4 месяца назад

      The code is at the blog post link (see link in the description). Thanks!

    • @jyhdjhyjdhjyj5058
      @jyhdjhyjdhjyj5058 4 месяца назад

      @@robkerrai thank you!

  • @ApoorvKhandelwal
    @ApoorvKhandelwal 4 месяца назад

    Kindly make a video to deploy this code on a server like vercel or cloudflare

    • @robkerrai
      @robkerrai 4 месяца назад

      I have a post on my blog for how to deploy a Streamlit app to a docker container. This technique would apply to this scenario as well. robkerr.ai/publish-streamlit-app-docker-azure-container/ Thanks!

  • @amethyst1044
    @amethyst1044 5 месяцев назад

    Very helpful video, thank you !

    • @robkerrai
      @robkerrai 4 месяца назад

      Glad it was helpful!

  • @EricB1
    @EricB1 5 месяцев назад

    Very useful demo. Gives me many ideas. You may save effort if you used the clipboard more. There is a great utility called "clipboard Help+Spell" that can help.

  • @presonator
    @presonator 5 месяцев назад

    Super smooth, informative and nicely paced intro to Copilot studio. Thanks v much. Rob J

    • @robkerrai
      @robkerrai 4 месяца назад

      You're welcome! Glad it was helpful.

  • @sdr7883
    @sdr7883 5 месяцев назад

    Can we create a custom copilot that could summarize data based on the data we pass (for ex: teams copilot that would summarize meeting based on transcripts) ?

    • @robkerrai
      @robkerrai 4 месяца назад

      Possibly. There are a lot of connectors available. At some point Microsoft's PromptFlow becomes potentially a better approach since it's open to any custom code you want to incorporate in the solution. Thanks!

  • @khavea
    @khavea 6 месяцев назад

    Is it possible do the same outside of the PBI workspace?

    • @robkerrai
      @robkerrai 6 месяцев назад

      According to the MSFT docs, Semantic Link is only supported within Fabric, so while SemPy could be used outside a Pandas notebook, it would still need to be within the context of Fabric.

  • @benscanlan3223
    @benscanlan3223 6 месяцев назад

    Dude Sweet Video!

    • @robkerrai
      @robkerrai 6 месяцев назад

      Thanks! Glad you enjoyed it.