hey awesome vid! thanks I follow you! with dataflow can i aim a specific schema on the warehouse destination? or i should combine pipeline and dataflow for that? thx
Thanks for the wonderful videos on Microsoft Fabric. I see that all the data imports are "one time activities" from a source. How can we get the delta data (eg. new records, deleted records) in the source syncd with the lakehouse periodically? i.e., What about the continuous CRUD operations being done on the source (eg. Sql DB) ? How can that be synced with the data in the lakehouse?
Thanks fo your comment! Yes you're right, they are one-time activities (but you can schedule then to run every day/ hour etc, but it's never going to be 'real-time' The feature you are describing is more like OneLake shortcuts (which you can create a real-time link to FILES in external locations (ADLS and Amazon S3). Microsoft are also releasing soon a feature called Database Mirroring which will do the same thing, but for databases (Snowflake, CosmosDB, Azure SQL etc) - this feature is currently in private preview, I believe they will release it soon for public preview!
Hey! Could be a number of reasons! Does the dataflow run ok outside of the pipeline? I would look in Monitoring Hub and analyze the error message. Good luck!
Thank you for sharing! I didnt load my data into the azure storage account - i used DFg2 to read/upload the CSV file. Do you guys experience slow delta writing to the lakehouse? [Can I do something to speed it up]?
Hi Yes in general the Dfg2 is quite slow at the moment during the public preview, I’m sure the write speeds will increase as we move closer to GA (general availability) of Fabric
Hey all! How's it going with Dataflows and Fabric? Have you started using either yet?! Share you experiences below 👇😃
Thanks for this fab video. It is so so easy to understand and so useful to actually go through use cases end to end!
Hey thanks for watching and for the feedback! I’ll be doing more end-to-end projects in the future for sure ☺️💪
very nice easy way you explained it, very helpful!!
hey awesome vid! thanks I follow you!
with dataflow can i aim a specific schema on the warehouse destination? or i should combine pipeline and dataflow for that? thx
Thanks for the wonderful videos on Microsoft Fabric.
I see that all the data imports are "one time activities" from a source.
How can we get the delta data (eg. new records, deleted records) in the source syncd with the lakehouse periodically?
i.e., What about the continuous CRUD operations being done on the source (eg. Sql DB) ? How can that be synced with the data in the lakehouse?
Thanks fo your comment! Yes you're right, they are one-time activities (but you can schedule then to run every day/ hour etc, but it's never going to be 'real-time'
The feature you are describing is more like OneLake shortcuts (which you can create a real-time link to FILES in external locations (ADLS and Amazon S3).
Microsoft are also releasing soon a feature called Database Mirroring which will do the same thing, but for databases (Snowflake, CosmosDB, Azure SQL etc) - this feature is currently in private preview, I believe they will release it soon for public preview!
nice.. Quick question why we are not having dataflow UI that we had in adf and synapse
Because it’s a different product!
hi wills for the report can i import background design maybe from figma or canva
yes
Great video, thanks
Thanks for watching!
Thanks for the awesome video. How do you add the folder names like BronzeLayer/SilverLayer? It all got created into the same workspace for me.
Hey thanks for the question, which part of the video are you referring to?
The folders are referring to data lakes so he has a Data lake for Bronze Data and a Data lake for Silver Data.
dataflow2 cannot be deployed in Fabric pipeline by my side, why is that?
Hey! Could be a number of reasons! Does the dataflow run ok outside of the pipeline? I would look in Monitoring Hub and analyze the error message. Good luck!
Thank you for sharing! I didnt load my data into the azure storage account - i used DFg2 to read/upload the CSV file.
Do you guys experience slow delta writing to the lakehouse? [Can I do something to speed it up]?
Hi
Yes in general the Dfg2 is quite slow at the moment during the public preview, I’m sure the write speeds will increase as we move closer to GA (general availability) of Fabric