DotPi
DotPi
  • Видео 34
  • Просмотров 139 506
Python log vs Excel log
Excel log and Python log work in different ways. Learn about the differences so you can avoid getting caught out
Просмотров: 26

Видео

is uv pip install the new pip install?
Просмотров 2,5 тыс.2 месяца назад
uv is an ultra fast package manager and installer, keep up to date on the latest developments here github.com/astral-sh/uv
Load data from Azure Blob Storage into R
Просмотров 2 тыс.3 месяца назад
# install packages #install.packages("AzureStor") library(AzureStor) # enter credentials account_url = ACCOUNT_URL account_key = ACCOUNT_KEY container = CONTAINER # connect to blob storage blob = storage_endpoint(endpoint = account_url, key = account_key) # connect to the container cont = storage_container(blob, container) # get a list of files in the container files = list_storage_files(cont) ...
What is Power BI's Query Folding?
Просмотров 988 месяцев назад
Learn about the magic of query folding in Power BI and the difference between private, organizational and public privacy settings
How much does Snowflake cost?
Просмотров 937Год назад
Learn about Snowflake's cost structure in an easy-to-follow video. More information and specifics on the Snowflake cost structure can be found here www.snowflake.com/pricing/
A spreadsheet that tracks the value your collector's items
Просмотров 917Год назад
Make an Excel spreadsheet that keeps track of your collectors items so you can keep track of what is in your collection, and what the current value is. This video uses Pokemon cards as an example.
Connect Goodreads to PowerBI
Просмотров 115Год назад
Create a connection between Goodreads and PowerBI so you can start making cool dashboards that show your reading progress.
Create a dynamic pivot function with multiple aggregate functions in Snowflake
Просмотров 1,6 тыс.Год назад
This video follows on from ruclips.net/video/FxNDJdZRrcI/видео.html The sql used in this video is below. Just one thing to note - RUclips does not allow less-than and greater-than signs in descriptions, so ive replaced that part with LESSTHANGREATERTHAN. That will need to be put back before you use the script. execute immediate $$ declare column_list_in string default (select listagg(distinct c...
Connect Azure Blob Storage To Excel (Without Extensions)
Просмотров 6 тыс.Год назад
Learn how to get your data from Azure Blob Storage into Excel without the need for any extensions or subscriptions.
View and Edit Azure Tables
Просмотров 1,6 тыс.Год назад
Learn how to create, view and edit tables in an Azure Storage Account.' Download the Azure Storage Explorer here: azure.microsoft.com/en-us/products/storage/storage-explorer/
Load data from Azure Blob Storage into Python
Просмотров 33 тыс.Год назад
Code below: from datetime import datetime, timedelta from azure.storage.blob import BlobServiceClient, generate_blob_sas, BlobSasPermissions import pandas as pd #enter credentials account_name = 'ACCOUNT NAME' account_key = 'ACCOUNT KEY' container_name = 'CONTAINER NAME' #create a client to interact with blob storage connect_str = 'DefaultEndpointsProtocol=https;AccountName=' account_name ';Acc...
Create a Dynamic Pivot Stored Procedure in Snowflake
Просмотров 2,3 тыс.2 года назад
TO DO A REGULAR SNOWFLAKE PIVOT SELECT * FROM "EXAMPLE_DB"."EXAMPLE_SCHEMA"."EXAMPLE_TABLE" PIVOT(sum("VALUE") for CATEGORY in ('AB', 'BA', 'AC', 'CA')) ; TO RUN A DYNAMIC PIVOT AS A QUERY: execute immediate $$ declare column_list_in string default (select listagg(distinct category, ',') from "EXAMPLE_DB"."EXAMPLE_SCHEMA"."EXAMPLE_TABLE" WHERE CONTAINS("CATEGORY", 'A')); table_in string default...
FIFO in Snowflake
Просмотров 6622 года назад
Code is on the way
Connect Snowflake to MS Access, and then to other softwares
Просмотров 2,1 тыс.2 года назад
Connect Snowflake to MS Access, and then to other softwares
Create 32-bit Python Environments from a 64-bit Conda Installation
Просмотров 7 тыс.2 года назад
Create 32-bit Python Environments from a 64-bit Conda Installation
Login to Snowflake and setup new user accounts using Azure AD
Просмотров 2,3 тыс.2 года назад
Login to Snowflake and setup new user accounts using Azure AD
Use Snowflake’s Pivot Function with a Dynamic List of Columns
Просмотров 12 тыс.2 года назад
Use Snowflake’s Pivot Function with a Dynamic List of Columns
Connect Azure Blob Storage to ArcGIS Pro to create a live map layer
Просмотров 1,7 тыс.2 года назад
Connect Azure Blob Storage to ArcGIS Pro to create a live map layer
Connect Azure Blob Storage to QGIS to create a live map layer
Просмотров 2,2 тыс.2 года назад
Connect Azure Blob Storage to QGIS to create a live map layer
Install GDAL and run GDAL Programs from the Command Line (with Anaconda)
Просмотров 3,6 тыс.2 года назад
Install GDAL and run GDAL Programs from the Command Line (with Anaconda)
Connect Snowflake to QGIS to create a live map layer
Просмотров 1,7 тыс.2 года назад
Connect Snowflake to QGIS to create a live map layer
Connect Python to Snowflake (with password and external authentication)
Просмотров 12 тыс.2 года назад
Connect Python to Snowflake (with password and external authentication)
Connect Snowflake to ArcGIS Pro to create a live map layer
Просмотров 1,4 тыс.2 года назад
Connect Snowflake to ArcGIS Pro to create a live map layer
Connect Excel to Snowflake (with password and external authentication)
Просмотров 5 тыс.2 года назад
Connect Excel to Snowflake (with password and external authentication)
Order the Stacks of a Line and Stacked Column Chart in Power BI
Просмотров 13 тыс.2 года назад
Order the Stacks of a Line and Stacked Column Chart in Power BI
Connect R to Snowflake (with password and external authentication)
Просмотров 2,9 тыс.2 года назад
Connect R to Snowflake (with password and external authentication)
Changing Virtual Environments in Anaconda and Spyder
Просмотров 16 тыс.3 года назад
Changing Virtual Environments in Anaconda and Spyder
PyQt5 App Layouts
Просмотров 2,7 тыс.3 года назад
PyQt5 App Layouts
Why is Super().__init__() needed in PyQt5 Apps?
Просмотров 5133 года назад
Why is Super(). init () needed in PyQt5 Apps?
Using Super() in Python, Part 3 - Multiple Inheritance
Просмотров 8693 года назад
Using Super() in Python, Part 3 - Multiple Inheritance

Комментарии

  • @barnestim
    @barnestim 11 дней назад

    I didn't manage to get this working using the AZURE_STORAGE_ACCESS_KEY method but did get it working using 2 environment variables: AZURE_STORAGE_ACCOUNT and AZURE_STORAGE_ACCESS_KEY with their relevant values. Also, a shapefile didn't work but a COG raster did. Just to re-highlight also, that adding/modifying of the environment variables requires restarting of QGIS- I found that some older values I had tried were being retained if I didn't close all instances of QGIS. The best way to check is to examine the 'Current environment variables (read-only - .....) section and checking the AZURE_....values are as expected.

  • @kiddjami25
    @kiddjami25 Месяц назад

    Thank you for this video. I hope you can assist me on a problem I am having. After I refresh the data, under queries & connections, it shows I have errors. I do not know how to check these or fix them. Can you help? Thanks!

  • @dakotakristenthornton4614
    @dakotakristenthornton4614 Месяц назад

    Love the video! I'm trying to do this for my MTG collection. How would you go about inputting cards when pricecharting doesn't have the number available in the url of the specific card? Example would be the plain jane Consuming Blob.

  • @pavanpenugonda4924
    @pavanpenugonda4924 Месяц назад

    Simpel straignt forward video, really helpful

  • @sammyvr94
    @sammyvr94 Месяц назад

    When we do this it appears a use security integration type saml instead of account parameter

  • @LimitedWard
    @LimitedWard 2 месяца назад

    I'm lost. Why wouldn't the maintainers of pip simply apply the same optimizations as uv then if it doesn't actually need to download all that extra content?

    • @dotpi5907
      @dotpi5907 2 месяца назад

      Good question, I think pip is still considered the more stable package manager, I imagine pip will adopt some things from uv pip once it reaches version 1.0.0

  • @rogercarlson2319
    @rogercarlson2319 2 месяца назад

    My issue is I want to link one of those tables that are duplicated in separate databases (as you showed in the beginning). This produces an error in Access. "Cannot define field more than once." I suspect the quick and dirty solution is to create a Snowflake Role that can only see that database. Thus it will appear only once. The cleaner solution is to link the table programmatically. This will require building an odbc connection string. I'm looking into that now. If I get a chance, I'll post a solution.

    • @dotpi5907
      @dotpi5907 2 месяца назад

      Thanks! That would be great to hear what you find

  • @mak448a
    @mak448a 2 месяца назад

    It’s way faster when making venvs!

    • @dotpi5907
      @dotpi5907 2 месяца назад

      That's good to know!

  • @RoamingAdhocrat
    @RoamingAdhocrat 2 месяца назад

    ooh. I support a SaaS app and _hate_ Azure Storage Explorer with a burning passion. if I can access logs etc from Python instead of ASE that would be a very happy rabbithole to go down. I suspect I don't have access to those keys though

  • @RoamingAdhocrat
    @RoamingAdhocrat 2 месяца назад

    how can it be faster, it's three extra keystrokes

  • @thecowmilk4857
    @thecowmilk4857 2 месяца назад

    Rust is the new programming contagion

    • @dotpi5907
      @dotpi5907 2 месяца назад

      The Rustaceans are sweeping the nations

  • @muhammadraza5108
    @muhammadraza5108 2 месяца назад

    Hi a very good tutorial. I am using SQL Server 2019 where order by command is not allowed with cte . Please suggest alternate. Also tell you are working on which environment. Regards

  • @ZuperPotato
    @ZuperPotato 2 месяца назад

    You're so underated

    • @dotpi5907
      @dotpi5907 2 месяца назад

      Thanks! The support means a lot

  • @clockblower6414
    @clockblower6414 2 месяца назад

    Hate using pip because of venvs. At least pipx exists when its just Python programs

    • @TheStanglehold
      @TheStanglehold 2 месяца назад

      Pray tell why does venv make pip difficult?

    • @clockblower6414
      @clockblower6414 2 месяца назад

      its not stangie its just extra steps

  • @EwaneGigga
    @EwaneGigga 2 месяца назад

    Thanks a lot for this very clear video. I spent hours trying to do this until I luckily stumble across your video. I agree that this video should definitely have more views!!

    • @dotpi5907
      @dotpi5907 2 месяца назад

      I'm glad it helped. Thanks for the support!

  • @adasalda
    @adasalda 2 месяца назад

    Hey, do you know by any chance how to connect to Microsoft Azure Data Lake Storage Gen 2? I cant do it the same way as you are doing it. It needs vsiadls file system handler (your method uses vsiaz), but i dont know how to set it :(

    • @dotpi5907
      @dotpi5907 2 месяца назад

      Hi, thanks for the comment, I can have a go this weekend, I dont have QGIS installed on my computer at the moment. My only thought would be to set the Type as HTTP/HTTPS/FTP in the data source manager window and passing in a url to your files, rather than selecting Microsoft Azure blob. It looks like the authentication for vsiadls and vsaiz should be the same, so I dont imagine there would be any change needed to the environment variable. Let me know if you figure it out.

    • @barnestim
      @barnestim 11 дней назад

      There's a provider/Protocol Type now in the the 'Add layer' dialogue: 'Microsoft Azure Data Lake Storage', try that?

  • @vardhanreddy435
    @vardhanreddy435 3 месяца назад

    Hi ,Thanks for your info.Hi, I want to display column headings without single quotes before and after in the output. You displayed 'BA', 'AB', 'CA', and I aim to dynamically achieve BA AB CA.can you explain how to achieve BA AB CA

  • @latebloomer1020
    @latebloomer1020 3 месяца назад

    This is not working these days. Don't waste your time.

    • @dotpi5907
      @dotpi5907 3 месяца назад

      Hi, which part doesnt work for you and what is the error message you get?

  • @TaylorDezio
    @TaylorDezio 3 месяца назад

    How do you add ungraded, along with all the graded prices?

  • @TravelChroniclesHD
    @TravelChroniclesHD 3 месяца назад

    Could You put code somewhere ?

    • @dotpi5907
      @dotpi5907 3 месяца назад

      Thanks for the reminder! I remembered I had trouble adding the code the the description because you can add < and > signs in a video description, ill paste the code in this comment until I figure it out

    • @dotpi5907
      @dotpi5907 3 месяца назад

      --create a FIFO schema create schema EXAMPLE_DB.FIFO; --create a BUY table create or replace table EXAMPLE_DB.FIFO.BUY ("INDEX" integer, QUANTITY float, PRICE float); --create a SELL table create or replace table EXAMPLE_DB.FIFO.SELL ("INDEX" integer, QUANTITY float); --add values to the initial tables -------------------------------------------------------------- insert into EXAMPLE_DB.FIFO.BUY ("INDEX", QUANTITY, PRICE) values (1, 200, 10), (2, 150, 12), (3, 225, 16); insert into EXAMPLE_DB.FIFO.SELL ("INDEX", QUANTITY) values (1, 100), (2, 300); CREATE OR REPLACE VIEW INVENTORY ("INDEX", QUANTITY, PRICE, TOTAL_COST) as --get the total number of units sold WITH total_units_sold AS ( SELECT SUM(QUANTITY) TOTAL_UNITS_SOLD FROM EXAMPLE_DB.FIFO.SELL ), --CUMULATIVE_SUM table as: the inventory table with a CUMULATIVE SUM COST column cumulative_sum AS ( SELECT *, SUM(QUANTITY) OVER (PARTITION BY NULL ORDER BY "INDEX") CUMULATIVE_SUM FROM EXAMPLE_DB.FIFO.BUY --do a cross join to add TOTAL_UNITS_SOLD AS A NEW COLUMN cross join total_units_sold ), updated_quantity AS ( SELECT *, CUMULATIVE_SUM-TOTAL_UNITS_SOLD BUY_SELL_DIFFERENCE, --whatever is left over from the buy-sell difference needs to be subtracted IFF(BUY_SELL_DIFFERENCE > 0 AND LAG(BUY_SELL_DIFFERENCE) OVER (PARTITION BY NULL ORDER BY "INDEX") < 0, BUY_SELL_DIFFERENCE, QUANTITY) UPDATED_QUANTITY FROM cumulative_sum ) SELECT "INDEX", UPDATED_QUANTITY QUANTITY, PRICE, UPDATED_QUANTITY*PRICE TOTAL_COST FROM updated_quantity --purchases that get completely sold out with have a negative buy sell difference, so get removed from the inventory WHERE BUY_SELL_DIFFERENCE >=0 ; --Cost of Goods sold CREATE OR REPLACE VIEW cogs ("INDEX", cogs) as WITH cumulative_units_sold AS ( SELECT *, SUM(QUANTITY) OVER (PARTITION BY NULL ORDER BY "INDEX") CUMULATIVE_SELL FROM EXAMPLE_DB.FIFO.SELL ), --work out the total amount of stock sold as of the previos sale previous_sales AS ( SELECT SELL.INDEX SELL_INDEX, SELL.QUANTITY SELL_QUANTITY, BUY.INDEX BUY_INDEX, BUY.QUANTITY BUY_QUANTITY, BUY.PRICE BUY_PRICE, CUMULATIVE_SELL - SELL_QUANTITY SOLD_PREVIOSLY, --deal with the SOLD PREVIOSLY FIRST THEN SUBTRACT THE REMAINDER SUM(BUY_QUANTITY) OVER (PARTITION BY SELL_INDEX ORDER BY BUY_INDEX) CUMULATIVE_BUY, --how much is left in the st CUMULATIVE_BUY-SOLD_PREVIOSLY BUY_SELL_DIFFERENCE FROM cumulative_units_sold SELL FULL OUTER JOIN BUY ORDER BY SELL_INDEX, BUY_INDEX ), --update the stocks to what we would have after the previous sale --rows with negative (OR 0) buy-sell differences are removed in the next table updated_inv_quantity AS ( SELECT SELL_INDEX, SELL_QUANTITY, BUY_INDEX, BUY_QUANTITY, BUY_PRICE, BUY_SELL_DIFFERENCE, --remaining_stock_rank will be 1 for any partially remaining stock. Which will come after some of the prev or same rows have used up stock rank() over (partition by sell_index order by iff(buy_sell_difference <=0, NULL, buy_index)) remaining_inv_rank, --make a adjustments for any partially depleted stock. Keep non-partially depleted stock the same number as what was bought iff(remaining_inv_rank = 1, BUY_SELL_DIFFERENCE, BUY_QUANTITY) updated_inv --updated holdings after the previous sale FROM previous_sales ORDER BY SELL_INDEX, BUY_INDEX ), cogs_whole AS ( SELECT SELL_INDEX, SELL_QUANTITY, BUY_INDEX, BUY_PRICE, updated_inv, SUM(updated_inv) OVER (PARTITION BY SELL_INDEX ORDER BY BUY_INDEX) CUMULATIVE_inv, CUMULATIVE_inv-SELL_QUANTITY UPDATED_BUY_SELL_DIFFERENCE, --if a buy record has been completely depleted, that is counted as a whole cogs IFF(UPDATED_BUY_SELL_DIFFERENCE<=0, UPDATED_inv*BUY_PRICE, 0) COGS_WHOLE FROM updated_inv_quantity WHERE BUY_SELL_DIFFERENCE >0 ORDER BY SELL_INDEX, BUY_INDEX ), cogs_part as ( select *, rank() over (partition by sell_index order by iff(cogs_whole = 0,buy_index, NULL)) cogs_part_rank, IFF(cogs_part_rank = 1, (UPDATED_inv - UPDATED_BUY_SELL_DIFFERENCE)*BUY_PRICE, 0) cogs_part, cogs_whole+cogs_part cogs_all --IFF(UPDATED_BUY_SELL_DIFFERENCE>SELL_QUANTITY) --LAG(COGS_WHOLE) OVER (PARTITION BY SELL_INDEX ORDER BY BUY_INDEX) TEST from cogs_whole ORDER BY SELL_INDEX, BUY_INDEX ) select sell_index, sum(cogs_all) cogs from cogs_part group by sell_index ;

  • @TravelChroniclesHD
    @TravelChroniclesHD 3 месяца назад

    Love it!

    • @dotpi5907
      @dotpi5907 3 месяца назад

      Glad it helped!

  • @kevinduffy2428
    @kevinduffy2428 4 месяца назад

    What if you did not want to bring the files down to the local machine? How would you process the files up on Azure? And run the Python code on Azure. For instance, the files were placed in blob storage and now you wanted process them, clean them up and then save out the results out in blob storage. The Python code is not complicated , just what are the pices/configuration up on Azure.

    • @busydoingnothing2677
      @busydoingnothing2677 Месяц назад

      You'd probably need a VM

    • @ruigerd
      @ruigerd 18 дней назад

      I'm trying to do this in an azure datafactory and a custom activity in the datafactory can execute python files that are saved on a blob. You will need an azure batch account with a pool that has python installed on it (pools are based on vm's and some vm's have python pre-installed). Another way could be azure function apps but I have not tried that enough

  • @hipotenus6154
    @hipotenus6154 4 месяца назад

    can i create different layouts and size them in specific area without them stepping outside of the limits i gave them?

  • @10052941
    @10052941 4 месяца назад

    Nice Video! I completed what you did but I want to make an addition. I would like to have a column with the grade and that the power query gets the web info for that specific grade. So to have the "ungraded" between square brackets replaced with an value of a column in de source file. = Table.AddColumn(#"Added Custom", "Value", each Web.Page(Web.Contents([urls]))[Data]{0}[Ungraded]{0}) But after trying I'm not getting the desired results. only errors that the specific column cant be found in the table of the URL (understandable) Do you have an idea how to fix this?

  • @gilissantos6532
    @gilissantos6532 4 месяца назад

    I'm having connection problems, but this video is solving them, thank you very much!

  • @MarcGKiwi
    @MarcGKiwi 5 месяцев назад

    Hi, can you write to Blob storage from ArcGIS Pro using these connections?

  • @cargouvu
    @cargouvu 5 месяцев назад

    How do you remove the single quotes on the column headers after the pivot? Also, I am unable to retrieve in another select statement once it has the single quotes.

  • @georgetong
    @georgetong 5 месяцев назад

    Just saying thanks for this. The online docs for how to authenticate with cloud providers is absolutely trash, and this was the only one that showed you needed to go to settings to add env vars. Helped me figure out how to authenticate with GCS

    • @dotpi5907
      @dotpi5907 2 месяца назад

      Thanks! Yes this took me ages to figure out, so had to share

  • @dhruvajmeri8677
    @dhruvajmeri8677 6 месяцев назад

    Thank you fo this video! It saved my time

  • @k2line706
    @k2line706 6 месяцев назад

    Fantastic video. Very clear explanation and clean code for us to follow. Thank you!

  • @ryansylvester1993
    @ryansylvester1993 6 месяцев назад

    Dude, thank you so much!!!

  • @beemac79
    @beemac79 6 месяцев назад

    What about Snowflake Roles? How do assign them to a user?

  • @paulopraca6197
    @paulopraca6197 6 месяцев назад

    Thank you for this video! It's exactly what I'm looking for . Thanks again!

  • @VikneshKoodalingam-qu2bl
    @VikneshKoodalingam-qu2bl 7 месяцев назад

    It is very detailed and clear explanation! Great work:)

  • @dashathomas9395
    @dashathomas9395 7 месяцев назад

    What happens to the connection when i share the workbook to colleagues? Just won't refresh, or does the data revert to errors of some sort? (Users who don't have a snowflake driver or creds)

  • @b3llydrum
    @b3llydrum 7 месяцев назад

    Notice the decorator function is *returning a function*, not simply calling it. That's a key part to understand.

  • @Kay-qg5td
    @Kay-qg5td 7 месяцев назад

    I need to establish a direct connection, do not want to export data to excel, my data tables are larger in size. How do I achieve that?

  • @saltrocklamp199
    @saltrocklamp199 7 месяцев назад

    I had to put "dsn=my-db-name" in the "Database" field, instead of just "my-db-name". Not sure why.

  • @remorabay
    @remorabay 7 месяцев назад

    I have a python script that reads an EDI file and, from there, creates unique data tags and elements (basically a CSV file with one tag and data field, per line). I need to process to load this into Azure and, for the outbound, to extract into the same tags+data. This looks close. Anyone interested in giving me a quote for this (can you show it working?). Thanks.

  • @o.chatri
    @o.chatri 7 месяцев назад

    Thanks a lot for this ! It's working for me.

  • @luismedina8746
    @luismedina8746 8 месяцев назад

    My database already has 2 columns: Lat and Long. Can I just call these two columns straight into GIS Pro and the software will identify it? Or I have to convert it using the same format as you?

  • @Justice429
    @Justice429 8 месяцев назад

    Thanks! that was very helpful

    • @dotpi5907
      @dotpi5907 8 месяцев назад

      Thanks for watching!

  • @hebasamir7431
    @hebasamir7431 8 месяцев назад

    thanks☺

    • @dotpi5907
      @dotpi5907 8 месяцев назад

      Thanks for watching!

  • @subhamchakraborty6822
    @subhamchakraborty6822 8 месяцев назад

    Hi, how to download that data and store it in excel or csv? Thanks in advance.

  • @CapitanFeeder
    @CapitanFeeder 9 месяцев назад

    Videos like yours should have way more views. Thank you for what you do.

    • @dotpi5907
      @dotpi5907 8 месяцев назад

      Thanks so much! I really appreciate the support

  • @kartikgupta8413
    @kartikgupta8413 9 месяцев назад

    thank you for this video

    • @dotpi5907
      @dotpi5907 8 месяцев назад

      Thanks for watching!

  • @ericbixby
    @ericbixby 9 месяцев назад

    Do you have any suggestions for how to then write a file in a similar fashion to the storage blob?

  • @investing3370
    @investing3370 9 месяцев назад

    What with happen when you have SAS token on hand, can it be replaced with account key?

    • @dotpi5907
      @dotpi5907 9 месяцев назад

      Hi @investing3370, try changing line 16 to: sas_token = 'your sas token' connect_str = 'DefaultEndpointsProtocol=https;SharedAccessSignature=' + sas_token + ';EndpointSuffix=core.windows.net' you wont line lines 11 to 13 let me know if that works

  • @Kekybra
    @Kekybra 9 месяцев назад

    Great video! I have tried to replicate this but I failed at some point. At the time stemp 13:46, you are changing the Data type of the card value to a decimal number and your value stays 1.38. When I do this my value changes from 38.00 to 3800, so it deletes the ".". Couild you maybe help me with this and explain how I can correct this?

    • @Kekybra
      @Kekybra 9 месяцев назад

      I have fixed it. Thanks for the great video. You explained it very well!

    • @dotpi5907
      @dotpi5907 9 месяцев назад

      Hi @Kekybra, thank you! And well done figuring it out, what did you have to do to fix it?

    • @Kekybra
      @Kekybra 9 месяцев назад

      Hi@@dotpi5907, I had to change my country in the main settings of excel and also when changing from text to decimal number, I had to specify a different country. My country uses a comma in decimal numbers but pricecharting uses a dot in decimal numbers.

  • @sclem32
    @sclem32 10 месяцев назад

    Thanks man amazing content keep going !

    • @dotpi5907
      @dotpi5907 9 месяцев назад

      Thanks for the support!