SQL Portfolio Project - Part 2 - Faaso's | Analytics | Ashutosh Kumar

Поделиться
HTML-код
  • Опубликовано: 10 дек 2022
  • The best technique of learning a skill is completing a project which is based on real life scenario and when it comes to analytics , SQL is the one of the most asked skillset in all the interviews, so friends in this video i have created an end to end project on data analysis using SQL, thi will be relevant to all those people who want to make a career into data analytics, business analytics ,data science , in this video i have analysis on Faaso's dataset . You can find the code attached below -
    DATASET CODE LINK - github.com/ashutoshkr103/Micr...
    ----------------------------------------------------------------------------------------------------------------------
    Check out some more relevant content here
    👉 SQL Portfolio Project Indian Census- Part 2
    • SQL Portfolio Project ...
    👉 SQL Portfolio Project - Shark Tank India
    • SQL Portfolio Project ...
    👉 How to Learn SQL
    • How to learn sql for b...
    👉 Complete playlist on Sql Interview questions and answers
    • HackerRank SQL problem...
    👉 Top free and best resources to learn analytics -
    • Free and Best resource...
    👉 How to become a data analyst complete roadmap-
    • Data Analyst Complete ...
    👉 Data analytics internships and job stipend-
    • 70,000 + per month dat...
    👉 Top 3 you tube channels to learn sql for free for beginners
    • Video
    👉 How to get data analytics internships-
    • How to apply for data ...
    👉Check out the roadmap to become a business analyst - • Business Analyst Compl...
    👉All about analytics playlist-
    • All about analytics
    👉 Top 3 you tube channels to learn excel for free for beginners
    • Top 3 you tube channel...
    ____________________________________________________________________
    Fill the form below to subscribe yourself to the analytics jobs mailing list to receive regular job opening updates - docs.google.com/forms/d/e/1FA...
    Why you should definitely fill the analytics job updates google form - • Job Openings into busi...
    _______________________________________________________________________
    Connect with me
    📸Instagram - / ashutoszh
    💻Linkedin- / ashutosh.analytics
    _____________________________________________________________________
    Comment down if you have any doubts
    Please leave a LIKE 👍 and SUBSCRIBE ❤️ to my channel to receive more amazing content in data analytics and data science.
    _____________________________________________________________________
    🏷️ Tags
    sql,
    sql for analytics,
    sql tutorials for beginners,
    portfolio project sql,
    sql portfolio project,
    portfolio project on sql for resume,
    sql interview portfolio project,
    sql interview questions and answers,
    sql for data analytics,
    sql data science,
    interview questions on sql,
    sql hard questions,
    sql problem solving,
    ashutosh,
    ashutosh kumar,
    ashutosh kumar analytics
    🏷️ HashTags
    #sql #portfolio #project

Комментарии • 37

  • @balaroxx2700
    @balaroxx2700 9 дней назад

    Energy level awesome with smilling face till end 🎉 why varchar for hours?? 34:25

  • @Suzi_06
    @Suzi_06 Год назад +1

    For question no 6.
    select order_id , count(roll_id) as total from customer_orders
    group by order_id
    order by total desc
    limit 1;
    I tried this query instead of using a subquery

  • @vipinc4533
    @vipinc4533 Год назад +5

    Thank you for choosing some of the hard level problems..keep doing such projects

  • @ayushsahu632
    @ayushsahu632 Год назад +1

    Thanks word become too small to appreciate your work.

  • @lavanyavaish101
    @lavanyavaish101 Год назад +2

    Hi, please note at timestamp 44:04, in the last 4 records, how can the pickup date be of the year 2020 when order date is of 2021?

  • @simranjitkaur6388
    @simranjitkaur6388 7 месяцев назад +1

    mzaa aa gya brother🤩🤩🤩🤩 amazing project

  • @ranarajbansh1911
    @ranarajbansh1911 Год назад +2

    Sir... you are dope 😎

  • @Quickknow-it5wi
    @Quickknow-it5wi 2 месяца назад

    I m not getting output for problem 10 , datename is not applicable. Please tell alternate code for this

  • @swapnildas29
    @swapnildas29 Год назад +1

    For the average time in minutes question,
    I am using MySql Workbench. Datediff function is not present in MySql. So I used the following command:
    select *, case when M1

  • @karishmasewraj6437
    @karishmasewraj6437 Год назад

    I think you use "row number" to get specific row numbers.

  • @karishmasewraj6437
    @karishmasewraj6437 Год назад +1

    At 47:31 where all the null values are in the cancelation column the diff comes as -527030 right till the last null row in the cancelation column. Why is that so?

  • @playtrip7528
    @playtrip7528 10 месяцев назад

    with cte as (
    SELECT *,
    CASE
    WHEN LENGTH(not_include_items) > 0 THEN 'changes'

    WHEN LENGTH(extra_items_included) > 0 AND extra_items_included 'NaN' THEN 'changes' else 'no changes'
    END AS xx
    FROM customer_orders
    where order_id not in (select order_id from driver_order where length(cancellation) >5 )
    )
    select
    customer_id, xx, count(*)
    from cte
    group by 1,2

  • @hhiiuubb
    @hhiiuubb Год назад +2

    Bro part 3? Eagerly waiting

  • @harshshah9006
    @harshshah9006 Год назад +2

    For those who want dataset
    drop table if exists driver;
    CREATE TABLE driver(driver_id integer,reg_date date);
    INSERT INTO driver(driver_id,reg_date)
    VALUES (1,'01-01-2021'),
    (2,'01-03-2021'),
    (3,'01-08-2021'),
    (4,'01-15-2021');
    drop table if exists ingredients;
    CREATE TABLE ingredients(ingredients_id integer,ingredients_name varchar(60));
    INSERT INTO ingredients(ingredients_id ,ingredients_name)
    VALUES (1,'BBQ Chicken'),
    (2,'Chilli Sauce'),
    (3,'Chicken'),
    (4,'Cheese'),
    (5,'Kebab'),
    (6,'Mushrooms'),
    (7,'Onions'),
    (8,'Egg'),
    (9,'Peppers'),
    (10,'schezwan sauce'),
    (11,'Tomatoes'),
    (12,'Tomato Sauce');
    drop table if exists rolls;
    CREATE TABLE rolls(roll_id integer,roll_name varchar(30));
    INSERT INTO rolls(roll_id ,roll_name)
    VALUES (1 ,'Non Veg Roll'),
    (2 ,'Veg Roll');
    drop table if exists rolls_recipes;
    CREATE TABLE rolls_recipes(roll_id integer,ingredients varchar(24));
    INSERT INTO rolls_recipes(roll_id ,ingredients)
    VALUES (1,'1,2,3,4,5,6,8,10'),
    (2,'4,6,7,9,11,12');
    drop table if exists driver_order;
    CREATE TABLE driver_order(order_id integer,driver_id integer,pickup_time datetime,distance VARCHAR(7),duration VARCHAR(10),cancellation VARCHAR(23));
    INSERT INTO driver_order(order_id,driver_id,pickup_time,distance,duration,cancellation)
    VALUES(1,1,'01-01-2021 18:15:34','20km','32 minutes',''),
    (2,1,'01-01-2021 19:10:54','20km','27 minutes',''),
    (3,1,'01-03-2021 00:12:37','13.4km','20 mins','NaN'),
    (4,2,'01-04-2021 13:53:03','23.4','40','NaN'),
    (5,3,'01-08-2021 21:10:57','10','15','NaN'),
    (6,3,null,null,null,'Cancellation'),
    (7,2,'01-08-2020 21:30:45','25km','25mins',null),
    (8,2,'01-10-2020 00:15:02','23.4 km','15 minute',null),
    (9,2,null,null,null,'Customer Cancellation'),
    (10,1,'01-11-2020 18:50:20','10km','10minutes',null);
    drop table if exists customer_orders;
    CREATE TABLE customer_orders(order_id integer,customer_id integer,roll_id integer,not_include_items VARCHAR(4),extra_items_included VARCHAR(4),order_date datetime);
    INSERT INTO customer_orders(order_id,customer_id,roll_id,not_include_items,extra_items_included,order_date)
    values (1,101,1,'','','01-01-2021 18:05:02'),
    (2,101,1,'','','01-01-2021 19:00:52'),
    (3,102,1,'','','01-02-2021 23:51:23'),
    (3,102,2,'','NaN','01-02-2021 23:51:23'),
    (4,103,1,'4','','01-04-2021 13:23:46'),
    (4,103,1,'4','','01-04-2021 13:23:46'),
    (4,103,2,'4','','01-04-2021 13:23:46'),
    (5,104,1,null,'1','01-08-2021 21:00:29'),
    (6,101,2,null,null,'01-08-2021 21:03:13'),
    (7,105,2,null,'1','01-08-2021 21:20:29'),
    (8,102,1,null,null,'01-09-2021 23:54:33'),
    (9,103,1,'4','1,5','01-10-2021 11:22:59'),
    (10,104,1,null,null,'01-11-2021 18:34:49'),
    (10,104,1,'2,6','1,4','01-11-2021 18:34:49');
    select * from customer_orders;
    select * from driver_order;
    select * from ingredients;
    select * from driver;
    select * from rolls;
    select * from rolls_recipes;

  • @karishmasewraj6437
    @karishmasewraj6437 Год назад +1

    What does the "a" mean next to all the sub queries?

  • @karishmasewraj6437
    @karishmasewraj6437 Год назад +1

    At timestamp 49:47
    Why could not have used the AVG aggregate function ?

  • @SK-wp4tm
    @SK-wp4tm Год назад

    Please make videos in Hindi 🙏

  • @rupalidhotre6204
    @rupalidhotre6204 11 месяцев назад +1

    can u share entire code

    • @AshutoshKumaryt
      @AshutoshKumaryt  11 месяцев назад +2

      there's a reason i dont share the code, because just copying pasting the code wont help anyone

  • @husnabanu4370
    @husnabanu4370 Год назад +1

    wow problem 9 was interesting to solver however i did not understand the purpose of conversion to varchar why was it important?? Here is my solution to it
    select buckt,count(roll_id) from (select *,concat(datepart(hh,order_date) ,'-',datepart(hh,order_date)+1 )as buckt
    from customer_orders) a group by buckt;

    • @AshutoshKumaryt
      @AshutoshKumaryt  Год назад

      some function can only be applied to varchar data type column similarily to int column

  • @tausifraza9528
    @tausifraza9528 Год назад

    useless without dataset.through github it couldnt downloading

    • @harshshah9006
      @harshshah9006 Год назад

      Its in copy paste format

    • @harshshah9006
      @harshshah9006 Год назад

      Heres the dataset
      drop table if exists driver;
      CREATE TABLE driver(driver_id integer,reg_date date);
      INSERT INTO driver(driver_id,reg_date)
      VALUES (1,'01-01-2021'),
      (2,'01-03-2021'),
      (3,'01-08-2021'),
      (4,'01-15-2021');
      drop table if exists ingredients;
      CREATE TABLE ingredients(ingredients_id integer,ingredients_name varchar(60));
      INSERT INTO ingredients(ingredients_id ,ingredients_name)
      VALUES (1,'BBQ Chicken'),
      (2,'Chilli Sauce'),
      (3,'Chicken'),
      (4,'Cheese'),
      (5,'Kebab'),
      (6,'Mushrooms'),
      (7,'Onions'),
      (8,'Egg'),
      (9,'Peppers'),
      (10,'schezwan sauce'),
      (11,'Tomatoes'),
      (12,'Tomato Sauce');
      drop table if exists rolls;
      CREATE TABLE rolls(roll_id integer,roll_name varchar(30));
      INSERT INTO rolls(roll_id ,roll_name)
      VALUES (1 ,'Non Veg Roll'),
      (2 ,'Veg Roll');
      drop table if exists rolls_recipes;
      CREATE TABLE rolls_recipes(roll_id integer,ingredients varchar(24));
      INSERT INTO rolls_recipes(roll_id ,ingredients)
      VALUES (1,'1,2,3,4,5,6,8,10'),
      (2,'4,6,7,9,11,12');
      drop table if exists driver_order;
      CREATE TABLE driver_order(order_id integer,driver_id integer,pickup_time datetime,distance VARCHAR(7),duration VARCHAR(10),cancellation VARCHAR(23));
      INSERT INTO driver_order(order_id,driver_id,pickup_time,distance,duration,cancellation)
      VALUES(1,1,'01-01-2021 18:15:34','20km','32 minutes',''),
      (2,1,'01-01-2021 19:10:54','20km','27 minutes',''),
      (3,1,'01-03-2021 00:12:37','13.4km','20 mins','NaN'),
      (4,2,'01-04-2021 13:53:03','23.4','40','NaN'),
      (5,3,'01-08-2021 21:10:57','10','15','NaN'),
      (6,3,null,null,null,'Cancellation'),
      (7,2,'01-08-2020 21:30:45','25km','25mins',null),
      (8,2,'01-10-2020 00:15:02','23.4 km','15 minute',null),
      (9,2,null,null,null,'Customer Cancellation'),
      (10,1,'01-11-2020 18:50:20','10km','10minutes',null);
      drop table if exists customer_orders;
      CREATE TABLE customer_orders(order_id integer,customer_id integer,roll_id integer,not_include_items VARCHAR(4),extra_items_included VARCHAR(4),order_date datetime);
      INSERT INTO customer_orders(order_id,customer_id,roll_id,not_include_items,extra_items_included,order_date)
      values (1,101,1,'','','01-01-2021 18:05:02'),
      (2,101,1,'','','01-01-2021 19:00:52'),
      (3,102,1,'','','01-02-2021 23:51:23'),
      (3,102,2,'','NaN','01-02-2021 23:51:23'),
      (4,103,1,'4','','01-04-2021 13:23:46'),
      (4,103,1,'4','','01-04-2021 13:23:46'),
      (4,103,2,'4','','01-04-2021 13:23:46'),
      (5,104,1,null,'1','01-08-2021 21:00:29'),
      (6,101,2,null,null,'01-08-2021 21:03:13'),
      (7,105,2,null,'1','01-08-2021 21:20:29'),
      (8,102,1,null,null,'01-09-2021 23:54:33'),
      (9,103,1,'4','1,5','01-10-2021 11:22:59'),
      (10,104,1,null,null,'01-11-2021 18:34:49'),
      (10,104,1,'2,6','1,4','01-11-2021 18:34:49');
      select * from customer_orders;
      select * from driver_order;
      select * from ingredients;
      select * from driver;
      select * from rolls;
      select * from rolls_recipes;

  • @swapnildas29
    @swapnildas29 Год назад +1

    Alternate solution to Q.7.
    select customer_id, ch_no_ch, count(order_id) as at_least_1_change
    from
    (select *, case when new_not_include_items = 0 and new_extra_items_included = 0 then 'No Change' else 'Change' end as ch_no_ch
    from
    (select order_id,roll_id, customer_id, case when not_include_items is null or not_include_items = '' then 0 else not_include_items end as new_not_include_items,
    case when extra_items_included is null or extra_items_included = '' or extra_items_included = 'NaN' then 0 else extra_items_included end as new_extra_items_included
    from
    (select c.order_id, c.roll_id, customer_id, not_include_items, extra_items_included, order_cancel_details
    from customer_orders c
    inner join
    (select * from
    (select *, case when cancellation in ('Cancellation', 'Customer Cancellation') then 'Cancelled' else 'Success' end as order_cancel_details
    from driver_order)a
    where order_cancel_details = 'Success')b
    on b.order_id = c.order_id)d)e)f
    group by 1,2;
    Output:
    customer_id ch_no_ch at_least_1_change
    101 No Change 2
    102 No Change 3
    103 Change 3
    104 Change 2
    104 No Change 1
    105 Change 1