Lesson 6 - Theoretical Foundations of Copulas (Sklar's Theorem)

Поделиться
HTML-код
  • Опубликовано: 17 ноя 2024

Комментарии • 13

  • @nickmillican22
    @nickmillican22 Год назад +9

    Glad somebody is filling this gap in youtube! Much appreciated

    • @KiranKarra87
      @KiranKarra87  Год назад +4

      Thank you for your kind comments! I'd be happy to dive into more detail on any of these topics and create videos for them if you have any suggestions of specific material related to copulas that you'd like to see. Feel free to leave comments on this!

  • @swP-z4v
    @swP-z4v Год назад +3

    这对我学习copula函数提供了有很大的帮助,非常感谢你的分享。

  • @vahidnikoofard2939
    @vahidnikoofard2939 2 месяца назад

    Excellent course! I'm enjoying a lot

  • @Tyokok
    @Tyokok Месяц назад

    HI Kiran, first thank you so much for the GREAT lecture and example! 2nd, if I may have some questions:
    1. so copula is just an advanced kind of joint distribution that allows more flexibility (e.g. different marginal, configurable parameter ...etc.) right?
    2. in terms of how to use copula. Gaussian copula is pretty much our traditional multivariate Normal distribution, with correlation dependency info embedded. If you use Gaussian Copula means you ASSUME between those marginal variable they have the multi-normal correlation dependency, but you just can apply different kind of marginal distribution other than Normal after you generate random sample from this Copula, right?
    3. in terms of how to use copula. Other copula (e.g. Archimedean Copula), meaning first you ASSUME the dependency relation info between marginal variables are embedded in this particular Copula, and then you need ASSUME different marginal distribution based on your best knowledge and apply on to random sample you generate from this copula, right?
    4. so that all we can use are the existing well defined or developed Copula libraries right (before any new defined copula discovered) ? Do they cover all kinds of the dependent relation, such as square, log, ... ?
    Thank you so much for your advice in advance!

    • @KiranKarra87
      @KiranKarra87  Месяц назад +1

      Hey! Great questions.
      1 - I would consider the copula to be a general representation of a joint distribution. The generalization enables you to represent any marginal distribution as two separate components: a) a dependence structure / coupling function (ie copula) b) the marginal distributions
      2 - the Gaussian copula itself is not the same as a joint multivariate distribution, but a Gaussian copula with Gaussian marginals is a joint multivariate Gaussian distribution
      3 - you can assume a dependence structure if you have prior knowledge about the underlying mechanics of the data. The other alternative is to sample data from the process that you want to model and then fit a copula to it. The hard thing here is to get fully representative data.
      4 - this is a pretty open ended question but the properties of a copula are well defined . We may choose to name copulas bc they have some useful properties etc but you can always use whatever coupling function you want

    • @Tyokok
      @Tyokok Месяц назад

      @@KiranKarra87 Hey Kiran, great explanation also. Really appreciate it!

  • @emmanuelandre2807
    @emmanuelandre2807 Год назад +1

    Hi, there is an issue with how you generated the gaussian copula from the normal dist, here is a revised code which works better:
    import numpy as np
    from scipy.stats import norm
    import matplotlib.pyplot as plt
    # Covariance matrix and parameters
    P = np.asarray([
    [1, 0.90],
    [0.90, 1]
    ])
    d = P.shape[0]
    n = 500
    alpha = 6
    # Cholesky Decomposition for Gaussian Copula
    A = np.linalg.cholesky(P)
    # Generate uniform samples for both copulas
    U_samples = np.random.rand(n, d)
    # Gaussian Copula
    # Transform the uniform samples to Gaussian using the inverse CDF (norm.ppf)
    Z = norm.ppf(U_samples)
    # Apply the correlation by multiplying with the Cholesky decomposition
    X_Gauss = np.dot(Z, A.T)
    # Convert the correlated Gaussian samples back to uniform
    U_Gauss = norm.cdf(X_Gauss)
    # Clayton Copula
    # Transform the same U_samples for Clayton Copula
    u = U_samples[:, 0]
    v = ((U_samples[:, 1] / u**(-alpha-1))**(-alpha/(1+alpha)) - u**(-alpha) + 1)**(-1/alpha)
    U_Clayton = np.column_stack((u, v))
    # Transform the uniform samples from the Clayton copula to Gaussian
    X_Clayton = norm.ppf(U_Clayton)
    # Visualization
    plt.figure(figsize=(12, 6))
    plt.subplot(1,2,1)
    plt.scatter(X_Gauss[:,0], X_Gauss[:,1])
    plt.xlabel('$\mathcal{N}(0,1)$')
    plt.ylabel('$\mathcal{N}(0,1)$')
    plt.title('Gaussian Copula $p=%0.02f$' % P[0,1])
    plt.subplot(1,2,2)
    plt.scatter(X_Clayton[:,0], X_Clayton[:,1])
    plt.xlabel('$\mathcal{N}(0,1)$')
    plt.ylabel('$\mathcal{N}(0,1)$')
    plt.title('Clayton Copula $\\alpha=%0.02f$' % alpha)
    plt.tight_layout()
    plt.show()

  • @italocegatta
    @italocegatta Год назад +2

    thanks! great video

  • @heshaniachinthika5327
    @heshaniachinthika5327 7 месяцев назад +1

    Can you suggest some references for further reading.

    • @KiranKarra87
      @KiranKarra87  4 месяца назад

      Hi - yes, sorry for the late reply as I'm catching up on everything. The main reference for copula's is Roger Nelsen's "Introduction to Copulas" - You can find it on Amazon or other bookstores!

  • @toastedsesamebun
    @toastedsesamebun Год назад +1

    Thank you for the videos!

  • @MegaMatzzz
    @MegaMatzzz Год назад

    i love u bro this was great!