SEM: Avoid improper solutions!

Поделиться
HTML-код
  • Опубликовано: 27 окт 2024

Комментарии • 10

  • @kristinaloderer2283
    @kristinaloderer2283 4 месяца назад

    Hi Christian, extremely helpful video, as always - thank you! I have a question regarding negative residual variances in ESEM models, run with Mplus. Is it possible to fix negative residual variance for a given item to zero (or to a non-negative value)? Trying to do so yields an error message from mplus ("cannot be done with EFA"), but based on the Mplus Forum, I feel like there might be an option for doing so. Using "model constraint" doesn't seem to be possible. I just want to see if the model runs when I fix this one error variance as I am getting the message that the model does not converge due to the item in question.
    Thank you!

  • @05Jhubbard
    @05Jhubbard Год назад

    Hi Christian, this video was really helpful, along with your others. I’ve run a five-factor model and five factor hierarchical model for compassion (5 factors loading on to overarching compassion factor). The fit indices look good for the five-factor model, and the five-factor hierarchical model (although fit indices indicate the five-factor model is a better fit). When I run the five factor hierarchical model, I get the common error message about the latent variable covariance matrix not being positive/ negative residual variance. The standardised loading of the factor is greater than 1 (1.026), the residual variance is negative (-0.052, significance = 999.000). I’m assuming this is due to inhomogeneity? with some variables being more highly correlated (ranging from .494 to .996). Or could this be a case of bad luck with sampling error as mentioned in one of your videos, due to standardised loadings being close to one meaning the residual variance becomes negative?
    In this case, should I be assuming that this model is not a good fit, or should I be fixing the error variance to a meaningful non-positive non zero value?
    Any advice would be greatly appreciated! Do you have an e-mail address I could contact?

  • @zahrashahrokhshahi1967
    @zahrashahrokhshahi1967 2 года назад

    Was really useful :) tnx. I also have another model of improper results. In my model, everything looks acceptable except that all loadings are non-significant. What could be the potential reasons?

    • @QuantFish
      @QuantFish  2 года назад

      Hi Zahra, Thank you for watching. I would say a model in which non of the factor loadings are significant is not typically "acceptable." You should consult with a statistician who is knowledgeable in SEM to figure out what's going on. It is impossible to know without seeing the output and data. Best, Christian Geiser

  • @jiseulsophiaahn796
    @jiseulsophiaahn796 2 года назад

    Thanks for such a helpful practical video. In case I want to fix my residual variance to a “meaningful” positive value informed by reliabilities from past research, are you referring to the reliability of the scale that my problematic item belongs to? Or, are you referring to the factor loading of my item to the factor it is purported to measure? Thanks in advance

    • @QuantFish
      @QuantFish  2 года назад

      Hi Jiseul, Thank you for watching! You would have to figure out a reasonable estimate of the reliability of the *item* (not the entire scale!) that applies to your study so that you can compute and fix the item's residual variance accordingly. Best, Christian Geiser

  • @SM-ut3yl
    @SM-ut3yl Год назад

    Hi Christian, thanks so much for this clear and very helpful video. Any reference to cite for fixing the error variance? I am one of the unlucky ones (I'm referring to what you say around 12'30") and I have a very reliable indicator (three repeated assessments of GPA - they correlate > .85), forcing error variance to a negative value. Thanks!!

    • @QuantFish
      @QuantFish  Год назад

      Have you tried constraining the loadings to equality (e.g., fixing them all to 1.0)? Since you are using the same measure (GPA) repeatedly, this may be a reasonable/meaningful constraint (implying measurement equivalence/invariance) that often solves the issue of negative residual variances.
      In terms of a reference, I don't recall for sure but perhaps you can find something in
      Chen, F., Bollen, K. A., Paxton, P., Curran, P. J., & Kirby, J. B. (2001). Improper solutions in structural equation models: Causes, consequences, and strategies. Sociological Methods & Research, 29(4), 468-508.
      Best, Christian Geiser

    • @SM-ut3yl
      @SM-ut3yl Год назад

      @@QuantFish Hi Christian, thank you for your answer! I run an RI-CLPM, where GPA is one of the three concepts - so, Im interested in within-person fluctuations in GPA. So, yes, the factor loadings are by design fixed to 1. That is not helping. Fixing the error variance of the GPA random intercept does partly solve the problem - no negative variance, but still negative co-variances. Any other ideas? I will check the reference. Thanks!!

    • @QuantFish
      @QuantFish  Год назад

      @@SM-ut3yl It is difficult to say without knowing more specifics about your model, but typically, a Heywood case indicates that the model is misspecified (e.g., over- or underparameterized) in some way. In your case, perhaps the model is overparameterized. GPA scores are rather stable across time (not a lot of intra-individual fluctuation). Have you tried an intercept-only (pure trait) model?
      Best, Christian Geiser