Stats question

Discussion in 'Technology' started by Superfluous, Jan 9, 2015.

Stats question
  1. Unread #1 - Jan 9, 2015 at 1:54 PM
  2. Superfluous
    Joined:
    Jul 5, 2012
    Posts:
    18,939
    Referrals:
    5
    Sythe Gold:
    9,135
    Vouch Thread:
    Click Here
    Discord Unique ID:
    247909953925414913
    Discord Username:
    .superfluous.
    Two Factor Authentication User Pool Shark Air Fryer DIAF m`lady Le Kingdoms Player STEVE Creamy

    Superfluous Rainbet.com Casino & Sportsbook
    Crabby Retired Global Moderator

    Stats question

    Having something weird going on with OLS regressions. I can't disclose what the data is regarding, but here's the scenario.

    If I run the regression "regress depVar on v1 v2 v3 vOfInterest v4 ..." (depVar = dependent variable, v1...vn = independent vars), I get a coefficient on vOfInterest of like 0.4. If I run the same regression on ten mutually exclusive and collectively exhaustive subsets of the data, I get coefficients for vOfInterest that all exceed 0.4. No observations or independent variables are being dropped when I do this, and there is no correlation between the dependent and independent variables. A weighted average of the coefficients with the subsets is approximately 0.79, yet the combined regression still yields 0.4. Does anyone know what else can cause this?
     
< Good computer for gaming? | iMac Mini Question >

Users viewing this thread
1 guest


 
 
Adblock breaks this site