About 524,000 results
Open links in new tab
  1. regression - When is R squared negative? - Cross Validated

    With linear regression with no constraints, R2 R 2 must be positive (or zero) and equals the square of the correlation coefficient, r r. A negative R2 R 2 is only possible with linear …

  2. correlation - What is the difference between linear regression on y ...

    The Pearson correlation coefficient of x and y is the same, whether you compute pearson(x, y) or pearson(y, x). This suggests that doing a linear regression of y given x or x given y should be …

  3. Newest 'regression' Questions - Cross Validated

    Q&A for people interested in statistics, machine learning, data analysis, data mining, and data visualization

  4. regression - Converting standardized betas back to original …

    Where β∗ β ∗ are the estimators from the regression run on the standardized variables and β^ β ^ is the same estimator converted back to the original scale, Sy S y is the sample standard …

  5. Regression with multiple dependent variables? - Cross Validated

    Nov 14, 2010 · Is it possible to have a (multiple) regression equation with two or more dependent variables? Sure, you could run two separate regression equations, one for each DV, but that …

  6. How to derive the standard error of linear regression coefficient

    How to derive the standard error of linear regression coefficient Ask Question Asked 11 years, 5 months ago Modified 10 months ago

  7. regression - Trying to understand the fitted vs residual plot?

    Dec 23, 2016 · A good residual vs fitted plot has three characteristics: The residuals "bounce randomly" around the 0 line. This suggests that the assumption that the relationship is linear is …

  8. regression - What does negative R-squared mean? - Cross Validated

    Nov 24, 2015 · For the top set of points, the red ones, the regression line is the best possible regression line that also passes through the origin. It just happens that that regression line is …

  9. How should outliers be dealt with in linear regression analysis?

    8 I've published a method for identifying outliers in nonlinear regression, and it can be also used when fitting a linear model. HJ Motulsky and RE Brown. Detecting outliers when fitting data …

  10. Why is the intercept negative, and what does my regression show?

    In a regression model where the intercept is negative implies that the model is overestimating on an average the y values thereby a negative correction in the predicted values is needed.