[email protected]
Member
Hi David,
trust you are doing great!
i chanced upon an old schweser material which stated that, overestimating the regression is a problem that emanates from instances where higher R^2 may be as a result of increased number of independent variables in a regression model and not how well these independent variables explain the dependent variable.
can this problem be overlooked if via trial and error means you have detected that a newly introduced independent variable gives a better explanation to the dependent variable?
is it right to eliminate from the model, independent variables that less explain the dependent variables before using the regression model?
can adjusted R^2 also means, rectifying the overestimation by eliminating some independent variables and introducing those independent variables that can well explain the dependent variable or you should strictly stick to using the adjusted R^2 estimator?
trust you are doing great!
i chanced upon an old schweser material which stated that, overestimating the regression is a problem that emanates from instances where higher R^2 may be as a result of increased number of independent variables in a regression model and not how well these independent variables explain the dependent variable.
can this problem be overlooked if via trial and error means you have detected that a newly introduced independent variable gives a better explanation to the dependent variable?
is it right to eliminate from the model, independent variables that less explain the dependent variables before using the regression model?
can adjusted R^2 also means, rectifying the overestimation by eliminating some independent variables and introducing those independent variables that can well explain the dependent variable or you should strictly stick to using the adjusted R^2 estimator?