Degrees of Freedom: Search Cost and Self-consistency
Model degrees of freedom () is a fundamental concept in statistics because it quantifies the flexibility of a fitting procedure and is indispensable in model selection. The is often intuitively equated with the number of independent variables in the fitting procedure. But for adaptive regressions that perform variable selection (e.g., the best subset regressions), the model is larger than the number of selected variables. The excess part has been defined as the search degrees of freedom () to account for model selection. However, this definition is limited since it does not consider fitting procedures in augmented space, such as splines and regression trees; and it does not use the same fitting procedure for and . For example, the lasso's is defined through the relaxed lasso's instead of the lasso's . Here we propose a modified search degrees of freedom () to directly account for the cost of searching in the original or augmented space. Since many fitting procedures can be characterized by a linear operator, we define the search cost as the effort to determine such a linear operator. When we construct a linear operator for the lasso via the iterative ridge regression, offers a new perspective for its search cost. For some complex procedures such as the multivariate adaptive regression splines (MARS), the search cost needs to be pre-determined to serve as a tuning parameter for the procedure itself, but it might be inaccurate. To investigate the inaccurate pre-determined search cost, we develop two concepts, nominal and actual , and formulate a property named self-consistency when there is no gap between the nominal and the actual .
READ FULL TEXT