%0 Journal Article %T Revisiting Akaike¡¯s Final Prediction Error and the Generalized Cross Validation Criteria in Regression from the Same Perspective: From Least Squares to Ridge Regression and Smoothing Splines %A Jean Raphael Ndzinga Mvondo %A Eug¨¨ne-Patrice Ndong Ngu¨¦ma %J Open Journal of Statistics %P 694-716 %@ 2161-7198 %D 2023 %I Scientific Research Publishing %R 10.4236/ojs.2023.135033 %X In regression, despite being both aimed at estimating the Mean Squared Prediction Error (MSPE), Akaike¡¯s Final Prediction Error (FPE) and the Generalized Cross Validation (GCV) selection criteria are usually derived from two quite different perspectives. Here, settling on the most commonly accepted definition of the MSPE as the expectation of the squared prediction error loss, we provide theoretical expressions for it, valid for any linear model (LM) fitter, be it under random or non random designs. Specializing these MSPE expressions for each of them, we are able to derive closed formulas of the MSPE for some of the most popular LM fitters: Ordinary Least Squares (OLS), with or without a full column rank design matrix; Ordinary and Generalized Ridge regression, the latter embedding smoothing splines fitting. For each of these LM fitters, we then deduce a computable estimate of the MSPE which turns out to coincide with Akaike¡¯s FPE. Using a slight variation, we similarly get a class of MSPE estimates coinciding with the classical GCV formula for those same LM fitters. %K Linear Model %K Mean Squared Prediction Error %K Final Prediction Error %K Generalized Cross Validation %K Least Squares %K Ridge Regression %U http://www.scirp.org/journal/PaperInformation.aspx?PaperID=128049