3 %>@brief Partial Least Squares
5 %> PLS according to [1]. <b>This PLS works with one response variable only</b> (which is typically the
class - sometimes called PLSDA (PLS Discriminant Analysis)).
8 %> [1] Hastie, The elements of Statistical Learning, 2001, Algorithm 3.2, p.68
10 %> <b>Important: X-variables (columns of X) need to be standardized, otherwise the
function will give an error.</b>
12 %> @param X
"Predictor" variables
13 %> @param Y
"Response" variable. <b>This PLS works with one variable only!</b>
14 %> @param no_factors Number of variables to be calculated.
15 %> @return <code>[loadings]</code> or <code>[loadings, scores]</code>
18 if ~exist('no_factors', 'var')
23 % Y = data_normalize(Y, 's');
25 % According to Hastie, the X-variables need to be standardized. This is not a general requirement of PLS, but only for
26 % the version applied here.
30 p = min(no_factors, nf);
39 % Kromer, Boulesteix, Tutz: Penalized Partial Least Squares Based on B-Splines Transformations
41 % a nice and clear paper
42 dcoeff = [1, 10, 10]*no;
44 for i = 1:length(dcoeff)
46 P = P+dcoeff(i)*(D'*D);
56 % Coefficients are correlations between X and Y
61 % Makes columns in X orthogonal to the newfound z.
64 X(:, j) = X(:, j)-z*(z'*X(:, j)/(z'*z));
69 % [XL, YL, XS, YS] = plsregress(X, Y, no_factors);
function adjust_unitnorm(in L)
function assert_standardized(in X, in tolerance)
function diff_operator(in nf, in order)
function irootlab_pls(in X, in Y, in no_factors)
Important: X-variables (columns of X) need to be standardized, otherwise the function will give an er...