Applied math

The numerical estimate of matrix rank

The most com­monly used func­tion to cal­cu­late the matrix rank is prob­a­bly the one in the MATLAB soft­ware pack­age [1]. It is based on the cal­cu­la­tion of sin­gu­lar val­ues of the matrix A, and the num­ber of sin­gu­lar val­ues that are greater than some tol­er­ance defines the rank. The algo­rithm is sim­ple: r = svd(A); …

The numer­i­cal esti­mate of matrix rank Read More »

A numerically stable PLS algorithm

Dur­ing 2011 year, I have tried many dif­fer­ent ways of mak­ing a PLS algo­rithm (PLS1) as sta­ble pos­si­ble. The goal was to make it as numer­i­cally good as the SVD imple­mented in MATLAB. The pseudocode that solved the PLS1 case for one y‑varable is shown below and it includes the addi­tion of null vec­tors to …

A numer­i­cally sta­ble PLS algo­rithm Read More »

Numerically good null spaces

There are many meth­ods to obtain null vec­tors to a given set of vec­tors. One of the numer­i­cally best meth­ods is imple­mented in MATLAB, based on sin­gu­lar value decom­po­si­tion, SVD. Null vec­tors can be used for many dif­fer­ent things; one of them is to remove prop­er­ties that are ore orthog­o­nal to another, given prop­erty. Such …

Numer­i­cally good null spaces Read More »

A fast Householder bidiagonalization algorithm

Please note that this is not regard­ing the Lanc­zos or the PLS bidi­ag­o­nal­iza­tion algo­rithm. This is about the House­holder bidi­ag­o­nal­iza­tion algo­rithm. The dif­fer­ence is huge. For exam­ple, there are no y‑variables in the House­holder decom­po­si­tion. It is a decom­po­si­tion of X that results in a fac­tor­iza­tion with a bidi­ag­o­nal matrix in the mid­dle, sim­i­lar to …

A fast House­holder bidi­ag­o­nal­iza­tion algo­rithm Read More »

PLS regression — NEW animation

Dur­ing the con­fer­ence in Cape Town on Near Infrared Spec­troscopy, NIR2011, I had a keynote in the chemo­met­rics ses­sion. I pre­sented a com­par­i­son of three dif­fer­ent meth­ods for mak­ing cal­i­bra­tions includ­ing the LOCAL con­cept by Shenk and West­er­haus, and fol­lowng that, I pre­sented another com­par­i­son, on dif­fer­ent PLS algo­rithms. As a link between these two …

PLS regres­sion — NEW ani­ma­tion Read More »

How about orthogonality? How about differences between PLS algorithms?

When good pre­dic­tive abil­ity is the one and only goal, it is per­haps less impor­tant to think about orthog­o­nal­ity. The rea­son is sim­ple: we decide to only think about the pre­dicted val­ues and we don’t care about look­ing at any­thing else in the mod­els we have cre­ated. When talk­ing about dif­fer­ences between dif­fer­ent lin­ear algorithms …

How about orthog­o­nal­ity? How about dif­fer­ences between PLS algo­rithms? Read More »

Code for analytical separation videos

My old videos about ana­lyt­i­cal sep­a­ra­tion processes should be added to this site. The scripts are still good and easy to mod­ify to sim­u­late dif­fer­ent sce­nario by chang­ing a few para­me­ters. When I first made them in 1996 and 1999, I saved them in mpeg1 for­mats. Nowa­days, other for­mats are more pop­u­lar, espe­cially MP4 that …

Code for ana­lyt­i­cal sep­a­ra­tion videos Read More »

This website uses cookies. By continuing to use this site, you accept our use of cookies. 

Scroll to Top