"Numerous and large dimensional data is now a default setting in modern machine learning (ML). Standard ML algorithms, starting with kernel methods such as support vector machines and graph-based methods like the PageRank algorithm, were however initially designed out of small dimensional intuitions and tend to misbehave, if not completely collapse, when dealing with real-world large datasets. Random matrix theory has recently developed a broad spectrum of tools to help understand this new curse of dimensionality, to help repair or completely recreate the sub-optimal algorithms, and most importantly to provide new intuitions to deal with modern data mining"-- Provided by publisher.
This resource is supported by the Institute of Museum and Library Services under the provisions of the Library Services and Technology Act as administered by State Library of Iowa.