SYSTEMS AND METHODS FOR ROBUST LARGE-SCALE MACHINE LEARNING
The present disclosure provides a new scalable coordinate descent (SCD) algorithm and associated system for generalized linear models whose convergence behavior is always the same, regardless of how much SCD is scaled out and regardless of the computing environment. This makes SCD highly robust and...
Gespeichert in:
Hauptverfasser: | , , , |
---|---|
Format: | Patent |
Sprache: | eng ; fre ; ger |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | The present disclosure provides a new scalable coordinate descent (SCD) algorithm and associated system for generalized linear models whose convergence behavior is always the same, regardless of how much SCD is scaled out and regardless of the computing environment. This makes SCD highly robust and enables it to scale to massive datasets on low-cost commodity servers. According to one aspect, by using a natural partitioning of parameters into blocks, updates can be performed in parallel a block at a time without compromising convergence. Experimental results on a real advertising dataset are used to demonstrate SCD's cost effectiveness and scalability. |
---|