A Common Lisp machine learning library focusing on neural networks, Boltzmann machines, and Gaussian processes with BLAS and CUDA support.
MGL is a comprehensive machine learning library for Common Lisp, primarily focusing on neural networks including backpropagation networks, Boltzmann machines, and Gaussian processes. It emphasizes performance and power over ease of use, leveraging BLAS and CUDA for efficient computation. The library provides low-level control and flexibility for advanced machine learning tasks, with built-in support for dataset utilities, gradient-based optimization, monitoring, and model persistence.
Common Lisp developers and researchers who require high-performance, low-level control over neural network architectures and other machine learning models, particularly those working on tasks like regression, classification, and unsupervised learning. It is suited for users who prioritize computational efficiency and are comfortable with a more complex, flexible API.
Developers choose MGL for its focus on performance via BLAS and CUDA integration, its comprehensive support for various neural network types (feed-forward, recurrent, Boltzmann machines), and its emphasis on low-level control and flexibility. Unlike higher-level libraries, MGL offers advanced features like batch processing, gradient-based optimization algorithms, and extensive monitoring tools, making it ideal for custom or research-oriented machine learning projects in Common Lisp.
Common Lisp machine learning library.
Open-Awesome is built by the community, for the community. Submit a project, suggest an awesome list, or help improve the catalog on GitHub.
Leverages BLAS and CUDA for efficient matrix operations, with automatic fallback to CPU if GPU is unavailable, ensuring performance across hardware setups.
Implements backpropagation networks (feed-forward and recurrent), Boltzmann machines, and Gaussian processes, covering a wide range of machine learning tasks from supervised to unsupervised learning.
Provides built-in monitors, measurers, and counters for tracking training statistics, classification metrics, and confusion matrices, facilitating detailed model debugging and analysis.
Includes utilities for data partitioning, cross-validation, bagging, and stratified sampling, enabling robust model evaluation and resampling techniques.
Requires manual installation of dependencies like CL-CUDA and MGL-MAT, which are not in Quicklisp, and configuration of BLAS/CUDA libraries, making initial setup cumbersome.
Built for Common Lisp, which has a smaller community and fewer integrations with mainstream ML tools, limiting access to resources, tutorials, and collaborative support.
Emphasizes low-level control and flexibility, requiring users to have deep understanding of machine learning concepts and Common Lisp, which can be barrier to entry for rapid development.