A Python library for constructing differentiable convex optimization layers in PyTorch, JAX, and MLX using CVXPY.
CVXPYlayers is a Python library that creates differentiable convex optimization layers for deep learning frameworks like PyTorch, JAX, and MLX. It allows developers to embed convex optimization problems as trainable components within neural networks, solving parametrized problems during forward passes and computing gradients during backward passes. This enables end-to-end learning of systems that require optimization-based reasoning or hard constraints.
Machine learning researchers and engineers working on models that incorporate optimization constraints, control systems, portfolio optimization, physics-informed learning, or any application requiring differentiable optimization layers within neural networks.
Developers choose CVXPYlayers because it provides a clean interface between CVXPY's expressive convex optimization modeling and popular deep learning frameworks, with GPU acceleration support and the ability to differentiate through complex optimization problems including log-log convex programs.
Differentiable convex optimization layers
Open-Awesome is built by the community, for the community. Submit a project, suggest an awesome list, or help improve the catalog on GitHub.
Enables embedding convex optimization problems as trainable layers in neural networks, with gradients computed for backpropagation, as demonstrated in the PyTorch and JAX usage examples.
Provides functionally equivalent layers for PyTorch, JAX, and MLX, allowing integration into diverse deep learning ecosystems without framework lock-in.
Supports GPU-accelerated solving via Moreau and CuClarabel backends, improving performance for large-scale problems, though setup can be complex.
Returns constraint dual variables (Lagrange multipliers) alongside primal solutions, useful for sensitivity analysis and advanced applications like control systems.
Extends differentiation to log-log convex programs and geometric programs via the gp=True flag, broadening applicability beyond standard convex optimization.
Setting up GPU acceleration with CuClarabel requires installing Julia and multiple additional packages (e.g., juliacall, cupy), which is cumbersome and error-prone compared to standard deep learning libraries.
Only supports convex optimization problems formulated with CVXPY's DPP, excluding non-convex scenarios that may be needed in broader machine learning applications.
Relies on external solvers like SCS or Clarabel, which may have convergence issues or performance overhead compared to native neural network operations, as noted in the solver arguments section.