A Rust performance profiler and monitoring toolkit for real-time time, memory, and async data flow analysis.
hotpath-rs is a performance profiling and monitoring toolkit for Rust applications. It provides detailed insights into where code spends time and allocates memory, helping developers identify and optimize bottlenecks efficiently. The tool supports both static reporting and live monitoring through a TUI dashboard.
Rust developers building performance-critical applications, especially those using async runtimes like Tokio, who need to identify and optimize time and memory bottlenecks. It is also suitable for teams integrating performance regression detection into CI/CD pipelines.
Developers choose hotpath-rs for its zero-cost design when disabled via feature flags, low-overhead profiling for both synchronous and asynchronous code, and integrated live TUI dashboard for real-time monitoring. Its unique selling points include built-in MCP server for AI agent integration and comprehensive monitoring of async data flow, channels, streams, and Tokio runtime metrics.
Rust Performance Profiler & Channels Monitoring Toolkit (TUI, MCP)
Open-Awesome is built by the community, for the community. Submit a project, suggest an awesome list, or help improve the catalog on GitHub.
Profiling is fully gated by feature flags, eliminating compile-time and runtime overhead when disabled, as stated in the README's 'Zero-cost when disabled' feature.
Measures time and memory allocations with minimal impact, supporting both synchronous and asynchronous code, making it suitable for production use without significant performance degradation.
Instruments channels, streams, futures, and Tokio runtime to monitor message flow, poll counts, and worker thread utilization, providing detailed async data flow metrics as highlighted in the features list.
Built with ratatui, it offers real-time monitoring of performance metrics via an interactive terminal interface, allowing developers to observe data flow and bottlenecks dynamically.
Includes a Model Context Protocol server that enables LLMs to query profiling data in real-time, facilitating AI-assisted performance analysis and unique tooling integration.
The project is under active development, and while core APIs are stable, implementation details like JSON report formats and TUI/MCP internals may change between releases, as noted in the Status section.
Requires annotating functions with #[hotpath::measure] macros, which can be cumbersome for large codebases and involves direct source code modifications rather than non-invasive profiling.
Primarily designed for Rust with Tokio runtime; support for other async runtimes like async-std is not emphasized, potentially limiting its utility in diverse ecosystems.
Although low, profiling adds runtime overhead due to instrumentation, which might affect measurements in extremely performance-sensitive scenarios and skew benchmark results.