A sub-millisecond full-text search library and multi-tenancy server written in Rust, designed for high performance and real-time indexing.
SeekStorm is an open-source, high-performance full-text search library and server implemented in Rust. It delivers sub-millisecond query latencies, true real-time search, and efficient multi-tenancy, making it suitable for applications requiring fast, scalable search on commodity hardware. It solves the problem of achieving low-latency, high-throughput keyword search without relying on proprietary hardware or clusters.
Developers and organizations building applications that require fast, scalable, and real-time keyword search, such as search engines, e-commerce platforms, document search systems, and data-intensive services. It is particularly suited for those needing to handle billion-scale indices on a single server with consistent performance.
Developers choose SeekStorm for its algorithmic performance that achieves sub-millisecond latencies and high throughput on commodity hardware, eliminating the need for expensive clusters or proprietary accelerators. Its true real-time search, sharded architecture, and broad language support provide a reliable and efficient solution for exact keyword matching and low-latency applications.
SeekStorm: vector & lexical search - in-process library & multi-tenancy server, in Rust.
Open-Awesome is built by the community, for the community. Submit a project, suggest an awesome list, or help improve the catalog on GitHub.
Delivers sub-millisecond query latencies and high throughput through sharded indexing and SIMD acceleration, as demonstrated in benchmarks against other search libraries.
Indexed documents become searchable immediately with negligible performance impact during indexing, avoiding delays from segment merges common in other systems.
Includes a built-in REST API server with API key management and an embedded web UI, enabling easy deployment of search services for multiple tenants or applications.
Supports stemming for 38 languages and Chinese word segmentation with dedicated tokenizers, making it versatile for global applications without extra configuration.
Lacks ready-made client SDKs for languages other than Rust, and features like distributed clustering and vector search are still on the roadmap, limiting immediate adoption in polyglot or advanced setups.
Requires Rust toolchain installation, environment variable configuration (e.g., MASTER_KEY_SECRET), and system-level tuning, which can be a barrier for teams not experienced with Rust or server administration.
As noted in the faceted search example, assembling features from documentation alone can be tedious, indicating a steep learning curve and potential frustration for new users.