Google TurboQuant paper is refuted point by point by the authors of the prior algorithms

BlockBeatNews

According to monitoring by 1M AI News, Gao Jianyang, a postdoctoral researcher at ETH Zurich, published an open letter accusing Google’s ICLR 2026 paper TurboQuant of three serious issues regarding its description of his prior work RaBitQ. Gao Jianyang is the first author of RaBitQ, which was published in 2024 at the top database conference SIGMOD. The core method involves applying random rotations (Johnson-Lindenstrauss transform) before quantization, and it has been rigorously proven to achieve asymptotically optimal error bounds. He was invited to present at the workshop of the top theoretical computer science conference FOCS.

The three accusations are as follows:

  1. Method similarity avoidance: The core method of TurboQuant also uses random rotations, but the paper classifies RaBitQ as “grid-based PQ,” systematically omitting the direct connection between the two methods. ICLR reviewers independently pointed out that both methods use random projections and requested additional discussion, but the TurboQuant team not only failed to supplement this but also moved the description of RaBitQ from the main text to the appendix.
  2. Misrepresentation of theoretical results: The paper qualitatively categorizes RaBitQ’s theoretical guarantees as “suboptimal” without any evidence, attributing this to “loose analysis.” The extended paper on RaBitQ has proven that its error bounds reach the asymptotically optimal bounds given by Alon-Klartag (FOCS 2017).
  3. Unfair experimental comparison: TurboQuant tested RaBitQ using self-translated Python code on a single-core CPU (with multithreading disabled) but tested its own algorithm on an NVIDIA A100 GPU, leading to RaBitQ’s speed being reported as several orders of magnitude slower, and this setup was not disclosed in the paper.

Gao Jianyang disclosed that Majid Daliri, the second author of TurboQuant, proactively contacted the RaBitQ team in January 2025 to request assistance in debugging its Python version translated from the RaBitQ C++ code. In an email from May 2025, he personally confirmed the unfair experimental setup and stated that he had informed all co-authors of the theoretical clarifications from the RaBitQ team. However, the aforementioned issues were not corrected throughout the entire process of submission, review, acceptance, and large-scale promotion by Google official large-scale promotion.

The RaBitQ team has published an open comment on ICLR OpenReview and submitted a formal complaint to the ICLR conference chair and the ethics committee. Amir Zandieh, the first author of TurboQuant, responded that he was willing to correct the second and third issues but refused to supplement the discussion of method similarity, and only agreed to make corrections after the ICLR 2026 conference. Third-party researcher Jonas Matthias Kübler also independently pointed out on OpenReview that the paper is inconsistent with Google’s blog in terms of speed benchmarks (PyTorch vs. JAX) and quantization baselines (FP32). TurboQuant, after being widely promoted by Google official large-scale promotion, previously triggered a collective drop in the stocks of storage chip companies like Micron and Western Digital.

Disclaimer: The information on this page may come from third parties and does not represent the views or opinions of Gate. The content displayed on this page is for reference only and does not constitute any financial, investment, or legal advice. Gate does not guarantee the accuracy or completeness of the information and shall not be liable for any losses arising from the use of this information. Virtual asset investments carry high risks and are subject to significant price volatility. You may lose all of your invested principal. Please fully understand the relevant risks and make prudent decisions based on your own financial situation and risk tolerance. For details, please refer to Disclaimer.
Comment
0/400
No comments