Recursive Computations in Rel
We are excited to announce the latest enhancements to our Relational Knowledge Graph System (RKGS) (opens in a new tab) that substantially improve the performance of certain recursive computations.
As an example, consider the following recursive computation:
When certain algebraic conditions are satisfied, this program is automatically rewritten by our optimizer into the following:
In the rewritten version, we exploited specific algebraic properties of F
to define and construct the “addition” and “subtraction” operators shown in the figure above, so that computing F
in two steps is faster than computing it in one step in the original formulation.
More specifically, the improvements come from implementing the Generalized SemiNaive Evaluation (GSNE) presented in our paper on the Convergence of Datalog over (Pre-) Semirings (opens in a new tab). We are honored that this research was awarded the Best Paper Award at PODS 2022 (opens in a new tab).
Experimentally, you can see the speedup factors for two computations on a collection of SNAP datasets. Specifically, we show the speedup for computing single-source-shortest-path (SSSP) and all-pairs-shortest-path (APSP).
Note that for the SSSP computation, the ca-GrGc dataset is too small for the speedup to show — the optimization adds some overhead, and the total overhead must be smaller than the promised speedup factor in order to display a result.
Related Posts
Declarative Quantum Computing
All businesses regularly confront difficult decisions. Even when these decisions are constrained (perhaps by budgets) and the cost implications of the decisions are complex, optimization algorithms may nevertheless provide decisions that minimize cost and maximize benefit. Unfortunately, optimization can become very difficult as the number of decisions increases and this makes it an excellent use-case for quantum computing. Declarative languages like RelationalAI’s Rel make the integration of quantum optimizers simple.
Trends in Machine Learning: ICML 2022
The 38th International Conference on Machine Learning (ICML) took place in mid-July in hybrid mode, two months after the International Conference on Learning Representations (ICLR) 2022. In our analysis of the ICLR, we pointed out the spread of language models (LMs) and self-supervision. Things are not that different at ICML. Read our analysis.