RelationalAI is named a Cool Vendor in the May 2022 Gartner Cool Vendors in Augmented Data Management report.

What Do Shannon-type Inequalities, Submodular Width, and Disjunctive Datalog Have to Do with One Another?

RelationalAI

01 January 2017

less than a minute read

What Do Shannon-type Inequalities, Submodular Width, and Disjunctive Datalog Have to Do with One Another?

This paper connects semantic query optimization, physical query optimization & cost estimation, to information theory with provable bounds.

Authors: Mahmoud Abo Khamis, Hung Q. Ngo, Dan Suciu. 2017.

In Proceedings of the 36th ACM SIGMOD-SIGACT-SIGAI Symposium on Principles of Database Systems (PODS ‘17) (Invited to the Journal of the ACM)

Recent works on bounding the output size of a conjunctive query with functional dependencies and degree bounds have shown a deep connection between fundamental questions in information theory and database theory. We prove analogous output bounds for disjunctive datalog rules, and answer
several open questions regarding the tightness and looseness of these bounds along the way. The bounds are intimately related to Shannon-type information inequalities. We devise the notion of a “proof sequence” of a specific class of
Shannon-type information inequalities called “Shannon flow inequalities”. We then show how a proof sequence can be used as symbolic instructions to guide an algorithm called PANDA, which answers disjunctive datalog rules within the size bound predicted. We show that PANDA can be used as a black-box to devise algorithms matching precisely the fractional hypertree width and the submodular width run-times for aggregate and conjunctive queries with functional dependencies and degree bounds.

Related Posts

Get Early Access

Join our community, keep up to date with the latest developments in our monthly newsletter, and get early access to RelationalAI.