Functional Aggregate Queries with Additive Inequalities
This paper develops more formally the tensor-decomposition framework for semantic optimization.
Authors: Mahmoud Abo Khamis, Ryan R. Curtin, Benjamin Moseley, Hung Q. Ngo, Xuan Long Nguyen, Dan Olteanu, Maximilian Schleich. 2020.
In ACM Transactions on Database Systems (TODS ‘20). Vol. 45, No. 4, Article 17.
Motivated by fundamental applications in databases and relational machine learning, we formulate and study the problem of answering functional aggregate queries (FAQ) in which some of the input factors are defined by a collection of additive inequalities between variables. We refer to these queries as FAQ-AI for short. We present three applications of our FAQ-AI framework to relational machine learning: k-means clustering, training linear support vector machines, and training models using non-polynomial loss.
Read the PDF: Functional Aggregate Queries with Additive Inequalities (opens in a new tab)
Related Posts
Human in the Loop Enrichment of Product Graphs with Probabilistic Soft Logic
Product graphs have emerged as a powerful tool for online retailers to enhance product semantic search, catalog navigation, and recommendations. Their versatility stems from the fact that they can uniformly store and represent different relationships between products, their attributes, concepts or abstractions etc, in an actionable form.
Learning Models over Relational Data using Sparse Tensors and Functional Dependencies
Integrated solutions for analytics over relational databases are of great practical importance as they avoid the costly repeated loop data scientists have to deal with on a daily basis: select features from data residing in relational databases using feature extraction queries involving joins, projections, and aggregations; export the training dataset defined by such queries; convert this dataset into the format of an external learning tool; and train the desired model using this tool.