Sagar

Research

“Good work is not done by "humble" men.
A man who is always asking "Is what I do worth while?"
"Am I the right person to do it?" will always be ineffective
He must shut his eyes a little
think a little more of his subject and himself than they deserve"

-- G.H Hardy. A Mathematician's Apology

Research Interests

Tractable Probabilistic Inference in Relational Domains
I am interested in probabilistic models over relational structures. I have worked in settings, where the relational structures are defined in a first-order logic language. In such a setting, the probabilistic inference task reduces to weighted model counting of a first-order logic formula. This is intractable in general. Hence, my work has focused on identifying tractable fragments of first order logic, where model counting can be performed in polynomial time. I am particularly interested in analytical approaches to weighted model counting, which yield closed form formulae. My first research contribution has been identifying closed form formulae for model counting in First Order Logic with two variables and it's extensions with counting quantifiers.

More recently, we extended the aforementioned fragment with graph-theoretic constraints. We allow relations in the language to represent directed acyclic graphs, trees, forests or connected graphs . We show that weighted model counting remains tractable with these additional constraints.

Consistency and Generalization of Probabilistic Inference in Relational Domains
The problem originates from the works of Shalizi and Rinaldo in Exponential Random Graph Models (ERGMs). And was generalized to probabilistic logic setting by Jaeger and Schulte. I with Luciano Serafini, have produced some results on this problem in Markov Logic Networks.

However, projective models although theoretically interesting, do not model many important properties of the real-world e.g. sparsity, small-world phenomenon etc. So I am trying to understand if some notion of generalization can be formalized to learn non-projective models, that perform relatively well when transferred to domains of different sizes.

Neuro-Symbolic Integration
Neuro-Symbolic Integration

Future Interests
These are the topics I have dabbled in but could not make headway. I hope to go back to (at least some of) them soon: Approximate Weighted Model Counting with guarantees, a minimum description length based approach to probabilistic inductive logic programming akin to this and defining the notion of Graphon and estimating it in the probabilistic logic setting.

Preprints and Publications

Alessandro Daniele, Tommasso Campari, Sagar Malhotra and Luciano Serafini. Deep Symbolic Learning: Discovering Symbols and Rules from Perception. 2022
Under Review
Arxiv

Sagar Malhotra and Luciano Serafini. On Projectivity in Markov Logic Networks
Proceedings of Machine Learning and Knowledge Discovery in Databases. Research Track - European Conference, ECML PKDD 2022
Conference Page Arxiv

Sagar Malhotra and Luciano Serafini. Weighted Model Counting in FO$^2$ with Cardinality Constraints and Counting Quantifiers: A Closed Form Formula.
Accepted for Oral presentation at the $36^{th}$ AAAI Conference on Artificial Intelligence, 2022.
Slides Poster Arxiv Proceedings

Sagar Malhotra and Luciano Serafini. A Combinatorial Approach to Weighted Model Counting in the Two Variable Fragment with Cardinality Constraints.
Proceedings of the $20^{th}$ International Conference of the Italian Association for Artificial Intelligence, 2021
Proceedings

Workshop Publications

Sagar Malhotra and Luciano Serafini. On Projectivity in Markov Logic Networks
$9^{th}$ International Workshop on Probabilistic Logic Programming, 2022
Preprint[updated] Arxiv

Sagar Malhotra and Luciano Serafini. Weighted Model Counting in FO$^2$ with Cardinality Constraints and Counting Quantifiers: A Closed Form Formula.
$10^{th}$ International Workshop on Statistical Relational AI, 2021 Link

Talks

  • Weighted Model Counting in the Two Variable Fragment. FBK @ AAAI 2022
    Slides accessible overview with a video @ 1:02:35
  • On Projectivity in Markov Logic Networks. FBK 2022
    Slides
  • Weighted Model Counting in First Order Logic. Doc In Progress.
    Dept. of Mathematics. University of Trento. 2022
    Slides a much better version of the AAAI slides.