Siddharth Bhaskar’s rudimentary website


This is mostly to serve as a clickable list of things that I’ve written, both refereed publications as well as preprints and unpublished notes.

Cons-free programs

Implicit computational complexity in the tradition of Neil Jones; cons-free refers to programs which only destruct their input (a.k.a. read-only data). This pair of papers addresses two outstanding open questions from earlier work, viz.: (i) what is cons-free running time? And (ii) how do we extend cons-free programs to capture functions, not just relations? Here we find simple, satisfactory answers to these questions over string data. Namely, cons-free running time is related to boolean circuit depth, and to extend cons-free programs from functions to relations, we add an additional type of write-only strings.

  1. RW-factorizable programs, with Jakob Grue Simonsen. Journal of Functional Programming, vol. 33, June 2023, e5
  2. Subclasses of Ptime interpreted as programming languages, with Cynthia Kop and Jakob Grue Simonsen. Theory of Computing Systems, June 2022.

Complexity theory

The Union Theorem says, in particular, that the complexity class P can be identified with DTIME(t(n)) for some function t that grows barely super-polynomially, but is a priori hard to compute. Here I show that t can be computed in quasi-polynomial time.

  1. A refinement of the Meyer-McCreight Union Theorem. Electronic Colloquium on Computational Complexity TR21-134, 2021.

Programming language semantics

A very recent and unpolished note that endows simple while-programs with a transfinite operational and denotational semantics, and proves them equivalent. Transfinite programs can take any ordinal number of steps to run; note that I do not mean “corecursive.”

  1. A transfinite operational semantics for while-programs

Graph traversals

Three variations on the theme of graph traversals. One is a descriptive complexity-theoretic characterization of L and NL using traversal-invariance, a refinement of order-invariance. One is an analysis of order types of well-ordered traversals of infinite graphs. (I am currently working on a substantial generalization of this work.) And the last is a category-theoretic characterization of breadth-first and depth-first traversals as functors from the appropriately categorified input graph.

  1. Graph traversals as universal constructions, with Robin Kaarsgaard. Mathematical Foundations of Computer Science, 2021.
  2. Algorithmic traversals of infinite graphs, with Jay Kienzle. arXiv:1810.09974
  3. Traversal-invariant characterizations of logarithmic space, with Steven Lindell and Scott Weinstein. arXiv:2006.07067
    (Repeated below under “descriptive complexity”)

Least fixed-point logic and descriptive complexity

The traversal-invariance paper is also listed here. The other paper is an analysis of model-theoretic dividing lines (OP, IP, sOP, etc.) in the context of least fixed-point logic over classes of finite structures, with an application to the problem of separating LFP from FO logic over classes of finite structures.

  1. Tameness in Least Fixed-Point Logic and McColm’s Conjecture, with Alex Kruckman. Logical Methods in Computer Science, vol. 17, Jan 2021.
  2. Traversal-invariant characterizations of logarithmic space, with Steven Lindell and Scott Weinstein. arXiv:2006.07067
    (Repeated above under “graph traversals.”)

Model-theoretic ranks

The Sauer-Shelah Lemma states that the shatter function associated with a set system grows at most polynomially or at least exponentially depending on whether the VC-dimension of said system is finite or infinite. Littlestone dimension is another set system dimension; I discovered an analogous “thicket” shatter function and a dichotomy for this quantity. I proved the growth rate of this shatter function (“thicket density”) could be identified with a well-known model-theoretic rank. This is all in the JSL paper; the other note formulates a question about Littlestone dimension of a set system and the depth of binary decision trees whose queries come from that system.

  1. Thicket Density. The Journal of Symbolic Logic, vol. 86, Mar. 2021, pages 110-127.
  2. A note on Littlestone dimension and decision trees

Mathematical phonology

The well-known class of regular languages on strings lift to at least three distinct function classes. Here we give descriptive complexity-theoretic characterizations of those classes in terms of simple recursive programs, refining known results that use second-order logic. This formalism has become of some practical use to linguists who study sub-regular function classes.

  1. Rational functions as recursive schemes, with Jane Chandlee and Adam Jardine. arXiv:2302.03074
  2. Boolean monadic recursive schemes as a logical characterization of the subsequential functions, with Jane Chandlee, Adam Jardine, and Christopher Oakden. LATA 2020.

Recursion over abstract structures & program schematology

This comes from an old tradition in computer science of considering programs that operate over data that is either uninterpreted (program schematology), or interpreted by “non-standard” mathematical structures (recursion over abstract structures). I consider both computability and complexity-theoretic questions, complexity here means abstract complexity measures.

The two published papers come out of my thesis. One shows that for certain structures, the question of whether iteration is as expressive as general recursion is provably hard; the other shows that there are structures where iteration and general recursion are extensionally equivalent, but general recursion is much more efficient.

The note shows that a syntactic class of programs called linear recursive are complete for the property of computation size being bounded above by a polynomial in computation depth.

  1. Recursion versus Tail Recursion over Fp. Journal of Logical and Algebraic Methods in Programming, vol. 94, Jan. 2018, pages 68-90.
  2. A Difference in Complexity Between Recursion and Tail Recursion. Theory of Computing Systems, vol. 60, Mar. 2016, pages 299-313.
  3. Computation size versus depth in linear recursion