We introduce a method based on Kolmogorov complexity to prove lower bounds on communication complexity. The intuition behind our technique is close to information theoretic methods [1,2]. Our goal is to gain a better understanding of how information theoretic techniques differ from the family of techniques that follow from Linial and Shraibman’s work on factorization norms . This family extends to quantum communication, which prevents them from being used to prove a gap with the randomized setting.
We use Kolmogorov complexity for three different things: first, to give a general lower bound in terms of Kolmogorov mutual information; second, to prove an alternative to Yao’s minmax principle based on Kolmogorov complexity; and finally, to identify worst case inputs.
We show that our method implies the rectangle and corruption bounds , known to be closely related to the subdistribution bound . We apply our method to the hidden matching problem, a relation introduced to prove an exponential gap between quantum and classical communication . We then show that our method generalizes the VC dimension  and shatter coefficient lower bounds . Finally, we compare one-way communication and simultaneous communication in the case of distributional communication complexity and improve the previous known result .