Jump to content
Main menu
Main menu
move to sidebar
hide
Navigation
Main page
Recent changes
Random page
Help about MediaWiki
Special pages
Niidae Wiki
Search
Search
Appearance
Create account
Log in
Personal tools
Create account
Log in
Pages for logged out editors
learn more
Contributions
Talk
Editing
Communication complexity
(section)
Page
Discussion
English
Read
Edit
View history
Tools
Tools
move to sidebar
hide
Actions
Read
Edit
View history
General
What links here
Related changes
Page information
Appearance
move to sidebar
hide
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
=== Information Complexity === A powerful approach to the study of distributional complexity is information complexity. Initiated by Bar-Yossef, Jayram, Kumar and Sivakumar,<ref>{{cite journal |last1=Bar-Yossef |first1=Ziv |last2=Jayram |first2=T. S. |last3=Kumar |first3=Ravi |last4=Sivakumar |first4=D. |date=2004 |title=An information statistics approach to data stream and communication complexity |url=https://people.seas.harvard.edu/~madhusudan/courses/Spring2016/papers/BJKS.pdf |journal=Journal of Computer and System Sciences |volume=68 |issue=4 |pages=702–732 |doi=10.1016/j.jcss.2003.11.006 |access-date=1 December 2023}}</ref> the approach was codified in work of Barak, Braverman, Chen and Rao<ref>{{cite journal |last1=Barak |first1=Boaz |author-link1=Boaz Barak |last2=Braverman |first2=Mark |author-link2=Mark Braverman |last3=Chen |first3=Xi |last4=Rao |first4=Anup |date=2013 |title=How to Compress Interactive Communication |url=https://www.boazbarak.org/Papers/directsum.pdf |journal=SIAM Journal on Computing |volume=42 |issue=3 |pages=1327–1363 |doi=10.1137/100811969}}</ref> and by Braverman and Rao.<ref>{{cite journal |last1=Braverman |first1=Mark |author-link1=Mark Braverman |last2=Rao |first2=Anup |title=Information equals amortized communication |year=2014 |journal=IEEE Transactions on Information Theory |volume=60 |issue=10 |pages=6058–6069 |doi=10.1109/TIT.2014.2347282|arxiv=1106.3595 }}</ref> The (internal) information complexity of a (possibly randomized) protocol {{mvar|R}} with respect to a distribution {{mvar|μ}} is defined as follows. Let <math>(X,Y) \sim \mu</math> be random inputs sampled according to {{mvar|μ}}, and let {{mvar|Π}} be the transcript of {{mvar|R}} when run on the inputs <math>X,Y</math>. The information complexity of the protocol is :<math> \operatorname{IC}_\mu(R) = I(\Pi;Y|X) + I(\Pi;X|Y), </math> where {{mvar|I}} denotes [[Information_theory#Mutual_information_(transinformation)|conditional mutual information]]. The first summand measures the amount of information that Alice learns about Bob's input from the transcript, and the second measures the amount of information that Bob learns about Alice's input. The {{mvar|ε}}-error information complexity of a function {{mvar|f}} with respect to a distribution {{mvar|μ}} is the infimal information complexity of a protocol for {{mvar|f}} whose error (with respect to {{mvar|μ}}) is at most {{mvar|ε}}. Braverman and Rao proved that information equals amortized communication. This means that the cost for solving {{mvar|n}} independent copies of {{mvar|f}} is roughly {{mvar|n}} times the information complexity of {{mvar|f}}. This is analogous to the well-known interpretation of [[Entropy_(information_theory)|Shannon entropy]] as the amortized bit-length required to transmit data from a given information source. Braverman and Rao's proof uses a technique known as "protocol compression", in which an information-efficient protocol is "compressed" into a communication-efficient protocol. The techniques of information complexity enable the computation of the exact (up to first order) communication complexity of set disjointness to be <math>1.4923\ldots n</math>.<ref>{{cite conference |url= |title= |last1=Braverman |first1=Mark |author-link1=Mark Braverman |last2=Garg |first2=Ankit |last3=Pankratov |first3=Denis |last4=Weinstein |first4=Omri |date=June 2013 |publisher=ACM |book-title=STOC '13: Proceedings of the forty-fifth annual ACM symposium on Theory of Computing |pages=151–160 |location=Palo Alto, CA |doi=10.1145/2488608.2488628|isbn= 978-1-4503-2029-0}}</ref> Information complexity techniques have also been used to analyze extended formulations, proving an essentially optimal lower bound on the complexity of algorithms based on [[linear programming]] which approximately solve the [[clique problem|maximum clique problem]].<ref>{{cite conference |url=https://eccc.weizmann.ac.il/report/2012/131/ |title=An information complexity approach to extended formulations |last1=Braverman |first1=Mark |author-link1=Mark Braverman (mathematician) |last2=Moitra |first2=Ankur |date=1 June 2013 |publisher=ACM |book-title=STOC '13: Proceedings of the forty-fifth annual ACM symposium on Theory of Computing |pages=161–170 |location=Palo Alto, CA |doi=10.1145/2488608.2488629}}</ref> Omri Weinstein's 2015 survey<ref>{{cite journal |last=Weinstein |first=Omri |date=June 2015 |title=Information Complexity and the Quest for Interactive Compression |url=https://eccc.weizmann.ac.il/report/2015/060/ |journal=ACM SIGACT News |volume=46 |issue=2 |pages=41–64 |doi=10.1145/2789149.2789161 |access-date=1 December 2023}}</ref> surveys the subject.
Summary:
Please note that all contributions to Niidae Wiki may be edited, altered, or removed by other contributors. If you do not want your writing to be edited mercilessly, then do not submit it here.
You are also promising us that you wrote this yourself, or copied it from a public domain or similar free resource (see
Encyclopedia:Copyrights
for details).
Do not submit copyrighted work without permission!
Cancel
Editing help
(opens in new window)
Search
Search
Editing
Communication complexity
(section)
Add topic