Loading ...
Sorry, an error occurred while loading the content.

[Computational Complexity] STOC Business Meeting, Part I

Expand Messages
  • Lance
    More from Victoria by James Lee. Jeanette Wing, the new director of CISE, kicks off the business meeting with an overview of the funding situation for theory
    Message 1 of 1 , May 20, 2008
    • 0 Attachment
      More from Victoria by James Lee.

      Jeanette Wing, the new director of CISE, kicks off the business meeting with an overview of the funding situation for theory at NSF. I think I discern two clear messages: First, NSF is part of the executive branch, so there is one clear way we can affect the budget this November. CISE has requested a 19.5% funding increase for 2009, with a 25.5% increase requested for CCF. Secondly, the best way to expand the amount of funding for theory is for algorithms people to look for money outside of pure theory venues. The opportunity to do this will hopefully be improved by having Sampath and Jeanette on our side at NSF.

      Dick Karp wins the SIGACT Distinguished Service Award, for his tireless dedication to promoting and expanding TCS. Prasad and Harald are given their best paper awards. Then Cynthia gives her report on the program committee, and its decision process.

      80 papers accepted out of 325 submitted (that's about 24.6%). Some notable results: Congestion and game theory goes 5/13 (38.5%), and metric embeddings goes 0/13 (0.0%). Before the committee met, they agreed on having a more open mind toward conceptual papers which might be otherwise overlooked because they lack technical depth. The following paragraph was added to the call:

      Papers that broaden the reach of theory, or raise important problems that can benefit from theoretical investigation and analysis, are encouraged.
      This paragraph has been kept for FOCS'08.

      The committee sought to appreciate simplicity as a virtue; no longer "I like the ideas, but the proofs are simple"; instead, "I like the ideas, and the proofs are simple!" I don't know if "They changed the model so as to trivialize the problem" is also replaced by "They changed the model, and now the problem is trivial!" I think responsible analysis of a paper is probably a bit more nuanced.

      Later, Madhu Sudan spoke of instances where a well-known problem had an easy solution, and this prevented a journal or conference from publishing it. This is certainly ridiculous, and I have a hard time believing that it's a frequent occurrence (of course, I have about 1% of Madhu's experience). I've seen examples where the community considered it "embarrassing" that the solution was so simple, but not where the paper itself was derided.

      Personally, I love the beautiful intricacies of hard, technical proofs. It's like a little universe sprung out of the human effort poured into developing a deep understanding of some problem. There are often reoccurring characters, a unique language, a sense of history, twists and turns, all mounting towards a resounding conclusion that one only fully comprehends after multiple readings, and intense effort. But in our field, the beauty of complexity only makes sense in contrast to our search for simplicity. Simplicity is certainly a virtue.

      When I have criticized a paper based on "technical simplicity," it's not because I wish the authors had purposely obfuscated their arguments. Rather, one has to understand the primary goals of a theoretical field: To approach understanding through rigor. What we are trying to understand is computation in all its forms. Toward this end, we often consider idealized versions of problems, and in this respect modeling becomes incredibly important. It comes up in algorithms: What happens if the traveling salesman wants to minimize the average latency, and not the total travel time? And it happens in complexity: What if we allow our constant-depth circuits to have mod gates with composite moduli?

      In both cases, we are not confronting the actual problem we want to solve; real-life instances of salesman problems (e.g. satellite movement) probably involve other practical constraints, and (uniform) poly-size circuits can probably do a lot more than AC_0[m]. So often I have to measure the importance of a new model by how it differs technically from the old one. If simple modifications of the old TSP algorithms suffice for the minimum-latency version, it's not clear that we have learned something new (even though one could argue independently that the min-latency version is practically important!). And if AC_0[m] circuits could be simulated in a simple way by AC_0[p] circuits, then I wouldn't think as highly of a paper proving lower bounds against AC_0[m].

      Maybe we can be a little more appreciative of the subtlety involved in the reviewing process, and agree that "simplicity is a virtue" is a a bit too simplistic to be the motto for a program committee.

      --
      Posted By Lance to Computational Complexity at 5/20/2008 03:08:00 PM

    Your message has been successfully submitted and would be delivered to recipients shortly.