Loading ...
Sorry, an error occurred while loading the content.

[Computational Complexity] Final STOC Post

Expand Messages
  • Lance
    James Lee finishes up from Victoria. Still at the business meeting (with a 15-item agenda), Cynthia explains that another rule of thumb was in place for
    Message 1 of 1 , May 22, 2008
    • 0 Attachment
      James Lee finishes up from Victoria.

      Still at the business meeting (with a 15-item agenda), Cynthia explains that another rule of thumb was in place for STOC’08: If a program committee member said "this is my favorite submission," then the paper was marked for acceptance, barring a severe negative reaction from the other committee members.

      Next, Bobby Kleinberg tells us about the "TheoryWiki" project, whose goal is to organize a community-wide effort to improve the presence and quality of TCS on Wikipedia. This seems like a fantastic goal. To rant tangentially while I have the chance: Unfortunately, a large contingent of our community recently contributed to the Springer Encyclopedia of Algorithms. There were various area editors who put together a list of possible articles, and then solicited authors to write them. The area editors were not paid. The authors were not paid. On the other hand, the default was for the copyright on all materials to be handed over to Springer, who will create a huge and potentially useful volume, and then sell it at very expensive prices. I agreed to write my article, but I complained quite a bit first. I was told that the process was too far along to change anything. I asked Springer if I could put it on Wikipedia. They said no. Finally, I said: I am writing an article. I am putting it in the public domain. I will give you permission to distribute it however you like; take it or leave it. They took it. Unfortunately, most authors did not make similar deals, and a tremendous amount of time and effort has been wasted (or, in the least, vastly underutilized).

      Adam Kalai announces that STOC 2010 will be in Boston (where he and Yael will be joining the newly formed MSR New England). Allan Borodin reads the conceptual manifesto. Despite a lot of dissent expressed in private conversations, Mikkel Thorup is the only one to speak up. He argues that, while new models should be valued, we might also want to appreciate the kind of algorithms that are running on millions of computers around the world right now.


      I’ll end with some talks I enjoyed from the rest of the conference:
      • Chris Umans gives a near-linear time algorithm to compute the composition of two multivariate polynomials over finite fields of small characteristic. This leads to asymptotically faster algorithms for factoring univariate polynomials over finite fields. The composition algorithm is inspired by the Parvaresh-Vardy and Guruswami-Rudra codes. (According to Chris, the “small characteristic” assumption has recently been lifted.)
      • Ishai, Kushilevitz, Ostrovsky, and Sahai disprove a well-known conjecture of Mansour, Nisan, and Tiwari by showing that 2-universal hash functions (from n bits to n bits) can be computed by circuits of linear size. Expander codes are the primary tool. (Their ultimate goal is more efficient crypto primitives.)
      • Adam Kalai gave an entertaining talk on "The myth of the folk theorem," joint work with Borgs, Chayes, Immorlica, Mirrokni, and Papadimitriou. The "Folk Theorem" is a collection of results from game theory which describe the Nash equilibria in repeated games, i.e. where the same one-shot game is played over and over. The "myth" of the folk theorem is that finding Nash equilibria in repeated games is easy, and it's true for two players: There is a poly-time algorithm. The authors show that once the repeated game has three players, though, finding a Nash equilibrium becomes PPAD-complete, just as in the one-shot case. The reduction from 2-NASH is pretty simple, and comes with the fantastically apt name of the "Actor-Critic game" (which, in the talk, was played by actors Jason and Nicole, and critic Lance).
      • Shachar Lovett gives an explicit construction of pseudorandom generators against low-degree polynomials over finite fields, improving over the work of Bogdanov and Viola who did this for d=2 and d=3. Lovett uses the sum of 2^d eps-biased generators (these are pseudorandom against linear functions). His analysis involves the Gowers norms, which measure the bias of random “derivatives” of a function. In very recent work, Viola has shown that one need only sum d eps-biased generators. Viola’s work does not use the Gowers norms, and is simply based on the bias of the polynomial to be fooled.
      • Spielman and Srivastava show that every n-vertex graph can be sparsified to a (weighted) subgraph containing only O(n log n) edges, where the sparse version preserves all Rayeligh quotients of the Laplacian up to a (1+eps) multiplicative error. In particular, the weights of all cuts in the sparse version are the same up to 1+eps. They also give a near-linear time randomized algorithm to sample the sparse subgraph. The sample probability of an edge is proportional to its effective resistance in the electrical network defined by the graph. They analyze the sampling procedure using work of Rudelson on central limit theorems for sums of rank-one matrices.
      There were many other beautiful results presented. I suggest using the comments section to highlight your favorites.

      --
      Posted By Lance to Computational Complexity at 5/22/2008 05:02:00 AM
    Your message has been successfully submitted and would be delivered to recipients shortly.