# Tag Archives: princeton

It’s been many months since I’ve posted here, due mostly to an extreme influx of work from classes and my jobs. But that doesn’t mean I’ve given up on this blog! Rather, I have a huge number of updates, which I will only describe in short detail for now. In no particular order:

After working at the Summer School in Theoretical Computer Science at Princeton, I went to one more conference the week after the school closed out. It was the annual RANDOM/APPROX dual conference, also held at Princeton. It was extremely interesting, with riveting talks by a variety of excellent speakers. I particularly enjoyed the keynote address by David Williamson, who spoke of a new generation of approximation algorithms called “lightweight algorithms” that he believes should and will define the future of the field. I saw a lot of really interesting talks and having taken a graduate course on this last semester, I was able to at least follow a good portion of most talks.

Next, I have begun my fall semester (actually by this point, it’s more than half done..)! I ended up taking two grad courses: Advanced Complexity Theory with Sanjeev Arora and an Advanced Topics course (aka, new courses which may or may not be kept in future semesters) on Information Theory with new professor Mark Braverman. Both are extremely interesting and have gotten very challenging. I think however that complexity theory started off much harder for me since I haven’t taken the undergraduate version of this course first, but as we moved on to new material, I feel like I’m keeping up fairly well with the class. As for information theory, it started off introducing really cool ideas that were pretty straightforward to follow, but our most recent topic of communication complexity is extremely difficult for someone with no prior knowledge or experience with this. The class has a couple of PhD candidates who are very familiar with this material, so I guess that somewhat helps a bit. But it goes very fast and I need to constantly review my material. Nonetheless, it’s a ton of fun! I’m also an undergrad grader for COS 340, which is called Reasoning about Computation. It’s sort of an introduction to theoretical CS through a variety of topics including probability, hashing, algorithmic analysis, approximation algorithms, graph theory, and more. Lots of super interesting material. It’s by no means a trivial course (either to take or to grade..) but it’s an excellent jumping point for anyone interested in theory. I actually haven’t taken this course in the past due to scheduling issues, so I’m enjoying following the material as the class does it too.

We also had a Quantum Computing Day mini-workshop at Princeton. I’m hugely interested in this subj. ct although I have little formal background in it. Nonetheless, I decided to attend anyway. The keynote talk was by Scott Aaronson of MIT, a well-noted theorist in this field. The talk was, as you might expect, brilliant. I’m pretty intrigued by how matrix determinants and permanents keep popping up in so many different places. Aaronson showed the application of those concepts in dealing with fermions and bosons, which given my limited quantum physics knowledge was new to me. There was also another great talk by an IBM researcher; this one a more applied talk about actually building larger multi-qubit gates while dealing with the immensely complicated issues of maintaining the superposition. The talk was very positive and the speaker showed that IBM has made a lot of progress in pinning down some of the problems. Although many believe QC is just a fad and will never actually make it to utility, I’m optimistic to see and follow real progress being made. If for nothing else, for science.

So those are the big news items from the past few months. Hopefully, I will have time to keep this updated more regularly hereon.

EDIT: One more thing! The research that I worked on with a few other undergraduates and my professor from the summer of freshman year has finally been published and presented at IMC 2011 in Germany early in November! The paper is also now online here (I think this might be behind a paywall).

A direct link is also available here.

Posted by on November 23, 2011 in Uncategorized

## Approximation Algorithms Workshop, Princeton 2011

These signs were all over the place!

This past week, I’ve been kept busy by attending a workshop on approximation algorithms, hosted by the Center for Computational Intractability at Princeton University. The particular workshop is this one, the purpose being to recap the last decade of advances in this field and to set the direction for the future. It was a five-day workshop with something like thirty speakers in total, with talks ranging from 35 minutes to an hour. It was my first major workshop that I have ever attended! It happened somewhat coincidentally, as I had emailed someone from the department for something totally irrelevant and got a reply regarding it from my professor for the approximation algorithms class I took last semester – he encouraged me to attend the workshop this week if I was attended and so I did. There were maybe three or four undergraduate students apart from me at the conference and it was pretty amusing to have every conversation start with “So which year of the PhD program are you at?”

There were some really quite amazing talks, such as one by Vijay Vazirani who did a overview of how much progress has been made on the twenty-one problems he predicted last decade would be the major ones in the field (at least 16 of them have either been conclusively solved or made substantial progress upon). It was a pretty exciting lecture and what was really cool was that the name “Vazirani” had come up again and again when we were studying approx. algorithms in class and it was pretty amazing to see him in person. He was also one of the three authors of the famous KVV Theorem (Karp-Vazirani-Vazirani) that gave an optimal result for any approximation algorithm for online bipartite matching. That theorem came up over and over again in other talks too.

Another cool talk was by Subhash Khot, inventor of the Unique Games Conjecture in his seminal paper, On the power of unique 2-prover 1-round Games. He reported on progress on the relations of UGC to other problems. Indeed this conjecture as well made its way into a good number of other talks. I had only very vaguely heard about UGC before this, but after hearing so much about how powerful it is, it’s pretty amazing. It ties in to a bunch of other theorems (If UGC is true, then …) and all of the results we’ve seen so far (that I’m aware of) derive reasonable conclusions with the assumption of UGC. It’s more evidence that it’s probably true and if it were ever proven to be so, it would represent a major step forward in this field (and as one speaker said, if it were false, we’d basically be back at square one).

My professor gave a talk on finding dense subgraphs inside graphs. He converted the problem into a decision problem of whether or not we can distinguish whether or not a graph has a dense $k$-subgraph (basically a set of $k$ vertices with a lot of edges between them). It was mostly understandable because I have a tiny bit of background in graph theory, but nonetheless a very difficult problem indeed. The talk was really interesting and it was a shame it was so short.

There were other talks by people from Google Research, Microsoft Research Labs, Bell Labs, and several universities around the world. I really wish I could explain every single talk that was given over the five days, but unfortunately my memory does not serve so well. The take-home message for me was that this is a truly exciting field and one that is filled with many, many brilliant people (duh) and that is being so very actively developed upon. It was an overall awesome experience to be in the company of such great researchers. I just wish I knew more than I did so I could gain more from attending. But that’s what the future is for!

Posted by on June 18, 2011 in Uncategorized

## Spring Semester Update

Today marks the beginning of Spring Break for us and the midway point of the semester as well. This is most definitely the hardest semester I’ve ever had so far and although it’s very stressful at times, I’m really enjoying it a lot. I guess in some ways this semester was a way for me to determine how much I actually loved computer science at the its core – to see how much I enjoyed it if I fully immersed myself in it. The results so far are better than what I had expected: Even though the great majority of the my homework and assignments are about computer science and even though I am studying or attending class on some CS subject nearly every single day, my interest in it has only increased. I feel very at home when learning about the theoretical aspects of computation and complexity, or knee-deep in the analysis of some algorithm. Even though the workload is just so very barely manageable, I’m glad I am taking what I decided to take and I’d like, to the extent possible, to complete all six courses I am enrolled in this semester. Six courses is a bit on the high side for an engineer here, but I guess it’s more about how interesting you find them. As I’ve said before, it only feels like work if you can actually distinguish it from play. In any case, most of the work that I have fits comfortably in the category of play, so it’s basically an excuse to do something I’d probably be doing – or wishing I could be doing – during that time anyway.

I had a few goals for the year. One to take at least one graduate class before the end of the year; to study lesser known data structures and algorithms; to learn at least one more popular language and one more obscure language; and to get an introduction on quantum computation. Of those, I am in the process of accomplishing the first few of them. Thanks to my class on approximation algorithms, which is both tough and very enjoyable, I am satisfying the first one. In my Theory of Algorithms course, we are covering quite a few different interesting data structures as well as algorithms. Although these are not exactly “lesser known” in the greater sense of computer science, they’re also not the ones you usually learn in an introductory course. Specifically, I was very happy to see that we covered Fibonacci heaps, rank-pairing heaps, AVL trees, and RAVL trees. These, especially the first, are among the more advanced data structures used for very large data sets in industry. If I recall correctly, I believe Google Maps uses some implementation of F-heaps in storing data. In general, self-adjusting data structures are extremely interesting and it’s so neat to get a pretty bound on their performance regardless of how convoluted the implementation and procedures may be. As far as the other goals, I’m making (slow) progress on picking up Python and we’re using C++ for our computer graphics course. And as far as lesser known languages, our compilers course is taught strictly in ML – a functional language. I am yet to fully wrap my head around those exotic things. More thought is necessary on that front.

Although I’m satisfied with the coverage of algorithms for trees and graphs so far, I’m now also interested in learning about probabilistic algorithms. I was turned in this direction after seeing the power of probabilistic rounding techniques in solving LP-relaxation problems for approximation algorithms for very common problems such as vertex cover. Although they may not always provide an optimal bound, they do sometimes make the code and analysis much simpler and offer better worst-case performance. And they are everywhere too. From the example I just mentioned to an implementation of MST in linear time (!) in the number of edges, randomized and probabilistic algorithms play a huge role. And looking at it from a more pure sense, it’s also particularly interesting to see how the nature of random numbers can reveal insight on the structure of computation. I’m starting to develop some ideas of my own; perhaps the better word for them would be “questions” or maybe just situations and combinations of known elements in a way I haven’t seen anywhere. Hopefully I will soon be able to gain the mathematical and analytical machinery to actually see if these ideas make any sense at all.

In other news, my copy of Knuth’s classic Art of Computer Programming Vol. 1-4A finally arrived this week! As you may know, the fourth volume was just released after something like 38 years. I’m very excited to check it out. It’s currently sitting, still shrink-wrapped, on the table beside me. During the next week I’ll see if I can make sense of it. On another note, I am very pleasantly surprised by the quality of CLRS, another very excellent algorithms book. The analysis is surprisingly clean and the writing is precise, which makes it a pleasure to read. And they tackle complicated concepts with ease, making it look like it takes hardly any effort, which is impressive. Over winter break, I had started on Michael Sipser’s Introduction to the Theory of Computation, another classic text. It turns out this gave me a very nice edge when we were covering DFA/NFAs, regular expressions, and context-free grammars in compilers. Although I had had an introduction to these concepts in a previous course, I accredit Sipser’s book for teaching them to me in a rigorous sense. Once again, I see the same ease with with Sipser explains these concepts and proofs and it’s quite impressive. It makes it as painless as possible to cover these proofs with sufficient depth. So I will have to continue on with that text, seeing as it coincides very closely with what I am interested in. And as for the last goal, it is yet unfulfilled. I need a whole lot more knowledge before I go near that subject. Perhaps next semester or next spring. There’s just so much learn.

Posted by on March 12, 2011 in Uncategorized

## Spring Semester 2011

Today is Friday (well it was about an hour ago) and that means that this week is done. This week also happens to be my first week of my spring semester. And it was a hell of a ride. My current mood right now is a combination of extreme tiredness, sleepiness, and a great deal of excitement for weeks to come. [Pretty sure that last statement is grammatically incorrect but I can’t be bothered to fix it because it conveys my feelings well].  Going into the planning stages for this semester, I sought to take on a major challenge for myself: to out-do anything I’ve done before and push myself beyond any boundaries I have previously thought was the limitation of my abilities. Obviously, classes here are really hard, but I see them as springboards rather than roadblocks. And the springboard metaphor is quite fitting: you jump on-board and you sink and sink for a bit, very well aware of your weight and limitations and how long they’re dragging you down – but that’s only for the first half. Once you reach that critical state when things start clicking, it’s all uphill from there. Suddenly your own weight is actually helping you go higher. And guess what? You’ll end up higher than you could’ve ever jumped otherwise.

The moral of the story is obviously this: Avoid falling off on the way down and it’ll be worth it on your way up.

And that’s how it was for a few of my classes last semester too. I took an upper level math class on numerical analysis and found it pretty hard at first. It was intimidating to be in a room full of people a year or more older than me and a professor who walk into the room, give a half-hi gesture, and almost immediately begin lecturing without break for the entirety of the class. Although I do enjoy math a great deal, it does in no way imply that I’m particularly good at it. A math TA from last year put it aptly when he said mathematics was the process of banging your head on a table until 1) you passed out, in which case you called it a day, or 2) you eventually made a breakthrough. As convenient and elegant as it may to think that there’s some magical state beyond which the marginal difficulty of learning the next theorem/definition/proof/algorithm falls off to some small, constant value, I really am starting to doubt that’s the case. The more I make progress in my education on the fronts of both math and computer science, I’m thinking that what instead happens is that we, either by consciously looking for it or by our minds’ own doing, start seeing the very same patterns and paradigms again and again. Of course this isn’t a new idea, but it hits you hard when you make the realization for yourself. It’s interesting because it’s almost as if my brain is developing some form of auto-complete: give it some properties of some new entity and it “predicts” some of the others that will follow. There are obviously tons of exceptions to this and that’s where the fun comes in and keeps things interesting enough to continue pursuing (although the first time I heard matrix multiplication doesn’t commute was jarring to my pristine understanding of the world). And it’s this same notion of “auto-complete”, or intuition, that gives a better grip on the weird world out there and thus provides the illusion that the marginal difficulty is indeed decreasing.

Another metaphor which I particularly like derives from the use of the term “structure” when thinking about a problem: namely, in the context of a phrase like “now after X years of research, we have a better understanding of the inner structure of [complexity class/research problem/concept/etc]…”. In my mind, I see each of these concepts not quite as black boxes, but as dark rooms, the size of which is sometimes unknown from our current perspective. And so long as Erdös was just being metaphorical about his Book, there aren’t any lighting fixtures in the room. All we are given is a bunch of little candles. In fact we do have a few matches but it’s much harder to light a candle with a match than it is to use an already lit one. And so we go about setting up tiny little candles all about this room. They each illuminate brightly things in their immediate vicinity but the falloff of light is pretty drastic at times. And sometimes different candles illuminate the same portion of the room. Ideally, we’d like to use exactly one candle, so perfectly positioned so that it lights the entire room, but finding that position is almost definitely at least NP-hard or something… The idea is that there are rooms with candles from other people, in fact all of the other people in the world. And then there’s your own room, where you have to discover which candles are lit by yourself. You don’t have to necessarily light them yourself, but you have to discover that they do indeed exist. But of course, the room is too large to light up fully. So instead, we attack a certain direction. Perhaps we like what candles so far have illuminated or perhaps we think we’re on the edge of a major push forward. In either way, we are forced to narrow down our search. It’s pretty amazing how much brighter my room has gotten in just the past few months. (Baseless prediction: the room is a actually cylindrical. Interpret that as you understand it.).

And all of these thoughts are what are following me around these days. I love a good mystery and this is exactly that. I am consistently more amazed and in a state of awe than I ever expected. And I find the intricate complexity of the surface lit thus far extremely beautiful. Although theoretical computer science has a bit less of natural feel (that is to say, closeness to the ways of nature and the universe) than mathematics, it’s still astonishing to see how things fit together. Yes, computer science is a man-made field consisting of arguably arbitrary dichotomies depending on who you ask. And yes, this field is still so very much in its infancy. But nonetheless, they reveal something deeper than the definitions that we have given them. To put it shortly, there’s still some magic left which lays undiscovered waiting for us. As frustrating as it is that we do not understand some seemingly elementary relationships between things, it’s also exactly that which gives it its charm. I was sitting in class this week, with the professor writing theorem after theorem on the board, each of which had to do in some way with P vs. NP. And I thought how much more boring the class would have been if indeed we did know the answer. Or how even more boring it would be if $P \neq NP$. As much as I hope it’s resolved soon, it’s the idea of not knowing which is incredible in some strange way. It keeps the magic alive and I like it.

I considered what courses I wanted to take this semester. There are lots of things I want to learn about in computer science with only time being the limitation. I decided to go forward with a bold move by taking two very difficult theory classes together. They are both on algorithms: one on the theory of algorithms taught by the great R.E. Tarjan and the other a graduate course on advanced algorithm design – specifically approximation algorithms. They are fast-moving and the latter is extremely difficult (I don’t doubt the former will soon become so too!). But I’m not getting off the springboard, no matter how tempting it may be. I will continue to push forward, on until that pivotal moment hits where things start finally making sense. I’m learning an insane amount of things every single day and it’s amazing that a lot of things which I had read about casually in the past are all suddenly coming together with a much brighter luminance. It’s hard and I anticipate lots and lots of banging heads on tables ahead, but it’ll be worthwhile. This is one of those utterly invaluable experiences that I wouldn’t give up for anything.

I started the week inspired and now I am more inspired than I recall ever being. I live in an amazingly intricate and beautiful world and all I want to do is keep lighting candles.

Posted by on February 5, 2011 in Uncategorized

## PUMaC in < 24 Hours!

Wow, crazy. PUMaC is set to begin in under 24 hours! As I’ve mentioned before, PUMaC is Princeton University’s annual mathematics competition intended for high-schoolers (and brilliant middle-schoolers) around the country and world. It’s been an exciting journey thus far and I’ve had a great time working with the director and other core PUMaC staff to bring it all together. I haven’t seen the problems yet, but I’ve taken a peek at the Power Round test (already handed out to teams last Saturday) and it looks pretty interesting! The topic this year is in the land of graph theory: specifically, minor graphs.

According to the latest numbers, we’re expected 500+ middle-/high-school students on campus tomorrow! It’s pretty exciting recalling how much I looked forward to PUMaC, HMMT, and others when I was in high school. Regardless of how fun it is to organize everything, nothing beats actually competing. 🙂 I hope everyone has a great time tomorrow and we see some fierce competition for the top spots!

Posted by on November 19, 2010 in Uncategorized

Tags: , ,

## Congrats to Prof. Elon Lindenstrauss!

I just wanted to say congratulations to Professor Elon Lindenstrauss, my professor for MAT 215, Analysis of a Single Variable, at Princeton.

As of yesterday morning, he is one of the recipients of the Fields Medal for 2010 and also the first Israeli mathematician to do so!

It is truly an honor to have been able to be taught by such a great mathematician.

Posted by on August 20, 2010 in Uncategorized

## PUMaC Site is Live!

Just wanted to point out that the Princeton University Math Club site is up and running! It’s at a temporary location right now, but the content should not change much when it is migrated over to the actual domain. The current URL is: http://www.alexogier.com/pumac/wp/. Please check it out and let me know of your suggestions. Thanks!