# Tag Archives: thoughts

## Spring Semester Update

Today marks the beginning of Spring Break for us and the midway point of the semester as well. This is most definitely the hardest semester I’ve ever had so far and although it’s very stressful at times, I’m really enjoying it a lot. I guess in some ways this semester was a way for me to determine how much I actually loved computer science at the its core – to see how much I enjoyed it if I fully immersed myself in it. The results so far are better than what I had expected: Even though the great majority of the my homework and assignments are about computer science and even though I am studying or attending class on some CS subject nearly every single day, my interest in it has only increased. I feel very at home when learning about the theoretical aspects of computation and complexity, or knee-deep in the analysis of some algorithm. Even though the workload is just so very barely manageable, I’m glad I am taking what I decided to take and I’d like, to the extent possible, to complete all six courses I am enrolled in this semester. Six courses is a bit on the high side for an engineer here, but I guess it’s more about how interesting you find them. As I’ve said before, it only feels like work if you can actually distinguish it from play. In any case, most of the work that I have fits comfortably in the category of play, so it’s basically an excuse to do something I’d probably be doing – or wishing I could be doing – during that time anyway.

I had a few goals for the year. One to take at least one graduate class before the end of the year; to study lesser known data structures and algorithms; to learn at least one more popular language and one more obscure language; and to get an introduction on quantum computation. Of those, I am in the process of accomplishing the first few of them. Thanks to my class on approximation algorithms, which is both tough and very enjoyable, I am satisfying the first one. In my Theory of Algorithms course, we are covering quite a few different interesting data structures as well as algorithms. Although these are not exactly “lesser known” in the greater sense of computer science, they’re also not the ones you usually learn in an introductory course. Specifically, I was very happy to see that we covered Fibonacci heaps, rank-pairing heaps, AVL trees, and RAVL trees. These, especially the first, are among the more advanced data structures used for very large data sets in industry. If I recall correctly, I believe Google Maps uses some implementation of F-heaps in storing data. In general, self-adjusting data structures are extremely interesting and it’s so neat to get a pretty bound on their performance regardless of how convoluted the implementation and procedures may be. As far as the other goals, I’m making (slow) progress on picking up Python and we’re using C++ for our computer graphics course. And as far as lesser known languages, our compilers course is taught strictly in ML – a functional language. I am yet to fully wrap my head around those exotic things. More thought is necessary on that front.

Although I’m satisfied with the coverage of algorithms for trees and graphs so far, I’m now also interested in learning about probabilistic algorithms. I was turned in this direction after seeing the power of probabilistic rounding techniques in solving LP-relaxation problems for approximation algorithms for very common problems such as vertex cover. Although they may not always provide an optimal bound, they do sometimes make the code and analysis much simpler and offer better worst-case performance. And they are everywhere too. From the example I just mentioned to an implementation of MST in linear time (!) in the number of edges, randomized and probabilistic algorithms play a huge role. And looking at it from a more pure sense, it’s also particularly interesting to see how the nature of random numbers can reveal insight on the structure of computation. I’m starting to develop some ideas of my own; perhaps the better word for them would be “questions” or maybe just situations and combinations of known elements in a way I haven’t seen anywhere. Hopefully I will soon be able to gain the mathematical and analytical machinery to actually see if these ideas make any sense at all.

In other news, my copy of Knuth’s classic Art of Computer Programming Vol. 1-4A finally arrived this week! As you may know, the fourth volume was just released after something like 38 years. I’m very excited to check it out. It’s currently sitting, still shrink-wrapped, on the table beside me. During the next week I’ll see if I can make sense of it. On another note, I am very pleasantly surprised by the quality of CLRS, another very excellent algorithms book. The analysis is surprisingly clean and the writing is precise, which makes it a pleasure to read. And they tackle complicated concepts with ease, making it look like it takes hardly any effort, which is impressive. Over winter break, I had started on Michael Sipser’s Introduction to the Theory of Computation, another classic text. It turns out this gave me a very nice edge when we were covering DFA/NFAs, regular expressions, and context-free grammars in compilers. Although I had had an introduction to these concepts in a previous course, I accredit Sipser’s book for teaching them to me in a rigorous sense. Once again, I see the same ease with with Sipser explains these concepts and proofs and it’s quite impressive. It makes it as painless as possible to cover these proofs with sufficient depth. So I will have to continue on with that text, seeing as it coincides very closely with what I am interested in. And as for the last goal, it is yet unfulfilled. I need a whole lot more knowledge before I go near that subject. Perhaps next semester or next spring. There’s just so much learn.

Posted by on March 12, 2011 in Uncategorized

## Spring Semester 2011

Today is Friday (well it was about an hour ago) and that means that this week is done. This week also happens to be my first week of my spring semester. And it was a hell of a ride. My current mood right now is a combination of extreme tiredness, sleepiness, and a great deal of excitement for weeks to come. [Pretty sure that last statement is grammatically incorrect but I can’t be bothered to fix it because it conveys my feelings well].  Going into the planning stages for this semester, I sought to take on a major challenge for myself: to out-do anything I’ve done before and push myself beyond any boundaries I have previously thought was the limitation of my abilities. Obviously, classes here are really hard, but I see them as springboards rather than roadblocks. And the springboard metaphor is quite fitting: you jump on-board and you sink and sink for a bit, very well aware of your weight and limitations and how long they’re dragging you down – but that’s only for the first half. Once you reach that critical state when things start clicking, it’s all uphill from there. Suddenly your own weight is actually helping you go higher. And guess what? You’ll end up higher than you could’ve ever jumped otherwise.

The moral of the story is obviously this: Avoid falling off on the way down and it’ll be worth it on your way up.

And that’s how it was for a few of my classes last semester too. I took an upper level math class on numerical analysis and found it pretty hard at first. It was intimidating to be in a room full of people a year or more older than me and a professor who walk into the room, give a half-hi gesture, and almost immediately begin lecturing without break for the entirety of the class. Although I do enjoy math a great deal, it does in no way imply that I’m particularly good at it. A math TA from last year put it aptly when he said mathematics was the process of banging your head on a table until 1) you passed out, in which case you called it a day, or 2) you eventually made a breakthrough. As convenient and elegant as it may to think that there’s some magical state beyond which the marginal difficulty of learning the next theorem/definition/proof/algorithm falls off to some small, constant value, I really am starting to doubt that’s the case. The more I make progress in my education on the fronts of both math and computer science, I’m thinking that what instead happens is that we, either by consciously looking for it or by our minds’ own doing, start seeing the very same patterns and paradigms again and again. Of course this isn’t a new idea, but it hits you hard when you make the realization for yourself. It’s interesting because it’s almost as if my brain is developing some form of auto-complete: give it some properties of some new entity and it “predicts” some of the others that will follow. There are obviously tons of exceptions to this and that’s where the fun comes in and keeps things interesting enough to continue pursuing (although the first time I heard matrix multiplication doesn’t commute was jarring to my pristine understanding of the world). And it’s this same notion of “auto-complete”, or intuition, that gives a better grip on the weird world out there and thus provides the illusion that the marginal difficulty is indeed decreasing.

Another metaphor which I particularly like derives from the use of the term “structure” when thinking about a problem: namely, in the context of a phrase like “now after X years of research, we have a better understanding of the inner structure of [complexity class/research problem/concept/etc]…”. In my mind, I see each of these concepts not quite as black boxes, but as dark rooms, the size of which is sometimes unknown from our current perspective. And so long as Erdös was just being metaphorical about his Book, there aren’t any lighting fixtures in the room. All we are given is a bunch of little candles. In fact we do have a few matches but it’s much harder to light a candle with a match than it is to use an already lit one. And so we go about setting up tiny little candles all about this room. They each illuminate brightly things in their immediate vicinity but the falloff of light is pretty drastic at times. And sometimes different candles illuminate the same portion of the room. Ideally, we’d like to use exactly one candle, so perfectly positioned so that it lights the entire room, but finding that position is almost definitely at least NP-hard or something… The idea is that there are rooms with candles from other people, in fact all of the other people in the world. And then there’s your own room, where you have to discover which candles are lit by yourself. You don’t have to necessarily light them yourself, but you have to discover that they do indeed exist. But of course, the room is too large to light up fully. So instead, we attack a certain direction. Perhaps we like what candles so far have illuminated or perhaps we think we’re on the edge of a major push forward. In either way, we are forced to narrow down our search. It’s pretty amazing how much brighter my room has gotten in just the past few months. (Baseless prediction: the room is a actually cylindrical. Interpret that as you understand it.).

And all of these thoughts are what are following me around these days. I love a good mystery and this is exactly that. I am consistently more amazed and in a state of awe than I ever expected. And I find the intricate complexity of the surface lit thus far extremely beautiful. Although theoretical computer science has a bit less of natural feel (that is to say, closeness to the ways of nature and the universe) than mathematics, it’s still astonishing to see how things fit together. Yes, computer science is a man-made field consisting of arguably arbitrary dichotomies depending on who you ask. And yes, this field is still so very much in its infancy. But nonetheless, they reveal something deeper than the definitions that we have given them. To put it shortly, there’s still some magic left which lays undiscovered waiting for us. As frustrating as it is that we do not understand some seemingly elementary relationships between things, it’s also exactly that which gives it its charm. I was sitting in class this week, with the professor writing theorem after theorem on the board, each of which had to do in some way with P vs. NP. And I thought how much more boring the class would have been if indeed we did know the answer. Or how even more boring it would be if $P \neq NP$. As much as I hope it’s resolved soon, it’s the idea of not knowing which is incredible in some strange way. It keeps the magic alive and I like it.

I considered what courses I wanted to take this semester. There are lots of things I want to learn about in computer science with only time being the limitation. I decided to go forward with a bold move by taking two very difficult theory classes together. They are both on algorithms: one on the theory of algorithms taught by the great R.E. Tarjan and the other a graduate course on advanced algorithm design – specifically approximation algorithms. They are fast-moving and the latter is extremely difficult (I don’t doubt the former will soon become so too!). But I’m not getting off the springboard, no matter how tempting it may be. I will continue to push forward, on until that pivotal moment hits where things start finally making sense. I’m learning an insane amount of things every single day and it’s amazing that a lot of things which I had read about casually in the past are all suddenly coming together with a much brighter luminance. It’s hard and I anticipate lots and lots of banging heads on tables ahead, but it’ll be worthwhile. This is one of those utterly invaluable experiences that I wouldn’t give up for anything.

I started the week inspired and now I am more inspired than I recall ever being. I live in an amazingly intricate and beautiful world and all I want to do is keep lighting candles.

Posted by on February 5, 2011 in Uncategorized

## Jolicloud 1.0 + Dell Mini 10v

In my continuous pursuit of trying out cool, new software and hardware, I recently decided to give Jolicloud a go. To be certain, this isn’t something that I just decided one day. In fact, I had seen and done a bit of research into Jolicloud many months ago, long before any announcement of the 1.0 release. The one major factor that stopped me from pursuing it any further was the actual fact that it was a cloud-based OS and therefore, I assumed that without a persistent connection, I would helpless. This, however, it not necessarily true with Jolicloud.

The device that I tried Jolicloud on (and am writing this post in) is a stock Dell Mini 10v. It’s your very typical netbook, with 1.0GB RAM, 160GB HDD, Intel Atom N270 clocking at 1.60GHz. Initially, I ordered it downgraded to Windows XP. Soon after I got it, I upgraded it to Windows 7 Home Premium. And about 6 hours after that, I partitioned the disk and installed Ubuntu Linux.

If you’re keeping track, that now means that there are three OS installations on this netbook. But that’s not strictly accurate since Jolicloud is currently installed on a virtual partition of the Windows partition (think Wubi with Ubuntu). So, it shows up under the Windows chainloader and not directly in the GRUB bootloader. Of course, once you actually boot into Jolicloud, it’s virtually impossible to tell.

So about three weeks ago, I finally took the leap and installed it. Right off the bat, I felt familiar in the interface. Recall, this is was pre-1.0 release, so the interface was completely different from the new 1.0 HTML5 interface. The reason I felt familiar is because it was nearly identical to UNR (Ubuntu Netbook Remix). This brings me to a good point: why did I even bother trying this OS? Well, of course I am a huge fan of Ubuntu and I think it’s pretty apparent that it’s one of the easiest Linux distros to use. It does work great on my netbook in terms of performance, although not so much in terms of usability, in the sense that it’s painfully clear that stock Ubuntu is not written with the netbook in mind. With Gnome-Do, it’s a whole lot easier to interact with the OS and so the need for a good universal launcher is a must. Although the Mini 10v is an otherwise great netbook that has satisfied all of my needs, the touchpad is absolutely horrendous in every conceivable way. Touch-based left- and right-click buttons is a tremendous mistake. In theory it sounds like a great idea, except that even the portions of the touchpad where you’re supposed to push down to initialize a click are touch-sensitive. Thus, unless you have a truly unwaivering hand (in which case you should just use this skill alongside a magnetized needle to write all your code), you will definitely move the cursor which clicking and thus so click accuracy sucks. This was an important consideration since I want to use the touchpad as little as possible. Jolicloud 1.0 makes that possible.

I decided to hold off on writing this post till I got chosen for the upgrade since I didn’t feel it was fair to make a judgement before seeing what the latest version held. And I’m glad I did because the 1.0 release is in most ways a huge step-up from before.

First of all, the well-known and talked-about HTML5 app launcher interface is just great. It’s much, much cleaner and easier to use than the previous layout, which mimics UNR. Indeed, the line between file browser, app browser, settings, and app “store” is very nicely and elegantly merged into an intuitive interface. I would have liked mouseover descriptions of the icons in the top menubar, but that’s okay. Installing a new app (either an actual desktop application or thin wrapper for a web-based client based on application shortcuts available in Chromium) is quite literally a one-click procedure. It’s very plain and simple when an application is being downloaded and even though some of the icons are not quite recognizable as to their purpose, it all makes sense in context.

When you first boot into Jolicloud, it asks you to set up your wireless connection yourself. Not all-together too shabby since the underlying OS does pretty much all of the work for you, but it’d be nice to see that process a bit better integrated into the starting up of the OS. Like having the list of available networks clearly visible in the center of the screen instead of being hidden away behind a mouseclick on the Wifi icon. Not a big deal though. If you think Ubuntu is good at getting drivers and other configurations of hardware done for you, you’ll be really happy with Jolicloud too. I didn’t have to touch one thing and it in fact loaded up my Broadcom proprietary driver automatically too (and informed me of so). Another interesting gnome-panel applet-esque icon allows you to underclock your processor and set up power plans. This is interesting but I haven’t looked into it. Of course, the underclocking is to save on battery life, but I haven’t tested that out (the Mini 10v with 3-cell battery has one of the worst battery lives I’ve ever come across – about 2-3 hours, which really sucks for a netbook, so I may actually look into this later on).

The apps I downloaded were still there when I completed the upgrade. The Launcher, mapped to the Windows key, allows very quick access back to the home screen from within any application. Hitting the Tab key brings focus to search box, which allows you to search Google or your downloaded apps/friends/other apps in the store. This is a really neat combo which I’ll be using a whole lot more. Overall, the interface is really great. Icons are large and crisp for my standard 1024×600 display.

There’s one thing in particular that truly annoys me though. And that’s the demotion of my most useful and favorite built-in applications that were available before to a submenu “Legacy Apps”. This include gnome-terminal and gedit. This makes me mad because outside of the popular console-based text editors, I still find gedit a superior editor over the others. And of course, how can I function without gnome-terminal? These apps are also not indexed in the Launcher searchbar nor are they available in the app store (neither is Google Chrome, as it was before, but Chromium is..). If I could find some way to put this in my main app list, then I could be okay with that.

I’ll continue to give more of my impressions of Jolicloud and info about my experiences with it as time goes on. All in all, I’m quite impressed with where it now and it’s a pleasure to use. If it continues to fair well, I might end up using it a whole lot more with Windows 7 on my netbook.

1 Comment

Posted by on July 31, 2010 in Uncategorized

## The Start of Another Week

It’s late Sunday night which means this week has already started. But I’m not too worried because so far I’m off to a pretty good start, work-wise. I’m really glad I took the time today to be as productive as possible; I know that this will make getting through the workload this week much easier. The more I read CS, the more interested I become in it.

I got through 100+ pages of the Bryant book (this one), which I’m pretty proud of, because that stuff is hard. It took me a while to really understand what was going on and I can’t say I understand it all, but I get the overarching concepts pretty well. It’s pretty incredible how much a computer does even when interpreting a simple input from the user or even just a routine OS exception that a program throws. Reading about all the different ways a computer maximizes its output while minimizing its time taken, the key optimizations necessary to let programs run at acceptable speeds; I can’t help but wonder how these ideas ever first occurred to someone decades ago. Just to understand what they’re saying, after several levels of simplification, is a challenge for me. I’ve wondered for a really long time how the lowest-level interactions between hardware and software are dealt with. We’ve cringed over how messy such simple C programs that involved calls to malloc() and free(), whereas even these are very high level compared to how things are actually done. Of course, reading about Knuth’s contributions to the development of efficient malloc() implementations (of first-fit as compared to best-fit) have only served to increase my admiration of him. It’s magic, what these guys do.

Posted by on April 11, 2010 in Uncategorized

Tags: , , ,

## It’s Been a While

Yeah, yeah, it’s been a while since I posted. No need for me to apologize to any anonymous audience because it’s self-deprecating and meaningless anyway. Instead of jumping onto broad resolutions that barely last moments, I’ll just play this one by ear. I’ll try to update this a bit more often, but I’m not gonna let myself worry if I don’t. I’ve wanted to have a place to blog for a while and now that I have it, I’ll write – given that I’ve been recently re-energized to the idea of blogging again by friends. Thanks guys, this may actually be the start of something good.

I’m gonna take this blog in a different direction. I know it had started off with the intention of being a technophile’s xanadu – I just don’t have the time to do software reviews and predictions/opinions. Instead, I’m gonna make this a bit more personal and about me. Truly though, it’s not much different than my original idea anyway – tech stuff represents who I am as best as anything else you’re ever likely to hear on this place. The past few years have all been about figuring out what it is that I can relate myself too; probably not the most unexpected thing. But I really feel like I might’ve found it now. That’s good stuff.

Before college, I was kinda stuck between studying math, physics, and comp sci. There were aspects of all that I really liked but I couldn’t quite pinpoint what it was in particular that ticked me. One semester having passed and neck-deep in the second, I’ve got it now. I have no idea how long this’ll last but I sure hope it does stay this way, considering it’s about time I started shifting my life towards it. What I’m talking about is computer science. I’ve found that it is this that inspires me more than anything else I’ve studied thus far at Princeton. If it’s one thing I’ve realized, it’s that you really learn a LOT here. Classes go by really fast, professors are always eager to move on to the next big topic, and it’s sometimes hard to catch a break. I was thinking the other day and realized what my ideas about Princeton were: The way I’d say it is: Princeton for me is about tiredness, stress, exhaustion, pain, and absolute awe. It’s the last one that makes it all worthwhile. The best example I can currently think of is my comp sci class itself. Early in the morning, I have to bike uphill quite a bit and quite fast in order to get to class on time. I get there and I’m always gasping for breath, aching from riding so fast, and if it’s cold or rainy, I’m fairly miserable. But when I actually do walk into class and my professor gets into the lecture, I realize why it is that in my mind, the gains outweigh the costs. There’s something quite quirky, strange, and magical about the world and I’m beginning to see it unraveled piece by piece. More to come.