Thursday, January 27, 2011

Social Change is the Greater Challenge

Here is a clip from yesterday's episode of The Daily Show:

The Daily Show With Jon StewartMon - Thurs 11p / 10c
State of the Union 2011 - Night of Too Many Promises
Daily Show Full EpisodesPolitical Humor & Satire BlogThe Daily Show on Facebook

It really amazes me how going to the moon is seen as more difficult than having a fast train network, electric cars and teachers who are seen as nation builders.

We have already been to the moon. We did that in the 60s with computers that were barely faster than a pocket calculator. On the other hand, it is now 2011 and we still haven't managed to put food on every plate, are headed towards an environmental disaster and income inequities are increasing rapidly. Yet, only high technology is seen as a challenge, as something worth achieving. Social change is something mundane, something that doesn't quite fire our imagination.

In my opinion, social change is a far greater challenge than going to the moon (or even mars) could ever hope to be.

Tuesday, January 25, 2011

What is Linux

Despite it's meager market share, it's surprising how rich and varied the Linux community is. There are differences in philosophies and outlook. In fact, they very definition of what Linux is, varies.

Here is what I think Linux is:

Technology: Of course, first and foremost, Linux is the technology. There is the Linux kernel. And then there is the giant world of open source software that runs on that kernel to create a wonderful computing environment. Some people consider only the kernel to be 'Linux'. I don't agree. The entire software ecosystem for me is Linux.

Community: Linux would be nothing without the community. There are the developers who create such wonderful software mostly on a voluntary basis. Then there are designers and bug reporters and packagers and maintainers and webadmins and forum moderators. The community is huge. And last but not the least, the users. The entire body of people is an inseparable part of 'Linux'. For me, it is not possible to see people as separate from the technology they create and use.

Philosophy: Although differences exist, most philosophies in the Linux world are the variations on the same theme - that software should be free (as in freedom) and open. The technology and community are built around these core beliefs and would have little meaning if the philosophy is given up. Some people don't believe that the philosophy is important or that it's too militant. I don't think that's the case. If philosophy is not important, why have Linux at all? Why not just Windows or OS X?

For me, Linux is the inseparable trio of these three components.

Monday, January 24, 2011


It is said that ignorance is bliss. True perhaps for the ignoramuses but for those of us who know better, it certainly is no bliss to deal with them. I admit, with some shame, that as a schookid I had little or no patience with classmates who weren't as 'smart' as I was. When I went to college, I wasn't the smartest guy around anymore. Yet, I still harbored a mild disdain for those who knew less than me.

I'm certainly not alone. I've seen people often complain about ignorance. Videos of how stupid Americans are, abound. Fellow bloggers complain about the poor level of knowledge among the people they interact with.

I have come to realize that it's not so much ignorance that pisses me off as the lack of willingness to learn. A person who didn't know something wouldn't piss me off as long as they said - hey, I don't know, tell me. Or, hey, I don't know, let's find out.

But it angers me when people say, I don't care or just believe the wrong thing.

Sunday, January 23, 2011

Packaging Linux as an OS

Linux is radically different from proprietary OSes - Windows and OS X. And I don't just mean in terms of philosophy. I mean technically too. Linux has a radical distributed model of development. Various parts of the OS (and here I define OS as the entire stack up to the desktop environment) are created by different teams. They are designed to be independent and swappable (well, mostly). You can use the vanilla kernel or one of the custom kernels for your platform. You can use the ubiquitous X server or swap it with something like Wayland. You can choose from one the many desktop environments - something very unique to Linux.

The entire software stack is always in a flux. Different projects have different release schedules, different compatibilities, different goals, functionality and philosophies. As you can imagine, packaging Linux as an operating system is a nightmare.

The situation becomes especially difficult for application developers. When you are writing an application for Windows or OS X, you just have to make sure it works for the latest few versions of the OS. When you write an application for Linux, what do you write for? What kernel version, what X server version, Gnome or KDE? Which distribution?

This is a major problem for Linux and one of the reasons why commercial application support is so bad on Linux.

Canonical, with it's flagship distro Ubuntu, is trying to solve this problem. They first did by introducing a predictable, stable, six monthly release cycle. Then they tried to convince various communities to practice what Mark Shuttleworth called cadence - predictable release cycles that synchronize with other projects. However, that suggestion didn't go down well with most communities. And now they are doing it by taking more control of Ubuntu by swapping Gnome Shell with Unity and X server with Wayland.

I'm not entirely sure if Canonical taking control of Ubuntu is this way is entirely benign. Of course, they are still open source. But as we've seen in the past with the development of the Ubuntu font or the Ubuntu One service, Ubuntu's development processes aren't always very open or democratic. A whole lot of people feel that this harms the community.

See Also:

Saturday, January 22, 2011

Why Ubuntu Switched to Unity and Wayland

There is a major barrier to adoption of Linux on the desktop - marketing. An average customer goes to a store (real or on-line) and buys what's available. Usually this means a computer preloaded with Windows or OS X. In the rare case that a computer is being sold with Linux pre-installed, the customer has little idea what he's getting into. A couple of days later he returns the computer and feels rather cheated.

Canonical, the company behind Ubuntu, is trying to solve this problem. They are talking to computer manufacturers to ship Ubuntu on their products. However, they are faced with a unique challenge as they do so.

When Microsoft approaches a computer manufacturer (in a manner of speaking, I don't think they need to approach anyone anymore) they are able to show an OS that they have total control over. Since they have total control over their OS, they can work with the manufacturer to deliver the best possible experience for that hardware. Drivers can be bundled with Windows, software can be bundled to run with uncommon hardware such as touchscreens or fingerprint readers.

This is something Canonical cannot do with Ubuntu. They have very little control over the OS. Various parts of the Linux OS stack - kernel, X, desktop environment etc. - are created by different communities. Canonical may have little or no say in these communities. As a result, Canonical is ill posed to cater to any special demands that manufacturers may have.

No wonder then, that radical changes are coming to Ubuntu in near future. First, they ditched Gnome Shell in favor of Unity, their home grown shell for Gnome. Then, they plan to switch to Wayland, a replacement for the X server. Wayland, being a fairly new project, is idea for Canonical to infiltrate and control. These moves are clearly meant to exercise more control over Ubuntu as an OS.

Note that while I've painted Canonical in a somewhat negative light, it still is operating withing the confines of the open source philosophy. However, we all know that the word of the law might not be it's spirit. And many questions have been raised about Canonicals open source spirit in recent times.

Friday, January 21, 2011

Man vs. Machine

It seems rather curious to me that while various US universities are embracing the internet like second sking - getting facebook pages and twitter accounts and what not - IITK should be thinking about limiting internet access and mobile phones. Why this different in outlook towards technology?

One group of people think that technology is just a tool. These people see technology as something inert, of itself neither good or bad. For them, it is we, the humans, who decide how to use technology and define whether it is benefitial or not. The classic example is nuclear technology which can be used both to produce electricity and bomb innocent people.

The other group of people are inherently suspcious of technology. The believe that the use of technology is going to inevitably corrupt. Such people believe that technology is not inert, that it affects human behaviour to the extent that it leaves us with no choice, no control over how we use it.

See Also:

Thursday, January 20, 2011

Bad (?) UI Design on the Nook

There is a UI design flaw in the nook that's been bugging me ever since I bought one. It has been about 6 mothns of using it regularly (every day) and it still bugs me. So I thought I'd rant about it.

Given below is a picture of the nook showing a hand holding it in a typical use position. You will notice that the nook has two buttons on each side for turning pages. The bottom buttons on each side move the page forward. The top buttons move is backward.

I find this very counter-intuitive. Many computerized software and hardware have the equivalent of the back/forward buttons. The tasks may be different - back/forward page browsing on firefox or previous/next song in iTunes. However, one visual principle remains the same - left button is back, right button is forward.

The nook breaks this visual principle. Here, the top button is back and the bottom button is forward. This would not be half as bad if the buttons weren't marked with left and right arrows. This creates a mentally inconsistent model - top/bottom is mapped to left/right.

Even after six months, I haven't gotten used to it and often keep pressing the left bottom button to go back a page.

From another point of view, this design makes sense. As you will notice from the picture, in the typical use scenario, the thumb of the user rests on the bottom button. Since moving forward is the most common operation, it is provided at the most ergonomically convenient location.

Wednesday, January 19, 2011

Why We Like Post-Apocalyptic Science Fiction

Post-Apocalyptic science fiction is a staple sub-genre of science fiction. It typically portrays a world after an apocalypse of some sort. Most of humanity dies and the few surviving must make it against enormous odds. The ending is not always happy in the Hollywood sense. The show Battlestar Galactica comes to mind, as does the Resident Evil franchise and 'The Windup Girl' by Paolo Bachigalupi.

I wasn't a big fan of post-apocalyptic science fiction to begin with. I was a Star Trek fan, which meant that I had a very positive, almost utopic vision of the future. I liked science fiction in which humanity expands beyond the solar system and thrives throughout the galaxy. My favorite author was Asimov.

However, gradually I began to like post-apocalyptic stories. And today I wondered why that was. I think the reason is that as I grew older, I learnt more about our social, political and economic systems. I learnt about global warming and neo-colonialism. And in such an environment, it's hard not to become pessimistic and cynical. It is hard not to feel that we deserve and apocalyptic future.

Reading post-apocalyptic science fiction is an act of self-flagellation.

It isn't unlike the concept of heaven in most mythologies. The idea that most of humanity is corrupt and that we'd be destroyed for it (either by a god or by natural course of this corruption) is very old. Of course, some of us will survive and they will essentially be righteous. Heaven is ancient post-apocalyptic science fiction.

Of course, in most modern stories, you don't go to heaven anymore. Instead you're chased by Cylons or zombies (or both: hey, if you can have cowboys and aliens, why not cylons and zombies). There is no reprieve.

So perhaps modern fiction is more like Hell. There is no redemption and you must suffer till eternity.

Tuesday, January 18, 2011

Ragging and Abuse

When I entered college in 2003, ragging was very much in vogue in almost all Indian colleges. Understandably, this can get rough for a first year student. Compared to other colleges, ragging at IIT Kanpur can be described as mild, or even feeble. Which was good for me. I was shy and reclusive as a school student and intense ragging would have done little good for me.

One reason ragging was so controlled at IITK was because of the counseling service. They spent a whole lot of time and effort into creating a student culture that was sensitive to issues surrounding ragging. They didn't want to put a complete ban on it because it was seen as a ritual practice wherein the seniors got to know the juniors. But they also realized that it was intensely abusive and sought to bring down the abuse factor.

Non physical abuse is difficult to define. There were first years students who'd get extremely comfortable even as the seniors started to talk to them. And then, where does conversations stop and abuse begin? There are hundreds of variations on just saying 'hello'. You could be cordial or you could be abusive in just that single word.

The counseling service adopted the view that it was upto the first year student to define abuse for himself. Ragging could go on, but only uptil the point that the student felt comfortable with it. If anyone saw that a first year was getting uncomfortable, it was supposed to stop.

Abuse gets even more complicated between two people who are emotionally related to each other. A verbal match can be playful teasing between a couple or it could be a nightmare of verbal violence. Where does one draw the line?

It is even more complicated in the case of parents and children. While individuals in a relationship are often capable of drawing the line of discomfort, children are not. Only adults can draw this line for them and these adults are often the very ones perpetrating the abuse.

The Expansion of the Noteworthy

The 90s in India can be called the decade of the rise of quizzing. When we were kids, we had the fabled Bournvita Quiz Contest. A dear friend of mine went on to participate to the national level. He was the first friend of mine to appear on television. The second friend to appear on television several years later was on the show Kaun Banega Crorepati Junior. In the intervening years, quizzing had evolved from a prodigious duel of minds to a game show.

I never liked quizzing. In the beginning, it seemed like it was easy enough. I was what Americans call a straight A student. I knew things. So should have been good at quizzing. It did seem like that for a while. All I had to do was keep up with the news, read those ‘general knowledge’ books that I’ve talked about earlier, and combine all of that with my already voracious reading habits to become a quizzing genius. Alas, it didn’t work that way.

For soon I discovered that what was worth knowing was ever expanding. It wasn’t confined to the news or current affairs of general knowledge books. I was scandalized to find that they asked you questions about movies, things that I knew little of. To make it worse, they asked questions about Hollywood movies, things that I had little or no access to. Movies weren’t knowledge! They were time-pass, entertainment!

I found that I wasn’t interested in keeping up. I had a very narrow field of interests – science and fiction – and didn’t feel much need to step out of it.

With the advent of mass media, what is worth knowing has expanded tremendously. And it has really exploded with the internet. The Encyclopedia Brittanica article on the Barbie doll is about 700 words long. Compare that to the Wikipedia article on the doll. Imagine how much more there is to know today.

In such an environment, all of us feel as if we aren’t keeping up. That we are missing out on stuff. Hence this strong urge to be plugged in to our social streams. We don’t want to be people who don’t know. We don’t want to feel knowledge-inadequate.

See Also:

Sunday, January 16, 2011

Does Meditation Work for Me?

I get this question a lot these days - does meditation work for you? I'm not entirely sure what people mean by that. Perhaps they mean if it is changing my life in any way. Well, yes, it is - I now do about 30 minutes of meditation each day and wake up at 6:30 every Saturday to go to my meditation group.

The question will get even more puzzling if you start exploring what kind of meditation I do. I do Zazen which is a sitting meditation practiced within the Zen Buddhist community. It is a peculiar practices because it promises you absolutely nothing. It doesn't say that it will increase your concentration or memory or make you feel better. In fact, to do Zazen with any goal in mind is not doing it properly. It is a goal-less practice.

If the practice is goal-less, why do it?

I'm not sure about that either. There is something that draws me towards it.

Okay, that sounded mysterious. And I hate mysterious. That's precisely the reason I started doing this practice - that it was NOT mysterious.

I do this practice because it brings some semblance of structure to my life. As a post-modern, liberal, athiest I find it hard to believe in anything. My life seems to be without any anchor at all. Morals are up for negotiation, good and bad a matter of definition. This lack of structure in life can be extremely disorienting.

Enter rituals. Rituals are important. Rituals are powerful. And I'm not talking about religious rituals, although they share this quality. I'm talking about everyday rituals. Waking up at a certain time, brushing your teeth, following a daily routine. Celebrating certain festivals at certain times of the year. They all bring structure to life. In and of themselves they have no meaning. But we choose to give them meaning. Because that's what human beings do.

Zazen is one such ritual for me. It's just something I do. It gives structure to my life.

Why Zazen, why not something else? Well, because Buddhism is athiest and rational and I can stand it.

Saturday, January 15, 2011

Inventors and Inventions

When I was a kid, something called 'general knowledge' was all in the vogue. It consisted of commiting myriad facts and figures to memory which were then parroted out to impress your fellow students. I remember one classmate that memorized the capital cities of all the countries of the world.

So this so called general knowledge had various topics. One of the common ones was 'inventors and inventions'. Given an invention you had to name the inventor and vice versa. Light bulb - Edison, steam engine - James Watt, airplane - Wright Brothers, telephone - Alexander Graham Bell, and so on. Until, of course, it got to the computer. It's hard to define who really made the computer. You could name Babbage and his difference engine but then what about Turing and his hypothetical Turing machine? And what about the fact that computers were made from vacuum tubes until the transistor came along. So which one do we talk about -- the idea, the mechanical, electrical or the electronic computer?

But I digress.

I was totally into science as a kid. I had several books at home about scientists and inventors, narrating stories of how they did a particular peice of science or invention. The story was always about individual effort, hard work and genius. Moral of the story -- if you locked yourself away in your office and were a genius and worked and worked, you'd make a great invention some day.

Despite my earlier lampooning of 'Guns, Germs and Steel', I did find this gem in there.

In reality, even for the most famous and apparently decisive modern inventions, neglected precursors lurked behind the bald claim “X invented Y.” For instance, we are regularly told, “James Watt invented the steam engine in 1769,” supposedly inspired by watching steam rise from a tea- kettle's spout. Unfortunately for this splendid fiction, Watt actually got the idea for his particular steam engine while repairing a model of Thomas Newcomen's steam engine, which Newcomen had invented 57 years earlier and of which over a hundred had been manufactured in England by the time of Watt's repair work. Newcomen's engine, in turn, followed the steam engine that the Englishman Thomas Savery patented in 1698, which followed the steam engine that the Frenchman Denis Papin designed (but did not build) around 1680, which in turn had precursors in the ideas of the Dutch scientist Christiaan Huygens and others. All this is not to deny that Watt greatly improved Newcomen's engine (by incorporating a sepa- rate steam condenser and a double-acting cylinder), just as Newcomen had greatly improved Savery's.

Similar histories can be related for all modern inventions that are adequately documented. The hero customarily credited with the invention fol- lowed previous inventors who had had similar aims and had already produced designs, working models, or (as in the case of the Newcomen steam engine) commercially successful models. Edison's famous “invention” of the incandescent light bulb on the night of October 21, 1879, improved on many other incandescent light bulbs patented by other inventors between 1841 and 1878. Similarly, the Wright brothers' manned powered airplane was preceded by the manned unpowered gliders of Otto Lilienthal and the unmanned powered airplane of Samuel Langley; Samuel Morse's telegraph was preceded by those of Joseph Henry, William Cooke, and Charles Wheatstone; and Eli Whitney's gin for cleaning short-staple (inland) cotton extended gins that had been cleaning long-staple (Sea Island) cotton for thousands of years. All this is not to deny that Watt, Edison, the Wright brothers, Morse, and Whitney made big improvements and thereby increased or inaugurated commercial success. [...] All recognized famous inventors had capable predecessors and successors and made their improvements at a time when society was capable of using their product.

Science is social. Why was this never taught to me as a kid?

Friday, January 14, 2011

Why Americans Love Technology and Indians Hate It

As I walk down the street that I live in, in the US, I'm amazed by the sheer number of cars I see. I'm amazed too by the number of different manufacturers that I see and the number of different models. And I am amazed by the number of iterations of each model that I see. You can pick a popular car model and find all of it's generations, often going back to the 70s or 80s.

The first ATMs were installed in the US during the 70s. But various prototype cash dispensing machines were tried out even back in the 60s. I saw my first ATM machine in 'Terminator 2: Judgment Day' during the mid 90s and it wasn't until the 2000s that I saw a real one.

Ebooks readers have really taken off in the US during the past 3 years or so. There are two popular brands, backed by mega book sellers and innumerable smaller ones. We have already seen three generations of the kindle. The market is competitive. Ebook sales have already crossed hardcover sales this year. Yet, not many people in India have yet heard about an ebook reader. My nook really got stared at during my India trip this November.

Technology doesn't ever iterate in India. It just appears, magically and fully formed. As a result it always has that slightly alien feel to it. We don't know who made it and how. Instead of the local lad we knew as a toddler with a runny nose, we get a suave foreigner who we can't quite get around to trusting fully.

This is why Americans love technology and Indians not quite as much. (Okay, we don't really hate it, do we? That title is just a link-bait.)

(One possible exception is cell phones. Cell phone market in India rocks.)

Thursday, January 13, 2011

Guns, Germs and Steel by Jared Diamond

I finished reading Jared Diamond's godawful book, Guns, Germs and Steel, a couple of days ago. In fact, I have up before the last 100 pages or so. I just could not take it anymore. So I might actually have missed out on something vital that he says towards the end. If that is the case, please correct me.

Let me say at the outset that I'm in no position to challenge his scholarship, though others have done so. I don't even have much of a problem with his central thesis - that environment has a big role to play in human history. It is rather the way he says it that bugs me.

I think the book is nauseatingly Eurocentric. Large parts of the analyses are about the European destruction of native American civilizations and his personal experiences in New Guinea. I was very puzzled by his complete neglect of India and China where one-third of the world's population now resides. He did devote one whole chapter to the Chinese in the end. I'm puzzled why he clubs Eurasia together in one giant lump when there was so many clashes of civilizations going on within the continent. (Whatever happened to the whole east vs. west divide?) What about the modern colonization of the east?

The underlying assumption throughout the book is that the modern western civilization is "better" and that that's what all human societies should have achieved. But they didn't. So why not? His repeated use of the term 'better' or 'more advanced' defined purely in terms of an industrial-capitalist mindset bugs me.

I was struck with this passage where the author describes the war between the Inca king Atahuallpa and the Spanish as 'well-known' because it had been recorded by several Spanishmen. Since when did history written purely by victors become 'well' known?

Finally, his selling of his thesis as something profound and fundamental is really over the top. I agree that environment has a role to play in history. But there are other factors - cultural and accidental - that play a role too.

Wednesday, January 12, 2011

Parents and Verbal Abuse

Why Chinese Mothers are Superior is doing the rounds on the interwebs. Seems to have caused much uproar. Here is one article denouncing the author as an abuser and racist. Here is another refuting the claim that all Chinese mothers are like this.

Amy Chua relates to us the story of a dinner where one of the guests actually broke down cried when she heard that Amy had called her daughter 'fatty' or 'garbage'. Clarissa seems equally outraged. This to me seemed rather odd.

I think this is because the definition of verbal abuse is different in different cultures. My mother has called me names in anger, as she has done to my brother and our cousins (yes, cousins too). So have I to my younger siblings. I never saw this as 'abuse' and I doubt if any of my siblings did either. I don't think this had any bad effect on me as a kid. It was common practice at school for teachers to say things. We said things back to them during lunch hour and got on with our lives.

Words and behaviours do have different connotations in different cultures. Unlike physical abuse, verbal abuse is subjective. It hurts only to the extent that it's defined to hurt you. A child in America grows up learning that a parent calling you 'garbage' is bad and abusive. A child in India might not even think about it for more than a minute.

This is not to say that abuse doesn't exist. A couple of my cousins are short and dark-skinned and hear no end of it. I find that immensely abusive. Another cousin hears no end for being fat. (It is funny that while I can't image her being truly hurt at being called 'fatty' she does get very upset when family memebers discuss her weight in calm, contained voices. So it's the way of saying things that counts too.) I grew up with a complex that I was short and not socially adept because my family constantly reminded me of this. This did have a bad effect on me as a kid.

In short, one does have to be culturally sensitive in defining verbal abuse. A word or tone of voice in one culture/language doesn't mean the same in another culture/language.

Tuesday, January 04, 2011

Sharing an Experience Changes It

One thing to note about blogging is that it changes the way you experience things. The experiences is not pure, untainted any more. Instead it's marred by the urge to share it later on.

No longer do you see a movie without thinking - oh, I can write about that. No longer do you visit a place without thinking - oh, I need to point this out in my blog post. No longer do you read a book without a notebook by your side to take notes.

The internet and the sharing of experiences that it enables is changing the way we experience life. People now click photographs to share them on facebook, make family videos to upload on youtube and record their singing at parties to create a podcast. Each experience is experienced with the acute awareness that it is going to be seen and heard by, if not thousands, then at least hundreds of people.

The internet is turning all of us into artists into some sense. Artists are often likened to voyeurs and prostitutes. They make a living by leeching of the experiences of others and by sharing what is most intimate to themselves. The internet is turning all of us into voyeurs and prostitutes.

Monday, January 03, 2011

What is Explanation?

I've started reading this book called 'Guns, Germs and Steel' by Jared Diamond. The beginning was a bit irritating because the author spends the entire preface stating that it is NOT a racist and/or eurocentric book. Sounded a bit like the old 'I'm not racist, many of my friends are black' argument to me. But I'm giving him a chance. Will reserve my final opinion until I'm done with the book.

The authors central thesis is this -- different societies developed differently over the course of history. He wants to know why. And he repeatedly claims that he is looking for the 'ultimate explanation' of these differences. This puzzles me slightly.

What could one mean by the 'ultimate explanation' in the context of historical differences between societies? History is an endless chain of cause and effect. A web would be a better word to describe it. There are multiple causes and multiple effects for each historical incident. Many of these are well studied. We keep finding new ones, which is good because our knowledge advances that way. We also keep questioning and perhaps throwing out the old ones which is also good because skepticism is the fundamental characteristic of the scientific attitude. In short, it's business as usual for historians.

But the author is not talking about these chains or webs of historical causes and effects. He is looking for an explanation that goes beyond this. I do not know what exactly he might mean by this.

History goes not lend itself to 'explaining' in the same way that physical phenomenon do. Physical explanations are concerned with precise mathematical relationships between phenomenon. Newton could observe apples falling and come up with the laws of gravitation. That is, he could write down a precise mathematical relationship between the masses of the objects and the gravitational pull between them. He could also write down the relationship between the amount of pull and the speed that an object attained under that force.

But that is not all, physical explanations are also reductionist. That is, you can isolate effects or 'variables' as fancy scientists like to call them. An applet falling under the bright blue sky is being attracted not only by the earth, but also by the moon, the sun and the entire galaxy. The beauty of gravitation is, you can isolate the effect of the earth, the sun, the moon and the galaxy and measure them independently. Then you can add all of those effects up and you will get the total effect. The whole IS the sum of the parts. This is called the principle of superposition. It is important to know where it applies and where it doesn't. Physics books explicitly state where it applies.

It is lucky for us engineers that so many physical phenomenon follow the principle of superposition. This allows us to isolate effects, study and quantify them and then put them together in the end to do all those magical things we do.

However, it is hazardous to bring this same reductionist thinking to other spheres. Food and nutrition is one. Michael Pollan points out in 'The Omnivore's Dilemma' that food doesn't quite follow the principle of superposition. Drinking wine and eating carrots together does NOT produce the same effect as drinking wine and eating carrots separately.

The same applies to history. The whole is so much more greater than the sum of it's parts that it makes little sense to isolate effects. Did human beings develop number systems because they needed to keep accounts or could they keep accounts because they had number systems? What role did the hard wiring of the brain play in this? What role did the underlying philosophy of the culture play? For example, the Indians may have invented the zero because they were so very concerned (philosophically) with nothingness or shunyata. All these various factors are so strongly interconnected that it makes little sense to isolate the effect of one particular variable.

But that's precisely what Jared Diamond seems to be aiming at. And while I'm not opposed to such analysis as an intellectual exercise, touting it as some 'ultimate explanation' seems to be a bit too much to me.

Sunday, January 02, 2011

What is Minimalism

There seems to be some debate about minimalism going on around the blogosphere. The first is a post berating minimalism as useless intellectual strategy. The second is a passionate counter-argument to the same. The third is a post covering these two posts.

Being an enthusiast minimalist, let me add my own two cents to this discussion. I think the problem with all these posts is that they are trying to identify a singular definition for a term that is being used to encompass a whole lot of unrelated concepts. For example, single-tasking, distraction elimination, un-consumption and minimalism as a design philosophy are all being clubbed together. They are disparate things are require disparate treatment.

Single Tasking
While the modern corporate worker is encouraged to multi-task and be good at it, juggling many things together comes at the cost of efficiency. Best work is done when you have long chunks of uninterrupted time. This is especially true of creative work where the most productive strategy is to stare at the wall. In order to get these long chunks of uninterrupted time, minimalists profess disconnecting, clubbing together communication and defragmenting talk time.

I don't think there's anything wrong with this strategy. Shutting yourself up in a room / outhouse is an accepted strategy to get a book written. Creative spaces are full of junk but they often have a "keep out" notice on the door. The minimalists are trying to achieve the same effect in a digital work space.

While some minimalists have elevated this to the level of extreme sports where they try to live of 33 things or 55 things. I'm not sure about such athletics but I do think there's some virtue to consuming less. It is often healthier for your body and your environment, leads to less worry and stress and keeps you emotionally happier and financially secure. Minimalism takes the form of minimizing here. If you can live without something, you probably should.

Minimalist Design Philosophy
Minimalism is a good design philosophy for particular use cases. Specifically, it's a good design when you're a "consumer" and not a producer. I love when a media player is really simple and gets out of my way and lets me watch my movie. I hate it when a programming IDE is minimalist. It's a matter of having the right tool for the right job.

Minimalism as a Way of Living
This I think is hokum. Leo Babauta is a minimalist. He advocates reducing work hours. He works something like four hours a day or something. But what does he do with this saved hours - he spends time with family, runs and thinks. Is he doing less than a corporate minion working 12 hours days? There's no way to measure that. They both live 24 hours every day, just like the rest of us.