Tuesday, March 01, 2011

the mind/body divide is meaningless

We often think of our minds as something that inhabits a body. We think of the mind as separate from the body that it inhabits. Literature is full of this idea. In Freaky Friday, a mother and her daughter exchange bodies. Apparently, the only thing that changes is their bodies. Their personalities remain intact. In Harry Potter books, wizards and witches can transfigure, turning themselves into an animal or a piece of furniture. Prof. McGonagall is known to turn herself into a tabby cat. But even as a tabby cat, she still remains Prof. McGonagall. Her essence, her mind remains intact in this new body.

This kind of mind/body divide is artificial. Minds cannot be separated from bodies in this fashion. Let us do a thought experiment. Suppose that you suddenly acquire a sixth finger on one of your hands. Do you remain the same person, albeit with six fingers on one hand? I think not. You are now a person who has six fingers. You are person who knows how to use this sixth finger. When you hold something, your perception of that object has changed. Your perception of the abilities of your body has changed. All of this is also part of your 'mind'. Bodily sensation cannot be separated from the mind.

See Also:

Monday, February 28, 2011

Minimalism does not work with food

Leo Babauta has this interesting article on the minimalism of a pizza Margherita. I enjoy minimalist food. I like to eat fruits plain and I enjoy minimal meals with only one or two items. (My all time favorite is daal-chawal with pickle.) However, food is a classic example where minimalism does not work. You cannot capture the rich taste of a Hyderabadi Biryani or the subtle flavors of chana masala with minimal ingredients. All the dozen ingredients have to come together in the right proportion to create that magic. There is no way you can capture that beauty with just a two or three ingredients.

Thus, minimalism cannot be some universal principle governing all aspects of your life. It has to be a deliberate choice, used only when it fits the context.

Sunday, February 27, 2011

How to Use Experts

Noreena Hertz gives a passionate talk on how experts are fallible and how we, as a society, should develop a health mistrust of experts.

I have some gripes with the talk. In the opening lines, she makes us believe that we have always relied on experts to make decisions. This is not true. Relying on experts to make decisions is a fairly recent phenomenon. Take the decision on what to eat, for example. In most cultures, there are elaborate rules, rituals, traditions and folklore surrounding what to eat, when and how. It represents the collective knowledge and experience of a culture about a very important aspect of human life - eating food. Relying on nutritionists a fairly new phenomenon, a by product of the modern capitalist/consumerist culture.

Second, I do not think the problem lies with experts themselves. I think experts are often very aware of the assumptions that they make and the limitations of their methodologies. There is often ample dissent within an expert group, at least during the early stages of development of a field. Experts also know that they are only human and make mistakes all the time.

The problem doesn't lie with non-experts either. People are often inquisitive and cautious. They will always challenge authority whenever they feel empowered to do so.

The problem lies with communication. Experts opinions get to the public in pithy slogans. The media loves taglines - eating chocolate decreases risk of cancer, drinking coffee can help you live longer. Experts will seldom make claims like that. The media fails to deliver the nuances and the details of expert knowledge to the general public.

The communication problem also runs the other way round. In most cases, there is no mechanism for experts to listen to 'user' feedback. Experts are often working with limited eyes and ears. This is perhaps a folly on the part of experts. With the advances in communication technology, this shortcoming can perhaps be overcome.

See this great talk by Thomas Goetz about how to redesign medical data to better communicate with patients.

Tuesday, February 22, 2011

Getting Carried away by the Internet

The Internet is perhaps the single most powerful change that has occurred in the developed world. It is indeed a wonderful invention that is changing us in ways we never knew were possible. New possibilities are opening up while the old world order is breaking down.

And yet, it is very easy to get carried away. Consider for example the recent revolution in Egypt which was widely referred to by the western media as a 'twitter revolution'. This term, I think, does disservice to the people of Egypt. Revolutions are created by people, not twitter. Twitter was not the cause of the uprising. It was only a medium. Calling it a twitter revolution is akin to calling the Renaissance a printing press revolution.

The danger here lies in ignoring the complexities of human interaction and reducing it down to a single focal point.

I'm reading this book called 'Is the Internet Changing the Way You Think' which is a collection of essays on the topic by leading thinkers of today. The book is compiled by John Brockman of Edge.org.

The book contains an essay titled, 'A Level Playing Field' by Martin Reese, president of the Royal Society. His thesis is that the Internet is leveling the playing field. As an example, he cites the work of Dr. Manindra Agarwal and his two students, Nitin Saxena and Neeraj Kayal (all of whom belong to my alma mater). Dr. Agarwal and his associates posted their work on the internet, thus receiving instant peer review and recognition of their work. Reese then compares this to the relative anonymity of Ramanujan until he was discovered by Hardy and presented to the western world.

What this simplistic analysis ignores is the contexts that these two researchers worked in. Dr. Agarwal works at the premier institute in free-independent India, set-up with American collaboration back in the 60s, deeply steeped in the Western tradition of science. Ramanujan was born in pre-independence colonial India and had little or no western education. Once you factor these social differences in, the level playing field doesn't remain that level after all.

The point is not that the internet is not making a difference. The point is that it is not the magical wand that many people claim that it is. Old power structures sill exist and will continue to exist. In fact, if we are not careful, they will soon devour the freedom and democracy that comes with the internet. Let is not get too carried away with the charmed of the backlit screen.

Thursday, February 17, 2011

The Problem with Stereotypes

Supporters of stereotypes often take refuge in facts and statistics. For example, people might support the stereotype that children of Asian immigrants in America are high academic achievers. These supporters will dig up data from spelling bees, olympiads and standardized test score to establish their arguments. Often, little fault can be found with their data.

The problem lies with the inference. In emphasizing ethnicity in analysing academic achievement, the implication always is that ethnicity is the cause of said achievement. These kids are good at spelling bees and olympiads and standardized tests because they are Asian immigrants. Correlation is confused with causation. (If not explicitly then at least implicitly.) And that is problematic.

Sunday, February 13, 2011

Introducing - The Indian Cow

Today I want to introduce you all to The Indian Cow - an India centric content aggregator. 

We all love to share links on facebook and twitter. Often these status updates and tweets are private and meant only for our friends. But we, the folks are The Indian Cow, thought, why not make it a thing? Of course, the idea is nothing new, there are at least a thousand, if not a million blogs out there sharing nothing but links. But hey, we're special! **insert random marketing jargon here**

We believe that a void has been left behind in the wake of (now defunct) Desipundit and (almost defunct) BlogBharti. We aim to fill this void. We are also...

Bah, we're just doing it for fun. We hope you will enjoy the links we share and have good conversations about them. That's all.

So head over now to our website and subscribe via RSS or email. Or follow us on twitter @theindiancow. We even have a facebook page!

Friday, February 11, 2011

Secularism is Not Anti-God?!

Read this new item where a Gujarat High Court has declared that secularism is not "anti-god" and that it is perfectly okay to perform a religious ceremony to mark the start of construction of a public building. (Via: The Indian Cow.)

Secularism is interpreted in different ways in different countries. In the US, the guiding principle is 'separation of church and state'. That is, the government does no religion. In India, the government does all religions. They sponsor both the Kailash Mansarovar and the Hajj pilgrimages.

However, the problem with the Indian interpretation is that of favoritism. If the state does all the religions, what stops it from having a favorite one. In this particular case, why was a Hindu ceremony performed, as opposed to, say reading verses from the Quran?

Why is no consideration given to how the minorities feel about a Brahminical ceremony being performed for a public building that they have an equal stake in?

Thursday, February 10, 2011

The Thing to Understand about Internet Privacy

Is that the internet is a public space. Just like real spaces, the level of you privacy varies. You could be exchanging emails with your spouse, and just like bedroom conversations, these will most probably remain private unless someone really comes digging for it. You could shout something out on facebook or twitter. Only your friends will hear it, but you never know who is gossiping to whom and where the word will reach. Finally you may write something on your blog and it's there for everyone to read.

Whatever you write, the internet is a public place. Do not say or write something that you will not say in public.

The second, also important, thing to understand is that the internet is not akin to speaking. It is akin to putting up printed notices and fliers. Everything is at least semi-permanent and hard to destroy or deny. So whenever you post something on the internet, think of it as posting something on the notice board in your own handwriting.

Sunday, February 06, 2011

The Three C's of Being Human

There are three human emotions that as as fundamental to being human as any other.

We all feel compassion for others. Some of us feel compassion for every being in this universe. Some people feel compassion for the environment or for animals. So feel compassion for fellow human beings. But even if you do not, you might at least feel compassion for your family, your parents, spouse or children. And if not even that, then at least you feel compassion for yourself. Or have, at some time. Compassion is a deeply human emotion.

All of us have felt curiosity at some time or another. As children, we were intensely curious. That curiosity helped us learn how to talk and what to like and what to not like. Even as adults we retain a fair amount of that curiosity. You might not be at the frontiers of science asking the cutting edge questions, but you at least read a new word in the newspaper and wonder what it means. Being curious defines us as human beings.

Human beings are makers. Over the course of our lives, we make a thousand things. Creativity can manifest itself in myriad ways. Perhaps you came up with a clever way to plug that leak in your plumbing. Or maybe you devised a smart shorthand while taking notes in class. As humans we are infinitely creative.

Saturday, February 05, 2011

The Power of Choosing

I recently read a self help book called 'The Now Habit' by Neil Fiore. The book intends to help you overcome procrastination. Like all self help books, it doesn't quite help you as much as you'd like to. But it does have a take away message that I really liked.

The author asks us to replace "have to" with "choose to". Day in an day out we find ourselves saying to ourselves - I have to go to work today. Or, I have to get this report done by evening. I should be exercising more. I should eat less fatty food. And so on.

The author asks us to replace this internal dialog with a choice statement. I choose to go to work today. I choose to get this report done by evening. I will exercise more. I will eat less fatty food.

When you say "have to", you are putting someone else in-charge: your boss, your teacher, or your doctor. When you say "choose to", you are in-charge. This can be a liberating feeling.

See also:

Friday, February 04, 2011

UI Design and Choice

Good UI design is easy to use, among other things. But easy of use is often in conflict with the amount of choice that is given to the user.

More choice requires more effort from the user. Imagine walking into a super market where they sell a hundred different varieties of bread. Which one do you buy? Instead of being empowering, this existence of choice can be intimidating. The user is likely to just pick up a loaf at random and walk away with a feeling of not having chosen the best option or not having chosen at all.

But having choice is good too. Everyone likes versatile tools. Wouldn't it be a joy to discover that you blender can also juice your oranges?

Designers of computer interfaces have to balance these two concerns. Most go either one way or another. The Linux desktop environments, Gnome and KDE, are almost diametrically opposite examples. Gnome gives the users as little choice as possible and believes in having "good defaults" that the user need not change. Gnome is known to have actually reduced the number of things the user could do over time. KDE on the other hand throws every possible option at the user, resulting in monstrocities like this.

To compound the problem even further, different use cases need different levels of choice. As I've said earlier, if I just want to want a video, a play pause button and a seek bar are enough. If I'm writing code, I need all sorts of sophisticated options.

I'm not sure what the solution to this conundrum is. Most good designs solve it by limiting their target audience and figuring out what is best for them.

Thursday, February 03, 2011

Tell Tale of Poverty, Detective novels, Letters in faces, Yearn to become someone else and Hidden meaning of words

Today we feature a guest post by Vaibhav Rathi about the strange allure of Orhan Pamuk.

A fictitious article, an essay, book or just a random conversation with the person you have never met. When you read some of these( or other) things again and again, it will often happen that the old meaning, which you were all too sure of, starts to melt away giving way to the obvious meaning which you have long suspected.

So when you are reading a good book (cue: the one which makes you read the world, rather than just itself) which boast of being a religious-political thriller, and you are assaulted with a chapter titled: We're Not Stupid, We're Just Poor. It’s only natural that you immediately think of the perception that our society is arranged in decreasing order of intelligence. How we think of the poor, it’s because they are stupid that they haven’t figured out how to make money and live good life. Rich folks think this of poor folks, rich country for poor country, and so on. But most of all you’d be haunted by all the new types of Secularist, Islamist, Nationalist, and Leftist you’d find in there. Although a political thriller –as it claims- it doesn’t have any government, government agent, corporation, any establishment, or even spies. All there is are bunch of poor people stranded in a small town completely cut off from outside world for snow. It’s then only, among those individuals with dangerously conflicting ideas, you get to realize what politics is all about: ideas and beliefs. No sooner than you have felt this very basic fact, you’ll be assaulted by secularists who are not as un-biased as they seem, Islamists whose only mistake is that they have self-respect, nationalists who live in the past, leftists, and two young boys who are discovering what it’s like to have faith. Well into the book, you’d know that there is no way this story can end ever.

When you, rather complacently, would figure that it might be the most influential book you’ll ever read, soon enough, you’ll stumble upon another one which will mock you for holding such deluded notions. This new book would look like a detective novel, read like a detective novel, but the problem is it wouldn’t feel like one. No murder, no criminals, no pursuit, ever morphing identities, you wouldn’t know who is searching and what, or even if there is anything to search.

Much early in the book, you’ll read: Galip had once told Ruya that the only detective book he’d ever want to read would the one in which not even author knew the murderer’s identity. Instead of decorating the story with clues and red herrings, the author would be forced to come to grips with his characters and his subject, and his characters would have a chance to become people in a book instead of just fragments of their author’s imagination.

You’d feel the nervous, being intimidated, when the protagonist would say that he finds it incredibly artificial when everything just falls into the place at the end of a detective book, and at this precise moment, the author would whisper all the clues into the detective’s ears, who is till now seemingly ignorant, and who would solve the mystery then and there. Just as you read this, you’d know the following story would not be a simple one, would not be one story, but story becoming another story, yearning to become another story and most of the time passing off as some another story, just as people do.

You shouldn’t be bothered if you are required to solve all the mysteries of the world in pursuit of your search as everything will become a clue: you, your wife, all the possessing of your wife, all the column her ex-lover writes, mystery behind mannequins influencing people’s expression, all the faces around you, the letters engraved in those faces: two brow lines, four eye lashes, and one hairline –seven in all, only till the face is divided into two by ‘late arriving nose’ and then the letter engraved will be fourteen. Then you’ll take account into the more poetic real and imaginary lines, the number doubles again, leaving you with a new twenty eight letter language to decipher. Not to mention the entire history, all the fables that were ever created, parables that were ever told, and secret meaning hidden behind all the words used. The world will be a sea of clues; every drop will bore the salty taste of mystery behind it. Each and everything would morph into a clue pointing you to another world which is bare of mysteries, and where in you can become someone else.

When you’ll see in the faces of people the yearn to become someone else, to pass off as someone else, it is imminent that you’ll think of every time you had already known this. When people would wish Merry Christmas, with an unusual zeal, and perhaps would lack that zeal even their own festivals and would conveniently forget other ones. When people would mention Boxing Day in a place where there is no such thing, and when those same people would wish each other Thanksgiving, you’ll almost be sure that they have already become someone else. At this time, even their voice will change and so will their speech, they will look different but you’ll easily be able to tell the truth just by having one look at the conflict on their faces. They have long forgotten who they really are; they have forgotten their history, their culture, their identity.

It is at this precise moment, you’d be assaulted by two questions:

Do you have trouble being yourself?

Is there a way a man can only be himself?

Each of these clues would lead you to another mystery, that is waiting to be unveiled, mystery behind hidden meaning of words. Words which contains second, third or even more meanings that are hidden; you just have to look for them. You have to convince yourself of the fact that they will reveal themselves eventually. Over time, you’ll know of the mystery words harbor, the secrecy they enjoy, to hide infinite amount of meaning. When you will be conversing with somebody you know, or don’t, and in absence of special circumstances, it is natural that everything will be obvious. But you’ll know what they really mean, its not to say that they are lying, they just mean something else. Just by paying attention to words, you’ll know, since they mean something else, their choice of words will invariably be biased. And sometimes, it will also happen that you’ll be able to know what the other person hasn’t even realized yet. All this just by paying attention to words, and their hidden meaning. As when an accuse becomes upset on being accused, you’d know of the prime thing that makes human upset: guilt. Now think how will you feel when, later, you get to know that the hidden meaning, which you sought after peeling all the layers and which was never said or implied by the unknown person, is indeed the only truth not just your delusional fantasy? So, it is only obvious, that eventually you’d stop looking at the meaning that is all too obvious but at the hidden meaning, because only it is the truth. Words will never reveal their true meaning and it is only after number of attempts you can hope to get the meaning that is always hidden layers beneath.

In the end you know that your gut feeling, the intuition is the only real meaning left in the things.

But mostly, even you wouldn’t realize, you wouldn’t want to look for true hidden meanings because you’d be afraid of what you might find.

It is, at this moment, when you are lost among, and overwhelmed by, hidden meanings; you’d try to make sense of things and this is all that will come out( by that green pen of yours):

I wonder, do people always mean what they are saying, or they are pointing you to an entirely different meaning. Often I feel maybe people are speaking in codes, some subtle meaning, if only I can understand the meaning, I’ll be able to talk to them. Of course I would also say some things, at the same time, implying other things. Then we would, finally, be able to have a conversation. To an onlooker, we’d be talking normally, perhaps about something relevant to current time, about a movie, a book, or some music we both just happen to like. But, all this time, I’d know the other person is reading all the signs, from the things I am saying, more often he’ll also get the meaning of things I had thought I’d say subtly but I didn’t. When we would be talking about a thing, quite normally, it’s natural that we’d both be aware of the other thing that this thing signifies, because we are indeed only talking about the other thing. When we’d be talking about a movie, immediately we both would know, that the conversation is not about this movie but some other movie whose fate is linked with this movie, maybe because the director was only able to make this movie because he had made the movie which we are really talking about. . Or perhaps, we would really be talking about an actor which is not in this movie, but only because he didn’t want to. A song would really mean an incidence worn into its lyrics, a book would refer to the thing which has really affected us, cat would really mean dog, while a dog would mean the book which has dog on its cover, and he would mean she and so on. In our conversation, things will refer to their real meaning, not to the meaning that people take of granted. Then, looking down, we’ll laugh at the world, but maybe we’ll really be frowning for their innocence and shallowness, due to which they will never get the true meaning that things signify.

When you are searching for something very hardly, after long enough the search will become more important than the thing itself. You won’t be sure anymore of the thing you really want to seek.

In the end, you’d know the real question is, Did Apes feel the same solidarity as Darwin did towards them? He hadn’t got any way to communicate by which he could confirm. Even if he’d have tried, it would have been, literally, like talking to chimps.

All the real meanings are hidden and only the hidden meanings are real.

Thursday, January 27, 2011

Social Change is the Greater Challenge

Here is a clip from yesterday's episode of The Daily Show:

The Daily Show With Jon StewartMon - Thurs 11p / 10c
State of the Union 2011 - Night of Too Many Promises
Daily Show Full EpisodesPolitical Humor & Satire BlogThe Daily Show on Facebook

It really amazes me how going to the moon is seen as more difficult than having a fast train network, electric cars and teachers who are seen as nation builders.

We have already been to the moon. We did that in the 60s with computers that were barely faster than a pocket calculator. On the other hand, it is now 2011 and we still haven't managed to put food on every plate, are headed towards an environmental disaster and income inequities are increasing rapidly. Yet, only high technology is seen as a challenge, as something worth achieving. Social change is something mundane, something that doesn't quite fire our imagination.

In my opinion, social change is a far greater challenge than going to the moon (or even mars) could ever hope to be.

Tuesday, January 25, 2011

What is Linux

Despite it's meager market share, it's surprising how rich and varied the Linux community is. There are differences in philosophies and outlook. In fact, they very definition of what Linux is, varies.

Here is what I think Linux is:

Technology: Of course, first and foremost, Linux is the technology. There is the Linux kernel. And then there is the giant world of open source software that runs on that kernel to create a wonderful computing environment. Some people consider only the kernel to be 'Linux'. I don't agree. The entire software ecosystem for me is Linux.

Community: Linux would be nothing without the community. There are the developers who create such wonderful software mostly on a voluntary basis. Then there are designers and bug reporters and packagers and maintainers and webadmins and forum moderators. The community is huge. And last but not the least, the users. The entire body of people is an inseparable part of 'Linux'. For me, it is not possible to see people as separate from the technology they create and use.

Philosophy: Although differences exist, most philosophies in the Linux world are the variations on the same theme - that software should be free (as in freedom) and open. The technology and community are built around these core beliefs and would have little meaning if the philosophy is given up. Some people don't believe that the philosophy is important or that it's too militant. I don't think that's the case. If philosophy is not important, why have Linux at all? Why not just Windows or OS X?

For me, Linux is the inseparable trio of these three components.

Monday, January 24, 2011


It is said that ignorance is bliss. True perhaps for the ignoramuses but for those of us who know better, it certainly is no bliss to deal with them. I admit, with some shame, that as a schookid I had little or no patience with classmates who weren't as 'smart' as I was. When I went to college, I wasn't the smartest guy around anymore. Yet, I still harbored a mild disdain for those who knew less than me.

I'm certainly not alone. I've seen people often complain about ignorance. Videos of how stupid Americans are, abound. Fellow bloggers complain about the poor level of knowledge among the people they interact with.

I have come to realize that it's not so much ignorance that pisses me off as the lack of willingness to learn. A person who didn't know something wouldn't piss me off as long as they said - hey, I don't know, tell me. Or, hey, I don't know, let's find out.

But it angers me when people say, I don't care or just believe the wrong thing.

Sunday, January 23, 2011

Packaging Linux as an OS

Linux is radically different from proprietary OSes - Windows and OS X. And I don't just mean in terms of philosophy. I mean technically too. Linux has a radical distributed model of development. Various parts of the OS (and here I define OS as the entire stack up to the desktop environment) are created by different teams. They are designed to be independent and swappable (well, mostly). You can use the vanilla kernel or one of the custom kernels for your platform. You can use the ubiquitous X server or swap it with something like Wayland. You can choose from one the many desktop environments - something very unique to Linux.

The entire software stack is always in a flux. Different projects have different release schedules, different compatibilities, different goals, functionality and philosophies. As you can imagine, packaging Linux as an operating system is a nightmare.

The situation becomes especially difficult for application developers. When you are writing an application for Windows or OS X, you just have to make sure it works for the latest few versions of the OS. When you write an application for Linux, what do you write for? What kernel version, what X server version, Gnome or KDE? Which distribution?

This is a major problem for Linux and one of the reasons why commercial application support is so bad on Linux.

Canonical, with it's flagship distro Ubuntu, is trying to solve this problem. They first did by introducing a predictable, stable, six monthly release cycle. Then they tried to convince various communities to practice what Mark Shuttleworth called cadence - predictable release cycles that synchronize with other projects. However, that suggestion didn't go down well with most communities. And now they are doing it by taking more control of Ubuntu by swapping Gnome Shell with Unity and X server with Wayland.

I'm not entirely sure if Canonical taking control of Ubuntu is this way is entirely benign. Of course, they are still open source. But as we've seen in the past with the development of the Ubuntu font or the Ubuntu One service, Ubuntu's development processes aren't always very open or democratic. A whole lot of people feel that this harms the community.

See Also:

Saturday, January 22, 2011

Why Ubuntu Switched to Unity and Wayland

There is a major barrier to adoption of Linux on the desktop - marketing. An average customer goes to a store (real or on-line) and buys what's available. Usually this means a computer preloaded with Windows or OS X. In the rare case that a computer is being sold with Linux pre-installed, the customer has little idea what he's getting into. A couple of days later he returns the computer and feels rather cheated.

Canonical, the company behind Ubuntu, is trying to solve this problem. They are talking to computer manufacturers to ship Ubuntu on their products. However, they are faced with a unique challenge as they do so.

When Microsoft approaches a computer manufacturer (in a manner of speaking, I don't think they need to approach anyone anymore) they are able to show an OS that they have total control over. Since they have total control over their OS, they can work with the manufacturer to deliver the best possible experience for that hardware. Drivers can be bundled with Windows, software can be bundled to run with uncommon hardware such as touchscreens or fingerprint readers.

This is something Canonical cannot do with Ubuntu. They have very little control over the OS. Various parts of the Linux OS stack - kernel, X, desktop environment etc. - are created by different communities. Canonical may have little or no say in these communities. As a result, Canonical is ill posed to cater to any special demands that manufacturers may have.

No wonder then, that radical changes are coming to Ubuntu in near future. First, they ditched Gnome Shell in favor of Unity, their home grown shell for Gnome. Then, they plan to switch to Wayland, a replacement for the X server. Wayland, being a fairly new project, is idea for Canonical to infiltrate and control. These moves are clearly meant to exercise more control over Ubuntu as an OS.

Note that while I've painted Canonical in a somewhat negative light, it still is operating withing the confines of the open source philosophy. However, we all know that the word of the law might not be it's spirit. And many questions have been raised about Canonicals open source spirit in recent times.

Friday, January 21, 2011

Man vs. Machine

It seems rather curious to me that while various US universities are embracing the internet like second sking - getting facebook pages and twitter accounts and what not - IITK should be thinking about limiting internet access and mobile phones. Why this different in outlook towards technology?

One group of people think that technology is just a tool. These people see technology as something inert, of itself neither good or bad. For them, it is we, the humans, who decide how to use technology and define whether it is benefitial or not. The classic example is nuclear technology which can be used both to produce electricity and bomb innocent people.

The other group of people are inherently suspcious of technology. The believe that the use of technology is going to inevitably corrupt. Such people believe that technology is not inert, that it affects human behaviour to the extent that it leaves us with no choice, no control over how we use it.

See Also:

Thursday, January 20, 2011

Bad (?) UI Design on the Nook

There is a UI design flaw in the nook that's been bugging me ever since I bought one. It has been about 6 mothns of using it regularly (every day) and it still bugs me. So I thought I'd rant about it.

Given below is a picture of the nook showing a hand holding it in a typical use position. You will notice that the nook has two buttons on each side for turning pages. The bottom buttons on each side move the page forward. The top buttons move is backward.

I find this very counter-intuitive. Many computerized software and hardware have the equivalent of the back/forward buttons. The tasks may be different - back/forward page browsing on firefox or previous/next song in iTunes. However, one visual principle remains the same - left button is back, right button is forward.

The nook breaks this visual principle. Here, the top button is back and the bottom button is forward. This would not be half as bad if the buttons weren't marked with left and right arrows. This creates a mentally inconsistent model - top/bottom is mapped to left/right.

Even after six months, I haven't gotten used to it and often keep pressing the left bottom button to go back a page.

From another point of view, this design makes sense. As you will notice from the picture, in the typical use scenario, the thumb of the user rests on the bottom button. Since moving forward is the most common operation, it is provided at the most ergonomically convenient location.

Wednesday, January 19, 2011

Why We Like Post-Apocalyptic Science Fiction

Post-Apocalyptic science fiction is a staple sub-genre of science fiction. It typically portrays a world after an apocalypse of some sort. Most of humanity dies and the few surviving must make it against enormous odds. The ending is not always happy in the Hollywood sense. The show Battlestar Galactica comes to mind, as does the Resident Evil franchise and 'The Windup Girl' by Paolo Bachigalupi.

I wasn't a big fan of post-apocalyptic science fiction to begin with. I was a Star Trek fan, which meant that I had a very positive, almost utopic vision of the future. I liked science fiction in which humanity expands beyond the solar system and thrives throughout the galaxy. My favorite author was Asimov.

However, gradually I began to like post-apocalyptic stories. And today I wondered why that was. I think the reason is that as I grew older, I learnt more about our social, political and economic systems. I learnt about global warming and neo-colonialism. And in such an environment, it's hard not to become pessimistic and cynical. It is hard not to feel that we deserve and apocalyptic future.

Reading post-apocalyptic science fiction is an act of self-flagellation.

It isn't unlike the concept of heaven in most mythologies. The idea that most of humanity is corrupt and that we'd be destroyed for it (either by a god or by natural course of this corruption) is very old. Of course, some of us will survive and they will essentially be righteous. Heaven is ancient post-apocalyptic science fiction.

Of course, in most modern stories, you don't go to heaven anymore. Instead you're chased by Cylons or zombies (or both: hey, if you can have cowboys and aliens, why not cylons and zombies). There is no reprieve.

So perhaps modern fiction is more like Hell. There is no redemption and you must suffer till eternity.

Tuesday, January 18, 2011

Ragging and Abuse

When I entered college in 2003, ragging was very much in vogue in almost all Indian colleges. Understandably, this can get rough for a first year student. Compared to other colleges, ragging at IIT Kanpur can be described as mild, or even feeble. Which was good for me. I was shy and reclusive as a school student and intense ragging would have done little good for me.

One reason ragging was so controlled at IITK was because of the counseling service. They spent a whole lot of time and effort into creating a student culture that was sensitive to issues surrounding ragging. They didn't want to put a complete ban on it because it was seen as a ritual practice wherein the seniors got to know the juniors. But they also realized that it was intensely abusive and sought to bring down the abuse factor.

Non physical abuse is difficult to define. There were first years students who'd get extremely comfortable even as the seniors started to talk to them. And then, where does conversations stop and abuse begin? There are hundreds of variations on just saying 'hello'. You could be cordial or you could be abusive in just that single word.

The counseling service adopted the view that it was upto the first year student to define abuse for himself. Ragging could go on, but only uptil the point that the student felt comfortable with it. If anyone saw that a first year was getting uncomfortable, it was supposed to stop.

Abuse gets even more complicated between two people who are emotionally related to each other. A verbal match can be playful teasing between a couple or it could be a nightmare of verbal violence. Where does one draw the line?

It is even more complicated in the case of parents and children. While individuals in a relationship are often capable of drawing the line of discomfort, children are not. Only adults can draw this line for them and these adults are often the very ones perpetrating the abuse.

The Expansion of the Noteworthy

The 90s in India can be called the decade of the rise of quizzing. When we were kids, we had the fabled Bournvita Quiz Contest. A dear friend of mine went on to participate to the national level. He was the first friend of mine to appear on television. The second friend to appear on television several years later was on the show Kaun Banega Crorepati Junior. In the intervening years, quizzing had evolved from a prodigious duel of minds to a game show.

I never liked quizzing. In the beginning, it seemed like it was easy enough. I was what Americans call a straight A student. I knew things. So should have been good at quizzing. It did seem like that for a while. All I had to do was keep up with the news, read those ‘general knowledge’ books that I’ve talked about earlier, and combine all of that with my already voracious reading habits to become a quizzing genius. Alas, it didn’t work that way.

For soon I discovered that what was worth knowing was ever expanding. It wasn’t confined to the news or current affairs of general knowledge books. I was scandalized to find that they asked you questions about movies, things that I knew little of. To make it worse, they asked questions about Hollywood movies, things that I had little or no access to. Movies weren’t knowledge! They were time-pass, entertainment!

I found that I wasn’t interested in keeping up. I had a very narrow field of interests – science and fiction – and didn’t feel much need to step out of it.

With the advent of mass media, what is worth knowing has expanded tremendously. And it has really exploded with the internet. The Encyclopedia Brittanica article on the Barbie doll is about 700 words long. Compare that to the Wikipedia article on the doll. Imagine how much more there is to know today.

In such an environment, all of us feel as if we aren’t keeping up. That we are missing out on stuff. Hence this strong urge to be plugged in to our social streams. We don’t want to be people who don’t know. We don’t want to feel knowledge-inadequate.

See Also:

Sunday, January 16, 2011

Does Meditation Work for Me?

I get this question a lot these days - does meditation work for you? I'm not entirely sure what people mean by that. Perhaps they mean if it is changing my life in any way. Well, yes, it is - I now do about 30 minutes of meditation each day and wake up at 6:30 every Saturday to go to my meditation group.

The question will get even more puzzling if you start exploring what kind of meditation I do. I do Zazen which is a sitting meditation practiced within the Zen Buddhist community. It is a peculiar practices because it promises you absolutely nothing. It doesn't say that it will increase your concentration or memory or make you feel better. In fact, to do Zazen with any goal in mind is not doing it properly. It is a goal-less practice.

If the practice is goal-less, why do it?

I'm not sure about that either. There is something that draws me towards it.

Okay, that sounded mysterious. And I hate mysterious. That's precisely the reason I started doing this practice - that it was NOT mysterious.

I do this practice because it brings some semblance of structure to my life. As a post-modern, liberal, athiest I find it hard to believe in anything. My life seems to be without any anchor at all. Morals are up for negotiation, good and bad a matter of definition. This lack of structure in life can be extremely disorienting.

Enter rituals. Rituals are important. Rituals are powerful. And I'm not talking about religious rituals, although they share this quality. I'm talking about everyday rituals. Waking up at a certain time, brushing your teeth, following a daily routine. Celebrating certain festivals at certain times of the year. They all bring structure to life. In and of themselves they have no meaning. But we choose to give them meaning. Because that's what human beings do.

Zazen is one such ritual for me. It's just something I do. It gives structure to my life.

Why Zazen, why not something else? Well, because Buddhism is athiest and rational and I can stand it.

Saturday, January 15, 2011

Inventors and Inventions

When I was a kid, something called 'general knowledge' was all in the vogue. It consisted of commiting myriad facts and figures to memory which were then parroted out to impress your fellow students. I remember one classmate that memorized the capital cities of all the countries of the world.

So this so called general knowledge had various topics. One of the common ones was 'inventors and inventions'. Given an invention you had to name the inventor and vice versa. Light bulb - Edison, steam engine - James Watt, airplane - Wright Brothers, telephone - Alexander Graham Bell, and so on. Until, of course, it got to the computer. It's hard to define who really made the computer. You could name Babbage and his difference engine but then what about Turing and his hypothetical Turing machine? And what about the fact that computers were made from vacuum tubes until the transistor came along. So which one do we talk about -- the idea, the mechanical, electrical or the electronic computer?

But I digress.

I was totally into science as a kid. I had several books at home about scientists and inventors, narrating stories of how they did a particular peice of science or invention. The story was always about individual effort, hard work and genius. Moral of the story -- if you locked yourself away in your office and were a genius and worked and worked, you'd make a great invention some day.

Despite my earlier lampooning of 'Guns, Germs and Steel', I did find this gem in there.

In reality, even for the most famous and apparently decisive modern inventions, neglected precursors lurked behind the bald claim “X invented Y.” For instance, we are regularly told, “James Watt invented the steam engine in 1769,” supposedly inspired by watching steam rise from a tea- kettle's spout. Unfortunately for this splendid fiction, Watt actually got the idea for his particular steam engine while repairing a model of Thomas Newcomen's steam engine, which Newcomen had invented 57 years earlier and of which over a hundred had been manufactured in England by the time of Watt's repair work. Newcomen's engine, in turn, followed the steam engine that the Englishman Thomas Savery patented in 1698, which followed the steam engine that the Frenchman Denis Papin designed (but did not build) around 1680, which in turn had precursors in the ideas of the Dutch scientist Christiaan Huygens and others. All this is not to deny that Watt greatly improved Newcomen's engine (by incorporating a sepa- rate steam condenser and a double-acting cylinder), just as Newcomen had greatly improved Savery's.

Similar histories can be related for all modern inventions that are adequately documented. The hero customarily credited with the invention fol- lowed previous inventors who had had similar aims and had already produced designs, working models, or (as in the case of the Newcomen steam engine) commercially successful models. Edison's famous “invention” of the incandescent light bulb on the night of October 21, 1879, improved on many other incandescent light bulbs patented by other inventors between 1841 and 1878. Similarly, the Wright brothers' manned powered airplane was preceded by the manned unpowered gliders of Otto Lilienthal and the unmanned powered airplane of Samuel Langley; Samuel Morse's telegraph was preceded by those of Joseph Henry, William Cooke, and Charles Wheatstone; and Eli Whitney's gin for cleaning short-staple (inland) cotton extended gins that had been cleaning long-staple (Sea Island) cotton for thousands of years. All this is not to deny that Watt, Edison, the Wright brothers, Morse, and Whitney made big improvements and thereby increased or inaugurated commercial success. [...] All recognized famous inventors had capable predecessors and successors and made their improvements at a time when society was capable of using their product.

Science is social. Why was this never taught to me as a kid?

Friday, January 14, 2011

Why Americans Love Technology and Indians Hate It

As I walk down the street that I live in, in the US, I'm amazed by the sheer number of cars I see. I'm amazed too by the number of different manufacturers that I see and the number of different models. And I am amazed by the number of iterations of each model that I see. You can pick a popular car model and find all of it's generations, often going back to the 70s or 80s.

The first ATMs were installed in the US during the 70s. But various prototype cash dispensing machines were tried out even back in the 60s. I saw my first ATM machine in 'Terminator 2: Judgment Day' during the mid 90s and it wasn't until the 2000s that I saw a real one.

Ebooks readers have really taken off in the US during the past 3 years or so. There are two popular brands, backed by mega book sellers and innumerable smaller ones. We have already seen three generations of the kindle. The market is competitive. Ebook sales have already crossed hardcover sales this year. Yet, not many people in India have yet heard about an ebook reader. My nook really got stared at during my India trip this November.

Technology doesn't ever iterate in India. It just appears, magically and fully formed. As a result it always has that slightly alien feel to it. We don't know who made it and how. Instead of the local lad we knew as a toddler with a runny nose, we get a suave foreigner who we can't quite get around to trusting fully.

This is why Americans love technology and Indians not quite as much. (Okay, we don't really hate it, do we? That title is just a link-bait.)

(One possible exception is cell phones. Cell phone market in India rocks.)

Thursday, January 13, 2011

Guns, Germs and Steel by Jared Diamond

I finished reading Jared Diamond's godawful book, Guns, Germs and Steel, a couple of days ago. In fact, I have up before the last 100 pages or so. I just could not take it anymore. So I might actually have missed out on something vital that he says towards the end. If that is the case, please correct me.

Let me say at the outset that I'm in no position to challenge his scholarship, though others have done so. I don't even have much of a problem with his central thesis - that environment has a big role to play in human history. It is rather the way he says it that bugs me.

I think the book is nauseatingly Eurocentric. Large parts of the analyses are about the European destruction of native American civilizations and his personal experiences in New Guinea. I was very puzzled by his complete neglect of India and China where one-third of the world's population now resides. He did devote one whole chapter to the Chinese in the end. I'm puzzled why he clubs Eurasia together in one giant lump when there was so many clashes of civilizations going on within the continent. (Whatever happened to the whole east vs. west divide?) What about the modern colonization of the east?

The underlying assumption throughout the book is that the modern western civilization is "better" and that that's what all human societies should have achieved. But they didn't. So why not? His repeated use of the term 'better' or 'more advanced' defined purely in terms of an industrial-capitalist mindset bugs me.

I was struck with this passage where the author describes the war between the Inca king Atahuallpa and the Spanish as 'well-known' because it had been recorded by several Spanishmen. Since when did history written purely by victors become 'well' known?

Finally, his selling of his thesis as something profound and fundamental is really over the top. I agree that environment has a role to play in history. But there are other factors - cultural and accidental - that play a role too.

Wednesday, January 12, 2011

Parents and Verbal Abuse

Why Chinese Mothers are Superior is doing the rounds on the interwebs. Seems to have caused much uproar. Here is one article denouncing the author as an abuser and racist. Here is another refuting the claim that all Chinese mothers are like this.

Amy Chua relates to us the story of a dinner where one of the guests actually broke down cried when she heard that Amy had called her daughter 'fatty' or 'garbage'. Clarissa seems equally outraged. This to me seemed rather odd.

I think this is because the definition of verbal abuse is different in different cultures. My mother has called me names in anger, as she has done to my brother and our cousins (yes, cousins too). So have I to my younger siblings. I never saw this as 'abuse' and I doubt if any of my siblings did either. I don't think this had any bad effect on me as a kid. It was common practice at school for teachers to say things. We said things back to them during lunch hour and got on with our lives.

Words and behaviours do have different connotations in different cultures. Unlike physical abuse, verbal abuse is subjective. It hurts only to the extent that it's defined to hurt you. A child in America grows up learning that a parent calling you 'garbage' is bad and abusive. A child in India might not even think about it for more than a minute.

This is not to say that abuse doesn't exist. A couple of my cousins are short and dark-skinned and hear no end of it. I find that immensely abusive. Another cousin hears no end for being fat. (It is funny that while I can't image her being truly hurt at being called 'fatty' she does get very upset when family memebers discuss her weight in calm, contained voices. So it's the way of saying things that counts too.) I grew up with a complex that I was short and not socially adept because my family constantly reminded me of this. This did have a bad effect on me as a kid.

In short, one does have to be culturally sensitive in defining verbal abuse. A word or tone of voice in one culture/language doesn't mean the same in another culture/language.

Tuesday, January 04, 2011

Sharing an Experience Changes It

One thing to note about blogging is that it changes the way you experience things. The experiences is not pure, untainted any more. Instead it's marred by the urge to share it later on.

No longer do you see a movie without thinking - oh, I can write about that. No longer do you visit a place without thinking - oh, I need to point this out in my blog post. No longer do you read a book without a notebook by your side to take notes.

The internet and the sharing of experiences that it enables is changing the way we experience life. People now click photographs to share them on facebook, make family videos to upload on youtube and record their singing at parties to create a podcast. Each experience is experienced with the acute awareness that it is going to be seen and heard by, if not thousands, then at least hundreds of people.

The internet is turning all of us into artists into some sense. Artists are often likened to voyeurs and prostitutes. They make a living by leeching of the experiences of others and by sharing what is most intimate to themselves. The internet is turning all of us into voyeurs and prostitutes.

Monday, January 03, 2011

What is Explanation?

I've started reading this book called 'Guns, Germs and Steel' by Jared Diamond. The beginning was a bit irritating because the author spends the entire preface stating that it is NOT a racist and/or eurocentric book. Sounded a bit like the old 'I'm not racist, many of my friends are black' argument to me. But I'm giving him a chance. Will reserve my final opinion until I'm done with the book.

The authors central thesis is this -- different societies developed differently over the course of history. He wants to know why. And he repeatedly claims that he is looking for the 'ultimate explanation' of these differences. This puzzles me slightly.

What could one mean by the 'ultimate explanation' in the context of historical differences between societies? History is an endless chain of cause and effect. A web would be a better word to describe it. There are multiple causes and multiple effects for each historical incident. Many of these are well studied. We keep finding new ones, which is good because our knowledge advances that way. We also keep questioning and perhaps throwing out the old ones which is also good because skepticism is the fundamental characteristic of the scientific attitude. In short, it's business as usual for historians.

But the author is not talking about these chains or webs of historical causes and effects. He is looking for an explanation that goes beyond this. I do not know what exactly he might mean by this.

History goes not lend itself to 'explaining' in the same way that physical phenomenon do. Physical explanations are concerned with precise mathematical relationships between phenomenon. Newton could observe apples falling and come up with the laws of gravitation. That is, he could write down a precise mathematical relationship between the masses of the objects and the gravitational pull between them. He could also write down the relationship between the amount of pull and the speed that an object attained under that force.

But that is not all, physical explanations are also reductionist. That is, you can isolate effects or 'variables' as fancy scientists like to call them. An applet falling under the bright blue sky is being attracted not only by the earth, but also by the moon, the sun and the entire galaxy. The beauty of gravitation is, you can isolate the effect of the earth, the sun, the moon and the galaxy and measure them independently. Then you can add all of those effects up and you will get the total effect. The whole IS the sum of the parts. This is called the principle of superposition. It is important to know where it applies and where it doesn't. Physics books explicitly state where it applies.

It is lucky for us engineers that so many physical phenomenon follow the principle of superposition. This allows us to isolate effects, study and quantify them and then put them together in the end to do all those magical things we do.

However, it is hazardous to bring this same reductionist thinking to other spheres. Food and nutrition is one. Michael Pollan points out in 'The Omnivore's Dilemma' that food doesn't quite follow the principle of superposition. Drinking wine and eating carrots together does NOT produce the same effect as drinking wine and eating carrots separately.

The same applies to history. The whole is so much more greater than the sum of it's parts that it makes little sense to isolate effects. Did human beings develop number systems because they needed to keep accounts or could they keep accounts because they had number systems? What role did the hard wiring of the brain play in this? What role did the underlying philosophy of the culture play? For example, the Indians may have invented the zero because they were so very concerned (philosophically) with nothingness or shunyata. All these various factors are so strongly interconnected that it makes little sense to isolate the effect of one particular variable.

But that's precisely what Jared Diamond seems to be aiming at. And while I'm not opposed to such analysis as an intellectual exercise, touting it as some 'ultimate explanation' seems to be a bit too much to me.

Sunday, January 02, 2011

What is Minimalism

There seems to be some debate about minimalism going on around the blogosphere. The first is a post berating minimalism as useless intellectual strategy. The second is a passionate counter-argument to the same. The third is a post covering these two posts.

Being an enthusiast minimalist, let me add my own two cents to this discussion. I think the problem with all these posts is that they are trying to identify a singular definition for a term that is being used to encompass a whole lot of unrelated concepts. For example, single-tasking, distraction elimination, un-consumption and minimalism as a design philosophy are all being clubbed together. They are disparate things are require disparate treatment.

Single Tasking
While the modern corporate worker is encouraged to multi-task and be good at it, juggling many things together comes at the cost of efficiency. Best work is done when you have long chunks of uninterrupted time. This is especially true of creative work where the most productive strategy is to stare at the wall. In order to get these long chunks of uninterrupted time, minimalists profess disconnecting, clubbing together communication and defragmenting talk time.

I don't think there's anything wrong with this strategy. Shutting yourself up in a room / outhouse is an accepted strategy to get a book written. Creative spaces are full of junk but they often have a "keep out" notice on the door. The minimalists are trying to achieve the same effect in a digital work space.

While some minimalists have elevated this to the level of extreme sports where they try to live of 33 things or 55 things. I'm not sure about such athletics but I do think there's some virtue to consuming less. It is often healthier for your body and your environment, leads to less worry and stress and keeps you emotionally happier and financially secure. Minimalism takes the form of minimizing here. If you can live without something, you probably should.

Minimalist Design Philosophy
Minimalism is a good design philosophy for particular use cases. Specifically, it's a good design when you're a "consumer" and not a producer. I love when a media player is really simple and gets out of my way and lets me watch my movie. I hate it when a programming IDE is minimalist. It's a matter of having the right tool for the right job.

Minimalism as a Way of Living
This I think is hokum. Leo Babauta is a minimalist. He advocates reducing work hours. He works something like four hours a day or something. But what does he do with this saved hours - he spends time with family, runs and thinks. Is he doing less than a corporate minion working 12 hours days? There's no way to measure that. They both live 24 hours every day, just like the rest of us.