Saturday, September 30, 2006

THE PENICILLIN BEFORE PENICILLIN

Image Hosted by ImageShack.us

We who live in the era of so many effective antibiotics would find it hard to imagine an era when even a simple cut or abscess would lead to a frequently fatal condition. It's hard to imagine the distress of doctors and family members when they saw a patient simply die of such an apparently minor affliction. The story of penicillin which finally emerged to fight these infections has become the stuff of legends. What's probably still living in the shadows is the equally crucial discovery of sulfa drugs which were the penicillins of their time; perhaps not as effective, but almost miraculous by virtue of being the first.

Now Thomas Hager has come out with a book that should rescue the heroic stories from being forgotten. Hager is a fine writer, and I have read his comprehensive biography of Linus Pauling. He now has written 'The Demon under the Microscope', a history of sulfa drugs discovered by German chemists in the 1920s and 30s. The New York Times gave it a favourable review, and I am looking forward to reading it. The NYT reviewer compared it to 'Microbe Hunters' a classic which has inspired many famous scientists including Nobel laureates in their childhood. I was also quite intrigued by the book, which reads like a romantic account of microbiology. Of course, truth is always harsher than such accounts, but it does no harm to initiate a child into science with them.

It was interesting for me to read that the German chemists had taken out a patent on the azo part of the first sulfa drug. They did not know that in fact it was the sulfa part which confered activity, and they were soon scooped by French chemists who actually discovered that even sulfanilamide alone has potency.

Sulfa drugs of course inihibit dihydrofolate reductase which is involved in nucleotide synthesis, and they are quite close to the ideal of the 'magic bullet', a molecule that is potent, has few to zero side effects, and most importantly, is selective for the microorganism. In this case, dihydrofolate enzyme is expressed only in bacteria. That does not necessarily mean that there will be no human side effects- after all, every molecule can target more than one protein- but it seems to work well in this particular case. Sulfa drugs led to further research on DHFR, which also led to the Methotrexate, a compound that is even today a standard component of anti-cancer therapy.

Friday, September 29, 2006

NO MELLOW HELLO

"Hello" is probably the world's most common greeting, especially because of the telephone. But I think that it's slowly becoming outdated or at least rare. The reason is that most phones (and all cell phones) now have caller ID, so that you can already see who is calling you. As a result, I have the experience of picking up the phone and hearing "Yes Ashutosh go ahead" or something similar to that. Since I have never owned a phone with caller ID, I tend to still always say 'Hello'.

But as I was pondering this, I remembered that Punekars already inculcated the tradition of not saying hello, eons before caller ID and cell phones. This is clearly explained by Pu La Deshpande in his 'Punekar Mumbaikar Nagpurkar' masterpiece, where he says that Punekars, instead of saying hello, usually respond as if someone has woken them up from a comfortable afternoon siesta, and shriek "Kon aahe?!!" (Translation: "Who the *expletive deleted* is this??!") loudly into the phone.

So that's another tradition that we Punekars taught the world. The more important tradition of course as Pu La says, is to incessantly argue aimlessly about everything under the sun.

Thursday, September 28, 2006

TWO MONTHS FROM THE LIFE OF A TECHNOLOGY CASUALTY

1. Sometime during August 1-10, 2006: Laptop starts doing weird things. Weirder reason discovered.

2. August 15th: Laptop ships out to the Honourable Certified Apple Service Providers AIS Systems (Damn them!)

3. August 30th: Delivery of laptop expedited by courier (40$) after painful wait of 15 days. Laptop is back with logic board apparently replaced. Airport (Wireless Internet) not working. One memory chip missing. Blame game begins.

4. September 2nd, 2006: Laptop ships back to AIS (Curse them!). Logic board turns out to be defective. Blame game begins.

5. September 20th: After 20 phone calls, I reach the stage (called 'Nirvana' in Hindu philosophy) where I just stop caring. I am in a completely serene place, beyond frustration or agitation. I like this place, but not the fact that I don't have my laptop yet.

6. September 27th: I am surprised by drumrolls announcing the arrival of the laptop. Suprised because I had stopped caring so much that I had forgotten about the 17 inch, 2800$ beast. Memory chip still absent. I call AIS. Blame game begins.

7. September 28th, Morning: I discover that I cannot hear audio on YouTube. While a small nuisance for others, it means the end of the world for me. Luckily, the electronics technician agrees.

8. September 28th, Afternoon and Evening, 5 hours in all: Flash Player uninstalled and installed 5 times. Laptop booted from external hard drive. Laptop booted from another laptop...I can hear folks on YouTube now.

Mac OS X reinstallation pondered. Just then, prodigy from the one-room College of Computing finds what the snag could be on the web. YouTube is broadcasting, but I cannot hear because broadcasting frequency is not compatible with audio output. I keep thinking how we are still in the age of radio. Geek on the Internet had same problem, was advised by Megageek to simply launch Garage Band. That resets the audio output frequency...uhh...identification system.
Probably the most bizarre solution to a bizarre problem that I have encountered. I console myself by remembering that there are things more bizarre in the world, such as people eating Alphonso Mango Juice mixed with Rice.

And then there was sound. And sound was good. Attempts to recover memory chip abandoned, cost procured from multimillion dollar drug windfall.

9. September 28th, Just before leaving for home: Mass email sent to everyone advising never to do business with AIS systems.

P.S. This is actually in no way a criticism of technology, only of the human beings who service it, and those who insist on using it.

Tuesday, September 26, 2006

MEDALS AND CHAMPAGNE FIZZ

It's that time of the year again, when the champagne floods the otherwise noxious chemical shelves in the stockroom. Yes, the Nobel prize in chemistry will be announced on October 4, and many hopefuls will be dreaming about it, although it probably would go to someone who is not. Harvard usually stocks champagne in the stockroom, "just in case" and not without realistic expectations this year, since three of their faculty have been slotted to win the prize for some time now.

So what do I predict about the whole chemistry prize festivities and deliberations? Here are my bets, placed while my computer calculation drones on, although I am not as sanguine about them as one may expect, because I don't win any prizes for predicting prizes.

1. Nanotechnology: George Whitesides, Harvard. Pioneer in everything nano, from enzymes to surface lithography, you name it. More than a thousand publications. Started out as a 'pure' chemist doing NMR spectroscopy at Caltech.

Also, J. Fraser Stoddart of UCLA, for his remarkable and dogged pursuit of nanomachines, including nanomotors and nanopropellors. Many then jumped on this bandwagon, but no one has been as prolific as Stoddart. And it's not just about making fancy toys, but about generating prototypes for some very novel chemistry using them. In the process, Stoddart has created a 'new' type of bond, the 'mechanical bond'.

2. Chemical Biology/ Bioorganic/Bioinorganic chemistry: Stuart Schreiber, Harvard. Pioneered the field of 'chemical genetics'- controlling the workings of genes and protein at a fundamental level in the body, using ‘small’ organic molecules. Schreiber metamorphosed from a purely synthetic organic chemist to a force to reckon with in chemical biology. Racy description of him in The Billion Dollar Molecule.

Some other contenders: Peter Dervan of Caltech- did magic with DNA as a chemical. Harry Gray of the same institution- discovered untold riches in electron transfer in proteins. Stephen Lippard, MIT- did groundbreaking work in understanding metal mediated enzymatic reactions, especially the notorious methane monooxygenase, that converts methane to methanol. If we could do that on a large scale at room temperature, given the amount of methane around (the most abundant greenhouse gas), what more could humankind want? World Peace perhaps.

3. Organic Synthesis: If there’s one metal that has had singular success in aiding the synthesis of complex molecules, it’s palladium, for all its toxicity in everyday life. Three palladium catalysed reactions in organic chemistry; Heck, Suzuki, Sonogashira (there's another one, except that the discoverer, Stille, is no more) have become ubiquitous in the art and science of synthesis. If anybody should get a prize for metal mediated reactions, it should be these three. I wouldn’t be optimistic though, because it was just last year that a prize was given for a special kind of organic reaction mediated by ruthenium and tungsten catalysts.

4. X-ray crystallography/Biochemistry- One of the best ways to try to get a Nobel is to devote your life to solving the structure of some protein that is crucial to life; you may succeed or you may spend your entire life doing it and fail- this being the tradeoff. Many such Nobels have been awarded, including the latest in 2003 to Roderick McKinnon, who after painstaking work resolved the structure and action of the potassium channel, a protein that is one of the fundamental workhorses of all living organisms, involved in ion conduction in the nervous system as well as elsewhere. I always get a kick out of such discoveries, because unlike even other Nobel discoveries, these involve truly looking into the heart of nature, like W-C did with DNA. If you are talking about fundamental science that lifts the veil from nature’s face, this is as fundamental as you can get.

So my contenders for such work- 'Venki' Ramakrishnan The Indian and Ada Yonath, who cracked the structure of the Ribosome, a protein which is if anything, more fundamental than the potassium channel.

4. Computational organic chemistry- Of course my favourite because of my own disposition. Kendall Houk of UCLA, who more than anyone else in the last twenty years, has helped expand our knowledge of organic reactions and syntheses using computational insights.

5. Computer Simulation of Biomolecules- Again, a personal favourite and a big contender- Martin Karplus of Harvard, the last student of two times laureate Linus Pauling. The scope and depth of Karplus's computational work in chemistry, physics, and biology, are almost comparable to Pauling's, and his work in especially simulating proteins using computers has been pioneering in every sense. He has been a possibility for many years. I have written about his visit to Emory a few months ago here.

Another big fish from the same pond is David Baker from the University of Washington at Seattle, who has probably provided the first solution to one of the greatest unsolved problems in biology- the protein folding problem. I presented a paper by Baker last year, about a program called ROSETTA, which was used to predict the folding of a small protein from first principles, giving a result of extraordinary accuracy that agreed with experiment. This is the first time any one has ever done something like that, and the work represented a triumph for computational scientists. The way in which the complex interactions in protein folding were included by Baker in his programs was diabolically clever. It may be a little early for Baker to get felicited, though the earlier the BaTer of course (that was a bad one)

One thing I think I can say for certain; the prize will definitely be connected with either biology or materials science, the two most fertile scientific paradigms of the twenty-first century.

I believe that many of these scientists should get a Nobel prize at least sometime in their lifetime, an opinion that is reiterated by many in the scientific community. But that only shows how hung up we all are on the Nobel. Of course, everyone who gets the Nobel is brilliant. The real problem is that there are many fold that number of scientists who are of Nobel caliber, who never get it. In the scientific community, their names are as well recognized, but in the public eye, they somehow rank lower in the genius category. This is surely unfair, because a Nobel prize is after all a prize instituted by human beings, and reflects personal preferences and tastes. It can be given to only three people at a time, and in the end, the exclusion of the fourth person often is unfair. Many such cases abound, probably the best known case in public memory being that of Rosalind Franklin, although she unfortunately died before the Nobels were awarded. The bottom line is that prestigious as the Nobels are and exceptional as the laureates are, there are many more such fine researchers around who would never get it, yet are on par with their Nobel colleagues in intellect and achievement. All of us could do well to remember this. In the end, it is a truism that getting a Nobel is as much a matter of timing and luck as it is of innate factors. That should help for a realistic appraisal of its prestige.

On a personal note, as you progress in your own field and career, it is a pleasure to see how you graduate from not ever having heard the names of that year’s laureates, to having actually studied their research in classes and otherwise, to having perhaps actually used their research in your own work. If I had been twenty-five in 1998 (ceteris paribus), I would have been pleased that I am actually using some of the methods developed by computational chemists John Pople and Walter Kohn who received that year’s prize, and I did study the reaction for which chemists received the prize last year. That is a very satisfying feeling, the feeling that one aspiring artisan is using the tools of another accomplished one.

Thursday, September 21, 2006

WATCHING GANDHI AGAIN

I watched Gandhi again yesterday after a long time, and realised a few things, or rather, those things which I already knew were underscored more strongly.

The British were gentlemen. I am not trying to downplay their brutalities, their cruel suppresion of the 1857 revolt, or the inhuman massacre at Jalianwala Bagh. Not at all. But it's still true that on a relative basis, we perhaps got the most benign ruler. If we had been ruled by the French, Japanese, Portugese, or Spanish, I shudder to think how it would have been, and Gandhi may not have lived to fight. As it was, the British were reasonably courteous to Gandhi, they used to make sure he was not treated too harshly in prison (his spare frame and frail body helped here), and they used to have armed escorts to take him to prison (I cannot believe that the armed escorts were employed to stop him from escaping). Perhaps they were not doing this out of genuine respect, but I believe they did it out of minimum respect.

The fact is that non-violence places an enormous moral burden on the the perpetrator, because he knows he is suppressing and tendering injustice on an individual who is doing absolutely nothing to stop him. Thus, it suddenly deprives him of an excuse, which is that his adversary is retaliating against him. He is in a position where he cannot justify his reprehensible actions on any grounds. Sooner or later, every perpetrator has to crack under such a tremendous moral burden. The British certainly were ones who in my opinion would give in in such a way sooner or later. And perhaps Gandhi was astute enough to realise this fact about them.

I always like to believe that the reason for the minimum moral sentiment of the British was democracy. Although this ideal was many times flaunted abroad by them under a hypocritical guise, the fact remains that it was they who gave democracy to the world, who tolerated if not encouraged, dissent. From one side, they were facing this potent weapon of non-violence wielded by us, and on the other side, they had a lofty tradition of democracy, independence, and free speech behind them. In The British parliament, there were Lords after all, who complained and chastised British actions in India, if only out of political ambitions. But the combination must have been a thorny source of discomfort to The Crown. The great advantage of democracy is that because it allows dissent, policy makes are always answerable to counterpoints, and their conscience is therefore constantly challenged by the viewpoints of others. It surely must be harder in such a situation to tenaciously hold to your own viewpoint, which many in the first place already regard as morally reprehensible. For this reason, I believe we are relatively lucky that it were the British who were our rulers.

One quality of Gandhi that caught my attention is that in spite of having such profound will, ideals, and conviction in himself, he was not brooding, distant, or utterly philosophical. By many accounts, he was an extremely simple person, and quite a playful one, ready to indulge in jokes and self-effacing comments. He could communicate as an ordinary person with the simplest of people, the poor and the young. He believed deeply in god, yet was not overly religious in his daily life and principles, and in his talk. In his heart, he really seems to have been a peace loving person of the highest morals. I always believe that Gandhi did appreciate the good things about Britain, including their language, culture and history, their laws (not as applied to India necessarily), and other things which were typically English- after all, he was an English-educated lawyer. One thing he said struck me as being sincere, and I believe he meant it. "When the British leave our country, we want them to leave us as friends". But he knew all too well that when friends stay for a disproportionately long time, their presence becomes a nuisance.

Tuesday, September 19, 2006

TUESDAY MORNING 1$ BOOK WINDFALL

If I saw a sign saying "Book sale" and then saw an open petri dish filled with Sarin kept next to it, I think I still would make a dash for the sale. Even if my mind appealed to my low bank balance, my legs probably would still automatically drift towards the event. And so it was today (sans the Sarin for now), inspite of the fact that this sale was of surplus books from the Theology library.

Lateral thinker Edward De Bono's: Teaching Thinking, The Mechanism of Mind, Practical Thinking. I still cannot top my father, who has about 90% of his published titles.

Chronicle of the 20th century: A >1000 pages extravaganza

Thank you.

Monday, September 18, 2006

LESSONS FROM RESEARCH 1

As they say about fine wines, you can never know what the experience of drinking them is until you actually take a sip. I may know all their chemistry, have a smokin knowledge about the detailed geography of all the wines in the world, and read and assimilate a hundred texts on wine composition and flavour. But me the non-winer still won't know what it is that inspired Vino Vita Veritas until I let the spirits flow down my throat and prance around in my bloodstream.

And so it is for research. All the books that you can read about Newton, Darwin, telescopes, DNA and nuclear fission won't prepare you for research. I am not implying that that means research is something for the chosen few, only that just like for other endeavors, you won't ever really know what details really matter until you do it. For some reason, all the books that I read about science never told me about the gory details involved and for a good reason, because that may have scared the wits out of my already feeble mind. But now that I am here, I have learnt a few things that they did not mention in the books. These matter now because they are so trivial that they actually turn out to be important. So here are a few humble words from an almost broken spirit (not the liquid one, again).

1. It's the simple things that matter:
Of course we knew this. It's so simple, isn't it? That's precisely why it is forgotten. More than once have I had the entire plan for a project or calculation well organised in my mind, ticked all the assumptions, requirements, and protocols in my head, and then begun the calculation or experiment, when I realise that I have forgotten to do the simplest possible thing- note down a value, check/uncheck some box, or used the wrong unit in adding two numbers. While that makes me want to tear my hair out, it's also a sober reminder of how there is great beauty in simplicity as Einstein said, or in this case, great frustration.

2. Physical and mental presence in the lab is not an option:
The bad thing about drudgery is that it's boring. The good thing is that you can mechanically do it like a robot, without really thinking about it. The bad thing about the good thing is that you may end up doing the wrong thing. Case in point; I was delighted that my results for one molecule were looking exactly the same as those for another molecule, a desired objective. But then I began to wonder...the results look exactly the same. No wonder! I was actually looking at the same molecule in both cases. Oranges will certainly turn out to be apples, if what you are looking at are apples in the first place. Another three hours consigned to glare from the monitor. Sometimes, with enough practice, you can actually get away with doing tedious, routine work in a zombie like manner. But it's hard work becoming a zombie. Well known axiom to this principle- always ntuple (the nth power of double) check your results if they look too good to be true. I don't want to become another Bengu Sezen, albeit a dramatically less attractive non-Middle Eastern one.

3. Write it down:
Another simple piece of advice, often avoided because of the time lag between writing speed and thinking speed. But actually, it turns out to be refreshingly easy to simply state in plain English what you have found (or not found). I could summarize last two month's work in two pages today (And no, I won't answer your question about how much work I actually accomplished in two months). I am notoriously prone to not accurately documenting my work regularly. But now I have found out that not only is it going to save me a nightmare later, but also that it's actually not too difficult. The language in such summaries need not be technical, although it helps to formulate accurate statements that would give you practice in writing scientific papers.

4. Common sense really is uncommon. No, really. No, really:
And somehow, I always find myself on the receiving end of this statement. However, more often than not, such a lapse in common sense is simply because you (actually me) are too eager and enthusiastic to obtain results, run your protocol, or make your point. The yet one more simple remedy for this: Surveillance should precede saltation (Or Look Before You Leap)

More to come soon, since I will keep on making mistakes.

Saturday, September 16, 2006

COMIC SANS TOO COMIC FOR SCIENCE

Image Hosted by ImageShack.us

I won't go so far as to punch any bunnies, but yes, I agree completely that Comic Sans looks too childlish for scientific presentations. Yet for some inexplicable reason, one of my advisors insists in using it in all its glory, in titles, bodies of texts, and footnotes. As a result, all his slides, though quite informative and entertaining, end up looking like

"Free Food! Dusty's Barbecue!!!
Peach Cobbler and blueberry pie!!
Graduate Picnic on Chandler Chapel Lawn today at 4.00 p.m.!! Come one, come all!!"

As a side note, the gratuitous excess of exclamation marks in such announcements makes me feel like they are either desperate for us to come and eat their food (otherwise they may have the USDA on their heels soon), or that they think that there's no other way to rouse us from our grad school slumber, except by slapping us on the face with exclamation marks. The second reason turns out to be the right one, although given the quality of food at most of these events, the first reason may not be completely unwarranted.

But back to the point. So I have completely renounced the use of this comic font now. I stick with Arial, Times New Roman, Verdana, or as I have most recently discovered, the elegant Gill Sans

Friday, September 15, 2006

TOO STIRRED AND SHAKEN

I am a big James Bond movies fan, and have seen almost all of them at least ten times each, with the exception of 'On her Majesty's Secret Service'. In fact, when I was in school, me and my cousin who had all the classics taped at home were so obsessed with Bond movies, that we used to spend a lot of time actually enacting the end scenes of the movies played out between Bond and the villain (eg. "Your time is running out Stromberg". "Yours too, Mr. Bond. Yours too...and faster than you think...- yes, I know we had no life.)

In any case, I always think that after Sean Connery, Pierce Brosnan is the best James Bond (an opinion reiterated by Roger Ebert). Many of my friends diasgree with me and place him third, after Connery and Roger Moore. Now I realise why I hold that opinion. James Bond, along with his suave charm and the usual personality paraphernelia, needs to have a special kind of detached look on his face when he either kills someone or escapes the usual unbelievably intractable situation with his life by a hair's breadth. This look is a combination of puzzlement, self-assurance, and detachment. But it is not cruel; instead, it needs to be a look of utterly innocent detachment, as if Bond himself is trying to pretend as if nothing happened. But all of these subtle nuances have to be combined in the right doses.

Moore is a fine actor and was a great James Bond, but the problem in my opinion was that his face was too fluid and used to display a look of surprise more than it should have at times. He did do a fairly good job, but the distillation of the right doses was not complete many times. He could not appear indifferent enough without appearing cruel.
Connery, I realise, was the unbeatable master of this innocent detached look (and he retains his ability for it in his 70s). And after him, Brosnan can do it the most competently. Given Brosnan's lifelong fascination for James Bond starting from his childhood, it's not surprising that he has studied every small aspect of the role and the character. So for me, after Connery, Brosnan will always be the best James Bond.
Shaken, but not stirred.

Thursday, September 14, 2006

KILLING THE HYDRA

As an addendum to the previous post, I want to note that the 'hockey stick' graph by Michael Mann and others which was so much in the spotlight recently has been endorsed in its general features by many bodies, including the National Academy of Sciences. (Link: Climateaudit)

Image Hosted by ImageShack.us

Once somebody asked me a curious question; Mann's graph shows that the temperature anomaly has been the highest for the 20th century in the last 1000 years. What if we are looking at a cycle that repeats itself in, say, 2000 years? Won't the temperature anomaly that we see then be only part of a cycle?

I guess this is an objection that many people have about global warming. What if it is only a cycle? To my knowledge, the answer to this objection is now clear; computer models can reproduce the temperature that might have existed had mankind and greenhouse gases not been around. This natural variation in the temperature is far lower than what it is. Also, the simple fact that temperature rise can be concomitant with the rise of CO2 is a very telling one, and may appear deceptively simple. Also, it is a simple law of nature that CO2 absorbs certain wavelengths of light. Taken together, these three facts for me constitute as good a chain of reasoning as any. Also, the naysayer's objection is absurd for another reason. If the data had been collected for 10,000 years, he could still have claimed that it was not collected for 20,000 and renewed his objection. I am not sure it makes sense to play these childlish games till the world comes to an end. As I have noted before, do we really want to be one hundred percent certain about an event that could eminently mean the end of humanity? Maybe then we should also stop vaccinating ourselves.

Frankly, given our nature, I don't think any amount of moral reasoning, no matter how true, is going to sway public opinion soon. Indeed, people don't even stop smoking cigarettes when they know they can kill them, so it may appear naive to expect them to suddenly care about global warming, a phenomenon that probably won't directly affect them in their own life. No matter how lofty the moral pillars of reasoning seem, the one thing that can finally force people to pay attention is still the mundane allure of economic incentives. Consider this; this semester onwards, almost nobody from my lab is going to drive their car to work, and they are all going to take one of the three new shuttles that Emory University has begun. I don't believe for a moment that they are doing this for the environment. The simple reason why they are doing it is because Emory University is also going to double the annual parking fees to 700$ a year. If you can't show them the tree, just don't make it free. Do this more often; create new bus services, make wireless internet available on the shuttles, and make the parking fees prohibitive, and people will obediently avail of the service. It's surprising how the simplest of daily incentives can change people's minds about the most profound objectives. But evidently, things like public transportation are not that simple, because they are not being implemented.

The Kyoto protocol is being riddled with blame games, with the US saying they won't sign until India and China do, and India and China saying they won't sign until the US does. China is next only to the US in greenhouse emissions, and if anything, it's going to spew many times more in the atmosphere in the near future. (Link: BBC)

Image Hosted by ImageShack.us

I for one think this is another childish game that can be played till eternity. Even though the arguments are valid, somebody has to get out of the loop. After all, deep down, does it really matter that the US has increased emissions over the last few decades? It's us who have to suffer the consequences to our environment if we don't cut them down. It is true that as of now, we cannot achieve a high standard of living without using conventional energy sources. But at some point in the future, we are going to run out of oil anyway. How does it harm us to start early and found our new standard of living on unconventional energy sources, an effort that actually can turn out to be profitable in case of a likely oil crisis? At least we have the nuclear deal with the US. Let's avail of it. On the part of the US, I think that if they want us to sign, they should also become ready to sell us some of their already developed research into uncoventional sources of energy, at a cheap cost, and save us the cost that we would have to expend in doing research using conventional energy in the first place. But finally, they also must be prepared to give their citizens incentives to cut down on their standard of living, a standard that neither they nor the world can realistically aspire to in the near future. I agree that governments are to blame, but ordinary Americans need to pitch in too, and they are going to do it only if they are provided incentives of the kind mentioned above regarding public transportation. As Al Gore says, we have the technology to stop global warming, but in a different sense, I think that statement can also refer to the technology needed to give people incentives to combat global warming.

The media has a very important role as always to play in this situation. As noted below, the media has many times embellished global warming research by spuriously connecting it with specific environmental events. However, much needed public awareness did come out of this misused connection. Now, the media needs to highlight the general effects of global warming that are becoming so certain. For example, not the number but the intensity of hurricanes is predicted to increase according to climate experts, and this has been so. A good connection has also been established between mean sea surface temperature and hurricane intensity. The media needs to highlight such facts that have been extensively investigated using sound science. In the US, the media has a considerable hold on the people's psyche. For once, they should exploit this hold for a good purpose.

Combating ignorance and galvanizing official policy and public opinion about global warming is like killing the Hydra; when one of its heads is cut off, another one appears from somewhere and grows in its place. But the truth remains that Hercules did kill the Hydra and win the battle, and so we also should know that we can, and fight it with all honesty and sincerity.

THE DISCOVERY OF GLOBAL WARMING
The Discovery of Global Warming- Spencer Weart

Any new scientific theory, when born, always comes into the world kicking and fighting back. That's because scientists inherently are skeptical, and in the opinion of one of my colleagues, also inherently mean. Whenever a new revolutionary fact is presented to them, their first reaction is of incredulity because skepticism is a reflex action for them, but also because another reflex action causes them to be galled that they weren't the one coming up with the new idea.
If 'pure' scientific ideas themselves have so much trouble coming up for air, what would the scenario be for a revolutionary new idea that also has a gory heap of political controversy written over it? Messy, to say the least. And so it is for the idea of global warming.

Spencer Weart has penned a lively, informative, and concise history of the discovery of global warming, that precisely demonstrates how difficult it is for such an idea to take root in the public mind and affect public policy. What is more fascinating is how research in climate change was spurred on by unseemly government and military interests, and misunderstood media coverage and inquiry. Weart starts with some old stalwarts from different fields, in the nineteenth and early twentieth century, and how they were intrigued by a fascinating phenomenon- the ice ages, which served as the driving force for suspecting the role of greenhouse gases in changing the temperature of the planet. If there's one singular fact that emerges out of the history of global warming, it is the public's extreme skepticism in underestimating humankind's role in changing the mighty earth's enormous environs, and scientists' reluctance to accept the role of small changes caused by humans and natural forces that could cause violent climate change ('The Day After Tomorrow' notwithstanding).

The discovery of global warming was a painful endeavor, often occupying many scientists' lifetimes. Almost everyone who wondered about it faced opposition in terms of opinion and funding. Almost no one could alone prove global warming without extensive collaboration; not suprising given the interdisciplinary nature of climate. Scientists had to grudgingly forge alliances with other scientists whose fields they would have hardly considered respectable. They had to beseech the government for funding and support. One of the most interesting facts is the government funding of climate studies in the 50s and 60s that was fuelled entirely by military purposes dealing with the Cold War. More than any one else, defense forces were interested in controlling the weather for military purposes, and they couldn't have cared less about global warming. But this was one of those fortuitous times in history, when a misguided venture proved to be beneficial for humanity. Just like building the atomic bomb produced a bonus of insights into the behaviour of matter as a side effect, so did the military's interest in the weather, absurd as it was in many ways, prove to be a godsend for scientists who were hungry for funding and facilities. Weart makes it quite clear how scientists found an unexpected asset in the military's interest in climate. Secretly, they must have laughed in the face of paranoid cold warriors. Publicly, they appeared most grateful, and in fact were, for the funding they got.

If the military unknowingly contributed to our knowledge of climate change by supporting dubious studies in the field, the media contributed to it by miscommunicating the facts on many occasions. During the first few years, the general public wasn't concerned and did not believe in climate change, again, because they could not believe that a puny entity such as mankind could disturb the grand equilibrium of nature. But then, as the general nature of events such as hurricanes, floods, and droughts began to be linked with climate change in the 70s, the media began to pay more attention to scientific studies, and began to exaggerate the connection of man's contribution to the environment and violent weather phenomena. Just like the military's venture, even thought this venture was completely misguided (even today, we cannot pinpoint specific events to global warming), the unexpected effect of the media's spin doctoring was that people began to believe that man could change climate. Of course, the media also was not afraid to point out and again exaggerate when the scientists' predictions and explanations failed, but for the better or worse, people for the first time in history began to take serious notice of global warming and mankind's contribution to it. In the 1960's, Rachel Carson's 'Silent Spring' provided yet another impetus for the public to consider the general relationship between technology and it's effects on the environment.

And yet, as Weart narrates, the road was tortuous. At every stage, speculative as the scientists' predictions were, they were opposed and overwhelmed by powerful government lobbyists who had influence in congress, and much more money to thwart their opponents' efforts. Whenever a new study linked greenhouse gases with warming, industrial lobbyists would launch massive campaigns to rebut the scientists and reinforce public faith in the propriety of what they were doing. As Joel Bakan says in The Corporation, one of the main methods corporations use in maximizing profits is to 'externalize' costs. Suddenly being responsible for environmental pollution which was previously externalized would put their profit making dreams in jeopardy. Until the 80s, scientists could not do much, as firstly there was not enough evidence for global warming and secondly, computer models were not powerful and reliable enough to help them make their case. Matters were made worse by the Reagan administration which has one of the worst track records in history when it comes to environmental legislation. So unfortunately for scientists, just when their efforts and computer models were gaining credence, they were faced with a looming pall of government and corporate opposition, against which their fight was feeble.

These scientists who researched climate change were and are an exemplary lot. They built computer models, wrote reams of codes, and ran simulations for weeks and months. They went to the coldest parts of Antarctica and the deepest parts of the ocean to gather data and samples, to collect climate 'proxies' such as pollen, ice cores and tree rings, for gathering data in past ages which thermometers had not. They spent lifetimes in their search for the contribution of mankind's action to climate change, even though they knew that their results could disprove their convictions. As far as dedication to science and policy is concerned, you could not wish for a more dedicated lot of investigators.

Slowly, in the face of opposition, predictions began to get more credible, and enough data began to get accumulated to make reasonable analyses and predictions. The discovery of global warming really came in the late 90s, but the culmination of efforts really came in the late 80s. During those few years, droughts and rain deficit around the US again brought media attention to climate change. Computer models became much more reliable. When a powerful volcano exploded in 1991, computer models accurately predicted the drop in temperature (one that was more than compensated by a rise in greenhouse gases) that was caused by the accumulation of sulfate particles in the atmosphere. Scientists began to appear before congress to testify. An Intergovernmental Panel on Climate Change (IPCC) was created that created authoritative reports on climate change and the 'anthropogenic' contribution to it. The evidence became too widespread to mock or downright reject. Global warming had to be given at least serious consideration. However, because of the uncertainties inherent in predicting something as complex as the climate, government officials always could do cherry picking and convince the public about the speculative nature of the whole framework. Here, they were making a fundamental mistake, of the kind that opponents of evolution make. Just because a theory has uncertainties does not mean it is completely wrong, as these officials would have the public believe. Of course nothing is certain. But in case of global warming, enough data had accumulated by the 90s to make one thing absolutely clear at the minimum; that we were altering the climate of the earth in unpredictable ways. Studies of past climates had also reinforced the conclusion (with some startling impetus from chaos theory) that very small perturbations in the earth's climate and ocean systems can result in huge effects on the climate (the so-called 'butterfly effect'). Man's contributions to the earth's environment are now eminently more than a 'small perturbation'.

However, when it comes to the fickle palette of politics, every colour can be shaded to suit one's interests. There was, and will always be, great hope from the fact that the opposition against CFCs worked and all nations successfully signed the Montreal Treaty. But In 1997, the US Senate rejected the Kyoto Protocol in spite of Clinton and Gore (naturally) ratifying it. After this, it was but a formality for George W. Bush to resurrect this policy by not agreeing to sign Kyoto in 2001, citing that it would bring about grave economic damage.

Today, there is no doubt that global warming is real. It has been endorsed by every major scientific body in the world. Its effects are many and each one of them is devastating. Enough data has now been accumulated to reinforce the relation between greenhouse gases and global warming. Individual details do remain ambiguous in certain respects. But they will soon be quantified. And as I noted in this post, does it matter that we don't know everything with one hundred percent certainty. The repurcussions of global warming are the biggest that mankind will ever face, and even a 30% certainty about them should be enough for us to make serious efforts to stop it. In my opinion, the unfortunate thing about global warming is that it is a relatively slow killer. And because individual events due to it cannot be predicted, people are not going to be flustered by even Hurricane Katrina and think it was caused by global warming. They will just consider it to be an unfortunate incident and move on. If they knew for sure that Katrina was caused by global warming, they would be lined up on the steps of Capitol Hill in Washington. But what they want is certainty. Strange that they don't seem to want it when it comes to terrorist attacks.

Weart's book is not an eloquent appeal to stop global warming. But that's what makes it striking, because the facts, as revealed by the dispassionate hand of science, make the phenomenon clear. However, that's probably the only problem I would find with the book. Weart is a good writer, but not a particularly poetic or eloquent one. I believe he could have made the book much more sobering and dramatic. He essentially weaves a history in the true sense of the word, even if he may fall short of making it read like a novel. The human drama is there, but kept to a minimum. He writes like a true scientist, making the facts matter. The science on global warming is now sound. What is not is human nature.

I cannot help putting in this cartoon again

Image Hosted by ImageShack.us

Note: Weart has an informative site based on his book, which contains many updated essays not in the book. Worth a look.

...And since we are on the subject of cartoons, here's another side-splitting one.

Image Hosted by ImageShack.us

Tuesday, September 12, 2006

Sudo Buy This T-Shirt

I just learned about the command a few months ago, but those who know Linux and do sysadmin work will appreciate this. It's priceless.

Image Hosted by ImageShack.us

This and more T-shirts

Thursday, September 07, 2006

OBESITY AND ANTI-OBESITY DRUGS

Derek Lowe analyses how anti-obesity research is still not considered in the same high-profile league as cancer, alzheimer's and other life threatening conditions. To some extent, I feel that obesity is like global warming. Many of us accept it as a part of life, for ourselves and our friends and family. It's not going to matter until it hits home hard and frequently. Even this may be difficult to see, because unlike cancer, it is hard to see how someone (except for the most pathological case) died exclusively of obesity.

And while anti-obesity drugs could definitely help, I see some juicy economic controversies that may arise if people start using them regularly. Discovery of drugs that reduce appetite might spark off another conflict with corporate America. One of the big and very unnatural achievements of the era of processed food is that it turned demand for food into an elastic demand. Through diabolically clever advertising, the previously held notion that people don't necessarily eat more if you produce more was rendered redundant. Many of us, if not most, give in to advertising and often end up eating more than what we actually need. Michael Pollan's 'The Omnivore's Dilemma' makes this fact quite clear. If drugs that really reduce appetite became widespread (and start to be used by parents on their small children for example), it would mean untold losses for McDonald's, Hershey's and KFC, to name a few conglomerates, all of whom want us to eat as much as we can. Since the primary aim of both the food industry and the pharmaceutical industry is to make money, it should be very interesting to see how this conflict turns out.

SILLY POINT AT 12 O' CLOCK

I have posted very few photos of myself on my blog, and I have taken care to post only those which are respectable. The insidious Patrix seems to have had knowledge of this fact, and so now, very cruelly, he has tossed a silly tag at me. As much as I am embarrassed to do this, the Blogocratic oath compels me to get done with this and put the painful memory behind me. Relief comes from the compensatory fact that I have at least been captured doing something I like viz. playing my keyboard.
So here goes. I will not forget Patrix boy. Lightning will strike, the seven seas will part, the sky will be aflame with crimson...and...and....forget it. How much weight can these lofty words bear, coming as they are from someone looking so silly :-(

Image Hosted by ImageShack.us


And now let me get to do the dubious honour. I tag Hirak, Sumedha, Madhura, Prakash, Chetan, and Sumeet. Mostly to prod them to post, but also to find out if they are as shy as I think they are. ;)

Wednesday, September 06, 2006

MOVIES, ORWELL, AND MAHATMAS

Lage Raho Munnabhai is proof of the fact that Bollywood directors can make movies with the typical Hindi storylines and far fetched visuals and themes, which are still intelligent and have important messages in them. Munnabhai may be the most successful such venture in many years.

I truly enjoyed the movie. The comedy is well-crafted, and never cheap. Performances are all respectable. Arshad Warsi and Sanjay Dutt have both done a good job. Dilip Prabhavalkar is very believable as Gandhi.
If movies like Munnabhai bring about a resurgence of interest in many positive Gandhian values, if not the values themselves, we will have a new phenomenon. This paradigm seems to be recently emerging in Bollywood, but I have seldom seen genuine moral values and good philosophy included in a typical Hindu movie, that will first make the audience laugh, but then also scratch their heads. Munnabhai seems to be a good augury for such future attempts. It strikes a remarkable balance between vacuous masala movies and movies that have genuine messages in them. But the great thing about the movie also is that it does not preach this philosophy, but lets it emerge as a part of the situation and the comedy.

Many of Gandhi's ideals and ideas were deep and profound. Whether they would work in each and every situation is a different question, but I have no doubt that they were largely responsible (along with other important factors) for getting us independence. I also always think that non-violence as a general framework is the best possible one for world peace and progress. For me, Gandhi's greatest achievement probably was in realising that armed aggression would not work in the India of those times. That also shows what an insightful politician the man was.

It was a coincidence that I happened to come across George Orwell's very readable essay on Gandhi. Orwell wrote it in 1949, and almost all of it still is valid. What is nice about the essay is that Orwell seems to realise and appreciate Gandhi's essential qualities even though he inherently did not like Gandhi too much. Orwell's honesty is to be appreciated, and the essay is a fine example of the word 'objective'. He does raise questions about Gandhi's philosophy, and does not fail to criticise it, rightly in my opinion. When Gandhi was asked what the Jews of Hitler's Third Reich should have done, Gandhi's reply was that they should have committed mass suicide to alert the world to their fate. The implicit assumption in making this statement is one gleaned from hindsight- that they all were murdered anyway. Any sane person listening to this opinion could be forgiven for thinking it to be the rant of a madman. But Orwell's contention is not that this was correct, but that it was honest. Gandhi's ideals as a universal framework may not be accepted by all, but there is no doubt about the fact that he really stuck with truth as much as anyone else.

I wholeheartedly agree with the fact that it takes great courage to speak the truth, to restrain oneself rather than express your anger, to exercise so much tolerance that your adversary's reserves of anger are exhausted. But the logical question is, are these actions an end in themselves, or should they achieve some end? The problem that Gandhi's critics have pointed out, is that there is no utility of these actions if they do not achieve the short term benefit that is expected, and he seemed to think of them as being ends in themselves. In case of India, these actions were aimed at an immense and long term objective, to persuade the British to let go of their crown jewel, and they succeeded spectacularly. But they may be mere ideals when a quick result is desired, with no forbearance for sacrifice and gradual change. But I think that is precisely the quality in these actions that makes them sane ones for an ultimate plan of peace for the world. World peace is probably the longest term ideal that we can aspire to. If non-violence is the ideal philosophy for this ideal aim, then Gandhi saw further and deeper into the future of humanity than anyone else.

I also find it remarkable that although Gandhi believed in god, his philosophy is refreshingly, at least to a large extent, free of religious underpinnings. That makes his philosophy more universal than any kind of religious parable could become. The other truly commendable thing is that even the British who were not his friends, never questioned his integrity and other positive values, which easily may have been brought under suspicion. As Orwell says, nobody accused him of being ambitious in a vulgar way, or being driven by fear or malice, or being corrupt. The fact that the British could not accuse him of such actions and qualities even when they could have done it with proper propaganda, definitely says something about his inner self.

About Gandhi's adherence to the ascetic ideas of non-attachment, Orwell has a memorable paragraph:
"Many people genuinely do not wish to be saints, and it is probable that some who achieve or aspire to sainthood have never felt much temptation to be human beings. If one could follow it to its psychological roots, one would, I believe, find that the main motive for “non-attachment” is a desire to escape from the pain of living, and above all from love, which, sexual or non-sexual, is hard work. But it is not necessary here to argue whether the other-worldly or the humanistic ideal is “higher”. The point is that they are incompatible. One must choose between God and Man, and all “radicals” and “progressives”, from the mildest Liberal to the most extreme Anarchist, have in effect chosen Man."

As for whether he was a great man or not, I agree with Orwell that the very fact that his role in India's independence movement is still fiercely debated, indicates his stature. His influence is undeniable. Social activist Martin Luther King and scientist Linus Pauling are two emblematic and diverse symbols of the purview of his ideas. Non-violence was not an option for the British whose country was being ravaged by The Blitz in World War 2. But non-violence may be the only long-term option for a world, whose character will be ravaged by untold miseries yet to descend upon it.

In any case, the fact that a movie named Lage Raho Munnabhai led me to this digression on non-violence and values, means that you should watch the movie whenever you get a chance.

Friday, September 01, 2006

KHAN'S NONSENSE CONTINUES

Salman Khan's arrogance and breathtaking inanity continues. In an interview with the BBC, he says:
"They wanted to set an example out of me... Who knew the black buck? I mean today because of me, people know there's an endangered species of deer called black buck, well it's actually an antelope."

Yes. Right. Shooting endangered species is the best way to bring them to the public's attention. What a great public service this man has done. National Geographic and the Discovery Channel should all have taken a lesson from him, and instead of painstakingly making documentary films, should have just gone on an animal killing spree to make people aware of them.

In any case, I am sure many people knew about the black buck. I myself have seen the black buck dozens of times on many television channels before. Also, he still does not realise that it's not just about killing the black buck, but the blithe attitude of blatantly trampling the law and nature that is the root of the issue. So it's absolutely right that the judge wanted to make an example out of him.

So no Mr. Khan, we were not ignorant of the black buck. But yes, you were and still are ignorant of kindness and responsibility.