Friday, April 27, 2007


Some bugs never seem to die, especially those of the Presidential type. The missile defence bug is surely one of these. It has bitten almost every President from JFK to GWB, sapped away billions of taxpayer dollars from the nation's coffers, and regularly evaded the attempts of dozens of eminent experts to declare it futile and ominous.

It started in 1957, when Sputnik blazed across the sky. For the next thirty years or so, US Presidents projected a false 'missile gap' to the nation, and devoted manpower and an immense amount of money to building bigger and better missiles that could carry thermonuclear warheads across continents and initiate global nuclear conflict. This was inspite of the fact that the perceived missile gap was never a gap in the middle days of the Cold War. By banning the development of ballistic missiles as well as nuclear weapons, the United States could have retained a clear advantage over its opponents. But bugs as we know can be all-pervasive and recalcitrant. In the 60s, the Cold War reached new heights with the Cuban Missile Crisis, and the testing of many new missiles. Mercifully, nuclear tests in the atmosphere, underwater, and in space were banned in an important 1963 test ban.

The Chinese had been engaged in nuclear weapons research since the 1950s, and affairs had come to a head in 1950 during the Korean War. In 1964, China had detonated its first atomic bomb. After the Soviet Union, it was seen as the biggest threat to the US. Sometime in the 1960s, the bug caught the imagination of Washington, and plans were made to employ a huge anti-ballistic missile (ABM) system that could deflect a potential Chinese nuclear attack. The plans were first conducted in secret, and then, even before China really had the requisite technology to engage in such attacks, were heavily publicised by Defense Secretary Robert McNamara, thus giving the Chinese carte blanche to go ahead.

Missile defense can be a tricky concept. On the face of it, it seems to guarantee a nation's safety from first nuclear strikes. But this sense of safety is misleading, for several reasons. First of all, technical ones; it was shown repeatedly, most strikingly in 1968 in a Scientific American article by the great physicist Hans Bethe and Richard Garwin, that almost any countermeasure that the US could take against such an attack could be defeated by 'countercountermeasures' by the enemy. This could include the use of any number of decoys, from aluminium foil to fake explosions and warheads, to mislead defensive missiles. On a very local scale, missile defense could be partially successful, but the authors showed that the marginal expense necessary for deflecting missile attacks was much more for the US than for the enemy. In other words, it was relatively easier for the enemy to thwart defensive missiles than for the US to thwart offensive ones.
The more important problem with missile defense is political. By employing such a defense, the US gives out a signal to other countries that since it is now securely defended, it may not have a problem launching a first nuclear strike itself. The result of such an impression is predictable; the enemy would put even more resources into developing more and better missiles and weapons to penetrate the system. If there is a good way to initiate another nuclear arms race, this way would be close to the top of the list. For example, in 1968, one could not have blamed the Chinese for accelerating their own missile development after hearing of US plans to develop such a defense system. In fact, it would have been a nice excuse for them.

In any case, the 1960s system fortunately did not work out, but not before millions of dollars were spent on it. A respite came in 1972, when Richard Nixon signed the Anti-Ballistic Missile Treaty, that limited the numbers of deployable ballistic missiles on both sides. But like other treaties, this treaty also contained a slippery slope, because it said nothing about developing more sophisticated missiles, simply about deploying them. Also, it did not say anything about the number of nuclear warheads that a single missile could carry. This loophole (probably intentionally left in) led to one of the most dangerous developments of the Cold War; Multiple Independently targeted Reentry Vehicles (MIRVs), in which a single missile such as the Minuteman could carry upto 10, independently targetable warheads. This probably was not much better than the situation before the treaty.

But the icing on the cake was laid by the charismatic Ronald Reagan, with his espousal of a really ridiculous, unfeasible, and prvocative system- the famous "Star Wars", or Strategic Defense Initiative. Star Wars envisioned hundreds of missiles and weapons based in space. It got rescued from early demise by a new invention, the X-ray laser, which could shoot its gigawatt beam across miles of space and supposedly blow out missiles and thermonuclear warheads into oblivion.
However, after Reagan enthusiastically took up the gospel, it began to become apparent that the project had been oversold, and scientists who worked on it had been pressured into remaining silent about its limitations. While hawkish Edward Teller lobbied for it, the resourceful Hans Bethe and his colleagues again rose to the cause, and published another article in Scientific American on the same lines as before, arguing yet again that the enemy could always emply suitable decoys and defeat the system, and that it would lead again to an accelerated arms race.

Nuclear weapons have been brought in check somewhat by various treaties since then. But needless to say, George Bush has inherited the mantle from his esteemed Cold War predecessor and has gloriously validated it. In 2002, after honouring the anti-ballistic missile treaty for 30 years, the administration withdrew from it in an ominous development. Part of the reason was a short-lived but dangerous resurgence of "small" nuclear weapons R & D. These "low-yield" weapons called "bunker busters" were intended to bust underground enemy bunkers. However, it was convincingly argued that contrary to what their proponents would have everyone believe, they were no "safer" or "contained" than conventional nukes, and in some respects even less so.

The most important reason for withdrawing from the treaty obviously is the renewed interest in another missile defense system, "Son of Star Wars", this time ostensibly against North Korea and Iran, and this time based in both Europe and the US.
Yet another article was published by Bethe's collaborator Richard Garwin in Scientific American in 2004, arguing against the proposed system. Again, the most important objection is political. What kind of signal is the US giving to N. Korea and Iran? It's clearly an invitation for these nations to become even more suspicious of the US, and get the perfect pretext for developing their own missiles.

Perhaps the US wants to adopt the Cold War strategy to try to economically bleed these nations out by making them spend huge sums on missiles, a justification that is often made in support of this endeavor. But this strategy is not only dubious but misguided. Firstly, there is no guarantee that another nation would not atack the US in any way until it faces economic collapse, so that the overall risk to the US increases. Secondly, even if it does not attack, it would develop any number of weapons and missile systems and possibly help them proliferate. This brings us to the third and most pertinent point. In this age of nuclear terrorism, it is highly unfeasible that Iran or N. Korea wil use nuclear warheads on missiles to attack the US if they really wanted to wage war against it. It would be suicide for them. Instead, as the late Carl Sagan, an outspoken opponent of missile defense used to point out, these states would smuggle in a low yield or dirty bomb through a suitcase or through diplomatic pouch. Graham Allison also affirms this in his highly readable Nuclear Terrorism. In fact, it is highly unlikely that these states would do something like this that would threaten to drop the curtain on their own existence. The real threat is from independent terrorist groups with no return labels and fear of retaliation. And it is almost impossible and ludicrous that they would use missiles to attack the US. However, nations like N. Korea and Iran would happily and anonymously provide the technology which they have developed to these terrorists.

In the latest developments, the US is pressurizing Vladimir Putin and other European leaders to employ missile defense in Europe. I commend Putin and these leaders for not giving in to this inane and misguided demand. One must not forget that it was the presence of Jupiter missiles in Turkey which threatened the Soviet Union in 1961, and became one of the causes of the missile crisis. By again employing missiles in Europe, the US is only going to make itself and the world a less safer place.

Times have changed, but the US still very much seems to be living in Cold War times. It still has about 10,000 warheads, many on hair-trigger 15 minute launch alerts. The threatening nature of this state of affairs has been certified by both conservatives and liberals. The strategies of missile defense that it's trying to employ are Cold War era tactics. And they were unworkable and dangerous during the Cold War, and are dangerous right now. One would think that statesmen would have learnt from such a long experience of dealing with possible death and destruction. But, like bacteria, these bugs don't die out. They can be resisted and rooted out though, but statesmen don't have the intelligence and conviction to resist them, and would rather let the bugs turn them into zombies.

Labels: ,

Tuesday, April 24, 2007

DECIDE HOW YOU WANT TO DIE... scurvy or DNA damage. Hopefully, none.

An interesting study has appeared in the British Journal of Nutrition (2007, 97, p.639), which seems to say what many of us may have suspected; that nature knows best. The study investigates the effect of Vitamin C as an antioxidant when taken alone in the form of pills, or as a component of oranges or orange juice. It concludes that Vitamin C in the form of oranges or orange juice may be much better than Vitamin C in the form of a pill or supplement.

The study made volunteers drink Vit C water, orange juice, and sugar water as a control. After the Vit C levels in all the volunteers' blood were equalised, samples were taken and were exposed to hydrogen peroxide, a known DNA damage agent. Surprisingly, the levels of DNA damage were much lower in the orange juice-fed volunteers than the Vit C water volunteers.

I have not had access to the full paper, but the authors conjecture that it may be the other substances in oranges which protect against DNA damage. As I see it, these other substances may be acting as "sacrificial" molecules, themselves getting oxidised and thereby protecting both DNA and Vit C.

Actually, these results should not be surprising. Nature has evolved intricate packages of chemicals that play roles in organisms. Oranges are not just containers for holding Vit C and other compounds, but intricate systems in which there is an essential synergistic interplay of all these substances as well as their container. Taking one out of context creates the same problems as when politicians take words out of contexts. As in many aspects of of nature, the whole is much more than the sum of the parts. In light of this, I sometimes wonder how we have so many effective drugs which are isolated from natural sources and used separately. But then, it's not perfect, is it? Think of how many side effects they have, and this may well be because you are not providing a holistic environment for them to act. Sometimes the leaf is really better than the pill. Validation for old Ayurveda and herbal medicine?

Next time, maybe we can think twice before we substitute a Vitamin tablet for its natural source.

Labels: , ,

Monday, April 23, 2007


Image Hosted by

A post about metaphors and analogies on a fellow blogger's page reminded me of a quote by a remarkable mathematician whose name is known only to afficionados now, but who stands in the front rank of brilliant mathematicians and physicists of the twentieth century- Stanislaw Ulam. Here's a quote from him about analogies:
"Great scientists see analogies between theorems or theories. The very best ones see analogies between analogies"
Indeed. And Stan Ulam could very well put himself into the second category, although his modest nature would have not made him do so.

Ulam was born in Poland and grew up in a romantic time in the 20s and 30s, when great discoveries in mathematics and physics were being made in small, enchanting roadside cafes by small groups of people working intensely together. One of those, the Scottish Cafe in Lwow, Poland, was a focal point for meeting of great minds, the best pure mathematicians in Europe. Equations used to be scribbled on tables there, and the waiters were told never to erase them. Marathon sessions used to be common, fueled by black coffee, and interrupted only by occasional meals and trips to the bathroom; one non-stop session lasted 17 hours. The mathematician Rota said this about Ulam's fascinating mind:
"Ulam's mind is a repository of thousands of stories, tales, jokes, epigrams, remarks, puzzles, tounge-twisters, footnotes, conclusions, slogans, formulas, diagrams, quotations, limericks, summaries, quips, epitaphs, and headlines. In the course of a normal conversation he simply pulls out of his mind the fifty-odd relevant items, and presents them in linear succession. A second-order memory prevents him from repeating himself too often before the same public."
Ulam was invited to visit the US as a lecturer several times during the 1930s by his fellow famous emigre from Europe, and admittedly the smartest man of his generation; John Von Neumann. Within a short time, the romantic days were at a tragic end. Ulam held out in Poland much longer than many other brilliant European scientists and mathematicians, and in 1939, on the eve of World War 2, escaped to America with his brother Adam. The rest of the Ulam family perished in the Holocaust.

After coming to the US, Ulam was secretly invited to join the Manhattan Project in Los Alamos, where he was known to be a problem solver and jovial team worker. In Los Alamos, he tried to recreate the idyllic atmosphere of his young years in Europe by installing a coffee machine outside his office where scientists could talk shop. You can get to see Ulam in The Day after Trinity. Here is a photo of three prodigies from those days, (From L to R) Ulam, Richard Feynman, and John Von Neumann

Image Hosted by

While at Los Alamos, Ulam made what was probably the most important contribution of his career- the Monte Carlo method, a way of calculating the result of complex processes through random numbers. This method is now so important and deep-rooted in physics, chemistry, and engineering, that many students have forgotten that somebody invented it. The method is now implemented as a black box in many computer programs, such as those which I use for calculating the structure of organic molecules, and so people tend to sometimes use it without knowing that they are using it.

In 1946, Ulam suffered an attack of encephalitis; he could not remember events after the attack, and after the operation, federal agents asked him questions to make sure that he may not have given away atomic secrets during his loss of recollection. After the operation, Ulam seemed to some to become even more brilliant than he had been before.

However, Ulam probably became best-known to a greater audience through his participation in the development of the hydrogen bomb. After the war, he and fellow scientist Cornelius Everett embarked on a series of tedious calculations to prove that the then accepted and widely touted design of the hydrogen bomb would not work. This was a significant result, as President Harry Truman had been earlier prodded to announce a crash effort to develop the bomb based on this design. WIthin a short time however, Ulam came to the essential breakthrough that encouraged the infamous Edward Teller to develop the most widely used design of the h-bomb. The breakthrough involved separating the fission and fusion parts of the weapons, and using compression from the fission bomb to activate the fusion bomb. After this design was invented, everybody assumed that the Soviets were doing it too, and the program was purused with vigour. Every country afterwards that developed thermonuclear weapons has used this so-called "Teller-Ulam" design or a variant of it.

The imperious Teller essentially took much of the credit for the invention, and later tried to expunge Ulam's name from that part of history. Hans Bethe liked to joke that Ulam was really the "father of the h-bomb" while Teller was the mother since he carried the baby for so long. Ulam for his own part, an unassuming and docile man, stayed away from these disputes, when he rightly could have done more for asserting his claim to fame. Ulam and Teller parted ways after the discovery, Ulam returning to his world of pure science, and Teller becoming increasingly belligerent and disliked by his fellow scientists, and pushing for new and "better" nuclear weapons, thus becoming what Richard Rhodes calls the "Richard Nixon of American science". Till the end of his life in 2004 at the age of 95, he gave hawkish and wrong advice to Presidents (famously about "Star Wars" to Ronald Reagan) and believed that he was doing the right thing in advancing peace by building more hydrogen bombs.

During his professional career, Ulam spent time at the Universities of Wisconsin, UCLA, and Boulder. His wife, Francoise, was always a loving support as well as an admirer of him. She remembers one defining moment from their lives, when she found her husband staring out the window after he had had the idea for a successful hydrogen bomb. "I have just discovered the idea that will change history", he presciently said.

Ulam died in 1984. An astonishingly versatile scientist, he had been equally at home with the most abstruse reaches of set theory and with the details of thermonuclear fusion. His memoirs, Adventures of a Mathematician, paints a fascinating and delightful portrait of the golden age of physics and mathematics, as well as the dawn of the nuclear age. In this book, we get to hear anecdotes about famous mathematicians and physicists, many of whom were good friends of Ulam.

Ulam once said:
"It is still an unending source of surprise for me how a few scribbles on a blackboard or on a piece of paper can change the course of human affairs."
Ulam was certainly one of the select few who scribbled.

Labels: , , ,

Saturday, April 21, 2007


Image Hosted by

Raid "Earth Options" Flying Insect Killer, the supposedly environment-friendly fly spray, is admittedly the worst designed fly killer I have come across until now. I doubt if even the manufacturers themselves knew what composition and materials they put in, and even if they did they don't seem to have actually tested it. For one thing, I don't know if it's because it's supposed to be benign, but its sheer potency is just lousy. You have to spray it directly on the fly, and more often than not the little critter ends up flying around before it suffers a direct hit, so that you mostly end up spraying everywhere else except on it. Even when it is hit, it usually struts around for a few random centimeters before finally falling dead, thus parading microscopic globs of the chemical all over the place. And in some death defying instances, I have even seen flies getting up, dusting themselves off as if nothing happened, and resuming their flying antics.

But the most annoying thing about Raid Earth Options is their aerosol composition, which is extremely poorly designed. The stuff does not perform even its basic function, to get finely aerosolized. I don't what exactly was circulating in the bloodstream of the chemist/engineer who designed it. When you spray the stuff, the particle size it produces is quite large, and so the droplets quickly drop like a stone on whatever surface is below. Because of this problem, not only does it not hang around long enough in the fly's flying space, but one can never use it on tubelights, where flies usually sit, because usually the tubelights are right above your desk and everything on it. The first few times I used it, I had to discard some papers on my desk, clean up the whole surface, and yes, throw away a box of cookies that was actually sitting quite far from where I sprayed the fly killer. I don't think environment-friendly means you can use it as hot sauce for your fried rice.

This problem dictates that to avoid a thin and pretty long-lasting coating on everything in my room, I always lure the fly into a bathroom by turning off all lights except the one there, and then turn the whole bathroom atmosphere into a Raid fest. And don't even think of using it anywhere in your kitchen. Plus, the smell is not exactly enticing, for humans and flies. Dismal. Others seem to agree.

S.C. Johnson and Co., for all those heartwarming commercials that portray three generations of dedicated product manufacturers, shame on you for selling us Raid Earth Options. I do like your Ziploc and Saran Wrap.

Labels: ,

Friday, April 20, 2007


This was one week when, after a very long time, I really wished that I did not own a computer or TV and had crawled into a cave.

But anyway, here are some perceptive comments from The Economist on the quagmire of hell that is the US gun politics saga.
"Cho Seung-hui does not stand for America's students, any more than Dylan Klebold and Eric Harris did when they slaughtered 13 of their fellow high-school students at Columbine in 1999. Such disturbed people exist in every society. The difference, as everyone knows but no one in authority was saying this week, is that in America such individuals have easy access to weapons of terrible destructive power. Cho killed his victims with two guns, one of them a Glock 9mm semi-automatic pistol, a rapid-fire weapon that is available only to police in virtually every other country, but which can legally be bought over the counter in thousands of gun-shops in America. There are estimated to be some 240m guns in America, considerably more than there are adults, and around a third of them are handguns, easy to conceal and use. Had powerful guns not been available to him, the deranged Cho would have killed fewer people, and perhaps none at all."

On a Commander-in-Chief who cannot get enough of waxing eloquent about peace, freedom and dignity:
"Mr Bush however, has done active damage. On his watch the assault-weapons ban was allowed to lapse in 2004. New laws make it much harder to trace illegal weapons and require the destruction after 24 hours of information gathered during checks of would-be gun-buyers. The administration has also reopened debate on the second amendment, which enshrines the right to bear arms. Last month an appeals court in Washington, DC, overturned the capital's prohibition on handguns, declaring that it violates the second amendment. The case will probably go to the newly conservative Supreme Court, which might end most state and local efforts at gun control."

On some simple, practical, measures:
"The assault-weapons ban should be renewed, with its egregious loopholes removed. No civilian needs an AK-47 for a legitimate purpose, but you can buy one online for $379.99. Guns could be made much safer, with the mandatory fitting of child-proof locks. A system of registration for guns and gun-owners, as exists in all other rich countries, threatens no one but the criminal. Cooling-off periods, a much more open flow of intelligence, tighter rules on the trading of guns and a wider blacklist of those ineligible to buy them would all help."

On why gun control is not a partisan issue and so probably won't ever be settled, and why the NRA may be the most senseless organisation that ever existed:
"Harry Reid, the Senate's Democratic majority leader, warned against a “rush to judgment. There is little danger of that. The blood-letting in Blacksburg is unlikely to shift the debate about guns...the Democrats are convinced that gun control helped them lose elections in 1994 and 2000. The reason is that, no matter how often the Democrats promise not to take away hunters' rifles, the NRA treats any curb on gun rights as a first step towards complete disarmament. And without their 240m guns, it argues, Americans will be defenceless not only against criminals but also against tyranny. The NRA draws on history to support its arguments. The first European settlers conquered America with guns; British soldiers tried to confiscate them, but the Americans revolted and shot off the superpower's yoke....This may be a selective view of history, but it is still relevant, for two reasons. One is that the notion of a right to bear arms is enshrined in the constitution. The other is that the NRA constantly exaggerates threats to gun-owners. Its sells books such as “Thank God I Had a Gun: True Accounts of Self-Defence”. It relentlessly publicises the fact that police in New Orleans, during the looting spree that followed Hurricane Katrina, confiscated some legally-held guns. And its chief, Wayne LaPierre, has peddled for years the absurd theory that the United Nations is plotting to take away Americans' guns."

It's unfathomable how the NRA will still pitch its drivel based on the 18th century American state of affairs, and this is an organisation which in addition has a crackpot chief. It's appalling how Americans will fall back on the Constitution as if it were really an objectively determinable God-given right, a sacrsanct book applicable in every time and circumstance.

And it's woeful how the standard argument that giving guns would enable citizens to protect themselves from deranged killers is a part of the greater lack of problem-solving ability increasingly prevelant in a moribund and militant society, where people's only vision for the future of their grandchildren is a world full of violence.



Image Hosted by ImageShack.usImage Hosted by

Curiously, Beverley is Professor of Apologetics at Tyndale University College and Seminary. His book is going to be published in July.

I have not yet written a review of Dawkins's book by the way, which I have finished reading ages ago. The reason is not easy to explain in a few words. For now, let me say that I was waiting to read Edward O. Wilson's The Creation which I am in the process of doing, before I wrote about The God Delusion.

Labels: , ,

Wednesday, April 18, 2007


This is a horse that has been long dead, and has been beaten so hard that it has been pounded into the ground, and will continue to do so. But I have been reading a few blogs and commenting, and it seems to me that some proclamations on gun control are pretty misguided. I know that no one can possibly say the last word on such a subject that has been paraded and trampled and swallowed and regurgitated a thousand times, but there were some observations that stuck out.

First of all, we have to face it; someone as demented beyond imagination as Cho Seung Hui could have done anything any time with any weapon, and nothing that could have been done would have predictably prevented him from doing it. But let's also face the facts; he committed his horrific deed with guns, two advanced rapid-fire automatic weapons. The fact remains that if he had been denied access to those guns, it is very likely that his actions would have been mitigated. People's standard argument is "Guns don't kill people...and he would have done it anyway". But let's think from a practical point of view. Hui could have used a knife, or explosives, or even a chainsaw. Timothy McVeigh did not use guns. But it's really much more complicated to use all these things to kill 30 people in general. Blowing up buildings needs quite sophisticated planning and execution, and knives are much weaker than guns. Sure, a demented man may take out lives in quite a bloody manner with knives, but guns are designed to do what they do; to be fast and deadly, concealed, quick, and efficient at long range. It's quite clear that if guns are around, they will be used by such a person. If they are not around, they would not be, and maybe killers might even rethink their actions because they cannot give instant vent to their impulses. As a fellow blogger said, "A gun needs little thought to discharge. It’s hard to say Cho wouldn’t have calmed down by the time he loaded his U-Haul up with fertilizer".
Guns provide very little protection for the victims because it would be a paradox; they are very much intended to kill their victims. So I believe that limiting access to guns (quite apart from how and to what extent it can be done) can surely alleviate the tragedy of such disasters, if not avoid them.

The second question is that of limiting access to the guns themselves. Proponents of guns were quick to say, "Look, this is what happens when you don't have legal access to guns for everyone on campus". But think of what the culture of a college campus would be like if you kept thinking that all your classmates and teachers are having concealed weapons on them at any given time. I personally cannot imagine such an environment subscribing to anyone's idea of how a college or university should be. Most importantly, where you had one person before who turned lethal when he lost his mind, now you potentially have a hundred persons who can turn lethal when they lose theirs. Can we say how many accidents have been averted because a person lost his mind, but did not have easy access to a gun, and that made him think and perhaps calm down? Imagine a society where everyone is armed to protect themselves from everyone else. Assuming human beings to be rational can be the biggest mistake in such thinking.

However, I think we all understand now that it gets very real and gruesome when such a person is on campus and no one has the means to protect themselves. When I was thinking about this issue, I thought that maybe there could be some compromise, maybe you could designate certain college officials and give them weapons. Maybe you could have one weapon per building, with a heirarchy of people who are allowed to use them in such an emergency.
But on second thoughts, I am not sure this will work. For one thing, only if those officials are attacked themselves are they going to use the weapons. The first reaction of any human being in such a crisis is to try to save himself, and risking becoming a martyr and sprinting to the scene of the incident to protect others, even if that venue is right next door, is just not something that even righteous people are programmed to do. Moreover, again, if these weapons fall into the wrong hands or if the concerned persons don't use them properly (after all, unlike the killer, they have not turned into cool-headed zombies and are still quite human), they can cause more harm than good.

The "solution" of giving everyone on campus a gun has always seemed to me to be one of those quintessential "solutions" that human beings are so fond of; solutions that they create because they have created the problems, and don't want to nip them in the bud, thus necessitating a new solution for every new problem ad infinitum. I don't know what the solution to the problem is, but to me, it has always seemed that some simple measures could lessen the risk of such permanently scarring incidents. These measures remind me of the nuclear treaties that have been signed in the last fifty years, which largely forged a compromise between the nuclear hawks and doves.

Limit the number and especially the kind of weapons that ordinary citizens can purchase. Like I said, it's not about "freedom", it's about proportionality. Make background checks extensive, and don't let the absence of a criminal record alone be an adequate reason for selling AK-47s to sixteen year olds. Restrict the sale of certain kinds of guns by putting a cap, say, on the number of rounds that an automatic weapon can fire continuously before it needs to be reloaded. When a gun needs to be purchased, make the approval of two adult referees necessary, and charge a reasonable fee for getting a license. Such strategies seem to have worked in New Zealand and have made both pro and anti gun camps happy, and they can work in the US if people stop having their love affair and obsession with "freedom". All this is not going to stop people like Hui from carrying out their morbid tasks, but who said a cure for such a problem was ever a realistic objective. In the case of an incident such an above, even five lives saved are more than worth it.

In light of the above thoughts, I find it a flawed argument to advocate giving guns to everyone or to the majority. This reason I have is similar to the one I have against nuclear proliferation. If everyone who had nuclear weapons was rational, then deterrence would superbly play out and I would be all for putting a few nukes into everyone's hands. But with nuclear terrorism, the equation changes, because even the rational condition of self-preservation no longer applies. Under such conditions, nuclear proliferation can only pose unnown dangers. I see an analogy between global nuclear proliferation and national gun proliferation, with inherently irrational people being the key deciding factor against proliferation. There is a compromise possible, but it's possible only if both camps stop sticking to their extreme positions. It needs moderate thinking, and unfortunately in the US, "moderation" seems to have become anathema these days.

P.S. By the way, notice how this event is rightly getting all the attention it deserves, but daily events in Iraq involving equally innocent civilians (like the 200 dead in the bombing today) are considered routine and boring.


Tuesday, April 17, 2007


For once, George W said it well:

"Schools should be places of safety and sanctuary and learning. When that sanctuary is violated the impact is felt in every American class and in every American community."

And that's what it's been. I personally have been feeling really uneasy since yesterday. Maybe it's in part because I grew up around a college campus, visiting Fergusson College regularly almost since I was born, with both my parents teaching there. College for me was synonymous with home and my parents. But the unease is also because of a general image that we have of college campuses, where peace is propagated and pure knowledge is disseminated. Most academics are gentle people, and this fact also adds to the feeling of loss. Part of the grief also comes from the fact that two Indians were killed in the massacre, although this grief is by no means exclusively reserved for our own. And of course, a major part of the despair comes from coming to grips with incomprehensibility, an inherently paradoxical task. The Holocaust still remains incomprehensible, but at least we know its dominant cause; the grotesque ideology of one man. But what happened yesterday really defies reason to a much larger extent. Who can we blame? And how do we make sure it doesn't happen again? When we are confronted with a tragedy, even blaming something or someone can provide solace, because that provides a way forward to prevent it from hapenning again. But it is very difficult to do both these things in light of this event.

Being a student and a grad student is a common culture that so many of us have experienced and are experiencing, and that cements the common feeling of pain like nothing else. All students without exception have tales to tell of how they cut classes, had secret crushes, survived on and made forays for departmental free food and the like, got into trouble more than once with authorities, and read PhDcomics. Even those for whom college was uneventful are many in number, thus still sharing a common heritage. College provides a shared experience for us like few other things in life. And just like some other common aspects of life like relationships, we all can feel deep inside what it means when such a tragedy befells college students. When my grandfather passed away two years ago, I found myself aimlessly wandering among the library stacks, probably to feel at home in my own personal sanctuary. Perhaps it gives me the feeling that whetever's in those shelves there provides a powerful source of hope to guide us from this moment on into the future. Strangely, I found myself doing the same thing yesterday, and that more than anything else made me realise the common thread of personal grief that ran through to me, even if to different extents and for different reasons.

The breaking point was when everybody was waiting for news about 26 year old Minal Panchal since yesterday, and today everybody's worst fears were concerned. Minal was an architecture student at Virginia Tech. Her Orkut profile is that of any person who loves and lives life, enjoying movies, long walks and old Hindi film songs. She seemed to be deeply interested in her work, and also in sustainability and in saving the environment. But of course we don't need to grieve for her because she was somebody special. We need to grieve for us because she was just like us, and I or you could have easily been in her place. Within two hours, almost 3,000 scraps (up to 15,000 now in 12 hours) have been written in her Orkut scrapbook; first they were of hope and prayers for her safety, then they were of despair and prayers for peace. Almost every one of these offers prayers and wishes for her soul to rest in peace. I was going to pen a line, then realised the futility because she would never read it, but then realised that if she can't, at least her friends and mother can, and went ahead anyway. I did not wish for peace for a soul that I don't believe exists, but all of us can wish for peace for those beings who are still alive. That is another common thread that runs through us; one does not need to believe in God to share in hopes and wishes.

Monday, April 16, 2007

Q: Why was 2002 an exceptionally good year for the US?

A: Because it seems to be the only year in which there was no school or university shooting.

And now, the worst one in US history.

I love the standard line touted by pro-gun Americans: "It's not the gun's the human being".

That's what they also said during the Rwandan genocide. When US officials asked to block the radio frequencies that were spewing vehement propaganda against the Tutsi, higher officials shut them up by sternly saying "Radios don't kill people, people do".

Yes people, it's always the human beings; that's precisely why you keep them away from guns and radios. And please, it's not about "freedom of choice". It's about access to guns and proportionality. Why not let citizens carry mini Sarin cylinders or Anthrax shots then?

What an amazingly and sickeningly warped and inverted acknowledgment of culpability, as well as assertion of freedom.


Thursday, April 12, 2007


It's strange that just yesterday I was reading a few horrifying chapters from Inferno: The Devastation of Hamburg, 1943 by Keith Lowe, and today we hear about the death of a man whose claim to fame rests on a novel that's based on the devastation of Dresden. Hamburg, Dresden, Tokyo, and finally Hiroshima, all dark chapters in the history of the twentieth century. But Lowe's book really reminded me why opposition to dropping the atomic bomb cannot be completely unequivocal, albeit for a heartbreaking reason.

Lowe's book details the destruction of Hamburg in July-August 1943, which was the most successful operation of the war for British Bomber Command. Hamburg, just like almost any other city, was engaged in some manufacturing of war material, and that pretext was enough for the Allies to cross the thin line between bombing military bases and bombing civilians, as it was enough for almost everybody else in the second world war.
Lowe devotes a full chapter to the unique and grotesque phenomenon that materialised in Hamburg on July 27, 1943, that was like nothing seen before. Technically, it was called a firestorm. On a personal level, it burnt memories in the minds of witnesses that are comparable to those evoked by any of the twentieth century's other excesses.

A firestorm begins when fires caused by bombing unite to form a giant conflagration. Both the bombing conditions have to be intense enough and the atmospheric conditions have to be right enough for a firestorm to start. Under warm, dry conditions, fires that are initiated by incendiary bombing burn bright and hot. They combine to form one giant inferno, which superheats the air on top and around it. The hot air rises, but with such fury because of the temperature, that the surrounding air at ground level is sucked into the center of the fire at speeds exceeding 100 kmph. In Hamburg on that day, the winds that the air gave rise to reached almost 150 kmph.

The testimonies of witnesses that Lowe narrates describe what can only be called hell on earth. The winds suck people around into the fire and raze their bodies to their bones in minutes. The wind, having no easy way to escape because of the buildings, forms mini hurricanes around corners that toss people around like paper dolls before they are swept into the conflagration. In addition, all the oxygen that is around fuels the firestorm, and people die gasping several dozens of meters away. The temperatures in the firestorm in Hamburg reached 800 degrees celsius. Researchers later estimated that this was more than in any fire in the history of the world, including the great fires of London and Chicago. On the ground, the concrete and glass melted, and witnesses recall people who were running get their shoes stuck in the molten material. When they desperately removed their shoes, they faced an even more gruesome situation, with their legs getting grounded in the hot tar, after which they actually started to burn upwards from the bottom; just like flies who fall into hot wax, as someone said. We don't really expect people to catch fire as quickly and easily as paper does, but that's what happened in Hamburg. People who were close to phosphorus bombs had melting phosphorus fall on their eyes and skin, which burnt incessantly and bored its way into bone and internal organs. As they blindly ran, they were hit by flying debris that was getting sucked into the fire. Others had their hair catch fire, and ran around madly as their face melted, before they finally gave a scream and flopped dead. Fathers, husbands, wives, sons and daughters and brothers and sisters saw their loved ones dying every imaginable horrible death in front of their eyes.

At the end of the nightmare, 45,000 people were dead. After the fires subsided, workers could not get into air-raid shelters to drag decomposing bodies out; the maggots were so thick that they slid on them.

The reason that my thoughts turn to the atomic bomb is this; while 45,000 people died in the most horrifying manner in Hamburg, millions were dying on the Eastern front. The Russo-German war was the single most horrible and devastating war in history, with Stalingrad topping the list of killing by numbers. The end left 20 million Russians dead, including 10 million civilians. Thousands of miles way, in the Pacific, equally horrific battles were being fought; Okinawa, Iwo Jima, Guadalcanal. Curtis "Iron Ass" Le May decided he could do better than the British, and in March 1945, burnt 100,000 men, women and children in a single night in Tokyo. And we don't even have to start recounting the fate of the millions who were being slaughtered in the Holocaust.

The main point in all this, is that by 1945, everyone had lost their morality. Everyone had sinned. Hitler may have been a monster, but on the ground, the British citizens in the Blitz and the German citizens in Hamurg and Dresden suffered equally. The estimate of one million soldiers that were going to die in the planned invasion of Japan in November 1945 may have been inflated, but no one can doubt that the toll would have been terrible, and would be yet another act of madness in a world gone half-mad. In Japan, twelve year old girls were being taught to fight with bamboo spears, and the Japanese had been programmed almost like zombies to kill or be killed. Under these circumstances, almost any end to the war would have been a pitifully welcome end.

That's why, I don't agree that dropping the atomic bomb was wrong for moral reasons. Or rather, I think it's a trivial point; sure, of course it was wrong for moral reasons. But was it any worse than the unparalleled killing that had gone on before? At least the people close to the epicenter in Hiroshima would have had an instant and painless death, unlike almost everyone who horribly suffered in Hamburg and Tokyo. On the other hand, atomic radiation caused scars that would proliferate through genes and generation. But again, is that really any more immoral than burning alive a hundred thousand people in one night by sustained bombing? In my opinion, once the Allies took the decision of bombing civilians in large numbers, the moral line was crossed and the decision to drop the atomic bomb was already taken.

The real reason why I think dropping the bomb was wrong, was because the Japanese would probably have surrended if they had been allowed to keep their Emperor, which was a very important symbolic necessity for them. This would have been easy, as the Emperor had virtually no practical powers, inspite of being treated like a God. But there was a big cultural divide between the Orient and American understanding. Even though there were officials who realised the mentality of the Japanese, Washington did not take cognizance of this desire. What bothers me, and this was not the first time it happened, was the constant tendency of the Allies to fail to empathize with their enemy.

And here we really get to the heart of the matter, which turns away from morality and back again to plain old politics. After fifty years, it is increasingly clear that Truman was too preoccupied with thoughts of the superiority that his country would have with this new weapon, and with preventing the Russians from entering Japan. The atomic bomb would serve as the ultimate diplomatic gambit. A million Russians had already amassed on the Manchurian border; Truman had already given them Berlin, and he was not about to lose Japan to the Reds too. The atomic bomb would ensure an instantaneous Japanese surrender, and the Americans could quickly move in before the Russians. Preemptive diplomacy, that's what the bomb really was. I don't mean to say that Truman cared nothing about the Japanese, or that his concerns about loss of American life in the invasion of Japan were unwarranted. But the real driving force was quickly ending the war, and stopping the Russians from taking over Japan.

It is a sobering fact that, not for the first time in history, the fate of hundreds of thousands was decided on a military and political basis. If anything about dropping the bomb strikes a raw nerve, it's that Truman did not exercise the patience that could have saved not just American lives but also all those in Hiroshima. As for the bomb, the Americans were superior to the Russians for many years, and the failure of almost every President after Truman to realise this ironically leveled the playing field. The dropping of the Nagasaki bomb was even more impulsive and premature, but again was done exactly for the same reasons.

Apart from this regret, Hiroshima was really another cauldron of brutality in a long line of cauldrons. Killing people by strategic bombing and killing people with atomic bombs, the principle is really the same. And that is one reason why, contrary to the hopes of those such as Niels Bohr and Robert Oppenheimer, the atomic bomb has not ended war. Perhaps it provides an element of swift and horrific shock like nothing else, but conventional war and genocide and ammunition can still uproot and kill millions. China, Vietnman, Cambodia, Rwanda, and Bosnia all bear testimony to this. Atomic bombs can provide deterrence, but they cannot stop us from killing each other. In this respect, the atomic bomb and the Hamburg phosphorus bomb both fit into the same paradigm. The least we can do is hope that our old ages have not been completely futile in giving us wisdom.

Labels: , , , ,

Wednesday, April 11, 2007


Image Hosted by

Some idiot on a bicycle slammed into me yesterday. Fortunately I did not break anything, but the bruises are giving me an ucomfortable time since then. After rinsing both knees with chlorhexidine and iodine, I was not concerned; if there was an infection, antibiotics would take care of it.

But it wouldn't have been that way seventy years ago, when the most you could do to prevent a wound from getting infected...was wait, and perhaps apply some crude remedies. That was how it had been for two hundred years. For all the progress we had made, bad bugs still mostly got the better of us. It is appalling that about fifty percent of deaths in WW1 were from infections that riddled shrapnel wounds, and not from explosives or gunfire themselves. Once infection set in and gas gangrene made its hideous appearance, all one could do was wait, and maybe hope that the suffering would end soon...until sulfa drugs appeared on the scene.

That era of sulfa drugs, and not the one of penicillin, was the first heroic age of antibiotics. Most of us, if asked to name the first wonder-drug antibiotic, would name penicillin. But long before penicillin, sulfa saved thousands of lives. Without sulfa around, Hoover's son died. With sulfa, FDR's son, and Winston Churchill, survived. Thomas Hager has done an excellent job in bringing this forgotten but extremely important story to life in "The Demon Under the Microscope". The former biographer of Linus Pauling has shown us how different it was to suddenly have a drug that cured infections that previously would have almost certainly killed you. The time until the 1930s was a scary time, with every kind of Strep and Staph waiting to kill you after entering your body through the slightest cut, and diseases whose names we don't even remember now were rampant and much feared. It was sulfa that first declared war on and largely eradicated all these infections.

At the center of the sulfa story is the remarkable doctor and biochemist Gerhard Domagk. Domagk was an officer in WW1 and saw thousands needlessly die around him in agony, all because nobody could prevent the infection that set in after they were hit. After the war, Domagk went through a succession of jobs and finally ended up at Bayer, where he had a trailblazing career in the discovery of new cures for old infections. Building upon Paul Ehrlich's convictions about azo dyes as bacteriocidal agents, he and his colleagues tested hundreds of analogs, until he hit on the right one. This was the beginning of SAR as we know it today. And here, we can see the chemist's tragedy. Domagk tested the compounds, but it were two chemists who actually made them. Yet, they were excluded from the prize that Domagk would gather. This was not his fault, but really the workings of the Swedish committee, which did not behave this way for the first and last time. Patriotic and yet conscientious, Domagk stayed put after Hitler came to power, losing himself in his work to distract himself from the injustice that was taking place around him. In 1939, he was awarded the Nobel prize, but the Nazis did not allow him to accept it. Bayer itself became connected with the notorious IG Farben, which designed hydrogen cyanide vials (Zyklon B) for the gas chambers.

There is much in the book that is eye-opening, and sulfa is only one chapter in a book that also deals with medical history and the social history of science. There were several things I was unaware of; one revelation was that the modern American university model is based on the German model. The Germans were the world leaders in both industry and academia, and the modern and highly successful trend of close collaboration between industry and academia was already widespread in Germany. For all their philosophical bent, the Germans never saw any contradiction between pure and applied research, and the university-industry collaboration and connection led to very fruitful research in engineering and medicine. The modern patent regime too was pioneered by German industry.

The most important fact which I was not aware of was the pivotal albeit unfortunate role that sulfa played in revitalizing the FDA and granting it powers to implement laws that made it mandatory for manufacturers to display warnings and ingredients labels on their products. Before that, almost anyone could set up shop and sell metals, elixirs, and liquids that promised cures for everything from syphilis to baldness, a practice that went back two hundred years. But in the 1930s, through a series of unfortunate events, a concoction of sulfa in, of all the things, ethylene glycol, was sold extensively in many states. Today, we would be horrified at such large-scale use of an industrial solvent for mixing a drug. But at the time, there were almost no laws that required manufacturers to list such petty things as solvents on their bottles. The FDA was a skimpy and ineffectual agency at the time, with a few dozen agents scuttling around to mainly keep a check on excessive profit making. After the sulfa-ethylene glycol concoction was sold, a wave of death began that did not stop until several hundred people died, and public outrage changed the face of the FDA- and the way in which drugs are developed, manufactured and sold in the US- forever. After the tragedy, the FDA acquired new powers that it could have only dreamt of before. Of course, it took the thalidomide tragedy to have the kind of strict FDA regime that we have today, but the sulfa tragedy started it all, and made drugs substantially safer for the public.

An amusing and ironic chemical fact also accompanies the discovery of sulfa. Even though it were the Germans who pioneered its development, it was a French group that discovered the most important fact about the drug; that it was not the azo chemical linkage, but the benzene sulfonamide group that was key to the action of the drug. Once they discovered this fact, all bets were off for the Germans, because the potent part of sulfa turned out to be benzene sulfonamide, a cheap bulk chemical that could not be patented! Even if the Germans tried to quickly get past this handicap by synthesizing new derivatives at a terrific pace to outnumber their French colleagues, the cat was out of the bag, and they could never top their initial success.

Gradually, sulfa made it everywhere, and into the United States through the perspicacity and interest of two Johns Hopkins researchers. It began to be marketed in every form and colour and flavour, as every derivative and analog. In the 1930s, it became the drug of choice for treating every imaginable kind of Strep or Staph infection, most of which it effectively tackled. Cure by sulfa was touted as a miracle cure, with its relentless and wondrous effect on cases that only ten years ago would have been totally hopeless. But as a drug, sulfa had already fallen behind. Penicillin had arived on the scene. In due course, resistance would develop to both drugs, albeit relatively gradually to sulfa.

Domagk spent the last days of his life in gloomy peace, distraught by his country's destruction, and somewhat validated by the thousands of lives he had saved. Sulfa is still used for topical purposes.

We now know that sulfa competes with PABA for the synthesis of dihydrofolate. Sulfa and further related research led to, among other things, Methotrexate, a widely used current drug in cancer therapy. But in the end, what befell sulfa has befallen other antibiotics. The bugs have become resistant. When sulfa and penicillin were discovered, they were regarded as miracles. Perhaps we need another miracle for bad bugs today, and the age of fervent antibiotic research might be coming back to haunt us. But it should not be forgotten that sulfa was the first miracle drug, before penicillin.

Labels: , , , ,

Thursday, April 05, 2007


Supramolecular chemistry is defined as chemistry "beyond the molecule". Just as ordinary molecules are aggregates of atoms, 'supramolecules' are aggregates of molecules. The bonds holding together atoms in an ordinary molecule are strong, covalent bonds. The bonds holding molecules together in a supramolecule are 'weak' bonds such as hydrogen bonds; bonds formed because of the attraction of a hydrogen bonded to an electronegative element like nitrogen or oxygen, to other oxygen or nitrogen atoms.

Supramolecular chemistry is immensely important for understanding life. That's because all the action in living organisms is mediated by such molecular assemblies connected by weak hydrogen bonds. Even though they are weak, just like many individual strands in a rope, they add up to a very strong force when they are present in large numbers. Also, in biological systems, strength is not as important as timing. If a bond is too strong, it may be useless for biological interactions because once formed, it will be very difficult to break in order to proceed to the next step. Molecular interactions in biological systems are multiple-step, dynamic interactions, and it's only the right strength and combination of weak and strong bonds that optimizes the function of biological systems.

If the molecules in biology are human beings, then weak bonds such as hydrogen bonds are the relationships and interactions that human beings have with each other and with the outside world in general. As with the human world, biology would not function without these relationships. Hence their pivotal importance for actually making things work in living organisms. And hence the importance of supramolecular chemistry for understanding, and hopefully learning to modulate, these interactions for practical purposes.

Supramolecular chemistry is also very important for materials science. It is obvious that every material around us consists of molecules which interact with each other. Understanding these interactions through supramolecular chemistry is helping us devise new materials for electronics, plastics, and biopolymers which may one day replace vital elements in our body. Because weak bonds are not as strong as covalent bonds, they enable us to have subtle and fine control over the synthesis of new materials which wil help us to sensitively manipulate their properties. Think of biosensors which will detect negligible concentrations of pesticides in our food, or toxins in the soil, or chemical warfare agents. Think of 'smart' materials inside the body which will respond to temperature and the presence of nutrients, enabling us to selectively and periodically release drugs in the body only under certain conditions and only in certain cells or organs. All these applications and many more will require the understanding of the the weak interactions that confer such exquisite sensitivity in biological systems.

Last but not the least, supramolecular assemblies look pretty, and provide an artistic edge to science. For example, here is a supramolecular shape I made from just two molecules, cyanuric acid and melamine, both of which are useful molecules in their own right. Melamine might be familiar as a material used in plastic kitchen utensils. Modifications of cyanuric acid are used as disinfectants.

Image Hosted by

The purple atoms are carbon, the red and blue ones are oxygen and nitrogen respectively, and the white atoms are hydrogens. The yellow hydrogen bonds are clearly seen. The beauty and symmetry of such structures is striking, and countless geometrical possibilities abound for combining any number of such molecules into symmetrical shapes. The economy of construction- just two molecules in this case- is also striking.

And if one is bored (as I was) one can also create whatever shapes he or she wants to. For instance, here's a "supramolecular dog" that I made. In the first view, every atom is represented as a big sphere, which reflects its actual size.

Image Hosted by

And here's the real character of the supramolecule, a hydrogen bonded assembly like that pictured above. It's clear that even though every hydrogen bond may be weak, many such bonds are quite enough to preserve the shape of the dog.

Image Hosted by

As in many other matters, don't underestimate the strength of the 'weak', and the strength of numbers.

Labels: , , , ,

Tuesday, April 03, 2007


One of the favourite tactics of religious people who want to use science as firepower for their arguments, is to proclaim that both Albert Einstein and Isaac Newton were religious. They think that marshalling the tacit support of two dead scientists who are among the greatest in history would help them fight for their cause.

However, there are a number of points that help to refute such disingenuous arguments. In the first place, such an argument is an appeal to authority, which by itself does not provide any 'proof' whatsoever for its justification. Now, as far as Einstein is concerned, it's quite clear that he used 'God' as a metaphor for the ultimate mysteries of the universe, for those awe-inspiring truths in the cosmos which we can't yet comprehend. Uses of the word God or something similar galore in Einstein's famous and oft-quoted phrases and writings, yet to my knowledge, there is not an iota of evidence that he believed in the personal deity which most people associate with the word 'God'. In his later years, Einstein was a strong supporter of Zionism and the creation of Israel, yet, it's clear that even these concerns of his were more humanitarian than religious, and did not attest to any deep deistic Jewish faith inside himself. So, for religious people to claim Einstein as their own is dishonest, and shows a simple ignorance of historical facts. At most, Einstein can be called a spiritual philosopher, but not a religious person by the common definition of the term.

But among all the scientists in history, the genius who appears the most mystical to future generations is Newton. This is partly because of the sheer and astonishing breadth of his imagination, which still defies comprehension. He single-handedly laid the foundations for all future physical science, and also invented the mathematical tools necessary to describe nature. Inspite of the two great achievements of twentieth century physics, quantum mechanics and relativity, we still live in largely a Newtonian world. Purely as a scientist, Newton's abilities do appear mystical and almost magical to all of us. This image of Newton is cemented by the way he lived his life, as a solitary and obsessed man who toiled for months in his laboratory and rooms without once appearing in front of the outside world, as a recluse who was so paranoid about his creations- pinnacles of human thought- that he sought to keep them secret for years, deciding to publish them years later in a burst of revelation. As Alexander Pope said, 'God said, let Newton be, and all was light'.
Religious people's fascination for Newton and their tendency to claim him as their own cannot be entirely disparaged, however. Newton in fact saw himself less as a scientist whose job was to document facts about nature and weave them into elegant theories, and more as a solver of puzzles, puzzles whose clues were laid by God for man to unravel. He saw God as the ultimate riddler, and man as the being whose duty was to lay bare His conundrums. He was a Unitarian, who believed in the oneness of divine existence. All these beliefs of Newton explains his later intense forays into theology and alchemy.

Yet, Newton's life cannot belie the facts. The first fact is, the laws of nature which Newton discovered don't need a divine explanation to justify their elegance, power, and use. The world is governed by the laws that he discovered, and it makes as much sense to ask for the 'laws behind the laws' as it does to ask what was before time happened. Even if we do discover some ultimate laws behind these laws, there is no reason to suppose that those laws would not be mathematical.

The second fact may be harder to digest, but it is also true. Newton's later obsession about alchemy and theology was largely crackpot and nonsense. His reams of writings on religion and theology seem more like figments of a magic kingdom constructed by the mind of a deluded person, although just like in the writings of a deluded person, there are some interesting conlusions that he draws. It is difficult to imagine what made Newton give up his spectacular study of natural law, and start searching for cryptic clues in the Bible. But one thing is for sure, whatever the driving force, it is the products of his scientific studies that have survived the test of time, and guide the behaviour of science in the modern world. Just because Newton was a great scientist does not automatically give him authority over deciphering ancient texts, as some religious people would have themselves believe. One can be exalted in one field, totally misguided in the other, and history has many examples which demonstrate this. Newton the natural philosopher was invaluable to mankind, but Newton the alchemist and theologian was at worst a deluded mortal, and at best an amusement of interest to historians, not to mention psychologists.

The third and most important fact that needs to be kept in mind when we talk about Newton, is to accurately guage the times in which he lived. This is perhaps the greatest fallacy which religious people commit, that of analyzing Newton outside the context of his times. We cannot forget that this was the seventeenth century. It was a time when religion did provide the best 'explanation' of many perplexities in the world. Apart from astronomy and mathematics, no other science was well-developed, and both astronomy and mathematics needed the spark of differential and integral calculus that Newton breathed into them to come to life. Newton may have been an extraordinary thinker, but he was probably awed as much as anyone else by the astonishing diversity and workings of life. Biology was not even a formal science then, and absolutely nothing was known about cells and organelles, although Newton's contemporary Robert Hooke would soon coin the term 'cell'. We knew nothing about microbes and their role in disease, about genes, about the transmission of hereditary characteristics. Most importantly, the world had no inkling of the great revolution that would lead us to redefine our origins and existence- the theory of evolution by natural selection, another gigantic intellectual revolution fomented by Newton's fellow Englishman two hundred years later. It is easy, even today, to look at life around us and think that a supernatural being created it. In the seventeenth century, Isaac Newton was as ignorant and smitten by all these mysteries as anyone else, and it was much easier for him and everyone else to believe in a divine provenance for all things in the world.

This was also a time when the grip of religion was very strong, and one also needed courage to make contrary views known. In fact, Newton's views on the oneness of God would have been heretical if he had made them public. The liberal King Charles II made a special concession and allowed Newton to become a fellow of Trinity College, Cambridge in spite of these views. Newton kept his side of the bargain and never published his religious views. That of course did not stop him from poring over ancient texts in private. He believed that theology, alchemy, and the laws of physics, all were manifestations of the divine power of God. But because he said so, that doesn't make them so. Within the context of his times and his genius, one can be sympathetic towards Newton's beliefs, but that says nothing about the facts, which prove that the laws of physics do not need to be combined with theological views in order to attain consistency.

Lastly, the most important and simplest truth about both Newton and Einstein cannot be forgotten; they might have even had beliefs akin to religious ones, but they chose to marshall their intellect and energies to unraveling the mysteries of the universe through science and not religion. Newton may have turned to religion in his later life, but as noted above, for him, his religious excursions were a natural extension of his scientific excursions. As for Einstein, he never gave up his scientific pursuits, although God continued to be a common metaphor in his writings.

Einstein and Newton; the stature of both these men was such and their creations were so lofty, that one cannot help apply religious or spiritual connotations to them. But it should never be forgotten that they looked towards science, not religion, as a means to understand the universe and our place in it.

As for being in awe of the universe, the one fact that the atoms that you and I are made up of were manufactured billions of years ago in furnaces in the innards of blazing and dying stars is much more profound and spiritual for me than contemplating a deity whose existence is questionable by any standards.

Labels: , , ,

Monday, April 02, 2007


The "graduate kitchen", just like the "graduate lounge" and the "rabbit hole" of "Alice", is a non-existent entity. But you fantasize about it, especially in light of the dismal thing down the hall that tries to approximate it.

Everyday that I trot down the hall into it to heat my lunch or get my daily coffee, I am assailed by familiar smells, that are yet so unfamiliar that even the Flying Spaghetti Monster would not be able to unravel their composition and sources. The dingy place consists of a table with unidentified food scraps on it, a giant something in a corner approximating a fridge, and stodgy white appliance right next to it, apparently the kitchen microwave.

Each one of these objects has its own story. If you open the microwave oven door, you see what looks like an extra layer of insulation, which turns out to be a 1 inch thick coating of food that has been spilled and splattered about it for the last couple of years. But this is anything but disgusting, at least not so if you don't find finely carbonized ash disgusting. That's what that layer looks like, carbon...or something that resembles it. It's not exactly carbon though, because it still has an 'organic' smell, neother pleasant nor repulsive. One dreams of what molecules could be in there...the cure for cancer of AIDS perhaps. Looking at the specks and layers, I had an idea; scientists could use this system as a model system for studying what happens to organisms and organic molecules when they are cast aloft into space and transported over long distances, bombarded by the less than ambient ultraviolet and other kinds of radiation rampant in the galaxy. The specks in the microwave bear witness to this cruel assault; having been fried countless times over by microwaves, they could perhaps be similar to the molecules that arrived after a perilous journey on early earth to sow the seeds of life. Obviously, Fred Hoyle and Francis Crick must have anticipated the modern grad student lifestyle.

Then there's the fridge. Even my home fridge was a great experimental system; the Chinese roommate who vacated the apartment when I came had a crab in there that was at least six months old. Immense benefits of bleach immediately dawned upon me. A similar, but much more exalted, scenario exists in the "graduate kitchen" fridge. There is some black goo on the fridge floor, long since petrified, that reminds me of a liquid alien form in an X-Files episode. Dig deeper if you dare to, and you find foods of different shapes and sizes, that used to actually form a part of international cuisine. It was two years ago that I stuck a notice on the fridge; "Please do not leave food inside for more than a month...or until it starts to stink". I should have realised; especially when it comes to international cuisine, the stink is in the nose of the beholder. Needless to say, I did not want my own gallon of milk to suffer such a fate. That's why I gulp it down within a week, and post a sign saying "Poison, do not ingest" on it to prevent theft. Then there's the top freezer compartment, which tests the resilience and shelf life of frozen dinners more than their manufacturers could have imagined in their wildest dreams.

In the corner of the "kitchen" sit two transparent lunchboxes. I vaguely remember two students long since graduated, once eating lunch from those lunchboxes. I want to inform them that their fungal growth experiment has worked.

But of course, it's the microwave that really interests me. I wish I could scrape off some of those particles of unknown composition and submit them to chemical analysis and spectroscopic deconvolution. Then at least, my daily visits to the "graduate kitchen" would have become a little colourful.

Having said all this, just like an unpleasant relative, the kitchen becomes part of your daily life, and so actually manages to somewhat endear itself to you. The random journal issues and magazines tossed on the table entice you to no end to want to stay. Some brave souls, including our department chair in the lead, take on occasional tasks of heading a search-and-rescue operation into the kitchen. At least I wipe something when the source of the spill is my lunchbox or mug. But I am not one of those brave souls.

Labels: , ,

Sunday, April 01, 2007


Gaurav points us towards Bill Maher's latest segment, where he interviewed Republican presidential candidate Ron Paul. Maher is one of my favourite 'comedian-intellectuals', and I was disappointed to hear him take such a one-sided stand against Ron Paul. Most of what Paul said was right on mark. As Gaurav pointed out, he also shows a sound understanding of American history, including the many dark deeds that the CIA has orchestrated. It was also disappointing that Maher did not agree with this absolutely accurate assesment about the role of the CIA; that makes it clear that he was there mainly for bashing Paul.

Paul was also quite correct about the Civil War.
The Quakers in England had been campaigning to end slavery for many years, purely based on personal conviction, without any political agenda. Their pacifism was not the political pacifism of Gandhi. Yet, they were very effective in ending the slave trade that England had. The Quakers accomplished their objectives because they had limited goals. They realised that ending slavery itself was a goal better left to their future generations. But ending the slave trade was a goal they could realistically achieve. They also did not see the end of slavery as punishment for slave owners. In fact, they crucially persuaded the government to actually pay compensation to the slave owners for giving up the acquisition of new slaves. The result was that the English made a smooth transition to a society free from slave trade. The American contrast was much more striking. One can only speculate that a similar strategy would have been possible in America too, and the Americans could have been spared the horrors of the Civil War. England's example bears testament to this possibility. But as Paul points out, the American leaders and especially Lincoln were not as interested in ending slavery as in unifying the nation. He and his associates traded blood for unification, when it could have been achieved more smoothly, if slowly.

The one place where Paul does not get it quite right is global warming. He deflects the issue by asking whether the government should invade China or try to stop volcanoes to suppress global warming. Both of these questions are extremes, and there is much that that government can do in other areas; as an aside though, Paul still recognised the key oil-induced situation in the Middle East that the US has brought upon itself. As of now, it is extremely difficult, if impossible, to practically envisage how the free market can regulate global warming. As I mentioned in a past post, there are so many sources of CO2 emissions that the cap-and-trade programs that worked so well for sulfur emissions cannot be seen to work for CO2 emissions especially in the short term. In such a case, a government tax may be the only optimum solution to curb such emissions.

I have something to say to Maher; stop calling yourself a libertarian. And I have something to say to Paul; stop calling yourself a Republican (although he is a moderate Republican by any standards). Assigning political labels consigns us to following textbook definitions in their entirety. Social science issues need us to know much better than deal with absolutes. If one dons a political label for too long, then he faces the danger of becoming a slave of that label. Long after he has taken such a position, changing that position even for a justifiable reason could make people call him a hypocrite. To avoid such pitfalls, better not take any absolutist position. Not the free market, not government, and not any single entity or system can be the solution to all problems. One needs to find a balance. Just like a well-made recipe, the correct political solutions need to showcase at least an effort of mixing all the ingredients in the right proportion. No matter how much I like a particular dessert, an excess of sweetness-that quality which is after all the sine quo non of the dessert- nevertheless spoils the whole act. As in other aspects of life, moderation is the key here too.

Labels: , , ,