Showing posts with label science. Show all posts
Showing posts with label science. Show all posts

Saturday, December 3, 2011

Freeze and thaw

Winter is a funny time of the year. There is a deluge of family-centered holidays, and shoppers flock to get the 'best deals' that only open up at this time of the year. In India, nobody remains slim anymore, as toddlers are covered in their layers of sweaters and the women bring out the expensive cardigans and shawls to the envy of the neighbors. As for the weather, depending on your geographical location, winters can either be the best time of the year or an absolute nightmare.

For me, 5 years in Mumbai destroyed whatever winter tolerance I had built up in 17 years living in Jhansi. Winters in Mumbai were the best time of the year- no fear of dehydration and humidity at its absolute low. Having swung between these two extremes, I was under-prepared when I landed in the desert. Having been here twice in the hot and dry summer, and knowing that Tucson was not very far from the west coast, there was nothing abetting my desire to pack enough warm clothes. Turns out, Tucson is more like Jaisalmer than Pune, and the cold wave outside makes me wonder about other assumptions that go wrong all the time.

For example, we assume that politicians are committed people who will put aside differences in the end and work out a mutual agreement in the greater benefit of the nation. Ironically, the educated US senators perform no better than the semi-literate, identity-bred politicians back home at this benchmark. You can gloss over other nations as well, and it becomes clear that politics is a dirty business to the core. Another assumption that often goes wrong is that the economy is going downhill because of less global spending. Never was there more investment activity going on across the globe. You simply need to look into the right sector. Be it clean energy, or the more conventional oil exploration, investors are flocking to the few seemingly reliable areas in both sectors. If anything, the reason might be the lop-sided nature of investments, where few monies are spent on human development, and more on increasing the already inflated MPLAD funds. We spend when we feel good about ourselves, or even when we want to feel good about ourselves. Having recently witnessed the Black friday euphoria in the states, the latter certainly seemed to be the case.

The third assumption, and this one is quite close to my heart, is that Science holds all the answers to every conceivable question, and if Science cannot explain it, it is not possible. The other day, I was having this conversation with a friend about religion and faith (you know where this is going), and I put forward my best argument that some of the practices seem to be lacking any logic whatsoever. Then comes the news that OPERA project was able to reproduce their results of neutrinos breaking the Einsteinian barrier of light travel. We might very well be re-writing all those Physics textbooks 10 years down the line. Who knows what else might become explainable with better and more sophisticated instruments!

The lesson I take from the above is that while it is good to assume certain stuff, it is imprudent to blank out an alternative. With that, I wish my assumption of a warm day on December 11 turns out to be correct!

PS: Nobody remained young at heart longer than Dev Anand did. He was the foremost style icon the industry ever had, and a wonderful observer of new talent where others would have seen none. RIP!


Saturday, September 17, 2011

The learning of science or the science of learning?

How would you define learning? I know, probably not the best idea to ask a question right off the bat, but I want you to think along, because the question is open-ended. If the orthodox definition is to be considered, development of new memory fits the paradigm of learning. So when people use words like 'learning from your mistakes', it basically means that you have developed an imprint of that fallacious happening in your mind. However, if you gravitate towards science, the idea of learning cannot be limited just to development of new memory. Sure, one of the objectives behind teaching and learning of science is development of memory, based on the give-and-take between the teacher and the student. But memory is not related to intuition or thinking. That would be like saying that I watch a Nat Geo video of Mt. Everest expedition, learn every single step and hike, and then set out on my own to conquer it. Learning therefore goes beyond the simple idea of memory development.

So how can you define learning in a more pragmatic sense? A reasonable assertion can be that uncoupling learning from fact-checking ought to be able to satisfy the question behind creativity. And surely we can agree that creativity is a benchmark for learning, as a well-learnt concept leads to a more creative output. Also, in terms of science education, sufficient challenge to redundant concepts and hypotheses, which are taken for granted more often than not, is a must in order to develop independent thinking, or as the jargon goes, a critical bend. Therefore, as you proceed on the educational ladder, questions should get tougher and hypothesis should become bullet-proof. 

Personally, the initiation into grad school has been a pretty exciting experience so far, because as grad students, we are supposed to challenge any idea, no matter how big or small, prominent or subtle, irrespective of its publishing pedigree. So scientific education has suddenly being molded from taking copious notes and reading material a day before the exam, which I was doing as late as 5 months ago (I officially stopped studying for endsems like most of us did, except for THAT one course!). Now, we go in well prepared to discuss publications, formulate ideas, and gain concepts from linking disparate ideas in class. And this I believe is learning in true sense, because intuition and creativity are both integral parts of this process. If we were able to merge this system into our school and undergraduate level coursework, students would be more primed for taking up higher studies, or at least they won't have the fear of the unknown. This modicum of reform can truly lead to a shake-up, or at least do some good in satisfying the innate creative urge, so essential to our existence as humans.

PS: If you are a Max Payne fan (the video game, not the movie, which I still believe does not exist), brace yourself for Max Payne 3. If not, it is never too late to start, for this game is more than just fancy guns in 3rd person style. It is about a man and his search of vengeance!



And the obligatory music video. How about some REM?

Monday, September 5, 2011

Think like a scientist, feel like a diplomat

'Publish or Perish' seems to be the general code of law as far as the scientific community goes. The idea, while being a relatively trite manifestation of knowledge for all, has come to be known as the single most dreaded facet of building a career in research. As you go up, the impact factor of the journals you publish in is expected to show a concordant rise. The instrument of knowledge is bypassed by the instrument of conformity. Is the evil of globalization to blame, or the competitive edge that has overtaken research the main reason? A bit of both in my opinion.


Scientific literature has seen a reversal of fortunes with advancements in information technology, and it suffices to say that it has been one of the cornerstones of the knowledge economy in the last 20 years. The digitization has resuscitated the field where journals would otherwise be forever lost in a library shelf of a university. The very fact that articles are now accessible in any corner of the earth at any time (of course, internet connectivity is mandatory, but e-journals can make up for the lack of it). The positives are there for all to see. A scientist working in a remote lab in Africa now has access to research coming out of the hallowed portals of MIT. It has provided scope for constant feedback on research, and erratum are more poignantly highlighted than a letter to the editor would have done in recent times. But somehow, some journals, either by sheer luck or careful selectivity, have risen to prominence as having more venerable research compared to another. 


This has led to an era of competition, where scientists are vying for limited print space in journals. Therefore, some research is considered more publishable than the other. And it is considered fashionable to print only in the big 4 because they seem to matter more. Sadly, a lot of researchers face a dilemma of prolonging their work or conducting more experiments in order to accentuate their chances of conquering the holy grail, or publish it instantly when the results are novel and exciting. The element of curiosity is being pushed to the back-burner, as post-college career is highly dependent on the publications on one's resume rather than the importance of the work. Now, this does not imply that the exclusivity of some journals is necessarily a bad thing. They are considered the cornerstone of cutting-edge research, a sort of a benchmark. Not just in science, but also in disciplines of engineering. It is more important to change the perception, to be able to differentiate between the quality of research and the impact of the journal. Especially in countries where publication record takes a backseat to everything else- the candidate's proficiency as a team player, extra-curricular pursuits, and basic qualities like presentation skills. 


Admit it, people go to graduate school because they love science. That's how it should remain throughout their career. More and more scientists are coming out and collaborating in order to better than chances of enhancing their quality of work. More such ideas are certainly worth probing. 


A must watch show coming this fall:



And if you haven't heard of Arcade Fire, here you go:

Saturday, August 20, 2011

A better way

As I sit here in my living room, watching Watchmen, I cannot help but think about Dr. Manhattan's words-"The world is changing, and this new world is going to be hard to adapt to." The world is indeed at a precipice, where old ideas are being chucked out more rapidly than ever before. Therefore, it is prudent to identify the positives out of this as well as understand how some things are still unchangeable.

The first idea which springs to my mind is the communication revolution. Now much has been said and writers have waxed eloquent about the ever changing face of communication and its implications for the globe. I would like to go more basic than an optical fibre facilitated enhancement. 20 years ago, chances are that if you wrote a letter to a person in the next state, it may or may not get delivered. May or may not! That possibility was greatly supplemented by the fact the there was no tracking mechanism to check on the port-to-port delivery. So if I were to apply for a PhD in the United states 20 years ago, there is the possibility that I might have missed my shot due to a sorting error. Therefore, it is safe to say that manual errors are slowly becoming redundant. The era of the machines is here, or almost.

What are the other ways in which the world is changing? Ironically, the communication revolution and the detriment of barriers was supposed to unite the world into a more harmonious place. Yet we have seen more wars and insurgencies in the last 50 years than any documented time in history. The way I see it, with information being so easily available, it abets a nation to secure its interests in ways that were unheard of earlier. So you have things like cyber attacks on a country's information database, which is slowly replacing the conventional warfare as a more potent weapon. This is stuff of science fiction for now, but with the way technology is seeping into our daily lives, it is not ludicrous to believe that the future might be significantly different for the future generations. Once thing is for sure, education is going to be the currency of development, and investments will be earmarked for the national intellectual wealth.

And then, to the part which has remained static. The right to expression, our ability to speak our mind, has remain untouched, or rather, has been bolstered by the availability of tools both scientific and rational in nature. People today want to study tiny archaea, because they believe the answers to significant questions about life lie in there. And they go about doing it like nobody's business. The fact that basic science has braved the storm of application and commercialization is a fact oft overlooked. We as global citizens are also more empowered today, mostly because every vote is documented and every opinion is recorded. Dissemination of ideas is easier, and their harnessing is slowly becoming more inclusive. There is no single ownership to any idea anymore, a fact well illustrated by the IP wars being fought around the world.

So the changes are there to be seen and exploited. The end user will not care, the proprietor does not exist.

The Who and their generation!