Search This Blog

Tuesday, August 15, 2023

Brain the long and short of it requires very long response.

 

Brain the long and short of it requires very long response.

 

Whenever any discussion or debate or detailed write up happens, especially about brain, irrespective of the author’s scholarship, erudition, expertise it ends up throwing more doubts and generating more questions than it answers with clarity in broader perspective.

It stimulates further thinking and seeking so far so good.

It may be so because, fundamentally, the types of approaches, analysis, attempts to understand how the brain functions itself is struck in studying its purely physiological aspects or studying its impact or involvement in terms of hyper generalized functions or as in the case of this author in studying how to relate or how to infer relationships that emanate from some parts of the brain or ends up being assigned to certain parts of the brain.

Brain, mind, consciousness, memory, knowledge, thoughts, feelings, emotions, understanding, various types of intelligences, instincts, intuitions, reasoning, analysis, imaginations, illusions, dreams, fantasies, fictions, logical thinking, scientific studies, mathematically decipherable patterns each one of them has multiple dimensions and each dimension has its own dynamics and they acquire new dimensions and generate new dynamics during the course of their functioning or interactions or relating with one another or with any external thing or event or person or issues or circumstance etc. I may have missed out many things here. Artistic creativities as well as unexplainable heinous crimes may be the result of some of these aspects.

We cannot conflate and confuse further, trying to describe Left and Right side of the brain in isolation at all [in my opinion]. This is like a multi vegetable, multi fruit, multi nut and multi flour all well powdered and mixed to make a soup with multiple tastes, aromas and colours where individual ingredients are tough to notice much less to highlight their specific contribution in the soup.

 

Therefore, trying to shrink the whole gamut of faculties and functions into intellectual silos and still worse attempt to verbally communicate convincingly is by itself not only a difficult task but a futile exercise.

That’s why, Scientists and philosophers have taken up the sections that they could study, simplify and explain.

 

Religions and rules have cropped up to fill certain functions.

Life is after all not mere chronology of existence but a meaningful topology of relationships.

 

Any or most write ups on brain and its functions remind me of what Voltaire said, “Four thousand volumes of metaphysics will not teach us what the soul is”.

 

On religion

[ Erich Fromm, “humans have a need for a stable frame of reference. Religion apparently fills this need. In effect, humans crave answers to questions that no other source of knowledge has an answer to, which only religion may seem to answer. However, a sense of free will must be given in order for religion to appear healthy. An authoritarian notion of religion appears detrimental.”


William James said it well: "The science of religions would forever have to confess, as every science confesses, that the subtlety of nature flies beyond it, and that its formulas are but approximations."



William James also writes "That the mind itself has a higher state of existence, beyond reason, a superconscious state, and that when the mind gets to that higher state, then this knowledge beyond reasoning comes.... All the different steps in yoga are intended to bring us scientifically to the superconscious state or samadhi.... Just as unconscious work is beneath consciousness, so there is another work which is above consciousness, and which, also, is not accompanied with the feeling of egoism.... There is no feeling of I, and yet the mind works, desireless, free from restlessness, objectless, bodiless. Then the Truth shines in its full effulgence, and we know ourselves- for Samadhi lies potential in us all- for what we truly are, free, immortal, omnipotent, loosed from the finite, and its contrasts of good and evil altogether, and identical with the Atman or Universal Soul." *



* My quotations are from VIVEKANANDA, Raja Yoga, London, 1896. The completest source of information on Yoga is the work translated by VIHARI LALA MITRA: Yoga Vasishta Maha Ramayana, 4 vols., Calcutta. 1891-99.]

 

 

 

We cannot experience life based on past templates nor future projections but can experience life as a live activity at present in the present situation or context.

 

[Human brain's ingenuity to hack or to hijack any system or any control and its ability to circumvent any regulation is its unique feature whether it is right or wrong is judged based on the outcomes or intentions.

 

There are many such factors like imagination, dreams, creativity, the confluence that one can experience under Anaesthesia ( I experienced it in December ) when there is collusion between forgotten or unknown subconscious feelings, semi- consciously aware conditions all moving around as if in a fiction movie vividly but even the vague memory of it all fade away once the effect of Anaesthesia slowly goes away. 

 

Whatever little I can recollect of that one hour will be like a real fiction movie.]

 

[ ‘The Brain Supremacy-Notes from the Frontiers of neuroscience’- KATHLEEN  TAYLOR starts off

 

“Science and technology are also changing their nature—and ours.”

 

“Brain research is already changing our sense of what being human involves, rejecting the age-old idea of a spiritual essence in favour of an organic   approach.   This   is   what   the   feared materialism of   modern science tells us. Brains are the pieces of meat which give us our selves, allowing you and me to exist as the people we are. Without them there would be no music, beauty, poetry, or science. There would be  no vicious murder or despairing suicide either; but also no joy of sex, no delight   in  nature,   no  pleasure  in  getting  lost   in  a  really  good  book.

Everything meaningful in your life and mine needs a cranial pudding to express itself, and each of those puddings is unique, irreplaceable and still   mysterious.   Brains   are   astonishing,   beautiful,   intricate, delicate marvels. Like human lives, they are good things in and of themselves.

 

If you were ill, and needed a heart transplant to save your life, would you accept one?  Most people would; they feel that having a different heart wouldn’t disrupt their sense of personal   identity.  How about a brain transplant?  If your brain were removed and put into storage to make room for a new, younger cerebrum, would you be in the body or in the storage?  What if all your former synaptic settings were copied across to the   new brain?   Or   if   only   part   of   it—the   cortex—were transplanted?  These thought-experiments and others suggest that we identify ourselves with our brains in a way we don’t with other parts of our bodies. Practical experiments, ethical and otherwise, suggest that we are right to do so. We can swap hearts, lose a kidney, cope without hands or eyes, and still be human, but remove the brain and what’s left is a kind of desecration: manmade meat.”

 

 

“The power of self-fashioning.

As well as shaking  up our ideas of what  we are, the  brain supremacy promises   unparalleled  techniques   for   changing  brains   directly: not with  language  or  images  or  drugs,   or  new  gadgets  to  play  with,   but by  altering  the  behaviour  of neurons  and  the  function  of  their genes.

 

Of course, brain manipulation isn’t novel; we do it indirectly all   the time and we always have. The social power which bends others to your will is so greatly valued that pursuing it  is one  of   humanity’s  great occupations.

 

With tongues and guns, ideals and incentives, persuasion and  pressure  and  sheer  propaganda,   human  beings  have  had  a  lot  of practice in treating others, pace the strictures of Immanuel Kant, instrumentally: as means to an end, objects to be utilized and adjusted, rather than individuals who are ends in themselves. And the methods we use affect our brains and bodies. Drugs change your genes. So do stressful events, meals eaten, and conversations. Yet we often fail to achieve the changes we want. To date, attempts to control other human beings have faced a mighty obstacle: the bony castle of the skull. That barrier has never been invincible—bullets or an axe will penetrate it—but it has kept out many less violent and crude attacks. Barred from the inner sanctuary of the brain, we were left with the  evolved  skills  of  social  interaction  and  the  knowledge  built  upon them:   psychology,   anthropology,   history,   literature.   That,   plus   rare neurological patients,   years of detailed observation of human  behavior, and what we had learned from studying the brains and behaviours of   other   species. The idea of   an  equivalent   capacity  to  that   of,   say, modern chemistry applied to the management of other human beings is  therefore  a  tremendously  attractive  prospect,   particularly  for  those people  and  institutions  tasked  with  managing  or   predicting  human behaviour.”]

 

[BUILDING A BETTER BRAIN

Life magazine - July 1994, page 62
By Daniel Golden and Alexander Tsiaras
Editor’s Note: This is an excerpt from an article in Life magazine pointing to the research being done on ways to stimulate and increase brain power into old age. This is the article that was the original impetus for creating this web site.

Evidence is accumulating that the brain works a lot like a muscle -- the harder you use it, the more it grows. Although scientists had long believed the brain’s circuitry was hard-wired by adolescence and inflexible in adulthood, its newly discovered ability to change and adapt is apparently with us well into old age. Best of all, this research has opened up an exciting world of possibilities for treating strokes and head injuries -- and warding off Alzheimer’s disease
The party last year was as rowdy as it gets in a convent. Celebrating her 100th birthday, Sister Regina Mergens discarded her habit in favor of a daring red gown, downed two glasses of champagne and proclaimed her intention to live to 102. She didn’t quite make it. Now, at vespers on a March afternoon in Mankato, MN, dozens of nuns file past the open casket where Mergens, 101 lies, rosary beads in her hands.
Concealed from view is an incision in the back of Mergen’s head through which her brain has been removed. Mergens and nearly 700 elderly sisters in her order are the largest group of brain donors in the world. By examining these nuns, as well as thousands of stroke victims, amputees and people with brain injuries, researchers are living up to the promise of a presidential proclamation that the 1990's be the Decade of the Brain. Scientists are beginning to understand that the brain has a remarkable capacity to change and grow, even into old age, and that individuals have some control over how healthy and alert their brains remain as the years go by. The Sisters of Mankato, for example, lead an intellectually challenging life, and recent research suggests that stimulating the mind with mental exercise may cause brain cells, called neurons, to branch wildly. The branching causes millions of additional connections, or synapses, between brain cells. Think of it, says Arnold Scheibel, director of UCLA’s Brain Research Institute, as a computer with a bigger memory board: "You can do more things more quickly."
The capacity of the brain to change offers a new hope for preventing and treating brain diseases. It helps explain why some people can:
• Delay the onset of Alzheimer’s disease symptoms for years. Studies show that the more educated a person is, the less likely he or she is to show symptoms of the disease. The reason: Intellectual activity develops brain tissue that compensates for tissue damaged by the disease.
• Make a better recovery from strokes. Research indicates that even when areas of the brain are permanently damaged by stroke, new message routes can be created to get around the roadblock or to resume the function of that area.
New knowledge about the brain may emerge from the obscure convent in Minnesota, a place where Ponce de Leon might have been tempted to test the waters. Mankato is the site of the northwest headquarters of the School Sisters of Notre Dame, where a long life is normal. In part because the nuns of this order don’t drink much, smoke or die in childbirth, they live to an average age of 85, and many live far beyond that. Of the 150 retired nuns residing in this real-life Cocoon, 25 are older than 90.
But longevity is only part of the nuns’ story. They also do not seem to suffer from dementia, Alzheimer’s and other debilitating brain diseases as early or as severely as the general population. David Snowdon of the Sander’s Brown Center on Aging at the University of Kentucky, the professor of preventative medicine who has been studying the nuns for several years, had found that those who earn college degrees, who teach, who constantly challenge their minds, live longer than less-educated nuns who clean rooms or work in the kitchen. He suspects the difference lies in how they use their heads.
Within the human brain each neuron contains at one end threadlike appendages called axons, which send signals to other nearby neurons. At the other end of the neuron are similar threadlike appendages called dendrites, which receive messages from nearby cells. Axons and dendrites tend to shrink with age, but experiments with rats have shown that intellectual exertion can spur neurons to branch like the roots of a growing tree, creating networks of new connections. Once a skill becomes automatic, the extra connections may fade, but the brain is so plastic that they can be tapped again if needed. Like the power grid of an electric company, the branching and connections provide surplus capacity in a brownout. Snowdon and some neuroscientists believe that people with such surplus who find their normal neural pathways blocked by the tangles that characterize Alzheimer’s disease can reroute messages. To be sure, every brain is limited by genetic endowment, and flexibility does decrease with age. But new thinking in brain science suggests that whether someone hits that wall at age 65 or at age 102 may be partly up the the individual.
Professor Snowdon says the nuns of Mankato demonstrate this. He expects to prove that the better-educated sisters have significantly more cortex and more synaptic branching of neurons than their less-educated counterparts, which would allow the former to cope better with Alzheimer’s disease, dementia and stroke. Brain exercising is a way of life at the nunnery, where the sisters live by the principle that an idle mind is the devil’s plaything. They write spiritual meditations in their journals and letters to their congressmen about the blockade in Haiti, and do puzzles of all sorts....One 99 year-old, Sister Mary Esther Boor, takes advantage of slow minutes while working as the complex’s receptionist to solve brainteasers -- some with words in Spanish.
What can the average person do to strengthen his or her mind? The important thing is to be actively involved in areas unfamiliar to you, says Steel, head of UCLA’s Brain Research Institute." Anything that’s intellectually challenging can probably serve as a kind of stimulus for dendritic growth, which means it adds to the computational reserve in your brain."
So pick something that’s diverting and, more important, unfamiliar. A computer programmer might try sculpture, a ballerina might try marine navigation. Here are some other stimulating suggestions from brain researchers:
"Do puzzles, I can’t stand crosswords," says neuroscientist Antonio Damasio of the University of Iowa, "but they’re a good idea." Psychologist Sherry Willis of Pennsylvania State University says, "People who do jigsaw puzzles show greater spatial ability, which you use when you look at a map."
And remember, researchers agree that it’s never too late. Says Scheibel: "All of life should be a learning experience, not just for the trivial reasons but because by continuing the learning process, we are challenging our brain and therefore building brain circuitry. Literally. This is the way the brain operates."
This article also discusses the enigma of phantom limbs and how the brain continues to register impulses due to synaptic connectivity long after the limb itself is gone. If you are interested, you can probably pick up a copy of this article at any library. The pictures are excellent, and this is information that everyone should be aware of.]

 

 

[Brain culture -Interview in 2005 Indian Express chennai
Monday September 19 2005 11:49 IST

Dr Vilayanur S Ramachandran

One of the most efficient scientists in the world today, Dr Vilayanur S Ramachandran is Director, Center for Brain and Cognition, and professor of Neurosciences at the University of California. He had received many honours, including a Fellowship at All Souls College, Oxford, the Ramon Y Cajal award from the International Neuropsychiatry Society, the Presidential Lecture Award from the American Academy of Neurology, and two honorary doctors. Newsweek named him a member of the Century Club, one of 100 most prominent people to watch in the 21st century. En route to London, where he will be awarded the 2005 Sir Hendry Dale prize and a life Fellowship by the Royal Institute, Dr Ramachandran spoke to The New Sunday Express in Chennai.

Who would you name in particular as your scientific heroes?

Michael Faraday and Thomas Huxley. Faraday moved a magnet to and fro within a coil of wire and linked two entire fields of physics; electricity and magnetism. I learnt that there’s no correlation between the sophistication of methodology or technology and the importance of the result. Maybe this is what has given me my perverse streak. I like doing experiments which make my colleagues go: ‘‘Why didn’t I think of that?’’ I also admire Huxley - for his overall approach, for his wit and pugnacity, and for bringing science to ‘‘the common people’’ ( his phrase) without dumbing it down.

Also the unknown Indian genius in the first millennium BC who combined the use of place value, base 10 (which was more practical than the Sumerian 60) and, most importantly, zero as an independent number and place holder. This marks the dawn of mathematics.

Any modern scientific heroes?

Norm Geschwind and Francis Crick, both of whom have had more sheer FUN doing science than anyone else I know.

What’s the best advice you’ve ever got for doing research?

All from Francis Crick. First, sheer intellectual daring - chutzpah. It is better to tackle 10 fundamental problems and solve one than tackle 10 trivial ones and solve all. Fundamental problems are not NECESSARILY more difficult, inherently, than trivial ones. Nature isn’t conspiring against us to make fundamental problems more difficult.

Second, don’t become trapped in small cul-de-sacs of specialization because you feel comfortable or your immediate peers reward you for it. Strive not for pats on the back from the majority of colleagues but only for the respect and admiration of those few exceptional people at the top of your field whom you genuinely hold in high esteem.

Do you enjoy writing about science?

Yes, and I feel I am reviving a venerable tradition. In Victorian times, it was not merely acceptable but fashionable to make science accessible to the common people. Lay audiences flocked to listen to Humphrey Davy, Huxley and Faraday lecturing at the Royal Institution. Inevitably one runs the risk of inadvertently oversimplifying some of the concepts and offending some experts.

Which of your own scientific achievements are you especially proud of?

It’s hard to name a favourite. I have pursued two parallel careers - one in visual perception and the other in neurology. In vision, I have always tried to link psychology to biology, especially to known anatomy and physiology and to evolution. I discovered several new visual illusions which have excited the interest of physiologists and researchers and resulted in a renaissance of interest in perceptual illusions - a neo-gestalt revolution in vision.

But in the last 12 years, my work has mainly been in behavioral neurology or cognitive neuroscience. Our overall strategy has been to bring old clinical ‘‘curiosities’’ from the clinic to the lab and show that they can provide fundamental insights into normal brain function. For example, our work on phantom limbs has shown that a massive reorganization of sensory pathways occurs after amputation, and we were able to correlate phenomenology with brain imaging data.

More recently, we studied synaesthesia, a condition in which people get their senses muddled up. They ‘‘see sounds’’. We showed that it was a genuine sensory effect, discovered the brain regions involved, and pointed out its relevance to understanding metaphor and creativity. Indeed, one can go from synesthesia genes to brain anatomy to perceptual phenomenology all the way to metaphor and Shakespeare in a single ‘‘preparation’’.

And what honours or awards are you especially proud of?

I’d have to say the BBC Reith Lecturership. Many previous

Reith Lectures have been turning points in Western civilization and were given by my boyhood heroes - Bertrand Russel, Peter Medawar, Robert Oppenheimer and Arnold Toynbee.

What would you say are the key problems in your field?

Well, first, what is the basis of abstract thinking? How did it evolve? I mean, how do we use neurons to sequentially juggle ideas in our heads? As when you say A is bigger than B, B is bigger than C, therefore A must be bigger than C. Is that transitivity, a deduction, learned through induction and empirical observation? If so, is this acquired through learning or hardwired through natural selection?

The second big question is consciousness. Crick and Koch galvanized the scientific community by daring to suggest (correctly, I believe) that it is a tractable scientific question.

But I disagree with their specific view that there are ‘‘consciousness neurons’’. I think consciousness arises not from individual neurons or from the entire brain but from small specialized circuits unique to (or very highly developed in) humans. This allows the brain to create an explicit metarepresentation of earlier sensory representations that we share with lower primates. This is accompanied by a sense of agency and self and especially to that uniquely human feature - knowing that you know or that you see, or knowing that you don’t know. These abilities are all closely interdependent in a way that we don’t yet clearly understand.

What are the major ethical issues raised by biology today?

Well, there are the usual ones, but there’s no need to go into them in any detail. There’s the fear of cloning , but that's absurd, because clones already exist - they are called identical twins! There’s also the fear that recent advances in molecular biology might lead to ‘‘genetic engineering’’ but that too is absurd. Because, as pointed out by Medawar, we have already had the resources and capacity to do this for centuries, using the far simpler technique of selective breeding. We haven't done it on humans because of ethical reasons, and the same ethical reasons will prevail when we consider chemically induced genetic engineering in the future.

Few people realize that the Nazi movement - the desire to create an albino, blue-eyed master race - didn't begin in Germany. It began in Cold Spring Harbor in America two decades earlier and was later adopted by Hitler. The eugenics movement, spearheaded by Charles Davenport, resulted in the mass sterilization of thousands of ‘‘imbeciles, prostitutes, criminals, homosexuals and epileptics’’, especially in Virginia. Even alcoholics were sterlilized. If such a law were in place today, it would involve the sterilization of 8 per cent of Americans, the incidence of alcoholism in the US, and that includes the president!

The ‘‘research’’ that all this eugenics was based on was funded by the Rockefeller Foundation. The Goddard commission also administered IQ tests to immigrants in the 1920s and 1930s and forbade the immigration of thousands of Jews on the grounds that they were ‘‘feeble minded’’.

It's a bit ironic that, since that time, the ratio of Jewish vs. non-Jewish Nobel prize-winners has been about 10 to 1. Combine this with the fact that Jews constitute only 5 per cent of the US population, and you will see that if you are Jewish, you have 200 times greater chance of winning a Nobel than if you are a WASP or indeed any other ethnic strain!

Davenport and Goddard would have had a far more positive impact on the US had they measured nose sizes instead of IQ and only admitted people with large noses. Their enterprise wasn’t all that different, if you think about it, than Mengele’s attempt to make Jews more intelligent by injecting blue dye into their irisis. Yet Mengele was considered a monster to be hunted down, whereas Davenport and Goddard have been conveniently erased from the American conscience. Even though the Holocaust began with them and not with Hitler. Nor are other countries immune to such crimes. Equally heinous acts are committed in India in the name of ethnic origin, caste and other misguided views on intrinsic superiority. There is simply no excuse for the way ‘‘mainstream’’ Indians treated ‘‘tribals’’ - Adivasis - in the past.

So what is the difference, then, since it is not genetic?

I think the difference is cultural. After all, Jews and Arabs are genetically almost identical, so if there are very few Arab Nobelists, it is almost certainly a temporary historical phenomenon. They had their heyday between the 5th and 12th centuries, when they were vastly superior to Europeans. Baghdad was the most civilized city in the world - a great center for culture and learning. Al Quarizmi (who translated Aryabhatta’s mathematics texts into Arabic and from whose name the word ‘‘algorithm’’ is derived) coined the word ‘‘algebra’’. Omar Khayam, the poet who wrote the Rubaiyat, also discovered the famous binomial theorem. A single quatrain from the Rubiyat is sufficient evidence for the glory of the Persian civilization.

My reason for bringing up nose size was to parody unidimensional measures of human ability such as IQ tests. My goal is to undermine WASP ideas on racial inequality on their own terms, using their own arguments. My point is that extraordinary achievement correlates much more highly with ‘‘nose size genes’’ than with albinism genes!

But isn't that kind of racism a thing of the past? And aren’t universities such as those you represent insulated from it?

No, it is in fact quite widespread. Even now and even in the hallowed halls of academia. While most of our faculty (within the University of California) are enlightened, there are many failed academics and bigots who get themselves into administrative positions and committees on campus and become a real nuisance, denying opportunities to younger colleagues. As Sherlock Holmes told Dr Watson: ‘‘Mediocrity knows nothing higher than itself. It requires talent to recognize genius’’.

Are there any special circuits that are unique to humans?

My point is that the human brain is primarily an organ of cultural sophistication and diversity. It is this trait above all that makes us absolutely unique in the animal kingdom. Let us not forget that war, imperialism, racism and neocolonialism are also examples of neuropsychiatric ailments. The best way to cure them is not through political intervention alone but by treating them as brain disorders - something that can only be achieved through a deeper understanding of the brain.

I have been reading a biography of Gen. Dyer, who was celebrated as a national hero by the British for having massacred 700 innocent people, including women and children, during a peaceful meeting in Amritsar. Judging from his symptoms, Dyer, in my humble medical opinion, probably had a form of sociopathic behavior caused by frontal dysfunction. For example, biographers say he was shy, had great difficulty making eye contact, had a perpetual blank stare and a volatile temper - all characteristic of frontal dysfunction. He may have had a genetic deficit, or he may have been born with congenital neuro-syphilis from his mother if she had slept with her pankha waala! (Although I have examined photos of Dyer, and there is no depressed nasal bridge caused by erosion of cartilage by spirochetes, one of the cardinal signs of congenital syphillis )

Are there any other bioethical issues that might emerge more directly from brain research?

More directly relevant to neuroscience is a second ethical dilemma that will emerge 300-500 yrs from now, when we completely understand the brain. Imagine a neuroscientist can transplant your brain into a vat - a culture medium - and artificially create patterns of activity which will make you feel like you are living the lives of Francis Crick, Bill Gates, Hugh Hefner, Mark Spitz and a dash of Mohandas Gandhi, while at the same time retaining your identity.

Given a choice would you rather pick this scenario or just be the real you? Ironically most people I know, even scientists who are not religious, pick the latter on the grounds that it is real. Yet there is absolutely no rational justification for this choice, because in a sense you already are a ‘‘brain in a vat’’ - a vat called the cranial vault, nurtured by cerebrospinal fluid and bombarded by photons. All I’m asking you is: ‘‘Which vat do you want?’’ and you pick the crummy one!

There is a sense in which this is the ultimate ethical dilemma. I confess I would myself pick the ‘‘real’’ me, perhaps because of a foolish sentimental attachment to my present reality or because I secretly believe there is something more, after all.

I gave the Alfred Deakin lectures in Melbourne this year; the other lecturers were E O Wilson and Sir Gustaff Nossal. Wilson gave an eloquent lecture. The gist of it was that we have spent three billion dollars on the Human Genome Project and should therefore allocate at least the same amount to identify, catalog and classify all the species on the planet - as a prelude to preserving the diversity of the gene pool for posterity.

I disagree. Or let me rephrase; I think Wilson’s goal is worthwhile but a bit skewed. I find it ironic that there is so much time, effort and money spent on protecting obscure species in the interest of genetic diversity, in stark contrast to what’s spent to protect cultural diversity on the planet.

Everywhere in the West I hear talk of resurrecting the mammoth or the recently extinct Tasmanian tiger. Cloning the latter from dead tissue sample might cost a few million. Sure, it would be fun to revive the Tasmanian tiger, but what about the Tasmanians themselves? I mean the humans! The last one was paraded as an ethnological curiosity in London in Victorian times. We know nothing of their culture - evolved over thousands of years - of their poems, their Gods, their religion, their customs, their art or their language. And we never will. My point is not that the Tasmanians were the proud bearers of a sophisticated culture; I don't know if they were. My point is that we will never know.

In America hundreds of thousands of dollars have been spent on protecting the bald eagle, but not even a fraction of that on the Cherokees or Apaches. Not to pick, again, on Wilson - he is a distinguished scientist and well-meaning gentleman. But I would wager that while he has cried his heart out for the ivory-billed woodpecker, he has shed not a single tear for the proud Apaches.

India has hundreds of such tribes, all of which will become extinct - like the Tasmanians - or assimilated and homogenized by ‘‘mainstream’’ Western lifestyles. I am not saying this to be politically correct; I have no interest in politics. I am saying it because cultural diversity is arguably the single most important trait that makes us human. Through it we have become Lamarckian rather than Darwinian creatures. We are actually hardwired to acquire culture.

The brain has become symbiotic with culture. Such is the power of cultural innovation and transmission that a single idea such as using zero in computation and as a place marker combined with the notion of place value can represent a turning point in human civilization. (The average Roman scholar or Englishman, an Ostrogoth or Visigoth during the 2nd century AD, would have required an entire wall and about half an hour, using Roman numbers, to multiply 329 by 219; an Indian peasant could have done it in 20 seconds.)

A child would become barely human if it is raised by wolves in a cave or in a culture-free environment like Texas. To be human is to be cultured, and nothing takes priority over that. It is not for moral reasons, but in the interest of our species that we need to preserve and celebrate the cultural diversity of humans.

It’s a sad thing that every Indian child learns of Shakespeare in school, yet not a single schoolboy at Eton or Winchester has even heard of Kalidasa’s Shakuntala, let alone read it in the Sanskrit original! (It may be recalled that Justice William Jones, the founder of linguistics and one of the few truly enlightened Englishmen, pointed out that Latin and Greek are effete offshoots of Sanskrit)

Consider what has happened to American Indians since cowboys took over their land. Many thousands died, but even worse, they were stripped of their culture, their identity, the very meaning of their existence. Corralled into reservations and stripped of their dignity, many became alcoholic - their lives had no meaning anymore. All this has nothing to do with their intellectual ability; they are of Mongoloid stock whose IQ is, if anything, 10 points higher than that of cowboys! The progressive erosion of their lifestyle and taking to drink was a direct consequence of marauding cowboy hordes with guns, yet many an American has had the gall to suggest to me - based on the flimsiest evidence - that they have ‘‘alcoholism genes’’..

My deepest fear is that there will come a time when the whole world succumbs to the inevitable onslaught of corporate homogenization - the modern equivalent of cowboys driving Indians close to extinction. We need a new word for cultural genocide - the word doesn't exist yet, but the phenomenon itself is extremely widespread. Everyone will be wearing Nike, playing Pokemon and consuming hamburgers.

What we now call ‘‘cultures’’ or even ‘‘countries’’ or ‘‘civilizations’’ will shrink progressively into little theme parks and museums for our children to gawk at. When I first came to the US, I was surprised that American Indians were kept in ‘‘reservations’’ and fed alcohol. (‘‘By their own choice,’’ one well-meaning American lady told me!) I thought reservations were for animals! I have yet to see a real live American Indian outside a reservation, although I have seen plenty of wooden statues of them in antique shops.

If you want to know what this brave new corporate world culture will be like 50 years from now, just look at Las Vegas today. It's a microcosm of America - the ultimate monument to corporate greed, vanity and sterility of the imagination. (Compare it, if you will, with Ellora or the Taj in India, or Venice in Europe.) They have a mock-up Disneyland version of Egypt and even one of Venice - and I am told there’s one on the Taj in progress. (And remember, these are for adult Americans, not children!) There is more beauty, more poetry and intrinsic value in a single Chola bronze or a Carnatic music concert than all the hotels of Las Vegas.

This complete lack of cultural sophistication was even more true of the British in Colonial India. While the small number of ‘‘first wave’’ of Englishmen were truly enlightened, most of those who came later were barely educated lower-crust tradesmen and aspiring ‘‘white nabobs’’ who looted India’s fabulous treasures and exploited her natural resources. They rationalized to themselves that they were ‘‘civilizing’’ the natives by building railway lines and bridges.

The much-admired Victorian statesman and orator Lord Thomas Babington Macaulay is said to have remarked that the goal of colonial expansion should be to ‘‘convert every Indian to an Englishman’’. Little did he realize that if Aryabhatta or Kalidasa had visited England in the early first millennium AD, they would have said the same thing about Macaulay’s ancestors, little realizing that the descendants of these very same albino savages would one day give birth to Shakespeare and Newton.

Actually, India did have a civilizing influence - on the whole of South-East Asia, Thailand, Burma and even China, starting from the time of Ashoka. Buddhism and Hinduism spread to all these countries and resulted in the ‘‘Indianization’’ of their artistic and cultural values. But all this was achieved through peaceful means by blending Indian and Far Eastern cultures. The result was the harmonious, many-splendoured, multi-cultural ethos that we now see in Indonesia, Thailand and Burma - not a homogenization or Disneyfication.

Are you saying that the West -- especially the US - has contributed nothing to civilization?

No, I’m not saying that. America had its halcyon days -. the golden age of Edison, Franklin and Tesla, when Yankee innovation was the envy of the world. Later, in the 1950s and 1960s, American universities were the best in the world, with more Nobel laureates per capita than any other country except the UK. (The master of my college in Cambridge often pointed out that Trinity alone has had more Nobels than the whole of France!) My institution has had 11, Berkeley 15.

But those days are long gone, no more a part of the present reality of America than the Gupta period is of India. So in a sense my remarks should be seen as a clarion call to young Americans and Indians to revive the Golden Age. It is still entirely within their reach, so long as they escape the clutches of corporate interests which now govern the media.

My comments are also intended to set the record straight and put things in perspective in order to neutralize the Eurocentric bias of Western historians. Contrary to what American schoolchildren learn, curry, holy men, yoga and snake charmers are not the  only thing India gave the world.

In singing the glory of India, however, I have no desire to provide ammunition to ultra right-wing Hindutvas. Yes, to India we owe linguistics as a science, chess, and much of mathematics - including the number system , trigonometry and algebra. But they didn't invent everything. Geometry, for example, was mainly Greek. Even though the Indians ‘‘knew’’ the Pythagoras Theorem a millennium before Pythagoras, they did not bother to prove it. To Euclid, and Euclid alone, we owe the concept of proof - of deducing a whole edifice from a set of simple axioms.

Even more amazingly, neither Indians nor Greeks understood the concept of an experiment. We had to wait for an Italian, Galileo, for that. Until he came along, people simply didn’t understand that the only way to unravel Nature’s secrets is to vary only one thing at a time, keeping everything else constant. Both Indians and Greeks were arrogant; they considered themselves too smart to get their hands dirty. Why do experiments when you can figure it all out in your head?

Aristotle relied on ‘‘common sense’’ to claim that heavy objects fall faster to the ground than light ones. People accepted his view for two millennia even though anyone - even a schoolboy - could have disproved it in 10 minutes by dropping a heavy stone and a pea simultaneously from the top of a building. Yet no one, until Galileo, tried the experiment.

Indeed the very notion of an experiment is alien to the human mind, whether Greek, Indian, English or Chinese, and for that reason, Galileo is rightly regarded as one of the supreme geniuses of the human race. I realize that the ‘‘great man’’ theory of history is unfashionable among left-wing social scientists and historians, but I can assure them that without the likes of Euclid, Panini, Aryabhatta and Galileo, there would be no modern science. If Kanaka’s ship -which carried Aryabhatta’s treatise on board and sailed to Baghdad from India in the 6th century - had capsized in a storm, what would the world be like today?]

 

[Brain plasticity or growth



 

This was in response to the April 3 1997, issue of Nature which has an article by Bruce McEwen of Rockefeller University. The article states, "...significantly more new neurons exist in the dentate gyrus of mice exposed to an enriched environment compared with littermates housed in standard cages." The Nature article suggests that this is biological confirmation of the importance of education and contradicts the previous dogma that the number of active brains cells is essentially fixed early in life. Similar tests were performed in the 1970s by psychologist William Greenough at the University of Illinois and they reached this same conclusion.

Roger Penrose in his book The Emperor's New Mind describes the relevance of synaptic firing in the phenomenon of brain plasticity. He states, "It is actually not legitimate to regard the brain as simply a fixed collection of wired-up neurons. The interconnections between neurons are not in fact fixed but are changing all the time. I am referring to the synaptic junctions where the communication between different neurons actually takes place. Often these occur at places called dendrite spines, which are tiny protuberances on dendrites at which contact with synaptic knobs can be made. Here , 'contact' means not just touching, but leaving a narrow gap (synaptic cleft) of just the right distance - about one forty-thousandth of a millimeter. Now under certain conditions, these dendrite spines can shrink away and break contact, or they (or new ones) can grow to make new contact."
It is estimated the you have about one hundred billion neurons in your brain, about ten billion of which are in your neo-cortex. It has been speculated that you lose about one thousand neurons each day after you reach forty. Research is finding that this loss can be offset by stimulating the brain regularly. A nerve is not like a simple relay circuit. Whether it fires or not depends on a complex interplay of many inputs. These can be inhibitory or exhibitory influences from the neurons surrounding it, or the intracellular fluid that fills the synaptic gap. If a neuron doesn't get enough excitatory input from the neurons connected to it, or gets too many neurotransmitters that inhibit neural action, it will do nothing.
Other research has found that if a neuron is being used, it secretes substances that affect nearby cells responsible for the neuron's nourishment. These cells, in turn, produce a chemical that appears to preserve the neuron from destruction. If the neuron does not get that substances, it dies.
In concert with this effect, Leif Finkel and Gerald M. Edelman of Rockefeller University have discovered that neurons do not act randomly but as a network. They tend to organize themselves into groups and specialize for different kinds of information processing. For example, when a touch stimuli comes in from the finger it first comes into the neural network. The information activates some groups of neurons more than others, and this high level of activity causes the connections among the group of excited neurons to be reinforced. As more and more similar patterns come through the network, the connections among the activated group of neurons becomes stronger and stronger, and eventually the group becomes specialized for processing that one finger's sense of touch.
As far back as 1949 Canadian neurophysiologist Donald Hebb proposed in his work Organization of Behavior that, "When an axon of cell A is near enough to excite a cell B, and repeatedly or persistently takes part in firing it, some growth process or metabolic change takes place in one or both cells such that A's efficiency as one of the cells firing B is increased." In other words, if one neuron sends a lot of signals that excite another neuron, the synapse between the two neurons is strengthened. The more active the two neurons are, the stronger the connection between them grows; thus, with every new experience, your brain slightly rewires its physical structure.
In working with nerve tissue scientists have also found that if two connected neurons are stimulated at the same time, the amount of signal passing from one neuron to the other can double. This is known as long-term potentiation or LTP. Whether this is permanent or not has yet to be verified. But work with aplysia, a sea-slug, by Eric Kandel of Columbia University, verified that the animal's neurons grew stronger as it learned to associate a food it disliked with the presence of a beam of light.
The internet is replete with more information on neural networks and brain plasticity. A simple search engine inquiry into either of these subjects will give more detailed information and lead to specific scientific articles.
This purpose of this site is to provide a simple method to 'exercise' the brain daily and make new connections. The brain's plasticity is becoming more apparent in cognitive science. More and more evidence is surfacing to validate the idea of "use it or lose it." Though this is something that common sense might dictate, there are very few mechanisms created that will allow us to use our brains in unfamiliar ways each day. Doing different puzzles will produce different kinds of thought processes as you search for solutions. Puzzles are useful because they do have solutions, therefore you can test your ability to find a resolve because there is one.
The ability to flex the mind in whatever direction is necessary to find resolve is what leads to true creative thinking. Creativity is not just coming up with something that is different, but with something that is coherent, useful and relevant to whatever stimulated the need for a creative thought. Learning to think creatively is a skill that anyone can learn. Test yourself and see how flexible your mind is. Try this method for six months and see if you are able to think more clearly and apply either logical or analogical thought at will to any situation that arises.]

 

[ https://contentwriteups.blogspot.com/2013/02/is-this-how-brain-waves-and-neurons.html ]

 

[Brains

http://contentwriteups.blogspot.in/2016/03/brain-and-its-puzzling-beauty.html

http://contentwriteups.blogspot.in/2016/01/brainbrain-brain.html

http://contentwriteups.blogspot.in/2013/07/building-better-brain.html

http://contentwriteups.blogspot.in/2011/12/brain-in-news.html

http://contentwriteups.blogspot.in/2011/12/our-brain-and-mind.html

http://contentwriteups.blogspot.in/2013/06/brain-culture-interview-in-2005-indian.html

http://contentwriteups.blogspot.in/2013/06/brain-plasticity-or-growth.html

http://contentwriteups.blogspot.in/2013/02/is-this-how-brain-waves-and-neurons.html ]

 

[Life is an unmapped atlas consisting of unexplored territories and a jig saw puzzle where we are not even sure whether all pieces are available in the box and what shape to create.

Reality and the real reasons of life or living are not uniform or predefined and hence, are not that easy to grasp because besides what has been stated above life is involved in the constant process of churning called living which is dynamic in the cauldron of social life [ the multiple shapes , size and  material of which this cauldron is made of is also not fully known].

Add to these what is the kirn- staff or churning stick [that is evaluation tool used to churn] is also not known.

So, reality is reality with all its multiple facets and faces; dimensions and dynamics; polarities and probabilities; positives and negatives; victors and victims and so on. We try to grasp it, give it various names depending on how we perceive and relate based on our attitudes and perspectives.

After all life is a constant process of learning and adjustment wherein we can neither deny nor defy the importance of anything or anyone.

Evolution goes about its jobs unmindful of whatever we do or do not do, that decides and determines everything including our birth and death as biochemical organisms which are bound to decay or to die or to reorganize or to be reborn or to recreate into something else eventually.

While innards and inherent qualities are inevitable and beyond our choice, at least, our utilization of strengths /reactions to shortcomings can be sane and more sensible in short our MINDSET, and that is what must be the attempt of any subject be it philosophy, spiritual science or pure science.

Evolution has ensured that human beings as a species have moved from kuru disease contacting cannibals to cyber gurus.

 

So, it becomes obvious that we cannot analyse anything through any templates of ideological fixations or subject it to satisfy socio- political or socio-cultural or religious justifications.

 

We must also know that everything has its own inherent attributes, intrinsic values, internal mechanism and logic for its existence besides, beyond, exclusive of and unmindful of human intellectual justifications, acceptance, acknowledgement, social or religious approvals, political support and therefore, it is purely absurd to extrapolate anything with specific ideological fixations.]

 

No comments: