Monday, October 28, 2013

I Am A Conservative, But I Am Not A Zombie.

Do you believe in zombies?


The prestigious Archaeology Magazine has posted research detailing evidence of a zombie outbreak in ancient Egypt near Hierakonpolis approximately 5000 years ago.


Yes, Archaeology Magazine!


The intrepid archaeologists posited that these zombie attacks were caused by a viral culprit:


Zombie-Humor


The idea that zombies are supernatural beings needs to be discarded. They are not the Spawn of Hell, although, they certainly look the part. They are, or were, people who were infected by the Solanum virus. The virus creates a zombie by eating away the frontal lobe of the brain for replication, thus destroying it. The virus mutates the brain and allows the brain to remain alive but dormant and without the need for oxygen. Once the mutation is complete, approximately 23 hours from infection to fully functioning zombie, the ghoul will be on the unending search for living human flesh, thus spreading the infection.


If you don’t believe, have you at least noticed that zombies are the new black? They’re everywhere, and they have a plethora of fans. On TV there is Bite Me, Ugly American and Team Daryl which was spawned by The Walking Dead (chosen as the #1 Show on Television).


A Google search of ‘zombies’ yields 117 million hits.



There are actually three different kinds of zombies. All of them are like humans in some ways, and all of them are lacking something crucial (something different in each case).


Hollywood zombies. These are found in zombie B-movies. Their defining feature is that they are dead, but “reanimated”. They are typically rather mean, and fond of human flesh. The zombies pictured on this page are mostly Hollywood zombies (though I’m informed that the one at the bottom is really a ghost demon). An expert tells me that the name should be “Pittsburgh zombies”, since the most important zombie movies were made in Pittsburgh, but somehow it doesn’t have the same ring.


Haitian zombies. These are found in the voodoo (or vodou) tradition in Haiti. Their defining feature seems to be that they lack free will, and perhaps lack a soul. Haitian zombies were once normal people, but underwent zombification by a “bokor” through spell or potion, and are afterwards used as slaves.


Philosophical zombies. These are found in philosophical articles on consciousness. Their defining features is that they lack conscious experience, but are behaviorally (and often physically) identical to normal humans.



Both the fear of contagion and the fear of predation are hard-wired into the human central nervous system. In other words, these are the cross-cultural,‘instinctive,’ pre-cognitive and pre-linguistic buttons that the modern zombie pushes.


The word zombie, the living dead, has its roots in West African/Haitian vodou religion. Historians and anthropologists trace the origin of zombies to the folklore of several tribes in western Africa, from Ghana to Nigeria. During the slave trade of the late 1500s through the 1800s, persons from these regions were spirited away from their homes to till the plantations of the Caribbean and the European colonies, bringing with them the voodoo culture of magic and spells.


The zombie as represented in film symbolizes a mindless, soulless being with the desire to consume the living, as opposed to the vodou zombie who exists to do the bidding of its master at all costs. At its most basic definition, the term zombie refers to a person who is no longer thinking for themselves. In the case of the West African/Haitian culture, this person is a slave.


Like humans, zombies came out of Africa. There they led rich lives being worshipped as Congolese snake gods (nzambi). The ability of certain snakes to use poison to paralyze their prey was ritualistically imitated by tribal priests, who then proclaimed themselves able to resurrect the dead as well. In such vodoun or Obeah cults, the term nzambi migrated in meaning to “spirits of the dead.”



Transported to the Americas, vodoun took root in Caribbean slave culture, mating with indigenous religions to spawn zombies, zumbies, jumbies, and duppies and spreading northward to the continent. By the 17th century vodoun was strong enough to trigger the Salem witch hysteria of 1692. Tituba, a Carib Indian slave bought by Samuel Parris in Barbados and brought to Salem, filled her young mistresses’ heads with vodoun notions like invocation of the devil, possession, trances, animal familiars, and the sticking of pins into “poppetts” (dolls) made to resemble enemies. The girls’ psyches broke down, alternating between hysterics and catatonia. Tituba was among the first arrested and was the first to confess, in lurid detail—yet she survived while 24 others did not.


To this day, voodoo is prominent in western Africa, Haiti, New Orleans, and parts of the Caribbean Islands.


Although most cultures would consider the zombie to be a fictional creature, zombiism (i.e., being a zombie) is rather common in Haiti, with instances of people being reported dead by loved ones, only to be spotted fully reanimated and wandering around town several weeks to several years later. In Haitian and African culture, zombification is a punishable offense on the same order of severity as murder.


What is lacking from the historicist or contextualist account of zombies is an accurate understanding of the psychology that underlies the fascination and repulsion that zombies engender. All cultural concepts are engaged in a struggle for survival, but that struggle is not fought in some disembodied ether – it’s fought in people’s minds.



What’s on people’s minds is determined by their experience and their culture, certainly, but also constrained and, in the first place, enabled by genetics. People are disposed to be interested in a limited range of things, to be afraid of a limited number of things.


Although we fear the zombies we see portrayed in film, most may, on some levels, subconsciously relate to the symbol being communicated. We use the term “mindless zombie” in a defaming manner toward those who are perceived as blindly following those not deemed fit to follow. If the leader or organization is not one in which one can pledge loyalty, then they are seen as manipulative and unworthy of anyone’s fealty and those that follow them are regarded as mindless due to their failure to perceive the true nature of the organization others so easily recognize. The zombie as defined here is not one to be feared but is one some may pity or even worse, scorn.


Exploring the zombie as a cultural symbol forces us to confront the mind/body issue that many films fail to address. Where religion struggles with the mind/body issue as a reality, science approaches it as a philosophical exercise whose results focus on concepts that would prove the plausibility of a consciousness in artificial lifeforms which result in an artificial intelligence. The transference of the consciousness from one physical vessel to another is a widely accepted theme in a majority of science fiction television shows and films–from the alien technology used in the Stargate Universe that allows a person to swap bodies millions of light years apart to the transference of the human consciousness into a cybernetic form in the upcoming television series Caprica.



This concept has remained a major topic of religious and philosophical study for much of humanity’s history. Both the concept of a “mindless zombie” and the theory of a dualistic human state–the separation of mind and body– is at the core of Joss Whedon’s television series Dollhouse. The significance of this study raises the question not only of the consciousness remaining viable apart from its original form but also whether the body can exist without the consciousness? If the body is capable of existing without a consciousness, is the vessel or shell still a person? The theoretical plausibility of zombies raise the much deeper issue of personhood and as such makes our understanding of the culture symbolism of zombies even more crucial.


The zombie taps into deep-rooted, ancient fears that extend far back in to our hominid lineage and beyond: notably the fear of contagion and the fear of predation. Humans are equipped with ‘elementary feature detectors geared to respond to biologically relevant threats,’ as Arne Öhman has spent a life of research demonstrating, and we react strongly and predictably to features that seem to represent ancestral dangers, even when the source is only a fleeting shadow in the twilight, flickering images on the silver screen, or indeed mental images procured by ink on paper.


There is a clear pop-culture fascination with zombies. Forget Halloween costumes. They’re dragging themselves along on a hit show, “The Walking Dead,” on AMC, holding conventions, taking part in protests and lurching in “zombie walks” through cities from Toronto to Omaha, Nebraska.


Part of this, I’m sure, is just an expression of our culture’s enjoyment of seeing violence performed on seemingly deserving subjects: Zombies can be killed in a variety of creative ways, and since they don’t feel pain and are already dead, there’s apparently no need to feel guilty about it.



But what if this fascination is about more than just gross-out gore and action thrills?


What if it represents a subtle, subconscious understanding that something is wrong—spiritually wrong—with our culture.

Zombies represent the appetite divorced from everything else. They are incapable of judgment, self-awareness, or self-preservation. Though they still move and act, they are not really alive. They hunger and are never filled. And they aren’t just hungry for anything—they specifically want to eat the living, and even more specifically the brain, seat of rationality and self control.


In Pauline terms, they are the sarx in its purest form. Without a soul to control it, the flesh is a slave to its own desires. The rise in popularity of zombies, then, may reflect a rise in anxiety over the elevation of appetite in modern life, a popular recognition that appetite has gotten out of control, and that unchecked, unreflective, and immoderate appetite is a form of death.


It’s this symbolic potential that seems to be behind the recent zombie film resurgence.


Zombies may inspire fear within those who witness them in popular culture, and this fear can be compared with the same emotions that people might experience when they encounter the unknown. Some of the fears brought on by zombies include fear of brain dysfunction, fear of death, and feelings of hopelessness. Zombies, in turn, make these fears into something concrete, something we can reflect upon from a safe distance, as opposed to more active methods of facing our fears, such as high-risk activities like sky diving or bungee jumping.



It’s not always subconscious, actually; Romero’s Dawn of the Dead overtly uses zombies to satirize consumerism. The humans are besieged by the walking dead in a shopping mall, and one of them says that the zombies have gathered there because that’s where they always went in life. Shaun of the Dead uses zombies in the same way, though more humorously. It takes a very long time for Shaun to realize that all of the shambling, vacant-eyed, disgusting people around him have actually become zombies, as their behavior really hasn’t changed all that much. (At the movie’s end, Shaun’s friend Ed’s lifestyle doesn’t seem to have changed at all after his own transformation into a zombie.)


The zombie phenomenon is very interesting theologically, as it’s sort of a “return of the repressed” way of recognizing the deadness of appetite-driven modern culture. As we become more and more zombified, as our culture becomes ever more adept at amplifying our desires through advertising, pornography, and a media culture obsessed with gratifying every appetite, we can see the inevitable results of that process shambling along on their rotting legs.

Another fascinating feature of most modern zombie stories is that, most of the time, the zombies themselves are not actually all that dangerous.


They’re usually slow and clumsy, almost never use weapons, and are too mindless to formulate any tactics. They just plod forward toward their victims, and only their numbers, persistence, and resilience to damage make them much of a threat.

No, what really makes things scary for the protagonists in a zombie story is not the zombies’ power, but the humans’ own weakness.



The survivors in Night of the Living Dead could have easily withstood the besieging zombies if they had stayed cool-headed and followed their most intelligent member’s plans. But instead they degenerate into infighting and hysteria, and that gives the zombies an opening to overwhelm them.

The theological lesson here is that it’s the frailty of our human wills that gives the sarx its power over us.


When we’re faced by naked appetite, we are all too often defenseless and paralyzed. And of course, the worst fate that can befall the victim of a zombie—far worse than being eaten—is to be turned into a zombie oneself. What seems at first like merely an external physical threat can get inside us, corrupt our humanity, and turn us into just another mindless, ravenous drone.


There is never just one meaning to a symbol as rich as the zombie. People have feared many things in many different guises over the centuries, but some fears are eternal and universal.


Zombies have come to occupy a very prominent spot in North American popular culture. This popularity has spilled over into other aspects of everyday life, making zombies a reoccurring metaphor in politics and economics, as well as the natural sciences and mathematics.



As a sub-genre of post-apocalyptic stories, since WWII zombies have reflected society’s concern with crises such as political conflict, social and cultural change, and economic decline. Yet, since the crystallization of the modern zombie in George A. Romero’s Night of the Living Dead (1968), zombies have also contained an under-current of environmental anxiety in addition to political, social and economic anxieties.


Zombies are historically contingent, and stand-in for specific types of environmental anxieties that shift and evolve to reflect the times.


Mainstream interest in zombies has steadily risen over the past 40 years.


Vampires have become sexy, mummies CG, monsters sympathetic, but no horror baddie remains as au courant as the lowly, lurching zombie.


It’s not just television, books and films that are cashing in on zombies. Preppers, as seen on National Geographic’s Doomsday Preppers are spending millions to survive a zombie apocalypse (and other end of world scenarios). They are buying land, growing their own food, stockpiling weapons and designing zombie proof bunkers. In today’s culture, zombies are major players and big business.


Zombies are a value stock. They are wordless and oozing and brain dead, but they’re an ever-expanding market with no glass ceiling. Zombies are a target-rich environment, literally and figuratively. The more you fill them with bullets, the more interesting they become.



Roughly 5.3 million people watched the first episode of “The Walking Dead” on AMC, a stunning 83 percent more than the 2.9 million who watched the Season 4 premiere of “Mad Men.” This means there are at least 2.4 million cable-ready Americans who might prefer watching Christina Hendricks if she were an animated corpse.


In most films, zombies are created when man-made events produce genetic mutations in normal humans. In either scenario, we have a person who is no longer in control of their own life. Helpless, and at the mercy of their new nature, the zombie’s only option, as depicted in modern film, is to band together and rise up and devour the living. In many cases this act does not free the zombie but only adds to their number those who now shamble forth and fight with them.


A person who has been zombified, or transformed into a zombie, can have a blunt affect, dull gaze, and almost stuporous behavior, characterized by a lumbering gait and simple, repetitive vocalizations and movements. Most medical evaluations would characterize victims of zombification as having mental disorders such as catatonic schizophrenia. The aforementioned traits have been incorporated into the current interpretation of zombies found in modern film and media.


What makes that measured amplification curious is the inherent limitations of the zombie itself: You can’t add much depth to a creature who can’t talk, doesn’t think and whose only motive is the consumption of flesh. You can’t humanize a zombie, unless you make it less zombie-esque. There are slow zombies, and there are fast zombies— that’s pretty much the spectrum of zombie diversity. It’s not that zombies are changing to fit the world’s condition; it’s that the condition of the world seems more like a zombie offensive.



People instinctively know to avoid the kind of toxic substances that over evolutionary time constituted a lethal threat to our ancestors, such as rotting meat.


That’s because natural selection has fine-tuned our perceptual apparatus to be on alert for such substances: those of our ancestors who cried yuck at the sight of decomposing flesh were more likely to propagate their genes than the ones who dug in happily.


Over time, the rot-lovers became extinct, and the human population today is united in its innate aversion to spoiled meat. This is an experiment you can do at home: purchase a packet of steaks, let it sit on the kitchen counter for a week and a half, and then open it and smell the roses. If your response is less than enthusiastic, that’s natural selection protecting your genetic material from a potent threat, right there.


New thinking tells us that consciousness itself is an evolutionary adaptation, and it is easy to see how being self-aware helped us survive.


But consciousness also can limit what we see. It allows us, even encourages us, to live in denial of the biggest new threats. For ruling establishments especially, the pressure to keep people oblivious can be all the more acute because admitting to big danger directly threatens legitimacy.



As Stephen King has pointed out on numerous occasions, horror fiction is so often about ordinary people trapped in extraordinary circumstances, and about their efforts to cope.


Humans care supremely about humans, and the motives and thoughts of other people is an ever-lasting well of interest to most of us. Just witness the prevalence of gossip anywhere, or the contents of most fiction throughout the ages. It’s all about what makes people tick, about human nature.


Zombie stories, too; zombies are attention-grabbing and salient in themselves, to be sure, but concerns and speculations regarding human nature usually make up the bulk of the thematic structure of zombie stories. It’s hard to imagine a story pitting zombies against squirrels or groundhogs being much of a blockbuster or bestseller (not to mention zombies vs. polyatomic ions, or the Zombie War on the Fibonacci Sequence). People are interested in the human element.


So zombies tell us more than just that Hollywood likes to come up with new ways to show gore. They also tell us about our own souls. When we watch or read or play a story about them, we see ourselves as both zombies and the victims. We know it, but we don’t realize it. What we need to realize is that we’re already undead, and that the only cure is regeneration.


Live and Learn. We All Do.


Thanks for reading. Please share ☺


Please don’t forget to leave a comment.




Filed under: Religion, Spirituality, wellness, Zombie Tagged: africa, apocalypse, Caribbean, CDC, Culture, google, Haiti, Halloween, hollywood, Joss Whedon, Samuel Parris, Stargate Universe, Television, Walking Dead, Zombie



via WordPress http://hermeticahealth.me/2013/10/28/i-am-a-conservative-but-i-am-not-a-zombie/

Sunday, October 13, 2013

The Only Thing That Interferes With My Learning Is My Education.

The new Federal education standards known as Common Core, are stirring up a big argument around the nation.


The standards have been in existence for a while and are now becoming an issue because they are just recently being put into effect across the country.


If you are not aware of what this is; Common core refers to a set of standards that are intended to provide clear goals for what students are expected to learn, and include a series of benchmarks in English and Math that all students will have to meet by the next school year.


COMMON-CORE-CLASSROOM


The standards apply to students from kindergarten through 12th.


This cure-all wonder drug – the Common Core, short for the Common Core State Standards Initiative was cooked up by the National Governors Association and the Council of Chief State School Officers. And, this magic potion promises to cure America’s education ills, according to its Mission Statement:


The Common Core State Standards provide a consistent, clear understanding of what students are expected to learn, so teachers and parents know what they need to do to help them. The standards are designed to be robust and relevant to the real world, reflecting the knowledge and skills that our young people need for success in college and careers. With American students fully prepared for the future, our communities will be best positioned to compete successfully in the global economy.


Before these standards were set, states came up with education benchmarks that were unique to each and every state.



Common Core tries to make sure all students, nationally, are on the same level, and was created by Governors from across the nation as well as education commissioners.


45 states have adopted the Common Core standards.


In 2014, there will be testing to correspond with the Common Core standards, after they take full effect.


Specifically, the Common Core claims to cure the ills that have long plagued America’s education: inequality and inefficiency. “Common standards will help ensure that students are receiving a high quality education consistently, from school to school and state to state. Common standards will provide a greater opportunity to share experiences and best practices within and across states that will improve our ability to best serve the needs of students.”


So how wonderful is this wonder drug? There is no empirical evidence at the moment to make any judgment since no one has taken it yet. But common sense can help.


It is important to remember that although they are referred to as Federal or National education standards, the Federal Government did not create them. They are referred to as this only because the majority of states have chosen to adopt them, and they are aimed at being a national set of standards.



The misnamed “Common Core State Standards” are not state standards. They’re national standards, created by Gates-funded consultants for the National Governors Association (NGA). They were designed, in part, to circumvent federal restrictions on the adoption of a national curriculum, hence the insertion of the word “state” in the brand name. States were coerced into adopting the Common Core by requirements attached to the federal Race to the Top grants and, later, the No Child Left Behind waivers. (This is one reason many conservative groups opposed to any federal role in education policy oppose the Common Core.)


Like so many education reform initiatives that seem to arise out of nowhere, the Common Core State Standards is another of these sweeping phantom movements that have gotten their impetus from a cadre of invisible human beings endowed with inordinate power to impose their ideas on everybody.


For example, the idea of collecting intimate personal data on public school students and teachers seems to have arisen spontaneously in the bowels of the National Center for Education Statistics in Washington. It required a small army of education psychologists to put together the data handbooks, which are periodically expanded to include more personal information.


Nobody knows who exactly authorized the creation of such a dossier on every student and teacher in American public schools, but the program exists and is being paid for by the taxpayer.



Already hailed as the “next big thing” in education reform, the Common Core State Standards are being rushed into classrooms in nearly every district in the country. Although these “world-class” standards raise substantive questions about curriculum choices and instructional practices, such educational concerns are likely to prove less significant than the role the Common Core is playing in the larger landscape of our polarized education reform politics.


The curriculum replaces the classics with government propaganda. According to the American Principles Project, “They de-emphasize the study of classic literature in favor of reading so-called ‘informational texts,’ such as government documents, court opinions, and technical manuals.” Over half the reading materials in grades 6-12 are to consist of informational texts rather than classical literature. Historical texts like the Gettysburg Address are to be presented to students without context or explanation.


The Common Core, however dressed, shares the fundamental spirit with NCLB: standardization of curriculum enforced with high-stakes testing. In fact, the Common Core comes with more force on a larger scale. The side effects will be even more significant.


“If you had a stomach ache, if you were nervous, if you were lethargic, if you needed energy, if you had tuberculosis, if you had asthma, all sorts of things. It was going to cure what you had.” That was historian Dr. Howard Markel talking about cocaine, a wonder drug praised by the medical researchers, doctors, and great minds in the 1880s, including the likes of Thomas Edison, Queen Victoria and Pope Leo XIII. “I take very small doses of it regularly against depression and against indigestion and with the most brilliant of success,” wrote Sigmund Freud.



“And today begins a new era, a new time in public education in our country. As of this hour, America’s schools will be on a new path of reform, and a new path of results.” That was President George W. Bush talking about the No Child Left Behind Act in 2002. “Our schools will have higher expectations,” he continued, “Our schools will have greater resources to help meet those goals. Parents will have more information about the schools, and more say in how their children are educated. From this day forward, all students will have a better chance to learn, to excel, and to live out their dreams.”


Today, we know that cocaine is indeed potent, in fact, so potent that there is an ongoing expensive battle against it.


And Bush’s NCLB? Every state is trying to get out of it, some even willing to trade it with a worse set of demands from Arne Duncan.


All medicine has side effects. When it cures, it can harm the body as well. Put it in another way, there is no free lunch. Everything comes at a cost.


Education cannot escape this simple common sense law of nature for a number of reasons. First, time is a constant. When one spends it on one thing, it cannot be spent on others. Thus when all time is spent on studying and preparing for exams, it cannot be spent on visiting museums. By the same token, when time is spent on activities not necessarily related to academic subjects, less time is available for studying the school subjects and preparing for exams. Second, certain human qualities may be antithetical to each other.



When one is taught to conform, it will be difficult for him to be creative. When one is punished for making mistakes, it will be hard for her to take risks. When one is told to be wrong or inadequate all the time, it will be difficult for her to maintain confidence. In contrast, when the students are allowed freedom to explore, they may question what they are asked to learn, and may decide not to comply. Finally, resources are a finite as well.


When a school or society devotes all resources to certain things, they don’t have them for others. For example, when all resources are devoted to teaching math and language, schools will have to cut out other programs. When more money is spent on testing students, less will be available for actually helping them grow.


Diane Ravitch has exposed many cases of education wonder drugs or silver bullets in her outstanding must-read book The Death and Life of the Great American School System: How Testing and Choice Are Undermining Education and other writings. She writes, “…in education, there are no shortcuts, no utopias, and no silver bullets.”


The Common Core has not been tested. If anything, standards and testing in the U.S. have not amounted much in curing the ills of inequality and inefficiency.



When I first read about the Common Core State Standards, I cheered. I believe that our schools should teach all students (except for those who have severe learning disabilities), the skills, habits and knowledge that they need to be successful in post-secondary education.


That doesn’t mean that every teenager must be prepared to enter Harvard, but it does mean that every young adult, with few exceptions, should at least be prepared to enter their local community college. That is how we give students a real choice.


I confess that I was naïve. I should have known in an age in which standardized tests direct teaching and learning, that the standards themselves would quickly become operationalized by tests. Testing, coupled with the evaluation of teachers by scores, is driving its implementation. The promise of the Common Core is dying and teaching and learning are being distorted. The well that should sustain the Core has been poisoned.


Written mostly by academics and assessment experts—many with ties to testing companies—the Common Core standards have never been fully implemented and tested in real schools anywhere. Of the 135 members on the official Common Core review panels convened by Achieve Inc., the consulting firm that has directed the Common Core project for the NGA, few were classroom teachers or current administrators. Parents were entirely missing. K–12 educators were mostly brought in after the fact to tweak and endorse the standards—and lend legitimacy to the results.



The standards are tied to assessments that are still in development and that must be given on computers many schools don’t have. So far, there is no research or experience to justify the extravagant claims being made for the ability of these standards to ensure that every child will graduate from high school “college and career ready.” By all accounts, the new Common Core tests will be considerably harder than current state assessments, leading to sharp drops in scores and proficiency rates.


We have seen this show before. The entire country just finished a decade-long experiment in standards-based, test-driven school reform called No Child Left Behind.


The tests showed that millions of students were not meeting existing standards. Yet the conclusion drawn by sponsors of the Common Core was that the solution was “more challenging” ones.


This conclusion is simply wrong.


Don’t judge teachers by their students’ scores. Test scores are a poor measure of a child’s quality and an even worse measure of the quality of teaching. Moreover students’ performance on tests is the result of many factors, many of which are beyond the control the teacher. Thus it is not only unfair to judge a teacher based on test scores, but also ineffective—research has shown that test-based incentive programs do not lead to improvement of student achievement.



There has been no bigger change in ten thousand years of recorded human history than the overwhelming transformation of society and commerce and health and civilization that was enabled (or caused) by industrialization.


We’re so surrounded by it that it seems normal and permanent and preordained, but we need to lay it out in stark relief to see how it has created the world we live in.

In just a few generations, society went from agrarian and distributed to corporatized and centralized.


In order to overhaul the planet, a bunch of things had to work in concert: Infrastructure changes, including paving the earth, laying pipe, building cities, wiring countries for communication, etc. Government changes; which meant permitting corporations to engage with the king, to lobby, and to receive the benefits of infrastructure and policy investments. “Corporations are people, friend.”


Education changes, including universal literacy, an expectation of widespread commerce, and most of all, the practice of instilling the instinct to obey civil (as opposed to government) authority.



None of this could have happened if there had been widespread objections from individuals. It turns out, though, that it was relatively easy to enforce and then teach corporate and educational obedience. It turns out that industrializing the schooling of billions of people was a natural fit, a process that quickly turned into a virtuous cycle: obedient students were turned into obedient teachers, who were then able to create even more obedient students. We’re wired for this stuff.


A hundred and fifty years ago, adults were incensed about child labor. Low-wage kids were taking jobs away from hard-working adults.


Sure, there was some moral outrage about seven-year-olds losing fingers and being abused at work, but the economic rationale was paramount. Factory owners insisted that losing child workers would be catastrophic to their industries and fought hard to keep the kids at work—they said they couldn’t afford to hire adults. It wasn’t until 1918 that nationwide compulsory education was in place.


Part of the rationale used to sell this major transformation to industrialists was the idea that educated kids would actually become more compliant and productive workers. Our current system of teaching kids to sit in straight rows and obey instructions isn’t a coincidence—it was an investment in our economic future. The plan: trade short-term child-labor wages for longer-term productivity by giving kids a head start in doing what they’re told.



Large-scale education was not developed to motivate kids or to create scholars. It was invented to churn out adults who worked well within the system. Scale was more important than quality, just as it was for most industrialists.

Of course, it worked. Several generations of productive, fully employed workers followed.

But now?


Nobel prize–winning economist Michael Spence makes this really clear: there are tradable jobs (doing things that could be done somewhere else, like building cars, designing chairs, and answering the phone) and non-tradable jobs (like mowing the lawn or cooking burgers). Is there any question that the first kind of job is worth keeping in our economy?


Alas, Spence reports that from 1990 to 2008, the U.S. economy added only 600,000 tradable jobs.


If you do a job where someone tells you exactly what to do, he will find someone cheaper than you to do it. And yet our schools are churning out kids who are stuck looking for jobs where the boss tells them exactly what to do.


Do you see the disconnect? Every year, we churn out millions of workers who are trained to do 1925-style labor.



The bargain (take kids out of work so we can teach them to become better factory workers as adults) has set us on a race to the bottom.


Over the last three generations, the amount of school we’ve delivered to the public has gone way up—more people are spending more hours being schooled than ever before. And the cost of that schooling is going up even faster, with trillions of dollars being spent on delivering school on a massive scale.


We spend a fortune teaching trigonometry to kids who don’t understand it, won’t use it, and will spend no more of their lives studying math. We invest thousands of hours exposing millions of students to fiction and literature, but end up training most of them to never again read for fun (one study found that 58 percent of all Americans never read for pleasure after they graduate from school).


As soon as we associate reading a book with taking a test, we’ve missed the point.


The industrialized mass nature of school goes back to the very beginning, to the common school and the normal school and the idea of universal schooling. All of which were invented at precisely the same time we were perfecting mass production and interchangeable parts and then mass marketing.



The common school (now called a public school) was a brand new concept, created shortly after the Civil War. “Common” because it was for everyone: for the kids of the farmer, the kids of the potter, and the kids of the local shopkeeper. Horace Mann is generally regarded as the father of the institution, but he didn’t have to fight nearly as hard as you would imagine—because industrialists were on his side.


The normal school (now called a teacher’s college) was developed to indoctrinate teachers into the system of the common school, ensuring that there would be a coherent approach to the processing of students. If this sounds parallel to the notion of factories producing items in bulk, of interchangeable parts, of the notion of measurement and quality, it’s not an accident.


The SAT, the single most important filtering device used to measure the effect of school on each individual, is a (almost without change) lower- order-thinking test.


The reason is simple. Not because it works.


No, we do it because it’s the easy and efficient way to keep the mass production of students moving forward.


School’s industrial, scaled-up, measurable structure means that fear must be used to keep the masses in line. There’s no other way to get hundreds or thousands of kids to comply, to process that many bodies, en masse, without simultaneous coordination.



And the flip side of this fear and conformity must be that passion will be destroyed.


There’s no room for someone who wants to go faster, or someone who wants to do something else, or someone who cares about a particular issue. Move on. Write it in your notes; there will be a test later. A multiple-choice test.


Do we need more fear? Less passion?


The notion that an organization could teach anything at all is a relatively new one.

Traditionally, society assumed that artists, singers, artisans, writers, scientists, and alchemists would find their calling, then find a mentor, and then learn their craft. It was absurd to think that you’d take people off the street and teach them to do science or to sing, and persist at that teaching long enough for them to get excited about it.


Now that we’ve built an industrial solution to teaching in bulk, we’ve seduced ourselves into believing that the only thing that can be taught is the way to get high SAT scores.


We shouldn’t be buying this.


We can teach people to make commitments, to overcome fear, to deal transparently, to initiate, and to plan a course.



We can teach people to desire lifelong learning, to express themselves, and to innovate.


And just as important, it’s vital we acknowledge that we can unteach bravery and creativity and initiative. And we have been doing just that.


School has become an industrialized system, working on a huge scale that has significant byproducts, including the destruction of many of the attitudes and emotions we’d like to build our culture around.


In order to efficiently jam as much testable data into a generation of kids, we push to make those children compliant, competitive zombies.


Human beings have, like all animals, a great ability to hide from the things they fear.


The universal truth is beyond question—the only people who excel are those who have decided to do so. Great doctors or speakers or skiers or writers or musicians are great because somewhere along the way, they made the choice.


Why have we completely denied the importance of this choice?


It’s clear that the economy has changed. What we want and expect from our best citizens has changed. Not only in what we do when we go to our jobs, but also in the doors that have been opened for people who want to make an impact on our culture.



At the very same time, the Internet has forever transformed the acquisition of knowledge. Often overlooked in the rush to waste time at Facebook and YouTube is the fact that the Internet is the most efficient and powerful information delivery system ever developed.


The change in the economy and the delivery of information online combine to amplify the speed of change. These rapid cycles are overwhelming the ability of the industrialized system of education to keep up.


As a result, the education-industrial system, the one that worked very well in creating a century’s worth of factory workers, lawyers, nurses, and soldiers, is now obsolete.


I don’t think it’s practical to say, “We want what we’ve been getting, but cheaper and better.” That’s not going to happen, and I’m not sure we want it to, anyway.


We need school to produce something different, and the only way for that to happen is for us to ask new questions and make new demands on every element of the educational system we’ve built. Whenever teachers, administrators, or board members respond with an answer that refers to a world before the rules changed, they must stop and start their answer again.



No, we do not need you to create compliance.


No, we do not need you to cause memorization.


And no, we do not need you to teach students to embrace the status quo.


Anything a school does to advance those three agenda items is not just a waste of money, but actually works against what we do need. The real shortage we face is dreams, and the wherewithal and the will to make them come true.


No tweaks. A revolution.


Unfortunately there’s been too little honest conversation and too little democracy in the development of the Common Core. We see consultants and corporate entrepreneurs where there should be parents and teachers, and more high-stakes testing where there should be none. Until that changes, it will be hard to distinguish the “next big thing” from the last one.


Whatever positive role standards might play in truly collaborative conversations about what our schools should teach and children should learn has been repeatedly undermined by bad process, suspect political agendas, and commercial interests.



Transparency in the traditional school might destroy it. If we told the truth about the irrelevance of various courses, about the relative quality of some teachers,

about the power of choice and free speech—could the school as we know it survive?


What happens when the connection revolution collides with the school?


Unlike just about every other institution and product line in our economy, transparency is missing from education. Students are lied to and so are parents. At some point, teenagers realize that most of school is a game, but the system never acknowledges it. In search of power, control and independence, administrators hide information from teachers, and vice versa.


Because school was invented to control students and give power to the state, it’s not surprising that the relationships are fraught with mistrust.


The very texture of the traditional school matches the organization and culture of the industrial economy. The bottom of the pyramid stores the students, with teachers (middle managers) following instructions from their bosses.



Changing school doesn’t involve sharpening the pencil we’ve already got. School reform cannot succeed if it focuses on getting schools to do a better job of what we previously asked them to do. We don’t need more of what schools produce when they’re working as designed. The challenge, then, is to change the very output of the school before we start spending even more time and money improving the performance of the school.


The simple way to make something different is to go about it in a whole new way. In other words, doing what we’re doing now and hoping we’ll get some- thing else as an outcome is nuts.


What’s the point of testing someone’s ability to cram for a test if we’re never going to have to cram for anything ever again? If I can find the answer in three seconds online, the skill of memorizing a fact for twelve hours (and then forget- ting it) is not only useless, it’s insane.


In a crowded market, it’s no surprise that people will choose someone who appears to offer more in return for our time and money. So admissions officers look for the talented, as do the people who do the hiring for corporations. Spotting the elite, the charismatic, and the obviously gifted might be a smart short-term strategy, but it punishes the rest of us, and society as a whole.



The opportunity for widespread education and skills improvement is far bigger than it has ever been before. When we can deliver lectures and lessons digitally, at scale, for virtually free, the only thing holding us back is the status quo (and our belief in the permanence of status).


School serves a real function when it activates a passion for lifelong learning, not when it establishes permanent boundaries for an elite class.


If the new goal of school is to create something different from what we have now, and if new technologies and new connections are changing the way school can deliver its lessons, it’s time for a change.


Here are a dozen ways school can be rethought:


• Homework during the day, lectures at night

• Open book, open note, all the time

• Access to any course, anywhere in the world

• Precise, focused instruction instead of mass, generalized instruction The end of multiple-choice exams

• Experience instead of test scores as a measure of achievement The end of compliance as an outcome

• Cooperation instead of isolation

• Amplification of outlying students, teachers, and ideas Transformation of the role of the teacher

• Lifelong learning, earlier work

• Death of the nearly famous college



In an open-book/open-note environment, the ability to synthesize complex ideas and to invent new concepts is far more useful than drill and practice. It might be harder (at first) to write tests, and it might be harder to grade them, but the goal of school isn’t to make the educational-industrial complex easy to run; it’s to create a better generation of workers and citizens.


The best tactic available to every taxpayer and parent and concerned teacher is to relentlessly ask questions, not settling for the status quo.


“Is this class/lecture/program/task/test/policy designed to help our students do the old thing a little more efficiently, or are we opening a new door to enable our students to do something that’s new and different?”


Parents were raised to have a dream for their kids—we want our kids to be happy, adjusted, and successful. We want them to live meaningful lives, to contribute and to find stability as they avoid pain.


School is at its best when it gives students the expectation that they will not only dream big, but dream dreams that they can work on every day until they accomplish them—not because they were chosen by a black-box process, but because they worked hard enough to reach them.


What do you think?


Live and Learn. We All Do.


Thanks for reading. Please share ☺


Please don’t forget to leave a comment.




Filed under: Culture, Economics Tagged: Common Core, Common Core State Standards Initiative, Council of Chief State School Officers, education, George W. Bush, Howard Markel, Khan Academy, National Center for Education Statistics, National Governors Association, No Child Left Behind Act, Obama, seth godin, Standardized tests, Ted Talks, United States



via WordPress http://hermeticahealth.me/2013/10/14/the-only-thing-that-interferes-with-my-learning-is-my-education/

Wednesday, October 9, 2013

Raw! Raw! Raw! That’s The Spirit! Right?

The raw food lifestyle has inspired an enthusiastic, soul-stirring movement across the globe and much of this excitement can be credited to Cherie Soria, who instructed and encouraged a host of devoted followers, entrepreneurs, and fledgling chefs.


Fueled by her desire to bring good health, weight loss, energy, and a youthful constitution to millions, Cherie joined with Brenda Davis and Vesanto Melina, both registered dietitians, to lead the way toward a raw food health revolution.


divineWayToCombine


The raw-food movement continues to make converts, thanks to a devoted group of individuals and celebrities who embrace the belief that an all-raw food diet is the best diet. The idea that stirs the most enthusiasm for this diet is the contention that cooking both destroys about fifty percent of the nutrients in food, and destroys all or most of the life promoting enzymes. Raw-food enthusiasts commonly make the claim that “cooked foods are dead foods.”


But, are cooked foods really dead foods?


Many people advocate eating raw food because animals eat raw food and stay healthy, or because raw foods contain a little more of some nutrients. However, the subject is more complex.


We are not like the animals. We think more, we worry, we go to work and do not sleep enough, and most people’s digestion is weak, unlike that of the animals.



Vegetarian animals, in particular, often have very complex or multiple stomachs, such as cows and goats, in order to digest raw vegetables. Human beings lack these.


For hundreds of thousands of years the evolving human race had eaten its food raw, but at some time between the first deliberate use of fire–in Africa in 1,400,000BC or Asia in 500,000BC (depending on which theory happens to be the flavor of the month)-and the appearance of the Neanderthals on the prehistoric scene, cooking was discovered.


Whether or not it came as a gastronomic revelation can only be guessed at, but since heat helps to release protein and carbohydrate as well as break down fiber, cooking increases the nutritive value of many foods and makes edible some that would otherwise be inedible.


Improved health must certainly have been one result of the discovery of cooking, and it has even been argued, by the late Carleton Coon, that cooking was the decisive factor in leading man from a primarily animal existence into one that was more fully human’.


Whatever the case, by all the laws of probability roasting must have been the first method used, its discovery accidental. The concept of roast meat could scarcely have existed without knowledge of cooking, nor the concept of cooking without knowledge of roast meat.



Eating raw food is necessary for good health and is an important feature of a healthy diet. But that does not mean that one’s entire diet has to be raw to be in excellent health. It also does not mean eating an all-raw diet is the healthiest way to eat. It is healthier to expand your nutrient density, your absorption of plant protein and your nutrient diversity with the inclusion of some conservatively cooked food in your diet.


Generally speaking, the larger the mammal, the larger its brain will be. Humans are a bit of an anomaly among primates, however, because we have the largest brain and number of neurons, but not the largest body. Great apes, for instance, have much bigger bodies than humans, yet much smaller brains.


How humans came to be so well endowed in the brain department has long been a mystery – but many theories abound, including the predominant one of access to animal-based omega-3 fats from seafood.


Another theory suggests it may, in fact, be cooking that allowed humans to develop so much brainpower.



Your brain is a major consumer of the calories you consume in a day. Even though it makes up only about 2 percent of your body mass, it uses 20 percent of your calories!


The size and number of neurons in your brain is, therefore, largely dependent on the number of calories you can consume in a day. Ancient humans had to graze constantly to find enough calories to live on, much the way apes and gorillas do today. There are only so many hours in a day, and raw, mostly vegetable, foods do not contain many calories, which together put a metabolic limitation on how big the brain could grow.


Researchers believe that it was the shift to a cooked-food diet that gave humans the extra calories they needed to allow their brains to get bigger.


“Absent the requirement to spend most available hours of the day feeding, the combination of newly freed time and a large number of brain neurons affordable on a cooked diet may thus have been a major positive driving force to the rapid increased in brain size in human evolution,” the researchers noted.


They speculated that gorillas would need to spend another two hours a day eating to gain the extra caloric intake to allow their brains to grow as big as humans’, and pointed out that the cooked foods were likely easier to chew and digest, and may have released more calories in some cases.



In 2008, researchers similarly concluded that human brains “smartened up” – allowing for the use of tools and the creation of art and religion – due to the extra calories that became available when cooked food became widespread.4 Eating cooked meals, they said, would have lessened the energy needs of the human digestive system, thereby freeing up calories for the brain.


Gathered around a blazing fire, our ancient ancestors probably huddled to pass the archaic kebab, munching cooked meat and figuring out how they might share it and plan to get more of it. Eating cooked food allowed these early hominids to spend less time gnawing on raw material and digesting it, providing time–and energy–to do other things instead, like socialize. The strenuous cognitive demands of communicating and socializing forced human ancestors to develop more powerful brains, which required more calories–calories that cooked food provided. Cooking, in other words, allowed us to become human.


A new paper examines the metabolic restrictions of a raw diet, and suggests that our primate cousins are limited by their inability to heat their dinners. It bolsters the cooking hypothesis of Richard Wrangham, a primatologist and professor of biological anthropology at Harvard who believes cooking is our legacy.



Brazilian biomedical scientists Karina Fonseca-Azevedo and Suzana Herculano-Houzel note that the largest primates do not have the largest brains, a perplexing question. Encephalization (a larger brain size per body size than you’d expect) has long been thought to be a key feature setting humans apart from other primates, and mammals as a whole, but there is no consensus on how or why this happened.


“We consider this disparity to be a clue that, in primate evolution, developing a very large body and a very large brain have been mutually excluding strategies, probably because of metabolic reasons,” the authors write. They’re the first to try and quantify these limits.


“You would think, ‘Surely people have thought about this stuff before,’” Wrangham said in an interview. “But nobody has ever thought about the fact that cooking gives you more energy.”


This is a central thesis of Wrangham’s 2009 book, “Catching Fire.” He argues that the control of fire allowed early hominids to not only cook their food, but obtain warmth, allowing them to shed body hair and in turn run faster without overheating; to develop calmer personalities, enabling social structures around the hearth; and even to form relationships among men and women–in short, to become human.



“My day job is studying chimpanzees in the wild, and I have often studied feeding behavior. I have tried to survive on what chimps eat,” he said.


Really?


“If I don’t have any food with me, I just eat what they eat. And that told me that what they eat is totally unsatisfying,” he continued. “I thought about what would happen if humans had to live like chimps. And that took me very rapidly to the conclusion, within a few minutes, that as long as we’ve been human, it’s hard to imagine how we could live on raw food.”


Wrangham’s ideas follow the expensive-tissue hypothesis. That concept predicts an inverse relationship between brain size and gut size–to accommodate a large, human-sized brain; our guts shrank relative to our primate cousins. Imagine the potbelly of a gorilla, Wrangham notes. This paper doesn’t even address gut size, just the requirements of our hungry brains.


“In order to be able to apply a sufficient number of calories to the brain, you have to be able to cook your food,” Wrangham said. “You can only afford to have a brain if you can supply a lot of energy to it.”



The idea is that raw food just doesn’t provide enough calories. You have to get out more than you put in, and raw food takes a lot more work (meaning calories) for your muscles and organs to chew and digest, resulting in a net decrease in the amount of calories available for the rest of your cells.


But you can only spend so many hours of the day eating–there must be time to sleep, forage and procreate, too. This limits the amount of calories you can get per day, and it turns out this is directly related to how many neurons you can grow, according to Fonseca-Azevedo and Herculano-Houzel.


The duo crunched numbers to figure out the metabolic costs of a human-sized brain, which is the third most energy-expensive organ in the human body, ranking below only skeletal muscle and the liver in terms of metabolic needs. The more neurons the brain has, the more energy it needs.


If we ate an only-raw diet, to maintain the body size we humans possess, as well as the number of neurons our brains possess, people would have to eat for more than 9 hours per day, they found.


Cooking does some of the work of digestion for us, as Wrangham puts it.



“Molecules are moving faster under the influence of heat; they are breaking up or shaking apart from each other, and that’s essentially what happens in digestion, the denaturating of proteins,” he said. “They lose their structure, and become more accessible.”


As an example, he and others have investigated the effects of cooking on starch molecules and humans’ ability to digest cooked versus raw grains. Simply cooking starchy foods increases the net energy gain by 30 percent, he said.


“The grains themselves represent long chains of glucose, which are very difficult to digest until they have been gelatinized; you are opening up these chains,” he said.


Take, for example, a simple white sauce of flour and butter. You have to stir constantly over even heat, letting the water in the butter invade the starch molecules in the grain. “Then you get this change in consistency, where the whole thing becomes a continuous colloid, and the starch grains have become gelatinized. The result is that it will be easier to digest,” Wrangham said. “Our body pays fewer calories for the digestion.”


Certainly, there are benefits to consuming plenty of raw fruits and vegetables. These foods supply us with high nutrient levels and are generally low in calories too. Eating lots of raw foods is a key feature of an anti-cancer diet style and a long life. But are there advantages to eating a diet of all raw foods and excluding all cooked foods?



The answer is a resounding “No”.


In fact, eating an exclusively raw-food diet is a disadvantage. Excluding all steamed vegetables and vegetable soups from your diet narrows your nutrient diversity and has a tendency to reduce the percentage of calories from vegetables in favor of nuts and fruits which are lower in nutrients per calorie.


Raw vegetables are dramatically low in calories and we probably only absorb about 50 calories a pound from raw vegetables. Our caloric needs cannot be met on a raw food diet without consuming large amounts of fruits, avocado, nuts and seeds.


Unfortunately, sloppy science prevails in the raw-food movement. Raw food advocates mistakenly conclude that since many cooked foods are not healthy for us, then all cooked foods are bad. This is not true.


The idea that stirs the most enthusiasm for this diet is the contention that cooking both destroys about fifty percent of the nutrients in food, and destroys all or most of the life promoting enzymes. It is true that when food is baked at high temperatures—and especially when it is fried or barbecued—toxic compounds are formed and most important nutrients are lost.



Enzymes are proteins that work to speed up or “catalyze” chemical reactions. Every living cell makes enzymes for its own activities. Human cells are no exception. Our glands secrete enzymes into the digestive tract to aid in the digestion of food.


However, after they are ingested, the enzymes contained in plants do not function as enhancements or replacements for human digestive enzymes. These molecules exist to serve the plant’s purpose, not ours. The plant enzymes get digested by our own digestive juices along with the rest of the food and are absorbed and utilized as nutrients.


Contrary to what many raw-food web sites claim, the enzymes contained in the plants we eat do not catalyze chemical reactions that occur in humans. The plant enzymes merely are broken down into simpler molecules by our own powerful digestive juices. Even when the food is consumed raw, plant enzymes do not aid in their own digestion inside the human body. It is not true that eating raw food demands less enzyme production by your body, and dietary enzymes inactivated by cooking have an insignificant effect on your health and your body’s enzymes.


Plant foods do not supply enzymes that aid in their digestion when consumed by animals. Our body supplies exactly the precise amount of enzymes needed for digestion; we are not ill equipped to digest normal food. The plant enzymes are broken down into simpler molecules by our own powerful digestive juices and even those that are absorbed as peptide size pieces (or with some biologic function) do not function to catalyze human functions.



So it is not true that eating raw food demands less enzyme production by your body. A healthy body produces the precise amount of enzymes needed to digest the ingested food appropriately and the enzymes our body uses for other processes are unique to our human needs and are not present in plants. We make what we need from the proper materials.


Recent studies confirm that the body absorbs much more of the beneficial anti-cancer compounds (carotenoids and phytochemicals—especially lutein and lycopene) from cooked vegetables compared with raw. Scientists speculate that the increase in absorption of antioxidants after cooking may be attributed to the destruction of the cell matrix (connective bands) to which the valuable compounds are bound.


In many cases, cooking actually destroys some of the harmful anti-nutrients that bind minerals in the gut and interfere with the utilization of nutrients. Destruction of these anti-nutrients increases absorption. Steaming vegetables and making vegetable soups breaks down cellulose and alters the plants’ cell structures so that fewer of your own enzymes are needed to digest the food, not more. On the other hand, the roasting of nuts and the baking of cereals does reduce availability and absorbability of protein.


Only small amounts of nutrients are lost with conservative cooking like making a soup, but many more nutrients are made more absorbable. These nutrients would have been lost if those vegetables had been consumed raw. When we heat, soften and moisturize the vegetables and beans we dramatically increase the potential digestibility and absorption of many beneficial and nutritious compounds.



Many vitamins are water-soluble, and a significant percent can be lost with cooking, especially overcooking. Similarly, many plant enzymes function as phytochemical nutrients in our body and are useful to maximize health. They, too, can be destroyed by overcooking. However, we cannot paint with this brush of negativity over every form of cooking.


When food is steamed or made into a soup, the temperature is fixed at 100 degrees Celsius or 212 Fahrenheit—the temperature of boiling water. This moisture-based cooking prevents food from browning and forming toxic compounds. Acrylamides, the most generally recognized of the heat-created toxins, are not formed with boiling or steaming. They are formed only with dry cooking. Most essential nutrients in vegetables are made more absorbable after being cooked in a soup and water-soluble nutrients are not lost because we eat the liquid portion of the soup too.


We also increase the plant proteins in the diet, especially important for those eating a plant-based diet with limited or no animal products.


Multiple studies have demonstrated that the beneficial antioxidant activity of cooked tomatoes is significantly higher than from uncooked tomatoes. Scientists speculate that the increase in absorption of antioxidants after cooking may be attributed to the destruction of the cell matrix (connective bands) to which the valuable compounds are bound.



It is true that vitamin C, folate, B vitamins, and certain minerals are water-soluble and can be destroyed by cooking; but vitamin C contributes less than one percent to the total antioxidant activity of fruits and vegetables. For example, the main antioxidant activity in apples is provided by classes of chemicals called phenolics and flavonoids, both of which are made more available by cooking.


If you compare raw broccoli to steamed or frozen broccoli, about 25 percent of the vitamin C and about 20 percent of the selenium is lost during cooking, but the other 20 commonly measured nutrients show only an insignificant change. Raw-food advocates are not accurate when they claim that 50 percent of nutrients are lost with steaming. A closer estimate would be 10 percent.


Cooking corn also has been shown to significantly boost its antioxidant activity, despite reduction in vitamin C. When the ability to quench free radicals was measured, cooked corn outperformed raw corn by between 25 to 50 percent. Cooking corn releases a compound called ferulic acid, which provides anti-cancer health benefits. Ferulic acid, a phytochemical, is unique in that it is found only in very low amounts in fruits and vegetables, but is found in very high amounts in corn. The availability to the body of ferulic acid can be increased 500 to 900 percent by cooking the corn.


In conclusion, eating lots of raw foods is only a feature of a healthy diet. Like most things in life, most who practice the raw food diet did so with blind faith, thinking it was the end all and be all.


This is NOT true.


Live and Learn. We All Do.


Thanks for reading. Please share ☺


Please don’t forget to leave a comment.




Filed under: Diet, Health and Fitness, Uncategorized, wellness Tagged: Carleton Coon, Cooked vs Raw Food, Digestion, Eating, education, energy, food, Good Health, health, nature, Raw foodism, Richard Wrangham, Ted Talks, Vesanto Melina



via WordPress http://hermeticahealth.me/2013/10/09/raw-raw-raw-thats-the-spirit-right/

Friday, October 4, 2013

You Can Kill A Man But Not An Idea

What is Hajj?


Hajj in the Arabic language means aim, destination or purpose (qasd). The reason is clear: Hajj is the ultimate journey of loving submission (‘ubūdīyah) and conscious surrender (riq) to Allāh.


Hajj is the fifth pillar of Islam and it is incumbent upon every able Muslim to carry out this journey in order to purify himself for God.


120118110318-hajj1-bm-story-top


Every year more than two million Muslims, from 70 different countries, travel to Makkah and Medina with the purpose of undertaking the great obligation of Hajj. It is an exemplary example of equality and unity when the pilgrims gather together for Hajj.


Muslims who belong to different nations, cultures, social and economical status are all dressed in two pieces of unsown cloth. All perform the same rites. There is no difference of rich and poor, all stand in front of their Lord in submission and humility.


Hajj provides a unique opportunity for Muslims to meet each other, understand each other, increase in love, get closer, improve and resolve relationships. It is from the blessings of God during Hajj that one is in continuous opportunity to gain good deeds by treating one’s Muslim brethren in the best way. And aid the poor and needy, which is also from the means of achieving great rewards from God.



The Hajj is a journey full of symbolism, for it represents the soul’s journey towards God. Each stage and each aspect of the pilgrimage is replete with profound meanings about life, worship and realities of faith, especially the love and awe of God.


The hajj carries immense symbolic significance, representing the oneness of the Islamic community (the Koran speaks of the umma wahida, the one community of the faith). The ritual that cuts across all sects, with some differences, suggests common bonds that stretch, geographically, from the Middle East to Africa and Asia and into the Muslim minority communities of Europe, the Americas and Australasia; and temporally from now to Judgment Day.


The hajj is also viewed as an affirmation of the equality of all believers. The simple white cloth that pilgrims don, and that some retain to use as a burial shroud, represents humility before God but also negates the hierarchies and inequalities that otherwise seem important in life.



Linguistically, Hajj means, ‘He prepared, or betook himself, to or towards a person… or towards an object of reverence, veneration, respect or honor.” [E.W. Lane, Arabic-English Lexicon (Cambridge, England: The Islamic Text Society, 1984), vol: 1, p. 513]


In the Sharee’ah, Hajj means a specific journey to Makkah during the designated month of Dhul-Hijjah, for the performance of Hajj as an act of worship to God: “The Hajj is (in) the well-known month (i.e. 10th month, 11th month and the first ten days of the 12th month of the Islamic calendar).


Most people do the pilgrimage for religious reasons of one kind or another. For many people, the pilgrimage is done as an act of devotion. For others, it is done to ask for benefits, for themselves or others, living or dead. Still others do it as a quest for enlightenment, or at least what they see as possible progress along that path. All of these are religious ideas.


Whatever their specific reason for doing the pilgrimage, nearly all pilgrims see the pilgrimage as a sacred activity. Many take vows. Many will vow to refrain from anger, alcohol, or sex during the pilgrimage, taking vows to abstain from what they perceive as “worldly activities” that seem to be out of place in the “sacred” realm of the pilgrimage.



Pilgrim‘ and ‘pilgrimage’ are words that have carried a range of meanings over the centuries.


Perhaps the most significant pilgrimage that any of us will ever undertake, however, is the spiritual pilgrimage.


The English term ‘pilgrim’ originally comes from the Latin word peregrinus (per, through + ager, field, country, land), which means a foreigner, a stranger, someone on a journey, or a temporary resident.


It can describe a traveler making a brief journey to a particular place or someone settling for a short or long period in a foreign land.


‘Pilgrimage’ is a term, which can be used to portray an inner spiritual journey through prayer, meditation or mystical experience. In some faiths and cultures, withdrawal from the everyday world into a monastery or hermit’s cell, choosing to enter into a physically restricted life of isolation and silence, is seen as a way of setting the soul free to travel inwardly.


The people of the world are usually aware of two kinds of journey.



One journey is that which is made to earn livelihood. The second one is that which is undertaken for pleasure and sightseeing.


In both [of] these journeys, a man is impelled to go abroad by his need and desire. He leaves home for a purpose of his own, he spends money or time for his own requirements, therefore, and no question of sacrifice arises in such a journey.


But the position of this particular journey, which is called Hajj, is quite different from that of other journeys.


This journey is not meant to gain any personal end or any personal desire. It is intended solely for God, and for fulfillment of the duty prescribed by God.


[You] will be unable to appreciate fully the benefits of Hajj unless you keep in view the fact that each and every Muslim does not perform Hajj individually but that only one single period has been fixed for Hajj for the Muslims of the whole world, and, therefore, [hundreds of thousands] of Muslims jointly perform it.



Historically, the Holy Prophet (S.A.W.) performed the Hajj towards the end of his life. It can thus be seen as the conclusion of his mission, or as the conclusion to his life. It can also be interpreted as a summary of one’s life on earth.


On the one hand there is the physical Hajj, which is specific to those who embark on the physical journey towards Makkah. Conversely there is also another level of Hajj, a Hajj that we are all a part of – the “Hajj of Life”. This is our journey through life. In it, there are various stations, challenges, decisions etc. Such is the life on earth.


Meaning that we are to see ourselves as travelers or wayfarers, embarking on a travel. However, wherever we stop our stay is never permanent. Such is our journey in this world, temporary, and not permanent.


The journey inevitably comes to an end, as we must return home. This is our true abode. Our hearts must therefore not be too attached to this world as it is not the end itself, but rather it is the means towards the end.


Journeying to a place of special significance plays a part in almost all cultures and religions. The goal may be a site given prominence by particular events, the shrine of a saint or other significant figure, or a remarkable geographical feature.



The journey of Hajj can be considered as a spiritual and physical healing program or journey, because it touches upon different aspects of the human self.


Every nation and society has a center of unity where they get together to worship God. They see prosperity and culture as relics of unity. People of the society get to know each other and understand each other’s difficulties. They form a unified front to remove these difficulties and achieve their goals.


Unity is vividly observed in the great pillar of Hajj, which is repeated every year and for which millions of Muslims gather from all over the world. They represent the Muslim ummah with all its different races, countries, colors, and languages. They gather in one place, at the same time, wearing the same garment and performing the same rites.


They make one stand in the same monument. They proclaim the oneness of the Lord of the worlds, submit themselves to His law, and show their unity under His banner. They announce to the whole world that they are one nation although they come form different countries and homes. They perform the rites and stand in the open areas of Makkah, where bodies become close to each other, faces meet, hands shake, greetings exchange, tongues communicate, and hearts reconcile. They meet for the same purpose and intention.



Differences in social class, wealth, race, and color vanish within these feelings and rites. A pure and solemn atmosphere of brotherhood, serenity, affection, and love prevails. In a world engulfed in dispute and division, it is a great blessing for a person to have the ability to enjoy this atmosphere of complete peace. In a world where discrepancy is the prevailing system, they (pilgrims) enjoy an atmosphere of equality. In the face of the world’s grudges, hatred, and disputation-feelings all too characteristics of the modern life-pilgrims experience a feeling of love and harmony.


Though the facilities and surrounding s around Mecca and vicinities have altered in modern days, the rites of pilgrimage and the bonds of national and international brotherhood among pilgrims have remained unaltered throughout centuries. This adds to the uniqueness of Hajj.


In this great fusion of the Muslim community, often under utmost intense conditions, all the pilgrims once together will finally get separated and return to their respective homes with hearts filled with light and minds filled with new concepts. The spiritual benefits of Hajj can be clearly estimated when the modern mind returns home utterly transformed.

In Islam there is a wisdom and purpose behind every ritual. Some of these wisdoms we know and some we don’t. The rituals we don’t understand or are unclear to us are a mere test of our obedience to the Omnipotent Lord. It is just like a corporation where a boss might give the worker a task, which he doesn’t understand. This doesn’t mean that the task is not important; there might be many reasons behind assigning such a task, one of which could be test of obedience and loyalty. The boss might want to know the level of obedience of the worker before promoting him to a higher position in the company.



Every action and ritual in Islam has a purpose. If we lose the essence and purpose of a ritual, then it becomes an empty shell with no fruit inside—meaning the benefit of the action is lost. Hajj like any other pillar of Islam has purpose, manners, virtues, values, rewards and benefits.


The Rite of the Wanderer, or the Symbolic Pilgrimage, is entirely puerile and unmeaning, unless we have learned in what ideas it originated, and what its authors intended to represent by it.


This symbolic journey is also emblematical of the pilgrimage of life, which, man soon enough discovers, is often dark and gloomy, surrounded by sorrow, and fear, and doubt. It teaches him that over this dark, perplexed, and fearful course lays the way to a glorious destiny; that through night to light must the earth-pilgrim work his way; that by struggle, and toil, and earnest endeavor, he must advance with courage and hope until, free of every fetter, and in the full light of virtue and knowledge, he stands face to face with the mighty secrets of the universe, and attains that lofty height, whence he can look backward over the night-shrouded and tortuous path in which he had been wandering, and forward to sublime elevation—to more glorious ideals, which seem to say to him, “On, on for ever!”



Such, then, is the grand and inspiring lesson, which this Symbolic Pilgrimage is perpetually repeating to the brethren. Let them study it well, and labor with faith; for it announces a progress in science and virtue, which will reach through eternity.


As long as man continues to live in this world, his soul and body are not separate. Man’s body is a manifestation of his soul and the acts of the body are manifestations of his inner feelings. In the same way that physical acts represent spiritual acts, the physical acts push the soul towards spiritual journey.


The Hajj consists of the Hajj of the Body (walking, standing, collecting and throwing), the Hajj of the Mind (performing the rites with understanding) and the Hajj of the Heart (performed in total submission to The Almighty).


The Ka’bah is not the destination; it is the starting point of one’s commitment to cast away one’s bad ways and to begin afresh a new God-centered life. 

The pilgrim is like a drop of water that has become part of the river that is flowing to its origin, the ocean of Eternity.


In essence, hajj is man’s evolution toward God; his return to Him. It is a symbolic demonstration of the philosophy of creation of Adam, the first man. To further illustrate this, it may be stated that the performance of hajj is a simultaneous show or exhibit of many things.



It is a show of creation. It is a show of history. It is a show of unity. It is a show of Islamic ideology. It is a show of Ummah, the community of Muslims. That is why, it is said in the Quran: “And proclaim unto mankind the hajj. … That they may witness things that are of benefit to them.” (Quran 22:27-8)


Our modern mind is at times so much engaged in the material pursuits of life that we sparingly find time to respond efficiently to the yearning of our soul. The retreat and solace found in Hajj fill this void by procuring a spiritual bliss and peace to our body, mind and soul.


The pilgrimage is daily life. The way you live your life during the pilgrimage becomes the way you live your daily life afterwards. Pilgrim’s lore is full of stories of miracles of reformed sinners, of people who have changed for the good. I’ve never yet heard a story of a pilgrim who became worse after doing the pilgrimage. Of course, some people probably have returned from the pilgrimage unchanged or changed for the worse, but they aren’t part of pilgrimage lore precisely because the pilgrimage is seen as a positive transforming experience. That’s what people expect to happen — change for the better, one way or another.


Progress is one of the basic themes of the pilgrimage. This idea of progress, progress within and of the mind, is central to ideals of the pilgrimage. Whatever your current level of mind, you can progress to the next level.



Though the pilgrimage is cast in terms of sacred activity, the sacred and the secular are so thoroughly blended that the distinction between the two breaks down. This teaches the lesson that there is no essential difference between the two. As a result, the improved person who has finished the pilgrimage goes back to that other everyday life, ready for further progress.


While the Persian mystic Mansur al-Hallaj famously thought that the hajj could be done in one’s own home as an inner journey, other centers have developed throughout the Muslim world. Karbala and Qum for the Shia, Nizamuddin/South Delhi in India and Dewsbury in Yorkshire for the Tablighi Jamaat, and Kaolack in Senegal for the Niassene Tijaniyya Sufis are just a few examples of the many other forms of pilgrimage that exist.


Like the great pilgrimage to Mecca, they assume a spiritual significance beyond the act of travel and, like them, are subject to political and social contestation.


While it obviously constitutes physical movement from one place to another, it is pre-eminently a journey of the mind, projecting believers across space and time, even often in the process overcoming barriers of gender and politics.


The external acts of Hajj symbolize the spiritual stages of the prophets and the Imams. Hajj is a display of the spiritual journey of the devotees and the stages of servitude.


All of these behaviors will lead you to enlightenment. It doesn’t matter whether you do them during the pilgrimage, at work, at school, at home, or when going about your life in town. In fact, what matters is not when or where you live this way. What matters is that you do live this way.


Live and Learn. We All Do.


Thanks for reading. Please share ☺


Please don’t forget to leave a comment.




Filed under: Islam, Spirituality, Uncategorized Tagged: British Museum Hajj, Cambridge, CNN, consciousness, education, enlightenment, Five Pillars of Islam, Hajj, Hajj of Life, humanity, islam, life, live, love, Makkah, Malcom X, Mecca, muslim, Pilgrim, Pilgrimage, soul, Spiritual, sufi, universal



via WordPress http://hermeticahealth.me/2013/10/05/you-can-kill-a-man-but-not-an-idea/