Tuesday, March 3, 2015

STEM and the Arts


I’ve heard a lot of talk recently about the problematic distinction, especially in education, of identifying someone as a “science” person, or an “art” or “humanities” person. 

With the widening career gap in the sciences between men, and women and people of color, much attention has been paid to how to increase the numbers of underrepresented groups in the sciences. Predominant focus falls, as usual, on our education system and how K-12 institutions can change these disparities. Programs targeted at increasing youth exposure to science, technology, engineering and math (STEM) fields abound. But at what cost?
Breaker (via stock.xchng)

I for one, when faced with the question “Are you a science person or an art person?” legitimately can’t answer.

Partly because I think it’s a meaningless distinction; an arbitrary definition based on our limited understanding about people and how they think about the world.

But partly it’s because I consider myself both a science person and humanities person. I love learning about what science tell us about the world and each other. But I also love singing, books, and writing. And it’s the combination of these that I find truly fascinating.

I think it’s why I love psychology and qualitative research so much. Those two fields marry the scientific method and the humanities in a way that is both logically appealing and creatively fascinating to me. The study of words, relationships, and people acknowledges the beautiful complexity of the human condition while also seeking to understand it more fully.

Jane Austen (via http://commons.wikimedia.org/)
But if I had never had the opportunity to deeply study BOTH science and arts, I know I wouldn’t have ended up where I am today. If I had been labeled as a “science” person when I was young, just because I was a girl who happened to be good at math, I would have turned out a poor scientist. Similarly, if I had been labeled an “art” or “humanities” person because I happened to be a good singer and liked reading Jane Austen novels, I would have turned out a poor humanities scholar. I am neither one nor the other. I am both. And I know I’m not alone in that.

The abundant focus on STEM at the expense of the arts leaves us with an “either or” scenario, not a “both and.”

But I’ve found that really understanding the human condition requires both a willingness to acknowledge what science tells us about ourselves, but also a willingness to acknowledge that we don’t know everything. Not everything is as neat and quantifiable as science would like. Nor is everything always as achingly complex and mysterious as the arts might suggest. Reality is somewhere in the middle.

I think it’s unfortunate that we as a society are still trying to label our youth as one or the other and intrinsically placing more value on one (science) over the other (humanities). This does our youth a disservice because it does not foster their own intellectual curiosity. It perpetuates the tendency toward stereotypes, labeling and judgment.

In reality we would all be better off if we viewed this as a spectrum, acknowledging that everyone is at least a little bit of a scientist and a little bit of an artist. Scientists can be good artists just as artists can be good scientists.

Life shouldn’t be about whether you’re a “science” person or a “humanities” person. It’s about wonder, curiosity, passion, and creativity. Those are quintessentially human traits that have spanned generations and brought the fields sciences and humanities to where they are today – to a better understanding of the world and the human condition.

So focus on developing children’s wonder, curiosity, passion and creativity and let them discover where their fascination will take them.

We’ll all be better off in the longer run.


What do you think? How can we enhance interest in science while also teaching the value of humanities and arts?

Wednesday, October 16, 2013

Finding the Middle, Early

I read an interesting Huffington Post Blog today about the need for early childhood to be a major world-wide focus in order to ensure positive global development in the future.


The article points out that in the last 20 years, there have been major increases in children attending primary schools across the world. However, there has been less progress with enrolling children in “proven early childhood development programs” worldwide.

The authors then go on to discuss the importance of a strong early childhood to build a foundation for later healthy growth and development, and the links to a successful global workforce and economy. They outline the importance of healthcare, parental support, and basic nutrition as being major factors early in life that can substantially impact later academic, behavioral, and global economic success.

All of this is fairly straightforward to me. It’s the party line I’ve heard (and have generally subscribed to) from day one of my studies in child development.

However, something of this post smacked a little too much of the idea that there is only “one right way” to do something for my taste.

Though I certainly understand the benefits and values of early childhood supports and education, and fully support the need to expand resources for families who are at-risk for highly preventable mental and physical health conditions, the idea that “proven early childhood development programs” are the one, and perhaps only way to go makes me a little leery.
 
Diversity 6 (via stock.xchng)
Especially when we’re taking a world-wide focus on the issue, focusing in on early childhood programs, which in many places are nationally funded and/or monitored, seems a little short-sited. Though I fully understand the importance of evidence-base in developing programs and interventions, I also know that science is one, but not the only, way of knowing something is good for kids.

Don’t get me wrong; it drives me nuts when people assume that they know everything that I know about child development just because they have kids. But it also drives me nuts when scientists claim to know everything about children when they’ve never worked directly with them outside of a laboratory setting one day in their lives.

There are generalizable realities in the human condition, to be sure. And science can help elucidate some of that generalizability.

But it is not, nor should it be, the end all and be all, largely because the field of child development has not studied everything in and across every world culture. There are nuances, differences, and beautiful variations in the way in which humans approach and experience the world around them. Science has a tendency to trounce on that beauty and chalk it up to “variations” or “outliers” that either aren’t important enough to explore, or are just noise in an otherwise coherent data set.  
Breaker (via stock.xchng)

This just in: Science doesn’t know it all.

And that’s okay. Frankly, it’d be a pretty boring world if we did know it all. Plus, all the scientists out there would have to pack their bags and go home if there was nothing else to learn.

My point here is that though science has incredible value and can shed light on some of the most complex things in our world, it is not the only answer.

“Evidence-based” isn’t, and shouldn’t be the only way.

There are a lot of ways to do things right. There are a lot of ways to know the right thing to do. There are a lot of ways to get to the same positive outcome. ‘Because science says so’ is one of many valid reasons to do something.

I’d just hate to think that we’re getting to a place where nations institute a 'one and only right way' to raise children, based solely on what science tells us. That mindset is limiting, and squashes the beautiful variations inherent in the human race.

Now, I don’t mean to be an alarmist. Nor do I mean to suggest that the authors of this article don’t recognize the value in other ways of knowing, or the variation in culture that may be lost in a more one-size-fits-all approach.

But I worry that there are people who may read this article, and who may subscribe fully to the belief that universal, evidence-based early childhood education is the only way to “correctly” raise children. And the more early childhood education gets entangled with state and federal policy and funding, the more this belief will become intractably instilled in our collective psyche.

But instead, let’s call this what it is. Early childhood education programs are a very western idea of child-rearing. And that's okay. But that doesn't make it suitable for the whole world. 

So I fear we are again getting into a situation where the West thinks it knows the “right way” to raise children, to the detriment of other less powerful, but equally vibrant cultures throughout the world that might raise their children differently.  

Children (via stock.xchng)
It’s a hard line to walk because on the one hand, there are certainly universal things that are very detrimental to the successful development of a child. Poor health and nutrition, traumatic experience, lack of family support and connection, mental illness, violence, the list goes on. And there are things that early childhood supports can do to help buffer against those negative early life experiences and enhance the lives of young children. Some of those supports are early interventions and quality early childhood care. Raising global awareness of these things is a laudable goal. 

But on the other hand, where do we stop? Quality is important, to be sure, but if certain cultural beliefs and practices are not in the “evidence-based curriculum” does the funding stop? Where does the western world draw the line on what’s an acceptable way to spend taxpayer dollars? What does it deem as “okay”? What’s defined as “not good enough”? Who decides?

These are the questions I fear we haven’t discussed enough.

And I worry that it will be the loud, powerful voices of western thinking demolishing all others in their path without careful consideration of what might be lost in the process.

And that sounds eerily familiar, and feels a little like groundhog day.


What do you think? Are we just repeating history all over again? Or will we find a middle ground, early?

Monday, October 14, 2013

On Failing to Give My Best, Everytime

Do you ever feel like you’re never giving things as much time as they deserve? As much time as you’d like to give them?

I feel this way ALL. THE. TIME.

(Especially right now as I realize it's been a MONTH since I last blogged.)

It’s partly the plight of being a bit (okay, more than a bit) Type-A. Checking things of the list is sometimes the most important thing in my world, but only if I can check them off with a satisfactory exclamation of “Yes! I gave that my full and best effort, and could not have done it any better.”

It doesn’t even have to be a perfect product. I can live with mistakes. That’s just a part of the process. 

No, it doesn’t have to be perfect, but I just have to have given it my best, every time.

The problem is that my definition of “my best,” is always very high. Sometimes unattainably so.

Part of the problem is, I think that I should be able to get things done in a certain amount of time. Sometimes I’m right about the amount of time it will take. Sometimes, I’m horribly, horribly wrong. Then things get shuffled, pushed off, neglected, to the point where when I am forced to address those things, I don’t have enough time to be able to give them my best effort and attention.
 
Office (via stock.xchng)
Then the guilt sets in. Guilt that I didn’t give it my all. That I let other stuff get in the way of doing my best work, of living my best life. That I let “good enough” for other people be my “good enough.”

And that, for whatever neurotic, perfectionist reason, is unacceptable to me. That, my friends, is failure.

It’s not about failing to do something. It’s about failing to do it well.

And I fail to do things well all the damn time. And it drives me crazy.

But here’s the funny thing. To the outside world, I’m sure most everything seems fine. I still get the things done I need to get done. I’m still a productive worker and a “have it together” wife, daughter, and friend (most of the time). But inside, it’s a different story.

Where the sausage gets made is sometimes a guilt-ridden, imperfect, hideous mess.

And I know what you’re thinking, “give yourself a break, you’re only human.”

Yes. I am.

But the world is constantly throwing messages of “do more, be better, you should be able to do it all, all the time, all while keeping it together and being perfectly happy all the time.” And frankly, I expect more out of myself than is probably reasonable, due in part, but not entirely, to these messages.

The other part of it is just me. Plain, old, failed me.

I’ve always been this way. The drive to do and be more, to give my best no matter what, is a characteristic I’ve always had. It's a large part of what's gotten me where I am today, no question.

A friend was recently reflecting on her own life and said that despite the perception of the outside world that everything is fine and dandy, sometimes what’s going on behind the scenes is a total mess. But that accepting and living in that mess is often the hardest part.

But living in the mess, where things seem to be falling apart, where nothing goes as planned, and where pity is a more common bedfellow than joy, well that’s part of the process of life.

I realize I’ve got a lot of things in my life together. And there’s a lot of great stuff happening in my life right now. I’m thankful and grateful for all of the opportunities, successes, support, and just plain luck I’ve had in life.

But that doesn’t stop me from feeling like a failure when I can’t put 125,000% into everything I do all the time.

I know it’s not realistic to expect that, but here I am.

Does anyone else out there feel like this? Or am I, as I often feel, alone in my perfectionist-driven guilt-fest?


…and now on to the next task, to give it my all, and nothing less…

Friday, September 13, 2013

Why the brain is not enough

Recently a colleague passed along a really interesting article about prevention, neuroscience, and mental illness. (Find it here: http://chronicle.umbmentoring.org/prevention-neurobiology-and-childrens-mental-health/).

This article highlights some really interesting considerations when it comes to the role of brain science in treating mental illness. Particularly, given the recent federal granting efforts around the Brain Initiative, it’s interesting to think about the value placed on brain sciences, and how that has come to eclipse many other forms of knowledge.

The article’s author quotes a recent article from the NewYork Times by Benjamin Fong, who writes about how psychology has come to fully embrace cognitive neuroscience, and the potential pitfalls of such complacent bedfellows. He highlights many of the arguments surrounding the pros and cons of the Brain Initiative, and astutely reflects:

“The real trouble with the Brain Initiative is not philosophical but practical. In short, the instrumental approach to the treatment of physiological and psychological diseases tends to be at odds with the traditional ways in which human beings have addressed their problems: that is, by talking and working with one another to the end of greater personal self-realization and social harmony.” (see full article here)

This got me thinking.

Isn’t it weird that we’ve gotten to a point in history where “fixing the brain” is perceived as the easiest way to treat mental illness?

It wasn’t always this way. Historically, and even as recently as 20-30 years ago, the brain was viewed as an enigma, a black box that could not be understood. And it was early neuroscientists who, to their credit, started pushing the limits of what we knew about the brain and advanced the field by leaps and bounds to the point where we as a society have come to place high value on understanding the role of the brain in our lives – arguably to a fault.

Paper Chain in the Dark (via stock.xchng)
Historically, humans used to rely on each other or home remedies to treat mental and physical illnesses. Trusted home-remedies, and medicinal traditions were passed on through generations of humans.

Care was kept within community.

And yes, with scientific discoveries and medical advancements, we found out that a lot of the things we’d been doing were just plain wrong. Some of the traditional methods of treatment were doing more harm than good. Anyone who’s been to a museum display medical history can attest to this.

Scientific progress has brought humanity crucial advancements that have saved lives. There’s no doubt about that. But when applying science to the human psyche, things get a little trickier.

Traditional reductionist methods of scientific inquiry do not fit well when trying to examine human psychology. Our lives, and our understanding of them, are determined not just by our genetic makeup, but by the interactions with our surrounding environments. Those environments have the power to shape the expression of genetics and physical biology.

It’s nature and nurture in a never-ending tango. Each taking the lead from time to time, in somewhat unpredictable ways.

The problem with psychology embracing cognitive neuroscience so fully and completely is that traditional, reductionist perspectives on how the world works don’t fit well for psychology because of its inherent complexity and unpredictability.
 
Pills Drugs (via stock.xchng)
Unfortunately, recent scientific priorities have pushed us to favor quick fixes to change the brain, because the brain is easy. It’s an organ. It can be reduced to classic scientific fields of biology and chemistry; scientific fields that are easy (or easier anyway) to understand and manipulate in ways that science can predict.

And this is where Benjamin Fong’s point is well taken. There is a complete lack of practicality of the Brain Initiative for treating psychological problems. Understanding how the brain works is great, but humans, despite being made up of chemicals, are not just chemicals. Humans are social beings who rely on trust and relationships with one another to survive. Without that social structure, relationships, and community, humans cease to thrive. Chemical treatments to fix deep psychological wounds created by the pain of broken communities and shattered relationships just won’t cut it. Those broken communities and shattered relationships will continue to do damage until the factors contributing to that brokenness are addressed.

This brings to mind a book called GhostMap, which is lauded in public health circles as a true triumph of human ingenuity and the scientific method to stop a massive cholera epidemic in London, England. Instead of using a solely medical model approach to find an effective treatment for cholera for everyone who was sick, and then accepting fact that cholera would continue to be an epidemic the whole of London, the emphasis was placed on figuring out WHY cholera was spreading in the first place. What was it that people were doing that was contributing to the spread of illness?

And wouldn’t you know it, it was a lifestyle thing. Choices that humans were making about where to deposit bodily waste where it was too near the local water supply, thus spreading the bacteria that causes cholera.

Because of this discovery, people who were living in community with one another, to promote their own health, changed where they deposited waste, and subsequently stopped the epidemic. Instead of sinking resources and money into treating everyone who had the illness with medicine, one environmental change prevented the spread of the disease, halting the epidemic in its tracks.

They could have treated cholera with medicine alone – treating the illness.

But they wouldn’t have saved as many lives as they did by changing the behaviors and systems that were causing the epidemic in the first place.

Now I’m not saying that some of our most pervasive and complicated societal problems like poverty, violence and racism, are as easily solved as figuring out where to deposit waste. Nor am I saying that all mental illness is attributable to only environmental factors.

Brain in hand (via stock.xchng)
What I’m saying is that the answers to the bigger problems with mental illness do not lie in the human brain alone. That the Brain Initiative, for all of its strengths and laudable endeavors, is not the silver bullet to end mental illness.

Because when it comes down to it, we can treat the symptoms of mental illness by altering brain chemistry all we want, but until we grapple with some of the bigger factors that contribute to mental illness, we’re only putting Band-Aids over the gaping wound.

Fixing the brain is easy.

“Talking and working with one another to the end of greater personal self-realization and social harmony” as Benjamin Fong suggests?


That’s hard, but it’s what needs to be done.