Modern Myths: Science vs. Religion

“Religious belief systems prefer a universe with mankind firmly at its center. No wonder Cosmos is so threatening.” This is the subheading to a recent article on the controversy over Neil de Grasse Tyson’s revamping of Carl Sagan’s famous Cosmos series, which began airing earlier this year on Fox. Alternet‘s Adam Lee examines the public outcry of many fundamentalist Christians over the show’s portrayal of the history of the universe; as the show is hosted by an astrophysicist and focused on the scientific exploration of outer space, it is unsurprising that the creation stories related in Genesis are not discussed. For those who insist on a literal reading of Scripture, of course, this is a thrown glove, an invitation to ideological combat. Lee, however, sees the issue in much broader terms: for him, this debate between scientists and fundamentalists is really the manifestation of a much deeper and absolute tension between science and religion on the whole.

Massive volumes have been penned on the idea that science and religion are locked in existential combat, and I have neither the space nor the expertise to go into detail here. A Google search or perusing of Wikipedia’s article on the subject can provide a better introduction to the scholarly debate on this narrative than I ever could. The short summary of what I think you will find in those investigations is this: the idea that religion as such and science as such are locked in some unavoidable ideological war is, simply put, a myth–in the full meaning of that word. It is not only mythical in that this narrative is untrue in many respects (i.e. many scientists are religious, many believers are fully accepting of science, and historically, a vast amount of scientific discovery has been achieved by people who were deeply religious and spiritual) but also in the more pernicious sense: this narrative is mythical in that it forms the backbone of a polemical stance that thinkers committed to a certain vision of modernity employ to discredit their opponents and give the impression that readers and listeners must pick a side in this great battle between progress and knowledge, on the one hand, and ignorant superstition, on the other.

But here, in the small space of a single blog post, I want to focus in on one particular claim that Lee makes–let’s return to his subtitle: “Religious belief systems prefer a universe with mankind firmly at its center. No wonder Cosmos” is so threatening.” Many readers will likely find this claim barely worth mentioning, because the assumptions behind it are largely accepted as obviously true. The uncontroversial nature of this claim only drives home how successfully the “conflict thesis” has been accepted in contemporary thought, for the claim is, theologically and biblically, simply untrue. What Lee is describing here–the idea that humanity is ontologically located at the center of reality–can be called anthropocentrism, an idea which is actually closely tied to Enlightenment humanism–not biblical religion. The assumption of human importance in the universe is the bedrock for social contract theory Liberalism and the application of scientific knowledge to the development of industrialism through technology, all tied up in the modern assumption of historical “progress” towards brighter and better futures. But this view is simply not central to Jewish or Christian religious thought. Somehow, however, many people today–even highly intelligent and well-educated people–seem to think that Abrahamic theology is tied deeply to an anthropocentric vision of reality.

This modern confusion is more complex than a simple historical and philosophical misattribution, though. Anthropcentrism’s consequences are meted out to various ideologies in a specific and ideologically-guided way. What we tend to see as the good aspects and achievements of an anthropocentric culture are attributed to science, technology, and liberal democracy, while the bad aspects or failures of anthropocentrism are attributed to religion or traditional culture. Thus, vaccines, air-conditioning, airplanes, computers, and the moon landings are all proof of the glories of scientific living, while the atom bomb, global warming, and the indignities of modern life are attributable to reactionary, unenlightened religious or tribal thought.

But the problem isn’t just that there is ideological cherry-picking here, there is also a mass of unexamined and baseless claims. There are few, if any, sections of the Bible that lend themselves to an anthropocentric reading. It is true that Jewish and Christian Scripture broadly claim that human life is purposeful and inherently meaningful–that God, the creator of all that is somehow cares for us–but humans are by no means placed at the center of creation. Indeed, the Bible is a theocentric, rather than anthropocentric, text. God, not humanity, is at the center of the biblical universe, and it is only in relation to God that humanity can “cash in” its potential, so to speak. Far from being concerned only or primarily with the immediate material concerns of human beings, the Bible stresses that only in consciousness of and service to the Reality that transcends immanent being can humanity understand its true identity. Human life is recognized as fleeting and, in immanent and material terms, almost trite:

Consider, for example, Psalm 103: 14-17 (note that I have left all gendered pronouns referring to God in place; the reader should not take this as my approval of such pronouns. All quotes are from the English Standard Version):

For [God] knows our frame; he remembers that we are dust. As for man, his days are like grass; he flourishes like a flower of the field; for the wind passes over it, and it is gone, and its place knows it no more. But the steadfast love of the LORD is from everlasting to everlasting on those who fear him, and his righteousness to children’s children.

Humanity is “dust” and “grass”, formless and fleeting: it is only by God’s continual creating and sustaining act that humans exist, and it is only in God’s loving act that we can have any hope. This same theme is present throughout the Bible; consider Isaiah 29:15-16:

Ah, you who hide deep from the LORD your counsel, whose deeds are in the dark, and who say, “Who sees us? Who knows us?” You turn things upside down! Shall the potter be regarded as the clay, that the thing made should say of its maker, “He did not make me”; or the thing formed say of him who formed it, “He has no understanding”?

Here again, any sense of human autonomy from the Ground of Being and Becoming from which it sprang is quashed–only through understanding of the meaning of existence, which is objectively determined apart from humanity, can the particular human harmonize themselves with the reality in which they live. Again, humans are not the center here, but rather a periphery offered meaning and importance precisely to the degree that they conform themselves to the Center, which gives them being in the first place. The writer of deutero-Isaiah continues this theme and even the same metaphor in Isaiah 45:9:

“Woe to those who quarrel with their Maker, those who are nothing but potsherds among the potsherds on the ground. Does the clay say to the potter, ‘What are you making?’ Does your work say, ‘The potter has no hands’?

Job also takes up this theme of human impermanence and seeming unimportance, even demanding that God leave him alone to enjoy what little passing pleasure he might:

“Man who is born of a woman is few of days and full of trouble. He comes out like a flower and withers; he flees like a shadow and continues not. And do you open your eyes on such a one and bring me into judgment with you? Who can bring a clean thing out of an unclean? There is not one. Since his days are determined, and the number of his months is with you, and you have appointed his limits that he cannot pass, look away from him and leave him alone,
that he may enjoy, like a hired hand, his day.

This lament, the opening of chapter 14, is later met with God’s reproval in the closing chapters, which maintain humanity’s determined and circumscribed existence while maintaining both God’s transcendent Otherness and sovereignty. Again, there is little room for a hubristic, anthropocentric reading here.

The Christian New Testament leans heavily on these images, continuing the insistence that the meaning–and indeed salvation–of humanity can come only via the human’s willingness to recognize and follow God, not on any human action itself. Thus James warns his comrades:

Let the lowly brother boast in his exaltation, and the rich in his humiliation, because like a flower of the grass he will pass away. For the sun rises with its scorching heat and withers the grass; its flower falls, and its beauty perishes. So also will the rich man fade away in the midst of his pursuits.

Here, even the rich person who seems in control of his or her life, with independent means and social power, is revealed as limited and contingent: the process of becoming will roll over them just as surely as it will over any other particular being in the world; death comes for all life. Similarly, the first letter of Peter directly quotes the lines from Psalm 103 above. And we have only skimmed the surface of this theme’s presence throughout Scripture: a number of other Psalms (e.g. 22, 90, 92) as well as Ecclesiastes explore the theme in greater breadth. But let’s not get carried away with quoting the text–I hope the point has been made.

This, of course, does not mean that religious people are often not anthropocentric in their thinking–but it does suggest, and I would say decisively so, that the source of this anthropocentrism is not their religiosity, but indeed the humanism that informs modern social thought. And here lies the interesting yet often unexplored tension within modern life. For it is modern social thought–social contract Liberalism in the Locke/Rousseau vein–that collides headlong with the scientific determinism of the 19th and 20th centuries–especially the Behaviorism of say, Skinner. The idea that human beings are largely determined and powerless in an often hostile universe is not at all threatening to a biblical view of the world–as we saw above, the Bible itself repeatedly asserts this very fact! It is humanism that finds this view of the world unacceptable and oppressive, for it suggests that humans, despite all our inventive cleverness and power, are ultimately unable to liberate ourselves from our material constraints. This would suggest that it is the largely unspoken yet ubiquitous classical Liberal view of humanity that, contradictorily, leads fundamentalist Christians to so vociferously reject scientific claims that seem to challenge an anthropocentric view of the universe, whether this is the Big Bang or evolutionary theory.

This will likely strike many readers as an odd claim, but, as counterintuitive as it is, I think its a much more accurate reading than the credulous pigeon-holing that Lee employs in his Alternet article. Fundamentalism, after all, is a religious movement that came about precisely to counter the rise of a robust Science in the 19th century; though its roots can be seen in resistance to early historical critical work on the Bible, it is not until geology and then biology undermined traditional readings of the creation stories that a full-throated ‘fundamentalism’ arrived on the scene. But this reaction already shows a major shift in the reading of Scripture, for creation stories that had meant to point to the mystery of creation had been bent–under the guiding rubric of humanism–to instead provide a firm basis for absolute human knowledge of the physical world. It is easy to forget that the basis for Evangelical and fundamentalist theology was and is (primarily) the Swiss Reformation–led by John Calvin, himself an avowed humanist. It was humanism that led to the idea that the Bible could be simply and directly translated–freeing the laity to interpret the text on their own, without ecclesiastical guidance (or interference, depending on one’s view). Although this certainly gave lay Christians more direct access to the Bible, it also meant that this highly diverse and ancient text was divorced from the cultural, philosophical, and hermeneutical context in which it was written–leading easily to a literalistic mode of reading in which the vast majority of the significance of the text is lost. This is, I think, precisely how the theocentric Bible can come to be read in a highly subjective, self-indulgent anthropocentric mode.

Thus, the real conflict here is not between religion and science, but between humanism and science. This seems paradoxical, precisely because we are so used to thinking of science as both the product of and servant to the humanist project: and again, medical, transportation, and information technology breakthroughs fit this narrative well. The last 2 centuries have been witness to an incredible surge in human knowledge, and that knowledge has led to human manipulation of the material environment in ways that have expanded human lifespan and quality of life (at least for those lucky enough to live in “developed” nations and of sufficient class status). This is an obvious historical fact. But of course, this same knowledge has also had its dark ramifications–not only in the development of military technology and the various negative “side effects” of industrialized society (pollution, rising levels of obesity, displacement of traditional culture, etc.) but also in the increased recognition of human beings as more object than subject. The Cartesian/Kantian view of human personhood as based in transcendental reason has collapsed. Humans now are immanent things, objects of science. This has resulted in massive medical breakthroughs–for only as a physical object can the human body be studied and healed scientifically–but it has also raised profound and unsettling questions: for can an object have intrinsic ethical value? When a person was understood as an immaterial soul, the separation of value and fact was not fatal to the maintenance of ethical realism. But if a person is no more than a body, how is a robust ethics to be maintained?

Herein lies the collision of what we might call “early” and “late” modernity: Locke, Rousseau, Descartes and even Kant inhabited a world of transcendent souls animating material bodies. Since the 19th century, this view of humanity has been challenged to the point of obsolescence: yet it remains the basis of our social thought. This philosophical view is necessary so long as we want to maintain modern liberal values of freedom and human rights, as well as the sense of human dignity, because there is no scientific basis for such claims. One cannot show, empirically, the existence of human rights. Nietzsche recognized that the collapse of transcendence meant the end of the old ethics just as much as it challenged old religious views of Divinity; Heidegger too saw the implications for classical Liberalism–and post-structuralist and postmodern thought has continued to drive home these conclusions. Yet, modern society remains caught in the limbo between humanist social thought and scientific determinism. It seems to me that the resistance to evolutionary thought and an “old” universe is as much about salvaging a sense of meaning to human existence–any meaning, religious or otherwise–as it is about defending a particular religious position. Fundamentalist Christians strike me more as the humanist canaries in the coal mine, rather than real countercultural traditionalists. And of course, they are far from the only people frightened by the conclusions of modern science and its manifestation in technology; lamentations of consumerism, the banality of industrialism, and the “disenchanted” nature of modern self-consciousness are common, especially among the artistic Left. What is perhaps unique about fundamentalists, however, is a recognition, even if it is pre-critical, that there is no “going back”–once one accepts the scientific view of the world, the realm of value and meaning seems forever lost.

I am not suggesting that we side with the fundamentalists in their angry, anti-intellectual assault. I am, however, suggesting 2 things that I think Lee–and many others–have missed. First off, fundamentalist angst is primarily humanist, not religious. Secondly, they aren’t wrong for sounding the alarm–their outbursts point to a central contradiction in modern life, the gap between our social/ethical and scientific/technological endeavors. The haughty cultured dismissal of fundamentalist fear only proves the uncritical complacency of many pundits and commentators. In the final analysis, if we are going to affirm science and ethical realism, we will need a new synthesis, a new way forward, that is not dependent on the transcendental individualism of humanistic classical Liberalism. The alternative to developing such a new way forward can only be the ethical relativism that post-structuralism and postmodernity promise. I would argue that such a synthesis can actually only come by reaffirming the theocentric view we briefly touched on above–but that’s a topic needing its own post.

Epistemology and the Dialectic of Hope

EpistemologySo, in my last post, I promised that with the end of schoolwork, I’d be posting more here. That was 3 months ago, and I’ve been revealed as a liar. I spent this summer, for the most part, preparing for the dreaded GREs, and spent the rest of the time enjoying the green and quiet of Richmond, which, after the urine-soaked madness of Manhattan, I clung to with a nearly religious vigor. School will be starting again in about a week, and I will try to learn from my previous hubris and make no promises to post more often. Part of the problem is that, whereas in the past, theological musings would often lead to a post on here, now I immediately wonder if such musings should be turned into an academic paper. So being a (wannabe) academic ends up nipping my blogging impulse in the bud.

However, after coming across Paul Burkhart’s blog, I’ve been blog-invigorated, and want to “enter the conversation” (to use a trite but useful expression). Although my guess is that Paul and I may not see eye to eye on everything, I’ve found his posts extraordinarily thoughtful and thought-provoking, and am glad to see another Christian who is simultaneously concerned with orthodoxy and systematic thought. It was some comments on a joint post he did with an atheist writer that have prompted me to write here and now. A somewhat trollish commenter dismissed Paul’s faith and defense of it, rather summarily. The comments (by the user “meat”, whose gender is unclear, so I will refer to them with “they” and “them”) can be seen at the bottom of this post (I should note that I don’t particularly agree with much of what Paul had to say in the post itself, but that’s immaterial to what I want to talk about here).

Essentially, meat argued that Christianity was a priori indefensible. He seems to think that, if one simply analyzes Christianity according to a given set of historical, metaphysical, and existential methodologies, one should conclude without controversy that the faith is false. Fair enough, there’s nothing irrational about this. Perhaps Christianity is false. But  meat went a step further, and suggested that such an approach is akin to “set[ting one's presuppositions] aside and then make [one's] determination from naught”. In other words, meat seems to think that his preferred methodology rests on no biases or presuppositions, that s/he is operating from pure reason alone, like some 21st century Kant. So, according to meat Christians assume a host of problematic presuppositions, but meat does not; their approach is Reason Manifest.

Of course, the reality is that such a hermeneutic of pure reason is impossible; human beings always process data according to some pre-arrived-at set of assumptions. And this isn’t a bad thing, because otherwise, we could never draw any conclusions from given data or experiences. Each new moment would be completely unique, a new instance of pure becoming, which we could not link to any previous data or experience. In order to make sense of existence, we have to draw connections between past and present, and that means emphasizing some data over others, and making assumptions about how the world functions. I don’t doubt that if I am holding a brick in my hand, and then release it, that it will fall. I assume gravity will function–and such an assumption seems well-validated! But it’s an assumption, nonetheless. There is always a chance, however slim, that gravity might not function in this new moment. Yet few (if any) of us actually live our lives open to the practical possibility of such unpredictable world-states.

Such an assumption about gravity probably seems rather innocuous, but we engage in a similar sort of assuming in all of our critical reflection. Whenever we evaluate a historical claim, for example, we try to fit that claim into an already-existing body of historical knowledge and assumptions. Critiques of Christianity are keen on pointing out that Christians certainly do this–and they are quite right, we certainly do. However, they are often not so willing to admit this about themselves. So, meat is adamant that Paul should be aware of his having been brainwashed as a child: “I’m being serious when I discuss childhood brainwashing, it takes a lot to overcome and yet you seem fully aware of your being affected and simultaneously unable to set it, which is the reason for your presuppositions, aside.” Paul has assumptions that he should question, but meat doesn’t seem to think that they have assumptions they might need to put aside.

A great example of how this functions is the Historical Jesus Movement, which indeed Paul mentions by way of a Russ Douthat column (I’m generally not a fan of Douthat, but I think he hits the nail squarely on the head on this). The Historical Jesus crowd assumes a boatload of foundational ideas about how history works, what is possible, and what Jesus’ life could or should mean, and then crafts a version of Jesus’ history that fits these pre-arrived-at (and ideologically-entangled) views. Of course, this in and of itself is fine, but what’s problematic is that they seem to think that their image of Jesus is just the correct, historical one, not an ideologically and methodologically-tinged one. My opponents have biases, but not me! I’m honest and open-minded!

Now, I want to be clear: I am not arguing that while meat, Reza Aslan, and DF Strauss are chock-full of bias, I am some bias-free machine of logic. We all have biases and worldviews, and, as I suggested above, this is good! Worldviews are our attempts to make sense of the world; without them, our experience would be a disorganized jumble of sensations. Seeking truth means taking the risk of ordering, and often times being wrong. We have to order, even while admitting that our ordering will likely be wrong, and therefore be prepared to correct our ordering, to try again. This entails constant risk-taking, constant vulnerability to past positions being revealed as erroneous. We are always ready, indeed gleefully so, to point out when our opponents fall into error. We seem less prepared for our own eventual failures on this score.

In other words, everyone in any debate has entered with presuppositions, assumptions, biases, ideologies, and worldviews. Someone who tries to argue that they are working from no presuppositions but just reason and logic is lying either to themselves or to you, quite possibly both. Human knowledge always implies some system of ordering information, a complex set of rules for how knowledge can be received and validated. Theologians can’t deny this about themselves, but neither can their critics. We are all biased. Indeed, even the reliance on reason itself reveals a bias: that the world is indeed an orderly place governed by a strict causality, laws that are fully laws (not just rules) that are wholly consistent and homogeneous through space and time, and that humans possess the capacity to discern all necessary truths about the world to understand it. This is actually an incredibly credulous package of assumptions, which not only theologians but indeed the whole of continental philosophy (and Hume, and classical skeptics and the Cynics…) has called into question. So even reasonability itself implies a very specific worldview, a set of assumptions and commitments which should be open to question, not relied on as a prima facie foundation for all thought.

Indeed, humans are not just logical agents, but beings-in-the-world who are constantly affected by their emotional and physical needs and urges as well as by their capacity for reason. Indeed, this capacity is often overwhelmed by the two former sets of motivations. Even people who value reason highly still have to navigate their own existential, emotional reality. And so, now having discussed the ridiculousness of any debater ignoring their own biases and worldviews while pointing out their opponents’, I’d like to discuss one of the most fundamental worldview-determinants. The post that spawned meat‘s comments was a sort of back-and-forth between Paul and an atheist friend, Dan. Dan seemed interested in plumbing Paul’s reasons for maintaining his Christian faith, and Paul presented a number of scenarios which would cause him to question that faith.

For me, the whole premise of this debate/discussion, though, is deeply problematic. I don’t think it gets at the fundamental set of assumptions that I think really fuels people’s belief or non-belief. Ultimately, physical or historical evidence for Jesus’ Resurrection, or metaphysical reason applied to the Event in the abstract, are not the foundational causes for belief or non-belief. We enter this debate, as we do in all debates, as stressed above, already with a set of presumptions, assumptions, biases, ideologies, and worldviews. The critical issue when it comes to our existential e/valuation of a claim like the Resurrection is extremely fundamental to us as persons. It is, in an important sense, pre-rational (on the personal level).

Developing into a person–not just a human being, but a full person–is, above all else, the process of developing self-consciousness. We have been thinking, feeling beings for years before we begin to have any awareness of the fact that we think. I can feel hungry without being aware that there is a process of feeling hungry. There comes a point when a thinking being recognizes that it is not just a detached subject taking in sensory data about an alien world, but actually also an object-body in that world. My thinking is something that my brain/body complex does. I am an object in the world. And this means I am vulnerable. I can die. I can be harmed. I can experience pain. These possibilities, I come to realize, are not just events that might occur in the process of my thoughts and feelings–they will or will not occur depending on what happens to my mind/body complex. I don’t have control over my own future, at least not in a final sense. I am contingent.

Such a realization is the beginning of existential reflection. It is the occasion of a sense of self. I have to navigate my own existence in the world; I am not just a sovereign consciousness sensing things. My future is tied up with the future of the world: my world. An awareness of my vulnerability introduces a new dialectic to human thought. Though it may have existed in embryonic form before, the rise of self-conscious realization forces it fully into our awareness. I have taken to calling this the Dialectic of Hope. Once we recognize ourselves as beings-in-the-world, we recognize that we have a future, that we will (or at least might) experience new thoughts and feelings in the future, and that what we will experience is largely going to be determined by forces out of our control. We, at this point, cannot help but feel hope: we hope that our future will be pleasant, pleasurable, peaceful, fulfilling. And we fear that it will not be, that we will die, that we will suffer, that we will be unfulfilled, that we will experience ugliness. This is the Dialectic of Hope. And our expectations along this dialectic–whether we are more likely to trust in hope or not–will greatly influence our credulity vis-a-vis claims like the Resurrection.

An essentially hopeful person will find the story of the Resurrection, at least some tiny kernel of it, reasonable, possible, and meaningful. The Resurrection is the vindication of self over other, of life over death, of subject over a deterministic object-order. Conversely, someone who is predisposed towards non-hope or fear will likely find the event unreasonable, impossible, and meaningless or even deceitful. What’s important here is that such a stance of hopefulness or non-hopefulness is brought to the event prima facie. In other words, no one simply evaluates the historical data about the Resurrection in a cool, detached fashion. Everyone has an axe to grind, a dog in the fight, because everyone is already committed, I believe, to a stance of hopefulness or non-hopefulness (I don’t think this is a binary, but rather a spectrum strung upon the dialectic; two people could both be hopeful in general, with one more, and the other somewhat less, hopeful).

Being or not being hopeful is, in and of itself, not some totally independent position. Obviously, our previous life experiences, our understanding of our family’s, ethnicity’s, nation’s, and species’ history will help to form our sense of hopefulness or non-hopefulness. But, that said, in each moment of evaluation of a given event, our stance of non/hopefulness is a prima facie stance that will color our evaluation. So, in an important sense, faith in Christianity is, even before faith in the Resurrection, a willingness to hope, in general.

Of course, this neither proves or disproves the Resurrection; I am not making any objective claim about the truthfulness or lack thereof of the Resurrection claim. I am pointing to the hermeneutical and epistemological bases from which all of us–believers and non-believers alike–make our evaluations. There is no neutral ground, there is no pure reason. There are only living, self-aware beings with complex histories struggling to understand, to live, and to thrive. Too often, modernism’s static, lifeless, narrow epistemology is asserted as some sort of necessary starting-ground for serious thought. But such a starting-ground already rests on a mountain of assumptions. It may turn out that such assumptions are correct–but serious philosophical, historical, and metaphysical reflection demands a willingness to analyze, critique, and deconstruct them. The idea that only those thoughts consonant with a given framework of modern thought are even worth considering is itself intellectually naive and embarrassingly credulous.

A Response to Michael Klare’s “The Race for What’s Left”

I haven’t posted in weeks–with final exams and papers taking up my time at the end of the semester, I really couldn’t spend any time blogging. And indeed over the whole semester my output was quite restricted. But the semester’s over and I hope to be back to blogging form for the summer. Over the last 2 days I read Michael Klare’s The Race for What’s Left. Klare describes the scramble for fossil fuel, mineral, and even agricultural resources that we are seeing in the 21st century, as old sources–often still drawing on wells and mines from the 50s and 60s–are drying up. He describes in detail the various energy and mineral substances that modern industry needs and how companies are trying to meet ever-rising demand, which is rising so fast today in part due to China and India’s economic growth. Companies like BP, Exxon, Rio Tinto, and others can obviously profit by supplying this demand, and are struggling to find new sources of energy and minerals in particular that can be extracted economically.

In the energy sector, this demand is not only leading companies to seek new supplies of oil, coal, and gas, but also to develop “unconventional” sources, such as tar sands, shale oil, and oil share (these last 2 are, it turns out, actually quite different). Klare describes these efforts in chapter 4, after revealing how the Arctic, in particular, is becoming the site for conventional oil and natural gas extraction (in chapter 3). Chapter 5 and 6 outline the mining industry’s efforts to maintain supplies of both relatively common minerals, like copper and iron, as well as more exotic ones, including gold, platinum, and the buzzworthy rare earth minerals. It’s this discussion of the rare earth minerals that, I think, is most crucial to the book as a whole.

After discussing agricultural land-grabs, especially in Africa, in chapter 7, Klare concludes the book in chapter 8. He calls for an end to the “race for what’s left”–which he sees as fueling military conflict, global warming, and ecosystem poisoning–and calls instead for a “race to adapt”. Klare argues that the old industrial order is coming to a close, and that the future will be defined by those countries and companies that can develop new, clean technologies (pp.227-234). Klare seems to think that these two “races” are totally distinct, that working to develop future technologies is somehow totally divorced from the old industrial approach.

The problem is, of course, that even supposedly green technologies, like photovalic solar cells and wind turbines, though they may be able to wean us off fossil fuels (at least in theory), still require huge natural resource inputs. Klare himself discusses this in chapter 6, as mentioned above: those rare earth minerals are crucial inputs for “green” technology. This is important, and worrisome, for 2 principle reasons. First, rare earth minerals are, at least when compared to minerals like copper and iron and as their name suggests, relatively rare. Klare presents US Geological Survey data from 2001 suggesting that total worldwide reserves amount to 114 million metric tons. The question is: is this a lot, or a little? That of course depends on how much we need to develop and produce various technologies. Smart phones, for example, require some rare earth minerals, for example Gallium and Tantalum, but they require tiny amounts–far less than an ounce. So even if we manufacture billions of smart phones, it’s probably unlikely that we’ll run out of the necessary minerals.

However, other technologies require much larger inputs. Even the lightest lithium-ion batteries require many pounds of lithium. Luckily, lithium is relatively common compared to materials like Gallium and Tantalum; nonetheless, it’s not clear whether there are global supplies sufficient to replace all existing vehicles with electric ones. And when we look at photovalic cells and wind turbines, the problem seems even more acute. These require substantial amounts of some of the rarer minerals, but a “green energy future” would require millions upon millions of turbines and solar panels. I’m not saying that we are sure that we won’t have enough rare earths to manufacture these–I am saying that no one seems to know whether we will.

The second main concern around this is that the extraction of rare earth minerals, so necessary for “green” technology, is itself extremely energy-intensive and environmentally damaging. The rare earths themselves have to be separated from their ore, which involves the use of powerful acids, which, once used, are simply discarded as tailings in massive reservoirs. If these tailings were to leak into water supplies, the impact on human health and the environment could be catastrophic. In short, their reliance on such minerals means that even supposedly “green” technologies aren’t at all environmentally friendly or neutral. They may be better, overall, than the burning of coal or petroleum, but they still come with vast health and environmental risks.

In other words, even the brightest technological future seems dimmed with serious concerns over scarcity and environmental degradation. Klare rightly laments over the scramble for resources, but then, rather incredibly, announces a highly optimistic confidence that future technologies can deliver us of these problems, if only we invest now. The conclusion of the book reads more like an ad for a solar-panel or lithium-ion start-up rather than a sober, well-thought-out conclusion. The reality seems to be that even the best-case scenario for the future of technology is rather bleak. Klare seems to hope–and believe–that with sufficient research, we could develop technologies that use either no rare earths, for example, or so few that we can limit the economic and environmental risks discussed above. While one can’t dismiss that possibility, right now, such a future is by no means guaranteed. The reality seems to be that the development of green technology will lead to different resource extraction than previous technologies, with different economic and environmental risks–but still plenty of economic and environmental risks. In short, new technology is not going to solve the conundrums that old technology poses. This doesn’t mean that the only ethical response is neo-Ludditism, but I do think it’s important that we’re honest about the economic and environmental realities we face. Klare, as far as I can see, gets the diagnosis of the current problems right, but in his over-enthusiastic plugging of technological research, falls into the very error of those he was just criticizing.

On Mountains, Trees, and Concrete

Image

View, looking west, of the ridge north of Chungju; taken from Gye Myung San, NE of Chungju itself.

Two years ago today I lived in Korea–in a small city named Chungju. Almost exactly in the middle of South Korea, it is flanked on its north, east, and south with mountains. And I mean immediately flanked–I could walk out of my apartment building and be hiking within 10 minutes. This was one of my favorite things about living in Korea: I would generally go on a hike every week or so; it was basically a mini-vacation from work and the city that I could take whenever I wanted. I could spend an hour up there, or a whole day. I could get into the silence and be among the trees and really get into nature without having to plan it at all. It was glorious.

One year later I was back in Richmond, VA. There aren’t any mountains near Richmond, but the city itself is full of parks and has the James River running right through it–and a state park along both banks of the river. So, if you live near downtown, you are never more than a few miles from a forest. It’s a small forest, granted, but an honest-to-God forest nonetheless, and there’s a large, unoccupied island, and a river full of smaller islands–all covered with trees. Again, living there, I could just walk out my door, hop on my bike and be out-of-the-city–while still in the city–in just a few minutes. And again, I would avail myself of this resource at least once a week. Indeed, last summer I was out in the parks or at the river almost every day reading or just walking.

Now I live in Manhattan. From the beginning I was less than enthusiastic about living here–it’s so densely crowded, almost every inch covered in concrete (which is itself often covered in dog shit or garbage–or both). Every direction you walk, there are people, and more people, a never-ending crowd of people. There’s nothing wrong with these people, but there are just too many of us packed into one space. The few available strips of trees or grass are a) also covered in people, because these bits of land are such a rare and valued commodity and/or b) right next to the West Side Highway, and thereby very loud, un-soothing places.

I had noticed, when I first arrived, that the relative lack of really large stretches of virgin nature was stressing me out. Over the winter, I had decided that I had gotten over this and adjusted to Manhattan life. But now, with the trees blooming and the grass stretching up to the daily-higher sun, I’m itching to get back out into the fresh air. The problem is that there really isn’t any fresh air here to speak of.

But school will be over in a few weeks and I should be back in Richmond in June. Although I’ll be back up here in August as school starts again, I’m already looking forward to some quiet streets and some forested places not filled with people, to the smell of soil and the sound of cicadas. I’m not writing this to complain about Manhattan–I know lots of people love this place. But I’m really feeling my separation from nature these days and noticing how it’s beginning to depress me again. I’m ready to take a break from this city.

Also: hopefully, once school is over I will actually start to write on this blog with some regularity again. These past few months, school has pretty much monopolized my brain power; but I’m hoping to post about twice weekly over the summer. So if you like reading about systematic theology, congratulations! The internet will have slightly more of that starting in June.

Scott Lipscomb:

This is an absolutely wonderful article. After recently reading about the Sandberg/Slaughter debacle, I found myself intensely frustrated that this sort of capitalist-apologism was being passed as feminism in the 21st century. I was working myself up to post about it myself, but I think you really hit the nail on the head–by focusing on what racism, sexism, classism, etc. etc. are all methods for achieving: power. The problem with opposition movements–reformist or revolutionary–is that once individuals or institutions actually acquire power, they are unlikely to want to give it up, whatever their ideologies. This is perhaps even truer for the sorts of people who get involved in politics and even organizing in the first place. But instead of focusing on a critique of the fundamental issue at hand–how we humans manage the possession and application of power in societies, mainstream feminist discourse is instead begging the question, assuming that of course power will be vested in the hands of a very few–so lets at least make sure a few of them are women.

So the fundamental question is whether we actually trust powerful women like Sandberg to fight the good fight and advance the cause of real racial, class, and sex equality. Who with a straight face can actually answer in the affirmative? Women are just as capable of cynical, self-serving power-wrangling as anyone else. Cleopatra, Queen Elizabeth, Indira Ghandi–did any of these women usher in some era of grand liberation? I don’t raise this example to suggest that any of these women was particularly evil or a failure, but rather to suggest that women in power are not intrinsically different than men in power. And I do think that here we run into a rough spot with feminist analysis; the reality of patriarchy is de facto: it doesn’t mean that men in power sit in their oak-lined offices all day thinking, “how can I help my fellow men get *even more* power over women?” Power is, I think, assigned to men because that structure is the most efficient way for men at the top to maximize their own power and for a host of economic and social reasons.

Likewise, I don’t think that most women in power sit in *their* oak-lined offices wondering how they can advance the cause of women. I’m sure some genuinely do, but I think the vast majority sit in their leather-upholstered chair wondering how they can maximize their *own* power, wealth, influence, and prestige. If helping some women, or all women, will serve those ends, then, sure they will. If, on the other hand, blocking paid-sick leave for female workers will help their chances in the next election, then…guess what? That’s what 99% of women will do, because they are just as human as male leaders.

Originally posted on tressiemc:

This is one of those posts that can go nowhere but down.

There are things you simply cannot do in this life and slaying unicorns is one of them.

What do I mean by “slaying unicorns”? It’s an old Livejournal term. It means providing evidence that one’s sacred emotional belief or object is either not a) universal b) all that great or c) grounded in reality or supported by empirical evidence.

I am really, really bad about this. I tend to slay unicorns even when I only mean to make an observation or intend to honor my own truth or even when I just mean to get through the day. I end up slaying unicorns way more than I’d like. My hands are filthy with their rainbow blood.

So, I wanted to leave alone The Atlantic article about women having it all.

An initial tentative reaction about not seeing my…

View original 1,879 more words

Profiting Non-Profits: The Capitalization of Charity

Pallotta has written a book–Charity Case–about his vision for marketizing the non-profit sector.

I just came across a TED talk by Dan Pallotta entitled “The way we think about charity is dead wrong“. Pallotta essentially lines up a criticism of traditional non-profit culture by comparing it to for-profit business models. He emphasizes that all the tools that for-profit business have: advertising, high salaries for CEOS and other decision-makers, investment capital, etc. are essentially unavailable to non-profit organizations. Pallotta outlines a reform proposal: non-profits need to think and act like for-profits if they are going to succeed. He points to a number of large campaigns in which aggressive marketing resulted in vast donations being given to a variety of causes (his two examples were races and rides for AIDS and breast cancer research). For Pallotta, the future of charities lies in using the tactics and tools of the business world to make non-profits more competitive and successful in securing funding.

Such a view of non-profits seems totally consistent with the culture of the Technology, Entertainment, Design conferences (TED). TED’s talks almost exclusively seem to focus on technological and market-based solutions to the world’s problems, with a heavy dose of self-congratulation over the successes of “social innovators”. Missing from every TED talk I’ve ever seen is any discussion of to what extent technology and neoliberalism themselves are parts–perhaps even core parts–of the very problems TEDers seem so resolved to solving. Pallotta can stand up on stage, pointing out that most MBAs in the private sector make $400k a year 10 years after college while most non-profit CEOs make half of that or less, and not recognize that incomes of this level are one of the driving cause of the very problem of poverty he supposedly wants to combat!

In other words, instead of recognizing that capitalism is largely the source of the evils that non-profits try to fight against, Pallotta only notices the material successes of private business and figures non-profits will have to adopt their tactics to be successful. But what does successful mean here? Pallotta focuses solely on raising funds–but the important issue is what those funds actually go to support. It’s certainly great–and impressive–that, as an organizer of public campaigns for non-profits, he was able to raise hundreds of millions of dollars. But how much of that money actually went to the causes it was given for? Pallotta himself admits that such tactics may lock up at least 40% of funding in overhead (though Pallotta himself hates this word and is clearly annoyed at its prominence in discussions around non-profits). But he goes further, suggesting that it should be no great scandal if a non-profit were taking in funds for six years without distributing any money to the causes it was championing. Pallotta is arguing that non-profits should adapt such an advertising, market-focused, infrastructural approach that their overhead would be 100% for years on end.

Such an approach would not only probably quash the source of donations–what giver wants to fork over their hard earned money to help March of Dimes develop a slicker image?–but would also likely invite irreversible mission creep. An organization that becomes nothing but a literal self-promotion machine is not going to be able to turn from that course and capitalize, as it were, on its popularity, to start directing funds towards the charitable causes it ostensibly would be assisting. An organization that was trying to compete with for-profit businesses for market-share and advertising attention would never be able to stop playing that game.

In fact, though Palletta seems to think his suggestion is totally novel and innovative, many are already criticizing a host of non-profits for falling into this mode of operation. The film Pink Ribbons, Inc. highlighted this trend by calling attention (among other things) to the fact that breast cancer awareness has become itself more of a public advertising campaign than an actual research- or care-funding community. A huge number of businesses engage in the pink ribbon advertising while actually producing goods or services that are carcinogenic(!) So while awareness of the reality of breast cancer may be expanding, the adoption of for-profit tactics–and allies–seems to have totally compromised charities like Susan G Komen for the Cure.

But my concerns with Palletta’s approach run much deeper. As I discussed a few weeks ago, the logic of capitalism is itself the crucial issue at hand. Instead of seeing poverty, for example, as a problem that strikes us out of the blue, and one which can be combated by implementing market-orient strategies, we need to recognize that the behavior of market-oriented firms and individuals is a huge cause of poverty in the first place. The old saying goes, “fight fire with fire,” but of course if you are actually fighting a fire, you need to use water. More fire won’t help. Likewise, trying to combat the collateral damage of capitalism with more capitalism is a pointless, even tragic endeavor. That smart, committed people like Palletta suggest such solutions only underscores how deeply the logic of capitalism has penetrated the core of both our public and private cultures and consciousness.

I would suggest that it is this reality–that we increasingly lack the capacity to think outside the box of neoliberal economic and social assumptions–that is the really pressing issue. If we can’t develop a new political ethics and build a society committed to environmental health, economic equality, and functioning communities, then all the non-profit strategies in the world are next to worthless; the need for help with medical costs, housing, and even just food is already skyrocketing here in the US, and as income and wealth inequality continue to soar, this will only be more true. Using corporate tactics to expand non-profits’ market-share in such an environment is akin to someone with breast cancer smoking a pack a day, hoping that the lung cancer might fight the tumor in their chest. It’s ridiculous, pathetically so; it’d be Quixotic if it weren’t so pathologically frightening.

At a time when humans need to fundamentally readjust their understanding of their own personhood, their place in human societies, and their place in the broader ecosystem, Palletta’s suggestions call for the lemmings to continue to throw themselves over the cliff–maybe eventually, we’ll fill the land below with enough bodies to build a bridge to the glorious future that he–and so many other TED presenters–see for humanity.

What we actually need is an economy that is owned equitably–this means strong unions; universal access to good education, medical care, and social insurance; worker-owned businesses; and an appreciation for and conservation of our “environmental capital”. Instead, Palletta seems to want to double down and go all-or-nothing. But he’s playing a game that, by its very design, only about 5% of any population can win. A few wealthier non-profit CEOs and a greater public image for a few fortunate causes that manage to outspend others–these things won’t fix the serious problems we face. They’ll just accelerate the rising tragedies of the modern world.

Though I embedded this in the previously mentioned post, it’s worth re-sharing. I think Slavoj Zizek here really captures the corrupted nature of the current non-profit as social-enterprise trend. If you didn’t watch this when I posted it before, definitely watch it now:

Albert V. Krauss; or, A Few Well-Known Scientists’ Ignorance of Science

Larry Krauss, author of A Universe From Nothing

Last year, Lawrence Krauss wrote A Universe from Nothing in which he explained how physicists’ current understanding of quantum mechanics suggests how quantum probability fields relate to one another to create matter and how, therefore, a state in which no matter existed could yield a state in which matter existed: in some arrangements, quantum fields yield no particles, but if their arrangements shift, matter would basically “appear”. So something (that is, matter) could appear from “nothing”. Perhaps unsurprisingly, Krauss pressed this point to try and argue that theological and philosophical arguments about cosmogenesis were now obsolete–through the mechanism described above, physics showed how the universe came into being.

Except that…it doesn’t. What’s so surprising about Krauss’ claim is how jaw-droppingly misguided it is. He seems to think that a quantum state in which fields exist but do not generate matter is “nothing”. But of course such a state isn’t nothing–the fields still exist–and since these fields are the very basis of just about everything, at least as quantum mechanics tells it, this means that essentially the same something that is now present would be present in such a state–just in a different configuration. This is like arguing that while a diamond is something, a piece of coal is nothing, because the carbon atoms are arranged differently.

This is obviously ridiculous, but I don’t think Krauss is pulling a Joaquin Phoenix here–he seems to be dead serious. This reveals that he basically doesn’t understand what the word “nothing” means, which is pretty scandalous, because I’m pretty sure he’s a native English speaker. Now, Krauss’ book ended up eliciting a scathing review from one David Albert in the New York Times–to which Krauss responded with a bitter interview with Ross Anderson in the Atlantic. I would really recommend reading both as they provide the central material of the debate. Neither is horribly long, and they give some real insight into the two camps that have basically lined up in this debate.

As for those two camps, lots of people have been posting on blogs and elsewhere, flocking to this academic dust-up. Two articles I found particularly worthwhile were Adam Frank’s post in NPR’s 13.7 blog and Massimo Pigliucci’s post in Rationally Speaking. Both seek to understand why Krauss–and some other scientists–seem so hostile to philosophy in general. The fact that Krauss’ book was plastered with an enthusiastic blurb in which Richard Dawkins compared the work with Darwin’s The Origin of Species only underscores this theme: Dawkins is well known for his hatred of religion and his dismissive attitude towards theology, but he’s equally hostile to–and ignorant of–philosophy.

Both Krauss and Dawkins are, essentially, dogmatic scientists. Science, as a methodology, has finally been around long enough that people can practice it without understanding what it actually is and how it works. The philosophy of science attempts to continue analyzing science as a discipline, but this often leads to philosophers pointing out that scientists themselves overreach in their claims–which understandably frustrates those scientists. But instead of giving well-thought-out defenses of their positions, increasingly scientists just dismiss philosophy altogether.

This rhetorical tactic is as psychologically interesting as it is logically deficient. You can’t actually win a debate by simply saying that your opponent’s opinion is wrong a priori because they have one doctoral degree instead of another; but this is exactly what Krauss and Dawkins do. Albert’s central critique of Krauss’ book is the one outlined above: the English word nothing means just that–nothing. No thing, not structure, no space, no time, no matter–no nothing! Krauss explains how one state of existence, one structure of being–quantum fields aligned in a given way–can lead to a different state of existence–quantum fields aligned in such a way as to bring matter into existence. But he seems to think he’s actually explained how something came from nothing–when in reality he’s explained how one thing came from something else, or, really: how one set of relationships resulted in a different set of relationships.

This is really no different from claiming that when the first sufficiently heavy star exploded in a supernova and generated really heavy elements–like gold and uranium–for the first time, that this was creatio ex nihilo–the creation of something (i.e. gold and uranium) out of nothing (i.e. hydrogen, iron, oxygen, and incredible amounts of energy). Obviously hydrogen, iron, etc. are not nothing. Perhaps less obviously–but no less truly–quantum fields are not nothing. They are, as said above, actually everything–they and their relating account, as far as we currently know, for everything we see (although I gather that the laws simply assume space-time rather than account for it–but I’m not sure). They are, actually, the opposite of nothing.

It’s hard not to wonder if the materialist realism that underpins modern science hasn’t so uncritically ingrained itself on the minds of folks like Krauss that they can’t even understand the implications of their own work. I’m not saying this unequivocally, I’m really wondering. But he seems to think that the absence of matter-as-particles is equivalent to “nothing”, even as he spearheads research in a field that seeks to explain particles as composite realities determined by something other than particles. In other words, his field assumes that particles are generated by the reaction of something more fundamental than particles. And yet he still seems to think that the absence of particles signifies the absence of anything. Which draws the question: so, quantum physics is the study of…nothing?

In short, Krauss knows how to pursue empirical methodology–and he seems very good at, I’m not aiming here to criticize his actual theoretical and experimental work as a physicist–but doesn’t seem to actually understand it. He seems to essentially be “ontologizing” his epistemology. In its radically skeptical form, empiricism basically accords no reality to anything except sense perceptions. Hume certainly held to this view. But quantum mechanics, crucially, claims–and with lots of indirect empirical evidence–that the reality we see and experience is governed by forces that we cannot directly detect. In other words, empiricism has itself led to a denunciation of its most extremely skeptical variety.This is in no way a refutation of science, but actually science itself progressive, developing, and critiquing itself in a very healthy way. Note that while Hume hewed to a very skeptical empiricist view, Netwon and Bacon did not.

Interestingly, quantum physics seems to have almost as much–perhaps even more–in common with objective idealism as it does with materialist realism. Traditional materialism along Netwonian lines simply took particles, space, and time as givens–particles were completely simple, with no composites. Modern physics has dramatically overturned this view; now it is forces, laws, and probabilities that are most fundamentally real. This actually mimics, at least in broad outlines, the thought of people like Immanuel Kant and HGF Hegel–not to mention the likes of Spinoza and even Plato. The various forms of idealism that each developed tended to see logic, mathematics, and reason as fundamental to the reality of the universe (though each differed from each other in crucial ways, and obviously quantum mechanics is not idealism–nonetheless, the parallels are, I think, intriguing). It might be better, though, to say that quantum mechanics seems to be affirming some of the fundamental insights of both objective idealism and material realism.

This sort of discussion is precisely the sort of thing that philosophers of science do, and it’s important to the practice of science because science, like all epistemological methods, only works when it is properly understood and well-guided. Once scientists themselves lose sight of how their discipline works, they are unlikely to be able to advance their field as quickly and are also likely to make erroneous and wild conclusions–just as we have seen Krauss do. This isn’t to say that somehow philosophy should “rule over” science, but rather than its voice and its discipline are valuable. It’s also well worth pointing out that many, many scientists are interested and educated in philosophy and even theology–I am not here criticizing all scientists en mass, most of whom actually do take these questions seriously. There is, however, a troubling trend within popular science writing of dismissing philosophy for a pure sort of empiricist materialist realism that dis-serves the public and damages the credibility and progress of science itself.