An essay donated by James B. Gray
"The Love Command: Further Thoughts"

Sponsored link.

For a number of years now the following question has been uppermost in my
mind: Given that Christianity is the dominant religion in the United States, and
that the central teaching of the putative founder of the religion is that one
should love the neighbor (indeed, that one loves God by loving the neighbor),
why is it that Christianity emphasizes correct belief (i.e., orthodoxy) rather
than behavior in accord with the love command? At times, over the years, I have
come to believe that I have found at least a part of the answer to that
question. But I have yet to sense that I have obtained the complete answer to
the question—and, in fact, doubt that I ever will. Still, I keep searching for a
more complete answer, and am excited when I encounter a book or article that
stimulates me to further develop my thinking about the matter.
Such occurred recently while surfing the internet for a review of a book that I
had just ordered on early nineteenth-century naturalists. I “chanced” (?) across
“Ethical Implications of the Laws of Pattern Abundance Distribution” by Stephen
R. P. Halloy and Jeffrey A. Lockwood—which pointed out that scientists have, at
times, argued that “laws” that they have supposedly found in the human realm
should be interpreted as indicating what should be. That is, scientists have at
times made the argument that a lawful “is” can be regarded as a guide to what
“ought” to be.
Such an argument can be convincing. What, then, is the basis for accusing the
argument of committing the “naturalistic fallacy”? Do Malloy and Lockwood
present a convincing argument that exposes the fallacy of such an argument? I
agree with Malloy and Lockwood that knowledge about (lawful) “is” can help us
decide what can be done; and that what should be done is necessarily limited by
the options of what can be done. But, personally, I am not convinced that they
succeed in presenting an argument that satisfactorily disposes of the claim that
“is” should be regarded as the basic guide to “ought.”
In sensing that I was not finding their presentation entirely satisfying, I was
prompted to see what I could do in providing a better answer—one which would
simultaneously give me more insight on the question that I raised in the first
paragraph above. And as I let my unconscious mind grapple with the matter last
night while sleeping, I was “given” an answer that I do find—at the moment, at
least—rather satisfying. My purpose here, then, is to present that answer,
formulating it as well as I can while writing “off the top of my head” this
morning.
My starting point is the assertion—one that I believe to have a rather sound
basis—that prior to the Agricultural Revolution of millennia ago the way(s) of
life that humans had was concordant with their biological nature. Given this,
one might say that they had a way(s) of life that was in accord with “human
nature”—i.e., was “natural.” With the Agricultural Revolution, however, ways of
life began to change—slowly at first, but at an accelerating rate especially
after the Industrial Revolution of 250 years ago. While ways of life were
changing, human biology was remaining basically unchanged, however; as David P.
Barash has put it, there occurred the “hare” of way-of-life change combined with
the “tortoise” of human biology relative fixity. The direct result of this
uneven (to say the least!) change was a growing “discrepancy”—best conceived as
a growing discrepancy between way of life lived and way of life “designed for”
(as a result of the operation of various selection mechanisms—such as sexual
selection, and not, I cannot emphasize too strongly, Darwinian “natural
selection”).
I strongly believe not only that the prophetic movement was precipitated by this
growing discrepancy (the goal of the prophets being restoration of a more
“natural” way of life)—which I comment further on later; but that virtually all
of the problems that humans have faced over the centuries are rooted in this
discrepancy. Given this latter belief, the question arises: Why is not such a
belief widespread?
The quick answer to that question is that although various scholars have
undertaken research that provides evidence in support of that thesis, two
problems are associated with that evidence. First, that evidence is not
well-known—because it has not been well-publicized (by the media, by university
professors in their teaching, etc.). And second, no scholar has yet developed a
comprehensive body of evidence in support of the thesis.
These facts lead us to the question: Why do these deficiencies exist in the
research supportive of this thesis? And here, it seems to me, the answer again
lies in developments associated with the Agricultural Revolution; thus, let me
explain myself here.
One of the most important developments that occurred in conjunction with the
period of the Agricultural Revolution was the development of social classes.
Some differentiation had existed within human groups prior to this Revolution:
certain functions tended to be associated with adult females, certain other
functions with adult males; and, one’s role within a group tended to change as
one became older. But social classes were unknown with humans until the
Agricultural Revolution got underway. Not only did social classes emerge, but
class membership became a basically hereditary matter: one tended to remain a
member of the social class into which one was born.
Class societies have, of course, varied greatly in their specific
characteristics through history, but what they have in common is that upper
classes are always parasitical and predatory with reference to lower classes.
This is not to say, of course, that the upper classes in all societies have
been, and are, equally exploitative: e.g., during Jesus’s time “honor” was
highly prized within Roman society, and members of the upper classes even
competed one with another for honor—doing so by trying to outdo others in
beneficence directed toward “lowers.” Still, the generalization is true that the
“office” (as Thorstein Veblen would put it) of those in the upper classes of a
society is to “live off” those occupying lower positions.
Despite this fact, it is also a fact that the exploitative function of the upper
classes typically is not at all well-understood within any given class society.
Quite the contrary, indeed! Why hasn’t this fact been recognized as such? The
answer is that as social class systems were developing, so was ideology—the
function of ideology being to provide intellectual support to the Existing
Order. The content of ideology has, of course, changed greatly over historical
time—in part reflecting the fact that the nature of elites has changed
historically. What’s interesting about ideology viewed historically, however, is
the important role that the “God” construct has played in ideologies.
Ideology seems to have first appeared on the scene with the rise of the
institution of kingship—the original “myth” created being that the king was an
agent of God, this myth later developed such that the king was a god. Given that
(as Karl Marx astutely observed) the ruling ideas of a society are those of its
ruling class, if the elite of a society promotes the myth that its members
partake of divinity in some sense, and this myth “catches on” (which it likely
will), the Existing Order will be sanctified as expressing God’s will. As such,
who would think of disturbing that Order? Given that such disturbance is rare,
the result is a (relatively) stable society (the rise of a Spartacus type of
person being rather rare).
After Jesus’s death, in the early years of the Common Era, a number of Jesus
movements arose, but the one that became successful was one that had little
bearing on Jesus’s ministry—and may not even have had a genetic connection with
the disciples that Jesus attracted (despite claims by the Roman Catholic church
to the contrary). It became successful, not because it had authenticity relative
to Jesus’s ministry but, rather, because (a) it had developed a theology which,
drawing its basic ideas from pagan Mystery religions, made it familiar (if not
attractive) to “gentiles,” and (b) the emperor Constantine supported its
development—and in 380 CE the emperor Theodosius made it the official religion
of the Roman Empire (so that one had better “convert” to Christianity, or else .
. . .).
Christianity, in this “successful” form, made an ideological claim not unlike
the “divine right of kings” doctrine that had been common before. It claimed
that Jesus was a god—the one and only son of God, in fact—and that Jesus, prior
to his death (and alleged resurrection, then ascension), had passed his
authority along to the disciple Peter who, in turn passed it along to . . .,
etc. So that the current Roman Catholic pope, although not a divine Being,
ostensibly derives his authority from a line of succession leading back to
Jesus—and, ultimately, God Himself (or is it Herself?!).
This ideology enabled popes, and others with high positions in the Church, to
rule for a long period (with occasional disturbances—the Montanists being a
notable early example)—until, in fact, the rise of nationalism, with its
associated rise of secular leaders to positions of power, enabled a challenge to
that authority. As the rise of secular leaders was occurring, again the “divine
right of kings” doctrine was “resurrected” to provide intellectual support for
their position in society. (Armies are useful, of course, but ideology is even
more useful—because it enables rule without the use of much external force. For
the way ideologies “work” is that they become internalized—meaning that people
accept the ideas contained in an ideology and thereby in effect rule
themselves.)
The struggle between Church and State in Europe occurred over a rather long
period of time, and can perhaps be thought of as not coming to a close until the
Renaissance (with the Reformation basically just increasing the number of
Churchly centers of power). The challenge represented by the Renaissance had a
secular basis in that it “raised up” Reason as a source of authority: one should
believe what was reasonable to believe rather than what someone in authority
told one to believe. What’s particularly interesting regarding the Renaissance
is that its thrust was such that, in paving the way to the Enlightenment, it not
only provided a basis for questioning the authority of the Church(es) but lent
support to the development (during the Enlightenment) of more egalitarian
ideas—most notably the idea of “natural” rights. An idea that was initially
applied only to certain segments of society, but over time became more and more
universalized. (I might add that Americans are fortunate that our Founding
Fathers were living at a time when there was somewhat of an egalitarian thrust
to the secular thinking of the time.)
The rise of rational thought initially involved the use of logic, but became
joined by an increasing amount of attention to empirical research—this
development attributable perhaps especially to Francis Bacon (and even Roger
Bacon earlier). That development, unfortunately, with its search for lawfulness
in nature, inadvertently opened the door for ideology once more—enabling a
reversal in any trends in the direction of increasing egalitarianism.
The first major “advance” on this new front was Adam Smith’s The Wealth of
Nations, a work that is best regarded as one that took (unbeknownst to Smith?)
the physical “laws” of Isaac Newton and applied them to the social realm. Smith
argued that certain forces were operating in the economy which tended to bring
well-being to everyone—if, that is, these forces were allowed to operate without
interference. From whence would interference possibly come? From
government—hence the use of the term laissez-faire to refer to an economic
system within which government basically adopted a “hands off” approach. Smith
himself was by no means an anti-government sort of person, but his book gave
those who did have such a mentality a basis for arguing that a “hands off”
policy by government serves the interests of all. The Great Depression of the
1930s, e.g., should provide sufficient evidence of the fallacy of such a
position, but laissez-faire thinking is still with us. However, this not so much
because it has a solid empirical basis but, rather, because of the lack of a
well-developed alternative.
Laissez-faire thinking recognizes that a social class hierarchy will develop,
and adds that this is to be expected: A laissez-faire system, if operating
without interference, will, the theory goes, enable justice to be realized. That
is, one will be rewarded on the basis of desert: the more one deserves, the more
one will receive. Conversely, the more one receives, the more one deserves—for
the latter can be inferred from the former, given the first-stated relationship.
What determines desert? The more one contributes to the society, the more one
deserves. And how does one measure the “contribution” of a given person?
Initially it was argued that the harder one worked, the more one contributed
(Smith’s labor theory of value—to be supplanted later by supply/demand theory).
But as the elite came to realize that such a “theory” did not fit their needs
(given that their “contribution” was primarily of a negative nature!—and they
could not fool themselves about this), they began to promote the idea that
“contribution” could—and should—be inferred from income! A sleight of hand on
their part that, they hoped, would not be noticed by non-elite members of the
society. (And they were basically correct about this!)
Further ammunition was added to the elite’s cause by Charles Darwin’s theory of
natural selection—which he presented as a virtual law of nature. A “law” which,
in his The Origin of Species (1859), he discussed in the context of non-human
species, but with the implication that it applied with humans as well. (To be
fair, in his later The Descent of Man Darwin gave sexual selection a significant
role.) Darwin cannot be credited with originating Social Darwinism, but it must
be admitted that that theory has its basis in Darwin’s theory of natural
selection. For natural selection theory asserts that (a) excess births are the
rule in nature (an idea drawn from Rev. Thomas Malthus), (b) given that there is
such a thing as “carrying capacity,” not all individuals of a species that are
born can survive, (c) intraspecific competition inevitably therefore occurs, (d)
that competition results in those individuals surviving who are best equipped,
by nature, to win in this competition, (e) this process occurring year after
year results in a slow, steady, progressive increase in the attribute(s) that
confers survival advantage in competition with one’s fellows (along with any
other attributes that, by chance, are correlated with those “survival”
attributes).
Social Darwinian theory, with its “survival of the fittest” phrase (drawn from
Herbert Spencer), contends that this principle operated throughout nature,
including within the human realm. And that in the latter realm it meant—using a
sleight of hand here—that success (rather than survival per se) indicated
fitness. There was no pretense here that survivors were ones who contributed
most to the society, just that they had attributes—ones with a biological
basis—that enabled them to achieve their success. Because it was “natural”
attributes that explained their success, one could think of the class system
that existed as expressing the operation of natural factors; and that, in turn,
implied a certain inevitability regarding the Existing Order—one that it thereby
would be foolish to try to change, and therefore that attempts should not be
made to change it.
“God” was evidently conceived, by Darwin, as primarily, if not solely, a
Creator; and because his evolutionary thinking could not admit of “special
creation,” “God” therefore played no role in Darwin’s evolutionary thought
(except for political reasons). Given this, the ideology of Social Darwinism was
also godless in its youth. However, some thinkers began to think of Social
Darwinism more in terms of the lawfulness associated with it, so that they could
then not only think of the Existing Order as natural, but as in accord with
God’s wishes—given that the laws of nature had been established by God. Once
this development occurred, people such as (Baptist) John D. Rockefeller could be
Social Darwinists who also taught Sunday School!
It is interesting, then, that in the history of ideology “God” (or gods) has
(have) played an important role in supporting the Existing Order—whatever it
happens to be at the time. However, as I noted earlier, the Agricultural
Revolution, with its beginnings of the “discrepancy,” also gave rise to a
prophetic movement—the portion of which that is most familiar to us being that
which is presented in Hebrew Scripture and the “Old Testament” of the Christian
Bible. What was notable about this movement is that (a) it occurred during a
time of kings, (b) the prophets directed their comments at royalty, and (c) the
basic thrust of their argument was that the one God was the true King, and that
His Law is what should govern the society—the “message” of that Law being that
the elite was not to exploit others but, rather, had an obligation to do for
them (see my “Worship” on this site). Unfortunately, the anti-ideological
perspective of the prophets never “took root” to any significant degree—and
certainly has never been a significant part of Christianity (except to a degree
during the “social gospel” period or a century and more ago). Meaning—to allude
now to my original question—that the “love of neighbor” command has never played
a central role within Christianity.
The fact that Christianity has done a poor job of continuing the prophetic
tradition does not mean, however, that that tradition died long ago. The
tradition has, in fact continued since Bible times but—ironically—primarily in
the “secular” realm. Granted that many of those who have continued the
tradition—such as Charles Fourier, Karl Marx, and Thorstein Veblen—have not
looked (or claimed to) God as the “author” of their ideas. Their orientation,
however, has been to human well-being—which is why it is proper to place them in
the prophetic tradition. Besides, their choosing to work outside the “religious”
realm can be interpreted, in part, as an indication of their disgust with
conventional religion for its traitorous neglect of the prophetic tradition.
At present, the “God” construct plays little role in rationalizing the Existing
Order, but as the Halloy-Lockwood article cited at the beginning indicates, the
notion of lawfulness is, rather, used for that purpose—the argument being that
what “is” can, and should, guide our “ought” thinking and, therefore, our
behavior. I stated earlier that I did not think that Halloy and Lockwood did a
satisfactory job of disposing of this idea—of demonstrating that the
“naturalistic fallacy” is an actual fallacy. Let me now present an argument
that, in my mind, does enable us to reject this view of “ought.”
If we think of the way of life lived by people in the period prior to the
Agricultural (AR) Revolution as “natural,” and also perceive that a
“discrepancy” began to develop with—and especially after—the Agricultural
Revolution, it follows that it is an error to view human life since that time as
“natural.” Therefore, any empirical regularities that are discovered with humans
during post-AR times are not descriptions of what’s “natural” for humans.
Rather, the benchmark for deciding what’s “natural” for humans is the period
prior to the AR.
Therefore, any “ought” arguments that argue from “is” to “ought,” but do so on
the basis of post-AR times are using an inappropriate benchmark, and should
therefore be rejected out of hand. This does not mean that arguments based on
“facts” pertaining to pre-AR times should, though, be accepted at face value,
without evaluation. One should keep in mind that most in our society are
“possessed” by the ruling ideology (one supportive of the Existing Order), and
this includes scholars; and that insofar as scholars are “possessed” by the
ruling ideology, they may be prone to interpret pre-AR times through the lens of
the ruling ideology. Some such scholars—perhaps the majority, even—perceive with
an ideological bias, and are unaware of that fact. But other scholars warrant
the label “prostitutes” because they have sold their souls to a Foundation or
Institute established with the specific purpose of producing “research” in
support of the Existing Order.
Would it be helpful to The Cause that the “discrepancy” concept be better
developed, and then publicized? Perhaps—but, then, perhaps not. I say the latter
because it is to be expected that if this perspective on historical development
were further developed, and then publicized extensively, the elite would
(rightly) recognize such developments as a threat. The elite would then, one
would reasonably expect, direct its lackeys in Foundations, Institutes—and
universities—to “refute” the developing ideas and findings. The elite might even
use character assassination—and actual assassinations—to nip such a development
in the bud.
Rather than working on the development and publicizing of “discrepancy” ideas, I
suggest that we recognize that insofar as societal system change has resulted
from conscious efforts to bring it about, this has occurred via the actions of a
small “vanguard” group. Therefore, those of us interested in advancing the
cause—i.e., the prophetic movement—are perhaps better advised to act on our
ideas rather than take the first-mentioned course. Being sure, it goes without
saying, to act in a non-violent manner that evinces wisdom.

Site navigation:

Originally posted: 2008-APR-06
Latest update: 2008-APR-06
Author: James B. Gray

|