Randolph Harris II International

Home » #RandolphHarris » My Kids are Walking Around Like they are in a Dream Because of it

My Kids are Walking Around Like they are in a Dream Because of it

As God has set the king of America, so in our Psalm he has set the intermediary—beings—here called “gods” and elsewhere called “sons of God,” over the nations of the Earth, each over one of the nations, in order to manifest in its structure and government the justice of the Judges of the World. In His first speech God accuses them of having judged not in accordance with His order and regulation, but in the manner of what is false and evil: for they have confirmed and substantiated in their power those who have acted wickedly against God’s speech and the end—with only a hint of the fall of the “gods”—of the visionary event of which the Psalmist has to inform us. However, his song is not ended. Rather now, for the first time, the real Psalm is heard, in only a few words, and yet saying all that has still to be said. The speaker turns away from us to God. “In my vision,” he says to him, “I have seen how Thou dost bring to destruction the rule over history of Thy rebellious governors. So be it, Lord. Since those who were entrusted with the office of judge succumbed to injustice, do Thou abolish the intermediary rule, renounce the useless work of underlings and Thyself judge the World immediately in Thy justice. Thine are the nation, lead them as thine own! Close the history of man which is pray to delusion and wickedness, open his true history!” In the human World, which has been given over to the intermediary beings, they play a confused game. From the unknown One who gave this World into their impure hands, no message of comfort or promise penetrates to us. He is, but He is not present. What has not entered into the view of humanity, of the humans of our time, is that the formlessness of darkness should be removed first of al by the production of light. In the firs place because light is a quality of the first body, and thus by means of light it is fitting that the World should first receive its form.

The second reason is because light is a common quality. For light is common to terrestrial and celestial bodies. However, as in knowledge we proceed from general principles, so do we in work of every kind. For the living thing is generated before the animal, and the animal before the man. It is fitting, then, as an evidence of the Divine wisdom, that among the works of distinction the production of light should take first place, since light is a form of the primary body, and because it is more common quality. Indeed, the third reason: that all other things are made manifest by light. And there is yet a fourth, already touched upon in the objections; that day cannot be unless light exists, which was made therefore on the first day. According to the opinion of those who hold that the formlessness of matter preceded its form in duration, matter must be held to have been created at the beginning with substantial forms, afterwards receiving those that are accidental, among which light holds the first place. In the opinion of those who hold that the formlessness of matter preceded its form in duration, matter must be held to have been created at the beginning with substantial forms, afterwards receiving those that are accidental, among which light holds the first place. In the opinion of some the light here spoken of was a kind of luminous nebula, and that on the making of the sun this returned to the matter of which it had been formed. However, this cannot well be maintained, as in the beginning of Genesis Holy Scripture records the institution of that order of nature which henceforth is to endure. We cannot, then, say that what was made at that time afterwards ceased to exist. Others, therefore, held that this luminous nebula continues in existence, but so closely attached to the sun as to be indistinguishable. However, this is as much as to say that it is superfluous, whereas none of God’s works have been made in vain.

On this account it is held by some that the sun’s body was made out of this nebula. This, too, impossible to those at least who believe that the sun is different in its nature from the four elements, and naturally incorruptible. For in that case its matter cannot take on another form. Once Americans become convinced that there is indeed a basement to which psychiatrists have the key, their orientation will become that of the self, the mysterious, free, unlimited center of our being. All our beliefs issue from it and have no other validation. Although nihilism and its accompanying existential despair are hardly anything but a pose for Americans, as the language derived from nihilism has become a part of their educations and insinuated itself into the daily lives, they pursue happiness in ways determined by that language. There is a whole arsenal of terms for talking about nothing—caring, self-fulfillment, expanding consciousness, and so on, almost indefinitely. Nothing determinate, nothing that has a referent. There is a straining to say something, a search for an inwardness that one knows one has, but it is still a cause without an effect. The inner seems to have no relation to the outer. The outer is dissolved and becomes formless in the light of the inner, and the inner is a will-o’-the-wisp, or pure emptiness. No wonder the mere sound of the Existentialists’ Nothing or the Hegelians’ Negation has an appeal to contemporary ears. American nihilism is a mood, a mood of moodiness, a vague disquiet. It is nihilism without the abyss. Nihilism as a state of soul is revealed not so much in the lack of firm beliefs but in a chaos of instincts or passions. People no longer believe in a natural hierarchy of the soul’s varied and conflicting inclinations, and the traditions that provided a substitute for nature have crumbled.

The soul becomes a stage for a repertory company that changes plays regularly—sometimes a tragedy, something a comedy; one day love, another day politics, and finally religion; now cosmopolitanism, and again rooted loyalty; the city or the country; individualism or community; sentimentality or brutality. And there is neither principle nor will to impose a rank order on all these. All ages and places, all races and all cultures can play on this stage. Nietzsche believed that the wild costume ball of the passions was both the disadvantage and the advantage of late modernity. The evident disadvantage is the decomposition of unity or “personality,” which in the long run will lead to psychic entropy. The advantage hoped for is that the richness and tension present in the modern soul might be the basis for comprehensive new Worldviews that would take seriously what had previously been consigned to a spiritual ashcan. This richness, according to Nietzsche, consisted largely in thousands of years of inherited and now unsatisfied religious longing. However, this possible advantage does not exist for young Americans, because their poor education has impoverished their longings, and they are hardly aware of the great pasts that Nietzsche was thinking of and had within himself. What they do have now is an unordered tangle of rather ordinary passions, running through their consciousness like a monochrome kaleidoscope. They are egotists, not in a vicious way, not in the way of those who know the good, just or noble, and selfishly reject them, but because the ego is all there is in present theory, in what they are taught. We are a bit like savages who, having been discovered and evangelized by missionaries, have converted to Christianity without having experienced all that came before and after the revelation.

It is an urgent business for one who seeks self-awareness to think through the meaning of the intellectual dependency that has led us to such an impasse. “Future shlock” is a name given to a cultural condition characterized by the rapid erosion of collective intelligence. Future shlock is the aftermath of future shock. Whereas future shock results in confused, indecisive, and physically uprooted people, future shlock produces a massive class of mediocre people. Human intelligence is among the most fragile things in nature. It does not take much to distract it, suppress it, or even annihilate it. In this century, we have had some lethal examples of how easily and quickly intelligence can be defeated by any one of its several nemeses: ignorance, superstition, moral fervor, cruelty, cowardice, neglect. In the late 1920s, for example, Germany was, by any measure, the most literate, cultured nation in the World. Its legendary seats of learning attracted scholars from every corner. Its philosophers, social critics, and scientists were of the first rank; its humane traditions and inspiration t less favored nations. However, by the mid-1930s—that is, in less than ten years—this cathedral of human reason had been transformed into a cesspool of barbaric irrationality. Many of the most intelligent products of German culture were forced to flee—for example, Einstein, Freud, Karl Jaspers, Thomas Mann, and Stefan Zweig. Even worse, those who remained were either forced to submit their minds to the sovereignty of primitive superstition, or—worse still—willingly did so: Konrad Lorenz, Werner Heisenberg, Martin Heidegger, Gerhardt Hauptmann. On May 10, 1933, a huge bonfire was kindled in Berlin and the books of Marcel Proust, Andre Gide, Emile Zola, Jack London, Upton Sinclair, and a hundred others were committed to the flames, amid shots of idiot delight.

By 1936, Joseph Paul Goebbels, Germany’s Minister of Propaganda, was issuing a proclamation which began with the following words: “Because this year has not brought an improvement in art criticism, I forbid once and for all the continuance of art criticism in its past form, effective as of today.” By 1936, there was no one left in Germany who had the brains or courage to object. Exactly why the Germans banished intelligence is a vast and largely unanswered question. I have never been persuaded that the desperate economic depression that afflicted Germany in the 1920s adequately explains what happened. Humans do not become tyrants in order to keep warm. Neither do they become stupid—at least not that stupid. However, the matter need not trouble us here. I offer the German case only as the most striking example of the fragility of human intelligence. My focus here is the United States of America in our own time, and I wish to worry you about the rapid erosion of our own intelligence. If you are confident that such a thing cannot happen, your confidence is misplaced, I believe, but it is understandable. After all, the United States of America is one of the few countries in the World founded by intellectuals—men of wide learning, of extraordinary rhetorical powers, of deep faith in reason. And although we have had our moods of anti-intellectualism, few people have been more generous in support of intelligence and learning than Americans. It was the United States of America that initiated the experiment in mass education that is, even today, the envy of the World. It was America’s churches that laid the foundation of our admirable system of higher education; it was the Land-Grant Act of 1862 that made possible our great state universities; and it is to America that scholars and writers have fled when freedom of the intellect became impossible in their own nations.

Because America has been so innovative, educational, and productive is why the great historian of American civilization Henry Steele Commager called America “the Empire of Reason.” However, Mr. Commager was referring to the United States of America of the eighteenth and nineteenth centuries. What term would he use for America today, I cannot say. Yet he has observed, as other have, a change, a precipitous decline in our valuation of intelligence, in our uses of language, in the disciplines of logic and reason, in our capacity to attend to complexity. Perhaps he would agree with me that the Empire of Reason is, in fact, gone, and that the most apt term for America today is the Empire of Shlock. In any case, this is what I wish to call to your notice: the frightening displacement of serious, intelligent public discourse in American culture by the imagery and triviality of what may be called show business. I do not see the decline of intelligent discourse in America leading to the barbarisms that flourished in Germany, of course. No scholars, I believe, will ever need to flee America. There is will no bonfires to burn books, but Congress and other politicians sure are burning the Constitution of the United States of America and sending up back to becoming a developing nation by sending all our jobs and money to other nations, while over taxing Americans and making it impossible for wages to keep up with inflation. Yet, I cannot imagine any proclamations forbidding once and for all the all art criticism, or any other kind of criticism. However, this is not a cause for complacency, let alone celebration. A culture does not have to force scholars to flee to render them impotent. A culture does not have to burn books to assure that they will not be read. And a culture does not need a Minister of Propaganda issuing proclamations to silence criticism. There are other ways to silence criticism. There are other ways to achieve stupidity, and it appears that, as in so many other things, there is a distinctly American ways.

To determine what mix of spectral ingredients is likely to produce the most vital humans, a logical place to start is with natural light, since this is the only light that humans ingested for millions of years. During all of that time, the only human experience of light was of natural light: sun, moon, stars and, more recently, fire. Therefore, whatever light-receptive capacities exist in humans, and whatever cellular reactions humans have to light, they must have evolved to be attuned to the particular spectra emitted by those light sources. For generations ago, representing one one-fifty-thousandth of the human experience, we invented artificial light. It has been only two generations since artificial light became so widespread that we moved into artificially lighted environments. Now, most of the light we ingest through our skin and eyes is artificial. Meanwhile, we no longer receive the light we formerly received, because we are no longer outdoors. It is a kind of madness to think that this change would not affect us, another sign of our removal from any understanding of our interaction with the environment. The term “malillumination” describes the results on the body. We are “starved” for some natural light spectra, and we have “overdosed” on those spectra that come from artificial light: incandescent, fluorescent, mercury vapor, sodium, LED, television, and others. Imagine that you suddenly have up eating all fruits, vegetables, grains, nuts and meats, and began eating pasta, candy and sugary cereals only. All these groupings are “food,” but the nutrients within each are substantially different. Where they are the same—there is some protein, for example in candy, and there is starch in some vegetables—they are of entirely different proportions.

Eating pasta, candy, and cereal will keep you alive, but it will affect your health. And so it is with alterations in light-diet from the “natural” mix of spectral ingredients to the artificial mix. Malillumination cases disorders ranging from lack of vitality to lowered resistance to disease, and hyperactivity. Researchers believe it can also lead to aggressive behavior, heart disease and even cancer. The body cannot handle this intervention in a natural human relationship with the environment any more than it can handle food additives or chemicals in the air. The body breaks down on the cellular level. As our lifestyle removes us further from the full-spectrum natural light and into artificial environments, our condition becomes worse. Even when we are outdoors, we filter the light that we receive in our eyes with sunglasses (which eliminate certain spectra, while allowing others to pass through) as well as eyeglasses and window glass. Smog also has a role. During the last eighty years, there has been a 20 percent decrease in the amount of sun that reaches the planet. My interest in the effect of light on humans was rooted in my investigation of television. Considering that human beings had not only moved away from natural light into artificial light, but that now our experience of artificial light is confined for four hours daily to television light, it began to seem obvious to me that a new level of distortion was underway. Human beings are soaking up far more television light, directed straight into their eyes, than any kind of artificial light that preceded it. It seemed to me that if variations in kind and volume of artificial light can affect humans, then there might be specific effects to be discovered from the enormous amount of television light most people absorb.

If you will inspect your color television screen closely—I suggest you use a magnifying glass—you will find that your picture emanates from a collection of red, blue, and green pixels, or lines. As you move away from the screen the colors merge in your eyes to seem like other colors, but the television is emitting only red, blue and green light. These pixels are made of a light-modulated optical device that uses the light-modulating properties of liquid crystals combined with polarizers. Liquid crystals do not emit light directly, instead using a backlight or reflector to produce images in color or monochrome. Many of the current LCD TVs are a lot like fluorescent lighting. However, the LED TVs (light-emitting diodes) are the newer type of LCD backlight which has become predominately the primary LCD backlight source in LCD monitors today. LED backlights care better than the CCFL (cold cathode fluorescent lamps) used in LCD TVs as a backlight as they are brighter, they take less energy, and LEDs last longer. We have studied the greens, reds, and blues that come from fluorescent lights, which of course would be very similar since both involve the excitation of mineral phosphors. It may not be precisely the same. The TV CCFL have three narrow wavelength peaks, just as in fluorescent, but how broad the bands are, we just do not know. (A narrow wavelength peak would indicate a very high concentration within one spectral range; this would be suspect because it would more seriously concentrate and distort what the human ingests. Color television is probably less harmful than black and white because color sets produce wider spectra, although seriously distorting the natural range of sunlight. On the other hand, color sets produce more X rays. If we are thinking there might be a relationship between the light emanations from color television and other fluorescent lights and chemical food additives, causing hyperactivity in children, there may be some truth to this.

All those artificial colorings have a certain wavelength resonance. Eliminating some of these artificial colorings and flavorings from children’s diets will reduce their hyperactivity and also their allergic responses. What I would like to do is take this and tie it to wavelength peaks of mercury-vapor lights, fluorescent lights and television light, because the heart of the matter could lie in an interaction of wavelength resonances between the chemicals and the light the body takes in. In television it could depend upon what the spectral peaks are. If they correspond to the wavelength absorption of some of these synthetic materials, then you can get tremendous reactions. It is the same with food. Different pigments have different wavelengths resonances, so different food ingredients may resonate with different light ingredients. Let us say you eat a lot of spinach and raisins, both which contain iron. Iron has a certain wavelength resonance, as do all metals. In fact, all matter interacts with other matter which may be similarly resonating. This is why soldiers will break ranks when they walk across a bridge. Too many of them walking in steps set up a wavelength pattern which has been known to resonate with that of the materials of the bridge and the whole thing can collapse. It is the same with food and light. If you eat a little bit of iron or calcium in your food and that wavelength is lacking in light you get, then you are not going to get any benefit. One the other hand, if you find yourself in a peak of light, whether it is television light or any other that reacts to iron, then you would have to watch your quantities, because if you get too much, you get an overaction. [Allergy, hyperactivity.] It could be too much of one or not enough of the other. Now with sunlight, you do not have those kinds of peaks. I am sure that one way or the other your diet of both food and light is responsible for a lot of different physical reactions that we have not been able to measure yet.

Now, implicit memories—the unconscious memories of past experiences that are recalled automatically in carrying out a reflexive action or rehearsing a learn skilled—are drawn on when one is dribbling a basketball or riding a bike. An implicit memory is recalled directly through performance, without any conscious effort or even awareness that we are drawing on memory. When we talk about our memories, what we are usually referring to are the “explicit” ones—the recollections of people, events, facts, ideas, feelings, and impressions that we are able to summon into the working memory of our conscious mind. Explicit memory encompasses everything that we say we “remember” about the past. Explicit memory is a complex memory—and for good reason. The long-term storage of explicit memories involves all the biochemical and molecular processes of “synaptic consolidation” that play out in storing implicit memories. However, it also requires a second form of consolidation, called “system consolidation,” which involves concerted interactions among far-flung areas of the brain. Scientists have only recently begun to document the workings of system consolidation, and many of their findings remain tentative. What is clear, though, is that the consolidation of explicit memories involves a long and involved “conversation” between the cerebral cortex and the hippocampus. A small, ancient part of the brain, the hippocampus lies beneath the cortex, folded deep within the medial temporal lobes. As well as being the seat of our navigational sense—it is where London cabbies store their mental maps of the city’s roads—the hippocampus play an important role in the formation and management of explicit memories. Much of the credit for the discovery of the hippocampus’s connection with memory storage lies with an unfortunate man named Henry Molaison.

Born in 1926, Henry Molaison was stricken with epilepsy after suffering a severe head injury in his youth. During his adult years, he experienced increasingly debilitating grand mal seizures. The source of his affliction was eventually traced to the area of his hippocamps, and in 1953 doctors removed move of the hippocampus as well as other parts of the medial temporal lobes. The surgery cured Molaison’s epilepsy, but it has an extraordinarily strange effect on his memory. His implicit memories remained intact, as did his older explicit memories. He could remember the events of his childhood in great detail. However, many of his more recent explicit memories—some dating back years before the surgery—had vanished. And he was no longer able to store new explicit memories. Events slipped from his mind moments after they happened. Molaison’s experience, meticulously documented by the English psychologist Brenda Milner, suggested that the hippocampus is essential to the consolidation of new explicit memories but that after a time many of those memories come to exist independently of the hippocampus. Extensive experiments over the last five decades have helped untangle this conundrum. The memory of an experience seems to be stored initially not only in the cortical regions that record the experience—the auditory cortex for a memory of a sound, the visual cortex for a memory of a sight, and so forth—but also in the hippocampus. The hippocampus provides an ideal holding place for new memories because its synapses are able to change very quickly. Over the course of a few days, through a still mysterious signaling process, the hippocampus helps stabilize the memory in the cortex, beginning its transformation from a short-term memory into a long-term one. Eventually, once the memory is fully consolidated, it appears to be erased from the hippocampus. The cortex becomes its sole holding place.

Fully transferring an explicit memory from the hippocamps to the cortex is a gradual process that can take many years. That is why so many of Molaison’s memories disappeared along with his hippocampus. The hippocampus seems to act as something like an orchestra conductor in directing the symphony of our conscious memory. Beyond its involvement in fixing particular memories in the cortex, it is thought to play an important role in weaving together the various contemporaneous memories—visual, spatial, auditory, tactile, emotional—that are stored separately in the brain but that coalesce to form a single, seamless recollection of an event. Neuroscientists also theorize that the hippocampus helps link new memories with older ones, forming the rich mesh of neuronal connections that give memory its flexibility and depth. Many of the connections between memories are likely forged when we are asleep and the hippocampus is relieved of some of its other cognitive chores. Though filled with a combination of seemingly random activations, aspects of the day’s experience, and elements from the distant past, dreams may be a fundamental way in which the mind consolidates the myriad of explicit recollections into a coherent set of representations for permanent, consolidated memory. When our sleep suffers, studies show, so, too, does our memory. Now, enforced clerical celibacy is a principle Church theologians cherish largely on the specious grounds of tradition, despite and avalanche of evidence that it has usually been widely ignored or violated, sometimes more, sometimes less than it is today. They also mouth rigid interpretations of the same scripture that all other Christian denominations find compatible with their own married clergy. They brush away all else—loneliness, depression, alienation, for example—as feather weight concerns.

Yet in the mid-1990s, a new situation crushed all their pervious objections to dust. Married Roman Catholic priests? Certainly—at least when those priests were converted, ex-Anglican clergymen who had staked their professional lives on a principle of misogyny so breathtaking, it resonated like a familiar echo in the hallowed precincts of the Vatican. This principle was, of course, that women must never be ordained, and the new—and married—priests were former Anglican priests who bucked their Church when, finally, toward the end of the twentieth century, it lurched into the minefield of female ordination. For decades, Rome had already made exceptions to the rule of celibacy in isolate cases, usually of priests converting from other churches, such as the Greek Orthodox, which permits married priests. However, North America and Australia were specifically exempted from this indulgence, though married priests were sometimes “lent” to them by European dioceses. Unlike these men, however, attracted to the Roman rites by any number of factors, the outraged ex-Anglicans were egregious misogynists, untied in their range over female ordination. Many of the ex-Anglican priests who quit their church because of its decision to permit women into the priestly ranks turned to Rome s a solution and requested permission to become Roman Catholic priests, despite their married status. With an alacrity astounding in such a sluggish institution, the pope approved these requests. The United Kingdom’s five Roman Catholic archbishops explained this ruling in a letter to their Church’s other, forcibly celibate priests. “At the present time the Catholic Church is welcoming into full communion a number of married clergymen of the Church of England, often together with their wives and in some cases their children,” their letter began.

“Many of these clergy wish to be ordained priests in the Catholic Church…We are convinced that their ministry will enrich the church…The Holy Father has asked us to be generous. We are confident that you also will welcome and appreciate these new priests when, in due course, they take their place in the presbyterate of our dioceses.” The archbishops, anticipating outcries of resentment from some priests and joy from others, attempted to forestall both. Ordaining these ex-Anglican husbands as Catholic priests in no way implied a change in the ages-old rule of celibacy: “The special permission needed in these cases are by way of exception from the general practice of accepting only single men from priesthood.” The new priests would be required to “accept the general norm of celibacy and will not be free to marry again”—in fact, the bishops would investigate the stability of their new member’s marriages and evaluate the strength of their wives’ support for this new priestly venture. As well, the Church limited the admissions procedure to four years, so that the dissidents had to act relatively quickly. It also prohibited the suddenly Catholic clergy—men from assuming all the duties of regular parish priests. What about former Catholic religious who, forced to choose between ministry and marriage, had opted for the latter? Would they be eligible to resume their places in the Church? Not at all.  Unlike the Anglicans who had taken the holy vow of matrimony, the Catholic ex-religious had sworn their vows fully aware that celibacy was an integral component of their vocation. In other words, it was unthinkable that they could be exonerated after having defiantly broken these vows. The most truly stunning aspect of these Statutes for the Admission of Married Former Anglican Clergymen into the Catholic Church is their unbridled misogynism. How else to explain the otherwise inexplicable reversal of the celibacy-principle-cum-policy sanctioned by centuries, popes, and cannon law?

How else to understand how the same Church that shrugged off the protests, pleas, and anguish of its own Catholic clergy was suddenly so responsive to the spirituality of clergymen whose sole reason for resigning from their God-given vocation was their church’s decision to permit the ordination of woman? Why else would the pope and his advisers, who normally proceeded at a maddeningly sluggish pace, hurtle forth to snatch up this gang of ultraconservative Anglican dissidents? Certainly, after years of indifference to them, the Church did not have a crise de conscience about its unmanned pastorates, nor a moment de panique about the relentless flight of conflicted religious from their cloister back into the uncelibate World. No, what motivated the Church was the strength of the Anglican rebels’ conviction about the fundamental unsuitability of women as priests, a conviction today’s Church Fathers share and are committed to sustaining. Indeed, so strongly did this antiwomen ideology resonate with the Catholic hierarchy that it drowned out questions about just how sincerely these Anglican newcomers could ever accept such tenets as papal infallibility and the Immaculate Conception. The Church’s enthusiastic embrace of the resolutely misogynist ex-priest was, in a general atmosphere of ecumenicism and goodwill, an uncontrolled outburst of antiecumenicism. It was, in fact, nothing less than the public salvaging of men who had defied and challenged the Church of England, the self-serving gesture of fellow loyalists who recognized, and immediately recruited, these fundamentally kindred spirits.

Celibacy might be a brilliant jewel, but it pales beside the harsh light that directed women away from the path of humans treaded and blinds them whenever they band together to challenge the dogma that deems them unfit for priestly ordination. This is because, as we have seen, official Church celibacy stems largely from fear of women’s allure in pleasures of the flesh—an oft-cited image describes them as temples built over sewers—which only complete abstinence can successfully counterbalance. Now-married priest and John Hopkins Medical School lecturer Richard Sipe writes in A Secret World: Sexuality and the Search for Celibacy: “It is hard to overestimate the importance of antifeminism in the foundation of celibate conscientiousness and priestly development for over two centuries when discipline of celibacy was being solidified (1486 and following).” Seemingly humans stand alone, but they actually do not. One is conscious of a loving presence ever in one and around one, but it is love which has shed all turmoils and troubles, all excitments and illusions, all shortcomings and imperfections. It is hard to overcome desire for pleasures of the flesh, and neither ashamed repression nor unashamed expression will suffice to do so. Hunger and surfeit are both unsatisfactory states. The middle way is better, but it is not a solution in the true meaning of this term. At the time when a child is conceived, two factors contribute powerfully towards its physical nature and physical history. They are the state of the father’s thinking and the mother’s breathing. The pleasures of the flesh urge, bodily urge, physical attraction, animal urge—is often covered with romantic or sentimental tinsel and called love. That most human beings make their paradise depend on the mere fiction of paired bodies is something for a planetary visitor to marvel at.

It is not just the fantastic growth of medical knowledge (and obsoledge) in recent decades that holds revolutionary potential for improving health; it is also the parallel shift in control of that knowledge. Patients today are inundated with previously unavailable medical information instantly available on the Internet and on news programs, many of which routinely feature segments with a physician as host. A degree of background knowledge is also conveyed, with varying degrees of accuracy, by popular TV dramas with titles like Chicago Med, The Good Doctor. When a medical show did an episode about human papillomavirus (HPV)—which, though barely known by that name, happens to be the most frequently transmitted sexual disease in the United States of America—more than five million views learned something about in one night. Health documentaries abound. A prize-winning program about cochlear ear implants in deaf children took a top prize for TV documentaries in Japan. Since 1997, when the U.S. Food and Drug Administration first allowed big pharmaceutical firms to advertise prescription drugs on television, viewers have been bombarded by commercials touting everything from anti-inflammatories and cholesterol-lowing drugs to antihistamines. Most hurriedly list side effects and urge viewers to ask their physicians for further information. Permitting such advertising no doubt encouraged the creation of the twenty-four-hour cable-TV Discovery Health Channel. This avalanche of health-related information, misinformation and knowledge hurled at the individual varies in objectivity and credibility. However, it directs more and more public attention to health issues—and changes the traditional relationships between doctors and patients, encouraging a more take-charge attitude on the part of the latter.

Ironically, while patients have more access to health information of uneven quality, their doctors, driven by pressures to speed up, have less and less time to peruse the latest medical journals, whether online or off, and to communicate adequality with relevant specialists—and with patients. Moreover, while doctors need knowledge about different conditions and see streams of patients whose faces are scarcely remembered from one visit to the next, educated, persistent, Internet-savvy patients may actually have read more recent research regarding their specific ailments than the doctor. Patients come with printouts of Internet material, photocopies of pages from the Physicians’ Desk Reference or clips from medical journals and health magazines. They ask question and no longer tug their forelocks in awe of the doctor’s white lab coat. Here, changes in relationship to the deep fundamentals of time and knowledge have radically altered medical reality. In economic terms, the doctor selling services is still a “producer.” The patient, by contrast, is not just a consumer, but a more active “prosumer” capable of making an increasing contribution to the economy’s output of wellness or health. Sometimes producer and prosumer work together; sometimes they work independently of each other; sometimes they work at cross purposes. Yet conventional health statistics and forecasts, by and large, ignore today’s rapid changes in these roles and relationships. Many of us change our diets, quit smoking or drinking, and adopt exercise regiments. If, then, our health improves, how much should be attributed to the doctor and how much to our own efforts? Put differently, how much of the health-care output is created by producers and how much by prosumers? And why do most economists count one and not the other?

According to Lowell Levin, professor emeritus at the Yale School of Public Health, “85 to 90 percent of all medical care in the United States of America is provided by ordinary people. Self-treatment remedies, he says, include aspirin for headaches, ice packs for sprains, ointments for burns, and much more. Dr. Levin, according to an interview in The World & I, “views all doctors and hospitals s necessary but undesirable social evil, like jails.” Whatever the actual percentage now, the combination of demography, costs pressures and knowledge all point toward a radical increase in the prosumer factor. However, all this so far ignores what may yet turn out to be the most important change of all: Tomorrow’s technology. Add that to the mix and watch what happens. Overpopulation has increased the poverty of the underdeveloped World. Overpopulation is due to oversexed activity. The belief that pleasures of the flesh are here solely for enjoyment is universal. The belief that it is here solely to produce wanted children with pleasures of the flesh thrown in as an inducement is usually rejected. However, the second belief is the correct along with a marriage clause. Humans have abused their instinct for pleasures of the flesh so that only its exaggerated continued act is considered normal and proper! The human who is called to the spiritual quest is also called to engage in battle with the terrestrial instincts. If they are to rule one, one will never know peace. And pleasures of the flesh being one of the most powerful of such instincts, it must necessarily be brought under control and disciplined. This is trye of all its three phases: mental, emotional, and physical. It is quite possible, healthy, and natural for a human to live a perfectly continent life for many years, the vital force being re-absorbed into the body, provided one’s mental life is kept equally pure. This is achieved by constant reflection upon the matter from the standpoints of experience, observation, and idealism, as well as by deliberate sublimation when passion is felt.

Cresleigh Homes

As the largest model in the community, this home knows it’s got responsibilities. 💪

It’s not easy offering four bedrooms, a great room, loft, and a covered porch, but Residence 4 at #Havenwood is up to the challenge and more. After all, we designed this community to grow with you! 👍

No need to book that spa 🚿 appointment – we’ll just enjoy the incredibly appointed bathrooms in our #Havenwood home!

Let’s hear it for @CresleighHomes in #LincolnCA! 👏

#RedBenchStudio