
For the most part we understand only gradually the decisive experiences which we have in our relation with the World. First, we accept what they seem to offer us, we express it, we weave it into a “view,” and then think we are aware of our World. However, we come to see that what we look on in this view is only an appearance. Not that our experiences have deceived us. However, we have turned them to our use, without penetrating to their heart. As a result, the material World has become condemned as a carefully laid trap, set up by the Devil for the purpose of deluding and ensnaring man. Christians were warned by the priesthood to beware of the delights of material existence, lest they be sidetracked from their true Earthly purpose, the preparation for a spiritual afterlife. All joys came under the rule of the Church, to be dispensed as the clergy saw fit. Pleasures of the flesh, people were informed, was to be indulged in solely for the purpose of propagating Christian babies and nothing else. The ecclesiastical severance of man from nature went so far that in the twelfth century a book entitled Hortus Deliciarum issued an ominous warning that the joys of gardening might well be harmful to the redemption of the soul. However, while these ideals of asceticism and self-denial were taking root in Christian dogma, the inequities of administration were apparent for all who had eyes to see. The hypocrisy of the clergy was flagrantly displayed, and, on this account, many had ceased to listen to long, tedious sermons dealing with the inevitable damnation of sinners. Priests were constantly seducing female members of their own laities, and as early as the eighth century, many convents had been condemned by high ecclesiastical figures as being “hotbeds of filthy conversation, drunkenness, and vice.”

As the peasant masses were considered to be the property of the feudal lords whose lands they occupied, they found themselves subject to the caprice of these lords and felt crushed beneath the injustices of the system. The serfs were taxed unmercifully and were subject to the foraging raids that were frequently made by the soldiers of the nobility, who came to carry off their grain or their women, depending on the inclination. One feudal law went to the extreme of declaring that before a peasant could consummate his marriage, he had to bring his bride to his lord in order that the lord might have the “first fruits” of the marriage. The peasant might forestall this, thereby preserving his wife’s virginity (at least temporarily), by making a prescribed payment to the noble, but the payment was usually too large for the peasant to afford. These injustices usually had the weight of the Church behind them, for the Church was a de facto part of the aristocracy. The social order was presented to the people as being ordained by Heaven and therefore immutable; the Church took the attitude that if God had wanted the situation changed, then He would change it. As long as they were on top of the heap, the Church and the nobility had little impetus to rock the boat. However, soon the peasantry became resentful of the oppression forced on them from above. Peasant revolts spread across Europe and one by one were put down by force of arms. The Church backed up the efforts of aristocracy to quell the rebels, deeming them to be anarchists inspired by the forces of Hell trying to topple God’s empire. The result was that the lower classes had ceased to listen to a God who had become in their eyes a solemn and unfeeling hypocrite, who dealt fortune to those least deserving and inflicted punishment on those most virtuous. God was the friend and ally of the corrupt nobility, the enemy of the common human.

The disaffection with the Christian hypocrisy was not only widespread among the peasantry but was shared by many members of the nobility themselves. It was no wonder, then, that the organized heretical movements that had begun to trickle into the Empire from the south had taken root in the minds of many. New ideas, carried from the east and south by the Crusaders, who had been sent, ironically enough, to stamp them out, began to take seed in certain Christian circles. Many of these movements, such as the Knights Templar and the various Gnostic heresies, were clear-cut reactions against the corruption rampant in the Church, and they instituted strict vows of chastity and poverty among their priesthoods. By the thirteenth century, the officials in Rome, exhausted from the senseless bloodshed and humiliating defeats in the Holy Land, began to call the troops home and turn their attention to more domestic problems. When they opened their eyes, they not only found blasphemous idolatry being carried on by the peasant masses but also saw their own position being undermined by heresies that had grown too big to be considered lightly. Christianity was rotting beneath its own weight. If the kingdom of God were to be saved from the jaws of Hell, these devout and pious men saw that something would have to be done, and done quickly. The laws grew steadily tougher because they felt that the only possibility man had for redemption was to expurgate evil by denying all forms of carnality. Christianity to them was a corrupt force, leading humans into sin and degradation, and the rabid opposition they expressed to the Church caused the papacy grave concern.

Manichaean sects, which had been brought into northern and central Europe from Bulgaria, such as the Albigenses and the Cathari, had attained such power that they had managed to send out missionaries to various parts of Europe in order to gather converts. In many areas they had been successful. To the Manichaeans, procreation was the ultimate sin, since it was the propagation of materiality. Realizing that the lower elements of the movement would not be able to expunge their terrestrial drives entirely, the elect of these sects may have given their unofficial sanction to pleasures of the flesh that would not result in reproduction. At any rate, under the direction of Innocent III, these pockets of dissension were exterminated, many of the groups going underground to carry on their opposition to Christianity. By the time of the suppression of the Manichaeans, the death penalty had come to be used freely in cases of heresy. Today, as in the past, hundreds of thousands of Catholic priests’ question and rail against the discipline of celibacy. Those tormented by doubt have options. They may take leaves of absence up to a year, to meditate, pray, and resolve their personal crises. They may seek counseling with the Catholic framework. They may also seek counseling within the Catholic framework. They may also abandon their vocations and return to the World, a course of action once unthinkable and even now, difficult to negotiate successfully. No Catholic religious doubts that celibacy freely chosen, or granted by God’s grace, has the power of infusing the priest with a profound love and serenity that strengthens his ministry and enrich his relations with parishioners. Coerced celibacy, on the other hand, weakens and saddens, embitters and alienates.

Some priests simply endure, with loneliness their overriding companion. Many cheat and take lovers they either disguise as housekeepers or friends or flaunt as mistresses. Others find the struggle intolerable and finally leave; of those who do so, 94 percent identify their discomfort with celibacy as they prime motive. Since Vatican II, over one hundred thousand have joined the exodus, more than one every two hours, early one-quarter of the World’s working priests. In the United States of America 42 priests leave within twenty-five years of their ordination, which translates into the bleak statistic that half of American priests under the age of sixty have already gone. In Canada in the past two decades alone, the number of priests and nuns has dwindled by a quarter, though 46 percent of Canadians are Catholic and the vast majority—84 percent versus 71 percent in the United States of America—are amenable to ministry by married priests. At the same time that droves of priests have been defecting, far fewer novices have felt called to join holy orders. Swiftly and steadily, the World’s complement of Catholic clergy is dangerously eroding, so that nearly half of all parishes have no priest at all. At the root of this phenomenon is compulsory celibacy, the issue that has racked the Church since its earliest days. “It was mainly about celibacy,” said Dominic, an America ex-priest, about his decision to leave. “I was spending too much of my time, my energy, my inner strength, on coping with it. Celibacy was presenting me from being the priest I wanted to be…The Christian I wanted to be. Celibacy had become an end rather than a means. That is the case with any number of priests.” Another priest, returning to his empty house after the emotional outpouring of Sunday mass, felt he “was like a driver who had gone down deep and did not have a decompression chamber when he came up.”

William Cleary, a Jesuit for twenty years, left the priesthood when he became convinced the celibacy is not a virtue and, in its denial of God-given sexuality, may even be “a vague kind of sin.” After all, Cleary argues in “A Letter to My Son: The Sin of Celibacy,” sexuality is God’s vehicle for perpetuating life on Earth and “reveals the Divine Being, and…reveals who we humans are, the incredible depths of this World’s goodness…[and] helps toward prayer, contemplation, and all the religious and human values.” No wonder that Dean R. Hoge notes, in The Future of Catholic Leadership, that “the celibacy requirement is the single most important deterrent to new vocations to the priesthood, and if it were removed, the flow of men into seminaries would increase greatly, maybe fourfold.” It should also be emphasized that traditionally the Church has treated apostates with brutal indifferences; leaving is often a miserable experience, emotionally, psychologically, professionally, and financially. In the United States of America, they were “solitary, shuffling parish, from whom Christian society asked only that they should disappear off the face of the Earth.” Italians refer to them as the Church’s “White Homicides” because despite years of service, they are pushed out into the World jobless, scorned, with only about $300 to their name, bereft of friends, ostracized by former colleagues. Even in countries where the Church does not actively persecute those who renounce their vows, the transition from religious to secular life is almost always trying and often frightening. Ex-priests who continue to work for the Church—as pastoral assistant, for example—report that they are underpaid, stigmatized, and humiliated, and that celibate coworkers are unfriendly, hostile, and envious of them.

As if they are in the same category as disbarred lawyers or other disgraced professionals, too often, ex-priests are dismissed. However, the majority remain devout Catholics who long to continue practicing their profession of serving God. Thousands of the attempt to so through Rent-A-Priest, a nonprofit agency begun in 1992 in Framingham, Massachusetts, which now has branches in Canada and South Africa. In the U.S., Rent-A-Priest claims a database of over two thousand estranged priests available to conduct baptisms, funerals, and other sacraments. The majority of these men are now married and call themselves “married Catholic priests.” However, the Church does not acknowledge them as such—the Canadian diocese of Toronto, for instance, refers to them as “laicized priests,” an ambiguous and unsatisfactory classification. Worldwide, the Church deplores Rent-A-Priest’s swelling ranks as a manifestation of the raging conflict over celibacy. More seriously, it refuses to sanction the sacrament they perform. No wonder, then, that so many priests prefer to remain in holy orders and defile the vow of chastity they find so onerous. However, how many is “so many”? Approximately 40 percent of American priests are chronically uncelibate, 20 percent in stable relationships with one or more adult woman, nearly as many with adult men. Subtract this cheating 40 percent from the priesthood and 60 percent remain. As its members at least celibate? Not necessarily. Many of them indulge in occasional erotic adventures, but slack Church tallying of pleasures of the flesh lapses permits about four such episodes annually before it labels their perpetrator sexually active. In other words, about half the Catholic clergy sworn to celibacy is uncelibate. Underneath all the tales there does lie something different from the tales? How different?

People must learn to love the soul, and especially seek to purify one’s thought-life. There are different requirements about the extent and nature of pleasures of the flesh discipline at different stages of the path. One’s own innermost promptings are the best guide here for they come from the higher self. However, they need to be separated from bodily impulses and emotional broodings, which is difficult to do. It is immaterial for the adept whether one lives a celibate or married life. The attitude toward pleasures of the flesh will always depend upon individual circumstances. A celibacy reached through insight and not by institutional behest, or an asceticism practised within marriage—in both cases as immaculate in thought as in deed—shows its value in peace and strength. However, for those who cannot arrive at this admittedly difficult condition, there should be periods of temporary withdrawal from pleasures of the flesh activity ranging from a few weeks to a few years. For single persons and dedicated married ones it is a voluntary inner self-discipline. Under the urge of intimate passions men will form undesirable relationships which bring mental and emotional sufferings, or fall into unpleasant habits, or behave quite ridiculously under the delusion that they are finding happiness. To gratify the desire of the moment without thought about its possible distant, but undesirable, consequence, is the fact of immaturity. If a human wished to become truly adult one should cultivate the needful qualities. The price of excess pleasure has to be paid in the end. It is paid in unwanted children, unhappy castaways, unpleasant diseases, lost health and premature ageing. Strength is squandered in undisciplined activities of pleasures of the pleasures of the flesh. When the mating urge descends on humans, they develop a temporary but immense capacity for glorifying the beloved person, seeing beautifies and virtues which may be quite slight or even non-existent. With the eyes so widely out of focus, nature achieves her purpose with ease.

In this—that the thing which is involved is a thing of a different nature, how it may put on a human appearance or indulge in its servants their human appetites. It is cold, it is hungry, it is violent, it is illusory. The warm blood children and meeting at the Sabbath do not satisfy it. It wants something more and other; it wants “obedience,” it wants “souls,” and yet it pines for matter. It never was, and yet it always is. To explain what I am getting at, I find it helpful to refer to two films, which taken together embody the main lines of my argument. The first film is of recent vintage and is called The Gods Must Be Crazy. It is about a tribal people who live in the Kalahari Desert plains of southern Africa, and what happens to their culture when it is invaded by an empty Coca-Cola bottle tossed from the window of a small plane passing overheard. The bottle lands in the middle of the village and is construed by these gentle people to be a gift from the gods, for they not only have never seen a bottle before but have never seen glass either. The people are almost immediately charmed by the gift, and not only because of its novelty. The bottle turns out, has multiple uses, chief among them the intriguing music it makes when one blows into it. However, gradually a change takes place in the tribe. The bottle becomes an irresistible preoccupation. Looking at it, holding it, thinking of things to do with it displace other activities once thought essential. However, more than this, the Coke bottle is the only thing these people have ever seen of which there is only one of its kind. And so those who do not have it try to get it from the one who does. And the one who does refuses to give it up. Jealousy, green, and even violence enter the scene, and come very close to destroying the harmony that has characterized their culture for a thousand years.

The people begin to love their bottle more than they love themselves, and are saved only when the leader of the tribe, convinced that the gods must be crazy, returns the bottle to the gods by throwing it off the top of a mountain. The film is great because it can be a metaphor about how leaders in the Catholic Church need to set an example by remaining celibate and not let this false idol of pleasures of the flesh overcome them and the entire Earth. However, it also raises two questions of extreme importance to another situation: How does culture change when new technologies are introduced to it? And is it always desirable for a culture to accommodate itself to the demands of new technologies? The leader of the Kalahari tribe is forced to confront these questions in a way that America have refused to do. And because his vision is not obstructed by a belief in what Americans call “technological progress,” he is able with minimal discomfort to decide that the songs of the Coke bottle are not so alluring that they are worthy admitting envy, egotism, and greed to a serene culture. The second film relevant to my argument was made in 1967. It is Mel Brook’s first film, The Producers. The Producers is a rather raucous comedy that has at its center a painful joke: An unscrupulous theatrical producer has figured out that it is relatively easy to turn a buck by producing a play that fails. All he has to do is induce dozens of backers to invest in the play by promising them exorbitant percentages of its profits. When the play fails, there being no profits to disperse, the producer walks away with thousands of dollars that can never be claimed. Of course, the central problem he must solve is to make sure that his play is a disastrous failure. And so he hit upon an excellent idea: he will take the story of Adolf Hitler—and make it into a musical.

Because the producer is only a crook and not a fool, he assumes that the stupidity of making a musical on this theme will be immediately grasped by audiences and tht they will leave the theater in dumbfounded rage. So he calls his play Springtime for Hitler, which is also the name of its most important song. The song begins with the lyrics: Springtime for Hitler and Germany; winter for Poland and France. The melody is catchy, and when the song is sung it is accompanied by a happy chorus line. (One must understand, of course, that Springtime for Hitler is no spoof of Hitler, as was, for example, Charlie Chaplin’s The Great Dictator. The play is instead a kind of denial of Hitler in song and dance; as if to say, it was all in fun.) The ending of the movie is predictable. The audience loves the play and leaves the theater humming Springtime for Hitler. The musical becomes a great hit. The producer ends up in jail, his joke having turned back on him. However, Brook’s point is that the joke is on us. Although the film was made years before a movie actor became President of the United States of America, Brooks was making a kind of prophecy about that—namely, that the producers of American culture will increasingly turn our history, politics, religion, commerce, and education into forms of entertainment, and that we will become as a result a trivial people, incapable of coping with complexity, ambiguity, uncertainty, perhaps even reality. We will become, in a phrase, a people amused into stupidity. For those readers who are not inclined to take Mel Brooks as seriously as I do, let me remind you that the prophecy I attribute here to Brooks was, in fact, made many years before b a more formidable social critic than he. I refer to Aldous Huxley, who wrote Brave New World at the time that the modern monuments to intellectual stupidity were taking shape: Nazism in Germany, fascism in Italy, communism in Russia.

However, Huxley was not concerned in his book with such naked and crude forms of intellectual suicide. He saw beyond them, and mostly, I must add, he saw America. To be more specific, he foresaw that the greatest threat to the intelligence and humane creativity of our culture would not come from Big Brother and Ministries of Propaganda, or gulags and concentration camps. He prophesied, if I may put it this way, that there is tyranny lurking in a Coca-Cola bottle; that we could be ruined not by what we fear and hate but by what we welcome and love, by what we construe to be a gift from the gods. And in case anyone missed his point in 1932, Huxley wrote Brave New World Revisited twenty years later. By the, George Orwell’s 1984 had been published, and it was inevitable that Huxley would compare Orwell’s book with his own. The difference, he said, is that in Orwell’s book people are controlled by inflicting pain. In Brave New World, they are controlled by inflicting pleasure. The Coke bottle that has fallen in our midst is a corporation of dazzling technologies whose forms turn all serious public business into a kind of Springtime for Hitler musical. Television is the principal instrument of this disaster, in part because it is the medium Americans most dearly love, and in part because it has become the command center of our culture. Americans turn to television not only for their light entertainment but for their news, their weather, their politics, this religion, their history—all of which may be said to be their serious entertainment. The light entertainment is not the problem. The least dangerous things on television are its junk. What I am talking about is television’s preemption of our culture’s most serious business.

It would be merely banal to say that television presents of with entertaining subject matter. It is quite another thing to say that on television all subject matter is presented as entertaining. And that is how television brings ruin to any intelligent of public affairs. Political campaigns, for example, are now conducted largely in the form of television commercials (and you will notice parties with the smaller number of commercials usually lose). Candidates forgo precision, complexity, substance—in some cases, language itself—for the arts of show business: music, imagery, celebrities, theatrics. Indeed, political figures have become so good at this, and so accustomed to it, that they do television commercials even when they are not campaigning, as, for example, Geraldine Ferraro for Diet Pepsi and former Vice-Presidential candidate William Miller and the late Senator Sam Ervin for American Express. More currently President Joe Biden endorsing Masks on TV and Vice President Kamala Harris endorsing vaccines. Furthermore, political figure appear on variety shows, soap operas, sit-coms, and YouTube videos. Barak Obama appears on Armin Van Buuren’s State of Trance, episode 1000, I believe. George McGovern, Ralph Nader, Ed Koch, and Jesse Jackson have all hosted “Saturday Night Live.” Henry Kissinger and former President Gerald Ford have done cameo roles on “Dynasty.” Tip O’Neill and Governor Michael Dukakis have appeared on “Cheers.” Michelle Obama has appeared on “Ellen. Richard Nixon did a short stint on “Laugh-In.” The late Senator from Illinois, Everett Dirksen, was on “What’s My Line?,” a prophetic question if ever there was one. What is the line of these people? Or, more precisely, where is the line that one ought to be able to draw between politics and entertainment? I would suggest that television has annihilated it.

There was a time while working on this report that we became thrilled about the implications of the human ingestion of light. As we began to understand for the first time that there is a concrete relationship between our bodies and light, and that light is a kid of thing we ingest for nourishment and growth, like food, we began to feel that humans probably hungered for and sought light the way plants do. We know that humans seek food. A lot of life is spent in this process. We can say that seeking food is instinctive in all humans. Even babies know how to do it, with their limits. If light is also food, then might we not seek it, as plants do? Id this why we look at the moon? Is this why we gaze at fire? It there an innate longing for light, like a kind of cellular hunger? If so, then I suppose Anne Waldman could be right. With natural light gone, we seek a surrogate light: television. Every culture and religion in history has placed light at the center of its cosmology. “Receive the light.” “Seek enlightenment.” “The truth always comes to the light.” “The mind of light.” “The luminescent soul.” The Hopi Indians speak of light as entertaining them through the tops of their heads. It is a goal of theirs to keep the tops of their heads open for light. Of course they are speaking in spiritual terms. It is very efficient and sensible to develop religions around natural processes which are the bases of survival. Most indigenous cultures do that. The Bolivian Indians have a meditational routine every day at the same time, sitting high on a cliff facing the sun. They called it “taking light.” They give it the same kind of meaning as “taking waters.” They claimed it had medicinal value, as well as stimulating spontaneous insight. There is hardly a medicine/healing system in the World where light is not used for health purposes…physical, mental, spiritual.

Chinese healing systems coordinate treatments of carious organs with foods of specific color. For example, for lung disorders, white foods like turnips and onions will be prescribed. Heart disorders are assisted by eating red foods such as beets and pomegranates. These might be combined with meditational practices in which the patient is asked to keep a certain color in mind. A spleen problem is considered to be caused partly by the body’s insufficient absorption of nutrients found in green vegetables. Intestinal problems may be caused by an insufficiency or an overabundance of foods containing pink light. In Mahayana Buddhism, each chakra (energy center) of the body is described as processing certain parts of the color spectrum, while also intermixing the colors processed by other energy centers. In acupuncture, the two principal light-reception glands, the pineal and the pituitary, are the subject of specific light treatments, designed to keep them in balance. Many cultures consider the body’s experience of color, which is to say spectra, as a prime factor in health. However, when faced with this kind of evidence, this culture places it all in a “primitive” category. We consider it superstition or mythology rather than knowledge or science. Another fascinating science topic is how much remains to be learned about the workings of explicit and even implicit memory, and much of what we now know will be revised and refined through future research. However, the growing body of evidence makes clear that the memory inside our heads is the product of an extraordinarily complex natural process that is, at every instant, exquisitely tuned to the unique environment in which each of us lives and the unique patter of experiences that each of us goes through.

The old botanical metaphors for memory, with their emphasis on continual, indeterminate organic growth, are, it turns out, remarkably apt. In fact, they see, to be more fitting than our new, fashionably high-tech metaphors, which equate biological memory with the precisely defined bits of digital data stored in databases and processed by computer chips. Governed by highly variable biological signals, chemical, electric, and genetic, every aspect of human memory—the way it is formed, maintained, connected, recalled—has almost infinite gradations. Computer memory exists as simple binary bits—ones and zeros—that are processed through fixed circuits, which can be either open or closed but nothing in between. One salient lessons to emerge is how different biological memory is from computer memory. The process of long-term memory creation in the human brain is one of the incredible processes which is so clearly different “than artificial brains” like those in a computer. While an artificial brain absorbs information and immediately saves it in its memory, the human brain continues to process information long after it is received, and the quality of memories depends on how the information is processed. Biological memory is alive. Computer memory is not. Those who celebrate the “outsourcing” of memory to the Web have been misled by a metaphor. They overlooked the fundamentally organic nature of biological memory. What gives real memory its richness and its character, not to mention its mystery and fragility, is its contingency. It exists in time, changing as the body changes. Indeed, the very act of recalling a memory appears to restart the entire process of consolidation, including the generation of proteins to form new synaptic terminals.

Once we bring an explicit long-term memory back into working memory, it becomes a short-term memory again. When we reconsolidate it, it gains a new set of connections—a new context. The brain that does the remembering is not the brain that formed the initial memory. In order for the old memory to make sense in the current brains, the memory has to be updated. Biological memory is in a perpetual state of renewal. The memory stored in a computer, by contrast, takes the form of distinct and static bits; you can move the bits from one storage drive to another as many times as you like, and they will always remain precisely as they were. The proponents of the outsourcing idea also confuse working memory with long-term memory. When a person fails to consolidate a fact, an idea, or an experience in long-term memory, one is not “freeing up,” space in one’s brain for other function. In contrast to working memory, with its constrained capacity, long-term memory expands and contracts with almost unlimited elasticity, thanks to the brain’s ability to grow and prune synaptic terminals and continually adjust the strength of synaptic connections. Unlike a computer, the normal human brain never reaches a point at which experience can no longer be committed to memory; the brain cannot be full. The amount of information that can be stored in long-term memory is virtually boundless. Evidence suggests, moreover, that as we build up our personal store of memories our minds become sharper. The very act of remembering appears to modify the brain in a way that can make it easier to learn ideas and skills in the future. We do not contain our mental powers when we store new long-term memories. We strengthen them. With each expansion of our memory comes an enlargement of our intelligence.

A great way to encourage students to study is to tell them one day they will be as rich as Bill Gates. He spent like six hours a day studying. Nowadays, the person to be like might be Elon Musk. Nonetheless, the Web provides a convenient and compelling supplement to personal memory, but we when we start using the Web as a substitute for personal memory, bypassing the inner processes of consolidation, we risk emptying out minds of their riches. In the 1970s, when school began allowing students to use portable calculators, many parents objected. They worried that a reliance on the machines would weaken their children’s grasp of mathematical concepts. The fears, subsequent studied showed, were largely unwarranted. No longer forced to spend a lot of time on routine calculations, many students gained a deeper understanding of the principles underlying their exercises. Today, the story of the calculator is often used to support the argument that our growing dependence on online databases is benign, even liberating. In freeing us from the work of remembering, it is said, the Web allows us to devote more time to creative thought. However, the parallel is flawed. The pocket calculator relieved the pressure on our working memory, letting us deploy that critical short-term store for more abstract reasoning. As the experience of math students has shown, the calculator made it easier for the brain to transfer ideas from working memory to long-term memory and encode them in the conceptual schemas that are so important to building knowledge. The Web has a very different effect. It places more pressure on our working memory, not only diverting resources from our higher reasoning faculties but obstructing the consolidation of long-term memories and the development of schemas. The calculator, a powerful but highly specialized tool, turned out to be an assistance to memory, but think how much smarter one’s brain would be without the assistance of a calculator. When you are racing a student on the blackboard to finish a math problem first, you cannot use a calculator. Additionally, the Web is a technology of forgetfulness.

Patients do not prosume health just by exercising more or quitting tobacco. They invest their money in technologies that can help them better care for themselves and their families. In 1965, about the only equipment available in the home were canes, crutches, walkers, and beds. In 1980, when we first called attention in The Third Wave, that market for home-use medical instruments was still relatively tiny. Today, patients are responsible for 99 percent of diabetes-management responsibilities, and sales of home-use diabetes-management products are $25.31 billion and is expected to reach $41.88 billion in 2028. However, home-care technology is no longer limited to a few basic products such as insulin-infusion kits, blood-pressure machines and pregnancy-test kits. An ever-widening array of technologies are springing up to help prosumers care for themselves or their loved ones. Anyone going online today can buy self-testing equipment for the detection from everything from allergies to COVID-19, prostate cancer to hepatitis. Trouble with your hand? The FlagHouse catalog for “special populations” offers a “finger goniometer” for measuring range of motion in metacarpophalangeal finger joints. Along with it you can get a hydraulic dynamometer and hydraulic pinch gauge for other measurements of the hand. Trouble breathing? You can buy an ultrasonic nebulizer, a spirometer, even a lifesaving ventilator unit. You can lay in a neurologist’s hammer or your own pediatric stethoscope. Women can regularly monitor their estradiol, testosterone and progesterone levels. There are new home tests for osteoporosis and colon cancer. According to the Food and Drug Administration (FDA), home-care systems are “the fastest growing segment of the medical device industry.” However, all this empowering self-help technology is still primitive compared with what lies ahead.

An FDA magazine reports that “the list of planned and imagined medical devices reads like a work of science fiction…imagine a toothbrush with a biosensing chip that checks your blood sugar and bacteria levels while you are brushing…computerized eyeglasses with a tiny embedded display that can help those who wear them remember people and things…a smart bandage…that could detect bacteria or virus in a wound and tell the wearer if treatment with antibiotics is warranted and which to use.” There is a “smart T-shirt” that monitored the vital signs of climbers on a recent expedition to Mount Everest. Upcoming developments also include a hands-free device that allows disabled person to operate machines by blinking an eye or by brain activity. Imagine home CAT scans in the privacy of your own marble bathroom. Automatic urine analysis with every flush. Computerized life-expectancy projection updated after each meal. As with any such forecast, not all of these products will ever see the light of day, or prove affordable, practical and safe. However, they represent only the first drops of a technology torrent to come. It will change the economics of both self-care and paid care. And it represents yet another way in which the mostly unmeasured prosumer economy interacts with the money economy. Prosumers invest money to buy capital goods that will help them perform better in the non-money economy—which will then reduce costs in the money economy. Would not the overall “output” of health be increased by recognizing the essential role of prosuming and changing the ratio of input by doctor, on one side, and patient on the other? Whether we look at demographics or costs, changes in the amount and availability of knowledge or coming breakthroughs in technology, it is clear that prosumers will play an even bigger role in the massive health economy of tomorrow.

It is, therefore, time for economists, instead of regarding the non-money economy as irrelevant or unimportant, to systematically track the most significant ways in which both these economies feed each other, and integrate with one another to form an overarching wealth-creating, health-creating system. If we understood these relationships better, they would cast important light on the global health crisis. At a minimum, they might introduce vital new questions into today’s relentlessly predictable political debates about health in many countries. If prosumers make huge, unpaid contributions to the level of health everywhere, if they invest their own money to make those contributions, might it not make sense to reduce overall health costs by educating and training prosumers as we now train producers? One of the investments governments could make in health care would be to instruct schoolchildren on how to be better health prosumers. That would include teaching some of the same things we teach in medical school—basic human anatomy and physiology and the causes and treatment of disease…Teach them to diagnose and treat common minor health problems. We should also teach them which kinds of health problems really require professional help. There needs to be a community-based, predictive game for children with type I diabetes. Professionals aim to develop mental models of their physiologies to motivate these children to check their glucose levels more frequently. The game would encourage diabetic kids, linked together wirelessly, to play on a computer to predict their own and others’ glucose levels. The idea is to leverage untapped social dynamics rather than relying entirely on doctor-patient instructions or parental nagging. In a densely cross-connected knowledge-based economy, why continue to think of the health crisis and the educational crisis as separate, rather than as interlinked? Can we not use imagination to revolutionize the ideas and institutions in both fields? Millions of prosumers are ready to help.

Cresleigh Homes

Love that textured wall! Homes at #CresleighHavenwood feature details that stand out – and speak to your unique style and aesthetic.

Also note the lighting and ceiling fan – everything about our thoughtfully constructed home is both practical and stylish. A winning combo!

Discover your dream home with Cresleigh Homes. With Cresleigh, one can expect sophisticated home designs the are crafted for an unforgettable first impression.
#CresleighHomes
