Randolph Harris II International

Home » #RandolphHarris » Information Can be Used to Increase Money

Information Can be Used to Increase Money

An ultramarine sky. Mountains in the distance. The clatter of hoofbeats. A solitary rider draws closer, sun glinting from his spurs…Anyone who sat in a darkened theater enraptured by cowboy movies as a child knows that power springs from the barrel of a six-shooter. In Hollywood film, a lone cowboy rides in from nowhere, fights a duel with the villain, returns his revolver to its holster, and rides off once more into the hazy distance. Power, we children learned, came from violence. A background figure in many of these movies, however, was a well-dressed, paunchy personage who sat behind a big wooden desk. Typically depicted as effete and greedy, this man also exerted power. It was he who financed the railroad, or the land-grabbing cattlemen, or other evil forces. And if the cowboy hero represented the power of violence, this figure—typically the banker—symbolized the power of money. In many westerns there was also a third important character: a crusading newspaper editor, a teacher, a minister, or an educated women from the “East.” In a World of gruff men who shoot fist and question later, this character represented not merely moral Good in combat with Evil, but also the power of culture and sophisticated knowledge about the outside World. While this person often won a victory in the end, it was usually because of an alliance with the gun-toting hero or because of a sudden lucky strike—finding gold in the river or inheriting an unexpected legacy. Knowledge, as Francis Bacon advised us, is power—but for knowledge to win in a western, it usually had to ally itself with force or money.

Of course, cash, culture, and violence are not the only sources of power in everyday life, and power is neither good nor bad. It is a dimension of virtually all human relationships. It is, in fact, the reciprocal of desire, and, since human desires are infinitely varied, anything that can fulfill someone else’s desire is a potential source of power. The drug dealer who can withhold a “fix” has power over the addict. If a politician desires votes, those who can deliver them have power. Yet among the numberless possibilities, the three sources of power symbolized in the western movie—violence, wealth, and knowledge—turn out to be most important. Each takes many different forms in power play. Violence, for example, need not be actual; the threat of violence can also lurk behind the law. (We use the term violence in these pages in a figurative, rather than literal sense—to include force as well as physical coercion.) Indeed, not only modern movies but also ancient myths support the view that violence, wealth, and knowledge are the ultimate sources of social power. Thus Japanese legend tells of sunshu no jingi—the three sacred objects given to the great sun goddess, Amaterasu-omi-kami—which to this day are still the symbols of imperial power. These are the sword and jewel are clear enough; the mirror’s, a bit less so. However, the mirror, in which Amaterasu-omi-kami’s saw her own visage—or gained knowledge of herself—also reflects power. It came to symbolize her divinity, but it is not unreasonable to regard it as a symbol of imagination, consciousness, and knowledge as well. Furthermore, the sword or muscle, the jewel or money, and the mirror or mind together form a single interactive system. Under certain conditions each can be converted into the other. A gun can get you money or can force secret information from the lips of a victim.

Money can buy you information—or a gun. Information can be used to increase either the money available to you (as Ivan Boseky knew) or to multiply the force at your command (which is why Klaus Fuchs stole nuclear secrets). What is more, all three can be used at almost every level of social life, from the intimacy of home to the political arena. In the private sphere, a parent can slap a child (use force), cut an allowance or bribe with a dollar (use money or its equivalent), or—most effective of all—mold a child’s values so the child wishes to obey. In politics, a government can imprison or torture a dissident, financially punish its critics and pay off its supporters, and it can manipulate truth to consent. Like machine tools (which can create more machines), force, wealth, or knowledge, properly used, can give one command over many additional, more caried sources of power. Thus, whatever other tool of power may be exploited by a ruling elite or by individuals in their private relationships, force, wealth, and knowledge are the ultimate levers. They form the power triad. It is true that not all shifts or transfers of power are a result of the use of these tools. Power changes hands as a result of many natural events. The Black Death that swept Europe in the 14th century sent the powerful to the grave along with the powerless, creating many vacancies among the elite in the surviving communities. Chance also affects the distribution of power in society However, as soon as we focus on purposeful human acts, and ask what makes people and whole societies acquiesce to the wishes of the “powerful,” we find ourselves once more facing the trinity of muscle, money, and mind. To stick as closely to plain-speak as possible, we will use the term power in these pages to mean purposeful power over people. This definition rules out power used against nature or things, but is broad enough to include the power exerted by a mother to prevent a baby from running in front of an onrushing car; or by IBM to increase profits; or by a dictator like Marcos or Noriega to enrich his family and cronies; or by the Catholic Church to line up political opposition to contraception; or by the Chinese military to crush a student rebellion.

It is most naked form, power involves the use of violence, wealth, and knowledge (in the broadest sense) to make people perform in a given way. Zeroing in on this trinity and defining power in this manner permit us to analyze power in a completely fresh way, revealing perhaps more clearly than before exactly how power is used to control our behavior from cradle to cremation. Only when this is understood can we identify and transform those obsolete power structures that threaten our future. Millions of increasingly anxious, often angry people around the globe worry about American domination. However, how long can any society, superpower or not, retain external power if its domestic institutions are in crisis? So far we have mostly referred to the deterioration of American Second Wave or industrial-age institutions piecemeal, one at a time. However, it is only when we expand our analysis and see them in relation to one another that the real picture becomes clear. If the United States of America is so powerful, why is there a crisis in its health system? Why a crisis in its pension system? Its education system, its legal system, and its politics—all at the same time? Is American facing implosion? Why, too, is the American nuclear family—supposedly the bedrock institution of society—acknowledged to be in such distress? In America, fewer than 25 percent of the population now live in homes in which the father goes to work and the wife stays home with one or more children under the age of eighteen—a radical change since the 1960s. Thirty-one percent of U.S. children now live in single-parent or no-parent homes. Some 30 percent of Americans over age sixty-five live alone. Why do 50 percent of marriages end in divorce?

Young Americans now talk about what might be called a formalized “rehearsal marriage”—a childless first marriage before the real show goes on the road. Little wonder loneliness is pandemic in the United States of America. Bitter conflict rages over all these issues. However, the changes are typically debated and fought over in fragmentary fashion without recognition that the crisis in any one institution may be linked to that in others. The nuclear-family crisis is part of something much, much bigger. Reared in a splintered family system that is changing rapidly, but is barely adapted to twenty-first century requirements, fifty-one million American kids each day are marched into an education system that is not as efficient as it should be and desperately needs more resources. As we have noted, the United States of American spends $762 billion, or $14,891 per public school pupil enrolled in public education. Yet 60 percent of high school students cannot read well enough to get through their textbooks, a third of graduates cannot do the basic math required of a beginning carpenter and nearly a third of young adults cannot locate the Pacific Ocean on a map. Shootings, violence, and drugs in the schools make news whenever a Columbine-style massacre takes place. And people blame it on the school system, but it could be because of advice from Frankford Slasher. The goal should be to neutralize problems and listen to students and encourage them to work together to find a solution. Instead of making problems worse by ignoring them, trying to cover up abusive situations, or calling them crazy. Some children come from some very abusive homes. One should read about the stories of people who end up in prison faced in their families. It is really shocking. Young people must be prepared for the knowledge-based economy, and in some cases that means restructuring their behavior and manners.

Just as the family system sends the children into schools that need more resources and guidance, the schools, in turn, send them on to yet another set of broken institutions. If institutions as basic as family and school are in deep trouble in the United States of America, why should it come as a shock to discover that key parts of its economy, too, are malfunctioning? Employers throughout America lament the failure of parents to inculcate good work values in their children, and of schools to teach them twenty-first-century skills. The failure of one institution affects the operations of another. For generations, Americans prided themselves on possessing the World’s cleanest, most efficient financial system in the World, the one most capable of allocating capital to its most productive uses. Having grown up in broken homes and gone through a broken education system, America’s baby boom workers—many of them also investors—should not have been studded by the chain reaction of scandals that followed the spectacular crash of Enron in the late 1990s. Caught up in an unprecedented flurry of corporate or executive scandals, failures, excesses, number-juggling and lies were WorldCom, Tyco, Rite Aid, Adelphia Communications, Qwest, Xerox and a lengthy list of other giant American firms, together with their eye-obliging investment bankers. All followed by more layoffs. Meanwhile, America’s main accounting firms, supposedly there to audit companies’ books and keep them clean, were themselves soon sweating under investigative spotlights. Arthur Andersen, Enron’s auditor, quickly perished, and as Fortune put it, “the Big Four—which together audit a staggering 78 percent of the nation’s 15,000 publicly traded companies—continue to careen from one humiliating headline to the next.”

 Satirists pictured ten thousand chief executive officers fleeing across the border into Mexico. Duped investors screamed. Trust in American stock markets and the American business system as a whole sluiced down the sewer. And with it went the jobs and retirement saving of hundreds of thousands of employees. Slowly changing regulatory and enforcement methods, along with legal and social norms, were left in the dust by accelerating changes in business, creating turbulence, confusion—and, for some, irresistible new opportunities at the blurred edges of once-clear boundaries, in yet another manifestation of the de-synchronization effect. At the same time, an additional crack has been widening in the sole superpower’s institutional infrastructure as its companies and their employees struggle to pay the skyrocketing costs of health care insurance. How, one might ask, can the American health system be in dire need of intensive care when in 2021 $4.1 trillion or $12,530 per person was spent on health care compared with, say, Japan spending $4,360 USD per person. Definitions of a crisis vary, of course, but the facts are that some forty million Americans lack health insurance, deadly errors are daily occurrences in the World’s most heavily funded hospitals and recurrent health manias spread viruslike through society—anti tobacco first, then anti-obesity and low-carb diets. What next? On top of that, a health-care executive warns a congressional subcommittee that “the U.S. health care system is about to implode, and Alzheimer’s disease will b the detonator” because the baby boomer generation is reaching the age of the onset of that terrible illness. The fact that health conditions in most other countries are worse does not change the reality. The World’s most expensive health-care system is deeply dysfunctional—and getting more so. One way of dealing with the World’s problems is to turn to God.

Nietzsche was ineluctably led to meditation on the coming to be of God—on God-creation—for God is the highest value, on which the other depend. God is not creative, for God is not. However, God as made by man reflects what man is, unbeknownst to himself. God is said to have made the World of concern to us out of nothing; so man makes something, God, out of nothing. The faith in God and the belief in miracles are closer to the truth than any scientific explanation, which has to overlook or explain away the creative in man. Moses, overpowered by the obscure drives within him, went to the peak of Sinai and brought back tables of values; these values had a necessity, a substantiality more compelling than health or wealth. They were the core of life. There are other possible tables of values—one thousand and one, according to Zarathustra—but these were the ones that made this people what it was and gave it a lifestyle, a unity of inner experience and outer expression of form. There is no prescription for creating the myths that constitute a people, no standardized test that can predict the man who will create them or determine which myths will work or are appropriate. There is no prescription for creating the myths that constitute a people, no standardized test that can predict the man who will create them or determine which myths will work or are appropriate. There is the matter and the maker, like stone and sculptor; but in this case the sculptor is not only the efficient cause but the formal and final cause as well. There is nothing that underlies the myth, no substance, no cause. No search for the cause of values, either in the rational quest for knowledge of good and evil or in, for example, their economic determinants, can result in an accurate account of them. Only an openness to the psychological phenomena of creativity can bring any clarity.

This psychology cannot be like Freud’s, which, beginning from Nietzsche’s understanding of the unconscious, finds causes of creativity that blur the difference between a Raphael and a finger painter. Everything is in that difference, which necessarily escapes our science. The unconscious is a great mystery; it is the truth of God, and it—the id—is as unfathomable as was God. Dr. Freud accepted the unconscious, and then tried to give it perfect clarity by means of science. Dr. Freud’s procedure is like trying to determine God’s essence or nature from what he created. God could have created an infinity of Worlds. If He had been limited to this one, He would not have been creative or free. If one is to understand creativity, understanding all of this is necessary. The id is the source; it is elusive and unfathomable and produces World interpretations. Yet natural scientists, among whom Dr. Freud wished to be counted, do not take any of this seriously. Biologists cannot even account for consciousness withing their science, let alone the unconscious. So psychologists like Dr. Freud are in an impossible halfway house between science, which does not admit the existence of the phenomena he wishes to explain, and the unconscious, which is outside the jurisdiction of science. It is a choice, so Nietzsche compellingly insists, between science and psychology. Psychology is by that very fact the winner, since science is the product of the psyche. Scientists themselves are gradually being affected by this choice. Perhaps science is only a product of our culture, which we know is no better than any other. Is science true? One sees a bit of decay around the edges of its good conscience, formerly so robust. Books like Thomas Kuhn’s The Structure of Scientific Revolutions are so popular symptoms of this condition.

This is where what I called the bottomless of fathomless self, the last version of the self, makes its appearance. Id, Nietzsche named it. The id mocks the ego when a man says, “It occurred to me.” The sovereign consciousness waits on something down below, which sends up its food for thought. The difference between this version and the others is that they began from a common experience, more or less immediately accessible, that all men share, which establishes, if only intersubjectively, a common humanity that can be called human nature. Fear of violent death and desire for comfortable self-preservation were the first stop on the way down. Everybody knows them, and we can recognize one another in them. The next stop was the sweet sentiment of existence, no longer immediately accessible to civilized man but recoverably by him. When under its spell, we can with certainty say to ourselves, “This is what I really am, what I live for,” with the further conviction that the same must be so for all other men. This, allied with a vague, generalized compassion, makes us a species and can give us guidance. At the next stop there turns out to be no stop, and the descent is breathtaking. If one finds anything at all, it is strictly one’s own, what Nietzsche calls one’s fatum, a stubborn, strong individual that has nothing to day for itself other than that it is. One finds, at best, oneself; and it is incommunicable and isolates each from all the others, rather than uniting them. Only the rarest individual finds their own stopping point from which they can move the World. They are, literally, profound. Through the values, the horizons, the tables of good and evil that originate in the self cannot be said to be true or false, cannot be derived from the common feeling of mankind or justified by the universal standards of reason, they are not equal, contrary to what vulgar teachers of value theory believe.

Nietzsche, and all those serious persons who in one way or another accepted his insight, held that inequality among humans is proved by the fact that there is no common experience accessible in principle to all. Such distinctions as authentic-inauthentic, profound-superficial, creator-created replace true and false. The individual vale of one man becomes the polestar for many others whose own experience provides them with no guidance. The rarest of men is the creator, and all other men need and follow Him. Imagine the existence of a whole population of individuals employing a certain strategy, and a single mutant individual employing a different strategy. If the mutant can get a higher payoff than the typical member of the population gets, the mutant strategy is said to invade the population. Put in other terms, the whole population can be imagined to be using a single strategy, while a single individual enters the population with a new strategy. The new comer will then be interacting only with individuals using the native strategy. Moreover, a native will almost certainly be interacting with another native since the single newcomer is a negligible part of the population. Therefore, if the newcomer gets a higher score with a native than a native gets with another native, a new strategy is said to invade a native. Since natives are virtually the entire population, the concept of invasion is equivalent to the single mutant individual being able to do better than the population average. This leads directly to the key concept of the evolutionary approach. If no strategy can invade it, the strategy is collectively stable. The biological motivation for this approach is based on the interpretation of the payoffs in terms of fitness (survival and number of offspring). All mutations are possible; and if any could invade a given population, this mutation presumably would have the chance to do so.

For this reason, only a collectively stable strategy is expected to be able to maintain itself in the long-run equilibrium as the strategy used by all. Collectively stable strategies are important because they are the only ones that an entire population can maintain in the long run in the face of any possible mutant. The motivation of applying collective stability to the analysis of people’s behavior is to discover which kinds of strategies can be maintained by a group in the face of any possible alternative strategy. If successful alternative strategy exists, it may be found by the “mutant” individual through conscious deliberation, or through trial and error, or through just plain luck. If everyone is using a given strategy and some other strategy can do better in the environment of the current population, then someone is sure to find this better strategy sooner or later. Thus only a strategy that cannot be invaded can maintain itself as the strategy used by all. A warning is in order about this definition of a collectively stable strategy. It assumes that the individual who are trying out novel strategies do not interact too much with one another. If they do interact in clusters, then new and very important developments are possible. In the last few years, appropriation has been practiced with certain limits; the art category as a whole is left intact, though inner divisions such as those between stylistic periods are breached. The model of Francis Picabia is relevant here. However, twenty-five years ago appropriation worked on the more universalizing model of Duchamp. In this case, the artist turns an eye upon preexisting entities with apparent destinies outside the art context, and, by that turning of the eye, appropriates them into the art realm, making them the property of art. This involves a presupposition that art is not a set of objects but an attitude toward objects, or a cognitive stance (as Oscar Wilde suggested, not a thing, but a way).

If we were to adopt such a stance to all of life, foregrounding the value of attention rather than issues of personal gain and loss, one would presumably have rendered life as seamlessly appreciative experience. Art then functions like a kind of universal awareness practice, not unlike the mindfulness of the southern Jesus Christ or the “Attention!” of a miracle. Clearly there is a residue of Romantic pantheistic mysticism here with a hidden ethical request. However, there is also a purely linguistic dimension to the procedure, bound up with the nominalist attitude. If words (such as “art”) lack rigid essences, if they are, rather, empty variables that can be converted to different uses, then usage is the only ground of meaning in language. To be this or that is simply to be called this or that. To be art is to be called art, by the people who supposedly are in charge of the word—artists, critics, curators, art historians, and so on. There is no appeal from the foundation of usage, no higher court on the issue. If something (anything) is presented as art by an artist and contextualized as art within the system then it is art, and there is nothing anybody can do about it. Conversely, the defenders of the traditional boundaries of the realm will be forced to reify language. They will continue to insist that certain things are, by essence, art, and certain other things, by essence, are not art. However, in an intellectual milieu dominated by linguistic philosophy and structural linguistics, the procedure of appropriation by designation, based on the authority of usage and the willingness to manipulate it, has for a while been rather widely accepted. During this time the artist has had a new option: to choose to manipulate language and context, which in turn manipulate mental focus by rearrangement of the category network within which our experience is organized.

Try to remember a time when you first read a book or heard a radio show, or pod cast and then later saw a film or a television program of the same work. If you read, day, The Queen of the Damned, Pamela: Or Virtue Reward, Roots, Marjorie Morningstar or From Here to Eternity, or heard any radio shows such as “The Lone Ranger” first, you created your own internal image of the events described while you read or listened. You imagined the characters, the events and the ambiance. You made pictures in your mind. These pictures were yours. Of course they were influenced by the author—what he or she told you—but the creation of the actual image was up to you. Marjorie Morningstar was an image in your mind before you saw the film. Then you saw the film with Natalie Wood playing Marjorie. Once you had seen Natalie Wood in the role, could you recover the image you had made up? Marjorie became Natalie Wood from that point on. So we can say that when your self-produced image was made concrete for you, your own image disappeared. When you listened to Lone Ranger on radio, you created a picture of him and Tonto. When you saw them later on the television, could you retain your new image, or did you get stuck with the actors? It was almost certainly the latter. If you then heard the radio program again, what image of the characters were you left with? In any competition between an internally generated image and one that is later solidified for your via moving-image media, your own image superseded. Moses is Charlton Heston. The Sundance Kid is Robert Redford. Isis a Sunday morning cartoon. Woodward and Bernstein are Redford and Hoffman. Buffalo Bill is Paul Newman. McMurphy is Jack Nicholson. (When Carlos Castaneda was offered an enormous sum of money to sell the screen rights to the Don Juan series, he refused saying, “I don’t want to see Don Juan turned into Anthony Quinn.”)

Let me ask the question in reverse. If you saw the movie version of The Queen of the Damned before you read the book, could you develop your own image of Queen Akasha? Or did she remain Aaliyah Haughton? Did you see Natalie Wood in the part before you read Marjorie Morningstar? I doubt it very much. Once the concrete image is in you, it stays. The power that television images have to replace imaginary images that you created yourself operates in all realms of external-image information. All of our minds are filled with images of places and times and people and stories with which we have never had personal contact. In fact, when you receive information from any source that does not have pictures attached to it, you make up pictures to go with it. They are your images. You create the move to go with the story. You hear the word “America” and a picture comes to mind. These internal movies can be of historical events and periods, such as the singing of the Declaration of Independence or the age of dinosaurs. They may be of happenings to which we have no direct access, such as life in a primitive village, or of exotic places we have never been—Borneo, China, and the Moon. The question is this: Once television provides an image of these places and time, what happens to your own image? Does it give way to the TV image or do you retain it? Try this, I am going to mention names and places and see what you come up with. How about life under the sea? The Winchester Mystery House? Life in an Inuit Village? Disneyland. A preoperation conference of doctors. An American Farm Family. The war room in the Pentagon. The Battle of Little Big Horn. The Old South. The Crusades. The landing of the Pilgrims. The flight of Amelia Earhart. An emergency ambulance crew. A Stone Age tribe. The Old West.

Where you able to come up with images for any or all of them? It is extremely unlikely that you have experienced more than one or two of them personally. Obviously the images were either out of your own imagination or else they were from the media. Can you identify which was? Most of the people in America right now would probably say that the images they carry in their minds of the Old South are from one of two television presentations: Gone With the Wind and Roots. These were, after all, the two most popular television shows in history, witnessed by more than 130 million people each. And none of the 130 million was actually in the Old South. Historical periods like the Crusades or the Old West are frequently picture on television and in films. If I asked them to bring those pictures to mind, I have little doubt that most people would call upon their film or TV images. How could it be otherwise? The same applies to the depictions of lifestyles. What images do you use to understand the quality of life for same gender couples? Or artists? Or farm laborers? Or member of the America Liberation Party? What images do you carry of Famers in the Middle West of America or the nomads in the Sahara or Indians of the Amazon? Like historical periods, or groups of people with whom you are not in personal contact, most current events are also removed from your direct participation. You watch news reports in which anchor John Doe tells you what is happening in China. You watch congresswoman Jane Doe explain the event in Chile, and then you see a street in Santiago. You see pictures of grounded oil tankers or fighting in Angola or elections in Sweden or scientific testimony on nuclear power. You do not participate in these things and you cannot see them for yourself. The images you have of them are derived from the media, and this becomes the totality of image bank.

Technology has been moving toward greater control of the structure of matter for millennia. For decades, microtechnology has been building ever-smaller devices, working toward the molecular size scale from the top down. For a century or more, chemistry has been building ever-larger molecules, working up toward molecules large enough to serve as machines. The research is global, and the competition is heating up. Since the concept of molecular nanotechnology was first laid out, scientists have developed more powerful capabilities in chemistry and molecular manipulation. There is now a better picture of how those capabilities can come together in the next steps, and of how advanced molecular manufacturing can work. Nanotechnology has arrived as an idea and as a research direction, though not yet as a reality. Naturally occurring molecular machines exist already. Researchers are learning to design new ones. The trend is clear, and it will accelerate because better molecular machines can help build even better molecular machines. By the standards of daily life, the development of molecular nanotechnology will be gradual, spanning years or decades, yet by the ponderous standards of human history it will happen in an eyeblink. In retrospect, the wholesale replacement of twentieth-century technologies will surely be seen as a technological revolution, as a process encompassing a great breakthrough. Today, we live in the end of the pre-breakthrough era, with pre-breakthrough technologies, hopes, fears, and preoccupations that often seem permanent, as did the Cold War. Yet it seems that the breakthrough era is not a matter for some future generation, but for our own. These developments are taking shape right now, and it would be rash to assume that their consequences will be many years delayed.

To get a picture of the future, we must picture what nanotechnology can do. This can be hard to grasp because past advanced technologies—microwave tubes, lasers, superconductors, satellites, robots, and the like—have come trickling out of factories, at first with high price tags and narrow applications. Molecular manufacturing, though, will be more like computers: a flexible technology with a huge range of applications. And a molecular manufacturing will not come trickling out of conventional factories as computers did: it will replace factories and replace or upgrade their products. This is something new and basic, not just another twentieth-century gadget. It will arise out of twentieth-century trends in science, but it will break the trend-line in technology, economics, and environmental affairs. Calculators were once thousand-dollar desktop clunkers, but microelectronics made them fast and efficient, sized to a child’s pocket and priced to a child’s budget. Now imagine a revolution of similar magnitude, but applied to everything else. There are always watches that allow one to play their balance for their purchases they make at the store making cash, check, and credit cards obsolete. As the spirit of Thamus reminds us, tools have a way of intruding on even the most unified set of cultural beliefs. There are limits to the power of both theology and metaphysics, and technology has business to do which sometimes cannot be stayed by any force. Perhaps the most interesting example of a drastic technological disruption of a tool-using culture is in the eighth-century use of the stirrup by the Franks under the leadership of Charles Martel. Until this time, the principal use of the horses in combat was to transport warriors to the scene of the battle, whereupon they dismounted to meet the foe. The stirrup made it possible to fight on horseback, and this created an awesome new military technology: mounted shock combat. The new form of combat, as Lynn White, Jr., has meticulously detailed, enlarged the importance of the knightly class and changed nature of feudal society.

Landholders found it necessary to secure the services of cavalry for protection. Eventually, the knights seized control of church lands and distributed them to vassals on condition that they stay in the service of the knights. If a pun will be allowed here, the stirrup was in the saddle, and tool feudal society where it would not have otherwise gone. To take a later example: I have already alluded to the transformation of the mechanical clock in the fourteenth century from an instrument of religious observance to an instrument of commercial enterprise. That transformation is sometimes given a specific date—1730—when King Charles V ordered all citizens of Paris to regulate their private, commercial, and industrial life by the bells of the Royal Palace clock, which struck every sixty minutes. All churches in Paris were similarly required to regulate their clocks, in disregard of the canonical hours. Thus, the church had to give material interests precedence over spiritual needs. Here is a clear example of a tool being employed to loosen the authority of the central institution of medieval life. Technologies can make life easier, but they can also create complications. That is why it is important to distinguish between apparent and true happiness, and to have great discernment to penetrate into the profundities of true happiness and to feel it more passionately. Virtue is its own reward. The self-enjoyment of a moral person is something everyone should experience. The conduct of a moral humans’ life and one’s happiness in one’s nature transcends the realm of ethics as well as that of self-consciousness. Both are to be understood only from a humans’ intercourse with God, which is the basic theme of Christian life. Humans’ who go to lead the ways of the wicked will go their own way and learn somewhere or other, at some point in their journey, that what they all the time had taken to be a way is no way, that this alleged way leads nowhere. And now they can see neither before nor after, their life now is wayless.

Cresleigh Homes

That sliding barn door means saved space AND chic design. #CresleighHomes are always full of careful details that show thoughtful attention to even the small things. 👌

Can’t wait to show you the rest of the house at #MillsStation – this is Residence 1! Wait til you see the inside of the bathroom…it’s even better! ✨