Human lifestyles have undergone many major changes in the past twelve thousand years, as we adapted from nomadic groups to sedentary villages to denser towns then cities.

In that process, we also expanded our technologies from the Upper Paleolithic (the end of the Old Stone Age) to the Information Age, completely changing how we interact with the world around us.

As our predecessors went through these transformations, they adopted myths about how life should be lived. Many of the myths that arose in the past few hundred years have been helpful to technological progress, but harmful to the individual and sometimes to the culture.

This page questions some of those lifestyle myths by examining where they came from and their context. The descriptions given are overly-brief, and are meant to just be the beginning of an explanation. Each explanation can be easily validated by doing your own research.

See also: Unhealthy Myths

Myth: Nuclear families are self-sufficient.

Most humans through most of our existence have organized in groups based on extended family and/or a close-knit tribe.

The concept of two parents and their children as the basic organizing unit of society seems to have arisen in England in the 1200s. That primacy of the nuclear family was somewhat advantageous as cities grew and then as industrial society developed, leading to further expansion of the lifestyle.

This nuclear family lifestyle was advantageous in terms of enabling industrial developments, and sometimes advantageous to the individual. But the primary benefits went to the lords and then later to the corporations.

The effect on individuals has often been quite negative, as the complexities and demands of child-rearing are a heavy burden for only two individuals to take on alone. Parents often find themselves deciding between overextending themselves or neglecting their children.

In a culture where there is a larger social unit that takes some collective responsibility for all the children, parents are less burdened and children are stronger.

Myth: Children should be in classrooms.

Our modern school system seems to have its roots in the Prussian educational system of the nineteenth century, which was developed to meet the demands of the industrial revolution. It was developed “in close cooperation with the churches and tried to impose a strict ethos of duty, sobriety and discipline.”

That ethos is exactly what industrial interests needed to train a class of workers for their factories and offices. But that ethos is not necessarily what healthy children need to become healthy adults.

Children have little natural inclination to sit still and “behave” for hours on end. School trains them to behave like that so they may continue behaving like that in their adult jobs.

In land-connected cultures, young children play around the adults as the adults work. As the children get older, a combination of prodding by adults and the child’s natural curiosity leads them to get more and more involved in the work and to learn all the complexities of doing it.

Myth: Work, play, and exercise are distinct.

The main reason white-collar workers do exercise is because their daily life has so little activity in it.

People who work in physically laborious jobs are much less likely to do exercise activities after work. Similarly, people in non-industrial cultures don’t do exercise activities except in the form of play or practice.

Play is something that modern culture relegates to children. Conventional norms seem to suggest that adults are only allowed to play in structured activities like sports.

This aversion to adult play is probably linked to the ethic taught in schools, as noted above. After 1-2 decades of being forced to “behave like an adult,” people tend to get broken of their “childish” habits.

Myth: Buildings need lawns.

Modern landscaping focuses on large, unbroken stretches of grass with few or no edible/fruiting plants. They have their origin in the Victorian era, when “people of means” could flaunt that they were so wealthy that they did not have to grow their own food.

There are some advantages. Lawns have been appreciated because they can be appealing to the eye, provide a space for gathering and play, and they provide a buffer between domestic space and wilderness. Having no fruiting plants near the lawn means it doesn’t get messed up with rotting fruit.

But we have taken this idea much too far. Most homes have no wilderness near them to buffer from, and most lawns serve no functional purpose. The lawns in fronts of houses are rarely used, and the ones around commercial buildings are purely “decorative.”

So now most people reside on fallow land, with no access to food except through grocery stores supplied by long, fragile supply chains to far-away farms.

Myth: Farming is for peasants.

Several thousand years ago, some agricultural cultures began specializing some individuals into roles such as warrior, statesman, and merchant. Those people started looking down on farmers as less-skilled and as bringing less innovative force to the culture.

Those farmers also were unable to defend themselves against specialized warriors and the people they worked for; and so were often attacked, displaced, enlisted, taxed, and otherwise mistreated.

As cultures evolved, the food-producers usually remained at the bottom of any hierarchy. And today mainstream culture continues that prejudice, looking at farming as dirty, low-skilled, low-value work.

And farming happens so far away from urban culture that many people never even see what a farm looks like. Food is thought of as something that just appears in grocery stores, and the actual touching of food is left to the lowest-paid migrant workers.

This bias is deep-rooted, though it has no real objective basis. It contains the fallacy that food-growing is something only to be performed by others, instead of by oneself and one’s community. And it helps lock us into a state where now almost all of us are alienated from our food and utterly dependent on distant strangers to feed us.

Myth: More comfort/ease is always better.

Modern consumer culture sells comfort and convenience, under the principle of: anything you can afford that makes your day easier is inherently good.

While there is some appeal to this principle, it leaves out the cost of that convenience. It may be good for you in the moment, but the effects of conveniences on other people and on our world can be enormous.

As an example, rechargeable batteries make conveniences like wireless headsets and robotic vacuums, but the cobalt used to make a rechargeable batteries comes at the price of substantial human suffering (particularly in the DRC) while laying large tracts of land to waste.

In fact, the amount of destruction for almost every consumer product is enormous. As another example, see a detailed look at the environmental costs of manufacturing aluminum.

Myth: All costs can be measured by money.

Our culture puts forward the idea that all value and all cost can be reduced to money. And yet is is clear that monetary valuations overlook many different kinds of value.

Nature is generally given no value in such calculations, other than in terms of the resources that can be extracted from it. The natural value of a parcel of land is defined by the amount of wood that can be logged from the trees, the amount of nutrients in the soil, and the amount of minerals underneath.

And when calculations of value are turned to people, their value is assessed by the amount of capital they can return. In these calculations, tasks that have traditionally been done by women (child rearing, teaching, cleaning, cooking, caring for the sick and elderly) are often greatly undervalued.

Myth: Waste is an inevitable nuisance.

Trash (including things that we recycle) and sewage are looked at as incidental and inevitable products of life that grudgingly need to be dealt with.

A thing becomes trash when it is not economically worthwhile to recover materials from that thing, and (as the practice goes) it therefore should be disposed of along with every other thing that also is not worth recovering.

This thought-process works well for animals because their waste is small amounts of simple organics. Humans, with our vastly more sophisticated technologies, cannot afford to think this way.

However when plastics started moving into the economy in the 1950s, consumer demand and corporate merchandising lead to an increasingly “throw-away” culture. Disposable diapers were introduced in the early 1960s and by the late 1970s, major soda manufacturers started shifting from returnable glass towards disposable plastic bottles.

Now people are trained by the culture to believe that that all foods come wrapped in disposable containers, and that any product that breaks cannot be repaired or recovered but rather also becomes trash.

Myth: Toilets are like garbage cans.

Both solid and liquid human waste is highly valuable fertilizer. They can also be disease vectors, and much of human progress in the past millennium has to do with improving sanitation.

However, that solution has always been to get the waste far away from the people and get rid of it. No thought is ever given to trying to recover any nutrients, even though all the pathogens will die off within a year or two if the waste is appropriate applied to fallow land.

But historically we have always looked at human waste as basically toxic, and so we have had no qualms about adding more toxics to it.

We now regularly contaminate our sewage with countless cleaning chemicals, detergents, plastics (including in the chemicals in shampoos), pharmaceuticals, and random other liquids and small solids that we can fit in a toilet.

The end result is that even when we try to process and sterilize sewage, it still makes horrible fertilizer: contaminated with countless plastics and chemicals.

Myth: Scent is innocuous.

Studies have shown more than 1 in 10 people show an allergic reaction to fragrances. And yet, modern consumer culture is overwhelmed with fragrances in every walk of life.

In places like the US, people typically sleep on laundry-scented sheets, wash with scented soaps, wear clothing with laundry scent, and then layer on various scented personal care items, all while being surrounded by the residue of heavily-scented cleaning products.

All of these fragrance chemicals contribute to the overall inflammatory and immunological burden in our bodies. Regardless of whether or not one is “allergic” to any fragrance, this constant exposure is generally detrimental to one’s overall health.

And this constant bath of fragrances dulls our sense of smell down to an almost-useless sense. In non-industrial cultures, the sense of smell is used to predict weather, diagnose ailments, communicate social cues, and much more.


Return to: Unhealthy Myths

Modified: