Summary
A participant in a recent writing workshop used this phrase to describe what her executive clients hoped for. That sudden flash of insight is what I'm looking for, too.
Advertisement
I'm a bit of a writing workshop junkie. Some workshops have been amazing, so I keep getting drawn back, hoping for more. A writer/actor friend recently told me that the two groups of people who will buy anything are writers and actors. I think it's because both groups deal in the unrestrained world of infinite possibilities; it's much easier to get them to believe.
To be honest, I'm not sure if I've ever actually had the flash of insight in the way that it's typically described. For me it has always been preceded by months or even years of effort. When it happens, though, I get big endorphin releases that can last for days.
I'm beginning to think that, to lay the ground for these insights, you must go around knocking out foundational precepts -- things that are such fundamental truths that you don't even think about them, much less question them. This activity is disturbing and difficult at first. Perhaps it is even impossible for many, who can't imagine doing it, just as those of us who do it can't imagine not doing it.
Fortunately there are more resources appearing than the traditional ones like modern art museums. Seth Godin is one such, as he is constantly seeking different ways to look and think. He recently recommended Radiolab, a podcast and public radio show that is outstanding not just because of its amazing content (it is subtitled "on a curiosity bender"), but also because of its production -- they are always adding sound effects and doing other things to tighten up and densify the flow of information, to the point where your attention simply cannot wander. After an hour of listening to this intensity, you're hungry for another one. It makes you realize that most of the problems in our education system come from taking information and presenting it in the most stultifying way possible. In contrast, you could listen to Radiolab-style lectures all the time. The big problem with this show, as Seth Godin points out, is that they don't make them fast enough and that I'm going to run out of back episodes soon. But I don't care, I can't stop.
Radiolab gives you a steady stream of golf balls to the forehead. There's a downside, as people who have come to open-spaces conferences can attest: it ruins you for the old, tedious way of doing things. I suspect you just become aware that you live bored, but this awareness is disruptive and you can't un-open the can of worms. Go in with your eyes open.
I had a big golf ball to the forehead during the writing workshop. I will probably process this insight for years. It came during a rapid-writing exercise, where one of the things that appeared on my page was this:
We are irrational. Proof: We believe we are rational.
I've been unconsciously struggling with this issue for a long time. Why don't we just behave sensibly? Every time I see people -- including and especially myself -- do dumb things, the same question comes up: we know the right answers, or at least how to find them, why do we adamantly continue down the path of stupid?
If you assume people are rational creatures, then our behavior is crazy and frustrating. But why did there need to be a "dawn of reason?" And just because we have discovered reason and the scientific method, does that mean that it permeates our brains?
It turns out that we are not really wired for reason per se. It's not the way we absorb and internalize information, or make decisions. I get this not only from my own experiences, but from reading (Daniel Pink is a good resource) and also from the Memory and Forgetting and Sleep episodes of Radiolab.
Our brains are wired to seek patterns. Brains are such subtle pattern detectors that they easily cross into the realm where the patterns don't actually exist. This is why we need something like the scientific method, because our brains are a little too good at establishing cause-and-effect relationships -- so good that we need mantras such as "correlation is not causation."
I think the age of reason had to "dawn" because, in simpler and more brutal times, the basic correlations were correct enough. It was enough to see that getting hit by a stick or a sword would hurt you, and that putting seeds in the ground would grow food. Believing that performing a ritual to the earth mother would increase the crops might not be functional, but it was probably harmless and likely increased the number of babies.
It was only when we had advanced enough to start investigating subtler things that we needed to analyze reason, to distinguish when we had over-matched a pattern. More important to the beginning of reason was having enough people in that situation. Obviously there were pockets of struggle with reason, such as the Greek philosophers, but those were small bubbles in a society where the vast majority simply toiled.
So our basic mode is pattern matching, and only sometimes do we get to the point where we must apply reason to sort out the good pattern matches from the bad. But the other important issue about how we work is memory.
The most powerful way to remember something is through pictures that tell stories. The reason Radiolab is so effective is that they make you see what they're talking about, not only by telling the story well, but by adding sounds and other things that make you feel the reality of it. All memory systems, most of them ancient, utilize these pictures-as-stories techniques, because our brain works that way. The easiest way to remember something is to make pictures out of what you want to remember, and then connect those pictures in some story-like fashion.
If you've used these techniques, or just tried to memorize things through repetition, you'll discover that there's a certain squirreliness to memory. According to the Radiolab episode on memory, each time you remember something you do a certain amount of re-creation of that thing. Over time your memories change (this may suggest that a positive mental attitude will cause bad memories to become less bad, and a negative mental attitude will cause good memories to go sour). Memories can even be created and manipulated.
Even more important: as you remember things over time, the stories behind them can become more real to you. Thus, something that begins as a fairy tale can, over time, become a belief. The opposite is also true, that something you believed at one time can fade away. The difference seems to be in repetition and affirmation. Tell a story and assert its truth enough times, and people will believe it.
To summarize:
We are not rational
We seek patterns, and discover them where none exist
Our memories change over time
Given repetition and affirmation, we will learn to believe any story
As with any discovery, this information can be used to great good or great evil, and it seems to me we have primarily seen it used for evil. And more heartbreaking, things that started out good that have been turned to evil. Basically, it's a formula for controlling people, and that's how it's used, especially when you throw in one other factor, which is described in Radiolab's "sleep" episode:
We dream to solve problems that threaten us
Things that happen in life that we can't seem to work out, and that seem important to us, cause our dreaming mind to go to work. But the most compelling motivator is fright; if something scares us, it's more likely to remove us from the gene pool so we'd better get on it while we're asleep, figuring out what to do the next time the scary thing shows up (interesting connection to visualization here -- like an athlete visualizing their actions to perform better, the dreamer learns to solve the problem in dream-world first).
So the formula for manipulating people is:
Find something that scares them, or invent something that does
Create a story and repeat it until the scary thing is believed
Create a "solution story" that alleviates the scary thing
Make sure that following the "solution story" performs the desired manipulation
I'll leave it to you to notice how this formula has been applied again and again throughout history to manipulate people into acting against their own self interest. Maybe not just throughout history, but right now. Also note that evil has a big advantage in that it has no hesitation using the manipulation formula.
Using the formula for good is much more constraining. In particular, you must stick to the truth, and that's problematic because we rarely get the truth right the first time (evil is quick to point fingers and say, "See, they're evil, they weren't being truthful"). The scientific method can never prove anything, but any theory must be disprovable to be legitimate, so evil can always claim that the truth is never being told. Alas, that's the difficult nature of the path; you can only attempt to tell the truth, knowing that you will fail, but not give up nonetheless. An intelligent doctor realizes that half of what they know is wrong, and they don't know which half, and yet they cannot give up trying to help.
One trap is the desire to have something be true. See my previous article on business management, a subject which is by and large merely wishful thinking. People want to believe that management is a science, so they create a story around that and try to make it true, and cause a lot of suffering in the process. Although it can appear well-intentioned, there is a foundation of greed which corrupts the outcome. This doesn't mean science can't help discover useful management techniques, but a company is not a machine, and never will be.
The world is made of stories, and you can choose which ones to believe and disseminate. You can proliferate stories that will control people and bring you power, and you will get that power at the same time that it spoils the world you live in. Or you can create stories that raise up the people that engage with them. This will not bring you power, but it will create a better world where you live. That's called enlightened self-interest.
What kind of experiences have you had that have given you "a golf ball to the forehead?"
Corollary drawn from the admission of human irrationality.
When you are irrational you have to become the target of constant feedback, monitoring, auditing, controlling, self-assessment, ... in the very name of your own self interest. On a political scale this is implemented as the unity of liberalism and surveillance state. Everyone admits now that the homo economicus is an empty fiction and does not exist. It remains nevertheless the ultimate model. If it is absent, it has to be established using a new bureaucracy that speaks in the name of individualism, human values and free markets. When you say today "I know on my own what is good for me", in defense of your presumed individuality and rationality, which is finally unprovable, no one will believe you.
It would be interesting to follow this line of thought and confront it with the methodology debates in software engineering we lead in the last decade. In fact I belive those debates foreshadow ones that we'll experience in other sectors of society as well, such as education.
Have a nice day and may no golf ball hit your forehead. Even if you are lucky and it doesn't cause a concussion it won't make you more thoughtful.
Alternative theorem: behaviour is always rational, it's just that the pressures to any given behaviour are not always visible or acknowledged.
Case: humans burn obscene tons of fossil fuel, corrupting air, land, and water. That can't be rational. But it is if: 1) immediate external (social) costs are not paid by the polluters and 2) (corollary) future costs are not paid by the polluters. So, if you're in Ohio (or China), it makes perfect sense to burn coal to generate electricity, since the pollution is sent downstream (water) and down wind to others. You, the polluter, get the benefit of cheap electricity and few, or even none, of the costs. This is perfectly rational.
(Social) Darwinism is essentially a complex tautology: whatever exists today is the result of these bettering others that existed yesterday. Will those that pollute, and breed obsessively, be dominant 50 years hence? Or will the dominants be the few remaining pastoral peoples we've not yet discovered? From the point of view of immediate gratification (another way of describing either form of Darwinism) China is winning. China is also most certainly condemning future generations to extinction. From a Darwinist perspective, this is perfectly OK. (Social) Darwinism admits to no time scale, other than immediate action. It's only when viewing back through the lens of history that Darwinism can say that a more rational course could have been taken, but Darwinism provides no mechanism to discover such alternate courses.
Without man made laws, imposed globally, can external costs, both immediate and temporal, affect decisions. Given the nationalist/rightist drift in all Western countries, good luck with that.
> People want to believe that management is a science, so > they create a story around that and try to make it true, > and cause a lot of suffering in the process.
I think continually asserting the same thing despite being informed that you misunderstand the basic concepts related to it is proof of irrationality.
I have so many ideas, so many golf ball for the forehead that i don't bother to write them. I hope that they dance freely in my mind, so the portrait get created by itself.
When you become used to this kind of thinking, you don't mind going in circles, the starting point is always new, you get a different vision every time.
I could give some very simple rules to construct ideas.
For example: 1) Give a step 2) Give another step
3) Know you should be able to think in a big step.
If you are not able to go to 3) you should concentrate in what kind of step are you able to think.
I think that you can find your first step if you are able to find the time and relax, the moment for your first step.
The most powerful way to remember something is through pictures that tell stories.
I don't think so.
If I were to say something interesting to someone, I should construct a model of my idea that have some real influence on that person, he would see himself first running in one direction and them, suddenly in front a wall. The thing this person would learn is a device to skip the wall.
Hey, it looks like you knew the solution before I posed the problem?
The answer is that to know someone is also to know the walls that people will be surrounded by.
I am reading Peter Drucker "Drucker essentials". I wrote this books first because it was cheap 6 Euros/860 pages, and also because of a simple idea i thought worth to be explored: He said that you should hire an employee by this best strength not for the overall quality.
Reading this book, I can see that any rule or idea requires to have some hidden information (a life working in business) to appreciate the risk, the potential, the market, and so on.
So I should say that the words in economics are nonsense, they only are meaningful to those that share a lot of experience and the same viewpoint.
Also there is no way to measure risk. There is not a grammar to generate stories from past time to which you should apply your framework.
Choosing some examples to illustrate things is not science in any way.
You can say that there is a lot of information in a story. But that story must be a sound one, hindsight is not a method of selecting stories for the future.
If you are able to convince people that your stories are the sound one, then you can give a lot of valuable information, that information has an economic value, and there can be a lot of science in the development of your ideas, but the core is faith.
a blurb from Douglas Hofstadter's upcoming book "Surfaces and Essences":
"In The Essence of Thought [apparently the title has changed], Hofstadter and Sander show how analogy-making pervades our thought at all levels—indeed, that we make analogies not once a day or once an hour, but many times per second. Thus, analogy is the mechanism that, silently and hidden, chooses our words and phrases for us when we speak, frames how we understand the most banal everyday situation, guides us in unfamiliar situations, and gives rise to great acts of imagination."
I agree with much of what you say, but would suggest that patterns can be defined as descriptive constructs that don't imply any reference to reality. In other words, people do not discover patterns, they create them based on observed, imagined or postulated phenomena.
So instead of "Brains are such subtle pattern detectors that they easily cross into the realm where the patterns don't actually exist.", I would say "Brains create patterns, some of which are unfounded."
In this light, the scientific method can be seen as a testing framework for patterns. It defines apriori criteria for "good" and "bad" patterns and compares them to reality as best it can.
It also removes the burden of rationality from the pattern or its creator and places it squarely on the method. In effect, the method defines rationality.
Finally, the "brainwashing" techniques you're describing can just as easily describe regular educational techniques, the kind you use to teach your kids the difference between right and wrong, how to appreciate beauty, how to learn "correctly", and most importantly, how to identify (which is to say, create) patterns. It's not all nefarious.
> I agree with much of what you say, but would suggest that > patterns can be defined as descriptive constructs that > don't imply any reference to reality. In other words, > people do not discover patterns, they create > them based on observed, imagined or postulated phenomena. > > So instead of "Brains are such subtle pattern detectors > that they easily cross into the realm where the patterns > don't actually exist.", I would say "Brains create > patterns, some of which are unfounded." > > In this light, the scientific method can be seen as a > testing framework for patterns. It defines apriori > criteria for "good" and "bad" patterns and compares them > to reality as best it can.
Well, but it can't or it can but only under the assumption that there is a neutral background, called "nature" which doesn't change due to our observations, assumptions and attitudes.
Once we position ourselves into the scenario also the best methodological rationality becomes delusional. We willingly bind ourselves to this sort of rationality as an oracle and this leads to a fatal shift where the model becomes invalid and/or loses predictability because it has an essential singularity. This leads to the paradox that we can trust a model only by not trusting it too much. People have rightly observed that the global financial crisis had no subject, no black knight and the trust in the modeling rationality which has no subject is part of the reason. It has to exclude the singularity because it can't treat it, making definite assertions of what is happening. Therefore people believe it is absent and underestimate risks and precisely this caused havoc. Irrationality gets fed with rationality and rationality attempts to smooth irrationality using statistics but can't finally avoid the excess.
Of course one can also take Bruces logic and turn Quants, like managers, into story telling bozos, some sort of marketing fluffs and tricksters like Bernie Madoff. I would understand this. The ordinary man wants revenche.
> > In this light, the scientific method can be seen as a > > testing framework for patterns. It defines apriori > > criteria for "good" and "bad" patterns and compares > > them to reality as best it can. > > Well, but it can't or it can but only under the assumption > that there is a neutral background, called "nature" which > doesn't change due to our observations, assumptions and > attitudes.
I disagree. Within the context of this pragmatic discussion, I'm assuming there's something called "reality" which needs no definition or explication. There's nothing neutral about it.
In a more general (neutral?) sense, a pattern is compared to something. To carry out a comparison, you have to have some conception of the things you're comparing. That something may be arbitrary, as in a thought experiment or fantasy novel, or it may be what you called "nature".
In any case, the comparison is performed by someone with the apriori capacity to perform the comparison. The scientific method is one way in which that someone can test patterns and construct models.
> This leads to the paradox that > we can trust a model only by not trusting it too much.
A model is an approximation, by definition. Trusting it too much is an error, not a paradox.
> A model is an approximation, by definition. > Trusting it too much is an error, not a paradox.
The error is constitutive though and it happens by means of an ironic twist in which reality undertakes to approximate the model, whereas the model promises to do it the other way round. There might still be science in the end and the scientific method but it doesn't work as expected.