Tonight I went home by train. I usually try to get into a car that is as empty as possible because like that I can sleep or even better, I can read the newspapers that the passengers from the previous train time have left in the car. Yeah, I'm cheap. Or you could say I'm "green"? Or I am a "new economy" practitioner?...Nah, I'm just cheap.
Interestingly, when I was checking for newspapers, I saw a group of papers attached by a clip that was left there, I don't really know if intentionally or not.
It was a financial paper called "Irrationality always happens", by Structural Logic, which is, I found later, an independent research company, that deals in corn equities and similar evaluations.
You could say that it was the divine providence/insert your favorite anthropomorphous supra human law! Obviously a little bit of thinking made me realize that I was in front of a "Friday the 13th" phenomena, where because you are paying more attention to a particular factor, you notice it more than what you would if you weren't paying attention . I will probably talk about this in the future, as it is one of the main errors in reasoning, and the subtler versions of it are pretty sneaky.
But what I wanted to talk is about the train of thought suggested by the title of that article. It was not an article about thinking, but about trading corn and other commodities , and all the related factors, like how to predict how the markets are going to react so certain characteristics of the environment. One of their points was that even though the people react in irrational ways to changes, you could still use this characteristic to predict the market, because the irrationality of it always happened.
I have to admit that as soon as I saw their point I thought that it was a genial idea, so I proceeded to evaluate why I had that reaction...mmm...probably the use of "thought" for my first impression was not the most precise, as it was more a first pass impression...I think there is another topic there for another day. So pardon me the imprecision at this time, and lets continue(and importantly, this is lack of precision is actually relevant to the rest of the thought pathway!)
So is this real? Does Irrationality always happens? If we expand the use of this concept from only economics to other thought domains, I would say that yes, but with some caveats. My main reasoning basis was statistic. If you have a certain thought pathway, that is, a series of connected thought processes that depend on the previous one either serially or in parallel, there is a non zero probability at every time that you will make a mistake. If it's a logical thought pathway, that is a group of thought processes based on only logical statements (logical as in the philosophical/reasoning methodology, not as in "oh yeah, that makes sense"), you have a probability at every step of making a mistake applying the rules of logic, or even more common, of expanding your thought pathway outside the realm of logic to another realm of thought, like subjective value giving. Also you have the non zero probability of going from one sub part of logic to another, that has different rules, without noticing it, either because it's a very subtle detail, or because simply you don't know all the precepts of that sub part of logic analysis.
Even more prone to this is reasoning based on reality based data, that is, using as your input for your reasoning not logical rules, but information that you got from a source. At this time, the sources of errors increases exponentially, going from an error of the source, a communication mistake from the source to you, a misunderstanding in the definition of the terms (my favorite kind of error, and the type that I see the most happening around me). This in addition to the mistakes that you can have only on the basis of your application of logical principles to the analysis of your data.
Another factor that I have to consider in my thought pathways is that a lot of times I have an "intuitive" solution in mind when I start, "intuitive" meaning that I reached a conclusion from the basis of the problem without going through any steps. This sadly introduces biases, because I don't like to be wrong, even though I have learned that this happens quite frequently. This bias can actually affect my thought processes to the point that the thought pathway will be deviated towards my "intuitive" solution even if it's not the right one, because "obviously" I have to be right, so I should reach through elaborate reasoning the same result that my "intuitive" thinking. The fact that this can actually be the adequate thought pathway, it doesn't mean that it should be like that. But it is hard to avoid...hopefully your egos are more rational than mine!
Is this impossible to avoid? I suppose that an imprecise simile that could be applied for this is comparing it to entropy. You can't really decrease the total entropy of a system, but you can, by increasing a sub part of it, decrease significantly the entropy of a particular system. We should be thankful for that, because we wouldn't be here discussing this topic if this feature of reality didn't existed. The same way, I suppose you can decrease the irrationality of a certain group of thought processes, so that a specific thought pathway reaches an adequate ending. This probably explains why short thought pathways tend to be less susceptible to mistakes. Also the use of a "repair" thought process parallel to your main pathway is always good, just like a good editor is good for a writer, because he/she will see errors that the author can't because of their involvement in the process. Obviously, the use of a repair thought process introduces another source of errors, but after testing it multiple times, you can at least evaluate how frequent this "repair" functions costs you in thought precision, similar to the way that DNA, when it replicates has DNA polymerases that review the strand for holes or misalignment of nucleotides, that has a certain rate of error, but low enough that the cell is able to replicate, and controlled enough that if the rate of error is more than acceptable, the cell undergoes apoptosis (cell suicide).
But the errors will always persist, mainly in reality based thinking, because of definition problems. If you think about this post itself, I have discussed a lot of things about irrationality, but I have not even defined the concept (I told you this was my favorite type of error!), so I am sure that if you review my reasoning with a different definition of irrationality you can actually reach different conclusions or just plain disagree with what I just wrote...Even myself, by rereading my reasoning, I realized that my internal definition of irrationality was actually not well defined, and that made me use actually two different concepts for irrationality, not that very far away from each other, but that take some precision from my reasoning...well, another day, another learned concept. Most of my propositions are still conceptually sound anyways, and the internal discussion that this topic raises let me realize that I have to take active measures to limit the irrationality in my thought processes, either by tightening my definitions or by restarting the thought pathway after obtaining the conclusion so I can see if I am not biasing my processes because of preconceptions or just by misapplication of logical concepts.
Well, time to think another thought [and use error checking mechanisms too ;-) ]