Tuesday, November 04, 2008

Entropy: Entropy Reduction In Depth, Part 1

My series of posts on Entropy in Human Achievement is growing, so I will refer you to this category link for previous posts.

I recently discussed Productive Entropy and how to add to productive entropy. We are truly blessed in this digital age , to have so many sources of information available to us, from the familiar Books, TV, Radio, to podcasts, blogs, Google searches, etc. Properly deployed, the huge amount of input we get can be streamlined: the Productive entropy of all these inputs cam be further turned into the Negentropy of organized behavior. What tools do we have to reduce entropy, to magically transform all of our inputs, ideas, new knowledge, speculation, advice, etc into “the capacity for useful work”? I recently listed 50 techniques. Here is an in-depth discussion of a few.

Debrief/Review
A key tenet of modern military practice is the debrief or the after-action review. The faster our “turnaround cycle” for learning from previous experience, the faster we can make progress. By scrutinizing, categorizing, and contemplating recent experiences from the outside world, we can elucidate general principles, practices we want to implement next time, and new pathways for productive activity. The next time we go out into the battle, we will be taking a little less entropy with us. A great book that includes the debrief process is James Murphy’s Flawless Execution: Use the Techniques and Systems of America's Fighter Pilots to Perform at Your Peak and Win the Battles of the Business World.

Metaphor
“An atom is like a little solar system”. Person “A” is “like” person B. Centrifugal force is “like” a ball on twirling on a string. A biological cell is “like” a microprocessor. “As you stretch, visualize yourself has a marionette bending on a string”. Using metaphor can temporarily transform the raw data of our senses into a form which allows us to implement the data productively. Quantum, physics suggests to us that the physical world does not, in a sense, even exist until we measure it. The equations of physics are replete with representations of “forces” , “fields”, ”spin”, ”charm” and other designations, about which scientists actually know very little, except that they make sense mathematically. Yet the “standard model” of quantum physics is the most accurate model of any physical process yet discovered. And it’s all metaphor. What metaphor does is highlight selected and essential characteristics of an object or process, temporarily focusing on the similarity to other objects/processes. By eliminating other (dissimilar) aspects, the mind is able to concentrate and operate on the object in an "as if" state. For instance, a “heart patient” is similar to other “heart patients” even though the patient may be totally different from other heart patients in a variety of respects. But, for the purposes of “right now”, we are not treating the patient’s race, religion, occupation, or reading habits. We are focusing on the “heart patient” metaphor.

Reduce “Local” Decision-making
What if you decided one day to abandon all habits and made new decisions for each action you took? When to get up, when to brush your teeth, how much toothpaste to squeeze out of the toothbrush, whether you would go to work today, which route you would take, etc? In all likelihood, your life would come to a standstill, overwhelmed by entropy. We make sense out of our lives by reducing that entropy through categorical decision-making. This does not mean we cannot take plenty of time and thought to make decisions, but that, once we do, we should use the power of those decisions to reduce the level of “disorder” (entropy) in our lives. Making frequent decisions carries a “cost” in time and also in probability. For instance, if you know the donut place runs out of sprinkled donuts by 9am, you know your chances are better if you get there before 9am. Sure, you can take a chance, sleep late, and, from time to time, they still have the sprinkled donuts at 9:10. But you have re-sequenced your whole routine, spent time thinking it over, perhaps got a little uneasy “feeling” about the outcome of the new decision, and also reduced the probability of the sprinkled donut being there when you get there. Remember, you don’t have a probability series for getting there at 9:05, or another one for 9:10, etc. You actually don’t know the probability of anything but 9am. So you have sacrificed the probability, changed your routine, and spent time on a decision when you could have been putting on your shoes and being out the door already.

This kind of “local” decision-making adds up. If you continue to re-schedule meetings, skip classes, change your exercise routine, buy items on a whim, buy stocks on “hot tips”, this will continue to cost energy, time, money, and efficiency. A classic case would be the case of compound interest. All those people looking for a quick way to wealth, who change their ideas, careers, and investments rapidly, and without a “categorical” decision on what they are going to do long-term, are at a disadvantage compared to the slow, steady investor who consistently makes a good return, and takes fewer losses. Compound interest requires a removal of “entropy”. When this entropy is removed, and the investor sticks to their categorical decision over a period of decades, the slow, steady, mathematical process of wealth accumulation procedes like a juggernaut. I am indebted to the ideas of Thomas Sowell’s Knowledge And Decisions, which goes into these concepts in great depth, and at the highest possible level.

Abandon Processes if Measurements Dictate
As the old saying goes, “if you find yourself in a hole, stop digging”. The cessation of measurably bad processes is a positive act: the reduction of entropy. There are plenty of creative ways to try new processes, new people, new careers, new approaches. The beauty of “Productive Entropy” is that the current digital age provides a nearly unlimited (and nearly free) supply of possible solutions to any problem. We want to remove the entropy around productive processes, to allow them to flourish; we want to discard processes (no matter how ordered or determined) that, by measurement, show a lack of productivity. Discarding processes that measure poorly is discarding entropy.

I will consider more entropy reduction processes in depth in part 2.



No comments: