Saturday, January 17, 2015

Nature of Reality - Another Imperfect Option

Assuming my free-will is really free, and not some farcical illusion of a pre-ordained mind forced to believe thusly, I still have a few complaints.  What kind of sad joke is it to live in universe where free-will enables the broad manipulation of future reality, and not only be limited to a few relatively limited sensory organs to comprehend current reality, but also have to make guiding decisions using a seriously flawed computing engine equipped with logically inconsistent logic routines,  faulty usage of "beliefs" where "facts" should apply, essentially no effective engine for statistics,  highly limited short-term memory (about 7 items, really???!?!), an error-prone and malleable long-term memory (without change tracking), and a hormonally-attuned "emotional override" circuit that is likely many millennia out of date.

I understand the nature of engineering trade-offs, and the self-learning, auto-correlation, and memory-pruning tools are pretty impressive so maybe some other areas had to give, but I think a little more attention to the conscious decision components would be helpful.  It would be SO much easier to make decisions if a few more criteria could be weighted in ad-hoc mode, and it would help every so much if intentional desire to commit info to long-term memory could be more efficient (even if you had to intentionally purge some others).   A better interface to that impressive differential math coprocessor that lets NBA players shoot 3-pointers on the move, yet makes it hard for most students to master calculus, would be nice. 

But it's really the logical fallacies that are readily allowed by the high-level algorithm engine that needs work.  Why have a machine that can self-analyze to find faults, yet have pre-wired sections that difficult or impossible to modify (let alone that the weighting of desire to actually make changes is so weak).  Maybe the root flaw is self-certainty -- people are just a little too sure of their obviously error-prone decisions to drive change.  Or maybe that derives from the questionable "respect" circuit, where we more heavily weight the opinions of people who seem certain over those who seem uncertain, even in non-crisis situations.  And that crisis circuit needs work too -- what's this deal about dumping adrenaline in for routine business situations, and thereby invoking archaic hormonal overrides where cold logic would better apply (and we won't even get into the unnecessary stress effects that causes)? 

Let's see if we can't make some wiring and programming changes, shall we?  And fix at least the more egregious processor limitations?

No comments:

Post a Comment