next up previous contents
Next: On Paraphrasing Up: COLLEGE LEARNING WAYS & Previous: Commitment by Contract

Logic and Problem Solving

We say that a conclusion is "logical" when it follows from what we already believe. If we believe that all people are created equal, then it follows logically that there should not be kings or slaves. If we believe that light travels faster than sound, then it follows logically that we should see lightning before we hear thunder. If we believe that the only way women can get pregnant is by having sexual intercourse, then it follows logically that a pregnant woman is not a virgin. In everyday life, logical conclusions simply make explicit what is already implicit in one's beliefs. Notice that logical conclusions may or may not be true. In the preceding examples, one could argue that humans are really like bees with queens and workers, we probably accept the scientific evidence about the speed of light and sound, but a virgin woman could become pregnant through artificial insemination (and perhaps through divine intervention). Hence, our beliefs are really premises, statements from which we draw conclusions. A conclusion that follows logically from the premises is VALID, but it is TRUE only if the premises are true. Conversely, faulty logic could lead to a true statement. A good way to illustrate these ideas is with the hypothetical syllogism. A "syllogism" is composed of three statements: A major premise, a minor premise, and a conclusion. In the "hypothetical" syllogism, the major premise is an "If" hypothesis. For example:
Major premise: If you study hard, then you will get an "A".
Minor premise: You have studied hard.
Conclusion: You will get an "A".
There are two parts to a major premise in the hypothetical syllogism. The "if" clause is called the antecedent and the "then" clause is the consequent. In the preceding example, the minor premise affirmed the antecedent (said it is true). Whenever the minor premise affirms the antecedent in a hypothetical syllogism, the conclusion is valid, but it is true only if the major premise is also true. Because we know that studying hard does not guarantee getting an "A", we arrive at a false conclusion even though the reasoning is sound. If the minor premise denies the antecedent (says it is false), then one cannot logically draw any conclusion. Continuing the above example: Major premise: If you study hard, then you will get an "A". Minor premise: You have NOT studied hard. Conclusion: None As you know, there are ways to get an "A" other than by studying hard, including improper ways such as cheating. People frequently make the mistake of thinking that "if" means "if and only if," and do draw a conclusion from denying the antecedent. Logicians often use the Latin expression, non-sequitur, for an expression that does not follow from the premises. For example: Major premise: If she's at home, she's not out with someone else. Minor premise: She's not at home. Conclusion: She's out with someone else. (NOT VALID) Hence, affirming the antecedent affirms the consequent, but denying the antecedent does not deny the consequent. There is another minor premise that does lead to a logical con- clusion, namely denying the consequent. Return to the first example: Major premise: If you study hard, then you will get an "A". Minor premise: You did not get an "A". Conclusion: You did not study hard. Again, the logic is valid even though the conclusion is false because the major premise is false. Although denying the consequent implies denial of the antecedent, affirming the consequent does not lead to any logical conclusion: Major premise: If you study hard, then you will get an "A". Minor premise: You got an "A". Conclusion: None This is an even more common source of faulty reasoning: Major premise: If he's out with someone else, he won't be home. Minor premise: He's not at home. Conclusion: He's out with someone else. (NOT VALID) It is well worth your time to make up a number of hypothetical syllogisms, and practice the four possible minor premises. Use the following table to structure your examples: Major Premise: If antecedent >>> then consequent

: Affirm-Valid : Affirm-Not Valid: Minor Premise: :Deny-Not Valid : Deny-Valid : In making up syllogisms, try to think of other ways to state the major premise. For example, "You can't pass calculus unless you can pass algebra" is another way to say, "If you can't pass algebra, then you can't pass calculus." Let me summarize this section by distinguishing between the two words "infer" and "imply." You, the thinker, draw inferences; you infer conclusions from the premises. For their part, the premises may or may not imply a conclusion. You may fail to infer conclusions that are actually implied by the premises, or more commonly, you may infer conclusions that are not implied by the premises. Reasoning is valid only when you infer conclusions that are indeed implied by the premises. Whether or not a valid conclusion is true depends on whether or not the premises are true.

On Probability The probability of an outcome is determined by the ratio

number of positive outcomes divided by: number of possible outcomes

For example, the probability of drawing an ace from a deck of cards is 4/52 = 1/13 = .077. The probability of drawing a spade is 13/52 = 1/4 = .25. The probability of drawing a black card is 26/52 = .50. These figures are objective probabilities determined from the actual numbers of positive and possible outcomes.

In like fashion, there is a fifty-fifty chance that a coin will turn up tails. It is interesting that most people call heads when a coin is tossed. If that call is correct, about half of them will call heads again for the next toss. If that toss is also heads, most people will now call tails for the third toss. During this time, the coin did not change; the chance of a tails was fifty-fifty on every toss. But most people have acquired guessing habits that are, at best, wrong, and at worst, very expensive.

Actually, there is nothing "wrong" with having a preference for calling heads. Indeed, a very good strategy is always to call heads (or tails) because you can't do better than fifty-fifty anyway. What is wrong is changing your call depending on what happened on the last toss or series of tosses. The true odds don't change, and the belief that they do is known as the "gambler's fallacy." Suppose you had a jar containing ten marbles, five white and five black. You shake the jar and draw out a marble; the odds are 5 out of 10 (50 black marble and then shake-and-draw again without putting the first marble back, the odds have now changed to 4 out of 9 (about 44 drawing another black marble. If you draw a second black marble, and shake-and-draw a third time without putting either marble back in the jar, the odds have changed to 3 out of 8 (about 37 continued until you happened to draw out all 5 of the black marbles, the next draw would have to be white. Suppose instead that you put the marble back after each draw. Obviously, the odds wouldn't change even if you happened to draw five black marbles in a row. On every draw there would be five black and five white marbles in the jar. This is what it's like when you are tossing coins, rolling dice, or spinning wheels. It is very hard to resist the feeling that, after several blacks in a row, a white is "due," but it is an important lesson to learn about probability.

The general point is this: People base decisions on their subjective probability of the outcomes which is often different from the objective probability. Most of us tend to overestimate the likelihood of low-probability events (people bet on long-shots at the race track when there is almost no chance of their winning); conversely, we tend to underestimate the likelihood of high- probability events (people drink and then drive thinking nothing will happen to them).

On Problem-Solving What makes a problem a problem? In general, a problem is when you start in some condition/position, you want to move to some other condition/position, and the solution is not immediately apparent. It would not be a problem, for example, to weigh out any number of grams of sugar (up to 30) using a simple balance scale if you have thirty weights weighing 1, 2, 3, etc. grams each. But suppose I confine you to using only five weights. You now have a problem because you must decide which five weights will enable you to measure any number from 1 to 30. It is obvious that you will have to combine weights in some way and you quickly see that you don't need a 3-weight because you can get 3 by combining 1+2. If you then have a 4-weight, you can get weights up to 7 grams (4, 4+1, 4+2, 4+2+1), so next you need an 8-weight. If you are familiar with number sequences, you will probably now figure out that each required weight is double the preceding one, so that your five chosen weights are 1, 2, 4, 8, and 16-grams. If you were carrying these around as a salesperson, that would be quite a saving in number of weights to carry. Can you do it using only four weights? To solve that problem, you will have to break your cognitive set. A cognitive set is simply a way of thinking about a problem that may hinder seeing all of the possible alternatives. In this case, your cognitive set is likely to be that you always put the weights on one side of the scale and sugar on the other side. But, there is no rule against putting weights on both sides of the scale, so that you can use subtraction of weights as well as addition. For example, you don't need a 2-weight if you have a 1 and a 3, because you can put the 1-weight on the sugar side and balance that side against the 3-weight. Cognitive set is one common reason that people may fail to solve problems. A parent whose child misbehaves may have a set to punish misbehavior and never even think of the alternative of rewarding good behavior. A person who takes pills to treat an ailment may never try out alternatives such as exercise or better eating habits. Students having difficulty in school may think that they have to spend many more hours studying rather than searching for better learning methods and strategies. Sometimes there is only one way to try to cope with a problem, but usually there are alternatives that are obvious once a cognitive set is broken. Research on problem-solving has not been very instructive about how to solve problems. In general, we draw on past experiences with similar problems to generate possible solutions and then try them out in the present situation. When one hypothesis fails, we search for another alternative and so on until we "hit upon" the solution. When that happens, we often have the familiar "Aha" experience. Most of what I have to say concerns natural tendencies that frequently block us from hitting upon the solution. This does not guarantee success, but it does serve to minimize failure. Impulsive Action. Just being in a problem situation arouses a degree of tension or uneasiness that encourages impulsive action. We are so eager to solve the problem that we start testing hypotheses before we have really analyzed the problem. The adage, "look before you leap," is good advice when attempting to solve a problem. There may be information that is not immediately obvious but that can be derived or determined and that might lead one toward the solution. To the right is a well-known -- -- -- problem. Think of the adjacent : : : : four dots/dashes as being sticks. : : : : You are to move two sticks, and -- -- -- -- replace them in the picture so as : : : to end up with four squares instead : : : of the original five. No tricks. -- -- Most people just start right off moving sticks around, which is what is meant by impulsive action. Knowledgeable problem-solvers are able to inhibit that impulse and analyze the problem before making any moves. They first review the instructions to be sure that there are no ambiguous words. They may ask questions such as, "Do all of the squares have to be the same size?" In this problem, there are no ambiguous words and the resulting four squares are all the same size. One heuristic (problem-solving strategy) that is useful in many types of problems is to count the number of objects. Frequently that number will lead you away from bad alternatives and toward good ones. In this case, there are 16 sticks. Knowing that there are 4 sides to a square, we know that we could make 4 completely separate squares. We therefore know that we cannot let any one stick be a part of two squares. Now look back at the picture and count the number that are parts of two squares. All 4 of the inside sticks count. Because we can only move two sticks, each one must undo two of the four problem sticks. This may take a bit of searching, but you now know precisely what you are looking for; as a result, it is much more likely that you will remove the sticks as shown on the left and replace them as shown on the right. -- -- -- -- : : : : : : : : : : : : : : : : -- -- -- -- -- -- -- -- : : : : : : : : : : : : : : -- -- -- The advice to avoid impulsive action, and to analyze the problem by looking for ambiguities and counting the objects, has generality. Another familiar puzzle is this: Three people each give a minister $10 to go to the market and buy them a turkey. When he got there, he found that a turkey only cost $25. Because he couldn't divide the $5 change into 3 parts, he gave $2 to charity and returned $1 to each of the three people. This means that each person actually paid $9, for a total of $27, which added to the $2 for charity makes $29. What became of the missing dollar?

I have known people to wrestle with that problem for hours over many days without solving it because they impulsively begin searching for the "missing dollar." There must be some ambiguity somewhere. If you stop to count, you ask yourself what happened to the original $30? You will see that $25 went to the market, $2 went to charity, and $3 was returned. The $2 that went to charity is thus seen to be a part of the $27 that the people paid, not something to be added to it. You can subtract the $2 to get the $25 that went for the turkey, or you can add the $3 that was returned to get back to the $30. Working backwards from the goal. We have defined a problem as a situation where you are in one position and want to move to another. The former is the start position, the latter is the goal position and the problem is to find a way to get there. Very frequently, there are many fewer alternatives to be explored working backward from the goal toward the start than in working forward toward the goal. Hence a useful heuristic is to reverse the order of solution. On the top half of the adjacent page is a maze consisting of a number of T-shaped choice points. Trace your path through the maze using some rule. For example, you might alternate turning right, and then left, then right again, and so on. . .retracing your steps when you get to a dead end. Repeat this top maze using some other strategy such as turning right (or left) at every choice point. What you will see is that you make many errors going from the start to the goal. This same maze is repeated on the bottom half of the page. This time, work backward from the goal with a single rule: Turn at every opportunity. You will chart the correct path on the first try. It is not always that easy to work backward from the goal in everyday types of problems, but it is frequently a good heuristic to try it in reverse order. Find a position where you can say, "If I were at that place, I could get to the goal; how can I get there?" If you can get there from some other place to which you can get from the start, you have found one solution to the problem. Here is an example: You enter your sophomore year knowing that you have to take a very difficult-for-you course X sometime in order to graduate. Should you take it now or later? Suppose your goal is to get into a professional school, and to do so, you have to take a course Z when you're a senior. A prerequisite for Z is course Y as a junior and X is a prerequisite for Y. There is no longer any problem; you have to take X now so you can take Y and then Z to reach your goal. Wrong question. The problem about the missing dollar illustrated asking the wrong question. Another example is this: "How can one build a house with an all-southern exposure?" You are not likely to get to the solution so long as you put the question that way. But if you will simply change the question to, "WHERE can one build a house with an all-southern exposure?" the solution becomes obvious: at the North Pole. Hence, another heuristic is to re-examine the question if you are having difficulty answering it. Perhaps it is the wrong question, sometimes one that can never be answered directly.

Another familiar fable: A king has three sons, all of whom are very intelligent. Still, he wants to leave his kingdom to the most intelligent son. So he slips into their bedroom one night while they are all asleep and paints a spot on each of their foreheads. He then awakens them and says, "If you can see at least one spot, raise your hand." Of course, all three raise their hands because they actually see two spots. The king then says, "If you can figure out rationally whether or not you have a spot, lower your hand and explain your reasoning. If you're right, you will inherit my kingdom." After a few minutes of thought, one son lowers his hand, says he has a spot, and justifies his answer. How did he figure it out? In this case, you will never solve the puzzle as long as you ask the question, "Do I have a spot?" If you have a spot, the other two would have their hands raised, which they do, but they would have their hands raised anyway because they could see each other's spots. The sooner you realize you cannot solve the problem that way, the sooner you are likely to ask an answerable question: "Do I NOT have a spot?" If you do not have a spot, the others would have raised their hands, to be sure, but they would quickly figure it out. That is because there have to be at least two spots for everyone to see at least one, and if they saw that I didn't have one, they would surely know that they did. Since they have not figured it out right away, I can answer "Do I not have a spot?" in the negative. If I do not-not have a spot, I do have a spot. Many questions are easier to answer by rephrasing them in some way. "Should I ask for help?" can be changed to "Is there any reason not to ask for help?" "Can I afford an expensive coat?" may be harder to answer than, "Is there something else that I'd rather do with the money?" It may be just as easy to answer, "Do I have to take the harder course?" than, "Should I take the harder course?" but you will have a better attitude answering the latter question. And in many situations, you may not be able to answer, "Is this the right thing to do?" but you can answer, "What are the wrong things to do?" Functional fixedness. Let me just briefly mention one more good heuristic for solving problems. We tend to see objects as useful for their intended purpose, and fail to think of other uses to which they could be put. This is a "fixedness" of our perception of an object's function. The classic problem is to tie two strings hanging from the ceiling together, but they are too far apart to reach one while still holding the other. On a table is some paper, glue, and scissors. Do you see at least three solutions? You could tie the scissors to one string as a weight, start it swinging as a pendulum, grab the other string and catch the swinging one when it comes back. You could use the table as a platform, or you could make a paper chain. The general point is to be versatile in seeing ways in which familiar objects can be used for alternative functions. A pen, for example, is a good weapon for gouging at the eyes of a person who is attacking you. A dime can often be used as a screwdriver, and you can bend a coat-hanger into a bookrest. Many things should be used "only as directed" but most things can function in many more ways than we customarily recognize.


next up previous contents
Next: On Paraphrasing Up: COLLEGE LEARNING WAYS & Previous: Commitment by Contract
Derek Hamilton
2000-09-05