Our ancestors have contemplated the world ever since they had the capacity for abstract thought. But higher thought functions evolved as untamed instincts mixed in with older animalistic modes of thought. 1 Only in the last few millennia did philosophers start to think about thinking. They recognized that the human mind is most accurate when imagination is disciplined by reason. Reason can reveal universal truths that do not depend on culture, opinion, or coercion. Sometimes people have changed the world just by changing the way they think about it.
Logic is the study of truth as captured by language. It is highly abstract, so logical rules apply equally to apples, oranges, and thoughts about oranges. In a sense, logic is a study of the mind itself, and thus it is a first essential step toward reasoned thought.
Ancient Greeks, predominantly Aristotle, systematized formal logic. The word “formal” here means that the veracity of an argument can be analyzed based on its form, no matter what the argument is actually about. Much like the physical universe, language can be reduced to a few primary elements. The basic unit of language is the sentence, which often represents a true or false fact. Facts can be connected with conjunctions like and, or, not. Facts can be qualified and quantified with terms such as sometimes and for every. They can be conditionally connected to each other, for instance by saying “If A happens, then B happens.” These simple connections create rules: rules about how the mind works but also, mostly, how reality works.
Aristotle analyzed formal logic during the Athenian democracy. In a democracy, citizens debate to persuade each other to action. Aristotle recognized that clever rhetoricians could convince each other of false facts with illogical tactics: by appealing to emotions or popular opinion, or by making faulty arguments sound convincing. 1 His main contributions were the analysis of syllogisms (correct reasoning) and logical fallacies (bad reasoning).
A syllogism connects two quantified connections together to deduce a third one, such as these examples:
Any argument that follows the pattern on the left will be correct. It’s harder to see, but an argument following the pattern on the right is faulty, even if it can be backed up with an example! 2 It’s easy to get fooled by bad logic. For instance, many people believe fallacious arguments like the following, even if they don’t consciously understand how they are connecting the thoughts together.
An orderly system of logic gave philosophers and scientists much better guidance to move academic progress forward. Logical statements have a “mechanical” flavor to them, in that they lead to conclusions automatically. In very recent centuries, engineers learned how to program logical sequences into machines, which progressed into computers. Even the phone in your pocket is part of a long tradition going back to Aristotle!
Logic alone is not enough to understand the world. Consider this argument:
Oranges are rich in Vitamin C
Vitamin C effectively prevents colds
Oranges effectively prevent colds
This syllogism is logically correct, but that only means that its conclusion is as good as its premises. We won’t actually know whether an orange is truly effective at preventing colds until both premises are proven true. First, we can’t ascertain whether oranges are rich in vitamin C just by thinking about oranges. We have to take some oranges into a laboratory and study them.
It’s even more challenging to test whether vitamin C effectively prevents a cold. Inductive reasoning may suggest a hypothesis: “Other vitamins are effective at preventing colds, so maybe vitamin C is too.” An experiment is the best way to test this prediction. By giving some people vitamin C and comparing them to similar people who did not take vitamin C, a scientist can get a pretty good idea of its efficacy. The clearest and most convincing results will be mathematical, for example by describing percentages and time durations. Observations, induction, experiments, and mathematical modeling are key elements of what we now call the scientific method.
This method (which is as much art as science) has been refined very gradually. Ancient Greeks showed some mindfulness of induction and observations, but they were not good scientists by today’s standards. Aristotle stated many laws of nature that were flat-out false, and scientists believed them for millennia just because Aristotle was so esteemed. Medieval Moslem scientists were more experimental and mathematical, and started overturning some classical ideas. 2 Renaissance Europeans continued developing science as a form of inquiry.
By 1650, it was well established that quality of knowledge is not measured by the person who expounds it but by the method he follows. Science was seen as an evolving body of hypotheses that lost or gained strength based on evidence. A hypothesis should also be falsifiable: it should make predictions that can be tested as true or false. 3
In the 16th century, Polish astronomer Nicolaus Copernicus used meticulous observations to propose the radical notion that the Earth goes around the sun. Copernicus’ model was imperfect, so much so that we might call his discovery a lucky guess. Nevertheless, it was confirmed, refined, and mathematized after another century of scientific observations. The power of science to uncover such profound secrets of nature made a great impression throughout Europe. The movement it inspired is now neatly summarized as the scientific revolution.
Since astronomy is mostly observational, it lent itself to early rapid development as a body of science. The same was true of anatomy. With dissections, scientists such as Vesalius and Harvey were able to discover basic realities about the structures and functions of organs.
The Renaissance’s trailblazing experimental scientist was Galileo. 3 He used man-made experimental setups to systematically force nature to give him answers. For instance, it is difficult to take precise measurements of falling balls, so Galileo slowed them down by rolling them down ramps. His observations led to broad laws of nature and mathematical descriptions of movement and solid / fluid mechanics.
The generation after Galileo saw a flurry of activity in many fields. Revolutionary discoveries were made in hydraulics, optics, electricity and magnetism, geology, and biology. The mathematics to describe nature matured very quickly into calculus and other advanced methods. Scientific academies and journals formed around each discipline. This accelerated the pace of science well beyond what isolated medieval thinkers had accomplished.
Isaac Newton 4 tied together many strands of scientific progress with mathematical principles. His universal laws of motion and gravitation were the most revolutionary, for multiple reasons. Aristotle had taught that objects behave according to their nature or “purpose”, and that Earthly objects have very different natures from heavenly objects. Newton showed that the same laws apply to motion in the heavens as on Earth. Particularly, the moon’s orbit is really just a combination of straight line motion in two directions: falling toward Earth (gravity, like an apple from a tree) and flying off tangentially into space (inertia, like a puck on frictionless ice).
Mathematical laws also greatly delimited “willpower” in nature. If particles and planets moved in patterns that could be predicted on paper, nature’s laws seemed more like indifferent properties of numbers than any divine plan. Post-Newtonian scientists spoke of a “clockwork” universe, one that did not operate at God’s whim but that could be precisely understood with close study.
Although Europe’s leading scientists up to and including Newton were devoutly Christian, some discoveries were starting to conflict with scripture 4 or church doctrine. 5 Churches banned numerous scientific works for heresy. 5 Europe was entering a confusing period when reason and authority gave different answers.
Outside of clockwork and astronomy, most phenomena are complicated and difficult to predict. Probability is the reasoned way to deal with the unknown. In medieval thought, understanding the unknown was a matter of interpreting signs from nature. To determine whether a sick man would survive, an oracle might say that a flock of ravens foretells his death, whereas a doctor might say that death is more likely when the patient’s nose perspires. Obviously, some signs were more accurate than others. Signs led to the concept of evidence, and their reliability, when measured, evolved into probability. 6
17th century mathematicians 6 realized that some probabilities could be quantified by counting equally likely events. Tools of chance – dice, playing cards, roulette wheels – provide the simplest illustrations. There are about 3,000,000 ways to deal a poker hand. Roughly 5,000 of these hands constitute a “flush”, with five cards of the same suit. Thus, the probability of getting a flush is 5,000 / 3,000,000 or about one in 600. Interpretation 1: In a casino full of 600 gamblers, about one of them would hold a flush. Interpretation 2: As applied to any one gambler, it is rational for him to bet $1 for a $600 prize in the event he is dealt a flush. Understanding these interpretations helped make lotteries, insurance, law, and annuities more fair and efficient. Managing risk wisely became an integral part of capitalistic investment, wealth management, and state budgets.
An essential step in calculating probabilities is to have accurate tallies of data. For instance, if the state issues annuities to millions of men, it needs to know the probability that each man will survive each year to collect his payment. This probability is determined by examining voluminous record books. Statistics, the study of data, was pioneered concurrently with probability. The earliest known case study was John Graunt’s analysis of England’s Bills of Mortality. This tome listed christenings, deaths, and immigrations through the centuries. Graunt was able to determine fundamental facts such as that England’s children were born 51% male, and that plagues rarely lasted more than two years.
Without examining data, such valuable but subtle truths would otherwise be invisible. Most people get their understanding of the world through direct personal experience. Any one person’s experience, though, is too limited and biased to get the full picture of reality. Conventional wisdom, too, can be mistaken. For instance, it was once believed that plagues struck whenever a new king was crowned. Data proved that this superstition was “most false”. 7 Science and society have become increasingly data-driven ever since – a characteristic that is only accelerating in the computer age.
- Shenefelt and White, If A, Then B: How the World Discovered Logic, Columbia University Press (2013), ISBN 978-0-231-53519-9, https://www.amazon.com/If-Then-World-Discovered-Logic/dp/0231161050 . Chapters 1 and 2 give a unique discussion of how Aristotle’s formulation of logic was influenced by his environment. Section 2.5 in particular discusses “The Separation of Logic from Rhetoric”. ↩
- For example, Alhazen (who lived in Egypt in 1000) wrote a treatise on optics in which he demonstrated experimentally that light enters the eye from objects. This contradicted Euclid, Ptolemy, and Aristotle. Lindberg, D.C., Theories of Vision from al-Kindi to Kepler, University of Chicago Press (1976), https://books.google.com/books/about/Theories_of_Vision_from_Al_kindi_to_Kepl.html?id=-8A_auBvyFoC , pp. 60 – 67. ↩
- 20th century American philosopher Karl Popper defined science by falsifiable hypotheses. Thornton, Stephen, “Karl Popper”, The Stanford Encyclopedia of Philosophy, Edward N. Zalta (ed.), https://plato.stanford.edu/entries/popper/ (accessed 12/11/16) is a thorough, readable account of Popper’s philosophy with an extensive bibliography. ↩
- The bible strongly implies a fixed, central Earth, without stating it outright. God created Earth before the sun according to Genesis 1:1-19. Joshua 10:12 – 13 states, “Then spake Joshua to the Lord … ‘Sun, stand thou still upon Gibeon; and thou, Moon, in the valley of Ajalon.’ And the sun stood still, and the moon stayed, until the people had avenged themselves upon their enemies …. So the sun stood still in the midst of heaven, and hasted not to go down about a whole day.” In other words, the sun was the body normally in motion that had to be stopped. Furthermore, Ecclesiastes 1:5 states, “The sun also ariseth, and the sun goeth down, and hasteth to his place where he arose.” (Both quotes are from King James Version of the Bible). For further scriptural arguments in the mouths of modern geocentrists themselves, see Hall, Marshall, “True Science Confirms Bible Geocentrism”, Fair Education Foundation, Inc., http://www.fixedearth.com/true-science-confirms-bible-geocentrism.html (accessed 12/18/16). ↩
- A Vatican panel condemned Copernicus’ theory as “heretical from a religious standpoint, inasmuch as it contradicts the tenets of Holy Scripture in many places, both according to the plain meaning of the words and according to the universal interpretation of the holy fathers and learned theologians.” Quoted by Krause, Ernst in “The Struggle Regarding the Position of the Earth”, The Open Court, vol. 14 no. 8 (August, 1900), http://opensiuc.lib.siu.edu/cgi/viewcontent.cgi?article=1214&context=ocj (accessed and saved 12/18/16). Other scientists on the Catholic church’s Index of Prohibited Books included Kepler, Galileo, Descartes, Francis Bacon, and Pascal. ↩
- Hacking, Ian, The Emergence of Probability, Cambridge University Press (2006),ISBN 978-0-521-86655-2, Chapter 5, “Signs”. ↩
- Graunt, John, Natural and Political Observations Made Upon the Bills of Mortality, London (1662) pp. 9 – 15, reprinted at http://www2.sunysuffolk.edu/westn/mortality.html (accessed 12/23/16) ↩
Facebook comments preferred; negative anonymous comments will not display. Please read this page / post fully before commenting, thanks!
Powered by Facebook Comments