Tuesday, March 26, 2013

Part I: The Problem of Faith in the Modern World (cont.)


The Role of Science in Western Culture (cont.)

Technology as the Fruits of Science

Up to this point, I’ve talked about the impact of science on our lives in terms that are pretty abstract and obscure. Even though we may vaguely grasp the way science shows us how the atom works or how the universe began, that understanding has little to do with how we go about our daily routine. In fact, though, science plays a key role in almost every moment of each day because of the critical role that science has played in the creation of the technology that defines how we live in the modern world. Because we recognize this relationship, our dependence on technology gives greater strength to the authority of science in our minds.
A revolution began at the end of the nineteenth century when science was applied more and more to the creation of new technology. Before that time, advances in technology were mostly evolutionary, the result of practical experimentation: New things were tried and, if they turned out to improve how something worked, they continued to be used. If they didn’t, then they were discarded. Although this sounds a lot like the scientific method, it is different because it was purely a trial-and-error process. The goal was to make the thing itself better, not to learn about the basic principles of nature that allowed the new thing to work. For this reason, invention was more craft than science, that is, the pursuit of knowledge for its usefulness, not for its own sake. Even the best-known inventor, Thomas Edison, developed most of his products through trial and error instead of applying scientific theory. His famous saying that invention is one percent inspiration and ninety-nine percent perspiration shows this: There’s no mention of any use of scientific theory, just new ideas and effort.
In a way, Edison was among the last of the “old-school” inventors. As advanced scientific knowledge began to flower in the nineteenth and early twentieth centuries (Einstein published his first paper in 1905), the discoveries that resulted began to be used to improve old technologies and to create new ones. This trend was not entirely new, of course. After all, Benjamin Franklin studied the nature of lightening and showed that it is a flow of electricity, so he was able to invent the lightning rod as a way to safely channel the electricity from the air to the ground through a wire instead of through the building itself. But this way of applying scientific discovery to the art of invention continued to be more the exception than the rule until the twentieth century.
Physical theories of electricity and magnetism inspired the greatest results among inventors at the beginning of the twentieth century. Early nineteenth-century scientists like Michael Faraday, Luigi Galvani, Alessandro Volta, André-Marie Ampère, and Georg Simon Ohm developed the scientific knowledge of electromagnetism that was the foundation for technologies created by later nineteenth-century engineers such as Nikola Tesla (electric motors), George Westinghouse and Thomas Edison (electrical generation and transmission), Samuel Morse (the telegraph), and Alexander Graham Bell (the telephone), and in the following century, such inventions as radio, the vacuum tube, and the cathode ray tube used in the first TVs and computer monitors.
Electronic gizmos are the most obvious product of science, but just about every invention of the twentieth century was at least helped along by science. For example, the Wright Brothers’ knowledge of the Bernoulli Principle (the unequal pressure that gives wings their lift and propellers their thrust) made it possible for them to invent the airplane. The theories of James Clerk Maxwell and Ludwig Boltzmann led to the invention of refrigeration. And on and on.
Most of us are at least vaguely aware of how the scientific method has sped up the development of the technology that we use every day, and this helps give the scientific process a greater credibility than can be found in any other area of human activity. Religion, on the other hand, appears regressive by comparison because it is mostly based on the authority of writings and traditions from the ancient past. It’s no surprise, then, that as technology becomes more and more central to the way we live, religion is losing its influence, receding to the background and fading in importance.

Thursday, March 14, 2013

Part I: The Problem of Faith in the Modern World (cont.)

The Role of Science in Western Culture (cont.)

The Scientific Method and Truth

One really important reason that the scientific worldview has taken over Western culture, despite feeble resistance by believers, is because the method it uses is so successful at figuring out what is true and what is false. And it is successful because, unlike other ways to seek truth and describe reality, it is truly progressive. Before the scientific method took charge, Truth was handed down by people with authority. And because we didn’t have bumper stickers back then to tell us to, we didn’t question authority. So, if you believed that everything that Aristotle wrote was true, you wouldn’t look for evidence that his worldview was flawed precisely because it was your worldview as well. No wonder people thought that reality couldn’t change.
On the other hand, the scientific view of the world is based on the idea that whatever we think is true is only partial and ever-changing. Every discovery, no matter how firmly supported and widely accepted, is assumed to be just one step on a journey, one piece of an always-growing puzzle. So even when a theory is “proved” (a term scientists don’t like to use, except maybe when talking down to us laypeople), it simply becomes the basis for new hypotheses. On the other hand, hypotheses that aren’t confirmed by experiments are thrown out, and scientists who continue to cling to them are pushed to the margins of the scientific community and lose its respect.
Because of this forward-moving quality, the scientific method gets better and better at predicting the future, that is, what develops from current conditions and events. This is most often seen in the laboratory, of course, but it also applies to everyday happenings as well. The weather report is the most obvious example of this, but such areas as medicine and engineering are also based largely on the ability of the scientific method to predict an outcome even when the exact conditions and events leading to that outcome have never been seen before.
Even beyond these everyday examples, science was able to predict such (literally) earth-shaking developments as the splitting of the atom long before they were accomplished. Newton, much less prophets and philosophers, couldn’t have foreseen such a discovery, but the scientific method that Newton pioneered created the conditions that eventually led to the theories that resulted in nuclear fission. In other words, scientists didn’t just stumble upon nuclear fission. Instead, the ideas of such visionaries as Albert Einstein led scientists to perform experiments that, through the process of constant fine-tuning, enabled them to create the first controlled nuclear reaction and then, sadly, the uncontrolled nuclear reaction of the atomic bomb.
Such developments as these have led us to give the scientific method a level of authority far greater than any other. Maybe with the exception of the most willfully closed-minded religious believers, we have come to view science as the most reliable way of finding the truth, the most consistent and powerful way to learn how the world really works. While we may embrace scraps of our ancestors’ faith, the reality is that we place far more trust in scientists than we do in prophets and priests, at least on a day-to-day basis.

Tuesday, March 5, 2013

Meanwhile, back at the book...

Part I: The Problem of Faith in the Modern World (cont.)

The Role of Science in Western Culture

The ultimate irony of the debate between science and faith is that the argument is already over, and science has won. The scientific worldview has conquered all others, at least in America and the more modern parts of the world. It’s everywhere, influencing practically everything we do. Seriously. Planning a party? Do you ask your pastor to say a prayer to beg God not to rain on your event? Or instead do you check your favorite weather report to find out what the odds of fair weather are for your special day? The fact that the second approach is the only one just about any of us would seriously consider shows how much of what we believe is based on science, not on religion. We no longer believe that weather is caused by the whim of a fickle God who can be coaxed to do what we want, but rather that it is the result of natural forces that scientists can measure and predict the effects of. Even the most religious among us watch the weather report to plan our day, not to laugh at how impertinent it is.
Or think of what we do when someone we love is critically injured. If we’re not Christian Scientists, we call 911 or rush them to the hospital, then we might pray for God to get involved, often asking him to help the doctors and nurses practice their science well. In a way, we’ve managed to demote God to the role of physician’s assistant.
 Every day, we do things that people a few generations ago would have thought impossible, mainly because science has shown that it can be done. We climb aboard an airplane and, despite its great weight, we trust that it will lift off and fly us to where we’re going. We watch events happening on the other side of the planet, probably unaware that the picture is being carried by satellites and lasers over glass “wires.” We point our phones at an interesting scene, record a video, and then send it to five friends around the country. Each of these activities began with scientific theory that was confirmed by experiments and then applied to technology. The only way this technology is possible is because of how well the scientific process has been able to give us a deeper understanding of how our world works.
Although we are not aware of it—and in fact, we probably would deny it—we in the Western world have come to believe more firmly in Isaac Newton than in God. We go about our days assuming that the physical rules that Newton discovered control how the world around us will behave, in part because those rules have been declared to be laws that seem to be more reliable than any of God’s moral laws. We understand that things fall when we let go of them, not just out of habit or because of supernatural forces pulling them down, but because they are obeying a law of gravitation that Newton said controls the whole universe. When we feel how hard it is to move something heavy, we know that Newton’s law of momentum explains why it’s so difficult, not that the darn thing is just being stubborn. Even though we may be only vaguely aware of how Newton’s ideas came to be accepted as laws (by first being worked out as mathematical formulas that were then verified by countless experiments), we have come to accept their authority without quibble. On the other hand, when someone is caught doing something bad, even something truly evil, the best we can do is pray that justice will be done, and usually we assume that it is up to us as a society to make sure that it is.
The place that Newton holds in our culture is mainly because he stood at the beginning of the scientific revolution that came to define how we look at the world. Newton’s theories were among the first to be systematically tested, both by analyzing the math behind them and through scientific experimentation. This turned into a virtuous cycle of sorts, where the truth of Newton’s groundbreaking insights was ever more firmly supported and the process of scientific experimentation itself came to be increasingly accepted as the most reliable way of revealing the truth. What happened as a result was the birth of the Enlightenment, when Western society turned away forever from blind obedience to authority and turned instead to the scientific method as the most trusted path to knowledge. In the Christian world, even the Bible became the subject of scientific study by scholars, scholars who often reached conclusions that disturbed the faithful and sparked a backlash against using science to study things that should be kept securely within church walls.
In spite of this limited backlash, though, even the most religious people in America accept the basic validity of the methods and assumptions of scientists. Yes, some people are offended by certain scientific theories because those theories seem to deny something that they believe God revealed through the Bible—in effect, calling God a liar—and yet those same people will often try to use other scientific theories to disprove the ones they disagree with. For example, fundamentalist Christians who think that Darwin’s theory of evolution contradicts the story of the creation of life in the book of Genesis will often claim that Newton’s laws of entropy prove that evolution is impossible. After all, they say, Newton’s laws say that everything is moving from order to disorder, so evolution must be impossible because the new life forms that supposedly result from evolution are usually more complex than the ones that came before. These “proofs” rarely stand up to close examination, though, because they almost always twist these laws or just flat out apply them wrongly. Even so, the fact that fundamentalists attempt to use science to discredit science shows how much the scientific point of view is actually accepted by people who, if asked, would heatedly deny that their worldview depends more on science than on revelation.

About Me

My photo
I am a former Presbyterian minister (and hence a holder of a Master of Divinity degree) and presently a technical writer for a Very Large Software Company (yes, you guessed right). My academic background is in things religious, but I have just enough interest in things scientific to support the delusion that I can write about them. In other words, I am a veritable salt shaker of dubious propositions.

Followers