As made evident in their writings, both Johnston and the Huesemann authors would find fault with the notion that technological fixes are “necessary” to solve social ills. Johnston outlines in his essay The Technological Fix as Social Cure-All how he believes that “Modern problems cannot be reduced to mere engineering solutions” (Johnston, pg. 54). He takes this view on account of the notion that both mankind and society are prone to constant change. Thus, that which seemed a proper correction to a problem in the past may not meet the standards of today, and itself become a new problem to be fixed. A similar view is proposed by the authors Huesemann and Huesemann, who believe that any attempt to solve complex social problems with technological remedies are doomed to either delay the inevitable, or create a worse outcome by forestalling it. They proclaimed in their book that “The negative and sometimes irreversible consequences brought about by the application of science and technology are not only inherently unavoidable but also intrinsically unpredictable” (Huesemann & Huesemann, pg. 15). This belief is founded on the idea that nature, and by extension humans, is influenced as much by the bonds that interconnect it as the individual facets of its being. Thus, a reductionist view cannot fully comprehend it.
It is these problems with the technological fix that necessitate its invariable failings. Nature, and the social constructs which constantly develop and grow between those who live among it, cannot be understood or categorized in a traditional scientific way because by its nature science is rational, as opposed to the very irrational nature of any construct formed between beings of sentience and free will. Thus, while it is not wrong to solve a problem with technology, such an endeavor should be undertaken with great caution.
Great blog, I fully agree with your last statement that humans need to take more caution when releasing potentially dangerous solutions to the public. This however raises the question of how we do that. It seems to me that no matter how much research is done on something, there’s still a chance that it will backfire because we can’t see the future and the possible outcomes of our actions. My question to you is weather or not there a way to be 100% sure that a technological fix won’t have unintended consequences.
I also agree, I would love to see the science types take more care and possibly have a little more foresight before a decision is made that could potentially damage the ecosystem that many animals and people rely on. Although more research would help, its’ probably impracticable to think we could see everything that would have a negative impact on another thing, but being prepared for something to happen would be the next best step.
Well written post! Great summary of the text and sound conclusion on how a reductionist rational view cannot be used cannot be used to understand and solve the irrational problems that exist in human social constructs. As you mentioned it is impossible to understand irrational human societies with technology, especially as both exhibit constant change and variance, and therefore great caution is necessary when using a technological fix. As advised by Johnston, designers need to pay close attention to the scope and longevity of their solutions (Johnston p. 54). Even Douthwaite admits to this and states the terrible temptation of a technological fix is to assume it will last (Douthwaite p. 32). Unfortunately hiring multiple specialists to examine the cost of every new item becomes unfeasible in practice as the rate at which new technologies are released to the market far exceeds our ability to determine the societal and environmental costs of that invention.
Yes. The limits of reductionism seems to suggest a limit to tech fixes. But so much is not reducible; do we give up on tech fixes altogether?