Disequilibrium Makes the World Go Round

Douthwaite states that technological fixes are necessary to solve social problems. The other two authors would disagree with that on a certain level. In the language of Johnston, he believes that modern problems cannot be reduced to mere engineering solutions over long term because human goals are diverse and constantly changing. This makes sense because different groups of people have different problems and have different ways of solving things. Huesemann talks about scientific reductionism, and how scientific finds can actually hurt and create social problems giving an overall negative effect. That is how Douthwaite differs from the other authors on how technological fixes are necessary on solving social problems. All three authors give good points on what the cause and effects are on what technology does to the future.

Technological fixes to social and environmental systems have negative repercussions, because putting things out of equilibrium can disrupt our nature on how we act and approach things. Whenever humans in particular are put into disequilibrium, negative effects are certain to happen. What most people fear is what they know little about. However, this should not stop the flow of making technological discoveries, even if that means there will be negative effects. You could even argue that a discovery that has negative problems now, will eventually give light to problem in the future. In my mind, it makes sense to always look forward to the future without any regrets of the past. We wouldn’t have nuclear energy today without the scary advancement of nuclear power. You can’t change the past, but you can always improve on the future.

 

3 thoughts on “Disequilibrium Makes the World Go Round”

  1. I think it is interesting how you connected what was talked about in the Johnston article and the Huesemann article in terms of humanity and individual equilibrium rather than just the environmental equilibrium. This, being in your title, is what made me click on your post. I think I could support your argument in your second paragraph by saying its healthy to be challenged and for our scientists to always have new challenges to face- but I could also argue that we are digging our hole deeper and deeper, and what happens when we reach the end? Do we run out of natural resources? Does the sun fry us through the hole in the ozone? Do countries press the “red button”? I agree you cant’ change the past, and regret can be a useless emotion, but I think we should still learn from obvious mistakes, and focus on reparation and filling in the hole we’re digging rather than blatantly disregard it and make it deeper.

  2. I most definitely agree with your idea that we need to continue further research and push to progress in order to solve the issues we face. I believe it is very important not to quit after failures if there is a goal one is working towards. We must not stop and regret our downfalls and possibly bad decisions but instead, as Emelyn said, “learn from obvious mistakes and focus on reparation”. I would argue with Douthwaite and say that technological fixes are not necessary in order to fix our social problems. There are other ways to do so. While these fixes have some helpful and positive uses, I would not consider them to be “necessary” to our social or environmental status. While all three authors make very thought-provoking claims, I personally find myself somewhere in the middle of all three.

  3. I agree that humans should always strive towards progression, but there has to be a limit. At what point do we stop “progressing” and create something we as humans can no longer control? For instance the creation of nuclear power as you stated. While I agree that nuclear energy is very beneficial to humanity, the discovery of nuclear fission also created weapons that could wipe most of humanity off the planet. Sure a nuclear holocaust has not occurred, but we have come close and we cannot put the cat back in the bag at this point. From now on every human will be born into a world where the possibility for a nuclear tragedy can occur. A new technology that is being developed is artificial intelligence. If this is created it will surely be a great technological feat, but what if we create something that we cannot stop and something that can turn against humanity? What if in the future a society decides to use artificial intelligence to take over tasks that we once trusted to mankind such as policy making or education of the youth? The horrible environmental effects of some technological fixes have already begun to show in seemingly irreversible ways. Cost and benefit analysis should be taken seriously. Humanity and the natural environment are not things worth gambling on by saying that there will surely be someone to create a technological fix later on.

Comments are closed.