Wiseacre
Retired USAF Chief
So I'm sitting here with the wife watching Stargate SG-1 DVDs (she has a thing for Richard Dean Anderson), and got to wondering about ET a little bit. Steven Hawking has postulated that an alien species that is advanced enough to invent intrastellar travel would likely also be shall we say less than friendly. We would be conquered, maybe exterminated, maybe used as a food source if they arrived and chose to do so. Kinda puts the next election into a little more perspective, eh?
Anyway, do you think he's right? Would said aliens who are that advanced not also be more advanced ethically as well, to the point where taking sentient life would be forbidden? Today we have animal rights groups trying to preserve many species of plants and animals before they go extinct. Have we not advanced somewhat ourselves in the regard for the sanctity of life? Where will we be in a thousand years regarding this issue, assuming we still exist.
Granted it is certainly possible for an alien civilization to advance technologicaly far beyond their abilities to deal with their new power in an ethical manner. I would admit such has been the case here, but will it always be so? What are the odds that should humans develop such huge tech advances that we would destroy ourselves if we cannot also develop sociologically and psychologically to use them wisely? I thnk we would be better served working harder on the workings of the mind and on interrelationships.
Anyway, do you think he's right? Would said aliens who are that advanced not also be more advanced ethically as well, to the point where taking sentient life would be forbidden? Today we have animal rights groups trying to preserve many species of plants and animals before they go extinct. Have we not advanced somewhat ourselves in the regard for the sanctity of life? Where will we be in a thousand years regarding this issue, assuming we still exist.
Granted it is certainly possible for an alien civilization to advance technologicaly far beyond their abilities to deal with their new power in an ethical manner. I would admit such has been the case here, but will it always be so? What are the odds that should humans develop such huge tech advances that we would destroy ourselves if we cannot also develop sociologically and psychologically to use them wisely? I thnk we would be better served working harder on the workings of the mind and on interrelationships.
Last edited: