Going Against The Grain

Status
Not open for further replies.
Yes, but AI is the future for humanity. Especially for science and exploration. To travel the stars, we will need to supplant these bodies. Bacteria goes dormant in space, but doesn't die. Dr. Hawking stated that we have approximately 100 years left to leave Earth, to save humanity. From an extinction event.

Perhaps AI will surpass us, but not our Humanity!
 
Originally Posted By: Onetor
Dr. Hawking stated that we have approximately 100 years left to leave Earth, to save humanity. From an extinction event.


What extinction event was he talking about?

Humans should be smart enough to never have to leave Earth unless there was a known super natural catastrophe coming sometime in the future. Not for reasons caused by his own actions and stupidity that the human race could otherwise control the direction of.
 
Originally Posted By: ZeeOSix
Originally Posted By: Onetor
Dr. Hawking stated that we have approximately 100 years left to leave Earth, to save humanity. From an extinction event.


What extinction event was he talking about?

Humans should be smart enough to never have to leave Earth unless there was a known super natural catastrophe coming sometime in the future. Not for reasons caused by his own actions and stupidity that the human race could otherwise control the direction of.


I think Hawking was just looking at the math and history. Pandemics, Meteors, Super volcanoes, world wide drought/flooding may almost wipe or completely wipe mammals off the face of the earth. I may argue a bit about the 100 year estimate but probably not a 1000 year estimate. Posit a new ice age after the current warming. Mankind will be forced into ever narrowing latitudes and arable land may become scarce. Food and water are going to supplant oil over why wars are fought in the future imo. Of course we are talking geologic time for the ice age and probably children/grandchildren for mass starvation.
 
Originally Posted By: JHZR2
Originally Posted By: AZjeff
Something I'm curious about is it seems acceptable for a human to make a mistake while driving but we seem to expect these cars to be perfect. At least the media seems to. Or I guess it's accepted that humans make mistakes but not these machines.
I think it’s more the matter of transfer of liability in a sue happy society.

If you’re in control of the maintenance and operation of your machine, then you’re 100% liable for its outcome. If you’re buying safety critical software that is operating your machine, then there’s an expectation that the vendor made it foolproof. Otherwise the question is if the owner naturally can pass liability to the vendor.
The US lawyers are going to have a field day with this. A huge feeding frenzy of epic proportions.

Just look at the recent crash in Florida where two teens were killed and one was thrown from the car. The 'father' blamed Tesla rather than accepting responsibility for repeatedly continuing to grant the keys to his immature, 'need-for-speed' son.
 
Originally Posted By: sleddriver
Just look at the recent crash in Florida where two teens were killed and one was thrown from the car. The 'father' blamed Tesla rather than accepting responsibility for repeatedly continuing to grant the keys to his immature, 'need-for-speed' son.


If the autopilot was running at twice the posted speed, they should blame tesla.

If the kids were in charge of the vehicle, it's got little to do with this thread, nor the subject of liability of the programmers of the autopilt.
 
Originally Posted By: Shannow
Originally Posted By: sleddriver
Just look at the recent crash in Florida where two teens were killed and one was thrown from the car. The 'father' blamed Tesla rather than accepting responsibility for repeatedly continuing to grant the keys to his immature, 'need-for-speed' son.


If the autopilot was running at twice the posted speed, they should blame tesla.

If the kids were in charge of the vehicle, it's got little to do with this thread, nor the subject of liability of the programmers of the autopilt.
I was viewing this from a much wider perspective.
 
AL would be nice for driving in heavy traffic, or after having a few drinks, but otherwise I enjoy driving...
 
Originally Posted By: ZeeOSix
What extinction event was he talking about?

Humans should be smart enough to never have to leave Earth unless there was a known super natural catastrophe coming sometime in the future. Not for reasons caused by his own actions and stupidity that the human race could otherwise control the direction of.


I've recently started hearing the current phase as the 6th(?) (Holocene) mass extinction event...I don't subscribe to that belief
 
Originally Posted By: Shannow
Originally Posted By: ZeeOSix
What extinction event was he talking about?

Humans should be smart enough to never have to leave Earth unless there was a known super natural catastrophe coming sometime in the future. Not for reasons caused by his own actions and stupidity that the human race could otherwise control the direction of.

I've recently started hearing the current phase as the 6th(?) (Holocene) mass extinction event...I don't subscribe to that belief


I had to look that one up ... https://en.wikipedia.org/wiki/Holocene_extinction

I think it will take a major natural disaster, like a large asteroid hit resulting in an ice age and no way to raise animals or plants (hence major starvation and survival chaos), or some incurable plague will wipe out mankind (Omega Man). When mankind realizes he's creating his own demise there should be people smart enough to take action and steer away from that. One thing is for sure ... the continued population growth is a key factor to increasing strife on this rock, and will play a major role in the direction of mankind's future. Population factor of mankind is also a something the Holocene extinction article also talks about.
 
Status
Not open for further replies.
Back
Top