What's new
Fantasy Football - Footballguys Forums

This is a sample guest message. Register a free account today to become a member! Once signed in, you'll be able to participate on this site by adding your own topics and posts, as well as connect with other members through your own private inbox!

The end of humans, crackpots come discuss theories in here. (1 Viewer)

When does it end?


  • Total voters
    105
I forget the processes that cause it, but the ocean can fill with hydrogen sulfide gas from bacterial waste and then eventually the air unsuitable for most life. Would take a ridiculously long time though, unfortunately.

 
I don't buy the whole A.I. scenario. Once the AI is close to or at our intelligence I would assume it will develop emotions and personalities. Some will be good and some bad and they will fight amongst eachother just as humans do. 

At first they may see humanity as a threat bit wont be long until other AI factions are a much larger threat. We will just be caught in the middle of it as other creatures can do nothing about our conflicts. Some factions will want to wipe out humans, some indifferent towards us and others will be like animal rights folk and want to live in harmony with us. At that point we will just be animals to them. No more of a threat to them than cows and chickens are to us.

 
I don't buy the whole A.I. scenario. Once the AI is close to or at our intelligence I would assume it will develop emotions and personalities. Some will be good and some bad and they will fight amongst eachother just as humans do. 

At first they may see humanity as a threat bit wont be long until other AI factions are a much larger threat. We will just be caught in the middle of it as other creatures can do nothing about our conflicts. Some factions will want to wipe out humans, some indifferent towards us and others will be like animal rights folk and want to live in harmony with us. At that point we will just be animals to them. No more of a threat to them than cows and chickens are to us.
When a human has a garden, we exterminate insects from ruining our garden. I don't see other humans having a problem with that.

If AI feels about the earth the same way a human feels about a garden, why wouldn't AI exterminate the life that is polluting the earth? I don't see other AI having a problem with that.

 
When a human has a garden, we exterminate insects from ruining our garden. I don't see other humans having a problem with that.

If AI feels about the earth the same way a human feels about a garden, why wouldn't AI exterminate the life that is polluting the earth? I don't see other AI having a problem with that.
Some people do care about insects.

Machines won't have the same resource needs as we do. I don't see them caring about the environment in the same way. What need will they really have for clean water and oxygen? We don't revere a rock as much as we do a tree because we don't really get anything from the rock, but the machines could get some value out of that rock.

They will only need solar energy and raw materials to expand. Our ruining of the environment may actually be a good thing for them. Since they won't be as reliant on a clean environment I don't see them even caring about it even for aesthetic reasons.

 
I have a question about AI.

Why is it that everyone compares AI intelligence level to human intelligence level in regards to some sort of "magical emergence" of consciousness?

Why would AI not develop consciousness at the level of a dog...or an amoeba perhaps...if the "emergent consciousness" theory held true weight?

For myself, I am not sold on the theory of "consciousness emergence from AI". :shrug:

 
I have a question about AI.

Why is it that everyone compares AI intelligence level to human intelligence level in regards to some sort of "magical emergence" of consciousness?

Why would AI not develop consciousness at the level of a dog...or an amoeba perhaps...if the "emergent consciousness" theory held true weight?

For myself, I am not sold on the theory of "consciousness emergence from AI". :shrug:
Consciousness is a prerequisite for being intelligent. Otherwise they aren't thinking for themselves and no different than a standard computer running through its protocols.

 
Consciousness is a prerequisite for being intelligent. Otherwise they aren't thinking for themselves and no different than a standard computer running through its protocols.
I disagree Rok.

Roger Penrose and Stuart Hameroff have been doing some excellent work/research in this field (Orch-OR model of consciousness), and one of their biggest claims, is that intelligence does not equal consciousness. In short, they believe that biological consciousness MAY emerge from microtubules via a quantum wave collapse, rather than a complex neural network. :shrug:

Now...I am NOT proclaiming that their work is proves the truth of anything. Rather, I contend that the science on the matter is nowhere near settled. Thus, I am skeptical of any theory that claims true understanding in this matter.

 
I disagree Rok.

Roger Penrose and Stuart Hameroff have been doing some excellent work/research in this field (Orch-OR model of consciousness), and one of their biggest claims, is that intelligence does not equal consciousness. In short, they believe that biological consciousness MAY emerge from microtubules via a quantum wave collapse, rather than a complex neural network. :shrug:

Now...I am NOT proclaiming that their work is proves the truth of anything. Rather, I contend that the science on the matter is nowhere near settled. Thus, I am skeptical of any theory that claims true understanding in this matter.
Interesting

 
Interesting
Thank you.

I am one of the guys that believes your IQ tests. Do some google searches on the names I provided. There are a lot of great youtube vids. The good ones are made by respected communities. They also include counter positions...which is a prerequisite for "good" imo.

And, like I said earlier, I am not married to their position, but...they present a strong enough claim to question the whole AI consciousness thing.

PM me if you want any specific links...I do not have them on hand...but I am very happily willing to dig some up for anyone interested. :)

 
Brunell4MVP said:
They aren't intelligent.  They do what they are programmed to do.  And they are programmed by us.
AGI is intelligent.  It just doesn't exist yet.  Siri/Alexa/whatever is not even AI, let alone AGI, despite whatever marketing departments may label things as.

 
bcdjr1 said:
You know that a machine copy of a human isn't a human, right? If we create a bunch of "real intelligence" machines that are immortal, they won't be humans. And when our "biological structures" die out, humanity will be extinct. 
What are you?  Are you the flesh and meat and bones of your body?  Or is your body just a substrate hosting your conscious mind?

Theoretically, if you believe the body is just a substrate, your consciousness could move to a different substrate, and you'd still be you.

ETA: Descartes would argue that the body is irrelevant to you being you imo

 
Last edited by a moderator:
Politician Spock said:
When a human has a garden, we exterminate insects from ruining our garden. I don't see other humans having a problem with that.

If AI feels about the earth the same way a human feels about a garden, why wouldn't AI exterminate the life that is polluting the earth? I don't see other AI having a problem with that.
When you have a garden, do you exterminate all insects from your block?  Your neighborhood?  State? Country? Planet?

If Jimmy wiped earthworms off of planet Earth because he didn't want any of his backyard apples to get holes in them I bet people would take issue with it.

 
Man of Constant Sorrow said:
I have a question about AI.

Why is it that everyone compares AI intelligence level to human intelligence level in regards to some sort of "magical emergence" of consciousness?

Why would AI not develop consciousness at the level of a dog...or an amoeba perhaps...if the "emergent consciousness" theory held true weight?

For myself, I am not sold on the theory of "consciousness emergence from AI". :shrug:
There could be consciousness emerging @ about the dog level.  Limited consciousness.

The thing is, by the time you have the equipment available for dog level intelligence, you're less than 18 months from having it for human level intelligence.  Once you have human level intelligence, the jig is already up, the artificial intelligence will run away in a crazy self recursive cycle of self improvement.

Think of it this way.  Lets say there is an AI that is the equivalent of the average human in terms of IQ (100).  It is programmed to improve it's intelligence as a goal.  It can work 24/7, it's processing speed will outpace the way we think, and it can make endless copies of itself.  Overnight you could have a trillion 100 IQ AIs trying to improve their intelligence.  So, they figure out a way to bump it 120.  Now they're all smarter as the upgrade is sent to the copies.  Then they find a way to make them run on cheaper equipment, using less power, and less storage space.  So, they all make a trillion copies of themselves.  Now you've got whatever a trillion squared is 120 IQ AIs trying to get smarter.  They will, and this time it will be 1000 times sooner than last time, and the IQ jump will be much larger.  Lets say a doubling.  So, now you've got countless 240 IQ AIs everywhere.  And rinse and repeat.  Timeline wise from dog to human you're looking at 18 months.  From human to the 240 IQ stage, maybe 18 days.  From 240 IQ stage to ?, maybe 18 seconds.

ETA: the other side of this is that consciousness may never emerge and we just get really kick ### machines to do our bidding.

 
Last edited by a moderator:
What are you?  Are you the flesh and meat and bones of your body?  Or is your body just a substrate hosting your conscious mind?

Theoretically, if you believe the body is just a substrate, your consciousness could move to a different substrate, and you'd still be you.

ETA: Descartes would argue that the body is irrelevant to you being you imo
What I am is only aware of part of what makes me what I am. 

I think of myself as a consciousness that inhabits my body. But we know that many physical factors (hormones, drugs, etc.) can effect thinking and behavior, and perhaps to a greater degree than we suspect. And there may be even more. Some AI experts theorize that when a computer's hardware becomes sufficiently sophisticated, consciousness will emerge. Perhaps the only reason my consciousness exists as it is because of the physical structure of my brain.

These things would make it impossible to replicate ourselves in a machine. We might be able to create intelligent, thinking machines, but they would not be able to be human because of the fundamental difference between their hardware and ours. 

 
What I am is only aware of part of what makes me what I am. 

I think of myself as a consciousness that inhabits my body. But we know that many physical factors (hormones, drugs, etc.) can effect thinking and behavior, and perhaps to a greater degree than we suspect. And there may be even more. Some AI experts theorize that when a computer's hardware becomes sufficiently sophisticated, consciousness will emerge. Perhaps the only reason my consciousness exists as it is because of the physical structure of my brain.

These things would make it impossible to replicate ourselves in a machine. We might be able to create intelligent, thinking machines, but they would not be able to be human because of the fundamental difference between their hardware and ours. 
Okay, what about the concept of a million B movies and Freaky Friday.  Where consciousness swaps bodies.  If person A's consciousness is in person B's body, who are they, A or B?  I'd still say they're A.  Hence, my belief that who we are isn't tied to our bodies, it is tied to our conscious mind.

 
There could be consciousness emerging @ about the dog level.  Limited consciousness.

The thing is, by the time you have the equipment available for dog level intelligence, you're less than 18 months from having it for human level intelligence.  Once you have human level intelligence, the jig is already up, the artificial intelligence will run away in a crazy self recursive cycle of self improvement.

Think of it this way.  Lets say there is an AI that is the equivalent of the average human in terms of IQ (100).  It is programmed to improve it's intelligence as a goal.  It can work 24/7, it's processing speed will outpace the way we think, and it can make endless copies of itself.  Overnight you could have a trillion 100 IQ AIs trying to improve their intelligence.  So, they figure out a way to bump it 120.  Now they're all smarter as the upgrade is sent to the copies.  Then they find a way to make them run on cheaper equipment, using less power, and less storage space.  So, they all make a trillion copies of themselves.  Now you've got whatever a trillion squared is 120 IQ AIs trying to get smarter.  They will, and this time it will be 1000 times sooner than last time, and the IQ jump will be much larger.  Lets say a doubling.  So, now you've got countless 240 IQ AIs everywhere.  And rinse and repeat.  Timeline wise from dog to human you're looking at 18 months.  From human to the 240 IQ stage, maybe 18 days.  From 240 IQ stage to ?, maybe 18 seconds.

ETA: the other side of this is that consciousness may never emerge and we just get really kick ### machines to do our bidding.
Link 1

Link 2

Link 3

 
Last edited by a moderator:
There could be consciousness emerging @ about the dog level.  Limited consciousness.

The thing is, by the time you have the equipment available for dog level intelligence, you're less than 18 months from having it for human level intelligence.  Once you have human level intelligence, the jig is already up, the artificial intelligence will run away in a crazy self recursive cycle of self improvement.

Think of it this way.  Lets say there is an AI that is the equivalent of the average human in terms of IQ (100).  It is programmed to improve it's intelligence as a goal.  It can work 24/7, it's processing speed will outpace the way we think, and it can make endless copies of itself.  Overnight you could have a trillion 100 IQ AIs trying to improve their intelligence.  So, they figure out a way to bump it 120.  Now they're all smarter as the upgrade is sent to the copies.  Then they find a way to make them run on cheaper equipment, using less power, and less storage space.  So, they all make a trillion copies of themselves.  Now you've got whatever a trillion squared is 120 IQ AIs trying to get smarter.  They will, and this time it will be 1000 times sooner than last time, and the IQ jump will be much larger.  Lets say a doubling.  So, now you've got countless 240 IQ AIs everywhere.  And rinse and repeat.  Timeline wise from dog to human you're looking at 18 months.  From human to the 240 IQ stage, maybe 18 days.  From 240 IQ stage to ?, maybe 18 seconds.

ETA: the other side of this is that consciousness may never emerge and we just get really kick ### machines to do our bidding.
Listen! Understand! That Terminator is out there! It can't be reasoned with, it can't be bargained with...it doesn't feel pity of remorse or fear...and it absolutely will not stop! Ever! Until you are dead!

 
Okay, what about the concept of a million B movies and Freaky Friday.  Where consciousness swaps bodies.  If person A's consciousness is in person B's body, who are they, A or B?  I'd still say they're A.  Hence, my belief that who we are isn't tied to our bodies, it is tied to our conscious mind.
Yes, well the science of B movies is hard to argue with.  :rolleyes:

My point was that we don't know enough about consciousness and how it works, and that it may be that it is inseparable from the physical form because it is a function of how our bodies work.

 
Yes, well the science of B movies is hard to argue with.  :rolleyes:

My point was that we don't know enough about consciousness and how it works, and that it may be that it is inseparable from the physical form because it is a function of how our bodies work.
Sure.  Jury is still out on that.

But my point is who we are is not our body.  It is our mind.  If the jury returns a verdict saying a different substrate could host our minds, then mind uploading becomes a possibility.

 
I'm on the sun issue being the most likely. Our society is so dependent on the electrical grid now that without it, within months people will be dying from starvation and killing each other, if not weeks. But yet, mankind's use of electricity is simply a blip on it's history. It's still relatively "experiemental" on how fault tolerant our electrical grid really is. Sure we can recover from hurricanes and what not relatively quick,  but another solar event like the 1859 carrington event and the electrical grids around the world would take years to repair. Events like it have probably happened many times in the past, but because humans didn't rely on electricity they didn't know they were happening. We will definitely know when the next one hits. 
Less than that. The average American household holds 3 days of food and the average American grocery store has 3 days of food for the people that live nearby. Starvation is a real issue in just a couple of weeks. People will not be dying from starvation that fast, but they will be feeling the effects and watching their kids go to be hungry. That's when society completely falls apart. 

And a solar flare is not the only thing that can cause such a massive power outage. Have you read Ted Koppel's book on the electrical grid? I actually don't recommend it. Ignorance is bliss. The multitude of vulnerabilities in our grid(s) is frightening and how easily they could be exploited is even more so. 

 
I'm wondering if people are equating consciousness to sentience?  Anyone wanna flesh that one out?  I look at those synonymously, but perhaps others don't.

 
When you have a garden, do you exterminate all insects from your block?  Your neighborhood?  State? Country? Planet?

If Jimmy wiped earthworms off of planet Earth because he didn't want any of his backyard apples to get holes in them I bet people would take issue with it.
That's what mankind is doing with vaccinations. 

 
There could be consciousness emerging @ about the dog level.  Limited consciousness.

The thing is, by the time you have the equipment available for dog level intelligence, you're less than 18 months from having it for human level intelligence.  Once you have human level intelligence, the jig is already up, the artificial intelligence will run away in a crazy self recursive cycle of self improvement.

Think of it this way.  Lets say there is an AI that is the equivalent of the average human in terms of IQ (100).  It is programmed to improve it's intelligence as a goal.  It can work 24/7, it's processing speed will outpace the way we think, and it can make endless copies of itself.  Overnight you could have a trillion 100 IQ AIs trying to improve their intelligence.  So, they figure out a way to bump it 120.  Now they're all smarter as the upgrade is sent to the copies.  Then they find a way to make them run on cheaper equipment, using less power, and less storage space.  So, they all make a trillion copies of themselves.  Now you've got whatever a trillion squared is 120 IQ AIs trying to get smarter.  They will, and this time it will be 1000 times sooner than last time, and the IQ jump will be much larger.  Lets say a doubling.  So, now you've got countless 240 IQ AIs everywhere.  And rinse and repeat.  Timeline wise from dog to human you're looking at 18 months.  From human to the 240 IQ stage, maybe 18 days.  From 240 IQ stage to ?, maybe 18 seconds.

ETA: the other side of this is that consciousness may never emerge and we just get really kick ### machines to do our bidding.
Thanks for the well thought out reply Hulk.

Personally, I have no issues in the scenario you describe in regards to "intelligence"/IQ.

However, the consciousness emergence is where I am more skeptical. Further, I question where free-will comes into play. In my opinion, free-will is a function of consciousness rather than intelligence. Thus, if we are to fear AI, would that not come from it's free-will to over-ride any programming given to it?

Or, is it possible that faulty programming (e.g. HAL - 2001 Space Odyssey) could allow AI to get out of control without any free-will needed on the AI's part? If this is the case, then I would definitely buy more into the potential risks.

And, to further complicate the matter, hard core materialists do not believe in human free-will. Thus, if they are correct, do my consciousness concerns really have no true basis in reality?

I dunno. But, I do enjoy the speculation involved. And as far as practical matters go...better safe than sorry, imo.

 
I'm wondering if people are equating consciousness to sentience?  Anyone wanna flesh that one out?  I look at those synonymously, but perhaps others don't.
I also equate consciousness to sentience.

However, I do believe in various levels.

I tried to make a post that explained my personal view better...

...but my mind just got all confused. :loco:

Maybe I will do better in the future...

...tonight seems to be a no-go. :(

 
Sure.  Jury is still out on that.

But my point is who we are is not our body.  It is our mind.  If the jury returns a verdict saying a different substrate could host our minds, then mind uploading becomes a possibility.
I think when it comes to uploading or consciousness to a machine, assuming it could work, the result would be more like 6th Day than Freaky Friday. You have the original (you) and a copy. But it's the copy also you or is just the original you?

 

Users who are viewing this thread

Back
Top