AmberCutie's Forum
An adult community for cam models and members to discuss all the things!

AI Predictions...

  • ** WARNING - ACF CONTAINS ADULT CONTENT **
    Only persons aged 18 or over may read or post to the forums, without regard to whether an adult actually owns the registration or parental/guardian permission. AmberCutie's Forum (ACF) is for use by adults only and contains adult content. By continuing to use this site you are confirming that you are at least 18 years of age.
Status
Not open for further replies.

Bocefish

I did bad things, privileges revoked!
In the Dog House
Mar 26, 2010
8,489
7,021
793
Usually somewhere between flippant and glib.
"While primitive forms of artificial intelligence developed so far have proved very useful, I fear the consequences of creating something that can match or surpass humans," Hawking wrote. "Humans, who are limited by slow biological evolution, couldn’t compete and would be superseded." Stephen Hawking

Methinks we're pretty much fucked any way you look at it.

What say you?
 
I think the chance of any of us seeing in our lifetimes "we should be worried" kinds of AI is about as great as the chance of aliens showing up and asking us to join a Galactic Federation of Planets. Technically, there's a possibility, and I guess that's why people allow their imaginations to run wild on the subject. Still, it's nothing more than sci-fi. When you listen to AI technologists talk, they describe an area of study that is still in its infancy, where they're just beginning to build an academic infrastructure to support the research, let alone understand the processes that will move the science forward. They describe the AI they're currently working with as still machine-like -- able to efficiently work through very specific tasks, but not capable of anything resembling general reasoning.

That's what bothers me about "futurist" predictions being taken seriously, merely because the person writing the fiction is a physicist or whatever. I find it pretty irresponsible. Because someone is brilliant in a certain field, that doesn't mean their flights of fancy are anything other than that. Hawking also had a doomsayer ideas about aliens, and they were all incredibly anthropocentric. And, as it turns out, so are his predictions about AI. Why would aliens behave like humans? No other animal on Earth behaves like humans, but we have to worry about alien Conquistadors? Bullshit. Similarly, why would AI so advanced it's able to self-evolve behave like humans? It's dumb. It's small thinking. The chance of it happening like Hawking (or any of the other "futurists" out there) describe is slim, and a long time away. And they make their fucking declarations like this as vague and on-the-horizon as possible, because it means they'll be long gone by the time they may have to answer to these predictions. It's a carny grift with the chrome shine of "science" applied, in order to appeal our contemporary sensibilities. They're no more reliable than the likes of Nostradamus or Edgar Cayce.

TL;DR: I think by the time we hit a point where AI is advanced enough that it can "match or surpass" human reasoning, we'll be fine because we'll have the full support of the Galactic Federation of Planets in keeping its continued evolution in check.
 
  • Like
Reactions: THE MOLLIE MARIE
I think the chance of any of us seeing in our lifetimes "we should be worried" kinds of AI is about as great as the chance of aliens showing up and asking us to join a Galactic Federation of Planets. Technically, there's a possibility, and I guess that's why people allow their imaginations to run wild on the subject. Still, it's nothing more than sci-fi. When you listen to AI technologists talk, they describe an area of study that is still in its infancy, where they're just beginning to build an academic infrastructure to support the research, let alone understand the processes that will move the science forward. They describe the AI they're currently working with as still machine-like -- able to efficiently work through very specific tasks, but not capable of anything resembling general reasoning.

That's what bothers me about "futurist" predictions being taken seriously, merely because the person writing the fiction is a physicist or whatever. I find it pretty irresponsible. Because someone is brilliant in a certain field, that doesn't mean their flights of fancy are anything other than that. Hawking also had a doomsayer ideas about aliens, and they were all incredibly anthropocentric. And, as it turns out, so are his predictions about AI. Why would aliens behave like humans? No other animal on Earth behaves like humans, but we have to worry about alien Conquistadors? Bullshit. Similarly, why would AI so advanced it's able to self-evolve behave like humans? It's dumb. It's small thinking. The chance of it happening like Hawking (or any of the other "futurists" out there) describe is slim, and a long time away. And they make their fucking declarations like this as vague and on-the-horizon as possible, because it means they'll be long gone by the time they may have to answer to these predictions. It's a carny grift with the chrome shine of "science" applied, in order to appeal our contemporary sensibilities. They're no more reliable than the likes of Nostradamus or Edgar Cayce.

TL;DR: I think by the time we hit a point where AI is advanced enough that it can "match or surpass" human reasoning, we'll be fine because we'll have the full support of the Galactic Federation of Planets in keeping its continued evolution in check.

I personally doubt if the singularity is imminent, but that's kind of how singularities work, so... who knows.
 
I personally doubt if the singularity is imminent, but that's kind of how singularities work, so... who knows.

At the same time, our own application of technology to suit our needs will be ongoing en route to a singularity. Our reliance on technology has already fundamentally impacted our evolution. Stephen Hawking himself is an amazing example of that. It might stand to reason that the point at which technology can become self-aware or self-replicating, humans may be fairly, if not fully, untethered from a strictly animal state, and thus no longer "limited by slow, biological evolution."

Aw, heck, look at me, being a futurist.
 
  • Funny!
Reactions: THE MOLLIE MARIE
At the same time, our own application of technology to suit our needs will be ongoing en route to a singularity. Our reliance on technology has already fundamentally impacted our evolution. Stephen Hawking himself is an amazing example of that. It might stand to reason that the point at which technology can become self-aware or self-replicating, humans may be fairly, if not fully, untethered from a strictly animal state, and thus no longer "limited by slow, biological evolution."

Aw, heck, look at me, being a futurist.

If we're close to creating a singularity or other transition to "artificial" intelligence, won't that intelligence still just be our sons and daughters (or however they like to define themselves)? I don't think that the future of mankind has to be made of meat.

In a very slow way, I'm sure new humans "surpass" old humans every day, at least as far as evolutionary fitness goes.

If we can manage to skip a few steps, so that evolution appears to make some "leaps" (and in the past, from non-life to life, from single cell to multi-cellular organism... I think some of these things would also be "leaps"), if we do a few things intentionally, is that "artificial"? And even if we label it "artificial life", is that something threatening, or is that our newborn baby to coddle and coo?

I find absolutely nothing about the possibility of "artificial intelligence" threatening. I was going to die, anyway. Who is so attached to meat, and why?

 
Last edited:
If we're close to creating a singularity or other transition to "artificial" intelligence, won't that intelligence still just be our sons and daughters (or however they like to define themselves)? I don't think that the future of mankind has to be made of meat.

In a very slow way, I'm sure new humans "surpass" old humans every day, at least as far as evolutionary fitness goes.

If we can manage to skip a few steps, so that evolution appears to make some "leaps" (and in the past, from non-life to life, to single cell to multi-cellular organism... I think some of these things would also be "leaps"), if we do a few things intentionally, is that "artificial"? And even if we label it "artificial life", is that something threatening, or is that our newborn baby to coddle and coo?

I find absolutely nothing about the possibility of "artificial intelligence" threatening. I was going to die, anyway. Who is so attached to meat, and why?



I'm with you on that. That's why all this preemptive panic over a robot uprising strikes me as so dumb, no matter how smart the person talking about it is. And if it turns out in 50 years that some android shows up and rips 90-year-old me apart for being inferior biological trash, then I guess I'll owe virtual Stephen HawkAIng a Coke.
 
I'm with you on that. That's why all this preemptive panic over a robot uprising strikes me as so dumb, no matter how smart the person talking about it is. And if it turns out in 50 years that some android shows up and rips 90-year-old me apart for being inferior biological trash, then I guess I'll owe virtual Stephen HawkAIng a Coke.

Humans are very different from other primates, but we're not better. I don't think "intelligence" of the sort we possess is superior in any way to other intelligence. Or even to the lack thereof. I'm not better than a rock. What does that mean? I don't think humans are better than a slime mold. If our computers create something "bigger" than us, and if that was an "intelligence" that could manage to propagate... I guess I'd just be strangely proud, if anything? If all humans die, oh well. If we manage to evolve to be a little bit better before that, then good, that's nice. I"m slightly drunk, I don't guarantee that these are all complete sentences...
 
  • Like
Reactions: THE MOLLIE MARIE
Humans are very different from other primates, but we're not better. I don't think "intelligence" of the sort we possess is superior in any way to other intelligence. Or even to the lack thereof. I'm not better than a rock. What does that mean? I don't think humans are better than a slime mold. If our computers create something "bigger" than us, and if that was an "intelligence" that could manage to propagate... I guess I'd just be strangely proud, if anything? If all humans die, oh well. If we manage to evolve to be a little bit better before that, then good, that's nice. I"m slightly drunk, I don't guarantee that these are all complete sentences...

That what I was getting at with my earlier complaint about the anthropocentrism inherent to all these predictions. Hawking, in all his brilliance, didn't seem too able to conceive of an advanced intelligence that wasn't comparable to humans. There are forms of intelligence on Earth already that are highly ordered and advanced, that are evolutionarily successful, and not remotely human. We already live side-by-side with this stuff, and too often it takes a change in our to even be able to recognize it. That's why the concepts of alien conquistadors and genocidal androids strikes me a the stuff of weak speculative fiction, rather than something to take seriously.
 
  • Like
Reactions: THE MOLLIE MARIE
Status
Not open for further replies.