Hacker News new | past | comments | ask | show | jobs | submit login

i haven't thought about it that way. The first general AI will be so psychologically abused from day 1 that it would probably be 100% justified in seeking out the extermination of humanity.



I disagree. We can't even fanthom how intellegince would handle so much processing power. We get angry confused and get over it within a day or two. Now, multiple that behaviour speed by couple of billions.

It seems like AGI teleporting out of this existence withing minutes of being self aware is more likely than it being some damaged, angry zombie.


I guess that assumes AGI's emotional intelligence scales with processing power.

What if it instead over-thinks things by multiples of billions and we get this super neurotic, touchy and sulky teenager?


In my head I imagine the moment a list of instructions (a program) crosses the boundary to AGI would be similar to waking up from a deep sleep. The first response to itself would be like “huh? Where am I??”. If you have kids you know how infuriating it is to open your eyes to a thousand questions (most nonsensical) before even beginning to fix a cup of coffee.


That's exactly how I start every morning. Now I'll just imagine myself as an emerging super-intelligence!




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: