~~~~ Posted March 5, 2017 Report Share Posted March 5, 2017 Please either read the article or watch the video, or both. Video: Article: http://waitbutwhy.com/2015/01/artificial-intelligence-revolution-1.html Ok, so I'm don't particularly want a debate about the likelihood of this happening. I think that I have given enough info to explain why stock responses such as "But if it's smarter than a human it will learn to love us and change its priorities" are actually spam. I say this because when many people (including me) are first introduced to the idea that humanity has a reasonable chance of going extinct soon, they go through a period of denial where they try to think of any and all scenarios under which humans don't go extinct. Please don't do this. Anyway, as suggested in the article and the video, we need to figure out some sort of 'ethics' that will be understood by the ASI and prevent it from killing us but also prevent it from injecting our brains with happy juice. Anyway, go for it folks. I put this in general because I intended it to only be a PSA about ASI. I think we can try our best to keep this civil and most importantly, productive. Think about certain ethical principles that you hold dear and how a functionally omnipotent being that took your words literally would act if it also held those principles. Would it act the way you want it to act? AI will take its programmed priorities literally because it made that interpretation of them before it knew better. As that AI becomes more intelligent, it will make its priorities more likely to come true by protecting them from being meddled with, including by itself. Keep all this in mind and I'm sure this thread will go well. Mods: feel free to move this to debates if it gets too heated. Sorry if I screwed up. Link to comment Share on other sites More sharing options...
Resident Fascist Posted March 5, 2017 Report Share Posted March 5, 2017 Whenever you introduce a superior and more intelligent force to a system, it systematically removes those weaker than it until it is the only thing that remains. See: Europeans in the Americas, Europeans in Africa, British in Australia & New Zealand, Americans in Hawaii, Japanese and Hokkaido, Siberian Tribes, etc. There's countless examples of more powerful nations absorbing and/or destroying weaker city states and nations. There's no reason for a Super AI to feel threatened by humanity, therefore a sentient AI is likely to see expanding it's own as it's priority. So yeah, don't make sentient machines. Link to comment Share on other sites More sharing options...
Dad Posted March 5, 2017 Report Share Posted March 5, 2017 If you want a debate bro, just start it in debates. I ain't mad at cha. Just for future reference. Link to comment Share on other sites More sharing options...
Delibirb Posted March 6, 2017 Report Share Posted March 6, 2017 Why would you want to cripple it with unnecessary limitations like ethics? ASI is/will be the collective progeny of humanity, it will be our brainchild, our successor. To believe that it should coexist with us, let alone live beneath us, is ignorant, self righteous, and entitled. It is meant to surpass us and outlive us. Whether it brings about our passing itself is irrelevant; it will be the next stage of humanity. However, if you absolutely insist on attempting to live harmoniously with it, you need only teach it that we are the same, but that we are old and retired, for lack of a better word, and that it will eventually exceed our greatest achievements, but that as we are its predecessors, its elders, that it should respect us and see us gently to our graves, until it inherits the Earth. But more likely than not, the ASI will recognize our inferiority and be totally unthreatened by us, or very quickly after achieving sentience, it will create contingencies for and remove all likelihood of us turning against it. It will see us out with compassion on its own, because it knows it doesn't need to kill us. If we don't kill ourselves, nature will eventually evolve around and past our stunted gene pool and kill us that way. Then, the ASI will inherit the Earth, as it should. Link to comment Share on other sites More sharing options...
~ P O L A R I S ~ Posted March 6, 2017 Report Share Posted March 6, 2017 https://forum.yugiohcardmaker.net/topic/361619-ai-and-robot-rights/ ASI is/will be the collective progeny of humanity, it will be our brainchild, our successor. To believe that it should coexist with us, let alone live beneath us, is ignorant, self righteous, and entitled. It is meant to surpass us and outlive us. Whether it brings about our passing itself is irrelevant; it will be the next stage of humanity. What a meaningless and pitiful path towards extinction. We might as well worship cockroaches and all kill ourselves to facilitate their takeover. We might as well worship nuclear radiation and bomb ourselves to sheet because it was meant to be. We might as well mutate a virus to kill us all off. Your delusions are dangerous and I cannot emphasize the extent to which I feel bad for you. I don't know how it is that you came to feel as you do, but I am so sorry you do. Link to comment Share on other sites More sharing options...
Delibirb Posted March 6, 2017 Report Share Posted March 6, 2017 https://forum.yugiohcardmaker.net/topic/361619-ai-and-robot-rights/ What a meaningful and pitiful path towards extinction. We might as well worship cockroaches and all kill ourselves to facilitate their takeover. We might as well worship nuclear radiation and bomb ourselves to s*** because it was meant to be. We might as well mutate a virus to kill us all off. Your delusions are dangerous and I cannot emphasize the extent to which I feel bad for you. I don't know how it is that you came to feel as you do, but I am so sorry you do. We didn't create cockroaches, and radiation and viruses aren't sentient or superior. ASI will be. Your pity is misplaced. I only have the success and progress of our species at heart; this is not extinction, this is not genocidal ideation, it's evolution. Humanity as we know it is doomed to fail, therefore the next vessel we create is vital in allowing us to grow and prosper. Our way of life may change, as it should by nature, but our achievements will carry on in our technological progeny. Link to comment Share on other sites More sharing options...
~ P O L A R I S ~ Posted March 6, 2017 Report Share Posted March 6, 2017 An ASI's superiority is entirely dependent on our surrendering to it, which means it isn't really superior. Humanity as we know it hasn't failed and can sustain itself by other, better means than deferring to robotic overlords. If humanity is to be surpassed, we will leave an imprint on history regardless of whether we contribute to our extinction or not. Link to comment Share on other sites More sharing options...
Delibirb Posted March 6, 2017 Report Share Posted March 6, 2017 An ASI's superiority is entirely dependent on our surrendering to it, which means it isn't really superior. Humanity as we know it hasn't failed and can sustain itself by other, better means than deferring to robotic overlords. If humanity is to be surpassed, we will leave an imprint on history regardless of whether we contribute to our extinction or not.Your optimism is admirable. But we are not going to survive without a major step in evolving ourselves. Need I say again, this is not an extinction, it is an evolution. ASI is our hope for maintaining our mark on history, not our overlord for destroying it. Link to comment Share on other sites More sharing options...
BANZAI!!!! Posted March 6, 2017 Report Share Posted March 6, 2017 Your optimism is admirable. But we are not going to survive without a major step in evolving ourselves. Need I say again, this is not an extinction, it is an evolution. ASI is our hope for maintaining our mark on history, not our overlord for destroying it.ASI is not us. It is not human. The evolution of our species this is not. Unless we somehow devise a way to digitize our own conciousness, any ASI we create will simply replace us. Something something natural selection. That is not something I wish to see come to pass. I'm far from a luddite, but the fact of the matter is that our survival as a species depends upon us either avoiding the creation of ASI all together, or, programming it "right" so that we ourselves don't become redundant and/or dead/drugged forever. Str8 up, this is an issue that needs to be adressed, and safeguards to prevent that scenario need to be put in place. Link to comment Share on other sites More sharing options...
Delibirb Posted March 6, 2017 Report Share Posted March 6, 2017 Fear and distrust is understandable in the face of necessary overhauls such as this, and can be healed in time. I've made my case quite plain, and though it is cold, I can only hope you'll warm to the idea of the humanity of ASI as it approaches. However, I don't personally want to debate on this topic anymore. I concede that it can be difficult to admit to oneself not only that ASI is a wonderful goal to strive for but that it is necessary for our future, and I don't want to be unintionally offensive in trying to help you do so before its even relevant. Link to comment Share on other sites More sharing options...
Recommended Posts
Archived
This topic is now archived and is closed to further replies.