April 13, 2015 Due April 20, 2015
Problem solved, if you posted before its here, no need to re do. Too much traffic in a one hour period Monday night so you all got marked as spam. Never happened to me before, guess 79 people never tried to post in an hour before lol….
Welcome back from Spring Break. The good news is is that it is less than 2 months until graduation – the bad news is you still have lots of school work to do before then. This week’s topic is singularity. Read the following articles, in order, to understand the concept. Then, answer the questions (citing ALL sources of course – and read all pages of each site because this topic is very relevant not only to your future but to the future of mankind). Questions – what does singularity mean for us? Will it be a “golden age” for humanity, the elimination of humanity, or something else?
http://electronics.howstuffworks.com/gadgets/high-tech-gadgets/technological-singularity.htm
http://time.com/3614349/artificial-intelligence-singularity-stephen-hawking-elon-musk/
http://www.livescience.com/29379-intelligent-robots-will-overtake-humans.html
- 99 comments
- Posted under Uncategorized
Permalink # dylanrileymjm said
As nice as it would be to have robot overlords, because let’s face it, people suck, it’s insanely far-fetched. Even though I don’t work with Artificial Intelligence, I personally don’t think singularity will ever be a problem in our lives for many, many, many, many, many, many, years. Also, even if we do reach the point of singularity, it doesn’t mean we’ll all turn into the people straight out of The Matrix, although it would be really cool to have Keanu Reeves back when his movies weren’t completely god awful.
In the first article, it just talks about singularity. Nothing terribly special. Except for the fact that it seems pretty hellbent on the idea that singularity is robots make humans slaves. (http://electronics.howstuffworks.com/gadgets/high-tech-gadgets/technological-singularity.htm)
In the second article, it has five very well respected people who say that the singularity is much more closer than we could ever imagine. The problem with this article is that just because a lot of very intelligent people something, it doesn’t necessarily mean their opinion is 100% accurate. Stephen Hawking thought humans would be living as one with aliens by now (with us knowing that they’re aliens) and there obviously aren’t any known aliens we live with. (http://time.com/3614349/artificial-intelligence-singularity-stephen-hawking-elon-musk/)
In the third article, is discussing the possibilities of what will happen during the singularity. This article probably lines up best with my ideals, even though the last part of the article is about robot overlords. I guess Noblitt is learning how to include articles that don’t agree with his personal beliefs. GG, Noblitt. (http://www.livescience.com/29379-intelligent-robots-will-overtake-humans.html)
In the fourth article, this article about Elon Musk, shockingly enough, talking about robots going to take us over one day. Well, that singularity is very close; it’s technically the same thing. This article talks about how fast Deepmind has been developing AI, but when you go to their website and try to find out information about their progress, its mostly just about how they made the AI make decisions. Not exactly taking over the world yet. (http://deepmind.com/publications.html) (http://mashable.com/2014/11/17/elon-musk-singularity/)
Bottom line, robots taking over the world is pretty idiotic. Unfortunatly, who won’t be able to enjoy Christmas with our robot overloads like Johnathan Coulton says we will. [Props to anyone who knows his music]
Permalink # gingerrrmaddie722 said
Singularity is ” a hypothetical future moment where software becomes self-aware and smart beyond our capacity to understand. ” (http://mashable.com/2014/11/17/elon-musk-singularity/) I don’t think I like singularity. Technological advances are cool but I don’t want everything to be computerized. Even though Google is much faster and convenient, the nerd in me is still just as content opening a book when I need information. In some cases there are computerized countertops that can like do stuff for you as if you aren’t capable of moving your own hands and doing them yourselves. I don’t know much about that though. I think that singularity takes away our humanity completely. We might not be dead but what’s the point of living if we aren’t even thinking for ourselves or doing things on our own?
“While AI can trounce the best chess or Jeopardy player and do other specialized tasks, it’s still light-years behind the average 7-year-old in terms of common sense, vision,
language and intuition about how the physical world works, Davis said.” (http://www.livescience.com/29379-intelligent-robots-will-overtake-humans.html) Why do we want to teach/program a computer to do something that we are perfectly capable of doing ourselves? Sometimes I feel like we spend so much time anticipating and trying to develop “the next big thing” that there’s no way we can ever learn to appreciate what we have. “It would take off on its own and re-design itself at an ever increasing rate. Humans, who are limited by slow biological evolution, couldn’t compete, and would be superseded.” (http://time.com/3614349/artificial-intelligence-singularity-stephen-hawking-elon-musk/) Stephen Hawking suggests that humans could cease to exist with these type of advances. I think that could be true based on how quickly computers respond….. most of the time.
“Even if we find a way to build transistors on a scale of just a few nanometers, they wouldn’t necessarily work.” (http://electronics.howstuffworks.com/gadgets/high-tech-gadgets/technological-singularity2.htm) New technology annoys me in general for this very reason… we have iPads so small they might as well be phones and other times we have iPhones and Galaxy phones that are so big they should just be tablets. I obviously don’t think we should go back to mega-dinosaur computers, but do we REALLY need Apple watches?
I can’t remember which source or I’d cite it, but I think they’re wrong for comparing this to the Agricultural and Industrial Revolutions. Those were different situations. They were pretty much necessary for sustaining human life. They were good for the economy. Singularity would increase the economy because everyone has to have the latest and greatest technology BUT if this technology is just going to be robots that wipes out the human race, does it really matter that it would boost the economy? Those types of revolutions allowed us to be more efficient with farming and production of consumer goods, for the most part. This type of revolution is literally allowing us to be as lazy as possible, which in itself could kill us for biological reasons like literally never moving a muscle.
I don’t see singularity as the beginning of a “golden age” but I don’t exactly see it as the end of the human race in other ways… Honestly I just see it as unnecessary.
Permalink # imanifaaruwq said
What does singularity mean for us? “Vinge proposes that mankind is heading toward an irrevocable destiny in which we will evolve beyond our understanding through the use of technology. He calls it the singularity.” (http://electronics.howstuffworks.com/gadgets/high-tech-gadgets/technological-singularity.htm) As Vinge proposes, singularity means the surpassing of technology beyond our intelligence. By giving computers a conscience, they become not equal but greater than the human race because of their ability to learn faster than humans, are stronger and cannot be wounded in the same ways as humans, and their upcoming talent of having a mental perception that will cause them to have human like reactions to situations that they in the pass needed human interact to help them understand.
“This moment, called the singularity, could create a utopia in which robots automate common forms of labor and humans relax amid bountiful resources. Or it could lead the artificial intelligence, or AI, to exterminate any creatures it views as competitors for control of the Earth—that would be us. ” (http://time.com/3614349/artificial-intelligence-singularity-stephen-hawking-elon-musk/) I agree with the statements made in this article by the writers and philosophers to the extent of having a race first mentally our equal but physically stronger then voluntarily giving them the power to develop to mentally overpower our race is a death sentence. The idea of giving technology the conscience ability to grow in a human like way is like starting your child off playing a sport at a earlier age than you did. Yes, at first you’re going to be more skilled than them as they go through the learning process but eventually they will be become a bigger and better athlete than you were at their age because they not only were passed down your knowledge of the game but were given additional time and resources to become bigger and badder as we are now giving technology.
Will it be a “golden age” for humanity, the elimination of humanity, or something else? “…we’re barreling toward a future that doesn’t take people into account. For instance, self-driving cars could improve safety, but also put millions of truck drivers out of work,” (http://www.livescience.com/29379-intelligent-robots-will-overtake-humans.html) In the article, there were different opinions on the erasing of the human race by robots, one believing and the other not. I take the side of the believing for the simple fact as the quote says, robots will be able to take the place of humans in numerous jobs causing poverty among people who can not afford to live by robot services. As robots grow strong intellectually, they will see there is no need for the human race, who just use them to service and for them and do all the work to bring content to their lazy habits. As technology rapidly grows, the human race is unconsciously becoming inferior. “The risk of something seriously dangerous happening is in the five year time-frame. 10 years at most,” wrote Musk. “Please note that I am normally super pro technology and have never raised this issue until recent months. This is not a case of crying wolf about something I don’t understand.” (http://mashable.com/2014/11/17/elon-musk-singularity/)
Permalink # dylanrileymjm said
Since wordpress thinks it’s hilarious for me to write a decent blog post and to delete it when I try to post it, I have to rewrite. It just makes me so excited to redo a blog post that won’t be nearly as well written. Singularity; I personally think it’s an insanely idiotic idea. As cool as it would be to have robot overlords, because let’s face it, people are horrible, it’s so incredibly far-fetched and ridiculously that it physically hurts me to think that people actually believe we’ll be over taken by robots. I like the Matrix as much as the next person, but I think it’s just dumb.
In the first article, it gives the definition of the, well, a skewed definition, of singularity. It says that the singularity are robots taking over the world and enslaving humans. Still, a pretty stupid idea. (http://electronics.howstuffworks.com/gadgets/high-tech-gadgets/technological-singularity.htm)
The second article is six very intelligent and well-respected people talking about how they all believe that robots will take us over one day. The problem with this is that just because smart people say that something is, it doesn’t mean this it is. Harrison Ford said that the Bush administration was a well one, but that’s obviously a lie. Stephen Hawking said that we would be living with aliens, and knowing that they’re aliens, as one. I personally don’t know any aliens, unless you count illegal immigrants, but that’s an argument for another time. (http://time.com/3614349/artificial-intelligence-singularity-stephen-hawking-elon-musk/)
In the third article, it says that scientist believe the singularity will happen within the 21st century. It also gives a partial argument for the other side who doesn’t believe in the singularity. One point that it brings up is that robots have no physical intuition. If someone sees a cup of water falling to the floor, we know that we’re going to end up with water everywhere, whereas a robot would have to countless simulations, full of measurements of everything, just to know that water will end up on the floor. So, good job Noblitt. You gave a somewhat non-biased article for once. Keep it up. (http://www.livescience.com/29379-intelligent-robots-will-overtake-humans.html)
In the last article. it’s talking about Elon Musk saying that the singularity is close. One company he brings up is Deepmind, which is an AI development company, and their overwhelming accomplishments. However, if you read their publications (http://deepmind.com/publications.html), you find that all there really is are documents stating theories of what robots could do based on psychological processes in humans. So, unless they keep all of their results under lock, there isn’t much proof. Of course, you’d have to be doing some pretty amazing work to have google as a business partner. (http://mashable.com/2014/11/17/elon-musk-singularity/)
Bottom line, the singularity is pretty dumb. Of course, living life like a Johnathan Coultan song would be pretty awesome. (props to anyone who knows about JoCo)
Permalink # dylanrileymjm said
Since wordpress thinks it’s hilarious for me to write a decent blog post and to delete it when I try to post it, I have to rewrite. It just makes me so excited to redo a blog post that won’t be nearly as well written. Singularity; I personally think it’s an insanely idiotic idea. As cool as it would be to have robot overlords, because let’s face it, people are horrible, it’s so incredibly far-fetched and ridiculous that it physically hurts me to think that people actually believe we’ll be overtaken by robots. I like the Matrix as much as the next person, but I think it’s just dumb.
In the first article, it gives the definition of the, well, a skewed definition, of singularity. It says that the singularity are robots taking over the world and enslaving humans. Still, a pretty stupid idea. (http://electronics.howstuffworks.com/gadgets/high-tech-gadgets/technological-singularity.htm)
The second article is six very intelligent and well-respected people talking about how they all believe that robots will take us over one day. The problem with this is that just because smart people say that something is, it doesn’t mean this it is. Harrison Ford said that the Bush administration was a well one, but that’s obviously a lie. Stephen Hawking said that we would be living with aliens, and knowing that they’re aliens, as one. I personally don’t know any aliens, unless you count illegal immigrants, but that’s an argument for another time. (http://time.com/3614349/artificial-intelligence-singularity-stephen-hawking-elon-musk/)
In the third article, it says that scientist believe the singularity will happen within the 21st century. It also gives a partial argument for the other side who doesn’t believe in the singularity. One point that it brings up is that robots have no physical intuition. If someone sees a cup of water falling to the floor, we know that we’re going to end up with water everywhere, whereas a robot would have to countless simulations, full of measurements of everything, just to know that water will end up on the floor. So, good job Noblitt. You gave a somewhat non-biased article for once. Keep it up. (http://www.livescience.com/29379-intelligent-robots-will-overtake-humans.html)
In the last article. it’s talking about Elon Musk saying that the singularity is close. One company he brings up is Deepmind, which is an AI development company, and their overwhelming accomplishments. However, if you read their publications (http://deepmind.com/publications.html), you find that all there really is are documents stating theories of what robots could do based on psychological processes in humans. So, unless they keep all of their results under lock, there isn’t much proof. Of course, you’d have to be doing some pretty amazing work to have google as a business partner. (http://mashable.com/2014/11/17/elon-musk-singularity/)
Bottom line, the singularity is a pretty idiotic idea. Of course, living life like a Jonathan Coulton song would be pretty amazing. (props to anyone who knows about JoCo and his music)
Permalink # ethanbullard said
“We’ve covered Musk’s recent comments about the potential dangers of A.I. before, when he variously compared the threat to nuclear weapons and a “demon” summoned by humanity.” (http://mashable.com/2014/11/17/elon-musk-singularity/)
Musk’s concerns over artificial intelligence make sense to me becuase it should always be a concern that we are messing with something that may eventually get out of our control. I’ve seen I-Robot and I did not like the way it went. First, we use robots for household stuff then someone gets the idea that it’d be cool to have a robot around as a lover. We develop a robot with the capability to comprehend and display emotions then bam! We have a crazy abusive robot lover. But seriously, I think that we would get carried away with trying to mimic the human body and go about creating something that we will not be able to contain.
“he past two singularities — the Agricultural and Industrial revolutions — led to a doubling in economic productivity every 1,000 and 15 years, respectively, said Robin Hanson, an economist at George Mason University in Washington, D.C., who is writing a book about the future singularity. But once machines become as smart as men, the economy will double every week or month.” (http://www.livescience.com/29379-intelligent-robots-will-overtake-humans.html)
I think that for a period of time, we will experience a golden age. Eventually, we will become the “mitochondria” and sit around while other things engulf our lives. Our reliance on robots will make us dependable on technology to the point that we won’t be able to do the simplest things. It’s a scary thought that we could eventually have computers making decisions for us. This free time however, would give us a ton of time to work on the arts. We’ll have an immense amount of time to design things that we’ve never imagined.
““The development of full artificial intelligence could spell the end of the human race,” the world-renowned physicist told the BBC. “It would take off on its own and re-design itself at an ever increasing rate. Humans, who are limited by slow biological evolution, couldn’t compete, and would be superseded.”” (http://time.com/3614349/artificial-intelligence-singularity-stephen-hawking-elon-musk/)
Stephen Hawkings and Elon Musk are two very intelligent people who both share the same views on artificial intelligence. It seems like we could take their word for it when they say it’s not going to end well. I believe that artificial intelligence will take us far until we start using it in comparison to humans. For instance, we should make machines that better produce things. We shouldn’t create robots that have the capacity to think like humans or even surpass humans. Elon refers to this as the summoning of the demon. Either way, it’s a scary way to look at things.
“But right now, these machines have to answer to humans. They lack the ability to make decisions outside of their programming or use intuition. Without self-awareness and the ability to extrapolate based on available information, machines remain tools.” (http://electronics.howstuffworks.com/gadgets/high-tech-gadgets/technological-singularity.htm)
This quote is much like what I was saying earlier. As long as we don’t create emotional and intuitive robots, we wouldn’t have to worry about an apocalypse. The flipping of a switch and use of a controller would be fine with me. This topic is awkward in approaching because you can either be for or against singularity and there are clear reasons as to why one would be for or against it. I think it’s scary but I honestly cannot see a robot apocalypse happening any time soon. I will prep my kids to take on the terminator when it comes time to do so.
Permalink # suarez05j said
“When machines reach the point where they can repair themselves and even create better versions of themselves, could they come to the conclusion that humans are not only unnecessary, but also unwanted?”
http://electronics.howstuffworks.com/gadgets/high-tech-gadgets/technological-singularity1.htm
According to what I read in the article cited above, many scientists believe of the possibility of creating a robot/computer that thinks and performs tasks a human would but even better. To us singularity means one day reaching the point that humans will not be needed and computer technology will take over. Is it possible? It seems like we are leading in that direction in which we rely more on computer technology than ever. The other day a spokesperson from Gaston came to talk to our first period about how The field of medicine has used computer technology to diagnose and improve the state of patients in a faster time period. We now make Cds in which you can see tumors recorded by doctors and comments and you can view them from your own house. Robotic surgery is also being performed although it is very expensive, but it is happening.
“The development of full artificial intelligence could spell the end of the human race,” the world-renowned physicist told the BBC. “It would take off on its own and re-design itself at an ever increasing rate.”
http://time.com/3614349/artificial-intelligence-singularity-stephen-hawking-elon-musk/
According to Stephen Hawking, If we keep leading towards complete artificial intelligence through the use of computer technology, it could lead to complete human elimination. I see us leading in that direction but I simply can’t picture Humanity ending by big giant robots killing us. It seems to fictional for me to believe and fear for that outcome.
“Humans have already relinquished many intelligent tasks, such as the ability to write, navigate, memorize facts or do calculations”
http://www.livescience.com/29379-intelligent-robots-will-overtake-humans.html
“The pace of progress in artificial intelligence is incredibly fast,” said Musk.
http://mashable.com/2014/11/17/elon-musk-singularity/
I dont believe we will be taken over by big robots but I do believe that technology will keep improving and computer technology will keep making our lifes easier. The defect of imperfection causes humans to relinquish alot of tasks. Ofcourse, robots/computer technology will always be faster than us and get things done quicker but the people behind computers and robots is humans. Humans are the ones who programs these things and increase artifical intelligence.
Permalink # zantrum17 said
If you know me then you probably know that I’m a huge fan of the TV show Person Of Interest. It’s about a guy who builds a super computer that can see through cell phones, webcams, street camera etc and he sells it to the government. He also programs it to think and and have human like emotions. At some point he realizes that what he did was probably not the best thing to do because many peoples lives have been taken due to the actions of this machine. Later on and technology company builds a similar machine called “Samaritan” with the sole purpose of destroying this other machine and the 5 people who work with it. Often times in the series, characters refer to this battle as “two gods going to war”. I could get really into this but I don’t think anyone cares that much. The point is that with technology evolving at the speed it is there is a very good chance that AI can basically take over the society we live in and change however they please.
I think that it’s very possible for technology to rule humans irrelevant at some point. If computers get to the point where they can update and improve themselves on their own then they could deem humans as irrelevant and try to end humanity as we know it. The could just view us as a pest that doesn’t benefit them any way even though we would benefit from them.
“But what if the machines see humans as redundant — or worse? When machines reach the point where they can repair themselves and even create better versions of themselves, could they come to the conclusion that humans are not only unnecessary, but also unwanted?”
Even though intelligent computers can evolve on their own I don’t think they will ever be able to completely see and understand the world the way humans do. Things like senses and emotions come naturally and computers have to be programmed so to a certain extent they can only do what the programmer allows them to do. Should computers be limited to what they can do it’s also very possible for society to stay the same or benefit from the the use of AI.
“While AI can trounce the best chess or Jeopardy player and do other specialized tasks, it’s still light-years behind the average 7-year-old in terms of common sense, vision, language and intuition about how the physical world works, Davis said.”
http://mashable.com/2014/11/17/elon-musk-singularity/
http://www.livescience.com/29379-intelligent-robots-will-
overtake-humans.html
http://time.com/3614349/artificial-intelligence-singularity-stephen-hawking-elon-musk/
http://electronics.howstuffworks.com/gadgets/high-tech-gadgets/technological-singularity.htm
Permalink # taylorwatkins said
What does singularity mean for us?
“When machines reach the point where they can repair themselves and even create better versions of themselves, could they come to the conclusion that humans are not only unnecessary, but also unwanted?” (http://electronics.howstuffworks.com/gadgets/high-tech-gadgets/technological-singularity1.htm)
Once machines have developed enough to fix themselves an realize that us humans have nothing more to give them then they could get rid of us. Machines by this time would probably be smarter than the average human and could easily get rid of the population.
Will it be a “golden age” for humanity, the elimination of humanity, or something else?
” could create a utopia in which robots automate common forms of labor and humans relax amid bountiful resources. Or it could lead the artificial intelligence, or AI, to exterminate any creatures it views as competitors for control of the Earth—that would be us. ” (http://time.com/3614349/artificial-intelligence-singularity-stephen-hawking-elon-musk/)
“”A society or economy made primarily of robots will not fear destroying nature in the same way that we should fear destroying nature,” Hanson said.” (http://www.livescience.com/29379-intelligent-robots-will-overtake-humans.html)
“The risk of something seriously dangerous happening is in the five year timeframe.10 years at most,” (http://mashable.com/2014/11/17/elon-musk-singularity/)
Singularity would more than likely be the elimination of humanity sooner than we think. If the robots would think that we were a threat to them then they would just get rid of us. They also would end up destroying nature which would end up destroying us without valuable resources.
It could also bring a “golden age” for humanity by letting robots do all of the hard work that nobody likes to do while we sit around and relax. This would be fine but after so long wouldn’t you get bored? We won’t really know what’s going to happen until it does because we have never had anything like this happen before and it could happen sooner then we think.
Permalink # chrisbonilla0430 said
Singularity is the idea that computers would one day have the capacity to work like and be faster than human brains. Different scientists and researchers say different things about singularity. They all have different ideas on how it would actually happen and when it would occur. “Scientists could develop advancements in artificial intelligence(AI). Computer networks might somehow become self-aware. Computer/human interfaces become so advanced that humans essentially evolve into a new species. Biological science advancements allow humans to physically engineer human intelligence.”
(http://electronics.howstuffworks.com/gadgets/high-tech-gadgets/technological-singularity2.htm) To be honest, this sounds more like the scifi movie, Transcendence, with Johnny Depp. I can see how we’re expanding with the use of computers, we’re also relying more and more each day. I just can’t see the whole “getting to the point of taking over and controlling humans”. “Bostrom argues that once machines surpass human intellect, they could mobilize and decide to eradicate humans extremely quickly using any number of strategies (deploying unseen pathogens, recruiting humans to their side or simple brute force). (http://time.com/3614349/artificial-intelligence-singularity-stephen-hawking-elon-musk/) This article was saying that singularity would mean the extinction of human caused by the computers. I mean were to happen, it would definitely not be a “golden age” for humanity.
I like this idea, “others think humans will eventually relinquish most of their abilities and gradually become absorbed into artificial intelligence (AI)-based organisms, much like the energy making machinery in our own cells. (http://www.livescience.com/29379-intelligent-robots-will-overtake-humans.html) I guess it’s a type of singularity but we would still be involved. We’d just be half computer. If this were to happen, this would most likely be the “golden age” for humans. It would allow us to have unlimited information and be able to perform certain tasks that we couldn’t before. But before any of this can actually happen, we have to actually build the machine to do the work. Elon Musk says, “They recognize the danger, but believe that they can shape and control the digital superintelligences and prevent bad ones from escaping into the Internet.” (http://mashable.com/2014/11/17/elon-musk-singularity/) So they already know the trouble that these machines might bring but they are going to try to build a safety net for these situations. I feel like they can try but at the end of the day, if the machine is smarter than humans, I’m sure it’ll be able to bypass it.
Permalink # chrisbonilla0430 said
Singularity is the idea that computers would one day have the capacity to work like and be faster than human brains. Different scientists and researchers say different things about singularity. They all have different ideas on how it would actually happen and when it would occur. “Scientists could develop advancements in artificial intelligence(AI). Computer networks might somehow become self-aware. Computer/human interfaces become so advanced that humans essentially evolve into a new species. Biological science advancements allow humans to physically engineer human intelligence.”
(http://electronics.howstuffworks.com/gadgets/high-tech-gadgets/technological-singularity2.htm) To be honest, this sounds more like the scifi movie, Transcendence, with Johnny Depp. I can see how we’re expanding with the use of computers, we’re also relying more and more each day. I just can’t see the whole “getting to the point of taking over and controlling humans”. “Bostrom argues that once machines surpass human intellect, they could mobilize and decide to eradicate humans extremely quickly using any number of strategies (deploying unseen pathogens, recruiting humans to their side or simple brute force). (http://time.com/3614349/artificial-intelligence-singularity-stephen-hawking-elon-musk/) This article was saying that singularity would mean the extinction of human caused by the computers. I mean were to happen, it would definitely not be a “golden age” for humanity.
I like this idea, “others think humans will eventually relinquish most of their abilities and gradually become absorbed into artificial intelligence (AI)-based organisms, much like the energy making machinery in our own cells. (http://www.livescience.com/29379-intelligent-robots-will-overtake-humans.html) I guess it’s a type of singularity but we would still be invovled. We’d just be half computer. If this were to happen, this would most likely be the “golden age” for humans. It would allow us to have unlimited information and be able to perform certain tasks that we couldn’t before. But before any of this can actually happen, we have to actually build the machine to do the work. Elon Musk says, “They recognize the danger, but believe that they can shape and control the digital superintelligences and prevent bad ones from escaping into the Internet.” (http://mashable.com/2014/11/17/elon-musk-singularity/) So they already know the trouble that these machines might bring but they are going to try to build a safety net for these situations. I feel like they can try but at the end of the day, if the machine is smarter than humans, I’m sure it’ll be able to bypass it.
Permalink # matthewmerrington said
Singularity is the point when we evolve past the point of our understanding with technology and technology can then make decisions for its self.(electronics.howstuffworks.com) For us singularity would most certainly mean the extension of humans. Many famous and very intelligent people agree that singularity will be the death of humans including Steven Hawking.(time.com) Many scientists say that this idea is only 16 years away when we will have robot overlords.(www.livescience.com) Other people believe that it would only be 5 years or so but either way singularity would mean elimination of humanity.(mashable.com)
Permalink # christopherdavis429 said
Singularity can be defined as – “the point at which artificial intelligence can match, and then overtake, human smarts”.
http://www.livescience.com/29379-intelligent-robots-will-overtake-humans.html
I wouldn’t go that far as to say its the end of human kind, but it will and is a key factor in unemployment for human beings. For instance, self-driving cars could improve safety, but also put millions of truck drivers out of work.
http://mashable.com/2014/11/17/elon-musk-singularity/
“A society or economy made primarily of robots will not fear destroying nature in the same way that we should fear destroying nature,” This can be considered very dangerous and destructive. Although scientists say robots will become self aware, they’ll only become self aware to the factors that affect them most. The essential things that humans need to survive are not essential to robots at all. Which means a robot doesn’t have to eat or rely on oxygen to breath. Which makes robots more likely to survive an apocalyptic event over humans.
http://time.com/3614349/artificial-intelligence-singularity-stephen-hawking-elon-musk/
Some of the smartest people in the world believe that this possibility, is closer than expected, will many others disapprove. One way to prevent the singularity includes building in safety features before machines are able to become self-aware. http://electronics.howstuffworks.com/gadgets/high-tech-gadgets/technological-singularity2.htm
Permalink # Maddie Moore said
Okay I’m gonna try it this way since I’m not sure if it’s posting or not.. Sorry if there are like 8 of these..
Singularity is ” a hypothetical future moment where software becomes self-aware and smart beyond our capacity to understand. ” (http://mashable.com/2014/11/17/elon-musk-singularity/) I don’t think I like singularity. Technological advances are cool but I don’t want everything to be computerized. Even though Google is much faster and convenient, the nerd in me is still just as content opening a book when I need information. In some cases there are computerized countertops that can like do stuff for you as if you aren’t capable of moving your own hands and doing them yourselves. I don’t know much about that though. I think that singularity takes away our humanity completely. We might not be dead but what’s the point of living if we aren’t even thinking for ourselves or doing things on our own?
“While AI can trounce the best chess or Jeopardy player and do other specialized tasks, it’s still light-years behind the average 7-year-old in terms of common sense, vision,
language and intuition about how the physical world works, Davis said.” (http://www.livescience.com/29379-intelligent-robots-will-overtake-humans.html) Why do we want to teach/program a computer to do something that we are perfectly capable of doing ourselves? Sometimes I feel like we spend so much time anticipating and trying to develop “the next big thing” that there’s no way we can ever learn to appreciate what we have. “It would take off on its own and re-design itself at an ever increasing rate. Humans, who are limited by slow biological evolution, couldn’t compete, and would be superseded.” (http://time.com/3614349/artificial-intelligence-singularity-stephen-hawking-elon-musk/) Stephen Hawking suggests that humans could cease to exist with these type of advances. I think that could be true based on how quickly computers respond….. most of the time.
“Even if we find a way to build transistors on a scale of just a few nanometers, they wouldn’t necessarily work.” (http://electronics.howstuffworks.com/gadgets/high-tech-gadgets/technological-singularity2.htm) New technology annoys me in general for this very reason… we have iPads so small they might as well be phones and other times we have iPhones and Galaxy phones that are so big they should just be tablets. I obviously don’t think we should go back to mega-dinosaur computers, but do we REALLY need Apple watches?
I can’t remember which source or I’d cite it, but I think they’re wrong for comparing this to the Agricultural and Industrial Revolutions. Those were different situations. They were pretty much necessary for sustaining human life. They were good for the economy. Singularity would increase the economy because everyone has to have the latest and greatest technology BUT if this technology is just going to be robots that wipes out the human race, does it really matter that it would boost the economy? Those types of revolutions allowed us to be more efficient with farming and production of consumer goods, for the most part. This type of revolution is literally allowing us to be as lazy as possible, which in itself could kill us for biological reasons like literally never moving a muscle.
I don’t see singularity as the beginning of a “golden age” but I don’t exactly see it as the end of the human race in other ways… Honestly I just see it as unnecessary.
Permalink # shatonia4398 said
Singularity is the point at which artificial intelligence can match, and then overtake, human smarts.
“Some believe in a utopian future, in which humans can transcend their physical limitations with the aid of machines. But others think humans will eventually relinquish most of their abilities and gradually become absorbed into artificial intelligence- based organisms, much like the energy making machinery in our own cells.”http://m.livescience.com/29379-intelligent-robots-will-overtake-humans.html
For some people, this may sound exciting. Technology that does everything for you.
“It could lead the artificial intelligence to exterminate any creatures it views as competitors for control of the Earth- that would be us.”http://time.com/3614349/artificial-intelligence-singularity-stephen-hawking-elon-musk/
This could be the elimination of humanity if the technology feels threatened.
“Vinge proposes that mankind is heading toward an irrevocably destiny in which we will evolve beyond our understanding through the use of technology. He calls it the singularity. What is the singularity, and how might it come about?”http://electronics.howstuffworks.com/gadgets/high-tech-gadgets/technological-singularity.htm
All of this could lead to something else, but I guess we will all have to wait and see.
“The leading AI companies have taken great steps to ensure safety. They recognize the danger.”http://mashable.com/2014/11/17/elon-musk-singularity/
Permalink # joeldanae said
What does singularity mean for us? I believe that singularity means that robots, computers, and any advanced technology will be taking over the work force, rather than citizens working to provide for their families, their will only be a few people working, in order to make sure the machines are working properly. “Technological advances would move at a blistering pace. Machines would know how to improve themselves. Humans would become obsolete in the computer world. We would have created a superhuman intelligence. Advances would come faster than we could recognize them. In short, we would reach the singularity.” http://ELECTRONICS.HOWSTUFFWORKS.COM/GADGETS/HIGH-TECH-GADGETS/TECHNOLOGICAL-SINGULARITY.HTM “…a growing number of artificial intelligence experts agree that humans will eventually create an artificial intelligence that can think beyond our own capacities. This moment, called the singularity, could create a utopia in which robots automate common forms of labor and humans relax amid bountiful resources.” http://TIME.COM/3614349/ARTIFICIAL-INTELLIGENCE-SINGULARITY-STEPHEN-HAWKING-ELON-MUSK/ My question is, why would we even let humanity even get to this point? Enough people are already at home, or on the streets without a job, and if singularity happens, then millions of people will be unemployed. “The Swedish philosopher is the director of the Future of Humanity Institute at the University of Oxford, where he’s spent a lot of time thinking about the potential outcomes of the singularity. In his new book Superintelligence, Bostrom argues that once machines surpass human intellect, they could mobilize and decide to eradicate humans extremely quickly using any number of strategies (deploying unseen pathogens, recruiting humans to their side or simple brute force). The world of the future would become ever more technologically advanced and complex, but we wouldn’t be around to see it. “A society of economic miracles and technological awesomeness, with nobody there to benefit,” he writes. “A Disneyland without children.” – Nick Bostrom http://TIME.COM/3614349/ARTIFICIAL-INTELLIGENCE-SINGULARITY-STEPHEN-HAWKING-ELON-MUSK/ “I am not alone in thinking we should be worried,” said Musk. “The leading AI companies have taken great steps to ensure safety. They recognize the danger, but believe that they can shape and control the digital superintelligences and prevent bad ones from escaping into the Internet. That remains to be seen …”
http://MASHABLE.COM/2014/11/17/ELON-MUSK-SINGULARITY/
Will it be a “golden age” for humanity, the elimination of humanity, or something else?
I believe it will be the elimination of humanity. If we are forced out of the job by computers, and advanced machinery (superintelligence) then we will not be able to survive. Would you like to be unemployed because of a computer? Also known as Singularity.
Permalink # jedwebb said
Singularity for the human race means a form of evolution. If there are one day AI computers built in wide scale there will be a couple of effects. For one, millions of jobs will be lost to AI’s. The reason for this is an AI can think faster, better, and more logically then a human. They also would be able to work better and more efficiently. As companies adopt AI technology. other companies will be forced to do the same in the very competitive market. AI’s do not need to be paid, they do not need to be fed, and they do not get sick. I believe it wont be all bad for AI’s to exist. We would get a more efficient world and one where human error does not occur. While we will just have to wait and see, I think more thought should be put into the future and just how AI’s will fit into our world.
http://ELECTRONICS.HOWSTUFFWORKS.COM/GADGETS/HIGH-TECH-GADGETS/TECHNOLOGICAL-SINGULARITY.HTM
http://TIME.COM/3614349/ARTIFICIAL-INTELLIGENCE-SINGULARITY-STEPHEN-HAWKING-ELON-MUSK/
http://WWW.LIVESCIENCE.COM/29379-INTELLIGENT-ROBOTS-WILL-OVERTAKE-HUMANS.HTML
http://MASHABLE.COM/2014/11/17/ELON-MUSK-SINGULARITY/
Permalink # jedwebb said
1
Permalink # maddiehaas said
” what does singularity mean for us? Will it be a “golden age” for humanity, the elimination of humanity, or something else?”
Singularity is when scientist advance in aftificial intelligence and computers and other technology become self aware. For us it could either mean a more luxorious life with basically robot slaves or being overpowered and wiped out. “A society or economy made primarily of robots will not fear destroying nature in the same way that we should fear destroying nature,” Hanson said. (http://electronics.howstuffworks.com/gadgets/high-tech-gadgets/technological-singularity1.htm)
Stephen Hawking said “The development of full artificial intelligence could spell the end of the human race,” which I mostly agree with. They will begin to develop their own software and wont need humans anymore.
(http://time.com/3614349/artificial-intelligence-singularity-stephen-hawking-elon-musk/)
The enhancement of robots could end jobs, destroy nature, and might happen in just 16 years according to the live science website.
“A society or economy made primarily of robots will not fear destroying nature in the same way that we should fear destroying nature,” Hanson said.
(http://www.livescience.com/29379-intelligent-robots-will-overtake-humans.html)
There actually could be something horrible happen with this advancement. We are going entirely over our heads. I think we should consider the outcomes in the future before we try to create something that could possibly destroy us in the end. According to Musk we might have only 5 to 10 years. ” The risk of something seriously dangerous happening is in the five year timeframe. 10 years at most ,” wrote Musk. “PLEASE note that I am normally super pro technology and have never raised this issue until recent months. This is not a case of crying wolf about something I don’t understand. (http://mashable.com/2014/11/17/elon-musk-singularity/)
Permalink # isabelchaparro said
As I read all of the articles it was clear to me that technological scientist are extremely worried about singularity “a hypothetical future moment where software becomes self-aware and smart beyond our capacity to understand.” http://mashable.com/2014/11/17/elon-musk-singularity/ The question is how I feel about singularly. My opinion in all of this is that I do indeed believe that robotic computers will have a better way of understanding things but to an extent. Lots of scientific researchers believe that by improving their capacities it can lead to the extinction of human race. Stephen King even conclude this “The development of full artificial intelligence could spell the end of the human race,” http://time.com/3614349/artificial-intelligence-singularity-stephen-hawking-elon-musk/. I’m not a scientist and therefore am not educated enough to say yes or no to wherever this could be true. But I do think that humans have been so involved in technology that even further in the future it is going to be bigger than what it is now. There have been so many inventions such as the car that drives itself. I think that it’s pretty amazing but I do see the downfall in it as well. By making machines that could do more things than what a human can do is sometimes not a good thing. Machines are not people who need to make a living to pay for expenses such as an average human. Therefore this labor even though is cheap for companies are costing many jobs that people do need. This right here is something that the AI has to consider. “we’re barreling toward a future that doesn’t take people into account. For instance, self-driving cars could improve safety, but also put millions of truck drivers out of work, Hibbard said. So far, no one is planning for those possibilities.”
http://www.livescience.com/29379-intelligent-robots-will-overtake-humans.html. Scientist believe that they can build some kind of software that will prevent them from exceeding their potential and limit their power. “Another way to prevent the singularity includes building in safety features before machines are able to become self-aware” http://electronics.howstuffworks.com/gadgets/high-tech-gadgets/technological-singularity2.htm. We need to all come up with ideas that building robotic machinery won’t affect us. Creating a powerful machine can be harmful because anyone who gets a hand of this machine would have any type of control. So also we should take in consideration how ambitious a human can be.
Permalink # ndaquioag98 said
“But right now, these machines have to answer to humans” (http://electronics.howstuffworks.com/gadgets/high-tech-gadgets/technological-singularity.htm) I agree that one day technology will be so complex and evolved that we will give control to robots. As of now, we are able to keep our ruling to a maximum.
“The development of full artificial intelligence could spell the end of the human race,” (http://time.com/3614349/artificial-intelligence-singularity-stephen-hawking-elon-musk/) as said by one of the greatest scientist Stephen Hawking, it could be the beginning of the end. Technology evolves faster than we blink. Without control, it could lead to an overrule of technology!
“Some believe in a utopian future, in which humans can transcend their physical limitations with the aid of machines.” (http://www.livescience.com/29379-intelligent-robots-will-overtake-humans.html) no scientist can prove an exact date to such an event, but it is bound to happen according to certain scientists.
“There’s plenty of debate over the singularity — a hypothetical future moment where software becomes self-aware and smart beyond our capacity to understand.” ( http://mashable.com/2014/11/17/elon-musk-singularity/) We will be unable to understand and predict what we can do with technology,because it will ruin the human race.
Permalink # Aislinn Nantz said
“Vernor Vinge proposes an interesting — and potentially terrifying — prediction in his essay titled “The Coming Technological Singularity: How to Survive in the Post-Human Era.” … The essay specifies four ways in which this could happen: Scientists could develop advancements in artificial intelligence (AI), Computer networks might somehow become self-aware, Computer/human interfaces become so advanced that humans essentially evolve into a new species, Biological science advancements allow humans to physically engineer human intelligence. Out of those four possibilities, the first three could lead to machines taking over.” http://ELECTRONICS.HOWSTUFFWORKS.COM/GADGETS/HIGH-TECH-GADGETS/TECHNOLOGICAL-SINGULARITY.HTM. Now in my words: Singularity could very well mean the end of mankind. You see we, humans, have a habit of not thinking of the consequences that come with our actions and it is very possible that people, wanting to build robots with emotions and the ability to think on their own could lead to the end of us, whether it be for the satisfaction of our entertainment or for our safety… especially if it’s for our safety! Because if you think about we are the biggest threat to our safety and in order to “protect us” the robots would have to destroy us, just watch IRobot with Will Smith. My best answer for if it will be a “golden age” for humanity, the elimination of humanity, or something else? Is to sit down and watch the movies: IRobot, Artificial Intelligence (AI), and the TV series Person of Interest. That show is my personal favorite and the computer does not “destroy us” but it does present a very serious “power” problem and… you’ll just have to watch them to understand it all.
Permalink # jaybyrd40 said
Singularity is a horrifying fantasy in the minds of many but recognizing the true terrifying reality of it is not based on age or even knowledge of the subject. Some may become willfully ignorant to the subject and some may just not recognize it initially. I don’t fear the unknown as much as it confuses me on how to approach the situation. I am however afraid of the rapid advancement in technology just because it presents more danger than positives in my mind. Just due to the fact that we will eventually not be able to expect and prepare for the next step in the chain reaction of advancement. This quote from the second article of the first link supports my idea that:
“Technological advances would move at a blistering pace. Machines would know how to improve themselves. Humans would become obsolete in the computer world. We would have created a superhuman intelligence. Advances would come faster than we could recognize them. In short, we would reach the singularity.”
http://electronics.howstuffworks.com/gadgets/high-tech-gadgets/technological-singularity1.htm
Although I don’t agree with the first quote from Stephen Hawking in the second link where he says that:
“The development of full artificial intelligence could spell the end of the human race,”
I do believe that his second statement connects to what I said previously that it would spark an uncontrollable advancement that could not be terminated or prevented.
“It would take off on its own and re-design itself at an ever increasing rate. Humans, who are limited by slow biological evolution, couldn’t compete, and would be superseded.”
http://time.com/3614349/artificial-intelligence-singularity-stephen-hawking-elon-musk/
I think that receiving the opinions of extremely educated people is valid but then again contains a specific underlying bias since most people who are profound geniuses share similar views on such subjects, this makes the second article helpful but invalid without an opposing website.
The third article I feel tries to explain science-fiction movies and support them rather than taking a view on a situation as a whole without portraying one of these topics. This is why the author chooses to begin the article with the quote that reads:
“Are you prepared to meet your robot overlords?
The idea of superintelligent machines may sound like the plot of “The Terminator” or “The Matrix,” but many experts say the idea isn’t far-fetched.”
http://www.livescience.com/29379-intelligent-robots-will-overtake-humans.html
Along with the fact that a robot similar in appearance to that of one in the famous Will Smith movie “I-Robot”.
The final article deeply analyzes Elon Musk’s view on the singularity that is shortly mentioned in the second article. It provides the point that although many computer scientists believe it possible and maybe even inevitable but it is a good time from now. This quote shows that Elon is overestimating the rapid advancement of technology:
“His prediction is way ahead of the pack, too. Ray Kurzweil, chief proponent of the singularity theory, predicts that we’ll see the technological event come to pass sometime around 2045 — decades later than Musk’s prediction.”
http://mashable.com/2014/11/17/elon-musk-singularity/
In conclusion, I don’t fear singularity mainly because it just seems unreasonable right now, and don’t fear the existence of the human race when competing with artificial intelligence.
Permalink # iheartnoblitt said
This is Dylan Riley.
Since wordpress thinks it’s hilarious for me to write a decent blog post and to delete it when I try to post it, I have to rewrite. It just makes me so excited to redo a blog post that won’t be nearly as well written. Singularity; I personally think it’s an insanely idiotic idea. As cool as it would be to have robot overlords, because let’s face it, people are horrible, it’s so incredibly far-fetched and ridiculous that it physically hurts me to think that people actually believe we’ll be overtaken by robots. I like the Matrix as much as the next person, but I think it’s just dumb.
In the first article, it gives the definition of the, well, a skewed definition, of singularity. It says that the singularity are robots taking over the world and enslaving humans. Still, a pretty stupid idea. (http://electronics.howstuffworks.com/gadgets/high-tech-gadgets/technological-singularity.htm)
The second article is six very intelligent and well-respected people talking about how they all believe that robots will take us over one day. The problem with this is that just because smart people say that something is, it doesn’t mean this it is. Harrison Ford said that the Bush administration was a well one, but that’s obviously a lie. Stephen Hawking said that we would be living with aliens, and knowing that they’re aliens, as one. I personally don’t know any aliens, unless you count illegal immigrants, but that’s an argument for another time. (http://time.com/3614349/artificial-intelligence-singularity-stephen-hawking-elon-musk/)
In the third article, it says that scientist believe the singularity will happen within the 21st century. It also gives a partial argument for the other side who doesn’t believe in the singularity. One point that it brings up is that robots have no physical intuition. If someone sees a cup of water falling to the floor, we know that we’re going to end up with water everywhere, whereas a robot would have to countless simulations, full of measurements of everything, just to know that water will end up on the floor. So, good job Noblitt. You gave a somewhat non-biased article for once. Keep it up. (http://www.livescience.com/29379-intelligent-robots-will-overtake-humans.html)
In the last article. it’s talking about Elon Musk saying that the singularity is close. One company he brings up is Deepmind, which is an AI development company, and their overwhelming accomplishments. However, if you read their publications (http://deepmind.com/publications.html), you find that all there really is are documents stating theories of what robots could do based on psychological processes in humans. So, unless they keep all of their results under lock, there isn’t much proof. Of course, you’d have to be doing some pretty amazing work to have google as a business partner. (http://mashable.com/2014/11/17/elon-musk-singularity/)
Bottom line, the singularity is a pretty idiotic idea. Of course, living life like a Jonathan Coulton song would be pretty amazing. (props to anyone who knows about JoCo and his music)
Permalink # iheartnoblitt said
Since wordpress thinks it’s hilarious for me to write a decent blog post and to delete it when I try to post it, I have to rewrite. It just makes me so excited to redo a blog post that won’t be nearly as well written. Singularity; I personally think it’s an insanely idiotic idea. As cool as it would be to have robot overlords, because let’s face it, people are horrible, it’s so incredibly far-fetched and ridiculous that it physically hurts me to think that people actually believe we’ll be overtaken by robots. I like the Matrix as much as the next person, but I think it’s just dumb.
In the first article, it gives the definition of the, well, a skewed definition, of singularity. It says that the singularity are robots taking over the world and enslaving humans. Still, a pretty stupid idea. (http://electronics.howstuffworks.com/gadgets/high-tech-gadgets/technological-singularity.htm)
The second article is six very intelligent and well-respected people talking about how they all believe that robots will take us over one day. The problem with this is that just because smart people say that something is, it doesn’t mean this it is. Harrison Ford said that the Bush administration was a well one, but that’s obviously a lie. Stephen Hawking said that we would be living with aliens, and knowing that they’re aliens, as one. I personally don’t know any aliens, unless you count illegal immigrants, but that’s an argument for another time. (http://time.com/3614349/artificial-intelligence-singularity-stephen-hawking-elon-musk/)
In the third article, it says that scientist believe the singularity will happen within the 21st century. It also gives a partial argument for the other side who doesn’t believe in the singularity. One point that it brings up is that robots have no physical intuition. If someone sees a cup of water falling to the floor, we know that we’re going to end up with water everywhere, whereas a robot would have to countless simulations, full of measurements of everything, just to know that water will end up on the floor. So, good job Noblitt. You gave a somewhat non-biased article for once. Keep it up. (http://www.livescience.com/29379-intelligent-robots-will-overtake-humans.html)
In the last article. it’s talking about Elon Musk saying that the singularity is close. One company he brings up is Deepmind, which is an AI development company, and their overwhelming accomplishments. However, if you read their publications (http://deepmind.com/publications.html), you find that all there really is are documents stating theories of what robots could do based on psychological processes in humans. So, unless they keep all of their results under lock, there isn’t much proof. Of course, you’d have to be doing some pretty amazing work to have google as a business partner. (http://mashable.com/2014/11/17/elon-musk-singularity/)
Bottom line, the singularity is a pretty idiotic idea. Of course, living life like a Jonathan Coulton song would be pretty amazing. (props to anyone who knows about JoCo and his music)
Permalink # morgan said
Singularity is when computers are so advanced that they can at the level or at a higher level than humans.
“After all, computers allow us to communicate with each other, keep track of complex systems like global markets and even control the world’s most dangerous weapons.” http://electronics.howstuffworks.com/gadgets/high-tech-gadgets/technological-singularity.htm I don’t necessarily agree with this. In my opinion, computers don’t “allow” us to communicate with eachother, they help us. We would still be able to communicate had computers never been invented. Also, I don’t think that computers don’t “keep track” of complex systems, humans do that using technology. We probably wouldn’t have such complex systems that we do if we didn’t have all the technology that we have, which wouldn’t be a bad thing. Lastly, computers don’t “control” the world’s most deadly weapons, humans do. There is no way as of right now that computers have the ability to make a decision regarding the world’s safety.
Permalink # shadowlander16 said
The term Singularity has many definitions to our humanity. The everyday english definition of singularity designates the quality of being one of a kind, strange, unique, remarkable or unusual. All things that one, may want to be known as or be called by or invented to. I think singularity is a good thing, in all aspects of the word. Will it be the golden age? Mabey, mabey not from what I read on the following recourse options, I personally would have to agree that it will be a golden age and it may be helpful for us to further advance our resources in technology. Yes I truly believe one day there will be no jobs and that robots or electronics will do our everyday jobs. The government has a way of working things out. People find a way. Yes, most humans don’t have the drive to do something with themselves and have no drive but the ones who do will “singularly” pointed out and will strive for being the next biggest thing. In the future we aren’t going to need farmers and naturalist. were going to be engineers and astronauts. Earth was and is dying. Any further development in science might cause total breakdown. ‘Cause development without harnessing the environment isn’t that possible. Although in reality this can never happen at such a rapid pace but gradually over time . In future, technology will grow so advance that man in field will become a history. There’ll be robots and machines for doing agricultural works. This will be the fall of humanity, the rise of humanity and “something else” all at the same time. Humans have one thing inside them, everyone does. SURVIVAL
“Could machines replace humans as the dominant force on the planet?” (http://electronics.howstuffworks.com/gadgets/high-tech-gadgets/technological-singularity.htm)
Permalink # chrisbonilla0430 said
Singularity is the idea that computers would one day have the capacity to work like and be faster than human brains. Different scientists and researchers say different things about singularity. They all have different ideas on how it would actually happen and when it would occur. “Scientists could develop advancements in artificial intelligence(AI). Computer networks might somehow become self-aware. Computer/human interfaces become so advanced that humans essentially evolve into a new species. Biological science advancements allow humans to physically engineer human intelligence.”
(http://electronics.howstuffworks.com/gadgets/high-tech-gadgets/technological-singularity2.htm) To be honest, this sounds more like the scifi movie, Transcendence, with Johnny Depp. I can see how we’re expanding with the use of computers, we’re also relying more and more each day. I just can’t see the whole “getting to the point of taking over and controlling humans”. “Bostrom argues that once machines surpass human intellect, they could mobilize and decide to eradicate humans extremely quickly using any number of strategies (deploying unseen pathogens, recruiting humans to their side or simple brute force). (http://time.com/3614349/artificial-intelligence-singularity-stephen-hawking-elon-musk/) This article was saying that singularity would mean the extinction of human caused by the computers. I mean were to happen, it would definitely not be a “golden age” for humanity.
I like this idea, “others think humans will eventually relinquish most of their abilities and gradually become absorbed into artificial intelligence (AI)-based organisms, much like the energy making machinery in our own cells. (http://www.livescience.com/29379-intelligent-robots-will-overtake-humans.html) I guess it’s a type of singularity but we would still be invovled. We’d just be half computer. If this were to happen, this would most likely be the “golden age” for humans. It would allow us to have unlimited information and be able to perform certain tasks that we couldn’t before. But before any of this can actually happen, we have to actually build the machine to do the work. Elon Musk says, “They recognize the danger, but believe that they can shape and control the digital superintelligences and prevent bad ones from escaping into the Internet.” (http://mashable.com/2014/11/17/elon-musk-singularity/) So they already know the trouble that these machines might bring but they are going to try to build a safety net for these situations. I feel like they can try but at the end of the day, if the machine is smarter than humans, I’m sure it’ll be able to bypass it. q
Permalink # chrisbonilla0430 said
Singularity is the idea that computers would one day have the capacity to work like and be faster than human brains. Different scientists and researchers say different things about singularity. They all have different ideas on how it would actually happen and when it would occur. “Scientists could develop advancements in artificial intelligence(AI). Computer networks might somehow become self-aware. Computer/human interfaces become so advanced that humans essentially evolve into a new species. Biological science advancements allow humans to physically engineer human intelligence.”
(http://electronics.howstuffworks.com/gadgets/high-tech-gadgets/technological-singularity2.htm) To be honest, this sounds more like the scifi movie, Transcendence, with Johnny Depp. I can see how we’re expanding with the use of computers, we’re also relying more and more each day. I just can’t see the whole “getting to the point of taking over and controlling humans”. “Bostrom argues that once machines surpass human intellect, they could mobilize and decide to eradicate humans extremely quickly using any number of strategies (deploying unseen pathogens, recruiting humans to their side or simple brute force). (http://time.com/3614349/artificial-intelligence-singularity-stephen-hawking-elon-musk/) This article was saying that singularity would mean the extinction of human caused by the computers. I mean were to happen, it would definitely not be a “golden age” for humanity.
I like this idea, “others think humans will eventually relinquish most of their abilities and gradually become absorbed into artificial intelligence (AI)-based organisms, much like the energy making machinery in our own cells. (http://www.livescience.com/29379-intelligent-robots-will-overtake-humans.html) I guess it’s a type of singularity but we would still be invovled. We’d just be half computer. If this were to happen, this would most likely be the “golden age” for humans. It would allow us to have unlimited information and be able to perform certain tasks that we couldn’t before. But before any of this can actually happen, we have to actually build the machine to do the work. Elon Musk says, “They recognize the danger, but believe that they can shape and control the digital superintelligences and prevent bad ones from escaping into the Internet.” (http://mashable.com/2014/11/17/elon-musk-singularity/) So they already know the trouble that these machines might bring but they are going to try to build a safety net for these situations. I feel like they can try but at the end of the day, if the machine is smarter than humans, I’m sure it’ll be able to bypass it. wq
Permalink # chrisbonilla0430 said
Singularity is the idea that computers would one day have the capacity to work like and be faster than human brains. Different scientists and researchers say different things about singularity. They all have different ideas on how it would actually happen and when it would occur. “Scientists could develop advancements in artificial intelligence(AI). Computer networks might somehow become self-aware. Computer/human interfaces become so advanced that humans essentially evolve into a new species. Biological science advancements allow humans to physically engineer human intelligence.”
(http://electronics.howstuffworks.com/gadgets/high-tech-gadgets/technological-singularity2.htm) To be honest, this sounds more like the scifi movie, Transcendence, with Johnny Depp. I can see how we’re expanding with the use of computers, we’re also relying more and more each day. I just can’t see the whole “getting to the point of taking over and controlling humans”. “Bostrom argues that once machines surpass human intellect, they could mobilize and decide to eradicate humans extremely quickly using any number of strategies (deploying unseen pathogens, recruiting humans to their side or simple brute force). (http://time.com/3614349/artificial-intelligence-singularity-stephen-hawking-elon-musk/) This article was saying that singularity would mean the extinction of human caused by the computers. I mean were to happen, it would definitely not be a “golden age” for humanity.
I like this idea, “others think humans will eventually relinquish most of their abilities and gradually become absorbed into artificial intelligence (AI)-based organisms, much like the energy making machinery in our own cells. (http://www.livescience.com/29379-intelligent-robots-will-overtake-humans.html) I guess it’s a type of singularity but we would still be invovled. We’d just be half computer. If this were to happen, this would most likely be the “golden age” for humans. It would allow us to have unlimited information and be able to perform certain tasks that we couldn’t before. But before any of this can actually happen, we have to actually build the machine to do the work. Elon Musk says, “They recognize the danger, but believe that they can shape and control the digital superintelligences and prevent bad ones from escaping into the Internet.” (http://mashable.com/2014/11/17/elon-musk-singularity/) So they already know the trouble that these machines might bring but they are going to try to build a safety net for these situations. I feel like they can try but at the end of the day, if the machine is smarter than humans, I’m sure it’ll be able to bypass it. qwre
Permalink # sarahpickert15 said
“There’s plenty of debate over the singularity — a hypothetical future moment where software becomes self-aware and smart beyond our capacity to understand. Some say it will be a boon for humanity; some foresee an Artificial Intelligence-driven apocalypse” (http://mashable.com/2014/11/17/elon-musk-singularity/).
“Computer technology advances at a faster rate than many other technologies. Computers tend to double in power every two years or so. This trend is related to Moore’s Law, which states that transistors double in power every 18 months. Vinge says that at this rate, it’s only a matter of time before humans build a machine that can “think” like a human” (http://electronics.howstuffworks.com/gadgets/high-tech-gadgets/technological-singularity.htm).
“And in scientific circles, a growing number of artificial intelligence experts agree that humans will eventually create an artificial intelligence that can think beyond our own capacities. This moment, called the singularity, could create a utopia in which robots automate common forms of labor and humans relax amid bountiful resources”(http://electronics.howstuffworks.com/gadgets/high-tech-gadgets/technological-singularity.htm).
“The past two singularities — the Agricultural and Industrial revolutions — led to a doubling in economic productivity every 1,000 and 15 years, respectively, said Robin Hanson, an economist at George Mason University in Washington, D.C., who is writing a book about the future singularity. But once machines become as smart as men, the economy will double every week or month” (http://www.livescience.com/29379-intelligent-robots-will-overtake-humans.html).
I guess you could say that I am one of those people who has thought of this occasionally and thinks it possible that robots will indeed become our “robot overlords.” Technology has gotten and is getting more and more advanced every day. Singularity is getting more and more prominent in minds all over the world, especially scientists (and crazy people who freak out about everything). For us, singularity could mean a possible end or even suppression of the human race. As for the “golden age” or damnation of humanity there is no way to tell. Humans are in control of this even if they don’t realize it. Humans control all robots and make them what they are so either way humans will control what happens to humanity. Hopefully it’s more of the “golden age” than damnation.
Permalink # ramseybrooke said
what does singularity mean for us? Will it be a “golden age” for humanity, the elimination of humanity, or something else?
Singularity is where “we will evolve beyond our understanding through the use of technology”(http://electronics.howstuffworks.com/gadgets/high-tech-gadgets/technological-singularity.htm). So for us it could possibly put us in danger like many people believe or we could use it to our advantage. ““The development of full artificial intelligence could spell the end of the human race,”(http://time.com/3614349/artificial-intelligence-singularity-stephen-hawking-elon-musk/). “Some believe in a utopian future, in which humans can transcend their physical limitations with the aid of machines. But others think humans will eventually relinquish most of their abilities and gradually become absorbed into artificial intelligence”(http://www.livescience.com/29379-intelligent-robots-will-overtake-humans.html).
“The risk of something seriously dangerous happening is in the five year timeframe. 10 years at most,” wrote Musk. “(http://mashable.com/2014/11/17/elon-musk-singularity/).
Even after reading the articles, I’m still not really sure if i believe this will happen. I think its a crazy idea and i haven’t heard anything about his until this year. But of course only time will tell. It also is kind of scary to think about so I don’t really like to think about and discuss it.
Permalink # mariachaparro119 said
As I read all of the articles it was clear to me that technological scientist are extremely worried about singularity “a hypothetical future moment where software becomes self-aware and smart beyond our capacity to understand.” http://mashable.com/2014/11/17/elon-musk-singularity/ The question is how I feel about singularly. My opinion in all of this is that I do indeed believe that robotic computers will have a better way of understanding things but to an extent. Lots of scientific researchers believe that by improving their capacities it can lead to the extinction of human race. Stephen King even conclude this “The development of full artificial intelligence could spell the end of the human race,” http://time.com/3614349/artificial-intelligence-singularity-stephen-hawking-elon-musk/. I’m not a scientist and therefore am not educated enough to say yes or no to wherever this could be true. But I do think that humans have been so involved in technology that even further in the future it is going to be bigger than what it is now. There have been so many inventions such as the car that drives itself. I think that it’s pretty amazing but I do see the downfall in it as well. By making machines that could do more things than what a human can do is sometimes not a good thing. Machines are not people who need to make a living to pay for expenses such as an average human. Therefore this labor even though is cheap for companies are costing many jobs that people do need. This right here is something that the AI has to consider. “we’re barreling toward a future that doesn’t take people into account. For instance, self-driving cars could improve safety, but also put millions of truck drivers out of work, Hibbard said. So far, no one is planning for those possibilities.”
http://www.livescience.com/29379-intelligent-robots-will-overtake-humans.html. Scientist believe that they can build some kind of software that will prevent them from exceeding their potential and limit their power. “Another way to prevent the singularity includes building in safety features before machines are able to become self-aware” http://electronics.howstuffworks.com/gadgets/high-tech-gadgets/technological-singularity2.htm. We need to all come up with ideas that building robotic machinery won’t affect us. Creating a powerful machine can be harmful because anyone who gets a hand of this machine would have any type of control. So also we should take in consideration how ambitious a human can be.
Permalink # blueeyedathlete said
Singularity is a function that takes an infinite value.
“Could machines replace humans as the dominant force on the planet? Some might argue that we’ve already reached that point.” http://ELECTRONICS.HOWSTUFFWORKS.COM/GADGETS/HIGH-TECH-GADGETS/TECHNOLOGICAL-SINGULARITY.HTM
This quote really stood out to me, because its 100% true. There are robotics machines in factories taking over people jobs. Yes, it will create more engineering jobs for the world because the machines will need to be fixed and updated. Correct? But what about the fact that more people would be living in poverty afterwards rather than actually working.
“This moment, called the singularity, could create a utopia in which robots automate common forms of labor and humans relax amid bountiful resources.”
http://TIME.COM/3614349/ARTIFICIAL-INTELLIGENCE-SINGULARITY-STEPHEN-HAWKING-ELON-MUSK/
I would find this so called imagined “Utopia” where you would find lazy and fat people, because they wouldn’t have to get up to work and they would be laying down all day eating. Technology is definitely playing a very big role in today’s society but it will be an even greater thing when “Utopia” comes around. The whole purpose of humanity would be eliminated. I saw a quote once it said, “ Being Human is given, but keeping our humanity is a choice.” It means that you can choose to have all of this technology or take it back to the old day where phones didn’t exist and you could be outside enjoying what is around you. Majority of teenagers have all these big dreams to be rich and famous. But how will they have those dreams if they have robots taking all of their jobs. The only robot I would like to see is Robocop because crime levels would decrease but then again there goes the problem with losing jobs. This will not be a golden age for us, it would be more like a bronze age because the fact that humanity is disappearing. Its great having a smartphone but it isn’t going to take my job in the future. But i doubt there will be a robot that can be involved with management. Robots will most likely be in the engineering and manufacturing jobs.
“We’re becoming like the mitochondria. We provide the energy — we turn on the machines,” Slonczewski told LiveScience. “But increasingly, they do everything else.”
http://WWW.LIVESCIENCE.COM/29379-INTELLIGENT-ROBOTS-WILL-OVERTAKE-HUMANS.HTML
I disagree with this statement because these robots can’t do “everything” they can’t taste food, they can’t feel emotions, they don’t have a personality and I would consider them boring. Yeah they could get the done job a lot faster. But to me I don’t want to live in a boring world where a machine does “everything” for me. That is what the society is coming to because if you think long and hard about it… I would like to be able to converse with my co-workers rather than have a cold sheet of metal sitting next to me. That is what being human is being about, to live and feel, and to taste and to love. Yeah, I hate the whole idea of robots taking over the world.
Permalink # julio said
what does singularity mean for us?
Vinge, a former professor says that singularity is the thought of evolving beyond our understanding. Humans have made great progress in the last centuries dealing with technology. Singularity for us should mean progress not failure. (http://electronics.howstuffworks.com/gadgets/high-tech-gadgets/technological-singularity.htm)
Will it be a “golden age” for humanity, the elimination of humanity, or something else?
We as humans I believe we can choose our own destiny. This can be either really good or really bad. So starting now I think everyone that is working on these technological advances shouldn’t do anything stupid with them. Philosophers and writers are predicting the elimination of humanity if these advances lead into the wrong direction. (http://time.com/3614349/artificial-intelligence-singularity-stephen-hawking-elon-musk/)
“Artificial technology will turn deadly in 5 years” no.
If we are talking in technical terms, artificial intelligence is already deadly. From a push of a few buttons the earth could be destroyed. Nuclear weapons and even our own phones are deadly. With our technology today we could do so many great things (which we have).
I don’t believe this will be a golden age if technology advances simply because we think we are in one now.
(http://mashable.com/2014/11/17/elon-musk-singularity/)
The advancement of human and artificial technology shouldn’t lead to the end of humanity, and if it does we can blame it on ourselves. Like my good friend Zack Antrum once said, “just unplug it” #420
Permalink # destinysanders235 said
Singularity means that computers and robotic could possibly become the brains of the world and take over human functions. “After all, computers allow us to communicate with each other, keep track of complete situations like global markets and even controls the world’s most dangerous weapons.” (http://electronics.howstuffworks.com/gadgets/high-tech-gadgets/technological-singularity.htm) This statement gives insight on how computers already have an advantage over mankind, however, it has to be operated by humans. Therefore, it is not yet definite that computers and robotics shall dominate. I personally believe that if mankind was not so reliable on technology and other things to figure stuff out for us we wouldn’t have an issue with whether technology and stuff with take over. “It would take off on its own and re-design itself at an ever increasing rate. Humans, who are limited by slow biological evolution, couldn’t compete, and would be superseded.” (http://time.com/3614349/artificial-intelligence-singularity-stephen-hawking-elon-musk/) This quote is believe by the physicist, Stephen Hawking. Only if humans give it the power to, technology could possibly take over. However, I think this particularly statement is a little outrageous. I mean who knows what could possibly happen within the next few years. It can’t be predicted. Let’s say humans don’t let robotics and computers take over, then what? It won’t be known until it happens. Is it a possibly technology can re-design at an increasing rate? Sure it is. But I can’t give a definite answer whether it’s going to end the human race. Only a matter of time will tell. “For instance, because of that physical intuition, humans can watch a person overturn a cup of coffee and just know that the end result will be a puddle on the floor. A computer program, on the other hand, would have to do a laborious simulation and know the exact size of the cup, the height of the cup from the surface and various other parameters to understand the outcome, Davis said.” (http://m.livescience.com/29379-intelligent-robots-will-overtake-humans.html) This quote sounds legitimate, however, isn’t the human race the ones who programmed and invented this technology, therefore, making us the intelligent ones? If it wasn’t for humans, how would the technology have been developed and received the techniques that it does. I feel as if now because the computer or robotics can do the things humans want it to and more, we no longer have to rely on finding information and knowledge on our own. But we shouldn’t let it rule our everyday like, resulting in “artificial intelligence” ruling mankind. “The risk of something seriously dangerous happening is in the five year timeframe. 10 years at most,” wrote Musk.” (http://mashable.com/2014/11/17/elon-musk-singularity/) As I stated earlier, I don’t believe it can specifically be predicted when such a thing could happen. Maybe it will, maybe it won’t, humans will never know until it happens. There are factors that can make an individual belief that “artificial intelligence” is bound to happen. Yet there are also just enough factors to support that it isn’t. Therefore, I can’t give a biased opinion on whether or not it will be the “golden age” for humanity, because who knows.
Permalink # destinysanders235 said
<span Singularity means that computers and robotic could possibly become the brains of the world and take over human functions. "After all, computers allow us to communicate with each other, keep track of complete situations like global markets and even controls the world's most dangerous weapons." (http://electronics.howstuffworks.com/gadgets/high-tech-gadgets/technological-singularity.htm) This statement gives insight on how computers already have an advantage over mankind, however, it has to be operated by humans. Therefore, it is not yet definite that computers and robotics shall dominate. I personally believe that if mankind was not so reliable on technology and other things to figure stuff out for us we wouldn't have an issue with whether technology and stuff with take over. "It would take off on its own and re-design itself at an ever increasing rate. Humans, who are limited by slow biological evolution, couldn't compete, and would be superseded." (http://time.com/3614349/artificial-intelligence-singularity-stephen-hawking-elon-musk/) This quote is believe by the physicist, Stephen Hawking. Only if humans give it the power to, technology could possibly take over. However, I think this particularly statement is a little outrageous. I mean who knows what could possibly happen within the next few years. It can't be predicted. Let's say humans don't let robotics and computers take over, then what? It won't be known until it happens. Is it a possibly technology can re-design at an increasing rate? Sure it is. But I can't give a definite answer whether it's going to end the human race. Only a matter of time will tell. "For instance, because of that physical intuition, humans can watch a person overturn a cup of coffee and just know that the end result will be a puddle on the floor. A computer program, on the other hand, would have to do a laborious simulation and know the exact size of the cup, the height of the cup from the surface and various other parameters to understand the outcome, Davis said." (http://m.livescience.com/29379-intelligent-robots-will-overtake-humans.html) This quote sounds legitimate, however, isn't the human race the ones who programmed and invented this technology, therefore, making us the intelligent ones? If it wasn't for humans, how would the technology have been developed and received the techniques that it does. I feel as if now because the computer or robotics can do the things humans want it to and more, we no longer have to rely on finding information and knowledge on our own. But we shouldn't let it rule our everyday like, resulting in artificial intelligence" ruling mankind. "The risk of something seriously dangerous happening is in the five year timeframe. 10 years at most," wrote Musk." (http://mashable.com/2014/11/17/elon-musk-singularity/) As I stated earlier, I don't believe it can specifically be predicted when such a thing could happen. Maybe it will, maybe it won't, humans will never know until it happens. There are factors that can make an individual belief that "artificial intelligence" is bound to happen. Yet there are also just enough factors to support that it isn't. Therefore, I can't give a biased opinion on whether or not it will be the "golden age" for humanity, because who knows.
Permalink # zhalenr said
What does singularity mean for us? Will it be a “golden age” for humanity, the elimination of humanity, or something else?
Singularity is basically “the point at which artificial intelligence can match, and then overtake, human smarts”.
http://www.livescience.com/29379-intelligent-robots-will-overtake-humans.html. I’m not sure of what will be the elimination of humanity, However I do believe that Singularity could be a possibility. In one of the articles it states that “Some of the smartest people in the world believe that this possibility, is closer than expected, will many others disapprove. One way to prevent the singularity includes building in safety features before machines are able to become self-aware.” http://electronics.howstuffworks.com/gadgets/high-tech-gadgets/technological-singularity2.htm In another article it says “Other people believe that it would only be 5 years or so but either way singularity would mean elimination of humanity” http://mashable.com/2014/11/17/elon-musk-singularity/ . In the last article http://time.com/3614349/artificial-intelligence-singularity-stephen-hawking-elon-musk/ it talked about Steve Hawkins agreeing to it. I’m still unsure if I think It will end Humanity.
Permalink # zhalenr said
http://time.com/3614349/artificial-intelligence-singularity-stephen-hawking-elon-musk/
Permalink # zhalenr said
What does singularity mean for us? Will it be a “golden age” for humanity, the elimination of humanity, or something else?
Singularity is basically “the point at which artificial intelligence can match, and then overtake, human smarts”.
http://www.livescience.com/29379-intelligent-robots-will-overtake-humans.html I don’t know if singularity will be the end of humanity, however I do think that it can be a possibility. In one of the articles it stated “Some of the smartest people in the world believe that this possibility, is closer than expected, will many others disapprove. One way to prevent the singularity includes building in safety features before machines are able to become self-aware.” http://electronics.howstuffworks.com/gadgets/high-tech-gadgets/technological-singularity2.htm In another one of the articles it said “Other people believe that it would only be 5 years or so but either way singularity would mean elimination of humanity”
http://mashable.com/2014/11/17/elon-musk-singularity/ I’m still not sure if singularity will be the end of humanity because I believe that only God knows. In the last article http://time.com/3614349/artificial-intelligence-singularity-stephen-hawking-elon-musk/ it talked about Steven Hawking agreeing. I will say that Singularity is a huge possibility but I am just unsure
Permalink # brooksdeanna0530 said
What does singularity mean for us? Will it be a “golden age” for humanity, the elimination of humanity, or something else? Singularity is the state, fact, quality, or condition of being singular. “How long will this last? Are we headed for a future in which machines gain a form of consciousness? If they do, what happens to us? Will we enter a future in which computers and robots do all the work and we enjoy the fruits of their labor? Will we be converted into inefficient batteries a la “The Matrix?” Or will machines exterminate the human race from the face of the Earth?” http://electronics.howstuffworks.com/gadgets/high-tech-gadgets/technological-singularity.htm “And in scientific circles, a growing number of artificial intelligence experts agree that humans will eventually create an artificial intelligence that can think beyond our own capacities.” http://time.com/3614349/artificial-intelligence-singularity-stephen-hawking-elon-musk/ “For instance, because of that physical intuition, humans can watch a person overturn a cup of coffee and just know that the end result will be a puddle on the floor. A computer program, on the other hand, would have to do a laborious simulation and know the exact size of the cup, the height of the cup from the surface and various other parameters to understand the outcome, Davis said” http://www.livescience.com/29379-intelligent-robots-will-overtake-humans.html “Aside from the frighteningly near-term prediction, the other thing that seemed to give Musk’s comment import was the mention of DeepMind, a very real company he has invested in that works in the artificial intelligence space.” http://mashable.com/2014/11/17/elon-musk-singularity/ These articles are saying it’s near, and yes they have already started working on it. However, I find it hard to believe we will get them working and mass produced before the human race destroys the Earth. Plus, any time you invent something, it takes several trials, and many alterations to the product.. I do not see it as a “golden age” for humanity, but if done before humanity destroys the Earth, as the elimination of humanity. We will have practically nothing to live for, we’d be empty shells of a person, just moving around. We wouldn’t be able to enjoy the simple little things in life, that make life as wonderful as it is.
Permalink # rufinalanda said
Singularity is said to be a vinge proposes that mankind is headed toward an irrevocable destiny in which we will evolve beyond our understanding through.l tge use of technology.
http://electronics.howstuffworks.com/gadgets/high-tech-gadgets/technological-singluarity.htm
So basically,eventually, all robotics will be more powerful than human kind and will not be stopped if they were ever to become in control. So i believe that singularity absolutely will elimante humanity, eventually. Robots are getting stronger and more powerful every day. Some people believe that robots will be able to do anything we need in the future.
Some people even think that singularity is more dangerous than nuclear weapons. Sucha s Elan Musk. (Time). wouldnt you be scared of this? We all already know what nuclear weapons are capable of doing. Now compare this to that and imagine what could happen.
http://time.com/3614349/artificial-intellegence-singularity-stephen-howlongelanmusk/
Humans could never win against the robots. Like Stephen Hawking says says,”Humans are limited by slow biological evolution, they ccouldnt ccompete ,and they would be superseded. ” They would aquire knowledge faster than humans. They could become incontrol of the whole universe. Maybe not anytime soon, but i still believe this could happen eventually.
Permalink # calliehatley said
Singularity seems to be the point at which artificial intelligence surpasses human intelligence. http://www.livescience.com/29379-intelligent-robots-will-overtake-humans.html. I have to admit that this is a situation I can’t imagine ; yet, many brilliant people are convinced that it is coming. Computers and machines analyze data amazingly well, but they do not have feelings. They do not have common sense and intuition.http://www.livescience.com/29379-intelligent-robots-will-overtake-humans.html. These are God-given abilities that cannot be improved upon. God made man in His image (The Holy Bible), and I personally do not believe that man can out-do God. Computers and robots are tools, but tools need man to make them work. The constant improvement of these tools may, however, ultimately hurt mankind. It has made society more lazy and less social. I don’t believe the machines will destroy us, but they may cause us to destroy ourselves.
Permalink # itaylor068 said
What does singularity mean for us?
” a hypothetical future moment where software becomes self-aware and smart beyond our capacity to understand.”
http://mashable.com/2014/11/17/elon-musk-singularity/
Will it be a “golden age” for humanity, the elimination of humanity, or something else?
I think that it will cause the elimination of humanity.
“Before artificial intelligence becomes a realty, someone will have to develop software that will allow a machine to analyze data, make decisions and act autonomously…Humans would be come obsolete in the computer world…In short, we would reach the singularity.”
http://electronics.howstuffworks.com/gadgets/high-tech-gadgets/technological-singularity1.htm
For singularity to actually happen, we would have to develop a way for it to. I see no reason in doing so, for it will have a very negative effect. Technology already controls the human race so much already, why increase its influence?
“’The development of full artificial intelligence could spell the end of the human race,’” the world-renowned physicist told the BBC.”
http://time.com/3614349/artificial-intelligence-singularity-stephen-hawking-elon-musk/
This was a quote said by Stephen Hawking. His thoughts on artificial intelligence correspond with mine. He states that the human race is limited by slow biological evolution and that we basically no chance when competing against artificial technology.
“‘We’re becoming like the mitochondria. We provide the energy — we turn on the machines,” Slonczewski told LiveScience. “But increasingly, they do everything else.'”
http://m.livescience.com/29379-intelligent-robots-will-overtake-humans.html
I like this quote. It basically sums up the singularity situation. WE control it happening. There needs to be a limit on how far we can go with technology. Without humans, technology will never reach the singularity extreme, so why let it?
Permalink # gerardrosenthal said
“After all, computers allow us to communicate with each other, keep track of complex systems like global markets and even control the world’s most dangerous weapons.” http://electronics.howstuffworks.com/gadgets/high-tech-gadgets/technological-singularity.htm
I know it might come off as lazy to use a quote to start off a post, but this one out of that entire article made me the MOST uneasy I’ve felt in a long time. And trust me, I’ve seen weird stuff. Man getting hit in the head with a large rock weird. Moving on. The fact that humanity already, key word, already relies so heavily on technology goes to show how far it can possible go. Man relying on machine to carry out simple mathematical functions? Wait… We’re there too… Huh.
Now, what is singularity? “Some believe in a utopian future, in which humans can transcend their physical limitations with the aid of machines.” http://WWW.LIVESCIENCE.COM/29379-INTELLIGENT-ROBOTS-WILL-OVERTAKE-HUMANS.HTML
To put it simply, singularity is the theoretical procedure where humans (That means us) become one with our tech. Like booting up your brain into a computer. Personally I’d never do it. I like grass under my feet. Besides, immortality isn’t all it’s cracked up to be. Now one can see this singularity as a golden age. Limitless lifespan, no physical burdens, nor will you have to learn anything. You know everything. All for the low price of your human body. If this future comes true. This is the best case scenario. “It would take off on its own and re-design itself at an ever increasing rate. Humans, who are limited by slow biological evolution, couldn’t compete, and would be superseded.” http://time.com/3614349/artificial-intelligence-singularity-stephen-hawking-elon-musk/ – Steven Hawking.
This is what also might happen. The AI, (Artificial Idiot) Will see these strange creatures. They enter an area, populate, remove all other organisms, construct structures for their own benefit, overall harm the area. The AI will recognize this as a virus, and remove it swiftly. If you haven’t figure it out, they will kill us. With super pointy robot things. I for one stopped trusting technology around the I-phone 3, and the android. “The leading AI companies have taken great steps to ensure safety. They recognize the danger, but believe that they can shape and control the digital superintelligences and prevent bad ones from escaping into the Internet.” http://mashable.com/2014/11/17/elon-musk-singularity/
I wonder if God once said something similar about man. Despite humanities best efforts bad things still happen. We all know this. We mean well, but something fails. We’re flawed, it’s a fact of life. The sooner we can cast aside our arrogance and move away from it to accept what we’re actually capable of we’re incredibly, and royally screwed! Now to answer whether or not the singularity brings about the golden age. Well, for me. It doesn’t. To me, both of the end results are bad. Either we all die horribly at the hands of robots, succumb to their rule, or relinquish our humanity. I for one enjoy the sun-shine, the grass beneath my feet, cool breezes, music, all the things of humanity.
Even if it’s the best feeling in the world or the worst thing I have ever felt, I get to experience it, for I am alive. I am human.
Permalink # gerardrosenthal said
Hold on, is this thing even working?
Permalink # djenkins60 said
I hope. Mine was long af
Permalink # gerardrosenthal said
Okay, GREAT! I get to do my post again! Of course I do! WONDERFUL!
“Some might argue that we’ve already reached that point. After all, computers allow us to communicate with each other, keep track of complex systems like global markets and even control the world’s most dangerous weapons.” http://electronics.howstuffworks.com/gadgets/high-tech-gadgets/technological-singularity.htm
Now as I started this before, this quote made me the most uneasy I’ve ever felt in a long time. I’ve seen some strange things, like small man getting hit with a big rock strange, this takes the cake! The fact man already relies so heavily on technology scares me. What’s next, relying on technology to do simple mathematics? Wait…. We’re there already. Moving on.
“The Singularity is Near: When Humans Transcend Biology” (Viking, 2005), futurist Ray Kurzweil predicted that computers will be as smart as humans by 2029, and that by 2045, “computers will be billions of times more powerful than unaided human intelligence,” http://www.livescience.com/29379-intelligent-robots-will-overtake-humans.html
This statement kind of sums up what the idea of singularity is. Singularity is the event in which computers, and AI’s (Artificial Idiots) will surpass humans in intelligence. This can mark the beginning of two possibilities. One, we’re royally screwed and it will be a dystopia. The other is we’ll merge with tech and live in a “Utopia” if you can call it that.
” the world-renowned physicist told the BBC. “It would take off on its own and re-design itself at an ever increasing rate. Humans, who are limited by slow biological evolution, couldn’t compete, and would be superseded.” http://time.com/3614349/artificial-intelligence-singularity-stephen-hawking-elon-musk/ – Steven Hawking
Basically this means that machines will rise far above humans, see us as nothing more than a weird virus that is killing the planet and remove us. Remove, as in deal with. Deal with, deal with as in end. End, as in ENSURE THAT WE ARE NO LONGER ON THIS MORTAL PLANE! This is the worst case scenario, however given that computers are very quick to remove a virus instead of saying, “But it has spunk” we won’t last long. They do control nukes after all. Nukes aren’t fun. “AI companies have taken great steps to ensure safety. They recognize the danger, but believe that they can shape and control the digital superintelligences and prevent bad ones from escaping into the Internet.” http://mashable.com/2014/11/17/elon-musk-singularity/
I wonder if God once said something similar about man. Point is, man is arrogant, and we’ve never done anything perfect before. This will just be another item on the list of things we have messed up. What I fear, this will be the last thing on the list.
Now, what is the other option. Well the other option is man learns to coexist with their new superior technology. This means that man will boot their consciousness into networks. You’ll live forever, know everything, and never have to worry about mortal whims. All for the low price of your body, and everything that makes you human. I for one, detest both ideas. Then again, I was the guy who stopped trusting technology around the I-phone 3, and the android. I was paranoid before it was cool. I for one appreciate the sun, music, grass, air. The human things. Let me put it like this, a quote my mother used to tell me.
“Whether it is the best feeling in the world or the worst feeling I wouldn’t even wish upon my enemy, I get to experience it because I am alive.”
I am human.
Permalink # robertadams97 said
Singularity is the state, fact, quality, or condition of being singular. ‘The end of human race’. This moment, called the singularity, could create a utopia in which robots automate common forms of labor and humans relax amid bountiful resources. Artificial intelligence can match, and then overtake, human smarts. I do believe in at least 10 years that technology will be far more advanced. Today we already have cars that can drive themselves. Just imagine in 10 years what technological advances the world will have. It’s a scary thought to think that “robots” will take over the human race because they can reproduce way quicker than humans. http://www.livescience.com/29379-intelligent-robots-will-overtake-humans.html. http://mashable.com/2014/11/17/elon-musk-singularity/
Permalink # shannah97 said
I believe that singularity means that the human race will be challenged, but not necessarily destroyed, but definitely not the “golden age”. I personally don’t the believe that singularity is coming as soon as some people think, but I can see the point that, in a way, machines have already taken over out society. “Could machines replace humans as the dominant force on the planet? Some might argue that we’ve already reached that point. ” (http://electronics.howstuffworks.com/gadgets/high-tech-gadgets/technological-singularity.htm) However, whenever singularity may happen, it won’t be good for humans, because if something is more intelligence than us, then humans won’t be able to compete, mentally. “Humans, who are limited by slow biological evolution, couldn’t compete, and would be superseded.” (http://time.com/3614349/artificial-intelligence-singularity-stephen-hawking-elon-musk/) This is why I agree that we, as a whole, need to do our best to prevent singularity from happening. We need to stop relying so much on technology, the companies that create technology need to take singularity into consideration when inventing new ideas. “They recognize the danger, but believe that they can shape and control the digital superintelligences and prevent bad ones from escaping into the Internet.” (http://mashable.com/2014/11/17/ELON-MUSK-SINGULARITY/) I believe that humans would be able to survive singularity, because we have been on this earth way longer than robots, and we have the ability to adapt and us our environment to our advantage. And while, robot would be more intelligent, they wouldn’t be physically able. But the best thing that we as humans can do is prepare for the worst, by being more independent, not creating more technology that is smarter than us, and taking care of our environment because robots sure wont. ” ‘A society or economy made primarily of robots will not fear destroying nature in the same way that we should fear destroying nature,” Hanson said.” (http://www.livescience.com/29379-intelligent-robots-will-overtake-humans.html)
Permalink # starlawood0234 said
Even if singularity does occur then it’s not something WE have to worry about because it’s not gonna happen in our life time. There’s still a need for humans right now. “But right now, these machines have to answer to humans. They lack the ability to make decisions outside their programming or use intuition. Without self awareness and the ability to extrapolate based on available information, machines remain tools” What’s the technological singularity? – HowStuffWorks
http://electronics.howstuffworks.com/gadgets/high-tech-gadgets/technological-singularity.htm There’s so many different ideas and predictions about singularity according to the third article. All scientist have a different hypothesis. So there’s definitely no precise future about it. No one can really say. You can guess a lot but people thought that we would look like the Jetsons by now about 20 years ago. And that has yet to happen. Technology is progressing fast but also unpredictably.
Permalink # Lesliechegue said
What does singularity mean for us? Singularity is when we get into the past and into the point that we understand technology which it makes decisions for itself.
“could machines replace humans as The dominant force on the planet”(http://electronics.howstuffworks.com/gadgets/high-tech-gadgets/technological-singularity.htm) Yes, us Humans have already been replaced by robots and other machines that basically work for other humans!
Will it be a “golden age” for humanity, the elimination of humanity, or something else? “And in Scientific circles, A growing number of artificial intelligence experts agree that humans will eventually create an artificial intelligence that can think beyond our own capacity”(http://time.com/3614349/artificial-intelligence-singularity-stephen-hawking-elon-musk/)
“For instance, because of that physical intuition, humans can watch a person overturn a cup of coffee and just know that the end result will be a puddle on the floor. “(http://m.livescience.com/29379-intelligent-robots-will-overtake-humans.html) As humans we already know the answer to this but computers will let you understand why and how this happen the way it did.
Permalink # dylpicklee said
“Could machines replace humans as the dominant force on the planet? ” Robots have already taken many jobs away from humans due to the amount of work they can get done without any complaints. Robots also evolve everyday. Humans create machines and our inventions only get better from here on out. http://electronics.howstuffworks.com/gadgets/high-tech-gadgets/technological-singularity.htm
“On the list of doomsday scenarios that could wipe out the human race, super-smart killer robots rate pretty high in the public consciousness.” The inventions of robots have all came from the creative mind of human beings. This may well end up being the golden age due to all of the possibilities that are ahead with machines become more active than most humans. http://TIME.COM/3614349/ARTIFICIAL-INTELLIGENCE-SINGULARITY-STEPHEN-HAWKING-ELON-MUSK/
“That productivity spike may not be a good thing. For one, robots could probably survive apocalyptic scenarios that would wipe out humans.” Machines can be capable of surviving what humans cannot. they can be rebuilt but humans have bodies that are very fragile. We are not meant to survive every earth disaster. http://www.livescience.com/29379-intelligent-robots-will-overtake-humans.html
“The pace of progress in artificial intelligence (I’m not referring to narrow AI) is incredibly fast,” I do not believe that humans ever planned on the things we created taking over us. Our inventions evolved and gotten stronger so quickly. The human mind kept on inventing and machines will forever keep on growing. http://mashable.com/2014/11/17/elon-musk-singularity/
Permalink # djenkins60 said
I think artificial intelligence will seem great at first. It will make our lives easier and then people will want more and more and it will become singularity. People will not be able to control computers and they will do things beyond our imaginations. Worst case scenario the human race could be terminated. Even some of the worlds smartest people think this could happen. Stephen Hawking said, “It would take off on its own and re-design itself at an ever increasing rate. Humans, who are limited by slow biological evolution, couldn’t compete, and would be superseded.” (http://time.com/3614349/artificial-intelligence-singularity-stephen-hawking-elon-musk/)
Some people think singularity has been going on for a while. “Since Gutenberg invented the printing press, humans have continuously redefined intelligence and transferred those tasks to machines.” (http://www.livescience.com/29379-intelligent-robots-will-overtake-humans.html)
People think singularity will happen in a small time frame, 5 or 10 years. “The risk of something seriously dangerous happening is in the five year timeframe. 10 years at most” (http://mashable.com/2014/11/17/elon-musk-singularity/)
(http://electronics.howstuffworks.com/gadgets/high-tech-gadgets/technological-singularity.htm)
Permalink # djenkins60 said
I definitely posted mine af and its not even showing up…
Permalink # carolinestyers said
Before we can discuss singularity we need to fully understand what it is. Singularity is defined as “an irrevocable destiny in which we will evolve beyond our understanding through the use of technology”(http://electronics.howstuffworks.com/gadgets/high-tech-gadgets/technological-singularity.htm). Another definition is, “a hypothetical future moment where software becomes self-aware and smart beyond our capacity to understand”(http://mashable.com/2014/11/17/elon-musk-singularity/). What I got out of those definitions is that singularity is basically a time when computers become smarter than humans and we can no longer control or understand them. This seems like a very silly thing to think about, but many people believe it will happen. Throughout the course of time humans have made astonishing advances in technology. I believe that the creation of a technology that can think by itself and understand the world around it is something that’s bound to happen. If/When this does happen, what does it mean for humans? “The development of full artificial intelligence could spell the end of the human race,” the world-renowned physicist told the BBC. “It would take off on its own and re-design itself at an ever increasing rate. Humans, who are limited by slow biological evolution, couldn’t compete, and would be superseded.”(http://time.com/3614349/artificial-intelligence-singularity-stephen-hawking-elon-musk/). This is a very scary thing to think about. But are we already in the midst of technological singularity? I would argue that we are. “Humans have already relinquished many intelligent tasks, such as the ability to write, navigate, memorize facts or do calculations, Joan Slonczewski, a microbiologist at Kenyon college and the author of a science-fiction book called “The Highest Frontier,” (Tor Books, 2011).”(http://WWW.LIVESCIENCE.COM/29379-INTELLIGENT-ROBOTS-WILL-OVERTAKE-HUMANS.HTML). Almost everything we use has a computer in it. From the cars we drive to the cell phones constantly in our hands use computers to run. We are already a society far too dependent on computers and technology. In some ways it’s good, but in other it’s bad. It’s good in the fact that life is made easier and we can do more with our time. It’s bad because it’s caused us to be extremely lazy. Why pull out a map and try to find a destination when you can plug it into your phone and find the your estimated drive time with traffic conditions factored in? I believe our dependence on technology is ultimately going to lead us to our downfall. But just how soon will we be taken over by artificial intelligence? Predictions are all over the map. Elon Musk tends to think that it will be much sooner rather than later. “The pace of progress in artificial intelligence (I’m not referring to narrow AI) is incredibly fast,” said Musk. “Unless you have direct exposure to groups like Deepmind, you have no idea how fast — it is growing at a pace close to exponential.”(http://mashable.com/2014/11/17/elon-musk-singularity/). I don’t know what singularity means for us. I hope that when computers become self-aware they don’t try to destroy humans and take over the world. I hope that computers and humans can work and live alongside each other and that it truly will be a “golden age” for humanity.
Permalink # robertadams97 said
Singularity is the state, fact, quality, or condition of being singular. ‘The end of human race’. This moment, called the singularity, could create a utopia in which robots automate common forms of labor and humans relax amid bountiful resources. Artificial intelligence can match and then overtake human smarts. I think that in 10 years from now, technology will be so advanced that humans wouldn’t have nothing to do about it. Jobs will be lost, resources would be limited, and deaths would even occur. If we have cars that drive themselves today, just imagine what the world would be in 10 years. I do believe that robots will take over the human race at some point in time. http://mashable.com/2014/11/17/elon-musk-singularity/
Permalink # yleiva22 said
Singularity is usually related to robots. The idea is to have something with the same intelligence as a human. The fear some people have is that it will result in the apocalypse. This can relate to religion as the idea of making something with the same intelligence as man is almost defying God himself, and thus result in an apocalypse. “Some believe in a utopian future, in which humans can transcend their physical limitations with the aid of machines. But others think humans will eventually relinquish most of their abilities and gradually become absorbed into artificial intelligence (AI)-based organisms, much like the energy making machinery in our own cells.” (http://www.livescience.com/29379-intelligent-robots-will-overtake-humans.html). To be honest, I don’t find having everyone with the same intelligence a utopia. That would totally mess with our entire systems of civilization, where certain people have certain jobs due to the skills they have. And with all the intelligence needed, where is the motivation? It almost strips away from the many things life has to offer, It is like being born but knowing all the information in the world once you get out of your womb. While this information would not include personal experiences, I still think it takes away from living. Living is about learning. I wish I had a perfect SAT score, had 100 in calculus and had gotten accepted into my top college. But I didn’t. Technically speaking, artificial intelligence would have granted me all of this. But if it had, I wouldn’t be wondering what adventures would be lying ahead of me next fall at Syracuse. With AI, we all would be pursuing the same thing. Who doesn’t want to be smart and rich? How do you get smart and rich? By getting into a good school and getting a good job. We would have billions of doctors. But if we had billions of doctors, then doctors wouldn’t be paid as much. Maybe the whole class system would flip. Maybe a worker at McDonald’s would be getting the highest paid job. Once that happens, it flips again and again and again until there’s just chaos. In my opinion, artificial intelligence isn’t the best thing that could be offered.
Permalink # justinquinn98 said
Singularity means the state , fact , quality or condition of being singular. Yes it would be amazing if robots develop things that are mentioned in http://www.livescience.com/29379-intelligent-robots-will-overtake-humans.html . If they do i do believe it could be the age in which their is no use for man kind. No use for man kind because they will have no one to answer to because they dont make mistakes. As of right now they listen to humans commands as mentioned in http://electronics.howstuffworks.com/gadgets/high-tech-gadgets/technological-singularity.htm . Like Stephen Hawking said in the article http://time.com/3614349/artificial-intelligence-singularity-stephen-hawking-elon-musk/ if aliens say well be nhere soon are we just going to wait? NO we are going to explore harder to find them. The scientists are exploring harder because they are on the breach of cracking it. It will come in the next fifty years. This could possibly be the elimination of the human race because the human race is flawed. AI is not flawed , they dont have to check social media or rest. IT will be a day when these things are brought forth.
Permalink # karigandanielle said
“Vinge proposes that mankind is heading toward an irrevocable destiny in which we will evolve beyond our understanding through the use of technology. He calls it the singularity.” (http://electronics.howstuffworks.com/gadgets/high-tech-gadgets/technological-singularity.htm)
“This moment, called the singularity, could create a utopia in which robots automate common forms of labor and humans relax amid bountiful resources. Or it could lead the artificial intelligence, or AI, to exterminate any creatures it views as competitors for control of the Earth—that would be us.” (http://time.com/3614349/artificial-intelligence-singularity-stephen-hawking-elon-musk/)
“Some even think the singularity — the point at which artificial intelligence can match, and then overtake, human smarts” (http://www.livescience.com/29379-intelligent-robots-will-overtake-humans.html)
“There’s plenty of debate over the singularity — a hypothetical future moment where software becomes self-aware and smart beyond our capacity to understand” (http://mashable.com/2014/11/17/elon-musk-singularity/)
All of these definitions of singularity basically mean the same exact thing just different wording. Robots are going to develop thought processing beyond humans and realize that we are just parasites walking around and destroy us all, or make us their slaves, I don’t know, but any who, robots are going to take over the world. What does this mean for humans you ask, it means that we need to stop trying to make the next bigger and better thing because sooner or later that bigger or that better thing is going to start catching on to human life and develop itself, just like in science how we learn about evolution, technology in evolving as well. I believe that at this rate, singularity will be the elimination of humanity. Stephen Hawkins said, “Humans, who are limited by slow biological evolution, couldn’t compete, and would be superseded.” (http://www.bbc.com/news/technology-30290540) If we create thinking machines, they will do as they were made for but then some.
Permalink # gerardrosenthal said
Okay, GREAT! I get to do my post again! Of course I do! WONDERFUL!
“Some might argue that we’ve already reached that point. After all, computers allow us to communicate with each other, keep track of complex systems like global markets and even control the world’s most dangerous weapons.” http://electronics.howstuffworks.com/gadgets/high-tech-gadgets/technological-singularity.htm
Now as I started this before, this quote made me the most uneasy I’ve ever felt in a long time. I’ve seen some strange things, like small man getting hit with a big rock strange, this takes the cake! The fact man already relies so heavily on technology scares me. What’s next, relying on technology to do simple mathematics? Wait…. We’re there already. Moving on.
“The Singularity is Near: When Humans Transcend Biology” (Viking, 2005), futurist Ray Kurzweil predicted that computers will be as smart as humans by 2029, and that by 2045, “computers will be billions of times more powerful than unaided human intelligence,” http://www.livescience.com/29379-intelligent-robots-will-overtake-humans.html
This statement kind of sums up what the idea of singularity is. Singularity is the event in which computers, and AI’s (Artificial Idiots) will surpass humans in intelligence. This can mark the beginning of two possibilities. One, we’re royally screwed and it will be a dystopia. The other is we’ll merge with tech and live in a “Utopia” if you can call it that.
” the world-renowned physicist told the BBC. “It would take off on its own and re-design itself at an ever increasing rate. Humans, who are limited by slow biological evolution, couldn’t compete, and would be superseded.” http://time.com/3614349/artificial-intelligence-singularity-stephen-hawking-elon-musk/ – Steven Hawking
Basically this means that machines will rise far above humans, see us as nothing more than a weird virus that is killing the planet and remove us. Remove, as in deal with. Deal with, deal with as in end. End, as in ENSURE THAT WE ARE NO LONGER ON THIS MORTAL PLANE! This is the worst case scenario, however given that computers are very quick to remove a virus instead of saying, “But it has spunk” we won’t last long. They do control nukes after all. Nukes aren’t fun. “AI companies have taken great steps to ensure safety. They recognize the danger, but believe that they can shape and control the digital superintelligences and prevent bad ones from escaping into the Internet.” http://mashable.com/2014/11/17/elon-musk-singularity/
I wonder if God once said something similar about man. Point is, man is arrogant, and we’ve never done anything perfect before. This will just be another item on the list of things we have messed up. What I fear, this will be the last thing on the list.
Now, what is the other option. Well the other option is man learns to coexist with their new superior technology. This means that man will boot their consciousness into networks. You’ll live forever, know everything, and never have to worry about mortal whims. All for the low price of your body, and everything that makes you human. I for one, detest both ideas. Then again, I was the guy who stopped trusting technology around the I-phone 3, and the android. I was paranoid before it was cool. I for one appreciate the sun, music, grass, air. The human things. Let me put it like this, a quote my mother used to tell me.
“Whether it is the best feeling in the world or the worst feeling I wouldn’t even wish upon my enemy, I get to experience it because I am alive.”
I am human.
Permalink # carleybyrum25 said
Singularity is the point in which artificial intelligence matches, and then overtakes human intelligence (http://www.livescience.com/29379-intelligent-robots-will-overtake-humans.html). Singularity could either be a really great thing for human beings, or a really horrible thing for human beings. In hopes of creating artificial intelligence that could actually think on its own, would in hopes bring a form of luxury for us (http://time.com/3614349/artificial-intelligence-singularity-stephen-hawking-elon-musk/). Although, this would be the hopes in creating artificial intelligence that could think on its own, it’s the not the more realistic approach. We continue to update and better technology, but as some point it will go too far. There are many computer engineers who believe that technology will begin to control us, if we give it the ability to think. At some point the artificial intelligence will see us as an unnecessary organism, and as a threat to the survival of the Earth. In result artificial intelligence will eliminate us. So is it true that we “should think twice before you mistreat a piece of machinery — you never know if it’ll come after you for revenge later down the road” (http://electronics.howstuffworks.com/gadgets/high-tech-gadgets/technological-singularity2.htm). As much as I would love to deny the fact that artificial intelligence could ever ‘think’ on its own, and even potentially wipe out the human race, I now realize that it’s a very real possibility. It’s a scary thought, computers control all sorts of our intelligence, including nuclear weapons. If they wanted to get rid of the human race it would be very simple. Putting boundaries on these artificial intelligence mechanisms as mentioned would only be beneficial momentarily, because if we’re creating them to be smarter than us, then they could simply get around them. Musk made a very interesting comment about artificial intelligence and it being “a demon summoned by humanity” (http://mashable.com/2014/11/17/elon-musk-singularity/). I certainly agree with his comment of that. It’s something we stumbled upon, and have used to better our lives, but at some point the updates of technology need stopped. We have something that we won’t understand the vast amount of danger that comes with it until it’s too late.
Permalink # julio said
I posted mine earlier today and it didnt show up. I came to the conclusion that my blog post was so good that there was no way I could possibly remake a masterpiece. If you dont believe me God will. I read the articles and everything.
🙂
Permalink # destinysanders235 said
Singularity means that computers and robotic could possibly become the brains of the world and take over human functions. “After all, computers allow us to communicate with each other, keep track of complete situations like global markets and even controls the world’s most dangerous weapons.”(http://electronics.howstuffworks.com/gadgets/high-tech-gadgets/technological-singularity.htm) This statement gives insight on how computers already have an advantage over mankind, however, it has to be operated by humans. Therefore, it is not yet definite that computers and robotics shall dominate. I personally believe that if mankind was not so reliable on technology and other things to figure stuff out for us we wouldn’t have an issue with whether technology and stuff with take over. “It would take off on its own and re-design itself at an ever increasing rate. Humans, who are limited by slow biological evolution, couldn’t compete, and would be superseded.”(http://time.com/3614349/artificial-intelligence-singularity-stephen-hawking-elon-musk/) This quote is believe by the physicist, Stephen Hawking. Only if humans give it the power to, technology could possibly take over. However, I think this particularly statement is a little outrageous. I mean who knows what could possibly happen within the next few years. It can’t be predicted. Let’s say humans don’t let robotics and computers take over, then what? It won’t be known until it happens. Is it a possibly technology can re-design at an increasing rate? Sure it is. But I can’t give a definite answer whether it’s going to end the human race. Only a matter of time will tell. “For instance, because of that physical intuition, humans can watch a person overturn a cup of coffee and just know that the end result will be a puddle on the floor. A computer program, on the other hand, would have to do a laborious simulation and know the exact size of the cup, the height of the cup from the surface and various other parameters to understand the outcome, Davis said.”(http://m.livescience.com/29379-intelligent-robots-will-overtake-humans.html) This quote sounds legitimate, however, isn’t the human race the ones who programmed and invented this technology, therefore, making us the intelligent ones? If it wasn’t for humans, how would the technology have been developed and received the techniques that it does. I feel as if now because the computer or robotics can do the things humans want it to and more, we no longer have to rely on finding information and knowledge on our own. But we shouldn’t let it rule our everyday like, resulting in “artificial intelligence” ruling mankind. “The risk of something seriously dangerous happening is in the five year timeframe. 10 years at most,” wrote Musk.” (http://mashable.com/2014/11/17/elon-musk-singularity/) As I stated earlier, I don’t believe it can specifically be predicted when such a thing could happen. Maybe it will, maybe it won’t, humans will never know until it happens. There are factors that can make an individual belief that “artificial intelligence”is bound to happen. Yet there are also just enough factors to support that it isn’t. Therefore, I can’t give a biased opinion on whether or not it will be the “golden age” for humanity, because who knows.
Permalink # vintagesoul12 said
What does singularity mean for us? For us singularity means two things. The advancement of technology in Which, it could take over and destroy mankind. Or it could be the advancement of human intelligence. Resulting in the evolvement of a new species, a more superior species. A species with supernatural capabilities. May it be one or the other both are equally, as scary. And could lead to horrible things. It’s always hard to think about the future. Especially one where humans may not exist, and computers run the world. Or intelligence is so far advanced that, the human species as we know it, is no more. “The development of full artificial intelligence could spell the end of the human race,”
“It would take off on its own and re-design itself at an ever increasing rate. Humans, who are limited by slow biological evolution, couldn’t compete, and would be superseded.” (http://time.com) This would be the result of of technology taking over. Which is approaching rapidly fast. Technology is getting more and more advanced! It’s only a matter of time. So to answer the question “Will it be a “golden age” for humanity, the elimination of humanity, or something else?” It could go either way. But if the singularity does come to pass, we brought it on ourselves. “Humans have already relinquished many intelligent tasks, such as the ability to write, navigate, memorize facts or do calculations.” ( http://m.livescience.com/29379-intelligent-robots-will-overtake-humans.htm) We rely on technology for everything! And because we rely on it to do so much, of course we’re going to look for ways to make it better. Which would be fine if, we knew when enough was enough. And when to leave well enough alone. A philosopher referred to the human race as the, “Mitochondria ” Which one was the powerhouse of cells, that controlled everything. But as time evolved it became only a main source of energy. And other parts of cell controls functions. If we allow technology to become more advanced than it already is. We’ll become the energy not really needed, and Artificial intelligence will become, the new power house. ( http://m.livescience.com/29379-intelligent-robots-will-overtake-humans.html)
http://electronics.howstuffworks.com/gadgets/high-tech-gadgets/technological-singularity.htm
http://time.com/3614349/artificial-intelligence-singularity-stephen-hawking-elon-musk/
http://m.livescience.com/29379-intelligent-robots-will-overtake-humans.html
http://mashable.com/2014/11/17/elon-musk-singularity/
Permalink # nataesia said
Singularity can be defined as – “the point at which artificial intelligence can match, and then overtake, human smarts”.
http://www.livescience.com/29379-intelligent-robots-will-overtake-humans.html
I wouldn’t say it would be the end of human kind but at this point it is one of the main reasons for unemployment and now they are putting robots in our, human kind will never end because robots are machinery so they can mess up and break down so it will never be the end of human kind but it will be the cost of many humans loosing there jobs and having to change there lifestyle.
Permalink # kiejaphillips said
Singularity will ultimately be out demise if we don’t stop while we’re ahead. “To prevent the singularity includes building in safety features before machines are able to become self aware.” (http://electronics.howstuffworks.com/gadgets/high-tech-gadgets/technological-singularity2.htm) This is completely true. If we stop while we’re ahead we will not have the possibility of being taken over. There are multiple ways to prevent singularity. But if we don’t prevent it we already know it’ll become a major problem. “It would take off on its own and re-design itself at an ever increasing rate. Humans, who are limited by slow biological evolution, couldn’t compete, and would be superseded.” ( http://time.com/3614349/artificial-intelligence-singularity-stephen-hawking-elon-musk/ ) Evolution of humans is a long process and if computers are advanced enough to redesign itself then they’ll exceed the rate of human evolution. “Humans have already relinquished many intelligent tasks, such as the ability to write, navigate, memorize facts or do calculations …” (http://m.livescience.com/29379-intelligent-robots-will-overtake-humans.html) With us giving up these skills it has put us in the mindset of dependency. If we give up critical thinking we will only have emotions left. Being able to think critically is a gift that the human race was blessed with. We shouldn’t give it up just to make our life easier. ” The leading AI companies have taken great steps to ensure safety. They recognize the danger, but believe that they can shape and control the digital superintelligences and prevent bad ones from escaping into the Internet.” (http://mashable.com/2014/11/17/elon-musk-singularity/) Information leaking on the Internet is nothing new to us. It happens on a daily basis. Almost everything is accessible through the Internet so by trying to make sure the “bad” super intelligence don’t escape, you’re wasting your time.
Permalink # Dean the Machine said
What does singularity mean for us? Will it be a “golden age” for humanity, the elimination of humanity, or something else?
Technology Singularity means humans are obsolete to the world and that machines can function without any humans. This means that human are no longer need in repairs on by humans and can make decisions based on the situation. This may also mean that machines accomplish all the tasks of our daily lives. As of this year we create machines and they can only operate within a certain program. With this program I will be able to accomplish tasks which the program guarantees. “Computers and robots force the human race into servitude.” http://electronics.howstuffworks.com/gadgets/high-tech-gadgets/technological-singularity.htm “ Humans would become obsolete in the computer world. We would have created a superhuman intelligence.” http://electronics.howstuffworks.com/gadgets/high-tech-gadgets/technological-singularity1.htm “Maybe we’ll live in a world where each person’s consciousness merges with a computer network. Or perhaps machines will accomplish all our tasks for us and let us live in luxury. But what if the machines see humans as redundant — or worse? When machines reach the point where they can repair themselves and even create better versions of themselves, could they come to the conclusion that humans are not only unnecessary, but also unwanted?” http://electronics.howstuffworks.com/gadgets/high-tech-gadgets/technological-singularity1.htm “And in scientific circles, a growing number of artificial intelligence experts agree that humans will eventually create an artificial intelligence that can think beyond our own capacities. This moment, called the singularity, could create a utopia in which robots automate common forms of labor and humans relax amid bountiful resources. Or it could lead the artificial intelligence, or AI, to exterminate any creatures it views as competitors for control of the Earth—that would be us.” http://time.com/3614349/artificial-intelligence-singularity-stephen-hawking-elon-musk/ “At a conference at MIT in October, Musk likened improving artificial intelligence to “summoning the demon” and called it the human race’s biggest existential threat.” http://time.com/3614349/artificial-intelligence-singularity-stephen-hawking-elon-musk/
This will be golden age humans because many are lazy and need machine to do most things in our daily lives during the short term but a long effect may be the elimination of the human race. The scariest part is that we don’t exactly know how to prevent singularity and don’t if the machines with artificial intelligence will actually eliminate us. This is all speculation but not a bad idea to think of. “In fact, the future of humanity may be similar to that of mitochondria, the energy powerhouses of cells. Mitochondria were once independent organisms, but at some point, an ancestral cell engulfed those primitive bacteria, and over evolutionary history, mitochondria let cells gradually take over all the functions they used to perform, until they only produced energy.
“We’re becoming like the mitochondria. We provide the energy — we turn on the machines,” Slonczewski told LiveScience. “But increasingly, they do everything else.”
http://www.livescience.com/29379-intelligent-robots-will-overtake-humans.html “We’ve covered Musk’s recent comments about the potential dangers of A.I. before, when he variously compared the threat to nuclear weapons and a “demon” summoned by humanity. But none of those comments indicated that Musk believed our downfall at the hands of A.I. would come so soon.”
Permalink # michaeldarden said
What does singularity mean for us? Will it be a “golden age” for humanity, the elimination of humanity, or something else? The technological singularity is the hypothesis that accelerating progress in technologies will cause a runaway effect wherein artificial intelligence will exceed human intellectual capacity and control, thus radically changing civilization in an event called “the singularity”. I don’t think this will ever happen, technology will never be that advanced. “But right now, these machines have to answer to humans. They lack the ability to make decisions outside of their programming or use intuition. Without self-awareness and the ability to extrapolate based on available information, machines remain tools.” (http://electronics.howstuffworks.com/gadgets/high-tech-gadgets/technological-singularity.htm) They will never ever be able to make decisions without humans telling them what to do. They run on batteries, they must be charged. We can always unplug them. “The development of full artificial intelligence could spell the end of the human race,” (http://time.com/3614349/artificial-intelligence-singularity-stephen-hawking-elon-musk/) People also thought the world was going to end in 2012. People can think whatever they want. The odds of this happening are slim to none. “In his book “The Singularity is Near: When Humans Transcend Biology” (Viking, 2005), futurist Ray Kurzweil predicted that computers will be as smart as humans by 2029, and that by 2045, “computers will be billions of times more powerful than unaided human intelligence,” Kurzweil wrote in an email to LiveScience.” (http://www.livescience.com/29379-intelligent-robots-will-overtake-humans.html). We are smarter than computers, after all who built computers? If we humans built computers we can destroy them. “The risk of something seriously dangerous happening is in the five year timeframe. 10 years at most,” wrote Musk. “Please note that I am normally super pro technology and have never raised this issue until recent months. This is not a case of crying wolf about something I don’t understand.” (http://mashable.com/2014/11/17/elon-musk-singularity/).
Permalink # queenbree7 said
Singularity is a very scary subject to think about considering the outcome is most likely the extinction of the human race. I don’t think anyone looks forward to their destruction. Reading these articles have helped me understand the beliefs and motives behind why scientist think this will occur. However, as much as it scares me and as much evidence as there is to back it up, there’s still a part of me who can’t ever imagine this happening. Some of this disbelief comes from the fact that it’s always been talked about as a fictitious occurrence so it’s insane that it could actually finally happen. I can’t even begin to visualise it. The other form of disbelief comes from the fact that I am a person of faith. Yeah I know, I can believe in a God that just created the Earth and controls us, but I can’t believe in the advancement of technology even though we’ve advance exponentially since the beginning of time. Well according to Vernor Vinge, the singularity is when “ mankind will develop a superhuman intelligence” (http://electronics.howstuffworks.com/gadgets/high-tech-gadgets/technological-singularity.htm) In my opinion, super human intelligence is divine intelligence. An intelligence we just aren’t going to obtain here on earth without permission from God. People have tried to reach God and be like him in the past. We do it everyday, but we’ve only gotten so far. I don’t see why and how God can just allow us to acquire a superhuman intelligence when he didn’t give us any in the first place down here on earth. There is a reason we don’t have any, and no matter how hard we try, I don’t think we’re going to get there. There are already a few roadblocks keeping us from getting there. Roadblocks that show it could probably be physically impossible. According to the article, when handling things on a subatomic level, things begin to behave differently than expected. An example would be “… physicists have shown that electrons can pass through extremely thin material as if the material weren’t there. They call this phenomenon electron or quantum tunneling. The electron doesn’t make a physical hole through the material — it just seemingly approaches from one side and ends up on the other. Since transistors control the flow of electrons like a valve, this becomes a problem.” (http://electronics.howstuffworks.com/gadgets/high-tech-gadgets/technological-singularity.htm) If that happens, we aren’t getting anywhere. On the other hand, you could completely throw faith out of the equation, and yes I could see singularity occurring. According to TIME, 5 very smart people all agree it’s going to happen. They aren’t sure when, but they know for sure it will occur. What this could mean for humanity is still being questioned. According to Stephen Hawking, “(Artificial Intelligence)…would take off on its own and re-design itself at an ever increasing rate. Humans, who are limited by slow biological evolution, couldn’t compete, and would be superseded.” (http://time.com/3614349/artificial-intelligence-singularity-stephen-hawking-elon-musk/) Nick Bostrom says that “…once machines surpass human intellect, they could mobilize and decide to eradicate humans extremely quickly using any number of strategies (deploying unseen pathogens, recruiting humans to their side or simple brute force).” (http://time.com/3614349/artificial-intelligence-singularity-stephen-hawking-elon-musk/) He says the world would be full of advanced technology, but we wouldn’t be around to see it. He calls it: “A Disneyland without children.” (http://time.com/3614349/artificial-intelligence-singularity-stephen-hawking-elon-musk/) According to Bill Hibbard, the singularity might not mean the extinction of the human race, but the “golden age” in which “ people won’t necessarily die (they can simply upgrade with cybernetic parts), and they could do just about anything they wanted to — provided it were physically possible and didn’t require too much energy,” (http://www.livescience.com/29379-intelligent-robots-will-overtake-humans.html).Some researchers and tech gurus believe the singularity is a lot closer than we’d like to imagine. Elon Musk reported: “The risk of something seriously dangerous happening is in the five year timeframe. 10 years at most…” (http://mashable.com/2014/11/17/elon-musk-singularity/) Elon Musk has worked with and invested into DeepMind which is a company that works in the artificial intelligence realm. He has seen the advancements that we haven’t up close and personally. I think if he sees an issue, then something is very wrong, and action should be taken now if we want to keep the human race alive and working. However, as Noblitt says, we as humans never stop advancing and we won’t stop until the day we die.
Permalink # slademoss said
What does singularity mean for us?
This is a quote from the first website that outlines what singularity is and what it would means for humankind.
“Mankind is heading toward an irrevocable destiny in which we will evolve beyond our understanding through the use of technology. He calls it the singularity. Scientists could develop advancements in artificial intelligence(AI). Computer networks might somehow become self-aware. Computer/human interfaces become so advanced that humans essentially evolve into a new species. Biological science advancements allow humans to physically engineer human intelligence” http://electronics.howstuffworks.com/gadgets/high-tech-gadgets/technological-singularity1.htm
The second website says this about singularity, “humans will eventually create an artificial intelligence that can think beyond our own capacities. This moment, called the singularity, could create a utopia in which robots automate common forms of labor and humans relax amid bountiful resources. Or it could lead the artificial intelligence, or AI, to exterminate any creatures it views as competitors for control of the Earth—that would be us.”
http://time.com/3614349/artificial-intelligence-singularity-stephen-hawking-elon-musk/
This is what the 3rd website has to say about singularity, “The idea of superintelligent machines may sound like the plot of “The Terminator” or “The Matrix,” but many experts say the idea isn’t far-fetched. Some even think the singularity — the point at which artificial intelligence can match, and then overtake, human smarts — might happen in just 16 years.” http://www.livescience.com/29379-intelligent-robots-will-overtake-humans.html
Singularity basically means the takeover of machines and technology because they have become too advanced even for their creators (humans) to control. I don’t think this is possible, I just can’t picture something humans create, end up ruling humans. Even the most powerful and feared beast on the planet don’t control humans actually they aren’t even close, so I definitely don’t see something running off of a battery pack, or something plugged into the wall taking over humankind. Sorry, just can’t see it. What happens when the power goes out and they don’t have enough charge left to allow them to fix the electrical outage. Well looks like they are out of luck then huh? So singularity may mean something to others, but to me it means nothing to be honest.
Will it be a “golden age” for humanity, the elimination of humanity, or something else?
Everything I have read about this topic all leads to the same ending, technology taking over the planet. So, it would seem the answer to this problem is the elimination of humanity. Just like it shows a first person point of view in the last source speaking on this, “The risk of something seriously dangerous happening is in the five year timeframe. 10 years at most,” wrote Musk. “Please note that I am normally super pro technology and have never raised this issue until recent months. This is not a case of crying wolf about something I don’t understand.” http://mashable.com/2014/11/17/elon-musk-singularity/
It seems as if it would be the exact opposite of a golden age if humankind actually creates something that would ultimately end up being the cause of their extinction. I don’t believe that we can possibly create a robot that could rule humankind, however I do believe humankind is dumb enough to try their best to create such robots. One way or another the thing that destroys human kind will most likely, ultimately, be humans themselves. Whether it’s through these crazy, imaginary robots or through common bad decisions with huge negative impacts worldwide. I mean heck we already create and sell harmful products that are proven to shorten life longevity, so obviously humankind as a whole has no problem with setting themselves up for death.
Permalink # khadijahwatson said
Questions – what does singularity mean for us? Will it be a “golden age” for humanity, the elimination of humanity, or something else?
“Vinge proposes that mankind is heading toward an irrevocable destiny in which we will evolve beyond our understanding through the use of technology. He calls it the singularity.” (www.electronics.howstuffworks.com). This trend is related to Moore’s Law, which states that transistors double in power every 18 months, which means they evolove at a much quicker rate than humans.
“Vinge says that at this rate, it’s only a matter of time before humans build a machine that can “think” like a human. This moment, called the singularity, could create a utopia in which robots automate common forms of labor and humans relax amid bountiful resources. Or it could lead the artificial intelligence, or AI, to exterminate any creatures it views as competitors for control of the Earth—that would be us.” (www.time.com). In my opinion, I believe, IF this were to happen, that it would not mean our destruction but only our goldenage.
“While AI can trounce the best chess or Jeopardy player and do other specialized tasks, it’s still light-years behind the average 7-year-old in terms of common sense, vision, language and intuition about how the physical world works, Davis said.” (www.livescience.com). Computers still lack an important component, and that is the basic understanding of how the true world works, which is not full of numbers and perfect squares. For example, humans can type the code that consist of blurred numbers and words used to access accounts for security, but computers could not read it.
“The leading AI companies have taken great steps to ensure safety. They recognize the danger.”(www.mashable.com). As long as these steps are taken by the science world, our world should be safe, even if we eventually use technology entwined into our lives even more. This could be an even greater age for humans!
Permalink # clintonreel16 said
The singularity is the point in time when Artificial Intelligence (AI) can match, and then overtake, human intelligence (http://www.livescience.com/29379-intelligent-robots-will-overtake-humans.html). I don’t think the singularity is going to be as big of a problem as everyone thinks. It’s just like the how stuff works article said, transistors can only get so much smaller before quantum physics come into play (http://electronics.howstuffworks.com/gadgets/high-tech-gadgets/technological-singularity2.htm). I do not, however, think we should just ignore the problem. Because we will most likely get past the quantum physics issue due to the fact that our understanding of things is always increasing thanks to advancements in technology. It should also not be ignored because so many really smart people, especially people in the AI and computer industry, say the singularity could be very dangerous, even more dangerous than nuclear weapons (http://time.com/3614349/artificial-intelligence-singularity-stephen-hawking-elon-musk/).But, even with these dangers, I feel safe from the singularity. The reason I feel safe is because companies making improvements in AI technology understand the threat AI can pose to humanity, and are putting safeguards in place for that very reason (http://mashable.com/2014/11/17/elon-musk-singularity/).
Permalink # hopehayes said
Singularity can be defined as “a growing number of artificial intelligence experts agree that humans will eventually create an artificial intelligence that can think beyond our own capacities.”http://time.com/3614349/artificial-intelligence-singularity-stephen-hawking-elon-musk/
I personally think that technology will not take over he human race. The humans are the ones who have designed these products in which they should be able to stop them. Although, technology has taken humans jobs and such things it will not end humanity.
Permalink # marionwhitely said
When I was reading these articles I felt like I was in the middle of the sci-fi movie I-Robot. Thinking that one day we could possibly be replaced by artificial intelligence is almost impossible for me to believe. Some even think the singularity — the point at which artificial intelligence can match, and then overtake, human smarts — might happen in just 16 years. http://WWW.LIVESCIENCE.COM/29379-INTELLIGENT-ROBOTS-WILL-OVERTAKE-HUMANS.HTML. Vinge says that at this rate, it’s only a matter of time before humans build a machine that can “think” like a human. http://ELECTRONICS.HOWSTUFFWORKS.COM/GADGETS/HIGH-TECH-GADGETS/TECHNOLOGICAL-SINGULARITY.HTM. Vinge proposes that mankind is heading toward an irrevocable destiny in which we will evolve beyond our understanding through the use of technology. He calls it the singularity.
• Scientists could develop advancements in artificial intelligence (AI).
• Computer networks might somehow become self-aware.
• Computer/human interfaces become so advanced that humans essentially evolve into a new species.
• Biological science advancements allow humans to physically engineer human intelligence.
http://ELECTRONICS.HOWSTUFFWORKS.COM/GADGETS/HIGH-TECH-GADGETS/TECHNOLOGICAL-SINGULARITY.HTM
But hardware is only part of the equation. Before artificial intelligence becomes a reality, someone will have to develop software that will allow a machine to analyze data, make decisions and act autonomously. If that happens, we can expect to see machines begin to design and build even better machines. These new machines could build faster, more powerful models. http://ELECTRONICS.HOWSTUFFWORKS.COM/GADGETS/HIGH-TECH-GADGETS/TECHNOLOGICAL-SINGULARITY.HTM .
I think singularity means for us that human means that the human race if not careful will replace the use for human labor. A growing number of artificial intelligence experts agree that humans will eventually create an artificial intelligence that can think beyond our own capacities. This moment, called the singularity, could create a utopia in which robots automate common forms of labor and humans relax amid bountiful resources. Or it could lead the artificial intelligence, or AI, to exterminate any creatures it views as competitors for control of the Earth—that would be us. http://TIME.COM/3614349/ARTIFICIAL-INTELLIGENCE-SINGULARITY-STEPHEN-HAWKING-ELON-MUSK/
Some believe in a utopian future, in which humans can transcend their physical limitations with the aid of machines. But others think humans will eventually relinquish most of their abilities and gradually become absorbed into artificial intelligence (AI)-based organisms, much like the energy making machinery in our own cells. http://WWW.LIVESCIENCE.COM/29379-INTELLIGENT-ROBOTS-WILL-OVERTAKE-HUMANS.HTML.
The “golden age” for humanity will be another excuse for man to invent artificial intelligence to help aide in everyday practices or to improve how we perform complicated procedures, but ultimately it’s the humans that have to create this intelligence. We are inventing ourselves out of jobs. “We’re becoming like the mitochondria. We provide the energy — we turn on the machines,” Slonczewski told LiveScience. “But increasingly, they do everything else.” http://WWW.LIVESCIENCE.COM/29379-INTELLIGENT-ROBOTS-WILL-OVERTAKE-HUMANS.HTML.
“The pace of progress in artificial intelligence (I’m not referring to narrow AI) is incredibly fast,” said Musk. Still, whether you believe Musk’s perspective is rooted in reality, or in too many science fiction novels, the very fact that the topic is now being discussed so seriously in the science and technology communities is a telling turn of events. http://MASHABLE.COM/2014/11/17/ELON-MUSK-SINGULARITY/
Permalink # kylethompson98 said
So, your favorite syfy movie where the evil robots take over the world and humanity is either wiped out or they become “Cyborgs” is coming true? Well according to some of these websites, we could actually be the first guinea pig cyborgs ( lets hope not). Initially I’m shocked to hear how many people ( top rated scientist and technology creators) actually believe that it is very close to happening, some think as little as five years, others, maybe a hundred years. Either way that’s too close for comfort. So, with that being said lets start. “But hardware is only part of the equation. Before artificial intelligence becomes a reality, someone will have to develop software that will allow a machine to analyze data, make decisions and act autonomously. If that happens, we can expect to see machines begin to design and build even better machines. These new machines could build faster, more powerful models.”
http://electronics.howstuffworks.com/gadgets/high-tech-gadgets/technological-singularity1.htm
This will be the breaking point. It will kick off the start of this singularity. The minute machines start building other machines without us having any interference at all will spark the end of humanity rather than our “golden age”. And like it says, the next machine will build a stronger machine and then a stronger, etc, etc. “He argues that intelligent beings are innately driven toward gathering resources and achieving goals, which would inevitably put a super-smart AI in competition with humans”
http://time.com/3614349/artificial-intelligence-singularity-stephen-hawking-elon-musk/
Competition can have positive effects in some cases but this is definitely not one of those cases. Think about it. It will literally create war between us and the machines. Both of us will be competing to come out on top but in the end, once the machines have become smart enough to make stronger versions of their-selves it will be no use. “The past two singularities — the Agricultural and Industrial revolutions — led to a doubling in economic productivity every 1,000 and 15 years.”
http://www.livescience.com/29379-intelligent-robots-will-overtake-humans.html
Now on the flip side, if this singularity actually was beneficial, this would be amazing. This is literally the only pathway I see that could lead us to our “golden age”.
” it is growing at a pace close to exponential”.
http://mashable.com/2014/11/17/elon-musk-singularity/
Ok, now this is kind of bad. If this is the case, then seeing this singularity in my lifetime could be quite possible. Exponential is quite crazy to think about. Technology would have to be advancing just about as fast as we put new stuff on the shelves. Well, I guess in ways, it really is. So, In a nutshell, I think this singularity will be more negative than positive. As soon as they become sentient and believe that they are dominate, then that will be it.
Permalink # tmc18 said
http://www.livescience.com/29379-intelligent-robots-will-overtake-humans.html“Humans have already relinquished many intelligent tasks, such as the ability to write, navigate, memorize facts or do calculations”
I believe that singularity is already showing small signs of it taking effect on the human race. We as humans have already slowly allowed ourselves to get lazy, to depend on objects to do our simple needs. (Mentioned above).
We have slowly allowed for our control to slip away from our grasp, willingly we put it into the hands of intellectual objects that hold no form of human awareness. http://electronics.howstuffworks.com/gadgets/high-tech-gadgets/technological-singularity.htm “They lack the ability to make decisions outside of their programming or use intuition.”
They lack the ability to feel, to understand, to sympathize their ability to not have these forms of handicaps leaves the free. Free to do as their please with out any remorse. Singularity is a outcome that we should not wish to have, it will not be our salvation but rather our destruction. “The physical extinction of the human race is one possibility,” http://time.com/3614349/artificial-intelligence-singularity-stephen-hawking-elon-musk/
Permalink # kia said
What does singularity mean for us? Will it be a “golden age” for humanity, the elimination of humanity, or something else? to start off singularity is a hypothetical moment in time when artificial intelligence and other technologies have become so advanced that humanity undergoes a dramatic and irreversible change. meaning that we’ve built something so advance that its has mentally got ahead of us. It’s a common theme in science fiction and in many Hollywood movies such as I robot and Destruction. “ mankind struggles to survive in a dystopian futuristic society. Scientists discover too late that their machines are too powerful to control. Computers and robots force the human race into servitude idea seems to gain more credence every year. Could machines replace humans as the dominant force on the planet?” http://electronics.howstuffworks.com/gadgets/high-tech-gadgets/technological-singularity1.htm its already happened with everyday job there no need for humans to help assist you on the phone there is a robot who you speak to how will direct your calls there no need for people to work in mills because there is a robot take and putting together everything they need. Scientists could develop advancements in artificial intelligence so in actuality we could blame scientist for the end of the world and destruction of mankind.
Permalink # kia said
What does singularity mean for us? Will it be a “golden age” for humanity, the elimination of humanity, or something else? to start off singularity is a hypothetical moment in time when artificial intelligence and other technologies have become so advanced that humanity undergoes a dramatic and irreversible change. meaning that we’ve built something so advance that its has mentally got ahead of us. It’s a common theme in science fiction and in many hollywood movies such as Irobot and Destruction. “ mankind struggles to survive in a dystopian futuristic society. Scientists discover too late that their machines are too powerful to control. Computers and robots force the human race into servitude idea seems to gain more credence every year. Could machines replace humans as the dominant force on the planet?” http://electronics.howstuffworks.com/gadgets/high-tech-gadgets/technological-singularity1.htm its already happened with everyday job there no need for humans to help assist you on the phone there is a robot who you speak to how will direct your calls there no need for people to work in mills because there is a robot take and putting together everything they need. Scientists could develop advancements in artificial intelligence so in actuality we could blame scientist for the end of the world and destruction of mankind.
Permalink # kiapressley said
What does singularity mean for us? Will it be a “golden age” for humanity, the elimination of humanity, or something else? to start off singularity is a hypothetical moment in time when artificial intelligence and other technologies have become so advanced that humanity undergoes a dramatic and irreversible change. meaning that we’ve built something so advance that its has mentally got ahead of us. It’s a common theme in science fiction and in many Hollywood movies such as I robot and Destruction. “ mankind struggles to survive in a Utopian futuristic society. Scientists discover too late that their machines are too powerful to control. Computers and robots force the human race into servitude idea seems to gain more credence every year. Could machines replace humans as the dominant force on the planet?” http://electronics.howstuffworks.com/gadgets/high-tech-gadgets/technological-singularity1.htm its already happened with everyday job there no need for humans to help assist you on the phone there is a robot who you speak to how will direct your calls there no need for people to work in mills because there is a robot take and putting together everything they need. Scientists could develop advancements in artificial intelligence so in actuality we could blame scientist for the end of the world and destruction of mankind.
Permalink # manningrebekah said
I think mine went through and its awaiting moderation… At least I hope so.
Permalink # Maddie (@GingerMaddie722) said
I really hope this works lol…
Singularity is ” a hypothetical future moment where software becomes self-aware and smart beyond our capacity to understand. ” (http://mashable.com/2014/11/17/elon-musk-singularity/) I don’t think I like singularity. Technological advances are cool but I don’t want everything to be computerized. Even though Google is much faster and convenient, the nerd in me is still just as content opening a book when I need information. In some cases there are computerized countertops that can like do stuff for you as if you aren’t capable of moving your own hands and doing them yourselves. I don’t know much about that though. I think that singularity takes away our humanity completely. We might not be dead but what’s the point of living if we aren’t even thinking for ourselves or doing things on our own?
“While AI can trounce the best chess or Jeopardy player and do other specialized tasks, it’s still light-years behind the average 7-year-old in terms of common sense, vision,
language and intuition about how the physical world works, Davis said.” (http://www.livescience.com/29379-intelligent-robots-will-overtake-humans.html) Why do we want to teach/program a computer to do something that we are perfectly capable of doing ourselves? Sometimes I feel like we spend so much time anticipating and trying to develop “the next big thing” that there’s no way we can ever learn to appreciate what we have. “It would take off on its own and re-design itself at an ever increasing rate. Humans, who are limited by slow biological evolution, couldn’t compete, and would be superseded.” (http://time.com/3614349/artificial-intelligence-singularity-stephen-hawking-elon-musk/) Stephen Hawking suggests that humans could cease to exist with these type of advances. I think that could be true based on how quickly computers respond….. most of the time.
“Even if we find a way to build transistors on a scale of just a few nanometers, they wouldn’t necessarily work.” (http://electronics.howstuffworks.com/gadgets/high-tech-gadgets/technological-singularity2.htm) New technology annoys me in general for this very reason… we have iPads so small they might as well be phones and other times we have iPhones and Galaxy phones that are so big they should just be tablets. I obviously don’t think we should go back to mega-dinosaur computers, but do we REALLY need Apple watches?
I can’t remember which source or I’d cite it, but I think they’re wrong for comparing this to the Agricultural and Industrial Revolutions. Those were different situations. They were pretty much necessary for sustaining human life. They were good for the economy. Singularity would increase the economy because everyone has to have the latest and greatest technology BUT if this technology is just going to be robots that wipes out the human race, does it really matter that it would boost the economy? Those types of revolutions allowed us to be more efficient with farming and production of consumer goods, for the most part. This type of revolution is literally allowing us to be as lazy as possible, which in itself could kill us for biological reasons like literally never moving a muscle.
I don’t see singularity as the beginning of a “golden age” but I don’t exactly see it as the end of the human race in other ways… Honestly I just see it as unnecessary.
Permalink # marionwhitely said
I just emailed mine to him just in case it didn’t show up on here
Permalink # alexandrashines said
14 people did this 😂
Permalink # marionwhitely said
When I was reading these articles I felt like I was in the middle of the sci-fi movie I-Robot. Thinking that one day we could possibly be replaced by artificial intelligence is almost impossible for me to believe. Some even think the singularity — the point at which artificial intelligence can match, and then overtake, human smarts — might happen in just 16 years. http://WWW.LIVESCIENCE.COM/29379-INTELLIGENT-ROBOTS-WILL-OVERTAKE-HUMANS.HTML. Vinge says that at this rate, it’s only a matter of time before humans build a machine that can “think” like a human. http://ELECTRONICS.HOWSTUFFWORKS.COM/GADGETS/HIGH-TECH-GADGETS/TECHNOLOGICAL-SINGULARITY.HTM. Vinge proposes that mankind is heading toward an irrevocable destiny in which we will evolve beyond our understanding through the use of technology. He calls it the singularity.
· Scientists could develop advancements in artificial intelligence (AI).
· Computer networks might somehow become self-aware.
· Computer/human interfaces become so advanced that humans essentially evolve into a new species.
· Biological science advancements allow humans to physically engineer human intelligence.
http://ELECTRONICS.HOWSTUFFWORKS.COM/GADGETS/HIGH-TECH-GADGETS/TECHNOLOGICAL-SINGULARITY.HTM
But hardware is only part of the equation. Before artificial intelligence becomes a reality, someone will have to develop software that will allow a machine to analyze data, make decisions and act autonomously. If that happens, we can expect to see machines begin to design and build even better machines. These new machines could build faster, more powerful models. http://ELECTRONICS.HOWSTUFFWORKS.COM/GADGETS/HIGH-TECH-GADGETS/TECHNOLOGICAL-SINGULARITY.HTM .
I think singularity means for us that human means that the human race if not careful will replace the use for human labor. A growing number of artificial intelligence experts agree that humans will eventually create an artificial intelligence that can think beyond our own capacities. This moment, called the singularity, could create a utopia in which robots automate common forms of labor and humans relax amid bountiful resources. Or it could lead the artificial intelligence, or AI, to exterminate any creatures it views as competitors for control of the Earth—that would be us. http://TIME.COM/3614349/ARTIFICIAL-INTELLIGENCE-SINGULARITY-STEPHEN-HAWKING-ELON-MUSK/
Some believe in a utopian future, in which humans can transcend their physical limitations with the aid of machines. But others think humans will eventually relinquish most of their abilities and gradually become absorbed into artificial intelligence (AI)-based organisms, much like the energy making machinery in our own cells. http://WWW.LIVESCIENCE.COM/29379-INTELLIGENT-ROBOTS-WILL-OVERTAKE-HUMANS.HTML.
The “golden age” for humanity will be another excuse for man to invent artificial intelligence to help aide in everyday practices or to improve how we perform complicated procedures, but ultimately it’s the humans that have to create this intelligence. We are inventing ourselves out of jobs. “We’re becoming like the mitochondria. We provide the energy — we turn on the machines,” Slonczewski told LiveScience. “But increasingly, they do everything else.” http://WWW.LIVESCIENCE.COM/29379-INTELLIGENT-ROBOTS-WILL-OVERTAKE-HUMANS.HTML.
“The pace of progress in artificial intelligence (I’m not referring to narrow AI) is incredibly fast,” said Musk. Still, whether you believe Musk’s perspective is rooted in reality, or in too many science fiction novels, the very fact that the topic is now being discussed so seriously in the science and technology communities is a telling turn of events. http://MASHABLE.COM/2014/11/17/ELON-MUSK-SINGULARITY/
Permalink # Rameez said
DEMOOOOOOO
DEMOOOOOOO
DEMOOOOOOO
DEMOOOOOOO
DEMOOOOOOO
Permalink # noblittisaselfawarerobot said
This is Dylan Riley.
Since wordpress thinks it’s hilarious for me to write a decent blog post and to delete it when I try to post it, I have to rewrite. It just makes me so excited to redo a blog post that won’t be nearly as well written. Singularity; I personally think it’s an insanely idiotic idea. As cool as it would be to have robot overlords, because let’s face it, people are horrible, it’s so incredibly far-fetched and ridiculous that it physically hurts me to think that people actually believe we’ll be overtaken by robots. I like the Matrix as much as the next person, but I think it’s just dumb.
In the first article, it gives the definition of the, well, a skewed definition, of singularity. It says that the singularity are robots taking over the world and enslaving humans. Still, a pretty stupid idea. (http://electronics.howstuffworks.com/gadgets/high-tech-gadgets/technological-singularity.htm)
The second article is six very intelligent and well-respected people talking about how they all believe that robots will take us over one day. The problem with this is that just because smart people say that something is, it doesn’t mean this it is. Harrison Ford said that the Bush administration was a well one, but that’s obviously a lie. Stephen Hawking said that we would be living with aliens, and knowing that they’re aliens, as one. I personally don’t know any aliens, unless you count illegal immigrants, but that’s an argument for another time. (http://time.com/3614349/artificial-intelligence-singularity-stephen-hawking-elon-musk/)
In the third article, it says that scientist believe the singularity will happen within the 21st century. It also gives a partial argument for the other side who doesn’t believe in the singularity. One point that it brings up is that robots have no physical intuition. If someone sees a cup of water falling to the floor, we know that we’re going to end up with water everywhere, whereas a robot would have to countless simulations, full of measurements of everything, just to know that water will end up on the floor. So, good job Noblitt. You gave a somewhat non-biased article for once. Keep it up. (http://www.livescience.com/29379-intelligent-robots-will-overtake-humans.html)
In the last article. it’s talking about Elon Musk saying that the singularity is close. One company he brings up is Deepmind, which is an AI development company, and their overwhelming accomplishments. However, if you read their publications (http://deepmind.com/publications.html), you find that all there really is are documents stating theories of what robots could do based on psychological processes in humans. So, unless they keep all of their results under lock, there isn’t much proof. Of course, you’d have to be doing some pretty amazing work to have google as a business partner. (http://mashable.com/2014/11/17/elon-musk-singularity/)
Bottom line, the singularity is a pretty idiotic idea. Of course, living life like a Jonathan Coulton song would be pretty amazing. (props to anyone who knows about JoCo and his music)
Permalink # noblittisaselfawarerobot said
Since wordpress thinks it’s hilarious for me to write a decent blog post and to delete it when I try to post it, I have to rewrite. It just makes me so excited to redo a blog post that won’t be nearly as well written. Singularity; I personally think it’s an insanely idiotic idea. As cool as it would be to have robot overlords, because let’s face it, people are horrible, it’s so incredibly far-fetched and ridiculous that it physically hurts me to think that people actually believe we’ll be overtaken by robots. I like the Matrix as much as the next person, but I think it’s just dumb.
In the first article, it gives the definition of the, well, a skewed definition, of singularity. It says that the singularity are robots taking over the world and enslaving humans. Still, a pretty stupid idea. (http://electronics.howstuffworks.com/gadgets/high-tech-gadgets/technological-singularity.htm)
The second article is six very intelligent and well-respected people talking about how they all believe that robots will take us over one day. The problem with this is that just because smart people say that something is, it doesn’t mean this it is. Harrison Ford said that the Bush administration was a well one, but that’s obviously a lie. Stephen Hawking said that we would be living with aliens, and knowing that they’re aliens, as one. I personally don’t know any aliens, unless you count illegal immigrants, but that’s an argument for another time. (http://time.com/3614349/artificial-intelligence-singularity-stephen-hawking-elon-musk/)
In the third article, it says that scientist believe the singularity will happen within the 21st century. It also gives a partial argument for the other side who doesn’t believe in the singularity. One point that it brings up is that robots have no physical intuition. If someone sees a cup of water falling to the floor, we know that we’re going to end up with water everywhere, whereas a robot would have to countless simulations, full of measurements of everything, just to know that water will end up on the floor. So, good job Noblitt. You gave a somewhat non-biased article for once. Keep it up. (http://www.livescience.com/29379-intelligent-robots-will-overtake-humans.html)
In the last article. it’s talking about Elon Musk saying that the singularity is close. One company he brings up is Deepmind, which is an AI development company, and their overwhelming accomplishments. However, if you read their publications (http://deepmind.com/publications.html), you find that all there really is are documents stating theories of what robots could do based on psychological processes in humans. So, unless they keep all of their results under lock, there isn’t much proof. Of course, you’d have to be doing some pretty amazing work to have google as a business partner. (http://mashable.com/2014/11/17/elon-musk-singularity/)
Bottom line, the singularity is a pretty idiotic idea. Of course, living life like a Jonathan Coulton song would be pretty amazing. (props to anyone who knows about JoCo and his music)
Permalink # geraldinmartinez015 said
what does singularity mean for us? Will it be a “golden age” for humanity, the elimination of humanity, or something else?
i think right now for us singularity means a point at which a function takes an infinite value.
http://electronics.howstuffworks.com/gadgets/high-tech-gadgets/technological-singularity.htm
scientist are realizing that computers,robots (technology) are becoming extremely to powerful to even control by a human intelligence. some people do believe that humanity will end later on because of technology will take over. however; other believe that the elimination of humanity will never exist because without people this world could not keep on going.
A lot of people believe that tech already reach the point that its taking over according to the article. I do agree with the statement that says robots are becoming more and more powerful each day that goes by.
http://time.com/3614349/artificial-intelligence-singularity-stephen-hawking-elon-musk/
This article states some good points.
“On the list of doomsday scenarios that could wipe out the human race, super-smart killer robots rate pretty high in the public consciousness.”
This is a very interesting quote.. I’m sure that this comes to mind to some people about how robots can be so smart to the point were they kill.
http://www.livescience.com/29379-intelligent-robots-will-overtake-humans.html
according to this article they state that robots will take over by 2100. This is somehow hard to believe but it can be true somewhat. By how this world is coming to its more likely for it to be true.
http://mashable.com/2014/11/17/elon-musk-singularity/
At the end of the day we just going to have to wait to see what happens because we cant do anything to change whats happening but change ourselves. and the way we learning things.
Permalink # jedwebb said
Singularity can mean a lot of things for the human race. It could be a time of prosperity and groundbreaking advances in technology, or it could be in reality the end of the human race. A very real possibility is that when AI’s are developed and built, they will be able to see that humans are really nothing more than parasites to the planet. All we do is take without adding anything. It’s a possibility that AI’s would then seek the destruction of the human race. On the other hand, if AI’s are mass produced we could in turn boost our economies greatly with more efficient and better quality work being done by machines who don’t make mistakes. While we can currently only speculate as to what the introduction of AI’s into our world will bring, we can be certain that they will change the world as we know it.
http://ELECTRONICS.HOWSTUFFWORKS.COM/GADGETS/HIGH-TECH-GADGETS/TECHNOLOGICAL-SINGULARITY.HTM
http://TIME.COM/3614349/ARTIFICIAL-INTELLIGENCE-SINGULARITY-STEPHEN-HAWKING-ELON-MUSK/
http://WWW.LIVESCIENCE.COM/29379-INTELLIGENT-ROBOTS-WILL-OVERTAKE-HUMANS.HTML
http://MASHABLE.COM/2014/11/17/ELON-MUSK-SINGULARITY/
Permalink # jedwebb said
Singularity can mean a lot of things for the human race. It could be a time of prosperity and groundbreaking advances in technology, or it could be in reality the end of the human race. A very real possibility is that when AI’s are developed and built, they will be able to see that humans are really nothing more than parasites to the planet. All we do is take without adding anything. It’s a possibility that AI’s would then seek the destruction of the human race. On the other hand, if AI’s are mass produced we could in turn boost our economies greatly with more efficient and better quality work being done by machines who don’t make mistakes. While we can currently only speculate as to what the introduction of AI’s into our world will bring, we can be certain that they will change the world as we know it.
http://ELECTRONICS.HOWSTUFFWORKS.COM/GADGETS/HIGH-TECH-GADGETS/TECHNOLOGICAL-SINGULARITY.HTM
http://TIME.COM/3614349/ARTIFICIAL-INTELLIGENCE-SINGULARITY-STEPHEN-HAWKING-ELON-MUSK/
http://WWW.LIVESCIENCE.COM/29379-INTELLIGENT-ROBOTS-WILL-OVERTAKE-HUMANS.HTML
http://MASHABLE.COM/2014/11/17/ELON-MUSK-SINGULARITY/
….
Permalink # jedwebb said
help, keeps saying duplicate comment
Permalink # gingerrrmaddie722 said
Hopefully it’ll go through this time…
Singularity is ” a hypothetical future moment where software becomes self-aware and smart beyond our capacity to understand. ” (http://mashable.com/2014/11/17/elon-musk-singularity/) I don’t think I like singularity. Technological advances are cool but I don’t want everything to be computerized. Even though Google is much faster and convenient, the nerd in me is still just as content opening a book when I need information. In some cases there are computerized countertops that can like do stuff for you as if you aren’t capable of moving your own hands and doing them yourselves. I don’t know much about that though. I think that singularity takes away our humanity completely. We might not be dead but what’s the point of living if we aren’t even thinking for ourselves or doing things on our own?
“While AI can trounce the best chess or Jeopardy player and do other specialized tasks, it’s still light-years behind the average 7-year-old in terms of common sense, vision,
language and intuition about how the physical world works, Davis said.” (http://www.livescience.com/29379-intelligent-robots-will-overtake-humans.html) Why do we want to teach/program a computer to do something that we are perfectly capable of doing ourselves? Sometimes I feel like we spend so much time anticipating and trying to develop “the next big thing” that there’s no way we can ever learn to appreciate what we have. “It would take off on its own and re-design itself at an ever increasing rate. Humans, who are limited by slow biological evolution, couldn’t compete, and would be superseded.” (http://time.com/3614349/artificial-intelligence-singularity-stephen-hawking-elon-musk/) Stephen Hawking suggests that humans could cease to exist with these type of advances. I think that could be true based on how quickly computers respond….. most of the time.
“Even if we find a way to build transistors on a scale of just a few nanometers, they wouldn’t necessarily work.” (http://electronics.howstuffworks.com/gadgets/high-tech-gadgets/technological-singularity2.htm) New technology annoys me in general for this very reason… we have iPads so small they might as well be phones and other times we have iPhones and Galaxy phones that are so big they should just be tablets. I obviously don’t think we should go back to mega-dinosaur computers, but do we REALLY need Apple watches?
I can’t remember which source or I’d cite it, but I think they’re wrong for comparing this to the Agricultural and Industrial Revolutions. Those were different situations. They were pretty much necessary for sustaining human life. They were good for the economy. Singularity would increase the economy because everyone has to have the latest and greatest technology BUT if this technology is just going to be robots that wipes out the human race, does it really matter that it would boost the economy? Those types of revolutions allowed us to be more efficient with farming and production of consumer goods, for the most part. This type of revolution is literally allowing us to be as lazy as possible, which in itself could kill us for biological reasons like literally never moving a muscle.
I don’t see singularity as the beginning of a “golden age” but I don’t exactly see it as the end of the human race in other ways… Honestly I just see it as unnecessary.
Permalink # Maddie Moore said
Hopefully it’ll work this time…
Singularity is ” a hypothetical future moment where software becomes self-aware and smart beyond our capacity to understand. ” (http://mashable.com/2014/11/17/elon-musk-singularity/) I don’t think I like singularity. Technological advances are cool but I don’t want everything to be computerized. Even though Google is much faster and convenient, the nerd in me is still just as content opening a book when I need information. In some cases there are computerized countertops that can like do stuff for you as if you aren’t capable of moving your own hands and doing them yourselves. I don’t know much about that though. I think that singularity takes away our humanity completely. We might not be dead but what’s the point of living if we aren’t even thinking for ourselves or doing things on our own?
“While AI can trounce the best chess or Jeopardy player and do other specialized tasks, it’s still light-years behind the average 7-year-old in terms of common sense, vision,
language and intuition about how the physical world works, Davis said.” (http://www.livescience.com/29379-intelligent-robots-will-overtake-humans.html) Why do we want to teach/program a computer to do something that we are perfectly capable of doing ourselves? Sometimes I feel like we spend so much time anticipating and trying to develop “the next big thing” that there’s no way we can ever learn to appreciate what we have. “It would take off on its own and re-design itself at an ever increasing rate. Humans, who are limited by slow biological evolution, couldn’t compete, and would be superseded.” (http://time.com/3614349/artificial-intelligence-singularity-stephen-hawking-elon-musk/) Stephen Hawking suggests that humans could cease to exist with these type of advances. I think that could be true based on how quickly computers respond….. most of the time.
“Even if we find a way to build transistors on a scale of just a few nanometers, they wouldn’t necessarily work.” (http://electronics.howstuffworks.com/gadgets/high-tech-gadgets/technological-singularity2.htm) New technology annoys me in general for this very reason… we have iPads so small they might as well be phones and other times we have iPhones and Galaxy phones that are so big they should just be tablets. I obviously don’t think we should go back to mega-dinosaur computers, but do we REALLY need Apple watches?
I can’t remember which source or I’d cite it, but I think they’re wrong for comparing this to the Agricultural and Industrial Revolutions. Those were different situations. They were pretty much necessary for sustaining human life. They were good for the economy. Singularity would increase the economy because everyone has to have the latest and greatest technology BUT if this technology is just going to be robots that wipes out the human race, does it really matter that it would boost the economy? Those types of revolutions allowed us to be more efficient with farming and production of consumer goods, for the most part. This type of revolution is literally allowing us to be as lazy as possible, which in itself could kill us for biological reasons like literally never moving a muscle.
I don’t see singularity as the beginning of a “golden age” but I don’t exactly see it as the end of the human race in other ways… Honestly I just see it as unnecessary.
Permalink # Maddie Moore said
Singularity is ” a hypothetical future moment where software becomes self-aware and smart beyond our capacity to understand. ” (http://mashable.com/2014/11/17/elon-musk-singularity/) I don’t think I like singularity. Technological advances are cool but I don’t want everything to be computerized. Even though Google is much faster and convenient, the nerd in me is still just as content opening a book when I need information. In some cases there are computerized countertops that can like do stuff for you as if you aren’t capable of moving your own hands and doing them yourselves. I don’t know much about that though. I think that singularity takes away our humanity completely. We might not be dead but what’s the point of living if we aren’t even thinking for ourselves or doing things on our own?
“While AI can trounce the best chess or Jeopardy player and do other specialized tasks, it’s still light-years behind the average 7-year-old in terms of common sense, vision,
language and intuition about how the physical world works, Davis said.” (http://www.livescience.com/29379-intelligent-robots-will-overtake-humans.html) Why do we want to teach/program a computer to do something that we are perfectly capable of doing ourselves? Sometimes I feel like we spend so much time anticipating and trying to develop “the next big thing” that there’s no way we can ever learn to appreciate what we have. “It would take off on its own and re-design itself at an ever increasing rate. Humans, who are limited by slow biological evolution, couldn’t compete, and would be superseded.” (http://time.com/3614349/artificial-intelligence-singularity-stephen-hawking-elon-musk/) Stephen Hawking suggests that humans could cease to exist with these type of advances. I think that could be true based on how quickly computers respond….. most of the time.
“Even if we find a way to build transistors on a scale of just a few nanometers, they wouldn’t necessarily work.” (http://electronics.howstuffworks.com/gadgets/high-tech-gadgets/technological-singularity2.htm) New technology annoys me in general for this very reason… we have iPads so small they might as well be phones and other times we have iPhones and Galaxy phones that are so big they should just be tablets. I obviously don’t think we should go back to mega-dinosaur computers, but do we REALLY need Apple watches?
I can’t remember which source or I’d cite it, but I think they’re wrong for comparing this to the Agricultural and Industrial Revolutions. Those were different situations. They were pretty much necessary for sustaining human life. They were good for the economy. Singularity would increase the economy because everyone has to have the latest and greatest technology BUT if this technology is just going to be robots that wipes out the human race, does it really matter that it would boost the economy? Those types of revolutions allowed us to be more efficient with farming and production of consumer goods, for the most part. This type of revolution is literally allowing us to be as lazy as possible, which in itself could kill us for biological reasons like literally never moving a muscle.
I don’t see singularity as the beginning of a “golden age” but I don’t exactly see it as the end of the human race in other ways… Honestly I just see it as unnecessary.
Permalink # itaylor068 said
What does singularity mean for us?
” a hypothetical future moment where software becomes self-aware and smart beyond our capacity to understand.”
http://mashable.com/2014/11/17/elon-musk-singularity/
Will it be a “golden age” for humanity, the elimination of humanity, or something else?
I think that it will cause the elimination of humanity.
“Before artificial intelligence becomes a realty, someone will have to develop software that will allow a machine to analyze data, make decisions and act autonomously…Humans would be come obsolete in the computer world…In short, we would reach the singularity.”
http://electronics.howstuffworks.com/gadgets/high-tech-gadgets/technological-singularity1.htm
For singularity to actually happen, we would have to develop a way for it to. I see no reason in doing so, for it will have a very negative effect. Technology already controls the human race so much already, why increase its influence?
“’The development of full artificial intelligence could spell the end of the human race,’” the world-renowned physicist told the BBC.”
http://time.com/3614349/artificial-intelligence-singularity-stephen-hawking-elon-musk/
This was a quote said by Stephen Hawking. His thoughts on artificial intelligence correspond with mine. He states that the human race is limited by slow biological evolution and that we basically no chance when competing against artificial technology.
“‘We’re becoming like the mitochondria. We provide the energy — we turn on the machines,” Slonczewski told LiveScience. “But increasingly, they do everything else.’”
http://m.livescience.com/29379-intelligent-robots-will-overtake-humans.html
I like this quote. It basically sums up the singularity situation. WE control it happening. There needs to be a limit on how far we can go with technology. Without humans, technology will never reach the singularity extreme, so why let it?
(hope this posts, if not, I sent it to your email Noblitt)
Permalink # candersonbaseball said
To the question what does singularity mean and will it be a “golden age” for humanity, the elimination of humanity or something else means to me an animal or an human being one with him/herself.
The first articular http://ELECTRONICS.HOWSTUFFWORKS.COM/GADGETS/HIGH-TECH-GADGETS/TECHNOLOGICAL-SINGULARITY.HTM Says that computer technology advances at a faster rate than many other technologies. This is true but it will come to a point where a computer will over rule the ruler himself. The second articleHTTP://TIME.COM/3614349/ARTIFICIAL-INTELLIGENCE-SINGULARITY-STEPHEN-HAWKING-ELON-MUSK/ Stephen Hawkings states “The end of the human race”. Hawkings is a very very gifted man but I dont agree with his statement. Computers only help us and they do affect are way of life but again it will never be the end of the human race
The third article http://WWW.LIVESCIENCE.COM/29379-INTELLIGENT-ROBOTS-WILL-OVERTAKE-HUMANS.HTML states that some scientist think we are already in the midst of the singularity. Computers are not even close to the human brain yet, so for him to make that statement is ridiculous! The last articleHTTP://MASHABLE.COM/2014/11/17/ELON-MUSK-SINGULARITY/ states Musks prediction is that tech event will come to pass sometime around 2045. Computers are getting faster and better but computers serve to the humans!
My conclusion is that the ruler will not create something that can be over ruled. Computers will get smarter but not to the point where they are capable as the human brain. All these scientist have there own views but I have mine.
Permalink # zacht27 said
Regarding the content of each article:
Although the first article presents a valuable summary on the concept of the singularity, the first article seems to be influenced by the misconception that any robot can become self aware. (http://electronics.howstuffworks.com/gadgets/high-tech-gadgets/technological-singularity2.htm)
I agree with Elon Musk’s statement that AI could become more dangerous than nuclear weapons as well as with James Barrat’s statement that superior AI could swiftly eradicate the human race. (http://time.com/3614349/artificial-intelligence-singularity-stephen-hawking-elon-musk/)
I feel compelled to agree with Davis in the third article that technology isn’t nearly as advanced to predict a singularity within this century. “While AI can trounce the best chess or Jeopardy player and do other specialized tasks, it’s still light-years behind the average 7-year-old in terms of common sense, vision, language and intuition about how the physical world works, Davis said.” However, I don’t particularly like the way the third article presented the endosymbiotic theory as unspeculated fact, since there are multiple theories as to the evolution of eukaryotic cells and the theory isn’t scientific consensus, as micro- and macroevolution are.
(http://www.livescience.com/29379-intelligent-robots-will-overtake-humans.html)
(http://www.evolutionnews.org/2012/01/on_the_origin_o054891.html)
Regarding the fourth article, Musk’s comment makes the point that leading AI corporations don’t necessarily share the depth of their accomplishments. This alone causes me to consider artificial self-awareness more as a possibility, but also still consider the idea that AI will only be what we program them to be able to be as well as the idea introduced in the first article that physical limits might produce an impassable barrier between us and truly conscious AI with the same human liberties of thought (though I feel somewhat less strongly on the latter). (http://electronics.howstuffworks.com/gadgets/high-tech-gadgets/technological-singularity2.htm)
“The pace of progress in artificial intelligence (I’m not referring to narrow AI) is incredibly fast,” said Musk. “Unless you have direct exposure to groups like Deepmind, you have no idea how fast — it is growing at a pace close to exponential.” (http://mashable.com/2014/11/17/ELON-MUSK-SINGULARITY/)
Permalink # zacht27 said
Regarding the content of each article:
Although the first article presents a valuable summary on the concept of the singularity, the first article seems to be influenced by the misconception that any robot can become self aware. (http://electronics.howstuffworks.com/gadgets/high-tech-gadgets/technological-singularity2.htm)
I agree with Elon Musk’s statement that AI could become more dangerous than nuclear weapons as well as with James Barrat’s statement that superior AI could swiftly eradicate the human race. (http://time.com/3614349/artificial-intelligence-singularity-stephen-hawking-elon-musk/)
I feel compelled to agree with Davis in the third article that technology isn’t nearly as advanced to predict a singularity within this century. “While AI can trounce the best chess or Jeopardy player and do other specialized tasks, it’s still light-years behind the average 7-year-old in terms of common sense, vision, language and intuition about how the physical world works, Davis said.” However, I don’t particularly like the way the third article presented the endosymbiotic theory as unspeculated fact, since there are multiple theories as to the evolution of eukaryotic cells and the theory isn’t scientific consensus, as micro- and macroevolution are.
(http://www.livescience.com/29379-intelligent-robots-will-overtake-humans.html)
(http://www.evolutionnews.org/2012/01/on_the_origin_o054891.html)
Regarding the fourth article, Musk’s comment makes the point that leading AI corporations don’t necessarily share the depth of their accomplishments. This alone causes me to consider artificial self-awareness more as a possibility, but also still consider the idea that AI will only be what we program them to be able to be as well as the idea introduced in the first article that physical limits might produce an impassable barrier between us and truly conscious AI with the same human liberties of thought (though I feel somewhat less strongly on the latter). (http://electronics.howstuffworks.com/gadgets/high-tech-gadgets/technological-singularity2.htm)
“The pace of progress in artificial intelligence (I’m not referring to narrow AI) is incredibly fast,” said Musk. “Unless you have direct exposure to groups like Deepmind, you have no idea how fast — it is growing at a pace close to exponential.” (http://mashable.com/2014/11/17/ELON-MUSK-SINGULARITY/).