Jeff Bezos (Amazon CEO) Quotes:
“It is a renaissance, it is a golden age. We are now solving problems with machine learning and artificial intelligence that were … in the realm of science fiction for the last several decades. And natural language understanding, machine vision problems, it really is an amazing renaissance.”
“Artificial Intelligence and Machine Learning ... will empower and improve every business, every government organization, every philanthropy … there is not an institution in the world that cannot be improved with Machine Learning.”
“[A] lot of the value that we’re getting from machine learning is actually happening beneath the surface. It is things like improved search results. Improved product recommendations for customers. Improved forecasting for inventory management. Literally hundreds of other things beneath the surface.”
Sundar Pichai (Google CEO) Quotes:
"AI holds the potential for some of the biggest advances we are going to see.”
“In an ‘AI first’ world we are rethinking all our products and applying Machine Learning and AI to solve user problems. We are doing that across every one of our products.”
"AI is one of the most important things humanity is working on. It is more profound than, (I dunno,) electricity or fire"
"Well, it kills people, too," Pichai says of fire. "We have learned to harness fire for the benefits of humanity but we had to overcome its downsides too. So my point is, AI is really important, but we have to be concerned about it."
Elon Musk (Tesla, SpaceX CEO) quotes:
"AI is a fundamental risk to the existence of human civilization.”
"AI will be the best or worst thing ever for humanity"
"If one company or small group of people manages to develop god-like superintelligence, they could take over the world,"
"We are rapidly heading towards digital superintelligence that far exceeds any human, I think it's very obvious."
"We have five years. I think digital superintelligence will happen in my lifetime, 100%."
"AI doesn't have to be evil to destroy humanity — if AI has a goal and humanity just happens in the way, it will destroy humanity as a matter of course without even thinking about it, no hard feelings. It's just like if we're building a road and an anthill happens to be in the way, we don't hate ants, we're just building a road, and so goodbye anthill."
"Governments don't need to follow normal laws. They will obtain AI developed by companies at gunpoint, if necessary."
Stephan Hawking (Physicist, Futurologist) Quotes:
"The development of full artificial intelligence could spell the end of the human race."
"It would take off on its own, and re-design itself at an ever increasing rate."
"Humans, who are limited by slow biological evolution, couldn't compete, and would be superseded."
"Success in creating effective AI, could be the biggest event in the history of our civilization. Or the worst. We just don't know. So we cannot know if we will be infinitely helped by AI, or ignored by it and side-lined, or conceivably destroyed by it.”
"Unless we learn how to prepare for, and avoid, the potential risks, AI could be the worst event in the history of our civilization. It brings dangers, like powerful autonomous weapons, or new ways for the few to oppress the many. It could bring great disruption to our economy."
"I am an optimist and I believe that we can create AI for the good of the world. That it can work in harmony with us. We simply need to be aware of the dangers, identify them, employ the best possible practice and management, and prepare for its consequences well in advance."