Quote of the Day
The world got enamored with smartphones and tablets, but what's interesting is those devices don't do everything that needs to be done. Three-D printing, virtual-reality computing, robotics are all controlled by PCs.
I have to admit it: I'm not a huge fan of the cloud computing concept.
Cloud computing offers individuals access to data and applications from nearly any point of access to the Internet, offers businesses a whole new way to cut costs for technical infrastructure, and offers big computer companies a potentially giant market for hardware and services.
If someone asks me what cloud computing is, I try not to get bogged down with definitions. I tell them that, simply put, cloud computing is a better way to run your business.
Computing should be taught as a rigorous - but fun - discipline covering topics like programming, database structures, and algorithms. That doesn't have to be boring.
One reason you should not use web applications to do your computing is that you lose control.
Information theory began as a bridge from mathematics to electrical engineering and from there to computing.
I can see a day soon where you'll create your own college degree by taking the best online courses from the best professors from around the world - some computing from Stanford, some entrepreneurship from Wharton, some ethics from Brandeis, some literature from Edinburgh - paying only the nominal fee for the certificates of completion.
In computing, turning the obvious into the useful is a living definition of the word 'frustration'.
Computing is not about computers any more. It is about living.
They don't call it the Internet anymore, they call it cloud computing. I'm no longer resisting the name. Call it what you want.
By 2020, most home computers will have the computing power of a human brain. That doesn't mean that they are brains, but it means that in terms of raw processing, they can process bits as fast as a brain can. So the question is, how far behind that is the development of a machine that's as smart as we are?
In 2000, when my partner Ben Horowitz was CEO of the first cloud computing company, Loudcloud, the cost of a customer running a basic Internet application was approximately $150,000 a month.
The interesting thing about cloud computing is that we've redefined cloud computing to include everything that we already do.
I suppose many people will continue moving towards careless computing, because there's a sucker born every minute.
A typical smart phone has more computing power than Apollo 11 when it landed a man on the moon.
In the practical world of computing, it is rather uncommon that a program, once it performs correctly and satisfactorily, remains unchanged forever.
Technological innovation has dramatically lowered the cost of computing, making it possible for large numbers of consumers to own powerful new technologies at reasonably low prices.
The way Moore's Law occurs in computing is really unprecedented in other walks of life. If the Boeing 747 obeyed Moore's Law, it would travel a million miles an hour, it would be shrunken down in size, and a trip to New York would cost about five dollars. Those enormous changes just aren't part of our everyday experience.
Mobile communications and pervasive computing technologies, together with social contracts that were never possible before, are already beginning to change the way people meet, mate, work, war, buy, sell, govern and create.
The most important application of quantum computing in the future is likely to be a computer simulation of quantum systems, because that's an application where we know for sure that quantum systems in general cannot be efficiently simulated on a classical computer.
If you think about computing, there isn't just one way to compute, just like there's not just one way to move around. You can have shoes, you can have a car, you can have a bicycle, submarine, rocket, plane, train, glider, whatever. Because you have one doesn't mean you get rid of another one... But PCs continue to be important.
Computing is kind of a mess. Your computer doesn't know where you are. It doesn't know what you're doing. It doesn't know what you know.
I wouldn't be surprised if tomorrow was the Final Dawn, the last sunrise before the Earth and Sun are reshaped into computing elements.
Computer games tend to be boys' games, warlike games with more violence. We have not spent enough time thinking through how to encourage more girls to be involved in computing before coming to college so they can see a possible career in information technology.
Freeman A. Hrabowski III
A. P. J. Abdul Kalam
Image of the Moment
Get Social with BrainyQuote
Follow BrainyQuote on Facebook, Twitter and Google+ to share inspiring quotes with friends.
Join us on
Follow us on
Follow us on
Quote of the Day
BQ on Facebook
BQ on Twitter
BQ on Pinterest
BQ on Google+
BQ on Instagram
Quote Of The Day Feeds
Quote of the Day Email
© 2001 - 2015 BrainyQuote