News filtering by :
"The concept of machine learning in itself is simple: it’s about systems and applications that analyze the results of actions and improve or adjust the next operation as a result. So we get systems that go through a sort of learning process, without the need for any further human intervention. The system collects data, checks the information against a certain objective and takes it into account in a subsequent action. But even though the concept may seem simple, in practice it’s anything but. And yet we’re already using it. The smartphone that recognizes your fingerprint, for instance, uses a simple form of machine learning. The software stores a table of 30 to 50 measurements per finger. When you log in, it looks for a correspondence in this table. It’s not really self learning, but learned behavior. The use of a smart thermostat is a little more complex. This ‘learns’ by taking account of the days of the week and the time when you come home. Connecting the system to your diary or other data sources – traffic information, for instance – makes it smarter."
"For a machine to learn independently, a computer needs to imitate the human brain. The IBM Watson project was a first step in this direction, based on an analysis of human language and millions of documents. When asked a question (“Why does a cat always land on its feet?”), the computer model can find the right answer by analyzing all these documents and texts. But it takes 90 servers, 2,880 processor threads and 16 terabytes of RAM to go through 200 million pages of text in one second. IBM is now going a step further by building chips that simulate the working of the brain. Their TrueNorth chip currently contains 4,096 cores that implement 1 million neurons and 256 million synapses on 5.4 billion transistors. These chips can be placed in networks, the aim being to build 4,096 chips into one rack, with 4 billion neurons and 1 trillion synapses altogether. By way of comparison: the human brain has around 100 billion neurons and 100 trillion synapses. Roughly put, at the moment IBM can ‘pour’ 1 to 5% of the brain into a computer. For that, the company developed a new programming language, new algorithms and new program libraries."
"If we really do achieve machine learning, it has advantages. Doctors can be assisted by computers to make a correct diagnosis. Robots can learn where something is kept in a storage facility. Smartphones will be able to recognize our emotions using sensors and cameras and play appropriate music or guide us to the web information that is most suitable at the time. But what if these selflearning systems were to take decisions independently – or develop emotions? We need to look at this development, as well. Who wants to see self-learning, independent computers take control?"
Do you want to share your opinion or comment?
To comment login to Facebook.