by Josie Wales
July 20, 2017

from TheAntiMedia Website







Many people don't realize that some of the most significant technological breakthroughs in recent years, like,

  • voice and facial recognition software

  • autonomous driving systems

  • image recognition software,

...have not actually been designed by humans, but by computers.

 

All of these advanced software programs have been the result of neural networks, popularly referred to as "deep learning."
 

Neural networks are modeled loosely after the human brain and learn like them in similar ways by processing large amounts of data, along with algorithms fed to the networks by programmers.

 

A neural net is then able to teach itself to perform tasks by analyzing the training data.

"You essentially have software writing software," says Jen-Hsun Huang, CEO of graphics processing leader Nvidia.

Research in the area of deep learning is advancing so quickly that neural networks are now able to dream and can even communicate with each other using inhuman cryptographic language indecipherable to humans and other computers.

 

The only drawback to the technology is that the networks require a lot of memory and power to operate, but MIT associate professor of electrical engineering and computer science Vivienne Sze and her colleagues have been working on a solution that could enable the powerful software to operate on cell phones.

 

Sze and her team made a breakthrough last year in designing an energy-efficient computer chip that could allow mobile devices to run powerful artificial intelligence systems.

 

The researchers have since taken an alternate approach to their research by designing an array of new techniques to make neural nets more energy efficient.

"First, they developed an analytic method that can determine how much power a neural network will consume when run on a particular type of hardware.

 

Then they used the method to evaluate new techniques for paring down neural networks so that they'll run more efficiently on handheld devices," MIT News reports.

The team will be presenting a paper on their research next week at the Computer Vision and Pattern Recognition Conference in Honolulu.

 

There, they will describe their methods for reducing neural networks' power consumption by as much as 43 percent over the best previous method and 73 percent over the standard implementation with the use of "energy-aware pruning."

 

According to Hartwig Adam, the team lead for mobile vision at Google:

"Recently, much activity in the deep-learning community has been directed toward development of efficient neural-network architectures for computationally constrained platforms.

 

However, most of this research is focused on either reducing model size or computation, while for smartphones and many other devices energy consumption is of utmost importance because of battery usage and heat restrictions."

Adam added:

"This work is taking an innovative approach to CNN (convolutional neural net) architecture optimization that is directly guided by minimization of power consumption using a sophisticated new energy estimation tool, and it demonstrates large performance gains over computation-focused methods.

 

I hope other researchers in the field will follow suit and adopt this general methodology to neural-network-model architecture design."