Yahoo open-sources CaffeOnSpark deep learning framework for Hadoop.

Yahoo today is releasing some key artificial intelligence software (AI) under an open-source license. The company last year built a library called CaffeOnSpark to perform a popular type of AI called “deep learning” on the vast swaths of data kept in its Hadoop open-source file system for storing big data. Now it’s becoming available for anyone to use under an open-source Apache license on GitHub.

Startups Aim to Exploit a Deep-Learning Skills Gap.

The latest machine-learning techniques promise to transform whole industries by making it easier for computers to recognize patterns in data, to make accurate predictions, and to generally behave more intelligently. Unfortunately, the experts capable of crafting and optimizing the code needed to make this magic possible are in pretty short supply.

Deep learning startup Nervana raises $20.5M Part of the Nervana team.

Nervana Systems, one startup building artificial intelligence systems that companies can use to make their applications smarter, announced today a $20.5 million round of funding.

The startup specializes in a type of AI called deep learning, which involves feeding lots of data to artificial neural networks and then throwing new data at them to receive inferences in response. Researchers developed the technique decades ago; and it’s since waxed and waned in popularity. Thanks to improvements in computing power and the availability of large sets of data, the technique has become popular for processing images, videos, text, and speech.

Y Combinator-backed Atomwise scores $6M to use deep learning for drug discovery.

Atomwise, a startup that relies on an increasingly popular type of artificial intelligence called deep learning to identify drug candidates, announced today a $6 million round of funding.

The promise of Atomwise lies in its algorithms. Instead of relying on more traditional machine learning approaches, Atomwise employs deep learning, which involves training artificial neural networks on a large quantity of data — like billions of pictures, for example — and then giving them new data to receive inferences, or predictions, in response. These systems can become more accurate over time as they receive more and more data.

Suscribirse a RSS - deep learning