Turn off the Ad Banner  

To print: Select File and then Print from your browser's menu.

    -----------------------------------------------
This story was printed from CdrInfo.com,
located at http://www.cdrinfo.com.
-----------------------------------------------

Appeared on: Friday, June 15, 2018
Wolfram Research Releases Neural Net Repository

Wolfram Research, the legendary makers of Mathematica, Wolfram Alpha and the Wolfram Language, officially launched at its European Wolfram Technology Conference the Wolfram Neural Net Repository.

Providing immediate access to neural networks for AI and machine learning applications, the repository further builds a layer of computational intelligence upon the Wolfram tech stack. These models are suitable for immediate evaluation, training, visualization, and transfer learning, for both experts and those new to applying AI to their work.

"A huge amount of work has gone into training or converting around 70 neural net models that now live in the repository, and can be accessed programmatically in the Wolfram Language via NetModel," said Wolfram Research.

Neural nets (NNs) are at the core of Deep Learning. They are inspired by trying to mimic how neurons in human brains operate -- by connecting with other neurons and processing input in a networked way. Deep learning is used to describe algorithms with many layers of neurons.

Building on the Wolfram Language Neural Net Framework introduced in 2014 with Version 10 of the Wolfram Language, the repository provides a uniform system for storing and deploying neural network models in an immediately computable form. The repository is built to be a global resource for neural net models, including those from the latest research papers, as well as ones trained or created at Wolfram Research.

So the Wolfram team has actually has invested much effort in converting publicly available models from other neural net frameworks, such as Caffe, Torch, MXNet, and TensorFlow, into the Wolfram Neural Net format. In addition, they have trained a number of NNs themselves.

"Machine learning is a field in hypergrowth right now-with interesting new results being published every week. Our goal with the Wolfram Neural Net Repository is to let people immediately integrate the latest research neural nets into their work," said Stephen Wolfram, founder and CEO of Wolfram Research. "Like everything we do with Wolfram Language, our goal is to make everything as smooth and automated as possible, so it's immediate to include a new neural net from the repository, with all encoding, decoding, etc. handled automatically."

The neural net models are accessible in the Wolfram Cloud and can be exported in the popular MXNet framework. These nets are suitable for a variety of applications including classification, feature extraction, image processing, language modeling, speech recognition, and regression. The growing number of models work with text, image and numerical inputs for quick prototyping of neural net solutions for multiparadigm data science applications.

The ease of access and use of the repository on desktop, cloud, or on a mobile device, brings flexibility to training and deploying AI solutions to a wide range of problems in business, research, data science, software development, and beyond, in labs, classrooms, or enterprise.

Wolfram sees the amount of automation as the single most tempting reason for someone to use Wolfram NNs compared to other alternatives. Wolfram's NNs handle variable length sequences. For example, users can
focus on building the net instead of something like esoteric details like converting and preprocessing the input data into the right form, tracking batch dimensions, applying sequence masking, and the like, which is a difficulty in other frameworks that is often foisted onto developers.

And since the nets are part of the Wolfram language, they are completely cross-platform.

Neural nets have generated a lot of interest recently: they form the basis for state-of-the-art solutions to a dizzying array of problems, from speech recognition to machine translation, from autonomous driving to playing Go. The Wolfram Language now has a neural net framework. This has made possible a whole new set of Wolfram Language functions, such as FindTextualAnswer, ImageIdentify, ImageRestyle and FacialFeatures.

However, training state-of-the art neural nets often requires huge datasets and significant computational resources that are inaccessible to most users. A repository of nets gives Wolfram Language users easy access to the latest net architectures and pre-trained nets, representing thousands of hours of computation time on powerful GPUs.

A great thing about the deep learning community is that it's common for researchers to make their trained nets publicly available. These are often in the form of disparate scripts and data files using a multitude of neural net frameworks. A major goal of Wolfram's repository is to curate and publish these models into a standard, easy-to-use format soon after they are released. In addition, Wolfram is providing its own trained models for various tasks.

Although much of this functionality will eventually be packaged as official Wolfram Language functions, the repository provides early access to a large set of functionality that until now was entirely impossible to do in the Wolfram Language.

Pre-trained nets can be used as powerful FeatureExtractor functions throughout the Wolfram Language's other machine learning functionalities, such as Classify, Predict, FeatureSpacePlot, etc. This gives users fine-grained control over incorporating prior knowledge into their machine learning pipelines.

Access to carefully designed and trained modules unlocks a higher-level paradigm for using the Wolfram neural net framework. This paradigm frees users from the difficult and laborious task of building good net architectures from individual layers and allows them to transfer knowledge from nets trained on different domains to their own problems.

An important but indirect benefit of having a diverse and rich library of nets available in the Wolfram Neural Net Repository is to catalyze the development of the Wolfram neural net framework itself. In particular, the addition of models operating on audio and text has driven a diverse set of improvements to the framework; these include extensive support for so-called dynamic dimensions (variable-length tensors), five new audio NetEncoder types and NetStateObject for easy recurrent generation.



Home | News | All News | Reviews | Articles | Guides | Download | Expert Area | Forum | Site Info
Site best viewed at 1024x768+ - CDRINFO.COM 1998-2024 - All rights reserved -
Privacy policy - Contact Us .