Google is hosting its first ever TensorFlow Dev Summit in Mountain View, California. The event is playing host to some cool new announcements from the software giant and along the same, the company has announced the release of version 1.0 of its TensorFlow open-source framework for deep learning.

The company has also announced a slew of new tools that will accompany version 1.0. Apparently,  Google is spiking the new Tensorflow release with a substantial dose of AI as well and Version 1.0 will include artificial neural networks that can also be trained using datasets. Tools like K-means and support vector machines (SVMs) have also been brought in with this version.

Google has also pushed support for the Python-based Keras library. What’s more, the addition of “canned estimators,” including simple neural networks mean that you can fire up things quickly.

In case you are unaware of it, TensorFlow is  basically an open source software library that allows for numerical computation using data flow graphs. The term “tensor” is significant here because while nodes in the graph equate to mathematical operations, graph edges represent the multidimensional data arrays or tensors that are communicated between the respective nodes. Since its launch by Google, several other companies have also published their own take on the platform — most notably, Yahoo.

Google also said that it will soon open source code that will increase the speed of the TensorFlow by 58. That is a pretty exact prediction and assuming that it is correct, models would start working that much faster.  Google is also pushing support for the Hexagon digital signal processor (DS) to TensorFlow. The processor, in case you are unaware of it, is the same one that is present on-board Qualcomm’s Snapdradon 820 mobile chip and its Dragonboard 820c board.

Interestingly, the changes come merely a day after Yahoo decided to open source TensorFlowOnSpark to promote big data deep learning. Not that there is any competition here, considering that everything is open source.

Leave a Reply

Your email address will not be published. Required fields are marked *