Frameworks IA & ML (Static and dynamic)
In this article we will present some most popular frameworks use on IA and ML
IA & ML
The ML (Machine Learning) is considered like a small branch of AI (Artificial Intelligence), additionally its applications and implications in our daily lives are growing. The DL (Deep Learning) and Big Data are popular in many areas of the industry. These algorithms provide the opportunity of processing very large volumes of structured or unstructured data in order to transform them into actionable information: predictions, decisions, suggestions and other “divinations”.
For example some content companies like Netflix use Machine Learning to recommend content to its users. Commercial sites such as Amazon are using it to suggest new products to their customers. If you desire to create tools of this kind to exploit data, you will necessarily need a machine learning framework that will help you create your own analysis graphs.
It should be remembered that a machine learning algorithm is a mathematical function – of a statistical type – that processes the data it receives and delivers a solution. A calculation graph is ultimately only a visual representation of this function. It commonly helps to apprehend how these data were processed by the algorithms.
Static and dynamic algorithms
There are two most important categories of graphs and especially algorithms: dynamics graphs and static. Frameworks using static graphs such as TensorFlow push developers to create fixed and reusable structures. This static graphs can be seen as phase of the programming language, which greatly facilitates the parallelization of tasks across a couple of machines. About Frameworks that use dynamic algorithms – for example PyTorch – are based on a calculation graph implicitly defined as capable of changing.
two The calculation is therefore more flexible but also easier to debug, which is far from negligible. This is likely why dynamic graphs are very well suited to the creation of neural networks.
Tensorflow (https://www.tensorflow.org/) is the Framework that won the battle, and that becomes indisputable. It is published in 2015 in open source by Google, it has been used historically by the American group to power its picture search services or voice recognition. It is additionally with this framework that the giant of Silicon Valley has developed Google Duplex, artificial intelligence able to hold a phone conversation posing as a human in a way quite bluffing, we must admit. TensorFlow uses a static graph as properly as abstraction libraries, such as the Keras, Sonnet or TFLearn neural networks API. The main programming language for working with TensorFlow is Python, but other languages are supported such as C ++, Java or Go, and an API written in C even supports other languages. In addition to its large community that offers many tutorials, documents and projects, TensorFlow offers a tool, TensorBoard, to view directly on a browser algorithms created.
Concerning dynamic Framework, PyTorch (https://pytorch.org/) won this battle.
Created by Facebook’s Artificial Intelligence research team in January 2017, the purpose for its popularity is certainly its good use of dynamic graphs and GPU acceleration. Successor to Torch, an open source Machine Learning library based on the Lua programming language and launched in 2002, PyTorch is based on the Python language which it can exploit the main libraries. Python developers can actually control it more easily and create complicated algorithms such as recurrent neural networks. PyTorch is unfortunately not compatible with Keras. Instead, it will take comfort with other APIs such as Ignite or Scorch. PyTorch combines the production-oriented modular capabilities of the Caffe2 framework (also designed by Facebook) with the Open Neural Network Exchange deep learning model developed by Facebook in conjunction with Amazon and Microsoft. PyTorch brings a flexible research-oriented design approach that allows developers to work faster. PyTorch 1.0 supports the ONNX framework. AWS (Amazon Web Services) plans to support PyTorch 1.0 in SageMaker, its service to help developers create and reuse current Artificial Intelligence models. Google has planned for assist on its Google Cloud Deep Learning virtual machine. Microsoft, for its part, announced a junction between PyTorch 1.0 and Azure and it is already possible to deploy PyTorch models on the Azure Machine Learning (Azure ML) cloud service. PyTorch also has the support of several leading AI chip makers, such as Arm, IBM, Intel, Nvidia or Qualcomm. It’s hard to be more popular.