The objective of Deep Learning is to imitate the human thinking process. It is based on the assumption that a class of mathematical algorithms can enable machines to extract and "deeply" learn the patterns of a given set of data. "Deeply" here refers to details picked up by the algorithms from the data with few assumptions and approximations. By discovering and formulating the complex relationships found in the data the machines can rival human intelligence. Internally, deep learning techniques use layers of processing to simulate the human brain. Modern deep learning techniques have come a long way and have the ability to understand speech, process structured data and enable object detection.

Neural networks is one of the popular ways to implement deep learning. Inspired by the working of neurons in the human brain, the concept of neural networks was proposed by Frank Rosenblatt in 1957. He devised a technique that would mean digital information could be processed in layers to solve a complex mathematical problem. His initial attempt at designing a neural network was quite simple and looked similar to a linear regression model. It was modelled after the human brain in which the dendrites act as sensors that detect a signal. The signal is then passed on to an axon, which is a long, slender projection of a nerve cell. The function of the axon is to transmit this signal to muscles, glands, and other neurons. The signal travels through interconnecting tissue called a synapse before being passed on to other neurons. Note that through this organic pipeline, the signal keeps traveling until it reaches the target muscle or gland, where it causes the required action. It typically takes seven to eight milliseconds for the signal to pass through the chain of neurons and reach its destination:

Implementing deep learning through neural networks did not have an overnight success. It took decades for neural networks to prove their worth. The initial neural networks consisted of a single processing unit called perceptron. It was found to have serious limitations, and in 1969, Marvin Minsky and Seymour Papert worked on research that led to the conclusion that a perceptron is incapable of learning any complex logic. In fact, they showed that it would be a struggle to learn even logical functions as simple as XOR. That led to a decrease in interest in machine learning in general, and neural networks in particular, and started an era that is now known as the AI winter. Researchers around the world would not take AI seriously, thinking that it was incapable of solving any complex problems.One of the primary reasons for the so-called AI winter was the limitation of the hardware capabilities available at that time. Either the necessary computing power was not available or it was prohibitively expensive. Toward the end of the 1990s, advances in distributed computing provided easily available and affordable infrastructure, which resulted in the thaw of the AI winter. The thaw reinvigorated research in AI. This eventually resulted in turning the current era into an era that can be called the AI spring, where there is so much interest in AI in general and neural networks in particular.

ON DEMAND WEBINAR: Implementing Deep Learning Concepts Through Neural Networks

Watch Now!

In neural networks the processing required during the learning and inference phases is performed by a set of computing layers called hidden layers. These hidden layers are between the first layer called input layer and the final layer called the output layer. These layers consist of processing units called neurons. Depending upon the nature of the use case, these neurons are connected in different ways giving us different types of specialized Neural networks. One of the most popular types of these specialized networks is Convolutional neural networks (CNNs) which are mainly used for computer vision, image processing, and object detection. CNNs have special processing layers which reduce the resolution of the images with minimal impact on the accuracy. Another important type of neural network is Recurrent neural networks (RNNs). RNNs use sequential data or time-series data for ordinal or temporal problems. They are extensively used in natural language processing applications. They use training data to learn the inherent sequences. Some common use cases of RNNs include Google Translate, image captioning, and Siri.

Deep learning techniques can be used to represent data in a very efficient way. For representative learning autoencoders are used which replicate data from the input layer to the output layer and are used to solve unsupervised learning problems. They're used for things such as image processing and pharmaceutical research.

A combination of various factors has made implementation of deep learning through neural networks one of the most important machine learning techniques available today. These factors include the need to solve increasingly complex problems, the explosion of data, and the emergence of technologies, such as readily available cheap clusters, that provide the computing power necessary to design very complex algorithms. In fact, this is the research area that is rapidly evolving and is responsible for most of the major advances claimed by leading-edge tech fields such as robotics, natural language processing, and self-driving cars.

At present there is a lot of interest in machine learning techniques. In the era of COVID-19, the need for reliable predictions is more than ever. From predictions of spread of the virus to the capacity planning for the hospitals, need for accurate predictions is everywhere. There is a great interest in Deep Learning and in fact the further evolution of Artificial Intelligence is both dependent on Deep Learning. Deep Learning is constantly evolving and improving constantly.

Attachments

  • Original document
  • Permalink

Disclaimer

Learning Tree International Inc. published this content on 04 October 2021 and is solely responsible for the information contained therein. Distributed by Public, unedited and unaltered, on 11 October 2021 21:11:05 UTC.