CSE Dept Engineering Topics List

Automata and Regular Expressions - Theory of Computation

Subject and UNIT: Theory of Computation: Unit I: Automata and Regular Expressions

The formal proof can be using deductive proof and inductive proof. The deductive proof consists of sequence of statements given with logical reasoning in order to prove the first or initial statement.

Automata and Regular Expressions - Theory of Computation

Subject and UNIT: Theory of Computation: Unit I: Automata and Regular Expressions

In automata theory and computability the word computability is for the computation in mathematical modeling. Hence this subject deals with the study of various mathematical models that are required for some computations.

Neural Networks - Artificial Intelligence and Machine Learning

Subject and UNIT: Artificial Intelligence and Machine Learning: Unit V: Neural Networks

The Multilayer Perceptron (MLP) model features multiple layers that are interconnected in such a way that they form a feed-forward neural network.

Neural Networks - Artificial Intelligence and Machine Learning

Subject and UNIT: Artificial Intelligence and Machine Learning: Unit V: Neural Networks

Just have a look at the above figure, and we can immediately predict that once we try to cover every minutest feature of the input data, there can be irregularities in the extracted features, which can introduce noise in the output. This is referred to as "Overfitting".

Neural Networks - Artificial Intelligence and Machine Learning

Subject and UNIT: Artificial Intelligence and Machine Learning: Unit V: Neural Networks

Normalization is a data preparation technique that is frequently used in machine learning. The process of transforming the columns in a dataset to the same scale is referred to as normalization.

Neural Networks - Artificial Intelligence and Machine Learning

Subject and UNIT: Artificial Intelligence and Machine Learning: Unit V: Neural Networks

Hyperparameters are parameters whose values control the learning process and determine the values of model parameters that a learning algorithm ends up learning.

Neural Networks - Artificial Intelligence and Machine Learning

Subject and UNIT: Artificial Intelligence and Machine Learning: Unit V: Neural Networks

Rectified Linear Unit (ReLU) solve the vanishing gradient problem. ReLU is a non-linear function or piecewise linear function that will output the input directly if it is positive, otherwise, it will output zero.

Neural Networks - Artificial Intelligence and Machine Learning

Subject and UNIT: Artificial Intelligence and Machine Learning: Unit V: Neural Networks

The vanishing gradient problem is a problem that user face, when we are training Neural Networks by using gradient-based methods like backpropagation.

Neural Networks - Artificial Intelligence and Machine Learning

Subject and UNIT: Artificial Intelligence and Machine Learning: Unit V: Neural Networks

Deep learning is a new area of machine learning research, which has been introduced with the objective of moving machine learning closer to one of its original goals.

Neural Networks - Artificial Intelligence and Machine Learning

Subject and UNIT: Artificial Intelligence and Machine Learning: Unit V: Neural Networks

The terms shallow and deep refer to the number of layers in a neural network; shallow neural networks refer to a neural network that have a small number of layers, usually regarded as having a single hidden layer.

Neural Networks - Artificial Intelligence and Machine Learning

Subject and UNIT: Artificial Intelligence and Machine Learning: Unit V: Neural Networks

Backpropagation is a training method used for a multi-layer neural network. It is also called the generalized delta rule. It is a gradient descent method which minimizes the total squared error of the output computed by the net.

Neural Networks - Artificial Intelligence and Machine Learning

Subject and UNIT: Artificial Intelligence and Machine Learning: Unit V: Neural Networks

Gradient Descent is an optimization algorithm in gadget mastering used to limit a feature with the aid iteratively moving towards the minimal fee of 1979 the characteristic.