Skip to content

manish0kuniyal/ML_PredictionModels

Repository files navigation

NeuralNetworks

REGRESSION

Helps us to predict a dependent output variable based on the values of independent input variable

LINEAR REGRESSION

finds the relation between input and output variable by plotting a line which best fits the data given to it equation which is used to predict that is a straight line equation.

x y
140 20
150 40
160 30
170 55

what will be the value at 165- ? by drawing a line of best fit through the data point we can actually predict the value at 165

LOGISTIC REGRESSION

predicts value of output based on input it's output is 0 or 1 (YES OR NO).

b0 and b1 are intercept values

PARAMETERS

ACTIVATION FUNCTION

The activation function determines whether the neuron should be "activated" (fire) or not, based on the input it receives.

Activation Function Description Use Cases
Sigmoid Squashes input values between 0 and 1 - Estimating probabilities
- Outputting values between 0 and 1
Tanh Squashes input values between -1 and 1 - Introducing non-linearities
- Normalizing data between -1 and 1
ReLU Keeps positive values as they are, turns negatives to 0 - Hidden layers in deep neural networks
- Faster learning and efficient training
Leaky ReLU Similar to ReLU, but allows a small negative slope - Preventing dead neurons
- Improved training with negative inputs
Softmax Converts values into a probability distribution - Multi-class classification
- Identifying the most probable class

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published