Part -1 :- Students , In this part we will see little bit of theory about leaning paths , Difference between AI ,ML and DL as well as regression and types of regression.
CONTENT 1: Learning Paths
Hey Data Scientist,
Simple Way Learn Machine Learning is bringing you a new learning experience. We know how difficult it is to carve out a career track so we’re introducing the Simple Way Learn Machine Learning to guarantee your way to success.
This Skill Track is a perfect fit if you:
Struggle to determine the skills you need to succeed in this field,
Are unsure which courses are right for you,
Desire to arrange your learning curve efficiently and on your schedule.
Built to deliver streamlined on-the-job success, the Simple Way To Learn Machine Learning provides structured curriculums for in-demand Machine Learning skills.
After completion, All Parts Track students will walk away with the required Machine Learning skills and a complete portfolio of work to showcase in competitive job interviews.
CONTENT 2: ML vs. DL vs. AI - What’s the Difference?
Dear Friends,
ML vs DL vs AI — What’s the Difference?’ one of the most popular questions we hear from students, and hopefully, it clarifies a few lingering questions for you too.
Artificial Intelligence :-
Science that empowers computers to mimic human intelligence such as decision making, text processing, and visual perception. Ai is a broader field (i.e.: the big umbrella) that contains several subfield such as machine learning, robotics, and computer vision.
Machine Learning :-
Machine Learning is a subfield of Artificial Intelligence that enables machines to improve at
a given task with experience. It is important to note that all machine learning techniques are
classified as Artificial Intelligence ones. However, not all Artificial Intelligence could count as
Machine Learning since some basic Rule-based engines could be classified as AI but they do
not learn from experience therefore they do not belong to the machine learning category.
Deep Learning :-
Deep learning, an advanced Artificial Intelligence technique, has become increasingly popular in the past few years, thanks to abundant data and increased computing power. It's the main technology behind many of the applications we use every day, including online language translation and automated face-tagging in social media
CONTENT 3: Regression Types
1. Linear Regression :-
Linear regression is a model that posits a linear connection between two variables, one of which is the input variable represented by x and the other being the single output variable represented by y. In other words, ‘y’ is the linear combination of the input variables ‘x’.
Formula :- Yi = f(Xi , B) +ei
Where :-
- Yi = a dependent variable
- f = a function
- Xi = an independent variable
- B = the unknown parameters
- ei = error
2. Logistic Regression :-
Logistic regression is a classification and regression technique at the same time. It helps in the prediction of a binary outcome, such as yes/no, 1/0, true/false, and so on. In logistic regression, the dependent variables are categorical, which means they can only accept integral values indicating various classes. It is a subset of the Generalized Linear Model algorithm class.
Formula :- logit (p) = ln (p/ (1-p)) = b0+b1X1+b2X2
Types of Logistic Regression :-
Binomial Regression : -
In binomial Logistic regression, there can be only two possible types of the dependent variables, such as 1 or 0 , Pass or Fail, etc.
Multinomial Regression :-
In multinomial Logistic regression, there can be 3 or more possible unordered types of the dependent variable, such as "lion", "sheep", or "goat".
Ordinal Regression :-
In ordinal Logistic regression, there can be 3 or more possible ordered types of dependent variables, such as "low", "Medium", or "High".
3. Polynomial Regression :-
Polynomial Regression is a regression algorithm that models the relationship between a dependent(y) and independent variable(x) as nth degree polynomial.
Formula :- y= b0+b1x + b2x2+ b3x3+....+ bnxn
4.Support Vector Regression :-
Support Vector Regression is a supervised learning algorithm that is used to predict discrete values. Support Vector Regression uses the same principle as the SVMs. The basic idea behind SVR is to find the best fit line. In SVR, the best fit line is the hyperplane that has the maximum number of points.
Formula :- y= wx+b
In our next part (Part-2) we will see data preprocessing tools and how to import some important Machine Learning Libraries. So keep with us.
Comments
Post a Comment