Can anyone give a real life example of supervised learning and unsupervised learning? [closed]
H

6

42

I recently studied about supervised learning and unsupervised learning. From theory, I know that supervised means getting the information from labeled datasets and unsupervised means clustering the data without any labels given.

But, the problem is I always get confused to identify whether the given example is supervised learning or unsupervised learning during my studies.

Can anyone please give a real life example?

Hamner answered 3/10, 2014 at 16:29 Comment(0)
B
66

Supervised learning:

  • You get a bunch of photos with information about what is on them and then you train a model to recognize new photos.
  • You have a bunch of molecules and information about which are drugs and you train a model to answer whether a new molecule is also a drug.

Unsupervised learning:

  • You have a bunch of photos of 6 people but without information about who is on which one and you want to divide this dataset into 6 piles, each with the photos of one individual.
  • You have molecules, part of them are drugs and part are not but you do not know which are which and you want the algorithm to discover the drugs.
Byplay answered 4/10, 2014 at 21:31 Comment(2)
thanks...that means in unsupervised learning, we cluster data into knowledge groups without having any external knowledge or labels?...right?Hamner
Clustering is the part of unsupervised learning but not the only one. The only distinction between supervised and unsupervised learning is the access to labels (supervised) or lack of it (unsupervised).Byplay
P
15

Supervised Learning:

  • is like learning with a teacher
  • training dataset is like a teacher
  • the training dataset is used to train the machine

Example:

Classification: Machine is trained to classify something into some class.

  • classifying whether a patient has disease or not
  • classifying whether an email is spam or not

Regression: Machine is trained to predict some value like price, weight or height.

  • predicting house/property price
  • predicting stock market price

Unsupervised Learning:

  • is like learning without a teacher
  • the machine learns through observation & find structures in data

Example:

Clustering: A clustering problem is where you want to discover the inherent groupings in the data

  • such as grouping customers by purchasing behavior

Association: An association rule learning problem is where you want to discover rules that describe large portions of your data

  • such as people that buy X also tend to buy Y

Read more: Supervised and Unsupervised Machine Learning Algorithms

Procne answered 8/9, 2017 at 14:51 Comment(0)
B
13

Supervised Learning

This is simple and you would have done it a number of times, for example:

  1. Cortana or any speech automated system in your mobile phone trains your voice and then starts working based on this training.
  2. Based on various features (past record of head-to-head, pitch, toss, player-vs-player) WASP predicts the winning % of both teams.
  3. Train your handwriting to OCR system and once trained, it will be able to convert your hand-writing images into text (till some accuracy obviously)
  4. Based on some prior knowledge (when its sunny, temperature is higher; when its cloudy, humidity is higher, etc.) weather apps predict the parameters for a given time.
  5. Based on past information about spams, filtering out a new incoming email into Inbox (normal) or Junk folder (Spam)

  6. Biometric attendance or ATM etc systems where you train the machine after couple of inputs (of your biometric identity - be it thumb or iris or ear-lobe, etc.), machine can validate your future input and identify you.

Unsupervised Learning

  1. A friend invites you to his party where you meet totally strangers. Now you will classify them using unsupervised learning (no prior knowledge) and this classification can be on the basis of gender, age group, dressing, educational qualification or whatever way you would like. Why this learning is different from Supervised Learning? Since you didn't use any past/prior knowledge about people and classified them "on-the-go".

  2. NASA discovers new heavenly bodies and finds them different from previously known astronomical objects - stars, planets, asteroids, blackholes etc. (i.e. it has no knowledge about these new bodies) and classifies them the way it would like to (distance from Milky way, intensity, gravitational force, red/blue shift or whatever)

  3. Let's suppose you have never seen a Cricket match before and by chance watch a video on internet, now you can classify players on the basis of different criterion: Players wearing same sort of kits are in one class, Players of one style are in one class (batsmen, bowler, fielders), or on the basis of playing hand (RH vs LH) or whatever way you would observe [and classify] it.

  4. We are conducting a survey of 500 questions about predicting the IQ level of students in a college. Since this questionnaire is too big, so after 100 students, administration decides to trim the questionnaire down to fewer questions and for it we use some statistical procedure like PCA to trim it down.

I hope these couple of examples explain the difference in detail.

Barrada answered 27/4, 2017 at 15:12 Comment(0)
B
7

Supervised Learning has input and correct output. For example: We have the data if a person liked the movie or not. On the basis of interviewing people and gathering their response if they liked the movie or not, we are going to predict if the movie is going to be hit or not. 

Raw Data

Let's look at the picture in the link above. I have visited the restaurants marked by red circle. The restaurants which I have not visited is marked by blue circle.

Now, If I have two restaurants to choose from, A and B, marked by green colour, which one will I choose?

Simple. We can classify the given data linearly into two parts. That means, we can draw a line segregating red and blue circle. Look at the picture in the link below:

Learned By supervised learning

Now, we can say with some confidence that chances of my visiting B is more than A. This is a case of supervised learning.

Unsupervised learning has inputs. Let's suppose we have a taxi driver who has an option of accepting or rejecting the bookings. We have plotted his accepted booking location on map with blue circle and is shown below:

Raw data for unsupervised learning

Now, Taxi driver has got two bookings A and B; Which one he will accept? If we observe the plot, we can see that his accepted booking shows a cluster at lower left corner. That can be shown in the picture below:

Unsupervised Learning

Brittaneybrittani answered 3/1, 2018 at 21:20 Comment(0)
B
6

Supervised Learning

Supervised learning is fairly common in classification problems because the goal is often to get the computer to learn a classification system that we have created. Digit recognition, once again, is a common example of classification learning. More generally, classification learning is appropriate for any problem where deducing a classification is useful and the classification is easy to determine. In some cases, it might not even be necessary to give pre-determined classifications to every instance of a problem if the agent can work out the classifications for itself. This would be an example of unsupervised learning in a classification context.

Supervised learning is the most common technique for training neural networks and decision trees. Both of these techniques are highly dependent on the information given by the pre-determined classifications. In the case of neural networks, the classification is used to determine the error of the network and then adjust the network to minimize it, and in decision trees, the classifications are used to determine what attributes provide the most information that can be used to solve the classification puzzle. We'll look at both of these in more detail, but for now, it should be sufficient to know that both of these examples thrive on having some "supervision" in the form of pre-determined classifications.

Speech recognition using hidden Markov models and Bayesian networks relies on some elements of supervision as well in order to adjust parameters to, as usual, minimize the error on the given inputs.

Notice something important here: in the classification problem, the goal of the learning algorithm is to minimize the error with respect to the given inputs. These inputs, often called the "training set", are the examples from which the agent tries to learn. But learning the training set well is not necessarily the best thing to do. For instance, if I tried to teach you exclusive-or, but only showed you combinations consisting of one true and one false, but never both false or both true, you might learn the rule that the answer is always true. Similarly, with machine learning algorithms, a common problem is over-fitting the data and essentially memorizing the training set rather than learning a more general classification technique.

Unsupervised Learning

Unsupervised learning seems much harder: the goal is to have the computer learn how to do something that we don't tell it how to do! There are actually two approaches to unsupervised learning. The first approach is to teach the agent not by giving explicit categorizations, but by using some sort of reward system to indicate success. Note that this type of training will generally fit into the decision problem framework because the goal is not to produce a classification but to make decisions that maximize rewards. This approach nicely generalizes to the real world, where agents might be rewarded for doing certain actions and punished for doing others.

Often, a form of reinforcement learning can be used for unsupervised learning, where the agent bases its actions on the previous rewards and punishments without necessarily even learning any information about the exact ways that its actions affect the world. In a way, all of this information is unnecessary because by learning a reward function, the agent simply knows what to do without any processing because it knows the exact reward it expects to achieve for each action it could take. This can be extremely beneficial in cases where calculating every possibility is very time consuming (even if all of the transition probabilities between world states were known). On the other hand, it can be very time consuming to learn by, essentially, trial and error.

But this kind of learning can be powerful because it assumes no pre-discovered classification of examples. In some cases, for example, our classifications may not be the best possible. One striking exmaple is that the conventional wisdom about the game of backgammon was turned on its head when a series of computer programs (neuro-gammon and TD-gammon) that learned through unsupervised learning became stronger than the best human chess players merely by playing themselves over and over. These programs discovered some principles that surprised the backgammon experts and performed better than backgammon programs trained on pre-classified examples.

A second type of unsupervised learning is called clustering. In this type of learning, the goal is not to maximize a utility function, but simply to find similarities in the training data. The assumption is often that the clusters discovered will match reasonably well with an intuitive classification. For instance, clustering individuals based on demographics might result in a clustering of the wealthy in one group and the poor in another.

Brianbriana answered 10/2, 2016 at 16:39 Comment(0)
S
0

Supervised Learning: In Simple Terms, you have certain inputs and expect some outputs. For example, you have a data of stock market which is of previous data and to get results of the present input for the next few years by giving some instructions it can give you needed output.

Unsupervised Learning: You have parameters like colour, type, size of something and you want a program to predict that whether it is a fruit, plant, animal or whatever it is, this is where Supervised comes in. It gives you output by taking some inputs.

Styrene answered 7/6, 2018 at 13:46 Comment(1)
Main difference is that Supervised learning is labeled for training samples.Barrada

© 2022 - 2024 — McMap. All rights reserved.