video_id
stringlengths
11
11
text
stringlengths
361
490
start_second
int64
0
11.3k
end_second
int64
18
11.3k
url
stringlengths
48
52
title
stringlengths
0
100
thumbnail
stringlengths
0
52
5cySIwg49RI
classification from researchers at Facebook a research lab this animation from Facebook's blog post on billions scale semi-supervised learning shows the idea of their semi-supervised training framework before integrating week supervision so in this case their take on semi-supervised learning is different from other definitions of semi-supervised learning such as
38
54
https://www.youtube.com/watch?v=5cySIwg49RI&t=38s
Semi-Weak Supervised Learning
https://i.ytimg.com/vi/5…axresdefault.jpg
5cySIwg49RI
rotating an image and then predicting the rotation angle or something like word to vector wave defect where you mask out certain parts of the sequence and then train the model to predict the missing part of the sequence this idea of semi-supervised learning is to have a label data set such as the image net data set train a large capacity model like a res NEX 101 32 by 48 group
54
74
https://www.youtube.com/watch?v=5cySIwg49RI&t=54s
Semi-Weak Supervised Learning
https://i.ytimg.com/vi/5…axresdefault.jpg
5cySIwg49RI
convolutions res next architecture and then use this massive high capacity model to predict the labels on an unlabeled data set and then you would use these unlabeled the softmax distribution of these predictions to pre train the target model ie as in model distillation such as what powers models like hugging faces distill burped then you'll fine tune the model that's been
74
95
https://www.youtube.com/watch?v=5cySIwg49RI&t=74s
Semi-Weak Supervised Learning
https://i.ytimg.com/vi/5…axresdefault.jpg
5cySIwg49RI
trained with model distillation on the label data set and this is your new model so one of the interesting things we already see about this is the novel use of model distillation as semi-supervised learning it's not really common to use these two terms together semi-supervised learning and model distillation also interesting lis is we can see this kind of model compression
95
113
https://www.youtube.com/watch?v=5cySIwg49RI&t=95s
Semi-Weak Supervised Learning
https://i.ytimg.com/vi/5…axresdefault.jpg
5cySIwg49RI
that arises from this framework you can have a really high capacity teacher model like the res NEX 101 32 by 48 D and then you can have a lower capacity more manageable probably faster inference time lower storage costs like the res net 50 that you could deploy on mobile and IOT in these kinds of things this animation shows the extension from the semi-supervised training framework
113
132
https://www.youtube.com/watch?v=5cySIwg49RI&t=113s
Semi-Weak Supervised Learning
https://i.ytimg.com/vi/5…axresdefault.jpg
5cySIwg49RI
to semi weekly supervised training so in this case instead of having some massive unlabeled data set we have a weekly supervised data set so instead of just having a collection of images we have a week label such as the hashtags on Instagram images and so the thing with the weekly supervised hashtags on Instagram images is that they're really subjective they're noisy and not really
132
150
https://www.youtube.com/watch?v=5cySIwg49RI&t=132s
Semi-Weak Supervised Learning
https://i.ytimg.com/vi/5…axresdefault.jpg
5cySIwg49RI
like as precisely labeled as say the data from imagenet so in this model we're going to pre train the teacher model same idea of having some larger capacity teacher model smaller capacity student network so we're gonna pre-trained it with the weekly supervised data set fine tuna with the image net then we're gonna use the fine-tuned model after having been pre
150
167
https://www.youtube.com/watch?v=5cySIwg49RI&t=150s
Semi-Weak Supervised Learning
https://i.ytimg.com/vi/5…axresdefault.jpg
5cySIwg49RI
trained to predict the softmax distribution over the weekly supervised data set and then when you use this model distillation knowledge distillation in order to train our student network we're gonna fine tune the student Network and then we have our Train model some of the interesting issues that research paper raises is this idea of class imbalance and unlabeled weekly supervised data sets so
167
184
https://www.youtube.com/watch?v=5cySIwg49RI&t=167s
Semi-Weak Supervised Learning
https://i.ytimg.com/vi/5…axresdefault.jpg
5cySIwg49RI
one characteristic of machine learning models and you know deep learning convolutional image classifiers is that class imbalance can really destroy the performance so for example if you're training a cat versus dog image classifier and you have 80% of your data your training data is cats and 20% its dogs your train model is going to want to predict cats it's going to be
184
202
https://www.youtube.com/watch?v=5cySIwg49RI&t=184s
Semi-Weak Supervised Learning
https://i.ytimg.com/vi/5…axresdefault.jpg
5cySIwg49RI
biased towards the imbalance data so in these weekly supervised data sets such as hashtags on Instagram images and then trying to transfer that into image net classification there's going to be like a long tail distribution where you're not gonna have as many of these really specific classes and image net contained in this image net data set so another interesting idea is just incorporating
202
222
https://www.youtube.com/watch?v=5cySIwg49RI&t=202s
Semi-Weak Supervised Learning
https://i.ytimg.com/vi/5…axresdefault.jpg
5cySIwg49RI
model distillation in the semi-supervised learning this teacher-student model compression and then this framework is gonna achieve 81.2% imaging at top one accuracy with a ResNet 50 and they're gonna scale this up to the res next 101 32 X 16 D with the student network capacity and they're gonna achieve 80 4.8% and this is up from eighty four point two from their
222
241
https://www.youtube.com/watch?v=5cySIwg49RI&t=222s
Semi-Weak Supervised Learning
https://i.ytimg.com/vi/5…axresdefault.jpg
5cySIwg49RI
previous research on doing a lot of label engineering for the weekly supervised Instagram dataset and then in that previous study they had achieved eighty five point four but it is a larger capacity bottle at the 48 D and they probably didn't test the 48 D because you know it's expensive and time-consuming to train these kinds of models this research from Facebook's and
241
259
https://www.youtube.com/watch?v=5cySIwg49RI&t=241s
Semi-Weak Supervised Learning
https://i.ytimg.com/vi/5…axresdefault.jpg
5cySIwg49RI
research lab is in line with a lot of their other work such as using weekly supervised pre training for video action recognition using over 65 million images like from Instagram as well as using billions of images in their weekly supervised pre-training of an image net image classifier in this case they do achieve a slightly higher performance but they do have larger capacity models
259
277
https://www.youtube.com/watch?v=5cySIwg49RI&t=259s
Semi-Weak Supervised Learning
https://i.ytimg.com/vi/5…axresdefault.jpg
5cySIwg49RI
and also interestingly is that in this case they are manually doing the you know kind of removing some of the noise from the week supervised data set whereas this framework is gonna present an automatic way automated way of doing this an interesting characteristic of the newest semi weak supervised training framework is they're going to use an explicit algorithm to balance the
277
294
https://www.youtube.com/watch?v=5cySIwg49RI&t=277s
Semi-Weak Supervised Learning
https://i.ytimg.com/vi/5…axresdefault.jpg
5cySIwg49RI
classes in the predicted distribution from the weekly supervised data set so this teacher model has been trained on the image Minette classification task but the weekly supervised data set probably isn't as balanced as imagenet classification it's probably heavily skewed to sort towards some classes more than others such as dogs and things like that compared to these really specific
294
310
https://www.youtube.com/watch?v=5cySIwg49RI&t=294s
Semi-Weak Supervised Learning
https://i.ytimg.com/vi/5…axresdefault.jpg
5cySIwg49RI
things like I don't know a tiger shark there's things like this so this visualization shows the top k scoring of examples so as the teacher model predicts a distribution of class labels over the unlabeled data or the weekly label data the models are going to be ranked according to their probability their class probability and then they're going to be balanced in this way so that
310
330
https://www.youtube.com/watch?v=5cySIwg49RI&t=310s
Semi-Weak Supervised Learning
https://i.ytimg.com/vi/5…axresdefault.jpg
5cySIwg49RI
each class has an even number of training samples and in this visualization you see how as it goes from rack 1000 to being very close to the image like the leaf beetle looks like the beetle whereas by rank 10000 16000 is not really a beetle anymore but the teacher model has given some probability to beetle when it's a predicting distribution of that image another
330
350
https://www.youtube.com/watch?v=5cySIwg49RI&t=330s
Semi-Weak Supervised Learning
https://i.ytimg.com/vi/5…axresdefault.jpg
5cySIwg49RI
interesting issue with this framework that's raised in the paper is the idea of inference time with model distillation so in this case we're predicting over a billion unlabeled or weekly labeled images with our teacher model so we want to have fast inference time for this prediction is much more important in this case than it is for typical examples of model distillation
350
369
https://www.youtube.com/watch?v=5cySIwg49RI&t=350s
Semi-Weak Supervised Learning
https://i.ytimg.com/vi/5…axresdefault.jpg
5cySIwg49RI
where the data sets aren't that large when you have billions of images you want to make sure the teacher model has fast inference I think with the rising popularity of knowledge distillation model distillation techniques such as hugging faces disco Bert and now Facebook semi weekly supervised training paradigm we're gonna see these kinds of inference accelerators like in videos
369
384
https://www.youtube.com/watch?v=5cySIwg49RI&t=369s
Semi-Weak Supervised Learning
https://i.ytimg.com/vi/5…axresdefault.jpg
5cySIwg49RI
tensor Artie accelerator to be more and more important because usually these frameworks are developed for inference when the model has been deployed but now we're seeing the inference be a part of the training loop as well in this teacher-student Maalik model distillation paradigm from their paper the researchers at Facebook give these six guideline recommendations for large-scale
384
401
https://www.youtube.com/watch?v=5cySIwg49RI&t=384s
Semi-Weak Supervised Learning
https://i.ytimg.com/vi/5…axresdefault.jpg
5cySIwg49RI
semi-supervised learning so the first idea is really interesting this teacher-student model distillation paradigm also really interestingly and uniquely in this paper is that they're going to test model distillation when the student and the teacher have the same architecture or the same capacity the second idea is to fine tune the model with true labels only this is a
401
416
https://www.youtube.com/watch?v=5cySIwg49RI&t=401s
Semi-Weak Supervised Learning
https://i.ytimg.com/vi/5…axresdefault.jpg
5cySIwg49RI
pretty intuitive idea the weekly supervised label dataset has a ton of noise in it compared to the imagenet data set or you know other more specific label data sets a third idea is that large-scale unlabeled data is key to this performance naturally the key driver behind this algorithm is that they have a billion images from Instagram that they're using to train
416
435
https://www.youtube.com/watch?v=5cySIwg49RI&t=416s
Semi-Weak Supervised Learning
https://i.ytimg.com/vi/5…axresdefault.jpg
5cySIwg49RI
this model the fourth idea is really interesting that they use a large number of training iterations for their pre training with the weekly supervised learning compared to you know more pre-training iterations compared to normal supervised learning the fifth idea is a novel contribution to this paper the idea of having a balanced distribution for inferred labels so when
435
453
https://www.youtube.com/watch?v=5cySIwg49RI&t=435s
Semi-Weak Supervised Learning
https://i.ytimg.com/vi/5…axresdefault.jpg
5cySIwg49RI
you're doing the model distillation you want you don't want to have class and balance in the distribution of these labels and the sixth idea is that pre training the high capacity teacher model week supervision further improves the results the idea of adding the week supervised to make this the semi week supervised learning framework now we'll get into some of the results of their
453
471
https://www.youtube.com/watch?v=5cySIwg49RI&t=453s
Semi-Weak Supervised Learning
https://i.ytimg.com/vi/5…axresdefault.jpg
5cySIwg49RI
research report you can check out their repository on github semi-supervised image net 1k models where you have the pre trained models that you can load with torch hub and then they also present some of the results you see the semi-supervised learning framework different model architectures from ResNet 18 and 50 up to the res next group convolution architectures and then
471
489
https://www.youtube.com/watch?v=5cySIwg49RI&t=471s
Semi-Weak Supervised Learning
https://i.ytimg.com/vi/5…axresdefault.jpg
5cySIwg49RI
you see up to the 80 4.8% accuracy when using the semi-weekly supervised learning framework with 193 million parameters on the res next 101 32 by 16 d architecture the first set of results they present shows the success of the semi-supervised learning framework with different student models so first we're looking at the resident 18 the resident 50 and then higher versions of the
489
508
https://www.youtube.com/watch?v=5cySIwg49RI&t=489s
Semi-Weak Supervised Learning
https://i.ytimg.com/vi/5…axresdefault.jpg
5cySIwg49RI
Resnick's and the 50 101 higher capacity resonate variations so we see that the fine-tuned semi-supervised learning framework is always outperforming the fully supervised learning tasks when you just train the student model on imagenet classification then they present this idea of varying the complexity of the teacher model and showing how increasing the capacity of the teacher model
508
528
https://www.youtube.com/watch?v=5cySIwg49RI&t=508s
Semi-Weak Supervised Learning
https://i.ytimg.com/vi/5…axresdefault.jpg
5cySIwg49RI
increases the accuracy of the student model we see the gains are increasing every time as we scale up the capacity the teacher model while holding the student model with constant this plot shows the results when the teacher and the student model have the same architectural capacity interestingly we still see these gains when we're using the same capacity for each model this
528
546
https://www.youtube.com/watch?v=5cySIwg49RI&t=528s
Semi-Weak Supervised Learning
https://i.ytimg.com/vi/5…axresdefault.jpg
5cySIwg49RI
plot shows how the top one accuracy changes as a function of the unlabeled data site data set size used as the you know the unlabeled or the weekly supervised data in this pipeline so we see that the performance continues to increase as a data set gets larger and larger following the recommendation from their guideline that having massive unlabeled data sets is key to this
546
565
https://www.youtube.com/watch?v=5cySIwg49RI&t=546s
Semi-Weak Supervised Learning
https://i.ytimg.com/vi/5…axresdefault.jpg
5cySIwg49RI
framework this plot shows how the accuracy improves as a function of the number of pre training iterations so as stated in their recommendations they use a much larger amount of pre training epochs than supervised learning epochs really interesting Lee is there showing four billion training iterations it achieves the highest accuracy in this plot then they show the results of
565
584
https://www.youtube.com/watch?v=5cySIwg49RI&t=565s
Semi-Weak Supervised Learning
https://i.ytimg.com/vi/5…axresdefault.jpg
5cySIwg49RI
increasing the K parameter so the K parameter shown here is this idea of scoring the class label distributions when you're doing the model distillation from the teacher Network so basically the idea is if you increase the K from say eight K to 16 K and then you're looking at a specific class such as leaf beetle the as you get towards the end of the eight K and then especially from say
584
605
https://www.youtube.com/watch?v=5cySIwg49RI&t=584s
Semi-Weak Supervised Learning
https://i.ytimg.com/vi/5…axresdefault.jpg
5cySIwg49RI
eight thousand one up to six the top eight thousand one to the top sixteen thousand the images are going to look less and less like beetles they just have been assigned some probability as beetle and now they're a part of the balanced data set because you've increased this K parameter so they show how there is a limiting effect to how large you increase the K because
605
623
https://www.youtube.com/watch?v=5cySIwg49RI&t=605s
Semi-Weak Supervised Learning
https://i.ytimg.com/vi/5…axresdefault.jpg
5cySIwg49RI
naturally as you increase the K passed a certain threshold you're making your knowledge distillation data step four your student network to be really imbalanced and deep learning machine learning these kinds of decision boundary models do not respond well to class and balance although they don't show you like things like random over sampling a lot of the techniques
623
641
https://www.youtube.com/watch?v=5cySIwg49RI&t=623s
Semi-Weak Supervised Learning
https://i.ytimg.com/vi/5…axresdefault.jpg
5cySIwg49RI
commonly used to overcome class imbalance this table shows the evidence of the balancing of the distribution with these ablation studies such as balancing dataset or unbalanced in a dataset showing a point eight percent accuracy improvement which is very significant for imagenet classification and then also the idea of using the Instagram tags versus ranking the list
641
659
https://www.youtube.com/watch?v=5cySIwg49RI&t=641s
Semi-Weak Supervised Learning
https://i.ytimg.com/vi/5…axresdefault.jpg
5cySIwg49RI
of the pretty distributions and then comparing all these performances to supervised learning this table is the showing the big highlights of the paper you see they achieve 81.2% accuracy with the resident 50 architecture and how this is state VR compared to previous works trying to achieve weekly supervised learning at the same model capacity then you can see the
659
679
https://www.youtube.com/watch?v=5cySIwg49RI&t=659s
Semi-Weak Supervised Learning
https://i.ytimg.com/vi/5…axresdefault.jpg
5cySIwg49RI
head-to-head comparison with their previous work on you know label engineering the week supervised learning and sort of you do see how the accuracy starts to saturate at the further model capacity of the Residex architecture another really interesting characteristic of this framework is success on transfer learning so when they transfer this pre trained model
679
696
https://www.youtube.com/watch?v=5cySIwg49RI&t=679s
Semi-Weak Supervised Learning
https://i.ytimg.com/vi/5…axresdefault.jpg
5cySIwg49RI
from imagenet to the bird image classification model they achieve a really high level of transfer learning performance compared to previous approaches this chart shows the difference between fine-tuning just a fully connected layer at the end of the network compared to the full network and then the performance achieved after doing this they also test this model on
696
712
https://www.youtube.com/watch?v=5cySIwg49RI&t=696s
Semi-Weak Supervised Learning
https://i.ytimg.com/vi/5…axresdefault.jpg
5cySIwg49RI
video classification with the deep mind kinetics data set and they show a significant improvement achieving seventy five point nine percent accuracy using this technique compared to the previous research achieving seventy four point eight percent accuracy thanks for watching this presentation of semi weak supervised learning from facebook's AI research lab this research paper has
712
729
https://www.youtube.com/watch?v=5cySIwg49RI&t=712s
Semi-Weak Supervised Learning
https://i.ytimg.com/vi/5…axresdefault.jpg
5cySIwg49RI
presented a really interesting framework for semi-supervised learning and weekly supervised learning and integrating this model distillation paradigm there are some really interesting ideas presented in this paper such as the importance of how they having a balanced class distribution for the model distillation data set used to train the student network also interestingly is this idea
729
746
https://www.youtube.com/watch?v=5cySIwg49RI&t=729s
Semi-Weak Supervised Learning
https://i.ytimg.com/vi/5…axresdefault.jpg
AuqZ4recf0s
hello my name is Krishna and welcome to my youtube channel today we are basically going to discuss how to learn data science for free now when I say for free as you know that their whole lot of materials available in the internet with respect to Python programming language with respect to data science with respect to machine learning AI deep learning and whole lot of stuffs so what
0
20
https://www.youtube.com/watch?v=AuqZ4recf0s&t=0s
How To Learn Data Science by Self Study and For Free
https://i.ytimg.com/vi/A…axresdefault.jpg
AuqZ4recf0s
I will do is that in this particular session I'll show you a systematic way how you can basically complete your data science syllabus within three months so that your transition towards data science can be possible just within three months and after that what you can do is that you can also attend interviews by updating your resume now resume part will be discussed later on
20
38
https://www.youtube.com/watch?v=AuqZ4recf0s&t=20s
How To Learn Data Science by Self Study and For Free
https://i.ytimg.com/vi/A…axresdefault.jpg
AuqZ4recf0s
in the upcoming videos but today we will just try to focus how to learn data science for free I'll show you the systematic way what our YouTube channels you can basically follow you know because there are lot of YouTube channels provides free materials with respect to Python machine learning deep learning and all apart from that you I'll also be mentioning about various
38
57
https://www.youtube.com/watch?v=AuqZ4recf0s&t=38s
How To Learn Data Science by Self Study and For Free
https://i.ytimg.com/vi/A…axresdefault.jpg
AuqZ4recf0s
blogs and that finally I'll also be mentioning about the best machine learning book that you can basically use in order to learn data science machine learning very easy now to begin with guys so I have already prepared the word doc over here in my laptop and this particular word doc I will actually upload it in my google drive and share with all of you and that will basically
57
76
https://www.youtube.com/watch?v=AuqZ4recf0s&t=57s
How To Learn Data Science by Self Study and For Free
https://i.ytimg.com/vi/A…axresdefault.jpg
AuqZ4recf0s
be given in a description box now in this particular video I am basically going to tell you that how we are going to do how we are going to learn data science with respect to machine learning and deep learning considering Python programming language the reason I'm telling about Python programming language is guys because I have an expert in Python programming language
76
94
https://www.youtube.com/watch?v=AuqZ4recf0s&t=76s
How To Learn Data Science by Self Study and For Free
https://i.ytimg.com/vi/A…axresdefault.jpg
AuqZ4recf0s
I've referred a lot of materials I have referred a lot of things and I've also done a lot of stuff self-study so that is the reason why I'm actually telling you this for our programming language I need to do a little bit more research which are the best materials but apart from that all the machine learning deep learning techniques for learning purpose you can basically use this materials use
94
113
https://www.youtube.com/watch?v=AuqZ4recf0s&t=94s
How To Learn Data Science by Self Study and For Free
https://i.ytimg.com/vi/A…axresdefault.jpg
AuqZ4recf0s
this links that I'm basically giving you but for the practical applications you have to be you have to you should be able to search through various internet resources okay so to begin with first of all as this is for Python programming language the first topic that I will take is basically from where we can basically run Python we can basically learn sorry so I had two channels in my like
113
137
https://www.youtube.com/watch?v=AuqZ4recf0s&t=113s
How To Learn Data Science by Self Study and For Free
https://i.ytimg.com/vi/A…axresdefault.jpg
AuqZ4recf0s
most most of the time I referred this channels whenever I wanted to learn about Python whenever I had some queries and the best part of this particular channel this two channels are read in Python as you know if you are learning data science you should just not know Python you should also know object-oriented features in Python apart from that you should also know some
137
156
https://www.youtube.com/watch?v=AuqZ4recf0s&t=137s
How To Learn Data Science by Self Study and For Free
https://i.ytimg.com/vi/A…axresdefault.jpg
AuqZ4recf0s
other frameworks like flask and Django because this two frameworks are very very important for deployment of machine learning models and deep learning models okay during the deployment stage you know that you create a flask framework or Django framework and you just uploaded it in in some other servers let it be a platform as-a-service server it may be infrastructure as a service
156
176
https://www.youtube.com/watch?v=AuqZ4recf0s&t=156s
How To Learn Data Science by Self Study and For Free
https://i.ytimg.com/vi/A…axresdefault.jpg
AuqZ4recf0s
server like ec2 instance of AWS or HID okuu platform and many more platforms are there but initially the web framework is basically created the micro-service framework is basically created with the help of flask or Django so this channel the first channel that I like to mention is Corrie Schaffer he is a wonderful person he used to work in an IT company before but later on he moved
176
196
https://www.youtube.com/watch?v=AuqZ4recf0s&t=176s
How To Learn Data Science by Self Study and For Free
https://i.ytimg.com/vi/A…axresdefault.jpg
AuqZ4recf0s
into teaching into YouTube channel itself he has one of the best Python videos guys the link is basically given in the description I mean in the word doc in the description so you can basically refer the YouTube channel of his link and I would suggest you if you have any queries go and see that particular channel with respect to Python and it is started from basic from
196
219
https://www.youtube.com/watch?v=AuqZ4recf0s&t=196s
How To Learn Data Science by Self Study and For Free
https://i.ytimg.com/vi/A…axresdefault.jpg
AuqZ4recf0s
the basic installation part ok now in that particular channel you'll also find a playlist on a flask and Django so this was one of the favorite channel that I also refer for learning Python so that is Corey Schaffer the second person is basically sent decks okay send decks is one of the oldest youtuber who uploads videos on machine learning deep learning Python natural language processing so he
219
242
https://www.youtube.com/watch?v=AuqZ4recf0s&t=219s
How To Learn Data Science by Self Study and For Free
https://i.ytimg.com/vi/A…axresdefault.jpg
AuqZ4recf0s
is also one of my favorite youtuber and he's a very simple guy very very very you know if I see him I really get that motivation motivation because he provides every materials every videos that he upload he does not have any online tutorial sites he just has some sites where he'll be writing basically about the blogs of whatever he's doing in his YouTube channel so before that saying
242
265
https://www.youtube.com/watch?v=AuqZ4recf0s&t=242s
How To Learn Data Science by Self Study and For Free
https://i.ytimg.com/vi/A…axresdefault.jpg
AuqZ4recf0s
Dex and again the link will be given in that particular word doc itself now once you learn Python programming language okay now Python programming language I think if you sit and guess guys this this if you're planning to cover this in three months make sure you give three to four hours daily okay give three to four hours and I'm where I'm saying give three to four hours that
265
286
https://www.youtube.com/watch?v=AuqZ4recf0s&t=265s
How To Learn Data Science by Self Study and For Free
https://i.ytimg.com/vi/A…axresdefault.jpg
AuqZ4recf0s
should be more productive hours okay now after this now once you finish Python programming language now the next thing is that you move towards machine learning now I know that many of them will ask me a question - Mia asked me a question saying that where is the math part where is the linear algebra part where should we learn it from where should we learn it the differential
286
308
https://www.youtube.com/watch?v=AuqZ4recf0s&t=286s
How To Learn Data Science by Self Study and For Free
https://i.ytimg.com/vi/A…axresdefault.jpg
AuqZ4recf0s
calculus and many more things right statistics parts and all guys don't go in that particular way we need to complete the data science syllabus within three months so what you do is that pick up the machine learning algorithm and through reverse engineering understand the mats and try to derive take a use case how to solve it and finally solve that particular use
308
326
https://www.youtube.com/watch?v=AuqZ4recf0s&t=308s
How To Learn Data Science by Self Study and For Free
https://i.ytimg.com/vi/A…axresdefault.jpg
AuqZ4recf0s
case try to optimize that particular use case try to increase the accuracy so while you're doing this all the steps you will be means you will be learning statistics you will be learning linear algebra will be unlearning differential calculus wherever it is required always do that reverse-engineer get that knowledge you know now suppose if I want to solve linear regression now when I am
326
344
https://www.youtube.com/watch?v=AuqZ4recf0s&t=326s
How To Learn Data Science by Self Study and For Free
https://i.ytimg.com/vi/A…axresdefault.jpg
AuqZ4recf0s
learning linear regression I know that there will be an equation of straight-line that will be coming into that particular algorithm like Y is equal to MX plus C then after that I will be saying I'll be just finding out I'll be deep diving how do I find out that coefficient value then over there the gradient descent come into existence then I learn about how this value is
344
362
https://www.youtube.com/watch?v=AuqZ4recf0s&t=344s
How To Learn Data Science by Self Study and For Free
https://i.ytimg.com/vi/A…axresdefault.jpg
AuqZ4recf0s
basically calculated through differential calculus I'm just taking as in one one as an example of our linear regulation similarly you have to learn with respect to each and every algorithm so for this I have actually again selected three channel one is you need to understand the maths behind each and every algorithm so you can basically refer or machine learning course by Andrew ng in
362
382
https://www.youtube.com/watch?v=AuqZ4recf0s&t=362s
How To Learn Data Science by Self Study and For Free
https://i.ytimg.com/vi/A…axresdefault.jpg
AuqZ4recf0s
applied AI so deep learning dot AI sorry not applied air deep learning dot AI channel you can basically and again the link is basically given in the word doc itself the other thing is that I have also uploaded many videos on machine learning and some of the feedback that I got that the machine learning playlist is not that ordered you know so what what you can do is that whenever you are
382
403
https://www.youtube.com/watch?v=AuqZ4recf0s&t=382s
How To Learn Data Science by Self Study and For Free
https://i.ytimg.com/vi/A…axresdefault.jpg
AuqZ4recf0s
searching my videos suppose you are learning simple linear regression just search that keyword and just put my name in front of that you will be getting the whole explanation apart from that I have also uploaded videos with respect to practical application ok so you'll be able to do that now I am trying to order that particular playlist and I'll make and making sure that whatever videos that I
403
423
https://www.youtube.com/watch?v=AuqZ4recf0s&t=403s
How To Learn Data Science by Self Study and For Free
https://i.ytimg.com/vi/A…axresdefault.jpg
AuqZ4recf0s
upload in the future that will also be ordered so I would like to say one is Andrew ng one is my channel with respect to machine learning if you just want to know the maths about each and every machine learning algorithm go and see Andrew ng from deep learning dai and then you also have sent dex channel again I'm referring send text because he has uploaded videos on Python he's
423
445
https://www.youtube.com/watch?v=AuqZ4recf0s&t=423s
How To Learn Data Science by Self Study and For Free
https://i.ytimg.com/vi/A…axresdefault.jpg
AuqZ4recf0s
uploaded videos on floss Jango apart from that he's also uploaded videos with respect to your machine learning algorithms ok and the best part is that he's not uploaded machine learning all he has uploaded deep learning also so I'm going to refer him again in the later links when I'm discussing about deep learning so machine learning three things one is my channel one is sent X
445
468
https://www.youtube.com/watch?v=AuqZ4recf0s&t=445s
How To Learn Data Science by Self Study and For Free
https://i.ytimg.com/vi/A…axresdefault.jpg
AuqZ4recf0s
and the other one is auntie Ong 4 applied for deep learning dai and um you know Andrew ng has explained the complete maths sometimes what happens is that you will not be able to follow it but again if you just see again and again you'll be able to follow it because the math is pretty much simpler but what I am making sure that in my channel so I will be uploading a lot of
468
487
https://www.youtube.com/watch?v=AuqZ4recf0s&t=468s
How To Learn Data Science by Self Study and For Free
https://i.ytimg.com/vi/A…axresdefault.jpg
AuqZ4recf0s
maths thing whenever I am explaining you about some specific algorithms and that will be continuing going on in Andrew ng channel of deep learning dot AI you will not find any practical application so for the practical application you can either refer my videos or you can refer send text videos ok over there send X do not explain you the maths behind any machine learning algorithm since is more
487
507
https://www.youtube.com/watch?v=AuqZ4recf0s&t=487s
How To Learn Data Science by Self Study and For Free
https://i.ytimg.com/vi/A…axresdefault.jpg
AuqZ4recf0s
focused towards implementation of machine learning algorithms ok so this was one now I have also seen some people who gets really attracted just not by seeing writings equation they like some animation kind of explanation so if you want some animation kind of explanation there is one channel that I went through is stat quest with Josh stammer ok so this is one of the good channel where
507
532
https://www.youtube.com/watch?v=AuqZ4recf0s&t=507s
How To Learn Data Science by Self Study and For Free
https://i.ytimg.com/vi/A…axresdefault.jpg
AuqZ4recf0s
they'll show a lot of animation to explain you each and every machine learning algorithm with you know along with that explanation the theoretical explanation how it is basically done including all statistics linear algebra differential calculation and different kind of maths formula again the link is basically given in the docx file itself okay then after that you have natural language
532
554
https://www.youtube.com/watch?v=AuqZ4recf0s&t=532s
How To Learn Data Science by Self Study and For Free
https://i.ytimg.com/vi/A…axresdefault.jpg
AuqZ4recf0s
processing natural language processing you can basically go through my playlist because I have uploaded around eight to nine videos with respect to machine learning and I'm planning to upload with respect to deep learning also where I'll be implemented a lot of NLP things where I'll be implementing with the help of word to work and all the different kind of tools or libraries that are basically
554
572
https://www.youtube.com/watch?v=AuqZ4recf0s&t=554s
How To Learn Data Science by Self Study and For Free
https://i.ytimg.com/vi/A…axresdefault.jpg
AuqZ4recf0s
present in natural language processing the other channel is basically again sent takes send X has been up has uploaded around twenty to thirty videos with respect to natural language processing and then you can refer that also now we let us go to the deep learning for deep learning I have selected two channels one is Andrew and G again from deep learning day I again
572
591
https://www.youtube.com/watch?v=AuqZ4recf0s&t=572s
How To Learn Data Science by Self Study and For Free
https://i.ytimg.com/vi/A…axresdefault.jpg
AuqZ4recf0s
any theoretical components any theoretical things that you need to understand about deep learning can be sick later for that link again just watched the word doc file again in that I mentioned the link also the second channel is my channel because deep learning might be complete deep learning playlist that I have created is completely in order okay to the other
591
611
https://www.youtube.com/watch?v=AuqZ4recf0s&t=591s
How To Learn Data Science by Self Study and For Free
https://i.ytimg.com/vi/A…axresdefault.jpg
AuqZ4recf0s
one doodle-do tutorial three like that I have have actually created tutorial 22 and I'm including both maths understanding of the algorithms understanding about neural networks and how to implement that with the help of carols and Python right so I have done that I have ordered it you know first of all I explained you about all the artificial indian or artificial
611
630
https://www.youtube.com/watch?v=AuqZ4recf0s&t=611s
How To Learn Data Science by Self Study and For Free
https://i.ytimg.com/vi/A…axresdefault.jpg
AuqZ4recf0s
intelligence artificial neural network and I mean artificial neural network I have explained about lot of things like back propagation how to update weights bias all those things and I am also showing it practically how you can basically use for practical implementation with the help of Kira's apart from that how can you basically optimize your problem statement also now
630
648
https://www.youtube.com/watch?v=AuqZ4recf0s&t=630s
How To Learn Data Science by Self Study and For Free
https://i.ytimg.com/vi/A…axresdefault.jpg
AuqZ4recf0s
apart from that you can basically refer my channel thrush I'm also going to complete the whole deep learning playlist still they are 15 to 20 videos so total overall display playlist will be having 40 videos okay which will be including LST m RN and CN and everything okay everything in and I am I'm serious about it because I started that playlist still not when you do
648
668
https://www.youtube.com/watch?v=AuqZ4recf0s&t=648s
How To Learn Data Science by Self Study and For Free
https://i.ytimg.com/vi/A…axresdefault.jpg
AuqZ4recf0s
videos as been completed I have still have plans to upload more and more now after learning all this you'll be having a lot of idea and this particular hole whatever I have explained is with respect to YouTube channel right now okay the whole lot of materials that are available in YouTube and the next thing is that you refer github links you know github links so suppose if you have a
668
689
https://www.youtube.com/watch?v=AuqZ4recf0s&t=668s
How To Learn Data Science by Self Study and For Free
https://i.ytimg.com/vi/A…axresdefault.jpg
AuqZ4recf0s
problem with about linear regression go and search linear regression github okay so you'll be getting various abundant materials in the Google in Google search itself and you can basically take one of the problems start solving it now the next thing is that after after learning all the skills you have to do a lot of practice projects right so I have created one deep learning
689
709
https://www.youtube.com/watch?v=AuqZ4recf0s&t=689s
How To Learn Data Science by Self Study and For Free
https://i.ytimg.com/vi/A…axresdefault.jpg
AuqZ4recf0s
sorry data science project playlist wherein I have uploaded more than 50 videos 50 different use cases and that is specifically kaggle use cases that I have taken I've solved with the help of Python and machine learning and deep learning so you can basically refer those projects try to solve it again all the code is basically given in the github itself you can refer my github
709
730
https://www.youtube.com/watch?v=AuqZ4recf0s&t=709s
How To Learn Data Science by Self Study and For Free
https://i.ytimg.com/vi/A…axresdefault.jpg
AuqZ4recf0s
and you can get up all the details ok so after learning all these things the last part is data science projects you should be able to implement various data science program finally create your resume you know include everything that you have learnt into your resume that's it now first we have discussed about YouTube channels now the second thing is that I'm also going to refer some of the
730
748
https://www.youtube.com/watch?v=AuqZ4recf0s&t=730s
How To Learn Data Science by Self Study and For Free
https://i.ytimg.com/vi/A…axresdefault.jpg
AuqZ4recf0s
blocks which has all almost all of the problem solutions of every machine learning algorithms or deep learning algorithms one is towards data science blog and the other one is medium with respect to machine learning and deep learning so all the links is basically given in the word doc itself ok so in short this was there and the last part is basically about books now one of the
748
771
https://www.youtube.com/watch?v=AuqZ4recf0s&t=748s
How To Learn Data Science by Self Study and For Free
https://i.ytimg.com/vi/A…axresdefault.jpg
AuqZ4recf0s
best machine learning book that I have also read you know the link is basically given in the description about the best machine learning book it is basically written for the O'Reilly publisher if you go and see that particular books book as the best book guys best book on machine learning and deep learning I think every fresher every fresher who wants to make that transition towards
771
796
https://www.youtube.com/watch?v=AuqZ4recf0s&t=771s
How To Learn Data Science by Self Study and For Free
https://i.ytimg.com/vi/A…axresdefault.jpg
AuqZ4recf0s
their data science machine learning I they should go and read this particular book and this book is basically you know a boon to all the data scientists because the author his basically or alien gear on sorry if I'm pronouncing it wrong but this particular person has written this book and it is basically the book name is hands on machine learning with sky kid learn and
796
820
https://www.youtube.com/watch?v=AuqZ4recf0s&t=796s
How To Learn Data Science by Self Study and For Free
https://i.ytimg.com/vi/A…axresdefault.jpg
AuqZ4recf0s
tensorflow okay again the link is basically given in the description and the the publisher name is O'Reilly and I'll tell you this book is very cheap guys I see there are a lot of free PDFs also available for this book I don't want to share that PDFs I don't know don't hurt want to research and find out the free PDFs over there because this guy has written this
820
842
https://www.youtube.com/watch?v=AuqZ4recf0s&t=820s
How To Learn Data Science by Self Study and For Free
https://i.ytimg.com/vi/A…axresdefault.jpg
AuqZ4recf0s
author has written this book so nicely and I don't want to disrespect him by just taking the free PDFs and distributing it to you okay I have not even researched and found out whether there is any PDF or not I basically download I basically bought this book the paperback version and I'll just if you want I can also review this particular book in one of my next videos
842
865
https://www.youtube.com/watch?v=AuqZ4recf0s&t=842s
How To Learn Data Science by Self Study and For Free
https://i.ytimg.com/vi/A…axresdefault.jpg
AuqZ4recf0s
but awesome book you know it is basically you have Python you have feature engineering you have machine learning you have everything okay and about apart from that guys you can buy this book and it is hardly around 1,500 rupees INR if you consider this in the terms of dollar hardly ten to fifteen dollars fifteen to twenty dollars so I think yes fifteen to twenty dollars you
865
887
https://www.youtube.com/watch?v=AuqZ4recf0s&t=865s
How To Learn Data Science by Self Study and For Free
https://i.ytimg.com/vi/A…axresdefault.jpg
AuqZ4recf0s
can basically buy this particular book again the link is basically given in the description go ahead and buy it I know this particular session is all about data science with respect to free but I'd suggest this particular book you know don't search for free PDFs keep this book handy because it will help you for the lifetime okay whenever you want you can basically leave this now the
887
907
https://www.youtube.com/watch?v=AuqZ4recf0s&t=887s
How To Learn Data Science by Self Study and For Free
https://i.ytimg.com/vi/A…axresdefault.jpg
AuqZ4recf0s
next thing comes that there are small small parts like feature engineering feature selection what you have to do is that I'll be sharing a very good github link about feature engineer and feature selection which I found it through the internet and I'll be sharing that link in this word doc itself what you have to do is that go inside that link there are many materials present inside just refer
907
929
https://www.youtube.com/watch?v=AuqZ4recf0s&t=907s
How To Learn Data Science by Self Study and For Free
https://i.ytimg.com/vi/A…axresdefault.jpg
AuqZ4recf0s
each and every notebook file it is clearly written what what is all about feature engineering how feature engineering is basically done similarly they are on 10 to 20 materials notebook file Jupiter notebook files you can just you just have to read it just have to execute that and by that you will be able to understand a lot of thing and similarly for the feature selection so
929
947
https://www.youtube.com/watch?v=AuqZ4recf0s&t=929s
How To Learn Data Science by Self Study and For Free
https://i.ytimg.com/vi/A…axresdefault.jpg
AuqZ4recf0s
I'll be sharing those two github link and thanks to the author I'll also be mentioning the author over there who had actually provided those materials and it is available in an Internet completely for free so I'll be providing that two things to you and yes that is all about this particular preparation guys and I think if you are able to give around two to three to four hours I think within
947
968
https://www.youtube.com/watch?v=AuqZ4recf0s&t=947s
How To Learn Data Science by Self Study and For Free
https://i.ytimg.com/vi/A…axresdefault.jpg
AuqZ4recf0s
three months you'll be able to complete this whole data science syllabus and after three months you'll be also giving the interviews because you have a learnt a lot of things done data science projects you have practice things in kaggle and make sure you practice a lot of projects after completing all these things through this YouTube channel through blogs through this particular
968
985
https://www.youtube.com/watch?v=AuqZ4recf0s&t=968s
How To Learn Data Science by Self Study and For Free
https://i.ytimg.com/vi/A…axresdefault.jpg
AuqZ4recf0s
book that I have told you from O'Reilly publisher which is basically hands on machine learning a sky killer intensive flow that was all about this particular video I hope you liked this particular videos share with all your friends subscribe do subscribe this channel if you're not already subscribed I'll see y'all in the next video have a great day ahead thank you
985
1,010
https://www.youtube.com/watch?v=AuqZ4recf0s&t=985s
How To Learn Data Science by Self Study and For Free
https://i.ytimg.com/vi/A…axresdefault.jpg
1sJuWg5dULg
hello welcome to lecture 8 of deep and squeezed learning today we are going to talk about the strengths and weaknesses of various narrative models and representation learning methods that we've seen so far so the brain has 10 to the power 14 synapses and we only live for 10 to power 9 seconds and so we have a lot more parameters then then I'm the data we ingest so this motivates that we
0
31
https://www.youtube.com/watch?v=1sJuWg5dULg&t=0s
L8 Round-up of Strengths and Weaknesses of Unsupervised Learning Methods -- UC Berkeley SP20
https://i.ytimg.com/vi/1…axresdefault.jpg
1sJuWg5dULg
should do a lot on scores learning because in order to provide sufficient fodder for the number of parameters that we have in our brain we should be able to predict a lot more bits from the data that we ingest which is 10 to the power 5 order magnitude smaller right so this was a statement made by Jeff Fenton in this 2014 in a reddit so firstly you summary of the course so far we've
31
61
https://www.youtube.com/watch?v=1sJuWg5dULg&t=31s
L8 Round-up of Strengths and Weaknesses of Unsupervised Learning Methods -- UC Berkeley SP20
https://i.ytimg.com/vi/1…axresdefault.jpg
1sJuWg5dULg
looked at or aggressive Morrow's fix learn and pick so CNN picked two skin and paws glass pixel snail we looked at four models really only P family of models and also the connection between Auto regressive flows and in with Auto regressive flows next we covered latent variable models models with approximate density estimates using the variational lower bound and various associations of
61
88
https://www.youtube.com/watch?v=1sJuWg5dULg&t=61s
L8 Round-up of Strengths and Weaknesses of Unsupervised Learning Methods -- UC Berkeley SP20
https://i.ytimg.com/vi/1…axresdefault.jpg
1sJuWg5dULg
that like the VA e importance rated our encoder VQ BAE pixel BAE and so forth we also then jumped into a different class of jeredy models that don't work with the likelihood principle the impasse density models against energy based models in the moment matching principle and finally we questioned the idea of like whether we even need to learn generative models if all we care about
88
115
https://www.youtube.com/watch?v=1sJuWg5dULg&t=88s
L8 Round-up of Strengths and Weaknesses of Unsupervised Learning Methods -- UC Berkeley SP20
https://i.ytimg.com/vi/1…axresdefault.jpg
1sJuWg5dULg
is extracting useful features from unlabeled data and that God isn't with this topic because house provides representation and we saw that with the right kind of simple cognitive principles and a lot of data and compute we can learn really useful representations of unlabeled images that are competitive with supervisor plantations so represent so let's let's look at auto regressive
115
142
https://www.youtube.com/watch?v=1sJuWg5dULg&t=115s
L8 Round-up of Strengths and Weaknesses of Unsupervised Learning Methods -- UC Berkeley SP20
https://i.ytimg.com/vi/1…axresdefault.jpg
1sJuWg5dULg
models used in the in 2015 the main paper was Bush with which introduced this idea of masked or encoder for density estimation and it was able to produce these I'm miss digits which were reasonable looking but very jittery and this idea was extended to much stronger architect more expressive architectures well-suited for image modeling like masking illusions this individual
142
172
https://www.youtube.com/watch?v=1sJuWg5dULg&t=142s
L8 Round-up of Strengths and Weaknesses of Unsupervised Learning Methods -- UC Berkeley SP20
https://i.ytimg.com/vi/1…axresdefault.jpg
1sJuWg5dULg
introduced in the pixel or an analytics is seen and family of models and you certainly started seeing generative models working for higher dimensional and much more diverse assi Commission ad so these are samples from image net 64 by 64 you can see that the the structure across 4,000 pixels is pretty coherent but the color is not that good and therefore you're not actually able to
172
198
https://www.youtube.com/watch?v=1sJuWg5dULg&t=172s
L8 Round-up of Strengths and Weaknesses of Unsupervised Learning Methods -- UC Berkeley SP20
https://i.ytimg.com/vi/1…axresdefault.jpg
1sJuWg5dULg
identify any visible class from imagenet but this was a big big jump from the quality you saw and made and this idea of mass convolutions it has also been applied for one dimensional data like audio and in order to model long-range cover in audio samples the idea of using dilated combinations was introduced and this was also applied for a text-to-speech system where you're going
198
230
https://www.youtube.com/watch?v=1sJuWg5dULg&t=198s
L8 Round-up of Strengths and Weaknesses of Unsupervised Learning Methods -- UC Berkeley SP20
https://i.ytimg.com/vi/1…axresdefault.jpg
1sJuWg5dULg
to convert linguistic and text features to raw audio and that can be used in any Indonesian assistant like the Google assistant and this was the Wayman architecture that was commercially deployed after after a year and the same idea of using mass conversions with Auto regressive pixel level modeling has also been applied for Rio prediction why are you looking at
230
256
https://www.youtube.com/watch?v=1sJuWg5dULg&t=230s
L8 Round-up of Strengths and Weaknesses of Unsupervised Learning Methods -- UC Berkeley SP20
https://i.ytimg.com/vi/1…axresdefault.jpg
1sJuWg5dULg
the pass frames and encoding them with a convolutional STM and then you're taking the embedded representation as a conditioning information for a pixel scene and decoder that generates the next frame pixel by pixel and it's able to produce coherent video look like a robot moving out to Tehran so over time the order aggressive modeling community has expanded further and further in
256
286
https://www.youtube.com/watch?v=1sJuWg5dULg&t=256s
L8 Round-up of Strengths and Weaknesses of Unsupervised Learning Methods -- UC Berkeley SP20
https://i.ytimg.com/vi/1…axresdefault.jpg
1sJuWg5dULg
terms of the level of engineering and architectural innovation and on the left you can see if the subscale pixel networks which have very coherent samples because of the clever conditioning can assume to use on the right you see hierarchical auto regressive image models with auxillary decoders where the idea of using latent space auto regressive models was introduced by quantizing representations
286
310
https://www.youtube.com/watch?v=1sJuWg5dULg&t=286s
L8 Round-up of Strengths and Weaknesses of Unsupervised Learning Methods -- UC Berkeley SP20
https://i.ytimg.com/vi/1…axresdefault.jpg
1sJuWg5dULg
or encoders and and modeling pixel CNN the latent space which is also similar to the vqv a a idea that you seen in the VA a lecture so apart from images and audio and video auto regressive models have had immense success in language and these are samples from GPT to riots it would actually produce a coherent story about unicorns and like a like a story of how
310
339
https://www.youtube.com/watch?v=1sJuWg5dULg&t=310s
L8 Round-up of Strengths and Weaknesses of Unsupervised Learning Methods -- UC Berkeley SP20
https://i.ytimg.com/vi/1…axresdefault.jpg