video_id
stringlengths
11
11
text
stringlengths
361
490
start_second
int64
0
11.3k
end_second
int64
18
11.3k
url
stringlengths
48
52
title
stringlengths
0
100
thumbnail
stringlengths
0
52
VKnoyiNxflk
and as you do that you will see that some parts of the drawing become more visible than others as you do that, right? And that process of sliding that stencil across this drawing is essentially the process of convolution. Now that we've explained what a filter is and introduced the concept of convolution, let's use an analogy of written language to understand the relationship between the filters
158
179
https://www.youtube.com/watch?v=VKnoyiNxflk&t=158s
Machine Learning For Medical Image Analysis - How It Works
https://i.ytimg.com/vi/V…axresdefault.jpg
VKnoyiNxflk
and the hierarchical structure of the layers in a CNN. We will simplify the explanation by using an analogy. The analogy is a written document. In order to communicate through writing, we organize it as a series of paragraphs, which are composed of sentences, those sentences are composed of words, and the words of letters. So reading a document requires assessing the relationship of letters to one another
179
202
https://www.youtube.com/watch?v=VKnoyiNxflk&t=179s
Machine Learning For Medical Image Analysis - How It Works
https://i.ytimg.com/vi/V…axresdefault.jpg
VKnoyiNxflk
in increasing layers of complexity, which is a kind of "deep" hierarchy, like the hierarchy in image analysis. Continuing with our analogy, let's say we're looking for the phrase Ada Lovelace in a paragraph. Ada Lovelace was a mathematician and writer who lived in the 19th century. And she holds the honor of having published the very first algorithm intended to be used
202
223
https://www.youtube.com/watch?v=VKnoyiNxflk&t=202s
Machine Learning For Medical Image Analysis - How It Works
https://i.ytimg.com/vi/V…axresdefault.jpg
VKnoyiNxflk
by a machine to perform calculations, which makes her the first ever computer programmer. In the first layer of the network, a CNN looks for the basic building blocks of an image. The basic building blocks of written language are letters. So in this analogy, the filters the CNN uses in the first layer would be letters. Let's zoom in on the word "Ada." Here is what the convolution process would look like for the letter A. When the
223
247
https://www.youtube.com/watch?v=VKnoyiNxflk&t=223s
Machine Learning For Medical Image Analysis - How It Works
https://i.ytimg.com/vi/V…axresdefault.jpg
VKnoyiNxflk
"A" filter overlies the letter "A" in the original image, the convolution output would generate a strong signal. This signal would then be mapped onto something called a feature map. The feature map represents how well elements in the image align with the filter. If something is there, the signal outputs white. If nothing is there, the signal outputs black. CNNs generate a feature map for every filter.
247
275
https://www.youtube.com/watch?v=VKnoyiNxflk&t=247s
Machine Learning For Medical Image Analysis - How It Works
https://i.ytimg.com/vi/V…axresdefault.jpg
VKnoyiNxflk
So in our analogy, there would be a feature map for every letter. These feature maps would then become the input for the second layer. In this layer, the CNN would spatially align and "stack" all those maps from the previous layer. This would allow the CNN to then look for short, specific sequences of letters in all the feature maps simultaneously. So the CNN would use a new set of filters to look for specific letters that are adjacent
275
301
https://www.youtube.com/watch?v=VKnoyiNxflk&t=275s
Machine Learning For Medical Image Analysis - How It Works
https://i.ytimg.com/vi/V…axresdefault.jpg
VKnoyiNxflk
to one another in particular sequences. In our analogy, the second layer would look for places where the letters A, D, and A are in sequence together making the word "ADA". It would also look for places where letters A, C, E, L, O and V are adjacent to one another using filters for LOVE and LACE. The output of the second layer would be the feature maps for those three sequences of letters.
301
328
https://www.youtube.com/watch?v=VKnoyiNxflk&t=301s
Machine Learning For Medical Image Analysis - How It Works
https://i.ytimg.com/vi/V…axresdefault.jpg
VKnoyiNxflk
In other words, in those feature maps, strong signals would be present where the sequences ADA, LOVE and LACE are located in the original paragraph. In the third layer, the CNN would stack and align these three new maps and perform more convolutions-this time identifying where longer words and groups of words are located. So the CNN could at this point identify where in the original paragraph the sequences of letters
328
352
https://www.youtube.com/watch?v=VKnoyiNxflk&t=328s
Machine Learning For Medical Image Analysis - How It Works
https://i.ytimg.com/vi/V…axresdefault.jpg
VKnoyiNxflk
and words making the phrase "ADA LOVELACE" are located. In our analogy, we were looking for a phrase consisting of only two words. Had we been looking for a longer sentence or even a paragraph, the CNN would deal with the greater complexity by having more layers. We've omitted quite a few details about CNNs for simplicity, but this captures the essence of the model.
352
375
https://www.youtube.com/watch?v=VKnoyiNxflk&t=352s
Machine Learning For Medical Image Analysis - How It Works
https://i.ytimg.com/vi/V…axresdefault.jpg
VKnoyiNxflk
But what does this look like for actual images, like identifying diabetic retinopathy from an ocular photograph? Images are made out of pixels rather than letters. In a digital context, a pixel is the smallest, controllable unit of an image represented on a display. Each pixel is a representation of a tiny portion of the original image. Think about pixels like creating a drawing with dots where every dot has a color
375
398
https://www.youtube.com/watch?v=VKnoyiNxflk&t=375s
Machine Learning For Medical Image Analysis - How It Works
https://i.ytimg.com/vi/V…axresdefault.jpg
VKnoyiNxflk
value and an intensity. The more dots used, the clearer the image becomes. The filters a CNN uses in that first layer are small squares of pixels that correspond to things like textures, contrast between two colors, or edges. These are the image analysis-equivalents of the letters used in our analogy. And as a CNN goes up in the hierarchy, it looks for combinations of these filters,
398
422
https://www.youtube.com/watch?v=VKnoyiNxflk&t=398s
Machine Learning For Medical Image Analysis - How It Works
https://i.ytimg.com/vi/V…axresdefault.jpg
VKnoyiNxflk
getting more and more complex with each layer. As the complexity increases, the CNN gets closer to identifying what it's looking for. So the specific features analyzed at each layer help put the whole thing together. So, for example, some of the earlier work showed that some layers tend to be better at extracting, sort of like, edge-like information. Meaning that, for example, if you combine different kinds of horizontal edges,
422
449
https://www.youtube.com/watch?v=VKnoyiNxflk&t=422s
Machine Learning For Medical Image Analysis - How It Works
https://i.ytimg.com/vi/V…axresdefault.jpg
VKnoyiNxflk
we might get a continuous line that resembles the retinal blood vessels. And as you combine more of those and start to encode more higher-level concepts such as, you know, is there a micro-aneurysm here, is there bleeding over here, is there other lesions in the image? And right at the very end is where these, after these multiple layers, the network will try to then condense all of that information down into a final prediction.
449
477
https://www.youtube.com/watch?v=VKnoyiNxflk&t=449s
Machine Learning For Medical Image Analysis - How It Works
https://i.ytimg.com/vi/V…axresdefault.jpg
VKnoyiNxflk
In this case, severe diabetic retinopathy. Developing a CNN to help identify diabetic retinopathy was motivated because many patients with diabetes are not getting screened frequently enough. We have to screen diabetic patients once a year or we should, and there are some barriers to getting that done. Some of it is just, you know, not having enough trained professionals to do that task.
477
500
https://www.youtube.com/watch?v=VKnoyiNxflk&t=477s
Machine Learning For Medical Image Analysis - How It Works
https://i.ytimg.com/vi/V…axresdefault.jpg
VKnoyiNxflk
It's also not having that expertise available where the patient is. It's not that, you know, there aren't retina specialists in a metropolitan city four hours away, it's that there isn't a retina specialist at your grocery store. And CNNs could facilitate the integration of diabetic retinopathy and other screening programs into primary care. But before that happens, more research,
500
522
https://www.youtube.com/watch?v=VKnoyiNxflk&t=500s
Machine Learning For Medical Image Analysis - How It Works
https://i.ytimg.com/vi/V…axresdefault.jpg
VKnoyiNxflk
especially prospective clinical trials, are needed. The way we do approach these things is really the way that medicine usually works, which is to say, "let's do validations of the method again and again and again until we're sure, we're reasonably confident that it really works on many kinds of images, in many settings for, you know, many different patient populations." And
522
543
https://www.youtube.com/watch?v=VKnoyiNxflk&t=522s
Machine Learning For Medical Image Analysis - How It Works
https://i.ytimg.com/vi/V…axresdefault.jpg
VKnoyiNxflk
so from my perspective that's really at the end of the day what's most important: does it work on real patients and is it reliable? The excitement generated by early results has already spurred several research groups to look into the efficacy of CNNs in clinical practice, which could potentially finally get CNNs from the bench to the bedside. I think we're on the third or fourth technological revolution
543
567
https://www.youtube.com/watch?v=VKnoyiNxflk&t=543s
Machine Learning For Medical Image Analysis - How It Works
https://i.ytimg.com/vi/V…axresdefault.jpg
VKnoyiNxflk
where neural networks are coming to the forefront, and I really hope that this time we'll get it right. But there were failures in the past where people used the technology in suboptimal ways and we don' t want it to happen again. One has to make sure that we have appropriate and sufficient data for development, validation and testing, and that we're solving actual clinical problems.
567
594
https://www.youtube.com/watch?v=VKnoyiNxflk&t=567s
Machine Learning For Medical Image Analysis - How It Works
https://i.ytimg.com/vi/V…axresdefault.jpg
VKnoyiNxflk
At the end of the day, one thing to take away is that even if, as a clinician, it can be hard to understand exactly how a CNN arrives at its diagnosis, it can still be a useful tool. And this is similar to how many clinicians use other widely-adopted technologies. Consider antibodies: You know, as a clinician I may not know exactly where that part of an antibody kind of binds to, but I'm comfortable after looking at some
594
619
https://www.youtube.com/watch?v=VKnoyiNxflk&t=594s
Machine Learning For Medical Image Analysis - How It Works
https://i.ytimg.com/vi/V…axresdefault.jpg
VKnoyiNxflk
of this clinical validation of using Lucentis, for example, for an injection, right. This is kind of like any new breakthrough technology: needs validation and needs transparency, but I think, you know, the medical community in general responds very well to new technologies that have been validated. This video is meant to be an introduction into the topic of CNNs.
619
639
https://www.youtube.com/watch?v=VKnoyiNxflk&t=619s
Machine Learning For Medical Image Analysis - How It Works
https://i.ytimg.com/vi/V…axresdefault.jpg
xamzdNUN1o0
Suat CS 287 fall 2019 sorry to miss you last week you covered on Tuesday contacting variant optimization with Igor and Thursday I believe you covered motion planning with one of the TAS Harry there was a couple of slides that were uncovered at the end of last lecture I'm gonna let you study those on your own essentially there's a topic of lqr trees it's not gonna come in the
0
30
https://www.youtube.com/watch?v=xamzdNUN1o0&t=0s
Lecture 11 Probability Review, Bayes Filters, Gaussians -- CS287-FA19 Advanced Robotics
https://i.ytimg.com/vi/x…o0/hqdefault.jpg
xamzdNUN1o0
homework but it's quite interesting it's a way to combine motion planning with lqr to have a more efficient motion planner effectively because motion planning tends to be very local wind sampling based lqr is a not that local it has a basin of Attraction and so combining both of those into lqr trees is pretty interesting and then the other thing you didn't cover was shortcutting
30
54
https://www.youtube.com/watch?v=xamzdNUN1o0&t=30s
Lecture 11 Probability Review, Bayes Filters, Gaussians -- CS287-FA19 Advanced Robotics
https://i.ytimg.com/vi/x…o0/hqdefault.jpg
xamzdNUN1o0
typically when you use a sampling based motion planner you will at the end of a path that's very jagged short cutting is the idea that you just check if any of the steps along the way you can just skip because it's just a little bit of a detour and use a straight line instead of course depends on the dynamics of your system if your system has peculiar dynamics you cannot just do straight
54
76
https://www.youtube.com/watch?v=xamzdNUN1o0&t=54s
Lecture 11 Probability Review, Bayes Filters, Gaussians -- CS287-FA19 Advanced Robotics
https://i.ytimg.com/vi/x…o0/hqdefault.jpg
xamzdNUN1o0
lines but if your system is able to follow any path except when there are obstacles thing you can do straight lines if there's no obstacle so that's shortcutting is a little more detail in the slides and often actually you also run non-linear optimization for control actually at the end to smooth out your path and get a locally optimal path so those are the highlights of what you
76
97
https://www.youtube.com/watch?v=xamzdNUN1o0&t=76s
Lecture 11 Probability Review, Bayes Filters, Gaussians -- CS287-FA19 Advanced Robotics
https://i.ytimg.com/vi/x…o0/hqdefault.jpg
xamzdNUN1o0
didn't cover but we're gonna not cover in more detail but the slides have the detail so they will switch to sometimes the second part of the course first part of the course was all about finding optimal actions how do we given a dynamical system some environment really find a sequence of actions or a policy that optimizes expected reward now we're going to look at the complementary part
97
123
https://www.youtube.com/watch?v=xamzdNUN1o0&t=97s
Lecture 11 Probability Review, Bayes Filters, Gaussians -- CS287-FA19 Advanced Robotics
https://i.ytimg.com/vi/x…o0/hqdefault.jpg
xamzdNUN1o0
which is trying to make sense of our sensor readings such that we actually even know what the state of the environment might be and then act against that and we'll need a lot more probability here so we'll start with a bit probability review then we'll look at Bayes filters which are going to be the core of what we'll be doing and we'll look at base filters today in the
123
142
https://www.youtube.com/watch?v=xamzdNUN1o0&t=123s
Lecture 11 Probability Review, Bayes Filters, Gaussians -- CS287-FA19 Advanced Robotics
https://i.ytimg.com/vi/x…o0/hqdefault.jpg
xamzdNUN1o0
simplest setting which is just discrete state discrete observation and then we'll look at gaussians which will allow us next lecture to do it in continuous state spaces and in many ways similar to how lqr give us local solutions similarly with Gaussian siva for nonlinear systems we'll be able to find local approximations to the probability distributions all right
142
169
https://www.youtube.com/watch?v=xamzdNUN1o0&t=142s
Lecture 11 Probability Review, Bayes Filters, Gaussians -- CS287-FA19 Advanced Robotics
https://i.ytimg.com/vi/x…o0/hqdefault.jpg
xamzdNUN1o0
any questions all right then let's get started so why might we care about probability in robotics well often the state of the robot and the state of its environment are unknown and only noisy sensors are available so we don't have a sensor that just says here is the state instead you measure something else probability provides a framework to fuse that sensor information to a reasonable
169
204
https://www.youtube.com/watch?v=xamzdNUN1o0&t=169s
Lecture 11 Probability Review, Bayes Filters, Gaussians -- CS287-FA19 Advanced Robotics
https://i.ytimg.com/vi/x…o0/hqdefault.jpg
xamzdNUN1o0
estimate of what the state might be and so the result of what we would compute would be a distribution over possible states that the environment and robot could be in there's another reason we care about probability which is something we've already covered a little bit which is that the dynamic is often stochastic so we can optimize for a particular outcome but the only optimize
204
224
https://www.youtube.com/watch?v=xamzdNUN1o0&t=204s
Lecture 11 Probability Review, Bayes Filters, Gaussians -- CS287-FA19 Advanced Robotics
https://i.ytimg.com/vi/x…o0/hqdefault.jpg
xamzdNUN1o0
obtain a good distribution of our outcomes and again probability provides as a framework to deal with that we've done that in simple settings so far but we'll expand that later and we'll actually bring both of those together in a future lecture where we'll look at the notion that the actions you take could be taken to reduce uncertainty about the world so you actively go seek out sensor
224
246
https://www.youtube.com/watch?v=xamzdNUN1o0&t=224s
Lecture 11 Probability Review, Bayes Filters, Gaussians -- CS287-FA19 Advanced Robotics
https://i.ytimg.com/vi/x…o0/hqdefault.jpg
xamzdNUN1o0
information that could help you understand better what the state of the world is and then collect more reward formulas are called pom the peace partial observable Markov decision processes and that will be somewhat the culmination of bringing together what we're covering in the next few lectures and what we've covered in everything so far let's look at an example how
246
267
https://www.youtube.com/watch?v=xamzdNUN1o0&t=246s
Lecture 11 Probability Review, Bayes Filters, Gaussians -- CS287-FA19 Advanced Robotics
https://i.ytimg.com/vi/x…o0/hqdefault.jpg
xamzdNUN1o0
after what would be the state position of the helicopter orientation velocity angular rate what would be the sensors you have you don't have direct access to position orientation velocity angular rate you have access to some of those in a noisy way and then others you don't have access at all so GPS gives you a noisy estimate of position sometimes also velocity typically only up to a
267
289
https://www.youtube.com/watch?v=xamzdNUN1o0&t=267s
Lecture 11 Probability Review, Bayes Filters, Gaussians -- CS287-FA19 Advanced Robotics
https://i.ytimg.com/vi/x…o0/hqdefault.jpg
xamzdNUN1o0
couple meters accuracy so not super precise but it's a noisy estimate of position typically you put the inertial sensing on your robot so for helicopter maybe you have a three axis gyro so charr is a angular rate sensor so it gives you three three numbers at any given time measuring the angular velocity around the three main axes of the helicopter if that's you know if you
289
311
https://www.youtube.com/watch?v=xamzdNUN1o0&t=289s
Lecture 11 Probability Review, Bayes Filters, Gaussians -- CS287-FA19 Advanced Robotics
https://i.ytimg.com/vi/x…o0/hqdefault.jpg
xamzdNUN1o0
mounted axis aligned then three axis accelerometer accelerometers measure acceleration but not exact acceleration it's a little trickier than that Adashi measures all zeros in freefall so if you're free falling an accelerometer measures all zeros and anything that you do that's not free-falling for example if you're standing on the ground your accelerometer will measure 9.81 m/s^2
311
339
https://www.youtube.com/watch?v=xamzdNUN1o0&t=311s
Lecture 11 Probability Review, Bayes Filters, Gaussians -- CS287-FA19 Advanced Robotics
https://i.ytimg.com/vi/x…o0/hqdefault.jpg
xamzdNUN1o0
opposite to essentially gravity because you're resisting gravity and that's what it's measuring at that time so in some sense accelerometer is not not just an in fact much more accelerometer is about measuring orientation very often than it is about measuring your actual acceleration because any resistance of gravity you'll be able to measure and from that get in some sense especially
339
361
https://www.youtube.com/watch?v=xamzdNUN1o0&t=339s
Lecture 11 Probability Review, Bayes Filters, Gaussians -- CS287-FA19 Advanced Robotics
https://i.ytimg.com/vi/x…o0/hqdefault.jpg
xamzdNUN1o0
if we're on the ground understand where gravity is pointing which gives you a lot of information about your orientation then three axis magnetometer which well what is it magnetometer it measures the earth magnetic field so earth magnetic field you might think of it as North but it's actually if you measure it in 3d it's not really north it's roughly north but it actually also
361
384
https://www.youtube.com/watch?v=xamzdNUN1o0&t=361s
Lecture 11 Probability Review, Bayes Filters, Gaussians -- CS287-FA19 Advanced Robotics
https://i.ytimg.com/vi/x…o0/hqdefault.jpg
xamzdNUN1o0
points into Earth but it's known which direction at point since you are measuring in the frame of your sensor where is the magnetic field pointing and that gives you information about the orientation of the system that you have some tricky things that of course if you have something that generates its own magnetic field like maybe have some magnets on your system where you have
384
406
https://www.youtube.com/watch?v=xamzdNUN1o0&t=384s
Lecture 11 Probability Review, Bayes Filters, Gaussians -- CS287-FA19 Advanced Robotics
https://i.ytimg.com/vi/x…o0/hqdefault.jpg
xamzdNUN1o0
some high currents that induce magnetic fields then it might perturb those measurements and not really get the earth magnetic field but something else are you flying near a power line or something it might perturb the measurements you're getting dynamics there is noise from the wind there's unball dynamics and engine in the servo motors and in the blades of the helicopter so overall we don't
406
429
https://www.youtube.com/watch?v=xamzdNUN1o0&t=406s
Lecture 11 Probability Review, Bayes Filters, Gaussians -- CS287-FA19 Advanced Robotics
https://i.ytimg.com/vi/x…o0/hqdefault.jpg
xamzdNUN1o0
really have access to state directly but we have things that relate pretty closely to state and we should be able to distill from that a likely estimate of the state or a distribution around some mode maybe of what the state might be how about a mobile robot inside a building the state could be position and heading is let's say a slow robot is just slowly moving you don't care about
429
449
https://www.youtube.com/watch?v=xamzdNUN1o0&t=429s
Lecture 11 Probability Review, Bayes Filters, Gaussians -- CS287-FA19 Advanced Robotics
https://i.ytimg.com/vi/x…o0/hqdefault.jpg
xamzdNUN1o0
velocity just the position in the direction it's facing then sense source odometry which is sensing motion of the actuators for example wheel encoders so on your wheels you might measure how much your wheel has turned if you didn't know the diameter of your wheel you know how much you have moved well that's approximately true because the wheel might have slipped there might have been
449
472
https://www.youtube.com/watch?v=xamzdNUN1o0&t=449s
Lecture 11 Probability Review, Bayes Filters, Gaussians -- CS287-FA19 Advanced Robotics
https://i.ytimg.com/vi/x…o0/hqdefault.jpg
xamzdNUN1o0
bumps in the road where it's if it's not flat you don't move as far forward as you thought you would have because you really have been going down and back up but it gives you some kind of measurement of how much you've moved then often people with a laser rangefinder on a mobile robot why is that it sends out a laser beam and then it measures how long it takes for that
472
491
https://www.youtube.com/watch?v=xamzdNUN1o0&t=472s
Lecture 11 Probability Review, Bayes Filters, Gaussians -- CS287-FA19 Advanced Robotics
https://i.ytimg.com/vi/x…o0/hqdefault.jpg
xamzdNUN1o0
beam to come back and that gives you a measurement of twice the distance to the obstacle in that direction and so it's a nice way to directly measure 3d it's a little bit problematic at times because sometimes you have mirrors and then the thing doesn't come back your reflects off in another way or you might have glass that acts much like a mirror and the same thing will happen you don't get
491
510
https://www.youtube.com/watch?v=xamzdNUN1o0&t=491s
Lecture 11 Probability Review, Bayes Filters, Gaussians -- CS287-FA19 Advanced Robotics
https://i.ytimg.com/vi/x…o0/hqdefault.jpg
xamzdNUN1o0
the measurements back but if you have nice matte surfaces it's a really nice way to measure how far away things are dynamics is noisy because of wheel slippage you might say my will turn this much but what if it slipped and you don't know what your next state is going to be if you don't know how much it slipped and it could be unmad unmodeled variations in the floor that effect
510
531
https://www.youtube.com/watch?v=xamzdNUN1o0&t=510s
Lecture 11 Probability Review, Bayes Filters, Gaussians -- CS287-FA19 Advanced Robotics
https://i.ytimg.com/vi/x…o0/hqdefault.jpg
xamzdNUN1o0
essentially where you end up all right so those are two examples just to motivate why we might want to deal with probability distributions in that yeah if this is what we're faced with or the helicopters what we're faced with we're now going to have a deterministic notion of this is the state we're just going to have a bunch of sensor information and the result that best
531
551
https://www.youtube.com/watch?v=xamzdNUN1o0&t=531s
Lecture 11 Probability Review, Bayes Filters, Gaussians -- CS287-FA19 Advanced Robotics
https://i.ytimg.com/vi/x…o0/hqdefault.jpg
xamzdNUN1o0
will be a distribution over possible States okay so today what we're going to do we're going to a probability review hopefully that's a review if what we're doing the probability review section is not review you should go study to make sure that it feels like review as soon as possible then we'll do Bayes filters where we will look at the foundation of a lot what we'll cover in the discrete
551
579
https://www.youtube.com/watch?v=xamzdNUN1o0&t=551s
Lecture 11 Probability Review, Bayes Filters, Gaussians -- CS287-FA19 Advanced Robotics
https://i.ytimg.com/vi/x…o0/hqdefault.jpg
xamzdNUN1o0
setting and then we'll start looking at gaussians which will form the foundation for what we'll do a next lecture to do it in continuous space all right so probability theory as a lot of math it starts with some actions and and everything follows from that so probability theory actions here is that you have some outcome a and it could be many possible outcomes an outcome a the
579
603
https://www.youtube.com/watch?v=xamzdNUN1o0&t=579s
Lecture 11 Probability Review, Bayes Filters, Gaussians -- CS287-FA19 Advanced Robotics
https://i.ytimg.com/vi/x…o0/hqdefault.jpg
xamzdNUN1o0
probability of the outcome is assigned a number between 0 and 1 maybe you know I don't know the probability of being in the correct room for lecture is some probability assigned to it then the probability of the union of all possible outcomes Omega is essentially everything that could possibly happen the probability of everything that could possibly happen something in that said
603
624
https://www.youtube.com/watch?v=xamzdNUN1o0&t=603s
Lecture 11 Probability Review, Bayes Filters, Gaussians -- CS287-FA19 Advanced Robotics
https://i.ytimg.com/vi/x…o0/hqdefault.jpg
xamzdNUN1o0
to have happened should be 1 otherwise that's not the proper Omega that captures everything and that also means nothing can have a number sign higher than 1 the probabilities don't go above 1 and then probability of the empty set well the empty set is never containing the outcome that happened whatever happened is something it's not in the empty set and so probability of that is
624
647
https://www.youtube.com/watch?v=xamzdNUN1o0&t=624s
Lecture 11 Probability Review, Bayes Filters, Gaussians -- CS287-FA19 Advanced Robotics
https://i.ytimg.com/vi/x…o0/hqdefault.jpg
xamzdNUN1o0
0 and then the main thing here is essentially how you kind of keep track of probability the probability of the union of two possible outcomes a and B is a probability of a plus probability of B but it's possible that a and B happen at the same time and then you're double counting down as you subtract back out the probability that a and B happened at the same time which is a intersection B
647
670
https://www.youtube.com/watch?v=xamzdNUN1o0&t=647s
Lecture 11 Probability Review, Bayes Filters, Gaussians -- CS287-FA19 Advanced Robotics
https://i.ytimg.com/vi/x…o0/hqdefault.jpg
xamzdNUN1o0
and that gives you the probability of the Union pictorially looks like this you have your Omega space which has all possible events in it then there is outcomes that make a true there is outcomes that make be true and then there's outcomes to make a and B through the intersection and the probability of a union B is probability of a plus probability of B minus probability of
670
698
https://www.youtube.com/watch?v=xamzdNUN1o0&t=670s
Lecture 11 Probability Review, Bayes Filters, Gaussians -- CS287-FA19 Advanced Robotics
https://i.ytimg.com/vi/x…o0/hqdefault.jpg
xamzdNUN1o0
the intersection one abstract or abstract or concrete depending on your mindset way to think of it is that everything that can possibly happen is a point in this rectangle pictorially that's like every possible state the world could be in is a point in that rectangle I mean we might never know the exact details of it but every possible state is a point in that rectangle and
698
720
https://www.youtube.com/watch?v=xamzdNUN1o0&t=698s
Lecture 11 Probability Review, Bayes Filters, Gaussians -- CS287-FA19 Advanced Robotics
https://i.ytimg.com/vi/x…o0/hqdefault.jpg
xamzdNUN1o0
then a is a property of the state of the world that holds true for all the ones that line that blue circle the bees have property of the state that holds true for all the states of the world in the yellow circle and then a intersection B is a property of the world that holds true for all the points that are in the intersection and so you can think of a and B just as properties or descriptions
720
747
https://www.youtube.com/watch?v=xamzdNUN1o0&t=720s
Lecture 11 Probability Review, Bayes Filters, Gaussians -- CS287-FA19 Advanced Robotics
https://i.ytimg.com/vi/x…o0/hqdefault.jpg
xamzdNUN1o0
of you know abstractions of the world of the state whereas individual points in that rectangle are in some sense the full state of the world which are probably never observed will never really talk about because we don't care about all the details we'll talk about a and B but in terms of how it's all set up that's an easy way to think of it then when you use these actions you can
747
768
https://www.youtube.com/watch?v=xamzdNUN1o0&t=747s
Lecture 11 Probability Review, Bayes Filters, Gaussians -- CS287-FA19 Advanced Robotics
https://i.ytimg.com/vi/x…o0/hqdefault.jpg
xamzdNUN1o0
come to new conclusions I mean that that's kind of how math works you posit some actions in the you do some derivations and you have a new thing that's useful to use so for example probability of a union on Vega minus a well on the Left we just look at what's in the parentheses is well a union with omega minus 8 that's omega so the probability of Omega is 1 that's what we
768
794
https://www.youtube.com/watch?v=xamzdNUN1o0&t=768s
Lecture 11 Probability Review, Bayes Filters, Gaussians -- CS287-FA19 Advanced Robotics
https://i.ytimg.com/vi/x…o0/hqdefault.jpg
xamzdNUN1o0
derived up to here on the right-hand side we use that thing that's a probability of Union is promotive each individual event- probability of the intersection that's this we've worked that out then we know that this probability is zero and now we have the probability of a plus probability of the complement of a has to be equal to one and so that's the kind of properties you would derive all
794
818
https://www.youtube.com/watch?v=xamzdNUN1o0&t=794s
Lecture 11 Probability Review, Bayes Filters, Gaussians -- CS287-FA19 Advanced Robotics
https://i.ytimg.com/vi/x…o0/hqdefault.jpg
xamzdNUN1o0
right so there are discrete random variables and continuous random variables so here we have Omega again the rectangle and again think of every point in the rectangle as representing one possible state of the entire world and then X is our random variable we work with X can take on four values if the event in indicates the world is in a state that's in that leftmost rectangle
818
847
https://www.youtube.com/watch?v=xamzdNUN1o0&t=818s
Lecture 11 Probability Review, Bayes Filters, Gaussians -- CS287-FA19 Advanced Robotics
https://i.ytimg.com/vi/x…o0/hqdefault.jpg
xamzdNUN1o0
we say x equals x1 if it's in that triangle above it's x equals x2 and so forth and so there's only four values X can take on the probability associated with X equal X 1 we can think of like essentially the mass associated with that region in the rectangle and we call P a probability mass function a simpler example than the abstract rectangle there would be a coin flip heads and
847
872
https://www.youtube.com/watch?v=xamzdNUN1o0&t=847s
Lecture 11 Probability Review, Bayes Filters, Gaussians -- CS287-FA19 Advanced Robotics
https://i.ytimg.com/vi/x…o0/hqdefault.jpg
xamzdNUN1o0
tails probability half for each that's another example of a distribution again you can think of as in the context of that rectangle though you can say I have a rectangle I split it in half there is heads and tails and you know have the states of the entire world my coin came out and heads have the states of the entire world my cone came out tails we're ignoring everything else about the
872
894
https://www.youtube.com/watch?v=xamzdNUN1o0&t=872s
Lecture 11 Probability Review, Bayes Filters, Gaussians -- CS287-FA19 Advanced Robotics
https://i.ytimg.com/vi/x…o0/hqdefault.jpg
xamzdNUN1o0
world we're just looking at heads versus tails and not the details of everything else in the world continuous random variables X takes on a value and some continuum you cannot now associate the probability with X taking on a specific value because if it's infinitely many things it's a continuum if you want to assign finite probability to think so it'll sum up to something higher than 1
894
917
https://www.youtube.com/watch?v=xamzdNUN1o0&t=894s
Lecture 11 Probability Review, Bayes Filters, Gaussians -- CS287-FA19 Advanced Robotics
https://i.ytimg.com/vi/x…o0/hqdefault.jpg
xamzdNUN1o0
which we can have it needs to be something to 1 so we can so many more rash you have to integrate now so we'll do is this a probability density function is defined saying that if you look at mass under the curve so the probability for X being an interval A to B is the integral from A to B of P of X DX for example if we think from zero here maybe till this point we could take the probability
917
948
https://www.youtube.com/watch?v=xamzdNUN1o0&t=917s
Lecture 11 Probability Review, Bayes Filters, Gaussians -- CS287-FA19 Advanced Robotics
https://i.ytimg.com/vi/x…o0/hqdefault.jpg
xamzdNUN1o0
Mouse and that's then the in the area under the curve would be denoting how much parabola we associate with X landing in that interval so again we can't really talk about X stick on a specific value here the best we can do is say probability of X lying in an interval you could say X probability of X taking on a specific value would say it's zero but then to have something
948
969
https://www.youtube.com/watch?v=xamzdNUN1o0&t=948s
Lecture 11 Probability Review, Bayes Filters, Gaussians -- CS287-FA19 Advanced Robotics
https://i.ytimg.com/vi/x…o0/hqdefault.jpg
xamzdNUN1o0
meaningful you take intervals and assign probabilities to the intervals any questions so far all right then the most important thing and what we'll be working on is to look at distributions that involve multiple variables so what why is that well the reason is that has already talked about we'll have things like real state that we care about and sensory measurements
969
1,001
https://www.youtube.com/watch?v=xamzdNUN1o0&t=969s
Lecture 11 Probability Review, Bayes Filters, Gaussians -- CS287-FA19 Advanced Robotics
https://i.ytimg.com/vi/x…o0/hqdefault.jpg
xamzdNUN1o0
so I want to somehow talk about the joint distribution between the two so if we measure one of them we can say something about the other one or we'll deal with dynamics of the world and so we want to relate the distribution now with the distribution at the next time and so both of these involved joint distributions over two random variables rather than just looking at a single
1,001
1,021
https://www.youtube.com/watch?v=xamzdNUN1o0&t=1001s
Lecture 11 Probability Review, Bayes Filters, Gaussians -- CS287-FA19 Advanced Robotics
https://i.ytimg.com/vi/x…o0/hqdefault.jpg
xamzdNUN1o0
variable all right so the joint distribution over x and y is denoted this way then in simple scenarios simple scenarios X&Y might be independent meaning of the probability of x and y taking on specific value small X small Y is a probability of X taking on value x and y taking on value of y that's the case actually knowing X does not tell you anything about Y or
1,021
1,059
https://www.youtube.com/watch?v=xamzdNUN1o0&t=1021s
Lecture 11 Probability Review, Bayes Filters, Gaussians -- CS287-FA19 Advanced Robotics
https://i.ytimg.com/vi/x…o0/hqdefault.jpg
xamzdNUN1o0
the other way around so it's actually not so interesting case mathematically simplifies things but yeah if you had a sensor where your sensor is independent of the state of your system the sensor cannot inform you about the system sometimes the property that holds true though and then it's it's good to know and good to take advantage of mathematically but you don't want to
1,059
1,077
https://www.youtube.com/watch?v=xamzdNUN1o0&t=1059s
Lecture 11 Probability Review, Bayes Filters, Gaussians -- CS287-FA19 Advanced Robotics
https://i.ytimg.com/vi/x…o0/hqdefault.jpg
xamzdNUN1o0
build your systems that way necessarily then X given Y is the probability of X taking on value of small X given Y has value small Y so what we have here is that as a definition we write it as X given Y is the probability of x and y divided by probability of Y so it think again about that rectangle there's a region where Y takes on the values small Y and that has a certain surface and
1,077
1,110
https://www.youtube.com/watch?v=xamzdNUN1o0&t=1077s
Lecture 11 Probability Review, Bayes Filters, Gaussians -- CS287-FA19 Advanced Robotics
https://i.ytimg.com/vi/x…o0/hqdefault.jpg
xamzdNUN1o0
that's py and we look at the region where X takes on value is small X within that region that's B X comma Y and then the fraction of the surface it takes up that's the conditional we can also rewrite this as the probability of X comma Y is the probability of Y times probability of x given Y this will become very useful many many situations because let's say we know the
1,110
1,136
https://www.youtube.com/watch?v=xamzdNUN1o0&t=1110s
Lecture 11 Probability Review, Bayes Filters, Gaussians -- CS287-FA19 Advanced Robotics
https://i.ytimg.com/vi/x…o0/hqdefault.jpg
xamzdNUN1o0
distribution for Y and want to know something about X which might be the distribution at the next time slice and if we notice in about X given Y this may be the dynamics of the system to go from Y to the next time X then this equation will tell something about the joint and hence tell us something about X then if x and y are independent then the conditional of X given Y is the same as
1,136
1,175
https://www.youtube.com/watch?v=xamzdNUN1o0&t=1136s
Lecture 11 Probability Review, Bayes Filters, Gaussians -- CS287-FA19 Advanced Robotics
https://i.ytimg.com/vi/x…o0/hqdefault.jpg
xamzdNUN1o0
the marginal for X so P of X will call the Marshal for X and so that means that sometimes in our math we'll be able to simplify things you have an assumption X in planet of Y we see X given Y appear we can just get rid of the Y and simplify to just px all of this is also true for probability densities it's just in densities often it's a small P but other than that you can write the exact
1,175
1,199
https://www.youtube.com/watch?v=xamzdNUN1o0&t=1175s
Lecture 11 Probability Review, Bayes Filters, Gaussians -- CS287-FA19 Advanced Robotics
https://i.ytimg.com/vi/x…o0/hqdefault.jpg
xamzdNUN1o0
same math and I will say something about the densities instead of the actual probability mass now let's work through the equations we'll be using the most all right so there are essentially two equations there are two equations we'll be using a lot and so let's explicitly step through them and see what they are one s cold law of total probability was it say it says that and we'll write
1,199
1,266
https://www.youtube.com/watch?v=xamzdNUN1o0&t=1199s
Lecture 11 Probability Review, Bayes Filters, Gaussians -- CS287-FA19 Advanced Robotics
https://i.ytimg.com/vi/x…o0/hqdefault.jpg
xamzdNUN1o0
it for the discrete case the probability distribution for X is equal to so probability x equals x is sum over all values Y can take on of P X comma Y so the reason that will be so important is because often we'll want to get rid of some variables and move to another variable and the way we're going to be doing that is by constructing a Joint Distribution over those two variables
1,266
1,296
https://www.youtube.com/watch?v=xamzdNUN1o0&t=1266s
Lecture 11 Probability Review, Bayes Filters, Gaussians -- CS287-FA19 Advanced Robotics
https://i.ytimg.com/vi/x…o0/hqdefault.jpg
xamzdNUN1o0
somehow and then something out the 1 we don't want anymore more specifically the way this will typically be used is here we'll say well PX is equal to sum over y and imagine we already had access to the distribution for y and we need to construct a joint because you want to bring in X now we might have a model for how X is related to Y so you'll have px given Y and so this equation here is one
1,296
1,328
https://www.youtube.com/watch?v=xamzdNUN1o0&t=1296s
Lecture 11 Probability Review, Bayes Filters, Gaussians -- CS287-FA19 Advanced Robotics
https://i.ytimg.com/vi/x…o0/hqdefault.jpg
xamzdNUN1o0
that we'll find ourselves using a lot one of the two equations we'll be using over and over same thing can of course be done for densities and densities this summation would become an integral because you have to integrate over all continuous barrels values the variable can take on whereas in some in discrete variables you just sum otherwise same thing though in general whenever there's
1,328
1,351
https://www.youtube.com/watch?v=xamzdNUN1o0&t=1328s
Lecture 11 Probability Review, Bayes Filters, Gaussians -- CS287-FA19 Advanced Robotics
https://i.ytimg.com/vi/x…o0/hqdefault.jpg
xamzdNUN1o0
a summation you can put an integral same math we'll go through so that's one of the things we'll use a lot the other thing we'll use a lot is Bayes rule Bayes rule hopefully sounds familiar where how do you derive it we actually derive it from looking at this the expression for the joint the X comma Y is px given Y x py which is also because the roles of X and y are arbitrary it
1,351
1,387
https://www.youtube.com/watch?v=xamzdNUN1o0&t=1351s
Lecture 11 Probability Review, Bayes Filters, Gaussians -- CS287-FA19 Advanced Robotics
https://i.ytimg.com/vi/x…o0/hqdefault.jpg
xamzdNUN1o0
doesn't matter which one comes first or second in this equation here we can swap the roles and we have P Y given x times PX from that using just this part over here we can write P X given Y as P y given x times px over py let's interpret this equation for a moment so what are we doing here we're interested in finding this mission over X given Y so we can associate a story with this
1,387
1,433
https://www.youtube.com/watch?v=xamzdNUN1o0&t=1387s
Lecture 11 Probability Review, Bayes Filters, Gaussians -- CS287-FA19 Advanced Robotics
https://i.ytimg.com/vi/x…o0/hqdefault.jpg
xamzdNUN1o0
imagine X is the state of the system and Y is a sensor reading we might obtain and so we want to know what's the distribution of our possible states of this system given we have a sensory reading Y well it's not easy to come up with a table for that it's not easy to just say oh you know what I'm just gonna build a table for a px given y because it's actually not really how nature works the
1,433
1,456
https://www.youtube.com/watch?v=xamzdNUN1o0&t=1433s
Lecture 11 Probability Review, Bayes Filters, Gaussians -- CS287-FA19 Advanced Robotics
https://i.ytimg.com/vi/x…o0/hqdefault.jpg
xamzdNUN1o0
way nature works is that if there is a state that state causes the readings so this model here is really the causal model that we have available given a state it will cause a distribution over readings and when you build a sensor and you try to sell to people this is your sensor model that you would provide it said this is the distribution of our outcomes given this situation this is
1,456
1,484
https://www.youtube.com/watch?v=xamzdNUN1o0&t=1456s
Lecture 11 Probability Review, Bayes Filters, Gaussians -- CS287-FA19 Advanced Robotics
https://i.ytimg.com/vi/x…o0/hqdefault.jpg
xamzdNUN1o0
fishing-rod come given this other situation and so forth for example for a laser range file you might say well if there's a map surface at a certain distance the reading you'll get back is within maybe two centimeters like maybe with denigration 2 centimeters around the proper distance reading something like that so that thing is your calibrated sensor and that you can you can get a
1,484
1,510
https://www.youtube.com/watch?v=xamzdNUN1o0&t=1484s
Lecture 11 Probability Review, Bayes Filters, Gaussians -- CS287-FA19 Advanced Robotics
https://i.ytimg.com/vi/x…o0/hqdefault.jpg
xamzdNUN1o0
distribution for but what actually ugly will be distribution over state given that reading will depend on the prior X what you thought ahead of time might be likely states of your system and so they combined together and then of course is a normalization to make sure things sum to one that's old is this and often will write this as just 1 over Z times P Y given x times px or it turns out in a
1,510
1,538
https://www.youtube.com/watch?v=xamzdNUN1o0&t=1510s
Lecture 11 Probability Review, Bayes Filters, Gaussians -- CS287-FA19 Advanced Robotics
https://i.ytimg.com/vi/x…o0/hqdefault.jpg
xamzdNUN1o0
probabilistic robotics book notation they love to use it I so often write as a dot x py given x times PX so we don't have too much worry about the thing at the bottom that's just a normalization we just want to understand a thing at the top which brings together the thing for which we are able to build a model which is the relationship from X to Y and how we can then use it assuming we
1,538
1,563
https://www.youtube.com/watch?v=xamzdNUN1o0&t=1538s
Lecture 11 Probability Review, Bayes Filters, Gaussians -- CS287-FA19 Advanced Robotics
https://i.ytimg.com/vi/x…o0/hqdefault.jpg
xamzdNUN1o0
have a prior over X use it to get the distribution we really want which is distribution over state any questions about this these two equations are essentially the ones we'll just use everywhere and what we'll do today next lecture lecture after that if you fully understand those you should be in really good shape now we can have a new version of this so law of total
1,563
1,607
https://www.youtube.com/watch?v=xamzdNUN1o0&t=1563s
Lecture 11 Probability Review, Bayes Filters, Gaussians -- CS287-FA19 Advanced Robotics
https://i.ytimg.com/vi/x…o0/hqdefault.jpg
xamzdNUN1o0
probability again but with conditioning what does that mean for both of these imagine for the first one here imagine we had already something to conditioned on maybe there was a measurement in the past that was already there like we really just in not going just from Y to X but there's some Z that's already present somewhere so what can we do well we can say okay P X
1,607
1,656
https://www.youtube.com/watch?v=xamzdNUN1o0&t=1607s
Lecture 11 Probability Review, Bayes Filters, Gaussians -- CS287-FA19 Advanced Robotics
https://i.ytimg.com/vi/x…o0/hqdefault.jpg
xamzdNUN1o0
given Z is equal to we write the same equation sum over Y P Y times P X given Y but since we conditioned on C we need to ever wear a condition on Z and if you're ready I mean notation wise condition on Y and Z is just Y comma Z you don't need to draw another bar and so what we see happen is done the exact same equation can be written with just additional conditioning on Z or any
1,656
1,691
https://www.youtube.com/watch?v=xamzdNUN1o0&t=1656s
Lecture 11 Probability Review, Bayes Filters, Gaussians -- CS287-FA19 Advanced Robotics
https://i.ytimg.com/vi/x…o0/hqdefault.jpg
xamzdNUN1o0
number of variables as long as you consistently put them everywhere so you can always add more conditioning and so often that's the version we'll be using this will have many things we've already observed in the past we want to conditioned on and then we'll want to apply the equation the left law of total probability and we'll just carry along everything so do our math
1,691
1,713
https://www.youtube.com/watch?v=xamzdNUN1o0&t=1691s
Lecture 11 Probability Review, Bayes Filters, Gaussians -- CS287-FA19 Advanced Robotics
https://i.ytimg.com/vi/x…o0/hqdefault.jpg
xamzdNUN1o0
as if it doesn't exist we just need to consistently carry it along very straightforward got to be careful you can't drop it anywhere same thing is true with Bayes rule what if you have multiple things you are conditioning on a already condition of something else no problem Bayes rule with conditioning and the reason this should really be a surprise is you go back to the original
1,713
1,746
https://www.youtube.com/watch?v=xamzdNUN1o0&t=1713s
Lecture 11 Probability Review, Bayes Filters, Gaussians -- CS287-FA19 Advanced Robotics
https://i.ytimg.com/vi/x…o0/hqdefault.jpg
xamzdNUN1o0
picture of probability right then you we have this rectangle and the rectangle is really the foundation of everything with a point in the rectangle corresponding to the full world state and then regions correspond to values the random variable takes on if you condition on something essentially we're just redefining the rectangle as being only only let's say this part that's our new rectangle and
1,746
1,766
https://www.youtube.com/watch?v=xamzdNUN1o0&t=1746s
Lecture 11 Probability Review, Bayes Filters, Gaussians -- CS287-FA19 Advanced Robotics
https://i.ytimg.com/vi/x…o0/hqdefault.jpg
xamzdNUN1o0
that's the only part of the world we're working with possible States you can take on and once we've done that we put all probability on this thing we kind of renormalize it everything's there that's our new rectangle and the maths not going to change because we're just looking at this sub part of the whole rectangle but of course you need to do it consistently every step along the way
1,766
1,785
https://www.youtube.com/watch?v=xamzdNUN1o0&t=1766s
Lecture 11 Probability Review, Bayes Filters, Gaussians -- CS287-FA19 Advanced Robotics
https://i.ytimg.com/vi/x…o0/hqdefault.jpg
xamzdNUN1o0
we need to consistently look at that sub part where we are having Z this would be Z equals Z we need to keep it consistently to that part we can sometimes have it sometimes not when we do this but then same thing with Bayes rule yeah initially it was for the full rectangle now we restricted attention to some sub region no problem we just condition everywhere
1,785
1,805
https://www.youtube.com/watch?v=xamzdNUN1o0&t=1785s
Lecture 11 Probability Review, Bayes Filters, Gaussians -- CS287-FA19 Advanced Robotics
https://i.ytimg.com/vi/x…o0/hqdefault.jpg
xamzdNUN1o0
P X given Y comma Z is equal to let's it's equal to well what do we have we had Y given X but of course also Z because Z needs to be in the back everywhere now then P X but we need to carry along the Z and then normalize by Y again given Z but again the bottom here doesn't really matter much in baseball we usually know that we just need to compute the top part for every
1,805
1,844
https://www.youtube.com/watch?v=xamzdNUN1o0&t=1805s
Lecture 11 Probability Review, Bayes Filters, Gaussians -- CS287-FA19 Advanced Robotics
https://i.ytimg.com/vi/x…o0/hqdefault.jpg
xamzdNUN1o0
possible value of x and once we computed the top part for every value of x we can see what it sums to and whatever it sums to is what goes in the bottom right and so applying Bayes rule is about doing this for all X and then summing to know what the bottom is and so here we see that we can incorporate new evidence we already had evidence we had already had X given Z we had a past observation
1,844
1,870
https://www.youtube.com/watch?v=xamzdNUN1o0&t=1844s
Lecture 11 Probability Review, Bayes Filters, Gaussians -- CS287-FA19 Advanced Robotics
https://i.ytimg.com/vi/x…o0/hqdefault.jpg
xamzdNUN1o0
distribution over x given that observation then a new thing came in for Y and we have a model of how Y acts given X and Z and practice often the Z will disappear here because Y will only depend on X not on Z but in channel YK depend on X and Z and then we can combine that together to know what now new distribution is over state X then another concept that will be very
1,870
1,903
https://www.youtube.com/watch?v=xamzdNUN1o0&t=1870s
Lecture 11 Probability Review, Bayes Filters, Gaussians -- CS287-FA19 Advanced Robotics
https://i.ytimg.com/vi/x…o0/hqdefault.jpg
xamzdNUN1o0
important is conditional independence so conditional independence is a lot like independence but a little less strict independence is a property done and sometimes it's nice because it simplifies the math whenever you run into it but if your variables are all independent then you can't really infer anything from one variable about the other variable and so ultimately you get no interesting
1,903
1,935
https://www.youtube.com/watch?v=xamzdNUN1o0&t=1903s
Lecture 11 Probability Review, Bayes Filters, Gaussians -- CS287-FA19 Advanced Robotics
https://i.ytimg.com/vi/x…o0/hqdefault.jpg
xamzdNUN1o0
results out you need the variables to interact to get something interesting out now it could be that they interact but in limited ways and that's what conditional independence tries to capture so condition dependence we have X conditionally independent of Y given Z if and only if we have P X comma Y given Z equals P X given Z times P Y given Z so it's like independence the joint
1,935
1,975
https://www.youtube.com/watch?v=xamzdNUN1o0&t=1935s
Lecture 11 Probability Review, Bayes Filters, Gaussians -- CS287-FA19 Advanced Robotics
https://i.ytimg.com/vi/x…o0/hqdefault.jpg
xamzdNUN1o0
between x and y is the product of marginal for X marginal for y exit that it's only true when you condition on Z once you know z x and y are independent but as long as you don't know z they might have a relationship for example z might be something I don't know like Z could be the weather and X might be is somebody carrying an umbrella and Y might be is the is the pavement wet or
1,975
2,004
https://www.youtube.com/watch?v=xamzdNUN1o0&t=1975s
Lecture 11 Probability Review, Bayes Filters, Gaussians -- CS287-FA19 Advanced Robotics
https://i.ytimg.com/vi/x…o0/hqdefault.jpg
xamzdNUN1o0
something and once you know it's raining you you don't need to you already know that the rain will cause x and y independently it's not related directly whether pavement is wet and somebody cares umbrella it's through the fact that it rains which is the common cause for both of them and said I'll make x and y independent given see another example and the one that we'll see most
2,004
2,026
https://www.youtube.com/watch?v=xamzdNUN1o0&t=2004s
Lecture 11 Probability Review, Bayes Filters, Gaussians -- CS287-FA19 Advanced Robotics
https://i.ytimg.com/vi/x…o0/hqdefault.jpg
xamzdNUN1o0
often in this class is if you know the state of the world then every sensor for every sense you're reading might be independent very often because if your sensor just acts on the real state of the world then the reading that happens first very happens next will be unrelated and we'll be able to simplify things okay so this is this is kind of really everything we need so we just need three
2,026
2,056
https://www.youtube.com/watch?v=xamzdNUN1o0&t=2026s
Lecture 11 Probability Review, Bayes Filters, Gaussians -- CS287-FA19 Advanced Robotics
https://i.ytimg.com/vi/x…o0/hqdefault.jpg
xamzdNUN1o0
things we need law of total probability the regular version and the conditioned version we need Bayes rule the regular version the conditioned version and we need a notion of conditional independence which will allow us to simplify equations this is also equivalent to writing and also we'll see it that way equivalent to px given Z and Y but once you know Z X is independent of Y being
2,056
2,086
https://www.youtube.com/watch?v=xamzdNUN1o0&t=2056s
Lecture 11 Probability Review, Bayes Filters, Gaussians -- CS287-FA19 Advanced Robotics
https://i.ytimg.com/vi/x…o0/hqdefault.jpg
xamzdNUN1o0
equal to px given Z and it's also equivalent to P Y given Z comma X being equal to Y given Z this once you know Z X does not tell you anything about Y now to be fair this condition the dependence assumptions that will be making sometimes in the real world might not exactly be true ever but they'll often be reasonable approximations and we might be willing to make him because
2,086
2,115
https://www.youtube.com/watch?v=xamzdNUN1o0&t=2086s
Lecture 11 Probability Review, Bayes Filters, Gaussians -- CS287-FA19 Advanced Robotics
https://i.ytimg.com/vi/x…o0/hqdefault.jpg
xamzdNUN1o0
they drastically simplify the math and the competition we need to do to then run the algorithm in practice so often often these things are assumptions we make to approximate things that are almost independent in the real world questions so far yes mm-hmm so typically the way you would do it is this typically what you'd be given is from a previous calculation maybe or
2,115
2,159
https://www.youtube.com/watch?v=xamzdNUN1o0&t=2115s
Lecture 11 Probability Review, Bayes Filters, Gaussians -- CS287-FA19 Advanced Robotics
https://i.ytimg.com/vi/x…o0/hqdefault.jpg
xamzdNUN1o0
just given would be this thing here which is the prior over X of PI over X given Z you'd be given a model of how X and Z in this case are just X in that case cos Y that's your sensor calibration model for example and that together allows you to compute this product for every possible value of X because we're conditioning on Z a favorite possible value of X yeah we're
2,159
2,185
https://www.youtube.com/watch?v=xamzdNUN1o0&t=2159s
Lecture 11 Probability Review, Bayes Filters, Gaussians -- CS287-FA19 Advanced Robotics
https://i.ytimg.com/vi/x…o0/hqdefault.jpg