video_id
stringlengths 11
11
| text
stringlengths 361
490
| start_second
int64 0
11.3k
| end_second
int64 18
11.3k
| url
stringlengths 48
52
| title
stringlengths 0
100
| thumbnail
stringlengths 0
52
|
---|---|---|---|---|---|---|
njKP3FqW3Sk | model on so in the class example the test set is you so you want to understand how likely you are to pass this class you're the test set now what this means is that we want to find the W's that minimize that total loss function which we call as the objective function J of W now remember that W is just a aggregation or a collection of all of the individual w's from all of | 1,908 | 1,941 | https://www.youtube.com/watch?v=njKP3FqW3Sk&t=1908s | MIT 6.S191 (2020): Introduction to Deep Learning | |
njKP3FqW3Sk | your weights so here this is just a way for me to express this in a clean notation but W is a whole set of numbers it's not just a single number and you want to find this all of the W's you want to find the value of each of those weights such that you can minimize this entire loss function it's a very complicated problem and remember that our loss function is just a simple | 1,941 | 1,965 | https://www.youtube.com/watch?v=njKP3FqW3Sk&t=1941s | MIT 6.S191 (2020): Introduction to Deep Learning | |
njKP3FqW3Sk | function in terms of those weights so if we plot in the case again of a two-dimensional weight problem so one of the weights is on the x-axis one of the weights is on this axis and on the z axis we have the loss so for any value of w we can see what the loss would be at that point now what do we want to do we want to find the place on this landscape what are the values of W | 1,965 | 1,992 | https://www.youtube.com/watch?v=njKP3FqW3Sk&t=1965s | MIT 6.S191 (2020): Introduction to Deep Learning | |
njKP3FqW3Sk | that we get the minimum loss okay so what we can do is we can just pick a random W pick a random place on this this landscape to start with and from this random place let's try to understand how the landscape is changing what's the slope of the landscape we can take the gradient of the loss with respect to each of these weights to understand the direction of maximum | 1,992 | 2,017 | https://www.youtube.com/watch?v=njKP3FqW3Sk&t=1992s | MIT 6.S191 (2020): Introduction to Deep Learning | |
njKP3FqW3Sk | ascent okay that's what the gradient tells us now that we know which way is up we can take a step in the direction that's down so we know which way is up we reverse the sign so now we start heading downhill and we can move towards that lowest point now we just keep repeating this process over and over again until we've converged to a local minimum now we can summarize this | 2,017 | 2,043 | https://www.youtube.com/watch?v=njKP3FqW3Sk&t=2017s | MIT 6.S191 (2020): Introduction to Deep Learning | |
njKP3FqW3Sk | algorithm which is known as gradient descent because you're taking a gradient and you're descending down down that landscape by starting to initialize our rates wait randomly we compute the gradient DJ with respect to all of our weights then we update our weights in the opposite direction of that gradient and take a small step which we call here ADA of that gradient and this is | 2,043 | 2,070 | https://www.youtube.com/watch?v=njKP3FqW3Sk&t=2043s | MIT 6.S191 (2020): Introduction to Deep Learning | |
njKP3FqW3Sk | referred to as the learning rate and we'll talk a little bit more about that later but ADA is just a scalar number that determines how much of a step you want to take at each iteration how strongly or aggressively do you want to step towards that gradient in code the picture looks very similar so to implement gradient descent is just a few lines of code just like the pseudocode | 2,070 | 2,092 | https://www.youtube.com/watch?v=njKP3FqW3Sk&t=2070s | MIT 6.S191 (2020): Introduction to Deep Learning | |
njKP3FqW3Sk | you can initialize your weights randomly in the first line you can compute your loss with respect to those gradients and with respect to those predictions and your data given that gradient you just update your weights in the opposite direction of that event of that vector right now the magic line here is actually how do you compute that gradient and that's something I haven't told you and that's something it's not easy at all so the | 2,092 | 2,122 | https://www.youtube.com/watch?v=njKP3FqW3Sk&t=2092s | MIT 6.S191 (2020): Introduction to Deep Learning | |
njKP3FqW3Sk | question is given a loss and given all of our weights in our network how do we know which way is good which way is a good place to move given all of this information and I never told you about that but that's a process called back propagation and let's talk about a very simple example of how we can actually derive back propagation using elementary calculus so we'll start with a very | 2,122 | 2,148 | https://www.youtube.com/watch?v=njKP3FqW3Sk&t=2122s | MIT 6.S191 (2020): Introduction to Deep Learning | |
njKP3FqW3Sk | simple network with only one hidden neuron and one output this is probably the simplest neural network that you can create you can't really get smaller than this computing the gradient of our loss with respect to W to here which is that second way between the hidden state and our output can tell us how much a small change in W 2 will impact our loss so that's what the gradient tells us right | 2,148 | 2,172 | https://www.youtube.com/watch?v=njKP3FqW3Sk&t=2148s | MIT 6.S191 (2020): Introduction to Deep Learning | |
njKP3FqW3Sk | if we change W 2 in the differential different like a very minor manner how does our loss change does it go up or down how does it change and by how much really so that's the gradient that we care about the gradient of our loss with respect to W 2 now to evaluate this we can just apply the chain rule in calculus so we can split this up into the gradient of our loss with respect to | 2,172 | 2,200 | https://www.youtube.com/watch?v=njKP3FqW3Sk&t=2172s | MIT 6.S191 (2020): Introduction to Deep Learning | |
njKP3FqW3Sk | our output Y multiplied by the gradient of our walk or output Y with respect to W 2 now if we want to repeat this process for a different way in the neural network let's say now W 1 not W 2 now we replace W 1 on both sides we also apply the chain rule but now you're going to notice that the gradient of Y with respect to W 1 is also not directly computable we have to apply the chain | 2,200 | 2,228 | https://www.youtube.com/watch?v=njKP3FqW3Sk&t=2200s | MIT 6.S191 (2020): Introduction to Deep Learning | |
njKP3FqW3Sk | rule again to evaluate this so let's apply the chain rule again we can break that second term up into with respect to now the the state Z ok and using that we can kind of back propagate all of these gradients from the output all the way back to the input that allows our error signal to really propagate from output to input and allows these gradients to be computed in | 2,228 | 2,251 | https://www.youtube.com/watch?v=njKP3FqW3Sk&t=2228s | MIT 6.S191 (2020): Introduction to Deep Learning | |
njKP3FqW3Sk | practice now a lot of this is not really important or excuse me it's not as crucial that you understand the nitty-gritty math here because in a lot of popular deep learning frameworks we have what's called automatic differentiation which does all of this back propagation for you under the hood and you never even see it which is incredible it made training neural | 2,251 | 2,275 | https://www.youtube.com/watch?v=njKP3FqW3Sk&t=2251s | MIT 6.S191 (2020): Introduction to Deep Learning | |
njKP3FqW3Sk | networks so much easier you don't have to implement back propagation anymore but it's still important to understand how these work at the foundation which is why we're going through it now ok obviously then you repeat this for every single way in the network here we showed it for just W 1 and W 2 which is every single way in this network but if you have more you can just repeat it again | 2,275 | 2,297 | https://www.youtube.com/watch?v=njKP3FqW3Sk&t=2275s | MIT 6.S191 (2020): Introduction to Deep Learning | |
njKP3FqW3Sk | keep applying the chain rule from output to input to compute this ok and that's the back prop algorithm in theory very simple it's just an application of the chain rule in essence but now let's touch on some of the insights from training and how you can use the back prop algorithm to train these networks in practice optimization of neural networks is incredibly tough in practice | 2,297 | 2,323 | https://www.youtube.com/watch?v=njKP3FqW3Sk&t=2297s | MIT 6.S191 (2020): Introduction to Deep Learning | |
njKP3FqW3Sk | so it's not as simple as the picture I showed you on the colorful one on the previous slide here's an illustration from a paper that came out about two or three years ago now where the authors tried to visualize the landscape of a of a neural network with millions of parameters but they collapsed that down onto just two-dimensional space so that we can visualize it and you can see that | 2,323 | 2,346 | https://www.youtube.com/watch?v=njKP3FqW3Sk&t=2323s | MIT 6.S191 (2020): Introduction to Deep Learning | |
njKP3FqW3Sk | the landscape is incredibly complex it's not easy there are many local minima where the gradient descent algorithm could get stuck into and applying gradient descent in practice in these type of environments which is very standard in neural networks can be a huge challenge now we're called the update equation that we defined previously with gradient descent this is that same equation we're going to update our weights in the | 2,346 | 2,373 | https://www.youtube.com/watch?v=njKP3FqW3Sk&t=2346s | MIT 6.S191 (2020): Introduction to Deep Learning | |
njKP3FqW3Sk | direction in the opposite direction of our gradient I didn't talk too much about this parameter ADA I pointed it out this is the learning rate it determines how much of a step we should take in the direction of that gradient and in practice setting this learning rate can have a huge impact in performance so if you set that learning rate to small that means that you're not really trusting your gradient on each step so if ADA is | 2,373 | 2,400 | https://www.youtube.com/watch?v=njKP3FqW3Sk&t=2373s | MIT 6.S191 (2020): Introduction to Deep Learning | |
njKP3FqW3Sk | super tiny that means on each time each step you're only going to move a little bit towards in the opposite direction of your gradient just in little small increments and what can happen then is you can get stuck in these local minima because you're not being as aggressive as you should be to escape them now if you set the learning rate to large you can actually overshoot completely and diverge which is even more undesirable | 2,400 | 2,423 | https://www.youtube.com/watch?v=njKP3FqW3Sk&t=2400s | MIT 6.S191 (2020): Introduction to Deep Learning | |
njKP3FqW3Sk | so setting the learning rate can be very challenging in practice you want to pick a learning rate that's large enough such that you avoid the local minima but small offs such that you still converge in practice now the question that you're all probably asking is how do we set the learning rate then well one option is that you can just try a bunch of learning rates and see what works best | 2,423 | 2,445 | https://www.youtube.com/watch?v=njKP3FqW3Sk&t=2423s | MIT 6.S191 (2020): Introduction to Deep Learning | |
njKP3FqW3Sk | another option is to do something a little bit more clever and see if we can try to have an adaptive learning rate that changes with respect to our lost landscape maybe it changes with respect to how fast the learning is happening or a range of other ideas within the network optimization scheme itself this means that the learning rate is no longer fixed but it can now increase or | 2,445 | 2,472 | https://www.youtube.com/watch?v=njKP3FqW3Sk&t=2445s | MIT 6.S191 (2020): Introduction to Deep Learning | |
njKP3FqW3Sk | decrease throughout training so as training progressive your learning rate may speed up you may take more aggressive steps you may take smaller steps as you get closer to the local minima so that you really converge on that point and there are many options here of how you might want to design this adaptive algorithm and this has been a huge or a widely studied field in | 2,472 | 2,494 | https://www.youtube.com/watch?v=njKP3FqW3Sk&t=2472s | MIT 6.S191 (2020): Introduction to Deep Learning | |
njKP3FqW3Sk | optimization theory for machine learning and deep learning and there have been many published papers and implementations within tensor flow on these different types of adaptive learning rate algorithms so SGD is just that vanilla gradient descent that I showed you before that's the first one all of the others are all adaptive learning rates which means that they change their learning rate during training itself so they can increase or | 2,494 | 2,520 | https://www.youtube.com/watch?v=njKP3FqW3Sk&t=2494s | MIT 6.S191 (2020): Introduction to Deep Learning | |
njKP3FqW3Sk | decrease depending on how the optimization is going and during your labs we really encourage you again to try out some of these different optimization schemes see what works what doesn't work a lot of it is problem dependent there are some heuristics that you can you can get but we want you to really gain those heuristics yourselves through the course of the labs it's part | 2,520 | 2,542 | https://www.youtube.com/watch?v=njKP3FqW3Sk&t=2520s | MIT 6.S191 (2020): Introduction to Deep Learning | |
njKP3FqW3Sk | of building character okay so let's put this all together from the beginning we can define our model which is defined as this sequential wrapper inside of this sequential wrapper we have all of our layers all of these layers are composed of perceptrons or single neurons which we saw earlier the second line defines our optimizer which we saw in the previous slide | 2,542 | 2,569 | https://www.youtube.com/watch?v=njKP3FqW3Sk&t=2542s | MIT 6.S191 (2020): Introduction to Deep Learning | |
njKP3FqW3Sk | this can be SGD it can also be any of those adaptive learning rates that we saw before now what we want to do is during our training loop it's very it's the same stories again as before nothing's changing here we forward pass all of our inputs through that model we get our predictions using those predictions we can evaluate them and compute our loss our loss tells us how | 2,569 | 2,595 | https://www.youtube.com/watch?v=njKP3FqW3Sk&t=2569s | MIT 6.S191 (2020): Introduction to Deep Learning | |
njKP3FqW3Sk | wrong our network was on that iteration it also tells us how we can compute the gradients and how we can change all of the weights in the network to improve in the future and then the final line there takes those gradients and actually allows our optimizer to update the weights and the trainable variables such that on the next iteration they do a little bit better and over time if you | 2,595 | 2,616 | https://www.youtube.com/watch?v=njKP3FqW3Sk&t=2595s | MIT 6.S191 (2020): Introduction to Deep Learning | |
njKP3FqW3Sk | keep looping this will converge and hopefully you should fit your data no now I want to continue to talk about some tips for training these networks in practice and focus on a very powerful idea of batching your data into mini batches so to do this let's revisit the gradient descent algorithm this gradient is actually very computationally expensive to compute in practice so | 2,616 | 2,646 | https://www.youtube.com/watch?v=njKP3FqW3Sk&t=2616s | MIT 6.S191 (2020): Introduction to Deep Learning | |
njKP3FqW3Sk | using the backprop algorithm is a very expensive idea and practice so what we want to do is actually not compute this over all of the data points but actually computed over just a single data point in the data set and most real-life applications it's not actually feasible to compute on your entire data set at every iteration it's just too much data so instead we pick a single | 2,646 | 2,668 | https://www.youtube.com/watch?v=njKP3FqW3Sk&t=2646s | MIT 6.S191 (2020): Introduction to Deep Learning | |
njKP3FqW3Sk | point randomly we compute our gradient with respect to that point and then on the next iteration we pick a different point and we can get a rough estimate of our gradient at each step right so instead of using all of our data now we just pick a single point I we compute our gradient with respect to that single point I and what's a middle ground here so the downside of using a single point | 2,668 | 2,694 | https://www.youtube.com/watch?v=njKP3FqW3Sk&t=2668s | MIT 6.S191 (2020): Introduction to Deep Learning | |
njKP3FqW3Sk | is that it's going to be very noisy the downside of using all of the points is that it's too computationally expensive if there's some middle ground that we can have in between so that middle ground is actually just very simple you instead of taking one point and instead taking all of the points let take a mini batch of points so maybe something on the order of 10 20 30 100 maybe | 2,694 | 2,714 | https://www.youtube.com/watch?v=njKP3FqW3Sk&t=2694s | MIT 6.S191 (2020): Introduction to Deep Learning | |
njKP3FqW3Sk | depending on how rough or accurate you want that approximation of your gradient to be and how much you want to trade off speed and computational efficiency now the true gradient is just obtained by averaging the gradient from each of those B points so B is the size of your batch in this case now since B is normally not that large like I said maybe on the order of tens to a hundreds | 2,714 | 2,740 | https://www.youtube.com/watch?v=njKP3FqW3Sk&t=2714s | MIT 6.S191 (2020): Introduction to Deep Learning | |
njKP3FqW3Sk | this is much faster to compute than full gradient descent and much more accurate than stochastic gradient descent because it's using more than one point more than one estimate now this increase in gradient accuracy estimation actually allows us to converge to our target much quicker because it means that our gradients are more accurate in practice it also means that we can increase our | 2,740 | 2,763 | https://www.youtube.com/watch?v=njKP3FqW3Sk&t=2740s | MIT 6.S191 (2020): Introduction to Deep Learning | |
njKP3FqW3Sk | learning rate and trust each update more so if we're very noisy in our gradient estimation we probably want to lower our learning rate a little more so we don't fully step in the wrong direction if we're not totally confident with that gradient if we have a larger batch of gradient of data to they are gradients with we can trust that learning great a little more | 2,763 | 2,784 | https://www.youtube.com/watch?v=njKP3FqW3Sk&t=2763s | MIT 6.S191 (2020): Introduction to Deep Learning | |
njKP3FqW3Sk | increase it so that it steps it more aggressively in that direction what this means also is that we can now massively paralyze this computation because we can split up batches on multiple GPUs or multiple computers even to achieve even more significant speed ups with this training process now the last topic I want to address is that of overfitting and this is also known as the problem of | 2,784 | 2,812 | https://www.youtube.com/watch?v=njKP3FqW3Sk&t=2784s | MIT 6.S191 (2020): Introduction to Deep Learning | |
njKP3FqW3Sk | generalization in machine learning and it's actually not unique to just deep learning but it's a fundamental problem of all of machine learning now ideally in machine learning we want a model that will approximate or estimate our data or accurately describes our data let's say like that said differently we want to build models that can learn representations from our training data | 2,812 | 2,838 | https://www.youtube.com/watch?v=njKP3FqW3Sk&t=2812s | MIT 6.S191 (2020): Introduction to Deep Learning | |
njKP3FqW3Sk | that's still generalize to unseen test data now assume that you want to build a line that best describes these points you can see on the on the screen under fitting describes if we if our model does not describe the state of complexity of this problem or if we can't really capture the true complexity of this problem while overfitting on the right starts to memorize certain aspects | 2,838 | 2,864 | https://www.youtube.com/watch?v=njKP3FqW3Sk&t=2838s | MIT 6.S191 (2020): Introduction to Deep Learning | |
njKP3FqW3Sk | of our training data and this is also not desirable we want the middle ground which ideally we end up with a model in the middle that is not too complex to memorize all of our training data but also one that will continue to generalize when it sees new data so to address this problem of regularization in neural network specifically let's talk about a technique of regularization | 2,864 | 2,884 | https://www.youtube.com/watch?v=njKP3FqW3Sk&t=2864s | MIT 6.S191 (2020): Introduction to Deep Learning | |
njKP3FqW3Sk | which is another way that we can deal with this and what this is doing is it's trying to discourage complex information from being learned so we want to eliminate the model from actually learning to memorize the training data we don't want to learn like very specific pinpoints of the training data that don't generalize well to test data now as we've seen before this is | 2,884 | 2,907 | https://www.youtube.com/watch?v=njKP3FqW3Sk&t=2884s | MIT 6.S191 (2020): Introduction to Deep Learning | |
njKP3FqW3Sk | actually crucial for our models to be able to generalize to our test data so this is very important the most popular regularization technique deep learning is this very basic idea of drop out now the idea of drop out is well actually let's start with by revisiting this picture of a neural network that we had introduced previously and drop out during training | 2,907 | 2,930 | https://www.youtube.com/watch?v=njKP3FqW3Sk&t=2907s | MIT 6.S191 (2020): Introduction to Deep Learning | |
njKP3FqW3Sk | we randomly set some of these activations of the hidden neurons to zero with some probability so I'd say our probability is 0.5 we're randomly going to set the activations to 0.5 with probability of 0.5 to some of our hidden neurons to 0 the idea is extremely powerful because it allows the network to lower its capacity it also makes it such that the network can't build these memorization channels | 2,930 | 2,957 | https://www.youtube.com/watch?v=njKP3FqW3Sk&t=2930s | MIT 6.S191 (2020): Introduction to Deep Learning | |
njKP3FqW3Sk | through the network where it tries to just remember the data because on every iteration 50% of that data is going to be or 50% of that memorization or memory is going to be wiped out so it's going to be forced to to not only generalize better but it's going to be forced to have multiple channels through the network and build a more robust representation of its prediction now we | 2,957 | 2,980 | https://www.youtube.com/watch?v=njKP3FqW3Sk&t=2957s | MIT 6.S191 (2020): Introduction to Deep Learning | |
njKP3FqW3Sk | just repeat this on every iteration so on the first iteration we dropped out one 50% of the nodes on the next iteration we can drop out a different randomly sampled 50% which may include some of the previously sampled nodes as well and this will allow the network to generalize better to new test data the second regularization technique that we'll talk about is the notion of early | 2,980 | 3,002 | https://www.youtube.com/watch?v=njKP3FqW3Sk&t=2980s | MIT 6.S191 (2020): Introduction to Deep Learning | |
njKP3FqW3Sk | stopping so what I want to do here is just talk about two lines so during training which is the x-axis here we have two lines the y-axis is our loss curve the first line is our training loss so that's the green line the green line tells us how our training data how well our model is fitting to our training data we expect this to be lower than the second line which is our | 3,002 | 3,025 | https://www.youtube.com/watch?v=njKP3FqW3Sk&t=3002s | MIT 6.S191 (2020): Introduction to Deep Learning | |
njKP3FqW3Sk | testing data so usually we expect to be doing better on our training data than our testing data as we train and as this line moves forward into the future both of these lines should kind of decrease go down because we're optimizing the network we're improving its performance eventually though there becomes a point where the training data starts to diverge from the testing data now what happens is that the training day | 3,025 | 3,049 | https://www.youtube.com/watch?v=njKP3FqW3Sk&t=3025s | MIT 6.S191 (2020): Introduction to Deep Learning | |
njKP3FqW3Sk | should always continue to fit or the model should always continue to fit the training data because it's still seeing all of the training data it's not being penalized from that except for maybe if you drop out or other means but the testing data it's not seeing so at some point the network is going to start to do better on its training data than its testing data and what this means is | 3,049 | 3,069 | https://www.youtube.com/watch?v=njKP3FqW3Sk&t=3049s | MIT 6.S191 (2020): Introduction to Deep Learning | |
njKP3FqW3Sk | basically that the network is starting to memorize some of the training data and that's what you don't want so what we can do is well we can perform early stopping or we can identify this point this inflection point where the test data starts to increase and diverge from the training data so we can stop the network early and make sure that our test accuracy is as minimum as possible | 3,069 | 3,094 | https://www.youtube.com/watch?v=njKP3FqW3Sk&t=3069s | MIT 6.S191 (2020): Introduction to Deep Learning | |
njKP3FqW3Sk | and of course if we actually look at on the side of this line if we look at on the left side that's where a model is under fit so we haven't reached the true capacity of our model yet so we'd want to keep training if we didn't stop yet if we did stop already and on the right side is where we've over fit where we've passed that early stopping point and we need to like basically we've started to | 3,094 | 3,115 | https://www.youtube.com/watch?v=njKP3FqW3Sk&t=3094s | MIT 6.S191 (2020): Introduction to Deep Learning | |
njKP3FqW3Sk | memorize some of our training did and that's when we've gone too far I'll conclude this lecture by just summarizing three main points that we've covered so far first we've learned about the fundamentals of neural networks which is a single neuron or a perceptron we've learned about stacking and composing these perceptrons together to form complex hierarchical | 3,115 | 3,135 | https://www.youtube.com/watch?v=njKP3FqW3Sk&t=3115s | MIT 6.S191 (2020): Introduction to Deep Learning | |
njKP3FqW3Sk | representations and how we can mathematically optimize these networks using a technique called back propagation using their loss and finally we address the practical side of training these models such as mini batching regularization and adaptive learning rates as well with that I'll finish up I can take a couple questions and then we'll move on to office lecture | 3,135 | 3,159 | https://www.youtube.com/watch?v=njKP3FqW3Sk&t=3135s | MIT 6.S191 (2020): Introduction to Deep Learning | |
La9oLLoI5Rc | It's a scientific fact that the hormones of stress downregulate genes and create disease. Long-term effects. Human beings because of the size of the neocortex, we can turn on the stress response just by thought alone as I think about our problems and turn on those chemicals That means then our thoughts Could make us sick So if it's possible, that our thoughts could make us sick then it is possible then our thoughts could make us well, the answer is absolutely yes | 0 | 32 | https://www.youtube.com/watch?v=La9oLLoI5Rc&t=0s | How To BRAINWASH Yourself For Success & Destroy NEGATIVE THOUGHTS! | Dr. Joe Dispenza | |
La9oLLoI5Rc | Everybody welcome to Impact Theory our goal with this show and company is to introduce you to the people and ideas that will help you Actually execute on your dreams Alright today's guest is a New York Times bestselling author and one of the most sought-after speakers in the world He's lectured and given advanced workshops in more than 30 countries Across five continents all with the aim of helping people better understand and unlock the power of their mind | 32 | 57 | https://www.youtube.com/watch?v=La9oLLoI5Rc&t=32s | How To BRAINWASH Yourself For Success & Destroy NEGATIVE THOUGHTS! | Dr. Joe Dispenza | |
La9oLLoI5Rc | His expertise is the intersection of the fields of neuroscience Epigenetics and quantum physics and he's partnered with other scientists across multiple disciplines to perform extensive research on the effects of meditation Using advanced technologies such as epigenetic testing brain mapping with EEG s and gas-discharge visualization technology. Through his work | 57 | 78 | https://www.youtube.com/watch?v=La9oLLoI5Rc&t=57s | How To BRAINWASH Yourself For Success & Destroy NEGATIVE THOUGHTS! | Dr. Joe Dispenza | |
La9oLLoI5Rc | He is endeavouring to help advance both the scientific community and the public at large as understanding of mind derived health optimization, a topic he covered extensively in his groundbreaking book, You are the placebo. His teaching has had such a profound impact on the way that people perceive a wide range of brain related topics around Mindfulness and well-being that he's a faculty member at the quantum University in Hawaii the Omega Institute for holistic studies in New York | 78 | 104 | https://www.youtube.com/watch?v=La9oLLoI5Rc&t=78s | How To BRAINWASH Yourself For Success & Destroy NEGATIVE THOUGHTS! | Dr. Joe Dispenza | |
La9oLLoI5Rc | And the Kerr Paulo Centre for yoga and health in Stockbridge, Massachusetts He's also an invited chair of the research committee at life University in Atlanta As well as a corporate consultant where he delivers his lectures and workshops for businesses So, please help me in welcoming the man who has appeared in such films as Heal, People versus the state of illusion and Unleashing creativity | 104 | 126 | https://www.youtube.com/watch?v=La9oLLoI5Rc&t=104s | How To BRAINWASH Yourself For Success & Destroy NEGATIVE THOUGHTS! | Dr. Joe Dispenza | |
La9oLLoI5Rc | The author of the recent book Becoming supernatural. Dr. Joe Dispenza Thanks for being here So, diving into your world and how you perceive the sense of self and the way that you marry science to - the way that we form memories the way that we live in a perpetual state of Reliving our past and things like that It's really, really incredible and I want to dive into the whole notion of you sort of being a habitual | 126 | 157 | https://www.youtube.com/watch?v=La9oLLoI5Rc&t=126s | How To BRAINWASH Yourself For Success & Destroy NEGATIVE THOUGHTS! | Dr. Joe Dispenza | |
La9oLLoI5Rc | Construct like what? What is that? What is the habit of you? Well a habit is a redundant set of Automatic unconscious thoughts, behaviors and emotions that's acquired through repetition The habit is when you've done done something so many times that your body now knows how to do it better than your mind So if you think about it people wake up in the morning they | 157 | 179 | https://www.youtube.com/watch?v=La9oLLoI5Rc&t=157s | How To BRAINWASH Yourself For Success & Destroy NEGATIVE THOUGHTS! | Dr. Joe Dispenza | |
La9oLLoI5Rc | Begin to think about their problems Those problems are circuits, memories in the brain, each One of those memories are connected to people and things at certain times and places and if the brain is a record of the past The moment they start their day, they're already thinking in the past. Each one of those memories has an emotion Emotions are the end product of past experiences | 179 | 203 | https://www.youtube.com/watch?v=La9oLLoI5Rc&t=179s | How To BRAINWASH Yourself For Success & Destroy NEGATIVE THOUGHTS! | Dr. Joe Dispenza | |
La9oLLoI5Rc | So the moment they recall those memories of their problems, they all of a sudden feel unhappy, they feel sad, they feel pain Now how you think and how you feel creates your state of being. So the person's entire State of being when they start their day is in the past. So what does that mean? The familiar past will sooner or later be predictable future so if you believe that your thoughts have something to do with your destiny and | 203 | 229 | https://www.youtube.com/watch?v=La9oLLoI5Rc&t=203s | How To BRAINWASH Yourself For Success & Destroy NEGATIVE THOUGHTS! | Dr. Joe Dispenza | |
La9oLLoI5Rc | You can't think greater than how you feel Or feelings have become the means of thinking by very definition of emotions you're thinking in the past And for the most part you're going to keep creating the same life, so then people grab their cell phone They check their WhatsApp. They check their texts. They check their emails. They check Facebook They take a picture of their feet. They post it on Facebook. They tweet something, they do Instagram | 229 | 252 | https://www.youtube.com/watch?v=La9oLLoI5Rc&t=229s | How To BRAINWASH Yourself For Success & Destroy NEGATIVE THOUGHTS! | Dr. Joe Dispenza | |
La9oLLoI5Rc | they check the news and now they feel really connected to everything that's known in their life And then they go through a series of routine behaviors They get out of bed on the same side. They go to the toilet. They get a cup of coffee They take a shower, they get dressed, they drive to work the same way. They do the same things They see the same people that pushed the same emotional buttons and that becomes the routine and it becomes like a program | 252 | 275 | https://www.youtube.com/watch?v=La9oLLoI5Rc&t=252s | How To BRAINWASH Yourself For Success & Destroy NEGATIVE THOUGHTS! | Dr. Joe Dispenza | |
La9oLLoI5Rc | So now they've lost their free will To a program and there's no unseen hand doing it to them. So when it comes time to change the Redundancy of that cycle becomes a subconscious program. So now 95% of who we are by the time we're 35 years old is a Memorized set of behaviors, emotional reactions, unconscious habits, hardwired attitudes, beliefs and perceptions that function like a computer program | 275 | 303 | https://www.youtube.com/watch?v=La9oLLoI5Rc&t=275s | How To BRAINWASH Yourself For Success & Destroy NEGATIVE THOUGHTS! | Dr. Joe Dispenza | |
La9oLLoI5Rc | So then person can say with their five percent of their conscious mind. I want to be healthy I want to be happy. I want to be free but the body's on a whole different program So then how do you begin to make those changes? Well? you have to get beyond the analytical mind because what separates the conscious mind from the Subconscious mind is the analytical mind and that's where meditation comes in because you can teach people | 303 | 328 | https://www.youtube.com/watch?v=La9oLLoI5Rc&t=303s | How To BRAINWASH Yourself For Success & Destroy NEGATIVE THOUGHTS! | Dr. Joe Dispenza | |
La9oLLoI5Rc | through practice how to change their brainwaves, slow them down and when they do that Properly they do enter the operating system where they can begin to make some really important changes. So Most people then wait for crisis or trauma or disease or diagnosis, you know, they wait for loss some tragedy to make up their mind to change and my message is why wait and and | 328 | 349 | https://www.youtube.com/watch?v=La9oLLoI5Rc&t=328s | How To BRAINWASH Yourself For Success & Destroy NEGATIVE THOUGHTS! | Dr. Joe Dispenza | |
La9oLLoI5Rc | You can learn and change in a state of pain and suffering or you can learn and change in a state of joy and inspiration I think right now the cool thing is that people are waking up that's really interesting and where I found the the deepest hooks into how powerful this can be for somebody is when you talk about trauma and you've talked about how People experience a traumatic event, but they then basically rehearse it and how that then has this knock-on effect. So, what is that? | 349 | 376 | https://www.youtube.com/watch?v=La9oLLoI5Rc&t=349s | How To BRAINWASH Yourself For Success & Destroy NEGATIVE THOUGHTS! | Dr. Joe Dispenza | |
La9oLLoI5Rc | Why do people find it so hard to get past trauma? Well? the the stronger the emotional reaction You have to some experience in your life the higher the emotional quotient The more you pay attention to the cause and the moment the brain puts all of its attention on the cause It takes a snapshot and that's called a memory. So long-term memories are created from very highly | 376 | 402 | https://www.youtube.com/watch?v=La9oLLoI5Rc&t=376s | How To BRAINWASH Yourself For Success & Destroy NEGATIVE THOUGHTS! | Dr. Joe Dispenza | |
La9oLLoI5Rc | Emotional experiences. So what happens then is that people think neurologically within the circuitry of that experience and they feel chemically within the boundaries of those emotions and So when you have an emotional reaction to someone or something most people think that they can't control their emotional reaction Well, it turns out if you allow that emotional reaction, it's called a refractory period to last for hours or days | 402 | 428 | https://www.youtube.com/watch?v=La9oLLoI5Rc&t=402s | How To BRAINWASH Yourself For Success & Destroy NEGATIVE THOUGHTS! | Dr. Joe Dispenza | |
La9oLLoI5Rc | That's called the mood. I say to someone. Hey, what's up think so I'm gonna move well, why are you in a mood? well I had this thing happen to me five days ago and I'm having one long emotional reaction if you keep that same emotional reaction going on for weeks or months That's called temperament. Why is he so bitter? I don't know. Let's ask him. Why is he so bitter? Why are you bitter? | 428 | 449 | https://www.youtube.com/watch?v=La9oLLoI5Rc&t=428s | How To BRAINWASH Yourself For Success & Destroy NEGATIVE THOUGHTS! | Dr. Joe Dispenza | |
La9oLLoI5Rc | Well, I had this thing happened to me nine months ago And if you keep that same emotional reaction going on for years on end that's called a personality trait And so learning how to shorten your refractory period of emotional reactions is really where that work starts So then people when they have an event what they do is they keep recalling the event because the | 449 | 473 | https://www.youtube.com/watch?v=La9oLLoI5Rc&t=449s | How To BRAINWASH Yourself For Success & Destroy NEGATIVE THOUGHTS! | Dr. Joe Dispenza | |
La9oLLoI5Rc | Emotions of stress hormones the survival emotions are saying pay attention to what happened Because you want to be prepared if it happens again Turns out most people spend 70% of their life living in survival and living in stress. So they're they're always Anticipating the worst-case scenario based on a past experience and they're literally out of the infinite | 473 | 497 | https://www.youtube.com/watch?v=La9oLLoI5Rc&t=473s | How To BRAINWASH Yourself For Success & Destroy NEGATIVE THOUGHTS! | Dr. Joe Dispenza | |
La9oLLoI5Rc | potentials in the quantum field they're selecting the worst possible outcome and they're beginning to emotionally embrace it with fear and their Conditioning their body into a state of fear do that enough times Body has a panic attack without you you you can't even predict it because it's programmed subconsciously So then you say to the person why are you this way? | 497 | 518 | https://www.youtube.com/watch?v=La9oLLoI5Rc&t=497s | How To BRAINWASH Yourself For Success & Destroy NEGATIVE THOUGHTS! | Dr. Joe Dispenza | |
La9oLLoI5Rc | And they'll say I am this way because of this event that happened to me 15 or 20 years ago and what that means from biological standpoint is that they haven't been able to change since that event So then the emotions from the experience tend to give the body and the brain a rush of energy So people become addicted To the rush of those emotions and they use the problems and conditions in their life to reaffirm their limitation | 518 | 546 | https://www.youtube.com/watch?v=La9oLLoI5Rc&t=518s | How To BRAINWASH Yourself For Success & Destroy NEGATIVE THOUGHTS! | Dr. Joe Dispenza | |
La9oLLoI5Rc | So at least they can feel something. So now when it comes time to change you say the person why are you this way? Well, every time they recall the event they're producing the same chemistry in their brain and body as if the event is occurring firing and wiring the same circuits and Settings the same emotional signature to the body. Well, what's the revelant behind that? Well your body is the unconscious mind | 546 | 568 | https://www.youtube.com/watch?v=La9oLLoI5Rc&t=546s | How To BRAINWASH Yourself For Success & Destroy NEGATIVE THOUGHTS! | Dr. Joe Dispenza | |
La9oLLoI5Rc | It doesn't know the difference between the experience that's creating the emotion and the emotion that you're creating by thought alone So the body's believing it's living in the same past experience 24 hours a day seven days a week 365 days a year and so then when those emotions influence certain thoughts and they do and Then those thoughts create the same emotions and those same emotions influence the same thoughts | 568 | 592 | https://www.youtube.com/watch?v=La9oLLoI5Rc&t=568s | How To BRAINWASH Yourself For Success & Destroy NEGATIVE THOUGHTS! | Dr. Joe Dispenza | |
La9oLLoI5Rc | Now the entire person's state of being is in the past. So then the hardest part about change is not making the same choice as you did the day before a period and The moment you decide to make a different choice get ready because it's going to feel uncomfortable It's going to feel unfamiliar. It's there's gonna be something so why does it feel so uncomfortable? Is it because of the the | 592 | 616 | https://www.youtube.com/watch?v=La9oLLoI5Rc&t=592s | How To BRAINWASH Yourself For Success & Destroy NEGATIVE THOUGHTS! | Dr. Joe Dispenza | |
La9oLLoI5Rc | neurons that fire together wire together so I've there's like an Easiness to that loop just because literally and you've talked very eloquently about this the way that the neurons connect in the brain how rapidly I've seen you show footage of how Rapidly those connections happen, which is pretty incredible Is is that what makes it so? discomforting for people I think that I think that the bigger thing is that we we keep | 616 | 641 | https://www.youtube.com/watch?v=La9oLLoI5Rc&t=616s | How To BRAINWASH Yourself For Success & Destroy NEGATIVE THOUGHTS! | Dr. Joe Dispenza | |
La9oLLoI5Rc | Firing and wiring those circuits they become more hardwired. So there you have a thought and then the program runs but it's the emotion that follows the thought if you have a if you have a Fearful thought you're gonna feel anxiety the moment you feel anxiety your brains checking in with your body and saying yeah, you're pretty anxious so then you start thinking more corresponding thoughts equaled how you | 641 | 664 | https://www.youtube.com/watch?v=La9oLLoI5Rc&t=641s | How To BRAINWASH Yourself For Success & Destroy NEGATIVE THOUGHTS! | Dr. Joe Dispenza | |
La9oLLoI5Rc | While the redundancy of that cycle conditions the body to become the minds. So now when it comes time to change Person's steps into that river of change and they make a different choice in all of a sudden They don't they don't feel the same way So the body says well you've been doing this for 35 years Well, you're gonna just stop feel suffering and stop feeling guilty and stop feeling shameful and you're not gonna complain or blame or make excuses | 664 | 691 | https://www.youtube.com/watch?v=La9oLLoI5Rc&t=664s | How To BRAINWASH Yourself For Success & Destroy NEGATIVE THOUGHTS! | Dr. Joe Dispenza | |
La9oLLoI5Rc | Or feel sorry for yourself long The body's in the unknown so the body says I want to return back to familiar territory so the body starts influencing the mind then it says Start tomorrow, you're too much like your mother. You'll never change. This isn't gonna work for you. This doesn't feel right And so if you respond to that thought as if it's true that same thought will lead to the same choice | 691 | 717 | https://www.youtube.com/watch?v=La9oLLoI5Rc&t=691s | How To BRAINWASH Yourself For Success & Destroy NEGATIVE THOUGHTS! | Dr. Joe Dispenza | |
La9oLLoI5Rc | Which will lead to the same behavior, which will create the same experience which produce the same emotion I want to talk about that notion of Give me a little more detail. We mean by the body becomes the mind or the unconscious mind. What do you mean by that exactly? Well, those are two different things your body is your unconscious mind in a sense if you're sitting down and you start thinking about | 717 | 742 | https://www.youtube.com/watch?v=La9oLLoI5Rc&t=717s | How To BRAINWASH Yourself For Success & Destroy NEGATIVE THOUGHTS! | Dr. Joe Dispenza | |
La9oLLoI5Rc | Some future worst-case scenario that you're conjuring up in your mind and you begin to feel the emotion of that event your body doesn't know the difference between The event that's taking place in your world outer world and what you're creating by emotion or thought alone. So most people then They're they're constantly reaffirming their emotional states So when it comes time to give up that emotion | 742 | 768 | https://www.youtube.com/watch?v=La9oLLoI5Rc&t=742s | How To BRAINWASH Yourself For Success & Destroy NEGATIVE THOUGHTS! | Dr. Joe Dispenza | |
La9oLLoI5Rc | they can say I really want to do it but really the body is stronger than the mind because it's been conditioned that way so The servant now has become the master and the person all of a sudden once they step into that unknown They'd rather feel guilt and suffering because at least they can predict it being in the unknown Is a scary place for most people because the unknown is uncertain people say to me. Well, I can't predict my future | 768 | 794 | https://www.youtube.com/watch?v=La9oLLoI5Rc&t=768s | How To BRAINWASH Yourself For Success & Destroy NEGATIVE THOUGHTS! | Dr. Joe Dispenza | |
La9oLLoI5Rc | I'm in the unknown and I always say the best way to predict your futures have created Not from the known but from the unknown what thoughts? Do you want to fire and wire in your brain? what behaviors do you want to demonstrate in one day the act of rehearsing the mentally closing your eyes and rehearsing the action the rehearsing the reaction of what you want or the action of what you want by closing your eyes and mentally rehearsing some action if | 794 | 820 | https://www.youtube.com/watch?v=La9oLLoI5Rc&t=794s | How To BRAINWASH Yourself For Success & Destroy NEGATIVE THOUGHTS! | Dr. Joe Dispenza | |
La9oLLoI5Rc | You're truly present. The brain does not know the difference between what you're imaging and what you're experiencing in 3d world so then you begin to install the Neurological Hardware in your brain to look like the event has already occurred Now your brain is no longer a record of the past now It's a map to the future and if you keep doing it priming it that way | 820 | 839 | https://www.youtube.com/watch?v=La9oLLoI5Rc&t=820s | How To BRAINWASH Yourself For Success & Destroy NEGATIVE THOUGHTS! | Dr. Joe Dispenza | |
La9oLLoI5Rc | the hardware becomes a software program and who knows you just may start acting like a happy person and then I think the hardest part is To teach our body emotionally what the future will feel like ahead of the actual experience. So, what does that mean? You can't wait for your success to feel empowered. You can't wait for your wealth to feel abundant you can't wait for your your new relationship to feel love or | 839 | 865 | https://www.youtube.com/watch?v=La9oLLoI5Rc&t=839s | How To BRAINWASH Yourself For Success & Destroy NEGATIVE THOUGHTS! | Dr. Joe Dispenza | |
La9oLLoI5Rc | Your healing to feel whole I mean that's the old model of reality of cause and effect, you know Waiting for something outside of us to change how we feel inside of us and when we feel better inside of us We pay attention to ever or whatever caused it But what that means then is that from the Newtonian world that most people spend their whole life living in lack | 865 | 884 | https://www.youtube.com/watch?v=La9oLLoI5Rc&t=865s | How To BRAINWASH Yourself For Success & Destroy NEGATIVE THOUGHTS! | Dr. Joe Dispenza | |
La9oLLoI5Rc | Will hitting through something to change out their what do you mean the Newtonian world? Newtonian world is all about the predictable It's all about predicting the future But the quantum model of reality isn't is about causing an effect the moment you start feeling Abundant and worthy you are generating wealth the moment you're empowered and feel it You're beginning to step towards your success the moment. You start feeling whole | 884 | 909 | https://www.youtube.com/watch?v=La9oLLoI5Rc&t=884s | How To BRAINWASH Yourself For Success & Destroy NEGATIVE THOUGHTS! | Dr. Joe Dispenza | |
La9oLLoI5Rc | Your healing begins and when you love yourself and you love all of life You'll create an equal and now you're causing an effect and I think that's that the difference between living as a victim in Your world saying I am this way because of this person or that thing or this experience They made me think and feel this way when you switch that around you become a creator of your world and you start | 909 | 931 | https://www.youtube.com/watch?v=La9oLLoI5Rc&t=909s | How To BRAINWASH Yourself For Success & Destroy NEGATIVE THOUGHTS! | Dr. Joe Dispenza | |
La9oLLoI5Rc | My thinking and my feeling is changing an outcome in my life And now that's a whole different game and we start believing more that were creators of reality. So, how do we go from? Okay, I have this negative emotion. It's controlling my life. It's got me in this cycle of I think about this emotion which triggers a chemical reaction which trains my body to feel that way which makes it easier more likely I will do it again and | 931 | 953 | https://www.youtube.com/watch?v=La9oLLoI5Rc&t=931s | How To BRAINWASH Yourself For Success & Destroy NEGATIVE THOUGHTS! | Dr. Joe Dispenza | |
La9oLLoI5Rc | so now I'm in this vicious line unconscious and it's unconscious right and you You said does your thinking create your environment orders your environment create your thinking which I thought was really really interesting. So how do we then go from that like mechanistically To begin this visualization process of something that's empowering its me in a different state. It's my future self | 953 | 976 | https://www.youtube.com/watch?v=La9oLLoI5Rc&t=953s | How To BRAINWASH Yourself For Success & Destroy NEGATIVE THOUGHTS! | Dr. Joe Dispenza | |
La9oLLoI5Rc | Is it meditation is it what does that look like if you're not being defined by a vision of the future? Then you're left with the old memories of the past and you will be predictable in your life and If you wake up in the morning and you're not being defined by a vision in the future as you see the same people and you go to the same places and You do the exact same thing at the exact same time | 976 | 1,001 | https://www.youtube.com/watch?v=La9oLLoI5Rc&t=976s | How To BRAINWASH Yourself For Success & Destroy NEGATIVE THOUGHTS! | Dr. Joe Dispenza | |
La9oLLoI5Rc | It's no longer that your personality is creating your personal reality Now your personal reality is affecting or creating your personality Your environment is really controlling how you think and feel? unconsciously because every person every thing every place every experience has a neurological network in your brain every Experience that you have with every person produces an emotion. So some people will use their boss to reaffirm their addiction to judgment | 1,001 | 1,026 | https://www.youtube.com/watch?v=La9oLLoI5Rc&t=1001s | How To BRAINWASH Yourself For Success & Destroy NEGATIVE THOUGHTS! | Dr. Joe Dispenza | |
La9oLLoI5Rc | They'll use their enemy to reaffirm their addiction to hatred to use their friends that we affirm their addiction to suffering So now they need the outer world to feel something. So To change them is to be greater than your environment to be greater than the conditions in your world and the environment Is that seductive so then why is meditation the tool well? | 1,026 | 1,048 | https://www.youtube.com/watch?v=La9oLLoI5Rc&t=1026s | How To BRAINWASH Yourself For Success & Destroy NEGATIVE THOUGHTS! | Dr. Joe Dispenza | |
La9oLLoI5Rc | Let's sit down. Let's close our eyes. Let's disconnect from your outer environment So if you're seeing less things is less stimulation going to your brain if you're playing soft music or you have earplugs in Less sensory information coming to your brain. So you're disconnecting from environment if you can sit your body down and Tell it to stay like an animal stay right here. I'm gonna feed you when we're done | 1,048 | 1,073 | https://www.youtube.com/watch?v=La9oLLoI5Rc&t=1048s | How To BRAINWASH Yourself For Success & Destroy NEGATIVE THOUGHTS! | Dr. Joe Dispenza | |
La9oLLoI5Rc | You can get up and check your emails You can do all your texts, but right now you're gonna sit there and obey me So then when you do that properly and the you're not eating anything or smelling anything or tasting anything? You're not up experiencing and feeling anything. You would have to agree with me that you're being defined by a thought, right? So when the body wants to go back to its emotional past | 1,073 | 1,097 | https://www.youtube.com/watch?v=La9oLLoI5Rc&t=1073s | How To BRAINWASH Yourself For Success & Destroy NEGATIVE THOUGHTS! | Dr. Joe Dispenza | |
La9oLLoI5Rc | And you become aware that your attention is on that emotion And where you place your attention is where you place your energy? you're siphoning your energy out of the present moment into the past and you become aware of that and You settle your body back down in the present moment because it's saying well, it's eight o'clock You normally get upset because you're in traffic around this time and here you are sitting and we're used to feeling anger and you're off | 1,097 | 1,120 | https://www.youtube.com/watch?v=La9oLLoI5Rc&t=1097s | How To BRAINWASH Yourself For Success & Destroy NEGATIVE THOUGHTS! | Dr. Joe Dispenza | |
La9oLLoI5Rc | Schedule. Oh, it's 11 o'clock and usually check your emails and judge everybody. Well, the body is looking for that that predictable chemical state every time you become aware that you're doing that and your body is craving those emotions and You settle it back down into the present moment. You're telling the body it's no longer the mind that you're the mind and now your will is | 1,120 | 1,142 | https://www.youtube.com/watch?v=La9oLLoI5Rc&t=1120s | How To BRAINWASH Yourself For Success & Destroy NEGATIVE THOUGHTS! | Dr. Joe Dispenza | |
La9oLLoI5Rc | Getting greater than the program and if you keep doing this over and over again over and over again over and over again Just like training a stallion or a dog. It's just gonna say I'm gonna sit and the moment that happens when the body's no longer the mind when it finally surrenders There's a liberation of energy We go from particle to wave from matter to energy and we free ourselves from the chains | 1,142 | 1,167 | https://www.youtube.com/watch?v=La9oLLoI5Rc&t=1142s | How To BRAINWASH Yourself For Success & Destroy NEGATIVE THOUGHTS! | Dr. Joe Dispenza | |
La9oLLoI5Rc | Of those emotions that keep us in the in the familiar past and we've seen this Thousands of times. In fact, we can actually predict it now on a brain scan. That's so interesting Let's go a little bit harder on Metacognition the notion that you don't have to believe everything you think I love the way that you talk about that. Hmm Yeah, and we have a huge frontal lobe. It's 40% of our entire brain and | 1,167 | 1,191 | https://www.youtube.com/watch?v=La9oLLoI5Rc&t=1167s | How To BRAINWASH Yourself For Success & Destroy NEGATIVE THOUGHTS! | Dr. Joe Dispenza |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.