prev1
stringlengths
1
7.9k
prev2
stringlengths
1
7.9k
next1
stringlengths
1
7.9k
next2
stringlengths
1
7.9k
is_boundary
bool
2 classes
session_id
stringlengths
7
14
Uh.
Mm - hmm.
But , yeah , actually the TIMIT noises are sort of a range of noises and they 're not so much the stationary driving kind of noises , right ? It 's it 's pretty different. Isn't it ?
Uh , there is a car noise. So there are f just four noises. Um , uh , " Car " , I think , " Babble " ,
false
QMSum_194
Mm - hmm.
But , yeah , actually the TIMIT noises are sort of a range of noises and they 're not so much the stationary driving kind of noises , right ? It 's it 's pretty different. Isn't it ?
Uh , there is a car noise. So there are f just four noises. Um , uh , " Car " , I think , " Babble " ,
" Babble. "
false
QMSum_194
But , yeah , actually the TIMIT noises are sort of a range of noises and they 're not so much the stationary driving kind of noises , right ? It 's it 's pretty different. Isn't it ?
Uh , there is a car noise. So there are f just four noises. Um , uh , " Car " , I think , " Babble " ,
" Babble. "
" Subway " , right ? and
false
QMSum_194
Uh , there is a car noise. So there are f just four noises. Um , uh , " Car " , I think , " Babble " ,
" Babble. "
" Subway " , right ? and
" Street " or " Airport " or something.
false
QMSum_194
" Babble. "
" Subway " , right ? and
" Street " or " Airport " or something.
and " Street " isn't
false
QMSum_194
" Subway " , right ? and
" Street " or " Airport " or something.
and " Street " isn't
Or " Train station ".
false
QMSum_194
" Street " or " Airport " or something.
and " Street " isn't
Or " Train station ".
" Train station " , yeah.
false
QMSum_194
and " Street " isn't
Or " Train station ".
" Train station " , yeah.
Yeah.
false
QMSum_194
Or " Train station ".
" Train station " , yeah.
Yeah.
So it 's mostly Well , " Car " is stationary ,
false
QMSum_194
" Train station " , yeah.
Yeah.
So it 's mostly Well , " Car " is stationary ,
Mm - hmm.
false
QMSum_194
Yeah.
So it 's mostly Well , " Car " is stationary ,
Mm - hmm.
" Babble " , it 's a stationary background plus some voices ,
false
QMSum_194
So it 's mostly Well , " Car " is stationary ,
Mm - hmm.
" Babble " , it 's a stationary background plus some voices ,
Mm - hmm.
false
QMSum_194
Mm - hmm.
" Babble " , it 's a stationary background plus some voices ,
Mm - hmm.
some speech over it. And the other two are rather stationary also.
false
QMSum_194
" Babble " , it 's a stationary background plus some voices ,
Mm - hmm.
some speech over it. And the other two are rather stationary also.
Well , I I think that if you run it Actually , you maybe you remember this. When you in in the old experiments when you ran with the neural net only , and didn't have this side path , um , uh , with the the pure features as well , did it make things better to have the neural net ?
false
QMSum_194
Mm - hmm.
some speech over it. And the other two are rather stationary also.
Well , I I think that if you run it Actually , you maybe you remember this. When you in in the old experiments when you ran with the neural net only , and didn't have this side path , um , uh , with the the pure features as well , did it make things better to have the neural net ?
Mm - hmm.
false
QMSum_194
some speech over it. And the other two are rather stationary also.
Well , I I think that if you run it Actually , you maybe you remember this. When you in in the old experiments when you ran with the neural net only , and didn't have this side path , um , uh , with the the pure features as well , did it make things better to have the neural net ?
Mm - hmm.
Was it about the same ? Uh , w i
false
QMSum_194
Well , I I think that if you run it Actually , you maybe you remember this. When you in in the old experiments when you ran with the neural net only , and didn't have this side path , um , uh , with the the pure features as well , did it make things better to have the neural net ?
Mm - hmm.
Was it about the same ? Uh , w i
It was b a little bit worse.
false
QMSum_194
Mm - hmm.
Was it about the same ? Uh , w i
It was b a little bit worse.
Than ?
false
QMSum_194
Was it about the same ? Uh , w i
It was b a little bit worse.
Than ?
Than just the features , yeah.
false
QMSum_194
It was b a little bit worse.
Than ?
Than just the features , yeah.
So , until you put the second path in with the pure features , the neural net wasn't helping at all.
false
QMSum_194
Than ?
Than just the features , yeah.
So , until you put the second path in with the pure features , the neural net wasn't helping at all.
Mm - hmm.
false
QMSum_194
Than just the features , yeah.
So , until you put the second path in with the pure features , the neural net wasn't helping at all.
Mm - hmm.
Well , that 's interesting.
false
QMSum_194
So , until you put the second path in with the pure features , the neural net wasn't helping at all.
Mm - hmm.
Well , that 's interesting.
It was helping , uh , if the features are b were bad ,
false
QMSum_194
Mm - hmm.
Well , that 's interesting.
It was helping , uh , if the features are b were bad ,
Yeah.
false
QMSum_194
Well , that 's interesting.
It was helping , uh , if the features are b were bad ,
Yeah.
I mean. Just plain P L Ps or M F
false
QMSum_194
It was helping , uh , if the features are b were bad ,
Yeah.
I mean. Just plain P L Ps or M F
Yeah.
false
QMSum_194
Yeah.
I mean. Just plain P L Ps or M F
Yeah.
C Cs. as soon as we added LDA on - line normalization , and all these things , then
false
QMSum_194
I mean. Just plain P L Ps or M F
Yeah.
C Cs. as soon as we added LDA on - line normalization , and all these things , then
They were doing similar enough things. Well , I still think it would be k sort of interesting to see what would happen if you just had the neural net without the side thing.
false
QMSum_194
Yeah.
C Cs. as soon as we added LDA on - line normalization , and all these things , then
They were doing similar enough things. Well , I still think it would be k sort of interesting to see what would happen if you just had the neural net without the side thing.
Yeah ,
false
QMSum_194
C Cs. as soon as we added LDA on - line normalization , and all these things , then
They were doing similar enough things. Well , I still think it would be k sort of interesting to see what would happen if you just had the neural net without the side thing.
Yeah ,
And and the thing I I have in mind is , uh , maybe you 'll see that the results are not just a little bit worse.
false
QMSum_194
They were doing similar enough things. Well , I still think it would be k sort of interesting to see what would happen if you just had the neural net without the side thing.
Yeah ,
And and the thing I I have in mind is , uh , maybe you 'll see that the results are not just a little bit worse.
mm - hmm.
false
QMSum_194
Yeah ,
And and the thing I I have in mind is , uh , maybe you 'll see that the results are not just a little bit worse.
mm - hmm.
Maybe that they 're a lot worse. You know ? And , um But if on the ha other hand , uh , it 's , say , somewhere in between what you 're seeing now and and and , uh , what you 'd have with just the pure features , then maybe there is some problem of a of a , uh , combination of these things , or correlation between them somehow.
false
QMSum_194
And and the thing I I have in mind is , uh , maybe you 'll see that the results are not just a little bit worse.
mm - hmm.
Maybe that they 're a lot worse. You know ? And , um But if on the ha other hand , uh , it 's , say , somewhere in between what you 're seeing now and and and , uh , what you 'd have with just the pure features , then maybe there is some problem of a of a , uh , combination of these things , or correlation between them somehow.
Mm - hmm.
false
QMSum_194
mm - hmm.
Maybe that they 're a lot worse. You know ? And , um But if on the ha other hand , uh , it 's , say , somewhere in between what you 're seeing now and and and , uh , what you 'd have with just the pure features , then maybe there is some problem of a of a , uh , combination of these things , or correlation between them somehow.
Mm - hmm.
If it really is that the net is hurting you at the moment , then I think the issue is to focus on on , uh , improving the the net.
false
QMSum_194
Maybe that they 're a lot worse. You know ? And , um But if on the ha other hand , uh , it 's , say , somewhere in between what you 're seeing now and and and , uh , what you 'd have with just the pure features , then maybe there is some problem of a of a , uh , combination of these things , or correlation between them somehow.
Mm - hmm.
If it really is that the net is hurting you at the moment , then I think the issue is to focus on on , uh , improving the the net.
Yeah ,
false
QMSum_194
Mm - hmm.
If it really is that the net is hurting you at the moment , then I think the issue is to focus on on , uh , improving the the net.
Yeah ,
Um.
false
QMSum_194
If it really is that the net is hurting you at the moment , then I think the issue is to focus on on , uh , improving the the net.
Yeah ,
Um.
mm - hmm.
false
QMSum_194
Yeah ,
Um.
mm - hmm.
So what 's the overall effe I mean , you haven't done all the experiments but you said it was i somewhat better , say , five percent better , for the first two conditions , and fifteen percent worse for the other one ? But it 's but of course that one 's weighted lower ,
false
QMSum_194
Um.
mm - hmm.
So what 's the overall effe I mean , you haven't done all the experiments but you said it was i somewhat better , say , five percent better , for the first two conditions , and fifteen percent worse for the other one ? But it 's but of course that one 's weighted lower ,
Y yeah , oh. Yeah.
false
QMSum_194
mm - hmm.
So what 's the overall effe I mean , you haven't done all the experiments but you said it was i somewhat better , say , five percent better , for the first two conditions , and fifteen percent worse for the other one ? But it 's but of course that one 's weighted lower ,
Y yeah , oh. Yeah.
so I wonder what the net effect is.
false
QMSum_194
So what 's the overall effe I mean , you haven't done all the experiments but you said it was i somewhat better , say , five percent better , for the first two conditions , and fifteen percent worse for the other one ? But it 's but of course that one 's weighted lower ,
Y yeah , oh. Yeah.
so I wonder what the net effect is.
I d I I think it 's it was one or two percent. That 's not that bad , but it was l like two percent relative worse on SpeechDat - Car. I have to to check that. Well , I have I will.
false
QMSum_194
Y yeah , oh. Yeah.
so I wonder what the net effect is.
I d I I think it 's it was one or two percent. That 's not that bad , but it was l like two percent relative worse on SpeechDat - Car. I have to to check that. Well , I have I will.
Well , it will overall it will be still better even if it is fifteen percent worse , because the fifteen percent worse is given like f w twenty - five point two five eight.
false
QMSum_194
so I wonder what the net effect is.
I d I I think it 's it was one or two percent. That 's not that bad , but it was l like two percent relative worse on SpeechDat - Car. I have to to check that. Well , I have I will.
Well , it will overall it will be still better even if it is fifteen percent worse , because the fifteen percent worse is given like f w twenty - five point two five eight.
Right.
false
QMSum_194
I d I I think it 's it was one or two percent. That 's not that bad , but it was l like two percent relative worse on SpeechDat - Car. I have to to check that. Well , I have I will.
Well , it will overall it will be still better even if it is fifteen percent worse , because the fifteen percent worse is given like f w twenty - five point two five eight.
Right.
Mm - hmm. Hmm.
false
QMSum_194
Well , it will overall it will be still better even if it is fifteen percent worse , because the fifteen percent worse is given like f w twenty - five point two five eight.
Right.
Mm - hmm. Hmm.
Right. So the so the worst it could be , if the others were exactly the same , is four ,
false
QMSum_194
Right.
Mm - hmm. Hmm.
Right. So the so the worst it could be , if the others were exactly the same , is four ,
Is it like
false
QMSum_194
Mm - hmm. Hmm.
Right. So the so the worst it could be , if the others were exactly the same , is four ,
Is it like
and and , uh , in fact since the others are somewhat better
false
QMSum_194
Right. So the so the worst it could be , if the others were exactly the same , is four ,
Is it like
and and , uh , in fact since the others are somewhat better
Yeah , so it 's four. Is i So either it 'll get cancelled out , or you 'll get , like , almost the same.
false
QMSum_194
Is it like
and and , uh , in fact since the others are somewhat better
Yeah , so it 's four. Is i So either it 'll get cancelled out , or you 'll get , like , almost the same.
Uh.
false
QMSum_194
and and , uh , in fact since the others are somewhat better
Yeah , so it 's four. Is i So either it 'll get cancelled out , or you 'll get , like , almost the same.
Uh.
Yeah , it was it was slightly worse.
false
QMSum_194
Yeah , so it 's four. Is i So either it 'll get cancelled out , or you 'll get , like , almost the same.
Uh.
Yeah , it was it was slightly worse.
Slightly bad. Yeah.
false
QMSum_194
Uh.
Yeah , it was it was slightly worse.
Slightly bad. Yeah.
Um ,
false
QMSum_194
Yeah , it was it was slightly worse.
Slightly bad. Yeah.
Um ,
Yeah , it should be pretty close to cancelled out.
false
QMSum_194
Slightly bad. Yeah.
Um ,
Yeah , it should be pretty close to cancelled out.
Yeah.
false
QMSum_194
Um ,
Yeah , it should be pretty close to cancelled out.
Yeah.
You know , I 've been wondering about something.
false
QMSum_194
Yeah , it should be pretty close to cancelled out.
Yeah.
You know , I 've been wondering about something.
Mm - hmm.
false
QMSum_194
Yeah.
You know , I 've been wondering about something.
Mm - hmm.
In the , um a lot of the , um the Hub - five systems , um , recently have been using LDA. and and they , um They run LDA on the features right before they train the models. So there 's the the LDA is is right there before the H M
false
QMSum_194
You know , I 've been wondering about something.
Mm - hmm.
In the , um a lot of the , um the Hub - five systems , um , recently have been using LDA. and and they , um They run LDA on the features right before they train the models. So there 's the the LDA is is right there before the H M
Yeah.
false
QMSum_194
Mm - hmm.
In the , um a lot of the , um the Hub - five systems , um , recently have been using LDA. and and they , um They run LDA on the features right before they train the models. So there 's the the LDA is is right there before the H M
Yeah.
So , you guys are using LDA but it seems like it 's pretty far back in the process.
false
QMSum_194
In the , um a lot of the , um the Hub - five systems , um , recently have been using LDA. and and they , um They run LDA on the features right before they train the models. So there 's the the LDA is is right there before the H M
Yeah.
So , you guys are using LDA but it seems like it 's pretty far back in the process.
Uh , this LDA is different from the LDA that you are talking about. The LDA that you saying is , like , you take a block of features , like nine frames or something , and then do an LDA on it ,
false
QMSum_194
Yeah.
So , you guys are using LDA but it seems like it 's pretty far back in the process.
Uh , this LDA is different from the LDA that you are talking about. The LDA that you saying is , like , you take a block of features , like nine frames or something , and then do an LDA on it ,
Yeah. Uh - huh.
false
QMSum_194
So , you guys are using LDA but it seems like it 's pretty far back in the process.
Uh , this LDA is different from the LDA that you are talking about. The LDA that you saying is , like , you take a block of features , like nine frames or something , and then do an LDA on it ,
Yeah. Uh - huh.
and then reduce the dimensionality to something like twenty - four or something like that.
false
QMSum_194
Uh , this LDA is different from the LDA that you are talking about. The LDA that you saying is , like , you take a block of features , like nine frames or something , and then do an LDA on it ,
Yeah. Uh - huh.
and then reduce the dimensionality to something like twenty - four or something like that.
Yeah , you c you c you can.
false
QMSum_194
Yeah. Uh - huh.
and then reduce the dimensionality to something like twenty - four or something like that.
Yeah , you c you c you can.
And then feed it to HMM.
false
QMSum_194
and then reduce the dimensionality to something like twenty - four or something like that.
Yeah , you c you c you can.
And then feed it to HMM.
I mean , it 's you know , you 're just basically i
false
QMSum_194
Yeah , you c you c you can.
And then feed it to HMM.
I mean , it 's you know , you 're just basically i
Yeah , so this is like a two d two dimensional tile.
false
QMSum_194
And then feed it to HMM.
I mean , it 's you know , you 're just basically i
Yeah , so this is like a two d two dimensional tile.
You 're shifting the feature space. Yeah.
false
QMSum_194
I mean , it 's you know , you 're just basically i
Yeah , so this is like a two d two dimensional tile.
You 're shifting the feature space. Yeah.
So this is a two dimensional tile. And the LDA that we are f applying is only in time , not in frequency high cost frequency. So it 's like more like a filtering in time , rather than doing a r
false
QMSum_194
Yeah , so this is like a two d two dimensional tile.
You 're shifting the feature space. Yeah.
So this is a two dimensional tile. And the LDA that we are f applying is only in time , not in frequency high cost frequency. So it 's like more like a filtering in time , rather than doing a r
Ah. OK. So what i what about , um i u what i w I mean , I don't know if this is a good idea or not , but what if you put ran the other kind of LDA , uh , on your features right before they go into the HMM ?
false
QMSum_194
You 're shifting the feature space. Yeah.
So this is a two dimensional tile. And the LDA that we are f applying is only in time , not in frequency high cost frequency. So it 's like more like a filtering in time , rather than doing a r
Ah. OK. So what i what about , um i u what i w I mean , I don't know if this is a good idea or not , but what if you put ran the other kind of LDA , uh , on your features right before they go into the HMM ?
Uh , it
false
QMSum_194
So this is a two dimensional tile. And the LDA that we are f applying is only in time , not in frequency high cost frequency. So it 's like more like a filtering in time , rather than doing a r
Ah. OK. So what i what about , um i u what i w I mean , I don't know if this is a good idea or not , but what if you put ran the other kind of LDA , uh , on your features right before they go into the HMM ?
Uh , it
Mm - hmm. No , actually , I think i
false
QMSum_194
Ah. OK. So what i what about , um i u what i w I mean , I don't know if this is a good idea or not , but what if you put ran the other kind of LDA , uh , on your features right before they go into the HMM ?
Uh , it
Mm - hmm. No , actually , I think i
Well. What do we do with the ANN is is something like that except that it 's not linear. But it 's it 's like a nonlinear discriminant analysis.
false
QMSum_194
Uh , it
Mm - hmm. No , actually , I think i
Well. What do we do with the ANN is is something like that except that it 's not linear. But it 's it 's like a nonlinear discriminant analysis.
Yeah. Right , it 's the It 's Right. The So Yeah , so it 's sort of like
false
QMSum_194
Mm - hmm. No , actually , I think i
Well. What do we do with the ANN is is something like that except that it 's not linear. But it 's it 's like a nonlinear discriminant analysis.
Yeah. Right , it 's the It 's Right. The So Yeah , so it 's sort of like
But.
false
QMSum_194
Well. What do we do with the ANN is is something like that except that it 's not linear. But it 's it 's like a nonlinear discriminant analysis.
Yeah. Right , it 's the It 's Right. The So Yeah , so it 's sort of like
But.
The tandem stuff is kind of like i nonlinear LDA.
false
QMSum_194
Yeah. Right , it 's the It 's Right. The So Yeah , so it 's sort of like
But.
The tandem stuff is kind of like i nonlinear LDA.
Yeah. It 's
false
QMSum_194
But.
The tandem stuff is kind of like i nonlinear LDA.
Yeah. It 's
I g
false
QMSum_194
The tandem stuff is kind of like i nonlinear LDA.
Yeah. It 's
I g
Yeah.
false
QMSum_194
Yeah. It 's
I g
Yeah.
Yeah.
false
QMSum_194
I g
Yeah.
Yeah.
Yeah.
false
QMSum_194
Yeah.
Yeah.
Yeah.
But I mean , w but the other features that you have , um , th the non - tandem ones ,
false
QMSum_194
Yeah.
Yeah.
But I mean , w but the other features that you have , um , th the non - tandem ones ,
Uh. Mm - hmm. Yeah , I know. That that Yeah. Well , in the proposal , they were transformed u using PCA , but
false
QMSum_194
Yeah.
But I mean , w but the other features that you have , um , th the non - tandem ones ,
Uh. Mm - hmm. Yeah , I know. That that Yeah. Well , in the proposal , they were transformed u using PCA , but
Uh - huh.
false
QMSum_194
But I mean , w but the other features that you have , um , th the non - tandem ones ,
Uh. Mm - hmm. Yeah , I know. That that Yeah. Well , in the proposal , they were transformed u using PCA , but
Uh - huh.
Yeah , it might be that LDA could be better.
false
QMSum_194
Uh. Mm - hmm. Yeah , I know. That that Yeah. Well , in the proposal , they were transformed u using PCA , but
Uh - huh.
Yeah , it might be that LDA could be better.
The a the argument i is kind of i in and it 's not like we really know , but the argument anyway is that , um , uh , we always have the prob I mean , discriminative things are good. LDA , neural nets , they 're good.
false
QMSum_194
Uh - huh.
Yeah , it might be that LDA could be better.
The a the argument i is kind of i in and it 's not like we really know , but the argument anyway is that , um , uh , we always have the prob I mean , discriminative things are good. LDA , neural nets , they 're good.
Yeah.
false
QMSum_194
Yeah , it might be that LDA could be better.
The a the argument i is kind of i in and it 's not like we really know , but the argument anyway is that , um , uh , we always have the prob I mean , discriminative things are good. LDA , neural nets , they 're good.
Yeah.
Uh , they 're good because you you you learn to distinguish between these categories that you want to be good at distinguishing between. And PCA doesn't do that. It PAC - PCA low - order PCA throws away pieces that are uh , maybe not not gonna be helpful just because they 're small , basically.
false
QMSum_194
The a the argument i is kind of i in and it 's not like we really know , but the argument anyway is that , um , uh , we always have the prob I mean , discriminative things are good. LDA , neural nets , they 're good.
Yeah.
Uh , they 're good because you you you learn to distinguish between these categories that you want to be good at distinguishing between. And PCA doesn't do that. It PAC - PCA low - order PCA throws away pieces that are uh , maybe not not gonna be helpful just because they 're small , basically.
Right.
false
QMSum_194
Yeah.
Uh , they 're good because you you you learn to distinguish between these categories that you want to be good at distinguishing between. And PCA doesn't do that. It PAC - PCA low - order PCA throws away pieces that are uh , maybe not not gonna be helpful just because they 're small , basically.
Right.
But , uh , the problem is , training sets aren't perfect and testing sets are different. So you f you you face the potential problem with discriminative stuff , be it LDA or neural nets , that you are training to discriminate between categories in one space but what you 're really gonna be g getting is is something else.
false
QMSum_194
Uh , they 're good because you you you learn to distinguish between these categories that you want to be good at distinguishing between. And PCA doesn't do that. It PAC - PCA low - order PCA throws away pieces that are uh , maybe not not gonna be helpful just because they 're small , basically.
Right.
But , uh , the problem is , training sets aren't perfect and testing sets are different. So you f you you face the potential problem with discriminative stuff , be it LDA or neural nets , that you are training to discriminate between categories in one space but what you 're really gonna be g getting is is something else.
Uh - huh.
false
QMSum_194
Right.
But , uh , the problem is , training sets aren't perfect and testing sets are different. So you f you you face the potential problem with discriminative stuff , be it LDA or neural nets , that you are training to discriminate between categories in one space but what you 're really gonna be g getting is is something else.
Uh - huh.
And so , uh , Stephane 's idea was , uh , let 's feed , uh , both this discriminatively trained thing and something that 's not. So you have a good set of features that everybody 's worked really hard to make ,
false
QMSum_194
But , uh , the problem is , training sets aren't perfect and testing sets are different. So you f you you face the potential problem with discriminative stuff , be it LDA or neural nets , that you are training to discriminate between categories in one space but what you 're really gonna be g getting is is something else.
Uh - huh.
And so , uh , Stephane 's idea was , uh , let 's feed , uh , both this discriminatively trained thing and something that 's not. So you have a good set of features that everybody 's worked really hard to make ,
Yeah.
false
QMSum_194
Uh - huh.
And so , uh , Stephane 's idea was , uh , let 's feed , uh , both this discriminatively trained thing and something that 's not. So you have a good set of features that everybody 's worked really hard to make ,
Yeah.
and then , uh , you you discriminately train it , but you also take the path that that doesn't have that ,
false
QMSum_194
And so , uh , Stephane 's idea was , uh , let 's feed , uh , both this discriminatively trained thing and something that 's not. So you have a good set of features that everybody 's worked really hard to make ,
Yeah.
and then , uh , you you discriminately train it , but you also take the path that that doesn't have that ,
Uh - huh.
false
QMSum_194
Yeah.
and then , uh , you you discriminately train it , but you also take the path that that doesn't have that ,
Uh - huh.
and putting those in together. And that that seem So it 's kind of like a combination of the uh , what , uh , Dan has been calling , you know , a feature uh , you know , a feature combination versus posterior combination or something. It 's it 's , you know , you have the posterior combination but then you get the features from that and use them as a feature combination with these these other things. And that seemed , at least in the last one , as he was just saying , he he when he only did discriminative stuff , i it actually was was it didn't help at all in this particular case.
false
QMSum_194
and then , uh , you you discriminately train it , but you also take the path that that doesn't have that ,
Uh - huh.
and putting those in together. And that that seem So it 's kind of like a combination of the uh , what , uh , Dan has been calling , you know , a feature uh , you know , a feature combination versus posterior combination or something. It 's it 's , you know , you have the posterior combination but then you get the features from that and use them as a feature combination with these these other things. And that seemed , at least in the last one , as he was just saying , he he when he only did discriminative stuff , i it actually was was it didn't help at all in this particular case.
Yeah.
false
QMSum_194
Uh - huh.
and putting those in together. And that that seem So it 's kind of like a combination of the uh , what , uh , Dan has been calling , you know , a feature uh , you know , a feature combination versus posterior combination or something. It 's it 's , you know , you have the posterior combination but then you get the features from that and use them as a feature combination with these these other things. And that seemed , at least in the last one , as he was just saying , he he when he only did discriminative stuff , i it actually was was it didn't help at all in this particular case.
Yeah.
There was enough of a difference , I guess , between the testing and training. But by having them both there The fact is some of the time , the discriminative stuff is gonna help you.
false
QMSum_194
and putting those in together. And that that seem So it 's kind of like a combination of the uh , what , uh , Dan has been calling , you know , a feature uh , you know , a feature combination versus posterior combination or something. It 's it 's , you know , you have the posterior combination but then you get the features from that and use them as a feature combination with these these other things. And that seemed , at least in the last one , as he was just saying , he he when he only did discriminative stuff , i it actually was was it didn't help at all in this particular case.
Yeah.
There was enough of a difference , I guess , between the testing and training. But by having them both there The fact is some of the time , the discriminative stuff is gonna help you.
Mm - hmm.
false
QMSum_194
Yeah.
There was enough of a difference , I guess , between the testing and training. But by having them both there The fact is some of the time , the discriminative stuff is gonna help you.
Mm - hmm.
And some of the time it 's going to hurt you ,
false
QMSum_194
There was enough of a difference , I guess , between the testing and training. But by having them both there The fact is some of the time , the discriminative stuff is gonna help you.
Mm - hmm.
And some of the time it 's going to hurt you ,
Right.
false
QMSum_194