prev1
stringlengths
1
7.9k
prev2
stringlengths
1
7.9k
next1
stringlengths
1
7.9k
next2
stringlengths
1
7.9k
is_boundary
bool
2 classes
session_id
stringlengths
7
14
Right ? In that case you wouldn't necessarily expect it to be better at all.
Oh , yeah , I wasn't necessarily saying it should be better. I 'm just surprised that you 're getting fifteen percent relative worse on the wel
Uh - huh.
But it 's worse.
false
QMSum_194
Oh , yeah , I wasn't necessarily saying it should be better. I 'm just surprised that you 're getting fifteen percent relative worse on the wel
Uh - huh.
But it 's worse.
On the highly mismatched condition.
false
QMSum_194
Uh - huh.
But it 's worse.
On the highly mismatched condition.
On the highly mismatch.
false
QMSum_194
But it 's worse.
On the highly mismatched condition.
On the highly mismatch.
Yeah , I
false
QMSum_194
On the highly mismatched condition.
On the highly mismatch.
Yeah , I
Yeah.
false
QMSum_194
On the highly mismatch.
Yeah , I
Yeah.
So , " highly mismatched condition " means that in fact your training is a bad estimate of your test.
false
QMSum_194
Yeah , I
Yeah.
So , " highly mismatched condition " means that in fact your training is a bad estimate of your test.
Uh - huh.
false
QMSum_194
Yeah.
So , " highly mismatched condition " means that in fact your training is a bad estimate of your test.
Uh - huh.
So having having , uh , a g a l a greater number of features , if they aren't maybe the right features that you use , certainly can e can easily , uh , make things worse. I mean , you 're right. If you have if you have , uh , lots and lots of data , and you have and your your your training is representative of your test , then getting more sources of information should just help. But but it 's It doesn't necessarily work that way.
false
QMSum_194
So , " highly mismatched condition " means that in fact your training is a bad estimate of your test.
Uh - huh.
So having having , uh , a g a l a greater number of features , if they aren't maybe the right features that you use , certainly can e can easily , uh , make things worse. I mean , you 're right. If you have if you have , uh , lots and lots of data , and you have and your your your training is representative of your test , then getting more sources of information should just help. But but it 's It doesn't necessarily work that way.
Huh.
false
QMSum_194
Uh - huh.
So having having , uh , a g a l a greater number of features , if they aren't maybe the right features that you use , certainly can e can easily , uh , make things worse. I mean , you 're right. If you have if you have , uh , lots and lots of data , and you have and your your your training is representative of your test , then getting more sources of information should just help. But but it 's It doesn't necessarily work that way.
Huh.
Mm - hmm.
false
QMSum_194
So having having , uh , a g a l a greater number of features , if they aren't maybe the right features that you use , certainly can e can easily , uh , make things worse. I mean , you 're right. If you have if you have , uh , lots and lots of data , and you have and your your your training is representative of your test , then getting more sources of information should just help. But but it 's It doesn't necessarily work that way.
Huh.
Mm - hmm.
So I wonder , um , Well , what 's your what 's your thought about what to do next with it ?
false
QMSum_194
Huh.
Mm - hmm.
So I wonder , um , Well , what 's your what 's your thought about what to do next with it ?
Um , I don't know. I 'm surprised , because I expected the neural net to help more when there is more mismatch , as it was the case for the
true
QMSum_194
Mm - hmm.
So I wonder , um , Well , what 's your what 's your thought about what to do next with it ?
Um , I don't know. I 'm surprised , because I expected the neural net to help more when there is more mismatch , as it was the case for the
Mm - hmm.
false
QMSum_194
So I wonder , um , Well , what 's your what 's your thought about what to do next with it ?
Um , I don't know. I 'm surprised , because I expected the neural net to help more when there is more mismatch , as it was the case for the
Mm - hmm.
So , was the training set same as the p the February proposal ? OK.
false
QMSum_194
Um , I don't know. I 'm surprised , because I expected the neural net to help more when there is more mismatch , as it was the case for the
Mm - hmm.
So , was the training set same as the p the February proposal ? OK.
Yeah , it 's the same training set , so it 's TIMIT with the TI - digits ' , uh , noises , uh , added.
false
QMSum_194
Mm - hmm.
So , was the training set same as the p the February proposal ? OK.
Yeah , it 's the same training set , so it 's TIMIT with the TI - digits ' , uh , noises , uh , added.
Mm - hmm.
false
QMSum_194
So , was the training set same as the p the February proposal ? OK.
Yeah , it 's the same training set , so it 's TIMIT with the TI - digits ' , uh , noises , uh , added.
Mm - hmm.
Um
false
QMSum_194
Yeah , it 's the same training set , so it 's TIMIT with the TI - digits ' , uh , noises , uh , added.
Mm - hmm.
Um
Well , we might uh , we might have to experiment with , uh better training sets. Again. But ,
false
QMSum_194
Mm - hmm.
Um
Well , we might uh , we might have to experiment with , uh better training sets. Again. But ,
Mm - hmm.
false
QMSum_194
Um
Well , we might uh , we might have to experiment with , uh better training sets. Again. But ,
Mm - hmm.
I The other thing is , I mean , before you found that was the best configuration , but you might have to retest those things now that we have different The rest of it is different , right ? So , um , uh , For instance , what 's the effect of just putting the neural net on without the o other other path ?
false
QMSum_194
Well , we might uh , we might have to experiment with , uh better training sets. Again. But ,
Mm - hmm.
I The other thing is , I mean , before you found that was the best configuration , but you might have to retest those things now that we have different The rest of it is different , right ? So , um , uh , For instance , what 's the effect of just putting the neural net on without the o other other path ?
Mm - hmm.
false
QMSum_194
Mm - hmm.
I The other thing is , I mean , before you found that was the best configuration , but you might have to retest those things now that we have different The rest of it is different , right ? So , um , uh , For instance , what 's the effect of just putting the neural net on without the o other other path ?
Mm - hmm.
I mean , you know what the straight features do.
false
QMSum_194
I The other thing is , I mean , before you found that was the best configuration , but you might have to retest those things now that we have different The rest of it is different , right ? So , um , uh , For instance , what 's the effect of just putting the neural net on without the o other other path ?
Mm - hmm.
I mean , you know what the straight features do.
Yeah.
false
QMSum_194
Mm - hmm.
I mean , you know what the straight features do.
Yeah.
That gives you this. You know what it does in combination.
false
QMSum_194
I mean , you know what the straight features do.
Yeah.
That gives you this. You know what it does in combination.
Mm - hmm.
false
QMSum_194
Yeah.
That gives you this. You know what it does in combination.
Mm - hmm.
You don't necessarily know what
false
QMSum_194
That gives you this. You know what it does in combination.
Mm - hmm.
You don't necessarily know what
What if you did the Would it make sense to do the KLT on the full set of combined features ? Instead of just on the
false
QMSum_194
Mm - hmm.
You don't necessarily know what
What if you did the Would it make sense to do the KLT on the full set of combined features ? Instead of just on the
Yeah. I g I guess. Um. The reason I did it this ways is that in February , it we we tested different things like that , so , having two KLT , having just a KLT for a network , or having a global KLT.
false
QMSum_194
You don't necessarily know what
What if you did the Would it make sense to do the KLT on the full set of combined features ? Instead of just on the
Yeah. I g I guess. Um. The reason I did it this ways is that in February , it we we tested different things like that , so , having two KLT , having just a KLT for a network , or having a global KLT.
Oh , I see.
false
QMSum_194
What if you did the Would it make sense to do the KLT on the full set of combined features ? Instead of just on the
Yeah. I g I guess. Um. The reason I did it this ways is that in February , it we we tested different things like that , so , having two KLT , having just a KLT for a network , or having a global KLT.
Oh , I see.
And
false
QMSum_194
Yeah. I g I guess. Um. The reason I did it this ways is that in February , it we we tested different things like that , so , having two KLT , having just a KLT for a network , or having a global KLT.
Oh , I see.
And
So you tried the global KLT before
false
QMSum_194
Oh , I see.
And
So you tried the global KLT before
Well
false
QMSum_194
And
So you tried the global KLT before
Well
and it didn't really
false
QMSum_194
So you tried the global KLT before
Well
and it didn't really
Yeah. And , uh , th Yeah.
false
QMSum_194
Well
and it didn't really
Yeah. And , uh , th Yeah.
I see.
false
QMSum_194
and it didn't really
Yeah. And , uh , th Yeah.
I see.
The differences between these configurations were not huge , but it was marginally better with this configuration.
false
QMSum_194
Yeah. And , uh , th Yeah.
I see.
The differences between these configurations were not huge , but it was marginally better with this configuration.
Uh - huh. Uh - huh.
false
QMSum_194
I see.
The differences between these configurations were not huge , but it was marginally better with this configuration.
Uh - huh. Uh - huh.
But , yeah , that 's obviously another thing to try ,
false
QMSum_194
The differences between these configurations were not huge , but it was marginally better with this configuration.
Uh - huh. Uh - huh.
But , yeah , that 's obviously another thing to try ,
Um.
false
QMSum_194
Uh - huh. Uh - huh.
But , yeah , that 's obviously another thing to try ,
Um.
since things are things are different.
false
QMSum_194
But , yeah , that 's obviously another thing to try ,
Um.
since things are things are different.
Mm - hmm. Mm - hmm.
false
QMSum_194
Um.
since things are things are different.
Mm - hmm. Mm - hmm.
And I guess if the These are all so all of these seventy - three features are going into , um , the , uh the HMM.
false
QMSum_194
since things are things are different.
Mm - hmm. Mm - hmm.
And I guess if the These are all so all of these seventy - three features are going into , um , the , uh the HMM.
Yeah.
false
QMSum_194
Mm - hmm. Mm - hmm.
And I guess if the These are all so all of these seventy - three features are going into , um , the , uh the HMM.
Yeah.
And is are i i are are any deltas being computed of tha of them ?
false
QMSum_194
And I guess if the These are all so all of these seventy - three features are going into , um , the , uh the HMM.
Yeah.
And is are i i are are any deltas being computed of tha of them ?
Of the straight features , yeah.
false
QMSum_194
Yeah.
And is are i i are are any deltas being computed of tha of them ?
Of the straight features , yeah.
n Not of the
false
QMSum_194
And is are i i are are any deltas being computed of tha of them ?
Of the straight features , yeah.
n Not of the
So. But n th the , um , tandem features are u used as they are.
false
QMSum_194
Of the straight features , yeah.
n Not of the
So. But n th the , um , tandem features are u used as they are.
Are not.
false
QMSum_194
n Not of the
So. But n th the , um , tandem features are u used as they are.
Are not.
So , yeah , maybe we can add some context from these features also as Dan did in in his last work.
false
QMSum_194
So. But n th the , um , tandem features are u used as they are.
Are not.
So , yeah , maybe we can add some context from these features also as Dan did in in his last work.
Could. i Yeah , but the other thing I was thinking was , um Uh , now I lost track of what I was thinking. But.
false
QMSum_194
Are not.
So , yeah , maybe we can add some context from these features also as Dan did in in his last work.
Could. i Yeah , but the other thing I was thinking was , um Uh , now I lost track of what I was thinking. But.
What is the You said there was a limit of sixty features or something ?
false
QMSum_194
So , yeah , maybe we can add some context from these features also as Dan did in in his last work.
Could. i Yeah , but the other thing I was thinking was , um Uh , now I lost track of what I was thinking. But.
What is the You said there was a limit of sixty features or something ?
Mm - hmm.
false
QMSum_194
Could. i Yeah , but the other thing I was thinking was , um Uh , now I lost track of what I was thinking. But.
What is the You said there was a limit of sixty features or something ?
Mm - hmm.
What 's the relation between that limit and the , um , forty - eight uh , forty eight hundred bits per second ?
false
QMSum_194
What is the You said there was a limit of sixty features or something ?
Mm - hmm.
What 's the relation between that limit and the , um , forty - eight uh , forty eight hundred bits per second ?
Oh , I know what I was gonna say.
false
QMSum_194
Mm - hmm.
What 's the relation between that limit and the , um , forty - eight uh , forty eight hundred bits per second ?
Oh , I know what I was gonna say.
Um , not no relation.
false
QMSum_194
What 's the relation between that limit and the , um , forty - eight uh , forty eight hundred bits per second ?
Oh , I know what I was gonna say.
Um , not no relation.
No relation.
false
QMSum_194
Oh , I know what I was gonna say.
Um , not no relation.
No relation.
So I I I don't understand ,
false
QMSum_194
Um , not no relation.
No relation.
So I I I don't understand ,
The f the forty - eight hundred bits is for transmission of some features.
false
QMSum_194
No relation.
So I I I don't understand ,
The f the forty - eight hundred bits is for transmission of some features.
because i I mean , if you 're only using h
false
QMSum_194
So I I I don't understand ,
The f the forty - eight hundred bits is for transmission of some features.
because i I mean , if you 're only using h
And generally , i it s allows you to transmit like , fifteen , uh , cepstrum.
false
QMSum_194
The f the forty - eight hundred bits is for transmission of some features.
because i I mean , if you 're only using h
And generally , i it s allows you to transmit like , fifteen , uh , cepstrum.
The issue was that , um , this is supposed to be a standard that 's then gonna be fed to somebody 's recognizer somewhere which might be , you know , it it might be a concern how many parameters are use u used and so forth. And so , uh , they felt they wanted to set a limit. So they chose sixty. Some people wanted to use hundreds of parameters and and that bothered some other people.
false
QMSum_194
because i I mean , if you 're only using h
And generally , i it s allows you to transmit like , fifteen , uh , cepstrum.
The issue was that , um , this is supposed to be a standard that 's then gonna be fed to somebody 's recognizer somewhere which might be , you know , it it might be a concern how many parameters are use u used and so forth. And so , uh , they felt they wanted to set a limit. So they chose sixty. Some people wanted to use hundreds of parameters and and that bothered some other people.
Uh - huh.
false
QMSum_194
And generally , i it s allows you to transmit like , fifteen , uh , cepstrum.
The issue was that , um , this is supposed to be a standard that 's then gonna be fed to somebody 's recognizer somewhere which might be , you know , it it might be a concern how many parameters are use u used and so forth. And so , uh , they felt they wanted to set a limit. So they chose sixty. Some people wanted to use hundreds of parameters and and that bothered some other people.
Uh - huh.
u And so they just chose that. I I I think it 's kind of r arbitrary too. But but that 's that 's kind of what was chosen. I I remembered what I was going to say. What I was going to say is that , um , maybe maybe with the noise removal , uh , these things are now more correlated. So you have two sets of things that are kind of uncorrelated , uh , within themselves , but they 're pretty correlated with one another.
false
QMSum_194
The issue was that , um , this is supposed to be a standard that 's then gonna be fed to somebody 's recognizer somewhere which might be , you know , it it might be a concern how many parameters are use u used and so forth. And so , uh , they felt they wanted to set a limit. So they chose sixty. Some people wanted to use hundreds of parameters and and that bothered some other people.
Uh - huh.
u And so they just chose that. I I I think it 's kind of r arbitrary too. But but that 's that 's kind of what was chosen. I I remembered what I was going to say. What I was going to say is that , um , maybe maybe with the noise removal , uh , these things are now more correlated. So you have two sets of things that are kind of uncorrelated , uh , within themselves , but they 're pretty correlated with one another.
Mm - hmm.
false
QMSum_194
Uh - huh.
u And so they just chose that. I I I think it 's kind of r arbitrary too. But but that 's that 's kind of what was chosen. I I remembered what I was going to say. What I was going to say is that , um , maybe maybe with the noise removal , uh , these things are now more correlated. So you have two sets of things that are kind of uncorrelated , uh , within themselves , but they 're pretty correlated with one another.
Mm - hmm.
And , um , they 're being fed into these , uh , variants , only Gaussians and so forth , and and , uh ,
false
QMSum_194
u And so they just chose that. I I I think it 's kind of r arbitrary too. But but that 's that 's kind of what was chosen. I I remembered what I was going to say. What I was going to say is that , um , maybe maybe with the noise removal , uh , these things are now more correlated. So you have two sets of things that are kind of uncorrelated , uh , within themselves , but they 're pretty correlated with one another.
Mm - hmm.
And , um , they 're being fed into these , uh , variants , only Gaussians and so forth , and and , uh ,
Mm - hmm.
false
QMSum_194
Mm - hmm.
And , um , they 're being fed into these , uh , variants , only Gaussians and so forth , and and , uh ,
Mm - hmm.
so maybe it would be a better idea now than it was before to , uh , have , uh , one KLT over everything , to de - correlate it.
false
QMSum_194
And , um , they 're being fed into these , uh , variants , only Gaussians and so forth , and and , uh ,
Mm - hmm.
so maybe it would be a better idea now than it was before to , uh , have , uh , one KLT over everything , to de - correlate it.
Mm - hmm. Yeah , I see.
false
QMSum_194
Mm - hmm.
so maybe it would be a better idea now than it was before to , uh , have , uh , one KLT over everything , to de - correlate it.
Mm - hmm. Yeah , I see.
Maybe. You know.
false
QMSum_194
so maybe it would be a better idea now than it was before to , uh , have , uh , one KLT over everything , to de - correlate it.
Mm - hmm. Yeah , I see.
Maybe. You know.
What are the S N Rs in the training set , TIMIT ?
false
QMSum_194
Mm - hmm. Yeah , I see.
Maybe. You know.
What are the S N Rs in the training set , TIMIT ?
It 's , uh , ranging from zero to clean ? Yeah. From zero to clean.
false
QMSum_194
Maybe. You know.
What are the S N Rs in the training set , TIMIT ?
It 's , uh , ranging from zero to clean ? Yeah. From zero to clean.
Mm - hmm.
false
QMSum_194
What are the S N Rs in the training set , TIMIT ?
It 's , uh , ranging from zero to clean ? Yeah. From zero to clean.
Mm - hmm.
Yeah. So we found this this , uh this Macrophone data , and so forth , that we were using for these other experiments , to be pretty good.
false
QMSum_194
It 's , uh , ranging from zero to clean ? Yeah. From zero to clean.
Mm - hmm.
Yeah. So we found this this , uh this Macrophone data , and so forth , that we were using for these other experiments , to be pretty good.
Mm - hmm.
false
QMSum_194
Mm - hmm.
Yeah. So we found this this , uh this Macrophone data , and so forth , that we were using for these other experiments , to be pretty good.
Mm - hmm.
So that 's i after you explore these other alternatives , that might be another way to start looking , is is just improving the training set.
false
QMSum_194
Yeah. So we found this this , uh this Macrophone data , and so forth , that we were using for these other experiments , to be pretty good.
Mm - hmm.
So that 's i after you explore these other alternatives , that might be another way to start looking , is is just improving the training set.
Mm - hmm.
false
QMSum_194
Mm - hmm.
So that 's i after you explore these other alternatives , that might be another way to start looking , is is just improving the training set.
Mm - hmm.
I mean , we were getting , uh , lots better recognition using that , than Of course , you do have the problem that , um , u i we are not able to increase the number of Gaussians , uh , or anything to , uh , uh , to match anything. So we 're only improving the training of our feature set , but that 's still probably something.
false
QMSum_194
So that 's i after you explore these other alternatives , that might be another way to start looking , is is just improving the training set.
Mm - hmm.
I mean , we were getting , uh , lots better recognition using that , than Of course , you do have the problem that , um , u i we are not able to increase the number of Gaussians , uh , or anything to , uh , uh , to match anything. So we 're only improving the training of our feature set , but that 's still probably something.
So you 're saying , add the Macrophone data to the training of the neural net ? The tandem net ?
false
QMSum_194
Mm - hmm.
I mean , we were getting , uh , lots better recognition using that , than Of course , you do have the problem that , um , u i we are not able to increase the number of Gaussians , uh , or anything to , uh , uh , to match anything. So we 're only improving the training of our feature set , but that 's still probably something.
So you 're saying , add the Macrophone data to the training of the neural net ? The tandem net ?
Yeah , that 's the only place that we can train.
false
QMSum_194
I mean , we were getting , uh , lots better recognition using that , than Of course , you do have the problem that , um , u i we are not able to increase the number of Gaussians , uh , or anything to , uh , uh , to match anything. So we 're only improving the training of our feature set , but that 's still probably something.
So you 're saying , add the Macrophone data to the training of the neural net ? The tandem net ?
Yeah , that 's the only place that we can train.
Yeah.
false
QMSum_194
So you 're saying , add the Macrophone data to the training of the neural net ? The tandem net ?
Yeah , that 's the only place that we can train.
Yeah.
We can't train the other stuff with anything other than the standard amount ,
false
QMSum_194
Yeah , that 's the only place that we can train.
Yeah.
We can't train the other stuff with anything other than the standard amount ,
Right.
false
QMSum_194
Yeah.
We can't train the other stuff with anything other than the standard amount ,
Right.
so. Um , um
false
QMSum_194
We can't train the other stuff with anything other than the standard amount ,
Right.
so. Um , um
What what was it trained on again ? The one that you used ?
false
QMSum_194
Right.
so. Um , um
What what was it trained on again ? The one that you used ?
It 's TIMIT with noise.
false
QMSum_194
so. Um , um
What what was it trained on again ? The one that you used ?
It 's TIMIT with noise.
Uh - huh.
false
QMSum_194
What what was it trained on again ? The one that you used ?
It 's TIMIT with noise.
Uh - huh.
Yeah.
false
QMSum_194
It 's TIMIT with noise.
Uh - huh.
Yeah.
So , yeah , it 's rather a small
false
QMSum_194
Uh - huh.
Yeah.
So , yeah , it 's rather a small
How big is the net , by the way ?
false
QMSum_194
Yeah.
So , yeah , it 's rather a small
How big is the net , by the way ?
Um , Uh , it 's , uh , five hundred hidden units. And
false
QMSum_194
So , yeah , it 's rather a small
How big is the net , by the way ?
Um , Uh , it 's , uh , five hundred hidden units. And
And again , you did experiments back then where you made it bigger and it and that was that was sort of the threshold point. Much less than that , it was worse ,
false
QMSum_194
How big is the net , by the way ?
Um , Uh , it 's , uh , five hundred hidden units. And
And again , you did experiments back then where you made it bigger and it and that was that was sort of the threshold point. Much less than that , it was worse ,
Yeah.
false
QMSum_194
Um , Uh , it 's , uh , five hundred hidden units. And
And again , you did experiments back then where you made it bigger and it and that was that was sort of the threshold point. Much less than that , it was worse ,
Yeah.
and
false
QMSum_194
And again , you did experiments back then where you made it bigger and it and that was that was sort of the threshold point. Much less than that , it was worse ,
Yeah.
and
Yeah.
false
QMSum_194
Yeah.
and
Yeah.
much more than that , it wasn't much better. Hmm.
false
QMSum_194
and
Yeah.
much more than that , it wasn't much better. Hmm.
Yeah. @ @ ?
false
QMSum_194
Yeah.
much more than that , it wasn't much better. Hmm.
Yeah. @ @ ?
So is it is it though the performance , big relation in the high ma high mismatch has something to do with the , uh , cleaning up that you that is done on the TIMIT after adding noise ?
false
QMSum_194
much more than that , it wasn't much better. Hmm.
Yeah. @ @ ?
So is it is it though the performance , big relation in the high ma high mismatch has something to do with the , uh , cleaning up that you that is done on the TIMIT after adding noise ?
So it 's i All the noises are from the TI - digits ,
false
QMSum_194
Yeah. @ @ ?
So is it is it though the performance , big relation in the high ma high mismatch has something to do with the , uh , cleaning up that you that is done on the TIMIT after adding noise ?
So it 's i All the noises are from the TI - digits ,
Yeah.
false
QMSum_194
So is it is it though the performance , big relation in the high ma high mismatch has something to do with the , uh , cleaning up that you that is done on the TIMIT after adding noise ?
So it 's i All the noises are from the TI - digits ,
Yeah.
right ? So you i
false
QMSum_194