prev1
stringlengths
1
7.9k
prev2
stringlengths
1
7.9k
next1
stringlengths
1
7.9k
next2
stringlengths
1
7.9k
is_boundary
bool
2 classes
session_id
stringlengths
7
14
Yeah. So if we if we can live with the latency or cut the latencies elsewhere , then then that would be a , uh , good thing.
Yeah. Yeah.
Um , anybody has anybody you guys or or Naren , uh , somebody , tried the , uh , um , second th second stream thing ? Uh.
Oh , I just I just h put the second stream in place and , uh ran one experiment , but just like just to know that everything is fine.
false
QMSum_194
Yeah. Yeah.
Um , anybody has anybody you guys or or Naren , uh , somebody , tried the , uh , um , second th second stream thing ? Uh.
Oh , I just I just h put the second stream in place and , uh ran one experiment , but just like just to know that everything is fine.
Uh - huh.
false
QMSum_194
Um , anybody has anybody you guys or or Naren , uh , somebody , tried the , uh , um , second th second stream thing ? Uh.
Oh , I just I just h put the second stream in place and , uh ran one experiment , but just like just to know that everything is fine.
Uh - huh.
So it was like , uh , forty - five cepstrum plus twenty - three mel log mel.
false
QMSum_194
Oh , I just I just h put the second stream in place and , uh ran one experiment , but just like just to know that everything is fine.
Uh - huh.
So it was like , uh , forty - five cepstrum plus twenty - three mel log mel.
Yeah.
false
QMSum_194
Uh - huh.
So it was like , uh , forty - five cepstrum plus twenty - three mel log mel.
Yeah.
And and , just , like , it gave me the baseline performance of the Aurora , which is like zero improvement.
false
QMSum_194
So it was like , uh , forty - five cepstrum plus twenty - three mel log mel.
Yeah.
And and , just , like , it gave me the baseline performance of the Aurora , which is like zero improvement.
Yeah. Yeah.
false
QMSum_194
Yeah.
And and , just , like , it gave me the baseline performance of the Aurora , which is like zero improvement.
Yeah. Yeah.
So I just tried it on Italian just to know that everything is But I I didn't export anything out of it because it was , like , a weird feature set.
false
QMSum_194
And and , just , like , it gave me the baseline performance of the Aurora , which is like zero improvement.
Yeah. Yeah.
So I just tried it on Italian just to know that everything is But I I didn't export anything out of it because it was , like , a weird feature set.
Yeah.
false
QMSum_194
Yeah. Yeah.
So I just tried it on Italian just to know that everything is But I I didn't export anything out of it because it was , like , a weird feature set.
Yeah.
So.
false
QMSum_194
So I just tried it on Italian just to know that everything is But I I didn't export anything out of it because it was , like , a weird feature set.
Yeah.
So.
Yeah. Well , what I think , you know , would be more what you 'd want to do is is is , uh , put it into another neural net. Right ?
false
QMSum_194
Yeah.
So.
Yeah. Well , what I think , you know , would be more what you 'd want to do is is is , uh , put it into another neural net. Right ?
Mm - hmm.
false
QMSum_194
So.
Yeah. Well , what I think , you know , would be more what you 'd want to do is is is , uh , put it into another neural net. Right ?
Mm - hmm.
Yeah , yeah , yeah , yeah.
false
QMSum_194
Yeah. Well , what I think , you know , would be more what you 'd want to do is is is , uh , put it into another neural net. Right ?
Mm - hmm.
Yeah , yeah , yeah , yeah.
And then But , yeah , we 're we 're not quite there yet. So we have to figure out the neural nets , I guess.
false
QMSum_194
Mm - hmm.
Yeah , yeah , yeah , yeah.
And then But , yeah , we 're we 're not quite there yet. So we have to figure out the neural nets , I guess.
Yeah.
false
QMSum_194
Yeah , yeah , yeah , yeah.
And then But , yeah , we 're we 're not quite there yet. So we have to figure out the neural nets , I guess.
Yeah.
The uh , other thing I was wondering was , um , if the neural net , um , has any because of the different noise con unseen noise conditions for the neural net , where , like , you train it on those four noise conditions , while you are feeding it with , like , a additional some four plus some f few more conditions which it hasn't seen , actually ,
false
QMSum_194
And then But , yeah , we 're we 're not quite there yet. So we have to figure out the neural nets , I guess.
Yeah.
The uh , other thing I was wondering was , um , if the neural net , um , has any because of the different noise con unseen noise conditions for the neural net , where , like , you train it on those four noise conditions , while you are feeding it with , like , a additional some four plus some f few more conditions which it hasn't seen , actually ,
Mm - hmm.
false
QMSum_194
Yeah.
The uh , other thing I was wondering was , um , if the neural net , um , has any because of the different noise con unseen noise conditions for the neural net , where , like , you train it on those four noise conditions , while you are feeding it with , like , a additional some four plus some f few more conditions which it hasn't seen , actually ,
Mm - hmm.
from the f f while testing.
false
QMSum_194
The uh , other thing I was wondering was , um , if the neural net , um , has any because of the different noise con unseen noise conditions for the neural net , where , like , you train it on those four noise conditions , while you are feeding it with , like , a additional some four plus some f few more conditions which it hasn't seen , actually ,
Mm - hmm.
from the f f while testing.
Yeah , yeah. Right.
false
QMSum_194
Mm - hmm.
from the f f while testing.
Yeah , yeah. Right.
Um instead of just h having c uh , those cleaned up t cepstrum , sh should we feed some additional information , like The the We have the VAD flag. I mean , should we f feed the VAD flag , also , at the input so that it it has some additional discriminating information at the input ?
false
QMSum_194
from the f f while testing.
Yeah , yeah. Right.
Um instead of just h having c uh , those cleaned up t cepstrum , sh should we feed some additional information , like The the We have the VAD flag. I mean , should we f feed the VAD flag , also , at the input so that it it has some additional discriminating information at the input ?
Hmm - hmm ! Um
false
QMSum_194
Yeah , yeah. Right.
Um instead of just h having c uh , those cleaned up t cepstrum , sh should we feed some additional information , like The the We have the VAD flag. I mean , should we f feed the VAD flag , also , at the input so that it it has some additional discriminating information at the input ?
Hmm - hmm ! Um
Wh - uh , the the VAD what ?
false
QMSum_194
Um instead of just h having c uh , those cleaned up t cepstrum , sh should we feed some additional information , like The the We have the VAD flag. I mean , should we f feed the VAD flag , also , at the input so that it it has some additional discriminating information at the input ?
Hmm - hmm ! Um
Wh - uh , the the VAD what ?
We have the VAD information also available at the back - end.
false
QMSum_194
Hmm - hmm ! Um
Wh - uh , the the VAD what ?
We have the VAD information also available at the back - end.
Uh - huh.
false
QMSum_194
Wh - uh , the the VAD what ?
We have the VAD information also available at the back - end.
Uh - huh.
So if it is something the neural net is not able to discriminate the classes
false
QMSum_194
We have the VAD information also available at the back - end.
Uh - huh.
So if it is something the neural net is not able to discriminate the classes
Yeah.
false
QMSum_194
Uh - huh.
So if it is something the neural net is not able to discriminate the classes
Yeah.
I mean Because most of it is sil I mean , we have dropped some silence f We have dropped so silence frames ?
false
QMSum_194
So if it is something the neural net is not able to discriminate the classes
Yeah.
I mean Because most of it is sil I mean , we have dropped some silence f We have dropped so silence frames ?
Mm - hmm.
false
QMSum_194
Yeah.
I mean Because most of it is sil I mean , we have dropped some silence f We have dropped so silence frames ?
Mm - hmm.
No , we haven't dropped silence frames still.
false
QMSum_194
I mean Because most of it is sil I mean , we have dropped some silence f We have dropped so silence frames ?
Mm - hmm.
No , we haven't dropped silence frames still.
Uh , still not. Yeah.
false
QMSum_194
Mm - hmm.
No , we haven't dropped silence frames still.
Uh , still not. Yeah.
Yeah. So
false
QMSum_194
No , we haven't dropped silence frames still.
Uh , still not. Yeah.
Yeah. So
Th
false
QMSum_194
Uh , still not. Yeah.
Yeah. So
Th
the b b biggest classification would be the speech and silence. So , by having an additional , uh , feature which says " this is speech and this is nonspeech " , I mean , it certainly helps in some unseen noise conditions for the neural net.
false
QMSum_194
Yeah. So
Th
the b b biggest classification would be the speech and silence. So , by having an additional , uh , feature which says " this is speech and this is nonspeech " , I mean , it certainly helps in some unseen noise conditions for the neural net.
What Do y do you have that feature available for the test data ?
false
QMSum_194
Th
the b b biggest classification would be the speech and silence. So , by having an additional , uh , feature which says " this is speech and this is nonspeech " , I mean , it certainly helps in some unseen noise conditions for the neural net.
What Do y do you have that feature available for the test data ?
Well , I mean , we have we are transferring the VAD to the back - end feature to the back - end. Because we are dropping it at the back - end after everything all the features are computed.
false
QMSum_194
the b b biggest classification would be the speech and silence. So , by having an additional , uh , feature which says " this is speech and this is nonspeech " , I mean , it certainly helps in some unseen noise conditions for the neural net.
What Do y do you have that feature available for the test data ?
Well , I mean , we have we are transferring the VAD to the back - end feature to the back - end. Because we are dropping it at the back - end after everything all the features are computed.
Oh , oh , I see.
false
QMSum_194
What Do y do you have that feature available for the test data ?
Well , I mean , we have we are transferring the VAD to the back - end feature to the back - end. Because we are dropping it at the back - end after everything all the features are computed.
Oh , oh , I see.
So
false
QMSum_194
Well , I mean , we have we are transferring the VAD to the back - end feature to the back - end. Because we are dropping it at the back - end after everything all the features are computed.
Oh , oh , I see.
So
I see.
false
QMSum_194
Oh , oh , I see.
So
I see.
so the neural so that is coming from a separate neural net or some VAD.
false
QMSum_194
So
I see.
so the neural so that is coming from a separate neural net or some VAD.
OK. OK.
false
QMSum_194
I see.
so the neural so that is coming from a separate neural net or some VAD.
OK. OK.
Which is which is certainly giving a
false
QMSum_194
so the neural so that is coming from a separate neural net or some VAD.
OK. OK.
Which is which is certainly giving a
So you 're saying , feed that , also , into the neural net.
false
QMSum_194
OK. OK.
Which is which is certainly giving a
So you 're saying , feed that , also , into the neural net.
to Yeah. So it it 's an additional discriminating information.
false
QMSum_194
Which is which is certainly giving a
So you 're saying , feed that , also , into the neural net.
to Yeah. So it it 's an additional discriminating information.
Yeah. Yeah. Right.
false
QMSum_194
So you 're saying , feed that , also , into the neural net.
to Yeah. So it it 's an additional discriminating information.
Yeah. Yeah. Right.
So that
false
QMSum_194
to Yeah. So it it 's an additional discriminating information.
Yeah. Yeah. Right.
So that
You could feed it into the neural net. The other thing you could do is just , um , p modify the , uh , output probabilities of the of the , uh , uh , um , neural net , tandem neural net , based on the fact that you have a silence probability.
false
QMSum_194
Yeah. Yeah. Right.
So that
You could feed it into the neural net. The other thing you could do is just , um , p modify the , uh , output probabilities of the of the , uh , uh , um , neural net , tandem neural net , based on the fact that you have a silence probability.
Mm - hmm.
false
QMSum_194
So that
You could feed it into the neural net. The other thing you could do is just , um , p modify the , uh , output probabilities of the of the , uh , uh , um , neural net , tandem neural net , based on the fact that you have a silence probability.
Mm - hmm.
Right ?
false
QMSum_194
You could feed it into the neural net. The other thing you could do is just , um , p modify the , uh , output probabilities of the of the , uh , uh , um , neural net , tandem neural net , based on the fact that you have a silence probability.
Mm - hmm.
Right ?
Mm - hmm.
false
QMSum_194
Mm - hmm.
Right ?
Mm - hmm.
So you have an independent estimator of what the silence probability is , and you could multiply the two things , and renormalize.
false
QMSum_194
Right ?
Mm - hmm.
So you have an independent estimator of what the silence probability is , and you could multiply the two things , and renormalize.
Yeah.
false
QMSum_194
Mm - hmm.
So you have an independent estimator of what the silence probability is , and you could multiply the two things , and renormalize.
Yeah.
Uh , I mean , you 'd have to do the nonlinearity part and deal with that. Uh , I mean , go backwards from what the nonlinearity would , you know would be.
false
QMSum_194
So you have an independent estimator of what the silence probability is , and you could multiply the two things , and renormalize.
Yeah.
Uh , I mean , you 'd have to do the nonlinearity part and deal with that. Uh , I mean , go backwards from what the nonlinearity would , you know would be.
Through t to the soft max.
false
QMSum_194
Yeah.
Uh , I mean , you 'd have to do the nonlinearity part and deal with that. Uh , I mean , go backwards from what the nonlinearity would , you know would be.
Through t to the soft max.
But but , uh
false
QMSum_194
Uh , I mean , you 'd have to do the nonlinearity part and deal with that. Uh , I mean , go backwards from what the nonlinearity would , you know would be.
Through t to the soft max.
But but , uh
Yeah , so maybe , yeah , when
false
QMSum_194
Through t to the soft max.
But but , uh
Yeah , so maybe , yeah , when
But in principle wouldn't it be better to feed it in ? And let the net do that ?
false
QMSum_194
But but , uh
Yeah , so maybe , yeah , when
But in principle wouldn't it be better to feed it in ? And let the net do that ?
Well , u Not sure.
false
QMSum_194
Yeah , so maybe , yeah , when
But in principle wouldn't it be better to feed it in ? And let the net do that ?
Well , u Not sure.
Hmm.
false
QMSum_194
But in principle wouldn't it be better to feed it in ? And let the net do that ?
Well , u Not sure.
Hmm.
I mean , let 's put it this way. I mean , y you you have this complicated system with thousands and thousand parameters
false
QMSum_194
Well , u Not sure.
Hmm.
I mean , let 's put it this way. I mean , y you you have this complicated system with thousands and thousand parameters
Yeah.
false
QMSum_194
Hmm.
I mean , let 's put it this way. I mean , y you you have this complicated system with thousands and thousand parameters
Yeah.
and you can tell it , uh , " Learn this thing. " Or you can say , " It 's silence ! Go away ! " I mean , I mean , i Doesn't ? I think I think the second one sounds a lot more direct.
false
QMSum_194
I mean , let 's put it this way. I mean , y you you have this complicated system with thousands and thousand parameters
Yeah.
and you can tell it , uh , " Learn this thing. " Or you can say , " It 's silence ! Go away ! " I mean , I mean , i Doesn't ? I think I think the second one sounds a lot more direct.
What what if you
false
QMSum_194
Yeah.
and you can tell it , uh , " Learn this thing. " Or you can say , " It 's silence ! Go away ! " I mean , I mean , i Doesn't ? I think I think the second one sounds a lot more direct.
What what if you
Uh.
false
QMSum_194
and you can tell it , uh , " Learn this thing. " Or you can say , " It 's silence ! Go away ! " I mean , I mean , i Doesn't ? I think I think the second one sounds a lot more direct.
What what if you
Uh.
Right. So , what if you then , uh since you know this , what if you only use the neural net on the speech portions ?
false
QMSum_194
What what if you
Uh.
Right. So , what if you then , uh since you know this , what if you only use the neural net on the speech portions ?
Well , uh ,
false
QMSum_194
Uh.
Right. So , what if you then , uh since you know this , what if you only use the neural net on the speech portions ?
Well , uh ,
That 's what
false
QMSum_194
Right. So , what if you then , uh since you know this , what if you only use the neural net on the speech portions ?
Well , uh ,
That 's what
Well , I guess that 's the same. Uh , that 's similar.
false
QMSum_194
Well , uh ,
That 's what
Well , I guess that 's the same. Uh , that 's similar.
Yeah , I mean , y you 'd have to actually run it continuously ,
false
QMSum_194
That 's what
Well , I guess that 's the same. Uh , that 's similar.
Yeah , I mean , y you 'd have to actually run it continuously ,
But I mean I mean , train the net only on
false
QMSum_194
Well , I guess that 's the same. Uh , that 's similar.
Yeah , I mean , y you 'd have to actually run it continuously ,
But I mean I mean , train the net only on
but it 's @ @ Well , no , you want to train on on the nonspeech also , because that 's part of what you 're learning in it , to to to generate , that it 's it has to distinguish between.
false
QMSum_194
Yeah , I mean , y you 'd have to actually run it continuously ,
But I mean I mean , train the net only on
but it 's @ @ Well , no , you want to train on on the nonspeech also , because that 's part of what you 're learning in it , to to to generate , that it 's it has to distinguish between.
Speech.
false
QMSum_194
But I mean I mean , train the net only on
but it 's @ @ Well , no , you want to train on on the nonspeech also , because that 's part of what you 're learning in it , to to to generate , that it 's it has to distinguish between.
Speech.
But I mean , if you 're gonna if you 're going to multiply the output of the net by this other decision , uh , would then you don't care about whether the net makes that distinction , right ?
false
QMSum_194
but it 's @ @ Well , no , you want to train on on the nonspeech also , because that 's part of what you 're learning in it , to to to generate , that it 's it has to distinguish between.
Speech.
But I mean , if you 're gonna if you 're going to multiply the output of the net by this other decision , uh , would then you don't care about whether the net makes that distinction , right ?
Well , yeah. But this other thing isn't perfect.
false
QMSum_194
Speech.
But I mean , if you 're gonna if you 're going to multiply the output of the net by this other decision , uh , would then you don't care about whether the net makes that distinction , right ?
Well , yeah. But this other thing isn't perfect.
Ah.
false
QMSum_194
But I mean , if you 're gonna if you 're going to multiply the output of the net by this other decision , uh , would then you don't care about whether the net makes that distinction , right ?
Well , yeah. But this other thing isn't perfect.
Ah.
So that you bring in some information from the net itself.
false
QMSum_194
Well , yeah. But this other thing isn't perfect.
Ah.
So that you bring in some information from the net itself.
Right , OK. That 's a good point.
false
QMSum_194
Ah.
So that you bring in some information from the net itself.
Right , OK. That 's a good point.
Yeah. Now the only thing that that bothers me about all this is that I I I The the fact i i It 's sort of bothersome that you 're getting more deletions.
false
QMSum_194
So that you bring in some information from the net itself.
Right , OK. That 's a good point.
Yeah. Now the only thing that that bothers me about all this is that I I I The the fact i i It 's sort of bothersome that you 're getting more deletions.
Yeah. But So I might maybe look at , is it due to the fact that um , the probability of the silence at the output of the network , is , uh ,
false
QMSum_194
Right , OK. That 's a good point.
Yeah. Now the only thing that that bothers me about all this is that I I I The the fact i i It 's sort of bothersome that you 're getting more deletions.
Yeah. But So I might maybe look at , is it due to the fact that um , the probability of the silence at the output of the network , is , uh ,
Is too high.
false
QMSum_194
Yeah. Now the only thing that that bothers me about all this is that I I I The the fact i i It 's sort of bothersome that you 're getting more deletions.
Yeah. But So I might maybe look at , is it due to the fact that um , the probability of the silence at the output of the network , is , uh ,
Is too high.
too too high or
false
QMSum_194
Yeah. But So I might maybe look at , is it due to the fact that um , the probability of the silence at the output of the network , is , uh ,
Is too high.
too too high or
Yeah. So maybe So
false
QMSum_194
Is too high.
too too high or
Yeah. So maybe So
If it 's the case , then multiplying it again by i by something ?
false
QMSum_194
too too high or
Yeah. So maybe So
If it 's the case , then multiplying it again by i by something ?
It may not be it
false
QMSum_194
Yeah. So maybe So
If it 's the case , then multiplying it again by i by something ?
It may not be it
Yeah.
false
QMSum_194
If it 's the case , then multiplying it again by i by something ?
It may not be it
Yeah.
Mm - hmm.
false
QMSum_194
It may not be it
Yeah.
Mm - hmm.
Yeah , it it may be too it 's too high in a sense , like , everything is more like a , um , flat probability.
false
QMSum_194
Yeah.
Mm - hmm.
Yeah , it it may be too it 's too high in a sense , like , everything is more like a , um , flat probability.
Yeah.
false
QMSum_194
Mm - hmm.
Yeah , it it may be too it 's too high in a sense , like , everything is more like a , um , flat probability.
Yeah.
Oh - eee - hhh.
false
QMSum_194
Yeah , it it may be too it 's too high in a sense , like , everything is more like a , um , flat probability.
Yeah.
Oh - eee - hhh.
So , like , it 's not really doing any distinction between speech and nonspeech
false
QMSum_194
Yeah.
Oh - eee - hhh.
So , like , it 's not really doing any distinction between speech and nonspeech
Uh , yeah.
false
QMSum_194
Oh - eee - hhh.
So , like , it 's not really doing any distinction between speech and nonspeech
Uh , yeah.
or , I mean , different among classes.
false
QMSum_194
So , like , it 's not really doing any distinction between speech and nonspeech
Uh , yeah.
or , I mean , different among classes.
Yeah.
false
QMSum_194
Uh , yeah.
or , I mean , different among classes.
Yeah.
Mm - hmm.
false
QMSum_194
or , I mean , different among classes.
Yeah.
Mm - hmm.
Be interesting to look at the Yeah , for the I wonder if you could do this. But if you look at the , um , highly mism high mismat the output of the net on the high mismatch case and just look at , you know , the distribution versus the the other ones , do you do you see more peaks or something ?
false
QMSum_194
Yeah.
Mm - hmm.
Be interesting to look at the Yeah , for the I wonder if you could do this. But if you look at the , um , highly mism high mismat the output of the net on the high mismatch case and just look at , you know , the distribution versus the the other ones , do you do you see more peaks or something ?
Yeah. Yeah , like the entropy of the the output ,
false
QMSum_194
Mm - hmm.
Be interesting to look at the Yeah , for the I wonder if you could do this. But if you look at the , um , highly mism high mismat the output of the net on the high mismatch case and just look at , you know , the distribution versus the the other ones , do you do you see more peaks or something ?
Yeah. Yeah , like the entropy of the the output ,
Yeah.
false
QMSum_194
Be interesting to look at the Yeah , for the I wonder if you could do this. But if you look at the , um , highly mism high mismat the output of the net on the high mismatch case and just look at , you know , the distribution versus the the other ones , do you do you see more peaks or something ?
Yeah. Yeah , like the entropy of the the output ,
Yeah.
Yeah , for instance.
false
QMSum_194
Yeah. Yeah , like the entropy of the the output ,
Yeah.
Yeah , for instance.
or
false
QMSum_194
Yeah.
Yeah , for instance.
or
But I bu
false
QMSum_194
Yeah , for instance.
or
But I bu
It it seems that the VAD network doesn't Well , it doesn't drop , uh , too many frames because the dele the number of deletion is reasonable. But it 's just when we add the tandem , the final MLP , and then
false
QMSum_194
or
But I bu
It it seems that the VAD network doesn't Well , it doesn't drop , uh , too many frames because the dele the number of deletion is reasonable. But it 's just when we add the tandem , the final MLP , and then
Yeah. Now the only problem is you don't want to ta I guess wait for the output of the VAD before you can put something into the other system ,
false
QMSum_194