prev1
stringlengths 1
7.9k
| prev2
stringlengths 1
7.9k
⌀ | next1
stringlengths 1
7.9k
| next2
stringlengths 1
7.9k
⌀ | is_boundary
bool 2
classes | session_id
stringlengths 7
14
|
---|---|---|---|---|---|
Mm - hmm.
|
And some of the time it 's going to hurt you ,
|
Right.
|
and by combining two information sources if , you know if if
| false |
QMSum_194
|
And some of the time it 's going to hurt you ,
|
Right.
|
and by combining two information sources if , you know if if
|
So you wouldn't necessarily then want to do LDA on the non - tandem features because now you 're doing something to them that
| false |
QMSum_194
|
Right.
|
and by combining two information sources if , you know if if
|
So you wouldn't necessarily then want to do LDA on the non - tandem features because now you 're doing something to them that
|
That i i I think that 's counter to that idea.
| false |
QMSum_194
|
and by combining two information sources if , you know if if
|
So you wouldn't necessarily then want to do LDA on the non - tandem features because now you 're doing something to them that
|
That i i I think that 's counter to that idea.
|
Yeah , right.
| false |
QMSum_194
|
So you wouldn't necessarily then want to do LDA on the non - tandem features because now you 're doing something to them that
|
That i i I think that 's counter to that idea.
|
Yeah , right.
|
Now , again , it 's we 're just trying these different things. We don't really know what 's gonna work best. But if that 's the hypothesis , at least it would be counter to that hypothesis to do that.
| false |
QMSum_194
|
That i i I think that 's counter to that idea.
|
Yeah , right.
|
Now , again , it 's we 're just trying these different things. We don't really know what 's gonna work best. But if that 's the hypothesis , at least it would be counter to that hypothesis to do that.
|
Right.
| false |
QMSum_194
|
Yeah , right.
|
Now , again , it 's we 're just trying these different things. We don't really know what 's gonna work best. But if that 's the hypothesis , at least it would be counter to that hypothesis to do that.
|
Right.
|
Um , and in principle you would think that the neural net would do better at the discriminant part than LDA.
| false |
QMSum_194
|
Now , again , it 's we 're just trying these different things. We don't really know what 's gonna work best. But if that 's the hypothesis , at least it would be counter to that hypothesis to do that.
|
Right.
|
Um , and in principle you would think that the neural net would do better at the discriminant part than LDA.
|
Right. Yeah. Well y
| false |
QMSum_194
|
Right.
|
Um , and in principle you would think that the neural net would do better at the discriminant part than LDA.
|
Right. Yeah. Well y
|
Though , maybe not.
| false |
QMSum_194
|
Um , and in principle you would think that the neural net would do better at the discriminant part than LDA.
|
Right. Yeah. Well y
|
Though , maybe not.
|
Yeah. Exactly. I mean , we , uh we were getting ready to do the tandem , uh , stuff for the Hub - five system , and , um , Andreas and I talked about it , and the idea w the thought was , " Well , uh , yeah , that i you know th the neural net should be better , but we should at least have uh , a number , you know , to show that we did try the LDA in place of the neural net , so that we can you know , show a clear path.
| false |
QMSum_194
|
Right. Yeah. Well y
|
Though , maybe not.
|
Yeah. Exactly. I mean , we , uh we were getting ready to do the tandem , uh , stuff for the Hub - five system , and , um , Andreas and I talked about it , and the idea w the thought was , " Well , uh , yeah , that i you know th the neural net should be better , but we should at least have uh , a number , you know , to show that we did try the LDA in place of the neural net , so that we can you know , show a clear path.
|
Right.
| false |
QMSum_194
|
Though , maybe not.
|
Yeah. Exactly. I mean , we , uh we were getting ready to do the tandem , uh , stuff for the Hub - five system , and , um , Andreas and I talked about it , and the idea w the thought was , " Well , uh , yeah , that i you know th the neural net should be better , but we should at least have uh , a number , you know , to show that we did try the LDA in place of the neural net , so that we can you know , show a clear path.
|
Right.
|
You know , that you have it without it , then you have the LDA , then you have the neural net , and you can see , theoretically. So. I was just wondering I I
| false |
QMSum_194
|
Yeah. Exactly. I mean , we , uh we were getting ready to do the tandem , uh , stuff for the Hub - five system , and , um , Andreas and I talked about it , and the idea w the thought was , " Well , uh , yeah , that i you know th the neural net should be better , but we should at least have uh , a number , you know , to show that we did try the LDA in place of the neural net , so that we can you know , show a clear path.
|
Right.
|
You know , that you have it without it , then you have the LDA , then you have the neural net , and you can see , theoretically. So. I was just wondering I I
|
Well , I think that 's a good idea.
| false |
QMSum_194
|
Right.
|
You know , that you have it without it , then you have the LDA , then you have the neural net , and you can see , theoretically. So. I was just wondering I I
|
Well , I think that 's a good idea.
|
Yeah.
| false |
QMSum_194
|
You know , that you have it without it , then you have the LDA , then you have the neural net , and you can see , theoretically. So. I was just wondering I I
|
Well , I think that 's a good idea.
|
Yeah.
|
Did did you do that
| false |
QMSum_194
|
Well , I think that 's a good idea.
|
Yeah.
|
Did did you do that
|
Um. No.
| false |
QMSum_194
|
Yeah.
|
Did did you do that
|
Um. No.
|
or tha that 's a
| false |
QMSum_194
|
Did did you do that
|
Um. No.
|
or tha that 's a
|
That 's what that 's what we 're gonna do next as soon as I finish this other thing. So.
| false |
QMSum_194
|
Um. No.
|
or tha that 's a
|
That 's what that 's what we 're gonna do next as soon as I finish this other thing. So.
|
Yeah. Yeah. No , well , that 's a good idea. I I
| false |
QMSum_194
|
or tha that 's a
|
That 's what that 's what we 're gonna do next as soon as I finish this other thing. So.
|
Yeah. Yeah. No , well , that 's a good idea. I I
|
We just want to show.
| false |
QMSum_194
|
That 's what that 's what we 're gonna do next as soon as I finish this other thing. So.
|
Yeah. Yeah. No , well , that 's a good idea. I I
|
We just want to show.
|
i Yeah.
| false |
QMSum_194
|
Yeah. Yeah. No , well , that 's a good idea. I I
|
We just want to show.
|
i Yeah.
|
I mean , it everybody believes it ,
| false |
QMSum_194
|
We just want to show.
|
i Yeah.
|
I mean , it everybody believes it ,
|
Oh , no it 's a g
| false |
QMSum_194
|
i Yeah.
|
I mean , it everybody believes it ,
|
Oh , no it 's a g
|
but you know , we just
| false |
QMSum_194
|
I mean , it everybody believes it ,
|
Oh , no it 's a g
|
but you know , we just
|
No , no , but it might not not even be true.
| false |
QMSum_194
|
Oh , no it 's a g
|
but you know , we just
|
No , no , but it might not not even be true.
|
Yeah.
| false |
QMSum_194
|
but you know , we just
|
No , no , but it might not not even be true.
|
Yeah.
|
I mean , it 's it 's it 's it 's it 's a great idea. I mean , one of the things that always disturbed me , uh , in the the resurgence of neural nets that happened in the eighties was that , um , a lot of people Because neural nets were pretty easy to to use a lot of people were just using them for all sorts of things without , uh , looking at all into the linear , uh uh , versions of them.
| false |
QMSum_194
|
No , no , but it might not not even be true.
|
Yeah.
|
I mean , it 's it 's it 's it 's it 's a great idea. I mean , one of the things that always disturbed me , uh , in the the resurgence of neural nets that happened in the eighties was that , um , a lot of people Because neural nets were pretty easy to to use a lot of people were just using them for all sorts of things without , uh , looking at all into the linear , uh uh , versions of them.
|
Yeah. Mm - hmm. Yeah.
| false |
QMSum_194
|
Yeah.
|
I mean , it 's it 's it 's it 's it 's a great idea. I mean , one of the things that always disturbed me , uh , in the the resurgence of neural nets that happened in the eighties was that , um , a lot of people Because neural nets were pretty easy to to use a lot of people were just using them for all sorts of things without , uh , looking at all into the linear , uh uh , versions of them.
|
Yeah. Mm - hmm. Yeah.
|
And , uh , people were doing recurrent nets but not looking at IIR filters , and You know , I mean , uh , so I think , yeah , it 's definitely a good idea to try it.
| false |
QMSum_194
|
I mean , it 's it 's it 's it 's it 's a great idea. I mean , one of the things that always disturbed me , uh , in the the resurgence of neural nets that happened in the eighties was that , um , a lot of people Because neural nets were pretty easy to to use a lot of people were just using them for all sorts of things without , uh , looking at all into the linear , uh uh , versions of them.
|
Yeah. Mm - hmm. Yeah.
|
And , uh , people were doing recurrent nets but not looking at IIR filters , and You know , I mean , uh , so I think , yeah , it 's definitely a good idea to try it.
|
Yeah , and everybody 's putting that on their systems now , and so , I that 's what made me wonder about this ,
| false |
QMSum_194
|
Yeah. Mm - hmm. Yeah.
|
And , uh , people were doing recurrent nets but not looking at IIR filters , and You know , I mean , uh , so I think , yeah , it 's definitely a good idea to try it.
|
Yeah , and everybody 's putting that on their systems now , and so , I that 's what made me wonder about this ,
|
Well , they 've been putting them in their systems off and on for ten years ,
| false |
QMSum_194
|
And , uh , people were doing recurrent nets but not looking at IIR filters , and You know , I mean , uh , so I think , yeah , it 's definitely a good idea to try it.
|
Yeah , and everybody 's putting that on their systems now , and so , I that 's what made me wonder about this ,
|
Well , they 've been putting them in their systems off and on for ten years ,
|
but.
| false |
QMSum_194
|
Yeah , and everybody 's putting that on their systems now , and so , I that 's what made me wonder about this ,
|
Well , they 've been putting them in their systems off and on for ten years ,
|
but.
|
but but but , uh ,
| false |
QMSum_194
|
Well , they 've been putting them in their systems off and on for ten years ,
|
but.
|
but but but , uh ,
|
Yeah , what I mean is it 's it 's like in the Hub - five evaluations , you know , and you read the system descriptions and everybody 's got , you know , LDA on their features.
| false |
QMSum_194
|
but.
|
but but but , uh ,
|
Yeah , what I mean is it 's it 's like in the Hub - five evaluations , you know , and you read the system descriptions and everybody 's got , you know , LDA on their features.
|
And now they all have that. I see.
| false |
QMSum_194
|
but but but , uh ,
|
Yeah , what I mean is it 's it 's like in the Hub - five evaluations , you know , and you read the system descriptions and everybody 's got , you know , LDA on their features.
|
And now they all have that. I see.
|
And so.
| false |
QMSum_194
|
Yeah , what I mean is it 's it 's like in the Hub - five evaluations , you know , and you read the system descriptions and everybody 's got , you know , LDA on their features.
|
And now they all have that. I see.
|
And so.
|
Yeah.
| false |
QMSum_194
|
And now they all have that. I see.
|
And so.
|
Yeah.
|
Uh.
| false |
QMSum_194
|
And so.
|
Yeah.
|
Uh.
|
It 's the transformation they 're estimating on Well , they are trained on the same data as the final HMM are.
| false |
QMSum_194
|
Yeah.
|
Uh.
|
It 's the transformation they 're estimating on Well , they are trained on the same data as the final HMM are.
|
Yeah , so it 's different. Yeah , exactly. Cuz they don't have these , you know , mismatches that that you guys have.
| false |
QMSum_194
|
Uh.
|
It 's the transformation they 're estimating on Well , they are trained on the same data as the final HMM are.
|
Yeah , so it 's different. Yeah , exactly. Cuz they don't have these , you know , mismatches that that you guys have.
|
Mm - hmm.
| false |
QMSum_194
|
It 's the transformation they 're estimating on Well , they are trained on the same data as the final HMM are.
|
Yeah , so it 's different. Yeah , exactly. Cuz they don't have these , you know , mismatches that that you guys have.
|
Mm - hmm.
|
So that 's why I was wondering if maybe it 's not even a good idea.
| false |
QMSum_194
|
Yeah , so it 's different. Yeah , exactly. Cuz they don't have these , you know , mismatches that that you guys have.
|
Mm - hmm.
|
So that 's why I was wondering if maybe it 's not even a good idea.
|
Mm - hmm.
| false |
QMSum_194
|
Mm - hmm.
|
So that 's why I was wondering if maybe it 's not even a good idea.
|
Mm - hmm.
|
I don't know. I I don't know enough about it ,
| false |
QMSum_194
|
So that 's why I was wondering if maybe it 's not even a good idea.
|
Mm - hmm.
|
I don't know. I I don't know enough about it ,
|
Mm - hmm.
| false |
QMSum_194
|
Mm - hmm.
|
I don't know. I I don't know enough about it ,
|
Mm - hmm.
|
but Um.
| false |
QMSum_194
|
I don't know. I I don't know enough about it ,
|
Mm - hmm.
|
but Um.
|
I mean , part of why I I think part of why you were getting into the KLT Y you were describing to me at one point that you wanted to see if , uh , you know , getting good orthogonal features was and combining the the different temporal ranges was the key thing that was happening or whether it was this discriminant thing , right ? So you were just trying I think you r I mean , this is it doesn't have the LDA aspect but th as far as the orthogonalizing transformation , you were trying that at one point , right ?
| false |
QMSum_194
|
Mm - hmm.
|
but Um.
|
I mean , part of why I I think part of why you were getting into the KLT Y you were describing to me at one point that you wanted to see if , uh , you know , getting good orthogonal features was and combining the the different temporal ranges was the key thing that was happening or whether it was this discriminant thing , right ? So you were just trying I think you r I mean , this is it doesn't have the LDA aspect but th as far as the orthogonalizing transformation , you were trying that at one point , right ?
|
Mm - hmm.
| false |
QMSum_194
|
but Um.
|
I mean , part of why I I think part of why you were getting into the KLT Y you were describing to me at one point that you wanted to see if , uh , you know , getting good orthogonal features was and combining the the different temporal ranges was the key thing that was happening or whether it was this discriminant thing , right ? So you were just trying I think you r I mean , this is it doesn't have the LDA aspect but th as far as the orthogonalizing transformation , you were trying that at one point , right ?
|
Mm - hmm.
|
I think you were.
| false |
QMSum_194
|
I mean , part of why I I think part of why you were getting into the KLT Y you were describing to me at one point that you wanted to see if , uh , you know , getting good orthogonal features was and combining the the different temporal ranges was the key thing that was happening or whether it was this discriminant thing , right ? So you were just trying I think you r I mean , this is it doesn't have the LDA aspect but th as far as the orthogonalizing transformation , you were trying that at one point , right ?
|
Mm - hmm.
|
I think you were.
|
Mm - hmm. Yeah.
| false |
QMSum_194
|
Mm - hmm.
|
I think you were.
|
Mm - hmm. Yeah.
|
Does something. It doesn't work as well. Yeah. Yeah.
| false |
QMSum_194
|
I think you were.
|
Mm - hmm. Yeah.
|
Does something. It doesn't work as well. Yeah. Yeah.
|
So , yeah , I 've been exploring a parallel VAD without neural network with , like , less latency using SNR and energy , um , after the cleaning up. So what I 'd been trying was , um , uh After the b after the noise compensation , n I was trying t to f find a f feature based on the ratio of the energies , that is , cl after clean and before clean. So that if if they are , like , pretty c close to one , which means it 's speech. And if it is n if it is close to zero , which is So it 's like a scale @ @ probability value. So I was trying , uh , with full band and multiple bands , m ps uh separating them to different frequency bands and deriving separate decisions on each bands , and trying to combine them. Uh , the advantage being like it doesn't have the latency of the neural net if it if it can
| false |
QMSum_194
|
Mm - hmm. Yeah.
|
Does something. It doesn't work as well. Yeah. Yeah.
|
So , yeah , I 've been exploring a parallel VAD without neural network with , like , less latency using SNR and energy , um , after the cleaning up. So what I 'd been trying was , um , uh After the b after the noise compensation , n I was trying t to f find a f feature based on the ratio of the energies , that is , cl after clean and before clean. So that if if they are , like , pretty c close to one , which means it 's speech. And if it is n if it is close to zero , which is So it 's like a scale @ @ probability value. So I was trying , uh , with full band and multiple bands , m ps uh separating them to different frequency bands and deriving separate decisions on each bands , and trying to combine them. Uh , the advantage being like it doesn't have the latency of the neural net if it if it can
|
Mm - hmm.
| true |
QMSum_194
|
Does something. It doesn't work as well. Yeah. Yeah.
|
So , yeah , I 've been exploring a parallel VAD without neural network with , like , less latency using SNR and energy , um , after the cleaning up. So what I 'd been trying was , um , uh After the b after the noise compensation , n I was trying t to f find a f feature based on the ratio of the energies , that is , cl after clean and before clean. So that if if they are , like , pretty c close to one , which means it 's speech. And if it is n if it is close to zero , which is So it 's like a scale @ @ probability value. So I was trying , uh , with full band and multiple bands , m ps uh separating them to different frequency bands and deriving separate decisions on each bands , and trying to combine them. Uh , the advantage being like it doesn't have the latency of the neural net if it if it can
|
Mm - hmm.
|
g And it gave me like , uh , one point One more than one percent relative improvement. So , from fifty - three point six it went to fifty f four point eight. So it 's , like , only slightly more than a percent improvement ,
| false |
QMSum_194
|
So , yeah , I 've been exploring a parallel VAD without neural network with , like , less latency using SNR and energy , um , after the cleaning up. So what I 'd been trying was , um , uh After the b after the noise compensation , n I was trying t to f find a f feature based on the ratio of the energies , that is , cl after clean and before clean. So that if if they are , like , pretty c close to one , which means it 's speech. And if it is n if it is close to zero , which is So it 's like a scale @ @ probability value. So I was trying , uh , with full band and multiple bands , m ps uh separating them to different frequency bands and deriving separate decisions on each bands , and trying to combine them. Uh , the advantage being like it doesn't have the latency of the neural net if it if it can
|
Mm - hmm.
|
g And it gave me like , uh , one point One more than one percent relative improvement. So , from fifty - three point six it went to fifty f four point eight. So it 's , like , only slightly more than a percent improvement ,
|
Mm - hmm.
| false |
QMSum_194
|
Mm - hmm.
|
g And it gave me like , uh , one point One more than one percent relative improvement. So , from fifty - three point six it went to fifty f four point eight. So it 's , like , only slightly more than a percent improvement ,
|
Mm - hmm.
|
just like Which means that it 's it 's doing a slightly better job than the previous VAD ,
| false |
QMSum_194
|
g And it gave me like , uh , one point One more than one percent relative improvement. So , from fifty - three point six it went to fifty f four point eight. So it 's , like , only slightly more than a percent improvement ,
|
Mm - hmm.
|
just like Which means that it 's it 's doing a slightly better job than the previous VAD ,
|
Mm - hmm.
| false |
QMSum_194
|
Mm - hmm.
|
just like Which means that it 's it 's doing a slightly better job than the previous VAD ,
|
Mm - hmm.
|
uh , at a l lower delay.
| false |
QMSum_194
|
just like Which means that it 's it 's doing a slightly better job than the previous VAD ,
|
Mm - hmm.
|
uh , at a l lower delay.
|
Mm - hmm.
| false |
QMSum_194
|
Mm - hmm.
|
uh , at a l lower delay.
|
Mm - hmm.
|
Um , so , um
| false |
QMSum_194
|
uh , at a l lower delay.
|
Mm - hmm.
|
Um , so , um
|
But i d I 'm sorry ,
| false |
QMSum_194
|
Mm - hmm.
|
Um , so , um
|
But i d I 'm sorry ,
|
so u
| false |
QMSum_194
|
Um , so , um
|
But i d I 'm sorry ,
|
so u
|
does it still have the median filter stuff ?
| false |
QMSum_194
|
But i d I 'm sorry ,
|
so u
|
does it still have the median filter stuff ?
|
It still has the median filter.
| false |
QMSum_194
|
so u
|
does it still have the median filter stuff ?
|
It still has the median filter.
|
So it still has most of the delay ,
| false |
QMSum_194
|
does it still have the median filter stuff ?
|
It still has the median filter.
|
So it still has most of the delay ,
|
So
| false |
QMSum_194
|
It still has the median filter.
|
So it still has most of the delay ,
|
So
|
it just doesn't
| false |
QMSum_194
|
So it still has most of the delay ,
|
So
|
it just doesn't
|
Yeah , so d with the delay , that 's gone is the input , which is the sixty millisecond. The forty plus twenty.
| false |
QMSum_194
|
So
|
it just doesn't
|
Yeah , so d with the delay , that 's gone is the input , which is the sixty millisecond. The forty plus twenty.
|
Well , w i
| false |
QMSum_194
|
it just doesn't
|
Yeah , so d with the delay , that 's gone is the input , which is the sixty millisecond. The forty plus twenty.
|
Well , w i
|
At the input of the neural net you have this , uh , f nine frames of context plus the delta.
| false |
QMSum_194
|
Yeah , so d with the delay , that 's gone is the input , which is the sixty millisecond. The forty plus twenty.
|
Well , w i
|
At the input of the neural net you have this , uh , f nine frames of context plus the delta.
|
Oh , plus the delta ,
| false |
QMSum_194
|
Well , w i
|
At the input of the neural net you have this , uh , f nine frames of context plus the delta.
|
Oh , plus the delta ,
|
Mm - hmm.
| false |
QMSum_194
|
At the input of the neural net you have this , uh , f nine frames of context plus the delta.
|
Oh , plus the delta ,
|
Mm - hmm.
|
right. OK.
| false |
QMSum_194
|
Oh , plus the delta ,
|
Mm - hmm.
|
right. OK.
|
Yeah. So that delay , plus the LDA.
| false |
QMSum_194
|
Mm - hmm.
|
right. OK.
|
Yeah. So that delay , plus the LDA.
|
Mm - hmm.
| false |
QMSum_194
|
right. OK.
|
Yeah. So that delay , plus the LDA.
|
Mm - hmm.
|
Uh , so the delay is only the forty millisecond of the noise cleaning , plus the hundred millisecond smoothing at the output.
| false |
QMSum_194
|
Yeah. So that delay , plus the LDA.
|
Mm - hmm.
|
Uh , so the delay is only the forty millisecond of the noise cleaning , plus the hundred millisecond smoothing at the output.
|
Mm - hmm. Mm - hmm.
| false |
QMSum_194
|
Mm - hmm.
|
Uh , so the delay is only the forty millisecond of the noise cleaning , plus the hundred millisecond smoothing at the output.
|
Mm - hmm. Mm - hmm.
|
Um. So. Yeah. So the the di the biggest The problem f for me was to find a consistent threshold that works well across the different databases , because I t I try to make it work on tr SpeechDat - Car
| false |
QMSum_194
|
Uh , so the delay is only the forty millisecond of the noise cleaning , plus the hundred millisecond smoothing at the output.
|
Mm - hmm. Mm - hmm.
|
Um. So. Yeah. So the the di the biggest The problem f for me was to find a consistent threshold that works well across the different databases , because I t I try to make it work on tr SpeechDat - Car
|
Mm - hmm.
| false |
QMSum_194
|
Mm - hmm. Mm - hmm.
|
Um. So. Yeah. So the the di the biggest The problem f for me was to find a consistent threshold that works well across the different databases , because I t I try to make it work on tr SpeechDat - Car
|
Mm - hmm.
|
and it fails on TI - digits , or if I try to make it work on that it 's just the Italian or something , it doesn't work on the Finnish.
| false |
QMSum_194
|
Um. So. Yeah. So the the di the biggest The problem f for me was to find a consistent threshold that works well across the different databases , because I t I try to make it work on tr SpeechDat - Car
|
Mm - hmm.
|
and it fails on TI - digits , or if I try to make it work on that it 's just the Italian or something , it doesn't work on the Finnish.
|
Mm - hmm.
| false |
QMSum_194
|
Mm - hmm.
|
and it fails on TI - digits , or if I try to make it work on that it 's just the Italian or something , it doesn't work on the Finnish.
|
Mm - hmm.
|
So , um. So there are there was , like , some problem in balancing the deletions and insertions when I try different thresholds.
| false |
QMSum_194
|
and it fails on TI - digits , or if I try to make it work on that it 's just the Italian or something , it doesn't work on the Finnish.
|
Mm - hmm.
|
So , um. So there are there was , like , some problem in balancing the deletions and insertions when I try different thresholds.
|
Mm - hmm.
| false |
QMSum_194
|
Mm - hmm.
|
So , um. So there are there was , like , some problem in balancing the deletions and insertions when I try different thresholds.
|
Mm - hmm.
|
So The I 'm still trying to make it better by using some other features from the after the p clean up maybe , some , uh , correlation auto - correlation or some s additional features of to mainly the improvement of the VAD. I 've been trying.
| false |
QMSum_194
|
So , um. So there are there was , like , some problem in balancing the deletions and insertions when I try different thresholds.
|
Mm - hmm.
|
So The I 'm still trying to make it better by using some other features from the after the p clean up maybe , some , uh , correlation auto - correlation or some s additional features of to mainly the improvement of the VAD. I 've been trying.
|
Now this this this , uh , " before and after clean " , it sounds like you think that 's a good feature. That that , it you th think that the , uh the i it appears to be a good feature , right ?
| false |
QMSum_194
|
Mm - hmm.
|
So The I 'm still trying to make it better by using some other features from the after the p clean up maybe , some , uh , correlation auto - correlation or some s additional features of to mainly the improvement of the VAD. I 've been trying.
|
Now this this this , uh , " before and after clean " , it sounds like you think that 's a good feature. That that , it you th think that the , uh the i it appears to be a good feature , right ?
|
Mm - hmm.
| false |
QMSum_194
|
So The I 'm still trying to make it better by using some other features from the after the p clean up maybe , some , uh , correlation auto - correlation or some s additional features of to mainly the improvement of the VAD. I 've been trying.
|
Now this this this , uh , " before and after clean " , it sounds like you think that 's a good feature. That that , it you th think that the , uh the i it appears to be a good feature , right ?
|
Mm - hmm.
|
What about using it in the neural net ?
| false |
QMSum_194
|
Now this this this , uh , " before and after clean " , it sounds like you think that 's a good feature. That that , it you th think that the , uh the i it appears to be a good feature , right ?
|
Mm - hmm.
|
What about using it in the neural net ?
|
Yeah.
| false |
QMSum_194
|
Mm - hmm.
|
What about using it in the neural net ?
|
Yeah.
|
Yeah , eventually we could could just
| false |
QMSum_194
|
What about using it in the neural net ?
|
Yeah.
|
Yeah , eventually we could could just
|
Yeah , so Yeah , so that 's the Yeah. So we 've been thinking about putting it into the neural net also.
| false |
QMSum_194
|
Yeah.
|
Yeah , eventually we could could just
|
Yeah , so Yeah , so that 's the Yeah. So we 've been thinking about putting it into the neural net also.
|
Yeah.
| false |
QMSum_194
|
Yeah , eventually we could could just
|
Yeah , so Yeah , so that 's the Yeah. So we 've been thinking about putting it into the neural net also.
|
Yeah.
|
Because they did that itself
| false |
QMSum_194
|
Yeah , so Yeah , so that 's the Yeah. So we 've been thinking about putting it into the neural net also.
|
Yeah.
|
Because they did that itself
|
Then you don't have to worry about the thresholds and
| false |
QMSum_194
|
Yeah.
|
Because they did that itself
|
Then you don't have to worry about the thresholds and
|
There 's a threshold and Yeah.
| false |
QMSum_194
|
Because they did that itself
|
Then you don't have to worry about the thresholds and
|
There 's a threshold and Yeah.
|
Yeah.
| false |
QMSum_194
|
Then you don't have to worry about the thresholds and
|
There 's a threshold and Yeah.
|
Yeah.
|
but just
| false |
QMSum_194
|
There 's a threshold and Yeah.
|
Yeah.
|
but just
|
Yeah. So that that 's , uh
| false |
QMSum_194
|
Yeah.
|
but just
|
Yeah. So that that 's , uh
|
Yeah. So if we if we can live with the latency or cut the latencies elsewhere , then then that would be a , uh , good thing.
| false |
QMSum_194
|
but just
|
Yeah. So that that 's , uh
|
Yeah. So if we if we can live with the latency or cut the latencies elsewhere , then then that would be a , uh , good thing.
|
Yeah. Yeah.
| false |
QMSum_194
|
Yeah. So that that 's , uh
|
Yeah. So if we if we can live with the latency or cut the latencies elsewhere , then then that would be a , uh , good thing.
|
Yeah. Yeah.
|
Um , anybody has anybody you guys or or Naren , uh , somebody , tried the , uh , um , second th second stream thing ? Uh.
| false |
QMSum_194
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.