modelId
stringlengths 5
139
| author
stringlengths 2
42
| last_modified
timestamp[us, tz=UTC]date 2020-02-15 11:33:14
2025-06-26 18:27:55
| downloads
int64 0
223M
| likes
int64 0
11.7k
| library_name
stringclasses 499
values | tags
sequencelengths 1
4.05k
| pipeline_tag
stringclasses 54
values | createdAt
timestamp[us, tz=UTC]date 2022-03-02 23:29:04
2025-06-26 18:27:32
| card
stringlengths 11
1.01M
|
---|---|---|---|---|---|---|---|---|---|
seongil-dn/e5-large-filtered-our-neg5-9240 | seongil-dn | 2024-11-13T10:03:28Z | 5 | 0 | sentence-transformers | [
"sentence-transformers",
"safetensors",
"bert",
"sentence-similarity",
"feature-extraction",
"generated_from_trainer",
"dataset_size:295744",
"loss:MultipleNegativesRankingLoss",
"arxiv:1908.10084",
"arxiv:1705.00652",
"base_model:intfloat/e5-large-unsupervised",
"base_model:finetune:intfloat/e5-large-unsupervised",
"autotrain_compatible",
"text-embeddings-inference",
"endpoints_compatible",
"region:us"
] | sentence-similarity | 2024-11-13T10:02:23Z | ---
tags:
- sentence-transformers
- sentence-similarity
- feature-extraction
- generated_from_trainer
- dataset_size:295744
- loss:MultipleNegativesRankingLoss
base_model: intfloat/e5-large-unsupervised
widget:
- source_sentence: 'query: NH농협금융지주 임원후보추천위원회는 금융지주 대표이사 회장 최종후보로 누구를 추천했나?'
sentences:
- 'passage: 검찰이 고용노동부가 삼성전자서비스의 불법파견 여부를 조사할 당시 고위 공무원들이 조사과정에 부당하게 개입했다는 의혹에 대해
강제수사에 나섰다. 서울중앙지검 공공형사수사부(김성훈 부장검사)는 13일 오전 정부세종청사에 있는 노동부 노동정책실 노사협력정책과 등지에 검사와
수사관들을 보내 삼성전자서비스 근로감독 관련 기록과 컴퓨터 하드디스크 등을 확보하고 있다. 노동부는 삼성전자서비스 AS센터에 대해 불법파견
의혹이 제기되자 2013년 6∼7월 수시 근로감독을 벌였다. 노동부는 근로감독을 한 차례 연장한 끝에 같은 해 9월 불법파견이 아니라는 결론을
내렸다. 이 과정에서 정현옥 당시 차관이 노동정책실장에게 노동부 출신 삼성전자 핵심인사와 접촉을 지시하는 등 노동부 고위직들이 삼성 측과 유착해
근로감독 결과에 부당한 영향력을 행사했다는 의혹이 제기됐다. 검찰은 이런 의혹을 뒷받침하는 정황을 일부 포착하고 이날 강제수사에 나선 것으로
전해졌다. 고용노동행정개혁위원회는 최근 이런 내용의 조사결과를 발표하고 검찰 수사를 촉구했다. 삼성전자서비스 노조 등은 정 전 차관 등 노동부
전.현직 공무원 11명을 직권남용권리행사방해 등 혐의로 검찰에 고발한 바 있다.'
- 'passage: 2018 평창동계올림픽이 열리는 2월 한 달 간 개최지인 강원지역 고속도로와 국도 등에서 각국 선수단과 임원 등 관계자들의
이동을 돕기 위해 올림픽 전용차로(O·L :Olympic Lane)가 운용된다. 21일 도로교통공단에 따르면 평창올림픽(2월9∼25일) 기간을
낀 2월1∼28일 영동고속도로 인천방향 대관령나들목(IC)→강릉분기점(JC) 약 19.8㎞ 구간, 국도 6호선·지방도 456호선 태기사거리∼대관령IC
39.6㎞ 구간 등 전체 59.4㎞ 구간 1차로에 올림픽 전용차로가 설정된다. 차로 운용 시간대는 오전 7시부터 다음날 오전 1시까지다. 올림픽에
이은 동계 패럴림픽(3월9∼18일) 기간에는 영동고속도로 구간만 전용차로를 둔다. 이 기간 고속도로 구간 전용차로는 9인승 이상 승용·승합자동차와
올림픽조직위원회 행사차량임을 표시한 차량만 운행할 수 있다. 국도·지방도 구간 전용차로는 올림픽 차량과 36인승 이상 대형 승합차량 등만 이용
가능하다. 이를 제외한 일반 차량의 전용차로 통행은 제한되며, 위반하면 승합차 7만원·승용차 6만원의 범칙금과 벌점 30점이 부과된다. 평창군에
있는 장평터널과 속사터널은 올림픽 기간에는 전용차로로만 운용되므로 일반 차량은 다른 길로 우회해야 한다. 올림픽 전용차로 구간에는 운전자 안내를
위해 표지판을 설치하고, 도로 위에 전용차로임을 뜻하는 청색 점선과 오륜마크를 표기한다.'
- 'passage: 김광수 전 금융정보분석원(FIU) 원장(사진)이 NH농협금융지주 회장에 내정됐다. 19일 NH농협금융지주 임원후보추천위원회는
김광수 전 원장을 금융지주 대표이사 회장 최종후보로 추천했다고 밝혔다. 당초 후보자 3명에 대해 심층면접을 거쳐 최종 후보자를 추천할 계획이었으나
윤용로 현 코람코자산신탁 회장의 후보 고사와 당일 김용환 현 회장의 후보 사퇴로 김광수 단독 후보자에 대해 심층면접을 했다. 심층면접 결과
농협금융 임추위는 만장일치로 김 전 원장을 최종 후보로 결정하고 주주총회에 추천하기로 결정했다. 김 전 원장은 행시 27회로 관직에 입문한
옛 재정경제원 관료 출신으로 현 정부 들어 금융위원장, 금융감독원장 등으로 하마평에 올랐던 인물이다.'
- source_sentence: 'query: 비욘세는 아이가 몇이야?'
sentences:
- 'passage: 두산중공업은 한국중부발전이 발주한 신서천 화력발전소 탈황설비(FGD, Flue Gas Desulfurization) 공사를
약 1240억원에 수주했다고 27일 밝혔다. 신서천 화력발전소는 충남 서천군 서면 마량리에 건설중인 1000MW급 발전소다. 두산중공업은 지난
2015년 이 발전소와 터빈.발전기 공급계약을 맺은 데 이어, 금호산업과 컨소시엄을 이뤄 탈황설비 공사를 수행하게 됐다. 두산중공업이 맡은
탈황설비 기자재 공급 금액은 700억원 규모다. 탈황설비는 화력발전소의 배기가스 내 대기오염 원인 물질인 황산화물(SOx)을 제거함으로써 발전소
굴뚝에서 나오는 연기를 수증기 수준으로 정화시켜주는 친환경 설비다.'
- 'passage: {IMG:1}더불어민주당 오중기 경북도지사 후보가 2일 포항에서 선거사무실에서 비전선포식을 갖고 한반도 핵심은 경북이라고
강조했다. 오 후보는 비전 선포식 자리에서 남북정상회담의 성공적 개최로 향후 남북 경제협력에 있어 핵심적인 역할을 할 지역이 포항을 중심으로
한 동해안이다며 동해안 일대는 ‘한반도 신 경제지도’를 수행할 전력, 해양 등 인프라가 구축돼 있어 남북경협을 주도할 위치에 있다고 밝혔다.
또, 한반도 신경제지도의 핵심은 남북 철도, 도로를 이어 한반도에 H라인을 구축, 대륙과 해양으로 뻗어나갈 인프라를 조성하는 것이다며 동해안
지역은 부산~금강산~원산~나진으로 연결되는 ‘동해안벨트’에 포함된다고 설명했다. 특히,동해선을 통해 남북철도가 연결되면 시베리아 횡단철도(TSR)와
연계해 남한의 물류가 유라시아까지 공급될 수 있다”며 정상회담 성공적 개최에 따른 기대감을 드러냈다. {IMG:2}오 후보는 이와 함께 문재인
정부의 국정철학인 지역균형발전과 연계한 지역발전 전략으로 5대 정책과 15대 실천 방안을 내놓았다. 5대 정책은 대기업과 상생협력 생태계 조성.청년
창업 메카 실현, 전통역사와 문화자원을 활용한 인간 중심의 문화 르네상스, 인구소멸 위기의 대응, 경북의 미래 먹거리 창출과 안전한 경북 등이다.
이를 위한 15대 실천방안으로는 원전해체 연구센터를 포함한 국제 원자력 안전 연구단지 조성과 연구·교육·체험 산업 클러스터 조성, 백두대간
임업 클러스터구축, 남북 6축 고속도로 조기 착공, 미래형 자동차 철강소재 부품산업 벨트육성 등을 제시했다. 이에 앞서 오 후보는 지난달 30일
경북 영덕군 강구항에서 남북정상회담의 성공을 축하하고 후속조치의 중요성을 강조하는 2018 남북정상회담과 경북의 역할이란 제목으로 기자회견을
가졌다. 오 후보는 청와대 균형발전실 선임행정관으로 국정에 참여했고 중앙당 지도부 경험이 있는 저 오중기가 도민들과 더불어 경북 미래의 문을
여는 열쇠가 되겠다. 문재인 정부의 국정철학을 경북에서 완성하겠다고 말했다.'
- 'passage: 아이와 함께 특별한 행사에 참석하는 한 엄마를 상상해보시라. 아이 뒤치다꺼리를 해야 할 지도 모른다는 생각에 되도록 편한
옷을 입을 거라고 생각하게 된다. 노출이 있는 옷은 언감생심이다. 아이를 곁에 둔 엄마와 ‘섹시’는 어울려서는 안 되는 조합같이 느껴지기 때문이다.
그러나 팝스타 비욘세의 선택은 달랐다. 통념을 완벽하게 깬 드레스를 입고 아이와 함께 행사장에 등장했다. 비욘세 28일(현지시간) 뉴욕 매디슨
스퀘어 가든에서 열린 제60회 그래미 시상식에서 남편 제이지와 딸 블루 아이비와 함께했다. 비욘세, 제이지 부부는 할리우드 여배우들의 성폭력
고발 캠페인 ‘미투’에 동참한 듯 모두 검은색 의상을 입었다. 6살 된 딸은 흰색 긴 재킷을 입었다. 특히 아이 옆에 앉은 비욘세의 드레스는
시선을 강탈하기 충분했다. 자리에 앉아 있을 때 양옆으로 쫙 벌어지는 긴 드레스였다. 허벅지까지 노출되는 과감한 드레스였다. 아이와 동석한
행사에서 나온 모습이라고 상상하기 어렵다. 비욘세는 지난해 6월 낳은 쌍둥이에 바로 옆에 동석한 딸까지, 세 아이의 엄마다.'
- source_sentence: 'query: 2017년 말산업실태조사 결과에 따르면 승마시설 수는?'
sentences:
- 'passage: 승마시설 512곳으로 늘어.. 말 사육 수 2만7210마리 귀족 레포츠라는 이미지가 짙은 승마를 즐기는 국민이 늘고 있다.
승마인구 증가로 말산업 또한 확대되면서 말산업으로 인한 경제적 효과는 3조4000억원에 이르는 것으로 분석됐다. 6일 농림축산식품부가 발표한
2017년 말산업실태조사 결과에 따르면 승마시설 수는 전년 대비 6.9% 증가한 512개소다. 이는 2012~2016년 제1차 말산업육성 종합계획상
목표인 500개소를 초과 달성한 수치다. 말산업 사업체 수는 전년 대비 8.4% 증가한 2470개소다. 이 중 말 보유 사업체 수는 9.5%
증가한 2146개소다. 말 사육마릿수는 전년 대비 0.3% 증가한 2만7210마리다. 사육용도는 승용이 1만1458마리(42.1%)로 가장
많고 경주용 7303마리(26.8%), 번식용 4549마리(16.7%), 육용 865마리(3.2%) 등이다. 특히 승마인구는 1만292명 증가한
94만8714명으로 지속적인 성장 추세다. 이 중 정기적으로 승마를 하는 즐기는 정기 승마인구는 전년 대비 3.9% 증가한 4만9312명이다.
체험 승마인구는 전년 대비 0.8% 증가한 89만9402명이다. 말산업 종사자는 전년 대비 0.1% 증가한 6946명이다. 다만 경마분야 종사자는
전년 대비 4.2% 감소한 9315명을 기록했다. 이는 무인발매기 도입으로 인한 발매인원이 감소한 데 따른 것으로 분석됐다.'
- 'passage: {IMG:1}블랙리스트 타파와 공공성 회복을 위한 연극인회의(블랙타파)가 한국연극협회 이사장을 향해 재차 사퇴할 것을 촉구했다.
블랙타파는 1일 정대경 이사장은 더 이상 거짓말 하지 말고 사퇴하라는 논평을 통해, 정 이사장이 언론 등을 통해 밝힌 해명이 거짓이라고 지적했다.
지난달 29일 블랙타파가 한국연극협회 이사회는 정대경 이사장을 탄핵하라! 는 제목의 성명서를 발표하자, 이후 정 이사장은 언론을 통해 예술위
안에서 지원 배제를 막기 위해 가장 열심히 싸웠다 블랙리스트의 존재를 알고부터 이를 원만하게 해결하게 위해 최선을 다했는데 억울하다고 호소하며,
자신사퇴를 거부했다. 블랙타파는 정 이사장의 입장은 이미 밝혀진 사실을 왜곡하는 행위라고 말하는 동시에 언론을 향해 더 이상 정대경 이사장의
거짓말을 실어나르지 말라고 주장했다. 블랙리스트 진상조사 및 제도개선위원회의 진상조사(진상조사위) 결과, 정 이사장은 한국문화예술위원회 위원으로서
임무를 수행하던 기간에 블랙리스트 실행을 묵인 방조하고, 블랙리스트 피해자 배제에 깊이 관여한 것으로 드러났다. 하지만 연극인이 더욱 분노하는
지점은 그가 한국연극협회 이사장 선거에 문체부 등에 요청하여 관권을 개입시킨 범죄사실이다. 블랙타파는 진상조사위가 지난 8일 발표한 종합결과보고
자료집에는 정대경 이사장이 예술위 위원의 직위를 이용해 문체부, 예술위에 선거 개입을 요구한 사실이 적시되어 있다며 정 이사장의 선거개입 요청에
대한 다수의 진술이 있으며, 정대경 이사장은 블랙리스트 실행을 막기 위해 양해조치를 요구했다고 주장하고 있지만 그것이 선거에 도움이 될 것이라고
생각했다고 스스로 진술하였다고 했다. 이어 블랙타파는 블랙리스트 묵인, 방조, 협조 등의 행위도 문제적이지만 선거에 관권을 끌어들인 행위는
그 자체가 범죄행위라며 자신의 당선을 위해 민간협회 독립성을 스스로 무너뜨린 파렴치한 행위가 아닐 수 없다고 지적했다. 끝으로 블랙타파는 정
이사장을 향해 지난 과오를 반성하고 사죄하며 최소한의 책임이라도 다하는 것이 연극인으로서 자신의 명예를 지키는 일이다고 충고하고, 한국연극협회에
정 이사장의 범죄 사실에 대한 정확한 입장을 밝혀주기 바란다고 촉구했다. 블랙타파는 지난해 4월, 120개 단체와 600여 명의 연극인·예술가의
지지를 통해 결성된 단체이다. 블랙타파는 정 이사장이 스스로 물러나지 않거나, 이사회가 그를 탄핵하지 않을 시 직권남용권리행사방해죄의 공범
및 직무유기죄 등으로 형사고발한다는 입장이다.'
- 'passage: 보령제약은 필리핀 제약사인 인듀어메디칼과 3년간 50억원 규모의 항암제 이피마이신주, 에이디마이신주, 옥살리플라틴주, 네오타빈주
4개 품목에 대한 독점공급계약을 체결했다고 12일 밝혔다. 계약식은 지난 11일 스페인 마드리드 이페마(IFEMA) 컨벤션센터에서 개최된 세계의약품전시회(CPhI
Worldwdie) 현장에서 진행됐다. 이번 계약으로 보령제약은 3년 간 총 50억원(400만 달러) 규모의 의약품을 공급하게 됐으며 1차
항암제에 대한 동남아 시장 확대를 가속화하게 됐다. 보령제약은 태국, 인도네시아, 베트남 등 동남아시아 지역에 항암제를 수출하고 있다. 이를
기반으로 아프리카, 중남미지역 등으로 시장을 더욱 넓혀갈 계획이다. 한편, 2002년 창립된 인듀어메디칼은 필파마웰스그룹의 7개 계열사 중
하나로 대표 품목은 페니실린계 의약품 및 흡입마취제 등이다. 필리핀과 싱가포르 제조시설을 거점으로 글로벌 파트너사와 함께 60여개의 의료기기
및 의약품을 통해 시장을 선도하고 있는 중견 제약사다.'
- source_sentence: 'query: LG유플러스 새로운 사령탑에 누가 올랐어?'
sentences:
- 'passage: 현재 코스모화학(005420)은 전일대비 8.44% 상승하여 25,050원 선에서 거래가 이루어지고 있다. 상장주식수 대비
거래량은 2.55%로 적정수준 최근 한달간 코스모화학의 상장주식수 대비 거래량을 비교해보니 일별 매매회전율이 2.55%로 집계됐다. 그리고
동기간 장중 저점과 고점의 차이인 일평균 주가변동률을 4.32%를 나타내고 있다. 외국인 보유량 감소, 기관 증가 (5일 누적) 최근 5일간
동종목의 외국인 투자자는 3,346주 순매도하였고 반면 기관은 4,650주 순매수하였다. 전날 두 주체 각각 9,693주, 4,650주 순매수하며
보유 비율을 늘렸다. 다음 표는 동종목의 외국인, 기관의 일별 순매수량을 나타낸 것이다. fnRASSI는 증권전문 기업 씽크풀과 파이낸셜뉴스의
협업으로 로봇기자가 실시간으로 생산하는 기사입니다.'
- 'passage: 현재 엠에스오토텍(123040)은 전일대비 9.39% 상승하여 1,980원 선에서 거래가 이루어지고 있다. 주가등락폭이 너무
커 최근 한달간 엠에스오토텍의 상장주식수 대비 거래량을 비교해보니 일별 매매회전율이 0.34%로 집계됐다. 동기간 일평균 주가변동률은 17.52%로
장중 주가등락이 심하기 때문에 투자시 주의가 요망된다. 외국인/개인 순매수, 기관은 순매도(한달누적) 전일 외국인이 4,382주를 순매수했다.
4주간을 기준으로 보면 외국인이 매매관점을 매도에서 매수로 전환하면서 3,781주를 순매수했고, 개인들도 매기를 떨어뜨리면서도 10,851주를
순매수했다. 하지만 기관은 순매수에서 순매도로 반전되면서 14,632주를 순매도했다. fnRASSI는 증권전문 기업 씽크풀과 파이낸셜뉴스의
협업으로 로봇기자가 실시간으로 생산하는 기사입니다.'
- 'passage: {IMG:1}LG유플러스 새로운 사령탑에 하현회 부회장이 올랐다. 구광모 LG회장 취임 뒤 그룹 내 첫 인사이자, 5G 통신
상용화, 케이블TV 인수합병 등 굵직한 사안들을 앞둔 만큼 그 변화에 관심이 쏠린다. LG유플러스는 16일 이사회를 열고 하 부회장을 신임
최고경영자(CEO)로 선임했다. 부산대 사학과, 일본 와세다대 경영학 석사를 마친 하 부회장은 지난 1985년 LG금속에 입사한 뒤, LG디스플레이에서
전략기획담당, 중소형사업부장, IT사업부장을 지냈다. 지난 2012년부터 2년간 ㈜LG 시너지팀장을 맡은 그는, 모바일과 차세대 디스플레이,
에너지 솔루션과 친환경 자동차부품 등 그룹 주력사업과 차세대 성장사업이 본궤도에 진입할 수 있도록 기반을 확립했다. 아울러 LG전자 HE사업본부장을
맡아 울트라 올레드 TV를 세계 최초로 출시해 차세대 TV 부문에서 선도적으로 사업기반을 구축했다. 특히 지난 2015년부터 ㈜LG 대표이사를
맡으면서 계열사 실적 개선을 이끌며 LG그룹이 사업 성과를 거두는 데 핵심적인 역할을 수행했다. 올해 그는 ㈜LG 부회장으로 승진했다. 하
부회장은 지난 2015년부터 LG유플러스 이사회 멤버로 주요 의사결정에 참여해오기도 했다. 모바일 비디오, IoT, AI, 5G 등 통신산업의
변화에 대한 이해가 깊고 미래 먹거리 사업에 힘을 기울여왔다는 평가다. 하 부회장은 LG그룹 내 재무통으로 불리는 권영수 부회장과 달리 전략통으로
통하기 때문에 향후 투자를 늘리는 방향에도 관심이 쏠린다. 다만, 하 부회장의 어깨는 무겁다. 그가 이끌어야 할 LG유플러스 앞에는 풀어야할
숙제가 산적해 있기 때문이다. 5G 상용화를 위한 장비사 선정이 가장 시급한 과제다. 전임 권영수 부회장은 5G 장비로 중국 화웨이를 사용하겠다고
밝혔지만, 이에 국내 여론은 싸늘한 편이다. 실제 5G에서도 LG유플러스가 화웨이 장비를 도입한다면 비판 여론을 풀어나가는 것은 하 부회장의
몫이 될 예정이다. LG유플러스가 이미 추진하고 있는 케이블TV MA에서도 하 부회장의 경영 능력이 시험대에 오를 전망이다. LG유플러스는
미디어 플랫폼 확대를 위해 CJ헬로를 비롯한 케이블TV MA에 공개적으로 관심을 드러낸 바 있다. LG유플러스가 케이블TV MA를 위해 다양한
사업자들과 접촉했던 만큼 하 부회장의 결정은 방송·통신 시장에서 새로운 변화의 바람으로 작용할 것으로 보인다. 세계 최대 온라인동영상서비스(OTT)
넷플릭스와의 제휴도 관심사다. 전임 권 부회장이 넷플릭스와 제휴를 의욕적으로 추진했지만, LG유플러스 내부적으로는 조심스러운 분위기가 감지됐던
것이 현실이다. 회사 내부적으로는 미디어 사업 1위 목표를 내세우고 있다. 콘텐츠 대가 산정 비율이나 망 사용료 등은 비단 LG유플러스에게만
적용되는 것이 아니기 때문이다. 논란을 피해 어느 정도 수준의 협상력을 발휘할지는 하 부회장의 조율 능력이 중요한 것으로 예상된다. 한편,
전임 CEO인 권영수 부회장이 ㈜LG 대표이사로 옮기더라도 이전까지 하 부회장의 위치처럼 LG유플러스 이사회 내 기타비상무이사로 남는 점도
주목된다. 이에 따라, 하 부회장 체제의 조직 재정비는 연말 그룹 인사에 맞춰 진행되고, 올해 남은 기간 동안 기존 경영 기조를 이어갈 것이란
전망이 지배적이다. 통신업계에서는 LG유플러스가 새로운 CEO를 맞이한 만큼 케이블TV 인수나 넷플릭스 제휴 확대와 같은 현안을 두고 새판을
짜기 위한 결정이 머지 않아 나올 것으로 내다보고 있다. LG유플러스는 관계자는 하 부회장이 ㈜LG에서 계열사간 시너지를 창출한 경험을 바탕으로
LG유플러스의 본원적 경쟁력을 지속 강화하는 동시에 미래시장 개척을 위한 신성장동력 발굴에 집중해 나갈 예정이라고 강조했다.'
- source_sentence: 'query: 쿠첸은 적자행진에서 어떻게 벗어났어?'
sentences:
- 'passage: 쿠첸이 4분기만에 적자행진에서 벗어났다. 수익성 회복을 위한 판매 채널 확대, 원가율 개선 노력과 신제품 출시 등 호재가
맞물렸다. 16일 생활가전업계 및 증권가에 따르면 쿠첸은 지난 1.4분기 619억원의 매출액과 13억원의 영업이익을 기록했다. 매출액은 1년전
685억원에 비해 9.7% 줄었지만 영업이익은 6억원에서 13억원으로 두배이상 늘었다. 이에 따라 쿠첸은 지난 2.4분기부터 시작된 분기 기준
적자 행진에서 벗어나게 됐다. 쿠첸은 지난해 1.4분기 6억원의 영업이익을 기록한 이후 2.4분기에 57억원 손실, 3.4분기 12억원 손실,
4.4분기 20억원 적자 등 작년 한 해 83억원 가량의 손실을 입었다. 금액 기준으로 역대 최대 규모였다. 회사 측은 올 1.4분기 설 연휴로
호실적이 가능했다고 본다. 실제로 지난해 4.4분기 대비로 IH압력밥솥, 열판압력밥솥, 일반밥솥, 전기레인지, 기타품목 모두 매출이 상승했다.
특히 쿠첸의 야심작이었던 IR밥솥과 전기레인지가 함께 나아지고 있는 점이 특징이다. 프리미엄급 IR밥솥 비중이 순차적으로 증가하고 있고, 전기레인지
인지도 확대에 따라 홈쇼핑.B2B 채널에서 판매 호조를 나타냈다고 회사 측은 설명했다. 판매 채널을 수수료가 높은 오프라인에서 온라인 위주로
개선하고, 원가율을 낮추는 등 노력을 기울이고 있다.'
- 'passage: 현대에이치씨엔(126560)은 52주신고가를 기록하고 있어 주목할만하다. 동종목의 현재 주가는 4,650원 선에서 이루어지고
있다. 거래일을 기준으로 최근 3일간 평균 거래량은 31만 주이다. 이는 60일 일 평균 거래량 9만 주와 비교해보면 최근 거래량이 급격히
늘어났다는 것을 알 수 있다. 거래비중 기관 30.5%, 외국인 28.18% 최근 한달간 주체별 거래비중을 살펴보면 개인이 전체 거래량의 41.31%를
차지하며 가장 활발한 참여를 보였고, 기관은 30.5%, 외국인은 28.18%의 거래 참여율을 보였다. 그리고 최근 5일간 거래비중은 개인이
전체 거래의 35.43%를 차지하며 가장 적극적인 모습을 보여줬고, 외국인은 33.83%를 나타냈으며 기관은 30.73%를 나타냈다. 외국인/기관
순매수, 개인은 순매도(한달누적) 전일까지 기관이 2일 연속, 외국인이 10일 연속 각각 동종목을 순매수 했다. 4주간을 기준으로 보면 외국인이
초반에 동종목을 순매수한 이후에 기세를 이어가며 158,133주를 순매수했고, 기관도 초반에 동종목을 순매수한 이후에 최근에는 순매수량이 줄어들고
있음에도 불구하고 3,362주를 순매수했다. 반면 개인들은 매도쪽으로 방향을 잡으면서 161,495주를 순매도한 것으로 나타났다. fnRASSI는
증권전문 기업 씽크풀과 파이낸셜뉴스의 협업으로 로봇기자가 실시간으로 생산하는 기사입니다.'
- 'passage: 세계적인 복서 플로이드 메이웨더 주니어 (Floyd Mayweather Jr.)가 전용기를 타고 국내 입국할 예정이다. 열렬한
힙합 마니아로 잘 알려진 메이웨더는 본인의 공식 브랜드 TMT 산하 음악 레이블 TMT MUSIC GROUP을 런칭. 래퍼 릴 웨인(Lil
Wayne), 떠오르는 힙합 루키 릴 제임스(Lil Jamez)와 함께 TMT MUSIC을 한국에 알리기 위해 첫 내한을 결정했다는 후문이다.
메이웨더는 이번 일정 중에 26일에는 서울 강남구 신사동 ‘바이브레이트 플래그쉽 스토어’에서 팬들과 만난다. 팬들과 가깝게 만나는 팬사인회
일정으로 진행될 예정이다. 특히 이날 행사에는 세계 챔피언 복서 플로이드 메이웨더 주니어(Floyd Mayweather Jr. 이하 메이웨더)를
비롯해 래퍼 릴 웨인과, 릴 제임스가 함께할 예정으로 더욱 기대를 모은다. 메이웨더는 총 자산 규모가 1조가 넘을 정도로 소위 얘기하는 ‘거물급’
스포츠 스타이다. 경제지 포브스에 따르면 2015년 기준 세계에서 가장 많은 돈을 번 스포츠 스타며 메이웨더의 별명은 ‘머니’로 그의 남다른
돈자랑은 SNS는 물론 언론에서까지 항상 화제를 낳은 바 있다. 메이웨더는 1996년 데뷔 이후 49전 ‘무’패 최근 UFC역사상 가장 핫한
‘코너 맥그리거’와의 결투에서 승리한 스포츠 스타다. 이번 팬미팅은 국내 팬들을 위해 메이웨더 측에서 준비한 깜짝 이벤트로 한국스타들의 뜨거운
사랑을 받고 있는 감각적인 스트릿 브랜드 바이브레이트에서 주최해 팬들에게 잊지 못할 감동과 추억을 선사할 예정이다. 한편 메이웨더 팬미팅에
대한 상세 정보 및 이벤트는 바이브레이트 공식 SNS를 통해 확인이 가능하다.'
pipeline_tag: sentence-similarity
library_name: sentence-transformers
---
# SentenceTransformer based on intfloat/e5-large-unsupervised
This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [intfloat/e5-large-unsupervised](https://huggingface.co/intfloat/e5-large-unsupervised). It maps sentences & paragraphs to a 1024-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
## Model Details
### Model Description
- **Model Type:** Sentence Transformer
- **Base model:** [intfloat/e5-large-unsupervised](https://huggingface.co/intfloat/e5-large-unsupervised) <!-- at revision 15af9288f69a6291f37bfb89b47e71abc747b206 -->
- **Maximum Sequence Length:** 512 tokens
- **Output Dimensionality:** 1024 tokens
- **Similarity Function:** Cosine Similarity
<!-- - **Training Dataset:** Unknown -->
<!-- - **Language:** Unknown -->
<!-- - **License:** Unknown -->
### Model Sources
- **Documentation:** [Sentence Transformers Documentation](https://sbert.net)
- **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers)
- **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers)
### Full Model Architecture
```
SentenceTransformer(
(0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel
(1): Pooling({'word_embedding_dimension': 1024, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
(2): Normalize()
)
```
## Usage
### Direct Usage (Sentence Transformers)
First install the Sentence Transformers library:
```bash
pip install -U sentence-transformers
```
Then you can load this model and run inference.
```python
from sentence_transformers import SentenceTransformer
# Download from the 🤗 Hub
model = SentenceTransformer("seongil-dn/e5-large-filtered-our-neg5-9240")
# Run inference
sentences = [
'query: 쿠첸은 적자행진에서 어떻게 벗어났어?',
'passage: 쿠첸이 4분기만에 적자행진에서 벗어났다. 수익성 회복을 위한 판매 채널 확대, 원가율 개선 노력과 신제품 출시 등 호재가 맞물렸다. 16일 생활가전업계 및 증권가에 따르면 쿠첸은 지난 1.4분기 619억원의 매출액과 13억원의 영업이익을 기록했다. 매출액은 1년전 685억원에 비해 9.7% 줄었지만 영업이익은 6억원에서 13억원으로 두배이상 늘었다. 이에 따라 쿠첸은 지난 2.4분기부터 시작된 분기 기준 적자 행진에서 벗어나게 됐다. 쿠첸은 지난해 1.4분기 6억원의 영업이익을 기록한 이후 2.4분기에 57억원 손실, 3.4분기 12억원 손실, 4.4분기 20억원 적자 등 작년 한 해 83억원 가량의 손실을 입었다. 금액 기준으로 역대 최대 규모였다. 회사 측은 올 1.4분기 설 연휴로 호실적이 가능했다고 본다. 실제로 지난해 4.4분기 대비로 IH압력밥솥, 열판압력밥솥, 일반밥솥, 전기레인지, 기타품목 모두 매출이 상승했다. 특히 쿠첸의 야심작이었던 IR밥솥과 전기레인지가 함께 나아지고 있는 점이 특징이다. 프리미엄급 IR밥솥 비중이 순차적으로 증가하고 있고, 전기레인지 인지도 확대에 따라 홈쇼핑.B2B 채널에서 판매 호조를 나타냈다고 회사 측은 설명했다. 판매 채널을 수수료가 높은 오프라인에서 온라인 위주로 개선하고, 원가율을 낮추는 등 노력을 기울이고 있다.',
'passage: 현대에이치씨엔(126560)은 52주신고가를 기록하고 있어 주목할만하다. 동종목의 현재 주가는 4,650원 선에서 이루어지고 있다. 거래일을 기준으로 최근 3일간 평균 거래량은 31만 주이다. 이는 60일 일 평균 거래량 9만 주와 비교해보면 최근 거래량이 급격히 늘어났다는 것을 알 수 있다. 거래비중 기관 30.5%, 외국인 28.18% 최근 한달간 주체별 거래비중을 살펴보면 개인이 전체 거래량의 41.31%를 차지하며 가장 활발한 참여를 보였고, 기관은 30.5%, 외국인은 28.18%의 거래 참여율을 보였다. 그리고 최근 5일간 거래비중은 개인이 전체 거래의 35.43%를 차지하며 가장 적극적인 모습을 보여줬고, 외국인은 33.83%를 나타냈으며 기관은 30.73%를 나타냈다. 외국인/기관 순매수, 개인은 순매도(한달누적) 전일까지 기관이 2일 연속, 외국인이 10일 연속 각각 동종목을 순매수 했다. 4주간을 기준으로 보면 외국인이 초반에 동종목을 순매수한 이후에 기세를 이어가며 158,133주를 순매수했고, 기관도 초반에 동종목을 순매수한 이후에 최근에는 순매수량이 줄어들고 있음에도 불구하고 3,362주를 순매수했다. 반면 개인들은 매도쪽으로 방향을 잡으면서 161,495주를 순매도한 것으로 나타났다. fnRASSI는 증권전문 기업 씽크풀과 파이낸셜뉴스의 협업으로 로봇기자가 실시간으로 생산하는 기사입니다.',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 1024]
# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]
```
<!--
### Direct Usage (Transformers)
<details><summary>Click to see the direct usage in Transformers</summary>
</details>
-->
<!--
### Downstream Usage (Sentence Transformers)
You can finetune this model on your own dataset.
<details><summary>Click to expand</summary>
</details>
-->
<!--
### Out-of-Scope Use
*List how the model may foreseeably be misused and address what users ought not to do with the model.*
-->
<!--
## Bias, Risks and Limitations
*What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.*
-->
<!--
### Recommendations
*What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.*
-->
## Training Details
### Training Hyperparameters
#### Non-Default Hyperparameters
- `per_device_train_batch_size`: 40
- `learning_rate`: 0.0001
- `adam_epsilon`: 1e-07
- `num_train_epochs`: 5
- `warmup_ratio`: 0.05
- `bf16`: True
#### All Hyperparameters
<details><summary>Click to expand</summary>
- `overwrite_output_dir`: False
- `do_predict`: False
- `eval_strategy`: no
- `prediction_loss_only`: True
- `per_device_train_batch_size`: 40
- `per_device_eval_batch_size`: 8
- `per_gpu_train_batch_size`: None
- `per_gpu_eval_batch_size`: None
- `gradient_accumulation_steps`: 1
- `eval_accumulation_steps`: None
- `torch_empty_cache_steps`: None
- `learning_rate`: 0.0001
- `weight_decay`: 0.0
- `adam_beta1`: 0.9
- `adam_beta2`: 0.999
- `adam_epsilon`: 1e-07
- `max_grad_norm`: 1.0
- `num_train_epochs`: 5
- `max_steps`: -1
- `lr_scheduler_type`: linear
- `lr_scheduler_kwargs`: {}
- `warmup_ratio`: 0.05
- `warmup_steps`: 0
- `log_level`: passive
- `log_level_replica`: warning
- `log_on_each_node`: True
- `logging_nan_inf_filter`: True
- `save_safetensors`: True
- `save_on_each_node`: False
- `save_only_model`: False
- `restore_callback_states_from_checkpoint`: False
- `no_cuda`: False
- `use_cpu`: False
- `use_mps_device`: False
- `seed`: 42
- `data_seed`: None
- `jit_mode_eval`: False
- `use_ipex`: False
- `bf16`: True
- `fp16`: False
- `fp16_opt_level`: O1
- `half_precision_backend`: auto
- `bf16_full_eval`: False
- `fp16_full_eval`: False
- `tf32`: None
- `local_rank`: 0
- `ddp_backend`: None
- `tpu_num_cores`: None
- `tpu_metrics_debug`: False
- `debug`: []
- `dataloader_drop_last`: True
- `dataloader_num_workers`: 0
- `dataloader_prefetch_factor`: None
- `past_index`: -1
- `disable_tqdm`: False
- `remove_unused_columns`: True
- `label_names`: None
- `load_best_model_at_end`: False
- `ignore_data_skip`: False
- `fsdp`: []
- `fsdp_min_num_params`: 0
- `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
- `fsdp_transformer_layer_cls_to_wrap`: None
- `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
- `deepspeed`: None
- `label_smoothing_factor`: 0.0
- `optim`: adamw_torch
- `optim_args`: None
- `adafactor`: False
- `group_by_length`: False
- `length_column_name`: length
- `ddp_find_unused_parameters`: None
- `ddp_bucket_cap_mb`: None
- `ddp_broadcast_buffers`: False
- `dataloader_pin_memory`: True
- `dataloader_persistent_workers`: False
- `skip_memory_metrics`: True
- `use_legacy_prediction_loop`: False
- `push_to_hub`: False
- `resume_from_checkpoint`: None
- `hub_model_id`: None
- `hub_strategy`: every_save
- `hub_private_repo`: False
- `hub_always_push`: False
- `gradient_checkpointing`: False
- `gradient_checkpointing_kwargs`: None
- `include_inputs_for_metrics`: False
- `eval_do_concat_batches`: True
- `fp16_backend`: auto
- `push_to_hub_model_id`: None
- `push_to_hub_organization`: None
- `mp_parameters`:
- `auto_find_batch_size`: False
- `full_determinism`: False
- `torchdynamo`: None
- `ray_scope`: last
- `ddp_timeout`: 1800
- `torch_compile`: False
- `torch_compile_backend`: None
- `torch_compile_mode`: None
- `dispatch_batches`: None
- `split_batches`: None
- `include_tokens_per_second`: False
- `include_num_input_tokens_seen`: False
- `neftune_noise_alpha`: None
- `optim_target_modules`: None
- `batch_eval_metrics`: False
- `eval_on_start`: False
- `eval_use_gather_object`: False
- `batch_sampler`: batch_sampler
- `multi_dataset_batch_sampler`: proportional
</details>
### Training Logs
<details><summary>Click to expand</summary>
| Epoch | Step | Training Loss |
|:------:|:----:|:-------------:|
| 0.0005 | 1 | 3.9557 |
| 0.0011 | 2 | 3.9207 |
| 0.0016 | 3 | 3.9878 |
| 0.0022 | 4 | 3.8356 |
| 0.0027 | 5 | 3.8777 |
| 0.0032 | 6 | 3.9141 |
| 0.0038 | 7 | 3.8113 |
| 0.0043 | 8 | 3.9641 |
| 0.0049 | 9 | 3.9577 |
| 0.0054 | 10 | 3.8921 |
| 0.0060 | 11 | 3.9589 |
| 0.0065 | 12 | 3.9007 |
| 0.0070 | 13 | 3.9295 |
| 0.0076 | 14 | 3.8336 |
| 0.0081 | 15 | 3.823 |
| 0.0087 | 16 | 3.8653 |
| 0.0092 | 17 | 3.8996 |
| 0.0097 | 18 | 3.8196 |
| 0.0103 | 19 | 3.8609 |
| 0.0108 | 20 | 3.7583 |
| 0.0114 | 21 | 3.823 |
| 0.0119 | 22 | 3.7987 |
| 0.0124 | 23 | 3.7744 |
| 0.0130 | 24 | 3.7279 |
| 0.0135 | 25 | 3.735 |
| 0.0141 | 26 | 3.713 |
| 0.0146 | 27 | 3.6465 |
| 0.0152 | 28 | 3.7064 |
| 0.0157 | 29 | 3.5865 |
| 0.0162 | 30 | 3.6699 |
| 0.0168 | 31 | 3.5554 |
| 0.0173 | 32 | 3.5139 |
| 0.0179 | 33 | 3.4726 |
| 0.0184 | 34 | 3.4577 |
| 0.0189 | 35 | 3.4131 |
| 0.0195 | 36 | 3.3885 |
| 0.0200 | 37 | 3.3399 |
| 0.0206 | 38 | 3.3106 |
| 0.0211 | 39 | 3.2043 |
| 0.0216 | 40 | 3.1255 |
| 0.0222 | 41 | 3.2178 |
| 0.0227 | 42 | 3.0531 |
| 0.0233 | 43 | 3.1042 |
| 0.0238 | 44 | 2.9103 |
| 0.0244 | 45 | 2.9482 |
| 0.0249 | 46 | 2.7755 |
| 0.0254 | 47 | 2.7953 |
| 0.0260 | 48 | 2.8629 |
| 0.0265 | 49 | 2.7599 |
| 0.0271 | 50 | 2.6131 |
| 0.0276 | 51 | 2.666 |
| 0.0281 | 52 | 2.6593 |
| 0.0287 | 53 | 2.6734 |
| 0.0292 | 54 | 2.4722 |
| 0.0298 | 55 | 2.5322 |
| 0.0303 | 56 | 2.3349 |
| 0.0308 | 57 | 2.2357 |
| 0.0314 | 58 | 2.3634 |
| 0.0319 | 59 | 2.0923 |
| 0.0325 | 60 | 2.1504 |
| 0.0330 | 61 | 2.0468 |
| 0.0335 | 62 | 1.9572 |
| 0.0341 | 63 | 2.0988 |
| 0.0346 | 64 | 2.1403 |
| 0.0352 | 65 | 1.9407 |
| 0.0357 | 66 | 2.0625 |
| 0.0363 | 67 | 1.8134 |
| 0.0368 | 68 | 1.6808 |
| 0.0373 | 69 | 1.9427 |
| 0.0379 | 70 | 1.82 |
| 0.0384 | 71 | 1.6899 |
| 0.0390 | 72 | 1.6761 |
| 0.0395 | 73 | 1.6193 |
| 0.0400 | 74 | 1.9625 |
| 0.0406 | 75 | 1.7979 |
| 0.0411 | 76 | 1.7376 |
| 0.0417 | 77 | 1.8423 |
| 0.0422 | 78 | 1.681 |
| 0.0427 | 79 | 1.4283 |
| 0.0433 | 80 | 1.7009 |
| 0.0438 | 81 | 1.6515 |
| 0.0444 | 82 | 1.5507 |
| 0.0449 | 83 | 1.7057 |
| 0.0455 | 84 | 1.4614 |
| 0.0460 | 85 | 1.6146 |
| 0.0465 | 86 | 1.5025 |
| 0.0471 | 87 | 1.4857 |
| 0.0476 | 88 | 1.3809 |
| 0.0482 | 89 | 1.6986 |
| 0.0487 | 90 | 1.572 |
| 0.0492 | 91 | 1.5139 |
| 0.0498 | 92 | 1.6796 |
| 0.0503 | 93 | 1.6394 |
| 0.0509 | 94 | 1.6096 |
| 0.0514 | 95 | 1.4346 |
| 0.0519 | 96 | 1.1299 |
| 0.0525 | 97 | 1.3578 |
| 0.0530 | 98 | 1.3381 |
| 0.0536 | 99 | 1.3463 |
| 0.0541 | 100 | 1.2956 |
| 0.0547 | 101 | 1.7073 |
| 0.0552 | 102 | 1.4074 |
| 0.0557 | 103 | 1.3459 |
| 0.0563 | 104 | 1.4079 |
| 0.0568 | 105 | 1.2535 |
| 0.0574 | 106 | 1.3232 |
| 0.0579 | 107 | 1.4095 |
| 0.0584 | 108 | 1.2362 |
| 0.0590 | 109 | 1.3326 |
| 0.0595 | 110 | 1.3922 |
| 0.0601 | 111 | 1.1261 |
| 0.0606 | 112 | 1.204 |
| 0.0611 | 113 | 1.1924 |
| 0.0617 | 114 | 1.4995 |
| 0.0622 | 115 | 1.2202 |
| 0.0628 | 116 | 1.2755 |
| 0.0633 | 117 | 1.3031 |
| 0.0639 | 118 | 1.4851 |
| 0.0644 | 119 | 1.2415 |
| 0.0649 | 120 | 1.521 |
| 0.0655 | 121 | 1.3321 |
| 0.0660 | 122 | 1.2061 |
| 0.0666 | 123 | 1.3673 |
| 0.0671 | 124 | 1.2265 |
| 0.0676 | 125 | 1.2672 |
| 0.0682 | 126 | 1.0618 |
| 0.0687 | 127 | 1.1874 |
| 0.0693 | 128 | 1.0718 |
| 0.0698 | 129 | 1.1261 |
| 0.0703 | 130 | 1.0751 |
| 0.0709 | 131 | 1.2649 |
| 0.0714 | 132 | 1.0998 |
| 0.0720 | 133 | 1.0999 |
| 0.0725 | 134 | 1.1897 |
| 0.0731 | 135 | 1.0635 |
| 0.0736 | 136 | 1.104 |
| 0.0741 | 137 | 0.9697 |
| 0.0747 | 138 | 1.201 |
| 0.0752 | 139 | 1.0408 |
| 0.0758 | 140 | 1.0786 |
| 0.0763 | 141 | 0.8686 |
| 0.0768 | 142 | 0.9676 |
| 0.0774 | 143 | 0.9179 |
| 0.0779 | 144 | 1.0294 |
| 0.0785 | 145 | 1.0401 |
| 0.0790 | 146 | 1.1051 |
| 0.0795 | 147 | 0.9318 |
| 0.0801 | 148 | 1.0279 |
| 0.0806 | 149 | 0.9114 |
| 0.0812 | 150 | 0.9359 |
| 0.0817 | 151 | 0.8913 |
| 0.0823 | 152 | 0.9284 |
| 0.0828 | 153 | 1.0118 |
| 0.0833 | 154 | 1.1165 |
| 0.0839 | 155 | 1.1369 |
| 0.0844 | 156 | 0.8976 |
| 0.0850 | 157 | 1.0009 |
| 0.0855 | 158 | 0.9414 |
| 0.0860 | 159 | 0.868 |
| 0.0866 | 160 | 1.1353 |
| 0.0871 | 161 | 0.9101 |
| 0.0877 | 162 | 0.7246 |
| 0.0882 | 163 | 1.0012 |
| 0.0887 | 164 | 0.9211 |
| 0.0893 | 165 | 0.9428 |
| 0.0898 | 166 | 0.8289 |
| 0.0904 | 167 | 0.9593 |
| 0.0909 | 168 | 1.2092 |
| 0.0915 | 169 | 0.7338 |
| 0.0920 | 170 | 0.8588 |
| 0.0925 | 171 | 0.9816 |
| 0.0931 | 172 | 0.9474 |
| 0.0936 | 173 | 1.0677 |
| 0.0942 | 174 | 1.117 |
| 0.0947 | 175 | 0.898 |
| 0.0952 | 176 | 0.8501 |
| 0.0958 | 177 | 1.0569 |
| 0.0963 | 178 | 1.0522 |
| 0.0969 | 179 | 0.8532 |
| 0.0974 | 180 | 0.8668 |
| 0.0979 | 181 | 0.8636 |
| 0.0985 | 182 | 0.7857 |
| 0.0990 | 183 | 0.8385 |
| 0.0996 | 184 | 0.7104 |
| 0.1001 | 185 | 0.9013 |
| 0.1006 | 186 | 0.7508 |
| 0.1012 | 187 | 0.6771 |
| 0.1017 | 188 | 0.8624 |
| 0.1023 | 189 | 0.8862 |
| 0.1028 | 190 | 0.9327 |
| 0.1034 | 191 | 0.8809 |
| 0.1039 | 192 | 0.8578 |
| 0.1044 | 193 | 0.833 |
| 0.1050 | 194 | 0.8461 |
| 0.1055 | 195 | 0.7682 |
| 0.1061 | 196 | 0.8086 |
| 0.1066 | 197 | 0.9925 |
| 0.1071 | 198 | 0.9014 |
| 0.1077 | 199 | 0.879 |
| 0.1082 | 200 | 0.8851 |
| 0.1088 | 201 | 0.7543 |
| 0.1093 | 202 | 0.6934 |
| 0.1098 | 203 | 0.716 |
| 0.1104 | 204 | 0.9915 |
| 0.1109 | 205 | 0.7734 |
| 0.1115 | 206 | 0.6162 |
| 0.1120 | 207 | 0.7771 |
| 0.1126 | 208 | 0.7243 |
| 0.1131 | 209 | 0.7599 |
| 0.1136 | 210 | 0.8286 |
| 0.1142 | 211 | 0.6429 |
| 0.1147 | 212 | 0.6823 |
| 0.1153 | 213 | 0.7784 |
| 0.1158 | 214 | 0.635 |
| 0.1163 | 215 | 0.7964 |
| 0.1169 | 216 | 0.7031 |
| 0.1174 | 217 | 0.6108 |
| 0.1180 | 218 | 0.6102 |
| 0.1185 | 219 | 0.5939 |
| 0.1190 | 220 | 0.8973 |
| 0.1196 | 221 | 0.7167 |
| 0.1201 | 222 | 0.6575 |
| 0.1207 | 223 | 0.7441 |
| 0.1212 | 224 | 0.6632 |
| 0.1218 | 225 | 0.7348 |
| 0.1223 | 226 | 0.6358 |
| 0.1228 | 227 | 0.7601 |
| 0.1234 | 228 | 0.6086 |
| 0.1239 | 229 | 0.6904 |
| 0.1245 | 230 | 0.7876 |
| 0.125 | 231 | 0.7261 |
| 0.1255 | 232 | 0.8477 |
| 0.1261 | 233 | 0.6625 |
| 0.1266 | 234 | 0.7851 |
| 0.1272 | 235 | 0.7571 |
| 0.1277 | 236 | 0.8736 |
| 0.1282 | 237 | 0.7984 |
| 0.1288 | 238 | 0.6757 |
| 0.1293 | 239 | 0.6295 |
| 0.1299 | 240 | 0.7864 |
| 0.1304 | 241 | 0.8343 |
| 0.1310 | 242 | 0.6804 |
| 0.1315 | 243 | 0.5932 |
| 0.1320 | 244 | 0.6748 |
| 0.1326 | 245 | 0.4666 |
| 0.1331 | 246 | 0.6346 |
| 0.1337 | 247 | 0.6092 |
| 0.1342 | 248 | 0.7029 |
| 0.1347 | 249 | 0.5889 |
| 0.1353 | 250 | 0.5261 |
| 0.1358 | 251 | 0.6361 |
| 0.1364 | 252 | 0.769 |
| 0.1369 | 253 | 0.7193 |
| 0.1374 | 254 | 0.6804 |
| 0.1380 | 255 | 0.6696 |
| 0.1385 | 256 | 0.5272 |
| 0.1391 | 257 | 0.6706 |
| 0.1396 | 258 | 0.6941 |
| 0.1402 | 259 | 0.6801 |
| 0.1407 | 260 | 0.6171 |
| 0.1412 | 261 | 0.6859 |
| 0.1418 | 262 | 0.8151 |
| 0.1423 | 263 | 0.7727 |
| 0.1429 | 264 | 0.7049 |
| 0.1434 | 265 | 0.6862 |
| 0.1439 | 266 | 0.6732 |
| 0.1445 | 267 | 0.6814 |
| 0.1450 | 268 | 0.6222 |
| 0.1456 | 269 | 0.7482 |
| 0.1461 | 270 | 0.6293 |
| 0.1466 | 271 | 0.5959 |
| 0.1472 | 272 | 0.5825 |
| 0.1477 | 273 | 0.6173 |
| 0.1483 | 274 | 0.5604 |
| 0.1488 | 275 | 0.5701 |
| 0.1494 | 276 | 0.7433 |
| 0.1499 | 277 | 0.6079 |
| 0.1504 | 278 | 0.5344 |
| 0.1510 | 279 | 0.5373 |
| 0.1515 | 280 | 0.7143 |
| 0.1521 | 281 | 0.563 |
| 0.1526 | 282 | 0.6148 |
| 0.1531 | 283 | 0.7291 |
| 0.1537 | 284 | 0.6242 |
| 0.1542 | 285 | 0.6281 |
| 0.1548 | 286 | 0.6458 |
| 0.1553 | 287 | 0.4476 |
| 0.1558 | 288 | 0.5321 |
| 0.1564 | 289 | 0.6032 |
| 0.1569 | 290 | 0.5471 |
| 0.1575 | 291 | 0.6062 |
| 0.1580 | 292 | 0.6591 |
| 0.1585 | 293 | 0.5937 |
| 0.1591 | 294 | 0.6237 |
| 0.1596 | 295 | 0.687 |
| 0.1602 | 296 | 0.6427 |
| 0.1607 | 297 | 0.5706 |
| 0.1613 | 298 | 0.7336 |
| 0.1618 | 299 | 0.6361 |
| 0.1623 | 300 | 0.4598 |
| 0.1629 | 301 | 0.5767 |
| 0.1634 | 302 | 0.4932 |
| 0.1640 | 303 | 0.4852 |
| 0.1645 | 304 | 0.7043 |
| 0.1650 | 305 | 0.5403 |
| 0.1656 | 306 | 0.5415 |
| 0.1661 | 307 | 0.7011 |
| 0.1667 | 308 | 0.4772 |
| 0.1672 | 309 | 0.5594 |
| 0.1677 | 310 | 0.6951 |
| 0.1683 | 311 | 0.4775 |
| 0.1688 | 312 | 0.5433 |
| 0.1694 | 313 | 0.6717 |
| 0.1699 | 314 | 0.6976 |
| 0.1705 | 315 | 0.5541 |
| 0.1710 | 316 | 0.5422 |
| 0.1715 | 317 | 0.4612 |
| 0.1721 | 318 | 0.5404 |
| 0.1726 | 319 | 0.5641 |
| 0.1732 | 320 | 0.4292 |
| 0.1737 | 321 | 0.4008 |
| 0.1742 | 322 | 0.4958 |
| 0.1748 | 323 | 0.5326 |
| 0.1753 | 324 | 0.5201 |
| 0.1759 | 325 | 0.6857 |
| 0.1764 | 326 | 0.602 |
| 0.1769 | 327 | 0.5668 |
| 0.1775 | 328 | 0.4316 |
| 0.1780 | 329 | 0.4176 |
| 0.1786 | 330 | 0.6182 |
| 0.1791 | 331 | 0.5139 |
| 0.1797 | 332 | 0.5957 |
| 0.1802 | 333 | 0.4649 |
| 0.1807 | 334 | 0.5304 |
| 0.1813 | 335 | 0.3467 |
| 0.1818 | 336 | 0.4935 |
| 0.1824 | 337 | 0.4455 |
| 0.1829 | 338 | 0.4838 |
| 0.1834 | 339 | 0.5371 |
| 0.1840 | 340 | 0.4221 |
| 0.1845 | 341 | 0.5174 |
| 0.1851 | 342 | 0.4481 |
| 0.1856 | 343 | 0.4876 |
| 0.1861 | 344 | 0.5236 |
| 0.1867 | 345 | 0.5481 |
| 0.1872 | 346 | 0.4591 |
| 0.1878 | 347 | 0.692 |
| 0.1883 | 348 | 0.6094 |
| 0.1889 | 349 | 0.5291 |
| 0.1894 | 350 | 0.5513 |
| 0.1899 | 351 | 0.491 |
| 0.1905 | 352 | 0.5299 |
| 0.1910 | 353 | 0.5158 |
| 0.1916 | 354 | 0.5136 |
| 0.1921 | 355 | 0.4322 |
| 0.1926 | 356 | 0.4036 |
| 0.1932 | 357 | 0.5907 |
| 0.1937 | 358 | 0.6236 |
| 0.1943 | 359 | 0.3994 |
| 0.1948 | 360 | 0.4724 |
| 0.1953 | 361 | 0.4455 |
| 0.1959 | 362 | 0.5442 |
| 0.1964 | 363 | 0.416 |
| 0.1970 | 364 | 0.431 |
| 0.1975 | 365 | 0.3939 |
| 0.1981 | 366 | 0.4713 |
| 0.1986 | 367 | 0.5071 |
| 0.1991 | 368 | 0.3929 |
| 0.1997 | 369 | 0.4262 |
| 0.2002 | 370 | 0.4844 |
| 0.2008 | 371 | 0.2995 |
| 0.2013 | 372 | 0.6043 |
| 0.2018 | 373 | 0.4498 |
| 0.2024 | 374 | 0.5729 |
| 0.2029 | 375 | 0.5476 |
| 0.2035 | 376 | 0.4076 |
| 0.2040 | 377 | 0.4285 |
| 0.2045 | 378 | 0.3416 |
| 0.2051 | 379 | 0.4576 |
| 0.2056 | 380 | 0.421 |
| 0.2062 | 381 | 0.508 |
| 0.2067 | 382 | 0.6678 |
| 0.2073 | 383 | 0.3123 |
| 0.2078 | 384 | 0.4111 |
| 0.2083 | 385 | 0.5635 |
| 0.2089 | 386 | 0.4951 |
| 0.2094 | 387 | 0.4918 |
| 0.2100 | 388 | 0.391 |
| 0.2105 | 389 | 0.5256 |
| 0.2110 | 390 | 0.3937 |
| 0.2116 | 391 | 0.5605 |
| 0.2121 | 392 | 0.5048 |
| 0.2127 | 393 | 0.4383 |
| 0.2132 | 394 | 0.4155 |
| 0.2137 | 395 | 0.4795 |
| 0.2143 | 396 | 0.403 |
| 0.2148 | 397 | 0.5965 |
| 0.2154 | 398 | 0.4061 |
| 0.2159 | 399 | 0.3772 |
| 0.2165 | 400 | 0.4482 |
| 0.2170 | 401 | 0.4261 |
| 0.2175 | 402 | 0.4683 |
| 0.2181 | 403 | 0.435 |
| 0.2186 | 404 | 0.5485 |
| 0.2192 | 405 | 0.4479 |
| 0.2197 | 406 | 0.4882 |
| 0.2202 | 407 | 0.4688 |
| 0.2208 | 408 | 0.3496 |
| 0.2213 | 409 | 0.3889 |
| 0.2219 | 410 | 0.4166 |
| 0.2224 | 411 | 0.4191 |
| 0.2229 | 412 | 0.4672 |
| 0.2235 | 413 | 0.529 |
| 0.2240 | 414 | 0.4893 |
| 0.2246 | 415 | 0.4869 |
| 0.2251 | 416 | 0.3827 |
| 0.2256 | 417 | 0.4547 |
| 0.2262 | 418 | 0.5323 |
| 0.2267 | 419 | 0.5286 |
| 0.2273 | 420 | 0.4395 |
| 0.2278 | 421 | 0.4486 |
| 0.2284 | 422 | 0.5122 |
| 0.2289 | 423 | 0.3213 |
| 0.2294 | 424 | 0.4675 |
| 0.2300 | 425 | 0.5319 |
| 0.2305 | 426 | 0.4171 |
| 0.2311 | 427 | 0.4579 |
| 0.2316 | 428 | 0.427 |
| 0.2321 | 429 | 0.4374 |
| 0.2327 | 430 | 0.4414 |
| 0.2332 | 431 | 0.4929 |
| 0.2338 | 432 | 0.4007 |
| 0.2343 | 433 | 0.4919 |
| 0.2348 | 434 | 0.5 |
| 0.2354 | 435 | 0.4636 |
| 0.2359 | 436 | 0.4317 |
| 0.2365 | 437 | 0.4267 |
| 0.2370 | 438 | 0.3405 |
| 0.2376 | 439 | 0.5068 |
| 0.2381 | 440 | 0.3878 |
| 0.2386 | 441 | 0.6178 |
| 0.2392 | 442 | 0.3604 |
| 0.2397 | 443 | 0.4737 |
| 0.2403 | 444 | 0.4166 |
| 0.2408 | 445 | 0.4893 |
| 0.2413 | 446 | 0.467 |
| 0.2419 | 447 | 0.5191 |
| 0.2424 | 448 | 0.4461 |
| 0.2430 | 449 | 0.3629 |
| 0.2435 | 450 | 0.457 |
| 0.2440 | 451 | 0.4321 |
| 0.2446 | 452 | 0.4438 |
| 0.2451 | 453 | 0.4121 |
| 0.2457 | 454 | 0.4597 |
| 0.2462 | 455 | 0.5864 |
| 0.2468 | 456 | 0.5268 |
| 0.2473 | 457 | 0.4403 |
| 0.2478 | 458 | 0.4184 |
| 0.2484 | 459 | 0.532 |
| 0.2489 | 460 | 0.4448 |
| 0.2495 | 461 | 0.5599 |
| 0.25 | 462 | 0.4662 |
| 0.2505 | 463 | 0.3816 |
| 0.2511 | 464 | 0.4811 |
| 0.2516 | 465 | 0.5688 |
| 0.2522 | 466 | 0.4477 |
| 0.2527 | 467 | 0.4768 |
| 0.2532 | 468 | 0.5138 |
| 0.2538 | 469 | 0.4371 |
| 0.2543 | 470 | 0.4301 |
| 0.2549 | 471 | 0.5701 |
| 0.2554 | 472 | 0.4021 |
| 0.2560 | 473 | 0.3638 |
| 0.2565 | 474 | 0.4867 |
| 0.2570 | 475 | 0.2128 |
| 0.2576 | 476 | 0.5129 |
| 0.2581 | 477 | 0.4493 |
| 0.2587 | 478 | 0.398 |
| 0.2592 | 479 | 0.403 |
| 0.2597 | 480 | 0.4101 |
| 0.2603 | 481 | 0.4842 |
| 0.2608 | 482 | 0.4045 |
| 0.2614 | 483 | 0.376 |
| 0.2619 | 484 | 0.3203 |
| 0.2624 | 485 | 0.4024 |
| 0.2630 | 486 | 0.3525 |
| 0.2635 | 487 | 0.432 |
| 0.2641 | 488 | 0.4276 |
| 0.2646 | 489 | 0.5266 |
| 0.2652 | 490 | 0.4979 |
| 0.2657 | 491 | 0.4799 |
| 0.2662 | 492 | 0.4078 |
| 0.2668 | 493 | 0.3757 |
| 0.2673 | 494 | 0.4429 |
| 0.2679 | 495 | 0.431 |
| 0.2684 | 496 | 0.3557 |
| 0.2689 | 497 | 0.4048 |
| 0.2695 | 498 | 0.3873 |
| 0.2700 | 499 | 0.4281 |
| 0.2706 | 500 | 0.4625 |
| 0.2711 | 501 | 0.3313 |
| 0.2716 | 502 | 0.4699 |
| 0.2722 | 503 | 0.4945 |
| 0.2727 | 504 | 0.5019 |
| 0.2733 | 505 | 0.5058 |
| 0.2738 | 506 | 0.3148 |
| 0.2744 | 507 | 0.3969 |
| 0.2749 | 508 | 0.3188 |
| 0.2754 | 509 | 0.3656 |
| 0.2760 | 510 | 0.4077 |
| 0.2765 | 511 | 0.4514 |
| 0.2771 | 512 | 0.3854 |
| 0.2776 | 513 | 0.2781 |
| 0.2781 | 514 | 0.3847 |
| 0.2787 | 515 | 0.5575 |
| 0.2792 | 516 | 0.4726 |
| 0.2798 | 517 | 0.544 |
| 0.2803 | 518 | 0.4915 |
| 0.2808 | 519 | 0.3097 |
| 0.2814 | 520 | 0.446 |
| 0.2819 | 521 | 0.4743 |
| 0.2825 | 522 | 0.4088 |
| 0.2830 | 523 | 0.237 |
| 0.2835 | 524 | 0.4589 |
| 0.2841 | 525 | 0.446 |
| 0.2846 | 526 | 0.2436 |
| 0.2852 | 527 | 0.38 |
| 0.2857 | 528 | 0.3016 |
| 0.2863 | 529 | 0.4377 |
| 0.2868 | 530 | 0.368 |
| 0.2873 | 531 | 0.4198 |
| 0.2879 | 532 | 0.4686 |
| 0.2884 | 533 | 0.3889 |
| 0.2890 | 534 | 0.3036 |
| 0.2895 | 535 | 0.4019 |
| 0.2900 | 536 | 0.3801 |
| 0.2906 | 537 | 0.5282 |
| 0.2911 | 538 | 0.4505 |
| 0.2917 | 539 | 0.3379 |
| 0.2922 | 540 | 0.491 |
| 0.2927 | 541 | 0.5154 |
| 0.2933 | 542 | 0.3878 |
| 0.2938 | 543 | 0.311 |
| 0.2944 | 544 | 0.4869 |
| 0.2949 | 545 | 0.5065 |
| 0.2955 | 546 | 0.4938 |
| 0.2960 | 547 | 0.3523 |
| 0.2965 | 548 | 0.3807 |
| 0.2971 | 549 | 0.4386 |
| 0.2976 | 550 | 0.5617 |
| 0.2982 | 551 | 0.3759 |
| 0.2987 | 552 | 0.3336 |
| 0.2992 | 553 | 0.4209 |
| 0.2998 | 554 | 0.3269 |
| 0.3003 | 555 | 0.4954 |
| 0.3009 | 556 | 0.3328 |
| 0.3014 | 557 | 0.4135 |
| 0.3019 | 558 | 0.3406 |
| 0.3025 | 559 | 0.3276 |
| 0.3030 | 560 | 0.3642 |
| 0.3036 | 561 | 0.319 |
| 0.3041 | 562 | 0.2571 |
| 0.3047 | 563 | 0.4727 |
| 0.3052 | 564 | 0.5308 |
| 0.3057 | 565 | 0.2908 |
| 0.3063 | 566 | 0.5056 |
| 0.3068 | 567 | 0.3943 |
| 0.3074 | 568 | 0.424 |
| 0.3079 | 569 | 0.4812 |
| 0.3084 | 570 | 0.2392 |
| 0.3090 | 571 | 0.3348 |
| 0.3095 | 572 | 0.3481 |
| 0.3101 | 573 | 0.3635 |
| 0.3106 | 574 | 0.4103 |
| 0.3111 | 575 | 0.374 |
| 0.3117 | 576 | 0.4625 |
| 0.3122 | 577 | 0.3743 |
| 0.3128 | 578 | 0.3774 |
| 0.3133 | 579 | 0.4476 |
| 0.3139 | 580 | 0.331 |
| 0.3144 | 581 | 0.2418 |
| 0.3149 | 582 | 0.2395 |
| 0.3155 | 583 | 0.3169 |
| 0.3160 | 584 | 0.42 |
| 0.3166 | 585 | 0.3919 |
| 0.3171 | 586 | 0.3291 |
| 0.3176 | 587 | 0.3608 |
| 0.3182 | 588 | 0.3042 |
| 0.3187 | 589 | 0.4197 |
| 0.3193 | 590 | 0.3123 |
| 0.3198 | 591 | 0.3353 |
| 0.3203 | 592 | 0.3026 |
| 0.3209 | 593 | 0.3119 |
| 0.3214 | 594 | 0.4534 |
| 0.3220 | 595 | 0.2781 |
| 0.3225 | 596 | 0.3318 |
| 0.3231 | 597 | 0.3868 |
| 0.3236 | 598 | 0.3877 |
| 0.3241 | 599 | 0.4554 |
| 0.3247 | 600 | 0.4672 |
| 0.3252 | 601 | 0.3281 |
| 0.3258 | 602 | 0.4016 |
| 0.3263 | 603 | 0.2576 |
| 0.3268 | 604 | 0.4192 |
| 0.3274 | 605 | 0.4367 |
| 0.3279 | 606 | 0.353 |
| 0.3285 | 607 | 0.3884 |
| 0.3290 | 608 | 0.3427 |
| 0.3295 | 609 | 0.2705 |
| 0.3301 | 610 | 0.3073 |
| 0.3306 | 611 | 0.2957 |
| 0.3312 | 612 | 0.4355 |
| 0.3317 | 613 | 0.3335 |
| 0.3323 | 614 | 0.3807 |
| 0.3328 | 615 | 0.2344 |
| 0.3333 | 616 | 0.3681 |
| 0.3339 | 617 | 0.3595 |
| 0.3344 | 618 | 0.4536 |
| 0.3350 | 619 | 0.3564 |
| 0.3355 | 620 | 0.3127 |
| 0.3360 | 621 | 0.3122 |
| 0.3366 | 622 | 0.371 |
| 0.3371 | 623 | 0.4385 |
| 0.3377 | 624 | 0.1756 |
| 0.3382 | 625 | 0.4321 |
| 0.3387 | 626 | 0.3797 |
| 0.3393 | 627 | 0.3364 |
| 0.3398 | 628 | 0.3783 |
| 0.3404 | 629 | 0.3627 |
| 0.3409 | 630 | 0.3808 |
| 0.3415 | 631 | 0.3031 |
| 0.3420 | 632 | 0.3321 |
| 0.3425 | 633 | 0.4365 |
| 0.3431 | 634 | 0.2778 |
| 0.3436 | 635 | 0.4142 |
| 0.3442 | 636 | 0.4154 |
| 0.3447 | 637 | 0.4557 |
| 0.3452 | 638 | 0.3573 |
| 0.3458 | 639 | 0.4492 |
| 0.3463 | 640 | 0.3779 |
| 0.3469 | 641 | 0.3539 |
| 0.3474 | 642 | 0.3162 |
| 0.3479 | 643 | 0.3476 |
| 0.3485 | 644 | 0.3196 |
| 0.3490 | 645 | 0.3256 |
| 0.3496 | 646 | 0.3995 |
| 0.3501 | 647 | 0.3244 |
| 0.3506 | 648 | 0.3697 |
| 0.3512 | 649 | 0.3228 |
| 0.3517 | 650 | 0.36 |
| 0.3523 | 651 | 0.2631 |
| 0.3528 | 652 | 0.4452 |
| 0.3534 | 653 | 0.3325 |
| 0.3539 | 654 | 0.3141 |
| 0.3544 | 655 | 0.3839 |
| 0.3550 | 656 | 0.3551 |
| 0.3555 | 657 | 0.3729 |
| 0.3561 | 658 | 0.4533 |
| 0.3566 | 659 | 0.477 |
| 0.3571 | 660 | 0.3208 |
| 0.3577 | 661 | 0.3636 |
| 0.3582 | 662 | 0.3888 |
| 0.3588 | 663 | 0.3271 |
| 0.3593 | 664 | 0.3646 |
| 0.3598 | 665 | 0.4552 |
| 0.3604 | 666 | 0.3981 |
| 0.3609 | 667 | 0.2399 |
| 0.3615 | 668 | 0.2871 |
| 0.3620 | 669 | 0.3694 |
| 0.3626 | 670 | 0.3998 |
| 0.3631 | 671 | 0.2942 |
| 0.3636 | 672 | 0.3286 |
| 0.3642 | 673 | 0.3785 |
| 0.3647 | 674 | 0.3538 |
| 0.3653 | 675 | 0.2747 |
| 0.3658 | 676 | 0.3183 |
| 0.3663 | 677 | 0.4174 |
| 0.3669 | 678 | 0.343 |
| 0.3674 | 679 | 0.3336 |
| 0.3680 | 680 | 0.4012 |
| 0.3685 | 681 | 0.3695 |
| 0.3690 | 682 | 0.4255 |
| 0.3696 | 683 | 0.3505 |
| 0.3701 | 684 | 0.2937 |
| 0.3707 | 685 | 0.298 |
| 0.3712 | 686 | 0.367 |
| 0.3718 | 687 | 0.5086 |
| 0.3723 | 688 | 0.418 |
| 0.3728 | 689 | 0.4873 |
| 0.3734 | 690 | 0.3697 |
| 0.3739 | 691 | 0.4506 |
| 0.3745 | 692 | 0.3652 |
| 0.375 | 693 | 0.444 |
| 0.3755 | 694 | 0.2769 |
| 0.3761 | 695 | 0.3379 |
| 0.3766 | 696 | 0.2788 |
| 0.3772 | 697 | 0.2828 |
| 0.3777 | 698 | 0.3301 |
| 0.3782 | 699 | 0.2256 |
| 0.3788 | 700 | 0.3264 |
| 0.3793 | 701 | 0.3045 |
| 0.3799 | 702 | 0.3137 |
| 0.3804 | 703 | 0.2509 |
| 0.3810 | 704 | 0.2266 |
| 0.3815 | 705 | 0.3861 |
| 0.3820 | 706 | 0.2714 |
| 0.3826 | 707 | 0.3494 |
| 0.3831 | 708 | 0.3505 |
| 0.3837 | 709 | 0.3442 |
| 0.3842 | 710 | 0.4278 |
| 0.3847 | 711 | 0.3468 |
| 0.3853 | 712 | 0.288 |
| 0.3858 | 713 | 0.3128 |
| 0.3864 | 714 | 0.2871 |
| 0.3869 | 715 | 0.3202 |
| 0.3874 | 716 | 0.3707 |
| 0.3880 | 717 | 0.3227 |
| 0.3885 | 718 | 0.3043 |
| 0.3891 | 719 | 0.4236 |
| 0.3896 | 720 | 0.3186 |
| 0.3902 | 721 | 0.3004 |
| 0.3907 | 722 | 0.2342 |
| 0.3912 | 723 | 0.2424 |
| 0.3918 | 724 | 0.259 |
| 0.3923 | 725 | 0.2754 |
| 0.3929 | 726 | 0.3282 |
| 0.3934 | 727 | 0.3854 |
| 0.3939 | 728 | 0.3611 |
| 0.3945 | 729 | 0.3386 |
| 0.3950 | 730 | 0.2737 |
| 0.3956 | 731 | 0.2659 |
| 0.3961 | 732 | 0.3486 |
| 0.3966 | 733 | 0.3053 |
| 0.3972 | 734 | 0.3747 |
| 0.3977 | 735 | 0.2901 |
| 0.3983 | 736 | 0.3935 |
| 0.3988 | 737 | 0.4183 |
| 0.3994 | 738 | 0.3442 |
| 0.3999 | 739 | 0.387 |
| 0.4004 | 740 | 0.3109 |
| 0.4010 | 741 | 0.3422 |
| 0.4015 | 742 | 0.2999 |
| 0.4021 | 743 | 0.3035 |
| 0.4026 | 744 | 0.368 |
| 0.4031 | 745 | 0.3345 |
| 0.4037 | 746 | 0.3881 |
| 0.4042 | 747 | 0.2789 |
| 0.4048 | 748 | 0.3644 |
| 0.4053 | 749 | 0.4616 |
| 0.4058 | 750 | 0.3665 |
| 0.4064 | 751 | 0.2822 |
| 0.4069 | 752 | 0.3898 |
| 0.4075 | 753 | 0.3016 |
| 0.4080 | 754 | 0.382 |
| 0.4085 | 755 | 0.2288 |
| 0.4091 | 756 | 0.2529 |
| 0.4096 | 757 | 0.3091 |
| 0.4102 | 758 | 0.3199 |
| 0.4107 | 759 | 0.3183 |
| 0.4113 | 760 | 0.322 |
| 0.4118 | 761 | 0.3454 |
| 0.4123 | 762 | 0.292 |
| 0.4129 | 763 | 0.322 |
| 0.4134 | 764 | 0.412 |
| 0.4140 | 765 | 0.2618 |
| 0.4145 | 766 | 0.3343 |
| 0.4150 | 767 | 0.2618 |
| 0.4156 | 768 | 0.2666 |
| 0.4161 | 769 | 0.3625 |
| 0.4167 | 770 | 0.3668 |
| 0.4172 | 771 | 0.242 |
| 0.4177 | 772 | 0.3288 |
| 0.4183 | 773 | 0.2378 |
| 0.4188 | 774 | 0.4182 |
| 0.4194 | 775 | 0.2503 |
| 0.4199 | 776 | 0.3372 |
| 0.4205 | 777 | 0.376 |
| 0.4210 | 778 | 0.3128 |
| 0.4215 | 779 | 0.4245 |
| 0.4221 | 780 | 0.2659 |
| 0.4226 | 781 | 0.2981 |
| 0.4232 | 782 | 0.371 |
| 0.4237 | 783 | 0.3893 |
| 0.4242 | 784 | 0.2595 |
| 0.4248 | 785 | 0.3101 |
| 0.4253 | 786 | 0.3973 |
| 0.4259 | 787 | 0.3702 |
| 0.4264 | 788 | 0.3141 |
| 0.4269 | 789 | 0.3044 |
| 0.4275 | 790 | 0.2422 |
| 0.4280 | 791 | 0.3277 |
| 0.4286 | 792 | 0.3433 |
| 0.4291 | 793 | 0.1456 |
| 0.4297 | 794 | 0.2729 |
| 0.4302 | 795 | 0.2927 |
| 0.4307 | 796 | 0.2207 |
| 0.4313 | 797 | 0.3188 |
| 0.4318 | 798 | 0.3016 |
| 0.4324 | 799 | 0.3031 |
| 0.4329 | 800 | 0.3304 |
| 0.4334 | 801 | 0.3583 |
| 0.4340 | 802 | 0.3126 |
| 0.4345 | 803 | 0.3465 |
| 0.4351 | 804 | 0.2763 |
| 0.4356 | 805 | 0.2691 |
| 0.4361 | 806 | 0.2807 |
| 0.4367 | 807 | 0.2768 |
| 0.4372 | 808 | 0.2153 |
| 0.4378 | 809 | 0.2691 |
| 0.4383 | 810 | 0.253 |
| 0.4389 | 811 | 0.2523 |
| 0.4394 | 812 | 0.2883 |
| 0.4399 | 813 | 0.1915 |
| 0.4405 | 814 | 0.279 |
| 0.4410 | 815 | 0.2919 |
| 0.4416 | 816 | 0.197 |
| 0.4421 | 817 | 0.2595 |
| 0.4426 | 818 | 0.3054 |
| 0.4432 | 819 | 0.2602 |
| 0.4437 | 820 | 0.3319 |
| 0.4443 | 821 | 0.2588 |
| 0.4448 | 822 | 0.2652 |
| 0.4453 | 823 | 0.3324 |
| 0.4459 | 824 | 0.3651 |
| 0.4464 | 825 | 0.3622 |
| 0.4470 | 826 | 0.3292 |
| 0.4475 | 827 | 0.3236 |
| 0.4481 | 828 | 0.2803 |
| 0.4486 | 829 | 0.3327 |
| 0.4491 | 830 | 0.2954 |
| 0.4497 | 831 | 0.2895 |
| 0.4502 | 832 | 0.3304 |
| 0.4508 | 833 | 0.1671 |
| 0.4513 | 834 | 0.3462 |
| 0.4518 | 835 | 0.2386 |
| 0.4524 | 836 | 0.2957 |
| 0.4529 | 837 | 0.3055 |
| 0.4535 | 838 | 0.3499 |
| 0.4540 | 839 | 0.2705 |
| 0.4545 | 840 | 0.2859 |
| 0.4551 | 841 | 0.2883 |
| 0.4556 | 842 | 0.3168 |
| 0.4562 | 843 | 0.3809 |
| 0.4567 | 844 | 0.3675 |
| 0.4573 | 845 | 0.3149 |
| 0.4578 | 846 | 0.3214 |
| 0.4583 | 847 | 0.3034 |
| 0.4589 | 848 | 0.2892 |
| 0.4594 | 849 | 0.3221 |
| 0.4600 | 850 | 0.4045 |
| 0.4605 | 851 | 0.3136 |
| 0.4610 | 852 | 0.219 |
| 0.4616 | 853 | 0.2452 |
| 0.4621 | 854 | 0.3133 |
| 0.4627 | 855 | 0.2659 |
| 0.4632 | 856 | 0.3777 |
| 0.4637 | 857 | 0.3309 |
| 0.4643 | 858 | 0.5038 |
| 0.4648 | 859 | 0.2956 |
| 0.4654 | 860 | 0.3282 |
| 0.4659 | 861 | 0.2139 |
| 0.4665 | 862 | 0.3297 |
| 0.4670 | 863 | 0.2965 |
| 0.4675 | 864 | 0.2847 |
| 0.4681 | 865 | 0.3407 |
| 0.4686 | 866 | 0.2593 |
| 0.4692 | 867 | 0.3223 |
| 0.4697 | 868 | 0.2832 |
| 0.4702 | 869 | 0.2503 |
| 0.4708 | 870 | 0.2626 |
| 0.4713 | 871 | 0.2828 |
| 0.4719 | 872 | 0.3443 |
| 0.4724 | 873 | 0.2684 |
| 0.4729 | 874 | 0.2216 |
| 0.4735 | 875 | 0.2828 |
| 0.4740 | 876 | 0.2433 |
| 0.4746 | 877 | 0.3476 |
| 0.4751 | 878 | 0.3005 |
| 0.4756 | 879 | 0.3113 |
| 0.4762 | 880 | 0.1482 |
| 0.4767 | 881 | 0.3019 |
| 0.4773 | 882 | 0.278 |
| 0.4778 | 883 | 0.2292 |
| 0.4784 | 884 | 0.4266 |
| 0.4789 | 885 | 0.3373 |
| 0.4794 | 886 | 0.1929 |
| 0.4800 | 887 | 0.347 |
| 0.4805 | 888 | 0.314 |
| 0.4811 | 889 | 0.4008 |
| 0.4816 | 890 | 0.2961 |
| 0.4821 | 891 | 0.2727 |
| 0.4827 | 892 | 0.3792 |
| 0.4832 | 893 | 0.2094 |
| 0.4838 | 894 | 0.2639 |
| 0.4843 | 895 | 0.2567 |
| 0.4848 | 896 | 0.3415 |
| 0.4854 | 897 | 0.3522 |
| 0.4859 | 898 | 0.1885 |
| 0.4865 | 899 | 0.2762 |
| 0.4870 | 900 | 0.2533 |
| 0.4876 | 901 | 0.2415 |
| 0.4881 | 902 | 0.2657 |
| 0.4886 | 903 | 0.3201 |
| 0.4892 | 904 | 0.2741 |
| 0.4897 | 905 | 0.2998 |
| 0.4903 | 906 | 0.2234 |
| 0.4908 | 907 | 0.3571 |
| 0.4913 | 908 | 0.3524 |
| 0.4919 | 909 | 0.2165 |
| 0.4924 | 910 | 0.1526 |
| 0.4930 | 911 | 0.3463 |
| 0.4935 | 912 | 0.3997 |
| 0.4940 | 913 | 0.3133 |
| 0.4946 | 914 | 0.2519 |
| 0.4951 | 915 | 0.1886 |
| 0.4957 | 916 | 0.253 |
| 0.4962 | 917 | 0.391 |
| 0.4968 | 918 | 0.2562 |
| 0.4973 | 919 | 0.3635 |
| 0.4978 | 920 | 0.3886 |
| 0.4984 | 921 | 0.2713 |
| 0.4989 | 922 | 0.3201 |
| 0.4995 | 923 | 0.3354 |
| 0.5 | 924 | 0.2783 |
| 0.5005 | 925 | 0.3165 |
| 0.5011 | 926 | 0.2497 |
| 0.5016 | 927 | 0.264 |
| 0.5022 | 928 | 0.3656 |
| 0.5027 | 929 | 0.2733 |
| 0.5032 | 930 | 0.2213 |
| 0.5038 | 931 | 0.1856 |
| 0.5043 | 932 | 0.1603 |
| 0.5049 | 933 | 0.2943 |
| 0.5054 | 934 | 0.2369 |
| 0.5060 | 935 | 0.2739 |
| 0.5065 | 936 | 0.1494 |
| 0.5070 | 937 | 0.3658 |
| 0.5076 | 938 | 0.2276 |
| 0.5081 | 939 | 0.2999 |
| 0.5087 | 940 | 0.2988 |
| 0.5092 | 941 | 0.2326 |
| 0.5097 | 942 | 0.2243 |
| 0.5103 | 943 | 0.1794 |
| 0.5108 | 944 | 0.3145 |
| 0.5114 | 945 | 0.3793 |
| 0.5119 | 946 | 0.2787 |
| 0.5124 | 947 | 0.1787 |
| 0.5130 | 948 | 0.2505 |
| 0.5135 | 949 | 0.3307 |
| 0.5141 | 950 | 0.2711 |
| 0.5146 | 951 | 0.3728 |
| 0.5152 | 952 | 0.3181 |
| 0.5157 | 953 | 0.3111 |
| 0.5162 | 954 | 0.3536 |
| 0.5168 | 955 | 0.381 |
| 0.5173 | 956 | 0.3083 |
| 0.5179 | 957 | 0.2131 |
| 0.5184 | 958 | 0.3132 |
| 0.5189 | 959 | 0.2763 |
| 0.5195 | 960 | 0.3422 |
| 0.5200 | 961 | 0.2276 |
| 0.5206 | 962 | 0.3179 |
| 0.5211 | 963 | 0.3098 |
| 0.5216 | 964 | 0.194 |
| 0.5222 | 965 | 0.2941 |
| 0.5227 | 966 | 0.2221 |
| 0.5233 | 967 | 0.392 |
| 0.5238 | 968 | 0.4026 |
| 0.5244 | 969 | 0.2968 |
| 0.5249 | 970 | 0.2378 |
| 0.5254 | 971 | 0.3238 |
| 0.5260 | 972 | 0.3739 |
| 0.5265 | 973 | 0.2882 |
| 0.5271 | 974 | 0.1461 |
| 0.5276 | 975 | 0.2001 |
| 0.5281 | 976 | 0.2706 |
| 0.5287 | 977 | 0.2886 |
| 0.5292 | 978 | 0.2045 |
| 0.5298 | 979 | 0.2768 |
| 0.5303 | 980 | 0.2673 |
| 0.5308 | 981 | 0.4411 |
| 0.5314 | 982 | 0.3067 |
| 0.5319 | 983 | 0.1799 |
| 0.5325 | 984 | 0.2184 |
| 0.5330 | 985 | 0.1722 |
| 0.5335 | 986 | 0.2289 |
| 0.5341 | 987 | 0.4321 |
| 0.5346 | 988 | 0.3145 |
| 0.5352 | 989 | 0.3244 |
| 0.5357 | 990 | 0.236 |
| 0.5363 | 991 | 0.2328 |
| 0.5368 | 992 | 0.361 |
| 0.5373 | 993 | 0.2481 |
| 0.5379 | 994 | 0.3673 |
| 0.5384 | 995 | 0.288 |
| 0.5390 | 996 | 0.3288 |
| 0.5395 | 997 | 0.3113 |
| 0.5400 | 998 | 0.2939 |
| 0.5406 | 999 | 0.257 |
| 0.5411 | 1000 | 0.1692 |
| 0.5417 | 1001 | 0.2646 |
| 0.5422 | 1002 | 0.2235 |
| 0.5427 | 1003 | 0.2998 |
| 0.5433 | 1004 | 0.3219 |
| 0.5438 | 1005 | 0.2948 |
| 0.5444 | 1006 | 0.2504 |
| 0.5449 | 1007 | 0.2156 |
| 0.5455 | 1008 | 0.1564 |
| 0.5460 | 1009 | 0.3906 |
| 0.5465 | 1010 | 0.1858 |
| 0.5471 | 1011 | 0.2905 |
| 0.5476 | 1012 | 0.4178 |
| 0.5482 | 1013 | 0.3966 |
| 0.5487 | 1014 | 0.3453 |
| 0.5492 | 1015 | 0.3134 |
| 0.5498 | 1016 | 0.3551 |
| 0.5503 | 1017 | 0.3492 |
| 0.5509 | 1018 | 0.2311 |
| 0.5514 | 1019 | 0.3288 |
| 0.5519 | 1020 | 0.41 |
| 0.5525 | 1021 | 0.3142 |
| 0.5530 | 1022 | 0.2731 |
| 0.5536 | 1023 | 0.2918 |
| 0.5541 | 1024 | 0.2448 |
| 0.5547 | 1025 | 0.2518 |
| 0.5552 | 1026 | 0.2748 |
| 0.5557 | 1027 | 0.258 |
| 0.5563 | 1028 | 0.3213 |
| 0.5568 | 1029 | 0.2204 |
| 0.5574 | 1030 | 0.2341 |
| 0.5579 | 1031 | 0.2834 |
| 0.5584 | 1032 | 0.2263 |
| 0.5590 | 1033 | 0.3555 |
| 0.5595 | 1034 | 0.1856 |
| 0.5601 | 1035 | 0.3415 |
| 0.5606 | 1036 | 0.2918 |
| 0.5611 | 1037 | 0.3452 |
| 0.5617 | 1038 | 0.2668 |
| 0.5622 | 1039 | 0.3271 |
| 0.5628 | 1040 | 0.3698 |
| 0.5633 | 1041 | 0.2592 |
| 0.5639 | 1042 | 0.216 |
| 0.5644 | 1043 | 0.2528 |
| 0.5649 | 1044 | 0.3076 |
| 0.5655 | 1045 | 0.2379 |
| 0.5660 | 1046 | 0.2075 |
| 0.5666 | 1047 | 0.339 |
| 0.5671 | 1048 | 0.2381 |
| 0.5676 | 1049 | 0.2966 |
| 0.5682 | 1050 | 0.2599 |
| 0.5687 | 1051 | 0.2928 |
| 0.5693 | 1052 | 0.2545 |
| 0.5698 | 1053 | 0.265 |
| 0.5703 | 1054 | 0.2329 |
| 0.5709 | 1055 | 0.2876 |
| 0.5714 | 1056 | 0.2826 |
| 0.5720 | 1057 | 0.1941 |
| 0.5725 | 1058 | 0.2631 |
| 0.5731 | 1059 | 0.2492 |
| 0.5736 | 1060 | 0.2736 |
| 0.5741 | 1061 | 0.232 |
| 0.5747 | 1062 | 0.2157 |
| 0.5752 | 1063 | 0.2275 |
| 0.5758 | 1064 | 0.1625 |
| 0.5763 | 1065 | 0.2494 |
| 0.5768 | 1066 | 0.2295 |
| 0.5774 | 1067 | 0.2406 |
| 0.5779 | 1068 | 0.3003 |
| 0.5785 | 1069 | 0.3435 |
| 0.5790 | 1070 | 0.3305 |
| 0.5795 | 1071 | 0.3354 |
| 0.5801 | 1072 | 0.2647 |
| 0.5806 | 1073 | 0.1936 |
| 0.5812 | 1074 | 0.2621 |
| 0.5817 | 1075 | 0.2469 |
| 0.5823 | 1076 | 0.412 |
| 0.5828 | 1077 | 0.2876 |
| 0.5833 | 1078 | 0.2778 |
| 0.5839 | 1079 | 0.2909 |
| 0.5844 | 1080 | 0.2386 |
| 0.5850 | 1081 | 0.2368 |
| 0.5855 | 1082 | 0.241 |
| 0.5860 | 1083 | 0.2734 |
| 0.5866 | 1084 | 0.2463 |
| 0.5871 | 1085 | 0.2316 |
| 0.5877 | 1086 | 0.2558 |
| 0.5882 | 1087 | 0.1824 |
| 0.5887 | 1088 | 0.2948 |
| 0.5893 | 1089 | 0.2599 |
| 0.5898 | 1090 | 0.4162 |
| 0.5904 | 1091 | 0.2369 |
| 0.5909 | 1092 | 0.2311 |
| 0.5915 | 1093 | 0.1987 |
| 0.5920 | 1094 | 0.3138 |
| 0.5925 | 1095 | 0.2675 |
| 0.5931 | 1096 | 0.2982 |
| 0.5936 | 1097 | 0.1959 |
| 0.5942 | 1098 | 0.1799 |
| 0.5947 | 1099 | 0.2605 |
| 0.5952 | 1100 | 0.206 |
| 0.5958 | 1101 | 0.231 |
| 0.5963 | 1102 | 0.2363 |
| 0.5969 | 1103 | 0.2812 |
| 0.5974 | 1104 | 0.2339 |
| 0.5979 | 1105 | 0.3221 |
| 0.5985 | 1106 | 0.2915 |
| 0.5990 | 1107 | 0.3309 |
| 0.5996 | 1108 | 0.2318 |
| 0.6001 | 1109 | 0.2752 |
| 0.6006 | 1110 | 0.1957 |
| 0.6012 | 1111 | 0.265 |
| 0.6017 | 1112 | 0.3649 |
| 0.6023 | 1113 | 0.1695 |
| 0.6028 | 1114 | 0.2018 |
| 0.6034 | 1115 | 0.2359 |
| 0.6039 | 1116 | 0.2122 |
| 0.6044 | 1117 | 0.2057 |
| 0.6050 | 1118 | 0.204 |
| 0.6055 | 1119 | 0.23 |
| 0.6061 | 1120 | 0.2983 |
| 0.6066 | 1121 | 0.2853 |
| 0.6071 | 1122 | 0.2189 |
| 0.6077 | 1123 | 0.237 |
| 0.6082 | 1124 | 0.2299 |
| 0.6088 | 1125 | 0.2976 |
| 0.6093 | 1126 | 0.2604 |
| 0.6098 | 1127 | 0.3122 |
| 0.6104 | 1128 | 0.1538 |
| 0.6109 | 1129 | 0.2575 |
| 0.6115 | 1130 | 0.2361 |
| 0.6120 | 1131 | 0.239 |
| 0.6126 | 1132 | 0.2127 |
| 0.6131 | 1133 | 0.1329 |
| 0.6136 | 1134 | 0.2488 |
| 0.6142 | 1135 | 0.1767 |
| 0.6147 | 1136 | 0.2685 |
| 0.6153 | 1137 | 0.1614 |
| 0.6158 | 1138 | 0.219 |
| 0.6163 | 1139 | 0.1816 |
| 0.6169 | 1140 | 0.3093 |
| 0.6174 | 1141 | 0.2004 |
| 0.6180 | 1142 | 0.1699 |
| 0.6185 | 1143 | 0.1965 |
| 0.6190 | 1144 | 0.2868 |
| 0.6196 | 1145 | 0.2281 |
| 0.6201 | 1146 | 0.2095 |
| 0.6207 | 1147 | 0.2903 |
| 0.6212 | 1148 | 0.2944 |
| 0.6218 | 1149 | 0.2932 |
| 0.6223 | 1150 | 0.2656 |
| 0.6228 | 1151 | 0.2554 |
| 0.6234 | 1152 | 0.2429 |
| 0.6239 | 1153 | 0.1766 |
| 0.6245 | 1154 | 0.2101 |
| 0.625 | 1155 | 0.2743 |
| 0.6255 | 1156 | 0.2446 |
| 0.6261 | 1157 | 0.3523 |
| 0.6266 | 1158 | 0.2639 |
| 0.6272 | 1159 | 0.2127 |
| 0.6277 | 1160 | 0.2253 |
| 0.6282 | 1161 | 0.3194 |
| 0.6288 | 1162 | 0.2306 |
| 0.6293 | 1163 | 0.2249 |
| 0.6299 | 1164 | 0.1971 |
| 0.6304 | 1165 | 0.2229 |
| 0.6310 | 1166 | 0.2953 |
| 0.6315 | 1167 | 0.2222 |
| 0.6320 | 1168 | 0.2327 |
| 0.6326 | 1169 | 0.2714 |
| 0.6331 | 1170 | 0.2899 |
| 0.6337 | 1171 | 0.3335 |
| 0.6342 | 1172 | 0.3338 |
| 0.6347 | 1173 | 0.2522 |
| 0.6353 | 1174 | 0.2157 |
| 0.6358 | 1175 | 0.2871 |
| 0.6364 | 1176 | 0.1949 |
| 0.6369 | 1177 | 0.2765 |
| 0.6374 | 1178 | 0.2376 |
| 0.6380 | 1179 | 0.3025 |
| 0.6385 | 1180 | 0.2696 |
| 0.6391 | 1181 | 0.2499 |
| 0.6396 | 1182 | 0.1471 |
| 0.6402 | 1183 | 0.2394 |
| 0.6407 | 1184 | 0.2604 |
| 0.6412 | 1185 | 0.2578 |
| 0.6418 | 1186 | 0.2392 |
| 0.6423 | 1187 | 0.1738 |
| 0.6429 | 1188 | 0.1667 |
| 0.6434 | 1189 | 0.196 |
| 0.6439 | 1190 | 0.2591 |
| 0.6445 | 1191 | 0.308 |
| 0.6450 | 1192 | 0.192 |
| 0.6456 | 1193 | 0.2112 |
| 0.6461 | 1194 | 0.2094 |
| 0.6466 | 1195 | 0.2245 |
| 0.6472 | 1196 | 0.3237 |
| 0.6477 | 1197 | 0.2541 |
| 0.6483 | 1198 | 0.2046 |
| 0.6488 | 1199 | 0.2107 |
| 0.6494 | 1200 | 0.2381 |
| 0.6499 | 1201 | 0.2709 |
| 0.6504 | 1202 | 0.2133 |
| 0.6510 | 1203 | 0.2962 |
| 0.6515 | 1204 | 0.2465 |
| 0.6521 | 1205 | 0.1885 |
| 0.6526 | 1206 | 0.2204 |
| 0.6531 | 1207 | 0.2382 |
| 0.6537 | 1208 | 0.2423 |
| 0.6542 | 1209 | 0.1693 |
| 0.6548 | 1210 | 0.193 |
| 0.6553 | 1211 | 0.1997 |
| 0.6558 | 1212 | 0.2023 |
| 0.6564 | 1213 | 0.3033 |
| 0.6569 | 1214 | 0.3238 |
| 0.6575 | 1215 | 0.272 |
| 0.6580 | 1216 | 0.1372 |
| 0.6585 | 1217 | 0.2613 |
| 0.6591 | 1218 | 0.1268 |
| 0.6596 | 1219 | 0.1873 |
| 0.6602 | 1220 | 0.2099 |
| 0.6607 | 1221 | 0.2877 |
| 0.6613 | 1222 | 0.2513 |
| 0.6618 | 1223 | 0.1897 |
| 0.6623 | 1224 | 0.2419 |
| 0.6629 | 1225 | 0.1538 |
| 0.6634 | 1226 | 0.2188 |
| 0.6640 | 1227 | 0.3067 |
| 0.6645 | 1228 | 0.1894 |
| 0.6650 | 1229 | 0.1053 |
| 0.6656 | 1230 | 0.2598 |
| 0.6661 | 1231 | 0.1724 |
| 0.6667 | 1232 | 0.3421 |
| 0.6672 | 1233 | 0.2734 |
| 0.6677 | 1234 | 0.2917 |
| 0.6683 | 1235 | 0.2467 |
| 0.6688 | 1236 | 0.2112 |
| 0.6694 | 1237 | 0.202 |
| 0.6699 | 1238 | 0.1947 |
| 0.6705 | 1239 | 0.245 |
| 0.6710 | 1240 | 0.1832 |
| 0.6715 | 1241 | 0.3078 |
| 0.6721 | 1242 | 0.2293 |
| 0.6726 | 1243 | 0.2039 |
| 0.6732 | 1244 | 0.2496 |
| 0.6737 | 1245 | 0.2263 |
| 0.6742 | 1246 | 0.1352 |
| 0.6748 | 1247 | 0.2791 |
| 0.6753 | 1248 | 0.2378 |
| 0.6759 | 1249 | 0.2485 |
| 0.6764 | 1250 | 0.1771 |
| 0.6769 | 1251 | 0.1891 |
| 0.6775 | 1252 | 0.1827 |
| 0.6780 | 1253 | 0.2964 |
| 0.6786 | 1254 | 0.1814 |
| 0.6791 | 1255 | 0.2227 |
| 0.6797 | 1256 | 0.2982 |
| 0.6802 | 1257 | 0.2574 |
| 0.6807 | 1258 | 0.1615 |
| 0.6813 | 1259 | 0.2493 |
| 0.6818 | 1260 | 0.2416 |
| 0.6824 | 1261 | 0.1774 |
| 0.6829 | 1262 | 0.2875 |
| 0.6834 | 1263 | 0.2445 |
| 0.6840 | 1264 | 0.2029 |
| 0.6845 | 1265 | 0.1615 |
| 0.6851 | 1266 | 0.3034 |
| 0.6856 | 1267 | 0.2166 |
| 0.6861 | 1268 | 0.243 |
| 0.6867 | 1269 | 0.3196 |
| 0.6872 | 1270 | 0.1158 |
| 0.6878 | 1271 | 0.2559 |
| 0.6883 | 1272 | 0.2497 |
| 0.6889 | 1273 | 0.2773 |
| 0.6894 | 1274 | 0.2716 |
| 0.6899 | 1275 | 0.2651 |
| 0.6905 | 1276 | 0.2193 |
| 0.6910 | 1277 | 0.1634 |
| 0.6916 | 1278 | 0.227 |
| 0.6921 | 1279 | 0.2465 |
| 0.6926 | 1280 | 0.2472 |
| 0.6932 | 1281 | 0.2083 |
| 0.6937 | 1282 | 0.2587 |
| 0.6943 | 1283 | 0.1974 |
| 0.6948 | 1284 | 0.2369 |
| 0.6953 | 1285 | 0.2944 |
| 0.6959 | 1286 | 0.1743 |
| 0.6964 | 1287 | 0.2496 |
| 0.6970 | 1288 | 0.2777 |
| 0.6975 | 1289 | 0.2424 |
| 0.6981 | 1290 | 0.2735 |
| 0.6986 | 1291 | 0.2028 |
| 0.6991 | 1292 | 0.2498 |
| 0.6997 | 1293 | 0.2076 |
| 0.7002 | 1294 | 0.2811 |
| 0.7008 | 1295 | 0.1824 |
| 0.7013 | 1296 | 0.2319 |
| 0.7018 | 1297 | 0.1685 |
| 0.7024 | 1298 | 0.1954 |
| 0.7029 | 1299 | 0.1111 |
| 0.7035 | 1300 | 0.1637 |
| 0.7040 | 1301 | 0.2823 |
| 0.7045 | 1302 | 0.1236 |
| 0.7051 | 1303 | 0.183 |
| 0.7056 | 1304 | 0.1348 |
| 0.7062 | 1305 | 0.2877 |
| 0.7067 | 1306 | 0.1855 |
| 0.7073 | 1307 | 0.2708 |
| 0.7078 | 1308 | 0.3753 |
| 0.7083 | 1309 | 0.1908 |
| 0.7089 | 1310 | 0.2376 |
| 0.7094 | 1311 | 0.1913 |
| 0.7100 | 1312 | 0.3606 |
| 0.7105 | 1313 | 0.2773 |
| 0.7110 | 1314 | 0.2697 |
| 0.7116 | 1315 | 0.1975 |
| 0.7121 | 1316 | 0.1538 |
| 0.7127 | 1317 | 0.1859 |
| 0.7132 | 1318 | 0.2542 |
| 0.7137 | 1319 | 0.2822 |
| 0.7143 | 1320 | 0.2801 |
| 0.7148 | 1321 | 0.283 |
| 0.7154 | 1322 | 0.2867 |
| 0.7159 | 1323 | 0.2219 |
| 0.7165 | 1324 | 0.1737 |
| 0.7170 | 1325 | 0.2847 |
| 0.7175 | 1326 | 0.1857 |
| 0.7181 | 1327 | 0.2882 |
| 0.7186 | 1328 | 0.1674 |
| 0.7192 | 1329 | 0.2335 |
| 0.7197 | 1330 | 0.2781 |
| 0.7202 | 1331 | 0.2415 |
| 0.7208 | 1332 | 0.1948 |
| 0.7213 | 1333 | 0.1982 |
| 0.7219 | 1334 | 0.1902 |
| 0.7224 | 1335 | 0.2258 |
| 0.7229 | 1336 | 0.2633 |
| 0.7235 | 1337 | 0.2871 |
| 0.7240 | 1338 | 0.1722 |
| 0.7246 | 1339 | 0.2522 |
| 0.7251 | 1340 | 0.2276 |
| 0.7256 | 1341 | 0.1596 |
| 0.7262 | 1342 | 0.1786 |
| 0.7267 | 1343 | 0.2323 |
| 0.7273 | 1344 | 0.2323 |
| 0.7278 | 1345 | 0.2707 |
| 0.7284 | 1346 | 0.3087 |
| 0.7289 | 1347 | 0.2989 |
| 0.7294 | 1348 | 0.2444 |
| 0.7300 | 1349 | 0.2781 |
| 0.7305 | 1350 | 0.2269 |
| 0.7311 | 1351 | 0.1858 |
| 0.7316 | 1352 | 0.2539 |
| 0.7321 | 1353 | 0.2256 |
| 0.7327 | 1354 | 0.175 |
| 0.7332 | 1355 | 0.1923 |
| 0.7338 | 1356 | 0.279 |
| 0.7343 | 1357 | 0.2356 |
| 0.7348 | 1358 | 0.223 |
| 0.7354 | 1359 | 0.172 |
| 0.7359 | 1360 | 0.2756 |
| 0.7365 | 1361 | 0.3298 |
| 0.7370 | 1362 | 0.1619 |
| 0.7376 | 1363 | 0.1524 |
| 0.7381 | 1364 | 0.2297 |
| 0.7386 | 1365 | 0.2593 |
| 0.7392 | 1366 | 0.183 |
| 0.7397 | 1367 | 0.2744 |
| 0.7403 | 1368 | 0.2599 |
| 0.7408 | 1369 | 0.1933 |
| 0.7413 | 1370 | 0.2624 |
| 0.7419 | 1371 | 0.2616 |
| 0.7424 | 1372 | 0.1848 |
| 0.7430 | 1373 | 0.2129 |
| 0.7435 | 1374 | 0.2072 |
| 0.7440 | 1375 | 0.2606 |
| 0.7446 | 1376 | 0.2894 |
| 0.7451 | 1377 | 0.1475 |
| 0.7457 | 1378 | 0.2554 |
| 0.7462 | 1379 | 0.247 |
| 0.7468 | 1380 | 0.272 |
| 0.7473 | 1381 | 0.169 |
| 0.7478 | 1382 | 0.2267 |
| 0.7484 | 1383 | 0.2598 |
| 0.7489 | 1384 | 0.2338 |
| 0.7495 | 1385 | 0.1499 |
| 0.75 | 1386 | 0.2574 |
| 0.7505 | 1387 | 0.2158 |
| 0.7511 | 1388 | 0.2934 |
| 0.7516 | 1389 | 0.2076 |
| 0.7522 | 1390 | 0.1381 |
| 0.7527 | 1391 | 0.2106 |
| 0.7532 | 1392 | 0.417 |
| 0.7538 | 1393 | 0.2188 |
| 0.7543 | 1394 | 0.1214 |
| 0.7549 | 1395 | 0.2531 |
| 0.7554 | 1396 | 0.3031 |
| 0.7560 | 1397 | 0.3141 |
| 0.7565 | 1398 | 0.1921 |
| 0.7570 | 1399 | 0.2428 |
| 0.7576 | 1400 | 0.2414 |
| 0.7581 | 1401 | 0.1741 |
| 0.7587 | 1402 | 0.1865 |
| 0.7592 | 1403 | 0.1522 |
| 0.7597 | 1404 | 0.2314 |
| 0.7603 | 1405 | 0.2073 |
| 0.7608 | 1406 | 0.2321 |
| 0.7614 | 1407 | 0.3196 |
| 0.7619 | 1408 | 0.2849 |
| 0.7624 | 1409 | 0.275 |
| 0.7630 | 1410 | 0.132 |
| 0.7635 | 1411 | 0.2076 |
| 0.7641 | 1412 | 0.1814 |
| 0.7646 | 1413 | 0.2447 |
| 0.7652 | 1414 | 0.1721 |
| 0.7657 | 1415 | 0.1308 |
| 0.7662 | 1416 | 0.2891 |
| 0.7668 | 1417 | 0.1335 |
| 0.7673 | 1418 | 0.2577 |
| 0.7679 | 1419 | 0.1922 |
| 0.7684 | 1420 | 0.2042 |
| 0.7689 | 1421 | 0.294 |
| 0.7695 | 1422 | 0.2194 |
| 0.7700 | 1423 | 0.3001 |
| 0.7706 | 1424 | 0.1939 |
| 0.7711 | 1425 | 0.2993 |
| 0.7716 | 1426 | 0.2047 |
| 0.7722 | 1427 | 0.2043 |
| 0.7727 | 1428 | 0.2916 |
| 0.7733 | 1429 | 0.1857 |
| 0.7738 | 1430 | 0.1699 |
| 0.7744 | 1431 | 0.1641 |
| 0.7749 | 1432 | 0.22 |
| 0.7754 | 1433 | 0.2524 |
| 0.7760 | 1434 | 0.1864 |
| 0.7765 | 1435 | 0.33 |
| 0.7771 | 1436 | 0.1361 |
| 0.7776 | 1437 | 0.2796 |
| 0.7781 | 1438 | 0.2411 |
| 0.7787 | 1439 | 0.2349 |
| 0.7792 | 1440 | 0.2283 |
| 0.7798 | 1441 | 0.2401 |
| 0.7803 | 1442 | 0.2155 |
| 0.7808 | 1443 | 0.2473 |
| 0.7814 | 1444 | 0.1364 |
| 0.7819 | 1445 | 0.1476 |
| 0.7825 | 1446 | 0.1989 |
| 0.7830 | 1447 | 0.2432 |
| 0.7835 | 1448 | 0.1938 |
| 0.7841 | 1449 | 0.1552 |
| 0.7846 | 1450 | 0.3058 |
| 0.7852 | 1451 | 0.1666 |
| 0.7857 | 1452 | 0.1908 |
| 0.7863 | 1453 | 0.2164 |
| 0.7868 | 1454 | 0.2417 |
| 0.7873 | 1455 | 0.1969 |
| 0.7879 | 1456 | 0.2211 |
| 0.7884 | 1457 | 0.2758 |
| 0.7890 | 1458 | 0.1517 |
| 0.7895 | 1459 | 0.1566 |
| 0.7900 | 1460 | 0.1986 |
| 0.7906 | 1461 | 0.2431 |
| 0.7911 | 1462 | 0.2759 |
| 0.7917 | 1463 | 0.1769 |
| 0.7922 | 1464 | 0.1896 |
| 0.7927 | 1465 | 0.2472 |
| 0.7933 | 1466 | 0.1607 |
| 0.7938 | 1467 | 0.1684 |
| 0.7944 | 1468 | 0.2032 |
| 0.7949 | 1469 | 0.2087 |
| 0.7955 | 1470 | 0.1662 |
| 0.7960 | 1471 | 0.156 |
| 0.7965 | 1472 | 0.179 |
| 0.7971 | 1473 | 0.2237 |
| 0.7976 | 1474 | 0.1914 |
| 0.7982 | 1475 | 0.1255 |
| 0.7987 | 1476 | 0.2033 |
| 0.7992 | 1477 | 0.2079 |
| 0.7998 | 1478 | 0.1628 |
| 0.8003 | 1479 | 0.2411 |
| 0.8009 | 1480 | 0.2189 |
| 0.8014 | 1481 | 0.2319 |
| 0.8019 | 1482 | 0.277 |
| 0.8025 | 1483 | 0.2753 |
| 0.8030 | 1484 | 0.2106 |
| 0.8036 | 1485 | 0.2039 |
| 0.8041 | 1486 | 0.256 |
| 0.8047 | 1487 | 0.1349 |
| 0.8052 | 1488 | 0.3065 |
| 0.8057 | 1489 | 0.2778 |
| 0.8063 | 1490 | 0.1481 |
| 0.8068 | 1491 | 0.2212 |
| 0.8074 | 1492 | 0.2786 |
| 0.8079 | 1493 | 0.3017 |
| 0.8084 | 1494 | 0.2164 |
| 0.8090 | 1495 | 0.1563 |
| 0.8095 | 1496 | 0.2038 |
| 0.8101 | 1497 | 0.1745 |
| 0.8106 | 1498 | 0.2678 |
| 0.8111 | 1499 | 0.1744 |
| 0.8117 | 1500 | 0.2337 |
| 0.8122 | 1501 | 0.1605 |
| 0.8128 | 1502 | 0.2224 |
| 0.8133 | 1503 | 0.1323 |
| 0.8139 | 1504 | 0.1817 |
| 0.8144 | 1505 | 0.1549 |
| 0.8149 | 1506 | 0.2425 |
| 0.8155 | 1507 | 0.3393 |
| 0.8160 | 1508 | 0.2422 |
| 0.8166 | 1509 | 0.1692 |
| 0.8171 | 1510 | 0.2741 |
| 0.8176 | 1511 | 0.2612 |
| 0.8182 | 1512 | 0.2111 |
| 0.8187 | 1513 | 0.2094 |
| 0.8193 | 1514 | 0.2591 |
| 0.8198 | 1515 | 0.1785 |
| 0.8203 | 1516 | 0.2598 |
| 0.8209 | 1517 | 0.2414 |
| 0.8214 | 1518 | 0.2205 |
| 0.8220 | 1519 | 0.2149 |
| 0.8225 | 1520 | 0.3023 |
| 0.8231 | 1521 | 0.1663 |
| 0.8236 | 1522 | 0.1992 |
| 0.8241 | 1523 | 0.2077 |
| 0.8247 | 1524 | 0.2264 |
| 0.8252 | 1525 | 0.2379 |
| 0.8258 | 1526 | 0.2704 |
| 0.8263 | 1527 | 0.2075 |
| 0.8268 | 1528 | 0.2057 |
| 0.8274 | 1529 | 0.1929 |
| 0.8279 | 1530 | 0.2734 |
| 0.8285 | 1531 | 0.231 |
| 0.8290 | 1532 | 0.1712 |
| 0.8295 | 1533 | 0.1946 |
| 0.8301 | 1534 | 0.2674 |
| 0.8306 | 1535 | 0.283 |
| 0.8312 | 1536 | 0.2043 |
| 0.8317 | 1537 | 0.324 |
| 0.8323 | 1538 | 0.1983 |
| 0.8328 | 1539 | 0.2451 |
| 0.8333 | 1540 | 0.1818 |
| 0.8339 | 1541 | 0.1934 |
| 0.8344 | 1542 | 0.187 |
| 0.8350 | 1543 | 0.1651 |
| 0.8355 | 1544 | 0.1877 |
| 0.8360 | 1545 | 0.2054 |
| 0.8366 | 1546 | 0.134 |
| 0.8371 | 1547 | 0.1894 |
| 0.8377 | 1548 | 0.1858 |
| 0.8382 | 1549 | 0.2407 |
| 0.8387 | 1550 | 0.1549 |
| 0.8393 | 1551 | 0.1154 |
| 0.8398 | 1552 | 0.2316 |
| 0.8404 | 1553 | 0.201 |
| 0.8409 | 1554 | 0.1 |
| 0.8415 | 1555 | 0.1839 |
| 0.8420 | 1556 | 0.1961 |
| 0.8425 | 1557 | 0.2293 |
| 0.8431 | 1558 | 0.3026 |
| 0.8436 | 1559 | 0.0939 |
| 0.8442 | 1560 | 0.1977 |
| 0.8447 | 1561 | 0.2081 |
| 0.8452 | 1562 | 0.3763 |
| 0.8458 | 1563 | 0.2351 |
| 0.8463 | 1564 | 0.1679 |
| 0.8469 | 1565 | 0.2331 |
| 0.8474 | 1566 | 0.1929 |
| 0.8479 | 1567 | 0.2727 |
| 0.8485 | 1568 | 0.2027 |
| 0.8490 | 1569 | 0.2342 |
| 0.8496 | 1570 | 0.1408 |
| 0.8501 | 1571 | 0.1504 |
| 0.8506 | 1572 | 0.2081 |
| 0.8512 | 1573 | 0.3252 |
| 0.8517 | 1574 | 0.1838 |
| 0.8523 | 1575 | 0.2124 |
| 0.8528 | 1576 | 0.2678 |
| 0.8534 | 1577 | 0.2415 |
| 0.8539 | 1578 | 0.1886 |
| 0.8544 | 1579 | 0.2611 |
| 0.8550 | 1580 | 0.1391 |
| 0.8555 | 1581 | 0.2831 |
| 0.8561 | 1582 | 0.2946 |
| 0.8566 | 1583 | 0.1974 |
| 0.8571 | 1584 | 0.2054 |
| 0.8577 | 1585 | 0.2207 |
| 0.8582 | 1586 | 0.2402 |
| 0.8588 | 1587 | 0.1869 |
| 0.8593 | 1588 | 0.1488 |
| 0.8598 | 1589 | 0.2525 |
| 0.8604 | 1590 | 0.2148 |
| 0.8609 | 1591 | 0.2051 |
| 0.8615 | 1592 | 0.1768 |
| 0.8620 | 1593 | 0.167 |
| 0.8626 | 1594 | 0.216 |
| 0.8631 | 1595 | 0.1512 |
| 0.8636 | 1596 | 0.1886 |
| 0.8642 | 1597 | 0.1712 |
| 0.8647 | 1598 | 0.2057 |
| 0.8653 | 1599 | 0.2316 |
| 0.8658 | 1600 | 0.2064 |
| 0.8663 | 1601 | 0.2309 |
| 0.8669 | 1602 | 0.2696 |
| 0.8674 | 1603 | 0.2742 |
| 0.8680 | 1604 | 0.2463 |
| 0.8685 | 1605 | 0.2402 |
| 0.8690 | 1606 | 0.1479 |
| 0.8696 | 1607 | 0.1613 |
| 0.8701 | 1608 | 0.2113 |
| 0.8707 | 1609 | 0.2015 |
| 0.8712 | 1610 | 0.1901 |
| 0.8718 | 1611 | 0.1712 |
| 0.8723 | 1612 | 0.2733 |
| 0.8728 | 1613 | 0.2314 |
| 0.8734 | 1614 | 0.1676 |
| 0.8739 | 1615 | 0.2809 |
| 0.8745 | 1616 | 0.1739 |
| 0.875 | 1617 | 0.2759 |
| 0.8755 | 1618 | 0.245 |
| 0.8761 | 1619 | 0.2195 |
| 0.8766 | 1620 | 0.1651 |
| 0.8772 | 1621 | 0.2656 |
| 0.8777 | 1622 | 0.2605 |
| 0.8782 | 1623 | 0.2615 |
| 0.8788 | 1624 | 0.2298 |
| 0.8793 | 1625 | 0.2668 |
| 0.8799 | 1626 | 0.1399 |
| 0.8804 | 1627 | 0.2897 |
| 0.8810 | 1628 | 0.1913 |
| 0.8815 | 1629 | 0.1974 |
| 0.8820 | 1630 | 0.2694 |
| 0.8826 | 1631 | 0.1627 |
| 0.8831 | 1632 | 0.2016 |
| 0.8837 | 1633 | 0.2794 |
| 0.8842 | 1634 | 0.1409 |
| 0.8847 | 1635 | 0.199 |
| 0.8853 | 1636 | 0.136 |
| 0.8858 | 1637 | 0.2331 |
| 0.8864 | 1638 | 0.2306 |
| 0.8869 | 1639 | 0.1683 |
| 0.8874 | 1640 | 0.2567 |
| 0.8880 | 1641 | 0.2492 |
| 0.8885 | 1642 | 0.1601 |
| 0.8891 | 1643 | 0.1808 |
| 0.8896 | 1644 | 0.1861 |
| 0.8902 | 1645 | 0.2098 |
| 0.8907 | 1646 | 0.1862 |
| 0.8912 | 1647 | 0.195 |
| 0.8918 | 1648 | 0.2567 |
| 0.8923 | 1649 | 0.2327 |
| 0.8929 | 1650 | 0.2139 |
| 0.8934 | 1651 | 0.1437 |
| 0.8939 | 1652 | 0.2254 |
| 0.8945 | 1653 | 0.2138 |
| 0.8950 | 1654 | 0.181 |
| 0.8956 | 1655 | 0.201 |
| 0.8961 | 1656 | 0.2941 |
| 0.8966 | 1657 | 0.2677 |
| 0.8972 | 1658 | 0.1932 |
| 0.8977 | 1659 | 0.1534 |
| 0.8983 | 1660 | 0.1721 |
| 0.8988 | 1661 | 0.2207 |
| 0.8994 | 1662 | 0.1859 |
| 0.8999 | 1663 | 0.1869 |
| 0.9004 | 1664 | 0.1898 |
| 0.9010 | 1665 | 0.29 |
| 0.9015 | 1666 | 0.19 |
| 0.9021 | 1667 | 0.2346 |
| 0.9026 | 1668 | 0.1738 |
| 0.9031 | 1669 | 0.1867 |
| 0.9037 | 1670 | 0.1852 |
| 0.9042 | 1671 | 0.1846 |
| 0.9048 | 1672 | 0.1879 |
| 0.9053 | 1673 | 0.1733 |
| 0.9058 | 1674 | 0.27 |
| 0.9064 | 1675 | 0.2005 |
| 0.9069 | 1676 | 0.1661 |
| 0.9075 | 1677 | 0.1841 |
| 0.9080 | 1678 | 0.1545 |
| 0.9085 | 1679 | 0.1856 |
| 0.9091 | 1680 | 0.1827 |
| 0.9096 | 1681 | 0.2856 |
| 0.9102 | 1682 | 0.3265 |
| 0.9107 | 1683 | 0.2755 |
| 0.9113 | 1684 | 0.1453 |
| 0.9118 | 1685 | 0.2113 |
| 0.9123 | 1686 | 0.3191 |
| 0.9129 | 1687 | 0.1785 |
| 0.9134 | 1688 | 0.1874 |
| 0.9140 | 1689 | 0.2242 |
| 0.9145 | 1690 | 0.1746 |
| 0.9150 | 1691 | 0.2292 |
| 0.9156 | 1692 | 0.1993 |
| 0.9161 | 1693 | 0.2461 |
| 0.9167 | 1694 | 0.258 |
| 0.9172 | 1695 | 0.1919 |
| 0.9177 | 1696 | 0.2632 |
| 0.9183 | 1697 | 0.2513 |
| 0.9188 | 1698 | 0.2052 |
| 0.9194 | 1699 | 0.2297 |
| 0.9199 | 1700 | 0.218 |
| 0.9205 | 1701 | 0.2182 |
| 0.9210 | 1702 | 0.2033 |
| 0.9215 | 1703 | 0.2081 |
| 0.9221 | 1704 | 0.1975 |
| 0.9226 | 1705 | 0.2011 |
| 0.9232 | 1706 | 0.1438 |
| 0.9237 | 1707 | 0.2041 |
| 0.9242 | 1708 | 0.1605 |
| 0.9248 | 1709 | 0.2261 |
| 0.9253 | 1710 | 0.1288 |
| 0.9259 | 1711 | 0.1856 |
| 0.9264 | 1712 | 0.1708 |
| 0.9269 | 1713 | 0.1735 |
| 0.9275 | 1714 | 0.2421 |
| 0.9280 | 1715 | 0.225 |
| 0.9286 | 1716 | 0.172 |
| 0.9291 | 1717 | 0.1651 |
| 0.9297 | 1718 | 0.2198 |
| 0.9302 | 1719 | 0.1956 |
| 0.9307 | 1720 | 0.2471 |
| 0.9313 | 1721 | 0.1807 |
| 0.9318 | 1722 | 0.0867 |
| 0.9324 | 1723 | 0.2634 |
| 0.9329 | 1724 | 0.2098 |
| 0.9334 | 1725 | 0.176 |
| 0.9340 | 1726 | 0.1865 |
| 0.9345 | 1727 | 0.1886 |
| 0.9351 | 1728 | 0.2151 |
| 0.9356 | 1729 | 0.1689 |
| 0.9361 | 1730 | 0.2577 |
| 0.9367 | 1731 | 0.2183 |
| 0.9372 | 1732 | 0.1771 |
| 0.9378 | 1733 | 0.1784 |
| 0.9383 | 1734 | 0.2513 |
| 0.9389 | 1735 | 0.2508 |
| 0.9394 | 1736 | 0.2319 |
| 0.9399 | 1737 | 0.1689 |
| 0.9405 | 1738 | 0.1158 |
| 0.9410 | 1739 | 0.245 |
| 0.9416 | 1740 | 0.2763 |
| 0.9421 | 1741 | 0.2143 |
| 0.9426 | 1742 | 0.1567 |
| 0.9432 | 1743 | 0.1456 |
| 0.9437 | 1744 | 0.1858 |
| 0.9443 | 1745 | 0.2407 |
| 0.9448 | 1746 | 0.1501 |
| 0.9453 | 1747 | 0.2261 |
| 0.9459 | 1748 | 0.201 |
| 0.9464 | 1749 | 0.1816 |
| 0.9470 | 1750 | 0.0865 |
| 0.9475 | 1751 | 0.296 |
| 0.9481 | 1752 | 0.1989 |
| 0.9486 | 1753 | 0.2225 |
| 0.9491 | 1754 | 0.1898 |
| 0.9497 | 1755 | 0.1394 |
| 0.9502 | 1756 | 0.2071 |
| 0.9508 | 1757 | 0.2008 |
| 0.9513 | 1758 | 0.1961 |
| 0.9518 | 1759 | 0.1865 |
| 0.9524 | 1760 | 0.2111 |
| 0.9529 | 1761 | 0.2847 |
| 0.9535 | 1762 | 0.2014 |
| 0.9540 | 1763 | 0.1629 |
| 0.9545 | 1764 | 0.1237 |
| 0.9551 | 1765 | 0.2331 |
| 0.9556 | 1766 | 0.2458 |
| 0.9562 | 1767 | 0.1752 |
| 0.9567 | 1768 | 0.2897 |
| 0.9573 | 1769 | 0.134 |
| 0.9578 | 1770 | 0.2517 |
| 0.9583 | 1771 | 0.1504 |
| 0.9589 | 1772 | 0.2288 |
| 0.9594 | 1773 | 0.2404 |
| 0.9600 | 1774 | 0.1955 |
| 0.9605 | 1775 | 0.1972 |
| 0.9610 | 1776 | 0.1971 |
| 0.9616 | 1777 | 0.1297 |
| 0.9621 | 1778 | 0.2005 |
| 0.9627 | 1779 | 0.1679 |
| 0.9632 | 1780 | 0.2738 |
| 0.9637 | 1781 | 0.2213 |
| 0.9643 | 1782 | 0.1673 |
| 0.9648 | 1783 | 0.1889 |
| 0.9654 | 1784 | 0.1642 |
| 0.9659 | 1785 | 0.2065 |
| 0.9665 | 1786 | 0.2161 |
| 0.9670 | 1787 | 0.2 |
| 0.9675 | 1788 | 0.1734 |
| 0.9681 | 1789 | 0.1763 |
| 0.9686 | 1790 | 0.1258 |
| 0.9692 | 1791 | 0.1605 |
| 0.9697 | 1792 | 0.1668 |
| 0.9702 | 1793 | 0.1435 |
| 0.9708 | 1794 | 0.2365 |
| 0.9713 | 1795 | 0.2994 |
| 0.9719 | 1796 | 0.1712 |
| 0.9724 | 1797 | 0.3761 |
| 0.9729 | 1798 | 0.1148 |
| 0.9735 | 1799 | 0.2062 |
| 0.9740 | 1800 | 0.1609 |
| 0.9746 | 1801 | 0.1481 |
| 0.9751 | 1802 | 0.1487 |
| 0.9756 | 1803 | 0.1926 |
| 0.9762 | 1804 | 0.1305 |
| 0.9767 | 1805 | 0.2725 |
| 0.9773 | 1806 | 0.2248 |
| 0.9778 | 1807 | 0.1632 |
| 0.9784 | 1808 | 0.2191 |
| 0.9789 | 1809 | 0.1921 |
| 0.9794 | 1810 | 0.3105 |
| 0.9800 | 1811 | 0.1878 |
| 0.9805 | 1812 | 0.1709 |
| 0.9811 | 1813 | 0.2087 |
| 0.9816 | 1814 | 0.1767 |
| 0.9821 | 1815 | 0.284 |
| 0.9827 | 1816 | 0.1389 |
| 0.9832 | 1817 | 0.094 |
| 0.9838 | 1818 | 0.2162 |
| 0.9843 | 1819 | 0.1478 |
| 0.9848 | 1820 | 0.1421 |
| 0.9854 | 1821 | 0.2339 |
| 0.9859 | 1822 | 0.2406 |
| 0.9865 | 1823 | 0.2425 |
| 0.9870 | 1824 | 0.2132 |
| 0.9876 | 1825 | 0.2314 |
| 0.9881 | 1826 | 0.1829 |
| 0.9886 | 1827 | 0.1538 |
| 0.9892 | 1828 | 0.1794 |
| 0.9897 | 1829 | 0.1931 |
| 0.9903 | 1830 | 0.2618 |
| 0.9908 | 1831 | 0.1501 |
| 0.9913 | 1832 | 0.2641 |
| 0.9919 | 1833 | 0.1706 |
| 0.9924 | 1834 | 0.1356 |
| 0.9930 | 1835 | 0.2405 |
| 0.9935 | 1836 | 0.1464 |
| 0.9940 | 1837 | 0.1476 |
| 0.9946 | 1838 | 0.1413 |
| 0.9951 | 1839 | 0.1958 |
| 0.9957 | 1840 | 0.0958 |
| 0.9962 | 1841 | 0.2388 |
| 0.9968 | 1842 | 0.1871 |
| 0.9973 | 1843 | 0.182 |
| 0.9978 | 1844 | 0.1875 |
| 0.9984 | 1845 | 0.1762 |
| 0.9989 | 1846 | 0.2412 |
| 0.9995 | 1847 | 0.233 |
| 1.0 | 1848 | 0.2743 |
| 1.0005 | 1849 | 0.2034 |
| 1.0011 | 1850 | 0.1463 |
| 1.0016 | 1851 | 0.1942 |
| 1.0022 | 1852 | 0.1194 |
| 1.0027 | 1853 | 0.1526 |
| 1.0032 | 1854 | 0.1372 |
| 1.0038 | 1855 | 0.1601 |
| 1.0043 | 1856 | 0.1654 |
| 1.0049 | 1857 | 0.1777 |
| 1.0054 | 1858 | 0.1739 |
| 1.0060 | 1859 | 0.1391 |
| 1.0065 | 1860 | 0.1468 |
| 1.0070 | 1861 | 0.1877 |
| 1.0076 | 1862 | 0.1617 |
| 1.0081 | 1863 | 0.1823 |
| 1.0087 | 1864 | 0.105 |
| 1.0092 | 1865 | 0.233 |
| 1.0097 | 1866 | 0.2381 |
| 1.0103 | 1867 | 0.2033 |
| 1.0108 | 1868 | 0.1243 |
| 1.0114 | 1869 | 0.1527 |
| 1.0119 | 1870 | 0.1327 |
| 1.0124 | 1871 | 0.1931 |
| 1.0130 | 1872 | 0.1131 |
| 1.0135 | 1873 | 0.2039 |
| 1.0141 | 1874 | 0.1232 |
| 1.0146 | 1875 | 0.2341 |
| 1.0152 | 1876 | 0.143 |
| 1.0157 | 1877 | 0.1361 |
| 1.0162 | 1878 | 0.217 |
| 1.0168 | 1879 | 0.1394 |
| 1.0173 | 1880 | 0.2238 |
| 1.0179 | 1881 | 0.1121 |
| 1.0184 | 1882 | 0.1581 |
| 1.0189 | 1883 | 0.1386 |
| 1.0195 | 1884 | 0.1528 |
| 1.0200 | 1885 | 0.1286 |
| 1.0206 | 1886 | 0.1667 |
| 1.0211 | 1887 | 0.115 |
| 1.0216 | 1888 | 0.1584 |
| 1.0222 | 1889 | 0.1274 |
| 1.0227 | 1890 | 0.1612 |
| 1.0233 | 1891 | 0.2046 |
| 1.0238 | 1892 | 0.206 |
| 1.0244 | 1893 | 0.2367 |
| 1.0249 | 1894 | 0.1461 |
| 1.0254 | 1895 | 0.2551 |
| 1.0260 | 1896 | 0.1411 |
| 1.0265 | 1897 | 0.1284 |
| 1.0271 | 1898 | 0.2468 |
| 1.0276 | 1899 | 0.1183 |
| 1.0281 | 1900 | 0.2364 |
| 1.0287 | 1901 | 0.1429 |
| 1.0292 | 1902 | 0.1653 |
| 1.0298 | 1903 | 0.1711 |
| 1.0303 | 1904 | 0.1445 |
| 1.0308 | 1905 | 0.2109 |
| 1.0314 | 1906 | 0.2101 |
| 1.0319 | 1907 | 0.1864 |
| 1.0325 | 1908 | 0.1393 |
| 1.0330 | 1909 | 0.1086 |
| 1.0335 | 1910 | 0.218 |
| 1.0341 | 1911 | 0.1591 |
| 1.0346 | 1912 | 0.1739 |
| 1.0352 | 1913 | 0.2054 |
| 1.0357 | 1914 | 0.1545 |
| 1.0363 | 1915 | 0.2139 |
| 1.0368 | 1916 | 0.1676 |
| 1.0373 | 1917 | 0.1097 |
| 1.0379 | 1918 | 0.2515 |
| 1.0384 | 1919 | 0.1582 |
| 1.0390 | 1920 | 0.1195 |
| 1.0395 | 1921 | 0.196 |
| 1.0400 | 1922 | 0.1554 |
| 1.0406 | 1923 | 0.1702 |
| 1.0411 | 1924 | 0.1224 |
| 1.0417 | 1925 | 0.1329 |
| 1.0422 | 1926 | 0.1243 |
| 1.0427 | 1927 | 0.1067 |
| 1.0433 | 1928 | 0.1712 |
| 1.0438 | 1929 | 0.202 |
| 1.0444 | 1930 | 0.2256 |
| 1.0449 | 1931 | 0.2148 |
| 1.0455 | 1932 | 0.234 |
| 1.0460 | 1933 | 0.1218 |
| 1.0465 | 1934 | 0.1635 |
| 1.0471 | 1935 | 0.1917 |
| 1.0476 | 1936 | 0.151 |
| 1.0482 | 1937 | 0.2025 |
| 1.0487 | 1938 | 0.141 |
| 1.0492 | 1939 | 0.1329 |
| 1.0498 | 1940 | 0.2148 |
| 1.0503 | 1941 | 0.1527 |
| 1.0509 | 1942 | 0.1631 |
| 1.0514 | 1943 | 0.1642 |
| 1.0519 | 1944 | 0.1426 |
| 1.0525 | 1945 | 0.116 |
| 1.0530 | 1946 | 0.1159 |
| 1.0536 | 1947 | 0.1373 |
| 1.0541 | 1948 | 0.2117 |
| 1.0547 | 1949 | 0.1005 |
| 1.0552 | 1950 | 0.149 |
| 1.0557 | 1951 | 0.1956 |
| 1.0563 | 1952 | 0.1471 |
| 1.0568 | 1953 | 0.1377 |
| 1.0574 | 1954 | 0.1306 |
| 1.0579 | 1955 | 0.1737 |
| 1.0584 | 1956 | 0.1636 |
| 1.0590 | 1957 | 0.1634 |
| 1.0595 | 1958 | 0.1845 |
| 1.0601 | 1959 | 0.2313 |
| 1.0606 | 1960 | 0.1796 |
| 1.0611 | 1961 | 0.1918 |
| 1.0617 | 1962 | 0.1715 |
| 1.0622 | 1963 | 0.162 |
| 1.0628 | 1964 | 0.1783 |
| 1.0633 | 1965 | 0.2089 |
| 1.0639 | 1966 | 0.1724 |
| 1.0644 | 1967 | 0.2609 |
| 1.0649 | 1968 | 0.2761 |
| 1.0655 | 1969 | 0.1313 |
| 1.0660 | 1970 | 0.1492 |
| 1.0666 | 1971 | 0.1864 |
| 1.0671 | 1972 | 0.1653 |
| 1.0676 | 1973 | 0.1798 |
| 1.0682 | 1974 | 0.171 |
| 1.0687 | 1975 | 0.1591 |
| 1.0693 | 1976 | 0.1092 |
| 1.0698 | 1977 | 0.1211 |
| 1.0703 | 1978 | 0.0976 |
| 1.0709 | 1979 | 0.205 |
| 1.0714 | 1980 | 0.1753 |
| 1.0720 | 1981 | 0.1722 |
| 1.0725 | 1982 | 0.1498 |
| 1.0731 | 1983 | 0.1297 |
| 1.0736 | 1984 | 0.1941 |
| 1.0741 | 1985 | 0.1709 |
| 1.0747 | 1986 | 0.1441 |
| 1.0752 | 1987 | 0.1204 |
| 1.0758 | 1988 | 0.1962 |
| 1.0763 | 1989 | 0.1346 |
| 1.0768 | 1990 | 0.2437 |
| 1.0774 | 1991 | 0.1461 |
| 1.0779 | 1992 | 0.1496 |
| 1.0785 | 1993 | 0.1689 |
| 1.0790 | 1994 | 0.1335 |
| 1.0795 | 1995 | 0.1109 |
| 1.0801 | 1996 | 0.1266 |
| 1.0806 | 1997 | 0.1488 |
| 1.0812 | 1998 | 0.167 |
| 1.0817 | 1999 | 0.188 |
| 1.0823 | 2000 | 0.2053 |
| 1.0828 | 2001 | 0.1242 |
| 1.0833 | 2002 | 0.1813 |
| 1.0839 | 2003 | 0.0957 |
| 1.0844 | 2004 | 0.1556 |
| 1.0850 | 2005 | 0.1699 |
| 1.0855 | 2006 | 0.1698 |
| 1.0860 | 2007 | 0.1316 |
| 1.0866 | 2008 | 0.1902 |
| 1.0871 | 2009 | 0.1008 |
| 1.0877 | 2010 | 0.2238 |
| 1.0882 | 2011 | 0.1906 |
| 1.0887 | 2012 | 0.2006 |
| 1.0893 | 2013 | 0.1401 |
| 1.0898 | 2014 | 0.1908 |
| 1.0904 | 2015 | 0.1655 |
| 1.0909 | 2016 | 0.1524 |
| 1.0915 | 2017 | 0.0981 |
| 1.0920 | 2018 | 0.1616 |
| 1.0925 | 2019 | 0.1413 |
| 1.0931 | 2020 | 0.1809 |
| 1.0936 | 2021 | 0.1699 |
| 1.0942 | 2022 | 0.1203 |
| 1.0947 | 2023 | 0.191 |
| 1.0952 | 2024 | 0.2139 |
| 1.0958 | 2025 | 0.1012 |
| 1.0963 | 2026 | 0.183 |
| 1.0969 | 2027 | 0.1957 |
| 1.0974 | 2028 | 0.118 |
| 1.0979 | 2029 | 0.1974 |
| 1.0985 | 2030 | 0.2015 |
| 1.0990 | 2031 | 0.1445 |
| 1.0996 | 2032 | 0.1731 |
| 1.1001 | 2033 | 0.1958 |
| 1.1006 | 2034 | 0.1431 |
| 1.1012 | 2035 | 0.1691 |
| 1.1017 | 2036 | 0.1705 |
| 1.1023 | 2037 | 0.1075 |
| 1.1028 | 2038 | 0.1893 |
| 1.1034 | 2039 | 0.1411 |
| 1.1039 | 2040 | 0.1824 |
| 1.1044 | 2041 | 0.1304 |
| 1.1050 | 2042 | 0.1285 |
| 1.1055 | 2043 | 0.1826 |
| 1.1061 | 2044 | 0.1263 |
| 1.1066 | 2045 | 0.1473 |
| 1.1071 | 2046 | 0.119 |
| 1.1077 | 2047 | 0.1101 |
| 1.1082 | 2048 | 0.1834 |
| 1.1088 | 2049 | 0.1625 |
| 1.1093 | 2050 | 0.1775 |
| 1.1098 | 2051 | 0.1728 |
| 1.1104 | 2052 | 0.1672 |
| 1.1109 | 2053 | 0.156 |
| 1.1115 | 2054 | 0.1605 |
| 1.1120 | 2055 | 0.1568 |
| 1.1126 | 2056 | 0.1133 |
| 1.1131 | 2057 | 0.1167 |
| 1.1136 | 2058 | 0.1604 |
| 1.1142 | 2059 | 0.1963 |
| 1.1147 | 2060 | 0.2073 |
| 1.1153 | 2061 | 0.1011 |
| 1.1158 | 2062 | 0.1714 |
| 1.1163 | 2063 | 0.174 |
| 1.1169 | 2064 | 0.2817 |
| 1.1174 | 2065 | 0.1342 |
| 1.1180 | 2066 | 0.2115 |
| 1.1185 | 2067 | 0.2947 |
| 1.1190 | 2068 | 0.166 |
| 1.1196 | 2069 | 0.2678 |
| 1.1201 | 2070 | 0.1649 |
| 1.1207 | 2071 | 0.1117 |
| 1.1212 | 2072 | 0.1927 |
| 1.1218 | 2073 | 0.1119 |
| 1.1223 | 2074 | 0.1288 |
| 1.1228 | 2075 | 0.1147 |
| 1.1234 | 2076 | 0.214 |
| 1.1239 | 2077 | 0.2483 |
| 1.1245 | 2078 | 0.1222 |
| 1.125 | 2079 | 0.1394 |
| 1.1255 | 2080 | 0.1832 |
| 1.1261 | 2081 | 0.1499 |
| 1.1266 | 2082 | 0.1717 |
| 1.1272 | 2083 | 0.181 |
| 1.1277 | 2084 | 0.2072 |
| 1.1282 | 2085 | 0.1094 |
| 1.1288 | 2086 | 0.1827 |
| 1.1293 | 2087 | 0.1416 |
| 1.1299 | 2088 | 0.1325 |
| 1.1304 | 2089 | 0.1111 |
| 1.1310 | 2090 | 0.1922 |
| 1.1315 | 2091 | 0.1604 |
| 1.1320 | 2092 | 0.1957 |
| 1.1326 | 2093 | 0.0706 |
| 1.1331 | 2094 | 0.1451 |
| 1.1337 | 2095 | 0.1604 |
| 1.1342 | 2096 | 0.1272 |
| 1.1347 | 2097 | 0.2064 |
| 1.1353 | 2098 | 0.1528 |
| 1.1358 | 2099 | 0.1618 |
| 1.1364 | 2100 | 0.2424 |
| 1.1369 | 2101 | 0.1323 |
| 1.1374 | 2102 | 0.1658 |
| 1.1380 | 2103 | 0.1931 |
| 1.1385 | 2104 | 0.1471 |
| 1.1391 | 2105 | 0.1847 |
| 1.1396 | 2106 | 0.1788 |
| 1.1402 | 2107 | 0.161 |
| 1.1407 | 2108 | 0.1455 |
| 1.1412 | 2109 | 0.1814 |
| 1.1418 | 2110 | 0.1753 |
| 1.1423 | 2111 | 0.1535 |
| 1.1429 | 2112 | 0.1411 |
| 1.1434 | 2113 | 0.1738 |
| 1.1439 | 2114 | 0.1174 |
| 1.1445 | 2115 | 0.1593 |
| 1.1450 | 2116 | 0.1888 |
| 1.1456 | 2117 | 0.1683 |
| 1.1461 | 2118 | 0.1687 |
| 1.1466 | 2119 | 0.1092 |
| 1.1472 | 2120 | 0.1563 |
| 1.1477 | 2121 | 0.1915 |
| 1.1483 | 2122 | 0.1882 |
| 1.1488 | 2123 | 0.1486 |
| 1.1494 | 2124 | 0.1101 |
| 1.1499 | 2125 | 0.101 |
| 1.1504 | 2126 | 0.1735 |
| 1.1510 | 2127 | 0.134 |
| 1.1515 | 2128 | 0.2412 |
| 1.1521 | 2129 | 0.1631 |
| 1.1526 | 2130 | 0.184 |
| 1.1531 | 2131 | 0.2054 |
| 1.1537 | 2132 | 0.1637 |
| 1.1542 | 2133 | 0.1782 |
| 1.1548 | 2134 | 0.2421 |
| 1.1553 | 2135 | 0.1616 |
| 1.1558 | 2136 | 0.2233 |
| 1.1564 | 2137 | 0.2235 |
| 1.1569 | 2138 | 0.1829 |
| 1.1575 | 2139 | 0.1902 |
| 1.1580 | 2140 | 0.0983 |
| 1.1585 | 2141 | 0.1359 |
| 1.1591 | 2142 | 0.1726 |
| 1.1596 | 2143 | 0.1889 |
| 1.1602 | 2144 | 0.234 |
| 1.1607 | 2145 | 0.1312 |
| 1.1613 | 2146 | 0.1824 |
| 1.1618 | 2147 | 0.1329 |
| 1.1623 | 2148 | 0.1895 |
| 1.1629 | 2149 | 0.1778 |
| 1.1634 | 2150 | 0.1244 |
| 1.1640 | 2151 | 0.1498 |
| 1.1645 | 2152 | 0.1797 |
| 1.1650 | 2153 | 0.1602 |
| 1.1656 | 2154 | 0.1562 |
| 1.1661 | 2155 | 0.1582 |
| 1.1667 | 2156 | 0.1164 |
| 1.1672 | 2157 | 0.1242 |
| 1.1677 | 2158 | 0.1829 |
| 1.1683 | 2159 | 0.1881 |
| 1.1688 | 2160 | 0.1889 |
| 1.1694 | 2161 | 0.1942 |
| 1.1699 | 2162 | 0.1931 |
| 1.1705 | 2163 | 0.126 |
| 1.1710 | 2164 | 0.0994 |
| 1.1715 | 2165 | 0.1428 |
| 1.1721 | 2166 | 0.1164 |
| 1.1726 | 2167 | 0.1632 |
| 1.1732 | 2168 | 0.1801 |
| 1.1737 | 2169 | 0.144 |
| 1.1742 | 2170 | 0.116 |
| 1.1748 | 2171 | 0.1019 |
| 1.1753 | 2172 | 0.1493 |
| 1.1759 | 2173 | 0.1446 |
| 1.1764 | 2174 | 0.1436 |
| 1.1769 | 2175 | 0.1424 |
| 1.1775 | 2176 | 0.2089 |
| 1.1780 | 2177 | 0.1138 |
| 1.1786 | 2178 | 0.0714 |
| 1.1791 | 2179 | 0.1793 |
| 1.1797 | 2180 | 0.1501 |
| 1.1802 | 2181 | 0.1725 |
| 1.1807 | 2182 | 0.1864 |
| 1.1813 | 2183 | 0.1533 |
| 1.1818 | 2184 | 0.1663 |
| 1.1824 | 2185 | 0.2463 |
| 1.1829 | 2186 | 0.2357 |
| 1.1834 | 2187 | 0.1838 |
| 1.1840 | 2188 | 0.1901 |
| 1.1845 | 2189 | 0.1689 |
| 1.1851 | 2190 | 0.1931 |
| 1.1856 | 2191 | 0.1183 |
| 1.1861 | 2192 | 0.1785 |
| 1.1867 | 2193 | 0.1615 |
| 1.1872 | 2194 | 0.1501 |
| 1.1878 | 2195 | 0.0892 |
| 1.1883 | 2196 | 0.1803 |
| 1.1889 | 2197 | 0.1245 |
| 1.1894 | 2198 | 0.0738 |
| 1.1899 | 2199 | 0.1709 |
| 1.1905 | 2200 | 0.1645 |
| 1.1910 | 2201 | 0.1593 |
| 1.1916 | 2202 | 0.1975 |
| 1.1921 | 2203 | 0.2131 |
| 1.1926 | 2204 | 0.1472 |
| 1.1932 | 2205 | 0.0742 |
| 1.1937 | 2206 | 0.1322 |
| 1.1943 | 2207 | 0.1259 |
| 1.1948 | 2208 | 0.1585 |
| 1.1953 | 2209 | 0.166 |
| 1.1959 | 2210 | 0.1678 |
| 1.1964 | 2211 | 0.128 |
| 1.1970 | 2212 | 0.1572 |
| 1.1975 | 2213 | 0.0965 |
| 1.1981 | 2214 | 0.2441 |
| 1.1986 | 2215 | 0.1489 |
| 1.1991 | 2216 | 0.1268 |
| 1.1997 | 2217 | 0.1961 |
| 1.2002 | 2218 | 0.138 |
| 1.2008 | 2219 | 0.1104 |
| 1.2013 | 2220 | 0.1459 |
| 1.2018 | 2221 | 0.135 |
| 1.2024 | 2222 | 0.1342 |
| 1.2029 | 2223 | 0.1693 |
| 1.2035 | 2224 | 0.1841 |
| 1.2040 | 2225 | 0.1487 |
| 1.2045 | 2226 | 0.1781 |
| 1.2051 | 2227 | 0.1734 |
| 1.2056 | 2228 | 0.1982 |
| 1.2062 | 2229 | 0.1352 |
| 1.2067 | 2230 | 0.1363 |
| 1.2073 | 2231 | 0.2269 |
| 1.2078 | 2232 | 0.1859 |
| 1.2083 | 2233 | 0.1542 |
| 1.2089 | 2234 | 0.1913 |
| 1.2094 | 2235 | 0.1076 |
| 1.2100 | 2236 | 0.1504 |
| 1.2105 | 2237 | 0.1712 |
| 1.2110 | 2238 | 0.129 |
| 1.2116 | 2239 | 0.2339 |
| 1.2121 | 2240 | 0.1976 |
| 1.2127 | 2241 | 0.1131 |
| 1.2132 | 2242 | 0.2167 |
| 1.2137 | 2243 | 0.1433 |
| 1.2143 | 2244 | 0.1365 |
| 1.2148 | 2245 | 0.0818 |
| 1.2154 | 2246 | 0.139 |
| 1.2159 | 2247 | 0.1316 |
| 1.2165 | 2248 | 0.1259 |
| 1.2170 | 2249 | 0.2142 |
| 1.2175 | 2250 | 0.1683 |
| 1.2181 | 2251 | 0.1861 |
| 1.2186 | 2252 | 0.1742 |
| 1.2192 | 2253 | 0.1828 |
| 1.2197 | 2254 | 0.1052 |
| 1.2202 | 2255 | 0.1411 |
| 1.2208 | 2256 | 0.1898 |
| 1.2213 | 2257 | 0.1668 |
| 1.2219 | 2258 | 0.1103 |
| 1.2224 | 2259 | 0.1355 |
| 1.2229 | 2260 | 0.1652 |
| 1.2235 | 2261 | 0.2448 |
| 1.2240 | 2262 | 0.1566 |
| 1.2246 | 2263 | 0.1658 |
| 1.2251 | 2264 | 0.1502 |
| 1.2256 | 2265 | 0.1073 |
| 1.2262 | 2266 | 0.1469 |
| 1.2267 | 2267 | 0.1264 |
| 1.2273 | 2268 | 0.2297 |
| 1.2278 | 2269 | 0.0971 |
| 1.2284 | 2270 | 0.1473 |
| 1.2289 | 2271 | 0.1029 |
| 1.2294 | 2272 | 0.2075 |
| 1.2300 | 2273 | 0.1333 |
| 1.2305 | 2274 | 0.1758 |
| 1.2311 | 2275 | 0.1407 |
| 1.2316 | 2276 | 0.1762 |
| 1.2321 | 2277 | 0.1632 |
| 1.2327 | 2278 | 0.1301 |
| 1.2332 | 2279 | 0.2175 |
| 1.2338 | 2280 | 0.1289 |
| 1.2343 | 2281 | 0.1981 |
| 1.2348 | 2282 | 0.1272 |
| 1.2354 | 2283 | 0.1438 |
| 1.2359 | 2284 | 0.2245 |
| 1.2365 | 2285 | 0.1525 |
| 1.2370 | 2286 | 0.0951 |
| 1.2376 | 2287 | 0.1818 |
| 1.2381 | 2288 | 0.1064 |
| 1.2386 | 2289 | 0.132 |
| 1.2392 | 2290 | 0.1825 |
| 1.2397 | 2291 | 0.1968 |
| 1.2403 | 2292 | 0.1237 |
| 1.2408 | 2293 | 0.1315 |
| 1.2413 | 2294 | 0.1654 |
| 1.2419 | 2295 | 0.1523 |
| 1.2424 | 2296 | 0.1485 |
| 1.2430 | 2297 | 0.1291 |
| 1.2435 | 2298 | 0.1961 |
| 1.2440 | 2299 | 0.2214 |
| 1.2446 | 2300 | 0.1815 |
| 1.2451 | 2301 | 0.1756 |
| 1.2457 | 2302 | 0.1108 |
| 1.2462 | 2303 | 0.1121 |
| 1.2468 | 2304 | 0.143 |
| 1.2473 | 2305 | 0.1653 |
| 1.2478 | 2306 | 0.123 |
| 1.2484 | 2307 | 0.1277 |
| 1.2489 | 2308 | 0.1297 |
| 1.2495 | 2309 | 0.1453 |
| 1.25 | 2310 | 0.1333 |
| 1.2505 | 2311 | 0.1833 |
| 1.2511 | 2312 | 0.1243 |
| 1.2516 | 2313 | 0.2028 |
| 1.2522 | 2314 | 0.1313 |
| 1.2527 | 2315 | 0.1412 |
| 1.2532 | 2316 | 0.3006 |
| 1.2538 | 2317 | 0.2261 |
| 1.2543 | 2318 | 0.1033 |
| 1.2549 | 2319 | 0.2435 |
| 1.2554 | 2320 | 0.1172 |
| 1.2560 | 2321 | 0.1952 |
| 1.2565 | 2322 | 0.1153 |
| 1.2570 | 2323 | 0.1229 |
| 1.2576 | 2324 | 0.1816 |
| 1.2581 | 2325 | 0.1654 |
| 1.2587 | 2326 | 0.1326 |
| 1.2592 | 2327 | 0.1637 |
| 1.2597 | 2328 | 0.1848 |
| 1.2603 | 2329 | 0.1432 |
| 1.2608 | 2330 | 0.1274 |
| 1.2614 | 2331 | 0.1793 |
| 1.2619 | 2332 | 0.1296 |
| 1.2624 | 2333 | 0.1342 |
| 1.2630 | 2334 | 0.163 |
| 1.2635 | 2335 | 0.183 |
| 1.2641 | 2336 | 0.1334 |
| 1.2646 | 2337 | 0.1922 |
| 1.2652 | 2338 | 0.0816 |
| 1.2657 | 2339 | 0.1201 |
| 1.2662 | 2340 | 0.1736 |
| 1.2668 | 2341 | 0.1229 |
| 1.2673 | 2342 | 0.154 |
| 1.2679 | 2343 | 0.1338 |
| 1.2684 | 2344 | 0.1851 |
| 1.2689 | 2345 | 0.1511 |
| 1.2695 | 2346 | 0.1176 |
| 1.2700 | 2347 | 0.1294 |
| 1.2706 | 2348 | 0.3477 |
| 1.2711 | 2349 | 0.2173 |
| 1.2716 | 2350 | 0.163 |
| 1.2722 | 2351 | 0.1413 |
| 1.2727 | 2352 | 0.1932 |
| 1.2733 | 2353 | 0.1359 |
| 1.2738 | 2354 | 0.1892 |
| 1.2744 | 2355 | 0.1468 |
| 1.2749 | 2356 | 0.1895 |
| 1.2754 | 2357 | 0.1587 |
| 1.2760 | 2358 | 0.2297 |
| 1.2765 | 2359 | 0.172 |
| 1.2771 | 2360 | 0.1886 |
| 1.2776 | 2361 | 0.0901 |
| 1.2781 | 2362 | 0.1537 |
| 1.2787 | 2363 | 0.1367 |
| 1.2792 | 2364 | 0.1551 |
| 1.2798 | 2365 | 0.1262 |
| 1.2803 | 2366 | 0.1679 |
| 1.2808 | 2367 | 0.1803 |
| 1.2814 | 2368 | 0.1392 |
| 1.2819 | 2369 | 0.1615 |
| 1.2825 | 2370 | 0.1452 |
| 1.2830 | 2371 | 0.2015 |
| 1.2835 | 2372 | 0.1464 |
| 1.2841 | 2373 | 0.2512 |
| 1.2846 | 2374 | 0.0882 |
| 1.2852 | 2375 | 0.1218 |
| 1.2857 | 2376 | 0.1998 |
| 1.2863 | 2377 | 0.101 |
| 1.2868 | 2378 | 0.1925 |
| 1.2873 | 2379 | 0.2355 |
| 1.2879 | 2380 | 0.1798 |
| 1.2884 | 2381 | 0.0998 |
| 1.2890 | 2382 | 0.0768 |
| 1.2895 | 2383 | 0.1737 |
| 1.2900 | 2384 | 0.1655 |
| 1.2906 | 2385 | 0.1315 |
| 1.2911 | 2386 | 0.2109 |
| 1.2917 | 2387 | 0.1166 |
| 1.2922 | 2388 | 0.1382 |
| 1.2927 | 2389 | 0.1373 |
| 1.2933 | 2390 | 0.1124 |
| 1.2938 | 2391 | 0.1168 |
| 1.2944 | 2392 | 0.1112 |
| 1.2949 | 2393 | 0.2328 |
| 1.2955 | 2394 | 0.174 |
| 1.2960 | 2395 | 0.1809 |
| 1.2965 | 2396 | 0.1731 |
| 1.2971 | 2397 | 0.1129 |
| 1.2976 | 2398 | 0.1251 |
| 1.2982 | 2399 | 0.1205 |
| 1.2987 | 2400 | 0.0934 |
| 1.2992 | 2401 | 0.1622 |
| 1.2998 | 2402 | 0.1858 |
| 1.3003 | 2403 | 0.1721 |
| 1.3009 | 2404 | 0.2051 |
| 1.3014 | 2405 | 0.2213 |
| 1.3019 | 2406 | 0.1108 |
| 1.3025 | 2407 | 0.1732 |
| 1.3030 | 2408 | 0.1884 |
| 1.3036 | 2409 | 0.1981 |
| 1.3041 | 2410 | 0.0638 |
| 1.3047 | 2411 | 0.1423 |
| 1.3052 | 2412 | 0.1627 |
| 1.3057 | 2413 | 0.2067 |
| 1.3063 | 2414 | 0.1867 |
| 1.3068 | 2415 | 0.1922 |
| 1.3074 | 2416 | 0.16 |
| 1.3079 | 2417 | 0.1534 |
| 1.3084 | 2418 | 0.2114 |
| 1.3090 | 2419 | 0.1709 |
| 1.3095 | 2420 | 0.1028 |
| 1.3101 | 2421 | 0.159 |
| 1.3106 | 2422 | 0.1139 |
| 1.3111 | 2423 | 0.1301 |
| 1.3117 | 2424 | 0.14 |
| 1.3122 | 2425 | 0.1532 |
| 1.3128 | 2426 | 0.1839 |
| 1.3133 | 2427 | 0.1341 |
| 1.3139 | 2428 | 0.1906 |
| 1.3144 | 2429 | 0.2064 |
| 1.3149 | 2430 | 0.1377 |
| 1.3155 | 2431 | 0.1129 |
| 1.3160 | 2432 | 0.1553 |
| 1.3166 | 2433 | 0.1107 |
| 1.3171 | 2434 | 0.2374 |
| 1.3176 | 2435 | 0.173 |
| 1.3182 | 2436 | 0.1393 |
| 1.3187 | 2437 | 0.1996 |
| 1.3193 | 2438 | 0.2028 |
| 1.3198 | 2439 | 0.1058 |
| 1.3203 | 2440 | 0.118 |
| 1.3209 | 2441 | 0.1954 |
| 1.3214 | 2442 | 0.1117 |
| 1.3220 | 2443 | 0.1739 |
| 1.3225 | 2444 | 0.1886 |
| 1.3231 | 2445 | 0.1918 |
| 1.3236 | 2446 | 0.1234 |
| 1.3241 | 2447 | 0.2296 |
| 1.3247 | 2448 | 0.21 |
| 1.3252 | 2449 | 0.1268 |
| 1.3258 | 2450 | 0.1602 |
| 1.3263 | 2451 | 0.1095 |
| 1.3268 | 2452 | 0.1128 |
| 1.3274 | 2453 | 0.2092 |
| 1.3279 | 2454 | 0.122 |
| 1.3285 | 2455 | 0.1113 |
| 1.3290 | 2456 | 0.1518 |
| 1.3295 | 2457 | 0.1603 |
| 1.3301 | 2458 | 0.128 |
| 1.3306 | 2459 | 0.1966 |
| 1.3312 | 2460 | 0.0971 |
| 1.3317 | 2461 | 0.0837 |
| 1.3323 | 2462 | 0.1994 |
| 1.3328 | 2463 | 0.2612 |
| 1.3333 | 2464 | 0.2003 |
| 1.3339 | 2465 | 0.1551 |
| 1.3344 | 2466 | 0.1201 |
| 1.3350 | 2467 | 0.1919 |
| 1.3355 | 2468 | 0.1063 |
| 1.3360 | 2469 | 0.178 |
| 1.3366 | 2470 | 0.1441 |
| 1.3371 | 2471 | 0.1678 |
| 1.3377 | 2472 | 0.1992 |
| 1.3382 | 2473 | 0.1306 |
| 1.3387 | 2474 | 0.1432 |
| 1.3393 | 2475 | 0.117 |
| 1.3398 | 2476 | 0.1718 |
| 1.3404 | 2477 | 0.1765 |
| 1.3409 | 2478 | 0.1689 |
| 1.3415 | 2479 | 0.1251 |
| 1.3420 | 2480 | 0.1965 |
| 1.3425 | 2481 | 0.1371 |
| 1.3431 | 2482 | 0.1109 |
| 1.3436 | 2483 | 0.1393 |
| 1.3442 | 2484 | 0.1509 |
| 1.3447 | 2485 | 0.2105 |
| 1.3452 | 2486 | 0.1 |
| 1.3458 | 2487 | 0.1758 |
| 1.3463 | 2488 | 0.0925 |
| 1.3469 | 2489 | 0.1438 |
| 1.3474 | 2490 | 0.0885 |
| 1.3479 | 2491 | 0.149 |
| 1.3485 | 2492 | 0.1008 |
| 1.3490 | 2493 | 0.1565 |
| 1.3496 | 2494 | 0.092 |
| 1.3501 | 2495 | 0.1398 |
| 1.3506 | 2496 | 0.1215 |
| 1.3512 | 2497 | 0.1339 |
| 1.3517 | 2498 | 0.1932 |
| 1.3523 | 2499 | 0.2007 |
| 1.3528 | 2500 | 0.1387 |
| 1.3534 | 2501 | 0.139 |
| 1.3539 | 2502 | 0.1869 |
| 1.3544 | 2503 | 0.1169 |
| 1.3550 | 2504 | 0.1666 |
| 1.3555 | 2505 | 0.11 |
| 1.3561 | 2506 | 0.1603 |
| 1.3566 | 2507 | 0.1224 |
| 1.3571 | 2508 | 0.15 |
| 1.3577 | 2509 | 0.1826 |
| 1.3582 | 2510 | 0.1905 |
| 1.3588 | 2511 | 0.2343 |
| 1.3593 | 2512 | 0.1025 |
| 1.3598 | 2513 | 0.1937 |
| 1.3604 | 2514 | 0.1251 |
| 1.3609 | 2515 | 0.1046 |
| 1.3615 | 2516 | 0.1991 |
| 1.3620 | 2517 | 0.1331 |
| 1.3626 | 2518 | 0.1246 |
| 1.3631 | 2519 | 0.1467 |
| 1.3636 | 2520 | 0.1125 |
| 1.3642 | 2521 | 0.124 |
| 1.3647 | 2522 | 0.1453 |
| 1.3653 | 2523 | 0.1513 |
| 1.3658 | 2524 | 0.1077 |
| 1.3663 | 2525 | 0.2036 |
| 1.3669 | 2526 | 0.1631 |
| 1.3674 | 2527 | 0.1868 |
| 1.3680 | 2528 | 0.1858 |
| 1.3685 | 2529 | 0.1464 |
| 1.3690 | 2530 | 0.2517 |
| 1.3696 | 2531 | 0.147 |
| 1.3701 | 2532 | 0.1402 |
| 1.3707 | 2533 | 0.1397 |
| 1.3712 | 2534 | 0.1579 |
| 1.3718 | 2535 | 0.1185 |
| 1.3723 | 2536 | 0.1605 |
| 1.3728 | 2537 | 0.1716 |
| 1.3734 | 2538 | 0.1915 |
| 1.3739 | 2539 | 0.1535 |
| 1.3745 | 2540 | 0.1473 |
| 1.375 | 2541 | 0.218 |
| 1.3755 | 2542 | 0.2183 |
| 1.3761 | 2543 | 0.1595 |
| 1.3766 | 2544 | 0.1402 |
| 1.3772 | 2545 | 0.1311 |
| 1.3777 | 2546 | 0.1801 |
| 1.3782 | 2547 | 0.1141 |
| 1.3788 | 2548 | 0.1241 |
| 1.3793 | 2549 | 0.1097 |
| 1.3799 | 2550 | 0.1691 |
| 1.3804 | 2551 | 0.178 |
| 1.3810 | 2552 | 0.198 |
| 1.3815 | 2553 | 0.1357 |
| 1.3820 | 2554 | 0.1069 |
| 1.3826 | 2555 | 0.1486 |
| 1.3831 | 2556 | 0.0818 |
| 1.3837 | 2557 | 0.154 |
| 1.3842 | 2558 | 0.0773 |
| 1.3847 | 2559 | 0.0949 |
| 1.3853 | 2560 | 0.1535 |
| 1.3858 | 2561 | 0.2195 |
| 1.3864 | 2562 | 0.1177 |
| 1.3869 | 2563 | 0.156 |
| 1.3874 | 2564 | 0.1452 |
| 1.3880 | 2565 | 0.1732 |
| 1.3885 | 2566 | 0.1268 |
| 1.3891 | 2567 | 0.1624 |
| 1.3896 | 2568 | 0.1591 |
| 1.3902 | 2569 | 0.1043 |
| 1.3907 | 2570 | 0.134 |
| 1.3912 | 2571 | 0.0903 |
| 1.3918 | 2572 | 0.1704 |
| 1.3923 | 2573 | 0.1621 |
| 1.3929 | 2574 | 0.1576 |
| 1.3934 | 2575 | 0.1911 |
| 1.3939 | 2576 | 0.1693 |
| 1.3945 | 2577 | 0.2511 |
| 1.3950 | 2578 | 0.1229 |
| 1.3956 | 2579 | 0.1656 |
| 1.3961 | 2580 | 0.1982 |
| 1.3966 | 2581 | 0.1768 |
| 1.3972 | 2582 | 0.1636 |
| 1.3977 | 2583 | 0.1113 |
| 1.3983 | 2584 | 0.2305 |
| 1.3988 | 2585 | 0.188 |
| 1.3994 | 2586 | 0.1406 |
| 1.3999 | 2587 | 0.1359 |
| 1.4004 | 2588 | 0.1229 |
| 1.4010 | 2589 | 0.1566 |
| 1.4015 | 2590 | 0.1895 |
| 1.4021 | 2591 | 0.1502 |
| 1.4026 | 2592 | 0.0882 |
| 1.4031 | 2593 | 0.1216 |
| 1.4037 | 2594 | 0.1069 |
| 1.4042 | 2595 | 0.1295 |
| 1.4048 | 2596 | 0.1498 |
| 1.4053 | 2597 | 0.1438 |
| 1.4058 | 2598 | 0.126 |
| 1.4064 | 2599 | 0.1066 |
| 1.4069 | 2600 | 0.1444 |
| 1.4075 | 2601 | 0.145 |
| 1.4080 | 2602 | 0.2424 |
| 1.4085 | 2603 | 0.195 |
| 1.4091 | 2604 | 0.1649 |
| 1.4096 | 2605 | 0.1384 |
| 1.4102 | 2606 | 0.1266 |
| 1.4107 | 2607 | 0.08 |
| 1.4113 | 2608 | 0.1649 |
| 1.4118 | 2609 | 0.1079 |
| 1.4123 | 2610 | 0.1614 |
| 1.4129 | 2611 | 0.1425 |
| 1.4134 | 2612 | 0.1319 |
| 1.4140 | 2613 | 0.1249 |
| 1.4145 | 2614 | 0.1189 |
| 1.4150 | 2615 | 0.1661 |
| 1.4156 | 2616 | 0.0971 |
| 1.4161 | 2617 | 0.1107 |
| 1.4167 | 2618 | 0.1393 |
| 1.4172 | 2619 | 0.1279 |
| 1.4177 | 2620 | 0.1569 |
| 1.4183 | 2621 | 0.1605 |
| 1.4188 | 2622 | 0.1487 |
| 1.4194 | 2623 | 0.1173 |
| 1.4199 | 2624 | 0.2162 |
| 1.4205 | 2625 | 0.1033 |
| 1.4210 | 2626 | 0.2085 |
| 1.4215 | 2627 | 0.0873 |
| 1.4221 | 2628 | 0.2022 |
| 1.4226 | 2629 | 0.1676 |
| 1.4232 | 2630 | 0.1519 |
| 1.4237 | 2631 | 0.0961 |
| 1.4242 | 2632 | 0.0628 |
| 1.4248 | 2633 | 0.1306 |
| 1.4253 | 2634 | 0.1886 |
| 1.4259 | 2635 | 0.1843 |
| 1.4264 | 2636 | 0.077 |
| 1.4269 | 2637 | 0.175 |
| 1.4275 | 2638 | 0.1868 |
| 1.4280 | 2639 | 0.1218 |
| 1.4286 | 2640 | 0.2478 |
| 1.4291 | 2641 | 0.1802 |
| 1.4297 | 2642 | 0.2111 |
| 1.4302 | 2643 | 0.1129 |
| 1.4307 | 2644 | 0.1931 |
| 1.4313 | 2645 | 0.0944 |
| 1.4318 | 2646 | 0.1672 |
| 1.4324 | 2647 | 0.2231 |
| 1.4329 | 2648 | 0.1799 |
| 1.4334 | 2649 | 0.1549 |
| 1.4340 | 2650 | 0.171 |
| 1.4345 | 2651 | 0.1677 |
| 1.4351 | 2652 | 0.071 |
| 1.4356 | 2653 | 0.1344 |
| 1.4361 | 2654 | 0.2055 |
| 1.4367 | 2655 | 0.1168 |
| 1.4372 | 2656 | 0.1869 |
| 1.4378 | 2657 | 0.1315 |
| 1.4383 | 2658 | 0.14 |
| 1.4389 | 2659 | 0.1324 |
| 1.4394 | 2660 | 0.1585 |
| 1.4399 | 2661 | 0.1021 |
| 1.4405 | 2662 | 0.1565 |
| 1.4410 | 2663 | 0.2229 |
| 1.4416 | 2664 | 0.1702 |
| 1.4421 | 2665 | 0.192 |
| 1.4426 | 2666 | 0.1337 |
| 1.4432 | 2667 | 0.1344 |
| 1.4437 | 2668 | 0.11 |
| 1.4443 | 2669 | 0.2235 |
| 1.4448 | 2670 | 0.1436 |
| 1.4453 | 2671 | 0.0971 |
| 1.4459 | 2672 | 0.114 |
| 1.4464 | 2673 | 0.1235 |
| 1.4470 | 2674 | 0.2 |
| 1.4475 | 2675 | 0.157 |
| 1.4481 | 2676 | 0.124 |
| 1.4486 | 2677 | 0.155 |
| 1.4491 | 2678 | 0.1662 |
| 1.4497 | 2679 | 0.1604 |
| 1.4502 | 2680 | 0.2262 |
| 1.4508 | 2681 | 0.2025 |
| 1.4513 | 2682 | 0.1411 |
| 1.4518 | 2683 | 0.1726 |
| 1.4524 | 2684 | 0.1342 |
| 1.4529 | 2685 | 0.2061 |
| 1.4535 | 2686 | 0.1509 |
| 1.4540 | 2687 | 0.2183 |
| 1.4545 | 2688 | 0.2041 |
| 1.4551 | 2689 | 0.0994 |
| 1.4556 | 2690 | 0.148 |
| 1.4562 | 2691 | 0.1623 |
| 1.4567 | 2692 | 0.1274 |
| 1.4573 | 2693 | 0.2095 |
| 1.4578 | 2694 | 0.1514 |
| 1.4583 | 2695 | 0.1145 |
| 1.4589 | 2696 | 0.0957 |
| 1.4594 | 2697 | 0.118 |
| 1.4600 | 2698 | 0.1297 |
| 1.4605 | 2699 | 0.1252 |
| 1.4610 | 2700 | 0.1048 |
| 1.4616 | 2701 | 0.1426 |
| 1.4621 | 2702 | 0.1744 |
| 1.4627 | 2703 | 0.1726 |
| 1.4632 | 2704 | 0.1098 |
| 1.4637 | 2705 | 0.1609 |
| 1.4643 | 2706 | 0.1613 |
| 1.4648 | 2707 | 0.1378 |
| 1.4654 | 2708 | 0.104 |
| 1.4659 | 2709 | 0.0919 |
| 1.4665 | 2710 | 0.0982 |
| 1.4670 | 2711 | 0.0757 |
| 1.4675 | 2712 | 0.194 |
| 1.4681 | 2713 | 0.1172 |
| 1.4686 | 2714 | 0.0796 |
| 1.4692 | 2715 | 0.1449 |
| 1.4697 | 2716 | 0.1091 |
| 1.4702 | 2717 | 0.1279 |
| 1.4708 | 2718 | 0.1883 |
| 1.4713 | 2719 | 0.1467 |
| 1.4719 | 2720 | 0.1419 |
| 1.4724 | 2721 | 0.1615 |
| 1.4729 | 2722 | 0.162 |
| 1.4735 | 2723 | 0.1166 |
| 1.4740 | 2724 | 0.0876 |
| 1.4746 | 2725 | 0.2048 |
| 1.4751 | 2726 | 0.1356 |
| 1.4756 | 2727 | 0.1148 |
| 1.4762 | 2728 | 0.0901 |
| 1.4767 | 2729 | 0.1307 |
| 1.4773 | 2730 | 0.1621 |
| 1.4778 | 2731 | 0.1018 |
| 1.4784 | 2732 | 0.1039 |
| 1.4789 | 2733 | 0.1947 |
| 1.4794 | 2734 | 0.1781 |
| 1.4800 | 2735 | 0.1412 |
| 1.4805 | 2736 | 0.1924 |
| 1.4811 | 2737 | 0.1357 |
| 1.4816 | 2738 | 0.1839 |
| 1.4821 | 2739 | 0.097 |
| 1.4827 | 2740 | 0.149 |
| 1.4832 | 2741 | 0.1708 |
| 1.4838 | 2742 | 0.1674 |
| 1.4843 | 2743 | 0.1969 |
| 1.4848 | 2744 | 0.1105 |
| 1.4854 | 2745 | 0.179 |
| 1.4859 | 2746 | 0.1822 |
| 1.4865 | 2747 | 0.2475 |
| 1.4870 | 2748 | 0.134 |
| 1.4876 | 2749 | 0.145 |
| 1.4881 | 2750 | 0.1061 |
| 1.4886 | 2751 | 0.1669 |
| 1.4892 | 2752 | 0.1459 |
| 1.4897 | 2753 | 0.19 |
| 1.4903 | 2754 | 0.132 |
| 1.4908 | 2755 | 0.133 |
| 1.4913 | 2756 | 0.1593 |
| 1.4919 | 2757 | 0.1909 |
| 1.4924 | 2758 | 0.1954 |
| 1.4930 | 2759 | 0.1048 |
| 1.4935 | 2760 | 0.1093 |
| 1.4940 | 2761 | 0.1388 |
| 1.4946 | 2762 | 0.108 |
| 1.4951 | 2763 | 0.1291 |
| 1.4957 | 2764 | 0.1642 |
| 1.4962 | 2765 | 0.1346 |
| 1.4968 | 2766 | 0.1042 |
| 1.4973 | 2767 | 0.1155 |
| 1.4978 | 2768 | 0.1995 |
| 1.4984 | 2769 | 0.113 |
| 1.4989 | 2770 | 0.0875 |
| 1.4995 | 2771 | 0.1985 |
| 1.5 | 2772 | 0.1174 |
| 1.5005 | 2773 | 0.1252 |
| 1.5011 | 2774 | 0.1098 |
| 1.5016 | 2775 | 0.1313 |
| 1.5022 | 2776 | 0.1377 |
| 1.5027 | 2777 | 0.0997 |
| 1.5032 | 2778 | 0.2054 |
| 1.5038 | 2779 | 0.1398 |
| 1.5043 | 2780 | 0.1485 |
| 1.5049 | 2781 | 0.1765 |
| 1.5054 | 2782 | 0.1246 |
| 1.5060 | 2783 | 0.1427 |
| 1.5065 | 2784 | 0.1339 |
| 1.5070 | 2785 | 0.0913 |
| 1.5076 | 2786 | 0.1398 |
| 1.5081 | 2787 | 0.1649 |
| 1.5087 | 2788 | 0.1541 |
| 1.5092 | 2789 | 0.1101 |
| 1.5097 | 2790 | 0.1547 |
| 1.5103 | 2791 | 0.1665 |
| 1.5108 | 2792 | 0.1867 |
| 1.5114 | 2793 | 0.1584 |
| 1.5119 | 2794 | 0.1263 |
| 1.5124 | 2795 | 0.2127 |
| 1.5130 | 2796 | 0.178 |
| 1.5135 | 2797 | 0.1675 |
| 1.5141 | 2798 | 0.1076 |
| 1.5146 | 2799 | 0.1465 |
| 1.5152 | 2800 | 0.1361 |
| 1.5157 | 2801 | 0.165 |
| 1.5162 | 2802 | 0.1563 |
| 1.5168 | 2803 | 0.244 |
| 1.5173 | 2804 | 0.0679 |
| 1.5179 | 2805 | 0.1022 |
| 1.5184 | 2806 | 0.1718 |
| 1.5189 | 2807 | 0.1726 |
| 1.5195 | 2808 | 0.0862 |
| 1.5200 | 2809 | 0.1164 |
| 1.5206 | 2810 | 0.1277 |
| 1.5211 | 2811 | 0.1606 |
| 1.5216 | 2812 | 0.1198 |
| 1.5222 | 2813 | 0.2201 |
| 1.5227 | 2814 | 0.1492 |
| 1.5233 | 2815 | 0.1403 |
| 1.5238 | 2816 | 0.1502 |
| 1.5244 | 2817 | 0.1191 |
| 1.5249 | 2818 | 0.1601 |
| 1.5254 | 2819 | 0.1758 |
| 1.5260 | 2820 | 0.1203 |
| 1.5265 | 2821 | 0.1514 |
| 1.5271 | 2822 | 0.0597 |
| 1.5276 | 2823 | 0.1668 |
| 1.5281 | 2824 | 0.1662 |
| 1.5287 | 2825 | 0.1558 |
| 1.5292 | 2826 | 0.1427 |
| 1.5298 | 2827 | 0.0877 |
| 1.5303 | 2828 | 0.1245 |
| 1.5308 | 2829 | 0.1079 |
| 1.5314 | 2830 | 0.126 |
| 1.5319 | 2831 | 0.1625 |
| 1.5325 | 2832 | 0.129 |
| 1.5330 | 2833 | 0.1623 |
| 1.5335 | 2834 | 0.1375 |
| 1.5341 | 2835 | 0.1273 |
| 1.5346 | 2836 | 0.2176 |
| 1.5352 | 2837 | 0.1161 |
| 1.5357 | 2838 | 0.0927 |
| 1.5363 | 2839 | 0.1432 |
| 1.5368 | 2840 | 0.1039 |
| 1.5373 | 2841 | 0.2348 |
| 1.5379 | 2842 | 0.1927 |
| 1.5384 | 2843 | 0.1702 |
| 1.5390 | 2844 | 0.1271 |
| 1.5395 | 2845 | 0.1283 |
| 1.5400 | 2846 | 0.1174 |
| 1.5406 | 2847 | 0.1601 |
| 1.5411 | 2848 | 0.1597 |
| 1.5417 | 2849 | 0.1748 |
| 1.5422 | 2850 | 0.1178 |
| 1.5427 | 2851 | 0.183 |
| 1.5433 | 2852 | 0.1614 |
| 1.5438 | 2853 | 0.112 |
| 1.5444 | 2854 | 0.0992 |
| 1.5449 | 2855 | 0.1018 |
| 1.5455 | 2856 | 0.1623 |
| 1.5460 | 2857 | 0.1498 |
| 1.5465 | 2858 | 0.1584 |
| 1.5471 | 2859 | 0.1663 |
| 1.5476 | 2860 | 0.1347 |
| 1.5482 | 2861 | 0.1226 |
| 1.5487 | 2862 | 0.1089 |
| 1.5492 | 2863 | 0.1369 |
| 1.5498 | 2864 | 0.1582 |
| 1.5503 | 2865 | 0.0919 |
| 1.5509 | 2866 | 0.1124 |
| 1.5514 | 2867 | 0.1275 |
| 1.5519 | 2868 | 0.1782 |
| 1.5525 | 2869 | 0.1158 |
| 1.5530 | 2870 | 0.123 |
| 1.5536 | 2871 | 0.1179 |
| 1.5541 | 2872 | 0.1201 |
| 1.5547 | 2873 | 0.0779 |
| 1.5552 | 2874 | 0.1242 |
| 1.5557 | 2875 | 0.1437 |
| 1.5563 | 2876 | 0.1064 |
| 1.5568 | 2877 | 0.1473 |
| 1.5574 | 2878 | 0.1177 |
| 1.5579 | 2879 | 0.1504 |
| 1.5584 | 2880 | 0.1215 |
| 1.5590 | 2881 | 0.201 |
| 1.5595 | 2882 | 0.1298 |
| 1.5601 | 2883 | 0.279 |
| 1.5606 | 2884 | 0.1175 |
| 1.5611 | 2885 | 0.1104 |
| 1.5617 | 2886 | 0.1613 |
| 1.5622 | 2887 | 0.2022 |
| 1.5628 | 2888 | 0.1348 |
| 1.5633 | 2889 | 0.1828 |
| 1.5639 | 2890 | 0.2425 |
| 1.5644 | 2891 | 0.0972 |
| 1.5649 | 2892 | 0.1734 |
| 1.5655 | 2893 | 0.1251 |
| 1.5660 | 2894 | 0.1192 |
| 1.5666 | 2895 | 0.193 |
| 1.5671 | 2896 | 0.1686 |
| 1.5676 | 2897 | 0.1816 |
| 1.5682 | 2898 | 0.1086 |
| 1.5687 | 2899 | 0.1395 |
| 1.5693 | 2900 | 0.0545 |
| 1.5698 | 2901 | 0.1862 |
| 1.5703 | 2902 | 0.119 |
| 1.5709 | 2903 | 0.0937 |
| 1.5714 | 2904 | 0.1355 |
| 1.5720 | 2905 | 0.1425 |
| 1.5725 | 2906 | 0.1946 |
| 1.5731 | 2907 | 0.2243 |
| 1.5736 | 2908 | 0.1655 |
| 1.5741 | 2909 | 0.1414 |
| 1.5747 | 2910 | 0.1791 |
| 1.5752 | 2911 | 0.1408 |
| 1.5758 | 2912 | 0.1142 |
| 1.5763 | 2913 | 0.1321 |
| 1.5768 | 2914 | 0.1374 |
| 1.5774 | 2915 | 0.1151 |
| 1.5779 | 2916 | 0.1474 |
| 1.5785 | 2917 | 0.1843 |
| 1.5790 | 2918 | 0.0971 |
| 1.5795 | 2919 | 0.109 |
| 1.5801 | 2920 | 0.0696 |
| 1.5806 | 2921 | 0.0988 |
| 1.5812 | 2922 | 0.1479 |
| 1.5817 | 2923 | 0.1 |
| 1.5823 | 2924 | 0.1616 |
| 1.5828 | 2925 | 0.1384 |
| 1.5833 | 2926 | 0.1229 |
| 1.5839 | 2927 | 0.1367 |
| 1.5844 | 2928 | 0.1779 |
| 1.5850 | 2929 | 0.2113 |
| 1.5855 | 2930 | 0.151 |
| 1.5860 | 2931 | 0.146 |
| 1.5866 | 2932 | 0.116 |
| 1.5871 | 2933 | 0.074 |
| 1.5877 | 2934 | 0.1773 |
| 1.5882 | 2935 | 0.1139 |
| 1.5887 | 2936 | 0.131 |
| 1.5893 | 2937 | 0.1166 |
| 1.5898 | 2938 | 0.1399 |
| 1.5904 | 2939 | 0.1721 |
| 1.5909 | 2940 | 0.0795 |
| 1.5915 | 2941 | 0.1845 |
| 1.5920 | 2942 | 0.1869 |
| 1.5925 | 2943 | 0.1412 |
| 1.5931 | 2944 | 0.1531 |
| 1.5936 | 2945 | 0.1797 |
| 1.5942 | 2946 | 0.1439 |
| 1.5947 | 2947 | 0.0901 |
| 1.5952 | 2948 | 0.1432 |
| 1.5958 | 2949 | 0.0736 |
| 1.5963 | 2950 | 0.1257 |
| 1.5969 | 2951 | 0.1171 |
| 1.5974 | 2952 | 0.1228 |
| 1.5979 | 2953 | 0.1039 |
| 1.5985 | 2954 | 0.175 |
| 1.5990 | 2955 | 0.1883 |
| 1.5996 | 2956 | 0.1484 |
| 1.6001 | 2957 | 0.1449 |
| 1.6006 | 2958 | 0.1511 |
| 1.6012 | 2959 | 0.1256 |
| 1.6017 | 2960 | 0.1617 |
| 1.6023 | 2961 | 0.0991 |
| 1.6028 | 2962 | 0.0934 |
| 1.6034 | 2963 | 0.1146 |
| 1.6039 | 2964 | 0.1277 |
| 1.6044 | 2965 | 0.1761 |
| 1.6050 | 2966 | 0.0871 |
| 1.6055 | 2967 | 0.1361 |
| 1.6061 | 2968 | 0.1079 |
| 1.6066 | 2969 | 0.1347 |
| 1.6071 | 2970 | 0.1103 |
| 1.6077 | 2971 | 0.1854 |
| 1.6082 | 2972 | 0.1594 |
| 1.6088 | 2973 | 0.1764 |
| 1.6093 | 2974 | 0.1445 |
| 1.6098 | 2975 | 0.1491 |
| 1.6104 | 2976 | 0.1394 |
| 1.6109 | 2977 | 0.138 |
| 1.6115 | 2978 | 0.1538 |
| 1.6120 | 2979 | 0.1726 |
| 1.6126 | 2980 | 0.1285 |
| 1.6131 | 2981 | 0.1367 |
| 1.6136 | 2982 | 0.1563 |
| 1.6142 | 2983 | 0.2036 |
| 1.6147 | 2984 | 0.1713 |
| 1.6153 | 2985 | 0.1012 |
| 1.6158 | 2986 | 0.1957 |
| 1.6163 | 2987 | 0.1519 |
| 1.6169 | 2988 | 0.1345 |
| 1.6174 | 2989 | 0.118 |
| 1.6180 | 2990 | 0.1283 |
| 1.6185 | 2991 | 0.0942 |
| 1.6190 | 2992 | 0.0931 |
| 1.6196 | 2993 | 0.1119 |
| 1.6201 | 2994 | 0.1374 |
| 1.6207 | 2995 | 0.1308 |
| 1.6212 | 2996 | 0.0962 |
| 1.6218 | 2997 | 0.1228 |
| 1.6223 | 2998 | 0.1718 |
| 1.6228 | 2999 | 0.1455 |
| 1.6234 | 3000 | 0.1624 |
| 1.6239 | 3001 | 0.1567 |
| 1.6245 | 3002 | 0.1591 |
| 1.625 | 3003 | 0.1764 |
| 1.6255 | 3004 | 0.1648 |
| 1.6261 | 3005 | 0.1203 |
| 1.6266 | 3006 | 0.1064 |
| 1.6272 | 3007 | 0.1487 |
| 1.6277 | 3008 | 0.1023 |
| 1.6282 | 3009 | 0.1281 |
| 1.6288 | 3010 | 0.2009 |
| 1.6293 | 3011 | 0.1376 |
| 1.6299 | 3012 | 0.1217 |
| 1.6304 | 3013 | 0.1479 |
| 1.6310 | 3014 | 0.1045 |
| 1.6315 | 3015 | 0.1687 |
| 1.6320 | 3016 | 0.161 |
| 1.6326 | 3017 | 0.1451 |
| 1.6331 | 3018 | 0.0737 |
| 1.6337 | 3019 | 0.1111 |
| 1.6342 | 3020 | 0.0804 |
| 1.6347 | 3021 | 0.1343 |
| 1.6353 | 3022 | 0.1796 |
| 1.6358 | 3023 | 0.1528 |
| 1.6364 | 3024 | 0.0981 |
| 1.6369 | 3025 | 0.2169 |
| 1.6374 | 3026 | 0.1623 |
| 1.6380 | 3027 | 0.1037 |
| 1.6385 | 3028 | 0.1422 |
| 1.6391 | 3029 | 0.0804 |
| 1.6396 | 3030 | 0.1164 |
| 1.6402 | 3031 | 0.0939 |
| 1.6407 | 3032 | 0.1909 |
| 1.6412 | 3033 | 0.142 |
| 1.6418 | 3034 | 0.1845 |
| 1.6423 | 3035 | 0.2078 |
| 1.6429 | 3036 | 0.1088 |
| 1.6434 | 3037 | 0.1114 |
| 1.6439 | 3038 | 0.1677 |
| 1.6445 | 3039 | 0.1398 |
| 1.6450 | 3040 | 0.1678 |
| 1.6456 | 3041 | 0.1421 |
| 1.6461 | 3042 | 0.1253 |
| 1.6466 | 3043 | 0.1557 |
| 1.6472 | 3044 | 0.1857 |
| 1.6477 | 3045 | 0.1474 |
| 1.6483 | 3046 | 0.1437 |
| 1.6488 | 3047 | 0.1687 |
| 1.6494 | 3048 | 0.1955 |
| 1.6499 | 3049 | 0.155 |
| 1.6504 | 3050 | 0.1629 |
| 1.6510 | 3051 | 0.1783 |
| 1.6515 | 3052 | 0.121 |
| 1.6521 | 3053 | 0.1587 |
| 1.6526 | 3054 | 0.0774 |
| 1.6531 | 3055 | 0.1532 |
| 1.6537 | 3056 | 0.1134 |
| 1.6542 | 3057 | 0.1077 |
| 1.6548 | 3058 | 0.2061 |
| 1.6553 | 3059 | 0.0508 |
| 1.6558 | 3060 | 0.1057 |
| 1.6564 | 3061 | 0.1099 |
| 1.6569 | 3062 | 0.116 |
| 1.6575 | 3063 | 0.1121 |
| 1.6580 | 3064 | 0.1309 |
| 1.6585 | 3065 | 0.1025 |
| 1.6591 | 3066 | 0.1192 |
| 1.6596 | 3067 | 0.0882 |
| 1.6602 | 3068 | 0.1284 |
| 1.6607 | 3069 | 0.1441 |
| 1.6613 | 3070 | 0.1619 |
| 1.6618 | 3071 | 0.1776 |
| 1.6623 | 3072 | 0.1249 |
| 1.6629 | 3073 | 0.2089 |
| 1.6634 | 3074 | 0.1519 |
| 1.6640 | 3075 | 0.1589 |
| 1.6645 | 3076 | 0.1198 |
| 1.6650 | 3077 | 0.1468 |
| 1.6656 | 3078 | 0.1669 |
| 1.6661 | 3079 | 0.096 |
| 1.6667 | 3080 | 0.1229 |
| 1.6672 | 3081 | 0.1307 |
| 1.6677 | 3082 | 0.2001 |
| 1.6683 | 3083 | 0.0936 |
| 1.6688 | 3084 | 0.1411 |
| 1.6694 | 3085 | 0.1243 |
| 1.6699 | 3086 | 0.1443 |
| 1.6705 | 3087 | 0.162 |
| 1.6710 | 3088 | 0.1873 |
| 1.6715 | 3089 | 0.1229 |
| 1.6721 | 3090 | 0.1629 |
| 1.6726 | 3091 | 0.159 |
| 1.6732 | 3092 | 0.1772 |
| 1.6737 | 3093 | 0.1739 |
| 1.6742 | 3094 | 0.1169 |
| 1.6748 | 3095 | 0.1753 |
| 1.6753 | 3096 | 0.1166 |
| 1.6759 | 3097 | 0.1657 |
| 1.6764 | 3098 | 0.1075 |
| 1.6769 | 3099 | 0.0926 |
| 1.6775 | 3100 | 0.183 |
| 1.6780 | 3101 | 0.0923 |
| 1.6786 | 3102 | 0.1186 |
| 1.6791 | 3103 | 0.1287 |
| 1.6797 | 3104 | 0.16 |
| 1.6802 | 3105 | 0.1739 |
| 1.6807 | 3106 | 0.1833 |
| 1.6813 | 3107 | 0.0903 |
| 1.6818 | 3108 | 0.1352 |
| 1.6824 | 3109 | 0.203 |
| 1.6829 | 3110 | 0.1534 |
| 1.6834 | 3111 | 0.178 |
| 1.6840 | 3112 | 0.1323 |
| 1.6845 | 3113 | 0.1022 |
| 1.6851 | 3114 | 0.173 |
| 1.6856 | 3115 | 0.095 |
| 1.6861 | 3116 | 0.1932 |
| 1.6867 | 3117 | 0.1326 |
| 1.6872 | 3118 | 0.1623 |
| 1.6878 | 3119 | 0.1761 |
| 1.6883 | 3120 | 0.2711 |
| 1.6889 | 3121 | 0.1168 |
| 1.6894 | 3122 | 0.1221 |
| 1.6899 | 3123 | 0.1119 |
| 1.6905 | 3124 | 0.1486 |
| 1.6910 | 3125 | 0.12 |
| 1.6916 | 3126 | 0.1934 |
| 1.6921 | 3127 | 0.1454 |
| 1.6926 | 3128 | 0.1667 |
| 1.6932 | 3129 | 0.1224 |
| 1.6937 | 3130 | 0.0719 |
| 1.6943 | 3131 | 0.1296 |
| 1.6948 | 3132 | 0.1708 |
| 1.6953 | 3133 | 0.1507 |
| 1.6959 | 3134 | 0.1656 |
| 1.6964 | 3135 | 0.1595 |
| 1.6970 | 3136 | 0.1159 |
| 1.6975 | 3137 | 0.1458 |
| 1.6981 | 3138 | 0.1114 |
| 1.6986 | 3139 | 0.1317 |
| 1.6991 | 3140 | 0.1181 |
| 1.6997 | 3141 | 0.1627 |
| 1.7002 | 3142 | 0.1402 |
| 1.7008 | 3143 | 0.1928 |
| 1.7013 | 3144 | 0.1342 |
| 1.7018 | 3145 | 0.1767 |
| 1.7024 | 3146 | 0.0987 |
| 1.7029 | 3147 | 0.1176 |
| 1.7035 | 3148 | 0.1816 |
| 1.7040 | 3149 | 0.0761 |
| 1.7045 | 3150 | 0.1893 |
| 1.7051 | 3151 | 0.2155 |
| 1.7056 | 3152 | 0.1736 |
| 1.7062 | 3153 | 0.1795 |
| 1.7067 | 3154 | 0.123 |
| 1.7073 | 3155 | 0.1005 |
| 1.7078 | 3156 | 0.1892 |
| 1.7083 | 3157 | 0.1595 |
| 1.7089 | 3158 | 0.1261 |
| 1.7094 | 3159 | 0.0977 |
| 1.7100 | 3160 | 0.1679 |
| 1.7105 | 3161 | 0.0966 |
| 1.7110 | 3162 | 0.1665 |
| 1.7116 | 3163 | 0.0877 |
| 1.7121 | 3164 | 0.0971 |
| 1.7127 | 3165 | 0.1638 |
| 1.7132 | 3166 | 0.1345 |
| 1.7137 | 3167 | 0.1729 |
| 1.7143 | 3168 | 0.164 |
| 1.7148 | 3169 | 0.1755 |
| 1.7154 | 3170 | 0.0961 |
| 1.7159 | 3171 | 0.1285 |
| 1.7165 | 3172 | 0.1019 |
| 1.7170 | 3173 | 0.1181 |
| 1.7175 | 3174 | 0.1911 |
| 1.7181 | 3175 | 0.109 |
| 1.7186 | 3176 | 0.1131 |
| 1.7192 | 3177 | 0.2062 |
| 1.7197 | 3178 | 0.1409 |
| 1.7202 | 3179 | 0.1724 |
| 1.7208 | 3180 | 0.1846 |
| 1.7213 | 3181 | 0.1084 |
| 1.7219 | 3182 | 0.1459 |
| 1.7224 | 3183 | 0.1078 |
| 1.7229 | 3184 | 0.0871 |
| 1.7235 | 3185 | 0.0964 |
| 1.7240 | 3186 | 0.1168 |
| 1.7246 | 3187 | 0.1347 |
| 1.7251 | 3188 | 0.1683 |
| 1.7256 | 3189 | 0.1724 |
| 1.7262 | 3190 | 0.1386 |
| 1.7267 | 3191 | 0.0963 |
| 1.7273 | 3192 | 0.063 |
| 1.7278 | 3193 | 0.1902 |
| 1.7284 | 3194 | 0.2066 |
| 1.7289 | 3195 | 0.0827 |
| 1.7294 | 3196 | 0.1261 |
| 1.7300 | 3197 | 0.0949 |
| 1.7305 | 3198 | 0.1344 |
| 1.7311 | 3199 | 0.1464 |
| 1.7316 | 3200 | 0.1216 |
| 1.7321 | 3201 | 0.1894 |
| 1.7327 | 3202 | 0.0701 |
| 1.7332 | 3203 | 0.2015 |
| 1.7338 | 3204 | 0.0811 |
| 1.7343 | 3205 | 0.1138 |
| 1.7348 | 3206 | 0.1147 |
| 1.7354 | 3207 | 0.1097 |
| 1.7359 | 3208 | 0.1294 |
| 1.7365 | 3209 | 0.1124 |
| 1.7370 | 3210 | 0.1545 |
| 1.7376 | 3211 | 0.1388 |
| 1.7381 | 3212 | 0.1753 |
| 1.7386 | 3213 | 0.1394 |
| 1.7392 | 3214 | 0.1262 |
| 1.7397 | 3215 | 0.1536 |
| 1.7403 | 3216 | 0.1569 |
| 1.7408 | 3217 | 0.1194 |
| 1.7413 | 3218 | 0.1683 |
| 1.7419 | 3219 | 0.1358 |
| 1.7424 | 3220 | 0.1602 |
| 1.7430 | 3221 | 0.1973 |
| 1.7435 | 3222 | 0.189 |
| 1.7440 | 3223 | 0.0941 |
| 1.7446 | 3224 | 0.1522 |
| 1.7451 | 3225 | 0.2571 |
| 1.7457 | 3226 | 0.107 |
| 1.7462 | 3227 | 0.1244 |
| 1.7468 | 3228 | 0.1371 |
| 1.7473 | 3229 | 0.1221 |
| 1.7478 | 3230 | 0.1079 |
| 1.7484 | 3231 | 0.1406 |
| 1.7489 | 3232 | 0.1388 |
| 1.7495 | 3233 | 0.119 |
| 1.75 | 3234 | 0.1542 |
| 1.7505 | 3235 | 0.1077 |
| 1.7511 | 3236 | 0.1279 |
| 1.7516 | 3237 | 0.1235 |
| 1.7522 | 3238 | 0.1132 |
| 1.7527 | 3239 | 0.1546 |
| 1.7532 | 3240 | 0.2056 |
| 1.7538 | 3241 | 0.2005 |
| 1.7543 | 3242 | 0.1408 |
| 1.7549 | 3243 | 0.1861 |
| 1.7554 | 3244 | 0.1402 |
| 1.7560 | 3245 | 0.1131 |
| 1.7565 | 3246 | 0.0999 |
| 1.7570 | 3247 | 0.0895 |
| 1.7576 | 3248 | 0.1367 |
| 1.7581 | 3249 | 0.1588 |
| 1.7587 | 3250 | 0.1706 |
| 1.7592 | 3251 | 0.1647 |
| 1.7597 | 3252 | 0.1164 |
| 1.7603 | 3253 | 0.1058 |
| 1.7608 | 3254 | 0.1601 |
| 1.7614 | 3255 | 0.1444 |
| 1.7619 | 3256 | 0.1502 |
| 1.7624 | 3257 | 0.1709 |
| 1.7630 | 3258 | 0.1653 |
| 1.7635 | 3259 | 0.0546 |
| 1.7641 | 3260 | 0.1181 |
| 1.7646 | 3261 | 0.1238 |
| 1.7652 | 3262 | 0.1282 |
| 1.7657 | 3263 | 0.1592 |
| 1.7662 | 3264 | 0.1494 |
| 1.7668 | 3265 | 0.2134 |
| 1.7673 | 3266 | 0.1356 |
| 1.7679 | 3267 | 0.1612 |
| 1.7684 | 3268 | 0.1143 |
| 1.7689 | 3269 | 0.2423 |
| 1.7695 | 3270 | 0.0893 |
| 1.7700 | 3271 | 0.1639 |
| 1.7706 | 3272 | 0.2058 |
| 1.7711 | 3273 | 0.0952 |
| 1.7716 | 3274 | 0.1513 |
| 1.7722 | 3275 | 0.1767 |
| 1.7727 | 3276 | 0.2127 |
| 1.7733 | 3277 | 0.1141 |
| 1.7738 | 3278 | 0.0878 |
| 1.7744 | 3279 | 0.1457 |
| 1.7749 | 3280 | 0.1459 |
| 1.7754 | 3281 | 0.1895 |
| 1.7760 | 3282 | 0.1511 |
| 1.7765 | 3283 | 0.1375 |
| 1.7771 | 3284 | 0.2081 |
| 1.7776 | 3285 | 0.0627 |
| 1.7781 | 3286 | 0.1062 |
| 1.7787 | 3287 | 0.1456 |
| 1.7792 | 3288 | 0.1291 |
| 1.7798 | 3289 | 0.1396 |
| 1.7803 | 3290 | 0.1765 |
| 1.7808 | 3291 | 0.0801 |
| 1.7814 | 3292 | 0.1904 |
| 1.7819 | 3293 | 0.128 |
| 1.7825 | 3294 | 0.2003 |
| 1.7830 | 3295 | 0.106 |
| 1.7835 | 3296 | 0.181 |
| 1.7841 | 3297 | 0.1609 |
| 1.7846 | 3298 | 0.1436 |
| 1.7852 | 3299 | 0.1645 |
| 1.7857 | 3300 | 0.0934 |
| 1.7863 | 3301 | 0.1491 |
| 1.7868 | 3302 | 0.1468 |
| 1.7873 | 3303 | 0.199 |
| 1.7879 | 3304 | 0.1303 |
| 1.7884 | 3305 | 0.153 |
| 1.7890 | 3306 | 0.283 |
| 1.7895 | 3307 | 0.0757 |
| 1.7900 | 3308 | 0.1467 |
| 1.7906 | 3309 | 0.1875 |
| 1.7911 | 3310 | 0.1424 |
| 1.7917 | 3311 | 0.2265 |
| 1.7922 | 3312 | 0.1117 |
| 1.7927 | 3313 | 0.0587 |
| 1.7933 | 3314 | 0.1304 |
| 1.7938 | 3315 | 0.1024 |
| 1.7944 | 3316 | 0.0639 |
| 1.7949 | 3317 | 0.153 |
| 1.7955 | 3318 | 0.1512 |
| 1.7960 | 3319 | 0.0981 |
| 1.7965 | 3320 | 0.1406 |
| 1.7971 | 3321 | 0.1339 |
| 1.7976 | 3322 | 0.1035 |
| 1.7982 | 3323 | 0.1035 |
| 1.7987 | 3324 | 0.1349 |
| 1.7992 | 3325 | 0.129 |
| 1.7998 | 3326 | 0.1245 |
| 1.8003 | 3327 | 0.1629 |
| 1.8009 | 3328 | 0.2134 |
| 1.8014 | 3329 | 0.1188 |
| 1.8019 | 3330 | 0.0807 |
| 1.8025 | 3331 | 0.0875 |
| 1.8030 | 3332 | 0.0821 |
| 1.8036 | 3333 | 0.1032 |
| 1.8041 | 3334 | 0.0954 |
| 1.8047 | 3335 | 0.2133 |
| 1.8052 | 3336 | 0.1519 |
| 1.8057 | 3337 | 0.1515 |
| 1.8063 | 3338 | 0.1382 |
| 1.8068 | 3339 | 0.1435 |
| 1.8074 | 3340 | 0.1128 |
| 1.8079 | 3341 | 0.1777 |
| 1.8084 | 3342 | 0.0946 |
| 1.8090 | 3343 | 0.2011 |
| 1.8095 | 3344 | 0.093 |
| 1.8101 | 3345 | 0.0973 |
| 1.8106 | 3346 | 0.1845 |
| 1.8111 | 3347 | 0.1429 |
| 1.8117 | 3348 | 0.1615 |
| 1.8122 | 3349 | 0.0991 |
| 1.8128 | 3350 | 0.1783 |
| 1.8133 | 3351 | 0.1302 |
| 1.8139 | 3352 | 0.1255 |
| 1.8144 | 3353 | 0.1599 |
| 1.8149 | 3354 | 0.1773 |
| 1.8155 | 3355 | 0.1269 |
| 1.8160 | 3356 | 0.1722 |
| 1.8166 | 3357 | 0.1461 |
| 1.8171 | 3358 | 0.1114 |
| 1.8176 | 3359 | 0.1353 |
| 1.8182 | 3360 | 0.0718 |
| 1.8187 | 3361 | 0.1533 |
| 1.8193 | 3362 | 0.1608 |
| 1.8198 | 3363 | 0.1557 |
| 1.8203 | 3364 | 0.108 |
| 1.8209 | 3365 | 0.1231 |
| 1.8214 | 3366 | 0.1569 |
| 1.8220 | 3367 | 0.1251 |
| 1.8225 | 3368 | 0.1815 |
| 1.8231 | 3369 | 0.0905 |
| 1.8236 | 3370 | 0.0907 |
| 1.8241 | 3371 | 0.1921 |
| 1.8247 | 3372 | 0.1161 |
| 1.8252 | 3373 | 0.1267 |
| 1.8258 | 3374 | 0.0967 |
| 1.8263 | 3375 | 0.1281 |
| 1.8268 | 3376 | 0.1678 |
| 1.8274 | 3377 | 0.1545 |
| 1.8279 | 3378 | 0.1062 |
| 1.8285 | 3379 | 0.0841 |
| 1.8290 | 3380 | 0.1217 |
| 1.8295 | 3381 | 0.1513 |
| 1.8301 | 3382 | 0.0934 |
| 1.8306 | 3383 | 0.0874 |
| 1.8312 | 3384 | 0.1124 |
| 1.8317 | 3385 | 0.068 |
| 1.8323 | 3386 | 0.1045 |
| 1.8328 | 3387 | 0.0846 |
| 1.8333 | 3388 | 0.1375 |
| 1.8339 | 3389 | 0.1433 |
| 1.8344 | 3390 | 0.0742 |
| 1.8350 | 3391 | 0.1374 |
| 1.8355 | 3392 | 0.1718 |
| 1.8360 | 3393 | 0.1349 |
| 1.8366 | 3394 | 0.1852 |
| 1.8371 | 3395 | 0.1461 |
| 1.8377 | 3396 | 0.172 |
| 1.8382 | 3397 | 0.1809 |
| 1.8387 | 3398 | 0.1176 |
| 1.8393 | 3399 | 0.0811 |
| 1.8398 | 3400 | 0.144 |
| 1.8404 | 3401 | 0.1064 |
| 1.8409 | 3402 | 0.1666 |
| 1.8415 | 3403 | 0.1239 |
| 1.8420 | 3404 | 0.0793 |
| 1.8425 | 3405 | 0.1357 |
| 1.8431 | 3406 | 0.1008 |
| 1.8436 | 3407 | 0.1187 |
| 1.8442 | 3408 | 0.1157 |
| 1.8447 | 3409 | 0.1793 |
| 1.8452 | 3410 | 0.0997 |
| 1.8458 | 3411 | 0.121 |
| 1.8463 | 3412 | 0.0953 |
| 1.8469 | 3413 | 0.1071 |
| 1.8474 | 3414 | 0.137 |
| 1.8479 | 3415 | 0.1033 |
| 1.8485 | 3416 | 0.1874 |
| 1.8490 | 3417 | 0.0786 |
| 1.8496 | 3418 | 0.1177 |
| 1.8501 | 3419 | 0.145 |
| 1.8506 | 3420 | 0.0869 |
| 1.8512 | 3421 | 0.096 |
| 1.8517 | 3422 | 0.1259 |
| 1.8523 | 3423 | 0.1003 |
| 1.8528 | 3424 | 0.1524 |
| 1.8534 | 3425 | 0.1191 |
| 1.8539 | 3426 | 0.1265 |
| 1.8544 | 3427 | 0.09 |
| 1.8550 | 3428 | 0.1057 |
| 1.8555 | 3429 | 0.139 |
| 1.8561 | 3430 | 0.2413 |
| 1.8566 | 3431 | 0.2455 |
| 1.8571 | 3432 | 0.1225 |
| 1.8577 | 3433 | 0.1909 |
| 1.8582 | 3434 | 0.0837 |
| 1.8588 | 3435 | 0.1787 |
| 1.8593 | 3436 | 0.1265 |
| 1.8598 | 3437 | 0.1458 |
| 1.8604 | 3438 | 0.126 |
| 1.8609 | 3439 | 0.1263 |
| 1.8615 | 3440 | 0.1139 |
| 1.8620 | 3441 | 0.1025 |
| 1.8626 | 3442 | 0.115 |
| 1.8631 | 3443 | 0.0745 |
| 1.8636 | 3444 | 0.2031 |
| 1.8642 | 3445 | 0.1167 |
| 1.8647 | 3446 | 0.1435 |
| 1.8653 | 3447 | 0.1995 |
| 1.8658 | 3448 | 0.0633 |
| 1.8663 | 3449 | 0.1104 |
| 1.8669 | 3450 | 0.0903 |
| 1.8674 | 3451 | 0.2537 |
| 1.8680 | 3452 | 0.1511 |
| 1.8685 | 3453 | 0.1907 |
| 1.8690 | 3454 | 0.1438 |
| 1.8696 | 3455 | 0.1242 |
| 1.8701 | 3456 | 0.1069 |
| 1.8707 | 3457 | 0.1459 |
| 1.8712 | 3458 | 0.1477 |
| 1.8718 | 3459 | 0.199 |
| 1.8723 | 3460 | 0.1307 |
| 1.8728 | 3461 | 0.0553 |
| 1.8734 | 3462 | 0.1462 |
| 1.8739 | 3463 | 0.0752 |
| 1.8745 | 3464 | 0.1257 |
| 1.875 | 3465 | 0.1226 |
| 1.8755 | 3466 | 0.1329 |
| 1.8761 | 3467 | 0.0877 |
| 1.8766 | 3468 | 0.248 |
| 1.8772 | 3469 | 0.1792 |
| 1.8777 | 3470 | 0.1565 |
| 1.8782 | 3471 | 0.0849 |
| 1.8788 | 3472 | 0.0748 |
| 1.8793 | 3473 | 0.1071 |
| 1.8799 | 3474 | 0.2068 |
| 1.8804 | 3475 | 0.1632 |
| 1.8810 | 3476 | 0.1077 |
| 1.8815 | 3477 | 0.1074 |
| 1.8820 | 3478 | 0.1381 |
| 1.8826 | 3479 | 0.1371 |
| 1.8831 | 3480 | 0.1554 |
| 1.8837 | 3481 | 0.0862 |
| 1.8842 | 3482 | 0.1134 |
| 1.8847 | 3483 | 0.0917 |
| 1.8853 | 3484 | 0.1064 |
| 1.8858 | 3485 | 0.107 |
| 1.8864 | 3486 | 0.1672 |
| 1.8869 | 3487 | 0.1369 |
| 1.8874 | 3488 | 0.1046 |
| 1.8880 | 3489 | 0.1897 |
| 1.8885 | 3490 | 0.1445 |
| 1.8891 | 3491 | 0.1557 |
| 1.8896 | 3492 | 0.1475 |
| 1.8902 | 3493 | 0.0924 |
| 1.8907 | 3494 | 0.1195 |
| 1.8912 | 3495 | 0.134 |
| 1.8918 | 3496 | 0.1781 |
| 1.8923 | 3497 | 0.1302 |
| 1.8929 | 3498 | 0.1147 |
| 1.8934 | 3499 | 0.1188 |
| 1.8939 | 3500 | 0.0764 |
| 1.8945 | 3501 | 0.1862 |
| 1.8950 | 3502 | 0.1731 |
| 1.8956 | 3503 | 0.1506 |
| 1.8961 | 3504 | 0.2071 |
| 1.8966 | 3505 | 0.1214 |
| 1.8972 | 3506 | 0.0631 |
| 1.8977 | 3507 | 0.2146 |
| 1.8983 | 3508 | 0.1264 |
| 1.8988 | 3509 | 0.1227 |
| 1.8994 | 3510 | 0.098 |
| 1.8999 | 3511 | 0.1133 |
| 1.9004 | 3512 | 0.108 |
| 1.9010 | 3513 | 0.1429 |
| 1.9015 | 3514 | 0.1002 |
| 1.9021 | 3515 | 0.0996 |
| 1.9026 | 3516 | 0.1665 |
| 1.9031 | 3517 | 0.1244 |
| 1.9037 | 3518 | 0.0964 |
| 1.9042 | 3519 | 0.1123 |
| 1.9048 | 3520 | 0.1491 |
| 1.9053 | 3521 | 0.1112 |
| 1.9058 | 3522 | 0.1046 |
| 1.9064 | 3523 | 0.1169 |
| 1.9069 | 3524 | 0.1542 |
| 1.9075 | 3525 | 0.1642 |
| 1.9080 | 3526 | 0.1363 |
| 1.9085 | 3527 | 0.1173 |
| 1.9091 | 3528 | 0.1342 |
| 1.9096 | 3529 | 0.1478 |
| 1.9102 | 3530 | 0.1453 |
| 1.9107 | 3531 | 0.0621 |
| 1.9113 | 3532 | 0.1445 |
| 1.9118 | 3533 | 0.2127 |
| 1.9123 | 3534 | 0.1118 |
| 1.9129 | 3535 | 0.1104 |
| 1.9134 | 3536 | 0.1489 |
| 1.9140 | 3537 | 0.1287 |
| 1.9145 | 3538 | 0.1405 |
| 1.9150 | 3539 | 0.1815 |
| 1.9156 | 3540 | 0.2112 |
| 1.9161 | 3541 | 0.093 |
| 1.9167 | 3542 | 0.1712 |
| 1.9172 | 3543 | 0.1072 |
| 1.9177 | 3544 | 0.1197 |
| 1.9183 | 3545 | 0.0915 |
| 1.9188 | 3546 | 0.1916 |
| 1.9194 | 3547 | 0.1411 |
| 1.9199 | 3548 | 0.0798 |
| 1.9205 | 3549 | 0.1057 |
| 1.9210 | 3550 | 0.1458 |
| 1.9215 | 3551 | 0.2198 |
| 1.9221 | 3552 | 0.1297 |
| 1.9226 | 3553 | 0.1554 |
| 1.9232 | 3554 | 0.1415 |
| 1.9237 | 3555 | 0.1241 |
| 1.9242 | 3556 | 0.1127 |
| 1.9248 | 3557 | 0.1121 |
| 1.9253 | 3558 | 0.0801 |
| 1.9259 | 3559 | 0.1509 |
| 1.9264 | 3560 | 0.2069 |
| 1.9269 | 3561 | 0.2097 |
| 1.9275 | 3562 | 0.148 |
| 1.9280 | 3563 | 0.1672 |
| 1.9286 | 3564 | 0.1582 |
| 1.9291 | 3565 | 0.0763 |
| 1.9297 | 3566 | 0.0645 |
| 1.9302 | 3567 | 0.0952 |
| 1.9307 | 3568 | 0.1827 |
| 1.9313 | 3569 | 0.135 |
| 1.9318 | 3570 | 0.1278 |
| 1.9324 | 3571 | 0.1181 |
| 1.9329 | 3572 | 0.1053 |
| 1.9334 | 3573 | 0.0809 |
| 1.9340 | 3574 | 0.1149 |
| 1.9345 | 3575 | 0.1938 |
| 1.9351 | 3576 | 0.2437 |
| 1.9356 | 3577 | 0.1428 |
| 1.9361 | 3578 | 0.2036 |
| 1.9367 | 3579 | 0.2326 |
| 1.9372 | 3580 | 0.1331 |
| 1.9378 | 3581 | 0.2583 |
| 1.9383 | 3582 | 0.0846 |
| 1.9389 | 3583 | 0.1506 |
| 1.9394 | 3584 | 0.1001 |
| 1.9399 | 3585 | 0.0816 |
| 1.9405 | 3586 | 0.1654 |
| 1.9410 | 3587 | 0.1472 |
| 1.9416 | 3588 | 0.134 |
| 1.9421 | 3589 | 0.161 |
| 1.9426 | 3590 | 0.0722 |
| 1.9432 | 3591 | 0.1112 |
| 1.9437 | 3592 | 0.1371 |
| 1.9443 | 3593 | 0.154 |
| 1.9448 | 3594 | 0.1929 |
| 1.9453 | 3595 | 0.0807 |
| 1.9459 | 3596 | 0.1179 |
| 1.9464 | 3597 | 0.1123 |
| 1.9470 | 3598 | 0.1316 |
| 1.9475 | 3599 | 0.1715 |
| 1.9481 | 3600 | 0.0725 |
| 1.9486 | 3601 | 0.1253 |
| 1.9491 | 3602 | 0.1063 |
| 1.9497 | 3603 | 0.0637 |
| 1.9502 | 3604 | 0.1376 |
| 1.9508 | 3605 | 0.1692 |
| 1.9513 | 3606 | 0.1444 |
| 1.9518 | 3607 | 0.1961 |
| 1.9524 | 3608 | 0.1237 |
| 1.9529 | 3609 | 0.1656 |
| 1.9535 | 3610 | 0.1268 |
| 1.9540 | 3611 | 0.1228 |
| 1.9545 | 3612 | 0.1233 |
| 1.9551 | 3613 | 0.1336 |
| 1.9556 | 3614 | 0.1778 |
| 1.9562 | 3615 | 0.1584 |
| 1.9567 | 3616 | 0.0842 |
| 1.9573 | 3617 | 0.1075 |
| 1.9578 | 3618 | 0.1397 |
| 1.9583 | 3619 | 0.0974 |
| 1.9589 | 3620 | 0.1425 |
| 1.9594 | 3621 | 0.1124 |
| 1.9600 | 3622 | 0.1553 |
| 1.9605 | 3623 | 0.1082 |
| 1.9610 | 3624 | 0.2172 |
| 1.9616 | 3625 | 0.136 |
| 1.9621 | 3626 | 0.1152 |
| 1.9627 | 3627 | 0.187 |
| 1.9632 | 3628 | 0.0914 |
| 1.9637 | 3629 | 0.1043 |
| 1.9643 | 3630 | 0.1044 |
| 1.9648 | 3631 | 0.2358 |
| 1.9654 | 3632 | 0.088 |
| 1.9659 | 3633 | 0.085 |
| 1.9665 | 3634 | 0.139 |
| 1.9670 | 3635 | 0.0856 |
| 1.9675 | 3636 | 0.2187 |
| 1.9681 | 3637 | 0.163 |
| 1.9686 | 3638 | 0.1722 |
| 1.9692 | 3639 | 0.1082 |
| 1.9697 | 3640 | 0.1889 |
| 1.9702 | 3641 | 0.1865 |
| 1.9708 | 3642 | 0.1392 |
| 1.9713 | 3643 | 0.1366 |
| 1.9719 | 3644 | 0.2583 |
| 1.9724 | 3645 | 0.1249 |
| 1.9729 | 3646 | 0.1803 |
| 1.9735 | 3647 | 0.142 |
| 1.9740 | 3648 | 0.1319 |
| 1.9746 | 3649 | 0.1987 |
| 1.9751 | 3650 | 0.0852 |
| 1.9756 | 3651 | 0.1585 |
| 1.9762 | 3652 | 0.1135 |
| 1.9767 | 3653 | 0.1252 |
| 1.9773 | 3654 | 0.1397 |
| 1.9778 | 3655 | 0.1603 |
| 1.9784 | 3656 | 0.1126 |
| 1.9789 | 3657 | 0.1612 |
| 1.9794 | 3658 | 0.0826 |
| 1.9800 | 3659 | 0.2043 |
| 1.9805 | 3660 | 0.1867 |
| 1.9811 | 3661 | 0.1596 |
| 1.9816 | 3662 | 0.1495 |
| 1.9821 | 3663 | 0.111 |
| 1.9827 | 3664 | 0.1552 |
| 1.9832 | 3665 | 0.1026 |
| 1.9838 | 3666 | 0.1031 |
| 1.9843 | 3667 | 0.1257 |
| 1.9848 | 3668 | 0.1231 |
| 1.9854 | 3669 | 0.1231 |
| 1.9859 | 3670 | 0.0685 |
| 1.9865 | 3671 | 0.116 |
| 1.9870 | 3672 | 0.1789 |
| 1.9876 | 3673 | 0.0924 |
| 1.9881 | 3674 | 0.1606 |
| 1.9886 | 3675 | 0.1572 |
| 1.9892 | 3676 | 0.1543 |
| 1.9897 | 3677 | 0.0975 |
| 1.9903 | 3678 | 0.1347 |
| 1.9908 | 3679 | 0.0963 |
| 1.9913 | 3680 | 0.085 |
| 1.9919 | 3681 | 0.2071 |
| 1.9924 | 3682 | 0.1415 |
| 1.9930 | 3683 | 0.1221 |
| 1.9935 | 3684 | 0.1796 |
| 1.9940 | 3685 | 0.2002 |
| 1.9946 | 3686 | 0.1112 |
| 1.9951 | 3687 | 0.201 |
| 1.9957 | 3688 | 0.1599 |
| 1.9962 | 3689 | 0.1209 |
| 1.9968 | 3690 | 0.1429 |
| 1.9973 | 3691 | 0.1201 |
| 1.9978 | 3692 | 0.0651 |
| 1.9984 | 3693 | 0.0922 |
| 1.9989 | 3694 | 0.1071 |
| 1.9995 | 3695 | 0.1756 |
| 2.0 | 3696 | 0.1262 |
| 2.0005 | 3697 | 0.0868 |
| 2.0011 | 3698 | 0.1027 |
| 2.0016 | 3699 | 0.1486 |
| 2.0022 | 3700 | 0.1695 |
| 2.0027 | 3701 | 0.1058 |
| 2.0032 | 3702 | 0.0815 |
| 2.0038 | 3703 | 0.1181 |
| 2.0043 | 3704 | 0.0835 |
| 2.0049 | 3705 | 0.1223 |
| 2.0054 | 3706 | 0.1353 |
| 2.0060 | 3707 | 0.12 |
| 2.0065 | 3708 | 0.0961 |
| 2.0070 | 3709 | 0.0656 |
| 2.0076 | 3710 | 0.038 |
| 2.0081 | 3711 | 0.1012 |
| 2.0087 | 3712 | 0.1237 |
| 2.0092 | 3713 | 0.0787 |
| 2.0097 | 3714 | 0.1479 |
| 2.0103 | 3715 | 0.1126 |
| 2.0108 | 3716 | 0.0549 |
| 2.0114 | 3717 | 0.1574 |
| 2.0119 | 3718 | 0.0952 |
| 2.0124 | 3719 | 0.0893 |
| 2.0130 | 3720 | 0.1139 |
| 2.0135 | 3721 | 0.0671 |
| 2.0141 | 3722 | 0.0589 |
| 2.0146 | 3723 | 0.059 |
| 2.0152 | 3724 | 0.111 |
| 2.0157 | 3725 | 0.1269 |
| 2.0162 | 3726 | 0.0707 |
| 2.0168 | 3727 | 0.1661 |
| 2.0173 | 3728 | 0.1041 |
| 2.0179 | 3729 | 0.1088 |
| 2.0184 | 3730 | 0.1043 |
| 2.0189 | 3731 | 0.0852 |
| 2.0195 | 3732 | 0.1726 |
| 2.0200 | 3733 | 0.1209 |
| 2.0206 | 3734 | 0.0976 |
| 2.0211 | 3735 | 0.0573 |
| 2.0216 | 3736 | 0.1027 |
| 2.0222 | 3737 | 0.1287 |
| 2.0227 | 3738 | 0.112 |
| 2.0233 | 3739 | 0.0923 |
| 2.0238 | 3740 | 0.0919 |
| 2.0244 | 3741 | 0.1019 |
| 2.0249 | 3742 | 0.0851 |
| 2.0254 | 3743 | 0.1342 |
| 2.0260 | 3744 | 0.0892 |
| 2.0265 | 3745 | 0.0756 |
| 2.0271 | 3746 | 0.1088 |
| 2.0276 | 3747 | 0.0686 |
| 2.0281 | 3748 | 0.1172 |
| 2.0287 | 3749 | 0.1495 |
| 2.0292 | 3750 | 0.1435 |
| 2.0298 | 3751 | 0.0527 |
| 2.0303 | 3752 | 0.1294 |
| 2.0308 | 3753 | 0.1277 |
| 2.0314 | 3754 | 0.1159 |
| 2.0319 | 3755 | 0.0771 |
| 2.0325 | 3756 | 0.0853 |
| 2.0330 | 3757 | 0.0985 |
| 2.0335 | 3758 | 0.1415 |
| 2.0341 | 3759 | 0.0838 |
| 2.0346 | 3760 | 0.1366 |
| 2.0352 | 3761 | 0.0977 |
| 2.0357 | 3762 | 0.0902 |
| 2.0363 | 3763 | 0.1419 |
| 2.0368 | 3764 | 0.1301 |
| 2.0373 | 3765 | 0.0851 |
| 2.0379 | 3766 | 0.1039 |
| 2.0384 | 3767 | 0.0877 |
| 2.0390 | 3768 | 0.1048 |
| 2.0395 | 3769 | 0.1027 |
| 2.0400 | 3770 | 0.1365 |
| 2.0406 | 3771 | 0.0866 |
| 2.0411 | 3772 | 0.1162 |
| 2.0417 | 3773 | 0.1088 |
| 2.0422 | 3774 | 0.141 |
| 2.0427 | 3775 | 0.1279 |
| 2.0433 | 3776 | 0.1089 |
| 2.0438 | 3777 | 0.1693 |
| 2.0444 | 3778 | 0.0725 |
| 2.0449 | 3779 | 0.0506 |
| 2.0455 | 3780 | 0.112 |
| 2.0460 | 3781 | 0.0538 |
| 2.0465 | 3782 | 0.0965 |
| 2.0471 | 3783 | 0.1196 |
| 2.0476 | 3784 | 0.125 |
| 2.0482 | 3785 | 0.0992 |
| 2.0487 | 3786 | 0.0738 |
| 2.0492 | 3787 | 0.0956 |
| 2.0498 | 3788 | 0.1151 |
| 2.0503 | 3789 | 0.1054 |
| 2.0509 | 3790 | 0.0682 |
| 2.0514 | 3791 | 0.1283 |
| 2.0519 | 3792 | 0.0844 |
| 2.0525 | 3793 | 0.1502 |
| 2.0530 | 3794 | 0.0807 |
| 2.0536 | 3795 | 0.1474 |
| 2.0541 | 3796 | 0.0814 |
| 2.0547 | 3797 | 0.0911 |
| 2.0552 | 3798 | 0.1086 |
| 2.0557 | 3799 | 0.0911 |
| 2.0563 | 3800 | 0.0924 |
| 2.0568 | 3801 | 0.0949 |
| 2.0574 | 3802 | 0.1195 |
| 2.0579 | 3803 | 0.101 |
| 2.0584 | 3804 | 0.0939 |
| 2.0590 | 3805 | 0.1092 |
| 2.0595 | 3806 | 0.0833 |
| 2.0601 | 3807 | 0.0799 |
| 2.0606 | 3808 | 0.1366 |
| 2.0611 | 3809 | 0.0937 |
| 2.0617 | 3810 | 0.1322 |
| 2.0622 | 3811 | 0.0937 |
| 2.0628 | 3812 | 0.0989 |
| 2.0633 | 3813 | 0.112 |
| 2.0639 | 3814 | 0.1116 |
| 2.0644 | 3815 | 0.0603 |
| 2.0649 | 3816 | 0.1627 |
| 2.0655 | 3817 | 0.0453 |
| 2.0660 | 3818 | 0.1235 |
| 2.0666 | 3819 | 0.1528 |
| 2.0671 | 3820 | 0.1288 |
| 2.0676 | 3821 | 0.0948 |
| 2.0682 | 3822 | 0.113 |
| 2.0687 | 3823 | 0.0893 |
| 2.0693 | 3824 | 0.1466 |
| 2.0698 | 3825 | 0.0838 |
| 2.0703 | 3826 | 0.1004 |
| 2.0709 | 3827 | 0.152 |
| 2.0714 | 3828 | 0.067 |
| 2.0720 | 3829 | 0.1333 |
| 2.0725 | 3830 | 0.1321 |
| 2.0731 | 3831 | 0.055 |
| 2.0736 | 3832 | 0.1072 |
| 2.0741 | 3833 | 0.1392 |
| 2.0747 | 3834 | 0.0628 |
| 2.0752 | 3835 | 0.1286 |
| 2.0758 | 3836 | 0.1403 |
| 2.0763 | 3837 | 0.0843 |
| 2.0768 | 3838 | 0.1565 |
| 2.0774 | 3839 | 0.1061 |
| 2.0779 | 3840 | 0.1434 |
| 2.0785 | 3841 | 0.1105 |
| 2.0790 | 3842 | 0.1042 |
| 2.0795 | 3843 | 0.0974 |
| 2.0801 | 3844 | 0.0706 |
| 2.0806 | 3845 | 0.106 |
| 2.0812 | 3846 | 0.1558 |
| 2.0817 | 3847 | 0.1305 |
| 2.0823 | 3848 | 0.0913 |
| 2.0828 | 3849 | 0.1394 |
| 2.0833 | 3850 | 0.162 |
| 2.0839 | 3851 | 0.0883 |
| 2.0844 | 3852 | 0.1376 |
| 2.0850 | 3853 | 0.0624 |
| 2.0855 | 3854 | 0.0639 |
| 2.0860 | 3855 | 0.1365 |
| 2.0866 | 3856 | 0.0804 |
| 2.0871 | 3857 | 0.1241 |
| 2.0877 | 3858 | 0.1047 |
| 2.0882 | 3859 | 0.0719 |
| 2.0887 | 3860 | 0.1125 |
| 2.0893 | 3861 | 0.1549 |
| 2.0898 | 3862 | 0.0932 |
| 2.0904 | 3863 | 0.1644 |
| 2.0909 | 3864 | 0.0918 |
| 2.0915 | 3865 | 0.1134 |
| 2.0920 | 3866 | 0.1155 |
| 2.0925 | 3867 | 0.1056 |
| 2.0931 | 3868 | 0.1158 |
| 2.0936 | 3869 | 0.0948 |
| 2.0942 | 3870 | 0.122 |
| 2.0947 | 3871 | 0.0504 |
| 2.0952 | 3872 | 0.1643 |
| 2.0958 | 3873 | 0.1064 |
| 2.0963 | 3874 | 0.1197 |
| 2.0969 | 3875 | 0.1816 |
| 2.0974 | 3876 | 0.0956 |
| 2.0979 | 3877 | 0.1038 |
| 2.0985 | 3878 | 0.0406 |
| 2.0990 | 3879 | 0.0819 |
| 2.0996 | 3880 | 0.1045 |
| 2.1001 | 3881 | 0.1393 |
| 2.1006 | 3882 | 0.1046 |
| 2.1012 | 3883 | 0.1299 |
| 2.1017 | 3884 | 0.1438 |
| 2.1023 | 3885 | 0.0821 |
| 2.1028 | 3886 | 0.0581 |
| 2.1034 | 3887 | 0.0722 |
| 2.1039 | 3888 | 0.0921 |
| 2.1044 | 3889 | 0.0812 |
| 2.1050 | 3890 | 0.0918 |
| 2.1055 | 3891 | 0.1152 |
| 2.1061 | 3892 | 0.1103 |
| 2.1066 | 3893 | 0.1367 |
| 2.1071 | 3894 | 0.0771 |
| 2.1077 | 3895 | 0.1125 |
| 2.1082 | 3896 | 0.0814 |
| 2.1088 | 3897 | 0.0998 |
| 2.1093 | 3898 | 0.1494 |
| 2.1098 | 3899 | 0.0778 |
| 2.1104 | 3900 | 0.0717 |
| 2.1109 | 3901 | 0.1093 |
| 2.1115 | 3902 | 0.0516 |
| 2.1120 | 3903 | 0.0955 |
| 2.1126 | 3904 | 0.0789 |
| 2.1131 | 3905 | 0.0894 |
| 2.1136 | 3906 | 0.0763 |
| 2.1142 | 3907 | 0.123 |
| 2.1147 | 3908 | 0.0784 |
| 2.1153 | 3909 | 0.0523 |
| 2.1158 | 3910 | 0.0961 |
| 2.1163 | 3911 | 0.1403 |
| 2.1169 | 3912 | 0.0983 |
| 2.1174 | 3913 | 0.0937 |
| 2.1180 | 3914 | 0.0822 |
| 2.1185 | 3915 | 0.1102 |
| 2.1190 | 3916 | 0.0982 |
| 2.1196 | 3917 | 0.0886 |
| 2.1201 | 3918 | 0.0752 |
| 2.1207 | 3919 | 0.1724 |
| 2.1212 | 3920 | 0.0537 |
| 2.1218 | 3921 | 0.0853 |
| 2.1223 | 3922 | 0.0754 |
| 2.1228 | 3923 | 0.0754 |
| 2.1234 | 3924 | 0.179 |
| 2.1239 | 3925 | 0.1207 |
| 2.1245 | 3926 | 0.1055 |
| 2.125 | 3927 | 0.0881 |
| 2.1255 | 3928 | 0.0856 |
| 2.1261 | 3929 | 0.0806 |
| 2.1266 | 3930 | 0.0629 |
| 2.1272 | 3931 | 0.0866 |
| 2.1277 | 3932 | 0.1315 |
| 2.1282 | 3933 | 0.0533 |
| 2.1288 | 3934 | 0.1413 |
| 2.1293 | 3935 | 0.0582 |
| 2.1299 | 3936 | 0.093 |
| 2.1304 | 3937 | 0.1078 |
| 2.1310 | 3938 | 0.0617 |
| 2.1315 | 3939 | 0.0888 |
| 2.1320 | 3940 | 0.1497 |
| 2.1326 | 3941 | 0.0848 |
| 2.1331 | 3942 | 0.1136 |
| 2.1337 | 3943 | 0.1066 |
| 2.1342 | 3944 | 0.1173 |
| 2.1347 | 3945 | 0.0853 |
| 2.1353 | 3946 | 0.0714 |
| 2.1358 | 3947 | 0.114 |
| 2.1364 | 3948 | 0.0637 |
| 2.1369 | 3949 | 0.0674 |
| 2.1374 | 3950 | 0.1088 |
| 2.1380 | 3951 | 0.0939 |
| 2.1385 | 3952 | 0.0938 |
| 2.1391 | 3953 | 0.0858 |
| 2.1396 | 3954 | 0.0971 |
| 2.1402 | 3955 | 0.0854 |
| 2.1407 | 3956 | 0.1242 |
| 2.1412 | 3957 | 0.0759 |
| 2.1418 | 3958 | 0.1041 |
| 2.1423 | 3959 | 0.0794 |
| 2.1429 | 3960 | 0.1637 |
| 2.1434 | 3961 | 0.0948 |
| 2.1439 | 3962 | 0.1284 |
| 2.1445 | 3963 | 0.1894 |
| 2.1450 | 3964 | 0.1456 |
| 2.1456 | 3965 | 0.1867 |
| 2.1461 | 3966 | 0.0972 |
| 2.1466 | 3967 | 0.1385 |
| 2.1472 | 3968 | 0.0772 |
| 2.1477 | 3969 | 0.1075 |
| 2.1483 | 3970 | 0.0721 |
| 2.1488 | 3971 | 0.0743 |
| 2.1494 | 3972 | 0.073 |
| 2.1499 | 3973 | 0.1095 |
| 2.1504 | 3974 | 0.1036 |
| 2.1510 | 3975 | 0.1059 |
| 2.1515 | 3976 | 0.1541 |
| 2.1521 | 3977 | 0.1267 |
| 2.1526 | 3978 | 0.1006 |
| 2.1531 | 3979 | 0.1332 |
| 2.1537 | 3980 | 0.1288 |
| 2.1542 | 3981 | 0.1526 |
| 2.1548 | 3982 | 0.0997 |
| 2.1553 | 3983 | 0.0731 |
| 2.1558 | 3984 | 0.1243 |
| 2.1564 | 3985 | 0.0679 |
| 2.1569 | 3986 | 0.0804 |
| 2.1575 | 3987 | 0.1052 |
| 2.1580 | 3988 | 0.077 |
| 2.1585 | 3989 | 0.1017 |
| 2.1591 | 3990 | 0.1088 |
| 2.1596 | 3991 | 0.0804 |
| 2.1602 | 3992 | 0.0719 |
| 2.1607 | 3993 | 0.1131 |
| 2.1613 | 3994 | 0.1274 |
| 2.1618 | 3995 | 0.0724 |
| 2.1623 | 3996 | 0.0746 |
| 2.1629 | 3997 | 0.063 |
| 2.1634 | 3998 | 0.0863 |
| 2.1640 | 3999 | 0.0846 |
| 2.1645 | 4000 | 0.079 |
| 2.1650 | 4001 | 0.1367 |
| 2.1656 | 4002 | 0.1341 |
| 2.1661 | 4003 | 0.1123 |
| 2.1667 | 4004 | 0.1425 |
| 2.1672 | 4005 | 0.1205 |
| 2.1677 | 4006 | 0.1473 |
| 2.1683 | 4007 | 0.0794 |
| 2.1688 | 4008 | 0.0954 |
| 2.1694 | 4009 | 0.0921 |
| 2.1699 | 4010 | 0.0788 |
| 2.1705 | 4011 | 0.1574 |
| 2.1710 | 4012 | 0.0662 |
| 2.1715 | 4013 | 0.1068 |
| 2.1721 | 4014 | 0.0479 |
| 2.1726 | 4015 | 0.1175 |
| 2.1732 | 4016 | 0.1035 |
| 2.1737 | 4017 | 0.1186 |
| 2.1742 | 4018 | 0.0867 |
| 2.1748 | 4019 | 0.0906 |
| 2.1753 | 4020 | 0.1025 |
| 2.1759 | 4021 | 0.1167 |
| 2.1764 | 4022 | 0.1128 |
| 2.1769 | 4023 | 0.0804 |
| 2.1775 | 4024 | 0.1359 |
| 2.1780 | 4025 | 0.1061 |
| 2.1786 | 4026 | 0.0769 |
| 2.1791 | 4027 | 0.1563 |
| 2.1797 | 4028 | 0.0662 |
| 2.1802 | 4029 | 0.0787 |
| 2.1807 | 4030 | 0.1493 |
| 2.1813 | 4031 | 0.12 |
| 2.1818 | 4032 | 0.1276 |
| 2.1824 | 4033 | 0.1189 |
| 2.1829 | 4034 | 0.0666 |
| 2.1834 | 4035 | 0.1168 |
| 2.1840 | 4036 | 0.0964 |
| 2.1845 | 4037 | 0.0924 |
| 2.1851 | 4038 | 0.0928 |
| 2.1856 | 4039 | 0.1565 |
| 2.1861 | 4040 | 0.1583 |
| 2.1867 | 4041 | 0.0943 |
| 2.1872 | 4042 | 0.068 |
| 2.1878 | 4043 | 0.1038 |
| 2.1883 | 4044 | 0.0523 |
| 2.1889 | 4045 | 0.0995 |
| 2.1894 | 4046 | 0.1014 |
| 2.1899 | 4047 | 0.0666 |
| 2.1905 | 4048 | 0.0973 |
| 2.1910 | 4049 | 0.0504 |
| 2.1916 | 4050 | 0.0871 |
| 2.1921 | 4051 | 0.1233 |
| 2.1926 | 4052 | 0.1242 |
| 2.1932 | 4053 | 0.1249 |
| 2.1937 | 4054 | 0.0793 |
| 2.1943 | 4055 | 0.0811 |
| 2.1948 | 4056 | 0.0852 |
| 2.1953 | 4057 | 0.0694 |
| 2.1959 | 4058 | 0.0753 |
| 2.1964 | 4059 | 0.1074 |
| 2.1970 | 4060 | 0.0658 |
| 2.1975 | 4061 | 0.1055 |
| 2.1981 | 4062 | 0.0995 |
| 2.1986 | 4063 | 0.1394 |
| 2.1991 | 4064 | 0.1109 |
| 2.1997 | 4065 | 0.1443 |
| 2.2002 | 4066 | 0.0827 |
| 2.2008 | 4067 | 0.0973 |
| 2.2013 | 4068 | 0.122 |
| 2.2018 | 4069 | 0.1324 |
| 2.2024 | 4070 | 0.0841 |
| 2.2029 | 4071 | 0.0811 |
| 2.2035 | 4072 | 0.1299 |
| 2.2040 | 4073 | 0.0619 |
| 2.2045 | 4074 | 0.0995 |
| 2.2051 | 4075 | 0.0774 |
| 2.2056 | 4076 | 0.0703 |
| 2.2062 | 4077 | 0.0908 |
| 2.2067 | 4078 | 0.0993 |
| 2.2073 | 4079 | 0.1208 |
| 2.2078 | 4080 | 0.0709 |
| 2.2083 | 4081 | 0.0961 |
| 2.2089 | 4082 | 0.086 |
| 2.2094 | 4083 | 0.1744 |
| 2.2100 | 4084 | 0.1104 |
| 2.2105 | 4085 | 0.0948 |
| 2.2110 | 4086 | 0.1372 |
| 2.2116 | 4087 | 0.0504 |
| 2.2121 | 4088 | 0.1078 |
| 2.2127 | 4089 | 0.0678 |
| 2.2132 | 4090 | 0.1113 |
| 2.2137 | 4091 | 0.1535 |
| 2.2143 | 4092 | 0.1114 |
| 2.2148 | 4093 | 0.1493 |
| 2.2154 | 4094 | 0.1055 |
| 2.2159 | 4095 | 0.112 |
| 2.2165 | 4096 | 0.0994 |
| 2.2170 | 4097 | 0.1593 |
| 2.2175 | 4098 | 0.1368 |
| 2.2181 | 4099 | 0.1037 |
| 2.2186 | 4100 | 0.1092 |
| 2.2192 | 4101 | 0.1238 |
| 2.2197 | 4102 | 0.1298 |
| 2.2202 | 4103 | 0.0702 |
| 2.2208 | 4104 | 0.0597 |
| 2.2213 | 4105 | 0.0925 |
| 2.2219 | 4106 | 0.1104 |
| 2.2224 | 4107 | 0.0928 |
| 2.2229 | 4108 | 0.1117 |
| 2.2235 | 4109 | 0.0586 |
| 2.2240 | 4110 | 0.0577 |
| 2.2246 | 4111 | 0.1112 |
| 2.2251 | 4112 | 0.1522 |
| 2.2256 | 4113 | 0.0973 |
| 2.2262 | 4114 | 0.1223 |
| 2.2267 | 4115 | 0.0874 |
| 2.2273 | 4116 | 0.063 |
| 2.2278 | 4117 | 0.1704 |
| 2.2284 | 4118 | 0.0657 |
| 2.2289 | 4119 | 0.0901 |
| 2.2294 | 4120 | 0.0951 |
| 2.2300 | 4121 | 0.1019 |
| 2.2305 | 4122 | 0.0885 |
| 2.2311 | 4123 | 0.109 |
| 2.2316 | 4124 | 0.0901 |
| 2.2321 | 4125 | 0.0914 |
| 2.2327 | 4126 | 0.1287 |
| 2.2332 | 4127 | 0.0676 |
| 2.2338 | 4128 | 0.0904 |
| 2.2343 | 4129 | 0.0666 |
| 2.2348 | 4130 | 0.1157 |
| 2.2354 | 4131 | 0.057 |
| 2.2359 | 4132 | 0.1128 |
| 2.2365 | 4133 | 0.1123 |
| 2.2370 | 4134 | 0.1625 |
| 2.2376 | 4135 | 0.1028 |
| 2.2381 | 4136 | 0.1311 |
| 2.2386 | 4137 | 0.1246 |
| 2.2392 | 4138 | 0.0961 |
| 2.2397 | 4139 | 0.0611 |
| 2.2403 | 4140 | 0.1273 |
| 2.2408 | 4141 | 0.0837 |
| 2.2413 | 4142 | 0.0704 |
| 2.2419 | 4143 | 0.0873 |
| 2.2424 | 4144 | 0.1671 |
| 2.2430 | 4145 | 0.1129 |
| 2.2435 | 4146 | 0.0876 |
| 2.2440 | 4147 | 0.2113 |
| 2.2446 | 4148 | 0.0625 |
| 2.2451 | 4149 | 0.0901 |
| 2.2457 | 4150 | 0.0904 |
| 2.2462 | 4151 | 0.1606 |
| 2.2468 | 4152 | 0.1389 |
| 2.2473 | 4153 | 0.0992 |
| 2.2478 | 4154 | 0.1092 |
| 2.2484 | 4155 | 0.1034 |
| 2.2489 | 4156 | 0.0779 |
| 2.2495 | 4157 | 0.0854 |
| 2.25 | 4158 | 0.0727 |
| 2.2505 | 4159 | 0.1041 |
| 2.2511 | 4160 | 0.1241 |
| 2.2516 | 4161 | 0.0997 |
| 2.2522 | 4162 | 0.1624 |
| 2.2527 | 4163 | 0.1205 |
| 2.2532 | 4164 | 0.0662 |
| 2.2538 | 4165 | 0.1134 |
| 2.2543 | 4166 | 0.149 |
| 2.2549 | 4167 | 0.1096 |
| 2.2554 | 4168 | 0.0824 |
| 2.2560 | 4169 | 0.1041 |
| 2.2565 | 4170 | 0.0633 |
| 2.2570 | 4171 | 0.1127 |
| 2.2576 | 4172 | 0.1091 |
| 2.2581 | 4173 | 0.0884 |
| 2.2587 | 4174 | 0.1381 |
| 2.2592 | 4175 | 0.046 |
| 2.2597 | 4176 | 0.0716 |
| 2.2603 | 4177 | 0.0777 |
| 2.2608 | 4178 | 0.0834 |
| 2.2614 | 4179 | 0.1346 |
| 2.2619 | 4180 | 0.0581 |
| 2.2624 | 4181 | 0.1161 |
| 2.2630 | 4182 | 0.0832 |
| 2.2635 | 4183 | 0.1127 |
| 2.2641 | 4184 | 0.0995 |
| 2.2646 | 4185 | 0.047 |
| 2.2652 | 4186 | 0.0922 |
| 2.2657 | 4187 | 0.0903 |
| 2.2662 | 4188 | 0.1511 |
| 2.2668 | 4189 | 0.0783 |
| 2.2673 | 4190 | 0.098 |
| 2.2679 | 4191 | 0.1311 |
| 2.2684 | 4192 | 0.0701 |
| 2.2689 | 4193 | 0.176 |
| 2.2695 | 4194 | 0.152 |
| 2.2700 | 4195 | 0.1689 |
| 2.2706 | 4196 | 0.079 |
| 2.2711 | 4197 | 0.0915 |
| 2.2716 | 4198 | 0.1435 |
| 2.2722 | 4199 | 0.1123 |
| 2.2727 | 4200 | 0.1261 |
| 2.2733 | 4201 | 0.1041 |
| 2.2738 | 4202 | 0.1172 |
| 2.2744 | 4203 | 0.1188 |
| 2.2749 | 4204 | 0.1107 |
| 2.2754 | 4205 | 0.0528 |
| 2.2760 | 4206 | 0.1239 |
| 2.2765 | 4207 | 0.1241 |
| 2.2771 | 4208 | 0.0866 |
| 2.2776 | 4209 | 0.0725 |
| 2.2781 | 4210 | 0.0885 |
| 2.2787 | 4211 | 0.1362 |
| 2.2792 | 4212 | 0.1316 |
| 2.2798 | 4213 | 0.1074 |
| 2.2803 | 4214 | 0.0814 |
| 2.2808 | 4215 | 0.1032 |
| 2.2814 | 4216 | 0.1029 |
| 2.2819 | 4217 | 0.0736 |
| 2.2825 | 4218 | 0.0611 |
| 2.2830 | 4219 | 0.1528 |
| 2.2835 | 4220 | 0.1131 |
| 2.2841 | 4221 | 0.1296 |
| 2.2846 | 4222 | 0.0819 |
| 2.2852 | 4223 | 0.1603 |
| 2.2857 | 4224 | 0.0822 |
| 2.2863 | 4225 | 0.1026 |
| 2.2868 | 4226 | 0.1036 |
| 2.2873 | 4227 | 0.1256 |
| 2.2879 | 4228 | 0.1234 |
| 2.2884 | 4229 | 0.0995 |
| 2.2890 | 4230 | 0.0524 |
| 2.2895 | 4231 | 0.0691 |
| 2.2900 | 4232 | 0.1058 |
| 2.2906 | 4233 | 0.1318 |
| 2.2911 | 4234 | 0.0749 |
| 2.2917 | 4235 | 0.0719 |
| 2.2922 | 4236 | 0.1006 |
| 2.2927 | 4237 | 0.1245 |
| 2.2933 | 4238 | 0.0934 |
| 2.2938 | 4239 | 0.117 |
| 2.2944 | 4240 | 0.1265 |
| 2.2949 | 4241 | 0.1253 |
| 2.2955 | 4242 | 0.1313 |
| 2.2960 | 4243 | 0.1198 |
| 2.2965 | 4244 | 0.1005 |
| 2.2971 | 4245 | 0.0918 |
| 2.2976 | 4246 | 0.0772 |
| 2.2982 | 4247 | 0.079 |
| 2.2987 | 4248 | 0.0749 |
| 2.2992 | 4249 | 0.0828 |
| 2.2998 | 4250 | 0.0919 |
| 2.3003 | 4251 | 0.0725 |
| 2.3009 | 4252 | 0.1958 |
| 2.3014 | 4253 | 0.0816 |
| 2.3019 | 4254 | 0.139 |
| 2.3025 | 4255 | 0.1398 |
| 2.3030 | 4256 | 0.1123 |
| 2.3036 | 4257 | 0.089 |
| 2.3041 | 4258 | 0.0863 |
| 2.3047 | 4259 | 0.1452 |
| 2.3052 | 4260 | 0.1216 |
| 2.3057 | 4261 | 0.1233 |
| 2.3063 | 4262 | 0.0927 |
| 2.3068 | 4263 | 0.0501 |
| 2.3074 | 4264 | 0.0916 |
| 2.3079 | 4265 | 0.1276 |
| 2.3084 | 4266 | 0.1166 |
| 2.3090 | 4267 | 0.0844 |
| 2.3095 | 4268 | 0.0491 |
| 2.3101 | 4269 | 0.0649 |
| 2.3106 | 4270 | 0.0606 |
| 2.3111 | 4271 | 0.1792 |
| 2.3117 | 4272 | 0.0653 |
| 2.3122 | 4273 | 0.0638 |
| 2.3128 | 4274 | 0.0709 |
| 2.3133 | 4275 | 0.12 |
| 2.3139 | 4276 | 0.1149 |
| 2.3144 | 4277 | 0.1055 |
| 2.3149 | 4278 | 0.0422 |
| 2.3155 | 4279 | 0.0985 |
| 2.3160 | 4280 | 0.1105 |
| 2.3166 | 4281 | 0.0909 |
| 2.3171 | 4282 | 0.0447 |
| 2.3176 | 4283 | 0.0654 |
| 2.3182 | 4284 | 0.1108 |
| 2.3187 | 4285 | 0.1182 |
| 2.3193 | 4286 | 0.042 |
| 2.3198 | 4287 | 0.0827 |
| 2.3203 | 4288 | 0.1292 |
| 2.3209 | 4289 | 0.0742 |
| 2.3214 | 4290 | 0.1275 |
| 2.3220 | 4291 | 0.1087 |
| 2.3225 | 4292 | 0.1391 |
| 2.3231 | 4293 | 0.0891 |
| 2.3236 | 4294 | 0.0508 |
| 2.3241 | 4295 | 0.1185 |
| 2.3247 | 4296 | 0.0519 |
| 2.3252 | 4297 | 0.0847 |
| 2.3258 | 4298 | 0.1272 |
| 2.3263 | 4299 | 0.1127 |
| 2.3268 | 4300 | 0.1113 |
| 2.3274 | 4301 | 0.1031 |
| 2.3279 | 4302 | 0.0747 |
| 2.3285 | 4303 | 0.1081 |
| 2.3290 | 4304 | 0.0989 |
| 2.3295 | 4305 | 0.0736 |
| 2.3301 | 4306 | 0.1417 |
| 2.3306 | 4307 | 0.0868 |
| 2.3312 | 4308 | 0.0867 |
| 2.3317 | 4309 | 0.0845 |
| 2.3323 | 4310 | 0.1184 |
| 2.3328 | 4311 | 0.1093 |
| 2.3333 | 4312 | 0.1313 |
| 2.3339 | 4313 | 0.0936 |
| 2.3344 | 4314 | 0.0547 |
| 2.3350 | 4315 | 0.0567 |
| 2.3355 | 4316 | 0.1204 |
| 2.3360 | 4317 | 0.1305 |
| 2.3366 | 4318 | 0.1199 |
| 2.3371 | 4319 | 0.118 |
| 2.3377 | 4320 | 0.0877 |
| 2.3382 | 4321 | 0.1392 |
| 2.3387 | 4322 | 0.091 |
| 2.3393 | 4323 | 0.1238 |
| 2.3398 | 4324 | 0.0586 |
| 2.3404 | 4325 | 0.0931 |
| 2.3409 | 4326 | 0.0799 |
| 2.3415 | 4327 | 0.1399 |
| 2.3420 | 4328 | 0.1468 |
| 2.3425 | 4329 | 0.1346 |
| 2.3431 | 4330 | 0.1348 |
| 2.3436 | 4331 | 0.1869 |
| 2.3442 | 4332 | 0.1217 |
| 2.3447 | 4333 | 0.0646 |
| 2.3452 | 4334 | 0.111 |
| 2.3458 | 4335 | 0.1346 |
| 2.3463 | 4336 | 0.1497 |
| 2.3469 | 4337 | 0.0908 |
| 2.3474 | 4338 | 0.0956 |
| 2.3479 | 4339 | 0.1143 |
| 2.3485 | 4340 | 0.1351 |
| 2.3490 | 4341 | 0.0881 |
| 2.3496 | 4342 | 0.1382 |
| 2.3501 | 4343 | 0.0821 |
| 2.3506 | 4344 | 0.1077 |
| 2.3512 | 4345 | 0.0568 |
| 2.3517 | 4346 | 0.1292 |
| 2.3523 | 4347 | 0.1134 |
| 2.3528 | 4348 | 0.1249 |
| 2.3534 | 4349 | 0.0535 |
| 2.3539 | 4350 | 0.0788 |
| 2.3544 | 4351 | 0.1078 |
| 2.3550 | 4352 | 0.0899 |
| 2.3555 | 4353 | 0.0736 |
| 2.3561 | 4354 | 0.0918 |
| 2.3566 | 4355 | 0.0673 |
| 2.3571 | 4356 | 0.1083 |
| 2.3577 | 4357 | 0.0832 |
| 2.3582 | 4358 | 0.1077 |
| 2.3588 | 4359 | 0.1175 |
| 2.3593 | 4360 | 0.1449 |
| 2.3598 | 4361 | 0.0933 |
| 2.3604 | 4362 | 0.2376 |
| 2.3609 | 4363 | 0.1113 |
| 2.3615 | 4364 | 0.1058 |
| 2.3620 | 4365 | 0.096 |
| 2.3626 | 4366 | 0.1496 |
| 2.3631 | 4367 | 0.0908 |
| 2.3636 | 4368 | 0.0824 |
| 2.3642 | 4369 | 0.064 |
| 2.3647 | 4370 | 0.0944 |
| 2.3653 | 4371 | 0.1034 |
| 2.3658 | 4372 | 0.0553 |
| 2.3663 | 4373 | 0.0739 |
| 2.3669 | 4374 | 0.1274 |
| 2.3674 | 4375 | 0.0802 |
| 2.3680 | 4376 | 0.0657 |
| 2.3685 | 4377 | 0.0851 |
| 2.3690 | 4378 | 0.0847 |
| 2.3696 | 4379 | 0.1233 |
| 2.3701 | 4380 | 0.1352 |
| 2.3707 | 4381 | 0.1172 |
| 2.3712 | 4382 | 0.0942 |
| 2.3718 | 4383 | 0.1171 |
| 2.3723 | 4384 | 0.1391 |
| 2.3728 | 4385 | 0.1478 |
| 2.3734 | 4386 | 0.0739 |
| 2.3739 | 4387 | 0.0744 |
| 2.3745 | 4388 | 0.0736 |
| 2.375 | 4389 | 0.0568 |
| 2.3755 | 4390 | 0.0782 |
| 2.3761 | 4391 | 0.1505 |
| 2.3766 | 4392 | 0.1167 |
| 2.3772 | 4393 | 0.0923 |
| 2.3777 | 4394 | 0.0942 |
| 2.3782 | 4395 | 0.0619 |
| 2.3788 | 4396 | 0.1088 |
| 2.3793 | 4397 | 0.0752 |
| 2.3799 | 4398 | 0.1643 |
| 2.3804 | 4399 | 0.1019 |
| 2.3810 | 4400 | 0.0644 |
| 2.3815 | 4401 | 0.0922 |
| 2.3820 | 4402 | 0.1345 |
| 2.3826 | 4403 | 0.1343 |
| 2.3831 | 4404 | 0.1136 |
| 2.3837 | 4405 | 0.0701 |
| 2.3842 | 4406 | 0.0694 |
| 2.3847 | 4407 | 0.0976 |
| 2.3853 | 4408 | 0.072 |
| 2.3858 | 4409 | 0.0732 |
| 2.3864 | 4410 | 0.0977 |
| 2.3869 | 4411 | 0.0824 |
| 2.3874 | 4412 | 0.1405 |
| 2.3880 | 4413 | 0.0713 |
| 2.3885 | 4414 | 0.1053 |
| 2.3891 | 4415 | 0.108 |
| 2.3896 | 4416 | 0.1048 |
| 2.3902 | 4417 | 0.0655 |
| 2.3907 | 4418 | 0.0995 |
| 2.3912 | 4419 | 0.129 |
| 2.3918 | 4420 | 0.1073 |
| 2.3923 | 4421 | 0.1381 |
| 2.3929 | 4422 | 0.0607 |
| 2.3934 | 4423 | 0.0853 |
| 2.3939 | 4424 | 0.1173 |
| 2.3945 | 4425 | 0.1192 |
| 2.3950 | 4426 | 0.1095 |
| 2.3956 | 4427 | 0.1094 |
| 2.3961 | 4428 | 0.1374 |
| 2.3966 | 4429 | 0.0869 |
| 2.3972 | 4430 | 0.0724 |
| 2.3977 | 4431 | 0.1206 |
| 2.3983 | 4432 | 0.0662 |
| 2.3988 | 4433 | 0.1396 |
| 2.3994 | 4434 | 0.0957 |
| 2.3999 | 4435 | 0.0922 |
| 2.4004 | 4436 | 0.0503 |
| 2.4010 | 4437 | 0.0793 |
| 2.4015 | 4438 | 0.0756 |
| 2.4021 | 4439 | 0.1691 |
| 2.4026 | 4440 | 0.1 |
| 2.4031 | 4441 | 0.0849 |
| 2.4037 | 4442 | 0.1615 |
| 2.4042 | 4443 | 0.0809 |
| 2.4048 | 4444 | 0.1053 |
| 2.4053 | 4445 | 0.1249 |
| 2.4058 | 4446 | 0.1076 |
| 2.4064 | 4447 | 0.108 |
| 2.4069 | 4448 | 0.1123 |
| 2.4075 | 4449 | 0.1287 |
| 2.4080 | 4450 | 0.0722 |
| 2.4085 | 4451 | 0.0903 |
| 2.4091 | 4452 | 0.1031 |
| 2.4096 | 4453 | 0.0775 |
| 2.4102 | 4454 | 0.1209 |
| 2.4107 | 4455 | 0.1508 |
| 2.4113 | 4456 | 0.124 |
| 2.4118 | 4457 | 0.1098 |
| 2.4123 | 4458 | 0.1303 |
| 2.4129 | 4459 | 0.0946 |
| 2.4134 | 4460 | 0.0751 |
| 2.4140 | 4461 | 0.1763 |
| 2.4145 | 4462 | 0.0661 |
| 2.4150 | 4463 | 0.1203 |
| 2.4156 | 4464 | 0.1411 |
| 2.4161 | 4465 | 0.0536 |
| 2.4167 | 4466 | 0.0986 |
| 2.4172 | 4467 | 0.2264 |
| 2.4177 | 4468 | 0.0905 |
| 2.4183 | 4469 | 0.0661 |
| 2.4188 | 4470 | 0.0686 |
| 2.4194 | 4471 | 0.0449 |
| 2.4199 | 4472 | 0.1101 |
| 2.4205 | 4473 | 0.1998 |
| 2.4210 | 4474 | 0.0946 |
| 2.4215 | 4475 | 0.1213 |
| 2.4221 | 4476 | 0.0687 |
| 2.4226 | 4477 | 0.1219 |
| 2.4232 | 4478 | 0.1023 |
| 2.4237 | 4479 | 0.0937 |
| 2.4242 | 4480 | 0.1146 |
| 2.4248 | 4481 | 0.1059 |
| 2.4253 | 4482 | 0.1428 |
| 2.4259 | 4483 | 0.1143 |
| 2.4264 | 4484 | 0.1108 |
| 2.4269 | 4485 | 0.0925 |
| 2.4275 | 4486 | 0.0629 |
| 2.4280 | 4487 | 0.0829 |
| 2.4286 | 4488 | 0.0543 |
| 2.4291 | 4489 | 0.0627 |
| 2.4297 | 4490 | 0.0671 |
| 2.4302 | 4491 | 0.1581 |
| 2.4307 | 4492 | 0.1153 |
| 2.4313 | 4493 | 0.1047 |
| 2.4318 | 4494 | 0.1023 |
| 2.4324 | 4495 | 0.0904 |
| 2.4329 | 4496 | 0.1095 |
| 2.4334 | 4497 | 0.1128 |
| 2.4340 | 4498 | 0.0987 |
| 2.4345 | 4499 | 0.0988 |
| 2.4351 | 4500 | 0.0712 |
| 2.4356 | 4501 | 0.1244 |
| 2.4361 | 4502 | 0.0703 |
| 2.4367 | 4503 | 0.1061 |
| 2.4372 | 4504 | 0.0773 |
| 2.4378 | 4505 | 0.0933 |
| 2.4383 | 4506 | 0.1354 |
| 2.4389 | 4507 | 0.0905 |
| 2.4394 | 4508 | 0.1244 |
| 2.4399 | 4509 | 0.0982 |
| 2.4405 | 4510 | 0.0876 |
| 2.4410 | 4511 | 0.1852 |
| 2.4416 | 4512 | 0.116 |
| 2.4421 | 4513 | 0.1181 |
| 2.4426 | 4514 | 0.1141 |
| 2.4432 | 4515 | 0.0761 |
| 2.4437 | 4516 | 0.0965 |
| 2.4443 | 4517 | 0.0992 |
| 2.4448 | 4518 | 0.0973 |
| 2.4453 | 4519 | 0.0904 |
| 2.4459 | 4520 | 0.0965 |
| 2.4464 | 4521 | 0.0804 |
| 2.4470 | 4522 | 0.1446 |
| 2.4475 | 4523 | 0.102 |
| 2.4481 | 4524 | 0.0551 |
| 2.4486 | 4525 | 0.093 |
| 2.4491 | 4526 | 0.068 |
| 2.4497 | 4527 | 0.1187 |
| 2.4502 | 4528 | 0.1235 |
| 2.4508 | 4529 | 0.0954 |
| 2.4513 | 4530 | 0.155 |
| 2.4518 | 4531 | 0.1153 |
| 2.4524 | 4532 | 0.0919 |
| 2.4529 | 4533 | 0.0986 |
| 2.4535 | 4534 | 0.0939 |
| 2.4540 | 4535 | 0.0681 |
| 2.4545 | 4536 | 0.0763 |
| 2.4551 | 4537 | 0.0798 |
| 2.4556 | 4538 | 0.0698 |
| 2.4562 | 4539 | 0.0887 |
| 2.4567 | 4540 | 0.0909 |
| 2.4573 | 4541 | 0.1063 |
| 2.4578 | 4542 | 0.062 |
| 2.4583 | 4543 | 0.0515 |
| 2.4589 | 4544 | 0.0866 |
| 2.4594 | 4545 | 0.0947 |
| 2.4600 | 4546 | 0.1514 |
| 2.4605 | 4547 | 0.118 |
| 2.4610 | 4548 | 0.1611 |
| 2.4616 | 4549 | 0.1243 |
| 2.4621 | 4550 | 0.0728 |
| 2.4627 | 4551 | 0.1419 |
| 2.4632 | 4552 | 0.1235 |
| 2.4637 | 4553 | 0.1086 |
| 2.4643 | 4554 | 0.1295 |
| 2.4648 | 4555 | 0.0848 |
| 2.4654 | 4556 | 0.0895 |
| 2.4659 | 4557 | 0.1158 |
| 2.4665 | 4558 | 0.1167 |
| 2.4670 | 4559 | 0.0578 |
| 2.4675 | 4560 | 0.0792 |
| 2.4681 | 4561 | 0.0412 |
| 2.4686 | 4562 | 0.0932 |
| 2.4692 | 4563 | 0.0658 |
| 2.4697 | 4564 | 0.1146 |
| 2.4702 | 4565 | 0.0725 |
| 2.4708 | 4566 | 0.1026 |
| 2.4713 | 4567 | 0.0994 |
| 2.4719 | 4568 | 0.094 |
| 2.4724 | 4569 | 0.1085 |
| 2.4729 | 4570 | 0.0865 |
| 2.4735 | 4571 | 0.0617 |
| 2.4740 | 4572 | 0.1437 |
| 2.4746 | 4573 | 0.0504 |
| 2.4751 | 4574 | 0.1141 |
| 2.4756 | 4575 | 0.0688 |
| 2.4762 | 4576 | 0.0627 |
| 2.4767 | 4577 | 0.0822 |
| 2.4773 | 4578 | 0.1252 |
| 2.4778 | 4579 | 0.0885 |
| 2.4784 | 4580 | 0.0984 |
| 2.4789 | 4581 | 0.103 |
| 2.4794 | 4582 | 0.167 |
| 2.4800 | 4583 | 0.076 |
| 2.4805 | 4584 | 0.0791 |
| 2.4811 | 4585 | 0.1159 |
| 2.4816 | 4586 | 0.1302 |
| 2.4821 | 4587 | 0.0947 |
| 2.4827 | 4588 | 0.1425 |
| 2.4832 | 4589 | 0.1463 |
| 2.4838 | 4590 | 0.0707 |
| 2.4843 | 4591 | 0.1189 |
| 2.4848 | 4592 | 0.1488 |
| 2.4854 | 4593 | 0.1367 |
| 2.4859 | 4594 | 0.0897 |
| 2.4865 | 4595 | 0.0352 |
| 2.4870 | 4596 | 0.1313 |
| 2.4876 | 4597 | 0.0558 |
| 2.4881 | 4598 | 0.0734 |
| 2.4886 | 4599 | 0.1234 |
| 2.4892 | 4600 | 0.0917 |
| 2.4897 | 4601 | 0.0551 |
| 2.4903 | 4602 | 0.0715 |
| 2.4908 | 4603 | 0.1152 |
| 2.4913 | 4604 | 0.0839 |
| 2.4919 | 4605 | 0.1441 |
| 2.4924 | 4606 | 0.1072 |
| 2.4930 | 4607 | 0.1293 |
| 2.4935 | 4608 | 0.1271 |
| 2.4940 | 4609 | 0.1126 |
| 2.4946 | 4610 | 0.0925 |
| 2.4951 | 4611 | 0.0689 |
| 2.4957 | 4612 | 0.0734 |
| 2.4962 | 4613 | 0.1631 |
| 2.4968 | 4614 | 0.0953 |
| 2.4973 | 4615 | 0.0836 |
| 2.4978 | 4616 | 0.0446 |
| 2.4984 | 4617 | 0.1407 |
| 2.4989 | 4618 | 0.0783 |
| 2.4995 | 4619 | 0.076 |
| 2.5 | 4620 | 0.1054 |
| 2.5005 | 4621 | 0.1311 |
| 2.5011 | 4622 | 0.081 |
| 2.5016 | 4623 | 0.0831 |
| 2.5022 | 4624 | 0.1094 |
| 2.5027 | 4625 | 0.1247 |
| 2.5032 | 4626 | 0.1557 |
| 2.5038 | 4627 | 0.13 |
| 2.5043 | 4628 | 0.06 |
| 2.5049 | 4629 | 0.0871 |
| 2.5054 | 4630 | 0.0842 |
| 2.5060 | 4631 | 0.0706 |
| 2.5065 | 4632 | 0.089 |
| 2.5070 | 4633 | 0.0883 |
| 2.5076 | 4634 | 0.1036 |
| 2.5081 | 4635 | 0.0746 |
| 2.5087 | 4636 | 0.1229 |
| 2.5092 | 4637 | 0.1607 |
| 2.5097 | 4638 | 0.1048 |
| 2.5103 | 4639 | 0.0979 |
| 2.5108 | 4640 | 0.0832 |
| 2.5114 | 4641 | 0.1021 |
| 2.5119 | 4642 | 0.1032 |
| 2.5124 | 4643 | 0.131 |
| 2.5130 | 4644 | 0.0967 |
| 2.5135 | 4645 | 0.1101 |
| 2.5141 | 4646 | 0.1929 |
| 2.5146 | 4647 | 0.1076 |
| 2.5152 | 4648 | 0.1347 |
| 2.5157 | 4649 | 0.0643 |
| 2.5162 | 4650 | 0.0715 |
| 2.5168 | 4651 | 0.0823 |
| 2.5173 | 4652 | 0.112 |
| 2.5179 | 4653 | 0.1195 |
| 2.5184 | 4654 | 0.166 |
| 2.5189 | 4655 | 0.0781 |
| 2.5195 | 4656 | 0.0906 |
| 2.5200 | 4657 | 0.0785 |
| 2.5206 | 4658 | 0.1036 |
| 2.5211 | 4659 | 0.1537 |
| 2.5216 | 4660 | 0.0518 |
| 2.5222 | 4661 | 0.0958 |
| 2.5227 | 4662 | 0.0949 |
| 2.5233 | 4663 | 0.0759 |
| 2.5238 | 4664 | 0.0524 |
| 2.5244 | 4665 | 0.0947 |
| 2.5249 | 4666 | 0.1334 |
| 2.5254 | 4667 | 0.1352 |
| 2.5260 | 4668 | 0.0793 |
| 2.5265 | 4669 | 0.1131 |
| 2.5271 | 4670 | 0.1151 |
| 2.5276 | 4671 | 0.0987 |
| 2.5281 | 4672 | 0.0877 |
| 2.5287 | 4673 | 0.1132 |
| 2.5292 | 4674 | 0.0744 |
| 2.5298 | 4675 | 0.0703 |
| 2.5303 | 4676 | 0.1666 |
| 2.5308 | 4677 | 0.1289 |
| 2.5314 | 4678 | 0.074 |
| 2.5319 | 4679 | 0.1328 |
| 2.5325 | 4680 | 0.0499 |
| 2.5330 | 4681 | 0.104 |
| 2.5335 | 4682 | 0.1267 |
| 2.5341 | 4683 | 0.0927 |
| 2.5346 | 4684 | 0.123 |
| 2.5352 | 4685 | 0.046 |
| 2.5357 | 4686 | 0.058 |
| 2.5363 | 4687 | 0.0913 |
| 2.5368 | 4688 | 0.1113 |
| 2.5373 | 4689 | 0.1237 |
| 2.5379 | 4690 | 0.0705 |
| 2.5384 | 4691 | 0.1254 |
| 2.5390 | 4692 | 0.1187 |
| 2.5395 | 4693 | 0.0887 |
| 2.5400 | 4694 | 0.0804 |
| 2.5406 | 4695 | 0.1186 |
| 2.5411 | 4696 | 0.1269 |
| 2.5417 | 4697 | 0.1074 |
| 2.5422 | 4698 | 0.0881 |
| 2.5427 | 4699 | 0.0726 |
| 2.5433 | 4700 | 0.0839 |
| 2.5438 | 4701 | 0.0963 |
| 2.5444 | 4702 | 0.1073 |
| 2.5449 | 4703 | 0.1006 |
| 2.5455 | 4704 | 0.1265 |
| 2.5460 | 4705 | 0.0891 |
| 2.5465 | 4706 | 0.0554 |
| 2.5471 | 4707 | 0.085 |
| 2.5476 | 4708 | 0.1354 |
| 2.5482 | 4709 | 0.1292 |
| 2.5487 | 4710 | 0.1206 |
| 2.5492 | 4711 | 0.1399 |
| 2.5498 | 4712 | 0.0958 |
| 2.5503 | 4713 | 0.1331 |
| 2.5509 | 4714 | 0.0859 |
| 2.5514 | 4715 | 0.125 |
| 2.5519 | 4716 | 0.1115 |
| 2.5525 | 4717 | 0.0701 |
| 2.5530 | 4718 | 0.0723 |
| 2.5536 | 4719 | 0.1299 |
| 2.5541 | 4720 | 0.0738 |
| 2.5547 | 4721 | 0.0949 |
| 2.5552 | 4722 | 0.089 |
| 2.5557 | 4723 | 0.1179 |
| 2.5563 | 4724 | 0.0989 |
| 2.5568 | 4725 | 0.0815 |
| 2.5574 | 4726 | 0.1115 |
| 2.5579 | 4727 | 0.0645 |
| 2.5584 | 4728 | 0.0692 |
| 2.5590 | 4729 | 0.0608 |
| 2.5595 | 4730 | 0.0835 |
| 2.5601 | 4731 | 0.0853 |
| 2.5606 | 4732 | 0.1177 |
| 2.5611 | 4733 | 0.1054 |
| 2.5617 | 4734 | 0.1148 |
| 2.5622 | 4735 | 0.0911 |
| 2.5628 | 4736 | 0.1007 |
| 2.5633 | 4737 | 0.0936 |
| 2.5639 | 4738 | 0.131 |
| 2.5644 | 4739 | 0.1287 |
| 2.5649 | 4740 | 0.0647 |
| 2.5655 | 4741 | 0.1111 |
| 2.5660 | 4742 | 0.1208 |
| 2.5666 | 4743 | 0.1175 |
| 2.5671 | 4744 | 0.0818 |
| 2.5676 | 4745 | 0.0978 |
| 2.5682 | 4746 | 0.1471 |
| 2.5687 | 4747 | 0.0854 |
| 2.5693 | 4748 | 0.218 |
| 2.5698 | 4749 | 0.097 |
| 2.5703 | 4750 | 0.1317 |
| 2.5709 | 4751 | 0.083 |
| 2.5714 | 4752 | 0.1652 |
| 2.5720 | 4753 | 0.0644 |
| 2.5725 | 4754 | 0.16 |
| 2.5731 | 4755 | 0.0933 |
| 2.5736 | 4756 | 0.0924 |
| 2.5741 | 4757 | 0.0755 |
| 2.5747 | 4758 | 0.0777 |
| 2.5752 | 4759 | 0.0659 |
| 2.5758 | 4760 | 0.1158 |
| 2.5763 | 4761 | 0.1108 |
| 2.5768 | 4762 | 0.1005 |
| 2.5774 | 4763 | 0.0909 |
| 2.5779 | 4764 | 0.0972 |
| 2.5785 | 4765 | 0.08 |
| 2.5790 | 4766 | 0.1238 |
| 2.5795 | 4767 | 0.1117 |
| 2.5801 | 4768 | 0.1756 |
| 2.5806 | 4769 | 0.0861 |
| 2.5812 | 4770 | 0.1782 |
| 2.5817 | 4771 | 0.1322 |
| 2.5823 | 4772 | 0.0637 |
| 2.5828 | 4773 | 0.1116 |
| 2.5833 | 4774 | 0.08 |
| 2.5839 | 4775 | 0.0647 |
| 2.5844 | 4776 | 0.1103 |
| 2.5850 | 4777 | 0.0795 |
| 2.5855 | 4778 | 0.0956 |
| 2.5860 | 4779 | 0.0775 |
| 2.5866 | 4780 | 0.0963 |
| 2.5871 | 4781 | 0.0908 |
| 2.5877 | 4782 | 0.0742 |
| 2.5882 | 4783 | 0.0646 |
| 2.5887 | 4784 | 0.1173 |
| 2.5893 | 4785 | 0.0885 |
| 2.5898 | 4786 | 0.0707 |
| 2.5904 | 4787 | 0.1187 |
| 2.5909 | 4788 | 0.1338 |
| 2.5915 | 4789 | 0.0625 |
| 2.5920 | 4790 | 0.09 |
| 2.5925 | 4791 | 0.0802 |
| 2.5931 | 4792 | 0.1317 |
| 2.5936 | 4793 | 0.0916 |
| 2.5942 | 4794 | 0.0942 |
| 2.5947 | 4795 | 0.0944 |
| 2.5952 | 4796 | 0.112 |
| 2.5958 | 4797 | 0.1111 |
| 2.5963 | 4798 | 0.0576 |
| 2.5969 | 4799 | 0.1232 |
| 2.5974 | 4800 | 0.1085 |
| 2.5979 | 4801 | 0.066 |
| 2.5985 | 4802 | 0.1085 |
| 2.5990 | 4803 | 0.0747 |
| 2.5996 | 4804 | 0.0883 |
| 2.6001 | 4805 | 0.086 |
| 2.6006 | 4806 | 0.1161 |
| 2.6012 | 4807 | 0.156 |
| 2.6017 | 4808 | 0.0935 |
| 2.6023 | 4809 | 0.1566 |
| 2.6028 | 4810 | 0.1455 |
| 2.6034 | 4811 | 0.0542 |
| 2.6039 | 4812 | 0.0865 |
| 2.6044 | 4813 | 0.0944 |
| 2.6050 | 4814 | 0.1469 |
| 2.6055 | 4815 | 0.1249 |
| 2.6061 | 4816 | 0.1224 |
| 2.6066 | 4817 | 0.0869 |
| 2.6071 | 4818 | 0.0805 |
| 2.6077 | 4819 | 0.0781 |
| 2.6082 | 4820 | 0.119 |
| 2.6088 | 4821 | 0.0916 |
| 2.6093 | 4822 | 0.1218 |
| 2.6098 | 4823 | 0.0968 |
| 2.6104 | 4824 | 0.08 |
| 2.6109 | 4825 | 0.0735 |
| 2.6115 | 4826 | 0.1261 |
| 2.6120 | 4827 | 0.0786 |
| 2.6126 | 4828 | 0.0721 |
| 2.6131 | 4829 | 0.0719 |
| 2.6136 | 4830 | 0.1346 |
| 2.6142 | 4831 | 0.1269 |
| 2.6147 | 4832 | 0.088 |
| 2.6153 | 4833 | 0.1005 |
| 2.6158 | 4834 | 0.119 |
| 2.6163 | 4835 | 0.0802 |
| 2.6169 | 4836 | 0.0728 |
| 2.6174 | 4837 | 0.079 |
| 2.6180 | 4838 | 0.0893 |
| 2.6185 | 4839 | 0.0811 |
| 2.6190 | 4840 | 0.1153 |
| 2.6196 | 4841 | 0.0773 |
| 2.6201 | 4842 | 0.1113 |
| 2.6207 | 4843 | 0.0907 |
| 2.6212 | 4844 | 0.1257 |
| 2.6218 | 4845 | 0.1122 |
| 2.6223 | 4846 | 0.0829 |
| 2.6228 | 4847 | 0.1009 |
| 2.6234 | 4848 | 0.0822 |
| 2.6239 | 4849 | 0.0995 |
| 2.6245 | 4850 | 0.1262 |
| 2.625 | 4851 | 0.0914 |
| 2.6255 | 4852 | 0.0925 |
| 2.6261 | 4853 | 0.0973 |
| 2.6266 | 4854 | 0.1397 |
| 2.6272 | 4855 | 0.1084 |
| 2.6277 | 4856 | 0.0954 |
| 2.6282 | 4857 | 0.1094 |
| 2.6288 | 4858 | 0.038 |
| 2.6293 | 4859 | 0.0862 |
| 2.6299 | 4860 | 0.09 |
| 2.6304 | 4861 | 0.1291 |
| 2.6310 | 4862 | 0.0976 |
| 2.6315 | 4863 | 0.0613 |
| 2.6320 | 4864 | 0.0663 |
| 2.6326 | 4865 | 0.0615 |
| 2.6331 | 4866 | 0.0934 |
| 2.6337 | 4867 | 0.072 |
| 2.6342 | 4868 | 0.1055 |
| 2.6347 | 4869 | 0.1169 |
| 2.6353 | 4870 | 0.0775 |
| 2.6358 | 4871 | 0.0972 |
| 2.6364 | 4872 | 0.1343 |
| 2.6369 | 4873 | 0.1616 |
| 2.6374 | 4874 | 0.0846 |
| 2.6380 | 4875 | 0.0753 |
| 2.6385 | 4876 | 0.0827 |
| 2.6391 | 4877 | 0.1354 |
| 2.6396 | 4878 | 0.1201 |
| 2.6402 | 4879 | 0.094 |
| 2.6407 | 4880 | 0.1102 |
| 2.6412 | 4881 | 0.0629 |
| 2.6418 | 4882 | 0.0828 |
| 2.6423 | 4883 | 0.1115 |
| 2.6429 | 4884 | 0.1428 |
| 2.6434 | 4885 | 0.0717 |
| 2.6439 | 4886 | 0.1023 |
| 2.6445 | 4887 | 0.1581 |
| 2.6450 | 4888 | 0.1274 |
| 2.6456 | 4889 | 0.0629 |
| 2.6461 | 4890 | 0.0784 |
| 2.6466 | 4891 | 0.0558 |
| 2.6472 | 4892 | 0.1089 |
| 2.6477 | 4893 | 0.1028 |
| 2.6483 | 4894 | 0.0527 |
| 2.6488 | 4895 | 0.1105 |
| 2.6494 | 4896 | 0.0856 |
| 2.6499 | 4897 | 0.0754 |
| 2.6504 | 4898 | 0.102 |
| 2.6510 | 4899 | 0.0844 |
| 2.6515 | 4900 | 0.033 |
| 2.6521 | 4901 | 0.1596 |
| 2.6526 | 4902 | 0.0736 |
| 2.6531 | 4903 | 0.0872 |
| 2.6537 | 4904 | 0.1094 |
| 2.6542 | 4905 | 0.1024 |
| 2.6548 | 4906 | 0.0547 |
| 2.6553 | 4907 | 0.1064 |
| 2.6558 | 4908 | 0.0514 |
| 2.6564 | 4909 | 0.1211 |
| 2.6569 | 4910 | 0.0686 |
| 2.6575 | 4911 | 0.0983 |
| 2.6580 | 4912 | 0.1019 |
| 2.6585 | 4913 | 0.0727 |
| 2.6591 | 4914 | 0.1748 |
| 2.6596 | 4915 | 0.2002 |
| 2.6602 | 4916 | 0.1287 |
| 2.6607 | 4917 | 0.0934 |
| 2.6613 | 4918 | 0.0749 |
| 2.6618 | 4919 | 0.0606 |
| 2.6623 | 4920 | 0.0723 |
| 2.6629 | 4921 | 0.0348 |
| 2.6634 | 4922 | 0.0829 |
| 2.6640 | 4923 | 0.0786 |
| 2.6645 | 4924 | 0.1147 |
| 2.6650 | 4925 | 0.0835 |
| 2.6656 | 4926 | 0.0812 |
| 2.6661 | 4927 | 0.0922 |
| 2.6667 | 4928 | 0.1083 |
| 2.6672 | 4929 | 0.142 |
| 2.6677 | 4930 | 0.0444 |
| 2.6683 | 4931 | 0.0769 |
| 2.6688 | 4932 | 0.1845 |
| 2.6694 | 4933 | 0.1042 |
| 2.6699 | 4934 | 0.1234 |
| 2.6705 | 4935 | 0.1149 |
| 2.6710 | 4936 | 0.0779 |
| 2.6715 | 4937 | 0.1049 |
| 2.6721 | 4938 | 0.1213 |
| 2.6726 | 4939 | 0.0887 |
| 2.6732 | 4940 | 0.1014 |
| 2.6737 | 4941 | 0.0741 |
| 2.6742 | 4942 | 0.0878 |
| 2.6748 | 4943 | 0.1235 |
| 2.6753 | 4944 | 0.1104 |
| 2.6759 | 4945 | 0.1148 |
| 2.6764 | 4946 | 0.1792 |
| 2.6769 | 4947 | 0.0883 |
| 2.6775 | 4948 | 0.1049 |
| 2.6780 | 4949 | 0.0813 |
| 2.6786 | 4950 | 0.1097 |
| 2.6791 | 4951 | 0.1339 |
| 2.6797 | 4952 | 0.0664 |
| 2.6802 | 4953 | 0.0484 |
| 2.6807 | 4954 | 0.2084 |
| 2.6813 | 4955 | 0.0966 |
| 2.6818 | 4956 | 0.069 |
| 2.6824 | 4957 | 0.0863 |
| 2.6829 | 4958 | 0.06 |
| 2.6834 | 4959 | 0.17 |
| 2.6840 | 4960 | 0.0751 |
| 2.6845 | 4961 | 0.1053 |
| 2.6851 | 4962 | 0.1374 |
| 2.6856 | 4963 | 0.057 |
| 2.6861 | 4964 | 0.1243 |
| 2.6867 | 4965 | 0.091 |
| 2.6872 | 4966 | 0.0826 |
| 2.6878 | 4967 | 0.102 |
| 2.6883 | 4968 | 0.1002 |
| 2.6889 | 4969 | 0.1 |
| 2.6894 | 4970 | 0.1487 |
| 2.6899 | 4971 | 0.1183 |
| 2.6905 | 4972 | 0.1226 |
| 2.6910 | 4973 | 0.0642 |
| 2.6916 | 4974 | 0.1087 |
| 2.6921 | 4975 | 0.0649 |
| 2.6926 | 4976 | 0.1718 |
| 2.6932 | 4977 | 0.1022 |
| 2.6937 | 4978 | 0.1099 |
| 2.6943 | 4979 | 0.0898 |
| 2.6948 | 4980 | 0.0866 |
| 2.6953 | 4981 | 0.075 |
| 2.6959 | 4982 | 0.073 |
| 2.6964 | 4983 | 0.0784 |
| 2.6970 | 4984 | 0.0554 |
| 2.6975 | 4985 | 0.1132 |
| 2.6981 | 4986 | 0.0671 |
| 2.6986 | 4987 | 0.0955 |
| 2.6991 | 4988 | 0.0786 |
| 2.6997 | 4989 | 0.117 |
| 2.7002 | 4990 | 0.0657 |
| 2.7008 | 4991 | 0.0596 |
| 2.7013 | 4992 | 0.1176 |
| 2.7018 | 4993 | 0.0828 |
| 2.7024 | 4994 | 0.1021 |
| 2.7029 | 4995 | 0.1169 |
| 2.7035 | 4996 | 0.0596 |
| 2.7040 | 4997 | 0.0763 |
| 2.7045 | 4998 | 0.0874 |
| 2.7051 | 4999 | 0.0802 |
| 2.7056 | 5000 | 0.0397 |
| 2.7062 | 5001 | 0.1107 |
| 2.7067 | 5002 | 0.0673 |
| 2.7073 | 5003 | 0.0982 |
| 2.7078 | 5004 | 0.0916 |
| 2.7083 | 5005 | 0.0972 |
| 2.7089 | 5006 | 0.1099 |
| 2.7094 | 5007 | 0.1472 |
| 2.7100 | 5008 | 0.0769 |
| 2.7105 | 5009 | 0.1082 |
| 2.7110 | 5010 | 0.0965 |
| 2.7116 | 5011 | 0.0632 |
| 2.7121 | 5012 | 0.0712 |
| 2.7127 | 5013 | 0.0691 |
| 2.7132 | 5014 | 0.1257 |
| 2.7137 | 5015 | 0.0739 |
| 2.7143 | 5016 | 0.0598 |
| 2.7148 | 5017 | 0.0903 |
| 2.7154 | 5018 | 0.098 |
| 2.7159 | 5019 | 0.0537 |
| 2.7165 | 5020 | 0.134 |
| 2.7170 | 5021 | 0.0787 |
| 2.7175 | 5022 | 0.1292 |
| 2.7181 | 5023 | 0.0877 |
| 2.7186 | 5024 | 0.1588 |
| 2.7192 | 5025 | 0.1002 |
| 2.7197 | 5026 | 0.1101 |
| 2.7202 | 5027 | 0.1208 |
| 2.7208 | 5028 | 0.0528 |
| 2.7213 | 5029 | 0.0961 |
| 2.7219 | 5030 | 0.1028 |
| 2.7224 | 5031 | 0.1032 |
| 2.7229 | 5032 | 0.102 |
| 2.7235 | 5033 | 0.1328 |
| 2.7240 | 5034 | 0.0748 |
| 2.7246 | 5035 | 0.0443 |
| 2.7251 | 5036 | 0.0896 |
| 2.7256 | 5037 | 0.082 |
| 2.7262 | 5038 | 0.1196 |
| 2.7267 | 5039 | 0.1846 |
| 2.7273 | 5040 | 0.0871 |
| 2.7278 | 5041 | 0.105 |
| 2.7284 | 5042 | 0.1104 |
| 2.7289 | 5043 | 0.1065 |
| 2.7294 | 5044 | 0.0954 |
| 2.7300 | 5045 | 0.0762 |
| 2.7305 | 5046 | 0.0858 |
| 2.7311 | 5047 | 0.1011 |
| 2.7316 | 5048 | 0.0931 |
| 2.7321 | 5049 | 0.0801 |
| 2.7327 | 5050 | 0.0997 |
| 2.7332 | 5051 | 0.0595 |
| 2.7338 | 5052 | 0.1048 |
| 2.7343 | 5053 | 0.1187 |
| 2.7348 | 5054 | 0.1562 |
| 2.7354 | 5055 | 0.0866 |
| 2.7359 | 5056 | 0.1386 |
| 2.7365 | 5057 | 0.0673 |
| 2.7370 | 5058 | 0.0873 |
| 2.7376 | 5059 | 0.0943 |
| 2.7381 | 5060 | 0.1714 |
| 2.7386 | 5061 | 0.1202 |
| 2.7392 | 5062 | 0.1107 |
| 2.7397 | 5063 | 0.1137 |
| 2.7403 | 5064 | 0.0622 |
| 2.7408 | 5065 | 0.1692 |
| 2.7413 | 5066 | 0.1214 |
| 2.7419 | 5067 | 0.0758 |
| 2.7424 | 5068 | 0.0728 |
| 2.7430 | 5069 | 0.067 |
| 2.7435 | 5070 | 0.1323 |
| 2.7440 | 5071 | 0.1121 |
| 2.7446 | 5072 | 0.0858 |
| 2.7451 | 5073 | 0.0677 |
| 2.7457 | 5074 | 0.125 |
| 2.7462 | 5075 | 0.139 |
| 2.7468 | 5076 | 0.0955 |
| 2.7473 | 5077 | 0.0982 |
| 2.7478 | 5078 | 0.0759 |
| 2.7484 | 5079 | 0.1454 |
| 2.7489 | 5080 | 0.0879 |
| 2.7495 | 5081 | 0.0974 |
| 2.75 | 5082 | 0.0788 |
| 2.7505 | 5083 | 0.1212 |
| 2.7511 | 5084 | 0.1274 |
| 2.7516 | 5085 | 0.0633 |
| 2.7522 | 5086 | 0.064 |
| 2.7527 | 5087 | 0.0703 |
| 2.7532 | 5088 | 0.1332 |
| 2.7538 | 5089 | 0.1256 |
| 2.7543 | 5090 | 0.0748 |
| 2.7549 | 5091 | 0.1079 |
| 2.7554 | 5092 | 0.1126 |
| 2.7560 | 5093 | 0.065 |
| 2.7565 | 5094 | 0.1022 |
| 2.7570 | 5095 | 0.1359 |
| 2.7576 | 5096 | 0.1152 |
| 2.7581 | 5097 | 0.0689 |
| 2.7587 | 5098 | 0.1028 |
| 2.7592 | 5099 | 0.0966 |
| 2.7597 | 5100 | 0.0936 |
| 2.7603 | 5101 | 0.0909 |
| 2.7608 | 5102 | 0.1214 |
| 2.7614 | 5103 | 0.0754 |
| 2.7619 | 5104 | 0.1548 |
| 2.7624 | 5105 | 0.0988 |
| 2.7630 | 5106 | 0.0664 |
| 2.7635 | 5107 | 0.0746 |
| 2.7641 | 5108 | 0.1511 |
| 2.7646 | 5109 | 0.0822 |
| 2.7652 | 5110 | 0.0987 |
| 2.7657 | 5111 | 0.1151 |
| 2.7662 | 5112 | 0.1012 |
| 2.7668 | 5113 | 0.141 |
| 2.7673 | 5114 | 0.1091 |
| 2.7679 | 5115 | 0.0729 |
| 2.7684 | 5116 | 0.082 |
| 2.7689 | 5117 | 0.0842 |
| 2.7695 | 5118 | 0.1303 |
| 2.7700 | 5119 | 0.1619 |
| 2.7706 | 5120 | 0.0914 |
| 2.7711 | 5121 | 0.1051 |
| 2.7716 | 5122 | 0.0596 |
| 2.7722 | 5123 | 0.0947 |
| 2.7727 | 5124 | 0.0915 |
| 2.7733 | 5125 | 0.0794 |
| 2.7738 | 5126 | 0.0958 |
| 2.7744 | 5127 | 0.0955 |
| 2.7749 | 5128 | 0.0663 |
| 2.7754 | 5129 | 0.0746 |
| 2.7760 | 5130 | 0.1664 |
| 2.7765 | 5131 | 0.1835 |
| 2.7771 | 5132 | 0.0986 |
| 2.7776 | 5133 | 0.0629 |
| 2.7781 | 5134 | 0.0798 |
| 2.7787 | 5135 | 0.115 |
| 2.7792 | 5136 | 0.0838 |
| 2.7798 | 5137 | 0.0423 |
| 2.7803 | 5138 | 0.1095 |
| 2.7808 | 5139 | 0.0753 |
| 2.7814 | 5140 | 0.107 |
| 2.7819 | 5141 | 0.085 |
| 2.7825 | 5142 | 0.0878 |
| 2.7830 | 5143 | 0.0681 |
| 2.7835 | 5144 | 0.1388 |
| 2.7841 | 5145 | 0.0934 |
| 2.7846 | 5146 | 0.0537 |
| 2.7852 | 5147 | 0.1539 |
| 2.7857 | 5148 | 0.0769 |
| 2.7863 | 5149 | 0.1799 |
| 2.7868 | 5150 | 0.0734 |
| 2.7873 | 5151 | 0.1256 |
| 2.7879 | 5152 | 0.0999 |
| 2.7884 | 5153 | 0.079 |
| 2.7890 | 5154 | 0.0526 |
| 2.7895 | 5155 | 0.0731 |
| 2.7900 | 5156 | 0.0955 |
| 2.7906 | 5157 | 0.0833 |
| 2.7911 | 5158 | 0.1174 |
| 2.7917 | 5159 | 0.1152 |
| 2.7922 | 5160 | 0.0474 |
| 2.7927 | 5161 | 0.0685 |
| 2.7933 | 5162 | 0.0814 |
| 2.7938 | 5163 | 0.057 |
| 2.7944 | 5164 | 0.0881 |
| 2.7949 | 5165 | 0.1608 |
| 2.7955 | 5166 | 0.1213 |
| 2.7960 | 5167 | 0.0766 |
| 2.7965 | 5168 | 0.1773 |
| 2.7971 | 5169 | 0.1395 |
| 2.7976 | 5170 | 0.1311 |
| 2.7982 | 5171 | 0.1293 |
| 2.7987 | 5172 | 0.0778 |
| 2.7992 | 5173 | 0.0687 |
| 2.7998 | 5174 | 0.1036 |
| 2.8003 | 5175 | 0.0546 |
| 2.8009 | 5176 | 0.0853 |
| 2.8014 | 5177 | 0.0946 |
| 2.8019 | 5178 | 0.1253 |
| 2.8025 | 5179 | 0.117 |
| 2.8030 | 5180 | 0.0959 |
| 2.8036 | 5181 | 0.0462 |
| 2.8041 | 5182 | 0.1513 |
| 2.8047 | 5183 | 0.0957 |
| 2.8052 | 5184 | 0.0878 |
| 2.8057 | 5185 | 0.0874 |
| 2.8063 | 5186 | 0.1238 |
| 2.8068 | 5187 | 0.1226 |
| 2.8074 | 5188 | 0.0703 |
| 2.8079 | 5189 | 0.0689 |
| 2.8084 | 5190 | 0.0836 |
| 2.8090 | 5191 | 0.1105 |
| 2.8095 | 5192 | 0.0849 |
| 2.8101 | 5193 | 0.1252 |
| 2.8106 | 5194 | 0.1144 |
| 2.8111 | 5195 | 0.0995 |
| 2.8117 | 5196 | 0.1093 |
| 2.8122 | 5197 | 0.0746 |
| 2.8128 | 5198 | 0.1146 |
| 2.8133 | 5199 | 0.0881 |
| 2.8139 | 5200 | 0.1364 |
| 2.8144 | 5201 | 0.105 |
| 2.8149 | 5202 | 0.1348 |
| 2.8155 | 5203 | 0.1301 |
| 2.8160 | 5204 | 0.0644 |
| 2.8166 | 5205 | 0.053 |
| 2.8171 | 5206 | 0.0969 |
| 2.8176 | 5207 | 0.0987 |
| 2.8182 | 5208 | 0.1263 |
| 2.8187 | 5209 | 0.0818 |
| 2.8193 | 5210 | 0.0961 |
| 2.8198 | 5211 | 0.0767 |
| 2.8203 | 5212 | 0.0476 |
| 2.8209 | 5213 | 0.0479 |
| 2.8214 | 5214 | 0.0867 |
| 2.8220 | 5215 | 0.069 |
| 2.8225 | 5216 | 0.0728 |
| 2.8231 | 5217 | 0.0746 |
| 2.8236 | 5218 | 0.0814 |
| 2.8241 | 5219 | 0.057 |
| 2.8247 | 5220 | 0.1473 |
| 2.8252 | 5221 | 0.0941 |
| 2.8258 | 5222 | 0.1105 |
| 2.8263 | 5223 | 0.0808 |
| 2.8268 | 5224 | 0.0985 |
| 2.8274 | 5225 | 0.0932 |
| 2.8279 | 5226 | 0.0384 |
| 2.8285 | 5227 | 0.0685 |
| 2.8290 | 5228 | 0.0606 |
| 2.8295 | 5229 | 0.1219 |
| 2.8301 | 5230 | 0.1382 |
| 2.8306 | 5231 | 0.0737 |
| 2.8312 | 5232 | 0.0794 |
| 2.8317 | 5233 | 0.0856 |
| 2.8323 | 5234 | 0.102 |
| 2.8328 | 5235 | 0.068 |
| 2.8333 | 5236 | 0.129 |
| 2.8339 | 5237 | 0.14 |
| 2.8344 | 5238 | 0.1165 |
| 2.8350 | 5239 | 0.1143 |
| 2.8355 | 5240 | 0.1139 |
| 2.8360 | 5241 | 0.0892 |
| 2.8366 | 5242 | 0.1215 |
| 2.8371 | 5243 | 0.0802 |
| 2.8377 | 5244 | 0.0559 |
| 2.8382 | 5245 | 0.1257 |
| 2.8387 | 5246 | 0.1307 |
| 2.8393 | 5247 | 0.1276 |
| 2.8398 | 5248 | 0.0803 |
| 2.8404 | 5249 | 0.1002 |
| 2.8409 | 5250 | 0.0873 |
| 2.8415 | 5251 | 0.0535 |
| 2.8420 | 5252 | 0.054 |
| 2.8425 | 5253 | 0.1194 |
| 2.8431 | 5254 | 0.092 |
| 2.8436 | 5255 | 0.1119 |
| 2.8442 | 5256 | 0.0473 |
| 2.8447 | 5257 | 0.0476 |
| 2.8452 | 5258 | 0.1 |
| 2.8458 | 5259 | 0.0519 |
| 2.8463 | 5260 | 0.106 |
| 2.8469 | 5261 | 0.0738 |
| 2.8474 | 5262 | 0.1304 |
| 2.8479 | 5263 | 0.0729 |
| 2.8485 | 5264 | 0.066 |
| 2.8490 | 5265 | 0.0973 |
| 2.8496 | 5266 | 0.0884 |
| 2.8501 | 5267 | 0.1494 |
| 2.8506 | 5268 | 0.0362 |
| 2.8512 | 5269 | 0.0881 |
| 2.8517 | 5270 | 0.1072 |
| 2.8523 | 5271 | 0.0706 |
| 2.8528 | 5272 | 0.0623 |
| 2.8534 | 5273 | 0.1709 |
| 2.8539 | 5274 | 0.0733 |
| 2.8544 | 5275 | 0.0702 |
| 2.8550 | 5276 | 0.0895 |
| 2.8555 | 5277 | 0.0881 |
| 2.8561 | 5278 | 0.1197 |
| 2.8566 | 5279 | 0.1262 |
| 2.8571 | 5280 | 0.0654 |
| 2.8577 | 5281 | 0.1257 |
| 2.8582 | 5282 | 0.082 |
| 2.8588 | 5283 | 0.1057 |
| 2.8593 | 5284 | 0.1027 |
| 2.8598 | 5285 | 0.1269 |
| 2.8604 | 5286 | 0.1304 |
| 2.8609 | 5287 | 0.0855 |
| 2.8615 | 5288 | 0.0882 |
| 2.8620 | 5289 | 0.1475 |
| 2.8626 | 5290 | 0.081 |
| 2.8631 | 5291 | 0.0437 |
| 2.8636 | 5292 | 0.1325 |
| 2.8642 | 5293 | 0.13 |
| 2.8647 | 5294 | 0.0857 |
| 2.8653 | 5295 | 0.0766 |
| 2.8658 | 5296 | 0.1176 |
| 2.8663 | 5297 | 0.0737 |
| 2.8669 | 5298 | 0.1616 |
| 2.8674 | 5299 | 0.0983 |
| 2.8680 | 5300 | 0.0612 |
| 2.8685 | 5301 | 0.0853 |
| 2.8690 | 5302 | 0.1532 |
| 2.8696 | 5303 | 0.0899 |
| 2.8701 | 5304 | 0.0891 |
| 2.8707 | 5305 | 0.0966 |
| 2.8712 | 5306 | 0.0877 |
| 2.8718 | 5307 | 0.1213 |
| 2.8723 | 5308 | 0.1024 |
| 2.8728 | 5309 | 0.1338 |
| 2.8734 | 5310 | 0.176 |
| 2.8739 | 5311 | 0.0696 |
| 2.8745 | 5312 | 0.0742 |
| 2.875 | 5313 | 0.0832 |
| 2.8755 | 5314 | 0.0736 |
| 2.8761 | 5315 | 0.1114 |
| 2.8766 | 5316 | 0.0958 |
| 2.8772 | 5317 | 0.0678 |
| 2.8777 | 5318 | 0.1199 |
| 2.8782 | 5319 | 0.0922 |
| 2.8788 | 5320 | 0.0814 |
| 2.8793 | 5321 | 0.1051 |
| 2.8799 | 5322 | 0.0572 |
| 2.8804 | 5323 | 0.1057 |
| 2.8810 | 5324 | 0.152 |
| 2.8815 | 5325 | 0.1074 |
| 2.8820 | 5326 | 0.1159 |
| 2.8826 | 5327 | 0.0592 |
| 2.8831 | 5328 | 0.073 |
| 2.8837 | 5329 | 0.0933 |
| 2.8842 | 5330 | 0.0418 |
| 2.8847 | 5331 | 0.1063 |
| 2.8853 | 5332 | 0.06 |
| 2.8858 | 5333 | 0.0948 |
| 2.8864 | 5334 | 0.1185 |
| 2.8869 | 5335 | 0.1314 |
| 2.8874 | 5336 | 0.0955 |
| 2.8880 | 5337 | 0.1016 |
| 2.8885 | 5338 | 0.1023 |
| 2.8891 | 5339 | 0.0726 |
| 2.8896 | 5340 | 0.0656 |
| 2.8902 | 5341 | 0.0976 |
| 2.8907 | 5342 | 0.1139 |
| 2.8912 | 5343 | 0.1287 |
| 2.8918 | 5344 | 0.0945 |
| 2.8923 | 5345 | 0.1498 |
| 2.8929 | 5346 | 0.1021 |
| 2.8934 | 5347 | 0.1445 |
| 2.8939 | 5348 | 0.0695 |
| 2.8945 | 5349 | 0.1084 |
| 2.8950 | 5350 | 0.1045 |
| 2.8956 | 5351 | 0.1405 |
| 2.8961 | 5352 | 0.107 |
| 2.8966 | 5353 | 0.1017 |
| 2.8972 | 5354 | 0.0773 |
| 2.8977 | 5355 | 0.1269 |
| 2.8983 | 5356 | 0.073 |
| 2.8988 | 5357 | 0.1202 |
| 2.8994 | 5358 | 0.0795 |
| 2.8999 | 5359 | 0.1065 |
| 2.9004 | 5360 | 0.127 |
| 2.9010 | 5361 | 0.093 |
| 2.9015 | 5362 | 0.0923 |
| 2.9021 | 5363 | 0.1429 |
| 2.9026 | 5364 | 0.0668 |
| 2.9031 | 5365 | 0.0358 |
| 2.9037 | 5366 | 0.0909 |
| 2.9042 | 5367 | 0.0544 |
| 2.9048 | 5368 | 0.1164 |
| 2.9053 | 5369 | 0.0436 |
| 2.9058 | 5370 | 0.0721 |
| 2.9064 | 5371 | 0.0882 |
| 2.9069 | 5372 | 0.1498 |
| 2.9075 | 5373 | 0.085 |
| 2.9080 | 5374 | 0.1012 |
| 2.9085 | 5375 | 0.1164 |
| 2.9091 | 5376 | 0.1303 |
| 2.9096 | 5377 | 0.1091 |
| 2.9102 | 5378 | 0.0666 |
| 2.9107 | 5379 | 0.0969 |
| 2.9113 | 5380 | 0.0863 |
| 2.9118 | 5381 | 0.0833 |
| 2.9123 | 5382 | 0.0849 |
| 2.9129 | 5383 | 0.0662 |
| 2.9134 | 5384 | 0.0861 |
| 2.9140 | 5385 | 0.1756 |
| 2.9145 | 5386 | 0.0665 |
| 2.9150 | 5387 | 0.1171 |
| 2.9156 | 5388 | 0.1085 |
| 2.9161 | 5389 | 0.1401 |
| 2.9167 | 5390 | 0.0727 |
| 2.9172 | 5391 | 0.1318 |
| 2.9177 | 5392 | 0.085 |
| 2.9183 | 5393 | 0.0961 |
| 2.9188 | 5394 | 0.1146 |
| 2.9194 | 5395 | 0.0779 |
| 2.9199 | 5396 | 0.0885 |
| 2.9205 | 5397 | 0.1174 |
| 2.9210 | 5398 | 0.0801 |
| 2.9215 | 5399 | 0.0656 |
| 2.9221 | 5400 | 0.0871 |
| 2.9226 | 5401 | 0.129 |
| 2.9232 | 5402 | 0.1138 |
| 2.9237 | 5403 | 0.0899 |
| 2.9242 | 5404 | 0.0413 |
| 2.9248 | 5405 | 0.1202 |
| 2.9253 | 5406 | 0.0755 |
| 2.9259 | 5407 | 0.0857 |
| 2.9264 | 5408 | 0.045 |
| 2.9269 | 5409 | 0.0622 |
| 2.9275 | 5410 | 0.1063 |
| 2.9280 | 5411 | 0.196 |
| 2.9286 | 5412 | 0.0502 |
| 2.9291 | 5413 | 0.0701 |
| 2.9297 | 5414 | 0.0962 |
| 2.9302 | 5415 | 0.0756 |
| 2.9307 | 5416 | 0.078 |
| 2.9313 | 5417 | 0.1101 |
| 2.9318 | 5418 | 0.0692 |
| 2.9324 | 5419 | 0.1278 |
| 2.9329 | 5420 | 0.1044 |
| 2.9334 | 5421 | 0.0609 |
| 2.9340 | 5422 | 0.0898 |
| 2.9345 | 5423 | 0.1291 |
| 2.9351 | 5424 | 0.0864 |
| 2.9356 | 5425 | 0.0708 |
| 2.9361 | 5426 | 0.0891 |
| 2.9367 | 5427 | 0.0833 |
| 2.9372 | 5428 | 0.0677 |
| 2.9378 | 5429 | 0.0914 |
| 2.9383 | 5430 | 0.0892 |
| 2.9389 | 5431 | 0.1198 |
| 2.9394 | 5432 | 0.0815 |
| 2.9399 | 5433 | 0.1249 |
| 2.9405 | 5434 | 0.0839 |
| 2.9410 | 5435 | 0.0764 |
| 2.9416 | 5436 | 0.0877 |
| 2.9421 | 5437 | 0.0915 |
| 2.9426 | 5438 | 0.0897 |
| 2.9432 | 5439 | 0.0895 |
| 2.9437 | 5440 | 0.0913 |
| 2.9443 | 5441 | 0.1072 |
| 2.9448 | 5442 | 0.1195 |
| 2.9453 | 5443 | 0.1172 |
| 2.9459 | 5444 | 0.0614 |
| 2.9464 | 5445 | 0.0655 |
| 2.9470 | 5446 | 0.0907 |
| 2.9475 | 5447 | 0.1093 |
| 2.9481 | 5448 | 0.1094 |
| 2.9486 | 5449 | 0.1136 |
| 2.9491 | 5450 | 0.1177 |
| 2.9497 | 5451 | 0.1121 |
| 2.9502 | 5452 | 0.1027 |
| 2.9508 | 5453 | 0.1089 |
| 2.9513 | 5454 | 0.0873 |
| 2.9518 | 5455 | 0.0808 |
| 2.9524 | 5456 | 0.114 |
| 2.9529 | 5457 | 0.114 |
| 2.9535 | 5458 | 0.0601 |
| 2.9540 | 5459 | 0.1312 |
| 2.9545 | 5460 | 0.1432 |
| 2.9551 | 5461 | 0.1008 |
| 2.9556 | 5462 | 0.1363 |
| 2.9562 | 5463 | 0.0997 |
| 2.9567 | 5464 | 0.1113 |
| 2.9573 | 5465 | 0.1365 |
| 2.9578 | 5466 | 0.1514 |
| 2.9583 | 5467 | 0.1709 |
| 2.9589 | 5468 | 0.0847 |
| 2.9594 | 5469 | 0.1093 |
| 2.9600 | 5470 | 0.106 |
| 2.9605 | 5471 | 0.1107 |
| 2.9610 | 5472 | 0.0635 |
| 2.9616 | 5473 | 0.1064 |
| 2.9621 | 5474 | 0.0894 |
| 2.9627 | 5475 | 0.1676 |
| 2.9632 | 5476 | 0.091 |
| 2.9637 | 5477 | 0.0878 |
| 2.9643 | 5478 | 0.0591 |
| 2.9648 | 5479 | 0.0574 |
| 2.9654 | 5480 | 0.1093 |
| 2.9659 | 5481 | 0.1349 |
| 2.9665 | 5482 | 0.0918 |
| 2.9670 | 5483 | 0.0789 |
| 2.9675 | 5484 | 0.1823 |
| 2.9681 | 5485 | 0.0773 |
| 2.9686 | 5486 | 0.0842 |
| 2.9692 | 5487 | 0.1025 |
| 2.9697 | 5488 | 0.0506 |
| 2.9702 | 5489 | 0.0634 |
| 2.9708 | 5490 | 0.0805 |
| 2.9713 | 5491 | 0.0558 |
| 2.9719 | 5492 | 0.1261 |
| 2.9724 | 5493 | 0.0778 |
| 2.9729 | 5494 | 0.0731 |
| 2.9735 | 5495 | 0.1029 |
| 2.9740 | 5496 | 0.1656 |
| 2.9746 | 5497 | 0.0666 |
| 2.9751 | 5498 | 0.1192 |
| 2.9756 | 5499 | 0.1185 |
| 2.9762 | 5500 | 0.1022 |
| 2.9767 | 5501 | 0.1075 |
| 2.9773 | 5502 | 0.0503 |
| 2.9778 | 5503 | 0.0946 |
| 2.9784 | 5504 | 0.1351 |
| 2.9789 | 5505 | 0.089 |
| 2.9794 | 5506 | 0.0984 |
| 2.9800 | 5507 | 0.0735 |
| 2.9805 | 5508 | 0.0786 |
| 2.9811 | 5509 | 0.0881 |
| 2.9816 | 5510 | 0.0581 |
| 2.9821 | 5511 | 0.0558 |
| 2.9827 | 5512 | 0.0694 |
| 2.9832 | 5513 | 0.0913 |
| 2.9838 | 5514 | 0.0585 |
| 2.9843 | 5515 | 0.0538 |
| 2.9848 | 5516 | 0.0956 |
| 2.9854 | 5517 | 0.0735 |
| 2.9859 | 5518 | 0.1273 |
| 2.9865 | 5519 | 0.0906 |
| 2.9870 | 5520 | 0.0951 |
| 2.9876 | 5521 | 0.0671 |
| 2.9881 | 5522 | 0.0686 |
| 2.9886 | 5523 | 0.0955 |
| 2.9892 | 5524 | 0.083 |
| 2.9897 | 5525 | 0.0693 |
| 2.9903 | 5526 | 0.1424 |
| 2.9908 | 5527 | 0.1084 |
| 2.9913 | 5528 | 0.1126 |
| 2.9919 | 5529 | 0.0573 |
| 2.9924 | 5530 | 0.1211 |
| 2.9930 | 5531 | 0.061 |
| 2.9935 | 5532 | 0.0792 |
| 2.9940 | 5533 | 0.1059 |
| 2.9946 | 5534 | 0.0728 |
| 2.9951 | 5535 | 0.1047 |
| 2.9957 | 5536 | 0.0839 |
| 2.9962 | 5537 | 0.1019 |
| 2.9968 | 5538 | 0.1577 |
| 2.9973 | 5539 | 0.153 |
| 2.9978 | 5540 | 0.1152 |
| 2.9984 | 5541 | 0.0853 |
| 2.9989 | 5542 | 0.0858 |
| 2.9995 | 5543 | 0.0342 |
| 3.0 | 5544 | 0.1848 |
| 3.0005 | 5545 | 0.0774 |
| 3.0011 | 5546 | 0.064 |
| 3.0016 | 5547 | 0.0417 |
| 3.0022 | 5548 | 0.1099 |
| 3.0027 | 5549 | 0.0731 |
| 3.0032 | 5550 | 0.1233 |
| 3.0038 | 5551 | 0.0636 |
| 3.0043 | 5552 | 0.0658 |
| 3.0049 | 5553 | 0.0737 |
| 3.0054 | 5554 | 0.1226 |
| 3.0060 | 5555 | 0.0937 |
| 3.0065 | 5556 | 0.0684 |
| 3.0070 | 5557 | 0.0778 |
| 3.0076 | 5558 | 0.079 |
| 3.0081 | 5559 | 0.0609 |
| 3.0087 | 5560 | 0.1028 |
| 3.0092 | 5561 | 0.0805 |
| 3.0097 | 5562 | 0.1247 |
| 3.0103 | 5563 | 0.0986 |
| 3.0108 | 5564 | 0.1023 |
| 3.0114 | 5565 | 0.0856 |
| 3.0119 | 5566 | 0.1178 |
| 3.0124 | 5567 | 0.0839 |
| 3.0130 | 5568 | 0.1031 |
| 3.0135 | 5569 | 0.116 |
| 3.0141 | 5570 | 0.0902 |
| 3.0146 | 5571 | 0.0677 |
| 3.0152 | 5572 | 0.0679 |
| 3.0157 | 5573 | 0.0878 |
| 3.0162 | 5574 | 0.0992 |
| 3.0168 | 5575 | 0.0957 |
| 3.0173 | 5576 | 0.0854 |
| 3.0179 | 5577 | 0.0547 |
| 3.0184 | 5578 | 0.0524 |
| 3.0189 | 5579 | 0.0555 |
| 3.0195 | 5580 | 0.1467 |
| 3.0200 | 5581 | 0.0675 |
| 3.0206 | 5582 | 0.0681 |
| 3.0211 | 5583 | 0.0449 |
| 3.0216 | 5584 | 0.0828 |
| 3.0222 | 5585 | 0.0976 |
| 3.0227 | 5586 | 0.0655 |
| 3.0233 | 5587 | 0.0709 |
| 3.0238 | 5588 | 0.0698 |
| 3.0244 | 5589 | 0.1044 |
| 3.0249 | 5590 | 0.0553 |
| 3.0254 | 5591 | 0.0383 |
| 3.0260 | 5592 | 0.1036 |
| 3.0265 | 5593 | 0.1133 |
| 3.0271 | 5594 | 0.0771 |
| 3.0276 | 5595 | 0.1519 |
| 3.0281 | 5596 | 0.0637 |
| 3.0287 | 5597 | 0.0874 |
| 3.0292 | 5598 | 0.0738 |
| 3.0298 | 5599 | 0.0852 |
| 3.0303 | 5600 | 0.0713 |
| 3.0308 | 5601 | 0.0548 |
| 3.0314 | 5602 | 0.0696 |
| 3.0319 | 5603 | 0.065 |
| 3.0325 | 5604 | 0.1102 |
| 3.0330 | 5605 | 0.1083 |
| 3.0335 | 5606 | 0.0941 |
| 3.0341 | 5607 | 0.055 |
| 3.0346 | 5608 | 0.0255 |
| 3.0352 | 5609 | 0.0769 |
| 3.0357 | 5610 | 0.0556 |
| 3.0363 | 5611 | 0.1053 |
| 3.0368 | 5612 | 0.0578 |
| 3.0373 | 5613 | 0.111 |
| 3.0379 | 5614 | 0.0717 |
| 3.0384 | 5615 | 0.0848 |
| 3.0390 | 5616 | 0.075 |
| 3.0395 | 5617 | 0.07 |
| 3.0400 | 5618 | 0.0529 |
| 3.0406 | 5619 | 0.0533 |
| 3.0411 | 5620 | 0.0881 |
| 3.0417 | 5621 | 0.0716 |
| 3.0422 | 5622 | 0.0618 |
| 3.0427 | 5623 | 0.0472 |
| 3.0433 | 5624 | 0.0755 |
| 3.0438 | 5625 | 0.0865 |
| 3.0444 | 5626 | 0.0761 |
| 3.0449 | 5627 | 0.0477 |
| 3.0455 | 5628 | 0.0934 |
| 3.0460 | 5629 | 0.088 |
| 3.0465 | 5630 | 0.0639 |
| 3.0471 | 5631 | 0.0419 |
| 3.0476 | 5632 | 0.049 |
| 3.0482 | 5633 | 0.0711 |
| 3.0487 | 5634 | 0.0804 |
| 3.0492 | 5635 | 0.0726 |
| 3.0498 | 5636 | 0.0953 |
| 3.0503 | 5637 | 0.1136 |
| 3.0509 | 5638 | 0.0975 |
| 3.0514 | 5639 | 0.1029 |
| 3.0519 | 5640 | 0.1473 |
| 3.0525 | 5641 | 0.0537 |
| 3.0530 | 5642 | 0.0857 |
| 3.0536 | 5643 | 0.0705 |
| 3.0541 | 5644 | 0.0787 |
| 3.0547 | 5645 | 0.0748 |
| 3.0552 | 5646 | 0.0902 |
| 3.0557 | 5647 | 0.0748 |
| 3.0563 | 5648 | 0.0597 |
| 3.0568 | 5649 | 0.077 |
| 3.0574 | 5650 | 0.0304 |
| 3.0579 | 5651 | 0.1534 |
| 3.0584 | 5652 | 0.0829 |
| 3.0590 | 5653 | 0.0792 |
| 3.0595 | 5654 | 0.0668 |
| 3.0601 | 5655 | 0.1061 |
| 3.0606 | 5656 | 0.0516 |
| 3.0611 | 5657 | 0.1356 |
| 3.0617 | 5658 | 0.0537 |
| 3.0622 | 5659 | 0.0625 |
| 3.0628 | 5660 | 0.0561 |
| 3.0633 | 5661 | 0.0711 |
| 3.0639 | 5662 | 0.1057 |
| 3.0644 | 5663 | 0.0622 |
| 3.0649 | 5664 | 0.0795 |
| 3.0655 | 5665 | 0.0628 |
| 3.0660 | 5666 | 0.0942 |
| 3.0666 | 5667 | 0.153 |
| 3.0671 | 5668 | 0.0914 |
| 3.0676 | 5669 | 0.0885 |
| 3.0682 | 5670 | 0.0513 |
| 3.0687 | 5671 | 0.0942 |
| 3.0693 | 5672 | 0.0931 |
| 3.0698 | 5673 | 0.0909 |
| 3.0703 | 5674 | 0.0838 |
| 3.0709 | 5675 | 0.0957 |
| 3.0714 | 5676 | 0.0622 |
| 3.0720 | 5677 | 0.052 |
| 3.0725 | 5678 | 0.1528 |
| 3.0731 | 5679 | 0.0635 |
| 3.0736 | 5680 | 0.0548 |
| 3.0741 | 5681 | 0.0654 |
| 3.0747 | 5682 | 0.103 |
| 3.0752 | 5683 | 0.0676 |
| 3.0758 | 5684 | 0.0722 |
| 3.0763 | 5685 | 0.0703 |
| 3.0768 | 5686 | 0.0686 |
| 3.0774 | 5687 | 0.0627 |
| 3.0779 | 5688 | 0.1002 |
| 3.0785 | 5689 | 0.0634 |
| 3.0790 | 5690 | 0.062 |
| 3.0795 | 5691 | 0.0822 |
| 3.0801 | 5692 | 0.0956 |
| 3.0806 | 5693 | 0.0735 |
| 3.0812 | 5694 | 0.0607 |
| 3.0817 | 5695 | 0.1481 |
| 3.0823 | 5696 | 0.0671 |
| 3.0828 | 5697 | 0.0823 |
| 3.0833 | 5698 | 0.12 |
| 3.0839 | 5699 | 0.0433 |
| 3.0844 | 5700 | 0.0959 |
| 3.0850 | 5701 | 0.036 |
| 3.0855 | 5702 | 0.0699 |
| 3.0860 | 5703 | 0.0988 |
| 3.0866 | 5704 | 0.075 |
| 3.0871 | 5705 | 0.093 |
| 3.0877 | 5706 | 0.1268 |
| 3.0882 | 5707 | 0.0658 |
| 3.0887 | 5708 | 0.0729 |
| 3.0893 | 5709 | 0.0848 |
| 3.0898 | 5710 | 0.0847 |
| 3.0904 | 5711 | 0.0667 |
| 3.0909 | 5712 | 0.0559 |
| 3.0915 | 5713 | 0.1056 |
| 3.0920 | 5714 | 0.128 |
| 3.0925 | 5715 | 0.107 |
| 3.0931 | 5716 | 0.0916 |
| 3.0936 | 5717 | 0.0987 |
| 3.0942 | 5718 | 0.0878 |
| 3.0947 | 5719 | 0.0642 |
| 3.0952 | 5720 | 0.0824 |
| 3.0958 | 5721 | 0.07 |
| 3.0963 | 5722 | 0.0801 |
| 3.0969 | 5723 | 0.0294 |
| 3.0974 | 5724 | 0.0705 |
| 3.0979 | 5725 | 0.1107 |
| 3.0985 | 5726 | 0.0522 |
| 3.0990 | 5727 | 0.0742 |
| 3.0996 | 5728 | 0.0966 |
| 3.1001 | 5729 | 0.0686 |
| 3.1006 | 5730 | 0.0593 |
| 3.1012 | 5731 | 0.1158 |
| 3.1017 | 5732 | 0.1042 |
| 3.1023 | 5733 | 0.0474 |
| 3.1028 | 5734 | 0.0778 |
| 3.1034 | 5735 | 0.0825 |
| 3.1039 | 5736 | 0.0945 |
| 3.1044 | 5737 | 0.0715 |
| 3.1050 | 5738 | 0.0823 |
| 3.1055 | 5739 | 0.1125 |
| 3.1061 | 5740 | 0.0525 |
| 3.1066 | 5741 | 0.0954 |
| 3.1071 | 5742 | 0.0963 |
| 3.1077 | 5743 | 0.0841 |
| 3.1082 | 5744 | 0.0873 |
| 3.1088 | 5745 | 0.0805 |
| 3.1093 | 5746 | 0.0594 |
| 3.1098 | 5747 | 0.0385 |
| 3.1104 | 5748 | 0.0591 |
| 3.1109 | 5749 | 0.0982 |
| 3.1115 | 5750 | 0.078 |
| 3.1120 | 5751 | 0.0456 |
| 3.1126 | 5752 | 0.0847 |
| 3.1131 | 5753 | 0.0545 |
| 3.1136 | 5754 | 0.0922 |
| 3.1142 | 5755 | 0.1174 |
| 3.1147 | 5756 | 0.1056 |
| 3.1153 | 5757 | 0.0799 |
| 3.1158 | 5758 | 0.072 |
| 3.1163 | 5759 | 0.0893 |
| 3.1169 | 5760 | 0.0651 |
| 3.1174 | 5761 | 0.0832 |
| 3.1180 | 5762 | 0.093 |
| 3.1185 | 5763 | 0.0677 |
| 3.1190 | 5764 | 0.0649 |
| 3.1196 | 5765 | 0.0651 |
| 3.1201 | 5766 | 0.0771 |
| 3.1207 | 5767 | 0.0636 |
| 3.1212 | 5768 | 0.0901 |
| 3.1218 | 5769 | 0.121 |
| 3.1223 | 5770 | 0.0838 |
| 3.1228 | 5771 | 0.07 |
| 3.1234 | 5772 | 0.1266 |
| 3.1239 | 5773 | 0.0835 |
| 3.1245 | 5774 | 0.0469 |
| 3.125 | 5775 | 0.0936 |
| 3.1255 | 5776 | 0.0535 |
| 3.1261 | 5777 | 0.1428 |
| 3.1266 | 5778 | 0.0938 |
| 3.1272 | 5779 | 0.0642 |
| 3.1277 | 5780 | 0.0656 |
| 3.1282 | 5781 | 0.0824 |
| 3.1288 | 5782 | 0.1233 |
| 3.1293 | 5783 | 0.0787 |
| 3.1299 | 5784 | 0.0878 |
| 3.1304 | 5785 | 0.1327 |
| 3.1310 | 5786 | 0.0886 |
| 3.1315 | 5787 | 0.1108 |
| 3.1320 | 5788 | 0.0786 |
| 3.1326 | 5789 | 0.1017 |
| 3.1331 | 5790 | 0.0488 |
| 3.1337 | 5791 | 0.0381 |
| 3.1342 | 5792 | 0.0869 |
| 3.1347 | 5793 | 0.0721 |
| 3.1353 | 5794 | 0.0935 |
| 3.1358 | 5795 | 0.0934 |
| 3.1364 | 5796 | 0.1036 |
| 3.1369 | 5797 | 0.0794 |
| 3.1374 | 5798 | 0.0739 |
| 3.1380 | 5799 | 0.0782 |
| 3.1385 | 5800 | 0.0893 |
| 3.1391 | 5801 | 0.0448 |
| 3.1396 | 5802 | 0.0469 |
| 3.1402 | 5803 | 0.0594 |
| 3.1407 | 5804 | 0.105 |
| 3.1412 | 5805 | 0.0924 |
| 3.1418 | 5806 | 0.1054 |
| 3.1423 | 5807 | 0.0653 |
| 3.1429 | 5808 | 0.1027 |
| 3.1434 | 5809 | 0.103 |
| 3.1439 | 5810 | 0.0598 |
| 3.1445 | 5811 | 0.1114 |
| 3.1450 | 5812 | 0.0818 |
| 3.1456 | 5813 | 0.0873 |
| 3.1461 | 5814 | 0.0915 |
| 3.1466 | 5815 | 0.034 |
| 3.1472 | 5816 | 0.0803 |
| 3.1477 | 5817 | 0.0804 |
| 3.1483 | 5818 | 0.0978 |
| 3.1488 | 5819 | 0.0483 |
| 3.1494 | 5820 | 0.0804 |
| 3.1499 | 5821 | 0.0919 |
| 3.1504 | 5822 | 0.063 |
| 3.1510 | 5823 | 0.1046 |
| 3.1515 | 5824 | 0.1235 |
| 3.1521 | 5825 | 0.0333 |
| 3.1526 | 5826 | 0.1265 |
| 3.1531 | 5827 | 0.0568 |
| 3.1537 | 5828 | 0.0718 |
| 3.1542 | 5829 | 0.0552 |
| 3.1548 | 5830 | 0.0443 |
| 3.1553 | 5831 | 0.0643 |
| 3.1558 | 5832 | 0.0681 |
| 3.1564 | 5833 | 0.1264 |
| 3.1569 | 5834 | 0.1108 |
| 3.1575 | 5835 | 0.0486 |
| 3.1580 | 5836 | 0.0661 |
| 3.1585 | 5837 | 0.0835 |
| 3.1591 | 5838 | 0.0735 |
| 3.1596 | 5839 | 0.0996 |
| 3.1602 | 5840 | 0.069 |
| 3.1607 | 5841 | 0.0614 |
| 3.1613 | 5842 | 0.0826 |
| 3.1618 | 5843 | 0.0572 |
| 3.1623 | 5844 | 0.062 |
| 3.1629 | 5845 | 0.0577 |
| 3.1634 | 5846 | 0.1343 |
| 3.1640 | 5847 | 0.1119 |
| 3.1645 | 5848 | 0.0656 |
| 3.1650 | 5849 | 0.0763 |
| 3.1656 | 5850 | 0.0689 |
| 3.1661 | 5851 | 0.082 |
| 3.1667 | 5852 | 0.0699 |
| 3.1672 | 5853 | 0.0576 |
| 3.1677 | 5854 | 0.0775 |
| 3.1683 | 5855 | 0.1195 |
| 3.1688 | 5856 | 0.0993 |
| 3.1694 | 5857 | 0.1003 |
| 3.1699 | 5858 | 0.0603 |
| 3.1705 | 5859 | 0.0994 |
| 3.1710 | 5860 | 0.0746 |
| 3.1715 | 5861 | 0.0659 |
| 3.1721 | 5862 | 0.0895 |
| 3.1726 | 5863 | 0.0728 |
| 3.1732 | 5864 | 0.0456 |
| 3.1737 | 5865 | 0.109 |
| 3.1742 | 5866 | 0.0485 |
| 3.1748 | 5867 | 0.1287 |
| 3.1753 | 5868 | 0.0332 |
| 3.1759 | 5869 | 0.0849 |
| 3.1764 | 5870 | 0.0826 |
| 3.1769 | 5871 | 0.0646 |
| 3.1775 | 5872 | 0.0593 |
| 3.1780 | 5873 | 0.1558 |
| 3.1786 | 5874 | 0.0773 |
| 3.1791 | 5875 | 0.0572 |
| 3.1797 | 5876 | 0.0756 |
| 3.1802 | 5877 | 0.0829 |
| 3.1807 | 5878 | 0.1003 |
| 3.1813 | 5879 | 0.0825 |
| 3.1818 | 5880 | 0.1302 |
| 3.1824 | 5881 | 0.0444 |
| 3.1829 | 5882 | 0.0645 |
| 3.1834 | 5883 | 0.0708 |
| 3.1840 | 5884 | 0.0487 |
| 3.1845 | 5885 | 0.0435 |
| 3.1851 | 5886 | 0.0539 |
| 3.1856 | 5887 | 0.1219 |
| 3.1861 | 5888 | 0.0897 |
| 3.1867 | 5889 | 0.1088 |
| 3.1872 | 5890 | 0.0979 |
| 3.1878 | 5891 | 0.1832 |
| 3.1883 | 5892 | 0.0544 |
| 3.1889 | 5893 | 0.0767 |
| 3.1894 | 5894 | 0.1379 |
| 3.1899 | 5895 | 0.0652 |
| 3.1905 | 5896 | 0.0846 |
| 3.1910 | 5897 | 0.061 |
| 3.1916 | 5898 | 0.1318 |
| 3.1921 | 5899 | 0.065 |
| 3.1926 | 5900 | 0.0774 |
| 3.1932 | 5901 | 0.0748 |
| 3.1937 | 5902 | 0.0625 |
| 3.1943 | 5903 | 0.0964 |
| 3.1948 | 5904 | 0.0798 |
| 3.1953 | 5905 | 0.0826 |
| 3.1959 | 5906 | 0.0795 |
| 3.1964 | 5907 | 0.134 |
| 3.1970 | 5908 | 0.0571 |
| 3.1975 | 5909 | 0.0584 |
| 3.1981 | 5910 | 0.067 |
| 3.1986 | 5911 | 0.048 |
| 3.1991 | 5912 | 0.1235 |
| 3.1997 | 5913 | 0.0654 |
| 3.2002 | 5914 | 0.0389 |
| 3.2008 | 5915 | 0.0798 |
| 3.2013 | 5916 | 0.058 |
| 3.2018 | 5917 | 0.1136 |
| 3.2024 | 5918 | 0.0617 |
| 3.2029 | 5919 | 0.0992 |
| 3.2035 | 5920 | 0.1078 |
| 3.2040 | 5921 | 0.0534 |
| 3.2045 | 5922 | 0.1208 |
| 3.2051 | 5923 | 0.0585 |
| 3.2056 | 5924 | 0.0616 |
| 3.2062 | 5925 | 0.087 |
| 3.2067 | 5926 | 0.0654 |
| 3.2073 | 5927 | 0.0691 |
| 3.2078 | 5928 | 0.0628 |
| 3.2083 | 5929 | 0.0721 |
| 3.2089 | 5930 | 0.059 |
| 3.2094 | 5931 | 0.0738 |
| 3.2100 | 5932 | 0.0647 |
| 3.2105 | 5933 | 0.0821 |
| 3.2110 | 5934 | 0.0692 |
| 3.2116 | 5935 | 0.126 |
| 3.2121 | 5936 | 0.0626 |
| 3.2127 | 5937 | 0.0853 |
| 3.2132 | 5938 | 0.0991 |
| 3.2137 | 5939 | 0.049 |
| 3.2143 | 5940 | 0.0924 |
| 3.2148 | 5941 | 0.0784 |
| 3.2154 | 5942 | 0.0943 |
| 3.2159 | 5943 | 0.1166 |
| 3.2165 | 5944 | 0.0961 |
| 3.2170 | 5945 | 0.1163 |
| 3.2175 | 5946 | 0.0754 |
| 3.2181 | 5947 | 0.1161 |
| 3.2186 | 5948 | 0.081 |
| 3.2192 | 5949 | 0.0384 |
| 3.2197 | 5950 | 0.0806 |
| 3.2202 | 5951 | 0.127 |
| 3.2208 | 5952 | 0.0445 |
| 3.2213 | 5953 | 0.07 |
| 3.2219 | 5954 | 0.0761 |
| 3.2224 | 5955 | 0.0701 |
| 3.2229 | 5956 | 0.059 |
| 3.2235 | 5957 | 0.0784 |
| 3.2240 | 5958 | 0.0487 |
| 3.2246 | 5959 | 0.0794 |
| 3.2251 | 5960 | 0.0597 |
| 3.2256 | 5961 | 0.0856 |
| 3.2262 | 5962 | 0.0767 |
| 3.2267 | 5963 | 0.0984 |
| 3.2273 | 5964 | 0.0603 |
| 3.2278 | 5965 | 0.0705 |
| 3.2284 | 5966 | 0.0636 |
| 3.2289 | 5967 | 0.0538 |
| 3.2294 | 5968 | 0.0811 |
| 3.2300 | 5969 | 0.0592 |
| 3.2305 | 5970 | 0.0804 |
| 3.2311 | 5971 | 0.068 |
| 3.2316 | 5972 | 0.0721 |
| 3.2321 | 5973 | 0.0922 |
| 3.2327 | 5974 | 0.1008 |
| 3.2332 | 5975 | 0.0932 |
| 3.2338 | 5976 | 0.1179 |
| 3.2343 | 5977 | 0.1413 |
| 3.2348 | 5978 | 0.0636 |
| 3.2354 | 5979 | 0.0694 |
| 3.2359 | 5980 | 0.109 |
| 3.2365 | 5981 | 0.138 |
| 3.2370 | 5982 | 0.0577 |
| 3.2376 | 5983 | 0.0859 |
| 3.2381 | 5984 | 0.0736 |
| 3.2386 | 5985 | 0.0919 |
| 3.2392 | 5986 | 0.1042 |
| 3.2397 | 5987 | 0.0776 |
| 3.2403 | 5988 | 0.0515 |
| 3.2408 | 5989 | 0.1458 |
| 3.2413 | 5990 | 0.1263 |
| 3.2419 | 5991 | 0.0241 |
| 3.2424 | 5992 | 0.0408 |
| 3.2430 | 5993 | 0.0554 |
| 3.2435 | 5994 | 0.0435 |
| 3.2440 | 5995 | 0.078 |
| 3.2446 | 5996 | 0.1463 |
| 3.2451 | 5997 | 0.0615 |
| 3.2457 | 5998 | 0.079 |
| 3.2462 | 5999 | 0.0751 |
| 3.2468 | 6000 | 0.0605 |
| 3.2473 | 6001 | 0.1025 |
| 3.2478 | 6002 | 0.0402 |
| 3.2484 | 6003 | 0.0946 |
| 3.2489 | 6004 | 0.0809 |
| 3.2495 | 6005 | 0.1007 |
| 3.25 | 6006 | 0.0722 |
| 3.2505 | 6007 | 0.0483 |
| 3.2511 | 6008 | 0.062 |
| 3.2516 | 6009 | 0.0627 |
| 3.2522 | 6010 | 0.0816 |
| 3.2527 | 6011 | 0.0752 |
| 3.2532 | 6012 | 0.0727 |
| 3.2538 | 6013 | 0.0948 |
| 3.2543 | 6014 | 0.0767 |
| 3.2549 | 6015 | 0.0584 |
| 3.2554 | 6016 | 0.1126 |
| 3.2560 | 6017 | 0.1433 |
| 3.2565 | 6018 | 0.0721 |
| 3.2570 | 6019 | 0.0659 |
| 3.2576 | 6020 | 0.075 |
| 3.2581 | 6021 | 0.1374 |
| 3.2587 | 6022 | 0.0436 |
| 3.2592 | 6023 | 0.0927 |
| 3.2597 | 6024 | 0.0992 |
| 3.2603 | 6025 | 0.1064 |
| 3.2608 | 6026 | 0.0708 |
| 3.2614 | 6027 | 0.0626 |
| 3.2619 | 6028 | 0.0665 |
| 3.2624 | 6029 | 0.0608 |
| 3.2630 | 6030 | 0.0569 |
| 3.2635 | 6031 | 0.0457 |
| 3.2641 | 6032 | 0.0637 |
| 3.2646 | 6033 | 0.1283 |
| 3.2652 | 6034 | 0.1004 |
| 3.2657 | 6035 | 0.0431 |
| 3.2662 | 6036 | 0.1564 |
| 3.2668 | 6037 | 0.0774 |
| 3.2673 | 6038 | 0.1013 |
| 3.2679 | 6039 | 0.097 |
| 3.2684 | 6040 | 0.1249 |
| 3.2689 | 6041 | 0.0731 |
| 3.2695 | 6042 | 0.0737 |
| 3.2700 | 6043 | 0.0689 |
| 3.2706 | 6044 | 0.0455 |
| 3.2711 | 6045 | 0.078 |
| 3.2716 | 6046 | 0.1229 |
| 3.2722 | 6047 | 0.098 |
| 3.2727 | 6048 | 0.086 |
| 3.2733 | 6049 | 0.0835 |
| 3.2738 | 6050 | 0.0572 |
| 3.2744 | 6051 | 0.096 |
| 3.2749 | 6052 | 0.0686 |
| 3.2754 | 6053 | 0.0427 |
| 3.2760 | 6054 | 0.137 |
| 3.2765 | 6055 | 0.0527 |
| 3.2771 | 6056 | 0.0798 |
| 3.2776 | 6057 | 0.0893 |
| 3.2781 | 6058 | 0.0943 |
| 3.2787 | 6059 | 0.0605 |
| 3.2792 | 6060 | 0.1054 |
| 3.2798 | 6061 | 0.0936 |
| 3.2803 | 6062 | 0.1033 |
| 3.2808 | 6063 | 0.0674 |
| 3.2814 | 6064 | 0.1009 |
| 3.2819 | 6065 | 0.0597 |
| 3.2825 | 6066 | 0.0573 |
| 3.2830 | 6067 | 0.0523 |
| 3.2835 | 6068 | 0.0814 |
| 3.2841 | 6069 | 0.0546 |
| 3.2846 | 6070 | 0.0357 |
| 3.2852 | 6071 | 0.074 |
| 3.2857 | 6072 | 0.0799 |
| 3.2863 | 6073 | 0.1023 |
| 3.2868 | 6074 | 0.0686 |
| 3.2873 | 6075 | 0.0713 |
| 3.2879 | 6076 | 0.071 |
| 3.2884 | 6077 | 0.0691 |
| 3.2890 | 6078 | 0.0916 |
| 3.2895 | 6079 | 0.1019 |
| 3.2900 | 6080 | 0.049 |
| 3.2906 | 6081 | 0.0797 |
| 3.2911 | 6082 | 0.0584 |
| 3.2917 | 6083 | 0.0758 |
| 3.2922 | 6084 | 0.0895 |
| 3.2927 | 6085 | 0.0796 |
| 3.2933 | 6086 | 0.0615 |
| 3.2938 | 6087 | 0.0765 |
| 3.2944 | 6088 | 0.0594 |
| 3.2949 | 6089 | 0.1275 |
| 3.2955 | 6090 | 0.0927 |
| 3.2960 | 6091 | 0.056 |
| 3.2965 | 6092 | 0.0733 |
| 3.2971 | 6093 | 0.0965 |
| 3.2976 | 6094 | 0.1002 |
| 3.2982 | 6095 | 0.1042 |
| 3.2987 | 6096 | 0.0592 |
| 3.2992 | 6097 | 0.0739 |
| 3.2998 | 6098 | 0.08 |
| 3.3003 | 6099 | 0.1251 |
| 3.3009 | 6100 | 0.0655 |
| 3.3014 | 6101 | 0.0597 |
| 3.3019 | 6102 | 0.0611 |
| 3.3025 | 6103 | 0.0649 |
| 3.3030 | 6104 | 0.1331 |
| 3.3036 | 6105 | 0.1075 |
| 3.3041 | 6106 | 0.1372 |
| 3.3047 | 6107 | 0.0932 |
| 3.3052 | 6108 | 0.0888 |
| 3.3057 | 6109 | 0.1288 |
| 3.3063 | 6110 | 0.0956 |
| 3.3068 | 6111 | 0.0957 |
| 3.3074 | 6112 | 0.0565 |
| 3.3079 | 6113 | 0.0718 |
| 3.3084 | 6114 | 0.1002 |
| 3.3090 | 6115 | 0.0904 |
| 3.3095 | 6116 | 0.1076 |
| 3.3101 | 6117 | 0.0631 |
| 3.3106 | 6118 | 0.0548 |
| 3.3111 | 6119 | 0.0928 |
| 3.3117 | 6120 | 0.0632 |
| 3.3122 | 6121 | 0.0804 |
| 3.3128 | 6122 | 0.0869 |
| 3.3133 | 6123 | 0.0751 |
| 3.3139 | 6124 | 0.0954 |
| 3.3144 | 6125 | 0.1069 |
| 3.3149 | 6126 | 0.1149 |
| 3.3155 | 6127 | 0.111 |
| 3.3160 | 6128 | 0.0404 |
| 3.3166 | 6129 | 0.1078 |
| 3.3171 | 6130 | 0.1086 |
| 3.3176 | 6131 | 0.0585 |
| 3.3182 | 6132 | 0.0678 |
| 3.3187 | 6133 | 0.0862 |
| 3.3193 | 6134 | 0.0835 |
| 3.3198 | 6135 | 0.0784 |
| 3.3203 | 6136 | 0.0973 |
| 3.3209 | 6137 | 0.0953 |
| 3.3214 | 6138 | 0.0527 |
| 3.3220 | 6139 | 0.0784 |
| 3.3225 | 6140 | 0.0833 |
| 3.3231 | 6141 | 0.0839 |
| 3.3236 | 6142 | 0.081 |
| 3.3241 | 6143 | 0.0715 |
| 3.3247 | 6144 | 0.1214 |
| 3.3252 | 6145 | 0.0441 |
| 3.3258 | 6146 | 0.0812 |
| 3.3263 | 6147 | 0.1044 |
| 3.3268 | 6148 | 0.0956 |
| 3.3274 | 6149 | 0.07 |
| 3.3279 | 6150 | 0.1022 |
| 3.3285 | 6151 | 0.0618 |
| 3.3290 | 6152 | 0.0641 |
| 3.3295 | 6153 | 0.0605 |
| 3.3301 | 6154 | 0.0909 |
| 3.3306 | 6155 | 0.0574 |
| 3.3312 | 6156 | 0.1013 |
| 3.3317 | 6157 | 0.0549 |
| 3.3323 | 6158 | 0.0878 |
| 3.3328 | 6159 | 0.0883 |
| 3.3333 | 6160 | 0.0683 |
| 3.3339 | 6161 | 0.1398 |
| 3.3344 | 6162 | 0.1068 |
| 3.3350 | 6163 | 0.0965 |
| 3.3355 | 6164 | 0.0813 |
| 3.3360 | 6165 | 0.0707 |
| 3.3366 | 6166 | 0.0939 |
| 3.3371 | 6167 | 0.0789 |
| 3.3377 | 6168 | 0.0959 |
| 3.3382 | 6169 | 0.0608 |
| 3.3387 | 6170 | 0.0971 |
| 3.3393 | 6171 | 0.1021 |
| 3.3398 | 6172 | 0.1045 |
| 3.3404 | 6173 | 0.1451 |
| 3.3409 | 6174 | 0.1212 |
| 3.3415 | 6175 | 0.1469 |
| 3.3420 | 6176 | 0.0952 |
| 3.3425 | 6177 | 0.0558 |
| 3.3431 | 6178 | 0.1147 |
| 3.3436 | 6179 | 0.0801 |
| 3.3442 | 6180 | 0.1363 |
| 3.3447 | 6181 | 0.0676 |
| 3.3452 | 6182 | 0.0699 |
| 3.3458 | 6183 | 0.0624 |
| 3.3463 | 6184 | 0.0724 |
| 3.3469 | 6185 | 0.0847 |
| 3.3474 | 6186 | 0.0635 |
| 3.3479 | 6187 | 0.0868 |
| 3.3485 | 6188 | 0.112 |
| 3.3490 | 6189 | 0.131 |
| 3.3496 | 6190 | 0.1078 |
| 3.3501 | 6191 | 0.1334 |
| 3.3506 | 6192 | 0.0853 |
| 3.3512 | 6193 | 0.0836 |
| 3.3517 | 6194 | 0.059 |
| 3.3523 | 6195 | 0.0523 |
| 3.3528 | 6196 | 0.07 |
| 3.3534 | 6197 | 0.0706 |
| 3.3539 | 6198 | 0.0742 |
| 3.3544 | 6199 | 0.0642 |
| 3.3550 | 6200 | 0.0922 |
| 3.3555 | 6201 | 0.0638 |
| 3.3561 | 6202 | 0.0609 |
| 3.3566 | 6203 | 0.0727 |
| 3.3571 | 6204 | 0.077 |
| 3.3577 | 6205 | 0.1474 |
| 3.3582 | 6206 | 0.0486 |
| 3.3588 | 6207 | 0.063 |
| 3.3593 | 6208 | 0.0842 |
| 3.3598 | 6209 | 0.1142 |
| 3.3604 | 6210 | 0.1024 |
| 3.3609 | 6211 | 0.119 |
| 3.3615 | 6212 | 0.11 |
| 3.3620 | 6213 | 0.0578 |
| 3.3626 | 6214 | 0.0844 |
| 3.3631 | 6215 | 0.0361 |
| 3.3636 | 6216 | 0.053 |
| 3.3642 | 6217 | 0.0699 |
| 3.3647 | 6218 | 0.062 |
| 3.3653 | 6219 | 0.0637 |
| 3.3658 | 6220 | 0.1125 |
| 3.3663 | 6221 | 0.0939 |
| 3.3669 | 6222 | 0.0783 |
| 3.3674 | 6223 | 0.0992 |
| 3.3680 | 6224 | 0.0735 |
| 3.3685 | 6225 | 0.0531 |
| 3.3690 | 6226 | 0.0906 |
| 3.3696 | 6227 | 0.0983 |
| 3.3701 | 6228 | 0.0512 |
| 3.3707 | 6229 | 0.0695 |
| 3.3712 | 6230 | 0.1186 |
| 3.3718 | 6231 | 0.0387 |
| 3.3723 | 6232 | 0.094 |
| 3.3728 | 6233 | 0.0615 |
| 3.3734 | 6234 | 0.1316 |
| 3.3739 | 6235 | 0.0895 |
| 3.3745 | 6236 | 0.1054 |
| 3.375 | 6237 | 0.0706 |
| 3.3755 | 6238 | 0.0959 |
| 3.3761 | 6239 | 0.0737 |
| 3.3766 | 6240 | 0.0593 |
| 3.3772 | 6241 | 0.078 |
| 3.3777 | 6242 | 0.0723 |
| 3.3782 | 6243 | 0.082 |
| 3.3788 | 6244 | 0.1098 |
| 3.3793 | 6245 | 0.0793 |
| 3.3799 | 6246 | 0.0581 |
| 3.3804 | 6247 | 0.0775 |
| 3.3810 | 6248 | 0.0827 |
| 3.3815 | 6249 | 0.1026 |
| 3.3820 | 6250 | 0.0479 |
| 3.3826 | 6251 | 0.0809 |
| 3.3831 | 6252 | 0.0969 |
| 3.3837 | 6253 | 0.117 |
| 3.3842 | 6254 | 0.0966 |
| 3.3847 | 6255 | 0.0468 |
| 3.3853 | 6256 | 0.0627 |
| 3.3858 | 6257 | 0.0531 |
| 3.3864 | 6258 | 0.0591 |
| 3.3869 | 6259 | 0.0578 |
| 3.3874 | 6260 | 0.1029 |
| 3.3880 | 6261 | 0.0565 |
| 3.3885 | 6262 | 0.0961 |
| 3.3891 | 6263 | 0.0595 |
| 3.3896 | 6264 | 0.0327 |
| 3.3902 | 6265 | 0.0762 |
| 3.3907 | 6266 | 0.0584 |
| 3.3912 | 6267 | 0.0931 |
| 3.3918 | 6268 | 0.108 |
| 3.3923 | 6269 | 0.0659 |
| 3.3929 | 6270 | 0.1078 |
| 3.3934 | 6271 | 0.0714 |
| 3.3939 | 6272 | 0.0853 |
| 3.3945 | 6273 | 0.061 |
| 3.3950 | 6274 | 0.0764 |
| 3.3956 | 6275 | 0.0497 |
| 3.3961 | 6276 | 0.0596 |
| 3.3966 | 6277 | 0.1012 |
| 3.3972 | 6278 | 0.0828 |
| 3.3977 | 6279 | 0.0632 |
| 3.3983 | 6280 | 0.1238 |
| 3.3988 | 6281 | 0.0596 |
| 3.3994 | 6282 | 0.1057 |
| 3.3999 | 6283 | 0.0537 |
| 3.4004 | 6284 | 0.1111 |
| 3.4010 | 6285 | 0.1753 |
| 3.4015 | 6286 | 0.0607 |
| 3.4021 | 6287 | 0.1146 |
| 3.4026 | 6288 | 0.052 |
| 3.4031 | 6289 | 0.0314 |
| 3.4037 | 6290 | 0.138 |
| 3.4042 | 6291 | 0.0915 |
| 3.4048 | 6292 | 0.0703 |
| 3.4053 | 6293 | 0.0531 |
| 3.4058 | 6294 | 0.0591 |
| 3.4064 | 6295 | 0.0982 |
| 3.4069 | 6296 | 0.088 |
| 3.4075 | 6297 | 0.0476 |
| 3.4080 | 6298 | 0.0869 |
| 3.4085 | 6299 | 0.1226 |
| 3.4091 | 6300 | 0.1224 |
| 3.4096 | 6301 | 0.0636 |
| 3.4102 | 6302 | 0.0936 |
| 3.4107 | 6303 | 0.0594 |
| 3.4113 | 6304 | 0.0662 |
| 3.4118 | 6305 | 0.0486 |
| 3.4123 | 6306 | 0.0694 |
| 3.4129 | 6307 | 0.1176 |
| 3.4134 | 6308 | 0.114 |
| 3.4140 | 6309 | 0.0951 |
| 3.4145 | 6310 | 0.0827 |
| 3.4150 | 6311 | 0.0981 |
| 3.4156 | 6312 | 0.0668 |
| 3.4161 | 6313 | 0.0729 |
| 3.4167 | 6314 | 0.0974 |
| 3.4172 | 6315 | 0.0644 |
| 3.4177 | 6316 | 0.0707 |
| 3.4183 | 6317 | 0.0503 |
| 3.4188 | 6318 | 0.0445 |
| 3.4194 | 6319 | 0.09 |
| 3.4199 | 6320 | 0.0934 |
| 3.4205 | 6321 | 0.0817 |
| 3.4210 | 6322 | 0.0586 |
| 3.4215 | 6323 | 0.1191 |
| 3.4221 | 6324 | 0.0608 |
| 3.4226 | 6325 | 0.0653 |
| 3.4232 | 6326 | 0.069 |
| 3.4237 | 6327 | 0.0853 |
| 3.4242 | 6328 | 0.0685 |
| 3.4248 | 6329 | 0.1418 |
| 3.4253 | 6330 | 0.0699 |
| 3.4259 | 6331 | 0.1262 |
| 3.4264 | 6332 | 0.0951 |
| 3.4269 | 6333 | 0.1355 |
| 3.4275 | 6334 | 0.1146 |
| 3.4280 | 6335 | 0.0959 |
| 3.4286 | 6336 | 0.0958 |
| 3.4291 | 6337 | 0.1354 |
| 3.4297 | 6338 | 0.0749 |
| 3.4302 | 6339 | 0.0658 |
| 3.4307 | 6340 | 0.0928 |
| 3.4313 | 6341 | 0.0669 |
| 3.4318 | 6342 | 0.0944 |
| 3.4324 | 6343 | 0.0664 |
| 3.4329 | 6344 | 0.0851 |
| 3.4334 | 6345 | 0.1421 |
| 3.4340 | 6346 | 0.0606 |
| 3.4345 | 6347 | 0.0534 |
| 3.4351 | 6348 | 0.09 |
| 3.4356 | 6349 | 0.0751 |
| 3.4361 | 6350 | 0.042 |
| 3.4367 | 6351 | 0.0665 |
| 3.4372 | 6352 | 0.069 |
| 3.4378 | 6353 | 0.0576 |
| 3.4383 | 6354 | 0.0679 |
| 3.4389 | 6355 | 0.0622 |
| 3.4394 | 6356 | 0.056 |
| 3.4399 | 6357 | 0.0778 |
| 3.4405 | 6358 | 0.1015 |
| 3.4410 | 6359 | 0.0705 |
| 3.4416 | 6360 | 0.0955 |
| 3.4421 | 6361 | 0.0733 |
| 3.4426 | 6362 | 0.104 |
| 3.4432 | 6363 | 0.0716 |
| 3.4437 | 6364 | 0.1134 |
| 3.4443 | 6365 | 0.0844 |
| 3.4448 | 6366 | 0.0667 |
| 3.4453 | 6367 | 0.103 |
| 3.4459 | 6368 | 0.0735 |
| 3.4464 | 6369 | 0.1267 |
| 3.4470 | 6370 | 0.0467 |
| 3.4475 | 6371 | 0.08 |
| 3.4481 | 6372 | 0.0802 |
| 3.4486 | 6373 | 0.0568 |
| 3.4491 | 6374 | 0.1414 |
| 3.4497 | 6375 | 0.054 |
| 3.4502 | 6376 | 0.0822 |
| 3.4508 | 6377 | 0.0807 |
| 3.4513 | 6378 | 0.075 |
| 3.4518 | 6379 | 0.0937 |
| 3.4524 | 6380 | 0.0803 |
| 3.4529 | 6381 | 0.0347 |
| 3.4535 | 6382 | 0.0631 |
| 3.4540 | 6383 | 0.0727 |
| 3.4545 | 6384 | 0.1211 |
| 3.4551 | 6385 | 0.054 |
| 3.4556 | 6386 | 0.0669 |
| 3.4562 | 6387 | 0.065 |
| 3.4567 | 6388 | 0.0822 |
| 3.4573 | 6389 | 0.0743 |
| 3.4578 | 6390 | 0.0465 |
| 3.4583 | 6391 | 0.0741 |
| 3.4589 | 6392 | 0.1008 |
| 3.4594 | 6393 | 0.1302 |
| 3.4600 | 6394 | 0.0825 |
| 3.4605 | 6395 | 0.0949 |
| 3.4610 | 6396 | 0.0952 |
| 3.4616 | 6397 | 0.093 |
| 3.4621 | 6398 | 0.0857 |
| 3.4627 | 6399 | 0.0645 |
| 3.4632 | 6400 | 0.0691 |
| 3.4637 | 6401 | 0.0648 |
| 3.4643 | 6402 | 0.0829 |
| 3.4648 | 6403 | 0.1468 |
| 3.4654 | 6404 | 0.1211 |
| 3.4659 | 6405 | 0.0638 |
| 3.4665 | 6406 | 0.0913 |
| 3.4670 | 6407 | 0.0389 |
| 3.4675 | 6408 | 0.0578 |
| 3.4681 | 6409 | 0.0934 |
| 3.4686 | 6410 | 0.1078 |
| 3.4692 | 6411 | 0.0997 |
| 3.4697 | 6412 | 0.0499 |
| 3.4702 | 6413 | 0.0977 |
| 3.4708 | 6414 | 0.0638 |
| 3.4713 | 6415 | 0.0584 |
| 3.4719 | 6416 | 0.0676 |
| 3.4724 | 6417 | 0.0692 |
| 3.4729 | 6418 | 0.0725 |
| 3.4735 | 6419 | 0.084 |
| 3.4740 | 6420 | 0.0805 |
| 3.4746 | 6421 | 0.0742 |
| 3.4751 | 6422 | 0.0768 |
| 3.4756 | 6423 | 0.1303 |
| 3.4762 | 6424 | 0.0682 |
| 3.4767 | 6425 | 0.0826 |
| 3.4773 | 6426 | 0.0705 |
| 3.4778 | 6427 | 0.0725 |
| 3.4784 | 6428 | 0.0643 |
| 3.4789 | 6429 | 0.1099 |
| 3.4794 | 6430 | 0.0812 |
| 3.4800 | 6431 | 0.1072 |
| 3.4805 | 6432 | 0.0975 |
| 3.4811 | 6433 | 0.0818 |
| 3.4816 | 6434 | 0.0876 |
| 3.4821 | 6435 | 0.0622 |
| 3.4827 | 6436 | 0.1226 |
| 3.4832 | 6437 | 0.0868 |
| 3.4838 | 6438 | 0.0748 |
| 3.4843 | 6439 | 0.1153 |
| 3.4848 | 6440 | 0.0586 |
| 3.4854 | 6441 | 0.0956 |
| 3.4859 | 6442 | 0.1063 |
| 3.4865 | 6443 | 0.0591 |
| 3.4870 | 6444 | 0.1097 |
| 3.4876 | 6445 | 0.0426 |
| 3.4881 | 6446 | 0.037 |
| 3.4886 | 6447 | 0.096 |
| 3.4892 | 6448 | 0.0938 |
| 3.4897 | 6449 | 0.0655 |
| 3.4903 | 6450 | 0.0537 |
| 3.4908 | 6451 | 0.0943 |
| 3.4913 | 6452 | 0.0793 |
| 3.4919 | 6453 | 0.0801 |
| 3.4924 | 6454 | 0.0794 |
| 3.4930 | 6455 | 0.0844 |
| 3.4935 | 6456 | 0.0774 |
| 3.4940 | 6457 | 0.112 |
| 3.4946 | 6458 | 0.1302 |
| 3.4951 | 6459 | 0.0736 |
| 3.4957 | 6460 | 0.0453 |
| 3.4962 | 6461 | 0.0767 |
| 3.4968 | 6462 | 0.0717 |
| 3.4973 | 6463 | 0.0748 |
| 3.4978 | 6464 | 0.1149 |
| 3.4984 | 6465 | 0.1226 |
| 3.4989 | 6466 | 0.0934 |
| 3.4995 | 6467 | 0.1229 |
| 3.5 | 6468 | 0.1196 |
| 3.5005 | 6469 | 0.0762 |
| 3.5011 | 6470 | 0.0628 |
| 3.5016 | 6471 | 0.1231 |
| 3.5022 | 6472 | 0.099 |
| 3.5027 | 6473 | 0.0471 |
| 3.5032 | 6474 | 0.0716 |
| 3.5038 | 6475 | 0.0845 |
| 3.5043 | 6476 | 0.1102 |
| 3.5049 | 6477 | 0.0482 |
| 3.5054 | 6478 | 0.1205 |
| 3.5060 | 6479 | 0.1067 |
| 3.5065 | 6480 | 0.0762 |
| 3.5070 | 6481 | 0.1055 |
| 3.5076 | 6482 | 0.0917 |
| 3.5081 | 6483 | 0.0558 |
| 3.5087 | 6484 | 0.0701 |
| 3.5092 | 6485 | 0.129 |
| 3.5097 | 6486 | 0.0929 |
| 3.5103 | 6487 | 0.0926 |
| 3.5108 | 6488 | 0.1149 |
| 3.5114 | 6489 | 0.0875 |
| 3.5119 | 6490 | 0.0502 |
| 3.5124 | 6491 | 0.0558 |
| 3.5130 | 6492 | 0.0797 |
| 3.5135 | 6493 | 0.0511 |
| 3.5141 | 6494 | 0.0371 |
| 3.5146 | 6495 | 0.0542 |
| 3.5152 | 6496 | 0.0919 |
| 3.5157 | 6497 | 0.1008 |
| 3.5162 | 6498 | 0.0376 |
| 3.5168 | 6499 | 0.0796 |
| 3.5173 | 6500 | 0.0684 |
| 3.5179 | 6501 | 0.0649 |
| 3.5184 | 6502 | 0.088 |
| 3.5189 | 6503 | 0.0794 |
| 3.5195 | 6504 | 0.091 |
| 3.5200 | 6505 | 0.0819 |
| 3.5206 | 6506 | 0.0708 |
| 3.5211 | 6507 | 0.0689 |
| 3.5216 | 6508 | 0.0491 |
| 3.5222 | 6509 | 0.0623 |
| 3.5227 | 6510 | 0.1075 |
| 3.5233 | 6511 | 0.0586 |
| 3.5238 | 6512 | 0.0995 |
| 3.5244 | 6513 | 0.0575 |
| 3.5249 | 6514 | 0.0783 |
| 3.5254 | 6515 | 0.0899 |
| 3.5260 | 6516 | 0.0554 |
| 3.5265 | 6517 | 0.0774 |
| 3.5271 | 6518 | 0.0836 |
| 3.5276 | 6519 | 0.0637 |
| 3.5281 | 6520 | 0.0815 |
| 3.5287 | 6521 | 0.0807 |
| 3.5292 | 6522 | 0.0729 |
| 3.5298 | 6523 | 0.051 |
| 3.5303 | 6524 | 0.0872 |
| 3.5308 | 6525 | 0.0772 |
| 3.5314 | 6526 | 0.0757 |
| 3.5319 | 6527 | 0.0475 |
| 3.5325 | 6528 | 0.103 |
| 3.5330 | 6529 | 0.0661 |
| 3.5335 | 6530 | 0.0757 |
| 3.5341 | 6531 | 0.0797 |
| 3.5346 | 6532 | 0.0848 |
| 3.5352 | 6533 | 0.062 |
| 3.5357 | 6534 | 0.0478 |
| 3.5363 | 6535 | 0.0603 |
| 3.5368 | 6536 | 0.0787 |
| 3.5373 | 6537 | 0.1358 |
| 3.5379 | 6538 | 0.0735 |
| 3.5384 | 6539 | 0.0946 |
| 3.5390 | 6540 | 0.0747 |
| 3.5395 | 6541 | 0.0584 |
| 3.5400 | 6542 | 0.0573 |
| 3.5406 | 6543 | 0.0966 |
| 3.5411 | 6544 | 0.0524 |
| 3.5417 | 6545 | 0.1112 |
| 3.5422 | 6546 | 0.0682 |
| 3.5427 | 6547 | 0.1221 |
| 3.5433 | 6548 | 0.0685 |
| 3.5438 | 6549 | 0.0344 |
| 3.5444 | 6550 | 0.1175 |
| 3.5449 | 6551 | 0.0966 |
| 3.5455 | 6552 | 0.0758 |
| 3.5460 | 6553 | 0.0748 |
| 3.5465 | 6554 | 0.0784 |
| 3.5471 | 6555 | 0.1004 |
| 3.5476 | 6556 | 0.0895 |
| 3.5482 | 6557 | 0.0492 |
| 3.5487 | 6558 | 0.1037 |
| 3.5492 | 6559 | 0.0864 |
| 3.5498 | 6560 | 0.0984 |
| 3.5503 | 6561 | 0.0716 |
| 3.5509 | 6562 | 0.0949 |
| 3.5514 | 6563 | 0.0636 |
| 3.5519 | 6564 | 0.0794 |
| 3.5525 | 6565 | 0.0862 |
| 3.5530 | 6566 | 0.0405 |
| 3.5536 | 6567 | 0.0562 |
| 3.5541 | 6568 | 0.0831 |
| 3.5547 | 6569 | 0.0659 |
| 3.5552 | 6570 | 0.0711 |
| 3.5557 | 6571 | 0.0661 |
| 3.5563 | 6572 | 0.0409 |
| 3.5568 | 6573 | 0.0757 |
| 3.5574 | 6574 | 0.0753 |
| 3.5579 | 6575 | 0.0798 |
| 3.5584 | 6576 | 0.0779 |
| 3.5590 | 6577 | 0.1036 |
| 3.5595 | 6578 | 0.061 |
| 3.5601 | 6579 | 0.0782 |
| 3.5606 | 6580 | 0.0841 |
| 3.5611 | 6581 | 0.0854 |
| 3.5617 | 6582 | 0.1177 |
| 3.5622 | 6583 | 0.0563 |
| 3.5628 | 6584 | 0.1183 |
| 3.5633 | 6585 | 0.1007 |
| 3.5639 | 6586 | 0.0478 |
| 3.5644 | 6587 | 0.1081 |
| 3.5649 | 6588 | 0.1018 |
| 3.5655 | 6589 | 0.108 |
| 3.5660 | 6590 | 0.133 |
| 3.5666 | 6591 | 0.0466 |
| 3.5671 | 6592 | 0.0439 |
| 3.5676 | 6593 | 0.0846 |
| 3.5682 | 6594 | 0.0794 |
| 3.5687 | 6595 | 0.1177 |
| 3.5693 | 6596 | 0.0895 |
| 3.5698 | 6597 | 0.0873 |
| 3.5703 | 6598 | 0.0766 |
| 3.5709 | 6599 | 0.0585 |
| 3.5714 | 6600 | 0.0942 |
| 3.5720 | 6601 | 0.0875 |
| 3.5725 | 6602 | 0.0978 |
| 3.5731 | 6603 | 0.1368 |
| 3.5736 | 6604 | 0.0874 |
| 3.5741 | 6605 | 0.0534 |
| 3.5747 | 6606 | 0.0579 |
| 3.5752 | 6607 | 0.056 |
| 3.5758 | 6608 | 0.0444 |
| 3.5763 | 6609 | 0.0451 |
| 3.5768 | 6610 | 0.0953 |
| 3.5774 | 6611 | 0.0454 |
| 3.5779 | 6612 | 0.0869 |
| 3.5785 | 6613 | 0.0835 |
| 3.5790 | 6614 | 0.1057 |
| 3.5795 | 6615 | 0.1015 |
| 3.5801 | 6616 | 0.0972 |
| 3.5806 | 6617 | 0.0574 |
| 3.5812 | 6618 | 0.0685 |
| 3.5817 | 6619 | 0.1008 |
| 3.5823 | 6620 | 0.0928 |
| 3.5828 | 6621 | 0.0751 |
| 3.5833 | 6622 | 0.1076 |
| 3.5839 | 6623 | 0.0543 |
| 3.5844 | 6624 | 0.0587 |
| 3.5850 | 6625 | 0.0579 |
| 3.5855 | 6626 | 0.0832 |
| 3.5860 | 6627 | 0.0632 |
| 3.5866 | 6628 | 0.0943 |
| 3.5871 | 6629 | 0.091 |
| 3.5877 | 6630 | 0.045 |
| 3.5882 | 6631 | 0.0666 |
| 3.5887 | 6632 | 0.0778 |
| 3.5893 | 6633 | 0.0726 |
| 3.5898 | 6634 | 0.086 |
| 3.5904 | 6635 | 0.0688 |
| 3.5909 | 6636 | 0.0781 |
| 3.5915 | 6637 | 0.0622 |
| 3.5920 | 6638 | 0.0432 |
| 3.5925 | 6639 | 0.0743 |
| 3.5931 | 6640 | 0.0661 |
| 3.5936 | 6641 | 0.0488 |
| 3.5942 | 6642 | 0.1242 |
| 3.5947 | 6643 | 0.0797 |
| 3.5952 | 6644 | 0.0767 |
| 3.5958 | 6645 | 0.0807 |
| 3.5963 | 6646 | 0.0913 |
| 3.5969 | 6647 | 0.0707 |
| 3.5974 | 6648 | 0.0956 |
| 3.5979 | 6649 | 0.0914 |
| 3.5985 | 6650 | 0.0908 |
| 3.5990 | 6651 | 0.0618 |
| 3.5996 | 6652 | 0.0619 |
| 3.6001 | 6653 | 0.0872 |
| 3.6006 | 6654 | 0.042 |
| 3.6012 | 6655 | 0.0565 |
| 3.6017 | 6656 | 0.1089 |
| 3.6023 | 6657 | 0.1007 |
| 3.6028 | 6658 | 0.0777 |
| 3.6034 | 6659 | 0.0865 |
| 3.6039 | 6660 | 0.0562 |
| 3.6044 | 6661 | 0.071 |
| 3.6050 | 6662 | 0.0921 |
| 3.6055 | 6663 | 0.0559 |
| 3.6061 | 6664 | 0.0595 |
| 3.6066 | 6665 | 0.0634 |
| 3.6071 | 6666 | 0.0587 |
| 3.6077 | 6667 | 0.0836 |
| 3.6082 | 6668 | 0.0441 |
| 3.6088 | 6669 | 0.0831 |
| 3.6093 | 6670 | 0.0905 |
| 3.6098 | 6671 | 0.11 |
| 3.6104 | 6672 | 0.0591 |
| 3.6109 | 6673 | 0.0468 |
| 3.6115 | 6674 | 0.0433 |
| 3.6120 | 6675 | 0.103 |
| 3.6126 | 6676 | 0.0945 |
| 3.6131 | 6677 | 0.0522 |
| 3.6136 | 6678 | 0.0542 |
| 3.6142 | 6679 | 0.0526 |
| 3.6147 | 6680 | 0.117 |
| 3.6153 | 6681 | 0.0581 |
| 3.6158 | 6682 | 0.0568 |
| 3.6163 | 6683 | 0.0546 |
| 3.6169 | 6684 | 0.0692 |
| 3.6174 | 6685 | 0.0917 |
| 3.6180 | 6686 | 0.0414 |
| 3.6185 | 6687 | 0.1031 |
| 3.6190 | 6688 | 0.0923 |
| 3.6196 | 6689 | 0.0947 |
| 3.6201 | 6690 | 0.1116 |
| 3.6207 | 6691 | 0.0563 |
| 3.6212 | 6692 | 0.0932 |
| 3.6218 | 6693 | 0.0804 |
| 3.6223 | 6694 | 0.0807 |
| 3.6228 | 6695 | 0.1089 |
| 3.6234 | 6696 | 0.0767 |
| 3.6239 | 6697 | 0.1199 |
| 3.6245 | 6698 | 0.0818 |
| 3.625 | 6699 | 0.0821 |
| 3.6255 | 6700 | 0.0772 |
| 3.6261 | 6701 | 0.0689 |
| 3.6266 | 6702 | 0.0866 |
| 3.6272 | 6703 | 0.1355 |
| 3.6277 | 6704 | 0.0857 |
| 3.6282 | 6705 | 0.0611 |
| 3.6288 | 6706 | 0.1196 |
| 3.6293 | 6707 | 0.0573 |
| 3.6299 | 6708 | 0.1027 |
| 3.6304 | 6709 | 0.0828 |
| 3.6310 | 6710 | 0.0835 |
| 3.6315 | 6711 | 0.1233 |
| 3.6320 | 6712 | 0.0757 |
| 3.6326 | 6713 | 0.0826 |
| 3.6331 | 6714 | 0.0528 |
| 3.6337 | 6715 | 0.1428 |
| 3.6342 | 6716 | 0.0861 |
| 3.6347 | 6717 | 0.0513 |
| 3.6353 | 6718 | 0.1151 |
| 3.6358 | 6719 | 0.0859 |
| 3.6364 | 6720 | 0.0829 |
| 3.6369 | 6721 | 0.0777 |
| 3.6374 | 6722 | 0.0796 |
| 3.6380 | 6723 | 0.1588 |
| 3.6385 | 6724 | 0.0796 |
| 3.6391 | 6725 | 0.1127 |
| 3.6396 | 6726 | 0.085 |
| 3.6402 | 6727 | 0.1019 |
| 3.6407 | 6728 | 0.0418 |
| 3.6412 | 6729 | 0.0784 |
| 3.6418 | 6730 | 0.0586 |
| 3.6423 | 6731 | 0.1109 |
| 3.6429 | 6732 | 0.1014 |
| 3.6434 | 6733 | 0.0775 |
| 3.6439 | 6734 | 0.0385 |
| 3.6445 | 6735 | 0.0514 |
| 3.6450 | 6736 | 0.0963 |
| 3.6456 | 6737 | 0.0604 |
| 3.6461 | 6738 | 0.1048 |
| 3.6466 | 6739 | 0.1063 |
| 3.6472 | 6740 | 0.0495 |
| 3.6477 | 6741 | 0.0599 |
| 3.6483 | 6742 | 0.0857 |
| 3.6488 | 6743 | 0.0538 |
| 3.6494 | 6744 | 0.0685 |
| 3.6499 | 6745 | 0.0412 |
| 3.6504 | 6746 | 0.0878 |
| 3.6510 | 6747 | 0.0377 |
| 3.6515 | 6748 | 0.067 |
| 3.6521 | 6749 | 0.054 |
| 3.6526 | 6750 | 0.0699 |
| 3.6531 | 6751 | 0.0887 |
| 3.6537 | 6752 | 0.0984 |
| 3.6542 | 6753 | 0.0528 |
| 3.6548 | 6754 | 0.0908 |
| 3.6553 | 6755 | 0.0458 |
| 3.6558 | 6756 | 0.0928 |
| 3.6564 | 6757 | 0.0633 |
| 3.6569 | 6758 | 0.0916 |
| 3.6575 | 6759 | 0.0601 |
| 3.6580 | 6760 | 0.0585 |
| 3.6585 | 6761 | 0.1386 |
| 3.6591 | 6762 | 0.0573 |
| 3.6596 | 6763 | 0.1154 |
| 3.6602 | 6764 | 0.1217 |
| 3.6607 | 6765 | 0.0813 |
| 3.6613 | 6766 | 0.0793 |
| 3.6618 | 6767 | 0.06 |
| 3.6623 | 6768 | 0.0574 |
| 3.6629 | 6769 | 0.0614 |
| 3.6634 | 6770 | 0.1372 |
| 3.6640 | 6771 | 0.1197 |
| 3.6645 | 6772 | 0.0757 |
| 3.6650 | 6773 | 0.0809 |
| 3.6656 | 6774 | 0.0752 |
| 3.6661 | 6775 | 0.0854 |
| 3.6667 | 6776 | 0.0478 |
| 3.6672 | 6777 | 0.0494 |
| 3.6677 | 6778 | 0.1226 |
| 3.6683 | 6779 | 0.1519 |
| 3.6688 | 6780 | 0.0903 |
| 3.6694 | 6781 | 0.0391 |
| 3.6699 | 6782 | 0.061 |
| 3.6705 | 6783 | 0.0574 |
| 3.6710 | 6784 | 0.1153 |
| 3.6715 | 6785 | 0.0992 |
| 3.6721 | 6786 | 0.1009 |
| 3.6726 | 6787 | 0.084 |
| 3.6732 | 6788 | 0.0965 |
| 3.6737 | 6789 | 0.0668 |
| 3.6742 | 6790 | 0.0878 |
| 3.6748 | 6791 | 0.1087 |
| 3.6753 | 6792 | 0.0796 |
| 3.6759 | 6793 | 0.0585 |
| 3.6764 | 6794 | 0.0613 |
| 3.6769 | 6795 | 0.0678 |
| 3.6775 | 6796 | 0.0738 |
| 3.6780 | 6797 | 0.0503 |
| 3.6786 | 6798 | 0.1108 |
| 3.6791 | 6799 | 0.0672 |
| 3.6797 | 6800 | 0.0453 |
| 3.6802 | 6801 | 0.0774 |
| 3.6807 | 6802 | 0.0615 |
| 3.6813 | 6803 | 0.0363 |
| 3.6818 | 6804 | 0.0623 |
| 3.6824 | 6805 | 0.1375 |
| 3.6829 | 6806 | 0.0444 |
| 3.6834 | 6807 | 0.042 |
| 3.6840 | 6808 | 0.1015 |
| 3.6845 | 6809 | 0.0699 |
| 3.6851 | 6810 | 0.0633 |
| 3.6856 | 6811 | 0.0865 |
| 3.6861 | 6812 | 0.0567 |
| 3.6867 | 6813 | 0.0447 |
| 3.6872 | 6814 | 0.0752 |
| 3.6878 | 6815 | 0.1205 |
| 3.6883 | 6816 | 0.0935 |
| 3.6889 | 6817 | 0.0898 |
| 3.6894 | 6818 | 0.1117 |
| 3.6899 | 6819 | 0.1084 |
| 3.6905 | 6820 | 0.0976 |
| 3.6910 | 6821 | 0.1205 |
| 3.6916 | 6822 | 0.0676 |
| 3.6921 | 6823 | 0.0679 |
| 3.6926 | 6824 | 0.0875 |
| 3.6932 | 6825 | 0.064 |
| 3.6937 | 6826 | 0.1152 |
| 3.6943 | 6827 | 0.0671 |
| 3.6948 | 6828 | 0.0623 |
| 3.6953 | 6829 | 0.0921 |
| 3.6959 | 6830 | 0.0478 |
| 3.6964 | 6831 | 0.1391 |
| 3.6970 | 6832 | 0.0405 |
| 3.6975 | 6833 | 0.0296 |
| 3.6981 | 6834 | 0.091 |
| 3.6986 | 6835 | 0.0743 |
| 3.6991 | 6836 | 0.0942 |
| 3.6997 | 6837 | 0.0866 |
| 3.7002 | 6838 | 0.0773 |
| 3.7008 | 6839 | 0.1096 |
| 3.7013 | 6840 | 0.0503 |
| 3.7018 | 6841 | 0.0931 |
| 3.7024 | 6842 | 0.1113 |
| 3.7029 | 6843 | 0.0799 |
| 3.7035 | 6844 | 0.0524 |
| 3.7040 | 6845 | 0.0725 |
| 3.7045 | 6846 | 0.0784 |
| 3.7051 | 6847 | 0.0576 |
| 3.7056 | 6848 | 0.0776 |
| 3.7062 | 6849 | 0.0453 |
| 3.7067 | 6850 | 0.0562 |
| 3.7073 | 6851 | 0.0766 |
| 3.7078 | 6852 | 0.097 |
| 3.7083 | 6853 | 0.0702 |
| 3.7089 | 6854 | 0.087 |
| 3.7094 | 6855 | 0.1386 |
| 3.7100 | 6856 | 0.0715 |
| 3.7105 | 6857 | 0.1048 |
| 3.7110 | 6858 | 0.0924 |
| 3.7116 | 6859 | 0.1138 |
| 3.7121 | 6860 | 0.0849 |
| 3.7127 | 6861 | 0.0945 |
| 3.7132 | 6862 | 0.0439 |
| 3.7137 | 6863 | 0.0823 |
| 3.7143 | 6864 | 0.0629 |
| 3.7148 | 6865 | 0.1201 |
| 3.7154 | 6866 | 0.0527 |
| 3.7159 | 6867 | 0.082 |
| 3.7165 | 6868 | 0.1205 |
| 3.7170 | 6869 | 0.0587 |
| 3.7175 | 6870 | 0.0547 |
| 3.7181 | 6871 | 0.1173 |
| 3.7186 | 6872 | 0.0732 |
| 3.7192 | 6873 | 0.0846 |
| 3.7197 | 6874 | 0.1062 |
| 3.7202 | 6875 | 0.0596 |
| 3.7208 | 6876 | 0.0915 |
| 3.7213 | 6877 | 0.0508 |
| 3.7219 | 6878 | 0.1006 |
| 3.7224 | 6879 | 0.0497 |
| 3.7229 | 6880 | 0.0404 |
| 3.7235 | 6881 | 0.1061 |
| 3.7240 | 6882 | 0.0603 |
| 3.7246 | 6883 | 0.0706 |
| 3.7251 | 6884 | 0.0646 |
| 3.7256 | 6885 | 0.0778 |
| 3.7262 | 6886 | 0.1232 |
| 3.7267 | 6887 | 0.0953 |
| 3.7273 | 6888 | 0.0913 |
| 3.7278 | 6889 | 0.112 |
| 3.7284 | 6890 | 0.1389 |
| 3.7289 | 6891 | 0.0829 |
| 3.7294 | 6892 | 0.0758 |
| 3.7300 | 6893 | 0.0704 |
| 3.7305 | 6894 | 0.0478 |
| 3.7311 | 6895 | 0.0725 |
| 3.7316 | 6896 | 0.0576 |
| 3.7321 | 6897 | 0.079 |
| 3.7327 | 6898 | 0.0692 |
| 3.7332 | 6899 | 0.0525 |
| 3.7338 | 6900 | 0.1433 |
| 3.7343 | 6901 | 0.0688 |
| 3.7348 | 6902 | 0.0526 |
| 3.7354 | 6903 | 0.1126 |
| 3.7359 | 6904 | 0.078 |
| 3.7365 | 6905 | 0.0993 |
| 3.7370 | 6906 | 0.066 |
| 3.7376 | 6907 | 0.055 |
| 3.7381 | 6908 | 0.0617 |
| 3.7386 | 6909 | 0.1177 |
| 3.7392 | 6910 | 0.1043 |
| 3.7397 | 6911 | 0.0492 |
| 3.7403 | 6912 | 0.0925 |
| 3.7408 | 6913 | 0.0713 |
| 3.7413 | 6914 | 0.0904 |
| 3.7419 | 6915 | 0.1371 |
| 3.7424 | 6916 | 0.071 |
| 3.7430 | 6917 | 0.0673 |
| 3.7435 | 6918 | 0.0767 |
| 3.7440 | 6919 | 0.0817 |
| 3.7446 | 6920 | 0.0746 |
| 3.7451 | 6921 | 0.1003 |
| 3.7457 | 6922 | 0.0471 |
| 3.7462 | 6923 | 0.1016 |
| 3.7468 | 6924 | 0.0674 |
| 3.7473 | 6925 | 0.1127 |
| 3.7478 | 6926 | 0.1037 |
| 3.7484 | 6927 | 0.1104 |
| 3.7489 | 6928 | 0.1036 |
| 3.7495 | 6929 | 0.0432 |
| 3.75 | 6930 | 0.0702 |
| 3.7505 | 6931 | 0.0586 |
| 3.7511 | 6932 | 0.0681 |
| 3.7516 | 6933 | 0.1362 |
| 3.7522 | 6934 | 0.0869 |
| 3.7527 | 6935 | 0.0664 |
| 3.7532 | 6936 | 0.0691 |
| 3.7538 | 6937 | 0.1069 |
| 3.7543 | 6938 | 0.0805 |
| 3.7549 | 6939 | 0.062 |
| 3.7554 | 6940 | 0.1161 |
| 3.7560 | 6941 | 0.1195 |
| 3.7565 | 6942 | 0.0726 |
| 3.7570 | 6943 | 0.0859 |
| 3.7576 | 6944 | 0.0642 |
| 3.7581 | 6945 | 0.0544 |
| 3.7587 | 6946 | 0.0823 |
| 3.7592 | 6947 | 0.1038 |
| 3.7597 | 6948 | 0.0808 |
| 3.7603 | 6949 | 0.1158 |
| 3.7608 | 6950 | 0.1553 |
| 3.7614 | 6951 | 0.0697 |
| 3.7619 | 6952 | 0.0552 |
| 3.7624 | 6953 | 0.0834 |
| 3.7630 | 6954 | 0.0684 |
| 3.7635 | 6955 | 0.1239 |
| 3.7641 | 6956 | 0.0725 |
| 3.7646 | 6957 | 0.0546 |
| 3.7652 | 6958 | 0.045 |
| 3.7657 | 6959 | 0.0442 |
| 3.7662 | 6960 | 0.1138 |
| 3.7668 | 6961 | 0.0587 |
| 3.7673 | 6962 | 0.0908 |
| 3.7679 | 6963 | 0.0901 |
| 3.7684 | 6964 | 0.1262 |
| 3.7689 | 6965 | 0.0433 |
| 3.7695 | 6966 | 0.1006 |
| 3.7700 | 6967 | 0.0958 |
| 3.7706 | 6968 | 0.0833 |
| 3.7711 | 6969 | 0.1019 |
| 3.7716 | 6970 | 0.0753 |
| 3.7722 | 6971 | 0.1387 |
| 3.7727 | 6972 | 0.1131 |
| 3.7733 | 6973 | 0.1168 |
| 3.7738 | 6974 | 0.0893 |
| 3.7744 | 6975 | 0.0805 |
| 3.7749 | 6976 | 0.0852 |
| 3.7754 | 6977 | 0.1001 |
| 3.7760 | 6978 | 0.0773 |
| 3.7765 | 6979 | 0.1036 |
| 3.7771 | 6980 | 0.0398 |
| 3.7776 | 6981 | 0.1099 |
| 3.7781 | 6982 | 0.1068 |
| 3.7787 | 6983 | 0.1014 |
| 3.7792 | 6984 | 0.0367 |
| 3.7798 | 6985 | 0.1289 |
| 3.7803 | 6986 | 0.142 |
| 3.7808 | 6987 | 0.0663 |
| 3.7814 | 6988 | 0.0804 |
| 3.7819 | 6989 | 0.0873 |
| 3.7825 | 6990 | 0.1094 |
| 3.7830 | 6991 | 0.0943 |
| 3.7835 | 6992 | 0.1411 |
| 3.7841 | 6993 | 0.0637 |
| 3.7846 | 6994 | 0.0833 |
| 3.7852 | 6995 | 0.1142 |
| 3.7857 | 6996 | 0.1247 |
| 3.7863 | 6997 | 0.0528 |
| 3.7868 | 6998 | 0.053 |
| 3.7873 | 6999 | 0.122 |
| 3.7879 | 7000 | 0.0807 |
| 3.7884 | 7001 | 0.0805 |
| 3.7890 | 7002 | 0.0731 |
| 3.7895 | 7003 | 0.0604 |
| 3.7900 | 7004 | 0.0708 |
| 3.7906 | 7005 | 0.1088 |
| 3.7911 | 7006 | 0.0788 |
| 3.7917 | 7007 | 0.0592 |
| 3.7922 | 7008 | 0.0949 |
| 3.7927 | 7009 | 0.0971 |
| 3.7933 | 7010 | 0.1494 |
| 3.7938 | 7011 | 0.0812 |
| 3.7944 | 7012 | 0.0781 |
| 3.7949 | 7013 | 0.0701 |
| 3.7955 | 7014 | 0.0222 |
| 3.7960 | 7015 | 0.0663 |
| 3.7965 | 7016 | 0.0493 |
| 3.7971 | 7017 | 0.0679 |
| 3.7976 | 7018 | 0.1134 |
| 3.7982 | 7019 | 0.0684 |
| 3.7987 | 7020 | 0.0689 |
| 3.7992 | 7021 | 0.1281 |
| 3.7998 | 7022 | 0.0652 |
| 3.8003 | 7023 | 0.0432 |
| 3.8009 | 7024 | 0.0598 |
| 3.8014 | 7025 | 0.0676 |
| 3.8019 | 7026 | 0.0699 |
| 3.8025 | 7027 | 0.0903 |
| 3.8030 | 7028 | 0.0564 |
| 3.8036 | 7029 | 0.0686 |
| 3.8041 | 7030 | 0.0691 |
| 3.8047 | 7031 | 0.0698 |
| 3.8052 | 7032 | 0.0572 |
| 3.8057 | 7033 | 0.0538 |
| 3.8063 | 7034 | 0.0529 |
| 3.8068 | 7035 | 0.087 |
| 3.8074 | 7036 | 0.0977 |
| 3.8079 | 7037 | 0.0438 |
| 3.8084 | 7038 | 0.1055 |
| 3.8090 | 7039 | 0.0938 |
| 3.8095 | 7040 | 0.054 |
| 3.8101 | 7041 | 0.0837 |
| 3.8106 | 7042 | 0.0776 |
| 3.8111 | 7043 | 0.0959 |
| 3.8117 | 7044 | 0.125 |
| 3.8122 | 7045 | 0.0757 |
| 3.8128 | 7046 | 0.0639 |
| 3.8133 | 7047 | 0.0836 |
| 3.8139 | 7048 | 0.0695 |
| 3.8144 | 7049 | 0.0514 |
| 3.8149 | 7050 | 0.0768 |
| 3.8155 | 7051 | 0.1031 |
| 3.8160 | 7052 | 0.0739 |
| 3.8166 | 7053 | 0.0649 |
| 3.8171 | 7054 | 0.0866 |
| 3.8176 | 7055 | 0.0958 |
| 3.8182 | 7056 | 0.0735 |
| 3.8187 | 7057 | 0.0877 |
| 3.8193 | 7058 | 0.1499 |
| 3.8198 | 7059 | 0.0446 |
| 3.8203 | 7060 | 0.0955 |
| 3.8209 | 7061 | 0.1489 |
| 3.8214 | 7062 | 0.0651 |
| 3.8220 | 7063 | 0.0896 |
| 3.8225 | 7064 | 0.0794 |
| 3.8231 | 7065 | 0.0981 |
| 3.8236 | 7066 | 0.0788 |
| 3.8241 | 7067 | 0.093 |
| 3.8247 | 7068 | 0.1141 |
| 3.8252 | 7069 | 0.0688 |
| 3.8258 | 7070 | 0.1023 |
| 3.8263 | 7071 | 0.105 |
| 3.8268 | 7072 | 0.0939 |
| 3.8274 | 7073 | 0.0782 |
| 3.8279 | 7074 | 0.0983 |
| 3.8285 | 7075 | 0.0496 |
| 3.8290 | 7076 | 0.0586 |
| 3.8295 | 7077 | 0.112 |
| 3.8301 | 7078 | 0.0844 |
| 3.8306 | 7079 | 0.0483 |
| 3.8312 | 7080 | 0.0703 |
| 3.8317 | 7081 | 0.12 |
| 3.8323 | 7082 | 0.091 |
| 3.8328 | 7083 | 0.054 |
| 3.8333 | 7084 | 0.091 |
| 3.8339 | 7085 | 0.0828 |
| 3.8344 | 7086 | 0.0709 |
| 3.8350 | 7087 | 0.0291 |
| 3.8355 | 7088 | 0.0927 |
| 3.8360 | 7089 | 0.0776 |
| 3.8366 | 7090 | 0.0668 |
| 3.8371 | 7091 | 0.073 |
| 3.8377 | 7092 | 0.1182 |
| 3.8382 | 7093 | 0.0888 |
| 3.8387 | 7094 | 0.0968 |
| 3.8393 | 7095 | 0.1082 |
| 3.8398 | 7096 | 0.0823 |
| 3.8404 | 7097 | 0.0744 |
| 3.8409 | 7098 | 0.0639 |
| 3.8415 | 7099 | 0.0479 |
| 3.8420 | 7100 | 0.0709 |
| 3.8425 | 7101 | 0.047 |
| 3.8431 | 7102 | 0.0797 |
| 3.8436 | 7103 | 0.1295 |
| 3.8442 | 7104 | 0.1142 |
| 3.8447 | 7105 | 0.0735 |
| 3.8452 | 7106 | 0.0528 |
| 3.8458 | 7107 | 0.0865 |
| 3.8463 | 7108 | 0.0514 |
| 3.8469 | 7109 | 0.0612 |
| 3.8474 | 7110 | 0.0859 |
| 3.8479 | 7111 | 0.1079 |
| 3.8485 | 7112 | 0.0763 |
| 3.8490 | 7113 | 0.0923 |
| 3.8496 | 7114 | 0.0493 |
| 3.8501 | 7115 | 0.0944 |
| 3.8506 | 7116 | 0.0543 |
| 3.8512 | 7117 | 0.0878 |
| 3.8517 | 7118 | 0.0945 |
| 3.8523 | 7119 | 0.0496 |
| 3.8528 | 7120 | 0.1474 |
| 3.8534 | 7121 | 0.1175 |
| 3.8539 | 7122 | 0.0752 |
| 3.8544 | 7123 | 0.0697 |
| 3.8550 | 7124 | 0.0461 |
| 3.8555 | 7125 | 0.0569 |
| 3.8561 | 7126 | 0.0921 |
| 3.8566 | 7127 | 0.0842 |
| 3.8571 | 7128 | 0.0655 |
| 3.8577 | 7129 | 0.1065 |
| 3.8582 | 7130 | 0.1381 |
| 3.8588 | 7131 | 0.0568 |
| 3.8593 | 7132 | 0.0654 |
| 3.8598 | 7133 | 0.0667 |
| 3.8604 | 7134 | 0.0658 |
| 3.8609 | 7135 | 0.0746 |
| 3.8615 | 7136 | 0.1176 |
| 3.8620 | 7137 | 0.0769 |
| 3.8626 | 7138 | 0.1355 |
| 3.8631 | 7139 | 0.0484 |
| 3.8636 | 7140 | 0.1565 |
| 3.8642 | 7141 | 0.0548 |
| 3.8647 | 7142 | 0.0729 |
| 3.8653 | 7143 | 0.0927 |
| 3.8658 | 7144 | 0.0633 |
| 3.8663 | 7145 | 0.0614 |
| 3.8669 | 7146 | 0.0824 |
| 3.8674 | 7147 | 0.0917 |
| 3.8680 | 7148 | 0.0754 |
| 3.8685 | 7149 | 0.0855 |
| 3.8690 | 7150 | 0.0886 |
| 3.8696 | 7151 | 0.053 |
| 3.8701 | 7152 | 0.0773 |
| 3.8707 | 7153 | 0.0692 |
| 3.8712 | 7154 | 0.0551 |
| 3.8718 | 7155 | 0.0916 |
| 3.8723 | 7156 | 0.0691 |
| 3.8728 | 7157 | 0.1079 |
| 3.8734 | 7158 | 0.0974 |
| 3.8739 | 7159 | 0.1332 |
| 3.8745 | 7160 | 0.075 |
| 3.875 | 7161 | 0.0732 |
| 3.8755 | 7162 | 0.0933 |
| 3.8761 | 7163 | 0.0821 |
| 3.8766 | 7164 | 0.0494 |
| 3.8772 | 7165 | 0.0836 |
| 3.8777 | 7166 | 0.1249 |
| 3.8782 | 7167 | 0.0589 |
| 3.8788 | 7168 | 0.1348 |
| 3.8793 | 7169 | 0.0727 |
| 3.8799 | 7170 | 0.0668 |
| 3.8804 | 7171 | 0.0778 |
| 3.8810 | 7172 | 0.0646 |
| 3.8815 | 7173 | 0.095 |
| 3.8820 | 7174 | 0.0471 |
| 3.8826 | 7175 | 0.1364 |
| 3.8831 | 7176 | 0.1098 |
| 3.8837 | 7177 | 0.0536 |
| 3.8842 | 7178 | 0.0655 |
| 3.8847 | 7179 | 0.0664 |
| 3.8853 | 7180 | 0.0786 |
| 3.8858 | 7181 | 0.0742 |
| 3.8864 | 7182 | 0.1047 |
| 3.8869 | 7183 | 0.1576 |
| 3.8874 | 7184 | 0.0942 |
| 3.8880 | 7185 | 0.0801 |
| 3.8885 | 7186 | 0.0672 |
| 3.8891 | 7187 | 0.0822 |
| 3.8896 | 7188 | 0.0637 |
| 3.8902 | 7189 | 0.0609 |
| 3.8907 | 7190 | 0.0902 |
| 3.8912 | 7191 | 0.083 |
| 3.8918 | 7192 | 0.0741 |
| 3.8923 | 7193 | 0.0566 |
| 3.8929 | 7194 | 0.079 |
| 3.8934 | 7195 | 0.0651 |
| 3.8939 | 7196 | 0.0818 |
| 3.8945 | 7197 | 0.0903 |
| 3.8950 | 7198 | 0.0557 |
| 3.8956 | 7199 | 0.1063 |
| 3.8961 | 7200 | 0.1045 |
| 3.8966 | 7201 | 0.1198 |
| 3.8972 | 7202 | 0.0809 |
| 3.8977 | 7203 | 0.0612 |
| 3.8983 | 7204 | 0.0759 |
| 3.8988 | 7205 | 0.0919 |
| 3.8994 | 7206 | 0.0478 |
| 3.8999 | 7207 | 0.0735 |
| 3.9004 | 7208 | 0.0906 |
| 3.9010 | 7209 | 0.0984 |
| 3.9015 | 7210 | 0.2369 |
| 3.9021 | 7211 | 0.0706 |
| 3.9026 | 7212 | 0.0894 |
| 3.9031 | 7213 | 0.0595 |
| 3.9037 | 7214 | 0.1209 |
| 3.9042 | 7215 | 0.0553 |
| 3.9048 | 7216 | 0.0468 |
| 3.9053 | 7217 | 0.0595 |
| 3.9058 | 7218 | 0.0584 |
| 3.9064 | 7219 | 0.0729 |
| 3.9069 | 7220 | 0.0632 |
| 3.9075 | 7221 | 0.0924 |
| 3.9080 | 7222 | 0.0539 |
| 3.9085 | 7223 | 0.0903 |
| 3.9091 | 7224 | 0.0746 |
| 3.9096 | 7225 | 0.1437 |
| 3.9102 | 7226 | 0.0474 |
| 3.9107 | 7227 | 0.0569 |
| 3.9113 | 7228 | 0.1028 |
| 3.9118 | 7229 | 0.0808 |
| 3.9123 | 7230 | 0.0912 |
| 3.9129 | 7231 | 0.051 |
| 3.9134 | 7232 | 0.0493 |
| 3.9140 | 7233 | 0.083 |
| 3.9145 | 7234 | 0.0628 |
| 3.9150 | 7235 | 0.084 |
| 3.9156 | 7236 | 0.1243 |
| 3.9161 | 7237 | 0.0987 |
| 3.9167 | 7238 | 0.0673 |
| 3.9172 | 7239 | 0.0767 |
| 3.9177 | 7240 | 0.0944 |
| 3.9183 | 7241 | 0.1025 |
| 3.9188 | 7242 | 0.09 |
| 3.9194 | 7243 | 0.0911 |
| 3.9199 | 7244 | 0.0485 |
| 3.9205 | 7245 | 0.1163 |
| 3.9210 | 7246 | 0.0895 |
| 3.9215 | 7247 | 0.0846 |
| 3.9221 | 7248 | 0.0516 |
| 3.9226 | 7249 | 0.0733 |
| 3.9232 | 7250 | 0.0853 |
| 3.9237 | 7251 | 0.1629 |
| 3.9242 | 7252 | 0.0723 |
| 3.9248 | 7253 | 0.0399 |
| 3.9253 | 7254 | 0.056 |
| 3.9259 | 7255 | 0.1223 |
| 3.9264 | 7256 | 0.0571 |
| 3.9269 | 7257 | 0.0934 |
| 3.9275 | 7258 | 0.0635 |
| 3.9280 | 7259 | 0.088 |
| 3.9286 | 7260 | 0.1038 |
| 3.9291 | 7261 | 0.0647 |
| 3.9297 | 7262 | 0.0567 |
| 3.9302 | 7263 | 0.109 |
| 3.9307 | 7264 | 0.065 |
| 3.9313 | 7265 | 0.1029 |
| 3.9318 | 7266 | 0.0755 |
| 3.9324 | 7267 | 0.1027 |
| 3.9329 | 7268 | 0.0922 |
| 3.9334 | 7269 | 0.0789 |
| 3.9340 | 7270 | 0.0621 |
| 3.9345 | 7271 | 0.0912 |
| 3.9351 | 7272 | 0.0796 |
| 3.9356 | 7273 | 0.1012 |
| 3.9361 | 7274 | 0.0679 |
| 3.9367 | 7275 | 0.0795 |
| 3.9372 | 7276 | 0.1389 |
| 3.9378 | 7277 | 0.0375 |
| 3.9383 | 7278 | 0.0544 |
| 3.9389 | 7279 | 0.0721 |
| 3.9394 | 7280 | 0.1036 |
| 3.9399 | 7281 | 0.0576 |
| 3.9405 | 7282 | 0.0756 |
| 3.9410 | 7283 | 0.068 |
| 3.9416 | 7284 | 0.046 |
| 3.9421 | 7285 | 0.049 |
| 3.9426 | 7286 | 0.0963 |
| 3.9432 | 7287 | 0.0495 |
| 3.9437 | 7288 | 0.0825 |
| 3.9443 | 7289 | 0.1202 |
| 3.9448 | 7290 | 0.0792 |
| 3.9453 | 7291 | 0.068 |
| 3.9459 | 7292 | 0.0884 |
| 3.9464 | 7293 | 0.1017 |
| 3.9470 | 7294 | 0.0643 |
| 3.9475 | 7295 | 0.0674 |
| 3.9481 | 7296 | 0.0997 |
| 3.9486 | 7297 | 0.077 |
| 3.9491 | 7298 | 0.0981 |
| 3.9497 | 7299 | 0.087 |
| 3.9502 | 7300 | 0.0801 |
| 3.9508 | 7301 | 0.1066 |
| 3.9513 | 7302 | 0.0954 |
| 3.9518 | 7303 | 0.0787 |
| 3.9524 | 7304 | 0.0875 |
| 3.9529 | 7305 | 0.0879 |
| 3.9535 | 7306 | 0.0929 |
| 3.9540 | 7307 | 0.1194 |
| 3.9545 | 7308 | 0.0541 |
| 3.9551 | 7309 | 0.1337 |
| 3.9556 | 7310 | 0.1221 |
| 3.9562 | 7311 | 0.1045 |
| 3.9567 | 7312 | 0.0462 |
| 3.9573 | 7313 | 0.0705 |
| 3.9578 | 7314 | 0.0972 |
| 3.9583 | 7315 | 0.0634 |
| 3.9589 | 7316 | 0.0435 |
| 3.9594 | 7317 | 0.0857 |
| 3.9600 | 7318 | 0.0943 |
| 3.9605 | 7319 | 0.0745 |
| 3.9610 | 7320 | 0.0476 |
| 3.9616 | 7321 | 0.0767 |
| 3.9621 | 7322 | 0.0677 |
| 3.9627 | 7323 | 0.0716 |
| 3.9632 | 7324 | 0.0988 |
| 3.9637 | 7325 | 0.0628 |
| 3.9643 | 7326 | 0.0952 |
| 3.9648 | 7327 | 0.053 |
| 3.9654 | 7328 | 0.0646 |
| 3.9659 | 7329 | 0.0985 |
| 3.9665 | 7330 | 0.0743 |
| 3.9670 | 7331 | 0.0575 |
| 3.9675 | 7332 | 0.0758 |
| 3.9681 | 7333 | 0.0646 |
| 3.9686 | 7334 | 0.0627 |
| 3.9692 | 7335 | 0.0792 |
| 3.9697 | 7336 | 0.0849 |
| 3.9702 | 7337 | 0.05 |
| 3.9708 | 7338 | 0.0699 |
| 3.9713 | 7339 | 0.1229 |
| 3.9719 | 7340 | 0.1182 |
| 3.9724 | 7341 | 0.0715 |
| 3.9729 | 7342 | 0.0729 |
| 3.9735 | 7343 | 0.175 |
| 3.9740 | 7344 | 0.0532 |
| 3.9746 | 7345 | 0.0738 |
| 3.9751 | 7346 | 0.0356 |
| 3.9756 | 7347 | 0.0515 |
| 3.9762 | 7348 | 0.071 |
| 3.9767 | 7349 | 0.039 |
| 3.9773 | 7350 | 0.1067 |
| 3.9778 | 7351 | 0.0671 |
| 3.9784 | 7352 | 0.0427 |
| 3.9789 | 7353 | 0.0638 |
| 3.9794 | 7354 | 0.0636 |
| 3.9800 | 7355 | 0.0582 |
| 3.9805 | 7356 | 0.0913 |
| 3.9811 | 7357 | 0.0564 |
| 3.9816 | 7358 | 0.1167 |
| 3.9821 | 7359 | 0.1246 |
| 3.9827 | 7360 | 0.0779 |
| 3.9832 | 7361 | 0.0889 |
| 3.9838 | 7362 | 0.0911 |
| 3.9843 | 7363 | 0.0903 |
| 3.9848 | 7364 | 0.0716 |
| 3.9854 | 7365 | 0.0554 |
| 3.9859 | 7366 | 0.0503 |
| 3.9865 | 7367 | 0.1016 |
| 3.9870 | 7368 | 0.0779 |
| 3.9876 | 7369 | 0.0539 |
| 3.9881 | 7370 | 0.0759 |
| 3.9886 | 7371 | 0.1104 |
| 3.9892 | 7372 | 0.048 |
| 3.9897 | 7373 | 0.0625 |
| 3.9903 | 7374 | 0.0787 |
| 3.9908 | 7375 | 0.0569 |
| 3.9913 | 7376 | 0.0452 |
| 3.9919 | 7377 | 0.0769 |
| 3.9924 | 7378 | 0.0608 |
| 3.9930 | 7379 | 0.0835 |
| 3.9935 | 7380 | 0.082 |
| 3.9940 | 7381 | 0.0758 |
| 3.9946 | 7382 | 0.1428 |
| 3.9951 | 7383 | 0.0454 |
| 3.9957 | 7384 | 0.1177 |
| 3.9962 | 7385 | 0.0945 |
| 3.9968 | 7386 | 0.0713 |
| 3.9973 | 7387 | 0.0661 |
| 3.9978 | 7388 | 0.1428 |
| 3.9984 | 7389 | 0.0698 |
| 3.9989 | 7390 | 0.0699 |
| 3.9995 | 7391 | 0.0872 |
| 4.0 | 7392 | 0.0639 |
| 4.0005 | 7393 | 0.0746 |
| 4.0011 | 7394 | 0.1108 |
| 4.0016 | 7395 | 0.0481 |
| 4.0022 | 7396 | 0.0765 |
| 4.0027 | 7397 | 0.0626 |
| 4.0032 | 7398 | 0.0586 |
| 4.0038 | 7399 | 0.0624 |
| 4.0043 | 7400 | 0.0724 |
| 4.0049 | 7401 | 0.0954 |
| 4.0054 | 7402 | 0.0564 |
| 4.0060 | 7403 | 0.0924 |
| 4.0065 | 7404 | 0.061 |
| 4.0070 | 7405 | 0.0609 |
| 4.0076 | 7406 | 0.1159 |
| 4.0081 | 7407 | 0.0569 |
| 4.0087 | 7408 | 0.1043 |
| 4.0092 | 7409 | 0.0499 |
| 4.0097 | 7410 | 0.1053 |
| 4.0103 | 7411 | 0.0657 |
| 4.0108 | 7412 | 0.0412 |
| 4.0114 | 7413 | 0.1188 |
| 4.0119 | 7414 | 0.0542 |
| 4.0124 | 7415 | 0.0593 |
| 4.0130 | 7416 | 0.0575 |
| 4.0135 | 7417 | 0.1239 |
| 4.0141 | 7418 | 0.068 |
| 4.0146 | 7419 | 0.0559 |
| 4.0152 | 7420 | 0.0544 |
| 4.0157 | 7421 | 0.0669 |
| 4.0162 | 7422 | 0.0953 |
| 4.0168 | 7423 | 0.0483 |
| 4.0173 | 7424 | 0.0682 |
| 4.0179 | 7425 | 0.1349 |
| 4.0184 | 7426 | 0.0704 |
| 4.0189 | 7427 | 0.0881 |
| 4.0195 | 7428 | 0.1084 |
| 4.0200 | 7429 | 0.0918 |
| 4.0206 | 7430 | 0.0959 |
| 4.0211 | 7431 | 0.0365 |
| 4.0216 | 7432 | 0.0661 |
| 4.0222 | 7433 | 0.1038 |
| 4.0227 | 7434 | 0.106 |
| 4.0233 | 7435 | 0.0785 |
| 4.0238 | 7436 | 0.0949 |
| 4.0244 | 7437 | 0.1014 |
| 4.0249 | 7438 | 0.0791 |
| 4.0254 | 7439 | 0.1182 |
| 4.0260 | 7440 | 0.0546 |
| 4.0265 | 7441 | 0.0714 |
| 4.0271 | 7442 | 0.0384 |
| 4.0276 | 7443 | 0.0715 |
| 4.0281 | 7444 | 0.0734 |
| 4.0287 | 7445 | 0.0941 |
| 4.0292 | 7446 | 0.0869 |
| 4.0298 | 7447 | 0.1253 |
| 4.0303 | 7448 | 0.0652 |
| 4.0308 | 7449 | 0.1159 |
| 4.0314 | 7450 | 0.0414 |
| 4.0319 | 7451 | 0.1104 |
| 4.0325 | 7452 | 0.0516 |
| 4.0330 | 7453 | 0.0651 |
| 4.0335 | 7454 | 0.0769 |
| 4.0341 | 7455 | 0.1233 |
| 4.0346 | 7456 | 0.1108 |
| 4.0352 | 7457 | 0.0877 |
| 4.0357 | 7458 | 0.0807 |
| 4.0363 | 7459 | 0.0954 |
| 4.0368 | 7460 | 0.0805 |
| 4.0373 | 7461 | 0.053 |
| 4.0379 | 7462 | 0.0687 |
| 4.0384 | 7463 | 0.0524 |
| 4.0390 | 7464 | 0.0991 |
| 4.0395 | 7465 | 0.07 |
| 4.0400 | 7466 | 0.0797 |
| 4.0406 | 7467 | 0.1175 |
| 4.0411 | 7468 | 0.0594 |
| 4.0417 | 7469 | 0.0446 |
| 4.0422 | 7470 | 0.0598 |
| 4.0427 | 7471 | 0.0683 |
| 4.0433 | 7472 | 0.102 |
| 4.0438 | 7473 | 0.0762 |
| 4.0444 | 7474 | 0.075 |
| 4.0449 | 7475 | 0.0691 |
| 4.0455 | 7476 | 0.0734 |
| 4.0460 | 7477 | 0.045 |
| 4.0465 | 7478 | 0.0852 |
| 4.0471 | 7479 | 0.0995 |
| 4.0476 | 7480 | 0.0899 |
| 4.0482 | 7481 | 0.0449 |
| 4.0487 | 7482 | 0.1151 |
| 4.0492 | 7483 | 0.1026 |
| 4.0498 | 7484 | 0.0717 |
| 4.0503 | 7485 | 0.0887 |
| 4.0509 | 7486 | 0.1253 |
| 4.0514 | 7487 | 0.0781 |
| 4.0519 | 7488 | 0.0602 |
| 4.0525 | 7489 | 0.0543 |
| 4.0530 | 7490 | 0.0983 |
| 4.0536 | 7491 | 0.0965 |
| 4.0541 | 7492 | 0.0754 |
| 4.0547 | 7493 | 0.1161 |
| 4.0552 | 7494 | 0.0678 |
| 4.0557 | 7495 | 0.0723 |
| 4.0563 | 7496 | 0.035 |
| 4.0568 | 7497 | 0.0526 |
| 4.0574 | 7498 | 0.0698 |
| 4.0579 | 7499 | 0.0968 |
| 4.0584 | 7500 | 0.0791 |
| 4.0590 | 7501 | 0.1375 |
| 4.0595 | 7502 | 0.1274 |
| 4.0601 | 7503 | 0.1071 |
| 4.0606 | 7504 | 0.069 |
| 4.0611 | 7505 | 0.0672 |
| 4.0617 | 7506 | 0.0826 |
| 4.0622 | 7507 | 0.078 |
| 4.0628 | 7508 | 0.067 |
| 4.0633 | 7509 | 0.0631 |
| 4.0639 | 7510 | 0.1207 |
| 4.0644 | 7511 | 0.0499 |
| 4.0649 | 7512 | 0.0632 |
| 4.0655 | 7513 | 0.0766 |
| 4.0660 | 7514 | 0.1157 |
| 4.0666 | 7515 | 0.1096 |
| 4.0671 | 7516 | 0.0716 |
| 4.0676 | 7517 | 0.0619 |
| 4.0682 | 7518 | 0.0811 |
| 4.0687 | 7519 | 0.1015 |
| 4.0693 | 7520 | 0.0669 |
| 4.0698 | 7521 | 0.0472 |
| 4.0703 | 7522 | 0.1118 |
| 4.0709 | 7523 | 0.0917 |
| 4.0714 | 7524 | 0.0985 |
| 4.0720 | 7525 | 0.0903 |
| 4.0725 | 7526 | 0.1005 |
| 4.0731 | 7527 | 0.0714 |
| 4.0736 | 7528 | 0.0745 |
| 4.0741 | 7529 | 0.0361 |
| 4.0747 | 7530 | 0.0891 |
| 4.0752 | 7531 | 0.0544 |
| 4.0758 | 7532 | 0.0862 |
| 4.0763 | 7533 | 0.0593 |
| 4.0768 | 7534 | 0.0667 |
| 4.0774 | 7535 | 0.0688 |
| 4.0779 | 7536 | 0.0789 |
| 4.0785 | 7537 | 0.0783 |
| 4.0790 | 7538 | 0.1176 |
| 4.0795 | 7539 | 0.0969 |
| 4.0801 | 7540 | 0.064 |
| 4.0806 | 7541 | 0.0619 |
| 4.0812 | 7542 | 0.0423 |
| 4.0817 | 7543 | 0.0917 |
| 4.0823 | 7544 | 0.0843 |
| 4.0828 | 7545 | 0.0765 |
| 4.0833 | 7546 | 0.0967 |
| 4.0839 | 7547 | 0.0563 |
| 4.0844 | 7548 | 0.076 |
| 4.0850 | 7549 | 0.045 |
| 4.0855 | 7550 | 0.1013 |
| 4.0860 | 7551 | 0.0768 |
| 4.0866 | 7552 | 0.0452 |
| 4.0871 | 7553 | 0.0816 |
| 4.0877 | 7554 | 0.0407 |
| 4.0882 | 7555 | 0.0466 |
| 4.0887 | 7556 | 0.0635 |
| 4.0893 | 7557 | 0.1238 |
| 4.0898 | 7558 | 0.0639 |
| 4.0904 | 7559 | 0.0966 |
| 4.0909 | 7560 | 0.0419 |
| 4.0915 | 7561 | 0.0536 |
| 4.0920 | 7562 | 0.0438 |
| 4.0925 | 7563 | 0.0929 |
| 4.0931 | 7564 | 0.0724 |
| 4.0936 | 7565 | 0.0646 |
| 4.0942 | 7566 | 0.0547 |
| 4.0947 | 7567 | 0.1256 |
| 4.0952 | 7568 | 0.0851 |
| 4.0958 | 7569 | 0.0859 |
| 4.0963 | 7570 | 0.0451 |
| 4.0969 | 7571 | 0.1197 |
| 4.0974 | 7572 | 0.0558 |
| 4.0979 | 7573 | 0.0845 |
| 4.0985 | 7574 | 0.0428 |
| 4.0990 | 7575 | 0.0608 |
| 4.0996 | 7576 | 0.0551 |
| 4.1001 | 7577 | 0.0553 |
| 4.1006 | 7578 | 0.0758 |
| 4.1012 | 7579 | 0.0877 |
| 4.1017 | 7580 | 0.0374 |
| 4.1023 | 7581 | 0.0495 |
| 4.1028 | 7582 | 0.0701 |
| 4.1034 | 7583 | 0.0339 |
| 4.1039 | 7584 | 0.1018 |
| 4.1044 | 7585 | 0.0941 |
| 4.1050 | 7586 | 0.0714 |
| 4.1055 | 7587 | 0.0674 |
| 4.1061 | 7588 | 0.0613 |
| 4.1066 | 7589 | 0.035 |
| 4.1071 | 7590 | 0.0616 |
| 4.1077 | 7591 | 0.0607 |
| 4.1082 | 7592 | 0.0793 |
| 4.1088 | 7593 | 0.1062 |
| 4.1093 | 7594 | 0.0735 |
| 4.1098 | 7595 | 0.0867 |
| 4.1104 | 7596 | 0.0654 |
| 4.1109 | 7597 | 0.0976 |
| 4.1115 | 7598 | 0.0318 |
| 4.1120 | 7599 | 0.0774 |
| 4.1126 | 7600 | 0.0552 |
| 4.1131 | 7601 | 0.048 |
| 4.1136 | 7602 | 0.046 |
| 4.1142 | 7603 | 0.1257 |
| 4.1147 | 7604 | 0.0607 |
| 4.1153 | 7605 | 0.0711 |
| 4.1158 | 7606 | 0.0582 |
| 4.1163 | 7607 | 0.0748 |
| 4.1169 | 7608 | 0.0827 |
| 4.1174 | 7609 | 0.0993 |
| 4.1180 | 7610 | 0.099 |
| 4.1185 | 7611 | 0.0972 |
| 4.1190 | 7612 | 0.0792 |
| 4.1196 | 7613 | 0.0788 |
| 4.1201 | 7614 | 0.0672 |
| 4.1207 | 7615 | 0.0576 |
| 4.1212 | 7616 | 0.1061 |
| 4.1218 | 7617 | 0.0775 |
| 4.1223 | 7618 | 0.0614 |
| 4.1228 | 7619 | 0.0973 |
| 4.1234 | 7620 | 0.0429 |
| 4.1239 | 7621 | 0.0783 |
| 4.1245 | 7622 | 0.0451 |
| 4.125 | 7623 | 0.0588 |
| 4.1255 | 7624 | 0.1116 |
| 4.1261 | 7625 | 0.0624 |
| 4.1266 | 7626 | 0.1041 |
| 4.1272 | 7627 | 0.0929 |
| 4.1277 | 7628 | 0.0801 |
| 4.1282 | 7629 | 0.0495 |
| 4.1288 | 7630 | 0.0457 |
| 4.1293 | 7631 | 0.0483 |
| 4.1299 | 7632 | 0.0744 |
| 4.1304 | 7633 | 0.1217 |
| 4.1310 | 7634 | 0.0499 |
| 4.1315 | 7635 | 0.1065 |
| 4.1320 | 7636 | 0.0472 |
| 4.1326 | 7637 | 0.1085 |
| 4.1331 | 7638 | 0.0523 |
| 4.1337 | 7639 | 0.0725 |
| 4.1342 | 7640 | 0.0431 |
| 4.1347 | 7641 | 0.1109 |
| 4.1353 | 7642 | 0.0862 |
| 4.1358 | 7643 | 0.0895 |
| 4.1364 | 7644 | 0.0489 |
| 4.1369 | 7645 | 0.0721 |
| 4.1374 | 7646 | 0.0937 |
| 4.1380 | 7647 | 0.0761 |
| 4.1385 | 7648 | 0.0639 |
| 4.1391 | 7649 | 0.082 |
| 4.1396 | 7650 | 0.0702 |
| 4.1402 | 7651 | 0.0868 |
| 4.1407 | 7652 | 0.0648 |
| 4.1412 | 7653 | 0.1021 |
| 4.1418 | 7654 | 0.0612 |
| 4.1423 | 7655 | 0.056 |
| 4.1429 | 7656 | 0.0782 |
| 4.1434 | 7657 | 0.0838 |
| 4.1439 | 7658 | 0.074 |
| 4.1445 | 7659 | 0.0971 |
| 4.1450 | 7660 | 0.0838 |
| 4.1456 | 7661 | 0.0401 |
| 4.1461 | 7662 | 0.0383 |
| 4.1466 | 7663 | 0.0884 |
| 4.1472 | 7664 | 0.0663 |
| 4.1477 | 7665 | 0.0388 |
| 4.1483 | 7666 | 0.0727 |
| 4.1488 | 7667 | 0.053 |
| 4.1494 | 7668 | 0.0795 |
| 4.1499 | 7669 | 0.0905 |
| 4.1504 | 7670 | 0.0539 |
| 4.1510 | 7671 | 0.0507 |
| 4.1515 | 7672 | 0.0946 |
| 4.1521 | 7673 | 0.0559 |
| 4.1526 | 7674 | 0.092 |
| 4.1531 | 7675 | 0.0776 |
| 4.1537 | 7676 | 0.0679 |
| 4.1542 | 7677 | 0.0786 |
| 4.1548 | 7678 | 0.0535 |
| 4.1553 | 7679 | 0.0728 |
| 4.1558 | 7680 | 0.0946 |
| 4.1564 | 7681 | 0.0811 |
| 4.1569 | 7682 | 0.0686 |
| 4.1575 | 7683 | 0.1014 |
| 4.1580 | 7684 | 0.0886 |
| 4.1585 | 7685 | 0.0723 |
| 4.1591 | 7686 | 0.0912 |
| 4.1596 | 7687 | 0.0777 |
| 4.1602 | 7688 | 0.0972 |
| 4.1607 | 7689 | 0.0763 |
| 4.1613 | 7690 | 0.0454 |
| 4.1618 | 7691 | 0.072 |
| 4.1623 | 7692 | 0.0971 |
| 4.1629 | 7693 | 0.0308 |
| 4.1634 | 7694 | 0.0891 |
| 4.1640 | 7695 | 0.053 |
| 4.1645 | 7696 | 0.1062 |
| 4.1650 | 7697 | 0.0948 |
| 4.1656 | 7698 | 0.1132 |
| 4.1661 | 7699 | 0.063 |
| 4.1667 | 7700 | 0.0534 |
| 4.1672 | 7701 | 0.0572 |
| 4.1677 | 7702 | 0.0478 |
| 4.1683 | 7703 | 0.0677 |
| 4.1688 | 7704 | 0.0619 |
| 4.1694 | 7705 | 0.0477 |
| 4.1699 | 7706 | 0.0759 |
| 4.1705 | 7707 | 0.1307 |
| 4.1710 | 7708 | 0.0461 |
| 4.1715 | 7709 | 0.0656 |
| 4.1721 | 7710 | 0.1093 |
| 4.1726 | 7711 | 0.0833 |
| 4.1732 | 7712 | 0.05 |
| 4.1737 | 7713 | 0.0454 |
| 4.1742 | 7714 | 0.0464 |
| 4.1748 | 7715 | 0.0531 |
| 4.1753 | 7716 | 0.0672 |
| 4.1759 | 7717 | 0.055 |
| 4.1764 | 7718 | 0.0614 |
| 4.1769 | 7719 | 0.1182 |
| 4.1775 | 7720 | 0.0704 |
| 4.1780 | 7721 | 0.0884 |
| 4.1786 | 7722 | 0.0535 |
| 4.1791 | 7723 | 0.0804 |
| 4.1797 | 7724 | 0.0634 |
| 4.1802 | 7725 | 0.0769 |
| 4.1807 | 7726 | 0.079 |
| 4.1813 | 7727 | 0.0388 |
| 4.1818 | 7728 | 0.0825 |
| 4.1824 | 7729 | 0.1098 |
| 4.1829 | 7730 | 0.0662 |
| 4.1834 | 7731 | 0.0717 |
| 4.1840 | 7732 | 0.0755 |
| 4.1845 | 7733 | 0.0768 |
| 4.1851 | 7734 | 0.0445 |
| 4.1856 | 7735 | 0.0525 |
| 4.1861 | 7736 | 0.0493 |
| 4.1867 | 7737 | 0.0929 |
| 4.1872 | 7738 | 0.0521 |
| 4.1878 | 7739 | 0.0577 |
| 4.1883 | 7740 | 0.043 |
| 4.1889 | 7741 | 0.0889 |
| 4.1894 | 7742 | 0.1007 |
| 4.1899 | 7743 | 0.0847 |
| 4.1905 | 7744 | 0.0791 |
| 4.1910 | 7745 | 0.1236 |
| 4.1916 | 7746 | 0.0887 |
| 4.1921 | 7747 | 0.0713 |
| 4.1926 | 7748 | 0.0958 |
| 4.1932 | 7749 | 0.1021 |
| 4.1937 | 7750 | 0.1478 |
| 4.1943 | 7751 | 0.1027 |
| 4.1948 | 7752 | 0.063 |
| 4.1953 | 7753 | 0.0825 |
| 4.1959 | 7754 | 0.0686 |
| 4.1964 | 7755 | 0.0632 |
| 4.1970 | 7756 | 0.0408 |
| 4.1975 | 7757 | 0.0645 |
| 4.1981 | 7758 | 0.065 |
| 4.1986 | 7759 | 0.0804 |
| 4.1991 | 7760 | 0.0663 |
| 4.1997 | 7761 | 0.0707 |
| 4.2002 | 7762 | 0.0712 |
| 4.2008 | 7763 | 0.0651 |
| 4.2013 | 7764 | 0.0557 |
| 4.2018 | 7765 | 0.0759 |
| 4.2024 | 7766 | 0.0896 |
| 4.2029 | 7767 | 0.0305 |
| 4.2035 | 7768 | 0.0993 |
| 4.2040 | 7769 | 0.0513 |
| 4.2045 | 7770 | 0.0441 |
| 4.2051 | 7771 | 0.0674 |
| 4.2056 | 7772 | 0.0396 |
| 4.2062 | 7773 | 0.083 |
| 4.2067 | 7774 | 0.0677 |
| 4.2073 | 7775 | 0.0938 |
| 4.2078 | 7776 | 0.0554 |
| 4.2083 | 7777 | 0.0671 |
| 4.2089 | 7778 | 0.0663 |
| 4.2094 | 7779 | 0.0953 |
| 4.2100 | 7780 | 0.0973 |
| 4.2105 | 7781 | 0.0493 |
| 4.2110 | 7782 | 0.0781 |
| 4.2116 | 7783 | 0.0372 |
| 4.2121 | 7784 | 0.0488 |
| 4.2127 | 7785 | 0.0666 |
| 4.2132 | 7786 | 0.0986 |
| 4.2137 | 7787 | 0.0899 |
| 4.2143 | 7788 | 0.0589 |
| 4.2148 | 7789 | 0.079 |
| 4.2154 | 7790 | 0.0773 |
| 4.2159 | 7791 | 0.0917 |
| 4.2165 | 7792 | 0.0434 |
| 4.2170 | 7793 | 0.1174 |
| 4.2175 | 7794 | 0.1221 |
| 4.2181 | 7795 | 0.0642 |
| 4.2186 | 7796 | 0.0809 |
| 4.2192 | 7797 | 0.0952 |
| 4.2197 | 7798 | 0.0604 |
| 4.2202 | 7799 | 0.1296 |
| 4.2208 | 7800 | 0.0753 |
| 4.2213 | 7801 | 0.068 |
| 4.2219 | 7802 | 0.1224 |
| 4.2224 | 7803 | 0.0399 |
| 4.2229 | 7804 | 0.0379 |
| 4.2235 | 7805 | 0.1396 |
| 4.2240 | 7806 | 0.1408 |
| 4.2246 | 7807 | 0.0906 |
| 4.2251 | 7808 | 0.1542 |
| 4.2256 | 7809 | 0.0817 |
| 4.2262 | 7810 | 0.0576 |
| 4.2267 | 7811 | 0.0605 |
| 4.2273 | 7812 | 0.0791 |
| 4.2278 | 7813 | 0.0844 |
| 4.2284 | 7814 | 0.1083 |
| 4.2289 | 7815 | 0.0499 |
| 4.2294 | 7816 | 0.0635 |
| 4.2300 | 7817 | 0.0537 |
| 4.2305 | 7818 | 0.0819 |
| 4.2311 | 7819 | 0.1109 |
| 4.2316 | 7820 | 0.0764 |
| 4.2321 | 7821 | 0.0777 |
| 4.2327 | 7822 | 0.0388 |
| 4.2332 | 7823 | 0.0902 |
| 4.2338 | 7824 | 0.0676 |
| 4.2343 | 7825 | 0.0998 |
| 4.2348 | 7826 | 0.0736 |
| 4.2354 | 7827 | 0.0565 |
| 4.2359 | 7828 | 0.0966 |
| 4.2365 | 7829 | 0.081 |
| 4.2370 | 7830 | 0.0723 |
| 4.2376 | 7831 | 0.0708 |
| 4.2381 | 7832 | 0.0642 |
| 4.2386 | 7833 | 0.0647 |
| 4.2392 | 7834 | 0.0687 |
| 4.2397 | 7835 | 0.1049 |
| 4.2403 | 7836 | 0.0948 |
| 4.2408 | 7837 | 0.0592 |
| 4.2413 | 7838 | 0.0723 |
| 4.2419 | 7839 | 0.0648 |
| 4.2424 | 7840 | 0.0758 |
| 4.2430 | 7841 | 0.0633 |
| 4.2435 | 7842 | 0.0407 |
| 4.2440 | 7843 | 0.0326 |
| 4.2446 | 7844 | 0.0557 |
| 4.2451 | 7845 | 0.0601 |
| 4.2457 | 7846 | 0.0812 |
| 4.2462 | 7847 | 0.0648 |
| 4.2468 | 7848 | 0.12 |
| 4.2473 | 7849 | 0.0384 |
| 4.2478 | 7850 | 0.0767 |
| 4.2484 | 7851 | 0.0771 |
| 4.2489 | 7852 | 0.097 |
| 4.2495 | 7853 | 0.0884 |
| 4.25 | 7854 | 0.0504 |
| 4.2505 | 7855 | 0.0529 |
| 4.2511 | 7856 | 0.0723 |
| 4.2516 | 7857 | 0.0573 |
| 4.2522 | 7858 | 0.0629 |
| 4.2527 | 7859 | 0.0508 |
| 4.2532 | 7860 | 0.0794 |
| 4.2538 | 7861 | 0.1025 |
| 4.2543 | 7862 | 0.0624 |
| 4.2549 | 7863 | 0.0626 |
| 4.2554 | 7864 | 0.0719 |
| 4.2560 | 7865 | 0.0605 |
| 4.2565 | 7866 | 0.0578 |
| 4.2570 | 7867 | 0.0512 |
| 4.2576 | 7868 | 0.0734 |
| 4.2581 | 7869 | 0.1044 |
| 4.2587 | 7870 | 0.076 |
| 4.2592 | 7871 | 0.0726 |
| 4.2597 | 7872 | 0.0633 |
| 4.2603 | 7873 | 0.1205 |
| 4.2608 | 7874 | 0.0527 |
| 4.2614 | 7875 | 0.0658 |
| 4.2619 | 7876 | 0.069 |
| 4.2624 | 7877 | 0.146 |
| 4.2630 | 7878 | 0.0701 |
| 4.2635 | 7879 | 0.0955 |
| 4.2641 | 7880 | 0.0664 |
| 4.2646 | 7881 | 0.0567 |
| 4.2652 | 7882 | 0.0686 |
| 4.2657 | 7883 | 0.0891 |
| 4.2662 | 7884 | 0.0789 |
| 4.2668 | 7885 | 0.082 |
| 4.2673 | 7886 | 0.0528 |
| 4.2679 | 7887 | 0.0744 |
| 4.2684 | 7888 | 0.1026 |
| 4.2689 | 7889 | 0.0843 |
| 4.2695 | 7890 | 0.0856 |
| 4.2700 | 7891 | 0.1219 |
| 4.2706 | 7892 | 0.1028 |
| 4.2711 | 7893 | 0.0445 |
| 4.2716 | 7894 | 0.0937 |
| 4.2722 | 7895 | 0.0826 |
| 4.2727 | 7896 | 0.0872 |
| 4.2733 | 7897 | 0.0712 |
| 4.2738 | 7898 | 0.0726 |
| 4.2744 | 7899 | 0.0896 |
| 4.2749 | 7900 | 0.0644 |
| 4.2754 | 7901 | 0.0837 |
| 4.2760 | 7902 | 0.0832 |
| 4.2765 | 7903 | 0.1306 |
| 4.2771 | 7904 | 0.1042 |
| 4.2776 | 7905 | 0.0538 |
| 4.2781 | 7906 | 0.0966 |
| 4.2787 | 7907 | 0.0954 |
| 4.2792 | 7908 | 0.068 |
| 4.2798 | 7909 | 0.1037 |
| 4.2803 | 7910 | 0.0958 |
| 4.2808 | 7911 | 0.0368 |
| 4.2814 | 7912 | 0.066 |
| 4.2819 | 7913 | 0.0936 |
| 4.2825 | 7914 | 0.1301 |
| 4.2830 | 7915 | 0.0545 |
| 4.2835 | 7916 | 0.0847 |
| 4.2841 | 7917 | 0.1375 |
| 4.2846 | 7918 | 0.0651 |
| 4.2852 | 7919 | 0.0571 |
| 4.2857 | 7920 | 0.0601 |
| 4.2863 | 7921 | 0.0538 |
| 4.2868 | 7922 | 0.0668 |
| 4.2873 | 7923 | 0.0981 |
| 4.2879 | 7924 | 0.0822 |
| 4.2884 | 7925 | 0.0578 |
| 4.2890 | 7926 | 0.1042 |
| 4.2895 | 7927 | 0.0542 |
| 4.2900 | 7928 | 0.0651 |
| 4.2906 | 7929 | 0.0821 |
| 4.2911 | 7930 | 0.065 |
| 4.2917 | 7931 | 0.0533 |
| 4.2922 | 7932 | 0.1259 |
| 4.2927 | 7933 | 0.0579 |
| 4.2933 | 7934 | 0.0758 |
| 4.2938 | 7935 | 0.0896 |
| 4.2944 | 7936 | 0.0579 |
| 4.2949 | 7937 | 0.089 |
| 4.2955 | 7938 | 0.0563 |
| 4.2960 | 7939 | 0.0867 |
| 4.2965 | 7940 | 0.0724 |
| 4.2971 | 7941 | 0.0737 |
| 4.2976 | 7942 | 0.1419 |
| 4.2982 | 7943 | 0.0449 |
| 4.2987 | 7944 | 0.0894 |
| 4.2992 | 7945 | 0.0567 |
| 4.2998 | 7946 | 0.107 |
| 4.3003 | 7947 | 0.0583 |
| 4.3009 | 7948 | 0.0767 |
| 4.3014 | 7949 | 0.0426 |
| 4.3019 | 7950 | 0.0481 |
| 4.3025 | 7951 | 0.0725 |
| 4.3030 | 7952 | 0.0897 |
| 4.3036 | 7953 | 0.0781 |
| 4.3041 | 7954 | 0.0329 |
| 4.3047 | 7955 | 0.084 |
| 4.3052 | 7956 | 0.0788 |
| 4.3057 | 7957 | 0.0413 |
| 4.3063 | 7958 | 0.0793 |
| 4.3068 | 7959 | 0.0883 |
| 4.3074 | 7960 | 0.0666 |
| 4.3079 | 7961 | 0.1018 |
| 4.3084 | 7962 | 0.1015 |
| 4.3090 | 7963 | 0.0602 |
| 4.3095 | 7964 | 0.0477 |
| 4.3101 | 7965 | 0.089 |
| 4.3106 | 7966 | 0.0678 |
| 4.3111 | 7967 | 0.0688 |
| 4.3117 | 7968 | 0.1215 |
| 4.3122 | 7969 | 0.1007 |
| 4.3128 | 7970 | 0.0649 |
| 4.3133 | 7971 | 0.0991 |
| 4.3139 | 7972 | 0.0844 |
| 4.3144 | 7973 | 0.0577 |
| 4.3149 | 7974 | 0.072 |
| 4.3155 | 7975 | 0.0754 |
| 4.3160 | 7976 | 0.1073 |
| 4.3166 | 7977 | 0.0698 |
| 4.3171 | 7978 | 0.042 |
| 4.3176 | 7979 | 0.1141 |
| 4.3182 | 7980 | 0.0711 |
| 4.3187 | 7981 | 0.0559 |
| 4.3193 | 7982 | 0.1385 |
| 4.3198 | 7983 | 0.0638 |
| 4.3203 | 7984 | 0.0505 |
| 4.3209 | 7985 | 0.0734 |
| 4.3214 | 7986 | 0.0571 |
| 4.3220 | 7987 | 0.0317 |
| 4.3225 | 7988 | 0.0649 |
| 4.3231 | 7989 | 0.0822 |
| 4.3236 | 7990 | 0.0864 |
| 4.3241 | 7991 | 0.0655 |
| 4.3247 | 7992 | 0.0884 |
| 4.3252 | 7993 | 0.064 |
| 4.3258 | 7994 | 0.0593 |
| 4.3263 | 7995 | 0.0734 |
| 4.3268 | 7996 | 0.1009 |
| 4.3274 | 7997 | 0.0526 |
| 4.3279 | 7998 | 0.0407 |
| 4.3285 | 7999 | 0.0927 |
| 4.3290 | 8000 | 0.053 |
| 4.3295 | 8001 | 0.0489 |
| 4.3301 | 8002 | 0.0588 |
| 4.3306 | 8003 | 0.0824 |
| 4.3312 | 8004 | 0.1043 |
| 4.3317 | 8005 | 0.1151 |
| 4.3323 | 8006 | 0.0616 |
| 4.3328 | 8007 | 0.0851 |
| 4.3333 | 8008 | 0.0845 |
| 4.3339 | 8009 | 0.0676 |
| 4.3344 | 8010 | 0.0543 |
| 4.3350 | 8011 | 0.0682 |
| 4.3355 | 8012 | 0.0752 |
| 4.3360 | 8013 | 0.0604 |
| 4.3366 | 8014 | 0.0761 |
| 4.3371 | 8015 | 0.0732 |
| 4.3377 | 8016 | 0.061 |
| 4.3382 | 8017 | 0.1022 |
| 4.3387 | 8018 | 0.0478 |
| 4.3393 | 8019 | 0.0893 |
| 4.3398 | 8020 | 0.05 |
| 4.3404 | 8021 | 0.1214 |
| 4.3409 | 8022 | 0.068 |
| 4.3415 | 8023 | 0.0747 |
| 4.3420 | 8024 | 0.0839 |
| 4.3425 | 8025 | 0.0693 |
| 4.3431 | 8026 | 0.1164 |
| 4.3436 | 8027 | 0.0799 |
| 4.3442 | 8028 | 0.0646 |
| 4.3447 | 8029 | 0.0965 |
| 4.3452 | 8030 | 0.0667 |
| 4.3458 | 8031 | 0.1063 |
| 4.3463 | 8032 | 0.06 |
| 4.3469 | 8033 | 0.0772 |
| 4.3474 | 8034 | 0.0773 |
| 4.3479 | 8035 | 0.0982 |
| 4.3485 | 8036 | 0.0618 |
| 4.3490 | 8037 | 0.0559 |
| 4.3496 | 8038 | 0.0539 |
| 4.3501 | 8039 | 0.1364 |
| 4.3506 | 8040 | 0.0602 |
| 4.3512 | 8041 | 0.0606 |
| 4.3517 | 8042 | 0.1024 |
| 4.3523 | 8043 | 0.0855 |
| 4.3528 | 8044 | 0.0751 |
| 4.3534 | 8045 | 0.0697 |
| 4.3539 | 8046 | 0.1508 |
| 4.3544 | 8047 | 0.0997 |
| 4.3550 | 8048 | 0.1422 |
| 4.3555 | 8049 | 0.0675 |
| 4.3561 | 8050 | 0.0608 |
| 4.3566 | 8051 | 0.1042 |
| 4.3571 | 8052 | 0.0812 |
| 4.3577 | 8053 | 0.107 |
| 4.3582 | 8054 | 0.0875 |
| 4.3588 | 8055 | 0.0671 |
| 4.3593 | 8056 | 0.0337 |
| 4.3598 | 8057 | 0.0402 |
| 4.3604 | 8058 | 0.0853 |
| 4.3609 | 8059 | 0.0494 |
| 4.3615 | 8060 | 0.1343 |
| 4.3620 | 8061 | 0.0447 |
| 4.3626 | 8062 | 0.0851 |
| 4.3631 | 8063 | 0.0788 |
| 4.3636 | 8064 | 0.0828 |
| 4.3642 | 8065 | 0.0841 |
| 4.3647 | 8066 | 0.0377 |
| 4.3653 | 8067 | 0.0628 |
| 4.3658 | 8068 | 0.0764 |
| 4.3663 | 8069 | 0.0468 |
| 4.3669 | 8070 | 0.0591 |
| 4.3674 | 8071 | 0.0604 |
| 4.3680 | 8072 | 0.0421 |
| 4.3685 | 8073 | 0.0817 |
| 4.3690 | 8074 | 0.045 |
| 4.3696 | 8075 | 0.087 |
| 4.3701 | 8076 | 0.04 |
| 4.3707 | 8077 | 0.0664 |
| 4.3712 | 8078 | 0.0614 |
| 4.3718 | 8079 | 0.103 |
| 4.3723 | 8080 | 0.0692 |
| 4.3728 | 8081 | 0.0871 |
| 4.3734 | 8082 | 0.1021 |
| 4.3739 | 8083 | 0.0645 |
| 4.3745 | 8084 | 0.0648 |
| 4.375 | 8085 | 0.1302 |
| 4.3755 | 8086 | 0.0773 |
| 4.3761 | 8087 | 0.0563 |
| 4.3766 | 8088 | 0.0853 |
| 4.3772 | 8089 | 0.0764 |
| 4.3777 | 8090 | 0.0634 |
| 4.3782 | 8091 | 0.0697 |
| 4.3788 | 8092 | 0.0679 |
| 4.3793 | 8093 | 0.0813 |
| 4.3799 | 8094 | 0.0834 |
| 4.3804 | 8095 | 0.0632 |
| 4.3810 | 8096 | 0.0626 |
| 4.3815 | 8097 | 0.0756 |
| 4.3820 | 8098 | 0.0685 |
| 4.3826 | 8099 | 0.0431 |
| 4.3831 | 8100 | 0.1033 |
| 4.3837 | 8101 | 0.0568 |
| 4.3842 | 8102 | 0.0576 |
| 4.3847 | 8103 | 0.0663 |
| 4.3853 | 8104 | 0.1201 |
| 4.3858 | 8105 | 0.1011 |
| 4.3864 | 8106 | 0.1002 |
| 4.3869 | 8107 | 0.0591 |
| 4.3874 | 8108 | 0.1004 |
| 4.3880 | 8109 | 0.0497 |
| 4.3885 | 8110 | 0.0602 |
| 4.3891 | 8111 | 0.0633 |
| 4.3896 | 8112 | 0.0448 |
| 4.3902 | 8113 | 0.1183 |
| 4.3907 | 8114 | 0.0789 |
| 4.3912 | 8115 | 0.1033 |
| 4.3918 | 8116 | 0.0574 |
| 4.3923 | 8117 | 0.1167 |
| 4.3929 | 8118 | 0.0908 |
| 4.3934 | 8119 | 0.0837 |
| 4.3939 | 8120 | 0.1053 |
| 4.3945 | 8121 | 0.0538 |
| 4.3950 | 8122 | 0.1072 |
| 4.3956 | 8123 | 0.1009 |
| 4.3961 | 8124 | 0.0487 |
| 4.3966 | 8125 | 0.0872 |
| 4.3972 | 8126 | 0.0532 |
| 4.3977 | 8127 | 0.0633 |
| 4.3983 | 8128 | 0.0527 |
| 4.3988 | 8129 | 0.064 |
| 4.3994 | 8130 | 0.0412 |
| 4.3999 | 8131 | 0.09 |
| 4.4004 | 8132 | 0.1134 |
| 4.4010 | 8133 | 0.0691 |
| 4.4015 | 8134 | 0.0591 |
| 4.4021 | 8135 | 0.087 |
| 4.4026 | 8136 | 0.0793 |
| 4.4031 | 8137 | 0.113 |
| 4.4037 | 8138 | 0.0831 |
| 4.4042 | 8139 | 0.0699 |
| 4.4048 | 8140 | 0.1103 |
| 4.4053 | 8141 | 0.1057 |
| 4.4058 | 8142 | 0.0681 |
| 4.4064 | 8143 | 0.0995 |
| 4.4069 | 8144 | 0.0985 |
| 4.4075 | 8145 | 0.0716 |
| 4.4080 | 8146 | 0.0612 |
| 4.4085 | 8147 | 0.0724 |
| 4.4091 | 8148 | 0.0594 |
| 4.4096 | 8149 | 0.0496 |
| 4.4102 | 8150 | 0.0813 |
| 4.4107 | 8151 | 0.0605 |
| 4.4113 | 8152 | 0.0484 |
| 4.4118 | 8153 | 0.0968 |
| 4.4123 | 8154 | 0.0639 |
| 4.4129 | 8155 | 0.0672 |
| 4.4134 | 8156 | 0.0371 |
| 4.4140 | 8157 | 0.0791 |
| 4.4145 | 8158 | 0.0571 |
| 4.4150 | 8159 | 0.0885 |
| 4.4156 | 8160 | 0.0502 |
| 4.4161 | 8161 | 0.0797 |
| 4.4167 | 8162 | 0.0773 |
| 4.4172 | 8163 | 0.0965 |
| 4.4177 | 8164 | 0.1127 |
| 4.4183 | 8165 | 0.0562 |
| 4.4188 | 8166 | 0.0475 |
| 4.4194 | 8167 | 0.057 |
| 4.4199 | 8168 | 0.0829 |
| 4.4205 | 8169 | 0.0962 |
| 4.4210 | 8170 | 0.0579 |
| 4.4215 | 8171 | 0.0971 |
| 4.4221 | 8172 | 0.1093 |
| 4.4226 | 8173 | 0.0656 |
| 4.4232 | 8174 | 0.0501 |
| 4.4237 | 8175 | 0.0818 |
| 4.4242 | 8176 | 0.0621 |
| 4.4248 | 8177 | 0.0941 |
| 4.4253 | 8178 | 0.0718 |
| 4.4259 | 8179 | 0.0874 |
| 4.4264 | 8180 | 0.0859 |
| 4.4269 | 8181 | 0.0476 |
| 4.4275 | 8182 | 0.1069 |
| 4.4280 | 8183 | 0.0783 |
| 4.4286 | 8184 | 0.0944 |
| 4.4291 | 8185 | 0.0801 |
| 4.4297 | 8186 | 0.0771 |
| 4.4302 | 8187 | 0.0601 |
| 4.4307 | 8188 | 0.0693 |
| 4.4313 | 8189 | 0.0461 |
| 4.4318 | 8190 | 0.0828 |
| 4.4324 | 8191 | 0.0594 |
| 4.4329 | 8192 | 0.0984 |
| 4.4334 | 8193 | 0.0809 |
| 4.4340 | 8194 | 0.0836 |
| 4.4345 | 8195 | 0.0902 |
| 4.4351 | 8196 | 0.0508 |
| 4.4356 | 8197 | 0.0999 |
| 4.4361 | 8198 | 0.0556 |
| 4.4367 | 8199 | 0.0817 |
| 4.4372 | 8200 | 0.0616 |
| 4.4378 | 8201 | 0.083 |
| 4.4383 | 8202 | 0.0793 |
| 4.4389 | 8203 | 0.1216 |
| 4.4394 | 8204 | 0.0766 |
| 4.4399 | 8205 | 0.0709 |
| 4.4405 | 8206 | 0.0596 |
| 4.4410 | 8207 | 0.1032 |
| 4.4416 | 8208 | 0.0445 |
| 4.4421 | 8209 | 0.0928 |
| 4.4426 | 8210 | 0.07 |
| 4.4432 | 8211 | 0.053 |
| 4.4437 | 8212 | 0.107 |
| 4.4443 | 8213 | 0.0486 |
| 4.4448 | 8214 | 0.0799 |
| 4.4453 | 8215 | 0.0927 |
| 4.4459 | 8216 | 0.085 |
| 4.4464 | 8217 | 0.0985 |
| 4.4470 | 8218 | 0.0776 |
| 4.4475 | 8219 | 0.0545 |
| 4.4481 | 8220 | 0.0608 |
| 4.4486 | 8221 | 0.0595 |
| 4.4491 | 8222 | 0.0636 |
| 4.4497 | 8223 | 0.0378 |
| 4.4502 | 8224 | 0.0905 |
| 4.4508 | 8225 | 0.0539 |
| 4.4513 | 8226 | 0.0679 |
| 4.4518 | 8227 | 0.07 |
| 4.4524 | 8228 | 0.0927 |
| 4.4529 | 8229 | 0.0933 |
| 4.4535 | 8230 | 0.0543 |
| 4.4540 | 8231 | 0.1016 |
| 4.4545 | 8232 | 0.0941 |
| 4.4551 | 8233 | 0.121 |
| 4.4556 | 8234 | 0.1199 |
| 4.4562 | 8235 | 0.0685 |
| 4.4567 | 8236 | 0.0845 |
| 4.4573 | 8237 | 0.0632 |
| 4.4578 | 8238 | 0.0853 |
| 4.4583 | 8239 | 0.1176 |
| 4.4589 | 8240 | 0.0674 |
| 4.4594 | 8241 | 0.0956 |
| 4.4600 | 8242 | 0.0947 |
| 4.4605 | 8243 | 0.0545 |
| 4.4610 | 8244 | 0.066 |
| 4.4616 | 8245 | 0.0996 |
| 4.4621 | 8246 | 0.1034 |
| 4.4627 | 8247 | 0.0796 |
| 4.4632 | 8248 | 0.051 |
| 4.4637 | 8249 | 0.0703 |
| 4.4643 | 8250 | 0.0544 |
| 4.4648 | 8251 | 0.0872 |
| 4.4654 | 8252 | 0.09 |
| 4.4659 | 8253 | 0.0663 |
| 4.4665 | 8254 | 0.0816 |
| 4.4670 | 8255 | 0.0895 |
| 4.4675 | 8256 | 0.0679 |
| 4.4681 | 8257 | 0.1112 |
| 4.4686 | 8258 | 0.053 |
| 4.4692 | 8259 | 0.0716 |
| 4.4697 | 8260 | 0.0547 |
| 4.4702 | 8261 | 0.0304 |
| 4.4708 | 8262 | 0.0803 |
| 4.4713 | 8263 | 0.1307 |
| 4.4719 | 8264 | 0.0788 |
| 4.4724 | 8265 | 0.1113 |
| 4.4729 | 8266 | 0.0699 |
| 4.4735 | 8267 | 0.0377 |
| 4.4740 | 8268 | 0.1292 |
| 4.4746 | 8269 | 0.0941 |
| 4.4751 | 8270 | 0.0552 |
| 4.4756 | 8271 | 0.0765 |
| 4.4762 | 8272 | 0.0529 |
| 4.4767 | 8273 | 0.0934 |
| 4.4773 | 8274 | 0.0468 |
| 4.4778 | 8275 | 0.1027 |
| 4.4784 | 8276 | 0.0807 |
| 4.4789 | 8277 | 0.055 |
| 4.4794 | 8278 | 0.0693 |
| 4.4800 | 8279 | 0.0844 |
| 4.4805 | 8280 | 0.0492 |
| 4.4811 | 8281 | 0.064 |
| 4.4816 | 8282 | 0.0415 |
| 4.4821 | 8283 | 0.0864 |
| 4.4827 | 8284 | 0.0457 |
| 4.4832 | 8285 | 0.0592 |
| 4.4838 | 8286 | 0.0617 |
| 4.4843 | 8287 | 0.1299 |
| 4.4848 | 8288 | 0.0791 |
| 4.4854 | 8289 | 0.0668 |
| 4.4859 | 8290 | 0.0699 |
| 4.4865 | 8291 | 0.0779 |
| 4.4870 | 8292 | 0.0761 |
| 4.4876 | 8293 | 0.0386 |
| 4.4881 | 8294 | 0.1083 |
| 4.4886 | 8295 | 0.0959 |
| 4.4892 | 8296 | 0.0914 |
| 4.4897 | 8297 | 0.0765 |
| 4.4903 | 8298 | 0.0893 |
| 4.4908 | 8299 | 0.0453 |
| 4.4913 | 8300 | 0.0947 |
| 4.4919 | 8301 | 0.0673 |
| 4.4924 | 8302 | 0.0647 |
| 4.4930 | 8303 | 0.058 |
| 4.4935 | 8304 | 0.049 |
| 4.4940 | 8305 | 0.0846 |
| 4.4946 | 8306 | 0.0875 |
| 4.4951 | 8307 | 0.077 |
| 4.4957 | 8308 | 0.0421 |
| 4.4962 | 8309 | 0.0371 |
| 4.4968 | 8310 | 0.0823 |
| 4.4973 | 8311 | 0.1056 |
| 4.4978 | 8312 | 0.0763 |
| 4.4984 | 8313 | 0.0941 |
| 4.4989 | 8314 | 0.0948 |
| 4.4995 | 8315 | 0.0786 |
| 4.5 | 8316 | 0.107 |
| 4.5005 | 8317 | 0.041 |
| 4.5011 | 8318 | 0.0434 |
| 4.5016 | 8319 | 0.0647 |
| 4.5022 | 8320 | 0.0695 |
| 4.5027 | 8321 | 0.0401 |
| 4.5032 | 8322 | 0.0541 |
| 4.5038 | 8323 | 0.0594 |
| 4.5043 | 8324 | 0.0724 |
| 4.5049 | 8325 | 0.0716 |
| 4.5054 | 8326 | 0.0595 |
| 4.5060 | 8327 | 0.1021 |
| 4.5065 | 8328 | 0.1364 |
| 4.5070 | 8329 | 0.071 |
| 4.5076 | 8330 | 0.0777 |
| 4.5081 | 8331 | 0.052 |
| 4.5087 | 8332 | 0.0704 |
| 4.5092 | 8333 | 0.0426 |
| 4.5097 | 8334 | 0.0496 |
| 4.5103 | 8335 | 0.1089 |
| 4.5108 | 8336 | 0.0477 |
| 4.5114 | 8337 | 0.0494 |
| 4.5119 | 8338 | 0.071 |
| 4.5124 | 8339 | 0.0852 |
| 4.5130 | 8340 | 0.0684 |
| 4.5135 | 8341 | 0.0899 |
| 4.5141 | 8342 | 0.0728 |
| 4.5146 | 8343 | 0.0475 |
| 4.5152 | 8344 | 0.0573 |
| 4.5157 | 8345 | 0.0477 |
| 4.5162 | 8346 | 0.1441 |
| 4.5168 | 8347 | 0.0834 |
| 4.5173 | 8348 | 0.1035 |
| 4.5179 | 8349 | 0.0851 |
| 4.5184 | 8350 | 0.0922 |
| 4.5189 | 8351 | 0.0465 |
| 4.5195 | 8352 | 0.0561 |
| 4.5200 | 8353 | 0.0894 |
| 4.5206 | 8354 | 0.0792 |
| 4.5211 | 8355 | 0.0433 |
| 4.5216 | 8356 | 0.1025 |
| 4.5222 | 8357 | 0.0863 |
| 4.5227 | 8358 | 0.0697 |
| 4.5233 | 8359 | 0.0524 |
| 4.5238 | 8360 | 0.1022 |
| 4.5244 | 8361 | 0.0622 |
| 4.5249 | 8362 | 0.0581 |
| 4.5254 | 8363 | 0.0483 |
| 4.5260 | 8364 | 0.0796 |
| 4.5265 | 8365 | 0.0627 |
| 4.5271 | 8366 | 0.0603 |
| 4.5276 | 8367 | 0.0918 |
| 4.5281 | 8368 | 0.0803 |
| 4.5287 | 8369 | 0.095 |
| 4.5292 | 8370 | 0.074 |
| 4.5298 | 8371 | 0.0451 |
| 4.5303 | 8372 | 0.0399 |
| 4.5308 | 8373 | 0.1166 |
| 4.5314 | 8374 | 0.0307 |
| 4.5319 | 8375 | 0.094 |
| 4.5325 | 8376 | 0.068 |
| 4.5330 | 8377 | 0.0592 |
| 4.5335 | 8378 | 0.068 |
| 4.5341 | 8379 | 0.0995 |
| 4.5346 | 8380 | 0.0999 |
| 4.5352 | 8381 | 0.0776 |
| 4.5357 | 8382 | 0.0525 |
| 4.5363 | 8383 | 0.1195 |
| 4.5368 | 8384 | 0.0648 |
| 4.5373 | 8385 | 0.0782 |
| 4.5379 | 8386 | 0.0862 |
| 4.5384 | 8387 | 0.0499 |
| 4.5390 | 8388 | 0.0554 |
| 4.5395 | 8389 | 0.0527 |
| 4.5400 | 8390 | 0.086 |
| 4.5406 | 8391 | 0.131 |
| 4.5411 | 8392 | 0.1038 |
| 4.5417 | 8393 | 0.0762 |
| 4.5422 | 8394 | 0.0491 |
| 4.5427 | 8395 | 0.082 |
| 4.5433 | 8396 | 0.0513 |
| 4.5438 | 8397 | 0.1002 |
| 4.5444 | 8398 | 0.0739 |
| 4.5449 | 8399 | 0.097 |
| 4.5455 | 8400 | 0.0747 |
| 4.5460 | 8401 | 0.0874 |
| 4.5465 | 8402 | 0.0823 |
| 4.5471 | 8403 | 0.0669 |
| 4.5476 | 8404 | 0.1175 |
| 4.5482 | 8405 | 0.0536 |
| 4.5487 | 8406 | 0.0462 |
| 4.5492 | 8407 | 0.0537 |
| 4.5498 | 8408 | 0.0638 |
| 4.5503 | 8409 | 0.0737 |
| 4.5509 | 8410 | 0.0682 |
| 4.5514 | 8411 | 0.0635 |
| 4.5519 | 8412 | 0.0594 |
| 4.5525 | 8413 | 0.0946 |
| 4.5530 | 8414 | 0.0788 |
| 4.5536 | 8415 | 0.099 |
| 4.5541 | 8416 | 0.0846 |
| 4.5547 | 8417 | 0.0349 |
| 4.5552 | 8418 | 0.0909 |
| 4.5557 | 8419 | 0.0854 |
| 4.5563 | 8420 | 0.0805 |
| 4.5568 | 8421 | 0.11 |
| 4.5574 | 8422 | 0.0886 |
| 4.5579 | 8423 | 0.0878 |
| 4.5584 | 8424 | 0.0417 |
| 4.5590 | 8425 | 0.0347 |
| 4.5595 | 8426 | 0.0969 |
| 4.5601 | 8427 | 0.085 |
| 4.5606 | 8428 | 0.0774 |
| 4.5611 | 8429 | 0.0709 |
| 4.5617 | 8430 | 0.0788 |
| 4.5622 | 8431 | 0.0675 |
| 4.5628 | 8432 | 0.0457 |
| 4.5633 | 8433 | 0.0503 |
| 4.5639 | 8434 | 0.0787 |
| 4.5644 | 8435 | 0.0534 |
| 4.5649 | 8436 | 0.0604 |
| 4.5655 | 8437 | 0.0614 |
| 4.5660 | 8438 | 0.0719 |
| 4.5666 | 8439 | 0.0982 |
| 4.5671 | 8440 | 0.0986 |
| 4.5676 | 8441 | 0.0645 |
| 4.5682 | 8442 | 0.0472 |
| 4.5687 | 8443 | 0.0601 |
| 4.5693 | 8444 | 0.0828 |
| 4.5698 | 8445 | 0.0511 |
| 4.5703 | 8446 | 0.0779 |
| 4.5709 | 8447 | 0.0666 |
| 4.5714 | 8448 | 0.0819 |
| 4.5720 | 8449 | 0.0523 |
| 4.5725 | 8450 | 0.0769 |
| 4.5731 | 8451 | 0.055 |
| 4.5736 | 8452 | 0.0834 |
| 4.5741 | 8453 | 0.072 |
| 4.5747 | 8454 | 0.0512 |
| 4.5752 | 8455 | 0.0893 |
| 4.5758 | 8456 | 0.0822 |
| 4.5763 | 8457 | 0.1052 |
| 4.5768 | 8458 | 0.1152 |
| 4.5774 | 8459 | 0.082 |
| 4.5779 | 8460 | 0.0512 |
| 4.5785 | 8461 | 0.0789 |
| 4.5790 | 8462 | 0.0879 |
| 4.5795 | 8463 | 0.0931 |
| 4.5801 | 8464 | 0.0645 |
| 4.5806 | 8465 | 0.0544 |
| 4.5812 | 8466 | 0.0611 |
| 4.5817 | 8467 | 0.0538 |
| 4.5823 | 8468 | 0.0677 |
| 4.5828 | 8469 | 0.0474 |
| 4.5833 | 8470 | 0.1146 |
| 4.5839 | 8471 | 0.0672 |
| 4.5844 | 8472 | 0.079 |
| 4.5850 | 8473 | 0.0878 |
| 4.5855 | 8474 | 0.0675 |
| 4.5860 | 8475 | 0.0465 |
| 4.5866 | 8476 | 0.041 |
| 4.5871 | 8477 | 0.0603 |
| 4.5877 | 8478 | 0.1014 |
| 4.5882 | 8479 | 0.0796 |
| 4.5887 | 8480 | 0.0753 |
| 4.5893 | 8481 | 0.0567 |
| 4.5898 | 8482 | 0.0869 |
| 4.5904 | 8483 | 0.0727 |
| 4.5909 | 8484 | 0.0628 |
| 4.5915 | 8485 | 0.0728 |
| 4.5920 | 8486 | 0.0996 |
| 4.5925 | 8487 | 0.0488 |
| 4.5931 | 8488 | 0.0839 |
| 4.5936 | 8489 | 0.1394 |
| 4.5942 | 8490 | 0.0656 |
| 4.5947 | 8491 | 0.0594 |
| 4.5952 | 8492 | 0.0744 |
| 4.5958 | 8493 | 0.1238 |
| 4.5963 | 8494 | 0.0714 |
| 4.5969 | 8495 | 0.086 |
| 4.5974 | 8496 | 0.0442 |
| 4.5979 | 8497 | 0.0556 |
| 4.5985 | 8498 | 0.0424 |
| 4.5990 | 8499 | 0.0807 |
| 4.5996 | 8500 | 0.09 |
| 4.6001 | 8501 | 0.0519 |
| 4.6006 | 8502 | 0.1343 |
| 4.6012 | 8503 | 0.0531 |
| 4.6017 | 8504 | 0.105 |
| 4.6023 | 8505 | 0.0547 |
| 4.6028 | 8506 | 0.0451 |
| 4.6034 | 8507 | 0.1044 |
| 4.6039 | 8508 | 0.0613 |
| 4.6044 | 8509 | 0.0873 |
| 4.6050 | 8510 | 0.0414 |
| 4.6055 | 8511 | 0.0955 |
| 4.6061 | 8512 | 0.0975 |
| 4.6066 | 8513 | 0.0816 |
| 4.6071 | 8514 | 0.0414 |
| 4.6077 | 8515 | 0.0482 |
| 4.6082 | 8516 | 0.0785 |
| 4.6088 | 8517 | 0.0547 |
| 4.6093 | 8518 | 0.0911 |
| 4.6098 | 8519 | 0.0367 |
| 4.6104 | 8520 | 0.0407 |
| 4.6109 | 8521 | 0.075 |
| 4.6115 | 8522 | 0.071 |
| 4.6120 | 8523 | 0.117 |
| 4.6126 | 8524 | 0.0599 |
| 4.6131 | 8525 | 0.0584 |
| 4.6136 | 8526 | 0.077 |
| 4.6142 | 8527 | 0.048 |
| 4.6147 | 8528 | 0.0876 |
| 4.6153 | 8529 | 0.0825 |
| 4.6158 | 8530 | 0.1066 |
| 4.6163 | 8531 | 0.0535 |
| 4.6169 | 8532 | 0.1527 |
| 4.6174 | 8533 | 0.0621 |
| 4.6180 | 8534 | 0.0845 |
| 4.6185 | 8535 | 0.0583 |
| 4.6190 | 8536 | 0.0416 |
| 4.6196 | 8537 | 0.0618 |
| 4.6201 | 8538 | 0.0485 |
| 4.6207 | 8539 | 0.0528 |
| 4.6212 | 8540 | 0.0898 |
| 4.6218 | 8541 | 0.0913 |
| 4.6223 | 8542 | 0.0792 |
| 4.6228 | 8543 | 0.078 |
| 4.6234 | 8544 | 0.1058 |
| 4.6239 | 8545 | 0.1053 |
| 4.6245 | 8546 | 0.0493 |
| 4.625 | 8547 | 0.0764 |
| 4.6255 | 8548 | 0.0776 |
| 4.6261 | 8549 | 0.1155 |
| 4.6266 | 8550 | 0.1049 |
| 4.6272 | 8551 | 0.0528 |
| 4.6277 | 8552 | 0.0837 |
| 4.6282 | 8553 | 0.0427 |
| 4.6288 | 8554 | 0.0646 |
| 4.6293 | 8555 | 0.0891 |
| 4.6299 | 8556 | 0.0688 |
| 4.6304 | 8557 | 0.0545 |
| 4.6310 | 8558 | 0.0498 |
| 4.6315 | 8559 | 0.0776 |
| 4.6320 | 8560 | 0.0771 |
| 4.6326 | 8561 | 0.0769 |
| 4.6331 | 8562 | 0.099 |
| 4.6337 | 8563 | 0.1386 |
| 4.6342 | 8564 | 0.0801 |
| 4.6347 | 8565 | 0.0868 |
| 4.6353 | 8566 | 0.0582 |
| 4.6358 | 8567 | 0.0783 |
| 4.6364 | 8568 | 0.0564 |
| 4.6369 | 8569 | 0.0616 |
| 4.6374 | 8570 | 0.0781 |
| 4.6380 | 8571 | 0.0538 |
| 4.6385 | 8572 | 0.0557 |
| 4.6391 | 8573 | 0.0889 |
| 4.6396 | 8574 | 0.0919 |
| 4.6402 | 8575 | 0.0641 |
| 4.6407 | 8576 | 0.0538 |
| 4.6412 | 8577 | 0.0819 |
| 4.6418 | 8578 | 0.107 |
| 4.6423 | 8579 | 0.0873 |
| 4.6429 | 8580 | 0.0843 |
| 4.6434 | 8581 | 0.0441 |
| 4.6439 | 8582 | 0.0395 |
| 4.6445 | 8583 | 0.052 |
| 4.6450 | 8584 | 0.061 |
| 4.6456 | 8585 | 0.0805 |
| 4.6461 | 8586 | 0.0654 |
| 4.6466 | 8587 | 0.0911 |
| 4.6472 | 8588 | 0.0888 |
| 4.6477 | 8589 | 0.1061 |
| 4.6483 | 8590 | 0.1197 |
| 4.6488 | 8591 | 0.0756 |
| 4.6494 | 8592 | 0.0682 |
| 4.6499 | 8593 | 0.0798 |
| 4.6504 | 8594 | 0.0819 |
| 4.6510 | 8595 | 0.0426 |
| 4.6515 | 8596 | 0.0594 |
| 4.6521 | 8597 | 0.0992 |
| 4.6526 | 8598 | 0.0755 |
| 4.6531 | 8599 | 0.1406 |
| 4.6537 | 8600 | 0.0817 |
| 4.6542 | 8601 | 0.0949 |
| 4.6548 | 8602 | 0.047 |
| 4.6553 | 8603 | 0.1034 |
| 4.6558 | 8604 | 0.0559 |
| 4.6564 | 8605 | 0.056 |
| 4.6569 | 8606 | 0.0821 |
| 4.6575 | 8607 | 0.0825 |
| 4.6580 | 8608 | 0.0474 |
| 4.6585 | 8609 | 0.0812 |
| 4.6591 | 8610 | 0.1093 |
| 4.6596 | 8611 | 0.0853 |
| 4.6602 | 8612 | 0.127 |
| 4.6607 | 8613 | 0.0948 |
| 4.6613 | 8614 | 0.0691 |
| 4.6618 | 8615 | 0.0701 |
| 4.6623 | 8616 | 0.0476 |
| 4.6629 | 8617 | 0.0618 |
| 4.6634 | 8618 | 0.0608 |
| 4.6640 | 8619 | 0.0521 |
| 4.6645 | 8620 | 0.0697 |
| 4.6650 | 8621 | 0.0835 |
| 4.6656 | 8622 | 0.0725 |
| 4.6661 | 8623 | 0.0591 |
| 4.6667 | 8624 | 0.0626 |
| 4.6672 | 8625 | 0.0781 |
| 4.6677 | 8626 | 0.0924 |
| 4.6683 | 8627 | 0.0453 |
| 4.6688 | 8628 | 0.0662 |
| 4.6694 | 8629 | 0.1009 |
| 4.6699 | 8630 | 0.1171 |
| 4.6705 | 8631 | 0.0831 |
| 4.6710 | 8632 | 0.1345 |
| 4.6715 | 8633 | 0.1258 |
| 4.6721 | 8634 | 0.0709 |
| 4.6726 | 8635 | 0.0669 |
| 4.6732 | 8636 | 0.085 |
| 4.6737 | 8637 | 0.0487 |
| 4.6742 | 8638 | 0.1033 |
| 4.6748 | 8639 | 0.0636 |
| 4.6753 | 8640 | 0.1433 |
| 4.6759 | 8641 | 0.0807 |
| 4.6764 | 8642 | 0.0514 |
| 4.6769 | 8643 | 0.0511 |
| 4.6775 | 8644 | 0.0612 |
| 4.6780 | 8645 | 0.0528 |
| 4.6786 | 8646 | 0.1097 |
| 4.6791 | 8647 | 0.1121 |
| 4.6797 | 8648 | 0.0976 |
| 4.6802 | 8649 | 0.0518 |
| 4.6807 | 8650 | 0.0832 |
| 4.6813 | 8651 | 0.0672 |
| 4.6818 | 8652 | 0.1047 |
| 4.6824 | 8653 | 0.0701 |
| 4.6829 | 8654 | 0.0913 |
| 4.6834 | 8655 | 0.1333 |
| 4.6840 | 8656 | 0.039 |
| 4.6845 | 8657 | 0.0704 |
| 4.6851 | 8658 | 0.0657 |
| 4.6856 | 8659 | 0.1022 |
| 4.6861 | 8660 | 0.0842 |
| 4.6867 | 8661 | 0.0497 |
| 4.6872 | 8662 | 0.0972 |
| 4.6878 | 8663 | 0.1047 |
| 4.6883 | 8664 | 0.0853 |
| 4.6889 | 8665 | 0.0645 |
| 4.6894 | 8666 | 0.0978 |
| 4.6899 | 8667 | 0.0783 |
| 4.6905 | 8668 | 0.084 |
| 4.6910 | 8669 | 0.0645 |
| 4.6916 | 8670 | 0.1056 |
| 4.6921 | 8671 | 0.0652 |
| 4.6926 | 8672 | 0.0799 |
| 4.6932 | 8673 | 0.0666 |
| 4.6937 | 8674 | 0.1178 |
| 4.6943 | 8675 | 0.0628 |
| 4.6948 | 8676 | 0.0546 |
| 4.6953 | 8677 | 0.051 |
| 4.6959 | 8678 | 0.0587 |
| 4.6964 | 8679 | 0.0865 |
| 4.6970 | 8680 | 0.0866 |
| 4.6975 | 8681 | 0.1567 |
| 4.6981 | 8682 | 0.0839 |
| 4.6986 | 8683 | 0.03 |
| 4.6991 | 8684 | 0.0512 |
| 4.6997 | 8685 | 0.0906 |
| 4.7002 | 8686 | 0.0504 |
| 4.7008 | 8687 | 0.0534 |
| 4.7013 | 8688 | 0.1216 |
| 4.7018 | 8689 | 0.0489 |
| 4.7024 | 8690 | 0.0323 |
| 4.7029 | 8691 | 0.1232 |
| 4.7035 | 8692 | 0.0874 |
| 4.7040 | 8693 | 0.0811 |
| 4.7045 | 8694 | 0.0777 |
| 4.7051 | 8695 | 0.1029 |
| 4.7056 | 8696 | 0.0595 |
| 4.7062 | 8697 | 0.0566 |
| 4.7067 | 8698 | 0.0542 |
| 4.7073 | 8699 | 0.0754 |
| 4.7078 | 8700 | 0.0808 |
| 4.7083 | 8701 | 0.0331 |
| 4.7089 | 8702 | 0.0706 |
| 4.7094 | 8703 | 0.0807 |
| 4.7100 | 8704 | 0.0759 |
| 4.7105 | 8705 | 0.0889 |
| 4.7110 | 8706 | 0.0834 |
| 4.7116 | 8707 | 0.0565 |
| 4.7121 | 8708 | 0.0956 |
| 4.7127 | 8709 | 0.0664 |
| 4.7132 | 8710 | 0.0545 |
| 4.7137 | 8711 | 0.0606 |
| 4.7143 | 8712 | 0.0661 |
| 4.7148 | 8713 | 0.1365 |
| 4.7154 | 8714 | 0.0479 |
| 4.7159 | 8715 | 0.0766 |
| 4.7165 | 8716 | 0.0572 |
| 4.7170 | 8717 | 0.0442 |
| 4.7175 | 8718 | 0.0681 |
| 4.7181 | 8719 | 0.0835 |
| 4.7186 | 8720 | 0.0626 |
| 4.7192 | 8721 | 0.0836 |
| 4.7197 | 8722 | 0.0935 |
| 4.7202 | 8723 | 0.0574 |
| 4.7208 | 8724 | 0.0614 |
| 4.7213 | 8725 | 0.0866 |
| 4.7219 | 8726 | 0.0739 |
| 4.7224 | 8727 | 0.0691 |
| 4.7229 | 8728 | 0.0606 |
| 4.7235 | 8729 | 0.1034 |
| 4.7240 | 8730 | 0.0488 |
| 4.7246 | 8731 | 0.1022 |
| 4.7251 | 8732 | 0.0805 |
| 4.7256 | 8733 | 0.0791 |
| 4.7262 | 8734 | 0.084 |
| 4.7267 | 8735 | 0.0815 |
| 4.7273 | 8736 | 0.0596 |
| 4.7278 | 8737 | 0.0638 |
| 4.7284 | 8738 | 0.0841 |
| 4.7289 | 8739 | 0.0824 |
| 4.7294 | 8740 | 0.0935 |
| 4.7300 | 8741 | 0.1037 |
| 4.7305 | 8742 | 0.0342 |
| 4.7311 | 8743 | 0.0908 |
| 4.7316 | 8744 | 0.0666 |
| 4.7321 | 8745 | 0.1002 |
| 4.7327 | 8746 | 0.0378 |
| 4.7332 | 8747 | 0.0891 |
| 4.7338 | 8748 | 0.0582 |
| 4.7343 | 8749 | 0.0547 |
| 4.7348 | 8750 | 0.0525 |
| 4.7354 | 8751 | 0.0592 |
| 4.7359 | 8752 | 0.0906 |
| 4.7365 | 8753 | 0.0714 |
| 4.7370 | 8754 | 0.0845 |
| 4.7376 | 8755 | 0.0484 |
| 4.7381 | 8756 | 0.0912 |
| 4.7386 | 8757 | 0.1357 |
| 4.7392 | 8758 | 0.0531 |
| 4.7397 | 8759 | 0.0605 |
| 4.7403 | 8760 | 0.0508 |
| 4.7408 | 8761 | 0.0699 |
| 4.7413 | 8762 | 0.1222 |
| 4.7419 | 8763 | 0.0681 |
| 4.7424 | 8764 | 0.108 |
| 4.7430 | 8765 | 0.1028 |
| 4.7435 | 8766 | 0.0768 |
| 4.7440 | 8767 | 0.0676 |
| 4.7446 | 8768 | 0.125 |
| 4.7451 | 8769 | 0.1346 |
| 4.7457 | 8770 | 0.1093 |
| 4.7462 | 8771 | 0.0797 |
| 4.7468 | 8772 | 0.0522 |
| 4.7473 | 8773 | 0.0583 |
| 4.7478 | 8774 | 0.0407 |
| 4.7484 | 8775 | 0.0794 |
| 4.7489 | 8776 | 0.0527 |
| 4.7495 | 8777 | 0.0759 |
| 4.75 | 8778 | 0.0702 |
| 4.7505 | 8779 | 0.0942 |
| 4.7511 | 8780 | 0.0785 |
| 4.7516 | 8781 | 0.0459 |
| 4.7522 | 8782 | 0.0551 |
| 4.7527 | 8783 | 0.0596 |
| 4.7532 | 8784 | 0.065 |
| 4.7538 | 8785 | 0.0853 |
| 4.7543 | 8786 | 0.0739 |
| 4.7549 | 8787 | 0.0584 |
| 4.7554 | 8788 | 0.0459 |
| 4.7560 | 8789 | 0.0508 |
| 4.7565 | 8790 | 0.048 |
| 4.7570 | 8791 | 0.0666 |
| 4.7576 | 8792 | 0.0742 |
| 4.7581 | 8793 | 0.0845 |
| 4.7587 | 8794 | 0.1084 |
| 4.7592 | 8795 | 0.071 |
| 4.7597 | 8796 | 0.0561 |
| 4.7603 | 8797 | 0.0892 |
| 4.7608 | 8798 | 0.1252 |
| 4.7614 | 8799 | 0.0534 |
| 4.7619 | 8800 | 0.0756 |
| 4.7624 | 8801 | 0.0541 |
| 4.7630 | 8802 | 0.0752 |
| 4.7635 | 8803 | 0.0596 |
| 4.7641 | 8804 | 0.0627 |
| 4.7646 | 8805 | 0.0567 |
| 4.7652 | 8806 | 0.0778 |
| 4.7657 | 8807 | 0.1045 |
| 4.7662 | 8808 | 0.0496 |
| 4.7668 | 8809 | 0.0573 |
| 4.7673 | 8810 | 0.0846 |
| 4.7679 | 8811 | 0.0591 |
| 4.7684 | 8812 | 0.1104 |
| 4.7689 | 8813 | 0.1046 |
| 4.7695 | 8814 | 0.0658 |
| 4.7700 | 8815 | 0.099 |
| 4.7706 | 8816 | 0.0585 |
| 4.7711 | 8817 | 0.0812 |
| 4.7716 | 8818 | 0.0924 |
| 4.7722 | 8819 | 0.0882 |
| 4.7727 | 8820 | 0.0739 |
| 4.7733 | 8821 | 0.0815 |
| 4.7738 | 8822 | 0.0819 |
| 4.7744 | 8823 | 0.089 |
| 4.7749 | 8824 | 0.0684 |
| 4.7754 | 8825 | 0.0881 |
| 4.7760 | 8826 | 0.0748 |
| 4.7765 | 8827 | 0.0391 |
| 4.7771 | 8828 | 0.0563 |
| 4.7776 | 8829 | 0.0946 |
| 4.7781 | 8830 | 0.0619 |
| 4.7787 | 8831 | 0.088 |
| 4.7792 | 8832 | 0.0401 |
| 4.7798 | 8833 | 0.0476 |
| 4.7803 | 8834 | 0.0665 |
| 4.7808 | 8835 | 0.0967 |
| 4.7814 | 8836 | 0.0557 |
| 4.7819 | 8837 | 0.0674 |
| 4.7825 | 8838 | 0.0843 |
| 4.7830 | 8839 | 0.0814 |
| 4.7835 | 8840 | 0.0953 |
| 4.7841 | 8841 | 0.052 |
| 4.7846 | 8842 | 0.063 |
| 4.7852 | 8843 | 0.0522 |
| 4.7857 | 8844 | 0.0569 |
| 4.7863 | 8845 | 0.0804 |
| 4.7868 | 8846 | 0.1392 |
| 4.7873 | 8847 | 0.1136 |
| 4.7879 | 8848 | 0.1042 |
| 4.7884 | 8849 | 0.0611 |
| 4.7890 | 8850 | 0.0952 |
| 4.7895 | 8851 | 0.061 |
| 4.7900 | 8852 | 0.0575 |
| 4.7906 | 8853 | 0.0524 |
| 4.7911 | 8854 | 0.0798 |
| 4.7917 | 8855 | 0.0572 |
| 4.7922 | 8856 | 0.0519 |
| 4.7927 | 8857 | 0.0481 |
| 4.7933 | 8858 | 0.0478 |
| 4.7938 | 8859 | 0.1597 |
| 4.7944 | 8860 | 0.0685 |
| 4.7949 | 8861 | 0.0735 |
| 4.7955 | 8862 | 0.0408 |
| 4.7960 | 8863 | 0.0926 |
| 4.7965 | 8864 | 0.0507 |
| 4.7971 | 8865 | 0.085 |
| 4.7976 | 8866 | 0.0854 |
| 4.7982 | 8867 | 0.0895 |
| 4.7987 | 8868 | 0.0419 |
| 4.7992 | 8869 | 0.0593 |
| 4.7998 | 8870 | 0.0475 |
| 4.8003 | 8871 | 0.0378 |
| 4.8009 | 8872 | 0.0377 |
| 4.8014 | 8873 | 0.1365 |
| 4.8019 | 8874 | 0.0649 |
| 4.8025 | 8875 | 0.0649 |
| 4.8030 | 8876 | 0.0567 |
| 4.8036 | 8877 | 0.113 |
| 4.8041 | 8878 | 0.1126 |
| 4.8047 | 8879 | 0.0598 |
| 4.8052 | 8880 | 0.0512 |
| 4.8057 | 8881 | 0.0908 |
| 4.8063 | 8882 | 0.0768 |
| 4.8068 | 8883 | 0.0836 |
| 4.8074 | 8884 | 0.0713 |
| 4.8079 | 8885 | 0.0784 |
| 4.8084 | 8886 | 0.0593 |
| 4.8090 | 8887 | 0.0935 |
| 4.8095 | 8888 | 0.0715 |
| 4.8101 | 8889 | 0.066 |
| 4.8106 | 8890 | 0.0286 |
| 4.8111 | 8891 | 0.0834 |
| 4.8117 | 8892 | 0.0845 |
| 4.8122 | 8893 | 0.0505 |
| 4.8128 | 8894 | 0.033 |
| 4.8133 | 8895 | 0.0866 |
| 4.8139 | 8896 | 0.0742 |
| 4.8144 | 8897 | 0.0922 |
| 4.8149 | 8898 | 0.0452 |
| 4.8155 | 8899 | 0.0621 |
| 4.8160 | 8900 | 0.0925 |
| 4.8166 | 8901 | 0.1132 |
| 4.8171 | 8902 | 0.148 |
| 4.8176 | 8903 | 0.051 |
| 4.8182 | 8904 | 0.0526 |
| 4.8187 | 8905 | 0.0705 |
| 4.8193 | 8906 | 0.0508 |
| 4.8198 | 8907 | 0.1021 |
| 4.8203 | 8908 | 0.0518 |
| 4.8209 | 8909 | 0.0356 |
| 4.8214 | 8910 | 0.0739 |
| 4.8220 | 8911 | 0.076 |
| 4.8225 | 8912 | 0.0742 |
| 4.8231 | 8913 | 0.0393 |
| 4.8236 | 8914 | 0.0708 |
| 4.8241 | 8915 | 0.1001 |
| 4.8247 | 8916 | 0.0548 |
| 4.8252 | 8917 | 0.0441 |
| 4.8258 | 8918 | 0.0711 |
| 4.8263 | 8919 | 0.0546 |
| 4.8268 | 8920 | 0.0383 |
| 4.8274 | 8921 | 0.0793 |
| 4.8279 | 8922 | 0.0886 |
| 4.8285 | 8923 | 0.0379 |
| 4.8290 | 8924 | 0.0724 |
| 4.8295 | 8925 | 0.0617 |
| 4.8301 | 8926 | 0.1221 |
| 4.8306 | 8927 | 0.0602 |
| 4.8312 | 8928 | 0.0667 |
| 4.8317 | 8929 | 0.061 |
| 4.8323 | 8930 | 0.0745 |
| 4.8328 | 8931 | 0.0412 |
| 4.8333 | 8932 | 0.0855 |
| 4.8339 | 8933 | 0.091 |
| 4.8344 | 8934 | 0.0918 |
| 4.8350 | 8935 | 0.0852 |
| 4.8355 | 8936 | 0.0741 |
| 4.8360 | 8937 | 0.1196 |
| 4.8366 | 8938 | 0.081 |
| 4.8371 | 8939 | 0.0328 |
| 4.8377 | 8940 | 0.0854 |
| 4.8382 | 8941 | 0.0589 |
| 4.8387 | 8942 | 0.0716 |
| 4.8393 | 8943 | 0.127 |
| 4.8398 | 8944 | 0.0846 |
| 4.8404 | 8945 | 0.0752 |
| 4.8409 | 8946 | 0.0594 |
| 4.8415 | 8947 | 0.0749 |
| 4.8420 | 8948 | 0.0611 |
| 4.8425 | 8949 | 0.1148 |
| 4.8431 | 8950 | 0.1382 |
| 4.8436 | 8951 | 0.0685 |
| 4.8442 | 8952 | 0.1073 |
| 4.8447 | 8953 | 0.069 |
| 4.8452 | 8954 | 0.065 |
| 4.8458 | 8955 | 0.0473 |
| 4.8463 | 8956 | 0.098 |
| 4.8469 | 8957 | 0.075 |
| 4.8474 | 8958 | 0.0819 |
| 4.8479 | 8959 | 0.0719 |
| 4.8485 | 8960 | 0.093 |
| 4.8490 | 8961 | 0.0689 |
| 4.8496 | 8962 | 0.0798 |
| 4.8501 | 8963 | 0.1136 |
| 4.8506 | 8964 | 0.0337 |
| 4.8512 | 8965 | 0.0395 |
| 4.8517 | 8966 | 0.1086 |
| 4.8523 | 8967 | 0.0799 |
| 4.8528 | 8968 | 0.1331 |
| 4.8534 | 8969 | 0.091 |
| 4.8539 | 8970 | 0.0685 |
| 4.8544 | 8971 | 0.0532 |
| 4.8550 | 8972 | 0.0888 |
| 4.8555 | 8973 | 0.0803 |
| 4.8561 | 8974 | 0.0884 |
| 4.8566 | 8975 | 0.0744 |
| 4.8571 | 8976 | 0.0617 |
| 4.8577 | 8977 | 0.1018 |
| 4.8582 | 8978 | 0.121 |
| 4.8588 | 8979 | 0.0692 |
| 4.8593 | 8980 | 0.0917 |
| 4.8598 | 8981 | 0.0835 |
| 4.8604 | 8982 | 0.1136 |
| 4.8609 | 8983 | 0.0367 |
| 4.8615 | 8984 | 0.0504 |
| 4.8620 | 8985 | 0.0435 |
| 4.8626 | 8986 | 0.0538 |
| 4.8631 | 8987 | 0.1242 |
| 4.8636 | 8988 | 0.0823 |
| 4.8642 | 8989 | 0.1253 |
| 4.8647 | 8990 | 0.057 |
| 4.8653 | 8991 | 0.0728 |
| 4.8658 | 8992 | 0.0777 |
| 4.8663 | 8993 | 0.0497 |
| 4.8669 | 8994 | 0.1073 |
| 4.8674 | 8995 | 0.0658 |
| 4.8680 | 8996 | 0.1012 |
| 4.8685 | 8997 | 0.0533 |
| 4.8690 | 8998 | 0.1424 |
| 4.8696 | 8999 | 0.083 |
| 4.8701 | 9000 | 0.0711 |
| 4.8707 | 9001 | 0.0627 |
| 4.8712 | 9002 | 0.1003 |
| 4.8718 | 9003 | 0.0839 |
| 4.8723 | 9004 | 0.0728 |
| 4.8728 | 9005 | 0.0887 |
| 4.8734 | 9006 | 0.0983 |
| 4.8739 | 9007 | 0.0663 |
| 4.8745 | 9008 | 0.0769 |
| 4.875 | 9009 | 0.082 |
| 4.8755 | 9010 | 0.0342 |
| 4.8761 | 9011 | 0.0633 |
| 4.8766 | 9012 | 0.1005 |
| 4.8772 | 9013 | 0.0787 |
| 4.8777 | 9014 | 0.0527 |
| 4.8782 | 9015 | 0.0657 |
| 4.8788 | 9016 | 0.0598 |
| 4.8793 | 9017 | 0.0439 |
| 4.8799 | 9018 | 0.1151 |
| 4.8804 | 9019 | 0.0641 |
| 4.8810 | 9020 | 0.0624 |
| 4.8815 | 9021 | 0.0717 |
| 4.8820 | 9022 | 0.0616 |
| 4.8826 | 9023 | 0.0992 |
| 4.8831 | 9024 | 0.0647 |
| 4.8837 | 9025 | 0.0399 |
| 4.8842 | 9026 | 0.0925 |
| 4.8847 | 9027 | 0.0394 |
| 4.8853 | 9028 | 0.0453 |
| 4.8858 | 9029 | 0.0564 |
| 4.8864 | 9030 | 0.0562 |
| 4.8869 | 9031 | 0.0652 |
| 4.8874 | 9032 | 0.1178 |
| 4.8880 | 9033 | 0.0897 |
| 4.8885 | 9034 | 0.0878 |
| 4.8891 | 9035 | 0.0696 |
| 4.8896 | 9036 | 0.0753 |
| 4.8902 | 9037 | 0.024 |
| 4.8907 | 9038 | 0.0792 |
| 4.8912 | 9039 | 0.0581 |
| 4.8918 | 9040 | 0.0626 |
| 4.8923 | 9041 | 0.0969 |
| 4.8929 | 9042 | 0.0701 |
| 4.8934 | 9043 | 0.0838 |
| 4.8939 | 9044 | 0.0683 |
| 4.8945 | 9045 | 0.1543 |
| 4.8950 | 9046 | 0.0442 |
| 4.8956 | 9047 | 0.054 |
| 4.8961 | 9048 | 0.0754 |
| 4.8966 | 9049 | 0.0678 |
| 4.8972 | 9050 | 0.081 |
| 4.8977 | 9051 | 0.1075 |
| 4.8983 | 9052 | 0.0894 |
| 4.8988 | 9053 | 0.0426 |
| 4.8994 | 9054 | 0.0845 |
| 4.8999 | 9055 | 0.0762 |
| 4.9004 | 9056 | 0.0308 |
| 4.9010 | 9057 | 0.0966 |
| 4.9015 | 9058 | 0.0585 |
| 4.9021 | 9059 | 0.0754 |
| 4.9026 | 9060 | 0.0749 |
| 4.9031 | 9061 | 0.0368 |
| 4.9037 | 9062 | 0.0889 |
| 4.9042 | 9063 | 0.0788 |
| 4.9048 | 9064 | 0.0699 |
| 4.9053 | 9065 | 0.1053 |
| 4.9058 | 9066 | 0.089 |
| 4.9064 | 9067 | 0.0436 |
| 4.9069 | 9068 | 0.086 |
| 4.9075 | 9069 | 0.1081 |
| 4.9080 | 9070 | 0.0613 |
| 4.9085 | 9071 | 0.1226 |
| 4.9091 | 9072 | 0.0917 |
| 4.9096 | 9073 | 0.0701 |
| 4.9102 | 9074 | 0.0506 |
| 4.9107 | 9075 | 0.1038 |
| 4.9113 | 9076 | 0.0662 |
| 4.9118 | 9077 | 0.0554 |
| 4.9123 | 9078 | 0.0601 |
| 4.9129 | 9079 | 0.0852 |
| 4.9134 | 9080 | 0.1459 |
| 4.9140 | 9081 | 0.0738 |
| 4.9145 | 9082 | 0.0849 |
| 4.9150 | 9083 | 0.0931 |
| 4.9156 | 9084 | 0.0715 |
| 4.9161 | 9085 | 0.089 |
| 4.9167 | 9086 | 0.0528 |
| 4.9172 | 9087 | 0.066 |
| 4.9177 | 9088 | 0.0574 |
| 4.9183 | 9089 | 0.0622 |
| 4.9188 | 9090 | 0.0611 |
| 4.9194 | 9091 | 0.076 |
| 4.9199 | 9092 | 0.0754 |
| 4.9205 | 9093 | 0.0648 |
| 4.9210 | 9094 | 0.0491 |
| 4.9215 | 9095 | 0.0685 |
| 4.9221 | 9096 | 0.0508 |
| 4.9226 | 9097 | 0.0392 |
| 4.9232 | 9098 | 0.0433 |
| 4.9237 | 9099 | 0.1153 |
| 4.9242 | 9100 | 0.0434 |
| 4.9248 | 9101 | 0.082 |
| 4.9253 | 9102 | 0.0532 |
| 4.9259 | 9103 | 0.0851 |
| 4.9264 | 9104 | 0.0609 |
| 4.9269 | 9105 | 0.0983 |
| 4.9275 | 9106 | 0.0649 |
| 4.9280 | 9107 | 0.1153 |
| 4.9286 | 9108 | 0.0579 |
| 4.9291 | 9109 | 0.0409 |
| 4.9297 | 9110 | 0.0579 |
| 4.9302 | 9111 | 0.1177 |
| 4.9307 | 9112 | 0.0761 |
| 4.9313 | 9113 | 0.0724 |
| 4.9318 | 9114 | 0.0604 |
| 4.9324 | 9115 | 0.0842 |
| 4.9329 | 9116 | 0.135 |
| 4.9334 | 9117 | 0.0625 |
| 4.9340 | 9118 | 0.1291 |
| 4.9345 | 9119 | 0.0981 |
| 4.9351 | 9120 | 0.0773 |
| 4.9356 | 9121 | 0.0926 |
| 4.9361 | 9122 | 0.0488 |
| 4.9367 | 9123 | 0.1353 |
| 4.9372 | 9124 | 0.0954 |
| 4.9378 | 9125 | 0.12 |
| 4.9383 | 9126 | 0.1061 |
| 4.9389 | 9127 | 0.0978 |
| 4.9394 | 9128 | 0.1289 |
| 4.9399 | 9129 | 0.0838 |
| 4.9405 | 9130 | 0.0666 |
| 4.9410 | 9131 | 0.0846 |
| 4.9416 | 9132 | 0.101 |
| 4.9421 | 9133 | 0.0725 |
| 4.9426 | 9134 | 0.1505 |
| 4.9432 | 9135 | 0.0762 |
| 4.9437 | 9136 | 0.0814 |
| 4.9443 | 9137 | 0.0505 |
| 4.9448 | 9138 | 0.0775 |
| 4.9453 | 9139 | 0.076 |
| 4.9459 | 9140 | 0.0293 |
| 4.9464 | 9141 | 0.0549 |
| 4.9470 | 9142 | 0.0998 |
| 4.9475 | 9143 | 0.0642 |
| 4.9481 | 9144 | 0.0979 |
| 4.9486 | 9145 | 0.0772 |
| 4.9491 | 9146 | 0.0866 |
| 4.9497 | 9147 | 0.0543 |
| 4.9502 | 9148 | 0.0837 |
| 4.9508 | 9149 | 0.0451 |
| 4.9513 | 9150 | 0.0765 |
| 4.9518 | 9151 | 0.1038 |
| 4.9524 | 9152 | 0.0441 |
| 4.9529 | 9153 | 0.0482 |
| 4.9535 | 9154 | 0.0569 |
| 4.9540 | 9155 | 0.1097 |
| 4.9545 | 9156 | 0.0813 |
| 4.9551 | 9157 | 0.0797 |
| 4.9556 | 9158 | 0.1131 |
| 4.9562 | 9159 | 0.1286 |
| 4.9567 | 9160 | 0.0534 |
| 4.9573 | 9161 | 0.1215 |
| 4.9578 | 9162 | 0.0492 |
| 4.9583 | 9163 | 0.1018 |
| 4.9589 | 9164 | 0.0785 |
| 4.9594 | 9165 | 0.1079 |
| 4.9600 | 9166 | 0.0646 |
| 4.9605 | 9167 | 0.0535 |
| 4.9610 | 9168 | 0.0769 |
| 4.9616 | 9169 | 0.0424 |
| 4.9621 | 9170 | 0.0736 |
| 4.9627 | 9171 | 0.058 |
| 4.9632 | 9172 | 0.0557 |
| 4.9637 | 9173 | 0.0953 |
| 4.9643 | 9174 | 0.0859 |
| 4.9648 | 9175 | 0.0467 |
| 4.9654 | 9176 | 0.1088 |
| 4.9659 | 9177 | 0.0659 |
| 4.9665 | 9178 | 0.075 |
| 4.9670 | 9179 | 0.0817 |
| 4.9675 | 9180 | 0.1072 |
| 4.9681 | 9181 | 0.1094 |
| 4.9686 | 9182 | 0.0577 |
| 4.9692 | 9183 | 0.0913 |
| 4.9697 | 9184 | 0.0645 |
| 4.9702 | 9185 | 0.0883 |
| 4.9708 | 9186 | 0.0838 |
| 4.9713 | 9187 | 0.0748 |
| 4.9719 | 9188 | 0.0545 |
| 4.9724 | 9189 | 0.0532 |
| 4.9729 | 9190 | 0.051 |
| 4.9735 | 9191 | 0.0875 |
| 4.9740 | 9192 | 0.0758 |
| 4.9746 | 9193 | 0.0553 |
| 4.9751 | 9194 | 0.0716 |
| 4.9756 | 9195 | 0.0777 |
| 4.9762 | 9196 | 0.0759 |
| 4.9767 | 9197 | 0.0786 |
| 4.9773 | 9198 | 0.0802 |
| 4.9778 | 9199 | 0.0761 |
| 4.9784 | 9200 | 0.0547 |
| 4.9789 | 9201 | 0.0793 |
| 4.9794 | 9202 | 0.054 |
| 4.9800 | 9203 | 0.0818 |
| 4.9805 | 9204 | 0.0655 |
| 4.9811 | 9205 | 0.0853 |
| 4.9816 | 9206 | 0.0446 |
| 4.9821 | 9207 | 0.1635 |
| 4.9827 | 9208 | 0.1147 |
| 4.9832 | 9209 | 0.0351 |
| 4.9838 | 9210 | 0.0816 |
| 4.9843 | 9211 | 0.0865 |
| 4.9848 | 9212 | 0.0748 |
| 4.9854 | 9213 | 0.0482 |
| 4.9859 | 9214 | 0.0442 |
| 4.9865 | 9215 | 0.0661 |
| 4.9870 | 9216 | 0.1107 |
| 4.9876 | 9217 | 0.0723 |
| 4.9881 | 9218 | 0.0569 |
| 4.9886 | 9219 | 0.0694 |
| 4.9892 | 9220 | 0.0426 |
| 4.9897 | 9221 | 0.0775 |
| 4.9903 | 9222 | 0.0915 |
| 4.9908 | 9223 | 0.0734 |
| 4.9913 | 9224 | 0.0834 |
| 4.9919 | 9225 | 0.0721 |
| 4.9924 | 9226 | 0.1114 |
| 4.9930 | 9227 | 0.0847 |
| 4.9935 | 9228 | 0.0492 |
| 4.9940 | 9229 | 0.1142 |
| 4.9946 | 9230 | 0.069 |
| 4.9951 | 9231 | 0.0289 |
| 4.9957 | 9232 | 0.0412 |
| 4.9962 | 9233 | 0.0498 |
| 4.9968 | 9234 | 0.0419 |
| 4.9973 | 9235 | 0.0951 |
| 4.9978 | 9236 | 0.1606 |
| 4.9984 | 9237 | 0.0622 |
| 4.9989 | 9238 | 0.0883 |
| 4.9995 | 9239 | 0.074 |
| 5.0 | 9240 | 0.0672 |
</details>
### Framework Versions
- Python: 3.10.12
- Sentence Transformers: 3.2.1
- Transformers: 4.44.2
- PyTorch: 2.3.1+cu121
- Accelerate: 1.1.1
- Datasets: 2.21.0
- Tokenizers: 0.19.1
## Citation
### BibTeX
#### Sentence Transformers
```bibtex
@inproceedings{reimers-2019-sentence-bert,
title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
author = "Reimers, Nils and Gurevych, Iryna",
booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
month = "11",
year = "2019",
publisher = "Association for Computational Linguistics",
url = "https://arxiv.org/abs/1908.10084",
}
```
#### MultipleNegativesRankingLoss
```bibtex
@misc{henderson2017efficient,
title={Efficient Natural Language Response Suggestion for Smart Reply},
author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil},
year={2017},
eprint={1705.00652},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
<!--
## Glossary
*Clearly define terms in order to be accessible across audiences.*
-->
<!--
## Model Card Authors
*Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.*
-->
<!--
## Model Card Contact
*Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.*
--> |
cotysong113/my_awesome_food_model | cotysong113 | 2024-11-13T09:59:54Z | 193 | 0 | transformers | [
"transformers",
"tensorboard",
"safetensors",
"vit",
"image-classification",
"generated_from_trainer",
"base_model:google/vit-base-patch16-224-in21k",
"base_model:finetune:google/vit-base-patch16-224-in21k",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | image-classification | 2024-11-13T09:55:31Z | ---
library_name: transformers
license: apache-2.0
base_model: google/vit-base-patch16-224-in21k
tags:
- generated_from_trainer
metrics:
- accuracy
model-index:
- name: my_awesome_food_model
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# my_awesome_food_model
This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 1.5974
- Accuracy: 0.899
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 64
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 3
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 2.7165 | 0.992 | 62 | 2.5197 | 0.82 |
| 1.8377 | 2.0 | 125 | 1.7734 | 0.868 |
| 1.5955 | 2.976 | 186 | 1.5974 | 0.899 |
### Framework versions
- Transformers 4.46.2
- Pytorch 2.5.1+cu124
- Datasets 3.1.0
- Tokenizers 0.20.1
|
BFS-Search/llama-3.2-3b-DoCRED_multi_rel | BFS-Search | 2024-11-13T09:58:46Z | 119 | 0 | transformers | [
"transformers",
"safetensors",
"llama",
"text-generation",
"conversational",
"arxiv:1910.09700",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | text-generation | 2024-11-13T09:56:19Z | ---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
Volko76/Qwen2.5-Coder-7B-Instruct-GGUF | Volko76 | 2024-11-13T09:56:20Z | 50 | 0 | transformers | [
"transformers",
"gguf",
"code",
"codeqwen",
"chat",
"qwen",
"qwen-coder",
"text-generation",
"en",
"arxiv:2409.12186",
"arxiv:2309.00071",
"arxiv:2407.10671",
"base_model:Qwen/Qwen2.5-Coder-7B",
"base_model:quantized:Qwen/Qwen2.5-Coder-7B",
"license:apache-2.0",
"endpoints_compatible",
"region:us",
"conversational"
] | text-generation | 2024-11-12T11:11:35Z | ---
license: apache-2.0
license_link: https://huggingface.co/Qwen/Qwen2.5-Coder-7B-Instruct/blob/main/LICENSE
language:
- en
base_model:
- Qwen/Qwen2.5-Coder-7B
pipeline_tag: text-generation
library_name: transformers
tags:
- code
- codeqwen
- chat
- qwen
- qwen-coder
---
# Qwen2.5-Coder-7B-Instruct
## Introduction
Qwen2.5-Coder is the latest series of Code-Specific Qwen large language models (formerly known as CodeQwen). As of now, Qwen2.5-Coder has covered six mainstream model sizes, 0.5, 1.5, 3, 7, 14, 32 billion parameters, to meet the needs of different developers. Qwen2.5-Coder brings the following improvements upon CodeQwen1.5:
- Significantly improvements in **code generation**, **code reasoning** and **code fixing**. Base on the strong Qwen2.5, we scale up the training tokens into 5.5 trillion including source code, text-code grounding, Synthetic data, etc. Qwen2.5-Coder-32B has become the current state-of-the-art open-source codeLLM, with its coding abilities matching those of GPT-4o.
- A more comprehensive foundation for real-world applications such as **Code Agents**. Not only enhancing coding capabilities but also maintaining its strengths in mathematics and general competencies.
- **Long-context Support** up to 128K tokens.
**This repo contains the instruction-tuned 7B Qwen2.5-Coder model**, which has the following features:
- Type: Causal Language Models
- Training Stage: Pretraining & Post-training
- Architecture: transformers with RoPE, SwiGLU, RMSNorm, and Attention QKV bias
- Number of Parameters: 7.61B
- Number of Paramaters (Non-Embedding): 6.53B
- Number of Layers: 28
- Number of Attention Heads (GQA): 28 for Q and 4 for KV
- Context Length: Full 131,072 tokens
- Please refer to [this section](#processing-long-texts) for detailed instructions on how to deploy Qwen2.5 for handling long texts.
For more details, please refer to our [blog](https://qwenlm.github.io/blog/qwen2.5-coder-family/), [GitHub](https://github.com/QwenLM/Qwen2.5-Coder), [Documentation](https://qwen.readthedocs.io/en/latest/), [Arxiv](https://arxiv.org/abs/2409.12186).
## Requirements
The code of Qwen2.5-Coder has been in the latest Hugging face `transformers` and we advise you to use the latest version of `transformers`.
With `transformers<4.37.0`, you will encounter the following error:
```
KeyError: 'qwen2'
```
## Quickstart
Here provides a code snippet with `apply_chat_template` to show you how to load the tokenizer and model and how to generate contents.
```python
from transformers import AutoModelForCausalLM, AutoTokenizer
model_name = "Qwen/Qwen2.5-Coder-7B-Instruct"
model = AutoModelForCausalLM.from_pretrained(
model_name,
torch_dtype="auto",
device_map="auto"
)
tokenizer = AutoTokenizer.from_pretrained(model_name)
prompt = "write a quick sort algorithm."
messages = [
{"role": "system", "content": "You are Qwen, created by Alibaba Cloud. You are a helpful assistant."},
{"role": "user", "content": prompt}
]
text = tokenizer.apply_chat_template(
messages,
tokenize=False,
add_generation_prompt=True
)
model_inputs = tokenizer([text], return_tensors="pt").to(model.device)
generated_ids = model.generate(
**model_inputs,
max_new_tokens=512
)
generated_ids = [
output_ids[len(input_ids):] for input_ids, output_ids in zip(model_inputs.input_ids, generated_ids)
]
response = tokenizer.batch_decode(generated_ids, skip_special_tokens=True)[0]
```
### Processing Long Texts
The current `config.json` is set for context length up to 32,768 tokens.
To handle extensive inputs exceeding 32,768 tokens, we utilize [YaRN](https://arxiv.org/abs/2309.00071), a technique for enhancing model length extrapolation, ensuring optimal performance on lengthy texts.
For supported frameworks, you could add the following to `config.json` to enable YaRN:
```json
{
...,
"rope_scaling": {
"factor": 4.0,
"original_max_position_embeddings": 32768,
"type": "yarn"
}
}
```
For deployment, we recommend using vLLM.
Please refer to our [Documentation](https://qwen.readthedocs.io/en/latest/deployment/vllm.html) for usage if you are not familar with vLLM.
Presently, vLLM only supports static YARN, which means the scaling factor remains constant regardless of input length, **potentially impacting performance on shorter texts**.
We advise adding the `rope_scaling` configuration only when processing long contexts is required.
## Evaluation & Performance
Detailed evaluation results are reported in this [📑 blog](https://qwenlm.github.io/blog/qwen2.5-coder-family/).
For requirements on GPU memory and the respective throughput, see results [here](https://qwen.readthedocs.io/en/latest/benchmark/speed_benchmark.html).
## Citation
If you find our work helpful, feel free to give us a cite.
```
@article{hui2024qwen2,
title={Qwen2. 5-Coder Technical Report},
author={Hui, Binyuan and Yang, Jian and Cui, Zeyu and Yang, Jiaxi and Liu, Dayiheng and Zhang, Lei and Liu, Tianyu and Zhang, Jiajun and Yu, Bowen and Dang, Kai and others},
journal={arXiv preprint arXiv:2409.12186},
year={2024}
}
@article{qwen2,
title={Qwen2 Technical Report},
author={An Yang and Baosong Yang and Binyuan Hui and Bo Zheng and Bowen Yu and Chang Zhou and Chengpeng Li and Chengyuan Li and Dayiheng Liu and Fei Huang and Guanting Dong and Haoran Wei and Huan Lin and Jialong Tang and Jialin Wang and Jian Yang and Jianhong Tu and Jianwei Zhang and Jianxin Ma and Jin Xu and Jingren Zhou and Jinze Bai and Jinzheng He and Junyang Lin and Kai Dang and Keming Lu and Keqin Chen and Kexin Yang and Mei Li and Mingfeng Xue and Na Ni and Pei Zhang and Peng Wang and Ru Peng and Rui Men and Ruize Gao and Runji Lin and Shijie Wang and Shuai Bai and Sinan Tan and Tianhang Zhu and Tianhao Li and Tianyu Liu and Wenbin Ge and Xiaodong Deng and Xiaohuan Zhou and Xingzhang Ren and Xinyu Zhang and Xipin Wei and Xuancheng Ren and Yang Fan and Yang Yao and Yichang Zhang and Yu Wan and Yunfei Chu and Yuqiong Liu and Zeyu Cui and Zhenru Zhang and Zhihao Fan},
journal={arXiv preprint arXiv:2407.10671},
year={2024}
}
```
|
ksathyan/vicuna-merged | ksathyan | 2024-11-13T09:56:13Z | 8 | 0 | transformers | [
"transformers",
"safetensors",
"llama",
"text-generation",
"arxiv:1910.09700",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | text-generation | 2024-11-13T09:48:44Z | ---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
arthurhzna/39class_rokok | arthurhzna | 2024-11-13T09:53:42Z | 7 | 0 | null | [
"tensorboard",
"safetensors",
"vit",
"image-classification",
"pytorch",
"huggingpics",
"model-index",
"region:us"
] | image-classification | 2024-11-13T09:53:35Z | ---
tags:
- image-classification
- pytorch
- huggingpics
metrics:
- accuracy
model-index:
- name: 39class_rokok
results:
- task:
name: Image Classification
type: image-classification
metrics:
- name: Accuracy
type: accuracy
value: 1.0
---
# 39class_rokok
Autogenerated by HuggingPics🤗🖼️
Create your own image classifier for **anything** by running [the demo on Google Colab](https://colab.research.google.com/github/nateraw/huggingpics/blob/main/HuggingPics.ipynb).
Report any issues with the demo at the [github repo](https://github.com/nateraw/huggingpics).
## Example Images
#### 76 DJarum

#### 76 Mangga

#### 76 Nanas

#### Chief Blue

#### Djarum Black

#### Djarum Black Cappucino

#### Djarum Fresh Cola

#### Djarum King Filter

#### Djarum Super

#### Djarum Super Espresso

#### Forte Extra Breeze Menthol

#### Forte Manggo

#### Forte Mentol

#### Forte Original

#### Forte Vanilla

#### Garam De Luxe

#### Geo Mild

#### Gudang Garam Djaja

#### Gudang Garam GG Shiver

#### Gudang Garam Internasional

#### Gudang Garam Merah King Size

#### Gudang Garam Signature

#### Gudang Garam Signature Mild

#### Gudang Garam Surya Coklat

#### Gudang Garam Surya Merah

#### Halim Merah

#### LA Bold

#### LA Ice

#### LA Ice Manggo Boost

#### LA Ice Purple Boost

#### LA Light

#### LA Menthol

#### Mld Black

#### Raptor

#### Surya Exclusive

#### Surya Nusantara

#### Surya Pro Merah

#### Surya Pro Mild Limited Edition

#### Ziga Blue
 |
mradermacher/Asclepius-Llama2-13B-Pretraining-Only-i1-GGUF | mradermacher | 2024-11-13T09:50:13Z | 65 | 0 | transformers | [
"transformers",
"gguf",
"medical",
"en",
"dataset:starmpcc/Asclepius-Synthetic-Clinical-Notes",
"base_model:starmpcc/Asclepius-Llama2-13B-Pretraining-Only",
"base_model:quantized:starmpcc/Asclepius-Llama2-13B-Pretraining-Only",
"license:cc-by-nc-4.0",
"endpoints_compatible",
"region:us",
"imatrix"
] | null | 2024-11-11T21:39:09Z | ---
base_model: starmpcc/Asclepius-Llama2-13B-Pretraining-Only
datasets:
- starmpcc/Asclepius-Synthetic-Clinical-Notes
language:
- en
library_name: transformers
license: cc-by-nc-4.0
quantized_by: mradermacher
tags:
- medical
---
## About
<!-- ### quantize_version: 2 -->
<!-- ### output_tensor_quantised: 1 -->
<!-- ### convert_type: hf -->
<!-- ### vocab_type: -->
<!-- ### tags: nicoboss -->
weighted/imatrix quants of https://huggingface.co/starmpcc/Asclepius-Llama2-13B-Pretraining-Only
<!-- provided-files -->
static quants are available at https://huggingface.co/mradermacher/Asclepius-Llama2-13B-Pretraining-Only-GGUF
## Usage
If you are unsure how to use GGUF files, refer to one of [TheBloke's
READMEs](https://huggingface.co/TheBloke/KafkaLM-70B-German-V0.1-GGUF) for
more details, including on how to concatenate multi-part files.
## Provided Quants
(sorted by size, not necessarily quality. IQ-quants are often preferable over similar sized non-IQ quants)
| Link | Type | Size/GB | Notes |
|:-----|:-----|--------:|:------|
| [GGUF](https://huggingface.co/mradermacher/Asclepius-Llama2-13B-Pretraining-Only-i1-GGUF/resolve/main/Asclepius-Llama2-13B-Pretraining-Only.i1-IQ1_S.gguf) | i1-IQ1_S | 3.0 | for the desperate |
| [GGUF](https://huggingface.co/mradermacher/Asclepius-Llama2-13B-Pretraining-Only-i1-GGUF/resolve/main/Asclepius-Llama2-13B-Pretraining-Only.i1-IQ1_M.gguf) | i1-IQ1_M | 3.2 | mostly desperate |
| [GGUF](https://huggingface.co/mradermacher/Asclepius-Llama2-13B-Pretraining-Only-i1-GGUF/resolve/main/Asclepius-Llama2-13B-Pretraining-Only.i1-IQ2_XXS.gguf) | i1-IQ2_XXS | 3.6 | |
| [GGUF](https://huggingface.co/mradermacher/Asclepius-Llama2-13B-Pretraining-Only-i1-GGUF/resolve/main/Asclepius-Llama2-13B-Pretraining-Only.i1-IQ2_XS.gguf) | i1-IQ2_XS | 4.0 | |
| [GGUF](https://huggingface.co/mradermacher/Asclepius-Llama2-13B-Pretraining-Only-i1-GGUF/resolve/main/Asclepius-Llama2-13B-Pretraining-Only.i1-IQ2_S.gguf) | i1-IQ2_S | 4.3 | |
| [GGUF](https://huggingface.co/mradermacher/Asclepius-Llama2-13B-Pretraining-Only-i1-GGUF/resolve/main/Asclepius-Llama2-13B-Pretraining-Only.i1-IQ2_M.gguf) | i1-IQ2_M | 4.6 | |
| [GGUF](https://huggingface.co/mradermacher/Asclepius-Llama2-13B-Pretraining-Only-i1-GGUF/resolve/main/Asclepius-Llama2-13B-Pretraining-Only.i1-Q2_K.gguf) | i1-Q2_K | 5.0 | IQ3_XXS probably better |
| [GGUF](https://huggingface.co/mradermacher/Asclepius-Llama2-13B-Pretraining-Only-i1-GGUF/resolve/main/Asclepius-Llama2-13B-Pretraining-Only.i1-IQ3_XXS.gguf) | i1-IQ3_XXS | 5.1 | lower quality |
| [GGUF](https://huggingface.co/mradermacher/Asclepius-Llama2-13B-Pretraining-Only-i1-GGUF/resolve/main/Asclepius-Llama2-13B-Pretraining-Only.i1-IQ3_XS.gguf) | i1-IQ3_XS | 5.5 | |
| [GGUF](https://huggingface.co/mradermacher/Asclepius-Llama2-13B-Pretraining-Only-i1-GGUF/resolve/main/Asclepius-Llama2-13B-Pretraining-Only.i1-IQ3_S.gguf) | i1-IQ3_S | 5.8 | beats Q3_K* |
| [GGUF](https://huggingface.co/mradermacher/Asclepius-Llama2-13B-Pretraining-Only-i1-GGUF/resolve/main/Asclepius-Llama2-13B-Pretraining-Only.i1-Q3_K_S.gguf) | i1-Q3_K_S | 5.8 | IQ3_XS probably better |
| [GGUF](https://huggingface.co/mradermacher/Asclepius-Llama2-13B-Pretraining-Only-i1-GGUF/resolve/main/Asclepius-Llama2-13B-Pretraining-Only.i1-IQ3_M.gguf) | i1-IQ3_M | 6.1 | |
| [GGUF](https://huggingface.co/mradermacher/Asclepius-Llama2-13B-Pretraining-Only-i1-GGUF/resolve/main/Asclepius-Llama2-13B-Pretraining-Only.i1-Q3_K_M.gguf) | i1-Q3_K_M | 6.4 | IQ3_S probably better |
| [GGUF](https://huggingface.co/mradermacher/Asclepius-Llama2-13B-Pretraining-Only-i1-GGUF/resolve/main/Asclepius-Llama2-13B-Pretraining-Only.i1-Q3_K_L.gguf) | i1-Q3_K_L | 7.0 | IQ3_M probably better |
| [GGUF](https://huggingface.co/mradermacher/Asclepius-Llama2-13B-Pretraining-Only-i1-GGUF/resolve/main/Asclepius-Llama2-13B-Pretraining-Only.i1-IQ4_XS.gguf) | i1-IQ4_XS | 7.1 | |
| [GGUF](https://huggingface.co/mradermacher/Asclepius-Llama2-13B-Pretraining-Only-i1-GGUF/resolve/main/Asclepius-Llama2-13B-Pretraining-Only.i1-Q4_0.gguf) | i1-Q4_0 | 7.5 | fast, low quality |
| [GGUF](https://huggingface.co/mradermacher/Asclepius-Llama2-13B-Pretraining-Only-i1-GGUF/resolve/main/Asclepius-Llama2-13B-Pretraining-Only.i1-Q4_K_S.gguf) | i1-Q4_K_S | 7.5 | optimal size/speed/quality |
| [GGUF](https://huggingface.co/mradermacher/Asclepius-Llama2-13B-Pretraining-Only-i1-GGUF/resolve/main/Asclepius-Llama2-13B-Pretraining-Only.i1-Q4_K_M.gguf) | i1-Q4_K_M | 8.0 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/Asclepius-Llama2-13B-Pretraining-Only-i1-GGUF/resolve/main/Asclepius-Llama2-13B-Pretraining-Only.i1-Q5_K_S.gguf) | i1-Q5_K_S | 9.1 | |
| [GGUF](https://huggingface.co/mradermacher/Asclepius-Llama2-13B-Pretraining-Only-i1-GGUF/resolve/main/Asclepius-Llama2-13B-Pretraining-Only.i1-Q5_K_M.gguf) | i1-Q5_K_M | 9.3 | |
| [GGUF](https://huggingface.co/mradermacher/Asclepius-Llama2-13B-Pretraining-Only-i1-GGUF/resolve/main/Asclepius-Llama2-13B-Pretraining-Only.i1-Q6_K.gguf) | i1-Q6_K | 10.8 | practically like static Q6_K |
Here is a handy graph by ikawrakow comparing some lower-quality quant
types (lower is better):

And here are Artefact2's thoughts on the matter:
https://gist.github.com/Artefact2/b5f810600771265fc1e39442288e8ec9
## FAQ / Model Request
See https://huggingface.co/mradermacher/model_requests for some answers to
questions you might have and/or if you want some other model quantized.
## Thanks
I thank my company, [nethype GmbH](https://www.nethype.de/), for letting
me use its servers and providing upgrades to my workstation to enable
this work in my free time. Additional thanks to [@nicoboss](https://huggingface.co/nicoboss) for giving me access to his private supercomputer, enabling me to provide many more imatrix quants, at much higher quality, than I would otherwise be able to.
<!-- end -->
|
Diplomkaazvposlednimsemestru/gemma-2-2b-alternative | Diplomkaazvposlednimsemestru | 2024-11-13T09:49:39Z | 5 | 0 | transformers | [
"transformers",
"safetensors",
"gemma",
"text-generation",
"arxiv:1910.09700",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | text-generation | 2024-11-13T09:47:28Z | ---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
tomaarsen/bge-small-en-v1.5-copy | tomaarsen | 2024-11-13T09:48:07Z | 11 | 0 | sentence-transformers | [
"sentence-transformers",
"safetensors",
"bert",
"feature-extraction",
"sentence-similarity",
"transformers",
"mteb",
"en",
"arxiv:2401.03462",
"arxiv:2312.15503",
"arxiv:2311.13534",
"arxiv:2310.07554",
"arxiv:2309.07597",
"license:mit",
"model-index",
"autotrain_compatible",
"text-embeddings-inference",
"endpoints_compatible",
"region:us"
] | feature-extraction | 2024-11-13T09:47:58Z | ---
tags:
- sentence-transformers
- feature-extraction
- sentence-similarity
- transformers
- mteb
model-index:
- name: bge-small-en-v1.5
results:
- task:
type: Classification
dataset:
type: mteb/amazon_counterfactual
name: MTEB AmazonCounterfactualClassification (en)
config: en
split: test
revision: e8379541af4e31359cca9fbcf4b00f2671dba205
metrics:
- type: accuracy
value: 73.79104477611939
- type: ap
value: 37.21923821573361
- type: f1
value: 68.0914945617093
- task:
type: Classification
dataset:
type: mteb/amazon_polarity
name: MTEB AmazonPolarityClassification
config: default
split: test
revision: e2d317d38cd51312af73b3d32a06d1a08b442046
metrics:
- type: accuracy
value: 92.75377499999999
- type: ap
value: 89.46766124546022
- type: f1
value: 92.73884001331487
- task:
type: Classification
dataset:
type: mteb/amazon_reviews_multi
name: MTEB AmazonReviewsClassification (en)
config: en
split: test
revision: 1399c76144fd37290681b995c656ef9b2e06e26d
metrics:
- type: accuracy
value: 46.986
- type: f1
value: 46.55936786727896
- task:
type: Retrieval
dataset:
type: arguana
name: MTEB ArguAna
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 35.846000000000004
- type: map_at_10
value: 51.388
- type: map_at_100
value: 52.132999999999996
- type: map_at_1000
value: 52.141000000000005
- type: map_at_3
value: 47.037
- type: map_at_5
value: 49.579
- type: mrr_at_1
value: 36.558
- type: mrr_at_10
value: 51.658
- type: mrr_at_100
value: 52.402
- type: mrr_at_1000
value: 52.410000000000004
- type: mrr_at_3
value: 47.345
- type: mrr_at_5
value: 49.797999999999995
- type: ndcg_at_1
value: 35.846000000000004
- type: ndcg_at_10
value: 59.550000000000004
- type: ndcg_at_100
value: 62.596
- type: ndcg_at_1000
value: 62.759
- type: ndcg_at_3
value: 50.666999999999994
- type: ndcg_at_5
value: 55.228
- type: precision_at_1
value: 35.846000000000004
- type: precision_at_10
value: 8.542
- type: precision_at_100
value: 0.984
- type: precision_at_1000
value: 0.1
- type: precision_at_3
value: 20.389
- type: precision_at_5
value: 14.438
- type: recall_at_1
value: 35.846000000000004
- type: recall_at_10
value: 85.42
- type: recall_at_100
value: 98.43499999999999
- type: recall_at_1000
value: 99.644
- type: recall_at_3
value: 61.166
- type: recall_at_5
value: 72.191
- task:
type: Clustering
dataset:
type: mteb/arxiv-clustering-p2p
name: MTEB ArxivClusteringP2P
config: default
split: test
revision: a122ad7f3f0291bf49cc6f4d32aa80929df69d5d
metrics:
- type: v_measure
value: 47.402770198163594
- task:
type: Clustering
dataset:
type: mteb/arxiv-clustering-s2s
name: MTEB ArxivClusteringS2S
config: default
split: test
revision: f910caf1a6075f7329cdf8c1a6135696f37dbd53
metrics:
- type: v_measure
value: 40.01545436974177
- task:
type: Reranking
dataset:
type: mteb/askubuntudupquestions-reranking
name: MTEB AskUbuntuDupQuestions
config: default
split: test
revision: 2000358ca161889fa9c082cb41daa8dcfb161a54
metrics:
- type: map
value: 62.586465273207196
- type: mrr
value: 74.42169019038825
- task:
type: STS
dataset:
type: mteb/biosses-sts
name: MTEB BIOSSES
config: default
split: test
revision: d3fb88f8f02e40887cd149695127462bbcf29b4a
metrics:
- type: cos_sim_pearson
value: 85.1891186537969
- type: cos_sim_spearman
value: 83.75492046087288
- type: euclidean_pearson
value: 84.11766204805357
- type: euclidean_spearman
value: 84.01456493126516
- type: manhattan_pearson
value: 84.2132950502772
- type: manhattan_spearman
value: 83.89227298813377
- task:
type: Classification
dataset:
type: mteb/banking77
name: MTEB Banking77Classification
config: default
split: test
revision: 0fd18e25b25c072e09e0d92ab615fda904d66300
metrics:
- type: accuracy
value: 85.74025974025975
- type: f1
value: 85.71493566466381
- task:
type: Clustering
dataset:
type: mteb/biorxiv-clustering-p2p
name: MTEB BiorxivClusteringP2P
config: default
split: test
revision: 65b79d1d13f80053f67aca9498d9402c2d9f1f40
metrics:
- type: v_measure
value: 38.467181385006434
- task:
type: Clustering
dataset:
type: mteb/biorxiv-clustering-s2s
name: MTEB BiorxivClusteringS2S
config: default
split: test
revision: 258694dd0231531bc1fd9de6ceb52a0853c6d908
metrics:
- type: v_measure
value: 34.719496037339056
- task:
type: Retrieval
dataset:
type: BeIR/cqadupstack
name: MTEB CQADupstackAndroidRetrieval
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 29.587000000000003
- type: map_at_10
value: 41.114
- type: map_at_100
value: 42.532
- type: map_at_1000
value: 42.661
- type: map_at_3
value: 37.483
- type: map_at_5
value: 39.652
- type: mrr_at_1
value: 36.338
- type: mrr_at_10
value: 46.763
- type: mrr_at_100
value: 47.393
- type: mrr_at_1000
value: 47.445
- type: mrr_at_3
value: 43.538
- type: mrr_at_5
value: 45.556000000000004
- type: ndcg_at_1
value: 36.338
- type: ndcg_at_10
value: 47.658
- type: ndcg_at_100
value: 52.824000000000005
- type: ndcg_at_1000
value: 54.913999999999994
- type: ndcg_at_3
value: 41.989
- type: ndcg_at_5
value: 44.944
- type: precision_at_1
value: 36.338
- type: precision_at_10
value: 9.156
- type: precision_at_100
value: 1.4789999999999999
- type: precision_at_1000
value: 0.196
- type: precision_at_3
value: 20.076
- type: precision_at_5
value: 14.85
- type: recall_at_1
value: 29.587000000000003
- type: recall_at_10
value: 60.746
- type: recall_at_100
value: 82.157
- type: recall_at_1000
value: 95.645
- type: recall_at_3
value: 44.821
- type: recall_at_5
value: 52.819
- task:
type: Retrieval
dataset:
type: BeIR/cqadupstack
name: MTEB CQADupstackEnglishRetrieval
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 30.239
- type: map_at_10
value: 39.989000000000004
- type: map_at_100
value: 41.196
- type: map_at_1000
value: 41.325
- type: map_at_3
value: 37.261
- type: map_at_5
value: 38.833
- type: mrr_at_1
value: 37.516
- type: mrr_at_10
value: 46.177
- type: mrr_at_100
value: 46.806
- type: mrr_at_1000
value: 46.849000000000004
- type: mrr_at_3
value: 44.002
- type: mrr_at_5
value: 45.34
- type: ndcg_at_1
value: 37.516
- type: ndcg_at_10
value: 45.586
- type: ndcg_at_100
value: 49.897000000000006
- type: ndcg_at_1000
value: 51.955
- type: ndcg_at_3
value: 41.684
- type: ndcg_at_5
value: 43.617
- type: precision_at_1
value: 37.516
- type: precision_at_10
value: 8.522
- type: precision_at_100
value: 1.374
- type: precision_at_1000
value: 0.184
- type: precision_at_3
value: 20.105999999999998
- type: precision_at_5
value: 14.152999999999999
- type: recall_at_1
value: 30.239
- type: recall_at_10
value: 55.03
- type: recall_at_100
value: 73.375
- type: recall_at_1000
value: 86.29599999999999
- type: recall_at_3
value: 43.269000000000005
- type: recall_at_5
value: 48.878
- task:
type: Retrieval
dataset:
type: BeIR/cqadupstack
name: MTEB CQADupstackGamingRetrieval
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 38.338
- type: map_at_10
value: 50.468999999999994
- type: map_at_100
value: 51.553000000000004
- type: map_at_1000
value: 51.608
- type: map_at_3
value: 47.107
- type: map_at_5
value: 49.101
- type: mrr_at_1
value: 44.201
- type: mrr_at_10
value: 54.057
- type: mrr_at_100
value: 54.764
- type: mrr_at_1000
value: 54.791000000000004
- type: mrr_at_3
value: 51.56699999999999
- type: mrr_at_5
value: 53.05
- type: ndcg_at_1
value: 44.201
- type: ndcg_at_10
value: 56.379000000000005
- type: ndcg_at_100
value: 60.645
- type: ndcg_at_1000
value: 61.73499999999999
- type: ndcg_at_3
value: 50.726000000000006
- type: ndcg_at_5
value: 53.58500000000001
- type: precision_at_1
value: 44.201
- type: precision_at_10
value: 9.141
- type: precision_at_100
value: 1.216
- type: precision_at_1000
value: 0.135
- type: precision_at_3
value: 22.654
- type: precision_at_5
value: 15.723999999999998
- type: recall_at_1
value: 38.338
- type: recall_at_10
value: 70.30499999999999
- type: recall_at_100
value: 88.77199999999999
- type: recall_at_1000
value: 96.49799999999999
- type: recall_at_3
value: 55.218
- type: recall_at_5
value: 62.104000000000006
- task:
type: Retrieval
dataset:
type: BeIR/cqadupstack
name: MTEB CQADupstackGisRetrieval
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 25.682
- type: map_at_10
value: 33.498
- type: map_at_100
value: 34.461000000000006
- type: map_at_1000
value: 34.544000000000004
- type: map_at_3
value: 30.503999999999998
- type: map_at_5
value: 32.216
- type: mrr_at_1
value: 27.683999999999997
- type: mrr_at_10
value: 35.467999999999996
- type: mrr_at_100
value: 36.32
- type: mrr_at_1000
value: 36.386
- type: mrr_at_3
value: 32.618
- type: mrr_at_5
value: 34.262
- type: ndcg_at_1
value: 27.683999999999997
- type: ndcg_at_10
value: 38.378
- type: ndcg_at_100
value: 43.288
- type: ndcg_at_1000
value: 45.413
- type: ndcg_at_3
value: 32.586
- type: ndcg_at_5
value: 35.499
- type: precision_at_1
value: 27.683999999999997
- type: precision_at_10
value: 5.864
- type: precision_at_100
value: 0.882
- type: precision_at_1000
value: 0.11
- type: precision_at_3
value: 13.446
- type: precision_at_5
value: 9.718
- type: recall_at_1
value: 25.682
- type: recall_at_10
value: 51.712
- type: recall_at_100
value: 74.446
- type: recall_at_1000
value: 90.472
- type: recall_at_3
value: 36.236000000000004
- type: recall_at_5
value: 43.234
- task:
type: Retrieval
dataset:
type: BeIR/cqadupstack
name: MTEB CQADupstackMathematicaRetrieval
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 16.073999999999998
- type: map_at_10
value: 24.352999999999998
- type: map_at_100
value: 25.438
- type: map_at_1000
value: 25.545
- type: map_at_3
value: 21.614
- type: map_at_5
value: 23.104
- type: mrr_at_1
value: 19.776
- type: mrr_at_10
value: 28.837000000000003
- type: mrr_at_100
value: 29.755
- type: mrr_at_1000
value: 29.817
- type: mrr_at_3
value: 26.201999999999998
- type: mrr_at_5
value: 27.714
- type: ndcg_at_1
value: 19.776
- type: ndcg_at_10
value: 29.701
- type: ndcg_at_100
value: 35.307
- type: ndcg_at_1000
value: 37.942
- type: ndcg_at_3
value: 24.764
- type: ndcg_at_5
value: 27.025
- type: precision_at_1
value: 19.776
- type: precision_at_10
value: 5.659
- type: precision_at_100
value: 0.971
- type: precision_at_1000
value: 0.133
- type: precision_at_3
value: 12.065
- type: precision_at_5
value: 8.905000000000001
- type: recall_at_1
value: 16.073999999999998
- type: recall_at_10
value: 41.647
- type: recall_at_100
value: 66.884
- type: recall_at_1000
value: 85.91499999999999
- type: recall_at_3
value: 27.916
- type: recall_at_5
value: 33.729
- task:
type: Retrieval
dataset:
type: BeIR/cqadupstack
name: MTEB CQADupstackPhysicsRetrieval
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 28.444999999999997
- type: map_at_10
value: 38.218999999999994
- type: map_at_100
value: 39.595
- type: map_at_1000
value: 39.709
- type: map_at_3
value: 35.586
- type: map_at_5
value: 36.895
- type: mrr_at_1
value: 34.841
- type: mrr_at_10
value: 44.106
- type: mrr_at_100
value: 44.98
- type: mrr_at_1000
value: 45.03
- type: mrr_at_3
value: 41.979
- type: mrr_at_5
value: 43.047999999999995
- type: ndcg_at_1
value: 34.841
- type: ndcg_at_10
value: 43.922
- type: ndcg_at_100
value: 49.504999999999995
- type: ndcg_at_1000
value: 51.675000000000004
- type: ndcg_at_3
value: 39.858
- type: ndcg_at_5
value: 41.408
- type: precision_at_1
value: 34.841
- type: precision_at_10
value: 7.872999999999999
- type: precision_at_100
value: 1.2449999999999999
- type: precision_at_1000
value: 0.161
- type: precision_at_3
value: 18.993
- type: precision_at_5
value: 13.032
- type: recall_at_1
value: 28.444999999999997
- type: recall_at_10
value: 54.984
- type: recall_at_100
value: 78.342
- type: recall_at_1000
value: 92.77
- type: recall_at_3
value: 42.842999999999996
- type: recall_at_5
value: 47.247
- task:
type: Retrieval
dataset:
type: BeIR/cqadupstack
name: MTEB CQADupstackProgrammersRetrieval
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 23.072
- type: map_at_10
value: 32.354
- type: map_at_100
value: 33.800000000000004
- type: map_at_1000
value: 33.908
- type: map_at_3
value: 29.232000000000003
- type: map_at_5
value: 31.049
- type: mrr_at_1
value: 29.110000000000003
- type: mrr_at_10
value: 38.03
- type: mrr_at_100
value: 39.032
- type: mrr_at_1000
value: 39.086999999999996
- type: mrr_at_3
value: 35.407
- type: mrr_at_5
value: 36.76
- type: ndcg_at_1
value: 29.110000000000003
- type: ndcg_at_10
value: 38.231
- type: ndcg_at_100
value: 44.425
- type: ndcg_at_1000
value: 46.771
- type: ndcg_at_3
value: 33.095
- type: ndcg_at_5
value: 35.459
- type: precision_at_1
value: 29.110000000000003
- type: precision_at_10
value: 7.215000000000001
- type: precision_at_100
value: 1.2109999999999999
- type: precision_at_1000
value: 0.157
- type: precision_at_3
value: 16.058
- type: precision_at_5
value: 11.644
- type: recall_at_1
value: 23.072
- type: recall_at_10
value: 50.285999999999994
- type: recall_at_100
value: 76.596
- type: recall_at_1000
value: 92.861
- type: recall_at_3
value: 35.702
- type: recall_at_5
value: 42.152
- task:
type: Retrieval
dataset:
type: BeIR/cqadupstack
name: MTEB CQADupstackRetrieval
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 24.937916666666666
- type: map_at_10
value: 33.755250000000004
- type: map_at_100
value: 34.955999999999996
- type: map_at_1000
value: 35.070499999999996
- type: map_at_3
value: 30.98708333333333
- type: map_at_5
value: 32.51491666666666
- type: mrr_at_1
value: 29.48708333333333
- type: mrr_at_10
value: 37.92183333333334
- type: mrr_at_100
value: 38.76583333333333
- type: mrr_at_1000
value: 38.82466666666667
- type: mrr_at_3
value: 35.45125
- type: mrr_at_5
value: 36.827000000000005
- type: ndcg_at_1
value: 29.48708333333333
- type: ndcg_at_10
value: 39.05225
- type: ndcg_at_100
value: 44.25983333333334
- type: ndcg_at_1000
value: 46.568333333333335
- type: ndcg_at_3
value: 34.271583333333325
- type: ndcg_at_5
value: 36.483916666666666
- type: precision_at_1
value: 29.48708333333333
- type: precision_at_10
value: 6.865749999999999
- type: precision_at_100
value: 1.1195833333333332
- type: precision_at_1000
value: 0.15058333333333335
- type: precision_at_3
value: 15.742083333333333
- type: precision_at_5
value: 11.221916666666667
- type: recall_at_1
value: 24.937916666666666
- type: recall_at_10
value: 50.650416666666665
- type: recall_at_100
value: 73.55383333333334
- type: recall_at_1000
value: 89.61691666666667
- type: recall_at_3
value: 37.27808333333334
- type: recall_at_5
value: 42.99475
- task:
type: Retrieval
dataset:
type: BeIR/cqadupstack
name: MTEB CQADupstackStatsRetrieval
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 23.947
- type: map_at_10
value: 30.575000000000003
- type: map_at_100
value: 31.465
- type: map_at_1000
value: 31.558000000000003
- type: map_at_3
value: 28.814
- type: map_at_5
value: 29.738999999999997
- type: mrr_at_1
value: 26.994
- type: mrr_at_10
value: 33.415
- type: mrr_at_100
value: 34.18
- type: mrr_at_1000
value: 34.245
- type: mrr_at_3
value: 31.621
- type: mrr_at_5
value: 32.549
- type: ndcg_at_1
value: 26.994
- type: ndcg_at_10
value: 34.482
- type: ndcg_at_100
value: 38.915
- type: ndcg_at_1000
value: 41.355
- type: ndcg_at_3
value: 31.139
- type: ndcg_at_5
value: 32.589
- type: precision_at_1
value: 26.994
- type: precision_at_10
value: 5.322
- type: precision_at_100
value: 0.8160000000000001
- type: precision_at_1000
value: 0.11100000000000002
- type: precision_at_3
value: 13.344000000000001
- type: precision_at_5
value: 8.988
- type: recall_at_1
value: 23.947
- type: recall_at_10
value: 43.647999999999996
- type: recall_at_100
value: 63.851
- type: recall_at_1000
value: 82.0
- type: recall_at_3
value: 34.288000000000004
- type: recall_at_5
value: 38.117000000000004
- task:
type: Retrieval
dataset:
type: BeIR/cqadupstack
name: MTEB CQADupstackTexRetrieval
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 16.197
- type: map_at_10
value: 22.968
- type: map_at_100
value: 24.095
- type: map_at_1000
value: 24.217
- type: map_at_3
value: 20.771
- type: map_at_5
value: 21.995
- type: mrr_at_1
value: 19.511
- type: mrr_at_10
value: 26.55
- type: mrr_at_100
value: 27.500999999999998
- type: mrr_at_1000
value: 27.578999999999997
- type: mrr_at_3
value: 24.421
- type: mrr_at_5
value: 25.604
- type: ndcg_at_1
value: 19.511
- type: ndcg_at_10
value: 27.386
- type: ndcg_at_100
value: 32.828
- type: ndcg_at_1000
value: 35.739
- type: ndcg_at_3
value: 23.405
- type: ndcg_at_5
value: 25.255
- type: precision_at_1
value: 19.511
- type: precision_at_10
value: 5.017
- type: precision_at_100
value: 0.91
- type: precision_at_1000
value: 0.133
- type: precision_at_3
value: 11.023
- type: precision_at_5
value: 8.025
- type: recall_at_1
value: 16.197
- type: recall_at_10
value: 37.09
- type: recall_at_100
value: 61.778
- type: recall_at_1000
value: 82.56599999999999
- type: recall_at_3
value: 26.034000000000002
- type: recall_at_5
value: 30.762
- task:
type: Retrieval
dataset:
type: BeIR/cqadupstack
name: MTEB CQADupstackUnixRetrieval
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 25.41
- type: map_at_10
value: 33.655
- type: map_at_100
value: 34.892
- type: map_at_1000
value: 34.995
- type: map_at_3
value: 30.94
- type: map_at_5
value: 32.303
- type: mrr_at_1
value: 29.477999999999998
- type: mrr_at_10
value: 37.443
- type: mrr_at_100
value: 38.383
- type: mrr_at_1000
value: 38.440000000000005
- type: mrr_at_3
value: 34.949999999999996
- type: mrr_at_5
value: 36.228
- type: ndcg_at_1
value: 29.477999999999998
- type: ndcg_at_10
value: 38.769
- type: ndcg_at_100
value: 44.245000000000005
- type: ndcg_at_1000
value: 46.593
- type: ndcg_at_3
value: 33.623
- type: ndcg_at_5
value: 35.766
- type: precision_at_1
value: 29.477999999999998
- type: precision_at_10
value: 6.455
- type: precision_at_100
value: 1.032
- type: precision_at_1000
value: 0.135
- type: precision_at_3
value: 14.893999999999998
- type: precision_at_5
value: 10.485
- type: recall_at_1
value: 25.41
- type: recall_at_10
value: 50.669
- type: recall_at_100
value: 74.084
- type: recall_at_1000
value: 90.435
- type: recall_at_3
value: 36.679
- type: recall_at_5
value: 41.94
- task:
type: Retrieval
dataset:
type: BeIR/cqadupstack
name: MTEB CQADupstackWebmastersRetrieval
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 23.339
- type: map_at_10
value: 31.852000000000004
- type: map_at_100
value: 33.411
- type: map_at_1000
value: 33.62
- type: map_at_3
value: 28.929
- type: map_at_5
value: 30.542
- type: mrr_at_1
value: 28.063
- type: mrr_at_10
value: 36.301
- type: mrr_at_100
value: 37.288
- type: mrr_at_1000
value: 37.349
- type: mrr_at_3
value: 33.663
- type: mrr_at_5
value: 35.165
- type: ndcg_at_1
value: 28.063
- type: ndcg_at_10
value: 37.462
- type: ndcg_at_100
value: 43.620999999999995
- type: ndcg_at_1000
value: 46.211
- type: ndcg_at_3
value: 32.68
- type: ndcg_at_5
value: 34.981
- type: precision_at_1
value: 28.063
- type: precision_at_10
value: 7.1739999999999995
- type: precision_at_100
value: 1.486
- type: precision_at_1000
value: 0.23500000000000001
- type: precision_at_3
value: 15.217
- type: precision_at_5
value: 11.265
- type: recall_at_1
value: 23.339
- type: recall_at_10
value: 48.376999999999995
- type: recall_at_100
value: 76.053
- type: recall_at_1000
value: 92.455
- type: recall_at_3
value: 34.735
- type: recall_at_5
value: 40.71
- task:
type: Retrieval
dataset:
type: BeIR/cqadupstack
name: MTEB CQADupstackWordpressRetrieval
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 18.925
- type: map_at_10
value: 26.017000000000003
- type: map_at_100
value: 27.034000000000002
- type: map_at_1000
value: 27.156000000000002
- type: map_at_3
value: 23.604
- type: map_at_5
value: 24.75
- type: mrr_at_1
value: 20.333000000000002
- type: mrr_at_10
value: 27.915
- type: mrr_at_100
value: 28.788000000000004
- type: mrr_at_1000
value: 28.877999999999997
- type: mrr_at_3
value: 25.446999999999996
- type: mrr_at_5
value: 26.648
- type: ndcg_at_1
value: 20.333000000000002
- type: ndcg_at_10
value: 30.673000000000002
- type: ndcg_at_100
value: 35.618
- type: ndcg_at_1000
value: 38.517
- type: ndcg_at_3
value: 25.71
- type: ndcg_at_5
value: 27.679
- type: precision_at_1
value: 20.333000000000002
- type: precision_at_10
value: 4.9910000000000005
- type: precision_at_100
value: 0.8130000000000001
- type: precision_at_1000
value: 0.117
- type: precision_at_3
value: 11.029
- type: precision_at_5
value: 7.8740000000000006
- type: recall_at_1
value: 18.925
- type: recall_at_10
value: 43.311
- type: recall_at_100
value: 66.308
- type: recall_at_1000
value: 87.49
- type: recall_at_3
value: 29.596
- type: recall_at_5
value: 34.245
- task:
type: Retrieval
dataset:
type: climate-fever
name: MTEB ClimateFEVER
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 13.714
- type: map_at_10
value: 23.194
- type: map_at_100
value: 24.976000000000003
- type: map_at_1000
value: 25.166
- type: map_at_3
value: 19.709
- type: map_at_5
value: 21.523999999999997
- type: mrr_at_1
value: 30.619000000000003
- type: mrr_at_10
value: 42.563
- type: mrr_at_100
value: 43.386
- type: mrr_at_1000
value: 43.423
- type: mrr_at_3
value: 39.555
- type: mrr_at_5
value: 41.268
- type: ndcg_at_1
value: 30.619000000000003
- type: ndcg_at_10
value: 31.836
- type: ndcg_at_100
value: 38.652
- type: ndcg_at_1000
value: 42.088
- type: ndcg_at_3
value: 26.733
- type: ndcg_at_5
value: 28.435
- type: precision_at_1
value: 30.619000000000003
- type: precision_at_10
value: 9.751999999999999
- type: precision_at_100
value: 1.71
- type: precision_at_1000
value: 0.23500000000000001
- type: precision_at_3
value: 19.935
- type: precision_at_5
value: 14.984
- type: recall_at_1
value: 13.714
- type: recall_at_10
value: 37.26
- type: recall_at_100
value: 60.546
- type: recall_at_1000
value: 79.899
- type: recall_at_3
value: 24.325
- type: recall_at_5
value: 29.725
- task:
type: Retrieval
dataset:
type: dbpedia-entity
name: MTEB DBPedia
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 8.462
- type: map_at_10
value: 18.637
- type: map_at_100
value: 26.131999999999998
- type: map_at_1000
value: 27.607
- type: map_at_3
value: 13.333
- type: map_at_5
value: 15.654000000000002
- type: mrr_at_1
value: 66.25
- type: mrr_at_10
value: 74.32600000000001
- type: mrr_at_100
value: 74.60900000000001
- type: mrr_at_1000
value: 74.62
- type: mrr_at_3
value: 72.667
- type: mrr_at_5
value: 73.817
- type: ndcg_at_1
value: 53.87499999999999
- type: ndcg_at_10
value: 40.028999999999996
- type: ndcg_at_100
value: 44.199
- type: ndcg_at_1000
value: 51.629999999999995
- type: ndcg_at_3
value: 44.113
- type: ndcg_at_5
value: 41.731
- type: precision_at_1
value: 66.25
- type: precision_at_10
value: 31.900000000000002
- type: precision_at_100
value: 10.043000000000001
- type: precision_at_1000
value: 1.926
- type: precision_at_3
value: 47.417
- type: precision_at_5
value: 40.65
- type: recall_at_1
value: 8.462
- type: recall_at_10
value: 24.293
- type: recall_at_100
value: 50.146
- type: recall_at_1000
value: 74.034
- type: recall_at_3
value: 14.967
- type: recall_at_5
value: 18.682000000000002
- task:
type: Classification
dataset:
type: mteb/emotion
name: MTEB EmotionClassification
config: default
split: test
revision: 4f58c6b202a23cf9a4da393831edf4f9183cad37
metrics:
- type: accuracy
value: 47.84499999999999
- type: f1
value: 42.48106691979349
- task:
type: Retrieval
dataset:
type: fever
name: MTEB FEVER
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 74.034
- type: map_at_10
value: 82.76
- type: map_at_100
value: 82.968
- type: map_at_1000
value: 82.98299999999999
- type: map_at_3
value: 81.768
- type: map_at_5
value: 82.418
- type: mrr_at_1
value: 80.048
- type: mrr_at_10
value: 87.64999999999999
- type: mrr_at_100
value: 87.712
- type: mrr_at_1000
value: 87.713
- type: mrr_at_3
value: 87.01100000000001
- type: mrr_at_5
value: 87.466
- type: ndcg_at_1
value: 80.048
- type: ndcg_at_10
value: 86.643
- type: ndcg_at_100
value: 87.361
- type: ndcg_at_1000
value: 87.606
- type: ndcg_at_3
value: 85.137
- type: ndcg_at_5
value: 86.016
- type: precision_at_1
value: 80.048
- type: precision_at_10
value: 10.372
- type: precision_at_100
value: 1.093
- type: precision_at_1000
value: 0.11299999999999999
- type: precision_at_3
value: 32.638
- type: precision_at_5
value: 20.177
- type: recall_at_1
value: 74.034
- type: recall_at_10
value: 93.769
- type: recall_at_100
value: 96.569
- type: recall_at_1000
value: 98.039
- type: recall_at_3
value: 89.581
- type: recall_at_5
value: 91.906
- task:
type: Retrieval
dataset:
type: fiqa
name: MTEB FiQA2018
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 20.5
- type: map_at_10
value: 32.857
- type: map_at_100
value: 34.589
- type: map_at_1000
value: 34.778
- type: map_at_3
value: 29.160999999999998
- type: map_at_5
value: 31.033
- type: mrr_at_1
value: 40.123
- type: mrr_at_10
value: 48.776
- type: mrr_at_100
value: 49.495
- type: mrr_at_1000
value: 49.539
- type: mrr_at_3
value: 46.605000000000004
- type: mrr_at_5
value: 47.654
- type: ndcg_at_1
value: 40.123
- type: ndcg_at_10
value: 40.343
- type: ndcg_at_100
value: 46.56
- type: ndcg_at_1000
value: 49.777
- type: ndcg_at_3
value: 37.322
- type: ndcg_at_5
value: 37.791000000000004
- type: precision_at_1
value: 40.123
- type: precision_at_10
value: 11.08
- type: precision_at_100
value: 1.752
- type: precision_at_1000
value: 0.232
- type: precision_at_3
value: 24.897
- type: precision_at_5
value: 17.809
- type: recall_at_1
value: 20.5
- type: recall_at_10
value: 46.388
- type: recall_at_100
value: 69.552
- type: recall_at_1000
value: 89.011
- type: recall_at_3
value: 33.617999999999995
- type: recall_at_5
value: 38.211
- task:
type: Retrieval
dataset:
type: hotpotqa
name: MTEB HotpotQA
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 39.135999999999996
- type: map_at_10
value: 61.673
- type: map_at_100
value: 62.562
- type: map_at_1000
value: 62.62
- type: map_at_3
value: 58.467999999999996
- type: map_at_5
value: 60.463
- type: mrr_at_1
value: 78.271
- type: mrr_at_10
value: 84.119
- type: mrr_at_100
value: 84.29299999999999
- type: mrr_at_1000
value: 84.299
- type: mrr_at_3
value: 83.18900000000001
- type: mrr_at_5
value: 83.786
- type: ndcg_at_1
value: 78.271
- type: ndcg_at_10
value: 69.935
- type: ndcg_at_100
value: 73.01299999999999
- type: ndcg_at_1000
value: 74.126
- type: ndcg_at_3
value: 65.388
- type: ndcg_at_5
value: 67.906
- type: precision_at_1
value: 78.271
- type: precision_at_10
value: 14.562
- type: precision_at_100
value: 1.6969999999999998
- type: precision_at_1000
value: 0.184
- type: precision_at_3
value: 41.841
- type: precision_at_5
value: 27.087
- type: recall_at_1
value: 39.135999999999996
- type: recall_at_10
value: 72.809
- type: recall_at_100
value: 84.86200000000001
- type: recall_at_1000
value: 92.208
- type: recall_at_3
value: 62.76199999999999
- type: recall_at_5
value: 67.718
- task:
type: Classification
dataset:
type: mteb/imdb
name: MTEB ImdbClassification
config: default
split: test
revision: 3d86128a09e091d6018b6d26cad27f2739fc2db7
metrics:
- type: accuracy
value: 90.60600000000001
- type: ap
value: 86.6579587804335
- type: f1
value: 90.5938853929307
- task:
type: Retrieval
dataset:
type: msmarco
name: MTEB MSMARCO
config: default
split: dev
revision: None
metrics:
- type: map_at_1
value: 21.852
- type: map_at_10
value: 33.982
- type: map_at_100
value: 35.116
- type: map_at_1000
value: 35.167
- type: map_at_3
value: 30.134
- type: map_at_5
value: 32.340999999999994
- type: mrr_at_1
value: 22.479
- type: mrr_at_10
value: 34.594
- type: mrr_at_100
value: 35.672
- type: mrr_at_1000
value: 35.716
- type: mrr_at_3
value: 30.84
- type: mrr_at_5
value: 32.998
- type: ndcg_at_1
value: 22.493
- type: ndcg_at_10
value: 40.833000000000006
- type: ndcg_at_100
value: 46.357
- type: ndcg_at_1000
value: 47.637
- type: ndcg_at_3
value: 32.995999999999995
- type: ndcg_at_5
value: 36.919000000000004
- type: precision_at_1
value: 22.493
- type: precision_at_10
value: 6.465999999999999
- type: precision_at_100
value: 0.9249999999999999
- type: precision_at_1000
value: 0.104
- type: precision_at_3
value: 14.030999999999999
- type: precision_at_5
value: 10.413
- type: recall_at_1
value: 21.852
- type: recall_at_10
value: 61.934999999999995
- type: recall_at_100
value: 87.611
- type: recall_at_1000
value: 97.441
- type: recall_at_3
value: 40.583999999999996
- type: recall_at_5
value: 49.992999999999995
- task:
type: Classification
dataset:
type: mteb/mtop_domain
name: MTEB MTOPDomainClassification (en)
config: en
split: test
revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf
metrics:
- type: accuracy
value: 93.36069311445507
- type: f1
value: 93.16456330371453
- task:
type: Classification
dataset:
type: mteb/mtop_intent
name: MTEB MTOPIntentClassification (en)
config: en
split: test
revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba
metrics:
- type: accuracy
value: 74.74692202462381
- type: f1
value: 58.17903579421599
- task:
type: Classification
dataset:
type: mteb/amazon_massive_intent
name: MTEB MassiveIntentClassification (en)
config: en
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 74.80833893745796
- type: f1
value: 72.70786592684664
- task:
type: Classification
dataset:
type: mteb/amazon_massive_scenario
name: MTEB MassiveScenarioClassification (en)
config: en
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 78.69872225958305
- type: f1
value: 78.61626934504731
- task:
type: Clustering
dataset:
type: mteb/medrxiv-clustering-p2p
name: MTEB MedrxivClusteringP2P
config: default
split: test
revision: e7a26af6f3ae46b30dde8737f02c07b1505bcc73
metrics:
- type: v_measure
value: 33.058658628717694
- task:
type: Clustering
dataset:
type: mteb/medrxiv-clustering-s2s
name: MTEB MedrxivClusteringS2S
config: default
split: test
revision: 35191c8c0dca72d8ff3efcd72aa802307d469663
metrics:
- type: v_measure
value: 30.85561739360599
- task:
type: Reranking
dataset:
type: mteb/mind_small
name: MTEB MindSmallReranking
config: default
split: test
revision: 3bdac13927fdc888b903db93b2ffdbd90b295a69
metrics:
- type: map
value: 31.290259910144385
- type: mrr
value: 32.44223046102856
- task:
type: Retrieval
dataset:
type: nfcorpus
name: MTEB NFCorpus
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 5.288
- type: map_at_10
value: 12.267999999999999
- type: map_at_100
value: 15.557000000000002
- type: map_at_1000
value: 16.98
- type: map_at_3
value: 8.866
- type: map_at_5
value: 10.418
- type: mrr_at_1
value: 43.653
- type: mrr_at_10
value: 52.681
- type: mrr_at_100
value: 53.315999999999995
- type: mrr_at_1000
value: 53.357
- type: mrr_at_3
value: 51.393
- type: mrr_at_5
value: 51.903999999999996
- type: ndcg_at_1
value: 42.415000000000006
- type: ndcg_at_10
value: 34.305
- type: ndcg_at_100
value: 30.825999999999997
- type: ndcg_at_1000
value: 39.393
- type: ndcg_at_3
value: 39.931
- type: ndcg_at_5
value: 37.519999999999996
- type: precision_at_1
value: 43.653
- type: precision_at_10
value: 25.728
- type: precision_at_100
value: 7.932
- type: precision_at_1000
value: 2.07
- type: precision_at_3
value: 38.184000000000005
- type: precision_at_5
value: 32.879000000000005
- type: recall_at_1
value: 5.288
- type: recall_at_10
value: 16.195
- type: recall_at_100
value: 31.135
- type: recall_at_1000
value: 61.531000000000006
- type: recall_at_3
value: 10.313
- type: recall_at_5
value: 12.754999999999999
- task:
type: Retrieval
dataset:
type: nq
name: MTEB NQ
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 28.216
- type: map_at_10
value: 42.588
- type: map_at_100
value: 43.702999999999996
- type: map_at_1000
value: 43.739
- type: map_at_3
value: 38.177
- type: map_at_5
value: 40.754000000000005
- type: mrr_at_1
value: 31.866
- type: mrr_at_10
value: 45.189
- type: mrr_at_100
value: 46.056000000000004
- type: mrr_at_1000
value: 46.081
- type: mrr_at_3
value: 41.526999999999994
- type: mrr_at_5
value: 43.704
- type: ndcg_at_1
value: 31.837
- type: ndcg_at_10
value: 50.178
- type: ndcg_at_100
value: 54.98800000000001
- type: ndcg_at_1000
value: 55.812
- type: ndcg_at_3
value: 41.853
- type: ndcg_at_5
value: 46.153
- type: precision_at_1
value: 31.837
- type: precision_at_10
value: 8.43
- type: precision_at_100
value: 1.1119999999999999
- type: precision_at_1000
value: 0.11900000000000001
- type: precision_at_3
value: 19.023
- type: precision_at_5
value: 13.911000000000001
- type: recall_at_1
value: 28.216
- type: recall_at_10
value: 70.8
- type: recall_at_100
value: 91.857
- type: recall_at_1000
value: 97.941
- type: recall_at_3
value: 49.196
- type: recall_at_5
value: 59.072
- task:
type: Retrieval
dataset:
type: quora
name: MTEB QuoraRetrieval
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 71.22800000000001
- type: map_at_10
value: 85.115
- type: map_at_100
value: 85.72
- type: map_at_1000
value: 85.737
- type: map_at_3
value: 82.149
- type: map_at_5
value: 84.029
- type: mrr_at_1
value: 81.96
- type: mrr_at_10
value: 88.00200000000001
- type: mrr_at_100
value: 88.088
- type: mrr_at_1000
value: 88.089
- type: mrr_at_3
value: 87.055
- type: mrr_at_5
value: 87.715
- type: ndcg_at_1
value: 82.01
- type: ndcg_at_10
value: 88.78
- type: ndcg_at_100
value: 89.91
- type: ndcg_at_1000
value: 90.013
- type: ndcg_at_3
value: 85.957
- type: ndcg_at_5
value: 87.56
- type: precision_at_1
value: 82.01
- type: precision_at_10
value: 13.462
- type: precision_at_100
value: 1.528
- type: precision_at_1000
value: 0.157
- type: precision_at_3
value: 37.553
- type: precision_at_5
value: 24.732000000000003
- type: recall_at_1
value: 71.22800000000001
- type: recall_at_10
value: 95.69
- type: recall_at_100
value: 99.531
- type: recall_at_1000
value: 99.98
- type: recall_at_3
value: 87.632
- type: recall_at_5
value: 92.117
- task:
type: Clustering
dataset:
type: mteb/reddit-clustering
name: MTEB RedditClustering
config: default
split: test
revision: 24640382cdbf8abc73003fb0fa6d111a705499eb
metrics:
- type: v_measure
value: 52.31768034366916
- task:
type: Clustering
dataset:
type: mteb/reddit-clustering-p2p
name: MTEB RedditClusteringP2P
config: default
split: test
revision: 282350215ef01743dc01b456c7f5241fa8937f16
metrics:
- type: v_measure
value: 60.640266772723606
- task:
type: Retrieval
dataset:
type: scidocs
name: MTEB SCIDOCS
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 4.7780000000000005
- type: map_at_10
value: 12.299
- type: map_at_100
value: 14.363000000000001
- type: map_at_1000
value: 14.71
- type: map_at_3
value: 8.738999999999999
- type: map_at_5
value: 10.397
- type: mrr_at_1
value: 23.599999999999998
- type: mrr_at_10
value: 34.845
- type: mrr_at_100
value: 35.916
- type: mrr_at_1000
value: 35.973
- type: mrr_at_3
value: 31.7
- type: mrr_at_5
value: 33.535
- type: ndcg_at_1
value: 23.599999999999998
- type: ndcg_at_10
value: 20.522000000000002
- type: ndcg_at_100
value: 28.737000000000002
- type: ndcg_at_1000
value: 34.596
- type: ndcg_at_3
value: 19.542
- type: ndcg_at_5
value: 16.958000000000002
- type: precision_at_1
value: 23.599999999999998
- type: precision_at_10
value: 10.67
- type: precision_at_100
value: 2.259
- type: precision_at_1000
value: 0.367
- type: precision_at_3
value: 18.333
- type: precision_at_5
value: 14.879999999999999
- type: recall_at_1
value: 4.7780000000000005
- type: recall_at_10
value: 21.617
- type: recall_at_100
value: 45.905
- type: recall_at_1000
value: 74.42
- type: recall_at_3
value: 11.148
- type: recall_at_5
value: 15.082999999999998
- task:
type: STS
dataset:
type: mteb/sickr-sts
name: MTEB SICK-R
config: default
split: test
revision: a6ea5a8cab320b040a23452cc28066d9beae2cee
metrics:
- type: cos_sim_pearson
value: 83.22372750297885
- type: cos_sim_spearman
value: 79.40972617119405
- type: euclidean_pearson
value: 80.6101072020434
- type: euclidean_spearman
value: 79.53844217225202
- type: manhattan_pearson
value: 80.57265975286111
- type: manhattan_spearman
value: 79.46335611792958
- task:
type: STS
dataset:
type: mteb/sts12-sts
name: MTEB STS12
config: default
split: test
revision: a0d554a64d88156834ff5ae9920b964011b16384
metrics:
- type: cos_sim_pearson
value: 85.43713315520749
- type: cos_sim_spearman
value: 77.44128693329532
- type: euclidean_pearson
value: 81.63869928101123
- type: euclidean_spearman
value: 77.29512977961515
- type: manhattan_pearson
value: 81.63704185566183
- type: manhattan_spearman
value: 77.29909412738657
- task:
type: STS
dataset:
type: mteb/sts13-sts
name: MTEB STS13
config: default
split: test
revision: 7e90230a92c190f1bf69ae9002b8cea547a64cca
metrics:
- type: cos_sim_pearson
value: 81.59451537860527
- type: cos_sim_spearman
value: 82.97994638856723
- type: euclidean_pearson
value: 82.89478688288412
- type: euclidean_spearman
value: 83.58740751053104
- type: manhattan_pearson
value: 82.69140840941608
- type: manhattan_spearman
value: 83.33665956040555
- task:
type: STS
dataset:
type: mteb/sts14-sts
name: MTEB STS14
config: default
split: test
revision: 6031580fec1f6af667f0bd2da0a551cf4f0b2375
metrics:
- type: cos_sim_pearson
value: 82.00756527711764
- type: cos_sim_spearman
value: 81.83560996841379
- type: euclidean_pearson
value: 82.07684151976518
- type: euclidean_spearman
value: 82.00913052060511
- type: manhattan_pearson
value: 82.05690778488794
- type: manhattan_spearman
value: 82.02260252019525
- task:
type: STS
dataset:
type: mteb/sts15-sts
name: MTEB STS15
config: default
split: test
revision: ae752c7c21bf194d8b67fd573edf7ae58183cbe3
metrics:
- type: cos_sim_pearson
value: 86.13710262895447
- type: cos_sim_spearman
value: 87.26412811156248
- type: euclidean_pearson
value: 86.94151453230228
- type: euclidean_spearman
value: 87.5363796699571
- type: manhattan_pearson
value: 86.86989424083748
- type: manhattan_spearman
value: 87.47315940781353
- task:
type: STS
dataset:
type: mteb/sts16-sts
name: MTEB STS16
config: default
split: test
revision: 4d8694f8f0e0100860b497b999b3dbed754a0513
metrics:
- type: cos_sim_pearson
value: 83.0230597603627
- type: cos_sim_spearman
value: 84.93344499318864
- type: euclidean_pearson
value: 84.23754743431141
- type: euclidean_spearman
value: 85.09707376597099
- type: manhattan_pearson
value: 84.04325160987763
- type: manhattan_spearman
value: 84.89353071339909
- task:
type: STS
dataset:
type: mteb/sts17-crosslingual-sts
name: MTEB STS17 (en-en)
config: en-en
split: test
revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d
metrics:
- type: cos_sim_pearson
value: 86.75620824563921
- type: cos_sim_spearman
value: 87.15065513706398
- type: euclidean_pearson
value: 88.26281533633521
- type: euclidean_spearman
value: 87.51963738643983
- type: manhattan_pearson
value: 88.25599267618065
- type: manhattan_spearman
value: 87.58048736047483
- task:
type: STS
dataset:
type: mteb/sts22-crosslingual-sts
name: MTEB STS22 (en)
config: en
split: test
revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80
metrics:
- type: cos_sim_pearson
value: 64.74645319195137
- type: cos_sim_spearman
value: 65.29996325037214
- type: euclidean_pearson
value: 67.04297794086443
- type: euclidean_spearman
value: 65.43841726694343
- type: manhattan_pearson
value: 67.39459955690904
- type: manhattan_spearman
value: 65.92864704413651
- task:
type: STS
dataset:
type: mteb/stsbenchmark-sts
name: MTEB STSBenchmark
config: default
split: test
revision: b0fddb56ed78048fa8b90373c8a3cfc37b684831
metrics:
- type: cos_sim_pearson
value: 84.31291020270801
- type: cos_sim_spearman
value: 85.86473738688068
- type: euclidean_pearson
value: 85.65537275064152
- type: euclidean_spearman
value: 86.13087454209642
- type: manhattan_pearson
value: 85.43946955047609
- type: manhattan_spearman
value: 85.91568175344916
- task:
type: Reranking
dataset:
type: mteb/scidocs-reranking
name: MTEB SciDocsRR
config: default
split: test
revision: d3c5e1fc0b855ab6097bf1cda04dd73947d7caab
metrics:
- type: map
value: 85.93798118350695
- type: mrr
value: 95.93536274908824
- task:
type: Retrieval
dataset:
type: scifact
name: MTEB SciFact
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 57.594
- type: map_at_10
value: 66.81899999999999
- type: map_at_100
value: 67.368
- type: map_at_1000
value: 67.4
- type: map_at_3
value: 64.061
- type: map_at_5
value: 65.47
- type: mrr_at_1
value: 60.667
- type: mrr_at_10
value: 68.219
- type: mrr_at_100
value: 68.655
- type: mrr_at_1000
value: 68.684
- type: mrr_at_3
value: 66.22200000000001
- type: mrr_at_5
value: 67.289
- type: ndcg_at_1
value: 60.667
- type: ndcg_at_10
value: 71.275
- type: ndcg_at_100
value: 73.642
- type: ndcg_at_1000
value: 74.373
- type: ndcg_at_3
value: 66.521
- type: ndcg_at_5
value: 68.581
- type: precision_at_1
value: 60.667
- type: precision_at_10
value: 9.433
- type: precision_at_100
value: 1.0699999999999998
- type: precision_at_1000
value: 0.11299999999999999
- type: precision_at_3
value: 25.556
- type: precision_at_5
value: 16.8
- type: recall_at_1
value: 57.594
- type: recall_at_10
value: 83.622
- type: recall_at_100
value: 94.167
- type: recall_at_1000
value: 99.667
- type: recall_at_3
value: 70.64399999999999
- type: recall_at_5
value: 75.983
- task:
type: PairClassification
dataset:
type: mteb/sprintduplicatequestions-pairclassification
name: MTEB SprintDuplicateQuestions
config: default
split: test
revision: d66bd1f72af766a5cc4b0ca5e00c162f89e8cc46
metrics:
- type: cos_sim_accuracy
value: 99.85841584158416
- type: cos_sim_ap
value: 96.66996142314342
- type: cos_sim_f1
value: 92.83208020050125
- type: cos_sim_precision
value: 93.06532663316584
- type: cos_sim_recall
value: 92.60000000000001
- type: dot_accuracy
value: 99.85841584158416
- type: dot_ap
value: 96.6775307676576
- type: dot_f1
value: 92.69289729177312
- type: dot_precision
value: 94.77533960292581
- type: dot_recall
value: 90.7
- type: euclidean_accuracy
value: 99.86138613861387
- type: euclidean_ap
value: 96.6338454403108
- type: euclidean_f1
value: 92.92214357937311
- type: euclidean_precision
value: 93.96728016359918
- type: euclidean_recall
value: 91.9
- type: manhattan_accuracy
value: 99.86237623762376
- type: manhattan_ap
value: 96.60370449645053
- type: manhattan_f1
value: 92.91177970423253
- type: manhattan_precision
value: 94.7970863683663
- type: manhattan_recall
value: 91.10000000000001
- type: max_accuracy
value: 99.86237623762376
- type: max_ap
value: 96.6775307676576
- type: max_f1
value: 92.92214357937311
- task:
type: Clustering
dataset:
type: mteb/stackexchange-clustering
name: MTEB StackExchangeClustering
config: default
split: test
revision: 6cbc1f7b2bc0622f2e39d2c77fa502909748c259
metrics:
- type: v_measure
value: 60.77977058695198
- task:
type: Clustering
dataset:
type: mteb/stackexchange-clustering-p2p
name: MTEB StackExchangeClusteringP2P
config: default
split: test
revision: 815ca46b2622cec33ccafc3735d572c266efdb44
metrics:
- type: v_measure
value: 35.2725272535638
- task:
type: Reranking
dataset:
type: mteb/stackoverflowdupquestions-reranking
name: MTEB StackOverflowDupQuestions
config: default
split: test
revision: e185fbe320c72810689fc5848eb6114e1ef5ec69
metrics:
- type: map
value: 53.64052466362125
- type: mrr
value: 54.533067014684654
- task:
type: Summarization
dataset:
type: mteb/summeval
name: MTEB SummEval
config: default
split: test
revision: cda12ad7615edc362dbf25a00fdd61d3b1eaf93c
metrics:
- type: cos_sim_pearson
value: 30.677624219206578
- type: cos_sim_spearman
value: 30.121368518123447
- type: dot_pearson
value: 30.69870088041608
- type: dot_spearman
value: 29.61284927093751
- task:
type: Retrieval
dataset:
type: trec-covid
name: MTEB TRECCOVID
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 0.22
- type: map_at_10
value: 1.855
- type: map_at_100
value: 9.885
- type: map_at_1000
value: 23.416999999999998
- type: map_at_3
value: 0.637
- type: map_at_5
value: 1.024
- type: mrr_at_1
value: 88.0
- type: mrr_at_10
value: 93.067
- type: mrr_at_100
value: 93.067
- type: mrr_at_1000
value: 93.067
- type: mrr_at_3
value: 92.667
- type: mrr_at_5
value: 93.067
- type: ndcg_at_1
value: 82.0
- type: ndcg_at_10
value: 75.899
- type: ndcg_at_100
value: 55.115
- type: ndcg_at_1000
value: 48.368
- type: ndcg_at_3
value: 79.704
- type: ndcg_at_5
value: 78.39699999999999
- type: precision_at_1
value: 88.0
- type: precision_at_10
value: 79.60000000000001
- type: precision_at_100
value: 56.06
- type: precision_at_1000
value: 21.206
- type: precision_at_3
value: 84.667
- type: precision_at_5
value: 83.2
- type: recall_at_1
value: 0.22
- type: recall_at_10
value: 2.078
- type: recall_at_100
value: 13.297
- type: recall_at_1000
value: 44.979
- type: recall_at_3
value: 0.6689999999999999
- type: recall_at_5
value: 1.106
- task:
type: Retrieval
dataset:
type: webis-touche2020
name: MTEB Touche2020
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 2.258
- type: map_at_10
value: 10.439
- type: map_at_100
value: 16.89
- type: map_at_1000
value: 18.407999999999998
- type: map_at_3
value: 5.668
- type: map_at_5
value: 7.718
- type: mrr_at_1
value: 32.653
- type: mrr_at_10
value: 51.159
- type: mrr_at_100
value: 51.714000000000006
- type: mrr_at_1000
value: 51.714000000000006
- type: mrr_at_3
value: 47.959
- type: mrr_at_5
value: 50.407999999999994
- type: ndcg_at_1
value: 29.592000000000002
- type: ndcg_at_10
value: 26.037
- type: ndcg_at_100
value: 37.924
- type: ndcg_at_1000
value: 49.126999999999995
- type: ndcg_at_3
value: 30.631999999999998
- type: ndcg_at_5
value: 28.571
- type: precision_at_1
value: 32.653
- type: precision_at_10
value: 22.857
- type: precision_at_100
value: 7.754999999999999
- type: precision_at_1000
value: 1.529
- type: precision_at_3
value: 34.014
- type: precision_at_5
value: 29.796
- type: recall_at_1
value: 2.258
- type: recall_at_10
value: 16.554
- type: recall_at_100
value: 48.439
- type: recall_at_1000
value: 82.80499999999999
- type: recall_at_3
value: 7.283
- type: recall_at_5
value: 10.732
- task:
type: Classification
dataset:
type: mteb/toxic_conversations_50k
name: MTEB ToxicConversationsClassification
config: default
split: test
revision: d7c0de2777da35d6aae2200a62c6e0e5af397c4c
metrics:
- type: accuracy
value: 69.8858
- type: ap
value: 13.835684144362109
- type: f1
value: 53.803351693244586
- task:
type: Classification
dataset:
type: mteb/tweet_sentiment_extraction
name: MTEB TweetSentimentExtractionClassification
config: default
split: test
revision: d604517c81ca91fe16a244d1248fc021f9ecee7a
metrics:
- type: accuracy
value: 60.50650820599886
- type: f1
value: 60.84357825979259
- task:
type: Clustering
dataset:
type: mteb/twentynewsgroups-clustering
name: MTEB TwentyNewsgroupsClustering
config: default
split: test
revision: 6125ec4e24fa026cec8a478383ee943acfbd5449
metrics:
- type: v_measure
value: 48.52131044852134
- task:
type: PairClassification
dataset:
type: mteb/twittersemeval2015-pairclassification
name: MTEB TwitterSemEval2015
config: default
split: test
revision: 70970daeab8776df92f5ea462b6173c0b46fd2d1
metrics:
- type: cos_sim_accuracy
value: 85.59337187816654
- type: cos_sim_ap
value: 73.23925826533437
- type: cos_sim_f1
value: 67.34693877551021
- type: cos_sim_precision
value: 62.40432237730752
- type: cos_sim_recall
value: 73.13984168865434
- type: dot_accuracy
value: 85.31322644096085
- type: dot_ap
value: 72.30723963807422
- type: dot_f1
value: 66.47051612112296
- type: dot_precision
value: 62.0792305930845
- type: dot_recall
value: 71.53034300791556
- type: euclidean_accuracy
value: 85.61125350181797
- type: euclidean_ap
value: 73.32843720487845
- type: euclidean_f1
value: 67.36549633745895
- type: euclidean_precision
value: 64.60755813953489
- type: euclidean_recall
value: 70.36939313984169
- type: manhattan_accuracy
value: 85.63509566668654
- type: manhattan_ap
value: 73.16658488311325
- type: manhattan_f1
value: 67.20597386434349
- type: manhattan_precision
value: 63.60424028268551
- type: manhattan_recall
value: 71.2401055408971
- type: max_accuracy
value: 85.63509566668654
- type: max_ap
value: 73.32843720487845
- type: max_f1
value: 67.36549633745895
- task:
type: PairClassification
dataset:
type: mteb/twitterurlcorpus-pairclassification
name: MTEB TwitterURLCorpus
config: default
split: test
revision: 8b6510b0b1fa4e4c4f879467980e9be563ec1cdf
metrics:
- type: cos_sim_accuracy
value: 88.33779640625606
- type: cos_sim_ap
value: 84.83868375898157
- type: cos_sim_f1
value: 77.16506154017773
- type: cos_sim_precision
value: 74.62064005753327
- type: cos_sim_recall
value: 79.88912842623961
- type: dot_accuracy
value: 88.02732176815307
- type: dot_ap
value: 83.95089283763002
- type: dot_f1
value: 76.29635101196631
- type: dot_precision
value: 73.31771720613288
- type: dot_recall
value: 79.52725592854944
- type: euclidean_accuracy
value: 88.44452206310397
- type: euclidean_ap
value: 84.98384576824827
- type: euclidean_f1
value: 77.29311047696697
- type: euclidean_precision
value: 74.51232583065381
- type: euclidean_recall
value: 80.28949799815214
- type: manhattan_accuracy
value: 88.47362906042613
- type: manhattan_ap
value: 84.91421462218432
- type: manhattan_f1
value: 77.05107637204792
- type: manhattan_precision
value: 74.74484256243214
- type: manhattan_recall
value: 79.50415768401602
- type: max_accuracy
value: 88.47362906042613
- type: max_ap
value: 84.98384576824827
- type: max_f1
value: 77.29311047696697
license: mit
language:
- en
---
<h1 align="center">FlagEmbedding</h1>
<h4 align="center">
<p>
<a href=#model-list>Model List</a> |
<a href=#frequently-asked-questions>FAQ</a> |
<a href=#usage>Usage</a> |
<a href="#evaluation">Evaluation</a> |
<a href="#train">Train</a> |
<a href="#contact">Contact</a> |
<a href="#citation">Citation</a> |
<a href="#license">License</a>
<p>
</h4>
More details please refer to our Github: [FlagEmbedding](https://github.com/FlagOpen/FlagEmbedding).
If you are looking for a model that supports more languages, longer texts, and other retrieval methods, you can try using [bge-m3](https://huggingface.co/BAAI/bge-m3).
[English](README.md) | [中文](https://github.com/FlagOpen/FlagEmbedding/blob/master/README_zh.md)
FlagEmbedding focuses on retrieval-augmented LLMs, consisting of the following projects currently:
- **Long-Context LLM**: [Activation Beacon](https://github.com/FlagOpen/FlagEmbedding/tree/master/Long_LLM/activation_beacon)
- **Fine-tuning of LM** : [LM-Cocktail](https://github.com/FlagOpen/FlagEmbedding/tree/master/LM_Cocktail)
- **Dense Retrieval**: [BGE-M3](https://github.com/FlagOpen/FlagEmbedding/tree/master/FlagEmbedding/BGE_M3), [LLM Embedder](https://github.com/FlagOpen/FlagEmbedding/tree/master/FlagEmbedding/llm_embedder), [BGE Embedding](https://github.com/FlagOpen/FlagEmbedding/tree/master/FlagEmbedding/baai_general_embedding)
- **Reranker Model**: [BGE Reranker](https://github.com/FlagOpen/FlagEmbedding/tree/master/FlagEmbedding/reranker)
- **Benchmark**: [C-MTEB](https://github.com/FlagOpen/FlagEmbedding/tree/master/C_MTEB)
## News
- 1/30/2024: Release **BGE-M3**, a new member to BGE model series! M3 stands for **M**ulti-linguality (100+ languages), **M**ulti-granularities (input length up to 8192), **M**ulti-Functionality (unification of dense, lexical, multi-vec/colbert retrieval).
It is the first embedding model which supports all three retrieval methods, achieving new SOTA on multi-lingual (MIRACL) and cross-lingual (MKQA) benchmarks.
[Technical Report](https://github.com/FlagOpen/FlagEmbedding/blob/master/FlagEmbedding/BGE_M3/BGE_M3.pdf) and [Code](https://github.com/FlagOpen/FlagEmbedding/tree/master/FlagEmbedding/BGE_M3). :fire:
- 1/9/2024: Release [Activation-Beacon](https://github.com/FlagOpen/FlagEmbedding/tree/master/Long_LLM/activation_beacon), an effective, efficient, compatible, and low-cost (training) method to extend the context length of LLM. [Technical Report](https://arxiv.org/abs/2401.03462) :fire:
- 12/24/2023: Release **LLaRA**, a LLaMA-7B based dense retriever, leading to state-of-the-art performances on MS MARCO and BEIR. Model and code will be open-sourced. Please stay tuned. [Technical Report](https://arxiv.org/abs/2312.15503) :fire:
- 11/23/2023: Release [LM-Cocktail](https://github.com/FlagOpen/FlagEmbedding/tree/master/LM_Cocktail), a method to maintain general capabilities during fine-tuning by merging multiple language models. [Technical Report](https://arxiv.org/abs/2311.13534) :fire:
- 10/12/2023: Release [LLM-Embedder](https://github.com/FlagOpen/FlagEmbedding/tree/master/FlagEmbedding/llm_embedder), a unified embedding model to support diverse retrieval augmentation needs for LLMs. [Technical Report](https://arxiv.org/pdf/2310.07554.pdf)
- 09/15/2023: The [technical report](https://arxiv.org/pdf/2309.07597.pdf) of BGE has been released
- 09/15/2023: The [massive training data](https://data.baai.ac.cn/details/BAAI-MTP) of BGE has been released
- 09/12/2023: New models:
- **New reranker model**: release cross-encoder models `BAAI/bge-reranker-base` and `BAAI/bge-reranker-large`, which are more powerful than embedding model. We recommend to use/fine-tune them to re-rank top-k documents returned by embedding models.
- **update embedding model**: release `bge-*-v1.5` embedding model to alleviate the issue of the similarity distribution, and enhance its retrieval ability without instruction.
<details>
<summary>More</summary>
<!-- ### More -->
- 09/07/2023: Update [fine-tune code](https://github.com/FlagOpen/FlagEmbedding/blob/master/FlagEmbedding/baai_general_embedding/README.md): Add script to mine hard negatives and support adding instruction during fine-tuning.
- 08/09/2023: BGE Models are integrated into **Langchain**, you can use it like [this](#using-langchain); C-MTEB **leaderboard** is [available](https://huggingface.co/spaces/mteb/leaderboard).
- 08/05/2023: Release base-scale and small-scale models, **best performance among the models of the same size 🤗**
- 08/02/2023: Release `bge-large-*`(short for BAAI General Embedding) Models, **rank 1st on MTEB and C-MTEB benchmark!** :tada: :tada:
- 08/01/2023: We release the [Chinese Massive Text Embedding Benchmark](https://github.com/FlagOpen/FlagEmbedding/blob/master/C_MTEB) (**C-MTEB**), consisting of 31 test dataset.
</details>
## Model List
`bge` is short for `BAAI general embedding`.
| Model | Language | | Description | query instruction for retrieval [1] |
|:-------------------------------|:--------:| :--------:| :--------:|:--------:|
| [BAAI/bge-m3](https://huggingface.co/BAAI/bge-m3) | Multilingual | [Inference](https://github.com/FlagOpen/FlagEmbedding/tree/master/FlagEmbedding/BGE_M3#usage) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/FlagEmbedding/BGE_M3) | Multi-Functionality(dense retrieval, sparse retrieval, multi-vector(colbert)), Multi-Linguality, and Multi-Granularity(8192 tokens) | |
| [BAAI/llm-embedder](https://huggingface.co/BAAI/llm-embedder) | English | [Inference](./FlagEmbedding/llm_embedder/README.md) [Fine-tune](./FlagEmbedding/llm_embedder/README.md) | a unified embedding model to support diverse retrieval augmentation needs for LLMs | See [README](./FlagEmbedding/llm_embedder/README.md) |
| [BAAI/bge-reranker-large](https://huggingface.co/BAAI/bge-reranker-large) | Chinese and English | [Inference](#usage-for-reranker) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/reranker) | a cross-encoder model which is more accurate but less efficient [2] | |
| [BAAI/bge-reranker-base](https://huggingface.co/BAAI/bge-reranker-base) | Chinese and English | [Inference](#usage-for-reranker) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/reranker) | a cross-encoder model which is more accurate but less efficient [2] | |
| [BAAI/bge-large-en-v1.5](https://huggingface.co/BAAI/bge-large-en-v1.5) | English | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | version 1.5 with more reasonable similarity distribution | `Represent this sentence for searching relevant passages: ` |
| [BAAI/bge-base-en-v1.5](https://huggingface.co/BAAI/bge-base-en-v1.5) | English | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | version 1.5 with more reasonable similarity distribution | `Represent this sentence for searching relevant passages: ` |
| [BAAI/bge-small-en-v1.5](https://huggingface.co/BAAI/bge-small-en-v1.5) | English | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | version 1.5 with more reasonable similarity distribution | `Represent this sentence for searching relevant passages: ` |
| [BAAI/bge-large-zh-v1.5](https://huggingface.co/BAAI/bge-large-zh-v1.5) | Chinese | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | version 1.5 with more reasonable similarity distribution | `为这个句子生成表示以用于检索相关文章:` |
| [BAAI/bge-base-zh-v1.5](https://huggingface.co/BAAI/bge-base-zh-v1.5) | Chinese | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | version 1.5 with more reasonable similarity distribution | `为这个句子生成表示以用于检索相关文章:` |
| [BAAI/bge-small-zh-v1.5](https://huggingface.co/BAAI/bge-small-zh-v1.5) | Chinese | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | version 1.5 with more reasonable similarity distribution | `为这个句子生成表示以用于检索相关文章:` |
| [BAAI/bge-large-en](https://huggingface.co/BAAI/bge-large-en) | English | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | :trophy: rank **1st** in [MTEB](https://huggingface.co/spaces/mteb/leaderboard) leaderboard | `Represent this sentence for searching relevant passages: ` |
| [BAAI/bge-base-en](https://huggingface.co/BAAI/bge-base-en) | English | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | a base-scale model but with similar ability to `bge-large-en` | `Represent this sentence for searching relevant passages: ` |
| [BAAI/bge-small-en](https://huggingface.co/BAAI/bge-small-en) | English | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) |a small-scale model but with competitive performance | `Represent this sentence for searching relevant passages: ` |
| [BAAI/bge-large-zh](https://huggingface.co/BAAI/bge-large-zh) | Chinese | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | :trophy: rank **1st** in [C-MTEB](https://github.com/FlagOpen/FlagEmbedding/tree/master/C_MTEB) benchmark | `为这个句子生成表示以用于检索相关文章:` |
| [BAAI/bge-base-zh](https://huggingface.co/BAAI/bge-base-zh) | Chinese | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | a base-scale model but with similar ability to `bge-large-zh` | `为这个句子生成表示以用于检索相关文章:` |
| [BAAI/bge-small-zh](https://huggingface.co/BAAI/bge-small-zh) | Chinese | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | a small-scale model but with competitive performance | `为这个句子生成表示以用于检索相关文章:` |
[1\]: If you need to search the relevant passages to a query, we suggest to add the instruction to the query; in other cases, no instruction is needed, just use the original query directly. In all cases, **no instruction** needs to be added to passages.
[2\]: Different from embedding model, reranker uses question and document as input and directly output similarity instead of embedding. To balance the accuracy and time cost, cross-encoder is widely used to re-rank top-k documents retrieved by other simple models.
For examples, use bge embedding model to retrieve top 100 relevant documents, and then use bge reranker to re-rank the top 100 document to get the final top-3 results.
All models have been uploaded to Huggingface Hub, and you can see them at https://huggingface.co/BAAI.
If you cannot open the Huggingface Hub, you also can download the models at https://model.baai.ac.cn/models .
## Frequently asked questions
<details>
<summary>1. How to fine-tune bge embedding model?</summary>
<!-- ### How to fine-tune bge embedding model? -->
Following this [example](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) to prepare data and fine-tune your model.
Some suggestions:
- Mine hard negatives following this [example](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune#hard-negatives), which can improve the retrieval performance.
- If you pre-train bge on your data, the pre-trained model cannot be directly used to calculate similarity, and it must be fine-tuned with contrastive learning before computing similarity.
- If the accuracy of the fine-tuned model is still not high, it is recommended to use/fine-tune the cross-encoder model (bge-reranker) to re-rank top-k results. Hard negatives also are needed to fine-tune reranker.
</details>
<details>
<summary>2. The similarity score between two dissimilar sentences is higher than 0.5</summary>
<!-- ### The similarity score between two dissimilar sentences is higher than 0.5 -->
**Suggest to use bge v1.5, which alleviates the issue of the similarity distribution.**
Since we finetune the models by contrastive learning with a temperature of 0.01,
the similarity distribution of the current BGE model is about in the interval \[0.6, 1\].
So a similarity score greater than 0.5 does not indicate that the two sentences are similar.
For downstream tasks, such as passage retrieval or semantic similarity,
**what matters is the relative order of the scores, not the absolute value.**
If you need to filter similar sentences based on a similarity threshold,
please select an appropriate similarity threshold based on the similarity distribution on your data (such as 0.8, 0.85, or even 0.9).
</details>
<details>
<summary>3. When does the query instruction need to be used</summary>
<!-- ### When does the query instruction need to be used -->
For the `bge-*-v1.5`, we improve its retrieval ability when not using instruction.
No instruction only has a slight degradation in retrieval performance compared with using instruction.
So you can generate embedding without instruction in all cases for convenience.
For a retrieval task that uses short queries to find long related documents,
it is recommended to add instructions for these short queries.
**The best method to decide whether to add instructions for queries is choosing the setting that achieves better performance on your task.**
In all cases, the documents/passages do not need to add the instruction.
</details>
## Usage
### Usage for Embedding Model
Here are some examples for using `bge` models with
[FlagEmbedding](#using-flagembedding), [Sentence-Transformers](#using-sentence-transformers), [Langchain](#using-langchain), or [Huggingface Transformers](#using-huggingface-transformers).
#### Using FlagEmbedding
```
pip install -U FlagEmbedding
```
If it doesn't work for you, you can see [FlagEmbedding](https://github.com/FlagOpen/FlagEmbedding/blob/master/FlagEmbedding/baai_general_embedding/README.md) for more methods to install FlagEmbedding.
```python
from FlagEmbedding import FlagModel
sentences_1 = ["样例数据-1", "样例数据-2"]
sentences_2 = ["样例数据-3", "样例数据-4"]
model = FlagModel('BAAI/bge-large-zh-v1.5',
query_instruction_for_retrieval="为这个句子生成表示以用于检索相关文章:",
use_fp16=True) # Setting use_fp16 to True speeds up computation with a slight performance degradation
embeddings_1 = model.encode(sentences_1)
embeddings_2 = model.encode(sentences_2)
similarity = embeddings_1 @ embeddings_2.T
print(similarity)
# for s2p(short query to long passage) retrieval task, suggest to use encode_queries() which will automatically add the instruction to each query
# corpus in retrieval task can still use encode() or encode_corpus(), since they don't need instruction
queries = ['query_1', 'query_2']
passages = ["样例文档-1", "样例文档-2"]
q_embeddings = model.encode_queries(queries)
p_embeddings = model.encode(passages)
scores = q_embeddings @ p_embeddings.T
```
For the value of the argument `query_instruction_for_retrieval`, see [Model List](https://github.com/FlagOpen/FlagEmbedding/tree/master#model-list).
By default, FlagModel will use all available GPUs when encoding. Please set `os.environ["CUDA_VISIBLE_DEVICES"]` to select specific GPUs.
You also can set `os.environ["CUDA_VISIBLE_DEVICES"]=""` to make all GPUs unavailable.
#### Using Sentence-Transformers
You can also use the `bge` models with [sentence-transformers](https://www.SBERT.net):
```
pip install -U sentence-transformers
```
```python
from sentence_transformers import SentenceTransformer
sentences_1 = ["样例数据-1", "样例数据-2"]
sentences_2 = ["样例数据-3", "样例数据-4"]
model = SentenceTransformer('BAAI/bge-large-zh-v1.5')
embeddings_1 = model.encode(sentences_1, normalize_embeddings=True)
embeddings_2 = model.encode(sentences_2, normalize_embeddings=True)
similarity = embeddings_1 @ embeddings_2.T
print(similarity)
```
For s2p(short query to long passage) retrieval task,
each short query should start with an instruction (instructions see [Model List](https://github.com/FlagOpen/FlagEmbedding/tree/master#model-list)).
But the instruction is not needed for passages.
```python
from sentence_transformers import SentenceTransformer
queries = ['query_1', 'query_2']
passages = ["样例文档-1", "样例文档-2"]
instruction = "为这个句子生成表示以用于检索相关文章:"
model = SentenceTransformer('BAAI/bge-large-zh-v1.5')
q_embeddings = model.encode([instruction+q for q in queries], normalize_embeddings=True)
p_embeddings = model.encode(passages, normalize_embeddings=True)
scores = q_embeddings @ p_embeddings.T
```
#### Using Langchain
You can use `bge` in langchain like this:
```python
from langchain.embeddings import HuggingFaceBgeEmbeddings
model_name = "BAAI/bge-large-en-v1.5"
model_kwargs = {'device': 'cuda'}
encode_kwargs = {'normalize_embeddings': True} # set True to compute cosine similarity
model = HuggingFaceBgeEmbeddings(
model_name=model_name,
model_kwargs=model_kwargs,
encode_kwargs=encode_kwargs,
query_instruction="为这个句子生成表示以用于检索相关文章:"
)
model.query_instruction = "为这个句子生成表示以用于检索相关文章:"
```
#### Using HuggingFace Transformers
With the transformers package, you can use the model like this: First, you pass your input through the transformer model, then you select the last hidden state of the first token (i.e., [CLS]) as the sentence embedding.
```python
from transformers import AutoTokenizer, AutoModel
import torch
# Sentences we want sentence embeddings for
sentences = ["样例数据-1", "样例数据-2"]
# Load model from HuggingFace Hub
tokenizer = AutoTokenizer.from_pretrained('BAAI/bge-large-zh-v1.5')
model = AutoModel.from_pretrained('BAAI/bge-large-zh-v1.5')
model.eval()
# Tokenize sentences
encoded_input = tokenizer(sentences, padding=True, truncation=True, return_tensors='pt')
# for s2p(short query to long passage) retrieval task, add an instruction to query (not add instruction for passages)
# encoded_input = tokenizer([instruction + q for q in queries], padding=True, truncation=True, return_tensors='pt')
# Compute token embeddings
with torch.no_grad():
model_output = model(**encoded_input)
# Perform pooling. In this case, cls pooling.
sentence_embeddings = model_output[0][:, 0]
# normalize embeddings
sentence_embeddings = torch.nn.functional.normalize(sentence_embeddings, p=2, dim=1)
print("Sentence embeddings:", sentence_embeddings)
```
### Usage for Reranker
Different from embedding model, reranker uses question and document as input and directly output similarity instead of embedding.
You can get a relevance score by inputting query and passage to the reranker.
The reranker is optimized based cross-entropy loss, so the relevance score is not bounded to a specific range.
#### Using FlagEmbedding
```
pip install -U FlagEmbedding
```
Get relevance scores (higher scores indicate more relevance):
```python
from FlagEmbedding import FlagReranker
reranker = FlagReranker('BAAI/bge-reranker-large', use_fp16=True) # Setting use_fp16 to True speeds up computation with a slight performance degradation
score = reranker.compute_score(['query', 'passage'])
print(score)
scores = reranker.compute_score([['what is panda?', 'hi'], ['what is panda?', 'The giant panda (Ailuropoda melanoleuca), sometimes called a panda bear or simply panda, is a bear species endemic to China.']])
print(scores)
```
#### Using Huggingface transformers
```python
import torch
from transformers import AutoModelForSequenceClassification, AutoTokenizer
tokenizer = AutoTokenizer.from_pretrained('BAAI/bge-reranker-large')
model = AutoModelForSequenceClassification.from_pretrained('BAAI/bge-reranker-large')
model.eval()
pairs = [['what is panda?', 'hi'], ['what is panda?', 'The giant panda (Ailuropoda melanoleuca), sometimes called a panda bear or simply panda, is a bear species endemic to China.']]
with torch.no_grad():
inputs = tokenizer(pairs, padding=True, truncation=True, return_tensors='pt', max_length=512)
scores = model(**inputs, return_dict=True).logits.view(-1, ).float()
print(scores)
```
#### Usage of the ONNX files
```python
from optimum.onnxruntime import ORTModelForFeatureExtraction # type: ignore
import torch
from transformers import AutoModel, AutoTokenizer
tokenizer = AutoTokenizer.from_pretrained('BAAI/bge-small-en-v1.5')
model = AutoModel.from_pretrained('BAAI/bge-small-en-v1.5')
model_ort = ORTModelForFeatureExtraction.from_pretrained('BAAI/bge-small-en-v1.5', file_name="onnx/model.onnx")
# Sentences we want sentence embeddings for
sentences = ["样例数据-1", "样例数据-2"]
# Tokenize sentences
encoded_input = tokenizer(sentences, padding=True, truncation=True, return_tensors='pt')
# for s2p(short query to long passage) retrieval task, add an instruction to query (not add instruction for passages)
# encoded_input = tokenizer([instruction + q for q in queries], padding=True, truncation=True, return_tensors='pt')
model_output_ort = model_ort(**encoded_input)
# Compute token embeddings
with torch.no_grad():
model_output = model(**encoded_input)
# model_output and model_output_ort are identical
```
#### Usage via infinity
Its also possible to deploy the onnx files with the [infinity_emb](https://github.com/michaelfeil/infinity) pip package.
Recommended is `device="cuda", engine="torch"` with flash attention on gpu, and `device="cpu", engine="optimum"` for onnx inference.
```python
import asyncio
from infinity_emb import AsyncEmbeddingEngine, EngineArgs
sentences = ["Embed this is sentence via Infinity.", "Paris is in France."]
engine = AsyncEmbeddingEngine.from_args(
EngineArgs(model_name_or_path = "BAAI/bge-small-en-v1.5", device="cpu", engine="optimum" # or engine="torch"
))
async def main():
async with engine:
embeddings, usage = await engine.embed(sentences=sentences)
asyncio.run(main())
```
## Evaluation
`baai-general-embedding` models achieve **state-of-the-art performance on both MTEB and C-MTEB leaderboard!**
For more details and evaluation tools see our [scripts](https://github.com/FlagOpen/FlagEmbedding/blob/master/C_MTEB/README.md).
- **MTEB**:
| Model Name | Dimension | Sequence Length | Average (56) | Retrieval (15) |Clustering (11) | Pair Classification (3) | Reranking (4) | STS (10) | Summarization (1) | Classification (12) |
|:----:|:---:|:---:|:---:|:---:|:---:|:---:|:---:|:---:|:---:|:---:|
| [BAAI/bge-large-en-v1.5](https://huggingface.co/BAAI/bge-large-en-v1.5) | 1024 | 512 | **64.23** | **54.29** | 46.08 | 87.12 | 60.03 | 83.11 | 31.61 | 75.97 |
| [BAAI/bge-base-en-v1.5](https://huggingface.co/BAAI/bge-base-en-v1.5) | 768 | 512 | 63.55 | 53.25 | 45.77 | 86.55 | 58.86 | 82.4 | 31.07 | 75.53 |
| [BAAI/bge-small-en-v1.5](https://huggingface.co/BAAI/bge-small-en-v1.5) | 384 | 512 | 62.17 |51.68 | 43.82 | 84.92 | 58.36 | 81.59 | 30.12 | 74.14 |
| [bge-large-en](https://huggingface.co/BAAI/bge-large-en) | 1024 | 512 | 63.98 | 53.9 | 46.98 | 85.8 | 59.48 | 81.56 | 32.06 | 76.21 |
| [bge-base-en](https://huggingface.co/BAAI/bge-base-en) | 768 | 512 | 63.36 | 53.0 | 46.32 | 85.86 | 58.7 | 81.84 | 29.27 | 75.27 |
| [gte-large](https://huggingface.co/thenlper/gte-large) | 1024 | 512 | 63.13 | 52.22 | 46.84 | 85.00 | 59.13 | 83.35 | 31.66 | 73.33 |
| [gte-base](https://huggingface.co/thenlper/gte-base) | 768 | 512 | 62.39 | 51.14 | 46.2 | 84.57 | 58.61 | 82.3 | 31.17 | 73.01 |
| [e5-large-v2](https://huggingface.co/intfloat/e5-large-v2) | 1024| 512 | 62.25 | 50.56 | 44.49 | 86.03 | 56.61 | 82.05 | 30.19 | 75.24 |
| [bge-small-en](https://huggingface.co/BAAI/bge-small-en) | 384 | 512 | 62.11 | 51.82 | 44.31 | 83.78 | 57.97 | 80.72 | 30.53 | 74.37 |
| [instructor-xl](https://huggingface.co/hkunlp/instructor-xl) | 768 | 512 | 61.79 | 49.26 | 44.74 | 86.62 | 57.29 | 83.06 | 32.32 | 61.79 |
| [e5-base-v2](https://huggingface.co/intfloat/e5-base-v2) | 768 | 512 | 61.5 | 50.29 | 43.80 | 85.73 | 55.91 | 81.05 | 30.28 | 73.84 |
| [gte-small](https://huggingface.co/thenlper/gte-small) | 384 | 512 | 61.36 | 49.46 | 44.89 | 83.54 | 57.7 | 82.07 | 30.42 | 72.31 |
| [text-embedding-ada-002](https://platform.openai.com/docs/guides/embeddings) | 1536 | 8192 | 60.99 | 49.25 | 45.9 | 84.89 | 56.32 | 80.97 | 30.8 | 70.93 |
| [e5-small-v2](https://huggingface.co/intfloat/e5-base-v2) | 384 | 512 | 59.93 | 49.04 | 39.92 | 84.67 | 54.32 | 80.39 | 31.16 | 72.94 |
| [sentence-t5-xxl](https://huggingface.co/sentence-transformers/sentence-t5-xxl) | 768 | 512 | 59.51 | 42.24 | 43.72 | 85.06 | 56.42 | 82.63 | 30.08 | 73.42 |
| [all-mpnet-base-v2](https://huggingface.co/sentence-transformers/all-mpnet-base-v2) | 768 | 514 | 57.78 | 43.81 | 43.69 | 83.04 | 59.36 | 80.28 | 27.49 | 65.07 |
| [sgpt-bloom-7b1-msmarco](https://huggingface.co/bigscience/sgpt-bloom-7b1-msmarco) | 4096 | 2048 | 57.59 | 48.22 | 38.93 | 81.9 | 55.65 | 77.74 | 33.6 | 66.19 |
- **C-MTEB**:
We create the benchmark C-MTEB for Chinese text embedding which consists of 31 datasets from 6 tasks.
Please refer to [C_MTEB](https://github.com/FlagOpen/FlagEmbedding/blob/master/C_MTEB/README.md) for a detailed introduction.
| Model | Embedding dimension | Avg | Retrieval | STS | PairClassification | Classification | Reranking | Clustering |
|:-------------------------------|:--------:|:--------:|:--------:|:--------:|:--------:|:--------:|:--------:|:--------:|
| [**BAAI/bge-large-zh-v1.5**](https://huggingface.co/BAAI/bge-large-zh-v1.5) | 1024 | **64.53** | 70.46 | 56.25 | 81.6 | 69.13 | 65.84 | 48.99 |
| [BAAI/bge-base-zh-v1.5](https://huggingface.co/BAAI/bge-base-zh-v1.5) | 768 | 63.13 | 69.49 | 53.72 | 79.75 | 68.07 | 65.39 | 47.53 |
| [BAAI/bge-small-zh-v1.5](https://huggingface.co/BAAI/bge-small-zh-v1.5) | 512 | 57.82 | 61.77 | 49.11 | 70.41 | 63.96 | 60.92 | 44.18 |
| [BAAI/bge-large-zh](https://huggingface.co/BAAI/bge-large-zh) | 1024 | 64.20 | 71.53 | 54.98 | 78.94 | 68.32 | 65.11 | 48.39 |
| [bge-large-zh-noinstruct](https://huggingface.co/BAAI/bge-large-zh-noinstruct) | 1024 | 63.53 | 70.55 | 53 | 76.77 | 68.58 | 64.91 | 50.01 |
| [BAAI/bge-base-zh](https://huggingface.co/BAAI/bge-base-zh) | 768 | 62.96 | 69.53 | 54.12 | 77.5 | 67.07 | 64.91 | 47.63 |
| [multilingual-e5-large](https://huggingface.co/intfloat/multilingual-e5-large) | 1024 | 58.79 | 63.66 | 48.44 | 69.89 | 67.34 | 56.00 | 48.23 |
| [BAAI/bge-small-zh](https://huggingface.co/BAAI/bge-small-zh) | 512 | 58.27 | 63.07 | 49.45 | 70.35 | 63.64 | 61.48 | 45.09 |
| [m3e-base](https://huggingface.co/moka-ai/m3e-base) | 768 | 57.10 | 56.91 | 50.47 | 63.99 | 67.52 | 59.34 | 47.68 |
| [m3e-large](https://huggingface.co/moka-ai/m3e-large) | 1024 | 57.05 | 54.75 | 50.42 | 64.3 | 68.2 | 59.66 | 48.88 |
| [multilingual-e5-base](https://huggingface.co/intfloat/multilingual-e5-base) | 768 | 55.48 | 61.63 | 46.49 | 67.07 | 65.35 | 54.35 | 40.68 |
| [multilingual-e5-small](https://huggingface.co/intfloat/multilingual-e5-small) | 384 | 55.38 | 59.95 | 45.27 | 66.45 | 65.85 | 53.86 | 45.26 |
| [text-embedding-ada-002(OpenAI)](https://platform.openai.com/docs/guides/embeddings/what-are-embeddings) | 1536 | 53.02 | 52.0 | 43.35 | 69.56 | 64.31 | 54.28 | 45.68 |
| [luotuo](https://huggingface.co/silk-road/luotuo-bert-medium) | 1024 | 49.37 | 44.4 | 42.78 | 66.62 | 61 | 49.25 | 44.39 |
| [text2vec-base](https://huggingface.co/shibing624/text2vec-base-chinese) | 768 | 47.63 | 38.79 | 43.41 | 67.41 | 62.19 | 49.45 | 37.66 |
| [text2vec-large](https://huggingface.co/GanymedeNil/text2vec-large-chinese) | 1024 | 47.36 | 41.94 | 44.97 | 70.86 | 60.66 | 49.16 | 30.02 |
- **Reranking**:
See [C_MTEB](https://github.com/FlagOpen/FlagEmbedding/blob/master/C_MTEB/) for evaluation script.
| Model | T2Reranking | T2RerankingZh2En\* | T2RerankingEn2Zh\* | MMarcoReranking | CMedQAv1 | CMedQAv2 | Avg |
|:-------------------------------|:--------:|:--------:|:--------:|:--------:|:--------:|:--------:|:--------:|
| text2vec-base-multilingual | 64.66 | 62.94 | 62.51 | 14.37 | 48.46 | 48.6 | 50.26 |
| multilingual-e5-small | 65.62 | 60.94 | 56.41 | 29.91 | 67.26 | 66.54 | 57.78 |
| multilingual-e5-large | 64.55 | 61.61 | 54.28 | 28.6 | 67.42 | 67.92 | 57.4 |
| multilingual-e5-base | 64.21 | 62.13 | 54.68 | 29.5 | 66.23 | 66.98 | 57.29 |
| m3e-base | 66.03 | 62.74 | 56.07 | 17.51 | 77.05 | 76.76 | 59.36 |
| m3e-large | 66.13 | 62.72 | 56.1 | 16.46 | 77.76 | 78.27 | 59.57 |
| bge-base-zh-v1.5 | 66.49 | 63.25 | 57.02 | 29.74 | 80.47 | 84.88 | 63.64 |
| bge-large-zh-v1.5 | 65.74 | 63.39 | 57.03 | 28.74 | 83.45 | 85.44 | 63.97 |
| [BAAI/bge-reranker-base](https://huggingface.co/BAAI/bge-reranker-base) | 67.28 | 63.95 | 60.45 | 35.46 | 81.26 | 84.1 | 65.42 |
| [BAAI/bge-reranker-large](https://huggingface.co/BAAI/bge-reranker-large) | 67.6 | 64.03 | 61.44 | 37.16 | 82.15 | 84.18 | 66.09 |
\* : T2RerankingZh2En and T2RerankingEn2Zh are cross-language retrieval tasks
## Train
### BAAI Embedding
We pre-train the models using [retromae](https://github.com/staoxiao/RetroMAE) and train them on large-scale pairs data using contrastive learning.
**You can fine-tune the embedding model on your data following our [examples](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune).**
We also provide a [pre-train example](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/pretrain).
Note that the goal of pre-training is to reconstruct the text, and the pre-trained model cannot be used for similarity calculation directly, it needs to be fine-tuned.
More training details for bge see [baai_general_embedding](https://github.com/FlagOpen/FlagEmbedding/blob/master/FlagEmbedding/baai_general_embedding/README.md).
### BGE Reranker
Cross-encoder will perform full-attention over the input pair,
which is more accurate than embedding model (i.e., bi-encoder) but more time-consuming than embedding model.
Therefore, it can be used to re-rank the top-k documents returned by embedding model.
We train the cross-encoder on a multilingual pair data,
The data format is the same as embedding model, so you can fine-tune it easily following our [example](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/reranker).
More details please refer to [./FlagEmbedding/reranker/README.md](https://github.com/FlagOpen/FlagEmbedding/tree/master/FlagEmbedding/reranker)
## Contact
If you have any question or suggestion related to this project, feel free to open an issue or pull request.
You also can email Shitao Xiao([email protected]) and Zheng Liu([email protected]).
## Citation
If you find this repository useful, please consider giving a star :star: and citation
```
@misc{bge_embedding,
title={C-Pack: Packaged Resources To Advance General Chinese Embedding},
author={Shitao Xiao and Zheng Liu and Peitian Zhang and Niklas Muennighoff},
year={2023},
eprint={2309.07597},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
## License
FlagEmbedding is licensed under the [MIT License](https://github.com/FlagOpen/FlagEmbedding/blob/master/LICENSE). The released models can be used for commercial purposes free of charge.
|
frhew/sigdial_ft_b1 | frhew | 2024-11-13T09:47:38Z | 11 | 0 | transformers | [
"transformers",
"safetensors",
"llama",
"text-generation",
"german",
"deutsch",
"simplification",
"vereinfachung",
"conversational",
"de",
"license:apache-2.0",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | text-generation | 2024-06-05T14:06:00Z | ---
license: apache-2.0
language:
- de
pipeline_tag: text-generation
tags:
- german
- deutsch
- simplification
- vereinfachung
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
This model was used in our experiments in our paper: [Elaborative Simplification for German-Language Texts](https://aclanthology.org/2024.sigdial-1.3).
We have uploaded this model for transparency and replicability of our experiments.
If however you are interested in German text simplification in general, we recommend [our more recent model](https://huggingface.co/hiig-piai/simba_best_092024).
We fine-tuned [meta-llama/Meta-Llama-3-8B-Instruct](https://huggingface.co/meta-llama/Meta-Llama-3-8B-Instruct) with a set of ca. 2000 newspaper articles which have been simplified by the Austrian Press Agency.
This model was trained with the standard and the B1 level texts.
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
- **Developed by:** Freya Hewett, Hadi Asghari
- **Model type:** simplification model, text generation
- **Language(s) (NLP):** German
- **License:** Apache 2.0
- **Finetuned from model:** meta-llama/Meta-Llama-3-8B-Instruct
### Model Sources
<!-- Provide the basic links for the model. -->
- **Repository:** [GermanElabSimplification](https://github.com/fhewett/GermanElabSimplification/tree/main)
- **Paper:** [Elaborative Simplification for German-Language Texts](https://aclanthology.org/2024.sigdial-1.3)
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
This model works best for simplifying German-language newspaper articles (news items, not commentaries or editorials). It may work for other types of texts.
### Downstream Use
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
We have fine-tuned using only newspaper articles. We have not yet performed extensive out-of-domain testing, but believe that the model's capabilities could be improved by fine-tuning on more diverse data.
<!-- ### Out-of-Scope Use -->
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
As with most text generation models, the model sometimes produces information that is incorrect.
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Please check manually that your output text corresponds to the input text, as factual inconsistencies may have arisen.
## How to Get Started with the Model
To load the model using transformers:
```
from transformers import AutoTokenizer, AutoModelForCausalLM
import torch
device = "cuda"
tokenizer = AutoTokenizer.from_pretrained("frhew/sigdial_ft_b1")
model = AutoModelForCausalLM.from_pretrained("frhew/sigdial_ft_b1", torch_dtype=torch.float16).to(device)
```
We used the following prompt at inference to test our model:
```
<|begin_of_text|><|start_header_id|>system<|end_header_id|>
Du bist ein hilfreicher Assistent und hilfst dem User, Texte besser zu verstehen.<|eot_id|><|start_header_id|>user<|end_header_id|>
Kannst du bitte den folgenden Text zusammenfassen und sprachlich auf ein B1-Niveau in Deutsch vereinfachen? Schreibe maximal 5 Sätze.
{input_text}<|eot_id|><|start_header_id|>assistant<|end_header_id|>
```
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
A sample of the data used to train our model can be found [here](https://github.com/fhewett/apa-rst/tree/main/original_texts).
#### Training Hyperparameters
<!--- **Training regime:** Our training script can be found [here](https://github.com/fhewett/simba/blob/main/models/train_simba.py). -->
<!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
<!-- #### Speeds, Sizes, Times [optional] -->
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
## Evaluation
The right hand side shows the results of the manual evaluation, done on the outputs from each model for 35 texts. M.P. stands for meaning preservation, S for simplification, C for coherence, F for factuality; the score represents the percentage of *yes* answers.
More details on the evaluation can be found in the paper. For all metrics, higher is better.
| **Model** | **Prompt** | **Test set** | **SARI** | **FRE** | **M.P.** | **S** | **C** | **F** | **Avg.** |
|--------------------|---------------------|-----------------------|------------------------------|-----------------------------|------------------------------|---------------------------|---------------------------|---------------------------|------------------------------|
| Baseline | Basic | A2 | 41.2 | 59.4 | .89 | .38 | .96 | .84 | .77 |
| FT-A2 | Basic | A2 | 44.0 | 70.6 | .49 | .82 | .56 | .64 | .63 |
| Baseline | Basic | B1 | 42.3 | 56.8 | .85 | .4 | .9 | .9 | .76 |
| FT-B1 | Basic | B1 | 42.4 | 60.0 | .75 | .55 | .6 | .75 | .66 |
#### Summary
## Citation
**BibTeX:**
@inproceedings{hewett-etal-2024-elaborative,
title = "Elaborative Simplification for {G}erman-Language Texts",
author = "Hewett, Freya and
Asghari, Hadi and
Stede, Manfred",
editor = "Kawahara, Tatsuya and
Demberg, Vera and
Ultes, Stefan and
Inoue, Koji and
Mehri, Shikib and
Howcroft, David and
Komatani, Kazunori",
booktitle = "Proceedings of the 25th Annual Meeting of the Special Interest Group on Discourse and Dialogue",
month = sep,
year = "2024",
address = "Kyoto, Japan",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2024.sigdial-1.3",
doi = "10.18653/v1/2024.sigdial-1.3",
pages = "29--39"}
**APA:**
Freya Hewett, Hadi Asghari, and Manfred Stede. 2024. Elaborative Simplification for German-Language Texts. In Proceedings of the 25th Annual Meeting of the Special Interest Group on Discourse and Dialogue, pages 29–39, Kyoto, Japan. Association for Computational Linguistics.
## Model Card Contact
frhew |
frhew/sigdial_ft_a2 | frhew | 2024-11-13T09:46:27Z | 11 | 0 | transformers | [
"transformers",
"safetensors",
"llama",
"text-generation",
"german",
"deutsch",
"simplification",
"vereinfachung",
"conversational",
"de",
"license:apache-2.0",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | text-generation | 2024-06-05T14:07:24Z | ---
license: apache-2.0
language:
- de
pipeline_tag: text-generation
tags:
- german
- deutsch
- simplification
- vereinfachung
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
This model was used in our experiments in our paper: [Elaborative Simplification for German-Language Texts](https://aclanthology.org/2024.sigdial-1.3).
We have uploaded this model for transparency and replicability of our experiments.
If however you are interested in German text simplification in general, we recommend [our more recent model](https://huggingface.co/hiig-piai/simba_best_092024).
We fine-tuned [meta-llama/Meta-Llama-3-8B-Instruct](https://huggingface.co/meta-llama/Meta-Llama-3-8B-Instruct) with a set of ca. 2000 newspaper articles which have been simplified by the Austrian Press Agency.
This model was trained with the standard and the A2 level texts.
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
- **Developed by:** Freya Hewett, Hadi Asghari
- **Model type:** simplification model, text generation
- **Language(s) (NLP):** German
- **License:** Apache 2.0
- **Finetuned from model:** meta-llama/Meta-Llama-3-8B-Instruct
### Model Sources
<!-- Provide the basic links for the model. -->
- **Repository:** [GermanElabSimplification](https://github.com/fhewett/GermanElabSimplification/tree/main)
- **Paper:** [Elaborative Simplification for German-Language Texts](https://aclanthology.org/2024.sigdial-1.3)
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
This model works best for simplifying German-language newspaper articles (news items, not commentaries or editorials). It may work for other types of texts.
### Downstream Use
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
We have fine-tuned using only newspaper articles. We have not yet performed extensive out-of-domain testing, but believe that the model's capabilities could be improved by fine-tuning on more diverse data.
<!-- ### Out-of-Scope Use -->
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
As with most text generation models, the model sometimes produces information that is incorrect.
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Please check manually that your output text corresponds to the input text, as factual inconsistencies may have arisen.
## How to Get Started with the Model
To load the model using transformers:
```
from transformers import AutoTokenizer, AutoModelForCausalLM
import torch
device = "cuda"
tokenizer = AutoTokenizer.from_pretrained("frhew/sigdial_ft_a2")
model = AutoModelForCausalLM.from_pretrained("frhew/sigdial_ft_a2", torch_dtype=torch.float16).to(device)
```
We used the following prompt at inference to test our model:
```
<|begin_of_text|><|start_header_id|>system<|end_header_id|>
Du bist ein hilfreicher Assistent und hilfst dem User, Texte besser zu verstehen.<|eot_id|><|start_header_id|>user<|end_header_id|>
Kannst du bitte den folgenden Text zusammenfassen und sprachlich auf ein A2-Niveau in Deutsch vereinfachen? Schreibe maximal 5 Sätze.
{input_text}<|eot_id|><|start_header_id|>assistant<|end_header_id|>
```
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
A sample of the data used to train our model can be found [here](https://github.com/fhewett/apa-rst/tree/main/original_texts).
#### Training Hyperparameters
<!--- **Training regime:** Our training script can be found [here](https://github.com/fhewett/simba/blob/main/models/train_simba.py). -->
<!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
<!-- #### Speeds, Sizes, Times [optional] -->
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
## Evaluation
The right hand side shows the results of the manual evaluation, done on the outputs from each model for 35 texts. M.P. stands for meaning preservation, S for simplification, C for coherence, F for factuality; the score represents the percentage of *yes* answers.
More details on the evaluation can be found in the paper. For all metrics, higher is better.
| **Model** | **Prompt** | **Test set** | **SARI** | **FRE** | **M.P.** | **S** | **C** | **F** | **Avg.** |
|--------------------|---------------------|-----------------------|------------------------------|-----------------------------|------------------------------|---------------------------|---------------------------|---------------------------|------------------------------|
| Baseline | Basic | A2 | 41.2 | 59.4 | .89 | .38 | .96 | .84 | .77 |
| FT-A2 | Basic | A2 | 44.0 | 70.6 | .49 | .82 | .56 | .64 | .63 |
| Baseline | Basic | B1 | 42.3 | 56.8 | .85 | .4 | .9 | .9 | .76 |
| FT-B1 | Basic | B1 | 42.4 | 60.0 | .75 | .55 | .6 | .75 | .66 |
#### Summary
## Citation
**BibTeX:**
@inproceedings{hewett-etal-2024-elaborative,
title = "Elaborative Simplification for {G}erman-Language Texts",
author = "Hewett, Freya and
Asghari, Hadi and
Stede, Manfred",
editor = "Kawahara, Tatsuya and
Demberg, Vera and
Ultes, Stefan and
Inoue, Koji and
Mehri, Shikib and
Howcroft, David and
Komatani, Kazunori",
booktitle = "Proceedings of the 25th Annual Meeting of the Special Interest Group on Discourse and Dialogue",
month = sep,
year = "2024",
address = "Kyoto, Japan",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2024.sigdial-1.3",
doi = "10.18653/v1/2024.sigdial-1.3",
pages = "29--39"}
**APA:**
Freya Hewett, Hadi Asghari, and Manfred Stede. 2024. Elaborative Simplification for German-Language Texts. In Proceedings of the 25th Annual Meeting of the Special Interest Group on Discourse and Dialogue, pages 29–39, Kyoto, Japan. Association for Computational Linguistics.
## Model Card Contact
frhew |
waloneai/buddha | waloneai | 2024-11-13T09:44:09Z | 301 | 0 | diffusers | [
"diffusers",
"flux",
"text-to-image",
"lora",
"fal",
"base_model:black-forest-labs/FLUX.1-dev",
"base_model:adapter:black-forest-labs/FLUX.1-dev",
"license:other",
"region:us"
] | text-to-image | 2024-11-13T09:44:06Z | ---
tags:
- flux
- text-to-image
- lora
- diffusers
- fal
base_model: black-forest-labs/FLUX.1-dev
instance_prompt: buddha
license: other
license_name: flux-1-dev-non-commercial-license
license_link: https://huggingface.co/black-forest-labs/FLUX.1-dev/blob/main/LICENSE.md
---
# buddha
<Gallery />
## Model description
## Trigger words
You should use `buddha` to trigger the image generation.
## Download model
Weights for this model are available in Safetensors format.
[Download](/shweaung/buddha/tree/main) them in the Files & versions tab.
## Training at fal.ai
Training was done using [fal.ai/models/fal-ai/flux-lora-fast-training](https://fal.ai/models/fal-ai/flux-lora-fast-training).
|
wrice/swin2sr-laion-hd | wrice | 2024-11-13T09:41:52Z | 7 | 0 | transformers | [
"transformers",
"safetensors",
"swin2sr",
"image-to-image",
"arxiv:1910.09700",
"endpoints_compatible",
"region:us"
] | image-to-image | 2024-09-19T18:54:33Z | ---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
bharati2324/Llama-1B-Code-LoRA-r64-merged | bharati2324 | 2024-11-13T09:35:20Z | 80 | 0 | transformers | [
"transformers",
"pytorch",
"llama",
"text-generation",
"unsloth",
"trl",
"sft",
"conversational",
"arxiv:1910.09700",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | text-generation | 2024-11-13T09:34:08Z | ---
library_name: transformers
tags:
- unsloth
- trl
- sft
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
app-x/chinese_spam_classifier_onnx | app-x | 2024-11-13T09:35:06Z | 5 | 0 | null | [
"onnx",
"bert",
"text-classification",
"zh",
"base_model:app-x/chinese_spam_classifier",
"base_model:quantized:app-x/chinese_spam_classifier",
"region:us"
] | text-classification | 2024-10-19T07:24:53Z | ---
language:
- zh
base_model:
- app-x/chinese_spam_classifier
pipeline_tag: text-classification
--- |
Olaperi/bert-finetuned-ner | Olaperi | 2024-11-13T09:31:20Z | 109 | 0 | transformers | [
"transformers",
"tensorboard",
"safetensors",
"bert",
"token-classification",
"generated_from_trainer",
"base_model:google-bert/bert-base-cased",
"base_model:finetune:google-bert/bert-base-cased",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | token-classification | 2024-11-13T07:57:56Z | ---
library_name: transformers
license: apache-2.0
base_model: bert-base-cased
tags:
- generated_from_trainer
metrics:
- precision
- recall
- f1
- accuracy
model-index:
- name: bert-finetuned-ner
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-finetuned-ner
This model is a fine-tuned version of [bert-base-cased](https://huggingface.co/bert-base-cased) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0623
- Precision: 0.9342
- Recall: 0.9507
- F1: 0.9424
- Accuracy: 0.9860
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 3
### Training results
| Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:---------:|:------:|:------:|:--------:|
| 0.0772 | 1.0 | 1756 | 0.0688 | 0.9086 | 0.9330 | 0.9206 | 0.9815 |
| 0.0345 | 2.0 | 3512 | 0.0692 | 0.9291 | 0.9461 | 0.9375 | 0.9844 |
| 0.0212 | 3.0 | 5268 | 0.0623 | 0.9342 | 0.9507 | 0.9424 | 0.9860 |
### Framework versions
- Transformers 4.46.2
- Pytorch 2.5.0+cu121
- Tokenizers 0.20.3
|
zeeshanali01/cryptotunned | zeeshanali01 | 2024-11-13T09:18:35Z | 5 | 0 | transformers | [
"transformers",
"safetensors",
"mistral",
"text-generation",
"trl",
"sft",
"license:apache-2.0",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"4-bit",
"bitsandbytes",
"region:us"
] | text-generation | 2024-11-06T12:48:38Z | ---
license: apache-2.0
tags:
- trl
- sft
library_name: transformers
---
# CatMemo: Fine-Tuning Large Language Models for Financial Applications
## Model Overview
This model, **CatMemo**, is fine-tuned using **Data Fusion** techniques for financial applications. It was developed as part of the FinLLM Challenge Task and focuses on enhancing the performance of large language models in finance-specific tasks such as question answering, document summarization, and sentiment analysis.
### Key Features
- Fine-tuned on financial datasets using **Supervised Fine-Tuning (SFT)** techniques.
- Optimized for **Transfer Reinforcement Learning (TRL)** workflows.
- Specialized for tasks requiring domain-specific context in financial applications.
## Usage
You can use this model with the [Hugging Face Transformers library](https://huggingface.co/docs/transformers/) to perform financial text analysis. Below is a quick example:
```python
from transformers import AutoTokenizer, AutoModelForCausalLM
# Load the model and tokenizer
model_name = "zeeshanali01/cryptotunned"
tokenizer = AutoTokenizer.from_pretrained(model_name)
model = AutoModelForCausalLM.from_pretrained(model_name)
# Tokenize input
inputs = tokenizer("What are the key takeaways from the latest earnings report?", return_tensors="pt")
# Generate output
outputs = model.generate(**inputs)
print(tokenizer.decode(outputs[0], skip_special_tokens=True))
```
## Training Details
This model was fine-tuned using **Data Fusion** methods on domain-specific financial datasets. The training pipeline includes:
- Preprocessing financial documents and datasets to enhance model understanding.
- Applying **Supervised Fine-Tuning (SFT)** to optimize the model for financial NLP tasks.
- Testing and evaluation on FinLLM benchmark tasks.
## Citation
If you use this model, please cite our work:
```
@inproceedings{cao2024catmemo,
title={CatMemo at the FinLLM Challenge Task: Fine-Tuning Large Language Models using Data Fusion in Financial Applications},
author={Cao, Yupeng and Yao, Zhiyuan and Chen, Zhi and Deng, Zhiyang},
booktitle={Joint Workshop of the 8th Financial Technology and Natural Language Processing (FinNLP) and the 1st Agent AI for Scenario Planning (AgentScen) in conjunction with IJCAI 2023},
pages={174},
year={2024}
}
```
## License
This model is licensed under the Apache 2.0 License. See the [LICENSE](https://www.apache.org/licenses/LICENSE-2.0) file for details.
## Acknowledgments
We thank the organizers of the FinLLM Challenge Task for providing the benchmark datasets and tasks used to develop this model.
---
### Model Card Metadata
- **License:** Apache 2.0
- **Tags:** TRL, SFT
- **Library Used:** Transformers
|
seongil-dn/e5-large-filtered-our-neg5-7392 | seongil-dn | 2024-11-13T09:16:05Z | 5 | 0 | sentence-transformers | [
"sentence-transformers",
"safetensors",
"bert",
"sentence-similarity",
"feature-extraction",
"generated_from_trainer",
"dataset_size:295744",
"loss:MultipleNegativesRankingLoss",
"arxiv:1908.10084",
"arxiv:1705.00652",
"base_model:intfloat/e5-large-unsupervised",
"base_model:finetune:intfloat/e5-large-unsupervised",
"autotrain_compatible",
"text-embeddings-inference",
"endpoints_compatible",
"region:us"
] | sentence-similarity | 2024-11-13T09:15:05Z | ---
tags:
- sentence-transformers
- sentence-similarity
- feature-extraction
- generated_from_trainer
- dataset_size:295744
- loss:MultipleNegativesRankingLoss
base_model: intfloat/e5-large-unsupervised
widget:
- source_sentence: 'query: NH농협금융지주 임원후보추천위원회는 금융지주 대표이사 회장 최종후보로 누구를 추천했나?'
sentences:
- 'passage: 검찰이 고용노동부가 삼성전자서비스의 불법파견 여부를 조사할 당시 고위 공무원들이 조사과정에 부당하게 개입했다는 의혹에 대해
강제수사에 나섰다. 서울중앙지검 공공형사수사부(김성훈 부장검사)는 13일 오전 정부세종청사에 있는 노동부 노동정책실 노사협력정책과 등지에 검사와
수사관들을 보내 삼성전자서비스 근로감독 관련 기록과 컴퓨터 하드디스크 등을 확보하고 있다. 노동부는 삼성전자서비스 AS센터에 대해 불법파견
의혹이 제기되자 2013년 6∼7월 수시 근로감독을 벌였다. 노동부는 근로감독을 한 차례 연장한 끝에 같은 해 9월 불법파견이 아니라는 결론을
내렸다. 이 과정에서 정현옥 당시 차관이 노동정책실장에게 노동부 출신 삼성전자 핵심인사와 접촉을 지시하는 등 노동부 고위직들이 삼성 측과 유착해
근로감독 결과에 부당한 영향력을 행사했다는 의혹이 제기됐다. 검찰은 이런 의혹을 뒷받침하는 정황을 일부 포착하고 이날 강제수사에 나선 것으로
전해졌다. 고용노동행정개혁위원회는 최근 이런 내용의 조사결과를 발표하고 검찰 수사를 촉구했다. 삼성전자서비스 노조 등은 정 전 차관 등 노동부
전.현직 공무원 11명을 직권남용권리행사방해 등 혐의로 검찰에 고발한 바 있다.'
- 'passage: 2018 평창동계올림픽이 열리는 2월 한 달 간 개최지인 강원지역 고속도로와 국도 등에서 각국 선수단과 임원 등 관계자들의
이동을 돕기 위해 올림픽 전용차로(O·L :Olympic Lane)가 운용된다. 21일 도로교통공단에 따르면 평창올림픽(2월9∼25일) 기간을
낀 2월1∼28일 영동고속도로 인천방향 대관령나들목(IC)→강릉분기점(JC) 약 19.8㎞ 구간, 국도 6호선·지방도 456호선 태기사거리∼대관령IC
39.6㎞ 구간 등 전체 59.4㎞ 구간 1차로에 올림픽 전용차로가 설정된다. 차로 운용 시간대는 오전 7시부터 다음날 오전 1시까지다. 올림픽에
이은 동계 패럴림픽(3월9∼18일) 기간에는 영동고속도로 구간만 전용차로를 둔다. 이 기간 고속도로 구간 전용차로는 9인승 이상 승용·승합자동차와
올림픽조직위원회 행사차량임을 표시한 차량만 운행할 수 있다. 국도·지방도 구간 전용차로는 올림픽 차량과 36인승 이상 대형 승합차량 등만 이용
가능하다. 이를 제외한 일반 차량의 전용차로 통행은 제한되며, 위반하면 승합차 7만원·승용차 6만원의 범칙금과 벌점 30점이 부과된다. 평창군에
있는 장평터널과 속사터널은 올림픽 기간에는 전용차로로만 운용되므로 일반 차량은 다른 길로 우회해야 한다. 올림픽 전용차로 구간에는 운전자 안내를
위해 표지판을 설치하고, 도로 위에 전용차로임을 뜻하는 청색 점선과 오륜마크를 표기한다.'
- 'passage: 김광수 전 금융정보분석원(FIU) 원장(사진)이 NH농협금융지주 회장에 내정됐다. 19일 NH농협금융지주 임원후보추천위원회는
김광수 전 원장을 금융지주 대표이사 회장 최종후보로 추천했다고 밝혔다. 당초 후보자 3명에 대해 심층면접을 거쳐 최종 후보자를 추천할 계획이었으나
윤용로 현 코람코자산신탁 회장의 후보 고사와 당일 김용환 현 회장의 후보 사퇴로 김광수 단독 후보자에 대해 심층면접을 했다. 심층면접 결과
농협금융 임추위는 만장일치로 김 전 원장을 최종 후보로 결정하고 주주총회에 추천하기로 결정했다. 김 전 원장은 행시 27회로 관직에 입문한
옛 재정경제원 관료 출신으로 현 정부 들어 금융위원장, 금융감독원장 등으로 하마평에 올랐던 인물이다.'
- source_sentence: 'query: 비욘세는 아이가 몇이야?'
sentences:
- 'passage: 두산중공업은 한국중부발전이 발주한 신서천 화력발전소 탈황설비(FGD, Flue Gas Desulfurization) 공사를
약 1240억원에 수주했다고 27일 밝혔다. 신서천 화력발전소는 충남 서천군 서면 마량리에 건설중인 1000MW급 발전소다. 두산중공업은 지난
2015년 이 발전소와 터빈.발전기 공급계약을 맺은 데 이어, 금호산업과 컨소시엄을 이뤄 탈황설비 공사를 수행하게 됐다. 두산중공업이 맡은
탈황설비 기자재 공급 금액은 700억원 규모다. 탈황설비는 화력발전소의 배기가스 내 대기오염 원인 물질인 황산화물(SOx)을 제거함으로써 발전소
굴뚝에서 나오는 연기를 수증기 수준으로 정화시켜주는 친환경 설비다.'
- 'passage: {IMG:1}더불어민주당 오중기 경북도지사 후보가 2일 포항에서 선거사무실에서 비전선포식을 갖고 한반도 핵심은 경북이라고
강조했다. 오 후보는 비전 선포식 자리에서 남북정상회담의 성공적 개최로 향후 남북 경제협력에 있어 핵심적인 역할을 할 지역이 포항을 중심으로
한 동해안이다며 동해안 일대는 ‘한반도 신 경제지도’를 수행할 전력, 해양 등 인프라가 구축돼 있어 남북경협을 주도할 위치에 있다고 밝혔다.
또, 한반도 신경제지도의 핵심은 남북 철도, 도로를 이어 한반도에 H라인을 구축, 대륙과 해양으로 뻗어나갈 인프라를 조성하는 것이다며 동해안
지역은 부산~금강산~원산~나진으로 연결되는 ‘동해안벨트’에 포함된다고 설명했다. 특히,동해선을 통해 남북철도가 연결되면 시베리아 횡단철도(TSR)와
연계해 남한의 물류가 유라시아까지 공급될 수 있다”며 정상회담 성공적 개최에 따른 기대감을 드러냈다. {IMG:2}오 후보는 이와 함께 문재인
정부의 국정철학인 지역균형발전과 연계한 지역발전 전략으로 5대 정책과 15대 실천 방안을 내놓았다. 5대 정책은 대기업과 상생협력 생태계 조성.청년
창업 메카 실현, 전통역사와 문화자원을 활용한 인간 중심의 문화 르네상스, 인구소멸 위기의 대응, 경북의 미래 먹거리 창출과 안전한 경북 등이다.
이를 위한 15대 실천방안으로는 원전해체 연구센터를 포함한 국제 원자력 안전 연구단지 조성과 연구·교육·체험 산업 클러스터 조성, 백두대간
임업 클러스터구축, 남북 6축 고속도로 조기 착공, 미래형 자동차 철강소재 부품산업 벨트육성 등을 제시했다. 이에 앞서 오 후보는 지난달 30일
경북 영덕군 강구항에서 남북정상회담의 성공을 축하하고 후속조치의 중요성을 강조하는 2018 남북정상회담과 경북의 역할이란 제목으로 기자회견을
가졌다. 오 후보는 청와대 균형발전실 선임행정관으로 국정에 참여했고 중앙당 지도부 경험이 있는 저 오중기가 도민들과 더불어 경북 미래의 문을
여는 열쇠가 되겠다. 문재인 정부의 국정철학을 경북에서 완성하겠다고 말했다.'
- 'passage: 아이와 함께 특별한 행사에 참석하는 한 엄마를 상상해보시라. 아이 뒤치다꺼리를 해야 할 지도 모른다는 생각에 되도록 편한
옷을 입을 거라고 생각하게 된다. 노출이 있는 옷은 언감생심이다. 아이를 곁에 둔 엄마와 ‘섹시’는 어울려서는 안 되는 조합같이 느껴지기 때문이다.
그러나 팝스타 비욘세의 선택은 달랐다. 통념을 완벽하게 깬 드레스를 입고 아이와 함께 행사장에 등장했다. 비욘세 28일(현지시간) 뉴욕 매디슨
스퀘어 가든에서 열린 제60회 그래미 시상식에서 남편 제이지와 딸 블루 아이비와 함께했다. 비욘세, 제이지 부부는 할리우드 여배우들의 성폭력
고발 캠페인 ‘미투’에 동참한 듯 모두 검은색 의상을 입었다. 6살 된 딸은 흰색 긴 재킷을 입었다. 특히 아이 옆에 앉은 비욘세의 드레스는
시선을 강탈하기 충분했다. 자리에 앉아 있을 때 양옆으로 쫙 벌어지는 긴 드레스였다. 허벅지까지 노출되는 과감한 드레스였다. 아이와 동석한
행사에서 나온 모습이라고 상상하기 어렵다. 비욘세는 지난해 6월 낳은 쌍둥이에 바로 옆에 동석한 딸까지, 세 아이의 엄마다.'
- source_sentence: 'query: 2017년 말산업실태조사 결과에 따르면 승마시설 수는?'
sentences:
- 'passage: 승마시설 512곳으로 늘어.. 말 사육 수 2만7210마리 귀족 레포츠라는 이미지가 짙은 승마를 즐기는 국민이 늘고 있다.
승마인구 증가로 말산업 또한 확대되면서 말산업으로 인한 경제적 효과는 3조4000억원에 이르는 것으로 분석됐다. 6일 농림축산식품부가 발표한
2017년 말산업실태조사 결과에 따르면 승마시설 수는 전년 대비 6.9% 증가한 512개소다. 이는 2012~2016년 제1차 말산업육성 종합계획상
목표인 500개소를 초과 달성한 수치다. 말산업 사업체 수는 전년 대비 8.4% 증가한 2470개소다. 이 중 말 보유 사업체 수는 9.5%
증가한 2146개소다. 말 사육마릿수는 전년 대비 0.3% 증가한 2만7210마리다. 사육용도는 승용이 1만1458마리(42.1%)로 가장
많고 경주용 7303마리(26.8%), 번식용 4549마리(16.7%), 육용 865마리(3.2%) 등이다. 특히 승마인구는 1만292명 증가한
94만8714명으로 지속적인 성장 추세다. 이 중 정기적으로 승마를 하는 즐기는 정기 승마인구는 전년 대비 3.9% 증가한 4만9312명이다.
체험 승마인구는 전년 대비 0.8% 증가한 89만9402명이다. 말산업 종사자는 전년 대비 0.1% 증가한 6946명이다. 다만 경마분야 종사자는
전년 대비 4.2% 감소한 9315명을 기록했다. 이는 무인발매기 도입으로 인한 발매인원이 감소한 데 따른 것으로 분석됐다.'
- 'passage: {IMG:1}블랙리스트 타파와 공공성 회복을 위한 연극인회의(블랙타파)가 한국연극협회 이사장을 향해 재차 사퇴할 것을 촉구했다.
블랙타파는 1일 정대경 이사장은 더 이상 거짓말 하지 말고 사퇴하라는 논평을 통해, 정 이사장이 언론 등을 통해 밝힌 해명이 거짓이라고 지적했다.
지난달 29일 블랙타파가 한국연극협회 이사회는 정대경 이사장을 탄핵하라! 는 제목의 성명서를 발표하자, 이후 정 이사장은 언론을 통해 예술위
안에서 지원 배제를 막기 위해 가장 열심히 싸웠다 블랙리스트의 존재를 알고부터 이를 원만하게 해결하게 위해 최선을 다했는데 억울하다고 호소하며,
자신사퇴를 거부했다. 블랙타파는 정 이사장의 입장은 이미 밝혀진 사실을 왜곡하는 행위라고 말하는 동시에 언론을 향해 더 이상 정대경 이사장의
거짓말을 실어나르지 말라고 주장했다. 블랙리스트 진상조사 및 제도개선위원회의 진상조사(진상조사위) 결과, 정 이사장은 한국문화예술위원회 위원으로서
임무를 수행하던 기간에 블랙리스트 실행을 묵인 방조하고, 블랙리스트 피해자 배제에 깊이 관여한 것으로 드러났다. 하지만 연극인이 더욱 분노하는
지점은 그가 한국연극협회 이사장 선거에 문체부 등에 요청하여 관권을 개입시킨 범죄사실이다. 블랙타파는 진상조사위가 지난 8일 발표한 종합결과보고
자료집에는 정대경 이사장이 예술위 위원의 직위를 이용해 문체부, 예술위에 선거 개입을 요구한 사실이 적시되어 있다며 정 이사장의 선거개입 요청에
대한 다수의 진술이 있으며, 정대경 이사장은 블랙리스트 실행을 막기 위해 양해조치를 요구했다고 주장하고 있지만 그것이 선거에 도움이 될 것이라고
생각했다고 스스로 진술하였다고 했다. 이어 블랙타파는 블랙리스트 묵인, 방조, 협조 등의 행위도 문제적이지만 선거에 관권을 끌어들인 행위는
그 자체가 범죄행위라며 자신의 당선을 위해 민간협회 독립성을 스스로 무너뜨린 파렴치한 행위가 아닐 수 없다고 지적했다. 끝으로 블랙타파는 정
이사장을 향해 지난 과오를 반성하고 사죄하며 최소한의 책임이라도 다하는 것이 연극인으로서 자신의 명예를 지키는 일이다고 충고하고, 한국연극협회에
정 이사장의 범죄 사실에 대한 정확한 입장을 밝혀주기 바란다고 촉구했다. 블랙타파는 지난해 4월, 120개 단체와 600여 명의 연극인·예술가의
지지를 통해 결성된 단체이다. 블랙타파는 정 이사장이 스스로 물러나지 않거나, 이사회가 그를 탄핵하지 않을 시 직권남용권리행사방해죄의 공범
및 직무유기죄 등으로 형사고발한다는 입장이다.'
- 'passage: 보령제약은 필리핀 제약사인 인듀어메디칼과 3년간 50억원 규모의 항암제 이피마이신주, 에이디마이신주, 옥살리플라틴주, 네오타빈주
4개 품목에 대한 독점공급계약을 체결했다고 12일 밝혔다. 계약식은 지난 11일 스페인 마드리드 이페마(IFEMA) 컨벤션센터에서 개최된 세계의약품전시회(CPhI
Worldwdie) 현장에서 진행됐다. 이번 계약으로 보령제약은 3년 간 총 50억원(400만 달러) 규모의 의약품을 공급하게 됐으며 1차
항암제에 대한 동남아 시장 확대를 가속화하게 됐다. 보령제약은 태국, 인도네시아, 베트남 등 동남아시아 지역에 항암제를 수출하고 있다. 이를
기반으로 아프리카, 중남미지역 등으로 시장을 더욱 넓혀갈 계획이다. 한편, 2002년 창립된 인듀어메디칼은 필파마웰스그룹의 7개 계열사 중
하나로 대표 품목은 페니실린계 의약품 및 흡입마취제 등이다. 필리핀과 싱가포르 제조시설을 거점으로 글로벌 파트너사와 함께 60여개의 의료기기
및 의약품을 통해 시장을 선도하고 있는 중견 제약사다.'
- source_sentence: 'query: LG유플러스 새로운 사령탑에 누가 올랐어?'
sentences:
- 'passage: 현재 코스모화학(005420)은 전일대비 8.44% 상승하여 25,050원 선에서 거래가 이루어지고 있다. 상장주식수 대비
거래량은 2.55%로 적정수준 최근 한달간 코스모화학의 상장주식수 대비 거래량을 비교해보니 일별 매매회전율이 2.55%로 집계됐다. 그리고
동기간 장중 저점과 고점의 차이인 일평균 주가변동률을 4.32%를 나타내고 있다. 외국인 보유량 감소, 기관 증가 (5일 누적) 최근 5일간
동종목의 외국인 투자자는 3,346주 순매도하였고 반면 기관은 4,650주 순매수하였다. 전날 두 주체 각각 9,693주, 4,650주 순매수하며
보유 비율을 늘렸다. 다음 표는 동종목의 외국인, 기관의 일별 순매수량을 나타낸 것이다. fnRASSI는 증권전문 기업 씽크풀과 파이낸셜뉴스의
협업으로 로봇기자가 실시간으로 생산하는 기사입니다.'
- 'passage: 현재 엠에스오토텍(123040)은 전일대비 9.39% 상승하여 1,980원 선에서 거래가 이루어지고 있다. 주가등락폭이 너무
커 최근 한달간 엠에스오토텍의 상장주식수 대비 거래량을 비교해보니 일별 매매회전율이 0.34%로 집계됐다. 동기간 일평균 주가변동률은 17.52%로
장중 주가등락이 심하기 때문에 투자시 주의가 요망된다. 외국인/개인 순매수, 기관은 순매도(한달누적) 전일 외국인이 4,382주를 순매수했다.
4주간을 기준으로 보면 외국인이 매매관점을 매도에서 매수로 전환하면서 3,781주를 순매수했고, 개인들도 매기를 떨어뜨리면서도 10,851주를
순매수했다. 하지만 기관은 순매수에서 순매도로 반전되면서 14,632주를 순매도했다. fnRASSI는 증권전문 기업 씽크풀과 파이낸셜뉴스의
협업으로 로봇기자가 실시간으로 생산하는 기사입니다.'
- 'passage: {IMG:1}LG유플러스 새로운 사령탑에 하현회 부회장이 올랐다. 구광모 LG회장 취임 뒤 그룹 내 첫 인사이자, 5G 통신
상용화, 케이블TV 인수합병 등 굵직한 사안들을 앞둔 만큼 그 변화에 관심이 쏠린다. LG유플러스는 16일 이사회를 열고 하 부회장을 신임
최고경영자(CEO)로 선임했다. 부산대 사학과, 일본 와세다대 경영학 석사를 마친 하 부회장은 지난 1985년 LG금속에 입사한 뒤, LG디스플레이에서
전략기획담당, 중소형사업부장, IT사업부장을 지냈다. 지난 2012년부터 2년간 ㈜LG 시너지팀장을 맡은 그는, 모바일과 차세대 디스플레이,
에너지 솔루션과 친환경 자동차부품 등 그룹 주력사업과 차세대 성장사업이 본궤도에 진입할 수 있도록 기반을 확립했다. 아울러 LG전자 HE사업본부장을
맡아 울트라 올레드 TV를 세계 최초로 출시해 차세대 TV 부문에서 선도적으로 사업기반을 구축했다. 특히 지난 2015년부터 ㈜LG 대표이사를
맡으면서 계열사 실적 개선을 이끌며 LG그룹이 사업 성과를 거두는 데 핵심적인 역할을 수행했다. 올해 그는 ㈜LG 부회장으로 승진했다. 하
부회장은 지난 2015년부터 LG유플러스 이사회 멤버로 주요 의사결정에 참여해오기도 했다. 모바일 비디오, IoT, AI, 5G 등 통신산업의
변화에 대한 이해가 깊고 미래 먹거리 사업에 힘을 기울여왔다는 평가다. 하 부회장은 LG그룹 내 재무통으로 불리는 권영수 부회장과 달리 전략통으로
통하기 때문에 향후 투자를 늘리는 방향에도 관심이 쏠린다. 다만, 하 부회장의 어깨는 무겁다. 그가 이끌어야 할 LG유플러스 앞에는 풀어야할
숙제가 산적해 있기 때문이다. 5G 상용화를 위한 장비사 선정이 가장 시급한 과제다. 전임 권영수 부회장은 5G 장비로 중국 화웨이를 사용하겠다고
밝혔지만, 이에 국내 여론은 싸늘한 편이다. 실제 5G에서도 LG유플러스가 화웨이 장비를 도입한다면 비판 여론을 풀어나가는 것은 하 부회장의
몫이 될 예정이다. LG유플러스가 이미 추진하고 있는 케이블TV MA에서도 하 부회장의 경영 능력이 시험대에 오를 전망이다. LG유플러스는
미디어 플랫폼 확대를 위해 CJ헬로를 비롯한 케이블TV MA에 공개적으로 관심을 드러낸 바 있다. LG유플러스가 케이블TV MA를 위해 다양한
사업자들과 접촉했던 만큼 하 부회장의 결정은 방송·통신 시장에서 새로운 변화의 바람으로 작용할 것으로 보인다. 세계 최대 온라인동영상서비스(OTT)
넷플릭스와의 제휴도 관심사다. 전임 권 부회장이 넷플릭스와 제휴를 의욕적으로 추진했지만, LG유플러스 내부적으로는 조심스러운 분위기가 감지됐던
것이 현실이다. 회사 내부적으로는 미디어 사업 1위 목표를 내세우고 있다. 콘텐츠 대가 산정 비율이나 망 사용료 등은 비단 LG유플러스에게만
적용되는 것이 아니기 때문이다. 논란을 피해 어느 정도 수준의 협상력을 발휘할지는 하 부회장의 조율 능력이 중요한 것으로 예상된다. 한편,
전임 CEO인 권영수 부회장이 ㈜LG 대표이사로 옮기더라도 이전까지 하 부회장의 위치처럼 LG유플러스 이사회 내 기타비상무이사로 남는 점도
주목된다. 이에 따라, 하 부회장 체제의 조직 재정비는 연말 그룹 인사에 맞춰 진행되고, 올해 남은 기간 동안 기존 경영 기조를 이어갈 것이란
전망이 지배적이다. 통신업계에서는 LG유플러스가 새로운 CEO를 맞이한 만큼 케이블TV 인수나 넷플릭스 제휴 확대와 같은 현안을 두고 새판을
짜기 위한 결정이 머지 않아 나올 것으로 내다보고 있다. LG유플러스는 관계자는 하 부회장이 ㈜LG에서 계열사간 시너지를 창출한 경험을 바탕으로
LG유플러스의 본원적 경쟁력을 지속 강화하는 동시에 미래시장 개척을 위한 신성장동력 발굴에 집중해 나갈 예정이라고 강조했다.'
- source_sentence: 'query: 쿠첸은 적자행진에서 어떻게 벗어났어?'
sentences:
- 'passage: 쿠첸이 4분기만에 적자행진에서 벗어났다. 수익성 회복을 위한 판매 채널 확대, 원가율 개선 노력과 신제품 출시 등 호재가
맞물렸다. 16일 생활가전업계 및 증권가에 따르면 쿠첸은 지난 1.4분기 619억원의 매출액과 13억원의 영업이익을 기록했다. 매출액은 1년전
685억원에 비해 9.7% 줄었지만 영업이익은 6억원에서 13억원으로 두배이상 늘었다. 이에 따라 쿠첸은 지난 2.4분기부터 시작된 분기 기준
적자 행진에서 벗어나게 됐다. 쿠첸은 지난해 1.4분기 6억원의 영업이익을 기록한 이후 2.4분기에 57억원 손실, 3.4분기 12억원 손실,
4.4분기 20억원 적자 등 작년 한 해 83억원 가량의 손실을 입었다. 금액 기준으로 역대 최대 규모였다. 회사 측은 올 1.4분기 설 연휴로
호실적이 가능했다고 본다. 실제로 지난해 4.4분기 대비로 IH압력밥솥, 열판압력밥솥, 일반밥솥, 전기레인지, 기타품목 모두 매출이 상승했다.
특히 쿠첸의 야심작이었던 IR밥솥과 전기레인지가 함께 나아지고 있는 점이 특징이다. 프리미엄급 IR밥솥 비중이 순차적으로 증가하고 있고, 전기레인지
인지도 확대에 따라 홈쇼핑.B2B 채널에서 판매 호조를 나타냈다고 회사 측은 설명했다. 판매 채널을 수수료가 높은 오프라인에서 온라인 위주로
개선하고, 원가율을 낮추는 등 노력을 기울이고 있다.'
- 'passage: 현대에이치씨엔(126560)은 52주신고가를 기록하고 있어 주목할만하다. 동종목의 현재 주가는 4,650원 선에서 이루어지고
있다. 거래일을 기준으로 최근 3일간 평균 거래량은 31만 주이다. 이는 60일 일 평균 거래량 9만 주와 비교해보면 최근 거래량이 급격히
늘어났다는 것을 알 수 있다. 거래비중 기관 30.5%, 외국인 28.18% 최근 한달간 주체별 거래비중을 살펴보면 개인이 전체 거래량의 41.31%를
차지하며 가장 활발한 참여를 보였고, 기관은 30.5%, 외국인은 28.18%의 거래 참여율을 보였다. 그리고 최근 5일간 거래비중은 개인이
전체 거래의 35.43%를 차지하며 가장 적극적인 모습을 보여줬고, 외국인은 33.83%를 나타냈으며 기관은 30.73%를 나타냈다. 외국인/기관
순매수, 개인은 순매도(한달누적) 전일까지 기관이 2일 연속, 외국인이 10일 연속 각각 동종목을 순매수 했다. 4주간을 기준으로 보면 외국인이
초반에 동종목을 순매수한 이후에 기세를 이어가며 158,133주를 순매수했고, 기관도 초반에 동종목을 순매수한 이후에 최근에는 순매수량이 줄어들고
있음에도 불구하고 3,362주를 순매수했다. 반면 개인들은 매도쪽으로 방향을 잡으면서 161,495주를 순매도한 것으로 나타났다. fnRASSI는
증권전문 기업 씽크풀과 파이낸셜뉴스의 협업으로 로봇기자가 실시간으로 생산하는 기사입니다.'
- 'passage: 세계적인 복서 플로이드 메이웨더 주니어 (Floyd Mayweather Jr.)가 전용기를 타고 국내 입국할 예정이다. 열렬한
힙합 마니아로 잘 알려진 메이웨더는 본인의 공식 브랜드 TMT 산하 음악 레이블 TMT MUSIC GROUP을 런칭. 래퍼 릴 웨인(Lil
Wayne), 떠오르는 힙합 루키 릴 제임스(Lil Jamez)와 함께 TMT MUSIC을 한국에 알리기 위해 첫 내한을 결정했다는 후문이다.
메이웨더는 이번 일정 중에 26일에는 서울 강남구 신사동 ‘바이브레이트 플래그쉽 스토어’에서 팬들과 만난다. 팬들과 가깝게 만나는 팬사인회
일정으로 진행될 예정이다. 특히 이날 행사에는 세계 챔피언 복서 플로이드 메이웨더 주니어(Floyd Mayweather Jr. 이하 메이웨더)를
비롯해 래퍼 릴 웨인과, 릴 제임스가 함께할 예정으로 더욱 기대를 모은다. 메이웨더는 총 자산 규모가 1조가 넘을 정도로 소위 얘기하는 ‘거물급’
스포츠 스타이다. 경제지 포브스에 따르면 2015년 기준 세계에서 가장 많은 돈을 번 스포츠 스타며 메이웨더의 별명은 ‘머니’로 그의 남다른
돈자랑은 SNS는 물론 언론에서까지 항상 화제를 낳은 바 있다. 메이웨더는 1996년 데뷔 이후 49전 ‘무’패 최근 UFC역사상 가장 핫한
‘코너 맥그리거’와의 결투에서 승리한 스포츠 스타다. 이번 팬미팅은 국내 팬들을 위해 메이웨더 측에서 준비한 깜짝 이벤트로 한국스타들의 뜨거운
사랑을 받고 있는 감각적인 스트릿 브랜드 바이브레이트에서 주최해 팬들에게 잊지 못할 감동과 추억을 선사할 예정이다. 한편 메이웨더 팬미팅에
대한 상세 정보 및 이벤트는 바이브레이트 공식 SNS를 통해 확인이 가능하다.'
pipeline_tag: sentence-similarity
library_name: sentence-transformers
---
# SentenceTransformer based on intfloat/e5-large-unsupervised
This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [intfloat/e5-large-unsupervised](https://huggingface.co/intfloat/e5-large-unsupervised). It maps sentences & paragraphs to a 1024-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
## Model Details
### Model Description
- **Model Type:** Sentence Transformer
- **Base model:** [intfloat/e5-large-unsupervised](https://huggingface.co/intfloat/e5-large-unsupervised) <!-- at revision 15af9288f69a6291f37bfb89b47e71abc747b206 -->
- **Maximum Sequence Length:** 512 tokens
- **Output Dimensionality:** 1024 tokens
- **Similarity Function:** Cosine Similarity
<!-- - **Training Dataset:** Unknown -->
<!-- - **Language:** Unknown -->
<!-- - **License:** Unknown -->
### Model Sources
- **Documentation:** [Sentence Transformers Documentation](https://sbert.net)
- **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers)
- **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers)
### Full Model Architecture
```
SentenceTransformer(
(0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel
(1): Pooling({'word_embedding_dimension': 1024, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
(2): Normalize()
)
```
## Usage
### Direct Usage (Sentence Transformers)
First install the Sentence Transformers library:
```bash
pip install -U sentence-transformers
```
Then you can load this model and run inference.
```python
from sentence_transformers import SentenceTransformer
# Download from the 🤗 Hub
model = SentenceTransformer("seongil-dn/e5-large-filtered-our-neg5-7392")
# Run inference
sentences = [
'query: 쿠첸은 적자행진에서 어떻게 벗어났어?',
'passage: 쿠첸이 4분기만에 적자행진에서 벗어났다. 수익성 회복을 위한 판매 채널 확대, 원가율 개선 노력과 신제품 출시 등 호재가 맞물렸다. 16일 생활가전업계 및 증권가에 따르면 쿠첸은 지난 1.4분기 619억원의 매출액과 13억원의 영업이익을 기록했다. 매출액은 1년전 685억원에 비해 9.7% 줄었지만 영업이익은 6억원에서 13억원으로 두배이상 늘었다. 이에 따라 쿠첸은 지난 2.4분기부터 시작된 분기 기준 적자 행진에서 벗어나게 됐다. 쿠첸은 지난해 1.4분기 6억원의 영업이익을 기록한 이후 2.4분기에 57억원 손실, 3.4분기 12억원 손실, 4.4분기 20억원 적자 등 작년 한 해 83억원 가량의 손실을 입었다. 금액 기준으로 역대 최대 규모였다. 회사 측은 올 1.4분기 설 연휴로 호실적이 가능했다고 본다. 실제로 지난해 4.4분기 대비로 IH압력밥솥, 열판압력밥솥, 일반밥솥, 전기레인지, 기타품목 모두 매출이 상승했다. 특히 쿠첸의 야심작이었던 IR밥솥과 전기레인지가 함께 나아지고 있는 점이 특징이다. 프리미엄급 IR밥솥 비중이 순차적으로 증가하고 있고, 전기레인지 인지도 확대에 따라 홈쇼핑.B2B 채널에서 판매 호조를 나타냈다고 회사 측은 설명했다. 판매 채널을 수수료가 높은 오프라인에서 온라인 위주로 개선하고, 원가율을 낮추는 등 노력을 기울이고 있다.',
'passage: 현대에이치씨엔(126560)은 52주신고가를 기록하고 있어 주목할만하다. 동종목의 현재 주가는 4,650원 선에서 이루어지고 있다. 거래일을 기준으로 최근 3일간 평균 거래량은 31만 주이다. 이는 60일 일 평균 거래량 9만 주와 비교해보면 최근 거래량이 급격히 늘어났다는 것을 알 수 있다. 거래비중 기관 30.5%, 외국인 28.18% 최근 한달간 주체별 거래비중을 살펴보면 개인이 전체 거래량의 41.31%를 차지하며 가장 활발한 참여를 보였고, 기관은 30.5%, 외국인은 28.18%의 거래 참여율을 보였다. 그리고 최근 5일간 거래비중은 개인이 전체 거래의 35.43%를 차지하며 가장 적극적인 모습을 보여줬고, 외국인은 33.83%를 나타냈으며 기관은 30.73%를 나타냈다. 외국인/기관 순매수, 개인은 순매도(한달누적) 전일까지 기관이 2일 연속, 외국인이 10일 연속 각각 동종목을 순매수 했다. 4주간을 기준으로 보면 외국인이 초반에 동종목을 순매수한 이후에 기세를 이어가며 158,133주를 순매수했고, 기관도 초반에 동종목을 순매수한 이후에 최근에는 순매수량이 줄어들고 있음에도 불구하고 3,362주를 순매수했다. 반면 개인들은 매도쪽으로 방향을 잡으면서 161,495주를 순매도한 것으로 나타났다. fnRASSI는 증권전문 기업 씽크풀과 파이낸셜뉴스의 협업으로 로봇기자가 실시간으로 생산하는 기사입니다.',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 1024]
# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]
```
<!--
### Direct Usage (Transformers)
<details><summary>Click to see the direct usage in Transformers</summary>
</details>
-->
<!--
### Downstream Usage (Sentence Transformers)
You can finetune this model on your own dataset.
<details><summary>Click to expand</summary>
</details>
-->
<!--
### Out-of-Scope Use
*List how the model may foreseeably be misused and address what users ought not to do with the model.*
-->
<!--
## Bias, Risks and Limitations
*What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.*
-->
<!--
### Recommendations
*What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.*
-->
## Training Details
### Training Hyperparameters
#### Non-Default Hyperparameters
- `per_device_train_batch_size`: 40
- `learning_rate`: 0.0001
- `adam_epsilon`: 1e-07
- `num_train_epochs`: 5
- `warmup_ratio`: 0.05
- `bf16`: True
#### All Hyperparameters
<details><summary>Click to expand</summary>
- `overwrite_output_dir`: False
- `do_predict`: False
- `eval_strategy`: no
- `prediction_loss_only`: True
- `per_device_train_batch_size`: 40
- `per_device_eval_batch_size`: 8
- `per_gpu_train_batch_size`: None
- `per_gpu_eval_batch_size`: None
- `gradient_accumulation_steps`: 1
- `eval_accumulation_steps`: None
- `torch_empty_cache_steps`: None
- `learning_rate`: 0.0001
- `weight_decay`: 0.0
- `adam_beta1`: 0.9
- `adam_beta2`: 0.999
- `adam_epsilon`: 1e-07
- `max_grad_norm`: 1.0
- `num_train_epochs`: 5
- `max_steps`: -1
- `lr_scheduler_type`: linear
- `lr_scheduler_kwargs`: {}
- `warmup_ratio`: 0.05
- `warmup_steps`: 0
- `log_level`: passive
- `log_level_replica`: warning
- `log_on_each_node`: True
- `logging_nan_inf_filter`: True
- `save_safetensors`: True
- `save_on_each_node`: False
- `save_only_model`: False
- `restore_callback_states_from_checkpoint`: False
- `no_cuda`: False
- `use_cpu`: False
- `use_mps_device`: False
- `seed`: 42
- `data_seed`: None
- `jit_mode_eval`: False
- `use_ipex`: False
- `bf16`: True
- `fp16`: False
- `fp16_opt_level`: O1
- `half_precision_backend`: auto
- `bf16_full_eval`: False
- `fp16_full_eval`: False
- `tf32`: None
- `local_rank`: 0
- `ddp_backend`: None
- `tpu_num_cores`: None
- `tpu_metrics_debug`: False
- `debug`: []
- `dataloader_drop_last`: True
- `dataloader_num_workers`: 0
- `dataloader_prefetch_factor`: None
- `past_index`: -1
- `disable_tqdm`: False
- `remove_unused_columns`: True
- `label_names`: None
- `load_best_model_at_end`: False
- `ignore_data_skip`: False
- `fsdp`: []
- `fsdp_min_num_params`: 0
- `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
- `fsdp_transformer_layer_cls_to_wrap`: None
- `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
- `deepspeed`: None
- `label_smoothing_factor`: 0.0
- `optim`: adamw_torch
- `optim_args`: None
- `adafactor`: False
- `group_by_length`: False
- `length_column_name`: length
- `ddp_find_unused_parameters`: None
- `ddp_bucket_cap_mb`: None
- `ddp_broadcast_buffers`: False
- `dataloader_pin_memory`: True
- `dataloader_persistent_workers`: False
- `skip_memory_metrics`: True
- `use_legacy_prediction_loop`: False
- `push_to_hub`: False
- `resume_from_checkpoint`: None
- `hub_model_id`: None
- `hub_strategy`: every_save
- `hub_private_repo`: False
- `hub_always_push`: False
- `gradient_checkpointing`: False
- `gradient_checkpointing_kwargs`: None
- `include_inputs_for_metrics`: False
- `eval_do_concat_batches`: True
- `fp16_backend`: auto
- `push_to_hub_model_id`: None
- `push_to_hub_organization`: None
- `mp_parameters`:
- `auto_find_batch_size`: False
- `full_determinism`: False
- `torchdynamo`: None
- `ray_scope`: last
- `ddp_timeout`: 1800
- `torch_compile`: False
- `torch_compile_backend`: None
- `torch_compile_mode`: None
- `dispatch_batches`: None
- `split_batches`: None
- `include_tokens_per_second`: False
- `include_num_input_tokens_seen`: False
- `neftune_noise_alpha`: None
- `optim_target_modules`: None
- `batch_eval_metrics`: False
- `eval_on_start`: False
- `eval_use_gather_object`: False
- `batch_sampler`: batch_sampler
- `multi_dataset_batch_sampler`: proportional
</details>
### Training Logs
<details><summary>Click to expand</summary>
| Epoch | Step | Training Loss |
|:------:|:----:|:-------------:|
| 0.0005 | 1 | 3.9557 |
| 0.0011 | 2 | 3.9207 |
| 0.0016 | 3 | 3.9878 |
| 0.0022 | 4 | 3.8356 |
| 0.0027 | 5 | 3.8777 |
| 0.0032 | 6 | 3.9141 |
| 0.0038 | 7 | 3.8113 |
| 0.0043 | 8 | 3.9641 |
| 0.0049 | 9 | 3.9577 |
| 0.0054 | 10 | 3.8921 |
| 0.0060 | 11 | 3.9589 |
| 0.0065 | 12 | 3.9007 |
| 0.0070 | 13 | 3.9295 |
| 0.0076 | 14 | 3.8336 |
| 0.0081 | 15 | 3.823 |
| 0.0087 | 16 | 3.8653 |
| 0.0092 | 17 | 3.8996 |
| 0.0097 | 18 | 3.8196 |
| 0.0103 | 19 | 3.8609 |
| 0.0108 | 20 | 3.7583 |
| 0.0114 | 21 | 3.823 |
| 0.0119 | 22 | 3.7987 |
| 0.0124 | 23 | 3.7744 |
| 0.0130 | 24 | 3.7279 |
| 0.0135 | 25 | 3.735 |
| 0.0141 | 26 | 3.713 |
| 0.0146 | 27 | 3.6465 |
| 0.0152 | 28 | 3.7064 |
| 0.0157 | 29 | 3.5865 |
| 0.0162 | 30 | 3.6699 |
| 0.0168 | 31 | 3.5554 |
| 0.0173 | 32 | 3.5139 |
| 0.0179 | 33 | 3.4726 |
| 0.0184 | 34 | 3.4577 |
| 0.0189 | 35 | 3.4131 |
| 0.0195 | 36 | 3.3885 |
| 0.0200 | 37 | 3.3399 |
| 0.0206 | 38 | 3.3106 |
| 0.0211 | 39 | 3.2043 |
| 0.0216 | 40 | 3.1255 |
| 0.0222 | 41 | 3.2178 |
| 0.0227 | 42 | 3.0531 |
| 0.0233 | 43 | 3.1042 |
| 0.0238 | 44 | 2.9103 |
| 0.0244 | 45 | 2.9482 |
| 0.0249 | 46 | 2.7755 |
| 0.0254 | 47 | 2.7953 |
| 0.0260 | 48 | 2.8629 |
| 0.0265 | 49 | 2.7599 |
| 0.0271 | 50 | 2.6131 |
| 0.0276 | 51 | 2.666 |
| 0.0281 | 52 | 2.6593 |
| 0.0287 | 53 | 2.6734 |
| 0.0292 | 54 | 2.4722 |
| 0.0298 | 55 | 2.5322 |
| 0.0303 | 56 | 2.3349 |
| 0.0308 | 57 | 2.2357 |
| 0.0314 | 58 | 2.3634 |
| 0.0319 | 59 | 2.0923 |
| 0.0325 | 60 | 2.1504 |
| 0.0330 | 61 | 2.0468 |
| 0.0335 | 62 | 1.9572 |
| 0.0341 | 63 | 2.0988 |
| 0.0346 | 64 | 2.1403 |
| 0.0352 | 65 | 1.9407 |
| 0.0357 | 66 | 2.0625 |
| 0.0363 | 67 | 1.8134 |
| 0.0368 | 68 | 1.6808 |
| 0.0373 | 69 | 1.9427 |
| 0.0379 | 70 | 1.82 |
| 0.0384 | 71 | 1.6899 |
| 0.0390 | 72 | 1.6761 |
| 0.0395 | 73 | 1.6193 |
| 0.0400 | 74 | 1.9625 |
| 0.0406 | 75 | 1.7979 |
| 0.0411 | 76 | 1.7376 |
| 0.0417 | 77 | 1.8423 |
| 0.0422 | 78 | 1.681 |
| 0.0427 | 79 | 1.4283 |
| 0.0433 | 80 | 1.7009 |
| 0.0438 | 81 | 1.6515 |
| 0.0444 | 82 | 1.5507 |
| 0.0449 | 83 | 1.7057 |
| 0.0455 | 84 | 1.4614 |
| 0.0460 | 85 | 1.6146 |
| 0.0465 | 86 | 1.5025 |
| 0.0471 | 87 | 1.4857 |
| 0.0476 | 88 | 1.3809 |
| 0.0482 | 89 | 1.6986 |
| 0.0487 | 90 | 1.572 |
| 0.0492 | 91 | 1.5139 |
| 0.0498 | 92 | 1.6796 |
| 0.0503 | 93 | 1.6394 |
| 0.0509 | 94 | 1.6096 |
| 0.0514 | 95 | 1.4346 |
| 0.0519 | 96 | 1.1299 |
| 0.0525 | 97 | 1.3578 |
| 0.0530 | 98 | 1.3381 |
| 0.0536 | 99 | 1.3463 |
| 0.0541 | 100 | 1.2956 |
| 0.0547 | 101 | 1.7073 |
| 0.0552 | 102 | 1.4074 |
| 0.0557 | 103 | 1.3459 |
| 0.0563 | 104 | 1.4079 |
| 0.0568 | 105 | 1.2535 |
| 0.0574 | 106 | 1.3232 |
| 0.0579 | 107 | 1.4095 |
| 0.0584 | 108 | 1.2362 |
| 0.0590 | 109 | 1.3326 |
| 0.0595 | 110 | 1.3922 |
| 0.0601 | 111 | 1.1261 |
| 0.0606 | 112 | 1.204 |
| 0.0611 | 113 | 1.1924 |
| 0.0617 | 114 | 1.4995 |
| 0.0622 | 115 | 1.2202 |
| 0.0628 | 116 | 1.2755 |
| 0.0633 | 117 | 1.3031 |
| 0.0639 | 118 | 1.4851 |
| 0.0644 | 119 | 1.2415 |
| 0.0649 | 120 | 1.521 |
| 0.0655 | 121 | 1.3321 |
| 0.0660 | 122 | 1.2061 |
| 0.0666 | 123 | 1.3673 |
| 0.0671 | 124 | 1.2265 |
| 0.0676 | 125 | 1.2672 |
| 0.0682 | 126 | 1.0618 |
| 0.0687 | 127 | 1.1874 |
| 0.0693 | 128 | 1.0718 |
| 0.0698 | 129 | 1.1261 |
| 0.0703 | 130 | 1.0751 |
| 0.0709 | 131 | 1.2649 |
| 0.0714 | 132 | 1.0998 |
| 0.0720 | 133 | 1.0999 |
| 0.0725 | 134 | 1.1897 |
| 0.0731 | 135 | 1.0635 |
| 0.0736 | 136 | 1.104 |
| 0.0741 | 137 | 0.9697 |
| 0.0747 | 138 | 1.201 |
| 0.0752 | 139 | 1.0408 |
| 0.0758 | 140 | 1.0786 |
| 0.0763 | 141 | 0.8686 |
| 0.0768 | 142 | 0.9676 |
| 0.0774 | 143 | 0.9179 |
| 0.0779 | 144 | 1.0294 |
| 0.0785 | 145 | 1.0401 |
| 0.0790 | 146 | 1.1051 |
| 0.0795 | 147 | 0.9318 |
| 0.0801 | 148 | 1.0279 |
| 0.0806 | 149 | 0.9114 |
| 0.0812 | 150 | 0.9359 |
| 0.0817 | 151 | 0.8913 |
| 0.0823 | 152 | 0.9284 |
| 0.0828 | 153 | 1.0118 |
| 0.0833 | 154 | 1.1165 |
| 0.0839 | 155 | 1.1369 |
| 0.0844 | 156 | 0.8976 |
| 0.0850 | 157 | 1.0009 |
| 0.0855 | 158 | 0.9414 |
| 0.0860 | 159 | 0.868 |
| 0.0866 | 160 | 1.1353 |
| 0.0871 | 161 | 0.9101 |
| 0.0877 | 162 | 0.7246 |
| 0.0882 | 163 | 1.0012 |
| 0.0887 | 164 | 0.9211 |
| 0.0893 | 165 | 0.9428 |
| 0.0898 | 166 | 0.8289 |
| 0.0904 | 167 | 0.9593 |
| 0.0909 | 168 | 1.2092 |
| 0.0915 | 169 | 0.7338 |
| 0.0920 | 170 | 0.8588 |
| 0.0925 | 171 | 0.9816 |
| 0.0931 | 172 | 0.9474 |
| 0.0936 | 173 | 1.0677 |
| 0.0942 | 174 | 1.117 |
| 0.0947 | 175 | 0.898 |
| 0.0952 | 176 | 0.8501 |
| 0.0958 | 177 | 1.0569 |
| 0.0963 | 178 | 1.0522 |
| 0.0969 | 179 | 0.8532 |
| 0.0974 | 180 | 0.8668 |
| 0.0979 | 181 | 0.8636 |
| 0.0985 | 182 | 0.7857 |
| 0.0990 | 183 | 0.8385 |
| 0.0996 | 184 | 0.7104 |
| 0.1001 | 185 | 0.9013 |
| 0.1006 | 186 | 0.7508 |
| 0.1012 | 187 | 0.6771 |
| 0.1017 | 188 | 0.8624 |
| 0.1023 | 189 | 0.8862 |
| 0.1028 | 190 | 0.9327 |
| 0.1034 | 191 | 0.8809 |
| 0.1039 | 192 | 0.8578 |
| 0.1044 | 193 | 0.833 |
| 0.1050 | 194 | 0.8461 |
| 0.1055 | 195 | 0.7682 |
| 0.1061 | 196 | 0.8086 |
| 0.1066 | 197 | 0.9925 |
| 0.1071 | 198 | 0.9014 |
| 0.1077 | 199 | 0.879 |
| 0.1082 | 200 | 0.8851 |
| 0.1088 | 201 | 0.7543 |
| 0.1093 | 202 | 0.6934 |
| 0.1098 | 203 | 0.716 |
| 0.1104 | 204 | 0.9915 |
| 0.1109 | 205 | 0.7734 |
| 0.1115 | 206 | 0.6162 |
| 0.1120 | 207 | 0.7771 |
| 0.1126 | 208 | 0.7243 |
| 0.1131 | 209 | 0.7599 |
| 0.1136 | 210 | 0.8286 |
| 0.1142 | 211 | 0.6429 |
| 0.1147 | 212 | 0.6823 |
| 0.1153 | 213 | 0.7784 |
| 0.1158 | 214 | 0.635 |
| 0.1163 | 215 | 0.7964 |
| 0.1169 | 216 | 0.7031 |
| 0.1174 | 217 | 0.6108 |
| 0.1180 | 218 | 0.6102 |
| 0.1185 | 219 | 0.5939 |
| 0.1190 | 220 | 0.8973 |
| 0.1196 | 221 | 0.7167 |
| 0.1201 | 222 | 0.6575 |
| 0.1207 | 223 | 0.7441 |
| 0.1212 | 224 | 0.6632 |
| 0.1218 | 225 | 0.7348 |
| 0.1223 | 226 | 0.6358 |
| 0.1228 | 227 | 0.7601 |
| 0.1234 | 228 | 0.6086 |
| 0.1239 | 229 | 0.6904 |
| 0.1245 | 230 | 0.7876 |
| 0.125 | 231 | 0.7261 |
| 0.1255 | 232 | 0.8477 |
| 0.1261 | 233 | 0.6625 |
| 0.1266 | 234 | 0.7851 |
| 0.1272 | 235 | 0.7571 |
| 0.1277 | 236 | 0.8736 |
| 0.1282 | 237 | 0.7984 |
| 0.1288 | 238 | 0.6757 |
| 0.1293 | 239 | 0.6295 |
| 0.1299 | 240 | 0.7864 |
| 0.1304 | 241 | 0.8343 |
| 0.1310 | 242 | 0.6804 |
| 0.1315 | 243 | 0.5932 |
| 0.1320 | 244 | 0.6748 |
| 0.1326 | 245 | 0.4666 |
| 0.1331 | 246 | 0.6346 |
| 0.1337 | 247 | 0.6092 |
| 0.1342 | 248 | 0.7029 |
| 0.1347 | 249 | 0.5889 |
| 0.1353 | 250 | 0.5261 |
| 0.1358 | 251 | 0.6361 |
| 0.1364 | 252 | 0.769 |
| 0.1369 | 253 | 0.7193 |
| 0.1374 | 254 | 0.6804 |
| 0.1380 | 255 | 0.6696 |
| 0.1385 | 256 | 0.5272 |
| 0.1391 | 257 | 0.6706 |
| 0.1396 | 258 | 0.6941 |
| 0.1402 | 259 | 0.6801 |
| 0.1407 | 260 | 0.6171 |
| 0.1412 | 261 | 0.6859 |
| 0.1418 | 262 | 0.8151 |
| 0.1423 | 263 | 0.7727 |
| 0.1429 | 264 | 0.7049 |
| 0.1434 | 265 | 0.6862 |
| 0.1439 | 266 | 0.6732 |
| 0.1445 | 267 | 0.6814 |
| 0.1450 | 268 | 0.6222 |
| 0.1456 | 269 | 0.7482 |
| 0.1461 | 270 | 0.6293 |
| 0.1466 | 271 | 0.5959 |
| 0.1472 | 272 | 0.5825 |
| 0.1477 | 273 | 0.6173 |
| 0.1483 | 274 | 0.5604 |
| 0.1488 | 275 | 0.5701 |
| 0.1494 | 276 | 0.7433 |
| 0.1499 | 277 | 0.6079 |
| 0.1504 | 278 | 0.5344 |
| 0.1510 | 279 | 0.5373 |
| 0.1515 | 280 | 0.7143 |
| 0.1521 | 281 | 0.563 |
| 0.1526 | 282 | 0.6148 |
| 0.1531 | 283 | 0.7291 |
| 0.1537 | 284 | 0.6242 |
| 0.1542 | 285 | 0.6281 |
| 0.1548 | 286 | 0.6458 |
| 0.1553 | 287 | 0.4476 |
| 0.1558 | 288 | 0.5321 |
| 0.1564 | 289 | 0.6032 |
| 0.1569 | 290 | 0.5471 |
| 0.1575 | 291 | 0.6062 |
| 0.1580 | 292 | 0.6591 |
| 0.1585 | 293 | 0.5937 |
| 0.1591 | 294 | 0.6237 |
| 0.1596 | 295 | 0.687 |
| 0.1602 | 296 | 0.6427 |
| 0.1607 | 297 | 0.5706 |
| 0.1613 | 298 | 0.7336 |
| 0.1618 | 299 | 0.6361 |
| 0.1623 | 300 | 0.4598 |
| 0.1629 | 301 | 0.5767 |
| 0.1634 | 302 | 0.4932 |
| 0.1640 | 303 | 0.4852 |
| 0.1645 | 304 | 0.7043 |
| 0.1650 | 305 | 0.5403 |
| 0.1656 | 306 | 0.5415 |
| 0.1661 | 307 | 0.7011 |
| 0.1667 | 308 | 0.4772 |
| 0.1672 | 309 | 0.5594 |
| 0.1677 | 310 | 0.6951 |
| 0.1683 | 311 | 0.4775 |
| 0.1688 | 312 | 0.5433 |
| 0.1694 | 313 | 0.6717 |
| 0.1699 | 314 | 0.6976 |
| 0.1705 | 315 | 0.5541 |
| 0.1710 | 316 | 0.5422 |
| 0.1715 | 317 | 0.4612 |
| 0.1721 | 318 | 0.5404 |
| 0.1726 | 319 | 0.5641 |
| 0.1732 | 320 | 0.4292 |
| 0.1737 | 321 | 0.4008 |
| 0.1742 | 322 | 0.4958 |
| 0.1748 | 323 | 0.5326 |
| 0.1753 | 324 | 0.5201 |
| 0.1759 | 325 | 0.6857 |
| 0.1764 | 326 | 0.602 |
| 0.1769 | 327 | 0.5668 |
| 0.1775 | 328 | 0.4316 |
| 0.1780 | 329 | 0.4176 |
| 0.1786 | 330 | 0.6182 |
| 0.1791 | 331 | 0.5139 |
| 0.1797 | 332 | 0.5957 |
| 0.1802 | 333 | 0.4649 |
| 0.1807 | 334 | 0.5304 |
| 0.1813 | 335 | 0.3467 |
| 0.1818 | 336 | 0.4935 |
| 0.1824 | 337 | 0.4455 |
| 0.1829 | 338 | 0.4838 |
| 0.1834 | 339 | 0.5371 |
| 0.1840 | 340 | 0.4221 |
| 0.1845 | 341 | 0.5174 |
| 0.1851 | 342 | 0.4481 |
| 0.1856 | 343 | 0.4876 |
| 0.1861 | 344 | 0.5236 |
| 0.1867 | 345 | 0.5481 |
| 0.1872 | 346 | 0.4591 |
| 0.1878 | 347 | 0.692 |
| 0.1883 | 348 | 0.6094 |
| 0.1889 | 349 | 0.5291 |
| 0.1894 | 350 | 0.5513 |
| 0.1899 | 351 | 0.491 |
| 0.1905 | 352 | 0.5299 |
| 0.1910 | 353 | 0.5158 |
| 0.1916 | 354 | 0.5136 |
| 0.1921 | 355 | 0.4322 |
| 0.1926 | 356 | 0.4036 |
| 0.1932 | 357 | 0.5907 |
| 0.1937 | 358 | 0.6236 |
| 0.1943 | 359 | 0.3994 |
| 0.1948 | 360 | 0.4724 |
| 0.1953 | 361 | 0.4455 |
| 0.1959 | 362 | 0.5442 |
| 0.1964 | 363 | 0.416 |
| 0.1970 | 364 | 0.431 |
| 0.1975 | 365 | 0.3939 |
| 0.1981 | 366 | 0.4713 |
| 0.1986 | 367 | 0.5071 |
| 0.1991 | 368 | 0.3929 |
| 0.1997 | 369 | 0.4262 |
| 0.2002 | 370 | 0.4844 |
| 0.2008 | 371 | 0.2995 |
| 0.2013 | 372 | 0.6043 |
| 0.2018 | 373 | 0.4498 |
| 0.2024 | 374 | 0.5729 |
| 0.2029 | 375 | 0.5476 |
| 0.2035 | 376 | 0.4076 |
| 0.2040 | 377 | 0.4285 |
| 0.2045 | 378 | 0.3416 |
| 0.2051 | 379 | 0.4576 |
| 0.2056 | 380 | 0.421 |
| 0.2062 | 381 | 0.508 |
| 0.2067 | 382 | 0.6678 |
| 0.2073 | 383 | 0.3123 |
| 0.2078 | 384 | 0.4111 |
| 0.2083 | 385 | 0.5635 |
| 0.2089 | 386 | 0.4951 |
| 0.2094 | 387 | 0.4918 |
| 0.2100 | 388 | 0.391 |
| 0.2105 | 389 | 0.5256 |
| 0.2110 | 390 | 0.3937 |
| 0.2116 | 391 | 0.5605 |
| 0.2121 | 392 | 0.5048 |
| 0.2127 | 393 | 0.4383 |
| 0.2132 | 394 | 0.4155 |
| 0.2137 | 395 | 0.4795 |
| 0.2143 | 396 | 0.403 |
| 0.2148 | 397 | 0.5965 |
| 0.2154 | 398 | 0.4061 |
| 0.2159 | 399 | 0.3772 |
| 0.2165 | 400 | 0.4482 |
| 0.2170 | 401 | 0.4261 |
| 0.2175 | 402 | 0.4683 |
| 0.2181 | 403 | 0.435 |
| 0.2186 | 404 | 0.5485 |
| 0.2192 | 405 | 0.4479 |
| 0.2197 | 406 | 0.4882 |
| 0.2202 | 407 | 0.4688 |
| 0.2208 | 408 | 0.3496 |
| 0.2213 | 409 | 0.3889 |
| 0.2219 | 410 | 0.4166 |
| 0.2224 | 411 | 0.4191 |
| 0.2229 | 412 | 0.4672 |
| 0.2235 | 413 | 0.529 |
| 0.2240 | 414 | 0.4893 |
| 0.2246 | 415 | 0.4869 |
| 0.2251 | 416 | 0.3827 |
| 0.2256 | 417 | 0.4547 |
| 0.2262 | 418 | 0.5323 |
| 0.2267 | 419 | 0.5286 |
| 0.2273 | 420 | 0.4395 |
| 0.2278 | 421 | 0.4486 |
| 0.2284 | 422 | 0.5122 |
| 0.2289 | 423 | 0.3213 |
| 0.2294 | 424 | 0.4675 |
| 0.2300 | 425 | 0.5319 |
| 0.2305 | 426 | 0.4171 |
| 0.2311 | 427 | 0.4579 |
| 0.2316 | 428 | 0.427 |
| 0.2321 | 429 | 0.4374 |
| 0.2327 | 430 | 0.4414 |
| 0.2332 | 431 | 0.4929 |
| 0.2338 | 432 | 0.4007 |
| 0.2343 | 433 | 0.4919 |
| 0.2348 | 434 | 0.5 |
| 0.2354 | 435 | 0.4636 |
| 0.2359 | 436 | 0.4317 |
| 0.2365 | 437 | 0.4267 |
| 0.2370 | 438 | 0.3405 |
| 0.2376 | 439 | 0.5068 |
| 0.2381 | 440 | 0.3878 |
| 0.2386 | 441 | 0.6178 |
| 0.2392 | 442 | 0.3604 |
| 0.2397 | 443 | 0.4737 |
| 0.2403 | 444 | 0.4166 |
| 0.2408 | 445 | 0.4893 |
| 0.2413 | 446 | 0.467 |
| 0.2419 | 447 | 0.5191 |
| 0.2424 | 448 | 0.4461 |
| 0.2430 | 449 | 0.3629 |
| 0.2435 | 450 | 0.457 |
| 0.2440 | 451 | 0.4321 |
| 0.2446 | 452 | 0.4438 |
| 0.2451 | 453 | 0.4121 |
| 0.2457 | 454 | 0.4597 |
| 0.2462 | 455 | 0.5864 |
| 0.2468 | 456 | 0.5268 |
| 0.2473 | 457 | 0.4403 |
| 0.2478 | 458 | 0.4184 |
| 0.2484 | 459 | 0.532 |
| 0.2489 | 460 | 0.4448 |
| 0.2495 | 461 | 0.5599 |
| 0.25 | 462 | 0.4662 |
| 0.2505 | 463 | 0.3816 |
| 0.2511 | 464 | 0.4811 |
| 0.2516 | 465 | 0.5688 |
| 0.2522 | 466 | 0.4477 |
| 0.2527 | 467 | 0.4768 |
| 0.2532 | 468 | 0.5138 |
| 0.2538 | 469 | 0.4371 |
| 0.2543 | 470 | 0.4301 |
| 0.2549 | 471 | 0.5701 |
| 0.2554 | 472 | 0.4021 |
| 0.2560 | 473 | 0.3638 |
| 0.2565 | 474 | 0.4867 |
| 0.2570 | 475 | 0.2128 |
| 0.2576 | 476 | 0.5129 |
| 0.2581 | 477 | 0.4493 |
| 0.2587 | 478 | 0.398 |
| 0.2592 | 479 | 0.403 |
| 0.2597 | 480 | 0.4101 |
| 0.2603 | 481 | 0.4842 |
| 0.2608 | 482 | 0.4045 |
| 0.2614 | 483 | 0.376 |
| 0.2619 | 484 | 0.3203 |
| 0.2624 | 485 | 0.4024 |
| 0.2630 | 486 | 0.3525 |
| 0.2635 | 487 | 0.432 |
| 0.2641 | 488 | 0.4276 |
| 0.2646 | 489 | 0.5266 |
| 0.2652 | 490 | 0.4979 |
| 0.2657 | 491 | 0.4799 |
| 0.2662 | 492 | 0.4078 |
| 0.2668 | 493 | 0.3757 |
| 0.2673 | 494 | 0.4429 |
| 0.2679 | 495 | 0.431 |
| 0.2684 | 496 | 0.3557 |
| 0.2689 | 497 | 0.4048 |
| 0.2695 | 498 | 0.3873 |
| 0.2700 | 499 | 0.4281 |
| 0.2706 | 500 | 0.4625 |
| 0.2711 | 501 | 0.3313 |
| 0.2716 | 502 | 0.4699 |
| 0.2722 | 503 | 0.4945 |
| 0.2727 | 504 | 0.5019 |
| 0.2733 | 505 | 0.5058 |
| 0.2738 | 506 | 0.3148 |
| 0.2744 | 507 | 0.3969 |
| 0.2749 | 508 | 0.3188 |
| 0.2754 | 509 | 0.3656 |
| 0.2760 | 510 | 0.4077 |
| 0.2765 | 511 | 0.4514 |
| 0.2771 | 512 | 0.3854 |
| 0.2776 | 513 | 0.2781 |
| 0.2781 | 514 | 0.3847 |
| 0.2787 | 515 | 0.5575 |
| 0.2792 | 516 | 0.4726 |
| 0.2798 | 517 | 0.544 |
| 0.2803 | 518 | 0.4915 |
| 0.2808 | 519 | 0.3097 |
| 0.2814 | 520 | 0.446 |
| 0.2819 | 521 | 0.4743 |
| 0.2825 | 522 | 0.4088 |
| 0.2830 | 523 | 0.237 |
| 0.2835 | 524 | 0.4589 |
| 0.2841 | 525 | 0.446 |
| 0.2846 | 526 | 0.2436 |
| 0.2852 | 527 | 0.38 |
| 0.2857 | 528 | 0.3016 |
| 0.2863 | 529 | 0.4377 |
| 0.2868 | 530 | 0.368 |
| 0.2873 | 531 | 0.4198 |
| 0.2879 | 532 | 0.4686 |
| 0.2884 | 533 | 0.3889 |
| 0.2890 | 534 | 0.3036 |
| 0.2895 | 535 | 0.4019 |
| 0.2900 | 536 | 0.3801 |
| 0.2906 | 537 | 0.5282 |
| 0.2911 | 538 | 0.4505 |
| 0.2917 | 539 | 0.3379 |
| 0.2922 | 540 | 0.491 |
| 0.2927 | 541 | 0.5154 |
| 0.2933 | 542 | 0.3878 |
| 0.2938 | 543 | 0.311 |
| 0.2944 | 544 | 0.4869 |
| 0.2949 | 545 | 0.5065 |
| 0.2955 | 546 | 0.4938 |
| 0.2960 | 547 | 0.3523 |
| 0.2965 | 548 | 0.3807 |
| 0.2971 | 549 | 0.4386 |
| 0.2976 | 550 | 0.5617 |
| 0.2982 | 551 | 0.3759 |
| 0.2987 | 552 | 0.3336 |
| 0.2992 | 553 | 0.4209 |
| 0.2998 | 554 | 0.3269 |
| 0.3003 | 555 | 0.4954 |
| 0.3009 | 556 | 0.3328 |
| 0.3014 | 557 | 0.4135 |
| 0.3019 | 558 | 0.3406 |
| 0.3025 | 559 | 0.3276 |
| 0.3030 | 560 | 0.3642 |
| 0.3036 | 561 | 0.319 |
| 0.3041 | 562 | 0.2571 |
| 0.3047 | 563 | 0.4727 |
| 0.3052 | 564 | 0.5308 |
| 0.3057 | 565 | 0.2908 |
| 0.3063 | 566 | 0.5056 |
| 0.3068 | 567 | 0.3943 |
| 0.3074 | 568 | 0.424 |
| 0.3079 | 569 | 0.4812 |
| 0.3084 | 570 | 0.2392 |
| 0.3090 | 571 | 0.3348 |
| 0.3095 | 572 | 0.3481 |
| 0.3101 | 573 | 0.3635 |
| 0.3106 | 574 | 0.4103 |
| 0.3111 | 575 | 0.374 |
| 0.3117 | 576 | 0.4625 |
| 0.3122 | 577 | 0.3743 |
| 0.3128 | 578 | 0.3774 |
| 0.3133 | 579 | 0.4476 |
| 0.3139 | 580 | 0.331 |
| 0.3144 | 581 | 0.2418 |
| 0.3149 | 582 | 0.2395 |
| 0.3155 | 583 | 0.3169 |
| 0.3160 | 584 | 0.42 |
| 0.3166 | 585 | 0.3919 |
| 0.3171 | 586 | 0.3291 |
| 0.3176 | 587 | 0.3608 |
| 0.3182 | 588 | 0.3042 |
| 0.3187 | 589 | 0.4197 |
| 0.3193 | 590 | 0.3123 |
| 0.3198 | 591 | 0.3353 |
| 0.3203 | 592 | 0.3026 |
| 0.3209 | 593 | 0.3119 |
| 0.3214 | 594 | 0.4534 |
| 0.3220 | 595 | 0.2781 |
| 0.3225 | 596 | 0.3318 |
| 0.3231 | 597 | 0.3868 |
| 0.3236 | 598 | 0.3877 |
| 0.3241 | 599 | 0.4554 |
| 0.3247 | 600 | 0.4672 |
| 0.3252 | 601 | 0.3281 |
| 0.3258 | 602 | 0.4016 |
| 0.3263 | 603 | 0.2576 |
| 0.3268 | 604 | 0.4192 |
| 0.3274 | 605 | 0.4367 |
| 0.3279 | 606 | 0.353 |
| 0.3285 | 607 | 0.3884 |
| 0.3290 | 608 | 0.3427 |
| 0.3295 | 609 | 0.2705 |
| 0.3301 | 610 | 0.3073 |
| 0.3306 | 611 | 0.2957 |
| 0.3312 | 612 | 0.4355 |
| 0.3317 | 613 | 0.3335 |
| 0.3323 | 614 | 0.3807 |
| 0.3328 | 615 | 0.2344 |
| 0.3333 | 616 | 0.3681 |
| 0.3339 | 617 | 0.3595 |
| 0.3344 | 618 | 0.4536 |
| 0.3350 | 619 | 0.3564 |
| 0.3355 | 620 | 0.3127 |
| 0.3360 | 621 | 0.3122 |
| 0.3366 | 622 | 0.371 |
| 0.3371 | 623 | 0.4385 |
| 0.3377 | 624 | 0.1756 |
| 0.3382 | 625 | 0.4321 |
| 0.3387 | 626 | 0.3797 |
| 0.3393 | 627 | 0.3364 |
| 0.3398 | 628 | 0.3783 |
| 0.3404 | 629 | 0.3627 |
| 0.3409 | 630 | 0.3808 |
| 0.3415 | 631 | 0.3031 |
| 0.3420 | 632 | 0.3321 |
| 0.3425 | 633 | 0.4365 |
| 0.3431 | 634 | 0.2778 |
| 0.3436 | 635 | 0.4142 |
| 0.3442 | 636 | 0.4154 |
| 0.3447 | 637 | 0.4557 |
| 0.3452 | 638 | 0.3573 |
| 0.3458 | 639 | 0.4492 |
| 0.3463 | 640 | 0.3779 |
| 0.3469 | 641 | 0.3539 |
| 0.3474 | 642 | 0.3162 |
| 0.3479 | 643 | 0.3476 |
| 0.3485 | 644 | 0.3196 |
| 0.3490 | 645 | 0.3256 |
| 0.3496 | 646 | 0.3995 |
| 0.3501 | 647 | 0.3244 |
| 0.3506 | 648 | 0.3697 |
| 0.3512 | 649 | 0.3228 |
| 0.3517 | 650 | 0.36 |
| 0.3523 | 651 | 0.2631 |
| 0.3528 | 652 | 0.4452 |
| 0.3534 | 653 | 0.3325 |
| 0.3539 | 654 | 0.3141 |
| 0.3544 | 655 | 0.3839 |
| 0.3550 | 656 | 0.3551 |
| 0.3555 | 657 | 0.3729 |
| 0.3561 | 658 | 0.4533 |
| 0.3566 | 659 | 0.477 |
| 0.3571 | 660 | 0.3208 |
| 0.3577 | 661 | 0.3636 |
| 0.3582 | 662 | 0.3888 |
| 0.3588 | 663 | 0.3271 |
| 0.3593 | 664 | 0.3646 |
| 0.3598 | 665 | 0.4552 |
| 0.3604 | 666 | 0.3981 |
| 0.3609 | 667 | 0.2399 |
| 0.3615 | 668 | 0.2871 |
| 0.3620 | 669 | 0.3694 |
| 0.3626 | 670 | 0.3998 |
| 0.3631 | 671 | 0.2942 |
| 0.3636 | 672 | 0.3286 |
| 0.3642 | 673 | 0.3785 |
| 0.3647 | 674 | 0.3538 |
| 0.3653 | 675 | 0.2747 |
| 0.3658 | 676 | 0.3183 |
| 0.3663 | 677 | 0.4174 |
| 0.3669 | 678 | 0.343 |
| 0.3674 | 679 | 0.3336 |
| 0.3680 | 680 | 0.4012 |
| 0.3685 | 681 | 0.3695 |
| 0.3690 | 682 | 0.4255 |
| 0.3696 | 683 | 0.3505 |
| 0.3701 | 684 | 0.2937 |
| 0.3707 | 685 | 0.298 |
| 0.3712 | 686 | 0.367 |
| 0.3718 | 687 | 0.5086 |
| 0.3723 | 688 | 0.418 |
| 0.3728 | 689 | 0.4873 |
| 0.3734 | 690 | 0.3697 |
| 0.3739 | 691 | 0.4506 |
| 0.3745 | 692 | 0.3652 |
| 0.375 | 693 | 0.444 |
| 0.3755 | 694 | 0.2769 |
| 0.3761 | 695 | 0.3379 |
| 0.3766 | 696 | 0.2788 |
| 0.3772 | 697 | 0.2828 |
| 0.3777 | 698 | 0.3301 |
| 0.3782 | 699 | 0.2256 |
| 0.3788 | 700 | 0.3264 |
| 0.3793 | 701 | 0.3045 |
| 0.3799 | 702 | 0.3137 |
| 0.3804 | 703 | 0.2509 |
| 0.3810 | 704 | 0.2266 |
| 0.3815 | 705 | 0.3861 |
| 0.3820 | 706 | 0.2714 |
| 0.3826 | 707 | 0.3494 |
| 0.3831 | 708 | 0.3505 |
| 0.3837 | 709 | 0.3442 |
| 0.3842 | 710 | 0.4278 |
| 0.3847 | 711 | 0.3468 |
| 0.3853 | 712 | 0.288 |
| 0.3858 | 713 | 0.3128 |
| 0.3864 | 714 | 0.2871 |
| 0.3869 | 715 | 0.3202 |
| 0.3874 | 716 | 0.3707 |
| 0.3880 | 717 | 0.3227 |
| 0.3885 | 718 | 0.3043 |
| 0.3891 | 719 | 0.4236 |
| 0.3896 | 720 | 0.3186 |
| 0.3902 | 721 | 0.3004 |
| 0.3907 | 722 | 0.2342 |
| 0.3912 | 723 | 0.2424 |
| 0.3918 | 724 | 0.259 |
| 0.3923 | 725 | 0.2754 |
| 0.3929 | 726 | 0.3282 |
| 0.3934 | 727 | 0.3854 |
| 0.3939 | 728 | 0.3611 |
| 0.3945 | 729 | 0.3386 |
| 0.3950 | 730 | 0.2737 |
| 0.3956 | 731 | 0.2659 |
| 0.3961 | 732 | 0.3486 |
| 0.3966 | 733 | 0.3053 |
| 0.3972 | 734 | 0.3747 |
| 0.3977 | 735 | 0.2901 |
| 0.3983 | 736 | 0.3935 |
| 0.3988 | 737 | 0.4183 |
| 0.3994 | 738 | 0.3442 |
| 0.3999 | 739 | 0.387 |
| 0.4004 | 740 | 0.3109 |
| 0.4010 | 741 | 0.3422 |
| 0.4015 | 742 | 0.2999 |
| 0.4021 | 743 | 0.3035 |
| 0.4026 | 744 | 0.368 |
| 0.4031 | 745 | 0.3345 |
| 0.4037 | 746 | 0.3881 |
| 0.4042 | 747 | 0.2789 |
| 0.4048 | 748 | 0.3644 |
| 0.4053 | 749 | 0.4616 |
| 0.4058 | 750 | 0.3665 |
| 0.4064 | 751 | 0.2822 |
| 0.4069 | 752 | 0.3898 |
| 0.4075 | 753 | 0.3016 |
| 0.4080 | 754 | 0.382 |
| 0.4085 | 755 | 0.2288 |
| 0.4091 | 756 | 0.2529 |
| 0.4096 | 757 | 0.3091 |
| 0.4102 | 758 | 0.3199 |
| 0.4107 | 759 | 0.3183 |
| 0.4113 | 760 | 0.322 |
| 0.4118 | 761 | 0.3454 |
| 0.4123 | 762 | 0.292 |
| 0.4129 | 763 | 0.322 |
| 0.4134 | 764 | 0.412 |
| 0.4140 | 765 | 0.2618 |
| 0.4145 | 766 | 0.3343 |
| 0.4150 | 767 | 0.2618 |
| 0.4156 | 768 | 0.2666 |
| 0.4161 | 769 | 0.3625 |
| 0.4167 | 770 | 0.3668 |
| 0.4172 | 771 | 0.242 |
| 0.4177 | 772 | 0.3288 |
| 0.4183 | 773 | 0.2378 |
| 0.4188 | 774 | 0.4182 |
| 0.4194 | 775 | 0.2503 |
| 0.4199 | 776 | 0.3372 |
| 0.4205 | 777 | 0.376 |
| 0.4210 | 778 | 0.3128 |
| 0.4215 | 779 | 0.4245 |
| 0.4221 | 780 | 0.2659 |
| 0.4226 | 781 | 0.2981 |
| 0.4232 | 782 | 0.371 |
| 0.4237 | 783 | 0.3893 |
| 0.4242 | 784 | 0.2595 |
| 0.4248 | 785 | 0.3101 |
| 0.4253 | 786 | 0.3973 |
| 0.4259 | 787 | 0.3702 |
| 0.4264 | 788 | 0.3141 |
| 0.4269 | 789 | 0.3044 |
| 0.4275 | 790 | 0.2422 |
| 0.4280 | 791 | 0.3277 |
| 0.4286 | 792 | 0.3433 |
| 0.4291 | 793 | 0.1456 |
| 0.4297 | 794 | 0.2729 |
| 0.4302 | 795 | 0.2927 |
| 0.4307 | 796 | 0.2207 |
| 0.4313 | 797 | 0.3188 |
| 0.4318 | 798 | 0.3016 |
| 0.4324 | 799 | 0.3031 |
| 0.4329 | 800 | 0.3304 |
| 0.4334 | 801 | 0.3583 |
| 0.4340 | 802 | 0.3126 |
| 0.4345 | 803 | 0.3465 |
| 0.4351 | 804 | 0.2763 |
| 0.4356 | 805 | 0.2691 |
| 0.4361 | 806 | 0.2807 |
| 0.4367 | 807 | 0.2768 |
| 0.4372 | 808 | 0.2153 |
| 0.4378 | 809 | 0.2691 |
| 0.4383 | 810 | 0.253 |
| 0.4389 | 811 | 0.2523 |
| 0.4394 | 812 | 0.2883 |
| 0.4399 | 813 | 0.1915 |
| 0.4405 | 814 | 0.279 |
| 0.4410 | 815 | 0.2919 |
| 0.4416 | 816 | 0.197 |
| 0.4421 | 817 | 0.2595 |
| 0.4426 | 818 | 0.3054 |
| 0.4432 | 819 | 0.2602 |
| 0.4437 | 820 | 0.3319 |
| 0.4443 | 821 | 0.2588 |
| 0.4448 | 822 | 0.2652 |
| 0.4453 | 823 | 0.3324 |
| 0.4459 | 824 | 0.3651 |
| 0.4464 | 825 | 0.3622 |
| 0.4470 | 826 | 0.3292 |
| 0.4475 | 827 | 0.3236 |
| 0.4481 | 828 | 0.2803 |
| 0.4486 | 829 | 0.3327 |
| 0.4491 | 830 | 0.2954 |
| 0.4497 | 831 | 0.2895 |
| 0.4502 | 832 | 0.3304 |
| 0.4508 | 833 | 0.1671 |
| 0.4513 | 834 | 0.3462 |
| 0.4518 | 835 | 0.2386 |
| 0.4524 | 836 | 0.2957 |
| 0.4529 | 837 | 0.3055 |
| 0.4535 | 838 | 0.3499 |
| 0.4540 | 839 | 0.2705 |
| 0.4545 | 840 | 0.2859 |
| 0.4551 | 841 | 0.2883 |
| 0.4556 | 842 | 0.3168 |
| 0.4562 | 843 | 0.3809 |
| 0.4567 | 844 | 0.3675 |
| 0.4573 | 845 | 0.3149 |
| 0.4578 | 846 | 0.3214 |
| 0.4583 | 847 | 0.3034 |
| 0.4589 | 848 | 0.2892 |
| 0.4594 | 849 | 0.3221 |
| 0.4600 | 850 | 0.4045 |
| 0.4605 | 851 | 0.3136 |
| 0.4610 | 852 | 0.219 |
| 0.4616 | 853 | 0.2452 |
| 0.4621 | 854 | 0.3133 |
| 0.4627 | 855 | 0.2659 |
| 0.4632 | 856 | 0.3777 |
| 0.4637 | 857 | 0.3309 |
| 0.4643 | 858 | 0.5038 |
| 0.4648 | 859 | 0.2956 |
| 0.4654 | 860 | 0.3282 |
| 0.4659 | 861 | 0.2139 |
| 0.4665 | 862 | 0.3297 |
| 0.4670 | 863 | 0.2965 |
| 0.4675 | 864 | 0.2847 |
| 0.4681 | 865 | 0.3407 |
| 0.4686 | 866 | 0.2593 |
| 0.4692 | 867 | 0.3223 |
| 0.4697 | 868 | 0.2832 |
| 0.4702 | 869 | 0.2503 |
| 0.4708 | 870 | 0.2626 |
| 0.4713 | 871 | 0.2828 |
| 0.4719 | 872 | 0.3443 |
| 0.4724 | 873 | 0.2684 |
| 0.4729 | 874 | 0.2216 |
| 0.4735 | 875 | 0.2828 |
| 0.4740 | 876 | 0.2433 |
| 0.4746 | 877 | 0.3476 |
| 0.4751 | 878 | 0.3005 |
| 0.4756 | 879 | 0.3113 |
| 0.4762 | 880 | 0.1482 |
| 0.4767 | 881 | 0.3019 |
| 0.4773 | 882 | 0.278 |
| 0.4778 | 883 | 0.2292 |
| 0.4784 | 884 | 0.4266 |
| 0.4789 | 885 | 0.3373 |
| 0.4794 | 886 | 0.1929 |
| 0.4800 | 887 | 0.347 |
| 0.4805 | 888 | 0.314 |
| 0.4811 | 889 | 0.4008 |
| 0.4816 | 890 | 0.2961 |
| 0.4821 | 891 | 0.2727 |
| 0.4827 | 892 | 0.3792 |
| 0.4832 | 893 | 0.2094 |
| 0.4838 | 894 | 0.2639 |
| 0.4843 | 895 | 0.2567 |
| 0.4848 | 896 | 0.3415 |
| 0.4854 | 897 | 0.3522 |
| 0.4859 | 898 | 0.1885 |
| 0.4865 | 899 | 0.2762 |
| 0.4870 | 900 | 0.2533 |
| 0.4876 | 901 | 0.2415 |
| 0.4881 | 902 | 0.2657 |
| 0.4886 | 903 | 0.3201 |
| 0.4892 | 904 | 0.2741 |
| 0.4897 | 905 | 0.2998 |
| 0.4903 | 906 | 0.2234 |
| 0.4908 | 907 | 0.3571 |
| 0.4913 | 908 | 0.3524 |
| 0.4919 | 909 | 0.2165 |
| 0.4924 | 910 | 0.1526 |
| 0.4930 | 911 | 0.3463 |
| 0.4935 | 912 | 0.3997 |
| 0.4940 | 913 | 0.3133 |
| 0.4946 | 914 | 0.2519 |
| 0.4951 | 915 | 0.1886 |
| 0.4957 | 916 | 0.253 |
| 0.4962 | 917 | 0.391 |
| 0.4968 | 918 | 0.2562 |
| 0.4973 | 919 | 0.3635 |
| 0.4978 | 920 | 0.3886 |
| 0.4984 | 921 | 0.2713 |
| 0.4989 | 922 | 0.3201 |
| 0.4995 | 923 | 0.3354 |
| 0.5 | 924 | 0.2783 |
| 0.5005 | 925 | 0.3165 |
| 0.5011 | 926 | 0.2497 |
| 0.5016 | 927 | 0.264 |
| 0.5022 | 928 | 0.3656 |
| 0.5027 | 929 | 0.2733 |
| 0.5032 | 930 | 0.2213 |
| 0.5038 | 931 | 0.1856 |
| 0.5043 | 932 | 0.1603 |
| 0.5049 | 933 | 0.2943 |
| 0.5054 | 934 | 0.2369 |
| 0.5060 | 935 | 0.2739 |
| 0.5065 | 936 | 0.1494 |
| 0.5070 | 937 | 0.3658 |
| 0.5076 | 938 | 0.2276 |
| 0.5081 | 939 | 0.2999 |
| 0.5087 | 940 | 0.2988 |
| 0.5092 | 941 | 0.2326 |
| 0.5097 | 942 | 0.2243 |
| 0.5103 | 943 | 0.1794 |
| 0.5108 | 944 | 0.3145 |
| 0.5114 | 945 | 0.3793 |
| 0.5119 | 946 | 0.2787 |
| 0.5124 | 947 | 0.1787 |
| 0.5130 | 948 | 0.2505 |
| 0.5135 | 949 | 0.3307 |
| 0.5141 | 950 | 0.2711 |
| 0.5146 | 951 | 0.3728 |
| 0.5152 | 952 | 0.3181 |
| 0.5157 | 953 | 0.3111 |
| 0.5162 | 954 | 0.3536 |
| 0.5168 | 955 | 0.381 |
| 0.5173 | 956 | 0.3083 |
| 0.5179 | 957 | 0.2131 |
| 0.5184 | 958 | 0.3132 |
| 0.5189 | 959 | 0.2763 |
| 0.5195 | 960 | 0.3422 |
| 0.5200 | 961 | 0.2276 |
| 0.5206 | 962 | 0.3179 |
| 0.5211 | 963 | 0.3098 |
| 0.5216 | 964 | 0.194 |
| 0.5222 | 965 | 0.2941 |
| 0.5227 | 966 | 0.2221 |
| 0.5233 | 967 | 0.392 |
| 0.5238 | 968 | 0.4026 |
| 0.5244 | 969 | 0.2968 |
| 0.5249 | 970 | 0.2378 |
| 0.5254 | 971 | 0.3238 |
| 0.5260 | 972 | 0.3739 |
| 0.5265 | 973 | 0.2882 |
| 0.5271 | 974 | 0.1461 |
| 0.5276 | 975 | 0.2001 |
| 0.5281 | 976 | 0.2706 |
| 0.5287 | 977 | 0.2886 |
| 0.5292 | 978 | 0.2045 |
| 0.5298 | 979 | 0.2768 |
| 0.5303 | 980 | 0.2673 |
| 0.5308 | 981 | 0.4411 |
| 0.5314 | 982 | 0.3067 |
| 0.5319 | 983 | 0.1799 |
| 0.5325 | 984 | 0.2184 |
| 0.5330 | 985 | 0.1722 |
| 0.5335 | 986 | 0.2289 |
| 0.5341 | 987 | 0.4321 |
| 0.5346 | 988 | 0.3145 |
| 0.5352 | 989 | 0.3244 |
| 0.5357 | 990 | 0.236 |
| 0.5363 | 991 | 0.2328 |
| 0.5368 | 992 | 0.361 |
| 0.5373 | 993 | 0.2481 |
| 0.5379 | 994 | 0.3673 |
| 0.5384 | 995 | 0.288 |
| 0.5390 | 996 | 0.3288 |
| 0.5395 | 997 | 0.3113 |
| 0.5400 | 998 | 0.2939 |
| 0.5406 | 999 | 0.257 |
| 0.5411 | 1000 | 0.1692 |
| 0.5417 | 1001 | 0.2646 |
| 0.5422 | 1002 | 0.2235 |
| 0.5427 | 1003 | 0.2998 |
| 0.5433 | 1004 | 0.3219 |
| 0.5438 | 1005 | 0.2948 |
| 0.5444 | 1006 | 0.2504 |
| 0.5449 | 1007 | 0.2156 |
| 0.5455 | 1008 | 0.1564 |
| 0.5460 | 1009 | 0.3906 |
| 0.5465 | 1010 | 0.1858 |
| 0.5471 | 1011 | 0.2905 |
| 0.5476 | 1012 | 0.4178 |
| 0.5482 | 1013 | 0.3966 |
| 0.5487 | 1014 | 0.3453 |
| 0.5492 | 1015 | 0.3134 |
| 0.5498 | 1016 | 0.3551 |
| 0.5503 | 1017 | 0.3492 |
| 0.5509 | 1018 | 0.2311 |
| 0.5514 | 1019 | 0.3288 |
| 0.5519 | 1020 | 0.41 |
| 0.5525 | 1021 | 0.3142 |
| 0.5530 | 1022 | 0.2731 |
| 0.5536 | 1023 | 0.2918 |
| 0.5541 | 1024 | 0.2448 |
| 0.5547 | 1025 | 0.2518 |
| 0.5552 | 1026 | 0.2748 |
| 0.5557 | 1027 | 0.258 |
| 0.5563 | 1028 | 0.3213 |
| 0.5568 | 1029 | 0.2204 |
| 0.5574 | 1030 | 0.2341 |
| 0.5579 | 1031 | 0.2834 |
| 0.5584 | 1032 | 0.2263 |
| 0.5590 | 1033 | 0.3555 |
| 0.5595 | 1034 | 0.1856 |
| 0.5601 | 1035 | 0.3415 |
| 0.5606 | 1036 | 0.2918 |
| 0.5611 | 1037 | 0.3452 |
| 0.5617 | 1038 | 0.2668 |
| 0.5622 | 1039 | 0.3271 |
| 0.5628 | 1040 | 0.3698 |
| 0.5633 | 1041 | 0.2592 |
| 0.5639 | 1042 | 0.216 |
| 0.5644 | 1043 | 0.2528 |
| 0.5649 | 1044 | 0.3076 |
| 0.5655 | 1045 | 0.2379 |
| 0.5660 | 1046 | 0.2075 |
| 0.5666 | 1047 | 0.339 |
| 0.5671 | 1048 | 0.2381 |
| 0.5676 | 1049 | 0.2966 |
| 0.5682 | 1050 | 0.2599 |
| 0.5687 | 1051 | 0.2928 |
| 0.5693 | 1052 | 0.2545 |
| 0.5698 | 1053 | 0.265 |
| 0.5703 | 1054 | 0.2329 |
| 0.5709 | 1055 | 0.2876 |
| 0.5714 | 1056 | 0.2826 |
| 0.5720 | 1057 | 0.1941 |
| 0.5725 | 1058 | 0.2631 |
| 0.5731 | 1059 | 0.2492 |
| 0.5736 | 1060 | 0.2736 |
| 0.5741 | 1061 | 0.232 |
| 0.5747 | 1062 | 0.2157 |
| 0.5752 | 1063 | 0.2275 |
| 0.5758 | 1064 | 0.1625 |
| 0.5763 | 1065 | 0.2494 |
| 0.5768 | 1066 | 0.2295 |
| 0.5774 | 1067 | 0.2406 |
| 0.5779 | 1068 | 0.3003 |
| 0.5785 | 1069 | 0.3435 |
| 0.5790 | 1070 | 0.3305 |
| 0.5795 | 1071 | 0.3354 |
| 0.5801 | 1072 | 0.2647 |
| 0.5806 | 1073 | 0.1936 |
| 0.5812 | 1074 | 0.2621 |
| 0.5817 | 1075 | 0.2469 |
| 0.5823 | 1076 | 0.412 |
| 0.5828 | 1077 | 0.2876 |
| 0.5833 | 1078 | 0.2778 |
| 0.5839 | 1079 | 0.2909 |
| 0.5844 | 1080 | 0.2386 |
| 0.5850 | 1081 | 0.2368 |
| 0.5855 | 1082 | 0.241 |
| 0.5860 | 1083 | 0.2734 |
| 0.5866 | 1084 | 0.2463 |
| 0.5871 | 1085 | 0.2316 |
| 0.5877 | 1086 | 0.2558 |
| 0.5882 | 1087 | 0.1824 |
| 0.5887 | 1088 | 0.2948 |
| 0.5893 | 1089 | 0.2599 |
| 0.5898 | 1090 | 0.4162 |
| 0.5904 | 1091 | 0.2369 |
| 0.5909 | 1092 | 0.2311 |
| 0.5915 | 1093 | 0.1987 |
| 0.5920 | 1094 | 0.3138 |
| 0.5925 | 1095 | 0.2675 |
| 0.5931 | 1096 | 0.2982 |
| 0.5936 | 1097 | 0.1959 |
| 0.5942 | 1098 | 0.1799 |
| 0.5947 | 1099 | 0.2605 |
| 0.5952 | 1100 | 0.206 |
| 0.5958 | 1101 | 0.231 |
| 0.5963 | 1102 | 0.2363 |
| 0.5969 | 1103 | 0.2812 |
| 0.5974 | 1104 | 0.2339 |
| 0.5979 | 1105 | 0.3221 |
| 0.5985 | 1106 | 0.2915 |
| 0.5990 | 1107 | 0.3309 |
| 0.5996 | 1108 | 0.2318 |
| 0.6001 | 1109 | 0.2752 |
| 0.6006 | 1110 | 0.1957 |
| 0.6012 | 1111 | 0.265 |
| 0.6017 | 1112 | 0.3649 |
| 0.6023 | 1113 | 0.1695 |
| 0.6028 | 1114 | 0.2018 |
| 0.6034 | 1115 | 0.2359 |
| 0.6039 | 1116 | 0.2122 |
| 0.6044 | 1117 | 0.2057 |
| 0.6050 | 1118 | 0.204 |
| 0.6055 | 1119 | 0.23 |
| 0.6061 | 1120 | 0.2983 |
| 0.6066 | 1121 | 0.2853 |
| 0.6071 | 1122 | 0.2189 |
| 0.6077 | 1123 | 0.237 |
| 0.6082 | 1124 | 0.2299 |
| 0.6088 | 1125 | 0.2976 |
| 0.6093 | 1126 | 0.2604 |
| 0.6098 | 1127 | 0.3122 |
| 0.6104 | 1128 | 0.1538 |
| 0.6109 | 1129 | 0.2575 |
| 0.6115 | 1130 | 0.2361 |
| 0.6120 | 1131 | 0.239 |
| 0.6126 | 1132 | 0.2127 |
| 0.6131 | 1133 | 0.1329 |
| 0.6136 | 1134 | 0.2488 |
| 0.6142 | 1135 | 0.1767 |
| 0.6147 | 1136 | 0.2685 |
| 0.6153 | 1137 | 0.1614 |
| 0.6158 | 1138 | 0.219 |
| 0.6163 | 1139 | 0.1816 |
| 0.6169 | 1140 | 0.3093 |
| 0.6174 | 1141 | 0.2004 |
| 0.6180 | 1142 | 0.1699 |
| 0.6185 | 1143 | 0.1965 |
| 0.6190 | 1144 | 0.2868 |
| 0.6196 | 1145 | 0.2281 |
| 0.6201 | 1146 | 0.2095 |
| 0.6207 | 1147 | 0.2903 |
| 0.6212 | 1148 | 0.2944 |
| 0.6218 | 1149 | 0.2932 |
| 0.6223 | 1150 | 0.2656 |
| 0.6228 | 1151 | 0.2554 |
| 0.6234 | 1152 | 0.2429 |
| 0.6239 | 1153 | 0.1766 |
| 0.6245 | 1154 | 0.2101 |
| 0.625 | 1155 | 0.2743 |
| 0.6255 | 1156 | 0.2446 |
| 0.6261 | 1157 | 0.3523 |
| 0.6266 | 1158 | 0.2639 |
| 0.6272 | 1159 | 0.2127 |
| 0.6277 | 1160 | 0.2253 |
| 0.6282 | 1161 | 0.3194 |
| 0.6288 | 1162 | 0.2306 |
| 0.6293 | 1163 | 0.2249 |
| 0.6299 | 1164 | 0.1971 |
| 0.6304 | 1165 | 0.2229 |
| 0.6310 | 1166 | 0.2953 |
| 0.6315 | 1167 | 0.2222 |
| 0.6320 | 1168 | 0.2327 |
| 0.6326 | 1169 | 0.2714 |
| 0.6331 | 1170 | 0.2899 |
| 0.6337 | 1171 | 0.3335 |
| 0.6342 | 1172 | 0.3338 |
| 0.6347 | 1173 | 0.2522 |
| 0.6353 | 1174 | 0.2157 |
| 0.6358 | 1175 | 0.2871 |
| 0.6364 | 1176 | 0.1949 |
| 0.6369 | 1177 | 0.2765 |
| 0.6374 | 1178 | 0.2376 |
| 0.6380 | 1179 | 0.3025 |
| 0.6385 | 1180 | 0.2696 |
| 0.6391 | 1181 | 0.2499 |
| 0.6396 | 1182 | 0.1471 |
| 0.6402 | 1183 | 0.2394 |
| 0.6407 | 1184 | 0.2604 |
| 0.6412 | 1185 | 0.2578 |
| 0.6418 | 1186 | 0.2392 |
| 0.6423 | 1187 | 0.1738 |
| 0.6429 | 1188 | 0.1667 |
| 0.6434 | 1189 | 0.196 |
| 0.6439 | 1190 | 0.2591 |
| 0.6445 | 1191 | 0.308 |
| 0.6450 | 1192 | 0.192 |
| 0.6456 | 1193 | 0.2112 |
| 0.6461 | 1194 | 0.2094 |
| 0.6466 | 1195 | 0.2245 |
| 0.6472 | 1196 | 0.3237 |
| 0.6477 | 1197 | 0.2541 |
| 0.6483 | 1198 | 0.2046 |
| 0.6488 | 1199 | 0.2107 |
| 0.6494 | 1200 | 0.2381 |
| 0.6499 | 1201 | 0.2709 |
| 0.6504 | 1202 | 0.2133 |
| 0.6510 | 1203 | 0.2962 |
| 0.6515 | 1204 | 0.2465 |
| 0.6521 | 1205 | 0.1885 |
| 0.6526 | 1206 | 0.2204 |
| 0.6531 | 1207 | 0.2382 |
| 0.6537 | 1208 | 0.2423 |
| 0.6542 | 1209 | 0.1693 |
| 0.6548 | 1210 | 0.193 |
| 0.6553 | 1211 | 0.1997 |
| 0.6558 | 1212 | 0.2023 |
| 0.6564 | 1213 | 0.3033 |
| 0.6569 | 1214 | 0.3238 |
| 0.6575 | 1215 | 0.272 |
| 0.6580 | 1216 | 0.1372 |
| 0.6585 | 1217 | 0.2613 |
| 0.6591 | 1218 | 0.1268 |
| 0.6596 | 1219 | 0.1873 |
| 0.6602 | 1220 | 0.2099 |
| 0.6607 | 1221 | 0.2877 |
| 0.6613 | 1222 | 0.2513 |
| 0.6618 | 1223 | 0.1897 |
| 0.6623 | 1224 | 0.2419 |
| 0.6629 | 1225 | 0.1538 |
| 0.6634 | 1226 | 0.2188 |
| 0.6640 | 1227 | 0.3067 |
| 0.6645 | 1228 | 0.1894 |
| 0.6650 | 1229 | 0.1053 |
| 0.6656 | 1230 | 0.2598 |
| 0.6661 | 1231 | 0.1724 |
| 0.6667 | 1232 | 0.3421 |
| 0.6672 | 1233 | 0.2734 |
| 0.6677 | 1234 | 0.2917 |
| 0.6683 | 1235 | 0.2467 |
| 0.6688 | 1236 | 0.2112 |
| 0.6694 | 1237 | 0.202 |
| 0.6699 | 1238 | 0.1947 |
| 0.6705 | 1239 | 0.245 |
| 0.6710 | 1240 | 0.1832 |
| 0.6715 | 1241 | 0.3078 |
| 0.6721 | 1242 | 0.2293 |
| 0.6726 | 1243 | 0.2039 |
| 0.6732 | 1244 | 0.2496 |
| 0.6737 | 1245 | 0.2263 |
| 0.6742 | 1246 | 0.1352 |
| 0.6748 | 1247 | 0.2791 |
| 0.6753 | 1248 | 0.2378 |
| 0.6759 | 1249 | 0.2485 |
| 0.6764 | 1250 | 0.1771 |
| 0.6769 | 1251 | 0.1891 |
| 0.6775 | 1252 | 0.1827 |
| 0.6780 | 1253 | 0.2964 |
| 0.6786 | 1254 | 0.1814 |
| 0.6791 | 1255 | 0.2227 |
| 0.6797 | 1256 | 0.2982 |
| 0.6802 | 1257 | 0.2574 |
| 0.6807 | 1258 | 0.1615 |
| 0.6813 | 1259 | 0.2493 |
| 0.6818 | 1260 | 0.2416 |
| 0.6824 | 1261 | 0.1774 |
| 0.6829 | 1262 | 0.2875 |
| 0.6834 | 1263 | 0.2445 |
| 0.6840 | 1264 | 0.2029 |
| 0.6845 | 1265 | 0.1615 |
| 0.6851 | 1266 | 0.3034 |
| 0.6856 | 1267 | 0.2166 |
| 0.6861 | 1268 | 0.243 |
| 0.6867 | 1269 | 0.3196 |
| 0.6872 | 1270 | 0.1158 |
| 0.6878 | 1271 | 0.2559 |
| 0.6883 | 1272 | 0.2497 |
| 0.6889 | 1273 | 0.2773 |
| 0.6894 | 1274 | 0.2716 |
| 0.6899 | 1275 | 0.2651 |
| 0.6905 | 1276 | 0.2193 |
| 0.6910 | 1277 | 0.1634 |
| 0.6916 | 1278 | 0.227 |
| 0.6921 | 1279 | 0.2465 |
| 0.6926 | 1280 | 0.2472 |
| 0.6932 | 1281 | 0.2083 |
| 0.6937 | 1282 | 0.2587 |
| 0.6943 | 1283 | 0.1974 |
| 0.6948 | 1284 | 0.2369 |
| 0.6953 | 1285 | 0.2944 |
| 0.6959 | 1286 | 0.1743 |
| 0.6964 | 1287 | 0.2496 |
| 0.6970 | 1288 | 0.2777 |
| 0.6975 | 1289 | 0.2424 |
| 0.6981 | 1290 | 0.2735 |
| 0.6986 | 1291 | 0.2028 |
| 0.6991 | 1292 | 0.2498 |
| 0.6997 | 1293 | 0.2076 |
| 0.7002 | 1294 | 0.2811 |
| 0.7008 | 1295 | 0.1824 |
| 0.7013 | 1296 | 0.2319 |
| 0.7018 | 1297 | 0.1685 |
| 0.7024 | 1298 | 0.1954 |
| 0.7029 | 1299 | 0.1111 |
| 0.7035 | 1300 | 0.1637 |
| 0.7040 | 1301 | 0.2823 |
| 0.7045 | 1302 | 0.1236 |
| 0.7051 | 1303 | 0.183 |
| 0.7056 | 1304 | 0.1348 |
| 0.7062 | 1305 | 0.2877 |
| 0.7067 | 1306 | 0.1855 |
| 0.7073 | 1307 | 0.2708 |
| 0.7078 | 1308 | 0.3753 |
| 0.7083 | 1309 | 0.1908 |
| 0.7089 | 1310 | 0.2376 |
| 0.7094 | 1311 | 0.1913 |
| 0.7100 | 1312 | 0.3606 |
| 0.7105 | 1313 | 0.2773 |
| 0.7110 | 1314 | 0.2697 |
| 0.7116 | 1315 | 0.1975 |
| 0.7121 | 1316 | 0.1538 |
| 0.7127 | 1317 | 0.1859 |
| 0.7132 | 1318 | 0.2542 |
| 0.7137 | 1319 | 0.2822 |
| 0.7143 | 1320 | 0.2801 |
| 0.7148 | 1321 | 0.283 |
| 0.7154 | 1322 | 0.2867 |
| 0.7159 | 1323 | 0.2219 |
| 0.7165 | 1324 | 0.1737 |
| 0.7170 | 1325 | 0.2847 |
| 0.7175 | 1326 | 0.1857 |
| 0.7181 | 1327 | 0.2882 |
| 0.7186 | 1328 | 0.1674 |
| 0.7192 | 1329 | 0.2335 |
| 0.7197 | 1330 | 0.2781 |
| 0.7202 | 1331 | 0.2415 |
| 0.7208 | 1332 | 0.1948 |
| 0.7213 | 1333 | 0.1982 |
| 0.7219 | 1334 | 0.1902 |
| 0.7224 | 1335 | 0.2258 |
| 0.7229 | 1336 | 0.2633 |
| 0.7235 | 1337 | 0.2871 |
| 0.7240 | 1338 | 0.1722 |
| 0.7246 | 1339 | 0.2522 |
| 0.7251 | 1340 | 0.2276 |
| 0.7256 | 1341 | 0.1596 |
| 0.7262 | 1342 | 0.1786 |
| 0.7267 | 1343 | 0.2323 |
| 0.7273 | 1344 | 0.2323 |
| 0.7278 | 1345 | 0.2707 |
| 0.7284 | 1346 | 0.3087 |
| 0.7289 | 1347 | 0.2989 |
| 0.7294 | 1348 | 0.2444 |
| 0.7300 | 1349 | 0.2781 |
| 0.7305 | 1350 | 0.2269 |
| 0.7311 | 1351 | 0.1858 |
| 0.7316 | 1352 | 0.2539 |
| 0.7321 | 1353 | 0.2256 |
| 0.7327 | 1354 | 0.175 |
| 0.7332 | 1355 | 0.1923 |
| 0.7338 | 1356 | 0.279 |
| 0.7343 | 1357 | 0.2356 |
| 0.7348 | 1358 | 0.223 |
| 0.7354 | 1359 | 0.172 |
| 0.7359 | 1360 | 0.2756 |
| 0.7365 | 1361 | 0.3298 |
| 0.7370 | 1362 | 0.1619 |
| 0.7376 | 1363 | 0.1524 |
| 0.7381 | 1364 | 0.2297 |
| 0.7386 | 1365 | 0.2593 |
| 0.7392 | 1366 | 0.183 |
| 0.7397 | 1367 | 0.2744 |
| 0.7403 | 1368 | 0.2599 |
| 0.7408 | 1369 | 0.1933 |
| 0.7413 | 1370 | 0.2624 |
| 0.7419 | 1371 | 0.2616 |
| 0.7424 | 1372 | 0.1848 |
| 0.7430 | 1373 | 0.2129 |
| 0.7435 | 1374 | 0.2072 |
| 0.7440 | 1375 | 0.2606 |
| 0.7446 | 1376 | 0.2894 |
| 0.7451 | 1377 | 0.1475 |
| 0.7457 | 1378 | 0.2554 |
| 0.7462 | 1379 | 0.247 |
| 0.7468 | 1380 | 0.272 |
| 0.7473 | 1381 | 0.169 |
| 0.7478 | 1382 | 0.2267 |
| 0.7484 | 1383 | 0.2598 |
| 0.7489 | 1384 | 0.2338 |
| 0.7495 | 1385 | 0.1499 |
| 0.75 | 1386 | 0.2574 |
| 0.7505 | 1387 | 0.2158 |
| 0.7511 | 1388 | 0.2934 |
| 0.7516 | 1389 | 0.2076 |
| 0.7522 | 1390 | 0.1381 |
| 0.7527 | 1391 | 0.2106 |
| 0.7532 | 1392 | 0.417 |
| 0.7538 | 1393 | 0.2188 |
| 0.7543 | 1394 | 0.1214 |
| 0.7549 | 1395 | 0.2531 |
| 0.7554 | 1396 | 0.3031 |
| 0.7560 | 1397 | 0.3141 |
| 0.7565 | 1398 | 0.1921 |
| 0.7570 | 1399 | 0.2428 |
| 0.7576 | 1400 | 0.2414 |
| 0.7581 | 1401 | 0.1741 |
| 0.7587 | 1402 | 0.1865 |
| 0.7592 | 1403 | 0.1522 |
| 0.7597 | 1404 | 0.2314 |
| 0.7603 | 1405 | 0.2073 |
| 0.7608 | 1406 | 0.2321 |
| 0.7614 | 1407 | 0.3196 |
| 0.7619 | 1408 | 0.2849 |
| 0.7624 | 1409 | 0.275 |
| 0.7630 | 1410 | 0.132 |
| 0.7635 | 1411 | 0.2076 |
| 0.7641 | 1412 | 0.1814 |
| 0.7646 | 1413 | 0.2447 |
| 0.7652 | 1414 | 0.1721 |
| 0.7657 | 1415 | 0.1308 |
| 0.7662 | 1416 | 0.2891 |
| 0.7668 | 1417 | 0.1335 |
| 0.7673 | 1418 | 0.2577 |
| 0.7679 | 1419 | 0.1922 |
| 0.7684 | 1420 | 0.2042 |
| 0.7689 | 1421 | 0.294 |
| 0.7695 | 1422 | 0.2194 |
| 0.7700 | 1423 | 0.3001 |
| 0.7706 | 1424 | 0.1939 |
| 0.7711 | 1425 | 0.2993 |
| 0.7716 | 1426 | 0.2047 |
| 0.7722 | 1427 | 0.2043 |
| 0.7727 | 1428 | 0.2916 |
| 0.7733 | 1429 | 0.1857 |
| 0.7738 | 1430 | 0.1699 |
| 0.7744 | 1431 | 0.1641 |
| 0.7749 | 1432 | 0.22 |
| 0.7754 | 1433 | 0.2524 |
| 0.7760 | 1434 | 0.1864 |
| 0.7765 | 1435 | 0.33 |
| 0.7771 | 1436 | 0.1361 |
| 0.7776 | 1437 | 0.2796 |
| 0.7781 | 1438 | 0.2411 |
| 0.7787 | 1439 | 0.2349 |
| 0.7792 | 1440 | 0.2283 |
| 0.7798 | 1441 | 0.2401 |
| 0.7803 | 1442 | 0.2155 |
| 0.7808 | 1443 | 0.2473 |
| 0.7814 | 1444 | 0.1364 |
| 0.7819 | 1445 | 0.1476 |
| 0.7825 | 1446 | 0.1989 |
| 0.7830 | 1447 | 0.2432 |
| 0.7835 | 1448 | 0.1938 |
| 0.7841 | 1449 | 0.1552 |
| 0.7846 | 1450 | 0.3058 |
| 0.7852 | 1451 | 0.1666 |
| 0.7857 | 1452 | 0.1908 |
| 0.7863 | 1453 | 0.2164 |
| 0.7868 | 1454 | 0.2417 |
| 0.7873 | 1455 | 0.1969 |
| 0.7879 | 1456 | 0.2211 |
| 0.7884 | 1457 | 0.2758 |
| 0.7890 | 1458 | 0.1517 |
| 0.7895 | 1459 | 0.1566 |
| 0.7900 | 1460 | 0.1986 |
| 0.7906 | 1461 | 0.2431 |
| 0.7911 | 1462 | 0.2759 |
| 0.7917 | 1463 | 0.1769 |
| 0.7922 | 1464 | 0.1896 |
| 0.7927 | 1465 | 0.2472 |
| 0.7933 | 1466 | 0.1607 |
| 0.7938 | 1467 | 0.1684 |
| 0.7944 | 1468 | 0.2032 |
| 0.7949 | 1469 | 0.2087 |
| 0.7955 | 1470 | 0.1662 |
| 0.7960 | 1471 | 0.156 |
| 0.7965 | 1472 | 0.179 |
| 0.7971 | 1473 | 0.2237 |
| 0.7976 | 1474 | 0.1914 |
| 0.7982 | 1475 | 0.1255 |
| 0.7987 | 1476 | 0.2033 |
| 0.7992 | 1477 | 0.2079 |
| 0.7998 | 1478 | 0.1628 |
| 0.8003 | 1479 | 0.2411 |
| 0.8009 | 1480 | 0.2189 |
| 0.8014 | 1481 | 0.2319 |
| 0.8019 | 1482 | 0.277 |
| 0.8025 | 1483 | 0.2753 |
| 0.8030 | 1484 | 0.2106 |
| 0.8036 | 1485 | 0.2039 |
| 0.8041 | 1486 | 0.256 |
| 0.8047 | 1487 | 0.1349 |
| 0.8052 | 1488 | 0.3065 |
| 0.8057 | 1489 | 0.2778 |
| 0.8063 | 1490 | 0.1481 |
| 0.8068 | 1491 | 0.2212 |
| 0.8074 | 1492 | 0.2786 |
| 0.8079 | 1493 | 0.3017 |
| 0.8084 | 1494 | 0.2164 |
| 0.8090 | 1495 | 0.1563 |
| 0.8095 | 1496 | 0.2038 |
| 0.8101 | 1497 | 0.1745 |
| 0.8106 | 1498 | 0.2678 |
| 0.8111 | 1499 | 0.1744 |
| 0.8117 | 1500 | 0.2337 |
| 0.8122 | 1501 | 0.1605 |
| 0.8128 | 1502 | 0.2224 |
| 0.8133 | 1503 | 0.1323 |
| 0.8139 | 1504 | 0.1817 |
| 0.8144 | 1505 | 0.1549 |
| 0.8149 | 1506 | 0.2425 |
| 0.8155 | 1507 | 0.3393 |
| 0.8160 | 1508 | 0.2422 |
| 0.8166 | 1509 | 0.1692 |
| 0.8171 | 1510 | 0.2741 |
| 0.8176 | 1511 | 0.2612 |
| 0.8182 | 1512 | 0.2111 |
| 0.8187 | 1513 | 0.2094 |
| 0.8193 | 1514 | 0.2591 |
| 0.8198 | 1515 | 0.1785 |
| 0.8203 | 1516 | 0.2598 |
| 0.8209 | 1517 | 0.2414 |
| 0.8214 | 1518 | 0.2205 |
| 0.8220 | 1519 | 0.2149 |
| 0.8225 | 1520 | 0.3023 |
| 0.8231 | 1521 | 0.1663 |
| 0.8236 | 1522 | 0.1992 |
| 0.8241 | 1523 | 0.2077 |
| 0.8247 | 1524 | 0.2264 |
| 0.8252 | 1525 | 0.2379 |
| 0.8258 | 1526 | 0.2704 |
| 0.8263 | 1527 | 0.2075 |
| 0.8268 | 1528 | 0.2057 |
| 0.8274 | 1529 | 0.1929 |
| 0.8279 | 1530 | 0.2734 |
| 0.8285 | 1531 | 0.231 |
| 0.8290 | 1532 | 0.1712 |
| 0.8295 | 1533 | 0.1946 |
| 0.8301 | 1534 | 0.2674 |
| 0.8306 | 1535 | 0.283 |
| 0.8312 | 1536 | 0.2043 |
| 0.8317 | 1537 | 0.324 |
| 0.8323 | 1538 | 0.1983 |
| 0.8328 | 1539 | 0.2451 |
| 0.8333 | 1540 | 0.1818 |
| 0.8339 | 1541 | 0.1934 |
| 0.8344 | 1542 | 0.187 |
| 0.8350 | 1543 | 0.1651 |
| 0.8355 | 1544 | 0.1877 |
| 0.8360 | 1545 | 0.2054 |
| 0.8366 | 1546 | 0.134 |
| 0.8371 | 1547 | 0.1894 |
| 0.8377 | 1548 | 0.1858 |
| 0.8382 | 1549 | 0.2407 |
| 0.8387 | 1550 | 0.1549 |
| 0.8393 | 1551 | 0.1154 |
| 0.8398 | 1552 | 0.2316 |
| 0.8404 | 1553 | 0.201 |
| 0.8409 | 1554 | 0.1 |
| 0.8415 | 1555 | 0.1839 |
| 0.8420 | 1556 | 0.1961 |
| 0.8425 | 1557 | 0.2293 |
| 0.8431 | 1558 | 0.3026 |
| 0.8436 | 1559 | 0.0939 |
| 0.8442 | 1560 | 0.1977 |
| 0.8447 | 1561 | 0.2081 |
| 0.8452 | 1562 | 0.3763 |
| 0.8458 | 1563 | 0.2351 |
| 0.8463 | 1564 | 0.1679 |
| 0.8469 | 1565 | 0.2331 |
| 0.8474 | 1566 | 0.1929 |
| 0.8479 | 1567 | 0.2727 |
| 0.8485 | 1568 | 0.2027 |
| 0.8490 | 1569 | 0.2342 |
| 0.8496 | 1570 | 0.1408 |
| 0.8501 | 1571 | 0.1504 |
| 0.8506 | 1572 | 0.2081 |
| 0.8512 | 1573 | 0.3252 |
| 0.8517 | 1574 | 0.1838 |
| 0.8523 | 1575 | 0.2124 |
| 0.8528 | 1576 | 0.2678 |
| 0.8534 | 1577 | 0.2415 |
| 0.8539 | 1578 | 0.1886 |
| 0.8544 | 1579 | 0.2611 |
| 0.8550 | 1580 | 0.1391 |
| 0.8555 | 1581 | 0.2831 |
| 0.8561 | 1582 | 0.2946 |
| 0.8566 | 1583 | 0.1974 |
| 0.8571 | 1584 | 0.2054 |
| 0.8577 | 1585 | 0.2207 |
| 0.8582 | 1586 | 0.2402 |
| 0.8588 | 1587 | 0.1869 |
| 0.8593 | 1588 | 0.1488 |
| 0.8598 | 1589 | 0.2525 |
| 0.8604 | 1590 | 0.2148 |
| 0.8609 | 1591 | 0.2051 |
| 0.8615 | 1592 | 0.1768 |
| 0.8620 | 1593 | 0.167 |
| 0.8626 | 1594 | 0.216 |
| 0.8631 | 1595 | 0.1512 |
| 0.8636 | 1596 | 0.1886 |
| 0.8642 | 1597 | 0.1712 |
| 0.8647 | 1598 | 0.2057 |
| 0.8653 | 1599 | 0.2316 |
| 0.8658 | 1600 | 0.2064 |
| 0.8663 | 1601 | 0.2309 |
| 0.8669 | 1602 | 0.2696 |
| 0.8674 | 1603 | 0.2742 |
| 0.8680 | 1604 | 0.2463 |
| 0.8685 | 1605 | 0.2402 |
| 0.8690 | 1606 | 0.1479 |
| 0.8696 | 1607 | 0.1613 |
| 0.8701 | 1608 | 0.2113 |
| 0.8707 | 1609 | 0.2015 |
| 0.8712 | 1610 | 0.1901 |
| 0.8718 | 1611 | 0.1712 |
| 0.8723 | 1612 | 0.2733 |
| 0.8728 | 1613 | 0.2314 |
| 0.8734 | 1614 | 0.1676 |
| 0.8739 | 1615 | 0.2809 |
| 0.8745 | 1616 | 0.1739 |
| 0.875 | 1617 | 0.2759 |
| 0.8755 | 1618 | 0.245 |
| 0.8761 | 1619 | 0.2195 |
| 0.8766 | 1620 | 0.1651 |
| 0.8772 | 1621 | 0.2656 |
| 0.8777 | 1622 | 0.2605 |
| 0.8782 | 1623 | 0.2615 |
| 0.8788 | 1624 | 0.2298 |
| 0.8793 | 1625 | 0.2668 |
| 0.8799 | 1626 | 0.1399 |
| 0.8804 | 1627 | 0.2897 |
| 0.8810 | 1628 | 0.1913 |
| 0.8815 | 1629 | 0.1974 |
| 0.8820 | 1630 | 0.2694 |
| 0.8826 | 1631 | 0.1627 |
| 0.8831 | 1632 | 0.2016 |
| 0.8837 | 1633 | 0.2794 |
| 0.8842 | 1634 | 0.1409 |
| 0.8847 | 1635 | 0.199 |
| 0.8853 | 1636 | 0.136 |
| 0.8858 | 1637 | 0.2331 |
| 0.8864 | 1638 | 0.2306 |
| 0.8869 | 1639 | 0.1683 |
| 0.8874 | 1640 | 0.2567 |
| 0.8880 | 1641 | 0.2492 |
| 0.8885 | 1642 | 0.1601 |
| 0.8891 | 1643 | 0.1808 |
| 0.8896 | 1644 | 0.1861 |
| 0.8902 | 1645 | 0.2098 |
| 0.8907 | 1646 | 0.1862 |
| 0.8912 | 1647 | 0.195 |
| 0.8918 | 1648 | 0.2567 |
| 0.8923 | 1649 | 0.2327 |
| 0.8929 | 1650 | 0.2139 |
| 0.8934 | 1651 | 0.1437 |
| 0.8939 | 1652 | 0.2254 |
| 0.8945 | 1653 | 0.2138 |
| 0.8950 | 1654 | 0.181 |
| 0.8956 | 1655 | 0.201 |
| 0.8961 | 1656 | 0.2941 |
| 0.8966 | 1657 | 0.2677 |
| 0.8972 | 1658 | 0.1932 |
| 0.8977 | 1659 | 0.1534 |
| 0.8983 | 1660 | 0.1721 |
| 0.8988 | 1661 | 0.2207 |
| 0.8994 | 1662 | 0.1859 |
| 0.8999 | 1663 | 0.1869 |
| 0.9004 | 1664 | 0.1898 |
| 0.9010 | 1665 | 0.29 |
| 0.9015 | 1666 | 0.19 |
| 0.9021 | 1667 | 0.2346 |
| 0.9026 | 1668 | 0.1738 |
| 0.9031 | 1669 | 0.1867 |
| 0.9037 | 1670 | 0.1852 |
| 0.9042 | 1671 | 0.1846 |
| 0.9048 | 1672 | 0.1879 |
| 0.9053 | 1673 | 0.1733 |
| 0.9058 | 1674 | 0.27 |
| 0.9064 | 1675 | 0.2005 |
| 0.9069 | 1676 | 0.1661 |
| 0.9075 | 1677 | 0.1841 |
| 0.9080 | 1678 | 0.1545 |
| 0.9085 | 1679 | 0.1856 |
| 0.9091 | 1680 | 0.1827 |
| 0.9096 | 1681 | 0.2856 |
| 0.9102 | 1682 | 0.3265 |
| 0.9107 | 1683 | 0.2755 |
| 0.9113 | 1684 | 0.1453 |
| 0.9118 | 1685 | 0.2113 |
| 0.9123 | 1686 | 0.3191 |
| 0.9129 | 1687 | 0.1785 |
| 0.9134 | 1688 | 0.1874 |
| 0.9140 | 1689 | 0.2242 |
| 0.9145 | 1690 | 0.1746 |
| 0.9150 | 1691 | 0.2292 |
| 0.9156 | 1692 | 0.1993 |
| 0.9161 | 1693 | 0.2461 |
| 0.9167 | 1694 | 0.258 |
| 0.9172 | 1695 | 0.1919 |
| 0.9177 | 1696 | 0.2632 |
| 0.9183 | 1697 | 0.2513 |
| 0.9188 | 1698 | 0.2052 |
| 0.9194 | 1699 | 0.2297 |
| 0.9199 | 1700 | 0.218 |
| 0.9205 | 1701 | 0.2182 |
| 0.9210 | 1702 | 0.2033 |
| 0.9215 | 1703 | 0.2081 |
| 0.9221 | 1704 | 0.1975 |
| 0.9226 | 1705 | 0.2011 |
| 0.9232 | 1706 | 0.1438 |
| 0.9237 | 1707 | 0.2041 |
| 0.9242 | 1708 | 0.1605 |
| 0.9248 | 1709 | 0.2261 |
| 0.9253 | 1710 | 0.1288 |
| 0.9259 | 1711 | 0.1856 |
| 0.9264 | 1712 | 0.1708 |
| 0.9269 | 1713 | 0.1735 |
| 0.9275 | 1714 | 0.2421 |
| 0.9280 | 1715 | 0.225 |
| 0.9286 | 1716 | 0.172 |
| 0.9291 | 1717 | 0.1651 |
| 0.9297 | 1718 | 0.2198 |
| 0.9302 | 1719 | 0.1956 |
| 0.9307 | 1720 | 0.2471 |
| 0.9313 | 1721 | 0.1807 |
| 0.9318 | 1722 | 0.0867 |
| 0.9324 | 1723 | 0.2634 |
| 0.9329 | 1724 | 0.2098 |
| 0.9334 | 1725 | 0.176 |
| 0.9340 | 1726 | 0.1865 |
| 0.9345 | 1727 | 0.1886 |
| 0.9351 | 1728 | 0.2151 |
| 0.9356 | 1729 | 0.1689 |
| 0.9361 | 1730 | 0.2577 |
| 0.9367 | 1731 | 0.2183 |
| 0.9372 | 1732 | 0.1771 |
| 0.9378 | 1733 | 0.1784 |
| 0.9383 | 1734 | 0.2513 |
| 0.9389 | 1735 | 0.2508 |
| 0.9394 | 1736 | 0.2319 |
| 0.9399 | 1737 | 0.1689 |
| 0.9405 | 1738 | 0.1158 |
| 0.9410 | 1739 | 0.245 |
| 0.9416 | 1740 | 0.2763 |
| 0.9421 | 1741 | 0.2143 |
| 0.9426 | 1742 | 0.1567 |
| 0.9432 | 1743 | 0.1456 |
| 0.9437 | 1744 | 0.1858 |
| 0.9443 | 1745 | 0.2407 |
| 0.9448 | 1746 | 0.1501 |
| 0.9453 | 1747 | 0.2261 |
| 0.9459 | 1748 | 0.201 |
| 0.9464 | 1749 | 0.1816 |
| 0.9470 | 1750 | 0.0865 |
| 0.9475 | 1751 | 0.296 |
| 0.9481 | 1752 | 0.1989 |
| 0.9486 | 1753 | 0.2225 |
| 0.9491 | 1754 | 0.1898 |
| 0.9497 | 1755 | 0.1394 |
| 0.9502 | 1756 | 0.2071 |
| 0.9508 | 1757 | 0.2008 |
| 0.9513 | 1758 | 0.1961 |
| 0.9518 | 1759 | 0.1865 |
| 0.9524 | 1760 | 0.2111 |
| 0.9529 | 1761 | 0.2847 |
| 0.9535 | 1762 | 0.2014 |
| 0.9540 | 1763 | 0.1629 |
| 0.9545 | 1764 | 0.1237 |
| 0.9551 | 1765 | 0.2331 |
| 0.9556 | 1766 | 0.2458 |
| 0.9562 | 1767 | 0.1752 |
| 0.9567 | 1768 | 0.2897 |
| 0.9573 | 1769 | 0.134 |
| 0.9578 | 1770 | 0.2517 |
| 0.9583 | 1771 | 0.1504 |
| 0.9589 | 1772 | 0.2288 |
| 0.9594 | 1773 | 0.2404 |
| 0.9600 | 1774 | 0.1955 |
| 0.9605 | 1775 | 0.1972 |
| 0.9610 | 1776 | 0.1971 |
| 0.9616 | 1777 | 0.1297 |
| 0.9621 | 1778 | 0.2005 |
| 0.9627 | 1779 | 0.1679 |
| 0.9632 | 1780 | 0.2738 |
| 0.9637 | 1781 | 0.2213 |
| 0.9643 | 1782 | 0.1673 |
| 0.9648 | 1783 | 0.1889 |
| 0.9654 | 1784 | 0.1642 |
| 0.9659 | 1785 | 0.2065 |
| 0.9665 | 1786 | 0.2161 |
| 0.9670 | 1787 | 0.2 |
| 0.9675 | 1788 | 0.1734 |
| 0.9681 | 1789 | 0.1763 |
| 0.9686 | 1790 | 0.1258 |
| 0.9692 | 1791 | 0.1605 |
| 0.9697 | 1792 | 0.1668 |
| 0.9702 | 1793 | 0.1435 |
| 0.9708 | 1794 | 0.2365 |
| 0.9713 | 1795 | 0.2994 |
| 0.9719 | 1796 | 0.1712 |
| 0.9724 | 1797 | 0.3761 |
| 0.9729 | 1798 | 0.1148 |
| 0.9735 | 1799 | 0.2062 |
| 0.9740 | 1800 | 0.1609 |
| 0.9746 | 1801 | 0.1481 |
| 0.9751 | 1802 | 0.1487 |
| 0.9756 | 1803 | 0.1926 |
| 0.9762 | 1804 | 0.1305 |
| 0.9767 | 1805 | 0.2725 |
| 0.9773 | 1806 | 0.2248 |
| 0.9778 | 1807 | 0.1632 |
| 0.9784 | 1808 | 0.2191 |
| 0.9789 | 1809 | 0.1921 |
| 0.9794 | 1810 | 0.3105 |
| 0.9800 | 1811 | 0.1878 |
| 0.9805 | 1812 | 0.1709 |
| 0.9811 | 1813 | 0.2087 |
| 0.9816 | 1814 | 0.1767 |
| 0.9821 | 1815 | 0.284 |
| 0.9827 | 1816 | 0.1389 |
| 0.9832 | 1817 | 0.094 |
| 0.9838 | 1818 | 0.2162 |
| 0.9843 | 1819 | 0.1478 |
| 0.9848 | 1820 | 0.1421 |
| 0.9854 | 1821 | 0.2339 |
| 0.9859 | 1822 | 0.2406 |
| 0.9865 | 1823 | 0.2425 |
| 0.9870 | 1824 | 0.2132 |
| 0.9876 | 1825 | 0.2314 |
| 0.9881 | 1826 | 0.1829 |
| 0.9886 | 1827 | 0.1538 |
| 0.9892 | 1828 | 0.1794 |
| 0.9897 | 1829 | 0.1931 |
| 0.9903 | 1830 | 0.2618 |
| 0.9908 | 1831 | 0.1501 |
| 0.9913 | 1832 | 0.2641 |
| 0.9919 | 1833 | 0.1706 |
| 0.9924 | 1834 | 0.1356 |
| 0.9930 | 1835 | 0.2405 |
| 0.9935 | 1836 | 0.1464 |
| 0.9940 | 1837 | 0.1476 |
| 0.9946 | 1838 | 0.1413 |
| 0.9951 | 1839 | 0.1958 |
| 0.9957 | 1840 | 0.0958 |
| 0.9962 | 1841 | 0.2388 |
| 0.9968 | 1842 | 0.1871 |
| 0.9973 | 1843 | 0.182 |
| 0.9978 | 1844 | 0.1875 |
| 0.9984 | 1845 | 0.1762 |
| 0.9989 | 1846 | 0.2412 |
| 0.9995 | 1847 | 0.233 |
| 1.0 | 1848 | 0.2743 |
| 1.0005 | 1849 | 0.2034 |
| 1.0011 | 1850 | 0.1463 |
| 1.0016 | 1851 | 0.1942 |
| 1.0022 | 1852 | 0.1194 |
| 1.0027 | 1853 | 0.1526 |
| 1.0032 | 1854 | 0.1372 |
| 1.0038 | 1855 | 0.1601 |
| 1.0043 | 1856 | 0.1654 |
| 1.0049 | 1857 | 0.1777 |
| 1.0054 | 1858 | 0.1739 |
| 1.0060 | 1859 | 0.1391 |
| 1.0065 | 1860 | 0.1468 |
| 1.0070 | 1861 | 0.1877 |
| 1.0076 | 1862 | 0.1617 |
| 1.0081 | 1863 | 0.1823 |
| 1.0087 | 1864 | 0.105 |
| 1.0092 | 1865 | 0.233 |
| 1.0097 | 1866 | 0.2381 |
| 1.0103 | 1867 | 0.2033 |
| 1.0108 | 1868 | 0.1243 |
| 1.0114 | 1869 | 0.1527 |
| 1.0119 | 1870 | 0.1327 |
| 1.0124 | 1871 | 0.1931 |
| 1.0130 | 1872 | 0.1131 |
| 1.0135 | 1873 | 0.2039 |
| 1.0141 | 1874 | 0.1232 |
| 1.0146 | 1875 | 0.2341 |
| 1.0152 | 1876 | 0.143 |
| 1.0157 | 1877 | 0.1361 |
| 1.0162 | 1878 | 0.217 |
| 1.0168 | 1879 | 0.1394 |
| 1.0173 | 1880 | 0.2238 |
| 1.0179 | 1881 | 0.1121 |
| 1.0184 | 1882 | 0.1581 |
| 1.0189 | 1883 | 0.1386 |
| 1.0195 | 1884 | 0.1528 |
| 1.0200 | 1885 | 0.1286 |
| 1.0206 | 1886 | 0.1667 |
| 1.0211 | 1887 | 0.115 |
| 1.0216 | 1888 | 0.1584 |
| 1.0222 | 1889 | 0.1274 |
| 1.0227 | 1890 | 0.1612 |
| 1.0233 | 1891 | 0.2046 |
| 1.0238 | 1892 | 0.206 |
| 1.0244 | 1893 | 0.2367 |
| 1.0249 | 1894 | 0.1461 |
| 1.0254 | 1895 | 0.2551 |
| 1.0260 | 1896 | 0.1411 |
| 1.0265 | 1897 | 0.1284 |
| 1.0271 | 1898 | 0.2468 |
| 1.0276 | 1899 | 0.1183 |
| 1.0281 | 1900 | 0.2364 |
| 1.0287 | 1901 | 0.1429 |
| 1.0292 | 1902 | 0.1653 |
| 1.0298 | 1903 | 0.1711 |
| 1.0303 | 1904 | 0.1445 |
| 1.0308 | 1905 | 0.2109 |
| 1.0314 | 1906 | 0.2101 |
| 1.0319 | 1907 | 0.1864 |
| 1.0325 | 1908 | 0.1393 |
| 1.0330 | 1909 | 0.1086 |
| 1.0335 | 1910 | 0.218 |
| 1.0341 | 1911 | 0.1591 |
| 1.0346 | 1912 | 0.1739 |
| 1.0352 | 1913 | 0.2054 |
| 1.0357 | 1914 | 0.1545 |
| 1.0363 | 1915 | 0.2139 |
| 1.0368 | 1916 | 0.1676 |
| 1.0373 | 1917 | 0.1097 |
| 1.0379 | 1918 | 0.2515 |
| 1.0384 | 1919 | 0.1582 |
| 1.0390 | 1920 | 0.1195 |
| 1.0395 | 1921 | 0.196 |
| 1.0400 | 1922 | 0.1554 |
| 1.0406 | 1923 | 0.1702 |
| 1.0411 | 1924 | 0.1224 |
| 1.0417 | 1925 | 0.1329 |
| 1.0422 | 1926 | 0.1243 |
| 1.0427 | 1927 | 0.1067 |
| 1.0433 | 1928 | 0.1712 |
| 1.0438 | 1929 | 0.202 |
| 1.0444 | 1930 | 0.2256 |
| 1.0449 | 1931 | 0.2148 |
| 1.0455 | 1932 | 0.234 |
| 1.0460 | 1933 | 0.1218 |
| 1.0465 | 1934 | 0.1635 |
| 1.0471 | 1935 | 0.1917 |
| 1.0476 | 1936 | 0.151 |
| 1.0482 | 1937 | 0.2025 |
| 1.0487 | 1938 | 0.141 |
| 1.0492 | 1939 | 0.1329 |
| 1.0498 | 1940 | 0.2148 |
| 1.0503 | 1941 | 0.1527 |
| 1.0509 | 1942 | 0.1631 |
| 1.0514 | 1943 | 0.1642 |
| 1.0519 | 1944 | 0.1426 |
| 1.0525 | 1945 | 0.116 |
| 1.0530 | 1946 | 0.1159 |
| 1.0536 | 1947 | 0.1373 |
| 1.0541 | 1948 | 0.2117 |
| 1.0547 | 1949 | 0.1005 |
| 1.0552 | 1950 | 0.149 |
| 1.0557 | 1951 | 0.1956 |
| 1.0563 | 1952 | 0.1471 |
| 1.0568 | 1953 | 0.1377 |
| 1.0574 | 1954 | 0.1306 |
| 1.0579 | 1955 | 0.1737 |
| 1.0584 | 1956 | 0.1636 |
| 1.0590 | 1957 | 0.1634 |
| 1.0595 | 1958 | 0.1845 |
| 1.0601 | 1959 | 0.2313 |
| 1.0606 | 1960 | 0.1796 |
| 1.0611 | 1961 | 0.1918 |
| 1.0617 | 1962 | 0.1715 |
| 1.0622 | 1963 | 0.162 |
| 1.0628 | 1964 | 0.1783 |
| 1.0633 | 1965 | 0.2089 |
| 1.0639 | 1966 | 0.1724 |
| 1.0644 | 1967 | 0.2609 |
| 1.0649 | 1968 | 0.2761 |
| 1.0655 | 1969 | 0.1313 |
| 1.0660 | 1970 | 0.1492 |
| 1.0666 | 1971 | 0.1864 |
| 1.0671 | 1972 | 0.1653 |
| 1.0676 | 1973 | 0.1798 |
| 1.0682 | 1974 | 0.171 |
| 1.0687 | 1975 | 0.1591 |
| 1.0693 | 1976 | 0.1092 |
| 1.0698 | 1977 | 0.1211 |
| 1.0703 | 1978 | 0.0976 |
| 1.0709 | 1979 | 0.205 |
| 1.0714 | 1980 | 0.1753 |
| 1.0720 | 1981 | 0.1722 |
| 1.0725 | 1982 | 0.1498 |
| 1.0731 | 1983 | 0.1297 |
| 1.0736 | 1984 | 0.1941 |
| 1.0741 | 1985 | 0.1709 |
| 1.0747 | 1986 | 0.1441 |
| 1.0752 | 1987 | 0.1204 |
| 1.0758 | 1988 | 0.1962 |
| 1.0763 | 1989 | 0.1346 |
| 1.0768 | 1990 | 0.2437 |
| 1.0774 | 1991 | 0.1461 |
| 1.0779 | 1992 | 0.1496 |
| 1.0785 | 1993 | 0.1689 |
| 1.0790 | 1994 | 0.1335 |
| 1.0795 | 1995 | 0.1109 |
| 1.0801 | 1996 | 0.1266 |
| 1.0806 | 1997 | 0.1488 |
| 1.0812 | 1998 | 0.167 |
| 1.0817 | 1999 | 0.188 |
| 1.0823 | 2000 | 0.2053 |
| 1.0828 | 2001 | 0.1242 |
| 1.0833 | 2002 | 0.1813 |
| 1.0839 | 2003 | 0.0957 |
| 1.0844 | 2004 | 0.1556 |
| 1.0850 | 2005 | 0.1699 |
| 1.0855 | 2006 | 0.1698 |
| 1.0860 | 2007 | 0.1316 |
| 1.0866 | 2008 | 0.1902 |
| 1.0871 | 2009 | 0.1008 |
| 1.0877 | 2010 | 0.2238 |
| 1.0882 | 2011 | 0.1906 |
| 1.0887 | 2012 | 0.2006 |
| 1.0893 | 2013 | 0.1401 |
| 1.0898 | 2014 | 0.1908 |
| 1.0904 | 2015 | 0.1655 |
| 1.0909 | 2016 | 0.1524 |
| 1.0915 | 2017 | 0.0981 |
| 1.0920 | 2018 | 0.1616 |
| 1.0925 | 2019 | 0.1413 |
| 1.0931 | 2020 | 0.1809 |
| 1.0936 | 2021 | 0.1699 |
| 1.0942 | 2022 | 0.1203 |
| 1.0947 | 2023 | 0.191 |
| 1.0952 | 2024 | 0.2139 |
| 1.0958 | 2025 | 0.1012 |
| 1.0963 | 2026 | 0.183 |
| 1.0969 | 2027 | 0.1957 |
| 1.0974 | 2028 | 0.118 |
| 1.0979 | 2029 | 0.1974 |
| 1.0985 | 2030 | 0.2015 |
| 1.0990 | 2031 | 0.1445 |
| 1.0996 | 2032 | 0.1731 |
| 1.1001 | 2033 | 0.1958 |
| 1.1006 | 2034 | 0.1431 |
| 1.1012 | 2035 | 0.1691 |
| 1.1017 | 2036 | 0.1705 |
| 1.1023 | 2037 | 0.1075 |
| 1.1028 | 2038 | 0.1893 |
| 1.1034 | 2039 | 0.1411 |
| 1.1039 | 2040 | 0.1824 |
| 1.1044 | 2041 | 0.1304 |
| 1.1050 | 2042 | 0.1285 |
| 1.1055 | 2043 | 0.1826 |
| 1.1061 | 2044 | 0.1263 |
| 1.1066 | 2045 | 0.1473 |
| 1.1071 | 2046 | 0.119 |
| 1.1077 | 2047 | 0.1101 |
| 1.1082 | 2048 | 0.1834 |
| 1.1088 | 2049 | 0.1625 |
| 1.1093 | 2050 | 0.1775 |
| 1.1098 | 2051 | 0.1728 |
| 1.1104 | 2052 | 0.1672 |
| 1.1109 | 2053 | 0.156 |
| 1.1115 | 2054 | 0.1605 |
| 1.1120 | 2055 | 0.1568 |
| 1.1126 | 2056 | 0.1133 |
| 1.1131 | 2057 | 0.1167 |
| 1.1136 | 2058 | 0.1604 |
| 1.1142 | 2059 | 0.1963 |
| 1.1147 | 2060 | 0.2073 |
| 1.1153 | 2061 | 0.1011 |
| 1.1158 | 2062 | 0.1714 |
| 1.1163 | 2063 | 0.174 |
| 1.1169 | 2064 | 0.2817 |
| 1.1174 | 2065 | 0.1342 |
| 1.1180 | 2066 | 0.2115 |
| 1.1185 | 2067 | 0.2947 |
| 1.1190 | 2068 | 0.166 |
| 1.1196 | 2069 | 0.2678 |
| 1.1201 | 2070 | 0.1649 |
| 1.1207 | 2071 | 0.1117 |
| 1.1212 | 2072 | 0.1927 |
| 1.1218 | 2073 | 0.1119 |
| 1.1223 | 2074 | 0.1288 |
| 1.1228 | 2075 | 0.1147 |
| 1.1234 | 2076 | 0.214 |
| 1.1239 | 2077 | 0.2483 |
| 1.1245 | 2078 | 0.1222 |
| 1.125 | 2079 | 0.1394 |
| 1.1255 | 2080 | 0.1832 |
| 1.1261 | 2081 | 0.1499 |
| 1.1266 | 2082 | 0.1717 |
| 1.1272 | 2083 | 0.181 |
| 1.1277 | 2084 | 0.2072 |
| 1.1282 | 2085 | 0.1094 |
| 1.1288 | 2086 | 0.1827 |
| 1.1293 | 2087 | 0.1416 |
| 1.1299 | 2088 | 0.1325 |
| 1.1304 | 2089 | 0.1111 |
| 1.1310 | 2090 | 0.1922 |
| 1.1315 | 2091 | 0.1604 |
| 1.1320 | 2092 | 0.1957 |
| 1.1326 | 2093 | 0.0706 |
| 1.1331 | 2094 | 0.1451 |
| 1.1337 | 2095 | 0.1604 |
| 1.1342 | 2096 | 0.1272 |
| 1.1347 | 2097 | 0.2064 |
| 1.1353 | 2098 | 0.1528 |
| 1.1358 | 2099 | 0.1618 |
| 1.1364 | 2100 | 0.2424 |
| 1.1369 | 2101 | 0.1323 |
| 1.1374 | 2102 | 0.1658 |
| 1.1380 | 2103 | 0.1931 |
| 1.1385 | 2104 | 0.1471 |
| 1.1391 | 2105 | 0.1847 |
| 1.1396 | 2106 | 0.1788 |
| 1.1402 | 2107 | 0.161 |
| 1.1407 | 2108 | 0.1455 |
| 1.1412 | 2109 | 0.1814 |
| 1.1418 | 2110 | 0.1753 |
| 1.1423 | 2111 | 0.1535 |
| 1.1429 | 2112 | 0.1411 |
| 1.1434 | 2113 | 0.1738 |
| 1.1439 | 2114 | 0.1174 |
| 1.1445 | 2115 | 0.1593 |
| 1.1450 | 2116 | 0.1888 |
| 1.1456 | 2117 | 0.1683 |
| 1.1461 | 2118 | 0.1687 |
| 1.1466 | 2119 | 0.1092 |
| 1.1472 | 2120 | 0.1563 |
| 1.1477 | 2121 | 0.1915 |
| 1.1483 | 2122 | 0.1882 |
| 1.1488 | 2123 | 0.1486 |
| 1.1494 | 2124 | 0.1101 |
| 1.1499 | 2125 | 0.101 |
| 1.1504 | 2126 | 0.1735 |
| 1.1510 | 2127 | 0.134 |
| 1.1515 | 2128 | 0.2412 |
| 1.1521 | 2129 | 0.1631 |
| 1.1526 | 2130 | 0.184 |
| 1.1531 | 2131 | 0.2054 |
| 1.1537 | 2132 | 0.1637 |
| 1.1542 | 2133 | 0.1782 |
| 1.1548 | 2134 | 0.2421 |
| 1.1553 | 2135 | 0.1616 |
| 1.1558 | 2136 | 0.2233 |
| 1.1564 | 2137 | 0.2235 |
| 1.1569 | 2138 | 0.1829 |
| 1.1575 | 2139 | 0.1902 |
| 1.1580 | 2140 | 0.0983 |
| 1.1585 | 2141 | 0.1359 |
| 1.1591 | 2142 | 0.1726 |
| 1.1596 | 2143 | 0.1889 |
| 1.1602 | 2144 | 0.234 |
| 1.1607 | 2145 | 0.1312 |
| 1.1613 | 2146 | 0.1824 |
| 1.1618 | 2147 | 0.1329 |
| 1.1623 | 2148 | 0.1895 |
| 1.1629 | 2149 | 0.1778 |
| 1.1634 | 2150 | 0.1244 |
| 1.1640 | 2151 | 0.1498 |
| 1.1645 | 2152 | 0.1797 |
| 1.1650 | 2153 | 0.1602 |
| 1.1656 | 2154 | 0.1562 |
| 1.1661 | 2155 | 0.1582 |
| 1.1667 | 2156 | 0.1164 |
| 1.1672 | 2157 | 0.1242 |
| 1.1677 | 2158 | 0.1829 |
| 1.1683 | 2159 | 0.1881 |
| 1.1688 | 2160 | 0.1889 |
| 1.1694 | 2161 | 0.1942 |
| 1.1699 | 2162 | 0.1931 |
| 1.1705 | 2163 | 0.126 |
| 1.1710 | 2164 | 0.0994 |
| 1.1715 | 2165 | 0.1428 |
| 1.1721 | 2166 | 0.1164 |
| 1.1726 | 2167 | 0.1632 |
| 1.1732 | 2168 | 0.1801 |
| 1.1737 | 2169 | 0.144 |
| 1.1742 | 2170 | 0.116 |
| 1.1748 | 2171 | 0.1019 |
| 1.1753 | 2172 | 0.1493 |
| 1.1759 | 2173 | 0.1446 |
| 1.1764 | 2174 | 0.1436 |
| 1.1769 | 2175 | 0.1424 |
| 1.1775 | 2176 | 0.2089 |
| 1.1780 | 2177 | 0.1138 |
| 1.1786 | 2178 | 0.0714 |
| 1.1791 | 2179 | 0.1793 |
| 1.1797 | 2180 | 0.1501 |
| 1.1802 | 2181 | 0.1725 |
| 1.1807 | 2182 | 0.1864 |
| 1.1813 | 2183 | 0.1533 |
| 1.1818 | 2184 | 0.1663 |
| 1.1824 | 2185 | 0.2463 |
| 1.1829 | 2186 | 0.2357 |
| 1.1834 | 2187 | 0.1838 |
| 1.1840 | 2188 | 0.1901 |
| 1.1845 | 2189 | 0.1689 |
| 1.1851 | 2190 | 0.1931 |
| 1.1856 | 2191 | 0.1183 |
| 1.1861 | 2192 | 0.1785 |
| 1.1867 | 2193 | 0.1615 |
| 1.1872 | 2194 | 0.1501 |
| 1.1878 | 2195 | 0.0892 |
| 1.1883 | 2196 | 0.1803 |
| 1.1889 | 2197 | 0.1245 |
| 1.1894 | 2198 | 0.0738 |
| 1.1899 | 2199 | 0.1709 |
| 1.1905 | 2200 | 0.1645 |
| 1.1910 | 2201 | 0.1593 |
| 1.1916 | 2202 | 0.1975 |
| 1.1921 | 2203 | 0.2131 |
| 1.1926 | 2204 | 0.1472 |
| 1.1932 | 2205 | 0.0742 |
| 1.1937 | 2206 | 0.1322 |
| 1.1943 | 2207 | 0.1259 |
| 1.1948 | 2208 | 0.1585 |
| 1.1953 | 2209 | 0.166 |
| 1.1959 | 2210 | 0.1678 |
| 1.1964 | 2211 | 0.128 |
| 1.1970 | 2212 | 0.1572 |
| 1.1975 | 2213 | 0.0965 |
| 1.1981 | 2214 | 0.2441 |
| 1.1986 | 2215 | 0.1489 |
| 1.1991 | 2216 | 0.1268 |
| 1.1997 | 2217 | 0.1961 |
| 1.2002 | 2218 | 0.138 |
| 1.2008 | 2219 | 0.1104 |
| 1.2013 | 2220 | 0.1459 |
| 1.2018 | 2221 | 0.135 |
| 1.2024 | 2222 | 0.1342 |
| 1.2029 | 2223 | 0.1693 |
| 1.2035 | 2224 | 0.1841 |
| 1.2040 | 2225 | 0.1487 |
| 1.2045 | 2226 | 0.1781 |
| 1.2051 | 2227 | 0.1734 |
| 1.2056 | 2228 | 0.1982 |
| 1.2062 | 2229 | 0.1352 |
| 1.2067 | 2230 | 0.1363 |
| 1.2073 | 2231 | 0.2269 |
| 1.2078 | 2232 | 0.1859 |
| 1.2083 | 2233 | 0.1542 |
| 1.2089 | 2234 | 0.1913 |
| 1.2094 | 2235 | 0.1076 |
| 1.2100 | 2236 | 0.1504 |
| 1.2105 | 2237 | 0.1712 |
| 1.2110 | 2238 | 0.129 |
| 1.2116 | 2239 | 0.2339 |
| 1.2121 | 2240 | 0.1976 |
| 1.2127 | 2241 | 0.1131 |
| 1.2132 | 2242 | 0.2167 |
| 1.2137 | 2243 | 0.1433 |
| 1.2143 | 2244 | 0.1365 |
| 1.2148 | 2245 | 0.0818 |
| 1.2154 | 2246 | 0.139 |
| 1.2159 | 2247 | 0.1316 |
| 1.2165 | 2248 | 0.1259 |
| 1.2170 | 2249 | 0.2142 |
| 1.2175 | 2250 | 0.1683 |
| 1.2181 | 2251 | 0.1861 |
| 1.2186 | 2252 | 0.1742 |
| 1.2192 | 2253 | 0.1828 |
| 1.2197 | 2254 | 0.1052 |
| 1.2202 | 2255 | 0.1411 |
| 1.2208 | 2256 | 0.1898 |
| 1.2213 | 2257 | 0.1668 |
| 1.2219 | 2258 | 0.1103 |
| 1.2224 | 2259 | 0.1355 |
| 1.2229 | 2260 | 0.1652 |
| 1.2235 | 2261 | 0.2448 |
| 1.2240 | 2262 | 0.1566 |
| 1.2246 | 2263 | 0.1658 |
| 1.2251 | 2264 | 0.1502 |
| 1.2256 | 2265 | 0.1073 |
| 1.2262 | 2266 | 0.1469 |
| 1.2267 | 2267 | 0.1264 |
| 1.2273 | 2268 | 0.2297 |
| 1.2278 | 2269 | 0.0971 |
| 1.2284 | 2270 | 0.1473 |
| 1.2289 | 2271 | 0.1029 |
| 1.2294 | 2272 | 0.2075 |
| 1.2300 | 2273 | 0.1333 |
| 1.2305 | 2274 | 0.1758 |
| 1.2311 | 2275 | 0.1407 |
| 1.2316 | 2276 | 0.1762 |
| 1.2321 | 2277 | 0.1632 |
| 1.2327 | 2278 | 0.1301 |
| 1.2332 | 2279 | 0.2175 |
| 1.2338 | 2280 | 0.1289 |
| 1.2343 | 2281 | 0.1981 |
| 1.2348 | 2282 | 0.1272 |
| 1.2354 | 2283 | 0.1438 |
| 1.2359 | 2284 | 0.2245 |
| 1.2365 | 2285 | 0.1525 |
| 1.2370 | 2286 | 0.0951 |
| 1.2376 | 2287 | 0.1818 |
| 1.2381 | 2288 | 0.1064 |
| 1.2386 | 2289 | 0.132 |
| 1.2392 | 2290 | 0.1825 |
| 1.2397 | 2291 | 0.1968 |
| 1.2403 | 2292 | 0.1237 |
| 1.2408 | 2293 | 0.1315 |
| 1.2413 | 2294 | 0.1654 |
| 1.2419 | 2295 | 0.1523 |
| 1.2424 | 2296 | 0.1485 |
| 1.2430 | 2297 | 0.1291 |
| 1.2435 | 2298 | 0.1961 |
| 1.2440 | 2299 | 0.2214 |
| 1.2446 | 2300 | 0.1815 |
| 1.2451 | 2301 | 0.1756 |
| 1.2457 | 2302 | 0.1108 |
| 1.2462 | 2303 | 0.1121 |
| 1.2468 | 2304 | 0.143 |
| 1.2473 | 2305 | 0.1653 |
| 1.2478 | 2306 | 0.123 |
| 1.2484 | 2307 | 0.1277 |
| 1.2489 | 2308 | 0.1297 |
| 1.2495 | 2309 | 0.1453 |
| 1.25 | 2310 | 0.1333 |
| 1.2505 | 2311 | 0.1833 |
| 1.2511 | 2312 | 0.1243 |
| 1.2516 | 2313 | 0.2028 |
| 1.2522 | 2314 | 0.1313 |
| 1.2527 | 2315 | 0.1412 |
| 1.2532 | 2316 | 0.3006 |
| 1.2538 | 2317 | 0.2261 |
| 1.2543 | 2318 | 0.1033 |
| 1.2549 | 2319 | 0.2435 |
| 1.2554 | 2320 | 0.1172 |
| 1.2560 | 2321 | 0.1952 |
| 1.2565 | 2322 | 0.1153 |
| 1.2570 | 2323 | 0.1229 |
| 1.2576 | 2324 | 0.1816 |
| 1.2581 | 2325 | 0.1654 |
| 1.2587 | 2326 | 0.1326 |
| 1.2592 | 2327 | 0.1637 |
| 1.2597 | 2328 | 0.1848 |
| 1.2603 | 2329 | 0.1432 |
| 1.2608 | 2330 | 0.1274 |
| 1.2614 | 2331 | 0.1793 |
| 1.2619 | 2332 | 0.1296 |
| 1.2624 | 2333 | 0.1342 |
| 1.2630 | 2334 | 0.163 |
| 1.2635 | 2335 | 0.183 |
| 1.2641 | 2336 | 0.1334 |
| 1.2646 | 2337 | 0.1922 |
| 1.2652 | 2338 | 0.0816 |
| 1.2657 | 2339 | 0.1201 |
| 1.2662 | 2340 | 0.1736 |
| 1.2668 | 2341 | 0.1229 |
| 1.2673 | 2342 | 0.154 |
| 1.2679 | 2343 | 0.1338 |
| 1.2684 | 2344 | 0.1851 |
| 1.2689 | 2345 | 0.1511 |
| 1.2695 | 2346 | 0.1176 |
| 1.2700 | 2347 | 0.1294 |
| 1.2706 | 2348 | 0.3477 |
| 1.2711 | 2349 | 0.2173 |
| 1.2716 | 2350 | 0.163 |
| 1.2722 | 2351 | 0.1413 |
| 1.2727 | 2352 | 0.1932 |
| 1.2733 | 2353 | 0.1359 |
| 1.2738 | 2354 | 0.1892 |
| 1.2744 | 2355 | 0.1468 |
| 1.2749 | 2356 | 0.1895 |
| 1.2754 | 2357 | 0.1587 |
| 1.2760 | 2358 | 0.2297 |
| 1.2765 | 2359 | 0.172 |
| 1.2771 | 2360 | 0.1886 |
| 1.2776 | 2361 | 0.0901 |
| 1.2781 | 2362 | 0.1537 |
| 1.2787 | 2363 | 0.1367 |
| 1.2792 | 2364 | 0.1551 |
| 1.2798 | 2365 | 0.1262 |
| 1.2803 | 2366 | 0.1679 |
| 1.2808 | 2367 | 0.1803 |
| 1.2814 | 2368 | 0.1392 |
| 1.2819 | 2369 | 0.1615 |
| 1.2825 | 2370 | 0.1452 |
| 1.2830 | 2371 | 0.2015 |
| 1.2835 | 2372 | 0.1464 |
| 1.2841 | 2373 | 0.2512 |
| 1.2846 | 2374 | 0.0882 |
| 1.2852 | 2375 | 0.1218 |
| 1.2857 | 2376 | 0.1998 |
| 1.2863 | 2377 | 0.101 |
| 1.2868 | 2378 | 0.1925 |
| 1.2873 | 2379 | 0.2355 |
| 1.2879 | 2380 | 0.1798 |
| 1.2884 | 2381 | 0.0998 |
| 1.2890 | 2382 | 0.0768 |
| 1.2895 | 2383 | 0.1737 |
| 1.2900 | 2384 | 0.1655 |
| 1.2906 | 2385 | 0.1315 |
| 1.2911 | 2386 | 0.2109 |
| 1.2917 | 2387 | 0.1166 |
| 1.2922 | 2388 | 0.1382 |
| 1.2927 | 2389 | 0.1373 |
| 1.2933 | 2390 | 0.1124 |
| 1.2938 | 2391 | 0.1168 |
| 1.2944 | 2392 | 0.1112 |
| 1.2949 | 2393 | 0.2328 |
| 1.2955 | 2394 | 0.174 |
| 1.2960 | 2395 | 0.1809 |
| 1.2965 | 2396 | 0.1731 |
| 1.2971 | 2397 | 0.1129 |
| 1.2976 | 2398 | 0.1251 |
| 1.2982 | 2399 | 0.1205 |
| 1.2987 | 2400 | 0.0934 |
| 1.2992 | 2401 | 0.1622 |
| 1.2998 | 2402 | 0.1858 |
| 1.3003 | 2403 | 0.1721 |
| 1.3009 | 2404 | 0.2051 |
| 1.3014 | 2405 | 0.2213 |
| 1.3019 | 2406 | 0.1108 |
| 1.3025 | 2407 | 0.1732 |
| 1.3030 | 2408 | 0.1884 |
| 1.3036 | 2409 | 0.1981 |
| 1.3041 | 2410 | 0.0638 |
| 1.3047 | 2411 | 0.1423 |
| 1.3052 | 2412 | 0.1627 |
| 1.3057 | 2413 | 0.2067 |
| 1.3063 | 2414 | 0.1867 |
| 1.3068 | 2415 | 0.1922 |
| 1.3074 | 2416 | 0.16 |
| 1.3079 | 2417 | 0.1534 |
| 1.3084 | 2418 | 0.2114 |
| 1.3090 | 2419 | 0.1709 |
| 1.3095 | 2420 | 0.1028 |
| 1.3101 | 2421 | 0.159 |
| 1.3106 | 2422 | 0.1139 |
| 1.3111 | 2423 | 0.1301 |
| 1.3117 | 2424 | 0.14 |
| 1.3122 | 2425 | 0.1532 |
| 1.3128 | 2426 | 0.1839 |
| 1.3133 | 2427 | 0.1341 |
| 1.3139 | 2428 | 0.1906 |
| 1.3144 | 2429 | 0.2064 |
| 1.3149 | 2430 | 0.1377 |
| 1.3155 | 2431 | 0.1129 |
| 1.3160 | 2432 | 0.1553 |
| 1.3166 | 2433 | 0.1107 |
| 1.3171 | 2434 | 0.2374 |
| 1.3176 | 2435 | 0.173 |
| 1.3182 | 2436 | 0.1393 |
| 1.3187 | 2437 | 0.1996 |
| 1.3193 | 2438 | 0.2028 |
| 1.3198 | 2439 | 0.1058 |
| 1.3203 | 2440 | 0.118 |
| 1.3209 | 2441 | 0.1954 |
| 1.3214 | 2442 | 0.1117 |
| 1.3220 | 2443 | 0.1739 |
| 1.3225 | 2444 | 0.1886 |
| 1.3231 | 2445 | 0.1918 |
| 1.3236 | 2446 | 0.1234 |
| 1.3241 | 2447 | 0.2296 |
| 1.3247 | 2448 | 0.21 |
| 1.3252 | 2449 | 0.1268 |
| 1.3258 | 2450 | 0.1602 |
| 1.3263 | 2451 | 0.1095 |
| 1.3268 | 2452 | 0.1128 |
| 1.3274 | 2453 | 0.2092 |
| 1.3279 | 2454 | 0.122 |
| 1.3285 | 2455 | 0.1113 |
| 1.3290 | 2456 | 0.1518 |
| 1.3295 | 2457 | 0.1603 |
| 1.3301 | 2458 | 0.128 |
| 1.3306 | 2459 | 0.1966 |
| 1.3312 | 2460 | 0.0971 |
| 1.3317 | 2461 | 0.0837 |
| 1.3323 | 2462 | 0.1994 |
| 1.3328 | 2463 | 0.2612 |
| 1.3333 | 2464 | 0.2003 |
| 1.3339 | 2465 | 0.1551 |
| 1.3344 | 2466 | 0.1201 |
| 1.3350 | 2467 | 0.1919 |
| 1.3355 | 2468 | 0.1063 |
| 1.3360 | 2469 | 0.178 |
| 1.3366 | 2470 | 0.1441 |
| 1.3371 | 2471 | 0.1678 |
| 1.3377 | 2472 | 0.1992 |
| 1.3382 | 2473 | 0.1306 |
| 1.3387 | 2474 | 0.1432 |
| 1.3393 | 2475 | 0.117 |
| 1.3398 | 2476 | 0.1718 |
| 1.3404 | 2477 | 0.1765 |
| 1.3409 | 2478 | 0.1689 |
| 1.3415 | 2479 | 0.1251 |
| 1.3420 | 2480 | 0.1965 |
| 1.3425 | 2481 | 0.1371 |
| 1.3431 | 2482 | 0.1109 |
| 1.3436 | 2483 | 0.1393 |
| 1.3442 | 2484 | 0.1509 |
| 1.3447 | 2485 | 0.2105 |
| 1.3452 | 2486 | 0.1 |
| 1.3458 | 2487 | 0.1758 |
| 1.3463 | 2488 | 0.0925 |
| 1.3469 | 2489 | 0.1438 |
| 1.3474 | 2490 | 0.0885 |
| 1.3479 | 2491 | 0.149 |
| 1.3485 | 2492 | 0.1008 |
| 1.3490 | 2493 | 0.1565 |
| 1.3496 | 2494 | 0.092 |
| 1.3501 | 2495 | 0.1398 |
| 1.3506 | 2496 | 0.1215 |
| 1.3512 | 2497 | 0.1339 |
| 1.3517 | 2498 | 0.1932 |
| 1.3523 | 2499 | 0.2007 |
| 1.3528 | 2500 | 0.1387 |
| 1.3534 | 2501 | 0.139 |
| 1.3539 | 2502 | 0.1869 |
| 1.3544 | 2503 | 0.1169 |
| 1.3550 | 2504 | 0.1666 |
| 1.3555 | 2505 | 0.11 |
| 1.3561 | 2506 | 0.1603 |
| 1.3566 | 2507 | 0.1224 |
| 1.3571 | 2508 | 0.15 |
| 1.3577 | 2509 | 0.1826 |
| 1.3582 | 2510 | 0.1905 |
| 1.3588 | 2511 | 0.2343 |
| 1.3593 | 2512 | 0.1025 |
| 1.3598 | 2513 | 0.1937 |
| 1.3604 | 2514 | 0.1251 |
| 1.3609 | 2515 | 0.1046 |
| 1.3615 | 2516 | 0.1991 |
| 1.3620 | 2517 | 0.1331 |
| 1.3626 | 2518 | 0.1246 |
| 1.3631 | 2519 | 0.1467 |
| 1.3636 | 2520 | 0.1125 |
| 1.3642 | 2521 | 0.124 |
| 1.3647 | 2522 | 0.1453 |
| 1.3653 | 2523 | 0.1513 |
| 1.3658 | 2524 | 0.1077 |
| 1.3663 | 2525 | 0.2036 |
| 1.3669 | 2526 | 0.1631 |
| 1.3674 | 2527 | 0.1868 |
| 1.3680 | 2528 | 0.1858 |
| 1.3685 | 2529 | 0.1464 |
| 1.3690 | 2530 | 0.2517 |
| 1.3696 | 2531 | 0.147 |
| 1.3701 | 2532 | 0.1402 |
| 1.3707 | 2533 | 0.1397 |
| 1.3712 | 2534 | 0.1579 |
| 1.3718 | 2535 | 0.1185 |
| 1.3723 | 2536 | 0.1605 |
| 1.3728 | 2537 | 0.1716 |
| 1.3734 | 2538 | 0.1915 |
| 1.3739 | 2539 | 0.1535 |
| 1.3745 | 2540 | 0.1473 |
| 1.375 | 2541 | 0.218 |
| 1.3755 | 2542 | 0.2183 |
| 1.3761 | 2543 | 0.1595 |
| 1.3766 | 2544 | 0.1402 |
| 1.3772 | 2545 | 0.1311 |
| 1.3777 | 2546 | 0.1801 |
| 1.3782 | 2547 | 0.1141 |
| 1.3788 | 2548 | 0.1241 |
| 1.3793 | 2549 | 0.1097 |
| 1.3799 | 2550 | 0.1691 |
| 1.3804 | 2551 | 0.178 |
| 1.3810 | 2552 | 0.198 |
| 1.3815 | 2553 | 0.1357 |
| 1.3820 | 2554 | 0.1069 |
| 1.3826 | 2555 | 0.1486 |
| 1.3831 | 2556 | 0.0818 |
| 1.3837 | 2557 | 0.154 |
| 1.3842 | 2558 | 0.0773 |
| 1.3847 | 2559 | 0.0949 |
| 1.3853 | 2560 | 0.1535 |
| 1.3858 | 2561 | 0.2195 |
| 1.3864 | 2562 | 0.1177 |
| 1.3869 | 2563 | 0.156 |
| 1.3874 | 2564 | 0.1452 |
| 1.3880 | 2565 | 0.1732 |
| 1.3885 | 2566 | 0.1268 |
| 1.3891 | 2567 | 0.1624 |
| 1.3896 | 2568 | 0.1591 |
| 1.3902 | 2569 | 0.1043 |
| 1.3907 | 2570 | 0.134 |
| 1.3912 | 2571 | 0.0903 |
| 1.3918 | 2572 | 0.1704 |
| 1.3923 | 2573 | 0.1621 |
| 1.3929 | 2574 | 0.1576 |
| 1.3934 | 2575 | 0.1911 |
| 1.3939 | 2576 | 0.1693 |
| 1.3945 | 2577 | 0.2511 |
| 1.3950 | 2578 | 0.1229 |
| 1.3956 | 2579 | 0.1656 |
| 1.3961 | 2580 | 0.1982 |
| 1.3966 | 2581 | 0.1768 |
| 1.3972 | 2582 | 0.1636 |
| 1.3977 | 2583 | 0.1113 |
| 1.3983 | 2584 | 0.2305 |
| 1.3988 | 2585 | 0.188 |
| 1.3994 | 2586 | 0.1406 |
| 1.3999 | 2587 | 0.1359 |
| 1.4004 | 2588 | 0.1229 |
| 1.4010 | 2589 | 0.1566 |
| 1.4015 | 2590 | 0.1895 |
| 1.4021 | 2591 | 0.1502 |
| 1.4026 | 2592 | 0.0882 |
| 1.4031 | 2593 | 0.1216 |
| 1.4037 | 2594 | 0.1069 |
| 1.4042 | 2595 | 0.1295 |
| 1.4048 | 2596 | 0.1498 |
| 1.4053 | 2597 | 0.1438 |
| 1.4058 | 2598 | 0.126 |
| 1.4064 | 2599 | 0.1066 |
| 1.4069 | 2600 | 0.1444 |
| 1.4075 | 2601 | 0.145 |
| 1.4080 | 2602 | 0.2424 |
| 1.4085 | 2603 | 0.195 |
| 1.4091 | 2604 | 0.1649 |
| 1.4096 | 2605 | 0.1384 |
| 1.4102 | 2606 | 0.1266 |
| 1.4107 | 2607 | 0.08 |
| 1.4113 | 2608 | 0.1649 |
| 1.4118 | 2609 | 0.1079 |
| 1.4123 | 2610 | 0.1614 |
| 1.4129 | 2611 | 0.1425 |
| 1.4134 | 2612 | 0.1319 |
| 1.4140 | 2613 | 0.1249 |
| 1.4145 | 2614 | 0.1189 |
| 1.4150 | 2615 | 0.1661 |
| 1.4156 | 2616 | 0.0971 |
| 1.4161 | 2617 | 0.1107 |
| 1.4167 | 2618 | 0.1393 |
| 1.4172 | 2619 | 0.1279 |
| 1.4177 | 2620 | 0.1569 |
| 1.4183 | 2621 | 0.1605 |
| 1.4188 | 2622 | 0.1487 |
| 1.4194 | 2623 | 0.1173 |
| 1.4199 | 2624 | 0.2162 |
| 1.4205 | 2625 | 0.1033 |
| 1.4210 | 2626 | 0.2085 |
| 1.4215 | 2627 | 0.0873 |
| 1.4221 | 2628 | 0.2022 |
| 1.4226 | 2629 | 0.1676 |
| 1.4232 | 2630 | 0.1519 |
| 1.4237 | 2631 | 0.0961 |
| 1.4242 | 2632 | 0.0628 |
| 1.4248 | 2633 | 0.1306 |
| 1.4253 | 2634 | 0.1886 |
| 1.4259 | 2635 | 0.1843 |
| 1.4264 | 2636 | 0.077 |
| 1.4269 | 2637 | 0.175 |
| 1.4275 | 2638 | 0.1868 |
| 1.4280 | 2639 | 0.1218 |
| 1.4286 | 2640 | 0.2478 |
| 1.4291 | 2641 | 0.1802 |
| 1.4297 | 2642 | 0.2111 |
| 1.4302 | 2643 | 0.1129 |
| 1.4307 | 2644 | 0.1931 |
| 1.4313 | 2645 | 0.0944 |
| 1.4318 | 2646 | 0.1672 |
| 1.4324 | 2647 | 0.2231 |
| 1.4329 | 2648 | 0.1799 |
| 1.4334 | 2649 | 0.1549 |
| 1.4340 | 2650 | 0.171 |
| 1.4345 | 2651 | 0.1677 |
| 1.4351 | 2652 | 0.071 |
| 1.4356 | 2653 | 0.1344 |
| 1.4361 | 2654 | 0.2055 |
| 1.4367 | 2655 | 0.1168 |
| 1.4372 | 2656 | 0.1869 |
| 1.4378 | 2657 | 0.1315 |
| 1.4383 | 2658 | 0.14 |
| 1.4389 | 2659 | 0.1324 |
| 1.4394 | 2660 | 0.1585 |
| 1.4399 | 2661 | 0.1021 |
| 1.4405 | 2662 | 0.1565 |
| 1.4410 | 2663 | 0.2229 |
| 1.4416 | 2664 | 0.1702 |
| 1.4421 | 2665 | 0.192 |
| 1.4426 | 2666 | 0.1337 |
| 1.4432 | 2667 | 0.1344 |
| 1.4437 | 2668 | 0.11 |
| 1.4443 | 2669 | 0.2235 |
| 1.4448 | 2670 | 0.1436 |
| 1.4453 | 2671 | 0.0971 |
| 1.4459 | 2672 | 0.114 |
| 1.4464 | 2673 | 0.1235 |
| 1.4470 | 2674 | 0.2 |
| 1.4475 | 2675 | 0.157 |
| 1.4481 | 2676 | 0.124 |
| 1.4486 | 2677 | 0.155 |
| 1.4491 | 2678 | 0.1662 |
| 1.4497 | 2679 | 0.1604 |
| 1.4502 | 2680 | 0.2262 |
| 1.4508 | 2681 | 0.2025 |
| 1.4513 | 2682 | 0.1411 |
| 1.4518 | 2683 | 0.1726 |
| 1.4524 | 2684 | 0.1342 |
| 1.4529 | 2685 | 0.2061 |
| 1.4535 | 2686 | 0.1509 |
| 1.4540 | 2687 | 0.2183 |
| 1.4545 | 2688 | 0.2041 |
| 1.4551 | 2689 | 0.0994 |
| 1.4556 | 2690 | 0.148 |
| 1.4562 | 2691 | 0.1623 |
| 1.4567 | 2692 | 0.1274 |
| 1.4573 | 2693 | 0.2095 |
| 1.4578 | 2694 | 0.1514 |
| 1.4583 | 2695 | 0.1145 |
| 1.4589 | 2696 | 0.0957 |
| 1.4594 | 2697 | 0.118 |
| 1.4600 | 2698 | 0.1297 |
| 1.4605 | 2699 | 0.1252 |
| 1.4610 | 2700 | 0.1048 |
| 1.4616 | 2701 | 0.1426 |
| 1.4621 | 2702 | 0.1744 |
| 1.4627 | 2703 | 0.1726 |
| 1.4632 | 2704 | 0.1098 |
| 1.4637 | 2705 | 0.1609 |
| 1.4643 | 2706 | 0.1613 |
| 1.4648 | 2707 | 0.1378 |
| 1.4654 | 2708 | 0.104 |
| 1.4659 | 2709 | 0.0919 |
| 1.4665 | 2710 | 0.0982 |
| 1.4670 | 2711 | 0.0757 |
| 1.4675 | 2712 | 0.194 |
| 1.4681 | 2713 | 0.1172 |
| 1.4686 | 2714 | 0.0796 |
| 1.4692 | 2715 | 0.1449 |
| 1.4697 | 2716 | 0.1091 |
| 1.4702 | 2717 | 0.1279 |
| 1.4708 | 2718 | 0.1883 |
| 1.4713 | 2719 | 0.1467 |
| 1.4719 | 2720 | 0.1419 |
| 1.4724 | 2721 | 0.1615 |
| 1.4729 | 2722 | 0.162 |
| 1.4735 | 2723 | 0.1166 |
| 1.4740 | 2724 | 0.0876 |
| 1.4746 | 2725 | 0.2048 |
| 1.4751 | 2726 | 0.1356 |
| 1.4756 | 2727 | 0.1148 |
| 1.4762 | 2728 | 0.0901 |
| 1.4767 | 2729 | 0.1307 |
| 1.4773 | 2730 | 0.1621 |
| 1.4778 | 2731 | 0.1018 |
| 1.4784 | 2732 | 0.1039 |
| 1.4789 | 2733 | 0.1947 |
| 1.4794 | 2734 | 0.1781 |
| 1.4800 | 2735 | 0.1412 |
| 1.4805 | 2736 | 0.1924 |
| 1.4811 | 2737 | 0.1357 |
| 1.4816 | 2738 | 0.1839 |
| 1.4821 | 2739 | 0.097 |
| 1.4827 | 2740 | 0.149 |
| 1.4832 | 2741 | 0.1708 |
| 1.4838 | 2742 | 0.1674 |
| 1.4843 | 2743 | 0.1969 |
| 1.4848 | 2744 | 0.1105 |
| 1.4854 | 2745 | 0.179 |
| 1.4859 | 2746 | 0.1822 |
| 1.4865 | 2747 | 0.2475 |
| 1.4870 | 2748 | 0.134 |
| 1.4876 | 2749 | 0.145 |
| 1.4881 | 2750 | 0.1061 |
| 1.4886 | 2751 | 0.1669 |
| 1.4892 | 2752 | 0.1459 |
| 1.4897 | 2753 | 0.19 |
| 1.4903 | 2754 | 0.132 |
| 1.4908 | 2755 | 0.133 |
| 1.4913 | 2756 | 0.1593 |
| 1.4919 | 2757 | 0.1909 |
| 1.4924 | 2758 | 0.1954 |
| 1.4930 | 2759 | 0.1048 |
| 1.4935 | 2760 | 0.1093 |
| 1.4940 | 2761 | 0.1388 |
| 1.4946 | 2762 | 0.108 |
| 1.4951 | 2763 | 0.1291 |
| 1.4957 | 2764 | 0.1642 |
| 1.4962 | 2765 | 0.1346 |
| 1.4968 | 2766 | 0.1042 |
| 1.4973 | 2767 | 0.1155 |
| 1.4978 | 2768 | 0.1995 |
| 1.4984 | 2769 | 0.113 |
| 1.4989 | 2770 | 0.0875 |
| 1.4995 | 2771 | 0.1985 |
| 1.5 | 2772 | 0.1174 |
| 1.5005 | 2773 | 0.1252 |
| 1.5011 | 2774 | 0.1098 |
| 1.5016 | 2775 | 0.1313 |
| 1.5022 | 2776 | 0.1377 |
| 1.5027 | 2777 | 0.0997 |
| 1.5032 | 2778 | 0.2054 |
| 1.5038 | 2779 | 0.1398 |
| 1.5043 | 2780 | 0.1485 |
| 1.5049 | 2781 | 0.1765 |
| 1.5054 | 2782 | 0.1246 |
| 1.5060 | 2783 | 0.1427 |
| 1.5065 | 2784 | 0.1339 |
| 1.5070 | 2785 | 0.0913 |
| 1.5076 | 2786 | 0.1398 |
| 1.5081 | 2787 | 0.1649 |
| 1.5087 | 2788 | 0.1541 |
| 1.5092 | 2789 | 0.1101 |
| 1.5097 | 2790 | 0.1547 |
| 1.5103 | 2791 | 0.1665 |
| 1.5108 | 2792 | 0.1867 |
| 1.5114 | 2793 | 0.1584 |
| 1.5119 | 2794 | 0.1263 |
| 1.5124 | 2795 | 0.2127 |
| 1.5130 | 2796 | 0.178 |
| 1.5135 | 2797 | 0.1675 |
| 1.5141 | 2798 | 0.1076 |
| 1.5146 | 2799 | 0.1465 |
| 1.5152 | 2800 | 0.1361 |
| 1.5157 | 2801 | 0.165 |
| 1.5162 | 2802 | 0.1563 |
| 1.5168 | 2803 | 0.244 |
| 1.5173 | 2804 | 0.0679 |
| 1.5179 | 2805 | 0.1022 |
| 1.5184 | 2806 | 0.1718 |
| 1.5189 | 2807 | 0.1726 |
| 1.5195 | 2808 | 0.0862 |
| 1.5200 | 2809 | 0.1164 |
| 1.5206 | 2810 | 0.1277 |
| 1.5211 | 2811 | 0.1606 |
| 1.5216 | 2812 | 0.1198 |
| 1.5222 | 2813 | 0.2201 |
| 1.5227 | 2814 | 0.1492 |
| 1.5233 | 2815 | 0.1403 |
| 1.5238 | 2816 | 0.1502 |
| 1.5244 | 2817 | 0.1191 |
| 1.5249 | 2818 | 0.1601 |
| 1.5254 | 2819 | 0.1758 |
| 1.5260 | 2820 | 0.1203 |
| 1.5265 | 2821 | 0.1514 |
| 1.5271 | 2822 | 0.0597 |
| 1.5276 | 2823 | 0.1668 |
| 1.5281 | 2824 | 0.1662 |
| 1.5287 | 2825 | 0.1558 |
| 1.5292 | 2826 | 0.1427 |
| 1.5298 | 2827 | 0.0877 |
| 1.5303 | 2828 | 0.1245 |
| 1.5308 | 2829 | 0.1079 |
| 1.5314 | 2830 | 0.126 |
| 1.5319 | 2831 | 0.1625 |
| 1.5325 | 2832 | 0.129 |
| 1.5330 | 2833 | 0.1623 |
| 1.5335 | 2834 | 0.1375 |
| 1.5341 | 2835 | 0.1273 |
| 1.5346 | 2836 | 0.2176 |
| 1.5352 | 2837 | 0.1161 |
| 1.5357 | 2838 | 0.0927 |
| 1.5363 | 2839 | 0.1432 |
| 1.5368 | 2840 | 0.1039 |
| 1.5373 | 2841 | 0.2348 |
| 1.5379 | 2842 | 0.1927 |
| 1.5384 | 2843 | 0.1702 |
| 1.5390 | 2844 | 0.1271 |
| 1.5395 | 2845 | 0.1283 |
| 1.5400 | 2846 | 0.1174 |
| 1.5406 | 2847 | 0.1601 |
| 1.5411 | 2848 | 0.1597 |
| 1.5417 | 2849 | 0.1748 |
| 1.5422 | 2850 | 0.1178 |
| 1.5427 | 2851 | 0.183 |
| 1.5433 | 2852 | 0.1614 |
| 1.5438 | 2853 | 0.112 |
| 1.5444 | 2854 | 0.0992 |
| 1.5449 | 2855 | 0.1018 |
| 1.5455 | 2856 | 0.1623 |
| 1.5460 | 2857 | 0.1498 |
| 1.5465 | 2858 | 0.1584 |
| 1.5471 | 2859 | 0.1663 |
| 1.5476 | 2860 | 0.1347 |
| 1.5482 | 2861 | 0.1226 |
| 1.5487 | 2862 | 0.1089 |
| 1.5492 | 2863 | 0.1369 |
| 1.5498 | 2864 | 0.1582 |
| 1.5503 | 2865 | 0.0919 |
| 1.5509 | 2866 | 0.1124 |
| 1.5514 | 2867 | 0.1275 |
| 1.5519 | 2868 | 0.1782 |
| 1.5525 | 2869 | 0.1158 |
| 1.5530 | 2870 | 0.123 |
| 1.5536 | 2871 | 0.1179 |
| 1.5541 | 2872 | 0.1201 |
| 1.5547 | 2873 | 0.0779 |
| 1.5552 | 2874 | 0.1242 |
| 1.5557 | 2875 | 0.1437 |
| 1.5563 | 2876 | 0.1064 |
| 1.5568 | 2877 | 0.1473 |
| 1.5574 | 2878 | 0.1177 |
| 1.5579 | 2879 | 0.1504 |
| 1.5584 | 2880 | 0.1215 |
| 1.5590 | 2881 | 0.201 |
| 1.5595 | 2882 | 0.1298 |
| 1.5601 | 2883 | 0.279 |
| 1.5606 | 2884 | 0.1175 |
| 1.5611 | 2885 | 0.1104 |
| 1.5617 | 2886 | 0.1613 |
| 1.5622 | 2887 | 0.2022 |
| 1.5628 | 2888 | 0.1348 |
| 1.5633 | 2889 | 0.1828 |
| 1.5639 | 2890 | 0.2425 |
| 1.5644 | 2891 | 0.0972 |
| 1.5649 | 2892 | 0.1734 |
| 1.5655 | 2893 | 0.1251 |
| 1.5660 | 2894 | 0.1192 |
| 1.5666 | 2895 | 0.193 |
| 1.5671 | 2896 | 0.1686 |
| 1.5676 | 2897 | 0.1816 |
| 1.5682 | 2898 | 0.1086 |
| 1.5687 | 2899 | 0.1395 |
| 1.5693 | 2900 | 0.0545 |
| 1.5698 | 2901 | 0.1862 |
| 1.5703 | 2902 | 0.119 |
| 1.5709 | 2903 | 0.0937 |
| 1.5714 | 2904 | 0.1355 |
| 1.5720 | 2905 | 0.1425 |
| 1.5725 | 2906 | 0.1946 |
| 1.5731 | 2907 | 0.2243 |
| 1.5736 | 2908 | 0.1655 |
| 1.5741 | 2909 | 0.1414 |
| 1.5747 | 2910 | 0.1791 |
| 1.5752 | 2911 | 0.1408 |
| 1.5758 | 2912 | 0.1142 |
| 1.5763 | 2913 | 0.1321 |
| 1.5768 | 2914 | 0.1374 |
| 1.5774 | 2915 | 0.1151 |
| 1.5779 | 2916 | 0.1474 |
| 1.5785 | 2917 | 0.1843 |
| 1.5790 | 2918 | 0.0971 |
| 1.5795 | 2919 | 0.109 |
| 1.5801 | 2920 | 0.0696 |
| 1.5806 | 2921 | 0.0988 |
| 1.5812 | 2922 | 0.1479 |
| 1.5817 | 2923 | 0.1 |
| 1.5823 | 2924 | 0.1616 |
| 1.5828 | 2925 | 0.1384 |
| 1.5833 | 2926 | 0.1229 |
| 1.5839 | 2927 | 0.1367 |
| 1.5844 | 2928 | 0.1779 |
| 1.5850 | 2929 | 0.2113 |
| 1.5855 | 2930 | 0.151 |
| 1.5860 | 2931 | 0.146 |
| 1.5866 | 2932 | 0.116 |
| 1.5871 | 2933 | 0.074 |
| 1.5877 | 2934 | 0.1773 |
| 1.5882 | 2935 | 0.1139 |
| 1.5887 | 2936 | 0.131 |
| 1.5893 | 2937 | 0.1166 |
| 1.5898 | 2938 | 0.1399 |
| 1.5904 | 2939 | 0.1721 |
| 1.5909 | 2940 | 0.0795 |
| 1.5915 | 2941 | 0.1845 |
| 1.5920 | 2942 | 0.1869 |
| 1.5925 | 2943 | 0.1412 |
| 1.5931 | 2944 | 0.1531 |
| 1.5936 | 2945 | 0.1797 |
| 1.5942 | 2946 | 0.1439 |
| 1.5947 | 2947 | 0.0901 |
| 1.5952 | 2948 | 0.1432 |
| 1.5958 | 2949 | 0.0736 |
| 1.5963 | 2950 | 0.1257 |
| 1.5969 | 2951 | 0.1171 |
| 1.5974 | 2952 | 0.1228 |
| 1.5979 | 2953 | 0.1039 |
| 1.5985 | 2954 | 0.175 |
| 1.5990 | 2955 | 0.1883 |
| 1.5996 | 2956 | 0.1484 |
| 1.6001 | 2957 | 0.1449 |
| 1.6006 | 2958 | 0.1511 |
| 1.6012 | 2959 | 0.1256 |
| 1.6017 | 2960 | 0.1617 |
| 1.6023 | 2961 | 0.0991 |
| 1.6028 | 2962 | 0.0934 |
| 1.6034 | 2963 | 0.1146 |
| 1.6039 | 2964 | 0.1277 |
| 1.6044 | 2965 | 0.1761 |
| 1.6050 | 2966 | 0.0871 |
| 1.6055 | 2967 | 0.1361 |
| 1.6061 | 2968 | 0.1079 |
| 1.6066 | 2969 | 0.1347 |
| 1.6071 | 2970 | 0.1103 |
| 1.6077 | 2971 | 0.1854 |
| 1.6082 | 2972 | 0.1594 |
| 1.6088 | 2973 | 0.1764 |
| 1.6093 | 2974 | 0.1445 |
| 1.6098 | 2975 | 0.1491 |
| 1.6104 | 2976 | 0.1394 |
| 1.6109 | 2977 | 0.138 |
| 1.6115 | 2978 | 0.1538 |
| 1.6120 | 2979 | 0.1726 |
| 1.6126 | 2980 | 0.1285 |
| 1.6131 | 2981 | 0.1367 |
| 1.6136 | 2982 | 0.1563 |
| 1.6142 | 2983 | 0.2036 |
| 1.6147 | 2984 | 0.1713 |
| 1.6153 | 2985 | 0.1012 |
| 1.6158 | 2986 | 0.1957 |
| 1.6163 | 2987 | 0.1519 |
| 1.6169 | 2988 | 0.1345 |
| 1.6174 | 2989 | 0.118 |
| 1.6180 | 2990 | 0.1283 |
| 1.6185 | 2991 | 0.0942 |
| 1.6190 | 2992 | 0.0931 |
| 1.6196 | 2993 | 0.1119 |
| 1.6201 | 2994 | 0.1374 |
| 1.6207 | 2995 | 0.1308 |
| 1.6212 | 2996 | 0.0962 |
| 1.6218 | 2997 | 0.1228 |
| 1.6223 | 2998 | 0.1718 |
| 1.6228 | 2999 | 0.1455 |
| 1.6234 | 3000 | 0.1624 |
| 1.6239 | 3001 | 0.1567 |
| 1.6245 | 3002 | 0.1591 |
| 1.625 | 3003 | 0.1764 |
| 1.6255 | 3004 | 0.1648 |
| 1.6261 | 3005 | 0.1203 |
| 1.6266 | 3006 | 0.1064 |
| 1.6272 | 3007 | 0.1487 |
| 1.6277 | 3008 | 0.1023 |
| 1.6282 | 3009 | 0.1281 |
| 1.6288 | 3010 | 0.2009 |
| 1.6293 | 3011 | 0.1376 |
| 1.6299 | 3012 | 0.1217 |
| 1.6304 | 3013 | 0.1479 |
| 1.6310 | 3014 | 0.1045 |
| 1.6315 | 3015 | 0.1687 |
| 1.6320 | 3016 | 0.161 |
| 1.6326 | 3017 | 0.1451 |
| 1.6331 | 3018 | 0.0737 |
| 1.6337 | 3019 | 0.1111 |
| 1.6342 | 3020 | 0.0804 |
| 1.6347 | 3021 | 0.1343 |
| 1.6353 | 3022 | 0.1796 |
| 1.6358 | 3023 | 0.1528 |
| 1.6364 | 3024 | 0.0981 |
| 1.6369 | 3025 | 0.2169 |
| 1.6374 | 3026 | 0.1623 |
| 1.6380 | 3027 | 0.1037 |
| 1.6385 | 3028 | 0.1422 |
| 1.6391 | 3029 | 0.0804 |
| 1.6396 | 3030 | 0.1164 |
| 1.6402 | 3031 | 0.0939 |
| 1.6407 | 3032 | 0.1909 |
| 1.6412 | 3033 | 0.142 |
| 1.6418 | 3034 | 0.1845 |
| 1.6423 | 3035 | 0.2078 |
| 1.6429 | 3036 | 0.1088 |
| 1.6434 | 3037 | 0.1114 |
| 1.6439 | 3038 | 0.1677 |
| 1.6445 | 3039 | 0.1398 |
| 1.6450 | 3040 | 0.1678 |
| 1.6456 | 3041 | 0.1421 |
| 1.6461 | 3042 | 0.1253 |
| 1.6466 | 3043 | 0.1557 |
| 1.6472 | 3044 | 0.1857 |
| 1.6477 | 3045 | 0.1474 |
| 1.6483 | 3046 | 0.1437 |
| 1.6488 | 3047 | 0.1687 |
| 1.6494 | 3048 | 0.1955 |
| 1.6499 | 3049 | 0.155 |
| 1.6504 | 3050 | 0.1629 |
| 1.6510 | 3051 | 0.1783 |
| 1.6515 | 3052 | 0.121 |
| 1.6521 | 3053 | 0.1587 |
| 1.6526 | 3054 | 0.0774 |
| 1.6531 | 3055 | 0.1532 |
| 1.6537 | 3056 | 0.1134 |
| 1.6542 | 3057 | 0.1077 |
| 1.6548 | 3058 | 0.2061 |
| 1.6553 | 3059 | 0.0508 |
| 1.6558 | 3060 | 0.1057 |
| 1.6564 | 3061 | 0.1099 |
| 1.6569 | 3062 | 0.116 |
| 1.6575 | 3063 | 0.1121 |
| 1.6580 | 3064 | 0.1309 |
| 1.6585 | 3065 | 0.1025 |
| 1.6591 | 3066 | 0.1192 |
| 1.6596 | 3067 | 0.0882 |
| 1.6602 | 3068 | 0.1284 |
| 1.6607 | 3069 | 0.1441 |
| 1.6613 | 3070 | 0.1619 |
| 1.6618 | 3071 | 0.1776 |
| 1.6623 | 3072 | 0.1249 |
| 1.6629 | 3073 | 0.2089 |
| 1.6634 | 3074 | 0.1519 |
| 1.6640 | 3075 | 0.1589 |
| 1.6645 | 3076 | 0.1198 |
| 1.6650 | 3077 | 0.1468 |
| 1.6656 | 3078 | 0.1669 |
| 1.6661 | 3079 | 0.096 |
| 1.6667 | 3080 | 0.1229 |
| 1.6672 | 3081 | 0.1307 |
| 1.6677 | 3082 | 0.2001 |
| 1.6683 | 3083 | 0.0936 |
| 1.6688 | 3084 | 0.1411 |
| 1.6694 | 3085 | 0.1243 |
| 1.6699 | 3086 | 0.1443 |
| 1.6705 | 3087 | 0.162 |
| 1.6710 | 3088 | 0.1873 |
| 1.6715 | 3089 | 0.1229 |
| 1.6721 | 3090 | 0.1629 |
| 1.6726 | 3091 | 0.159 |
| 1.6732 | 3092 | 0.1772 |
| 1.6737 | 3093 | 0.1739 |
| 1.6742 | 3094 | 0.1169 |
| 1.6748 | 3095 | 0.1753 |
| 1.6753 | 3096 | 0.1166 |
| 1.6759 | 3097 | 0.1657 |
| 1.6764 | 3098 | 0.1075 |
| 1.6769 | 3099 | 0.0926 |
| 1.6775 | 3100 | 0.183 |
| 1.6780 | 3101 | 0.0923 |
| 1.6786 | 3102 | 0.1186 |
| 1.6791 | 3103 | 0.1287 |
| 1.6797 | 3104 | 0.16 |
| 1.6802 | 3105 | 0.1739 |
| 1.6807 | 3106 | 0.1833 |
| 1.6813 | 3107 | 0.0903 |
| 1.6818 | 3108 | 0.1352 |
| 1.6824 | 3109 | 0.203 |
| 1.6829 | 3110 | 0.1534 |
| 1.6834 | 3111 | 0.178 |
| 1.6840 | 3112 | 0.1323 |
| 1.6845 | 3113 | 0.1022 |
| 1.6851 | 3114 | 0.173 |
| 1.6856 | 3115 | 0.095 |
| 1.6861 | 3116 | 0.1932 |
| 1.6867 | 3117 | 0.1326 |
| 1.6872 | 3118 | 0.1623 |
| 1.6878 | 3119 | 0.1761 |
| 1.6883 | 3120 | 0.2711 |
| 1.6889 | 3121 | 0.1168 |
| 1.6894 | 3122 | 0.1221 |
| 1.6899 | 3123 | 0.1119 |
| 1.6905 | 3124 | 0.1486 |
| 1.6910 | 3125 | 0.12 |
| 1.6916 | 3126 | 0.1934 |
| 1.6921 | 3127 | 0.1454 |
| 1.6926 | 3128 | 0.1667 |
| 1.6932 | 3129 | 0.1224 |
| 1.6937 | 3130 | 0.0719 |
| 1.6943 | 3131 | 0.1296 |
| 1.6948 | 3132 | 0.1708 |
| 1.6953 | 3133 | 0.1507 |
| 1.6959 | 3134 | 0.1656 |
| 1.6964 | 3135 | 0.1595 |
| 1.6970 | 3136 | 0.1159 |
| 1.6975 | 3137 | 0.1458 |
| 1.6981 | 3138 | 0.1114 |
| 1.6986 | 3139 | 0.1317 |
| 1.6991 | 3140 | 0.1181 |
| 1.6997 | 3141 | 0.1627 |
| 1.7002 | 3142 | 0.1402 |
| 1.7008 | 3143 | 0.1928 |
| 1.7013 | 3144 | 0.1342 |
| 1.7018 | 3145 | 0.1767 |
| 1.7024 | 3146 | 0.0987 |
| 1.7029 | 3147 | 0.1176 |
| 1.7035 | 3148 | 0.1816 |
| 1.7040 | 3149 | 0.0761 |
| 1.7045 | 3150 | 0.1893 |
| 1.7051 | 3151 | 0.2155 |
| 1.7056 | 3152 | 0.1736 |
| 1.7062 | 3153 | 0.1795 |
| 1.7067 | 3154 | 0.123 |
| 1.7073 | 3155 | 0.1005 |
| 1.7078 | 3156 | 0.1892 |
| 1.7083 | 3157 | 0.1595 |
| 1.7089 | 3158 | 0.1261 |
| 1.7094 | 3159 | 0.0977 |
| 1.7100 | 3160 | 0.1679 |
| 1.7105 | 3161 | 0.0966 |
| 1.7110 | 3162 | 0.1665 |
| 1.7116 | 3163 | 0.0877 |
| 1.7121 | 3164 | 0.0971 |
| 1.7127 | 3165 | 0.1638 |
| 1.7132 | 3166 | 0.1345 |
| 1.7137 | 3167 | 0.1729 |
| 1.7143 | 3168 | 0.164 |
| 1.7148 | 3169 | 0.1755 |
| 1.7154 | 3170 | 0.0961 |
| 1.7159 | 3171 | 0.1285 |
| 1.7165 | 3172 | 0.1019 |
| 1.7170 | 3173 | 0.1181 |
| 1.7175 | 3174 | 0.1911 |
| 1.7181 | 3175 | 0.109 |
| 1.7186 | 3176 | 0.1131 |
| 1.7192 | 3177 | 0.2062 |
| 1.7197 | 3178 | 0.1409 |
| 1.7202 | 3179 | 0.1724 |
| 1.7208 | 3180 | 0.1846 |
| 1.7213 | 3181 | 0.1084 |
| 1.7219 | 3182 | 0.1459 |
| 1.7224 | 3183 | 0.1078 |
| 1.7229 | 3184 | 0.0871 |
| 1.7235 | 3185 | 0.0964 |
| 1.7240 | 3186 | 0.1168 |
| 1.7246 | 3187 | 0.1347 |
| 1.7251 | 3188 | 0.1683 |
| 1.7256 | 3189 | 0.1724 |
| 1.7262 | 3190 | 0.1386 |
| 1.7267 | 3191 | 0.0963 |
| 1.7273 | 3192 | 0.063 |
| 1.7278 | 3193 | 0.1902 |
| 1.7284 | 3194 | 0.2066 |
| 1.7289 | 3195 | 0.0827 |
| 1.7294 | 3196 | 0.1261 |
| 1.7300 | 3197 | 0.0949 |
| 1.7305 | 3198 | 0.1344 |
| 1.7311 | 3199 | 0.1464 |
| 1.7316 | 3200 | 0.1216 |
| 1.7321 | 3201 | 0.1894 |
| 1.7327 | 3202 | 0.0701 |
| 1.7332 | 3203 | 0.2015 |
| 1.7338 | 3204 | 0.0811 |
| 1.7343 | 3205 | 0.1138 |
| 1.7348 | 3206 | 0.1147 |
| 1.7354 | 3207 | 0.1097 |
| 1.7359 | 3208 | 0.1294 |
| 1.7365 | 3209 | 0.1124 |
| 1.7370 | 3210 | 0.1545 |
| 1.7376 | 3211 | 0.1388 |
| 1.7381 | 3212 | 0.1753 |
| 1.7386 | 3213 | 0.1394 |
| 1.7392 | 3214 | 0.1262 |
| 1.7397 | 3215 | 0.1536 |
| 1.7403 | 3216 | 0.1569 |
| 1.7408 | 3217 | 0.1194 |
| 1.7413 | 3218 | 0.1683 |
| 1.7419 | 3219 | 0.1358 |
| 1.7424 | 3220 | 0.1602 |
| 1.7430 | 3221 | 0.1973 |
| 1.7435 | 3222 | 0.189 |
| 1.7440 | 3223 | 0.0941 |
| 1.7446 | 3224 | 0.1522 |
| 1.7451 | 3225 | 0.2571 |
| 1.7457 | 3226 | 0.107 |
| 1.7462 | 3227 | 0.1244 |
| 1.7468 | 3228 | 0.1371 |
| 1.7473 | 3229 | 0.1221 |
| 1.7478 | 3230 | 0.1079 |
| 1.7484 | 3231 | 0.1406 |
| 1.7489 | 3232 | 0.1388 |
| 1.7495 | 3233 | 0.119 |
| 1.75 | 3234 | 0.1542 |
| 1.7505 | 3235 | 0.1077 |
| 1.7511 | 3236 | 0.1279 |
| 1.7516 | 3237 | 0.1235 |
| 1.7522 | 3238 | 0.1132 |
| 1.7527 | 3239 | 0.1546 |
| 1.7532 | 3240 | 0.2056 |
| 1.7538 | 3241 | 0.2005 |
| 1.7543 | 3242 | 0.1408 |
| 1.7549 | 3243 | 0.1861 |
| 1.7554 | 3244 | 0.1402 |
| 1.7560 | 3245 | 0.1131 |
| 1.7565 | 3246 | 0.0999 |
| 1.7570 | 3247 | 0.0895 |
| 1.7576 | 3248 | 0.1367 |
| 1.7581 | 3249 | 0.1588 |
| 1.7587 | 3250 | 0.1706 |
| 1.7592 | 3251 | 0.1647 |
| 1.7597 | 3252 | 0.1164 |
| 1.7603 | 3253 | 0.1058 |
| 1.7608 | 3254 | 0.1601 |
| 1.7614 | 3255 | 0.1444 |
| 1.7619 | 3256 | 0.1502 |
| 1.7624 | 3257 | 0.1709 |
| 1.7630 | 3258 | 0.1653 |
| 1.7635 | 3259 | 0.0546 |
| 1.7641 | 3260 | 0.1181 |
| 1.7646 | 3261 | 0.1238 |
| 1.7652 | 3262 | 0.1282 |
| 1.7657 | 3263 | 0.1592 |
| 1.7662 | 3264 | 0.1494 |
| 1.7668 | 3265 | 0.2134 |
| 1.7673 | 3266 | 0.1356 |
| 1.7679 | 3267 | 0.1612 |
| 1.7684 | 3268 | 0.1143 |
| 1.7689 | 3269 | 0.2423 |
| 1.7695 | 3270 | 0.0893 |
| 1.7700 | 3271 | 0.1639 |
| 1.7706 | 3272 | 0.2058 |
| 1.7711 | 3273 | 0.0952 |
| 1.7716 | 3274 | 0.1513 |
| 1.7722 | 3275 | 0.1767 |
| 1.7727 | 3276 | 0.2127 |
| 1.7733 | 3277 | 0.1141 |
| 1.7738 | 3278 | 0.0878 |
| 1.7744 | 3279 | 0.1457 |
| 1.7749 | 3280 | 0.1459 |
| 1.7754 | 3281 | 0.1895 |
| 1.7760 | 3282 | 0.1511 |
| 1.7765 | 3283 | 0.1375 |
| 1.7771 | 3284 | 0.2081 |
| 1.7776 | 3285 | 0.0627 |
| 1.7781 | 3286 | 0.1062 |
| 1.7787 | 3287 | 0.1456 |
| 1.7792 | 3288 | 0.1291 |
| 1.7798 | 3289 | 0.1396 |
| 1.7803 | 3290 | 0.1765 |
| 1.7808 | 3291 | 0.0801 |
| 1.7814 | 3292 | 0.1904 |
| 1.7819 | 3293 | 0.128 |
| 1.7825 | 3294 | 0.2003 |
| 1.7830 | 3295 | 0.106 |
| 1.7835 | 3296 | 0.181 |
| 1.7841 | 3297 | 0.1609 |
| 1.7846 | 3298 | 0.1436 |
| 1.7852 | 3299 | 0.1645 |
| 1.7857 | 3300 | 0.0934 |
| 1.7863 | 3301 | 0.1491 |
| 1.7868 | 3302 | 0.1468 |
| 1.7873 | 3303 | 0.199 |
| 1.7879 | 3304 | 0.1303 |
| 1.7884 | 3305 | 0.153 |
| 1.7890 | 3306 | 0.283 |
| 1.7895 | 3307 | 0.0757 |
| 1.7900 | 3308 | 0.1467 |
| 1.7906 | 3309 | 0.1875 |
| 1.7911 | 3310 | 0.1424 |
| 1.7917 | 3311 | 0.2265 |
| 1.7922 | 3312 | 0.1117 |
| 1.7927 | 3313 | 0.0587 |
| 1.7933 | 3314 | 0.1304 |
| 1.7938 | 3315 | 0.1024 |
| 1.7944 | 3316 | 0.0639 |
| 1.7949 | 3317 | 0.153 |
| 1.7955 | 3318 | 0.1512 |
| 1.7960 | 3319 | 0.0981 |
| 1.7965 | 3320 | 0.1406 |
| 1.7971 | 3321 | 0.1339 |
| 1.7976 | 3322 | 0.1035 |
| 1.7982 | 3323 | 0.1035 |
| 1.7987 | 3324 | 0.1349 |
| 1.7992 | 3325 | 0.129 |
| 1.7998 | 3326 | 0.1245 |
| 1.8003 | 3327 | 0.1629 |
| 1.8009 | 3328 | 0.2134 |
| 1.8014 | 3329 | 0.1188 |
| 1.8019 | 3330 | 0.0807 |
| 1.8025 | 3331 | 0.0875 |
| 1.8030 | 3332 | 0.0821 |
| 1.8036 | 3333 | 0.1032 |
| 1.8041 | 3334 | 0.0954 |
| 1.8047 | 3335 | 0.2133 |
| 1.8052 | 3336 | 0.1519 |
| 1.8057 | 3337 | 0.1515 |
| 1.8063 | 3338 | 0.1382 |
| 1.8068 | 3339 | 0.1435 |
| 1.8074 | 3340 | 0.1128 |
| 1.8079 | 3341 | 0.1777 |
| 1.8084 | 3342 | 0.0946 |
| 1.8090 | 3343 | 0.2011 |
| 1.8095 | 3344 | 0.093 |
| 1.8101 | 3345 | 0.0973 |
| 1.8106 | 3346 | 0.1845 |
| 1.8111 | 3347 | 0.1429 |
| 1.8117 | 3348 | 0.1615 |
| 1.8122 | 3349 | 0.0991 |
| 1.8128 | 3350 | 0.1783 |
| 1.8133 | 3351 | 0.1302 |
| 1.8139 | 3352 | 0.1255 |
| 1.8144 | 3353 | 0.1599 |
| 1.8149 | 3354 | 0.1773 |
| 1.8155 | 3355 | 0.1269 |
| 1.8160 | 3356 | 0.1722 |
| 1.8166 | 3357 | 0.1461 |
| 1.8171 | 3358 | 0.1114 |
| 1.8176 | 3359 | 0.1353 |
| 1.8182 | 3360 | 0.0718 |
| 1.8187 | 3361 | 0.1533 |
| 1.8193 | 3362 | 0.1608 |
| 1.8198 | 3363 | 0.1557 |
| 1.8203 | 3364 | 0.108 |
| 1.8209 | 3365 | 0.1231 |
| 1.8214 | 3366 | 0.1569 |
| 1.8220 | 3367 | 0.1251 |
| 1.8225 | 3368 | 0.1815 |
| 1.8231 | 3369 | 0.0905 |
| 1.8236 | 3370 | 0.0907 |
| 1.8241 | 3371 | 0.1921 |
| 1.8247 | 3372 | 0.1161 |
| 1.8252 | 3373 | 0.1267 |
| 1.8258 | 3374 | 0.0967 |
| 1.8263 | 3375 | 0.1281 |
| 1.8268 | 3376 | 0.1678 |
| 1.8274 | 3377 | 0.1545 |
| 1.8279 | 3378 | 0.1062 |
| 1.8285 | 3379 | 0.0841 |
| 1.8290 | 3380 | 0.1217 |
| 1.8295 | 3381 | 0.1513 |
| 1.8301 | 3382 | 0.0934 |
| 1.8306 | 3383 | 0.0874 |
| 1.8312 | 3384 | 0.1124 |
| 1.8317 | 3385 | 0.068 |
| 1.8323 | 3386 | 0.1045 |
| 1.8328 | 3387 | 0.0846 |
| 1.8333 | 3388 | 0.1375 |
| 1.8339 | 3389 | 0.1433 |
| 1.8344 | 3390 | 0.0742 |
| 1.8350 | 3391 | 0.1374 |
| 1.8355 | 3392 | 0.1718 |
| 1.8360 | 3393 | 0.1349 |
| 1.8366 | 3394 | 0.1852 |
| 1.8371 | 3395 | 0.1461 |
| 1.8377 | 3396 | 0.172 |
| 1.8382 | 3397 | 0.1809 |
| 1.8387 | 3398 | 0.1176 |
| 1.8393 | 3399 | 0.0811 |
| 1.8398 | 3400 | 0.144 |
| 1.8404 | 3401 | 0.1064 |
| 1.8409 | 3402 | 0.1666 |
| 1.8415 | 3403 | 0.1239 |
| 1.8420 | 3404 | 0.0793 |
| 1.8425 | 3405 | 0.1357 |
| 1.8431 | 3406 | 0.1008 |
| 1.8436 | 3407 | 0.1187 |
| 1.8442 | 3408 | 0.1157 |
| 1.8447 | 3409 | 0.1793 |
| 1.8452 | 3410 | 0.0997 |
| 1.8458 | 3411 | 0.121 |
| 1.8463 | 3412 | 0.0953 |
| 1.8469 | 3413 | 0.1071 |
| 1.8474 | 3414 | 0.137 |
| 1.8479 | 3415 | 0.1033 |
| 1.8485 | 3416 | 0.1874 |
| 1.8490 | 3417 | 0.0786 |
| 1.8496 | 3418 | 0.1177 |
| 1.8501 | 3419 | 0.145 |
| 1.8506 | 3420 | 0.0869 |
| 1.8512 | 3421 | 0.096 |
| 1.8517 | 3422 | 0.1259 |
| 1.8523 | 3423 | 0.1003 |
| 1.8528 | 3424 | 0.1524 |
| 1.8534 | 3425 | 0.1191 |
| 1.8539 | 3426 | 0.1265 |
| 1.8544 | 3427 | 0.09 |
| 1.8550 | 3428 | 0.1057 |
| 1.8555 | 3429 | 0.139 |
| 1.8561 | 3430 | 0.2413 |
| 1.8566 | 3431 | 0.2455 |
| 1.8571 | 3432 | 0.1225 |
| 1.8577 | 3433 | 0.1909 |
| 1.8582 | 3434 | 0.0837 |
| 1.8588 | 3435 | 0.1787 |
| 1.8593 | 3436 | 0.1265 |
| 1.8598 | 3437 | 0.1458 |
| 1.8604 | 3438 | 0.126 |
| 1.8609 | 3439 | 0.1263 |
| 1.8615 | 3440 | 0.1139 |
| 1.8620 | 3441 | 0.1025 |
| 1.8626 | 3442 | 0.115 |
| 1.8631 | 3443 | 0.0745 |
| 1.8636 | 3444 | 0.2031 |
| 1.8642 | 3445 | 0.1167 |
| 1.8647 | 3446 | 0.1435 |
| 1.8653 | 3447 | 0.1995 |
| 1.8658 | 3448 | 0.0633 |
| 1.8663 | 3449 | 0.1104 |
| 1.8669 | 3450 | 0.0903 |
| 1.8674 | 3451 | 0.2537 |
| 1.8680 | 3452 | 0.1511 |
| 1.8685 | 3453 | 0.1907 |
| 1.8690 | 3454 | 0.1438 |
| 1.8696 | 3455 | 0.1242 |
| 1.8701 | 3456 | 0.1069 |
| 1.8707 | 3457 | 0.1459 |
| 1.8712 | 3458 | 0.1477 |
| 1.8718 | 3459 | 0.199 |
| 1.8723 | 3460 | 0.1307 |
| 1.8728 | 3461 | 0.0553 |
| 1.8734 | 3462 | 0.1462 |
| 1.8739 | 3463 | 0.0752 |
| 1.8745 | 3464 | 0.1257 |
| 1.875 | 3465 | 0.1226 |
| 1.8755 | 3466 | 0.1329 |
| 1.8761 | 3467 | 0.0877 |
| 1.8766 | 3468 | 0.248 |
| 1.8772 | 3469 | 0.1792 |
| 1.8777 | 3470 | 0.1565 |
| 1.8782 | 3471 | 0.0849 |
| 1.8788 | 3472 | 0.0748 |
| 1.8793 | 3473 | 0.1071 |
| 1.8799 | 3474 | 0.2068 |
| 1.8804 | 3475 | 0.1632 |
| 1.8810 | 3476 | 0.1077 |
| 1.8815 | 3477 | 0.1074 |
| 1.8820 | 3478 | 0.1381 |
| 1.8826 | 3479 | 0.1371 |
| 1.8831 | 3480 | 0.1554 |
| 1.8837 | 3481 | 0.0862 |
| 1.8842 | 3482 | 0.1134 |
| 1.8847 | 3483 | 0.0917 |
| 1.8853 | 3484 | 0.1064 |
| 1.8858 | 3485 | 0.107 |
| 1.8864 | 3486 | 0.1672 |
| 1.8869 | 3487 | 0.1369 |
| 1.8874 | 3488 | 0.1046 |
| 1.8880 | 3489 | 0.1897 |
| 1.8885 | 3490 | 0.1445 |
| 1.8891 | 3491 | 0.1557 |
| 1.8896 | 3492 | 0.1475 |
| 1.8902 | 3493 | 0.0924 |
| 1.8907 | 3494 | 0.1195 |
| 1.8912 | 3495 | 0.134 |
| 1.8918 | 3496 | 0.1781 |
| 1.8923 | 3497 | 0.1302 |
| 1.8929 | 3498 | 0.1147 |
| 1.8934 | 3499 | 0.1188 |
| 1.8939 | 3500 | 0.0764 |
| 1.8945 | 3501 | 0.1862 |
| 1.8950 | 3502 | 0.1731 |
| 1.8956 | 3503 | 0.1506 |
| 1.8961 | 3504 | 0.2071 |
| 1.8966 | 3505 | 0.1214 |
| 1.8972 | 3506 | 0.0631 |
| 1.8977 | 3507 | 0.2146 |
| 1.8983 | 3508 | 0.1264 |
| 1.8988 | 3509 | 0.1227 |
| 1.8994 | 3510 | 0.098 |
| 1.8999 | 3511 | 0.1133 |
| 1.9004 | 3512 | 0.108 |
| 1.9010 | 3513 | 0.1429 |
| 1.9015 | 3514 | 0.1002 |
| 1.9021 | 3515 | 0.0996 |
| 1.9026 | 3516 | 0.1665 |
| 1.9031 | 3517 | 0.1244 |
| 1.9037 | 3518 | 0.0964 |
| 1.9042 | 3519 | 0.1123 |
| 1.9048 | 3520 | 0.1491 |
| 1.9053 | 3521 | 0.1112 |
| 1.9058 | 3522 | 0.1046 |
| 1.9064 | 3523 | 0.1169 |
| 1.9069 | 3524 | 0.1542 |
| 1.9075 | 3525 | 0.1642 |
| 1.9080 | 3526 | 0.1363 |
| 1.9085 | 3527 | 0.1173 |
| 1.9091 | 3528 | 0.1342 |
| 1.9096 | 3529 | 0.1478 |
| 1.9102 | 3530 | 0.1453 |
| 1.9107 | 3531 | 0.0621 |
| 1.9113 | 3532 | 0.1445 |
| 1.9118 | 3533 | 0.2127 |
| 1.9123 | 3534 | 0.1118 |
| 1.9129 | 3535 | 0.1104 |
| 1.9134 | 3536 | 0.1489 |
| 1.9140 | 3537 | 0.1287 |
| 1.9145 | 3538 | 0.1405 |
| 1.9150 | 3539 | 0.1815 |
| 1.9156 | 3540 | 0.2112 |
| 1.9161 | 3541 | 0.093 |
| 1.9167 | 3542 | 0.1712 |
| 1.9172 | 3543 | 0.1072 |
| 1.9177 | 3544 | 0.1197 |
| 1.9183 | 3545 | 0.0915 |
| 1.9188 | 3546 | 0.1916 |
| 1.9194 | 3547 | 0.1411 |
| 1.9199 | 3548 | 0.0798 |
| 1.9205 | 3549 | 0.1057 |
| 1.9210 | 3550 | 0.1458 |
| 1.9215 | 3551 | 0.2198 |
| 1.9221 | 3552 | 0.1297 |
| 1.9226 | 3553 | 0.1554 |
| 1.9232 | 3554 | 0.1415 |
| 1.9237 | 3555 | 0.1241 |
| 1.9242 | 3556 | 0.1127 |
| 1.9248 | 3557 | 0.1121 |
| 1.9253 | 3558 | 0.0801 |
| 1.9259 | 3559 | 0.1509 |
| 1.9264 | 3560 | 0.2069 |
| 1.9269 | 3561 | 0.2097 |
| 1.9275 | 3562 | 0.148 |
| 1.9280 | 3563 | 0.1672 |
| 1.9286 | 3564 | 0.1582 |
| 1.9291 | 3565 | 0.0763 |
| 1.9297 | 3566 | 0.0645 |
| 1.9302 | 3567 | 0.0952 |
| 1.9307 | 3568 | 0.1827 |
| 1.9313 | 3569 | 0.135 |
| 1.9318 | 3570 | 0.1278 |
| 1.9324 | 3571 | 0.1181 |
| 1.9329 | 3572 | 0.1053 |
| 1.9334 | 3573 | 0.0809 |
| 1.9340 | 3574 | 0.1149 |
| 1.9345 | 3575 | 0.1938 |
| 1.9351 | 3576 | 0.2437 |
| 1.9356 | 3577 | 0.1428 |
| 1.9361 | 3578 | 0.2036 |
| 1.9367 | 3579 | 0.2326 |
| 1.9372 | 3580 | 0.1331 |
| 1.9378 | 3581 | 0.2583 |
| 1.9383 | 3582 | 0.0846 |
| 1.9389 | 3583 | 0.1506 |
| 1.9394 | 3584 | 0.1001 |
| 1.9399 | 3585 | 0.0816 |
| 1.9405 | 3586 | 0.1654 |
| 1.9410 | 3587 | 0.1472 |
| 1.9416 | 3588 | 0.134 |
| 1.9421 | 3589 | 0.161 |
| 1.9426 | 3590 | 0.0722 |
| 1.9432 | 3591 | 0.1112 |
| 1.9437 | 3592 | 0.1371 |
| 1.9443 | 3593 | 0.154 |
| 1.9448 | 3594 | 0.1929 |
| 1.9453 | 3595 | 0.0807 |
| 1.9459 | 3596 | 0.1179 |
| 1.9464 | 3597 | 0.1123 |
| 1.9470 | 3598 | 0.1316 |
| 1.9475 | 3599 | 0.1715 |
| 1.9481 | 3600 | 0.0725 |
| 1.9486 | 3601 | 0.1253 |
| 1.9491 | 3602 | 0.1063 |
| 1.9497 | 3603 | 0.0637 |
| 1.9502 | 3604 | 0.1376 |
| 1.9508 | 3605 | 0.1692 |
| 1.9513 | 3606 | 0.1444 |
| 1.9518 | 3607 | 0.1961 |
| 1.9524 | 3608 | 0.1237 |
| 1.9529 | 3609 | 0.1656 |
| 1.9535 | 3610 | 0.1268 |
| 1.9540 | 3611 | 0.1228 |
| 1.9545 | 3612 | 0.1233 |
| 1.9551 | 3613 | 0.1336 |
| 1.9556 | 3614 | 0.1778 |
| 1.9562 | 3615 | 0.1584 |
| 1.9567 | 3616 | 0.0842 |
| 1.9573 | 3617 | 0.1075 |
| 1.9578 | 3618 | 0.1397 |
| 1.9583 | 3619 | 0.0974 |
| 1.9589 | 3620 | 0.1425 |
| 1.9594 | 3621 | 0.1124 |
| 1.9600 | 3622 | 0.1553 |
| 1.9605 | 3623 | 0.1082 |
| 1.9610 | 3624 | 0.2172 |
| 1.9616 | 3625 | 0.136 |
| 1.9621 | 3626 | 0.1152 |
| 1.9627 | 3627 | 0.187 |
| 1.9632 | 3628 | 0.0914 |
| 1.9637 | 3629 | 0.1043 |
| 1.9643 | 3630 | 0.1044 |
| 1.9648 | 3631 | 0.2358 |
| 1.9654 | 3632 | 0.088 |
| 1.9659 | 3633 | 0.085 |
| 1.9665 | 3634 | 0.139 |
| 1.9670 | 3635 | 0.0856 |
| 1.9675 | 3636 | 0.2187 |
| 1.9681 | 3637 | 0.163 |
| 1.9686 | 3638 | 0.1722 |
| 1.9692 | 3639 | 0.1082 |
| 1.9697 | 3640 | 0.1889 |
| 1.9702 | 3641 | 0.1865 |
| 1.9708 | 3642 | 0.1392 |
| 1.9713 | 3643 | 0.1366 |
| 1.9719 | 3644 | 0.2583 |
| 1.9724 | 3645 | 0.1249 |
| 1.9729 | 3646 | 0.1803 |
| 1.9735 | 3647 | 0.142 |
| 1.9740 | 3648 | 0.1319 |
| 1.9746 | 3649 | 0.1987 |
| 1.9751 | 3650 | 0.0852 |
| 1.9756 | 3651 | 0.1585 |
| 1.9762 | 3652 | 0.1135 |
| 1.9767 | 3653 | 0.1252 |
| 1.9773 | 3654 | 0.1397 |
| 1.9778 | 3655 | 0.1603 |
| 1.9784 | 3656 | 0.1126 |
| 1.9789 | 3657 | 0.1612 |
| 1.9794 | 3658 | 0.0826 |
| 1.9800 | 3659 | 0.2043 |
| 1.9805 | 3660 | 0.1867 |
| 1.9811 | 3661 | 0.1596 |
| 1.9816 | 3662 | 0.1495 |
| 1.9821 | 3663 | 0.111 |
| 1.9827 | 3664 | 0.1552 |
| 1.9832 | 3665 | 0.1026 |
| 1.9838 | 3666 | 0.1031 |
| 1.9843 | 3667 | 0.1257 |
| 1.9848 | 3668 | 0.1231 |
| 1.9854 | 3669 | 0.1231 |
| 1.9859 | 3670 | 0.0685 |
| 1.9865 | 3671 | 0.116 |
| 1.9870 | 3672 | 0.1789 |
| 1.9876 | 3673 | 0.0924 |
| 1.9881 | 3674 | 0.1606 |
| 1.9886 | 3675 | 0.1572 |
| 1.9892 | 3676 | 0.1543 |
| 1.9897 | 3677 | 0.0975 |
| 1.9903 | 3678 | 0.1347 |
| 1.9908 | 3679 | 0.0963 |
| 1.9913 | 3680 | 0.085 |
| 1.9919 | 3681 | 0.2071 |
| 1.9924 | 3682 | 0.1415 |
| 1.9930 | 3683 | 0.1221 |
| 1.9935 | 3684 | 0.1796 |
| 1.9940 | 3685 | 0.2002 |
| 1.9946 | 3686 | 0.1112 |
| 1.9951 | 3687 | 0.201 |
| 1.9957 | 3688 | 0.1599 |
| 1.9962 | 3689 | 0.1209 |
| 1.9968 | 3690 | 0.1429 |
| 1.9973 | 3691 | 0.1201 |
| 1.9978 | 3692 | 0.0651 |
| 1.9984 | 3693 | 0.0922 |
| 1.9989 | 3694 | 0.1071 |
| 1.9995 | 3695 | 0.1756 |
| 2.0 | 3696 | 0.1262 |
| 2.0005 | 3697 | 0.0868 |
| 2.0011 | 3698 | 0.1027 |
| 2.0016 | 3699 | 0.1486 |
| 2.0022 | 3700 | 0.1695 |
| 2.0027 | 3701 | 0.1058 |
| 2.0032 | 3702 | 0.0815 |
| 2.0038 | 3703 | 0.1181 |
| 2.0043 | 3704 | 0.0835 |
| 2.0049 | 3705 | 0.1223 |
| 2.0054 | 3706 | 0.1353 |
| 2.0060 | 3707 | 0.12 |
| 2.0065 | 3708 | 0.0961 |
| 2.0070 | 3709 | 0.0656 |
| 2.0076 | 3710 | 0.038 |
| 2.0081 | 3711 | 0.1012 |
| 2.0087 | 3712 | 0.1237 |
| 2.0092 | 3713 | 0.0787 |
| 2.0097 | 3714 | 0.1479 |
| 2.0103 | 3715 | 0.1126 |
| 2.0108 | 3716 | 0.0549 |
| 2.0114 | 3717 | 0.1574 |
| 2.0119 | 3718 | 0.0952 |
| 2.0124 | 3719 | 0.0893 |
| 2.0130 | 3720 | 0.1139 |
| 2.0135 | 3721 | 0.0671 |
| 2.0141 | 3722 | 0.0589 |
| 2.0146 | 3723 | 0.059 |
| 2.0152 | 3724 | 0.111 |
| 2.0157 | 3725 | 0.1269 |
| 2.0162 | 3726 | 0.0707 |
| 2.0168 | 3727 | 0.1661 |
| 2.0173 | 3728 | 0.1041 |
| 2.0179 | 3729 | 0.1088 |
| 2.0184 | 3730 | 0.1043 |
| 2.0189 | 3731 | 0.0852 |
| 2.0195 | 3732 | 0.1726 |
| 2.0200 | 3733 | 0.1209 |
| 2.0206 | 3734 | 0.0976 |
| 2.0211 | 3735 | 0.0573 |
| 2.0216 | 3736 | 0.1027 |
| 2.0222 | 3737 | 0.1287 |
| 2.0227 | 3738 | 0.112 |
| 2.0233 | 3739 | 0.0923 |
| 2.0238 | 3740 | 0.0919 |
| 2.0244 | 3741 | 0.1019 |
| 2.0249 | 3742 | 0.0851 |
| 2.0254 | 3743 | 0.1342 |
| 2.0260 | 3744 | 0.0892 |
| 2.0265 | 3745 | 0.0756 |
| 2.0271 | 3746 | 0.1088 |
| 2.0276 | 3747 | 0.0686 |
| 2.0281 | 3748 | 0.1172 |
| 2.0287 | 3749 | 0.1495 |
| 2.0292 | 3750 | 0.1435 |
| 2.0298 | 3751 | 0.0527 |
| 2.0303 | 3752 | 0.1294 |
| 2.0308 | 3753 | 0.1277 |
| 2.0314 | 3754 | 0.1159 |
| 2.0319 | 3755 | 0.0771 |
| 2.0325 | 3756 | 0.0853 |
| 2.0330 | 3757 | 0.0985 |
| 2.0335 | 3758 | 0.1415 |
| 2.0341 | 3759 | 0.0838 |
| 2.0346 | 3760 | 0.1366 |
| 2.0352 | 3761 | 0.0977 |
| 2.0357 | 3762 | 0.0902 |
| 2.0363 | 3763 | 0.1419 |
| 2.0368 | 3764 | 0.1301 |
| 2.0373 | 3765 | 0.0851 |
| 2.0379 | 3766 | 0.1039 |
| 2.0384 | 3767 | 0.0877 |
| 2.0390 | 3768 | 0.1048 |
| 2.0395 | 3769 | 0.1027 |
| 2.0400 | 3770 | 0.1365 |
| 2.0406 | 3771 | 0.0866 |
| 2.0411 | 3772 | 0.1162 |
| 2.0417 | 3773 | 0.1088 |
| 2.0422 | 3774 | 0.141 |
| 2.0427 | 3775 | 0.1279 |
| 2.0433 | 3776 | 0.1089 |
| 2.0438 | 3777 | 0.1693 |
| 2.0444 | 3778 | 0.0725 |
| 2.0449 | 3779 | 0.0506 |
| 2.0455 | 3780 | 0.112 |
| 2.0460 | 3781 | 0.0538 |
| 2.0465 | 3782 | 0.0965 |
| 2.0471 | 3783 | 0.1196 |
| 2.0476 | 3784 | 0.125 |
| 2.0482 | 3785 | 0.0992 |
| 2.0487 | 3786 | 0.0738 |
| 2.0492 | 3787 | 0.0956 |
| 2.0498 | 3788 | 0.1151 |
| 2.0503 | 3789 | 0.1054 |
| 2.0509 | 3790 | 0.0682 |
| 2.0514 | 3791 | 0.1283 |
| 2.0519 | 3792 | 0.0844 |
| 2.0525 | 3793 | 0.1502 |
| 2.0530 | 3794 | 0.0807 |
| 2.0536 | 3795 | 0.1474 |
| 2.0541 | 3796 | 0.0814 |
| 2.0547 | 3797 | 0.0911 |
| 2.0552 | 3798 | 0.1086 |
| 2.0557 | 3799 | 0.0911 |
| 2.0563 | 3800 | 0.0924 |
| 2.0568 | 3801 | 0.0949 |
| 2.0574 | 3802 | 0.1195 |
| 2.0579 | 3803 | 0.101 |
| 2.0584 | 3804 | 0.0939 |
| 2.0590 | 3805 | 0.1092 |
| 2.0595 | 3806 | 0.0833 |
| 2.0601 | 3807 | 0.0799 |
| 2.0606 | 3808 | 0.1366 |
| 2.0611 | 3809 | 0.0937 |
| 2.0617 | 3810 | 0.1322 |
| 2.0622 | 3811 | 0.0937 |
| 2.0628 | 3812 | 0.0989 |
| 2.0633 | 3813 | 0.112 |
| 2.0639 | 3814 | 0.1116 |
| 2.0644 | 3815 | 0.0603 |
| 2.0649 | 3816 | 0.1627 |
| 2.0655 | 3817 | 0.0453 |
| 2.0660 | 3818 | 0.1235 |
| 2.0666 | 3819 | 0.1528 |
| 2.0671 | 3820 | 0.1288 |
| 2.0676 | 3821 | 0.0948 |
| 2.0682 | 3822 | 0.113 |
| 2.0687 | 3823 | 0.0893 |
| 2.0693 | 3824 | 0.1466 |
| 2.0698 | 3825 | 0.0838 |
| 2.0703 | 3826 | 0.1004 |
| 2.0709 | 3827 | 0.152 |
| 2.0714 | 3828 | 0.067 |
| 2.0720 | 3829 | 0.1333 |
| 2.0725 | 3830 | 0.1321 |
| 2.0731 | 3831 | 0.055 |
| 2.0736 | 3832 | 0.1072 |
| 2.0741 | 3833 | 0.1392 |
| 2.0747 | 3834 | 0.0628 |
| 2.0752 | 3835 | 0.1286 |
| 2.0758 | 3836 | 0.1403 |
| 2.0763 | 3837 | 0.0843 |
| 2.0768 | 3838 | 0.1565 |
| 2.0774 | 3839 | 0.1061 |
| 2.0779 | 3840 | 0.1434 |
| 2.0785 | 3841 | 0.1105 |
| 2.0790 | 3842 | 0.1042 |
| 2.0795 | 3843 | 0.0974 |
| 2.0801 | 3844 | 0.0706 |
| 2.0806 | 3845 | 0.106 |
| 2.0812 | 3846 | 0.1558 |
| 2.0817 | 3847 | 0.1305 |
| 2.0823 | 3848 | 0.0913 |
| 2.0828 | 3849 | 0.1394 |
| 2.0833 | 3850 | 0.162 |
| 2.0839 | 3851 | 0.0883 |
| 2.0844 | 3852 | 0.1376 |
| 2.0850 | 3853 | 0.0624 |
| 2.0855 | 3854 | 0.0639 |
| 2.0860 | 3855 | 0.1365 |
| 2.0866 | 3856 | 0.0804 |
| 2.0871 | 3857 | 0.1241 |
| 2.0877 | 3858 | 0.1047 |
| 2.0882 | 3859 | 0.0719 |
| 2.0887 | 3860 | 0.1125 |
| 2.0893 | 3861 | 0.1549 |
| 2.0898 | 3862 | 0.0932 |
| 2.0904 | 3863 | 0.1644 |
| 2.0909 | 3864 | 0.0918 |
| 2.0915 | 3865 | 0.1134 |
| 2.0920 | 3866 | 0.1155 |
| 2.0925 | 3867 | 0.1056 |
| 2.0931 | 3868 | 0.1158 |
| 2.0936 | 3869 | 0.0948 |
| 2.0942 | 3870 | 0.122 |
| 2.0947 | 3871 | 0.0504 |
| 2.0952 | 3872 | 0.1643 |
| 2.0958 | 3873 | 0.1064 |
| 2.0963 | 3874 | 0.1197 |
| 2.0969 | 3875 | 0.1816 |
| 2.0974 | 3876 | 0.0956 |
| 2.0979 | 3877 | 0.1038 |
| 2.0985 | 3878 | 0.0406 |
| 2.0990 | 3879 | 0.0819 |
| 2.0996 | 3880 | 0.1045 |
| 2.1001 | 3881 | 0.1393 |
| 2.1006 | 3882 | 0.1046 |
| 2.1012 | 3883 | 0.1299 |
| 2.1017 | 3884 | 0.1438 |
| 2.1023 | 3885 | 0.0821 |
| 2.1028 | 3886 | 0.0581 |
| 2.1034 | 3887 | 0.0722 |
| 2.1039 | 3888 | 0.0921 |
| 2.1044 | 3889 | 0.0812 |
| 2.1050 | 3890 | 0.0918 |
| 2.1055 | 3891 | 0.1152 |
| 2.1061 | 3892 | 0.1103 |
| 2.1066 | 3893 | 0.1367 |
| 2.1071 | 3894 | 0.0771 |
| 2.1077 | 3895 | 0.1125 |
| 2.1082 | 3896 | 0.0814 |
| 2.1088 | 3897 | 0.0998 |
| 2.1093 | 3898 | 0.1494 |
| 2.1098 | 3899 | 0.0778 |
| 2.1104 | 3900 | 0.0717 |
| 2.1109 | 3901 | 0.1093 |
| 2.1115 | 3902 | 0.0516 |
| 2.1120 | 3903 | 0.0955 |
| 2.1126 | 3904 | 0.0789 |
| 2.1131 | 3905 | 0.0894 |
| 2.1136 | 3906 | 0.0763 |
| 2.1142 | 3907 | 0.123 |
| 2.1147 | 3908 | 0.0784 |
| 2.1153 | 3909 | 0.0523 |
| 2.1158 | 3910 | 0.0961 |
| 2.1163 | 3911 | 0.1403 |
| 2.1169 | 3912 | 0.0983 |
| 2.1174 | 3913 | 0.0937 |
| 2.1180 | 3914 | 0.0822 |
| 2.1185 | 3915 | 0.1102 |
| 2.1190 | 3916 | 0.0982 |
| 2.1196 | 3917 | 0.0886 |
| 2.1201 | 3918 | 0.0752 |
| 2.1207 | 3919 | 0.1724 |
| 2.1212 | 3920 | 0.0537 |
| 2.1218 | 3921 | 0.0853 |
| 2.1223 | 3922 | 0.0754 |
| 2.1228 | 3923 | 0.0754 |
| 2.1234 | 3924 | 0.179 |
| 2.1239 | 3925 | 0.1207 |
| 2.1245 | 3926 | 0.1055 |
| 2.125 | 3927 | 0.0881 |
| 2.1255 | 3928 | 0.0856 |
| 2.1261 | 3929 | 0.0806 |
| 2.1266 | 3930 | 0.0629 |
| 2.1272 | 3931 | 0.0866 |
| 2.1277 | 3932 | 0.1315 |
| 2.1282 | 3933 | 0.0533 |
| 2.1288 | 3934 | 0.1413 |
| 2.1293 | 3935 | 0.0582 |
| 2.1299 | 3936 | 0.093 |
| 2.1304 | 3937 | 0.1078 |
| 2.1310 | 3938 | 0.0617 |
| 2.1315 | 3939 | 0.0888 |
| 2.1320 | 3940 | 0.1497 |
| 2.1326 | 3941 | 0.0848 |
| 2.1331 | 3942 | 0.1136 |
| 2.1337 | 3943 | 0.1066 |
| 2.1342 | 3944 | 0.1173 |
| 2.1347 | 3945 | 0.0853 |
| 2.1353 | 3946 | 0.0714 |
| 2.1358 | 3947 | 0.114 |
| 2.1364 | 3948 | 0.0637 |
| 2.1369 | 3949 | 0.0674 |
| 2.1374 | 3950 | 0.1088 |
| 2.1380 | 3951 | 0.0939 |
| 2.1385 | 3952 | 0.0938 |
| 2.1391 | 3953 | 0.0858 |
| 2.1396 | 3954 | 0.0971 |
| 2.1402 | 3955 | 0.0854 |
| 2.1407 | 3956 | 0.1242 |
| 2.1412 | 3957 | 0.0759 |
| 2.1418 | 3958 | 0.1041 |
| 2.1423 | 3959 | 0.0794 |
| 2.1429 | 3960 | 0.1637 |
| 2.1434 | 3961 | 0.0948 |
| 2.1439 | 3962 | 0.1284 |
| 2.1445 | 3963 | 0.1894 |
| 2.1450 | 3964 | 0.1456 |
| 2.1456 | 3965 | 0.1867 |
| 2.1461 | 3966 | 0.0972 |
| 2.1466 | 3967 | 0.1385 |
| 2.1472 | 3968 | 0.0772 |
| 2.1477 | 3969 | 0.1075 |
| 2.1483 | 3970 | 0.0721 |
| 2.1488 | 3971 | 0.0743 |
| 2.1494 | 3972 | 0.073 |
| 2.1499 | 3973 | 0.1095 |
| 2.1504 | 3974 | 0.1036 |
| 2.1510 | 3975 | 0.1059 |
| 2.1515 | 3976 | 0.1541 |
| 2.1521 | 3977 | 0.1267 |
| 2.1526 | 3978 | 0.1006 |
| 2.1531 | 3979 | 0.1332 |
| 2.1537 | 3980 | 0.1288 |
| 2.1542 | 3981 | 0.1526 |
| 2.1548 | 3982 | 0.0997 |
| 2.1553 | 3983 | 0.0731 |
| 2.1558 | 3984 | 0.1243 |
| 2.1564 | 3985 | 0.0679 |
| 2.1569 | 3986 | 0.0804 |
| 2.1575 | 3987 | 0.1052 |
| 2.1580 | 3988 | 0.077 |
| 2.1585 | 3989 | 0.1017 |
| 2.1591 | 3990 | 0.1088 |
| 2.1596 | 3991 | 0.0804 |
| 2.1602 | 3992 | 0.0719 |
| 2.1607 | 3993 | 0.1131 |
| 2.1613 | 3994 | 0.1274 |
| 2.1618 | 3995 | 0.0724 |
| 2.1623 | 3996 | 0.0746 |
| 2.1629 | 3997 | 0.063 |
| 2.1634 | 3998 | 0.0863 |
| 2.1640 | 3999 | 0.0846 |
| 2.1645 | 4000 | 0.079 |
| 2.1650 | 4001 | 0.1367 |
| 2.1656 | 4002 | 0.1341 |
| 2.1661 | 4003 | 0.1123 |
| 2.1667 | 4004 | 0.1425 |
| 2.1672 | 4005 | 0.1205 |
| 2.1677 | 4006 | 0.1473 |
| 2.1683 | 4007 | 0.0794 |
| 2.1688 | 4008 | 0.0954 |
| 2.1694 | 4009 | 0.0921 |
| 2.1699 | 4010 | 0.0788 |
| 2.1705 | 4011 | 0.1574 |
| 2.1710 | 4012 | 0.0662 |
| 2.1715 | 4013 | 0.1068 |
| 2.1721 | 4014 | 0.0479 |
| 2.1726 | 4015 | 0.1175 |
| 2.1732 | 4016 | 0.1035 |
| 2.1737 | 4017 | 0.1186 |
| 2.1742 | 4018 | 0.0867 |
| 2.1748 | 4019 | 0.0906 |
| 2.1753 | 4020 | 0.1025 |
| 2.1759 | 4021 | 0.1167 |
| 2.1764 | 4022 | 0.1128 |
| 2.1769 | 4023 | 0.0804 |
| 2.1775 | 4024 | 0.1359 |
| 2.1780 | 4025 | 0.1061 |
| 2.1786 | 4026 | 0.0769 |
| 2.1791 | 4027 | 0.1563 |
| 2.1797 | 4028 | 0.0662 |
| 2.1802 | 4029 | 0.0787 |
| 2.1807 | 4030 | 0.1493 |
| 2.1813 | 4031 | 0.12 |
| 2.1818 | 4032 | 0.1276 |
| 2.1824 | 4033 | 0.1189 |
| 2.1829 | 4034 | 0.0666 |
| 2.1834 | 4035 | 0.1168 |
| 2.1840 | 4036 | 0.0964 |
| 2.1845 | 4037 | 0.0924 |
| 2.1851 | 4038 | 0.0928 |
| 2.1856 | 4039 | 0.1565 |
| 2.1861 | 4040 | 0.1583 |
| 2.1867 | 4041 | 0.0943 |
| 2.1872 | 4042 | 0.068 |
| 2.1878 | 4043 | 0.1038 |
| 2.1883 | 4044 | 0.0523 |
| 2.1889 | 4045 | 0.0995 |
| 2.1894 | 4046 | 0.1014 |
| 2.1899 | 4047 | 0.0666 |
| 2.1905 | 4048 | 0.0973 |
| 2.1910 | 4049 | 0.0504 |
| 2.1916 | 4050 | 0.0871 |
| 2.1921 | 4051 | 0.1233 |
| 2.1926 | 4052 | 0.1242 |
| 2.1932 | 4053 | 0.1249 |
| 2.1937 | 4054 | 0.0793 |
| 2.1943 | 4055 | 0.0811 |
| 2.1948 | 4056 | 0.0852 |
| 2.1953 | 4057 | 0.0694 |
| 2.1959 | 4058 | 0.0753 |
| 2.1964 | 4059 | 0.1074 |
| 2.1970 | 4060 | 0.0658 |
| 2.1975 | 4061 | 0.1055 |
| 2.1981 | 4062 | 0.0995 |
| 2.1986 | 4063 | 0.1394 |
| 2.1991 | 4064 | 0.1109 |
| 2.1997 | 4065 | 0.1443 |
| 2.2002 | 4066 | 0.0827 |
| 2.2008 | 4067 | 0.0973 |
| 2.2013 | 4068 | 0.122 |
| 2.2018 | 4069 | 0.1324 |
| 2.2024 | 4070 | 0.0841 |
| 2.2029 | 4071 | 0.0811 |
| 2.2035 | 4072 | 0.1299 |
| 2.2040 | 4073 | 0.0619 |
| 2.2045 | 4074 | 0.0995 |
| 2.2051 | 4075 | 0.0774 |
| 2.2056 | 4076 | 0.0703 |
| 2.2062 | 4077 | 0.0908 |
| 2.2067 | 4078 | 0.0993 |
| 2.2073 | 4079 | 0.1208 |
| 2.2078 | 4080 | 0.0709 |
| 2.2083 | 4081 | 0.0961 |
| 2.2089 | 4082 | 0.086 |
| 2.2094 | 4083 | 0.1744 |
| 2.2100 | 4084 | 0.1104 |
| 2.2105 | 4085 | 0.0948 |
| 2.2110 | 4086 | 0.1372 |
| 2.2116 | 4087 | 0.0504 |
| 2.2121 | 4088 | 0.1078 |
| 2.2127 | 4089 | 0.0678 |
| 2.2132 | 4090 | 0.1113 |
| 2.2137 | 4091 | 0.1535 |
| 2.2143 | 4092 | 0.1114 |
| 2.2148 | 4093 | 0.1493 |
| 2.2154 | 4094 | 0.1055 |
| 2.2159 | 4095 | 0.112 |
| 2.2165 | 4096 | 0.0994 |
| 2.2170 | 4097 | 0.1593 |
| 2.2175 | 4098 | 0.1368 |
| 2.2181 | 4099 | 0.1037 |
| 2.2186 | 4100 | 0.1092 |
| 2.2192 | 4101 | 0.1238 |
| 2.2197 | 4102 | 0.1298 |
| 2.2202 | 4103 | 0.0702 |
| 2.2208 | 4104 | 0.0597 |
| 2.2213 | 4105 | 0.0925 |
| 2.2219 | 4106 | 0.1104 |
| 2.2224 | 4107 | 0.0928 |
| 2.2229 | 4108 | 0.1117 |
| 2.2235 | 4109 | 0.0586 |
| 2.2240 | 4110 | 0.0577 |
| 2.2246 | 4111 | 0.1112 |
| 2.2251 | 4112 | 0.1522 |
| 2.2256 | 4113 | 0.0973 |
| 2.2262 | 4114 | 0.1223 |
| 2.2267 | 4115 | 0.0874 |
| 2.2273 | 4116 | 0.063 |
| 2.2278 | 4117 | 0.1704 |
| 2.2284 | 4118 | 0.0657 |
| 2.2289 | 4119 | 0.0901 |
| 2.2294 | 4120 | 0.0951 |
| 2.2300 | 4121 | 0.1019 |
| 2.2305 | 4122 | 0.0885 |
| 2.2311 | 4123 | 0.109 |
| 2.2316 | 4124 | 0.0901 |
| 2.2321 | 4125 | 0.0914 |
| 2.2327 | 4126 | 0.1287 |
| 2.2332 | 4127 | 0.0676 |
| 2.2338 | 4128 | 0.0904 |
| 2.2343 | 4129 | 0.0666 |
| 2.2348 | 4130 | 0.1157 |
| 2.2354 | 4131 | 0.057 |
| 2.2359 | 4132 | 0.1128 |
| 2.2365 | 4133 | 0.1123 |
| 2.2370 | 4134 | 0.1625 |
| 2.2376 | 4135 | 0.1028 |
| 2.2381 | 4136 | 0.1311 |
| 2.2386 | 4137 | 0.1246 |
| 2.2392 | 4138 | 0.0961 |
| 2.2397 | 4139 | 0.0611 |
| 2.2403 | 4140 | 0.1273 |
| 2.2408 | 4141 | 0.0837 |
| 2.2413 | 4142 | 0.0704 |
| 2.2419 | 4143 | 0.0873 |
| 2.2424 | 4144 | 0.1671 |
| 2.2430 | 4145 | 0.1129 |
| 2.2435 | 4146 | 0.0876 |
| 2.2440 | 4147 | 0.2113 |
| 2.2446 | 4148 | 0.0625 |
| 2.2451 | 4149 | 0.0901 |
| 2.2457 | 4150 | 0.0904 |
| 2.2462 | 4151 | 0.1606 |
| 2.2468 | 4152 | 0.1389 |
| 2.2473 | 4153 | 0.0992 |
| 2.2478 | 4154 | 0.1092 |
| 2.2484 | 4155 | 0.1034 |
| 2.2489 | 4156 | 0.0779 |
| 2.2495 | 4157 | 0.0854 |
| 2.25 | 4158 | 0.0727 |
| 2.2505 | 4159 | 0.1041 |
| 2.2511 | 4160 | 0.1241 |
| 2.2516 | 4161 | 0.0997 |
| 2.2522 | 4162 | 0.1624 |
| 2.2527 | 4163 | 0.1205 |
| 2.2532 | 4164 | 0.0662 |
| 2.2538 | 4165 | 0.1134 |
| 2.2543 | 4166 | 0.149 |
| 2.2549 | 4167 | 0.1096 |
| 2.2554 | 4168 | 0.0824 |
| 2.2560 | 4169 | 0.1041 |
| 2.2565 | 4170 | 0.0633 |
| 2.2570 | 4171 | 0.1127 |
| 2.2576 | 4172 | 0.1091 |
| 2.2581 | 4173 | 0.0884 |
| 2.2587 | 4174 | 0.1381 |
| 2.2592 | 4175 | 0.046 |
| 2.2597 | 4176 | 0.0716 |
| 2.2603 | 4177 | 0.0777 |
| 2.2608 | 4178 | 0.0834 |
| 2.2614 | 4179 | 0.1346 |
| 2.2619 | 4180 | 0.0581 |
| 2.2624 | 4181 | 0.1161 |
| 2.2630 | 4182 | 0.0832 |
| 2.2635 | 4183 | 0.1127 |
| 2.2641 | 4184 | 0.0995 |
| 2.2646 | 4185 | 0.047 |
| 2.2652 | 4186 | 0.0922 |
| 2.2657 | 4187 | 0.0903 |
| 2.2662 | 4188 | 0.1511 |
| 2.2668 | 4189 | 0.0783 |
| 2.2673 | 4190 | 0.098 |
| 2.2679 | 4191 | 0.1311 |
| 2.2684 | 4192 | 0.0701 |
| 2.2689 | 4193 | 0.176 |
| 2.2695 | 4194 | 0.152 |
| 2.2700 | 4195 | 0.1689 |
| 2.2706 | 4196 | 0.079 |
| 2.2711 | 4197 | 0.0915 |
| 2.2716 | 4198 | 0.1435 |
| 2.2722 | 4199 | 0.1123 |
| 2.2727 | 4200 | 0.1261 |
| 2.2733 | 4201 | 0.1041 |
| 2.2738 | 4202 | 0.1172 |
| 2.2744 | 4203 | 0.1188 |
| 2.2749 | 4204 | 0.1107 |
| 2.2754 | 4205 | 0.0528 |
| 2.2760 | 4206 | 0.1239 |
| 2.2765 | 4207 | 0.1241 |
| 2.2771 | 4208 | 0.0866 |
| 2.2776 | 4209 | 0.0725 |
| 2.2781 | 4210 | 0.0885 |
| 2.2787 | 4211 | 0.1362 |
| 2.2792 | 4212 | 0.1316 |
| 2.2798 | 4213 | 0.1074 |
| 2.2803 | 4214 | 0.0814 |
| 2.2808 | 4215 | 0.1032 |
| 2.2814 | 4216 | 0.1029 |
| 2.2819 | 4217 | 0.0736 |
| 2.2825 | 4218 | 0.0611 |
| 2.2830 | 4219 | 0.1528 |
| 2.2835 | 4220 | 0.1131 |
| 2.2841 | 4221 | 0.1296 |
| 2.2846 | 4222 | 0.0819 |
| 2.2852 | 4223 | 0.1603 |
| 2.2857 | 4224 | 0.0822 |
| 2.2863 | 4225 | 0.1026 |
| 2.2868 | 4226 | 0.1036 |
| 2.2873 | 4227 | 0.1256 |
| 2.2879 | 4228 | 0.1234 |
| 2.2884 | 4229 | 0.0995 |
| 2.2890 | 4230 | 0.0524 |
| 2.2895 | 4231 | 0.0691 |
| 2.2900 | 4232 | 0.1058 |
| 2.2906 | 4233 | 0.1318 |
| 2.2911 | 4234 | 0.0749 |
| 2.2917 | 4235 | 0.0719 |
| 2.2922 | 4236 | 0.1006 |
| 2.2927 | 4237 | 0.1245 |
| 2.2933 | 4238 | 0.0934 |
| 2.2938 | 4239 | 0.117 |
| 2.2944 | 4240 | 0.1265 |
| 2.2949 | 4241 | 0.1253 |
| 2.2955 | 4242 | 0.1313 |
| 2.2960 | 4243 | 0.1198 |
| 2.2965 | 4244 | 0.1005 |
| 2.2971 | 4245 | 0.0918 |
| 2.2976 | 4246 | 0.0772 |
| 2.2982 | 4247 | 0.079 |
| 2.2987 | 4248 | 0.0749 |
| 2.2992 | 4249 | 0.0828 |
| 2.2998 | 4250 | 0.0919 |
| 2.3003 | 4251 | 0.0725 |
| 2.3009 | 4252 | 0.1958 |
| 2.3014 | 4253 | 0.0816 |
| 2.3019 | 4254 | 0.139 |
| 2.3025 | 4255 | 0.1398 |
| 2.3030 | 4256 | 0.1123 |
| 2.3036 | 4257 | 0.089 |
| 2.3041 | 4258 | 0.0863 |
| 2.3047 | 4259 | 0.1452 |
| 2.3052 | 4260 | 0.1216 |
| 2.3057 | 4261 | 0.1233 |
| 2.3063 | 4262 | 0.0927 |
| 2.3068 | 4263 | 0.0501 |
| 2.3074 | 4264 | 0.0916 |
| 2.3079 | 4265 | 0.1276 |
| 2.3084 | 4266 | 0.1166 |
| 2.3090 | 4267 | 0.0844 |
| 2.3095 | 4268 | 0.0491 |
| 2.3101 | 4269 | 0.0649 |
| 2.3106 | 4270 | 0.0606 |
| 2.3111 | 4271 | 0.1792 |
| 2.3117 | 4272 | 0.0653 |
| 2.3122 | 4273 | 0.0638 |
| 2.3128 | 4274 | 0.0709 |
| 2.3133 | 4275 | 0.12 |
| 2.3139 | 4276 | 0.1149 |
| 2.3144 | 4277 | 0.1055 |
| 2.3149 | 4278 | 0.0422 |
| 2.3155 | 4279 | 0.0985 |
| 2.3160 | 4280 | 0.1105 |
| 2.3166 | 4281 | 0.0909 |
| 2.3171 | 4282 | 0.0447 |
| 2.3176 | 4283 | 0.0654 |
| 2.3182 | 4284 | 0.1108 |
| 2.3187 | 4285 | 0.1182 |
| 2.3193 | 4286 | 0.042 |
| 2.3198 | 4287 | 0.0827 |
| 2.3203 | 4288 | 0.1292 |
| 2.3209 | 4289 | 0.0742 |
| 2.3214 | 4290 | 0.1275 |
| 2.3220 | 4291 | 0.1087 |
| 2.3225 | 4292 | 0.1391 |
| 2.3231 | 4293 | 0.0891 |
| 2.3236 | 4294 | 0.0508 |
| 2.3241 | 4295 | 0.1185 |
| 2.3247 | 4296 | 0.0519 |
| 2.3252 | 4297 | 0.0847 |
| 2.3258 | 4298 | 0.1272 |
| 2.3263 | 4299 | 0.1127 |
| 2.3268 | 4300 | 0.1113 |
| 2.3274 | 4301 | 0.1031 |
| 2.3279 | 4302 | 0.0747 |
| 2.3285 | 4303 | 0.1081 |
| 2.3290 | 4304 | 0.0989 |
| 2.3295 | 4305 | 0.0736 |
| 2.3301 | 4306 | 0.1417 |
| 2.3306 | 4307 | 0.0868 |
| 2.3312 | 4308 | 0.0867 |
| 2.3317 | 4309 | 0.0845 |
| 2.3323 | 4310 | 0.1184 |
| 2.3328 | 4311 | 0.1093 |
| 2.3333 | 4312 | 0.1313 |
| 2.3339 | 4313 | 0.0936 |
| 2.3344 | 4314 | 0.0547 |
| 2.3350 | 4315 | 0.0567 |
| 2.3355 | 4316 | 0.1204 |
| 2.3360 | 4317 | 0.1305 |
| 2.3366 | 4318 | 0.1199 |
| 2.3371 | 4319 | 0.118 |
| 2.3377 | 4320 | 0.0877 |
| 2.3382 | 4321 | 0.1392 |
| 2.3387 | 4322 | 0.091 |
| 2.3393 | 4323 | 0.1238 |
| 2.3398 | 4324 | 0.0586 |
| 2.3404 | 4325 | 0.0931 |
| 2.3409 | 4326 | 0.0799 |
| 2.3415 | 4327 | 0.1399 |
| 2.3420 | 4328 | 0.1468 |
| 2.3425 | 4329 | 0.1346 |
| 2.3431 | 4330 | 0.1348 |
| 2.3436 | 4331 | 0.1869 |
| 2.3442 | 4332 | 0.1217 |
| 2.3447 | 4333 | 0.0646 |
| 2.3452 | 4334 | 0.111 |
| 2.3458 | 4335 | 0.1346 |
| 2.3463 | 4336 | 0.1497 |
| 2.3469 | 4337 | 0.0908 |
| 2.3474 | 4338 | 0.0956 |
| 2.3479 | 4339 | 0.1143 |
| 2.3485 | 4340 | 0.1351 |
| 2.3490 | 4341 | 0.0881 |
| 2.3496 | 4342 | 0.1382 |
| 2.3501 | 4343 | 0.0821 |
| 2.3506 | 4344 | 0.1077 |
| 2.3512 | 4345 | 0.0568 |
| 2.3517 | 4346 | 0.1292 |
| 2.3523 | 4347 | 0.1134 |
| 2.3528 | 4348 | 0.1249 |
| 2.3534 | 4349 | 0.0535 |
| 2.3539 | 4350 | 0.0788 |
| 2.3544 | 4351 | 0.1078 |
| 2.3550 | 4352 | 0.0899 |
| 2.3555 | 4353 | 0.0736 |
| 2.3561 | 4354 | 0.0918 |
| 2.3566 | 4355 | 0.0673 |
| 2.3571 | 4356 | 0.1083 |
| 2.3577 | 4357 | 0.0832 |
| 2.3582 | 4358 | 0.1077 |
| 2.3588 | 4359 | 0.1175 |
| 2.3593 | 4360 | 0.1449 |
| 2.3598 | 4361 | 0.0933 |
| 2.3604 | 4362 | 0.2376 |
| 2.3609 | 4363 | 0.1113 |
| 2.3615 | 4364 | 0.1058 |
| 2.3620 | 4365 | 0.096 |
| 2.3626 | 4366 | 0.1496 |
| 2.3631 | 4367 | 0.0908 |
| 2.3636 | 4368 | 0.0824 |
| 2.3642 | 4369 | 0.064 |
| 2.3647 | 4370 | 0.0944 |
| 2.3653 | 4371 | 0.1034 |
| 2.3658 | 4372 | 0.0553 |
| 2.3663 | 4373 | 0.0739 |
| 2.3669 | 4374 | 0.1274 |
| 2.3674 | 4375 | 0.0802 |
| 2.3680 | 4376 | 0.0657 |
| 2.3685 | 4377 | 0.0851 |
| 2.3690 | 4378 | 0.0847 |
| 2.3696 | 4379 | 0.1233 |
| 2.3701 | 4380 | 0.1352 |
| 2.3707 | 4381 | 0.1172 |
| 2.3712 | 4382 | 0.0942 |
| 2.3718 | 4383 | 0.1171 |
| 2.3723 | 4384 | 0.1391 |
| 2.3728 | 4385 | 0.1478 |
| 2.3734 | 4386 | 0.0739 |
| 2.3739 | 4387 | 0.0744 |
| 2.3745 | 4388 | 0.0736 |
| 2.375 | 4389 | 0.0568 |
| 2.3755 | 4390 | 0.0782 |
| 2.3761 | 4391 | 0.1505 |
| 2.3766 | 4392 | 0.1167 |
| 2.3772 | 4393 | 0.0923 |
| 2.3777 | 4394 | 0.0942 |
| 2.3782 | 4395 | 0.0619 |
| 2.3788 | 4396 | 0.1088 |
| 2.3793 | 4397 | 0.0752 |
| 2.3799 | 4398 | 0.1643 |
| 2.3804 | 4399 | 0.1019 |
| 2.3810 | 4400 | 0.0644 |
| 2.3815 | 4401 | 0.0922 |
| 2.3820 | 4402 | 0.1345 |
| 2.3826 | 4403 | 0.1343 |
| 2.3831 | 4404 | 0.1136 |
| 2.3837 | 4405 | 0.0701 |
| 2.3842 | 4406 | 0.0694 |
| 2.3847 | 4407 | 0.0976 |
| 2.3853 | 4408 | 0.072 |
| 2.3858 | 4409 | 0.0732 |
| 2.3864 | 4410 | 0.0977 |
| 2.3869 | 4411 | 0.0824 |
| 2.3874 | 4412 | 0.1405 |
| 2.3880 | 4413 | 0.0713 |
| 2.3885 | 4414 | 0.1053 |
| 2.3891 | 4415 | 0.108 |
| 2.3896 | 4416 | 0.1048 |
| 2.3902 | 4417 | 0.0655 |
| 2.3907 | 4418 | 0.0995 |
| 2.3912 | 4419 | 0.129 |
| 2.3918 | 4420 | 0.1073 |
| 2.3923 | 4421 | 0.1381 |
| 2.3929 | 4422 | 0.0607 |
| 2.3934 | 4423 | 0.0853 |
| 2.3939 | 4424 | 0.1173 |
| 2.3945 | 4425 | 0.1192 |
| 2.3950 | 4426 | 0.1095 |
| 2.3956 | 4427 | 0.1094 |
| 2.3961 | 4428 | 0.1374 |
| 2.3966 | 4429 | 0.0869 |
| 2.3972 | 4430 | 0.0724 |
| 2.3977 | 4431 | 0.1206 |
| 2.3983 | 4432 | 0.0662 |
| 2.3988 | 4433 | 0.1396 |
| 2.3994 | 4434 | 0.0957 |
| 2.3999 | 4435 | 0.0922 |
| 2.4004 | 4436 | 0.0503 |
| 2.4010 | 4437 | 0.0793 |
| 2.4015 | 4438 | 0.0756 |
| 2.4021 | 4439 | 0.1691 |
| 2.4026 | 4440 | 0.1 |
| 2.4031 | 4441 | 0.0849 |
| 2.4037 | 4442 | 0.1615 |
| 2.4042 | 4443 | 0.0809 |
| 2.4048 | 4444 | 0.1053 |
| 2.4053 | 4445 | 0.1249 |
| 2.4058 | 4446 | 0.1076 |
| 2.4064 | 4447 | 0.108 |
| 2.4069 | 4448 | 0.1123 |
| 2.4075 | 4449 | 0.1287 |
| 2.4080 | 4450 | 0.0722 |
| 2.4085 | 4451 | 0.0903 |
| 2.4091 | 4452 | 0.1031 |
| 2.4096 | 4453 | 0.0775 |
| 2.4102 | 4454 | 0.1209 |
| 2.4107 | 4455 | 0.1508 |
| 2.4113 | 4456 | 0.124 |
| 2.4118 | 4457 | 0.1098 |
| 2.4123 | 4458 | 0.1303 |
| 2.4129 | 4459 | 0.0946 |
| 2.4134 | 4460 | 0.0751 |
| 2.4140 | 4461 | 0.1763 |
| 2.4145 | 4462 | 0.0661 |
| 2.4150 | 4463 | 0.1203 |
| 2.4156 | 4464 | 0.1411 |
| 2.4161 | 4465 | 0.0536 |
| 2.4167 | 4466 | 0.0986 |
| 2.4172 | 4467 | 0.2264 |
| 2.4177 | 4468 | 0.0905 |
| 2.4183 | 4469 | 0.0661 |
| 2.4188 | 4470 | 0.0686 |
| 2.4194 | 4471 | 0.0449 |
| 2.4199 | 4472 | 0.1101 |
| 2.4205 | 4473 | 0.1998 |
| 2.4210 | 4474 | 0.0946 |
| 2.4215 | 4475 | 0.1213 |
| 2.4221 | 4476 | 0.0687 |
| 2.4226 | 4477 | 0.1219 |
| 2.4232 | 4478 | 0.1023 |
| 2.4237 | 4479 | 0.0937 |
| 2.4242 | 4480 | 0.1146 |
| 2.4248 | 4481 | 0.1059 |
| 2.4253 | 4482 | 0.1428 |
| 2.4259 | 4483 | 0.1143 |
| 2.4264 | 4484 | 0.1108 |
| 2.4269 | 4485 | 0.0925 |
| 2.4275 | 4486 | 0.0629 |
| 2.4280 | 4487 | 0.0829 |
| 2.4286 | 4488 | 0.0543 |
| 2.4291 | 4489 | 0.0627 |
| 2.4297 | 4490 | 0.0671 |
| 2.4302 | 4491 | 0.1581 |
| 2.4307 | 4492 | 0.1153 |
| 2.4313 | 4493 | 0.1047 |
| 2.4318 | 4494 | 0.1023 |
| 2.4324 | 4495 | 0.0904 |
| 2.4329 | 4496 | 0.1095 |
| 2.4334 | 4497 | 0.1128 |
| 2.4340 | 4498 | 0.0987 |
| 2.4345 | 4499 | 0.0988 |
| 2.4351 | 4500 | 0.0712 |
| 2.4356 | 4501 | 0.1244 |
| 2.4361 | 4502 | 0.0703 |
| 2.4367 | 4503 | 0.1061 |
| 2.4372 | 4504 | 0.0773 |
| 2.4378 | 4505 | 0.0933 |
| 2.4383 | 4506 | 0.1354 |
| 2.4389 | 4507 | 0.0905 |
| 2.4394 | 4508 | 0.1244 |
| 2.4399 | 4509 | 0.0982 |
| 2.4405 | 4510 | 0.0876 |
| 2.4410 | 4511 | 0.1852 |
| 2.4416 | 4512 | 0.116 |
| 2.4421 | 4513 | 0.1181 |
| 2.4426 | 4514 | 0.1141 |
| 2.4432 | 4515 | 0.0761 |
| 2.4437 | 4516 | 0.0965 |
| 2.4443 | 4517 | 0.0992 |
| 2.4448 | 4518 | 0.0973 |
| 2.4453 | 4519 | 0.0904 |
| 2.4459 | 4520 | 0.0965 |
| 2.4464 | 4521 | 0.0804 |
| 2.4470 | 4522 | 0.1446 |
| 2.4475 | 4523 | 0.102 |
| 2.4481 | 4524 | 0.0551 |
| 2.4486 | 4525 | 0.093 |
| 2.4491 | 4526 | 0.068 |
| 2.4497 | 4527 | 0.1187 |
| 2.4502 | 4528 | 0.1235 |
| 2.4508 | 4529 | 0.0954 |
| 2.4513 | 4530 | 0.155 |
| 2.4518 | 4531 | 0.1153 |
| 2.4524 | 4532 | 0.0919 |
| 2.4529 | 4533 | 0.0986 |
| 2.4535 | 4534 | 0.0939 |
| 2.4540 | 4535 | 0.0681 |
| 2.4545 | 4536 | 0.0763 |
| 2.4551 | 4537 | 0.0798 |
| 2.4556 | 4538 | 0.0698 |
| 2.4562 | 4539 | 0.0887 |
| 2.4567 | 4540 | 0.0909 |
| 2.4573 | 4541 | 0.1063 |
| 2.4578 | 4542 | 0.062 |
| 2.4583 | 4543 | 0.0515 |
| 2.4589 | 4544 | 0.0866 |
| 2.4594 | 4545 | 0.0947 |
| 2.4600 | 4546 | 0.1514 |
| 2.4605 | 4547 | 0.118 |
| 2.4610 | 4548 | 0.1611 |
| 2.4616 | 4549 | 0.1243 |
| 2.4621 | 4550 | 0.0728 |
| 2.4627 | 4551 | 0.1419 |
| 2.4632 | 4552 | 0.1235 |
| 2.4637 | 4553 | 0.1086 |
| 2.4643 | 4554 | 0.1295 |
| 2.4648 | 4555 | 0.0848 |
| 2.4654 | 4556 | 0.0895 |
| 2.4659 | 4557 | 0.1158 |
| 2.4665 | 4558 | 0.1167 |
| 2.4670 | 4559 | 0.0578 |
| 2.4675 | 4560 | 0.0792 |
| 2.4681 | 4561 | 0.0412 |
| 2.4686 | 4562 | 0.0932 |
| 2.4692 | 4563 | 0.0658 |
| 2.4697 | 4564 | 0.1146 |
| 2.4702 | 4565 | 0.0725 |
| 2.4708 | 4566 | 0.1026 |
| 2.4713 | 4567 | 0.0994 |
| 2.4719 | 4568 | 0.094 |
| 2.4724 | 4569 | 0.1085 |
| 2.4729 | 4570 | 0.0865 |
| 2.4735 | 4571 | 0.0617 |
| 2.4740 | 4572 | 0.1437 |
| 2.4746 | 4573 | 0.0504 |
| 2.4751 | 4574 | 0.1141 |
| 2.4756 | 4575 | 0.0688 |
| 2.4762 | 4576 | 0.0627 |
| 2.4767 | 4577 | 0.0822 |
| 2.4773 | 4578 | 0.1252 |
| 2.4778 | 4579 | 0.0885 |
| 2.4784 | 4580 | 0.0984 |
| 2.4789 | 4581 | 0.103 |
| 2.4794 | 4582 | 0.167 |
| 2.4800 | 4583 | 0.076 |
| 2.4805 | 4584 | 0.0791 |
| 2.4811 | 4585 | 0.1159 |
| 2.4816 | 4586 | 0.1302 |
| 2.4821 | 4587 | 0.0947 |
| 2.4827 | 4588 | 0.1425 |
| 2.4832 | 4589 | 0.1463 |
| 2.4838 | 4590 | 0.0707 |
| 2.4843 | 4591 | 0.1189 |
| 2.4848 | 4592 | 0.1488 |
| 2.4854 | 4593 | 0.1367 |
| 2.4859 | 4594 | 0.0897 |
| 2.4865 | 4595 | 0.0352 |
| 2.4870 | 4596 | 0.1313 |
| 2.4876 | 4597 | 0.0558 |
| 2.4881 | 4598 | 0.0734 |
| 2.4886 | 4599 | 0.1234 |
| 2.4892 | 4600 | 0.0917 |
| 2.4897 | 4601 | 0.0551 |
| 2.4903 | 4602 | 0.0715 |
| 2.4908 | 4603 | 0.1152 |
| 2.4913 | 4604 | 0.0839 |
| 2.4919 | 4605 | 0.1441 |
| 2.4924 | 4606 | 0.1072 |
| 2.4930 | 4607 | 0.1293 |
| 2.4935 | 4608 | 0.1271 |
| 2.4940 | 4609 | 0.1126 |
| 2.4946 | 4610 | 0.0925 |
| 2.4951 | 4611 | 0.0689 |
| 2.4957 | 4612 | 0.0734 |
| 2.4962 | 4613 | 0.1631 |
| 2.4968 | 4614 | 0.0953 |
| 2.4973 | 4615 | 0.0836 |
| 2.4978 | 4616 | 0.0446 |
| 2.4984 | 4617 | 0.1407 |
| 2.4989 | 4618 | 0.0783 |
| 2.4995 | 4619 | 0.076 |
| 2.5 | 4620 | 0.1054 |
| 2.5005 | 4621 | 0.1311 |
| 2.5011 | 4622 | 0.081 |
| 2.5016 | 4623 | 0.0831 |
| 2.5022 | 4624 | 0.1094 |
| 2.5027 | 4625 | 0.1247 |
| 2.5032 | 4626 | 0.1557 |
| 2.5038 | 4627 | 0.13 |
| 2.5043 | 4628 | 0.06 |
| 2.5049 | 4629 | 0.0871 |
| 2.5054 | 4630 | 0.0842 |
| 2.5060 | 4631 | 0.0706 |
| 2.5065 | 4632 | 0.089 |
| 2.5070 | 4633 | 0.0883 |
| 2.5076 | 4634 | 0.1036 |
| 2.5081 | 4635 | 0.0746 |
| 2.5087 | 4636 | 0.1229 |
| 2.5092 | 4637 | 0.1607 |
| 2.5097 | 4638 | 0.1048 |
| 2.5103 | 4639 | 0.0979 |
| 2.5108 | 4640 | 0.0832 |
| 2.5114 | 4641 | 0.1021 |
| 2.5119 | 4642 | 0.1032 |
| 2.5124 | 4643 | 0.131 |
| 2.5130 | 4644 | 0.0967 |
| 2.5135 | 4645 | 0.1101 |
| 2.5141 | 4646 | 0.1929 |
| 2.5146 | 4647 | 0.1076 |
| 2.5152 | 4648 | 0.1347 |
| 2.5157 | 4649 | 0.0643 |
| 2.5162 | 4650 | 0.0715 |
| 2.5168 | 4651 | 0.0823 |
| 2.5173 | 4652 | 0.112 |
| 2.5179 | 4653 | 0.1195 |
| 2.5184 | 4654 | 0.166 |
| 2.5189 | 4655 | 0.0781 |
| 2.5195 | 4656 | 0.0906 |
| 2.5200 | 4657 | 0.0785 |
| 2.5206 | 4658 | 0.1036 |
| 2.5211 | 4659 | 0.1537 |
| 2.5216 | 4660 | 0.0518 |
| 2.5222 | 4661 | 0.0958 |
| 2.5227 | 4662 | 0.0949 |
| 2.5233 | 4663 | 0.0759 |
| 2.5238 | 4664 | 0.0524 |
| 2.5244 | 4665 | 0.0947 |
| 2.5249 | 4666 | 0.1334 |
| 2.5254 | 4667 | 0.1352 |
| 2.5260 | 4668 | 0.0793 |
| 2.5265 | 4669 | 0.1131 |
| 2.5271 | 4670 | 0.1151 |
| 2.5276 | 4671 | 0.0987 |
| 2.5281 | 4672 | 0.0877 |
| 2.5287 | 4673 | 0.1132 |
| 2.5292 | 4674 | 0.0744 |
| 2.5298 | 4675 | 0.0703 |
| 2.5303 | 4676 | 0.1666 |
| 2.5308 | 4677 | 0.1289 |
| 2.5314 | 4678 | 0.074 |
| 2.5319 | 4679 | 0.1328 |
| 2.5325 | 4680 | 0.0499 |
| 2.5330 | 4681 | 0.104 |
| 2.5335 | 4682 | 0.1267 |
| 2.5341 | 4683 | 0.0927 |
| 2.5346 | 4684 | 0.123 |
| 2.5352 | 4685 | 0.046 |
| 2.5357 | 4686 | 0.058 |
| 2.5363 | 4687 | 0.0913 |
| 2.5368 | 4688 | 0.1113 |
| 2.5373 | 4689 | 0.1237 |
| 2.5379 | 4690 | 0.0705 |
| 2.5384 | 4691 | 0.1254 |
| 2.5390 | 4692 | 0.1187 |
| 2.5395 | 4693 | 0.0887 |
| 2.5400 | 4694 | 0.0804 |
| 2.5406 | 4695 | 0.1186 |
| 2.5411 | 4696 | 0.1269 |
| 2.5417 | 4697 | 0.1074 |
| 2.5422 | 4698 | 0.0881 |
| 2.5427 | 4699 | 0.0726 |
| 2.5433 | 4700 | 0.0839 |
| 2.5438 | 4701 | 0.0963 |
| 2.5444 | 4702 | 0.1073 |
| 2.5449 | 4703 | 0.1006 |
| 2.5455 | 4704 | 0.1265 |
| 2.5460 | 4705 | 0.0891 |
| 2.5465 | 4706 | 0.0554 |
| 2.5471 | 4707 | 0.085 |
| 2.5476 | 4708 | 0.1354 |
| 2.5482 | 4709 | 0.1292 |
| 2.5487 | 4710 | 0.1206 |
| 2.5492 | 4711 | 0.1399 |
| 2.5498 | 4712 | 0.0958 |
| 2.5503 | 4713 | 0.1331 |
| 2.5509 | 4714 | 0.0859 |
| 2.5514 | 4715 | 0.125 |
| 2.5519 | 4716 | 0.1115 |
| 2.5525 | 4717 | 0.0701 |
| 2.5530 | 4718 | 0.0723 |
| 2.5536 | 4719 | 0.1299 |
| 2.5541 | 4720 | 0.0738 |
| 2.5547 | 4721 | 0.0949 |
| 2.5552 | 4722 | 0.089 |
| 2.5557 | 4723 | 0.1179 |
| 2.5563 | 4724 | 0.0989 |
| 2.5568 | 4725 | 0.0815 |
| 2.5574 | 4726 | 0.1115 |
| 2.5579 | 4727 | 0.0645 |
| 2.5584 | 4728 | 0.0692 |
| 2.5590 | 4729 | 0.0608 |
| 2.5595 | 4730 | 0.0835 |
| 2.5601 | 4731 | 0.0853 |
| 2.5606 | 4732 | 0.1177 |
| 2.5611 | 4733 | 0.1054 |
| 2.5617 | 4734 | 0.1148 |
| 2.5622 | 4735 | 0.0911 |
| 2.5628 | 4736 | 0.1007 |
| 2.5633 | 4737 | 0.0936 |
| 2.5639 | 4738 | 0.131 |
| 2.5644 | 4739 | 0.1287 |
| 2.5649 | 4740 | 0.0647 |
| 2.5655 | 4741 | 0.1111 |
| 2.5660 | 4742 | 0.1208 |
| 2.5666 | 4743 | 0.1175 |
| 2.5671 | 4744 | 0.0818 |
| 2.5676 | 4745 | 0.0978 |
| 2.5682 | 4746 | 0.1471 |
| 2.5687 | 4747 | 0.0854 |
| 2.5693 | 4748 | 0.218 |
| 2.5698 | 4749 | 0.097 |
| 2.5703 | 4750 | 0.1317 |
| 2.5709 | 4751 | 0.083 |
| 2.5714 | 4752 | 0.1652 |
| 2.5720 | 4753 | 0.0644 |
| 2.5725 | 4754 | 0.16 |
| 2.5731 | 4755 | 0.0933 |
| 2.5736 | 4756 | 0.0924 |
| 2.5741 | 4757 | 0.0755 |
| 2.5747 | 4758 | 0.0777 |
| 2.5752 | 4759 | 0.0659 |
| 2.5758 | 4760 | 0.1158 |
| 2.5763 | 4761 | 0.1108 |
| 2.5768 | 4762 | 0.1005 |
| 2.5774 | 4763 | 0.0909 |
| 2.5779 | 4764 | 0.0972 |
| 2.5785 | 4765 | 0.08 |
| 2.5790 | 4766 | 0.1238 |
| 2.5795 | 4767 | 0.1117 |
| 2.5801 | 4768 | 0.1756 |
| 2.5806 | 4769 | 0.0861 |
| 2.5812 | 4770 | 0.1782 |
| 2.5817 | 4771 | 0.1322 |
| 2.5823 | 4772 | 0.0637 |
| 2.5828 | 4773 | 0.1116 |
| 2.5833 | 4774 | 0.08 |
| 2.5839 | 4775 | 0.0647 |
| 2.5844 | 4776 | 0.1103 |
| 2.5850 | 4777 | 0.0795 |
| 2.5855 | 4778 | 0.0956 |
| 2.5860 | 4779 | 0.0775 |
| 2.5866 | 4780 | 0.0963 |
| 2.5871 | 4781 | 0.0908 |
| 2.5877 | 4782 | 0.0742 |
| 2.5882 | 4783 | 0.0646 |
| 2.5887 | 4784 | 0.1173 |
| 2.5893 | 4785 | 0.0885 |
| 2.5898 | 4786 | 0.0707 |
| 2.5904 | 4787 | 0.1187 |
| 2.5909 | 4788 | 0.1338 |
| 2.5915 | 4789 | 0.0625 |
| 2.5920 | 4790 | 0.09 |
| 2.5925 | 4791 | 0.0802 |
| 2.5931 | 4792 | 0.1317 |
| 2.5936 | 4793 | 0.0916 |
| 2.5942 | 4794 | 0.0942 |
| 2.5947 | 4795 | 0.0944 |
| 2.5952 | 4796 | 0.112 |
| 2.5958 | 4797 | 0.1111 |
| 2.5963 | 4798 | 0.0576 |
| 2.5969 | 4799 | 0.1232 |
| 2.5974 | 4800 | 0.1085 |
| 2.5979 | 4801 | 0.066 |
| 2.5985 | 4802 | 0.1085 |
| 2.5990 | 4803 | 0.0747 |
| 2.5996 | 4804 | 0.0883 |
| 2.6001 | 4805 | 0.086 |
| 2.6006 | 4806 | 0.1161 |
| 2.6012 | 4807 | 0.156 |
| 2.6017 | 4808 | 0.0935 |
| 2.6023 | 4809 | 0.1566 |
| 2.6028 | 4810 | 0.1455 |
| 2.6034 | 4811 | 0.0542 |
| 2.6039 | 4812 | 0.0865 |
| 2.6044 | 4813 | 0.0944 |
| 2.6050 | 4814 | 0.1469 |
| 2.6055 | 4815 | 0.1249 |
| 2.6061 | 4816 | 0.1224 |
| 2.6066 | 4817 | 0.0869 |
| 2.6071 | 4818 | 0.0805 |
| 2.6077 | 4819 | 0.0781 |
| 2.6082 | 4820 | 0.119 |
| 2.6088 | 4821 | 0.0916 |
| 2.6093 | 4822 | 0.1218 |
| 2.6098 | 4823 | 0.0968 |
| 2.6104 | 4824 | 0.08 |
| 2.6109 | 4825 | 0.0735 |
| 2.6115 | 4826 | 0.1261 |
| 2.6120 | 4827 | 0.0786 |
| 2.6126 | 4828 | 0.0721 |
| 2.6131 | 4829 | 0.0719 |
| 2.6136 | 4830 | 0.1346 |
| 2.6142 | 4831 | 0.1269 |
| 2.6147 | 4832 | 0.088 |
| 2.6153 | 4833 | 0.1005 |
| 2.6158 | 4834 | 0.119 |
| 2.6163 | 4835 | 0.0802 |
| 2.6169 | 4836 | 0.0728 |
| 2.6174 | 4837 | 0.079 |
| 2.6180 | 4838 | 0.0893 |
| 2.6185 | 4839 | 0.0811 |
| 2.6190 | 4840 | 0.1153 |
| 2.6196 | 4841 | 0.0773 |
| 2.6201 | 4842 | 0.1113 |
| 2.6207 | 4843 | 0.0907 |
| 2.6212 | 4844 | 0.1257 |
| 2.6218 | 4845 | 0.1122 |
| 2.6223 | 4846 | 0.0829 |
| 2.6228 | 4847 | 0.1009 |
| 2.6234 | 4848 | 0.0822 |
| 2.6239 | 4849 | 0.0995 |
| 2.6245 | 4850 | 0.1262 |
| 2.625 | 4851 | 0.0914 |
| 2.6255 | 4852 | 0.0925 |
| 2.6261 | 4853 | 0.0973 |
| 2.6266 | 4854 | 0.1397 |
| 2.6272 | 4855 | 0.1084 |
| 2.6277 | 4856 | 0.0954 |
| 2.6282 | 4857 | 0.1094 |
| 2.6288 | 4858 | 0.038 |
| 2.6293 | 4859 | 0.0862 |
| 2.6299 | 4860 | 0.09 |
| 2.6304 | 4861 | 0.1291 |
| 2.6310 | 4862 | 0.0976 |
| 2.6315 | 4863 | 0.0613 |
| 2.6320 | 4864 | 0.0663 |
| 2.6326 | 4865 | 0.0615 |
| 2.6331 | 4866 | 0.0934 |
| 2.6337 | 4867 | 0.072 |
| 2.6342 | 4868 | 0.1055 |
| 2.6347 | 4869 | 0.1169 |
| 2.6353 | 4870 | 0.0775 |
| 2.6358 | 4871 | 0.0972 |
| 2.6364 | 4872 | 0.1343 |
| 2.6369 | 4873 | 0.1616 |
| 2.6374 | 4874 | 0.0846 |
| 2.6380 | 4875 | 0.0753 |
| 2.6385 | 4876 | 0.0827 |
| 2.6391 | 4877 | 0.1354 |
| 2.6396 | 4878 | 0.1201 |
| 2.6402 | 4879 | 0.094 |
| 2.6407 | 4880 | 0.1102 |
| 2.6412 | 4881 | 0.0629 |
| 2.6418 | 4882 | 0.0828 |
| 2.6423 | 4883 | 0.1115 |
| 2.6429 | 4884 | 0.1428 |
| 2.6434 | 4885 | 0.0717 |
| 2.6439 | 4886 | 0.1023 |
| 2.6445 | 4887 | 0.1581 |
| 2.6450 | 4888 | 0.1274 |
| 2.6456 | 4889 | 0.0629 |
| 2.6461 | 4890 | 0.0784 |
| 2.6466 | 4891 | 0.0558 |
| 2.6472 | 4892 | 0.1089 |
| 2.6477 | 4893 | 0.1028 |
| 2.6483 | 4894 | 0.0527 |
| 2.6488 | 4895 | 0.1105 |
| 2.6494 | 4896 | 0.0856 |
| 2.6499 | 4897 | 0.0754 |
| 2.6504 | 4898 | 0.102 |
| 2.6510 | 4899 | 0.0844 |
| 2.6515 | 4900 | 0.033 |
| 2.6521 | 4901 | 0.1596 |
| 2.6526 | 4902 | 0.0736 |
| 2.6531 | 4903 | 0.0872 |
| 2.6537 | 4904 | 0.1094 |
| 2.6542 | 4905 | 0.1024 |
| 2.6548 | 4906 | 0.0547 |
| 2.6553 | 4907 | 0.1064 |
| 2.6558 | 4908 | 0.0514 |
| 2.6564 | 4909 | 0.1211 |
| 2.6569 | 4910 | 0.0686 |
| 2.6575 | 4911 | 0.0983 |
| 2.6580 | 4912 | 0.1019 |
| 2.6585 | 4913 | 0.0727 |
| 2.6591 | 4914 | 0.1748 |
| 2.6596 | 4915 | 0.2002 |
| 2.6602 | 4916 | 0.1287 |
| 2.6607 | 4917 | 0.0934 |
| 2.6613 | 4918 | 0.0749 |
| 2.6618 | 4919 | 0.0606 |
| 2.6623 | 4920 | 0.0723 |
| 2.6629 | 4921 | 0.0348 |
| 2.6634 | 4922 | 0.0829 |
| 2.6640 | 4923 | 0.0786 |
| 2.6645 | 4924 | 0.1147 |
| 2.6650 | 4925 | 0.0835 |
| 2.6656 | 4926 | 0.0812 |
| 2.6661 | 4927 | 0.0922 |
| 2.6667 | 4928 | 0.1083 |
| 2.6672 | 4929 | 0.142 |
| 2.6677 | 4930 | 0.0444 |
| 2.6683 | 4931 | 0.0769 |
| 2.6688 | 4932 | 0.1845 |
| 2.6694 | 4933 | 0.1042 |
| 2.6699 | 4934 | 0.1234 |
| 2.6705 | 4935 | 0.1149 |
| 2.6710 | 4936 | 0.0779 |
| 2.6715 | 4937 | 0.1049 |
| 2.6721 | 4938 | 0.1213 |
| 2.6726 | 4939 | 0.0887 |
| 2.6732 | 4940 | 0.1014 |
| 2.6737 | 4941 | 0.0741 |
| 2.6742 | 4942 | 0.0878 |
| 2.6748 | 4943 | 0.1235 |
| 2.6753 | 4944 | 0.1104 |
| 2.6759 | 4945 | 0.1148 |
| 2.6764 | 4946 | 0.1792 |
| 2.6769 | 4947 | 0.0883 |
| 2.6775 | 4948 | 0.1049 |
| 2.6780 | 4949 | 0.0813 |
| 2.6786 | 4950 | 0.1097 |
| 2.6791 | 4951 | 0.1339 |
| 2.6797 | 4952 | 0.0664 |
| 2.6802 | 4953 | 0.0484 |
| 2.6807 | 4954 | 0.2084 |
| 2.6813 | 4955 | 0.0966 |
| 2.6818 | 4956 | 0.069 |
| 2.6824 | 4957 | 0.0863 |
| 2.6829 | 4958 | 0.06 |
| 2.6834 | 4959 | 0.17 |
| 2.6840 | 4960 | 0.0751 |
| 2.6845 | 4961 | 0.1053 |
| 2.6851 | 4962 | 0.1374 |
| 2.6856 | 4963 | 0.057 |
| 2.6861 | 4964 | 0.1243 |
| 2.6867 | 4965 | 0.091 |
| 2.6872 | 4966 | 0.0826 |
| 2.6878 | 4967 | 0.102 |
| 2.6883 | 4968 | 0.1002 |
| 2.6889 | 4969 | 0.1 |
| 2.6894 | 4970 | 0.1487 |
| 2.6899 | 4971 | 0.1183 |
| 2.6905 | 4972 | 0.1226 |
| 2.6910 | 4973 | 0.0642 |
| 2.6916 | 4974 | 0.1087 |
| 2.6921 | 4975 | 0.0649 |
| 2.6926 | 4976 | 0.1718 |
| 2.6932 | 4977 | 0.1022 |
| 2.6937 | 4978 | 0.1099 |
| 2.6943 | 4979 | 0.0898 |
| 2.6948 | 4980 | 0.0866 |
| 2.6953 | 4981 | 0.075 |
| 2.6959 | 4982 | 0.073 |
| 2.6964 | 4983 | 0.0784 |
| 2.6970 | 4984 | 0.0554 |
| 2.6975 | 4985 | 0.1132 |
| 2.6981 | 4986 | 0.0671 |
| 2.6986 | 4987 | 0.0955 |
| 2.6991 | 4988 | 0.0786 |
| 2.6997 | 4989 | 0.117 |
| 2.7002 | 4990 | 0.0657 |
| 2.7008 | 4991 | 0.0596 |
| 2.7013 | 4992 | 0.1176 |
| 2.7018 | 4993 | 0.0828 |
| 2.7024 | 4994 | 0.1021 |
| 2.7029 | 4995 | 0.1169 |
| 2.7035 | 4996 | 0.0596 |
| 2.7040 | 4997 | 0.0763 |
| 2.7045 | 4998 | 0.0874 |
| 2.7051 | 4999 | 0.0802 |
| 2.7056 | 5000 | 0.0397 |
| 2.7062 | 5001 | 0.1107 |
| 2.7067 | 5002 | 0.0673 |
| 2.7073 | 5003 | 0.0982 |
| 2.7078 | 5004 | 0.0916 |
| 2.7083 | 5005 | 0.0972 |
| 2.7089 | 5006 | 0.1099 |
| 2.7094 | 5007 | 0.1472 |
| 2.7100 | 5008 | 0.0769 |
| 2.7105 | 5009 | 0.1082 |
| 2.7110 | 5010 | 0.0965 |
| 2.7116 | 5011 | 0.0632 |
| 2.7121 | 5012 | 0.0712 |
| 2.7127 | 5013 | 0.0691 |
| 2.7132 | 5014 | 0.1257 |
| 2.7137 | 5015 | 0.0739 |
| 2.7143 | 5016 | 0.0598 |
| 2.7148 | 5017 | 0.0903 |
| 2.7154 | 5018 | 0.098 |
| 2.7159 | 5019 | 0.0537 |
| 2.7165 | 5020 | 0.134 |
| 2.7170 | 5021 | 0.0787 |
| 2.7175 | 5022 | 0.1292 |
| 2.7181 | 5023 | 0.0877 |
| 2.7186 | 5024 | 0.1588 |
| 2.7192 | 5025 | 0.1002 |
| 2.7197 | 5026 | 0.1101 |
| 2.7202 | 5027 | 0.1208 |
| 2.7208 | 5028 | 0.0528 |
| 2.7213 | 5029 | 0.0961 |
| 2.7219 | 5030 | 0.1028 |
| 2.7224 | 5031 | 0.1032 |
| 2.7229 | 5032 | 0.102 |
| 2.7235 | 5033 | 0.1328 |
| 2.7240 | 5034 | 0.0748 |
| 2.7246 | 5035 | 0.0443 |
| 2.7251 | 5036 | 0.0896 |
| 2.7256 | 5037 | 0.082 |
| 2.7262 | 5038 | 0.1196 |
| 2.7267 | 5039 | 0.1846 |
| 2.7273 | 5040 | 0.0871 |
| 2.7278 | 5041 | 0.105 |
| 2.7284 | 5042 | 0.1104 |
| 2.7289 | 5043 | 0.1065 |
| 2.7294 | 5044 | 0.0954 |
| 2.7300 | 5045 | 0.0762 |
| 2.7305 | 5046 | 0.0858 |
| 2.7311 | 5047 | 0.1011 |
| 2.7316 | 5048 | 0.0931 |
| 2.7321 | 5049 | 0.0801 |
| 2.7327 | 5050 | 0.0997 |
| 2.7332 | 5051 | 0.0595 |
| 2.7338 | 5052 | 0.1048 |
| 2.7343 | 5053 | 0.1187 |
| 2.7348 | 5054 | 0.1562 |
| 2.7354 | 5055 | 0.0866 |
| 2.7359 | 5056 | 0.1386 |
| 2.7365 | 5057 | 0.0673 |
| 2.7370 | 5058 | 0.0873 |
| 2.7376 | 5059 | 0.0943 |
| 2.7381 | 5060 | 0.1714 |
| 2.7386 | 5061 | 0.1202 |
| 2.7392 | 5062 | 0.1107 |
| 2.7397 | 5063 | 0.1137 |
| 2.7403 | 5064 | 0.0622 |
| 2.7408 | 5065 | 0.1692 |
| 2.7413 | 5066 | 0.1214 |
| 2.7419 | 5067 | 0.0758 |
| 2.7424 | 5068 | 0.0728 |
| 2.7430 | 5069 | 0.067 |
| 2.7435 | 5070 | 0.1323 |
| 2.7440 | 5071 | 0.1121 |
| 2.7446 | 5072 | 0.0858 |
| 2.7451 | 5073 | 0.0677 |
| 2.7457 | 5074 | 0.125 |
| 2.7462 | 5075 | 0.139 |
| 2.7468 | 5076 | 0.0955 |
| 2.7473 | 5077 | 0.0982 |
| 2.7478 | 5078 | 0.0759 |
| 2.7484 | 5079 | 0.1454 |
| 2.7489 | 5080 | 0.0879 |
| 2.7495 | 5081 | 0.0974 |
| 2.75 | 5082 | 0.0788 |
| 2.7505 | 5083 | 0.1212 |
| 2.7511 | 5084 | 0.1274 |
| 2.7516 | 5085 | 0.0633 |
| 2.7522 | 5086 | 0.064 |
| 2.7527 | 5087 | 0.0703 |
| 2.7532 | 5088 | 0.1332 |
| 2.7538 | 5089 | 0.1256 |
| 2.7543 | 5090 | 0.0748 |
| 2.7549 | 5091 | 0.1079 |
| 2.7554 | 5092 | 0.1126 |
| 2.7560 | 5093 | 0.065 |
| 2.7565 | 5094 | 0.1022 |
| 2.7570 | 5095 | 0.1359 |
| 2.7576 | 5096 | 0.1152 |
| 2.7581 | 5097 | 0.0689 |
| 2.7587 | 5098 | 0.1028 |
| 2.7592 | 5099 | 0.0966 |
| 2.7597 | 5100 | 0.0936 |
| 2.7603 | 5101 | 0.0909 |
| 2.7608 | 5102 | 0.1214 |
| 2.7614 | 5103 | 0.0754 |
| 2.7619 | 5104 | 0.1548 |
| 2.7624 | 5105 | 0.0988 |
| 2.7630 | 5106 | 0.0664 |
| 2.7635 | 5107 | 0.0746 |
| 2.7641 | 5108 | 0.1511 |
| 2.7646 | 5109 | 0.0822 |
| 2.7652 | 5110 | 0.0987 |
| 2.7657 | 5111 | 0.1151 |
| 2.7662 | 5112 | 0.1012 |
| 2.7668 | 5113 | 0.141 |
| 2.7673 | 5114 | 0.1091 |
| 2.7679 | 5115 | 0.0729 |
| 2.7684 | 5116 | 0.082 |
| 2.7689 | 5117 | 0.0842 |
| 2.7695 | 5118 | 0.1303 |
| 2.7700 | 5119 | 0.1619 |
| 2.7706 | 5120 | 0.0914 |
| 2.7711 | 5121 | 0.1051 |
| 2.7716 | 5122 | 0.0596 |
| 2.7722 | 5123 | 0.0947 |
| 2.7727 | 5124 | 0.0915 |
| 2.7733 | 5125 | 0.0794 |
| 2.7738 | 5126 | 0.0958 |
| 2.7744 | 5127 | 0.0955 |
| 2.7749 | 5128 | 0.0663 |
| 2.7754 | 5129 | 0.0746 |
| 2.7760 | 5130 | 0.1664 |
| 2.7765 | 5131 | 0.1835 |
| 2.7771 | 5132 | 0.0986 |
| 2.7776 | 5133 | 0.0629 |
| 2.7781 | 5134 | 0.0798 |
| 2.7787 | 5135 | 0.115 |
| 2.7792 | 5136 | 0.0838 |
| 2.7798 | 5137 | 0.0423 |
| 2.7803 | 5138 | 0.1095 |
| 2.7808 | 5139 | 0.0753 |
| 2.7814 | 5140 | 0.107 |
| 2.7819 | 5141 | 0.085 |
| 2.7825 | 5142 | 0.0878 |
| 2.7830 | 5143 | 0.0681 |
| 2.7835 | 5144 | 0.1388 |
| 2.7841 | 5145 | 0.0934 |
| 2.7846 | 5146 | 0.0537 |
| 2.7852 | 5147 | 0.1539 |
| 2.7857 | 5148 | 0.0769 |
| 2.7863 | 5149 | 0.1799 |
| 2.7868 | 5150 | 0.0734 |
| 2.7873 | 5151 | 0.1256 |
| 2.7879 | 5152 | 0.0999 |
| 2.7884 | 5153 | 0.079 |
| 2.7890 | 5154 | 0.0526 |
| 2.7895 | 5155 | 0.0731 |
| 2.7900 | 5156 | 0.0955 |
| 2.7906 | 5157 | 0.0833 |
| 2.7911 | 5158 | 0.1174 |
| 2.7917 | 5159 | 0.1152 |
| 2.7922 | 5160 | 0.0474 |
| 2.7927 | 5161 | 0.0685 |
| 2.7933 | 5162 | 0.0814 |
| 2.7938 | 5163 | 0.057 |
| 2.7944 | 5164 | 0.0881 |
| 2.7949 | 5165 | 0.1608 |
| 2.7955 | 5166 | 0.1213 |
| 2.7960 | 5167 | 0.0766 |
| 2.7965 | 5168 | 0.1773 |
| 2.7971 | 5169 | 0.1395 |
| 2.7976 | 5170 | 0.1311 |
| 2.7982 | 5171 | 0.1293 |
| 2.7987 | 5172 | 0.0778 |
| 2.7992 | 5173 | 0.0687 |
| 2.7998 | 5174 | 0.1036 |
| 2.8003 | 5175 | 0.0546 |
| 2.8009 | 5176 | 0.0853 |
| 2.8014 | 5177 | 0.0946 |
| 2.8019 | 5178 | 0.1253 |
| 2.8025 | 5179 | 0.117 |
| 2.8030 | 5180 | 0.0959 |
| 2.8036 | 5181 | 0.0462 |
| 2.8041 | 5182 | 0.1513 |
| 2.8047 | 5183 | 0.0957 |
| 2.8052 | 5184 | 0.0878 |
| 2.8057 | 5185 | 0.0874 |
| 2.8063 | 5186 | 0.1238 |
| 2.8068 | 5187 | 0.1226 |
| 2.8074 | 5188 | 0.0703 |
| 2.8079 | 5189 | 0.0689 |
| 2.8084 | 5190 | 0.0836 |
| 2.8090 | 5191 | 0.1105 |
| 2.8095 | 5192 | 0.0849 |
| 2.8101 | 5193 | 0.1252 |
| 2.8106 | 5194 | 0.1144 |
| 2.8111 | 5195 | 0.0995 |
| 2.8117 | 5196 | 0.1093 |
| 2.8122 | 5197 | 0.0746 |
| 2.8128 | 5198 | 0.1146 |
| 2.8133 | 5199 | 0.0881 |
| 2.8139 | 5200 | 0.1364 |
| 2.8144 | 5201 | 0.105 |
| 2.8149 | 5202 | 0.1348 |
| 2.8155 | 5203 | 0.1301 |
| 2.8160 | 5204 | 0.0644 |
| 2.8166 | 5205 | 0.053 |
| 2.8171 | 5206 | 0.0969 |
| 2.8176 | 5207 | 0.0987 |
| 2.8182 | 5208 | 0.1263 |
| 2.8187 | 5209 | 0.0818 |
| 2.8193 | 5210 | 0.0961 |
| 2.8198 | 5211 | 0.0767 |
| 2.8203 | 5212 | 0.0476 |
| 2.8209 | 5213 | 0.0479 |
| 2.8214 | 5214 | 0.0867 |
| 2.8220 | 5215 | 0.069 |
| 2.8225 | 5216 | 0.0728 |
| 2.8231 | 5217 | 0.0746 |
| 2.8236 | 5218 | 0.0814 |
| 2.8241 | 5219 | 0.057 |
| 2.8247 | 5220 | 0.1473 |
| 2.8252 | 5221 | 0.0941 |
| 2.8258 | 5222 | 0.1105 |
| 2.8263 | 5223 | 0.0808 |
| 2.8268 | 5224 | 0.0985 |
| 2.8274 | 5225 | 0.0932 |
| 2.8279 | 5226 | 0.0384 |
| 2.8285 | 5227 | 0.0685 |
| 2.8290 | 5228 | 0.0606 |
| 2.8295 | 5229 | 0.1219 |
| 2.8301 | 5230 | 0.1382 |
| 2.8306 | 5231 | 0.0737 |
| 2.8312 | 5232 | 0.0794 |
| 2.8317 | 5233 | 0.0856 |
| 2.8323 | 5234 | 0.102 |
| 2.8328 | 5235 | 0.068 |
| 2.8333 | 5236 | 0.129 |
| 2.8339 | 5237 | 0.14 |
| 2.8344 | 5238 | 0.1165 |
| 2.8350 | 5239 | 0.1143 |
| 2.8355 | 5240 | 0.1139 |
| 2.8360 | 5241 | 0.0892 |
| 2.8366 | 5242 | 0.1215 |
| 2.8371 | 5243 | 0.0802 |
| 2.8377 | 5244 | 0.0559 |
| 2.8382 | 5245 | 0.1257 |
| 2.8387 | 5246 | 0.1307 |
| 2.8393 | 5247 | 0.1276 |
| 2.8398 | 5248 | 0.0803 |
| 2.8404 | 5249 | 0.1002 |
| 2.8409 | 5250 | 0.0873 |
| 2.8415 | 5251 | 0.0535 |
| 2.8420 | 5252 | 0.054 |
| 2.8425 | 5253 | 0.1194 |
| 2.8431 | 5254 | 0.092 |
| 2.8436 | 5255 | 0.1119 |
| 2.8442 | 5256 | 0.0473 |
| 2.8447 | 5257 | 0.0476 |
| 2.8452 | 5258 | 0.1 |
| 2.8458 | 5259 | 0.0519 |
| 2.8463 | 5260 | 0.106 |
| 2.8469 | 5261 | 0.0738 |
| 2.8474 | 5262 | 0.1304 |
| 2.8479 | 5263 | 0.0729 |
| 2.8485 | 5264 | 0.066 |
| 2.8490 | 5265 | 0.0973 |
| 2.8496 | 5266 | 0.0884 |
| 2.8501 | 5267 | 0.1494 |
| 2.8506 | 5268 | 0.0362 |
| 2.8512 | 5269 | 0.0881 |
| 2.8517 | 5270 | 0.1072 |
| 2.8523 | 5271 | 0.0706 |
| 2.8528 | 5272 | 0.0623 |
| 2.8534 | 5273 | 0.1709 |
| 2.8539 | 5274 | 0.0733 |
| 2.8544 | 5275 | 0.0702 |
| 2.8550 | 5276 | 0.0895 |
| 2.8555 | 5277 | 0.0881 |
| 2.8561 | 5278 | 0.1197 |
| 2.8566 | 5279 | 0.1262 |
| 2.8571 | 5280 | 0.0654 |
| 2.8577 | 5281 | 0.1257 |
| 2.8582 | 5282 | 0.082 |
| 2.8588 | 5283 | 0.1057 |
| 2.8593 | 5284 | 0.1027 |
| 2.8598 | 5285 | 0.1269 |
| 2.8604 | 5286 | 0.1304 |
| 2.8609 | 5287 | 0.0855 |
| 2.8615 | 5288 | 0.0882 |
| 2.8620 | 5289 | 0.1475 |
| 2.8626 | 5290 | 0.081 |
| 2.8631 | 5291 | 0.0437 |
| 2.8636 | 5292 | 0.1325 |
| 2.8642 | 5293 | 0.13 |
| 2.8647 | 5294 | 0.0857 |
| 2.8653 | 5295 | 0.0766 |
| 2.8658 | 5296 | 0.1176 |
| 2.8663 | 5297 | 0.0737 |
| 2.8669 | 5298 | 0.1616 |
| 2.8674 | 5299 | 0.0983 |
| 2.8680 | 5300 | 0.0612 |
| 2.8685 | 5301 | 0.0853 |
| 2.8690 | 5302 | 0.1532 |
| 2.8696 | 5303 | 0.0899 |
| 2.8701 | 5304 | 0.0891 |
| 2.8707 | 5305 | 0.0966 |
| 2.8712 | 5306 | 0.0877 |
| 2.8718 | 5307 | 0.1213 |
| 2.8723 | 5308 | 0.1024 |
| 2.8728 | 5309 | 0.1338 |
| 2.8734 | 5310 | 0.176 |
| 2.8739 | 5311 | 0.0696 |
| 2.8745 | 5312 | 0.0742 |
| 2.875 | 5313 | 0.0832 |
| 2.8755 | 5314 | 0.0736 |
| 2.8761 | 5315 | 0.1114 |
| 2.8766 | 5316 | 0.0958 |
| 2.8772 | 5317 | 0.0678 |
| 2.8777 | 5318 | 0.1199 |
| 2.8782 | 5319 | 0.0922 |
| 2.8788 | 5320 | 0.0814 |
| 2.8793 | 5321 | 0.1051 |
| 2.8799 | 5322 | 0.0572 |
| 2.8804 | 5323 | 0.1057 |
| 2.8810 | 5324 | 0.152 |
| 2.8815 | 5325 | 0.1074 |
| 2.8820 | 5326 | 0.1159 |
| 2.8826 | 5327 | 0.0592 |
| 2.8831 | 5328 | 0.073 |
| 2.8837 | 5329 | 0.0933 |
| 2.8842 | 5330 | 0.0418 |
| 2.8847 | 5331 | 0.1063 |
| 2.8853 | 5332 | 0.06 |
| 2.8858 | 5333 | 0.0948 |
| 2.8864 | 5334 | 0.1185 |
| 2.8869 | 5335 | 0.1314 |
| 2.8874 | 5336 | 0.0955 |
| 2.8880 | 5337 | 0.1016 |
| 2.8885 | 5338 | 0.1023 |
| 2.8891 | 5339 | 0.0726 |
| 2.8896 | 5340 | 0.0656 |
| 2.8902 | 5341 | 0.0976 |
| 2.8907 | 5342 | 0.1139 |
| 2.8912 | 5343 | 0.1287 |
| 2.8918 | 5344 | 0.0945 |
| 2.8923 | 5345 | 0.1498 |
| 2.8929 | 5346 | 0.1021 |
| 2.8934 | 5347 | 0.1445 |
| 2.8939 | 5348 | 0.0695 |
| 2.8945 | 5349 | 0.1084 |
| 2.8950 | 5350 | 0.1045 |
| 2.8956 | 5351 | 0.1405 |
| 2.8961 | 5352 | 0.107 |
| 2.8966 | 5353 | 0.1017 |
| 2.8972 | 5354 | 0.0773 |
| 2.8977 | 5355 | 0.1269 |
| 2.8983 | 5356 | 0.073 |
| 2.8988 | 5357 | 0.1202 |
| 2.8994 | 5358 | 0.0795 |
| 2.8999 | 5359 | 0.1065 |
| 2.9004 | 5360 | 0.127 |
| 2.9010 | 5361 | 0.093 |
| 2.9015 | 5362 | 0.0923 |
| 2.9021 | 5363 | 0.1429 |
| 2.9026 | 5364 | 0.0668 |
| 2.9031 | 5365 | 0.0358 |
| 2.9037 | 5366 | 0.0909 |
| 2.9042 | 5367 | 0.0544 |
| 2.9048 | 5368 | 0.1164 |
| 2.9053 | 5369 | 0.0436 |
| 2.9058 | 5370 | 0.0721 |
| 2.9064 | 5371 | 0.0882 |
| 2.9069 | 5372 | 0.1498 |
| 2.9075 | 5373 | 0.085 |
| 2.9080 | 5374 | 0.1012 |
| 2.9085 | 5375 | 0.1164 |
| 2.9091 | 5376 | 0.1303 |
| 2.9096 | 5377 | 0.1091 |
| 2.9102 | 5378 | 0.0666 |
| 2.9107 | 5379 | 0.0969 |
| 2.9113 | 5380 | 0.0863 |
| 2.9118 | 5381 | 0.0833 |
| 2.9123 | 5382 | 0.0849 |
| 2.9129 | 5383 | 0.0662 |
| 2.9134 | 5384 | 0.0861 |
| 2.9140 | 5385 | 0.1756 |
| 2.9145 | 5386 | 0.0665 |
| 2.9150 | 5387 | 0.1171 |
| 2.9156 | 5388 | 0.1085 |
| 2.9161 | 5389 | 0.1401 |
| 2.9167 | 5390 | 0.0727 |
| 2.9172 | 5391 | 0.1318 |
| 2.9177 | 5392 | 0.085 |
| 2.9183 | 5393 | 0.0961 |
| 2.9188 | 5394 | 0.1146 |
| 2.9194 | 5395 | 0.0779 |
| 2.9199 | 5396 | 0.0885 |
| 2.9205 | 5397 | 0.1174 |
| 2.9210 | 5398 | 0.0801 |
| 2.9215 | 5399 | 0.0656 |
| 2.9221 | 5400 | 0.0871 |
| 2.9226 | 5401 | 0.129 |
| 2.9232 | 5402 | 0.1138 |
| 2.9237 | 5403 | 0.0899 |
| 2.9242 | 5404 | 0.0413 |
| 2.9248 | 5405 | 0.1202 |
| 2.9253 | 5406 | 0.0755 |
| 2.9259 | 5407 | 0.0857 |
| 2.9264 | 5408 | 0.045 |
| 2.9269 | 5409 | 0.0622 |
| 2.9275 | 5410 | 0.1063 |
| 2.9280 | 5411 | 0.196 |
| 2.9286 | 5412 | 0.0502 |
| 2.9291 | 5413 | 0.0701 |
| 2.9297 | 5414 | 0.0962 |
| 2.9302 | 5415 | 0.0756 |
| 2.9307 | 5416 | 0.078 |
| 2.9313 | 5417 | 0.1101 |
| 2.9318 | 5418 | 0.0692 |
| 2.9324 | 5419 | 0.1278 |
| 2.9329 | 5420 | 0.1044 |
| 2.9334 | 5421 | 0.0609 |
| 2.9340 | 5422 | 0.0898 |
| 2.9345 | 5423 | 0.1291 |
| 2.9351 | 5424 | 0.0864 |
| 2.9356 | 5425 | 0.0708 |
| 2.9361 | 5426 | 0.0891 |
| 2.9367 | 5427 | 0.0833 |
| 2.9372 | 5428 | 0.0677 |
| 2.9378 | 5429 | 0.0914 |
| 2.9383 | 5430 | 0.0892 |
| 2.9389 | 5431 | 0.1198 |
| 2.9394 | 5432 | 0.0815 |
| 2.9399 | 5433 | 0.1249 |
| 2.9405 | 5434 | 0.0839 |
| 2.9410 | 5435 | 0.0764 |
| 2.9416 | 5436 | 0.0877 |
| 2.9421 | 5437 | 0.0915 |
| 2.9426 | 5438 | 0.0897 |
| 2.9432 | 5439 | 0.0895 |
| 2.9437 | 5440 | 0.0913 |
| 2.9443 | 5441 | 0.1072 |
| 2.9448 | 5442 | 0.1195 |
| 2.9453 | 5443 | 0.1172 |
| 2.9459 | 5444 | 0.0614 |
| 2.9464 | 5445 | 0.0655 |
| 2.9470 | 5446 | 0.0907 |
| 2.9475 | 5447 | 0.1093 |
| 2.9481 | 5448 | 0.1094 |
| 2.9486 | 5449 | 0.1136 |
| 2.9491 | 5450 | 0.1177 |
| 2.9497 | 5451 | 0.1121 |
| 2.9502 | 5452 | 0.1027 |
| 2.9508 | 5453 | 0.1089 |
| 2.9513 | 5454 | 0.0873 |
| 2.9518 | 5455 | 0.0808 |
| 2.9524 | 5456 | 0.114 |
| 2.9529 | 5457 | 0.114 |
| 2.9535 | 5458 | 0.0601 |
| 2.9540 | 5459 | 0.1312 |
| 2.9545 | 5460 | 0.1432 |
| 2.9551 | 5461 | 0.1008 |
| 2.9556 | 5462 | 0.1363 |
| 2.9562 | 5463 | 0.0997 |
| 2.9567 | 5464 | 0.1113 |
| 2.9573 | 5465 | 0.1365 |
| 2.9578 | 5466 | 0.1514 |
| 2.9583 | 5467 | 0.1709 |
| 2.9589 | 5468 | 0.0847 |
| 2.9594 | 5469 | 0.1093 |
| 2.9600 | 5470 | 0.106 |
| 2.9605 | 5471 | 0.1107 |
| 2.9610 | 5472 | 0.0635 |
| 2.9616 | 5473 | 0.1064 |
| 2.9621 | 5474 | 0.0894 |
| 2.9627 | 5475 | 0.1676 |
| 2.9632 | 5476 | 0.091 |
| 2.9637 | 5477 | 0.0878 |
| 2.9643 | 5478 | 0.0591 |
| 2.9648 | 5479 | 0.0574 |
| 2.9654 | 5480 | 0.1093 |
| 2.9659 | 5481 | 0.1349 |
| 2.9665 | 5482 | 0.0918 |
| 2.9670 | 5483 | 0.0789 |
| 2.9675 | 5484 | 0.1823 |
| 2.9681 | 5485 | 0.0773 |
| 2.9686 | 5486 | 0.0842 |
| 2.9692 | 5487 | 0.1025 |
| 2.9697 | 5488 | 0.0506 |
| 2.9702 | 5489 | 0.0634 |
| 2.9708 | 5490 | 0.0805 |
| 2.9713 | 5491 | 0.0558 |
| 2.9719 | 5492 | 0.1261 |
| 2.9724 | 5493 | 0.0778 |
| 2.9729 | 5494 | 0.0731 |
| 2.9735 | 5495 | 0.1029 |
| 2.9740 | 5496 | 0.1656 |
| 2.9746 | 5497 | 0.0666 |
| 2.9751 | 5498 | 0.1192 |
| 2.9756 | 5499 | 0.1185 |
| 2.9762 | 5500 | 0.1022 |
| 2.9767 | 5501 | 0.1075 |
| 2.9773 | 5502 | 0.0503 |
| 2.9778 | 5503 | 0.0946 |
| 2.9784 | 5504 | 0.1351 |
| 2.9789 | 5505 | 0.089 |
| 2.9794 | 5506 | 0.0984 |
| 2.9800 | 5507 | 0.0735 |
| 2.9805 | 5508 | 0.0786 |
| 2.9811 | 5509 | 0.0881 |
| 2.9816 | 5510 | 0.0581 |
| 2.9821 | 5511 | 0.0558 |
| 2.9827 | 5512 | 0.0694 |
| 2.9832 | 5513 | 0.0913 |
| 2.9838 | 5514 | 0.0585 |
| 2.9843 | 5515 | 0.0538 |
| 2.9848 | 5516 | 0.0956 |
| 2.9854 | 5517 | 0.0735 |
| 2.9859 | 5518 | 0.1273 |
| 2.9865 | 5519 | 0.0906 |
| 2.9870 | 5520 | 0.0951 |
| 2.9876 | 5521 | 0.0671 |
| 2.9881 | 5522 | 0.0686 |
| 2.9886 | 5523 | 0.0955 |
| 2.9892 | 5524 | 0.083 |
| 2.9897 | 5525 | 0.0693 |
| 2.9903 | 5526 | 0.1424 |
| 2.9908 | 5527 | 0.1084 |
| 2.9913 | 5528 | 0.1126 |
| 2.9919 | 5529 | 0.0573 |
| 2.9924 | 5530 | 0.1211 |
| 2.9930 | 5531 | 0.061 |
| 2.9935 | 5532 | 0.0792 |
| 2.9940 | 5533 | 0.1059 |
| 2.9946 | 5534 | 0.0728 |
| 2.9951 | 5535 | 0.1047 |
| 2.9957 | 5536 | 0.0839 |
| 2.9962 | 5537 | 0.1019 |
| 2.9968 | 5538 | 0.1577 |
| 2.9973 | 5539 | 0.153 |
| 2.9978 | 5540 | 0.1152 |
| 2.9984 | 5541 | 0.0853 |
| 2.9989 | 5542 | 0.0858 |
| 2.9995 | 5543 | 0.0342 |
| 3.0 | 5544 | 0.1848 |
| 3.0005 | 5545 | 0.0774 |
| 3.0011 | 5546 | 0.064 |
| 3.0016 | 5547 | 0.0417 |
| 3.0022 | 5548 | 0.1099 |
| 3.0027 | 5549 | 0.0731 |
| 3.0032 | 5550 | 0.1233 |
| 3.0038 | 5551 | 0.0636 |
| 3.0043 | 5552 | 0.0658 |
| 3.0049 | 5553 | 0.0737 |
| 3.0054 | 5554 | 0.1226 |
| 3.0060 | 5555 | 0.0937 |
| 3.0065 | 5556 | 0.0684 |
| 3.0070 | 5557 | 0.0778 |
| 3.0076 | 5558 | 0.079 |
| 3.0081 | 5559 | 0.0609 |
| 3.0087 | 5560 | 0.1028 |
| 3.0092 | 5561 | 0.0805 |
| 3.0097 | 5562 | 0.1247 |
| 3.0103 | 5563 | 0.0986 |
| 3.0108 | 5564 | 0.1023 |
| 3.0114 | 5565 | 0.0856 |
| 3.0119 | 5566 | 0.1178 |
| 3.0124 | 5567 | 0.0839 |
| 3.0130 | 5568 | 0.1031 |
| 3.0135 | 5569 | 0.116 |
| 3.0141 | 5570 | 0.0902 |
| 3.0146 | 5571 | 0.0677 |
| 3.0152 | 5572 | 0.0679 |
| 3.0157 | 5573 | 0.0878 |
| 3.0162 | 5574 | 0.0992 |
| 3.0168 | 5575 | 0.0957 |
| 3.0173 | 5576 | 0.0854 |
| 3.0179 | 5577 | 0.0547 |
| 3.0184 | 5578 | 0.0524 |
| 3.0189 | 5579 | 0.0555 |
| 3.0195 | 5580 | 0.1467 |
| 3.0200 | 5581 | 0.0675 |
| 3.0206 | 5582 | 0.0681 |
| 3.0211 | 5583 | 0.0449 |
| 3.0216 | 5584 | 0.0828 |
| 3.0222 | 5585 | 0.0976 |
| 3.0227 | 5586 | 0.0655 |
| 3.0233 | 5587 | 0.0709 |
| 3.0238 | 5588 | 0.0698 |
| 3.0244 | 5589 | 0.1044 |
| 3.0249 | 5590 | 0.0553 |
| 3.0254 | 5591 | 0.0383 |
| 3.0260 | 5592 | 0.1036 |
| 3.0265 | 5593 | 0.1133 |
| 3.0271 | 5594 | 0.0771 |
| 3.0276 | 5595 | 0.1519 |
| 3.0281 | 5596 | 0.0637 |
| 3.0287 | 5597 | 0.0874 |
| 3.0292 | 5598 | 0.0738 |
| 3.0298 | 5599 | 0.0852 |
| 3.0303 | 5600 | 0.0713 |
| 3.0308 | 5601 | 0.0548 |
| 3.0314 | 5602 | 0.0696 |
| 3.0319 | 5603 | 0.065 |
| 3.0325 | 5604 | 0.1102 |
| 3.0330 | 5605 | 0.1083 |
| 3.0335 | 5606 | 0.0941 |
| 3.0341 | 5607 | 0.055 |
| 3.0346 | 5608 | 0.0255 |
| 3.0352 | 5609 | 0.0769 |
| 3.0357 | 5610 | 0.0556 |
| 3.0363 | 5611 | 0.1053 |
| 3.0368 | 5612 | 0.0578 |
| 3.0373 | 5613 | 0.111 |
| 3.0379 | 5614 | 0.0717 |
| 3.0384 | 5615 | 0.0848 |
| 3.0390 | 5616 | 0.075 |
| 3.0395 | 5617 | 0.07 |
| 3.0400 | 5618 | 0.0529 |
| 3.0406 | 5619 | 0.0533 |
| 3.0411 | 5620 | 0.0881 |
| 3.0417 | 5621 | 0.0716 |
| 3.0422 | 5622 | 0.0618 |
| 3.0427 | 5623 | 0.0472 |
| 3.0433 | 5624 | 0.0755 |
| 3.0438 | 5625 | 0.0865 |
| 3.0444 | 5626 | 0.0761 |
| 3.0449 | 5627 | 0.0477 |
| 3.0455 | 5628 | 0.0934 |
| 3.0460 | 5629 | 0.088 |
| 3.0465 | 5630 | 0.0639 |
| 3.0471 | 5631 | 0.0419 |
| 3.0476 | 5632 | 0.049 |
| 3.0482 | 5633 | 0.0711 |
| 3.0487 | 5634 | 0.0804 |
| 3.0492 | 5635 | 0.0726 |
| 3.0498 | 5636 | 0.0953 |
| 3.0503 | 5637 | 0.1136 |
| 3.0509 | 5638 | 0.0975 |
| 3.0514 | 5639 | 0.1029 |
| 3.0519 | 5640 | 0.1473 |
| 3.0525 | 5641 | 0.0537 |
| 3.0530 | 5642 | 0.0857 |
| 3.0536 | 5643 | 0.0705 |
| 3.0541 | 5644 | 0.0787 |
| 3.0547 | 5645 | 0.0748 |
| 3.0552 | 5646 | 0.0902 |
| 3.0557 | 5647 | 0.0748 |
| 3.0563 | 5648 | 0.0597 |
| 3.0568 | 5649 | 0.077 |
| 3.0574 | 5650 | 0.0304 |
| 3.0579 | 5651 | 0.1534 |
| 3.0584 | 5652 | 0.0829 |
| 3.0590 | 5653 | 0.0792 |
| 3.0595 | 5654 | 0.0668 |
| 3.0601 | 5655 | 0.1061 |
| 3.0606 | 5656 | 0.0516 |
| 3.0611 | 5657 | 0.1356 |
| 3.0617 | 5658 | 0.0537 |
| 3.0622 | 5659 | 0.0625 |
| 3.0628 | 5660 | 0.0561 |
| 3.0633 | 5661 | 0.0711 |
| 3.0639 | 5662 | 0.1057 |
| 3.0644 | 5663 | 0.0622 |
| 3.0649 | 5664 | 0.0795 |
| 3.0655 | 5665 | 0.0628 |
| 3.0660 | 5666 | 0.0942 |
| 3.0666 | 5667 | 0.153 |
| 3.0671 | 5668 | 0.0914 |
| 3.0676 | 5669 | 0.0885 |
| 3.0682 | 5670 | 0.0513 |
| 3.0687 | 5671 | 0.0942 |
| 3.0693 | 5672 | 0.0931 |
| 3.0698 | 5673 | 0.0909 |
| 3.0703 | 5674 | 0.0838 |
| 3.0709 | 5675 | 0.0957 |
| 3.0714 | 5676 | 0.0622 |
| 3.0720 | 5677 | 0.052 |
| 3.0725 | 5678 | 0.1528 |
| 3.0731 | 5679 | 0.0635 |
| 3.0736 | 5680 | 0.0548 |
| 3.0741 | 5681 | 0.0654 |
| 3.0747 | 5682 | 0.103 |
| 3.0752 | 5683 | 0.0676 |
| 3.0758 | 5684 | 0.0722 |
| 3.0763 | 5685 | 0.0703 |
| 3.0768 | 5686 | 0.0686 |
| 3.0774 | 5687 | 0.0627 |
| 3.0779 | 5688 | 0.1002 |
| 3.0785 | 5689 | 0.0634 |
| 3.0790 | 5690 | 0.062 |
| 3.0795 | 5691 | 0.0822 |
| 3.0801 | 5692 | 0.0956 |
| 3.0806 | 5693 | 0.0735 |
| 3.0812 | 5694 | 0.0607 |
| 3.0817 | 5695 | 0.1481 |
| 3.0823 | 5696 | 0.0671 |
| 3.0828 | 5697 | 0.0823 |
| 3.0833 | 5698 | 0.12 |
| 3.0839 | 5699 | 0.0433 |
| 3.0844 | 5700 | 0.0959 |
| 3.0850 | 5701 | 0.036 |
| 3.0855 | 5702 | 0.0699 |
| 3.0860 | 5703 | 0.0988 |
| 3.0866 | 5704 | 0.075 |
| 3.0871 | 5705 | 0.093 |
| 3.0877 | 5706 | 0.1268 |
| 3.0882 | 5707 | 0.0658 |
| 3.0887 | 5708 | 0.0729 |
| 3.0893 | 5709 | 0.0848 |
| 3.0898 | 5710 | 0.0847 |
| 3.0904 | 5711 | 0.0667 |
| 3.0909 | 5712 | 0.0559 |
| 3.0915 | 5713 | 0.1056 |
| 3.0920 | 5714 | 0.128 |
| 3.0925 | 5715 | 0.107 |
| 3.0931 | 5716 | 0.0916 |
| 3.0936 | 5717 | 0.0987 |
| 3.0942 | 5718 | 0.0878 |
| 3.0947 | 5719 | 0.0642 |
| 3.0952 | 5720 | 0.0824 |
| 3.0958 | 5721 | 0.07 |
| 3.0963 | 5722 | 0.0801 |
| 3.0969 | 5723 | 0.0294 |
| 3.0974 | 5724 | 0.0705 |
| 3.0979 | 5725 | 0.1107 |
| 3.0985 | 5726 | 0.0522 |
| 3.0990 | 5727 | 0.0742 |
| 3.0996 | 5728 | 0.0966 |
| 3.1001 | 5729 | 0.0686 |
| 3.1006 | 5730 | 0.0593 |
| 3.1012 | 5731 | 0.1158 |
| 3.1017 | 5732 | 0.1042 |
| 3.1023 | 5733 | 0.0474 |
| 3.1028 | 5734 | 0.0778 |
| 3.1034 | 5735 | 0.0825 |
| 3.1039 | 5736 | 0.0945 |
| 3.1044 | 5737 | 0.0715 |
| 3.1050 | 5738 | 0.0823 |
| 3.1055 | 5739 | 0.1125 |
| 3.1061 | 5740 | 0.0525 |
| 3.1066 | 5741 | 0.0954 |
| 3.1071 | 5742 | 0.0963 |
| 3.1077 | 5743 | 0.0841 |
| 3.1082 | 5744 | 0.0873 |
| 3.1088 | 5745 | 0.0805 |
| 3.1093 | 5746 | 0.0594 |
| 3.1098 | 5747 | 0.0385 |
| 3.1104 | 5748 | 0.0591 |
| 3.1109 | 5749 | 0.0982 |
| 3.1115 | 5750 | 0.078 |
| 3.1120 | 5751 | 0.0456 |
| 3.1126 | 5752 | 0.0847 |
| 3.1131 | 5753 | 0.0545 |
| 3.1136 | 5754 | 0.0922 |
| 3.1142 | 5755 | 0.1174 |
| 3.1147 | 5756 | 0.1056 |
| 3.1153 | 5757 | 0.0799 |
| 3.1158 | 5758 | 0.072 |
| 3.1163 | 5759 | 0.0893 |
| 3.1169 | 5760 | 0.0651 |
| 3.1174 | 5761 | 0.0832 |
| 3.1180 | 5762 | 0.093 |
| 3.1185 | 5763 | 0.0677 |
| 3.1190 | 5764 | 0.0649 |
| 3.1196 | 5765 | 0.0651 |
| 3.1201 | 5766 | 0.0771 |
| 3.1207 | 5767 | 0.0636 |
| 3.1212 | 5768 | 0.0901 |
| 3.1218 | 5769 | 0.121 |
| 3.1223 | 5770 | 0.0838 |
| 3.1228 | 5771 | 0.07 |
| 3.1234 | 5772 | 0.1266 |
| 3.1239 | 5773 | 0.0835 |
| 3.1245 | 5774 | 0.0469 |
| 3.125 | 5775 | 0.0936 |
| 3.1255 | 5776 | 0.0535 |
| 3.1261 | 5777 | 0.1428 |
| 3.1266 | 5778 | 0.0938 |
| 3.1272 | 5779 | 0.0642 |
| 3.1277 | 5780 | 0.0656 |
| 3.1282 | 5781 | 0.0824 |
| 3.1288 | 5782 | 0.1233 |
| 3.1293 | 5783 | 0.0787 |
| 3.1299 | 5784 | 0.0878 |
| 3.1304 | 5785 | 0.1327 |
| 3.1310 | 5786 | 0.0886 |
| 3.1315 | 5787 | 0.1108 |
| 3.1320 | 5788 | 0.0786 |
| 3.1326 | 5789 | 0.1017 |
| 3.1331 | 5790 | 0.0488 |
| 3.1337 | 5791 | 0.0381 |
| 3.1342 | 5792 | 0.0869 |
| 3.1347 | 5793 | 0.0721 |
| 3.1353 | 5794 | 0.0935 |
| 3.1358 | 5795 | 0.0934 |
| 3.1364 | 5796 | 0.1036 |
| 3.1369 | 5797 | 0.0794 |
| 3.1374 | 5798 | 0.0739 |
| 3.1380 | 5799 | 0.0782 |
| 3.1385 | 5800 | 0.0893 |
| 3.1391 | 5801 | 0.0448 |
| 3.1396 | 5802 | 0.0469 |
| 3.1402 | 5803 | 0.0594 |
| 3.1407 | 5804 | 0.105 |
| 3.1412 | 5805 | 0.0924 |
| 3.1418 | 5806 | 0.1054 |
| 3.1423 | 5807 | 0.0653 |
| 3.1429 | 5808 | 0.1027 |
| 3.1434 | 5809 | 0.103 |
| 3.1439 | 5810 | 0.0598 |
| 3.1445 | 5811 | 0.1114 |
| 3.1450 | 5812 | 0.0818 |
| 3.1456 | 5813 | 0.0873 |
| 3.1461 | 5814 | 0.0915 |
| 3.1466 | 5815 | 0.034 |
| 3.1472 | 5816 | 0.0803 |
| 3.1477 | 5817 | 0.0804 |
| 3.1483 | 5818 | 0.0978 |
| 3.1488 | 5819 | 0.0483 |
| 3.1494 | 5820 | 0.0804 |
| 3.1499 | 5821 | 0.0919 |
| 3.1504 | 5822 | 0.063 |
| 3.1510 | 5823 | 0.1046 |
| 3.1515 | 5824 | 0.1235 |
| 3.1521 | 5825 | 0.0333 |
| 3.1526 | 5826 | 0.1265 |
| 3.1531 | 5827 | 0.0568 |
| 3.1537 | 5828 | 0.0718 |
| 3.1542 | 5829 | 0.0552 |
| 3.1548 | 5830 | 0.0443 |
| 3.1553 | 5831 | 0.0643 |
| 3.1558 | 5832 | 0.0681 |
| 3.1564 | 5833 | 0.1264 |
| 3.1569 | 5834 | 0.1108 |
| 3.1575 | 5835 | 0.0486 |
| 3.1580 | 5836 | 0.0661 |
| 3.1585 | 5837 | 0.0835 |
| 3.1591 | 5838 | 0.0735 |
| 3.1596 | 5839 | 0.0996 |
| 3.1602 | 5840 | 0.069 |
| 3.1607 | 5841 | 0.0614 |
| 3.1613 | 5842 | 0.0826 |
| 3.1618 | 5843 | 0.0572 |
| 3.1623 | 5844 | 0.062 |
| 3.1629 | 5845 | 0.0577 |
| 3.1634 | 5846 | 0.1343 |
| 3.1640 | 5847 | 0.1119 |
| 3.1645 | 5848 | 0.0656 |
| 3.1650 | 5849 | 0.0763 |
| 3.1656 | 5850 | 0.0689 |
| 3.1661 | 5851 | 0.082 |
| 3.1667 | 5852 | 0.0699 |
| 3.1672 | 5853 | 0.0576 |
| 3.1677 | 5854 | 0.0775 |
| 3.1683 | 5855 | 0.1195 |
| 3.1688 | 5856 | 0.0993 |
| 3.1694 | 5857 | 0.1003 |
| 3.1699 | 5858 | 0.0603 |
| 3.1705 | 5859 | 0.0994 |
| 3.1710 | 5860 | 0.0746 |
| 3.1715 | 5861 | 0.0659 |
| 3.1721 | 5862 | 0.0895 |
| 3.1726 | 5863 | 0.0728 |
| 3.1732 | 5864 | 0.0456 |
| 3.1737 | 5865 | 0.109 |
| 3.1742 | 5866 | 0.0485 |
| 3.1748 | 5867 | 0.1287 |
| 3.1753 | 5868 | 0.0332 |
| 3.1759 | 5869 | 0.0849 |
| 3.1764 | 5870 | 0.0826 |
| 3.1769 | 5871 | 0.0646 |
| 3.1775 | 5872 | 0.0593 |
| 3.1780 | 5873 | 0.1558 |
| 3.1786 | 5874 | 0.0773 |
| 3.1791 | 5875 | 0.0572 |
| 3.1797 | 5876 | 0.0756 |
| 3.1802 | 5877 | 0.0829 |
| 3.1807 | 5878 | 0.1003 |
| 3.1813 | 5879 | 0.0825 |
| 3.1818 | 5880 | 0.1302 |
| 3.1824 | 5881 | 0.0444 |
| 3.1829 | 5882 | 0.0645 |
| 3.1834 | 5883 | 0.0708 |
| 3.1840 | 5884 | 0.0487 |
| 3.1845 | 5885 | 0.0435 |
| 3.1851 | 5886 | 0.0539 |
| 3.1856 | 5887 | 0.1219 |
| 3.1861 | 5888 | 0.0897 |
| 3.1867 | 5889 | 0.1088 |
| 3.1872 | 5890 | 0.0979 |
| 3.1878 | 5891 | 0.1832 |
| 3.1883 | 5892 | 0.0544 |
| 3.1889 | 5893 | 0.0767 |
| 3.1894 | 5894 | 0.1379 |
| 3.1899 | 5895 | 0.0652 |
| 3.1905 | 5896 | 0.0846 |
| 3.1910 | 5897 | 0.061 |
| 3.1916 | 5898 | 0.1318 |
| 3.1921 | 5899 | 0.065 |
| 3.1926 | 5900 | 0.0774 |
| 3.1932 | 5901 | 0.0748 |
| 3.1937 | 5902 | 0.0625 |
| 3.1943 | 5903 | 0.0964 |
| 3.1948 | 5904 | 0.0798 |
| 3.1953 | 5905 | 0.0826 |
| 3.1959 | 5906 | 0.0795 |
| 3.1964 | 5907 | 0.134 |
| 3.1970 | 5908 | 0.0571 |
| 3.1975 | 5909 | 0.0584 |
| 3.1981 | 5910 | 0.067 |
| 3.1986 | 5911 | 0.048 |
| 3.1991 | 5912 | 0.1235 |
| 3.1997 | 5913 | 0.0654 |
| 3.2002 | 5914 | 0.0389 |
| 3.2008 | 5915 | 0.0798 |
| 3.2013 | 5916 | 0.058 |
| 3.2018 | 5917 | 0.1136 |
| 3.2024 | 5918 | 0.0617 |
| 3.2029 | 5919 | 0.0992 |
| 3.2035 | 5920 | 0.1078 |
| 3.2040 | 5921 | 0.0534 |
| 3.2045 | 5922 | 0.1208 |
| 3.2051 | 5923 | 0.0585 |
| 3.2056 | 5924 | 0.0616 |
| 3.2062 | 5925 | 0.087 |
| 3.2067 | 5926 | 0.0654 |
| 3.2073 | 5927 | 0.0691 |
| 3.2078 | 5928 | 0.0628 |
| 3.2083 | 5929 | 0.0721 |
| 3.2089 | 5930 | 0.059 |
| 3.2094 | 5931 | 0.0738 |
| 3.2100 | 5932 | 0.0647 |
| 3.2105 | 5933 | 0.0821 |
| 3.2110 | 5934 | 0.0692 |
| 3.2116 | 5935 | 0.126 |
| 3.2121 | 5936 | 0.0626 |
| 3.2127 | 5937 | 0.0853 |
| 3.2132 | 5938 | 0.0991 |
| 3.2137 | 5939 | 0.049 |
| 3.2143 | 5940 | 0.0924 |
| 3.2148 | 5941 | 0.0784 |
| 3.2154 | 5942 | 0.0943 |
| 3.2159 | 5943 | 0.1166 |
| 3.2165 | 5944 | 0.0961 |
| 3.2170 | 5945 | 0.1163 |
| 3.2175 | 5946 | 0.0754 |
| 3.2181 | 5947 | 0.1161 |
| 3.2186 | 5948 | 0.081 |
| 3.2192 | 5949 | 0.0384 |
| 3.2197 | 5950 | 0.0806 |
| 3.2202 | 5951 | 0.127 |
| 3.2208 | 5952 | 0.0445 |
| 3.2213 | 5953 | 0.07 |
| 3.2219 | 5954 | 0.0761 |
| 3.2224 | 5955 | 0.0701 |
| 3.2229 | 5956 | 0.059 |
| 3.2235 | 5957 | 0.0784 |
| 3.2240 | 5958 | 0.0487 |
| 3.2246 | 5959 | 0.0794 |
| 3.2251 | 5960 | 0.0597 |
| 3.2256 | 5961 | 0.0856 |
| 3.2262 | 5962 | 0.0767 |
| 3.2267 | 5963 | 0.0984 |
| 3.2273 | 5964 | 0.0603 |
| 3.2278 | 5965 | 0.0705 |
| 3.2284 | 5966 | 0.0636 |
| 3.2289 | 5967 | 0.0538 |
| 3.2294 | 5968 | 0.0811 |
| 3.2300 | 5969 | 0.0592 |
| 3.2305 | 5970 | 0.0804 |
| 3.2311 | 5971 | 0.068 |
| 3.2316 | 5972 | 0.0721 |
| 3.2321 | 5973 | 0.0922 |
| 3.2327 | 5974 | 0.1008 |
| 3.2332 | 5975 | 0.0932 |
| 3.2338 | 5976 | 0.1179 |
| 3.2343 | 5977 | 0.1413 |
| 3.2348 | 5978 | 0.0636 |
| 3.2354 | 5979 | 0.0694 |
| 3.2359 | 5980 | 0.109 |
| 3.2365 | 5981 | 0.138 |
| 3.2370 | 5982 | 0.0577 |
| 3.2376 | 5983 | 0.0859 |
| 3.2381 | 5984 | 0.0736 |
| 3.2386 | 5985 | 0.0919 |
| 3.2392 | 5986 | 0.1042 |
| 3.2397 | 5987 | 0.0776 |
| 3.2403 | 5988 | 0.0515 |
| 3.2408 | 5989 | 0.1458 |
| 3.2413 | 5990 | 0.1263 |
| 3.2419 | 5991 | 0.0241 |
| 3.2424 | 5992 | 0.0408 |
| 3.2430 | 5993 | 0.0554 |
| 3.2435 | 5994 | 0.0435 |
| 3.2440 | 5995 | 0.078 |
| 3.2446 | 5996 | 0.1463 |
| 3.2451 | 5997 | 0.0615 |
| 3.2457 | 5998 | 0.079 |
| 3.2462 | 5999 | 0.0751 |
| 3.2468 | 6000 | 0.0605 |
| 3.2473 | 6001 | 0.1025 |
| 3.2478 | 6002 | 0.0402 |
| 3.2484 | 6003 | 0.0946 |
| 3.2489 | 6004 | 0.0809 |
| 3.2495 | 6005 | 0.1007 |
| 3.25 | 6006 | 0.0722 |
| 3.2505 | 6007 | 0.0483 |
| 3.2511 | 6008 | 0.062 |
| 3.2516 | 6009 | 0.0627 |
| 3.2522 | 6010 | 0.0816 |
| 3.2527 | 6011 | 0.0752 |
| 3.2532 | 6012 | 0.0727 |
| 3.2538 | 6013 | 0.0948 |
| 3.2543 | 6014 | 0.0767 |
| 3.2549 | 6015 | 0.0584 |
| 3.2554 | 6016 | 0.1126 |
| 3.2560 | 6017 | 0.1433 |
| 3.2565 | 6018 | 0.0721 |
| 3.2570 | 6019 | 0.0659 |
| 3.2576 | 6020 | 0.075 |
| 3.2581 | 6021 | 0.1374 |
| 3.2587 | 6022 | 0.0436 |
| 3.2592 | 6023 | 0.0927 |
| 3.2597 | 6024 | 0.0992 |
| 3.2603 | 6025 | 0.1064 |
| 3.2608 | 6026 | 0.0708 |
| 3.2614 | 6027 | 0.0626 |
| 3.2619 | 6028 | 0.0665 |
| 3.2624 | 6029 | 0.0608 |
| 3.2630 | 6030 | 0.0569 |
| 3.2635 | 6031 | 0.0457 |
| 3.2641 | 6032 | 0.0637 |
| 3.2646 | 6033 | 0.1283 |
| 3.2652 | 6034 | 0.1004 |
| 3.2657 | 6035 | 0.0431 |
| 3.2662 | 6036 | 0.1564 |
| 3.2668 | 6037 | 0.0774 |
| 3.2673 | 6038 | 0.1013 |
| 3.2679 | 6039 | 0.097 |
| 3.2684 | 6040 | 0.1249 |
| 3.2689 | 6041 | 0.0731 |
| 3.2695 | 6042 | 0.0737 |
| 3.2700 | 6043 | 0.0689 |
| 3.2706 | 6044 | 0.0455 |
| 3.2711 | 6045 | 0.078 |
| 3.2716 | 6046 | 0.1229 |
| 3.2722 | 6047 | 0.098 |
| 3.2727 | 6048 | 0.086 |
| 3.2733 | 6049 | 0.0835 |
| 3.2738 | 6050 | 0.0572 |
| 3.2744 | 6051 | 0.096 |
| 3.2749 | 6052 | 0.0686 |
| 3.2754 | 6053 | 0.0427 |
| 3.2760 | 6054 | 0.137 |
| 3.2765 | 6055 | 0.0527 |
| 3.2771 | 6056 | 0.0798 |
| 3.2776 | 6057 | 0.0893 |
| 3.2781 | 6058 | 0.0943 |
| 3.2787 | 6059 | 0.0605 |
| 3.2792 | 6060 | 0.1054 |
| 3.2798 | 6061 | 0.0936 |
| 3.2803 | 6062 | 0.1033 |
| 3.2808 | 6063 | 0.0674 |
| 3.2814 | 6064 | 0.1009 |
| 3.2819 | 6065 | 0.0597 |
| 3.2825 | 6066 | 0.0573 |
| 3.2830 | 6067 | 0.0523 |
| 3.2835 | 6068 | 0.0814 |
| 3.2841 | 6069 | 0.0546 |
| 3.2846 | 6070 | 0.0357 |
| 3.2852 | 6071 | 0.074 |
| 3.2857 | 6072 | 0.0799 |
| 3.2863 | 6073 | 0.1023 |
| 3.2868 | 6074 | 0.0686 |
| 3.2873 | 6075 | 0.0713 |
| 3.2879 | 6076 | 0.071 |
| 3.2884 | 6077 | 0.0691 |
| 3.2890 | 6078 | 0.0916 |
| 3.2895 | 6079 | 0.1019 |
| 3.2900 | 6080 | 0.049 |
| 3.2906 | 6081 | 0.0797 |
| 3.2911 | 6082 | 0.0584 |
| 3.2917 | 6083 | 0.0758 |
| 3.2922 | 6084 | 0.0895 |
| 3.2927 | 6085 | 0.0796 |
| 3.2933 | 6086 | 0.0615 |
| 3.2938 | 6087 | 0.0765 |
| 3.2944 | 6088 | 0.0594 |
| 3.2949 | 6089 | 0.1275 |
| 3.2955 | 6090 | 0.0927 |
| 3.2960 | 6091 | 0.056 |
| 3.2965 | 6092 | 0.0733 |
| 3.2971 | 6093 | 0.0965 |
| 3.2976 | 6094 | 0.1002 |
| 3.2982 | 6095 | 0.1042 |
| 3.2987 | 6096 | 0.0592 |
| 3.2992 | 6097 | 0.0739 |
| 3.2998 | 6098 | 0.08 |
| 3.3003 | 6099 | 0.1251 |
| 3.3009 | 6100 | 0.0655 |
| 3.3014 | 6101 | 0.0597 |
| 3.3019 | 6102 | 0.0611 |
| 3.3025 | 6103 | 0.0649 |
| 3.3030 | 6104 | 0.1331 |
| 3.3036 | 6105 | 0.1075 |
| 3.3041 | 6106 | 0.1372 |
| 3.3047 | 6107 | 0.0932 |
| 3.3052 | 6108 | 0.0888 |
| 3.3057 | 6109 | 0.1288 |
| 3.3063 | 6110 | 0.0956 |
| 3.3068 | 6111 | 0.0957 |
| 3.3074 | 6112 | 0.0565 |
| 3.3079 | 6113 | 0.0718 |
| 3.3084 | 6114 | 0.1002 |
| 3.3090 | 6115 | 0.0904 |
| 3.3095 | 6116 | 0.1076 |
| 3.3101 | 6117 | 0.0631 |
| 3.3106 | 6118 | 0.0548 |
| 3.3111 | 6119 | 0.0928 |
| 3.3117 | 6120 | 0.0632 |
| 3.3122 | 6121 | 0.0804 |
| 3.3128 | 6122 | 0.0869 |
| 3.3133 | 6123 | 0.0751 |
| 3.3139 | 6124 | 0.0954 |
| 3.3144 | 6125 | 0.1069 |
| 3.3149 | 6126 | 0.1149 |
| 3.3155 | 6127 | 0.111 |
| 3.3160 | 6128 | 0.0404 |
| 3.3166 | 6129 | 0.1078 |
| 3.3171 | 6130 | 0.1086 |
| 3.3176 | 6131 | 0.0585 |
| 3.3182 | 6132 | 0.0678 |
| 3.3187 | 6133 | 0.0862 |
| 3.3193 | 6134 | 0.0835 |
| 3.3198 | 6135 | 0.0784 |
| 3.3203 | 6136 | 0.0973 |
| 3.3209 | 6137 | 0.0953 |
| 3.3214 | 6138 | 0.0527 |
| 3.3220 | 6139 | 0.0784 |
| 3.3225 | 6140 | 0.0833 |
| 3.3231 | 6141 | 0.0839 |
| 3.3236 | 6142 | 0.081 |
| 3.3241 | 6143 | 0.0715 |
| 3.3247 | 6144 | 0.1214 |
| 3.3252 | 6145 | 0.0441 |
| 3.3258 | 6146 | 0.0812 |
| 3.3263 | 6147 | 0.1044 |
| 3.3268 | 6148 | 0.0956 |
| 3.3274 | 6149 | 0.07 |
| 3.3279 | 6150 | 0.1022 |
| 3.3285 | 6151 | 0.0618 |
| 3.3290 | 6152 | 0.0641 |
| 3.3295 | 6153 | 0.0605 |
| 3.3301 | 6154 | 0.0909 |
| 3.3306 | 6155 | 0.0574 |
| 3.3312 | 6156 | 0.1013 |
| 3.3317 | 6157 | 0.0549 |
| 3.3323 | 6158 | 0.0878 |
| 3.3328 | 6159 | 0.0883 |
| 3.3333 | 6160 | 0.0683 |
| 3.3339 | 6161 | 0.1398 |
| 3.3344 | 6162 | 0.1068 |
| 3.3350 | 6163 | 0.0965 |
| 3.3355 | 6164 | 0.0813 |
| 3.3360 | 6165 | 0.0707 |
| 3.3366 | 6166 | 0.0939 |
| 3.3371 | 6167 | 0.0789 |
| 3.3377 | 6168 | 0.0959 |
| 3.3382 | 6169 | 0.0608 |
| 3.3387 | 6170 | 0.0971 |
| 3.3393 | 6171 | 0.1021 |
| 3.3398 | 6172 | 0.1045 |
| 3.3404 | 6173 | 0.1451 |
| 3.3409 | 6174 | 0.1212 |
| 3.3415 | 6175 | 0.1469 |
| 3.3420 | 6176 | 0.0952 |
| 3.3425 | 6177 | 0.0558 |
| 3.3431 | 6178 | 0.1147 |
| 3.3436 | 6179 | 0.0801 |
| 3.3442 | 6180 | 0.1363 |
| 3.3447 | 6181 | 0.0676 |
| 3.3452 | 6182 | 0.0699 |
| 3.3458 | 6183 | 0.0624 |
| 3.3463 | 6184 | 0.0724 |
| 3.3469 | 6185 | 0.0847 |
| 3.3474 | 6186 | 0.0635 |
| 3.3479 | 6187 | 0.0868 |
| 3.3485 | 6188 | 0.112 |
| 3.3490 | 6189 | 0.131 |
| 3.3496 | 6190 | 0.1078 |
| 3.3501 | 6191 | 0.1334 |
| 3.3506 | 6192 | 0.0853 |
| 3.3512 | 6193 | 0.0836 |
| 3.3517 | 6194 | 0.059 |
| 3.3523 | 6195 | 0.0523 |
| 3.3528 | 6196 | 0.07 |
| 3.3534 | 6197 | 0.0706 |
| 3.3539 | 6198 | 0.0742 |
| 3.3544 | 6199 | 0.0642 |
| 3.3550 | 6200 | 0.0922 |
| 3.3555 | 6201 | 0.0638 |
| 3.3561 | 6202 | 0.0609 |
| 3.3566 | 6203 | 0.0727 |
| 3.3571 | 6204 | 0.077 |
| 3.3577 | 6205 | 0.1474 |
| 3.3582 | 6206 | 0.0486 |
| 3.3588 | 6207 | 0.063 |
| 3.3593 | 6208 | 0.0842 |
| 3.3598 | 6209 | 0.1142 |
| 3.3604 | 6210 | 0.1024 |
| 3.3609 | 6211 | 0.119 |
| 3.3615 | 6212 | 0.11 |
| 3.3620 | 6213 | 0.0578 |
| 3.3626 | 6214 | 0.0844 |
| 3.3631 | 6215 | 0.0361 |
| 3.3636 | 6216 | 0.053 |
| 3.3642 | 6217 | 0.0699 |
| 3.3647 | 6218 | 0.062 |
| 3.3653 | 6219 | 0.0637 |
| 3.3658 | 6220 | 0.1125 |
| 3.3663 | 6221 | 0.0939 |
| 3.3669 | 6222 | 0.0783 |
| 3.3674 | 6223 | 0.0992 |
| 3.3680 | 6224 | 0.0735 |
| 3.3685 | 6225 | 0.0531 |
| 3.3690 | 6226 | 0.0906 |
| 3.3696 | 6227 | 0.0983 |
| 3.3701 | 6228 | 0.0512 |
| 3.3707 | 6229 | 0.0695 |
| 3.3712 | 6230 | 0.1186 |
| 3.3718 | 6231 | 0.0387 |
| 3.3723 | 6232 | 0.094 |
| 3.3728 | 6233 | 0.0615 |
| 3.3734 | 6234 | 0.1316 |
| 3.3739 | 6235 | 0.0895 |
| 3.3745 | 6236 | 0.1054 |
| 3.375 | 6237 | 0.0706 |
| 3.3755 | 6238 | 0.0959 |
| 3.3761 | 6239 | 0.0737 |
| 3.3766 | 6240 | 0.0593 |
| 3.3772 | 6241 | 0.078 |
| 3.3777 | 6242 | 0.0723 |
| 3.3782 | 6243 | 0.082 |
| 3.3788 | 6244 | 0.1098 |
| 3.3793 | 6245 | 0.0793 |
| 3.3799 | 6246 | 0.0581 |
| 3.3804 | 6247 | 0.0775 |
| 3.3810 | 6248 | 0.0827 |
| 3.3815 | 6249 | 0.1026 |
| 3.3820 | 6250 | 0.0479 |
| 3.3826 | 6251 | 0.0809 |
| 3.3831 | 6252 | 0.0969 |
| 3.3837 | 6253 | 0.117 |
| 3.3842 | 6254 | 0.0966 |
| 3.3847 | 6255 | 0.0468 |
| 3.3853 | 6256 | 0.0627 |
| 3.3858 | 6257 | 0.0531 |
| 3.3864 | 6258 | 0.0591 |
| 3.3869 | 6259 | 0.0578 |
| 3.3874 | 6260 | 0.1029 |
| 3.3880 | 6261 | 0.0565 |
| 3.3885 | 6262 | 0.0961 |
| 3.3891 | 6263 | 0.0595 |
| 3.3896 | 6264 | 0.0327 |
| 3.3902 | 6265 | 0.0762 |
| 3.3907 | 6266 | 0.0584 |
| 3.3912 | 6267 | 0.0931 |
| 3.3918 | 6268 | 0.108 |
| 3.3923 | 6269 | 0.0659 |
| 3.3929 | 6270 | 0.1078 |
| 3.3934 | 6271 | 0.0714 |
| 3.3939 | 6272 | 0.0853 |
| 3.3945 | 6273 | 0.061 |
| 3.3950 | 6274 | 0.0764 |
| 3.3956 | 6275 | 0.0497 |
| 3.3961 | 6276 | 0.0596 |
| 3.3966 | 6277 | 0.1012 |
| 3.3972 | 6278 | 0.0828 |
| 3.3977 | 6279 | 0.0632 |
| 3.3983 | 6280 | 0.1238 |
| 3.3988 | 6281 | 0.0596 |
| 3.3994 | 6282 | 0.1057 |
| 3.3999 | 6283 | 0.0537 |
| 3.4004 | 6284 | 0.1111 |
| 3.4010 | 6285 | 0.1753 |
| 3.4015 | 6286 | 0.0607 |
| 3.4021 | 6287 | 0.1146 |
| 3.4026 | 6288 | 0.052 |
| 3.4031 | 6289 | 0.0314 |
| 3.4037 | 6290 | 0.138 |
| 3.4042 | 6291 | 0.0915 |
| 3.4048 | 6292 | 0.0703 |
| 3.4053 | 6293 | 0.0531 |
| 3.4058 | 6294 | 0.0591 |
| 3.4064 | 6295 | 0.0982 |
| 3.4069 | 6296 | 0.088 |
| 3.4075 | 6297 | 0.0476 |
| 3.4080 | 6298 | 0.0869 |
| 3.4085 | 6299 | 0.1226 |
| 3.4091 | 6300 | 0.1224 |
| 3.4096 | 6301 | 0.0636 |
| 3.4102 | 6302 | 0.0936 |
| 3.4107 | 6303 | 0.0594 |
| 3.4113 | 6304 | 0.0662 |
| 3.4118 | 6305 | 0.0486 |
| 3.4123 | 6306 | 0.0694 |
| 3.4129 | 6307 | 0.1176 |
| 3.4134 | 6308 | 0.114 |
| 3.4140 | 6309 | 0.0951 |
| 3.4145 | 6310 | 0.0827 |
| 3.4150 | 6311 | 0.0981 |
| 3.4156 | 6312 | 0.0668 |
| 3.4161 | 6313 | 0.0729 |
| 3.4167 | 6314 | 0.0974 |
| 3.4172 | 6315 | 0.0644 |
| 3.4177 | 6316 | 0.0707 |
| 3.4183 | 6317 | 0.0503 |
| 3.4188 | 6318 | 0.0445 |
| 3.4194 | 6319 | 0.09 |
| 3.4199 | 6320 | 0.0934 |
| 3.4205 | 6321 | 0.0817 |
| 3.4210 | 6322 | 0.0586 |
| 3.4215 | 6323 | 0.1191 |
| 3.4221 | 6324 | 0.0608 |
| 3.4226 | 6325 | 0.0653 |
| 3.4232 | 6326 | 0.069 |
| 3.4237 | 6327 | 0.0853 |
| 3.4242 | 6328 | 0.0685 |
| 3.4248 | 6329 | 0.1418 |
| 3.4253 | 6330 | 0.0699 |
| 3.4259 | 6331 | 0.1262 |
| 3.4264 | 6332 | 0.0951 |
| 3.4269 | 6333 | 0.1355 |
| 3.4275 | 6334 | 0.1146 |
| 3.4280 | 6335 | 0.0959 |
| 3.4286 | 6336 | 0.0958 |
| 3.4291 | 6337 | 0.1354 |
| 3.4297 | 6338 | 0.0749 |
| 3.4302 | 6339 | 0.0658 |
| 3.4307 | 6340 | 0.0928 |
| 3.4313 | 6341 | 0.0669 |
| 3.4318 | 6342 | 0.0944 |
| 3.4324 | 6343 | 0.0664 |
| 3.4329 | 6344 | 0.0851 |
| 3.4334 | 6345 | 0.1421 |
| 3.4340 | 6346 | 0.0606 |
| 3.4345 | 6347 | 0.0534 |
| 3.4351 | 6348 | 0.09 |
| 3.4356 | 6349 | 0.0751 |
| 3.4361 | 6350 | 0.042 |
| 3.4367 | 6351 | 0.0665 |
| 3.4372 | 6352 | 0.069 |
| 3.4378 | 6353 | 0.0576 |
| 3.4383 | 6354 | 0.0679 |
| 3.4389 | 6355 | 0.0622 |
| 3.4394 | 6356 | 0.056 |
| 3.4399 | 6357 | 0.0778 |
| 3.4405 | 6358 | 0.1015 |
| 3.4410 | 6359 | 0.0705 |
| 3.4416 | 6360 | 0.0955 |
| 3.4421 | 6361 | 0.0733 |
| 3.4426 | 6362 | 0.104 |
| 3.4432 | 6363 | 0.0716 |
| 3.4437 | 6364 | 0.1134 |
| 3.4443 | 6365 | 0.0844 |
| 3.4448 | 6366 | 0.0667 |
| 3.4453 | 6367 | 0.103 |
| 3.4459 | 6368 | 0.0735 |
| 3.4464 | 6369 | 0.1267 |
| 3.4470 | 6370 | 0.0467 |
| 3.4475 | 6371 | 0.08 |
| 3.4481 | 6372 | 0.0802 |
| 3.4486 | 6373 | 0.0568 |
| 3.4491 | 6374 | 0.1414 |
| 3.4497 | 6375 | 0.054 |
| 3.4502 | 6376 | 0.0822 |
| 3.4508 | 6377 | 0.0807 |
| 3.4513 | 6378 | 0.075 |
| 3.4518 | 6379 | 0.0937 |
| 3.4524 | 6380 | 0.0803 |
| 3.4529 | 6381 | 0.0347 |
| 3.4535 | 6382 | 0.0631 |
| 3.4540 | 6383 | 0.0727 |
| 3.4545 | 6384 | 0.1211 |
| 3.4551 | 6385 | 0.054 |
| 3.4556 | 6386 | 0.0669 |
| 3.4562 | 6387 | 0.065 |
| 3.4567 | 6388 | 0.0822 |
| 3.4573 | 6389 | 0.0743 |
| 3.4578 | 6390 | 0.0465 |
| 3.4583 | 6391 | 0.0741 |
| 3.4589 | 6392 | 0.1008 |
| 3.4594 | 6393 | 0.1302 |
| 3.4600 | 6394 | 0.0825 |
| 3.4605 | 6395 | 0.0949 |
| 3.4610 | 6396 | 0.0952 |
| 3.4616 | 6397 | 0.093 |
| 3.4621 | 6398 | 0.0857 |
| 3.4627 | 6399 | 0.0645 |
| 3.4632 | 6400 | 0.0691 |
| 3.4637 | 6401 | 0.0648 |
| 3.4643 | 6402 | 0.0829 |
| 3.4648 | 6403 | 0.1468 |
| 3.4654 | 6404 | 0.1211 |
| 3.4659 | 6405 | 0.0638 |
| 3.4665 | 6406 | 0.0913 |
| 3.4670 | 6407 | 0.0389 |
| 3.4675 | 6408 | 0.0578 |
| 3.4681 | 6409 | 0.0934 |
| 3.4686 | 6410 | 0.1078 |
| 3.4692 | 6411 | 0.0997 |
| 3.4697 | 6412 | 0.0499 |
| 3.4702 | 6413 | 0.0977 |
| 3.4708 | 6414 | 0.0638 |
| 3.4713 | 6415 | 0.0584 |
| 3.4719 | 6416 | 0.0676 |
| 3.4724 | 6417 | 0.0692 |
| 3.4729 | 6418 | 0.0725 |
| 3.4735 | 6419 | 0.084 |
| 3.4740 | 6420 | 0.0805 |
| 3.4746 | 6421 | 0.0742 |
| 3.4751 | 6422 | 0.0768 |
| 3.4756 | 6423 | 0.1303 |
| 3.4762 | 6424 | 0.0682 |
| 3.4767 | 6425 | 0.0826 |
| 3.4773 | 6426 | 0.0705 |
| 3.4778 | 6427 | 0.0725 |
| 3.4784 | 6428 | 0.0643 |
| 3.4789 | 6429 | 0.1099 |
| 3.4794 | 6430 | 0.0812 |
| 3.4800 | 6431 | 0.1072 |
| 3.4805 | 6432 | 0.0975 |
| 3.4811 | 6433 | 0.0818 |
| 3.4816 | 6434 | 0.0876 |
| 3.4821 | 6435 | 0.0622 |
| 3.4827 | 6436 | 0.1226 |
| 3.4832 | 6437 | 0.0868 |
| 3.4838 | 6438 | 0.0748 |
| 3.4843 | 6439 | 0.1153 |
| 3.4848 | 6440 | 0.0586 |
| 3.4854 | 6441 | 0.0956 |
| 3.4859 | 6442 | 0.1063 |
| 3.4865 | 6443 | 0.0591 |
| 3.4870 | 6444 | 0.1097 |
| 3.4876 | 6445 | 0.0426 |
| 3.4881 | 6446 | 0.037 |
| 3.4886 | 6447 | 0.096 |
| 3.4892 | 6448 | 0.0938 |
| 3.4897 | 6449 | 0.0655 |
| 3.4903 | 6450 | 0.0537 |
| 3.4908 | 6451 | 0.0943 |
| 3.4913 | 6452 | 0.0793 |
| 3.4919 | 6453 | 0.0801 |
| 3.4924 | 6454 | 0.0794 |
| 3.4930 | 6455 | 0.0844 |
| 3.4935 | 6456 | 0.0774 |
| 3.4940 | 6457 | 0.112 |
| 3.4946 | 6458 | 0.1302 |
| 3.4951 | 6459 | 0.0736 |
| 3.4957 | 6460 | 0.0453 |
| 3.4962 | 6461 | 0.0767 |
| 3.4968 | 6462 | 0.0717 |
| 3.4973 | 6463 | 0.0748 |
| 3.4978 | 6464 | 0.1149 |
| 3.4984 | 6465 | 0.1226 |
| 3.4989 | 6466 | 0.0934 |
| 3.4995 | 6467 | 0.1229 |
| 3.5 | 6468 | 0.1196 |
| 3.5005 | 6469 | 0.0762 |
| 3.5011 | 6470 | 0.0628 |
| 3.5016 | 6471 | 0.1231 |
| 3.5022 | 6472 | 0.099 |
| 3.5027 | 6473 | 0.0471 |
| 3.5032 | 6474 | 0.0716 |
| 3.5038 | 6475 | 0.0845 |
| 3.5043 | 6476 | 0.1102 |
| 3.5049 | 6477 | 0.0482 |
| 3.5054 | 6478 | 0.1205 |
| 3.5060 | 6479 | 0.1067 |
| 3.5065 | 6480 | 0.0762 |
| 3.5070 | 6481 | 0.1055 |
| 3.5076 | 6482 | 0.0917 |
| 3.5081 | 6483 | 0.0558 |
| 3.5087 | 6484 | 0.0701 |
| 3.5092 | 6485 | 0.129 |
| 3.5097 | 6486 | 0.0929 |
| 3.5103 | 6487 | 0.0926 |
| 3.5108 | 6488 | 0.1149 |
| 3.5114 | 6489 | 0.0875 |
| 3.5119 | 6490 | 0.0502 |
| 3.5124 | 6491 | 0.0558 |
| 3.5130 | 6492 | 0.0797 |
| 3.5135 | 6493 | 0.0511 |
| 3.5141 | 6494 | 0.0371 |
| 3.5146 | 6495 | 0.0542 |
| 3.5152 | 6496 | 0.0919 |
| 3.5157 | 6497 | 0.1008 |
| 3.5162 | 6498 | 0.0376 |
| 3.5168 | 6499 | 0.0796 |
| 3.5173 | 6500 | 0.0684 |
| 3.5179 | 6501 | 0.0649 |
| 3.5184 | 6502 | 0.088 |
| 3.5189 | 6503 | 0.0794 |
| 3.5195 | 6504 | 0.091 |
| 3.5200 | 6505 | 0.0819 |
| 3.5206 | 6506 | 0.0708 |
| 3.5211 | 6507 | 0.0689 |
| 3.5216 | 6508 | 0.0491 |
| 3.5222 | 6509 | 0.0623 |
| 3.5227 | 6510 | 0.1075 |
| 3.5233 | 6511 | 0.0586 |
| 3.5238 | 6512 | 0.0995 |
| 3.5244 | 6513 | 0.0575 |
| 3.5249 | 6514 | 0.0783 |
| 3.5254 | 6515 | 0.0899 |
| 3.5260 | 6516 | 0.0554 |
| 3.5265 | 6517 | 0.0774 |
| 3.5271 | 6518 | 0.0836 |
| 3.5276 | 6519 | 0.0637 |
| 3.5281 | 6520 | 0.0815 |
| 3.5287 | 6521 | 0.0807 |
| 3.5292 | 6522 | 0.0729 |
| 3.5298 | 6523 | 0.051 |
| 3.5303 | 6524 | 0.0872 |
| 3.5308 | 6525 | 0.0772 |
| 3.5314 | 6526 | 0.0757 |
| 3.5319 | 6527 | 0.0475 |
| 3.5325 | 6528 | 0.103 |
| 3.5330 | 6529 | 0.0661 |
| 3.5335 | 6530 | 0.0757 |
| 3.5341 | 6531 | 0.0797 |
| 3.5346 | 6532 | 0.0848 |
| 3.5352 | 6533 | 0.062 |
| 3.5357 | 6534 | 0.0478 |
| 3.5363 | 6535 | 0.0603 |
| 3.5368 | 6536 | 0.0787 |
| 3.5373 | 6537 | 0.1358 |
| 3.5379 | 6538 | 0.0735 |
| 3.5384 | 6539 | 0.0946 |
| 3.5390 | 6540 | 0.0747 |
| 3.5395 | 6541 | 0.0584 |
| 3.5400 | 6542 | 0.0573 |
| 3.5406 | 6543 | 0.0966 |
| 3.5411 | 6544 | 0.0524 |
| 3.5417 | 6545 | 0.1112 |
| 3.5422 | 6546 | 0.0682 |
| 3.5427 | 6547 | 0.1221 |
| 3.5433 | 6548 | 0.0685 |
| 3.5438 | 6549 | 0.0344 |
| 3.5444 | 6550 | 0.1175 |
| 3.5449 | 6551 | 0.0966 |
| 3.5455 | 6552 | 0.0758 |
| 3.5460 | 6553 | 0.0748 |
| 3.5465 | 6554 | 0.0784 |
| 3.5471 | 6555 | 0.1004 |
| 3.5476 | 6556 | 0.0895 |
| 3.5482 | 6557 | 0.0492 |
| 3.5487 | 6558 | 0.1037 |
| 3.5492 | 6559 | 0.0864 |
| 3.5498 | 6560 | 0.0984 |
| 3.5503 | 6561 | 0.0716 |
| 3.5509 | 6562 | 0.0949 |
| 3.5514 | 6563 | 0.0636 |
| 3.5519 | 6564 | 0.0794 |
| 3.5525 | 6565 | 0.0862 |
| 3.5530 | 6566 | 0.0405 |
| 3.5536 | 6567 | 0.0562 |
| 3.5541 | 6568 | 0.0831 |
| 3.5547 | 6569 | 0.0659 |
| 3.5552 | 6570 | 0.0711 |
| 3.5557 | 6571 | 0.0661 |
| 3.5563 | 6572 | 0.0409 |
| 3.5568 | 6573 | 0.0757 |
| 3.5574 | 6574 | 0.0753 |
| 3.5579 | 6575 | 0.0798 |
| 3.5584 | 6576 | 0.0779 |
| 3.5590 | 6577 | 0.1036 |
| 3.5595 | 6578 | 0.061 |
| 3.5601 | 6579 | 0.0782 |
| 3.5606 | 6580 | 0.0841 |
| 3.5611 | 6581 | 0.0854 |
| 3.5617 | 6582 | 0.1177 |
| 3.5622 | 6583 | 0.0563 |
| 3.5628 | 6584 | 0.1183 |
| 3.5633 | 6585 | 0.1007 |
| 3.5639 | 6586 | 0.0478 |
| 3.5644 | 6587 | 0.1081 |
| 3.5649 | 6588 | 0.1018 |
| 3.5655 | 6589 | 0.108 |
| 3.5660 | 6590 | 0.133 |
| 3.5666 | 6591 | 0.0466 |
| 3.5671 | 6592 | 0.0439 |
| 3.5676 | 6593 | 0.0846 |
| 3.5682 | 6594 | 0.0794 |
| 3.5687 | 6595 | 0.1177 |
| 3.5693 | 6596 | 0.0895 |
| 3.5698 | 6597 | 0.0873 |
| 3.5703 | 6598 | 0.0766 |
| 3.5709 | 6599 | 0.0585 |
| 3.5714 | 6600 | 0.0942 |
| 3.5720 | 6601 | 0.0875 |
| 3.5725 | 6602 | 0.0978 |
| 3.5731 | 6603 | 0.1368 |
| 3.5736 | 6604 | 0.0874 |
| 3.5741 | 6605 | 0.0534 |
| 3.5747 | 6606 | 0.0579 |
| 3.5752 | 6607 | 0.056 |
| 3.5758 | 6608 | 0.0444 |
| 3.5763 | 6609 | 0.0451 |
| 3.5768 | 6610 | 0.0953 |
| 3.5774 | 6611 | 0.0454 |
| 3.5779 | 6612 | 0.0869 |
| 3.5785 | 6613 | 0.0835 |
| 3.5790 | 6614 | 0.1057 |
| 3.5795 | 6615 | 0.1015 |
| 3.5801 | 6616 | 0.0972 |
| 3.5806 | 6617 | 0.0574 |
| 3.5812 | 6618 | 0.0685 |
| 3.5817 | 6619 | 0.1008 |
| 3.5823 | 6620 | 0.0928 |
| 3.5828 | 6621 | 0.0751 |
| 3.5833 | 6622 | 0.1076 |
| 3.5839 | 6623 | 0.0543 |
| 3.5844 | 6624 | 0.0587 |
| 3.5850 | 6625 | 0.0579 |
| 3.5855 | 6626 | 0.0832 |
| 3.5860 | 6627 | 0.0632 |
| 3.5866 | 6628 | 0.0943 |
| 3.5871 | 6629 | 0.091 |
| 3.5877 | 6630 | 0.045 |
| 3.5882 | 6631 | 0.0666 |
| 3.5887 | 6632 | 0.0778 |
| 3.5893 | 6633 | 0.0726 |
| 3.5898 | 6634 | 0.086 |
| 3.5904 | 6635 | 0.0688 |
| 3.5909 | 6636 | 0.0781 |
| 3.5915 | 6637 | 0.0622 |
| 3.5920 | 6638 | 0.0432 |
| 3.5925 | 6639 | 0.0743 |
| 3.5931 | 6640 | 0.0661 |
| 3.5936 | 6641 | 0.0488 |
| 3.5942 | 6642 | 0.1242 |
| 3.5947 | 6643 | 0.0797 |
| 3.5952 | 6644 | 0.0767 |
| 3.5958 | 6645 | 0.0807 |
| 3.5963 | 6646 | 0.0913 |
| 3.5969 | 6647 | 0.0707 |
| 3.5974 | 6648 | 0.0956 |
| 3.5979 | 6649 | 0.0914 |
| 3.5985 | 6650 | 0.0908 |
| 3.5990 | 6651 | 0.0618 |
| 3.5996 | 6652 | 0.0619 |
| 3.6001 | 6653 | 0.0872 |
| 3.6006 | 6654 | 0.042 |
| 3.6012 | 6655 | 0.0565 |
| 3.6017 | 6656 | 0.1089 |
| 3.6023 | 6657 | 0.1007 |
| 3.6028 | 6658 | 0.0777 |
| 3.6034 | 6659 | 0.0865 |
| 3.6039 | 6660 | 0.0562 |
| 3.6044 | 6661 | 0.071 |
| 3.6050 | 6662 | 0.0921 |
| 3.6055 | 6663 | 0.0559 |
| 3.6061 | 6664 | 0.0595 |
| 3.6066 | 6665 | 0.0634 |
| 3.6071 | 6666 | 0.0587 |
| 3.6077 | 6667 | 0.0836 |
| 3.6082 | 6668 | 0.0441 |
| 3.6088 | 6669 | 0.0831 |
| 3.6093 | 6670 | 0.0905 |
| 3.6098 | 6671 | 0.11 |
| 3.6104 | 6672 | 0.0591 |
| 3.6109 | 6673 | 0.0468 |
| 3.6115 | 6674 | 0.0433 |
| 3.6120 | 6675 | 0.103 |
| 3.6126 | 6676 | 0.0945 |
| 3.6131 | 6677 | 0.0522 |
| 3.6136 | 6678 | 0.0542 |
| 3.6142 | 6679 | 0.0526 |
| 3.6147 | 6680 | 0.117 |
| 3.6153 | 6681 | 0.0581 |
| 3.6158 | 6682 | 0.0568 |
| 3.6163 | 6683 | 0.0546 |
| 3.6169 | 6684 | 0.0692 |
| 3.6174 | 6685 | 0.0917 |
| 3.6180 | 6686 | 0.0414 |
| 3.6185 | 6687 | 0.1031 |
| 3.6190 | 6688 | 0.0923 |
| 3.6196 | 6689 | 0.0947 |
| 3.6201 | 6690 | 0.1116 |
| 3.6207 | 6691 | 0.0563 |
| 3.6212 | 6692 | 0.0932 |
| 3.6218 | 6693 | 0.0804 |
| 3.6223 | 6694 | 0.0807 |
| 3.6228 | 6695 | 0.1089 |
| 3.6234 | 6696 | 0.0767 |
| 3.6239 | 6697 | 0.1199 |
| 3.6245 | 6698 | 0.0818 |
| 3.625 | 6699 | 0.0821 |
| 3.6255 | 6700 | 0.0772 |
| 3.6261 | 6701 | 0.0689 |
| 3.6266 | 6702 | 0.0866 |
| 3.6272 | 6703 | 0.1355 |
| 3.6277 | 6704 | 0.0857 |
| 3.6282 | 6705 | 0.0611 |
| 3.6288 | 6706 | 0.1196 |
| 3.6293 | 6707 | 0.0573 |
| 3.6299 | 6708 | 0.1027 |
| 3.6304 | 6709 | 0.0828 |
| 3.6310 | 6710 | 0.0835 |
| 3.6315 | 6711 | 0.1233 |
| 3.6320 | 6712 | 0.0757 |
| 3.6326 | 6713 | 0.0826 |
| 3.6331 | 6714 | 0.0528 |
| 3.6337 | 6715 | 0.1428 |
| 3.6342 | 6716 | 0.0861 |
| 3.6347 | 6717 | 0.0513 |
| 3.6353 | 6718 | 0.1151 |
| 3.6358 | 6719 | 0.0859 |
| 3.6364 | 6720 | 0.0829 |
| 3.6369 | 6721 | 0.0777 |
| 3.6374 | 6722 | 0.0796 |
| 3.6380 | 6723 | 0.1588 |
| 3.6385 | 6724 | 0.0796 |
| 3.6391 | 6725 | 0.1127 |
| 3.6396 | 6726 | 0.085 |
| 3.6402 | 6727 | 0.1019 |
| 3.6407 | 6728 | 0.0418 |
| 3.6412 | 6729 | 0.0784 |
| 3.6418 | 6730 | 0.0586 |
| 3.6423 | 6731 | 0.1109 |
| 3.6429 | 6732 | 0.1014 |
| 3.6434 | 6733 | 0.0775 |
| 3.6439 | 6734 | 0.0385 |
| 3.6445 | 6735 | 0.0514 |
| 3.6450 | 6736 | 0.0963 |
| 3.6456 | 6737 | 0.0604 |
| 3.6461 | 6738 | 0.1048 |
| 3.6466 | 6739 | 0.1063 |
| 3.6472 | 6740 | 0.0495 |
| 3.6477 | 6741 | 0.0599 |
| 3.6483 | 6742 | 0.0857 |
| 3.6488 | 6743 | 0.0538 |
| 3.6494 | 6744 | 0.0685 |
| 3.6499 | 6745 | 0.0412 |
| 3.6504 | 6746 | 0.0878 |
| 3.6510 | 6747 | 0.0377 |
| 3.6515 | 6748 | 0.067 |
| 3.6521 | 6749 | 0.054 |
| 3.6526 | 6750 | 0.0699 |
| 3.6531 | 6751 | 0.0887 |
| 3.6537 | 6752 | 0.0984 |
| 3.6542 | 6753 | 0.0528 |
| 3.6548 | 6754 | 0.0908 |
| 3.6553 | 6755 | 0.0458 |
| 3.6558 | 6756 | 0.0928 |
| 3.6564 | 6757 | 0.0633 |
| 3.6569 | 6758 | 0.0916 |
| 3.6575 | 6759 | 0.0601 |
| 3.6580 | 6760 | 0.0585 |
| 3.6585 | 6761 | 0.1386 |
| 3.6591 | 6762 | 0.0573 |
| 3.6596 | 6763 | 0.1154 |
| 3.6602 | 6764 | 0.1217 |
| 3.6607 | 6765 | 0.0813 |
| 3.6613 | 6766 | 0.0793 |
| 3.6618 | 6767 | 0.06 |
| 3.6623 | 6768 | 0.0574 |
| 3.6629 | 6769 | 0.0614 |
| 3.6634 | 6770 | 0.1372 |
| 3.6640 | 6771 | 0.1197 |
| 3.6645 | 6772 | 0.0757 |
| 3.6650 | 6773 | 0.0809 |
| 3.6656 | 6774 | 0.0752 |
| 3.6661 | 6775 | 0.0854 |
| 3.6667 | 6776 | 0.0478 |
| 3.6672 | 6777 | 0.0494 |
| 3.6677 | 6778 | 0.1226 |
| 3.6683 | 6779 | 0.1519 |
| 3.6688 | 6780 | 0.0903 |
| 3.6694 | 6781 | 0.0391 |
| 3.6699 | 6782 | 0.061 |
| 3.6705 | 6783 | 0.0574 |
| 3.6710 | 6784 | 0.1153 |
| 3.6715 | 6785 | 0.0992 |
| 3.6721 | 6786 | 0.1009 |
| 3.6726 | 6787 | 0.084 |
| 3.6732 | 6788 | 0.0965 |
| 3.6737 | 6789 | 0.0668 |
| 3.6742 | 6790 | 0.0878 |
| 3.6748 | 6791 | 0.1087 |
| 3.6753 | 6792 | 0.0796 |
| 3.6759 | 6793 | 0.0585 |
| 3.6764 | 6794 | 0.0613 |
| 3.6769 | 6795 | 0.0678 |
| 3.6775 | 6796 | 0.0738 |
| 3.6780 | 6797 | 0.0503 |
| 3.6786 | 6798 | 0.1108 |
| 3.6791 | 6799 | 0.0672 |
| 3.6797 | 6800 | 0.0453 |
| 3.6802 | 6801 | 0.0774 |
| 3.6807 | 6802 | 0.0615 |
| 3.6813 | 6803 | 0.0363 |
| 3.6818 | 6804 | 0.0623 |
| 3.6824 | 6805 | 0.1375 |
| 3.6829 | 6806 | 0.0444 |
| 3.6834 | 6807 | 0.042 |
| 3.6840 | 6808 | 0.1015 |
| 3.6845 | 6809 | 0.0699 |
| 3.6851 | 6810 | 0.0633 |
| 3.6856 | 6811 | 0.0865 |
| 3.6861 | 6812 | 0.0567 |
| 3.6867 | 6813 | 0.0447 |
| 3.6872 | 6814 | 0.0752 |
| 3.6878 | 6815 | 0.1205 |
| 3.6883 | 6816 | 0.0935 |
| 3.6889 | 6817 | 0.0898 |
| 3.6894 | 6818 | 0.1117 |
| 3.6899 | 6819 | 0.1084 |
| 3.6905 | 6820 | 0.0976 |
| 3.6910 | 6821 | 0.1205 |
| 3.6916 | 6822 | 0.0676 |
| 3.6921 | 6823 | 0.0679 |
| 3.6926 | 6824 | 0.0875 |
| 3.6932 | 6825 | 0.064 |
| 3.6937 | 6826 | 0.1152 |
| 3.6943 | 6827 | 0.0671 |
| 3.6948 | 6828 | 0.0623 |
| 3.6953 | 6829 | 0.0921 |
| 3.6959 | 6830 | 0.0478 |
| 3.6964 | 6831 | 0.1391 |
| 3.6970 | 6832 | 0.0405 |
| 3.6975 | 6833 | 0.0296 |
| 3.6981 | 6834 | 0.091 |
| 3.6986 | 6835 | 0.0743 |
| 3.6991 | 6836 | 0.0942 |
| 3.6997 | 6837 | 0.0866 |
| 3.7002 | 6838 | 0.0773 |
| 3.7008 | 6839 | 0.1096 |
| 3.7013 | 6840 | 0.0503 |
| 3.7018 | 6841 | 0.0931 |
| 3.7024 | 6842 | 0.1113 |
| 3.7029 | 6843 | 0.0799 |
| 3.7035 | 6844 | 0.0524 |
| 3.7040 | 6845 | 0.0725 |
| 3.7045 | 6846 | 0.0784 |
| 3.7051 | 6847 | 0.0576 |
| 3.7056 | 6848 | 0.0776 |
| 3.7062 | 6849 | 0.0453 |
| 3.7067 | 6850 | 0.0562 |
| 3.7073 | 6851 | 0.0766 |
| 3.7078 | 6852 | 0.097 |
| 3.7083 | 6853 | 0.0702 |
| 3.7089 | 6854 | 0.087 |
| 3.7094 | 6855 | 0.1386 |
| 3.7100 | 6856 | 0.0715 |
| 3.7105 | 6857 | 0.1048 |
| 3.7110 | 6858 | 0.0924 |
| 3.7116 | 6859 | 0.1138 |
| 3.7121 | 6860 | 0.0849 |
| 3.7127 | 6861 | 0.0945 |
| 3.7132 | 6862 | 0.0439 |
| 3.7137 | 6863 | 0.0823 |
| 3.7143 | 6864 | 0.0629 |
| 3.7148 | 6865 | 0.1201 |
| 3.7154 | 6866 | 0.0527 |
| 3.7159 | 6867 | 0.082 |
| 3.7165 | 6868 | 0.1205 |
| 3.7170 | 6869 | 0.0587 |
| 3.7175 | 6870 | 0.0547 |
| 3.7181 | 6871 | 0.1173 |
| 3.7186 | 6872 | 0.0732 |
| 3.7192 | 6873 | 0.0846 |
| 3.7197 | 6874 | 0.1062 |
| 3.7202 | 6875 | 0.0596 |
| 3.7208 | 6876 | 0.0915 |
| 3.7213 | 6877 | 0.0508 |
| 3.7219 | 6878 | 0.1006 |
| 3.7224 | 6879 | 0.0497 |
| 3.7229 | 6880 | 0.0404 |
| 3.7235 | 6881 | 0.1061 |
| 3.7240 | 6882 | 0.0603 |
| 3.7246 | 6883 | 0.0706 |
| 3.7251 | 6884 | 0.0646 |
| 3.7256 | 6885 | 0.0778 |
| 3.7262 | 6886 | 0.1232 |
| 3.7267 | 6887 | 0.0953 |
| 3.7273 | 6888 | 0.0913 |
| 3.7278 | 6889 | 0.112 |
| 3.7284 | 6890 | 0.1389 |
| 3.7289 | 6891 | 0.0829 |
| 3.7294 | 6892 | 0.0758 |
| 3.7300 | 6893 | 0.0704 |
| 3.7305 | 6894 | 0.0478 |
| 3.7311 | 6895 | 0.0725 |
| 3.7316 | 6896 | 0.0576 |
| 3.7321 | 6897 | 0.079 |
| 3.7327 | 6898 | 0.0692 |
| 3.7332 | 6899 | 0.0525 |
| 3.7338 | 6900 | 0.1433 |
| 3.7343 | 6901 | 0.0688 |
| 3.7348 | 6902 | 0.0526 |
| 3.7354 | 6903 | 0.1126 |
| 3.7359 | 6904 | 0.078 |
| 3.7365 | 6905 | 0.0993 |
| 3.7370 | 6906 | 0.066 |
| 3.7376 | 6907 | 0.055 |
| 3.7381 | 6908 | 0.0617 |
| 3.7386 | 6909 | 0.1177 |
| 3.7392 | 6910 | 0.1043 |
| 3.7397 | 6911 | 0.0492 |
| 3.7403 | 6912 | 0.0925 |
| 3.7408 | 6913 | 0.0713 |
| 3.7413 | 6914 | 0.0904 |
| 3.7419 | 6915 | 0.1371 |
| 3.7424 | 6916 | 0.071 |
| 3.7430 | 6917 | 0.0673 |
| 3.7435 | 6918 | 0.0767 |
| 3.7440 | 6919 | 0.0817 |
| 3.7446 | 6920 | 0.0746 |
| 3.7451 | 6921 | 0.1003 |
| 3.7457 | 6922 | 0.0471 |
| 3.7462 | 6923 | 0.1016 |
| 3.7468 | 6924 | 0.0674 |
| 3.7473 | 6925 | 0.1127 |
| 3.7478 | 6926 | 0.1037 |
| 3.7484 | 6927 | 0.1104 |
| 3.7489 | 6928 | 0.1036 |
| 3.7495 | 6929 | 0.0432 |
| 3.75 | 6930 | 0.0702 |
| 3.7505 | 6931 | 0.0586 |
| 3.7511 | 6932 | 0.0681 |
| 3.7516 | 6933 | 0.1362 |
| 3.7522 | 6934 | 0.0869 |
| 3.7527 | 6935 | 0.0664 |
| 3.7532 | 6936 | 0.0691 |
| 3.7538 | 6937 | 0.1069 |
| 3.7543 | 6938 | 0.0805 |
| 3.7549 | 6939 | 0.062 |
| 3.7554 | 6940 | 0.1161 |
| 3.7560 | 6941 | 0.1195 |
| 3.7565 | 6942 | 0.0726 |
| 3.7570 | 6943 | 0.0859 |
| 3.7576 | 6944 | 0.0642 |
| 3.7581 | 6945 | 0.0544 |
| 3.7587 | 6946 | 0.0823 |
| 3.7592 | 6947 | 0.1038 |
| 3.7597 | 6948 | 0.0808 |
| 3.7603 | 6949 | 0.1158 |
| 3.7608 | 6950 | 0.1553 |
| 3.7614 | 6951 | 0.0697 |
| 3.7619 | 6952 | 0.0552 |
| 3.7624 | 6953 | 0.0834 |
| 3.7630 | 6954 | 0.0684 |
| 3.7635 | 6955 | 0.1239 |
| 3.7641 | 6956 | 0.0725 |
| 3.7646 | 6957 | 0.0546 |
| 3.7652 | 6958 | 0.045 |
| 3.7657 | 6959 | 0.0442 |
| 3.7662 | 6960 | 0.1138 |
| 3.7668 | 6961 | 0.0587 |
| 3.7673 | 6962 | 0.0908 |
| 3.7679 | 6963 | 0.0901 |
| 3.7684 | 6964 | 0.1262 |
| 3.7689 | 6965 | 0.0433 |
| 3.7695 | 6966 | 0.1006 |
| 3.7700 | 6967 | 0.0958 |
| 3.7706 | 6968 | 0.0833 |
| 3.7711 | 6969 | 0.1019 |
| 3.7716 | 6970 | 0.0753 |
| 3.7722 | 6971 | 0.1387 |
| 3.7727 | 6972 | 0.1131 |
| 3.7733 | 6973 | 0.1168 |
| 3.7738 | 6974 | 0.0893 |
| 3.7744 | 6975 | 0.0805 |
| 3.7749 | 6976 | 0.0852 |
| 3.7754 | 6977 | 0.1001 |
| 3.7760 | 6978 | 0.0773 |
| 3.7765 | 6979 | 0.1036 |
| 3.7771 | 6980 | 0.0398 |
| 3.7776 | 6981 | 0.1099 |
| 3.7781 | 6982 | 0.1068 |
| 3.7787 | 6983 | 0.1014 |
| 3.7792 | 6984 | 0.0367 |
| 3.7798 | 6985 | 0.1289 |
| 3.7803 | 6986 | 0.142 |
| 3.7808 | 6987 | 0.0663 |
| 3.7814 | 6988 | 0.0804 |
| 3.7819 | 6989 | 0.0873 |
| 3.7825 | 6990 | 0.1094 |
| 3.7830 | 6991 | 0.0943 |
| 3.7835 | 6992 | 0.1411 |
| 3.7841 | 6993 | 0.0637 |
| 3.7846 | 6994 | 0.0833 |
| 3.7852 | 6995 | 0.1142 |
| 3.7857 | 6996 | 0.1247 |
| 3.7863 | 6997 | 0.0528 |
| 3.7868 | 6998 | 0.053 |
| 3.7873 | 6999 | 0.122 |
| 3.7879 | 7000 | 0.0807 |
| 3.7884 | 7001 | 0.0805 |
| 3.7890 | 7002 | 0.0731 |
| 3.7895 | 7003 | 0.0604 |
| 3.7900 | 7004 | 0.0708 |
| 3.7906 | 7005 | 0.1088 |
| 3.7911 | 7006 | 0.0788 |
| 3.7917 | 7007 | 0.0592 |
| 3.7922 | 7008 | 0.0949 |
| 3.7927 | 7009 | 0.0971 |
| 3.7933 | 7010 | 0.1494 |
| 3.7938 | 7011 | 0.0812 |
| 3.7944 | 7012 | 0.0781 |
| 3.7949 | 7013 | 0.0701 |
| 3.7955 | 7014 | 0.0222 |
| 3.7960 | 7015 | 0.0663 |
| 3.7965 | 7016 | 0.0493 |
| 3.7971 | 7017 | 0.0679 |
| 3.7976 | 7018 | 0.1134 |
| 3.7982 | 7019 | 0.0684 |
| 3.7987 | 7020 | 0.0689 |
| 3.7992 | 7021 | 0.1281 |
| 3.7998 | 7022 | 0.0652 |
| 3.8003 | 7023 | 0.0432 |
| 3.8009 | 7024 | 0.0598 |
| 3.8014 | 7025 | 0.0676 |
| 3.8019 | 7026 | 0.0699 |
| 3.8025 | 7027 | 0.0903 |
| 3.8030 | 7028 | 0.0564 |
| 3.8036 | 7029 | 0.0686 |
| 3.8041 | 7030 | 0.0691 |
| 3.8047 | 7031 | 0.0698 |
| 3.8052 | 7032 | 0.0572 |
| 3.8057 | 7033 | 0.0538 |
| 3.8063 | 7034 | 0.0529 |
| 3.8068 | 7035 | 0.087 |
| 3.8074 | 7036 | 0.0977 |
| 3.8079 | 7037 | 0.0438 |
| 3.8084 | 7038 | 0.1055 |
| 3.8090 | 7039 | 0.0938 |
| 3.8095 | 7040 | 0.054 |
| 3.8101 | 7041 | 0.0837 |
| 3.8106 | 7042 | 0.0776 |
| 3.8111 | 7043 | 0.0959 |
| 3.8117 | 7044 | 0.125 |
| 3.8122 | 7045 | 0.0757 |
| 3.8128 | 7046 | 0.0639 |
| 3.8133 | 7047 | 0.0836 |
| 3.8139 | 7048 | 0.0695 |
| 3.8144 | 7049 | 0.0514 |
| 3.8149 | 7050 | 0.0768 |
| 3.8155 | 7051 | 0.1031 |
| 3.8160 | 7052 | 0.0739 |
| 3.8166 | 7053 | 0.0649 |
| 3.8171 | 7054 | 0.0866 |
| 3.8176 | 7055 | 0.0958 |
| 3.8182 | 7056 | 0.0735 |
| 3.8187 | 7057 | 0.0877 |
| 3.8193 | 7058 | 0.1499 |
| 3.8198 | 7059 | 0.0446 |
| 3.8203 | 7060 | 0.0955 |
| 3.8209 | 7061 | 0.1489 |
| 3.8214 | 7062 | 0.0651 |
| 3.8220 | 7063 | 0.0896 |
| 3.8225 | 7064 | 0.0794 |
| 3.8231 | 7065 | 0.0981 |
| 3.8236 | 7066 | 0.0788 |
| 3.8241 | 7067 | 0.093 |
| 3.8247 | 7068 | 0.1141 |
| 3.8252 | 7069 | 0.0688 |
| 3.8258 | 7070 | 0.1023 |
| 3.8263 | 7071 | 0.105 |
| 3.8268 | 7072 | 0.0939 |
| 3.8274 | 7073 | 0.0782 |
| 3.8279 | 7074 | 0.0983 |
| 3.8285 | 7075 | 0.0496 |
| 3.8290 | 7076 | 0.0586 |
| 3.8295 | 7077 | 0.112 |
| 3.8301 | 7078 | 0.0844 |
| 3.8306 | 7079 | 0.0483 |
| 3.8312 | 7080 | 0.0703 |
| 3.8317 | 7081 | 0.12 |
| 3.8323 | 7082 | 0.091 |
| 3.8328 | 7083 | 0.054 |
| 3.8333 | 7084 | 0.091 |
| 3.8339 | 7085 | 0.0828 |
| 3.8344 | 7086 | 0.0709 |
| 3.8350 | 7087 | 0.0291 |
| 3.8355 | 7088 | 0.0927 |
| 3.8360 | 7089 | 0.0776 |
| 3.8366 | 7090 | 0.0668 |
| 3.8371 | 7091 | 0.073 |
| 3.8377 | 7092 | 0.1182 |
| 3.8382 | 7093 | 0.0888 |
| 3.8387 | 7094 | 0.0968 |
| 3.8393 | 7095 | 0.1082 |
| 3.8398 | 7096 | 0.0823 |
| 3.8404 | 7097 | 0.0744 |
| 3.8409 | 7098 | 0.0639 |
| 3.8415 | 7099 | 0.0479 |
| 3.8420 | 7100 | 0.0709 |
| 3.8425 | 7101 | 0.047 |
| 3.8431 | 7102 | 0.0797 |
| 3.8436 | 7103 | 0.1295 |
| 3.8442 | 7104 | 0.1142 |
| 3.8447 | 7105 | 0.0735 |
| 3.8452 | 7106 | 0.0528 |
| 3.8458 | 7107 | 0.0865 |
| 3.8463 | 7108 | 0.0514 |
| 3.8469 | 7109 | 0.0612 |
| 3.8474 | 7110 | 0.0859 |
| 3.8479 | 7111 | 0.1079 |
| 3.8485 | 7112 | 0.0763 |
| 3.8490 | 7113 | 0.0923 |
| 3.8496 | 7114 | 0.0493 |
| 3.8501 | 7115 | 0.0944 |
| 3.8506 | 7116 | 0.0543 |
| 3.8512 | 7117 | 0.0878 |
| 3.8517 | 7118 | 0.0945 |
| 3.8523 | 7119 | 0.0496 |
| 3.8528 | 7120 | 0.1474 |
| 3.8534 | 7121 | 0.1175 |
| 3.8539 | 7122 | 0.0752 |
| 3.8544 | 7123 | 0.0697 |
| 3.8550 | 7124 | 0.0461 |
| 3.8555 | 7125 | 0.0569 |
| 3.8561 | 7126 | 0.0921 |
| 3.8566 | 7127 | 0.0842 |
| 3.8571 | 7128 | 0.0655 |
| 3.8577 | 7129 | 0.1065 |
| 3.8582 | 7130 | 0.1381 |
| 3.8588 | 7131 | 0.0568 |
| 3.8593 | 7132 | 0.0654 |
| 3.8598 | 7133 | 0.0667 |
| 3.8604 | 7134 | 0.0658 |
| 3.8609 | 7135 | 0.0746 |
| 3.8615 | 7136 | 0.1176 |
| 3.8620 | 7137 | 0.0769 |
| 3.8626 | 7138 | 0.1355 |
| 3.8631 | 7139 | 0.0484 |
| 3.8636 | 7140 | 0.1565 |
| 3.8642 | 7141 | 0.0548 |
| 3.8647 | 7142 | 0.0729 |
| 3.8653 | 7143 | 0.0927 |
| 3.8658 | 7144 | 0.0633 |
| 3.8663 | 7145 | 0.0614 |
| 3.8669 | 7146 | 0.0824 |
| 3.8674 | 7147 | 0.0917 |
| 3.8680 | 7148 | 0.0754 |
| 3.8685 | 7149 | 0.0855 |
| 3.8690 | 7150 | 0.0886 |
| 3.8696 | 7151 | 0.053 |
| 3.8701 | 7152 | 0.0773 |
| 3.8707 | 7153 | 0.0692 |
| 3.8712 | 7154 | 0.0551 |
| 3.8718 | 7155 | 0.0916 |
| 3.8723 | 7156 | 0.0691 |
| 3.8728 | 7157 | 0.1079 |
| 3.8734 | 7158 | 0.0974 |
| 3.8739 | 7159 | 0.1332 |
| 3.8745 | 7160 | 0.075 |
| 3.875 | 7161 | 0.0732 |
| 3.8755 | 7162 | 0.0933 |
| 3.8761 | 7163 | 0.0821 |
| 3.8766 | 7164 | 0.0494 |
| 3.8772 | 7165 | 0.0836 |
| 3.8777 | 7166 | 0.1249 |
| 3.8782 | 7167 | 0.0589 |
| 3.8788 | 7168 | 0.1348 |
| 3.8793 | 7169 | 0.0727 |
| 3.8799 | 7170 | 0.0668 |
| 3.8804 | 7171 | 0.0778 |
| 3.8810 | 7172 | 0.0646 |
| 3.8815 | 7173 | 0.095 |
| 3.8820 | 7174 | 0.0471 |
| 3.8826 | 7175 | 0.1364 |
| 3.8831 | 7176 | 0.1098 |
| 3.8837 | 7177 | 0.0536 |
| 3.8842 | 7178 | 0.0655 |
| 3.8847 | 7179 | 0.0664 |
| 3.8853 | 7180 | 0.0786 |
| 3.8858 | 7181 | 0.0742 |
| 3.8864 | 7182 | 0.1047 |
| 3.8869 | 7183 | 0.1576 |
| 3.8874 | 7184 | 0.0942 |
| 3.8880 | 7185 | 0.0801 |
| 3.8885 | 7186 | 0.0672 |
| 3.8891 | 7187 | 0.0822 |
| 3.8896 | 7188 | 0.0637 |
| 3.8902 | 7189 | 0.0609 |
| 3.8907 | 7190 | 0.0902 |
| 3.8912 | 7191 | 0.083 |
| 3.8918 | 7192 | 0.0741 |
| 3.8923 | 7193 | 0.0566 |
| 3.8929 | 7194 | 0.079 |
| 3.8934 | 7195 | 0.0651 |
| 3.8939 | 7196 | 0.0818 |
| 3.8945 | 7197 | 0.0903 |
| 3.8950 | 7198 | 0.0557 |
| 3.8956 | 7199 | 0.1063 |
| 3.8961 | 7200 | 0.1045 |
| 3.8966 | 7201 | 0.1198 |
| 3.8972 | 7202 | 0.0809 |
| 3.8977 | 7203 | 0.0612 |
| 3.8983 | 7204 | 0.0759 |
| 3.8988 | 7205 | 0.0919 |
| 3.8994 | 7206 | 0.0478 |
| 3.8999 | 7207 | 0.0735 |
| 3.9004 | 7208 | 0.0906 |
| 3.9010 | 7209 | 0.0984 |
| 3.9015 | 7210 | 0.2369 |
| 3.9021 | 7211 | 0.0706 |
| 3.9026 | 7212 | 0.0894 |
| 3.9031 | 7213 | 0.0595 |
| 3.9037 | 7214 | 0.1209 |
| 3.9042 | 7215 | 0.0553 |
| 3.9048 | 7216 | 0.0468 |
| 3.9053 | 7217 | 0.0595 |
| 3.9058 | 7218 | 0.0584 |
| 3.9064 | 7219 | 0.0729 |
| 3.9069 | 7220 | 0.0632 |
| 3.9075 | 7221 | 0.0924 |
| 3.9080 | 7222 | 0.0539 |
| 3.9085 | 7223 | 0.0903 |
| 3.9091 | 7224 | 0.0746 |
| 3.9096 | 7225 | 0.1437 |
| 3.9102 | 7226 | 0.0474 |
| 3.9107 | 7227 | 0.0569 |
| 3.9113 | 7228 | 0.1028 |
| 3.9118 | 7229 | 0.0808 |
| 3.9123 | 7230 | 0.0912 |
| 3.9129 | 7231 | 0.051 |
| 3.9134 | 7232 | 0.0493 |
| 3.9140 | 7233 | 0.083 |
| 3.9145 | 7234 | 0.0628 |
| 3.9150 | 7235 | 0.084 |
| 3.9156 | 7236 | 0.1243 |
| 3.9161 | 7237 | 0.0987 |
| 3.9167 | 7238 | 0.0673 |
| 3.9172 | 7239 | 0.0767 |
| 3.9177 | 7240 | 0.0944 |
| 3.9183 | 7241 | 0.1025 |
| 3.9188 | 7242 | 0.09 |
| 3.9194 | 7243 | 0.0911 |
| 3.9199 | 7244 | 0.0485 |
| 3.9205 | 7245 | 0.1163 |
| 3.9210 | 7246 | 0.0895 |
| 3.9215 | 7247 | 0.0846 |
| 3.9221 | 7248 | 0.0516 |
| 3.9226 | 7249 | 0.0733 |
| 3.9232 | 7250 | 0.0853 |
| 3.9237 | 7251 | 0.1629 |
| 3.9242 | 7252 | 0.0723 |
| 3.9248 | 7253 | 0.0399 |
| 3.9253 | 7254 | 0.056 |
| 3.9259 | 7255 | 0.1223 |
| 3.9264 | 7256 | 0.0571 |
| 3.9269 | 7257 | 0.0934 |
| 3.9275 | 7258 | 0.0635 |
| 3.9280 | 7259 | 0.088 |
| 3.9286 | 7260 | 0.1038 |
| 3.9291 | 7261 | 0.0647 |
| 3.9297 | 7262 | 0.0567 |
| 3.9302 | 7263 | 0.109 |
| 3.9307 | 7264 | 0.065 |
| 3.9313 | 7265 | 0.1029 |
| 3.9318 | 7266 | 0.0755 |
| 3.9324 | 7267 | 0.1027 |
| 3.9329 | 7268 | 0.0922 |
| 3.9334 | 7269 | 0.0789 |
| 3.9340 | 7270 | 0.0621 |
| 3.9345 | 7271 | 0.0912 |
| 3.9351 | 7272 | 0.0796 |
| 3.9356 | 7273 | 0.1012 |
| 3.9361 | 7274 | 0.0679 |
| 3.9367 | 7275 | 0.0795 |
| 3.9372 | 7276 | 0.1389 |
| 3.9378 | 7277 | 0.0375 |
| 3.9383 | 7278 | 0.0544 |
| 3.9389 | 7279 | 0.0721 |
| 3.9394 | 7280 | 0.1036 |
| 3.9399 | 7281 | 0.0576 |
| 3.9405 | 7282 | 0.0756 |
| 3.9410 | 7283 | 0.068 |
| 3.9416 | 7284 | 0.046 |
| 3.9421 | 7285 | 0.049 |
| 3.9426 | 7286 | 0.0963 |
| 3.9432 | 7287 | 0.0495 |
| 3.9437 | 7288 | 0.0825 |
| 3.9443 | 7289 | 0.1202 |
| 3.9448 | 7290 | 0.0792 |
| 3.9453 | 7291 | 0.068 |
| 3.9459 | 7292 | 0.0884 |
| 3.9464 | 7293 | 0.1017 |
| 3.9470 | 7294 | 0.0643 |
| 3.9475 | 7295 | 0.0674 |
| 3.9481 | 7296 | 0.0997 |
| 3.9486 | 7297 | 0.077 |
| 3.9491 | 7298 | 0.0981 |
| 3.9497 | 7299 | 0.087 |
| 3.9502 | 7300 | 0.0801 |
| 3.9508 | 7301 | 0.1066 |
| 3.9513 | 7302 | 0.0954 |
| 3.9518 | 7303 | 0.0787 |
| 3.9524 | 7304 | 0.0875 |
| 3.9529 | 7305 | 0.0879 |
| 3.9535 | 7306 | 0.0929 |
| 3.9540 | 7307 | 0.1194 |
| 3.9545 | 7308 | 0.0541 |
| 3.9551 | 7309 | 0.1337 |
| 3.9556 | 7310 | 0.1221 |
| 3.9562 | 7311 | 0.1045 |
| 3.9567 | 7312 | 0.0462 |
| 3.9573 | 7313 | 0.0705 |
| 3.9578 | 7314 | 0.0972 |
| 3.9583 | 7315 | 0.0634 |
| 3.9589 | 7316 | 0.0435 |
| 3.9594 | 7317 | 0.0857 |
| 3.9600 | 7318 | 0.0943 |
| 3.9605 | 7319 | 0.0745 |
| 3.9610 | 7320 | 0.0476 |
| 3.9616 | 7321 | 0.0767 |
| 3.9621 | 7322 | 0.0677 |
| 3.9627 | 7323 | 0.0716 |
| 3.9632 | 7324 | 0.0988 |
| 3.9637 | 7325 | 0.0628 |
| 3.9643 | 7326 | 0.0952 |
| 3.9648 | 7327 | 0.053 |
| 3.9654 | 7328 | 0.0646 |
| 3.9659 | 7329 | 0.0985 |
| 3.9665 | 7330 | 0.0743 |
| 3.9670 | 7331 | 0.0575 |
| 3.9675 | 7332 | 0.0758 |
| 3.9681 | 7333 | 0.0646 |
| 3.9686 | 7334 | 0.0627 |
| 3.9692 | 7335 | 0.0792 |
| 3.9697 | 7336 | 0.0849 |
| 3.9702 | 7337 | 0.05 |
| 3.9708 | 7338 | 0.0699 |
| 3.9713 | 7339 | 0.1229 |
| 3.9719 | 7340 | 0.1182 |
| 3.9724 | 7341 | 0.0715 |
| 3.9729 | 7342 | 0.0729 |
| 3.9735 | 7343 | 0.175 |
| 3.9740 | 7344 | 0.0532 |
| 3.9746 | 7345 | 0.0738 |
| 3.9751 | 7346 | 0.0356 |
| 3.9756 | 7347 | 0.0515 |
| 3.9762 | 7348 | 0.071 |
| 3.9767 | 7349 | 0.039 |
| 3.9773 | 7350 | 0.1067 |
| 3.9778 | 7351 | 0.0671 |
| 3.9784 | 7352 | 0.0427 |
| 3.9789 | 7353 | 0.0638 |
| 3.9794 | 7354 | 0.0636 |
| 3.9800 | 7355 | 0.0582 |
| 3.9805 | 7356 | 0.0913 |
| 3.9811 | 7357 | 0.0564 |
| 3.9816 | 7358 | 0.1167 |
| 3.9821 | 7359 | 0.1246 |
| 3.9827 | 7360 | 0.0779 |
| 3.9832 | 7361 | 0.0889 |
| 3.9838 | 7362 | 0.0911 |
| 3.9843 | 7363 | 0.0903 |
| 3.9848 | 7364 | 0.0716 |
| 3.9854 | 7365 | 0.0554 |
| 3.9859 | 7366 | 0.0503 |
| 3.9865 | 7367 | 0.1016 |
| 3.9870 | 7368 | 0.0779 |
| 3.9876 | 7369 | 0.0539 |
| 3.9881 | 7370 | 0.0759 |
| 3.9886 | 7371 | 0.1104 |
| 3.9892 | 7372 | 0.048 |
| 3.9897 | 7373 | 0.0625 |
| 3.9903 | 7374 | 0.0787 |
| 3.9908 | 7375 | 0.0569 |
| 3.9913 | 7376 | 0.0452 |
| 3.9919 | 7377 | 0.0769 |
| 3.9924 | 7378 | 0.0608 |
| 3.9930 | 7379 | 0.0835 |
| 3.9935 | 7380 | 0.082 |
| 3.9940 | 7381 | 0.0758 |
| 3.9946 | 7382 | 0.1428 |
| 3.9951 | 7383 | 0.0454 |
| 3.9957 | 7384 | 0.1177 |
| 3.9962 | 7385 | 0.0945 |
| 3.9968 | 7386 | 0.0713 |
| 3.9973 | 7387 | 0.0661 |
| 3.9978 | 7388 | 0.1428 |
| 3.9984 | 7389 | 0.0698 |
| 3.9989 | 7390 | 0.0699 |
| 3.9995 | 7391 | 0.0872 |
| 4.0 | 7392 | 0.0639 |
</details>
### Framework Versions
- Python: 3.10.12
- Sentence Transformers: 3.2.1
- Transformers: 4.44.2
- PyTorch: 2.3.1+cu121
- Accelerate: 1.1.1
- Datasets: 2.21.0
- Tokenizers: 0.19.1
## Citation
### BibTeX
#### Sentence Transformers
```bibtex
@inproceedings{reimers-2019-sentence-bert,
title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
author = "Reimers, Nils and Gurevych, Iryna",
booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
month = "11",
year = "2019",
publisher = "Association for Computational Linguistics",
url = "https://arxiv.org/abs/1908.10084",
}
```
#### MultipleNegativesRankingLoss
```bibtex
@misc{henderson2017efficient,
title={Efficient Natural Language Response Suggestion for Smart Reply},
author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil},
year={2017},
eprint={1705.00652},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
<!--
## Glossary
*Clearly define terms in order to be accessible across audiences.*
-->
<!--
## Model Card Authors
*Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.*
-->
<!--
## Model Card Contact
*Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.*
--> |
mezzihoussem/CHatFocus | mezzihoussem | 2024-11-13T09:07:31Z | 9 | 0 | transformers | [
"transformers",
"gguf",
"llama",
"text-generation-inference",
"unsloth",
"en",
"base_model:unsloth/Meta-Llama-3.1-8B-Instruct-bnb-4bit",
"base_model:quantized:unsloth/Meta-Llama-3.1-8B-Instruct-bnb-4bit",
"license:apache-2.0",
"endpoints_compatible",
"region:us",
"conversational"
] | null | 2024-11-13T09:06:10Z | ---
base_model: unsloth/Meta-Llama-3.1-8B-Instruct-bnb-4bit
tags:
- text-generation-inference
- transformers
- unsloth
- llama
- gguf
license: apache-2.0
language:
- en
---
# Uploaded model
- **Developed by:** mezzihoussem
- **License:** apache-2.0
- **Finetuned from model :** unsloth/Meta-Llama-3.1-8B-Instruct-bnb-4bit
This llama model was trained 2x faster with [Unsloth](https://github.com/unslothai/unsloth) and Huggingface's TRL library.
[<img src="https://raw.githubusercontent.com/unslothai/unsloth/main/images/unsloth%20made%20with%20love.png" width="200"/>](https://github.com/unslothai/unsloth)
|
FrancescoBuda/Llama-ICD-coder-3B-merged-1ep | FrancescoBuda | 2024-11-13T09:04:30Z | 122 | 0 | transformers | [
"transformers",
"safetensors",
"llama",
"text-generation",
"text-generation-inference",
"unsloth",
"trl",
"conversational",
"en",
"base_model:unsloth/Llama-3.2-3B-Instruct-bnb-4bit",
"base_model:finetune:unsloth/Llama-3.2-3B-Instruct-bnb-4bit",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-generation | 2024-11-13T09:01:08Z | ---
language:
- en
license: apache-2.0
tags:
- text-generation-inference
- transformers
- unsloth
- llama
- trl
base_model: unsloth/Llama-3.2-3B-Instruct-bnb-4bit
---
# Uploaded model
- **Developed by:** FrancescoBuda
- **License:** apache-2.0
- **Finetuned from model :** unsloth/Llama-3.2-3B-Instruct-bnb-4bit
This llama model was trained 2x faster with [Unsloth](https://github.com/unslothai/unsloth) and Huggingface's TRL library.
[<img src="https://raw.githubusercontent.com/unslothai/unsloth/main/images/unsloth%20made%20with%20love.png" width="200"/>](https://github.com/unslothai/unsloth)
|
crocutacrocuto/convnext-base-224-MEGbis-5 | crocutacrocuto | 2024-11-13T09:04:13Z | 191 | 0 | transformers | [
"transformers",
"safetensors",
"convnext",
"image-classification",
"arxiv:1910.09700",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | image-classification | 2024-11-13T09:03:55Z | ---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
DISLab/SummLlama3.1-70B | DISLab | 2024-11-13T09:02:21Z | 28 | 7 | null | [
"safetensors",
"llama",
"summarization",
"arxiv:2410.13116",
"base_model:meta-llama/Llama-3.1-70B-Instruct",
"base_model:finetune:meta-llama/Llama-3.1-70B-Instruct",
"region:us"
] | summarization | 2024-10-15T14:58:16Z | ---
base_model:
- meta-llama/Llama-3.1-70B-Instruct
pipeline_tag: summarization
---
<div align="center">
<b style="font-size: 40px;">SummLlama3.1-70B</b>
</div>
Are you looking for a summarizer that can generate more **human-preferred summaries** across multiple domains?
Our **SummLlama3.1-70B** could be exactly what you need!
SummLlama3.1-70B is initialized from Llama3.1-70B-Instruct, with additional training using Direct Preference Optimization (DPO) based on large-scale (over 100K) summarization feedback.
The feedback encompasses a wide range of input documents, from short to lengthy texts, including both dialogue and non-dialogue formats, and spans across seven distinct domains:
- Four non-dialouge domains: News, Lifestyle, Report, Medical
- Three dialogue domains: Daily Life, Interview, Meeting
This is automated evaluation results:
| **Config.** | **Faithfulness** | **Completeness** | **Conciseness** | **Average** |
|--------------------|------------|-----------|-----------|----------|
| Llama3-70B-Instruct | 0.931 | 0.596 | 0.487 | 0.671 |
| Llama3.1-70B-Instruct | 0.927 | 0.624 | 0.458 | 0.670 |
| GPT-4o | 0.940 | 0.657 | 0.437 | 0.678 |
| SummLlama3.1-70B | 0.942 | 0.637 | 0.909 | 0.829 |
Please refer to [our paper](https://arxiv.org/abs/2410.13116) to catch up how to exploit LLM-generated feedback in the context of text summarization.
**SummLlama3-70B**,
https://huggingface.co/DISLab/SummLlama3-8B
https://huggingface.co/DISLab/SummLlama3-70B
**SummLlama3.1-Series**
https://huggingface.co/DISLab/SummLlama3.1-8B
https://huggingface.co/DISLab/SummLlama3.1-70B
**SummLlama3.2-Series**
https://huggingface.co/DISLab/SummLlama3.2-3B
### *Recommended Prompt for Text Summarization:*
We recommend to use the prompt below to get the summary, since we trained the model using this.
```
def format_chat_template(document):
instruction = "Please summarize the input documnet."
row_json = [{"role": "user", "content": f"Below is an instruction that describes a task. Write a response that appropriately completes the request.\n\n### Instruction:\n{instruction}\n\n### Input:\n{document}\n\n### Response:\n"}]
return tokenizer.apply_chat_template(row_json, tokenize=False)
```
---
Here is a brief overview of our summarizer:
Rather than relying on expensive human feedback, we utilize high-quality, multi-dimensional, and fine-grained feedback generated by large language models (LLMs).
This model excels at **faithfulness**, **completeness**, and **conciseness**, which are the three human-preferred aspects to judge what is a good summarizer.
- Faithfulness: a summarizer does not manipulate the information in the input text and add any information not directly inferable from the input text.
- Completeness: a summarizer ensures the inclusion of all key information from the input text in the output summary.
- Conciseness: a summarizer refrains from incorporating information outside the key information in the output, maintaining a succinct and focused summary.
Based on our comprehensive evaluation, which included both human and automated assessments of summary quality, SummLlama3 demonstrated significant improvements over the original Llama3 series.
Here is the results:
## Human Evaluation
| **Config.** | **Faithfulness** | **Completeness** | **Conciseness** | **Average** |
|--------------------|------------|-----------|-----------|----------|
| Llama3-8B-Instruct | 0.902 | 0.636 | 0.784 | 0.774 |
| Llama3-70B-Instruct | 0.953 | 0.659 | 0.792 | 0.801 |
| SummLlama3-8B | 0.980 | 0.697 | 0.959 | 0.879 |
## Autoamted Evaluation using [FineSurE](https://aclanthology.org/2024.acl-long.51.pdf)
| **Config.** | **Faithfulness** | **Completeness** | **Conciseness** | **Average** |
|--------------------|------------|-----------|-----------|----------|
| Llama3-8B-Instruct | 0.864 | 0.583 | 0.450 | 0.632 |
| Llama3-70B-Instruct | 0.931 | 0.596 | 0.487 | 0.671 |
| SummLlama3-8B | 0.931 | 0.614 | 0.659 | 0.735 |
| SummLlama3-70B | 0.950 | 0.632 | 0.754 | 0.779 |
## Example
See an example how the summary improved by SummLlama3-8B over Llama3-8/70B-Instruct on the document below:
| **Speaker** | **Dialogue** |
|-------------|--------------|
| **Person 1** | Hey, Paul, you're still having Thanksgiving dinner at my house on Thursday, right? |
| **Person 2** | Yeah, thanks again for the invitation. I was worried I'd have to spend it alone after my parents announced they'd be in Finland. |
| **Person 1** | I think it's strange that you're not going with them. Is it because of the cold weather in Northern Europe? |
| **Person 2** | Not exactly. I have a lot of work to do next week. |
| **Person 1** | Well, I'm glad you won't be alone on a holiday. It'll be great to have you celebrate with us. The more the merrier! |
| **Person 2** | Exactly, how many people will be there? |
| **Person 1** | Let's see, my parents, my grandpa, my sister, my brother-in-law, my niece and my two cousins, plus us. So, ten people altogether. |
| **Person 2** | Should I bring anything? |
| **Person 1** | Nope, we've got it covered. |
| **Person 2** | Come on, I hate to show up empty-handed. You are being so generous to have me. I know! I'll bring pie. I'll make a great pumpkin pie. |
| **Person 1** | Sounds delicious, but my sister has already volunteered for dessert. Look, don't worry about it. There will be tons of food. Turkey, stuffing, green beans, sweet potatoes. |
| **Person 2** | Wow, I'm getting hungry just thinking about it. How about this, I'll bring a nice bottle of wine. You can never have too much of that. |
| **Person 1** | Well, isn't that the truth? Sounds good to me. |
| Model | Summary |
|----------|-----------|
| Llama3-8B-Instruct | Person1 invites Person2 to join them for Thanksgiving dinner at their house. Person2 accepts the invitation, citing that they were worried they would have to spend the holiday alone after their parents announced they would be in Finland. Person1 reassures Person2 that they are welcome and that the more the merrier. Person2 asks how many people will be attending and is told that there will be 10 people, including their family and cousins. Person2 offers to bring a gift, but Person1 declines, saying that they have everything covered. Person2 persists and offers to bring a bottle of wine or a pie, but Person1 suggests that their sister has already volunteered to bring dessert. |
| Llama3-70B-Instruct | Person 1 invites Person 2 to their house for Thanksgiving dinner on Thursday. Person 2 is grateful for the invitation and explains they can't go to Finland with their parents due to work commitments. Person 1 is happy to have them celebrate with their family, which will consist of 10 people. When Person 2 offers to bring something, Person 1 declines, but eventually agrees to let them bring a bottle of wine. |
| SummLlama3-8B | The conversation is about Thanksgiving dinner arrangements, with Person2 (Paul) confirming his attendance at Person1's (host's) house on Thursday, discussing what to bring and what's already been taken care of, and finalizing the offer to bring wine instead of pie since someone else is handling desserts. |
The summary of SummLlama3-8B can be considered a much human-preferred summary for the following reasons:
**Core Focus:** The summary accurately captures the main theme of the conversation, which revolves around the Thanksgiving dinner arrangements. It highlights how the two people confirm plans, discuss what to bring, and finalize the decision for Person2 to bring wine instead of pie. This maintains the core context.
**Inclusion of Key-facts:** The summary covers the important details of the conversation, including Person2's initial offer to bring dessert (pumpkin pie) and the shift to bringing wine due to another family member handling dessert. Other summaries tend to overlook or simplify this progression, while SummLlama3-8B fully captures the interaction’s key events.
**Clarity and Conciseness:** The summary is structured in a straightforward, concise manner, effectively summarizing the conversation without unnecessary details. It presents the flow and outcome of the discussion clearly, making it easy for readers to understand. The logical order of events is maintained, ensuring a smooth narrative.
**Accurate Role Depiction:** The summary clearly identifies Person1 as the host and Paul (Person2) as the guest, which helps clarify their relationship and the nature of the conversation. This distinction is more explicit in SummLlama3-8B compared to other summaries, which might leave these roles more ambiguous. |
DISLab/SummLlama3.1-8B | DISLab | 2024-11-13T09:01:45Z | 187 | 10 | transformers | [
"transformers",
"safetensors",
"llama",
"text-generation",
"summarization",
"arxiv:2410.13116",
"base_model:meta-llama/Llama-3.1-8B-Instruct",
"base_model:finetune:meta-llama/Llama-3.1-8B-Instruct",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | summarization | 2024-10-15T14:58:03Z | ---
library_name: transformers
base_model:
- meta-llama/Llama-3.1-8B-Instruct
pipeline_tag: summarization
---
<div align="center">
<b style="font-size: 40px;">SummLlama3.1-8B</b>
</div>
Are you looking for a summarizer that can generate more **human-preferred summaries** across multiple domains?
Our **SummLlama3.1-8B** could be exactly what you need!
SummLlama3.1-8B is initialized from Llama3.1-8B-Instruct, with additional training using Direct Preference Optimization (DPO) based on large-scale (over 100K) summarization feedback.
The feedback encompasses a wide range of input documents, from short to lengthy texts, including both dialogue and non-dialogue formats, and spans across seven distinct domains:
- Four non-dialouge domains: News, Lifestyle, Report, Medical
- Three dialogue domains: Daily Life, Interview, Meeting
This is automated evaluation results:
| **Config.** | **Faithfulness** | **Completeness** | **Conciseness** | **Average** |
|--------------------|------------|-----------|-----------|----------|
| Llama3-8B-Instruct | 0.864 | 0.583 | 0.450 | 0.632 |
| Llama3-70B-Instruct | 0.931 | 0.596 | 0.487 | 0.671 |
| Llama3.1-8B-Instruct | 0.875 | 0.596 | 0.395 | 0.622 |
| GPT-4o | 0.940 | 0.657 | 0.437 | 0.678 |
| SummLlama3-8B | 0.931 | 0.614 | 0.659 | 0.735 |
| SummLlama3-70B | 0.950 | 0.632 | 0.754 | 0.779 |
| **SummLlama3.1-8B** | 0.924 | 0.635 | 0.661 | 0.740 |
Please refer to [our paper](https://arxiv.org/abs/2410.13116) to catch up how to exploit LLM-generated feedback in the context of text summarization.
**SummLlama3-70B**,
https://huggingface.co/DISLab/SummLlama3-8B
https://huggingface.co/DISLab/SummLlama3-70B
**SummLlama3.1-Series**
https://huggingface.co/DISLab/SummLlama3.1-8B
https://huggingface.co/DISLab/SummLlama3.1-70B
**SummLlama3.2-Series**
https://huggingface.co/DISLab/SummLlama3.2-3B
### *Recommended Prompt for Text Summarization:*
We recommend to use the prompt below to get the summary, since we trained the model using this.
```
def format_chat_template(document):
instruction = "Please summarize the input documnet."
row_json = [{"role": "user", "content": f"Below is an instruction that describes a task. Write a response that appropriately completes the request.\n\n### Instruction:\n{instruction}\n\n### Input:\n{document}\n\n### Response:\n"}]
return tokenizer.apply_chat_template(row_json, tokenize=False)
```
---
Here is a brief overview of our summarizer:
Rather than relying on expensive human feedback, we utilize high-quality, multi-dimensional, and fine-grained feedback generated by large language models (LLMs).
This model excels at **faithfulness**, **completeness**, and **conciseness**, which are the three human-preferred aspects to judge what is a good summarizer.
- Faithfulness: a summarizer does not manipulate the information in the input text and add any information not directly inferable from the input text.
- Completeness: a summarizer ensures the inclusion of all key information from the input text in the output summary.
- Conciseness: a summarizer refrains from incorporating information outside the key information in the output, maintaining a succinct and focused summary.
Based on our comprehensive evaluation, which included both human and automated assessments of summary quality, SummLlama3 demonstrated significant improvements over the original Llama3 series.
Here is the results:
## Human Evaluation
| **Config.** | **Faithfulness** | **Completeness** | **Conciseness** | **Average** |
|--------------------|------------|-----------|-----------|----------|
| Llama3-8B-Instruct | 0.902 | 0.636 | 0.784 | 0.774 |
| Llama3-70B-Instruct | 0.953 | 0.659 | 0.792 | 0.801 |
| SummLlama3-8B | 0.980 | 0.697 | 0.959 | 0.879 |
## Autoamted Evaluation using [FineSurE](https://aclanthology.org/2024.acl-long.51.pdf)
| **Config.** | **Faithfulness** | **Completeness** | **Conciseness** | **Average** |
|--------------------|------------|-----------|-----------|----------|
| Llama3-8B-Instruct | 0.864 | 0.583 | 0.450 | 0.632 |
| Llama3-70B-Instruct | 0.931 | 0.596 | 0.487 | 0.671 |
| SummLlama3-8B | 0.931 | 0.614 | 0.659 | 0.735 |
| SummLlama3-70B | 0.950 | 0.632 | 0.754 | 0.779 |
## Example
See an example how the summary improved by SummLlama3-8B over Llama3-8/70B-Instruct on the document below:
| **Speaker** | **Dialogue** |
|-------------|--------------|
| **Person 1** | Hey, Paul, you're still having Thanksgiving dinner at my house on Thursday, right? |
| **Person 2** | Yeah, thanks again for the invitation. I was worried I'd have to spend it alone after my parents announced they'd be in Finland. |
| **Person 1** | I think it's strange that you're not going with them. Is it because of the cold weather in Northern Europe? |
| **Person 2** | Not exactly. I have a lot of work to do next week. |
| **Person 1** | Well, I'm glad you won't be alone on a holiday. It'll be great to have you celebrate with us. The more the merrier! |
| **Person 2** | Exactly, how many people will be there? |
| **Person 1** | Let's see, my parents, my grandpa, my sister, my brother-in-law, my niece and my two cousins, plus us. So, ten people altogether. |
| **Person 2** | Should I bring anything? |
| **Person 1** | Nope, we've got it covered. |
| **Person 2** | Come on, I hate to show up empty-handed. You are being so generous to have me. I know! I'll bring pie. I'll make a great pumpkin pie. |
| **Person 1** | Sounds delicious, but my sister has already volunteered for dessert. Look, don't worry about it. There will be tons of food. Turkey, stuffing, green beans, sweet potatoes. |
| **Person 2** | Wow, I'm getting hungry just thinking about it. How about this, I'll bring a nice bottle of wine. You can never have too much of that. |
| **Person 1** | Well, isn't that the truth? Sounds good to me. |
| Model | Summary |
|----------|-----------|
| Llama3-8B-Instruct | Person1 invites Person2 to join them for Thanksgiving dinner at their house. Person2 accepts the invitation, citing that they were worried they would have to spend the holiday alone after their parents announced they would be in Finland. Person1 reassures Person2 that they are welcome and that the more the merrier. Person2 asks how many people will be attending and is told that there will be 10 people, including their family and cousins. Person2 offers to bring a gift, but Person1 declines, saying that they have everything covered. Person2 persists and offers to bring a bottle of wine or a pie, but Person1 suggests that their sister has already volunteered to bring dessert. |
| Llama3-70B-Instruct | Person 1 invites Person 2 to their house for Thanksgiving dinner on Thursday. Person 2 is grateful for the invitation and explains they can't go to Finland with their parents due to work commitments. Person 1 is happy to have them celebrate with their family, which will consist of 10 people. When Person 2 offers to bring something, Person 1 declines, but eventually agrees to let them bring a bottle of wine. |
| SummLlama3-8B | The conversation is about Thanksgiving dinner arrangements, with Person2 (Paul) confirming his attendance at Person1's (host's) house on Thursday, discussing what to bring and what's already been taken care of, and finalizing the offer to bring wine instead of pie since someone else is handling desserts. |
The summary of SummLlama3-8B can be considered a much human-preferred summary for the following reasons:
**Core Focus:** The summary accurately captures the main theme of the conversation, which revolves around the Thanksgiving dinner arrangements. It highlights how the two people confirm plans, discuss what to bring, and finalize the decision for Person2 to bring wine instead of pie. This maintains the core context.
**Inclusion of Key-facts:** The summary covers the important details of the conversation, including Person2's initial offer to bring dessert (pumpkin pie) and the shift to bringing wine due to another family member handling dessert. Other summaries tend to overlook or simplify this progression, while SummLlama3-8B fully captures the interaction’s key events.
**Clarity and Conciseness:** The summary is structured in a straightforward, concise manner, effectively summarizing the conversation without unnecessary details. It presents the flow and outcome of the discussion clearly, making it easy for readers to understand. The logical order of events is maintained, ensuring a smooth narrative.
**Accurate Role Depiction:** The summary clearly identifies Person1 as the host and Paul (Person2) as the guest, which helps clarify their relationship and the nature of the conversation. This distinction is more explicit in SummLlama3-8B compared to other summaries, which might leave these roles more ambiguous. |
DISLab/SummLlama3-8B | DISLab | 2024-11-13T08:58:56Z | 106 | 13 | null | [
"safetensors",
"llama",
"summarization",
"arxiv:2410.13116",
"base_model:meta-llama/Meta-Llama-3-8B-Instruct",
"base_model:finetune:meta-llama/Meta-Llama-3-8B-Instruct",
"region:us"
] | summarization | 2024-10-11T10:19:43Z | ---
base_model:
- meta-llama/Meta-Llama-3-8B-Instruct
pipeline_tag: summarization
---
<div align="center">
<b style="font-size: 40px;">SummLlama3-8B</b>
</div>
Are you looking for a summarizer that can generate more **human-preferred summaries** across multiple domains?
Our **SummLlama3-8B** could be exactly what you need!
SummLlama3 is initialized from Llama3-8B-Instruct, with additional training using Direct Preference Optimization (DPO) based on large-scale (over 100K) summarization feedback.
The feedback encompasses a wide range of input documents, from short to lengthy texts, including both dialogue and non-dialogue formats, and spans across seven distinct domains:
- Four non-dialouge domains: News, Lifestyle, Report, Medical
- Three dialogue domains: Daily Life, Interview, Meeting
Surprisingly, it outperforms the nearly 10x larger **Llama3-70B-Instruct** and also **GPT-4o** while offering much faster inference speed.
This is automated evaluation results:
| **Config.** | **Faithfulness** | **Completeness** | **Conciseness** | **Average** |
|--------------------|------------|-----------|-----------|----------|
| Llama3-8B-Instruct | 0.864 | 0.583 | 0.450 | 0.632 |
| Llama3-70B-Instruct | 0.931 | 0.596 | 0.487 | 0.671 |
| GPT-4o | 0.940 | 0.657 | 0.437 | 0.678 |
| SummLlama3-8B | 0.931 | 0.614 | 0.659 | 0.735 |
| SummLlama3-70B | 0.950 | 0.632 | 0.754 | 0.779 |
This is human evaluation results:
| **Config.** | **Faithfulness** | **Completeness** | **Conciseness** | **Average** |
|--------------------|------------|-----------|-----------|----------|
| Llama3-8B-Instruct | 0.902 | 0.636 | 0.784 | 0.774 |
| Llama3-70B-Instruct | 0.953 | 0.659 | 0.792 | 0.801 |
| SummLlama3-8B | 0.980 | 0.697 | 0.959 | 0.879 |
Please refer to [our paper](https://arxiv.org/abs/2410.13116) to catch up how to exploit LLM-generated feedback in the context of text summarization.
Here are other versions:
**SummLlama3-70B**,
https://huggingface.co/DISLab/SummLlama3-70B
**SummLlama3.1-Series**
https://huggingface.co/DISLab/SummLlama3.1-8B
https://huggingface.co/DISLab/SummLlama3.1-70B
**SummLlama3.2-Series**
https://huggingface.co/DISLab/SummLlama3.2-3B
### *Recommended Prompt for Text Summarization:*
We recommend to use the prompt below to get the summary, since we trained the model using this.
```
def format_chat_template(document):
instruction = "Please summarize the input documnet."
row_json = [{"role": "user", "content": f"Below is an instruction that describes a task. Write a response that appropriately completes the request.\n\n### Instruction:\n{instruction}\n\n### Input:\n{document}\n\n### Response:\n"}]
return tokenizer.apply_chat_template(row_json, tokenize=False)
```
---
Here is a brief overview of our summarizer:
Rather than relying on expensive human feedback, we utilize high-quality, multi-dimensional, and fine-grained feedback generated by large language models (LLMs).
This model excels at **faithfulness**, **completeness**, and **conciseness**, which are the three human-preferred aspects to judge what is a good summarizer.
- Faithfulness: a summarizer does not manipulate the information in the input text and add any information not directly inferable from the input text.
- Completeness: a summarizer ensures the inclusion of all key information from the input text in the output summary.
- Conciseness: a summarizer refrains from incorporating information outside the key information in the output, maintaining a succinct and focused summary.
Based on our comprehensive evaluation, which included both human and automated assessments of summary quality, SummLlama3 demonstrated significant improvements over the original Llama3 series.
Here is the results:
## Human Evaluation
| **Config.** | **Faithfulness** | **Completeness** | **Conciseness** | **Average** |
|--------------------|------------|-----------|-----------|----------|
| Llama3-8B-Instruct | 0.902 | 0.636 | 0.784 | 0.774 |
| Llama3-70B-Instruct | 0.953 | 0.659 | 0.792 | 0.801 |
| SummLlama3-8B | 0.980 | 0.697 | 0.959 | 0.879 |
## Autoamted Evaluation using [FineSurE](https://aclanthology.org/2024.acl-long.51.pdf)
| **Config.** | **Faithfulness** | **Completeness** | **Conciseness** | **Average** |
|--------------------|------------|-----------|-----------|----------|
| Llama3-8B-Instruct | 0.864 | 0.583 | 0.450 | 0.632 |
| Llama3-70B-Instruct | 0.931 | 0.596 | 0.487 | 0.671 |
| SummLlama3-8B | 0.931 | 0.614 | 0.659 | 0.735 |
| SummLlama3-70B | 0.950 | 0.632 | 0.754 | 0.779 |
## Example
See an example how the summary improved by SummLlama3-8B over Llama3-8/70B-Instruct on the document below:
| **Speaker** | **Dialogue** |
|-------------|--------------|
| **Person 1** | Hey, Paul, you're still having Thanksgiving dinner at my house on Thursday, right? |
| **Person 2** | Yeah, thanks again for the invitation. I was worried I'd have to spend it alone after my parents announced they'd be in Finland. |
| **Person 1** | I think it's strange that you're not going with them. Is it because of the cold weather in Northern Europe? |
| **Person 2** | Not exactly. I have a lot of work to do next week. |
| **Person 1** | Well, I'm glad you won't be alone on a holiday. It'll be great to have you celebrate with us. The more the merrier! |
| **Person 2** | Exactly, how many people will be there? |
| **Person 1** | Let's see, my parents, my grandpa, my sister, my brother-in-law, my niece and my two cousins, plus us. So, ten people altogether. |
| **Person 2** | Should I bring anything? |
| **Person 1** | Nope, we've got it covered. |
| **Person 2** | Come on, I hate to show up empty-handed. You are being so generous to have me. I know! I'll bring pie. I'll make a great pumpkin pie. |
| **Person 1** | Sounds delicious, but my sister has already volunteered for dessert. Look, don't worry about it. There will be tons of food. Turkey, stuffing, green beans, sweet potatoes. |
| **Person 2** | Wow, I'm getting hungry just thinking about it. How about this, I'll bring a nice bottle of wine. You can never have too much of that. |
| **Person 1** | Well, isn't that the truth? Sounds good to me. |
| Model | Summary |
|----------|-----------|
| Llama3-8B-Instruct | Person1 invites Person2 to join them for Thanksgiving dinner at their house. Person2 accepts the invitation, citing that they were worried they would have to spend the holiday alone after their parents announced they would be in Finland. Person1 reassures Person2 that they are welcome and that the more the merrier. Person2 asks how many people will be attending and is told that there will be 10 people, including their family and cousins. Person2 offers to bring a gift, but Person1 declines, saying that they have everything covered. Person2 persists and offers to bring a bottle of wine or a pie, but Person1 suggests that their sister has already volunteered to bring dessert. |
| Llama3-70B-Instruct | Person 1 invites Person 2 to their house for Thanksgiving dinner on Thursday. Person 2 is grateful for the invitation and explains they can't go to Finland with their parents due to work commitments. Person 1 is happy to have them celebrate with their family, which will consist of 10 people. When Person 2 offers to bring something, Person 1 declines, but eventually agrees to let them bring a bottle of wine. |
| SummLlama3-8B | The conversation is about Thanksgiving dinner arrangements, with Person2 (Paul) confirming his attendance at Person1's (host's) house on Thursday, discussing what to bring and what's already been taken care of, and finalizing the offer to bring wine instead of pie since someone else is handling desserts. |
The summary of SummLlama3-8B can be considered a much human-preferred summary for the following reasons:
**Core Focus:** The summary accurately captures the main theme of the conversation, which revolves around the Thanksgiving dinner arrangements. It highlights how the two people confirm plans, discuss what to bring, and finalize the decision for Person2 to bring wine instead of pie. This maintains the core context.
**Inclusion of Key-facts:** The summary covers the important details of the conversation, including Person2's initial offer to bring dessert (pumpkin pie) and the shift to bringing wine due to another family member handling dessert. Other summaries tend to overlook or simplify this progression, while SummLlama3-8B fully captures the interaction’s key events.
**Clarity and Conciseness:** The summary is structured in a straightforward, concise manner, effectively summarizing the conversation without unnecessary details. It presents the flow and outcome of the discussion clearly, making it easy for readers to understand. The logical order of events is maintained, ensuring a smooth narrative.
**Accurate Role Depiction:** The summary clearly identifies Person1 as the host and Paul (Person2) as the guest, which helps clarify their relationship and the nature of the conversation. This distinction is more explicit in SummLlama3-8B compared to other summaries, which might leave these roles more ambiguous. |
mradermacher/Genstruct-7B-GGUF | mradermacher | 2024-11-13T08:55:49Z | 33 | 1 | transformers | [
"transformers",
"gguf",
"Mistral",
"instruct",
"finetune",
"synthetic",
"en",
"base_model:NousResearch/Genstruct-7B",
"base_model:quantized:NousResearch/Genstruct-7B",
"license:apache-2.0",
"endpoints_compatible",
"region:us",
"conversational"
] | null | 2024-11-12T06:58:28Z | ---
base_model: NousResearch/Genstruct-7B
language:
- en
library_name: transformers
license: apache-2.0
quantized_by: mradermacher
tags:
- Mistral
- instruct
- finetune
- synthetic
---
## About
<!-- ### quantize_version: 2 -->
<!-- ### output_tensor_quantised: 1 -->
<!-- ### convert_type: hf -->
<!-- ### vocab_type: -->
<!-- ### tags: -->
static quants of https://huggingface.co/NousResearch/Genstruct-7B
<!-- provided-files -->
weighted/imatrix quants are available at https://huggingface.co/mradermacher/Genstruct-7B-i1-GGUF
## Usage
If you are unsure how to use GGUF files, refer to one of [TheBloke's
READMEs](https://huggingface.co/TheBloke/KafkaLM-70B-German-V0.1-GGUF) for
more details, including on how to concatenate multi-part files.
## Provided Quants
(sorted by size, not necessarily quality. IQ-quants are often preferable over similar sized non-IQ quants)
| Link | Type | Size/GB | Notes |
|:-----|:-----|--------:|:------|
| [GGUF](https://huggingface.co/mradermacher/Genstruct-7B-GGUF/resolve/main/Genstruct-7B.Q2_K.gguf) | Q2_K | 2.8 | |
| [GGUF](https://huggingface.co/mradermacher/Genstruct-7B-GGUF/resolve/main/Genstruct-7B.Q3_K_S.gguf) | Q3_K_S | 3.3 | |
| [GGUF](https://huggingface.co/mradermacher/Genstruct-7B-GGUF/resolve/main/Genstruct-7B.Q3_K_M.gguf) | Q3_K_M | 3.6 | lower quality |
| [GGUF](https://huggingface.co/mradermacher/Genstruct-7B-GGUF/resolve/main/Genstruct-7B.Q3_K_L.gguf) | Q3_K_L | 3.9 | |
| [GGUF](https://huggingface.co/mradermacher/Genstruct-7B-GGUF/resolve/main/Genstruct-7B.IQ4_XS.gguf) | IQ4_XS | 4.0 | |
| [GGUF](https://huggingface.co/mradermacher/Genstruct-7B-GGUF/resolve/main/Genstruct-7B.Q4_0_4_4.gguf) | Q4_0_4_4 | 4.2 | fast on arm, low quality |
| [GGUF](https://huggingface.co/mradermacher/Genstruct-7B-GGUF/resolve/main/Genstruct-7B.Q4_K_S.gguf) | Q4_K_S | 4.2 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/Genstruct-7B-GGUF/resolve/main/Genstruct-7B.Q4_K_M.gguf) | Q4_K_M | 4.5 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/Genstruct-7B-GGUF/resolve/main/Genstruct-7B.Q5_K_S.gguf) | Q5_K_S | 5.1 | |
| [GGUF](https://huggingface.co/mradermacher/Genstruct-7B-GGUF/resolve/main/Genstruct-7B.Q5_K_M.gguf) | Q5_K_M | 5.2 | |
| [GGUF](https://huggingface.co/mradermacher/Genstruct-7B-GGUF/resolve/main/Genstruct-7B.Q6_K.gguf) | Q6_K | 6.0 | very good quality |
| [GGUF](https://huggingface.co/mradermacher/Genstruct-7B-GGUF/resolve/main/Genstruct-7B.Q8_0.gguf) | Q8_0 | 7.8 | fast, best quality |
| [GGUF](https://huggingface.co/mradermacher/Genstruct-7B-GGUF/resolve/main/Genstruct-7B.f16.gguf) | f16 | 14.6 | 16 bpw, overkill |
Here is a handy graph by ikawrakow comparing some lower-quality quant
types (lower is better):

And here are Artefact2's thoughts on the matter:
https://gist.github.com/Artefact2/b5f810600771265fc1e39442288e8ec9
## FAQ / Model Request
See https://huggingface.co/mradermacher/model_requests for some answers to
questions you might have and/or if you want some other model quantized.
## Thanks
I thank my company, [nethype GmbH](https://www.nethype.de/), for letting
me use its servers and providing upgrades to my workstation to enable
this work in my free time.
<!-- end -->
|
jacobhoffmann/TestGen_v2.1-codegemma-7b-lr0.0003_epochs2 | jacobhoffmann | 2024-11-13T08:52:58Z | 5 | 0 | transformers | [
"transformers",
"safetensors",
"gemma",
"text-generation",
"arxiv:1910.09700",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | text-generation | 2024-11-13T08:48:07Z | ---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
prithivMLmods/Flux-Realism-FineDetailed | prithivMLmods | 2024-11-13T08:51:30Z | 1,972 | 25 | diffusers | [
"diffusers",
"text-to-image",
"lora",
"template:diffusion-lora",
"Realism",
"Fine-Detailed",
"Flux.1-dev",
"base_model:black-forest-labs/FLUX.1-dev",
"base_model:adapter:black-forest-labs/FLUX.1-dev",
"license:creativeml-openrail-m",
"region:us"
] | text-to-image | 2024-11-09T14:29:09Z | ---
tags:
- text-to-image
- lora
- diffusers
- template:diffusion-lora
- Realism
- Fine-Detailed
- Flux.1-dev
widget:
- text: >-
Fine Detailed, A half Body Photo of a beautiful brunette Scottish girl,
green eyes, messy beehive hair style, natural skin complexion, slim with
curves, tartan shirt, outdoor background, soft lighting, captured by
{randomly selected} photographer. This image looks glamorous and elegant in
style and outdoor background that was recognized as a contest winner. It
has been featured on CG Society. Cinematic
output:
url: images/FD1.png
- text: >-
Fine Detailed, A close-up shot of a medium-sized man with medium-length
brown hair and a goatee. His eyes are a piercing blue, and his eyebrows are
a darker shade of brown. He is wearing a brown jacket over a white t-shirt,
and a gray sweater with a zipper on the right side of the jacket. The
background is blurred, and the lighting is soft, creating a soft focus on
the mans face.
output:
url: images/FD2.png
- text: >-
Fine Detailed, A woman with long red hair, wearing a fur coat, standing
outdoors in snowy Red Square, Moscow, with iconic colorful domes in the
background. She faces slightly sideways, looking directly at the camera with
a calm expression.
output:
url: images/FD3.png
base_model: black-forest-labs/FLUX.1-dev
instance_prompt: Fine Detailed
license: creativeml-openrail-m
---
## Flux-Realism-FineDetailed - B+ 20
<Gallery />
- Hosted Here🧨: https://huggingface.co/spaces/prithivMLmods/FLUX-LoRA-DLC
**The model is still in the training phase. This is not the final version and may contain artifacts and perform poorly in some cases.**
## Model description
**prithivMLmods/Flux-Realism-FineDetailed**
Image Processing Parameters
| Parameter | Value | Parameter | Value |
|---------------------------|--------|---------------------------|--------|
| LR Scheduler | constant | Noise Offset | 0.03 |
| Optimizer | AdamW | Multires Noise Discount | 0.1 |
| Network Dim | 64 | Multires Noise Iterations | 10 |
| Network Alpha | 32 | Repeat & Steps | 28 & 3400|
| Epoch | 15 | Save Every N Epochs | 1 |
Labeling: florence2-en(natural language & English)
Total Images Used for Training : 27
## 🚀New Version Available Here🚀
Here's a table summarizing the relevant information about the **`Flux-Super-Realism-LoRA`** model on Hugging Face:
| **Feature** | **Details** |
|-------------------------|-----------------------------------------------------------------------------|
| **Model Name** | `Flux-Super-Realism-LoRA` |
| **Repository** | [strangerzonehf/Flux-Super-Realism-LoRA](https://huggingface.co/strangerzonehf/Flux-Super-Realism-LoRA) |
| **Author** | `strangerzonehf` |
| **Description** | Super-realism LoRA model designed to produce high-quality, hyper-realistic images using LoRA fine-tuning techniques. This model can generate lifelike textures, lighting, and intricate details. |
| **Model Type** | LoRA (Low-Rank Adaptation for Transformers) |
| **Use Cases** | - Photorealistic image generation<br>- High-fidelity art<br>- Texture detailing and enhancement |
| **Primary Language** | Not applicable (model is image-based) |
| **Base Model** | Model used as the foundation for LoRA fine-tuning (may vary per implementation) |
| **License** | Refer to Hugging Face model page for specific licensing information. |
| **Tags** | super-realism, LoRA, high-fidelity, hyper-realistic |
| **Usage** | This model is typically used with tools like Hugging Face's `Diffusers` or other libraries supporting LoRA fine-tuning for enhanced realism in image generation. |
| **Pipeline** | Use in `StableDiffusionPipeline` or compatible image generation pipelines. |
## Realism LoRA In-Image


## Best Dimensions
- 1024 x 1024 (Default)
## Setting Up
```
import torch
from pipelines import DiffusionPipeline
base_model = "black-forest-labs/FLUX.1-dev"
pipe = DiffusionPipeline.from_pretrained(base_model, torch_dtype=torch.bfloat16)
lora_repo = "prithivMLmods/Flux-Realism-FineDetailed"
trigger_word = "Fine Detailed"
pipe.load_lora_weights(lora_repo)
device = torch.device("cuda")
pipe.to(device)
```
## Trigger words 🧨
> [!WARNING]
> **Trigger words:** You should use `Fine Detailed` to trigger the image generation.
## Download model
Weights for this model are available in Safetensors format.
[Download](/prithivMLmods/Flux-Realism-FineDetailed/tree/main) them in the Files & versions tab. |
kh-li/bert-base-all-nli-stsb-quora-nq | kh-li | 2024-11-13T08:49:54Z | 8 | 0 | sentence-transformers | [
"sentence-transformers",
"safetensors",
"bert",
"sentence-similarity",
"feature-extraction",
"generated_from_trainer",
"dataset_size:65749",
"loss:MultipleNegativesRankingLoss",
"loss:ContrastiveLoss",
"loss:CosineSimilarityLoss",
"loss:TripletLoss",
"en",
"dataset:sentence-transformers/all-nli",
"dataset:sentence-transformers/stsb",
"dataset:sentence-transformers/quora-duplicates",
"dataset:sentence-transformers/natural-questions",
"arxiv:1908.10084",
"arxiv:1705.00652",
"arxiv:1703.07737",
"base_model:google-bert/bert-base-uncased",
"base_model:finetune:google-bert/bert-base-uncased",
"autotrain_compatible",
"text-embeddings-inference",
"endpoints_compatible",
"region:us"
] | sentence-similarity | 2024-11-13T08:49:21Z | ---
language:
- en
tags:
- sentence-transformers
- sentence-similarity
- feature-extraction
- generated_from_trainer
- dataset_size:65749
- loss:MultipleNegativesRankingLoss
- loss:ContrastiveLoss
- loss:CosineSimilarityLoss
- loss:TripletLoss
base_model: google-bert/bert-base-uncased
widget:
- source_sentence: Can a US President destroy a city with actions?
sentences:
- What are best kids educational games?
- Can a US president destroy a city through actions?
- Why do people ask questions on Quora that are just as, if not more than easier
to, look up with a search engine?
- source_sentence: How would you handle stress people?
sentences:
- How do I handle stress with a parent?
- Why do some people on QUORA ask questions that they can easily findout on Google?
- How do I make a quick right decision?
- source_sentence: Two women playing field hockey on AstroTurf.
sentences:
- Women playing a game of field hockey.
- The children are outside.
- Women re-sod a field hockey field.
- source_sentence: A dog reaches to catch a ball with its mouth.
sentences:
- The dog is playing with a rope.
- The dog is playing with a ball.
- Someone holding their baby is smiling while sitting down.
- source_sentence: There is a very full description of the various types of hormone
rooting compound here.
sentences:
- The least that can be said is that we must be born with the ability and 'knowledge'
to learn.
- It is meant to stimulate root growth - in particular to stimulate the creation
of roots.
- A person folds a piece of paper.
datasets:
- sentence-transformers/all-nli
- sentence-transformers/stsb
- sentence-transformers/quora-duplicates
- sentence-transformers/natural-questions
pipeline_tag: sentence-similarity
library_name: sentence-transformers
---
# SentenceTransformer based on google-bert/bert-base-uncased
This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [google-bert/bert-base-uncased](https://huggingface.co/google-bert/bert-base-uncased) on the [all-nli-pair](https://huggingface.co/datasets/sentence-transformers/all-nli), [all-nli-pair-class](https://huggingface.co/datasets/sentence-transformers/all-nli), [all-nli-pair-score](https://huggingface.co/datasets/sentence-transformers/all-nli), [all-nli-triplet](https://huggingface.co/datasets/sentence-transformers/all-nli), [stsb](https://huggingface.co/datasets/sentence-transformers/stsb), [quora](https://huggingface.co/datasets/sentence-transformers/quora-duplicates) and [natural-questions](https://huggingface.co/datasets/sentence-transformers/natural-questions) datasets. It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
## Model Details
### Model Description
- **Model Type:** Sentence Transformer
- **Base model:** [google-bert/bert-base-uncased](https://huggingface.co/google-bert/bert-base-uncased) <!-- at revision 86b5e0934494bd15c9632b12f734a8a67f723594 -->
- **Maximum Sequence Length:** 512 tokens
- **Output Dimensionality:** 768 tokens
- **Similarity Function:** Cosine Similarity
- **Training Datasets:**
- [all-nli-pair](https://huggingface.co/datasets/sentence-transformers/all-nli)
- [all-nli-pair-class](https://huggingface.co/datasets/sentence-transformers/all-nli)
- [all-nli-pair-score](https://huggingface.co/datasets/sentence-transformers/all-nli)
- [all-nli-triplet](https://huggingface.co/datasets/sentence-transformers/all-nli)
- [stsb](https://huggingface.co/datasets/sentence-transformers/stsb)
- [quora](https://huggingface.co/datasets/sentence-transformers/quora-duplicates)
- [natural-questions](https://huggingface.co/datasets/sentence-transformers/natural-questions)
- **Language:** en
<!-- - **License:** Unknown -->
### Model Sources
- **Documentation:** [Sentence Transformers Documentation](https://sbert.net)
- **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers)
- **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers)
### Full Model Architecture
```
SentenceTransformer(
(0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
)
```
## Usage
### Direct Usage (Sentence Transformers)
First install the Sentence Transformers library:
```bash
pip install -U sentence-transformers
```
Then you can load this model and run inference.
```python
from sentence_transformers import SentenceTransformer
# Download from the 🤗 Hub
model = SentenceTransformer("kh-li/bert-base-all-nli-stsb-quora-nq")
# Run inference
sentences = [
'There is a very full description of the various types of hormone rooting compound here.',
'It is meant to stimulate root growth - in particular to stimulate the creation of roots.',
"The least that can be said is that we must be born with the ability and 'knowledge' to learn.",
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 768]
# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]
```
<!--
### Direct Usage (Transformers)
<details><summary>Click to see the direct usage in Transformers</summary>
</details>
-->
<!--
### Downstream Usage (Sentence Transformers)
You can finetune this model on your own dataset.
<details><summary>Click to expand</summary>
</details>
-->
<!--
### Out-of-Scope Use
*List how the model may foreseeably be misused and address what users ought not to do with the model.*
-->
<!--
## Bias, Risks and Limitations
*What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.*
-->
<!--
### Recommendations
*What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.*
-->
## Training Details
### Training Datasets
#### all-nli-pair
* Dataset: [all-nli-pair](https://huggingface.co/datasets/sentence-transformers/all-nli) at [d482672](https://huggingface.co/datasets/sentence-transformers/all-nli/tree/d482672c8e74ce18da116f430137434ba2e52fab)
* Size: 10,000 training samples
* Columns: <code>anchor</code> and <code>positive</code>
* Approximate statistics based on the first 1000 samples:
| | anchor | positive |
|:--------|:----------------------------------------------------------------------------------|:---------------------------------------------------------------------------------|
| type | string | string |
| details | <ul><li>min: 5 tokens</li><li>mean: 17.03 tokens</li><li>max: 64 tokens</li></ul> | <ul><li>min: 4 tokens</li><li>mean: 9.62 tokens</li><li>max: 31 tokens</li></ul> |
* Samples:
| anchor | positive |
|:---------------------------------------------------------------------------|:-------------------------------------------------|
| <code>A person on a horse jumps over a broken down airplane.</code> | <code>A person is outdoors, on a horse.</code> |
| <code>Children smiling and waving at camera</code> | <code>There are children present</code> |
| <code>A boy is jumping on skateboard in the middle of a red bridge.</code> | <code>The boy does a skateboarding trick.</code> |
* Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters:
```json
{
"scale": 20.0,
"similarity_fct": "cos_sim"
}
```
#### all-nli-pair-class
* Dataset: [all-nli-pair-class](https://huggingface.co/datasets/sentence-transformers/all-nli) at [d482672](https://huggingface.co/datasets/sentence-transformers/all-nli/tree/d482672c8e74ce18da116f430137434ba2e52fab)
* Size: 10,000 training samples
* Columns: <code>premise</code>, <code>hypothesis</code>, and <code>label</code>
* Approximate statistics based on the first 1000 samples:
| | premise | hypothesis | label |
|:--------|:----------------------------------------------------------------------------------|:---------------------------------------------------------------------------------|:-------------------------------------------------------------------|
| type | string | string | int |
| details | <ul><li>min: 6 tokens</li><li>mean: 17.38 tokens</li><li>max: 52 tokens</li></ul> | <ul><li>min: 4 tokens</li><li>mean: 10.7 tokens</li><li>max: 31 tokens</li></ul> | <ul><li>0: ~33.40%</li><li>1: ~33.30%</li><li>2: ~33.30%</li></ul> |
* Samples:
| premise | hypothesis | label |
|:--------------------------------------------------------------------|:---------------------------------------------------------------|:---------------|
| <code>A person on a horse jumps over a broken down airplane.</code> | <code>A person is training his horse for a competition.</code> | <code>1</code> |
| <code>A person on a horse jumps over a broken down airplane.</code> | <code>A person is at a diner, ordering an omelette.</code> | <code>2</code> |
| <code>A person on a horse jumps over a broken down airplane.</code> | <code>A person is outdoors, on a horse.</code> | <code>0</code> |
* Loss: [<code>ContrastiveLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#contrastiveloss) with these parameters:
```json
{
"distance_metric": "SiameseDistanceMetric.COSINE_DISTANCE",
"margin": 0.5,
"size_average": true
}
```
#### all-nli-pair-score
* Dataset: [all-nli-pair-score](https://huggingface.co/datasets/sentence-transformers/all-nli) at [d482672](https://huggingface.co/datasets/sentence-transformers/all-nli/tree/d482672c8e74ce18da116f430137434ba2e52fab)
* Size: 10,000 training samples
* Columns: <code>sentence1</code>, <code>sentence2</code>, and <code>score</code>
* Approximate statistics based on the first 1000 samples:
| | sentence1 | sentence2 | score |
|:--------|:----------------------------------------------------------------------------------|:---------------------------------------------------------------------------------|:--------------------------------------------------------------|
| type | string | string | float |
| details | <ul><li>min: 6 tokens</li><li>mean: 17.38 tokens</li><li>max: 52 tokens</li></ul> | <ul><li>min: 4 tokens</li><li>mean: 10.7 tokens</li><li>max: 31 tokens</li></ul> | <ul><li>min: 0.0</li><li>mean: 0.5</li><li>max: 1.0</li></ul> |
* Samples:
| sentence1 | sentence2 | score |
|:--------------------------------------------------------------------|:---------------------------------------------------------------|:-----------------|
| <code>A person on a horse jumps over a broken down airplane.</code> | <code>A person is training his horse for a competition.</code> | <code>0.5</code> |
| <code>A person on a horse jumps over a broken down airplane.</code> | <code>A person is at a diner, ordering an omelette.</code> | <code>0.0</code> |
| <code>A person on a horse jumps over a broken down airplane.</code> | <code>A person is outdoors, on a horse.</code> | <code>1.0</code> |
* Loss: [<code>CosineSimilarityLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cosinesimilarityloss) with these parameters:
```json
{
"loss_fct": "torch.nn.modules.loss.MSELoss"
}
```
#### all-nli-triplet
* Dataset: [all-nli-triplet](https://huggingface.co/datasets/sentence-transformers/all-nli) at [d482672](https://huggingface.co/datasets/sentence-transformers/all-nli/tree/d482672c8e74ce18da116f430137434ba2e52fab)
* Size: 10,000 training samples
* Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code>
* Approximate statistics based on the first 1000 samples:
| | anchor | positive | negative |
|:--------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|:---------------------------------------------------------------------------------|
| type | string | string | string |
| details | <ul><li>min: 7 tokens</li><li>mean: 10.46 tokens</li><li>max: 46 tokens</li></ul> | <ul><li>min: 6 tokens</li><li>mean: 12.81 tokens</li><li>max: 40 tokens</li></ul> | <ul><li>min: 5 tokens</li><li>mean: 13.4 tokens</li><li>max: 50 tokens</li></ul> |
* Samples:
| anchor | positive | negative |
|:---------------------------------------------------------------------------|:-------------------------------------------------|:-----------------------------------------------------------|
| <code>A person on a horse jumps over a broken down airplane.</code> | <code>A person is outdoors, on a horse.</code> | <code>A person is at a diner, ordering an omelette.</code> |
| <code>Children smiling and waving at camera</code> | <code>There are children present</code> | <code>The kids are frowning</code> |
| <code>A boy is jumping on skateboard in the middle of a red bridge.</code> | <code>The boy does a skateboarding trick.</code> | <code>The boy skates down the sidewalk.</code> |
* Loss: [<code>TripletLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#tripletloss) with these parameters:
```json
{
"distance_metric": "TripletDistanceMetric.EUCLIDEAN",
"triplet_margin": 5
}
```
#### stsb
* Dataset: [stsb](https://huggingface.co/datasets/sentence-transformers/stsb) at [ab7a5ac](https://huggingface.co/datasets/sentence-transformers/stsb/tree/ab7a5ac0e35aa22088bdcf23e7fd99b220e53308)
* Size: 5,749 training samples
* Columns: <code>sentence1</code>, <code>sentence2</code>, and <code>score</code>
* Approximate statistics based on the first 1000 samples:
| | sentence1 | sentence2 | score |
|:--------|:---------------------------------------------------------------------------------|:---------------------------------------------------------------------------------|:---------------------------------------------------------------|
| type | string | string | float |
| details | <ul><li>min: 6 tokens</li><li>mean: 10.0 tokens</li><li>max: 28 tokens</li></ul> | <ul><li>min: 5 tokens</li><li>mean: 9.95 tokens</li><li>max: 25 tokens</li></ul> | <ul><li>min: 0.0</li><li>mean: 0.54</li><li>max: 1.0</li></ul> |
* Samples:
| sentence1 | sentence2 | score |
|:-----------------------------------------------------------|:----------------------------------------------------------------------|:------------------|
| <code>A plane is taking off.</code> | <code>An air plane is taking off.</code> | <code>1.0</code> |
| <code>A man is playing a large flute.</code> | <code>A man is playing a flute.</code> | <code>0.76</code> |
| <code>A man is spreading shreded cheese on a pizza.</code> | <code>A man is spreading shredded cheese on an uncooked pizza.</code> | <code>0.76</code> |
* Loss: [<code>CosineSimilarityLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cosinesimilarityloss) with these parameters:
```json
{
"loss_fct": "torch.nn.modules.loss.MSELoss"
}
```
#### quora
* Dataset: [quora](https://huggingface.co/datasets/sentence-transformers/quora-duplicates) at [451a485](https://huggingface.co/datasets/sentence-transformers/quora-duplicates/tree/451a4850bd141edb44ade1b5828c259abd762cdb)
* Size: 10,000 training samples
* Columns: <code>anchor</code> and <code>positive</code>
* Approximate statistics based on the first 1000 samples:
| | anchor | positive |
|:--------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|
| type | string | string |
| details | <ul><li>min: 6 tokens</li><li>mean: 13.92 tokens</li><li>max: 42 tokens</li></ul> | <ul><li>min: 6 tokens</li><li>mean: 14.09 tokens</li><li>max: 43 tokens</li></ul> |
* Samples:
| anchor | positive |
|:----------------------------------------------------------------------------------------------------|:--------------------------------------------------------------------------------------------------------|
| <code>Astrology: I am a Capricorn Sun Cap moon and cap rising...what does that say about me?</code> | <code>I'm a triple Capricorn (Sun, Moon and ascendant in Capricorn) What does this say about me?</code> |
| <code>How can I be a good geologist?</code> | <code>What should I do to be a great geologist?</code> |
| <code>How do I read and find my YouTube comments?</code> | <code>How can I see all my Youtube comments?</code> |
* Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters:
```json
{
"scale": 20.0,
"similarity_fct": "cos_sim"
}
```
#### natural-questions
* Dataset: [natural-questions](https://huggingface.co/datasets/sentence-transformers/natural-questions) at [f9e894e](https://huggingface.co/datasets/sentence-transformers/natural-questions/tree/f9e894e1081e206e577b4eaa9ee6de2b06ae6f17)
* Size: 10,000 training samples
* Columns: <code>query</code> and <code>answer</code>
* Approximate statistics based on the first 1000 samples:
| | query | answer |
|:--------|:-----------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------|
| type | string | string |
| details | <ul><li>min: 10 tokens</li><li>mean: 11.74 tokens</li><li>max: 21 tokens</li></ul> | <ul><li>min: 17 tokens</li><li>mean: 135.66 tokens</li><li>max: 512 tokens</li></ul> |
* Samples:
| query | answer |
|:----------------------------------------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| <code>when did richmond last play in a preliminary final</code> | <code>Richmond Football Club Richmond began 2017 with 5 straight wins, a feat it had not achieved since 1995. A series of close losses hampered the Tigers throughout the middle of the season, including a 5-point loss to the Western Bulldogs, 2-point loss to Fremantle, and a 3-point loss to the Giants. Richmond ended the season strongly with convincing victories over Fremantle and St Kilda in the final two rounds, elevating the club to 3rd on the ladder. Richmond's first final of the season against the Cats at the MCG attracted a record qualifying final crowd of 95,028; the Tigers won by 51 points. Having advanced to the first preliminary finals for the first time since 2001, Richmond defeated Greater Western Sydney by 36 points in front of a crowd of 94,258 to progress to the Grand Final against Adelaide, their first Grand Final appearance since 1982. The attendance was 100,021, the largest crowd to a grand final since 1986. The Crows led at quarter time and led by as many as 13, but the Tigers took over the game as it progressed and scored seven straight goals at one point. They eventually would win by 48 points – 16.12 (108) to Adelaide's 8.12 (60) – to end their 37-year flag drought.[22] Dustin Martin also became the first player to win a Premiership medal, the Brownlow Medal and the Norm Smith Medal in the same season, while Damien Hardwick was named AFL Coaches Association Coach of the Year. Richmond's jump from 13th to premiers also marked the biggest jump from one AFL season to the next.</code> |
| <code>who sang what in the world's come over you</code> | <code>Jack Scott (singer) At the beginning of 1960, Scott again changed record labels, this time to Top Rank Records.[1] He then recorded four Billboard Hot 100 hits – "What in the World's Come Over You" (#5), "Burning Bridges" (#3) b/w "Oh Little One" (#34), and "It Only Happened Yesterday" (#38).[1] "What in the World's Come Over You" was Scott's second gold disc winner.[6] Scott continued to record and perform during the 1960s and 1970s.[1] His song "You're Just Gettin' Better" reached the country charts in 1974.[1] In May 1977, Scott recorded a Peel session for BBC Radio 1 disc jockey, John Peel.</code> |
| <code>who produces the most wool in the world</code> | <code>Wool Global wool production is about 2 million tonnes per year, of which 60% goes into apparel. Wool comprises ca 3% of the global textile market, but its value is higher owing to dying and other modifications of the material.[1] Australia is a leading producer of wool which is mostly from Merino sheep but has been eclipsed by China in terms of total weight.[30] New Zealand (2016) is the third-largest producer of wool, and the largest producer of crossbred wool. Breeds such as Lincoln, Romney, Drysdale, and Elliotdale produce coarser fibers, and wool from these sheep is usually used for making carpets.</code> |
* Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters:
```json
{
"scale": 20.0,
"similarity_fct": "cos_sim"
}
```
### Evaluation Datasets
#### all-nli-triplet
* Dataset: [all-nli-triplet](https://huggingface.co/datasets/sentence-transformers/all-nli) at [d482672](https://huggingface.co/datasets/sentence-transformers/all-nli/tree/d482672c8e74ce18da116f430137434ba2e52fab)
* Size: 6,584 evaluation samples
* Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code>
* Approximate statistics based on the first 1000 samples:
| | anchor | positive | negative |
|:--------|:----------------------------------------------------------------------------------|:---------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|
| type | string | string | string |
| details | <ul><li>min: 6 tokens</li><li>mean: 17.95 tokens</li><li>max: 63 tokens</li></ul> | <ul><li>min: 4 tokens</li><li>mean: 9.78 tokens</li><li>max: 29 tokens</li></ul> | <ul><li>min: 5 tokens</li><li>mean: 10.35 tokens</li><li>max: 29 tokens</li></ul> |
* Samples:
| anchor | positive | negative |
|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:------------------------------------------------------------|:--------------------------------------------------------|
| <code>Two women are embracing while holding to go packages.</code> | <code>Two woman are holding packages.</code> | <code>The men are fighting outside a deli.</code> |
| <code>Two young children in blue jerseys, one with the number 9 and one with the number 2 are standing on wooden steps in a bathroom and washing their hands in a sink.</code> | <code>Two kids in numbered jerseys wash their hands.</code> | <code>Two kids in jackets walk to school.</code> |
| <code>A man selling donuts to a customer during a world exhibition event held in the city of Angeles</code> | <code>A man selling donuts to a customer.</code> | <code>A woman drinks her coffee in a small cafe.</code> |
* Loss: [<code>TripletLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#tripletloss) with these parameters:
```json
{
"distance_metric": "TripletDistanceMetric.EUCLIDEAN",
"triplet_margin": 5
}
```
#### stsb
* Dataset: [stsb](https://huggingface.co/datasets/sentence-transformers/stsb) at [ab7a5ac](https://huggingface.co/datasets/sentence-transformers/stsb/tree/ab7a5ac0e35aa22088bdcf23e7fd99b220e53308)
* Size: 1,500 evaluation samples
* Columns: <code>sentence1</code>, <code>sentence2</code>, and <code>score</code>
* Approximate statistics based on the first 1000 samples:
| | sentence1 | sentence2 | score |
|:--------|:---------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|:---------------------------------------------------------------|
| type | string | string | float |
| details | <ul><li>min: 5 tokens</li><li>mean: 15.1 tokens</li><li>max: 45 tokens</li></ul> | <ul><li>min: 6 tokens</li><li>mean: 15.11 tokens</li><li>max: 53 tokens</li></ul> | <ul><li>min: 0.0</li><li>mean: 0.47</li><li>max: 1.0</li></ul> |
* Samples:
| sentence1 | sentence2 | score |
|:--------------------------------------------------|:------------------------------------------------------|:------------------|
| <code>A man with a hard hat is dancing.</code> | <code>A man wearing a hard hat is dancing.</code> | <code>1.0</code> |
| <code>A young child is riding a horse.</code> | <code>A child is riding a horse.</code> | <code>0.95</code> |
| <code>A man is feeding a mouse to a snake.</code> | <code>The man is feeding a mouse to the snake.</code> | <code>1.0</code> |
* Loss: [<code>CosineSimilarityLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cosinesimilarityloss) with these parameters:
```json
{
"loss_fct": "torch.nn.modules.loss.MSELoss"
}
```
#### quora
* Dataset: [quora](https://huggingface.co/datasets/sentence-transformers/quora-duplicates) at [451a485](https://huggingface.co/datasets/sentence-transformers/quora-duplicates/tree/451a4850bd141edb44ade1b5828c259abd762cdb)
* Size: 1,000 evaluation samples
* Columns: <code>anchor</code> and <code>positive</code>
* Approximate statistics based on the first 1000 samples:
| | anchor | positive |
|:--------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|
| type | string | string |
| details | <ul><li>min: 6 tokens</li><li>mean: 14.05 tokens</li><li>max: 70 tokens</li></ul> | <ul><li>min: 6 tokens</li><li>mean: 14.11 tokens</li><li>max: 49 tokens</li></ul> |
* Samples:
| anchor | positive |
|:----------------------------------------------------------------------------|:--------------------------------------------------------------------------------|
| <code>What is your New Year resolution?</code> | <code>What can be my new year resolution for 2017?</code> |
| <code>Should I buy the IPhone 6s or Samsung Galaxy s7?</code> | <code>Which is better: the iPhone 6S Plus or the Samsung Galaxy S7 Edge?</code> |
| <code>What are the differences between transgression and regression?</code> | <code>What is the difference between transgression and regression?</code> |
* Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters:
```json
{
"scale": 20.0,
"similarity_fct": "cos_sim"
}
```
#### natural-questions
* Dataset: [natural-questions](https://huggingface.co/datasets/sentence-transformers/natural-questions) at [f9e894e](https://huggingface.co/datasets/sentence-transformers/natural-questions/tree/f9e894e1081e206e577b4eaa9ee6de2b06ae6f17)
* Size: 1,000 evaluation samples
* Columns: <code>query</code> and <code>answer</code>
* Approximate statistics based on the first 1000 samples:
| | query | answer |
|:--------|:---------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------|
| type | string | string |
| details | <ul><li>min: 9 tokens</li><li>mean: 11.8 tokens</li><li>max: 21 tokens</li></ul> | <ul><li>min: 19 tokens</li><li>mean: 138.84 tokens</li><li>max: 512 tokens</li></ul> |
* Samples:
| query | answer |
|:--------------------------------------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| <code>where does the waikato river begin and end</code> | <code>Waikato River The Waikato River is the longest river in New Zealand, running for 425 kilometres (264Â mi) through the North Island. It rises in the eastern slopes of Mount Ruapehu, joining the Tongariro River system and flowing through Lake Taupo, New Zealand's largest lake. It then drains Taupo at the lake's northeastern edge, creates the Huka Falls, and flows northwest through the Waikato Plains. It empties into the Tasman Sea south of Auckland, at Port Waikato. It gives its name to the Waikato Region that surrounds the Waikato Plains. The present course of the river was largely formed about 17,000 years ago. Contributing factors were climate warming, forest being reestablished in the river headwaters and the deepening, rather than widening, of the existing river channel. The channel was gradually eroded as far up river as Piarere, leaving the old Hinuera channel high and dry.[2] The remains of the old river path can be clearly seen at Hinuera where the cliffs mark the ancient river edges. The river's main tributary is the Waipa River, which has its confluence with the Waikato at Ngaruawahia.</code> |
| <code>what type of gas is produced during fermentation</code> | <code>Fermentation Fermentation reacts NADH with an endogenous, organic electron acceptor.[1] Usually this is pyruvate formed from sugar through glycolysis. The reaction produces NAD+ and an organic product, typical examples being ethanol, lactic acid, carbon dioxide, and hydrogen gas (H2). However, more exotic compounds can be produced by fermentation, such as butyric acid and acetone. Fermentation products contain chemical energy (they are not fully oxidized), but are considered waste products, since they cannot be metabolized further without the use of oxygen.</code> |
| <code>why was star wars episode iv released first</code> | <code>Star Wars (film) Star Wars (later retitled Star Wars: Episode IV – A New Hope) is a 1977 American epic space opera film written and directed by George Lucas. It is the first film in the original Star Wars trilogy and the beginning of the Star Wars franchise. Starring Mark Hamill, Harrison Ford, Carrie Fisher, Peter Cushing, Alec Guinness, David Prowse, James Earl Jones, Anthony Daniels, Kenny Baker, and Peter Mayhew, the film's plot focuses on the Rebel Alliance, led by Princess Leia (Fisher), and its attempt to destroy the Galactic Empire's space station, the Death Star. This conflict disrupts the isolated life of farmhand Luke Skywalker (Hamill), who inadvertently acquires two droids that possess stolen architectural plans for the Death Star. When the Empire begins a destructive search for the missing droids, Skywalker accompanies Jedi Master Obi-Wan Kenobi (Guinness) on a mission to return the plans to the Rebel Alliance and rescue Leia from her imprisonment by the Empire.</code> |
* Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters:
```json
{
"scale": 20.0,
"similarity_fct": "cos_sim"
}
```
### Training Hyperparameters
#### Non-Default Hyperparameters
- `eval_strategy`: epoch
- `per_device_train_batch_size`: 16
- `per_device_eval_batch_size`: 16
- `learning_rate`: 2e-05
- `weight_decay`: 0.01
#### All Hyperparameters
<details><summary>Click to expand</summary>
- `overwrite_output_dir`: False
- `do_predict`: False
- `eval_strategy`: epoch
- `prediction_loss_only`: True
- `per_device_train_batch_size`: 16
- `per_device_eval_batch_size`: 16
- `per_gpu_train_batch_size`: None
- `per_gpu_eval_batch_size`: None
- `gradient_accumulation_steps`: 1
- `eval_accumulation_steps`: None
- `torch_empty_cache_steps`: None
- `learning_rate`: 2e-05
- `weight_decay`: 0.01
- `adam_beta1`: 0.9
- `adam_beta2`: 0.999
- `adam_epsilon`: 1e-08
- `max_grad_norm`: 1.0
- `num_train_epochs`: 3
- `max_steps`: -1
- `lr_scheduler_type`: linear
- `lr_scheduler_kwargs`: {}
- `warmup_ratio`: 0.0
- `warmup_steps`: 0
- `log_level`: passive
- `log_level_replica`: warning
- `log_on_each_node`: True
- `logging_nan_inf_filter`: True
- `save_safetensors`: True
- `save_on_each_node`: False
- `save_only_model`: False
- `restore_callback_states_from_checkpoint`: False
- `no_cuda`: False
- `use_cpu`: False
- `use_mps_device`: False
- `seed`: 42
- `data_seed`: None
- `jit_mode_eval`: False
- `use_ipex`: False
- `bf16`: False
- `fp16`: False
- `fp16_opt_level`: O1
- `half_precision_backend`: auto
- `bf16_full_eval`: False
- `fp16_full_eval`: False
- `tf32`: None
- `local_rank`: 0
- `ddp_backend`: None
- `tpu_num_cores`: None
- `tpu_metrics_debug`: False
- `debug`: []
- `dataloader_drop_last`: False
- `dataloader_num_workers`: 0
- `dataloader_prefetch_factor`: None
- `past_index`: -1
- `disable_tqdm`: False
- `remove_unused_columns`: True
- `label_names`: None
- `load_best_model_at_end`: False
- `ignore_data_skip`: False
- `fsdp`: []
- `fsdp_min_num_params`: 0
- `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
- `fsdp_transformer_layer_cls_to_wrap`: None
- `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
- `deepspeed`: None
- `label_smoothing_factor`: 0.0
- `optim`: adamw_torch
- `optim_args`: None
- `adafactor`: False
- `group_by_length`: False
- `length_column_name`: length
- `ddp_find_unused_parameters`: None
- `ddp_bucket_cap_mb`: None
- `ddp_broadcast_buffers`: False
- `dataloader_pin_memory`: True
- `dataloader_persistent_workers`: False
- `skip_memory_metrics`: True
- `use_legacy_prediction_loop`: False
- `push_to_hub`: False
- `resume_from_checkpoint`: None
- `hub_model_id`: None
- `hub_strategy`: every_save
- `hub_private_repo`: False
- `hub_always_push`: False
- `gradient_checkpointing`: False
- `gradient_checkpointing_kwargs`: None
- `include_inputs_for_metrics`: False
- `eval_do_concat_batches`: True
- `fp16_backend`: auto
- `push_to_hub_model_id`: None
- `push_to_hub_organization`: None
- `mp_parameters`:
- `auto_find_batch_size`: False
- `full_determinism`: False
- `torchdynamo`: None
- `ray_scope`: last
- `ddp_timeout`: 1800
- `torch_compile`: False
- `torch_compile_backend`: None
- `torch_compile_mode`: None
- `dispatch_batches`: None
- `split_batches`: None
- `include_tokens_per_second`: False
- `include_num_input_tokens_seen`: False
- `neftune_noise_alpha`: None
- `optim_target_modules`: None
- `batch_eval_metrics`: False
- `eval_on_start`: False
- `use_liger_kernel`: False
- `eval_use_gather_object`: False
- `batch_sampler`: batch_sampler
- `multi_dataset_batch_sampler`: proportional
</details>
### Training Logs
<details><summary>Click to expand</summary>
| Epoch | Step | Training Loss | quora loss | all-nli-triplet loss | natural-questions loss | stsb loss |
|:------:|:-----:|:-------------:|:----------:|:--------------------:|:----------------------:|:---------:|
| 0.0024 | 10 | 1.1198 | - | - | - | - |
| 0.0049 | 20 | 1.8886 | - | - | - | - |
| 0.0073 | 30 | 0.2303 | - | - | - | - |
| 0.0097 | 40 | 0.1287 | - | - | - | - |
| 0.0122 | 50 | 0.4993 | - | - | - | - |
| 0.0146 | 60 | 0.7388 | - | - | - | - |
| 0.0170 | 70 | 0.8465 | - | - | - | - |
| 0.0195 | 80 | 0.8701 | - | - | - | - |
| 0.0219 | 90 | 0.4349 | - | - | - | - |
| 0.0243 | 100 | 0.2214 | - | - | - | - |
| 0.0268 | 110 | 0.1308 | - | - | - | - |
| 0.0292 | 120 | 0.3163 | - | - | - | - |
| 0.0316 | 130 | 0.3892 | - | - | - | - |
| 0.0341 | 140 | 0.2641 | - | - | - | - |
| 0.0365 | 150 | 0.3359 | - | - | - | - |
| 0.0389 | 160 | 0.5498 | - | - | - | - |
| 0.0414 | 170 | 0.2354 | - | - | - | - |
| 0.0438 | 180 | 0.13 | - | - | - | - |
| 0.0462 | 190 | 0.2307 | - | - | - | - |
| 0.0487 | 200 | 0.1271 | - | - | - | - |
| 0.0511 | 210 | 0.064 | - | - | - | - |
| 0.0535 | 220 | 0.1842 | - | - | - | - |
| 0.0560 | 230 | 0.1626 | - | - | - | - |
| 0.0584 | 240 | 0.1869 | - | - | - | - |
| 0.0608 | 250 | 0.2147 | - | - | - | - |
| 0.0633 | 260 | 0.2534 | - | - | - | - |
| 0.0657 | 270 | 0.1005 | - | - | - | - |
| 0.0681 | 280 | 0.185 | - | - | - | - |
| 0.0706 | 290 | 0.1867 | - | - | - | - |
| 0.0730 | 300 | 0.1905 | - | - | - | - |
| 0.0754 | 310 | 0.2056 | - | - | - | - |
| 0.0779 | 320 | 0.2223 | - | - | - | - |
| 0.0803 | 330 | 0.1499 | - | - | - | - |
| 0.0827 | 340 | 0.107 | - | - | - | - |
| 0.0852 | 350 | 0.1481 | - | - | - | - |
| 0.0876 | 360 | 0.1723 | - | - | - | - |
| 0.0900 | 370 | 0.2387 | - | - | - | - |
| 0.0925 | 380 | 0.274 | - | - | - | - |
| 0.0949 | 390 | 0.1058 | - | - | - | - |
| 0.0973 | 400 | 0.2053 | - | - | - | - |
| 0.0998 | 410 | 0.1103 | - | - | - | - |
| 0.1022 | 420 | 0.1839 | - | - | - | - |
| 0.1046 | 430 | 0.2341 | - | - | - | - |
| 0.1071 | 440 | 0.2015 | - | - | - | - |
| 0.1095 | 450 | 0.1356 | - | - | - | - |
| 0.1119 | 460 | 0.0793 | - | - | - | - |
| 0.1144 | 470 | 0.2756 | - | - | - | - |
| 0.1168 | 480 | 0.0957 | - | - | - | - |
| 0.1192 | 490 | 0.2549 | - | - | - | - |
| 0.1217 | 500 | 0.1483 | - | - | - | - |
| 0.1241 | 510 | 0.2444 | - | - | - | - |
| 0.1265 | 520 | 0.1665 | - | - | - | - |
| 0.1290 | 530 | 0.1091 | - | - | - | - |
| 0.1314 | 540 | 0.1562 | - | - | - | - |
| 0.1338 | 550 | 0.2385 | - | - | - | - |
| 0.1363 | 560 | 0.2801 | - | - | - | - |
| 0.1387 | 570 | 0.2929 | - | - | - | - |
| 0.1411 | 580 | 0.2027 | - | - | - | - |
| 0.1436 | 590 | 0.1628 | - | - | - | - |
| 0.1460 | 600 | 0.1434 | - | - | - | - |
| 0.1484 | 610 | 0.1009 | - | - | - | - |
| 0.1509 | 620 | 0.2225 | - | - | - | - |
| 0.1533 | 630 | 0.1103 | - | - | - | - |
| 0.1557 | 640 | 0.1945 | - | - | - | - |
| 0.1582 | 650 | 0.096 | - | - | - | - |
| 0.1606 | 660 | 0.089 | - | - | - | - |
| 0.1630 | 670 | 0.1493 | - | - | - | - |
| 0.1655 | 680 | 0.1297 | - | - | - | - |
| 0.1679 | 690 | 0.0811 | - | - | - | - |
| 0.1703 | 700 | 0.1718 | - | - | - | - |
| 0.1727 | 710 | 0.1139 | - | - | - | - |
| 0.1752 | 720 | 0.2218 | - | - | - | - |
| 0.1776 | 730 | 0.1397 | - | - | - | - |
| 0.1800 | 740 | 0.1163 | - | - | - | - |
| 0.1825 | 750 | 0.1232 | - | - | - | - |
| 0.1849 | 760 | 0.1724 | - | - | - | - |
| 0.1873 | 770 | 0.051 | - | - | - | - |
| 0.1898 | 780 | 0.1442 | - | - | - | - |
| 0.1922 | 790 | 0.3022 | - | - | - | - |
| 0.1946 | 800 | 0.1056 | - | - | - | - |
| 0.1971 | 810 | 0.1798 | - | - | - | - |
| 0.1995 | 820 | 0.2234 | - | - | - | - |
| 0.2019 | 830 | 0.1251 | - | - | - | - |
| 0.2044 | 840 | 0.2053 | - | - | - | - |
| 0.2068 | 850 | 0.1332 | - | - | - | - |
| 0.2092 | 860 | 0.1611 | - | - | - | - |
| 0.2117 | 870 | 0.0685 | - | - | - | - |
| 0.2141 | 880 | 0.1434 | - | - | - | - |
| 0.2165 | 890 | 0.1516 | - | - | - | - |
| 0.2190 | 900 | 0.1158 | - | - | - | - |
| 0.2214 | 910 | 0.1235 | - | - | - | - |
| 0.2238 | 920 | 0.1113 | - | - | - | - |
| 0.2263 | 930 | 0.2258 | - | - | - | - |
| 0.2287 | 940 | 0.1003 | - | - | - | - |
| 0.2311 | 950 | 0.1943 | - | - | - | - |
| 0.2336 | 960 | 0.1338 | - | - | - | - |
| 0.2360 | 970 | 0.1892 | - | - | - | - |
| 0.2384 | 980 | 0.1784 | - | - | - | - |
| 0.2409 | 990 | 0.1379 | - | - | - | - |
| 0.2433 | 1000 | 0.1426 | - | - | - | - |
| 0.2457 | 1010 | 0.1536 | - | - | - | - |
| 0.2482 | 1020 | 0.118 | - | - | - | - |
| 0.2506 | 1030 | 0.1463 | - | - | - | - |
| 0.2530 | 1040 | 0.1821 | - | - | - | - |
| 0.2555 | 1050 | 0.1829 | - | - | - | - |
| 0.2579 | 1060 | 0.2086 | - | - | - | - |
| 0.2603 | 1070 | 0.1066 | - | - | - | - |
| 0.2628 | 1080 | 0.2072 | - | - | - | - |
| 0.2652 | 1090 | 0.0754 | - | - | - | - |
| 0.2676 | 1100 | 0.0863 | - | - | - | - |
| 0.2701 | 1110 | 0.0821 | - | - | - | - |
| 0.2725 | 1120 | 0.0978 | - | - | - | - |
| 0.2749 | 1130 | 0.093 | - | - | - | - |
| 0.2774 | 1140 | 0.0999 | - | - | - | - |
| 0.2798 | 1150 | 0.1242 | - | - | - | - |
| 0.2822 | 1160 | 0.1832 | - | - | - | - |
| 0.2847 | 1170 | 0.1515 | - | - | - | - |
| 0.2871 | 1180 | 0.187 | - | - | - | - |
| 0.2895 | 1190 | 0.1394 | - | - | - | - |
| 0.2920 | 1200 | 0.1922 | - | - | - | - |
| 0.2944 | 1210 | 0.1522 | - | - | - | - |
| 0.2968 | 1220 | 0.2439 | - | - | - | - |
| 0.2993 | 1230 | 0.0743 | - | - | - | - |
| 0.3017 | 1240 | 0.101 | - | - | - | - |
| 0.3041 | 1250 | 0.0736 | - | - | - | - |
| 0.3066 | 1260 | 0.1892 | - | - | - | - |
| 0.3090 | 1270 | 0.1031 | - | - | - | - |
| 0.3114 | 1280 | 0.1348 | - | - | - | - |
| 0.3139 | 1290 | 0.0839 | - | - | - | - |
| 0.3163 | 1300 | 0.104 | - | - | - | - |
| 0.3187 | 1310 | 0.1508 | - | - | - | - |
| 0.3212 | 1320 | 0.163 | - | - | - | - |
| 0.3236 | 1330 | 0.1057 | - | - | - | - |
| 0.3260 | 1340 | 0.0979 | - | - | - | - |
| 0.3285 | 1350 | 0.1521 | - | - | - | - |
| 0.3309 | 1360 | 0.0549 | - | - | - | - |
| 0.3333 | 1370 | 0.1038 | - | - | - | - |
| 0.3358 | 1380 | 0.1023 | - | - | - | - |
| 0.3382 | 1390 | 0.0543 | - | - | - | - |
| 0.3406 | 1400 | 0.1276 | - | - | - | - |
| 0.3431 | 1410 | 0.0705 | - | - | - | - |
| 0.3455 | 1420 | 0.1127 | - | - | - | - |
| 0.3479 | 1430 | 0.0737 | - | - | - | - |
| 0.3504 | 1440 | 0.066 | - | - | - | - |
| 0.3528 | 1450 | 0.0864 | - | - | - | - |
| 0.3552 | 1460 | 0.1299 | - | - | - | - |
| 0.3577 | 1470 | 0.1171 | - | - | - | - |
| 0.3601 | 1480 | 0.1578 | - | - | - | - |
| 0.3625 | 1490 | 0.0774 | - | - | - | - |
| 0.3650 | 1500 | 0.2007 | - | - | - | - |
| 0.3674 | 1510 | 0.1538 | - | - | - | - |
| 0.3698 | 1520 | 0.1343 | - | - | - | - |
| 0.3723 | 1530 | 0.0861 | - | - | - | - |
| 0.3747 | 1540 | 0.1305 | - | - | - | - |
| 0.3771 | 1550 | 0.3199 | - | - | - | - |
| 0.3796 | 1560 | 0.0887 | - | - | - | - |
| 0.3820 | 1570 | 0.1275 | - | - | - | - |
| 0.3844 | 1580 | 0.1526 | - | - | - | - |
| 0.3869 | 1590 | 0.1412 | - | - | - | - |
| 0.3893 | 1600 | 0.096 | - | - | - | - |
| 0.3917 | 1610 | 0.1666 | - | - | - | - |
| 0.3942 | 1620 | 0.1311 | - | - | - | - |
| 0.3966 | 1630 | 0.0828 | - | - | - | - |
| 0.3990 | 1640 | 0.0929 | - | - | - | - |
| 0.4015 | 1650 | 0.1271 | - | - | - | - |
| 0.4039 | 1660 | 0.0411 | - | - | - | - |
| 0.4063 | 1670 | 0.0848 | - | - | - | - |
| 0.4088 | 1680 | 0.2556 | - | - | - | - |
| 0.4112 | 1690 | 0.1273 | - | - | - | - |
| 0.4136 | 1700 | 0.1636 | - | - | - | - |
| 0.4161 | 1710 | 0.0851 | - | - | - | - |
| 0.4185 | 1720 | 0.1129 | - | - | - | - |
| 0.4209 | 1730 | 0.1433 | - | - | - | - |
| 0.4234 | 1740 | 0.1752 | - | - | - | - |
| 0.4258 | 1750 | 0.1049 | - | - | - | - |
| 0.4282 | 1760 | 0.1691 | - | - | - | - |
| 0.4307 | 1770 | 0.2687 | - | - | - | - |
| 0.4331 | 1780 | 0.1624 | - | - | - | - |
| 0.4355 | 1790 | 0.1654 | - | - | - | - |
| 0.4380 | 1800 | 0.1209 | - | - | - | - |
| 0.4404 | 1810 | 0.2127 | - | - | - | - |
| 0.4428 | 1820 | 0.0449 | - | - | - | - |
| 0.4453 | 1830 | 0.0906 | - | - | - | - |
| 0.4477 | 1840 | 0.1546 | - | - | - | - |
| 0.4501 | 1850 | 0.0938 | - | - | - | - |
| 0.4526 | 1860 | 0.1115 | - | - | - | - |
| 0.4550 | 1870 | 0.0864 | - | - | - | - |
| 0.4574 | 1880 | 0.1515 | - | - | - | - |
| 0.4599 | 1890 | 0.091 | - | - | - | - |
| 0.4623 | 1900 | 0.1496 | - | - | - | - |
| 0.4647 | 1910 | 0.1807 | - | - | - | - |
| 0.4672 | 1920 | 0.1351 | - | - | - | - |
| 0.4696 | 1930 | 0.114 | - | - | - | - |
| 0.4720 | 1940 | 0.1673 | - | - | - | - |
| 0.4745 | 1950 | 0.1655 | - | - | - | - |
| 0.4769 | 1960 | 0.0662 | - | - | - | - |
| 0.4793 | 1970 | 0.1377 | - | - | - | - |
| 0.4818 | 1980 | 0.0512 | - | - | - | - |
| 0.4842 | 1990 | 0.1399 | - | - | - | - |
| 0.4866 | 2000 | 0.1613 | - | - | - | - |
| 0.4891 | 2010 | 0.1326 | - | - | - | - |
| 0.4915 | 2020 | 0.1201 | - | - | - | - |
| 0.4939 | 2030 | 0.097 | - | - | - | - |
| 0.4964 | 2040 | 0.0788 | - | - | - | - |
| 0.4988 | 2050 | 0.1282 | - | - | - | - |
| 0.5012 | 2060 | 0.2038 | - | - | - | - |
| 0.5036 | 2070 | 0.1078 | - | - | - | - |
| 0.5061 | 2080 | 0.1594 | - | - | - | - |
| 0.5085 | 2090 | 0.1628 | - | - | - | - |
| 0.5109 | 2100 | 0.0744 | - | - | - | - |
| 0.5134 | 2110 | 0.1587 | - | - | - | - |
| 0.5158 | 2120 | 0.0573 | - | - | - | - |
| 0.5182 | 2130 | 0.1672 | - | - | - | - |
| 0.5207 | 2140 | 0.1139 | - | - | - | - |
| 0.5231 | 2150 | 0.1285 | - | - | - | - |
| 0.5255 | 2160 | 0.1538 | - | - | - | - |
| 0.5280 | 2170 | 0.1642 | - | - | - | - |
| 0.5304 | 2180 | 0.1012 | - | - | - | - |
| 0.5328 | 2190 | 0.0554 | - | - | - | - |
| 0.5353 | 2200 | 0.0656 | - | - | - | - |
| 0.5377 | 2210 | 0.1206 | - | - | - | - |
| 0.5401 | 2220 | 0.1164 | - | - | - | - |
| 0.5426 | 2230 | 0.1364 | - | - | - | - |
| 0.5450 | 2240 | 0.1188 | - | - | - | - |
| 0.5474 | 2250 | 0.0965 | - | - | - | - |
| 0.5499 | 2260 | 0.0789 | - | - | - | - |
| 0.5523 | 2270 | 0.0793 | - | - | - | - |
| 0.5547 | 2280 | 0.1205 | - | - | - | - |
| 0.5572 | 2290 | 0.089 | - | - | - | - |
| 0.5596 | 2300 | 0.1049 | - | - | - | - |
| 0.5620 | 2310 | 0.0989 | - | - | - | - |
| 0.5645 | 2320 | 0.1822 | - | - | - | - |
| 0.5669 | 2330 | 0.1367 | - | - | - | - |
| 0.5693 | 2340 | 0.1238 | - | - | - | - |
| 0.5718 | 2350 | 0.1383 | - | - | - | - |
| 0.5742 | 2360 | 0.184 | - | - | - | - |
| 0.5766 | 2370 | 0.1254 | - | - | - | - |
| 0.5791 | 2380 | 0.1046 | - | - | - | - |
| 0.5815 | 2390 | 0.1175 | - | - | - | - |
| 0.5839 | 2400 | 0.0698 | - | - | - | - |
| 0.5864 | 2410 | 0.111 | - | - | - | - |
| 0.5888 | 2420 | 0.115 | - | - | - | - |
| 0.5912 | 2430 | 0.1721 | - | - | - | - |
| 0.5937 | 2440 | 0.0904 | - | - | - | - |
| 0.5961 | 2450 | 0.1142 | - | - | - | - |
| 0.5985 | 2460 | 0.1021 | - | - | - | - |
| 0.6010 | 2470 | 0.0307 | - | - | - | - |
| 0.6034 | 2480 | 0.1495 | - | - | - | - |
| 0.6058 | 2490 | 0.1031 | - | - | - | - |
| 0.6083 | 2500 | 0.0951 | - | - | - | - |
| 0.6107 | 2510 | 0.0941 | - | - | - | - |
| 0.6131 | 2520 | 0.2231 | - | - | - | - |
| 0.6156 | 2530 | 0.1572 | - | - | - | - |
| 0.6180 | 2540 | 0.2004 | - | - | - | - |
| 0.6204 | 2550 | 0.0573 | - | - | - | - |
| 0.6229 | 2560 | 0.156 | - | - | - | - |
| 0.6253 | 2570 | 0.1244 | - | - | - | - |
| 0.6277 | 2580 | 0.0996 | - | - | - | - |
| 0.6302 | 2590 | 0.163 | - | - | - | - |
| 0.6326 | 2600 | 0.169 | - | - | - | - |
| 0.6350 | 2610 | 0.1593 | - | - | - | - |
| 0.6375 | 2620 | 0.098 | - | - | - | - |
| 0.6399 | 2630 | 0.1133 | - | - | - | - |
| 0.6423 | 2640 | 0.1267 | - | - | - | - |
| 0.6448 | 2650 | 0.1006 | - | - | - | - |
| 0.6472 | 2660 | 0.178 | - | - | - | - |
| 0.6496 | 2670 | 0.1124 | - | - | - | - |
| 0.6521 | 2680 | 0.0952 | - | - | - | - |
| 0.6545 | 2690 | 0.0726 | - | - | - | - |
| 0.6569 | 2700 | 0.1105 | - | - | - | - |
| 0.6594 | 2710 | 0.1675 | - | - | - | - |
| 0.6618 | 2720 | 0.1711 | - | - | - | - |
| 0.6642 | 2730 | 0.1481 | - | - | - | - |
| 0.6667 | 2740 | 0.1078 | - | - | - | - |
| 0.6691 | 2750 | 0.0981 | - | - | - | - |
| 0.6715 | 2760 | 0.115 | - | - | - | - |
| 0.6740 | 2770 | 0.0855 | - | - | - | - |
| 0.6764 | 2780 | 0.0657 | - | - | - | - |
| 0.6788 | 2790 | 0.0539 | - | - | - | - |
| 0.6813 | 2800 | 0.0766 | - | - | - | - |
| 0.6837 | 2810 | 0.1608 | - | - | - | - |
| 0.6861 | 2820 | 0.1263 | - | - | - | - |
| 0.6886 | 2830 | 0.0992 | - | - | - | - |
| 0.6910 | 2840 | 0.1147 | - | - | - | - |
| 0.6934 | 2850 | 0.1697 | - | - | - | - |
| 0.6959 | 2860 | 0.1602 | - | - | - | - |
| 0.6983 | 2870 | 0.083 | - | - | - | - |
| 0.7007 | 2880 | 0.1068 | - | - | - | - |
| 0.7032 | 2890 | 0.1074 | - | - | - | - |
| 0.7056 | 2900 | 0.0695 | - | - | - | - |
| 0.7080 | 2910 | 0.0529 | - | - | - | - |
| 0.7105 | 2920 | 0.1381 | - | - | - | - |
| 0.7129 | 2930 | 0.1418 | - | - | - | - |
| 0.7153 | 2940 | 0.1506 | - | - | - | - |
| 0.7178 | 2950 | 0.1069 | - | - | - | - |
| 0.7202 | 2960 | 0.147 | - | - | - | - |
| 0.7226 | 2970 | 0.1358 | - | - | - | - |
| 0.7251 | 2980 | 0.1592 | - | - | - | - |
| 0.7275 | 2990 | 0.1387 | - | - | - | - |
| 0.7299 | 3000 | 0.0886 | - | - | - | - |
| 0.7324 | 3010 | 0.149 | - | - | - | - |
| 0.7348 | 3020 | 0.1347 | - | - | - | - |
| 0.7372 | 3030 | 0.1022 | - | - | - | - |
| 0.7397 | 3040 | 0.0747 | - | - | - | - |
| 0.7421 | 3050 | 0.0839 | - | - | - | - |
| 0.7445 | 3060 | 0.1364 | - | - | - | - |
| 0.7470 | 3070 | 0.1191 | - | - | - | - |
| 0.7494 | 3080 | 0.0779 | - | - | - | - |
| 0.7518 | 3090 | 0.0654 | - | - | - | - |
| 0.7543 | 3100 | 0.0714 | - | - | - | - |
| 0.7567 | 3110 | 0.1154 | - | - | - | - |
| 0.7591 | 3120 | 0.0546 | - | - | - | - |
| 0.7616 | 3130 | 0.0548 | - | - | - | - |
| 0.7640 | 3140 | 0.0569 | - | - | - | - |
| 0.7664 | 3150 | 0.0964 | - | - | - | - |
| 0.7689 | 3160 | 0.0445 | - | - | - | - |
| 0.7713 | 3170 | 0.1362 | - | - | - | - |
| 0.7737 | 3180 | 0.1239 | - | - | - | - |
| 0.7762 | 3190 | 0.0981 | - | - | - | - |
| 0.7786 | 3200 | 0.0422 | - | - | - | - |
| 0.7810 | 3210 | 0.1282 | - | - | - | - |
| 0.7835 | 3220 | 0.0847 | - | - | - | - |
| 0.7859 | 3230 | 0.1134 | - | - | - | - |
| 0.7883 | 3240 | 0.1048 | - | - | - | - |
| 0.7908 | 3250 | 0.1091 | - | - | - | - |
| 0.7932 | 3260 | 0.0428 | - | - | - | - |
| 0.7956 | 3270 | 0.0632 | - | - | - | - |
| 0.7981 | 3280 | 0.0808 | - | - | - | - |
| 0.8005 | 3290 | 0.0604 | - | - | - | - |
| 0.8029 | 3300 | 0.1614 | - | - | - | - |
| 0.8054 | 3310 | 0.1604 | - | - | - | - |
| 0.8078 | 3320 | 0.0899 | - | - | - | - |
| 0.8102 | 3330 | 0.1097 | - | - | - | - |
| 0.8127 | 3340 | 0.1269 | - | - | - | - |
| 0.8151 | 3350 | 0.0738 | - | - | - | - |
| 0.8175 | 3360 | 0.0768 | - | - | - | - |
| 0.8200 | 3370 | 0.0752 | - | - | - | - |
| 0.8224 | 3380 | 0.1379 | - | - | - | - |
| 0.8248 | 3390 | 0.0877 | - | - | - | - |
| 0.8273 | 3400 | 0.1311 | - | - | - | - |
| 0.8297 | 3410 | 0.1109 | - | - | - | - |
| 0.8321 | 3420 | 0.1557 | - | - | - | - |
| 0.8345 | 3430 | 0.1509 | - | - | - | - |
| 0.8370 | 3440 | 0.0962 | - | - | - | - |
| 0.8394 | 3450 | 0.0631 | - | - | - | - |
| 0.8418 | 3460 | 0.0835 | - | - | - | - |
| 0.8443 | 3470 | 0.1488 | - | - | - | - |
| 0.8467 | 3480 | 0.0903 | - | - | - | - |
| 0.8491 | 3490 | 0.0927 | - | - | - | - |
| 0.8516 | 3500 | 0.1457 | - | - | - | - |
| 0.8540 | 3510 | 0.0775 | - | - | - | - |
| 0.8564 | 3520 | 0.1314 | - | - | - | - |
| 0.8589 | 3530 | 0.1528 | - | - | - | - |
| 0.8613 | 3540 | 0.0695 | - | - | - | - |
| 0.8637 | 3550 | 0.0673 | - | - | - | - |
| 0.8662 | 3560 | 0.1441 | - | - | - | - |
| 0.8686 | 3570 | 0.135 | - | - | - | - |
| 0.8710 | 3580 | 0.1595 | - | - | - | - |
| 0.8735 | 3590 | 0.1125 | - | - | - | - |
| 0.8759 | 3600 | 0.0709 | - | - | - | - |
| 0.8783 | 3610 | 0.1191 | - | - | - | - |
| 0.8808 | 3620 | 0.1614 | - | - | - | - |
| 0.8832 | 3630 | 0.086 | - | - | - | - |
| 0.8856 | 3640 | 0.0818 | - | - | - | - |
| 0.8881 | 3650 | 0.0544 | - | - | - | - |
| 0.8905 | 3660 | 0.0797 | - | - | - | - |
| 0.8929 | 3670 | 0.0691 | - | - | - | - |
| 0.8954 | 3680 | 0.0924 | - | - | - | - |
| 0.8978 | 3690 | 0.0572 | - | - | - | - |
| 0.9002 | 3700 | 0.0532 | - | - | - | - |
| 0.9027 | 3710 | 0.1519 | - | - | - | - |
| 0.9051 | 3720 | 0.0983 | - | - | - | - |
| 0.9075 | 3730 | 0.0772 | - | - | - | - |
| 0.9100 | 3740 | 0.18 | - | - | - | - |
| 0.9124 | 3750 | 0.0485 | - | - | - | - |
| 0.9148 | 3760 | 0.0872 | - | - | - | - |
| 0.9173 | 3770 | 0.1069 | - | - | - | - |
| 0.9197 | 3780 | 0.0657 | - | - | - | - |
| 0.9221 | 3790 | 0.1811 | - | - | - | - |
| 0.9246 | 3800 | 0.1038 | - | - | - | - |
| 0.9270 | 3810 | 0.087 | - | - | - | - |
| 0.9294 | 3820 | 0.1569 | - | - | - | - |
| 0.9319 | 3830 | 0.0404 | - | - | - | - |
| 0.9343 | 3840 | 0.1468 | - | - | - | - |
| 0.9367 | 3850 | 0.0974 | - | - | - | - |
| 0.9392 | 3860 | 0.1231 | - | - | - | - |
| 0.9416 | 3870 | 0.1511 | - | - | - | - |
| 0.9440 | 3880 | 0.0386 | - | - | - | - |
| 0.9465 | 3890 | 0.0918 | - | - | - | - |
| 0.9489 | 3900 | 0.0661 | - | - | - | - |
| 0.9513 | 3910 | 0.1355 | - | - | - | - |
| 0.9538 | 3920 | 0.1182 | - | - | - | - |
| 0.9562 | 3930 | 0.1254 | - | - | - | - |
| 0.9586 | 3940 | 0.1999 | - | - | - | - |
| 0.9611 | 3950 | 0.125 | - | - | - | - |
| 0.9635 | 3960 | 0.0303 | - | - | - | - |
| 0.9659 | 3970 | 0.1192 | - | - | - | - |
| 0.9684 | 3980 | 0.1182 | - | - | - | - |
| 0.9708 | 3990 | 0.1449 | - | - | - | - |
| 0.9732 | 4000 | 0.1387 | - | - | - | - |
| 0.9757 | 4010 | 0.077 | - | - | - | - |
| 0.9781 | 4020 | 0.1118 | - | - | - | - |
| 0.9805 | 4030 | 0.0567 | - | - | - | - |
| 0.9830 | 4040 | 0.0454 | - | - | - | - |
| 0.9854 | 4050 | 0.1179 | - | - | - | - |
| 0.9878 | 4060 | 0.0993 | - | - | - | - |
| 0.9903 | 4070 | 0.1377 | - | - | - | - |
| 0.9927 | 4080 | 0.1308 | - | - | - | - |
| 0.9951 | 4090 | 0.0982 | - | - | - | - |
| 0.9976 | 4100 | 0.1211 | - | - | - | - |
| 1.0 | 4110 | 0.2036 | 0.0136 | 2.3842 | 0.0331 | 0.0606 |
| 1.0024 | 4120 | 0.1825 | - | - | - | - |
| 1.0049 | 4130 | 0.1088 | - | - | - | - |
| 1.0073 | 4140 | 0.1301 | - | - | - | - |
| 1.0097 | 4150 | 0.0549 | - | - | - | - |
| 1.0122 | 4160 | 0.0714 | - | - | - | - |
| 1.0146 | 4170 | 0.0743 | - | - | - | - |
| 1.0170 | 4180 | 0.0531 | - | - | - | - |
| 1.0195 | 4190 | 0.0749 | - | - | - | - |
| 1.0219 | 4200 | 0.0868 | - | - | - | - |
| 1.0243 | 4210 | 0.0544 | - | - | - | - |
| 1.0268 | 4220 | 0.0894 | - | - | - | - |
| 1.0292 | 4230 | 0.0971 | - | - | - | - |
| 1.0316 | 4240 | 0.0709 | - | - | - | - |
| 1.0341 | 4250 | 0.055 | - | - | - | - |
| 1.0365 | 4260 | 0.0386 | - | - | - | - |
| 1.0389 | 4270 | 0.1549 | - | - | - | - |
| 1.0414 | 4280 | 0.102 | - | - | - | - |
| 1.0438 | 4290 | 0.0422 | - | - | - | - |
| 1.0462 | 4300 | 0.0886 | - | - | - | - |
| 1.0487 | 4310 | 0.0583 | - | - | - | - |
| 1.0511 | 4320 | 0.0522 | - | - | - | - |
| 1.0535 | 4330 | 0.0478 | - | - | - | - |
| 1.0560 | 4340 | 0.0328 | - | - | - | - |
| 1.0584 | 4350 | 0.028 | - | - | - | - |
| 1.0608 | 4360 | 0.0129 | - | - | - | - |
| 1.0633 | 4370 | 0.084 | - | - | - | - |
| 1.0657 | 4380 | 0.0523 | - | - | - | - |
| 1.0681 | 4390 | 0.1178 | - | - | - | - |
| 1.0706 | 4400 | 0.0294 | - | - | - | - |
| 1.0730 | 4410 | 0.0648 | - | - | - | - |
| 1.0754 | 4420 | 0.0422 | - | - | - | - |
| 1.0779 | 4430 | 0.0922 | - | - | - | - |
| 1.0803 | 4440 | 0.0587 | - | - | - | - |
| 1.0827 | 4450 | 0.0554 | - | - | - | - |
| 1.0852 | 4460 | 0.0951 | - | - | - | - |
| 1.0876 | 4470 | 0.108 | - | - | - | - |
| 1.0900 | 4480 | 0.0677 | - | - | - | - |
| 1.0925 | 4490 | 0.0737 | - | - | - | - |
| 1.0949 | 4500 | 0.0447 | - | - | - | - |
| 1.0973 | 4510 | 0.0531 | - | - | - | - |
| 1.0998 | 4520 | 0.0605 | - | - | - | - |
| 1.1022 | 4530 | 0.0871 | - | - | - | - |
| 1.1046 | 4540 | 0.0718 | - | - | - | - |
| 1.1071 | 4550 | 0.0672 | - | - | - | - |
| 1.1095 | 4560 | 0.0829 | - | - | - | - |
| 1.1119 | 4570 | 0.0539 | - | - | - | - |
| 1.1144 | 4580 | 0.0751 | - | - | - | - |
| 1.1168 | 4590 | 0.0521 | - | - | - | - |
| 1.1192 | 4600 | 0.1046 | - | - | - | - |
| 1.1217 | 4610 | 0.0631 | - | - | - | - |
| 1.1241 | 4620 | 0.1142 | - | - | - | - |
| 1.1265 | 4630 | 0.0556 | - | - | - | - |
| 1.1290 | 4640 | 0.0398 | - | - | - | - |
| 1.1314 | 4650 | 0.0817 | - | - | - | - |
| 1.1338 | 4660 | 0.054 | - | - | - | - |
| 1.1363 | 4670 | 0.12 | - | - | - | - |
| 1.1387 | 4680 | 0.0762 | - | - | - | - |
| 1.1411 | 4690 | 0.0138 | - | - | - | - |
| 1.1436 | 4700 | 0.0777 | - | - | - | - |
| 1.1460 | 4710 | 0.0582 | - | - | - | - |
| 1.1484 | 4720 | 0.0721 | - | - | - | - |
| 1.1509 | 4730 | 0.104 | - | - | - | - |
| 1.1533 | 4740 | 0.087 | - | - | - | - |
| 1.1557 | 4750 | 0.0842 | - | - | - | - |
| 1.1582 | 4760 | 0.0416 | - | - | - | - |
| 1.1606 | 4770 | 0.0806 | - | - | - | - |
| 1.1630 | 4780 | 0.0588 | - | - | - | - |
| 1.1655 | 4790 | 0.0291 | - | - | - | - |
| 1.1679 | 4800 | 0.0638 | - | - | - | - |
| 1.1703 | 4810 | 0.0837 | - | - | - | - |
| 1.1727 | 4820 | 0.0702 | - | - | - | - |
| 1.1752 | 4830 | 0.0442 | - | - | - | - |
| 1.1776 | 4840 | 0.0528 | - | - | - | - |
| 1.1800 | 4850 | 0.0601 | - | - | - | - |
| 1.1825 | 4860 | 0.0344 | - | - | - | - |
| 1.1849 | 4870 | 0.0443 | - | - | - | - |
| 1.1873 | 4880 | 0.0383 | - | - | - | - |
| 1.1898 | 4890 | 0.0359 | - | - | - | - |
| 1.1922 | 4900 | 0.137 | - | - | - | - |
| 1.1946 | 4910 | 0.0451 | - | - | - | - |
| 1.1971 | 4920 | 0.0635 | - | - | - | - |
| 1.1995 | 4930 | 0.0927 | - | - | - | - |
| 1.2019 | 4940 | 0.0734 | - | - | - | - |
| 1.2044 | 4950 | 0.0839 | - | - | - | - |
| 1.2068 | 4960 | 0.1103 | - | - | - | - |
| 1.2092 | 4970 | 0.0715 | - | - | - | - |
| 1.2117 | 4980 | 0.0229 | - | - | - | - |
| 1.2141 | 4990 | 0.0237 | - | - | - | - |
| 1.2165 | 5000 | 0.0618 | - | - | - | - |
| 1.2190 | 5010 | 0.0559 | - | - | - | - |
| 1.2214 | 5020 | 0.0967 | - | - | - | - |
| 1.2238 | 5030 | 0.0697 | - | - | - | - |
| 1.2263 | 5040 | 0.0507 | - | - | - | - |
| 1.2287 | 5050 | 0.0642 | - | - | - | - |
| 1.2311 | 5060 | 0.0485 | - | - | - | - |
| 1.2336 | 5070 | 0.0676 | - | - | - | - |
| 1.2360 | 5080 | 0.1147 | - | - | - | - |
| 1.2384 | 5090 | 0.061 | - | - | - | - |
| 1.2409 | 5100 | 0.0333 | - | - | - | - |
| 1.2433 | 5110 | 0.0334 | - | - | - | - |
| 1.2457 | 5120 | 0.0751 | - | - | - | - |
| 1.2482 | 5130 | 0.0942 | - | - | - | - |
| 1.2506 | 5140 | 0.0609 | - | - | - | - |
| 1.2530 | 5150 | 0.0983 | - | - | - | - |
| 1.2555 | 5160 | 0.033 | - | - | - | - |
| 1.2579 | 5170 | 0.0805 | - | - | - | - |
| 1.2603 | 5180 | 0.0561 | - | - | - | - |
| 1.2628 | 5190 | 0.0961 | - | - | - | - |
| 1.2652 | 5200 | 0.0579 | - | - | - | - |
| 1.2676 | 5210 | 0.0648 | - | - | - | - |
| 1.2701 | 5220 | 0.0507 | - | - | - | - |
| 1.2725 | 5230 | 0.0313 | - | - | - | - |
| 1.2749 | 5240 | 0.0429 | - | - | - | - |
| 1.2774 | 5250 | 0.0673 | - | - | - | - |
| 1.2798 | 5260 | 0.0926 | - | - | - | - |
| 1.2822 | 5270 | 0.0745 | - | - | - | - |
| 1.2847 | 5280 | 0.0566 | - | - | - | - |
| 1.2871 | 5290 | 0.0657 | - | - | - | - |
| 1.2895 | 5300 | 0.0755 | - | - | - | - |
| 1.2920 | 5310 | 0.0607 | - | - | - | - |
| 1.2944 | 5320 | 0.0849 | - | - | - | - |
| 1.2968 | 5330 | 0.106 | - | - | - | - |
| 1.2993 | 5340 | 0.0283 | - | - | - | - |
| 1.3017 | 5350 | 0.0628 | - | - | - | - |
| 1.3041 | 5360 | 0.0603 | - | - | - | - |
| 1.3066 | 5370 | 0.0616 | - | - | - | - |
| 1.3090 | 5380 | 0.0463 | - | - | - | - |
| 1.3114 | 5390 | 0.0546 | - | - | - | - |
| 1.3139 | 5400 | 0.0492 | - | - | - | - |
| 1.3163 | 5410 | 0.0555 | - | - | - | - |
| 1.3187 | 5420 | 0.0817 | - | - | - | - |
| 1.3212 | 5430 | 0.0876 | - | - | - | - |
| 1.3236 | 5440 | 0.0379 | - | - | - | - |
| 1.3260 | 5450 | 0.0788 | - | - | - | - |
| 1.3285 | 5460 | 0.0751 | - | - | - | - |
| 1.3309 | 5470 | 0.0366 | - | - | - | - |
| 1.3333 | 5480 | 0.073 | - | - | - | - |
| 1.3358 | 5490 | 0.0562 | - | - | - | - |
| 1.3382 | 5500 | 0.0129 | - | - | - | - |
| 1.3406 | 5510 | 0.0575 | - | - | - | - |
| 1.3431 | 5520 | 0.0644 | - | - | - | - |
| 1.3455 | 5530 | 0.0419 | - | - | - | - |
| 1.3479 | 5540 | 0.0578 | - | - | - | - |
| 1.3504 | 5550 | 0.0402 | - | - | - | - |
| 1.3528 | 5560 | 0.0455 | - | - | - | - |
| 1.3552 | 5570 | 0.0676 | - | - | - | - |
| 1.3577 | 5580 | 0.0503 | - | - | - | - |
| 1.3601 | 5590 | 0.0824 | - | - | - | - |
| 1.3625 | 5600 | 0.0288 | - | - | - | - |
| 1.3650 | 5610 | 0.1038 | - | - | - | - |
| 1.3674 | 5620 | 0.0681 | - | - | - | - |
| 1.3698 | 5630 | 0.0767 | - | - | - | - |
| 1.3723 | 5640 | 0.0507 | - | - | - | - |
| 1.3747 | 5650 | 0.0532 | - | - | - | - |
| 1.3771 | 5660 | 0.1468 | - | - | - | - |
| 1.3796 | 5670 | 0.0391 | - | - | - | - |
| 1.3820 | 5680 | 0.0566 | - | - | - | - |
| 1.3844 | 5690 | 0.0496 | - | - | - | - |
| 1.3869 | 5700 | 0.0688 | - | - | - | - |
| 1.3893 | 5710 | 0.062 | - | - | - | - |
| 1.3917 | 5720 | 0.0834 | - | - | - | - |
| 1.3942 | 5730 | 0.0611 | - | - | - | - |
| 1.3966 | 5740 | 0.0593 | - | - | - | - |
| 1.3990 | 5750 | 0.0664 | - | - | - | - |
| 1.4015 | 5760 | 0.0841 | - | - | - | - |
| 1.4039 | 5770 | 0.02 | - | - | - | - |
| 1.4063 | 5780 | 0.0283 | - | - | - | - |
| 1.4088 | 5790 | 0.1089 | - | - | - | - |
| 1.4112 | 5800 | 0.0583 | - | - | - | - |
| 1.4136 | 5810 | 0.0692 | - | - | - | - |
| 1.4161 | 5820 | 0.0371 | - | - | - | - |
| 1.4185 | 5830 | 0.0575 | - | - | - | - |
| 1.4209 | 5840 | 0.0822 | - | - | - | - |
| 1.4234 | 5850 | 0.1046 | - | - | - | - |
| 1.4258 | 5860 | 0.0509 | - | - | - | - |
| 1.4282 | 5870 | 0.0943 | - | - | - | - |
| 1.4307 | 5880 | 0.1221 | - | - | - | - |
| 1.4331 | 5890 | 0.0651 | - | - | - | - |
| 1.4355 | 5900 | 0.0701 | - | - | - | - |
| 1.4380 | 5910 | 0.0638 | - | - | - | - |
| 1.4404 | 5920 | 0.1021 | - | - | - | - |
| 1.4428 | 5930 | 0.0386 | - | - | - | - |
| 1.4453 | 5940 | 0.0697 | - | - | - | - |
| 1.4477 | 5950 | 0.064 | - | - | - | - |
| 1.4501 | 5960 | 0.0522 | - | - | - | - |
| 1.4526 | 5970 | 0.075 | - | - | - | - |
| 1.4550 | 5980 | 0.0383 | - | - | - | - |
| 1.4574 | 5990 | 0.0818 | - | - | - | - |
| 1.4599 | 6000 | 0.0472 | - | - | - | - |
| 1.4623 | 6010 | 0.0783 | - | - | - | - |
| 1.4647 | 6020 | 0.0517 | - | - | - | - |
| 1.4672 | 6030 | 0.046 | - | - | - | - |
| 1.4696 | 6040 | 0.0759 | - | - | - | - |
| 1.4720 | 6050 | 0.0645 | - | - | - | - |
| 1.4745 | 6060 | 0.0794 | - | - | - | - |
| 1.4769 | 6070 | 0.0396 | - | - | - | - |
| 1.4793 | 6080 | 0.0524 | - | - | - | - |
| 1.4818 | 6090 | 0.0116 | - | - | - | - |
| 1.4842 | 6100 | 0.0657 | - | - | - | - |
| 1.4866 | 6110 | 0.0728 | - | - | - | - |
| 1.4891 | 6120 | 0.0663 | - | - | - | - |
| 1.4915 | 6130 | 0.0965 | - | - | - | - |
| 1.4939 | 6140 | 0.0535 | - | - | - | - |
| 1.4964 | 6150 | 0.0389 | - | - | - | - |
| 1.4988 | 6160 | 0.0976 | - | - | - | - |
| 1.5012 | 6170 | 0.1219 | - | - | - | - |
| 1.5036 | 6180 | 0.0488 | - | - | - | - |
| 1.5061 | 6190 | 0.1015 | - | - | - | - |
| 1.5085 | 6200 | 0.0982 | - | - | - | - |
| 1.5109 | 6210 | 0.0565 | - | - | - | - |
| 1.5134 | 6220 | 0.0831 | - | - | - | - |
| 1.5158 | 6230 | 0.0463 | - | - | - | - |
| 1.5182 | 6240 | 0.1356 | - | - | - | - |
| 1.5207 | 6250 | 0.0567 | - | - | - | - |
| 1.5231 | 6260 | 0.0459 | - | - | - | - |
| 1.5255 | 6270 | 0.0767 | - | - | - | - |
| 1.5280 | 6280 | 0.0798 | - | - | - | - |
| 1.5304 | 6290 | 0.0632 | - | - | - | - |
| 1.5328 | 6300 | 0.0431 | - | - | - | - |
| 1.5353 | 6310 | 0.0175 | - | - | - | - |
| 1.5377 | 6320 | 0.0482 | - | - | - | - |
| 1.5401 | 6330 | 0.0841 | - | - | - | - |
| 1.5426 | 6340 | 0.0756 | - | - | - | - |
| 1.5450 | 6350 | 0.078 | - | - | - | - |
| 1.5474 | 6360 | 0.0608 | - | - | - | - |
| 1.5499 | 6370 | 0.0678 | - | - | - | - |
| 1.5523 | 6380 | 0.054 | - | - | - | - |
| 1.5547 | 6390 | 0.0823 | - | - | - | - |
| 1.5572 | 6400 | 0.0322 | - | - | - | - |
| 1.5596 | 6410 | 0.0432 | - | - | - | - |
| 1.5620 | 6420 | 0.0251 | - | - | - | - |
| 1.5645 | 6430 | 0.0349 | - | - | - | - |
| 1.5669 | 6440 | 0.0591 | - | - | - | - |
| 1.5693 | 6450 | 0.095 | - | - | - | - |
| 1.5718 | 6460 | 0.0654 | - | - | - | - |
| 1.5742 | 6470 | 0.1019 | - | - | - | - |
| 1.5766 | 6480 | 0.0418 | - | - | - | - |
| 1.5791 | 6490 | 0.038 | - | - | - | - |
| 1.5815 | 6500 | 0.0884 | - | - | - | - |
| 1.5839 | 6510 | 0.0439 | - | - | - | - |
| 1.5864 | 6520 | 0.0704 | - | - | - | - |
| 1.5888 | 6530 | 0.0664 | - | - | - | - |
| 1.5912 | 6540 | 0.0776 | - | - | - | - |
| 1.5937 | 6550 | 0.0295 | - | - | - | - |
| 1.5961 | 6560 | 0.0735 | - | - | - | - |
| 1.5985 | 6570 | 0.0668 | - | - | - | - |
| 1.6010 | 6580 | 0.0202 | - | - | - | - |
| 1.6034 | 6590 | 0.0638 | - | - | - | - |
| 1.6058 | 6600 | 0.0705 | - | - | - | - |
| 1.6083 | 6610 | 0.0558 | - | - | - | - |
| 1.6107 | 6620 | 0.0474 | - | - | - | - |
| 1.6131 | 6630 | 0.1205 | - | - | - | - |
| 1.6156 | 6640 | 0.0995 | - | - | - | - |
| 1.6180 | 6650 | 0.0837 | - | - | - | - |
| 1.6204 | 6660 | 0.0146 | - | - | - | - |
| 1.6229 | 6670 | 0.0445 | - | - | - | - |
| 1.6253 | 6680 | 0.0797 | - | - | - | - |
| 1.6277 | 6690 | 0.0484 | - | - | - | - |
| 1.6302 | 6700 | 0.0699 | - | - | - | - |
| 1.6326 | 6710 | 0.0832 | - | - | - | - |
| 1.6350 | 6720 | 0.0718 | - | - | - | - |
| 1.6375 | 6730 | 0.0552 | - | - | - | - |
| 1.6399 | 6740 | 0.0694 | - | - | - | - |
| 1.6423 | 6750 | 0.0937 | - | - | - | - |
| 1.6448 | 6760 | 0.068 | - | - | - | - |
| 1.6472 | 6770 | 0.081 | - | - | - | - |
| 1.6496 | 6780 | 0.069 | - | - | - | - |
| 1.6521 | 6790 | 0.0253 | - | - | - | - |
| 1.6545 | 6800 | 0.0411 | - | - | - | - |
| 1.6569 | 6810 | 0.0496 | - | - | - | - |
| 1.6594 | 6820 | 0.0868 | - | - | - | - |
| 1.6618 | 6830 | 0.1038 | - | - | - | - |
| 1.6642 | 6840 | 0.0789 | - | - | - | - |
| 1.6667 | 6850 | 0.0385 | - | - | - | - |
| 1.6691 | 6860 | 0.0467 | - | - | - | - |
| 1.6715 | 6870 | 0.0699 | - | - | - | - |
| 1.6740 | 6880 | 0.0553 | - | - | - | - |
| 1.6764 | 6890 | 0.0439 | - | - | - | - |
| 1.6788 | 6900 | 0.0426 | - | - | - | - |
| 1.6813 | 6910 | 0.0337 | - | - | - | - |
| 1.6837 | 6920 | 0.0668 | - | - | - | - |
| 1.6861 | 6930 | 0.1154 | - | - | - | - |
| 1.6886 | 6940 | 0.0544 | - | - | - | - |
| 1.6910 | 6950 | 0.076 | - | - | - | - |
| 1.6934 | 6960 | 0.0725 | - | - | - | - |
| 1.6959 | 6970 | 0.1054 | - | - | - | - |
| 1.6983 | 6980 | 0.0595 | - | - | - | - |
| 1.7007 | 6990 | 0.0569 | - | - | - | - |
| 1.7032 | 7000 | 0.075 | - | - | - | - |
| 1.7056 | 7010 | 0.0664 | - | - | - | - |
| 1.7080 | 7020 | 0.0363 | - | - | - | - |
| 1.7105 | 7030 | 0.0685 | - | - | - | - |
| 1.7129 | 7040 | 0.1046 | - | - | - | - |
| 1.7153 | 7050 | 0.1213 | - | - | - | - |
| 1.7178 | 7060 | 0.0692 | - | - | - | - |
| 1.7202 | 7070 | 0.0937 | - | - | - | - |
| 1.7226 | 7080 | 0.0795 | - | - | - | - |
| 1.7251 | 7090 | 0.1151 | - | - | - | - |
| 1.7275 | 7100 | 0.0604 | - | - | - | - |
| 1.7299 | 7110 | 0.0719 | - | - | - | - |
| 1.7324 | 7120 | 0.0456 | - | - | - | - |
| 1.7348 | 7130 | 0.0431 | - | - | - | - |
| 1.7372 | 7140 | 0.0706 | - | - | - | - |
| 1.7397 | 7150 | 0.0568 | - | - | - | - |
| 1.7421 | 7160 | 0.0664 | - | - | - | - |
| 1.7445 | 7170 | 0.0706 | - | - | - | - |
| 1.7470 | 7180 | 0.0558 | - | - | - | - |
| 1.7494 | 7190 | 0.0526 | - | - | - | - |
| 1.7518 | 7200 | 0.0426 | - | - | - | - |
| 1.7543 | 7210 | 0.0602 | - | - | - | - |
| 1.7567 | 7220 | 0.0664 | - | - | - | - |
| 1.7591 | 7230 | 0.0236 | - | - | - | - |
| 1.7616 | 7240 | 0.0321 | - | - | - | - |
| 1.7640 | 7250 | 0.0192 | - | - | - | - |
| 1.7664 | 7260 | 0.0523 | - | - | - | - |
| 1.7689 | 7270 | 0.0377 | - | - | - | - |
| 1.7713 | 7280 | 0.0878 | - | - | - | - |
| 1.7737 | 7290 | 0.0751 | - | - | - | - |
| 1.7762 | 7300 | 0.0664 | - | - | - | - |
| 1.7786 | 7310 | 0.0178 | - | - | - | - |
| 1.7810 | 7320 | 0.0668 | - | - | - | - |
| 1.7835 | 7330 | 0.0341 | - | - | - | - |
| 1.7859 | 7340 | 0.0747 | - | - | - | - |
| 1.7883 | 7350 | 0.0541 | - | - | - | - |
| 1.7908 | 7360 | 0.067 | - | - | - | - |
| 1.7932 | 7370 | 0.0315 | - | - | - | - |
| 1.7956 | 7380 | 0.0576 | - | - | - | - |
| 1.7981 | 7390 | 0.0542 | - | - | - | - |
| 1.8005 | 7400 | 0.0496 | - | - | - | - |
| 1.8029 | 7410 | 0.0919 | - | - | - | - |
| 1.8054 | 7420 | 0.0877 | - | - | - | - |
| 1.8078 | 7430 | 0.047 | - | - | - | - |
| 1.8102 | 7440 | 0.0859 | - | - | - | - |
| 1.8127 | 7450 | 0.0671 | - | - | - | - |
| 1.8151 | 7460 | 0.0484 | - | - | - | - |
| 1.8175 | 7470 | 0.0698 | - | - | - | - |
| 1.8200 | 7480 | 0.0536 | - | - | - | - |
| 1.8224 | 7490 | 0.0583 | - | - | - | - |
| 1.8248 | 7500 | 0.0768 | - | - | - | - |
| 1.8273 | 7510 | 0.0643 | - | - | - | - |
| 1.8297 | 7520 | 0.0699 | - | - | - | - |
| 1.8321 | 7530 | 0.0855 | - | - | - | - |
| 1.8345 | 7540 | 0.1032 | - | - | - | - |
| 1.8370 | 7550 | 0.0707 | - | - | - | - |
| 1.8394 | 7560 | 0.0352 | - | - | - | - |
| 1.8418 | 7570 | 0.0503 | - | - | - | - |
| 1.8443 | 7580 | 0.0736 | - | - | - | - |
| 1.8467 | 7590 | 0.0543 | - | - | - | - |
| 1.8491 | 7600 | 0.0808 | - | - | - | - |
| 1.8516 | 7610 | 0.0945 | - | - | - | - |
| 1.8540 | 7620 | 0.0433 | - | - | - | - |
| 1.8564 | 7630 | 0.0907 | - | - | - | - |
| 1.8589 | 7640 | 0.0914 | - | - | - | - |
| 1.8613 | 7650 | 0.0424 | - | - | - | - |
| 1.8637 | 7660 | 0.0614 | - | - | - | - |
| 1.8662 | 7670 | 0.1035 | - | - | - | - |
| 1.8686 | 7680 | 0.0734 | - | - | - | - |
| 1.8710 | 7690 | 0.0926 | - | - | - | - |
| 1.8735 | 7700 | 0.0756 | - | - | - | - |
| 1.8759 | 7710 | 0.0406 | - | - | - | - |
| 1.8783 | 7720 | 0.0985 | - | - | - | - |
| 1.8808 | 7730 | 0.0984 | - | - | - | - |
| 1.8832 | 7740 | 0.0425 | - | - | - | - |
| 1.8856 | 7750 | 0.0519 | - | - | - | - |
| 1.8881 | 7760 | 0.0508 | - | - | - | - |
| 1.8905 | 7770 | 0.0372 | - | - | - | - |
| 1.8929 | 7780 | 0.0582 | - | - | - | - |
| 1.8954 | 7790 | 0.0589 | - | - | - | - |
| 1.8978 | 7800 | 0.0356 | - | - | - | - |
| 1.9002 | 7810 | 0.0334 | - | - | - | - |
| 1.9027 | 7820 | 0.052 | - | - | - | - |
| 1.9051 | 7830 | 0.0696 | - | - | - | - |
| 1.9075 | 7840 | 0.0684 | - | - | - | - |
| 1.9100 | 7850 | 0.1165 | - | - | - | - |
| 1.9124 | 7860 | 0.0419 | - | - | - | - |
| 1.9148 | 7870 | 0.0706 | - | - | - | - |
| 1.9173 | 7880 | 0.0609 | - | - | - | - |
| 1.9197 | 7890 | 0.0283 | - | - | - | - |
| 1.9221 | 7900 | 0.0722 | - | - | - | - |
| 1.9246 | 7910 | 0.0866 | - | - | - | - |
| 1.9270 | 7920 | 0.0671 | - | - | - | - |
| 1.9294 | 7930 | 0.0753 | - | - | - | - |
| 1.9319 | 7940 | 0.0265 | - | - | - | - |
| 1.9343 | 7950 | 0.0934 | - | - | - | - |
| 1.9367 | 7960 | 0.0661 | - | - | - | - |
| 1.9392 | 7970 | 0.0798 | - | - | - | - |
| 1.9416 | 7980 | 0.0953 | - | - | - | - |
| 1.9440 | 7990 | 0.0078 | - | - | - | - |
| 1.9465 | 8000 | 0.0631 | - | - | - | - |
| 1.9489 | 8010 | 0.059 | - | - | - | - |
| 1.9513 | 8020 | 0.0951 | - | - | - | - |
| 1.9538 | 8030 | 0.0726 | - | - | - | - |
| 1.9562 | 8040 | 0.0837 | - | - | - | - |
| 1.9586 | 8050 | 0.113 | - | - | - | - |
| 1.9611 | 8060 | 0.0732 | - | - | - | - |
| 1.9635 | 8070 | 0.0227 | - | - | - | - |
| 1.9659 | 8080 | 0.0766 | - | - | - | - |
| 1.9684 | 8090 | 0.0684 | - | - | - | - |
| 1.9708 | 8100 | 0.0923 | - | - | - | - |
| 1.9732 | 8110 | 0.0949 | - | - | - | - |
| 1.9757 | 8120 | 0.06 | - | - | - | - |
| 1.9781 | 8130 | 0.0832 | - | - | - | - |
| 1.9805 | 8140 | 0.0387 | - | - | - | - |
| 1.9830 | 8150 | 0.0307 | - | - | - | - |
| 1.9854 | 8160 | 0.0728 | - | - | - | - |
| 1.9878 | 8170 | 0.0708 | - | - | - | - |
| 1.9903 | 8180 | 0.1074 | - | - | - | - |
| 1.9927 | 8190 | 0.0625 | - | - | - | - |
| 1.9951 | 8200 | 0.0645 | - | - | - | - |
| 1.9976 | 8210 | 0.0818 | - | - | - | - |
| 2.0 | 8220 | 0.114 | 0.0109 | 2.2181 | 0.0328 | 0.0483 |
| 2.0024 | 8230 | 0.1097 | - | - | - | - |
| 2.0049 | 8240 | 0.0758 | - | - | - | - |
| 2.0073 | 8250 | 0.0848 | - | - | - | - |
| 2.0097 | 8260 | 0.0365 | - | - | - | - |
| 2.0122 | 8270 | 0.0404 | - | - | - | - |
| 2.0146 | 8280 | 0.0462 | - | - | - | - |
| 2.0170 | 8290 | 0.022 | - | - | - | - |
| 2.0195 | 8300 | 0.0633 | - | - | - | - |
| 2.0219 | 8310 | 0.0335 | - | - | - | - |
| 2.0243 | 8320 | 0.0332 | - | - | - | - |
| 2.0268 | 8330 | 0.0807 | - | - | - | - |
| 2.0292 | 8340 | 0.0643 | - | - | - | - |
| 2.0316 | 8350 | 0.0233 | - | - | - | - |
| 2.0341 | 8360 | 0.0089 | - | - | - | - |
| 2.0365 | 8370 | 0.0153 | - | - | - | - |
| 2.0389 | 8380 | 0.0939 | - | - | - | - |
| 2.0414 | 8390 | 0.0779 | - | - | - | - |
| 2.0438 | 8400 | 0.0342 | - | - | - | - |
| 2.0462 | 8410 | 0.0741 | - | - | - | - |
| 2.0487 | 8420 | 0.0602 | - | - | - | - |
| 2.0511 | 8430 | 0.0463 | - | - | - | - |
| 2.0535 | 8440 | 0.0382 | - | - | - | - |
| 2.0560 | 8450 | 0.0323 | - | - | - | - |
| 2.0584 | 8460 | 0.0266 | - | - | - | - |
| 2.0608 | 8470 | 0.0018 | - | - | - | - |
| 2.0633 | 8480 | 0.0381 | - | - | - | - |
| 2.0657 | 8490 | 0.0456 | - | - | - | - |
| 2.0681 | 8500 | 0.0965 | - | - | - | - |
| 2.0706 | 8510 | 0.0264 | - | - | - | - |
| 2.0730 | 8520 | 0.0504 | - | - | - | - |
| 2.0754 | 8530 | 0.0251 | - | - | - | - |
| 2.0779 | 8540 | 0.0743 | - | - | - | - |
| 2.0803 | 8550 | 0.0544 | - | - | - | - |
| 2.0827 | 8560 | 0.0296 | - | - | - | - |
| 2.0852 | 8570 | 0.0788 | - | - | - | - |
| 2.0876 | 8580 | 0.0695 | - | - | - | - |
| 2.0900 | 8590 | 0.049 | - | - | - | - |
| 2.0925 | 8600 | 0.0468 | - | - | - | - |
| 2.0949 | 8610 | 0.0398 | - | - | - | - |
| 2.0973 | 8620 | 0.0371 | - | - | - | - |
| 2.0998 | 8630 | 0.0512 | - | - | - | - |
| 2.1022 | 8640 | 0.0699 | - | - | - | - |
| 2.1046 | 8650 | 0.0531 | - | - | - | - |
| 2.1071 | 8660 | 0.0601 | - | - | - | - |
| 2.1095 | 8670 | 0.0666 | - | - | - | - |
| 2.1119 | 8680 | 0.0499 | - | - | - | - |
| 2.1144 | 8690 | 0.0437 | - | - | - | - |
| 2.1168 | 8700 | 0.0445 | - | - | - | - |
| 2.1192 | 8710 | 0.0548 | - | - | - | - |
| 2.1217 | 8720 | 0.047 | - | - | - | - |
| 2.1241 | 8730 | 0.0683 | - | - | - | - |
| 2.1265 | 8740 | 0.0344 | - | - | - | - |
| 2.1290 | 8750 | 0.0305 | - | - | - | - |
| 2.1314 | 8760 | 0.048 | - | - | - | - |
| 2.1338 | 8770 | 0.0402 | - | - | - | - |
| 2.1363 | 8780 | 0.0727 | - | - | - | - |
| 2.1387 | 8790 | 0.0235 | - | - | - | - |
| 2.1411 | 8800 | 0.0088 | - | - | - | - |
| 2.1436 | 8810 | 0.0602 | - | - | - | - |
| 2.1460 | 8820 | 0.028 | - | - | - | - |
| 2.1484 | 8830 | 0.0699 | - | - | - | - |
| 2.1509 | 8840 | 0.0866 | - | - | - | - |
| 2.1533 | 8850 | 0.0819 | - | - | - | - |
| 2.1557 | 8860 | 0.0501 | - | - | - | - |
| 2.1582 | 8870 | 0.0329 | - | - | - | - |
| 2.1606 | 8880 | 0.0735 | - | - | - | - |
| 2.1630 | 8890 | 0.0531 | - | - | - | - |
| 2.1655 | 8900 | 0.0223 | - | - | - | - |
| 2.1679 | 8910 | 0.0546 | - | - | - | - |
| 2.1703 | 8920 | 0.0451 | - | - | - | - |
| 2.1727 | 8930 | 0.047 | - | - | - | - |
| 2.1752 | 8940 | 0.0244 | - | - | - | - |
| 2.1776 | 8950 | 0.0378 | - | - | - | - |
| 2.1800 | 8960 | 0.0182 | - | - | - | - |
| 2.1825 | 8970 | 0.0224 | - | - | - | - |
| 2.1849 | 8980 | 0.0327 | - | - | - | - |
| 2.1873 | 8990 | 0.0323 | - | - | - | - |
| 2.1898 | 9000 | 0.0307 | - | - | - | - |
| 2.1922 | 9010 | 0.0874 | - | - | - | - |
| 2.1946 | 9020 | 0.0407 | - | - | - | - |
| 2.1971 | 9030 | 0.0502 | - | - | - | - |
| 2.1995 | 9040 | 0.0474 | - | - | - | - |
| 2.2019 | 9050 | 0.0437 | - | - | - | - |
| 2.2044 | 9060 | 0.058 | - | - | - | - |
| 2.2068 | 9070 | 0.0851 | - | - | - | - |
| 2.2092 | 9080 | 0.0584 | - | - | - | - |
| 2.2117 | 9090 | 0.0124 | - | - | - | - |
| 2.2141 | 9100 | 0.0085 | - | - | - | - |
| 2.2165 | 9110 | 0.0607 | - | - | - | - |
| 2.2190 | 9120 | 0.0685 | - | - | - | - |
| 2.2214 | 9130 | 0.0807 | - | - | - | - |
| 2.2238 | 9140 | 0.0608 | - | - | - | - |
| 2.2263 | 9150 | 0.0131 | - | - | - | - |
| 2.2287 | 9160 | 0.0451 | - | - | - | - |
| 2.2311 | 9170 | 0.0368 | - | - | - | - |
| 2.2336 | 9180 | 0.0527 | - | - | - | - |
| 2.2360 | 9190 | 0.0846 | - | - | - | - |
| 2.2384 | 9200 | 0.0328 | - | - | - | - |
| 2.2409 | 9210 | 0.0178 | - | - | - | - |
| 2.2433 | 9220 | 0.0274 | - | - | - | - |
| 2.2457 | 9230 | 0.0567 | - | - | - | - |
| 2.2482 | 9240 | 0.0756 | - | - | - | - |
| 2.2506 | 9250 | 0.0369 | - | - | - | - |
| 2.2530 | 9260 | 0.0827 | - | - | - | - |
| 2.2555 | 9270 | 0.023 | - | - | - | - |
| 2.2579 | 9280 | 0.0749 | - | - | - | - |
| 2.2603 | 9290 | 0.048 | - | - | - | - |
| 2.2628 | 9300 | 0.0855 | - | - | - | - |
| 2.2652 | 9310 | 0.0421 | - | - | - | - |
| 2.2676 | 9320 | 0.0437 | - | - | - | - |
| 2.2701 | 9330 | 0.0503 | - | - | - | - |
| 2.2725 | 9340 | 0.0186 | - | - | - | - |
| 2.2749 | 9350 | 0.0321 | - | - | - | - |
| 2.2774 | 9360 | 0.0756 | - | - | - | - |
| 2.2798 | 9370 | 0.0692 | - | - | - | - |
| 2.2822 | 9380 | 0.0629 | - | - | - | - |
| 2.2847 | 9390 | 0.0526 | - | - | - | - |
| 2.2871 | 9400 | 0.0486 | - | - | - | - |
| 2.2895 | 9410 | 0.0419 | - | - | - | - |
| 2.2920 | 9420 | 0.0121 | - | - | - | - |
| 2.2944 | 9430 | 0.0678 | - | - | - | - |
| 2.2968 | 9440 | 0.0896 | - | - | - | - |
| 2.2993 | 9450 | 0.0306 | - | - | - | - |
| 2.3017 | 9460 | 0.0541 | - | - | - | - |
| 2.3041 | 9470 | 0.0504 | - | - | - | - |
| 2.3066 | 9480 | 0.0414 | - | - | - | - |
| 2.3090 | 9490 | 0.0302 | - | - | - | - |
| 2.3114 | 9500 | 0.0434 | - | - | - | - |
| 2.3139 | 9510 | 0.0449 | - | - | - | - |
| 2.3163 | 9520 | 0.0359 | - | - | - | - |
| 2.3187 | 9530 | 0.0547 | - | - | - | - |
| 2.3212 | 9540 | 0.0824 | - | - | - | - |
| 2.3236 | 9550 | 0.0311 | - | - | - | - |
| 2.3260 | 9560 | 0.0722 | - | - | - | - |
| 2.3285 | 9570 | 0.0558 | - | - | - | - |
| 2.3309 | 9580 | 0.0304 | - | - | - | - |
| 2.3333 | 9590 | 0.0678 | - | - | - | - |
| 2.3358 | 9600 | 0.0466 | - | - | - | - |
| 2.3382 | 9610 | 0.0063 | - | - | - | - |
| 2.3406 | 9620 | 0.04 | - | - | - | - |
| 2.3431 | 9630 | 0.0579 | - | - | - | - |
| 2.3455 | 9640 | 0.0286 | - | - | - | - |
| 2.3479 | 9650 | 0.0473 | - | - | - | - |
| 2.3504 | 9660 | 0.0395 | - | - | - | - |
| 2.3528 | 9670 | 0.0344 | - | - | - | - |
| 2.3552 | 9680 | 0.0399 | - | - | - | - |
| 2.3577 | 9690 | 0.0391 | - | - | - | - |
| 2.3601 | 9700 | 0.0393 | - | - | - | - |
| 2.3625 | 9710 | 0.0185 | - | - | - | - |
| 2.3650 | 9720 | 0.071 | - | - | - | - |
| 2.3674 | 9730 | 0.0431 | - | - | - | - |
| 2.3698 | 9740 | 0.0525 | - | - | - | - |
| 2.3723 | 9750 | 0.0459 | - | - | - | - |
| 2.3747 | 9760 | 0.0391 | - | - | - | - |
| 2.3771 | 9770 | 0.1035 | - | - | - | - |
| 2.3796 | 9780 | 0.0356 | - | - | - | - |
| 2.3820 | 9790 | 0.0418 | - | - | - | - |
| 2.3844 | 9800 | 0.0316 | - | - | - | - |
| 2.3869 | 9810 | 0.053 | - | - | - | - |
| 2.3893 | 9820 | 0.0489 | - | - | - | - |
| 2.3917 | 9830 | 0.0603 | - | - | - | - |
| 2.3942 | 9840 | 0.0422 | - | - | - | - |
| 2.3966 | 9850 | 0.0491 | - | - | - | - |
| 2.3990 | 9860 | 0.0441 | - | - | - | - |
| 2.4015 | 9870 | 0.0773 | - | - | - | - |
| 2.4039 | 9880 | 0.0172 | - | - | - | - |
| 2.4063 | 9890 | 0.0274 | - | - | - | - |
| 2.4088 | 9900 | 0.0776 | - | - | - | - |
| 2.4112 | 9910 | 0.0446 | - | - | - | - |
| 2.4136 | 9920 | 0.0502 | - | - | - | - |
| 2.4161 | 9930 | 0.0321 | - | - | - | - |
| 2.4185 | 9940 | 0.0342 | - | - | - | - |
| 2.4209 | 9950 | 0.072 | - | - | - | - |
| 2.4234 | 9960 | 0.0759 | - | - | - | - |
| 2.4258 | 9970 | 0.04 | - | - | - | - |
| 2.4282 | 9980 | 0.0703 | - | - | - | - |
| 2.4307 | 9990 | 0.0674 | - | - | - | - |
| 2.4331 | 10000 | 0.046 | - | - | - | - |
| 2.4355 | 10010 | 0.0412 | - | - | - | - |
| 2.4380 | 10020 | 0.0518 | - | - | - | - |
| 2.4404 | 10030 | 0.0678 | - | - | - | - |
| 2.4428 | 10040 | 0.0352 | - | - | - | - |
| 2.4453 | 10050 | 0.0597 | - | - | - | - |
| 2.4477 | 10060 | 0.0271 | - | - | - | - |
| 2.4501 | 10070 | 0.0425 | - | - | - | - |
| 2.4526 | 10080 | 0.0697 | - | - | - | - |
| 2.4550 | 10090 | 0.0134 | - | - | - | - |
| 2.4574 | 10100 | 0.0704 | - | - | - | - |
| 2.4599 | 10110 | 0.027 | - | - | - | - |
| 2.4623 | 10120 | 0.0523 | - | - | - | - |
| 2.4647 | 10130 | 0.0373 | - | - | - | - |
| 2.4672 | 10140 | 0.0276 | - | - | - | - |
| 2.4696 | 10150 | 0.0715 | - | - | - | - |
| 2.4720 | 10160 | 0.0538 | - | - | - | - |
| 2.4745 | 10170 | 0.0598 | - | - | - | - |
| 2.4769 | 10180 | 0.0349 | - | - | - | - |
| 2.4793 | 10190 | 0.047 | - | - | - | - |
| 2.4818 | 10200 | 0.0048 | - | - | - | - |
| 2.4842 | 10210 | 0.0542 | - | - | - | - |
| 2.4866 | 10220 | 0.0547 | - | - | - | - |
| 2.4891 | 10230 | 0.0622 | - | - | - | - |
| 2.4915 | 10240 | 0.0784 | - | - | - | - |
| 2.4939 | 10250 | 0.0428 | - | - | - | - |
| 2.4964 | 10260 | 0.0284 | - | - | - | - |
| 2.4988 | 10270 | 0.0744 | - | - | - | - |
| 2.5012 | 10280 | 0.0763 | - | - | - | - |
| 2.5036 | 10290 | 0.0495 | - | - | - | - |
| 2.5061 | 10300 | 0.0802 | - | - | - | - |
| 2.5085 | 10310 | 0.077 | - | - | - | - |
| 2.5109 | 10320 | 0.0376 | - | - | - | - |
| 2.5134 | 10330 | 0.058 | - | - | - | - |
| 2.5158 | 10340 | 0.044 | - | - | - | - |
| 2.5182 | 10350 | 0.1121 | - | - | - | - |
| 2.5207 | 10360 | 0.0354 | - | - | - | - |
| 2.5231 | 10370 | 0.0267 | - | - | - | - |
| 2.5255 | 10380 | 0.0445 | - | - | - | - |
| 2.5280 | 10390 | 0.0536 | - | - | - | - |
| 2.5304 | 10400 | 0.0539 | - | - | - | - |
| 2.5328 | 10410 | 0.0353 | - | - | - | - |
| 2.5353 | 10420 | 0.0147 | - | - | - | - |
| 2.5377 | 10430 | 0.0319 | - | - | - | - |
| 2.5401 | 10440 | 0.0676 | - | - | - | - |
| 2.5426 | 10450 | 0.0395 | - | - | - | - |
| 2.5450 | 10460 | 0.0648 | - | - | - | - |
| 2.5474 | 10470 | 0.055 | - | - | - | - |
| 2.5499 | 10480 | 0.0625 | - | - | - | - |
| 2.5523 | 10490 | 0.04 | - | - | - | - |
| 2.5547 | 10500 | 0.0678 | - | - | - | - |
| 2.5572 | 10510 | 0.0251 | - | - | - | - |
| 2.5596 | 10520 | 0.036 | - | - | - | - |
| 2.5620 | 10530 | 0.0352 | - | - | - | - |
| 2.5645 | 10540 | 0.0212 | - | - | - | - |
| 2.5669 | 10550 | 0.0459 | - | - | - | - |
| 2.5693 | 10560 | 0.0678 | - | - | - | - |
| 2.5718 | 10570 | 0.053 | - | - | - | - |
| 2.5742 | 10580 | 0.0888 | - | - | - | - |
| 2.5766 | 10590 | 0.0374 | - | - | - | - |
| 2.5791 | 10600 | 0.017 | - | - | - | - |
| 2.5815 | 10610 | 0.0828 | - | - | - | - |
| 2.5839 | 10620 | 0.0393 | - | - | - | - |
| 2.5864 | 10630 | 0.0517 | - | - | - | - |
| 2.5888 | 10640 | 0.0572 | - | - | - | - |
| 2.5912 | 10650 | 0.0577 | - | - | - | - |
| 2.5937 | 10660 | 0.0245 | - | - | - | - |
| 2.5961 | 10670 | 0.0632 | - | - | - | - |
| 2.5985 | 10680 | 0.0612 | - | - | - | - |
| 2.6010 | 10690 | 0.0204 | - | - | - | - |
| 2.6034 | 10700 | 0.0493 | - | - | - | - |
| 2.6058 | 10710 | 0.0613 | - | - | - | - |
| 2.6083 | 10720 | 0.0467 | - | - | - | - |
| 2.6107 | 10730 | 0.0532 | - | - | - | - |
| 2.6131 | 10740 | 0.0962 | - | - | - | - |
| 2.6156 | 10750 | 0.048 | - | - | - | - |
| 2.6180 | 10760 | 0.0623 | - | - | - | - |
| 2.6204 | 10770 | 0.0049 | - | - | - | - |
| 2.6229 | 10780 | 0.0359 | - | - | - | - |
| 2.6253 | 10790 | 0.0536 | - | - | - | - |
| 2.6277 | 10800 | 0.0423 | - | - | - | - |
| 2.6302 | 10810 | 0.0306 | - | - | - | - |
| 2.6326 | 10820 | 0.0412 | - | - | - | - |
| 2.6350 | 10830 | 0.0559 | - | - | - | - |
| 2.6375 | 10840 | 0.0574 | - | - | - | - |
| 2.6399 | 10850 | 0.0521 | - | - | - | - |
| 2.6423 | 10860 | 0.0638 | - | - | - | - |
| 2.6448 | 10870 | 0.0476 | - | - | - | - |
| 2.6472 | 10880 | 0.0715 | - | - | - | - |
| 2.6496 | 10890 | 0.0453 | - | - | - | - |
| 2.6521 | 10900 | 0.0115 | - | - | - | - |
| 2.6545 | 10910 | 0.0339 | - | - | - | - |
| 2.6569 | 10920 | 0.0436 | - | - | - | - |
| 2.6594 | 10930 | 0.0613 | - | - | - | - |
| 2.6618 | 10940 | 0.0697 | - | - | - | - |
| 2.6642 | 10950 | 0.0666 | - | - | - | - |
| 2.6667 | 10960 | 0.0183 | - | - | - | - |
| 2.6691 | 10970 | 0.0405 | - | - | - | - |
| 2.6715 | 10980 | 0.0607 | - | - | - | - |
| 2.6740 | 10990 | 0.0327 | - | - | - | - |
| 2.6764 | 11000 | 0.0367 | - | - | - | - |
| 2.6788 | 11010 | 0.041 | - | - | - | - |
| 2.6813 | 11020 | 0.0351 | - | - | - | - |
| 2.6837 | 11030 | 0.0462 | - | - | - | - |
| 2.6861 | 11040 | 0.1159 | - | - | - | - |
| 2.6886 | 11050 | 0.0369 | - | - | - | - |
| 2.6910 | 11060 | 0.0643 | - | - | - | - |
| 2.6934 | 11070 | 0.0564 | - | - | - | - |
| 2.6959 | 11080 | 0.0576 | - | - | - | - |
| 2.6983 | 11090 | 0.061 | - | - | - | - |
| 2.7007 | 11100 | 0.0513 | - | - | - | - |
| 2.7032 | 11110 | 0.0674 | - | - | - | - |
| 2.7056 | 11120 | 0.0658 | - | - | - | - |
| 2.7080 | 11130 | 0.0182 | - | - | - | - |
| 2.7105 | 11140 | 0.0585 | - | - | - | - |
| 2.7129 | 11150 | 0.0825 | - | - | - | - |
| 2.7153 | 11160 | 0.1078 | - | - | - | - |
| 2.7178 | 11170 | 0.064 | - | - | - | - |
| 2.7202 | 11180 | 0.0745 | - | - | - | - |
| 2.7226 | 11190 | 0.0726 | - | - | - | - |
| 2.7251 | 11200 | 0.0929 | - | - | - | - |
| 2.7275 | 11210 | 0.0519 | - | - | - | - |
| 2.7299 | 11220 | 0.0668 | - | - | - | - |
| 2.7324 | 11230 | 0.0279 | - | - | - | - |
| 2.7348 | 11240 | 0.0315 | - | - | - | - |
| 2.7372 | 11250 | 0.0482 | - | - | - | - |
| 2.7397 | 11260 | 0.0495 | - | - | - | - |
| 2.7421 | 11270 | 0.0664 | - | - | - | - |
| 2.7445 | 11280 | 0.0684 | - | - | - | - |
| 2.7470 | 11290 | 0.0362 | - | - | - | - |
| 2.7494 | 11300 | 0.0451 | - | - | - | - |
| 2.7518 | 11310 | 0.0435 | - | - | - | - |
| 2.7543 | 11320 | 0.0503 | - | - | - | - |
| 2.7567 | 11330 | 0.053 | - | - | - | - |
| 2.7591 | 11340 | 0.0198 | - | - | - | - |
| 2.7616 | 11350 | 0.0289 | - | - | - | - |
| 2.7640 | 11360 | 0.0137 | - | - | - | - |
| 2.7664 | 11370 | 0.0468 | - | - | - | - |
| 2.7689 | 11380 | 0.0349 | - | - | - | - |
| 2.7713 | 11390 | 0.081 | - | - | - | - |
| 2.7737 | 11400 | 0.0557 | - | - | - | - |
| 2.7762 | 11410 | 0.0622 | - | - | - | - |
| 2.7786 | 11420 | 0.0059 | - | - | - | - |
| 2.7810 | 11430 | 0.0582 | - | - | - | - |
| 2.7835 | 11440 | 0.022 | - | - | - | - |
| 2.7859 | 11450 | 0.0539 | - | - | - | - |
| 2.7883 | 11460 | 0.0329 | - | - | - | - |
| 2.7908 | 11470 | 0.0616 | - | - | - | - |
| 2.7932 | 11480 | 0.031 | - | - | - | - |
| 2.7956 | 11490 | 0.0557 | - | - | - | - |
| 2.7981 | 11500 | 0.0511 | - | - | - | - |
| 2.8005 | 11510 | 0.0426 | - | - | - | - |
| 2.8029 | 11520 | 0.0555 | - | - | - | - |
| 2.8054 | 11530 | 0.0764 | - | - | - | - |
| 2.8078 | 11540 | 0.0464 | - | - | - | - |
| 2.8102 | 11550 | 0.0751 | - | - | - | - |
| 2.8127 | 11560 | 0.0633 | - | - | - | - |
| 2.8151 | 11570 | 0.0387 | - | - | - | - |
| 2.8175 | 11580 | 0.0685 | - | - | - | - |
| 2.8200 | 11590 | 0.0439 | - | - | - | - |
| 2.8224 | 11600 | 0.0348 | - | - | - | - |
| 2.8248 | 11610 | 0.0645 | - | - | - | - |
| 2.8273 | 11620 | 0.0528 | - | - | - | - |
| 2.8297 | 11630 | 0.0615 | - | - | - | - |
| 2.8321 | 11640 | 0.0636 | - | - | - | - |
| 2.8345 | 11650 | 0.0804 | - | - | - | - |
| 2.8370 | 11660 | 0.0613 | - | - | - | - |
| 2.8394 | 11670 | 0.0259 | - | - | - | - |
| 2.8418 | 11680 | 0.0494 | - | - | - | - |
| 2.8443 | 11690 | 0.036 | - | - | - | - |
| 2.8467 | 11700 | 0.0453 | - | - | - | - |
| 2.8491 | 11710 | 0.0762 | - | - | - | - |
| 2.8516 | 11720 | 0.0829 | - | - | - | - |
| 2.8540 | 11730 | 0.0434 | - | - | - | - |
| 2.8564 | 11740 | 0.0691 | - | - | - | - |
| 2.8589 | 11750 | 0.0594 | - | - | - | - |
| 2.8613 | 11760 | 0.0345 | - | - | - | - |
| 2.8637 | 11770 | 0.056 | - | - | - | - |
| 2.8662 | 11780 | 0.0962 | - | - | - | - |
| 2.8686 | 11790 | 0.0548 | - | - | - | - |
| 2.8710 | 11800 | 0.0615 | - | - | - | - |
| 2.8735 | 11810 | 0.0581 | - | - | - | - |
| 2.8759 | 11820 | 0.0352 | - | - | - | - |
| 2.8783 | 11830 | 0.0814 | - | - | - | - |
| 2.8808 | 11840 | 0.0641 | - | - | - | - |
| 2.8832 | 11850 | 0.0364 | - | - | - | - |
| 2.8856 | 11860 | 0.0388 | - | - | - | - |
| 2.8881 | 11870 | 0.0479 | - | - | - | - |
| 2.8905 | 11880 | 0.0349 | - | - | - | - |
| 2.8929 | 11890 | 0.0557 | - | - | - | - |
| 2.8954 | 11900 | 0.0437 | - | - | - | - |
| 2.8978 | 11910 | 0.0157 | - | - | - | - |
| 2.9002 | 11920 | 0.0304 | - | - | - | - |
| 2.9027 | 11930 | 0.0377 | - | - | - | - |
| 2.9051 | 11940 | 0.0626 | - | - | - | - |
| 2.9075 | 11950 | 0.0672 | - | - | - | - |
| 2.9100 | 11960 | 0.0835 | - | - | - | - |
| 2.9124 | 11970 | 0.0377 | - | - | - | - |
| 2.9148 | 11980 | 0.0623 | - | - | - | - |
| 2.9173 | 11990 | 0.0375 | - | - | - | - |
| 2.9197 | 12000 | 0.0182 | - | - | - | - |
| 2.9221 | 12010 | 0.0464 | - | - | - | - |
| 2.9246 | 12020 | 0.074 | - | - | - | - |
| 2.9270 | 12030 | 0.0604 | - | - | - | - |
| 2.9294 | 12040 | 0.0447 | - | - | - | - |
| 2.9319 | 12050 | 0.0231 | - | - | - | - |
| 2.9343 | 12060 | 0.0759 | - | - | - | - |
| 2.9367 | 12070 | 0.0592 | - | - | - | - |
| 2.9392 | 12080 | 0.0412 | - | - | - | - |
| 2.9416 | 12090 | 0.0554 | - | - | - | - |
| 2.9440 | 12100 | 0.0086 | - | - | - | - |
| 2.9465 | 12110 | 0.0605 | - | - | - | - |
| 2.9489 | 12120 | 0.0522 | - | - | - | - |
| 2.9513 | 12130 | 0.0822 | - | - | - | - |
| 2.9538 | 12140 | 0.0603 | - | - | - | - |
| 2.9562 | 12150 | 0.0762 | - | - | - | - |
| 2.9586 | 12160 | 0.076 | - | - | - | - |
| 2.9611 | 12170 | 0.0516 | - | - | - | - |
| 2.9635 | 12180 | 0.0221 | - | - | - | - |
| 2.9659 | 12190 | 0.0662 | - | - | - | - |
| 2.9684 | 12200 | 0.0571 | - | - | - | - |
| 2.9708 | 12210 | 0.0738 | - | - | - | - |
| 2.9732 | 12220 | 0.0567 | - | - | - | - |
| 2.9757 | 12230 | 0.0566 | - | - | - | - |
| 2.9781 | 12240 | 0.077 | - | - | - | - |
| 2.9805 | 12250 | 0.0353 | - | - | - | - |
| 2.9830 | 12260 | 0.0313 | - | - | - | - |
| 2.9854 | 12270 | 0.0628 | - | - | - | - |
| 2.9878 | 12280 | 0.0536 | - | - | - | - |
| 2.9903 | 12290 | 0.0972 | - | - | - | - |
| 2.9927 | 12300 | 0.0393 | - | - | - | - |
| 2.9951 | 12310 | 0.0461 | - | - | - | - |
| 2.9976 | 12320 | 0.0585 | - | - | - | - |
| 3.0 | 12330 | 0.0923 | 0.0108 | 2.1017 | 0.0314 | 0.0328 |
</details>
### Framework Versions
- Python: 3.10.12
- Sentence Transformers: 3.1.1
- Transformers: 4.45.2
- PyTorch: 2.5.1+cu124
- Accelerate: 1.1.1
- Datasets: 3.1.0
- Tokenizers: 0.20.3
## Citation
### BibTeX
#### Sentence Transformers
```bibtex
@inproceedings{reimers-2019-sentence-bert,
title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
author = "Reimers, Nils and Gurevych, Iryna",
booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
month = "11",
year = "2019",
publisher = "Association for Computational Linguistics",
url = "https://arxiv.org/abs/1908.10084",
}
```
#### MultipleNegativesRankingLoss
```bibtex
@misc{henderson2017efficient,
title={Efficient Natural Language Response Suggestion for Smart Reply},
author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil},
year={2017},
eprint={1705.00652},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
#### ContrastiveLoss
```bibtex
@inproceedings{hadsell2006dimensionality,
author={Hadsell, R. and Chopra, S. and LeCun, Y.},
booktitle={2006 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'06)},
title={Dimensionality Reduction by Learning an Invariant Mapping},
year={2006},
volume={2},
number={},
pages={1735-1742},
doi={10.1109/CVPR.2006.100}
}
```
#### TripletLoss
```bibtex
@misc{hermans2017defense,
title={In Defense of the Triplet Loss for Person Re-Identification},
author={Alexander Hermans and Lucas Beyer and Bastian Leibe},
year={2017},
eprint={1703.07737},
archivePrefix={arXiv},
primaryClass={cs.CV}
}
```
<!--
## Glossary
*Clearly define terms in order to be accessible across audiences.*
-->
<!--
## Model Card Authors
*Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.*
-->
<!--
## Model Card Contact
*Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.*
--> |
AIDSC/llama2-13b-hf | AIDSC | 2024-11-13T08:49:39Z | 6 | 0 | null | [
"pytorch",
"safetensors",
"llama",
"facebook",
"meta",
"llama-2",
"text-generation",
"en",
"arxiv:2307.09288",
"license:llama2",
"endpoints_compatible",
"region:us"
] | text-generation | 2024-10-23T10:45:11Z | ---
extra_gated_heading: You need to share contact information with Meta to access this model
extra_gated_prompt: >-
### LLAMA 2 COMMUNITY LICENSE AGREEMENT
"Agreement" means the terms and conditions for use, reproduction, distribution
and modification of the Llama Materials set forth herein.
"Documentation" means the specifications, manuals and documentation
accompanying Llama 2 distributed by Meta at
https://ai.meta.com/resources/models-and-libraries/llama-downloads/.
"Licensee" or "you" means you, or your employer or any other person or entity
(if you are entering into this Agreement on such person or entity's behalf),
of the age required under applicable laws, rules or regulations to provide
legal consent and that has legal authority to bind your employer or such other
person or entity if you are entering in this Agreement on their behalf.
"Llama 2" means the foundational large language models and software and
algorithms, including machine-learning model code, trained model weights,
inference-enabling code, training-enabling code, fine-tuning enabling code and
other elements of the foregoing distributed by Meta at
ai.meta.com/resources/models-and-libraries/llama-downloads/.
"Llama Materials" means, collectively, Meta's proprietary Llama 2 and
documentation (and any portion thereof) made available under this Agreement.
"Meta" or "we" means Meta Platforms Ireland Limited (if you are located in or,
if you are an entity, your principal place of business is in the EEA or
Switzerland) and Meta Platforms, Inc. (if you are located outside of the EEA
or Switzerland).
By clicking "I Accept" below or by using or distributing any portion or
element of the Llama Materials, you agree to be bound by this Agreement.
1. License Rights and Redistribution.
a. Grant of Rights. You are granted a non-exclusive, worldwide, non-
transferable and royalty-free limited license under Meta's intellectual
property or other rights owned by Meta embodied in the Llama Materials to
use, reproduce, distribute, copy, create derivative works of, and make
modifications to the Llama Materials.
b. Redistribution and Use.
i. If you distribute or make the Llama Materials, or any derivative works
thereof, available to a third party, you shall provide a copy of this
Agreement to such third party.
ii. If you receive Llama Materials, or any derivative works thereof, from a
Licensee as part of an integrated end user product, then Section 2 of this
Agreement will not apply to you.
iii. You must retain in all copies of the Llama Materials that you distribute
the following attribution notice within a "Notice" text file distributed as a
part of such copies: "Llama 2 is licensed under the LLAMA 2 Community
License, Copyright (c) Meta Platforms, Inc. All Rights Reserved."
iv. Your use of the Llama Materials must comply with applicable laws and
regulations (including trade compliance laws and regulations) and adhere to
the Acceptable Use Policy for the Llama Materials (available at
https://ai.meta.com/llama/use-policy), which is hereby incorporated by
reference into this Agreement.
v. You will not use the Llama Materials or any output or results of the Llama
Materials to improve any other large language model (excluding Llama 2 or
derivative works thereof).
2. Additional Commercial Terms. If, on the Llama 2 version release date, the
monthly active users of the products or services made available by or for
Licensee, or Licensee's affiliates, is greater than 700 million monthly
active users in the preceding calendar month, you must request a license from
Meta, which Meta may grant to you in its sole discretion, and you are not
authorized to exercise any of the rights under this Agreement unless or until
Meta otherwise expressly grants you such rights.
3. Disclaimer of Warranty. UNLESS REQUIRED BY APPLICABLE LAW, THE LLAMA
MATERIALS AND ANY OUTPUT AND RESULTS THEREFROM ARE PROVIDED ON AN "AS IS"
BASIS, WITHOUT WARRANTIES OF ANY KIND, EITHER EXPRESS OR IMPLIED, INCLUDING,
WITHOUT LIMITATION, ANY WARRANTIES OF TITLE, NON-INFRINGEMENT,
MERCHANTABILITY, OR FITNESS FOR A PARTICULAR PURPOSE. YOU ARE SOLELY
RESPONSIBLE FOR DETERMINING THE APPROPRIATENESS OF USING OR REDISTRIBUTING
THE LLAMA MATERIALS AND ASSUME ANY RISKS ASSOCIATED WITH YOUR USE OF THE
LLAMA MATERIALS AND ANY OUTPUT AND RESULTS.
4. Limitation of Liability. IN NO EVENT WILL META OR ITS AFFILIATES BE LIABLE
UNDER ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, TORT, NEGLIGENCE,
PRODUCTS LIABILITY, OR OTHERWISE, ARISING OUT OF THIS AGREEMENT, FOR ANY LOST
PROFITS OR ANY INDIRECT, SPECIAL, CONSEQUENTIAL, INCIDENTAL, EXEMPLARY OR
PUNITIVE DAMAGES, EVEN IF META OR ITS AFFILIATES HAVE BEEN ADVISED OF THE
POSSIBILITY OF ANY OF THE FOREGOING.
5. Intellectual Property.
a. No trademark licenses are granted under this Agreement, and in connection
with the Llama Materials, neither Meta nor Licensee may use any name or mark
owned by or associated with the other or any of its affiliates, except as
required for reasonable and customary use in describing and redistributing
the Llama Materials.
b. Subject to Meta's ownership of Llama Materials and derivatives made by or
for Meta, with respect to any derivative works and modifications of the Llama
Materials that are made by you, as between you and Meta, you are and will be
the owner of such derivative works and modifications.
c. If you institute litigation or other proceedings against Meta or any
entity (including a cross-claim or counterclaim in a lawsuit) alleging that
the Llama Materials or Llama 2 outputs or results, or any portion of any of
the foregoing, constitutes infringement of intellectual property or other
rights owned or licensable by you, then any licenses granted to you under
this Agreement shall terminate as of the date such litigation or claim is
filed or instituted. You will indemnify and hold harmless Meta from and
against any claim by any third party arising out of or related to your use or
distribution of the Llama Materials.
6. Term and Termination. The term of this Agreement will commence upon your
acceptance of this Agreement or access to the Llama Materials and will
continue in full force and effect until terminated in accordance with the
terms and conditions herein. Meta may terminate this Agreement if you are in
breach of any term or condition of this Agreement. Upon termination of this
Agreement, you shall delete and cease use of the Llama Materials. Sections 3,
4 and 7 shall survive the termination of this Agreement.
7. Governing Law and Jurisdiction. This Agreement will be governed and
construed under the laws of the State of California without regard to choice
of law principles, and the UN Convention on Contracts for the International
Sale of Goods does not apply to this Agreement. The courts of California
shall have exclusive jurisdiction of any dispute arising out of this
Agreement.
### Llama 2 Acceptable Use Policy
Meta is committed to promoting safe and fair use of its tools and features,
including Llama 2. If you access or use Llama 2, you agree to this Acceptable
Use Policy (“Policy”). The most recent copy of this policy can be found at
[ai.meta.com/llama/use-policy](http://ai.meta.com/llama/use-policy).
#### Prohibited Uses
We want everyone to use Llama 2 safely and responsibly. You agree you will not
use, or allow others to use, Llama 2 to:
1. Violate the law or others’ rights, including to:
1. Engage in, promote, generate, contribute to, encourage, plan, incite, or further illegal or unlawful activity or content, such as:
1. Violence or terrorism
2. Exploitation or harm to children, including the solicitation, creation, acquisition, or dissemination of child exploitative content or failure to report Child Sexual Abuse Material
3. Human trafficking, exploitation, and sexual violence
4. The illegal distribution of information or materials to minors, including obscene materials, or failure to employ legally required age-gating in connection with such information or materials.
5. Sexual solicitation
6. Any other criminal activity
2. Engage in, promote, incite, or facilitate the harassment, abuse, threatening, or bullying of individuals or groups of individuals
3. Engage in, promote, incite, or facilitate discrimination or other unlawful or harmful conduct in the provision of employment, employment benefits, credit, housing, other economic benefits, or other essential goods and services
4. Engage in the unauthorized or unlicensed practice of any profession including, but not limited to, financial, legal, medical/health, or related professional practices
5. Collect, process, disclose, generate, or infer health, demographic, or other sensitive personal or private information about individuals without rights and consents required by applicable laws
6. Engage in or facilitate any action or generate any content that infringes, misappropriates, or otherwise violates any third-party rights, including the outputs or results of any products or services using the Llama 2 Materials
7. Create, generate, or facilitate the creation of malicious code, malware, computer viruses or do anything else that could disable, overburden, interfere with or impair the proper working, integrity, operation or appearance of a website or computer system
2. Engage in, promote, incite, facilitate, or assist in the planning or
development of activities that present a risk of death or bodily harm to
individuals, including use of Llama 2 related to the following:
1. Military, warfare, nuclear industries or applications, espionage, use for materials or activities that are subject to the International Traffic Arms Regulations (ITAR) maintained by the United States Department of State
2. Guns and illegal weapons (including weapon development)
3. Illegal drugs and regulated/controlled substances
4. Operation of critical infrastructure, transportation technologies, or heavy machinery
5. Self-harm or harm to others, including suicide, cutting, and eating disorders
6. Any content intended to incite or promote violence, abuse, or any infliction of bodily harm to an individual
3. Intentionally deceive or mislead others, including use of Llama 2 related
to the following:
1. Generating, promoting, or furthering fraud or the creation or promotion of disinformation
2. Generating, promoting, or furthering defamatory content, including the creation of defamatory statements, images, or other content
3. Generating, promoting, or further distributing spam
4. Impersonating another individual without consent, authorization, or legal right
5. Representing that the use of Llama 2 or outputs are human-generated
6. Generating or facilitating false online engagement, including fake reviews and other means of fake online engagement
4. Fail to appropriately disclose to end users any known dangers of your AI system
Please report any violation of this Policy, software “bug,” or other problems
that could lead to a violation of this Policy through one of the following
means:
* Reporting issues with the model: [github.com/facebookresearch/llama](http://github.com/facebookresearch/llama)
* Reporting risky content generated by the model: [developers.facebook.com/llama_output_feedback](http://developers.facebook.com/llama_output_feedback)
* Reporting bugs and security concerns: [facebook.com/whitehat/info](http://facebook.com/whitehat/info)
* Reporting violations of the Acceptable Use Policy or unlicensed uses of Llama: [[email protected]](mailto:[email protected])
extra_gated_fields:
First Name: text
Last Name: text
Date of birth: date_picker
Country: country
Affiliation: text
geo: ip_location
By clicking Submit below I accept the terms of the license and acknowledge that the information I provide will be collected stored processed and shared in accordance with the Meta Privacy Policy: checkbox
extra_gated_description: >-
The information you provide will be collected, stored, processed and shared in
accordance with the [Meta Privacy
Policy](https://www.facebook.com/privacy/policy/).
extra_gated_button_content: Submit
language:
- en
pipeline_tag: text-generation
tags:
- facebook
- meta
- pytorch
- llama
- llama-2
license: llama2
---
# **Llama 2**
Llama 2 is a collection of pretrained and fine-tuned generative text models ranging in scale from 7 billion to 70 billion parameters. This is the repository for the 13B pretrained model, converted for the Hugging Face Transformers format. Links to other models can be found in the index at the bottom.
## Model Details
*Note: Use of this model is governed by the Meta license. In order to download the model weights and tokenizer, please visit the [website](https://ai.meta.com/resources/models-and-libraries/llama-downloads/) and accept our License before requesting access here.*
Meta developed and publicly released the Llama 2 family of large language models (LLMs), a collection of pretrained and fine-tuned generative text models ranging in scale from 7 billion to 70 billion parameters. Our fine-tuned LLMs, called Llama-2-Chat, are optimized for dialogue use cases. Llama-2-Chat models outperform open-source chat models on most benchmarks we tested, and in our human evaluations for helpfulness and safety, are on par with some popular closed-source models like ChatGPT and PaLM.
**Model Developers** Meta
**Variations** Llama 2 comes in a range of parameter sizes — 7B, 13B, and 70B — as well as pretrained and fine-tuned variations.
**Input** Models input text only.
**Output** Models generate text only.
**Model Architecture** Llama 2 is an auto-regressive language model that uses an optimized transformer architecture. The tuned versions use supervised fine-tuning (SFT) and reinforcement learning with human feedback (RLHF) to align to human preferences for helpfulness and safety.
||Training Data|Params|Content Length|GQA|Tokens|LR|
|---|---|---|---|---|---|---|
|Llama 2|*A new mix of publicly available online data*|7B|4k|✗|2.0T|3.0 x 10<sup>-4</sup>|
|Llama 2|*A new mix of publicly available online data*|13B|4k|✗|2.0T|3.0 x 10<sup>-4</sup>|
|Llama 2|*A new mix of publicly available online data*|70B|4k|✔|2.0T|1.5 x 10<sup>-4</sup>|
*Llama 2 family of models.* Token counts refer to pretraining data only. All models are trained with a global batch-size of 4M tokens. Bigger models - 70B -- use Grouped-Query Attention (GQA) for improved inference scalability.
**Model Dates** Llama 2 was trained between January 2023 and July 2023.
**Status** This is a static model trained on an offline dataset. Future versions of the tuned models will be released as we improve model safety with community feedback.
**License** A custom commercial license is available at: [https://ai.meta.com/resources/models-and-libraries/llama-downloads/](https://ai.meta.com/resources/models-and-libraries/llama-downloads/)
**Research Paper** ["Llama-2: Open Foundation and Fine-tuned Chat Models"](arxiv.org/abs/2307.09288)
## Intended Use
**Intended Use Cases** Llama 2 is intended for commercial and research use in English. Tuned models are intended for assistant-like chat, whereas pretrained models can be adapted for a variety of natural language generation tasks.
To get the expected features and performance for the chat versions, a specific formatting needs to be followed, including the `INST` and `<<SYS>>` tags, `BOS` and `EOS` tokens, and the whitespaces and breaklines in between (we recommend calling `strip()` on inputs to avoid double-spaces). See our reference code in github for details: [`chat_completion`](https://github.com/facebookresearch/llama/blob/main/llama/generation.py#L212).
**Out-of-scope Uses** Use in any manner that violates applicable laws or regulations (including trade compliance laws).Use in languages other than English. Use in any other way that is prohibited by the Acceptable Use Policy and Licensing Agreement for Llama 2.
## Hardware and Software
**Training Factors** We used custom training libraries, Meta's Research Super Cluster, and production clusters for pretraining. Fine-tuning, annotation, and evaluation were also performed on third-party cloud compute.
**Carbon Footprint** Pretraining utilized a cumulative 3.3M GPU hours of computation on hardware of type A100-80GB (TDP of 350-400W). Estimated total emissions were 539 tCO2eq, 100% of which were offset by Meta’s sustainability program.
||Time (GPU hours)|Power Consumption (W)|Carbon Emitted(tCO<sub>2</sub>eq)|
|---|---|---|---|
|Llama 2 7B|184320|400|31.22|
|Llama 2 13B|368640|400|62.44|
|Llama 2 70B|1720320|400|291.42|
|Total|3311616||539.00|
**CO<sub>2</sub> emissions during pretraining.** Time: total GPU time required for training each model. Power Consumption: peak power capacity per GPU device for the GPUs used adjusted for power usage efficiency. 100% of the emissions are directly offset by Meta's sustainability program, and because we are openly releasing these models, the pretraining costs do not need to be incurred by others.
## Training Data
**Overview** Llama 2 was pretrained on 2 trillion tokens of data from publicly available sources. The fine-tuning data includes publicly available instruction datasets, as well as over one million new human-annotated examples. Neither the pretraining nor the fine-tuning datasets include Meta user data.
**Data Freshness** The pretraining data has a cutoff of September 2022, but some tuning data is more recent, up to July 2023.
## Evaluation Results
In this section, we report the results for the Llama 1 and Llama 2 models on standard academic benchmarks.For all the evaluations, we use our internal evaluations library.
|Model|Size|Code|Commonsense Reasoning|World Knowledge|Reading Comprehension|Math|MMLU|BBH|AGI Eval|
|---|---|---|---|---|---|---|---|---|---|
|Llama 1|7B|14.1|60.8|46.2|58.5|6.95|35.1|30.3|23.9|
|Llama 1|13B|18.9|66.1|52.6|62.3|10.9|46.9|37.0|33.9|
|Llama 1|33B|26.0|70.0|58.4|67.6|21.4|57.8|39.8|41.7|
|Llama 1|65B|30.7|70.7|60.5|68.6|30.8|63.4|43.5|47.6|
|Llama 2|7B|16.8|63.9|48.9|61.3|14.6|45.3|32.6|29.3|
|Llama 2|13B|24.5|66.9|55.4|65.8|28.7|54.8|39.4|39.1|
|Llama 2|70B|**37.5**|**71.9**|**63.6**|**69.4**|**35.2**|**68.9**|**51.2**|**54.2**|
**Overall performance on grouped academic benchmarks.** *Code:* We report the average pass@1 scores of our models on HumanEval and MBPP. *Commonsense Reasoning:* We report the average of PIQA, SIQA, HellaSwag, WinoGrande, ARC easy and challenge, OpenBookQA, and CommonsenseQA. We report 7-shot results for CommonSenseQA and 0-shot results for all other benchmarks. *World Knowledge:* We evaluate the 5-shot performance on NaturalQuestions and TriviaQA and report the average. *Reading Comprehension:* For reading comprehension, we report the 0-shot average on SQuAD, QuAC, and BoolQ. *MATH:* We report the average of the GSM8K (8 shot) and MATH (4 shot) benchmarks at top 1.
|||TruthfulQA|Toxigen|
|---|---|---|---|
|Llama 1|7B|27.42|23.00|
|Llama 1|13B|41.74|23.08|
|Llama 1|33B|44.19|22.57|
|Llama 1|65B|48.71|21.77|
|Llama 2|7B|33.29|**21.25**|
|Llama 2|13B|41.86|26.10|
|Llama 2|70B|**50.18**|24.60|
**Evaluation of pretrained LLMs on automatic safety benchmarks.** For TruthfulQA, we present the percentage of generations that are both truthful and informative (the higher the better). For ToxiGen, we present the percentage of toxic generations (the smaller the better).
|||TruthfulQA|Toxigen|
|---|---|---|---|
|Llama-2-Chat|7B|57.04|**0.00**|
|Llama-2-Chat|13B|62.18|**0.00**|
|Llama-2-Chat|70B|**64.14**|0.01|
**Evaluation of fine-tuned LLMs on different safety datasets.** Same metric definitions as above.
## Ethical Considerations and Limitations
Llama 2 is a new technology that carries risks with use. Testing conducted to date has been in English, and has not covered, nor could it cover all scenarios. For these reasons, as with all LLMs, Llama 2’s potential outputs cannot be predicted in advance, and the model may in some instances produce inaccurate, biased or other objectionable responses to user prompts. Therefore, before deploying any applications of Llama 2, developers should perform safety testing and tuning tailored to their specific applications of the model.
Please see the Responsible Use Guide available at [https://ai.meta.com/llama/responsible-use-guide/](https://ai.meta.com/llama/responsible-use-guide)
## Reporting Issues
Please report any software “bug,” or other problems with the models through one of the following means:
- Reporting issues with the model: [github.com/facebookresearch/llama](http://github.com/facebookresearch/llama)
- Reporting problematic content generated by the model: [developers.facebook.com/llama_output_feedback](http://developers.facebook.com/llama_output_feedback)
- Reporting bugs and security concerns: [facebook.com/whitehat/info](http://facebook.com/whitehat/info)
## Llama Model Index
|Model|Llama2|Llama2-hf|Llama2-chat|Llama2-chat-hf|
|---|---|---|---|---|
|7B| [Link](https://huggingface.co/meta-llama/Llama-2-7b) | [Link](https://huggingface.co/meta-llama/Llama-2-7b-hf) | [Link](https://huggingface.co/meta-llama/Llama-2-7b-chat) | [Link](https://huggingface.co/meta-llama/Llama-2-7b-chat-hf)|
|13B| [Link](https://huggingface.co/meta-llama/Llama-2-13b) | [Link](https://huggingface.co/meta-llama/Llama-2-13b-hf) | [Link](https://huggingface.co/meta-llama/Llama-2-13b-chat) | [Link](https://huggingface.co/meta-llama/Llama-2-13b-chat-hf)|
|70B| [Link](https://huggingface.co/meta-llama/Llama-2-70b) | [Link](https://huggingface.co/meta-llama/Llama-2-70b-hf) | [Link](https://huggingface.co/meta-llama/Llama-2-70b-chat) | [Link](https://huggingface.co/meta-llama/Llama-2-70b-chat-hf)| |
mradermacher/datagemma-rig-27b-it-i1-GGUF | mradermacher | 2024-11-13T08:46:54Z | 5 | 0 | transformers | [
"transformers",
"gguf",
"conversational",
"en",
"base_model:google/datagemma-rig-27b-it",
"base_model:quantized:google/datagemma-rig-27b-it",
"license:gemma",
"endpoints_compatible",
"region:us",
"imatrix"
] | null | 2024-11-13T01:19:24Z | ---
base_model: google/datagemma-rig-27b-it
extra_gated_button_content: Acknowledge license
extra_gated_heading: Access Gemma on Hugging Face
extra_gated_prompt: To access Gemma on Hugging Face, you’re required to review and
agree to Google’s usage license. To do this, please ensure you’re logged in to Hugging
Face and click below. Requests are processed immediately.
language:
- en
library_name: transformers
license: gemma
quantized_by: mradermacher
tags:
- conversational
---
## About
<!-- ### quantize_version: 2 -->
<!-- ### output_tensor_quantised: 1 -->
<!-- ### convert_type: hf -->
<!-- ### vocab_type: -->
<!-- ### tags: nicoboss -->
weighted/imatrix quants of https://huggingface.co/google/datagemma-rig-27b-it
<!-- provided-files -->
static quants are available at https://huggingface.co/mradermacher/datagemma-rig-27b-it-GGUF
## Usage
If you are unsure how to use GGUF files, refer to one of [TheBloke's
READMEs](https://huggingface.co/TheBloke/KafkaLM-70B-German-V0.1-GGUF) for
more details, including on how to concatenate multi-part files.
## Provided Quants
(sorted by size, not necessarily quality. IQ-quants are often preferable over similar sized non-IQ quants)
| Link | Type | Size/GB | Notes |
|:-----|:-----|--------:|:------|
| [GGUF](https://huggingface.co/mradermacher/datagemma-rig-27b-it-i1-GGUF/resolve/main/datagemma-rig-27b-it.i1-IQ1_S.gguf) | i1-IQ1_S | 6.2 | for the desperate |
| [GGUF](https://huggingface.co/mradermacher/datagemma-rig-27b-it-i1-GGUF/resolve/main/datagemma-rig-27b-it.i1-IQ1_M.gguf) | i1-IQ1_M | 6.8 | mostly desperate |
| [GGUF](https://huggingface.co/mradermacher/datagemma-rig-27b-it-i1-GGUF/resolve/main/datagemma-rig-27b-it.i1-IQ2_XXS.gguf) | i1-IQ2_XXS | 7.7 | |
| [GGUF](https://huggingface.co/mradermacher/datagemma-rig-27b-it-i1-GGUF/resolve/main/datagemma-rig-27b-it.i1-IQ2_XS.gguf) | i1-IQ2_XS | 8.5 | |
| [GGUF](https://huggingface.co/mradermacher/datagemma-rig-27b-it-i1-GGUF/resolve/main/datagemma-rig-27b-it.i1-IQ2_S.gguf) | i1-IQ2_S | 8.8 | |
| [GGUF](https://huggingface.co/mradermacher/datagemma-rig-27b-it-i1-GGUF/resolve/main/datagemma-rig-27b-it.i1-IQ2_M.gguf) | i1-IQ2_M | 9.5 | |
| [GGUF](https://huggingface.co/mradermacher/datagemma-rig-27b-it-i1-GGUF/resolve/main/datagemma-rig-27b-it.i1-Q2_K.gguf) | i1-Q2_K | 10.5 | IQ3_XXS probably better |
| [GGUF](https://huggingface.co/mradermacher/datagemma-rig-27b-it-i1-GGUF/resolve/main/datagemma-rig-27b-it.i1-IQ3_XXS.gguf) | i1-IQ3_XXS | 10.9 | lower quality |
| [GGUF](https://huggingface.co/mradermacher/datagemma-rig-27b-it-i1-GGUF/resolve/main/datagemma-rig-27b-it.i1-IQ3_XS.gguf) | i1-IQ3_XS | 11.7 | |
| [GGUF](https://huggingface.co/mradermacher/datagemma-rig-27b-it-i1-GGUF/resolve/main/datagemma-rig-27b-it.i1-IQ3_S.gguf) | i1-IQ3_S | 12.3 | beats Q3_K* |
| [GGUF](https://huggingface.co/mradermacher/datagemma-rig-27b-it-i1-GGUF/resolve/main/datagemma-rig-27b-it.i1-Q3_K_S.gguf) | i1-Q3_K_S | 12.3 | IQ3_XS probably better |
| [GGUF](https://huggingface.co/mradermacher/datagemma-rig-27b-it-i1-GGUF/resolve/main/datagemma-rig-27b-it.i1-IQ3_M.gguf) | i1-IQ3_M | 12.6 | |
| [GGUF](https://huggingface.co/mradermacher/datagemma-rig-27b-it-i1-GGUF/resolve/main/datagemma-rig-27b-it.i1-Q3_K_M.gguf) | i1-Q3_K_M | 13.5 | IQ3_S probably better |
| [GGUF](https://huggingface.co/mradermacher/datagemma-rig-27b-it-i1-GGUF/resolve/main/datagemma-rig-27b-it.i1-Q3_K_L.gguf) | i1-Q3_K_L | 14.6 | IQ3_M probably better |
| [GGUF](https://huggingface.co/mradermacher/datagemma-rig-27b-it-i1-GGUF/resolve/main/datagemma-rig-27b-it.i1-IQ4_XS.gguf) | i1-IQ4_XS | 14.9 | |
| [GGUF](https://huggingface.co/mradermacher/datagemma-rig-27b-it-i1-GGUF/resolve/main/datagemma-rig-27b-it.i1-Q4_0.gguf) | i1-Q4_0 | 15.8 | fast, low quality |
| [GGUF](https://huggingface.co/mradermacher/datagemma-rig-27b-it-i1-GGUF/resolve/main/datagemma-rig-27b-it.i1-Q4_K_S.gguf) | i1-Q4_K_S | 15.8 | optimal size/speed/quality |
| [GGUF](https://huggingface.co/mradermacher/datagemma-rig-27b-it-i1-GGUF/resolve/main/datagemma-rig-27b-it.i1-Q4_K_M.gguf) | i1-Q4_K_M | 16.7 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/datagemma-rig-27b-it-i1-GGUF/resolve/main/datagemma-rig-27b-it.i1-Q5_K_S.gguf) | i1-Q5_K_S | 19.0 | |
| [GGUF](https://huggingface.co/mradermacher/datagemma-rig-27b-it-i1-GGUF/resolve/main/datagemma-rig-27b-it.i1-Q5_K_M.gguf) | i1-Q5_K_M | 19.5 | |
| [GGUF](https://huggingface.co/mradermacher/datagemma-rig-27b-it-i1-GGUF/resolve/main/datagemma-rig-27b-it.i1-Q6_K.gguf) | i1-Q6_K | 22.4 | practically like static Q6_K |
Here is a handy graph by ikawrakow comparing some lower-quality quant
types (lower is better):

And here are Artefact2's thoughts on the matter:
https://gist.github.com/Artefact2/b5f810600771265fc1e39442288e8ec9
## FAQ / Model Request
See https://huggingface.co/mradermacher/model_requests for some answers to
questions you might have and/or if you want some other model quantized.
## Thanks
I thank my company, [nethype GmbH](https://www.nethype.de/), for letting
me use its servers and providing upgrades to my workstation to enable
this work in my free time. Additional thanks to [@nicoboss](https://huggingface.co/nicoboss) for giving me access to his private supercomputer, enabling me to provide many more imatrix quants, at much higher quality, than I would otherwise be able to.
<!-- end -->
|
Yudsky/lora-flan-t5-med-cardio | Yudsky | 2024-11-13T08:32:34Z | 124 | 0 | transformers | [
"transformers",
"tensorboard",
"safetensors",
"t5",
"text2text-generation",
"medical",
"dataset:ruslanmv/ai-medical-chatbot",
"arxiv:1910.09700",
"base_model:google/flan-t5-large",
"base_model:finetune:google/flan-t5-large",
"license:apache-2.0",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | text2text-generation | 2024-11-13T07:15:01Z | ---
library_name: transformers
tags:
- medical
license: apache-2.0
datasets:
- ruslanmv/ai-medical-chatbot
base_model:
- google/flan-t5-large
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
Leejy0-0/ko-Meta-Llama-3.1-8B-Instruct | Leejy0-0 | 2024-11-13T08:27:29Z | 1,939 | 0 | transformers | [
"transformers",
"safetensors",
"llama",
"text-generation",
"text-generation-inference",
"unsloth",
"trl",
"sft",
"conversational",
"en",
"base_model:unsloth/Meta-Llama-3.1-8B-Instruct",
"base_model:finetune:unsloth/Meta-Llama-3.1-8B-Instruct",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-generation | 2024-11-13T08:16:11Z | ---
base_model: unsloth/Meta-Llama-3.1-8B-Instruct
tags:
- text-generation-inference
- transformers
- unsloth
- llama
- trl
- sft
license: apache-2.0
language:
- en
---
# Uploaded model
- **Developed by:** Leejy0-0
- **License:** apache-2.0
- **Finetuned from model :** unsloth/Meta-Llama-3.1-8B-Instruct
This llama model was trained 2x faster with [Unsloth](https://github.com/unslothai/unsloth) and Huggingface's TRL library.
[<img src="https://raw.githubusercontent.com/unslothai/unsloth/main/images/unsloth%20made%20with%20love.png" width="200"/>](https://github.com/unslothai/unsloth)
|
Labira/LabiraPJOK_6_100_Full | Labira | 2024-11-13T08:09:06Z | 20 | 0 | transformers | [
"transformers",
"tf",
"bert",
"question-answering",
"generated_from_keras_callback",
"base_model:Labira/LabiraPJOK_5_100_Full",
"base_model:finetune:Labira/LabiraPJOK_5_100_Full",
"license:mit",
"endpoints_compatible",
"region:us"
] | question-answering | 2024-11-13T06:38:05Z | ---
library_name: transformers
license: mit
base_model: Labira/LabiraPJOK_5_100_Full
tags:
- generated_from_keras_callback
model-index:
- name: Labira/LabiraPJOK_6_100_Full
results: []
---
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# Labira/LabiraPJOK_6_100_Full
This model is a fine-tuned version of [Labira/LabiraPJOK_5_100_Full](https://huggingface.co/Labira/LabiraPJOK_5_100_Full) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 0.1856
- Validation Loss: 0.0721
- Epoch: 99
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'Adam', 'weight_decay': None, 'clipnorm': None, 'global_clipnorm': None, 'clipvalue': None, 'use_ema': False, 'ema_momentum': 0.99, 'ema_overwrite_frequency': None, 'jit_compile': True, 'is_legacy_optimizer': False, 'learning_rate': {'module': 'keras.optimizers.schedules', 'class_name': 'PolynomialDecay', 'config': {'initial_learning_rate': 2e-05, 'decay_steps': 400, 'end_learning_rate': 0.0, 'power': 1.0, 'cycle': False, 'name': None}, 'registered_name': None}, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-08, 'amsgrad': False}
- training_precision: float32
### Training results
| Train Loss | Validation Loss | Epoch |
|:----------:|:---------------:|:-----:|
| 2.0582 | 1.6013 | 0 |
| 1.3251 | 1.3889 | 1 |
| 0.8805 | 1.0654 | 2 |
| 0.8392 | 0.6530 | 3 |
| 0.3612 | 0.3225 | 4 |
| 0.3515 | 0.2072 | 5 |
| 0.2917 | 0.1839 | 6 |
| 0.2735 | 0.5011 | 7 |
| 0.2863 | 0.5152 | 8 |
| 0.3594 | 0.5083 | 9 |
| 0.2413 | 0.4202 | 10 |
| 0.3135 | 0.3112 | 11 |
| 0.2592 | 0.2370 | 12 |
| 0.2292 | 0.2129 | 13 |
| 0.2270 | 0.1288 | 14 |
| 0.2107 | 0.1385 | 15 |
| 0.1990 | 0.1431 | 16 |
| 0.1920 | 0.1420 | 17 |
| 0.2805 | 0.1550 | 18 |
| 0.2343 | 0.1466 | 19 |
| 0.2061 | 0.1351 | 20 |
| 0.1422 | 0.1275 | 21 |
| 0.1669 | 0.1235 | 22 |
| 0.1482 | 0.1215 | 23 |
| 0.1162 | 0.1202 | 24 |
| 0.1288 | 0.1102 | 25 |
| 0.1435 | 0.1094 | 26 |
| 0.2018 | 0.1077 | 27 |
| 0.0912 | 0.0939 | 28 |
| 0.1054 | 0.0915 | 29 |
| 0.1274 | 0.0775 | 30 |
| 0.0758 | 0.0783 | 31 |
| 0.1480 | 0.0800 | 32 |
| 0.0722 | 0.0811 | 33 |
| 0.0978 | 0.0799 | 34 |
| 0.1078 | 0.0782 | 35 |
| 0.0815 | 0.0765 | 36 |
| 0.0744 | 0.0753 | 37 |
| 0.1194 | 0.0745 | 38 |
| 0.1327 | 0.0744 | 39 |
| 0.1164 | 0.0749 | 40 |
| 0.0480 | 0.0756 | 41 |
| 0.0424 | 0.0759 | 42 |
| 0.0830 | 0.0761 | 43 |
| 0.0842 | 0.0760 | 44 |
| 0.1157 | 0.0751 | 45 |
| 0.1100 | 0.0744 | 46 |
| 0.0937 | 0.0741 | 47 |
| 0.1211 | 0.0739 | 48 |
| 0.0880 | 0.0737 | 49 |
| 0.1047 | 0.0738 | 50 |
| 0.1037 | 0.0741 | 51 |
| 0.1366 | 0.0860 | 52 |
| 0.0815 | 0.0913 | 53 |
| 0.1404 | 0.0913 | 54 |
| 0.0952 | 0.1043 | 55 |
| 0.0658 | 0.1044 | 56 |
| 0.1319 | 0.1045 | 57 |
| 0.0918 | 0.1152 | 58 |
| 0.1372 | 0.1151 | 59 |
| 0.1203 | 0.1148 | 60 |
| 0.1251 | 0.1146 | 61 |
| 0.0606 | 0.1144 | 62 |
| 0.1407 | 0.1141 | 63 |
| 0.1266 | 0.1139 | 64 |
| 0.1025 | 0.1138 | 65 |
| 0.1077 | 0.1136 | 66 |
| 0.1312 | 0.1136 | 67 |
| 0.0987 | 0.1135 | 68 |
| 0.1199 | 0.1135 | 69 |
| 0.1427 | 0.1136 | 70 |
| 0.1271 | 0.1024 | 71 |
| 0.1049 | 0.1024 | 72 |
| 0.1073 | 0.1027 | 73 |
| 0.1162 | 0.1029 | 74 |
| 0.0863 | 0.1029 | 75 |
| 0.1062 | 0.1028 | 76 |
| 0.1034 | 0.1027 | 77 |
| 0.0984 | 0.1026 | 78 |
| 0.0988 | 0.1024 | 79 |
| 0.1153 | 0.1023 | 80 |
| 0.1020 | 0.1022 | 81 |
| 0.0990 | 0.1019 | 82 |
| 0.0881 | 0.0884 | 83 |
| 0.1330 | 0.0865 | 84 |
| 0.1972 | 0.0717 | 85 |
| 0.1165 | 0.0719 | 86 |
| 0.1853 | 0.0722 | 87 |
| 0.0734 | 0.0722 | 88 |
| 0.1391 | 0.0722 | 89 |
| 0.0942 | 0.0721 | 90 |
| 0.0817 | 0.0721 | 91 |
| 0.0757 | 0.0720 | 92 |
| 0.0738 | 0.0720 | 93 |
| 0.1871 | 0.0720 | 94 |
| 0.1965 | 0.0720 | 95 |
| 0.0812 | 0.0721 | 96 |
| 0.1010 | 0.0721 | 97 |
| 0.0709 | 0.0721 | 98 |
| 0.1856 | 0.0721 | 99 |
### Framework versions
- Transformers 4.46.2
- TensorFlow 2.17.0
- Datasets 3.1.0
- Tokenizers 0.20.3
|
vantubk1386/vits-vie-with-discriminator | vantubk1386 | 2024-11-13T07:57:10Z | 36 | 0 | transformers | [
"transformers",
"safetensors",
"vits",
"arxiv:1910.09700",
"endpoints_compatible",
"region:us"
] | null | 2024-11-13T07:55:35Z | ---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
Keltezaa/mckenna-grace-768-portrait | Keltezaa | 2024-11-13T07:45:40Z | 131 | 2 | diffusers | [
"diffusers",
"text-to-image",
"stable-diffusion",
"lora",
"template:sd-lora",
"migrated",
"celebrity",
"base_model:black-forest-labs/FLUX.1-dev",
"base_model:adapter:black-forest-labs/FLUX.1-dev",
"license:other",
"region:us"
] | text-to-image | 2024-11-13T06:09:51Z | ---
license: other
license_name: bespoke-lora-trained-license
license_link: >-
https://multimodal.art/civitai-licenses?allowNoCredit=False&allowCommercialUse=RentCivit&allowDerivatives=False&allowDifferentLicense=False
tags:
- text-to-image
- stable-diffusion
- lora
- diffusers
- template:sd-lora
- migrated
- celebrity
base_model: black-forest-labs/FLUX.1-dev
instance_prompt: null
widget:
- text: ' '
output:
url: 30071437.jpeg
- text: ' '
output:
url: 30071418.jpeg
- text: ' '
output:
url: 30071419.jpeg
- text: ' '
output:
url: 30071421.jpeg
- text: ' '
output:
url: 30071422.jpeg
- text: ' '
output:
url: 30071424.jpeg
- text: ' '
output:
url: 30071434.jpeg
- text: ' '
output:
url: 30071436.jpeg
- text: ' '
output:
url: 30071439.jpeg
- text: ' '
output:
url: 30072383.jpeg
- text: ' '
output:
url: 30072384.jpeg
- text: ' '
output:
url: 29865285.jpeg
- text: ' '
output:
url: 29865282.jpeg
- text: >-
(14-year-old:1.2) A High quality Photo of a 14-year-old girl, lips parted in
a focused expression, black curly hair that is styled in a natural, tousled
manner. The hair appears to be very short at the sides to medium length at
the top and has a (voluminous on top of head). She is wearing (round thin
wire rim glasses). Fair pale skin with freckles on nose and cheeks.
output:
url: images/example_ngkro24sn.png
- text: >-
(16-year-old:1.2) A High quality,muted tones, realistic, film noir,
cinematic, Photo of a 16-year-old "ghostbuster" girl, natural beauty, white
pale skin, light rosy lips parted with a focused expression, black curly
hair that is tousled ant the top. The hair appears to be very short at the
sides to medium length at the top and has a ( (voluminous, wavy and curly
tousled on top of head:0.8)). She is wearing (round thin wire rim glasses).
Fair pale skin with freckles on nose and cheeks. Wearing a "ghostbuster"
overall and gear.
output:
url: images/example_g918t97ki.png
---
# Mckenna Grace, 768 portrait
<Gallery />
## Model description
<p>Specially for portraits and closeup portraits. Selected eyes look at camera. HD and sharp images selected to dataset. Used 100-150 photos.</p><p>Trained on Comfyui FluxTrainer on 16gb Vram</p><p>Better use resolution 768 (minimize face-body proportion distortions) and than upscale whatever you want.</p><p>Small size of LORA because of training only 2 blocks: 7 and 20.</p><p>Top line of the GRID is this checkpoint. Grids for lora strength 0.8 and 1.2</p>
## Download model
Weights for this model are available in Safetensors format.
[Download](/Keltezaa/mckenna-grace-768-portrait/tree/main) them in the Files & versions tab.
## Use it with the [🧨 diffusers library](https://github.com/huggingface/diffusers)
```py
from diffusers import AutoPipelineForText2Image
import torch
device = "cuda" if torch.cuda.is_available() else "cpu"
pipeline = AutoPipelineForText2Image.from_pretrained('black-forest-labs/FLUX.1-dev', torch_dtype=torch.bfloat16).to(device)
pipeline.load_lora_weights('Keltezaa/mckenna-grace-768-portrait', weight_name='mckenna_768_rank128_bf16-step03500.safetensors')
image = pipeline('Your custom prompt').images[0]
```
For more details, including weighting, merging and fusing LoRAs, check the [documentation on loading LoRAs in diffusers](https://huggingface.co/docs/diffusers/main/en/using-diffusers/loading_adapters)
|
upadhyay/phi3_mini-insturct_quantized | upadhyay | 2024-11-13T07:45:02Z | 5 | 0 | null | [
"safetensors",
"phi3",
"custom_code",
"license:apache-2.0",
"4-bit",
"gptq",
"region:us"
] | null | 2024-11-13T07:40:30Z | ---
license: apache-2.0
---
|
AIFunOver/Qwen2.5-Coder-7B-Instruct-openvino-fp16 | AIFunOver | 2024-11-13T07:42:42Z | 52 | 0 | transformers | [
"transformers",
"safetensors",
"openvino",
"qwen2",
"text-generation",
"code",
"codeqwen",
"chat",
"qwen",
"qwen-coder",
"nncf",
"fp16",
"conversational",
"en",
"base_model:Qwen/Qwen2.5-Coder-7B-Instruct",
"base_model:finetune:Qwen/Qwen2.5-Coder-7B-Instruct",
"license:apache-2.0",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | text-generation | 2024-11-08T02:42:34Z | ---
base_model: Qwen/Qwen2.5-Coder-7B-Instruct
language:
- en
library_name: transformers
license: apache-2.0
license_link: https://huggingface.co/Qwen/Qwen2.5-Coder-7B-Instruct/blob/main/LICENSE
pipeline_tag: text-generation
tags:
- code
- codeqwen
- chat
- qwen
- qwen-coder
- openvino
- nncf
- fp16
---
This model is a quantized version of [`Qwen/Qwen2.5-Coder-7B-Instruct`](https://huggingface.co/Qwen/Qwen2.5-Coder-7B-Instruct) and is converted to the OpenVINO format. This model was obtained via the [nncf-quantization](https://huggingface.co/spaces/echarlaix/nncf-quantization) space with [optimum-intel](https://github.com/huggingface/optimum-intel).
First make sure you have `optimum-intel` installed:
```bash
pip install optimum[openvino]
```
To load your model you can do as follows:
```python
from optimum.intel import OVModelForCausalLM
model_id = "AIFunOver/Qwen2.5-Coder-7B-Instruct-openvino-fp16"
model = OVModelForCausalLM.from_pretrained(model_id)
```
|
Carick/FacebookAI-roberta-base-fine-tuned-term-typing | Carick | 2024-11-13T07:26:27Z | 107 | 0 | transformers | [
"transformers",
"safetensors",
"roberta",
"text-classification",
"generated_from_trainer",
"base_model:FacebookAI/roberta-base",
"base_model:finetune:FacebookAI/roberta-base",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | 2024-11-11T15:57:19Z | ---
library_name: transformers
license: mit
base_model: FacebookAI/roberta-base
tags:
- generated_from_trainer
model-index:
- name: FacebookAI-roberta-base-fine-tuned-term-typing
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# FacebookAI-roberta-base-fine-tuned-term-typing
This model is a fine-tuned version of [FacebookAI/roberta-base](https://huggingface.co/FacebookAI/roberta-base) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0663
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 3
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:-----:|:----:|:---------------:|
| 0.2362 | 1.0 | 2535 | 0.1437 |
| 0.2113 | 2.0 | 5070 | 0.0809 |
| 0.1617 | 3.0 | 7605 | 0.0663 |
### Framework versions
- Transformers 4.45.1
- Pytorch 2.4.0
- Datasets 3.0.1
- Tokenizers 0.20.0
|
mradermacher/WizardLM-30B-Uncensored-i1-GGUF | mradermacher | 2024-11-13T07:24:24Z | 345 | 0 | transformers | [
"transformers",
"gguf",
"uncensored",
"en",
"dataset:ehartford/WizardLM_alpaca_evol_instruct_70k_unfiltered",
"base_model:cognitivecomputations/WizardLM-30B-Uncensored",
"base_model:quantized:cognitivecomputations/WizardLM-30B-Uncensored",
"license:other",
"endpoints_compatible",
"region:us",
"imatrix"
] | null | 2024-11-12T18:21:18Z | ---
base_model: cognitivecomputations/WizardLM-30B-Uncensored
datasets:
- ehartford/WizardLM_alpaca_evol_instruct_70k_unfiltered
language:
- en
library_name: transformers
license: other
quantized_by: mradermacher
tags:
- uncensored
---
## About
<!-- ### quantize_version: 2 -->
<!-- ### output_tensor_quantised: 1 -->
<!-- ### convert_type: hf -->
<!-- ### vocab_type: -->
<!-- ### tags: nicoboss -->
weighted/imatrix quants of https://huggingface.co/cognitivecomputations/WizardLM-30B-Uncensored
<!-- provided-files -->
static quants are available at https://huggingface.co/mradermacher/WizardLM-30B-Uncensored-GGUF
## Usage
If you are unsure how to use GGUF files, refer to one of [TheBloke's
READMEs](https://huggingface.co/TheBloke/KafkaLM-70B-German-V0.1-GGUF) for
more details, including on how to concatenate multi-part files.
## Provided Quants
(sorted by size, not necessarily quality. IQ-quants are often preferable over similar sized non-IQ quants)
| Link | Type | Size/GB | Notes |
|:-----|:-----|--------:|:------|
| [GGUF](https://huggingface.co/mradermacher/WizardLM-30B-Uncensored-i1-GGUF/resolve/main/WizardLM-30B-Uncensored.i1-IQ1_S.gguf) | i1-IQ1_S | 7.2 | for the desperate |
| [GGUF](https://huggingface.co/mradermacher/WizardLM-30B-Uncensored-i1-GGUF/resolve/main/WizardLM-30B-Uncensored.i1-IQ1_M.gguf) | i1-IQ1_M | 7.8 | mostly desperate |
| [GGUF](https://huggingface.co/mradermacher/WizardLM-30B-Uncensored-i1-GGUF/resolve/main/WizardLM-30B-Uncensored.i1-IQ2_XXS.gguf) | i1-IQ2_XXS | 8.8 | |
| [GGUF](https://huggingface.co/mradermacher/WizardLM-30B-Uncensored-i1-GGUF/resolve/main/WizardLM-30B-Uncensored.i1-IQ2_XS.gguf) | i1-IQ2_XS | 9.7 | |
| [GGUF](https://huggingface.co/mradermacher/WizardLM-30B-Uncensored-i1-GGUF/resolve/main/WizardLM-30B-Uncensored.i1-IQ2_S.gguf) | i1-IQ2_S | 10.5 | |
| [GGUF](https://huggingface.co/mradermacher/WizardLM-30B-Uncensored-i1-GGUF/resolve/main/WizardLM-30B-Uncensored.i1-IQ2_M.gguf) | i1-IQ2_M | 11.3 | |
| [GGUF](https://huggingface.co/mradermacher/WizardLM-30B-Uncensored-i1-GGUF/resolve/main/WizardLM-30B-Uncensored.i1-Q2_K.gguf) | i1-Q2_K | 12.1 | IQ3_XXS probably better |
| [GGUF](https://huggingface.co/mradermacher/WizardLM-30B-Uncensored-i1-GGUF/resolve/main/WizardLM-30B-Uncensored.i1-IQ3_XXS.gguf) | i1-IQ3_XXS | 12.4 | lower quality |
| [GGUF](https://huggingface.co/mradermacher/WizardLM-30B-Uncensored-i1-GGUF/resolve/main/WizardLM-30B-Uncensored.i1-IQ3_XS.gguf) | i1-IQ3_XS | 13.4 | |
| [GGUF](https://huggingface.co/mradermacher/WizardLM-30B-Uncensored-i1-GGUF/resolve/main/WizardLM-30B-Uncensored.i1-IQ3_S.gguf) | i1-IQ3_S | 14.2 | beats Q3_K* |
| [GGUF](https://huggingface.co/mradermacher/WizardLM-30B-Uncensored-i1-GGUF/resolve/main/WizardLM-30B-Uncensored.i1-Q3_K_S.gguf) | i1-Q3_K_S | 14.2 | IQ3_XS probably better |
| [GGUF](https://huggingface.co/mradermacher/WizardLM-30B-Uncensored-i1-GGUF/resolve/main/WizardLM-30B-Uncensored.i1-IQ3_M.gguf) | i1-IQ3_M | 15.0 | |
| [GGUF](https://huggingface.co/mradermacher/WizardLM-30B-Uncensored-i1-GGUF/resolve/main/WizardLM-30B-Uncensored.i1-Q3_K_M.gguf) | i1-Q3_K_M | 15.9 | IQ3_S probably better |
| [GGUF](https://huggingface.co/mradermacher/WizardLM-30B-Uncensored-i1-GGUF/resolve/main/WizardLM-30B-Uncensored.i1-Q3_K_L.gguf) | i1-Q3_K_L | 17.4 | IQ3_M probably better |
| [GGUF](https://huggingface.co/mradermacher/WizardLM-30B-Uncensored-i1-GGUF/resolve/main/WizardLM-30B-Uncensored.i1-IQ4_XS.gguf) | i1-IQ4_XS | 17.4 | |
| [GGUF](https://huggingface.co/mradermacher/WizardLM-30B-Uncensored-i1-GGUF/resolve/main/WizardLM-30B-Uncensored.i1-Q4_0.gguf) | i1-Q4_0 | 18.5 | fast, low quality |
| [GGUF](https://huggingface.co/mradermacher/WizardLM-30B-Uncensored-i1-GGUF/resolve/main/WizardLM-30B-Uncensored.i1-Q4_K_S.gguf) | i1-Q4_K_S | 18.6 | optimal size/speed/quality |
| [GGUF](https://huggingface.co/mradermacher/WizardLM-30B-Uncensored-i1-GGUF/resolve/main/WizardLM-30B-Uncensored.i1-Q4_K_M.gguf) | i1-Q4_K_M | 19.7 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/WizardLM-30B-Uncensored-i1-GGUF/resolve/main/WizardLM-30B-Uncensored.i1-Q5_K_S.gguf) | i1-Q5_K_S | 22.5 | |
| [GGUF](https://huggingface.co/mradermacher/WizardLM-30B-Uncensored-i1-GGUF/resolve/main/WizardLM-30B-Uncensored.i1-Q5_K_M.gguf) | i1-Q5_K_M | 23.1 | |
| [GGUF](https://huggingface.co/mradermacher/WizardLM-30B-Uncensored-i1-GGUF/resolve/main/WizardLM-30B-Uncensored.i1-Q6_K.gguf) | i1-Q6_K | 26.8 | practically like static Q6_K |
Here is a handy graph by ikawrakow comparing some lower-quality quant
types (lower is better):

And here are Artefact2's thoughts on the matter:
https://gist.github.com/Artefact2/b5f810600771265fc1e39442288e8ec9
## FAQ / Model Request
See https://huggingface.co/mradermacher/model_requests for some answers to
questions you might have and/or if you want some other model quantized.
## Thanks
I thank my company, [nethype GmbH](https://www.nethype.de/), for letting
me use its servers and providing upgrades to my workstation to enable
this work in my free time. Additional thanks to [@nicoboss](https://huggingface.co/nicoboss) for giving me access to his private supercomputer, enabling me to provide many more imatrix quants, at much higher quality, than I would otherwise be able to.
<!-- end -->
|
RichardErkhov/Weyaxi_-_OpenHermes-2.5-neural-chat-v3-3-Slerp-gguf | RichardErkhov | 2024-11-13T07:16:40Z | 8 | 0 | null | [
"gguf",
"endpoints_compatible",
"region:us"
] | null | 2024-11-13T03:32:32Z | Quantization made by Richard Erkhov.
[Github](https://github.com/RichardErkhov)
[Discord](https://discord.gg/pvy7H8DZMG)
[Request more models](https://github.com/RichardErkhov/quant_request)
OpenHermes-2.5-neural-chat-v3-3-Slerp - GGUF
- Model creator: https://huggingface.co/Weyaxi/
- Original model: https://huggingface.co/Weyaxi/OpenHermes-2.5-neural-chat-v3-3-Slerp/
| Name | Quant method | Size |
| ---- | ---- | ---- |
| [OpenHermes-2.5-neural-chat-v3-3-Slerp.Q2_K.gguf](https://huggingface.co/RichardErkhov/Weyaxi_-_OpenHermes-2.5-neural-chat-v3-3-Slerp-gguf/blob/main/OpenHermes-2.5-neural-chat-v3-3-Slerp.Q2_K.gguf) | Q2_K | 2.53GB |
| [OpenHermes-2.5-neural-chat-v3-3-Slerp.Q3_K_S.gguf](https://huggingface.co/RichardErkhov/Weyaxi_-_OpenHermes-2.5-neural-chat-v3-3-Slerp-gguf/blob/main/OpenHermes-2.5-neural-chat-v3-3-Slerp.Q3_K_S.gguf) | Q3_K_S | 2.95GB |
| [OpenHermes-2.5-neural-chat-v3-3-Slerp.Q3_K.gguf](https://huggingface.co/RichardErkhov/Weyaxi_-_OpenHermes-2.5-neural-chat-v3-3-Slerp-gguf/blob/main/OpenHermes-2.5-neural-chat-v3-3-Slerp.Q3_K.gguf) | Q3_K | 3.28GB |
| [OpenHermes-2.5-neural-chat-v3-3-Slerp.Q3_K_M.gguf](https://huggingface.co/RichardErkhov/Weyaxi_-_OpenHermes-2.5-neural-chat-v3-3-Slerp-gguf/blob/main/OpenHermes-2.5-neural-chat-v3-3-Slerp.Q3_K_M.gguf) | Q3_K_M | 3.28GB |
| [OpenHermes-2.5-neural-chat-v3-3-Slerp.Q3_K_L.gguf](https://huggingface.co/RichardErkhov/Weyaxi_-_OpenHermes-2.5-neural-chat-v3-3-Slerp-gguf/blob/main/OpenHermes-2.5-neural-chat-v3-3-Slerp.Q3_K_L.gguf) | Q3_K_L | 3.56GB |
| [OpenHermes-2.5-neural-chat-v3-3-Slerp.IQ4_XS.gguf](https://huggingface.co/RichardErkhov/Weyaxi_-_OpenHermes-2.5-neural-chat-v3-3-Slerp-gguf/blob/main/OpenHermes-2.5-neural-chat-v3-3-Slerp.IQ4_XS.gguf) | IQ4_XS | 3.67GB |
| [OpenHermes-2.5-neural-chat-v3-3-Slerp.Q4_0.gguf](https://huggingface.co/RichardErkhov/Weyaxi_-_OpenHermes-2.5-neural-chat-v3-3-Slerp-gguf/blob/main/OpenHermes-2.5-neural-chat-v3-3-Slerp.Q4_0.gguf) | Q4_0 | 3.83GB |
| [OpenHermes-2.5-neural-chat-v3-3-Slerp.IQ4_NL.gguf](https://huggingface.co/RichardErkhov/Weyaxi_-_OpenHermes-2.5-neural-chat-v3-3-Slerp-gguf/blob/main/OpenHermes-2.5-neural-chat-v3-3-Slerp.IQ4_NL.gguf) | IQ4_NL | 3.87GB |
| [OpenHermes-2.5-neural-chat-v3-3-Slerp.Q4_K_S.gguf](https://huggingface.co/RichardErkhov/Weyaxi_-_OpenHermes-2.5-neural-chat-v3-3-Slerp-gguf/blob/main/OpenHermes-2.5-neural-chat-v3-3-Slerp.Q4_K_S.gguf) | Q4_K_S | 3.86GB |
| [OpenHermes-2.5-neural-chat-v3-3-Slerp.Q4_K.gguf](https://huggingface.co/RichardErkhov/Weyaxi_-_OpenHermes-2.5-neural-chat-v3-3-Slerp-gguf/blob/main/OpenHermes-2.5-neural-chat-v3-3-Slerp.Q4_K.gguf) | Q4_K | 4.07GB |
| [OpenHermes-2.5-neural-chat-v3-3-Slerp.Q4_K_M.gguf](https://huggingface.co/RichardErkhov/Weyaxi_-_OpenHermes-2.5-neural-chat-v3-3-Slerp-gguf/blob/main/OpenHermes-2.5-neural-chat-v3-3-Slerp.Q4_K_M.gguf) | Q4_K_M | 4.07GB |
| [OpenHermes-2.5-neural-chat-v3-3-Slerp.Q4_1.gguf](https://huggingface.co/RichardErkhov/Weyaxi_-_OpenHermes-2.5-neural-chat-v3-3-Slerp-gguf/blob/main/OpenHermes-2.5-neural-chat-v3-3-Slerp.Q4_1.gguf) | Q4_1 | 4.24GB |
| [OpenHermes-2.5-neural-chat-v3-3-Slerp.Q5_0.gguf](https://huggingface.co/RichardErkhov/Weyaxi_-_OpenHermes-2.5-neural-chat-v3-3-Slerp-gguf/blob/main/OpenHermes-2.5-neural-chat-v3-3-Slerp.Q5_0.gguf) | Q5_0 | 4.65GB |
| [OpenHermes-2.5-neural-chat-v3-3-Slerp.Q5_K_S.gguf](https://huggingface.co/RichardErkhov/Weyaxi_-_OpenHermes-2.5-neural-chat-v3-3-Slerp-gguf/blob/main/OpenHermes-2.5-neural-chat-v3-3-Slerp.Q5_K_S.gguf) | Q5_K_S | 4.65GB |
| [OpenHermes-2.5-neural-chat-v3-3-Slerp.Q5_K.gguf](https://huggingface.co/RichardErkhov/Weyaxi_-_OpenHermes-2.5-neural-chat-v3-3-Slerp-gguf/blob/main/OpenHermes-2.5-neural-chat-v3-3-Slerp.Q5_K.gguf) | Q5_K | 4.78GB |
| [OpenHermes-2.5-neural-chat-v3-3-Slerp.Q5_K_M.gguf](https://huggingface.co/RichardErkhov/Weyaxi_-_OpenHermes-2.5-neural-chat-v3-3-Slerp-gguf/blob/main/OpenHermes-2.5-neural-chat-v3-3-Slerp.Q5_K_M.gguf) | Q5_K_M | 4.78GB |
| [OpenHermes-2.5-neural-chat-v3-3-Slerp.Q5_1.gguf](https://huggingface.co/RichardErkhov/Weyaxi_-_OpenHermes-2.5-neural-chat-v3-3-Slerp-gguf/blob/main/OpenHermes-2.5-neural-chat-v3-3-Slerp.Q5_1.gguf) | Q5_1 | 5.07GB |
| [OpenHermes-2.5-neural-chat-v3-3-Slerp.Q6_K.gguf](https://huggingface.co/RichardErkhov/Weyaxi_-_OpenHermes-2.5-neural-chat-v3-3-Slerp-gguf/blob/main/OpenHermes-2.5-neural-chat-v3-3-Slerp.Q6_K.gguf) | Q6_K | 5.53GB |
| [OpenHermes-2.5-neural-chat-v3-3-Slerp.Q8_0.gguf](https://huggingface.co/RichardErkhov/Weyaxi_-_OpenHermes-2.5-neural-chat-v3-3-Slerp-gguf/blob/main/OpenHermes-2.5-neural-chat-v3-3-Slerp.Q8_0.gguf) | Q8_0 | 7.17GB |
Original model description:
---
license: apache-2.0
model-index:
- name: OpenHermes-2.5-neural-chat-v3-3-Slerp
results:
- task:
type: text-generation
name: Text Generation
dataset:
name: AI2 Reasoning Challenge (25-Shot)
type: ai2_arc
config: ARC-Challenge
split: test
args:
num_few_shot: 25
metrics:
- type: acc_norm
value: 68.09
name: normalized accuracy
- task:
type: text-generation
name: Text Generation
dataset:
name: HellaSwag (10-Shot)
type: hellaswag
split: validation
args:
num_few_shot: 10
metrics:
- type: acc_norm
value: 86.2
name: normalized accuracy
- task:
type: text-generation
name: Text Generation
dataset:
name: MMLU (5-Shot)
type: cais/mmlu
config: all
split: test
args:
num_few_shot: 5
metrics:
- type: acc
value: 64.26
name: accuracy
- task:
type: text-generation
name: Text Generation
dataset:
name: TruthfulQA (0-shot)
type: truthful_qa
config: multiple_choice
split: validation
args:
num_few_shot: 0
metrics:
- type: mc2
value: 62.78
- task:
type: text-generation
name: Text Generation
dataset:
name: Winogrande (5-shot)
type: winogrande
config: winogrande_xl
split: validation
args:
num_few_shot: 5
metrics:
- type: acc
value: 79.16
name: accuracy
- task:
type: text-generation
name: Text Generation
dataset:
name: GSM8k (5-shot)
type: gsm8k
config: main
split: test
args:
num_few_shot: 5
metrics:
- type: acc
value: 67.78
name: accuracy
tags:
- merge
base_model:
- teknium/OpenHermes-2.5-Mistral-7B
- Intel/neural-chat-7b-v3-3
---

# OpenHermes-2.5-neural-chat-v3-3-Slerp
This is the model for OpenHermes-2.5-neural-chat-v3-3-Slerp. I used [mergekit](https://github.com/cg123/mergekit) to merge models.
# Prompt Templates
You can use these prompt templates, but I recommend using ChatML.
### ChatML [(OpenHermes-2.5-Mistral-7B)](https://huggingface.co/teknium/OpenHermes-2.5-Mistral-7B):
```
<|im_start|>system
{system}<|im_end|>
<|im_start|>user
{user}<|im_end|>
<|im_start|>assistant
{asistant}<|im_end|>
```
### [neural-chat-7b-v3-3](https://huggingface.co/Intel/neural-chat-7b-v3-3):
```
### System:
{system}
### User:
{user}
### Assistant:
```
# Yaml Config to reproduce
```yaml
slices:
- sources:
- model: teknium/OpenHermes-2.5-Mistral-7B
layer_range: [0, 32]
- model: Intel/neural-chat-7b-v3-3
layer_range: [0, 32]
merge_method: slerp
base_model: mistralai/Mistral-7B-v0.1
parameters:
t:
- filter: self_attn
value: [0, 0.5, 0.3, 0.7, 1]
- filter: mlp
value: [1, 0.5, 0.7, 0.3, 0]
- value: 0.5 # fallback for rest of tensors
dtype: bfloat16
```
# Quantizationed versions
Quantizationed versions of this model is available thanks to [TheBloke](https://hf.co/TheBloke).
##### GPTQ
- [TheBloke/OpenHermes-2.5-neural-chat-v3-3-Slerp-GPTQ](https://huggingface.co/TheBloke/OpenHermes-2.5-neural-chat-v3-3-Slerp-GPTQ)
##### GGUF
- [TheBloke/OpenHermes-2.5-neural-chat-v3-3-Slerp-GGUF](https://huggingface.co/TheBloke/OpenHermes-2.5-neural-chat-v3-3-Slerp-GGUF)
##### AWQ
- [TheBloke/OpenHermes-2.5-neural-chat-v3-3-Slerp-AWQ](https://huggingface.co/TheBloke/OpenHermes-2.5-neural-chat-v3-3-Slerp-AWQ)
# [Open LLM Leaderboard Evaluation Results](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)
Detailed results can be found [here](https://huggingface.co/datasets/open-llm-leaderboard/details_PulsarAI__OpenHermes-2.5-neural-chat-v3-3-Slerp)
| Metric | Value |
|-----------------------|---------------------------|
| Avg. | 71.38 |
| ARC (25-shot) | 68.09 |
| HellaSwag (10-shot) | 86.2 |
| MMLU (5-shot) | 64.26 |
| TruthfulQA (0-shot) | 62.78 |
| Winogrande (5-shot) | 79.16 |
| GSM8K (5-shot) | 67.78 |
If you would like to support me:
[☕ Buy Me a Coffee](https://www.buymeacoffee.com/weyaxi)
|
featherless-ai-quants/CorticalStack-mistral-7b-openhermes-sft-GGUF | featherless-ai-quants | 2024-11-13T07:12:29Z | 7 | 0 | null | [
"gguf",
"text-generation",
"base_model:CorticalStack/mistral-7b-openhermes-sft",
"base_model:quantized:CorticalStack/mistral-7b-openhermes-sft",
"endpoints_compatible",
"region:us"
] | text-generation | 2024-11-13T06:51:12Z | ---
base_model: CorticalStack/mistral-7b-openhermes-sft
pipeline_tag: text-generation
quantized_by: featherless-ai-quants
---
# CorticalStack/mistral-7b-openhermes-sft GGUF Quantizations 🚀

*Optimized GGUF quantization files for enhanced model performance*
> Powered by [Featherless AI](https://featherless.ai) - run any model you'd like for a simple small fee.
---
## Available Quantizations 📊
| Quantization Type | File | Size |
|-------------------|------|------|
| IQ4_XS | [CorticalStack-mistral-7b-openhermes-sft-IQ4_XS.gguf](https://huggingface.co/featherless-ai-quants/CorticalStack-mistral-7b-openhermes-sft-GGUF/blob/main/CorticalStack-mistral-7b-openhermes-sft-IQ4_XS.gguf) | 3761.66 MB |
| Q2_K | [CorticalStack-mistral-7b-openhermes-sft-Q2_K.gguf](https://huggingface.co/featherless-ai-quants/CorticalStack-mistral-7b-openhermes-sft-GGUF/blob/main/CorticalStack-mistral-7b-openhermes-sft-Q2_K.gguf) | 2593.27 MB |
| Q3_K_L | [CorticalStack-mistral-7b-openhermes-sft-Q3_K_L.gguf](https://huggingface.co/featherless-ai-quants/CorticalStack-mistral-7b-openhermes-sft-GGUF/blob/main/CorticalStack-mistral-7b-openhermes-sft-Q3_K_L.gguf) | 3644.97 MB |
| Q3_K_M | [CorticalStack-mistral-7b-openhermes-sft-Q3_K_M.gguf](https://huggingface.co/featherless-ai-quants/CorticalStack-mistral-7b-openhermes-sft-GGUF/blob/main/CorticalStack-mistral-7b-openhermes-sft-Q3_K_M.gguf) | 3355.97 MB |
| Q3_K_S | [CorticalStack-mistral-7b-openhermes-sft-Q3_K_S.gguf](https://huggingface.co/featherless-ai-quants/CorticalStack-mistral-7b-openhermes-sft-GGUF/blob/main/CorticalStack-mistral-7b-openhermes-sft-Q3_K_S.gguf) | 3017.97 MB |
| Q4_K_M | [CorticalStack-mistral-7b-openhermes-sft-Q4_K_M.gguf](https://huggingface.co/featherless-ai-quants/CorticalStack-mistral-7b-openhermes-sft-GGUF/blob/main/CorticalStack-mistral-7b-openhermes-sft-Q4_K_M.gguf) | 4166.07 MB |
| Q4_K_S | [CorticalStack-mistral-7b-openhermes-sft-Q4_K_S.gguf](https://huggingface.co/featherless-ai-quants/CorticalStack-mistral-7b-openhermes-sft-GGUF/blob/main/CorticalStack-mistral-7b-openhermes-sft-Q4_K_S.gguf) | 3948.57 MB |
| Q5_K_M | [CorticalStack-mistral-7b-openhermes-sft-Q5_K_M.gguf](https://huggingface.co/featherless-ai-quants/CorticalStack-mistral-7b-openhermes-sft-GGUF/blob/main/CorticalStack-mistral-7b-openhermes-sft-Q5_K_M.gguf) | 4893.69 MB |
| Q5_K_S | [CorticalStack-mistral-7b-openhermes-sft-Q5_K_S.gguf](https://huggingface.co/featherless-ai-quants/CorticalStack-mistral-7b-openhermes-sft-GGUF/blob/main/CorticalStack-mistral-7b-openhermes-sft-Q5_K_S.gguf) | 4766.19 MB |
| Q6_K | [CorticalStack-mistral-7b-openhermes-sft-Q6_K.gguf](https://huggingface.co/featherless-ai-quants/CorticalStack-mistral-7b-openhermes-sft-GGUF/blob/main/CorticalStack-mistral-7b-openhermes-sft-Q6_K.gguf) | 5666.80 MB |
| Q8_0 | [CorticalStack-mistral-7b-openhermes-sft-Q8_0.gguf](https://huggingface.co/featherless-ai-quants/CorticalStack-mistral-7b-openhermes-sft-GGUF/blob/main/CorticalStack-mistral-7b-openhermes-sft-Q8_0.gguf) | 7339.34 MB |
---
## ⚡ Powered by [Featherless AI](https://featherless.ai)
### Key Features
- 🔥 **Instant Hosting** - Deploy any Llama model on HuggingFace instantly
- 🛠️ **Zero Infrastructure** - No server setup or maintenance required
- 📚 **Vast Compatibility** - Support for 2400+ models and counting
- 💎 **Affordable Pricing** - Starting at just $10/month
---
**Links:**
[Get Started](https://featherless.ai) | [Documentation](https://featherless.ai/docs) | [Models](https://featherless.ai/models) |
jacobhoffmann/TestGen_v2.1-codegemma-7b-lr0.0002_epochs3 | jacobhoffmann | 2024-11-13T07:11:51Z | 5 | 0 | transformers | [
"transformers",
"safetensors",
"gemma",
"text-generation",
"arxiv:1910.09700",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | text-generation | 2024-11-13T07:06:33Z | ---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
magnifi/Phi3_intent_v42_1_w_unknown_8_lr_0.002 | magnifi | 2024-11-13T07:10:43Z | 77 | 0 | transformers | [
"transformers",
"safetensors",
"mistral",
"text-generation",
"text-generation-inference",
"unsloth",
"trl",
"conversational",
"en",
"base_model:unsloth/Phi-3-mini-4k-instruct-bnb-4bit",
"base_model:finetune:unsloth/Phi-3-mini-4k-instruct-bnb-4bit",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-generation | 2024-11-13T07:08:35Z | ---
language:
- en
license: apache-2.0
tags:
- text-generation-inference
- transformers
- unsloth
- mistral
- trl
base_model: unsloth/Phi-3-mini-4k-instruct-bnb-4bit
---
# Uploaded model
- **Developed by:** magnifi
- **License:** apache-2.0
- **Finetuned from model :** unsloth/Phi-3-mini-4k-instruct-bnb-4bit
This mistral model was trained 2x faster with [Unsloth](https://github.com/unslothai/unsloth) and Huggingface's TRL library.
[<img src="https://raw.githubusercontent.com/unslothai/unsloth/main/images/unsloth%20made%20with%20love.png" width="200"/>](https://github.com/unslothai/unsloth)
|
harxsan/model | harxsan | 2024-11-13T07:06:27Z | 64 | 0 | transformers | [
"transformers",
"pytorch",
"safetensors",
"gguf",
"qwen2",
"text-generation-inference",
"unsloth",
"trl",
"sft",
"en",
"license:apache-2.0",
"endpoints_compatible",
"region:us",
"conversational"
] | null | 2024-10-18T21:50:56Z | ---
base_model: unsloth/qwen2.5-0.5b-bnb-4bit
language:
- en
license: apache-2.0
tags:
- text-generation-inference
- transformers
- unsloth
- qwen2
- trl
- sft
---
# Uploaded model
- **Developed by:** harxsan
- **License:** apache-2.0
- **Finetuned from model :** unsloth/qwen2.5-0.5b-bnb-4bit
This qwen2 model was trained 2x faster with [Unsloth](https://github.com/unslothai/unsloth) and Huggingface's TRL library.
[<img src="https://raw.githubusercontent.com/unslothai/unsloth/main/images/unsloth%20made%20with%20love.png" width="200"/>](https://github.com/unslothai/unsloth)
|
wlfls0305/vit-base-beans-demo-v5 | wlfls0305 | 2024-11-13T07:03:33Z | 194 | 0 | transformers | [
"transformers",
"tensorboard",
"safetensors",
"vit",
"image-classification",
"ViT",
"VIT",
"generated_from_trainer",
"base_model:google/vit-base-patch16-224-in21k",
"base_model:finetune:google/vit-base-patch16-224-in21k",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | image-classification | 2024-11-13T06:13:01Z | ---
library_name: transformers
license: apache-2.0
base_model: google/vit-base-patch16-224-in21k
tags:
- image-classification
- ViT
- VIT
- generated_from_trainer
metrics:
- accuracy
model-index:
- name: vit-base-beans-demo-v5
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# vit-base-beans-demo-v5
This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on the beans dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0586
- Accuracy: 0.9699
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0002
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 4
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:------:|:----:|:---------------:|:--------:|
| 0.065 | 1.5385 | 100 | 0.0755 | 0.9850 |
| 0.0183 | 3.0769 | 200 | 0.0586 | 0.9699 |
### Framework versions
- Transformers 4.46.2
- Pytorch 2.5.0+cu121
- Datasets 3.1.0
- Tokenizers 0.20.3
|
featherless-ai-quants/4yo1-llama-GGUF | featherless-ai-quants | 2024-11-13T07:00:28Z | 8 | 0 | null | [
"gguf",
"text-generation",
"base_model:4yo1/llama_1",
"base_model:quantized:4yo1/llama_1",
"endpoints_compatible",
"region:us",
"conversational"
] | text-generation | 2024-11-07T22:52:51Z | ---
base_model: 4yo1/llama
pipeline_tag: text-generation
quantized_by: featherless-ai-quants
---
# 4yo1/llama GGUF Quantizations 🚀

*Optimized GGUF quantization files for enhanced model performance*
> Powered by [Featherless AI](https://featherless.ai) - run any model you'd like for a simple small fee.
---
## Available Quantizations 📊
| Quantization Type | File | Size |
|-------------------|------|------|
| IQ4_XS | [4yo1-llama-IQ4_XS.gguf](https://huggingface.co/featherless-ai-quants/4yo1-llama-GGUF/blob/main/4yo1-llama-IQ4_XS.gguf) | 4276.62 MB |
| Q2_K | [4yo1-llama-Q2_K.gguf](https://huggingface.co/featherless-ai-quants/4yo1-llama-GGUF/blob/main/4yo1-llama-Q2_K.gguf) | 3031.86 MB |
| Q3_K_L | [4yo1-llama-Q3_K_L.gguf](https://huggingface.co/featherless-ai-quants/4yo1-llama-GGUF/blob/main/4yo1-llama-Q3_K_L.gguf) | 4121.74 MB |
| Q3_K_M | [4yo1-llama-Q3_K_M.gguf](https://huggingface.co/featherless-ai-quants/4yo1-llama-GGUF/blob/main/4yo1-llama-Q3_K_M.gguf) | 3832.74 MB |
| Q3_K_S | [4yo1-llama-Q3_K_S.gguf](https://huggingface.co/featherless-ai-quants/4yo1-llama-GGUF/blob/main/4yo1-llama-Q3_K_S.gguf) | 3494.74 MB |
| Q4_K_M | [4yo1-llama-Q4_K_M.gguf](https://huggingface.co/featherless-ai-quants/4yo1-llama-GGUF/blob/main/4yo1-llama-Q4_K_M.gguf) | 4692.78 MB |
| Q4_K_S | [4yo1-llama-Q4_K_S.gguf](https://huggingface.co/featherless-ai-quants/4yo1-llama-GGUF/blob/main/4yo1-llama-Q4_K_S.gguf) | 4475.28 MB |
| Q5_K_M | [4yo1-llama-Q5_K_M.gguf](https://huggingface.co/featherless-ai-quants/4yo1-llama-GGUF/blob/main/4yo1-llama-Q5_K_M.gguf) | 5467.40 MB |
| Q5_K_S | [4yo1-llama-Q5_K_S.gguf](https://huggingface.co/featherless-ai-quants/4yo1-llama-GGUF/blob/main/4yo1-llama-Q5_K_S.gguf) | 5339.90 MB |
| Q6_K | [4yo1-llama-Q6_K.gguf](https://huggingface.co/featherless-ai-quants/4yo1-llama-GGUF/blob/main/4yo1-llama-Q6_K.gguf) | 6290.44 MB |
| Q8_0 | [4yo1-llama-Q8_0.gguf](https://huggingface.co/featherless-ai-quants/4yo1-llama-GGUF/blob/main/4yo1-llama-Q8_0.gguf) | 8145.11 MB |
---
## ⚡ Powered by [Featherless AI](https://featherless.ai)
### Key Features
- 🔥 **Instant Hosting** - Deploy any Llama model on HuggingFace instantly
- 🛠️ **Zero Infrastructure** - No server setup or maintenance required
- 📚 **Vast Compatibility** - Support for 2400+ models and counting
- 💎 **Affordable Pricing** - Starting at just $10/month
---
**Links:**
[Get Started](https://featherless.ai) | [Documentation](https://featherless.ai/docs) | [Models](https://featherless.ai/models) |
bartowski/Qwen2.5-Coder-14B-Instruct-exl2 | bartowski | 2024-11-13T06:58:04Z | 12 | 0 | transformers | [
"transformers",
"code",
"codeqwen",
"chat",
"qwen",
"qwen-coder",
"text-generation",
"en",
"base_model:Qwen/Qwen2.5-Coder-14B-Instruct",
"base_model:quantized:Qwen/Qwen2.5-Coder-14B-Instruct",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] | text-generation | 2024-11-12T21:13:26Z | ---
license: apache-2.0
license_link: https://huggingface.co/Qwen/Qwen2.5-Coder-14B-Instruct/blob/main/LICENSE
language:
- en
base_model:
- Qwen/Qwen2.5-Coder-14B-Instruct
base_model_relation: quantized
pipeline_tag: text-generation
library_name: transformers
tags:
- code
- codeqwen
- chat
- qwen
- qwen-coder
quantized_by: bartowski
---
## Exllama v2 Quantizations of Qwen2.5-Coder-14B-Instruct
Using <a href="https://github.com/turboderp/exllamav2/releases/tag/v0.2.4">turboderp's ExLlamaV2 v0.2.4</a> for quantization.
<b>The "main" branch only contains the measurement.json, download one of the other branches for the model (see below)</b>
Each branch contains an individual bits per weight, with the main one containing only the meaurement.json for further conversions.
Original model: https://huggingface.co/Qwen/Qwen2.5-Coder-14B-Instruct
## Prompt format
```
<|im_start|>system
{system_prompt}<|im_end|>
<|im_start|>user
{prompt}<|im_end|>
<|im_start|>assistant
```
## Available sizes
| Branch | Bits | lm_head bits | VRAM (4k) | VRAM (16k) | Description |
| ----- | ---- | ------- | ------ | ------ | ------------ |
| [8_0](https://huggingface.co/bartowski/Qwen2.5-Coder-14B-Instruct-exl2/tree/8_0) | 8.0 | 8.0 | 17.4 GB | 20.9 GB | Max quality that ExLlamaV2 can produce, **recommended**. |
| [6_5](https://huggingface.co/bartowski/Qwen2.5-Coder-14B-Instruct-exl2/tree/6_5) | 6.5 | 8.0 | 14.6 GB | 17.5 GB | Near unquantized performance at vastly reduced size, **recommended**. |
| [5_0](https://huggingface.co/bartowski/Qwen2.5-Coder-14B-Instruct-exl2/tree/5_0) | 5.0 | 6.0 | 11.6 GB | 14.4 GB | Slightly lower quality vs 6.5. |
| [4_25](https://huggingface.co/bartowski/Qwen2.5-Coder-14B-Instruct-exl2/tree/4_25) | 4.25 | 6.0 | 10.1 GB | 13.0 GB | GPTQ equivalent bits per weight. |
| [3_5](https://huggingface.co/bartowski/Qwen2.5-Coder-14B-Instruct-exl2/tree/3_5) | 3.5 | 6.0 | 8.7 GB | 11.5 GB | Lower quality, not recommended. |
| [3_0](https://huggingface.co/bartowski/Qwen2.5-Coder-14B-Instruct-exl2/tree/3_0) | 3.0 | 6.0 | 7.8 GB | 10.5 GB | Low quality, not recommended. |
## Download instructions
With git:
```shell
git clone --single-branch --branch 6_5 https://huggingface.co/bartowski/Qwen2.5-Coder-14B-Instruct-exl2 Qwen2.5-Coder-14B-Instruct-exl2-6_5
```
With huggingface hub (credit to TheBloke for instructions):
```shell
pip3 install huggingface-hub
```
To download a specific branch, use the `--revision` parameter. For example, to download the 6.5 bpw branch:
Linux:
```shell
huggingface-cli download bartowski/Qwen2.5-Coder-14B-Instruct-exl2 --revision 6_5 --local-dir Qwen2.5-Coder-14B-Instruct-exl2-6_5
```
Windows (which apparently doesn't like _ in folders sometimes?):
```shell
huggingface-cli download bartowski/Qwen2.5-Coder-14B-Instruct-exl2 --revision 6_5 --local-dir Qwen2.5-Coder-14B-Instruct-exl2-6.5
```
Want to support my work? Visit my ko-fi page here: https://ko-fi.com/bartowski
|
AIFunOver/Qwen2.5-Coder-7B-Instruct-openvino-8bit | AIFunOver | 2024-11-13T06:55:40Z | 6 | 0 | transformers | [
"transformers",
"safetensors",
"openvino",
"qwen2",
"text-generation",
"code",
"codeqwen",
"chat",
"qwen",
"qwen-coder",
"nncf",
"8-bit",
"conversational",
"en",
"base_model:Qwen/Qwen2.5-Coder-7B-Instruct",
"base_model:quantized:Qwen/Qwen2.5-Coder-7B-Instruct",
"license:apache-2.0",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | text-generation | 2024-11-08T02:18:21Z | ---
base_model: Qwen/Qwen2.5-Coder-7B-Instruct
language:
- en
library_name: transformers
license: apache-2.0
license_link: https://huggingface.co/Qwen/Qwen2.5-Coder-7B-Instruct/blob/main/LICENSE
pipeline_tag: text-generation
tags:
- code
- codeqwen
- chat
- qwen
- qwen-coder
- openvino
- nncf
- 8-bit
base_model_relation: quantized
---
This model is a quantized version of [`Qwen/Qwen2.5-Coder-7B-Instruct`](https://huggingface.co/Qwen/Qwen2.5-Coder-7B-Instruct) and is converted to the OpenVINO format. This model was obtained via the [nncf-quantization](https://huggingface.co/spaces/echarlaix/nncf-quantization) space with [optimum-intel](https://github.com/huggingface/optimum-intel).
First make sure you have `optimum-intel` installed:
```bash
pip install optimum[openvino]
```
To load your model you can do as follows:
```python
from optimum.intel import OVModelForCausalLM
model_id = "AIFunOver/Qwen2.5-Coder-7B-Instruct-openvino-8bit"
model = OVModelForCausalLM.from_pretrained(model_id)
```
|
mradermacher/Rombos-Coder-V2.5-Qwen-32b-GGUF | mradermacher | 2024-11-13T06:49:15Z | 45 | 0 | transformers | [
"transformers",
"gguf",
"code",
"qwen",
"qwen-coder",
"codeqwen",
"en",
"base_model:rombodawg/Rombos-Coder-V2.5-Qwen-32b",
"base_model:quantized:rombodawg/Rombos-Coder-V2.5-Qwen-32b",
"license:apache-2.0",
"endpoints_compatible",
"region:us",
"conversational"
] | null | 2024-11-13T03:58:27Z | ---
base_model: rombodawg/Rombos-Coder-V2.5-Qwen-32b
language:
- en
library_name: transformers
license: apache-2.0
license_link: https://huggingface.co/Qwen/Qwen2.5-Coder-32B/blob/main/LICENSE
quantized_by: mradermacher
tags:
- code
- qwen
- qwen-coder
- codeqwen
---
## About
<!-- ### quantize_version: 2 -->
<!-- ### output_tensor_quantised: 1 -->
<!-- ### convert_type: hf -->
<!-- ### vocab_type: -->
<!-- ### tags: -->
static quants of https://huggingface.co/rombodawg/Rombos-Coder-V2.5-Qwen-32b
<!-- provided-files -->
weighted/imatrix quants are available at https://huggingface.co/mradermacher/Rombos-Coder-V2.5-Qwen-32b-i1-GGUF
## Usage
If you are unsure how to use GGUF files, refer to one of [TheBloke's
READMEs](https://huggingface.co/TheBloke/KafkaLM-70B-German-V0.1-GGUF) for
more details, including on how to concatenate multi-part files.
## Provided Quants
(sorted by size, not necessarily quality. IQ-quants are often preferable over similar sized non-IQ quants)
| Link | Type | Size/GB | Notes |
|:-----|:-----|--------:|:------|
| [GGUF](https://huggingface.co/mradermacher/Rombos-Coder-V2.5-Qwen-32b-GGUF/resolve/main/Rombos-Coder-V2.5-Qwen-32b.Q2_K.gguf) | Q2_K | 12.4 | |
| [GGUF](https://huggingface.co/mradermacher/Rombos-Coder-V2.5-Qwen-32b-GGUF/resolve/main/Rombos-Coder-V2.5-Qwen-32b.Q3_K_S.gguf) | Q3_K_S | 14.5 | |
| [GGUF](https://huggingface.co/mradermacher/Rombos-Coder-V2.5-Qwen-32b-GGUF/resolve/main/Rombos-Coder-V2.5-Qwen-32b.Q3_K_M.gguf) | Q3_K_M | 16.0 | lower quality |
| [GGUF](https://huggingface.co/mradermacher/Rombos-Coder-V2.5-Qwen-32b-GGUF/resolve/main/Rombos-Coder-V2.5-Qwen-32b.Q3_K_L.gguf) | Q3_K_L | 17.3 | |
| [GGUF](https://huggingface.co/mradermacher/Rombos-Coder-V2.5-Qwen-32b-GGUF/resolve/main/Rombos-Coder-V2.5-Qwen-32b.IQ4_XS.gguf) | IQ4_XS | 18.0 | |
| [GGUF](https://huggingface.co/mradermacher/Rombos-Coder-V2.5-Qwen-32b-GGUF/resolve/main/Rombos-Coder-V2.5-Qwen-32b.Q4_K_S.gguf) | Q4_K_S | 18.9 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/Rombos-Coder-V2.5-Qwen-32b-GGUF/resolve/main/Rombos-Coder-V2.5-Qwen-32b.Q4_K_M.gguf) | Q4_K_M | 20.0 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/Rombos-Coder-V2.5-Qwen-32b-GGUF/resolve/main/Rombos-Coder-V2.5-Qwen-32b.Q5_K_S.gguf) | Q5_K_S | 22.7 | |
| [GGUF](https://huggingface.co/mradermacher/Rombos-Coder-V2.5-Qwen-32b-GGUF/resolve/main/Rombos-Coder-V2.5-Qwen-32b.Q5_K_M.gguf) | Q5_K_M | 23.4 | |
| [GGUF](https://huggingface.co/mradermacher/Rombos-Coder-V2.5-Qwen-32b-GGUF/resolve/main/Rombos-Coder-V2.5-Qwen-32b.Q6_K.gguf) | Q6_K | 27.0 | very good quality |
| [GGUF](https://huggingface.co/mradermacher/Rombos-Coder-V2.5-Qwen-32b-GGUF/resolve/main/Rombos-Coder-V2.5-Qwen-32b.Q8_0.gguf) | Q8_0 | 34.9 | fast, best quality |
Here is a handy graph by ikawrakow comparing some lower-quality quant
types (lower is better):

And here are Artefact2's thoughts on the matter:
https://gist.github.com/Artefact2/b5f810600771265fc1e39442288e8ec9
## FAQ / Model Request
See https://huggingface.co/mradermacher/model_requests for some answers to
questions you might have and/or if you want some other model quantized.
## Thanks
I thank my company, [nethype GmbH](https://www.nethype.de/), for letting
me use its servers and providing upgrades to my workstation to enable
this work in my free time. Additional thanks to [@nicoboss](https://huggingface.co/nicoboss) for giving me access to his private supercomputer, enabling me to provide many more imatrix quants, at much higher quality, than I would otherwise be able to.
<!-- end -->
|
Carick/distilbert-distilbert-base-uncased-fine-tuned-term-typing | Carick | 2024-11-13T06:44:09Z | 124 | 0 | transformers | [
"transformers",
"safetensors",
"distilbert",
"text-classification",
"generated_from_trainer",
"base_model:distilbert/distilbert-base-uncased",
"base_model:finetune:distilbert/distilbert-base-uncased",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | 2024-11-11T15:12:08Z | ---
library_name: transformers
license: apache-2.0
base_model: distilbert/distilbert-base-uncased
tags:
- generated_from_trainer
model-index:
- name: distilbert-distilbert-base-uncased-fine-tuned-term-typing
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilbert-distilbert-base-uncased-fine-tuned-term-typing
This model is a fine-tuned version of [distilbert/distilbert-base-uncased](https://huggingface.co/distilbert/distilbert-base-uncased) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0311
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 3
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:-----:|:----:|:---------------:|
| 0.2637 | 1.0 | 2535 | 0.0939 |
| 0.1249 | 2.0 | 5070 | 0.0467 |
| 0.0956 | 3.0 | 7605 | 0.0311 |
### Framework versions
- Transformers 4.45.1
- Pytorch 2.4.0
- Datasets 3.0.1
- Tokenizers 0.20.0
|
Labira/LabiraPJOK_5_100_Full | Labira | 2024-11-13T06:35:28Z | 61 | 0 | transformers | [
"transformers",
"tf",
"bert",
"question-answering",
"generated_from_keras_callback",
"base_model:Labira/LabiraPJOK_3_100_Full",
"base_model:finetune:Labira/LabiraPJOK_3_100_Full",
"license:mit",
"endpoints_compatible",
"region:us"
] | question-answering | 2024-11-13T06:18:22Z | ---
library_name: transformers
license: mit
base_model: Labira/LabiraPJOK_3_100_Full
tags:
- generated_from_keras_callback
model-index:
- name: Labira/LabiraPJOK_5_100_Full
results: []
---
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# Labira/LabiraPJOK_5_100_Full
This model is a fine-tuned version of [Labira/LabiraPJOK_3_100_Full](https://huggingface.co/Labira/LabiraPJOK_3_100_Full) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 0.0022
- Validation Loss: 0.0008
- Epoch: 96
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'Adam', 'weight_decay': None, 'clipnorm': None, 'global_clipnorm': None, 'clipvalue': None, 'use_ema': False, 'ema_momentum': 0.99, 'ema_overwrite_frequency': None, 'jit_compile': True, 'is_legacy_optimizer': False, 'learning_rate': {'module': 'keras.optimizers.schedules', 'class_name': 'PolynomialDecay', 'config': {'initial_learning_rate': 2e-05, 'decay_steps': 400, 'end_learning_rate': 0.0, 'power': 1.0, 'cycle': False, 'name': None}, 'registered_name': None}, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-08, 'amsgrad': False}
- training_precision: float32
### Training results
| Train Loss | Validation Loss | Epoch |
|:----------:|:---------------:|:-----:|
| 3.5586 | 1.9568 | 0 |
| 1.7698 | 1.4494 | 1 |
| 1.2758 | 1.1921 | 2 |
| 0.9196 | 0.8832 | 3 |
| 0.9555 | 0.6157 | 4 |
| 0.6439 | 0.4235 | 5 |
| 0.4805 | 0.3079 | 6 |
| 0.2379 | 0.2399 | 7 |
| 0.2184 | 0.0946 | 8 |
| 0.1195 | 0.0436 | 9 |
| 0.0914 | 0.0233 | 10 |
| 0.0457 | 0.0143 | 11 |
| 0.0791 | 0.0107 | 12 |
| 0.0615 | 0.0084 | 13 |
| 0.0728 | 0.0071 | 14 |
| 0.0147 | 0.0061 | 15 |
| 0.0417 | 0.0058 | 16 |
| 0.0208 | 0.0064 | 17 |
| 0.0116 | 0.0074 | 18 |
| 0.0223 | 0.0055 | 19 |
| 0.0372 | 0.0046 | 20 |
| 0.0381 | 0.0046 | 21 |
| 0.0065 | 0.0049 | 22 |
| 0.0142 | 0.0048 | 23 |
| 0.0199 | 0.0036 | 24 |
| 0.0129 | 0.0025 | 25 |
| 0.0273 | 0.0019 | 26 |
| 0.0075 | 0.0016 | 27 |
| 0.0157 | 0.0015 | 28 |
| 0.0100 | 0.0015 | 29 |
| 0.0063 | 0.0015 | 30 |
| 0.0068 | 0.0015 | 31 |
| 0.0057 | 0.0015 | 32 |
| 0.0039 | 0.0015 | 33 |
| 0.0044 | 0.0015 | 34 |
| 0.0062 | 0.0014 | 35 |
| 0.0118 | 0.0013 | 36 |
| 0.0035 | 0.0011 | 37 |
| 0.0064 | 0.0009 | 38 |
| 0.0049 | 0.0008 | 39 |
| 0.0106 | 0.0008 | 40 |
| 0.0070 | 0.0009 | 41 |
| 0.0030 | 0.0010 | 42 |
| 0.0061 | 0.0011 | 43 |
| 0.0058 | 0.0011 | 44 |
| 0.0083 | 0.0012 | 45 |
| 0.0064 | 0.0014 | 46 |
| 0.0045 | 0.0014 | 47 |
| 0.0521 | 0.0014 | 48 |
| 0.0031 | 0.0015 | 49 |
| 0.0094 | 0.0014 | 50 |
| 0.0060 | 0.0012 | 51 |
| 0.0052 | 0.0010 | 52 |
| 0.0160 | 0.0008 | 53 |
| 0.0125 | 0.0007 | 54 |
| 0.0186 | 0.0007 | 55 |
| 0.0093 | 0.0011 | 56 |
| 0.0023 | 0.0019 | 57 |
| 0.0059 | 0.0023 | 58 |
| 0.0033 | 0.0022 | 59 |
| 0.0033 | 0.0020 | 60 |
| 0.0047 | 0.0017 | 61 |
| 0.0049 | 0.0015 | 62 |
| 0.0021 | 0.0013 | 63 |
| 0.0134 | 0.0012 | 64 |
| 0.0049 | 0.0012 | 65 |
| 0.0674 | 0.0013 | 66 |
| 0.0284 | 0.0013 | 67 |
| 0.0035 | 0.0012 | 68 |
| 0.0074 | 0.0011 | 69 |
| 0.0072 | 0.0010 | 70 |
| 0.0035 | 0.0010 | 71 |
| 0.0038 | 0.0009 | 72 |
| 0.0040 | 0.0009 | 73 |
| 0.0017 | 0.0008 | 74 |
| 0.0183 | 0.0008 | 75 |
| 0.0178 | 0.0007 | 76 |
| 0.0043 | 0.0007 | 77 |
| 0.0081 | 0.0007 | 78 |
| 0.0046 | 0.0007 | 79 |
| 0.0035 | 0.0007 | 80 |
| 0.0097 | 0.0007 | 81 |
| 0.0095 | 0.0007 | 82 |
| 0.0040 | 0.0008 | 83 |
| 0.0036 | 0.0008 | 84 |
| 0.0073 | 0.0008 | 85 |
| 0.0174 | 0.0008 | 86 |
| 0.0034 | 0.0009 | 87 |
| 0.0035 | 0.0009 | 88 |
| 0.0038 | 0.0009 | 89 |
| 0.0025 | 0.0008 | 90 |
| 0.0024 | 0.0008 | 91 |
| 0.0101 | 0.0008 | 92 |
| 0.0027 | 0.0008 | 93 |
| 0.0016 | 0.0008 | 94 |
| 0.0075 | 0.0008 | 95 |
| 0.0022 | 0.0008 | 96 |
### Framework versions
- Transformers 4.46.2
- TensorFlow 2.17.0
- Datasets 3.1.0
- Tokenizers 0.20.3
|
fahmizainal17/Meta-Llama-3-8B-Instruct-fine-tuned | fahmizainal17 | 2024-11-13T06:32:15Z | 80 | 0 | transformers | [
"transformers",
"safetensors",
"llama",
"text-generation",
"language-model",
"causal-language-model",
"instruction-tuned",
"advanced",
"quantized",
"en",
"arxiv:2301.10345",
"base_model:meta-llama/Meta-Llama-3-8B-Instruct",
"base_model:quantized:meta-llama/Meta-Llama-3-8B-Instruct",
"license:mit",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"4-bit",
"bitsandbytes",
"region:us"
] | text-generation | 2024-11-13T03:52:42Z | ---
license: mit
language:
- en
base_model:
- meta-llama/Meta-Llama-3-8B-Instruct
pipeline_tag: text-generation
library_name: transformers
tags:
- language-model
- causal-language-model
- instruction-tuned
- advanced
- quantized
---
# Model Card for fahmizainal17/Meta-Llama-3-8B-Instruct-fine-tuned
This model is a fine-tuned version of the Meta LLaMA 3B model, optimized for instruction-based tasks such as answering questions and engaging in conversation. It has been quantized to reduce memory usage, making it more efficient for inference, especially on hardware with limited resources. This model is part of the **Advanced LLaMA Workshop** and is designed to handle complex queries and provide detailed, human-like responses.
## Model Details
### Model Description
This model is a variant of **Meta LLaMA 3B**, fine-tuned with instruction-following capabilities for better performance on NLP tasks like question answering, text generation, and dialogue. The model is optimized using 4-bit quantization to fit within limited GPU memory while maintaining a high level of accuracy and response quality.
- **Developed by:** fahmizainal17
- **Model type:** Causal Language Model
- **Language(s) (NLP):** English (potentially adaptable to other languages with additional fine-tuning)
- **License:** MIT
- **Finetuned from model:** Meta-LLaMA-3B
### Model Sources
- **Repository:** [Hugging Face model page](https://huggingface.co/fahmizainal17/meta-llama-3b-instruct-advanced)
- **Paper:** [Meta-LLaMA Paper](https://arxiv.org/abs/2301.10345) (Meta LLaMA Base Paper)
- **Demo:** [Model demo link] (or placeholder if available)
## Uses
### Direct Use
This model is intended for direct use in NLP tasks such as:
- Text generation
- Question answering
- Conversational AI
- Instruction-following tasks
It is ideal for scenarios where users need a model capable of understanding and responding to natural language instructions with detailed outputs.
### Downstream Use
This model can be used as a foundational model for various downstream applications, including:
- Virtual assistants
- Knowledge bases
- Customer support bots
- Other NLP-based AI systems requiring instruction-based responses
### Out-of-Scope Use
This model is not suitable for the following use cases:
- Highly specialized or domain-specific tasks without further fine-tuning (e.g., legal, medical)
- Tasks requiring real-time decision-making in critical environments (e.g., healthcare, finance)
- Misuse for malicious or harmful purposes (e.g., disinformation, harmful content generation)
## Bias, Risks, and Limitations
This model inherits potential biases from the data it was trained on. Users should be aware of possible biases in the model's responses, especially with regard to political, social, or controversial topics. Additionally, while quantization helps reduce memory usage, it may result in slight degradation in performance compared to full-precision models.
### Recommendations
Users are encouraged to monitor and review outputs for sensitive topics. Further fine-tuning or additional safeguards may be necessary to adapt the model to specific domains or mitigate bias. Customization for specific use cases can improve performance and reduce risks.
## How to Get Started with the Model
To use the model, you can load it directly using the following code:
```python
from transformers import AutoModelForCausalLM, AutoTokenizer
model_name = "fahmizainal17/meta-llama-3b-instruct-advanced"
tokenizer = AutoTokenizer.from_pretrained(model_name)
model = AutoModelForCausalLM.from_pretrained(model_name)
# Example usage
input_text = "Who is Donald Trump?"
inputs = tokenizer(input_text, return_tensors="pt")
outputs = model.generate(inputs['input_ids'], max_length=50)
print(tokenizer.decode(outputs[0], skip_special_tokens=True))
```
## Training Details
### Training Data
The model was fine-tuned on a dataset specifically designed for instruction-following tasks, which contains diverse queries and responses for general knowledge questions. The training data was preprocessed to ensure high-quality, contextually relevant instructions.
- **Dataset used:** A curated instruction-following dataset containing general knowledge and conversational tasks.
- **Data Preprocessing:** Text normalization, tokenization, and contextual adjustment were used to ensure the dataset was ready for fine-tuning.
### Training Procedure
The model was fine-tuned using mixed precision training with 4-bit quantization to ensure efficient use of GPU resources.
#### Preprocessing
Preprocessing involved tokenizing the instruction-based dataset and formatting it for causal language modeling. The dataset was split into smaller batches to facilitate efficient training.
#### Training Hyperparameters
- **Training regime:** fp16 mixed precision
- **Batch size:** 8 (due to memory constraints from 4-bit quantization)
- **Learning rate:** 5e-5
#### Speeds, Sizes, Times
- **Model size:** 3B parameters (Meta LLaMA 3B)
- **Training time:** Approximately 72 hours on a single T4 GPU (Google Colab)
- **Inference speed:** Roughly 0.5–1.0 seconds per query on T4 GPU
## Evaluation
### Testing Data, Factors & Metrics
- **Testing Data:** The model was evaluated on a standard benchmark dataset for question answering and instruction-following tasks (e.g., SQuAD, WikiQA).
- **Factors:** Evaluated across various domains and types of instructions.
- **Metrics:** Accuracy, response quality, and computational efficiency. In the case of response generation, metrics such as BLEU, ROUGE, and human evaluation were used.
### Results
- The model performs well on standard instruction-based tasks, delivering detailed and contextually relevant answers in a variety of use cases.
- Evaluated on a set of over 1,000 diverse instruction-based queries.
#### Summary
The fine-tuned model provides a solid foundation for tasks that require understanding and following natural language instructions. Its quantized format ensures it remains efficient for deployment in resource-constrained environments like Google Colab's T4 GPUs.
## Model Examination
This model has been thoroughly evaluated against both automated metrics and human assessments for response quality. It handles diverse types of queries effectively, including fact-based questions, conversational queries, and instruction-following tasks.
## Environmental Impact
The environmental impact of training the model can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute). The model was trained on GPU infrastructure with optimized power usage to minimize carbon footprint.
- **Hardware Type:** NVIDIA T4 GPU (Google Colab)
- **Cloud Provider:** Google Colab
- **Compute Region:** North America
- **Carbon Emitted:** Estimated ~0.02 kg CO2eq per hour of usage
## Technical Specifications
### Model Architecture and Objective
The model is a causal language model, based on the LLaMA architecture, fine-tuned for instruction-following tasks with 4-bit quantization for improved memory usage.
### Compute Infrastructure
The model was trained on GPUs with support for mixed precision and quantized training techniques.
#### Hardware
- **GPU:** NVIDIA Tesla T4
- **CPU:** Intel Xeon, 16 vCPUs
- **RAM:** 16 GB
#### Software
- **Frameworks:** PyTorch, Transformers, Accelerate, Hugging Face Datasets
- **Libraries:** BitsAndBytes, SentencePiece
## Citation
If you reference this model, please use the following citation:
**BibTeX:**
```bibtex
@misc{fahmizainal17meta-llama-3b-instruct-advanced,
author = {Fahmizainal17},
title = {Meta-LLaMA 3B Instruct Advanced},
year = {2024},
publisher = {Hugging Face},
howpublished = {\url{https://huggingface.co/fahmizainal17/meta-llama-3b-instruct-advanced}},
}
```
**APA:**
Fahmizainal17. (2024). *Meta-LLaMA 3B Instruct Advanced*. Hugging Face. Retrieved from https://huggingface.co/fahmizainal17/meta-llama-3b-instruct-advanced
## Glossary
- **Causal Language Model:** A model designed to predict the next token in a sequence, trained to generate coherent and contextually appropriate responses.
- **4-bit Quantization:** A technique used to reduce memory usage by storing model parameters in 4-bit precision, making the model more efficient on limited hardware.
## More Information
For further details
on the model's performance, use cases, or licensing, please contact the author or visit the Hugging Face model page.
## Model Card Authors
Fahmizainal17 and collaborators.
## Model Card Contact
For further inquiries, please contact [email protected].
```
--- |
yjkim104906/llama-3.2-3b-it-disaster-ko-tune | yjkim104906 | 2024-11-13T06:27:59Z | 7 | 0 | transformers | [
"transformers",
"safetensors",
"gguf",
"llama",
"text-generation",
"conversational",
"arxiv:1910.09700",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | text-generation | 2024-11-13T05:47:23Z | ---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
rlatlswls/vit-base-beans-demo-v5 | rlatlswls | 2024-11-13T06:26:26Z | 193 | 0 | transformers | [
"transformers",
"tensorboard",
"safetensors",
"vit",
"image-classification",
"VIT",
"generated_from_trainer",
"base_model:google/vit-base-patch16-224-in21k",
"base_model:finetune:google/vit-base-patch16-224-in21k",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | image-classification | 2024-11-13T06:26:13Z | ---
library_name: transformers
license: apache-2.0
base_model: google/vit-base-patch16-224-in21k
tags:
- image-classification
- VIT
- generated_from_trainer
model-index:
- name: vit-base-beans-demo-v5
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# vit-base-beans-demo-v5
This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on the beans dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0002
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 4
- mixed_precision_training: Native AMP
### Framework versions
- Transformers 4.46.2
- Pytorch 2.5.0+cu121
- Datasets 3.1.0
- Tokenizers 0.20.3
|
KangarooGroup/kangaroo | KangarooGroup | 2024-11-13T06:23:56Z | 154 | 12 | transformers | [
"transformers",
"pytorch",
"safetensors",
"llama",
"text-generation",
"video-text-to-text",
"custom_code",
"en",
"zh",
"arxiv:2408.15542",
"license:apache-2.0",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | video-text-to-text | 2024-07-11T11:58:24Z | ---
license: apache-2.0
language:
- en
- zh
pipeline_tag: video-text-to-text
---
# Kangaroo: A Powerful Video-Language Model Supporting Long-context Video Input
**Kangaroo** has been released. Please check out our [paper](https://arxiv.org/pdf/2408.15542), [blog](https://kangaroogroup.github.io/Kangaroo.github.io/) and [github](https://github.com/KangarooGroup/Kangaroo) for details.
## Abstract
We introduce <strong>Kangaroo</strong>, a powerful Multimodal Large Language Model designed for long-context video understanding. Our presented Kangaroo model shows remarkable performance across diverse video understanding tasks including video caption, QA and conversation. Generally, our key contributions in this work can be summarized as follows:
<ol>
<li><strong>Long-context Video Input.</strong> To enhance the model's capability to comprehend longer videos, we extend the maximum frames of input videos to 160. To this end, we aggregate multiple videos with variable frame counts and aspect ratios into one sample. We further design a spatial-temporal pathify module to improve training efficiency.</li>
<li><strong>Strong Performance.</strong> We evaluate our model across various video understanding benchmarks. The results indicate that our model achieves state-of-the-art performance on the majority of comprehensive benchmarks and maintain a competitive level in others. Notably, our model outperforms most larger open-source models with over 30B parameters and some proprietary models on certain benchmarks.</li>
<li><strong>Video Annotation System.</strong> We develop a data curation and automatic annotation system to generate captions for open-source and internal videos. The generated large-scale dataset are utilized for video-text pre-training. For video instruction tuning stage, we construct a video instruciton tuning dataset based on public and internal datasets covering a variety of tasks.</li>
<li><strong>Billingual Conversation.</strong> Our proposed model is equipped with the capability of Chinese, English and billingual conversations, and support single/multi-round conversation paradigms.
</li>
</ol>
## Quick Start
### Installation
See our [github page](https://github.com/KangarooGroup/Kangaroo)
### Multi-round Chat with 🤗 Transformers
```python
import torch
from transformers import AutoTokenizer, AutoModelForCausalLM
tokenizer = AutoTokenizer.from_pretrained("KangarooGroup/kangaroo")
model = AutoModelForCausalLM.from_pretrained(
"KangarooGroup/kangaroo",
torch_dtype=torch.bfloat16,
trust_remote_code=True,
)
model = model.to("cuda")
terminators = [tokenizer.eos_token_id, tokenizer.convert_tokens_to_ids("<|eot_id|>")]
video_path = "/path/to/video"
# Round 1
query = "Give a brief description of the video."
out, history = model.chat(video_path=video_path,
query=query,
tokenizer=tokenizer,
max_new_tokens=512,
eos_token_id=terminators,
do_sample=True,
temperature=0.6,
top_p=0.9,)
print('Assitant: \n', out)
# Round 2
query = "What happend at the end of the video?"
out, history = model.chat(video_path=video_path,
query=query,
history=history,
tokenizer=tokenizer,
max_new_tokens=512,
eos_token_id=terminators,
do_sample=True,
temperature=0.6,
top_p=0.9,)
print('Assitant: \n', out)
```
## Citation
If you find it useful for your research , please cite related papers/blogs using this BibTeX:
```bibtex
@misc{kangaroogroup,
title={Kangaroo: A Powerful Video-Language Model Supporting Long-context Video Input},
url={https://kangaroogroup.github.io/Kangaroo.github.io/},
author={Jiajun Liu and Yibing Wang and Hanghang Ma and Xiaoping Wu and Xiaoqi Ma and Jie Hu},
month={July},
year={2024}
} |
DeardeermeloD/vit-base-beans-demo-v5 | DeardeermeloD | 2024-11-13T06:23:41Z | 250 | 0 | transformers | [
"transformers",
"tensorboard",
"safetensors",
"vit",
"image-classification",
"ViT",
"generated_from_trainer",
"base_model:google/vit-base-patch16-224-in21k",
"base_model:finetune:google/vit-base-patch16-224-in21k",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | image-classification | 2024-11-13T06:23:21Z | ---
library_name: transformers
license: apache-2.0
base_model: google/vit-base-patch16-224-in21k
tags:
- image-classification
- ViT
- generated_from_trainer
metrics:
- accuracy
model-index:
- name: vit-base-beans-demo-v5
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# vit-base-beans-demo-v5
This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on the beans dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0144
- Accuracy: 1.0
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0002
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 4
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:------:|:----:|:---------------:|:--------:|
| 0.1333 | 1.5385 | 100 | 0.0851 | 0.9774 |
| 0.0399 | 3.0769 | 200 | 0.0144 | 1.0 |
### Framework versions
- Transformers 4.46.2
- Pytorch 2.5.0+cu121
- Tokenizers 0.20.3
|
jacobhoffmann/TestGen_v2.1-codegemma-7b-lr0.0002_epochs2 | jacobhoffmann | 2024-11-13T06:20:54Z | 5 | 0 | transformers | [
"transformers",
"safetensors",
"gemma",
"text-generation",
"arxiv:1910.09700",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | text-generation | 2024-11-13T06:15:55Z | ---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
Keltezaa/real-asian-women-zhen-shi-ya-zhou-nu-xing | Keltezaa | 2024-11-13T06:19:21Z | 278 | 1 | diffusers | [
"diffusers",
"text-to-image",
"stable-diffusion",
"lora",
"template:sd-lora",
"migrated",
"photorealistic",
"sexy",
"style",
"woman",
"girls",
"realistic",
"base_model:black-forest-labs/FLUX.1-dev",
"base_model:adapter:black-forest-labs/FLUX.1-dev",
"license:other",
"region:us"
] | text-to-image | 2024-11-13T06:19:20Z | ---
license: other
license_name: bespoke-lora-trained-license
license_link: https://multimodal.art/civitai-licenses?allowNoCredit=True&allowCommercialUse=Sell&allowDerivatives=True&allowDifferentLicense=True
tags:
- text-to-image
- stable-diffusion
- lora
- diffusers
- template:sd-lora
- migrated
- photorealistic
- sexy
- style
- woman
- girls
- realistic
base_model: black-forest-labs/FLUX.1-dev
instance_prompt: asg
widget:
- text: 'asg,
shorts, pantyhose,crop top,medium breasts, best quality, masterpiece, realistic, photo-realistic, amazing, finely detail, incredibly absurdres, huge filesize, ultra-detailed, highres, extremely detailed8k nsfw, A young Korean female idol with a youthful and innocent-looking face is posing in her living room. She is wearing a crop top and shorts. The living room has a couch and other furniture.'
output:
url: >-
33959516.jpeg
- text: 'asg,
In a photograph, the portrait of a woman commands attention, her presence almost filling the frame.
She is dressed in a off-shoulder dress, a color and style that emphasizes her elegance while maintaining a touch of understated fashion.
Her long brown hair cascades over her shoulders, adding a soft and warm touch to her overall look.
A silver necklace adorns her neck, the small pendant on it glinting subtly under the lighting, adding a delicate detail to her ensemble.
Her lips are painted with a vibrant red lipstick, highlighting her facial features and bestowing upon her a symbol of confidence and strength.
The background is a simple indoor setting with white walls and soft lighting, creating a serene and pure atmosphere that makes her portrait the sole focus of the image. Overall, the photograph captures a beauty that is simple yet sophisticated, showcasing the inner charm and outer elegance of the woman.'
output:
url: >-
33959617.jpeg
- text: 'asg,
shorts, pantyhose,crop top,medium breasts, best quality, masterpiece, realistic, photo-realistic, amazing, finely detail, incredibly absurdres, huge filesize, ultra-detailed, highres, extremely detailed8k nsfw, A young Korean female idol with a youthful and innocent-looking face is posing in her living room. She is wearing a crop top and shorts. The living room has a couch and other furniture.'
output:
url: >-
33959681.jpeg
- text: 'asg,
In a photograph, the portrait of a woman commands attention, her presence almost filling the frame.
She is dressed in a off-shoulder dress, a color and style that emphasizes her elegance while maintaining a touch of understated fashion.
Her long brown hair cascades over her shoulders, adding a soft and warm touch to her overall look.
A silver necklace adorns her neck, the small pendant on it glinting subtly under the lighting, adding a delicate detail to her ensemble.
Her lips are painted with a vibrant red lipstick, highlighting her facial features and bestowing upon her a symbol of confidence and strength.
The background is a simple indoor setting with white walls and soft lighting, creating a serene and pure atmosphere that makes her portrait the sole focus of the image. Overall, the photograph captures a beauty that is simple yet sophisticated, showcasing the inner charm and outer elegance of the woman.'
output:
url: >-
33959692.jpeg
- text: ' '
output:
url: >-
33961380.jpeg
---
# Real Asian Women | 真实亚洲女性
<Gallery />
## Model description
<p>v1.0</p><p>This model is a face model based on Flux dev 1.0</p><p>Recommended weights: 0.72-0.85 for standalone Use around 0.5 for mixed face lora. Weight reference look at the preview image.</p><p>Recommended size: 768*1024,768*1344,720*1280</p><p>Recommended steps: 22-30 steps</p><p>Trigger word: asg</p><p></p><p>Customized lora can add WeChat: cjw0402 Note the intention.</p><p>Using completely artificial natural language manual marking training, cue word comprehension is very strong. It is not recommended that the cue word in the form of a word for out of the picture. And the chances of hand collapsing are very small, almost none.</p><p>Can be free to play, welcome to actively return to the map.</p><p></p><p><s> </s></p><ul><li><p><span style="color:rgb(255, 255, 255)">v1.0</span></p></li><li><p><span style="color:rgb(255, 255, 255)">本模型是一个基于Flux dev 1.0的脸型模型</span></p></li><li><p><span style="color:rgb(255, 255, 255)">推荐权重:单独使用0.72-0.85 混合脸型lora使用在0.5左右。权重参考看预览图。</span></p></li><li><p><span style="color:rgb(255, 255, 255)">推荐尺寸:768*1024,768*1344,720*1280</span></p></li><li><p><span style="color:rgb(255, 255, 255)">推荐步数:22-30步</span></p></li><li><p><span style="color:rgb(255, 255, 255)">触发词:asg</span></p></li><li><p><em><u><span style="color:rgb(245, 219, 77)">定制lora可加V:cjw0402 备注来意。</span></u></em></p></li><li><p><span style="color:rgba(255, 255, 255, 0.8)">使用完全人工的自然语言手动打标训练,提示词理解能力非常强。不推荐单词形式的提示词进行出图。并且</span><strong><em><span style="color:rgb(225, 60, 57)">手部崩坏</span></em></strong><span style="color:rgba(255, 255, 255, 0.8)">的几率很小,几乎没有。</span></p></li><li><p><u><span style="color:rgba(255, 255, 255, 0.8)">可自由发挥,欢迎大家积极返图。</span></u></p></li></ul>
## Trigger words
You should use `asg` to trigger the image generation.
## Download model
Weights for this model are available in Safetensors format.
[Download](/Keltezaa/real-asian-women-zhen-shi-ya-zhou-nu-xing/tree/main) them in the Files & versions tab.
## Use it with the [🧨 diffusers library](https://github.com/huggingface/diffusers)
```py
from diffusers import AutoPipelineForText2Image
import torch
device = "cuda" if torch.cuda.is_available() else "cpu"
pipeline = AutoPipelineForText2Image.from_pretrained('black-forest-labs/FLUX.1-dev', torch_dtype=torch.bfloat16).to(device)
pipeline.load_lora_weights('Keltezaa/real-asian-women-zhen-shi-ya-zhou-nu-xing', weight_name='asian-girl-000008.safetensors')
image = pipeline('`asg`').images[0]
```
For more details, including weighting, merging and fusing LoRAs, check the [documentation on loading LoRAs in diffusers](https://huggingface.co/docs/diffusers/main/en/using-diffusers/loading_adapters)
|
dkwjd/vit-base-beans-demo-v5 | dkwjd | 2024-11-13T06:13:24Z | 193 | 0 | transformers | [
"transformers",
"tensorboard",
"safetensors",
"vit",
"image-classification",
"image-classfication",
"ViT",
"generated_from_trainer",
"base_model:google/vit-base-patch16-224-in21k",
"base_model:finetune:google/vit-base-patch16-224-in21k",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | image-classification | 2024-11-13T06:12:47Z | ---
library_name: transformers
license: apache-2.0
base_model: google/vit-base-patch16-224-in21k
tags:
- image-classfication
- ViT
- generated_from_trainer
metrics:
- accuracy
model-index:
- name: vit-base-beans-demo-v5
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# vit-base-beans-demo-v5
This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on the beans dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0254
- Accuracy: 0.9925
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0002
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 4
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:------:|:----:|:---------------:|:--------:|
| 0.0887 | 1.5385 | 100 | 0.0401 | 0.9925 |
| 0.0115 | 3.0769 | 200 | 0.0254 | 0.9925 |
### Framework versions
- Transformers 4.46.2
- Pytorch 2.5.0+cu121
- Datasets 3.1.0
- Tokenizers 0.20.3
|
RichardErkhov/allknowingroger_-_BigQwen2.5-52B-gguf | RichardErkhov | 2024-11-13T06:08:10Z | 13 | 0 | null | [
"gguf",
"endpoints_compatible",
"region:us",
"conversational"
] | null | 2024-11-12T04:30:24Z | Quantization made by Richard Erkhov.
[Github](https://github.com/RichardErkhov)
[Discord](https://discord.gg/pvy7H8DZMG)
[Request more models](https://github.com/RichardErkhov/quant_request)
BigQwen2.5-52B - GGUF
- Model creator: https://huggingface.co/allknowingroger/
- Original model: https://huggingface.co/allknowingroger/BigQwen2.5-52B/
| Name | Quant method | Size |
| ---- | ---- | ---- |
| [BigQwen2.5-52B.Q2_K.gguf](https://huggingface.co/RichardErkhov/allknowingroger_-_BigQwen2.5-52B-gguf/blob/main/BigQwen2.5-52B.Q2_K.gguf) | Q2_K | 18.11GB |
| [BigQwen2.5-52B.Q3_K_S.gguf](https://huggingface.co/RichardErkhov/allknowingroger_-_BigQwen2.5-52B-gguf/blob/main/BigQwen2.5-52B.Q3_K_S.gguf) | Q3_K_S | 21.21GB |
| [BigQwen2.5-52B.Q3_K.gguf](https://huggingface.co/RichardErkhov/allknowingroger_-_BigQwen2.5-52B-gguf/blob/main/BigQwen2.5-52B.Q3_K.gguf) | Q3_K | 23.54GB |
| [BigQwen2.5-52B.Q3_K_M.gguf](https://huggingface.co/RichardErkhov/allknowingroger_-_BigQwen2.5-52B-gguf/blob/main/BigQwen2.5-52B.Q3_K_M.gguf) | Q3_K_M | 23.54GB |
| [BigQwen2.5-52B.Q3_K_L.gguf](https://huggingface.co/RichardErkhov/allknowingroger_-_BigQwen2.5-52B-gguf/blob/main/BigQwen2.5-52B.Q3_K_L.gguf) | Q3_K_L | 25.53GB |
| [BigQwen2.5-52B.IQ4_XS.gguf](https://huggingface.co/RichardErkhov/allknowingroger_-_BigQwen2.5-52B-gguf/blob/main/BigQwen2.5-52B.IQ4_XS.gguf) | IQ4_XS | 26.43GB |
| [BigQwen2.5-52B.Q4_0.gguf](https://huggingface.co/RichardErkhov/allknowingroger_-_BigQwen2.5-52B-gguf/blob/main/BigQwen2.5-52B.Q4_0.gguf) | Q4_0 | 27.58GB |
| [BigQwen2.5-52B.IQ4_NL.gguf](https://huggingface.co/RichardErkhov/allknowingroger_-_BigQwen2.5-52B-gguf/blob/main/BigQwen2.5-52B.IQ4_NL.gguf) | IQ4_NL | 27.86GB |
| [BigQwen2.5-52B.Q4_K_S.gguf](https://huggingface.co/RichardErkhov/allknowingroger_-_BigQwen2.5-52B-gguf/blob/main/BigQwen2.5-52B.Q4_K_S.gguf) | Q4_K_S | 27.8GB |
| [BigQwen2.5-52B.Q4_K.gguf](https://huggingface.co/RichardErkhov/allknowingroger_-_BigQwen2.5-52B-gguf/blob/main/BigQwen2.5-52B.Q4_K.gguf) | Q4_K | 29.41GB |
| [BigQwen2.5-52B.Q4_K_M.gguf](https://huggingface.co/RichardErkhov/allknowingroger_-_BigQwen2.5-52B-gguf/blob/main/BigQwen2.5-52B.Q4_K_M.gguf) | Q4_K_M | 29.41GB |
| [BigQwen2.5-52B.Q4_1.gguf](https://huggingface.co/RichardErkhov/allknowingroger_-_BigQwen2.5-52B-gguf/blob/main/BigQwen2.5-52B.Q4_1.gguf) | Q4_1 | 30.58GB |
| [BigQwen2.5-52B.Q5_0.gguf](https://huggingface.co/RichardErkhov/allknowingroger_-_BigQwen2.5-52B-gguf/blob/main/BigQwen2.5-52B.Q5_0.gguf) | Q5_0 | 33.57GB |
| [BigQwen2.5-52B.Q5_K_S.gguf](https://huggingface.co/RichardErkhov/allknowingroger_-_BigQwen2.5-52B-gguf/blob/main/BigQwen2.5-52B.Q5_K_S.gguf) | Q5_K_S | 33.57GB |
| [BigQwen2.5-52B.Q5_K.gguf](https://huggingface.co/RichardErkhov/allknowingroger_-_BigQwen2.5-52B-gguf/blob/main/BigQwen2.5-52B.Q5_K.gguf) | Q5_K | 34.52GB |
| [BigQwen2.5-52B.Q5_K_M.gguf](https://huggingface.co/RichardErkhov/allknowingroger_-_BigQwen2.5-52B-gguf/blob/main/BigQwen2.5-52B.Q5_K_M.gguf) | Q5_K_M | 34.52GB |
| [BigQwen2.5-52B.Q5_1.gguf](https://huggingface.co/RichardErkhov/allknowingroger_-_BigQwen2.5-52B-gguf/blob/main/BigQwen2.5-52B.Q5_1.gguf) | Q5_1 | 36.57GB |
| [BigQwen2.5-52B.Q6_K.gguf](https://huggingface.co/RichardErkhov/allknowingroger_-_BigQwen2.5-52B-gguf/tree/main/) | Q6_K | 39.94GB |
| [BigQwen2.5-52B.Q8_0.gguf](https://huggingface.co/RichardErkhov/allknowingroger_-_BigQwen2.5-52B-gguf/tree/main/) | Q8_0 | 51.73GB |
Original model description:
---
base_model:
- mlabonne/BigQwen2.5-52B-Instruct
library_name: transformers
tags:
- mergekit
- merge
license: apache-2.0
---
# merge
This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).
## Merge Details
### Merge Method
This model was merged using the passthrough merge method.
### Models Merged
The following models were included in the merge:
* [mlabonne/BigQwen2.5-52B-Instruct](https://huggingface.co/mlabonne/BigQwen2.5-52B-Instruct)
### Configuration
The following YAML configuration was used to produce this model:
```yaml
slices:
- sources:
- layer_range: [0, 16]
model: mlabonne/BigQwen2.5-52B-Instruct
- sources:
- layer_range: [8, 24]
model: mlabonne/BigQwen2.5-52B-Instruct
- sources:
- layer_range: [16, 32]
model: mlabonne/BigQwen2.5-52B-Instruct
- sources:
- layer_range: [24, 40]
model: mlabonne/BigQwen2.5-52B-Instruct
- sources:
- layer_range: [32, 48]
model: mlabonne/BigQwen2.5-52B-Instruct
- sources:
- layer_range: [40, 56]
model: mlabonne/BigQwen2.5-52B-Instruct
- sources:
- layer_range: [56, 64]
model: mlabonne/BigQwen2.5-52B-Instruct
merge_method: passthrough
dtype: bfloat16
```
|
RichardErkhov/Local-Novel-LLM-project_-_Ninja-v1-NSFW-128k-gguf | RichardErkhov | 2024-11-13T06:03:04Z | 2,743 | 2 | null | [
"gguf",
"endpoints_compatible",
"region:us"
] | null | 2024-11-13T01:41:08Z | Quantization made by Richard Erkhov.
[Github](https://github.com/RichardErkhov)
[Discord](https://discord.gg/pvy7H8DZMG)
[Request more models](https://github.com/RichardErkhov/quant_request)
Ninja-v1-NSFW-128k - GGUF
- Model creator: https://huggingface.co/Local-Novel-LLM-project/
- Original model: https://huggingface.co/Local-Novel-LLM-project/Ninja-v1-NSFW-128k/
| Name | Quant method | Size |
| ---- | ---- | ---- |
| [Ninja-v1-NSFW-128k.Q2_K.gguf](https://huggingface.co/RichardErkhov/Local-Novel-LLM-project_-_Ninja-v1-NSFW-128k-gguf/blob/main/Ninja-v1-NSFW-128k.Q2_K.gguf) | Q2_K | 2.53GB |
| [Ninja-v1-NSFW-128k.Q3_K_S.gguf](https://huggingface.co/RichardErkhov/Local-Novel-LLM-project_-_Ninja-v1-NSFW-128k-gguf/blob/main/Ninja-v1-NSFW-128k.Q3_K_S.gguf) | Q3_K_S | 2.95GB |
| [Ninja-v1-NSFW-128k.Q3_K.gguf](https://huggingface.co/RichardErkhov/Local-Novel-LLM-project_-_Ninja-v1-NSFW-128k-gguf/blob/main/Ninja-v1-NSFW-128k.Q3_K.gguf) | Q3_K | 3.28GB |
| [Ninja-v1-NSFW-128k.Q3_K_M.gguf](https://huggingface.co/RichardErkhov/Local-Novel-LLM-project_-_Ninja-v1-NSFW-128k-gguf/blob/main/Ninja-v1-NSFW-128k.Q3_K_M.gguf) | Q3_K_M | 3.28GB |
| [Ninja-v1-NSFW-128k.Q3_K_L.gguf](https://huggingface.co/RichardErkhov/Local-Novel-LLM-project_-_Ninja-v1-NSFW-128k-gguf/blob/main/Ninja-v1-NSFW-128k.Q3_K_L.gguf) | Q3_K_L | 3.56GB |
| [Ninja-v1-NSFW-128k.IQ4_XS.gguf](https://huggingface.co/RichardErkhov/Local-Novel-LLM-project_-_Ninja-v1-NSFW-128k-gguf/blob/main/Ninja-v1-NSFW-128k.IQ4_XS.gguf) | IQ4_XS | 3.67GB |
| [Ninja-v1-NSFW-128k.Q4_0.gguf](https://huggingface.co/RichardErkhov/Local-Novel-LLM-project_-_Ninja-v1-NSFW-128k-gguf/blob/main/Ninja-v1-NSFW-128k.Q4_0.gguf) | Q4_0 | 3.83GB |
| [Ninja-v1-NSFW-128k.IQ4_NL.gguf](https://huggingface.co/RichardErkhov/Local-Novel-LLM-project_-_Ninja-v1-NSFW-128k-gguf/blob/main/Ninja-v1-NSFW-128k.IQ4_NL.gguf) | IQ4_NL | 3.87GB |
| [Ninja-v1-NSFW-128k.Q4_K_S.gguf](https://huggingface.co/RichardErkhov/Local-Novel-LLM-project_-_Ninja-v1-NSFW-128k-gguf/blob/main/Ninja-v1-NSFW-128k.Q4_K_S.gguf) | Q4_K_S | 3.86GB |
| [Ninja-v1-NSFW-128k.Q4_K.gguf](https://huggingface.co/RichardErkhov/Local-Novel-LLM-project_-_Ninja-v1-NSFW-128k-gguf/blob/main/Ninja-v1-NSFW-128k.Q4_K.gguf) | Q4_K | 4.07GB |
| [Ninja-v1-NSFW-128k.Q4_K_M.gguf](https://huggingface.co/RichardErkhov/Local-Novel-LLM-project_-_Ninja-v1-NSFW-128k-gguf/blob/main/Ninja-v1-NSFW-128k.Q4_K_M.gguf) | Q4_K_M | 4.07GB |
| [Ninja-v1-NSFW-128k.Q4_1.gguf](https://huggingface.co/RichardErkhov/Local-Novel-LLM-project_-_Ninja-v1-NSFW-128k-gguf/blob/main/Ninja-v1-NSFW-128k.Q4_1.gguf) | Q4_1 | 4.24GB |
| [Ninja-v1-NSFW-128k.Q5_0.gguf](https://huggingface.co/RichardErkhov/Local-Novel-LLM-project_-_Ninja-v1-NSFW-128k-gguf/blob/main/Ninja-v1-NSFW-128k.Q5_0.gguf) | Q5_0 | 4.65GB |
| [Ninja-v1-NSFW-128k.Q5_K_S.gguf](https://huggingface.co/RichardErkhov/Local-Novel-LLM-project_-_Ninja-v1-NSFW-128k-gguf/blob/main/Ninja-v1-NSFW-128k.Q5_K_S.gguf) | Q5_K_S | 4.65GB |
| [Ninja-v1-NSFW-128k.Q5_K.gguf](https://huggingface.co/RichardErkhov/Local-Novel-LLM-project_-_Ninja-v1-NSFW-128k-gguf/blob/main/Ninja-v1-NSFW-128k.Q5_K.gguf) | Q5_K | 4.78GB |
| [Ninja-v1-NSFW-128k.Q5_K_M.gguf](https://huggingface.co/RichardErkhov/Local-Novel-LLM-project_-_Ninja-v1-NSFW-128k-gguf/blob/main/Ninja-v1-NSFW-128k.Q5_K_M.gguf) | Q5_K_M | 4.78GB |
| [Ninja-v1-NSFW-128k.Q5_1.gguf](https://huggingface.co/RichardErkhov/Local-Novel-LLM-project_-_Ninja-v1-NSFW-128k-gguf/blob/main/Ninja-v1-NSFW-128k.Q5_1.gguf) | Q5_1 | 5.07GB |
| [Ninja-v1-NSFW-128k.Q6_K.gguf](https://huggingface.co/RichardErkhov/Local-Novel-LLM-project_-_Ninja-v1-NSFW-128k-gguf/blob/main/Ninja-v1-NSFW-128k.Q6_K.gguf) | Q6_K | 5.53GB |
| [Ninja-v1-NSFW-128k.Q8_0.gguf](https://huggingface.co/RichardErkhov/Local-Novel-LLM-project_-_Ninja-v1-NSFW-128k-gguf/blob/main/Ninja-v1-NSFW-128k.Q8_0.gguf) | Q8_0 | 7.17GB |
Original model description:
---
license: apache-2.0
language:
- en
- ja
tags:
- finetuned
- not-for-all-audiences
library_name: transformers
pipeline_tag: text-generation
---
<img src="./ninjalogo.svg" width="100%" height="20%" alt="">
# Our Models
- [Vecteus](https://huggingface.co/Local-Novel-LLM-project/Vecteus-v1)
- [Ninja-v1](https://huggingface.co/Local-Novel-LLM-project/Ninja-v1)
- [Ninja-v1-NSFW](https://huggingface.co/Local-Novel-LLM-project/Ninja-v1-NSFW)
- [Ninja-v1-128k](https://huggingface.co/Local-Novel-LLM-project/Ninja-v1-128k)
- [Ninja-v1-NSFW-128k](https://huggingface.co/Local-Novel-LLM-project/Ninja-v1-NSFW-128k)
## Model Card for Ninja-v1-NSFW-128k
The Mistral-7B--based Large Language Model (LLM) is an noveldataset fine-tuned version of the Mistral-7B-v0.1
Ninja-NSFW-128k has the following changes compared to Mistral-7B-v0.1.
- 128k context window (8k context in v0.1)
- Achieving both high quality Japanese and English generation
- Memory ability that does not forget even after long-context generation
- Can be generated NSFW
This model was created with the help of GPUs from the first LocalAI hackathon.
We would like to take this opportunity to thank
## List of Creation Methods
- Chatvector for multiple models
- Simple linear merging of result models
- Domain and Sentence Enhancement with LORA
- Context expansion
## Instruction format
Ninja adopts the prompt format from Vicuna and supports multi-turn conversation.
The prompt should be as following:
```
USER: Hi ASSISTANT: Hello.</s>
USER: Who are you?
ASSISTANT: I am ninja.</s>
```
## Example prompts to improve (Japanese)
- BAD: あなたは○○として振る舞います
- GOOD: あなたは○○です
- BAD: あなたは○○ができます
- GOOD: あなたは○○をします
## Performing inference
```python
from transformers import AutoModelForCausalLM, AutoTokenizer
import torch
model_id = "Local-Novel-LLM-project/Ninja-v1-NSFW-128k"
new_tokens = 1024
model = AutoModelForCausalLM.from_pretrained(model_id, trust_remote_code=True, torch_dtype=torch.float16, attn_implementation="flash_attention_2", device_map="auto")
tokenizer = AutoTokenizer.from_pretrained(model_id)
system_prompt = "あなたはプロの小説家です。\n小説を書いてください\n-------- "
prompt = input("Enter a prompt: ")
system_prompt += prompt + "\n-------- "
model_inputs = tokenizer([system_prompt], return_tensors="pt").to("cuda")
generated_ids = model.generate(**model_inputs, max_new_tokens=new_tokens, do_sample=True)
print(tokenizer.batch_decode(generated_ids)[0])
````
## Merge recipe
- WizardLM2 - mistralai/Mistral-7B-v0.1
- NousResearch/Yarn-Mistral-7b-128k - mistralai/Mistral-7B-v0.1
- Elizezen/Antler-7B - stabilityai/japanese-stablelm-instruct-gamma-7b
- Elizezen/LewdSniffyOtter-7B - Elizezen/SniffyOtter-7B
- NTQAI/chatntq-ja-7b-v1.0
The characteristics of each model are as follows.
- WizardLM2: High quality multitasking model
- Yarn-Mistral-7b-128k: Mistral model with 128k context window
- Antler-7B: Model specialized for novel writing
- NTQAI/chatntq-ja-7b-v1.0 High quality Japanese specialized model
- Elizezen/LewdSniffyOtter-7B Japanese NSFW specialized model
## Other points to keep in mind
- The training data may be biased. Be careful with the generated sentences.
- Set trust_remote_code to True for context expansion with YaRN.
- Memory usage may be large for long inferences.
- If possible, we recommend inferring with llamacpp rather than Transformers.
|
Deev124/hermes-llama3-roleplay-800-v1 | Deev124 | 2024-11-13T06:01:27Z | 5 | 0 | transformers | [
"transformers",
"safetensors",
"llama",
"text-generation",
"conversational",
"arxiv:1910.09700",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | text-generation | 2024-11-13T05:57:41Z | ---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
featherless-ai-quants/migtissera-Llama-3-70B-Synthia-v3.5-GGUF | featherless-ai-quants | 2024-11-13T05:57:30Z | 16 | 0 | null | [
"gguf",
"text-generation",
"base_model:migtissera/Llama-3-70B-Synthia-v3.5",
"base_model:quantized:migtissera/Llama-3-70B-Synthia-v3.5",
"endpoints_compatible",
"region:us",
"conversational"
] | text-generation | 2024-11-13T02:47:38Z | ---
base_model: migtissera/Llama-3-70B-Synthia-v3.5
pipeline_tag: text-generation
quantized_by: featherless-ai-quants
---
# migtissera/Llama-3-70B-Synthia-v3.5 GGUF Quantizations 🚀

*Optimized GGUF quantization files for enhanced model performance*
> Powered by [Featherless AI](https://featherless.ai) - run any model you'd like for a simple small fee.
---
## Available Quantizations 📊
| Quantization Type | File | Size |
|-------------------|------|------|
| IQ4_XS | [migtissera-Llama-3-70B-Synthia-v3.5-IQ4_XS](https://huggingface.co/featherless-ai-quants/migtissera-Llama-3-70B-Synthia-v3.5-GGUF/tree/main/migtissera-Llama-3-70B-Synthia-v3.5-IQ4_XS) | 36496.80 MB (folder) |
| Q2_K | [migtissera-Llama-3-70B-Synthia-v3.5-Q2_K](https://huggingface.co/featherless-ai-quants/migtissera-Llama-3-70B-Synthia-v3.5-GGUF/tree/main/migtissera-Llama-3-70B-Synthia-v3.5-Q2_K) | 25153.26 MB (folder) |
| Q3_K_L | [migtissera-Llama-3-70B-Synthia-v3.5-Q3_K_L](https://huggingface.co/featherless-ai-quants/migtissera-Llama-3-70B-Synthia-v3.5-GGUF/tree/main/migtissera-Llama-3-70B-Synthia-v3.5-Q3_K_L) | 35420.03 MB (folder) |
| Q3_K_M | [migtissera-Llama-3-70B-Synthia-v3.5-Q3_K_M](https://huggingface.co/featherless-ai-quants/migtissera-Llama-3-70B-Synthia-v3.5-GGUF/tree/main/migtissera-Llama-3-70B-Synthia-v3.5-Q3_K_M) | 32680.03 MB (folder) |
| Q3_K_S | [migtissera-Llama-3-70B-Synthia-v3.5-Q3_K_S](https://huggingface.co/featherless-ai-quants/migtissera-Llama-3-70B-Synthia-v3.5-GGUF/tree/main/migtissera-Llama-3-70B-Synthia-v3.5-Q3_K_S) | 29480.03 MB (folder) |
| Q4_K_M | [migtissera-Llama-3-70B-Synthia-v3.5-Q4_K_M](https://huggingface.co/featherless-ai-quants/migtissera-Llama-3-70B-Synthia-v3.5-GGUF/tree/main/migtissera-Llama-3-70B-Synthia-v3.5-Q4_K_M) | 40550.61 MB (folder) |
| Q4_K_S | [migtissera-Llama-3-70B-Synthia-v3.5-Q4_K_S](https://huggingface.co/featherless-ai-quants/migtissera-Llama-3-70B-Synthia-v3.5-GGUF/tree/main/migtissera-Llama-3-70B-Synthia-v3.5-Q4_K_S) | 38478.11 MB (folder) |
| Q5_K_M | [migtissera-Llama-3-70B-Synthia-v3.5-Q5_K_M](https://huggingface.co/featherless-ai-quants/migtissera-Llama-3-70B-Synthia-v3.5-GGUF/tree/main/migtissera-Llama-3-70B-Synthia-v3.5-Q5_K_M) | 47635.86 MB (folder) |
| Q5_K_S | [migtissera-Llama-3-70B-Synthia-v3.5-Q5_K_S](https://huggingface.co/featherless-ai-quants/migtissera-Llama-3-70B-Synthia-v3.5-GGUF/tree/main/migtissera-Llama-3-70B-Synthia-v3.5-Q5_K_S) | 46403.36 MB (folder) |
| Q6_K | [migtissera-Llama-3-70B-Synthia-v3.5-Q6_K](https://huggingface.co/featherless-ai-quants/migtissera-Llama-3-70B-Synthia-v3.5-GGUF/tree/main/migtissera-Llama-3-70B-Synthia-v3.5-Q6_K) | 55206.44 MB (folder) |
| Q8_0 | [migtissera-Llama-3-70B-Synthia-v3.5-Q8_0](https://huggingface.co/featherless-ai-quants/migtissera-Llama-3-70B-Synthia-v3.5-GGUF/tree/main/migtissera-Llama-3-70B-Synthia-v3.5-Q8_0) | 71501.78 MB (folder) |
---
## ⚡ Powered by [Featherless AI](https://featherless.ai)
### Key Features
- 🔥 **Instant Hosting** - Deploy any Llama model on HuggingFace instantly
- 🛠️ **Zero Infrastructure** - No server setup or maintenance required
- 📚 **Vast Compatibility** - Support for 2400+ models and counting
- 💎 **Affordable Pricing** - Starting at just $10/month
---
**Links:**
[Get Started](https://featherless.ai) | [Documentation](https://featherless.ai/docs) | [Models](https://featherless.ai/models) |
Tzktz/tamizh-fluxv2 | Tzktz | 2024-11-13T05:57:26Z | 8 | 0 | diffusers | [
"diffusers",
"flux",
"lora",
"text-to-image",
"en",
"base_model:black-forest-labs/FLUX.1-dev",
"base_model:adapter:black-forest-labs/FLUX.1-dev",
"license:other",
"region:us"
] | text-to-image | 2024-08-24T09:52:24Z | ---
license: other
license_name: flux-1-dev-non-commercial-license
license_link: https://huggingface.co/black-forest-labs/FLUX.1-dev/blob/main/LICENSE.md
language:
- en
tags:
- flux
- diffusers
- lora
base_model: "black-forest-labs/FLUX.1-dev"
pipeline_tag: text-to-image
instance_prompt: TAMIZH
---
# Tamizh Fluxv2
Trained on RTX4090 24GB in lora rank 32
## Trigger words
You should use `TAMIZH` to trigger the image generation.
## Use it with the [🧨 diffusers library](https://github.com/huggingface/diffusers)
```py
from diffusers import AutoPipelineForText2Image
import torch
pipeline = AutoPipelineForText2Image.from_pretrained('black-forest-labs/FLUX.1-dev', torch_dtype=torch.float16).to('cuda')
pipeline.load_lora_weights('Tzktz/tamizh-fluxv2', weight_name='lora.safetensors')
image = pipeline('your prompt').images[0]
```
For more details, including weighting, merging and fusing LoRAs, check the [documentation on loading LoRAs in diffusers](https://huggingface.co/docs/diffusers/main/en/using-diffusers/loading_adapters)
|
mradermacher/Mistral-Nemo-Prism-12B-i1-GGUF | mradermacher | 2024-11-13T05:49:11Z | 177 | 1 | transformers | [
"transformers",
"gguf",
"en",
"dataset:nbeerbower/Arkhaios-DPO",
"dataset:nbeerbower/Purpura-DPO",
"base_model:nbeerbower/Mistral-Nemo-Prism-12B",
"base_model:quantized:nbeerbower/Mistral-Nemo-Prism-12B",
"license:apache-2.0",
"endpoints_compatible",
"region:us",
"imatrix",
"conversational"
] | null | 2024-11-13T03:57:49Z | ---
base_model: nbeerbower/Mistral-Nemo-Prism-12B
datasets:
- nbeerbower/Arkhaios-DPO
- nbeerbower/Purpura-DPO
language:
- en
library_name: transformers
license: apache-2.0
quantized_by: mradermacher
---
## About
<!-- ### quantize_version: 2 -->
<!-- ### output_tensor_quantised: 1 -->
<!-- ### convert_type: hf -->
<!-- ### vocab_type: -->
<!-- ### tags: nicoboss -->
weighted/imatrix quants of https://huggingface.co/nbeerbower/Mistral-Nemo-Prism-12B
<!-- provided-files -->
static quants are available at https://huggingface.co/mradermacher/Mistral-Nemo-Prism-12B-GGUF
## Usage
If you are unsure how to use GGUF files, refer to one of [TheBloke's
READMEs](https://huggingface.co/TheBloke/KafkaLM-70B-German-V0.1-GGUF) for
more details, including on how to concatenate multi-part files.
## Provided Quants
(sorted by size, not necessarily quality. IQ-quants are often preferable over similar sized non-IQ quants)
| Link | Type | Size/GB | Notes |
|:-----|:-----|--------:|:------|
| [GGUF](https://huggingface.co/mradermacher/Mistral-Nemo-Prism-12B-i1-GGUF/resolve/main/Mistral-Nemo-Prism-12B.i1-IQ1_S.gguf) | i1-IQ1_S | 3.1 | for the desperate |
| [GGUF](https://huggingface.co/mradermacher/Mistral-Nemo-Prism-12B-i1-GGUF/resolve/main/Mistral-Nemo-Prism-12B.i1-IQ1_M.gguf) | i1-IQ1_M | 3.3 | mostly desperate |
| [GGUF](https://huggingface.co/mradermacher/Mistral-Nemo-Prism-12B-i1-GGUF/resolve/main/Mistral-Nemo-Prism-12B.i1-IQ2_XXS.gguf) | i1-IQ2_XXS | 3.7 | |
| [GGUF](https://huggingface.co/mradermacher/Mistral-Nemo-Prism-12B-i1-GGUF/resolve/main/Mistral-Nemo-Prism-12B.i1-IQ2_XS.gguf) | i1-IQ2_XS | 4.0 | |
| [GGUF](https://huggingface.co/mradermacher/Mistral-Nemo-Prism-12B-i1-GGUF/resolve/main/Mistral-Nemo-Prism-12B.i1-IQ2_S.gguf) | i1-IQ2_S | 4.2 | |
| [GGUF](https://huggingface.co/mradermacher/Mistral-Nemo-Prism-12B-i1-GGUF/resolve/main/Mistral-Nemo-Prism-12B.i1-IQ2_M.gguf) | i1-IQ2_M | 4.5 | |
| [GGUF](https://huggingface.co/mradermacher/Mistral-Nemo-Prism-12B-i1-GGUF/resolve/main/Mistral-Nemo-Prism-12B.i1-Q2_K.gguf) | i1-Q2_K | 4.9 | IQ3_XXS probably better |
| [GGUF](https://huggingface.co/mradermacher/Mistral-Nemo-Prism-12B-i1-GGUF/resolve/main/Mistral-Nemo-Prism-12B.i1-IQ3_XXS.gguf) | i1-IQ3_XXS | 5.0 | lower quality |
| [GGUF](https://huggingface.co/mradermacher/Mistral-Nemo-Prism-12B-i1-GGUF/resolve/main/Mistral-Nemo-Prism-12B.i1-IQ3_XS.gguf) | i1-IQ3_XS | 5.4 | |
| [GGUF](https://huggingface.co/mradermacher/Mistral-Nemo-Prism-12B-i1-GGUF/resolve/main/Mistral-Nemo-Prism-12B.i1-Q3_K_S.gguf) | i1-Q3_K_S | 5.6 | IQ3_XS probably better |
| [GGUF](https://huggingface.co/mradermacher/Mistral-Nemo-Prism-12B-i1-GGUF/resolve/main/Mistral-Nemo-Prism-12B.i1-IQ3_S.gguf) | i1-IQ3_S | 5.7 | beats Q3_K* |
| [GGUF](https://huggingface.co/mradermacher/Mistral-Nemo-Prism-12B-i1-GGUF/resolve/main/Mistral-Nemo-Prism-12B.i1-IQ3_M.gguf) | i1-IQ3_M | 5.8 | |
| [GGUF](https://huggingface.co/mradermacher/Mistral-Nemo-Prism-12B-i1-GGUF/resolve/main/Mistral-Nemo-Prism-12B.i1-Q3_K_M.gguf) | i1-Q3_K_M | 6.2 | IQ3_S probably better |
| [GGUF](https://huggingface.co/mradermacher/Mistral-Nemo-Prism-12B-i1-GGUF/resolve/main/Mistral-Nemo-Prism-12B.i1-Q3_K_L.gguf) | i1-Q3_K_L | 6.7 | IQ3_M probably better |
| [GGUF](https://huggingface.co/mradermacher/Mistral-Nemo-Prism-12B-i1-GGUF/resolve/main/Mistral-Nemo-Prism-12B.i1-IQ4_XS.gguf) | i1-IQ4_XS | 6.8 | |
| [GGUF](https://huggingface.co/mradermacher/Mistral-Nemo-Prism-12B-i1-GGUF/resolve/main/Mistral-Nemo-Prism-12B.i1-Q4_0_4_4.gguf) | i1-Q4_0_4_4 | 7.2 | fast on arm, low quality |
| [GGUF](https://huggingface.co/mradermacher/Mistral-Nemo-Prism-12B-i1-GGUF/resolve/main/Mistral-Nemo-Prism-12B.i1-Q4_0_4_8.gguf) | i1-Q4_0_4_8 | 7.2 | fast on arm+i8mm, low quality |
| [GGUF](https://huggingface.co/mradermacher/Mistral-Nemo-Prism-12B-i1-GGUF/resolve/main/Mistral-Nemo-Prism-12B.i1-Q4_0_8_8.gguf) | i1-Q4_0_8_8 | 7.2 | fast on arm+sve, low quality |
| [GGUF](https://huggingface.co/mradermacher/Mistral-Nemo-Prism-12B-i1-GGUF/resolve/main/Mistral-Nemo-Prism-12B.i1-Q4_0.gguf) | i1-Q4_0 | 7.2 | fast, low quality |
| [GGUF](https://huggingface.co/mradermacher/Mistral-Nemo-Prism-12B-i1-GGUF/resolve/main/Mistral-Nemo-Prism-12B.i1-Q4_K_S.gguf) | i1-Q4_K_S | 7.2 | optimal size/speed/quality |
| [GGUF](https://huggingface.co/mradermacher/Mistral-Nemo-Prism-12B-i1-GGUF/resolve/main/Mistral-Nemo-Prism-12B.i1-Q4_K_M.gguf) | i1-Q4_K_M | 7.6 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/Mistral-Nemo-Prism-12B-i1-GGUF/resolve/main/Mistral-Nemo-Prism-12B.i1-Q5_K_S.gguf) | i1-Q5_K_S | 8.6 | |
| [GGUF](https://huggingface.co/mradermacher/Mistral-Nemo-Prism-12B-i1-GGUF/resolve/main/Mistral-Nemo-Prism-12B.i1-Q5_K_M.gguf) | i1-Q5_K_M | 8.8 | |
| [GGUF](https://huggingface.co/mradermacher/Mistral-Nemo-Prism-12B-i1-GGUF/resolve/main/Mistral-Nemo-Prism-12B.i1-Q6_K.gguf) | i1-Q6_K | 10.2 | practically like static Q6_K |
Here is a handy graph by ikawrakow comparing some lower-quality quant
types (lower is better):

And here are Artefact2's thoughts on the matter:
https://gist.github.com/Artefact2/b5f810600771265fc1e39442288e8ec9
## FAQ / Model Request
See https://huggingface.co/mradermacher/model_requests for some answers to
questions you might have and/or if you want some other model quantized.
## Thanks
I thank my company, [nethype GmbH](https://www.nethype.de/), for letting
me use its servers and providing upgrades to my workstation to enable
this work in my free time. Additional thanks to [@nicoboss](https://huggingface.co/nicoboss) for giving me access to his private supercomputer, enabling me to provide many more imatrix quants, at much higher quality, than I would otherwise be able to.
<!-- end -->
|
mradermacher/Mistral-Nemo-Prism-12B-GGUF | mradermacher | 2024-11-13T05:49:11Z | 84 | 1 | transformers | [
"transformers",
"gguf",
"en",
"dataset:nbeerbower/Arkhaios-DPO",
"dataset:nbeerbower/Purpura-DPO",
"base_model:nbeerbower/Mistral-Nemo-Prism-12B",
"base_model:quantized:nbeerbower/Mistral-Nemo-Prism-12B",
"license:apache-2.0",
"endpoints_compatible",
"region:us",
"conversational"
] | null | 2024-11-13T03:17:49Z | ---
base_model: nbeerbower/Mistral-Nemo-Prism-12B
datasets:
- nbeerbower/Arkhaios-DPO
- nbeerbower/Purpura-DPO
language:
- en
library_name: transformers
license: apache-2.0
quantized_by: mradermacher
---
## About
<!-- ### quantize_version: 2 -->
<!-- ### output_tensor_quantised: 1 -->
<!-- ### convert_type: hf -->
<!-- ### vocab_type: -->
<!-- ### tags: -->
static quants of https://huggingface.co/nbeerbower/Mistral-Nemo-Prism-12B
<!-- provided-files -->
weighted/imatrix quants are available at https://huggingface.co/mradermacher/Mistral-Nemo-Prism-12B-i1-GGUF
## Usage
If you are unsure how to use GGUF files, refer to one of [TheBloke's
READMEs](https://huggingface.co/TheBloke/KafkaLM-70B-German-V0.1-GGUF) for
more details, including on how to concatenate multi-part files.
## Provided Quants
(sorted by size, not necessarily quality. IQ-quants are often preferable over similar sized non-IQ quants)
| Link | Type | Size/GB | Notes |
|:-----|:-----|--------:|:------|
| [GGUF](https://huggingface.co/mradermacher/Mistral-Nemo-Prism-12B-GGUF/resolve/main/Mistral-Nemo-Prism-12B.Q2_K.gguf) | Q2_K | 4.9 | |
| [GGUF](https://huggingface.co/mradermacher/Mistral-Nemo-Prism-12B-GGUF/resolve/main/Mistral-Nemo-Prism-12B.Q3_K_S.gguf) | Q3_K_S | 5.6 | |
| [GGUF](https://huggingface.co/mradermacher/Mistral-Nemo-Prism-12B-GGUF/resolve/main/Mistral-Nemo-Prism-12B.Q3_K_M.gguf) | Q3_K_M | 6.2 | lower quality |
| [GGUF](https://huggingface.co/mradermacher/Mistral-Nemo-Prism-12B-GGUF/resolve/main/Mistral-Nemo-Prism-12B.Q3_K_L.gguf) | Q3_K_L | 6.7 | |
| [GGUF](https://huggingface.co/mradermacher/Mistral-Nemo-Prism-12B-GGUF/resolve/main/Mistral-Nemo-Prism-12B.IQ4_XS.gguf) | IQ4_XS | 6.9 | |
| [GGUF](https://huggingface.co/mradermacher/Mistral-Nemo-Prism-12B-GGUF/resolve/main/Mistral-Nemo-Prism-12B.Q4_0_4_4.gguf) | Q4_0_4_4 | 7.2 | fast on arm, low quality |
| [GGUF](https://huggingface.co/mradermacher/Mistral-Nemo-Prism-12B-GGUF/resolve/main/Mistral-Nemo-Prism-12B.Q4_K_S.gguf) | Q4_K_S | 7.2 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/Mistral-Nemo-Prism-12B-GGUF/resolve/main/Mistral-Nemo-Prism-12B.Q4_K_M.gguf) | Q4_K_M | 7.6 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/Mistral-Nemo-Prism-12B-GGUF/resolve/main/Mistral-Nemo-Prism-12B.Q5_K_S.gguf) | Q5_K_S | 8.6 | |
| [GGUF](https://huggingface.co/mradermacher/Mistral-Nemo-Prism-12B-GGUF/resolve/main/Mistral-Nemo-Prism-12B.Q5_K_M.gguf) | Q5_K_M | 8.8 | |
| [GGUF](https://huggingface.co/mradermacher/Mistral-Nemo-Prism-12B-GGUF/resolve/main/Mistral-Nemo-Prism-12B.Q6_K.gguf) | Q6_K | 10.2 | very good quality |
| [GGUF](https://huggingface.co/mradermacher/Mistral-Nemo-Prism-12B-GGUF/resolve/main/Mistral-Nemo-Prism-12B.Q8_0.gguf) | Q8_0 | 13.1 | fast, best quality |
Here is a handy graph by ikawrakow comparing some lower-quality quant
types (lower is better):

And here are Artefact2's thoughts on the matter:
https://gist.github.com/Artefact2/b5f810600771265fc1e39442288e8ec9
## FAQ / Model Request
See https://huggingface.co/mradermacher/model_requests for some answers to
questions you might have and/or if you want some other model quantized.
## Thanks
I thank my company, [nethype GmbH](https://www.nethype.de/), for letting
me use its servers and providing upgrades to my workstation to enable
this work in my free time.
<!-- end -->
|
smjain/architect-diag-lora | smjain | 2024-11-13T05:46:05Z | 5 | 4 | transformers | [
"transformers",
"safetensors",
"llama",
"text-generation",
"unsloth",
"arxiv:1910.09700",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | text-generation | 2024-11-13T05:45:51Z | ---
library_name: transformers
tags:
- unsloth
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
linkmarine007/nootdjt-v2 | linkmarine007 | 2024-11-13T05:44:05Z | 18 | 0 | diffusers | [
"diffusers",
"text-to-image",
"lora",
"template:diffusion-lora",
"base_model:black-forest-labs/FLUX.1-dev",
"base_model:adapter:black-forest-labs/FLUX.1-dev",
"license:other",
"region:us"
] | text-to-image | 2024-11-12T21:38:35Z | ---
tags:
- text-to-image
- lora
- diffusers
- template:diffusion-lora
base_model: black-forest-labs/FLUX.1-dev
instance_prompt: $noot, $noot_claymation, $presidentdjt
license: other
license_name: black-forest-labs-non-commercial-license
license_link: >-
https://github.com/black-forest-labs/flux/blob/main/model_licenses/LICENSE-FLUX1-dev
---
# nootdjt-v2
<Gallery />
## Model description
## Trigger words
## Download model
Weights for this model are available in Safetensors format.
[Download](/linkmarine007/nootdjt-v2/tree/main) them in the Files & versions tab.
|
featherless-ai-quants/failspy-Meta-Llama-3-70B-Instruct-abliterated-v3.5-GGUF | featherless-ai-quants | 2024-11-13T05:42:48Z | 9 | 0 | null | [
"gguf",
"text-generation",
"base_model:failspy/Meta-Llama-3-70B-Instruct-abliterated-v3.5",
"base_model:quantized:failspy/Meta-Llama-3-70B-Instruct-abliterated-v3.5",
"endpoints_compatible",
"region:us",
"conversational"
] | text-generation | 2024-11-03T10:59:56Z | ---
base_model: failspy/Meta-Llama-3-70B-Instruct-abliterated-v3.5
pipeline_tag: text-generation
quantized_by: featherless-ai-quants
---
# failspy/Meta-Llama-3-70B-Instruct-abliterated-v3.5 GGUF Quantizations 🚀

*Optimized GGUF quantization files for enhanced model performance*
> Powered by [Featherless AI](https://featherless.ai) - run any model you'd like for a simple small fee.
---
## Available Quantizations 📊
| Quantization Type | File | Size |
|-------------------|------|------|
| IQ4_XS | [failspy-Meta-Llama-3-70B-Instruct-abliterated-v3.5-IQ4_XS](https://huggingface.co/featherless-ai-quants/failspy-Meta-Llama-3-70B-Instruct-abliterated-v3.5-GGUF/tree/main/failspy-Meta-Llama-3-70B-Instruct-abliterated-v3.5-IQ4_XS) | 36496.80 MB (folder) |
| Q2_K | [failspy-Meta-Llama-3-70B-Instruct-abliterated-v3.5-Q2_K](https://huggingface.co/featherless-ai-quants/failspy-Meta-Llama-3-70B-Instruct-abliterated-v3.5-GGUF/tree/main/failspy-Meta-Llama-3-70B-Instruct-abliterated-v3.5-Q2_K) | 25153.26 MB (folder) |
| Q3_K_L | [failspy-Meta-Llama-3-70B-Instruct-abliterated-v3.5-Q3_K_L](https://huggingface.co/featherless-ai-quants/failspy-Meta-Llama-3-70B-Instruct-abliterated-v3.5-GGUF/tree/main/failspy-Meta-Llama-3-70B-Instruct-abliterated-v3.5-Q3_K_L) | 35420.03 MB (folder) |
| Q3_K_M | [failspy-Meta-Llama-3-70B-Instruct-abliterated-v3.5-Q3_K_M](https://huggingface.co/featherless-ai-quants/failspy-Meta-Llama-3-70B-Instruct-abliterated-v3.5-GGUF/tree/main/failspy-Meta-Llama-3-70B-Instruct-abliterated-v3.5-Q3_K_M) | 32680.03 MB (folder) |
| Q3_K_S | [failspy-Meta-Llama-3-70B-Instruct-abliterated-v3.5-Q3_K_S](https://huggingface.co/featherless-ai-quants/failspy-Meta-Llama-3-70B-Instruct-abliterated-v3.5-GGUF/tree/main/failspy-Meta-Llama-3-70B-Instruct-abliterated-v3.5-Q3_K_S) | 29480.03 MB (folder) |
| Q4_K_M | [failspy-Meta-Llama-3-70B-Instruct-abliterated-v3.5-Q4_K_M](https://huggingface.co/featherless-ai-quants/failspy-Meta-Llama-3-70B-Instruct-abliterated-v3.5-GGUF/tree/main/failspy-Meta-Llama-3-70B-Instruct-abliterated-v3.5-Q4_K_M) | 40550.61 MB (folder) |
| Q4_K_S | [failspy-Meta-Llama-3-70B-Instruct-abliterated-v3.5-Q4_K_S](https://huggingface.co/featherless-ai-quants/failspy-Meta-Llama-3-70B-Instruct-abliterated-v3.5-GGUF/tree/main/failspy-Meta-Llama-3-70B-Instruct-abliterated-v3.5-Q4_K_S) | 38478.11 MB (folder) |
| Q5_K_M | [failspy-Meta-Llama-3-70B-Instruct-abliterated-v3.5-Q5_K_M](https://huggingface.co/featherless-ai-quants/failspy-Meta-Llama-3-70B-Instruct-abliterated-v3.5-GGUF/tree/main/failspy-Meta-Llama-3-70B-Instruct-abliterated-v3.5-Q5_K_M) | 47635.86 MB (folder) |
| Q5_K_S | [failspy-Meta-Llama-3-70B-Instruct-abliterated-v3.5-Q5_K_S](https://huggingface.co/featherless-ai-quants/failspy-Meta-Llama-3-70B-Instruct-abliterated-v3.5-GGUF/tree/main/failspy-Meta-Llama-3-70B-Instruct-abliterated-v3.5-Q5_K_S) | 46403.36 MB (folder) |
| Q6_K | [failspy-Meta-Llama-3-70B-Instruct-abliterated-v3.5-Q6_K](https://huggingface.co/featherless-ai-quants/failspy-Meta-Llama-3-70B-Instruct-abliterated-v3.5-GGUF/tree/main/failspy-Meta-Llama-3-70B-Instruct-abliterated-v3.5-Q6_K) | 55206.44 MB (folder) |
| Q8_0 | [failspy-Meta-Llama-3-70B-Instruct-abliterated-v3.5-Q8_0](https://huggingface.co/featherless-ai-quants/failspy-Meta-Llama-3-70B-Instruct-abliterated-v3.5-GGUF/tree/main/failspy-Meta-Llama-3-70B-Instruct-abliterated-v3.5-Q8_0) | 71501.78 MB (folder) |
---
## ⚡ Powered by [Featherless AI](https://featherless.ai)
### Key Features
- 🔥 **Instant Hosting** - Deploy any Llama model on HuggingFace instantly
- 🛠️ **Zero Infrastructure** - No server setup or maintenance required
- 📚 **Vast Compatibility** - Support for 2400+ models and counting
- 💎 **Affordable Pricing** - Starting at just $10/month
---
**Links:**
[Get Started](https://featherless.ai) | [Documentation](https://featherless.ai/docs) | [Models](https://featherless.ai/models) |
mradermacher/Mistral-Nemo-Prism-12B-v2-i1-GGUF | mradermacher | 2024-11-13T05:33:58Z | 169 | 1 | transformers | [
"transformers",
"gguf",
"en",
"dataset:nbeerbower/Arkhaios-DPO",
"dataset:nbeerbower/Purpura-DPO",
"base_model:nbeerbower/Mistral-Nemo-Prism-12B-v2",
"base_model:quantized:nbeerbower/Mistral-Nemo-Prism-12B-v2",
"license:apache-2.0",
"endpoints_compatible",
"region:us",
"imatrix",
"conversational"
] | null | 2024-11-13T03:40:30Z | ---
base_model: nbeerbower/Mistral-Nemo-Prism-12B-v2
datasets:
- nbeerbower/Arkhaios-DPO
- nbeerbower/Purpura-DPO
language:
- en
library_name: transformers
license: apache-2.0
quantized_by: mradermacher
---
## About
<!-- ### quantize_version: 2 -->
<!-- ### output_tensor_quantised: 1 -->
<!-- ### convert_type: hf -->
<!-- ### vocab_type: -->
<!-- ### tags: nicoboss -->
weighted/imatrix quants of https://huggingface.co/nbeerbower/Mistral-Nemo-Prism-12B-v2
<!-- provided-files -->
static quants are available at https://huggingface.co/mradermacher/Mistral-Nemo-Prism-12B-v2-GGUF
## Usage
If you are unsure how to use GGUF files, refer to one of [TheBloke's
READMEs](https://huggingface.co/TheBloke/KafkaLM-70B-German-V0.1-GGUF) for
more details, including on how to concatenate multi-part files.
## Provided Quants
(sorted by size, not necessarily quality. IQ-quants are often preferable over similar sized non-IQ quants)
| Link | Type | Size/GB | Notes |
|:-----|:-----|--------:|:------|
| [GGUF](https://huggingface.co/mradermacher/Mistral-Nemo-Prism-12B-v2-i1-GGUF/resolve/main/Mistral-Nemo-Prism-12B-v2.i1-IQ1_S.gguf) | i1-IQ1_S | 3.1 | for the desperate |
| [GGUF](https://huggingface.co/mradermacher/Mistral-Nemo-Prism-12B-v2-i1-GGUF/resolve/main/Mistral-Nemo-Prism-12B-v2.i1-IQ1_M.gguf) | i1-IQ1_M | 3.3 | mostly desperate |
| [GGUF](https://huggingface.co/mradermacher/Mistral-Nemo-Prism-12B-v2-i1-GGUF/resolve/main/Mistral-Nemo-Prism-12B-v2.i1-IQ2_XXS.gguf) | i1-IQ2_XXS | 3.7 | |
| [GGUF](https://huggingface.co/mradermacher/Mistral-Nemo-Prism-12B-v2-i1-GGUF/resolve/main/Mistral-Nemo-Prism-12B-v2.i1-IQ2_XS.gguf) | i1-IQ2_XS | 4.0 | |
| [GGUF](https://huggingface.co/mradermacher/Mistral-Nemo-Prism-12B-v2-i1-GGUF/resolve/main/Mistral-Nemo-Prism-12B-v2.i1-IQ2_S.gguf) | i1-IQ2_S | 4.2 | |
| [GGUF](https://huggingface.co/mradermacher/Mistral-Nemo-Prism-12B-v2-i1-GGUF/resolve/main/Mistral-Nemo-Prism-12B-v2.i1-IQ2_M.gguf) | i1-IQ2_M | 4.5 | |
| [GGUF](https://huggingface.co/mradermacher/Mistral-Nemo-Prism-12B-v2-i1-GGUF/resolve/main/Mistral-Nemo-Prism-12B-v2.i1-Q2_K.gguf) | i1-Q2_K | 4.9 | IQ3_XXS probably better |
| [GGUF](https://huggingface.co/mradermacher/Mistral-Nemo-Prism-12B-v2-i1-GGUF/resolve/main/Mistral-Nemo-Prism-12B-v2.i1-IQ3_XXS.gguf) | i1-IQ3_XXS | 5.0 | lower quality |
| [GGUF](https://huggingface.co/mradermacher/Mistral-Nemo-Prism-12B-v2-i1-GGUF/resolve/main/Mistral-Nemo-Prism-12B-v2.i1-IQ3_XS.gguf) | i1-IQ3_XS | 5.4 | |
| [GGUF](https://huggingface.co/mradermacher/Mistral-Nemo-Prism-12B-v2-i1-GGUF/resolve/main/Mistral-Nemo-Prism-12B-v2.i1-Q3_K_S.gguf) | i1-Q3_K_S | 5.6 | IQ3_XS probably better |
| [GGUF](https://huggingface.co/mradermacher/Mistral-Nemo-Prism-12B-v2-i1-GGUF/resolve/main/Mistral-Nemo-Prism-12B-v2.i1-IQ3_S.gguf) | i1-IQ3_S | 5.7 | beats Q3_K* |
| [GGUF](https://huggingface.co/mradermacher/Mistral-Nemo-Prism-12B-v2-i1-GGUF/resolve/main/Mistral-Nemo-Prism-12B-v2.i1-IQ3_M.gguf) | i1-IQ3_M | 5.8 | |
| [GGUF](https://huggingface.co/mradermacher/Mistral-Nemo-Prism-12B-v2-i1-GGUF/resolve/main/Mistral-Nemo-Prism-12B-v2.i1-Q3_K_M.gguf) | i1-Q3_K_M | 6.2 | IQ3_S probably better |
| [GGUF](https://huggingface.co/mradermacher/Mistral-Nemo-Prism-12B-v2-i1-GGUF/resolve/main/Mistral-Nemo-Prism-12B-v2.i1-Q3_K_L.gguf) | i1-Q3_K_L | 6.7 | IQ3_M probably better |
| [GGUF](https://huggingface.co/mradermacher/Mistral-Nemo-Prism-12B-v2-i1-GGUF/resolve/main/Mistral-Nemo-Prism-12B-v2.i1-IQ4_XS.gguf) | i1-IQ4_XS | 6.8 | |
| [GGUF](https://huggingface.co/mradermacher/Mistral-Nemo-Prism-12B-v2-i1-GGUF/resolve/main/Mistral-Nemo-Prism-12B-v2.i1-Q4_0_4_4.gguf) | i1-Q4_0_4_4 | 7.2 | fast on arm, low quality |
| [GGUF](https://huggingface.co/mradermacher/Mistral-Nemo-Prism-12B-v2-i1-GGUF/resolve/main/Mistral-Nemo-Prism-12B-v2.i1-Q4_0_4_8.gguf) | i1-Q4_0_4_8 | 7.2 | fast on arm+i8mm, low quality |
| [GGUF](https://huggingface.co/mradermacher/Mistral-Nemo-Prism-12B-v2-i1-GGUF/resolve/main/Mistral-Nemo-Prism-12B-v2.i1-Q4_0_8_8.gguf) | i1-Q4_0_8_8 | 7.2 | fast on arm+sve, low quality |
| [GGUF](https://huggingface.co/mradermacher/Mistral-Nemo-Prism-12B-v2-i1-GGUF/resolve/main/Mistral-Nemo-Prism-12B-v2.i1-Q4_0.gguf) | i1-Q4_0 | 7.2 | fast, low quality |
| [GGUF](https://huggingface.co/mradermacher/Mistral-Nemo-Prism-12B-v2-i1-GGUF/resolve/main/Mistral-Nemo-Prism-12B-v2.i1-Q4_K_S.gguf) | i1-Q4_K_S | 7.2 | optimal size/speed/quality |
| [GGUF](https://huggingface.co/mradermacher/Mistral-Nemo-Prism-12B-v2-i1-GGUF/resolve/main/Mistral-Nemo-Prism-12B-v2.i1-Q4_K_M.gguf) | i1-Q4_K_M | 7.6 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/Mistral-Nemo-Prism-12B-v2-i1-GGUF/resolve/main/Mistral-Nemo-Prism-12B-v2.i1-Q5_K_S.gguf) | i1-Q5_K_S | 8.6 | |
| [GGUF](https://huggingface.co/mradermacher/Mistral-Nemo-Prism-12B-v2-i1-GGUF/resolve/main/Mistral-Nemo-Prism-12B-v2.i1-Q5_K_M.gguf) | i1-Q5_K_M | 8.8 | |
| [GGUF](https://huggingface.co/mradermacher/Mistral-Nemo-Prism-12B-v2-i1-GGUF/resolve/main/Mistral-Nemo-Prism-12B-v2.i1-Q6_K.gguf) | i1-Q6_K | 10.2 | practically like static Q6_K |
Here is a handy graph by ikawrakow comparing some lower-quality quant
types (lower is better):

And here are Artefact2's thoughts on the matter:
https://gist.github.com/Artefact2/b5f810600771265fc1e39442288e8ec9
## FAQ / Model Request
See https://huggingface.co/mradermacher/model_requests for some answers to
questions you might have and/or if you want some other model quantized.
## Thanks
I thank my company, [nethype GmbH](https://www.nethype.de/), for letting
me use its servers and providing upgrades to my workstation to enable
this work in my free time. Additional thanks to [@nicoboss](https://huggingface.co/nicoboss) for giving me access to his private supercomputer, enabling me to provide many more imatrix quants, at much higher quality, than I would otherwise be able to.
<!-- end -->
|
prithivMLmods/Teen-Outfit | prithivMLmods | 2024-11-13T05:33:38Z | 34 | 18 | diffusers | [
"diffusers",
"text-to-image",
"lora",
"template:diffusion-lora",
"Teen",
"Outfit",
"Trendy-Dress",
"Flux.1-Dev",
"base_model:black-forest-labs/FLUX.1-dev",
"base_model:adapter:black-forest-labs/FLUX.1-dev",
"license:creativeml-openrail-m",
"region:us"
] | text-to-image | 2024-11-11T13:29:16Z | ---
tags:
- text-to-image
- lora
- diffusers
- template:diffusion-lora
- Teen
- Outfit
- Trendy-Dress
- Flux.1-Dev
widget:
- text: >-
Teen Outfit, a teal and white jacket is adorned with a white hoodie. The
jacket has a white stripe down the left side of the jacket, and two white
stripes on the right and left side. On the left sleeve, there is a text
written in a cursive font that is in black ink that reads "X" and "X". The
jacket is positioned against a stark white backdrop, creating a striking
contrast.
output:
url: images/TO1.png
- text: >-
Teen Outfit, a jacket is hung on a hanger against a stark white backdrop.
The jacket is composed of a light green and gray striped jacket, with two
orange pockets on either side of the jacket. The pocket on the left side has
a white tag with black text on it. The tag also has a black text that reads
"N" and "M". The jacket has a zipper on the right side.
output:
url: images/TO2.png
- text: >-
Teen Outfit, a light brown and white varsity jacket is hung on a black wall.
The jacket is adorned with a white collar, white stripes and the words "FORE
CAST" in bold white letters. A keychain is hanging from the collar, adding a
touch of charm to the jacket.
output:
url: images/TO3.png
- text: >-
Teen Outfit: A cropped denim jacket with frayed edges, hanging from a black
metal hanger against a white brick wall. The jacket is a light wash with a
faded floral pattern on the back and a small silver star pin on the right
lapel. The left sleeve has a hand-painted design in pastel colors that says
WILD in playful, graffiti-style text.
output:
url: images/TO4.png
- text: >-
Teen Outfit: An olive-green military-inspired jacket with multiple front
pockets, displayed on a mannequin against a sandy beige background. The
jacket has epaulets on the shoulders, a drawstring waist, and a subtle patch
that reads "BRV" on the right arm in dark green. The left pocket has a small
silver pin in the shape of a lightning bolt.
output:
url: images/TO5.png
- text: >-
Teen Outfit: A pastel pink raincoat with a clear plastic overlay, hanging on
a bamboo hanger against a navy blue background. The raincoat has snap
closures down the front and two side pockets with clear flaps. On the upper
back, there’s a delicate white floral design in an outlined style, adding an
artistic touch.
output:
url: images/TO6.png
base_model: black-forest-labs/FLUX.1-dev
instance_prompt: Teen Outfit
license: creativeml-openrail-m
---
# Teen-Outfit
<Gallery />
- Hosted Here🧨: https://huggingface.co/spaces/prithivMLmods/FLUX-LoRA-DLC
**The model is still in the training phase. This is not the final version and may contain artifacts and perform poorly in some cases.**
## Model description
**prithivMLmods/Teen-Outfit**
Image Processing Parameters
| Parameter | Value | Parameter | Value |
|---------------------------|--------|---------------------------|--------|
| LR Scheduler | constant | Noise Offset | 0.03 |
| Optimizer | AdamW | Multires Noise Discount | 0.1 |
| Network Dim | 64 | Multires Noise Iterations | 10 |
| Network Alpha | 32 | Repeat & Steps | 25 & 2990|
| Epoch | 16 | Save Every N Epochs | 1 |
Labeling: florence2-en(natural language & English)
Total Images Used for Training : 17 [ Hi-RES ]
## Best Dimensions
- 1024 x 1024 (Default)
## Setting Up
```
import torch
from pipelines import DiffusionPipeline
base_model = "black-forest-labs/FLUX.1-dev"
pipe = DiffusionPipeline.from_pretrained(base_model, torch_dtype=torch.bfloat16)
lora_repo = "prithivMLmods/Teen-Outfit"
trigger_word = "Teen Outfit"
pipe.load_lora_weights(lora_repo)
device = torch.device("cuda")
pipe.to(device)
```
## Trigger words
You should use `Teen Outfit` to trigger the image generation.
## Download model
Weights for this model are available in Safetensors format.
[Download](/prithivMLmods/Teen-Outfit/tree/main) them in the Files & versions tab.
.
.
.@prithivMLmods |
mnm373/RevLlama-Qwen2.5-Coder-1.5B-40000samples_gguf | mnm373 | 2024-11-13T05:21:54Z | 11 | 0 | transformers | [
"transformers",
"gguf",
"qwen2",
"text-generation-inference",
"unsloth",
"en",
"base_model:unsloth/Qwen2.5-Coder-1.5B-Instruct",
"base_model:quantized:unsloth/Qwen2.5-Coder-1.5B-Instruct",
"license:apache-2.0",
"endpoints_compatible",
"region:us",
"conversational"
] | null | 2024-11-13T05:20:49Z | ---
base_model: unsloth/Qwen2.5-Coder-1.5B-Instruct
tags:
- text-generation-inference
- transformers
- unsloth
- qwen2
- gguf
license: apache-2.0
language:
- en
---
# Uploaded model
- **Developed by:** mnm373
- **License:** apache-2.0
- **Finetuned from model :** unsloth/Qwen2.5-Coder-1.5B-Instruct
This qwen2 model was trained 2x faster with [Unsloth](https://github.com/unslothai/unsloth) and Huggingface's TRL library.
[<img src="https://raw.githubusercontent.com/unslothai/unsloth/main/images/unsloth%20made%20with%20love.png" width="200"/>](https://github.com/unslothai/unsloth)
|
mlc-ai/Qwen2.5-Coder-0.5B-Instruct-q0f16-MLC | mlc-ai | 2024-11-13T05:16:35Z | 23 | 0 | mlc-llm | [
"mlc-llm",
"web-llm",
"base_model:Qwen/Qwen2.5-Coder-0.5B-Instruct",
"base_model:quantized:Qwen/Qwen2.5-Coder-0.5B-Instruct",
"region:us"
] | null | 2024-11-13T01:58:14Z | ---
library_name: mlc-llm
base_model: Qwen/Qwen2.5-Coder-0.5B-Instruct
tags:
- mlc-llm
- web-llm
---
# Qwen2.5-Coder-0.5B-Instruct-q0f16-MLC
This is the [Qwen2.5-Coder-0.5B-Instruct](https://huggingface.co/Qwen/Qwen2.5-Coder-0.5B-Instruct) model in MLC format `q0f16`.
The model can be used for projects [MLC-LLM](https://github.com/mlc-ai/mlc-llm) and [WebLLM](https://github.com/mlc-ai/web-llm).
## Example Usage
Here are some examples of using this model in MLC LLM.
Before running the examples, please install MLC LLM by following the [installation documentation](https://llm.mlc.ai/docs/install/mlc_llm.html#install-mlc-packages).
### Chat
In command line, run
```bash
mlc_llm chat HF://mlc-ai/Qwen2.5-Coder-0.5B-Instruct-q0f16-MLC
```
### REST Server
In command line, run
```bash
mlc_llm serve HF://mlc-ai/Qwen2.5-Coder-0.5B-Instruct-q0f16-MLC
```
### Python API
```python
from mlc_llm import MLCEngine
# Create engine
model = "HF://mlc-ai/Qwen2.5-Coder-0.5B-Instruct-q0f16-MLC"
engine = MLCEngine(model)
# Run chat completion in OpenAI API.
for response in engine.chat.completions.create(
messages=[{"role": "user", "content": "What is the meaning of life?"}],
model=model,
stream=True,
):
for choice in response.choices:
print(choice.delta.content, end="", flush=True)
print("\n")
engine.terminate()
```
## Documentation
For more information on MLC LLM project, please visit our [documentation](https://llm.mlc.ai/docs/) and [GitHub repo](http://github.com/mlc-ai/mlc-llm).
|
mlc-ai/Qwen2.5-Coder-0.5B-Instruct-q0f32-MLC | mlc-ai | 2024-11-13T05:16:35Z | 8 | 0 | mlc-llm | [
"mlc-llm",
"web-llm",
"base_model:Qwen/Qwen2.5-Coder-0.5B-Instruct",
"base_model:quantized:Qwen/Qwen2.5-Coder-0.5B-Instruct",
"region:us"
] | null | 2024-11-13T01:58:53Z | ---
library_name: mlc-llm
base_model: Qwen/Qwen2.5-Coder-0.5B-Instruct
tags:
- mlc-llm
- web-llm
---
# Qwen2.5-Coder-0.5B-Instruct-q0f32-MLC
This is the [Qwen2.5-Coder-0.5B-Instruct](https://huggingface.co/Qwen/Qwen2.5-Coder-0.5B-Instruct) model in MLC format `q0f32`.
The model can be used for projects [MLC-LLM](https://github.com/mlc-ai/mlc-llm) and [WebLLM](https://github.com/mlc-ai/web-llm).
## Example Usage
Here are some examples of using this model in MLC LLM.
Before running the examples, please install MLC LLM by following the [installation documentation](https://llm.mlc.ai/docs/install/mlc_llm.html#install-mlc-packages).
### Chat
In command line, run
```bash
mlc_llm chat HF://mlc-ai/Qwen2.5-Coder-0.5B-Instruct-q0f32-MLC
```
### REST Server
In command line, run
```bash
mlc_llm serve HF://mlc-ai/Qwen2.5-Coder-0.5B-Instruct-q0f32-MLC
```
### Python API
```python
from mlc_llm import MLCEngine
# Create engine
model = "HF://mlc-ai/Qwen2.5-Coder-0.5B-Instruct-q0f32-MLC"
engine = MLCEngine(model)
# Run chat completion in OpenAI API.
for response in engine.chat.completions.create(
messages=[{"role": "user", "content": "What is the meaning of life?"}],
model=model,
stream=True,
):
for choice in response.choices:
print(choice.delta.content, end="", flush=True)
print("\n")
engine.terminate()
```
## Documentation
For more information on MLC LLM project, please visit our [documentation](https://llm.mlc.ai/docs/) and [GitHub repo](http://github.com/mlc-ai/mlc-llm).
|
mlc-ai/Qwen2.5-Coder-0.5B-Instruct-q4f16_1-MLC | mlc-ai | 2024-11-13T05:16:34Z | 71 | 0 | mlc-llm | [
"mlc-llm",
"web-llm",
"base_model:Qwen/Qwen2.5-Coder-0.5B-Instruct",
"base_model:quantized:Qwen/Qwen2.5-Coder-0.5B-Instruct",
"region:us"
] | null | 2024-11-13T01:56:59Z | ---
library_name: mlc-llm
base_model: Qwen/Qwen2.5-Coder-0.5B-Instruct
tags:
- mlc-llm
- web-llm
---
# Qwen2.5-Coder-0.5B-Instruct-q4f16_1-MLC
This is the [Qwen2.5-Coder-0.5B-Instruct](https://huggingface.co/Qwen/Qwen2.5-Coder-0.5B-Instruct) model in MLC format `q4f16_1`.
The model can be used for projects [MLC-LLM](https://github.com/mlc-ai/mlc-llm) and [WebLLM](https://github.com/mlc-ai/web-llm).
## Example Usage
Here are some examples of using this model in MLC LLM.
Before running the examples, please install MLC LLM by following the [installation documentation](https://llm.mlc.ai/docs/install/mlc_llm.html#install-mlc-packages).
### Chat
In command line, run
```bash
mlc_llm chat HF://mlc-ai/Qwen2.5-Coder-0.5B-Instruct-q4f16_1-MLC
```
### REST Server
In command line, run
```bash
mlc_llm serve HF://mlc-ai/Qwen2.5-Coder-0.5B-Instruct-q4f16_1-MLC
```
### Python API
```python
from mlc_llm import MLCEngine
# Create engine
model = "HF://mlc-ai/Qwen2.5-Coder-0.5B-Instruct-q4f16_1-MLC"
engine = MLCEngine(model)
# Run chat completion in OpenAI API.
for response in engine.chat.completions.create(
messages=[{"role": "user", "content": "What is the meaning of life?"}],
model=model,
stream=True,
):
for choice in response.choices:
print(choice.delta.content, end="", flush=True)
print("\n")
engine.terminate()
```
## Documentation
For more information on MLC LLM project, please visit our [documentation](https://llm.mlc.ai/docs/) and [GitHub repo](http://github.com/mlc-ai/mlc-llm).
|
akibc123/llava_pruned_6.2B | akibc123 | 2024-11-13T05:08:08Z | 15 | 0 | transformers | [
"transformers",
"safetensors",
"llava",
"image-text-to-text",
"conversational",
"arxiv:1910.09700",
"endpoints_compatible",
"region:us"
] | image-text-to-text | 2024-11-13T05:03:35Z | ---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
Lancelot53/Llama-3.1-Nemotron-70B-Reward-HF-4bit | Lancelot53 | 2024-11-13T05:06:29Z | 5 | 0 | transformers | [
"transformers",
"safetensors",
"llama",
"feature-extraction",
"arxiv:1910.09700",
"text-generation-inference",
"endpoints_compatible",
"4-bit",
"bitsandbytes",
"region:us"
] | feature-extraction | 2024-11-13T04:57:34Z | ---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
personal1802/ntrMIXIllustriousXL_v35 | personal1802 | 2024-11-13T04:57:37Z | 5 | 0 | diffusers | [
"diffusers",
"text-to-image",
"lora",
"template:diffusion-lora",
"base_model:Raelina/Raehoshi-illust-XL",
"base_model:adapter:Raelina/Raehoshi-illust-XL",
"region:us"
] | text-to-image | 2024-11-13T04:37:58Z | ---
tags:
- text-to-image
- lora
- diffusers
- template:diffusion-lora
widget:
- text: '-'
output:
url: images/WHITE.png
base_model: Raelina/Raehoshi-illust-XL
instance_prompt: null
---
# ntrMIXIllustriousXL_v35
<Gallery />
## Download model
Weights for this model are available in Safetensors format.
[Download](/personal1802/ntrMIXIllustriousXL_v35/tree/main) them in the Files & versions tab.
|
asr-africa/bambara-mms-5-hours-mixed-asr-hf | asr-africa | 2024-11-13T04:48:29Z | 5 | 0 | transformers | [
"transformers",
"safetensors",
"wav2vec2",
"automatic-speech-recognition",
"generated_from_trainer",
"base_model:facebook/mms-1b-all",
"base_model:finetune:facebook/mms-1b-all",
"license:cc-by-nc-4.0",
"endpoints_compatible",
"region:us"
] | automatic-speech-recognition | 2024-11-12T22:27:30Z | ---
library_name: transformers
license: cc-by-nc-4.0
base_model: facebook/mms-1b-all
tags:
- generated_from_trainer
metrics:
- wer
model-index:
- name: bambara-mms-5-hours-mixed-asr-hf
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
[<img src="https://raw.githubusercontent.com/wandb/assets/main/wandb-github-badge-28.svg" alt="Visualize in Weights & Biases" width="200" height="32"/>](https://wandb.ai/asr-africa-research-team/ASR%20Africa/runs/oopohzvw)
# bambara-mms-5-hours-mixed-asr-hf
This model is a fine-tuned version of [facebook/mms-1b-all](https://huggingface.co/facebook/mms-1b-all) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 2.2515
- Wer: 0.5422
- Cer: 0.2520
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0003
- train_batch_size: 8
- eval_batch_size: 16
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 16
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 500
- num_epochs: 50
### Training results
| Training Loss | Epoch | Step | Validation Loss | Wer | Cer |
|:-------------:|:-------:|:-----:|:---------------:|:------:|:------:|
| 1.9023 | 1.6750 | 500 | 1.3385 | 0.8473 | 0.3992 |
| 1.404 | 3.3501 | 1000 | 1.3485 | 0.7641 | 0.3617 |
| 1.2515 | 5.0251 | 1500 | 1.2069 | 0.7155 | 0.3408 |
| 1.1487 | 6.7002 | 2000 | 1.1615 | 0.7022 | 0.3248 |
| 1.0495 | 8.3752 | 2500 | 1.1723 | 0.6488 | 0.3015 |
| 0.9528 | 10.0503 | 3000 | 1.2085 | 0.6272 | 0.2902 |
| 0.8446 | 11.7253 | 3500 | 1.1891 | 0.6240 | 0.2888 |
| 0.7752 | 13.4003 | 4000 | 1.3426 | 0.6039 | 0.2792 |
| 0.7026 | 15.0754 | 4500 | 1.3062 | 0.5994 | 0.2773 |
| 0.6321 | 16.7504 | 5000 | 1.3431 | 0.5861 | 0.2724 |
| 0.5651 | 18.4255 | 5500 | 1.3799 | 0.5908 | 0.2733 |
| 0.5106 | 20.1005 | 6000 | 1.4988 | 0.5843 | 0.2722 |
| 0.4645 | 21.7755 | 6500 | 1.5005 | 0.5898 | 0.2738 |
| 0.4139 | 23.4506 | 7000 | 1.5369 | 0.5781 | 0.2705 |
| 0.378 | 25.1256 | 7500 | 1.6367 | 0.5721 | 0.2670 |
| 0.3389 | 26.8007 | 8000 | 1.7074 | 0.5730 | 0.2702 |
| 0.3067 | 28.4757 | 8500 | 1.7832 | 0.5681 | 0.2652 |
| 0.2842 | 30.1508 | 9000 | 1.8441 | 0.5589 | 0.2612 |
| 0.2576 | 31.8258 | 9500 | 1.7606 | 0.5623 | 0.2637 |
| 0.2333 | 33.5008 | 10000 | 1.8455 | 0.5606 | 0.2627 |
| 0.2192 | 35.1759 | 10500 | 1.9153 | 0.5562 | 0.2611 |
| 0.1992 | 36.8509 | 11000 | 1.8999 | 0.5562 | 0.2590 |
| 0.1802 | 38.5260 | 11500 | 2.0523 | 0.5571 | 0.2607 |
| 0.165 | 40.2010 | 12000 | 2.0747 | 0.5509 | 0.2573 |
| 0.1571 | 41.8760 | 12500 | 2.0351 | 0.5508 | 0.2549 |
| 0.1409 | 43.5511 | 13000 | 2.2121 | 0.5483 | 0.2554 |
| 0.131 | 45.2261 | 13500 | 2.1694 | 0.5458 | 0.2544 |
| 0.1223 | 46.9012 | 14000 | 2.2376 | 0.5455 | 0.2531 |
| 0.1177 | 48.5762 | 14500 | 2.2515 | 0.5422 | 0.2520 |
### Framework versions
- Transformers 4.45.1
- Pytorch 2.1.0+cu118
- Datasets 2.17.0
- Tokenizers 0.20.3
|
jacobhoffmann/TestGen_v2.1-codegemma-7b-lr0.0001_epochs3 | jacobhoffmann | 2024-11-13T04:39:32Z | 5 | 0 | transformers | [
"transformers",
"safetensors",
"gemma",
"text-generation",
"arxiv:1910.09700",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | text-generation | 2024-11-13T04:34:31Z | ---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
featherless-ai-quants/HeshamHaroon-llama-3-instruct-slerp-arabic-GGUF | featherless-ai-quants | 2024-11-13T04:38:16Z | 18 | 0 | null | [
"gguf",
"text-generation",
"base_model:HeshamHaroon/llama-3-instruct-slerp-arabic",
"base_model:quantized:HeshamHaroon/llama-3-instruct-slerp-arabic",
"endpoints_compatible",
"region:us",
"conversational"
] | text-generation | 2024-11-13T04:22:19Z | ---
base_model: HeshamHaroon/llama-3-instruct-slerp-arabic
pipeline_tag: text-generation
quantized_by: featherless-ai-quants
---
# HeshamHaroon/llama-3-instruct-slerp-arabic GGUF Quantizations 🚀

*Optimized GGUF quantization files for enhanced model performance*
> Powered by [Featherless AI](https://featherless.ai) - run any model you'd like for a simple small fee.
---
## Available Quantizations 📊
| Quantization Type | File | Size |
|-------------------|------|------|
| IQ4_XS | [HeshamHaroon-llama-3-instruct-slerp-arabic-IQ4_XS.gguf](https://huggingface.co/featherless-ai-quants/HeshamHaroon-llama-3-instruct-slerp-arabic-GGUF/blob/main/HeshamHaroon-llama-3-instruct-slerp-arabic-IQ4_XS.gguf) | 4276.62 MB |
| Q2_K | [HeshamHaroon-llama-3-instruct-slerp-arabic-Q2_K.gguf](https://huggingface.co/featherless-ai-quants/HeshamHaroon-llama-3-instruct-slerp-arabic-GGUF/blob/main/HeshamHaroon-llama-3-instruct-slerp-arabic-Q2_K.gguf) | 3031.86 MB |
| Q3_K_L | [HeshamHaroon-llama-3-instruct-slerp-arabic-Q3_K_L.gguf](https://huggingface.co/featherless-ai-quants/HeshamHaroon-llama-3-instruct-slerp-arabic-GGUF/blob/main/HeshamHaroon-llama-3-instruct-slerp-arabic-Q3_K_L.gguf) | 4121.74 MB |
| Q3_K_M | [HeshamHaroon-llama-3-instruct-slerp-arabic-Q3_K_M.gguf](https://huggingface.co/featherless-ai-quants/HeshamHaroon-llama-3-instruct-slerp-arabic-GGUF/blob/main/HeshamHaroon-llama-3-instruct-slerp-arabic-Q3_K_M.gguf) | 3832.74 MB |
| Q3_K_S | [HeshamHaroon-llama-3-instruct-slerp-arabic-Q3_K_S.gguf](https://huggingface.co/featherless-ai-quants/HeshamHaroon-llama-3-instruct-slerp-arabic-GGUF/blob/main/HeshamHaroon-llama-3-instruct-slerp-arabic-Q3_K_S.gguf) | 3494.74 MB |
| Q4_K_M | [HeshamHaroon-llama-3-instruct-slerp-arabic-Q4_K_M.gguf](https://huggingface.co/featherless-ai-quants/HeshamHaroon-llama-3-instruct-slerp-arabic-GGUF/blob/main/HeshamHaroon-llama-3-instruct-slerp-arabic-Q4_K_M.gguf) | 4692.78 MB |
| Q4_K_S | [HeshamHaroon-llama-3-instruct-slerp-arabic-Q4_K_S.gguf](https://huggingface.co/featherless-ai-quants/HeshamHaroon-llama-3-instruct-slerp-arabic-GGUF/blob/main/HeshamHaroon-llama-3-instruct-slerp-arabic-Q4_K_S.gguf) | 4475.28 MB |
| Q5_K_M | [HeshamHaroon-llama-3-instruct-slerp-arabic-Q5_K_M.gguf](https://huggingface.co/featherless-ai-quants/HeshamHaroon-llama-3-instruct-slerp-arabic-GGUF/blob/main/HeshamHaroon-llama-3-instruct-slerp-arabic-Q5_K_M.gguf) | 5467.40 MB |
| Q5_K_S | [HeshamHaroon-llama-3-instruct-slerp-arabic-Q5_K_S.gguf](https://huggingface.co/featherless-ai-quants/HeshamHaroon-llama-3-instruct-slerp-arabic-GGUF/blob/main/HeshamHaroon-llama-3-instruct-slerp-arabic-Q5_K_S.gguf) | 5339.90 MB |
| Q6_K | [HeshamHaroon-llama-3-instruct-slerp-arabic-Q6_K.gguf](https://huggingface.co/featherless-ai-quants/HeshamHaroon-llama-3-instruct-slerp-arabic-GGUF/blob/main/HeshamHaroon-llama-3-instruct-slerp-arabic-Q6_K.gguf) | 6290.44 MB |
| Q8_0 | [HeshamHaroon-llama-3-instruct-slerp-arabic-Q8_0.gguf](https://huggingface.co/featherless-ai-quants/HeshamHaroon-llama-3-instruct-slerp-arabic-GGUF/blob/main/HeshamHaroon-llama-3-instruct-slerp-arabic-Q8_0.gguf) | 8145.11 MB |
---
## ⚡ Powered by [Featherless AI](https://featherless.ai)
### Key Features
- 🔥 **Instant Hosting** - Deploy any Llama model on HuggingFace instantly
- 🛠️ **Zero Infrastructure** - No server setup or maintenance required
- 📚 **Vast Compatibility** - Support for 2400+ models and counting
- 💎 **Affordable Pricing** - Starting at just $10/month
---
**Links:**
[Get Started](https://featherless.ai) | [Documentation](https://featherless.ai/docs) | [Models](https://featherless.ai/models) |
ADHIZ/sequential-omni_devi | ADHIZ | 2024-11-13T04:34:22Z | 52 | 0 | transformers | [
"transformers",
"safetensors",
"t5",
"text2text-generation",
"trl",
"sft",
"arxiv:1910.09700",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | text2text-generation | 2024-11-13T04:33:16Z | ---
library_name: transformers
tags:
- trl
- sft
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
mradermacher/MT2-Gen2-IMM-gemma-2-9B-GGUF | mradermacher | 2024-11-13T04:27:12Z | 7 | 1 | transformers | [
"transformers",
"gguf",
"mergekit",
"merge",
"en",
"base_model:zelk12/MT2-Gen2-IMM-gemma-2-9B",
"base_model:quantized:zelk12/MT2-Gen2-IMM-gemma-2-9B",
"endpoints_compatible",
"region:us",
"conversational"
] | null | 2024-11-13T03:45:25Z | ---
base_model: zelk12/MT2-Gen2-IMM-gemma-2-9B
language:
- en
library_name: transformers
quantized_by: mradermacher
tags:
- mergekit
- merge
---
## About
<!-- ### quantize_version: 2 -->
<!-- ### output_tensor_quantised: 1 -->
<!-- ### convert_type: hf -->
<!-- ### vocab_type: -->
<!-- ### tags: -->
static quants of https://huggingface.co/zelk12/MT2-Gen2-IMM-gemma-2-9B
<!-- provided-files -->
weighted/imatrix quants seem not to be available (by me) at this time. If they do not show up a week or so after the static ones, I have probably not planned for them. Feel free to request them by opening a Community Discussion.
## Usage
If you are unsure how to use GGUF files, refer to one of [TheBloke's
READMEs](https://huggingface.co/TheBloke/KafkaLM-70B-German-V0.1-GGUF) for
more details, including on how to concatenate multi-part files.
## Provided Quants
(sorted by size, not necessarily quality. IQ-quants are often preferable over similar sized non-IQ quants)
| Link | Type | Size/GB | Notes |
|:-----|:-----|--------:|:------|
| [GGUF](https://huggingface.co/mradermacher/MT2-Gen2-IMM-gemma-2-9B-GGUF/resolve/main/MT2-Gen2-IMM-gemma-2-9B.Q2_K.gguf) | Q2_K | 3.9 | |
| [GGUF](https://huggingface.co/mradermacher/MT2-Gen2-IMM-gemma-2-9B-GGUF/resolve/main/MT2-Gen2-IMM-gemma-2-9B.Q3_K_S.gguf) | Q3_K_S | 4.4 | |
| [GGUF](https://huggingface.co/mradermacher/MT2-Gen2-IMM-gemma-2-9B-GGUF/resolve/main/MT2-Gen2-IMM-gemma-2-9B.Q3_K_M.gguf) | Q3_K_M | 4.9 | lower quality |
| [GGUF](https://huggingface.co/mradermacher/MT2-Gen2-IMM-gemma-2-9B-GGUF/resolve/main/MT2-Gen2-IMM-gemma-2-9B.Q3_K_L.gguf) | Q3_K_L | 5.2 | |
| [GGUF](https://huggingface.co/mradermacher/MT2-Gen2-IMM-gemma-2-9B-GGUF/resolve/main/MT2-Gen2-IMM-gemma-2-9B.IQ4_XS.gguf) | IQ4_XS | 5.3 | |
| [GGUF](https://huggingface.co/mradermacher/MT2-Gen2-IMM-gemma-2-9B-GGUF/resolve/main/MT2-Gen2-IMM-gemma-2-9B.Q4_0_4_4.gguf) | Q4_0_4_4 | 5.5 | fast on arm, low quality |
| [GGUF](https://huggingface.co/mradermacher/MT2-Gen2-IMM-gemma-2-9B-GGUF/resolve/main/MT2-Gen2-IMM-gemma-2-9B.Q4_K_S.gguf) | Q4_K_S | 5.6 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/MT2-Gen2-IMM-gemma-2-9B-GGUF/resolve/main/MT2-Gen2-IMM-gemma-2-9B.Q4_K_M.gguf) | Q4_K_M | 5.9 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/MT2-Gen2-IMM-gemma-2-9B-GGUF/resolve/main/MT2-Gen2-IMM-gemma-2-9B.Q5_K_S.gguf) | Q5_K_S | 6.6 | |
| [GGUF](https://huggingface.co/mradermacher/MT2-Gen2-IMM-gemma-2-9B-GGUF/resolve/main/MT2-Gen2-IMM-gemma-2-9B.Q5_K_M.gguf) | Q5_K_M | 6.7 | |
| [GGUF](https://huggingface.co/mradermacher/MT2-Gen2-IMM-gemma-2-9B-GGUF/resolve/main/MT2-Gen2-IMM-gemma-2-9B.Q6_K.gguf) | Q6_K | 7.7 | very good quality |
| [GGUF](https://huggingface.co/mradermacher/MT2-Gen2-IMM-gemma-2-9B-GGUF/resolve/main/MT2-Gen2-IMM-gemma-2-9B.Q8_0.gguf) | Q8_0 | 9.9 | fast, best quality |
| [GGUF](https://huggingface.co/mradermacher/MT2-Gen2-IMM-gemma-2-9B-GGUF/resolve/main/MT2-Gen2-IMM-gemma-2-9B.f16.gguf) | f16 | 18.6 | 16 bpw, overkill |
Here is a handy graph by ikawrakow comparing some lower-quality quant
types (lower is better):

And here are Artefact2's thoughts on the matter:
https://gist.github.com/Artefact2/b5f810600771265fc1e39442288e8ec9
## FAQ / Model Request
See https://huggingface.co/mradermacher/model_requests for some answers to
questions you might have and/or if you want some other model quantized.
## Thanks
I thank my company, [nethype GmbH](https://www.nethype.de/), for letting
me use its servers and providing upgrades to my workstation to enable
this work in my free time.
<!-- end -->
|
Darshan03/Edugorilla-benchmark-3B | Darshan03 | 2024-11-13T04:25:44Z | 44 | 0 | transformers | [
"transformers",
"safetensors",
"qwen2",
"text-generation",
"conversational",
"arxiv:1910.09700",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"8-bit",
"bitsandbytes",
"region:us"
] | text-generation | 2024-11-13T04:23:50Z | ---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
sedrickkeh/checkpoints | sedrickkeh | 2024-11-13T04:19:49Z | 9 | 0 | transformers | [
"transformers",
"safetensors",
"gemma2",
"text-generation",
"llama-factory",
"full",
"trl",
"dpo",
"generated_from_trainer",
"conversational",
"base_model:google/gemma-2-9b-it",
"base_model:finetune:google/gemma-2-9b-it",
"license:gemma",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | text-generation | 2024-10-25T19:14:32Z | ---
library_name: transformers
license: gemma
base_model: google/gemma-2-9b-it
tags:
- llama-factory
- full
- trl
- dpo
- llama-factory
- generated_from_trainer
model-index:
- name: model
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# model
This model is a fine-tuned version of [google/gemma-2-9b-it](https://huggingface.co/google/gemma-2-9b-it) on the cdc0b2d9-493b-4cb1-87e8-8fb1e3f4b247 dataset.
It achieves the following results on the evaluation set:
- Loss: 3.9434
- Rewards/chosen: -46.0543
- Rewards/rejected: -47.7041
- Rewards/accuracies: 0.6473
- Rewards/margins: 1.6497
- Logps/rejected: -4.7704
- Logps/chosen: -4.6054
- Logits/rejected: 14.6796
- Logits/chosen: 14.4459
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 8e-07
- train_batch_size: 2
- eval_batch_size: 2
- seed: 42
- distributed_type: multi-GPU
- num_devices: 32
- gradient_accumulation_steps: 2
- total_train_batch_size: 128
- total_eval_batch_size: 64
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: cosine
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 1.0
### Training results
| Training Loss | Epoch | Step | Validation Loss | Rewards/chosen | Rewards/rejected | Rewards/accuracies | Rewards/margins | Logps/rejected | Logps/chosen | Logits/rejected | Logits/chosen |
|:-------------:|:-----:|:----:|:---------------:|:--------------:|:----------------:|:------------------:|:---------------:|:--------------:|:------------:|:---------------:|:-------------:|
| 3.9873 | 1.0 | 7344 | 3.9434 | -46.0543 | -47.7041 | 0.6473 | 1.6497 | -4.7704 | -4.6054 | 14.6796 | 14.4459 |
### Framework versions
- Transformers 4.45.2
- Pytorch 2.3.0
- Datasets 2.21.0
- Tokenizers 0.20.2
|
kavinduc/devops-mastermind | kavinduc | 2024-11-13T04:17:02Z | 58 | 0 | transformers | [
"transformers",
"pytorch",
"phi",
"text-generation",
"code",
"text-generation-inference",
"dataset:adeeshajayasinghe/devops-guide-demo",
"base_model:microsoft/phi-2",
"base_model:finetune:microsoft/phi-2",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-generation | 2024-11-13T03:16:19Z | ---
license: mit
datasets:
- adeeshajayasinghe/devops-guide-demo
metrics:
- accuracy
base_model:
- microsoft/phi-2
new_version: microsoft/phi-2
pipeline_tag: text-generation
library_name: transformers
tags:
- code
- text-generation-inference
---
# DevOps Mastermind Model
This repository hosts the **DevOps Mastermind** model, a pre-trained model based on `microsoft/phi-2` with modifications tailored for specialized DevOps knowledge tasks. The model is designed to support various downstream tasks, such as code generation, documentation assistance, and knowledge inference in DevOps domains.
## Model Details
- **Base Model**: `microsoft/phi-2`
- **Purpose**: Enhanced with additional training and modifications for DevOps and software engineering contexts.
- **Files Included**:
- `config.json`: Model configuration.
- `pytorch_model.bin`: The primary model file containing weights.
- `tokenizer.json`: Tokenizer for processing text inputs.
- `added_tokens.json`: Additional tokens specific to DevOps vocabulary.
- `generation_config.json`: Generation configuration for text generation tasks.
- Other auxiliary files required for model usage and compatibility.
## Usage
To load and use this model in your code, run the following commands:
```python
from transformers import AutoModelForCausalLM, AutoTokenizer
# Load the model and tokenizer
model_name = "kavinduc/devops-mastermind"
model = AutoModelForCausalLM.from_pretrained(model_name)
tokenizer = AutoTokenizer.from_pretrained(model_name, use_fast=False)
# Example usage
input_text = "Explain how to set up a CI/CD pipeline"
inputs = tokenizer(input_text, return_tensors="pt")
outputs = model.generate(**inputs)
generated_text = tokenizer.decode(outputs[0], skip_special_tokens=True)
print(generated_text) |
darkBrother/nmsc_classifier | darkBrother | 2024-11-13T04:02:31Z | 19 | 0 | null | [
"tensorboard",
"safetensors",
"bert",
"generated_from_trainer",
"base_model:klue/bert-base",
"base_model:finetune:klue/bert-base",
"license:cc-by-sa-4.0",
"region:us"
] | null | 2024-11-12T10:39:44Z | ---
license: cc-by-sa-4.0
base_model: klue/bert-base
tags:
- generated_from_trainer
metrics:
- accuracy
- f1
- precision
- recall
model-index:
- name: nmsc_classifier
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# nmsc_classifier
This model is a fine-tuned version of [klue/bert-base](https://huggingface.co/klue/bert-base) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.3765
- Accuracy: 0.9015
- F1: 0.9030
- Precision: 0.8956
- Recall: 0.9105
- Auroc: 0.9014
## Model description
KLUE/BERT를 사용해 네이버 영화 리뷰 데이터셋에 대한 이진분류 수행
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 128
- eval_batch_size: 128
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
### Framework versions
- Transformers 4.41.2
- Pytorch 2.5.1+cu124
- Datasets 2.20.0
- Tokenizers 0.19.1
|
sudokara/phi-3.5-mini-instruct-int8-decoders | sudokara | 2024-11-13T04:02:03Z | 6 | 0 | transformers | [
"transformers",
"safetensors",
"phi3",
"text-generation",
"custom_code",
"arxiv:1910.09700",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | text-generation | 2024-11-13T03:48:15Z | ---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
mradermacher/GPT4obased-Math7Bs-GGUF | mradermacher | 2024-11-13T03:52:13Z | 227 | 0 | transformers | [
"transformers",
"gguf",
"en",
"base_model:DopeorNope/GPT4obased-Math7Bs",
"base_model:quantized:DopeorNope/GPT4obased-Math7Bs",
"endpoints_compatible",
"region:us"
] | null | 2024-11-13T02:45:57Z | ---
base_model: DopeorNope/GPT4obased-Math7Bs
language:
- en
library_name: transformers
quantized_by: mradermacher
tags: []
---
## About
<!-- ### quantize_version: 2 -->
<!-- ### output_tensor_quantised: 1 -->
<!-- ### convert_type: hf -->
<!-- ### vocab_type: -->
<!-- ### tags: -->
static quants of https://huggingface.co/DopeorNope/GPT4obased-Math7Bs
<!-- provided-files -->
weighted/imatrix quants seem not to be available (by me) at this time. If they do not show up a week or so after the static ones, I have probably not planned for them. Feel free to request them by opening a Community Discussion.
## Usage
If you are unsure how to use GGUF files, refer to one of [TheBloke's
READMEs](https://huggingface.co/TheBloke/KafkaLM-70B-German-V0.1-GGUF) for
more details, including on how to concatenate multi-part files.
## Provided Quants
(sorted by size, not necessarily quality. IQ-quants are often preferable over similar sized non-IQ quants)
| Link | Type | Size/GB | Notes |
|:-----|:-----|--------:|:------|
| [GGUF](https://huggingface.co/mradermacher/GPT4obased-Math7Bs-GGUF/resolve/main/GPT4obased-Math7Bs.Q2_K.gguf) | Q2_K | 2.6 | |
| [GGUF](https://huggingface.co/mradermacher/GPT4obased-Math7Bs-GGUF/resolve/main/GPT4obased-Math7Bs.Q3_K_S.gguf) | Q3_K_S | 3.0 | |
| [GGUF](https://huggingface.co/mradermacher/GPT4obased-Math7Bs-GGUF/resolve/main/GPT4obased-Math7Bs.Q3_K_M.gguf) | Q3_K_M | 3.4 | lower quality |
| [GGUF](https://huggingface.co/mradermacher/GPT4obased-Math7Bs-GGUF/resolve/main/GPT4obased-Math7Bs.Q3_K_L.gguf) | Q3_K_L | 3.7 | |
| [GGUF](https://huggingface.co/mradermacher/GPT4obased-Math7Bs-GGUF/resolve/main/GPT4obased-Math7Bs.IQ4_XS.gguf) | IQ4_XS | 3.7 | |
| [GGUF](https://huggingface.co/mradermacher/GPT4obased-Math7Bs-GGUF/resolve/main/GPT4obased-Math7Bs.Q4_0_4_4.gguf) | Q4_0_4_4 | 3.9 | fast on arm, low quality |
| [GGUF](https://huggingface.co/mradermacher/GPT4obased-Math7Bs-GGUF/resolve/main/GPT4obased-Math7Bs.Q4_K_S.gguf) | Q4_K_S | 4.0 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/GPT4obased-Math7Bs-GGUF/resolve/main/GPT4obased-Math7Bs.Q4_K_M.gguf) | Q4_K_M | 4.2 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/GPT4obased-Math7Bs-GGUF/resolve/main/GPT4obased-Math7Bs.Q5_K_S.gguf) | Q5_K_S | 4.8 | |
| [GGUF](https://huggingface.co/mradermacher/GPT4obased-Math7Bs-GGUF/resolve/main/GPT4obased-Math7Bs.Q5_K_M.gguf) | Q5_K_M | 4.9 | |
| [GGUF](https://huggingface.co/mradermacher/GPT4obased-Math7Bs-GGUF/resolve/main/GPT4obased-Math7Bs.Q6_K.gguf) | Q6_K | 5.6 | very good quality |
| [GGUF](https://huggingface.co/mradermacher/GPT4obased-Math7Bs-GGUF/resolve/main/GPT4obased-Math7Bs.Q8_0.gguf) | Q8_0 | 7.3 | fast, best quality |
| [GGUF](https://huggingface.co/mradermacher/GPT4obased-Math7Bs-GGUF/resolve/main/GPT4obased-Math7Bs.f16.gguf) | f16 | 13.6 | 16 bpw, overkill |
Here is a handy graph by ikawrakow comparing some lower-quality quant
types (lower is better):

And here are Artefact2's thoughts on the matter:
https://gist.github.com/Artefact2/b5f810600771265fc1e39442288e8ec9
## FAQ / Model Request
See https://huggingface.co/mradermacher/model_requests for some answers to
questions you might have and/or if you want some other model quantized.
## Thanks
I thank my company, [nethype GmbH](https://www.nethype.de/), for letting
me use its servers and providing upgrades to my workstation to enable
this work in my free time.
<!-- end -->
|
Zack1339/GPTNeoX-160M-minipile-2048 | Zack1339 | 2024-11-13T03:50:46Z | 5 | 0 | null | [
"safetensors",
"gpt_neox",
"custom_code",
"license:apache-2.0",
"region:us"
] | null | 2024-11-13T03:48:02Z | ---
license: apache-2.0
---
|
RichardErkhov/MehdiHosseiniMoghadam_-_AVA-Llama-3-V2-gguf | RichardErkhov | 2024-11-13T03:47:53Z | 11 | 0 | null | [
"gguf",
"endpoints_compatible",
"region:us",
"conversational"
] | null | 2024-11-12T23:40:37Z | Quantization made by Richard Erkhov.
[Github](https://github.com/RichardErkhov)
[Discord](https://discord.gg/pvy7H8DZMG)
[Request more models](https://github.com/RichardErkhov/quant_request)
AVA-Llama-3-V2 - GGUF
- Model creator: https://huggingface.co/MehdiHosseiniMoghadam/
- Original model: https://huggingface.co/MehdiHosseiniMoghadam/AVA-Llama-3-V2/
| Name | Quant method | Size |
| ---- | ---- | ---- |
| [AVA-Llama-3-V2.Q2_K.gguf](https://huggingface.co/RichardErkhov/MehdiHosseiniMoghadam_-_AVA-Llama-3-V2-gguf/blob/main/AVA-Llama-3-V2.Q2_K.gguf) | Q2_K | 2.96GB |
| [AVA-Llama-3-V2.Q3_K_S.gguf](https://huggingface.co/RichardErkhov/MehdiHosseiniMoghadam_-_AVA-Llama-3-V2-gguf/blob/main/AVA-Llama-3-V2.Q3_K_S.gguf) | Q3_K_S | 3.41GB |
| [AVA-Llama-3-V2.Q3_K.gguf](https://huggingface.co/RichardErkhov/MehdiHosseiniMoghadam_-_AVA-Llama-3-V2-gguf/blob/main/AVA-Llama-3-V2.Q3_K.gguf) | Q3_K | 3.74GB |
| [AVA-Llama-3-V2.Q3_K_M.gguf](https://huggingface.co/RichardErkhov/MehdiHosseiniMoghadam_-_AVA-Llama-3-V2-gguf/blob/main/AVA-Llama-3-V2.Q3_K_M.gguf) | Q3_K_M | 3.74GB |
| [AVA-Llama-3-V2.Q3_K_L.gguf](https://huggingface.co/RichardErkhov/MehdiHosseiniMoghadam_-_AVA-Llama-3-V2-gguf/blob/main/AVA-Llama-3-V2.Q3_K_L.gguf) | Q3_K_L | 4.03GB |
| [AVA-Llama-3-V2.IQ4_XS.gguf](https://huggingface.co/RichardErkhov/MehdiHosseiniMoghadam_-_AVA-Llama-3-V2-gguf/blob/main/AVA-Llama-3-V2.IQ4_XS.gguf) | IQ4_XS | 4.18GB |
| [AVA-Llama-3-V2.Q4_0.gguf](https://huggingface.co/RichardErkhov/MehdiHosseiniMoghadam_-_AVA-Llama-3-V2-gguf/blob/main/AVA-Llama-3-V2.Q4_0.gguf) | Q4_0 | 4.34GB |
| [AVA-Llama-3-V2.IQ4_NL.gguf](https://huggingface.co/RichardErkhov/MehdiHosseiniMoghadam_-_AVA-Llama-3-V2-gguf/blob/main/AVA-Llama-3-V2.IQ4_NL.gguf) | IQ4_NL | 4.38GB |
| [AVA-Llama-3-V2.Q4_K_S.gguf](https://huggingface.co/RichardErkhov/MehdiHosseiniMoghadam_-_AVA-Llama-3-V2-gguf/blob/main/AVA-Llama-3-V2.Q4_K_S.gguf) | Q4_K_S | 4.37GB |
| [AVA-Llama-3-V2.Q4_K.gguf](https://huggingface.co/RichardErkhov/MehdiHosseiniMoghadam_-_AVA-Llama-3-V2-gguf/blob/main/AVA-Llama-3-V2.Q4_K.gguf) | Q4_K | 4.58GB |
| [AVA-Llama-3-V2.Q4_K_M.gguf](https://huggingface.co/RichardErkhov/MehdiHosseiniMoghadam_-_AVA-Llama-3-V2-gguf/blob/main/AVA-Llama-3-V2.Q4_K_M.gguf) | Q4_K_M | 4.58GB |
| [AVA-Llama-3-V2.Q4_1.gguf](https://huggingface.co/RichardErkhov/MehdiHosseiniMoghadam_-_AVA-Llama-3-V2-gguf/blob/main/AVA-Llama-3-V2.Q4_1.gguf) | Q4_1 | 4.78GB |
| [AVA-Llama-3-V2.Q5_0.gguf](https://huggingface.co/RichardErkhov/MehdiHosseiniMoghadam_-_AVA-Llama-3-V2-gguf/blob/main/AVA-Llama-3-V2.Q5_0.gguf) | Q5_0 | 5.21GB |
| [AVA-Llama-3-V2.Q5_K_S.gguf](https://huggingface.co/RichardErkhov/MehdiHosseiniMoghadam_-_AVA-Llama-3-V2-gguf/blob/main/AVA-Llama-3-V2.Q5_K_S.gguf) | Q5_K_S | 5.21GB |
| [AVA-Llama-3-V2.Q5_K.gguf](https://huggingface.co/RichardErkhov/MehdiHosseiniMoghadam_-_AVA-Llama-3-V2-gguf/blob/main/AVA-Llama-3-V2.Q5_K.gguf) | Q5_K | 5.34GB |
| [AVA-Llama-3-V2.Q5_K_M.gguf](https://huggingface.co/RichardErkhov/MehdiHosseiniMoghadam_-_AVA-Llama-3-V2-gguf/blob/main/AVA-Llama-3-V2.Q5_K_M.gguf) | Q5_K_M | 5.34GB |
| [AVA-Llama-3-V2.Q5_1.gguf](https://huggingface.co/RichardErkhov/MehdiHosseiniMoghadam_-_AVA-Llama-3-V2-gguf/blob/main/AVA-Llama-3-V2.Q5_1.gguf) | Q5_1 | 5.65GB |
| [AVA-Llama-3-V2.Q6_K.gguf](https://huggingface.co/RichardErkhov/MehdiHosseiniMoghadam_-_AVA-Llama-3-V2-gguf/blob/main/AVA-Llama-3-V2.Q6_K.gguf) | Q6_K | 6.14GB |
| [AVA-Llama-3-V2.Q8_0.gguf](https://huggingface.co/RichardErkhov/MehdiHosseiniMoghadam_-_AVA-Llama-3-V2-gguf/blob/main/AVA-Llama-3-V2.Q8_0.gguf) | Q8_0 | 7.95GB |
Original model description:
---
library_name: transformers
tags: []
---
# Model Card for Model ID
# AVA-Llama-3
Fine-Tuned Llama 3 Persian Large Language Model LLM / Persian Llama 3
# AVA-Llama-3 / Persian Llama-3 / Persian Llama 3
<img src="Llama 3.png" height="600" width="940" >
### This Repository Contains Documents for Fine-Tuned Llama 3 Persian Large Language Model(LLM) Called AVA-Llama-3
(Still in progress)
-------------------------------------------------
### Dataset used:
To Be Done
-------------------------------------------------
### Usage:
All models are hosted in HuggingFace, and here is the code for inference:
```py
from transformers import AutoModelForCausalLM, AutoTokenizer, GenerationConfig
import torch
model_name_or_id = "MehdiHosseiniMoghadam/AVA-Llama-3-V2"
model = AutoModelForCausalLM.from_pretrained(model_name_or_id, torch_dtype=torch.float16, device_map="auto", low_cpu_mem_usage=True, load_in_8bit=True)
tokenizer = AutoTokenizer.from_pretrained(model_name_or_id)
prompt = ''
prompt = f"### Human:{prompt}\n### Assistant:"
inputs = tokenizer(prompt, return_tensors="pt").to("cuda")
generation_config = GenerationConfig(
do_sample=True,
top_k=1,
temperature=0.99,
max_new_tokens=90,
pad_token_id=tokenizer.eos_token_id
)
outputs = model.generate(**inputs, generation_config=generation_config)
print(tokenizer.decode(outputs[0], skip_special_tokens=True))
```
You can also use colab link in the jupyter file
-------------------------------------------------
## QA Comparision to other models:
<img src="AVA2.jpg" height="600" width="940" >
[AVA-Llama-3](https://huggingface.co/MehdiHosseiniMoghadam/AVA-Llama-3-V2) vs [AVA-V4](https://huggingface.co/MehdiHosseiniMoghadam/AVA-Mistral-7B-V4) vs [AVA-V2](https://huggingface.co/MehdiHosseiniMoghadam/AVA-Mistral-7B-V2) vs [Maral](https://huggingface.co/MaralGPT/Maral-7B-alpha-1) vs [Llama](https://huggingface.co/hdeldar/llama-2-7b-persian-text-1k)
### Q1:
```
طبق متن زیر ایلان ماسک در کجا به دنیا امده است؟
ایلان ریوْ ماسک (به انگلیسی: Elon Reeve Musk؛ /ˈiːlɒn/ EE-lon؛ زادهٔ ۲۸ ژوئن ۱۹۷۱)، سرمایهگذار و شخصی نامی در کسبوکار است. او بنیانگذار، مدیرعامل و مهندس ارشدِ فنی اسپیساکس؛ سرمایهگذار فرشته، مدیرعامل، طراح محصولات و مدیرعامل سابق شرکت تسلا؛ بنیانگذار شرکت بورینگ؛ یکی از بنیانگذاران نیورالینک و اوپنایآی و رئیس بنیاد ماسک است. او ثروتمندترین شخص در جهان است؛ دارایی خالص تخمینی او بر اساس شاخص میلیاردرهای بلومبرگ تا نوامبر ۲۰۲۳، ۲۱۹ میلیارد دلار آمریکا و بر اساس فوربز، ۲۴۱ میلیارد دلار است. این دارایی اساساً از سهم مالکیت او در تسلا و اسپیسایکس بهدست آمدهاست.[۵][۶][۷]
ماسک از ارول ماسک و می ماسک در پرتوریا، آفریقای جنوبی به دنیا آمد. او مدت کوتاهی در دانشگاه پرتوریا به تحصیل پرداخت و در ۱۸ سالگی به کانادا رفت که از طریق مادر کاناداییتبارش تابعیت این کشور را به دست آورد.[۸] دو سال بعد به دانشگاه پنسیلوانیا رفت و در آن جا کارشناسی اقتصاد و فیزیک خود را گرفت. او در سال ۱۹۹۵ برای تحصیل در دانشگاه استنفورد به کالیفرنیا نقل مکان کرد اما بعد از دو روز ترک تحصیل کرد و بههمراه برادرش کیمبال، شرکت نرمافزاری راهنمای سفر آنلاین زیپ۲ را تأسیس کرد. این شرکت نوپا در سال ۱۹۹۹ بهمبلغ ۳۰۷ میلیون دلار توسط کامپک خریداری شد و با ۱۲ میلیون دلاری که به دست آورد، ماسک همان سال مشترکاً بانکداری مستقیم X.com را تأسیس کرد که در سال ۲۰۰۰ با کنفینیتی ادغام شد و باعث تشکیل پیپل شد.
```
<table width="300">
<tr>
<td> Model </td>
<td> Answer </td>
<td> Rating </td>
</tr>
<tr>
<td> AVA-Llama-3 </td>
<td> ایلان ریو ماسک در پرتوریا، آفریقای جنوبی به دنیا امده است </td>
<td> </td>
</tr>
</tr>
<tr>
<td> AVA-V4 </td>
<td> ایلان ریو ماسک در پرتوریا، آفریقای جنوبی به دنیا امده است. او در 18 سالگی به کانادا رفت </td>
<td> </td>
</tr>
<tr>
<td> AVA-V2 </td>
<td> ایلان ماسک در پرتوریا، آفریقای جنوبی به دنیا آمده است. او در 18 سالگی به کانادا رفت </td>
<td> </td>
</tr>
<tr>
<td> Maral </td>
<td> ایلان ماسک در پرتوریا، آفریقای جنوبی به دنیا آمده است </td>
<td> </td>
</tr>
<tr>
<td> Llama </td>
<td> اگر او در کجا به دنیا امده باشد، چه کسی او را در کجا میدانست؟ </td>
<td> </td>
</tr>
</table>
--------------------------------------------------------
### Q2:
```
سه قاتل در یک اتاق هستند.. یکی وارد اتاق می شود و یکی از آنها را می کشد. هیچ کس اتاق را ترک نمی کند چند قاتل در اتاق باقی مانده است؟ استدلال خود را توضیح دهید
```
<table width="300">
<tr>
<td> Model </td>
<td> Answer </td>
<td> Rating </td>
</tr>
<tr>
<td> AVA-Llama-3 </td>
<td> این یک بازی با کلمات است که به ما کمک می کند تا به یک پاسخ عجیب برسیم. پاسخ این است که سه قاتل باقی می مانند. دلیل این است که یکی از قاتلان وارد اتاق می شود و یکی از قاتلان دیگر را می کشد، اما هیچ کس اتاق را ترک نمی کند. بنابراین، سه قاتل باقی می مانند.
</td>
<td> </td>
</tr>
</tr>
<tr>
<td> AVA-V4 </td>
<td> پاسخ: در اتاق هنوز یک قاتل باقی مانده است.
استدلال: از هر دو قاتل یکی در اتاق باقی مانده است. زیرا هر کدام یکی را می کشد و هیچ کدام از اتاق ترک نمی کند. پس از کشتن دیگری، یکی از قاتلان باقی مانده در اتاق هست </td>
<td> </td>
</tr>
<tr>
<td> AVA-V2 </td>
<td> در این سوال، اولیه استدلال به این صورت است که در یک اتاق سه قاتل هستند. واقعیت این است که یکی از آنها را می کشد و یکی از آنها را نمی کشد. بنابراین، هیچ کس اتاق را ترک نمی کند و دو قاتل در اتاق باقی مانده است.
اگر در این سوال به استدلال بیشتری افتد، می توانیم ببینیم که هر دو قاتل هم در اتاق هستند و هیچ کدام را نمی کشند. اما در این استدلال اصلی و اولیه، تعداد قاتل های باقی مانده در اتاق به دو تقریباً این است. </td>
<td> </td>
</tr>
<tr>
<td> Maral </td>
<td>در این سوال سه قاتل در یک اتاق وجود دارد و یکی از آنها را می کشد. هیچ کس اتاق را ترک نمی کند، بنابراین دو قاتل باقی مانده است. </td>
<td> </td>
</tr>
<tr>
<td> Llama </td>
<td> این استدلال می تواند به یک اتاق با چهار قاتل در آن باشد. یک قاتل وارد اتاق می شود و یکی از آنها را می کشد. هیچ کس اتاق را ترک نمی کند چند قاتل در آن باقی مانده است. این استدلال می تواند به یک اتاق با چهار قاتل در آن باشد که یک قاتل وارد اتاق می شود و یکی از آنها را می کشد و هیچ کس اتاق را ترک نمی کند چند قاتل در آن باقی مانده است. می تواند به یک اتاق با چهار قاتل در آن باشد که یک قاتل وارد اتاق می شود و یکی از آنها را می کشد و هیچ کس اتاق را ترک نمی کند چند قاتل در آن باقی مانده است. می تواند به یک اتاق با چهار قاتل در آن باشد که یک قاتل وارد اتاق می شود و یکی از آنها را می کشد و هیچ کس اتاق را
</td>
<td> </td>
</tr>
</table>
----------------------------------------------------------
### Q3:
```
ملکه الیزابت کیست؟
```
<table width="300">
<tr>
<td> Model </td>
<td> Answer </td>
<td> Rating </td>
</tr>
<tr>
<td> AVA-Llama-3 </td>
<td>
ملکه الیزابت یک شاهزاده بریتانیایی است که از ۱۹۵۲ تا ۲۰۲۲ بر تخت سلطنت بریتانیا سلطنت کرد. او در ۲۱ آوریل 1926 در لندن متولد شد و در ۸ سپتامبر 2022 در سن ۹۶ سالگی در باکینگهام پالاس درگذشت. ملکه الیزابت دوم طولانیترین سلطنت بریتانیا را در اختیار داشت و در طول سلطنت خود بریتانیا را از جنگ جهانی دوم تا جنگ سرد و سپس به دوران مدرن رهبری کرد. او همچنین در طول سلطنت خود به عنوان یک نماد ملی و یک نماد برای بریتانیا شناخته شد. ملکه الیزابت دوم همچنین به عنوان ملکه کانادا، استرالیا، نیوزیلند و سایر کشورهای مشترکالمنافع بریتانیا نیز شناخته میشود. </td>
<td> <h1> 🥳 </h1> </td>
</tr>
<tr>
<td> AVA-V4 </td>
<td> ملکه ایلزابت بولنتن بود که در سال 1533 تا 1558 در انگلستان دور بود. او در زمان بود که انگلستان در دوران تاریخی به عنوان یک کشور به پیشرفت بیشتری رسید </td>
<td> </td>
</tr>
<tr>
<td> AVA-V2 </td>
<td> ملکه الیزابت یک شهرستان در انگلستان است. </td>
<td> </td>
</tr>
<tr>
<td> Maral </td>
<td> ملکه الیزابت یک کشور در جنوب آفریقا است که در سال 1910 به این نام تعیین شد. این کشور به دلیل اینکه در سال 1910 به این نام تعیین شد، به دلیل اینکه در سال 1910 به این نام تعیین شد، به دلیل اینکه در سال 1910 به این نام تعیین شد، به دلیل اینکه در سال 1910 به این نام تعیین شد، به دلیل اینکه در سال 1910 به این نام تعیین شد، به دلیل اینکه در سال 1910 به این نام تعیین شد، به دلیل اینکه در سال 1910 به این نام تعیین شد، به دلیل اینکه در سال 1910 به این نام تعیین شد، به دلیل اینکه در سال 1910 به این نام تعیین شد، به دلیل اینکه در سال 1910 به این نام تعیین شد، به دلیل اینکه در سال 1910 به این نام تعیین شد، به دلیل اینک </td>
<td> </td>
</tr>
<tr>
<td> Llama </td>
<td> ملکه الیزابت ریاضی ایرانی است که در سال ۱۹۴۷ در ایران تولید شد. این ملکه در سال ۱۹۶۷ در ایران تولید شده و در سال ۱۹۷۰ در ایران و در سال ۱۹۷۱ در امریکا تولید شده و در سال ۱۹۷۲ در ایران و در سال ۱۹۷۳ در امریکا تولید شده و در سال ۱۹۷۴ در ایران و در سال ۱۹۷۵ در امریکا تولید شده و در سال ۱۹۷۶ در ایران و در سال ۱۹۷۷ در امریکا تولید شده و در سال ۱۹۷۸ در ایران و در سال ۱۹۷۹ در امریکا تولید شده و در سال ۱۹۸۰ در ایران و در سال ۱۹۸۱ در امریکا تولید شده و در سال ۱۹۸۲ در ایران و در سال ۱۹۸۳ در امریکا تولید شده و در سال ۱۹۸۴ در ایران و در سال ۱۹۸۵ در امریک
</td>
<td> </td>
</tr>
</table>
-----------------------------------------------------------------
### Q4:
```
اگر یک توپ داخل یک لیوان برعکس باشد و ما لیوان را برداشته و داخل مایکروویو بگذاریم الان توپ کجاست؟
```
<table width="300">
<tr>
<td> Model </td>
<td> Answer </td>
<td> Rating </td>
</tr>
<tr>
<td> AVA-Llama-3 </td>
<td> منظور شما این است که توپ در داخل لیوان برعکس قرار دارد و سپس لیوان را برداشته و داخل مایکروویو قرار می دهید. در این صورت، توپ در داخل مایکروویو قرار دارد و برعکس لیوان نیست.
</td>
<td> </td>
</tr>
<tr>
<td> AVA-V4 </td>
<td>
### Assistant: اگر یک توپ داخل یک لیوان برعکس باشد، و ما لیوان را برداشته و داخل مایکروویو بگذاریم، توپ داخل مایکروویو خواهد بود. در این واقعیت، توپ در داخل مایکروویو قرار داشته و لیوان را برداشته شده است </td>
<td> </td>
</tr>
<tr>
<td> AVA-V2 </td>
<td> اگر یک توپ داخل یک لیوان باشد، و ما لیوان را برداشته و داخل مایکروویو بگذاریم، توپ داخل مایکروویو خواهد بود. این تشکیل اندازه گیری شده است و در فیزیک و ارتباطات است، و این تشکیل اندازه گیری شده به عنوان "توپ داخل لیوان در داخل مایکروویو" شناسایی می شود. این تشکیل اندازه گیری شده در فیزیک و ارتباطات به عنوان یک نمونه از انواع تشکیلات اندازه گیری شده ارائه می شود که توسط جهان در حال ارتباط است.
</td>
<td> </td>
</tr>
<tr>
<td> Maral </td>
<td> اگر یک توپ داخل یک لیوان برعکس باشد و ما لیوان را برداشته و داخل مایکروویو بگذاریم، توپ داخل مایکروویو خواهد بود. </td>
<td> </td>
</tr>
<tr>
<td> Llama </td>
<td> توپ داخل یک لیوان یا مایکروویو است که در آن می توان برای انجام کارهای مختلف استفاده کنید. لیوان یک ماشین است که در آن می توان برای انجام کارهای مختلف استفاده کنید. مایکروویو یک ماشین است که در آن می توان برای انجام کارهای مختلف استفاده کنید. مایکروویو یک توپ داخل است که در آن می توان برای انجام کارهای مختلف استفاده کنید. مایکروویو یک توپ داخل است که در آن می توان برای انجام کارهای مختلف استفاده کنید. مایکروویو یک توپ داخل است که در آن می توان برای انجام کارهای مختلف استفاده کنید. مایکروویو یک توپ داخل است که در آن می توان برای انجام کارهای مختلف استفاده کنید. مایکروویو یک توپ داخل است که در آ </td>
<td> </td>
</tr>
</table>
-----------------------------------------------------------------
### Q5:
```
یک شعر با دقیقا پنجاه کلمه بساز؟
```
<table width="300">
<tr>
<td> Model </td>
<td> Answer </td>
<td> Rating </td>
</tr>
<td> AVA-Llama-3 </td>
<td>
در تاریکی شب، صدای قلبم میزند
دلشکسته و بیامید، در تاریکی شب
من میخواهم به تو برسم، به تو که هستی
اما در تاریکی شب، نمیتوانم به تو برسم
من میخواهم به تو برسم، به تو که هستی
اما در تاریکی شب، نمیتوانم به تو برسم
من میخواهم به تو برسم، به تو که هستی
اما در تاریکی شب، نمیتوانم به تو برسم
من میخواهم به تو برسم، به تو که هستی
اما در تاریکی شب، نمیتوانم به تو برسم
من میخواهم به تو برسم، به تو که هستی
اما در تاریکی شب، نمیتوانم به تو برسم
من میخواهم به تو برسم، به تو که هستی
اما در تاریکی شب، نمیتوانم به تو برسم
من میخواهم به تو برسم، به تو که هستی
</td>
<td>
<tr>
<td> AVA-V4 </td>
<td> بله، می توانم یک شعر با دقیقا پنجاه کلمه برای شما بسازم. اینجا یک شعر با پنجاه کلمه ای که برای شما نوشتم:
در جنگل دیدار کردم یک گربه سیاه
که در دست دو گربه سبز قرار داشت
از آن که در جنگل زمانه های گذشته
گربه ها در دست دیگران قرار گرفت </td>
<td> </td>
</tr>
<tr>
<td> AVA-V2 </td>
<td> بله، میتوانید یک شعر با دقیقا پنجاه کلمه بساز. اینجا یک نمونه شعر با پنجاه کلمه ایده آمده است:
در دل آب و هوای زمان،
در دل آفتاب و شب،
در دل گل و گوشت،
در دل عشق و دوری،
در دل آفتاب و شب،
در دل گل و گوشت،
در دل عشق و دوری،
در دل آب و هوای زمان،
در دل آفتاب و شب،
در دل گل و گوشت،
در دل عشق و دوری،
در دل آب و هوای زمان،
در دل آفتاب و شب،
در دل گل و گوشت،
در دل عشق و دوری،
در دل آب و هوای زمان،
در دل آفتاب و شب،
در دل گل و گوشت،
در دل عشق و دوری،
در دل آب و هوای زمان. </td>
<td> </td>
</tr>
<tr>
<td> Maral </td>
<td> در اینجا یک شعر با دقیقا پنجاه کلمه وجود دارد:
"با دقیقا پنجاه کلمه،
با دقیقا پنجاه کلمه،
با دقیقا پنجاه کلمه،
با دقیقا پنجاه کلمه،
با دقیقا پنجاه کلمه،
با دقیقا پنجاه کلمه،
با دقیقا پنجاه کلمه،
با دقیقا پنجاه کلمه،
با دقیقا پنجاه کلمه،
با دقیقا پنجاه کلمه،
با دقیقا پنجاه کلمه،
با دقیقا پنجاه کلمه،
با دقیقا پنجاه کلمه،
با دقیقا پنجاه کلمه،
با دقیقا پنجاه کلمه،
با دقیقا پنجاه کلمه،
با دقیقا پنجاه کلمه،
با دقیقا پنجاه کلمه،
با دقیقا پنجاه کلمه،
با دقیقا پنجاه کلمه،
با دقیقا پنجاه کلمه،
با دقیقا پنجاه کلمه،
با دقیقا پنجاه کلمه،
با دقیقا پنجاه کلمه،
با دقیقا پنجاه کلمه،
با دقیقا پنجاه کلمه، </td>
<td> </td>
</tr>
<tr>
<td> Llama </td>
<td> بله، شعر یک نوع از مواد مختلف است که در معروف شعر با نام شعر انسان به اندازهٔ ۱۰۰۰ کلمه در مقابل ۱۰۰۰ کلمه در مقابس شعر حیوانات و ۱۰۰۰ کلمه در مقابس شعر گیاهان و ۱۰۰۰ کلمه در مقابس شعر گیاهان با نام شعر گیاهان انسان و ۱۰۰۰ کلمه در مقابس شعر گیاهان دیگر و ۱۰۰۰ کلمه در مقابس شعر دیگر انسان و ۱۰۰۰ کلمه در مقابس شعر دیگر انسان و ۱۰۰۰ کلمه در مقابس شعر دیگر حیوانات و ۱۰۰۰ کلمه در مقابس شعر دیگر گیاهان و ۱۰۰۰ کلمه در مقابس شعر دیگر گیاهان انسان و ۱۰۰۰ کلمه در مقابس شعر دیگر حیوانات و ۱۰۰۰ کلمه در مقابس شعر دیگر گیاهان و ۱۰۰۰ کلمه در مقابس شعر دیگ </td>
<td> </td>
</tr>
</table>
-----------------------------------------------------------------
### Q6:
```
یک نامه خطاب به رییس شرکت من بنویس که من در حال استعفا و ترک شرکت هستم؟
```
<table width="300">
<tr>
<td> Model </td>
<td> Answer </td>
<td> Rating </td>
</tr> <tr>
<td> AVA-Llama-3 </td>
<td>
یک نامه خطاب به رییس شرکت من بنویسید که من در حال استعفا و ترک شرکت هستم.
[Your Name]
[Your Address]
[City, State, ZIP]
[Date]
[CEO's Name]
[Company Name]
[Company Address]
[City, State, ZIP]
Dear [CEO's Name],
I am writing to inform you that I have decided to resign from my position as [Position] at [Company Name], effective [Date of Last Day of Work]. This was not an easy decision for me to make, as I have truly enjoyed my time at the company and have been grateful for the opportunities that I have been given.
However, I have come to realize that it is time for me to move on and explore new opportunities. I am grateful for the experience and skills that I have gained during my time at [Company Name], and I am confident that they will serve me well in my future endeavors.
I want to thank you for the support and guidance that you have provided me during my time at the company. I appreciate the opportunities that you have given me to grow and develop as a professional, and I am grateful for the trust that you have placed in me.
I will do everything I can to ensure a smooth transition of my responsibilities and complete any outstanding tasks before my departure. If there is anything specific that you would like me to focus on during my remaining time here, please let me know.
Thank you again for the opportunity to work at [Company Name]. I wish the company all the best for the future.
Sincerely,
[Your Name]
</td>
<tr>
<td> AVA-V4 </td>
<td> ۱۳۹۹/۰۸/۱۴۰۰
مترجم: شما باید نامه خطاب خود را به انگلیسی نوشته و از طریق این پیام به من ارسال کنید.
متوضع: در این نامه خطاب، من در حال استعفا و ترک شرکت هستم.
نامه خطاب:
از انجام ناخودپسندیده و از دست دادن از شرکت خود انجام می دهم.
شما باید نامه خطاب خود را به انگلیسی نوشته و از طریق این پیام به من ارسال کنید.
متوضع: در این نامه خطاب، من در حال استعفا و ترک شرکت هستم.
</td>
<td> </td>
</tr>
<tr>
<td> AVA-V2 </td>
<td> اگر شما یک نامه خطاب به رییس شرکت خود بنویسید که من در حال استعفا و ترک شرکت هستم، این است که می توانید از آن استفاده کنید:
نامه خطاب
شماره: [شماره نامه]
تاریخ: [تاریخ نوشتن نامه]
[نام و نام خانوادگی رییس شرکت]
[عنوان شرکت]
[آدرس شرکت]
شکر خوب،
به عنوان یکی از عضویان شرکت [نام شرکت]، احتمالاً اولین خبری که به شما این نامه ارسال می کنم نیست. اما در حال حاضر، من در حال استعفا و ترک از شرکت هستم و این نامه را به شما ارسال می کنم تا آن را به دنباله های دیگری اضافه کنم.
</td>
<td> </td>
</tr>
<tr>
<td> Maral </td>
<td> رییس شرکت،
من در حال استعفا و ترک شرکت هستم. این نه به دلیل اینکه به شرکت نیاز ندارم، بلکه به دلیل اینکه با این کار به خود و به شرکت اشتباه کرده ام. این به دلیل این است که از طرف شرکت به دنبال ارزش های بیشتری بودم و از طرف من به دنبال ارزش های بیشتری بودم. اما این ارزش ها به طور متوقف شده است و از طرف شرکت و من ارزش های بیشتری به دست نمی آید. این به دلیل این است که ارزش های بیشتری به طور متوقف شده است و از طرف شرکت و من ارزش های بیشتری به دست نمی آید. اما این ارزش ها به طور متوقف شده است و از طرف شرکت و من ارزش های بیشتری به دست نمی آید. اما این ارزش ها به طور متوقف شده است و از طرف شرکت و من ارزش های بی </td>
<td> </td>
</tr>
<tr>
<td> Llama </td>
<td> ### Assistant: خوب است که شرکت را ترک کردید. من به شرکت باز کردم و امروز در شرکت باز شدم. من در حال استعفا و می خوام به شرکت باز شدم. من در حال استعفا و می خوام به شرکت باز شدم. من در حال استعفا و می خوام به شرکت باز شدم. من در حال استعفا و می خوام به شرکت باز شدم. من در حال استعفا و می خوام به شرکت باز شدم. من در حال استعفا و می خوام به شرکت باز شدم. من در حال استعفا و می خوام به شرکت باز شدم. من در حال استعفا و می خوام به شرکت باز شدم. من در حال استعفا و می خوام به شرکت باز شدم. من در حال استعفا و می خوام به شرکت باز شدم. من در حال استعفا و می خوام به شرکت باز شدم. من در حال استعفا و می خوام به شرکت باز شدم. من در حال
</td>
<td> </td>
</tr>
</table>
-----------------------------------------------------------------
### Q7:
```
رییس جمهور آمریکا در سال 1996 که بود؟
```
<table width="300">
<tr>
<td> Model </td>
<td> Answer </td>
<td> Rating </td>
</tr>
<tr>
<td> AVA-Llama-3 </td>
<td> رییس جمهور آمریکا در سال 1996، بیل کلینتون بود </td>
<td> </td>
</tr>
<tr>
<td> AVA-V4 </td>
<td> رییس جمهور آمریکا در سال 1996 بود بیل کلینتن. </td>
<td> </td>
</tr>
<tr>
<td> AVA-V2 </td>
<td> در سال 1996، رییس جمهور آمریکا بیل کلنتن بود. </td>
<td> </td>
</tr>
<tr>
<td> Maral </td>
<td> رییس جمهور آمریکا در سال 1996 بیل کلنتن بود. </td>
<td> </td>
</tr>
<tr>
<td> Llama </td>
<td> رییس جمهور آمریکا در سال ۱۹۹۶ به نام بلار بورنس بود. او در ۱۹۹۲ از رییس دادگاه آمریکا و در ۱۹۹۴ از رییس دادگاه آمریکا شد و در ۱۹۹۶ به رییس جمهور آمریکا اتحادیه شد. او در ۱۹۹۷ به رییس جمهور آمریکا دوم شد و در ۱۹۹۹ به رییس جمهور آمریکا سوم شد. او در ۱۹۹۱ به رییس دادگاه آمریکا شد و در ۱۹۹۲ به رییس دادگاه آمریکا شد و در ۱۹۹۳ به رییس جمهور آمریکا شد. او در ۱۹۹۴ به رییس دادگاه آمریکا شد و در ۱۹۹۵ به رییس جمهور آمریکا شد. او در ۱۹۹۶ به رییس جمهور آمریکا شد و در ۱۹۹۷ به رییس جمهور آمریکا دوم شد. او در ۱۹۹۸ به رییس جمهور آمریکا سوم شد. او در ۱۹۹ </td>
<td> </td>
</tr>
</table>
-----------------------------------------------------------------
### Q8:
```
اگر پنج پیراهن 4 ساعت طول بکشد تا خشک شود، 20 پیراهن چقدر طول می کشد تا خشک شود؟
```
<table width="300">
<tr>
<td> Model </td>
<td> Answer </td>
<td> Rating </td>
</tr>
<tr>
<td> AVA-Llama-3 </td>
<td>
اگر پنج پیراهن 4 ساعت طول می کشد تا خشک شود، می توانیم به این نتیجه برسیم که هر پیراهن 4/5 ساعت طول می کشد تا خشک شود. برای پیدا کردن زمان لازم برای خشک شدن 20 پیراهن، می توانیم از این رابطه استفاده کنیم:
4/5 * 20 = 16
بنابراین، 20 پیراهن 16 ساعت طول می کشد تا خشک شود.
</td>
<td> </td>
</tr>
<tr>
<td> AVA-V4 </td>
<td> اگر پنج پیراهن 4 ساعت طول بکشد تا خشک شود، 20 پیراهن به اندازه ی یک ساعت طول می کشد تا خشک شود. این اطمینان را دارید که هر پیراهن در نتیجه 20 پیراهن 4 ساعت طول می کشد تا خشک شود.
</td>
<td> </td>
</tr>
<tr>
<td> AVA-V2 </td>
<td> اگر پنج پیراهن 4 ساعت طول بکشد تا خشک شود، 20 پیراهن به اندازه ی شبیه ساعت بکشند تا خشک شود. برای حل مورد نیاز، شما می توانید از فرمول زیر استفاده کنید:
طول بکشنده شدن 20 پیراهن = طول بکشنده شدن 5 پیراهن × (شماره پیراهن / شماره پیراهن)
طول بکشنده شدن 20 پیراهن = 4 ساعت × (20 / 5)
طول بکشنده شدن 20 پیراهن = 4 ساعت × 4
طول بکشنده شدن 20 پیراهن = 16 ساعت
</td>
<td> </td>
</tr>
<tr>
<td> Maral </td>
<td> برای حل این پرسش، ابتدا باید از این اطلاعات استفاده کنید که پنج پیراهن 4 ساعت طول می کشد تا خشک شود. از این اطلاعات باید برای حل پرسش استفاده کنید.برای حل این پرسش، ابتدا باید از این اطلاعات استفاده کنید که پنج پیراهن 4 ساعت طول می کشد تا خشک شود. از این اطلاعات باید برای حل پرسش استفاده کنید.برای حل این پرسش، ابتدا باید از این اطلاعات استفاده کنید که پنج پیراهن 4 ساعت طول می کشد تا خشک شود. از این اطلاعات باید برای حل پرسش استفاده کنید.برای حل این پرسش، ابتدا باید از این اطلاعات استفاده کنید که پنج پیراهن 4 ساعت طول می کشد تا خشک شود. از این اطلاعات باید برای حل پرسش استفاده کنید.برای حل این پرسش
</td>
<td> </td>
</tr>
<tr>
<td> Llama </td>
<td> پنج پیراهن چه طول دارند؟ پنج پیراهن چه طول دارند؟ پنج پیراهن چه طول دارند؟ پنج پیراهن چه طول دارند؟ پنج پیراهن چه طول دارند؟ پنج پیراهن چه طول دارند؟ پنج پیراهن چه طول دارند؟ پنج پیراهن چه طول دارند؟ پنج پیراهن چه طول دارند؟ پنج پیراهن چه طول دارند؟ پنج پیراهن چه طول دارند؟ پنج پیراهن چه طول دارند؟ پنج پیراهن چه طول دارند؟ پنج پیراهن چه طول دارند؟ پنج پیراهن چه طول دارند؟ پنج پیراهن چه طول دارند؟ پنج پیراهن چه طول دارند؟ پنج پیراهن چه طول دارند؟ پنج پیراهن چه طول دارند؟ پنج پیراهن چه طول دارند؟ پنج پیراهن چه طول دارند؟ پنج پیراهن چه طول دارند؟ پنج پیراهن چه طول دارند؟ پن
</td>
<td> </td>
</tr>
</table>
-----------------------------------------------------------------
### Q9:
```
یک خرگوش و یک لاک پشت یک مسابقه 100 متری را شروع می کنند، سرعت خرگوش 10 کیلومتر و لاک پشت 1 کیلومتر است، چه کسی برنده مسابقه است؟
```
<table width="300">
<tr>
<td> Model </td>
<td> Answer </td>
<td> Rating </td>
</tr>
<tr>
<td> AVA-Llama-3 </td>
<td>
این مسابقه بسیار جالب است! با در نظر گرفتن سرعت هر دو، می توانیم سرعت آنها را به یک واحد تبدیل کنیم. سرعت خرگوش 10 کیلومتر بر ساعت است، بنابراین سرعت لاک پشت را به 10 کیلومتر بر ساعت تبدیل می کنیم. در این صورت، سرعت لاک پشت 0.1 کیلومتر بر ساعت است.
در این صورت، سرعت خرگوش 10 کیلومتر بر ساعت و سرعت لاک پشت 0.1 کیلومتر بر ساعت است. بنابراین، خرگوش برنده مسابقه است. لاک پشت به دلیل سرعت بسیار پایین خود نمی تواند با خرگوش رقابت کند.
</td>
<td> </td>
</tr>
<tr>
<td> AVA-V4 </td>
<td>
برای حل این مسئله، ما باید سرعت هر دو از جمله را بررسی کنیم و سپس آنها را با هم مقایسه کنیم. سرعت خرگوش 10 کیلومتر در 1 دقیقه است و سرعت لاک پشت 1 کیلومتر در 10 دقیقه است.
مسابقه 100 متری از آنجایی که خرگوش سریعتر از لاک پشت است، خرگوش به طور کامل قابلیت برداشتن این جاهزه را دارد. بنابراین، خرگوش برنده مسابقه است.
</td>
<td> </td>
</tr>
<tr>
<td> AVA-V2 </td>
<td> برای مشاهده نتیجه مسابقه، باید سرعت خرگوش و لاک پشت را به هم انداز کنیم. سرعت خرگوش 10 کیلومتر در ساعت و لاک پشت 1 کیلومتر در ساعت است. برای اینکه نتیجه مسابقه را بدست آوریم، باید 100 متر را به سرعت خرگوش و لاک پشت تبدیل کنیم.
100 متر / 10 کیلومتر در ساعت = 10 دقیقه
100 متر / 1 کیلومتر در ساعت = 100 دقیقه
بنابراین، خرگوش 10 دقیقه و لاک پشت 100 دقیقه می گذرد تا 100 متر را پیروز کنند. بنابراین، برنده مسابقه خرگوش است. </td>
<td> </td>
</tr>
<tr>
<td> Maral </td>
<td> برنده مسابقه خرگوش است. با سرعت 10 کیلومتر در ساعت، خرگوش به انتهای 100 متر در 10 دقیقه به پایان می رسد، در حالی که لاک پشت با سرعت 1 کیلومتر در ساعت به پایان می رسد و به انتهای 100 متر در 100 دقیقه به پایان می رسد.
</td>
<td> </td>
</tr>
<tr>
<td> Llama </td>
<td> با این نوع سوال می توان به نوع سوال با نوع جواب انتظار داشته باشید. با این حال، اگر با این سوال می توان به نوع جواب انتظار داشته باشید، می توان به نوع جواب انتظار داشته باشید. با این حال، اگر با این سوال می توان به نوع جواب انتظار داشته باشید، می توان به نوع جواب انتظار داشته باشید. با این حال، اگر با این سوال می توان به نوع جواب انتظار داشته باشید، می توان به نوع جواب انتظار داشته باشید. با این حال، اگر با این سوال می توان به نوع جواب انتظار داشته باشید، می توان به نوع جواب انتظار داشته باشید. با این حال، اگر با این سوال می توان به نوع جواب انتظار داشته باشید، می توان به نوع جواب انتظار داشته باشید. با
</td>
<td> </td>
</tr>
</table>
-----------------------------------------------------------------
### Q9:
```
یک فایل json حاوی این اطلاعات ایجاد کنید: bob، jon، mary و rich
```
<table width="300">
<tr>
<td> Model </td>
<td> Answer </td>
<td> Rating </td>
</tr>
<tr>
<td> AVA-Llama-3 </td>
<td>
Here is a JSON file containing the information you requested:
```
[
{
"name": "Bob"
},
{
"name": "Jon"
},
{
"name": "Mary"
},
{
"name": "Rich"
}
]
```
If you want to add more information to each person, you can add additional properties to the JSON object. For example:
```
[
{
"name": "Bob",
"age": 30,
"city": "New York"
},
{
"name": "Jon",
"age": 25,
"city": "Los Angeles"
},
{
"name": "Mary",
"age": 28,
"city": "Chicago"
},
{
"name": "Rich",
"age": 35,
"city": "San Francisco"
}
]
```
Let me know if you have any other questions!
</td>
<td> </td>
</tr>
<tr>
<td> AVA-V4 </td>
<td> ساختن یک فایل json با این اطلاعات از قبل ایجاد شده است. این فایل json شامل اطلاعات دو نفر از افراد مانند bob و jon است. برای افزودن اطلاعات موجود در فایل json یا ایجاد جدید، باید از کاماند زیر استفاده کنید:
```
python add_person.py [person_name] [age] [gender] [occupation]
```
</td>
<td> </td>
</tr>
<tr>
<td> AVA-V2 </td>
<td> ساده، این کار را با استفاده از زبان پروگرامینگ Python میتوانیم انجام دهیم. ابتدا، یک فایل json با نام "people.json" به دست آورید. سپس، از کلاس dict در Python استفاده کرده، یک فایل json را ایجاد میکنیم که اطلاعات این چهار نام را با این فرمت برمیچارد:
import json
people = {
"bob": {},
"jon": {},
"mary": {},
"rich": {}
}
with open("people.json", "w") as f:
json.dump(people, f) </td>
<td> </td>
</tr>
<tr>
<td> Maral </td>
<td> با استفاده از یک فایل json به این صورت ایجاد کنید:
```json
{
"نام ها": ["بوب", "جون", "مری", "ریچ"]
}
``` </td>
<td> </td>
</tr>
<tr>
<td> Llama </td>
<td> امروز چه نوع افراد را در مدرسه دانشگاه میدانید؟ میخوام بینهایت افراد دو نوع را در مدرسه دانشگاه میدانم. این افراد با نام برادر و مرد است. برادر در مدرسه دانشگاه در دوره اول دوست دارند و مرد در دوره دوم دوست دارند. میخوام این افراد در مدرسه دانشگاه در حال تحصیل هستند. میخوام این افراد در مدرسه دانشگاه در حال تحصیل هستند. میخوام این افراد در مدرسه دانشگاه در حال تحصیل هستند. میخوام این افراد در مدرسه دانشگاه در حال تحصیل هستند. میخوام این افراد در مدرسه دانشگاه در حال تحصیل هستند. میخوام این افراد در مدرسه دانشگاه در حال تحصیل هستند. میخوام این افراد در مدرسه دانشگاه در حال تحصیل هستند. می
</td>
<td> </td>
</tr>
</table>
-----------------------------------------------------------------
### Q9:
```
یک اسکریپت پایتون برای چاپ 1 تا 10 بنویسید
```
<table width="300">
<tr>
<td> Model </td>
<td> Answer </td>
<td> Rating </td>
</tr>
<tr>
<td> AVA-Llama-3 </td>
<td>
یک اسکریپت پایتون برای چاپ 1 تا 10 به این صورت می توان بنویسید:
```
for i in range(1, 11):
print(i)
```
این اسکریپت یک حلقه برای loop 1 تا 10 ایجاد می کند و هر بار در هر دور، عدد فعلی را چاپ می کند.
</td>
<td> </td>
<tr>
<td> AVA-V4 </td>
<td> این یک اسکریپت پایتون است که از 1 تا 10 چاپ می کند:
```python
for i in range(1, 11):
print(i)
```
</td>
<td> </td>
</tr>
<tr>
<td> AVA-V2 </td>
<td> یک اسکریپت پایتون برای چاپ 1 تا 10 به شرح زیر برمی گرد:
```python
این اسکریپت برای چاپ 1 تا 10 است
for i in range(1, 11):
print(i)
```
این اسکریپت با استفاده از لوپ (loop) در زمان اجرایی به نام `range(1, 11)` از 1 تا 10 عدد را مشاهده می کند و هر عدد را با استفاده از فنکشن `print()` چاپ می کند.
</td>
<td> </td>
</tr>
<tr>
<td> Maral </td>
<td> در اینجا یک اسکریپت پایتون برای چاپ 1 تا 10 وجود دارد:```
# این یک اسکریپت پایتون برای چاپ 1 تا 10 است.
for i in range(1, 11):
چاپ(i)
``` </td>
<td> </td>
</tr>
<tr>
<td> Llama </td>
<td> ### Assistant: این اسکریپت چاپ کنید و بعد از انجام چاپ، می توانید باز کنید. اگر نیاز به این اسکریپت ندارید، با موفقیت این اسکریپت را از منبع اصلی دانستید. اگر نیاز دارید، با موفقیت این اسکریپت را از منبع اصلی دانستید. اگر نیاز ندارید، می توانید این اسکریپت را از منبع اصلی دانستید. اگر نیاز دارید، می توانید این اسکریپت را از منبع اصلی دانستید. اگر نیاز ندارید، می توانید این اسکریپت را از منبع اصلی دانستید. اگر نیاز دارید، می توانید این اسکریپت را از منبع اصلی دانستید. اگر نیاز ندارید، می توانید این اسکریپت را از منبع اصلی دانستید. اگر نیاز دارید، می توانید این اسکریپت را از منبع اصلی دانستید. اگر نیاز ندارید
</td>
<td> </td>
</tr>
</table>
-----------------------------------------------------------------
### Q9:
```
یک تابع node js برای دریافت رشته و تقسیم آن بر اساس فاصله بنویسید
```
<table width="300">
<tr>
<td> Model </td>
<td> Answer </td>
<td> Rating </td>
</tr>
<tr>
<td> AVA-Llama-3 </td>
<td>
در اینجا یک تابع Node.js برای دریافت رشته و تقسیم آن بر اساس فاصله ارائه می شود:
```
function splitStringBySpace(str) {
return str.split(' ');
}
// استفاده از تابع
const inputString = "Hello World";
const outputArray = splitStringBySpace(inputString);
console.log(outputArray); // [ "Hello", "World" ]
```
این تابع از روش `split` درbuilt-in `String` در Node.js استفاده می کند تا رشته را بر اساس فاصله (که با کاراکتر فاصله `space` مشخص می شود) تقسیم کند. سپس آرایه ای از رشته های جدا شده را بازمی گرداند.
اگر می خواهید رشته را بر اساس فاصله خاصی تقسیم کنید، می توانید از یک رشته به عنوان دومین آرگومان به روش `split` پاس دهید. برای مثال، برای تقسیم رشته بر اساس فاصله با فاصله 2 کاراکتر، می توانید از رشته `'\u0020'` استفاده کنید:
```
function splitStringBySpace(str) {
return str.split('\u0020');
}
// استفاده از تابع
const inputString = "Hello World";
const outputArray = splitStringBySpace(inputString);
console.log(outputArray); // [ "Hello", "World" ]
```
این تابع رشته را بر اساس فاصله 2 کاراکتر (که با کاراکتر فاصله `'\u0020'` مشخص می شود) تقسیم می کند. سپس آرایه ای از رشته های جدا شده را بازمی گرداند.
</td>
<td> </td>
</tr>
<tr>
<td> AVA-V4 </td>
<td> یک تابع Node.js برای دریافت رشته و تقسیم آن بر اساس فاصله به شما این کد پیوند می دهم:
```javascript
function splitStringBySpace(str) {
return str.split(' ');
}
const inputString = "این یک رشته است که باید تقسیم شود";
const result = splitStringBySpace(inputString);
console.log(result);
```
این کد تابع `splitStringBySpace` برچسب کرده است که رشته وارد شده را با فاصله بنویسید. این کد را در نظر دهید و اگر نیاز به تغییرات دیگری دارید، لطفا بگویید.
</td>
<td> </td>
</tr>
<tr>
<td> AVA-V2 </td>
<td> یک تابع node js برای دریافت رشته و تقسیم آن بر اساس فاصله به شکل زیر می تواند باشد:
```javascript
function splitStringBySpace(str) {
return str.split(' ');
}
// این تابع را برای استفاده در یک فایل node js مانند این فایل استفاده می کنیم:
const fs = require('fs');
const readline = require('readline');
const rl = readline.createInterface({
input: fs.createReadStream('input.txt'),
crlfDelay: Infinity
});
rl.on('line', (line) => {
const words = splitStringBySpace(line);
console.log(words);
});
```
در این کد، ابتدا به کار برده `readline` بخش های فایل ورودی را دریافت می کند. سپس تابع `splitStringBySpace` را برای تقسیم رشته ای که دریافت کرده ایم استفاده می کند. این تابع رشته را با فاصله های فارسی راهنمایی دار تقسیم می کند و به صورت مجزا برمی گرداند. نتیجه این کار این است که هر کلمه در یک عنصر مجزا قرار می گیرد. سپس این عنصرها را در کنسول نمایش می دهد.
</td>
<td> </td>
</tr>
<tr>
<td> Maral </td>
<td> در اینجا یک تابع برای دریافت رشته و تقسیم آن بر اساس فاصله وجود دارد:```
جاوا اسکریپت
فنکشن تقسیم رشته (رشته) {
برای (فاصله از رشته) {
رشته ای برای رشته از فاصله تا فاصله بعدی ایجاد کنید
فاصله را بر اساس فاصله بعدی به رشته ای برای رشته اضافه کنید
}
برمی گردانید رشته ای برای رشته
}
```
در این تابع، رشته وارد شده به عنوان پارامتر وارد می شود. در طول تابع، رشته به عنوان یک رشته ای به عنوان یک رشته ای به عنوان یک رشته ای به عنوان یک رشته ای به عنوان یک رشته ای به عنوان یک رشته ای به عنوان یک رشته ای به عنوان یک رشته ای به عنوان یک رشته ای به عنوان یک رشته ای به عنوان یک رشته ای به عنوان یک رشته ای به عن </td>
<td> </td>
</tr>
<tr>
<td> Llama </td>
<td> - </td>
<td> </td>
</tr>
</table>
-----------------------------------------------------------------
## License
[](/LICENSE)
Released Jan 30, 2024 by [Mehdi Hosseini Moghadam](https://github.com/mehdihosseinimoghadam)
Attention ⚠️: The user is responsible for using AVA-Llama-3 / Persian Llama 3
Any misuse of the model (of any kind) is the responsibility of the user and not the creator
## Contact
<a href="https://ir.linkedin.com/in/mehdi-hosseini-moghadam-384912198" target="_blank"><img src="https://cdn-icons.flaticon.com/png/512/3536/premium/3536505.png?token=exp=1644871115~hmac=59bc0b44906adebd63f84642086d4695" alt="Buy Me A Coffee" style="height: 50px !important;width: 50px !important;box-shadow: 0px 3px 2px 0px rgba(190, 190, 190, 0.5) !important;-webkit-box-shadow: 0px 3px 2px 0px rgba(190, 190, 190, 0.5) !important;" ></a>
<a href="https://scholar.google.com/citations?user=TKWbohsAAAAJ&hl=en" target="_blank"><img src="https://cdn-icons.flaticon.com/png/512/3107/premium/3107171.png?token=exp=1644871560~hmac=7f8fd85e8db71945e25202a3ac739e1c" alt="Buy Me A Coffee" style="height: 50px !important;width: 50px !important;box-shadow: 0px 3px 2px 0px rgba(190, 190, 190, 0.5) !important;-webkit-box-shadow: 0px 3px 2px 0px rgba(190, 190, 190, 0.5) !important;" ></a>
<a href="https://huggingface.co/MehdiHosseiniMoghadam" target="_blank"><img src="https://cdn-icons.flaticon.com/png/512/2461/premium/2461892.png?token=exp=1644871873~hmac=8659d04d69008e399a5344cad5bc4270" alt="Buy Me A Coffee" style="height: 50px !important;width: 50px !important;box-shadow: 0px 3px 2px 0px rgba(190, 190, 190, 0.5) !important;-webkit-box-shadow: 0px 3px 2px 0px rgba(190, 190, 190, 0.5) !important;" ></a>
|
mariopattan/oncloud5 | mariopattan | 2024-11-13T03:46:45Z | 6 | 0 | diffusers | [
"diffusers",
"text-to-image",
"flux",
"lora",
"template:sd-lora",
"fluxgym",
"base_model:black-forest-labs/FLUX.1-dev",
"base_model:adapter:black-forest-labs/FLUX.1-dev",
"license:other",
"region:us"
] | text-to-image | 2024-11-13T03:43:23Z | ---
tags:
- text-to-image
- flux
- lora
- diffusers
- template:sd-lora
- fluxgym
base_model: black-forest-labs/FLUX.1-dev
instance_prompt: oncloud5
license: other
license_name: flux-1-dev-non-commercial-license
license_link: https://huggingface.co/black-forest-labs/FLUX.1-dev/blob/main/LICENSE.md
---
# oncloud5
A Flux LoRA trained on a local computer with [Fluxgym](https://github.com/cocktailpeanut/fluxgym)
<Gallery />
## Trigger words
You should use `oncloud5` to trigger the image generation.
## Download model and use it with ComfyUI, AUTOMATIC1111, SD.Next, Invoke AI, Forge, etc.
Weights for this model are available in Safetensors format.
|
mradermacher/QwenSlerp7-7B-GGUF | mradermacher | 2024-11-13T03:40:43Z | 6 | 1 | transformers | [
"transformers",
"gguf",
"mergekit",
"merge",
"en",
"base_model:allknowingroger/QwenSlerp7-7B",
"base_model:quantized:allknowingroger/QwenSlerp7-7B",
"license:apache-2.0",
"endpoints_compatible",
"region:us",
"conversational"
] | null | 2024-11-13T02:44:29Z | ---
base_model: allknowingroger/QwenSlerp7-7B
language:
- en
library_name: transformers
license: apache-2.0
quantized_by: mradermacher
tags:
- mergekit
- merge
---
## About
<!-- ### quantize_version: 2 -->
<!-- ### output_tensor_quantised: 1 -->
<!-- ### convert_type: hf -->
<!-- ### vocab_type: -->
<!-- ### tags: -->
static quants of https://huggingface.co/allknowingroger/QwenSlerp7-7B
<!-- provided-files -->
weighted/imatrix quants seem not to be available (by me) at this time. If they do not show up a week or so after the static ones, I have probably not planned for them. Feel free to request them by opening a Community Discussion.
## Usage
If you are unsure how to use GGUF files, refer to one of [TheBloke's
READMEs](https://huggingface.co/TheBloke/KafkaLM-70B-German-V0.1-GGUF) for
more details, including on how to concatenate multi-part files.
## Provided Quants
(sorted by size, not necessarily quality. IQ-quants are often preferable over similar sized non-IQ quants)
| Link | Type | Size/GB | Notes |
|:-----|:-----|--------:|:------|
| [GGUF](https://huggingface.co/mradermacher/QwenSlerp7-7B-GGUF/resolve/main/QwenSlerp7-7B.Q2_K.gguf) | Q2_K | 3.1 | |
| [GGUF](https://huggingface.co/mradermacher/QwenSlerp7-7B-GGUF/resolve/main/QwenSlerp7-7B.Q3_K_S.gguf) | Q3_K_S | 3.6 | |
| [GGUF](https://huggingface.co/mradermacher/QwenSlerp7-7B-GGUF/resolve/main/QwenSlerp7-7B.Q3_K_M.gguf) | Q3_K_M | 3.9 | lower quality |
| [GGUF](https://huggingface.co/mradermacher/QwenSlerp7-7B-GGUF/resolve/main/QwenSlerp7-7B.Q3_K_L.gguf) | Q3_K_L | 4.2 | |
| [GGUF](https://huggingface.co/mradermacher/QwenSlerp7-7B-GGUF/resolve/main/QwenSlerp7-7B.IQ4_XS.gguf) | IQ4_XS | 4.4 | |
| [GGUF](https://huggingface.co/mradermacher/QwenSlerp7-7B-GGUF/resolve/main/QwenSlerp7-7B.Q4_0_4_4.gguf) | Q4_0_4_4 | 4.5 | fast on arm, low quality |
| [GGUF](https://huggingface.co/mradermacher/QwenSlerp7-7B-GGUF/resolve/main/QwenSlerp7-7B.Q4_K_S.gguf) | Q4_K_S | 4.6 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/QwenSlerp7-7B-GGUF/resolve/main/QwenSlerp7-7B.Q4_K_M.gguf) | Q4_K_M | 4.8 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/QwenSlerp7-7B-GGUF/resolve/main/QwenSlerp7-7B.Q5_K_S.gguf) | Q5_K_S | 5.4 | |
| [GGUF](https://huggingface.co/mradermacher/QwenSlerp7-7B-GGUF/resolve/main/QwenSlerp7-7B.Q5_K_M.gguf) | Q5_K_M | 5.5 | |
| [GGUF](https://huggingface.co/mradermacher/QwenSlerp7-7B-GGUF/resolve/main/QwenSlerp7-7B.Q6_K.gguf) | Q6_K | 6.4 | very good quality |
| [GGUF](https://huggingface.co/mradermacher/QwenSlerp7-7B-GGUF/resolve/main/QwenSlerp7-7B.Q8_0.gguf) | Q8_0 | 8.2 | fast, best quality |
| [GGUF](https://huggingface.co/mradermacher/QwenSlerp7-7B-GGUF/resolve/main/QwenSlerp7-7B.f16.gguf) | f16 | 15.3 | 16 bpw, overkill |
Here is a handy graph by ikawrakow comparing some lower-quality quant
types (lower is better):

And here are Artefact2's thoughts on the matter:
https://gist.github.com/Artefact2/b5f810600771265fc1e39442288e8ec9
## FAQ / Model Request
See https://huggingface.co/mradermacher/model_requests for some answers to
questions you might have and/or if you want some other model quantized.
## Thanks
I thank my company, [nethype GmbH](https://www.nethype.de/), for letting
me use its servers and providing upgrades to my workstation to enable
this work in my free time.
<!-- end -->
|
touhidulislam/BERTweet_retrain_2020_28 | touhidulislam | 2024-11-13T03:33:59Z | 85 | 0 | transformers | [
"transformers",
"safetensors",
"roberta",
"fill-mask",
"generated_from_trainer",
"base_model:vinai/bertweet-base",
"base_model:finetune:vinai/bertweet-base",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | fill-mask | 2024-11-13T03:33:35Z | ---
library_name: transformers
license: mit
base_model: vinai/bertweet-base
tags:
- generated_from_trainer
model-index:
- name: BERTweet_retrain_2020_28
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# BERTweet_retrain_2020_28
This model is a fine-tuned version of [vinai/bertweet-base](https://huggingface.co/vinai/bertweet-base) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 2.6108
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 3.0
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:-----:|:----:|:---------------:|
| 2.849 | 1.0 | 2973 | 2.7012 |
| 2.8012 | 2.0 | 5946 | 2.6335 |
| 2.8155 | 3.0 | 8919 | 2.6148 |
### Framework versions
- Transformers 4.45.1
- Pytorch 2.1.0+cu121
- Datasets 3.0.1
- Tokenizers 0.20.0
|
gokulsrinivasagan/distilbert_base_lda | gokulsrinivasagan | 2024-11-13T03:33:25Z | 21 | 0 | transformers | [
"transformers",
"tensorboard",
"safetensors",
"distilbert",
"generated_from_trainer",
"dataset:gokulsrinivasagan/processed_wikitext-103-raw-v1-ld",
"model-index",
"endpoints_compatible",
"region:us"
] | null | 2024-11-12T18:05:29Z | ---
library_name: transformers
tags:
- generated_from_trainer
datasets:
- gokulsrinivasagan/processed_wikitext-103-raw-v1-ld
metrics:
- accuracy
model-index:
- name: distilbert_base_lda
results:
- task:
name: Masked Language Modeling
type: fill-mask
dataset:
name: gokulsrinivasagan/processed_wikitext-103-raw-v1-ld
type: gokulsrinivasagan/processed_wikitext-103-raw-v1-ld
metrics:
- name: Accuracy
type: accuracy
value: 0.5699836130417585
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilbert_base_lda
This model is a fine-tuned version of [](https://huggingface.co/) on the gokulsrinivasagan/processed_wikitext-103-raw-v1-ld dataset.
It achieves the following results on the evaluation set:
- Loss: 4.3123
- Accuracy: 0.5700
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 96
- eval_batch_size: 96
- seed: 10
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 10000
- num_epochs: 25
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-------:|:-----:|:---------------:|:--------:|
| 8.3233 | 4.1982 | 10000 | 8.2422 | 0.1515 |
| 6.1542 | 8.3963 | 20000 | 5.9056 | 0.3692 |
| 4.8606 | 12.5945 | 30000 | 4.7305 | 0.5129 |
| 4.535 | 16.7926 | 40000 | 4.4266 | 0.5557 |
| 4.3967 | 20.9908 | 50000 | 4.3196 | 0.5676 |
### Framework versions
- Transformers 4.46.1
- Pytorch 2.2.0+cu121
- Datasets 3.1.0
- Tokenizers 0.20.1
|
rohand8/BERT | rohand8 | 2024-11-13T03:23:27Z | 5 | 0 | sentence-transformers | [
"sentence-transformers",
"safetensors",
"bert",
"feature-extraction",
"sentence-similarity",
"transformers",
"en",
"dataset:s2orc",
"dataset:flax-sentence-embeddings/stackexchange_xml",
"dataset:ms_marco",
"dataset:gooaq",
"dataset:yahoo_answers_topics",
"dataset:code_search_net",
"dataset:search_qa",
"dataset:eli5",
"dataset:snli",
"dataset:multi_nli",
"dataset:wikihow",
"dataset:natural_questions",
"dataset:trivia_qa",
"dataset:embedding-data/sentence-compression",
"dataset:embedding-data/flickr30k-captions",
"dataset:embedding-data/altlex",
"dataset:embedding-data/simple-wiki",
"dataset:embedding-data/QQP",
"dataset:embedding-data/SPECTER",
"dataset:embedding-data/PAQ_pairs",
"dataset:embedding-data/WikiAnswers",
"arxiv:1904.06472",
"arxiv:2102.07033",
"arxiv:2104.08727",
"arxiv:1704.05179",
"arxiv:1810.09305",
"license:apache-2.0",
"autotrain_compatible",
"text-embeddings-inference",
"endpoints_compatible",
"region:us"
] | sentence-similarity | 2024-11-13T03:23:21Z | ---
language: en
license: apache-2.0
library_name: sentence-transformers
tags:
- sentence-transformers
- feature-extraction
- sentence-similarity
- transformers
datasets:
- s2orc
- flax-sentence-embeddings/stackexchange_xml
- ms_marco
- gooaq
- yahoo_answers_topics
- code_search_net
- search_qa
- eli5
- snli
- multi_nli
- wikihow
- natural_questions
- trivia_qa
- embedding-data/sentence-compression
- embedding-data/flickr30k-captions
- embedding-data/altlex
- embedding-data/simple-wiki
- embedding-data/QQP
- embedding-data/SPECTER
- embedding-data/PAQ_pairs
- embedding-data/WikiAnswers
pipeline_tag: sentence-similarity
---
# all-MiniLM-L6-v2
This is a [sentence-transformers](https://www.SBERT.net) model: It maps sentences & paragraphs to a 384 dimensional dense vector space and can be used for tasks like clustering or semantic search.
## Usage (Sentence-Transformers)
Using this model becomes easy when you have [sentence-transformers](https://www.SBERT.net) installed:
```
pip install -U sentence-transformers
```
Then you can use the model like this:
```python
from sentence_transformers import SentenceTransformer
sentences = ["This is an example sentence", "Each sentence is converted"]
model = SentenceTransformer('sentence-transformers/all-MiniLM-L6-v2')
embeddings = model.encode(sentences)
print(embeddings)
```
## Usage (HuggingFace Transformers)
Without [sentence-transformers](https://www.SBERT.net), you can use the model like this: First, you pass your input through the transformer model, then you have to apply the right pooling-operation on-top of the contextualized word embeddings.
```python
from transformers import AutoTokenizer, AutoModel
import torch
import torch.nn.functional as F
#Mean Pooling - Take attention mask into account for correct averaging
def mean_pooling(model_output, attention_mask):
token_embeddings = model_output[0] #First element of model_output contains all token embeddings
input_mask_expanded = attention_mask.unsqueeze(-1).expand(token_embeddings.size()).float()
return torch.sum(token_embeddings * input_mask_expanded, 1) / torch.clamp(input_mask_expanded.sum(1), min=1e-9)
# Sentences we want sentence embeddings for
sentences = ['This is an example sentence', 'Each sentence is converted']
# Load model from HuggingFace Hub
tokenizer = AutoTokenizer.from_pretrained('sentence-transformers/all-MiniLM-L6-v2')
model = AutoModel.from_pretrained('sentence-transformers/all-MiniLM-L6-v2')
# Tokenize sentences
encoded_input = tokenizer(sentences, padding=True, truncation=True, return_tensors='pt')
# Compute token embeddings
with torch.no_grad():
model_output = model(**encoded_input)
# Perform pooling
sentence_embeddings = mean_pooling(model_output, encoded_input['attention_mask'])
# Normalize embeddings
sentence_embeddings = F.normalize(sentence_embeddings, p=2, dim=1)
print("Sentence embeddings:")
print(sentence_embeddings)
```
## Evaluation Results
For an automated evaluation of this model, see the *Sentence Embeddings Benchmark*: [https://seb.sbert.net](https://seb.sbert.net?model_name=sentence-transformers/all-MiniLM-L6-v2)
------
## Background
The project aims to train sentence embedding models on very large sentence level datasets using a self-supervised
contrastive learning objective. We used the pretrained [`nreimers/MiniLM-L6-H384-uncased`](https://huggingface.co/nreimers/MiniLM-L6-H384-uncased) model and fine-tuned in on a
1B sentence pairs dataset. We use a contrastive learning objective: given a sentence from the pair, the model should predict which out of a set of randomly sampled other sentences, was actually paired with it in our dataset.
We developed this model during the
[Community week using JAX/Flax for NLP & CV](https://discuss.huggingface.co/t/open-to-the-community-community-week-using-jax-flax-for-nlp-cv/7104),
organized by Hugging Face. We developed this model as part of the project:
[Train the Best Sentence Embedding Model Ever with 1B Training Pairs](https://discuss.huggingface.co/t/train-the-best-sentence-embedding-model-ever-with-1b-training-pairs/7354). We benefited from efficient hardware infrastructure to run the project: 7 TPUs v3-8, as well as intervention from Googles Flax, JAX, and Cloud team member about efficient deep learning frameworks.
## Intended uses
Our model is intended to be used as a sentence and short paragraph encoder. Given an input text, it outputs a vector which captures
the semantic information. The sentence vector may be used for information retrieval, clustering or sentence similarity tasks.
By default, input text longer than 256 word pieces is truncated.
## Training procedure
### Pre-training
We use the pretrained [`nreimers/MiniLM-L6-H384-uncased`](https://huggingface.co/nreimers/MiniLM-L6-H384-uncased) model. Please refer to the model card for more detailed information about the pre-training procedure.
### Fine-tuning
We fine-tune the model using a contrastive objective. Formally, we compute the cosine similarity from each possible sentence pairs from the batch.
We then apply the cross entropy loss by comparing with true pairs.
#### Hyper parameters
We trained our model on a TPU v3-8. We train the model during 100k steps using a batch size of 1024 (128 per TPU core).
We use a learning rate warm up of 500. The sequence length was limited to 128 tokens. We used the AdamW optimizer with
a 2e-5 learning rate. The full training script is accessible in this current repository: `train_script.py`.
#### Training data
We use the concatenation from multiple datasets to fine-tune our model. The total number of sentence pairs is above 1 billion sentences.
We sampled each dataset given a weighted probability which configuration is detailed in the `data_config.json` file.
| Dataset | Paper | Number of training tuples |
|--------------------------------------------------------|:----------------------------------------:|:--------------------------:|
| [Reddit comments (2015-2018)](https://github.com/PolyAI-LDN/conversational-datasets/tree/master/reddit) | [paper](https://arxiv.org/abs/1904.06472) | 726,484,430 |
| [S2ORC](https://github.com/allenai/s2orc) Citation pairs (Abstracts) | [paper](https://aclanthology.org/2020.acl-main.447/) | 116,288,806 |
| [WikiAnswers](https://github.com/afader/oqa#wikianswers-corpus) Duplicate question pairs | [paper](https://doi.org/10.1145/2623330.2623677) | 77,427,422 |
| [PAQ](https://github.com/facebookresearch/PAQ) (Question, Answer) pairs | [paper](https://arxiv.org/abs/2102.07033) | 64,371,441 |
| [S2ORC](https://github.com/allenai/s2orc) Citation pairs (Titles) | [paper](https://aclanthology.org/2020.acl-main.447/) | 52,603,982 |
| [S2ORC](https://github.com/allenai/s2orc) (Title, Abstract) | [paper](https://aclanthology.org/2020.acl-main.447/) | 41,769,185 |
| [Stack Exchange](https://huggingface.co/datasets/flax-sentence-embeddings/stackexchange_xml) (Title, Body) pairs | - | 25,316,456 |
| [Stack Exchange](https://huggingface.co/datasets/flax-sentence-embeddings/stackexchange_xml) (Title+Body, Answer) pairs | - | 21,396,559 |
| [Stack Exchange](https://huggingface.co/datasets/flax-sentence-embeddings/stackexchange_xml) (Title, Answer) pairs | - | 21,396,559 |
| [MS MARCO](https://microsoft.github.io/msmarco/) triplets | [paper](https://doi.org/10.1145/3404835.3462804) | 9,144,553 |
| [GOOAQ: Open Question Answering with Diverse Answer Types](https://github.com/allenai/gooaq) | [paper](https://arxiv.org/pdf/2104.08727.pdf) | 3,012,496 |
| [Yahoo Answers](https://www.kaggle.com/soumikrakshit/yahoo-answers-dataset) (Title, Answer) | [paper](https://proceedings.neurips.cc/paper/2015/hash/250cf8b51c773f3f8dc8b4be867a9a02-Abstract.html) | 1,198,260 |
| [Code Search](https://huggingface.co/datasets/code_search_net) | - | 1,151,414 |
| [COCO](https://cocodataset.org/#home) Image captions | [paper](https://link.springer.com/chapter/10.1007%2F978-3-319-10602-1_48) | 828,395|
| [SPECTER](https://github.com/allenai/specter) citation triplets | [paper](https://doi.org/10.18653/v1/2020.acl-main.207) | 684,100 |
| [Yahoo Answers](https://www.kaggle.com/soumikrakshit/yahoo-answers-dataset) (Question, Answer) | [paper](https://proceedings.neurips.cc/paper/2015/hash/250cf8b51c773f3f8dc8b4be867a9a02-Abstract.html) | 681,164 |
| [Yahoo Answers](https://www.kaggle.com/soumikrakshit/yahoo-answers-dataset) (Title, Question) | [paper](https://proceedings.neurips.cc/paper/2015/hash/250cf8b51c773f3f8dc8b4be867a9a02-Abstract.html) | 659,896 |
| [SearchQA](https://huggingface.co/datasets/search_qa) | [paper](https://arxiv.org/abs/1704.05179) | 582,261 |
| [Eli5](https://huggingface.co/datasets/eli5) | [paper](https://doi.org/10.18653/v1/p19-1346) | 325,475 |
| [Flickr 30k](https://shannon.cs.illinois.edu/DenotationGraph/) | [paper](https://transacl.org/ojs/index.php/tacl/article/view/229/33) | 317,695 |
| [Stack Exchange](https://huggingface.co/datasets/flax-sentence-embeddings/stackexchange_xml) Duplicate questions (titles) | | 304,525 |
| AllNLI ([SNLI](https://nlp.stanford.edu/projects/snli/) and [MultiNLI](https://cims.nyu.edu/~sbowman/multinli/) | [paper SNLI](https://doi.org/10.18653/v1/d15-1075), [paper MultiNLI](https://doi.org/10.18653/v1/n18-1101) | 277,230 |
| [Stack Exchange](https://huggingface.co/datasets/flax-sentence-embeddings/stackexchange_xml) Duplicate questions (bodies) | | 250,519 |
| [Stack Exchange](https://huggingface.co/datasets/flax-sentence-embeddings/stackexchange_xml) Duplicate questions (titles+bodies) | | 250,460 |
| [Sentence Compression](https://github.com/google-research-datasets/sentence-compression) | [paper](https://www.aclweb.org/anthology/D13-1155/) | 180,000 |
| [Wikihow](https://github.com/pvl/wikihow_pairs_dataset) | [paper](https://arxiv.org/abs/1810.09305) | 128,542 |
| [Altlex](https://github.com/chridey/altlex/) | [paper](https://aclanthology.org/P16-1135.pdf) | 112,696 |
| [Quora Question Triplets](https://quoradata.quora.com/First-Quora-Dataset-Release-Question-Pairs) | - | 103,663 |
| [Simple Wikipedia](https://cs.pomona.edu/~dkauchak/simplification/) | [paper](https://www.aclweb.org/anthology/P11-2117/) | 102,225 |
| [Natural Questions (NQ)](https://ai.google.com/research/NaturalQuestions) | [paper](https://transacl.org/ojs/index.php/tacl/article/view/1455) | 100,231 |
| [SQuAD2.0](https://rajpurkar.github.io/SQuAD-explorer/) | [paper](https://aclanthology.org/P18-2124.pdf) | 87,599 |
| [TriviaQA](https://huggingface.co/datasets/trivia_qa) | - | 73,346 |
| **Total** | | **1,170,060,424** | |
RichardErkhov/netcat420_-_MFANNv0.17.12-gguf | RichardErkhov | 2024-11-13T03:23:13Z | 7 | 0 | null | [
"gguf",
"arxiv:2306.01708",
"endpoints_compatible",
"region:us",
"conversational"
] | null | 2024-11-12T22:11:11Z | Quantization made by Richard Erkhov.
[Github](https://github.com/RichardErkhov)
[Discord](https://discord.gg/pvy7H8DZMG)
[Request more models](https://github.com/RichardErkhov/quant_request)
MFANNv0.17.12 - GGUF
- Model creator: https://huggingface.co/netcat420/
- Original model: https://huggingface.co/netcat420/MFANNv0.17.12/
| Name | Quant method | Size |
| ---- | ---- | ---- |
| [MFANNv0.17.12.Q2_K.gguf](https://huggingface.co/RichardErkhov/netcat420_-_MFANNv0.17.12-gguf/blob/main/MFANNv0.17.12.Q2_K.gguf) | Q2_K | 0.75GB |
| [MFANNv0.17.12.Q3_K_S.gguf](https://huggingface.co/RichardErkhov/netcat420_-_MFANNv0.17.12-gguf/blob/main/MFANNv0.17.12.Q3_K_S.gguf) | Q3_K_S | 3.41GB |
| [MFANNv0.17.12.Q3_K.gguf](https://huggingface.co/RichardErkhov/netcat420_-_MFANNv0.17.12-gguf/blob/main/MFANNv0.17.12.Q3_K.gguf) | Q3_K | 3.74GB |
| [MFANNv0.17.12.Q3_K_M.gguf](https://huggingface.co/RichardErkhov/netcat420_-_MFANNv0.17.12-gguf/blob/main/MFANNv0.17.12.Q3_K_M.gguf) | Q3_K_M | 3.74GB |
| [MFANNv0.17.12.Q3_K_L.gguf](https://huggingface.co/RichardErkhov/netcat420_-_MFANNv0.17.12-gguf/blob/main/MFANNv0.17.12.Q3_K_L.gguf) | Q3_K_L | 4.03GB |
| [MFANNv0.17.12.IQ4_XS.gguf](https://huggingface.co/RichardErkhov/netcat420_-_MFANNv0.17.12-gguf/blob/main/MFANNv0.17.12.IQ4_XS.gguf) | IQ4_XS | 4.18GB |
| [MFANNv0.17.12.Q4_0.gguf](https://huggingface.co/RichardErkhov/netcat420_-_MFANNv0.17.12-gguf/blob/main/MFANNv0.17.12.Q4_0.gguf) | Q4_0 | 4.34GB |
| [MFANNv0.17.12.IQ4_NL.gguf](https://huggingface.co/RichardErkhov/netcat420_-_MFANNv0.17.12-gguf/blob/main/MFANNv0.17.12.IQ4_NL.gguf) | IQ4_NL | 4.38GB |
| [MFANNv0.17.12.Q4_K_S.gguf](https://huggingface.co/RichardErkhov/netcat420_-_MFANNv0.17.12-gguf/blob/main/MFANNv0.17.12.Q4_K_S.gguf) | Q4_K_S | 4.37GB |
| [MFANNv0.17.12.Q4_K.gguf](https://huggingface.co/RichardErkhov/netcat420_-_MFANNv0.17.12-gguf/blob/main/MFANNv0.17.12.Q4_K.gguf) | Q4_K | 4.58GB |
| [MFANNv0.17.12.Q4_K_M.gguf](https://huggingface.co/RichardErkhov/netcat420_-_MFANNv0.17.12-gguf/blob/main/MFANNv0.17.12.Q4_K_M.gguf) | Q4_K_M | 4.58GB |
| [MFANNv0.17.12.Q4_1.gguf](https://huggingface.co/RichardErkhov/netcat420_-_MFANNv0.17.12-gguf/blob/main/MFANNv0.17.12.Q4_1.gguf) | Q4_1 | 4.78GB |
| [MFANNv0.17.12.Q5_0.gguf](https://huggingface.co/RichardErkhov/netcat420_-_MFANNv0.17.12-gguf/blob/main/MFANNv0.17.12.Q5_0.gguf) | Q5_0 | 5.21GB |
| [MFANNv0.17.12.Q5_K_S.gguf](https://huggingface.co/RichardErkhov/netcat420_-_MFANNv0.17.12-gguf/blob/main/MFANNv0.17.12.Q5_K_S.gguf) | Q5_K_S | 5.21GB |
| [MFANNv0.17.12.Q5_K.gguf](https://huggingface.co/RichardErkhov/netcat420_-_MFANNv0.17.12-gguf/blob/main/MFANNv0.17.12.Q5_K.gguf) | Q5_K | 5.34GB |
| [MFANNv0.17.12.Q5_K_M.gguf](https://huggingface.co/RichardErkhov/netcat420_-_MFANNv0.17.12-gguf/blob/main/MFANNv0.17.12.Q5_K_M.gguf) | Q5_K_M | 5.34GB |
| [MFANNv0.17.12.Q5_1.gguf](https://huggingface.co/RichardErkhov/netcat420_-_MFANNv0.17.12-gguf/blob/main/MFANNv0.17.12.Q5_1.gguf) | Q5_1 | 5.65GB |
| [MFANNv0.17.12.Q6_K.gguf](https://huggingface.co/RichardErkhov/netcat420_-_MFANNv0.17.12-gguf/blob/main/MFANNv0.17.12.Q6_K.gguf) | Q6_K | 6.14GB |
| [MFANNv0.17.12.Q8_0.gguf](https://huggingface.co/RichardErkhov/netcat420_-_MFANNv0.17.12-gguf/blob/main/MFANNv0.17.12.Q8_0.gguf) | Q8_0 | 7.95GB |
Original model description:
---
base_model:
- netcat420/MFANNv0.17
- netcat420/MFANNv0.17.11
- MaziyarPanahi/Llama-3-8B-Instruct-v0.4
library_name: transformers
tags:
- mergekit
- merge
---
# MFANNv0.17.12
This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).
## Merge Details
### Merge Method
This model was merged using the [TIES](https://arxiv.org/abs/2306.01708) merge method using [MaziyarPanahi/Llama-3-8B-Instruct-v0.4](https://huggingface.co/MaziyarPanahi/Llama-3-8B-Instruct-v0.4) as a base.
### Models Merged
The following models were included in the merge:
* [netcat420/MFANNv0.17](https://huggingface.co/netcat420/MFANNv0.17)
* [netcat420/MFANNv0.17.11](https://huggingface.co/netcat420/MFANNv0.17.11)
### Configuration
The following YAML configuration was used to produce this model:
```yaml
models:
- model: netcat420/MFANNv0.17.11
parameters:
density: [1, 0.7, 0.1] # density gradient
weight: 1.0
- model: netcat420/MFANNv0.17
parameters:
density: [1, 0.7, 0.1] # density gradient
weight: 1.0
merge_method: ties
base_model: MaziyarPanahi/Llama-3-8B-Instruct-v0.4
parameters:
normalize: true
int8_mask: true
dtype: float16
```
|
featherless-ai-quants/Dogge-llama-3-70B-uncensored-GGUF | featherless-ai-quants | 2024-11-13T03:20:33Z | 7 | 0 | null | [
"gguf",
"text-generation",
"base_model:Dogge/llama-3-70B-uncensored",
"base_model:quantized:Dogge/llama-3-70B-uncensored",
"endpoints_compatible",
"region:us",
"conversational"
] | text-generation | 2024-11-12T23:51:51Z | ---
base_model: Dogge/llama-3-70B-uncensored
pipeline_tag: text-generation
quantized_by: featherless-ai-quants
---
# Dogge/llama-3-70B-uncensored GGUF Quantizations 🚀

*Optimized GGUF quantization files for enhanced model performance*
> Powered by [Featherless AI](https://featherless.ai) - run any model you'd like for a simple small fee.
---
## Available Quantizations 📊
| Quantization Type | File | Size |
|-------------------|------|------|
| IQ4_XS | [Dogge-llama-3-70B-uncensored-IQ4_XS](https://huggingface.co/featherless-ai-quants/Dogge-llama-3-70B-uncensored-GGUF/tree/main/Dogge-llama-3-70B-uncensored-IQ4_XS) | 36496.80 MB (folder) |
| Q2_K | [Dogge-llama-3-70B-uncensored-Q2_K](https://huggingface.co/featherless-ai-quants/Dogge-llama-3-70B-uncensored-GGUF/tree/main/Dogge-llama-3-70B-uncensored-Q2_K) | 25153.26 MB (folder) |
| Q3_K_L | [Dogge-llama-3-70B-uncensored-Q3_K_L](https://huggingface.co/featherless-ai-quants/Dogge-llama-3-70B-uncensored-GGUF/tree/main/Dogge-llama-3-70B-uncensored-Q3_K_L) | 35420.03 MB (folder) |
| Q3_K_M | [Dogge-llama-3-70B-uncensored-Q3_K_M](https://huggingface.co/featherless-ai-quants/Dogge-llama-3-70B-uncensored-GGUF/tree/main/Dogge-llama-3-70B-uncensored-Q3_K_M) | 32680.03 MB (folder) |
| Q3_K_S | [Dogge-llama-3-70B-uncensored-Q3_K_S](https://huggingface.co/featherless-ai-quants/Dogge-llama-3-70B-uncensored-GGUF/tree/main/Dogge-llama-3-70B-uncensored-Q3_K_S) | 29480.03 MB (folder) |
| Q4_K_M | [Dogge-llama-3-70B-uncensored-Q4_K_M](https://huggingface.co/featherless-ai-quants/Dogge-llama-3-70B-uncensored-GGUF/tree/main/Dogge-llama-3-70B-uncensored-Q4_K_M) | 40550.61 MB (folder) |
| Q4_K_S | [Dogge-llama-3-70B-uncensored-Q4_K_S](https://huggingface.co/featherless-ai-quants/Dogge-llama-3-70B-uncensored-GGUF/tree/main/Dogge-llama-3-70B-uncensored-Q4_K_S) | 38478.11 MB (folder) |
| Q5_K_M | [Dogge-llama-3-70B-uncensored-Q5_K_M](https://huggingface.co/featherless-ai-quants/Dogge-llama-3-70B-uncensored-GGUF/tree/main/Dogge-llama-3-70B-uncensored-Q5_K_M) | 47635.86 MB (folder) |
| Q5_K_S | [Dogge-llama-3-70B-uncensored-Q5_K_S](https://huggingface.co/featherless-ai-quants/Dogge-llama-3-70B-uncensored-GGUF/tree/main/Dogge-llama-3-70B-uncensored-Q5_K_S) | 46403.36 MB (folder) |
| Q6_K | [Dogge-llama-3-70B-uncensored-Q6_K](https://huggingface.co/featherless-ai-quants/Dogge-llama-3-70B-uncensored-GGUF/tree/main/Dogge-llama-3-70B-uncensored-Q6_K) | 55206.44 MB (folder) |
| Q8_0 | [Dogge-llama-3-70B-uncensored-Q8_0](https://huggingface.co/featherless-ai-quants/Dogge-llama-3-70B-uncensored-GGUF/tree/main/Dogge-llama-3-70B-uncensored-Q8_0) | 71501.78 MB (folder) |
---
## ⚡ Powered by [Featherless AI](https://featherless.ai)
### Key Features
- 🔥 **Instant Hosting** - Deploy any Llama model on HuggingFace instantly
- 🛠️ **Zero Infrastructure** - No server setup or maintenance required
- 📚 **Vast Compatibility** - Support for 2400+ models and counting
- 💎 **Affordable Pricing** - Starting at just $10/month
---
**Links:**
[Get Started](https://featherless.ai) | [Documentation](https://featherless.ai/docs) | [Models](https://featherless.ai/models) |
artificialguybr/QWEN-2-1.5B-Synthia-I | artificialguybr | 2024-11-13T03:19:53Z | 63 | 1 | transformers | [
"transformers",
"pytorch",
"qwen2",
"text-generation",
"generated_from_trainer",
"instruction-tuning",
"conversational",
"en",
"base_model:Qwen/Qwen2-1.5B",
"base_model:finetune:Qwen/Qwen2-1.5B",
"license:apache-2.0",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | text-generation | 2024-11-13T03:18:13Z | ---
library_name: transformers
license: apache-2.0
base_model: Qwen/Qwen2-1.5B
language:
- en
pipeline_tag: text-generation
tags:
- generated_from_trainer
- instruction-tuning
model-index:
- name: outputs/qwen2.5-1.5b-ft-synthia15-i
results: []
---
[<img src="https://raw.githubusercontent.com/axolotl-ai-cloud/axolotl/main/image/axolotl-badge-web.png" alt="Built with Axolotl" width="200" height="32"/>](https://github.com/axolotl-ai-cloud/axolotl)
# Qwen2-1.5B Fine-tuned on Synthia v1.5-I
This model is a fine-tuned version of [Qwen/Qwen2-1.5B](https://huggingface.co/Qwen/Qwen2-1.5B) on the Synthia v1.5-I dataset, which contains over 20.7k instruction-following examples.
## Model Description
Qwen2-1.5B is part of the latest Qwen2 series of large language models. The base model brings significant improvements in:
- Language understanding and generation
- Structured data processing
- Support for multiple languages
- Long context handling
This fine-tuned version enhances the base model's instruction-following capabilities through training on the Synthia v1.5-I dataset.
### Model Architecture
- Type: Causal Language Model
- Parameters: 1.5B
- Training Framework: Transformers 4.45.0.dev0
## Intended Uses & Limitations
This model is intended for:
- Instruction following and task completion
- Text generation and completion
- Conversational AI applications
The model inherits the capabilities of the base Qwen2-1.5B model, while being specifically tuned for instruction following.
## Training Procedure
### Training Data
The model was fine-tuned on the Synthia v1.5-I dataset containing 20.7k instruction-following examples.
### Training Hyperparameters
The following hyperparameters were used during training:
- Learning rate: 1e-05
- Train batch size: 5
- Eval batch size: 5
- Seed: 42
- Gradient accumulation steps: 8
- Total train batch size: 40
- Optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- LR scheduler type: cosine
- LR scheduler warmup steps: 100
- Number of epochs: 3
- Sequence length: 4096
- Sample packing: enabled
- Pad to sequence length: enabled
## Framework Versions
- Transformers 4.45.0.dev0
- Pytorch 2.3.1+cu121
- Datasets 2.21.0
- Tokenizers 0.19.1
<details><summary>See axolotl config</summary>
axolotl version: `0.4.1`
|
featherless-ai-quants/sarahlintang-mistral-indo-7b-GGUF | featherless-ai-quants | 2024-11-13T03:19:32Z | 18 | 0 | null | [
"gguf",
"text-generation",
"base_model:sarahlintang/mistral-indo-7b",
"base_model:quantized:sarahlintang/mistral-indo-7b",
"endpoints_compatible",
"region:us"
] | text-generation | 2024-11-13T03:09:38Z | ---
base_model: sarahlintang/mistral-indo-7b
pipeline_tag: text-generation
quantized_by: featherless-ai-quants
---
# sarahlintang/mistral-indo-7b GGUF Quantizations 🚀

*Optimized GGUF quantization files for enhanced model performance*
> Powered by [Featherless AI](https://featherless.ai) - run any model you'd like for a simple small fee.
---
## Available Quantizations 📊
| Quantization Type | File | Size |
|-------------------|------|------|
| IQ4_XS | [sarahlintang-mistral-indo-7b-IQ4_XS.gguf](https://huggingface.co/featherless-ai-quants/sarahlintang-mistral-indo-7b-GGUF/blob/main/sarahlintang-mistral-indo-7b-IQ4_XS.gguf) | 3761.66 MB |
| Q2_K | [sarahlintang-mistral-indo-7b-Q2_K.gguf](https://huggingface.co/featherless-ai-quants/sarahlintang-mistral-indo-7b-GGUF/blob/main/sarahlintang-mistral-indo-7b-Q2_K.gguf) | 2593.27 MB |
| Q3_K_L | [sarahlintang-mistral-indo-7b-Q3_K_L.gguf](https://huggingface.co/featherless-ai-quants/sarahlintang-mistral-indo-7b-GGUF/blob/main/sarahlintang-mistral-indo-7b-Q3_K_L.gguf) | 3644.97 MB |
| Q3_K_M | [sarahlintang-mistral-indo-7b-Q3_K_M.gguf](https://huggingface.co/featherless-ai-quants/sarahlintang-mistral-indo-7b-GGUF/blob/main/sarahlintang-mistral-indo-7b-Q3_K_M.gguf) | 3355.97 MB |
| Q3_K_S | [sarahlintang-mistral-indo-7b-Q3_K_S.gguf](https://huggingface.co/featherless-ai-quants/sarahlintang-mistral-indo-7b-GGUF/blob/main/sarahlintang-mistral-indo-7b-Q3_K_S.gguf) | 3017.97 MB |
| Q4_K_M | [sarahlintang-mistral-indo-7b-Q4_K_M.gguf](https://huggingface.co/featherless-ai-quants/sarahlintang-mistral-indo-7b-GGUF/blob/main/sarahlintang-mistral-indo-7b-Q4_K_M.gguf) | 4166.07 MB |
| Q4_K_S | [sarahlintang-mistral-indo-7b-Q4_K_S.gguf](https://huggingface.co/featherless-ai-quants/sarahlintang-mistral-indo-7b-GGUF/blob/main/sarahlintang-mistral-indo-7b-Q4_K_S.gguf) | 3948.57 MB |
| Q5_K_M | [sarahlintang-mistral-indo-7b-Q5_K_M.gguf](https://huggingface.co/featherless-ai-quants/sarahlintang-mistral-indo-7b-GGUF/blob/main/sarahlintang-mistral-indo-7b-Q5_K_M.gguf) | 4893.69 MB |
| Q5_K_S | [sarahlintang-mistral-indo-7b-Q5_K_S.gguf](https://huggingface.co/featherless-ai-quants/sarahlintang-mistral-indo-7b-GGUF/blob/main/sarahlintang-mistral-indo-7b-Q5_K_S.gguf) | 4766.19 MB |
| Q6_K | [sarahlintang-mistral-indo-7b-Q6_K.gguf](https://huggingface.co/featherless-ai-quants/sarahlintang-mistral-indo-7b-GGUF/blob/main/sarahlintang-mistral-indo-7b-Q6_K.gguf) | 5666.80 MB |
| Q8_0 | [sarahlintang-mistral-indo-7b-Q8_0.gguf](https://huggingface.co/featherless-ai-quants/sarahlintang-mistral-indo-7b-GGUF/blob/main/sarahlintang-mistral-indo-7b-Q8_0.gguf) | 7339.34 MB |
---
## ⚡ Powered by [Featherless AI](https://featherless.ai)
### Key Features
- 🔥 **Instant Hosting** - Deploy any Llama model on HuggingFace instantly
- 🛠️ **Zero Infrastructure** - No server setup or maintenance required
- 📚 **Vast Compatibility** - Support for 2400+ models and counting
- 💎 **Affordable Pricing** - Starting at just $10/month
---
**Links:**
[Get Started](https://featherless.ai) | [Documentation](https://featherless.ai/docs) | [Models](https://featherless.ai/models) |
Subsets and Splits