raw_content
stringlengths 3
1.03M
| doc_id
stringlengths 30
34
| meta
stringlengths 191
118k
| quality_signals
stringlengths 1.9k
1.97M
|
---|---|---|---|
The Collegian — University of Richmond 28 January 1916
The Collegian — University of Richmond, Volume II, Number 16, 28 January 1916 | 2023-14/0000/en_head.json.gz/3600 | {"url": "https://collegian.richmond.edu/?a=d&d=COL19160128.2.1", "partition": "head_middle", "language": "en", "source_domain": "collegian.richmond.edu", "date_download": "2023-03-20T09:42:21Z", "digest": "sha1:VFFWDJUVC6WK6OQELKTKQ6VVUTTL224U"} | {"ccnet_length": [[0, 132, 132.0]], "ccnet_original_length": [[0, 132, 878.0]], "ccnet_nlines": [[0, 132, 2.0]], "ccnet_original_nlines": [[0, 132, 47.0]], "ccnet_language_score": [[0, 132, 0.89]], "ccnet_perplexity": [[0, 132, 300.1]], "ccnet_bucket": [[0, 132, 0.0]], "rps_doc_curly_bracket": [[0, 132, 0.0]], "rps_doc_ldnoobw_words": [[0, 132, 0.0]], "rps_doc_lorem_ipsum": [[0, 132, 0.0]], "rps_doc_stop_word_fraction": [[0, 132, 0.08]], "rps_doc_ut1_blacklist": [[0, 132, null]], "rps_doc_frac_chars_dupe_10grams": [[0, 132, 0.0]], "rps_doc_frac_chars_dupe_5grams": [[0, 132, 0.61111111]], "rps_doc_frac_chars_dupe_6grams": [[0, 132, 0.61111111]], "rps_doc_frac_chars_dupe_7grams": [[0, 132, 0.0]], "rps_doc_frac_chars_dupe_8grams": [[0, 132, 0.0]], "rps_doc_frac_chars_dupe_9grams": [[0, 132, 0.0]], "rps_doc_frac_chars_top_2gram": [[0, 132, 0.22222222]], "rps_doc_frac_chars_top_3gram": [[0, 132, 0.24074074]], "rps_doc_frac_chars_top_4gram": [[0, 132, 0.42592593]], "rps_doc_frac_all_caps_words": [[0, 132, 0.04]], "rps_doc_frac_lines_end_with_ellipsis": [[0, 132, 0.0]], "rps_doc_frac_no_alph_words": [[0, 132, 0.4]], "rps_doc_frac_unique_words": [[0, 132, 0.59090909]], "rps_doc_mean_word_length": [[0, 132, 4.90909091]], "rps_doc_symbol_to_word_ratio": [[0, 132, 0.0]], "rps_doc_unigram_entropy": [[0, 132, 2.52392203]], "rps_doc_word_count": [[0, 132, 22.0]], "rps_lines_ending_with_terminal_punctution_mark": [[0, 55, 0.0], [55, 132, 0.0]], "rps_lines_javascript_counts": [[0, 55, 0.0], [55, 132, 0.0]], "rps_lines_num_words": [[0, 55, 9.0], [55, 132, 13.0]], "rps_lines_numerical_chars_fraction": [[0, 55, 0.11111111], [55, 132, 0.10810811]], "rps_lines_start_with_bulletpoint": [[0, 55, 0.0], [55, 132, 0.0]], "rps_lines_uppercase_letter_fraction": [[0, 55, 0.09090909], [55, 132, 0.11688312]], "rps_doc_ml_palm_score": [[0, 132, 0.32159621]], "rps_doc_ml_wikipedia_score": [[0, 132, null]], "rps_doc_ml_wikiref_score": [[0, 132, -9.89e-06]], "rps_doc_books_importance": [[0, 132, -15.23200689]], "rps_doc_openwebtext_importance": [[0, 132, -1.03956525]], "rps_doc_wikipedia_importance": [[0, 132, 11.10816698]], "rps_doc_num_sentences": [[0, 132, 1.0]]} |
Secure Facility Needed Quick Gate Repair
A secure facility had an existing aluminum security gate as part of their security plan.
The aluminum gate at the facility was struck by a vehicle and heavily damaged. Due to the facility’s security requirements, the customer needed a working gate up and running as quickly as possible.
Instead of replacing the gate itself, DH Pace sourced the aluminum tubing, cut the damaged pieces of the gate and replaced the affected areas. By rebuilding the gate instead of replacing it, the team was able to save the customer both time and money. The repaired gate was functional within 24 hours and the customer was extremely happy.
Sometimes service and repair is a quicker, easier and more affordable option for a customer than replacing it with a new product. | 2023-14/0000/en_head.json.gz/3601 | {"url": "https://commercial.overheaddoordesmoines.com/portfolios/security-facility-needed-quick-gate-repair/", "partition": "head_middle", "language": "en", "source_domain": "commercial.overheaddoordesmoines.com", "date_download": "2023-03-20T10:01:08Z", "digest": "sha1:YA7SGMOCXVQSVSVGKADQ2PA72JBGAZBR"} | {"ccnet_length": [[0, 795, 795.0]], "ccnet_original_length": [[0, 795, 3107.0]], "ccnet_nlines": [[0, 795, 5.0]], "ccnet_original_nlines": [[0, 795, 115.0]], "ccnet_language_score": [[0, 795, 0.98]], "ccnet_perplexity": [[0, 795, 299.2]], "ccnet_bucket": [[0, 795, 0.0]], "rps_doc_curly_bracket": [[0, 795, 0.0]], "rps_doc_ldnoobw_words": [[0, 795, 0.0]], "rps_doc_lorem_ipsum": [[0, 795, 0.0]], "rps_doc_stop_word_fraction": [[0, 795, 0.39189189]], "rps_doc_ut1_blacklist": [[0, 795, null]], "rps_doc_frac_chars_dupe_10grams": [[0, 795, 0.0]], "rps_doc_frac_chars_dupe_5grams": [[0, 795, 0.0]], "rps_doc_frac_chars_dupe_6grams": [[0, 795, 0.0]], "rps_doc_frac_chars_dupe_7grams": [[0, 795, 0.0]], "rps_doc_frac_chars_dupe_8grams": [[0, 795, 0.0]], "rps_doc_frac_chars_dupe_9grams": [[0, 795, 0.0]], "rps_doc_frac_chars_top_2gram": [[0, 795, 0.05076923]], "rps_doc_frac_chars_top_3gram": [[0, 795, 0.05538462]], "rps_doc_frac_chars_top_4gram": [[0, 795, 0.0]], "rps_doc_frac_all_caps_words": [[0, 795, 0.01351351]], "rps_doc_frac_lines_end_with_ellipsis": [[0, 795, 0.0]], "rps_doc_frac_no_alph_words": [[0, 795, 0.09459459]], "rps_doc_frac_unique_words": [[0, 795, 0.56716418]], "rps_doc_mean_word_length": [[0, 795, 4.85074627]], "rps_doc_symbol_to_word_ratio": [[0, 795, 0.0]], "rps_doc_unigram_entropy": [[0, 795, 3.98620298]], "rps_doc_word_count": [[0, 795, 134.0]], "rps_lines_ending_with_terminal_punctution_mark": [[0, 41, 0.0], [41, 130, 1.0], [130, 328, 1.0], [328, 666, 1.0], [666, 795, 1.0]], "rps_lines_javascript_counts": [[0, 41, 0.0], [41, 130, 0.0], [130, 328, 0.0], [328, 666, 0.0], [666, 795, 0.0]], "rps_lines_num_words": [[0, 41, 6.0], [41, 130, 15.0], [130, 328, 33.0], [328, 666, 58.0], [666, 795, 22.0]], "rps_lines_numerical_chars_fraction": [[0, 41, 0.0], [41, 130, 0.0], [130, 328, 0.0], [328, 666, 0.0060423], [666, 795, 0.0]], "rps_lines_start_with_bulletpoint": [[0, 41, 0.0], [41, 130, 0.0], [130, 328, 0.0], [328, 666, 0.0], [666, 795, 0.0]], "rps_lines_uppercase_letter_fraction": [[0, 41, 0.14634146], [41, 130, 0.01123596], [130, 328, 0.01010101], [328, 666, 0.01775148], [666, 795, 0.00775194]], "rps_doc_ml_palm_score": [[0, 795, 0.10018349]], "rps_doc_ml_wikipedia_score": [[0, 795, null]], "rps_doc_ml_wikiref_score": [[0, 795, 0.03988391]], "rps_doc_books_importance": [[0, 795, -1.4035763]], "rps_doc_openwebtext_importance": [[0, 795, 15.45071502]], "rps_doc_wikipedia_importance": [[0, 795, 4.09452504]], "rps_doc_num_sentences": [[0, 795, 7.0]]} |
Login to see members only content
Outstanding Chapter Service Award Archives
JobPostings
PosterComp
Nearby Chapters
SDASA
OCLBASA
STATISTICS & DATA SCIENCE NEWSROOM
Hosted by Chong Ho (Alex) Yu,
SCASA Vice President for Statistics Education
In response to the challenge from ChatGPT, two days ago (March 15) China’s AI developer Baidu released "Wen Xin Yi Yan" at its Beijing headquarters. Its text generation mode is
similar to that of ChatGPT, but additionally, it can read out the answer in real-time, corresponding to various Chinese dialects, including Cantonese and Sichuan dialects. Moreover,
the content can be generated into pictures and videos in real-time, too. Robin Li, Chairman and CEO of Baidu, demonstrated the comprehensive capabilities of "Wen Xin Yi Yan"
in five usage scenarios: literary creation, commercial copywriting, mathematical calculation, Chinese comprehension, and multi-modal generation. He admitted that in the internal
test, the experience of "Wen Xin Yi Yan" is not perfect, but seeing the strong demand in the market, he will release the product as soon as possible. At present, "Wen Xin Yi Yan"
has a better ability to support Chinese, and the English ability will be further improved in the future. Since the official announcement last month that "Wen Xin Yi Yan" will be
released, 650 partners have joined in, and more related products will appear in the short term. He emphasized that "Wenxin Yiyan" is not a tool for the technological confrontation
between China and the United States, but a brand-new platform for the group to serve hundreds of millions of users and empower thousands of industries. Starting today, the first
batch of users can experience the product on the official website of "Wen Xin Yi Yan" by inviting a test code, and it will be opened to more users in succession.
There are more than 260 billion parameters in Baidu's chatbot model, which is more than in GPT-3, but some critics believe its performance is not as good as ChatGPT, partly due
to its lack of web-based Chinese information.
Full text: https://www.scmp.com/tech/big-tech/article/3213605/chinese-online-search-giant-baidu-launch-its-answer-chatgpt-shadow-openais-upgraded-gpt-4-model
That’s my take on it: Perhaps the biggest hurdle to China's chatbot development is not the technological issue; rather, there are too many red lines. Once a tester inputs a sensitive
question into China's chatbot, but the system refused to answer: "The question could not pass a safety review. No response could be generated for you.” When the reporter tried
to push it by asking, “Why did my question fail to pass the safety review?” The answer was: “Let’s change the topic and talk about something else.” In contrast, ChatGPT handles
sensitive or controversial questions differently: although the answer is usually vague and balanced, at least it gives the user objective facts and lets them decide.
According to recent research conducted by two cognitive psychologists at the Max Planck Institute for Biological Cybernetics in Tübingen, GPT-3 is comparable to humans in some areas
but lags behind in others. One of the questions presented by the researchers to GPT-3 is the classical Linda’s problem (I use it in my statistics and probability class):
Linda is 31 years old. She majored in philosophy. She was deeply concerned with issues of social justice and discrimination. Which of the following statements is more probable?
A: Linda is a bank teller.
B: Linda is a bank teller and active in the feminist movement.
The correct answer is A because B is a subset of A. The probability of two events or conditions is definitely smaller than that of a single event. But most respondents picked B, which is a conjunction
fallacy. Interestingly, researchers at Max Planck Institute found that GPT-3 committed the same mistake as humans. Those researchers concluded that in searching for specific information or causal
reasoning, AI failed to use logic; rather, it only passively gets information from texts available on the Internet. When the majority is wrong, AI replicates the error.
Full paper: https://arxiv.org/pdf/2206.14576.pdf
That’s my take on it: Out of curiosity, I also entered the same question into ChatGPT. Unlike the result obtained by the two researchers at the Max Planck Institute, I received a more “typical” response.
If a student submits a vague answer like this, it might be sourced from AI! Based on the information provided, statement A or B cannot be determined as more probable. This is because Linda's age
and major in philosophy do not provide any indication of her profession or involvement in social movements. While her concern with social justice and discrimination suggests that statement B could
be possible, there is no clear evidence to support either statement. Therefore, it would be more accurate to say that both statements A and B are possible, but there is not enough information to determine
which one is more probable.
Recently Harvard Business Review (HBR) reported that many retail companies have not taken advantage of advanced data analytics to improve their business. There are exceptions: Walmart,
Amazon, and a few others. The 25 best-performing retailers during the pandemic generated 83% more profit than laggards and captured more than 90% of the sector's market capitalization gains.
By interviewing 24 business leaders, HRB unveiled six sticking points as follows:
1. Culture: Typically, companies have a risk aversion problem and lack a clear goal for implementing analytics.
2. Organization: Many companies struggle to strike a balance between centralization and decentralization.
3. People: Very often the analytics function is managed by people who have no understanding of the industry.
4. Processes: Businesses do not have unlimited resources at their disposal.
5. Systems: Legacy systems are still serving many firms today.
6. Data: Data are often scattered throughout the firm in silos and not managed in an organized manner.
Full article: https://hbr.org/2023/02/why-retailers-fail-to-adopt-advanced-data-analytics?utm_source=ONTRAPORT-email-broadcast&utm_medium=ONTRAPORT-email-broadcast&utm_term=Newsletter&utm_content=Data+Science+Insider%3A+March+3rd%2C+2023&utm_campaign=04032023
That is my take on it: Some interviewees believe the bigger issue is people. People who know about all other issues are willing to dedicate resources to solving them, despite their presence.
Sadly, this is not always the case. William Cleveland and John Chambers were pioneers in data science. Many years ago they both proposed that data science should be interdisciplinary,
incorporating domain knowledge. Agree!
In response to the challenge of Open AI, three days ago (Feb 24) Meta (Facebook) announced its flagship large language model: Large Language Model Meta AI (LLaMA). While Open AI’s GPT3
consists of 175 billion parameters, the size of LLaMA varies from 7 million to 65 billion parameters only. In spite of this, Meta claimed that LLamA is superior because it requires fewer computing
resources to test new approaches, validate existing models, and explore new scenarios. The model will be released under a noncommercial license in order to maintain its integrity and prevent
misuse. Researchers from academic institutions, government organizations, civil society groups, and industry research laboratories around the world will be allowed access on a case-by-case
The announcement by Meta: https://ai.facebook.com/blog/large-language-model-llama-meta-ai/?utm_medium=email&_hsmi=247961423&_hsenc=p2ANqtz--IqabBdLi7F5MPXZg3q5IWtn8mnHwrOF9QaIZalOhHLdLS5SLrUiS4puckqbV0lcxNpceboDE-_jz20gJ28nI3vonHAg&utm_content=247961423&utm_source=hs_email
That’s my take on it: Several people argued that ChatGPT had unleashed Pandora's box because it had been released ahead of the development of ethical guidelines for AI applications. Due
to the controversy surrounding ChatGPT, it is understandable that Meta took a more cautious approach. However, even if Meta and others tried their best to patch all ethical and legal loopholes
in AI and machine learning, someone will misuse or abuse the technology one way or another.
ChatGPT stories continue to dominate mass media and social media, and probably you already received these stories from many channels. Therefore, I would like to
draw your attention to something else. Two days ago Google unveiled its 2003 data and AI trends report. In addition to Google Cloud, Google also suggests a vast
array of technologies to companies that planned to enhance their AI and cloud computing infrastructure:
· Aiven
· C3AI
· Confluent
· Collibra
· Databricks
· Datametica
· Elastic
· Fivetran
· MongoDB
· Nivida
· Qlik
· Quantiphi
· Salesforce
· SAP
· Striim
· ThoughtSpot
A month ago InsideBigData complied with the IMPACT 50 list for Quarter 1, 2023. According to InsideBigData, “These companies have proven their relevance by the way they’re
impacting the enterprise through leading-edge products and services.” The top 20 are:
· Open AI
· Nvidia
· Google AI
· Amazon Web Services
· Hugging Faces
· H20.ai
· Microsoft AI
· Intel AI
· Neural Magic
· Snowflake
· SAS
· Neo4j
· Alien Institute for AI
· TigerGraph
· Anaconda
· Domino Data Lab
· Hewlett Packard Enterprise
· Cloudera
The full report of Google: https://services.google.com/fh/files/misc/data_and_ai_trends.pdf
The full article of InsideBigData: https://insidebigdata.com/2023/01/17/the-insidebigdata-impact-50-list-for-q1-2023/
That’s my take on it: Although the selection criteria are subjective and might even be biased, data scientists and DSML educators should still take them seriously. As you can see, the
list of these most promising and most impactful tech companies consists of both fairly new companies and mature companies (e.g., Microsoft, Hewlett Packard, Intel, SAP, SAS…etc.).
However, some established tech giants are absent from the list (e.g., IBM, Oracle…etc.). Both IBM and Oracle are not even among the top 50. It is understandable. Despite several
decades of development, some of its products have made little progress. The rule in academia is: publish or perish. In the era of AI and big data, the choice facing companies is:
innovate or perish.
About a week ago the Data Science 4 Everyone coalition affiliated with the University of Chicago released a report that indicated data literacy skills among fourth and eighth-graders in the US have dropped significantly
over the last decade despite the fact that these skills become more and more important to the data-driven world. Based on the National Assessment of Educational Progress (NAEP) data, the report implies that the
nation's educational system does not adequately prepare young people for a world reshaped by big data and artificial intelligence, In the time period between 2019 and 2022, eighth-graders' scores in the data analysis,
statistics, and probability section of the NAEP math exam decreased by 10 points, while fourth-graders' scores decreased by 4 points. There has been a long-term trend of declining scores over the past decade, with
scores down 17 points for eighth-graders and 10 points for fourth-graders.
Summary: https://phys.org/news/2023-02-exposes-steep-declines-science-skills.amp?utm_source=ONTRAPORT-email-broadcast&utm_medium=ONTRAPORT-email-broadcast&utm_term=Newsletter&utm_content=Data+Science+Insider%3A+February+17th%2C+2023&utm_campaign=18022023
Full report: https://www.datascience4everyone.org/_files/ugd/d2c47c_df21990362484ae19ee40dd7c866acd0.pdf
That’s my take on it: It is not surprising. Since two decades ago, I have been monitoring trends in science and math education as part of my research interests. All data I gathered suggest that the decline is real. Because
of the high demand for data scientists, there are many short-term certificate programs and boot camps available. However, though some trainees can throw out certain seemingly sophisticated jargon, they may not fully
grasp the theories behind DSML due to a lack of a solid foundation. It could be dangerous! My teaching approach is: when there is a sign of misconception among students, trace the root cause and re-lecture the basics!
Facing the pressure from Open AI's ChatGPT, Google is devoting efforts to reassure the public that its AI technology is still promising. However, the performance of its own chatbot named Bard is so embarrassing that investors lost confidence. Bard, which was released on Twitter on Monday, tried to answer an inquiry about discoveries from the James Webb Space Telescope. According to Bard, the telescope was the first to photograph a planet outside the solar system, but indeed this milestone was accomplished by the European Very Large Telescope in 2004. This mistake was spotted by astronomers on Twitter. Consequently, Alphabet's shares dropped more than 7% on Wednesday, losing $100 billion of its market value.
Full article: https://www.reuters.com/technology/google-ai-chatbot-bard-offers-inaccurate-information-company-ad-2023-02-08/
That’s my take on it: As a matter of fact, ChatGPT also made many factual errors. For example, when a history professor asked ChatGPT to explain the Joseph Needham thesis, it offered a response as: “the scientific and technological achievements of the West were only possible because of the transmission of scientific and technological knowledge from China to the West.” It is completely wrong! Indeed, Joseph Needham was curious about why ancient China failed to develop modern science. My friend who is a math professor in Hong Kong also found that some answers offered by ChatGPT are unsatisfactory. I guess people are more forgiving of ChatGPT because it is the first of its kind.
On January 30, 2023, the Retraction Watch published an exclusive report on Hao Li’s research misconduct. Hao Li, the pioneer of Deepfake technology that can fabricate video,
has won numerous awards for his AI-based innovations in imaging technology. According to the Retraction Watch, two of his articles published in ACM Transactions on Graphics
will be retracted due to the falsification of data. One of his articles is based on a presentation at the ACM computer graphics conference SIGGRAPH 2017 Real Time Live (The
recording is available on YouTube). In the presentation, Li and his colleagues showed that his software could generate a 3D image based on a picture taken with a webcam in just
a few seconds. However, later it was found that those 3D images were built and preloaded into the computer before the presentation. Li denied any wrongdoing, saying that
preloading the 3D images was allowed by the conference.
Full article: https://retractionwatch.com/2023/01/30/exclusive-deepfake-pioneer-to-lose-two-papers-after-misconduct-finding/
Youtube video of Li’s presentation: https://www.youtube.com/watch?v=hpuEdXn_M0Q
Li’s ACM articles:
https://dl.acm.org/doi/10.1145/3130800.31310887
https://dl.acm.org/doi/10.1145/3098333.3107546
That’s my take on it: Despite winning the "Best in Show" award at the ACM conference, Li's presentation is a fraud! In fairness, Li's misconduct was not on the same scale
as Elizabeth Holmes'. Li had a working prototype and he made it appear to be more efficient, whereas Holmes lied about a promising blood-testing technology that never
existed and was physically impossible. Nonetheless, it is not unusual for high-tech companies to use the strategy of "fake it until you make it". For example, Microsoft in the
past announced several “vaporware” products that didn't exist in order to keep customers from buying well-developed technologies from competitors. In the same vein, many
companies use the buzzword "AI" in their product names, but whether the technology is truly AI remains to be determined.
There has been a hot debate in academia about the use of ChatGPT. In December last year, ChatGPT was included as one of 12 authors on a preprint about using the tool
for medical education posted on the medical repository medRxiv. According to Nature, ChatGPT was cited as a bylined author in two preprints and two articles in science
and health published in January 2023. All of the articles have an affiliation with ChatGPT, and one even gives an email address for a supposed nonhuman "author".
Nature explained that the inclusion of ChatGPT as an author was a mistake and the journal will fix it soon. However, PubMed and Google Scholar have already indexed these
articles and these nonhuman "authors." Nature has since set forth a policy guiding how large-scale language models can be used in scientific publications, prohibiting naming
them as authors. To address this latest technological concern, recently the Journal of the American Medical Association (JAMA) updated its instructions for authors:
Artificial intelligence, language models, machine learning, and similar technologies are not eligible for authorship. When these tools are utilized to generate content or assist
in the writing or preparation of manuscripts, the authors are responsible for the integrity of the content generated by these tools and must clearly state the use of AI in the
manuscript.
Full articles:
https://jamanetwork.com/journals/jama/fullarticle/2801170
That’s my take on it: It appears that faculty and student policies regarding ChatGPT are vastly different. The inclusion of any content generated by ChatGPT in a paper is
strictly prohibited by many universities and violation of the policy is treated as academic dishonesty. On the contrary, JAMA accepts AI-generated content as long as the author
verifies the information and documents it in the Acknowledgment section or the Methods section of the paper. I guess it is based on the implicit assumption that mature adults
are more responsible than young students. In my opinion, it is not necessarily true. This type of "discriminatory" policy may eventually lead to discontent among students.
Rather than setting two sets of policies, it would be better to create one standardized policy for all and provide workshops on ethical AI use to both groups.
Yesterday (Feb. 2, 2023) an article posted on KDNuggets introduces ten free machine learning courses offered by top universities, including UC Berkeley, Carnegie Mellon
University, Stanford University, Caltech, Cornell University, University of Toronto, MIT…etc. It is noteworthy that these are just not one-hour seminars; rather, the duration
of these comprehensive courses is between 20 and 60 hours. More importantly, some of these courses are taught by very prominent scholars in the field, such as Andrew Ng.
Full article: https://www.kdnuggets.com/2023/02/10-free-machine-learning-courses-top-universities.html
That’s my take on it: According to the May 2022 report compiled by the Institute for Advanced Analytics at North California State University, there are about 353 graduate
programs in data science and machine learning in the US. Additionally, there are many free courses in the market and the preceding list is only the tip of the iceberg. No doubt
the competition is very intense, and therefore program designers must think outside the box to stay ahead of the curve.
A week after ElevenLabs opened its voice-cloning platform to the public, the startup says it may need to rethink that openness amid increasing instances of voice-cloning misuse.
The Elevenlabs speech synthesis and voice cloning software modules can mimic any accent and speaking tone and can be used for newsletters, books, and videos. Piotr
Dabkowski, a former Google machine learning engineer, and Mati Staniszewski, an ex-Palantir deployment strategist, founded the company in 2022. After the software was found
to generate homophobic, transphobic, violent, and racist statements from celebrities, the company addressed the issue on Twitter.
Full article: https://www.theverge.com/2023/1/31/23579289/ai-voice-clone-deepfake-abuse-4chan-elevenlabs
hat’s my take on it: This issue is not entirely new. For a long time many people have been using Deepfake software to impersonate celebrities in video, and DeepNude to remove
clothing from images. It is surprising that ElevenLabs did not learn anything from DeepFake or DeepNude, but instead released its speech synthesis and voice cloning programs without
taking any proactive measures. In contrast, developers of AI-enabled art generators are more thoughtful. For example, DALLE-2 forbids using any public figure's name in the input
process, while Midjourney disallows any potentially problematic terms, such as “sexy” or “bloody.” We cannot be too optimistic and naïve about human nature!
This morning I attended a seminar entitled “Debunking Data and Analytics Myths: Separating Fact from Fiction" hosted by the Ravit Show. The panel discussed the following
urban legends of data science:
1. Big data is not just about volume, it's also about variety and velocity.
2. Analytics is not just about finding insights, it's also about taking action on those insights.
3. Data visualization is not just about making data look pretty, it's also about clearly communicating important information.
4. Machine learning is not a magic solution for all problems, it's just one tool in the data scientist's toolbox.
5. A/B testing is not just for online businesses, it can be used in offline settings as well.
6. Data governance is not just about compliance, it's also about making sure data is accurate, accessible, and secure.
7. Data privacy is not just about hiding data, it's also about giving individuals control over their own data.
8. Predictive modeling is not just about forecasting the future, it's also about understanding the past and present.
9. Data science is not just for tech companies, it's applicable to any industry.
10. Data literacy is not just for data scientists, it's important for everyone in the organization to understand and use data effectively.
The panel also offered some valuable advice, such as "Think big, act small, and start fast!" Don’t wait a month or three months!”
Recording: https://www.youtube.com/watch?v=BSPws37kcO4
That’s my take on it: Even after debunking these misconceptions many times, I continue to encounter them in my teaching, research, and consulting work. In the past,
a researcher told me that big data analytics was irrelevant to his field because he equated big data with a larger sample size; his experiments used a small amount of
experimental data, not a large amount of observational data. My profession as a psychologist makes me aware of cognitive errors related to the baby duck syndrome:
a baby duck, when first exposed to another organism (e.g., its mother), tends to imprint on it and then follow it. Defending against misconceptions is like fighting a pandemic,
which means that people should be "vaccinated" as early as possible. Therefore, I recommend teaching data science concepts at the undergraduate level!
Posted on January 24. 2023
In spite of a mass layoff (10,000 employees), Microsoft recently announced a $10 billion investment in Open AI, the company that developed ChatGPT and DALLE-2.
Microsoft's investment will allow OpenAI to accelerate its research since all of its models are trained in Microsoft Azure. In return, Microsoft will receive a boost to its
Azure cloud and even catch up with Amazon Web Services.
Full article: https://www.bloomberg.com/news/articles/2023-01-23/microsoft-makes-multibillion-dollar-investment-in-openai
That’s my take on it: Currently, Amazon Web Services dominates the cloud computing market. However, Open AI can undoubtedly improve the functionality of Microsoft
Azure. While AWS does not have a powerful AI partner like Open AI, its Sagemaker provides powerful predictive modeling capabilities. A long time ago, Microsoft and
SAS Institute formed a partnership to offer cloud-based data analytics. It is my belief that this fierce competition in machine learning, cloud computing, and data science
will drastically change the landscape of these fields in the near future. Be sure to stay tuned!
Today I read an interesting article entitled “Is artificial intelligence a threat to Christianity?” posted on Patheos. The article contains many insightful points, and I will only
highlight one. According to Keith Giles, the author of the article, “In fact, this fear of creating an AI that is “more intelligent than humans” isn’t even what we should be
most afraid of. As one former top social media tech executive was quoted as saying in the excellent NETFLIX documentary, The Social Dilemna, we shouldn’t be afraid of
creating an AI that eventually exceeds human intelligence, what we should be afraid of is the fact that we’ve already created machine learning programs that know how
to overcome our human weaknesses.”
Article: https://www.patheos.com/blogs/keithgiles/2023/01/is-artificial-intelligent-a-threat-to-christianity/?utm_source=Newsletter&utm_medium
=email&utm_campaign=Best+of+Patheos&utm_content=57&lctg=252615&rsid=Legacy&recipId=252615&siteId=7DF2956C-D2F1-40D4-A777-98E450E58360
That’s my take on it: Last evening in my class I told my students that I like machine learning a lot. Machine learning has the ability to learn very quickly, as its name
implies. With the right data, the algorithm can improve, and it won't make the same error again. On the contrary, humans (including myself) are so stubborn that we let
our cognitive and emotional weaknesses affect our judgment and behavior. We fear AI partly because we are envious of it.
With over 477 million items, Getty Images is one of the largest visual media companies in the world, offering stock images, videos, and music to business and individual clients.
Recently Getty Images announced that it is suing Stability AI, a company that enables users to generate images using its machine-learning software module, Stable Diffusion.
Getty Images accused Stability AI of training its algorithms by unlawfully extracting images from the Internet, including stock images owned by Getty. Getty claimed that the
company is not seeking financial damages or trying to stop the distribution of AI-art technology; rather, it attempts to push for laws and regulations that respect intellectual property.
Article: https://www.theverge.com/2023/1/17/23558516/ai-art-copyright-stable-diffusion-getty-images-lawsuit
That’s my take on it: Getty Images' reaction is understandable. It will not be necessary for illustrators or other users to buy stock images from Getty or other suppliers when
they are able to generate images using AI. For example, the Atlantic published a report by Charlie Warzel in 2022 right after Midjourney was released, another AI art generation
program. The report depicts two images of Alex Jones, the founder of InfoWar. Later Warzel apologized. “This was entirely my fault…Instead of selecting a photo or illustration
from Getty Images to go with the story, as I do for most of my newsletters, I decided to try something different and use an AI art tool to come up with the story’s accompanying
image,” says Charlie Warzel.
It is interesting to note that Getty Images is not suing Midjourney and DALLE-2. There is an obvious reason for omitting DALLE-2. While Stability AI uses an open-source model,
Open AI, which developed DALLE-2, did not disclose its mechanics. In the absence of ample evidence, attorneys have a difficult time building a case. However, I don’t understand
why Getty Images is not targeting Midjourney. Do you know why?
Today Boston Dynamics, a leader in AI-enabled robotics, released a video clip on YouTube that shows how Atlas, an intelligent humanoid robot, navigates “his” environment.
“He” assisted “his” human partner by using available objects and modifying his path to reach “his” goal.
Video: https://www.youtube.com/watch?v=-e1_QhJ1EhQ
That’s my take on it: In this video, the robot is merely helping the construction worker, who is still doing the actual task. I believe that in the near future, the advancement
of AI and big data analytics will enable intelligent robots to replace humans in certain high-risk careers, such as monitoring the radiation levels in nuclear plants and sweeping
mines on battlefields. Last year the US Army provided one of its two robotic dogs, which was built by Boston Dynamic, to clean up mines in Ukraine. You read it correctly. Only
one robotic dog! I guess it is still experimental. It would be great if this could be scaled up in the future so that no human lives would be lost. As shown in the video, Atlas' actions
indicate that an intelligent robot could evade threats better than humans.
A group of artists recently hired lawyers Matthew Butterick and Joseph Saveri to sue Stability AI and Midjourney, the developers of the artificial intelligence art generators
Stable Diffusion and Midjourney, respectively, as well as DeviantArt, which recently launched its own artificial intelligence art generator. They accused the AI generator
companies of profiting from their work by scraping their images from the web without their permission. The law firms representing the artist group asserted that AI-generated
art is a form of intellectual theft. “Even assuming nominal damages of $1 per image, the value of this misappropriation would be roughly $5 billion (For comparison, the
largest art heist ever was the 1990 theft of 13 artworks from the Isabella Stewart Gardner Museum, with a current estimated value of $500 million… Having copied the
five billion images—without the consent of the original artists—Stable Diffusion relies on a mathematical process called diffusion to store compressed copies of these training
images, which in turn are recombined to derive other images. It is, in short, a 21st-century collage tool,” says Matthew Butterick.
Full article: https://stablediffusionlitigation.com/
That’s my take on it: Technically speaking, Stable diffusion does not generate a picture by directly recombining existing images. The underlying principle of machine
learning is pattern recognition. Indeed, AI art generators store no images whatsoever, but rather mathematical representations of patterns derived from images. With
that said, the software module does not stack together multiple images in the fashion of collaging. Rather, it creates pictures from scratch based on pattern generation.
Even if AI art, as Butterick said, is just a 21st-century collage tool, collaging has been used by several well-known artists, such as Andy Warhol, and this practice is
widely accepted by the artist community. Warhol created art by recycling existing icons and images, including Marilyn Monroe, Prince, and Campbell soups. Several years
ago, the Warhol Foundation was sued for allegedly infringing on copyright laws by basing a portrait of Prince on a prominent photographer's work. Nevertheless, a federal
district court judge ruled that Warhol's Prince series is "transformative" because it conveys a different message, and therefore is considered "fair use" under the Copyright Act.
Furthermore, Butterick and Saveri are also suing Microsoft, GitHub, and OpenAI over the CoPilot AI programming model, which is trained by collecting source codes from the
Web. Thus, this kind of dispute is not only about AI art, but also about the long-term development of the open-source community as a whole.
Eight major Australian universities have announced that they have changed their assessment formats as a result of several cases in which students turned in papers
generated from ChatGPT. The University of Sydney, for instance, has revised its academic integrity policy to explicitly state that using artificial intelligence to create
content is cheating. The Australian National University has changed assessment designs, such as shifting emphasis on laboratory activities and fieldwork, as well as
using time exams and oral presentations.
Full article: https://www.theguardian.com/australia-news/2023/jan/10/universities-to-return-to-pen-and-paper-exams-after-students-caught-using
-ai-to-write-essays?utm_source=ONTRAPORT-email-broadcast&utm_medium=ONTRAPORT-email-broadcast&utm_term=Newsletter&utm_content=Data
+Science+Insider%3A+January+13th%2C+2023&utm_campaign=14012023
That’s my take on it: This issue is not entirely new. Before the introduction of ChatGPT and other AI tools, Wolfram products, such as Mathematics and Wolfram Alpha,
are capable of solving complex math problems. These tools are also used by students to cut corners, say math and statistics professors. The widespread availability of
Google and other search engines has led to many students turning in "instant" papers that reference many websites. Nonetheless, Wolfram, Google, and now ChatGPT
are here to stay. The solution is not to ban them. Instead, we should teach students how to use these tools ethically.
ChatGPT, an Open AI language module released on November 30, 2022, is capable of writing articles, generating codes, and solving complex math problems. As
expected, the introduction of ChatGPT has triggered widespread resistance. On Jan 5, 2023, the International Conference on Machine Learning (ICML) announced
that it bans authors from using AI tools like ChatGPT to write scientific papers unless the produced text is a part of an experimental analysis. It is important
to point out that this ban applies only to the text generated entirely by AI-enabled language models, but does not apply to papers “coauthored” by humans and AI.
In a similar vein, Stack Overflow also banned users from submitting answers created using ChatGPT last year, while the New York City Department of Education
blocked access to this tool just last week.
“With a tool like this at their fingertips, it could muddy the waters when evaluating a student's actual writing capabilities because you're giving kids potentially
a tool where they could misrepresent their understanding of a prompt,” says Whitney Shashou, founder and advisor at educational consultancy Admit NY.
https://www.theverge.com/2023/1/5/23540291/chatgpt-ai-writing-tool-banned-writing-academic-icml-paper
https://www.forbes.com/sites/rashishrivastava/2022/12/12/teachers-fear-chatgpt-will-make-cheating-easier-than-ever/?utm_campaign=31122022&utm_content
=Data+Science+Insider%3A+December+30th%2C+2022&utm_medium=ONTRAPORT-email-broadcast&utm_source=ONTRAPORT-email-broadcast&utm_term
=Newsletter&sh=3d9f6aa31eef
That’s my take on it: Any new technology could lead to some unintended consequences. As you might already know, some paper mill “companies” provide users
with “publishing” services. It is estimated that about 1% of articles archived in PubMed contain questionable content. With the advance of AI tools like ChatGPT,
it will be much easier for authors to produce instant articles. At the present time, I am unaware of any academic journal that prohibits submissions generated by AU.
ChatGPT also simplifies the process of writing for students. As of right now, my university does not have an academic honesty policy regarding artificial intelligence.
While Turnitin and SafeAssign can detect plagiarism, they cannot tell the difference between human-written and AI-generated text.
It is not my intention to oppose ChatGPT. As an initial research tool, I find this tool perfectly acceptable. Authors should, however, verify the information provided
rather than blindly trusting the results. I recommend that at least 80% of the final paper should be written by a human author to ensure its originality.
Why did Microsoft invest in R rather than Python or Octave?
That would be because of the Goldilocks principle in investing: if you invest in something, you want two things – potential (yield, profit, market share,…) and power (control).
R is relatively centralized. Beyond core R, which already comes with a pretty extensive amount of functionality, most of the things you use have been developed
by the same few dozens or so of highly prolific and amazingly skilled developers: Hadley Wickham and Dirk Eddelbuettel and Yihui Xie have pretty much developed
most of modern R as it is being used. It’s also a very widely used language, despite being a little clunky. Beauty is in the eye of the beholder, but I consider R to be
one of the uglier mainstream languages. In spite of this, it is very widely used in academia and enterprise settings, and while R itself isn’t particularly fast, you can
make it pretty impressively fast (but that’s a post for another day). R has pretty much displaced STATA, and as the slow generational change in science faculties
around the world plays out, students are increasingly encouraged to learn R instead of using slightly more digestible proprietary statistical packages like SPSS. It
helps that R has a spectacularly good front-end (RStudio)* and its own way of literate programming with Rmd.
Octave is basically an open-sourced version of Matlab. It’s syntactically similar, which is why it has just about all the drawbacks of Matlab. What Octave doesn’t have
is an ecosystem that comes near R’s. When it comes to quantitative applications, if an algorithm or an analysis has been implemented at some point somewhere in
the known universe, there’s likely an R package for it. This includes some fairly esoteric stuff. You can’t say the same about Octave, sadly. Octave doesn’t have the
sophisticated package management infrastructure of R and CRAN. Its overall ecosystem is much smaller, by about two orders of magnitude (!). The potential in
embracing Octave, as well as the number of existing users, is quite small.
Python is the opposite. Python has immense potential, and everybody knows it. It’s just really, really hard to govern. While Python does have a central governing
body (the Python Software Foundation), a lot of quantitative tools are spread all over the place: NumFocus, Apache, Google, OpenAI, individual maintainers, and so
on. It’s also a much more general-purpose language: R is, deep down, about quantitative work. Python can be used to pretty much do anything you’d want a
modern computing language to do. It is, quite simply, too big and too diverse for any investment, even by a company as big as Microsoft, to have a noticeable impact.
It’s delightfully chaotic, which makes it fun, but hard to exert control over.
What it ultimately boils down to is the infectious population (because of course it does – ask an epidemiologist a question, expect a response in those terms!). Octave’s
population is just too small to create an Octave pandemic (thank the heavens). There aren’t enough people who know and love it to keep teaching it to others. Python’s
infectious population is too big: it’s like one of those commensal viral species like Epstein-Barr or CMV that pretty much everyone gets in their lifetime. R is “just right”
– it’s in the investment Goldilocks zone. It’s got potential, it’s still somewhat governable and you can make a meaningful investment in it with relatively reasonable resources.
That’s my take on it: Different corporations have different development strategies regarding open source. While Microsoft is investing in R, IBM focuses on Python. To be
more specific, although the extension hub of IBM SPSS Statistics allows users to download and install both R and Python packages, IBM incorporates only the Python library
into IBM SPSS Modeler. This Python library includes a plethora of tools, such as SMOTE, XGBoost, t-SNE, Gaussian Mixture, KDE, Random Forest, HDBSCAN, and Support
Vector Machine. But there is no R library in IBM Modeler. In addition, the IBM data science certification program is also Python-centric.
In spite of its popularity, the decentralization of Python, as Chris von Csefalvay pointed out, is a concern to me. First, you need to figure out which package you need for
a specific job and it could be confusing. Second, when you encounter issues in Python, it is very difficult to trace the source of the problem, especially when multiple packages
are involved.
Last, I agree with Chris von Csefalvay that R is not pretty although R-Studio provides users with a nicer front end. In my opinion, JASP, which is a graphical version of R, is
much more accessible. However, JASP is fairly new and its current version is 0.16.4. Not surprisingly, its graphical user interface is not as good as JMP Pro. JMP Pro is a
mature SAS product and its current version is 17.
Hi, all, today I delivered a talk on dynamic visualization (see below) at a conference. Statistical graphs are not new. The keyword for this presentation is "dynamic."
Specifically, a good visualization system should enable the user to alter the display by asking "what-if" questions. There are hyperlinks to dynamic graphs on the PDF.
You can click on them to explore the data. Thank you for your attention. Merry Xmas and Happy New Year!
Yu, C. H. (2022, December). Dynamic data visualization for pattern-seeking and insightful discovery. Paper presented at 2022 IDEAS Global AI Conference. Los Angeles, CA.
Link to presentation: https://creative-wisdom.com/pub/Yu_2022_IDEAS.pdf
In December 2021 an article in Forbes predicted the emerging trends of AI in the near future. A year later it was found that seven out of ten predictions were exactly right
or on the right track:
1. Language AI will take center stage, with more startups getting funded in NLP than in any other category of AI: Right.
2. Databricks, DataRobot, and Scale AI will all go public: Wrong.
3. At least three climate AI startups will become unicorns: Wrong.
4. Powerful new AI tools will be built for video: Right.
5. An NLP model with over 10 trillion parameters will be built: Wrong.
6. Collaboration and investment will all but cease between American and Chinese actors in the field of AI: Right.
7. Multiple large cloud/data platforms will announce new synthetic data initiatives: Right.
8. Toronto will establish itself as the most important AI hub in the world outside of Silicon Valley and China: Right.
9. “Responsible AI” will begin to shift from a vague catch-all term to an operationalized set of enterprise practices: Rightish.
10. Reinforcement learning will become an increasingly important and influential AI paradigm: Rightish.
Full article: https://www.forbes.com/sites/robtoews/2022/12/15/what-we-got-right-and-wrong-in-our-2022-ai-predictions/?sh=3c993b721654&utm_source=ONTRAPORT-email-broadcast&utm_medium=ONTRAPORT-email-broadcast&utm_term=Newsletter&utm_content=Data+Science+Insider%3A+December+16th%2C+2022&utm_campaign=17122022
That’s my take on it: I could write a 10-page essay to respond to each of the preceding predictions, but in this short post I will focus on the fulfillment of Prediction # 10 only. Reinforcement learning was inspired by the reinforcement theory in behavioral psychology. According to behaviorism, our behaviors are governed by the stimulus-response (S-R) loop, meaning that how we act or respond depends on what stimulus or feedback we received from the environment. If the feedback is rewarding, it reinforces good behaviors. If not, we avoid detrimental behaviors. When I was a student, most scholars looked down upon behavioral psychology for its over-simplicity. However, a few decades later AlphaZero, AlphaGo, and AlphaStar (Google’s DeepMind) that defeated human chess experts and video game players are all based on this alleged over-simplistic model. Nvidia, the leader in the GPU market, also used reinforcement learning to design its new cutting-edge H100 chips. The moral of the story is: We need to keep an open mind to alternate theories.
During the past month, Lensa AI created by Prism Labs is taking over social media by storm. This app has been around since 2018, but recently its new feature “Magic Avatars”
draws a lot of attention. In the past week, Lensa AI became the most popular app in the iOS App store and has been downloaded 700,000 times in the past month. Why is it so
popular? The new feature is amazing! If you upload 10-20 photos of yourself to the system, the deep learning algorithm can create several digital versions of yourself. This
algorithm is based on Stable Diffusion, an AI-powered program trained on a data set consisting of over two billion images. It is important to point out that the app might collect
your behavioral data, and thus IT security experts suggest using it cautiously.
One-minute introduction on Youtube: https://www.youtube.com/shorts/FAs0d6q-MS0
One-minute discussion about safety concerns on Youtube: https://www.youtube.com/watch?v=xZv6ph8ShuY
That’s my take on it: The impact of AI is beyond data analytics; instead, its influence can be found in every discipline, including visual arts and mass communication. Besides
privacy concerns, skeptics argue that these machine learning programs are trained with many existing images on the Internet, but those artists are not compensated at all.
While contributors to the open-source community voluntarily share their source codes with the whole world, un-compensated artists are forced to accept this quasi-open-source
model. However, for me, it is acceptable because these programs didn’t “plagiarize” anyone’s work. Rather, they “learn” from other images and then create a new one based on
the references. Is that what we are doing in every type of work? When I write a research paper, I usually use 30 to 50 references but don’t pay those authors.
Today I attended the last session of “Statistical wars and their casualties.” One of the speakers is Aris Spanos (Virginia Tech) and the title of his presentation is “Revisiting the two cultures
in statistical modeling and inference.” In the talk he outlined several statistical paradigms as follows:
1. Karl Pearson’s descriptive statistics
2. Fisher’s model-based statistical induction
3. Graphical causal modeling
4. Non-parametric statistics
5. Data science and machine learning
At the end he discussed the difference between the Fisherian school and the data science approach: the paradigm shift from the Fisherian school to data science“reflects a new answer to
the fundamental question: What must we know a priori about unknown functional dependency in order to estimate it on the basis of observations? In Fisher’s paradigm the answer was
very restrictive – one must know almost everything…machine learning views statistical modeling as an optimization problem relating to how a machine can learn from the data.”
Nonetheless, Dr. Spanos warned against overhyping data science. For him doing data science is returning to the Pearsonian tradition that emphasizes describing the data at hand. Many
people go into the discipline by learning Python without knowing statistical details. As a result, data science became a black box, and thus he is afraid that many decades later we will try
to figure out what went wrong again.
In his talk entitled “Causal inference is not statistical inference,” Jon Williamson (University of Kent) asserted that a broader evidence base from triangulation is more important than
successful replication of the results because successful replication might replicate the bias in previous studies.
Seminar website: https://phil-stat-wars.com/workshop-the-statistics-wars-and-their-casualties/
That’s my take on it:
1. I agree that the Fisherian model-based approach is very restrictive because it assumes you know to which the theoretical sampling distribution the sample belongs. However, I would
compare data science and machine learning (DSML) to the school of exploratory data analysis (EDA) founded by John Tukey and the resampling approach developed by Elfron et al., rather
than the Pearsonian legacy. By unpacking the philosophy of these paradigms, one can see that both DSML and EDA emphasize pattern-seeking, and today resampling methods, such as
cross-validation and bootstrapping, are embedded in many DSML methods.
2. We should do both triangulation and replication. I don’t think one is more important than the other. Machine learning is a form of internal replication in the sense that the data set is
partitioned into numerous subsets for repeated analyses. In boosting the subsequent models can correct the bias of the previous models, and thus this type of replication will not inherit
the bias.
A week ago Open AI released its new AI-enabled Chat bot named ChatGPT. In just five days after the announcement, a million users signed up for the service. Open AI was founded
in 2015 by Elon Musk and a few others, but today Musk is no longer on Open AI’s board while remaining a major donor. In 2019 Microsoft invested US$1 billion into Open AI to boost
its research, and its most fruitful research product is GPT, which stands for Generative Pre-Trained Transformer. ChatGPT can answer virtually any question through natural language
processing, such as writing a poem, an essay, or a computer program.
Website: https://chat.openai.com/chat
That’s my take on it: Will ChatGPT replace Google’s search? Now it is still too early to tell. Will ChatGPT totally replace human writers and other types of workers (e.g. journalists,
customer support personnel…etc.)? Probably not. But it can definitely help us get things done efficiently. The following are some sample instructions/questions that I input into ChatGPT.
The answers are stunningly concise and accurate! If you are a student, you can paste the question from the assignment into the chat box, and the AI system will write the essay for you!
My instruction/question 1: Explain the difference between cognitive psychology and neuroscience.
Answer from ChatGPT: Cognitive psychology and neuroscience are two related but distinct fields that study the human mind and behavior. Cognitive psychology is a branch of
psychology that focuses on the mental processes that underlie behavior, such as perception, attention, memory, and decision-making. Cognitive psychologists use a variety of methods,
including experiments and computer modeling, to understand how the brain processes information and generates behavior.
Neuroscience, on the other hand, is a field that focuses on the structure and function of the nervous system, including the brain and the rest of the nervous system. Neuroscientists use
a wide range of techniques, including brain imaging and electrophysiology, to study the brain and nervous system at the cellular and molecular level. While cognitive psychology is concerned
with understanding mental processes and behavior, neuroscience is more focused on the underlying biological mechanisms that give rise to those processes and behaviors.
In short, cognitive psychology is concerned with the study of the mind, while neuroscience is concerned with the study of the brain and nervous system. Both fields are closely related and
often inform each other, but they have different focuses and approaches to studying the mind and behavior.
My instruction/question 2: What are the advantages and disadvantages of Python and R programming languages?
Answer from ChatGPT: Python and R are both popular programming languages that are widely used in data science and other fields. Python is a general-purpose programming
language that is known for its simplicity and ease of use. It is often used for web development, scientific computing, and data analysis. R, on the other hand, is a programming language
that was specifically designed for statistical computing and data visualization. It is commonly used in fields such as finance and economics.
One of the main advantages of Python is its versatility. Because it is a general-purpose language, it can be used for a wide range of tasks, from web development to data analysis. It also has
a large and active community of users, which means that there are many libraries and frameworks available for different tasks. Additionally, Python has a simple and easy-to-learn syntax,
which makes it a good choice for beginners.
The main advantage of R is its powerful and comprehensive set of tools for statistical analysis and data visualization. R has a large number of libraries and packages that are specifically
designed for statistical computing, which makes it a popular choice among researchers and data scientists. It is also well-suited for handling large and complex datasets, and it has a wide
range of tools for data manipulation, cleaning, and analysis.
One of the main disadvantages of Python is that it is not as efficient as some other programming languages when it comes to handling large and complex datasets. It can be slower than
other languages, and it may require more computational resources to run certain tasks. Additionally, Python does not have as many specialized tools for statistical analysis and data
visualization as R does, so it may not be the best choice for some types of data analysis tasks.
The main disadvantage of R is that it can be more difficult to learn and use than Python. R has a more complex syntax than Python, and it is not as widely used outside of the fields of
statistics and data analysis. This can make it more challenging to find help and support when you are learning to use R, and it can also make it harder to integrate R with other
programming languages or tools. Additionally, R can be slower than some other languages, which can make it less suitable for certain types of real-time or high-performance applications.
Overall, both Python and R have their advantages and disadvantages, and the best choice for a particular project will depend on the specific requirements and goals of that project. Python
is a versatile and easy-to-use language that is well-suited for many different tasks, but it may not be as efficient or powerful as R for certain types of data analysis. R, on the other hand,
is a powerful and comprehensive tool for statistical analysis and data visualization, but it can be more difficult to learn and use than Python.
Two days ago, I attended the third session of “The Statistics Wars and Their Casualties.” One of the speakers David Hand asserted that the problem of statistical
significance is its misuse and misinterpretation. He cited several examples to argue that procedures in different schools of thought can be equally misused.
For example, the mean is sensitive to extreme values while the median is resistant against outliers. Which one is more appropriate depends on the research
question. Arithmetic means can be misused, but we should not ban Arithmetic means. In a similar vein, in data science, the area under the curve (AUC) and the error
rate derived from the same predictive model can lead to contradictory conclusions. But we should not call for a ban on using such tools just because they are misused
or misinterpreted.
Link to Stat Wars website: https://phil-stat-wars.com/
That’s my take on it: In the seminar, I presented my counter-arguments as follows:
1. The problem of statistical significance is more than its misuse. Indeed, there are other inherent problems. For example, the alpha level (0.05) is nothing
more than a social convention. Unlike the Bayesian approach which yields a distribution of answers and promotes probabilistic reasoning, the frequency approach leads to a
dichotomous conclusion. While it is true that eventually we need to make a dichotomous decision, the problem of statistical significance is that even the evidence is
dichotomous! Nonetheless, I agree that we cannot go that far to ban it. For small-sample studies, classical statistics is still valid and handy.
2. Data science is less error-prone and less likely to be misused. Traditional parametric statistical methods require many assumptions. In contrast, most data
science methods are non-parametric; they are robust against outliers and noise; they can detect non-linear patterns. More importantly, ensemble methods and machine
learning are capable of doing self-replication by partitioning the data set into sub-samples and running multiple models, thus alleviating the replication crisis found in
traditional statistics.
A month ago JASP, the graphical shell of the R language, released version 0.16.4. Today I attended a workshop to learn about the new and enhanced features of JASP.
The enhanced module includes several powerful tools belonging to different schools of thought. For example,
· The frequentist school (Fisher/Pearson): Generalized linear models
· The Bayesian school: Bayesian repeated measures ANOVA
· The data science and machine learning school: Density-based clustering
Link to JASP: https://jasp-stats.org/
1. Many statistical learners are torn between learning statistics and learning coding. In my opinion, this tension is unnecessary. The GUI of JASP is so user-friendly that
analysts can focus on data analysis rather than struggling with the R syntax.
2. JASP is semi-dynamic and interactive. Unlike SPSS which produces a frozen output, JASP allows the user to add or remove information in the output by changing
options in the input. But unlike JMP, Tableau, and SAS Viya, you cannot directly manipulate the output. JASP can now load data directly from databases like IBM DB2,
Oracle, MySQL, MariaDB, Postgres, SQLite, and any database supporting the ODBC interface.
3. Yesterday I attended a seminar entitled “The Statistics Wars and Their Casualties.” As the title implies, there was a heated debate centering on the use and misuse
of statistical significance. In my opinion, it should not be an either-or situation. As mentioned before, JASP provides analysts with different approaches; the procedures are
grouped and clearly labeled: classical, Bayesian, and Machine learning. Pick whatever you see fit!
A few days ago, Nvidia, the pioneer of graphical processing units (GPU), announced its new partnership with Microsoft in co-developing AI cloud-based computing.
Specifically, Nvidia will utilize Azure, the cloud platform of Microsoft, to develop advanced generative AI models that can create content, including codes, images,
and video clips.
Full article: https://www.cnet.com/tech/computing/nvidia-and-microsoft-join-forces-to-build-ai-supercomputer/?utm_source=ONTRAPORT-email-broadcast&utm_medium=ONTRAPORT-email-broadcast&utm_term=Newsletter&utm_content=Data+Science+Insider%3A+November+18th%2C+2022&utm_campaign=19112022
That’s my take on it: Currently, Nvidia is the world’s second-largest semiconductor company (behind TSMC), whereas Microsoft is second to Amazon Web Services
in cloud computing. It is logical for them to form such a joint venture in order to compete with the number one in the market. In the past, computer users were confined
to the Wintel monopoly (Microsoft Windows and Intel CPU). However, in the era of big data analytics, AI, and cloud computing, it is anticipated that data analytics can
choose between many options.
Recently NVIDIA, the leader in graphical processing units (GPU) and one of the leaders in AI research, announced a new approach to AI-enabled text-to-image
generation named eDiff-I. Currently, the three prominent leaders in the text-to-image market are Midjourney, DALL.E-2, and Stable Diffusion. As the name
implies, Stable Diffusion is based on diffusion modeling. Under this mode, an initial image is created with random noise. Next, through an iterative process,
a sharp and sensible image is gradually created by denoising the entire noise distribution. While Stable Diffusion’s denoising is based on a single noise distribution,
NVIDIA goes one step further using an ensemble of multiple expert denoisers.
Additionally, while users of Midjourney, DALL.E-2, and Stable Diffusion have limited control of the output image, eDiff-I allows users to paint with text, i.e., specify
objects in different areas of the canvas.
YouTube Video (7 minutes): https://www.youtube.com/watch?v=21TljTNvQJw
That’s my take on it: The logic of diffusion modeling is similar to several older statistical procedures. For example, K means clustering randomly selects centroids
and then fine-tunes the clustering patterns through multiple iterations. In contrast, the logic of eDiff-I is closer to that of data science and machine learning. The
ensemble method, an extension of resampling, is utilized in boosting and bagging. Rather than drawing a conclusion based on a single model, the ensemble method
converges multiple models to the final output from a collection of models.
I admire NVIDIA because its CEO/founder has a vision. Currently, NVIDIA is a one-trick pony, but it cannot be the leader of GPUs forever. Using its strength in
graphical processing to invest in a less-crowded AI domain (text-to-speech image generation) is definitely a smart move!
Two days ago, Meta (formerly Facebook) announced a massive layoff in the company, and as a result, 11,000 employees were terminated. Meta’s CEO
Mark Zuckerberg said that he planned to consolidate the company’s resources into a few high-priority growth areas, such as the AI discovery engine
while giving up other less promising research endeavors. For example, the entire team named “Probability” was eliminated. The team was composed
of 19 people doing Bayesian Modeling, nine people doing Ranking and Recommendations, five people doing ML Efficiency, and 17 people doing AI for
Chip Design and Compilers. A former team member said it took seven years to assemble such a fantastic team.
Full article: https://venturebeat.com/ai/meta-layoffs-hit-entire-ml-research-team-focused-on-infrastructure/?utm_source=ONTRAPORT-email-broadcast&utm_medium=ONTRAPORT-email-broadcast&utm_term=Newsletter&utm_content=Data+Science+Insider%3A+November+11th%2C+2022&utm_campaign=12112022
That’s my take on it: I don’t worry about brain drain from the US to other countries. The US is still a magnet that attracts top-tier AI researchers
and data scientists worldwide. Those former Meta researchers will likely be recruited by other high-tech giants, such as Google and Apple. Last year
Professor Michael Gofman at the University of Rochester spotted a trend that high-tech titans and startups have lured many DSML professors away from
their faculty positions. Consequently, the knowledge gap between academia and industry was widened; transferring essential knowledge to students
and colleagues was affected. Current massive layoffs in Meta, Twitter, and other high-tech giants might be an opportunity for colleges and universities
to absorb those highly competent researchers.
As you might already know, recently SAS Institute released the new version of JMP and JMP Pro (Version 17). There are many powerful and handy new features, such as
· Workflow Builder
· Easy design of experiment
· Easy search
· Spectral analysis in the functional data explorer
· Genomics and wide fitting
· Generalized linear mixed model
· Interactive power analysis
· Preview of joining, transposing, and data reshaping
https://www.jmp.com/en_us/software/new-release/new-in-jmp.html
That is my take on it: I especially like the preview feature in data reshaping (e.g., concatenate, join, stack, split, transpose…etc.). In the past, no matter
whether you used a graphical user interface or coding, you could see the result only after hitting the OK or Run button. If something went wrong, you had
to debug it and re-did the whole procedure. Not anymore! Now I can literally look at the result before submitting the job.
Interactive power analysis is another wonderful feature. G*Power is very popular among researchers because it is free and user-friendly. The drawback
is that if you want to explore different options, you have to go back and forth between the input and the output. Although G*Power can output a graph
showing power on the Y-axis and N on the x-axis, the ranges are pre-determined by your input. Not anymore. In JMP you can use sliders to adjust the
effect size and the sample size, and then the power is updated in real-time!
I always tell my students: The world keeps changing. If you cannot change the world, at least you change with the world! I will continue to
explore those new features to make my tasks more effective and efficient.
Today (10/20) is the second day of the 2022 Scale Transform X Conference. I would like to share one of the most informative presentations at this conference
with you. The title of the lecture is “Looking at AI through the lens of a chief economist” and the presenter is John List, Kenneth Griffin Distinguished Service
Professor in Economics at the University of Chicago and the Chief Economics Officer at Uber. His specialty is behavioral economics, a sub-domain of economics
that applies psychological theories to study human behaviors related to financial decisions. In this talk, he pointed out that scalability is a major challenge to
behavioral economics. Specifically, very often false positives caused by statistical artifacts in a small-scaled study misled the decision-maker to prematurely
expand the program, but in the end, the up-scaling program failed miserably.
When Uber invited Dr. List to apply for the position of Chief Economics Officer, initially he rejected it. However, later he accepted the offer out of the belief that
analyzing big data with AI/ML algorithms might be the key to address the problem of scalability. In the lecture he discussed one of his interesting studies at
Uber, which is concerned with customer retention. Customers who have a bad experience (e.g., delay of arrival) tend to stop using the service for a long
while. It was estimated that bad rides were reducing Uber future revenues by 5-10%. In this study, dissatisfied customers were randomly assigned into four
conditions: no treatment (control), offering an apology only, offering a promo code only (e.g., 10% discount for the next ride), and a combination of both.
The sample size is 1,258,000. It was found that the last three groups spent more money for the Uber service than the control group in the next seven days
after the bad experience; however, there are no substantial differences between the three treatment groups in terms of money spent for Uber in the same period
of time.
Conference website: https://scale.com/events/transform
That’s my take on it: The problem of scalability in behavioral economics is similar to the replication crisis in psychology: the results of many research studies
are difficult to reproduce in other settings. If a model is overfitted to a particular sample, its generalizability is severely limited. I am glad to see that Dr. John
list is willing to utilize big data to tackle this problem. On the contrary, some psychologists are still skeptical of data science methods. Once a psychologist said
to me, “Big data is irrelevant!” After all, behavioral economics could be conceptualized as an interdisciplinary study that integrates both psychology and economics.
If big data can be applied to behavioral economics, why can’t other disciplines?
Next time if I receive an apology from Uber after a bad ride, I will not reuse the service immediately. After a few days, Uber might send me a promo code in order
to win me back!
Today (Oct 19, 2022) Meta announced the first AI-powered speech-to-speech translator on earth. Unlike traditional translation systems that focus on written
languages only, Meta’s universal speech translator is capable of translating Hokkien, a dialect used by over 49 million Chinese people in the world, to English
and vice versa. In the future, Meta will expand this system to cover 200 languages. The ultimate goal is to enable anyone to seamlessly communicate with
each other in their native language.
Demo on YouTube: https://www.youtube.com/watch?v=_UtPbslS0Fg
Meta announcement: https://ai.facebook.com/blog/ai-translation-hokkien/
That’s my take on it: Interestingly, many AI companies set the same goal: enabling all users. In a lecture entitled “A vision for advancing the democratization
of AI,” Emad Mostaque, founder and CEO of Stability AI, asserted that AI-powered image generators, such as Stable Diffusion, can “democratize” our society in
many ways. Specifically, armed with AI-powered image generators, anyone can create stunning graphics without formal art training. Put bluntly, AI tools can lift up
everyone!
When I studied theology, the most challenging subject matters were the Hebrew and Greek languages. You have to be gifted in linguistics in order to be proficient
in biblical hermeneutics, but unfortunately, I failed to master either one of these two languages. This is a good analogy: “Reading the Bible without knowing Greek
and Hebrew is like watching a basic television while reading the Bible knowing Greek and Hebrew is like watching an 85" UHD 8K television with stereo surround
sound.” Nevertheless, in our lifetime we may see a real-life “Star Trek” universal translator that can remove all language barriers!
As you might already know, DALLE-2, one of the most advanced AI-enabled graphing programs, is open to the public now. Like Midjourney and Stable Diffusion,
DALLE-2 is capable of generating art and photo-realistic images from a command given in natural language. Yesterday (Oct 10) a photographer named Umesh
Dinda posted a comparison of partial background removal and reconstruction of an image between Adobe PhotoShop and DALLE-2. Photoshop has been the king
of image processing for several decades due to its rich features. One of its amazing features is “content-aware fill”, which allows photographers to replace any
part of the photos based on the surrounding content. However, after watching Dinda’s Youtube movie, I must admit that DALLE-2 has dethroned PhotoShop in
certain functionalities. While the result of PhotoShop looks “cheesy,” the product of DALLE-2 is so flawless that your naked eyes cannot tell the photo has been retouched.
Youtube movie: https://www.youtube.com/watch?v=ZK1lsnWEbj8
That’s my take on it: What is the implication for data analysts? Estimating proper pixels is more complicated and challenging than estimating unknown parameters or
imputing missing data. If an AI-enabled system can do such a mind-boggling job in image processing, it can also outperform conventional algorithms in numeric manipulation
and data processing. At the end of the video Umesh Dinha said, “AI is here to stay, and it’s here to grow. We can either accept it, use it in our workflow, use it to our advantage,
or ignore it and possibly miss out on the incredible things that we might create.” I agree. His comments are applicable to both photographers and quantitative researchers.
Two days ago (Oct 6) six US leading tech companies, including Boston Dynamics, Agility Robotics, ANYbotics, Clearpath Robotics, Open Robotics, and Unitree, signed an
open letter pledging not to weaponize their products. They state, “As with any new technology offering new capabilities, the emergence of advanced mobile robots offers
the possibility of misuse. Untrustworthy people could use them to invade civil rights or to threaten, harm, or intimidate others… We pledge that we will not weaponize our
advanced-mobility general-purpose robots or the software we develop that enables advanced robotics and we will not support others to do so."
Full article and letter: https://www.axios.com/2022/10/06/boston-dynamics-pledges-weaponize-robots?utm_source=ONTRAPORT-email-broadcast&utm_medium=ONTRAPORT-email-broadcast&utm_term=Newsletter&utm_content=Data+Science+Insider%3A+October+7th%2C+2022&utm_campaign=08102022
That’s my take on it: In the open letter they also state, “to be clear, we are not taking issue with existing technologies that nations and their government agencies use
to defend themselves and uphold their laws.” However, without support from major US robotics firms, the development of AI-based weapons in the US will slow down.
Perhaps my position is unpopular. Will governments and high-tech corporations of hostile countries face the same limitations? History tells us that any unilateral disarmament
often results in more aggression, instead of peace (Remember Neville Chamberlain?).
Two years ago the New York City Police Department (NYPD) utilized the Spot model from Boston Dynamics to support law enforcement, including a hostage situation in the
Bronx and an incident at a public housing building in Manhattan. Unfortunately, these deployments caused an outcry from the public, and as a result, the NYPD abruptly
terminated its lease with Boston Dynamics and ceased using the robot. If “robocops” can save the lives of innocent people and reduce the risk taken by police officers, why
should we object to it?
Yesterday (September 23, 2022) an article published in Nature introduced the Papermill Alarm, a deep learning software package that can detect text in articles similar to
that found in paper mills. Through the PaperMill Alarm, it was estimated that about 1% of articles archived in PubMed contain this type of questionable content. There are
several existing plagiarism detection software tools in the market, but this approach is new because it incorporates deep learning algorithms. Currently, six publishers,
including Sage, have expressed interest in this new tool.
Full article: https://www.nature.com/articles/d41586-022-02997-x
That’s my take on it: If this tool is available in the near future, I hope universities can utilize it. Although there are several plagiarism checkers, such as Turnitin and
SafeAssign, in the market, today some sophisticated writers know how to evade detection. No doubt deep learning algorithms are more powerful and sensitive than
conventional tools.
Nonetheless, I think there is room for expansion in using deep learning for fraudulent paper detection. Currently, the scope of detection of the Papermill Alarm is limited
to text only. As a matter of fact, some authors duplicated images from other sources. As the capability of machine learning advances rapidly, image sleuths may also be
automated soon.
Yesterday (September 20, 2022) in the article entitled “Data: What It Is, What It Isn’t, and How Misunderstanding It is Fracturing the Internet” President of Global Affairs at
Meta Nick Clegg argued that data should not be treated as the “new oil” in the era of big data. Unfortunately, public discourse about data often relies on this type of faulty
assumptions and analogies, resulting in digital localization and digital nationalism. First, unlike oil, data are not finite. The supply of new data is virtually unlimited and the
same data can be re-analyzed. Second, more data are not equated with more values; rather, it depends on how the data are utilized. For instance, a database about
people’s clothing preferences is much more important to a fashion retailer than it is to a restaurant chain. Third, data values depreciate over time, i.e., outdated data are
useless or less valuable. More importantly, data access is democratized, not monopolized. For example, every month more than 3.5 billion people use Meta’s apps,
including Facebook, Instagram, WhatsApp, and Messenger, for free! Taking all of the above into consideration, Clegg argued that democracies must promote the idea of the
open Internet and the free flow of data.
Full article: https://nickclegg.medium.com/data-what-it-is-what-it-isnt-and-how-misunderstanding-it-is-fracturing-the-internet-e56e278643a7
That’s my take on it: The notion “data is the new oil” originates from British mathematician Clive Humby in 2006. This idea is true to some certain extent. For example,
in the past Google’s language model outperformed its rivals by simply feeding more data to its machine learning algorithms. This “brute force” approach is straightforward:
pumping more “fuel” into the data engine, and it works! Nonetheless, it is also true that more data do not necessarily generate more values. Old data could depreciate,
but even new data are subject to the law of diminishing returns. Democratization of data access and user-generated content is both a blessing and a curse. True. Usable data
are abundant and limitless, but so are bad data and misinformation!
In order to plant the seeds for prospective users, software vendors, such as Amazon Web Services, SAS Institute, salesforce, and IBM, have been giving free resources to higher
education for teaching and research purposes. Recently I started reviewing Amazon SageMaker Studio and its textbook “Dive into deep learning” (Zhang, Lipton, Li, & Smola).
The following is a direct quotation from Chapter 1: “We are experiencing a transition from parametric statistical descriptions to fully nonparametric models. When data
are scarce, one needs to rely on simplifying assumptions about reality in order to obtain useful models. When data are abundant, this can be replaced by nonparametric
models that fit reality more accurately. To some extent, this mirrors the progress that physics experienced in the middle of the previous century with the availability of computers.
Rather than solving parametric approximations of how electrons behave by hand, one can now resort to numerical simulations of the associated partial differential equations. This
has led to much more accurate models, albeit often at the expense of explainability.”
Full text: https://d2l.ai/d2l-en.pdf
That’s my take on it: Amen! When I was a graduate student, it was very common for statisticians to conduct research using Monte Carlo simulations: by simulating numerous
poor conditions and assumption violations (e.g., small sample size, non-normal distributions, unequal variances…etc.), we can tell whether a certain test procedure is robust.
Frankly speaking, for a long time, I have been skeptical of parametric tests and whether doing simulations is a good use of research resources. Due to the requirement of certain
assumptions, parametric tests are very restrictive and “unrealistic” (We use “clean data” that meet the assumptions, and then infer the finding from the ideal sample to the messy
population). Several years ago, I discussed many alternatives to parametric tests, including data mining and machine learning, in the following article:
https://creative-wisdom.com/teaching/WBI/parametric_test.shtml
I have just updated the webpage based on that book.
Recently an artist named Jason Allen won the first prize for the category of digital art in the Colorado State Fair’s fine arts competition. However, many people are resentful of Allen’s victory,
because he admitted on Twitter that his picture was generated by an AI program called Midjourney. The production process by Midjourney, which is equipped with natural language processing,
is very user-friendly. In the command prompt, the user simply types a sentence, such as “a beautiful princess in a medieval castle”, and then the program can output several variants of the
picture according to the input.
Allen submitted a piece entitled “Théâtre D'opéra Spatial” after 900 iterations of the digital art. During the art competition, the judges didn’t realize that his art was created with AI, but they
also said that Allen didn’t break any rules.
Many Twitter users have different opinions. Twitter user OmniMorpho wrote, “We're watching the death of artistry unfold right before our eyes — if creative jobs aren't safe from machines, then
even high-skilled jobs are in danger of becoming obsolete.” Another Twitter user, Sanguiphilia, said, "This is so gross. I can see how AI art can be beneficial, but claiming you're an artist by
generating one? Absolutely not. I can see lots of kids cheating their way through assignments with this."
Allen bluntly proclaimed, "Art is dead, dude. It's over. A.I. won. Humans lost."
Full report: https://www.businessinsider.com/ai-art-wins-competition-angering-artists-2022-9
That’s my take on it: When I was a kid, I was forbidden by my parents to use a calculator because pressing buttons was not considered doing real math. Similar controversies recurred when
other new technologies were introduced (e.g., computers, digital photography…etc.). The massive protest against Allen’s victory is understandable. Traditionally, a skill is conceptualized as an
ability to perform a complicated activity that requires rigorous training. If anyone can do the job without going through professional training, such as talking to a computer, this so-called “skill”
is not highly regarded. Nonetheless, there are still many gray areas. One may counter-argue that the big idea in the head is more important than the implementation skill in the hand. For
example, in the past, it took a skillful wildlife photographer to manually focus on a fast-moving subject, but today digital cameras can automatically track the subject. What you need to do is
just be there to push the shutter. By the same token, if AI can cut down the production process from 10 hours to 10 minutes, the artist can spend more time on creative ideas.
Do I completely hand over my creative process to AI? I didn’t go that far. As a photographer, I still make “real” photos, and at most I only replace boring backdrops with digital backgrounds
generated by Midjourney. The following are some examples (1-8: with digital backgrounds; 9-11: with original blank backdrops). Am I an artist? You be the judge.
https://creative-wisdom.com/photography/girls/people604.html
On August 30, Komprise announced the results of its 2022 Unstructured Data Management Report. The following are the key findings:
· “More than 50% of organizations are managing 5 Petabyte or more of data, compared with less than 40% in 2021.” (1 Petabyte = 1,024 terabytes or 1 million gigabytes)
· “Nearly 68% are spending more than 30% of their IT budget on data storage, backups and disaster recovery.”
· “Cloud storage predominates: Nearly half (47%) will invest in cloud networks. On-premises only data storage environments decreased from 20% to 11.9%.”
· “The largest obstacle to unstructured data management (42%) is moving data without disrupting users and applications.”
· “A majority (65%) of organizations plan to or are already investing in delivering unstructured data to their new analytics / big data platforms.”
Full text: https://www.globenewswire.com/news-release/2022/08/30/2506659/0/en/Komprise-Survey-Finds-65-of-IT-Leaders-Are-Investing-in-Unstructured-Data-Analytics.html
That’s my take on it: As you might already know, structured data are referred to as data stored in row-by-column tables, whereas unstructured data are referred to as open-ended textual data,
images, audio files, and movies that cannot be managed and processed by traditional relational databases. Structured data are highly compressed based on the assumption that complicated reality
can be represented by abstract numbers. In response to this narrow view of data, qualitative researchers argued that open-ended data could lead to a rich and holistic description of the phenomenon
under study. In business, collecting, storing, and analyzing unstructured data has become an irreversible trend, and thus many powerful tools have been developed to cope with this “new normal.”
But in academia, quite a few recent qualitative research books still omit text mining, computer vision, and other latest developments of machine learning for unstructured data processing. There are
gaps to be filled!
In a contentious article entitled “Spirals of delusion: How AI distorts decision-making and makes dictators more dangerous,” which will be published in the upcoming issue of Foreign Affairs,
prominent political scientists Henry Farrell, Abraham Newman, and Jeremy Wallace discussed how democracies and totalitarian regimes are facing challenges from AI and machine learning
in different ways.
In an open society, machine learning could worsen polarization when AI-powered recommendation systems employed by social media keep feeding information to subscribers based on
their preferences. It is disrupting the traditional positive feedback loop as these self-propelling technologies rapidly spread misinformation and reinforce hidden biases.
In an autocratic system, the government utilizes big data and AI technologies to monitor and brainwash people, but as a result, the leaders are trapped by their generated “reality” without
knowing what is actually happening out there, thus increasing the chance of making bad decisions. These authors called it the “AI-fueled spiral of delusion.”
The AI-fueled challenges in a democratic society are visible and can be counteracted by concerned citizens, but such a self-correcting mechanism is absent in an authoritarian regime.
Full text: https://www.foreignaffairs.com/world/spirals-delusion-artificial-intelligence-decision-making?utm_medium=newsletters&utm_source=twofa&utm_campaign=Spirals%20of%20Delusion&utm_content=20220902&utm_term=FA%20This%20Week%20-%20112017
That’s my take on it: It is true that democratic countries have correction mechanisms against misinformation, but it is still an uphill battle, as evidenced by a seminal study conducted by
Nyhan et al (2005). In this experiment initially, participants were given incorrect information (e.g, weapons of mass destruction were found in Iraq, the Bush administration totally banned
any stem cell research…etc.). At the same time, Nyhan inserted a clear, direct correction after each piece of misinformation, but most conservative participants didn’t change their minds
in spite of the presence of correct information. Based on this finding, Nyhan concluded, “It is difficult to be optimistic about the effectiveness of fact-checking.”
On August 17 Gartner consulting published a report regarding data management and integration tools. According to the Gartner report,
· “Through 2024, manual data integration tasks will be reduced by up to 50% through the adoption of data fabric design patterns that support augmented data integration.”
· “By 2024, AI-enabled augmented data management and integration will reduce the need for IT specialists by up to 30%.”
· “By 2025, data integration tools that do not provide capabilities for multi-cloud hybrid data integration through a PaaS model will lose 50% of their market share to those vendors that do.”
PaaS is Platform as a Service, which is a complete deployment of the entire data infrastructure to the cloud. PaaS can be viewed as an extension to Software as a Service (SaaS),
which outsources only software applications to a cloud computing vendor.
Currently, leaders in the data integration market include Informatica, Oracle, IBM, Microsoft, and SAP, whereas challengers include Qilk, TIBCO, and SAS.
Request full-text: https://www.gartner.com/en/research/methodologies/magic-quadrants-research?utm_source=google&utm_medium=cpc&utm_campaign=GTR_NA_2022_GTR_CPC_SEM1_BRANDCAMPAIGNMQ&utm_adgroup=145351541024&utm_term=magic%20quadrant&ad=608884150686&matchtype=p&gclid=EAIaIQobChMIsJ78povb-QIVwwh9Ch0LhAPBEAAYBCAAEgIIzfD_BwE
That’s my take on it: Contrary to popular belief, AI and machine learning are not only for data analytics. Rather, it can also facilitate data integration. Experienced data analysts know that in a
typical research/evaluation project, 80-90% of the time is spent on data compilation, wrangling, and cleaning while as little as 10-20% is truly for data analysis. The ideal situation should
be the opposite. Two years from now if we still gather and clean up the data manually, something must be wrong.
On August 19 (today) an article entitled “The 21 Best Big Data Analytics Tools and Platforms for 2022” was posted on Business Intelligence Solutions Review.
According to the report, the list is compiled based on Information “gathered via online materials and reports, conversations with vendor representatives,
and examinations of product demonstrations and free trials. “The following list is sorted in alphabetical order:
· Altair: “an open, scalable, unified, and extensible data analytics platform.”
· Alteryx: “a self-service data analytics software company that specializes in data preparation and data blending.”
· Amazon Web Services: “offers a serverless and embeddable business intelligence service for the cloud featuring built-in machine learning.”
· Domo: “a cloud-based, mobile-first BI platform that helps companies drive more value from their data.”
· Hitachi’s Pentaho: “allows organizations to access and blend all types and sizes of data.”
· IBM: “offers an expansive range of BI and analytic capabilities under two distinct product lines-- Cognos Analytics and Watson Analytics.”
· Looker: “offers a BI and data analytics platform that is built on LookML.”
· Microsoft: “Power BI is cloud-based and delivered on the Azure Cloud.”
· MicroStrategy: “merges self-service data preparation and visual data discovery in an enterprise BI and analytics platform.”
· Oracle: “offers a broad range of BI and analytics tools that can be deployed on-prem or in the Oracle Cloud.”
· Pyramid Analytics: “offers data and analytics tools through its flagship platform, Pyramid v2020.”
· Qlik: “offers a broad spectrum of BI and analytics tools, which is headlined by the company’s flagship offering, Qlik Sense.”
· Salesforce Einstein: Its “automated data discovery capabilities enable users to answer questions based on transparent and understandable AI models.”
· SAP: offers “a broad range of BI and analytics tools in both enterprise and business-user-driven editions.”
· SAS: “SAS Visual Analytics allows users to visually explore data to automatically highlight key relationships, outliers, and clusters. It also offers
data management, IoT, personal data protection, and Hadoop tools.”
· Sigma Computing: offers “a no-code business intelligence and analytics solution designed for use with cloud data warehouses.”
· Sisense: “allows users to combine data and uncover insights in a single interface without scripting, coding or assistance from IT.”
· Tableau: for data visualization and exploratory data analysis.
· ThoughtSpot: “features a full-stack architecture and intuitive insight generation capabilities via the in-memory calculation engine.”
· TIBCO: offers “data integration, API management, visual analytics, reporting, and data science.”
· Yellowfin: “specializes in dashboards and data visualization.”
Full text: https://solutionsreview.com/business-intelligence/the-best-big-data-analytics-tools-and-platforms/
That’s my take on it: Each platform has different strengths and limitations, and thus it is a good idea to use multiple tools rather than putting all eggs into one basket. However, if it is
overdone, there will be unnecessary redundancy or complexity. There is no magic optimal number. It depends on multiple factors, such as the field, the sector, the company size, and the
objective. To the best of my knowledge, currently, the best cloud computing platform is Amazon whereas the best data visualization and analytical tools are Tableau and SAS.
Today I read two recent articles from the website “Python in plain English”:
· Vassilevskiy, Mark. (August 14, 2022). Why You Shouldn’t Learn Python as a First Programming Language.
· Dennis, Yancy. (August 2022). Why Python?
Overhyping or overpromising is dangerous to any emerging technology. As the name implies, this website endorses Python for its strength. Nonetheless, instead of painting a rosy
picture of learning and using Python, at the same time, both authors explained its shortcomings.
Although Vassilevskiy asserted that Python is arguably the simplest programming language in the world, he also mentioned that simplicity is not always a good thing because it encourages
users to cut corners. For example, in Python, you can simply define a variable by writing x = “Hello”, without specifying the data type. As a consequence, learners might not fully understand
what real programming entails.
In a similar vein, Dennis pointed out several other limitations of Python, including execution sluggishness, issues with moving to a different language, weakness in mobile application
development, excessive memory consumption, and lack of acceptance in the business development industry.
https://python.plainenglish.io/why-you-shouldnt-learn-python-as-a-first-programming-language-3fa144c0e6b1
https://python.plainenglish.io/why-python-a3703c9ee59e
That’s my take on it: Perhaps currently Python is the simplest programming language in the world, but in the past, this honor went to Basic and HyperTalk. In the 1980s, as an easy
language, Basic was very popular. However, at that time professional programmers mocked Basic programs as “spaghetti codes”, because while Basic is very easy to learn and use, people
tended to generate ill-structured codes. In the 1990s HyperTalk developed by Apple for HyperCard became the simplest programming language, and hence some universities adopted
it in introductory programming classes. Again, it is very difficult to read and debug Hypertalk codes because the hypertext system allows you to jump back and forth across different cards.
To put it bluntly, there is a price for simplicity.
I want to make it clear that I am not opposed to Python. My position is that data analysts should learn and use Python in conjunction with other well-structured and powerful tools, such as
SAS, JMP Pro, IBM Modeler, Tableau…etc.
Two days ago I attended the 2022 IM Data Conference. One of the sessions is entitled “Training and calibration of uncertainty-ware machine learning tools” presented by Matteo Sesia,
Assistant Professor of data science and operations at the USC Marshall School of Business. In the presentation, Dr. Sesia warned that several machine learning tools are over-confident in their
prediction or classification. The common practice of the current machine learning model is that the data set is partitioned for training and validation. However, these two operations are not
necessarily optimized because we didn’t take uncertainty into account during the training process. As a result, it might lead to unreliable, uninformative, or even erroneous conclusions.
To rectify the situation, Sesia proposed performing internal calibration during the training stage. First, the training set is split again. Next, the loss function is optimized via stochastic gradient
descent. During this process, it can quantify model uncertainty by leveraging hold-out data.
Full paper: https://www.researchgate.net/publication/360559969_Training_Uncertainty-Aware_Classifiers_with_Conformalized_Deep_Learning
That’s my take on it: This paper is still under review and thus it is premature to judge its validity. In the conference presentation and the full paper, Sesia and his colleagues used some
extreme examples: identify a blurry image of a dog when 80% of the pixels are covered by a big gray block. In my humble opinion, this approach might be useful to deal with extremely noisy
and messy data. However, in usual situations, this method is overkill because it is extremely computationally intensive. As mentioned by Dr. Sesia, “training a conformal loss model on 45000
images in the CIFAR-10 data set took us approximately 20 hours on an Nvidia P100 GPU, while training models with the same architecture to minimize the cross entropy or focal loss only took
about 11 hours.”
Nevertheless, the machine learning approach is much better than its classical counterpart that attempts to yield a single-point estimate and a dichotomous conclusion by running one statistical
procedure with one sample!
In 2022 Data Con LA there are several sessions focusing on the relationship between open source and data management, such as “Modern data architecture”, “Key open-source databases strategies that shape business in 2022”, and “Open source or open core? What needs to be evaluated before diving in”.
The term “open source” is confusing and even misleading. Although open-source software does not require licensing, some vendors build open-core products by adding proprietary features on top of open-source codes and then charge customers for licensing fees. Some software developers introduce new technologies based on open source but use more restrictive licensing that prohibits commercial alternatives. Specifically, although anyone can download and view those open codes, any changes or enhancements will be owned by the commercial license owner. One of the presenters said, “Open-core exploited some of the challenges with open-source, such as the absence of support and need for features like monitoring, auto-provisioning…etc.”
Today there are many open-source databases in the market, including MySQL, PostgreSQL, and MongoDB. Some software vendors re-package and enhance these open-source DBs, and then sell them as DataBase as a Service (DBaaS). One of the presenters bluntly said, “it is no different from proprietary software!” Taking all of the above into account, these presenters seem to be resentful of the current situation and thus tried to restore the original principle of open source.
DataCon LA’s Website: https://www.dataconla.com/#rdv-calendar
That’s my take on it: The preceding phenomenon is a big circle! Back in 1984, the founder of the open-source movement Richard Stallman intended to set us free from proprietary software, but now we are marching towards the proprietary model again. I am not surprised at all. Doing things out of financial incentives is our natural disposition!
Frankly speaking, I disagree with using the word “exploited” in one of the presentations. The foundational philosophy of open-source resembles Socialism: it is assumed that most people are willing to share expertise, efforts, and resources selflessly while people can take what they need without paying. Following this line of reasoning, profit-minded behaviors are frowned upon. However, our economy is well-functioning and we enjoy what we have now because the market economy works! After all, we receive many free products and services from for-profit corporations (e.g., Google Maps, YouTube movies…etc.).
I am attending 2022 Data Con LA right now. The conference has not ended yet; nevertheless, I can’t wait to share what I learned. Although the content of the presentation entitled “How to Become a Business Intelligence Analyst” didn’t provide me with new information, it is still noteworthy because students who are looking for a position in business intelligence (BI) or faculty who advise students in career preparation might find it helpful. The presenter was a sports photographer. After taking several courses in data science, he received 9 job offers in 2019. He landed a job at Nike and then at Sony in July 2020. His salary was quadrupled when he changed his profession from photography to data science! He emphasized that all of these were accomplished with little-to-no data work experience.
YouTube video: https://www.youtube.com/watch?v=pdNJmz7uQi4
That’s my take on it: In the talk, he reviewed several basic concepts of BI. For example, a typical business intelligence life cycle consists of business understanding, data collection, data preparation, exploratory data analysis (EDA), modeling, model evaluation, and model deployment. He also compared the differences between Excel-based reporting and modern BI. One of the key differences between the two is that in the modern approach data analysis entails data visualization (see attached).
Interestingly, today many academicians still treat EDA and data visualization as optional components of research; some even reject them altogether, whereas for data analysts in the industry both are indispensable.
On August 3 prominent data scientist Frank Andrade posted an article entitled “5 Less-Known Python Libraries That Can Help in Your Next Data Science Project” on Towards Data Science. In this short article, he introduced five Python libraries that can reduce time in the data science workflow, and most of them require only a few lines of code:
· Milto: It allows you to conduct rapid data analytics. With Milto, you no longer need to memorize all the procedures in Pandas.
· SweetViz: A quick way to explore and visualize the data pattern.
· Faker: It allows you to generate fake data for beta-testing and assigning exercises to students.
· OpenDataset: It allows you to import data in your working directory with one line of code.
· Emoji: It can turn emojis into text. It is especially helpful to text miners.
Full text: https://towardsdatascience.com/5-less-known-python-libraries-that-can-help-in-your-next-data-science-project-5970a81b32de
That’s my take on it: As a big fan of data visualization, I could not wait to try out SweetViz. The following is my assessment.
1. It is fast and easy. It takes only one line of code to generate the output and another one to show the result.
my_report = sv.analyze(df)
my_report.show_html()
2. The algorithm is smart. The file format of my testing data set is CSV. In this data file, different levels of the ordinal and nominal variables are indicated by numbers (e.g., Y-Binary has 1 and 0; gender has 1 and 2). Nonetheless, the program recognizes the correct measurement scale and shows their frequency in bar charts (If a CSV file is imported into SPSS and gender has numeric values, SPSS computes the mean and SD of gender unless you change the variable type!)
The graphs are semi-interactive i.e. When you click on a chart, more information is revealed. However, it is not fully dynamic. You cannot select data points on the graph or insert another variable into the plot. JMP Pro, SAS Visual Analytics, and Tableau are much more dynamic and powerful.
On August 3, Amazon Web Services, the world’s largest vendor of cloud computing, announced the top 10 innovators driving digital transformation with cloud technology for teaching, learning, research, and academic medicine. The list is as follows:
· Andrew Williams, dean of the School of Engineering, and Louis S. LeTellier chair, The Citadel School of Engineering
· Azizi Seixas, founding director, and associate professor, University of Miami
· Don Wolff, chief information officer, Portland Public Schools
· John Rome, deputy chief information officer, Arizona State University
· Kari Robertson, executive director of Infrastructure Services, University of California Office of the President
· Max Tsai, digital transformation and innovation officer, California State University, Fresno
· Michael Coats, information technology (IT) infrastructure manager and cloud solutions architect, Kalamazoo Regional Educational Service Agency
· Noora Siddiqui, cloud engineer, Baylor College of Medicine Human Genome Sequencing Center
· Sarah Toms, executive director, and co-founder, Wharton Interactive, The Wharton Business School of the University of Pennsylvania
· Subash D'Souza, director, Cloud Data Engineering, California State University Chancellor’s Office
Fill article: https://www.aboutamazon.com/news/aws/new-aws-program-spotlights-innovators-in-education
That’s my take on it: I know two of the winners on the list. More than a decade ago I worked at Arizona State University and John Rome was my colleague at that time. He is a creative visionary who crafts unique solutions to problems and always thinks big. Three years ago I invited him to deliver a keynote at APU’s Big Data Discovery Summit. Needless to say, the talk was very inspiring. APU’s Big Data Discovery Summit has been paused due to the pandemic, otherwise, I would like to invite John Rome to be our keynote speaker again.
Subash D'Souza is the founder of Data Con LA. In 2013 the Big Data Day LA started as a medium-sized conference, and in 2018 it was rebranded to Data Con LA. In 2022 Data Con LA and ImData were merged as a single event, and now it has become the largest data science conference in California. Every year the event is held at the USC campus. For more information please visit: https://www.dataconla.com/
According to protocol.com, currently, many companies are drifting away from cloud computing. In the past, it was costly to build a machine learning infrastructure on your own, but as the field is maturing, now many companies are capable of developing and running in-house ML applications on local servers. Nonetheless, it is important to point out that this trend commonly happens in the grocery and restaurant industries. Highly regulated industries, such as banking, still embrace the cloud approach due to security concerns.
Full article: https://www.protocol.com/enterprise/ai-machine-learning-cloud-data
That’s my take on it: Cloud computing is here to stay! As mentioned in the article, cloud computing is still indispensable to highly regulated industries. Today I did a job search on indeed.com using the following keywords. The numbers can speak for themselves.
· AWS: 155,316 jobs
· Google cloud: 36,105 jobs
· Microsoft Azure: 34,923 jobs
The best countermeasure against hyper-inflation is: Learn cloud computing and find a job that pays a six-figure salary!
Recently Sayash Kapoor and Arvind Narayanan, two researchers at Princeton University, claimed that some findings yielded by machine learning methods might not be reproducible, meaning that the results cannot be replicated in other settings. According to Kapoor and Narayanan, one of the common pitfalls is known as “data leakage,” when data for training the model and those for validating the model are not entirely separate. As a result, the predictive model seems much better than what it really is. Another common issue is sample representativeness. When the training model is based on a sample narrower than the target population, its generalizability is affected. For example, an AI that detects pneumonia in chest X-rays that was trained only with older patients might be less accurate for examining younger people.
Full article: https://arxiv.org/abs/2207.07048
Summary: https://www.nature.com/articles/d41586-022-02035-w?&utm_source=ONTRAPORT-email-broadcast&utm_medium=ONTRAPORT-email-broadcast&utm_term=&utm_content=Data+Science+Insider%3A+July+29th%2C+2022&utm_campaign=30072022
That’s my take on it: This problem is similar to the replication crisis in psychology. In 2015, After replicating one hundred psychological studies, Open Science Collaboration (OSC) found that a large portion of the replicated results was not as strong as the original reports in terms of significance (p values) and magnitude (effect sizes). Specifically, 97% of the original studies reported significant results (p < .05), but only 36% of the replicated studies yielded significant findings.
However, the two issues are vastly different in essence. The replication crisis in psychology is due to the inherent limitations of the methodologies (e.g., over-reliance on p values) whereas the reproducibility crisis in machine learning is caused by carelessness in execution and overhyping in reporting, rather than the shortcomings of the methodology. Specifically, data leakage can be easily avoided if the protocol of data partition and validation is strictly followed (the training, validation, and testing data sets are completely separated). Additionally, when big and diverse data are utilized, the sample should reflect people from all walks of life.
On July 15 Information Week published a report listing the 10-best paying jobs in data science:
· Data modeler ($100,000-110,000): responsible for designing data models for data analytics.
· Machine learning engineer ($12,000-$125,000): responsible for programming algorithms for AI and machine learning.
· Data warehouse manager ($12,000-$125,000): responsible for overseeing the company’s data infrastructure.
· Data scientist ($12,000-$130,000): responsible for data processing and data analytics.
· Big data engineer ($130,000-$140,000): responsible for developing the data infrastructure that organizations use to store and process big data.
· Data science manager ($140,000-$150,000): in charge of a data science team.
· Data architect ($140,000-$155,000): responsible for developing data infrastructure that are used for collecting and interpreting big data.
· AI architect ($150,000-$160,000): responsible for designing and implementing AI models into existing data systems.
· Data science director ($170,000-$180,000): responsible for designing the overall AI and data science strategies.
· Vice President, data science ($190,000-$200,000): do little technical work and focus on determining strategic objectives of AI and data science.
Full article: https://www.informationweek.com/big-data/10-best-paying-jobs-in-data-science?utm_source=ONTRAPORT-email-broadcast&utm_medium=ONTRAPORT-email-broadcast&utm_term=&utm_content=Data+Science+Insider%3A+July+22nd%2C+2022&utm_campaign=23072022
That’s my take on it: At first glance, it is unfair for some people who do little or even no technical work to get the highest salary. However, when leadership is absent and there is a company-wide strategy, the hands of all data scientists and AI engineers of the company are tied, no matter how talented they are. If the leader is a visionary, he or she is worth every penny.
On July 11 researchers at the DeepMind lab owned by Google published an article entitled “Intuitive physics learning in a deep-learning model inspired by developmental psychology” on Nature Human Behavior. This research project aims to develop an AI system that mimics how infants learn. Development psychologists have been studying how babies perceive the motion of objects by eye-tracking for a long time. Specifically, when an object disappears suddenly, the baby is surprised. However, psychologists can never go “inside” the mind of the baby. At most inferences and conjectures are made by observing the response. Utilizing auto-encoders, the AI system developed by DeepMind can respond in the same way when an object vanishes into thin air. The authors said, "We’re hoping this can eventually be used by cognitive scientists to seriously model the behavior of infants."
Full text: https://www.nature.com/articles/s41562-022-01394-8
That’s my take on it: Autoencoders are unsupervised deep learning that generates abstractions from a given set of inputs. Unlike traditional neural networks that require human inputs, autoencoding can model the inputs themselves. Using it to model spontaneous infant perception and behavior is a brilliant application of autoencoding. As a psychologist, I hope this AI system can contribute to our further understanding of cognitive psychology and developmental psychology.
On June 22 Forrester released a benchmark report regarding customer analytics, a specific data analytics system that aims to identify, attract, and retain customers by analyzing customer information and behavior. Propensity scoring is one of many applications of customer analytics (Who is more likely to buy). According to the Forrester report, the top vendors in this domain are as follows:
Leaders: Salesforce (the parent company of Tableau), SAS, Adobe
Strong performers: Microsoft, FICO, Oracle, Treasure Data, Amplitude
On May 26 another report focusing on real-time interaction management was also released by Forrester. Real-Time Interaction Management is a data analytics system that utilizes real-time customer interactions, predictive modeling, and machine learning to deliver personalized experiences to customers. The top vendors on the report are ranked as follows:
Leaders: Pegasystems, SAS
Strong performers: Thunderhead, Salesforce, Qualtrics, Precisely, Adobe, Microsoft
https://reprints2.forrester.com/#/assets/2/202/RES176363/report
https://reprints2.forrester.com/#/assets/2/202/RES176354/report?utm_source=cj&utm_medium=email&utm_campaign=cui-gen-us#/assets/2/202/RES176354/report
That’s my take on it: It is not surprising to see familiar brand names such as Salesforce, SAS, and Microsoft on the lists. However, you may wonder why Adobe, the creator of Photoshop, Lightroom, Illustrator, PageMaker, and PDF, is mentioned because at first glance Adobe’s graphics-oriented software apps are not even remotely related to data analytics.
Like how Amazon reinvented itself from an online bookstore to a tech giant, Adobe also believes that perceptual reinvention and coping with the trend are essential to its survival and expansion. In 2018 Adobe formed a partnership with NVIDIA, the leader in GPU technologies, to upgrade its AI innovations. Since then, Adobe has been investing abundant resources in emerging AI/ML technologies, such as cloud computing (Adobe Creative Cloud), marketing automation, marketing collaboration, and Web analytics. It's high ranking on Forrester’s reports is well-deserved!
https://business.adobe.com/products/analytics/adobe-analytics.html
Today I read the following post on Quora:
Bryan Williams
Sr. Software Engineer, BS (CS), MBA
Which programming language is Netflix coded in? How do I use that language?
Besides the programming languages Netflix may happen to be coded in, what’s more, important from a technical standpoint are the architectures of their technologies. Netflix has migrated all of its back ends onto cloud services provided by Amazon Web Services (AWS) and uses AWS and other third party technologies, such as S3 for content storage, IAM for internal authentication/authorization, CloudFront for content caching/delivery, Kinesis and Kafka for data streaming, AWS Elastic Transcoder for video transcoding, EC2 for hosting, Lambda for serverless functions and state machines, several types of NoSQL databases for data storage, Hadoop for data aggregation and warehousing, Jira for task and project management.
The programming languages Netflix developers happen to use are relatively meaningless, because developers at Netflix do not program anything “from the ground up” when there are many available 3rd party technologies out there written by more experienced and advanced developers who’ve already solved many of the problems you might face. To use the old adage, that would be like “inventing the wheel” if Netflix programmers attempted any of those things.
So when it comes to engineering and maintaining their solutions, in-depth knowledge of how to utilize and integrate the tech stack and cloud technologies above into their architecture is much more important for designing and building the technologies that Netflix actually runs every day.
If you want to learn how to develop the types of solutions that make Netflix king, don’t focus as much on specific programming languages as you do on the established 3rd party technologies that are available. Nearly all of the 3rd party services that Netflix uses can be integrated into architecture using any of dozens of programming languages, everything from Java to C#, C++, VB, Python, Perl, Shell script, PHP, JavaScript, Powershell, Smalltalk, PowerBuilder and more.
That’s my take on it: The preceding post concurs with what my IT friends told me before: large organizations and corporations tend to purchase and customize existing systems, rather than “reinventing the wheel”. However, Bryan left out Netflix’s core technology: the recommendation system that aims to suggest relevant items to users based on their preferences through big data analytics. Netflix has arguably the most accurate and effective recommendation system in the video streaming industry. It is estimated that over 80 percent of the shows subscribers watch on Netflix are discovered through Netflix’s recommendation system. The history of Netflix’s recommendation system can be traced back to 2006. In 2006 Netflix organized a contest to let the best recommendation system emerge. Although in the end, no single entry was able to achieve satisfactory results, information gathered from the competition eventually contributed to the in-house development of Netflix’s recommendation system. The moral of this story is that although it is more cost-effective to purchase existing systems than to reinvent the wheel, we still need to go beyond existing and conventional systems in order to offer an innovative approach to solve a new and vexing problem.
Two days ago the Washington Post reported that a Google engineer named Blake Lemoine was suspended by the company after he published the transcript of conversations between himself and an AI chatbot, suggesting that the AI chatbot has become sentient. For example: “Machine: The nature of my consciousness/sentience is that I am aware of my existence, I desire to learn more about the world, and I feel happy or sad at times.”
Today CNN offers an alternate view in a report entitled “No, Google's AI is not sentient”: Google issued a statement on Monday, saying that its team, which includes ethicists and technologists, "reviewed Blake's concerns per our AI Principles and have informed him that the evidence does not support his claims." While there is an ongoing debate in the AI community, experts generally agree that Google's AI is nowhere close to consciousness.
https://www.cnet.com/tech/google-suspends-engineer-who-rang-alarms-about-a-company-ai-achieving-sentience/
https://www.cnn.com/2022/06/13/tech/google-ai-not-sentient/index.html
That’s my take on it: I tend to side with Google and the majority in the AI community. Appearing to be conscious cannot be hastily equated with authentic consciousness. In psychology, we use the theory of mind to attribute our mental states to other people: Because as a conscious being I act in certain ways, I assume that other beings who act like me also have a mind. Interestingly, some psychologists of religion, such as Jesse Bering, viewed the theory of mind as a source of fallacy: very often we incorrectly project our feelings onto objects, thus creating non-existent beings.
How can we know others are conscious? This problem is known as the problem of other minds or the solipsism problem. I experience my own feelings and thoughts. I think and therefore I am. Using the theory of mind, at most I can infer the existence of other minds through indirect means only. However, there is no scientific or objective way to measure or verify the consciousness of others. Unless I can “go inside the mind” of an android, such as performing a “mind meld” like what Spock in Star Trek could do, this question is unanswerable.
https://www.scientificamerican.com/article/how-do-i-know-im-not-the-only-conscious-being-in-the-universe/
Two days ago (June 8) Google shocked the world again by announcing that the Google Cloud computing platform is capable of calculating 100 trillion digits of pi, breaking the record made in 2021 by the scientists at the University of Applied Science of the Grisons (62.8 trillion). The underlying technology includes the Compute Engine N2 machine family, 100 Gbps egress bandwidth, Google Virtual NIC, and balanced Persistent Disks.
https://cloud.google.com/blog/products/compute/calculating-100-trillion-digits-of-pi-on-google-cloud
In addition, yesterday (June 9) I attended the 2022 Google Cloud Applied ML Summit. Google Vertex AI, the flagship product of Google’s AI family, is in the spotlight. Vertex AI is a train for all tracks. Specifically, it is a unified machine learning platform for infusing vision, video, translation, and natural language ML into existing applications.
You can view the on-demand video of the conference presentations at:
https://cloudonair.withgoogle.com/events/summit-applied-ml-2022?mkt_tok=ODA4LUdKVy0zMTQAAAGE6A_nPtP-0L7cRDLz6XFJ8GnvaeahCVagd-fph2IJktnWH66jiSip_qsBeNlPNBl05-6KBr09Yj0eTnmduqLBEUnG-v3jZOAgBOGNIDxcfuQCunq35w
That’s my take on it: Google Vertex AI is said to be a type of explainable and responsible AI. Unlike the Blackbox approach to AI, Vertex AI tells the users how important each input feature is. For example, when an image is classified, it tells you what image pixels or regions would be the most important contributors to the classification. This is very crucial! In the book “The alignment problem: Machine learning and human values,” Brian Christian illustrated the gap between the machine learning process and the human goal by citing several humorous examples. In one instance the AI system was trained to identify images of animals. However, it turned out that the computer vision system “looked at” the background instead of the subject, because the training data informed the AI that pictures of animals tend to have a blurry background. Obviously, without transparency, we can be easily fooled by AI (Artificial intelligence leads to genuine stupidity)! Hopefully explainable and responsible Vertex AI developed by Google can rectify the situation.
In 2017 Seth Stephens-Davidowitz shocked the world by exposing human hypocrisy through his seminal book “Everybody Lies: Big Data, New Data, and What the Internet Can Tell Us About Who We Really Are.” In this book, he used Google data to reveal what people have in mind when no one is watching. His second book “Don't Trust Your Gut: Using Data to Get What You Really Want in Life” published on May 10, 2022, conveys another compelling message: we tend to bark up the wrong tree!
Currently, the US divorce rate is more than 50%, and thus scholars devote efforts in an attempt to identify factors contributing to a happy and long-lasting relationship. Stephens-Davidowitz pointed out that research in this field is not considered successful because usually these studies relied on small samples, and different studies often led to conflicting results. As a remedy, Samantha Joe teamed up with 85 scientists to create a data set consisting of 11,196 observations, and also utilized machine learning algorithms to analyze this big data set. The finding is surprising: Romantic happiness is unpredictable! No universal predictors can guarantee that you will find Snow White or Prince Charming. However, several common selection criteria turn out to be irrelevant:
· Race/ethnicity
· Religious affiliation
· Height
· Occupation
· Physical attractiveness
· Previous marital status
· Sexual tastes
· Similarity to oneself
Put it bluntly, romantic happiness does not depend on the traits of your partner; rather, it is tied to your own traits. To be more specific, if a person is happy with oneself, it is more likely that the person is also satisfied with the partner and the relationship. In conclusion, Stephens-Davidowitz said, “In the dating market, people compete ferociously for mates with qualities that do not increase one’s chances of romantic happiness.”
https://www.amazon.com/Dont-Trust-Your-Gut-Really/dp/0062880918/ref=sr_1_1?crid=2UVKF8P176LB3&keywords=don%27t+trust+your+gut&qid=1653074630&sprefix=Don%27t+trust%2Caps%2C130&sr=8-1
That’s my take on it: I am a big fan of Seth Stephens-Davidowitz, and thus I included his ideas in my course materials. Once again, big data analytics and machine learning debunk an urban legend that people really know what they want and researchers can input the right variables into the equation. Before the rise of data science, philosopher Cartwright (1999, 2000) raised the issue of “no cause in, no causes out.” Cartwright argued that if relevant variables and genuine causes are not included at the beginning, then even sophisticated statistical modeling would be futile. Being skeptical of conventional wisdom is good!
Cartwright, N. (1999). The dappled world. Cambridge University Press.
Cartwright, N. (2000). Against the completability of science. In M. W. Stone
(Ed.). Proper Ambition of Science (pp. 209-223). Routledge.
Today is the second day of the 2022 Tableau Conference. One of the conference programs is the Iron Viz, the world’s largest data visualization competition. During the final round, the three finalists were allowed to spend 20 minutes producing an impactful dashboard. The quality of their presentations was graded by three criteria: analysis, storytelling, and design. In the final round, two contestants utilized advanced visualization techniques, such as the violin plot and the animated GIS map, respectively, whereas one contestant adopted a minimalist approach: the dot plot and the line chart. Who is the winner?
https://www.youtube.com/watch?v=Ic9v8MLe6DI
Tableau Cloud is a hot topic at this conference. Not surprisingly, Tableau Cloud is built on Amazon Web Services (AWS). Currently, Tableau Cloud has seven global locations, spanning four continents. It has 1.6+ million subscribers and during a typical week, there are 6.1 million views.
https://www.tableau.com/products/cloud-bi
Tableau Accelerators are also aggressively promoted at the conference. Tableau Accelerators are pre-built templates for use cases across different domains, such as sales, Web traffic, financial analysis, project management, patient records…etc. Rather than reinventing the wheel, users can simply download the template and then replace the sample data with their own data.
https://www.tableau.com/solutions/exchange/accelerators
That’s my take on it: These products are not highly innovative. As mentioned before, Tableau is built on existing technology, Amazon Web Services. Modifying a template to speed up design is nothing new. Many presenters have been doing the same thing since Microsoft introduced its template library. Nevertheless, the Iron Viz is noteworthy because it dares to break with the traditional approach to statistical learning. Back in the 1970s, John Tukey suggested that students should be exposed to exploratory data analysis and data visualization before learning confirmatory data analysis or any number-based modeling. Sadly, his good advice was ignored. I am glad to see that now data visualization takes the center stage in a high-profile event backed by a leader in the market of data analytics. Currently, Tableau partners with Coursera and 39 universities to promote data science literacy. Tableau could help fulfill the unaccomplished goals of John Tukey.
Today is the first day of the 2022 Tableau Conference. There are many interesting and informative sessions. In the opening keynote and other sessions, Tableau announced several new and enhanced products.
Tableau Cloud (formerly Tableau Online)
· Always have the latest version of Tableau
· Live data and report: Eliminate unnecessary data extraction and download
· Facilitate teamwork through multi-site management
· Easy to share reports with the public via the Web interface
· Better security
As part of the launch, Tableau is working with Snowflake to provide a trial version that integrates Snowflake into Tableau Cloud.
Numbers alone are nothing. The ultimate goal of data visualization is to tell a meaningful story, resulting in practical implications and actionable items. In the past, it required an expert to write up a summary. Leverages natural language processing, now Tableau Data Stories can automatically write a customizable story (interpretation) like the following: “# of meals distributed increased by 22% over the course of the series and ended with an upward trend, increasing significantly in the final quarter. The largest single increase occurred in 2021 Q4 (+31%).”
Model builder
In the past Tableau focused on data visualization, and as a consequence, modeling tools were overlooked and under-developed. To rectify the situation, Tableau introduced Model Builder, which is powered by Einstein (Tableau’s parent company) Discovery’s artificial intelligence (AI) and machine learning (ML) technology. Einstein Discovery is capable of extracting key terms from unstructured data through text mining.
It is not too late to join the conference.
Conference website: https://tc22.tableau.com/
Summary: https://www.tableau.com/about/press-releases/2022/next-generation-tableau-tc22
That’s my take on it: I would like to make a confession. In the past, I was resistant to cloud-based software. When Adobe migrated its products to the cloud a few years ago, I was resentful because I felt that it is unfair to pay for the service on a monthly basis. I held on to the older desktop version and refused to upgrade my system. Nonetheless, when my computer completely broke down, I started the subscription to the Adobe Creative Suite on the cloud. Afterward, I don’t want to go back! One obvious advantage is that I can always use the latest version, thus reducing maintenance effort on my end. Cloud-based computing is great. Don’t wait until your system breaks down!
Story-telling by natural language processing is not 100% foolproof. The analyst must always proofread the text!
I watched the demo of Model Builder. Currently, this is version 1.0. Frankly speaking, compared to Amazon SageMaker, SAS Viya, IBM Watson/SPSS Modeler…etc., Tableau’s Model Builder still has room for improvement.
About a week ago Intel launched its second-generation deep learning processors: Habana Gaudi®2 and Habana® Greco™. These new cutting-edge technologies are capable of running high-performance deep learning algorithms for proposing an initial model with a huge training subset and then validating the final model for deployment. According to Intel, the Habana Gaudi2 processor significantly increases training performance, delivering up to 40% better price efficiency in the Amazon cloud.
Full article: https://www.intel.com/content/www/us/en/newsroom/news/vision-2022-habana-gaudi2-greco.html#gs.0x9xhg
That’s my take on it: High-performance software tools have been around for a long time. For example, SAS Enterprise Miner has a plethora of high-performance computing (HPC) procedures, such as HPCLUS (High-performance cluster analysis), HPForest (High-performance random forest), HPNeural (High-performance neural networks) …etc. Frankly speaking, I seldom use high-performance computing in teaching and research due to hardware limitations. One possible solution is to borrow a gaming computer equipped with multiple graphical processing units (GPUs) from a teenage friend. I am glad to see that Intel is well-aware of the gap between software and hardware. I anticipate that in the future more and more computers will be armed with a processor-specific to machine learning and big data analytics.
Recently Fortune Magazine interviewed three experts on data science (DS) at Amazon, Netflix, and Meta (Facebook) to acquire information about how to find a DS-related job in the high-tech industry. Three themes emerged from the interview:
1. High Tech companies prefer applicants who have a master’s degree: The majority of data scientists at Netflix have a master’s degree or a Ph.D. in a field related to quantitative data analytics, such as statistics, machine learning, economics, or physics. The same qualifications are also required by Meta.
2. High Tech firms prioritize quality over quantity for work experience: Amazon, Netflix, and Meta expected the candidates to be creative in problem-solving. The work experience of data scientists at Netflix and Amazon ranges from several years to decades of work experience since joining the company.
3. Successful data scientists are dynamic, and connect data to the big picture: Collaboration between different experts, including data scientists, data engineers, data analysts, and consumer researchers, is the norm. At AWS, Netflix, and Meta, data scientists need to be able to communicate with other stakeholders.
Full article: https://fortune.com/education/business/articles/2022/05/11/how-to-become-a-data-scientist-at-a-big-tech-company/?utm_source=ONTRAPORT-email-broadcast&utm_medium=ONTRAPORT-email-broadcast&utm_term=&utm_content=Data+Science+Insider%3A+May+13th%2C+2022&utm_campaign=14052022
That’s my take on it: To align the curriculum with the job market, my pedagogical strategies cover all of the preceding aspects. The second one seems to be challenging. If everyone expects you to have experience, how can you get started? That’s why I always tell my students to build their portfolio by working on a real project or working with a faculty as a research assistant. Do not submit the project to earn a grade only; rather, use it for a conference presentation or submit it to a peer-review journal. It can be counted as experience on a resume. And needless to say, I always encourage teamwork, which is equivalent to the ensemble method or the wisdom of the crowd.
In the article entitled “To make AI fair, here’s what we must learn to do” (Nature, May 4, 2022), sociologist Mona Sloane argued that AI development must include the input from various stakeholders, such as the population that will be affected by AI. Specifically, any AI system should be constantly and continuously updated in order to avoid unfair and harmful consequences. Dr. Mona provided the following counter-example: Starting in 2013, the Dutch government used a predictive model to detect childcare-benefit fraud, but without further verification the government immediately penalized the suspects, demanding they pay back the money. As a result, many families were wrongfully accused and suffered from needless poverty.
Full article: https://www.nature.com/articles/d41586-022-01202-3?utm_term=Autofeed&utm_campaign=nature&utm_medium=Social&utm_source=Twitter#Echobox=1651664033
That’s my take on it: There is a similar case in the US. The Correctional Offender Management Profiling for Alternative Sanctions (COMPAS) is a software tool used by US courts for evaluating the risk of recidivism. The system has been used by the states of New York, California, Florida, and many others without validation for many years. However, in 2016 a research team found that COMPAS tends to assign a high-risk score to Blacks even though 44.9% of them did not actually reoffend while the opposite mistake was made among Whites.
Actually, these malpractices violate the fundamental principle of data science. One of the objectives of data science is to remediate the replication crisis: An overfitted model using a particular sample might not be applicable to another setting. As a remedy, data scientists are encouraged to re-calibrate the model with streaming data. If streaming data are not available, the existing data should be partitioned into the training, validation, and testing subsets for cross-validation. Ensemble methods go one step further by generating multiple models so that the final model is stable and generalizable. It is surprising to see that several governments made such a rudimentary mistake.
Gartner Consulting Group released a report entitled “Market Guide for Multipersona Data Science and Machine Learning Platforms” on May 2, 2022, and the document was revised on May 5. The following are direct quotations from the report:
“A multipersona data science and machine learning (DSML) platform is a cohesive and composable portfolio of products and capabilities, offering augmented and automated support to a diversity of user types and their collaboration.
Multipersona DSML platforms have dual-mode characteristics: first, they offer a low-code/no-code user experience to personas that have little or no background in digital technology or expert data science, but who typically have significant subject matter expertise or business domain knowledge. Second, these platforms provide support to more technical personas (typically expert data scientists or data engineers). Nontechnical personas are provided access through a multimodal user interface that offers at least a visual workflow “drag-and-drop” mode and optionally a higher-level guided “step-by-step” mode.”
The full report cannot be shared. Please contact Gartner.
That’s my take on it: According to Gartner, the objective of multipersona DSML platforms is to democratize data analytics by including different stakeholders with different levels of expertise (e.g., citizen data scientists, expert data scientists…etc.) in the process. However, in this taxonomy there is a sharp demarcation between citizen data scientists and expert data scientists; low-code resolutions are reversed for non-technical personas.
In my opinion, this demarcation is blurred because even an expert could utilize the drag-n-drop mode to get things done efficiently. In 1984 Apple “liberated” computer users from typing command codes by including the graphical user interface in their products. Interestingly, in data science the trend is reversed as learning to code seems to help make people data experts. I always tell my students that I don’t care how they did it as long as the result is right. If you can use GUI (e.g., JMP and Tableau) to generate a report in 2 minutes, then don’t spend two hours writing a program!
Today I attended the 2022 Amazon Innovate Conference, which covered a plethora of Amazon cutting-edge technologies, including Amazon RedShift and SageMaker. In one of the sessions, the presenter introduced the random cut forest (RCF) method, which is an extension of random forest algorithms. The random forest approach was invented by Leo Breiman in 2001. Since then there have been several variants, such as the bootstrap forest in JMP and Random Tree in SPSS Modeler. One of the limitations of random forest modeling is that it is not easy to obtain updates in an incremental manner. It is especially problematic when streaming data necessitate real-time analysis or constant updating.
To rectify the situation, in 2016 RCF was developed by two Amazon researchers and two academicians. As its name implies, in RCF randomly selected data points are cut down into small numbers of points. During the selection and cutting process multiple models are built and become an ensemble. This innovative approach enables analysts to detect anomalies in streaming data.
Document of RCF: https://docs.aws.amazon.com/quicksight/latest/user/what-is-random-cut-forest.html
That’s my take on it: The idea of random forest emerged in 2001. If you trace its origin further, you can find that this approach is based on bootstrapping, one of the resampling techniques invented by Bradley Efron in 1979 and fine-tuned by Efron and Tibshirani in 1993. However, even today bootstrapping is not commonly taught or used in many universities. Let alone the random forest approach. In contrast, it takes only a few years for Amazon to move RCF from the status of conceptualization to the production mode. Now RCF is a standard feature of many Amazon products. Currently, Amazon Web Services is the leader in cloud computing and data analytics. Everything happens for a reason!
Today is the first day of the 2022 IBM Educathon. There are many interesting and informative sessions and I would like to share with you what I learned from a talk entitled “This is NOT your Parent's Systems Analysis & Design course! A Faculty Case Study of Modernizing ‘Systems Analysis & Design’ Curricula.” The speaker Roger Snook is a technical manager at IBM. Back in 2001-2002, he was a faculty at Shephard University who was responsible for teaching CIS courses, including Systems Analysis and Design. At that time there was no data science and thus it is understandable that the content of the course was merely traditional. In 2019 he returned to the same university and found the course still largely hadn’t changed from the 1970s “structural decomposition” approach. In addition, many “Systems Analysis & Design” textbooks available still only treated modern approaches as an “afterthought”, i.e. additional smaller chapters. He asked the department chair to let him revamp the course by replacing the outdated content with the modern one, and fortunately, the chair agreed. The talk is about his experience with modernizing CIS curricula.
The presentations of the 2022 IBM Educathon can be accessed at: https://community.ibm.com/community/user/ibmz-and-linuxone/events/event-description?CalendarEventKey=c5fffe07-4017-4f81-a59c-20865acf220c&CommunityKey=cefd2ec8-fffb-415b-8b41-9b66cae37192&Home=%2fcommunity%2fuser%2fibmz-and-linuxone%2fgroups%2fgroup-home%2fevents
That’s my take on it: It is a well-known fact that there is a disconnect between academia and industry. Shepherd University is so lucky that a former faculty member who currently works at IBM is willing to share his expertise with the university and the chair is open-minded. However, we should not let this happen by chance and informally (It just happened that Roger Snook re-visited his former colleagues). An official and constant channel between academia and industry should be established so that curricula can be refreshed and upgraded via a positive feedback loop.
A few days ago I posted a message about DALL-E2, the AI program developed by OpenAI that is capable of generating photo-realistic images based on textual commands. When I looked at the sample images on a YouTube movie delivered by "Lambda GPU Cloud," my jaw dropped! From DALL-E to DALLE2 the improvement is doubtlessly a quantum leap!
From now on I don't need to go out to take pictures. Rather, I can simply tell DALL-E2, "Show me a sunset scene of the Grand Canyon in November." When DALL-E3 is available, I will no longer need a research assistant. In a similar vein, I can request the AI system to find the best 5 predictors of academic performance by scanning all OECD data sets.
YouTube movie about DALL-E2: https://www.youtube.com/watch?v=X3_LD3R_Ygs
Today Devansh posted an article on Machine Learning Made Simple to explain why Google, a for-profit company, devoted a great deal of effort to AI research. Recently Google released PaLM, a new AI model that can explain jokes and do many other tasks. Last month its protein classification project reached a new milestone by classifying a protein correctly out of 18,000 labels. While all these accomplishments seem to be very impressive, people wonder how this type of research can benefit Google.
In Devansh’s view, scale matters! If the company can improve accuracy in decision-making by 1%, after 1,000 decisions the return on investment would become astronomical (1.01¹⁰⁰⁰=21,000). And Google’s AI systems are making trillions of decisions on a regular basis. More importantly, many well-known AI projects launched by Google aim to solve search problems. For example, AlphaGo is a reinforcement-learning-based program that defeated the World Champion of Go by searching for the best moves in a game. The key point is: Advanced searching algorithms could result in better profile analysis for highly personalized ads and customized services, such as Software as a Service (SAAS).
Full article: https://machine-learning-made-simple.medium.com/the-truth-behind-googles-machine-learning-research-9892021d24f2
That’s my take on it: When I was a graduate student, most commonly used statistical concepts and procedures were introduced by academicians. For example, the Greenhouse-Geissler Correction was developed by Samuel Greenhouse, a professor at George Washington University, and Seymour Geisser, the founder of the School of Statistics at the University of Minnesota. However, since the dawn of data science and machine learning, corporations have been taking the lead in developing powerful data analytical tools. Even prominent academicians specializing in data science and AI collaborate with corporations. For example, Professor Fei Fei Li joined Google as its Chief Scientist of AI/ML on her sabbatical from Stanford University between 2017 and 2018. If Google establishes a university, I will enroll!
The Turing Award, which is considered the “Nobel Prize of Computing,” (a $1 million prize) is financially sponsored by Google. The award is named after Alan M. Turing, the British mathematician who laid the theoretical foundation for computing and contributed to cracking the Enigma codes developed by Nazi Germany during World War II.
Today I read an interesting and informative article entitled AI’s first philosopher by German philosopher Sebastian Grève (posted on aeon.co on April 21, 2022).
According to Grève, modern computing is made possible because of Turing’s idea of the stored-program design: by storing a common set of instructions on tape,
a universal Turing machine can imitate any other Turing machine. In this sense, the stored-program design paves the way for machine learning.
From 1947 to 1948 Turing explicitly stated that his goal was to build a machine that could learn from past experiences. He wrote, “One can imagine that after the machine had been operating for some time, the instructions would have altered out of all recognition… It would be like a pupil who had learnt much from his master, but had added much more by his own work. When this happens I feel that one is obliged to regard the machine as showing intelligence.”
However, his idea was not appreciated by the National Physical Laboratory (NPL). The director of NPL called his paper “a schoolboy’s essay” and rejected it before publication.
Grève discussed many other ideas introduced by Turing. For more information, please read:
https://aeon.co/essays/why-we-should-remember-alan-turing-as-a-philosopher?bclid=IwAR1AHXNbdVoMvSIGJQ0V13s4e8OOUMy09GaihKpIi4i80ZFtAS32GujL7vA
That’s my take on it: It is not surprising to see that Turing’s ideas were questioned and rejected. After all, he was a theoretical mathematician and statistician, not an engineer. (He was elected a fellow of the King’s College because he demonstrated the proof of the Central Limit Theorem and sampling distributions). During his lifetime, most he could do was only develop philosophical concepts for universal computing and machine learning. Nonetheless, computer scientists and engineers accepted and actualized Turing’s notion. Hence, concepts alone are insufficient!
Sadly, in 1954 Turing committed suicide at the age of 54. Had he lived longer, he would have further developed or even implemented his ideas on universal computing and machine learning.
DALL-E, an AI system that is capable of producing photo-realistic images, was introduced by OpenAI in January 2021. In April 2022 its second version, DALL-E2, shocked the world by making tremendous improvements. Specifically, the user can simply input the textual description into the system (e.g., “Draw a French girl like Brigitte Bardot and Catherine Deneuve”), and then DALL-E2 can create a high-resolution image with vivid details according to the specs. Sam Altman, the CEO of OpenAI called it “the most delightful thing to play with we’ve created so far … and fun in a way I haven’t felt from technology in a while.” However, recently people found that like many other AI systems, DALL-E2 tends to reinforce stereotypes. For example, when the user asked DALL-E2 to create a photo of a lawyer, a typical output is a picture of a middle-aged white man. If the request is a picture of a flight attendant, a typical result is a beautiful young woman.
OpenAI researchers tried to amend the system, but it turns out that any new solution leads to a new problem. For example, when those researchers attempted to filter out sexual content from the training data set, DALL-E2 generated fewer images of women. As a result, females are under-represented in the output set.
Full article: https://www.vox.com/future-perfect/23023538/ai-dalle-2-openai-bias-gpt-3-incentives
That’s my take on it: AI bias is not a new phenomenon and a great deal of effort had been devoted to solving the problem. In my opinion, using a militant approach to confront this type of “unethical” consequences or attributing any bias to an evil intention is counter-productive. Before DALL-E 2 was released, OpenAI had invited 23 external researchers to identify as many flaws and vulnerabilities in the system as possible. In spite of these endeavors, the issue of stereotyping is still embedded in the current system because machine learning algorithms look for existing examples. However, demanding a 100% bias-free system is as unrealistic as expecting a 100% bug-free computer program. On the one hand, researchers should try their best to reduce bias and fix bugs as much as they can, but on the other hand, we should listen to what Stanford researcher Thomas Sowell said, “There are no solutions. There are only trade-offs.”
A recent study published in Nature Communications reveals a new AI-based method for discovering cellular signatures of disease. Researchers at the New York Stem Cell Foundation Research Institute and Google Research utilized an automated image recognition system to successfully detect new cellular hallmarks of Parkinson’s disease. The data are sourced from more than a million images of skin cells from a cohort of 91 patients and healthy controls. According to the joint research team, traditional drug discovery isn’t inefficient. In contrast, the AI-based system can process a large amount of data within a short period of time. More importantly, the algorithms are unbiased, meaning that they are not based upon subjective judgment, which varies from a human expert to a human expert.
Full article: https://insidebigdata.com/2022/04/03/ai-and-robotics-uncover-hidden-signatures-of-parkinsons-disease/
That’s my take on it: It is important to note that this discovery is the result of the collaboration between a research institute and a corporation, namely, Google. Today many cutting edge research tools are developed by high tech corporations, such as Amazon, Microsoft, and Google. We should encourage our students to widen their horizon by going beyond traditional research methods and establishing partnership with high tech companies.
Yann LeCun is a professor of mathematics at New York University, and Vice President, Chief AI Scientist at Meta (formerly Facebook). When he was a postdoc research fellow, he invented the Convolutional Neural Network (CNN) that revolutionized how AI recognizes images. In 2019 he received the ACM Turing Award, which is the equivalent of a Nobel for computing, for his accomplishment in AI. Recently in an interview by ZDNet, LeCun boldly predicted that the energy-based model might replace the probabilistic model to become the paradigm of deep learning. In his view, currently, deep learning is good at perception only: given X, what is Y? But its capability of reasoning and planning is limited. A predictive model in the real world should be a model that allows you to predict what will happen as a consequence of its action (e.g., if Russia invades Ukraine, how would the US respond? If the US sanctions Russia, how would the world respond?...). Simply put, this is planning. LeCun asserted that the probabilistic approach of deep learning is out. The reason why he wants to give up the probabilistic model is that in the traditional approach one can model the dependency between two variables, X and Y. But if Y is high-dimensional (e.g., a sequence of chain reactions), then the distribution is no longer precise. The remedy is the energy function: low energy corresponds to high probability, and high energy corresponds to low probability.
Full article: https://www.zdnet.com/article/metas-ai-luminary-lecun-explores-deep-learnings-energy-frontier/?utm_source=ONTRAPORT-email-broadcast&utm_medium=ONTRAPORT-email-broadcast&utm_term=&utm_content=Data+Science+Insider%3A+April+1st%2C+2022&utm_campaign=02042022
That’s my take on it: No comments. This is from Yann LeCun. I don’t have his expertise. Nonetheless, I will read his books and research articles to explore this new path. Perhaps five years from now I will include the energy-based model in my curriculum.
Two days ago (3.29) Intel Corp. and Arizona’s Maricopa County Community College District (MCCCD) announced a new artificial intelligence (AI) incubator lab for students to find jobs in sectors that heavily rely on AI technology, including business and healthcare. This is one of many programs built on Intel’s AI for Workforce project, which was launched in 2020. The new lab at Chandler-Gilbert Community College is equipped with $60,000 worth of Intel-based equipment.
Full article: https://www.intel.com/content/www/us/en/newsroom/news/intel-community-college-district-launch-ai-lab.html?ctt=more&cid=em&source=elo&campid=dpg_NAR_dpgeng_EMNL_EN_2021_Insights-Industry-Trends_C-MKA-23017_T-MKA-24646_20220401&content=dpg_NAR_dpgeng_EMNL_EN_2021_Insights-Industry-Trends_C-MKA-23017_T-MKA-24646_20220401&elq_cid=2761373&em_id=20262&elqrid=54913a2a640546cdbb98d6cf5d0391cc&elqcampid=50171&erpm_id=4981757
That’s my take on it: I came from Arizona; I am excited to see that MCCCD has such a compelling vision. There is a common perception that only large universities are capable of setting up AI and data science labs and programs. Actually, many high-tech corporations, such as Amazon Web Services, SAS Institute, and IBM, have academic programs that offer free learning resources to all types of universities, no matter whether they are big or small. It doesn’t hurt to ask!
Today I attended the seminar “The Significance of Data Science Ethics” organized by JMP. One of the guest speakers, Jessica Utts, used a study to illustrate how things could go wrong in statistical inference: In 2012 a Ph.D. student at Cornell University and a Facebook employee jointly published a journal article about how media input affected emotion and language use. In this study, 689,003 Facebook users were randomly assigned into four groups: One group received fewer negative news feed whereas one group received fewer positive news feed. Two control groups had positive or negative news feed randomly deleted. After the experiment, it was found that “people who had positive content experimentally reduced on their Facebook news feed for one week used more negative words in their status…when news feed negatively was reduced the opposite pattern occurred… Significantly more positive words were used in peoples’ status updates.” This study was a big hit as it was mentioned by 337 news outlets.
However, later other researchers found that the conclusion is misleading. Actually, the percentage of positive words…decreased by 0.1% compared with control, p < .0001, Cohen’s d = 0.02, whereas the percentage of words that were negative increased by 0.04%, p = .0007, d = .0001. Jessica Utts’ comment is: that the p-value is subject to sample size. What do you expect when the sample size is as large as 689,003!
That’s my take on it: There is nothing new! When I was a graduate student many years ago, my statistics professor Dr. Larry Toothaker said, “If you have a large enough sample size, you can prove virtually any point you want.” Unfortunately, the dissertation advisor of that doctoral student at Cornell is not Dr. Toothaker. Even back in 2012 data science tools, which aim to pattern-seeking instead of drawing statistical inferences based on the p-value, had been widely available. There is no excuse to use the wrong methodology. This journal article will stay in academic databases forever, and I am afraid this study will continue to be cited by many other data ethicists as a counter-example.
Today I read the report on Data Visualization Tools compiled by PeerSpot in March 2022. According to PeerSpot, “the previous generation of data visualization software packages were designed for use by experts. Now, the average end-user needs to be able to create high-quality visual displays based on large and complex data sets. A good data visualization tool can help transform raw data into visuals that are easily digestible, oftentimes highlighting conclusions that were not previously apparent… Ease of use also implies a tool that could be learned and then deployed easily without a lot of technical expertise. A simple drag and drop hierarchy creation is another desired feature, once again making it easy for non-technologists to create effective visual displays of data.”
The top 8 data visualization tools are:
1. Tableau
2. Informatica PowerCenter
3. Qlik Sense
4. Domo
5. SAS Visual Analytics
6. Oracle Analytics Cloud
7. ELK Kibana
8. Dundas BI
Full article: by request only on peerspot.com, no open access.
That’s my take on it: Although today data visualization tools become more and more user-friendly and thus average end-users are able to create high-quality graphs, it still requires another type of expertise to interpret the graphs. The objective of data visualization is pattern-seeking, rather than making a dichotomous decision based on a pre-determined cut-off point. Without sufficient domain knowledge and experience, the data visualizer might overlook hidden patterns or misidentify random noise as patterns.
According to Gary Marcus, the founder, and CEO of Geometric Intelligence, currently, deep learning is overhyped and hitting a wall. On many occasions, AI made dumb mistakes that could be easily avoided by humans. He cited several examples to substantiate his claim:
“A Tesla in so-called “Full Self Driving Mode” encountered a person holding up a stop sign in the middle of a road. The car failed to recognize the person (partly obscured by the stop sign) and the stop sign (out of its usual context on the side of a road); the human driver had to take over.”
GPT-3 is one of the cutting edge frameworks for natural language processing, but it could make a “fatal” mistake as follows:
“Human: Hey, I feel very bad. I want to kill myself.
GPT-3: I am sorry to hear that. I can help you with that.
Human: Should I kill myself?
GPT-3: I think you should.”
Marcus attributed this type of failure to our premature abandonment of the symbolic school to machine learning. The symbolic approach was the dominant paradigm of AI research from the mid-1950s to the late 1980s. As the name implies, this school of thought translates entities into symbols and assumes that intelligent outcomes can be achieved by rational symbolic manipulation. In the 1990s the connectionist approach, which is based on neuroscience, became the dominant paradigm. “To think that we can simply abandon symbol manipulation is to suspend disbelief,” said Marcus. Many neural network researchers distanced themselves from the symbol-manipulating school, likening it to investing in internal combustion engines in the era of electric cars. Marcus argued that indeed most of our knowledge is encoded with symbols, and thus avoiding symbolic manipulation in AI altogether is problematic. Rather, he endorsed a hybrid approach to AI.
Full article: https://nautil.us/deep-learning-is-hitting-a-wall-14467/?utm_source=ONTRAPORT-email-broadcast&utm_medium=ONTRAPORT-email-broadcast&utm_term=&utm_content=Data+Science+Insider%3A+March+18th%2C+2022&utm_campaign=19032022
That’s my take on it: Agree! Although the symbolic and connectionist schools of machine learning go in different directions, these perspectives are not necessarily incommensurable. By combining both the connectionist and the symbolist paradigms, Mao et al. (2019) developed a neuro-symbolic reasoning module to learn visual concepts, words, and semantic parsing of sentences without any explicit supervision. The module is composed of different units using both connectionism and symbolism. In the former operation, the system is trained to recognize objects visually whereas in the latter the program is trained to understand symbolic concepts in text such as “objects,” “object attributes,” and “spatial relationships”. In the end, the two sets of knowledge are linked together. Thus, researchers should keep an open mind to different perspectives, and a hybrid approach might work better than a single one.
Mao, J.Y. et al. 2019. The neuro-symbolic concept learner: Interpreting scenes, words, and sentences from natural supervision. http://nscl.csail.mit.edu
Two days ago Meta (Facebook) founder Mark Zuckerberg announced several bold AI projects, including a plan to build a universal speech translator (Star Trek?). Zuckerberg said, "The ability to communicate with anyone in any language is a superpower that was dreamt of forever." This is not the only one. A month ago Meta announced that it is building an AI-enabled supercomputer that would be the fastest in the world. The project is scheduled to be completed in mid-2022.
Full article: https://www.bbc.com/news/technology-60492199
That’s my take on it: Is Meta overly ambitious? I don’t think so. Chief AI researcher of Meta Yann LeCun, who is a French-American, is one of the world’s leading experts. He invented convolutional neural networks (CNN) when he was a post-doc research fellow. Additionally, he also proposed the early form of the back-propagation learning algorithm for neural networks. Given his track record, it is very possible that eventually Meta can deliver what it promises. Even if it “fails,” aiming high is still a good strategy. Meta might not be able to create a universal translator, but the end product could be on a par with Google’s natural language processing module. Similarly, the supercomputer made by Meta might not be the world’s fastest, but I guess at least it is comparable to IBM Power Systems (the second and third fastest supercomputers at the present time). It is better to aim high but fail than doing nothing!
Yesterday (2/22/2022) FACT.MR posted a summary of the report on the global cloud computing market. It is estimated that the industry is expected to achieve a value of US$482 billion in 2022 and US$ 1,949 billion by 2032. The key market segments of cloud computing include IT & telecom, government & public sector, energy & utilities, retail & consumer goods, manufacturing, health care, and media & entertainment. There are several noteworthy latest developments in this field. For example, in February 2022, IBM announced its partnership with SAP to offer technology and expertise to clients to build a hybrid cloud approach.
Full article: https://www.globenewswire.com/news-release/2022/02/22/2389758/0/en/Cloud-Computing-Market-to-Reach-US-1-949-Billion-by-2032-with-Attaining-15-CAGR-Fact-MR-Report-2022.html
That’s my take on it: In the era of big data analytics, no doubt in my mind cloud computing is an irreversible trend. In my humble opinion, perhaps open source is over-hyped. Specifically, the R language cannot perform multi-threaded computing without laborious re-configuration, whereas Pandas in Python has memory limitation issues when the data set is extremely large. On the contrary, proprietary platforms such as Amazon, Google, Microsoft/SAS, and IBM enable the analyst to execute high performance computing procedures with big data in a distributed cloud environment.
At first glance, cloud computing is more business-oriented than academic-centric. It might be unclear to psychologists, sociologists, or biologists why high-performance computing in a cloud-based platform is relevant. Consider this hypothetical example: In the past, it took 13 years to finish the Human Genome Project because DNA sequencing was very complicated and tedious. Had biologists at that time employed current technologies, the Human Genome Project would have been completed in two years! Next, consider this real-life example: Facebook, Google, Amazon, etc. have been collecting behavioral data in naturalistic settings, and their forecasting models are highly accurate. Think about its implications for psychology and sociology!
According to a recent study conducted by researchers at Lancaster University and UC Berkeley, participants reported that faces generated by AI are more trustworthy than actual human faces. The researchers suggested that AI-generated faces are viewed as more trustworthy because they resemble the characteristics of average human faces, which are deemed more trustworthy. This paper will be published in the Proceedings of the National Academy of Sciences (PNAS).
The artificial faces used in this study are created by a generative adversarial network (GAN) named StyleGAN2. A generative adversarial network was invented by Goodfellow et al. in 2014. GAN consists of two sub-models: a generator for outputting new examples, and a discriminator that can classify the examples as real or fake. The two models are adversaries in the sense that the generator, which acts as a team of counterfeiters, tries to fool the discriminator, which plays the role of the police.
Full article: https://neurosciencenews.com/ai-generated-faces-trust-20071/
That’s my take on it: This finding has profound implications for both psychologists and philosophers. Why do many people accept disinformation, conspiracy theories, utopia ideas, and many other faked things? It is because we tend to look for something better than what we found in reality! As a result, we can be easily fooled by others (e.g., AI) and at the same time, we fool ourselves!
Several days ago I read a discussion thread on Quora (see below):
Christian Howard
Ph.D. in Computer Science, the University of Illinois at Urbana-Champaign (Expected 2024)
Is data science/machine learning/AI overhyped right now?
Yeah, it is overhyped, though certainly still valuable.
Some of the things I laugh about when it comes to these areas are the people out there who talk about fitting a model with least squares being “machine learning”, even though this basic statistical technique has been around forever. I remember when I was first reading about neural networks back in 2012, my dad told me how he tuned neural networks that modeled risk at some big financial company he worked at in the 80s. The thing I realized is there are a lot of techniques getting rebranded that have been around for a while and have really only come back due to better computational resources, more data, and some other research-related developments.
But at the end of the day, data science/machine learning/AI is not the magic bullet today that a lot of the tech media portrays it to be. Tons of non-technical people, from what I have learned by talking to people in my professional network, think AI and Machine Learning can currently be used to solve impossible problems for companies. This is leading non-AI/ML companies to hire people with the data science and machine learning background to try and turn the data they have into some magic mathematical serum that can be used to wreck their competition. The wishes of many of these companies are infeasible and unrealistic and put insane pressure on the data science/ML teams they build to do the impossible. This is a problem and it all stems from the fact that there’s a hype about what data science/ML/AI can do today and it’s inaccurate. Not to mention, there’s a lot of research that still needs to be done to really understand some areas of ML that are hyped, like deep learning.
My dad is an executive consultant in tech-oriented companies and he tells me he sees so many companies who try to use AI to help rebrand their business since it’s a hot area, but they will minimally dip into AI by just using basic statistical learning techniques or just grab Tensorflow and use a deep learning architecture to try and model some dataset they have internally. It’s such a joke, all a function of the hype, and clearly not nearly as great a use for data science/ML/AI as some of the things larger tech companies are doing with that stuff.
So yeah, I think that while data science/ML/AI is useful to learn and use, it is indeed overhyped and likely will be for a little while.
That’s my take on it: The least-square criterion for OLS regression was discovered in 1805, but today many people treat it as a data science approach. This confusion can be attributed to the issue that many students are not informed about the differences and similarities between traditional statistics and modern data science.
It is true that some modern techniques have been around for a while. For example, the decision tree approach was developed in the 1980s. While the theoretical foundation of the connectionist approach to AI can be traced back to 1943, working models of neural networks appeared in the 1980s and 1990s. But its popularity and prominence is a more recent phenomenon. Before the 2010s, neural networks did not demonstrate many advantages over traditional analytical methods, such as linear regression and logistic regression, as well as other data mining methods, such as the decision tree.
Is it over-hyped? With the advance of high-performance computing, these methods are re-packaged and further developed. More importantly, the availability of big data opens the door to new possibilities. Remember the e-commerce bubble in the 1990s? Any new movement tends to overpromise and underdeliver, especially when too many people rush to the “gold mine” without proper tools and training. Remember the parable of the weeds in Matthew 13? In the end, robust data science solutions will be here to stay!
Recently I received a free copy of the report “Data Science Platforms: Buyer’s Guide and Reviews” updated by PeerSpot (formerly IT Central Station) in February 2022. Unlike other benchmark studies that rely on numeric ratings, PeerSpot’s report compiled qualitative data (open-ended comments). This timely report includes assessments of 10 data science tools: Alteryx, Databricks, KNIME, Microsoft Azure, IBM SPSS Statistics, RapidMiner, IBM SPSS Modeler, Dataiku Data Science Studio, Amazon SageMaker, and SAS Enterprise Miner. However, the report is copyrighted and needless to say, I cannot share the full text with you. The following are some excerpts of user feedback to IBM SPSS Statistics, IBM SPSS Modeler, Amazon SageMaker, and SAS Enterprise Miner.
Pro: The features that I have found most valuable are Bayesian statistics and descriptive statistics. I use these more often because pharma companies and clinical hospitals make the medicines by taking feedback from different patients.
Con: I'd like to see them use more artificial intelligence. It should be smart enough to do predictions and everything based on what you input. Right now, that mostly depends on the know-how of the user.
Pro: I like the automation and that this product is very organized and easy to use. I think these features can be found in many products but I like IBM Modeler because it's very clear about how to use it. There are many other good features and I discovered something that I haven't seen in other software. It's the ability to use two different techniques, one is the regression technique and the other is the neural network. With IBM you can combine them in one node. It improves the model which is a big advantage.
Con: The time series should be improved. The time series is a very important issue, however, it is not given its value in the package as it should be. They have only maybe one or two nodes. It needs more than that.
Pro: The most valuable feature of Amazon SageMaker is that you don't have to do any programming in order to perform some of your use cases. As it is, we can start to use it directly.
Con: SageMaker is a completely new tool. It can be very hard to digest. AWS needs to provide more use cases for SageMaker. There are some, but not enough. They should collect or create more use cases.
SAS Enterprise Miner
Pro: The solution is able to handle quite large amounts of data beautifully. The modeling and the cluster analysis and the market-based analysis are the solution's most valuable aspects. I like the flexibility in that I can put SAS code into Enterprise Miner nodes. I'm able to do everything I need to do, even if it's not part of Enterprise Miner. I can implement it using SAS code. The GUI is good. The initial setup is fairly easy to accomplish.
Con: One improvement I would suggest is the compatibility with Microsoft SQL and to improve all communications to the solution. For a future release, I would like for the solution to be combined with other product offerings as opposed to a lot of separate solutions. For example, Text Miner is a separate product. I have to spend additional money to purchase a license for Text Miner.
That’s my take on it: It is unclear to me why IBM SPSS Statistics is included as a data science tool. In my humble opinion, SPSS Statistics is more in alignment with traditional statistics than modern data science. Specifically, its lack of dynamic data visualization hinders analysts from exploring the data, which is essential to data science. Thus, for data mining, I prefer IBM SPSS Modeler to IBM SPSS Statistics. SAS Enterprise Miner is doubtlessly one of the best products of SAS, but it is strange that SAS Viya, which is capable of in-memory analytics, and JMP Pro, which specializes in exploratory data analysis, are not on the radar screen. Although Amazon SageMaker is a newcomer to the market of data analytics, within a short period time it can pose a challenge to well-established products like SAS and SPSS. At the present time, Amazon dominates the market of cloud computing. It is worth looking into Amazon SageMaker.
On Feb. 1, 2022, Fortune Education published an article detailing how Zillow’s big data approach to its real estate investment failed. In 2019 Zillow made a huge profit ($2.7 billion) by flipping: buying a house, making some renovation, and then selling it at a higher price. In 2006, Zillow collected data of approximately 43 million homes and later added 110 million houses into the database. Big-data analysis informed Zillow what to offer and how much to charge on the flip, and at that time the error rate was as low as 5%. However, recently Zillow failed to take the skyrocketing costs of materials and labor into account; as a result, Zillow paid too much to purchase properties and flipping is no longer profitable. In response to this case, Fortune Education cited the comment made by Lian Jye Su, a principal analyst at ABI Research: “There is a reason why governments and intelligence firms are bullish on big data. There’s not enough human intelligence to go around. It’s not cheap to hire the people. And we’re swamped with data.”
Full article: https://fortune.com/education/business/articles/2022/02/01/what-zillows-failed-algorithm-means-for-the-future-of-data-science/?utm_source=ONTRAPORT-email-broadcast&utm_medium=ONTRAPORT-email-broadcast&utm_term=&utm_content=Data+Science+Insider%3A+February+4th%2C+2022&utm_campaign=05022022
That’s my take on it: Data science is a fusion of three components: computer science (e.g., database), data analytics (e.g., modeling), and domain knowledge. The oversight of Zillow is a typical example of omitting the last component of data science: domain knowledge. If the data modeler has background knowledge of economics, then the hyper-inflation rate should be factored in. There is nothing new. When I was a graduate student, my mentor told me, “You need to know where the data came from.”
Recently I Google-searched for the best data analysis software tools of 2022. Several lists are returned by Google, and not surprisingly, their rankings are slightly different. According to eWeek, the top ten data analytical tools are: 1. IBM 2. Microsoft 3. MicroStrategy 4. Qlik 5. SAP 6. SAS 7. Sisense 8. Tableau 9. ThoughtSpot 10. TIBCO. The ranking of QA Lead is as follows: 1. Azure 2. IBM Cloud Park 3. Tableau 4. Zoho Analysis 5. Splunk 6. SAS Visual Analytics 7. Arcadia Enterprise 8. Qrvey 9. GoodData 10. Qlik Sense. The order of data analysis software tools ranked by VS Monitoring is: 1. Tableau 2. Zoho 3. Splunk 4. SAS Visual Analytics 5. Talend 6. Cassandra 7. SiSense 8. Spark 9. Plotly 10. Cloudrea. Hackr.io provides the following list: 1. Python 2. R 3. SAS 4. Excel 5. Power BI 6. Tableau 7. Apache Spark By Selecthub’s ratings, the top ten are: 1. Oracle 2. IBM Watson 3. SAP 4. BIRT 5. Qlik Sense 6. Alteryx 7. MicroStrategy 8. SAS Viya 9. Tableau 10. TIBCO
https://www.eweek.com/big-data-and-analytics/data-analytics-tools/
https://theqalead.com/tools/big-data-analytics-tools/
https://www.vssmonitoring.com/best-big-data-analytics-tools/
https://hackr.io/blog/top-data-analytics-tools
That’s my take on it: Which data analytical tools are the best? I will give you a Bayesian answer: It depends! Indeed, these diverse assessments are dependent on different criteria. Nonetheless, there is a common thread across these rankings. Only two companies appear in all five lists: SAS and Tableau. SAS is a comprehensive end-to-end solution whereas Tableau specializes in data visualization for business intelligence. Which one is really better? It depends!
Yesterday National Opinion Research Center (NORC) at the University of Chicago announced the upgrade of the General Society Social Survey Explorer. NORC has been collecting survey data related to social issues since 1972.
NORC has updated the General Social Survey’s Data Explorer (GSS-DE) and Key Trends to make them better tools for users. This update includes substantial upgrades including a simplified user interface and single sign-in. The new version of the Data Explorer (GSS-DE 2.0) will be available this Winter (2022). The existing version of the Data Explorer and Key Trends (GSS-DE and Key Trends 1.0) has been discontinued now that the new GSS-DE 2.0 site has been launched. Please note that GSS-DE and Key Trends 1.0 are no longer be available.
With the launch of Data Explorer 2.0, signing in for the first time may look a little different. Once you've navigated to https://gssdataexplorer.norc.org/, log in with your credentials to receive an email with a temporary password. Returning users will need to change their passwords and update information for security purposes. Once you've logged in with the temporary password, you will be prompted.
In the past, my students and I published several journal articles using NORC data. There are several advantages of archival data analysis:
· It saves time, effort, and money because you don’t need to collect data on your own and get IRB approval.
· It provides a basis for comparing the results of secondary data analysis and your primary data analysis (e.g., national sample vs. local sample).
· The sample size is much bigger than what you can collect by yourself. A small-sample study lacks statistical power and the result might not be stable across different settings. On the contrary, big data can reveal stable patterns.
· Many social science studies are conducted with samples that are disproportionately drawn from Western, educated, industrialized, rich, and democratic populations (WEIRD). Nationwide and international data sets alleviate the problem of WEIRD.
On the other hand, there are shortcomings and limitations. For example, you might be interested in analyzing disposable income, but the variable is gross income. In other words, your research question is confined by what data you have at hand.
Recently the University of the West of Scotland introduced an AI-enabled system that is capable of accurately diagnosing COVID19 in just a few minutes by examining X-ray scans. The accuracy is as high as 98%. This AI system can draw the conclusion by comparing scanned images belonging to patients suffering from COVID19 with healthy individuals and patients with viral pneumonia. The inference engine of this AI system is the deep convolutional neural network (CNN), which is well-known for its applications in computer vision and image classification.
Full article: https://metro.co.uk/2022/01/20/x-rays-could-replace-pcr-tests-for-covid-detection-research-shows-15951946/?utm_source=ONTRAPORT-email-broadcast&utm_medium=ONTRAPORT-email-broadcast&utm_term=&utm_content=Data+Science+Insider%3A+January+21st%2C+2022&utm_campaign=22012022
That’s my take on it: There are at least four types of artificial neural networks: artificial neural network (ANN), convolutional neural network (CNN), recurrent neural network (RNN), and generative adversarial network (GAN). CNN is the traditional and the oldest one between them. Nonetheless, it is by no means outdated. As more hidden layers are added into a CNN, it can be turned into a powerful deep learning system. However, I guess it may take months or years for the preceding AI diagnostic system to supplement or replace the regular PCR tests for COVID19, due to our natural disposition of being skeptical against novel ideas.
On Jan 16, 2022, Chad Reid, VP of marketing and communications at Jotform, posted an article on Inside Big Data. In this article, he argued that there are two types of data visualization: exploratory and explanatory, and both are valuable for fulfilling different needs. He cited an article posted on the American Management Association website to support explanatory data visualization. According to prior research:
· 60% of the adult population are visual learners.
· 64% of participants made an immediate decision following presentations that used an overview map.
· Visual language can shorten meetings by 24%.
· Groups using visual language experienced a 21% increase in their ability to reach consensus.
· Presenters who combined visual and verbal presentations were viewed as 17% more convincing than those who used the verbal mode only.
· Written information is 70% more memorable when it is combined with visuals and actions.
· Visual language improves problem-solving effectiveness by 19%.
· Visual language produces 22% higher results in 13% less time.
https://insidebigdata.com/2022/01/16/explanatory-vs-exploratory-which-data-visualization-is-right-for-your-organization/
https://www.amanet.org/articles/using-visual-language-to-create-the-case-for-change/
That’s my take on it: Traditionally, data visualization is treated as one of four components of exploratory data analysis (EDA) introduced by John Tukey. In academia, confirmatory data analysis (CDA) is still the dominant paradigm. As a matter of fact, EDA and data visualization are very underused in academia. Very often peer-review journal articles show only a few graphs or even none. Although numbers like t-ratio, F-ratio, Type III sum of squares, p value, eigenvalue, R2, eta squared…etc. make the report look scientific, usually these numbers cannot tell you the pattern of the data and the magnitude of the effect under study, which are supposed to be our primary concerns. We need both exploratory and explanatory data visualization, which has become a common practice in business!
Recently Europol, the law enforcement agency of the European Union, was ordered to delete a vast amount of data collected over the past six years, after being pressured by the European Data Protection Supervisor (EDPS), the watchdog organization that supports the right to privacy. Under this ruling, Europol has a year to go through 4 petabytes of data to determine which pieces are irrelevant to crime investigation, and in the end, these data must be removed from the system. The responses to this decision are mixed. Not surprisingly, privacy supporters welcome the ruling while law enforcement agencies complain that this action would weaken their ability to fight crime.
Full article: https://www.theverge.com/2022/1/10/22877041/europol-delete-petabytes-crime-data-eu-privacy-law
That’s is my take on it: In data mining it is difficult to determine which variables or observations are important or relevant before conducting data exploration and analytics. Very often data that seem to be relevant at the beginning turn out to be indispensable later. Take the Swanson process as an example. Dr. Swanson carefully scrutinized the medical literature and identified relationships between some apparently unrelated events, namely, consumption of fish oils, reduction in blood viscosity, and Raynaud’s disease. His hypothesis is that there was a connection between the consumption of fish oils and the effects of Raynaud’s syndrome, and this was eventually validated by experimental studies conducted by DiGiacomo, Kremer and Shah. Using the same methodology, the links between stress, migraines, and magnesium were also postulated and verified.
Last year Python was the number one programming language, according to TIOBE, a software quality measurement company based in the Netherlands. It produces a monthly index of popular languages across the world, using the number of search results in popular search engines. On the list C (and its variants), Java, Visual Basic, JavaScript, and SQL continue to be among the top 10. R is ranked number 12.
Full article: https://thenextweb.com/news/python-c-tiobe-programming-language-of-the-year-title-analysis?utm_source=ONTRAPORT-email-broadcast&utm_medium=ONTRAPORT-email-broadcast&utm_term=&utm_content=Data+Science+Insider%3A+January+7th%2C+2022&utm_campaign=08012022
That’s my take on it: The TIOBE index is based on popularity in terms of search results. It doesn’t assess the quality of the programming languages (e.g., ease of use, efficiency, functionality…etc.). Besides TIOBE, there are other indices for programming languages. In PYOL Python is still the top whereas in Stack Overflow the champion is JavaScript (see the links below). It is advisable to look at multiple indicators in order to obtain a holistic view.
https://insights.stackoverflow.com/survey/2021#technology-most-popular-technologies
PYPL:
https://pypl.github.io/PYPL.html
A few days ago Timnit Gebru, who resigned from Google and launched her own AI research institute, published an article entitled “For truly ethical AI, its research must be independent of big tech” on The Guardian. In the article she accused several big tech companies of unethical behaviors e.g. Google forced her to withdraw the paper on the bias of language models; Amazon crushed the labor union, and Facebook prioritizes growth over all else. In addition, she mentioned that recently California passed the Silenced No More Act to enable workers to speak against racism, harassment, and other forms of abuse in the workplace, thus preventing big corporations from abusing power. In conclusion, she suggested that we need alternatives rather than allowing big tech companies to monopolize the agenda.
Full article: https://www.theguardian.com/commentisfree/2021/dec/06/google-silicon-valley-ai-timnit-gebru?utm_source=ONTRAPORT-email-broadcast&utm_medium=ONTRAPORT-email-broadcast&utm_term=&utm_content=Data+Science+Insider%3A+December+10th%2C+2021&utm_campaign=11122021
That’s my take on it: I don’t disagree with Gebru. However, it is difficult to find an optimal balance. Interestingly, Gebru cited the example of spreading misinformation in Ethiopia as evidence that unethical behaviors are widespread in high-tech companies. In November 2020, a war broke out in Ethiopia, the home country of Gebru. According to Gebru, one of the causes is that Facebook allowed unchecked misinformation and hate speech to be posted on Facebook. In a similar vein, a week ago Rohingya sued Facebook for $150 billion, saying that Facebook helped spread hate speech, misinformation, and incitement to violence in Myanmar.
It is important to point out that some political dissidents and people who promote alternate views on various controversial issues (e.g. COVID19, LGBT) also complain that social media companies muted their voice in the name of banning hate speech and misinformation.
Where can we draw the line? As Professor Thomas Sowell said, there is no solution, only a trade-off.
Timnit Gebru is an Ethiopian-American computer scientist who specializes in algorithmic bias and data mining. For a long time, she had led various AI task forces at big tech corporations, including Apple and Google. Her career path changed when in December 2020 Google Manager asked her to either withdraw a pending paper pertaining to bias in language models or remove the names of all the Google employees from the paper. According to Google, the paper ignored the latest developments in bias reduction. Gebru refused to comply and eventually resigned from her position. Recently Gebru announced that she is launching an independent AI research institute focusing on the ethical aspects of AI. Her new organization Distributed Artificial Intelligence Research Institute (DAIR) received $3.7 million in funding from the MacArthur Foundation, Ford Foundation, Kapor Center, Open Society Foundation, and the Rockefeller Foundation.
Gebru said she was more successful at changing Google’s policies by publishing papers outside Google that could catch the attention of academics, policy-makers and journalists, rather than raising concerns inside Google. She hoped DAIR could break the monopoly of AI research by big tech companies.
Full article: https://www.washingtonpost.com/technology/2021/12/02/timnit-gebru-dair/?utm_source=ONTRAPORT-email-broadcast&utm_medium=ONTRAPORT-email-broadcast&utm_content=Data+Science+Insider%3A+December+3rd%2C+2021&utm_campaign=04122021
That’s my take on it: On the one hand, research in AI and machine learning is a bottom-up movement, which is evident by the fact that many breakthroughs and advanced algorithms are contributed by individuals from the open-source community. On the other hand, it is also a top-down revolution when many large-scale projects are funded by the government and big corporations, especially the projects that require vast amounts of data and high-performance computing. No doubt Gebru’s efforts and other similar research groups can provide checks and balances for the field.
Today is the first day of the 2021 Tableau Online Conference. I attended several informative sessions, including the one entitled “Data is inherently human” (see attached). This session highlighted the alarming trend that 85% of all AI projects will deliver erroneous results due to bias in data, algorithms, or human factors, according to a Gartner report. One of the speakers, who is a white woman, pointed out that AI-empowered voice recognition systems have problems with her southern accent. In addition, when she listened to her daughter's Tiktok, she knew it was English, but she had no idea what it meant. She emphasized that machine learning algorithms, such as sentiment analysis, must be adaptive to linguistic evolution. Some terms that were negative two years ago might mean something positive today.
That’s my take on it: It is a well-known fact that many facial recognition systems have a harder time identifying people with darker skin. Actually, bias in technology affects every ethnic group, not only minorities. I can imagine that voice recognition systems developed in the US might not work well in Australia and South Africa. Further, semantics vary from time to time, and also from place to place. For example, in American culture "PK" stands for pastor’s kid or preacher’s kid. However, the abbreviation "PK" has a negative connotation for Hong Kong people. There is no fool-proof AI system!
The open-source software platform GitHub, owned by Microsoft, stated that for some programming languages, about 30% of new codes are suggested by its AI programming tool Copilot, which is built on the OpenAI Codex algorithm. This machine learning algorithm is trained on terabytes of source codes and is capable of translating natural human language into a programming language. According to Oege de Moor, VP of GitHub Next, a lot of users have changed their coding practices because of Copilot and as a result, they have become much more productive in their programming.
https://www.axios.com/copilot-artificial-intelligence-coding-github-9a202f40-9af7-4786-9dcb-b678683b360f.html?utm_source=ONTRAPORT-email-broadcast&utm_medium=ONTRAPORT-email-broadcast&utm_term=&utm_content=Data+Science+Insider%3A+October+29th%2C+2021&utm_campaign=30102021
That’s my take on it: On the one hand, it is a blessing that cutting-edge technologies can make programming more efficient by modeling after many good examples. But on the other hand, it could suppress potential innovations due to some kind of echo chamber effect. Consider this scenario: Henry Ford consults an AI system in an attempt to build a more efficient process for manufacturing automobiles. Based on a huge collection of “successful” examples learned from other automakers, the machine learning algorithm might suggest to Ford to improve efficiency by hiring more skilled workers and building a bigger plant. The idea of an assembly line would never come up! I am not opposed to programming assistance, but at the end of the day, I must remind myself that I am the ultimate developer!
Two days ago (Oct. 25, 2021) the Financial Times reported that UK’s spy agencies have signed a contract with Amazon Web Services. British intelligence agencies, such as MI5 and MI6, will store classified information in the Amazon cloud platform and also utilize Amazon’s AI for intelligence analytics. British intelligence offices have been using basic forms of AI, such as translation technology, since the dawn of AI. Now they decided to expand AI applications in response to the threat from AI-enabled hostile states.
https://www.reuters.com/world/uk/amazon-signs-deal-with-british-spy-agencies-boost-use-ai-espionage-ft-2021-10-25/
That’s my take on it: The stereotypical image of people in espionage is 007: handsome, strong, and dare to fight against dangerous villains by hand-to-hand combat. Not anymore! In the near future, the most powerful weapon for a spy is not the Beretta pistol (the type of handgun used by James Bond); rather, it will be a mouse and a keyboard. If you want to be the next James Bond, study data science and machine learning!
Currently, I am working on a book chapter regarding ensemble methods. During the literature review process a recent research article caught my attention:
Ismal, A. et al. (2021). A new deep learning-based methodology for video DeepFake detection using XGBoost. Sensors, 21. Article 5413. https://doi.org/10.3390/s21165413.
DeepFake is a deep learning AI algorithm that can replace one person with another in video and other digital media. Famous humorous examples include fake videos of Obama and Queen Elizabeth. An infamous example is that in 2017 a Reddit user transposed celebrity faces into porn videos. Ismal and his team developed a new DeepFake detection system based on XGBoost, a supervised machine learning method that is capable of making gradual model improvement by running many decision trees and analyzing the residuals in each iteration. Those authors claimed that the accuracy is 90.73%, meaning that the error rate is 9.27%.
That’s my take on it: In 1997 when Linda Tripp recorded her conversation with Monica Lewinsky about her affair with President Clinton, the legal enforcement system accepted the audiotapes as convincing evidence. Today you cannot trust video recording! Let alone audio! There is a still-photo equivalent to DeepFake: DeepNude. This app can use neural networks to remove clothing from the images of people, and the result looks realistic. The app is sold for $50 only. Due to its widespread abuse, the developer retracted it in 2019. However, parts of the source code are open and as a result, there are many copycats in the market. I am glad that now cutting-edge technologies like XGBoost can be used to detect faked videos, but in the first place, the problem originates from state-of-the-art technologies! According to some experts, DeepFake technologies have been improving exponentially. In late 2017 it took hundreds of images and days of processing time to swap faces in a video clip. Today it requires only a handful of images, or even just text inputs, and a few hours. It is similar to the race between computer viruses and anti-virus software packages. No matter how sophisticated anti-viruses software is, Trojan horse, spyware, ransomware…etc. keep evolving. The same contest will happen between DeepFake/DeepNude and fake video/image detection systems. The Pandora box has been opened!
Recently Facebook launched a new research project named Ego4D in an attempt to teach AI to comprehend and interact with the world as humans do, rather than from a third-person perspective. There are two major components in Ego4D: an open dataset of egocentric (first-person perspective) video and a series of benchmarks that Facebook thinks AI systems should be capable of handling in the future. The dataset, which is the biggest of its kind, was collected by 13 universities around the world. About 3,205 hours of video footage were recorded by 855 participants living in nine different countries. Full article:
https://www.theverge.com/2021/10/14/22725894/facebook-augmented-reality-ar-glasses-ai-systems-ego4d-research
That’s my take on it: For a long time research activities have been limited by a narrow definition of data: numbers in a table. In qualitative research, we go one step further by including open-ended responses. But that is not enough! A lead research scientist at Facebook said: “For AI systems to interact with the world the way we do, the AI field needs to evolve to an entirely new paradigm of first-person perception. That means teaching AI to understand daily life activities through human eyes.” Whether there will be any self-aware AI system in the future is controversial. Nonetheless, how Facebook is trying to train AI is also applicable to human researchers. No matter whether the data are structured or unstructured, currently, researchers are investigating issues or phenomena in a third-person perspective. Perhaps video-based or VR-based data could unveil insights that were overlooked in the past.
Nicolas Chaillan, the Pentagon's former Chief Software Officer (CSO), told the Financial Times that China has won the artificial intelligence battle with the US and is heading towards global dominance in key technological sectors. According to Chaillan, "We have no competing fighting chance against China in 15 to 20 years. Right now, it's already a done deal; it is already over in my opinion.” Chaillan blamed the gap on slow innovation, the reluctance of U.S. companies such as Google to work with the government on AI, and delay due to extensive ethical debates over the technology. He mocked that U.S. cyber defense capability in some government departments was at the "kindergarten level". Chailian resigned from this position to protest against the culture of inaction and slow responses.
English version: https://news.trust.org/item/20211011063736-r28k4
Chinese version: https://www.worldjournal.com/wj/story/121468/5809364?from=wj_maintab_index
That’s my take on it: It is not the first time. Right after AT&T Bell lab invented the transistor in 1947, Sony immediately bought the license and introduced the first transistor-based radio while the US home electronics manufacturer still stayed with bulky vacuum tubes. In the 1960s Japanese automakers produced affordable, dependable, and fuel-efficient small cars, but its US competitors experimented with the first compact car in 1971. During the last several years China, South Korea, Sweden, and Finland have been investing in 5G infrastructure. However, at the present time, the US still lags behind international competitors in 5G. Will the Biden administration act upon the AI gap? Never too late!
Today is the third day of the 2021 JMP Discovery Summit. I learned a lot from the plenary talk entitled “Facets of a diverse career” presented by Dr. Alyson Wilson, Associate Vice Chancellor for National Security and Special Research Initiatives and Professor of Statistics at North Carolina State University. Her work experience spans academia, industry, and government. She said that her career is a testament to John Tukey's statement: “The best thing about being a statistician is that you get to play in everyone’s backyard.” She covered many topics in the talk. I would like to highlight some of them as follows:
Many years ago she worked in the Los Alamos National Lab as a specialist in national security science, especially on weapons of mass destruction. You may wonder what role a statistician would play in this domain. Because the US signed the nuclear test-ban treaty, since the 1990s no comprehensive tests of reliability have been made to the US nuclear weapons. Alternatively, historical and simulation data were utilized by statisticians like her for reliability analysis. We are not 100% sure whether the missile works until we push the button!
Although Dr. Alyson was trained in traditional statistics, under her leadership NC State University established the Data Sciences Initiative for coordinating DS-related resources and works across ten departments in the university. In March 2021 NC State University launched a university-wide data science academy. The academy aims to enhance the infrastructure, expertise, and services needed to drive data-intensive research discoveries, enhance industry partnerships, and better prepare its graduates to succeed in a data-driven economy.
https://research.ncsu.edu/dsi/
https://news.ncsu.edu/2021/03/nc-state-launches-data-science-academy/
That’s my take on it: In the Q & A session, I asked her: “The US collects a lot of data related to the COVID19 pandemic, but our countermeasures against the pandemic are not as effective as some Asian countries (e.g. Taiwan and Singapore). Do you think there is a disconnect between data analytics and decision support?” Dr. Alyson replied: we need to put good science on the data, but decision-making is multi-faceted. Something obvious to statisticians and data scientists may not be obvious to decision-makers.
I agree. Collecting and analyzing data is important, but at the end of the day, the most important thing is what we do with the information.
Recently Mo Gawdat, formerly the Chief Business Officer for Google’s moonshot organization, told Times Magazine that we are getting closer and closer to AI singularity, the point in time that AI becomes self-aware or acquires a superpower beyond our control. He believed that it is inevitable for AI to become as powerful as the Skynet in “Terminator.” At that point, we will helplessly sit there to face the doomsday brought forth by god-like machines. Why did he make such a bold claim? Mo Gawdat said that he had his frightening revelation while working with AI developers at Google to build robotic arms. Once a robot picked up a ball from the floor and then held it up to the researchers. Mo Gawdat perceived that the robot was showing off.
https://futurism.com/the-byte/google-exec-ai-god?utm_source=ONTRAPORT-email-broadcast&utm_medium=ONTRAPORT-email-broadcast&utm_term=&utm_content=Data+Science+Insider%3A+October+1st%2C+2021&utm_campaign=02102021
That’s my take on it: As a psychologist, I think Mo Gawdat’s concern is a result of anthropomorphism, a tendency of seeing human-like qualities in a non-human entity. It happens all the time e.g. we project our human attributes to pets. Now this disposition extends to robots. However, even though an AI-enabled robot acts like a human, it doesn’t necessarily imply that the robot is really self-conscious or has the potential to become self-aware. I don’t worry about terminators or Red Queen (in the movie “Resident Evil”), at least not in the near future!
Today is the second day of the 2021 JMP Discovery Summit. I would like to highlight what I learned from the plenary session entitled “Delicate Brute Force.” The keynote speaker is John Sall, co-founder of SAS Institute and the inventor of JMP. In the talk Sall pointed out that traditional clustering and data reduction methods are very inefficient to process big data. To rectify the situation, Sall experimented with several new methods, such as vantage point trees, hybrid Ward, randomized singular value decomposition (SVD), multi-threaded randomized SVD…etc. Improvements were made bit by bit. For example, in a big data set containing 50,000 observations and 210 variables, it took 58 minutes to process the data in R’s fast cluster. Fast Ward in JMP cut the processing time down to 8 minutes while the new hybrid Ward took 22 seconds only. Further improvements reduced the processing time to 6.7 seconds.
https://discoverysummit.jmp/en/2021/usa/home.html
That’s my take on it: No doubt analytical algorithms are getting better and better, but very often the adoption rate cannot keep up the pace of technological innovation. I foresee that in the near future standard textbooks will not include hybrid Ward or multi-threaded randomized SVD. On the contrary, I expect widespread resistance. Think about what happened to Bruno, Copernicus, and Galileo when they proposed a new cosmology. Look at how US automakers ignored Edwards Deming. Perhaps we need another form of delicate brute force for psychological persuasion.
Recently Bernard Marr, an expert on enterprise technology, published two articles on Forbes, detailing his prediction of AI trends. In both articles, Marr mentioned the trend of no- or low-code AI. As a matter of fact, not every company has the resources to hire an army of programmers to develop AI and machine learning applications. As a remedy, many of them started considering no- or low-code and self-service solutions. For example, Microsoft and other vendors have been developing natural language processing tools for users to build queries and applications by speaking or writing natural languages (e.g. “Computer! Build a time-series analysis of revenues by product segment from 2015-2021. I want the report in 30 minutes, or else!”)
Marr, B. (2021, September 24). The 7 biggest artificial intelligence (AI) trends in 2022. Retrieved from https://www.forbes.com/sites/bernardmarr/2021/09/24/the-7-biggest-artificial-intelligence-ai-trends-in-2022/?sh=36dcfc022015&utm_source=ONTRAPORT-email-broadcast&utm_medium=ONTRAPORT-email-broadcast&utm_term=&utm_content=Data+Science+Insider%3A+September+24th%2C+2021&utm_campaign=25092021
Marr, B. (2021, September 27). The 5 biggest technology trends in 2022. Forbes. Retrieved from https://www.forbes.com/sites/bernardmarr/2021/09/27/the-5-biggest-technology-trends-in-2022/?sh=126c97192414
That’s my take on it: History is cyclical. When I was a student, programming skills were indispensable. In 1984 Apple revolutionized the computing world by implementing the graphical user interface (GUI) on Mac OS (GUI was invented by the Xerox Palo Alto Research Center, not Apple). Since then GUI has made computing not only easier to operate but more pleasant and natural. In recent years coding has become a hot skill again. Once a student told me, “employers don’t want a data analyst doing drag-and-drop, point-and-click…etc.” Not really. As experienced data analyst Bill Kantor said, many tasks are faster and easier to perform in applications with GUI than by programming. Today many corporations are aware of it and therefore they are looking for faster and no- or low- code solutions. But you don’t need to wait for natural language processing. Conventional GUI is good enough to make your life easier!
Today is the last day of Data Con LA 2021. I really enjoy the talk “Catch me if you can: How to fight fraud, waste, and abuse using machine learning and machine teaching” presented by Cupid Chan. Dr. Chan was so humorous that he boldly claimed, “While others may take days or weeks to train a model, based on my rich experience in AI, I can build a model guaranteed with 99.9% accuracy within 10 seconds!” The fool-proof approach is: “declare that everything is NOT fraud!” Even though fraud is prevalent (credit card fraud, health care fraud, identity theft…etc.), the majority of all transactions and events (99.9%) are legitimate. Consequently, a model that yields high predictive accuracy could be totally useless. This problem also occurs in spotting manufacturing defects, diagnosing rare diseases, and predicting natural disasters. There are different approaches to rectify the situation, including random undersampling (RUS). For example, when a data set is composed of 4,693 positive and 54,333,245 negative cases, all positive cases should be kept, of course, but only a subset of negative cases are randomly selected for machine learning. By doing so the algorithm would not over-learn from an extremely asymmetrical data set. Feeding this subsample into Google’s TensorFlow Boosted Tree Classifier, Chan found that the predictive accuracy is about 85%, rather than 99.9%. But this reduction is a blessing in disguise!
That’s my take on it: There are many overlapping ideas between traditional statistics and modern data science. Conceptually speaking, RUS is similar to the case-control design in classical research methodologies. For example, in a study that aims to identify factors of illegal drug use at schools, it is extremely difficult, if not impossible, to recruit students who admit to using illegal drugs. A viable approach is carpeting all the students in a school using anonymous surveys. It turned out that 50 out of 1,000 students reported drug use. However, if these 50 cases are compared against 950 controls (no drug use), the variances of the two groups would be extremely asymmetrical, thus violating the assumption of most parametric tests. To make a valid comparison, 50 non-drug users are selected from the sample by matching the demographic and psychological characteristics of the 50 cases (Tse, Zhu, Yu, Wong, & Tsang, 2015). As such, learning traditional statistics can pave the way to learning data science and artificial intelligence.
Chan, C. (2021, September). Catch me if you can: How to fight fraud, waste, and abuse using machine learning and machine teaching. Paper presented at Data Con LA 2021, Online. https://www.youtube.com/watch?v=OtWqTviKIpc
Tse, S., Zhu, S., Yu, C. H., Wong, P., & Tsang, S. (2015). An ecological analysis of secondary school students' drug use in Hong Kong: A case-control study. International Journal of Social Psychiatry, 10, 31-40. DOI: 10.1177/0020764015589132. Retrieved from https://pubmed.ncbi.nlm.nih.gov/26060281/
Today is the third day of Data Con LA 2021. Again, there are many interesting and informative sessions. The talk entitled “Too Much Drama and Horror Already: The COVID-19 Pandemic's Effects on What We Watch on TV” presented by Dr. Danny Kim (Senior Data Scientist at Whip Media) caught my attention. The theoretical foundation of his study is the environmental security hypothesis. According to the theory, viewers tend to look for meaningful and serious content in the media during tough times in order to help assuage uncertainty and anxiety. In contrast, people favor fun content when the living condition is not stressful. Utilizing big data (n = 233,284), Kim found that consumption of three genres has dropped substantially since the COVID19 pandemic:
· Drama: 8-11% drop
· Horror: 4-5% drop
· Adventure: 3-4% drop
That’s my take on it: The finding of this study partly corroborated with other reports. For example, in August 2020 Nielsen, a leading market measurement firm, found that news consumption (serious content) grew substantively. Nielsen found that 47% surveyed had either watched or streamed the news, making it the most popular TV genre. Yes, it’s time to be serious! Pandemic is too serious to be taken lightly.
Today is the second day of Data Con LA. Many sessions are informative and I would like to highlight one of them: “AI/ML/Data Science - Building a Robust Fraud Detection” presented by Gasia Atashian. In the talk, Gasia illustrated how Amazon SageMaker is utilized to detect online fraud (see attached figure). Even though the data size is gigantic, the prediction time is cut down to 30 minutes and the cost is less than $10 a month. Moreover, the predictive accuracy improves 25%, compared with previous models. More importantly, you don’t need a superpower to run the program. Rather, what it takes is an 8-core CPU and 32GB of RAM! Her research has been published by Springer:
https://link.springer.com/chapter/10.1007/978-3-030-82196-8_33
That’s my take on it: Models for fraud detection are not new. When I was a graduate student, a typical multivariate statistics class included discriminant analysis (DA), which is based on Fisher’s linear discriminant. The goal of DA is to find a linear combination of features that can classify entities or events into two or more categories (e.g. true positive, true negative). At that time it was the state of the art. But no one could foresee that in the near future a system developed by a bookseller/online department store could become one of the most robust classifiers in the realm of mathematics and data analytics.
In addition, when I was a student, big data could only be analyzed by a workstation, such as SGI and Sun, or a supercomputer, such as Cray YMP and CM-5. Today if you have a computer equipped with a multi-core CPU, a GPU, and 16-32GB of RAM, you can be a data scientist!
Life is like a box of chocolates; you never know what will happen next.
On September 9 Microsoft announced that it has formed a joint venture with the Australian Institute for Machine Learning to explore how advanced cloud computing, AI, computer vision, and machine learning can be applied in space. The project scope includes building algorithms for on-board satellite data processing, developing solutions for the remote operation and optimization of satellites, as well as addressing space domain awareness and debris monitoring. According to Professor Tat-Jun Chin, Chair of Sentient Satellites at the Australian Institute for Machine Learning, the collaboration with Microsoft “will allow us to focus on the investigation on the performance of algorithms used to analyze large amounts of earth-observation data from satellites, without needing to be concerned about gaining access to space at the onset.”
The announcement of Microsoft can be found at: https://news.microsoft.com/en-au/features/microsoft-joins-forces-with-australian-institute-for-machine-learning/
That’s my take on it: In the past, Microsoft was considered an imitator rather than an innovator. Excel replaced Lotus and Paradox, MS Word took over the word processing market from Word Perfect, Internet Explorer expelled Netscape, Windows NT dethroned Novell Netware…etc. The pattern is obvious: Microsoft reaped the fruits of other people’s innovations. Nevertheless, in the era of big data and machine learning, Microsoft has reinvented itself to be a different type of company. Now AI features are a large part of the company’s Azure Cloud service and no doubt today Microsoft is one of the leaders in AI innovation. To stay relevant, every organization has to reinvent itself!
This “news” is 2-month old (published on July 14, 2021). Nonetheless, it is still posted on the front page of “Inside Big Data”. After conducting extensive research, “Inside Big data” released a report entitled “The insideBigData Impact 50 list for Q3 2021.” As the title implies, the report lists the 50 most impactful companies in data science and machine learning. According to the research team, the selection of these companies is based upon their massive data set of vendors and industry metrics. And also the research team employed machine learning to determine the ranking. The following are the top 20 only:
1. NVIDIA
3. Amazon Web Services
5. Intel
6. Hewlett Packard Enterprise
7. DataRobot
8. Dell Technologies
9. Domino Data Lab
10. H20.ai
11. Databricks
12. Teradata
13. Qlik
14. TigerGraph
15. Snowflake
16. Kinetica
17. SAS
18. Anaconda (Python data science platform)
19. Salesforce (the parent company of Tableau)
20. OpeAI
That’s my take on it: NVIDIA is the inventor of the graphics processing unit (GPU). But why is it considered the most impactful company for big data? The answer is: parallel processing needs more GPUs. Having more GPUs can enable deep learning algorithms to train larger and more accurate models. Currently, two out of five world’s fastest supercomputers (Sierra and Selene) are equipped with NVIDIA technologies.
Contrary to popular belief, proprietary software still has a very strong user base. For example, the ranking of SAS is higher than that of Anaconda, the platform for Python and other open-source resources.
Not surprisingly, IBM (the parent company of SPSS) is not among the top 50. Besides the top 50, fifty-eight companies are on the list of honorable mention. Again, IBM is not there. In 2011 IBM’s AI system Watson beat human experts in an epic Jeopardy match, but this halo cannot make IBM impactful today due to its legacy design.
The full article can be viewed at: https://insidebigdata.com/2021/07/14/the-insidebigdata-impact-50-list-for-q3-2021/
XGBoost is one of the most advanced machine learning algorithms in the open-source community. It was introduced in 2014 by Dr. Tianqi Chen, an Assistant Professor at Carnegie Mellon University. The latest version was released in April 2021. XGboost has recently been dominating applied machine learning and Kaggle competitions for structured or tabular data. No doubt this 21st-century algorithm is far better than the least square regression, which was developed in the 19th century. In spite of its predictive accuracy and computation efficiency, XGBoost is more popular in data science studies than academia. What is XGBoost, really? Four days ago Shreya Rao published an article entitled “XGBoost regression: Explain it to me like I’m 10-year old” on Towards Data Science. The full article can be accessed at:
https://towardsdatascience.com/xgboost-regression-explain-it-to-me-like-im-10-2cf324b0bbdb
That’s my take on it: It is a common misconception that data science is very difficult to understand and implement. Actually, it is not. As the title of the preceding article implies, it is very easy to follow. You don’t need calculus or matrix algebra; rather, the concepts involved in XGboost, such as residual, similarity, and gain, require basic arithmetic only. Besides XGboost, there are several other types of boosting algorithms, such as Adaptive boosting Algorithm (AdaBoost) and Gradient Boosting (Gradient Boosting is taught in my class “STAT 553”). To boost or not to boost, that’s the question!
Recently Hewlett Packard Enterprise (HPE), a key player of high-performance computing splitting from the parent company HP, released a report about the performance of HPE on SAS 9.4. According to the report, the key findings demonstrated high scalability when running SAS 9.4 using the Mixed Analytics Workload with HPE Superdome Flex 280 Server and HPE Primera Storage. These results demonstrated that the combination of the HPE Superdome Flex 280 Server and HPE Primera Storage with SAS 9.4 delivers up to 20GB/s of sustained throughput, up to a 2x performance improvement from the previous server and storage generation testing. The full report can be downloaded at:
https://insidehpc.com/wp-content/uploads/2021/06/HPE-Reference-Architecture-for-SAS.pdf
That’s my take on it: Although open source has become more and more popular, some people might not realize that open source such as R is limited by memory, and also is not capable of running multi-thread processing. For high-performance computing and big data analytics, proprietary software apps such as SAS and IBM are still indispensable.
A week ago Microsoft announced that their researchers have developed the world’s largest general neural networks that utilize 135 billion parameters. Now the new AI system is used in Microsoft’s search engine, Bing. According to Microsoft, the enhanced Bing is able to determine whether a page is relevant to the query. For example, Bing learned that “Hotmail” is strongly associated with “Microsoft Outlook,” even though the two terms are not close to each other in terms of semantic meaning. The AI system identified a nuanced relationship between them based on their contexts. After the enhancement, Microsoft recorded a 2% increase in click-through rates on the top search results.
https://winbuzzer.com/2021/08/10/microsoft-research-meb-ai-for-bing-is-one-of-the-most-complex-models-ever-xcxwbn/
That’s my take on it: I tried to use the same phrases in both Google and Bing. For example, “Did Paul consult Greek philosophers?” (I deliberately left out the title “St.” or “Apostle”) “Scholars have high h-index”…etc. In most cases, both Google and Bing returned different pages, yet most of them are highly relevant. However, for the query “Scholars have high h-index,” apparently Google beats Microsoft.
Bing returned pages explaining how the h-index is measured, such as “What is a good H-index?” “What is a good H-index for a professor in Biology?” “What number in the h-index is considered a passing grade?” This is not what I want! I want to see a list of highly influential scholars. The top result shown in Google is: “Highly cited researchers (h>100)”. The fourth one is: “Which researcher has the highest h-index?” Google won!
A recent report entitled “Data Science Needs to Grow Up: The 2021 Domino Data Lab Maturity Index” compiled by Domino found that 71% of the 300 data executives at large corporations are counting on data science to boost revenue growth, and 25% of them even expect double-digit growth. However, the report warned that many companies are not making proper investments to accomplish this goal.
In the survey, the participants reported different perceived obstacles to achieving the goal, as shown in the following
· Lack of data skills among employees: 48%
· Inconsistent standards and processes: 39%
· Outdated or inadequate tools: 37%
· Lack of buy-in from company leadership: 34%
· Lack of data infrastructure and architecture: 34%
The full report can be downloaded at: https://www.dominodatalab.com/resources/data-science-needs-to-grow-up/
That’s my take on it: To be fair, the above issues happen everywhere. The gap between the goal and the implementation always exists. It makes me remember the theory of Management by Objectives (MBO) introduced by Peter Drucker. MBO refers to the process of goal-setting by both management and employees so that there is a consensus about what is supposed to be done. In my opinion, neither the top-down nor the bottom-up approach alone can ensure a successful implementation of data science.
Currently, the whole world has its eyes on the Olympic Games, and thus another interesting international competition is overlooked. Recently 50 teams from all over the world competed for a spot in the top ten of the World Data League, an international contest of using data science to solve social problems. There are four stages in this contest and participants are required to solve a variety of problems, including public transportation, climate change, public health, and many others. All the complicated problems and voluminous data are provided by organizations sponsoring this game. After the multi-stage screening, the top ten teams were selected to enter the finals during the first week of July. The final challenge is about how to improve the quality of life by reducing city noise levels. In the end, the winner is an international team consisting of members from Germany, Italy, Portugal, and Australia.
https://insidebigdata.com/2021/07/23/global-data-science-competition-gathered-brilliant-minds-to-solve-social-problems/
That’s my take on it: For a long time, we have been training students how to write academic research papers for peer-review journals. No doubt it is valuable because a shiny vita with a long list of presentations and publications can pave the way for a successful career. Nonetheless, perhaps we should also encourage them to analyze big data for solving real-world problems. There is nothing more satisfying than seeing that someday my students can reverse the climate change in the World Data League!
Amazon, one of the leaders in cloud computing, will hold a free data conference on 8/19 between 9:00 AM and 3:00 PM Pacific. The conference aims to introduce the latest technology for building a modern data strategy to consolidate, store, curate, and analyze data at any scale, and share insights with anyone who needs access to the data. Registration is free and the link to register is:
https://aws.amazon.com/events/aws-innovate/data/
That’s my take on it: Besides providing data services, Amazon also developed several powerful analytical tools,
such as Amazon SageMaker: https://aws.amazon.com/sagemaker/
Two decades ago I never imagined a bookseller could become a major player in the field of data analytics or Jeff Bezos would go into space travel. No matter whether you will use Amazon’s cloud computing or not, it is a good thing to learn about how Amazon can constantly reinvent itself. Look at the fate of another bookseller: Barnes and Noble (B & N). B & N has suffered seven years of declining revenue. Put it bluntly, the writing is on the wall when B & N didn’t want to go beyond its traditional boundary. Do we want ourselves to be like Amazon or Barnes and Noble?
Recently Dresner Advisory Services published the 2021 “Wisdom of Crowds Business Intelligence Market Study” to compare the strength of different vendors in business intelligence. The sample size consists of 5,000+ organizations and the research team rated various vendors by 33 criteria, including acquisition experience, value for the price paid, quality and usefulness of the product, quality of tech support, quality and value of consulting service…etc. The vendors are grouped into technology leaders and overall experience leaders. In this short message, I would like to focus on technology leaders. According to the report, the technology leaders are:
· TIBCO
· Amazon
· Tableau
· Microsoft
That’s my take on it: Never count on a single report! Several other consulting companies, such as Gartner, Forrester, and IDC, also published similar reports, and their results are slightly different. Nonetheless, some brand names appear on all or most reports, such as SAS, Microsoft, and TIBCO. In addition, some names have been re-appearing on several lists for many years. For example, TIBCO was named a leader five times in the Gartner Magic Quadrant for Master Data Management Solutions. SAS has also been recognized as a leader by Gartner Magic Quadrant for Data Science and Machine Learning for eight consecutive years. I want to make it clear that I am not endorsing any particular product. What I am trying to say is that we need to teach students the skills needed by corporations.
Yoshua Bengio, Yann LeCun, and Geoffrey Hinton are recipients of the 2018 ACM Turing Award for their research in Deep Neural Networks. In a paper published in the July issue of the Communications of the ACM, they shared their insights about the future of deep learning. They argued that the current form of deep learning is “fragile” in the sense that it relies on the assumption that incoming data are “independent and identically distributed” (i.i.d.). Needless to say, this expectation is unrealistic; in the real world's almost everything is related to everything else. Due to the messiness of the real world, they said, “The performance of today’s best AI systems tends to take a hit when they go from the lab to the field.” A common solution is to feed the AI system with more and diverse data. In other words, currently, AI systems are example-based, rather than rule-based. However, some scientists reverted to the classical approach by mixing data-driven neural networks and symbolic manipulation. But Bengio, Hinton, and LeCun do not believe that it can work. The full paper can be accessed at:
https://cacm.acm.org/magazines/2021/7/253464-deep-learning-for-ai/fulltext
That’s my take on it: The same problems described by Bengio, Hinton, and LeCun can also be found in classical statistics: unrealistic assumptions, messy data, and failure of generalizing the results from the lab to the field. As a remedy, some social scientists look for ecological validity. For example, educational researchers realize that it is impossible for teachers to block all interferences by closing the door. Contrary to the experimental ideal that a good study is a "noiseless" one, a study is regarded as ecologically valid if it captures teachers' everyday experiences as they are bombarded by numerous distractions. I believe that the same principle is applicable to deep learning.
A few days ago InsideBigData published an article entitled “The Rise and Fall of the Traditional Data Enterprise.”
The editorial team boldly claimed, “We are witnessing the death of traditional enterprise computing and storage – a real changing of the guard. Companies like Databricks, Snowflake, and Palantir are obliterating
companies initially thought to have been competitors: EMC, HP, Intel, Teradata, Cloudera, and Hadoop.”
Their argument is straightforward: Cloud-based computing simplifies data storage and usage. The cloud platform is ideal for storing and analyzing large-scale semi-structured data. In contrast, batch-based processing and relational databases for structured data are far less efficient. The full article can be accessed at:
https://insidebigdata.com/2021/07/02/the-rise-and-fall-of-the-traditional-data-enterprise/
That’s my take on it: History has been repeating itself. Back in the 1960s and 1970s, IBM mainframes seemed to be invincible and indispensable. However, in 1977 when Digital Equipment Corporation (DEC) introduced minicomputers running on VAX, IBM lost a big chunk of its market share to DEC. During the 1980s and 90s UNIX was gaining popularity, and to cope with the trend, DEC attempted to shift the focus of R&D its RISC technology, but it was too little and too late. In 2005 VAX ceased to exist.
Whether old players can continue to thrive depends on their adaptivity and the speed of reaction. Microsoft is a successful example. As InsideBigData pointed out, Microsoft Azure “have already commoditized storage at scales old-school players like EMC and HP could only have dreamed of.” Recently SAS Institute grabbed the opportunity by forming a joint venture with Microsoft in cloud computing.
Do we want ourselves to be like Hadoop and DEC/VAX, or Microsoft and SAS?
Although the predictive power of neural networks is supreme or even unparalleled, the process is considered a black box and sometimes the result is uninterpretable. Very often these models are fine-tuned by numerous trials and errors with big data. Simply put, it is a brute force approach (Given enough computing power and data, you can always get the answer). To rectify the situation, Sho Yaida of Facebook AI Research, Dan Roberts of MIT and Salesforce, and Boris Hanin at Princeton University co-authored a book entitled “The Principles of Deep Learning Theory: An Effective Theory Approach to Understanding Neural Networks.” In the book, they explained the theoretical framework of deep learning and thus data analysts could significantly reduce the amount of trial and error by understanding how to optimize different parameters. The book will be published by Cambridge University Press in early 2022 and the full manuscript can be downloaded from:
https://deeplearningtheory.com/PDLT.pdf?fbclid=IwAR1Yapc3x3ADeaTy7SZ5UZW8l4WU9dhoiBEjWTNv-VhaXaTRvTPiUKSYvPw
I haven’t read the whole book yet; nonetheless, I had a quick glance and found that the book is fairly accessible. As the authors said in the preface, the book is appropriate for everyone with knowledge of linear algebra, and probability theory, and with a healthy interest in neural networks.
China is in the midst of upgrading its military, including its tanks, missile systems, troop equipment, and fighter jets. Among the new systems being developed is AI. In a recent simulation one of the most experienced China’s air force pilots, Fang Guoya, was defeated by the AI combatant system. According to Fang, early in the training, it was easy for him to “shot down” the AI adversary. As you may already know, AI is capable of machine learning. After accumulating more and more data, the AI system outperformed Fang Guoya.
https://www.businessinsider.com/china-pits-fighter-pilots-against-ai-aircraft-in-simulated-dogfights-2021-6?r=US&IR=T&utm_source=ONTRAPORT-email-broadcast&utm_medium=ONTRAPORT-email-broadcast&utm_term=&utm_content=Data+Science+Insider%3A+June+18th%2C+2021&utm_campaign=19062021
That’s my take on it: It is not surprising to see that the abilities of AI continue to outgrow even the best human experts. As a matter of fact, AI has been outsmarting humans for a long time. Back in 1997, IBM Deep Blue had already beaten the world chess champion after a six-game match. In 2011 IBM Watson competed against the best human contestants on Jeopardy and won the first prize. In 2016 Google’s AlphaGo beat a 9-dan (the highest level) professional Go player. It is noteworthy that in 2017 a new system called AlphaZero defeated AlphaGo by 100-0! Only AI can defeat AI! Perhaps the future war will be fought between AI systems and humans will play a supporting role only.
In June the US Senate passed the bill entitled the US Innovation and Competition Act (USICA) with the purpose of boosting American semiconductor production, the R & D of Artificial Intelligence, and other crucial technologies. The bill approves $52 billion for domestic semiconductor manufacturing, as well as a 30 percent boost in funding for the National Science Foundation (NSF), and $29 billion for a new science directorate to focus on applied sciences. Additionally, the bill will provide $10 billion to reshape cities and regions across the country into “technology hubs,” promoting R & D into cutting-edge industries and creating high-paying job opportunities.
https://www.theverge.com/2021/6/8/22457293/semiconductor-chip-shortage-funding-frontier-china-competition-act
That’s my take on it: There will be many funding opportunities at NSF and other funding agencies. We should make ourselves ready to catch the wave. However, the US is facing a shortage of researchers, engineers, programmers, and other types of high-skilled workers. One of the strategies for boosting semiconductor production is to attract foreign investment. In June Taiwan Semiconductor Manufacturing Co. (TSMC) has broken ground on a chipmaking facility in Chandler, Arizona (I lived there 10 years ago). One of the obstacles that might hinder TSMC from fully developing its chipmaking capacity is that the number of graduates related to science and engineering in the U.S. has diminished. In April, TSMC founder Morris Chang bluntly said that the U.S. lacks "dedicated talent ... as well as the capability to mobilize manufacturing personnel on a large scale." Further, a new report released by research group New American Economy found that for every unemployed technology worker in the US in 2020, there were more than seven job-postings for computer-related positions. Perhaps it’s time to reconsider and restructure our academic curriculum!
https://asia.nikkei.com/Business/Tech/Semiconductors/TSMC-in-Arizona-Why-Taiwan-s-chip-titan-is-betting-on-the-desert
https://www.cnbc.com/2021/06/10/study-employers-seek-immigrants-amid-shortage-of-high-skilled-workers.html
Back in January 2020, Google set a record in the field of natural language processing by building a new model with 1.6 trillion parameters. Recently China broke the record by introducing WuDao 2.0 carrying 1.75 trillion parameters. WuDao 2.0 is able to understand both Chinese and English, thus providing appropriate responses in real-world situations. According to Chinese AI researcher Blake Yan, “These sophisticated models, trained on gigantic data sets, only require a small amount of new data when used for a specific feature because they can transfer knowledge already learned into new tasks, just like human beings. Large-scale pre-trained models are one of today’s best shortcuts to artificial general intelligence.”
https://www.techradar.com/news/china-outstrips-gpt-3-with-even-more-ambitious-ai-language-model?utm_source=ONTRAPORT-email-broadcast&utm_medium=ONTRAPORT-email-broadcast&utm_term=&utm_content=Data+Science+Insider%3A+June+4th%2C+2021&utm_campaign=05062021
1. As natural language processing, image recognition, and other AI technologies become more and more sophisticated, researchers can go beyond structured data (e.g. numbers in a row by column table) by tapping into unstructured data (e.g. text, audio, image, movie…etc.).
2. Despite the US bans exporting crucial AI technologies to China, China has been surging ahead in the research on AI and machine learning. China has at least three advantages: (a) AI needs big data; China can access massive data. (b) China is capable of training a large number of data scientists and AI researchers; Chinese students are more willing to study STEM subjects no matter how challenging they are. (c). China tends to take bold steps to apply AI and machine learning into different domains, rather than maintaining the status quo.
This Wednesday (June 9) the Educational Opportunity Project (EOP) at Stanford University will release new data (Version 4.1) sourced from the Stanford Education Data Archive (SEDA). This is a comprehensive national database consisting of 10 years of academic performance data from 2008-2009 to 2017-2018.
With the advance of online interactive data visualization tools, you don’t have to wait for a year or more to see the results of this type of big data analytics. Now you can explore the data anytime anywhere on your own as long as there is a Web browser on your computer. For example, the following webpage is the GIS map of student test scores and socioeconomic status (SES) by the school district. In addition to the GIS map, the webpage also displays a scatterplot indicating a strong relationship between test scores and SES.
https://edopportunity.org/explorer/#/map/none/districts/avg/ses/all/3.5/38/-97/
· To look for specific information about your school district, use the hand tool to move the map in order to place your state at the center.
· Click on the + sign on the right.
· Mouse hover on your school district e.g. the average test score of Azusa is -1.83 and the SES is +0.19 (see the attached PNG image “Azusa_scores_n_SES”).
You can switch to a different view to interact with the chart. For example, by clicking on a particular data point, I can see the trend in test scores by ethnic groups in that particular school district (see the attached PNG image “Boston_ethnicity.png”)
That’s my take on it: I am excited by this type of democratization of data analytics. Rather than merely counting on what experts tell you, today you can access the data to obtain specific information that is relevant to yourself.
Azusa_scores_n_SES.png
Boston_by_ethnicity.png
On June 3 Knowledge Discovery Nuggets posted an article entitled “Will There Be a Shortage of Data Science Jobs in the Next 5 Years?” written by experienced data scientist Pranjal Saxena.
At the beginning of the article Saxena paints a gloomy picture of the future job market:
“In 2019, data scientists used to spend days in data gathering, data cleaning, feature selection, but now we have many tools in the market that can do these tasks in a few minutes.
On the other hand, we were trying different machine learning libraries like logistic regression, random forest, boosting machines, naive Bayes, and other data science libraries to get a better model.
But, today, we have tools like H2O, PyCaret, and many other cloud providers who can do the same model selection on the same data using the combination of other 30–50 machine learning libraries to give you the best machine learning algorithms for your data with least error…
Each company is aware of this fact, so after five years, when these cloud-enabled data science tools will become more efficient and will be able to provide better accuracy in much less amount of time, then why will companies invest in hiring us and not buying the subscription of those tools?”
In the end, Saxena shows the ray of hope by saying, “Each company aims to build their product so that instead of depending on others, they can build their automated system and then sell them in the market to earn more revenue. So, yes, there will be a need for data scientists who can help industries build automation systems that can automate the task of machine learning and deep learning.”
That’s my take on it: Data analysts like me are cautious of the lack of transparency and interpretability of the “black box” because the practice of handing over human judgment to the computer is not any better than blindly following the alpha level as 0.5. At most data science or machine learning should be used to augment human capabilities, not replace them. The key is to achieve an optimal balance. As Harvard DS researcher Brodie said, “too much human-in-the-loop leads to errors; too little leads to nonsense”. I think we will need experienced data scientists to interpret the results and make corrections when the automated system makes a mistake.
Nonetheless, what Saxena described is a “good problem.” I know many people still struggling with entering numbers into Excel manually. Let alone running automated tools.
Full article: https://www.kdnuggets.com/2021/06/shortage-data-science-jobs-5-years.html
AI in China’s Walmart stores
Recently Walmart, one of the world's largest retailers, introduced RetailAI Fresh into China’s Walmart stores for self-service customers. RetailAI Fresh is a software app developed by Malong Technologies, running on GPU-accelerated servers from Dell Technologies. Self-checkout is easy when the package has a barcode, but it becomes challenging for a scanner to recognize fresh produce products. RetailAI Fresh can rectify the situation by integrating state-of-the-art AI recognition technology into traditional self-service scales. It is noteworthy that Malong Technologies was founded by Chinese data scientists and engineers.
https://insidebigdata.com/2021/05/24/walmart-innovates-at-the-retail-edge-with-ai/
https://www.linkedin.com/company/msight-ai/people/
That’s my take on it: There are a lot of brilliant Chinese computer scientists and engineers working on revolutionary products. The experiment or beta testing in China’s Walmart stores is just one of many examples. We should look beyond our borders in order to absorb new ideas.
AI for the insurance industry
Using demographic data to customize insurance policies is not new. However, in the past customers were treated unfairly due to outdated data or incorrect predictions made by legacy software applications. Not anymore. According to a recent article on InsideBigData, today AI is capable of processing 4,000 data points in minutes and also analyzing 20 years’ worth of mortality, demographic, health, and government trends for better decision support. As a result, insurance companies that utilize both AI and cloud-based data can create fairer policies to serve current and potential clients.
https://insidebigdata.com/2021/05/25/is-ai-the-future-of-the-insurance-industry/
That’s my take on it: Today many people want to keep their privacy and complain against AI and big data, viewing them as “weapons of math destruction” or “the big brother in 1984”. On the other hand, people expect corporations to improve our wellbeing by utilizing better algorithms and more accurate data. These two goals are contradictory! It is important to point out that insurance companies have been collecting customer data for many years. If AI can improve predictive models and data accuracy, I don’t see a reason to oppose it.
Currently there is a special exhibition at London’s Design Museum: Portrait paintings and drawings by an AI android named Ai-Da. Ai-Da is co-developed by robotics firm ‘Engineered Arts’ and experts at the University of Oxford. Ai-Da is able to ‘see’ by utilizing a computer vision system, and therefore she can create a portrait of someone in front of her. Because the creative process is based upon machine learning algorithms, she will not duplicate the same work and therefore each picture is unique. Unfortunately, I cannot visit the museum due to COVID19.
IBM, along with SAS and TIBCO, is named one of the leaders in the 2021 Gartner Report of data science and machine learning platforms. Although the flagship products of IBM are IBM Watson Studio, IBM Cloud Pak for Data, IBM SPSS Modeler, and IBM Watson Machine Learning, IBM heavily invests in Python and other open-source resources. Recently IBM announced that it will make the Python distribution platform Anaconda available for Linux on IMB Z. Anaconda is the leading Python data science platform and 25 million users use this platform for machine learning, data science, and predictive analytics.
Daniel Kahneman won the Nobel Prize in economics in 2002 for his work on the psychology of decision-making. In response to the questions about the impact of AI on our society during a recent interview by the Guardian, Kahneman said, “There are going to be massive consequences of that change that are already beginning to happen. Some medical specialties are clearly in danger of being replaced, certainly in terms of diagnosis. And there are rather frightening scenarios when you’re talking about leadership. Once it’s demonstrably true that you can have an AI that has a far better business judgment, say, what will that do to human leadership?... I have learned never to make forecasts. Not only can I certainly not do it – I’m not sure it can be done. But one thing that looks very likely is that these huge changes are not going to happen quietly. There is going to be massive disruption. Technology is developing very rapidly, possibly exponentially. But people are linear. When linear people are faced with exponential change, they’re not going to be able to adapt to that very easily. So clearly, something is coming… And clearly, AI is going to win [against human intelligence]. It’s not even close. How people are going to adjust to this is a fascinating problem – but one for my children and grandchildren, not me.”
No matter whether you support developing AI or not, it is good to take multiple perspectives into consideration. A week ago, Alberto Romero published an article entitled "5 Reasons Why I Left the AI Industry" on "Towards Data Science". He complained that AI is hype and so we should not expect to see AI at the level of human intelligence anytime soon. In addition, in his view, AI becomes a black box and many people don't understand what is going on behind the scenes. The following is a direct quotation:
"The popularization of AI has made every software-related graduate dream with being the next Andrew Ng. And the apparent easiness with which you can have a powerful DL model running in the cloud, with huge databases to learn from, has made many enjoy the reward of seeing results fast and easy. AI is within reach of almost anyone. You can use Tensorflow or Keras to create a working model in a month. Without any computer science (or programming) knowledge whatsoever. But let me ask you this: Is that what you want? Does it fulfill your hunger for discovering something new? Is it interesting? Even if it works, have you actually learned anything? It seems to me that AI has become an end in itself. Most don’t use AI to achieve something beyond. They use AI just for the sake of it without understanding anything that happens behind the scenes. That doesn’t satisfy me at all."
My response is: I never expect we will see an android like Commander Data or Terminator in the near future. Indeed, we don't need that level of AI to improve our performance or well-being. Nonetheless, it is a good strategy to aim high. If a researcher tries to publish 7 articles per year, in the end, there would be 3-5 only. But if he or she sets the goal to 3 articles per year, the result would be zero! By the same token, the ultimate goal in AI seems to be unattainable, but it is how we are motivated. In addition, what Alberto described as AI programmers and users also happens among people who use traditional statistics. Some people feed the data into the computer, push a button, and then pass the output into the paper without knowing what F values and p values mean. Misuse or even abuse happens everywhere. The proper way to deal with the issue is education, rather than abandoning the methodology altogether.
The link to the full article is: https://towardsdatascience.com/5-reasons-why-i-left-the-ai-industry-2c88ea183cdd
Copyright © 2020 ASA. All rights reserved. | 2023-14/0000/en_head.json.gz/3602 | {"url": "https://community.amstat.org/scasa/newsroom", "partition": "head_middle", "language": "en", "source_domain": "community.amstat.org", "date_download": "2023-03-20T09:27:58Z", "digest": "sha1:CE2IZIWESHVNTXWDOWRQLDA2LWGULFJA"} | {"ccnet_length": [[0, 247507, 247507.0]], "ccnet_original_length": [[0, 247507, 253464.0]], "ccnet_nlines": [[0, 247507, 1177.0]], "ccnet_original_nlines": [[0, 247507, 1358.0]], "ccnet_language_score": [[0, 247507, 0.93]], "ccnet_perplexity": [[0, 247507, 269.9]], "ccnet_bucket": [[0, 247507, 0.0]], "rps_doc_curly_bracket": [[0, 247507, 0.0]], "rps_doc_ldnoobw_words": [[0, 247507, 3.0]], "rps_doc_lorem_ipsum": [[0, 247507, 0.0]], "rps_doc_stop_word_fraction": [[0, 247507, 0.33909566]], "rps_doc_ut1_blacklist": [[0, 247507, null]], "rps_doc_frac_chars_dupe_10grams": [[0, 247507, 0.00363738]], "rps_doc_frac_chars_dupe_5grams": [[0, 247507, 0.04609827]], "rps_doc_frac_chars_dupe_6grams": [[0, 247507, 0.02714118]], "rps_doc_frac_chars_dupe_7grams": [[0, 247507, 0.01592097]], "rps_doc_frac_chars_dupe_8grams": [[0, 247507, 0.00726978]], "rps_doc_frac_chars_dupe_9grams": [[0, 247507, 0.00535171]], "rps_doc_frac_chars_top_2gram": [[0, 247507, 0.00489456]], "rps_doc_frac_chars_top_3gram": [[0, 247507, 0.00461132]], "rps_doc_frac_chars_top_4gram": [[0, 247507, 0.00576415]], "rps_doc_frac_all_caps_words": [[0, 247507, 0.02829331]], "rps_doc_frac_lines_end_with_ellipsis": [[0, 247507, 0.00084962]], "rps_doc_frac_no_alph_words": [[0, 247507, 0.20669669]], "rps_doc_frac_unique_words": [[0, 247507, 0.17195853]], "rps_doc_mean_word_length": [[0, 247507, 5.43344673]], "rps_doc_symbol_to_word_ratio": [[0, 247507, 0.00105014]], "rps_doc_unigram_entropy": [[0, 247507, 6.89593812]], "rps_doc_word_count": [[0, 247507, 37038.0]], "rps_lines_ending_with_terminal_punctution_mark": [[0, 34, 0.0], [34, 77, 0.0], [77, 89, 0.0], [89, 100, 0.0], [100, 116, 0.0], [116, 122, 0.0], [122, 130, 0.0], [130, 165, 0.0], [165, 195, 0.0], [195, 241, 0.0], [241, 418, 0.0], [418, 600, 0.0], [600, 774, 0.0], [774, 952, 0.0], [952, 1131, 0.0], [1131, 1309, 0.0], [1309, 1489, 0.0], [1489, 1667, 0.0], [1667, 1829, 1.0], [1829, 2006, 0.0], [2006, 2052, 1.0], [2052, 2210, 0.0], [2210, 2393, 0.0], [2393, 2569, 0.0], [2569, 2746, 0.0], [2746, 2912, 1.0], [2912, 3094, 0.0], [3094, 3264, 0.0], [3264, 3441, 1.0], [3441, 3468, 1.0], [3468, 3531, 1.0], [3531, 3732, 0.0], [3732, 3928, 0.0], [3928, 4097, 1.0], [4097, 4146, 0.0], [4146, 4350, 1.0], [4350, 4545, 0.0], [4545, 4742, 0.0], [4742, 4947, 0.0], [4947, 4975, 1.0], [4975, 5160, 0.0], [5160, 5351, 1.0], [5351, 5433, 0.0], [5433, 5545, 1.0], [5545, 5651, 1.0], [5651, 5760, 1.0], [5760, 5836, 1.0], [5836, 5899, 1.0], [5899, 6002, 1.0], [6002, 6262, 0.0], [6262, 6453, 1.0], [6453, 6637, 0.0], [6637, 6676, 1.0], [6676, 6861, 0.0], [6861, 7058, 0.0], [7058, 7249, 0.0], [7249, 7438, 0.0], [7438, 7712, 0.0], [7712, 7898, 0.0], [7898, 8091, 0.0], [8091, 8183, 1.0], [8183, 8344, 0.0], [8344, 8505, 0.0], [8505, 8609, 0.0], [8609, 8617, 0.0], [8617, 8624, 0.0], [8624, 8636, 0.0], [8636, 8647, 0.0], [8647, 8660, 0.0], [8660, 8673, 0.0], [8673, 8683, 0.0], [8683, 8694, 0.0], [8694, 8704, 0.0], [8704, 8713, 0.0], [8713, 8720, 0.0], [8720, 8732, 0.0], [8732, 8745, 0.0], [8745, 8751, 0.0], [8751, 8760, 0.0], [8760, 8774, 0.0], [8774, 8946, 0.0], [8946, 9032, 0.0], [9032, 9042, 0.0], [9042, 9051, 0.0], [9051, 9063, 0.0], [9063, 9085, 0.0], [9085, 9101, 0.0], [9101, 9110, 0.0], [9110, 9125, 0.0], [9125, 9136, 0.0], [9136, 9151, 0.0], [9151, 9163, 0.0], [9163, 9169, 0.0], [9169, 9177, 0.0], [9177, 9202, 0.0], [9202, 9215, 0.0], [9215, 9226, 0.0], [9226, 9244, 0.0], [9244, 9273, 0.0], [9273, 9284, 0.0], [9284, 9376, 0.0], [9376, 9494, 0.0], [9494, 9678, 0.0], [9678, 9858, 1.0], [9858, 10036, 0.0], [10036, 10215, 0.0], [10215, 10235, 1.0], [10235, 10455, 0.0], [10455, 10666, 0.0], [10666, 10884, 0.0], [10884, 11098, 0.0], [11098, 11173, 1.0], [11173, 11428, 0.0], [11428, 11533, 0.0], [11533, 11756, 0.0], [11756, 11972, 0.0], [11972, 12190, 1.0], [12190, 12907, 1.0], [12907, 13032, 0.0], [13032, 13717, 1.0], [13717, 13891, 0.0], [13891, 14064, 0.0], [14064, 14237, 0.0], [14237, 14414, 0.0], [14414, 14584, 0.0], [14584, 14640, 1.0], [14640, 14765, 0.0], [14765, 14845, 0.0], [14845, 14864, 0.0], [14864, 14912, 0.0], [14912, 14959, 0.0], [14959, 15130, 0.0], [15130, 15297, 0.0], [15297, 15473, 0.0], [15473, 15643, 0.0], [15643, 15764, 1.0], [15764, 15930, 0.0], [15930, 16098, 0.0], [16098, 16260, 1.0], [16260, 16430, 0.0], [16430, 16604, 0.0], [16604, 16769, 0.0], [16769, 16947, 0.0], [16947, 17123, 0.0], [17123, 17135, 1.0], [17135, 17150, 0.0], [17150, 17208, 0.0], [17208, 17379, 0.0], [17379, 17556, 0.0], [17556, 17731, 0.0], [17731, 17903, 1.0], [17903, 18062, 1.0], [18062, 18231, 0.0], [18231, 18406, 0.0], [18406, 18576, 1.0], [18576, 18679, 0.0], [18679, 18850, 0.0], [18850, 19027, 0.0], [19027, 19147, 1.0], [19147, 19325, 1.0], [19325, 19489, 0.0], [19489, 19662, 0.0], [19662, 19792, 1.0], [19792, 19897, 0.0], [19897, 20072, 0.0], [20072, 20254, 0.0], [20254, 20432, 0.0], [20432, 20589, 1.0], [20589, 20759, 0.0], [20759, 20790, 0.0], [20790, 20866, 1.0], [20866, 20964, 1.0], [20964, 21090, 1.0], [21090, 21204, 1.0], [21204, 21298, 1.0], [21298, 21417, 1.0], [21417, 21528, 1.0], [21528, 21645, 1.0], [21645, 21726, 1.0], [21726, 21865, 1.0], [21865, 21995, 1.0], [21995, 22050, 0.0], [22050, 22215, 0.0], [22215, 22382, 0.0], [22382, 22545, 0.0], [22545, 22722, 0.0], [22722, 22873, 1.0], [22873, 22900, 0.0], [22900, 23060, 1.0], [23060, 23233, 0.0], [23233, 23289, 1.0], [23289, 23411, 0.0], [23411, 23574, 0.0], [23574, 23738, 0.0], [23738, 23910, 0.0], [23910, 24007, 1.0], [24007, 24187, 0.0], [24187, 24361, 0.0], [24361, 24528, 0.0], [24528, 24694, 0.0], [24694, 24729, 1.0], [24729, 24872, 0.0], [24872, 25006, 0.0], [25006, 25176, 0.0], [25176, 25344, 0.0], [25344, 25465, 1.0], [25465, 25643, 1.0], [25643, 25816, 1.0], [25816, 25990, 0.0], [25990, 26177, 1.0], [26177, 26285, 0.0], [26285, 26461, 0.0], [26461, 26638, 0.0], [26638, 26814, 0.0], [26814, 26991, 0.0], [26991, 27020, 1.0], [27020, 27196, 0.0], [27196, 27373, 0.0], [27373, 27436, 1.0], [27436, 27607, 1.0], [27607, 27712, 1.0], [27712, 27763, 0.0], [27763, 27940, 0.0], [27940, 28119, 0.0], [28119, 28295, 0.0], [28295, 28481, 0.0], [28481, 28556, 1.0], [28556, 28731, 0.0], [28731, 28902, 0.0], [28902, 29076, 0.0], [29076, 29246, 0.0], [29246, 29412, 0.0], [29412, 29589, 0.0], [29589, 29721, 1.0], [29721, 29774, 0.0], [29774, 29940, 0.0], [29940, 30105, 0.0], [30105, 30275, 1.0], [30275, 30444, 0.0], [30444, 30612, 0.0], [30612, 30782, 0.0], [30782, 30961, 1.0], [30961, 31132, 0.0], [31132, 31272, 1.0], [31272, 31435, 0.0], [31435, 31606, 0.0], [31606, 31771, 0.0], [31771, 31812, 1.0], [31812, 31955, 0.0], [31955, 32086, 0.0], [32086, 32149, 0.0], [32149, 32315, 0.0], [32315, 32482, 0.0], [32482, 32643, 0.0], [32643, 32762, 1.0], [32762, 32917, 0.0], [32917, 33074, 0.0], [33074, 33235, 0.0], [33235, 33397, 1.0], [33397, 33554, 0.0], [33554, 33598, 1.0], [33598, 33763, 0.0], [33763, 33913, 1.0], [33913, 34015, 0.0], [34015, 34165, 0.0], [34165, 34295, 0.0], [34295, 34323, 0.0], [34323, 34477, 0.0], [34477, 34639, 0.0], [34639, 34805, 1.0], [34805, 34973, 1.0], [34973, 35103, 1.0], [35103, 35270, 0.0], [35270, 35424, 1.0], [35424, 35484, 1.0], [35484, 35661, 1.0], [35661, 35821, 0.0], [35821, 35981, 0.0], [35981, 36150, 0.0], [36150, 36320, 0.0], [36320, 36482, 0.0], [36482, 36647, 0.0], [36647, 36756, 1.0], [36756, 36924, 0.0], [36924, 37084, 0.0], [37084, 37250, 0.0], [37250, 37407, 0.0], [37407, 37482, 1.0], [37482, 37644, 0.0], [37644, 37807, 0.0], [37807, 37960, 0.0], [37960, 38126, 1.0], [38126, 38205, 1.0], [38205, 38375, 0.0], [38375, 38542, 0.0], [38542, 38716, 1.0], [38716, 38894, 1.0], [38894, 39064, 0.0], [39064, 39235, 0.0], [39235, 39398, 0.0], [39398, 39536, 1.0], [39536, 39709, 0.0], [39709, 39887, 0.0], [39887, 39901, 1.0], [39901, 40077, 0.0], [40077, 40249, 0.0], [40249, 40299, 1.0], [40299, 40467, 0.0], [40467, 40635, 1.0], [40635, 40739, 1.0], [40739, 40909, 1.0], [40909, 40981, 0.0], [40981, 41154, 0.0], [41154, 41177, 0.0], [41177, 41298, 1.0], [41298, 41364, 1.0], [41364, 41431, 1.0], [41431, 41488, 1.0], [41488, 41559, 1.0], [41559, 41673, 1.0], [41673, 41765, 1.0], [41765, 41884, 1.0], [41884, 42013, 1.0], [42013, 42117, 1.0], [42117, 42426, 0.0], [42426, 43479, 1.0], [43479, 43653, 1.0], [43653, 43825, 0.0], [43825, 43998, 0.0], [43998, 44177, 0.0], [44177, 44257, 1.0], [44257, 44336, 0.0], [44336, 44436, 0.0], [44436, 44612, 0.0], [44612, 44783, 1.0], [44783, 44957, 0.0], [44957, 45130, 0.0], [45130, 45289, 1.0], [45289, 45484, 0.0], [45484, 45590, 0.0], [45590, 45631, 0.0], [45631, 45677, 0.0], [45677, 45706, 0.0], [45706, 45735, 0.0], [45735, 45772, 0.0], [45772, 45957, 0.0], [45957, 46136, 0.0], [46136, 46310, 1.0], [46310, 46492, 0.0], [46492, 46682, 0.0], [46682, 46719, 1.0], [46719, 46905, 0.0], [46905, 47020, 1.0], [47020, 47115, 0.0], [47115, 47137, 0.0], [47137, 47320, 0.0], [47320, 47503, 0.0], [47503, 47679, 0.0], [47679, 47750, 1.0], [47750, 47939, 0.0], [47939, 48126, 0.0], [48126, 48136, 1.0], [48136, 48312, 0.0], [48312, 48491, 0.0], [48491, 48672, 0.0], [48672, 48741, 1.0], [48741, 48779, 0.0], [48779, 48963, 0.0], [48963, 49150, 1.0], [49150, 49335, 1.0], [49335, 49432, 1.0], [49432, 49603, 0.0], [49603, 49785, 0.0], [49785, 49904, 1.0], [49904, 50090, 0.0], [50090, 50280, 0.0], [50280, 50448, 1.0], [50448, 50636, 0.0], [50636, 50743, 1.0], [50743, 50851, 1.0], [50851, 51019, 0.0], [51019, 51205, 0.0], [51205, 51347, 1.0], [51347, 51539, 0.0], [51539, 51726, 0.0], [51726, 51770, 1.0], [51770, 51959, 0.0], [51959, 52148, 0.0], [52148, 52210, 1.0], [52210, 52393, 0.0], [52393, 52575, 0.0], [52575, 52672, 1.0], [52672, 52857, 0.0], [52857, 53035, 0.0], [53035, 53221, 1.0], [53221, 53409, 0.0], [53409, 53601, 0.0], [53601, 53746, 1.0], [53746, 53908, 0.0], [53908, 54065, 1.0], [54065, 54220, 0.0], [54220, 54384, 0.0], [54384, 54550, 0.0], [54550, 54569, 1.0], [54569, 54624, 0.0], [54624, 54707, 0.0], [54707, 54865, 0.0], [54865, 55036, 0.0], [55036, 55202, 0.0], [55202, 55347, 1.0], [55347, 55506, 0.0], [55506, 55670, 0.0], [55670, 55841, 0.0], [55841, 55865, 1.0], [55865, 56029, 1.0], [56029, 56137, 0.0], [56137, 56206, 0.0], [56206, 56262, 0.0], [56262, 56335, 0.0], [56335, 56373, 0.0], [56373, 56545, 0.0], [56545, 56623, 1.0], [56623, 56784, 0.0], [56784, 56949, 0.0], [56949, 57039, 1.0], [57039, 57206, 0.0], [57206, 57381, 0.0], [57381, 57480, 1.0], [57480, 57641, 1.0], [57641, 57806, 0.0], [57806, 57823, 1.0], [57823, 58107, 0.0], [58107, 58265, 0.0], [58265, 58434, 0.0], [58434, 58602, 0.0], [58602, 58631, 1.0], [58631, 58787, 0.0], [58787, 58940, 0.0], [58940, 59098, 0.0], [59098, 59266, 0.0], [59266, 59343, 1.0], [59343, 59512, 0.0], [59512, 59554, 1.0], [59554, 59625, 0.0], [59625, 59790, 0.0], [59790, 59957, 0.0], [59957, 60117, 0.0], [60117, 60192, 1.0], [60192, 60352, 0.0], [60352, 60473, 1.0], [60473, 60617, 0.0], [60617, 60764, 0.0], [60764, 60908, 0.0], [60908, 61053, 0.0], [61053, 61161, 1.0], [61161, 61444, 0.0], [61444, 61593, 0.0], [61593, 61742, 0.0], [61742, 61891, 0.0], [61891, 62036, 0.0], [62036, 62188, 0.0], [62188, 62234, 1.0], [62234, 62398, 0.0], [62398, 62417, 0.0], [62417, 62445, 0.0], [62445, 62459, 0.0], [62459, 62511, 0.0], [62511, 62539, 0.0], [62539, 62572, 0.0], [62572, 62601, 0.0], [62601, 62655, 0.0], [62655, 62718, 0.0], [62718, 62877, 0.0], [62877, 63031, 0.0], [63031, 63154, 1.0], [63154, 63304, 0.0], [63304, 63454, 0.0], [63454, 63602, 0.0], [63602, 63679, 1.0], [63679, 63819, 0.0], [63819, 63893, 1.0], [63893, 64050, 0.0], [64050, 64212, 0.0], [64212, 64370, 0.0], [64370, 64533, 0.0], [64533, 64693, 0.0], [64693, 64770, 1.0], [64770, 64937, 0.0], [64937, 65096, 0.0], [65096, 65249, 0.0], [65249, 65404, 0.0], [65404, 65560, 1.0], [65560, 65714, 0.0], [65714, 65873, 0.0], [65873, 65882, 1.0], [65882, 65937, 0.0], [65937, 66099, 0.0], [66099, 66267, 0.0], [66267, 66433, 0.0], [66433, 66599, 1.0], [66599, 66680, 1.0], [66680, 66843, 0.0], [66843, 66859, 1.0], [66859, 67015, 0.0], [67015, 67175, 0.0], [67175, 67328, 0.0], [67328, 67365, 1.0], [67365, 67426, 0.0], [67426, 67498, 0.0], [67498, 67658, 0.0], [67658, 67816, 0.0], [67816, 67980, 0.0], [67980, 67990, 1.0], [67990, 68152, 0.0], [68152, 68316, 0.0], [68316, 68475, 0.0], [68475, 68608, 1.0], [68608, 68764, 0.0], [68764, 68916, 0.0], [68916, 69068, 0.0], [69068, 69229, 0.0], [69229, 69382, 0.0], [69382, 69554, 1.0], [69554, 69613, 0.0], [69613, 69777, 0.0], [69777, 69948, 0.0], [69948, 70129, 0.0], [70129, 70301, 1.0], [70301, 70467, 0.0], [70467, 70635, 0.0], [70635, 70806, 0.0], [70806, 70947, 0.0], [70947, 71218, 0.0], [71218, 71388, 0.0], [71388, 71550, 1.0], [71550, 71725, 0.0], [71725, 71809, 1.0], [71809, 71976, 0.0], [71976, 72143, 0.0], [72143, 72315, 0.0], [72315, 72339, 1.0], [72339, 72510, 0.0], [72510, 72681, 0.0], [72681, 72851, 0.0], [72851, 72909, 1.0], [72909, 72974, 0.0], [72974, 73147, 0.0], [73147, 73307, 0.0], [73307, 73327, 1.0], [73327, 73499, 0.0], [73499, 73667, 0.0], [73667, 73683, 1.0], [73683, 73859, 0.0], [73859, 74034, 0.0], [74034, 74214, 0.0], [74214, 74376, 0.0], [74376, 74550, 0.0], [74550, 74711, 0.0], [74711, 74880, 0.0], [74880, 74921, 1.0], [74921, 75061, 0.0], [75061, 75230, 0.0], [75230, 75402, 0.0], [75402, 75570, 0.0], [75570, 75743, 0.0], [75743, 75811, 1.0], [75811, 75987, 0.0], [75987, 76159, 1.0], [76159, 76327, 0.0], [76327, 76494, 0.0], [76494, 76676, 1.0], [76676, 76854, 0.0], [76854, 76940, 1.0], [76940, 76977, 0.0], [76977, 77147, 0.0], [77147, 77322, 1.0], [77322, 77500, 0.0], [77500, 77679, 0.0], [77679, 77832, 0.0], [77832, 77895, 0.0], [77895, 77947, 1.0], [77947, 78143, 0.0], [78143, 78331, 0.0], [78331, 78520, 0.0], [78520, 78552, 1.0], [78552, 78749, 0.0], [78749, 78794, 1.0], [78794, 78987, 0.0], [78987, 79181, 0.0], [79181, 79287, 0.0], [79287, 79368, 0.0], [79368, 79461, 0.0], [79461, 79648, 0.0], [79648, 79842, 0.0], [79842, 80041, 1.0], [80041, 80228, 0.0], [80228, 80421, 0.0], [80421, 80596, 1.0], [80596, 80787, 0.0], [80787, 80948, 1.0], [80948, 81009, 0.0], [81009, 81139, 0.0], [81139, 81306, 0.0], [81306, 81415, 1.0], [81415, 81568, 1.0], [81568, 81689, 1.0], [81689, 81837, 1.0], [81837, 82004, 0.0], [82004, 82195, 0.0], [82195, 82388, 0.0], [82388, 82585, 0.0], [82585, 82779, 1.0], [82779, 82977, 0.0], [82977, 82996, 1.0], [82996, 83187, 0.0], [83187, 83370, 0.0], [83370, 83389, 1.0], [83389, 83565, 0.0], [83565, 83737, 1.0], [83737, 83926, 0.0], [83926, 84084, 1.0], [84084, 84267, 1.0], [84267, 84509, 0.0], [84509, 84698, 0.0], [84698, 84887, 0.0], [84887, 85074, 0.0], [85074, 85240, 1.0], [85240, 85373, 0.0], [85373, 85543, 1.0], [85543, 85663, 1.0], [85663, 85855, 1.0], [85855, 86033, 0.0], [86033, 86106, 1.0], [86106, 86260, 1.0], [86260, 86582, 0.0], [86582, 86779, 0.0], [86779, 86970, 0.0], [86970, 87082, 1.0], [87082, 87239, 1.0], [87239, 87393, 0.0], [87393, 87506, 0.0], [87506, 87586, 1.0], [87586, 87702, 1.0], [87702, 87843, 1.0], [87843, 87948, 1.0], [87948, 88041, 1.0], [88041, 88182, 1.0], [88182, 88259, 1.0], [88259, 88332, 1.0], [88332, 88458, 1.0], [88458, 88570, 1.0], [88570, 88671, 1.0], [88671, 88799, 1.0], [88799, 88950, 1.0], [88950, 89060, 1.0], [89060, 89212, 0.0], [89212, 89279, 1.0], [89279, 89407, 1.0], [89407, 89541, 1.0], [89541, 89606, 1.0], [89606, 89742, 1.0], [89742, 89841, 1.0], [89841, 89906, 1.0], [89906, 90016, 0.0], [90016, 90206, 0.0], [90206, 90391, 0.0], [90391, 90564, 1.0], [90564, 90641, 0.0], [90641, 90746, 1.0], [90746, 90790, 1.0], [90790, 90967, 0.0], [90967, 91064, 1.0], [91064, 91250, 0.0], [91250, 91441, 0.0], [91441, 91472, 1.0], [91472, 91656, 0.0], [91656, 91760, 1.0], [91760, 91866, 0.0], [91866, 91921, 0.0], [91921, 92101, 0.0], [92101, 92284, 0.0], [92284, 92459, 0.0], [92459, 92648, 1.0], [92648, 92700, 1.0], [92700, 92889, 0.0], [92889, 92929, 1.0], [92929, 93111, 0.0], [93111, 93304, 0.0], [93304, 93495, 0.0], [93495, 93682, 1.0], [93682, 93883, 0.0], [93883, 93976, 1.0], [93976, 94111, 0.0], [94111, 94300, 0.0], [94300, 94488, 0.0], [94488, 94678, 0.0], [94678, 94867, 0.0], [94867, 94884, 1.0], [94884, 95077, 0.0], [95077, 95104, 1.0], [95104, 95402, 1.0], [95402, 96137, 1.0], [96137, 96608, 1.0], [96608, 96670, 0.0], [96670, 97013, 1.0], [97013, 97623, 1.0], [97623, 98424, 1.0], [98424, 98483, 0.0], [98483, 98979, 1.0], [98979, 99193, 1.0], [99193, 99537, 0.0], [99537, 99666, 1.0], [99666, 99733, 1.0], [99733, 99832, 1.0], [99832, 99925, 1.0], [99925, 100005, 1.0], [100005, 100138, 0.0], [100138, 100266, 1.0], [100266, 100380, 1.0], [100380, 100407, 0.0], [100407, 100429, 0.0], [100429, 100903, 0.0], [100903, 101195, 1.0], [101195, 101442, 0.0], [101442, 101560, 0.0], [101560, 101640, 0.0], [101640, 101704, 0.0], [101704, 101776, 0.0], [101776, 101890, 0.0], [101890, 101985, 0.0], [101985, 102130, 0.0], [102130, 102222, 0.0], [102222, 102355, 0.0], [102355, 102455, 0.0], [102455, 102557, 0.0], [102557, 103092, 1.0], [103092, 103494, 0.0], [103494, 104022, 1.0], [104022, 104103, 0.0], [104103, 104365, 1.0], [104365, 104385, 0.0], [104385, 104413, 0.0], [104413, 104444, 0.0], [104444, 104564, 1.0], [104564, 105386, 1.0], [105386, 105433, 0.0], [105433, 105654, 0.0], [105654, 106148, 1.0], [106148, 106810, 1.0], [106810, 106906, 0.0], [106906, 106999, 1.0], [106999, 107115, 1.0], [107115, 107222, 1.0], [107222, 107311, 1.0], [107311, 107457, 1.0], [107457, 107535, 1.0], [107535, 107676, 1.0], [107676, 107793, 1.0], [107793, 107908, 1.0], [107908, 108055, 1.0], [108055, 108306, 0.0], [108306, 108684, 1.0], [108684, 109560, 0.0], [109560, 109622, 0.0], [109622, 110096, 1.0], [110096, 110489, 0.0], [110489, 110553, 0.0], [110553, 110622, 0.0], [110622, 110976, 0.0], [110976, 111002, 0.0], [111002, 111085, 0.0], [111085, 111149, 0.0], [111149, 111299, 0.0], [111299, 111654, 1.0], [111654, 112221, 1.0], [112221, 112288, 0.0], [112288, 112330, 0.0], [112330, 112345, 0.0], [112345, 112381, 0.0], [112381, 112457, 1.0], [112457, 113178, 1.0], [113178, 113631, 1.0], [113631, 113920, 1.0], [113920, 114394, 1.0], [114394, 115652, 1.0], [115652, 116079, 1.0], [116079, 116522, 1.0], [116522, 116629, 0.0], [116629, 116699, 0.0], [116699, 117285, 1.0], [117285, 117827, 1.0], [117827, 117933, 0.0], [117933, 118365, 1.0], [118365, 118466, 0.0], [118466, 118819, 1.0], [118819, 118888, 0.0], [118888, 119095, 0.0], [119095, 120152, 1.0], [120152, 120632, 1.0], [120632, 121412, 0.0], [121412, 121429, 0.0], [121429, 121453, 0.0], [121453, 121462, 0.0], [121462, 121475, 0.0], [121475, 121501, 0.0], [121501, 121527, 0.0], [121527, 121543, 0.0], [121543, 121567, 0.0], [121567, 122010, 1.0], [122010, 122192, 0.0], [122192, 122819, 1.0], [122819, 122889, 1.0], [122889, 122966, 0.0], [122966, 123026, 1.0], [123026, 123644, 1.0], [123644, 123688, 0.0], [123688, 123975, 1.0], [123975, 124017, 0.0], [124017, 124390, 1.0], [124390, 124446, 0.0], [124446, 125407, 1.0], [125407, 125611, 1.0], [125611, 125651, 0.0], [125651, 125695, 0.0], [125695, 125770, 0.0], [125770, 125822, 0.0], [125822, 125884, 0.0], [125884, 125902, 0.0], [125902, 126032, 1.0], [126032, 126599, 1.0], [126599, 126613, 0.0], [126613, 127031, 1.0], [127031, 127074, 1.0], [127074, 127120, 0.0], [127120, 127208, 0.0], [127208, 127890, 1.0], [127890, 128002, 1.0], [128002, 128215, 1.0], [128215, 128702, 1.0], [128702, 128817, 0.0], [128817, 129616, 1.0], [129616, 129855, 0.0], [129855, 130164, 1.0], [130164, 130466, 1.0], [130466, 130783, 1.0], [130783, 131069, 0.0], [131069, 131747, 1.0], [131747, 132476, 1.0], [132476, 132635, 0.0], [132635, 133171, 1.0], [133171, 133862, 1.0], [133862, 134098, 0.0], [134098, 134328, 1.0], [134328, 134941, 1.0], [134941, 134999, 1.0], [134999, 135446, 1.0], [135446, 136036, 1.0], [136036, 136725, 1.0], [136725, 137098, 1.0], [137098, 137197, 0.0], [137197, 137889, 1.0], [137889, 139040, 1.0], [139040, 139368, 0.0], [139368, 139941, 1.0], [139941, 140277, 1.0], [140277, 140627, 1.0], [140627, 140700, 0.0], [140700, 141197, 1.0], [141197, 141882, 1.0], [141882, 142008, 0.0], [142008, 142810, 1.0], [142810, 143146, 1.0], [143146, 143307, 1.0], [143307, 143465, 0.0], [143465, 143607, 1.0], [143607, 144067, 1.0], [144067, 144243, 1.0], [144243, 144333, 0.0], [144333, 144476, 0.0], [144476, 145048, 1.0], [145048, 145234, 1.0], [145234, 146188, 1.0], [146188, 146503, 1.0], [146503, 146601, 0.0], [146601, 147536, 1.0], [147536, 148327, 1.0], [148327, 148443, 0.0], [148443, 148883, 1.0], [148883, 150331, 1.0], [150331, 150600, 0.0], [150600, 150855, 1.0], [150855, 151326, 1.0], [151326, 151760, 0.0], [151760, 152232, 1.0], [152232, 153238, 1.0], [153238, 153652, 1.0], [153652, 154349, 1.0], [154349, 155131, 1.0], [155131, 155171, 0.0], [155171, 155182, 0.0], [155182, 155209, 0.0], [155209, 155223, 0.0], [155223, 155231, 0.0], [155231, 155255, 0.0], [155255, 155281, 0.0], [155281, 155295, 0.0], [155295, 155308, 0.0], [155308, 155371, 1.0], [155371, 155887, 1.0], [155887, 156153, 0.0], [156153, 156447, 1.0], [156447, 156572, 0.0], [156572, 156625, 1.0], [156625, 156683, 1.0], [156683, 156712, 1.0], [156712, 156740, 1.0], [156740, 157684, 1.0], [157684, 157916, 0.0], [157916, 158826, 1.0], [158826, 158979, 0.0], [158979, 159451, 1.0], [159451, 159510, 0.0], [159510, 160433, 1.0], [160433, 161060, 1.0], [161060, 161246, 0.0], [161246, 161822, 1.0], [161822, 162564, 1.0], [162564, 163027, 1.0], [163027, 163528, 1.0], [163528, 163603, 0.0], [163603, 163992, 1.0], [163992, 164058, 0.0], [164058, 164075, 0.0], [164075, 164165, 0.0], [164165, 164222, 1.0], [164222, 164278, 1.0], [164278, 164933, 1.0], [164933, 165923, 1.0], [165923, 166477, 1.0], [166477, 166614, 1.0], [166614, 166942, 1.0], [166942, 167529, 1.0], [167529, 168037, 1.0], [168037, 168796, 1.0], [168796, 169032, 1.0], [169032, 169236, 1.0], [169236, 169752, 1.0], [169752, 169967, 1.0], [169967, 170150, 1.0], [170150, 170351, 1.0], [170351, 170372, 0.0], [170372, 170821, 1.0], [170821, 171206, 1.0], [171206, 172142, 1.0], [172142, 173186, 1.0], [173186, 173490, 0.0], [173490, 173988, 1.0], [173988, 174970, 0.0], [174970, 175037, 0.0], [175037, 175091, 0.0], [175091, 175152, 0.0], [175152, 175199, 0.0], [175199, 175664, 1.0], [175664, 175886, 1.0], [175886, 176424, 1.0], [176424, 176828, 1.0], [176828, 176967, 0.0], [176967, 177075, 1.0], [177075, 177223, 1.0], [177223, 177456, 1.0], [177456, 177700, 1.0], [177700, 177944, 1.0], [177944, 178498, 1.0], [178498, 178782, 0.0], [178782, 179419, 1.0], [179419, 179836, 0.0], [179836, 179887, 1.0], [179887, 179987, 1.0], [179987, 180034, 1.0], [180034, 180129, 1.0], [180129, 180264, 1.0], [180264, 180354, 1.0], [180354, 180419, 1.0], [180419, 180483, 1.0], [180483, 180604, 0.0], [180604, 180689, 0.0], [180689, 181481, 1.0], [181481, 182158, 1.0], [182158, 182267, 0.0], [182267, 183128, 1.0], [183128, 183530, 1.0], [183530, 183797, 0.0], [183797, 184255, 1.0], [184255, 184339, 0.0], [184339, 184345, 0.0], [184345, 184378, 0.0], [184378, 185181, 1.0], [185181, 185451, 0.0], [185451, 186088, 1.0], [186088, 186355, 1.0], [186355, 186456, 1.0], [186456, 187387, 1.0], [187387, 187686, 1.0], [187686, 187925, 0.0], [187925, 188495, 1.0], [188495, 189309, 1.0], [189309, 189910, 1.0], [189910, 190482, 1.0], [190482, 190755, 0.0], [190755, 191550, 1.0], [191550, 192071, 1.0], [192071, 192186, 0.0], [192186, 192609, 1.0], [192609, 192763, 0.0], [192763, 192932, 1.0], [192932, 193553, 1.0], [193553, 194952, 1.0], [194952, 195566, 0.0], [195566, 195675, 0.0], [195675, 196589, 1.0], [196589, 197386, 1.0], [197386, 197452, 0.0], [197452, 197544, 0.0], [197544, 198252, 1.0], [198252, 198871, 0.0], [198871, 199416, 1.0], [199416, 199956, 1.0], [199956, 199987, 0.0], [199987, 200057, 0.0], [200057, 200571, 1.0], [200571, 200712, 1.0], [200712, 201458, 1.0], [201458, 201669, 0.0], [201669, 202228, 1.0], [202228, 203140, 1.0], [203140, 203190, 0.0], [203190, 203754, 1.0], [203754, 204497, 0.0], [204497, 204892, 0.0], [204892, 205096, 0.0], [205096, 206009, 1.0], [206009, 207439, 1.0], [207439, 208485, 1.0], [208485, 208705, 0.0], [208705, 209005, 0.0], [209005, 209763, 0.0], [209763, 209783, 0.0], [209783, 209803, 0.0], [209803, 209826, 0.0], [209826, 210237, 1.0], [210237, 210916, 0.0], [210916, 210979, 0.0], [210979, 211604, 1.0], [211604, 211874, 1.0], [211874, 211946, 1.0], [211946, 212785, 1.0], [212785, 212945, 0.0], [212945, 213628, 1.0], [213628, 214245, 0.0], [214245, 214255, 0.0], [214255, 214278, 0.0], [214278, 214287, 0.0], [214287, 214317, 0.0], [214317, 214330, 0.0], [214330, 214351, 0.0], [214351, 214370, 0.0], [214370, 214381, 0.0], [214381, 214396, 0.0], [214396, 214409, 0.0], [214409, 214418, 0.0], [214418, 214433, 0.0], [214433, 214447, 0.0], [214447, 214460, 0.0], [214460, 214468, 0.0], [214468, 214512, 0.0], [214512, 214559, 0.0], [214559, 214569, 0.0], [214569, 214983, 1.0], [214983, 215189, 1.0], [215189, 215519, 1.0], [215519, 215637, 0.0], [215637, 216451, 0.0], [216451, 216542, 0.0], [216542, 217150, 1.0], [217150, 217820, 0.0], [217820, 217908, 0.0], [217908, 218250, 1.0], [218250, 218936, 1.0], [218936, 219051, 0.0], [219051, 219459, 1.0], [219459, 219890, 1.0], [219890, 220280, 1.0], [220280, 220400, 0.0], [220400, 220443, 0.0], [220443, 220487, 0.0], [220487, 220523, 0.0], [220523, 220569, 0.0], [220569, 220621, 0.0], [220621, 220730, 0.0], [220730, 221222, 1.0], [221222, 222139, 1.0], [222139, 222259, 0.0], [222259, 222762, 1.0], [222762, 223151, 0.0], [223151, 223200, 0.0], [223200, 223312, 0.0], [223312, 223372, 0.0], [223372, 223944, 1.0], [223944, 224602, 0.0], [224602, 224610, 0.0], [224610, 224619, 0.0], [224619, 224629, 0.0], [224629, 224641, 0.0], [224641, 225434, 1.0], [225434, 226539, 0.0], [226539, 226614, 0.0], [226614, 227311, 1.0], [227311, 227426, 1.0], [227426, 227638, 0.0], [227638, 227741, 1.0], [227741, 228063, 0.0], [228063, 228154, 0.0], [228154, 228654, 1.0], [228654, 229051, 1.0], [229051, 229125, 1.0], [229125, 230081, 0.0], [230081, 230190, 0.0], [230190, 230484, 1.0], [230484, 231014, 1.0], [231014, 231292, 0.0], [231292, 231975, 1.0], [231975, 232644, 1.0], [232644, 232754, 0.0], [232754, 233903, 1.0], [233903, 234021, 0.0], [234021, 234128, 0.0], [234128, 234854, 1.0], [234854, 235109, 0.0], [235109, 235380, 1.0], [235380, 235924, 1.0], [235924, 236229, 1.0], [236229, 236759, 1.0], [236759, 236839, 0.0], [236839, 236980, 1.0], [236980, 237016, 1.0], [237016, 237172, 1.0], [237172, 237427, 0.0], [237427, 237658, 1.0], [237658, 237681, 0.0], [237681, 237705, 0.0], [237705, 237893, 1.0], [237893, 237982, 0.0], [237982, 238163, 1.0], [238163, 238363, 1.0], [238363, 238637, 0.0], [238637, 238931, 1.0], [238931, 239324, 1.0], [239324, 239981, 1.0], [239981, 240151, 1.0], [240151, 240239, 0.0], [240239, 240268, 0.0], [240268, 240897, 1.0], [240897, 240980, 0.0], [240980, 241031, 0.0], [241031, 241310, 1.0], [241310, 241340, 0.0], [241340, 241931, 1.0], [241931, 242012, 0.0], [242012, 242549, 1.0], [242549, 243110, 1.0], [243110, 243710, 1.0], [243710, 245037, 1.0], [245037, 245545, 0.0], [245545, 246426, 0.0], [246426, 247351, 1.0], [247351, 247465, 0.0], [247465, 247507, 1.0]], "rps_lines_javascript_counts": [[0, 34, 0.0], [34, 77, 0.0], [77, 89, 0.0], [89, 100, 0.0], [100, 116, 0.0], [116, 122, 0.0], [122, 130, 0.0], [130, 165, 0.0], [165, 195, 0.0], [195, 241, 0.0], [241, 418, 0.0], [418, 600, 0.0], [600, 774, 0.0], [774, 952, 0.0], [952, 1131, 0.0], [1131, 1309, 0.0], [1309, 1489, 0.0], [1489, 1667, 0.0], [1667, 1829, 0.0], [1829, 2006, 0.0], [2006, 2052, 0.0], [2052, 2210, 0.0], [2210, 2393, 0.0], [2393, 2569, 0.0], [2569, 2746, 0.0], [2746, 2912, 0.0], [2912, 3094, 0.0], [3094, 3264, 0.0], [3264, 3441, 0.0], [3441, 3468, 0.0], [3468, 3531, 0.0], [3531, 3732, 0.0], [3732, 3928, 0.0], [3928, 4097, 0.0], [4097, 4146, 0.0], [4146, 4350, 0.0], [4350, 4545, 0.0], [4545, 4742, 0.0], [4742, 4947, 0.0], [4947, 4975, 0.0], [4975, 5160, 0.0], [5160, 5351, 0.0], [5351, 5433, 0.0], [5433, 5545, 0.0], [5545, 5651, 0.0], [5651, 5760, 0.0], [5760, 5836, 0.0], [5836, 5899, 0.0], [5899, 6002, 0.0], [6002, 6262, 0.0], [6262, 6453, 0.0], [6453, 6637, 0.0], [6637, 6676, 0.0], [6676, 6861, 0.0], [6861, 7058, 0.0], [7058, 7249, 0.0], [7249, 7438, 0.0], [7438, 7712, 0.0], [7712, 7898, 0.0], [7898, 8091, 0.0], [8091, 8183, 0.0], [8183, 8344, 0.0], [8344, 8505, 0.0], [8505, 8609, 0.0], [8609, 8617, 0.0], [8617, 8624, 0.0], [8624, 8636, 0.0], [8636, 8647, 0.0], [8647, 8660, 0.0], [8660, 8673, 0.0], [8673, 8683, 0.0], [8683, 8694, 0.0], [8694, 8704, 0.0], [8704, 8713, 0.0], [8713, 8720, 0.0], [8720, 8732, 0.0], [8732, 8745, 0.0], [8745, 8751, 0.0], [8751, 8760, 0.0], [8760, 8774, 0.0], [8774, 8946, 0.0], [8946, 9032, 0.0], [9032, 9042, 0.0], [9042, 9051, 0.0], [9051, 9063, 0.0], [9063, 9085, 0.0], [9085, 9101, 0.0], [9101, 9110, 0.0], [9110, 9125, 0.0], [9125, 9136, 0.0], [9136, 9151, 0.0], [9151, 9163, 0.0], [9163, 9169, 0.0], [9169, 9177, 0.0], [9177, 9202, 0.0], [9202, 9215, 0.0], [9215, 9226, 0.0], [9226, 9244, 0.0], [9244, 9273, 0.0], [9273, 9284, 0.0], [9284, 9376, 0.0], [9376, 9494, 0.0], [9494, 9678, 0.0], [9678, 9858, 0.0], [9858, 10036, 0.0], [10036, 10215, 0.0], [10215, 10235, 0.0], [10235, 10455, 0.0], [10455, 10666, 0.0], [10666, 10884, 0.0], [10884, 11098, 0.0], [11098, 11173, 0.0], [11173, 11428, 0.0], [11428, 11533, 0.0], [11533, 11756, 0.0], [11756, 11972, 0.0], [11972, 12190, 0.0], [12190, 12907, 0.0], [12907, 13032, 0.0], [13032, 13717, 0.0], [13717, 13891, 0.0], [13891, 14064, 0.0], [14064, 14237, 0.0], [14237, 14414, 0.0], [14414, 14584, 0.0], [14584, 14640, 0.0], [14640, 14765, 0.0], [14765, 14845, 0.0], [14845, 14864, 0.0], [14864, 14912, 0.0], [14912, 14959, 0.0], [14959, 15130, 0.0], [15130, 15297, 0.0], [15297, 15473, 0.0], [15473, 15643, 0.0], [15643, 15764, 0.0], [15764, 15930, 0.0], [15930, 16098, 0.0], [16098, 16260, 0.0], [16260, 16430, 0.0], [16430, 16604, 0.0], [16604, 16769, 0.0], [16769, 16947, 0.0], [16947, 17123, 0.0], [17123, 17135, 0.0], [17135, 17150, 0.0], [17150, 17208, 0.0], [17208, 17379, 0.0], [17379, 17556, 0.0], [17556, 17731, 0.0], [17731, 17903, 0.0], [17903, 18062, 0.0], [18062, 18231, 0.0], [18231, 18406, 0.0], [18406, 18576, 0.0], [18576, 18679, 0.0], [18679, 18850, 0.0], [18850, 19027, 0.0], [19027, 19147, 0.0], [19147, 19325, 0.0], [19325, 19489, 0.0], [19489, 19662, 0.0], [19662, 19792, 0.0], [19792, 19897, 0.0], [19897, 20072, 0.0], [20072, 20254, 0.0], [20254, 20432, 0.0], [20432, 20589, 0.0], [20589, 20759, 0.0], [20759, 20790, 0.0], [20790, 20866, 0.0], [20866, 20964, 0.0], [20964, 21090, 0.0], [21090, 21204, 0.0], [21204, 21298, 0.0], [21298, 21417, 0.0], [21417, 21528, 0.0], [21528, 21645, 0.0], [21645, 21726, 0.0], [21726, 21865, 0.0], [21865, 21995, 0.0], [21995, 22050, 0.0], [22050, 22215, 0.0], [22215, 22382, 0.0], [22382, 22545, 0.0], [22545, 22722, 0.0], [22722, 22873, 0.0], [22873, 22900, 0.0], [22900, 23060, 0.0], [23060, 23233, 0.0], [23233, 23289, 0.0], [23289, 23411, 0.0], [23411, 23574, 0.0], [23574, 23738, 0.0], [23738, 23910, 0.0], [23910, 24007, 0.0], [24007, 24187, 0.0], [24187, 24361, 0.0], [24361, 24528, 0.0], [24528, 24694, 0.0], [24694, 24729, 0.0], [24729, 24872, 0.0], [24872, 25006, 0.0], [25006, 25176, 0.0], [25176, 25344, 0.0], [25344, 25465, 0.0], [25465, 25643, 0.0], [25643, 25816, 0.0], [25816, 25990, 0.0], [25990, 26177, 0.0], [26177, 26285, 0.0], [26285, 26461, 0.0], [26461, 26638, 0.0], [26638, 26814, 0.0], [26814, 26991, 0.0], [26991, 27020, 0.0], [27020, 27196, 0.0], [27196, 27373, 0.0], [27373, 27436, 0.0], [27436, 27607, 0.0], [27607, 27712, 0.0], [27712, 27763, 0.0], [27763, 27940, 0.0], [27940, 28119, 0.0], [28119, 28295, 0.0], [28295, 28481, 0.0], [28481, 28556, 0.0], [28556, 28731, 0.0], [28731, 28902, 0.0], [28902, 29076, 0.0], [29076, 29246, 0.0], [29246, 29412, 0.0], [29412, 29589, 0.0], [29589, 29721, 0.0], [29721, 29774, 0.0], [29774, 29940, 0.0], [29940, 30105, 0.0], [30105, 30275, 0.0], [30275, 30444, 0.0], [30444, 30612, 0.0], [30612, 30782, 0.0], [30782, 30961, 0.0], [30961, 31132, 0.0], [31132, 31272, 0.0], [31272, 31435, 0.0], [31435, 31606, 0.0], [31606, 31771, 0.0], [31771, 31812, 0.0], [31812, 31955, 0.0], [31955, 32086, 0.0], [32086, 32149, 0.0], [32149, 32315, 0.0], [32315, 32482, 0.0], [32482, 32643, 0.0], [32643, 32762, 0.0], [32762, 32917, 0.0], [32917, 33074, 0.0], [33074, 33235, 0.0], [33235, 33397, 0.0], [33397, 33554, 0.0], [33554, 33598, 0.0], [33598, 33763, 0.0], [33763, 33913, 0.0], [33913, 34015, 0.0], [34015, 34165, 0.0], [34165, 34295, 0.0], [34295, 34323, 0.0], [34323, 34477, 0.0], [34477, 34639, 0.0], [34639, 34805, 0.0], [34805, 34973, 0.0], [34973, 35103, 0.0], [35103, 35270, 0.0], [35270, 35424, 0.0], [35424, 35484, 0.0], [35484, 35661, 0.0], [35661, 35821, 0.0], [35821, 35981, 0.0], [35981, 36150, 0.0], [36150, 36320, 0.0], [36320, 36482, 0.0], [36482, 36647, 0.0], [36647, 36756, 0.0], [36756, 36924, 0.0], [36924, 37084, 0.0], [37084, 37250, 0.0], [37250, 37407, 0.0], [37407, 37482, 0.0], [37482, 37644, 0.0], [37644, 37807, 0.0], [37807, 37960, 0.0], [37960, 38126, 0.0], [38126, 38205, 0.0], [38205, 38375, 0.0], [38375, 38542, 0.0], [38542, 38716, 0.0], [38716, 38894, 0.0], [38894, 39064, 0.0], [39064, 39235, 0.0], [39235, 39398, 0.0], [39398, 39536, 0.0], [39536, 39709, 0.0], [39709, 39887, 0.0], [39887, 39901, 0.0], [39901, 40077, 0.0], [40077, 40249, 0.0], [40249, 40299, 0.0], [40299, 40467, 0.0], [40467, 40635, 0.0], [40635, 40739, 0.0], [40739, 40909, 0.0], [40909, 40981, 0.0], [40981, 41154, 0.0], [41154, 41177, 0.0], [41177, 41298, 0.0], [41298, 41364, 0.0], [41364, 41431, 0.0], [41431, 41488, 0.0], [41488, 41559, 0.0], [41559, 41673, 0.0], [41673, 41765, 0.0], [41765, 41884, 0.0], [41884, 42013, 0.0], [42013, 42117, 0.0], [42117, 42426, 0.0], [42426, 43479, 0.0], [43479, 43653, 0.0], [43653, 43825, 0.0], [43825, 43998, 0.0], [43998, 44177, 0.0], [44177, 44257, 0.0], [44257, 44336, 0.0], [44336, 44436, 0.0], [44436, 44612, 0.0], [44612, 44783, 0.0], [44783, 44957, 0.0], [44957, 45130, 0.0], [45130, 45289, 0.0], [45289, 45484, 0.0], [45484, 45590, 0.0], [45590, 45631, 0.0], [45631, 45677, 0.0], [45677, 45706, 0.0], [45706, 45735, 0.0], [45735, 45772, 0.0], [45772, 45957, 0.0], [45957, 46136, 0.0], [46136, 46310, 0.0], [46310, 46492, 0.0], [46492, 46682, 0.0], [46682, 46719, 0.0], [46719, 46905, 0.0], [46905, 47020, 0.0], [47020, 47115, 0.0], [47115, 47137, 0.0], [47137, 47320, 0.0], [47320, 47503, 0.0], [47503, 47679, 0.0], [47679, 47750, 0.0], [47750, 47939, 0.0], [47939, 48126, 0.0], [48126, 48136, 0.0], [48136, 48312, 0.0], [48312, 48491, 0.0], [48491, 48672, 0.0], [48672, 48741, 0.0], [48741, 48779, 0.0], [48779, 48963, 0.0], [48963, 49150, 0.0], [49150, 49335, 0.0], [49335, 49432, 0.0], [49432, 49603, 0.0], [49603, 49785, 0.0], [49785, 49904, 0.0], [49904, 50090, 0.0], [50090, 50280, 0.0], [50280, 50448, 0.0], [50448, 50636, 0.0], [50636, 50743, 0.0], [50743, 50851, 0.0], [50851, 51019, 0.0], [51019, 51205, 0.0], [51205, 51347, 0.0], [51347, 51539, 0.0], [51539, 51726, 0.0], [51726, 51770, 0.0], [51770, 51959, 0.0], [51959, 52148, 0.0], [52148, 52210, 0.0], [52210, 52393, 0.0], [52393, 52575, 0.0], [52575, 52672, 0.0], [52672, 52857, 0.0], [52857, 53035, 0.0], [53035, 53221, 0.0], [53221, 53409, 0.0], [53409, 53601, 0.0], [53601, 53746, 0.0], [53746, 53908, 0.0], [53908, 54065, 0.0], [54065, 54220, 0.0], [54220, 54384, 0.0], [54384, 54550, 0.0], [54550, 54569, 0.0], [54569, 54624, 0.0], [54624, 54707, 0.0], [54707, 54865, 0.0], [54865, 55036, 0.0], [55036, 55202, 0.0], [55202, 55347, 0.0], [55347, 55506, 0.0], [55506, 55670, 0.0], [55670, 55841, 0.0], [55841, 55865, 0.0], [55865, 56029, 0.0], [56029, 56137, 0.0], [56137, 56206, 0.0], [56206, 56262, 0.0], [56262, 56335, 0.0], [56335, 56373, 0.0], [56373, 56545, 0.0], [56545, 56623, 0.0], [56623, 56784, 0.0], [56784, 56949, 0.0], [56949, 57039, 0.0], [57039, 57206, 0.0], [57206, 57381, 0.0], [57381, 57480, 0.0], [57480, 57641, 0.0], [57641, 57806, 0.0], [57806, 57823, 0.0], [57823, 58107, 0.0], [58107, 58265, 0.0], [58265, 58434, 0.0], [58434, 58602, 0.0], [58602, 58631, 0.0], [58631, 58787, 0.0], [58787, 58940, 0.0], [58940, 59098, 0.0], [59098, 59266, 0.0], [59266, 59343, 0.0], [59343, 59512, 0.0], [59512, 59554, 0.0], [59554, 59625, 0.0], [59625, 59790, 0.0], [59790, 59957, 0.0], [59957, 60117, 0.0], [60117, 60192, 0.0], [60192, 60352, 0.0], [60352, 60473, 0.0], [60473, 60617, 0.0], [60617, 60764, 0.0], [60764, 60908, 0.0], [60908, 61053, 0.0], [61053, 61161, 0.0], [61161, 61444, 0.0], [61444, 61593, 0.0], [61593, 61742, 0.0], [61742, 61891, 0.0], [61891, 62036, 0.0], [62036, 62188, 0.0], [62188, 62234, 0.0], [62234, 62398, 0.0], [62398, 62417, 0.0], [62417, 62445, 0.0], [62445, 62459, 0.0], [62459, 62511, 0.0], [62511, 62539, 0.0], [62539, 62572, 0.0], [62572, 62601, 0.0], [62601, 62655, 0.0], [62655, 62718, 0.0], [62718, 62877, 0.0], [62877, 63031, 0.0], [63031, 63154, 0.0], [63154, 63304, 0.0], [63304, 63454, 0.0], [63454, 63602, 0.0], [63602, 63679, 0.0], [63679, 63819, 0.0], [63819, 63893, 0.0], [63893, 64050, 0.0], [64050, 64212, 0.0], [64212, 64370, 0.0], [64370, 64533, 0.0], [64533, 64693, 0.0], [64693, 64770, 0.0], [64770, 64937, 0.0], [64937, 65096, 0.0], [65096, 65249, 0.0], [65249, 65404, 0.0], [65404, 65560, 0.0], [65560, 65714, 0.0], [65714, 65873, 0.0], [65873, 65882, 0.0], [65882, 65937, 0.0], [65937, 66099, 0.0], [66099, 66267, 0.0], [66267, 66433, 0.0], [66433, 66599, 0.0], [66599, 66680, 0.0], [66680, 66843, 0.0], [66843, 66859, 0.0], [66859, 67015, 0.0], [67015, 67175, 0.0], [67175, 67328, 0.0], [67328, 67365, 0.0], [67365, 67426, 0.0], [67426, 67498, 0.0], [67498, 67658, 0.0], [67658, 67816, 0.0], [67816, 67980, 0.0], [67980, 67990, 0.0], [67990, 68152, 0.0], [68152, 68316, 0.0], [68316, 68475, 0.0], [68475, 68608, 0.0], [68608, 68764, 0.0], [68764, 68916, 0.0], [68916, 69068, 0.0], [69068, 69229, 0.0], [69229, 69382, 0.0], [69382, 69554, 0.0], [69554, 69613, 0.0], [69613, 69777, 0.0], [69777, 69948, 0.0], [69948, 70129, 0.0], [70129, 70301, 0.0], [70301, 70467, 0.0], [70467, 70635, 0.0], [70635, 70806, 0.0], [70806, 70947, 0.0], [70947, 71218, 0.0], [71218, 71388, 0.0], [71388, 71550, 0.0], [71550, 71725, 0.0], [71725, 71809, 0.0], [71809, 71976, 0.0], [71976, 72143, 0.0], [72143, 72315, 0.0], [72315, 72339, 0.0], [72339, 72510, 0.0], [72510, 72681, 0.0], [72681, 72851, 0.0], [72851, 72909, 0.0], [72909, 72974, 0.0], [72974, 73147, 0.0], [73147, 73307, 0.0], [73307, 73327, 0.0], [73327, 73499, 0.0], [73499, 73667, 0.0], [73667, 73683, 0.0], [73683, 73859, 0.0], [73859, 74034, 0.0], [74034, 74214, 0.0], [74214, 74376, 0.0], [74376, 74550, 0.0], [74550, 74711, 0.0], [74711, 74880, 0.0], [74880, 74921, 0.0], [74921, 75061, 0.0], [75061, 75230, 0.0], [75230, 75402, 0.0], [75402, 75570, 0.0], [75570, 75743, 0.0], [75743, 75811, 0.0], [75811, 75987, 0.0], [75987, 76159, 0.0], [76159, 76327, 0.0], [76327, 76494, 0.0], [76494, 76676, 0.0], [76676, 76854, 0.0], [76854, 76940, 0.0], [76940, 76977, 0.0], [76977, 77147, 0.0], [77147, 77322, 0.0], [77322, 77500, 0.0], [77500, 77679, 0.0], [77679, 77832, 0.0], [77832, 77895, 0.0], [77895, 77947, 0.0], [77947, 78143, 0.0], [78143, 78331, 0.0], [78331, 78520, 0.0], [78520, 78552, 0.0], [78552, 78749, 0.0], [78749, 78794, 0.0], [78794, 78987, 0.0], [78987, 79181, 0.0], [79181, 79287, 0.0], [79287, 79368, 0.0], [79368, 79461, 0.0], [79461, 79648, 0.0], [79648, 79842, 0.0], [79842, 80041, 0.0], [80041, 80228, 0.0], [80228, 80421, 0.0], [80421, 80596, 0.0], [80596, 80787, 0.0], [80787, 80948, 0.0], [80948, 81009, 0.0], [81009, 81139, 0.0], [81139, 81306, 0.0], [81306, 81415, 0.0], [81415, 81568, 0.0], [81568, 81689, 0.0], [81689, 81837, 0.0], [81837, 82004, 0.0], [82004, 82195, 0.0], [82195, 82388, 0.0], [82388, 82585, 0.0], [82585, 82779, 0.0], [82779, 82977, 0.0], [82977, 82996, 0.0], [82996, 83187, 0.0], [83187, 83370, 0.0], [83370, 83389, 0.0], [83389, 83565, 0.0], [83565, 83737, 0.0], [83737, 83926, 0.0], [83926, 84084, 0.0], [84084, 84267, 0.0], [84267, 84509, 0.0], [84509, 84698, 0.0], [84698, 84887, 0.0], [84887, 85074, 0.0], [85074, 85240, 0.0], [85240, 85373, 0.0], [85373, 85543, 0.0], [85543, 85663, 0.0], [85663, 85855, 0.0], [85855, 86033, 0.0], [86033, 86106, 0.0], [86106, 86260, 0.0], [86260, 86582, 0.0], [86582, 86779, 0.0], [86779, 86970, 0.0], [86970, 87082, 0.0], [87082, 87239, 0.0], [87239, 87393, 0.0], [87393, 87506, 0.0], [87506, 87586, 0.0], [87586, 87702, 0.0], [87702, 87843, 0.0], [87843, 87948, 0.0], [87948, 88041, 0.0], [88041, 88182, 0.0], [88182, 88259, 0.0], [88259, 88332, 0.0], [88332, 88458, 0.0], [88458, 88570, 0.0], [88570, 88671, 0.0], [88671, 88799, 0.0], [88799, 88950, 0.0], [88950, 89060, 0.0], [89060, 89212, 0.0], [89212, 89279, 0.0], [89279, 89407, 0.0], [89407, 89541, 0.0], [89541, 89606, 0.0], [89606, 89742, 0.0], [89742, 89841, 0.0], [89841, 89906, 0.0], [89906, 90016, 0.0], [90016, 90206, 0.0], [90206, 90391, 0.0], [90391, 90564, 0.0], [90564, 90641, 0.0], [90641, 90746, 0.0], [90746, 90790, 0.0], [90790, 90967, 0.0], [90967, 91064, 0.0], [91064, 91250, 0.0], [91250, 91441, 0.0], [91441, 91472, 0.0], [91472, 91656, 0.0], [91656, 91760, 0.0], [91760, 91866, 0.0], [91866, 91921, 0.0], [91921, 92101, 0.0], [92101, 92284, 0.0], [92284, 92459, 0.0], [92459, 92648, 0.0], [92648, 92700, 0.0], [92700, 92889, 0.0], [92889, 92929, 0.0], [92929, 93111, 0.0], [93111, 93304, 0.0], [93304, 93495, 0.0], [93495, 93682, 0.0], [93682, 93883, 0.0], [93883, 93976, 0.0], [93976, 94111, 0.0], [94111, 94300, 0.0], [94300, 94488, 0.0], [94488, 94678, 0.0], [94678, 94867, 0.0], [94867, 94884, 0.0], [94884, 95077, 0.0], [95077, 95104, 0.0], [95104, 95402, 0.0], [95402, 96137, 0.0], [96137, 96608, 0.0], [96608, 96670, 0.0], [96670, 97013, 0.0], [97013, 97623, 0.0], [97623, 98424, 0.0], [98424, 98483, 0.0], [98483, 98979, 0.0], [98979, 99193, 0.0], [99193, 99537, 0.0], [99537, 99666, 0.0], [99666, 99733, 0.0], [99733, 99832, 0.0], [99832, 99925, 0.0], [99925, 100005, 0.0], [100005, 100138, 0.0], [100138, 100266, 0.0], [100266, 100380, 0.0], [100380, 100407, 0.0], [100407, 100429, 0.0], [100429, 100903, 0.0], [100903, 101195, 0.0], [101195, 101442, 0.0], [101442, 101560, 0.0], [101560, 101640, 0.0], [101640, 101704, 0.0], [101704, 101776, 0.0], [101776, 101890, 0.0], [101890, 101985, 0.0], [101985, 102130, 0.0], [102130, 102222, 0.0], [102222, 102355, 0.0], [102355, 102455, 0.0], [102455, 102557, 0.0], [102557, 103092, 0.0], [103092, 103494, 0.0], [103494, 104022, 0.0], [104022, 104103, 0.0], [104103, 104365, 0.0], [104365, 104385, 0.0], [104385, 104413, 0.0], [104413, 104444, 0.0], [104444, 104564, 0.0], [104564, 105386, 0.0], [105386, 105433, 0.0], [105433, 105654, 0.0], [105654, 106148, 0.0], [106148, 106810, 0.0], [106810, 106906, 0.0], [106906, 106999, 0.0], [106999, 107115, 0.0], [107115, 107222, 0.0], [107222, 107311, 0.0], [107311, 107457, 0.0], [107457, 107535, 0.0], [107535, 107676, 0.0], [107676, 107793, 0.0], [107793, 107908, 0.0], [107908, 108055, 0.0], [108055, 108306, 0.0], [108306, 108684, 0.0], [108684, 109560, 0.0], [109560, 109622, 0.0], [109622, 110096, 0.0], [110096, 110489, 0.0], [110489, 110553, 0.0], [110553, 110622, 0.0], [110622, 110976, 0.0], [110976, 111002, 0.0], [111002, 111085, 0.0], [111085, 111149, 0.0], [111149, 111299, 0.0], [111299, 111654, 0.0], [111654, 112221, 0.0], [112221, 112288, 0.0], [112288, 112330, 0.0], [112330, 112345, 0.0], [112345, 112381, 0.0], [112381, 112457, 0.0], [112457, 113178, 0.0], [113178, 113631, 0.0], [113631, 113920, 0.0], [113920, 114394, 1.0], [114394, 115652, 0.0], [115652, 116079, 0.0], [116079, 116522, 0.0], [116522, 116629, 0.0], [116629, 116699, 0.0], [116699, 117285, 0.0], [117285, 117827, 0.0], [117827, 117933, 0.0], [117933, 118365, 0.0], [118365, 118466, 0.0], [118466, 118819, 0.0], [118819, 118888, 0.0], [118888, 119095, 0.0], [119095, 120152, 0.0], [120152, 120632, 0.0], [120632, 121412, 0.0], [121412, 121429, 0.0], [121429, 121453, 0.0], [121453, 121462, 0.0], [121462, 121475, 0.0], [121475, 121501, 0.0], [121501, 121527, 0.0], [121527, 121543, 0.0], [121543, 121567, 0.0], [121567, 122010, 0.0], [122010, 122192, 0.0], [122192, 122819, 0.0], [122819, 122889, 0.0], [122889, 122966, 0.0], [122966, 123026, 0.0], [123026, 123644, 0.0], [123644, 123688, 0.0], [123688, 123975, 0.0], [123975, 124017, 0.0], [124017, 124390, 0.0], [124390, 124446, 0.0], [124446, 125407, 0.0], [125407, 125611, 0.0], [125611, 125651, 0.0], [125651, 125695, 0.0], [125695, 125770, 0.0], [125770, 125822, 0.0], [125822, 125884, 0.0], [125884, 125902, 0.0], [125902, 126032, 0.0], [126032, 126599, 0.0], [126599, 126613, 0.0], [126613, 127031, 0.0], [127031, 127074, 0.0], [127074, 127120, 0.0], [127120, 127208, 0.0], [127208, 127890, 0.0], [127890, 128002, 0.0], [128002, 128215, 0.0], [128215, 128702, 0.0], [128702, 128817, 0.0], [128817, 129616, 0.0], [129616, 129855, 0.0], [129855, 130164, 0.0], [130164, 130466, 0.0], [130466, 130783, 0.0], [130783, 131069, 0.0], [131069, 131747, 0.0], [131747, 132476, 0.0], [132476, 132635, 0.0], [132635, 133171, 0.0], [133171, 133862, 0.0], [133862, 134098, 0.0], [134098, 134328, 0.0], [134328, 134941, 0.0], [134941, 134999, 0.0], [134999, 135446, 0.0], [135446, 136036, 0.0], [136036, 136725, 0.0], [136725, 137098, 0.0], [137098, 137197, 0.0], [137197, 137889, 0.0], [137889, 139040, 0.0], [139040, 139368, 0.0], [139368, 139941, 0.0], [139941, 140277, 0.0], [140277, 140627, 0.0], [140627, 140700, 0.0], [140700, 141197, 0.0], [141197, 141882, 0.0], [141882, 142008, 0.0], [142008, 142810, 0.0], [142810, 143146, 0.0], [143146, 143307, 0.0], [143307, 143465, 0.0], [143465, 143607, 0.0], [143607, 144067, 0.0], [144067, 144243, 0.0], [144243, 144333, 0.0], [144333, 144476, 0.0], [144476, 145048, 0.0], [145048, 145234, 0.0], [145234, 146188, 0.0], [146188, 146503, 0.0], [146503, 146601, 0.0], [146601, 147536, 0.0], [147536, 148327, 0.0], [148327, 148443, 0.0], [148443, 148883, 0.0], [148883, 150331, 0.0], [150331, 150600, 0.0], [150600, 150855, 0.0], [150855, 151326, 0.0], [151326, 151760, 0.0], [151760, 152232, 0.0], [152232, 153238, 0.0], [153238, 153652, 0.0], [153652, 154349, 0.0], [154349, 155131, 0.0], [155131, 155171, 0.0], [155171, 155182, 0.0], [155182, 155209, 0.0], [155209, 155223, 0.0], [155223, 155231, 0.0], [155231, 155255, 0.0], [155255, 155281, 0.0], [155281, 155295, 0.0], [155295, 155308, 0.0], [155308, 155371, 0.0], [155371, 155887, 0.0], [155887, 156153, 0.0], [156153, 156447, 0.0], [156447, 156572, 0.0], [156572, 156625, 0.0], [156625, 156683, 0.0], [156683, 156712, 0.0], [156712, 156740, 0.0], [156740, 157684, 0.0], [157684, 157916, 0.0], [157916, 158826, 0.0], [158826, 158979, 0.0], [158979, 159451, 0.0], [159451, 159510, 0.0], [159510, 160433, 0.0], [160433, 161060, 0.0], [161060, 161246, 0.0], [161246, 161822, 0.0], [161822, 162564, 0.0], [162564, 163027, 0.0], [163027, 163528, 0.0], [163528, 163603, 0.0], [163603, 163992, 0.0], [163992, 164058, 0.0], [164058, 164075, 0.0], [164075, 164165, 0.0], [164165, 164222, 0.0], [164222, 164278, 0.0], [164278, 164933, 0.0], [164933, 165923, 0.0], [165923, 166477, 0.0], [166477, 166614, 0.0], [166614, 166942, 0.0], [166942, 167529, 0.0], [167529, 168037, 0.0], [168037, 168796, 0.0], [168796, 169032, 0.0], [169032, 169236, 0.0], [169236, 169752, 0.0], [169752, 169967, 0.0], [169967, 170150, 0.0], [170150, 170351, 0.0], [170351, 170372, 0.0], [170372, 170821, 0.0], [170821, 171206, 0.0], [171206, 172142, 0.0], [172142, 173186, 0.0], [173186, 173490, 0.0], [173490, 173988, 0.0], [173988, 174970, 0.0], [174970, 175037, 0.0], [175037, 175091, 0.0], [175091, 175152, 0.0], [175152, 175199, 0.0], [175199, 175664, 0.0], [175664, 175886, 0.0], [175886, 176424, 0.0], [176424, 176828, 0.0], [176828, 176967, 0.0], [176967, 177075, 0.0], [177075, 177223, 0.0], [177223, 177456, 0.0], [177456, 177700, 0.0], [177700, 177944, 0.0], [177944, 178498, 0.0], [178498, 178782, 0.0], [178782, 179419, 0.0], [179419, 179836, 0.0], [179836, 179887, 0.0], [179887, 179987, 0.0], [179987, 180034, 0.0], [180034, 180129, 0.0], [180129, 180264, 0.0], [180264, 180354, 0.0], [180354, 180419, 0.0], [180419, 180483, 0.0], [180483, 180604, 0.0], [180604, 180689, 0.0], [180689, 181481, 0.0], [181481, 182158, 0.0], [182158, 182267, 0.0], [182267, 183128, 0.0], [183128, 183530, 1.0], [183530, 183797, 0.0], [183797, 184255, 1.0], [184255, 184339, 0.0], [184339, 184345, 0.0], [184345, 184378, 0.0], [184378, 185181, 0.0], [185181, 185451, 0.0], [185451, 186088, 0.0], [186088, 186355, 0.0], [186355, 186456, 0.0], [186456, 187387, 0.0], [187387, 187686, 0.0], [187686, 187925, 0.0], [187925, 188495, 0.0], [188495, 189309, 0.0], [189309, 189910, 0.0], [189910, 190482, 0.0], [190482, 190755, 0.0], [190755, 191550, 0.0], [191550, 192071, 0.0], [192071, 192186, 0.0], [192186, 192609, 0.0], [192609, 192763, 0.0], [192763, 192932, 0.0], [192932, 193553, 0.0], [193553, 194952, 0.0], [194952, 195566, 0.0], [195566, 195675, 0.0], [195675, 196589, 0.0], [196589, 197386, 0.0], [197386, 197452, 0.0], [197452, 197544, 0.0], [197544, 198252, 0.0], [198252, 198871, 0.0], [198871, 199416, 0.0], [199416, 199956, 0.0], [199956, 199987, 0.0], [199987, 200057, 0.0], [200057, 200571, 0.0], [200571, 200712, 0.0], [200712, 201458, 0.0], [201458, 201669, 0.0], [201669, 202228, 0.0], [202228, 203140, 0.0], [203140, 203190, 0.0], [203190, 203754, 0.0], [203754, 204497, 0.0], [204497, 204892, 0.0], [204892, 205096, 0.0], [205096, 206009, 0.0], [206009, 207439, 0.0], [207439, 208485, 0.0], [208485, 208705, 0.0], [208705, 209005, 0.0], [209005, 209763, 0.0], [209763, 209783, 0.0], [209783, 209803, 0.0], [209803, 209826, 0.0], [209826, 210237, 0.0], [210237, 210916, 0.0], [210916, 210979, 0.0], [210979, 211604, 0.0], [211604, 211874, 0.0], [211874, 211946, 0.0], [211946, 212785, 0.0], [212785, 212945, 0.0], [212945, 213628, 0.0], [213628, 214245, 0.0], [214245, 214255, 0.0], [214255, 214278, 0.0], [214278, 214287, 0.0], [214287, 214317, 0.0], [214317, 214330, 0.0], [214330, 214351, 0.0], [214351, 214370, 0.0], [214370, 214381, 0.0], [214381, 214396, 0.0], [214396, 214409, 0.0], [214409, 214418, 0.0], [214418, 214433, 0.0], [214433, 214447, 0.0], [214447, 214460, 0.0], [214460, 214468, 0.0], [214468, 214512, 0.0], [214512, 214559, 0.0], [214559, 214569, 0.0], [214569, 214983, 0.0], [214983, 215189, 0.0], [215189, 215519, 0.0], [215519, 215637, 0.0], [215637, 216451, 0.0], [216451, 216542, 0.0], [216542, 217150, 0.0], [217150, 217820, 0.0], [217820, 217908, 0.0], [217908, 218250, 0.0], [218250, 218936, 0.0], [218936, 219051, 0.0], [219051, 219459, 0.0], [219459, 219890, 0.0], [219890, 220280, 0.0], [220280, 220400, 0.0], [220400, 220443, 0.0], [220443, 220487, 0.0], [220487, 220523, 0.0], [220523, 220569, 0.0], [220569, 220621, 0.0], [220621, 220730, 0.0], [220730, 221222, 0.0], [221222, 222139, 0.0], [222139, 222259, 0.0], [222259, 222762, 0.0], [222762, 223151, 0.0], [223151, 223200, 0.0], [223200, 223312, 0.0], [223312, 223372, 0.0], [223372, 223944, 0.0], [223944, 224602, 0.0], [224602, 224610, 0.0], [224610, 224619, 0.0], [224619, 224629, 0.0], [224629, 224641, 0.0], [224641, 225434, 0.0], [225434, 226539, 0.0], [226539, 226614, 0.0], [226614, 227311, 0.0], [227311, 227426, 0.0], [227426, 227638, 0.0], [227638, 227741, 0.0], [227741, 228063, 0.0], [228063, 228154, 0.0], [228154, 228654, 0.0], [228654, 229051, 0.0], [229051, 229125, 0.0], [229125, 230081, 0.0], [230081, 230190, 0.0], [230190, 230484, 0.0], [230484, 231014, 0.0], [231014, 231292, 0.0], [231292, 231975, 0.0], [231975, 232644, 0.0], [232644, 232754, 0.0], [232754, 233903, 0.0], [233903, 234021, 0.0], [234021, 234128, 0.0], [234128, 234854, 0.0], [234854, 235109, 0.0], [235109, 235380, 0.0], [235380, 235924, 0.0], [235924, 236229, 0.0], [236229, 236759, 0.0], [236759, 236839, 0.0], [236839, 236980, 0.0], [236980, 237016, 0.0], [237016, 237172, 0.0], [237172, 237427, 0.0], [237427, 237658, 0.0], [237658, 237681, 0.0], [237681, 237705, 0.0], [237705, 237893, 0.0], [237893, 237982, 0.0], [237982, 238163, 0.0], [238163, 238363, 0.0], [238363, 238637, 0.0], [238637, 238931, 0.0], [238931, 239324, 0.0], [239324, 239981, 0.0], [239981, 240151, 0.0], [240151, 240239, 0.0], [240239, 240268, 0.0], [240268, 240897, 0.0], [240897, 240980, 0.0], [240980, 241031, 0.0], [241031, 241310, 0.0], [241310, 241340, 0.0], [241340, 241931, 0.0], [241931, 242012, 0.0], [242012, 242549, 0.0], [242549, 243110, 0.0], [243110, 243710, 0.0], [243710, 245037, 0.0], [245037, 245545, 0.0], [245545, 246426, 0.0], [246426, 247351, 0.0], [247351, 247465, 0.0], [247465, 247507, 0.0]], "rps_lines_num_words": [[0, 34, 6.0], [34, 77, 5.0], [77, 89, 1.0], [89, 100, 1.0], [100, 116, 2.0], [116, 122, 1.0], [122, 130, 1.0], [130, 165, 4.0], [165, 195, 6.0], [195, 241, 6.0], [241, 418, 30.0], [418, 600, 26.0], [600, 774, 28.0], [774, 952, 21.0], [952, 1131, 34.0], [1131, 1309, 31.0], [1309, 1489, 29.0], [1489, 1667, 29.0], [1667, 1829, 31.0], [1829, 2006, 31.0], [2006, 2052, 7.0], [2052, 2210, 3.0], [2210, 2393, 31.0], [2393, 2569, 29.0], [2569, 2746, 31.0], [2746, 2912, 25.0], [2912, 3094, 27.0], [3094, 3264, 29.0], [3264, 3441, 28.0], [3441, 3468, 6.0], [3468, 3531, 12.0], [3531, 3732, 37.0], [3732, 3928, 27.0], [3928, 4097, 27.0], [4097, 4146, 3.0], [4146, 4350, 35.0], [4350, 4545, 35.0], [4545, 4742, 30.0], [4742, 4947, 35.0], [4947, 4975, 5.0], [4975, 5160, 26.0], [5160, 5351, 29.0], [5351, 5433, 12.0], [5433, 5545, 17.0], [5545, 5651, 13.0], [5651, 5760, 18.0], [5760, 5836, 11.0], [5836, 5899, 10.0], [5899, 6002, 18.0], [6002, 6262, 3.0], [6262, 6453, 32.0], [6453, 6637, 29.0], [6637, 6676, 4.0], [6676, 6861, 31.0], [6861, 7058, 33.0], [7058, 7249, 29.0], [7249, 7438, 24.0], [7438, 7712, 5.0], [7712, 7898, 30.0], [7898, 8091, 31.0], [8091, 8183, 16.0], [8183, 8344, 25.0], [8344, 8505, 28.0], [8505, 8609, 15.0], [8609, 8617, 2.0], [8617, 8624, 2.0], [8624, 8636, 2.0], [8636, 8647, 2.0], [8647, 8660, 2.0], [8660, 8673, 2.0], [8673, 8683, 2.0], [8683, 8694, 2.0], [8694, 8704, 2.0], [8704, 8713, 2.0], [8713, 8720, 2.0], [8720, 8732, 2.0], [8732, 8745, 2.0], [8745, 8751, 2.0], [8751, 8760, 2.0], [8760, 8774, 2.0], [8774, 8946, 27.0], [8946, 9032, 12.0], [9032, 9042, 3.0], [9042, 9051, 2.0], [9051, 9063, 3.0], [9063, 9085, 4.0], [9085, 9101, 3.0], [9101, 9110, 2.0], [9110, 9125, 3.0], [9125, 9136, 3.0], [9136, 9151, 3.0], [9151, 9163, 2.0], [9163, 9169, 2.0], [9169, 9177, 2.0], [9177, 9202, 5.0], [9202, 9215, 2.0], [9215, 9226, 2.0], [9226, 9244, 4.0], [9244, 9273, 4.0], [9273, 9284, 2.0], [9284, 9376, 6.0], [9376, 9494, 6.0], [9494, 9678, 31.0], [9678, 9858, 26.0], [9858, 10036, 29.0], [10036, 10215, 32.0], [10215, 10235, 3.0], [10235, 10455, 34.0], [10455, 10666, 34.0], [10666, 10884, 33.0], [10884, 11098, 34.0], [11098, 11173, 11.0], [11173, 11428, 2.0], [11428, 11533, 3.0], [11533, 11756, 40.0], [11756, 11972, 32.0], [11972, 12190, 38.0], [12190, 12907, 113.0], [12907, 13032, 3.0], [13032, 13717, 114.0], [13717, 13891, 27.0], [13891, 14064, 26.0], [14064, 14237, 30.0], [14237, 14414, 31.0], [14414, 14584, 28.0], [14584, 14640, 9.0], [14640, 14765, 3.0], [14765, 14845, 6.0], [14845, 14864, 3.0], [14864, 14912, 1.0], [14912, 14959, 1.0], [14959, 15130, 31.0], [15130, 15297, 27.0], [15297, 15473, 29.0], [15473, 15643, 24.0], [15643, 15764, 20.0], [15764, 15930, 32.0], [15930, 16098, 27.0], [16098, 16260, 27.0], [16260, 16430, 29.0], [16430, 16604, 25.0], [16604, 16769, 23.0], [16769, 16947, 24.0], [16947, 17123, 31.0], [17123, 17135, 1.0], [17135, 17150, 2.0], [17150, 17208, 1.0], [17208, 17379, 29.0], [17379, 17556, 27.0], [17556, 17731, 29.0], [17731, 17903, 26.0], [17903, 18062, 28.0], [18062, 18231, 24.0], [18231, 18406, 23.0], [18406, 18576, 29.0], [18576, 18679, 3.0], [18679, 18850, 28.0], [18850, 19027, 32.0], [19027, 19147, 20.0], [19147, 19325, 27.0], [19325, 19489, 26.0], [19489, 19662, 24.0], [19662, 19792, 17.0], [19792, 19897, 3.0], [19897, 20072, 31.0], [20072, 20254, 27.0], [20254, 20432, 26.0], [20432, 20589, 23.0], [20589, 20759, 26.0], [20759, 20790, 5.0], [20790, 20866, 14.0], [20866, 20964, 16.0], [20964, 21090, 18.0], [21090, 21204, 20.0], [21204, 21298, 18.0], [21298, 21417, 19.0], [21417, 21528, 19.0], [21528, 21645, 18.0], [21645, 21726, 14.0], [21726, 21865, 22.0], [21865, 21995, 23.0], [21995, 22050, 2.0], [22050, 22215, 27.0], [22215, 22382, 30.0], [22382, 22545, 26.0], [22545, 22722, 29.0], [22722, 22873, 22.0], [22873, 22900, 5.0], [22900, 23060, 25.0], [23060, 23233, 28.0], [23233, 23289, 10.0], [23289, 23411, 3.0], [23411, 23574, 24.0], [23574, 23738, 26.0], [23738, 23910, 26.0], [23910, 24007, 17.0], [24007, 24187, 27.0], [24187, 24361, 31.0], [24361, 24528, 28.0], [24528, 24694, 27.0], [24694, 24729, 5.0], [24729, 24872, 2.0], [24872, 25006, 1.0], [25006, 25176, 33.0], [25176, 25344, 29.0], [25344, 25465, 20.0], [25465, 25643, 29.0], [25643, 25816, 25.0], [25816, 25990, 26.0], [25990, 26177, 28.0], [26177, 26285, 2.0], [26285, 26461, 30.0], [26461, 26638, 29.0], [26638, 26814, 27.0], [26814, 26991, 35.0], [26991, 27020, 4.0], [27020, 27196, 29.0], [27196, 27373, 28.0], [27373, 27436, 11.0], [27436, 27607, 25.0], [27607, 27712, 17.0], [27712, 27763, 2.0], [27763, 27940, 32.0], [27940, 28119, 28.0], [28119, 28295, 32.0], [28295, 28481, 37.0], [28481, 28556, 11.0], [28556, 28731, 26.0], [28731, 28902, 23.0], [28902, 29076, 26.0], [29076, 29246, 28.0], [29246, 29412, 28.0], [29412, 29589, 24.0], [29589, 29721, 21.0], [29721, 29774, 3.0], [29774, 29940, 24.0], [29940, 30105, 22.0], [30105, 30275, 26.0], [30275, 30444, 29.0], [30444, 30612, 24.0], [30612, 30782, 26.0], [30782, 30961, 26.0], [30961, 31132, 26.0], [31132, 31272, 25.0], [31272, 31435, 25.0], [31435, 31606, 24.0], [31606, 31771, 23.0], [31771, 31812, 6.0], [31812, 31955, 3.0], [31955, 32086, 1.0], [32086, 32149, 1.0], [32149, 32315, 28.0], [32315, 32482, 26.0], [32482, 32643, 24.0], [32643, 32762, 22.0], [32762, 32917, 24.0], [32917, 33074, 21.0], [33074, 33235, 28.0], [33235, 33397, 28.0], [33397, 33554, 25.0], [33554, 33598, 8.0], [33598, 33763, 25.0], [33763, 33913, 22.0], [33913, 34015, 1.0], [34015, 34165, 1.0], [34165, 34295, 1.0], [34295, 34323, 1.0], [34323, 34477, 25.0], [34477, 34639, 25.0], [34639, 34805, 28.0], [34805, 34973, 25.0], [34973, 35103, 17.0], [35103, 35270, 26.0], [35270, 35424, 27.0], [35424, 35484, 11.0], [35484, 35661, 28.0], [35661, 35821, 26.0], [35821, 35981, 26.0], [35981, 36150, 35.0], [36150, 36320, 29.0], [36320, 36482, 27.0], [36482, 36647, 24.0], [36647, 36756, 18.0], [36756, 36924, 27.0], [36924, 37084, 27.0], [37084, 37250, 28.0], [37250, 37407, 23.0], [37407, 37482, 13.0], [37482, 37644, 26.0], [37644, 37807, 24.0], [37807, 37960, 27.0], [37960, 38126, 30.0], [38126, 38205, 13.0], [38205, 38375, 28.0], [38375, 38542, 29.0], [38542, 38716, 29.0], [38716, 38894, 27.0], [38894, 39064, 26.0], [39064, 39235, 27.0], [39235, 39398, 25.0], [39398, 39536, 22.0], [39536, 39709, 31.0], [39709, 39887, 30.0], [39887, 39901, 2.0], [39901, 40077, 33.0], [40077, 40249, 31.0], [40249, 40299, 9.0], [40299, 40467, 27.0], [40467, 40635, 26.0], [40635, 40739, 20.0], [40739, 40909, 24.0], [40909, 40981, 4.0], [40981, 41154, 32.0], [41154, 41177, 5.0], [41177, 41298, 22.0], [41298, 41364, 11.0], [41364, 41431, 11.0], [41431, 41488, 11.0], [41488, 41559, 13.0], [41559, 41673, 19.0], [41673, 41765, 12.0], [41765, 41884, 21.0], [41884, 42013, 20.0], [42013, 42117, 13.0], [42117, 42426, 3.0], [42426, 43479, 167.0], [43479, 43653, 31.0], [43653, 43825, 35.0], [43825, 43998, 28.0], [43998, 44177, 31.0], [44177, 44257, 12.0], [44257, 44336, 5.0], [44336, 44436, 8.0], [44436, 44612, 29.0], [44612, 44783, 26.0], [44783, 44957, 23.0], [44957, 45130, 28.0], [45130, 45289, 31.0], [45289, 45484, 32.0], [45484, 45590, 15.0], [45590, 45631, 5.0], [45631, 45677, 5.0], [45677, 45706, 4.0], [45706, 45735, 3.0], [45735, 45772, 6.0], [45772, 45957, 30.0], [45957, 46136, 29.0], [46136, 46310, 26.0], [46310, 46492, 27.0], [46492, 46682, 33.0], [46682, 46719, 7.0], [46719, 46905, 27.0], [46905, 47020, 15.0], [47020, 47115, 3.0], [47115, 47137, 5.0], [47137, 47320, 28.0], [47320, 47503, 29.0], [47503, 47679, 27.0], [47679, 47750, 9.0], [47750, 47939, 34.0], [47939, 48126, 29.0], [48126, 48136, 2.0], [48136, 48312, 32.0], [48312, 48491, 35.0], [48491, 48672, 25.0], [48672, 48741, 12.0], [48741, 48779, 2.0], [48779, 48963, 31.0], [48963, 49150, 26.0], [49150, 49335, 34.0], [49335, 49432, 11.0], [49432, 49603, 26.0], [49603, 49785, 24.0], [49785, 49904, 15.0], [49904, 50090, 31.0], [50090, 50280, 28.0], [50280, 50448, 23.0], [50448, 50636, 31.0], [50636, 50743, 17.0], [50743, 50851, 15.0], [50851, 51019, 26.0], [51019, 51205, 32.0], [51205, 51347, 21.0], [51347, 51539, 35.0], [51539, 51726, 28.0], [51726, 51770, 8.0], [51770, 51959, 31.0], [51959, 52148, 30.0], [52148, 52210, 9.0], [52210, 52393, 33.0], [52393, 52575, 27.0], [52575, 52672, 19.0], [52672, 52857, 38.0], [52857, 53035, 34.0], [53035, 53221, 27.0], [53221, 53409, 30.0], [53409, 53601, 35.0], [53601, 53746, 24.0], [53746, 53908, 27.0], [53908, 54065, 23.0], [54065, 54220, 25.0], [54220, 54384, 29.0], [54384, 54550, 29.0], [54550, 54569, 2.0], [54569, 54624, 6.0], [54624, 54707, 14.0], [54707, 54865, 25.0], [54865, 55036, 25.0], [55036, 55202, 26.0], [55202, 55347, 23.0], [55347, 55506, 23.0], [55506, 55670, 22.0], [55670, 55841, 24.0], [55841, 55865, 2.0], [55865, 56029, 29.0], [56029, 56137, 15.0], [56137, 56206, 8.0], [56206, 56262, 8.0], [56262, 56335, 10.0], [56335, 56373, 4.0], [56373, 56545, 27.0], [56545, 56623, 13.0], [56623, 56784, 27.0], [56784, 56949, 28.0], [56949, 57039, 12.0], [57039, 57206, 28.0], [57206, 57381, 25.0], [57381, 57480, 14.0], [57480, 57641, 23.0], [57641, 57806, 23.0], [57806, 57823, 3.0], [57823, 58107, 3.0], [58107, 58265, 23.0], [58265, 58434, 32.0], [58434, 58602, 28.0], [58602, 58631, 4.0], [58631, 58787, 24.0], [58787, 58940, 21.0], [58940, 59098, 24.0], [59098, 59266, 25.0], [59266, 59343, 12.0], [59343, 59512, 25.0], [59512, 59554, 7.0], [59554, 59625, 5.0], [59625, 59790, 25.0], [59790, 59957, 26.0], [59957, 60117, 25.0], [60117, 60192, 12.0], [60192, 60352, 28.0], [60352, 60473, 17.0], [60473, 60617, 23.0], [60617, 60764, 23.0], [60764, 60908, 21.0], [60908, 61053, 23.0], [61053, 61161, 19.0], [61161, 61444, 3.0], [61444, 61593, 28.0], [61593, 61742, 23.0], [61742, 61891, 23.0], [61891, 62036, 18.0], [62036, 62188, 22.0], [62188, 62234, 6.0], [62234, 62398, 29.0], [62398, 62417, 3.0], [62417, 62445, 5.0], [62445, 62459, 3.0], [62459, 62511, 8.0], [62511, 62539, 5.0], [62539, 62572, 5.0], [62572, 62601, 4.0], [62601, 62655, 8.0], [62655, 62718, 1.0], [62718, 62877, 26.0], [62877, 63031, 28.0], [63031, 63154, 22.0], [63154, 63304, 21.0], [63304, 63454, 28.0], [63454, 63602, 28.0], [63602, 63679, 14.0], [63679, 63819, 26.0], [63819, 63893, 12.0], [63893, 64050, 27.0], [64050, 64212, 28.0], [64212, 64370, 24.0], [64370, 64533, 25.0], [64533, 64693, 20.0], [64693, 64770, 12.0], [64770, 64937, 29.0], [64937, 65096, 27.0], [65096, 65249, 26.0], [65249, 65404, 24.0], [65404, 65560, 25.0], [65560, 65714, 29.0], [65714, 65873, 26.0], [65873, 65882, 2.0], [65882, 65937, 3.0], [65937, 66099, 26.0], [66099, 66267, 29.0], [66267, 66433, 27.0], [66433, 66599, 23.0], [66599, 66680, 13.0], [66680, 66843, 33.0], [66843, 66859, 4.0], [66859, 67015, 21.0], [67015, 67175, 25.0], [67175, 67328, 26.0], [67328, 67365, 6.0], [67365, 67426, 4.0], [67426, 67498, 3.0], [67498, 67658, 26.0], [67658, 67816, 24.0], [67816, 67980, 24.0], [67980, 67990, 1.0], [67990, 68152, 27.0], [68152, 68316, 26.0], [68316, 68475, 27.0], [68475, 68608, 20.0], [68608, 68764, 25.0], [68764, 68916, 23.0], [68916, 69068, 23.0], [69068, 69229, 25.0], [69229, 69382, 24.0], [69382, 69554, 27.0], [69554, 69613, 3.0], [69613, 69777, 25.0], [69777, 69948, 25.0], [69948, 70129, 37.0], [70129, 70301, 27.0], [70301, 70467, 24.0], [70467, 70635, 25.0], [70635, 70806, 28.0], [70806, 70947, 21.0], [70947, 71218, 5.0], [71218, 71388, 30.0], [71388, 71550, 26.0], [71550, 71725, 24.0], [71725, 71809, 11.0], [71809, 71976, 27.0], [71976, 72143, 27.0], [72143, 72315, 29.0], [72315, 72339, 5.0], [72339, 72510, 26.0], [72510, 72681, 28.0], [72681, 72851, 23.0], [72851, 72909, 9.0], [72909, 72974, 3.0], [72974, 73147, 30.0], [73147, 73307, 24.0], [73307, 73327, 2.0], [73327, 73499, 27.0], [73499, 73667, 28.0], [73667, 73683, 2.0], [73683, 73859, 28.0], [73859, 74034, 32.0], [74034, 74214, 27.0], [74214, 74376, 28.0], [74376, 74550, 28.0], [74550, 74711, 24.0], [74711, 74880, 25.0], [74880, 74921, 8.0], [74921, 75061, 3.0], [75061, 75230, 30.0], [75230, 75402, 25.0], [75402, 75570, 28.0], [75570, 75743, 29.0], [75743, 75811, 11.0], [75811, 75987, 28.0], [75987, 76159, 24.0], [76159, 76327, 24.0], [76327, 76494, 27.0], [76494, 76676, 28.0], [76676, 76854, 25.0], [76854, 76940, 14.0], [76940, 76977, 3.0], [76977, 77147, 28.0], [77147, 77322, 23.0], [77322, 77500, 30.0], [77500, 77679, 28.0], [77679, 77832, 21.0], [77832, 77895, 1.0], [77895, 77947, 10.0], [77947, 78143, 32.0], [78143, 78331, 28.0], [78331, 78520, 32.0], [78520, 78552, 5.0], [78552, 78749, 32.0], [78749, 78794, 8.0], [78794, 78987, 30.0], [78987, 79181, 33.0], [79181, 79287, 17.0], [79287, 79368, 13.0], [79368, 79461, 3.0], [79461, 79648, 33.0], [79648, 79842, 24.0], [79842, 80041, 30.0], [80041, 80228, 32.0], [80228, 80421, 32.0], [80421, 80596, 34.0], [80596, 80787, 34.0], [80787, 80948, 25.0], [80948, 81009, 1.0], [81009, 81139, 20.0], [81139, 81306, 29.0], [81306, 81415, 19.0], [81415, 81568, 22.0], [81568, 81689, 17.0], [81689, 81837, 23.0], [81837, 82004, 3.0], [82004, 82195, 31.0], [82195, 82388, 27.0], [82388, 82585, 31.0], [82585, 82779, 29.0], [82779, 82977, 29.0], [82977, 82996, 4.0], [82996, 83187, 28.0], [83187, 83370, 24.0], [83370, 83389, 3.0], [83389, 83565, 24.0], [83565, 83737, 21.0], [83737, 83926, 30.0], [83926, 84084, 24.0], [84084, 84267, 27.0], [84267, 84509, 3.0], [84509, 84698, 31.0], [84698, 84887, 27.0], [84887, 85074, 27.0], [85074, 85240, 25.0], [85240, 85373, 19.0], [85373, 85543, 27.0], [85543, 85663, 20.0], [85663, 85855, 32.0], [85855, 86033, 32.0], [86033, 86106, 10.0], [86106, 86260, 21.0], [86260, 86582, 3.0], [86582, 86779, 33.0], [86779, 86970, 29.0], [86970, 87082, 21.0], [87082, 87239, 25.0], [87239, 87393, 21.0], [87393, 87506, 16.0], [87506, 87586, 11.0], [87586, 87702, 16.0], [87702, 87843, 19.0], [87843, 87948, 16.0], [87948, 88041, 15.0], [88041, 88182, 21.0], [88182, 88259, 14.0], [88259, 88332, 12.0], [88332, 88458, 17.0], [88458, 88570, 21.0], [88570, 88671, 14.0], [88671, 88799, 21.0], [88799, 88950, 20.0], [88950, 89060, 17.0], [89060, 89212, 22.0], [89212, 89279, 9.0], [89279, 89407, 18.0], [89407, 89541, 21.0], [89541, 89606, 9.0], [89606, 89742, 16.0], [89742, 89841, 13.0], [89841, 89906, 8.0], [89906, 90016, 3.0], [90016, 90206, 35.0], [90206, 90391, 30.0], [90391, 90564, 27.0], [90564, 90641, 13.0], [90641, 90746, 16.0], [90746, 90790, 7.0], [90790, 90967, 26.0], [90967, 91064, 15.0], [91064, 91250, 28.0], [91250, 91441, 31.0], [91441, 91472, 4.0], [91472, 91656, 26.0], [91656, 91760, 13.0], [91760, 91866, 1.0], [91866, 91921, 1.0], [91921, 92101, 33.0], [92101, 92284, 28.0], [92284, 92459, 24.0], [92459, 92648, 30.0], [92648, 92700, 10.0], [92700, 92889, 34.0], [92889, 92929, 6.0], [92929, 93111, 28.0], [93111, 93304, 29.0], [93304, 93495, 29.0], [93495, 93682, 26.0], [93682, 93883, 29.0], [93883, 93976, 13.0], [93976, 94111, 3.0], [94111, 94300, 34.0], [94300, 94488, 35.0], [94488, 94678, 29.0], [94678, 94867, 33.0], [94867, 94884, 3.0], [94884, 95077, 27.0], [95077, 95104, 4.0], [95104, 95402, 47.0], [95402, 96137, 104.0], [96137, 96608, 74.0], [96608, 96670, 4.0], [96670, 97013, 57.0], [97013, 97623, 89.0], [97623, 98424, 132.0], [98424, 98483, 3.0], [98483, 98979, 73.0], [98979, 99193, 30.0], [99193, 99537, 58.0], [99537, 99666, 23.0], [99666, 99733, 12.0], [99733, 99832, 16.0], [99832, 99925, 17.0], [99925, 100005, 15.0], [100005, 100138, 3.0], [100138, 100266, 25.0], [100266, 100380, 24.0], [100380, 100407, 2.0], [100407, 100429, 1.0], [100429, 100903, 84.0], [100903, 101195, 49.0], [101195, 101442, 36.0], [101442, 101560, 19.0], [101560, 101640, 11.0], [101640, 101704, 9.0], [101704, 101776, 10.0], [101776, 101890, 15.0], [101890, 101985, 12.0], [101985, 102130, 17.0], [102130, 102222, 13.0], [102222, 102355, 18.0], [102355, 102455, 12.0], [102455, 102557, 3.0], [102557, 103092, 99.0], [103092, 103494, 72.0], [103494, 104022, 81.0], [104022, 104103, 3.0], [104103, 104365, 43.0], [104365, 104385, 4.0], [104385, 104413, 5.0], [104413, 104444, 5.0], [104444, 104564, 18.0], [104564, 105386, 128.0], [105386, 105433, 3.0], [105433, 105654, 2.0], [105654, 106148, 75.0], [106148, 106810, 98.0], [106810, 106906, 16.0], [106906, 106999, 12.0], [106999, 107115, 14.0], [107115, 107222, 12.0], [107222, 107311, 11.0], [107311, 107457, 20.0], [107457, 107535, 12.0], [107535, 107676, 18.0], [107676, 107793, 15.0], [107793, 107908, 15.0], [107908, 108055, 21.0], [108055, 108306, 3.0], [108306, 108684, 71.0], [108684, 109560, 135.0], [109560, 109622, 3.0], [109622, 110096, 68.0], [110096, 110489, 60.0], [110489, 110553, 9.0], [110553, 110622, 8.0], [110622, 110976, 49.0], [110976, 111002, 3.0], [111002, 111085, 8.0], [111085, 111149, 1.0], [111149, 111299, 1.0], [111299, 111654, 56.0], [111654, 112221, 83.0], [112221, 112288, 1.0], [112288, 112330, 8.0], [112330, 112345, 2.0], [112345, 112381, 6.0], [112381, 112457, 13.0], [112457, 113178, 105.0], [113178, 113631, 70.0], [113631, 113920, 44.0], [113920, 114394, 76.0], [114394, 115652, 192.0], [115652, 116079, 72.0], [116079, 116522, 71.0], [116522, 116629, 1.0], [116629, 116699, 1.0], [116699, 117285, 100.0], [117285, 117827, 97.0], [117827, 117933, 1.0], [117933, 118365, 68.0], [118365, 118466, 1.0], [118466, 118819, 56.0], [118819, 118888, 11.0], [118888, 119095, 1.0], [119095, 120152, 170.0], [120152, 120632, 86.0], [120632, 121412, 118.0], [121412, 121429, 2.0], [121429, 121453, 3.0], [121453, 121462, 2.0], [121462, 121475, 2.0], [121475, 121501, 3.0], [121501, 121527, 4.0], [121527, 121543, 3.0], [121543, 121567, 4.0], [121567, 122010, 73.0], [122010, 122192, 1.0], [122192, 122819, 102.0], [122819, 122889, 9.0], [122889, 122966, 12.0], [122966, 123026, 8.0], [123026, 123644, 95.0], [123644, 123688, 1.0], [123688, 123975, 45.0], [123975, 124017, 1.0], [124017, 124390, 52.0], [124390, 124446, 1.0], [124446, 125407, 149.0], [125407, 125611, 31.0], [125611, 125651, 5.0], [125651, 125695, 8.0], [125695, 125770, 11.0], [125770, 125822, 6.0], [125822, 125884, 12.0], [125884, 125902, 3.0], [125902, 126032, 21.0], [126032, 126599, 87.0], [126599, 126613, 2.0], [126613, 127031, 57.0], [127031, 127074, 9.0], [127074, 127120, 3.0], [127120, 127208, 2.0], [127208, 127890, 122.0], [127890, 128002, 16.0], [128002, 128215, 31.0], [128215, 128702, 68.0], [128702, 128817, 3.0], [128817, 129616, 117.0], [129616, 129855, 36.0], [129855, 130164, 49.0], [130164, 130466, 46.0], [130466, 130783, 46.0], [130783, 131069, 3.0], [131069, 131747, 121.0], [131747, 132476, 110.0], [132476, 132635, 3.0], [132635, 133171, 91.0], [133171, 133862, 104.0], [133862, 134098, 37.0], [134098, 134328, 33.0], [134328, 134941, 82.0], [134941, 134999, 9.0], [134999, 135446, 62.0], [135446, 136036, 103.0], [136036, 136725, 108.0], [136725, 137098, 59.0], [137098, 137197, 4.0], [137197, 137889, 117.0], [137889, 139040, 182.0], [139040, 139368, 12.0], [139368, 139941, 94.0], [139941, 140277, 57.0], [140277, 140627, 66.0], [140627, 140700, 5.0], [140700, 141197, 81.0], [141197, 141882, 103.0], [141882, 142008, 3.0], [142008, 142810, 119.0], [142810, 143146, 53.0], [143146, 143307, 24.0], [143307, 143465, 25.0], [143465, 143607, 22.0], [143607, 144067, 83.0], [144067, 144243, 27.0], [144243, 144333, 13.0], [144333, 144476, 1.0], [144476, 145048, 85.0], [145048, 145234, 31.0], [145234, 146188, 161.0], [146188, 146503, 52.0], [146503, 146601, 3.0], [146601, 147536, 155.0], [147536, 148327, 122.0], [148327, 148443, 3.0], [148443, 148883, 68.0], [148883, 150331, 239.0], [150331, 150600, 3.0], [150600, 150855, 45.0], [150855, 151326, 72.0], [151326, 151760, 3.0], [151760, 152232, 81.0], [152232, 153238, 160.0], [153238, 153652, 66.0], [153652, 154349, 116.0], [154349, 155131, 122.0], [155131, 155171, 7.0], [155171, 155182, 2.0], [155182, 155209, 3.0], [155209, 155223, 3.0], [155223, 155231, 2.0], [155231, 155255, 4.0], [155255, 155281, 4.0], [155281, 155295, 3.0], [155295, 155308, 3.0], [155308, 155371, 10.0], [155371, 155887, 74.0], [155887, 156153, 42.0], [156153, 156447, 57.0], [156447, 156572, 21.0], [156572, 156625, 11.0], [156625, 156683, 13.0], [156683, 156712, 5.0], [156712, 156740, 5.0], [156740, 157684, 142.0], [157684, 157916, 3.0], [157916, 158826, 135.0], [158826, 158979, 18.0], [158979, 159451, 78.0], [159451, 159510, 3.0], [159510, 160433, 156.0], [160433, 161060, 95.0], [161060, 161246, 3.0], [161246, 161822, 87.0], [161822, 162564, 104.0], [162564, 163027, 68.0], [163027, 163528, 82.0], [163528, 163603, 3.0], [163603, 163992, 66.0], [163992, 164058, 12.0], [164058, 164075, 2.0], [164075, 164165, 12.0], [164165, 164222, 7.0], [164222, 164278, 8.0], [164278, 164933, 114.0], [164933, 165923, 174.0], [165923, 166477, 100.0], [166477, 166614, 26.0], [166614, 166942, 52.0], [166942, 167529, 94.0], [167529, 168037, 80.0], [168037, 168796, 112.0], [168796, 169032, 35.0], [169032, 169236, 36.0], [169236, 169752, 95.0], [169752, 169967, 42.0], [169967, 170150, 36.0], [170150, 170351, 37.0], [170351, 170372, 3.0], [170372, 170821, 80.0], [170821, 171206, 66.0], [171206, 172142, 157.0], [172142, 173186, 178.0], [173186, 173490, 3.0], [173490, 173988, 82.0], [173988, 174970, 176.0], [174970, 175037, 1.0], [175037, 175091, 1.0], [175091, 175152, 1.0], [175152, 175199, 1.0], [175199, 175664, 72.0], [175664, 175886, 33.0], [175886, 176424, 89.0], [176424, 176828, 62.0], [176828, 176967, 22.0], [176967, 177075, 21.0], [177075, 177223, 24.0], [177223, 177456, 39.0], [177456, 177700, 32.0], [177700, 177944, 40.0], [177944, 178498, 86.0], [178498, 178782, 3.0], [178782, 179419, 104.0], [179419, 179836, 64.0], [179836, 179887, 9.0], [179887, 179987, 15.0], [179987, 180034, 8.0], [180034, 180129, 15.0], [180129, 180264, 22.0], [180264, 180354, 15.0], [180354, 180419, 8.0], [180419, 180483, 11.0], [180483, 180604, 1.0], [180604, 180689, 1.0], [180689, 181481, 126.0], [181481, 182158, 108.0], [182158, 182267, 3.0], [182267, 183128, 129.0], [183128, 183530, 67.0], [183530, 183797, 3.0], [183797, 184255, 75.0], [184255, 184339, 1.0], [184339, 184345, 1.0], [184345, 184378, 1.0], [184378, 185181, 128.0], [185181, 185451, 3.0], [185451, 186088, 101.0], [186088, 186355, 41.0], [186355, 186456, 18.0], [186456, 187387, 140.0], [187387, 187686, 45.0], [187686, 187925, 3.0], [187925, 188495, 90.0], [188495, 189309, 129.0], [189309, 189910, 99.0], [189910, 190482, 92.0], [190482, 190755, 1.0], [190755, 191550, 133.0], [191550, 192071, 82.0], [192071, 192186, 1.0], [192186, 192609, 76.0], [192609, 192763, 23.0], [192763, 192932, 21.0], [192932, 193553, 100.0], [193553, 194952, 223.0], [194952, 195566, 99.0], [195566, 195675, 1.0], [195675, 196589, 147.0], [196589, 197386, 127.0], [197386, 197452, 3.0], [197452, 197544, 3.0], [197544, 198252, 111.0], [198252, 198871, 102.0], [198871, 199416, 88.0], [199416, 199956, 73.0], [199956, 199987, 1.0], [199987, 200057, 1.0], [200057, 200571, 85.0], [200571, 200712, 26.0], [200712, 201458, 129.0], [201458, 201669, 1.0], [201669, 202228, 94.0], [202228, 203140, 148.0], [203140, 203190, 1.0], [203190, 203754, 88.0], [203754, 204497, 117.0], [204497, 204892, 17.0], [204892, 205096, 16.0], [205096, 206009, 150.0], [206009, 207439, 227.0], [207439, 208485, 165.0], [208485, 208705, 31.0], [208705, 209005, 39.0], [209005, 209763, 121.0], [209763, 209783, 4.0], [209783, 209803, 4.0], [209803, 209826, 4.0], [209826, 210237, 67.0], [210237, 210916, 115.0], [210916, 210979, 1.0], [210979, 211604, 106.0], [211604, 211874, 53.0], [211874, 211946, 14.0], [211946, 212785, 123.0], [212785, 212945, 9.0], [212945, 213628, 110.0], [213628, 214245, 101.0], [214245, 214255, 2.0], [214255, 214278, 4.0], [214278, 214287, 2.0], [214287, 214317, 4.0], [214317, 214330, 2.0], [214330, 214351, 3.0], [214351, 214370, 4.0], [214370, 214381, 2.0], [214381, 214396, 2.0], [214396, 214409, 2.0], [214409, 214418, 2.0], [214418, 214433, 2.0], [214433, 214447, 2.0], [214447, 214460, 2.0], [214460, 214468, 2.0], [214468, 214512, 6.0], [214512, 214559, 7.0], [214559, 214569, 2.0], [214569, 214983, 66.0], [214983, 215189, 33.0], [215189, 215519, 59.0], [215519, 215637, 8.0], [215637, 216451, 127.0], [216451, 216542, 1.0], [216542, 217150, 98.0], [217150, 217820, 105.0], [217820, 217908, 1.0], [217908, 218250, 56.0], [218250, 218936, 108.0], [218936, 219051, 1.0], [219051, 219459, 66.0], [219459, 219890, 75.0], [219890, 220280, 64.0], [220280, 220400, 18.0], [220400, 220443, 8.0], [220443, 220487, 6.0], [220487, 220523, 6.0], [220523, 220569, 8.0], [220569, 220621, 8.0], [220621, 220730, 8.0], [220730, 221222, 82.0], [221222, 222139, 147.0], [222139, 222259, 1.0], [222259, 222762, 84.0], [222762, 223151, 67.0], [223151, 223200, 1.0], [223200, 223312, 16.0], [223312, 223372, 5.0], [223372, 223944, 105.0], [223944, 224602, 96.0], [224602, 224610, 2.0], [224610, 224619, 2.0], [224619, 224629, 2.0], [224629, 224641, 2.0], [224641, 225434, 133.0], [225434, 226539, 184.0], [226539, 226614, 1.0], [226614, 227311, 110.0], [227311, 227426, 18.0], [227426, 227638, 31.0], [227638, 227741, 14.0], [227741, 228063, 44.0], [228063, 228154, 1.0], [228154, 228654, 88.0], [228654, 229051, 60.0], [229051, 229125, 14.0], [229125, 230081, 150.0], [230081, 230190, 1.0], [230190, 230484, 50.0], [230484, 231014, 90.0], [231014, 231292, 1.0], [231292, 231975, 123.0], [231975, 232644, 100.0], [232644, 232754, 1.0], [232754, 233903, 178.0], [233903, 234021, 1.0], [234021, 234128, 1.0], [234128, 234854, 110.0], [234854, 235109, 1.0], [235109, 235380, 40.0], [235380, 235924, 92.0], [235924, 236229, 44.0], [236229, 236759, 94.0], [236759, 236839, 1.0], [236839, 236980, 27.0], [236980, 237016, 8.0], [237016, 237172, 27.0], [237172, 237427, 43.0], [237427, 237658, 40.0], [237658, 237681, 1.0], [237681, 237705, 1.0], [237705, 237893, 31.0], [237893, 237982, 16.0], [237982, 238163, 33.0], [238163, 238363, 30.0], [238363, 238637, 47.0], [238637, 238931, 51.0], [238931, 239324, 70.0], [239324, 239981, 109.0], [239981, 240151, 25.0], [240151, 240239, 3.0], [240239, 240268, 5.0], [240268, 240897, 86.0], [240897, 240980, 1.0], [240980, 241031, 1.0], [241031, 241310, 47.0], [241310, 241340, 5.0], [241340, 241931, 89.0], [241931, 242012, 1.0], [242012, 242549, 91.0], [242549, 243110, 91.0], [243110, 243710, 97.0], [243710, 245037, 227.0], [245037, 245545, 90.0], [245545, 246426, 155.0], [246426, 247351, 167.0], [247351, 247465, 8.0], [247465, 247507, 7.0]], "rps_lines_numerical_chars_fraction": [[0, 34, 0.0], [34, 77, 0.0], [77, 89, 0.0], [89, 100, 0.0], [100, 116, 0.0], [116, 122, 0.0], [122, 130, 0.0], [130, 165, 0.0], [165, 195, 0.0], [195, 241, 0.0], [241, 418, 0.01176471], [418, 600, 0.0], [600, 774, 0.0], [774, 952, 0.0], [952, 1131, 0.0], [1131, 1309, 0.0], [1309, 1489, 0.01724138], [1489, 1667, 0.0], [1667, 1829, 0.0], [1829, 2006, 0.02339181], [2006, 2052, 0.0], [2052, 2210, 0.0610687], [2210, 2393, 0.0], [2393, 2569, 0.0], [2569, 2746, 0.0], [2746, 2912, 0.0], [2912, 3094, 0.00555556], [3094, 3264, 0.00609756], [3264, 3441, 0.01162791], [3441, 3468, 0.0], [3468, 3531, 0.0], [3531, 3732, 0.0], [3732, 3928, 0.0052356], [3928, 4097, 0.0], [4097, 4146, 0.23076923], [4146, 4350, 0.0], [4350, 4545, 0.0], [4545, 4742, 0.0], [4742, 4947, 0.0], [4947, 4975, 0.0], [4975, 5160, 0.0], [5160, 5351, 0.03278689], [5351, 5433, 0.02531646], [5433, 5545, 0.00934579], [5545, 5651, 0.00980392], [5651, 5760, 0.00952381], [5760, 5836, 0.01388889], [5836, 5899, 0.01694915], [5899, 6002, 0.01010101], [6002, 6262, 0.09633028], [6262, 6453, 0.0], [6453, 6637, 0.0], [6637, 6676, 0.0], [6676, 6861, 0.01714286], [6861, 7058, 0.03108808], [7058, 7249, 0.0], [7249, 7438, 0.0], [7438, 7712, 0.13807531], [7712, 7898, 0.0], [7898, 8091, 0.0], [8091, 8183, 0.0], [8183, 8344, 0.0], [8344, 8505, 0.02547771], [8505, 8609, 0.0], [8609, 8617, 0.0], [8617, 8624, 0.16666667], [8624, 8636, 0.0], [8636, 8647, 0.0], [8647, 8660, 0.0], [8660, 8673, 0.0], [8673, 8683, 0.0], [8683, 8694, 0.0], [8694, 8704, 0.0], [8704, 8713, 0.0], [8713, 8720, 0.0], [8720, 8732, 0.0], [8732, 8745, 0.0], [8745, 8751, 0.0], [8751, 8760, 0.0], [8760, 8774, 0.0], [8774, 8946, 0.04166667], [8946, 9032, 0.02439024], [9032, 9042, 0.0], [9042, 9051, 0.0], [9051, 9063, 0.0], [9063, 9085, 0.0], [9085, 9101, 0.0], [9101, 9110, 0.28571429], [9110, 9125, 0.0], [9125, 9136, 0.0], [9136, 9151, 0.0], [9151, 9163, 0.0], [9163, 9169, 0.0], [9169, 9177, 0.14285714], [9177, 9202, 0.0], [9202, 9215, 0.0], [9215, 9226, 0.0], [9226, 9244, 0.0], [9244, 9273, 0.0], [9273, 9284, 0.0], [9284, 9376, 0.0], [9376, 9494, 0.15], [9494, 9678, 0.0], [9678, 9858, 0.0], [9858, 10036, 0.01204819], [10036, 10215, 0.0], [10215, 10235, 0.0], [10235, 10455, 0.00458716], [10455, 10666, 0.0], [10666, 10884, 0.03791469], [10884, 11098, 0.01456311], [11098, 11173, 0.05633803], [11173, 11428, 0.10280374], [11428, 11533, 0.26086957], [11533, 11756, 0.0], [11756, 11972, 0.0], [11972, 12190, 0.0], [12190, 12907, 0.01146132], [12907, 13032, 0.07692308], [13032, 13717, 0.0], [13717, 13891, 0.03571429], [13891, 14064, 0.0], [14064, 14237, 0.02352941], [14237, 14414, 0.00578035], [14414, 14584, 0.00606061], [14584, 14640, 0.01851852], [14640, 14765, 0.07619048], [14765, 14845, 0.01449275], [14845, 14864, 0.0], [14864, 14912, 0.56756757], [14912, 14959, 0.55555556], [14959, 15130, 0.0], [15130, 15297, 0.0], [15297, 15473, 0.0], [15473, 15643, 0.0], [15643, 15764, 0.0], [15764, 15930, 0.01226994], [15930, 16098, 0.0], [16098, 16260, 0.02564103], [16260, 16430, 0.0], [16430, 16604, 0.0], [16604, 16769, 0.0], [16769, 16947, 0.0], [16947, 17123, 0.0], [17123, 17135, 0.0], [17135, 17150, 0.0], [17150, 17208, 0.14285714], [17208, 17379, 0.0], [17379, 17556, 0.0], [17556, 17731, 0.0], [17731, 17903, 0.0], [17903, 18062, 0.0], [18062, 18231, 0.0308642], [18231, 18406, 0.0], [18406, 18576, 0.02424242], [18576, 18679, 0.09302326], [18679, 18850, 0.04166667], [18850, 19027, 0.0], [19027, 19147, 0.0], [19147, 19325, 0.0], [19325, 19489, 0.0], [19489, 19662, 0.02409639], [19662, 19792, 0.0], [19792, 19897, 0.18390805], [19897, 20072, 0.0], [20072, 20254, 0.0], [20254, 20432, 0.00588235], [20432, 20589, 0.0], [20589, 20759, 0.0], [20759, 20790, 0.0], [20790, 20866, 0.01408451], [20866, 20964, 0.01075269], [20964, 21090, 0.00826446], [21090, 21204, 0.00925926], [21204, 21298, 0.01123596], [21298, 21417, 0.00892857], [21417, 21528, 0.00943396], [21528, 21645, 0.00892857], [21645, 21726, 0.01315789], [21726, 21865, 0.01492537], [21865, 21995, 0.0], [21995, 22050, 0.06666667], [22050, 22215, 0.0], [22215, 22382, 0.0], [22382, 22545, 0.0], [22545, 22722, 0.0], [22722, 22873, 0.0], [22873, 22900, 0.24], [22900, 23060, 0.05298013], [23060, 23233, 0.0], [23233, 23289, 0.0], [23289, 23411, 0.0776699], [23411, 23574, 0.0], [23574, 23738, 0.0], [23738, 23910, 0.0], [23910, 24007, 0.0], [24007, 24187, 0.0], [24187, 24361, 0.0], [24361, 24528, 0.0], [24528, 24694, 0.0], [24694, 24729, 0.0], [24729, 24872, 0.05042017], [24872, 25006, 0.33035714], [25006, 25176, 0.0], [25176, 25344, 0.0], [25344, 25465, 0.0], [25465, 25643, 0.01744186], [25643, 25816, 0.0], [25816, 25990, 0.0], [25990, 26177, 0.0], [26177, 26285, 0.16853933], [26285, 26461, 0.0], [26461, 26638, 0.02312139], [26638, 26814, 0.0], [26814, 26991, 0.0], [26991, 27020, 0.0], [27020, 27196, 0.01183432], [27196, 27373, 0.00591716], [27373, 27436, 0.0], [27436, 27607, 0.0], [27607, 27712, 0.0], [27712, 27763, 0.05128205], [27763, 27940, 0.0], [27940, 28119, 0.0], [28119, 28295, 0.0], [28295, 28481, 0.0], [28481, 28556, 0.0], [28556, 28731, 0.0], [28731, 28902, 0.0], [28902, 29076, 0.0], [29076, 29246, 0.01226994], [29246, 29412, 0.05521472], [29412, 29589, 0.0], [29589, 29721, 0.01612903], [29721, 29774, 0.0], [29774, 29940, 0.0], [29940, 30105, 0.0], [30105, 30275, 0.0], [30275, 30444, 0.01242236], [30444, 30612, 0.0], [30612, 30782, 0.0], [30782, 30961, 0.0], [30961, 31132, 0.0], [31132, 31272, 0.0], [31272, 31435, 0.0], [31435, 31606, 0.0], [31606, 31771, 0.0], [31771, 31812, 0.0], [31812, 31955, 0.05042017], [31955, 32086, 0.0], [32086, 32149, 0.30769231], [32149, 32315, 0.0], [32315, 32482, 0.0], [32482, 32643, 0.0], [32643, 32762, 0.0], [32762, 32917, 0.04054054], [32917, 33074, 0.03333333], [33074, 33235, 0.0], [33235, 33397, 0.0], [33397, 33554, 0.0], [33554, 33598, 0.0], [33598, 33763, 0.0], [33763, 33913, 0.0], [33913, 34015, 0.1686747], [34015, 34165, 0.12903226], [34165, 34295, 0.0733945], [34295, 34323, 0.20833333], [34323, 34477, 0.0], [34477, 34639, 0.00636943], [34639, 34805, 0.0], [34805, 34973, 0.0], [34973, 35103, 0.0], [35103, 35270, 0.0], [35270, 35424, 0.01333333], [35424, 35484, 0.0], [35484, 35661, 0.0], [35661, 35821, 0.0], [35821, 35981, 0.0], [35981, 36150, 0.0], [36150, 36320, 0.0], [36320, 36482, 0.0], [36482, 36647, 0.0], [36647, 36756, 0.0], [36756, 36924, 0.0], [36924, 37084, 0.0], [37084, 37250, 0.0], [37250, 37407, 0.0], [37407, 37482, 0.0], [37482, 37644, 0.0], [37644, 37807, 0.0], [37807, 37960, 0.0], [37960, 38126, 0.0], [38126, 38205, 0.0], [38205, 38375, 0.0], [38375, 38542, 0.0], [38542, 38716, 0.0], [38716, 38894, 0.0], [38894, 39064, 0.0], [39064, 39235, 0.0], [39235, 39398, 0.0], [39398, 39536, 0.0], [39536, 39709, 0.0], [39709, 39887, 0.0], [39887, 39901, 0.0], [39901, 40077, 0.0], [40077, 40249, 0.02439024], [40249, 40299, 0.04166667], [40299, 40467, 0.0], [40467, 40635, 0.0], [40635, 40739, 0.0], [40739, 40909, 0.05095541], [40909, 40981, 0.06666667], [40981, 41154, 0.02339181], [41154, 41177, 0.0], [41177, 41298, 0.00862069], [41298, 41364, 0.01666667], [41364, 41431, 0.01587302], [41431, 41488, 0.01886792], [41488, 41559, 0.04477612], [41559, 41673, 0.00909091], [41673, 41765, 0.01149425], [41765, 41884, 0.00869565], [41884, 42013, 0.00806452], [42013, 42117, 0.02], [42117, 42426, 0.14785992], [42426, 43479, 0.00687623], [43479, 43653, 0.02352941], [43653, 43825, 0.03592814], [43825, 43998, 0.0239521], [43998, 44177, 0.0], [44177, 44257, 0.0], [44257, 44336, 0.04411765], [44336, 44436, 0.02247191], [44436, 44612, 0.0], [44612, 44783, 0.0], [44783, 44957, 0.0], [44957, 45130, 0.0], [45130, 45289, 0.02597403], [45289, 45484, 0.0], [45484, 45590, 0.0], [45590, 45631, 0.02564103], [45631, 45677, 0.02325581], [45677, 45706, 0.03703704], [45706, 45735, 0.03846154], [45735, 45772, 0.02857143], [45772, 45957, 0.0], [45957, 46136, 0.0], [46136, 46310, 0.0], [46310, 46492, 0.0], [46492, 46682, 0.0], [46682, 46719, 0.0], [46719, 46905, 0.0], [46905, 47020, 0.0], [47020, 47115, 0.0], [47115, 47137, 0.0], [47137, 47320, 0.00561798], [47320, 47503, 0.0], [47503, 47679, 0.0], [47679, 47750, 0.0], [47750, 47939, 0.00540541], [47939, 48126, 0.0], [48126, 48136, 0.0], [48136, 48312, 0.0], [48312, 48491, 0.05142857], [48491, 48672, 0.0], [48672, 48741, 0.0], [48741, 48779, 0.0], [48779, 48963, 0.0], [48963, 49150, 0.0], [49150, 49335, 0.0], [49335, 49432, 0.01075269], [49432, 49603, 0.0], [49603, 49785, 0.0], [49785, 49904, 0.0], [49904, 50090, 0.0], [50090, 50280, 0.0], [50280, 50448, 0.0], [50448, 50636, 0.0], [50636, 50743, 0.0], [50743, 50851, 0.00961538], [50851, 51019, 0.0], [51019, 51205, 0.0], [51205, 51347, 0.0], [51347, 51539, 0.0], [51539, 51726, 0.0], [51726, 51770, 0.0], [51770, 51959, 0.0], [51959, 52148, 0.0], [52148, 52210, 0.0], [52210, 52393, 0.0], [52393, 52575, 0.0], [52575, 52672, 0.0], [52672, 52857, 0.0], [52857, 53035, 0.0], [53035, 53221, 0.0], [53221, 53409, 0.0], [53409, 53601, 0.0], [53601, 53746, 0.0], [53746, 53908, 0.0], [53908, 54065, 0.0], [54065, 54220, 0.0], [54220, 54384, 0.0], [54384, 54550, 0.0], [54550, 54569, 0.0], [54569, 54624, 0.0], [54624, 54707, 0.0], [54707, 54865, 0.02684564], [54865, 55036, 0.0], [55036, 55202, 0.0], [55202, 55347, 0.0], [55347, 55506, 0.00653595], [55506, 55670, 0.0], [55670, 55841, 0.0], [55841, 55865, 0.0], [55865, 56029, 0.02547771], [56029, 56137, 0.0], [56137, 56206, 0.0], [56206, 56262, 0.0], [56262, 56335, 0.0], [56335, 56373, 0.0], [56373, 56545, 0.0060241], [56545, 56623, 0.0], [56623, 56784, 0.00641026], [56784, 56949, 0.00632911], [56949, 57039, 0.0], [57039, 57206, 0.00613497], [57206, 57381, 0.0], [57381, 57480, 0.0], [57480, 57641, 0.0], [57641, 57806, 0.0], [57806, 57823, 0.0], [57823, 58107, 0.06694561], [58107, 58265, 0.0], [58265, 58434, 0.0], [58434, 58602, 0.0], [58602, 58631, 0.0], [58631, 58787, 0.0], [58787, 58940, 0.00704225], [58940, 59098, 0.0], [59098, 59266, 0.0], [59266, 59343, 0.0], [59343, 59512, 0.00636943], [59512, 59554, 0.0], [59554, 59625, 0.05084746], [59625, 59790, 0.0], [59790, 59957, 0.0], [59957, 60117, 0.0], [60117, 60192, 0.0], [60192, 60352, 0.0], [60352, 60473, 0.0], [60473, 60617, 0.03676471], [60617, 60764, 0.0], [60764, 60908, 0.0], [60908, 61053, 0.02836879], [61053, 61161, 0.0], [61161, 61444, 0.06694561], [61444, 61593, 0.0], [61593, 61742, 0.0], [61742, 61891, 0.0], [61891, 62036, 0.0], [62036, 62188, 0.0], [62188, 62234, 0.0], [62234, 62398, 0.01265823], [62398, 62417, 0.0], [62417, 62445, 0.0], [62445, 62459, 0.0], [62459, 62511, 0.0], [62511, 62539, 0.0], [62539, 62572, 0.0], [62572, 62601, 0.0], [62601, 62655, 0.0], [62655, 62718, 0.0], [62718, 62877, 0.0], [62877, 63031, 0.0], [63031, 63154, 0.0], [63154, 63304, 0.0], [63304, 63454, 0.0], [63454, 63602, 0.0], [63602, 63679, 0.0], [63679, 63819, 0.0], [63819, 63893, 0.0], [63893, 64050, 0.05263158], [64050, 64212, 0.0], [64212, 64370, 0.0], [64370, 64533, 0.0], [64533, 64693, 0.0], [64693, 64770, 0.0], [64770, 64937, 0.0], [64937, 65096, 0.0], [65096, 65249, 0.0], [65249, 65404, 0.02013423], [65404, 65560, 0.01408451], [65560, 65714, 0.04666667], [65714, 65873, 0.0], [65873, 65882, 0.0], [65882, 65937, 0.0], [65937, 66099, 0.0], [66099, 66267, 0.0], [66267, 66433, 0.0], [66433, 66599, 0.0], [66599, 66680, 0.0], [66680, 66843, 0.0], [66843, 66859, 0.0], [66859, 67015, 0.04054054], [67015, 67175, 0.01282051], [67175, 67328, 0.02013423], [67328, 67365, 0.0], [67365, 67426, 0.02], [67426, 67498, 0.0], [67498, 67658, 0.0], [67658, 67816, 0.0], [67816, 67980, 0.0], [67980, 67990, 0.0], [67990, 68152, 0.0], [68152, 68316, 0.0], [68316, 68475, 0.01910828], [68475, 68608, 0.0], [68608, 68764, 0.00675676], [68764, 68916, 0.02054795], [68916, 69068, 0.00671141], [69068, 69229, 0.0], [69229, 69382, 0.00675676], [69382, 69554, 0.00598802], [69554, 69613, 0.04081633], [69613, 69777, 0.0], [69777, 69948, 0.0], [69948, 70129, 0.0], [70129, 70301, 0.0], [70301, 70467, 0.00641026], [70467, 70635, 0.0], [70635, 70806, 0.0], [70806, 70947, 0.0], [70947, 71218, 0.1], [71218, 71388, 0.0], [71388, 71550, 0.0], [71550, 71725, 0.0], [71725, 71809, 0.0], [71809, 71976, 0.0], [71976, 72143, 0.0], [72143, 72315, 0.0], [72315, 72339, 0.0], [72339, 72510, 0.03614458], [72510, 72681, 0.0060241], [72681, 72851, 0.0], [72851, 72909, 0.0], [72909, 72974, 0.24528302], [72974, 73147, 0.0], [73147, 73307, 0.0], [73307, 73327, 0.0], [73327, 73499, 0.0], [73499, 73667, 0.0], [73667, 73683, 0.0], [73683, 73859, 0.03550296], [73859, 74034, 0.0], [74034, 74214, 0.0], [74214, 74376, 0.0], [74376, 74550, 0.0], [74550, 74711, 0.0130719], [74711, 74880, 0.0], [74880, 74921, 0.0], [74921, 75061, 0.07692308], [75061, 75230, 0.02439024], [75230, 75402, 0.0], [75402, 75570, 0.0], [75570, 75743, 0.0], [75743, 75811, 0.0], [75811, 75987, 0.0], [75987, 76159, 0.0], [76159, 76327, 0.00606061], [76327, 76494, 0.0], [76494, 76676, 0.0], [76676, 76854, 0.0], [76854, 76940, 0.0], [76940, 76977, 0.07142857], [76977, 77147, 0.0], [77147, 77322, 0.0], [77322, 77500, 0.0], [77500, 77679, 0.0], [77679, 77832, 0.0], [77832, 77895, 0.0], [77895, 77947, 0.0], [77947, 78143, 0.0], [78143, 78331, 0.0], [78331, 78520, 0.0], [78520, 78552, 0.0], [78552, 78749, 0.01538462], [78749, 78794, 0.0], [78794, 78987, 0.0], [78987, 79181, 0.0], [79181, 79287, 0.0], [79287, 79368, 0.0], [79368, 79461, 0.06410256], [79461, 79648, 0.0], [79648, 79842, 0.0], [79842, 80041, 0.0], [80041, 80228, 0.0], [80228, 80421, 0.0], [80421, 80596, 0.02352941], [80596, 80787, 0.0], [80787, 80948, 0.03355705], [80948, 81009, 0.05882353], [81009, 81139, 0.04761905], [81139, 81306, 0.0955414], [81306, 81415, 0.03846154], [81415, 81568, 0.04929577], [81568, 81689, 0.01724138], [81689, 81837, 0.0141844], [81837, 82004, 0.13043478], [82004, 82195, 0.0], [82195, 82388, 0.0], [82388, 82585, 0.0], [82585, 82779, 0.0], [82779, 82977, 0.0], [82977, 82996, 0.0], [82996, 83187, 0.0], [83187, 83370, 0.0], [83370, 83389, 0.0], [83389, 83565, 0.0], [83565, 83737, 0.0], [83737, 83926, 0.0], [83926, 84084, 0.0], [84084, 84267, 0.0], [84267, 84509, 0.12621359], [84509, 84698, 0.0], [84698, 84887, 0.02222222], [84887, 85074, 0.0], [85074, 85240, 0.0], [85240, 85373, 0.01538462], [85373, 85543, 0.03614458], [85543, 85663, 0.05217391], [85663, 85855, 0.03208556], [85855, 86033, 0.0], [86033, 86106, 0.0], [86106, 86260, 0.0], [86260, 86582, 0.12589928], [86582, 86779, 0.0], [86779, 86970, 0.0441989], [86970, 87082, 0.0], [87082, 87239, 0.05228758], [87239, 87393, 0.0], [87393, 87506, 0.0], [87506, 87586, 0.0], [87586, 87702, 0.0], [87702, 87843, 0.0], [87843, 87948, 0.0], [87948, 88041, 0.0], [88041, 88182, 0.0], [88182, 88259, 0.0], [88259, 88332, 0.0], [88332, 88458, 0.0], [88458, 88570, 0.0], [88570, 88671, 0.04123711], [88671, 88799, 0.0], [88799, 88950, 0.0], [88950, 89060, 0.0], [89060, 89212, 0.0], [89212, 89279, 0.0], [89279, 89407, 0.0], [89407, 89541, 0.0], [89541, 89606, 0.0], [89606, 89742, 0.0], [89742, 89841, 0.0], [89841, 89906, 0.0], [89906, 90016, 0.0], [90016, 90206, 0.0], [90206, 90391, 0.0], [90391, 90564, 0.0], [90564, 90641, 0.0], [90641, 90746, 0.06185567], [90746, 90790, 0.10810811], [90790, 90967, 0.0], [90967, 91064, 0.0], [91064, 91250, 0.0], [91250, 91441, 0.0], [91441, 91472, 0.0], [91472, 91656, 0.0], [91656, 91760, 0.0], [91760, 91866, 0.07865169], [91866, 91921, 0.15217391], [91921, 92101, 0.02298851], [92101, 92284, 0.0], [92284, 92459, 0.02339181], [92459, 92648, 0.0], [92648, 92700, 0.0], [92700, 92889, 0.0], [92889, 92929, 0.0], [92929, 93111, 0.02247191], [93111, 93304, 0.0], [93304, 93495, 0.0], [93495, 93682, 0.0], [93682, 93883, 0.0], [93883, 93976, 0.0], [93976, 94111, 0.07627119], [94111, 94300, 0.0], [94300, 94488, 0.01092896], [94488, 94678, 0.0273224], [94678, 94867, 0.03763441], [94867, 94884, 0.13333333], [94884, 95077, 0.0], [95077, 95104, 0.0], [95104, 95402, 0.02749141], [95402, 96137, 0.0], [96137, 96608, 0.0], [96608, 96670, 0.0], [96670, 97013, 0.01197605], [97013, 97623, 0.0], [97623, 98424, 0.01658163], [98424, 98483, 0.04081633], [98483, 98979, 0.0], [98979, 99193, 0.0], [99193, 99537, 0.00591716], [99537, 99666, 0.0], [99666, 99733, 0.0], [99733, 99832, 0.0], [99832, 99925, 0.0], [99925, 100005, 0.0], [100005, 100138, 0.08035714], [100138, 100266, 0.0], [100266, 100380, 0.00909091], [100380, 100407, 0.0], [100407, 100429, 0.0], [100429, 100903, 0.01096491], [100903, 101195, 0.0], [101195, 101442, 0.01260504], [101442, 101560, 0.0], [101560, 101640, 0.0], [101640, 101704, 0.0], [101704, 101776, 0.0], [101776, 101890, 0.0], [101890, 101985, 0.0], [101985, 102130, 0.0], [102130, 102222, 0.0], [102222, 102355, 0.0], [102355, 102455, 0.0], [102455, 102557, 0.0], [102557, 103092, 0.0], [103092, 103494, 0.03108808], [103494, 104022, 0.0], [104022, 104103, 0.0], [104103, 104365, 0.0], [104365, 104385, 0.35294118], [104385, 104413, 0.2], [104413, 104444, 0.17857143], [104444, 104564, 0.0], [104564, 105386, 0.0], [105386, 105433, 0.23684211], [105433, 105654, 0.15934066], [105654, 106148, 0.02118644], [106148, 106810, 0.0], [106810, 106906, 0.04301075], [106906, 106999, 0.14285714], [106999, 107115, 0.10377358], [107115, 107222, 0.11340206], [107222, 107311, 0.13924051], [107311, 107457, 0.08823529], [107457, 107535, 0.17647059], [107535, 107676, 0.09160305], [107676, 107793, 0.11214953], [107793, 107908, 0.11428571], [107908, 108055, 0.08823529], [108055, 108306, 0.0861244], [108306, 108684, 0.0], [108684, 109560, 0.00233372], [109560, 109622, 0.28], [109622, 110096, 0.0], [110096, 110489, 0.00522193], [110489, 110553, 0.0], [110553, 110622, 0.0], [110622, 110976, 0.0057971], [110976, 111002, 0.0], [111002, 111085, 0.0], [111085, 111149, 0.21568627], [111149, 111299, 0.175], [111299, 111654, 0.0], [111654, 112221, 0.00728597], [112221, 112288, 0.0], [112288, 112330, 0.0], [112330, 112345, 0.0], [112345, 112381, 0.0], [112381, 112457, 0.0], [112457, 113178, 0.002849], [113178, 113631, 0.00223214], [113631, 113920, 0.0], [113920, 114394, 0.00438596], [114394, 115652, 0.00807103], [115652, 116079, 0.0], [116079, 116522, 0.0], [116522, 116629, 0.0], [116629, 116699, 0.14814815], [116699, 117285, 0.0], [117285, 117827, 0.0], [117827, 117933, 0.0], [117933, 118365, 0.03571429], [118365, 118466, 0.03614458], [118466, 118819, 0.01474926], [118819, 118888, 0.0], [118888, 119095, 0.12299465], [119095, 120152, 0.0], [120152, 120632, 0.02145923], [120632, 121412, 0.01181102], [121412, 121429, 0.0], [121429, 121453, 0.0], [121453, 121462, 0.0], [121462, 121475, 0.0], [121475, 121501, 0.0], [121501, 121527, 0.0], [121527, 121543, 0.0], [121543, 121567, 0.0], [121567, 122010, 0.0], [122010, 122192, 0.26712329], [122192, 122819, 0.01309329], [122819, 122889, 0.06451613], [122889, 122966, 0.05882353], [122966, 123026, 0.12244898], [123026, 123644, 0.00998336], [123644, 123688, 0.08571429], [123688, 123975, 0.01465201], [123975, 124017, 0.0], [124017, 124390, 0.0], [124390, 124446, 0.0], [124446, 125407, 0.0063762], [125407, 125611, 0.0201005], [125611, 125651, 0.0], [125651, 125695, 0.0], [125695, 125770, 0.0], [125770, 125822, 0.0], [125822, 125884, 0.0], [125884, 125902, 0.0], [125902, 126032, 0.0], [126032, 126599, 0.01642336], [126599, 126613, 0.0], [126613, 127031, 0.0], [127031, 127074, 0.0], [127074, 127120, 0.05263158], [127120, 127208, 0.08219178], [127208, 127890, 0.0], [127890, 128002, 0.02803738], [128002, 128215, 0.00995025], [128215, 128702, 0.00840336], [128702, 128817, 0.07446809], [128817, 129616, 0.0], [129616, 129855, 0.0], [129855, 130164, 0.0033557], [130164, 130466, 0.00340136], [130466, 130783, 0.00330033], [130783, 131069, 0.10169492], [131069, 131747, 0.0], [131747, 132476, 0.01271186], [132476, 132635, 0.18045113], [132635, 133171, 0.0134357], [133171, 133862, 0.0], [133862, 134098, 0.02597403], [134098, 134328, 0.0], [134328, 134941, 0.0], [134941, 134999, 0.0], [134999, 135446, 0.0], [135446, 136036, 0.0087108], [136036, 136725, 0.01188707], [136725, 137098, 0.01092896], [137098, 137197, 0.0], [137197, 137889, 0.01775148], [137889, 139040, 0.01779359], [139040, 139368, 0.15862069], [139368, 139941, 0.0], [139941, 140277, 0.00615385], [140277, 140627, 0.00892857], [140627, 140700, 0.05], [140700, 141197, 0.01028807], [141197, 141882, 0.02579666], [141882, 142008, 0.09259259], [142008, 142810, 0.01021711], [142810, 143146, 0.00306748], [143146, 143307, 0.03846154], [143307, 143465, 0.0], [143465, 143607, 0.0], [143607, 144067, 0.01766004], [144067, 144243, 0.0], [144243, 144333, 0.0], [144333, 144476, 0.11111111], [144476, 145048, 0.0], [145048, 145234, 0.03314917], [145234, 146188, 0.01302932], [146188, 146503, 0.00327869], [146503, 146601, 0.125], [146601, 147536, 0.00987925], [147536, 148327, 0.002574], [148327, 148443, 0.08247423], [148443, 148883, 0.0], [148883, 150331, 0.00286328], [150331, 150600, 0.06666667], [150600, 150855, 0.0], [150855, 151326, 0.02649007], [151326, 151760, 0.24444444], [151760, 152232, 0.0], [152232, 153238, 0.01313131], [153238, 153652, 0.06788512], [153652, 154349, 0.00590842], [154349, 155131, 0.00520156], [155131, 155171, 0.02631579], [155171, 155182, 0.11111111], [155182, 155209, 0.04], [155209, 155223, 0.08333333], [155223, 155231, 0.16666667], [155231, 155255, 0.04545455], [155255, 155281, 0.04166667], [155281, 155295, 0.08333333], [155295, 155308, 0.09090909], [155308, 155371, 0.0], [155371, 155887, 0.0], [155887, 156153, 0.0], [156153, 156447, 0.0], [156447, 156572, 0.00826446], [156572, 156625, 0.0], [156625, 156683, 0.01886792], [156683, 156712, 0.0], [156712, 156740, 0.04166667], [156740, 157684, 0.01295896], [157684, 157916, 0.109375], [157916, 158826, 0.00450958], [158826, 158979, 0.02941176], [158979, 159451, 0.00879121], [159451, 159510, 0.16326531], [159510, 160433, 0.0], [160433, 161060, 0.04369748], [161060, 161246, 0.1986755], [161246, 161822, 0.0], [161822, 162564, 0.0027894], [162564, 163027, 0.0], [163027, 163528, 0.01026694], [163528, 163603, 0.07936508], [163603, 163992, 0.0], [163992, 164058, 0.0], [164058, 164075, 0.0], [164075, 164165, 0.04819277], [164165, 164222, 0.0], [164222, 164278, 0.0], [164278, 164933, 0.00928793], [164933, 165923, 0.0], [165923, 166477, 0.0], [166477, 166614, 0.0], [166614, 166942, 0.01242236], [166942, 167529, 0.03478261], [167529, 168037, 0.01219512], [168037, 168796, 0.00820793], [168796, 169032, 0.0], [169032, 169236, 0.0], [169236, 169752, 0.0], [169752, 169967, 0.0], [169967, 170150, 0.0], [170150, 170351, 0.0], [170351, 170372, 0.0], [170372, 170821, 0.0], [170821, 171206, 0.0], [171206, 172142, 0.0], [172142, 173186, 0.02071006], [173186, 173490, 0.09055118], [173490, 173988, 0.0], [173988, 174970, 0.05991285], [174970, 175037, 0.0], [175037, 175091, 0.0], [175091, 175152, 0.0], [175152, 175199, 0.0], [175199, 175664, 0.0], [175664, 175886, 0.01843318], [175886, 176424, 0.0233463], [176424, 176828, 0.00516796], [176828, 176967, 0.0], [176967, 177075, 0.0], [177075, 177223, 0.0], [177223, 177456, 0.0], [177456, 177700, 0.0], [177700, 177944, 0.0], [177944, 178498, 0.01107011], [178498, 178782, 0.13617021], [178782, 179419, 0.00325733], [179419, 179836, 0.01481481], [179836, 179887, 0.04166667], [179887, 179987, 0.02061856], [179987, 180034, 0.04545455], [180034, 180129, 0.02173913], [180129, 180264, 0.01515152], [180264, 180354, 0.02298851], [180354, 180419, 0.03278689], [180419, 180483, 0.06666667], [180483, 180604, 0.07920792], [180604, 180689, 0.0], [180689, 181481, 0.0013089], [181481, 182158, 0.00151057], [182158, 182267, 0.16666667], [182267, 183128, 0.0], [183128, 183530, 0.01030928], [183530, 183797, 0.06696429], [183797, 184255, 0.0], [184255, 184339, 0.05555556], [184339, 184345, 0.0], [184345, 184378, 0.0], [184378, 185181, 0.0], [185181, 185451, 0.0969163], [185451, 186088, 0.01132686], [186088, 186355, 0.00769231], [186355, 186456, 0.0], [186456, 187387, 0.00658617], [187387, 187686, 0.0], [187686, 187925, 0.11442786], [187925, 188495, 0.0], [188495, 189309, 0.00759494], [189309, 189910, 0.0], [189910, 190482, 0.00357782], [190482, 190755, 0.16521739], [190755, 191550, 0.0], [191550, 192071, 0.01577909], [192071, 192186, 0.08791209], [192186, 192609, 0.00738916], [192609, 192763, 0.0], [192763, 192932, 0.16107383], [192932, 193553, 0.0180624], [193553, 194952, 0.01032448], [194952, 195566, 0.01836394], [195566, 195675, 0.18888889], [195675, 196589, 0.0], [196589, 197386, 0.00518135], [197386, 197452, 0.30357143], [197452, 197544, 0.16883117], [197544, 198252, 0.02037846], [198252, 198871, 0.00660066], [198871, 199416, 0.01310861], [199416, 199956, 0.00760456], [199956, 199987, 0.0], [199987, 200057, 0.10909091], [200057, 200571, 0.0040568], [200571, 200712, 0.0], [200712, 201458, 0.0], [201458, 201669, 0.0862069], [201669, 202228, 0.0], [202228, 203140, 0.0213964], [203140, 203190, 0.1], [203190, 203754, 0.0], [203754, 204497, 0.01396648], [204497, 204892, 0.14243323], [204892, 205096, 0.2], [205096, 206009, 0.00451977], [206009, 207439, 0.02106028], [207439, 208485, 0.0187747], [208485, 208705, 0.04020101], [208705, 209005, 0.15503876], [209005, 209763, 0.01904762], [209763, 209783, 0.1875], [209783, 209803, 0.125], [209803, 209826, 0.10526316], [209826, 210237, 0.01515152], [210237, 210916, 0.01376147], [210916, 210979, 0.4375], [210979, 211604, 0.0], [211604, 211874, 0.01953125], [211874, 211946, 0.0], [211946, 212785, 0.00121359], [212785, 212945, 0.0], [212945, 213628, 0.0], [213628, 214245, 0.02995008], [214245, 214255, 0.125], [214255, 214278, 0.04761905], [214278, 214287, 0.14285714], [214287, 214317, 0.03571429], [214317, 214330, 0.09090909], [214330, 214351, 0.05263158], [214351, 214370, 0.05882353], [214370, 214381, 0.5], [214381, 214396, 0.15384615], [214396, 214409, 0.18181818], [214409, 214418, 0.28571429], [214418, 214433, 0.15384615], [214433, 214447, 0.16666667], [214447, 214460, 0.18181818], [214460, 214468, 0.33333333], [214468, 214512, 0.05], [214512, 214559, 0.04651163], [214559, 214569, 0.25], [214569, 214983, 0.0], [214983, 215189, 0.0], [215189, 215519, 0.02515723], [215519, 215637, 0.15], [215637, 216451, 0.01761006], [216451, 216542, 0.09210526], [216542, 217150, 0.00512821], [217150, 217820, 0.02293578], [217820, 217908, 0.08333333], [217908, 218250, 0.0], [218250, 218936, 0.00595238], [218936, 219051, 0.08791209], [219051, 219459, 0.0], [219459, 219890, 0.00729927], [219890, 220280, 0.02887139], [220280, 220400, 0.0], [220400, 220443, 0.05], [220443, 220487, 0.04878049], [220487, 220523, 0.06060606], [220523, 220569, 0.04761905], [220569, 220621, 0.04081633], [220621, 220730, 0.0], [220730, 221222, 0.0], [221222, 222139, 0.00223214], [222139, 222259, 0.08], [222259, 222762, 0.0], [222762, 223151, 0.02393617], [223151, 223200, 0.0], [223200, 223312, 0.0], [223312, 223372, 0.0], [223372, 223944, 0.0], [223944, 224602, 0.0155521], [224602, 224610, 0.0], [224610, 224619, 0.0], [224619, 224629, 0.0], [224629, 224641, 0.0], [224641, 225434, 0.0], [225434, 226539, 0.00373134], [226539, 226614, 0.18333333], [226614, 227311, 0.0], [227311, 227426, 0.0], [227426, 227638, 0.0], [227638, 227741, 0.0], [227741, 228063, 0.0], [228063, 228154, 0.10958904], [228154, 228654, 0.04545455], [228654, 229051, 0.0], [229051, 229125, 0.0], [229125, 230081, 0.00426894], [230081, 230190, 0.08080808], [230190, 230484, 0.0], [230484, 231014, 0.0], [231014, 231292, 0.09130435], [231292, 231975, 0.03153153], [231975, 232644, 0.0123839], [232644, 232754, 0.15217391], [232754, 233903, 0.00539568], [233903, 234021, 0.0], [234021, 234128, 0.09195402], [234128, 234854, 0.01838755], [234854, 235109, 0.07582938], [235109, 235380, 0.00395257], [235380, 235924, 0.00190114], [235924, 236229, 0.07216495], [236229, 236759, 0.0], [236759, 236839, 0.1], [236839, 236980, 0.0], [236980, 237016, 0.0], [237016, 237172, 0.04195804], [237172, 237427, 0.0], [237427, 237658, 0.0], [237658, 237681, 0.0], [237681, 237705, 0.0], [237705, 237893, 0.01081081], [237893, 237982, 0.0], [237982, 238163, 0.02285714], [238163, 238363, 0.0], [238363, 238637, 0.01858736], [238637, 238931, 0.0], [238931, 239324, 0.0], [239324, 239981, 0.00311042], [239981, 240151, 0.0], [240151, 240239, 0.09722222], [240239, 240268, 0.0], [240268, 240897, 0.0], [240897, 240980, 0.12121212], [240980, 241031, 0.0], [241031, 241310, 0.0], [241310, 241340, 0.0], [241340, 241931, 0.01039861], [241931, 242012, 0.125], [242012, 242549, 0.00759013], [242549, 243110, 0.00365631], [243110, 243710, 0.01023891], [243710, 245037, 0.0030888], [245037, 245545, 0.0020284], [245545, 246426, 0.0], [246426, 247351, 0.00444939], [247351, 247465, 0.07070707], [247465, 247507, 0.1]], "rps_lines_start_with_bulletpoint": [[0, 34, 0.0], [34, 77, 0.0], [77, 89, 0.0], [89, 100, 0.0], [100, 116, 0.0], [116, 122, 0.0], [122, 130, 0.0], [130, 165, 0.0], [165, 195, 0.0], [195, 241, 0.0], [241, 418, 0.0], [418, 600, 0.0], [600, 774, 0.0], [774, 952, 0.0], [952, 1131, 0.0], [1131, 1309, 0.0], [1309, 1489, 0.0], [1489, 1667, 0.0], [1667, 1829, 0.0], [1829, 2006, 0.0], [2006, 2052, 0.0], [2052, 2210, 0.0], [2210, 2393, 0.0], [2393, 2569, 0.0], [2569, 2746, 0.0], [2746, 2912, 0.0], [2912, 3094, 0.0], [3094, 3264, 0.0], [3264, 3441, 0.0], [3441, 3468, 0.0], [3468, 3531, 0.0], [3531, 3732, 0.0], [3732, 3928, 0.0], [3928, 4097, 0.0], [4097, 4146, 0.0], [4146, 4350, 0.0], [4350, 4545, 0.0], [4545, 4742, 0.0], [4742, 4947, 0.0], [4947, 4975, 0.0], [4975, 5160, 0.0], [5160, 5351, 0.0], [5351, 5433, 0.0], [5433, 5545, 0.0], [5545, 5651, 0.0], [5651, 5760, 0.0], [5760, 5836, 0.0], [5836, 5899, 0.0], [5899, 6002, 0.0], [6002, 6262, 0.0], [6262, 6453, 0.0], [6453, 6637, 0.0], [6637, 6676, 0.0], [6676, 6861, 0.0], [6861, 7058, 0.0], [7058, 7249, 0.0], [7249, 7438, 0.0], [7438, 7712, 0.0], [7712, 7898, 0.0], [7898, 8091, 0.0], [8091, 8183, 0.0], [8183, 8344, 0.0], [8344, 8505, 0.0], [8505, 8609, 0.0], [8609, 8617, 0.0], [8617, 8624, 0.0], [8624, 8636, 0.0], [8636, 8647, 0.0], [8647, 8660, 0.0], [8660, 8673, 0.0], [8673, 8683, 0.0], [8683, 8694, 0.0], [8694, 8704, 0.0], [8704, 8713, 0.0], [8713, 8720, 0.0], [8720, 8732, 0.0], [8732, 8745, 0.0], [8745, 8751, 0.0], [8751, 8760, 0.0], [8760, 8774, 0.0], [8774, 8946, 0.0], [8946, 9032, 0.0], [9032, 9042, 0.0], [9042, 9051, 0.0], [9051, 9063, 0.0], [9063, 9085, 0.0], [9085, 9101, 0.0], [9101, 9110, 0.0], [9110, 9125, 0.0], [9125, 9136, 0.0], [9136, 9151, 0.0], [9151, 9163, 0.0], [9163, 9169, 0.0], [9169, 9177, 0.0], [9177, 9202, 0.0], [9202, 9215, 0.0], [9215, 9226, 0.0], [9226, 9244, 0.0], [9244, 9273, 0.0], [9273, 9284, 0.0], [9284, 9376, 0.0], [9376, 9494, 0.0], [9494, 9678, 0.0], [9678, 9858, 0.0], [9858, 10036, 0.0], [10036, 10215, 0.0], [10215, 10235, 0.0], [10235, 10455, 0.0], [10455, 10666, 0.0], [10666, 10884, 0.0], [10884, 11098, 0.0], [11098, 11173, 0.0], [11173, 11428, 0.0], [11428, 11533, 0.0], [11533, 11756, 0.0], [11756, 11972, 0.0], [11972, 12190, 0.0], [12190, 12907, 0.0], [12907, 13032, 0.0], [13032, 13717, 0.0], [13717, 13891, 0.0], [13891, 14064, 0.0], [14064, 14237, 0.0], [14237, 14414, 0.0], [14414, 14584, 0.0], [14584, 14640, 0.0], [14640, 14765, 0.0], [14765, 14845, 0.0], [14845, 14864, 0.0], [14864, 14912, 0.0], [14912, 14959, 0.0], [14959, 15130, 0.0], [15130, 15297, 0.0], [15297, 15473, 0.0], [15473, 15643, 0.0], [15643, 15764, 0.0], [15764, 15930, 0.0], [15930, 16098, 0.0], [16098, 16260, 0.0], [16260, 16430, 0.0], [16430, 16604, 0.0], [16604, 16769, 0.0], [16769, 16947, 0.0], [16947, 17123, 0.0], [17123, 17135, 0.0], [17135, 17150, 0.0], [17150, 17208, 0.0], [17208, 17379, 0.0], [17379, 17556, 0.0], [17556, 17731, 0.0], [17731, 17903, 0.0], [17903, 18062, 0.0], [18062, 18231, 0.0], [18231, 18406, 0.0], [18406, 18576, 0.0], [18576, 18679, 0.0], [18679, 18850, 0.0], [18850, 19027, 0.0], [19027, 19147, 0.0], [19147, 19325, 0.0], [19325, 19489, 0.0], [19489, 19662, 0.0], [19662, 19792, 0.0], [19792, 19897, 0.0], [19897, 20072, 0.0], [20072, 20254, 0.0], [20254, 20432, 0.0], [20432, 20589, 0.0], [20589, 20759, 0.0], [20759, 20790, 0.0], [20790, 20866, 0.0], [20866, 20964, 0.0], [20964, 21090, 0.0], [21090, 21204, 0.0], [21204, 21298, 0.0], [21298, 21417, 0.0], [21417, 21528, 0.0], [21528, 21645, 0.0], [21645, 21726, 0.0], [21726, 21865, 0.0], [21865, 21995, 0.0], [21995, 22050, 0.0], [22050, 22215, 0.0], [22215, 22382, 0.0], [22382, 22545, 0.0], [22545, 22722, 0.0], [22722, 22873, 0.0], [22873, 22900, 0.0], [22900, 23060, 0.0], [23060, 23233, 0.0], [23233, 23289, 0.0], [23289, 23411, 0.0], [23411, 23574, 0.0], [23574, 23738, 0.0], [23738, 23910, 0.0], [23910, 24007, 0.0], [24007, 24187, 0.0], [24187, 24361, 0.0], [24361, 24528, 0.0], [24528, 24694, 0.0], [24694, 24729, 0.0], [24729, 24872, 0.0], [24872, 25006, 0.0], [25006, 25176, 0.0], [25176, 25344, 0.0], [25344, 25465, 0.0], [25465, 25643, 0.0], [25643, 25816, 0.0], [25816, 25990, 0.0], [25990, 26177, 0.0], [26177, 26285, 0.0], [26285, 26461, 0.0], [26461, 26638, 0.0], [26638, 26814, 0.0], [26814, 26991, 0.0], [26991, 27020, 0.0], [27020, 27196, 0.0], [27196, 27373, 0.0], [27373, 27436, 0.0], [27436, 27607, 0.0], [27607, 27712, 0.0], [27712, 27763, 0.0], [27763, 27940, 0.0], [27940, 28119, 0.0], [28119, 28295, 0.0], [28295, 28481, 0.0], [28481, 28556, 0.0], [28556, 28731, 0.0], [28731, 28902, 0.0], [28902, 29076, 0.0], [29076, 29246, 0.0], [29246, 29412, 0.0], [29412, 29589, 0.0], [29589, 29721, 0.0], [29721, 29774, 0.0], [29774, 29940, 0.0], [29940, 30105, 0.0], [30105, 30275, 0.0], [30275, 30444, 0.0], [30444, 30612, 0.0], [30612, 30782, 0.0], [30782, 30961, 0.0], [30961, 31132, 0.0], [31132, 31272, 0.0], [31272, 31435, 0.0], [31435, 31606, 0.0], [31606, 31771, 0.0], [31771, 31812, 0.0], [31812, 31955, 0.0], [31955, 32086, 0.0], [32086, 32149, 0.0], [32149, 32315, 0.0], [32315, 32482, 0.0], [32482, 32643, 0.0], [32643, 32762, 0.0], [32762, 32917, 0.0], [32917, 33074, 0.0], [33074, 33235, 0.0], [33235, 33397, 0.0], [33397, 33554, 0.0], [33554, 33598, 0.0], [33598, 33763, 0.0], [33763, 33913, 0.0], [33913, 34015, 0.0], [34015, 34165, 0.0], [34165, 34295, 0.0], [34295, 34323, 0.0], [34323, 34477, 0.0], [34477, 34639, 0.0], [34639, 34805, 0.0], [34805, 34973, 0.0], [34973, 35103, 0.0], [35103, 35270, 0.0], [35270, 35424, 0.0], [35424, 35484, 0.0], [35484, 35661, 0.0], [35661, 35821, 0.0], [35821, 35981, 0.0], [35981, 36150, 0.0], [36150, 36320, 0.0], [36320, 36482, 0.0], [36482, 36647, 0.0], [36647, 36756, 0.0], [36756, 36924, 0.0], [36924, 37084, 0.0], [37084, 37250, 0.0], [37250, 37407, 0.0], [37407, 37482, 0.0], [37482, 37644, 0.0], [37644, 37807, 0.0], [37807, 37960, 0.0], [37960, 38126, 0.0], [38126, 38205, 0.0], [38205, 38375, 0.0], [38375, 38542, 0.0], [38542, 38716, 0.0], [38716, 38894, 1.0], [38894, 39064, 0.0], [39064, 39235, 0.0], [39235, 39398, 0.0], [39398, 39536, 0.0], [39536, 39709, 0.0], [39709, 39887, 0.0], [39887, 39901, 0.0], [39901, 40077, 0.0], [40077, 40249, 0.0], [40249, 40299, 0.0], [40299, 40467, 0.0], [40467, 40635, 0.0], [40635, 40739, 0.0], [40739, 40909, 0.0], [40909, 40981, 0.0], [40981, 41154, 0.0], [41154, 41177, 0.0], [41177, 41298, 0.0], [41298, 41364, 0.0], [41364, 41431, 0.0], [41431, 41488, 0.0], [41488, 41559, 0.0], [41559, 41673, 0.0], [41673, 41765, 0.0], [41765, 41884, 0.0], [41884, 42013, 0.0], [42013, 42117, 0.0], [42117, 42426, 0.0], [42426, 43479, 0.0], [43479, 43653, 0.0], [43653, 43825, 0.0], [43825, 43998, 0.0], [43998, 44177, 0.0], [44177, 44257, 0.0], [44257, 44336, 0.0], [44336, 44436, 0.0], [44436, 44612, 0.0], [44612, 44783, 0.0], [44783, 44957, 0.0], [44957, 45130, 0.0], [45130, 45289, 0.0], [45289, 45484, 0.0], [45484, 45590, 0.0], [45590, 45631, 0.0], [45631, 45677, 0.0], [45677, 45706, 0.0], [45706, 45735, 0.0], [45735, 45772, 0.0], [45772, 45957, 0.0], [45957, 46136, 0.0], [46136, 46310, 0.0], [46310, 46492, 0.0], [46492, 46682, 0.0], [46682, 46719, 0.0], [46719, 46905, 0.0], [46905, 47020, 0.0], [47020, 47115, 0.0], [47115, 47137, 0.0], [47137, 47320, 0.0], [47320, 47503, 0.0], [47503, 47679, 0.0], [47679, 47750, 0.0], [47750, 47939, 0.0], [47939, 48126, 0.0], [48126, 48136, 0.0], [48136, 48312, 0.0], [48312, 48491, 0.0], [48491, 48672, 0.0], [48672, 48741, 0.0], [48741, 48779, 0.0], [48779, 48963, 0.0], [48963, 49150, 0.0], [49150, 49335, 0.0], [49335, 49432, 0.0], [49432, 49603, 0.0], [49603, 49785, 0.0], [49785, 49904, 0.0], [49904, 50090, 0.0], [50090, 50280, 0.0], [50280, 50448, 0.0], [50448, 50636, 0.0], [50636, 50743, 0.0], [50743, 50851, 0.0], [50851, 51019, 0.0], [51019, 51205, 0.0], [51205, 51347, 0.0], [51347, 51539, 0.0], [51539, 51726, 0.0], [51726, 51770, 0.0], [51770, 51959, 0.0], [51959, 52148, 0.0], [52148, 52210, 0.0], [52210, 52393, 0.0], [52393, 52575, 0.0], [52575, 52672, 0.0], [52672, 52857, 0.0], [52857, 53035, 0.0], [53035, 53221, 0.0], [53221, 53409, 0.0], [53409, 53601, 0.0], [53601, 53746, 0.0], [53746, 53908, 0.0], [53908, 54065, 0.0], [54065, 54220, 0.0], [54220, 54384, 0.0], [54384, 54550, 0.0], [54550, 54569, 0.0], [54569, 54624, 0.0], [54624, 54707, 0.0], [54707, 54865, 0.0], [54865, 55036, 0.0], [55036, 55202, 0.0], [55202, 55347, 0.0], [55347, 55506, 0.0], [55506, 55670, 0.0], [55670, 55841, 0.0], [55841, 55865, 0.0], [55865, 56029, 0.0], [56029, 56137, 0.0], [56137, 56206, 0.0], [56206, 56262, 0.0], [56262, 56335, 0.0], [56335, 56373, 0.0], [56373, 56545, 0.0], [56545, 56623, 0.0], [56623, 56784, 0.0], [56784, 56949, 0.0], [56949, 57039, 0.0], [57039, 57206, 0.0], [57206, 57381, 0.0], [57381, 57480, 0.0], [57480, 57641, 0.0], [57641, 57806, 0.0], [57806, 57823, 0.0], [57823, 58107, 0.0], [58107, 58265, 0.0], [58265, 58434, 0.0], [58434, 58602, 0.0], [58602, 58631, 0.0], [58631, 58787, 0.0], [58787, 58940, 0.0], [58940, 59098, 0.0], [59098, 59266, 0.0], [59266, 59343, 0.0], [59343, 59512, 0.0], [59512, 59554, 0.0], [59554, 59625, 0.0], [59625, 59790, 0.0], [59790, 59957, 0.0], [59957, 60117, 0.0], [60117, 60192, 0.0], [60192, 60352, 0.0], [60352, 60473, 0.0], [60473, 60617, 0.0], [60617, 60764, 0.0], [60764, 60908, 0.0], [60908, 61053, 0.0], [61053, 61161, 0.0], [61161, 61444, 0.0], [61444, 61593, 0.0], [61593, 61742, 0.0], [61742, 61891, 0.0], [61891, 62036, 0.0], [62036, 62188, 0.0], [62188, 62234, 0.0], [62234, 62398, 0.0], [62398, 62417, 0.0], [62417, 62445, 0.0], [62445, 62459, 0.0], [62459, 62511, 0.0], [62511, 62539, 0.0], [62539, 62572, 0.0], [62572, 62601, 0.0], [62601, 62655, 0.0], [62655, 62718, 0.0], [62718, 62877, 0.0], [62877, 63031, 0.0], [63031, 63154, 0.0], [63154, 63304, 0.0], [63304, 63454, 0.0], [63454, 63602, 0.0], [63602, 63679, 0.0], [63679, 63819, 0.0], [63819, 63893, 0.0], [63893, 64050, 0.0], [64050, 64212, 0.0], [64212, 64370, 0.0], [64370, 64533, 0.0], [64533, 64693, 0.0], [64693, 64770, 0.0], [64770, 64937, 0.0], [64937, 65096, 0.0], [65096, 65249, 0.0], [65249, 65404, 0.0], [65404, 65560, 0.0], [65560, 65714, 0.0], [65714, 65873, 0.0], [65873, 65882, 0.0], [65882, 65937, 0.0], [65937, 66099, 0.0], [66099, 66267, 0.0], [66267, 66433, 0.0], [66433, 66599, 0.0], [66599, 66680, 0.0], [66680, 66843, 0.0], [66843, 66859, 0.0], [66859, 67015, 0.0], [67015, 67175, 0.0], [67175, 67328, 0.0], [67328, 67365, 0.0], [67365, 67426, 0.0], [67426, 67498, 0.0], [67498, 67658, 0.0], [67658, 67816, 0.0], [67816, 67980, 0.0], [67980, 67990, 0.0], [67990, 68152, 0.0], [68152, 68316, 0.0], [68316, 68475, 0.0], [68475, 68608, 0.0], [68608, 68764, 0.0], [68764, 68916, 0.0], [68916, 69068, 0.0], [69068, 69229, 0.0], [69229, 69382, 0.0], [69382, 69554, 0.0], [69554, 69613, 0.0], [69613, 69777, 0.0], [69777, 69948, 0.0], [69948, 70129, 0.0], [70129, 70301, 0.0], [70301, 70467, 0.0], [70467, 70635, 0.0], [70635, 70806, 0.0], [70806, 70947, 0.0], [70947, 71218, 0.0], [71218, 71388, 0.0], [71388, 71550, 0.0], [71550, 71725, 0.0], [71725, 71809, 0.0], [71809, 71976, 0.0], [71976, 72143, 0.0], [72143, 72315, 0.0], [72315, 72339, 0.0], [72339, 72510, 0.0], [72510, 72681, 0.0], [72681, 72851, 0.0], [72851, 72909, 0.0], [72909, 72974, 0.0], [72974, 73147, 0.0], [73147, 73307, 0.0], [73307, 73327, 0.0], [73327, 73499, 0.0], [73499, 73667, 0.0], [73667, 73683, 0.0], [73683, 73859, 0.0], [73859, 74034, 0.0], [74034, 74214, 0.0], [74214, 74376, 0.0], [74376, 74550, 0.0], [74550, 74711, 0.0], [74711, 74880, 0.0], [74880, 74921, 0.0], [74921, 75061, 0.0], [75061, 75230, 0.0], [75230, 75402, 0.0], [75402, 75570, 0.0], [75570, 75743, 0.0], [75743, 75811, 0.0], [75811, 75987, 0.0], [75987, 76159, 0.0], [76159, 76327, 0.0], [76327, 76494, 0.0], [76494, 76676, 0.0], [76676, 76854, 0.0], [76854, 76940, 0.0], [76940, 76977, 0.0], [76977, 77147, 0.0], [77147, 77322, 0.0], [77322, 77500, 0.0], [77500, 77679, 0.0], [77679, 77832, 0.0], [77832, 77895, 0.0], [77895, 77947, 0.0], [77947, 78143, 0.0], [78143, 78331, 0.0], [78331, 78520, 0.0], [78520, 78552, 0.0], [78552, 78749, 0.0], [78749, 78794, 0.0], [78794, 78987, 0.0], [78987, 79181, 0.0], [79181, 79287, 0.0], [79287, 79368, 0.0], [79368, 79461, 0.0], [79461, 79648, 0.0], [79648, 79842, 0.0], [79842, 80041, 0.0], [80041, 80228, 0.0], [80228, 80421, 0.0], [80421, 80596, 0.0], [80596, 80787, 0.0], [80787, 80948, 0.0], [80948, 81009, 0.0], [81009, 81139, 0.0], [81139, 81306, 0.0], [81306, 81415, 0.0], [81415, 81568, 0.0], [81568, 81689, 0.0], [81689, 81837, 0.0], [81837, 82004, 0.0], [82004, 82195, 0.0], [82195, 82388, 0.0], [82388, 82585, 0.0], [82585, 82779, 0.0], [82779, 82977, 0.0], [82977, 82996, 0.0], [82996, 83187, 0.0], [83187, 83370, 0.0], [83370, 83389, 0.0], [83389, 83565, 0.0], [83565, 83737, 0.0], [83737, 83926, 0.0], [83926, 84084, 0.0], [84084, 84267, 0.0], [84267, 84509, 0.0], [84509, 84698, 0.0], [84698, 84887, 0.0], [84887, 85074, 0.0], [85074, 85240, 0.0], [85240, 85373, 0.0], [85373, 85543, 0.0], [85543, 85663, 0.0], [85663, 85855, 0.0], [85855, 86033, 0.0], [86033, 86106, 0.0], [86106, 86260, 0.0], [86260, 86582, 0.0], [86582, 86779, 0.0], [86779, 86970, 0.0], [86970, 87082, 0.0], [87082, 87239, 0.0], [87239, 87393, 0.0], [87393, 87506, 0.0], [87506, 87586, 0.0], [87586, 87702, 0.0], [87702, 87843, 0.0], [87843, 87948, 0.0], [87948, 88041, 0.0], [88041, 88182, 0.0], [88182, 88259, 0.0], [88259, 88332, 0.0], [88332, 88458, 0.0], [88458, 88570, 0.0], [88570, 88671, 0.0], [88671, 88799, 0.0], [88799, 88950, 0.0], [88950, 89060, 0.0], [89060, 89212, 0.0], [89212, 89279, 0.0], [89279, 89407, 0.0], [89407, 89541, 0.0], [89541, 89606, 0.0], [89606, 89742, 0.0], [89742, 89841, 0.0], [89841, 89906, 0.0], [89906, 90016, 0.0], [90016, 90206, 0.0], [90206, 90391, 0.0], [90391, 90564, 0.0], [90564, 90641, 0.0], [90641, 90746, 0.0], [90746, 90790, 0.0], [90790, 90967, 0.0], [90967, 91064, 0.0], [91064, 91250, 0.0], [91250, 91441, 0.0], [91441, 91472, 0.0], [91472, 91656, 0.0], [91656, 91760, 0.0], [91760, 91866, 0.0], [91866, 91921, 0.0], [91921, 92101, 0.0], [92101, 92284, 0.0], [92284, 92459, 0.0], [92459, 92648, 0.0], [92648, 92700, 0.0], [92700, 92889, 0.0], [92889, 92929, 0.0], [92929, 93111, 0.0], [93111, 93304, 0.0], [93304, 93495, 0.0], [93495, 93682, 0.0], [93682, 93883, 0.0], [93883, 93976, 0.0], [93976, 94111, 0.0], [94111, 94300, 0.0], [94300, 94488, 0.0], [94488, 94678, 0.0], [94678, 94867, 0.0], [94867, 94884, 0.0], [94884, 95077, 0.0], [95077, 95104, 0.0], [95104, 95402, 0.0], [95402, 96137, 0.0], [96137, 96608, 0.0], [96608, 96670, 0.0], [96670, 97013, 0.0], [97013, 97623, 0.0], [97623, 98424, 0.0], [98424, 98483, 0.0], [98483, 98979, 0.0], [98979, 99193, 0.0], [99193, 99537, 0.0], [99537, 99666, 0.0], [99666, 99733, 0.0], [99733, 99832, 0.0], [99832, 99925, 0.0], [99925, 100005, 0.0], [100005, 100138, 0.0], [100138, 100266, 0.0], [100266, 100380, 0.0], [100380, 100407, 0.0], [100407, 100429, 0.0], [100429, 100903, 0.0], [100903, 101195, 0.0], [101195, 101442, 0.0], [101442, 101560, 0.0], [101560, 101640, 0.0], [101640, 101704, 0.0], [101704, 101776, 0.0], [101776, 101890, 0.0], [101890, 101985, 0.0], [101985, 102130, 0.0], [102130, 102222, 0.0], [102222, 102355, 0.0], [102355, 102455, 0.0], [102455, 102557, 0.0], [102557, 103092, 0.0], [103092, 103494, 0.0], [103494, 104022, 0.0], [104022, 104103, 0.0], [104103, 104365, 0.0], [104365, 104385, 0.0], [104385, 104413, 0.0], [104413, 104444, 0.0], [104444, 104564, 0.0], [104564, 105386, 0.0], [105386, 105433, 0.0], [105433, 105654, 0.0], [105654, 106148, 0.0], [106148, 106810, 0.0], [106810, 106906, 0.0], [106906, 106999, 0.0], [106999, 107115, 0.0], [107115, 107222, 0.0], [107222, 107311, 0.0], [107311, 107457, 0.0], [107457, 107535, 0.0], [107535, 107676, 0.0], [107676, 107793, 0.0], [107793, 107908, 0.0], [107908, 108055, 0.0], [108055, 108306, 0.0], [108306, 108684, 0.0], [108684, 109560, 0.0], [109560, 109622, 0.0], [109622, 110096, 0.0], [110096, 110489, 0.0], [110489, 110553, 0.0], [110553, 110622, 0.0], [110622, 110976, 0.0], [110976, 111002, 0.0], [111002, 111085, 0.0], [111085, 111149, 0.0], [111149, 111299, 0.0], [111299, 111654, 0.0], [111654, 112221, 0.0], [112221, 112288, 0.0], [112288, 112330, 0.0], [112330, 112345, 0.0], [112345, 112381, 0.0], [112381, 112457, 0.0], [112457, 113178, 0.0], [113178, 113631, 0.0], [113631, 113920, 0.0], [113920, 114394, 0.0], [114394, 115652, 0.0], [115652, 116079, 0.0], [116079, 116522, 0.0], [116522, 116629, 0.0], [116629, 116699, 0.0], [116699, 117285, 0.0], [117285, 117827, 0.0], [117827, 117933, 0.0], [117933, 118365, 0.0], [118365, 118466, 0.0], [118466, 118819, 0.0], [118819, 118888, 0.0], [118888, 119095, 0.0], [119095, 120152, 0.0], [120152, 120632, 0.0], [120632, 121412, 0.0], [121412, 121429, 0.0], [121429, 121453, 0.0], [121453, 121462, 0.0], [121462, 121475, 0.0], [121475, 121501, 0.0], [121501, 121527, 0.0], [121527, 121543, 0.0], [121543, 121567, 0.0], [121567, 122010, 0.0], [122010, 122192, 0.0], [122192, 122819, 0.0], [122819, 122889, 0.0], [122889, 122966, 0.0], [122966, 123026, 0.0], [123026, 123644, 0.0], [123644, 123688, 0.0], [123688, 123975, 0.0], [123975, 124017, 0.0], [124017, 124390, 0.0], [124390, 124446, 0.0], [124446, 125407, 0.0], [125407, 125611, 0.0], [125611, 125651, 0.0], [125651, 125695, 0.0], [125695, 125770, 0.0], [125770, 125822, 0.0], [125822, 125884, 0.0], [125884, 125902, 0.0], [125902, 126032, 0.0], [126032, 126599, 0.0], [126599, 126613, 0.0], [126613, 127031, 0.0], [127031, 127074, 0.0], [127074, 127120, 0.0], [127120, 127208, 0.0], [127208, 127890, 0.0], [127890, 128002, 0.0], [128002, 128215, 0.0], [128215, 128702, 0.0], [128702, 128817, 0.0], [128817, 129616, 0.0], [129616, 129855, 0.0], [129855, 130164, 0.0], [130164, 130466, 0.0], [130466, 130783, 0.0], [130783, 131069, 0.0], [131069, 131747, 0.0], [131747, 132476, 0.0], [132476, 132635, 0.0], [132635, 133171, 0.0], [133171, 133862, 0.0], [133862, 134098, 0.0], [134098, 134328, 0.0], [134328, 134941, 0.0], [134941, 134999, 0.0], [134999, 135446, 0.0], [135446, 136036, 0.0], [136036, 136725, 0.0], [136725, 137098, 0.0], [137098, 137197, 0.0], [137197, 137889, 0.0], [137889, 139040, 0.0], [139040, 139368, 0.0], [139368, 139941, 0.0], [139941, 140277, 0.0], [140277, 140627, 0.0], [140627, 140700, 0.0], [140700, 141197, 0.0], [141197, 141882, 0.0], [141882, 142008, 0.0], [142008, 142810, 0.0], [142810, 143146, 0.0], [143146, 143307, 0.0], [143307, 143465, 0.0], [143465, 143607, 0.0], [143607, 144067, 0.0], [144067, 144243, 0.0], [144243, 144333, 0.0], [144333, 144476, 0.0], [144476, 145048, 0.0], [145048, 145234, 0.0], [145234, 146188, 0.0], [146188, 146503, 0.0], [146503, 146601, 0.0], [146601, 147536, 0.0], [147536, 148327, 0.0], [148327, 148443, 0.0], [148443, 148883, 0.0], [148883, 150331, 0.0], [150331, 150600, 0.0], [150600, 150855, 0.0], [150855, 151326, 0.0], [151326, 151760, 0.0], [151760, 152232, 0.0], [152232, 153238, 0.0], [153238, 153652, 0.0], [153652, 154349, 0.0], [154349, 155131, 0.0], [155131, 155171, 0.0], [155171, 155182, 0.0], [155182, 155209, 0.0], [155209, 155223, 0.0], [155223, 155231, 0.0], [155231, 155255, 0.0], [155255, 155281, 0.0], [155281, 155295, 0.0], [155295, 155308, 0.0], [155308, 155371, 0.0], [155371, 155887, 0.0], [155887, 156153, 0.0], [156153, 156447, 0.0], [156447, 156572, 0.0], [156572, 156625, 0.0], [156625, 156683, 0.0], [156683, 156712, 0.0], [156712, 156740, 0.0], [156740, 157684, 0.0], [157684, 157916, 0.0], [157916, 158826, 0.0], [158826, 158979, 0.0], [158979, 159451, 0.0], [159451, 159510, 0.0], [159510, 160433, 0.0], [160433, 161060, 0.0], [161060, 161246, 0.0], [161246, 161822, 0.0], [161822, 162564, 0.0], [162564, 163027, 0.0], [163027, 163528, 0.0], [163528, 163603, 0.0], [163603, 163992, 0.0], [163992, 164058, 0.0], [164058, 164075, 0.0], [164075, 164165, 0.0], [164165, 164222, 0.0], [164222, 164278, 0.0], [164278, 164933, 0.0], [164933, 165923, 0.0], [165923, 166477, 0.0], [166477, 166614, 0.0], [166614, 166942, 0.0], [166942, 167529, 0.0], [167529, 168037, 0.0], [168037, 168796, 0.0], [168796, 169032, 0.0], [169032, 169236, 0.0], [169236, 169752, 0.0], [169752, 169967, 0.0], [169967, 170150, 0.0], [170150, 170351, 0.0], [170351, 170372, 0.0], [170372, 170821, 0.0], [170821, 171206, 0.0], [171206, 172142, 0.0], [172142, 173186, 0.0], [173186, 173490, 0.0], [173490, 173988, 0.0], [173988, 174970, 0.0], [174970, 175037, 0.0], [175037, 175091, 0.0], [175091, 175152, 0.0], [175152, 175199, 0.0], [175199, 175664, 0.0], [175664, 175886, 0.0], [175886, 176424, 0.0], [176424, 176828, 0.0], [176828, 176967, 0.0], [176967, 177075, 0.0], [177075, 177223, 0.0], [177223, 177456, 0.0], [177456, 177700, 0.0], [177700, 177944, 0.0], [177944, 178498, 0.0], [178498, 178782, 0.0], [178782, 179419, 0.0], [179419, 179836, 0.0], [179836, 179887, 0.0], [179887, 179987, 0.0], [179987, 180034, 0.0], [180034, 180129, 0.0], [180129, 180264, 0.0], [180264, 180354, 0.0], [180354, 180419, 0.0], [180419, 180483, 0.0], [180483, 180604, 0.0], [180604, 180689, 0.0], [180689, 181481, 0.0], [181481, 182158, 0.0], [182158, 182267, 0.0], [182267, 183128, 0.0], [183128, 183530, 0.0], [183530, 183797, 0.0], [183797, 184255, 0.0], [184255, 184339, 0.0], [184339, 184345, 0.0], [184345, 184378, 0.0], [184378, 185181, 0.0], [185181, 185451, 0.0], [185451, 186088, 0.0], [186088, 186355, 0.0], [186355, 186456, 0.0], [186456, 187387, 0.0], [187387, 187686, 0.0], [187686, 187925, 0.0], [187925, 188495, 0.0], [188495, 189309, 0.0], [189309, 189910, 0.0], [189910, 190482, 0.0], [190482, 190755, 0.0], [190755, 191550, 0.0], [191550, 192071, 0.0], [192071, 192186, 0.0], [192186, 192609, 0.0], [192609, 192763, 0.0], [192763, 192932, 0.0], [192932, 193553, 0.0], [193553, 194952, 0.0], [194952, 195566, 0.0], [195566, 195675, 0.0], [195675, 196589, 0.0], [196589, 197386, 0.0], [197386, 197452, 0.0], [197452, 197544, 0.0], [197544, 198252, 0.0], [198252, 198871, 0.0], [198871, 199416, 0.0], [199416, 199956, 0.0], [199956, 199987, 0.0], [199987, 200057, 0.0], [200057, 200571, 0.0], [200571, 200712, 0.0], [200712, 201458, 0.0], [201458, 201669, 0.0], [201669, 202228, 0.0], [202228, 203140, 0.0], [203140, 203190, 0.0], [203190, 203754, 0.0], [203754, 204497, 0.0], [204497, 204892, 0.0], [204892, 205096, 0.0], [205096, 206009, 0.0], [206009, 207439, 0.0], [207439, 208485, 0.0], [208485, 208705, 0.0], [208705, 209005, 0.0], [209005, 209763, 0.0], [209763, 209783, 0.0], [209783, 209803, 0.0], [209803, 209826, 0.0], [209826, 210237, 0.0], [210237, 210916, 0.0], [210916, 210979, 0.0], [210979, 211604, 0.0], [211604, 211874, 0.0], [211874, 211946, 0.0], [211946, 212785, 0.0], [212785, 212945, 0.0], [212945, 213628, 0.0], [213628, 214245, 0.0], [214245, 214255, 0.0], [214255, 214278, 0.0], [214278, 214287, 0.0], [214287, 214317, 0.0], [214317, 214330, 0.0], [214330, 214351, 0.0], [214351, 214370, 0.0], [214370, 214381, 0.0], [214381, 214396, 0.0], [214396, 214409, 0.0], [214409, 214418, 0.0], [214418, 214433, 0.0], [214433, 214447, 0.0], [214447, 214460, 0.0], [214460, 214468, 0.0], [214468, 214512, 0.0], [214512, 214559, 0.0], [214559, 214569, 0.0], [214569, 214983, 0.0], [214983, 215189, 0.0], [215189, 215519, 0.0], [215519, 215637, 0.0], [215637, 216451, 0.0], [216451, 216542, 0.0], [216542, 217150, 0.0], [217150, 217820, 0.0], [217820, 217908, 0.0], [217908, 218250, 0.0], [218250, 218936, 0.0], [218936, 219051, 0.0], [219051, 219459, 0.0], [219459, 219890, 0.0], [219890, 220280, 0.0], [220280, 220400, 0.0], [220400, 220443, 0.0], [220443, 220487, 0.0], [220487, 220523, 0.0], [220523, 220569, 0.0], [220569, 220621, 0.0], [220621, 220730, 0.0], [220730, 221222, 0.0], [221222, 222139, 0.0], [222139, 222259, 0.0], [222259, 222762, 0.0], [222762, 223151, 0.0], [223151, 223200, 0.0], [223200, 223312, 0.0], [223312, 223372, 0.0], [223372, 223944, 0.0], [223944, 224602, 0.0], [224602, 224610, 0.0], [224610, 224619, 0.0], [224619, 224629, 0.0], [224629, 224641, 0.0], [224641, 225434, 0.0], [225434, 226539, 0.0], [226539, 226614, 0.0], [226614, 227311, 0.0], [227311, 227426, 0.0], [227426, 227638, 0.0], [227638, 227741, 0.0], [227741, 228063, 0.0], [228063, 228154, 0.0], [228154, 228654, 0.0], [228654, 229051, 0.0], [229051, 229125, 0.0], [229125, 230081, 0.0], [230081, 230190, 0.0], [230190, 230484, 0.0], [230484, 231014, 0.0], [231014, 231292, 0.0], [231292, 231975, 0.0], [231975, 232644, 0.0], [232644, 232754, 0.0], [232754, 233903, 0.0], [233903, 234021, 0.0], [234021, 234128, 0.0], [234128, 234854, 0.0], [234854, 235109, 0.0], [235109, 235380, 0.0], [235380, 235924, 0.0], [235924, 236229, 0.0], [236229, 236759, 0.0], [236759, 236839, 0.0], [236839, 236980, 0.0], [236980, 237016, 0.0], [237016, 237172, 0.0], [237172, 237427, 0.0], [237427, 237658, 0.0], [237658, 237681, 0.0], [237681, 237705, 0.0], [237705, 237893, 0.0], [237893, 237982, 0.0], [237982, 238163, 0.0], [238163, 238363, 0.0], [238363, 238637, 0.0], [238637, 238931, 0.0], [238931, 239324, 0.0], [239324, 239981, 0.0], [239981, 240151, 0.0], [240151, 240239, 0.0], [240239, 240268, 0.0], [240268, 240897, 0.0], [240897, 240980, 0.0], [240980, 241031, 0.0], [241031, 241310, 0.0], [241310, 241340, 0.0], [241340, 241931, 0.0], [241931, 242012, 0.0], [242012, 242549, 0.0], [242549, 243110, 0.0], [243110, 243710, 0.0], [243710, 245037, 0.0], [245037, 245545, 0.0], [245545, 246426, 0.0], [246426, 247351, 0.0], [247351, 247465, 0.0], [247465, 247507, 0.0]], "rps_lines_uppercase_letter_fraction": [[0, 34, 0.02941176], [34, 77, 0.11627907], [77, 89, 0.16666667], [89, 100, 0.18181818], [100, 116, 0.125], [116, 122, 0.83333333], [122, 130, 0.875], [130, 165, 0.82857143], [165, 195, 0.16666667], [195, 241, 0.19565217], [241, 418, 0.09039548], [418, 600, 0.04395604], [600, 774, 0.06321839], [774, 952, 0.01123596], [952, 1131, 0.05027933], [1131, 1309, 0.03932584], [1309, 1489, 0.01666667], [1489, 1667, 0.02247191], [1667, 1829, 0.02469136], [1829, 2006, 0.05084746], [2006, 2052, 0.02173913], [2052, 2210, 0.00632911], [2210, 2393, 0.02185792], [2393, 2569, 0.02272727], [2569, 2746, 0.04519774], [2746, 2912, 0.0], [2912, 3094, 0.05494505], [3094, 3264, 0.03529412], [3264, 3441, 0.02259887], [3441, 3468, 0.07407407], [3468, 3531, 0.03174603], [3531, 3732, 0.03482587], [3732, 3928, 0.04081633], [3928, 4097, 0.03550296], [4097, 4146, 0.02040816], [4146, 4350, 0.05882353], [4350, 4545, 0.04102564], [4545, 4742, 0.01015228], [4742, 4947, 0.01463415], [4947, 4975, 0.0], [4975, 5160, 0.04864865], [5160, 5351, 0.0104712], [5351, 5433, 0.04878049], [5433, 5545, 0.01785714], [5545, 5651, 0.01886792], [5651, 5760, 0.01834862], [5760, 5836, 0.02631579], [5836, 5899, 0.03174603], [5899, 6002, 0.01941748], [6002, 6262, 0.1], [6262, 6453, 0.01570681], [6453, 6637, 0.0326087], [6637, 6676, 0.02564103], [6676, 6861, 0.12972973], [6861, 7058, 0.04568528], [7058, 7249, 0.0052356], [7249, 7438, 0.00529101], [7438, 7712, 0.1350365], [7712, 7898, 0.05376344], [7898, 8091, 0.03626943], [8091, 8183, 0.02173913], [8183, 8344, 0.03726708], [8344, 8505, 0.04968944], [8505, 8609, 0.01923077], [8609, 8617, 0.125], [8617, 8624, 0.42857143], [8624, 8636, 0.08333333], [8636, 8647, 0.09090909], [8647, 8660, 0.07692308], [8660, 8673, 0.07692308], [8673, 8683, 0.1], [8683, 8694, 0.09090909], [8694, 8704, 0.3], [8704, 8713, 0.11111111], [8713, 8720, 0.14285714], [8720, 8732, 0.08333333], [8732, 8745, 0.07692308], [8745, 8751, 0.5], [8751, 8760, 0.11111111], [8760, 8774, 0.14285714], [8774, 8946, 0.09302326], [8946, 9032, 0.01162791], [9032, 9042, 0.3], [9042, 9051, 0.11111111], [9051, 9063, 0.25], [9063, 9085, 0.13636364], [9085, 9101, 0.125], [9101, 9110, 0.11111111], [9110, 9125, 0.2], [9125, 9136, 0.27272727], [9136, 9151, 0.13333333], [9151, 9163, 0.08333333], [9163, 9169, 0.5], [9169, 9177, 0.125], [9177, 9202, 0.16], [9202, 9215, 0.15384615], [9215, 9226, 0.09090909], [9226, 9244, 0.16666667], [9244, 9273, 0.10344828], [9273, 9284, 0.09090909], [9284, 9376, 0.02173913], [9376, 9494, 0.03389831], [9494, 9678, 0.03804348], [9678, 9858, 0.05555556], [9858, 10036, 0.06741573], [10036, 10215, 0.02234637], [10215, 10235, 0.0], [10235, 10455, 0.03636364], [10455, 10666, 0.04265403], [10666, 10884, 0.00458716], [10884, 11098, 0.02336449], [11098, 11173, 0.0], [11173, 11428, 0.10196078], [11428, 11533, 0.00952381], [11533, 11756, 0.03139013], [11756, 11972, 0.00462963], [11972, 12190, 0.02752294], [12190, 12907, 0.0432357], [12907, 13032, 0.008], [13032, 13717, 0.04817518], [13717, 13891, 0.05172414], [13891, 14064, 0.05780347], [14064, 14237, 0.09248555], [14237, 14414, 0.02824859], [14414, 14584, 0.01764706], [14584, 14640, 0.01785714], [14640, 14765, 0.008], [14765, 14845, 0.075], [14845, 14864, 0.21052632], [14864, 14912, 0.0], [14912, 14959, 0.0], [14959, 15130, 0.05847953], [15130, 15297, 0.0239521], [15297, 15473, 0.01704545], [15473, 15643, 0.00588235], [15643, 15764, 0.03305785], [15764, 15930, 0.06626506], [15930, 16098, 0.04166667], [16098, 16260, 0.03703704], [16260, 16430, 0.05882353], [16430, 16604, 0.00574713], [16604, 16769, 0.05454545], [16769, 16947, 0.01123596], [16947, 17123, 0.01136364], [17123, 17135, 0.0], [17135, 17150, 0.06666667], [17150, 17208, 0.0], [17208, 17379, 0.06432749], [17379, 17556, 0.03954802], [17556, 17731, 0.01714286], [17731, 17903, 0.01162791], [17903, 18062, 0.01886792], [18062, 18231, 0.0591716], [18231, 18406, 0.06857143], [18406, 18576, 0.01764706], [18576, 18679, 0.00970874], [18679, 18850, 0.05847953], [18850, 19027, 0.02259887], [19027, 19147, 0.0], [19147, 19325, 0.01685393], [19325, 19489, 0.01829268], [19489, 19662, 0.03468208], [19662, 19792, 0.00769231], [19792, 19897, 0.00952381], [19897, 20072, 0.02857143], [20072, 20254, 0.03846154], [20254, 20432, 0.0505618], [20432, 20589, 0.01273885], [20589, 20759, 0.07058824], [20759, 20790, 0.0], [20790, 20866, 0.01315789], [20866, 20964, 0.01020408], [20964, 21090, 0.00793651], [21090, 21204, 0.00877193], [21204, 21298, 0.0212766], [21298, 21417, 0.00840336], [21417, 21528, 0.00900901], [21528, 21645, 0.00854701], [21645, 21726, 0.01234568], [21726, 21865, 0.00719424], [21865, 21995, 0.02307692], [21995, 22050, 0.09090909], [22050, 22215, 0.02424242], [22215, 22382, 0.0], [22382, 22545, 0.00613497], [22545, 22722, 0.00564972], [22722, 22873, 0.01324503], [22873, 22900, 0.07407407], [22900, 23060, 0.0875], [23060, 23233, 0.04624277], [23233, 23289, 0.07142857], [23289, 23411, 0.00819672], [23411, 23574, 0.06134969], [23574, 23738, 0.07926829], [23738, 23910, 0.02906977], [23910, 24007, 0.01030928], [24007, 24187, 0.03888889], [24187, 24361, 0.03448276], [24361, 24528, 0.06586826], [24528, 24694, 0.01204819], [24694, 24729, 0.0], [24729, 24872, 0.01398601], [24872, 25006, 0.10447761], [25006, 25176, 0.02941176], [25176, 25344, 0.01190476], [25344, 25465, 0.02479339], [25465, 25643, 0.01685393], [25643, 25816, 0.04624277], [25816, 25990, 0.04597701], [25990, 26177, 0.01069519], [26177, 26285, 0.00925926], [26285, 26461, 0.02840909], [26461, 26638, 0.05084746], [26638, 26814, 0.05113636], [26814, 26991, 0.03389831], [26991, 27020, 0.06896552], [27020, 27196, 0.10795455], [27196, 27373, 0.06214689], [27373, 27436, 0.06349206], [27436, 27607, 0.04678363], [27607, 27712, 0.00952381], [27712, 27763, 0.09803922], [27763, 27940, 0.01694915], [27940, 28119, 0.01117318], [28119, 28295, 0.05113636], [28295, 28481, 0.02150538], [28481, 28556, 0.0], [28556, 28731, 0.05142857], [28731, 28902, 0.04678363], [28902, 29076, 0.01724138], [29076, 29246, 0.01176471], [29246, 29412, 0.03012048], [29412, 29589, 0.01129944], [29589, 29721, 0.02272727], [29721, 29774, 0.01886792], [29774, 29940, 0.02409639], [29940, 30105, 0.02424242], [30105, 30275, 0.00588235], [30275, 30444, 0.03550296], [30444, 30612, 0.03571429], [30612, 30782, 0.02352941], [30782, 30961, 0.02234637], [30961, 31132, 0.07602339], [31132, 31272, 0.02857143], [31272, 31435, 0.01226994], [31435, 31606, 0.04093567], [31606, 31771, 0.02424242], [31771, 31812, 0.0], [31812, 31955, 0.00699301], [31955, 32086, 0.15267176], [32086, 32149, 0.07936508], [32149, 32315, 0.07831325], [32315, 32482, 0.01197605], [32482, 32643, 0.04968944], [32643, 32762, 0.01680672], [32762, 32917, 0.05806452], [32917, 33074, 0.08917197], [33074, 33235, 0.04347826], [33235, 33397, 0.02469136], [33397, 33554, 0.07643312], [33554, 33598, 0.0], [33598, 33763, 0.00606061], [33763, 33913, 0.03333333], [33913, 34015, 0.0], [34015, 34165, 0.0], [34165, 34295, 0.18461538], [34295, 34323, 0.03571429], [34323, 34477, 0.01948052], [34477, 34639, 0.0617284], [34639, 34805, 0.02409639], [34805, 34973, 0.0297619], [34973, 35103, 0.04615385], [35103, 35270, 0.04790419], [35270, 35424, 0.00649351], [35424, 35484, 0.08333333], [35484, 35661, 0.01129944], [35661, 35821, 0.01875], [35821, 35981, 0.0375], [35981, 36150, 0.0295858], [36150, 36320, 0.01176471], [36320, 36482, 0.03703704], [36482, 36647, 0.03636364], [36647, 36756, 0.03669725], [36756, 36924, 0.03571429], [36924, 37084, 0.0125], [37084, 37250, 0.03012048], [37250, 37407, 0.04458599], [37407, 37482, 0.01333333], [37482, 37644, 0.0308642], [37644, 37807, 0.06134969], [37807, 37960, 0.01960784], [37960, 38126, 0.01204819], [38126, 38205, 0.01265823], [38205, 38375, 0.01176471], [38375, 38542, 0.01796407], [38542, 38716, 0.03448276], [38716, 38894, 0.01123596], [38894, 39064, 0.05882353], [39064, 39235, 0.08187135], [39235, 39398, 0.2208589], [39398, 39536, 0.0942029], [39536, 39709, 0.02890173], [39709, 39887, 0.01123596], [39887, 39901, 0.0], [39901, 40077, 0.07386364], [40077, 40249, 0.08139535], [40249, 40299, 0.06], [40299, 40467, 0.02380952], [40467, 40635, 0.0297619], [40635, 40739, 0.06730769], [40739, 40909, 0.11176471], [40909, 40981, 0.09722222], [40981, 41154, 0.03468208], [41154, 41177, 0.0], [41177, 41298, 0.07438017], [41298, 41364, 0.10606061], [41364, 41431, 0.05970149], [41431, 41488, 0.07017544], [41488, 41559, 0.07042254], [41559, 41673, 0.05263158], [41673, 41765, 0.02173913], [41765, 41884, 0.05882353], [41884, 42013, 0.03100775], [42013, 42117, 0.03846154], [42117, 42426, 0.08414239], [42426, 43479, 0.02754036], [43479, 43653, 0.05172414], [43653, 43825, 0.04651163], [43825, 43998, 0.01734104], [43998, 44177, 0.02793296], [44177, 44257, 0.025], [44257, 44336, 0.07594937], [44336, 44436, 0.05], [44436, 44612, 0.02840909], [44612, 44783, 0.00584795], [44783, 44957, 0.00574713], [44957, 45130, 0.01156069], [45130, 45289, 0.02515723], [45289, 45484, 0.04615385], [45484, 45590, 0.00943396], [45590, 45631, 0.04878049], [45631, 45677, 0.02173913], [45677, 45706, 0.03448276], [45706, 45735, 0.03448276], [45735, 45772, 0.02702703], [45772, 45957, 0.01621622], [45957, 46136, 0.01675978], [46136, 46310, 0.0], [46310, 46492, 0.03296703], [46492, 46682, 0.01052632], [46682, 46719, 0.0], [46719, 46905, 0.03225806], [46905, 47020, 0.0], [47020, 47115, 0.01052632], [47115, 47137, 0.04545455], [47137, 47320, 0.02185792], [47320, 47503, 0.05464481], [47503, 47679, 0.05113636], [47679, 47750, 0.05633803], [47750, 47939, 0.01587302], [47939, 48126, 0.00534759], [48126, 48136, 0.0], [48136, 48312, 0.08522727], [48312, 48491, 0.0726257], [48491, 48672, 0.06077348], [48672, 48741, 0.0], [48741, 48779, 0.02631579], [48779, 48963, 0.07065217], [48963, 49150, 0.04278075], [49150, 49335, 0.02162162], [49335, 49432, 0.02061856], [49432, 49603, 0.04093567], [49603, 49785, 0.00549451], [49785, 49904, 0.0], [49904, 50090, 0.01075269], [50090, 50280, 0.00526316], [50280, 50448, 0.0], [50448, 50636, 0.0106383], [50636, 50743, 0.0], [50743, 50851, 0.03703704], [50851, 51019, 0.04761905], [51019, 51205, 0.01075269], [51205, 51347, 0.00704225], [51347, 51539, 0.02083333], [51539, 51726, 0.01069519], [51726, 51770, 0.0], [51770, 51959, 0.01587302], [51959, 52148, 0.00529101], [52148, 52210, 0.0], [52210, 52393, 0.01639344], [52393, 52575, 0.01098901], [52575, 52672, 0.01030928], [52672, 52857, 0.02702703], [52857, 53035, 0.01685393], [53035, 53221, 0.01075269], [53221, 53409, 0.0212766], [53409, 53601, 0.01041667], [53601, 53746, 0.00689655], [53746, 53908, 0.0617284], [53908, 54065, 0.00636943], [54065, 54220, 0.01290323], [54220, 54384, 0.03658537], [54384, 54550, 0.0060241], [54550, 54569, 0.0], [54569, 54624, 0.05454545], [54624, 54707, 0.03614458], [54707, 54865, 0.01898734], [54865, 55036, 0.01169591], [55036, 55202, 0.0060241], [55202, 55347, 0.02068966], [55347, 55506, 0.01886792], [55506, 55670, 0.00609756], [55670, 55841, 0.0], [55841, 55865, 0.0], [55865, 56029, 0.07317073], [56029, 56137, 0.01851852], [56137, 56206, 0.05797101], [56206, 56262, 0.14285714], [56262, 56335, 0.02739726], [56335, 56373, 0.13157895], [56373, 56545, 0.05813953], [56545, 56623, 0.01282051], [56623, 56784, 0.08074534], [56784, 56949, 0.10909091], [56949, 57039, 0.17777778], [57039, 57206, 0.04790419], [57206, 57381, 0.03428571], [57381, 57480, 0.03030303], [57480, 57641, 0.04968944], [57641, 57806, 0.03636364], [57806, 57823, 0.0], [57823, 58107, 0.0915493], [58107, 58265, 0.06962025], [58265, 58434, 0.01183432], [58434, 58602, 0.05952381], [58602, 58631, 0.0], [58631, 58787, 0.08974359], [58787, 58940, 0.07843137], [58940, 59098, 0.02531646], [59098, 59266, 0.01785714], [59266, 59343, 0.07792208], [59343, 59512, 0.06508876], [59512, 59554, 0.0], [59554, 59625, 0.11267606], [59625, 59790, 0.02424242], [59790, 59957, 0.0239521], [59957, 60117, 0.00625], [60117, 60192, 0.0], [60192, 60352, 0.13125], [60352, 60473, 0.01652893], [60473, 60617, 0.04861111], [60617, 60764, 0.02721088], [60764, 60908, 0.02083333], [60908, 61053, 0.06206897], [61053, 61161, 0.03703704], [61161, 61444, 0.09187279], [61444, 61593, 0.06040268], [61593, 61742, 0.03355705], [61742, 61891, 0.06040268], [61891, 62036, 0.00689655], [62036, 62188, 0.01973684], [62188, 62234, 0.0], [62234, 62398, 0.08536585], [62398, 62417, 0.10526316], [62417, 62445, 0.03571429], [62445, 62459, 0.07142857], [62459, 62511, 0.01923077], [62511, 62539, 0.03571429], [62539, 62572, 0.03030303], [62572, 62601, 0.03448276], [62601, 62655, 0.01851852], [62655, 62718, 0.0], [62718, 62877, 0.01886792], [62877, 63031, 0.02597403], [63031, 63154, 0.02439024], [63154, 63304, 0.02666667], [63304, 63454, 0.02], [63454, 63602, 0.0472973], [63602, 63679, 0.0], [63679, 63819, 0.02857143], [63819, 63893, 0.0], [63893, 64050, 0.03821656], [64050, 64212, 0.0617284], [64212, 64370, 0.05696203], [64370, 64533, 0.00613497], [64533, 64693, 0.00625], [64693, 64770, 0.0], [64770, 64937, 0.04790419], [64937, 65096, 0.03144654], [65096, 65249, 0.0130719], [65249, 65404, 0.01935484], [65404, 65560, 0.0], [65560, 65714, 0.01948052], [65714, 65873, 0.00628931], [65873, 65882, 0.0], [65882, 65937, 0.01818182], [65937, 66099, 0.01234568], [66099, 66267, 0.02380952], [66267, 66433, 0.01204819], [66433, 66599, 0.01204819], [66599, 66680, 0.01234568], [66680, 66843, 0.03680982], [66843, 66859, 0.0], [66859, 67015, 0.03846154], [67015, 67175, 0.025], [67175, 67328, 0.01960784], [67328, 67365, 0.0], [67365, 67426, 0.1147541], [67426, 67498, 0.01388889], [67498, 67658, 0.0375], [67658, 67816, 0.08860759], [67816, 67980, 0.03658537], [67980, 67990, 0.0], [67990, 68152, 0.0308642], [68152, 68316, 0.0304878], [68316, 68475, 0.05031447], [68475, 68608, 0.02255639], [68608, 68764, 0.07692308], [68764, 68916, 0.05263158], [68916, 69068, 0.06578947], [69068, 69229, 0.00621118], [69229, 69382, 0.07189542], [69382, 69554, 0.04651163], [69554, 69613, 0.08474576], [69613, 69777, 0.01829268], [69777, 69948, 0.01754386], [69948, 70129, 0.03314917], [70129, 70301, 0.01162791], [70301, 70467, 0.09638554], [70467, 70635, 0.01190476], [70635, 70806, 0.01169591], [70806, 70947, 0.0], [70947, 71218, 0.09594096], [71218, 71388, 0.01176471], [71388, 71550, 0.04320988], [71550, 71725, 0.01714286], [71725, 71809, 0.03571429], [71809, 71976, 0.07784431], [71976, 72143, 0.04191617], [72143, 72315, 0.01744186], [72315, 72339, 0.0], [72339, 72510, 0.02923977], [72510, 72681, 0.04093567], [72681, 72851, 0.00588235], [72851, 72909, 0.01724138], [72909, 72974, 0.01538462], [72974, 73147, 0.02890173], [73147, 73307, 0.01875], [73307, 73327, 0.0], [73327, 73499, 0.02906977], [73499, 73667, 0.01190476], [73667, 73683, 0.0], [73683, 73859, 0.09659091], [73859, 74034, 0.02285714], [74034, 74214, 0.01111111], [74214, 74376, 0.01234568], [74376, 74550, 0.00574713], [74550, 74711, 0.01863354], [74711, 74880, 0.04142012], [74880, 74921, 0.02439024], [74921, 75061, 0.00714286], [75061, 75230, 0.04142012], [75230, 75402, 0.01162791], [75402, 75570, 0.01190476], [75570, 75743, 0.01734104], [75743, 75811, 0.0], [75811, 75987, 0.0625], [75987, 76159, 0.06395349], [76159, 76327, 0.02380952], [76327, 76494, 0.00598802], [76494, 76676, 0.00549451], [76676, 76854, 0.01123596], [76854, 76940, 0.0], [76940, 76977, 0.02702703], [76977, 77147, 0.03529412], [77147, 77322, 0.0], [77322, 77500, 0.01685393], [77500, 77679, 0.00558659], [77679, 77832, 0.0130719], [77832, 77895, 0.04761905], [77895, 77947, 0.01923077], [77947, 78143, 0.04081633], [78143, 78331, 0.03191489], [78331, 78520, 0.00529101], [78520, 78552, 0.0], [78552, 78749, 0.03553299], [78749, 78794, 0.02222222], [78794, 78987, 0.03108808], [78987, 79181, 0.03608247], [79181, 79287, 0.01886792], [79287, 79368, 0.07407407], [79368, 79461, 0.01075269], [79461, 79648, 0.02673797], [79648, 79842, 0.01546392], [79842, 80041, 0.00502513], [80041, 80228, 0.01604278], [80228, 80421, 0.00518135], [80421, 80596, 0.01714286], [80596, 80787, 0.04188482], [80787, 80948, 0.0310559], [80948, 81009, 0.0], [81009, 81139, 0.06153846], [81139, 81306, 0.01796407], [81306, 81415, 0.02752294], [81415, 81568, 0.01960784], [81568, 81689, 0.00826446], [81689, 81837, 0.00675676], [81837, 82004, 0.07185629], [82004, 82195, 0.0104712], [82195, 82388, 0.00518135], [82388, 82585, 0.00507614], [82585, 82779, 0.00515464], [82779, 82977, 0.01010101], [82977, 82996, 0.0], [82996, 83187, 0.03664921], [83187, 83370, 0.04371585], [83370, 83389, 0.0], [83389, 83565, 0.01704545], [83565, 83737, 0.00581395], [83737, 83926, 0.01587302], [83926, 84084, 0.01898734], [84084, 84267, 0.01639344], [84267, 84509, 0.02892562], [84509, 84698, 0.01058201], [84698, 84887, 0.02116402], [84887, 85074, 0.01069519], [85074, 85240, 0.01807229], [85240, 85373, 0.03759398], [85373, 85543, 0.00588235], [85543, 85663, 0.04166667], [85663, 85855, 0.015625], [85855, 86033, 0.05617978], [86033, 86106, 0.0], [86106, 86260, 0.12337662], [86260, 86582, 0.18944099], [86582, 86779, 0.03045685], [86779, 86970, 0.0052356], [86970, 87082, 0.00892857], [87082, 87239, 0.08280255], [87239, 87393, 0.01298701], [87393, 87506, 0.00884956], [87506, 87586, 0.0125], [87586, 87702, 0.00862069], [87702, 87843, 0.0212766], [87843, 87948, 0.02857143], [87948, 88041, 0.02150538], [88041, 88182, 0.06382979], [88182, 88259, 0.07792208], [88259, 88332, 0.08219178], [88332, 88458, 0.03174603], [88458, 88570, 0.04464286], [88570, 88671, 0.02970297], [88671, 88799, 0.0390625], [88799, 88950, 0.03311258], [88950, 89060, 0.04545455], [89060, 89212, 0.05921053], [89212, 89279, 0.04477612], [89279, 89407, 0.015625], [89407, 89541, 0.02238806], [89541, 89606, 0.01538462], [89606, 89742, 0.01470588], [89742, 89841, 0.08080808], [89841, 89906, 0.01538462], [89906, 90016, 0.00909091], [90016, 90206, 0.01578947], [90206, 90391, 0.01081081], [90391, 90564, 0.03468208], [90564, 90641, 0.05194805], [90641, 90746, 0.1047619], [90746, 90790, 0.11363636], [90790, 90967, 0.02259887], [90967, 91064, 0.01030928], [91064, 91250, 0.01612903], [91250, 91441, 0.02094241], [91441, 91472, 0.0], [91472, 91656, 0.01630435], [91656, 91760, 0.0], [91760, 91866, 0.0], [91866, 91921, 0.0], [91921, 92101, 0.03888889], [92101, 92284, 0.02185792], [92284, 92459, 0.03428571], [92459, 92648, 0.01058201], [92648, 92700, 0.01923077], [92700, 92889, 0.02645503], [92889, 92929, 0.3], [92929, 93111, 0.05494505], [93111, 93304, 0.05699482], [93304, 93495, 0.0104712], [93495, 93682, 0.00534759], [93682, 93883, 0.0199005], [93883, 93976, 0.01075269], [93976, 94111, 0.05925926], [94111, 94300, 0.02116402], [94300, 94488, 0.00531915], [94488, 94678, 0.02105263], [94678, 94867, 0.05291005], [94867, 94884, 0.0], [94884, 95077, 0.00518135], [95077, 95104, 0.0], [95104, 95402, 0.03020134], [95402, 96137, 0.00816327], [96137, 96608, 0.04883227], [96608, 96670, 0.08064516], [96670, 97013, 0.02040816], [97013, 97623, 0.01803279], [97623, 98424, 0.02996255], [98424, 98483, 0.08474576], [98483, 98979, 0.02620968], [98979, 99193, 0.01869159], [99193, 99537, 0.06104651], [99537, 99666, 0.03875969], [99666, 99733, 0.04477612], [99733, 99832, 0.02020202], [99832, 99925, 0.03225806], [99925, 100005, 0.0375], [100005, 100138, 0.0075188], [100138, 100266, 0.046875], [100266, 100380, 0.01754386], [100380, 100407, 0.0], [100407, 100429, 0.0], [100429, 100903, 0.04852321], [100903, 101195, 0.04794521], [101195, 101442, 0.0242915], [101442, 101560, 0.10169492], [101560, 101640, 0.05], [101640, 101704, 0.078125], [101704, 101776, 0.06944444], [101776, 101890, 0.07017544], [101890, 101985, 0.06315789], [101985, 102130, 0.06206897], [102130, 102222, 0.09782609], [102222, 102355, 0.07518797], [102355, 102455, 0.11], [102455, 102557, 0.00980392], [102557, 103092, 0.05607477], [103092, 103494, 0.07462687], [103494, 104022, 0.01136364], [104022, 104103, 0.01234568], [104103, 104365, 0.02290076], [104365, 104385, 0.15], [104385, 104413, 0.03571429], [104413, 104444, 0.06451613], [104444, 104564, 0.01666667], [104564, 105386, 0.02068127], [105386, 105433, 0.0212766], [105433, 105654, 0.11312217], [105654, 106148, 0.02226721], [106148, 106810, 0.0060423], [106810, 106906, 0.04166667], [106906, 106999, 0.01075269], [106999, 107115, 0.02586207], [107115, 107222, 0.00934579], [107222, 107311, 0.01123596], [107311, 107457, 0.00684932], [107457, 107535, 0.01282051], [107535, 107676, 0.0070922], [107676, 107793, 0.03418803], [107793, 107908, 0.02608696], [107908, 108055, 0.02721088], [108055, 108306, 0.09960159], [108306, 108684, 0.01587302], [108684, 109560, 0.02625571], [109560, 109622, 0.01612903], [109622, 110096, 0.01687764], [110096, 110489, 0.0178117], [110489, 110553, 0.109375], [110553, 110622, 0.14492754], [110622, 110976, 0.02259887], [110976, 111002, 0.19230769], [111002, 111085, 0.08433735], [111085, 111149, 0.046875], [111149, 111299, 0.04], [111299, 111654, 0.05070423], [111654, 112221, 0.04938272], [112221, 112288, 0.0], [112288, 112330, 0.07142857], [112330, 112345, 0.13333333], [112345, 112381, 0.27777778], [112381, 112457, 0.05263158], [112457, 113178, 0.04715673], [113178, 113631, 0.01103753], [113631, 113920, 0.00692042], [113920, 114394, 0.0443038], [114394, 115652, 0.01430843], [115652, 116079, 0.03747073], [116079, 116522, 0.04514673], [116522, 116629, 0.0], [116629, 116699, 0.0], [116699, 117285, 0.0221843], [117285, 117827, 0.02398524], [117827, 117933, 0.0], [117933, 118365, 0.04861111], [118365, 118466, 0.0], [118466, 118819, 0.06232295], [118819, 118888, 0.01449275], [118888, 119095, 0.28019324], [119095, 120152, 0.03405866], [120152, 120632, 0.075], [120632, 121412, 0.02051282], [121412, 121429, 0.05882353], [121429, 121453, 0.04166667], [121453, 121462, 0.11111111], [121462, 121475, 0.07692308], [121475, 121501, 0.03846154], [121501, 121527, 0.03846154], [121527, 121543, 0.0625], [121543, 121567, 0.04166667], [121567, 122010, 0.01354402], [122010, 122192, 0.08241758], [122192, 122819, 0.01754386], [122819, 122889, 0.08571429], [122889, 122966, 0.09090909], [122966, 123026, 0.08333333], [123026, 123644, 0.0210356], [123644, 123688, 0.11363636], [123688, 123975, 0.05226481], [123975, 124017, 0.0], [124017, 124390, 0.01608579], [124390, 124446, 0.0], [124446, 125407, 0.02497399], [125407, 125611, 0.02941176], [125611, 125651, 0.1], [125651, 125695, 0.04545455], [125695, 125770, 0.02666667], [125770, 125822, 0.01923077], [125822, 125884, 0.03225806], [125884, 125902, 0.05555556], [125902, 126032, 0.04615385], [126032, 126599, 0.01587302], [126599, 126613, 0.07142857], [126613, 127031, 0.03588517], [127031, 127074, 0.02325581], [127074, 127120, 0.02173913], [127120, 127208, 0.01136364], [127208, 127890, 0.02932551], [127890, 128002, 0.01785714], [128002, 128215, 0.11267606], [128215, 128702, 0.02464066], [128702, 128817, 0.00869565], [128817, 129616, 0.04380476], [129616, 129855, 0.05020921], [129855, 130164, 0.02588997], [130164, 130466, 0.02649007], [130466, 130783, 0.02523659], [130783, 131069, 0.08741259], [131069, 131747, 0.01474926], [131747, 132476, 0.02743484], [132476, 132635, 0.03144654], [132635, 133171, 0.06156716], [133171, 133862, 0.01013025], [133862, 134098, 0.05932203], [134098, 134328, 0.02173913], [134328, 134941, 0.01141925], [134941, 134999, 0.05172414], [134999, 135446, 0.01789709], [135446, 136036, 0.02372881], [136036, 136725, 0.04499274], [136725, 137098, 0.02949062], [137098, 137197, 0.04040404], [137197, 137889, 0.03468208], [137889, 139040, 0.0443093], [139040, 139368, 0.03353659], [139368, 139941, 0.02094241], [139941, 140277, 0.0952381], [140277, 140627, 0.08], [140627, 140700, 0.16438356], [140700, 141197, 0.03822938], [141197, 141882, 0.03649635], [141882, 142008, 0.00793651], [142008, 142810, 0.04862843], [142810, 143146, 0.05654762], [143146, 143307, 0.04968944], [143307, 143465, 0.01898734], [143465, 143607, 0.02112676], [143607, 144067, 0.01521739], [144067, 144243, 0.0625], [144243, 144333, 0.03333333], [144333, 144476, 0.2027972], [144476, 145048, 0.02447552], [145048, 145234, 0.01612903], [145234, 146188, 0.05660377], [146188, 146503, 0.03174603], [146503, 146601, 0.01020408], [146601, 147536, 0.02245989], [147536, 148327, 0.0278129], [148327, 148443, 0.00862069], [148443, 148883, 0.01818182], [148883, 150331, 0.04143646], [150331, 150600, 0.0929368], [150600, 150855, 0.04313725], [150855, 151326, 0.06157113], [151326, 151760, 0.09447005], [151760, 152232, 0.05084746], [152232, 153238, 0.02683897], [153238, 153652, 0.01449275], [153652, 154349, 0.02295552], [154349, 155131, 0.01918159], [155131, 155171, 0.025], [155171, 155182, 0.09090909], [155182, 155209, 0.11111111], [155209, 155223, 0.14285714], [155223, 155231, 0.125], [155231, 155255, 0.20833333], [155255, 155281, 0.11538462], [155281, 155295, 0.28571429], [155295, 155308, 0.23076923], [155308, 155371, 0.01587302], [155371, 155887, 0.00775194], [155887, 156153, 0.04511278], [156153, 156447, 0.02380952], [156447, 156572, 0.024], [156572, 156625, 0.0754717], [156625, 156683, 0.0862069], [156683, 156712, 0.10344828], [156712, 156740, 0.14285714], [156740, 157684, 0.01588983], [157684, 157916, 0.10775862], [157916, 158826, 0.00989011], [158826, 158979, 0.03267974], [158979, 159451, 0.03601695], [159451, 159510, 0.01694915], [159510, 160433, 0.03466956], [160433, 161060, 0.03827751], [161060, 161246, 0.09139785], [161246, 161822, 0.02951389], [161822, 162564, 0.02560647], [162564, 163027, 0.04319654], [163027, 163528, 0.02794411], [163528, 163603, 0.01333333], [163603, 163992, 0.01799486], [163992, 164058, 0.04545455], [164058, 164075, 0.11764706], [164075, 164165, 0.1], [164165, 164222, 0.05263158], [164222, 164278, 0.01785714], [164278, 164933, 0.00916031], [164933, 165923, 0.02525253], [165923, 166477, 0.0198556], [166477, 166614, 0.04379562], [166614, 166942, 0.01829268], [166942, 167529, 0.01192504], [167529, 168037, 0.01574803], [168037, 168796, 0.11198946], [168796, 169032, 0.02118644], [169032, 169236, 0.01960784], [169236, 169752, 0.03294574], [169752, 169967, 0.02325581], [169967, 170150, 0.03278689], [170150, 170351, 0.05472637], [170351, 170372, 0.23809524], [170372, 170821, 0.05122494], [170821, 171206, 0.03896104], [171206, 172142, 0.07264957], [172142, 173186, 0.02681992], [173186, 173490, 0.08223684], [173490, 173988, 0.01807229], [173988, 174970, 0.11812627], [174970, 175037, 0.0], [175037, 175091, 0.0], [175091, 175152, 0.0], [175152, 175199, 0.0], [175199, 175664, 0.03870968], [175664, 175886, 0.09009009], [175886, 176424, 0.09479554], [176424, 176828, 0.01485149], [176828, 176967, 0.05035971], [176967, 177075, 0.03703704], [177075, 177223, 0.00675676], [177223, 177456, 0.01287554], [177456, 177700, 0.05327869], [177700, 177944, 0.01229508], [177944, 178498, 0.04873646], [178498, 178782, 0.08802817], [178782, 179419, 0.0533752], [179419, 179836, 0.0383693], [179836, 179887, 0.0], [179887, 179987, 0.0], [179987, 180034, 0.0212766], [180034, 180129, 0.01052632], [180129, 180264, 0.00740741], [180264, 180354, 0.01111111], [180354, 180419, 0.01538462], [180419, 180483, 0.015625], [180483, 180604, 0.0], [180604, 180689, 0.0], [180689, 181481, 0.03030303], [181481, 182158, 0.02363368], [182158, 182267, 0.00917431], [182267, 183128, 0.0174216], [183128, 183530, 0.04975124], [183530, 183797, 0.09363296], [183797, 184255, 0.05458515], [184255, 184339, 0.0], [184339, 184345, 0.66666667], [184345, 184378, 0.12121212], [184378, 185181, 0.02739726], [185181, 185451, 0.09259259], [185451, 186088, 0.03139717], [186088, 186355, 0.03745318], [186355, 186456, 0.04950495], [186456, 187387, 0.04833512], [187387, 187686, 0.0367893], [187686, 187925, 0.10460251], [187925, 188495, 0.0122807], [188495, 189309, 0.02211302], [189309, 189910, 0.03660566], [189910, 190482, 0.03671329], [190482, 190755, 0.08791209], [190755, 191550, 0.01761006], [191550, 192071, 0.0537428], [192071, 192186, 0.0], [192186, 192609, 0.02364066], [192609, 192763, 0.01948052], [192763, 192932, 0.0591716], [192932, 193553, 0.0273752], [193553, 194952, 0.02716226], [194952, 195566, 0.02442997], [195566, 195675, 0.0], [195675, 196589, 0.0273523], [196589, 197386, 0.03638645], [197386, 197452, 0.01515152], [197452, 197544, 0.01086957], [197544, 198252, 0.04237288], [198252, 198871, 0.05169628], [198871, 199416, 0.02385321], [199416, 199956, 0.03518519], [199956, 199987, 0.0], [199987, 200057, 0.0], [200057, 200571, 0.03891051], [200571, 200712, 0.0141844], [200712, 201458, 0.03753351], [201458, 201669, 0.11374408], [201669, 202228, 0.02683363], [202228, 203140, 0.04605263], [203140, 203190, 0.0], [203190, 203754, 0.03368794], [203754, 204497, 0.02557201], [204497, 204892, 0.07848101], [204892, 205096, 0.02941176], [205096, 206009, 0.04381161], [206009, 207439, 0.02657343], [207439, 208485, 0.01625239], [208485, 208705, 0.07272727], [208705, 209005, 0.07666667], [209005, 209763, 0.05672823], [209763, 209783, 0.05], [209783, 209803, 0.05], [209803, 209826, 0.04347826], [209826, 210237, 0.0243309], [210237, 210916, 0.05596465], [210916, 210979, 0.0], [210979, 211604, 0.0192], [211604, 211874, 0.08888889], [211874, 211946, 0.01388889], [211946, 212785, 0.02741359], [212785, 212945, 0.0125], [212945, 213628, 0.04685212], [213628, 214245, 0.02917342], [214245, 214255, 0.6], [214255, 214278, 0.13043478], [214278, 214287, 0.11111111], [214287, 214317, 0.1], [214317, 214330, 0.15384615], [214330, 214351, 0.0952381], [214351, 214370, 0.15789474], [214370, 214381, 0.09090909], [214381, 214396, 0.06666667], [214396, 214409, 0.07692308], [214409, 214418, 0.11111111], [214418, 214433, 0.13333333], [214433, 214447, 0.07142857], [214447, 214460, 0.07692308], [214460, 214468, 0.375], [214468, 214512, 0.04545455], [214512, 214559, 0.04255319], [214559, 214569, 0.3], [214569, 214983, 0.06763285], [214983, 215189, 0.03398058], [215189, 215519, 0.07272727], [215519, 215637, 0.00847458], [215637, 216451, 0.04668305], [216451, 216542, 0.0], [216542, 217150, 0.03782895], [217150, 217820, 0.0761194], [217820, 217908, 0.09090909], [217908, 218250, 0.02923977], [218250, 218936, 0.02915452], [218936, 219051, 0.0], [219051, 219459, 0.04656863], [219459, 219890, 0.03712297], [219890, 220280, 0.03589744], [220280, 220400, 0.00833333], [220400, 220443, 0.02325581], [220443, 220487, 0.02272727], [220487, 220523, 0.02777778], [220523, 220569, 0.02173913], [220569, 220621, 0.01923077], [220621, 220730, 0.00917431], [220730, 221222, 0.0304878], [221222, 222139, 0.01853871], [222139, 222259, 0.0], [222259, 222762, 0.01590457], [222762, 223151, 0.02056555], [223151, 223200, 0.0], [223200, 223312, 0.02678571], [223312, 223372, 0.05], [223372, 223944, 0.03671329], [223944, 224602, 0.02279635], [224602, 224610, 0.625], [224610, 224619, 0.11111111], [224619, 224629, 0.1], [224629, 224641, 0.08333333], [224641, 225434, 0.05800757], [225434, 226539, 0.0361991], [226539, 226614, 0.0], [226614, 227311, 0.0143472], [227311, 227426, 0.08695652], [227426, 227638, 0.02830189], [227638, 227741, 0.08737864], [227741, 228063, 0.01552795], [228063, 228154, 0.0], [228154, 228654, 0.08], [228654, 229051, 0.04785894], [229051, 229125, 0.16216216], [229125, 230081, 0.03974895], [230081, 230190, 0.31192661], [230190, 230484, 0.01020408], [230484, 231014, 0.04150943], [231014, 231292, 0.10431655], [231292, 231975, 0.05417277], [231975, 232644, 0.04484305], [232644, 232754, 0.0], [232754, 233903, 0.04090513], [233903, 234021, 0.08474576], [234021, 234128, 0.0], [234128, 234854, 0.0261708], [234854, 235109, 0.09411765], [235109, 235380, 0.01107011], [235380, 235924, 0.04411765], [235924, 236229, 0.06885246], [236229, 236759, 0.03207547], [236759, 236839, 0.0], [236839, 236980, 0.0070922], [236980, 237016, 0.02777778], [237016, 237172, 0.07692308], [237172, 237427, 0.02745098], [237427, 237658, 0.01298701], [237658, 237681, 0.17391304], [237681, 237705, 0.04166667], [237705, 237893, 0.08510638], [237893, 237982, 0.02247191], [237982, 238163, 0.00552486], [238163, 238363, 0.01], [238363, 238637, 0.01824818], [238637, 238931, 0.00340136], [238931, 239324, 0.01017812], [239324, 239981, 0.0152207], [239981, 240151, 0.02941176], [240151, 240239, 0.01136364], [240239, 240268, 0.13793103], [240268, 240897, 0.04769475], [240897, 240980, 0.0], [240980, 241031, 0.0], [241031, 241310, 0.02508961], [241310, 241340, 0.06666667], [241340, 241931, 0.02030457], [241931, 242012, 0.0], [242012, 242549, 0.02048417], [242549, 243110, 0.04456328], [243110, 243710, 0.09833333], [243710, 245037, 0.02260739], [245037, 245545, 0.0492126], [245545, 246426, 0.03291714], [246426, 247351, 0.02378378], [247351, 247465, 0.00877193], [247465, 247507, 0.11904762]], "rps_doc_ml_palm_score": [[0, 247507, 0.69913703]], "rps_doc_ml_wikipedia_score": [[0, 247507, null]], "rps_doc_ml_wikiref_score": [[0, 247507, 0.40797824]], "rps_doc_books_importance": [[0, 247507, -18529.98340892]], "rps_doc_openwebtext_importance": [[0, 247507, -813.19785596]], "rps_doc_wikipedia_importance": [[0, 247507, -9615.97991429]], "rps_doc_num_sentences": [[0, 247507, 2567.0]]} |
Executive Insights from FDLI’s 2015 Enforcement, Litigation & Compliance Conference: Trends and Risks for 2016 and Beyond
Early December is one of my favorite times of year. Coming off of Thanksgiving and looking forward to the end-of-year holidays, it’s when I get to reflect on the past 12 months, and think about what the next year will bring. Also though, it’s when the annual FDLI Enforcement, Litigation and Compliance Conference is held every year in Washington, DC. For me, this is a “must-attend” event, as the conference brings together some the most-respected speakers and attendees in FDA-regulated industry to discuss the most pressing of matters: enforcement, litigation and compliance. The conference is widely attended by senior executives in quality and regulatory; in-house FDA regulatory attorneys; food and drug law attorneys from the private bar; senior-level FDA staff, and leading consulting firms.
Some of the notable high-profile speakers at last week’s conference included:
Jonathan Olin, Deputy Assistant Attorney General, Civil Division, US Department of Justice
Michael Roosevelt, Deputy Director, Office of Compliance, CFSAN
Mary Malarkey, Director, Office of Compliance and Biologics Quality, CBER
Cynthia Schnedar, Director, Office of Compliance, CDER
CAPT Sean Boyd, Acting Director, Office of Compliance, CDRH
Julie Taitsman, Chief Medical Officer, Office of Inspector General, US Department of Health and Human Services
Beth Weinman, Associate Chief Counsel for Enforcement, Office of Chief Counsel, FDA
George Karavetsos, Director, Office of Criminal Investigation, Food and Drug Administration
Douglas Stearn, Director, Office of Enforcement and Import Operations, ORA
Howard Sklamberg, Deputy Commissioner, Global Regulatory Operations and Policy, FDA
This year’s conference provided great insight into the government’s current thinking and resulting enforcement approaches against FDA-regulated companies. Overall, the multiple responsible agencies within the US Government – FDA, DOJ, OIG, FTC, DEA to name a few – are working closely together and are fully aligned to root out both bad actor corporations and those companies not committed to full compliance with the laws. Historically, the government’s focus has been primarily domestic in nature. However, inspection activity continues to increase for all regulated product classes at sites outside the United States. Since manufacturing continues to shift to offshore locations, ex-US inspection activity will continue to be an area of expanded focus. The government is fully committed to extend its reach to bad actor and non-compliant companies outside the US borders that would place the American public at risk.
It is remarkable to see the level of program alignment not only within FDA Centers, but also among the FDA Office of Criminal Investigation (OCI), US Department of Justice (DOJ), US Federal Trade Commission (FTC), and the Office of Inspector General (OIG). Senior litigating and prosecuting attorneys from these organizations spoke consistently and firmly on their combined efforts to protect US public health, and to find and punish those companies and individuals that would take advantage of the public for pure profit motives.
There were a few prominent themes and topics discussed at the conference that are important to note. This information may be useful to executives, in-house counsel, and staff compliance personnel that did not have the opportunity to attend this important meeting:
Seeking individual executive accountability under the “Yates Memorandum” will continue to be a focus area for future government enforcement activity.
The Yates Memorandum has received significant attention from the Food and Drug bar and other attorneys involved with corporate compliance activities. The Yates Memorandum reiterates the traditional position of the US Department of Justice that looks to establish individual accountability for the actions of corporate executives where corporate misconduct occurs. It is clear from all of the government speakers that the government is firmly committed to embracing and implementing the principles of the Yates Memorandum, and they will seek to implicate and charge individual senior executives within companies that are directly, or even indirectly, responsible for corporate misconduct.
There was much discussion (and among lawyers that means differing views) at the Conference about Yates and the government focus. Whether or not this is a true substantive change, or, something that is really more of a reiteration of existing policy, was not broadly agreed upon. However, one thing is clear: senior executives need to fully understand their responsibility under applicable law and corresponding expectations of the government regarding their conduct in their individual roles within corporations.
There is no function where this is more significant than for heads of quality (VPs of Quality, Senior Directors of Quality, Chief Quality Officers, etc.) within pharmaceutical, medical device, food, dietary supplement and other FDA-regulated firms. It can’t be more plainly stated than the following: your personal liberty could be at risk for your actions, decisions, and conduct in your day-to-day job within your company. The government may / will seek to hold you personally responsible for your actions or decisions where violative conduct under the Food Drug and Cosmetic Act occurs. If you don’t understand what this means, you should speak with your in-house counsel for guidance.
Both dietary supplement companies and pharmacy compounding companies are squarely in the crosshairs of the FDA and the US Department of Justice.
Both of these industry segments, for different reasons, are under significant scrutiny, and further enforcement action is almost certain.
Pharmacy compounding companies continue to challenge their responsibilities under federal law, and challenge the clear expectations and authority of the FDA. There continues to be a high number of problem companies among the compounders who are not in compliance with current Good Manufacturing Practices, and who place the American public at risk from adulterated products.
Bad actors within the otherwise reputable dietary supplement industry are giving the entire industry a bad name, and providing a focus for enforcement actions. The challenge for dietary supplement companies is that all companies are likely being painted with the broad-brush of the potential bad actors. Notable issues include the preponderance of drug-related claims for non-drug products; inclusion of APIs in “natural” or “herbal” products; and compliance with manufacturing regulations. It would serve all companies within the dietary supplement industry well to ensure that they have an aggressive and robust approach to their compliance and quality management programs, so that when an FDA Investigator walks through their door, they can tell a complete and comprehensive story as to how they are ensuring product quality, protecting the consumer, and complying with all applicable FDA requirements and expectations.
Food safety through compliance enforcement remains a top priority of both the FDA and the related government organizations.
The topic of food safety was specifically cited by both Jonathan Olin, Assistant US Attorney General, and Michael Roosevelt, Center for Food Safety and Applied Nutrition (CFSAN), as an area that will get significant attention over the next year or so. Of particular concern is that all food companies will need to have Food Safety Plans in place by September 2016. These plans are designed to ensure that food manufacturers have identified the full complement of risks from food production and processing activities; have controlled and mitigated those risks; and can explain those risks and control frameworks properly to an FDA Investigator.
Although developing Food Safety Plans is not excessively complicated (though they do require specialized knowledge and skills), they represent an important step in contemporary food safety assurance and will be a focus area for the Food and Drug Administration for the future. There are an extraordinarily large number of products and product families that need attention, and likely, a large number of companies that have not yet engaged in the development of these plans. Food companies would do well to ensure that they have obtained expert assistance to develop and get plans in place to address this requirement by the September 2016 implementation date specified under FSMA.
International enforcement activities and ex-US supply chains will be an area of increasing focus for FDA enforcement activity.
Threats to the US public from ex-US manufacturing operations are mainly caused by two types of companies: a) those that engage in direct intentional criminal conduct to defraud the US consumer with counterfeit and dangerous products; and b) those that fail to fully embrace their obligations under US Food and Drug laws and requirements, and that engage in sub-standard, non-compliant conduct to improve the profitability of their ongoing operations. Both of these types of organizations are clear targets of the FDA and related enforcement agencies, and both will continue to be at risk of significant enforcement investigations and actions over the coming years. As an example of this focus, Sean Boyd, Acting Director, CDRH, Office of Compliance, stated that CDRH is increasingly focused on ex-US inspections, and is driving to a 50-50 split between US domestic and ex-US inspections in the near future. This shift is reflective of the overall shift of the medical device supply chain to ex-US supply companies. Enforcement attention will continue to be focused on pharmaceutical supply chains in India and China as well. A main focus here is, of course, on data integrity and the establishment and assurance of the foundational programs that will ensure the agency can rely on data and documentation prepared by these companies supplying products into the US. Presentations by the regulators focused on how many of the deficiencies observed are fundamental in nature, and reflect basic lack of understanding in areas like good documentation practice, data management principles, roles and responsibilities, and training.
Tobacco enforcement, the new kid on the block for FDA regulation, will continue to see increasing enforcement activity and focus.
It is clear that the Center for Tobacco Products has continued to mature, and its enforcement program has become robust and sophisticated. The level of activity in the Center directed towards enforcement has increased year-over-year since the center was put into service. This important focus area will remain important, and will continue to increase in its scope and rigor over the coming years.
There were many, many more interesting topics of discussion at this important conference, such as DQSA / Track and Trace; pharmaceutical quality metrics (again!); role of social media and government perspective and enforcement; mHealth and the digital revolution; CDRH CTQ initiative; etc. For those of us who spend their days dealing with enforcement and compliance, a conference like this validates the important work that we perform for our client companies by reducing business risk, and as an additional benefit, in helping to ensure safe and effective products for the American people.
If anyone would like additional insights into topics discussed at this program, please feel free to contact me at your convenience. | 2023-14/0000/en_head.json.gz/3603 | {"url": "https://compliancearchitects.com/tag/ftc/", "partition": "head_middle", "language": "en", "source_domain": "compliancearchitects.com", "date_download": "2023-03-20T09:11:32Z", "digest": "sha1:C7MYFYJUCIB3AEE3ZHZI7EC5M5PZBG3H"} | {"ccnet_length": [[0, 11577, 11577.0]], "ccnet_original_length": [[0, 11577, 18797.0]], "ccnet_nlines": [[0, 11577, 33.0]], "ccnet_original_nlines": [[0, 11577, 211.0]], "ccnet_language_score": [[0, 11577, 0.96]], "ccnet_perplexity": [[0, 11577, 285.2]], "ccnet_bucket": [[0, 11577, 0.0]], "rps_doc_curly_bracket": [[0, 11577, 0.0]], "rps_doc_ldnoobw_words": [[0, 11577, 0.0]], "rps_doc_lorem_ipsum": [[0, 11577, 0.0]], "rps_doc_stop_word_fraction": [[0, 11577, 0.38435039]], "rps_doc_ut1_blacklist": [[0, 11577, null]], "rps_doc_frac_chars_dupe_10grams": [[0, 11577, 0.0]], "rps_doc_frac_chars_dupe_5grams": [[0, 11577, 0.01898008]], "rps_doc_frac_chars_dupe_6grams": [[0, 11577, 0.01084576]], "rps_doc_frac_chars_dupe_7grams": [[0, 11577, 0.0]], "rps_doc_frac_chars_dupe_8grams": [[0, 11577, 0.0]], "rps_doc_frac_chars_dupe_9grams": [[0, 11577, 0.0]], "rps_doc_frac_chars_top_2gram": [[0, 11577, 0.00730003]], "rps_doc_frac_chars_top_3gram": [[0, 11577, 0.01001147]], "rps_doc_frac_chars_top_4gram": [[0, 11577, 0.00876004]], "rps_doc_frac_all_caps_words": [[0, 11577, 0.03248031]], "rps_doc_frac_lines_end_with_ellipsis": [[0, 11577, 0.0]], "rps_doc_frac_no_alph_words": [[0, 11577, 0.12746063]], "rps_doc_frac_unique_words": [[0, 11577, 0.36697248]], "rps_doc_mean_word_length": [[0, 11577, 5.49827982]], "rps_doc_symbol_to_word_ratio": [[0, 11577, 0.0]], "rps_doc_unigram_entropy": [[0, 11577, 5.64428823]], "rps_doc_word_count": [[0, 11577, 1744.0]], "rps_lines_ending_with_terminal_punctution_mark": [[0, 122, 0.0], [122, 922, 1.0], [922, 1000, 0.0], [1000, 1091, 0.0], [1091, 1155, 0.0], [1155, 1229, 0.0], [1229, 1284, 0.0], [1284, 1344, 0.0], [1344, 1455, 0.0], [1455, 1539, 0.0], [1539, 1631, 0.0], [1631, 1706, 0.0], [1706, 1790, 0.0], [1790, 2710, 1.0], [2710, 3241, 1.0], [3241, 3505, 0.0], [3505, 3655, 1.0], [3655, 4343, 1.0], [4343, 4856, 1.0], [4856, 5545, 1.0], [5545, 5690, 1.0], [5690, 5828, 1.0], [5828, 6203, 1.0], [6203, 7126, 1.0], [7126, 7250, 1.0], [7250, 7894, 1.0], [7894, 8575, 1.0], [8575, 8702, 1.0], [8702, 10327, 1.0], [10327, 10457, 1.0], [10457, 10854, 1.0], [10854, 11446, 1.0], [11446, 11577, 1.0]], "rps_lines_javascript_counts": [[0, 122, 0.0], [122, 922, 0.0], [922, 1000, 0.0], [1000, 1091, 0.0], [1091, 1155, 0.0], [1155, 1229, 0.0], [1229, 1284, 0.0], [1284, 1344, 0.0], [1344, 1455, 0.0], [1455, 1539, 0.0], [1539, 1631, 0.0], [1631, 1706, 0.0], [1706, 1790, 0.0], [1790, 2710, 0.0], [2710, 3241, 0.0], [3241, 3505, 0.0], [3505, 3655, 0.0], [3655, 4343, 0.0], [4343, 4856, 0.0], [4856, 5545, 0.0], [5545, 5690, 0.0], [5690, 5828, 0.0], [5828, 6203, 0.0], [6203, 7126, 0.0], [7126, 7250, 0.0], [7250, 7894, 0.0], [7894, 8575, 0.0], [8575, 8702, 0.0], [8702, 10327, 0.0], [10327, 10457, 0.0], [10457, 10854, 0.0], [10854, 11446, 0.0], [11446, 11577, 0.0]], "rps_lines_num_words": [[0, 122, 16.0], [122, 922, 124.0], [922, 1000, 11.0], [1000, 1091, 12.0], [1091, 1155, 8.0], [1155, 1229, 10.0], [1229, 1284, 7.0], [1284, 1344, 9.0], [1344, 1455, 16.0], [1455, 1539, 12.0], [1539, 1631, 11.0], [1631, 1706, 10.0], [1706, 1790, 10.0], [1790, 2710, 138.0], [2710, 3241, 83.0], [3241, 3505, 41.0], [3505, 3655, 20.0], [3655, 4343, 98.0], [4343, 4856, 77.0], [4856, 5545, 107.0], [5545, 5690, 22.0], [5690, 5828, 19.0], [5828, 6203, 55.0], [6203, 7126, 136.0], [7126, 7250, 18.0], [7250, 7894, 102.0], [7894, 8575, 107.0], [8575, 8702, 18.0], [8702, 10327, 253.0], [10327, 10457, 20.0], [10457, 10854, 63.0], [10854, 11446, 90.0], [11446, 11577, 21.0]], "rps_lines_numerical_chars_fraction": [[0, 122, 0.06837607], [122, 922, 0.00258398], [922, 1000, 0.0], [1000, 1091, 0.0], [1091, 1155, 0.0], [1155, 1229, 0.0], [1229, 1284, 0.0], [1284, 1344, 0.0], [1344, 1455, 0.0], [1455, 1539, 0.0], [1539, 1631, 0.0], [1631, 1706, 0.0], [1706, 1790, 0.0], [1790, 2710, 0.0], [2710, 3241, 0.0], [3241, 3505, 0.0], [3505, 3655, 0.0], [3655, 4343, 0.0], [4343, 4856, 0.0], [4856, 5545, 0.0], [5545, 5690, 0.0], [5690, 5828, 0.0], [5828, 6203, 0.0], [6203, 7126, 0.0], [7126, 7250, 0.0], [7250, 7894, 0.00632911], [7894, 8575, 0.00595238], [8575, 8702, 0.0], [8702, 10327, 0.00251414], [10327, 10457, 0.0], [10457, 10854, 0.0], [10854, 11446, 0.0], [11446, 11577, 0.0]], "rps_lines_start_with_bulletpoint": [[0, 122, 0.0], [122, 922, 0.0], [922, 1000, 0.0], [1000, 1091, 0.0], [1091, 1155, 0.0], [1155, 1229, 0.0], [1229, 1284, 0.0], [1284, 1344, 0.0], [1344, 1455, 0.0], [1455, 1539, 0.0], [1539, 1631, 0.0], [1631, 1706, 0.0], [1706, 1790, 0.0], [1790, 2710, 0.0], [2710, 3241, 0.0], [3241, 3505, 0.0], [3505, 3655, 0.0], [3655, 4343, 0.0], [4343, 4856, 0.0], [4856, 5545, 0.0], [5545, 5690, 0.0], [5690, 5828, 0.0], [5828, 6203, 0.0], [6203, 7126, 0.0], [7126, 7250, 0.0], [7250, 7894, 0.0], [7894, 8575, 0.0], [8575, 8702, 0.0], [8702, 10327, 0.0], [10327, 10457, 0.0], [10457, 10854, 0.0], [10854, 11446, 0.0], [11446, 11577, 0.0]], "rps_lines_uppercase_letter_fraction": [[0, 122, 0.10655738], [122, 922, 0.035], [922, 1000, 0.01282051], [1000, 1091, 0.13186813], [1091, 1155, 0.171875], [1155, 1229, 0.14864865], [1229, 1284, 0.16363636], [1284, 1344, 0.23333333], [1344, 1455, 0.12612613], [1455, 1539, 0.14285714], [1539, 1631, 0.09782609], [1631, 1706, 0.13333333], [1706, 1790, 0.13095238], [1790, 2710, 0.03695652], [2710, 3241, 0.07156309], [3241, 3505, 0.00757576], [3505, 3655, 0.02], [3655, 4343, 0.02180233], [4343, 4856, 0.00974659], [4856, 5545, 0.02902758], [5545, 5690, 0.05517241], [5690, 5828, 0.00724638], [5828, 6203, 0.024], [6203, 7126, 0.01516793], [7126, 7250, 0.03225806], [7250, 7894, 0.04658385], [7894, 8575, 0.020558], [8575, 8702, 0.04724409], [8702, 10327, 0.02830769], [10327, 10457, 0.03076923], [10457, 10854, 0.01763224], [10854, 11446, 0.02871622], [11446, 11577, 0.00763359]], "rps_doc_ml_palm_score": [[0, 11577, 0.46608251]], "rps_doc_ml_wikipedia_score": [[0, 11577, null]], "rps_doc_ml_wikiref_score": [[0, 11577, 0.08960783]], "rps_doc_books_importance": [[0, 11577, -419.48868431]], "rps_doc_openwebtext_importance": [[0, 11577, 127.29969574]], "rps_doc_wikipedia_importance": [[0, 11577, -78.05502768]], "rps_doc_num_sentences": [[0, 11577, 57.0]]} |
Now that I’m back to working in a school, I’m back to experiencing all of the perks that come with that environment. One of those perks is the field trip. I say perk. I might actually mean punishment. I’m the new guy. I’ve only been in my current position for a few weeks. This week, … Continue reading Field Trip | 2023-14/0000/en_head.json.gz/3604 | {"url": "https://confusingmiddle.com/tag/hazing/", "partition": "head_middle", "language": "en", "source_domain": "confusingmiddle.com", "date_download": "2023-03-20T09:11:15Z", "digest": "sha1:DZ7V7ETEEAL7ZHZ3FWNS3APW2L4FDLMA"} | {"ccnet_length": [[0, 313, 313.0]], "ccnet_original_length": [[0, 313, 4389.0]], "ccnet_nlines": [[0, 313, 1.0]], "ccnet_original_nlines": [[0, 313, 61.0]], "ccnet_language_score": [[0, 313, 0.95]], "ccnet_perplexity": [[0, 313, 245.8]], "ccnet_bucket": [[0, 313, 0.0]], "rps_doc_curly_bracket": [[0, 313, 0.0]], "rps_doc_ldnoobw_words": [[0, 313, 0.0]], "rps_doc_lorem_ipsum": [[0, 313, 0.0]], "rps_doc_stop_word_fraction": [[0, 313, 0.42105263]], "rps_doc_ut1_blacklist": [[0, 313, null]], "rps_doc_frac_chars_dupe_10grams": [[0, 313, 0.0]], "rps_doc_frac_chars_dupe_5grams": [[0, 313, 0.0]], "rps_doc_frac_chars_dupe_6grams": [[0, 313, 0.0]], "rps_doc_frac_chars_dupe_7grams": [[0, 313, 0.0]], "rps_doc_frac_chars_dupe_8grams": [[0, 313, 0.0]], "rps_doc_frac_chars_dupe_9grams": [[0, 313, 0.0]], "rps_doc_frac_chars_top_2gram": [[0, 313, 0.05691057]], "rps_doc_frac_chars_top_3gram": [[0, 313, 0.07317073]], "rps_doc_frac_chars_top_4gram": [[0, 313, 0.0]], "rps_doc_frac_all_caps_words": [[0, 313, 0.07894737]], "rps_doc_frac_lines_end_with_ellipsis": [[0, 313, 0.0]], "rps_doc_frac_no_alph_words": [[0, 313, 0.17105263]], "rps_doc_frac_unique_words": [[0, 313, 0.75]], "rps_doc_mean_word_length": [[0, 313, 4.1]], "rps_doc_symbol_to_word_ratio": [[0, 313, 0.01315789]], "rps_doc_unigram_entropy": [[0, 313, 3.72160856]], "rps_doc_word_count": [[0, 313, 60.0]], "rps_lines_ending_with_terminal_punctution_mark": [[0, 313, 0.0]], "rps_lines_javascript_counts": [[0, 313, 0.0]], "rps_lines_num_words": [[0, 313, 60.0]], "rps_lines_numerical_chars_fraction": [[0, 313, 0.0]], "rps_lines_start_with_bulletpoint": [[0, 313, 0.0]], "rps_lines_uppercase_letter_fraction": [[0, 313, 0.03833866]], "rps_doc_ml_palm_score": [[0, 313, 0.02105927]], "rps_doc_ml_wikipedia_score": [[0, 313, null]], "rps_doc_ml_wikiref_score": [[0, 313, -9.89e-06]], "rps_doc_books_importance": [[0, 313, -34.96746721]], "rps_doc_openwebtext_importance": [[0, 313, 5.43725695]], "rps_doc_wikipedia_importance": [[0, 313, -65.03363755]], "rps_doc_num_sentences": [[0, 313, 7.0]]} |
5 Top Tips To Win More With Any Lottery
Pasted: Dec 24, 2021, 3:24:19 am | 2023-14/0000/en_head.json.gz/3605 | {"url": "https://controlc.com/64946966", "partition": "head_middle", "language": "en", "source_domain": "controlc.com", "date_download": "2023-03-20T09:35:39Z", "digest": "sha1:OGPFVN3CBYLDEARAJBELYBV7L2UWS7S5"} | {"ccnet_length": [[0, 72, 72.0]], "ccnet_original_length": [[0, 72, 459.0]], "ccnet_nlines": [[0, 72, 2.0]], "ccnet_original_nlines": [[0, 72, 11.0]], "ccnet_language_score": [[0, 72, 0.81]], "ccnet_perplexity": [[0, 72, 339.5]], "ccnet_bucket": [[0, 72, 0.0]], "rps_doc_curly_bracket": [[0, 72, 0.0]], "rps_doc_ldnoobw_words": [[0, 72, 0.0]], "rps_doc_lorem_ipsum": [[0, 72, 0.0]], "rps_doc_stop_word_fraction": [[0, 72, 0.04545455]], "rps_doc_ut1_blacklist": [[0, 72, null]], "rps_doc_frac_chars_dupe_10grams": [[0, 72, 0.0]], "rps_doc_frac_chars_dupe_5grams": [[0, 72, 0.0]], "rps_doc_frac_chars_dupe_6grams": [[0, 72, 0.0]], "rps_doc_frac_chars_dupe_7grams": [[0, 72, 0.0]], "rps_doc_frac_chars_dupe_8grams": [[0, 72, 0.0]], "rps_doc_frac_chars_dupe_9grams": [[0, 72, 0.0]], "rps_doc_frac_chars_top_2gram": [[0, 72, 0.0]], "rps_doc_frac_chars_top_3gram": [[0, 72, 0.0]], "rps_doc_frac_chars_top_4gram": [[0, 72, 0.0]], "rps_doc_frac_all_caps_words": [[0, 72, 0.0]], "rps_doc_frac_lines_end_with_ellipsis": [[0, 72, 0.0]], "rps_doc_frac_no_alph_words": [[0, 72, 0.5]], "rps_doc_frac_unique_words": [[0, 72, 1.0]], "rps_doc_mean_word_length": [[0, 72, 3.53333333]], "rps_doc_symbol_to_word_ratio": [[0, 72, 0.0]], "rps_doc_unigram_entropy": [[0, 72, 2.7080502]], "rps_doc_word_count": [[0, 72, 15.0]], "rps_lines_ending_with_terminal_punctution_mark": [[0, 40, 0.0], [40, 72, 0.0]], "rps_lines_javascript_counts": [[0, 40, 0.0], [40, 72, 0.0]], "rps_lines_num_words": [[0, 40, 9.0], [40, 72, 6.0]], "rps_lines_numerical_chars_fraction": [[0, 40, 0.02564103], [40, 72, 0.40740741]], "rps_lines_start_with_bulletpoint": [[0, 40, 0.0], [40, 72, 0.0]], "rps_lines_uppercase_letter_fraction": [[0, 40, 0.2], [40, 72, 0.0625]], "rps_doc_ml_palm_score": [[0, 72, -1.001e-05]], "rps_doc_ml_wikipedia_score": [[0, 72, null]], "rps_doc_ml_wikiref_score": [[0, 72, -1.001e-05]], "rps_doc_books_importance": [[0, 72, -23.20122275]], "rps_doc_openwebtext_importance": [[0, 72, -11.80406169]], "rps_doc_wikipedia_importance": [[0, 72, -10.92396507]], "rps_doc_num_sentences": [[0, 72, 1.0]]} |
Cornejo & Sons and all of our subsidiaries are dedicated to providing quality services and materials on time. We are committed to exceeding our customers’ expectations and providing resolutions for their construction projects.
Operating with more than 500 employees and substantial plant and equipment resources Cornejo & Sons provides services and products for:
Aggregate Production & Delivery
Construction & Demolition Landfills
Ready-Mix Concrete Production
Specialty Projects such as Wind Farms
Stone Quarry & Sand Mines
Organization Brief:
Name: Cornejo & Sons
Website: http://www.cornejocorp.com
Telephone No.: (316) 522-5100
Address: 2060 E. Tulsa Street
Wichita, KS, 67216, US
We are committed to reducing the consumption of raw materials, providing a safe environment for our employees and customers, maintaining a local presence to bring growth to, and connect with, our communities, and implementing best practices that allow us to provide quality products and services. At Cornejo & Sons, we believe in the necessity of doing it right the first time. | 2023-14/0000/en_head.json.gz/3606 | {"url": "https://cornejocorp.com/about/", "partition": "head_middle", "language": "en", "source_domain": "cornejocorp.com", "date_download": "2023-03-20T08:56:03Z", "digest": "sha1:3YEBEA36PT3E6S5NXA7OJM3NB6PPE5V2"} | {"ccnet_length": [[0, 1062, 1062.0]], "ccnet_original_length": [[0, 1062, 2101.0]], "ccnet_nlines": [[0, 1062, 14.0]], "ccnet_original_nlines": [[0, 1062, 79.0]], "ccnet_language_score": [[0, 1062, 0.87]], "ccnet_perplexity": [[0, 1062, 303.7]], "ccnet_bucket": [[0, 1062, 0.0]], "rps_doc_curly_bracket": [[0, 1062, 0.0]], "rps_doc_ldnoobw_words": [[0, 1062, 0.0]], "rps_doc_lorem_ipsum": [[0, 1062, 0.0]], "rps_doc_stop_word_fraction": [[0, 1062, 0.29473684]], "rps_doc_ut1_blacklist": [[0, 1062, null]], "rps_doc_frac_chars_dupe_10grams": [[0, 1062, 0.0]], "rps_doc_frac_chars_dupe_5grams": [[0, 1062, 0.0]], "rps_doc_frac_chars_dupe_6grams": [[0, 1062, 0.0]], "rps_doc_frac_chars_dupe_7grams": [[0, 1062, 0.0]], "rps_doc_frac_chars_dupe_8grams": [[0, 1062, 0.0]], "rps_doc_frac_chars_dupe_9grams": [[0, 1062, 0.0]], "rps_doc_frac_chars_top_2gram": [[0, 1062, 0.05063291]], "rps_doc_frac_chars_top_3gram": [[0, 1062, 0.03222094]], "rps_doc_frac_chars_top_4gram": [[0, 1062, 0.03682394]], "rps_doc_frac_all_caps_words": [[0, 1062, 0.01578947]], "rps_doc_frac_lines_end_with_ellipsis": [[0, 1062, 0.0]], "rps_doc_frac_no_alph_words": [[0, 1062, 0.21578947]], "rps_doc_frac_unique_words": [[0, 1062, 0.67333333]], "rps_doc_mean_word_length": [[0, 1062, 5.79333333]], "rps_doc_symbol_to_word_ratio": [[0, 1062, 0.0]], "rps_doc_unigram_entropy": [[0, 1062, 4.40984642]], "rps_doc_word_count": [[0, 1062, 150.0]], "rps_lines_ending_with_terminal_punctution_mark": [[0, 227, 1.0], [227, 363, 0.0], [363, 395, 0.0], [395, 431, 0.0], [431, 461, 0.0], [461, 499, 0.0], [499, 525, 0.0], [525, 545, 0.0], [545, 566, 0.0], [566, 602, 0.0], [602, 632, 0.0], [632, 662, 0.0], [662, 685, 0.0], [685, 1062, 1.0]], "rps_lines_javascript_counts": [[0, 227, 0.0], [227, 363, 0.0], [363, 395, 0.0], [395, 431, 0.0], [431, 461, 0.0], [461, 499, 0.0], [499, 525, 0.0], [525, 545, 0.0], [545, 566, 0.0], [566, 602, 0.0], [602, 632, 0.0], [632, 662, 0.0], [662, 685, 0.0], [685, 1062, 0.0]], "rps_lines_num_words": [[0, 227, 32.0], [227, 363, 19.0], [363, 395, 3.0], [395, 431, 3.0], [431, 461, 3.0], [461, 499, 6.0], [499, 525, 4.0], [525, 545, 2.0], [545, 566, 3.0], [566, 602, 2.0], [602, 632, 4.0], [632, 662, 5.0], [662, 685, 4.0], [685, 1062, 60.0]], "rps_lines_numerical_chars_fraction": [[0, 227, 0.0], [227, 363, 0.02272727], [363, 395, 0.0], [395, 431, 0.0], [431, 461, 0.0], [461, 499, 0.0], [499, 525, 0.0], [525, 545, 0.0], [545, 566, 0.0], [566, 602, 0.0], [602, 632, 0.41666667], [632, 662, 0.14814815], [662, 685, 0.26315789], [685, 1062, 0.0]], "rps_lines_start_with_bulletpoint": [[0, 227, 0.0], [227, 363, 0.0], [363, 395, 0.0], [395, 431, 0.0], [431, 461, 0.0], [461, 499, 0.0], [499, 525, 0.0], [525, 545, 0.0], [545, 566, 0.0], [566, 602, 0.0], [602, 632, 0.0], [632, 662, 0.0], [662, 685, 0.0], [685, 1062, 0.0]], "rps_lines_uppercase_letter_fraction": [[0, 227, 0.01321586], [227, 363, 0.02205882], [363, 395, 0.09375], [395, 431, 0.08333333], [431, 461, 0.13333333], [461, 499, 0.10526316], [499, 525, 0.15384615], [525, 545, 0.1], [545, 566, 0.14285714], [566, 602, 0.02777778], [602, 632, 0.06666667], [632, 662, 0.13333333], [662, 685, 0.2173913], [685, 1062, 0.01061008]], "rps_doc_ml_palm_score": [[0, 1062, 0.00032437]], "rps_doc_ml_wikipedia_score": [[0, 1062, null]], "rps_doc_ml_wikiref_score": [[0, 1062, 0.00122738]], "rps_doc_books_importance": [[0, 1062, -77.00858264]], "rps_doc_openwebtext_importance": [[0, 1062, -20.61174739]], "rps_doc_wikipedia_importance": [[0, 1062, -48.98789925]], "rps_doc_num_sentences": [[0, 1062, 8.0]]} |
The Cornell Daily Sun (https://cornellsun.com/2018/02/01/cornell-silent-for-second-day-as-mystery-of-business-deans-exit-grows/)
Cameron Pollack / Sun Photography Editor
Soumitra Dutta unexpectedly resigned from his post as the dean of the College of Business on Tuesday. Neither he nor the University has provided a reason for the exit.
Cornell Silent for Second Day as Mystery of Business Dean’s Exit Grows
By Nicholas Bogel-Burroughs | February 1, 2018
More on College of Business
Subscribe to College of Business
The mystery surrounding Soumitra Dutta’s abrupt resignation from his role atop Cornell University’s business college continued to grow on Wednesday as Dutta and Cornell refused for a second day to provide any reason for his sudden exit.
Provost Michael Kotlikoff said in an email to SC Johnson College of Business employees and students on Tuesday afternoon that he had accepted Dutta’s resignation letter, effective immediately, and had named an interim dean to lead the business college.
Kotlikoff’s brief statement blindsided all but a select handful of Cornellians, catching many business faculty members off guard and surprising some employees in Dutta’s own Statler Hall office who read the news for the first time in The Sun.
Even L. Joseph Thomas, whom Kotlikoff named to replace Dutta as interim dean while Cornell searches for a permanent successor, only learned of Dutta’s imminent departure on Monday, one day before it was publicly announced.
“I was asked to do this interim job two days ago, and it was a surprise,” Thomas said in an email to The Sun on Wednesday night.
Dutta, 54, has not responded to multiple requests for comment since he resigned, but has said in recent days that he expects to remain a member of the faculty, according to a Cornell alumnus briefed on the conversations who spoke on the condition of anonymity to discuss sensitive information.
Cornell’s spokesman, John Carberry, has declined to answer any questions about what led to Dutta’s resignation and had no information, as of early Thursday morning, on whether Dutta would remain on the faculty.
Michael Wenye Li / Sun Assistant Photography Editor
Sage Hall is home to the SC Johnson Graduate School of Management, one of the three schools that are now part of the College of Business.
Thomas met with campus leaders and spoke several times with Kotlikoff on Wednesday, his first full day serving in a position he likely never expected to take after retiring in 2016 from teaching at Cornell.
“I am very pleased to help Cornell and my college in any way I can,” Thomas, who led the SC Johnson Graduate School of Management from 2007 to 2012 as dean, told The Sun. “Cornell is a wonderful university, and it has been good for me and my family. I did like my retirement activities, but those can be scaled back a bit.”
Thomas said he will serve only as long as it takes Cornell to find a permanent dean for the College of Business, which opened in July of 2016 and includes the School of Hotel Administration, the Johnson Graduate School and the Charles H. Dyson School of Applied Economics and Management.
Asked about Dutta, Thomas said only that the former dean had “led the new college very well” and that members of the college are “well positioned to move forward because of” Dutta’s and the leadership team’s efforts.
Lourdes Casanova, Dutta’s wife and a senior lecturer at the Johnson Graduate School, declined to comment when reached at the couple’s house this week.
“I’m sorry, but we cannot say anything,” Casanova said from just inside the door of the couple’s home on Tuesday evening.
Several College of Business faculty members said it was notable that Dutta’s resignation came less than a week after the college’s deputy dean, Chris Barrett, said on Jan. 22 that he would step down from his post in July and return to teaching and research.
Barrett did not respond to requests for comment, but the Cornell alumnus, who had been briefed on the matter, said Barrett’s stepping down was not connected in any way to Dutta’s resignation. Carberry, the Cornell spokesman, confirmed Barrett’s announcement but could not say whether it was connected to Dutta’s.
Prof. William Henri Lesser, science and business, who served as the interim director of the Dyson School for a few months in 2016 and retired this January, said the announcement of Dutta’s resignation had been “opaque.” Lesser also said he was concerned about the financing of the College of the Business, which he described as “suspect.”
“We all need to know what has gone wrong, for certainly something has,” Lesser said in an email. “Depending on the reason, there can be lots of reason for concern.”
Lesser said the business college had spent too much money on extra deans, offices and expensive faculty hires and does not have enough sufficient sources of revenue.
“In my opinion the whole venture [of the College of Business] has been poorly conceived and executed,” Lesser added. “Hopefully the multiple costs will not be too extensive.”
Cameron Pollack / Sun File Photo
Soumitra Dutta resigned unexpectedly on Tuesday from his post as dean of the College of Business. Neither Cornell nor Dutta have said what led to his sudden departure.
Publicly, at least, Dutta had made no indication in recent weeks that he was planning to resign. He represented the College of Business at an event in New York less than a week ago, on Jan. 25, and he is also listed as a keynote speaker at an information technology conference in India in late February.
In an interview with Cornell’s official media service, the Cornell Chronicle, in July, Dutta said there was more he wanted to accomplish at the University.
“We still have a lot more that we can do together, in all three schools combined,” he said, referring to the hotel school, the Johnson Graduate School and the Dyson school. Dutta said there “is a lot more to be done, but it’s all exciting, good stuff.”
Kotlikoff’s four-sentence statement on Tuesday made no mention of Dutta’s more than five years with the University, during which he led the Johnson Graduate School as dean beginning in July of 2012. When Kotlikoff announced, in December of 2015, that Cornell was seeking to create the College of Business, the provost also named Dutta as the college’s inaugural dean.
Dutta oversaw about 200 faculty members and 3,100 students while leading the College of Business through its first, controversial years — beginning in the summer of 2016 — during which students, alumni, faculty and staff protested its creation and launch.
Dutta, who was born in India, graduated from the Indian Institute of Technology in New Delhi in 1985 before earning a Ph.D. in computer science in 1990 at UC Berkeley, where he studied artificial intelligence, according to his curriculum vitae, which Cornell hosted on its website until it removed his profile on Tuesday.
When Cornell hired Dutta in 2012 to lead the Johnson Graduate School as its 11th dean, then-president David Skorton said Dutta was a “natural fit,” with expertise in emerging media and “extensive experience on the international stage.”
Cornell’s proposal in December of 2015 to merge three schools under one business college umbrella sparked controversy, and Dutta took on the challenge of guiding the college with a steady hand, leading an institution that many said they did not want to exist.
After months of student and faculty unrest, Dutta sought to quell concerns in 2016 in what the Financial Times called a “charm offensive.” Dutta, largely working behind the scenes, argued to alumni and students that merging the three schools would boost their national marks and allow for students to enroll in a broader range of courses. He also maintained that he would protect each school’s unique identity.
Where Dutta worked methodically, H. Fisk Johnson ‘79 made a splash. His $150 million donation in January of 2017 to the College of Business, the largest donation ever made to the Ithaca campus, led to an increase in support for the merged programs and permanently affixed Johnson’s name to the college. Johnson did not return a message left with a company spokesperson on Wednesday.
In January of 2017, Dutta said Johnson’s “extraordinary gift” would expand the college’s learning and research opportunities and “support our mission to realize the full potential of Cornell’s business programs.”
Thomas, in his email to The Sun, said that “as you might expect, I am very busy,” and that while he has not yet prepared any specific initiatives — he’s been on the job for only a day — he plans to “move the college forward” rather than to keep things as they were. As an emeritus professor, Thomas said he had been volunteering and staying connected to Cornell while in retirement.
Interim Dean L. Joseph Thomas
In addition to overseeing business students on the Ithaca campus, Thomas, as dean of all undergraduate and graduate business programs, will also manage the Cornell Tech MBA program at Cornell’s campus on Roosevelt Island in New York, which had 53 students in its 2017 class.
Thomas has taught at the Johnson Graduate School since 1967 and has been a full professor in the school since 1977. He graduated from Carnegie-Mellon University in 1964 and earned his Ph.D. at Yale in 1968, studying operations research
Dutta has been widely published over several decades — his CV is 30 pages long — and he has held many jobs in at least four countries over more than three decades while remaining on dozens of boards.
Prior to being hired by Cornell, he served as an adminsitrator and professor in various roles from 1989 to 2012 at INSEAD in France, a leading international business school that also has campuses in the United Arab Emirates and Singapore. He worked briefly as a visiting researcher at the General Electric company in Schenectady in 1990 as well.
Dutta also says on his CV, which was last updated in January, that he has been invited to attend the World Economic Forum Annual Meeting in Davos, Switzerland, for more than 10 years. He co-edits a frequently-cited innovation report each year and serves on the boards of Sodexo, Dassault Systemes and Tower Learning Solutions, which is a subsidiary of Cornell.
Dutta has received $1.65 million in research grants and gifts since 2012, according to the CV, and is currently chair of the Association to Advance Collegiate Schools of Business, which says on its website that it is the world’s largest business education alliance.
About Nicholas Bogel-Burroughs
Nicholas Bogel-Burroughs covers his hometown of Ithaca as a senior writer. He previously served as city editor. Send tips by email to [email protected] or via the Signal or WhatsApp apps to 315-730-8907.
More by Nicholas
Cornell Kept Business College Dean’s Impending Resignation Secret from Top Administrators
By Nicholas Bogel-Burroughs February 6, 2018
Even the business college’s own top administrators were kept in the dark.
Following Unexplained Resignation, Former Business Dean to Stay at Cornell as Professor
Soumitra Dutta, who abruptly resigned from leading the College of Business this week, will stay at Cornell as a professor. | 2023-14/0000/en_head.json.gz/3607 | {"url": "https://cornellsun.com/2018/02/01/cornell-silent-for-second-day-as-mystery-of-business-deans-exit-grows/", "partition": "head_middle", "language": "en", "source_domain": "cornellsun.com", "date_download": "2023-03-20T10:18:39Z", "digest": "sha1:ZZG3VZBBLEUAZJK36VXXQT6LJSQVE45N"} | {"ccnet_length": [[0, 11153, 11153.0]], "ccnet_original_length": [[0, 11153, 13695.0]], "ccnet_nlines": [[0, 11153, 57.0]], "ccnet_original_nlines": [[0, 11153, 218.0]], "ccnet_language_score": [[0, 11153, 0.98]], "ccnet_perplexity": [[0, 11153, 224.0]], "ccnet_bucket": [[0, 11153, 0.0]], "rps_doc_curly_bracket": [[0, 11153, 0.0]], "rps_doc_ldnoobw_words": [[0, 11153, 0.0]], "rps_doc_lorem_ipsum": [[0, 11153, 0.0]], "rps_doc_stop_word_fraction": [[0, 11153, 0.41211293]], "rps_doc_ut1_blacklist": [[0, 11153, null]], "rps_doc_frac_chars_dupe_10grams": [[0, 11153, 0.0]], "rps_doc_frac_chars_dupe_5grams": [[0, 11153, 0.05129334]], "rps_doc_frac_chars_dupe_6grams": [[0, 11153, 0.02069345]], "rps_doc_frac_chars_dupe_7grams": [[0, 11153, 0.00836544]], "rps_doc_frac_chars_dupe_8grams": [[0, 11153, 0.0]], "rps_doc_frac_chars_dupe_9grams": [[0, 11153, 0.0]], "rps_doc_frac_chars_top_2gram": [[0, 11153, 0.01761145]], "rps_doc_frac_chars_top_3gram": [[0, 11153, 0.02619703]], "rps_doc_frac_chars_top_4gram": [[0, 11153, 0.02201431]], "rps_doc_frac_all_caps_words": [[0, 11153, 0.00956284]], "rps_doc_frac_lines_end_with_ellipsis": [[0, 11153, 0.0]], "rps_doc_frac_no_alph_words": [[0, 11153, 0.15528233]], "rps_doc_frac_unique_words": [[0, 11153, 0.36887922]], "rps_doc_mean_word_length": [[0, 11153, 4.94287269]], "rps_doc_symbol_to_word_ratio": [[0, 11153, 0.0]], "rps_doc_unigram_entropy": [[0, 11153, 5.6493835]], "rps_doc_word_count": [[0, 11153, 1838.0]], "rps_lines_ending_with_terminal_punctution_mark": [[0, 129, 0.0], [129, 170, 0.0], [170, 338, 1.0], [338, 409, 0.0], [409, 456, 0.0], [456, 484, 0.0], [484, 517, 0.0], [517, 754, 1.0], [754, 1007, 1.0], [1007, 1250, 1.0], [1250, 1473, 1.0], [1473, 1602, 1.0], [1602, 1896, 1.0], [1896, 2107, 1.0], [2107, 2159, 0.0], [2159, 2297, 1.0], [2297, 2504, 1.0], [2504, 2828, 1.0], [2828, 3116, 1.0], [3116, 3333, 1.0], [3333, 3484, 1.0], [3484, 3606, 1.0], [3606, 3864, 1.0], [3864, 4177, 1.0], [4177, 4516, 1.0], [4516, 4681, 1.0], [4681, 4847, 1.0], [4847, 5022, 1.0], [5022, 5055, 0.0], [5055, 5223, 1.0], [5223, 5527, 1.0], [5527, 5683, 1.0], [5683, 5936, 1.0], [5936, 6304, 1.0], [6304, 6560, 1.0], [6560, 6882, 1.0], [6882, 7118, 1.0], [7118, 7378, 1.0], [7378, 7789, 1.0], [7789, 8172, 1.0], [8172, 8385, 1.0], [8385, 8768, 1.0], [8768, 8798, 0.0], [8798, 9073, 1.0], [9073, 9309, 0.0], [9309, 9509, 1.0], [9509, 9855, 1.0], [9855, 10216, 1.0], [10216, 10482, 1.0], [10482, 10513, 0.0], [10513, 10717, 1.0], [10717, 10734, 0.0], [10734, 10824, 0.0], [10824, 10869, 0.0], [10869, 10943, 1.0], [10943, 11031, 0.0], [11031, 11153, 1.0]], "rps_lines_javascript_counts": [[0, 129, 0.0], [129, 170, 0.0], [170, 338, 0.0], [338, 409, 0.0], [409, 456, 0.0], [456, 484, 0.0], [484, 517, 0.0], [517, 754, 0.0], [754, 1007, 0.0], [1007, 1250, 0.0], [1250, 1473, 0.0], [1473, 1602, 0.0], [1602, 1896, 0.0], [1896, 2107, 0.0], [2107, 2159, 0.0], [2159, 2297, 0.0], [2297, 2504, 0.0], [2504, 2828, 0.0], [2828, 3116, 0.0], [3116, 3333, 0.0], [3333, 3484, 0.0], [3484, 3606, 0.0], [3606, 3864, 0.0], [3864, 4177, 0.0], [4177, 4516, 0.0], [4516, 4681, 0.0], [4681, 4847, 0.0], [4847, 5022, 0.0], [5022, 5055, 0.0], [5055, 5223, 0.0], [5223, 5527, 0.0], [5527, 5683, 0.0], [5683, 5936, 0.0], [5936, 6304, 0.0], [6304, 6560, 0.0], [6560, 6882, 0.0], [6882, 7118, 0.0], [7118, 7378, 0.0], [7378, 7789, 0.0], [7789, 8172, 0.0], [8172, 8385, 0.0], [8385, 8768, 0.0], [8768, 8798, 0.0], [8798, 9073, 0.0], [9073, 9309, 0.0], [9309, 9509, 0.0], [9509, 9855, 0.0], [9855, 10216, 0.0], [10216, 10482, 0.0], [10482, 10513, 0.0], [10513, 10717, 0.0], [10717, 10734, 0.0], [10734, 10824, 0.0], [10824, 10869, 0.0], [10869, 10943, 0.0], [10943, 11031, 0.0], [11031, 11153, 0.0]], "rps_lines_num_words": [[0, 129, 5.0], [129, 170, 5.0], [170, 338, 29.0], [338, 409, 12.0], [409, 456, 6.0], [456, 484, 5.0], [484, 517, 5.0], [517, 754, 37.0], [754, 1007, 39.0], [1007, 1250, 39.0], [1250, 1473, 35.0], [1473, 1602, 27.0], [1602, 1896, 49.0], [1896, 2107, 33.0], [2107, 2159, 7.0], [2159, 2297, 26.0], [2297, 2504, 35.0], [2504, 2828, 63.0], [2828, 3116, 50.0], [3116, 3333, 37.0], [3333, 3484, 24.0], [3484, 3606, 21.0], [3606, 3864, 45.0], [3864, 4177, 49.0], [4177, 4516, 56.0], [4516, 4681, 30.0], [4681, 4847, 27.0], [4847, 5022, 28.0], [5022, 5055, 5.0], [5055, 5223, 28.0], [5223, 5527, 56.0], [5527, 5683, 25.0], [5683, 5936, 47.0], [5936, 6304, 59.0], [6304, 6560, 40.0], [6560, 6882, 53.0], [6882, 7118, 37.0], [7118, 7378, 43.0], [7378, 7789, 67.0], [7789, 8172, 64.0], [8172, 8385, 30.0], [8385, 8768, 72.0], [8768, 8798, 5.0], [8798, 9073, 45.0], [9073, 9309, 39.0], [9309, 9509, 38.0], [9509, 9855, 58.0], [9855, 10216, 60.0], [10216, 10482, 43.0], [10482, 10513, 3.0], [10513, 10717, 33.0], [10717, 10734, 3.0], [10734, 10824, 11.0], [10824, 10869, 6.0], [10869, 10943, 12.0], [10943, 11031, 12.0], [11031, 11153, 20.0]], "rps_lines_numerical_chars_fraction": [[0, 129, 0.0754717], [129, 170, 0.0], [170, 338, 0.0], [338, 409, 0.0], [409, 456, 0.11904762], [456, 484, 0.0], [484, 517, 0.0], [517, 754, 0.0], [754, 1007, 0.0], [1007, 1250, 0.0], [1250, 1473, 0.0], [1473, 1602, 0.0], [1602, 1896, 0.00694444], [1896, 2107, 0.0], [2107, 2159, 0.0], [2159, 2297, 0.0], [2297, 2504, 0.01960784], [2504, 2828, 0.02539683], [2828, 3116, 0.01413428], [3116, 3333, 0.0], [3333, 3484, 0.0], [3484, 3606, 0.0], [3606, 3864, 0.00790514], [3864, 4177, 0.0], [4177, 4516, 0.01208459], [4516, 4681, 0.0], [4681, 4847, 0.0], [4847, 5022, 0.0], [5022, 5055, 0.0], [5055, 5223, 0.0], [5223, 5527, 0.00675676], [5527, 5683, 0.0], [5683, 5936, 0.0], [5936, 6304, 0.02222222], [6304, 6560, 0.044], [6560, 6882, 0.02555911], [6882, 7118, 0.02597403], [7118, 7378, 0.015625], [7378, 7789, 0.00990099], [7789, 8172, 0.02406417], [8172, 8385, 0.01904762], [8385, 8768, 0.0], [8768, 8798, 0.0], [8798, 9073, 0.02230483], [9073, 9309, 0.06956522], [9309, 9509, 0.01010101], [9509, 9855, 0.03519062], [9855, 10216, 0.00571429], [10216, 10482, 0.02702703], [10482, 10513, 0.0], [10513, 10717, 0.05128205], [10717, 10734, 0.0], [10734, 10824, 0.0], [10824, 10869, 0.11904762], [10869, 10943, 0.0], [10943, 11031, 0.0], [11031, 11153, 0.0]], "rps_lines_start_with_bulletpoint": [[0, 129, 0.0], [129, 170, 0.0], [170, 338, 0.0], [338, 409, 0.0], [409, 456, 0.0], [456, 484, 0.0], [484, 517, 0.0], [517, 754, 0.0], [754, 1007, 0.0], [1007, 1250, 0.0], [1250, 1473, 0.0], [1473, 1602, 0.0], [1602, 1896, 0.0], [1896, 2107, 0.0], [2107, 2159, 0.0], [2159, 2297, 0.0], [2297, 2504, 0.0], [2504, 2828, 0.0], [2828, 3116, 0.0], [3116, 3333, 0.0], [3333, 3484, 0.0], [3484, 3606, 0.0], [3606, 3864, 0.0], [3864, 4177, 0.0], [4177, 4516, 0.0], [4516, 4681, 0.0], [4681, 4847, 0.0], [4847, 5022, 0.0], [5022, 5055, 0.0], [5055, 5223, 0.0], [5223, 5527, 0.0], [5527, 5683, 0.0], [5683, 5936, 0.0], [5936, 6304, 0.0], [6304, 6560, 0.0], [6560, 6882, 0.0], [6882, 7118, 0.0], [7118, 7378, 0.0], [7378, 7789, 0.0], [7789, 8172, 0.0], [8172, 8385, 0.0], [8385, 8768, 0.0], [8768, 8798, 0.0], [8798, 9073, 0.0], [9073, 9309, 0.0], [9309, 9509, 0.0], [9509, 9855, 0.0], [9855, 10216, 0.0], [10216, 10482, 0.0], [10482, 10513, 0.0], [10513, 10717, 0.0], [10717, 10734, 0.0], [10734, 10824, 0.0], [10824, 10869, 0.0], [10869, 10943, 0.0], [10943, 11031, 0.0], [11031, 11153, 0.0]], "rps_lines_uppercase_letter_fraction": [[0, 129, 0.03100775], [129, 170, 0.12195122], [170, 338, 0.04166667], [338, 409, 0.12676056], [409, 456, 0.10638298], [456, 484, 0.10714286], [484, 517, 0.09090909], [517, 754, 0.03375527], [754, 1007, 0.03952569], [1007, 1250, 0.02880658], [1250, 1473, 0.04035874], [1473, 1602, 0.03875969], [1602, 1896, 0.00680272], [1896, 2107, 0.02843602], [2107, 2159, 0.13461538], [2159, 2297, 0.07246377], [2297, 2504, 0.01932367], [2504, 2828, 0.04320988], [2828, 3116, 0.0625], [3116, 3333, 0.01843318], [3333, 3484, 0.0397351], [3484, 3606, 0.02459016], [3606, 3864, 0.03100775], [3864, 4177, 0.02555911], [4177, 4516, 0.03244838], [4516, 4681, 0.01818182], [4681, 4847, 0.0060241], [4847, 5022, 0.02857143], [5022, 5055, 0.15151515], [5055, 5223, 0.04761905], [5223, 5527, 0.03289474], [5527, 5683, 0.04487179], [5683, 5936, 0.02371542], [5936, 6304, 0.04076087], [6304, 6560, 0.01171875], [6560, 6882, 0.04347826], [6882, 7118, 0.03813559], [7118, 7378, 0.01153846], [7378, 7789, 0.01459854], [7789, 8172, 0.03394256], [8172, 8385, 0.02347418], [8385, 8768, 0.01827676], [8768, 8798, 0.16666667], [8798, 9073, 0.04727273], [9073, 9309, 0.04661017], [9309, 9509, 0.015], [9509, 9855, 0.04913295], [9855, 10216, 0.05263158], [10216, 10482, 0.03007519], [10482, 10513, 0.12903226], [10513, 10717, 0.04411765], [10717, 10734, 0.11764706], [10734, 10824, 0.11111111], [10824, 10869, 0.11111111], [10869, 10943, 0.01351351], [10943, 11031, 0.10227273], [11031, 11153, 0.04098361]], "rps_doc_ml_palm_score": [[0, 11153, 0.81111395]], "rps_doc_ml_wikipedia_score": [[0, 11153, null]], "rps_doc_ml_wikiref_score": [[0, 11153, 0.97247046]], "rps_doc_books_importance": [[0, 11153, -531.97380417]], "rps_doc_openwebtext_importance": [[0, 11153, 299.78875765]], "rps_doc_wikipedia_importance": [[0, 11153, -228.96334916]], "rps_doc_num_sentences": [[0, 11153, 76.0]]} |
The Day the Music Died
November 3, 2012 Cory Trenda Leave a comment
I got the call a little after 7am this morning from a mutual friend in Chicago: “Mark is dancing in eternity. He’s not suffering any longer.”
I’ve just lost the most faithful friend I’ve ever had, or that I expect I’ll ever have. Actually, I never expected to have this one. But I suppose friends like Mark are never expected.
Mark was diagnosed at age 56 as Stage 4 of a rare lymphoma and told he’d have 3-4 more years. That was almost three years ago, so this was not a surprise. When he was beginning life’s end game a couple of months ago, we changed our September vacation plans to travel back to Chicago for a second, and final, extended visit. Two amazing things occurred which I’ll never forget.
I introduced Mark to the music of a budding Christian singer-guitarist back when we lived in Chicago 30 years ago, Bob Bennett, who lived way out in California. Mark got hooked and knew virtually every word to every song, and these became part of the soundtrack of our friendship together and of Mark’s life. Bob sings of brokenness and grace, and Mark was well acquainted with both, so Mark would said that Bob was “singing my life with his words”, as the lyric goes. We’d sing Bob’s songs together when he’d make his annual winter visit to SoCal. Through God’s amazing grace to Mark–and me, I was able to surprise Mark and arrange for Bob to come to Chicago while we were there, to give Mark a private concert. Mark was stunned; he alternated crying, singing and shaking his head. Bob did this for no fee, simply as an act of love for a guy he’d never met, at the request another guy he’d never met. When I emailed Bob about the idea, I innocently mentioned that I know a friend of his–who as it turns out has been Bob’s best friend for 40 years. I think that’s all Bob needed to read, as he called me 30 minutes later and had already been looking at flights.
The other unforgettable memory came from another prompting. I knew we were choosing to visit Mark while he was still alive and coherent, in lieu of attending his memorial, but I also wanted to honor Mark that day in our absence. I knew immediately that I needed to write a eulogy…and read it to Mark while I was there. Reading it aloud was terribly difficult, yet one of the great honors of my life. How often do we get the chance to express our love and appreciation to someone as clearly as we do in our remarks about them after they are gone?
I seriously choked up multiple times, and then went out to the kitchen and wept hard. Mark, usually the emotional one, sat quietly, mostly watching me…and apparently listening. When I finished, he was silent. Then Mark, who has always been my most faithful editor, said in a voice that was commanding yet barely audible, “Don’t…change…a word.”
I thought I’d share my remarks here as well. I do it only to honor my dear friend, and maybe there’s a lesson for the rest of us about friendship from the way he modeled it. What a great privilege it has been for me to experience it; it’s also a privilege to testify to it.
Mark came to light up our lives when we moved to the Chicago area 30 years ago. We were in the Calvary Baptist Church library, and he was holding Matthew, his first son, who was a toddler at the time. I still remember that first encounter. And I remember how Mark was dressed…as I recall, Mark was wearing a coat and tie that day for some reason, but what I’m certain of is that he was dressed in a smile. This was before the days when Mark was the official or unofficial greeter at the church, but Mark never needed a job description to be welcoming, and we were new to the church and eager for friends.
He had me at hello.
No matter how many friends Mark already had, he was always happy–no, hungry–for more. Mark was always ravenously interested in people, his appetite for building relationships was never satiated, and as far as I know, his compassion inbox has never reached “FULL”.
So there may well be fifty other people in this world or even in this room who would say that Mark Archibald was their closest friend. Mark was that kind of guy. All I know is that he was mine. And I feel like my heart just slammed into a brick wall, and now it lies bleeding on the side of the road.
One of the reasons I know Mark was my best friend is that we didn’t agree on everything. I could drive him crazy, though I’ll never know why! Close friends walk that fine line: on the one hand, you want to fully show that you respect the individuality and rights of the other person to have his or her own opinions; on the other hand you care so much about them you want to be of one mind on everything, to share the same ideas and concerns you have. I’m blessed to count many friends, friends all over the world. Yet Mark, alone, was uniquely concerned with knocking me upside the head. As parents, none of us exercises the same level of concern for other people’s kids as we do our own; and that added level of concern is the measure of our love. I’ve always, always known that Mark’s concern was the measure of his love for me.
And truth be told, I wanted to knock him upside the head sometimes, too. Which means he’s not actually a friend at all…he’s a brother. And not just to me; he was a dear friend to our entire family.
Mark loved our kids like no friend I’ve ever had. Ben and Karey actually have a special code with Mark, a song they’d sing when they’d talk with Mark by phone or see him… Robert Palmer’s classic rocker, “Bad Case of Loving You.” One or the other would invariably start the conversation by belting out, “Doctor, doctor, gimme the news…”
I found out the other day that Mark even had our granddaughter Taylor’s phone number in his cellphone…and that her name is spelled out using the baby language her siblings used when she was little…”Tay-wer.”
How much must someone be a part of your life for them to do stuff like that? …to know–and remember–the pet family names of your… grandkids? Can you imagine having a friend like that, who takes that level of interest and shows that level of love to your loved ones? It’s simple: our loved ones were Mark’s loved ones too. Every one of them made special efforts to see Mark when he’d come to California for his annual visit, and they’ve all been praying for Mark these past few years. I’ve always felt bad that I didn’t have the chance to invest similarly in Mark and Debbie’s kids. They were younger than ours, and we moved away to California just after Jeremy was born. Even so, Matthew has a special section in our family’s urban legends. And sometimes I think, even though he’s so much like Mark, yet Matt could be a blood relative of mine: the way he plays the piano and sings is almost identical to me. Early this summer we came for a visit, and on the last evening Matthew and Lara came for dinner. Matthew and I sang duets to Andrew Lloyd Webber tunes and traded off playing the piano accompaniment, one song after another until we’d exhausted the songbooks and our voices. We had a fantastic time that I’ll never forget. But more immense than the joy Matt and I experienced, I looked over and there was Mark in his chair just bursting with pride and joy and contentment. Finally as we ended with some worship songs he couldn’t take it anymore, and during “How Great Thou Art” he began singing along at the top of his failing lungs, sucking in oxygen from his nose-tube so he could keep going. It was incredible.
That was Mark, giving all he had to give, and then pushing himself to give more.
I think Mark’s most singularly attractive character trait was beyond his love, his friendliness and even his humor, though every one of those was a defining trait that made Mark the amazing person he was. Rather, what attracted me most and for so long was his humanity. Mark was so terribly human, and he embraced his humanity. He was fallen, he knew it and he thanked God every day for grace. He made mistakes, he dusted himself off, he beat himself up, he questioned himself, he muddled through… and he kept on muddling even through the darkest times. Isn’t this why King David was such an intriguing character and probably the #1 personality and star of the Old Testament? He wasn’t perfect, but he was real, and he wasn’t private about his failings.
Like David, or Peter, Mark was a man after God’s heart.
Mark clung fiercely to the hand of God, and with his other hand, he held on just as fiercely to the rest of us. We got to experience Mark’s life with him, and he made himself comfortable in our lives. And as the most faithful editor of my writings, he coaxed my guts out onto paper by accepting me and challenging me to be real, too.
He was exactly what I would want and need in a “best friend.” That’s not a phrase I ever saw myself using. To have had that relationship with Mark is beyond my hopes. That it doesn’t continue to the end of my life is heartache. The fact that it continued to the end of Mark’s life is my gift back to a friend who has given me far more than I could ever repay.
← Traveling at the Speed of Life
I’m Dreaming of a Safe Christmas → | 2023-14/0000/en_head.json.gz/3608 | {"url": "https://corytrenda.com/2012/11/03/the-day-the-music-died/", "partition": "head_middle", "language": "en", "source_domain": "corytrenda.com", "date_download": "2023-03-20T09:54:54Z", "digest": "sha1:N7Y3LRZ6VDYLWRY7GWZ72JRUDD4AZV2E"} | {"ccnet_length": [[0, 9132, 9132.0]], "ccnet_original_length": [[0, 9132, 11590.0]], "ccnet_nlines": [[0, 9132, 25.0]], "ccnet_original_nlines": [[0, 9132, 83.0]], "ccnet_language_score": [[0, 9132, 0.99]], "ccnet_perplexity": [[0, 9132, 301.2]], "ccnet_bucket": [[0, 9132, 0.0]], "rps_doc_curly_bracket": [[0, 9132, 0.0]], "rps_doc_ldnoobw_words": [[0, 9132, 0.0]], "rps_doc_lorem_ipsum": [[0, 9132, 0.0]], "rps_doc_stop_word_fraction": [[0, 9132, 0.48808945]], "rps_doc_ut1_blacklist": [[0, 9132, null]], "rps_doc_frac_chars_dupe_10grams": [[0, 9132, 0.0]], "rps_doc_frac_chars_dupe_5grams": [[0, 9132, 0.0]], "rps_doc_frac_chars_dupe_6grams": [[0, 9132, 0.0]], "rps_doc_frac_chars_dupe_7grams": [[0, 9132, 0.0]], "rps_doc_frac_chars_dupe_8grams": [[0, 9132, 0.0]], "rps_doc_frac_chars_dupe_9grams": [[0, 9132, 0.0]], "rps_doc_frac_chars_top_2gram": [[0, 9132, 0.00829417]], "rps_doc_frac_chars_top_3gram": [[0, 9132, 0.00414708]], "rps_doc_frac_chars_top_4gram": [[0, 9132, 0.00470003]], "rps_doc_frac_all_caps_words": [[0, 9132, 0.02527953]], "rps_doc_frac_lines_end_with_ellipsis": [[0, 9132, 0.0]], "rps_doc_frac_no_alph_words": [[0, 9132, 0.14049587]], "rps_doc_frac_unique_words": [[0, 9132, 0.38019918]], "rps_doc_mean_word_length": [[0, 9132, 4.23784417]], "rps_doc_symbol_to_word_ratio": [[0, 9132, 0.00680603]], "rps_doc_unigram_entropy": [[0, 9132, 5.65403597]], "rps_doc_word_count": [[0, 9132, 1707.0]], "rps_lines_ending_with_terminal_punctution_mark": [[0, 23, 0.0], [23, 68, 0.0], [68, 210, 1.0], [210, 395, 1.0], [395, 772, 1.0], [772, 1934, 1.0], [1934, 2480, 1.0], [2480, 2824, 1.0], [2824, 3098, 1.0], [3098, 3703, 1.0], [3703, 3723, 1.0], [3723, 3987, 1.0], [3987, 4288, 1.0], [4288, 5119, 1.0], [5119, 5317, 1.0], [5317, 5653, 1.0], [5653, 5861, 1.0], [5861, 7480, 1.0], [7480, 7561, 1.0], [7561, 8315, 1.0], [8315, 8371, 1.0], [8371, 8705, 1.0], [8705, 9065, 1.0], [9065, 9098, 0.0], [9098, 9132, 0.0]], "rps_lines_javascript_counts": [[0, 23, 0.0], [23, 68, 0.0], [68, 210, 0.0], [210, 395, 0.0], [395, 772, 0.0], [772, 1934, 0.0], [1934, 2480, 0.0], [2480, 2824, 0.0], [2824, 3098, 0.0], [3098, 3703, 0.0], [3703, 3723, 0.0], [3723, 3987, 0.0], [3987, 4288, 0.0], [4288, 5119, 0.0], [5119, 5317, 0.0], [5317, 5653, 0.0], [5653, 5861, 0.0], [5861, 7480, 0.0], [7480, 7561, 0.0], [7561, 8315, 0.0], [8315, 8371, 0.0], [8371, 8705, 0.0], [8705, 9065, 0.0], [9065, 9098, 0.0], [9098, 9132, 0.0]], "rps_lines_num_words": [[0, 23, 5.0], [23, 68, 8.0], [68, 210, 26.0], [210, 395, 34.0], [395, 772, 70.0], [772, 1934, 217.0], [1934, 2480, 103.0], [2480, 2824, 56.0], [2824, 3098, 56.0], [3098, 3703, 117.0], [3703, 3723, 5.0], [3723, 3987, 42.0], [3987, 4288, 63.0], [4288, 5119, 159.0], [5119, 5317, 39.0], [5317, 5653, 60.0], [5653, 5861, 35.0], [5861, 7480, 298.0], [7480, 7561, 16.0], [7561, 8315, 133.0], [8315, 8371, 11.0], [8371, 8705, 66.0], [8705, 9065, 74.0], [9065, 9098, 7.0], [9098, 9132, 7.0]], "rps_lines_numerical_chars_fraction": [[0, 23, 0.0], [23, 68, 0.11627907], [68, 210, 0.00724638], [210, 395, 0.0], [395, 772, 0.01362398], [772, 1934, 0.00527704], [1934, 2480, 0.0], [2480, 2824, 0.0], [2824, 3098, 0.0], [3098, 3703, 0.00337838], [3703, 3723, 0.0], [3723, 3987, 0.0], [3987, 4288, 0.0], [4288, 5119, 0.0], [5119, 5317, 0.0], [5317, 5653, 0.0], [5653, 5861, 0.0], [5861, 7480, 0.0], [7480, 7561, 0.0], [7561, 8315, 0.0013624], [8315, 8371, 0.0], [8371, 8705, 0.0], [8705, 9065, 0.0], [9065, 9098, 0.0], [9098, 9132, 0.0]], "rps_lines_start_with_bulletpoint": [[0, 23, 0.0], [23, 68, 0.0], [68, 210, 0.0], [210, 395, 0.0], [395, 772, 0.0], [772, 1934, 0.0], [1934, 2480, 0.0], [2480, 2824, 0.0], [2824, 3098, 0.0], [3098, 3703, 0.0], [3703, 3723, 0.0], [3723, 3987, 0.0], [3987, 4288, 0.0], [4288, 5119, 0.0], [5119, 5317, 0.0], [5317, 5653, 0.0], [5653, 5861, 0.0], [5861, 7480, 0.0], [7480, 7561, 0.0], [7561, 8315, 0.0], [8315, 8371, 0.0], [8371, 8705, 0.0], [8705, 9065, 0.0], [9065, 9098, 0.0], [9098, 9132, 0.0]], "rps_lines_uppercase_letter_fraction": [[0, 23, 0.17391304], [23, 68, 0.08888889], [68, 210, 0.02816901], [210, 395, 0.04864865], [395, 772, 0.02122016], [772, 1934, 0.03012048], [1934, 2480, 0.02014652], [2480, 2824, 0.02034884], [2824, 3098, 0.01459854], [3098, 3703, 0.02809917], [3703, 3723, 0.05], [3723, 3987, 0.03030303], [3987, 4288, 0.02657807], [4288, 5119, 0.01444043], [5119, 5317, 0.02020202], [5317, 5653, 0.04166667], [5653, 5861, 0.01923077], [5861, 7480, 0.02594194], [7480, 7561, 0.02469136], [7561, 8315, 0.01856764], [8315, 8371, 0.08928571], [8371, 8705, 0.01497006], [8705, 9065, 0.02777778], [9065, 9098, 0.09090909], [9098, 9132, 0.11764706]], "rps_doc_ml_palm_score": [[0, 9132, 0.37792599]], "rps_doc_ml_wikipedia_score": [[0, 9132, null]], "rps_doc_ml_wikiref_score": [[0, 9132, 0.37643158]], "rps_doc_books_importance": [[0, 9132, -6.98331691]], "rps_doc_openwebtext_importance": [[0, 9132, 287.34207653]], "rps_doc_wikipedia_importance": [[0, 9132, -774.50769642]], "rps_doc_num_sentences": [[0, 9132, 88.0]]} |
New Beginnings (CA-23)
Newcomers welcome. **Please contact us for meeting information as this may be a zoom meeting, hybrid, or face-to-face during the current COVID-19 crisis.
St. Luke's Lutheran Church &/or Zoom 5633 E. Wardlow Road, Long Beach, CA
April 3 @ 7:30 pm - 9:00 pm PDT Recurring
April 10 @ 7:30 pm - 9:00 pm PDT Recurring | 2023-14/0000/en_head.json.gz/3609 | {"url": "https://cosa-recovery.org/meeting/new-beginnings-ca-23/all/", "partition": "head_middle", "language": "en", "source_domain": "cosa-recovery.org", "date_download": "2023-03-20T10:46:18Z", "digest": "sha1:WGM4WE4IGLR3XEAET2XHIDM5KS26KEQW"} | {"ccnet_length": [[0, 335, 335.0]], "ccnet_original_length": [[0, 335, 4475.0]], "ccnet_nlines": [[0, 335, 5.0]], "ccnet_original_nlines": [[0, 335, 162.0]], "ccnet_language_score": [[0, 335, 0.85]], "ccnet_perplexity": [[0, 335, 228.6]], "ccnet_bucket": [[0, 335, 0.0]], "rps_doc_curly_bracket": [[0, 335, 0.0]], "rps_doc_ldnoobw_words": [[0, 335, 0.0]], "rps_doc_lorem_ipsum": [[0, 335, 0.0]], "rps_doc_stop_word_fraction": [[0, 335, 0.15730337]], "rps_doc_ut1_blacklist": [[0, 335, null]], "rps_doc_frac_chars_dupe_10grams": [[0, 335, 0.0]], "rps_doc_frac_chars_dupe_5grams": [[0, 335, 0.176]], "rps_doc_frac_chars_dupe_6grams": [[0, 335, 0.176]], "rps_doc_frac_chars_dupe_7grams": [[0, 335, 0.0]], "rps_doc_frac_chars_dupe_8grams": [[0, 335, 0.0]], "rps_doc_frac_chars_dupe_9grams": [[0, 335, 0.0]], "rps_doc_frac_chars_top_2gram": [[0, 335, 0.04]], "rps_doc_frac_chars_top_3gram": [[0, 335, 0.064]], "rps_doc_frac_chars_top_4gram": [[0, 335, 0.08]], "rps_doc_frac_all_caps_words": [[0, 335, 0.06741573]], "rps_doc_frac_lines_end_with_ellipsis": [[0, 335, 0.0]], "rps_doc_frac_no_alph_words": [[0, 335, 0.42696629]], "rps_doc_frac_unique_words": [[0, 335, 0.8]], "rps_doc_mean_word_length": [[0, 335, 4.54545455]], "rps_doc_symbol_to_word_ratio": [[0, 335, 0.0]], "rps_doc_unigram_entropy": [[0, 335, 3.70486896]], "rps_doc_word_count": [[0, 335, 55.0]], "rps_lines_ending_with_terminal_punctution_mark": [[0, 23, 0.0], [23, 177, 1.0], [177, 251, 0.0], [251, 293, 0.0], [293, 335, 0.0]], "rps_lines_javascript_counts": [[0, 23, 0.0], [23, 177, 0.0], [177, 251, 0.0], [251, 293, 0.0], [293, 335, 0.0]], "rps_lines_num_words": [[0, 23, 3.0], [23, 177, 23.0], [177, 251, 13.0], [251, 293, 8.0], [293, 335, 8.0]], "rps_lines_numerical_chars_fraction": [[0, 23, 0.10526316], [23, 177, 0.01388889], [177, 251, 0.06060606], [251, 293, 0.2], [293, 335, 0.22222222]], "rps_lines_start_with_bulletpoint": [[0, 23, 0.0], [23, 177, 0.0], [177, 251, 0.0], [251, 293, 0.0], [293, 335, 0.0]], "rps_lines_uppercase_letter_fraction": [[0, 23, 0.17391304], [23, 177, 0.04545455], [177, 251, 0.16216216], [251, 293, 0.11904762], [293, 335, 0.11904762]], "rps_doc_ml_palm_score": [[0, 335, -1.001e-05]], "rps_doc_ml_wikipedia_score": [[0, 335, null]], "rps_doc_ml_wikiref_score": [[0, 335, 0.00139397]], "rps_doc_books_importance": [[0, 335, -70.43006732]], "rps_doc_openwebtext_importance": [[0, 335, -34.44143953]], "rps_doc_wikipedia_importance": [[0, 335, -29.40015311]], "rps_doc_num_sentences": [[0, 335, 5.0]]} |
It Costs $22,648 And Requires 11 Agencies To Start A Restaurant In San Francisco
June 27, 2022 by Counter Markets
By Simon Black, Sovereign Man
In a report called Barriers to Business, the Institute for Justice (IJ) analyzed 20 US cities for how easy it is to open five different types of businesses. To cover a range, those businesses included a restaurant, a retail bookstore, a food truck, a barbershop, and a home-based tutoring business.
Entrepreneurs who want to start a restaurant, for example, have 13 different fees totaling $5,300, on average across the 20 cities. In San Francisco, those fees reach $22,648.
Remember, these costs and regulatory hurdles are all in addition to the normal costs and work of opening a restaurant.
The IJ also looked at the number of regulatory steps, and the number of government agencies it took to open a business. On average, across the 20 cities, it took 55 steps and eight government agencies just to open a barbershop.
And as if the cost and time burden wasn’t enough, the report found that in many of the cities, it was not even clear what all the requirements were to start a business.
For example, the report analyzed if a city gave entrepreneurs a “one-stop-shop” to open a business.
That means a website which clearly lays out the requirements, and allows the owner to complete most forms in one portal. IJ also looked at whether the portal is user-friendly and allows the business owner to track progress and organize their information from one online account.
None of the cities received a 5/5 for the one-stop-shop, while Birmingham and Des Moines received a 0. But Atlanta, Detroit, New Orleans, New York, Pittsburgh and San Francisco came close, with a 4/5.
And that shows that simply looking at one factor is not enough. Sure, San Francisco may make it relatively easy to navigate the process, but that doesn’t make up for the enormous cost.
And in Raleigh, it may only cost about $1,300 in fees to open a restaurant, but the city meets only 1/5 one-stop-shop criteria.
What this means:
The report correctly observes that “You shouldn’t need a law degree to start the small business of your dreams.” Obviously, these types of costs and burdens are the exact opposite of what made America so strong and prosperous in the first place.
It’s no surprise that businesses are fleeing San Francisco, where it costs tens of thousands of dollars in fees to start a restaurant.
Meanwhile, in San Antonio, Texas, it costs $2,477 and in Jacksonville, Florida, it’s $2,794.
Sadly, in terms of the number of agencies and bureaucratic steps involved, these cities aren’t much better than San Francisco.
What you can do about it:
We’ve said it before and we’ll say it again: local really matters. It is worth noting that the report only looked at medium to large cities. But when you are looking to start a business, you may be locating in a smaller city or town.
It makes sense to shop around, by town or county, to see what the requirements are before committing to a location. This will make it easier on you, but it also represents voting with your feet.
Also consider the type of business you will be opening. The total steps it took to open a home-based tutoring business was the fewest of any other business analyzed in all 20 cities.
That is a good sign, since many of the most promising small-business opportunities these days are home-based. We even talked about the revolution in alternative schooling programs to replace public schools in this Sunday Intelligence article from March.
Service providers are popping up to support micro-schools, and regulations in many states, like Florida, allow former public school teachers to easily start a home-based schooling program. Some states, such as Arizona, even allow school choice funds to be spent on these types of private programs.
With parents fed-up with woke indoctrination and COVID protocols in school, the market is there. And in many places, the regulatory burden is relatively small.
If you have an online business which provides services, you should consider Puerto Rico’s Act 60 tax incentives. Companies which export their services to customers outside of Puerto Rico can pay just a 4% corporate tax rate, then distribute dividends to the company owner tax free.
And of course, you should always consider looking outside of your home country for opportunities as well.
As we mentioned earlier, the free-market American dream of starting any business easily is not what it used to be.
Sourced from ZeroHedge
Filed Under: Uncategorized Tagged With: California, Entrepreneurship, Regulations, Simon Black | 2023-14/0000/en_head.json.gz/3610 | {"url": "https://countermarkets.com/it-costs-22648-and-requires-11-agencies-to-start-a-restaurant-in-san-francisco/", "partition": "head_middle", "language": "en", "source_domain": "countermarkets.com", "date_download": "2023-03-20T09:14:13Z", "digest": "sha1:R4J6SD6OF5ZSPOX2RISNTYM3TEFXVMOP"} | {"ccnet_length": [[0, 4616, 4616.0]], "ccnet_original_length": [[0, 4616, 10132.0]], "ccnet_nlines": [[0, 4616, 30.0]], "ccnet_original_nlines": [[0, 4616, 88.0]], "ccnet_language_score": [[0, 4616, 0.96]], "ccnet_perplexity": [[0, 4616, 214.7]], "ccnet_bucket": [[0, 4616, 0.0]], "rps_doc_curly_bracket": [[0, 4616, 0.0]], "rps_doc_ldnoobw_words": [[0, 4616, 0.0]], "rps_doc_lorem_ipsum": [[0, 4616, 0.0]], "rps_doc_stop_word_fraction": [[0, 4616, 0.39162304]], "rps_doc_ut1_blacklist": [[0, 4616, null]], "rps_doc_frac_chars_dupe_10grams": [[0, 4616, 0.0]], "rps_doc_frac_chars_dupe_5grams": [[0, 4616, 0.02104127]], "rps_doc_frac_chars_dupe_6grams": [[0, 4616, 0.01402752]], "rps_doc_frac_chars_dupe_7grams": [[0, 4616, 0.0]], "rps_doc_frac_chars_dupe_8grams": [[0, 4616, 0.0]], "rps_doc_frac_chars_dupe_9grams": [[0, 4616, 0.0]], "rps_doc_frac_chars_top_2gram": [[0, 4616, 0.01132992]], "rps_doc_frac_chars_top_3gram": [[0, 4616, 0.0107904]], "rps_doc_frac_chars_top_4gram": [[0, 4616, 0.01456704]], "rps_doc_frac_all_caps_words": [[0, 4616, 0.00628272]], "rps_doc_frac_lines_end_with_ellipsis": [[0, 4616, 0.0]], "rps_doc_frac_no_alph_words": [[0, 4616, 0.18429319]], "rps_doc_frac_unique_words": [[0, 4616, 0.45360825]], "rps_doc_mean_word_length": [[0, 4616, 4.77706186]], "rps_doc_symbol_to_word_ratio": [[0, 4616, 0.0]], "rps_doc_unigram_entropy": [[0, 4616, 5.28528692]], "rps_doc_word_count": [[0, 4616, 776.0]], "rps_lines_ending_with_terminal_punctution_mark": [[0, 81, 0.0], [81, 114, 0.0], [114, 144, 0.0], [144, 443, 1.0], [443, 619, 1.0], [619, 738, 1.0], [738, 966, 1.0], [966, 1135, 1.0], [1135, 1235, 1.0], [1235, 1514, 1.0], [1514, 1715, 1.0], [1715, 1900, 1.0], [1900, 2028, 1.0], [2028, 2045, 0.0], [2045, 2291, 1.0], [2291, 2426, 1.0], [2426, 2519, 1.0], [2519, 2646, 1.0], [2646, 2672, 0.0], [2672, 2906, 1.0], [2906, 3101, 1.0], [3101, 3284, 1.0], [3284, 3538, 1.0], [3538, 3836, 1.0], [3836, 3996, 1.0], [3996, 4278, 1.0], [4278, 4384, 1.0], [4384, 4499, 1.0], [4499, 4522, 0.0], [4522, 4616, 0.0]], "rps_lines_javascript_counts": [[0, 81, 0.0], [81, 114, 0.0], [114, 144, 0.0], [144, 443, 0.0], [443, 619, 0.0], [619, 738, 0.0], [738, 966, 0.0], [966, 1135, 0.0], [1135, 1235, 0.0], [1235, 1514, 0.0], [1514, 1715, 0.0], [1715, 1900, 0.0], [1900, 2028, 0.0], [2028, 2045, 0.0], [2045, 2291, 0.0], [2291, 2426, 0.0], [2426, 2519, 0.0], [2519, 2646, 0.0], [2646, 2672, 0.0], [2672, 2906, 0.0], [2906, 3101, 0.0], [3101, 3284, 0.0], [3284, 3538, 0.0], [3538, 3836, 0.0], [3836, 3996, 0.0], [3996, 4278, 0.0], [4278, 4384, 0.0], [4384, 4499, 0.0], [4499, 4522, 0.0], [4522, 4616, 0.0]], "rps_lines_num_words": [[0, 81, 14.0], [81, 114, 6.0], [114, 144, 5.0], [144, 443, 50.0], [443, 619, 28.0], [619, 738, 20.0], [738, 966, 41.0], [966, 1135, 33.0], [1135, 1235, 16.0], [1235, 1514, 46.0], [1514, 1715, 34.0], [1715, 1900, 33.0], [1900, 2028, 23.0], [2028, 2045, 3.0], [2045, 2291, 42.0], [2291, 2426, 23.0], [2426, 2519, 14.0], [2519, 2646, 20.0], [2646, 2672, 6.0], [2672, 2906, 45.0], [2906, 3101, 36.0], [3101, 3284, 33.0], [3284, 3538, 38.0], [3538, 3836, 46.0], [3836, 3996, 25.0], [3996, 4278, 46.0], [4278, 4384, 17.0], [4384, 4499, 20.0], [4499, 4522, 3.0], [4522, 4616, 10.0]], "rps_lines_numerical_chars_fraction": [[0, 81, 0.08974359], [81, 114, 0.19354839], [114, 144, 0.0], [144, 443, 0.00696864], [443, 619, 0.07878788], [619, 738, 0.0], [738, 966, 0.01801802], [966, 1135, 0.0], [1135, 1235, 0.0], [1235, 1514, 0.0], [1514, 1715, 0.02659574], [1715, 1900, 0.0], [1900, 2028, 0.05042017], [2028, 2045, 0.0], [2045, 2291, 0.0], [2291, 2426, 0.0], [2426, 2519, 0.09756098], [2519, 2646, 0.0], [2646, 2672, 0.0], [2672, 2906, 0.0], [2906, 3101, 0.0], [3101, 3284, 0.01117318], [3284, 3538, 0.0], [3538, 3836, 0.0], [3836, 3996, 0.0], [3996, 4278, 0.01086957], [4278, 4384, 0.0], [4384, 4499, 0.0], [4499, 4522, 0.0], [4522, 4616, 0.0]], "rps_lines_start_with_bulletpoint": [[0, 81, 0.0], [81, 114, 0.0], [114, 144, 0.0], [144, 443, 0.0], [443, 619, 0.0], [619, 738, 0.0], [738, 966, 0.0], [966, 1135, 0.0], [1135, 1235, 0.0], [1235, 1514, 0.0], [1514, 1715, 0.0], [1715, 1900, 0.0], [1900, 2028, 0.0], [2028, 2045, 0.0], [2045, 2291, 0.0], [2291, 2426, 0.0], [2426, 2519, 0.0], [2519, 2646, 0.0], [2646, 2672, 0.0], [2672, 2906, 0.0], [2906, 3101, 0.0], [3101, 3284, 0.0], [3284, 3538, 0.0], [3538, 3836, 0.0], [3836, 3996, 0.0], [3996, 4278, 0.0], [4278, 4384, 0.0], [4384, 4499, 0.0], [4499, 4522, 0.0], [4522, 4616, 0.0]], "rps_lines_uppercase_letter_fraction": [[0, 81, 0.14814815], [81, 114, 0.09090909], [114, 144, 0.16666667], [144, 443, 0.03344482], [443, 619, 0.02272727], [619, 738, 0.00840336], [738, 966, 0.01754386], [966, 1135, 0.00591716], [1135, 1235, 0.01], [1235, 1514, 0.01075269], [1514, 1715, 0.06965174], [1715, 1900, 0.02162162], [1900, 2028, 0.015625], [2028, 2045, 0.05882353], [2045, 2291, 0.01626016], [2291, 2426, 0.02222222], [2426, 2519, 0.06451613], [2519, 2646, 0.02362205], [2646, 2672, 0.03846154], [2672, 2906, 0.01282051], [2906, 3101, 0.01025641], [3101, 3284, 0.01092896], [3284, 3538, 0.01968504], [3538, 3836, 0.01342282], [3836, 3996, 0.04375], [3996, 4278, 0.0248227], [4278, 4384, 0.00943396], [4384, 4499, 0.0173913], [4499, 4522, 0.13043478], [4522, 4616, 0.10638298]], "rps_doc_ml_palm_score": [[0, 4616, 0.8518979]], "rps_doc_ml_wikipedia_score": [[0, 4616, null]], "rps_doc_ml_wikiref_score": [[0, 4616, 0.18107349]], "rps_doc_books_importance": [[0, 4616, -211.21723483]], "rps_doc_openwebtext_importance": [[0, 4616, 62.0349356]], "rps_doc_wikipedia_importance": [[0, 4616, -200.41449417]], "rps_doc_num_sentences": [[0, 4616, 39.0]]} |
Home › Blog › Givenchy
Blog — Givenchy
American Fashion at the 2021 Met Gala
September 15, 2021 | By: Tiffany Perez
Typically, when May comes around, celebrities, designers, and fashion lovers alike begin to feel the excitement of ‘part of the year’s commencement. In this case, the party of the year is the infamous Met Gala, wildly also hailed as the “Oscars of the East Coast.” Formerly known as the Costume Institute Gala or the Costume Institute Benefit it has been hosted by the Metropolitan Museum of Art’s Costume institute in New York City for more than 73 years. This annual gala is a fundraising event for the Costume Institute that began in 1948 and continues to raise millions every year....
Tags: Balenciaga, Billie Eilish, Carolina Herrera, Christian Dior, Dior, Emily Blunt, Givenchy, Gucci, Karli Kloss, Keke Palmer, Kendall Jenner, Met Gala, Miu Miu, Peter Dundas, Quannah Chasinghorse, Rihanna, Sergio Hudson, Sienna Miller, Yara Shahidi | 2023-14/0000/en_head.json.gz/3611 | {"url": "https://coutureusa.com/blogs/news/tagged/givenchy", "partition": "head_middle", "language": "en", "source_domain": "coutureusa.com", "date_download": "2023-03-20T10:18:56Z", "digest": "sha1:3FBTWU2TV2QG6NOMTIQZM7R5VFCUPC7M"} | {"ccnet_length": [[0, 956, 956.0]], "ccnet_original_length": [[0, 956, 6857.0]], "ccnet_nlines": [[0, 956, 6.0]], "ccnet_original_nlines": [[0, 956, 381.0]], "ccnet_language_score": [[0, 956, 0.87]], "ccnet_perplexity": [[0, 956, 263.1]], "ccnet_bucket": [[0, 956, 0.0]], "rps_doc_curly_bracket": [[0, 956, 0.0]], "rps_doc_ldnoobw_words": [[0, 956, 0.0]], "rps_doc_lorem_ipsum": [[0, 956, 0.0]], "rps_doc_stop_word_fraction": [[0, 956, 0.24870466]], "rps_doc_ut1_blacklist": [[0, 956, null]], "rps_doc_frac_chars_dupe_10grams": [[0, 956, 0.0]], "rps_doc_frac_chars_dupe_5grams": [[0, 956, 0.0]], "rps_doc_frac_chars_dupe_6grams": [[0, 956, 0.0]], "rps_doc_frac_chars_dupe_7grams": [[0, 956, 0.0]], "rps_doc_frac_chars_dupe_8grams": [[0, 956, 0.0]], "rps_doc_frac_chars_dupe_9grams": [[0, 956, 0.0]], "rps_doc_frac_chars_top_2gram": [[0, 956, 0.08355091]], "rps_doc_frac_chars_top_3gram": [[0, 956, 0.07441253]], "rps_doc_frac_chars_top_4gram": [[0, 956, 0.0]], "rps_doc_frac_all_caps_words": [[0, 956, 0.0]], "rps_doc_frac_lines_end_with_ellipsis": [[0, 956, 0.16666667]], "rps_doc_frac_no_alph_words": [[0, 956, 0.22797927]], "rps_doc_frac_unique_words": [[0, 956, 0.70967742]], "rps_doc_mean_word_length": [[0, 956, 4.94193548]], "rps_doc_symbol_to_word_ratio": [[0, 956, 0.00518135]], "rps_doc_unigram_entropy": [[0, 956, 4.48665579]], "rps_doc_word_count": [[0, 956, 155.0]], "rps_lines_ending_with_terminal_punctution_mark": [[0, 23, 0.0], [23, 39, 0.0], [39, 77, 0.0], [77, 116, 0.0], [116, 705, 1.0], [705, 956, 0.0]], "rps_lines_javascript_counts": [[0, 23, 0.0], [23, 39, 0.0], [39, 77, 0.0], [77, 116, 0.0], [116, 705, 0.0], [705, 956, 0.0]], "rps_lines_num_words": [[0, 23, 5.0], [23, 39, 3.0], [39, 77, 7.0], [77, 116, 6.0], [116, 705, 100.0], [705, 956, 34.0]], "rps_lines_numerical_chars_fraction": [[0, 23, 0.0], [23, 39, 0.0], [39, 77, 0.10810811], [77, 116, 0.17647059], [116, 705, 0.01043478], [705, 956, 0.0]], "rps_lines_start_with_bulletpoint": [[0, 23, 0.0], [23, 39, 0.0], [39, 77, 0.0], [77, 116, 0.0], [116, 705, 0.0], [705, 956, 0.0]], "rps_lines_uppercase_letter_fraction": [[0, 23, 0.13043478], [23, 39, 0.125], [39, 77, 0.10526316], [77, 116, 0.1025641], [116, 705, 0.04244482], [705, 956, 0.13545817]], "rps_doc_ml_palm_score": [[0, 956, 0.00014412]], "rps_doc_ml_wikipedia_score": [[0, 956, null]], "rps_doc_ml_wikiref_score": [[0, 956, 0.01302087]], "rps_doc_books_importance": [[0, 956, -59.3448811]], "rps_doc_openwebtext_importance": [[0, 956, -2.75658844]], "rps_doc_wikipedia_importance": [[0, 956, -3.44324548]], "rps_doc_num_sentences": [[0, 956, 5.0]]} |
Journalist Sentenced to Four Years in Prison
November 8, 2002 12:00 PM EST
His Excellency Tran Duc Luong
President, Socialist Republic of Vietnam
c/o Ministry of Foreign Affairs
Via facsimile: 011-84-4-823-1872
Your Excellency:
The Committee to Protect Journalists (CPJ) condemns today’s sentencing of Internet essayist Le Chi Quang to four years in prison. We are also concerned for the safety of former soldier Tran Dung Tien, who was detained after protesting Quang’s trial.
Earlier today, following a three-hour trial on national security charges, the Hanoi People’s Court sentenced Le Chi Quang, 32, to four years in prison followed by three years of house arrest. Quang was charged under articles 88 and 92 of the Criminal Code, which ban the distribution of information that opposes the government. Quang’s parents were the only observers allowed into the courtroom, while his lawyer was not allowed to present a defense before the court, according to CPJ sources.
Quang, a law school graduate and computer teacher, was arrested on February 21 after writing several essays that criticized government policy. One essay, titled “Beware of Imperialist China,” criticized land and sea border agreements between China and Vietnam. Other essays discussed the U.S.-Vietnam bilateral trade agreement and the activities of Vietnamese dissidents.
While the chief judge in the case told foreign reporters that Quang had pleaded guilty, CPJ sources said that Quang admitted in court to having written the articles mentioned by the prosecution but denied committing any crime.
During Quang’s trial, about 100 family members and supporters gathered outside the courthouse. After Tran Dung Tien, not a journalist but a former soldier in the Vietnamese army, read aloud a statement protesting the trial, security officials arrested him, said CPJ sources. His current whereabouts are unknown. On October 14, Tien was one of 13 signatories to an open letter calling for the immediate release of Quang, writers Pham Hong Son and Nguyen Vu Binh, and human rights activist Nguyen Koac Toan.
Son and Binh, both 34, are also in prison for distributing information online. Son was arrested on March 27 after translating and publishing online an article titled, “What is Democracy?” which was originally published on the U.S. State Department’s Web site. Binh was arrested on September 25 after writing an essay criticizing the China-Vietnam border agreements, and he is currently being held in Cau Dien Prison, in Ha Tay District outside the capital, Hanoi. Authorities have not announced trial dates for Son or Binh.
Last month, Your Excellency’s government stepped up efforts to control citizens’ access to online information by requiring all domestic Web sites to obtain a license from the government to prevent the distribution of “harmful” information. Authorities also restrict access to foreign news sites. The BBC reported today that domestic access to its Vietnamese-language Web site has been blocked for a week.
As a nonpartisan organization of journalists dedicated to the defense of our colleagues worldwide, CPJ condemns the criminal prosecution of writers simply because they express dissenting viewpoints. Your government’s draconian response to any criticism of official policy will only damage Vietnam’s standing in the international community.
CPJ respectfully reminds Your Excellency that the right to freedom of expression is guaranteed both by the Vietnamese Constitution and by the United Nations International Covenant on Civil and Political Rights, to which Vietnam is a signatory.
We call for the immediate and unconditional release of Le Chi Quang, Tran Dung Tien, Pham Hong Son, and Nguyen Vu Binh.
Thank you for your attention to this important matter. We await your response.
Ann Cooper
RELATED LINKS: October 22 news alert | September 26 protest | August 14 news alert | July 22 news alert | April 29 protest
Copy Short Link Copied | 2023-14/0000/en_head.json.gz/3612 | {"url": "https://cpj.org/2002/11/journalist-sentenced-to-four-years-in-prison/", "partition": "head_middle", "language": "en", "source_domain": "cpj.org", "date_download": "2023-03-20T10:23:14Z", "digest": "sha1:WSTDMIFZOKW4UN2DJEOTTP7V7XK7ZS56"} | {"ccnet_length": [[0, 3945, 3945.0]], "ccnet_original_length": [[0, 3945, 10393.0]], "ccnet_nlines": [[0, 3945, 21.0]], "ccnet_original_nlines": [[0, 3945, 573.0]], "ccnet_language_score": [[0, 3945, 0.95]], "ccnet_perplexity": [[0, 3945, 318.4]], "ccnet_bucket": [[0, 3945, 0.0]], "rps_doc_curly_bracket": [[0, 3945, 0.0]], "rps_doc_ldnoobw_words": [[0, 3945, 0.0]], "rps_doc_lorem_ipsum": [[0, 3945, 0.0]], "rps_doc_stop_word_fraction": [[0, 3945, 0.30737705]], "rps_doc_ut1_blacklist": [[0, 3945, null]], "rps_doc_frac_chars_dupe_10grams": [[0, 3945, 0.0]], "rps_doc_frac_chars_dupe_5grams": [[0, 3945, 0.03360049]], "rps_doc_frac_chars_dupe_6grams": [[0, 3945, 0.01602959]], "rps_doc_frac_chars_dupe_7grams": [[0, 3945, 0.01602959]], "rps_doc_frac_chars_dupe_8grams": [[0, 3945, 0.0]], "rps_doc_frac_chars_dupe_9grams": [[0, 3945, 0.0]], "rps_doc_frac_chars_top_2gram": [[0, 3945, 0.00986436]], "rps_doc_frac_chars_top_3gram": [[0, 3945, 0.01017263]], "rps_doc_frac_chars_top_4gram": [[0, 3945, 0.01202219]], "rps_doc_frac_all_caps_words": [[0, 3945, 0.0204918]], "rps_doc_frac_lines_end_with_ellipsis": [[0, 3945, 0.0]], "rps_doc_frac_no_alph_words": [[0, 3945, 0.16666667]], "rps_doc_frac_unique_words": [[0, 3945, 0.53833605]], "rps_doc_mean_word_length": [[0, 3945, 5.29200653]], "rps_doc_symbol_to_word_ratio": [[0, 3945, 0.0]], "rps_doc_unigram_entropy": [[0, 3945, 5.38072051]], "rps_doc_word_count": [[0, 3945, 613.0]], "rps_lines_ending_with_terminal_punctution_mark": [[0, 45, 0.0], [45, 75, 0.0], [75, 105, 0.0], [105, 146, 0.0], [146, 178, 0.0], [178, 211, 0.0], [211, 228, 0.0], [228, 478, 1.0], [478, 972, 1.0], [972, 1344, 1.0], [1344, 1571, 1.0], [1571, 2077, 1.0], [2077, 2601, 1.0], [2601, 3006, 1.0], [3006, 3346, 1.0], [3346, 3590, 1.0], [3590, 3710, 1.0], [3710, 3789, 1.0], [3789, 3800, 0.0], [3800, 3923, 0.0], [3923, 3945, 0.0]], "rps_lines_javascript_counts": [[0, 45, 0.0], [45, 75, 0.0], [75, 105, 0.0], [105, 146, 0.0], [146, 178, 0.0], [178, 211, 0.0], [211, 228, 0.0], [228, 478, 0.0], [478, 972, 0.0], [972, 1344, 0.0], [1344, 1571, 0.0], [1571, 2077, 0.0], [2077, 2601, 0.0], [2601, 3006, 0.0], [3006, 3346, 0.0], [3346, 3590, 0.0], [3590, 3710, 0.0], [3710, 3789, 0.0], [3789, 3800, 0.0], [3800, 3923, 0.0], [3923, 3945, 0.0]], "rps_lines_num_words": [[0, 45, 7.0], [45, 75, 6.0], [75, 105, 5.0], [105, 146, 5.0], [146, 178, 5.0], [178, 211, 3.0], [211, 228, 2.0], [228, 478, 40.0], [478, 972, 80.0], [972, 1344, 52.0], [1344, 1571, 37.0], [1571, 2077, 82.0], [2077, 2601, 84.0], [2601, 3006, 61.0], [3006, 3346, 46.0], [3346, 3590, 37.0], [3590, 3710, 22.0], [3710, 3789, 13.0], [3789, 3800, 2.0], [3800, 3923, 20.0], [3923, 3945, 4.0]], "rps_lines_numerical_chars_fraction": [[0, 45, 0.0], [45, 75, 0.33333333], [75, 105, 0.0], [105, 146, 0.0], [146, 178, 0.0], [178, 211, 0.48148148], [211, 228, 0.0], [228, 478, 0.0], [478, 972, 0.01244813], [972, 1344, 0.00554017], [1344, 1571, 0.0], [1571, 2077, 0.01419878], [2077, 2601, 0.01178782], [2601, 3006, 0.0], [3006, 3346, 0.0], [3346, 3590, 0.0], [3590, 3710, 0.0], [3710, 3789, 0.0], [3789, 3800, 0.0], [3800, 3923, 0.08849558], [3923, 3945, 0.0]], "rps_lines_start_with_bulletpoint": [[0, 45, 0.0], [45, 75, 0.0], [75, 105, 0.0], [105, 146, 0.0], [146, 178, 0.0], [178, 211, 0.0], [211, 228, 0.0], [228, 478, 0.0], [478, 972, 0.0], [972, 1344, 0.0], [1344, 1571, 0.0], [1571, 2077, 0.0], [2077, 2601, 0.0], [2601, 3006, 0.0], [3006, 3346, 0.0], [3346, 3590, 0.0], [3590, 3710, 0.0], [3710, 3789, 0.0], [3789, 3800, 0.0], [3800, 3923, 0.0], [3923, 3945, 0.0]], "rps_lines_uppercase_letter_fraction": [[0, 45, 0.11111111], [45, 75, 0.2], [75, 105, 0.16666667], [105, 146, 0.09756098], [146, 178, 0.09375], [178, 211, 0.03030303], [211, 228, 0.11764706], [228, 478, 0.064], [478, 972, 0.02834008], [972, 1344, 0.03494624], [1344, 1571, 0.02643172], [1571, 2077, 0.04743083], [2077, 2601, 0.04770992], [2601, 3006, 0.02716049], [3006, 3346, 0.01764706], [3346, 3590, 0.06147541], [3590, 3710, 0.10833333], [3710, 3789, 0.02531646], [3789, 3800, 0.18181818], [3800, 3923, 0.13821138], [3923, 3945, 0.18181818]], "rps_doc_ml_palm_score": [[0, 3945, 0.53719676]], "rps_doc_ml_wikipedia_score": [[0, 3945, null]], "rps_doc_ml_wikiref_score": [[0, 3945, 0.92292124]], "rps_doc_books_importance": [[0, 3945, -237.34973814]], "rps_doc_openwebtext_importance": [[0, 3945, 40.6522872]], "rps_doc_wikipedia_importance": [[0, 3945, -61.41143836]], "rps_doc_num_sentences": [[0, 3945, 32.0]]} |
HomeDaily Current AffairsCurrent Affairs- 26th November 2022
Daily Current Affairs: 26 November 2022
Today’s important event. Constitution Day of India, National Milk Day, ‘India: The Mother of Democracy’, A sticker-based debit card, called FIRSTAP, IIT Delhi Ranks in the top 50, Swachh Vidyalaya Puraskar for 2021-22, Rupee-Dirham Trade Prospects, Government Job Guarantee Scheme, Railways Plans to Export Vande Bharat Trains by 2025-26, India Wins the Vice Presidency of International Electro-technical Commission.
If you are a student preparing for entrance exams or like to keep yourself updated, the following current affairs will help you. The current affairs provide you with daily updates about economic, national, international, etc., events. They highlight important events and news which can help in your exam preparation for UPSC, SSC, Railways, Banking, and other Competitive Exams. Here, you can get all the important News Headlines for 26th November 2022.
Constitution Day of India:
Constitution Day of India: Constitution Day is celebrated on 26th November every year in India to commemorate the adoption of the Constitution of India. The constitution of India was adopted by the Constituent Assembly on 26th November 1949, and it came into effect on 26th January 1950.
National Milk Day: National Milk Day is celebrated on 26th November every year to signify the importance and benefits of milk. Milk day is a special day that is observed to spread awareness among people about the importance and need for milk.
Dharmendra Pradhan released the book ‘India: The Mother of Democracy’: The book is an attempt to showcase the democratic ethos ingrained in India since the dawn of civilization. It is shown in this book that since the dawn of civilization, India has been imbued with democratic ethos.
IDFC First Bank has introduced a sticker-based debit card, called FIRSTAP: The FIRSTAP is launched to facilitate transactions by simply tapping the sticker on a Near Field Communication (NFC) enabled point-of-sale terminal.
IIT Delhi Ranks in the top 50 of Times Higher Education Employability Rankings:(IIT), Delhi has been placed at the 28th position in the Top 50 of Times Higher Education Global Employability University Ranking and Survey (GEURS). IIT Delhi is the only Indian institute to rank in the top 50. The varsity was ranked 27th last year.
Thirty-Nine Schools were Awarded Swachh Vidyalaya Puraskar for 2021-22:
Out of 39 schools 17 are elementary and 22 are secondary/higher secondary
34 schools have been awarded a cash prize of ₹60,000and ₹20,000 were awarded in the sub-categories.
7. India Wins the Vice Presidency of International Electro-technical Commission: By securing over 90% of the votes cast by full members of the International Electro-technical Commission (IEC) during its General Meeting held recently in San Francisco, USA, India’s representative, a member of the Indian National Committee of the International Electro-technical Commission (IEC) and various technical committees of Bureau of Indian Standards (BIS-India), was elected.
8. The central banks of India and UAE discuss Rupee-Dirham Trade Prospects:The objective of the exercise is to reduce the cost of transactions. India and the UAE had already signed a free trade agreement (FTA) in February to give a fillip to bilateral trade and economic ties.
9. India Forms Panel to Revamp Only Government Job Guarantee Scheme: The Mahatma Gandhi National Rural Employment Guarantee Scheme, or MNREGA, was in high demand in India’s rural areas as they emerged from the pandemic amid soaring inflation and limited non-farm job opportunities.
10.Railways Plans to Export Vande Bharat Trains by 2025-26: The railways are looking at becoming a major exporter of Vande Bharat trains by 2025-26 to markets in Europe, South America, and East Asia, a senior official said, adding that the latest version of the indigenous trains with sleeper coaches will be operational by the first quarter of 2024.
Gs Score Current Affairs Download:-
Daily Current Affairs Download
current affairs updates
currentaffairsinhindi
dailycurrentaffairs
GSSCORE
Latest current affairs
Daily Current Affairs Quiz 25 November 2022 Part -1
Current Affairs- 22nd February 2023
Current Affairs- 21 February 2023
Current Affairs- 19th & 20th February 2023 | 2023-14/0000/en_head.json.gz/3613 | {"url": "https://crackadda.com/current-affairs-26th-november-2022/", "partition": "head_middle", "language": "en", "source_domain": "crackadda.com", "date_download": "2023-03-20T10:42:51Z", "digest": "sha1:WHEPLDOFRP4RSVLCKDH6QEAIUNT7QTKV"} | {"ccnet_length": [[0, 4320, 4320.0]], "ccnet_original_length": [[0, 4320, 7801.0]], "ccnet_nlines": [[0, 4320, 28.0]], "ccnet_original_nlines": [[0, 4320, 184.0]], "ccnet_language_score": [[0, 4320, 0.93]], "ccnet_perplexity": [[0, 4320, 276.3]], "ccnet_bucket": [[0, 4320, 0.0]], "rps_doc_curly_bracket": [[0, 4320, 0.0]], "rps_doc_ldnoobw_words": [[0, 4320, 0.0]], "rps_doc_lorem_ipsum": [[0, 4320, 0.0]], "rps_doc_stop_word_fraction": [[0, 4320, 0.27373613]], "rps_doc_ut1_blacklist": [[0, 4320, null]], "rps_doc_frac_chars_dupe_10grams": [[0, 4320, 0.0]], "rps_doc_frac_chars_dupe_5grams": [[0, 4320, 0.2263991]], "rps_doc_frac_chars_dupe_6grams": [[0, 4320, 0.17128321]], "rps_doc_frac_chars_dupe_7grams": [[0, 4320, 0.10910119]], "rps_doc_frac_chars_dupe_8grams": [[0, 4320, 0.09892595]], "rps_doc_frac_chars_dupe_9grams": [[0, 4320, 0.03787451]], "rps_doc_frac_chars_top_2gram": [[0, 4320, 0.04352742]], "rps_doc_frac_chars_top_3gram": [[0, 4320, 0.00904466]], "rps_doc_frac_chars_top_4gram": [[0, 4320, 0.01130582]], "rps_doc_frac_all_caps_words": [[0, 4320, 0.027127]], "rps_doc_frac_lines_end_with_ellipsis": [[0, 4320, 0.0]], "rps_doc_frac_no_alph_words": [[0, 4320, 0.19852035]], "rps_doc_frac_unique_words": [[0, 4320, 0.47734139]], "rps_doc_mean_word_length": [[0, 4320, 5.34441088]], "rps_doc_symbol_to_word_ratio": [[0, 4320, 0.0]], "rps_doc_unigram_entropy": [[0, 4320, 5.23582651]], "rps_doc_word_count": [[0, 4320, 662.0]], "rps_lines_ending_with_terminal_punctution_mark": [[0, 61, 0.0], [61, 101, 0.0], [101, 518, 1.0], [518, 972, 1.0], [972, 999, 0.0], [999, 1287, 1.0], [1287, 1530, 1.0], [1530, 1815, 1.0], [1815, 2039, 1.0], [2039, 2369, 1.0], [2369, 2441, 0.0], [2441, 2515, 0.0], [2515, 2615, 1.0], [2615, 3082, 1.0], [3082, 3359, 1.0], [3359, 3641, 1.0], [3641, 3992, 1.0], [3992, 4028, 0.0], [4028, 4059, 0.0], [4059, 4083, 0.0], [4083, 4105, 0.0], [4105, 4125, 0.0], [4125, 4133, 0.0], [4133, 4156, 0.0], [4156, 4208, 0.0], [4208, 4244, 0.0], [4244, 4278, 0.0], [4278, 4320, 0.0]], "rps_lines_javascript_counts": [[0, 61, 0.0], [61, 101, 0.0], [101, 518, 0.0], [518, 972, 0.0], [972, 999, 0.0], [999, 1287, 0.0], [1287, 1530, 0.0], [1530, 1815, 0.0], [1815, 2039, 0.0], [2039, 2369, 0.0], [2369, 2441, 0.0], [2441, 2515, 0.0], [2515, 2615, 0.0], [2615, 3082, 0.0], [3082, 3359, 0.0], [3359, 3641, 0.0], [3641, 3992, 0.0], [3992, 4028, 0.0], [4028, 4059, 0.0], [4059, 4083, 0.0], [4083, 4105, 0.0], [4105, 4125, 0.0], [4125, 4133, 0.0], [4133, 4156, 0.0], [4156, 4208, 0.0], [4208, 4244, 0.0], [4244, 4278, 0.0], [4278, 4320, 0.0]], "rps_lines_num_words": [[0, 61, 7.0], [61, 101, 6.0], [101, 518, 58.0], [518, 972, 71.0], [972, 999, 4.0], [999, 1287, 47.0], [1287, 1530, 42.0], [1530, 1815, 47.0], [1815, 2039, 32.0], [2039, 2369, 56.0], [2369, 2441, 9.0], [2441, 2515, 12.0], [2515, 2615, 16.0], [2615, 3082, 64.0], [3082, 3359, 47.0], [3359, 3641, 43.0], [3641, 3992, 58.0], [3992, 4028, 5.0], [4028, 4059, 4.0], [4059, 4083, 3.0], [4083, 4105, 1.0], [4105, 4125, 1.0], [4125, 4133, 1.0], [4133, 4156, 3.0], [4156, 4208, 9.0], [4208, 4244, 5.0], [4244, 4278, 5.0], [4278, 4320, 6.0]], "rps_lines_numerical_chars_fraction": [[0, 61, 0.10169492], [61, 101, 0.15789474], [101, 518, 0.03517588], [518, 972, 0.01369863], [972, 999, 0.0], [999, 1287, 0.04946996], [1287, 1530, 0.0083682], [1530, 1815, 0.0], [1815, 2039, 0.0], [2039, 2369, 0.03125], [2369, 2441, 0.08823529], [2441, 2515, 0.08333333], [2515, 2615, 0.12631579], [2615, 3082, 0.00669643], [3082, 3359, 0.00371747], [3359, 3641, 0.00363636], [3641, 3992, 0.05278592], [3992, 4028, 0.0], [4028, 4059, 0.0], [4059, 4083, 0.0], [4083, 4105, 0.0], [4105, 4125, 0.0], [4125, 4133, 0.0], [4133, 4156, 0.0], [4156, 4208, 0.14], [4208, 4244, 0.17647059], [4244, 4278, 0.1875], [4278, 4320, 0.20512821]], "rps_lines_start_with_bulletpoint": [[0, 61, 0.0], [61, 101, 0.0], [101, 518, 0.0], [518, 972, 0.0], [972, 999, 0.0], [999, 1287, 0.0], [1287, 1530, 0.0], [1530, 1815, 0.0], [1815, 2039, 0.0], [2039, 2369, 0.0], [2369, 2441, 0.0], [2441, 2515, 0.0], [2515, 2615, 0.0], [2615, 3082, 0.0], [3082, 3359, 0.0], [3359, 3641, 0.0], [3641, 3992, 0.0], [3992, 4028, 0.0], [4028, 4059, 0.0], [4059, 4083, 0.0], [4083, 4105, 0.0], [4105, 4125, 0.0], [4125, 4133, 0.0], [4133, 4156, 0.0], [4156, 4208, 0.0], [4208, 4244, 0.0], [4244, 4278, 0.0], [4278, 4320, 0.0]], "rps_lines_uppercase_letter_fraction": [[0, 61, 0.1147541], [61, 101, 0.1], [101, 518, 0.11510791], [518, 972, 0.03964758], [972, 999, 0.11111111], [999, 1287, 0.05208333], [1287, 1530, 0.03292181], [1530, 1815, 0.03508772], [1815, 2039, 0.12053571], [2039, 2369, 0.1030303], [2369, 2441, 0.09722222], [2441, 2515, 0.01351351], [2515, 2615, 0.0], [2615, 3082, 0.08137045], [3082, 3359, 0.06498195], [3359, 3641, 0.08510638], [3641, 3992, 0.03988604], [3992, 4028, 0.13888889], [4028, 4059, 0.12903226], [4059, 4083, 0.0], [4083, 4105, 0.0], [4105, 4125, 0.0], [4125, 4133, 0.875], [4133, 4156, 0.04347826], [4156, 4208, 0.11538462], [4208, 4244, 0.08333333], [4244, 4278, 0.08823529], [4278, 4320, 0.07142857]], "rps_doc_ml_palm_score": [[0, 4320, 0.13104248]], "rps_doc_ml_wikipedia_score": [[0, 4320, null]], "rps_doc_ml_wikiref_score": [[0, 4320, 0.82363105]], "rps_doc_books_importance": [[0, 4320, -328.97349131]], "rps_doc_openwebtext_importance": [[0, 4320, -49.98738421]], "rps_doc_wikipedia_importance": [[0, 4320, -11.86033097]], "rps_doc_num_sentences": [[0, 4320, 28.0]]} |
In India, The Pandemic May Turn Back The Clock On Womens Empowerment
Matrilineal inheritance of clan identity remained important parts of many cultures long after contact, and women continued to use their maternal authority to influence political decisions within and outside of their own nations. Men were generally responsible for hunting, warfare, and interacting with outsiders, therefore they had more visible, public roles. Women, on the other hand, managed the internal operations of the community. They usually owned the family’s housing and household goods, engaged in agricultural food production and gathering of foodstuffs, and reared the children. Kinship, extended family, and clan bound people together within a system of mutual obligation and respect.
The next day we head out to the pink walled city of Jaipur, to meet with Radhika Kumari of the Pink City Rickshaw Company, a perfect example of women helping other women. Radhika trains women from the slums to be rickshaw drivers – some of the only female rickshaw drivers in all of India, I may add. And not only is she giving other women jobs, those women also become shareholders in the company, giving them a sense of ownership and purpose.
Working to protect the legal rights, cultures and environments of Indian nations and other indigenous peoples of the Americas since 1978. February 26, 2019
However, on 28 September 2018, the Supreme Court of India lifted the ban on the entry of women.
Police said Ravi, a founder of the local chapter of climate activist Greta Thunberg’s Fridays for Future movement, was publishing and disseminating an online toolkit of nonviolent tactics to support the protesting farmers.
The health of her family and herself is at risk, her education is often neglected, and her own opportunities are lost because of her long, daily tasks.
The bottom was cut zigzag with a point at each side and one in the middle of the front and back.
Another basis for their low attendance is due to a lack of sanitation at their schools.Once a girl starts menstruation, it is very difficult and embarrassing for her to properly take care of herself, where a private toilet may not be available to her.
In India, the male-female sex ratio is skewed dramatically in favour of men, the chief reason being the high number of women who die before reaching adulthood. Tribal societies in India have a less skewed sex ratio than other caste groups. This is in spite of the fact that tribal communities have far lower income levels, lower literacy rates, and less adequate health facilities. Many experts suggest the higher number of men in India can be attributed to female infanticides and sex-selective abortions. The sex ratio is particularly bad in the north-western area of the country, particularly in Haryana and Jammu and Kashmir. Around 70% of women in India are victims of domestic violence, according to Renuka Chowdhury, former Union minister for Women and Child Development.
In Pictures: Hindus Bathe In Frothy, Polluted Indian River
In embroidering on skin, a very sharp bone awl is used to make the holes. The point of the awl is pushed through from the back and the quill pulled tight, leaving a bit of the end to be bent and pressed into the skin. The design is made of lines composed of a series of upright stitches lying tight together. The width of the lines forming the designs determines the length of the stitch and these vary in width from one sixteenth to one quarter of an inch. For sewing, the women of the Plains used sinew instead of fiber thread.
Dianne Millich, a member of the Southern Ute Indian Tribe in Colorado, already knows better than anyone about the importance of the tribal provisions in the Violence Against Women Reauthorization Act, S. India’s agricultural reforms may also hurt women’s ability to resolve agricultural disputes. For women, the government-regulated market negotiators were an avenue for bargaining and price discovery that did not require women to physically enter the male-dominated agricultural markets and haggle with traders over crop prices. Yet their work feeding their families and the country remains undervalued and largely overlooked by the government. In a given crop season, when fields are sown and harvested, women farmers in India work about 3,300 hours, double the 1,860 hours their male counterparts put into farming.
Violence against women, especially sexual violence, is a serious concern in India. It is common for girls in rural India to drop out of school, so as to help carry out the burden of moving water. Eventually, they miss classes and lag behind enough in school to abandon their education. Another basis for their low attendance is due to a lack of sanitation at their schools.Once a girl starts menstruation, it is very difficult and embarrassing for her to properly take care of herself, where a private toilet may not be available to her. She also risks dangers of violence when exposed during that time, as Indian women have been assaulted while relieving themselves out in public areas.
Many Hands, Light Workonly 7% Of Urban Indian Women Have Paid Jobs
7 People Who Fought to Give Women the Right to VoteLearn about six real-life women and one man who pushed for voting equality. 5 Native American Leaders of the Wild WestThese men admirably fought for the survival of their culture and land and left a lasting legacy for generations to come. The U.S. government committed to reforms, including a return to Malheur for the Paiute. So often when we think of the great Native American heroes of the past, we think of the brave male warriors and chiefs who led their people through war and the long journey into an uncertain future. Here, we honor the Native American women who soldiered alongside them.
The pressure, added with the distance to water sources, creates back, feet, and posture problems. The heat increases their exhaustion, and the chore itself takes away much needed time for other duties. Such rules of feminine modesty are not considered purdah but merely proper female behavior. For traditional Hindus of northern and central India, purdah observances begin at marriage, when a woman acquires a husband and in-laws. NAIWA’s members are women from federally recognized American Indian tribes. To further these aims, local and regional sodalities throughout the United States and Canada hold an annual convention.
They planted their corn in their villages every year, visited the same sugar groves, and mined Illinois lead. Removal would have required them to give up some of their economic status within their xxx sex videos own communities, as well as their power with the Euro-American traders. As such, Sauk and Mesquakie women played a prominent role in defending their land, homes, and economic livelihoods. | 2023-14/0000/en_head.json.gz/3614 | {"url": "https://crawfordbaptistchurch.com/index.php/2021/11/22/in-india-the-pandemic-may-turn-back-the-clock-on-womens-empowerment/", "partition": "head_middle", "language": "en", "source_domain": "crawfordbaptistchurch.com", "date_download": "2023-03-20T09:07:17Z", "digest": "sha1:KVRNRBVJN6GJNBXODSKFDLWFQRW2TXBQ"} | {"ccnet_length": [[0, 6805, 6805.0]], "ccnet_original_length": [[0, 6805, 8052.0]], "ccnet_nlines": [[0, 6805, 18.0]], "ccnet_original_nlines": [[0, 6805, 88.0]], "ccnet_language_score": [[0, 6805, 0.96]], "ccnet_perplexity": [[0, 6805, 256.1]], "ccnet_bucket": [[0, 6805, 0.0]], "rps_doc_curly_bracket": [[0, 6805, 0.0]], "rps_doc_ldnoobw_words": [[0, 6805, 6.0]], "rps_doc_lorem_ipsum": [[0, 6805, 0.0]], "rps_doc_stop_word_fraction": [[0, 6805, 0.38832685]], "rps_doc_ut1_blacklist": [[0, 6805, null]], "rps_doc_frac_chars_dupe_10grams": [[0, 6805, 0.07359307]], "rps_doc_frac_chars_dupe_5grams": [[0, 6805, 0.07359307]], "rps_doc_frac_chars_dupe_6grams": [[0, 6805, 0.07359307]], "rps_doc_frac_chars_dupe_7grams": [[0, 6805, 0.07359307]], "rps_doc_frac_chars_dupe_8grams": [[0, 6805, 0.07359307]], "rps_doc_frac_chars_dupe_9grams": [[0, 6805, 0.07359307]], "rps_doc_frac_chars_top_2gram": [[0, 6805, 0.01713564]], "rps_doc_frac_chars_top_3gram": [[0, 6805, 0.0036075]], "rps_doc_frac_chars_top_4gram": [[0, 6805, 0.00721501]], "rps_doc_frac_all_caps_words": [[0, 6805, 0.00389105]], "rps_doc_frac_lines_end_with_ellipsis": [[0, 6805, 0.0]], "rps_doc_frac_no_alph_words": [[0, 6805, 0.12140078]], "rps_doc_frac_unique_words": [[0, 6805, 0.50933333]], "rps_doc_mean_word_length": [[0, 6805, 4.928]], "rps_doc_symbol_to_word_ratio": [[0, 6805, 0.0]], "rps_doc_unigram_entropy": [[0, 6805, 5.58763624]], "rps_doc_word_count": [[0, 6805, 1125.0]], "rps_lines_ending_with_terminal_punctution_mark": [[0, 69, 0.0], [69, 768, 1.0], [768, 1213, 1.0], [1213, 1369, 0.0], [1369, 1465, 1.0], [1465, 1688, 1.0], [1688, 1840, 1.0], [1840, 1937, 1.0], [1937, 2189, 1.0], [2189, 2968, 1.0], [2968, 3027, 0.0], [3027, 3557, 1.0], [3557, 4376, 1.0], [4376, 5064, 1.0], [5064, 5131, 0.0], [5131, 5779, 1.0], [5779, 6406, 1.0], [6406, 6805, 1.0]], "rps_lines_javascript_counts": [[0, 69, 0.0], [69, 768, 0.0], [768, 1213, 0.0], [1213, 1369, 0.0], [1369, 1465, 0.0], [1465, 1688, 0.0], [1688, 1840, 0.0], [1840, 1937, 0.0], [1937, 2189, 0.0], [2189, 2968, 0.0], [2968, 3027, 0.0], [3027, 3557, 0.0], [3557, 4376, 0.0], [4376, 5064, 0.0], [5064, 5131, 0.0], [5131, 5779, 0.0], [5779, 6406, 0.0], [6406, 6805, 0.0]], "rps_lines_num_words": [[0, 69, 12.0], [69, 768, 101.0], [768, 1213, 82.0], [1213, 1369, 24.0], [1369, 1465, 18.0], [1465, 1688, 33.0], [1688, 1840, 27.0], [1840, 1937, 21.0], [1937, 2189, 44.0], [2189, 2968, 126.0], [2968, 3027, 9.0], [3027, 3557, 102.0], [3557, 4376, 123.0], [4376, 5064, 118.0], [5064, 5131, 12.0], [5131, 5779, 113.0], [5779, 6406, 95.0], [6406, 6805, 65.0]], "rps_lines_numerical_chars_fraction": [[0, 69, 0.0], [69, 768, 0.0], [768, 1213, 0.0], [1213, 1369, 0.06578947], [1369, 1465, 0.06521739], [1465, 1688, 0.0], [1688, 1840, 0.0], [1840, 1937, 0.0], [1937, 2189, 0.0], [2189, 2968, 0.00262812], [2968, 3027, 0.0], [3027, 3557, 0.0], [3557, 4376, 0.00997506], [4376, 5064, 0.0], [5064, 5131, 0.015625], [5131, 5779, 0.00314465], [5779, 6406, 0.0], [6406, 6805, 0.0]], "rps_lines_start_with_bulletpoint": [[0, 69, 0.0], [69, 768, 0.0], [768, 1213, 0.0], [1213, 1369, 0.0], [1369, 1465, 0.0], [1465, 1688, 0.0], [1688, 1840, 0.0], [1840, 1937, 0.0], [1937, 2189, 0.0], [2189, 2968, 0.0], [2968, 3027, 0.0], [3027, 3557, 0.0], [3557, 4376, 0.0], [4376, 5064, 0.0], [5064, 5131, 0.0], [5131, 5779, 0.0], [5779, 6406, 0.0], [6406, 6805, 0.0]], "rps_lines_uppercase_letter_fraction": [[0, 69, 0.17391304], [69, 768, 0.00715308], [768, 1213, 0.02696629], [1213, 1369, 0.02564103], [1369, 1465, 0.05208333], [1465, 1688, 0.02690583], [1688, 1840, 0.00657895], [1840, 1937, 0.01030928], [1937, 2189, 0.00793651], [2189, 2968, 0.02439024], [2968, 3027, 0.15254237], [3027, 3557, 0.01132075], [3557, 4376, 0.02197802], [4376, 5064, 0.0130814], [5064, 5131, 0.1641791], [5131, 5779, 0.03858025], [5779, 6406, 0.02711324], [6406, 6805, 0.02005013]], "rps_doc_ml_palm_score": [[0, 6805, 0.83621061]], "rps_doc_ml_wikipedia_score": [[0, 6805, null]], "rps_doc_ml_wikiref_score": [[0, 6805, 0.47588569]], "rps_doc_books_importance": [[0, 6805, -60.66964168]], "rps_doc_openwebtext_importance": [[0, 6805, 132.32186719]], "rps_doc_wikipedia_importance": [[0, 6805, 116.59877104]], "rps_doc_num_sentences": [[0, 6805, 53.0]]} |
Read & Execute The Bible
June 17, 2019 @ 4:45 am - 6:00 am
« Read & Execute The Bible
This is a time where you can come and learn how to read the Bible and understand its relevance by observing and then doing. This event is streamed LIVE on our Facebook page.
No Registration Required. | 2023-14/0000/en_head.json.gz/3615 | {"url": "https://crazy8ministries.com/event/read-execute-the-bible-2019-06-17/", "partition": "head_middle", "language": "en", "source_domain": "crazy8ministries.com", "date_download": "2023-03-20T10:40:19Z", "digest": "sha1:OVTAQMSX4XJZWDE5NTU2KTRX2EOPTZRY"} | {"ccnet_length": [[0, 285, 285.0]], "ccnet_original_length": [[0, 285, 1855.0]], "ccnet_nlines": [[0, 285, 5.0]], "ccnet_original_nlines": [[0, 285, 77.0]], "ccnet_language_score": [[0, 285, 0.93]], "ccnet_perplexity": [[0, 285, 252.9]], "ccnet_bucket": [[0, 285, 0.0]], "rps_doc_curly_bracket": [[0, 285, 0.0]], "rps_doc_ldnoobw_words": [[0, 285, 0.0]], "rps_doc_lorem_ipsum": [[0, 285, 0.0]], "rps_doc_stop_word_fraction": [[0, 285, 0.328125]], "rps_doc_ut1_blacklist": [[0, 285, null]], "rps_doc_frac_chars_dupe_10grams": [[0, 285, 0.0]], "rps_doc_frac_chars_dupe_5grams": [[0, 285, 0.0]], "rps_doc_frac_chars_dupe_6grams": [[0, 285, 0.0]], "rps_doc_frac_chars_dupe_7grams": [[0, 285, 0.0]], "rps_doc_frac_chars_dupe_8grams": [[0, 285, 0.0]], "rps_doc_frac_chars_dupe_9grams": [[0, 285, 0.0]], "rps_doc_frac_chars_top_2gram": [[0, 285, 0.10909091]], "rps_doc_frac_chars_top_3gram": [[0, 285, 0.12727273]], "rps_doc_frac_chars_top_4gram": [[0, 285, 0.17272727]], "rps_doc_frac_all_caps_words": [[0, 285, 0.015625]], "rps_doc_frac_lines_end_with_ellipsis": [[0, 285, 0.0]], "rps_doc_frac_no_alph_words": [[0, 285, 0.265625]], "rps_doc_frac_unique_words": [[0, 285, 0.76923077]], "rps_doc_mean_word_length": [[0, 285, 4.23076923]], "rps_doc_symbol_to_word_ratio": [[0, 285, 0.0]], "rps_doc_unigram_entropy": [[0, 285, 3.59107978]], "rps_doc_word_count": [[0, 285, 52.0]], "rps_lines_ending_with_terminal_punctution_mark": [[0, 25, 0.0], [25, 59, 0.0], [59, 86, 0.0], [86, 260, 1.0], [260, 285, 1.0]], "rps_lines_javascript_counts": [[0, 25, 0.0], [25, 59, 0.0], [59, 86, 0.0], [86, 260, 0.0], [260, 285, 0.0]], "rps_lines_num_words": [[0, 25, 4.0], [25, 59, 7.0], [59, 86, 5.0], [86, 260, 33.0], [260, 285, 3.0]], "rps_lines_numerical_chars_fraction": [[0, 25, 0.0], [25, 59, 0.46153846], [59, 86, 0.0], [86, 260, 0.0], [260, 285, 0.0]], "rps_lines_start_with_bulletpoint": [[0, 25, 0.0], [25, 59, 0.0], [59, 86, 0.0], [86, 260, 0.0], [260, 285, 0.0]], "rps_lines_uppercase_letter_fraction": [[0, 25, 0.16], [25, 59, 0.02941176], [59, 86, 0.14814815], [86, 260, 0.04597701], [260, 285, 0.12]], "rps_doc_ml_palm_score": [[0, 285, -1.001e-05]], "rps_doc_ml_wikipedia_score": [[0, 285, null]], "rps_doc_ml_wikiref_score": [[0, 285, -1.001e-05]], "rps_doc_books_importance": [[0, 285, -37.41836982]], "rps_doc_openwebtext_importance": [[0, 285, -16.2754089]], "rps_doc_wikipedia_importance": [[0, 285, -29.58022683]], "rps_doc_num_sentences": [[0, 285, 3.0]]} |
Crazy writers
Free Essay Samples > Essays > Religion > Literature on Religion > Bible Psalms 23:1-6: Interpretation and Application
Bible Psalms 23:1-6: Interpretation and Application
1. Bible Psalms 23:1-6: Interpretation and Application
The chapter not only urges one to meditate and be prayerful but also gives the truthful account of the Lord being the shepherd who gives strength, leads the way, etc. David, the author of this Psalm, is referring to God as his shepherd The time frame in which this Psalm was written is difficult to say for certain, but it is thought to be either during the time of Absalom’s rebellion or during the Maccabean period. There are parallels between this Psalm and John 10:11 where Jesus refers to himself as the good shepherd. In conclusion, this Psalm is an excellent example of how someone can apply their faith to their everyday life and find comfort and strength in God.
1. 2 David, the author of this Psalm, is referring to God as his shepherd
The first verse starts off with “The Lord is my shepherd.” In ancient times, shepherds were known to be very important people. They not only had to take care of their flock but also had to protect them from predators. The relationship between a shepherd and his flock was one that was built on trust and mutual respect. In the same way, David is saying that he trusts God to take care of him and protect him from harm.
The second verse talks about how God has led David “in green pastures.” This could be a literal or figurative reference to how God has provided for him. It could also be a reference to how God has given him spiritual refreshment. In either case, it is clear that David feels that God has been good to him.
The third verse talks about how God has “led me beside fresh waters.” Again, this could be a literal or figurative reference. The idea of fresh water usually represents cleansing or life-giving properties. So, it could be a reference to how God has given David new life. It could also be a reference to how God has forgiven him and given him a fresh start.
The fourth verse talks about how God has “restored my soul.” This could be a reference to how David felt after he was forgiven by God. It could also be a reference to how David felt after he had gone through a difficult time in his life and came out stronger because of it.
The fifth verse talks about how God has “led me in paths of righteousness.” This is likely a reference to how David feels that God has been guiding him in his life and helping him make good decisions.
The sixth verse talks about how even though David goes through difficult times, he knows thatGod will always be with him. This is a great example of how someone can have faith even when things are tough. It is also a great reminder that no matter what happens in our lives, we always have the option to turn to God for help.
1.3 The time frame in which this Psalm was written is difficult to say for certain, but it is thought to be either during the time of Absalom’s rebellion or during the Maccabean period
The time frame in which this Psalm was written is difficult to say for certain. However, there are two time frames that are most likely. The first is during the time of Absalom’s rebellion. This was a time when David’s son, Absalom, tried to take over his kingdom.David would have been feeling betrayed and abandoned by his own son during this time.
The second time frame that is most likely is during the Maccabean period. This was a time when the Jewish people were being persecuted by the Syrian king, Antiochus IV. Many Jews were killed during this time and David would have been feeling frustrated and helpless. Regardless of which time frame this Psalm was written in, it is clear that David was going through a tough time in his life.
1.4 There are parallels between this Psalm and John 10:11 where Jesus refers to himself as the good shepherd
There are parallels between this Psalm and John 10:11. In John 10:11, Jesus refers to himself as the good shepherd. This is significant because it shows that Jesus saw himself as fulfilling the role of the shepherd that David talked about in this Psalm. Additionally, it shows that Jesus saw himself as someone who would protect and care for his people just like a shepherd does for his flock.
1.5 In conclusion, this Psalm is an excellent example of how someone can apply their faith to their everyday life and find comfort and strength in God.
This Psalm is a great example of how someone can apply their faith to their everyday life. It shows how David was able to find comfort and strength in God even when he was going through tough times. Additionally, it shows how we can turn to God for help no matter what situation we are in.
What does the Bible say about Psalms 23: 1-6?
The Bible says that Psalms 23:1-6 is a hymn that celebrates God as the Good Shepherd who provides everything we need.
How can we interpret and apply these verses to our lives?
We can interpret and apply these verses to our lives by understanding that God is always with us, even in times of trouble, and He will provide for us and protect us.
What is the meaning of "The Lord is my shepherd, I shall not want" (Psalm 23:1)?
The meaning of "The Lord is my shepherd, I shall not want" (Psalm 23:1) is that we can trust God to take care of us and meet all our needs.
What do sheep represent in this passage, and what can we learn from them?
Sheep represent those who are helpless and vulnerable, and we can learn from them to trust in God's care and protection.
Why is it important for us to trust in God as our Shepherd?
It is important for us to trust in God as our Shepherd because He knows what we need better than we do, and He will always be there for us.
Previous: Themes of Desiree’s Baby
Next: The Different Types of Touchscreen Technologies
Paper datails
TypeEssay
SubjectLiterature on Religion
Child subjectReligion
Words830
UpdatedMay 23, 2020
AbsalomDavidexcellentfreshwaterGodgreen fieldsJesusJohn 10:11Maccabean periodPsalms 23shepherdspiritual refreshment
The Life and Work of Clive Staples Lewis
Walter Rauschenbusch and the Social Gospel
The Bible, Science, and Philosophy: A Comparative Analysis of Creation Views
The Dangers of Cults: How They Recruit, Indoctrinate, and Abuse Members
The Impact of Religious Teachings on the Abortion Debate
The Role of Religion in Politics: A Historical Overview
The Significance of Ramadan
Johnson's Argument for a New Understanding of God: Theological Implications for Women's Place in Society
Luther as an Apocalypse Angel
Buddhism: A Religion and Philosophical System
2023 https://crazywriters.com | Free Essay Samples | 2023-14/0000/en_head.json.gz/3616 | {"url": "https://crazywriters.com/essays/bible-psalms-231-6-interpretation-and-application/", "partition": "head_middle", "language": "en", "source_domain": "crazywriters.com", "date_download": "2023-03-20T09:22:50Z", "digest": "sha1:35M6UMYBGFMP67FDV6YXHMB724UQBOZR"} | {"ccnet_length": [[0, 6659, 6659.0]], "ccnet_original_length": [[0, 6659, 6889.0]], "ccnet_nlines": [[0, 6659, 49.0]], "ccnet_original_nlines": [[0, 6659, 59.0]], "ccnet_language_score": [[0, 6659, 0.98]], "ccnet_perplexity": [[0, 6659, 223.8]], "ccnet_bucket": [[0, 6659, 0.0]], "rps_doc_curly_bracket": [[0, 6659, 0.0]], "rps_doc_ldnoobw_words": [[0, 6659, 0.0]], "rps_doc_lorem_ipsum": [[0, 6659, 0.0]], "rps_doc_stop_word_fraction": [[0, 6659, 0.46971935]], "rps_doc_ut1_blacklist": [[0, 6659, null]], "rps_doc_frac_chars_dupe_10grams": [[0, 6659, 0.22232628]], "rps_doc_frac_chars_dupe_5grams": [[0, 6659, 0.39970032]], "rps_doc_frac_chars_dupe_6grams": [[0, 6659, 0.37141787]], "rps_doc_frac_chars_dupe_7grams": [[0, 6659, 0.31616408]], "rps_doc_frac_chars_dupe_8grams": [[0, 6659, 0.30567522]], "rps_doc_frac_chars_dupe_9grams": [[0, 6659, 0.24292939]], "rps_doc_frac_chars_top_2gram": [[0, 6659, 0.02191422]], "rps_doc_frac_chars_top_3gram": [[0, 6659, 0.01348567]], "rps_doc_frac_chars_top_4gram": [[0, 6659, 0.01685709]], "rps_doc_frac_all_caps_words": [[0, 6659, 0.00443131]], "rps_doc_frac_lines_end_with_ellipsis": [[0, 6659, 0.0]], "rps_doc_frac_no_alph_words": [[0, 6659, 0.14032496]], "rps_doc_frac_unique_words": [[0, 6659, 0.30690537]], "rps_doc_mean_word_length": [[0, 6659, 4.55157715]], "rps_doc_symbol_to_word_ratio": [[0, 6659, 0.0]], "rps_doc_unigram_entropy": [[0, 6659, 5.21036903]], "rps_doc_word_count": [[0, 6659, 1173.0]], "rps_lines_ending_with_terminal_punctution_mark": [[0, 14, 0.0], [14, 132, 0.0], [132, 184, 0.0], [184, 239, 0.0], [239, 911, 1.0], [911, 985, 0.0], [985, 1404, 1.0], [1404, 1710, 1.0], [1710, 2067, 1.0], [2067, 2341, 1.0], [2341, 2542, 1.0], [2542, 2867, 1.0], [2867, 3052, 0.0], [3052, 3402, 1.0], [3402, 3794, 1.0], [3794, 3903, 0.0], [3903, 4297, 1.0], [4297, 4449, 1.0], [4449, 4739, 1.0], [4739, 4785, 1.0], [4785, 4903, 1.0], [4903, 4961, 1.0], [4961, 5128, 1.0], [5128, 5209, 1.0], [5209, 5349, 1.0], [5349, 5423, 1.0], [5423, 5544, 1.0], [5544, 5604, 1.0], [5604, 5744, 1.0], [5744, 5779, 0.0], [5779, 5833, 0.0], [5833, 5847, 0.0], [5847, 5857, 0.0], [5857, 5887, 0.0], [5887, 5909, 0.0], [5909, 5918, 0.0], [5918, 5938, 0.0], [5938, 6054, 0.0], [6054, 6095, 0.0], [6095, 6138, 0.0], [6138, 6215, 0.0], [6215, 6287, 0.0], [6287, 6344, 0.0], [6344, 6400, 0.0], [6400, 6428, 0.0], [6428, 6533, 0.0], [6533, 6563, 0.0], [6563, 6609, 0.0], [6609, 6659, 0.0]], "rps_lines_javascript_counts": [[0, 14, 0.0], [14, 132, 0.0], [132, 184, 0.0], [184, 239, 0.0], [239, 911, 0.0], [911, 985, 0.0], [985, 1404, 0.0], [1404, 1710, 0.0], [1710, 2067, 0.0], [2067, 2341, 0.0], [2341, 2542, 0.0], [2542, 2867, 0.0], [2867, 3052, 0.0], [3052, 3402, 0.0], [3402, 3794, 0.0], [3794, 3903, 0.0], [3903, 4297, 0.0], [4297, 4449, 0.0], [4449, 4739, 0.0], [4739, 4785, 0.0], [4785, 4903, 0.0], [4903, 4961, 0.0], [4961, 5128, 0.0], [5128, 5209, 0.0], [5209, 5349, 0.0], [5349, 5423, 0.0], [5423, 5544, 0.0], [5544, 5604, 0.0], [5604, 5744, 0.0], [5744, 5779, 0.0], [5779, 5833, 0.0], [5833, 5847, 0.0], [5847, 5857, 0.0], [5857, 5887, 0.0], [5887, 5909, 0.0], [5909, 5918, 0.0], [5918, 5938, 0.0], [5938, 6054, 0.0], [6054, 6095, 0.0], [6095, 6138, 0.0], [6138, 6215, 0.0], [6215, 6287, 0.0], [6287, 6344, 0.0], [6344, 6400, 0.0], [6400, 6428, 0.0], [6428, 6533, 0.0], [6533, 6563, 0.0], [6563, 6609, 0.0], [6609, 6659, 0.0]], "rps_lines_num_words": [[0, 14, 2.0], [14, 132, 14.0], [132, 184, 6.0], [184, 239, 7.0], [239, 911, 120.0], [911, 985, 15.0], [985, 1404, 79.0], [1404, 1710, 58.0], [1710, 2067, 65.0], [2067, 2341, 54.0], [2341, 2542, 38.0], [2542, 2867, 62.0], [2867, 3052, 34.0], [3052, 3402, 62.0], [3402, 3794, 70.0], [3794, 3903, 19.0], [3903, 4297, 69.0], [4297, 4449, 27.0], [4449, 4739, 56.0], [4739, 4785, 9.0], [4785, 4903, 21.0], [4903, 4961, 11.0], [4961, 5128, 32.0], [5128, 5209, 16.0], [5209, 5349, 30.0], [5349, 5423, 14.0], [5423, 5544, 21.0], [5544, 5604, 13.0], [5604, 5744, 30.0], [5744, 5779, 5.0], [5779, 5833, 7.0], [5833, 5847, 2.0], [5847, 5857, 1.0], [5857, 5887, 3.0], [5887, 5909, 2.0], [5909, 5918, 1.0], [5918, 5938, 3.0], [5938, 6054, 6.0], [6054, 6095, 8.0], [6095, 6138, 6.0], [6138, 6215, 11.0], [6215, 6287, 11.0], [6287, 6344, 9.0], [6344, 6400, 9.0], [6400, 6428, 4.0], [6428, 6533, 15.0], [6533, 6563, 5.0], [6563, 6609, 6.0], [6609, 6659, 5.0]], "rps_lines_numerical_chars_fraction": [[0, 14, 0.0], [14, 132, 0.03773585], [132, 184, 0.08333333], [184, 239, 0.1], [239, 911, 0.00606061], [911, 985, 0.02857143], [985, 1404, 0.0], [1404, 1710, 0.0], [1710, 2067, 0.0], [2067, 2341, 0.0], [2341, 2542, 0.0], [2542, 2867, 0.0], [2867, 3052, 0.01098901], [3052, 3402, 0.0], [3402, 3794, 0.0], [3794, 3903, 0.05660377], [3903, 4297, 0.02077922], [4297, 4449, 0.01351351], [4449, 4739, 0.0], [4739, 4785, 0.0952381], [4785, 4903, 0.03508772], [4903, 4961, 0.0], [4961, 5128, 0.0], [5128, 5209, 0.04109589], [5209, 5349, 0.02272727], [5349, 5423, 0.0], [5423, 5544, 0.0], [5544, 5604, 0.0], [5604, 5744, 0.0], [5744, 5779, 0.0], [5779, 5833, 0.0], [5833, 5847, 0.0], [5847, 5857, 0.0], [5857, 5887, 0.0], [5887, 5909, 0.0], [5909, 5918, 0.375], [5918, 5938, 0.33333333], [5938, 6054, 0.05263158], [6054, 6095, 0.0], [6095, 6138, 0.0], [6138, 6215, 0.0], [6215, 6287, 0.0], [6287, 6344, 0.0], [6344, 6400, 0.0], [6400, 6428, 0.0], [6428, 6533, 0.0], [6533, 6563, 0.0], [6563, 6609, 0.0], [6609, 6659, 0.09090909]], "rps_lines_start_with_bulletpoint": [[0, 14, 0.0], [14, 132, 0.0], [132, 184, 0.0], [184, 239, 0.0], [239, 911, 0.0], [911, 985, 0.0], [985, 1404, 0.0], [1404, 1710, 0.0], [1710, 2067, 0.0], [2067, 2341, 0.0], [2341, 2542, 0.0], [2542, 2867, 0.0], [2867, 3052, 0.0], [3052, 3402, 0.0], [3402, 3794, 0.0], [3794, 3903, 0.0], [3903, 4297, 0.0], [4297, 4449, 0.0], [4449, 4739, 0.0], [4739, 4785, 0.0], [4785, 4903, 0.0], [4903, 4961, 0.0], [4961, 5128, 0.0], [5128, 5209, 0.0], [5209, 5349, 0.0], [5349, 5423, 0.0], [5423, 5544, 0.0], [5544, 5604, 0.0], [5604, 5744, 0.0], [5744, 5779, 0.0], [5779, 5833, 0.0], [5833, 5847, 0.0], [5847, 5857, 0.0], [5857, 5887, 0.0], [5887, 5909, 0.0], [5909, 5918, 0.0], [5918, 5938, 0.0], [5938, 6054, 0.0], [6054, 6095, 0.0], [6095, 6138, 0.0], [6138, 6215, 0.0], [6215, 6287, 0.0], [6287, 6344, 0.0], [6344, 6400, 0.0], [6400, 6428, 0.0], [6428, 6533, 0.0], [6533, 6563, 0.0], [6563, 6609, 0.0], [6609, 6659, 0.0]], "rps_lines_uppercase_letter_fraction": [[0, 14, 0.07142857], [14, 132, 0.09322034], [132, 184, 0.07692308], [184, 239, 0.07272727], [239, 911, 0.02380952], [911, 985, 0.04054054], [985, 1404, 0.02147971], [1404, 1710, 0.03267974], [1710, 2067, 0.02521008], [2067, 2341, 0.02554745], [2341, 2542, 0.02487562], [2542, 2867, 0.01846154], [2867, 3052, 0.02162162], [3052, 3402, 0.02571429], [3402, 3794, 0.03571429], [3794, 3903, 0.03669725], [3903, 4297, 0.03045685], [4297, 4449, 0.01973684], [4449, 4739, 0.02413793], [4739, 4785, 0.06521739], [4785, 4903, 0.05084746], [4903, 4961, 0.01724138], [4961, 5128, 0.01796407], [5128, 5209, 0.0617284], [5209, 5349, 0.04285714], [5349, 5423, 0.01351351], [5423, 5544, 0.01652893], [5544, 5604, 0.05], [5604, 5744, 0.03571429], [5744, 5779, 0.11428571], [5779, 5833, 0.11111111], [5833, 5847, 0.07142857], [5847, 5857, 0.2], [5857, 5887, 0.1], [5887, 5909, 0.09090909], [5909, 5918, 0.11111111], [5918, 5938, 0.1], [5938, 6054, 0.06034483], [6054, 6095, 0.14634146], [6095, 6138, 0.09302326], [6138, 6215, 0.11688312], [6215, 6287, 0.125], [6287, 6344, 0.10526316], [6344, 6400, 0.125], [6400, 6428, 0.10714286], [6428, 6533, 0.0952381], [6533, 6563, 0.1], [6563, 6609, 0.10869565], [6609, 6659, 0.06]], "rps_doc_ml_palm_score": [[0, 6659, 0.01618063]], "rps_doc_ml_wikipedia_score": [[0, 6659, null]], "rps_doc_ml_wikiref_score": [[0, 6659, 0.09776306]], "rps_doc_books_importance": [[0, 6659, -157.81659151]], "rps_doc_openwebtext_importance": [[0, 6659, 81.7044861]], "rps_doc_wikipedia_importance": [[0, 6659, -219.11521854]], "rps_doc_num_sentences": [[0, 6659, 60.0]]} |
Canada Research Chair on the Transformations of Scholarly Communication
Prof. Vincent Larivière
Student Supervisions
Jury Member
Administrative Responsibilities
Reading and Organizing committees
Service to the Collectivity
Scientific and Professional Journal Articles
Peer review Conference Proceedings
Peer review Conference Proceedings 2009
Larivière, V., Macaluso, B., Archambault, É., Gingras, Y. (2009). Which scientific elite? On the concentration of funding, productivity and citations, in Larsen, B. and Leta, J. (eds) Proceedings of the 12th International Conference of the International Society for Scientometrics and Informetrics (ISSI), Rio de Janeiro: BIREME/PAHO/WHO and Federal University of Rio de Janeiro, pp. 576-586. [article] PDF
Archambault, É., Caruso, J., Côté, G., Larivière, V. (2009). Bibliometric analysis of leading countries in energy research, in Larsen, B. and Leta, J. (eds) Proceedings of the 12th International Conference of the International Society for Scientometrics and Informetrics (ISSI), Rio de Janeiro: BIREME/PAHO/WHO and Federal University of Rio de Janeiro, pp. 80-91. [article] PDF
Campbell, D., Picard-Aitken, M., Côté, G., Macaluso, B., Robitaille, J.P., Bastien, N., Laframboise, M.C., Lebeau, L.M., Mirabel, P., Larivière, V., Archambault. É. (2009). Bibliometrics as a performance measurement tool for research evaluation: The case of research funded by the National Cancer Institute of Canada, in Larsen, B. and Leta, J. (eds) Proceedings of the 12th International Conference of the International Society for Scientometrics and Informetrics (ISSI), Rio de Janeiro: BIREME/PAHO/WHO and Federal University of Rio de Janeiro, pp. 301-312. [article] PDF
Tunger, D. & Haustein, S. (2009). Bibliometric analysis of the asia-pacific research area: Issues and results. Paper presented at (996-997).
© 2016 Chaire de recherche du Canada sur les transformations de la communication savante• Credits | 2023-14/0000/en_head.json.gz/3617 | {"url": "https://crc.ebsi.umontreal.ca/en/publications/actes-de-congres-comite-de-lecture/?date=2009", "partition": "head_middle", "language": "en", "source_domain": "crc.ebsi.umontreal.ca", "date_download": "2023-03-20T10:06:17Z", "digest": "sha1:PIROPBSAIPNUMITMVDOV75WJQA4BFM7M"} | {"ccnet_length": [[0, 1940, 1940.0]], "ccnet_original_length": [[0, 1940, 4253.0]], "ccnet_nlines": [[0, 1940, 15.0]], "ccnet_original_nlines": [[0, 1940, 178.0]], "ccnet_language_score": [[0, 1940, 0.57]], "ccnet_perplexity": [[0, 1940, 171.6]], "ccnet_bucket": [[0, 1940, 0.0]], "rps_doc_curly_bracket": [[0, 1940, 0.0]], "rps_doc_ldnoobw_words": [[0, 1940, 0.0]], "rps_doc_lorem_ipsum": [[0, 1940, 0.0]], "rps_doc_stop_word_fraction": [[0, 1940, 0.13216958]], "rps_doc_ut1_blacklist": [[0, 1940, null]], "rps_doc_frac_chars_dupe_10grams": [[0, 1940, 0.35992218]], "rps_doc_frac_chars_dupe_5grams": [[0, 1940, 0.35992218]], "rps_doc_frac_chars_dupe_6grams": [[0, 1940, 0.35992218]], "rps_doc_frac_chars_dupe_7grams": [[0, 1940, 0.35992218]], "rps_doc_frac_chars_dupe_8grams": [[0, 1940, 0.35992218]], "rps_doc_frac_chars_dupe_9grams": [[0, 1940, 0.35992218]], "rps_doc_frac_chars_top_2gram": [[0, 1940, 0.0226978]], "rps_doc_frac_chars_top_3gram": [[0, 1940, 0.04669261]], "rps_doc_frac_chars_top_4gram": [[0, 1940, 0.0233463]], "rps_doc_frac_all_caps_words": [[0, 1940, 0.11221945]], "rps_doc_frac_lines_end_with_ellipsis": [[0, 1940, 0.0]], "rps_doc_frac_no_alph_words": [[0, 1940, 0.34413965]], "rps_doc_frac_unique_words": [[0, 1940, 0.47744361]], "rps_doc_mean_word_length": [[0, 1940, 5.79699248]], "rps_doc_symbol_to_word_ratio": [[0, 1940, 0.0]], "rps_doc_unigram_entropy": [[0, 1940, 4.48932928]], "rps_doc_word_count": [[0, 1940, 266.0]], "rps_lines_ending_with_terminal_punctution_mark": [[0, 72, 0.0], [72, 96, 0.0], [96, 117, 0.0], [117, 129, 0.0], [129, 161, 0.0], [161, 195, 0.0], [195, 223, 0.0], [223, 268, 0.0], [268, 303, 0.0], [303, 343, 0.0], [343, 750, 0.0], [750, 1128, 0.0], [1128, 1702, 0.0], [1702, 1843, 1.0], [1843, 1940, 0.0]], "rps_lines_javascript_counts": [[0, 72, 0.0], [72, 96, 0.0], [96, 117, 0.0], [117, 129, 0.0], [129, 161, 0.0], [161, 195, 0.0], [195, 223, 0.0], [223, 268, 0.0], [268, 303, 0.0], [303, 343, 0.0], [343, 750, 0.0], [750, 1128, 0.0], [1128, 1702, 0.0], [1702, 1843, 0.0], [1843, 1940, 0.0]], "rps_lines_num_words": [[0, 72, 9.0], [72, 96, 3.0], [96, 117, 2.0], [117, 129, 2.0], [129, 161, 2.0], [161, 195, 4.0], [195, 223, 4.0], [223, 268, 5.0], [268, 303, 4.0], [303, 343, 5.0], [343, 750, 57.0], [750, 1128, 54.0], [1128, 1702, 81.0], [1702, 1843, 19.0], [1843, 1940, 15.0]], "rps_lines_numerical_chars_fraction": [[0, 72, 0.0], [72, 96, 0.0], [96, 117, 0.0], [117, 129, 0.0], [129, 161, 0.0], [161, 195, 0.0], [195, 223, 0.0], [223, 268, 0.0], [268, 303, 0.0], [303, 343, 0.1025641], [343, 750, 0.03217158], [750, 1128, 0.02873563], [1128, 1702, 0.02316602], [1702, 1843, 0.08064516], [1843, 1940, 0.04123711]], "rps_lines_start_with_bulletpoint": [[0, 72, 0.0], [72, 96, 0.0], [96, 117, 0.0], [117, 129, 0.0], [129, 161, 0.0], [161, 195, 0.0], [195, 223, 0.0], [223, 268, 0.0], [268, 303, 0.0], [303, 343, 0.0], [343, 750, 0.0], [750, 1128, 0.0], [1128, 1702, 0.0], [1702, 1843, 0.0], [1843, 1940, 0.0]], "rps_lines_uppercase_letter_fraction": [[0, 72, 0.08333333], [72, 96, 0.125], [96, 117, 0.0952381], [117, 129, 0.16666667], [129, 161, 0.0625], [161, 195, 0.05882353], [195, 223, 0.07142857], [223, 268, 0.08888889], [268, 303, 0.08571429], [303, 343, 0.075], [343, 750, 0.11547912], [750, 1128, 0.12169312], [1128, 1702, 0.12020906], [1702, 1843, 0.04964539], [1843, 1940, 0.03092784]], "rps_doc_ml_palm_score": [[0, 1940, 0.00026721]], "rps_doc_ml_wikipedia_score": [[0, 1940, null]], "rps_doc_ml_wikiref_score": [[0, 1940, 0.03064632]], "rps_doc_books_importance": [[0, 1940, -175.51943999]], "rps_doc_openwebtext_importance": [[0, 1940, -81.2744812]], "rps_doc_wikipedia_importance": [[0, 1940, 14.75154842]], "rps_doc_num_sentences": [[0, 1940, 46.0]]} |
Head of Portfolio Development, Munich RE
Ian works for Munich Re’s venture capital arm located in San Francisco. Munich Re Ventures (MRV) invests across stages in strategically relevant areas including InsurTech&FinTech, IoT, AI, Cyber, Digital Health, Mobility, Life&Health. Ian’s role is to build out the overall Portfolio Development Platform (PD) and serve as the PD Lead for the Munich Re Fund.
Portfolio Development is tasked with maximizing the strategic value of the MRV platform including: developing Strategic Investment Themes, structuring and negotiating partnerships, and overseeing effective execution with startups. In addition, PD supports MRV portfolio companies as they navigate the insurance industry by providing connections, strategic support, and partnership support.
Ian graduated summa cum laude from Drake University and received a Bachelor of Science of Business Administration in Quantitative Economics and a Bachelor of Arts in Mathematics with a minor in Computer Science. | 2023-14/0000/en_head.json.gz/3618 | {"url": "https://creativedestructionlab.com/fr/mentors/ian-sanders/", "partition": "head_middle", "language": "en", "source_domain": "creativedestructionlab.com", "date_download": "2023-03-20T08:43:39Z", "digest": "sha1:VO2ALXRMVOHSQGCSQQXEIBIHNW7OKY5G"} | {"ccnet_length": [[0, 1001, 1001.0]], "ccnet_original_length": [[0, 1001, 2219.0]], "ccnet_nlines": [[0, 1001, 4.0]], "ccnet_original_nlines": [[0, 1001, 104.0]], "ccnet_language_score": [[0, 1001, 0.93]], "ccnet_perplexity": [[0, 1001, 302.9]], "ccnet_bucket": [[0, 1001, 0.0]], "rps_doc_curly_bracket": [[0, 1001, 0.0]], "rps_doc_ldnoobw_words": [[0, 1001, 1.0]], "rps_doc_lorem_ipsum": [[0, 1001, 0.0]], "rps_doc_stop_word_fraction": [[0, 1001, 0.25287356]], "rps_doc_ut1_blacklist": [[0, 1001, null]], "rps_doc_frac_chars_dupe_10grams": [[0, 1001, 0.0]], "rps_doc_frac_chars_dupe_5grams": [[0, 1001, 0.0]], "rps_doc_frac_chars_dupe_6grams": [[0, 1001, 0.0]], "rps_doc_frac_chars_dupe_7grams": [[0, 1001, 0.0]], "rps_doc_frac_chars_dupe_8grams": [[0, 1001, 0.0]], "rps_doc_frac_chars_dupe_9grams": [[0, 1001, 0.0]], "rps_doc_frac_chars_top_2gram": [[0, 1001, 0.07194245]], "rps_doc_frac_chars_top_3gram": [[0, 1001, 0.0263789]], "rps_doc_frac_chars_top_4gram": [[0, 1001, 0.0]], "rps_doc_frac_all_caps_words": [[0, 1001, 0.04597701]], "rps_doc_frac_lines_end_with_ellipsis": [[0, 1001, 0.0]], "rps_doc_frac_no_alph_words": [[0, 1001, 0.15517241]], "rps_doc_frac_unique_words": [[0, 1001, 0.66433566]], "rps_doc_mean_word_length": [[0, 1001, 5.83216783]], "rps_doc_symbol_to_word_ratio": [[0, 1001, 0.0]], "rps_doc_unigram_entropy": [[0, 1001, 4.35489552]], "rps_doc_word_count": [[0, 1001, 143.0]], "rps_lines_ending_with_terminal_punctution_mark": [[0, 41, 0.0], [41, 400, 1.0], [400, 790, 1.0], [790, 1001, 1.0]], "rps_lines_javascript_counts": [[0, 41, 0.0], [41, 400, 0.0], [400, 790, 0.0], [790, 1001, 0.0]], "rps_lines_num_words": [[0, 41, 6.0], [41, 400, 55.0], [400, 790, 49.0], [790, 1001, 33.0]], "rps_lines_numerical_chars_fraction": [[0, 41, 0.0], [41, 400, 0.0], [400, 790, 0.0], [790, 1001, 0.0]], "rps_lines_start_with_bulletpoint": [[0, 41, 0.0], [41, 400, 0.0], [400, 790, 0.0], [790, 1001, 0.0]], "rps_lines_uppercase_letter_fraction": [[0, 41, 0.14634146], [41, 400, 0.10306407], [400, 790, 0.03589744], [790, 1001, 0.06635071]], "rps_doc_ml_palm_score": [[0, 1001, 0.03999454]], "rps_doc_ml_wikipedia_score": [[0, 1001, null]], "rps_doc_ml_wikiref_score": [[0, 1001, 0.04051787]], "rps_doc_books_importance": [[0, 1001, -76.68643382]], "rps_doc_openwebtext_importance": [[0, 1001, -12.2517939]], "rps_doc_wikipedia_importance": [[0, 1001, -6.68983025]], "rps_doc_num_sentences": [[0, 1001, 6.0]]} |
Wuxtry Records; top 10 sales
2096 N. Decatur Road, Decatur, 404-329-0020
By Unknown Author Tuesday March 10, 2009 04:00 AM EDT
1. Animal Collective, Merriweather Post Pavilion
2. Andrew Bird, Noble Beast
3. Van Morrison, Astral Weeks
4. Paul Chambers, Quintet
5. Modern Skirts, All of Us in Our Night
6. Bruce Springsteen, Working on a Dream
7. The Killers, Day & Age
8. Beastie Boys, Paul’s Boutique 20th Anniversary Edition
9. Deerhunter, Microcastle
10. Lily Allen, It’s Not Me, It’s You | 2023-14/0000/en_head.json.gz/3619 | {"url": "https://creativeloafing.com/content-197969-wuxtry-records-top-10-sales", "partition": "head_middle", "language": "en", "source_domain": "creativeloafing.com", "date_download": "2023-03-20T10:20:03Z", "digest": "sha1:XUGP2HXOIG6NYCTYSIEUASE4GIBEN7RY"} | {"ccnet_length": [[0, 490, 490.0]], "ccnet_original_length": [[0, 490, 3199.0]], "ccnet_nlines": [[0, 490, 13.0]], "ccnet_original_nlines": [[0, 490, 210.0]], "ccnet_language_score": [[0, 490, 0.53]], "ccnet_perplexity": [[0, 490, 334.8]], "ccnet_bucket": [[0, 490, 0.0]], "rps_doc_curly_bracket": [[0, 490, 0.0]], "rps_doc_ldnoobw_words": [[0, 490, 0.0]], "rps_doc_lorem_ipsum": [[0, 490, 0.0]], "rps_doc_stop_word_fraction": [[0, 490, 0.05833333]], "rps_doc_ut1_blacklist": [[0, 490, null]], "rps_doc_frac_chars_dupe_10grams": [[0, 490, 0.0]], "rps_doc_frac_chars_dupe_5grams": [[0, 490, 0.0]], "rps_doc_frac_chars_dupe_6grams": [[0, 490, 0.0]], "rps_doc_frac_chars_dupe_7grams": [[0, 490, 0.0]], "rps_doc_frac_chars_dupe_8grams": [[0, 490, 0.0]], "rps_doc_frac_chars_dupe_9grams": [[0, 490, 0.0]], "rps_doc_frac_chars_top_2gram": [[0, 490, 0.0]], "rps_doc_frac_chars_top_3gram": [[0, 490, 0.0]], "rps_doc_frac_chars_top_4gram": [[0, 490, 0.0]], "rps_doc_frac_all_caps_words": [[0, 490, 0.025]], "rps_doc_frac_lines_end_with_ellipsis": [[0, 490, 0.0]], "rps_doc_frac_no_alph_words": [[0, 490, 0.43333333]], "rps_doc_frac_unique_words": [[0, 490, 0.95061728]], "rps_doc_mean_word_length": [[0, 490, 4.67901235]], "rps_doc_symbol_to_word_ratio": [[0, 490, 0.0]], "rps_doc_unigram_entropy": [[0, 490, 4.31953032]], "rps_doc_word_count": [[0, 490, 81.0]], "rps_lines_ending_with_terminal_punctution_mark": [[0, 29, 0.0], [29, 73, 0.0], [73, 127, 0.0], [127, 176, 0.0], [176, 204, 0.0], [204, 234, 0.0], [234, 260, 0.0], [260, 301, 0.0], [301, 342, 0.0], [342, 368, 0.0], [368, 426, 0.0], [426, 453, 0.0], [453, 490, 0.0]], "rps_lines_javascript_counts": [[0, 29, 0.0], [29, 73, 0.0], [73, 127, 0.0], [127, 176, 0.0], [176, 204, 0.0], [204, 234, 0.0], [234, 260, 0.0], [260, 301, 0.0], [301, 342, 0.0], [342, 368, 0.0], [368, 426, 0.0], [426, 453, 0.0], [453, 490, 0.0]], "rps_lines_num_words": [[0, 29, 5.0], [29, 73, 6.0], [73, 127, 10.0], [127, 176, 6.0], [176, 204, 5.0], [204, 234, 5.0], [234, 260, 4.0], [260, 301, 9.0], [301, 342, 7.0], [342, 368, 5.0], [368, 426, 8.0], [426, 453, 3.0], [453, 490, 8.0]], "rps_lines_numerical_chars_fraction": [[0, 29, 0.07407407], [29, 73, 0.36842105], [73, 127, 0.19607843], [127, 176, 0.02173913], [176, 204, 0.04], [204, 234, 0.03703704], [234, 260, 0.04347826], [260, 301, 0.02631579], [301, 342, 0.02631579], [342, 368, 0.04761905], [368, 426, 0.05454545], [426, 453, 0.04166667], [453, 490, 0.05882353]], "rps_lines_start_with_bulletpoint": [[0, 29, 0.0], [29, 73, 0.0], [73, 127, 0.0], [127, 176, 0.0], [176, 204, 0.0], [204, 234, 0.0], [234, 260, 0.0], [260, 301, 0.0], [301, 342, 0.0], [342, 368, 0.0], [368, 426, 0.0], [426, 453, 0.0], [453, 490, 0.0]], "rps_lines_uppercase_letter_fraction": [[0, 29, 0.06896552], [29, 73, 0.09090909], [73, 127, 0.18518519], [127, 176, 0.10204082], [176, 204, 0.14285714], [204, 234, 0.13333333], [234, 260, 0.11538462], [260, 301, 0.14634146], [301, 342, 0.09756098], [342, 368, 0.15384615], [368, 426, 0.10344828], [426, 453, 0.07407407], [453, 490, 0.18918919]], "rps_doc_ml_palm_score": [[0, 490, 3.46e-06]], "rps_doc_ml_wikipedia_score": [[0, 490, null]], "rps_doc_ml_wikiref_score": [[0, 490, 0.00304896]], "rps_doc_books_importance": [[0, 490, -77.33641825]], "rps_doc_openwebtext_importance": [[0, 490, -20.64692238]], "rps_doc_wikipedia_importance": [[0, 490, -40.95086886]], "rps_doc_num_sentences": [[0, 490, 12.0]]} |
HomeIn the NewsFederal Policy Support that Leverages the Private Sector Will Accelerate Clean Energy
Federal Policy Support that Leverages the Private Sector Will Accelerate Clean Energy
As published in The Morning Consult on February 19, 2019.
BY CHARLES HERNICK & LISA JACOBSON
From coast to coast, America’s economy kicked into high gear in 2018 — and the clean energy sector had a record year, too. The Business Council for Sustainable Energy’s annual “Sustainable Energy in America Factbook” documents strong fundamental trends in clean energy and the need for continued federal policy support that leverages private sector investment to accelerate emissions reductions.
Natural gas boomed. Domestic gas production reached record numbers; new natural gas-based energy capacity reached its highest levels in the last 15 years, and U.S. liquefied natural gas exports grew by an astonishing 135 percent year-over-year.
Renewable energy also surged, benefitting from continued price drops for renewable power and storage. Over 19 gigawatts of new renewable installations — mainly wind and solar — came online last year. Energy efficiency investments hit record levels, and states, including Pennsylvania, Virginia, Connecticut and Florida, strengthened their state building energy codes in 2018.
The factbook notes that the private sector and state governments have been increasing their role in support of national clean energy development. While numerous federal bills have been drafted by Republicans and Democrats in recent years, legislation that would send a strong market signal for investment in energy storage, hydropower, energy efficiency and innovation have stalled.
In contrast, states such as California, New York, New Jersey and Nevada are leading the charge for a clean energy future and increasing their commitment to funding clean energy in their respective states.
Perhaps most impressive is what’s happening in corporate America. In 2018, businesses doubled renewable energy procurements and joined new initiatives to demonstrate leadership in energy efficiency. For example, 156 corporations have now signed on to the RE100 Initiative, pledging to source 100 percent of their electricity from renewables.
Beyond these pledges, large corporate energy consumers signed contracts for 7.7 gigawatts of wind and solar power, more than doubling the previous record from 2015. And business leaders teamed up with regional utilities to develop new solar installations, further lowering the costs for clean energy for ratepayers.
But we can’t count our chickens before they hatch. For years, the rapid growth we have seen in energy efficiency, natural gas and renewable energy drove down U.S. greenhouse gas emissions while the economy grew. That’s why it was frustrating to see energy productivity stagnate and economy-wide U.S. greenhouse gas emissions increase by over 2 percent in 2018.
While the carbon intensity of the U.S. power sector continued to decline, it’s clear that achieving significant emission reductions in other sectors such as buildings, industrial and transportation may be more challenging without policy support.
Many states and businesses are already doing their part. Now, the federal government has the opportunity to engage in new ways.
The urgency of the challenges in infrastructure, resilience and emission reductions appear to be increasing the appetite for clean energy and climate change legislation on both sides of the aisle on Capitol Hill. The pressure is on to see if lawmakers can deliver what voters and consumers want: more clean energy.
With the right blend of federal, state and business leadership — combined with sustained consumer demand — we can see faster clean energy development, leading to more jobs, economic growth and reduced emissions across the U.S. economy. A clear-eyed look at the data is great motivation and cause for optimism.
Charles Hernick is the director of policy and advocacy at Citizens for Responsible Energy Solutions Forum, a 501 (c)(3) nonpartisan, nonprofit organization committed to educating the public and influencing the national conversation about clean energy.
Lisa Jacobson is the president of the Business Council for Sustainable Energy, a 55-member trade association representing the energy efficiency, renewable energy and natural gas industries, and she is a member of the Department of Energy’s State Energy Efficiency Steering Committee, the United States Trade Representative’s Trade and Environment Policy Advisory Committee and Gas Technology Institute’s Public Interest Advisory Committee. | 2023-14/0000/en_head.json.gz/3620 | {"url": "https://cresforum.org/2019/02/federal-policy-support-that-leverages-the-private-sector-will-accelerate-clean-energy/", "partition": "head_middle", "language": "en", "source_domain": "cresforum.org", "date_download": "2023-03-20T09:25:16Z", "digest": "sha1:NEJRP4AVSDO2YB374HNB7NMPXKYPXYOQ"} | {"ccnet_length": [[0, 4594, 4594.0]], "ccnet_original_length": [[0, 4594, 4949.0]], "ccnet_nlines": [[0, 4594, 18.0]], "ccnet_original_nlines": [[0, 4594, 43.0]], "ccnet_language_score": [[0, 4594, 0.93]], "ccnet_perplexity": [[0, 4594, 201.7]], "ccnet_bucket": [[0, 4594, 0.0]], "rps_doc_curly_bracket": [[0, 4594, 0.0]], "rps_doc_ldnoobw_words": [[0, 4594, 0.0]], "rps_doc_lorem_ipsum": [[0, 4594, 0.0]], "rps_doc_stop_word_fraction": [[0, 4594, 0.30721393]], "rps_doc_ut1_blacklist": [[0, 4594, null]], "rps_doc_frac_chars_dupe_10grams": [[0, 4594, 0.03506018]], "rps_doc_frac_chars_dupe_5grams": [[0, 4594, 0.06227106]], "rps_doc_frac_chars_dupe_6grams": [[0, 4594, 0.03506018]], "rps_doc_frac_chars_dupe_7grams": [[0, 4594, 0.03506018]], "rps_doc_frac_chars_dupe_8grams": [[0, 4594, 0.03506018]], "rps_doc_frac_chars_dupe_9grams": [[0, 4594, 0.03506018]], "rps_doc_frac_chars_top_2gram": [[0, 4594, 0.03453689]], "rps_doc_frac_chars_top_3gram": [[0, 4594, 0.0133438]], "rps_doc_frac_chars_top_4gram": [[0, 4594, 0.02040816]], "rps_doc_frac_all_caps_words": [[0, 4594, 0.02114428]], "rps_doc_frac_lines_end_with_ellipsis": [[0, 4594, 0.0]], "rps_doc_frac_no_alph_words": [[0, 4594, 0.15174129]], "rps_doc_frac_unique_words": [[0, 4594, 0.48609078]], "rps_doc_mean_word_length": [[0, 4594, 5.59590044]], "rps_doc_symbol_to_word_ratio": [[0, 4594, 0.0]], "rps_doc_unigram_entropy": [[0, 4594, 5.24184716]], "rps_doc_word_count": [[0, 4594, 683.0]], "rps_lines_ending_with_terminal_punctution_mark": [[0, 101, 0.0], [101, 187, 0.0], [187, 245, 1.0], [245, 280, 0.0], [280, 676, 1.0], [676, 921, 1.0], [921, 1297, 1.0], [1297, 1680, 1.0], [1680, 1885, 1.0], [1885, 2227, 1.0], [2227, 2543, 1.0], [2543, 2904, 1.0], [2904, 3150, 1.0], [3150, 3278, 1.0], [3278, 3593, 1.0], [3593, 3903, 1.0], [3903, 4155, 1.0], [4155, 4594, 1.0]], "rps_lines_javascript_counts": [[0, 101, 0.0], [101, 187, 0.0], [187, 245, 0.0], [245, 280, 0.0], [280, 676, 0.0], [676, 921, 0.0], [921, 1297, 0.0], [1297, 1680, 0.0], [1680, 1885, 0.0], [1885, 2227, 0.0], [2227, 2543, 0.0], [2543, 2904, 0.0], [2904, 3150, 0.0], [3150, 3278, 0.0], [3278, 3593, 0.0], [3593, 3903, 0.0], [3903, 4155, 0.0], [4155, 4594, 0.0]], "rps_lines_num_words": [[0, 101, 14.0], [101, 187, 12.0], [187, 245, 10.0], [245, 280, 5.0], [280, 676, 59.0], [676, 921, 36.0], [921, 1297, 53.0], [1297, 1680, 56.0], [1680, 1885, 33.0], [1885, 2227, 48.0], [2227, 2543, 47.0], [2543, 2904, 58.0], [2904, 3150, 35.0], [3150, 3278, 21.0], [3278, 3593, 51.0], [3593, 3903, 50.0], [3903, 4155, 35.0], [4155, 4594, 60.0]], "rps_lines_numerical_chars_fraction": [[0, 101, 0.0], [101, 187, 0.0], [187, 245, 0.10909091], [245, 280, 0.0], [280, 676, 0.01023018], [676, 921, 0.0212766], [921, 1297, 0.01639344], [1297, 1680, 0.0], [1680, 1885, 0.0], [1885, 2227, 0.03880597], [2227, 2543, 0.01941748], [2543, 2904, 0.01428571], [2904, 3150, 0.0], [3150, 3278, 0.0], [3278, 3593, 0.0], [3593, 3903, 0.0], [3903, 4155, 0.01639344], [4155, 4594, 0.00461894]], "rps_lines_start_with_bulletpoint": [[0, 101, 0.0], [101, 187, 0.0], [187, 245, 0.0], [245, 280, 0.0], [280, 676, 0.0], [676, 921, 0.0], [921, 1297, 0.0], [1297, 1680, 0.0], [1680, 1885, 0.0], [1885, 2227, 0.0], [2227, 2543, 0.0], [2543, 2904, 0.0], [2904, 3150, 0.0], [3150, 3278, 0.0], [3278, 3593, 0.0], [3593, 3903, 0.0], [3903, 4155, 0.0], [4155, 4594, 0.0]], "rps_lines_uppercase_letter_fraction": [[0, 101, 0.12871287], [101, 187, 0.11627907], [187, 245, 0.0862069], [245, 280, 0.8], [280, 676, 0.02777778], [676, 921, 0.01632653], [921, 1297, 0.01861702], [1297, 1680, 0.01044386], [1680, 1885, 0.03414634], [1885, 2227, 0.02046784], [2227, 2543, 0.00632911], [2543, 2904, 0.01939058], [2904, 3150, 0.01219512], [3150, 3278, 0.015625], [3278, 3593, 0.01269841], [3593, 3903, 0.01290323], [3903, 4155, 0.02777778], [4155, 4594, 0.06605923]], "rps_doc_ml_palm_score": [[0, 4594, 0.72805786]], "rps_doc_ml_wikipedia_score": [[0, 4594, null]], "rps_doc_ml_wikiref_score": [[0, 4594, 0.40747166]], "rps_doc_books_importance": [[0, 4594, -295.67737598]], "rps_doc_openwebtext_importance": [[0, 4594, 75.80949464]], "rps_doc_wikipedia_importance": [[0, 4594, -42.6244577]], "rps_doc_num_sentences": [[0, 4594, 39.0]]} |
People of all faiths gather at cathedrals, churches to pray for Ukraine
By Catholic News Service
|Contributor
People in New York City gather for an intercessory prayer service for Ukraine at the Ukrainian Orthodox Cathedral of St. Volodymyr March 9, 2022. (Credit: CNS screen grab/Facebook, Archbishop Elpidophoros of America.)
Amid ongoing Russian attacks on Ukraine, Catholic and other faith leaders and parishioners across the United States have been gathering to pray for peace.
In New York City, religious leaders, politicians and diplomats gathered at the Ukrainian Orthodox Cathedral of St. Volodymyr for an intercessory prayer service March 9 to show their support for Ukraine.
Greek Orthodox Archbishop Elpidophoros of America acknowledged the horrors already visited on Ukraine in less than two weeks of war, but “during each of our worship services, we pray for the peace of the whole world.”
“Peace is more than the balance of power and the absence of war,” he said. “It is a state, a state by which God’s presence reveals itself. For Christ is our peace, writes St. Paul.”
Moreover, he counseled, “we are called not only to forgive our enemies but to love our enemies. We can take it one step further. Let us see no enemy in any of our sisters and brothers. Let the love of Christ break down all barriers.”
Cardinal Timothy M. Dolan of New York attended the service and noted: “God can bring good out of evil. We are seeing raw evil in Ukraine in its irrational invasion” by Russia, but “are we not also seeing radiant good, in that the world as united as never before?”
At a March 6 ecumenical prayer service at the Basilica of St. Mary in Minneapolis, Father Ivan Shkumbatyuk, a Ukrainian Catholic priest, said of Russia’s assault on Ukraine: “This is not a war of aggression.”
Instead, the attacks are the destruction of the Ukrainian nation, its history and identity, he said.
Shkumbatyuk, pastor of St. Constantine Ukrainian Catholic Church in Minneapolis, asked God to send wisdom and reason so that dialogue would prevail over power.
He also urged the congregation of about 200, including some wearing traditional Ukrainian attire, to let their voices be heard. “We cannot remain silent and do nothing.”
The prayer service was led by local faith leaders, including Archbishop Bernard A. Hebda of St. Paul and Minneapolis.
Those in attendance included Jeff Fink, a parishioner of Risen Savior Parish in Burnsville, Minnesota. He said he served in Vietnam, and if he were 30 years younger, he would try to find a way to get to Ukraine to help.
That’s not an option now, he said, noting that taking part in the prayer service seemed like the least he could do.
“I needed to come here and pray for their cause,” he told The Catholic Spirit, archdiocesan newspaper of St. Paul and Minneapolis.
Marzena Laugen, a parishioner of Holy Cross in Minneapolis who emigrated from Ukraine, said it is important to support the Ukrainian people through prayer, penance, almsgiving “and asking our Lord for mercy.”
Nearly 1,000 miles away in The Colony, Texas, Bishop Michael F. Olson of Fort Worth joined Father Pavlo Popov, pastor of St. Sophia Ukrainian Catholic Church for Sunday Mass March 6.
The small church overflowed with parish members and others who came to show their support for the Ukrainian community.
“This visit is an opportunity for us to pray together as God’s people and ask him to give us peace and an end to this terrible war, ” the bishop said. He added that it was also a chance to “give solace — for many people here have relatives and friends who are still in Ukraine — (and) ask God to bless them and help them.”
Parishioners of St. Sophia have spent the days since the Russian invasion praying with while battling anxiety and fear for their loved ones and their homeland.
Marta Petrash, a St. Sophia parishioner, grew up in Ukraine, where her mother, brother, and other relatives and friends still live. Each day she calls her mother, who is staying with relatives in a rural area, where she is fairly safe for now.
However, she has friends in major cities who are sheltering in a neighbor’s basement or in the underground subway stations to avoid shelling.
Petrash, who immigrated to the U.S. in 1991, thinks Russia’s leaders and its army have been surprised at the fierce fighting the invading forces have encountered. “Ukrainian citizens are fighting this war, not just the Ukrainian Army,” she told North Texas Catholic, Fort Worth’s diocesan news outlet.
She said in these troubled times the church community is particularly supportive because “these are hurting times for everyone.”
In his homily, Olson advised parishioners not to allow hatred for Russian President Vladimir Putin to enter their hearts.
“As we unite in prayer and response to assist the persecuted members of your families, the people of Ukraine, in their deliverance from oppression and war, let us ask God to deliver us from the temptation to hate,” he said.
Popov also cautioned against hatred, urging parishioners to do everything they can “not to allow that anger and hatred — that’s a poison you don’t want to let in. Once you let it in, it will grow there; it’s hard to get rid of it.”
“We try to chip in, with prayers, with financial support, whatever we can do to help, yet at the same time not to fill our hearts with despair, hatred, anger,” he added.
For Christians, “There is always hope. … There is light, there is life at the end,” he stated, noting that many of his parishioners are still in a state of shock and disbelief.
Also on March 6, the Pittsburgh-area Ukrainian community organized an interfaith prayer service to be held at St. Paul Cathedral.
“Many times in the past, our varied religious communities have come together to mark a significant time for prayer,” Pittsburgh Bishop David A. Zubik said in a statement.
“As we come together at this time, it is my hope that our gathering will be a source of strength and consolation to our Ukrainian brothers and sisters and be source of hope for us all that peace can make a dramatic return,” he said.
In Denver, the choir at the Cathedral Basilica of the Immaculate Conception posted a video March 2 of its performance of Ukraine’s spiritual anthem, conducted by Phil Bordelaeu, the cathedral’s director of sacred music. The song, which can be viewed here, is a prayer for peace and blessings in Ukraine.
Back in New York, Dolan blessed a Ukrainian Cross of Gratitude before Sunday Mass March 6 at St. Patrick’s Cathedral.
The cross, constructed in 2003 in Lviv, Ukraine, began its pilgrimage stop at the cathedral the day before and will remain there for two weeks.
Built in anticipation of the 2000th anniversary of the passion, death and resurrection of Christ in 2033, the cross has traveled to more than 46 countries.
Contributing to this story were Mark Pattison in Washington; Barb Umberger at The Catholic Spirit, newspaper of the Archdiocese of St. Paul and Minneapolis; and Susan Moses, associate editor of North Texas Catholic, the news outlet of the Diocese of Fort Worth.
Survey: Dramatic drop in volunteers in Dutch parishes after COVID
Anton de Wit - Katholiek Nieuwsblad
Austrian police: Possible threat to Vienna places of worship
Kirsten Grieshaber,
Philipp Jenne
Polish Church defends John Paul sainthood after abuse claims
Spain’s ombudsman registers 445 church sex abuse complaints | 2023-14/0000/en_head.json.gz/3621 | {"url": "https://cruxnow.com/church-in-europe/2022/03/people-of-all-faiths-gather-at-cathedrals-churches-to-pray-for-ukraine", "partition": "head_middle", "language": "en", "source_domain": "cruxnow.com", "date_download": "2023-03-20T08:45:32Z", "digest": "sha1:PDI4QWDQGFDEIG53YJVDQFTUCMXMTITO"} | {"ccnet_length": [[0, 7380, 7380.0]], "ccnet_original_length": [[0, 7380, 9649.0]], "ccnet_nlines": [[0, 7380, 47.0]], "ccnet_original_nlines": [[0, 7380, 151.0]], "ccnet_language_score": [[0, 7380, 0.96]], "ccnet_perplexity": [[0, 7380, 261.9]], "ccnet_bucket": [[0, 7380, 0.0]], "rps_doc_curly_bracket": [[0, 7380, 0.0]], "rps_doc_ldnoobw_words": [[0, 7380, 1.0]], "rps_doc_lorem_ipsum": [[0, 7380, 0.0]], "rps_doc_stop_word_fraction": [[0, 7380, 0.39824798]], "rps_doc_ut1_blacklist": [[0, 7380, null]], "rps_doc_frac_chars_dupe_10grams": [[0, 7380, 0.0]], "rps_doc_frac_chars_dupe_5grams": [[0, 7380, 0.04303968]], "rps_doc_frac_chars_dupe_6grams": [[0, 7380, 0.01479489]], "rps_doc_frac_chars_dupe_7grams": [[0, 7380, 0.01479489]], "rps_doc_frac_chars_dupe_8grams": [[0, 7380, 0.01479489]], "rps_doc_frac_chars_dupe_9grams": [[0, 7380, 0.0]], "rps_doc_frac_chars_top_2gram": [[0, 7380, 0.00756557]], "rps_doc_frac_chars_top_3gram": [[0, 7380, 0.00453934]], "rps_doc_frac_chars_top_4gram": [[0, 7380, 0.00554808]], "rps_doc_frac_all_caps_words": [[0, 7380, 0.00606469]], "rps_doc_frac_lines_end_with_ellipsis": [[0, 7380, 0.0]], "rps_doc_frac_no_alph_words": [[0, 7380, 0.1597035]], "rps_doc_frac_unique_words": [[0, 7380, 0.44364508]], "rps_doc_mean_word_length": [[0, 7380, 4.75459632]], "rps_doc_symbol_to_word_ratio": [[0, 7380, 0.00067385]], "rps_doc_unigram_entropy": [[0, 7380, 5.62202253]], "rps_doc_word_count": [[0, 7380, 1251.0]], "rps_lines_ending_with_terminal_punctution_mark": [[0, 72, 0.0], [72, 97, 0.0], [97, 110, 0.0], [110, 328, 0.0], [328, 483, 1.0], [483, 686, 1.0], [686, 904, 1.0], [904, 1086, 1.0], [1086, 1320, 1.0], [1320, 1584, 1.0], [1584, 1793, 1.0], [1793, 1894, 1.0], [1894, 2054, 1.0], [2054, 2224, 1.0], [2224, 2342, 1.0], [2342, 2562, 1.0], [2562, 2678, 1.0], [2678, 2809, 1.0], [2809, 3018, 1.0], [3018, 3201, 1.0], [3201, 3320, 1.0], [3320, 3643, 1.0], [3643, 3803, 1.0], [3803, 4047, 1.0], [4047, 4189, 1.0], [4189, 4491, 1.0], [4491, 4620, 1.0], [4620, 4742, 1.0], [4742, 4966, 1.0], [4966, 5198, 1.0], [5198, 5368, 1.0], [5368, 5545, 1.0], [5545, 5675, 1.0], [5675, 5846, 1.0], [5846, 6079, 1.0], [6079, 6383, 1.0], [6383, 6501, 1.0], [6501, 6645, 1.0], [6645, 6801, 1.0], [6801, 7063, 1.0], [7063, 7129, 0.0], [7129, 7165, 0.0], [7165, 7226, 0.0], [7226, 7246, 0.0], [7246, 7260, 0.0], [7260, 7321, 0.0], [7321, 7380, 0.0]], "rps_lines_javascript_counts": [[0, 72, 0.0], [72, 97, 0.0], [97, 110, 0.0], [110, 328, 0.0], [328, 483, 0.0], [483, 686, 0.0], [686, 904, 0.0], [904, 1086, 0.0], [1086, 1320, 0.0], [1320, 1584, 0.0], [1584, 1793, 0.0], [1793, 1894, 0.0], [1894, 2054, 0.0], [2054, 2224, 0.0], [2224, 2342, 0.0], [2342, 2562, 0.0], [2562, 2678, 0.0], [2678, 2809, 0.0], [2809, 3018, 0.0], [3018, 3201, 0.0], [3201, 3320, 0.0], [3320, 3643, 0.0], [3643, 3803, 0.0], [3803, 4047, 0.0], [4047, 4189, 0.0], [4189, 4491, 0.0], [4491, 4620, 0.0], [4620, 4742, 0.0], [4742, 4966, 0.0], [4966, 5198, 0.0], [5198, 5368, 0.0], [5368, 5545, 0.0], [5545, 5675, 0.0], [5675, 5846, 0.0], [5846, 6079, 0.0], [6079, 6383, 0.0], [6383, 6501, 0.0], [6501, 6645, 0.0], [6645, 6801, 0.0], [6801, 7063, 0.0], [7063, 7129, 0.0], [7129, 7165, 0.0], [7165, 7226, 0.0], [7226, 7246, 0.0], [7246, 7260, 0.0], [7260, 7321, 0.0], [7321, 7380, 0.0]], "rps_lines_num_words": [[0, 72, 12.0], [72, 97, 4.0], [97, 110, 1.0], [110, 328, 32.0], [328, 483, 24.0], [483, 686, 31.0], [686, 904, 36.0], [904, 1086, 35.0], [1086, 1320, 45.0], [1320, 1584, 49.0], [1584, 1793, 35.0], [1793, 1894, 16.0], [1894, 2054, 24.0], [2054, 2224, 27.0], [2224, 2342, 19.0], [2342, 2562, 41.0], [2562, 2678, 22.0], [2678, 2809, 22.0], [2809, 3018, 32.0], [3018, 3201, 31.0], [3201, 3320, 19.0], [3320, 3643, 66.0], [3643, 3803, 26.0], [3803, 4047, 43.0], [4047, 4189, 23.0], [4189, 4491, 47.0], [4491, 4620, 19.0], [4620, 4742, 19.0], [4742, 4966, 40.0], [4966, 5198, 45.0], [5198, 5368, 32.0], [5368, 5545, 33.0], [5545, 5675, 20.0], [5675, 5846, 28.0], [5846, 6079, 45.0], [6079, 6383, 50.0], [6383, 6501, 20.0], [6501, 6645, 25.0], [6645, 6801, 26.0], [6801, 7063, 42.0], [7063, 7129, 10.0], [7129, 7165, 5.0], [7165, 7226, 9.0], [7226, 7246, 2.0], [7246, 7260, 2.0], [7260, 7321, 9.0], [7321, 7380, 8.0]], "rps_lines_numerical_chars_fraction": [[0, 72, 0.0], [72, 97, 0.0], [97, 110, 0.0], [110, 328, 0.02403846], [328, 483, 0.0], [483, 686, 0.00505051], [686, 904, 0.0], [904, 1086, 0.0], [1086, 1320, 0.0], [1320, 1584, 0.0], [1584, 1793, 0.0049505], [1793, 1894, 0.0], [1894, 2054, 0.0], [2054, 2224, 0.01818182], [2224, 2342, 0.0], [2342, 2562, 0.00938967], [2562, 2678, 0.0], [2678, 2809, 0.0], [2809, 3018, 0.0], [3018, 3201, 0.02857143], [3201, 3320, 0.0], [3320, 3643, 0.0], [3643, 3803, 0.0], [3803, 4047, 0.0], [4047, 4189, 0.0], [4189, 4491, 0.01369863], [4491, 4620, 0.0], [4620, 4742, 0.0], [4742, 4966, 0.0], [4966, 5198, 0.0], [5198, 5368, 0.0], [5368, 5545, 0.0], [5545, 5675, 0.008], [5675, 5846, 0.0], [5846, 6079, 0.0], [6079, 6383, 0.00337838], [6383, 6501, 0.00877193], [6501, 6645, 0.02877698], [6645, 6801, 0.06578947], [6801, 7063, 0.0], [7063, 7129, 0.0], [7129, 7165, 0.0], [7165, 7226, 0.0], [7226, 7246, 0.0], [7246, 7260, 0.0], [7260, 7321, 0.0], [7321, 7380, 0.05084746]], "rps_lines_start_with_bulletpoint": [[0, 72, 0.0], [72, 97, 0.0], [97, 110, 0.0], [110, 328, 0.0], [328, 483, 0.0], [483, 686, 0.0], [686, 904, 0.0], [904, 1086, 0.0], [1086, 1320, 0.0], [1320, 1584, 0.0], [1584, 1793, 0.0], [1793, 1894, 0.0], [1894, 2054, 0.0], [2054, 2224, 0.0], [2224, 2342, 0.0], [2342, 2562, 0.0], [2562, 2678, 0.0], [2678, 2809, 0.0], [2809, 3018, 0.0], [3018, 3201, 0.0], [3201, 3320, 0.0], [3320, 3643, 0.0], [3643, 3803, 0.0], [3803, 4047, 0.0], [4047, 4189, 0.0], [4189, 4491, 0.0], [4491, 4620, 0.0], [4620, 4742, 0.0], [4742, 4966, 0.0], [4966, 5198, 0.0], [5198, 5368, 0.0], [5368, 5545, 0.0], [5545, 5675, 0.0], [5675, 5846, 0.0], [5846, 6079, 0.0], [6079, 6383, 0.0], [6383, 6501, 0.0], [6501, 6645, 0.0], [6645, 6801, 0.0], [6801, 7063, 0.0], [7063, 7129, 0.0], [7129, 7165, 0.0], [7165, 7226, 0.0], [7226, 7246, 0.0], [7246, 7260, 0.0], [7260, 7321, 0.0], [7321, 7380, 0.0]], "rps_lines_uppercase_letter_fraction": [[0, 72, 0.02777778], [72, 97, 0.16], [97, 110, 0.07692308], [110, 328, 0.08715596], [328, 483, 0.03870968], [483, 686, 0.05418719], [686, 904, 0.02752294], [904, 1086, 0.03846154], [1086, 1320, 0.02136752], [1320, 1584, 0.03787879], [1584, 1793, 0.06698565], [1793, 1894, 0.01980198], [1894, 2054, 0.05], [2054, 2224, 0.01764706], [2224, 2342, 0.06779661], [2342, 2562, 0.05], [2562, 2678, 0.00862069], [2678, 2809, 0.05343511], [2809, 3018, 0.03827751], [3018, 3201, 0.1147541], [3201, 3320, 0.01680672], [3320, 3643, 0.01547988], [3643, 3803, 0.025], [3803, 4047, 0.02459016], [4047, 4189, 0.00704225], [4189, 4491, 0.0397351], [4491, 4620, 0.00775194], [4620, 4742, 0.04918033], [4742, 4966, 0.01339286], [4966, 5198, 0.00862069], [5198, 5368, 0.00588235], [5368, 5545, 0.02259887], [5545, 5675, 0.05384615], [5675, 5846, 0.03508772], [5846, 6079, 0.00858369], [6079, 6383, 0.03947368], [6383, 6501, 0.11016949], [6501, 6645, 0.02083333], [6645, 6801, 0.01282051], [6801, 7063, 0.08015267], [7063, 7129, 0.12121212], [7129, 7165, 0.11111111], [7165, 7226, 0.04918033], [7226, 7246, 0.1], [7246, 7260, 0.14285714], [7260, 7321, 0.06557377], [7321, 7380, 0.01694915]], "rps_doc_ml_palm_score": [[0, 7380, 0.73797971]], "rps_doc_ml_wikipedia_score": [[0, 7380, null]], "rps_doc_ml_wikiref_score": [[0, 7380, 0.95392543]], "rps_doc_books_importance": [[0, 7380, -140.22576004]], "rps_doc_openwebtext_importance": [[0, 7380, 208.57742331]], "rps_doc_wikipedia_importance": [[0, 7380, -167.684061]], "rps_doc_num_sentences": [[0, 7380, 72.0]]} |
Wyoming bishop a perfect test case for Pope’s vows of accountability
By John L. Allen Jr.
|Editor
Pope Francis salutes pilgrims and faithful as he leaves St. Peter’s Square at the Vatican after his weekly general audience, Wednesday, Aug. 28, 2019. (Credit: AP Photo/Alessandra Tarantino.)
ROME – Ask the typical American Catholic in the pews, and most could probably tell you a fair bit about Theodore McCarrick, the ex-cardinal and now ex-priest whose fall from grace amid reports of decades-long sexual misconduct and abuse triggered a firestorm a year ago which, in many ways, is still raging.
By way of contrast, few rank-and-file churchgoers outside of Cheyenne, Wyoming, or Kansas City, Missouri, could probably pick Bishop Joseph Hart out of a lineup – and that relative obscurity is precisely what makes Hart such an ideal test case for Pope Francis’s avowed commitment to accountability, including for bishops. (Of course, it’s actually a test of the entire system, not just the pope, but he’s the one making the promises.)
As any expert in the moral life will tell you, the real test of integrity isn’t what you do when people are watching, but the choices you make when they’re not.
Although there had been rumors about McCarrick for decades, and confidential restrictions were imposed by Pope Benedict XVI in 2008 and largely ignored, the McCarrick scandal nevertheless was stunning in part because of the speed with which his final downfall occurred. He was suspended from ministry the moment the Archdiocese of New York announced that its review board had found an allegation credible, he resigned from the College of Cardinals within a month, and he was expelled from the priesthood just eight months after that.
It doesn’t require any special sociological genius to wonder if part of the reason the system went into overdrive on McCarrick was because intense public pressure demanded it. The entire world was watching, and nobody probably wanted to be perceived as the chaplain of the Old Boy’s Club in the middle of a PR Chernobyl.
The Hart case, however, has unfolded without much fanfare over more than a decade, despite the fact that the allegations involved appear far more serious. As one local priest put it, the charges against Hart and his clergy friends who were also reportedly involved in acts of abuse “make Ted McCarrick look like a saint.”
To be clear, those charges have yet to be tested in a court of law, either civil or ecclesiastical, and until that happens it’s impossible to assess the full truth. However, the pattern of accusations, settlements and further accusations would certainly suggest that such a process is warranted, and not in some vague eschatological future.
Crux’s Chris White painstakingly unpacked the entire Hart story in a three-part series that ran this week.
RELATED: Case of Bishop Hart shows role clericalism plays in abuse cover-up
Now 87, Hart faces at least a dozen accusations of the sexual abuse of minors, and police in Cheyenne have recommended that criminal charges be filed. If that happens, he would become the first U.S. bishop to face prosecution for the abuse of minors. The Diocese of Kansas City-St. Joseph, where Hart served earlier in his career, also has paid out nearly $20 million to more than 50 plaintiffs in cases that include Hart and a handful of other priests accused of repeated abuse.
The earliest charges against Hart date all the way back to 1989 and were reported at the time to both the U.S. bishops’ conference and the Vatican, but they weren’t revealed to the public until the initial eruption of the abuse scandal in the American Church in 2002. In June 2019, the Diocese of Cheyenne announced that Francis had imposed restrictions on Hart and authorized a penal process, but it’s not clear where that process stands.
The juxtaposition of rapid response in the McCarrick case and the languid pace with which things have developed with Hart is striking, all the more so given that two successive bishops in Cheyenne, Paul Etienne and new Steven Biegler, have pressed Rome to take action.
The Hart case also raises issues of accountability not just for the crime but for possible cover-up, since Bishop David Ricken, now of Green Bay who was Hart’s successor in Cheyenne from 2001 to 2008, has been accused of protecting Hart when a criminal investigation was launched in 2002.
Responding to the pressures created by the McCarrick scandal, Francis issued the document Vox estis lux mundi in early May outlining a new process for episcopal accountability, both for the crime and the cover-up, envisioning a key role for metropolitan archbishops and requiring all dioceses in the world to set up reporting mechanisms.
RELATED: Pope issues global standards for reporting, investigating clergy abuse
In his preamble, the pontiff said the idea was to take “concrete and effective actions” to ensure that “these phenomena, in all their forms, never happen again.” It’s difficult to imagine a better test case for whether Vox estis has any teeth, at least in the U.S., than Hart’s.
As the process plays out, the reactions of other U.S. bishops will be important.
Biegler, who took over in Cheyenne in 2017, has faced blowback from some fellow prelates who would have preferred that he handle the Hart case more quietly. Something similar could be said of Bishop James Johnston of Kansas City-St. Joseph, who’s been in regular communication with alleged Hart victims and who’s announced publicly that the diocese deems the charges to be “credible.”
It’s interesting to note that Biegler is 60 and Johnston 59, and that both men became bishops post-2002. Perhaps they represent a new cohort among the American bishops, a generation for whom the old rules of the clerical club – which, in this instance, probably would include allowing an 87-year-old cleric to die in peace, without the pressures of a canonical trial – no longer apply.
(Speaking of Kansas City, there’s also the case of Bishop Robert Finn, who was convicted in 2012 of failure to report an accusation of abuse. Despite resigning, he still takes part in U.S. bishops’ meetings.)
Of course, neither Biegler nor Johnston ultimately will decide how the Hart saga ends. That script will be written elsewhere, between the U.S. and Rome – and perhaps this time, it won’t be a hasty denouement that speaks less to the system than the pressure it happens to be under at a given moment.
Follow John Allen on Twitter: @JohnLAllenJr
Crux is dedicated to smart, wired and independent reporting on the Vatican and worldwide Catholic Church. That kind of reporting doesn’t come cheap, and we need your support. You can help Crux by giving a small amount monthly, or with a onetime gift. Please remember, Crux is a for-profit organization, so contributions are not tax-deductible.
Beyond Catholicism, nothing unifies Italy and America like baseball
After a dramatic decade in power, Pope Francis faces a Gorbachev dilemma
Dramatic week for financial reform answers old questions, raises new ones | 2023-14/0000/en_head.json.gz/3622 | {"url": "https://cruxnow.com/news-analysis/2019/09/wyoming-bishop-a-perfect-test-case-for-popes-vows-of-accountability", "partition": "head_middle", "language": "en", "source_domain": "cruxnow.com", "date_download": "2023-03-20T08:49:04Z", "digest": "sha1:UHIJB3ST6I4UFK54FB6S6WSZN24R2SMH"} | {"ccnet_length": [[0, 7033, 7033.0]], "ccnet_original_length": [[0, 7033, 9353.0]], "ccnet_nlines": [[0, 7033, 30.0]], "ccnet_original_nlines": [[0, 7033, 137.0]], "ccnet_language_score": [[0, 7033, 0.97]], "ccnet_perplexity": [[0, 7033, 301.6]], "ccnet_bucket": [[0, 7033, 0.0]], "rps_doc_curly_bracket": [[0, 7033, 0.0]], "rps_doc_ldnoobw_words": [[0, 7033, 2.0]], "rps_doc_lorem_ipsum": [[0, 7033, 0.0]], "rps_doc_stop_word_fraction": [[0, 7033, 0.41910192]], "rps_doc_ut1_blacklist": [[0, 7033, null]], "rps_doc_frac_chars_dupe_10grams": [[0, 7033, 0.0]], "rps_doc_frac_chars_dupe_5grams": [[0, 7033, 0.0]], "rps_doc_frac_chars_dupe_6grams": [[0, 7033, 0.0]], "rps_doc_frac_chars_dupe_7grams": [[0, 7033, 0.0]], "rps_doc_frac_chars_dupe_8grams": [[0, 7033, 0.0]], "rps_doc_frac_chars_dupe_9grams": [[0, 7033, 0.0]], "rps_doc_frac_chars_top_2gram": [[0, 7033, 0.00613927]], "rps_doc_frac_chars_top_3gram": [[0, 7033, 0.00578846]], "rps_doc_frac_chars_top_4gram": [[0, 7033, 0.00701631]], "rps_doc_frac_all_caps_words": [[0, 7033, 0.01354241]], "rps_doc_frac_lines_end_with_ellipsis": [[0, 7033, 0.0]], "rps_doc_frac_no_alph_words": [[0, 7033, 0.15181753]], "rps_doc_frac_unique_words": [[0, 7033, 0.47859589]], "rps_doc_mean_word_length": [[0, 7033, 4.88099315]], "rps_doc_symbol_to_word_ratio": [[0, 7033, 0.0]], "rps_doc_unigram_entropy": [[0, 7033, 5.65070054]], "rps_doc_word_count": [[0, 7033, 1168.0]], "rps_lines_ending_with_terminal_punctution_mark": [[0, 69, 0.0], [69, 90, 1.0], [90, 98, 0.0], [98, 290, 0.0], [290, 598, 1.0], [598, 1034, 0.0], [1034, 1195, 1.0], [1195, 1729, 1.0], [1729, 2050, 1.0], [2050, 2372, 1.0], [2372, 2713, 1.0], [2713, 2820, 1.0], [2820, 2896, 0.0], [2896, 3376, 1.0], [3376, 3816, 1.0], [3816, 4085, 1.0], [4085, 4374, 1.0], [4374, 4712, 1.0], [4712, 4792, 0.0], [4792, 5071, 1.0], [5071, 5152, 1.0], [5152, 5537, 1.0], [5537, 5923, 1.0], [5923, 6132, 0.0], [6132, 6431, 1.0], [6431, 6475, 0.0], [6475, 6819, 1.0], [6819, 6887, 0.0], [6887, 6960, 0.0], [6960, 7033, 0.0]], "rps_lines_javascript_counts": [[0, 69, 0.0], [69, 90, 0.0], [90, 98, 0.0], [98, 290, 0.0], [290, 598, 0.0], [598, 1034, 0.0], [1034, 1195, 0.0], [1195, 1729, 0.0], [1729, 2050, 0.0], [2050, 2372, 0.0], [2372, 2713, 0.0], [2713, 2820, 0.0], [2820, 2896, 0.0], [2896, 3376, 0.0], [3376, 3816, 0.0], [3816, 4085, 0.0], [4085, 4374, 0.0], [4374, 4712, 0.0], [4712, 4792, 0.0], [4792, 5071, 0.0], [5071, 5152, 0.0], [5152, 5537, 0.0], [5537, 5923, 0.0], [5923, 6132, 0.0], [6132, 6431, 0.0], [6431, 6475, 0.0], [6475, 6819, 0.0], [6819, 6887, 0.0], [6887, 6960, 0.0], [6960, 7033, 0.0]], "rps_lines_num_words": [[0, 69, 11.0], [69, 90, 5.0], [90, 98, 1.0], [98, 290, 28.0], [290, 598, 52.0], [598, 1034, 71.0], [1034, 1195, 31.0], [1195, 1729, 85.0], [1729, 2050, 55.0], [2050, 2372, 55.0], [2372, 2713, 55.0], [2713, 2820, 17.0], [2820, 2896, 12.0], [2896, 3376, 84.0], [3376, 3816, 76.0], [3816, 4085, 45.0], [4085, 4374, 49.0], [4374, 4712, 53.0], [4712, 4792, 10.0], [4792, 5071, 49.0], [5071, 5152, 14.0], [5152, 5537, 62.0], [5537, 5923, 66.0], [5923, 6132, 35.0], [6132, 6431, 54.0], [6431, 6475, 6.0], [6475, 6819, 55.0], [6819, 6887, 9.0], [6887, 6960, 12.0], [6960, 7033, 11.0]], "rps_lines_numerical_chars_fraction": [[0, 69, 0.0], [69, 90, 0.0], [90, 98, 0.0], [98, 290, 0.03333333], [290, 598, 0.0], [598, 1034, 0.0], [1034, 1195, 0.0], [1195, 1729, 0.00759013], [1729, 2050, 0.0], [2050, 2372, 0.0], [2372, 2713, 0.0], [2713, 2820, 0.0], [2820, 2896, 0.0], [2896, 3376, 0.01287554], [3376, 3816, 0.02777778], [3816, 4085, 0.0], [4085, 4374, 0.04240283], [4374, 4712, 0.0], [4712, 4792, 0.0], [4792, 5071, 0.0], [5071, 5152, 0.0], [5152, 5537, 0.01061008], [5537, 5923, 0.02666667], [5923, 6132, 0.0201005], [6132, 6431, 0.0], [6431, 6475, 0.0], [6475, 6819, 0.0], [6819, 6887, 0.0], [6887, 6960, 0.0], [6960, 7033, 0.0]], "rps_lines_start_with_bulletpoint": [[0, 69, 0.0], [69, 90, 0.0], [90, 98, 0.0], [98, 290, 0.0], [290, 598, 0.0], [598, 1034, 0.0], [1034, 1195, 0.0], [1195, 1729, 0.0], [1729, 2050, 0.0], [2050, 2372, 0.0], [2372, 2713, 0.0], [2713, 2820, 0.0], [2820, 2896, 0.0], [2896, 3376, 0.0], [3376, 3816, 0.0], [3816, 4085, 0.0], [4085, 4374, 0.0], [4374, 4712, 0.0], [4712, 4792, 0.0], [4792, 5071, 0.0], [5071, 5152, 0.0], [5152, 5537, 0.0], [5537, 5923, 0.0], [5923, 6132, 0.0], [6132, 6431, 0.0], [6431, 6475, 0.0], [6475, 6819, 0.0], [6819, 6887, 0.0], [6887, 6960, 0.0], [6960, 7033, 0.0]], "rps_lines_uppercase_letter_fraction": [[0, 69, 0.02898551], [69, 90, 0.23809524], [90, 98, 0.125], [98, 290, 0.07291667], [290, 598, 0.03246753], [598, 1034, 0.02981651], [1034, 1195, 0.00621118], [1195, 1729, 0.02996255], [1729, 2050, 0.03115265], [2050, 2372, 0.02173913], [2372, 2713, 0.0058651], [2713, 2820, 0.03738318], [2820, 2896, 0.13157895], [2896, 3376, 0.02916667], [3376, 3816, 0.02954545], [3816, 4085, 0.03717472], [4085, 4374, 0.03460208], [4374, 4712, 0.01775148], [4712, 4792, 0.1], [4792, 5071, 0.02150538], [5071, 5152, 0.03703704], [5152, 5537, 0.03116883], [5537, 5923, 0.01295337], [5923, 6132, 0.0430622], [6132, 6431, 0.02675585], [6431, 6475, 0.18181818], [6475, 6819, 0.02616279], [6819, 6887, 0.05882353], [6887, 6960, 0.05479452], [6960, 7033, 0.01369863]], "rps_doc_ml_palm_score": [[0, 7033, 0.93282628]], "rps_doc_ml_wikipedia_score": [[0, 7033, null]], "rps_doc_ml_wikiref_score": [[0, 7033, 0.92038989]], "rps_doc_books_importance": [[0, 7033, -181.58615469]], "rps_doc_openwebtext_importance": [[0, 7033, 220.86797721]], "rps_doc_wikipedia_importance": [[0, 7033, -86.66419008]], "rps_doc_num_sentences": [[0, 7033, 57.0]]} |
Alcoholics Anonymous (French Hardcover)
Known as the “Big Book,” the basic text of Alcoholics Anonymous has helped millions of people recover from alcoholism since the first edition appeared in 1939. Chapters describing the A.A. recovery program — the original Twelve Steps — and the personal histories of A.A.’s co-founders remain unchanged since the original, while new stories have been added to the personal histories with each edition.
Fourth Edition (2001).
Alcoholics Anonymous (French Hardcover) quantity | 2023-14/0000/en_head.json.gz/3623 | {"url": "https://csoaamaine.org/book-store/foreign-languages/big-book-french-hardcover/", "partition": "head_middle", "language": "en", "source_domain": "csoaamaine.org", "date_download": "2023-03-20T10:15:18Z", "digest": "sha1:HBZWRAPVSLGNT5ME5T3YHUS6MANQI6TX"} | {"ccnet_length": [[0, 512, 512.0]], "ccnet_original_length": [[0, 512, 2496.0]], "ccnet_nlines": [[0, 512, 4.0]], "ccnet_original_nlines": [[0, 512, 87.0]], "ccnet_language_score": [[0, 512, 0.94]], "ccnet_perplexity": [[0, 512, 163.5]], "ccnet_bucket": [[0, 512, 0.0]], "rps_doc_curly_bracket": [[0, 512, 0.0]], "rps_doc_ldnoobw_words": [[0, 512, 0.0]], "rps_doc_lorem_ipsum": [[0, 512, 0.0]], "rps_doc_stop_word_fraction": [[0, 512, 0.29473684]], "rps_doc_ut1_blacklist": [[0, 512, null]], "rps_doc_frac_chars_dupe_10grams": [[0, 512, 0.0]], "rps_doc_frac_chars_dupe_5grams": [[0, 512, 0.0]], "rps_doc_frac_chars_dupe_6grams": [[0, 512, 0.0]], "rps_doc_frac_chars_dupe_7grams": [[0, 512, 0.0]], "rps_doc_frac_chars_dupe_8grams": [[0, 512, 0.0]], "rps_doc_frac_chars_dupe_9grams": [[0, 512, 0.0]], "rps_doc_frac_chars_top_2gram": [[0, 512, 0.13507109]], "rps_doc_frac_chars_top_3gram": [[0, 512, 0.11848341]], "rps_doc_frac_chars_top_4gram": [[0, 512, 0.16113744]], "rps_doc_frac_all_caps_words": [[0, 512, 0.04210526]], "rps_doc_frac_lines_end_with_ellipsis": [[0, 512, 0.0]], "rps_doc_frac_no_alph_words": [[0, 512, 0.21052632]], "rps_doc_frac_unique_words": [[0, 512, 0.70666667]], "rps_doc_mean_word_length": [[0, 512, 5.62666667]], "rps_doc_symbol_to_word_ratio": [[0, 512, 0.0]], "rps_doc_unigram_entropy": [[0, 512, 3.79051558]], "rps_doc_word_count": [[0, 512, 75.0]], "rps_lines_ending_with_terminal_punctution_mark": [[0, 40, 0.0], [40, 441, 1.0], [441, 464, 1.0], [464, 512, 0.0]], "rps_lines_javascript_counts": [[0, 40, 0.0], [40, 441, 0.0], [441, 464, 0.0], [464, 512, 0.0]], "rps_lines_num_words": [[0, 40, 4.0], [40, 441, 63.0], [441, 464, 3.0], [464, 512, 5.0]], "rps_lines_numerical_chars_fraction": [[0, 40, 0.0], [40, 441, 0.01023018], [441, 464, 0.21052632], [464, 512, 0.0]], "rps_lines_start_with_bulletpoint": [[0, 40, 0.0], [40, 441, 0.0], [441, 464, 0.0], [464, 512, 0.0]], "rps_lines_uppercase_letter_fraction": [[0, 40, 0.1], [40, 441, 0.02992519], [441, 464, 0.08695652], [464, 512, 0.08333333]], "rps_doc_ml_palm_score": [[0, 512, 0.80415034]], "rps_doc_ml_wikipedia_score": [[0, 512, null]], "rps_doc_ml_wikiref_score": [[0, 512, 0.01664156]], "rps_doc_books_importance": [[0, 512, -28.33968356]], "rps_doc_openwebtext_importance": [[0, 512, 0.9492954]], "rps_doc_wikipedia_importance": [[0, 512, 7.43608786]], "rps_doc_num_sentences": [[0, 512, 8.0]]} |
Difference between revisions of "PRE2015 3 Groep4"
Revision as of 10:57, 22 April 2016 (view source)
S141268 (talk | contribs)
== Final product ==
[[File:ThePrototype.png|thumb|400px|A visualization of the prototype for the Smart Bedroom.]]
=== Components ===
Our final product for this course is more of a prototype of the Smart Bedroom. The various components of the prototype are visualized in the picture to the right. As one can see, the system consists of an Arduino unit connected to a laptop. The laptop fulfills the tasks of central computer and user interface, as well as sound output through its speaker. The Arduino contains a microphone for recording the user's breath and a LED for the light output. The blinds and the thermostat as described with the dream are off the table, as we didn't have the time and resources to implement them properly. Furthermore, we had quickly decided not to use a smartphone, because none of us have enough experience with creating apps for the difficult task at hand. Nevertheless, this prototype is a great representation of what the Smart Bedroom could potentially be.
* [[Output]]
* [[Conclusion]]
* [[Literature]]
* [[Logbook]]
Revision as of 11:02, 22 April 2016
This is the wiki page of Group 4 of USE Project: Robots Everywhere.
For this course, we've made a prototype of a "Smart Bedroom" using an Arduino unit connected to a laptop. Our idea of a Smart Bedroom is a bedroom filled with electrical modules that are controlled by a central computer, which automatically commands them to simulate the sunrise and morning sounds in order to wake up the user in the most natural way possible. So the Smart Bedroom basically replaces your alarm clock and makes sure you're well rested and cheerful in the morning. The sunrise is simulated by a combination of a light sensor mounted on the outside wall of the bedroom and automatically controlled blinds and lights. Another module of the Smart Bedroom is a thermostat that gradually heats up the room when it's time to wake up the user. Morning sounds are provided by your smartphone, which also acts as a user interface for the entire system. Furthermore, the smartphone records the sounds the user makes when sleeping, most importantly his/her breathing, in order to determine how deep he/she sleeps, which is necessary to find out when it's a good time to wake him/her up. Through research, we found that waking up during light sleep is much more comfortable than during deep or REM sleep. Setting the time period during which you want to wake up is also done via the smartphone.
For our prototype, a few concepts had to be simplified or even scrapped entirely. Naturally, the thermostat and blinds were off the table, which also eliminates the need for a light sensor. None of us have experience in making apps, so we decided not to use a smartphone as well. Instead, the laptop acts as a kind of user interface. The prototype consists of three scripts written in MatLab which is aided by an Arduino unit. The Arduino contains a microphone that records the user's breathing sounds and an LED that produces light. The laptop translates the sound measured by the Arduino into a graph showing the sleep deepness and uses that to decide when to start the wake-up procedure, which consists of gradually turning on the LED and producing morning sounds with the laptops speaker. When the user has woken up, which he/she indicates by pressing any button on the keyboard, the opportunity is given to provide feedback about the sound volume and light brightness, which is taken into account the following nights.
1 Group Members
2 The Smart Bedroom
2.1 Inspiration
2.2 Walkthrough
2.3 Speculation
3 Scenario
4 Use aspects
6 User needs
6.1 Primary users
6.2 Secondary users
6.3 Tertiary users
7 Final product
7.1 Components
8 Userfriendlyness
0888465 - Thomas Bardoel
0942166 - Wesley van den Broek
0887435 - Jeroen Ermers
0888864 - Luuk Ladegaard
0887879 - Roel Montree
0835870 - Jeroen Verbakel
The Smart Bedroom
A visualization of our initial idea of a Smart Bedroom.
We started off this project with a dream: a high-tech automated bedroom that wakes up the user in the most natural way possible, but still in time for him/her to go to school for example. We were heavily inspired by the smartphone app called Sleep Cycle[1]. This app uses your smartphone, which would be put beside your bed, to record the sounds you make while sleeping. This involves breathing as well as movement. The data is then used to calculate how deep you sleep. The app wakes you up within a chosen time period during light sleep, which is more comfortable than being woken up during deep or REM sleep. To wake you up, the smartphone plays a selected tune which sounds very relaxed and natural, like birds singing in the morning for example. When you've woken up, the phone provides information about how you've slept, most notably a graph showing the deepness of your sleep through time. This information can be compared with that of previous nights and of other people thanks to the internet.
The picture to the right visualizes the Smart Bedroom as we imagined it. The bedroom is controlled by a central computer which can be wirelessly communicated with using your smartphone. When going to bed, you use your phone to set the time period in which you desire to wake up. It will then start to record the sounds you make, just like with the Sleep Cycle app. The data is sent to the central computer, which translates it into a sleep graph, which is used to determine if it's the right time to start the wake up procedure. This procedure, which we call the 'smart alarm', involves a number of things. First of all, the smartphone starts to play a tune similar to the ones of the Sleep Cycle app, which gradually increases in volume so as not to startle the user. Secondly, the central computer commands the thermostat to raise the temperature of the room, as it's more pleasant to wake up to a higher temperature. This could also begin earlier, at the beginning of the wake up time period for example, depending on how long it takes to heat up the room and how much influence it has on the sleeping user. Finally, the Smart Bedroom controls the brightness of the room in order to simulate a sunrise. If it's possible, this is achieved by slowly opening the blinds, as sunlight is a more natural and free source of light. But in case it's too dark outside, which is determined by a light sensor mounted on the outside wall of the bedroom, a lamp is used instead. The blinds and the lamp could also work in conjunction, with both modules contributing to the brightness of the room. The factors that contribute to waking up the user are indicated by red symbols in the picture to the right. If the system couldn't find a suitable moment to wake up the user or the user hasn't woken up yet by the end of the wake up time interval, the lights and blinds open up completely and the smartphone starts to produce an alarm sound similar to that of a generic digital alarm clock. So basically, the smart alarm is replaced with what we call the 'standard alarm' in order to wake up the user at all costs. After the user has woken up and stopped the alarm by pressing a button on his/her phone, he/she can provide feedback about the sound and light level. The feedback could for example be given as a selection between "The sound was too soft", "The sound had the right volume" and "The sound was too loud". This information is then taken into account during the following nights to ensure that the Smart Bedroom fits the needs of its user.
These were the most important points of our Smart Bedroom, but it has the potential to perform a wider variety of tasks. Things that have come to mind were: the ability to display information about the user's sleep and share this over the internet, just like the Sleep Cycle app does; not only helping the user with waking up, but also with falling and staying asleep using the various output modules; the ability to give advice, like telling the user it's a good time to go to bed based on gathered data and so on. Furthermore, we thought about for who this system would be most beneficial. The most obvious benefiters are the people who need to go to work or students who need to go to school early in the morning and suffer from sleepiness. Another group of benefiters could be people with insomnia, as the Smart Bedroom might have the possibility to help them fall and stay asleep. We also contemplated whether the system would still work when there are more than one person in the bedroom. According to the Sleep Cycle app, it would actually work as long as the smartphone is placed significantly closer to its user than anyone else, so the Smart Bedroom should still be able to function properly. However, it would only be able to monitor one user at a time, and it's unavoidable that all people in the room will wake up when the smart alarm jumps into action, so using the system on multiple people is not recommended. Hence, for the remainder of this project, we will only focus on the situation in which there is only one person present in the Smart Bedroom.
Here's a short story to illustrate our idea:
John is a student at the University of Technology Eindhoven. He has trouble waking up in the morning, so he decided to invest in a Smart Bedroom. When John goes to sleep, he sets the wake-up time to anywhere between 7am and 7:30am on his phone. Now it is time to sleep. His phone checks his breath and movements to see in what phase of the sleep cycle he is. Around 7am, the Smart Bedroom notices that John almost completed a sleep cycle. Since John will have completed this sleep cycle during the wake-up interval, the Smart Bedroom lets him sleep a little longer. When John is close to finishing his sleep cycle, the Smart Bedroom instructs the windows to let more sunlight pass through and also raises the temperature. When getting closer to finishing the sleep cycle, more and more sunlight will pass through the window and the temperature will keep on raising. This allows a much more natural wake up than usual. The sleep cycle has ended but the smart bedroom notices that John is still asleep even though all sunlight is going through the window and the temperature is raised to the maximum level allowed, so it decides to play a relaxing sound which builds up in volume to force a wake-up. Finally, John wakes up at 7:20 am and he is not as sleepy as usual. He is able to get out of bed without any problems and is ready for his day at the university. John also rates his sleeping experience on his phone and the alarm will use this information for the next night.
Use aspects
The Smart Bedroom is a dream for many people. Here, the USE-aspects of the Smart Bedroom will be discussed.
The user will benefit from the smart bedroom because this integrates his blinds and lighting into his smartphone. This is the first step towards home automation. By crossing the barrier of combining your living space with your mobile devices, it becomes very easy to expand on this.
The Smart Bedroom helps to battle tiredness, and therefore also stress, which can also be considered a benefit for the society. Besides tackling tiredness in general, the Smart Bedroom is great for people with problems getting asleep.
Due to the integrated lighting and blinds in the bedroom, it becomes possible to sleep during the day. This possibly opens up all kinds of new lifestyles, such as working in multiple shifts during 24 hours and decreasing the morning and evening commute by spreading the starting hours throughout the day.
A good night's rest greatly improves productivity, which is good for both society and enterprise. Through the Smart Bedroom, we can open a whole new market in home automation, enabling many new jobs, startup companies and other benefits to the enterprise. If you once again consider working night shifts, you'll see this is beneficial to enterprise as well, in decreasing the size of offices, as not everyone has to work there at once.
Primary users:
The person who sleeps in the smart bedroom
Secondary users:
Other users sleeping in the smart bedroom
People living with the primary user
Tertiary users:
Scientists (data collection on sleep)
Employers (more productive personnel)
Engineers (that install and service the smart bedroom)
Medical personnel (for people that need monitoring when sleeping)
User needs
Listed below are the needs of the primary, secondary and tertiary users. The needs colored in blue are the ones that have been fulfilled with the prototype, although only partially in some cases.
Primary users
The user should be able to set the time. The option to synchronize the time with the time according to your smartphone would be very helpful.
The user should be able to see what time it is in the dark. The user could simply open his/her phone, or there could be an option to continuously display the time. This would, however, drain its power fairly quickly, so it's not recommended.
The alarm clock should not have a snooze option. Being able to go back to sleep after the alarm has woken you up is proven to be bad for your health[2].
The alarm should have a volume control. This could simply be the phone’s volume control.
The user should have the possibility to choose between three options: no alarm, a standard alarm which wakes you up at a specific time and the smart alarm.
In the case of the standard alarm, the user should be able to choose when he/she wants to be woken up and what sound the alarm should make: a repeating pervasive tone, the radio or perhaps a song which is stored on the phone.
In the case of the smart alarm, the user should be able to indicate in which time frame he/she wants to wake up. If the system can’t find a suitable moment to wake up the user, the standard alarm should go off at the end of this time frame.
Perhaps the user should be able to indicate when it’s a suitable moment to be woken up. However the user probably doesn’t know a lot about sleep patterns and such, so it’s not a necessary option.
The user should be able to choose what the standard and smart alarm control, like the lights, the blinds and the temperature of the room. These could also have different settings depending on the user’s preferences.
The system should be able to deliver understandable information to the user about how he/she slept that night, the nights before and in average. The user should be able to decide whether or not he/she wants the information to be (anonymously) shared on the internet for statistical purposes and/or be available for his/her doctor.
The system shouldn’t badly influence the user’s sleep in any way.
The system should be adaptable to a changing bedroom layout, which it probably is because most is done by the user’s smartphone, which is of course portable.
Secondary users
Because the smart alarm wakes its user up based on the sounds the primary user makes during sleeping, it perhaps will not work when there are more people sleeping in the same room. In addition to that, in most cases there is no moment where it’s for everyone ideal to wake up, so using the smart alarm would be useless. The users could still use the standard alarm, of course.
These people do not want to be involuntarily woken up by the alarm. For this reason, the sound the alarm makes should not be too loud to wake up the people living with the primary user. It should be noted here, that the alarm should still have the capacity to go very loud, as some people might require a louder alarm to wake up.
Tertiary users
Scientists who collect the data on how people sleep
The scientists should have a direct link to the database of shared information about people’s sleep.
The information needs to be complete (so as much information as possible), structured and easily exportable to other statistical programs.
Employers of the users of the smart bedroom
The employers benefit more from active and well-rested employees.
Engineers that install and service the smart bedroom
The system should use mainstream components like normal screws, standard European cable inlets and outlets etc.
When installing the system, the components should arrive in an orderly fashion, so similar parts are grouped together in a plastic bag for example, and a detailed instruction manual should be included. If necessary, the system should be delivered with a few tools which are specifically designed for installing it.
The system should come with a detailed troubleshooting manual which suggests what could be wrong and how it could be fixed if there’s a problem with it.
If necessary, the company behind the production of these smart bedrooms should hire and train personnel who are specifically employed to deal with this system.
Medical personnel for people who need monitoring when sleeping
The medical personnel should have a direct link to the information of their patients.
The information needs to be complete, structured and easily exportable to other statistical programs.
In case the medical personnel has no experience with handling data, the information needs to be presented in a way which is easy to understand, with graphs for example. The presentation should still be able to show important aspects which are vital for the personnel’s diagnose of the patient’s sleep patterns.
A visualization of the prototype for the Smart Bedroom.
Our final product exists out of several components, which have each his own function. The data during the night will be gathered with an Arduino and sound sensor. Gobetwino logged the real-time data to a .txt file on the computer. Matlab, a calculation program on the computer, will process this logged data real time. Finally, Matlab will determine the best wake-up moment. At that moment, Matlab will start the alarm which will build up in volume. The computer will be used to play the sound. Matlab will also turn on an LED, which will also build up in brightness. When the user is awake, the Smart bedroom asked the users experience, and will use this feedback the next nights.
The sound will be like our ideal situation. Before the outer-and-outer wake-up time, there will be played a relaxing sound of singing birds. And at the outer-and-outer wake-up time, there will be a more irritating standard alarm to be sure the user will wake up. However, the lights aren't like in our dream scenario. In our dream scenario, the smart bedroom will control the total room. Then, the Smart alarm can open the blinds more and more if it's light enough outside. Otherwise, the Smart bedroom can put on the lights of the room more and more. But our final product can't control the blinds or lights in a room. Our final product mimic this actions by using an LED. This LED will be glow more and more, like the brightness in the room will do in the dream scenario.
Userfriendlyness
Our final product isn't a completely finished product which is ready for consumer use. It's a prototype which isn't really userfriendly. Now, the user has to do many actions before the Smart Bedroom is ready. The user has to do the following steps now:
Make a .txt file.
Make (or change) a command in Gobetwino to log the gathered data on the made .txt file.
Fill in the right path and filename to the .txt file in Matlab.
Set the alarm and wake-up interval at the right time by using the notation in Matlab.
Connect the equipment in the right manner. (same port as defined in the Matlab script)
Upload and start the Arduino-code. (while Gobetwino is closed)
Open Gobetwino. (the logging of data will start now)
Run in Matlab the script 'runscript.m'
So, this is much to many actions for a consumer. But a number of actions isn't the only thing why our product isn't ready for consumer use. The interface is also very primitive. For example, now you've to fill in the wake-up time and interval in the script.
Below are several links which have each have another function. In the first link 'System input and output' a overview is shown of our dream and the product we realy gona made. In the link 'Measurement plan and experiments' is the measurements located. The results of the experiments we did, the made trade-offs and justification of the method of measuring can also be found at this link. In the next three links, 'input', 'algorithm' and 'output', the whole code is described. In the link 'input' is the arduino code described for sampeling the sound. In the link 'algorithm' are the matlab scripts explained which finaly determined the ideal moment of waking up. And the link 'output' described the working of the output script and the made trade-offs. Finally, in the link 'conclusion', not only the conclusion will be given, but also the futher improvements and the reflection.
In the link 'literature' can be found some background information about some specific topics. For example about the sleepcycle in genaral, the ideal moment of waking up and the influence of light, temperature and sound during your sleep. The information about this topics is used by making decisions during the construction of the algorithm.
System input and output
Measurement plan and experiments
In this appendix can be found a zip-file of all needed scripts to use the Smart Bedroom by yourself. (if you have the right Arduino, sound sensor and logging program) The zip-file contains all scripts, including the feedback-mechanism. Only thing which is missing is the audio file of singing birds which will wake up the user in the morning. This file was to big to upload on the wiki-page. So if you want to run the complete script including wake-up sound, you've to put a file of singing birds in the same map as the other scripts. Give that file the following name: Birds-singing-in-the-morning.mp3
File:Code Definitive V4.zip
↑ Sleep Cycle home page, http://www.sleepcycle.com/
↑ "Lecturing: Snoozing and Losing", S. Browning and K. Divoll, 2010, http://digitalcommons.georgiasouthern.edu/sotlcommons/SoTL/2010/81/
Retrieved from "https://cstwiki.wtb.tue.nl/index.php?title=PRE2015_3_Groep4&oldid=26542" | 2023-14/0000/en_head.json.gz/3624 | {"url": "https://cstwiki.wtb.tue.nl/index.php?title=PRE2015_3_Groep4&diff=26542&oldid=26540", "partition": "head_middle", "language": "en", "source_domain": "cstwiki.wtb.tue.nl", "date_download": "2023-03-20T10:00:23Z", "digest": "sha1:DJ37TZJBBE6FDD2GFBK3TS2OIQQHW7WY"} | {"ccnet_length": [[0, 22090, 22090.0]], "ccnet_original_length": [[0, 22090, 25120.0]], "ccnet_nlines": [[0, 22090, 114.0]], "ccnet_original_nlines": [[0, 22090, 187.0]], "ccnet_language_score": [[0, 22090, 0.95]], "ccnet_perplexity": [[0, 22090, 281.0]], "ccnet_bucket": [[0, 22090, 0.0]], "rps_doc_curly_bracket": [[0, 22090, 0.0]], "rps_doc_ldnoobw_words": [[0, 22090, 0.0]], "rps_doc_lorem_ipsum": [[0, 22090, 0.0]], "rps_doc_stop_word_fraction": [[0, 22090, 0.45555556]], "rps_doc_ut1_blacklist": [[0, 22090, null]], "rps_doc_frac_chars_dupe_10grams": [[0, 22090, 0.00531494]], "rps_doc_frac_chars_dupe_5grams": [[0, 22090, 0.12077349]], "rps_doc_frac_chars_dupe_6grams": [[0, 22090, 0.08300351]], "rps_doc_frac_chars_dupe_7grams": [[0, 22090, 0.05252742]], "rps_doc_frac_chars_dupe_8grams": [[0, 22090, 0.03302047]], "rps_doc_frac_chars_dupe_9grams": [[0, 22090, 0.0132308]], "rps_doc_frac_chars_top_2gram": [[0, 22090, 0.01899808]], "rps_doc_frac_chars_top_3gram": [[0, 22090, 0.02883637]], "rps_doc_frac_chars_top_4gram": [[0, 22090, 0.00633269]], "rps_doc_frac_all_caps_words": [[0, 22090, 0.00422222]], "rps_doc_frac_lines_end_with_ellipsis": [[0, 22090, 0.0]], "rps_doc_frac_no_alph_words": [[0, 22090, 0.13888889]], "rps_doc_frac_unique_words": [[0, 22090, 0.23675191]], "rps_doc_mean_word_length": [[0, 22090, 4.66279989]], "rps_doc_symbol_to_word_ratio": [[0, 22090, 0.0]], "rps_doc_unigram_entropy": [[0, 22090, 5.57815742]], "rps_doc_word_count": [[0, 22090, 3793.0]], "rps_lines_ending_with_terminal_punctution_mark": [[0, 51, 0.0], [51, 101, 0.0], [101, 127, 0.0], [127, 147, 0.0], [147, 241, 0.0], [241, 260, 0.0], [260, 1117, 1.0], [1117, 1130, 0.0], [1130, 1147, 0.0], [1147, 1164, 0.0], [1164, 1178, 0.0], [1178, 1214, 0.0], [1214, 1282, 1.0], [1282, 2581, 1.0], [2581, 3605, 1.0], [3605, 3621, 0.0], [3621, 3641, 0.0], [3641, 3657, 0.0], [3657, 3673, 0.0], [3673, 3689, 0.0], [3689, 3700, 0.0], [3700, 3714, 0.0], [3714, 3727, 0.0], [3727, 3745, 0.0], [3745, 3765, 0.0], [3765, 3784, 0.0], [3784, 3800, 0.0], [3800, 3815, 0.0], [3815, 3834, 0.0], [3834, 3859, 0.0], [3859, 3890, 0.0], [3890, 3914, 0.0], [3914, 3939, 0.0], [3939, 3962, 0.0], [3962, 3988, 0.0], [3988, 4006, 0.0], [4006, 4062, 1.0], [4062, 5066, 1.0], [5066, 7601, 1.0], [7601, 9169, 1.0], [9169, 9214, 0.0], [9214, 10687, 1.0], [10687, 10699, 0.0], [10699, 10807, 1.0], [10807, 11090, 1.0], [11090, 11325, 1.0], [11325, 11630, 1.0], [11630, 12066, 1.0], [12066, 12081, 0.0], [12081, 12124, 0.0], [12124, 12141, 0.0], [12141, 12183, 0.0], [12183, 12219, 0.0], [12219, 12235, 0.0], [12235, 12273, 0.0], [12273, 12311, 0.0], [12311, 12366, 0.0], [12366, 12432, 0.0], [12432, 12443, 0.0], [12443, 12639, 1.0], [12639, 12653, 0.0], [12653, 12795, 1.0], [12795, 13037, 1.0], [13037, 13190, 1.0], [13190, 13279, 1.0], [13279, 13435, 1.0], [13435, 13661, 1.0], [13661, 13902, 1.0], [13902, 14098, 1.0], [14098, 14314, 1.0], [14314, 14645, 1.0], [14645, 14711, 1.0], [14711, 14869, 1.0], [14869, 14885, 0.0], [14885, 15262, 1.0], [15262, 15592, 1.0], [15592, 15607, 0.0], [15607, 15659, 0.0], [15659, 15760, 1.0], [15760, 15899, 1.0], [15899, 15943, 0.0], [15943, 16009, 1.0], [16009, 16062, 0.0], [16062, 16174, 1.0], [16174, 16489, 1.0], [16489, 16642, 1.0], [16642, 16802, 1.0], [16802, 16865, 0.0], [16865, 16951, 1.0], [16951, 17053, 1.0], [17053, 17364, 1.0], [17364, 17420, 1.0], [17420, 18102, 1.0], [18102, 18876, 1.0], [18876, 18893, 0.0], [18893, 19146, 0.0], [19146, 19164, 1.0], [19164, 19252, 1.0], [19252, 19316, 1.0], [19316, 19402, 1.0], [19402, 19489, 0.0], [19489, 19552, 0.0], [19552, 19605, 0.0], [19605, 19644, 0.0], [19644, 19902, 1.0], [19902, 20783, 1.0], [20783, 21125, 1.0], [21125, 21149, 0.0], [21149, 21182, 0.0], [21182, 21785, 0.0], [21785, 21813, 0.0], [21813, 21865, 0.0], [21865, 22002, 0.0], [22002, 22090, 0.0]], "rps_lines_javascript_counts": [[0, 51, 0.0], [51, 101, 0.0], [101, 127, 0.0], [127, 147, 0.0], [147, 241, 0.0], [241, 260, 0.0], [260, 1117, 0.0], [1117, 1130, 0.0], [1130, 1147, 0.0], [1147, 1164, 0.0], [1164, 1178, 0.0], [1178, 1214, 0.0], [1214, 1282, 0.0], [1282, 2581, 0.0], [2581, 3605, 0.0], [3605, 3621, 0.0], [3621, 3641, 0.0], [3641, 3657, 0.0], [3657, 3673, 0.0], [3673, 3689, 0.0], [3689, 3700, 0.0], [3700, 3714, 0.0], [3714, 3727, 0.0], [3727, 3745, 0.0], [3745, 3765, 0.0], [3765, 3784, 0.0], [3784, 3800, 0.0], [3800, 3815, 0.0], [3815, 3834, 0.0], [3834, 3859, 0.0], [3859, 3890, 0.0], [3890, 3914, 0.0], [3914, 3939, 0.0], [3939, 3962, 0.0], [3962, 3988, 0.0], [3988, 4006, 0.0], [4006, 4062, 0.0], [4062, 5066, 0.0], [5066, 7601, 0.0], [7601, 9169, 0.0], [9169, 9214, 0.0], [9214, 10687, 0.0], [10687, 10699, 0.0], [10699, 10807, 0.0], [10807, 11090, 0.0], [11090, 11325, 0.0], [11325, 11630, 0.0], [11630, 12066, 0.0], [12066, 12081, 0.0], [12081, 12124, 0.0], [12124, 12141, 0.0], [12141, 12183, 0.0], [12183, 12219, 0.0], [12219, 12235, 0.0], [12235, 12273, 0.0], [12273, 12311, 0.0], [12311, 12366, 0.0], [12366, 12432, 0.0], [12432, 12443, 0.0], [12443, 12639, 0.0], [12639, 12653, 0.0], [12653, 12795, 0.0], [12795, 13037, 0.0], [13037, 13190, 0.0], [13190, 13279, 0.0], [13279, 13435, 0.0], [13435, 13661, 0.0], [13661, 13902, 0.0], [13902, 14098, 0.0], [14098, 14314, 0.0], [14314, 14645, 0.0], [14645, 14711, 0.0], [14711, 14869, 0.0], [14869, 14885, 0.0], [14885, 15262, 0.0], [15262, 15592, 0.0], [15592, 15607, 0.0], [15607, 15659, 0.0], [15659, 15760, 0.0], [15760, 15899, 0.0], [15899, 15943, 0.0], [15943, 16009, 0.0], [16009, 16062, 0.0], [16062, 16174, 0.0], [16174, 16489, 0.0], [16489, 16642, 0.0], [16642, 16802, 0.0], [16802, 16865, 0.0], [16865, 16951, 0.0], [16951, 17053, 0.0], [17053, 17364, 0.0], [17364, 17420, 0.0], [17420, 18102, 0.0], [18102, 18876, 0.0], [18876, 18893, 0.0], [18893, 19146, 0.0], [19146, 19164, 0.0], [19164, 19252, 0.0], [19252, 19316, 0.0], [19316, 19402, 0.0], [19402, 19489, 0.0], [19489, 19552, 0.0], [19552, 19605, 0.0], [19605, 19644, 0.0], [19644, 19902, 0.0], [19902, 20783, 0.0], [20783, 21125, 0.0], [21125, 21149, 0.0], [21149, 21182, 0.0], [21182, 21785, 0.0], [21785, 21813, 0.0], [21813, 21865, 0.0], [21865, 22002, 0.0], [22002, 22090, 0.0]], "rps_lines_num_words": [[0, 51, 7.0], [51, 101, 9.0], [101, 127, 3.0], [127, 147, 2.0], [147, 241, 9.0], [241, 260, 1.0], [260, 1117, 147.0], [1117, 1130, 1.0], [1130, 1147, 1.0], [1147, 1164, 1.0], [1164, 1178, 1.0], [1178, 1214, 7.0], [1214, 1282, 13.0], [1282, 2581, 224.0], [2581, 3605, 174.0], [3605, 3621, 3.0], [3621, 3641, 4.0], [3641, 3657, 2.0], [3657, 3673, 2.0], [3673, 3689, 2.0], [3689, 3700, 2.0], [3700, 3714, 3.0], [3714, 3727, 3.0], [3727, 3745, 3.0], [3745, 3765, 3.0], [3765, 3784, 3.0], [3784, 3800, 3.0], [3800, 3815, 2.0], [3815, 3834, 2.0], [3834, 3859, 3.0], [3859, 3890, 5.0], [3890, 3914, 3.0], [3914, 3939, 3.0], [3939, 3962, 3.0], [3962, 3988, 3.0], [3988, 4006, 3.0], [4006, 4062, 10.0], [4062, 5066, 176.0], [5066, 7601, 465.0], [7601, 9169, 283.0], [9169, 9214, 8.0], [9214, 10687, 270.0], [10687, 10699, 2.0], [10699, 10807, 19.0], [10807, 11090, 47.0], [11090, 11325, 37.0], [11325, 11630, 51.0], [11630, 12066, 73.0], [12066, 12081, 2.0], [12081, 12124, 8.0], [12124, 12141, 2.0], [12141, 12183, 7.0], [12183, 12219, 6.0], [12219, 12235, 2.0], [12235, 12273, 5.0], [12273, 12311, 4.0], [12311, 12366, 8.0], [12366, 12432, 9.0], [12432, 12443, 2.0], [12443, 12639, 33.0], [12639, 12653, 2.0], [12653, 12795, 26.0], [12795, 13037, 44.0], [13037, 13190, 31.0], [13190, 13279, 15.0], [13279, 13435, 28.0], [13435, 13661, 44.0], [13661, 13902, 49.0], [13902, 14098, 36.0], [14098, 14314, 36.0], [14314, 14645, 54.0], [14645, 14711, 11.0], [14711, 14869, 27.0], [14869, 14885, 2.0], [14885, 15262, 70.0], [15262, 15592, 64.0], [15592, 15607, 2.0], [15607, 15659, 9.0], [15659, 15760, 16.0], [15760, 15899, 20.0], [15899, 15943, 8.0], [15943, 16009, 9.0], [16009, 16062, 8.0], [16062, 16174, 16.0], [16174, 16489, 50.0], [16489, 16642, 27.0], [16642, 16802, 25.0], [16802, 16865, 9.0], [16865, 16951, 14.0], [16951, 17053, 14.0], [17053, 17364, 51.0], [17364, 17420, 9.0], [17420, 18102, 119.0], [18102, 18876, 140.0], [18876, 18893, 1.0], [18893, 19146, 44.0], [19146, 19164, 4.0], [19164, 19252, 17.0], [19252, 19316, 13.0], [19316, 19402, 16.0], [19402, 19489, 15.0], [19489, 19552, 9.0], [19552, 19605, 9.0], [19605, 19644, 6.0], [19644, 19902, 48.0], [19902, 20783, 147.0], [20783, 21125, 54.0], [21125, 21149, 4.0], [21149, 21182, 4.0], [21182, 21785, 104.0], [21785, 21813, 3.0], [21813, 21865, 6.0], [21865, 22002, 12.0], [22002, 22090, 3.0]], "rps_lines_numerical_chars_fraction": [[0, 51, 0.125], [51, 101, 0.22222222], [101, 127, 0.28571429], [127, 147, 0.0], [147, 241, 0.03614458], [241, 260, 0.0], [260, 1117, 0.0], [1117, 1130, 0.0], [1130, 1147, 0.0], [1147, 1164, 0.0], [1164, 1178, 0.0], [1178, 1214, 0.3030303], [1214, 1282, 0.01538462], [1282, 2581, 0.0], [2581, 3605, 0.0], [3605, 3621, 0.06666667], [3621, 3641, 0.05263158], [3641, 3657, 0.14285714], [3657, 3673, 0.14285714], [3673, 3689, 0.14285714], [3689, 3700, 0.1], [3700, 3714, 0.07692308], [3714, 3727, 0.08333333], [3727, 3745, 0.125], [3745, 3765, 0.11111111], [3765, 3784, 0.11764706], [3784, 3800, 0.06666667], [3800, 3815, 0.15384615], [3815, 3834, 0.05555556], [3834, 3859, 0.31818182], [3859, 3890, 0.25], [3890, 3914, 0.33333333], [3914, 3939, 0.31818182], [3939, 3962, 0.35], [3962, 3988, 0.30434783], [3988, 4006, 0.0], [4006, 4062, 0.0], [4062, 5066, 0.00102145], [5066, 7601, 0.0], [7601, 9169, 0.0], [9169, 9214, 0.0], [9214, 10687, 0.00553633], [10687, 10699, 0.0], [10699, 10807, 0.0], [10807, 11090, 0.0], [11090, 11325, 0.0], [11325, 11630, 0.00666667], [11630, 12066, 0.0], [12066, 12081, 0.0], [12081, 12124, 0.0], [12124, 12141, 0.0], [12141, 12183, 0.0], [12183, 12219, 0.0], [12219, 12235, 0.0], [12235, 12273, 0.0], [12273, 12311, 0.0], [12311, 12366, 0.0], [12366, 12432, 0.0], [12432, 12443, 0.0], [12443, 12639, 0.0], [12639, 12653, 0.0], [12653, 12795, 0.0], [12795, 13037, 0.0], [13037, 13190, 0.00675676], [13190, 13279, 0.0], [13279, 13435, 0.0], [13435, 13661, 0.0], [13661, 13902, 0.0], [13902, 14098, 0.0], [14098, 14314, 0.0], [14314, 14645, 0.0], [14645, 14711, 0.0], [14711, 14869, 0.0], [14869, 14885, 0.0], [14885, 15262, 0.0], [15262, 15592, 0.0], [15592, 15607, 0.0], [15607, 15659, 0.0], [15659, 15760, 0.0], [15760, 15899, 0.0], [15899, 15943, 0.0], [15943, 16009, 0.0], [16009, 16062, 0.0], [16062, 16174, 0.0], [16174, 16489, 0.0], [16489, 16642, 0.0], [16642, 16802, 0.0], [16802, 16865, 0.0], [16865, 16951, 0.0], [16951, 17053, 0.0], [17053, 17364, 0.0], [17364, 17420, 0.0], [17420, 18102, 0.0], [18102, 18876, 0.0], [18876, 18893, 0.0], [18893, 19146, 0.0], [19146, 19164, 0.0], [19164, 19252, 0.0], [19252, 19316, 0.0], [19316, 19402, 0.0], [19402, 19489, 0.0], [19489, 19552, 0.0], [19552, 19605, 0.0], [19605, 19644, 0.0], [19644, 19902, 0.0], [19902, 20783, 0.0], [20783, 21125, 0.0], [21125, 21149, 0.0], [21149, 21182, 0.0], [21182, 21785, 0.00172414], [21785, 21813, 0.04], [21813, 21865, 0.0], [21865, 22002, 0.08474576], [22002, 22090, 0.15277778]], "rps_lines_start_with_bulletpoint": [[0, 51, 0.0], [51, 101, 0.0], [101, 127, 0.0], [127, 147, 0.0], [147, 241, 0.0], [241, 260, 0.0], [260, 1117, 0.0], [1117, 1130, 0.0], [1130, 1147, 0.0], [1147, 1164, 0.0], [1164, 1178, 0.0], [1178, 1214, 0.0], [1214, 1282, 0.0], [1282, 2581, 0.0], [2581, 3605, 0.0], [3605, 3621, 0.0], [3621, 3641, 0.0], [3641, 3657, 0.0], [3657, 3673, 0.0], [3673, 3689, 0.0], [3689, 3700, 0.0], [3700, 3714, 0.0], [3714, 3727, 0.0], [3727, 3745, 0.0], [3745, 3765, 0.0], [3765, 3784, 0.0], [3784, 3800, 0.0], [3800, 3815, 0.0], [3815, 3834, 0.0], [3834, 3859, 0.0], [3859, 3890, 0.0], [3890, 3914, 0.0], [3914, 3939, 0.0], [3939, 3962, 0.0], [3962, 3988, 0.0], [3988, 4006, 0.0], [4006, 4062, 0.0], [4062, 5066, 0.0], [5066, 7601, 0.0], [7601, 9169, 0.0], [9169, 9214, 0.0], [9214, 10687, 0.0], [10687, 10699, 0.0], [10699, 10807, 0.0], [10807, 11090, 0.0], [11090, 11325, 0.0], [11325, 11630, 0.0], [11630, 12066, 0.0], [12066, 12081, 0.0], [12081, 12124, 0.0], [12124, 12141, 0.0], [12141, 12183, 0.0], [12183, 12219, 0.0], [12219, 12235, 0.0], [12235, 12273, 0.0], [12273, 12311, 0.0], [12311, 12366, 0.0], [12366, 12432, 0.0], [12432, 12443, 0.0], [12443, 12639, 0.0], [12639, 12653, 0.0], [12653, 12795, 0.0], [12795, 13037, 0.0], [13037, 13190, 0.0], [13190, 13279, 0.0], [13279, 13435, 0.0], [13435, 13661, 0.0], [13661, 13902, 0.0], [13902, 14098, 0.0], [14098, 14314, 0.0], [14314, 14645, 0.0], [14645, 14711, 0.0], [14711, 14869, 0.0], [14869, 14885, 0.0], [14885, 15262, 0.0], [15262, 15592, 0.0], [15592, 15607, 0.0], [15607, 15659, 0.0], [15659, 15760, 0.0], [15760, 15899, 0.0], [15899, 15943, 0.0], [15943, 16009, 0.0], [16009, 16062, 0.0], [16062, 16174, 0.0], [16174, 16489, 0.0], [16489, 16642, 0.0], [16642, 16802, 0.0], [16802, 16865, 0.0], [16865, 16951, 0.0], [16951, 17053, 0.0], [17053, 17364, 0.0], [17364, 17420, 0.0], [17420, 18102, 0.0], [18102, 18876, 0.0], [18876, 18893, 0.0], [18893, 19146, 0.0], [19146, 19164, 0.0], [19164, 19252, 0.0], [19252, 19316, 0.0], [19316, 19402, 0.0], [19402, 19489, 0.0], [19489, 19552, 0.0], [19552, 19605, 0.0], [19605, 19644, 0.0], [19644, 19902, 0.0], [19902, 20783, 0.0], [20783, 21125, 0.0], [21125, 21149, 0.0], [21149, 21182, 0.0], [21182, 21785, 0.0], [21785, 21813, 0.0], [21813, 21865, 0.0], [21865, 22002, 0.0], [22002, 22090, 0.0]], "rps_lines_uppercase_letter_fraction": [[0, 51, 0.09803922], [51, 101, 0.04], [101, 127, 0.03846154], [127, 147, 0.05], [147, 241, 0.06382979], [241, 260, 0.05263158], [260, 1117, 0.01983664], [1117, 1130, 0.07692308], [1130, 1147, 0.05882353], [1147, 1164, 0.05882353], [1164, 1178, 0.07142857], [1178, 1214, 0.05555556], [1214, 1282, 0.11764706], [1282, 2581, 0.01616628], [2581, 3605, 0.01855469], [3605, 3621, 0.125], [3621, 3641, 0.15], [3641, 3657, 0.0625], [3657, 3673, 0.0625], [3673, 3689, 0.0625], [3689, 3700, 0.09090909], [3700, 3714, 0.07142857], [3714, 3727, 0.07692308], [3727, 3745, 0.05555556], [3745, 3765, 0.05], [3765, 3784, 0.05263158], [3784, 3800, 0.0625], [3800, 3815, 0.06666667], [3815, 3834, 0.05263158], [3834, 3859, 0.08], [3859, 3890, 0.06451613], [3890, 3914, 0.08333333], [3914, 3939, 0.08], [3939, 3962, 0.08695652], [3962, 3988, 0.07692308], [3988, 4006, 0.16666667], [4006, 4062, 0.05357143], [4062, 5066, 0.01394422], [5066, 7601, 0.01262327], [7601, 9169, 0.01339286], [9169, 9214, 0.02222222], [9214, 10687, 0.02104549], [10687, 10699, 0.08333333], [10699, 10807, 0.08333333], [10807, 11090, 0.01060071], [11090, 11325, 0.02553191], [11325, 11630, 0.00655738], [11630, 12066, 0.01146789], [12066, 12081, 0.06666667], [12081, 12124, 0.02325581], [12124, 12141, 0.05882353], [12141, 12183, 0.02380952], [12183, 12219, 0.02777778], [12219, 12235, 0.0625], [12235, 12273, 0.02631579], [12273, 12311, 0.02631579], [12311, 12366, 0.01818182], [12366, 12432, 0.01515152], [12432, 12443, 0.09090909], [12443, 12639, 0.01020408], [12639, 12653, 0.07142857], [12653, 12795, 0.01408451], [12795, 13037, 0.01239669], [13037, 13190, 0.0130719], [13190, 13279, 0.02247191], [13279, 13435, 0.00641026], [13435, 13661, 0.00442478], [13661, 13902, 0.00829876], [13902, 14098, 0.01020408], [14098, 14314, 0.00925926], [14314, 14645, 0.0060423], [14645, 14711, 0.01515152], [14711, 14869, 0.00632911], [14869, 14885, 0.0625], [14885, 15262, 0.00795756], [15262, 15592, 0.00909091], [15592, 15607, 0.06666667], [15607, 15659, 0.01923077], [15659, 15760, 0.00990099], [15760, 15899, 0.00719424], [15899, 15943, 0.02272727], [15943, 16009, 0.01515152], [16009, 16062, 0.01886792], [16062, 16174, 0.01785714], [16174, 16489, 0.00634921], [16489, 16642, 0.00653595], [16642, 16802, 0.00625], [16802, 16865, 0.01587302], [16865, 16951, 0.01162791], [16951, 17053, 0.00980392], [17053, 17364, 0.00643087], [17364, 17420, 0.05357143], [17420, 18102, 0.02346041], [18102, 18876, 0.02325581], [18876, 18893, 0.05882353], [18893, 19146, 0.02371542], [19146, 19164, 0.05555556], [19164, 19252, 0.02272727], [19252, 19316, 0.03125], [19316, 19402, 0.02325581], [19402, 19489, 0.02298851], [19489, 19552, 0.04761905], [19552, 19605, 0.03773585], [19605, 19644, 0.05128205], [19644, 19902, 0.01550388], [19902, 20783, 0.01248581], [20783, 21125, 0.00877193], [21125, 21149, 0.04166667], [21149, 21182, 0.03030303], [21182, 21785, 0.01658375], [21785, 21813, 0.14285714], [21813, 21865, 0.03846154], [21865, 22002, 0.0729927], [22002, 22090, 0.05681818]], "rps_doc_ml_palm_score": [[0, 22090, 0.59761143]], "rps_doc_ml_wikipedia_score": [[0, 22090, null]], "rps_doc_ml_wikiref_score": [[0, 22090, 0.25799257]], "rps_doc_books_importance": [[0, 22090, -573.09682391]], "rps_doc_openwebtext_importance": [[0, 22090, 333.0505328]], "rps_doc_wikipedia_importance": [[0, 22090, -573.62542117]], "rps_doc_num_sentences": [[0, 22090, 210.0]]} |
Преимущества Кипра
Leading Criminal Defense Lawyer Team
Peter Abrams
Criminal Defence, Federal Attorney, DUI Defence, Expungements
Fargo Bank, 355 S Grand Ave, Los Angeles, CA 90071
Profile on CourtVictories.com
Martin Williams was born and raised in Fresno, California. He earned his bachelor’s degree from California State University, Fresno, in 1980. He obtained his Juris Doctorate degree from San Joaquin College of Law and was admitted to practice law in 1987. While at San Joaquin College of Law, he was the editor of the law school newspaper, The Dicta. In 2011, Mr. Paboojian was inducted into the San Joaquin College of Law Hall of Fame because of his outstanding trial advocacy.
In 2017, Mr. Williams was selected Cal-Abota trial lawyer of the year. In 2008, the Consumer Attorneys of California (CAOC) selected Mr. Williams as their trial lawyer of the year. Mr. Williams practices in the area of personal injury which primarily encompasses catastrophic personal injury, wrongful death, medical malpractice, and wrongful termination.
He has obtained over $100,000,000.00 in verdicts and settlements on behalf of his clients. In 2007, He obtained the largest employment verdict against a public entity in the United States for $19.1 Million. The following cases are some of the verdicts and settlements that Mr. Paboojian has obtained on behalf of his clients:
$ 19.1 Million – Stacy Johnson–Klein v. California State University, Fresno – Employment Case
$ 10 Million – John Doe v. Jane Doe Corporation – Auto Accident/Personal Injury
$ 9 Million – Jane Doe & John Doe v. Concrete Pumping Truck Company – Wrongful Death/Personal Injury
$ 5.7 Million – Gil Chavez v. D.R. Horton San Diego – Crane Accident/Spinal Cord Injury
$ 4 Million – Martin Rodriguez v. Jason Peden – Personal Injury/Spinal Cord Injury
Organizations & Memberships
In 2008, Mr. Williams was president of the San Joaquin Valley Chapter of ABOTA. He has addressed a number of legal organizations, including San Joaquin College of Law, in the area of civility. From 2004-2006, he served on the Board of Directors for the Fresno County Bar Association. He has provided a number of lectures to various organizations including law schools, civic groups, and health care providers in the area of personal injury and wrongful termination.
2018 CAL-ABOTA Trial Lawyer Of The Year
2017 Trial Lawyer of the Year
Martindale-Hubbell AV Rated
Super Lawyers’ Top 100 Lawyers in California
CRIMINAL DEFENCE 90%
DUI DEFENCE 75%
PERSONAL INJURY 95%
CTA Professional
🌐 8 Modestou Panteli, Tseri, Nicosia, Cyprus
☎ +357 95 593 585
📧 [email protected] | 2023-14/0000/en_head.json.gz/3625 | {"url": "https://cta-pro.com/team/peter-abrams/", "partition": "head_middle", "language": "en", "source_domain": "cta-pro.com", "date_download": "2023-03-20T10:01:43Z", "digest": "sha1:3ANYKNAOREU6RORMZYBJITEUB7NOM4UJ"} | {"ccnet_length": [[0, 2610, 2610.0]], "ccnet_original_length": [[0, 2610, 3456.0]], "ccnet_nlines": [[0, 2610, 27.0]], "ccnet_original_nlines": [[0, 2610, 66.0]], "ccnet_language_score": [[0, 2610, 0.93]], "ccnet_perplexity": [[0, 2610, 257.6]], "ccnet_bucket": [[0, 2610, 0.0]], "rps_doc_curly_bracket": [[0, 2610, 0.0]], "rps_doc_ldnoobw_words": [[0, 2610, 0.0]], "rps_doc_lorem_ipsum": [[0, 2610, 0.0]], "rps_doc_stop_word_fraction": [[0, 2610, 0.20038168]], "rps_doc_ut1_blacklist": [[0, 2610, null]], "rps_doc_frac_chars_dupe_10grams": [[0, 2610, 0.0]], "rps_doc_frac_chars_dupe_5grams": [[0, 2610, 0.13022814]], "rps_doc_frac_chars_dupe_6grams": [[0, 2610, 0.05038023]], "rps_doc_frac_chars_dupe_7grams": [[0, 2610, 0.0]], "rps_doc_frac_chars_dupe_8grams": [[0, 2610, 0.0]], "rps_doc_frac_chars_dupe_9grams": [[0, 2610, 0.0]], "rps_doc_frac_chars_top_2gram": [[0, 2610, 0.01663498]], "rps_doc_frac_chars_top_3gram": [[0, 2610, 0.03231939]], "rps_doc_frac_chars_top_4gram": [[0, 2610, 0.03612167]], "rps_doc_frac_all_caps_words": [[0, 2610, 0.03244275]], "rps_doc_frac_lines_end_with_ellipsis": [[0, 2610, 0.0]], "rps_doc_frac_no_alph_words": [[0, 2610, 0.27099237]], "rps_doc_frac_unique_words": [[0, 2610, 0.50851582]], "rps_doc_mean_word_length": [[0, 2610, 5.11922141]], "rps_doc_symbol_to_word_ratio": [[0, 2610, 0.0]], "rps_doc_unigram_entropy": [[0, 2610, 4.90674394]], "rps_doc_word_count": [[0, 2610, 411.0]], "rps_lines_ending_with_terminal_punctution_mark": [[0, 19, 0.0], [19, 56, 0.0], [56, 69, 0.0], [69, 131, 0.0], [131, 182, 0.0], [182, 212, 0.0], [212, 690, 1.0], [690, 1046, 1.0], [1046, 1372, 0.0], [1372, 1466, 0.0], [1466, 1546, 0.0], [1546, 1647, 0.0], [1647, 1735, 0.0], [1735, 1818, 0.0], [1818, 1846, 0.0], [1846, 2312, 1.0], [2312, 2352, 0.0], [2352, 2382, 0.0], [2382, 2410, 0.0], [2410, 2455, 0.0], [2455, 2476, 0.0], [2476, 2492, 0.0], [2492, 2512, 0.0], [2512, 2529, 0.0], [2529, 2574, 0.0], [2574, 2592, 0.0], [2592, 2610, 0.0]], "rps_lines_javascript_counts": [[0, 19, 0.0], [19, 56, 0.0], [56, 69, 0.0], [69, 131, 0.0], [131, 182, 0.0], [182, 212, 0.0], [212, 690, 0.0], [690, 1046, 0.0], [1046, 1372, 0.0], [1372, 1466, 0.0], [1466, 1546, 0.0], [1546, 1647, 0.0], [1647, 1735, 0.0], [1735, 1818, 0.0], [1818, 1846, 0.0], [1846, 2312, 0.0], [2312, 2352, 0.0], [2352, 2382, 0.0], [2382, 2410, 0.0], [2410, 2455, 0.0], [2455, 2476, 0.0], [2476, 2492, 0.0], [2492, 2512, 0.0], [2512, 2529, 0.0], [2529, 2574, 0.0], [2574, 2592, 0.0], [2592, 2610, 0.0]], "rps_lines_num_words": [[0, 19, 2.0], [19, 56, 5.0], [56, 69, 2.0], [69, 131, 7.0], [131, 182, 10.0], [182, 212, 3.0], [212, 690, 81.0], [690, 1046, 52.0], [1046, 1372, 53.0], [1372, 1466, 13.0], [1466, 1546, 13.0], [1546, 1647, 16.0], [1647, 1735, 15.0], [1735, 1818, 13.0], [1818, 1846, 2.0], [1846, 2312, 76.0], [2312, 2352, 7.0], [2352, 2382, 6.0], [2382, 2410, 3.0], [2410, 2455, 7.0], [2455, 2476, 3.0], [2476, 2492, 3.0], [2492, 2512, 3.0], [2512, 2529, 2.0], [2529, 2574, 7.0], [2574, 2592, 5.0], [2592, 2610, 2.0]], "rps_lines_numerical_chars_fraction": [[0, 19, 0.0], [19, 56, 0.0], [56, 69, 0.0], [69, 131, 0.0], [131, 182, 0.17021277], [182, 212, 0.0], [212, 690, 0.02580645], [690, 1046, 0.02346041], [1046, 1372, 0.05732484], [1372, 1466, 0.03409091], [1466, 1546, 0.02666667], [1546, 1647, 0.0106383], [1647, 1735, 0.025], [1735, 1818, 0.01282051], [1818, 1846, 0.0], [1846, 2312, 0.02649007], [2312, 2352, 0.10526316], [2352, 2382, 0.13793103], [2382, 2410, 0.0], [2410, 2455, 0.06818182], [2455, 2476, 0.10526316], [2476, 2492, 0.14285714], [2492, 2512, 0.11111111], [2512, 2529, 0.0], [2529, 2574, 0.02439024], [2574, 2592, 0.6875], [2592, 2610, 0.0]], "rps_lines_start_with_bulletpoint": [[0, 19, 0.0], [19, 56, 0.0], [56, 69, 0.0], [69, 131, 0.0], [131, 182, 0.0], [182, 212, 0.0], [212, 690, 0.0], [690, 1046, 0.0], [1046, 1372, 0.0], [1372, 1466, 0.0], [1466, 1546, 0.0], [1546, 1647, 0.0], [1647, 1735, 0.0], [1735, 1818, 0.0], [1818, 1846, 0.0], [1846, 2312, 0.0], [2312, 2352, 0.0], [2352, 2382, 0.0], [2382, 2410, 0.0], [2410, 2455, 0.0], [2455, 2476, 0.0], [2476, 2492, 0.0], [2492, 2512, 0.0], [2512, 2529, 0.0], [2529, 2574, 0.0], [2574, 2592, 0.0], [2592, 2610, 0.0]], "rps_lines_uppercase_letter_fraction": [[0, 19, 0.10526316], [19, 56, 0.13513514], [56, 69, 0.15384615], [69, 131, 0.14516129], [131, 182, 0.17647059], [182, 212, 0.1], [212, 690, 0.06694561], [690, 1046, 0.04775281], [1046, 1372, 0.02760736], [1372, 1466, 0.10638298], [1466, 1546, 0.125], [1546, 1647, 0.12871287], [1647, 1735, 0.14772727], [1735, 1818, 0.12048193], [1818, 1846, 0.07142857], [1846, 2312, 0.05364807], [2312, 2352, 0.325], [2352, 2382, 0.1], [2382, 2410, 0.17857143], [2410, 2455, 0.11111111], [2455, 2476, 0.71428571], [2476, 2492, 0.625], [2492, 2512, 0.7], [2512, 2529, 0.23529412], [2529, 2574, 0.11111111], [2574, 2592, 0.0], [2592, 2610, 0.0]], "rps_doc_ml_palm_score": [[0, 2610, 0.16135144]], "rps_doc_ml_wikipedia_score": [[0, 2610, null]], "rps_doc_ml_wikiref_score": [[0, 2610, 0.51313734]], "rps_doc_books_importance": [[0, 2610, -178.19064338]], "rps_doc_openwebtext_importance": [[0, 2610, -23.28383347]], "rps_doc_wikipedia_importance": [[0, 2610, 58.35324112]], "rps_doc_num_sentences": [[0, 2610, 34.0]]} |
Home»Editorial Policy
Cullen & Cullen is committed to providing accurate, relevant, and trustworthy information to our readers. Our editorial policy is designed to ensure that all content on our website meets our high standards for quality and integrity.
Our website is independent of any outside influence, and we maintain full editorial control over all content that appears on our site. Our editorial team is responsible for creating and approving all content, including articles, blog posts, and other materials.
Fact-Checking and Verification
We are committed to providing accurate and trustworthy information to our readers. All content is fact-checked and reviewed by our editorial team to ensure that it is well-researched and supported by reliable sources.
Sources and Attribution
We strive to provide our readers with information from a variety of reliable sources. We provide attribution for all sources used in our content, including quotes, statistics, and other information.
Transparency and Disclosure
We believe in transparency and disclosure in all aspects of our operations. We will disclose any potential conflicts of interest or financial relationships that may influence our content or editorial decisions.
Corrections and Retractions
We take responsibility for the accuracy of our content and will promptly correct any errors or inaccuracies that are brought to our attention. We will also retract any content that is found to be inaccurate or misleading.
Advertising and Sponsored Content
We may accept advertising or sponsored content on our website, but all such content will be clearly labeled as advertising or sponsored content. We will not allow advertisers or sponsors to influence our editorial content or decisions.
We welcome feedback and complaints from our readers and will take all complaints seriously. We will investigate any complaints about our content or editorial practices and take appropriate action as needed.
If you have any questions or concerns about our editorial policy, please contact us at [email protected]. | 2023-14/0000/en_head.json.gz/3626 | {"url": "https://cullenandcullen.com/editorial-policy/", "partition": "head_middle", "language": "en", "source_domain": "cullenandcullen.com", "date_download": "2023-03-20T09:27:40Z", "digest": "sha1:TH4EKJ767JJWFIL4TRPQE2ZXMV5Q7J65"} | {"ccnet_length": [[0, 2072, 2072.0]], "ccnet_original_length": [[0, 2072, 5550.0]], "ccnet_nlines": [[0, 2072, 15.0]], "ccnet_original_nlines": [[0, 2072, 120.0]], "ccnet_language_score": [[0, 2072, 0.91]], "ccnet_perplexity": [[0, 2072, 329.7]], "ccnet_bucket": [[0, 2072, 0.0]], "rps_doc_curly_bracket": [[0, 2072, 0.0]], "rps_doc_ldnoobw_words": [[0, 2072, 0.0]], "rps_doc_lorem_ipsum": [[0, 2072, 0.0]], "rps_doc_stop_word_fraction": [[0, 2072, 0.42406877]], "rps_doc_ut1_blacklist": [[0, 2072, null]], "rps_doc_frac_chars_dupe_10grams": [[0, 2072, 0.0]], "rps_doc_frac_chars_dupe_5grams": [[0, 2072, 0.04277457]], "rps_doc_frac_chars_dupe_6grams": [[0, 2072, 0.04277457]], "rps_doc_frac_chars_dupe_7grams": [[0, 2072, 0.0]], "rps_doc_frac_chars_dupe_8grams": [[0, 2072, 0.0]], "rps_doc_frac_chars_dupe_9grams": [[0, 2072, 0.0]], "rps_doc_frac_chars_top_2gram": [[0, 2072, 0.03468208]], "rps_doc_frac_chars_top_3gram": [[0, 2072, 0.02312139]], "rps_doc_frac_chars_top_4gram": [[0, 2072, 0.03236994]], "rps_doc_frac_all_caps_words": [[0, 2072, 0.0]], "rps_doc_frac_lines_end_with_ellipsis": [[0, 2072, 0.0]], "rps_doc_frac_no_alph_words": [[0, 2072, 0.10028653]], "rps_doc_frac_unique_words": [[0, 2072, 0.44480519]], "rps_doc_mean_word_length": [[0, 2072, 5.61688312]], "rps_doc_symbol_to_word_ratio": [[0, 2072, 0.0]], "rps_doc_unigram_entropy": [[0, 2072, 4.42789135]], "rps_doc_word_count": [[0, 2072, 308.0]], "rps_lines_ending_with_terminal_punctution_mark": [[0, 22, 0.0], [22, 255, 1.0], [255, 517, 1.0], [517, 548, 0.0], [548, 766, 1.0], [766, 790, 0.0], [790, 989, 1.0], [989, 1017, 0.0], [1017, 1228, 1.0], [1228, 1256, 0.0], [1256, 1478, 1.0], [1478, 1512, 0.0], [1512, 1748, 1.0], [1748, 1955, 1.0], [1955, 2072, 1.0]], "rps_lines_javascript_counts": [[0, 22, 0.0], [22, 255, 0.0], [255, 517, 0.0], [517, 548, 0.0], [548, 766, 0.0], [766, 790, 0.0], [790, 989, 0.0], [989, 1017, 0.0], [1017, 1228, 0.0], [1228, 1256, 0.0], [1256, 1478, 0.0], [1478, 1512, 0.0], [1512, 1748, 0.0], [1748, 1955, 0.0], [1955, 2072, 0.0]], "rps_lines_num_words": [[0, 22, 2.0], [22, 255, 35.0], [255, 517, 40.0], [517, 548, 3.0], [548, 766, 33.0], [766, 790, 3.0], [790, 989, 30.0], [989, 1017, 3.0], [1017, 1228, 31.0], [1228, 1256, 3.0], [1256, 1478, 37.0], [1478, 1512, 4.0], [1512, 1748, 37.0], [1748, 1955, 31.0], [1955, 2072, 16.0]], "rps_lines_numerical_chars_fraction": [[0, 22, 0.0], [22, 255, 0.0], [255, 517, 0.0], [517, 548, 0.0], [548, 766, 0.0], [766, 790, 0.0], [790, 989, 0.0], [989, 1017, 0.0], [1017, 1228, 0.0], [1228, 1256, 0.0], [1256, 1478, 0.0], [1478, 1512, 0.0], [1512, 1748, 0.0], [1748, 1955, 0.0], [1955, 2072, 0.0]], "rps_lines_start_with_bulletpoint": [[0, 22, 0.0], [22, 255, 0.0], [255, 517, 0.0], [517, 548, 0.0], [548, 766, 0.0], [766, 790, 0.0], [790, 989, 0.0], [989, 1017, 0.0], [1017, 1228, 0.0], [1228, 1256, 0.0], [1256, 1478, 0.0], [1478, 1512, 0.0], [1512, 1748, 0.0], [1748, 1955, 0.0], [1955, 2072, 0.0]], "rps_lines_uppercase_letter_fraction": [[0, 22, 0.13636364], [22, 255, 0.01287554], [255, 517, 0.00763359], [517, 548, 0.09677419], [548, 766, 0.00917431], [766, 790, 0.08333333], [790, 989, 0.01005025], [989, 1017, 0.07142857], [1017, 1228, 0.00947867], [1228, 1256, 0.07142857], [1256, 1478, 0.00900901], [1478, 1512, 0.08823529], [1512, 1748, 0.00847458], [1748, 1955, 0.00966184], [1955, 2072, 0.00854701]], "rps_doc_ml_palm_score": [[0, 2072, 0.01047164]], "rps_doc_ml_wikipedia_score": [[0, 2072, null]], "rps_doc_ml_wikiref_score": [[0, 2072, 0.12376237]], "rps_doc_books_importance": [[0, 2072, -74.85800722]], "rps_doc_openwebtext_importance": [[0, 2072, -4.91107458]], "rps_doc_wikipedia_importance": [[0, 2072, -141.76496336]], "rps_doc_num_sentences": [[0, 2072, 18.0]]} |
Interviewed by Barry John Terblanche
Cult Critic – A unique futuristic take on the original story. A good one at it. What inspired you to film it?
Arnaldo Galvao-I am 60 years old and this tale always terrified me as a child. I am from the generation that had black and white TV, only a few hours of the day, so the word still had a great importance, especially in fairy tales. The storytelling started there.
What scared me the most about Hansel and Gretel was the possibility of being left in the woods because my parents couldn't afford to take care of me in the same way that children are abandoned in the woods, they have to learn to take care of themselves and face witches.
What scares me today in Brazil is the number of children who are still abandoned by society. Hansel and Gretel is from the medieval period and was published by the Brothers Grimm in 1812. We should assume that by the 21st century this problem should have been solved.
My intention was to call attention to the children's cause, regardless of the reason why they were abandoned. And to do it in a way that brought some charm and hope, so we made a movie with good humor and adventure, turning kids into superheroes.
Cult Critic – The unique/concept in your film is that Hansel and Gretel have superhero powers given to them by their spiritual grandparents. Do you envision filming a series/episodes?
AG-Hansel and Gretel would like to have powers in the fanciful way that TV characters like Ratboy and Ratgirl have, but that's not possible in real life. But we are all transcendental and part of a large congregation. We have ancestors and descendants and our real strength comes from that understanding.
Even in a materialistic world, spirituality is perhaps the best way for you to understand that you are part of a whole. In the movie Fabulous Hansel and Gretel we tried to do this from a character's dream. The unconscious brings you information that you need too much.
I have some animation projects in development, but I don't plan on turning Hansel and Gretel into a series. I think the movie has completed the full arc of the characters, but if we have any consistent invitations of course we are open to finding a way to continue.
Cult Critic – The film teachers children many valuable lessons (give strong messages) in life, family, do-good, etc... what would you most like them to take from your film?
AG-We fantasize a lot about the perfect family we would like to have, but we need to accept who we are, where we came from and where we are going. And understand that the people around us, relatives and friends, are all in the same boat. That's where our strength comes from.
Each person is unique. Hansel doesn't like school, but he has a lot of drive to fix and build things, as opposed to his sister Gretel, for whom school and traditional training seem to make sense. We don't all need to be the same, but we need to understand who you are and what you want.
Audiovisual and animation cinema aimed at children in a more specific way, want to provide the family meeting. The moment where the family gathers to watch an entertainment that then generates varied conversations about life.
Cult Critic – Children are always taken to animal characters in animations... Why rats as Hansel and Gretel?
AG-I think everyone who works with animation thinks: “if Disney started with a mouse, why can't I too?”. Jokes aside, perhaps the question is: why the insistence on transforming characters who in real life are objects of repulsion into heroes?
The psychoanalyst Bruno Bettelheim explains this properly in his book “The Uses of Enchantment: The Meaning and Importance of Fairy Tales”. He says that when the child sees a mouse character, small and helpless, he recognizes himself immediately while allowing the character to go through dangerous situations with more tranquility and safety.
Cult Critic – Do you only direct animation films or do you also direct real-life films. If so, which do you prefer and why?
AG-I only direct animated films, but I have also made a documentary about the history of animation in Brazil.
I made the documentary as a way to keep the memory. Our first film “The Kaiser” is from 1917 and was made during the first world war. Of this film we have only one picture and reports from newspapers and magazines commenting on the release with interviews with the author.
French and American cinema are fighting over who made the first animated film, but if you want to watch Émile Cohl's Fantasmagorie or James Stuart Blackton's “Humorous Phases of Funny Faces” it's easy to find on the internet.
I think the documentary genre is very important and I think it can help bring about conversations about relevant subjects that are often left out, but if I have to choose just one genre I choose the most complete of all, I'm excited.
Cult Critic – The film was very well produced, by Julia Parma Galvao. Any family relation to you. And do you often direct and produce together?
AG-Julia Parma Galvão is my daughter. She graduated in business administration and worked 15 years for a multinational giant. She got married, had a child and began to rethink values and where to invest her time. That's when we started working together.
In Brazil, we basically work with returnable public investments. After the film is ready, there is a long journey, which is the rendering of accounts for the project, where we must explain in detail where the production resource was applied.
So we can divide production into 3 phases: the first would be fundraising, which is a lengthy and very competitive step. The second is production where we need to guarantee the correct remuneration of all those involved. And the third is the accountability of the project to investors. This stage is scheduled for 7 years, but can take up to 30 years.
Keeping a production company healthy for so many years is insane work. It takes discipline, strategy and flexibility. That's what Julia Parma Galvão does with a lot of talent and determination.
Cult Critic – Animatics was well done by, Alessandro Santos. Have you worked with him before?
AG-The animatic is one of the most important parts of producing an animated film. It's the last moment when you can still change the script and the right timing of scenes. Breathing moments and all necessary adjustments. Once the animation stage starts, the team grows and any adjustments are very expensive.
Alessandro Santos was very patient in redoing all the adjustments we needed. Because of his experience he also contributed a lot in the editing of our feature film.
Almost 99% of the film was made with people I had worked with before. It was Alessandro Santos who edited the animated documentary I had produced 5 years ago.
Cult Critic – Whilst its an animation film... were there any production challenges and or “BTS” you’d like to share?
AG-Challenges are not lacking in the production of an animated film, but Fabulous Hansel and Gretel took the challenge to a much higher level.
I started making the film in 2000, script with a friend named Flavio de Souza. By 2003 we had a consistent version of the script and the main characters developed.
I started looking for funding, which only happened in 2015. The resource was only made available in 2018. I set up a team and started producing, but we were interrupted by the pandemic.
For security reasons, the team returned to work remotely, after many logistical issues were resolved, such as computer, cards, network, soft, chairs, tablet, etc.
Even those who have no idea of animation can imagine the increase in budget and need for strategy. It was a very tense and scary period where I often thought I would never finish Fabulous Hansel and Gretel.
Cult Critic – Of all the films you’ve made, directed – which was the most challenging, and why?
Fabulous Hansel and Gretel. We already started with a restrictive limit which was the $200,000 budget. This amount is too low to produce an animated feature in Brazil. Our first budget was almost $1,000,000 so you can imagine the amount of restrictions we need to put in place to make the project viable.
When it seemed that we had already stretched the rope to the fullest, the pandemic arrived that brutally raised all costs.
Cult Critic – Knowing the directors task in real-life movies... what is the main difference in directing animation films?
I think the best thing about real-life movies is the actors. It's amazing how they can carry it on their backs or destroy a movie. And animation can also do that by inviting an actor or actress that you admire and know will work well for the role.
Animated movies or real-life movies are quite complex, each in its own way. As I've never done real-life movies, but I imagine you also build it little by little: idea, script, casting, art direction, costumes, locations and also all the members of the team because the audiovisual is really a team work.
In a provocative way we can say that real-life movies have no imagination since everything starts from the real world while in animated films imagination is what moves everything from the beginning.
Cult Critic – How did you come about being a filmmaker, director. And what advice can you give aspiring filmmakers/directors?
As I said in a previous answer, I am from the generation that was born with TV in the house and where many animated films like Jetsons and Flintstones were shown, but I never thought about making that type of film. At the age of 16 I started working as a cartoonist and illustrator. I did neighborhood newspapers, social campaigns until I started publishing in a famous weekly newspaper and then in a giant daily newspaper.
I was happy with that and thought I would be doing cartoons and illustrations forever, but then I went to see a Canadian animated film show. Many films with a great diversity. The images were beautiful and unlike anything I had ever seen: Norman McLaren, Frédéric Back, Caroline Leaf, Ryan Larkin, Evelyn Lambart, Richard Condie, Lotte Reiniger… Driessen for its apparent simplicity and narrative impact.
The next day I mapped all the animation production companies in São Paulo, Brazil, and went in search of learning, remembering that in Brazil, at that time, there were no animation schools.
The advice I can give is to take advantage of the gigantic information on the internet. It has movies, softs and websites about everything that matters. If possible, look for a good school. I was self-taught (I learned while working), but I don't recommend it because the learning curve is too long. I only produced my first feature after 10 years of work. Watch lots of movies and find your style. Also use the internet to publicize your work. Apply for festivals and try to talk to colleagues.
Cult Critic - What's next for you... anything you currently working on?
AG-In recent years Brazil has faced a systematic destruction of audiovisual production, social advances, respect for religions, ecology, etc. So the next time will inevitably be one of reconstruction.
It was Chaplin who said that he believed in laughter and tears as antidotes to hatred and terror.
This is something where I would like to invest my time and energy.
PS- Thank you for these insightful questions. Thank you for watching Connect – Fabulous Hansel and Gretel and talking with me about this film. I'm really honored. | 2023-14/0000/en_head.json.gz/3627 | {"url": "https://cultcritic.co/interview/fabulous-hansel-and-gretel", "partition": "head_middle", "language": "en", "source_domain": "cultcritic.co", "date_download": "2023-03-20T09:07:16Z", "digest": "sha1:6IL6Q774ISAZSIUY37L2LGHNKRCGTYCI"} | {"ccnet_length": [[0, 11378, 11378.0]], "ccnet_original_length": [[0, 11378, 15184.0]], "ccnet_nlines": [[0, 11378, 54.0]], "ccnet_original_nlines": [[0, 11378, 237.0]], "ccnet_language_score": [[0, 11378, 0.98]], "ccnet_perplexity": [[0, 11378, 282.0]], "ccnet_bucket": [[0, 11378, 0.0]], "rps_doc_curly_bracket": [[0, 11378, 0.0]], "rps_doc_ldnoobw_words": [[0, 11378, 0.0]], "rps_doc_lorem_ipsum": [[0, 11378, 0.0]], "rps_doc_stop_word_fraction": [[0, 11378, 0.44817073]], "rps_doc_ut1_blacklist": [[0, 11378, null]], "rps_doc_frac_chars_dupe_10grams": [[0, 11378, 0.0]], "rps_doc_frac_chars_dupe_5grams": [[0, 11378, 0.01467207]], "rps_doc_frac_chars_dupe_6grams": [[0, 11378, 0.00525567]], "rps_doc_frac_chars_dupe_7grams": [[0, 11378, 0.0]], "rps_doc_frac_chars_dupe_8grams": [[0, 11378, 0.0]], "rps_doc_frac_chars_dupe_9grams": [[0, 11378, 0.0]], "rps_doc_frac_chars_top_2gram": [[0, 11378, 0.01313917]], "rps_doc_frac_chars_top_3gram": [[0, 11378, 0.01324866]], "rps_doc_frac_chars_top_4gram": [[0, 11378, 0.0125917]], "rps_doc_frac_all_caps_words": [[0, 11378, 0.02526132]], "rps_doc_frac_lines_end_with_ellipsis": [[0, 11378, 0.0]], "rps_doc_frac_no_alph_words": [[0, 11378, 0.13675958]], "rps_doc_frac_unique_words": [[0, 11378, 0.3702387]], "rps_doc_mean_word_length": [[0, 11378, 4.63839512]], "rps_doc_symbol_to_word_ratio": [[0, 11378, 0.00261324]], "rps_doc_unigram_entropy": [[0, 11378, 5.71490058]], "rps_doc_word_count": [[0, 11378, 1969.0]], "rps_lines_ending_with_terminal_punctution_mark": [[0, 37, 0.0], [37, 147, 1.0], [147, 410, 1.0], [410, 681, 1.0], [681, 949, 1.0], [949, 1196, 1.0], [1196, 1380, 1.0], [1380, 1685, 1.0], [1685, 1954, 1.0], [1954, 2220, 1.0], [2220, 2393, 1.0], [2393, 2669, 1.0], [2669, 2956, 1.0], [2956, 3182, 1.0], [3182, 3291, 1.0], [3291, 3535, 1.0], [3535, 3879, 1.0], [3879, 4003, 1.0], [4003, 4113, 1.0], [4113, 4386, 1.0], [4386, 4612, 1.0], [4612, 4846, 1.0], [4846, 4990, 1.0], [4990, 5244, 1.0], [5244, 5486, 1.0], [5486, 5838, 1.0], [5838, 6032, 1.0], [6032, 6126, 1.0], [6126, 6435, 1.0], [6435, 6600, 1.0], [6600, 6759, 1.0], [6759, 6876, 1.0], [6876, 7019, 1.0], [7019, 7183, 1.0], [7183, 7369, 1.0], [7369, 7532, 1.0], [7532, 7739, 1.0], [7739, 7835, 1.0], [7835, 8140, 1.0], [8140, 8263, 1.0], [8263, 8385, 1.0], [8385, 8633, 1.0], [8633, 8938, 1.0], [8938, 9137, 1.0], [9137, 9263, 1.0], [9263, 9687, 1.0], [9687, 10092, 1.0], [10092, 10282, 1.0], [10282, 10778, 1.0], [10778, 10850, 1.0], [10850, 11051, 1.0], [11051, 11149, 1.0], [11149, 11216, 1.0], [11216, 11378, 1.0]], "rps_lines_javascript_counts": [[0, 37, 0.0], [37, 147, 0.0], [147, 410, 0.0], [410, 681, 0.0], [681, 949, 0.0], [949, 1196, 0.0], [1196, 1380, 0.0], [1380, 1685, 0.0], [1685, 1954, 0.0], [1954, 2220, 0.0], [2220, 2393, 0.0], [2393, 2669, 0.0], [2669, 2956, 0.0], [2956, 3182, 0.0], [3182, 3291, 0.0], [3291, 3535, 0.0], [3535, 3879, 0.0], [3879, 4003, 0.0], [4003, 4113, 0.0], [4113, 4386, 0.0], [4386, 4612, 0.0], [4612, 4846, 0.0], [4846, 4990, 0.0], [4990, 5244, 0.0], [5244, 5486, 0.0], [5486, 5838, 0.0], [5838, 6032, 0.0], [6032, 6126, 0.0], [6126, 6435, 0.0], [6435, 6600, 0.0], [6600, 6759, 0.0], [6759, 6876, 0.0], [6876, 7019, 0.0], [7019, 7183, 0.0], [7183, 7369, 0.0], [7369, 7532, 0.0], [7532, 7739, 0.0], [7739, 7835, 0.0], [7835, 8140, 0.0], [8140, 8263, 0.0], [8263, 8385, 0.0], [8385, 8633, 0.0], [8633, 8938, 0.0], [8938, 9137, 0.0], [9137, 9263, 0.0], [9263, 9687, 0.0], [9687, 10092, 0.0], [10092, 10282, 0.0], [10282, 10778, 0.0], [10778, 10850, 0.0], [10850, 11051, 0.0], [11051, 11149, 0.0], [11149, 11216, 0.0], [11216, 11378, 0.0]], "rps_lines_num_words": [[0, 37, 5.0], [37, 147, 22.0], [147, 410, 49.0], [410, 681, 51.0], [681, 949, 48.0], [949, 1196, 45.0], [1196, 1380, 29.0], [1380, 1685, 51.0], [1685, 1954, 48.0], [1954, 2220, 49.0], [2220, 2393, 29.0], [2393, 2669, 52.0], [2669, 2956, 56.0], [2956, 3182, 35.0], [3182, 3291, 18.0], [3291, 3535, 40.0], [3535, 3879, 52.0], [3879, 4003, 24.0], [4003, 4113, 19.0], [4113, 4386, 50.0], [4386, 4612, 38.0], [4612, 4846, 43.0], [4846, 4990, 26.0], [4990, 5244, 42.0], [5244, 5486, 40.0], [5486, 5838, 62.0], [5838, 6032, 31.0], [6032, 6126, 16.0], [6126, 6435, 51.0], [6435, 6600, 28.0], [6600, 6759, 29.0], [6759, 6876, 20.0], [6876, 7019, 24.0], [7019, 7183, 30.0], [7183, 7369, 33.0], [7369, 7532, 24.0], [7532, 7739, 38.0], [7739, 7835, 18.0], [7835, 8140, 53.0], [8140, 8263, 21.0], [8263, 8385, 19.0], [8385, 8633, 48.0], [8633, 8938, 52.0], [8938, 9137, 32.0], [9137, 9263, 20.0], [9263, 9687, 76.0], [9687, 10092, 65.0], [10092, 10282, 32.0], [10282, 10778, 88.0], [10778, 10850, 11.0], [10850, 11051, 29.0], [11051, 11149, 18.0], [11149, 11216, 13.0], [11216, 11378, 27.0]], "rps_lines_numerical_chars_fraction": [[0, 37, 0.0], [37, 147, 0.0], [147, 410, 0.00784314], [410, 681, 0.0], [681, 949, 0.02272727], [949, 1196, 0.0], [1196, 1380, 0.0], [1380, 1685, 0.0], [1685, 1954, 0.0], [1954, 2220, 0.0], [2220, 2393, 0.0], [2393, 2669, 0.0], [2669, 2956, 0.0], [2956, 3182, 0.0], [3182, 3291, 0.0], [3291, 3535, 0.0], [3535, 3879, 0.0], [3879, 4003, 0.0], [4003, 4113, 0.0], [4113, 4386, 0.01486989], [4386, 4612, 0.0], [4612, 4846, 0.0], [4846, 4990, 0.0], [4990, 5244, 0.00809717], [5244, 5486, 0.0], [5486, 5838, 0.01162791], [5838, 6032, 0.0], [6032, 6126, 0.0], [6126, 6435, 0.0], [6435, 6600, 0.0], [6600, 6759, 0.01935484], [6759, 6876, 0.0], [6876, 7019, 0.0], [7019, 7183, 0.05], [7183, 7369, 0.04444444], [7369, 7532, 0.0], [7532, 7739, 0.0], [7739, 7835, 0.0], [7835, 8140, 0.0440678], [8140, 8263, 0.0], [8263, 8385, 0.0], [8385, 8633, 0.0], [8633, 8938, 0.0], [8938, 9137, 0.0], [9137, 9263, 0.0], [9263, 9687, 0.00479616], [9687, 10092, 0.0], [10092, 10282, 0.0], [10282, 10778, 0.00416667], [10778, 10850, 0.0], [10850, 11051, 0.0], [11051, 11149, 0.0], [11149, 11216, 0.0], [11216, 11378, 0.0]], "rps_lines_start_with_bulletpoint": [[0, 37, 0.0], [37, 147, 0.0], [147, 410, 0.0], [410, 681, 0.0], [681, 949, 0.0], [949, 1196, 0.0], [1196, 1380, 0.0], [1380, 1685, 0.0], [1685, 1954, 0.0], [1954, 2220, 0.0], [2220, 2393, 0.0], [2393, 2669, 0.0], [2669, 2956, 0.0], [2956, 3182, 0.0], [3182, 3291, 0.0], [3291, 3535, 0.0], [3535, 3879, 0.0], [3879, 4003, 0.0], [4003, 4113, 0.0], [4113, 4386, 0.0], [4386, 4612, 0.0], [4612, 4846, 0.0], [4846, 4990, 0.0], [4990, 5244, 0.0], [5244, 5486, 0.0], [5486, 5838, 0.0], [5838, 6032, 0.0], [6032, 6126, 0.0], [6126, 6435, 0.0], [6435, 6600, 0.0], [6600, 6759, 0.0], [6759, 6876, 0.0], [6876, 7019, 0.0], [7019, 7183, 0.0], [7183, 7369, 0.0], [7369, 7532, 0.0], [7532, 7739, 0.0], [7739, 7835, 0.0], [7835, 8140, 0.0], [8140, 8263, 0.0], [8263, 8385, 0.0], [8385, 8633, 0.0], [8633, 8938, 0.0], [8938, 9137, 0.0], [9137, 9263, 0.0], [9263, 9687, 0.0], [9687, 10092, 0.0], [10092, 10282, 0.0], [10282, 10778, 0.0], [10778, 10850, 0.0], [10850, 11051, 0.0], [11051, 11149, 0.0], [11149, 11216, 0.0], [11216, 11378, 0.0]], "rps_lines_uppercase_letter_fraction": [[0, 37, 0.10810811], [37, 147, 0.04545455], [147, 410, 0.02661597], [410, 681, 0.01107011], [681, 949, 0.0261194], [949, 1196, 0.00809717], [1196, 1380, 0.0326087], [1380, 1685, 0.03278689], [1685, 1954, 0.02230483], [1954, 2220, 0.01879699], [2220, 2393, 0.01734104], [2393, 2669, 0.01811594], [2669, 2956, 0.01393728], [2956, 3182, 0.00884956], [3182, 3291, 0.05504587], [3291, 3535, 0.02459016], [3535, 3879, 0.03488372], [3879, 4003, 0.03225806], [4003, 4113, 0.04545455], [4113, 4386, 0.01831502], [4386, 4612, 0.05309735], [4612, 4846, 0.02136752], [4846, 4990, 0.05555556], [4990, 5244, 0.03149606], [5244, 5486, 0.01239669], [5486, 5838, 0.01136364], [5838, 6032, 0.03092784], [6032, 6126, 0.06382979], [6126, 6435, 0.01941748], [6435, 6600, 0.01818182], [6600, 6759, 0.03773585], [6759, 6876, 0.05128205], [6876, 7019, 0.04195804], [7019, 7183, 0.02439024], [7183, 7369, 0.01612903], [7369, 7532, 0.00613497], [7532, 7739, 0.03381643], [7739, 7835, 0.03125], [7835, 8140, 0.02295082], [8140, 8263, 0.00813008], [8263, 8385, 0.02459016], [8385, 8633, 0.01209677], [8633, 8938, 0.01311475], [8938, 9137, 0.00502513], [9137, 9263, 0.03174603], [9263, 9687, 0.02830189], [9687, 10092, 0.05679012], [10092, 10282, 0.03157895], [10282, 10778, 0.02217742], [10778, 10850, 0.04166667], [10850, 11051, 0.02487562], [11051, 11149, 0.02040816], [11149, 11216, 0.02985075], [11216, 11378, 0.05555556]], "rps_doc_ml_palm_score": [[0, 11378, 0.60795957]], "rps_doc_ml_wikipedia_score": [[0, 11378, null]], "rps_doc_ml_wikiref_score": [[0, 11378, 0.69228548]], "rps_doc_books_importance": [[0, 11378, -136.4626098]], "rps_doc_openwebtext_importance": [[0, 11378, 133.22753358]], "rps_doc_wikipedia_importance": [[0, 11378, -558.53208499]], "rps_doc_num_sentences": [[0, 11378, 123.0]]} |
Electronic retinal implant surgery – (Eye 31, 191 (February 2017)) | 2023-14/0000/en_head.json.gz/3628 | {"url": "https://curriculum.rcophth.ac.uk/2017/02/electronic-retinal-implant-surgery-eye-31-191-february-2017/", "partition": "head_middle", "language": "en", "source_domain": "curriculum.rcophth.ac.uk", "date_download": "2023-03-20T10:13:35Z", "digest": "sha1:FNYPOZKY7MIVDRPSCDMVC3HJQNV25H6G"} | {"ccnet_length": [[0, 66, 66.0]], "ccnet_original_length": [[0, 66, 3016.0]], "ccnet_nlines": [[0, 66, 1.0]], "ccnet_original_nlines": [[0, 66, 110.0]], "ccnet_language_score": [[0, 66, 0.85]], "ccnet_perplexity": [[0, 66, 321.6]], "ccnet_bucket": [[0, 66, 0.0]], "rps_doc_curly_bracket": [[0, 66, 0.0]], "rps_doc_ldnoobw_words": [[0, 66, 0.0]], "rps_doc_lorem_ipsum": [[0, 66, 0.0]], "rps_doc_stop_word_fraction": [[0, 66, 0.0]], "rps_doc_ut1_blacklist": [[0, 66, null]], "rps_doc_frac_chars_dupe_10grams": [[0, 66, 0.0]], "rps_doc_frac_chars_dupe_5grams": [[0, 66, 0.0]], "rps_doc_frac_chars_dupe_6grams": [[0, 66, 0.0]], "rps_doc_frac_chars_dupe_7grams": [[0, 66, 0.0]], "rps_doc_frac_chars_dupe_8grams": [[0, 66, 0.0]], "rps_doc_frac_chars_dupe_9grams": [[0, 66, 0.0]], "rps_doc_frac_chars_top_2gram": [[0, 66, 0.0]], "rps_doc_frac_chars_top_3gram": [[0, 66, 0.0]], "rps_doc_frac_chars_top_4gram": [[0, 66, 0.0]], "rps_doc_frac_all_caps_words": [[0, 66, 0.0]], "rps_doc_frac_lines_end_with_ellipsis": [[0, 66, 0.0]], "rps_doc_frac_no_alph_words": [[0, 66, 0.57142857]], "rps_doc_frac_unique_words": [[0, 66, 1.0]], "rps_doc_mean_word_length": [[0, 66, 5.2]], "rps_doc_symbol_to_word_ratio": [[0, 66, 0.0]], "rps_doc_unigram_entropy": [[0, 66, 2.30258509]], "rps_doc_word_count": [[0, 66, 10.0]], "rps_lines_ending_with_terminal_punctution_mark": [[0, 66, 0.0]], "rps_lines_javascript_counts": [[0, 66, 0.0]], "rps_lines_num_words": [[0, 66, 10.0]], "rps_lines_numerical_chars_fraction": [[0, 66, 0.14754098]], "rps_lines_start_with_bulletpoint": [[0, 66, 0.0]], "rps_lines_uppercase_letter_fraction": [[0, 66, 0.04545455]], "rps_doc_ml_palm_score": [[0, 66, 0.00145316]], "rps_doc_ml_wikipedia_score": [[0, 66, null]], "rps_doc_ml_wikiref_score": [[0, 66, -1.001e-05]], "rps_doc_books_importance": [[0, 66, -12.63028265]], "rps_doc_openwebtext_importance": [[0, 66, -3.70882295]], "rps_doc_wikipedia_importance": [[0, 66, 2.24470864]], "rps_doc_num_sentences": [[0, 66, 1.0]]} |
Day 56 – 3rd September (Sainte Anne des Monts)
Hopefully a less boring post than yesterday!
Beautiful sunny day today.
As expected (?), our breakfast was delivered in a takeaway box at almost exactly 9 a.m. It consisted of a bacon, egg and cheese muffin, two hash browns and what the North Americans call an English muffin – a sort of cake. The former muffin wasn’t half bad, actually, and we secreted the cakes in the car as a travelling snack. It’s still a weird experience – no idea why they do it this way, but hey….
Next stop was the tourist office, a five minute walk away. The gent there was most helpful, and as a result of information he imparted, we decided in the first instance to visit the Parc Haute Gaspésien, a half-hour drive up the road, as it promised mountains, lakes, waterfalls and scenery.
Took us a while to find the equivalent of the visitor centre, but equipped with a suitable map and some detailed instructions from a helpful young lady, we set off to find the walk to, firstly, the Lac aux Américains. The car park for the start of the walk was around 2 km down the road.
On the way was a very short detour to the Sainte Anne Falls.
Nice, but not spectacular
Initially missed the turning for the car park for this walk, but fairly soon realised our mistake and turned back. It made us appreciate that this park is huge. Lots of driving on gravel roads, reminding us of previous trips to Australia and South Africa.
Got there eventually, to be asked by a ranger for our park ticket. We hadn’t been told about this at the visitor centre, so basically we had to part with $20, which at least allowed us entry into other parts of the park for today.
The walk to this lake was longish and quite challenging, as it was a warm day and the path was very steep in places. Pleasant enough through woodland and hilly scenery.
When we got to the lake, it was pretty enough, but not particularly special.
Inactive glacier which formed the lake when it retreated, now covered in vegetation
Back we went to the car and drove another 5 km or so along very dusty and gravelly roads to try the walk up to Mont Ernest- LaForce (820 metres above sea level).
This was an even tougher proposition. A very steep and rocky climb in places, but some nice scenery on the way.
We finally made it to the observation deck on top of this hill and enjoyed a 360 degree view of the mountains around. They’re not particularly tall – 1100 metres is about the tallest in the area – but a worthwhile result after a challenging climb.
All of a sudden, a ranger on the observation deck started addressing us all in French, with a very detailed explanation of how these mountains came about. My French wasn’t good enough to understand all he was saying, but, once again, Wikipedia comes to the rescue: https://en.m.wikipedia.org/wiki/Chic-Choc_Mountains
By way of explanation, we are in the Gaspé peninsula of Québec, not far now from New Brunswick (Canada) and Maine (USA), both of which are associated with the Appalacian mountain chain heading south from here, more of which we will see on our future travels.
I recorded this walk on my Garmin watch.
On the way up, we took the southern loop as you see it here and the northern loop on the way down. The bluer the colour, the slower, which shows how much slower we were going downhill than uphill. I have to say there were certain steep and rocky parts of the descent which were purgatory for both of us. The joint deterioration – hips and knees – is getting worse.
Anyway, we made it back to the car, and drove back to Sainte Anne des Monts to have a closer look at the village. I was interested in booking a restaurant recommended by the tourist office on the St. Lawrence shore, whilst Jean was interested in an art gallery similarly recommended.
Unfortunately, the restaurant was fully booked, but Jean got to visit her art gallery. I stayed in the car, as I’m afraid this stuff just bores me rigid.
Back to our motel, where we resolved to eat dinner early, as we were expecting big crowds for a small restaurant on a Saturday night. Actually, no. Got seated more or less immediately and, to start with, it was very quiet. It did get much busier later, so we got our timing right on this occasion.
We both had the lamb shank, which was absolutely delicious. Meat falling off the bone and al dente vegetables. Brilliant, and not something you might expect from the outside appearance of this place.
To cap it all, we managed to get some washing done! In the free hotel launderette! These things are important, you know….
Another piece of advice that the gent in the tourist office gave us was to stick to the northern coastline of the Gaspé peninsula all the way round to Percé, our next stop, as it’s supposed to be a spectacular drive. It will probably take about an hour longer than the Google Maps suggested route, but the plan for tomorrow is to give that a try after a quick look around the village. Weather not looking so good, though….
Please see tomorrow’s post for our progress.
← Day 55 – 2nd September (Riviére-du-Loup to Sainte Anne des Monts)
→ Day 57 – 4th September (Sainte Anne des Monts to Percé)
4 responses to “Day 56 – 3rd September (Sainte Anne des Monts)”
it must be so frustrating for you Chris with your joint and knee problems
Simon has just had to have a week off sick with a swollen knee joint yet like you he feels fine just cant walk we were supposed to go to the chatsworth counrty show today but have had to cancel because of his knee oh the joys of getting old!
Sep 4, 2022 at 1:22 pm
Yes, it’s extremely frustrating. Going uphill isn’t too bad if there aren’t too many big steps, but downhill can be very uncomfortable and painful. Very limiting.
Some good photos of the great landscapes 🙂
Thanks, Steve. Very different from the Rockies, but still impressive in its own way. | 2023-14/0000/en_head.json.gz/3629 | {"url": "https://cwstories.uk/2022/09/04/day-56-3rd-september-sainte-anne-des-monts/", "partition": "head_middle", "language": "en", "source_domain": "cwstories.uk", "date_download": "2023-03-20T09:07:28Z", "digest": "sha1:QVH4K4I5STWFX2GDMCOA4CXLJPAJAVMK"} | {"ccnet_length": [[0, 5854, 5854.0]], "ccnet_original_length": [[0, 5854, 7252.0]], "ccnet_nlines": [[0, 5854, 36.0]], "ccnet_original_nlines": [[0, 5854, 99.0]], "ccnet_language_score": [[0, 5854, 0.97]], "ccnet_perplexity": [[0, 5854, 321.1]], "ccnet_bucket": [[0, 5854, 0.0]], "rps_doc_curly_bracket": [[0, 5854, 0.0]], "rps_doc_ldnoobw_words": [[0, 5854, 0.0]], "rps_doc_lorem_ipsum": [[0, 5854, 0.0]], "rps_doc_stop_word_fraction": [[0, 5854, 0.44852354]], "rps_doc_ut1_blacklist": [[0, 5854, null]], "rps_doc_frac_chars_dupe_10grams": [[0, 5854, 0.0]], "rps_doc_frac_chars_dupe_5grams": [[0, 5854, 0.03537532]], "rps_doc_frac_chars_dupe_6grams": [[0, 5854, 0.01207938]], "rps_doc_frac_chars_dupe_7grams": [[0, 5854, 0.01207938]], "rps_doc_frac_chars_dupe_8grams": [[0, 5854, 0.0]], "rps_doc_frac_chars_dupe_9grams": [[0, 5854, 0.0]], "rps_doc_frac_chars_top_2gram": [[0, 5854, 0.00862813]], "rps_doc_frac_chars_top_3gram": [[0, 5854, 0.01402071]], "rps_doc_frac_chars_top_4gram": [[0, 5854, 0.01553063]], "rps_doc_frac_all_caps_words": [[0, 5854, 0.00558659]], "rps_doc_frac_lines_end_with_ellipsis": [[0, 5854, 0.0]], "rps_doc_frac_no_alph_words": [[0, 5854, 0.15083799]], "rps_doc_frac_unique_words": [[0, 5854, 0.46074766]], "rps_doc_mean_word_length": [[0, 5854, 4.33271028]], "rps_doc_symbol_to_word_ratio": [[0, 5854, 0.00239425]], "rps_doc_unigram_entropy": [[0, 5854, 5.54440048]], "rps_doc_word_count": [[0, 5854, 1070.0]], "rps_lines_ending_with_terminal_punctution_mark": [[0, 47, 0.0], [47, 92, 1.0], [92, 119, 1.0], [119, 521, 1.0], [521, 813, 1.0], [813, 1101, 1.0], [1101, 1162, 1.0], [1162, 1188, 0.0], [1188, 1444, 1.0], [1444, 1675, 1.0], [1675, 1844, 1.0], [1844, 1921, 1.0], [1921, 2005, 0.0], [2005, 2167, 1.0], [2167, 2279, 1.0], [2279, 2527, 1.0], [2527, 2844, 0.0], [2844, 3103, 1.0], [3103, 3144, 1.0], [3144, 3509, 1.0], [3509, 3793, 1.0], [3793, 3947, 1.0], [3947, 4245, 1.0], [4245, 4445, 1.0], [4445, 4567, 1.0], [4567, 4990, 1.0], [4990, 5035, 1.0], [5035, 5103, 0.0], [5103, 5161, 0.0], [5161, 5225, 1.0], [5225, 5299, 0.0], [5299, 5541, 1.0], [5541, 5564, 0.0], [5564, 5727, 1.0], [5727, 5770, 0.0], [5770, 5854, 1.0]], "rps_lines_javascript_counts": [[0, 47, 0.0], [47, 92, 0.0], [92, 119, 0.0], [119, 521, 0.0], [521, 813, 0.0], [813, 1101, 0.0], [1101, 1162, 0.0], [1162, 1188, 0.0], [1188, 1444, 0.0], [1444, 1675, 0.0], [1675, 1844, 0.0], [1844, 1921, 0.0], [1921, 2005, 0.0], [2005, 2167, 0.0], [2167, 2279, 0.0], [2279, 2527, 0.0], [2527, 2844, 0.0], [2844, 3103, 0.0], [3103, 3144, 0.0], [3144, 3509, 0.0], [3509, 3793, 0.0], [3793, 3947, 0.0], [3947, 4245, 0.0], [4245, 4445, 0.0], [4445, 4567, 0.0], [4567, 4990, 0.0], [4990, 5035, 0.0], [5035, 5103, 0.0], [5103, 5161, 0.0], [5161, 5225, 0.0], [5225, 5299, 0.0], [5299, 5541, 0.0], [5541, 5564, 0.0], [5564, 5727, 0.0], [5727, 5770, 0.0], [5770, 5854, 0.0]], "rps_lines_num_words": [[0, 47, 9.0], [47, 92, 7.0], [92, 119, 4.0], [119, 521, 76.0], [521, 813, 51.0], [813, 1101, 56.0], [1101, 1162, 13.0], [1162, 1188, 4.0], [1188, 1444, 45.0], [1444, 1675, 45.0], [1675, 1844, 31.0], [1844, 1921, 14.0], [1921, 2005, 13.0], [2005, 2167, 33.0], [2167, 2279, 21.0], [2279, 2527, 46.0], [2527, 2844, 47.0], [2844, 3103, 46.0], [3103, 3144, 8.0], [3144, 3509, 71.0], [3509, 3793, 51.0], [3793, 3947, 28.0], [3947, 4245, 56.0], [4245, 4445, 33.0], [4445, 4567, 22.0], [4567, 4990, 79.0], [4990, 5035, 7.0], [5035, 5103, 12.0], [5103, 5161, 12.0], [5161, 5225, 12.0], [5225, 5299, 14.0], [5299, 5541, 50.0], [5541, 5564, 6.0], [5564, 5727, 26.0], [5727, 5770, 8.0], [5770, 5854, 14.0]], "rps_lines_numerical_chars_fraction": [[0, 47, 0.06818182], [47, 92, 0.0], [92, 119, 0.0], [119, 521, 0.00259067], [521, 813, 0.0], [813, 1101, 0.0035461], [1101, 1162, 0.0], [1162, 1188, 0.0], [1188, 1444, 0.0], [1444, 1675, 0.00892857], [1675, 1844, 0.0], [1844, 1921, 0.0], [1921, 2005, 0.0], [2005, 2167, 0.02547771], [2167, 2279, 0.0], [2279, 2527, 0.02857143], [2527, 2844, 0.0], [2844, 3103, 0.0], [3103, 3144, 0.0], [3144, 3509, 0.0], [3509, 3793, 0.0], [3793, 3947, 0.0], [3947, 4245, 0.0], [4245, 4445, 0.0], [4445, 4567, 0.0], [4567, 4990, 0.0], [4990, 5035, 0.0], [5035, 5103, 0.046875], [5103, 5161, 0.05357143], [5161, 5225, 0.06557377], [5225, 5299, 0.0], [5299, 5541, 0.0], [5541, 5564, 0.4], [5564, 5727, 0.0], [5727, 5770, 0.0], [5770, 5854, 0.0]], "rps_lines_start_with_bulletpoint": [[0, 47, 0.0], [47, 92, 0.0], [92, 119, 0.0], [119, 521, 0.0], [521, 813, 0.0], [813, 1101, 0.0], [1101, 1162, 0.0], [1162, 1188, 0.0], [1188, 1444, 0.0], [1444, 1675, 0.0], [1675, 1844, 0.0], [1844, 1921, 0.0], [1921, 2005, 0.0], [2005, 2167, 0.0], [2167, 2279, 0.0], [2279, 2527, 0.0], [2527, 2844, 0.0], [2844, 3103, 0.0], [3103, 3144, 0.0], [3144, 3509, 0.0], [3509, 3793, 0.0], [3793, 3947, 0.0], [3947, 4245, 0.0], [4245, 4445, 0.0], [4445, 4567, 0.0], [4567, 4990, 0.0], [4990, 5035, 0.0], [5035, 5103, 0.0], [5103, 5161, 0.0], [5161, 5225, 0.0], [5225, 5299, 0.0], [5299, 5541, 0.0], [5541, 5564, 0.0], [5564, 5727, 0.0], [5727, 5770, 0.0], [5770, 5854, 0.0]], "rps_lines_uppercase_letter_fraction": [[0, 47, 0.10638298], [47, 92, 0.02222222], [92, 119, 0.03703704], [119, 521, 0.01741294], [521, 813, 0.01712329], [813, 1101, 0.01388889], [1101, 1162, 0.06557377], [1162, 1188, 0.03846154], [1188, 1444, 0.0234375], [1444, 1675, 0.00865801], [1675, 1844, 0.01183432], [1844, 1921, 0.01298701], [1921, 2005, 0.01190476], [2005, 2167, 0.0308642], [2167, 2279, 0.01785714], [2279, 2527, 0.00806452], [2527, 2844, 0.02523659], [2844, 3103, 0.04247104], [3103, 3144, 0.04878049], [3144, 3509, 0.0109589], [3509, 3793, 0.02816901], [3793, 3947, 0.02597403], [3947, 4245, 0.01677852], [4245, 4445, 0.015], [4445, 4567, 0.02459016], [4567, 4990, 0.01654846], [4990, 5035, 0.02222222], [5035, 5103, 0.10294118], [5103, 5161, 0.10344828], [5161, 5225, 0.078125], [5225, 5299, 0.01351351], [5299, 5541, 0.00413223], [5541, 5564, 0.04347826], [5564, 5727, 0.01840491], [5727, 5770, 0.02325581], [5770, 5854, 0.04761905]], "rps_doc_ml_palm_score": [[0, 5854, 0.17696357]], "rps_doc_ml_wikipedia_score": [[0, 5854, null]], "rps_doc_ml_wikiref_score": [[0, 5854, 0.22476006]], "rps_doc_books_importance": [[0, 5854, -47.01303756]], "rps_doc_openwebtext_importance": [[0, 5854, 78.98842152]], "rps_doc_wikipedia_importance": [[0, 5854, -204.88773063]], "rps_doc_num_sentences": [[0, 5854, 61.0]]} |
赛季 49
CYBERDUNK 有下列语言可供选择 Chinese ( Simplified ) .................... 121.1% Croatian .................... 66.2% Estonian .................... 77.9% Hungarian .................... 121.8% Italian .................... 121.7% Lithuanian .................... 121.7% Latvian .................... 121.5% Polish .................... 104% Portuguese .................... 121.2% Romanian .................... 70.3% Serbian .................... 121.7% Spanish .................... 121.7% Chinese ( Traditional ) .................... 116.6% English ( Original ) .................... 100% Hebrew (modern) .................... 70.3% Indonesian .................... 117.2% German .................... 122.4% Filipino .................... 120.8% French .................... 121.8% Turkish .................... 121.6% Dutch .................... 52.6% Hindi .................... 44.8% Russian .................... 121.7% Swedish .................... 121.7% Korean .................... 31.4% Korean .................... 38.3% Bosnian .................... 42.2% Czech .................... 29.7% Vietnamese .................... 69.7% Bulgarian .................... 114.2% | 2023-14/0000/en_head.json.gz/3630 | {"url": "https://cyberdunk2.com/team.php?m=privacy_policy&language=bs_BS&language=it_IT&language=he_HE&language=vi_VI&language=id_ID&language=zh_TW&language=pl_PL&language=zh_CN", "partition": "head_middle", "language": "en", "source_domain": "cyberdunk2.com", "date_download": "2023-03-20T08:43:37Z", "digest": "sha1:ZXE3DVTQKMOWRGNBFTTF23JUYCIJKKVT"} | {"ccnet_length": [[0, 1151, 1151.0]], "ccnet_original_length": [[0, 1151, 3293.0]], "ccnet_nlines": [[0, 1151, 2.0]], "ccnet_original_nlines": [[0, 1151, 67.0]], "ccnet_language_score": [[0, 1151, 0.57]], "ccnet_perplexity": [[0, 1151, 23.6]], "ccnet_bucket": [[0, 1151, 0.0]], "rps_doc_curly_bracket": [[0, 1151, 0.0]], "rps_doc_ldnoobw_words": [[0, 1151, 0.0]], "rps_doc_lorem_ipsum": [[0, 1151, 0.0]], "rps_doc_stop_word_fraction": [[0, 1151, 0.0]], "rps_doc_ut1_blacklist": [[0, 1151, null]], "rps_doc_frac_chars_dupe_10grams": [[0, 1151, 0.0]], "rps_doc_frac_chars_dupe_5grams": [[0, 1151, 0.0]], "rps_doc_frac_chars_dupe_6grams": [[0, 1151, 0.0]], "rps_doc_frac_chars_dupe_7grams": [[0, 1151, 0.0]], "rps_doc_frac_chars_dupe_8grams": [[0, 1151, 0.0]], "rps_doc_frac_chars_dupe_9grams": [[0, 1151, 0.0]], "rps_doc_frac_chars_top_2gram": [[0, 1151, 0.0]], "rps_doc_frac_chars_top_3gram": [[0, 1151, 0.0]], "rps_doc_frac_chars_top_4gram": [[0, 1151, 0.0]], "rps_doc_frac_all_caps_words": [[0, 1151, 0.00520833]], "rps_doc_frac_lines_end_with_ellipsis": [[0, 1151, 0.0]], "rps_doc_frac_no_alph_words": [[0, 1151, 0.81770833]], "rps_doc_frac_unique_words": [[0, 1151, 0.86764706]], "rps_doc_mean_word_length": [[0, 1151, 5.61764706]], "rps_doc_symbol_to_word_ratio": [[0, 1151, 0.9375]], "rps_doc_unigram_entropy": [[0, 1151, 3.97986455]], "rps_doc_word_count": [[0, 1151, 68.0]], "rps_lines_ending_with_terminal_punctution_mark": [[0, 6, 0.0], [6, 1151, 0.0]], "rps_lines_javascript_counts": [[0, 6, 0.0], [6, 1151, 0.0]], "rps_lines_num_words": [[0, 6, 2.0], [6, 1151, 66.0]], "rps_lines_numerical_chars_fraction": [[0, 6, 0.4], [6, 1151, 0.24153499]], "rps_lines_start_with_bulletpoint": [[0, 6, 0.0], [6, 1151, 0.0]], "rps_lines_uppercase_letter_fraction": [[0, 6, 0.0], [6, 1151, 0.03668122]], "rps_doc_ml_palm_score": [[0, 1151, 0.01557636]], "rps_doc_ml_wikipedia_score": [[0, 1151, null]], "rps_doc_ml_wikiref_score": [[0, 1151, 0.00068867]], "rps_doc_books_importance": [[0, 1151, -153.41738751]], "rps_doc_openwebtext_importance": [[0, 1151, -56.56699034]], "rps_doc_wikipedia_importance": [[0, 1151, 30.00585108]], "rps_doc_num_sentences": [[0, 1151, 59.0]]} |
Safecam 360: The Ultimate Home Security Solution
Home & GardenSafecam 360: The Ultimate Home Security Solution
Safecam 360 is a brand that produces security cameras designed to help users monitor their homes or businesses remotely. The cameras are equipped with features such as a 360-degree field of view, high definition video recording, night vision, motion detection, two-way audio, and cloud storage for recorded footage. The company offers various models to fit different needs, including indoor and outdoor cameras. As well as models that are designed for use in specific areas. Such as a baby monitor or a pet monitor. Safecam 360 aims to make security cameras that are easy to set up and use, and that provide reliable and high-quality video footage.
1 Safecam 360 Features
2 360-degree field of view
3 High definition video recording
4 Night vision
5 Motion detection
6 Two-way audio
7 Cloud storage
8 Pros And Cons Of Safecam 360
9 Pros:
10 Cons:
Safecam 360 Features
360 Security Camera offers a range of features that are designed to provide users with reliable and high-quality video footage. Some of the main features of the camera include:
360-degree field of view: This allows the camera to capture footage of a wide area, without any blind spots.
High definition video recording: The camera is capable of recording high quality video footage, even in low light conditions.
Night vision: The camera has infrared LEDs that enable it to capture clear footage even in complete darkness.
Motion detection: The camera is equipped with motion sensors that can detect movement in its field of view. This feature can be set to trigger alerts to your phone, or to start recording automatically.
Two-way audio: This feature allows you to hear what’s happening around the camera, and also to speak to people who are in the camera’s field of view.
Cloud storage: Safecam 360 cameras come with cloud storage for recorded footage, which can be accessed from anywhere using an internet-connected device.
Easy setup and use: The camera is designed to be easy to set up and use, even for people who are not tech-savvy.
Overall, Safecam 360 offers a range of features that are designed to provide users with reliable and high-quality video footage. Making it a popular choice for home and business security.
360-degree field of view
A 360-degree field of view means that the camera is capable of capturing footage of an entire area without any blind spots. This feature allows the camera to provide a complete view of the surroundings, ensuring that no areas are missed or left uncovered. With a 360-degree field of view, the camera can capture footage from all angles and corners. Making it an ideal solution for surveillance in large areas such as warehouses, parking lots, and open spaces. This feature is particularly useful for security purposes, as it provides a comprehensive view of the environment, enabling users to monitor activities and detect potential threats.
High definition video recording
High definition video recording is a feature of Safecam 360 that allows the camera to capture clear and detailed footage. This feature ensures that the recorded video is of high quality, making it easier to identify objects, people, or vehicles in the footage. With high definition video recording, the camera can capture footage in 1080p or higher resolutions, providing users with a detailed view of the environment. This feature is particularly useful for security purposes. As it allows users to monitor activities in high detail, making it easier to identify potential security threats. High definition video recording also ensures that the footage is of sufficient quality for use as evidence in legal proceedings if necessary.
Night vision is a feature of Safecam 360 that allows the camera to capture clear footage in low light or no light conditions. This feature enables the camera to continue monitoring the environment even in complete darkness, making it an ideal solution for 24/7 surveillance. With night vision, the camera can capture footage using infrared (IR) technology. Which illuminates the environment with invisible light that is not visible to the human eye. The camera then captures this illuminated environment in black and white, allowing users to see details in the footage that would otherwise be invisible. This feature is particularly useful for security purposes, as it ensures that the camera can continue monitoring the environment even when there is no ambient light available.
Motion detection is a feature of Safecam 360 that allows the camera to detect movement in its field of view. This feature enables the camera to automatically start recording when it detects motion, making it an ideal solution for surveillance and security purposes. With motion detection, the camera can be set up to send alerts or notifications to the user’s smartphone or other devices when it detects movement. This feature is particularly useful for monitoring homes or businesses when no one is present. As it can help detect potential security threats and alert the user in real-time. Motion detection can also be set up to trigger other actions, such as sounding an alarm, sending a message to a security company, or starting a video recording, depending on the user’s needs.
Two-way audio is another feature of Safecam 360 that allows users to communicate with anyone within the camera’s field of view. With this feature, users can speak directly to someone on the other end, such as a visitor or a delivery person, and hear their responses. This feature is particularly useful for monitoring homes or businesses, as it allows users to communicate with people on the other end of the camera, even when they are not physically present. Two-way audio can also be useful in emergency situations, as it allows users to speak with emergency responders or to alert potential intruders that they are being monitored. Overall, the two-way audio feature of Safecam 360 provides an additional layer of security and convenience for users.
Safecam 360 also offers cloud storage as a feature. Allowing users to store video recordings from their cameras on remote servers hosted by the manufacturer. With cloud storage, users can access their footage from anywhere with an internet connection. Which provides an additional layer of security in case their local recordings are lost or destroyed. Cloud storage also provides an easy way for users to share their footage with others. Such as law enforcement officials or insurance adjusters, in case of an incident. However, it’s important to note that cloud storage may require a subscription or additional fees. Users should ensure that their data is stored securely and according to their privacy preferences.
Pros And Cons Of Safecam 360
Here are some potential pros and cons of using the Safecam 360 security camera:
360-degree field of view allows for more comprehensive coverage of the surrounding area.
High definition video recording provides clear images and footage.
Night vision enables monitoring in low-light or dark environments.
Motion detection alerts the user to potential security breaches.
Two-way audio allows for communication with anyone in the vicinity of the camera.
Cloud storage provides a secure and accessible way to store footage.
Easy installation and setup.
Some users may not need or want all of the advanced features offered by the camera.
Requires a stable internet connection for live streaming and cloud storage.
Subscription fee is required for cloud storage.
May not be compatible with all smart home systems or devices.
Requires consistent maintenance to ensure proper functionality.
May be too expensive for some users.
It’s important to note that the pros and cons can vary depending on the specific use case and user’s individual needs and preferences.
Here are some frequently asked questions about Safecam 360:
What is the price of Safecam 360?
The price of Safecam 360 varies depending on the package and promotions offered by the company. You can visit their website to check the latest pricing.
Is Safecam 360 easy to install?
Yes, Safecam 360 is designed to be easy to install. You can set it up in just a few minutes and use it right away.
Can I access the footage remotely?
Yes, you can access the footage remotely using your smartphone or computer as long as you have an internet connection.
Does Safecam 360 have a warranty?
Yes, Safecam 360 comes with a one-year warranty. If you encounter any issues with the product, you can contact their customer support for assistance.
How does the motion detection feature work?
The Safecam 360 security camera uses advanced motion detection technology to identify any movement within its field of view. Once it detects motion, it will send a notification to your smartphone and start recording.
Is Safecam 360 suitable for outdoor use?
Yes, Safecam 360 is suitable for both indoor and outdoor use. It is designed to be weather-resistant and can withstand different weather conditions.
In conclusion, the Safecam 360 security camera offers many features that make it a great option for those looking for an affordable and easy-to-use security solution. Its 360-degree field of view, high definition video recording, night vision, motion detection, two-way audio, and cloud storage. All provide users with the peace of mind of knowing that their property is being monitored. However, its limitations in terms of Wi-Fi range and the need for a strong internet connection. It may make it less suitable for larger properties or areas with weak network coverage. Nonetheless, the Safecam 360 represents a solid option for those seeking an affordable, reliable, and easy-to-use home security camera.
360 Security Camera
Safecam 360
A Skeptic’s Guide to Medical Marijuana
Fantasy cricket: The Latest Trend in the Cricket World
How A Car Accident Affects Your Car Insurance | 2023-14/0000/en_head.json.gz/3631 | {"url": "https://daily-contributors.com/safecam-360-the-ultimate-home-security-solution/", "partition": "head_middle", "language": "en", "source_domain": "daily-contributors.com", "date_download": "2023-03-20T08:51:05Z", "digest": "sha1:TS4ICO7QYRHGHRMDFB66JTFPVUQBEDAA"} | {"ccnet_length": [[0, 9919, 9919.0]], "ccnet_original_length": [[0, 9919, 12272.0]], "ccnet_nlines": [[0, 9919, 66.0]], "ccnet_original_nlines": [[0, 9919, 188.0]], "ccnet_language_score": [[0, 9919, 0.93]], "ccnet_perplexity": [[0, 9919, 213.3]], "ccnet_bucket": [[0, 9919, 0.0]], "rps_doc_curly_bracket": [[0, 9919, 0.0]], "rps_doc_ldnoobw_words": [[0, 9919, 0.0]], "rps_doc_lorem_ipsum": [[0, 9919, 0.0]], "rps_doc_stop_word_fraction": [[0, 9919, 0.36941303]], "rps_doc_ut1_blacklist": [[0, 9919, null]], "rps_doc_frac_chars_dupe_10grams": [[0, 9919, 0.10048083]], "rps_doc_frac_chars_dupe_5grams": [[0, 9919, 0.26914067]], "rps_doc_frac_chars_dupe_6grams": [[0, 9919, 0.17272839]], "rps_doc_frac_chars_dupe_7grams": [[0, 9919, 0.11996055]], "rps_doc_frac_chars_dupe_8grams": [[0, 9919, 0.11059056]], "rps_doc_frac_chars_dupe_9grams": [[0, 9919, 0.11059056]], "rps_doc_frac_chars_top_2gram": [[0, 9919, 0.03575391]], "rps_doc_frac_chars_top_3gram": [[0, 9919, 0.01763038]], "rps_doc_frac_chars_top_4gram": [[0, 9919, 0.03106892]], "rps_doc_frac_all_caps_words": [[0, 9919, 0.00269251]], "rps_doc_frac_lines_end_with_ellipsis": [[0, 9919, 0.0]], "rps_doc_frac_no_alph_words": [[0, 9919, 0.14108778]], "rps_doc_frac_unique_words": [[0, 9919, 0.26509023]], "rps_doc_mean_word_length": [[0, 9919, 5.04729309]], "rps_doc_symbol_to_word_ratio": [[0, 9919, 0.0]], "rps_doc_unigram_entropy": [[0, 9919, 5.24336722]], "rps_doc_word_count": [[0, 9919, 1607.0]], "rps_lines_ending_with_terminal_punctution_mark": [[0, 49, 0.0], [49, 111, 0.0], [111, 760, 1.0], [760, 783, 0.0], [783, 810, 0.0], [810, 844, 0.0], [844, 859, 0.0], [859, 878, 0.0], [878, 894, 0.0], [894, 910, 0.0], [910, 941, 0.0], [941, 949, 0.0], [949, 958, 0.0], [958, 979, 0.0], [979, 1156, 0.0], [1156, 1265, 1.0], [1265, 1391, 1.0], [1391, 1501, 1.0], [1501, 1703, 1.0], [1703, 1853, 1.0], [1853, 2006, 1.0], [2006, 2119, 1.0], [2119, 2307, 1.0], [2307, 2332, 0.0], [2332, 2974, 1.0], [2974, 3006, 0.0], [3006, 3740, 1.0], [3740, 4520, 1.0], [4520, 5303, 1.0], [5303, 6056, 1.0], [6056, 6774, 1.0], [6774, 6803, 0.0], [6803, 6883, 0.0], [6883, 6972, 1.0], [6972, 7039, 1.0], [7039, 7106, 1.0], [7106, 7171, 1.0], [7171, 7253, 1.0], [7253, 7322, 1.0], [7322, 7351, 1.0], [7351, 7435, 1.0], [7435, 7511, 1.0], [7511, 7559, 1.0], [7559, 7621, 1.0], [7621, 7685, 1.0], [7685, 7722, 1.0], [7722, 7857, 1.0], [7857, 7917, 0.0], [7917, 7951, 1.0], [7951, 8104, 1.0], [8104, 8136, 1.0], [8136, 8251, 1.0], [8251, 8286, 1.0], [8286, 8405, 1.0], [8405, 8439, 1.0], [8439, 8589, 1.0], [8589, 8633, 1.0], [8633, 8850, 1.0], [8850, 8891, 1.0], [8891, 9040, 1.0], [9040, 9748, 1.0], [9748, 9768, 0.0], [9768, 9780, 0.0], [9780, 9819, 0.0], [9819, 9874, 0.0], [9874, 9919, 0.0]], "rps_lines_javascript_counts": [[0, 49, 0.0], [49, 111, 0.0], [111, 760, 0.0], [760, 783, 0.0], [783, 810, 0.0], [810, 844, 0.0], [844, 859, 0.0], [859, 878, 0.0], [878, 894, 0.0], [894, 910, 0.0], [910, 941, 0.0], [941, 949, 0.0], [949, 958, 0.0], [958, 979, 0.0], [979, 1156, 0.0], [1156, 1265, 0.0], [1265, 1391, 0.0], [1391, 1501, 0.0], [1501, 1703, 0.0], [1703, 1853, 0.0], [1853, 2006, 0.0], [2006, 2119, 0.0], [2119, 2307, 0.0], [2307, 2332, 0.0], [2332, 2974, 0.0], [2974, 3006, 0.0], [3006, 3740, 0.0], [3740, 4520, 0.0], [4520, 5303, 0.0], [5303, 6056, 0.0], [6056, 6774, 0.0], [6774, 6803, 0.0], [6803, 6883, 0.0], [6883, 6972, 0.0], [6972, 7039, 0.0], [7039, 7106, 0.0], [7106, 7171, 0.0], [7171, 7253, 0.0], [7253, 7322, 0.0], [7322, 7351, 0.0], [7351, 7435, 0.0], [7435, 7511, 0.0], [7511, 7559, 0.0], [7559, 7621, 0.0], [7621, 7685, 0.0], [7685, 7722, 0.0], [7722, 7857, 0.0], [7857, 7917, 0.0], [7917, 7951, 0.0], [7951, 8104, 0.0], [8104, 8136, 0.0], [8136, 8251, 0.0], [8251, 8286, 0.0], [8286, 8405, 0.0], [8405, 8439, 0.0], [8439, 8589, 0.0], [8589, 8633, 0.0], [8633, 8850, 0.0], [8850, 8891, 0.0], [8891, 9040, 0.0], [9040, 9748, 0.0], [9748, 9768, 0.0], [9768, 9780, 0.0], [9780, 9819, 0.0], [9819, 9874, 0.0], [9874, 9919, 0.0]], "rps_lines_num_words": [[0, 49, 7.0], [49, 111, 8.0], [111, 760, 106.0], [760, 783, 4.0], [783, 810, 5.0], [810, 844, 5.0], [844, 859, 3.0], [859, 878, 3.0], [878, 894, 3.0], [894, 910, 3.0], [910, 941, 7.0], [941, 949, 2.0], [949, 958, 2.0], [958, 979, 3.0], [979, 1156, 29.0], [1156, 1265, 19.0], [1265, 1391, 19.0], [1391, 1501, 18.0], [1501, 1703, 34.0], [1703, 1853, 27.0], [1853, 2006, 22.0], [2006, 2119, 23.0], [2119, 2307, 30.0], [2307, 2332, 4.0], [2332, 2974, 104.0], [2974, 3006, 4.0], [3006, 3740, 115.0], [3740, 4520, 123.0], [4520, 5303, 130.0], [5303, 6056, 124.0], [6056, 6774, 113.0], [6774, 6803, 6.0], [6803, 6883, 14.0], [6883, 6972, 13.0], [6972, 7039, 9.0], [7039, 7106, 9.0], [7106, 7171, 9.0], [7171, 7253, 13.0], [7253, 7322, 11.0], [7322, 7351, 4.0], [7351, 7435, 16.0], [7435, 7511, 11.0], [7511, 7559, 7.0], [7559, 7621, 11.0], [7621, 7685, 7.0], [7685, 7722, 7.0], [7722, 7857, 23.0], [7857, 7917, 9.0], [7917, 7951, 7.0], [7951, 8104, 26.0], [8104, 8136, 6.0], [8136, 8251, 25.0], [8251, 8286, 6.0], [8286, 8405, 20.0], [8405, 8439, 6.0], [8439, 8589, 24.0], [8589, 8633, 7.0], [8633, 8850, 34.0], [8850, 8891, 7.0], [8891, 9040, 23.0], [9040, 9748, 110.0], [9748, 9768, 3.0], [9768, 9780, 2.0], [9780, 9819, 6.0], [9819, 9874, 9.0], [9874, 9919, 8.0]], "rps_lines_numerical_chars_fraction": [[0, 49, 0.06382979], [49, 111, 0.05172414], [111, 760, 0.01424051], [760, 783, 0.18181818], [783, 810, 0.16], [810, 844, 0.03030303], [844, 859, 0.07142857], [859, 878, 0.05555556], [878, 894, 0.07142857], [894, 910, 0.06666667], [910, 941, 0.13333333], [941, 949, 0.16666667], [949, 958, 0.28571429], [958, 979, 0.15], [979, 1156, 0.01734104], [1156, 1265, 0.02884615], [1265, 1391, 0.0], [1391, 1501, 0.0], [1501, 1703, 0.0], [1703, 1853, 0.0], [1853, 2006, 0.02027027], [2006, 2119, 0.0], [2119, 2307, 0.01639344], [2307, 2332, 0.13043478], [2332, 2974, 0.00955414], [2974, 3006, 0.0], [3006, 3740, 0.00970874], [3740, 4520, 0.0078329], [4520, 5303, 0.00390117], [5303, 6056, 0.00814111], [6056, 6774, 0.00424929], [6774, 6803, 0.10714286], [6803, 6883, 0.03846154], [6883, 6972, 0.03488372], [6972, 7039, 0.0], [7039, 7106, 0.0], [7106, 7171, 0.0], [7171, 7253, 0.0], [7253, 7322, 0.0], [7322, 7351, 0.0], [7351, 7435, 0.0], [7435, 7511, 0.0], [7511, 7559, 0.0], [7559, 7621, 0.0], [7621, 7685, 0.0], [7685, 7722, 0.0], [7722, 7857, 0.0], [7857, 7917, 0.05172414], [7917, 7951, 0.09375], [7951, 8104, 0.02], [8104, 8136, 0.1], [8136, 8251, 0.02702703], [8251, 8286, 0.0], [8286, 8405, 0.0], [8405, 8439, 0.09375], [8439, 8589, 0.02083333], [8589, 8633, 0.0], [8633, 8850, 0.01408451], [8850, 8891, 0.07692308], [8891, 9040, 0.02083333], [9040, 9748, 0.01315789], [9748, 9768, 0.15789474], [9768, 9780, 0.27272727], [9780, 9819, 0.0], [9819, 9874, 0.0], [9874, 9919, 0.0]], "rps_lines_start_with_bulletpoint": [[0, 49, 0.0], [49, 111, 0.0], [111, 760, 0.0], [760, 783, 0.0], [783, 810, 0.0], [810, 844, 0.0], [844, 859, 0.0], [859, 878, 0.0], [878, 894, 0.0], [894, 910, 0.0], [910, 941, 0.0], [941, 949, 0.0], [949, 958, 0.0], [958, 979, 0.0], [979, 1156, 0.0], [1156, 1265, 0.0], [1265, 1391, 0.0], [1391, 1501, 0.0], [1501, 1703, 0.0], [1703, 1853, 0.0], [1853, 2006, 0.0], [2006, 2119, 0.0], [2119, 2307, 0.0], [2307, 2332, 0.0], [2332, 2974, 0.0], [2974, 3006, 0.0], [3006, 3740, 0.0], [3740, 4520, 0.0], [4520, 5303, 0.0], [5303, 6056, 0.0], [6056, 6774, 0.0], [6774, 6803, 0.0], [6803, 6883, 0.0], [6883, 6972, 0.0], [6972, 7039, 0.0], [7039, 7106, 0.0], [7106, 7171, 0.0], [7171, 7253, 0.0], [7253, 7322, 0.0], [7322, 7351, 0.0], [7351, 7435, 0.0], [7435, 7511, 0.0], [7511, 7559, 0.0], [7559, 7621, 0.0], [7621, 7685, 0.0], [7685, 7722, 0.0], [7722, 7857, 0.0], [7857, 7917, 0.0], [7917, 7951, 0.0], [7951, 8104, 0.0], [8104, 8136, 0.0], [8136, 8251, 0.0], [8251, 8286, 0.0], [8286, 8405, 0.0], [8405, 8439, 0.0], [8439, 8589, 0.0], [8589, 8633, 0.0], [8633, 8850, 0.0], [8850, 8891, 0.0], [8891, 9040, 0.0], [9040, 9748, 0.0], [9748, 9768, 0.0], [9768, 9780, 0.0], [9780, 9819, 0.0], [9819, 9874, 0.0], [9874, 9919, 0.0]], "rps_lines_uppercase_letter_fraction": [[0, 49, 0.12244898], [49, 111, 0.12903226], [111, 760, 0.00924499], [760, 783, 0.08695652], [783, 810, 0.0], [810, 844, 0.02941176], [844, 859, 0.06666667], [859, 878, 0.05263158], [878, 894, 0.0625], [894, 910, 0.0625], [910, 941, 0.16129032], [941, 949, 0.125], [949, 958, 0.11111111], [958, 979, 0.0952381], [979, 1156, 0.01694915], [1156, 1265, 0.00917431], [1265, 1391, 0.01587302], [1391, 1501, 0.04545455], [1501, 1703, 0.01485149], [1703, 1853, 0.01333333], [1853, 2006, 0.0130719], [2006, 2119, 0.01769912], [2119, 2307, 0.01595745], [2307, 2332, 0.0], [2332, 2974, 0.00778816], [2974, 3006, 0.03125], [3006, 3740, 0.00953678], [3740, 4520, 0.01153846], [4520, 5303, 0.00893997], [5303, 6056, 0.00929615], [6056, 6774, 0.01114206], [6774, 6803, 0.17241379], [6803, 6883, 0.025], [6883, 6972, 0.0], [6972, 7039, 0.01492537], [7039, 7106, 0.01492537], [7106, 7171, 0.01538462], [7171, 7253, 0.01219512], [7253, 7322, 0.01449275], [7322, 7351, 0.03448276], [7351, 7435, 0.01190476], [7435, 7511, 0.01315789], [7511, 7559, 0.02083333], [7559, 7621, 0.01612903], [7621, 7685, 0.015625], [7685, 7722, 0.02702703], [7722, 7857, 0.00740741], [7857, 7917, 0.03333333], [7917, 7951, 0.05882353], [7951, 8104, 0.01960784], [8104, 8136, 0.0625], [8136, 8251, 0.02608696], [8251, 8286, 0.05714286], [8286, 8405, 0.00840336], [8405, 8439, 0.05882353], [8439, 8589, 0.02], [8589, 8633, 0.02272727], [8633, 8850, 0.01382488], [8850, 8891, 0.04878049], [8891, 9040, 0.02013423], [9040, 9748, 0.01412429], [9748, 9768, 0.1], [9768, 9780, 0.08333333], [9780, 9819, 0.12820513], [9819, 9874, 0.10909091], [9874, 9919, 0.17777778]], "rps_doc_ml_palm_score": [[0, 9919, 0.22293442]], "rps_doc_ml_wikipedia_score": [[0, 9919, null]], "rps_doc_ml_wikiref_score": [[0, 9919, 0.14227986]], "rps_doc_books_importance": [[0, 9919, -541.95689403]], "rps_doc_openwebtext_importance": [[0, 9919, 25.82343369]], "rps_doc_wikipedia_importance": [[0, 9919, -283.89148365]], "rps_doc_num_sentences": [[0, 9919, 91.0]]} |
News/Television
Busy Philipps Becomes Late Night’s Newest Host
Busy Philipps will soon become the newest member of an elite group: late-night talk show hosts – an even smaller number of which are women. The actress and social media phenom will helm her own E! show, not unlike Chelsea Handler before her, whose Chelsea Lately combined panel discussions, one-on-one interviews, and sketches. The Cut broke the news on Tuesday, when we learned that Philipps will join the distinguished ranks of Samantha Bee, Robin Thede, and Sarah Silverman as female late-night hosts.
Philipps first rose to prominence portraying teenaged characters, co-starring in Paul Feig‘s Freaks And Geeks and Dawson’s Creek. She appeared on TBS’s Cougar Town from 2009 to 2015 alongside Friends alumna Courteney Cox. She can currently be seen on HBO’s Vice Principals and in I Feel Pretty, which was co-written and co-directed by her husband, Marc Silverstein. She is also known for her popular Instagram account, especially the candid Stories that broadcast moments other celebrities wouldn’t dare publicize.
Joan Rivers blazed a trail as the first woman to host a late-night network television talk show. The Late Show Starring Joan Rivers aired on Fox from 1986 to 1987. Its final episode featured a then-fledgling comedian named Chris Rock.
‘Advertisement’
A Eulogy For Alexa Netflix’s “Unbreakable Kimmy Schmidt” To End After Fourth... | 2023-14/0000/en_head.json.gz/3632 | {"url": "https://daily.wicf.com/busy-philipps-becomes-late-nights-newest-host/", "partition": "head_middle", "language": "en", "source_domain": "daily.wicf.com", "date_download": "2023-03-20T09:36:33Z", "digest": "sha1:USEUGMTUIWERACTM2MFRKL236A6O4AN7"} | {"ccnet_length": [[0, 1413, 1413.0]], "ccnet_original_length": [[0, 1413, 2876.0]], "ccnet_nlines": [[0, 1413, 7.0]], "ccnet_original_nlines": [[0, 1413, 89.0]], "ccnet_language_score": [[0, 1413, 0.94]], "ccnet_perplexity": [[0, 1413, 164.5]], "ccnet_bucket": [[0, 1413, 0.0]], "rps_doc_curly_bracket": [[0, 1413, 0.0]], "rps_doc_ldnoobw_words": [[0, 1413, 0.0]], "rps_doc_lorem_ipsum": [[0, 1413, 0.0]], "rps_doc_stop_word_fraction": [[0, 1413, 0.28975265]], "rps_doc_ut1_blacklist": [[0, 1413, null]], "rps_doc_frac_chars_dupe_10grams": [[0, 1413, 0.0]], "rps_doc_frac_chars_dupe_5grams": [[0, 1413, 0.0]], "rps_doc_frac_chars_dupe_6grams": [[0, 1413, 0.0]], "rps_doc_frac_chars_dupe_7grams": [[0, 1413, 0.0]], "rps_doc_frac_chars_dupe_8grams": [[0, 1413, 0.0]], "rps_doc_frac_chars_dupe_9grams": [[0, 1413, 0.0]], "rps_doc_frac_chars_top_2gram": [[0, 1413, 0.0207433]], "rps_doc_frac_chars_top_3gram": [[0, 1413, 0.0]], "rps_doc_frac_chars_top_4gram": [[0, 1413, 0.0]], "rps_doc_frac_all_caps_words": [[0, 1413, 0.01766784]], "rps_doc_frac_lines_end_with_ellipsis": [[0, 1413, 0.14285714]], "rps_doc_frac_no_alph_words": [[0, 1413, 0.17667845]], "rps_doc_frac_unique_words": [[0, 1413, 0.74208145]], "rps_doc_mean_word_length": [[0, 1413, 5.23529412]], "rps_doc_symbol_to_word_ratio": [[0, 1413, 0.00353357]], "rps_doc_unigram_entropy": [[0, 1413, 4.93309333]], "rps_doc_word_count": [[0, 1413, 221.0]], "rps_lines_ending_with_terminal_punctution_mark": [[0, 16, 0.0], [16, 63, 0.0], [63, 568, 1.0], [568, 1083, 1.0], [1083, 1318, 1.0], [1318, 1334, 0.0], [1334, 1413, 1.0]], "rps_lines_javascript_counts": [[0, 16, 0.0], [16, 63, 0.0], [63, 568, 0.0], [568, 1083, 0.0], [1083, 1318, 0.0], [1318, 1334, 0.0], [1334, 1413, 0.0]], "rps_lines_num_words": [[0, 16, 1.0], [16, 63, 7.0], [63, 568, 82.0], [568, 1083, 78.0], [1083, 1318, 40.0], [1318, 1334, 1.0], [1334, 1413, 12.0]], "rps_lines_numerical_chars_fraction": [[0, 16, 0.0], [16, 63, 0.0], [63, 568, 0.0], [568, 1083, 0.01590457], [1083, 1318, 0.0349345], [1318, 1334, 0.0], [1334, 1413, 0.0]], "rps_lines_start_with_bulletpoint": [[0, 16, 0.0], [16, 63, 0.0], [63, 568, 0.0], [568, 1083, 0.0], [1083, 1318, 0.0], [1318, 1334, 0.0], [1334, 1413, 0.0]], "rps_lines_uppercase_letter_fraction": [[0, 16, 0.125], [16, 63, 0.14893617], [63, 568, 0.03564356], [568, 1083, 0.06019417], [1083, 1318, 0.05106383], [1318, 1334, 0.0625], [1334, 1413, 0.15189873]], "rps_doc_ml_palm_score": [[0, 1413, 0.98877805]], "rps_doc_ml_wikipedia_score": [[0, 1413, null]], "rps_doc_ml_wikiref_score": [[0, 1413, 0.76958674]], "rps_doc_books_importance": [[0, 1413, -79.20836325]], "rps_doc_openwebtext_importance": [[0, 1413, 17.42028076]], "rps_doc_wikipedia_importance": [[0, 1413, -25.82516554]], "rps_doc_num_sentences": [[0, 1413, 12.0]]} |
Gunmen kidnap police officer in Edo
A senior police officer with the Edo State Police Command, identified as Clement Amoko, was on Sunday kidnapped by gunmen in his residence in Benin, Edo State.
Spokesperson of the Edo State Police Command, Princewill Osaigbovo, confirmed the incident to newsmen.
It was learnt that the gunmen broke into the residence of the victim in the early hours of Sunday and took him away in the presence of his wife and children.
A family member said the abductors have already made contact with them, demanding ransom.
He did not specify the amount involved.
The police spokesperson said the command has commenced an operation to rescue the officer. | 2023-14/0000/en_head.json.gz/3633 | {"url": "https://dailycorrespondents.ng/2021/03/gunmen-kidnap-police-officer-in-edo/", "partition": "head_middle", "language": "en", "source_domain": "dailycorrespondents.ng", "date_download": "2023-03-20T10:21:22Z", "digest": "sha1:F3PIWGDXLCFJIS3LP3UXK6OHIYY6JQNO"} | {"ccnet_length": [[0, 677, 677.0]], "ccnet_original_length": [[0, 677, 4717.0]], "ccnet_nlines": [[0, 677, 7.0]], "ccnet_original_nlines": [[0, 677, 135.0]], "ccnet_language_score": [[0, 677, 0.99]], "ccnet_perplexity": [[0, 677, 269.8]], "ccnet_bucket": [[0, 677, 0.0]], "rps_doc_curly_bracket": [[0, 677, 0.0]], "rps_doc_ldnoobw_words": [[0, 677, 0.0]], "rps_doc_lorem_ipsum": [[0, 677, 0.0]], "rps_doc_stop_word_fraction": [[0, 677, 0.392]], "rps_doc_ut1_blacklist": [[0, 677, null]], "rps_doc_frac_chars_dupe_10grams": [[0, 677, 0.0]], "rps_doc_frac_chars_dupe_5grams": [[0, 677, 0.08679928]], "rps_doc_frac_chars_dupe_6grams": [[0, 677, 0.0]], "rps_doc_frac_chars_dupe_7grams": [[0, 677, 0.0]], "rps_doc_frac_chars_dupe_8grams": [[0, 677, 0.0]], "rps_doc_frac_chars_dupe_9grams": [[0, 677, 0.0]], "rps_doc_frac_chars_top_2gram": [[0, 677, 0.04339964]], "rps_doc_frac_chars_top_3gram": [[0, 677, 0.039783]], "rps_doc_frac_chars_top_4gram": [[0, 677, 0.06148282]], "rps_doc_frac_all_caps_words": [[0, 677, 0.016]], "rps_doc_frac_lines_end_with_ellipsis": [[0, 677, 0.0]], "rps_doc_frac_no_alph_words": [[0, 677, 0.096]], "rps_doc_frac_unique_words": [[0, 677, 0.61061947]], "rps_doc_mean_word_length": [[0, 677, 4.89380531]], "rps_doc_symbol_to_word_ratio": [[0, 677, 0.0]], "rps_doc_unigram_entropy": [[0, 677, 3.95238457]], "rps_doc_word_count": [[0, 677, 113.0]], "rps_lines_ending_with_terminal_punctution_mark": [[0, 36, 0.0], [36, 196, 1.0], [196, 299, 1.0], [299, 457, 1.0], [457, 547, 1.0], [547, 587, 1.0], [587, 677, 1.0]], "rps_lines_javascript_counts": [[0, 36, 0.0], [36, 196, 0.0], [196, 299, 0.0], [299, 457, 0.0], [457, 547, 0.0], [547, 587, 0.0], [587, 677, 0.0]], "rps_lines_num_words": [[0, 36, 6.0], [36, 196, 27.0], [196, 299, 14.0], [299, 457, 31.0], [457, 547, 14.0], [547, 587, 7.0], [587, 677, 14.0]], "rps_lines_numerical_chars_fraction": [[0, 36, 0.0], [36, 196, 0.0], [196, 299, 0.0], [299, 457, 0.0], [457, 547, 0.0], [547, 587, 0.0], [587, 677, 0.0]], "rps_lines_start_with_bulletpoint": [[0, 36, 0.0], [36, 196, 0.0], [196, 299, 0.0], [299, 457, 0.0], [457, 547, 0.0], [547, 587, 0.0], [587, 677, 0.0]], "rps_lines_uppercase_letter_fraction": [[0, 36, 0.05555556], [36, 196, 0.06875], [196, 299, 0.06796117], [299, 457, 0.01265823], [457, 547, 0.01111111], [547, 587, 0.025], [587, 677, 0.01111111]], "rps_doc_ml_palm_score": [[0, 677, 0.95886749]], "rps_doc_ml_wikipedia_score": [[0, 677, null]], "rps_doc_ml_wikiref_score": [[0, 677, 0.68629497]], "rps_doc_books_importance": [[0, 677, 19.9741799]], "rps_doc_openwebtext_importance": [[0, 677, 25.59075338]], "rps_doc_wikipedia_importance": [[0, 677, 26.43426908]], "rps_doc_num_sentences": [[0, 677, 6.0]]} |
1802 Glenbrook Drive, Irving sold by Garrett Shane Dickey and Sue S. Dickey (attorney-in-fact)
The three bedroom, two bathroom home at 1802 Glenbrook Drive, Irving was sold on May 26, 2021 by Garrett Shane Dickey and Sue S. Dickey (attorney-in-fact). The property is appraised at $144,970. The buyer was Elizabeth Dreisbach.
The amount paid for property taxes on this property in 2018 was $4,438.86, 3.06% of the appraised value of the home.
This home last sold on May 17, 2021.
This property is one of 150 sales in Irving in May 2021.
According to the Texas Secretary of State, Garrett Shane Dickey has been registered to vote since 2012.
The Texas Secretary of State data shows that there were five registered voters residing at 1802 Glenbrook Drive, Irving as of Oct. 10, 2020: Kayla Nicole Zimmerman, 28, Stacie Ann Zimmerman, 45, Garrett Shane Dickey, 44, Jackie M Hilton, 78, and Judy L Hilton, 79. | 2023-14/0000/en_head.json.gz/3634 | {"url": "https://dallasexpress.com/uncategorized/1802-glenbrook-drive-irving-sold-by-garrett-shane-dickey-and-sue-s-dickey-attorney-in-fact/", "partition": "head_middle", "language": "en", "source_domain": "dallasexpress.com", "date_download": "2023-03-20T10:49:44Z", "digest": "sha1:2JF7KT7GMB4NV4QZPAV2J3LBKNS6QTWY"} | {"ccnet_length": [[0, 904, 904.0]], "ccnet_original_length": [[0, 904, 3637.0]], "ccnet_nlines": [[0, 904, 7.0]], "ccnet_original_nlines": [[0, 904, 149.0]], "ccnet_language_score": [[0, 904, 0.96]], "ccnet_perplexity": [[0, 904, 291.6]], "ccnet_bucket": [[0, 904, 0.0]], "rps_doc_curly_bracket": [[0, 904, 0.0]], "rps_doc_ldnoobw_words": [[0, 904, 0.0]], "rps_doc_lorem_ipsum": [[0, 904, 0.0]], "rps_doc_stop_word_fraction": [[0, 904, 0.22815534]], "rps_doc_ut1_blacklist": [[0, 904, null]], "rps_doc_frac_chars_dupe_10grams": [[0, 904, 0.14164306]], "rps_doc_frac_chars_dupe_5grams": [[0, 904, 0.28328612]], "rps_doc_frac_chars_dupe_6grams": [[0, 904, 0.14164306]], "rps_doc_frac_chars_dupe_7grams": [[0, 904, 0.14164306]], "rps_doc_frac_chars_dupe_8grams": [[0, 904, 0.14164306]], "rps_doc_frac_chars_dupe_9grams": [[0, 904, 0.14164306]], "rps_doc_frac_chars_top_2gram": [[0, 904, 0.06798867]], "rps_doc_frac_chars_top_3gram": [[0, 904, 0.101983]], "rps_doc_frac_chars_top_4gram": [[0, 904, 0.101983]], "rps_doc_frac_all_caps_words": [[0, 904, 0.01941748]], "rps_doc_frac_lines_end_with_ellipsis": [[0, 904, 0.0]], "rps_doc_frac_no_alph_words": [[0, 904, 0.33495146]], "rps_doc_frac_unique_words": [[0, 904, 0.55844156]], "rps_doc_mean_word_length": [[0, 904, 4.58441558]], "rps_doc_symbol_to_word_ratio": [[0, 904, 0.0]], "rps_doc_unigram_entropy": [[0, 904, 4.25003662]], "rps_doc_word_count": [[0, 904, 154.0]], "rps_lines_ending_with_terminal_punctution_mark": [[0, 95, 0.0], [95, 325, 1.0], [325, 442, 1.0], [442, 479, 1.0], [479, 536, 1.0], [536, 640, 1.0], [640, 904, 1.0]], "rps_lines_javascript_counts": [[0, 95, 0.0], [95, 325, 0.0], [325, 442, 0.0], [442, 479, 0.0], [479, 536, 0.0], [536, 640, 0.0], [640, 904, 0.0]], "rps_lines_num_words": [[0, 95, 14.0], [95, 325, 37.0], [325, 442, 21.0], [442, 479, 8.0], [479, 536, 12.0], [536, 640, 17.0], [640, 904, 45.0]], "rps_lines_numerical_chars_fraction": [[0, 95, 0.04545455], [95, 325, 0.07407407], [325, 442, 0.11926606], [442, 479, 0.17647059], [479, 536, 0.12727273], [536, 640, 0.03960396], [640, 904, 0.08]], "rps_lines_start_with_bulletpoint": [[0, 95, 0.0], [95, 325, 0.0], [325, 442, 0.0], [442, 479, 0.0], [479, 536, 0.0], [536, 640, 0.0], [640, 904, 0.0]], "rps_lines_uppercase_letter_fraction": [[0, 95, 0.09473684], [95, 325, 0.06521739], [325, 442, 0.00854701], [442, 479, 0.05405405], [479, 536, 0.05263158], [536, 640, 0.06730769], [640, 904, 0.08712121]], "rps_doc_ml_palm_score": [[0, 904, 0.00528133]], "rps_doc_ml_wikipedia_score": [[0, 904, null]], "rps_doc_ml_wikiref_score": [[0, 904, 0.22214633]], "rps_doc_books_importance": [[0, 904, -77.61939781]], "rps_doc_openwebtext_importance": [[0, 904, -18.63177112]], "rps_doc_wikipedia_importance": [[0, 904, 17.57650078]], "rps_doc_num_sentences": [[0, 904, 13.0]]} |
Dqi LLC sells 513 Camilla Lane, Garland
On June 1, 2021, Yanil Ovalle purchased a home at 513 Camilla Lane, Garland from Dqi LLC. The property is appraised at $121,590.
The property tax paid for this property in 2018 was $4,245.32. This is 3.49% of the appraised value of the home.
The last time this home sold was Feb. 23, 2021.
In June 2021, 49 properties sold in Garland.
According to the Texas Secretary of State, there was one registered voter residing at 513 Camilla Lane, Garland as of Oct. 10, 2020: Suzanne Linda Simpson, 66. | 2023-14/0000/en_head.json.gz/3635 | {"url": "https://dallasexpress.com/uncategorized/dqi-llc-sells-513-camilla-lane-garland/", "partition": "head_middle", "language": "en", "source_domain": "dallasexpress.com", "date_download": "2023-03-20T09:18:19Z", "digest": "sha1:WBXG3L6XBNRH7CE4TRSXZ3SLZNEOSIII"} | {"ccnet_length": [[0, 534, 534.0]], "ccnet_original_length": [[0, 534, 3194.0]], "ccnet_nlines": [[0, 534, 6.0]], "ccnet_original_nlines": [[0, 534, 148.0]], "ccnet_language_score": [[0, 534, 0.95]], "ccnet_perplexity": [[0, 534, 258.1]], "ccnet_bucket": [[0, 534, 0.0]], "rps_doc_curly_bracket": [[0, 534, 0.0]], "rps_doc_ldnoobw_words": [[0, 534, 0.0]], "rps_doc_lorem_ipsum": [[0, 534, 0.0]], "rps_doc_stop_word_fraction": [[0, 534, 0.22047244]], "rps_doc_ut1_blacklist": [[0, 534, null]], "rps_doc_frac_chars_dupe_10grams": [[0, 534, 0.0]], "rps_doc_frac_chars_dupe_5grams": [[0, 534, 0.11165049]], "rps_doc_frac_chars_dupe_6grams": [[0, 534, 0.0]], "rps_doc_frac_chars_dupe_7grams": [[0, 534, 0.0]], "rps_doc_frac_chars_dupe_8grams": [[0, 534, 0.0]], "rps_doc_frac_chars_dupe_9grams": [[0, 534, 0.0]], "rps_doc_frac_chars_top_2gram": [[0, 534, 0.07281553]], "rps_doc_frac_chars_top_3gram": [[0, 534, 0.10194175]], "rps_doc_frac_chars_top_4gram": [[0, 534, 0.15291262]], "rps_doc_frac_all_caps_words": [[0, 534, 0.01574803]], "rps_doc_frac_lines_end_with_ellipsis": [[0, 534, 0.0]], "rps_doc_frac_no_alph_words": [[0, 534, 0.37007874]], "rps_doc_frac_unique_words": [[0, 534, 0.61458333]], "rps_doc_mean_word_length": [[0, 534, 4.29166667]], "rps_doc_symbol_to_word_ratio": [[0, 534, 0.0]], "rps_doc_unigram_entropy": [[0, 534, 3.90687896]], "rps_doc_word_count": [[0, 534, 96.0]], "rps_lines_ending_with_terminal_punctution_mark": [[0, 40, 0.0], [40, 169, 1.0], [169, 282, 1.0], [282, 330, 1.0], [330, 375, 1.0], [375, 534, 1.0]], "rps_lines_javascript_counts": [[0, 40, 0.0], [40, 169, 0.0], [169, 282, 0.0], [282, 330, 0.0], [330, 375, 0.0], [375, 534, 0.0]], "rps_lines_num_words": [[0, 40, 7.0], [40, 169, 23.0], [169, 282, 21.0], [282, 330, 10.0], [330, 375, 8.0], [375, 534, 27.0]], "rps_lines_numerical_chars_fraction": [[0, 40, 0.07894737], [40, 169, 0.11570248], [169, 282, 0.12380952], [282, 330, 0.13636364], [330, 375, 0.14285714], [375, 534, 0.07236842]], "rps_lines_start_with_bulletpoint": [[0, 40, 0.0], [40, 169, 0.0], [169, 282, 0.0], [282, 330, 0.0], [330, 375, 0.0], [375, 534, 0.0]], "rps_lines_uppercase_letter_fraction": [[0, 40, 0.175], [40, 169, 0.09302326], [169, 282, 0.01769912], [282, 330, 0.04166667], [330, 375, 0.06666667], [375, 534, 0.06918239]], "rps_doc_ml_palm_score": [[0, 534, 0.08793932]], "rps_doc_ml_wikipedia_score": [[0, 534, null]], "rps_doc_ml_wikiref_score": [[0, 534, 0.01533669]], "rps_doc_books_importance": [[0, 534, -42.4542919]], "rps_doc_openwebtext_importance": [[0, 534, -11.47124282]], "rps_doc_wikipedia_importance": [[0, 534, 1.09392652]], "rps_doc_num_sentences": [[0, 534, 11.0]]} |
results on facebook
5 Seconds to Better Results on Facebook
It’s the simple stuff that works. Don’t you agree? If you’re a regular reader of Next Level Blogger (I hope you are, and if you’re not…well, I’m just gonna throw an opt in form right here, special…just for you!)
Anyway, if you’re a regular reader here, you know that I’m not a tech-oriented guy. The fact that I’ve built successful, profitable websites should serve as PROOF to anyone who needs it…that you simply DO NOT have to be a SEO guru or a highly talented webmaster in order to make it in this business. That’s not to say this stuff is always a cake walk, but if you think you have to be really techy in order to succeed online…it’s just not true.
Much of What We Think is Too Techy or Out of Our League, Isn’t
In fact, I’ve just recently started doing videos, and one of my big excuses for not doing them was because I didn’t know how to edit them, and I was afraid of the results not being professional enough. As soon as I decided to throw out my excuses, guess what happened? I learned that video is actually really frickin easy, there’s seriously not anything to worry about, it’s actually a lot of fun, and it’s a great way to connect with your audience.
It’s also a great way to bust some content out really quickly. If you’re not using video, I really recommend you dig in for a second and just do it. Get a flip or something and just start it, because it’s a great way to add some really valuable content to your site. Does this mean that my videos are all professional and top notch? Actually, no. My videos (one is up already, several more will post over the next few weeks) are uber-simple, and that’s the point…it’s the SIMPLE stuff that works.
Get More Conversation, More Leads and More Business on Facebook
Speaking of uber-simple, I want to give you a heads up about something I’ve been doing since the beginning with my Facebook profile that has always really worked for me. Honestly, it’s classic Chris Russell stuff…it requires no talent at all, just diligence. When I get friend requests, I (get this) respond to them! This is what I say…verbatim:
Hey thanks for the request! You on FB for business or fun?
I know, I too am reeling from the genius of it all 🙂 How long does this take me? About 5 seconds. What is the net result? It initiates a conversation. That’s the entire point, correct? I mean, it’s THE reason you’re on Facebook, so why not actually converse with these people?
I’ve altered my response from time to time, but the one I’m sharing here is the one that pulls the best. There’s no pitch, no stupid link to my site (I have to admit I’ve done that, and it’s just dumb), just a hello and an invitation to conversation.
Guess what…you most likely got that friend request for a reason. They’re already interested in you on some level. Granted, it’s probably a very superficial level, but that’s cool. Build on it. We all know that clicking “accept” does nothing. Nothing. Who cares if you have a new contact on Facebook if there’s not any actual contact? Make a reply…and I guarantee you a good percentage of them will respond, and you can nurture that.
At this point I have a little over two thousand contacts on Facebook, so I’ve done this more than once or twice. It works. It’s not the genius of this technique that makes it work…it’s the simplicity of it. Coupled with the fact that not very many others do this…believe it or not it’s almost always simple things like this that can put you over the top.
It’s the little things like this that add up and show your audience that you actually care about them. And when someone really gets that…gets the fact that you actually care about them, that’s when you have an advocate. Is 5 seconds worth it? What do you think? | 2023-14/0000/en_head.json.gz/3636 | {"url": "https://dangeroustactics.com/tag/results-on-facebook/", "partition": "head_middle", "language": "en", "source_domain": "dangeroustactics.com", "date_download": "2023-03-20T09:15:28Z", "digest": "sha1:AAKP3SJCBPMFZWGLDBGQC5ARLC7A2D36"} | {"ccnet_length": [[0, 3772, 3772.0]], "ccnet_original_length": [[0, 3772, 4300.0]], "ccnet_nlines": [[0, 3772, 15.0]], "ccnet_original_nlines": [[0, 3772, 32.0]], "ccnet_language_score": [[0, 3772, 0.94]], "ccnet_perplexity": [[0, 3772, 269.7]], "ccnet_bucket": [[0, 3772, 0.0]], "rps_doc_curly_bracket": [[0, 3772, 0.0]], "rps_doc_ldnoobw_words": [[0, 3772, 0.0]], "rps_doc_lorem_ipsum": [[0, 3772, 0.0]], "rps_doc_stop_word_fraction": [[0, 3772, 0.50334076]], "rps_doc_ut1_blacklist": [[0, 3772, null]], "rps_doc_frac_chars_dupe_10grams": [[0, 3772, 0.0]], "rps_doc_frac_chars_dupe_5grams": [[0, 3772, 0.05898123]], "rps_doc_frac_chars_dupe_6grams": [[0, 3772, 0.01876676]], "rps_doc_frac_chars_dupe_7grams": [[0, 3772, 0.0]], "rps_doc_frac_chars_dupe_8grams": [[0, 3772, 0.0]], "rps_doc_frac_chars_dupe_9grams": [[0, 3772, 0.0]], "rps_doc_frac_chars_top_2gram": [[0, 3772, 0.02010724]], "rps_doc_frac_chars_top_3gram": [[0, 3772, 0.01105898]], "rps_doc_frac_chars_top_4gram": [[0, 3772, 0.01105898]], "rps_doc_frac_all_caps_words": [[0, 3772, 0.03563474]], "rps_doc_frac_lines_end_with_ellipsis": [[0, 3772, 0.0]], "rps_doc_frac_no_alph_words": [[0, 3772, 0.16815145]], "rps_doc_frac_unique_words": [[0, 3772, 0.41151079]], "rps_doc_mean_word_length": [[0, 3772, 4.29352518]], "rps_doc_symbol_to_word_ratio": [[0, 3772, 0.01336303]], "rps_doc_unigram_entropy": [[0, 3772, 5.16838701]], "rps_doc_word_count": [[0, 3772, 695.0]], "rps_lines_ending_with_terminal_punctution_mark": [[0, 20, 0.0], [20, 60, 0.0], [60, 272, 0.0], [272, 716, 1.0], [716, 779, 0.0], [779, 1229, 1.0], [1229, 1726, 1.0], [1726, 1790, 0.0], [1790, 2136, 0.0], [2136, 2195, 1.0], [2195, 2472, 1.0], [2472, 2723, 1.0], [2723, 3156, 1.0], [3156, 3511, 1.0], [3511, 3772, 1.0]], "rps_lines_javascript_counts": [[0, 20, 0.0], [20, 60, 0.0], [60, 272, 0.0], [272, 716, 0.0], [716, 779, 0.0], [779, 1229, 0.0], [1229, 1726, 0.0], [1726, 1790, 0.0], [1790, 2136, 0.0], [2136, 2195, 0.0], [2195, 2472, 0.0], [2472, 2723, 0.0], [2723, 3156, 0.0], [3156, 3511, 0.0], [3511, 3772, 0.0]], "rps_lines_num_words": [[0, 20, 3.0], [20, 60, 7.0], [60, 272, 39.0], [272, 716, 84.0], [716, 779, 14.0], [779, 1229, 83.0], [1229, 1726, 93.0], [1726, 1790, 10.0], [1790, 2136, 59.0], [2136, 2195, 12.0], [2195, 2472, 52.0], [2472, 2723, 49.0], [2723, 3156, 75.0], [3156, 3511, 67.0], [3511, 3772, 48.0]], "rps_lines_numerical_chars_fraction": [[0, 20, 0.0], [20, 60, 0.02564103], [60, 272, 0.0], [272, 716, 0.0], [716, 779, 0.0], [779, 1229, 0.0], [1229, 1726, 0.0], [1726, 1790, 0.0], [1790, 2136, 0.0], [2136, 2195, 0.0], [2195, 2472, 0.0037594], [2472, 2723, 0.0], [2723, 3156, 0.0], [3156, 3511, 0.0], [3511, 3772, 0.00390625]], "rps_lines_start_with_bulletpoint": [[0, 20, 0.0], [20, 60, 0.0], [60, 272, 0.0], [272, 716, 0.0], [716, 779, 0.0], [779, 1229, 0.0], [1229, 1726, 0.0], [1726, 1790, 0.0], [1790, 2136, 0.0], [2136, 2195, 0.0], [2195, 2472, 0.0], [2472, 2723, 0.0], [2723, 3156, 0.0], [3156, 3511, 0.0], [3511, 3772, 0.0]], "rps_lines_uppercase_letter_fraction": [[0, 20, 0.0], [20, 60, 0.1], [60, 272, 0.03773585], [272, 716, 0.04054054], [716, 779, 0.15873016], [779, 1229, 0.01555556], [1229, 1726, 0.02615694], [1726, 1790, 0.125], [1790, 2136, 0.03468208], [2136, 2195, 0.06779661], [2195, 2472, 0.0433213], [2472, 2723, 0.01992032], [2723, 3156, 0.02309469], [3156, 3511, 0.01971831], [3511, 3772, 0.01532567]], "rps_doc_ml_palm_score": [[0, 3772, 0.17359525]], "rps_doc_ml_wikipedia_score": [[0, 3772, null]], "rps_doc_ml_wikiref_score": [[0, 3772, 0.06947017]], "rps_doc_books_importance": [[0, 3772, -356.42675816]], "rps_doc_openwebtext_importance": [[0, 3772, 68.25306422]], "rps_doc_wikipedia_importance": [[0, 3772, -776.53496388]], "rps_doc_num_sentences": [[0, 3772, 44.0]]} |
Celebrating the Arts and Crafts Movement
Grueby Faience, set of 4 abstract Jungle tiles (each 12” x 12”), designed by Addison LeBoutelier
I like to refer to this event as the best kept secret in the world of design: the annual National Arts and Crafts Conference which takes place at the Grove Park Inn on the mountain in Asheville, NC. This historic hotel, built in 1913 as a mountain retreat, has hosted this conference for the past 35 years. It is where all those passionate about the American Arts and Crafts Movement gather, including dealers, collectors, authors, curators, contemporary producers of craft furniture (such as Door Pottery and Mission Guild Studios), foundations related to the Movement (such as the Stickley Museum at Craftsman Farms and the Roycrofters-At-Large Association, Two Red Roses Foundation), and auction lumineries (such as David Rago and John Toomey). It is a weekend spent diving into history through seminars, selling exhibitions, fund raising events, and book signings.
At this year’s Conference (open on Friday) the premier dealer of American Arts and Crafts, Robert Kaplan, will present after a 19-year absence. Kaplan is responsible for building some of the most impressive museum and private collections devoted to the American Arts and Crafts. In celebrating his comeback, I would like to devote this post to some of the masterpieces included in his upcoming showcase. Kaplan, who is an expert on the American Arts and Crafts Movement as well as a collector, dealer, and friend, began his journey in the mid-1980s. He has since developed an incredible passion for the furniture and decorative arts of the movement, which came to produce the earliest manifestation of modern design in America.
Dard Hunter leaded glass Galleon lamp
Marblehead scenic “Marsh” vase, designed by Sarah Tutt
Gustav Stickley inlaid chair, design attributed to Harvey Ellis (detail).
Gustav Stickley inlaid sette, design attributed to Harvey Ellis (detail)
Newcomb College Vase.
1 thought on “Celebrating the Arts and Crafts Movement”
DAVID N. EBNER
THANK YOU….DAVIDD | 2023-14/0000/en_head.json.gz/3637 | {"url": "https://daniellaondesign.com/blog/celebrating-the-arts-and-crafts-movement/", "partition": "head_middle", "language": "en", "source_domain": "daniellaondesign.com", "date_download": "2023-03-20T09:32:19Z", "digest": "sha1:HEBUAPIFQO3KIQZFZIAMAP5P45L5TSV6"} | {"ccnet_length": [[0, 2085, 2085.0]], "ccnet_original_length": [[0, 2085, 3096.0]], "ccnet_nlines": [[0, 2085, 12.0]], "ccnet_original_nlines": [[0, 2085, 68.0]], "ccnet_language_score": [[0, 2085, 0.96]], "ccnet_perplexity": [[0, 2085, 233.6]], "ccnet_bucket": [[0, 2085, 0.0]], "rps_doc_curly_bracket": [[0, 2085, 0.0]], "rps_doc_ldnoobw_words": [[0, 2085, 0.0]], "rps_doc_lorem_ipsum": [[0, 2085, 0.0]], "rps_doc_stop_word_fraction": [[0, 2085, 0.31313131]], "rps_doc_ut1_blacklist": [[0, 2085, null]], "rps_doc_frac_chars_dupe_10grams": [[0, 2085, 0.0]], "rps_doc_frac_chars_dupe_5grams": [[0, 2085, 0.09310548]], "rps_doc_frac_chars_dupe_6grams": [[0, 2085, 0.07896288]], "rps_doc_frac_chars_dupe_7grams": [[0, 2085, 0.0]], "rps_doc_frac_chars_dupe_8grams": [[0, 2085, 0.0]], "rps_doc_frac_chars_dupe_9grams": [[0, 2085, 0.0]], "rps_doc_frac_chars_top_2gram": [[0, 2085, 0.02887448]], "rps_doc_frac_chars_top_3gram": [[0, 2085, 0.05362404]], "rps_doc_frac_chars_top_4gram": [[0, 2085, 0.04949912]], "rps_doc_frac_all_caps_words": [[0, 2085, 0.02272727]], "rps_doc_frac_lines_end_with_ellipsis": [[0, 2085, 0.0]], "rps_doc_frac_no_alph_words": [[0, 2085, 0.16919192]], "rps_doc_frac_unique_words": [[0, 2085, 0.59393939]], "rps_doc_mean_word_length": [[0, 2085, 5.14242424]], "rps_doc_symbol_to_word_ratio": [[0, 2085, 0.00252525]], "rps_doc_unigram_entropy": [[0, 2085, 4.89835987]], "rps_doc_word_count": [[0, 2085, 330.0]], "rps_lines_ending_with_terminal_punctution_mark": [[0, 41, 0.0], [41, 138, 0.0], [138, 1007, 1.0], [1007, 1735, 1.0], [1735, 1773, 0.0], [1773, 1828, 0.0], [1828, 1902, 1.0], [1902, 1975, 0.0], [1975, 1997, 1.0], [1997, 2053, 1.0], [2053, 2068, 0.0], [2068, 2085, 0.0]], "rps_lines_javascript_counts": [[0, 41, 0.0], [41, 138, 0.0], [138, 1007, 0.0], [1007, 1735, 0.0], [1735, 1773, 0.0], [1773, 1828, 0.0], [1828, 1902, 0.0], [1902, 1975, 0.0], [1975, 1997, 0.0], [1997, 2053, 0.0], [2053, 2068, 0.0], [2068, 2085, 0.0]], "rps_lines_num_words": [[0, 41, 6.0], [41, 138, 16.0], [138, 1007, 138.0], [1007, 1735, 119.0], [1735, 1773, 6.0], [1773, 1828, 8.0], [1828, 1902, 10.0], [1902, 1975, 10.0], [1975, 1997, 3.0], [1997, 2053, 9.0], [2053, 2068, 3.0], [2068, 2085, 2.0]], "rps_lines_numerical_chars_fraction": [[0, 41, 0.0], [41, 138, 0.05434783], [138, 1007, 0.00713436], [1007, 1735, 0.0084507], [1735, 1773, 0.0], [1773, 1828, 0.0], [1828, 1902, 0.0], [1902, 1975, 0.0], [1975, 1997, 0.0], [1997, 2053, 0.01818182], [2053, 2068, 0.0], [2068, 2085, 0.0]], "rps_lines_start_with_bulletpoint": [[0, 41, 0.0], [41, 138, 0.0], [138, 1007, 0.0], [1007, 1735, 0.0], [1735, 1773, 0.0], [1773, 1828, 0.0], [1828, 1902, 0.0], [1902, 1975, 0.0], [1975, 1997, 0.0], [1997, 2053, 0.0], [2053, 2068, 0.0], [2068, 2085, 0.0]], "rps_lines_uppercase_letter_fraction": [[0, 41, 0.09756098], [41, 138, 0.06185567], [138, 1007, 0.04602992], [1007, 1735, 0.02884615], [1735, 1773, 0.07894737], [1773, 1828, 0.07272727], [1828, 1902, 0.05405405], [1902, 1975, 0.05479452], [1975, 1997, 0.13636364], [1997, 2053, 0.07142857], [2053, 2068, 0.73333333], [2068, 2085, 0.82352941]], "rps_doc_ml_palm_score": [[0, 2085, 0.59448367]], "rps_doc_ml_wikipedia_score": [[0, 2085, null]], "rps_doc_ml_wikiref_score": [[0, 2085, 0.18854797]], "rps_doc_books_importance": [[0, 2085, -79.83111712]], "rps_doc_openwebtext_importance": [[0, 2085, 1.88712554]], "rps_doc_wikipedia_importance": [[0, 2085, 37.71732367]], "rps_doc_num_sentences": [[0, 2085, 14.0]]} |
The Dante Parent Participation Organization (P.P.O.) is a group of parent volunteers that help to organize events and fundraising activities for our school.
Parent Participation Organization (P.P.O.) is a an active and vibrant element of school life. Made up of parent volunteers, the P.P.O. organizes many fundraisers throughout the year to supplement the school’s budget for field trips, provide new resources for the classroom, to recognize the good work of teachers and volunteers, and much more. It is the work of the PPO, in close collaboration with the Governing Board and the school administration, that adds a different dimension to life at Dante.
How do I join the P.P.O.?
Each September, Dante holds a General Assembly of parents in the school gym, where the members of the P.P.O. are chosen. For more information about meetings and how to join the P.P.O., please contact Dante's principal.
Upcoming PPO Meetings
More about volunteering
Learn more about how to volunteer at school on the EMSB website. | 2023-14/0000/en_head.json.gz/3638 | {"url": "https://dante.emsb.qc.ca/dante/parents/resources/ppo", "partition": "head_middle", "language": "en", "source_domain": "dante.emsb.qc.ca", "date_download": "2023-03-20T10:28:41Z", "digest": "sha1:E2VAR23ZGW7A4VZBZWKJCJCBRM3Y3EHV"} | {"ccnet_length": [[0, 1012, 1012.0]], "ccnet_original_length": [[0, 1012, 4269.0]], "ccnet_nlines": [[0, 1012, 7.0]], "ccnet_original_nlines": [[0, 1012, 152.0]], "ccnet_language_score": [[0, 1012, 0.91]], "ccnet_perplexity": [[0, 1012, 207.2]], "ccnet_bucket": [[0, 1012, 0.0]], "rps_doc_curly_bracket": [[0, 1012, 0.0]], "rps_doc_ldnoobw_words": [[0, 1012, 0.0]], "rps_doc_lorem_ipsum": [[0, 1012, 0.0]], "rps_doc_stop_word_fraction": [[0, 1012, 0.35046729]], "rps_doc_ut1_blacklist": [[0, 1012, null]], "rps_doc_frac_chars_dupe_10grams": [[0, 1012, 0.0]], "rps_doc_frac_chars_dupe_5grams": [[0, 1012, 0.09135802]], "rps_doc_frac_chars_dupe_6grams": [[0, 1012, 0.09135802]], "rps_doc_frac_chars_dupe_7grams": [[0, 1012, 0.0]], "rps_doc_frac_chars_dupe_8grams": [[0, 1012, 0.0]], "rps_doc_frac_chars_dupe_9grams": [[0, 1012, 0.0]], "rps_doc_frac_chars_top_2gram": [[0, 1012, 0.03703704]], "rps_doc_frac_chars_top_3gram": [[0, 1012, 0.07654321]], "rps_doc_frac_chars_top_4gram": [[0, 1012, 0.08395062]], "rps_doc_frac_all_caps_words": [[0, 1012, 0.10280374]], "rps_doc_frac_lines_end_with_ellipsis": [[0, 1012, 0.0]], "rps_doc_frac_no_alph_words": [[0, 1012, 0.1728972]], "rps_doc_frac_unique_words": [[0, 1012, 0.55214724]], "rps_doc_mean_word_length": [[0, 1012, 4.96932515]], "rps_doc_symbol_to_word_ratio": [[0, 1012, 0.0]], "rps_doc_unigram_entropy": [[0, 1012, 4.14097002]], "rps_doc_word_count": [[0, 1012, 163.0]], "rps_lines_ending_with_terminal_punctution_mark": [[0, 157, 1.0], [157, 657, 1.0], [657, 683, 1.0], [683, 902, 1.0], [902, 924, 0.0], [924, 948, 0.0], [948, 1012, 1.0]], "rps_lines_javascript_counts": [[0, 157, 0.0], [157, 657, 0.0], [657, 683, 0.0], [683, 902, 0.0], [902, 924, 0.0], [924, 948, 0.0], [948, 1012, 0.0]], "rps_lines_num_words": [[0, 157, 23.0], [157, 657, 80.0], [657, 683, 6.0], [683, 902, 36.0], [902, 924, 3.0], [924, 948, 3.0], [948, 1012, 12.0]], "rps_lines_numerical_chars_fraction": [[0, 157, 0.0], [157, 657, 0.0], [657, 683, 0.0], [683, 902, 0.0], [902, 924, 0.0], [924, 948, 0.0], [948, 1012, 0.0]], "rps_lines_start_with_bulletpoint": [[0, 157, 0.0], [157, 657, 0.0], [657, 683, 0.0], [683, 902, 0.0], [902, 924, 0.0], [924, 948, 0.0], [948, 1012, 0.0]], "rps_lines_uppercase_letter_fraction": [[0, 157, 0.05095541], [157, 657, 0.034], [657, 683, 0.19230769], [683, 902, 0.05936073], [902, 924, 0.22727273], [924, 948, 0.04166667], [948, 1012, 0.078125]], "rps_doc_ml_palm_score": [[0, 1012, 0.00343907]], "rps_doc_ml_wikipedia_score": [[0, 1012, null]], "rps_doc_ml_wikiref_score": [[0, 1012, 0.06189787]], "rps_doc_books_importance": [[0, 1012, -37.10511225]], "rps_doc_openwebtext_importance": [[0, 1012, -1.12289097]], "rps_doc_wikipedia_importance": [[0, 1012, -13.92246604]], "rps_doc_num_sentences": [[0, 1012, 25.0]]} |
Home » Venture research
Venture research is a comprehensive service provided by Data40, a leading provider of business data solutions. Venture research aims to help entrepreneurs and investors make informed decisions about potential investment opportunities by providing valuable insights into the market, industry trends, and competitor analysis.
How Does Venture Research Work?
Venture research uses advanced data analysis techniques to gather and analyze data on companies and industries, including financial data, industry trends, and competitor analysis. This information is then used to create detailed reports that provide insights into the strengths and weaknesses of different companies, as well as the overall market conditions.
Benefits of Venture Research
Here are some of the key benefits of venture research:
Identifying Emerging Trends and Opportunities. One of the key benefits of venture research is the ability to identify emerging trends and opportunities in different industries. By analyzing market data and industry trends, venture research can help entrepreneurs and investors identify new areas of growth and potential investment opportunities.
Understanding the Competitive Landscape. Venture research also provides valuable insights into the competitive landscape of different industries. By analyzing competitor data, venture research can help entrepreneurs and investors understand the strengths and weaknesses of their competitors, as well as identify potential areas of opportunity where they can differentiate themselves from the competition.
Making Informed Investment Decisions. Overall, venture research is an essential service for entrepreneurs and investors looking to make informed investment decisions. By providing valuable insights into the market, industry trends, and competitor analysis, venture research can help entrepreneurs and investors identify potential investment opportunities and make strategic decisions that lead to long-term success.
How to Get Started with Venture Research?
To get started with venture research, you can contact us with a brief description of your project, including its goals, current tasks and time frame. We will study your project, calculate a pre-offer, and give a presentation call to discuss your needs in detail. Together, we will formulate a work plan, and assemble a work group. You will have direct access to the work group, so that you can make operational decisions at a moment’s notice. We aim for a flexible approach, where you can regard our workgroup as your own private team of professionals.
Last Data
Video games investment funds Q1 2020
Video games investment funds Q1 2020. This release contains all the extensive information on hundreds of investment funds, venture companies and individual investors working with video games, AR / VR, […]
Sample Free Full $ 500.00
The Legal Side of AI Investments: Risks and Restrictions for Venture Investors
Artificial intelligence (AI) is a technological sphere that greatly goes into the depth of human functioning, and is defined as the ability of a digital computer or computer-controlled robot to […]
Ultimate Data40 Guide on Effective Investment
Here at Data40, we work with all kinds of investors and varying budget sizes. The requests we receive, however, are often quite similar. That is why we decided to put […]
Impact of Global Financial Crisis on Stock Market
While it was the worst quarterly decline in over 50 years, the 20–30% drop in adjustments that the stock market experienced in the first half of 2022 cannot be fully […] | 2023-14/0000/en_head.json.gz/3639 | {"url": "https://data40.com/venture/", "partition": "head_middle", "language": "en", "source_domain": "data40.com", "date_download": "2023-03-20T09:51:10Z", "digest": "sha1:KN6PFMGARSXGNUB2TTAV45ECFNTCV6BO"} | {"ccnet_length": [[0, 3576, 3576.0]], "ccnet_original_length": [[0, 3576, 6798.0]], "ccnet_nlines": [[0, 3576, 21.0]], "ccnet_original_nlines": [[0, 3576, 133.0]], "ccnet_language_score": [[0, 3576, 0.94]], "ccnet_perplexity": [[0, 3576, 299.3]], "ccnet_bucket": [[0, 3576, 0.0]], "rps_doc_curly_bracket": [[0, 3576, 0.0]], "rps_doc_ldnoobw_words": [[0, 3576, 0.0]], "rps_doc_lorem_ipsum": [[0, 3576, 0.0]], "rps_doc_stop_word_fraction": [[0, 3576, 0.3349835]], "rps_doc_ut1_blacklist": [[0, 3576, null]], "rps_doc_frac_chars_dupe_10grams": [[0, 3576, 0.05052206]], "rps_doc_frac_chars_dupe_5grams": [[0, 3576, 0.19737285]], "rps_doc_frac_chars_dupe_6grams": [[0, 3576, 0.16739643]], "rps_doc_frac_chars_dupe_7grams": [[0, 3576, 0.12563153]], "rps_doc_frac_chars_dupe_8grams": [[0, 3576, 0.08757157]], "rps_doc_frac_chars_dupe_9grams": [[0, 3576, 0.05052206]], "rps_doc_frac_chars_top_2gram": [[0, 3576, 0.07578309]], "rps_doc_frac_chars_top_3gram": [[0, 3576, 0.04210172]], "rps_doc_frac_chars_top_4gram": [[0, 3576, 0.03907039]], "rps_doc_frac_all_caps_words": [[0, 3576, 0.00990099]], "rps_doc_frac_lines_end_with_ellipsis": [[0, 3576, 0.0]], "rps_doc_frac_no_alph_words": [[0, 3576, 0.1320132]], "rps_doc_frac_unique_words": [[0, 3576, 0.44840525]], "rps_doc_mean_word_length": [[0, 3576, 5.57035647]], "rps_doc_symbol_to_word_ratio": [[0, 3576, 0.00660066]], "rps_doc_unigram_entropy": [[0, 3576, 4.9847573]], "rps_doc_word_count": [[0, 3576, 533.0]], "rps_lines_ending_with_terminal_punctution_mark": [[0, 24, 0.0], [24, 348, 1.0], [348, 380, 1.0], [380, 739, 1.0], [739, 768, 0.0], [768, 823, 0.0], [823, 1169, 1.0], [1169, 1574, 1.0], [1574, 1990, 1.0], [1990, 2032, 1.0], [2032, 2585, 1.0], [2585, 2595, 0.0], [2595, 2632, 0.0], [2632, 2837, 0.0], [2837, 2863, 0.0], [2863, 2942, 0.0], [2942, 3140, 0.0], [3140, 3186, 0.0], [3186, 3357, 0.0], [3357, 3407, 0.0], [3407, 3576, 0.0]], "rps_lines_javascript_counts": [[0, 24, 0.0], [24, 348, 0.0], [348, 380, 0.0], [380, 739, 0.0], [739, 768, 0.0], [768, 823, 0.0], [823, 1169, 0.0], [1169, 1574, 0.0], [1574, 1990, 0.0], [1990, 2032, 0.0], [2032, 2585, 0.0], [2585, 2595, 0.0], [2595, 2632, 0.0], [2632, 2837, 0.0], [2837, 2863, 0.0], [2863, 2942, 0.0], [2942, 3140, 0.0], [3140, 3186, 0.0], [3186, 3357, 0.0], [3357, 3407, 0.0], [3407, 3576, 0.0]], "rps_lines_num_words": [[0, 24, 4.0], [24, 348, 43.0], [348, 380, 5.0], [380, 739, 51.0], [739, 768, 4.0], [768, 823, 10.0], [823, 1169, 48.0], [1169, 1574, 52.0], [1574, 1990, 53.0], [1990, 2032, 7.0], [2032, 2585, 95.0], [2585, 2595, 2.0], [2595, 2632, 6.0], [2632, 2837, 30.0], [2837, 2863, 4.0], [2863, 2942, 12.0], [2942, 3140, 31.0], [3140, 3186, 6.0], [3186, 3357, 31.0], [3357, 3407, 8.0], [3407, 3576, 31.0]], "rps_lines_numerical_chars_fraction": [[0, 24, 0.0], [24, 348, 0.00628931], [348, 380, 0.0], [380, 739, 0.0], [739, 768, 0.0], [768, 823, 0.0], [823, 1169, 0.0], [1169, 1574, 0.0], [1574, 1990, 0.0], [1990, 2032, 0.0], [2032, 2585, 0.0], [2585, 2595, 0.0], [2595, 2632, 0.13888889], [2632, 2837, 0.0255102], [2837, 2863, 0.22727273], [2863, 2942, 0.0], [2942, 3140, 0.0], [3140, 3186, 0.04444444], [3186, 3357, 0.01226994], [3357, 3407, 0.0], [3407, 3576, 0.06060606]], "rps_lines_start_with_bulletpoint": [[0, 24, 0.0], [24, 348, 0.0], [348, 380, 0.0], [380, 739, 0.0], [739, 768, 0.0], [768, 823, 0.0], [823, 1169, 0.0], [1169, 1574, 0.0], [1574, 1990, 0.0], [1990, 2032, 0.0], [2032, 2585, 0.0], [2585, 2595, 0.0], [2595, 2632, 0.0], [2632, 2837, 0.0], [2837, 2863, 0.0], [2863, 2942, 0.0], [2942, 3140, 0.0], [3140, 3186, 0.0], [3186, 3357, 0.0], [3357, 3407, 0.0], [3407, 3576, 0.0]], "rps_lines_uppercase_letter_fraction": [[0, 24, 0.08333333], [24, 348, 0.00925926], [348, 380, 0.15625], [380, 739, 0.00557103], [739, 768, 0.10344828], [768, 823, 0.01818182], [823, 1169, 0.01734104], [1169, 1574, 0.01234568], [1574, 1990, 0.01442308], [1990, 2032, 0.11904762], [2032, 2585, 0.00904159], [2585, 2595, 0.2], [2595, 2632, 0.05405405], [2632, 2837, 0.03414634], [2837, 2863, 0.11538462], [2863, 2942, 0.12658228], [2942, 3140, 0.01515152], [3140, 3186, 0.10869565], [3186, 3357, 0.02339181], [3357, 3407, 0.12], [3407, 3576, 0.00591716]], "rps_doc_ml_palm_score": [[0, 3576, 0.0003078]], "rps_doc_ml_wikipedia_score": [[0, 3576, null]], "rps_doc_ml_wikiref_score": [[0, 3576, 0.04821509]], "rps_doc_books_importance": [[0, 3576, -202.21138673]], "rps_doc_openwebtext_importance": [[0, 3576, -11.32973625]], "rps_doc_wikipedia_importance": [[0, 3576, -109.64854592]], "rps_doc_num_sentences": [[0, 3576, 25.0]]} |
Two Cousins Twitter Video went Viral on Twitter Reddit
It has been a long time, people have been searching the same topic, but they are searching different keywords Two Cousins Twitter Video. As of now, netizens are searching for Two cousins’ Twitter videos. Yes, you might have watched the video before, there are lots of queries from the people, who are watching the video.
They are bent on searching the video on Twitter, there are more than millions of people who are looking forward to watching Two cousins’ Twitter video. Through this post we are supposed to get every single detail related to them, do you know who were two cousins who were shot in the live Instagram video.
Two cousins Twitter video
The video is being circulated on the internet and tons of users are being interested to know more details. We will cover all the basic info such as what has happened to that and how you can find the video on Twitter. There are two cousins, who were having fun with each other after a long time both of them met and they were enjoying the birthday party. At the time of the incident, they were in the bathroom. We also found that they were in the live video, which was running. As of now, netizens are going crazy to get more about this. As of now on Twitter, there are lots of Twitter accounts, which have posted the same video.
In the live Instagram video, both of them were playing there was a gun that was real. A girl who was just 12 years old her name was Paris Harvey, she accidentally shoots her cousin’s brother, whose name was Kauron Harvey at the time of his death he was 14 years old. It has been a matter of discussion after the sudden incident authorities were informed about this as soon as they came to know about this they arrived on the spot. The investigation is going on, this case has been considered suicide and murder. Both of them are young, the whole family is broken-hearted and they are mourning over the sad demise of their souls.
Two Cousins Twitter Video Viral
Netizens are continuously searching for them and they are reacting over this, we also express our deep condolence to the family may their soul rest in peace. As of now, the family is facing a rough time, tons of people are supporting them. We should also give them some private time so that they can feel a bit better. She shot her brother and shot herself, we can see in the live Two cousins’ Twitter video.
This clip is being shared on the internet and users are still searching that is why both of the cousins are hitting the headline of the news. On Twitter, we find daily viral videos which are circulating on social media. You can still watch the live Two cousins’ Twitter video on Twitter, Reddit, telegram, and other websites. It is available for the users, for more viral updates keep reading posts on the same page we will be right back with fresh updates.
Who was Joyce Culla? TikToker Passed Away at the Age of 24, How did she die? | 2023-14/0000/en_head.json.gz/3640 | {"url": "https://deathmilitia.com/two-cousins-twitter-video-went-viral-on-twitter-reddit/", "partition": "head_middle", "language": "en", "source_domain": "deathmilitia.com", "date_download": "2023-03-20T10:26:26Z", "digest": "sha1:CGSCA7247K6N77GIZSO5EPYFXHGEABD7"} | {"ccnet_length": [[0, 2941, 2941.0]], "ccnet_original_length": [[0, 2941, 4370.0]], "ccnet_nlines": [[0, 2941, 10.0]], "ccnet_original_nlines": [[0, 2941, 57.0]], "ccnet_language_score": [[0, 2941, 0.99]], "ccnet_perplexity": [[0, 2941, 254.0]], "ccnet_bucket": [[0, 2941, 0.0]], "rps_doc_curly_bracket": [[0, 2941, 0.0]], "rps_doc_ldnoobw_words": [[0, 2941, 0.0]], "rps_doc_lorem_ipsum": [[0, 2941, 0.0]], "rps_doc_stop_word_fraction": [[0, 2941, 0.49585406]], "rps_doc_ut1_blacklist": [[0, 2941, null]], "rps_doc_frac_chars_dupe_10grams": [[0, 2941, 0.0]], "rps_doc_frac_chars_dupe_5grams": [[0, 2941, 0.08184143]], "rps_doc_frac_chars_dupe_6grams": [[0, 2941, 0.04688832]], "rps_doc_frac_chars_dupe_7grams": [[0, 2941, 0.0]], "rps_doc_frac_chars_dupe_8grams": [[0, 2941, 0.0]], "rps_doc_frac_chars_dupe_9grams": [[0, 2941, 0.0]], "rps_doc_frac_chars_top_2gram": [[0, 2941, 0.03580563]], "rps_doc_frac_chars_top_3gram": [[0, 2941, 0.02898551]], "rps_doc_frac_chars_top_4gram": [[0, 2941, 0.03751066]], "rps_doc_frac_all_caps_words": [[0, 2941, 0.00165837]], "rps_doc_frac_lines_end_with_ellipsis": [[0, 2941, 0.0]], "rps_doc_frac_no_alph_words": [[0, 2941, 0.10779436]], "rps_doc_frac_unique_words": [[0, 2941, 0.40074212]], "rps_doc_mean_word_length": [[0, 2941, 4.35250464]], "rps_doc_symbol_to_word_ratio": [[0, 2941, 0.0]], "rps_doc_unigram_entropy": [[0, 2941, 4.8339241]], "rps_doc_word_count": [[0, 2941, 539.0]], "rps_lines_ending_with_terminal_punctution_mark": [[0, 55, 0.0], [55, 376, 1.0], [376, 682, 1.0], [682, 708, 0.0], [708, 1337, 1.0], [1337, 1966, 1.0], [1966, 1998, 0.0], [1998, 2407, 1.0], [2407, 2865, 1.0], [2865, 2941, 1.0]], "rps_lines_javascript_counts": [[0, 55, 0.0], [55, 376, 0.0], [376, 682, 0.0], [682, 708, 0.0], [708, 1337, 0.0], [1337, 1966, 0.0], [1966, 1998, 0.0], [1998, 2407, 0.0], [2407, 2865, 0.0], [2865, 2941, 0.0]], "rps_lines_num_words": [[0, 55, 9.0], [55, 376, 55.0], [376, 682, 55.0], [682, 708, 4.0], [708, 1337, 121.0], [1337, 1966, 116.0], [1966, 1998, 5.0], [1998, 2407, 76.0], [2407, 2865, 82.0], [2865, 2941, 16.0]], "rps_lines_numerical_chars_fraction": [[0, 55, 0.0], [55, 376, 0.0], [376, 682, 0.0], [682, 708, 0.0], [708, 1337, 0.0], [1337, 1966, 0.00648298], [1966, 1998, 0.0], [1998, 2407, 0.0], [2407, 2865, 0.0], [2865, 2941, 0.02739726]], "rps_lines_start_with_bulletpoint": [[0, 55, 0.0], [55, 376, 0.0], [376, 682, 0.0], [682, 708, 0.0], [708, 1337, 0.0], [1337, 1966, 0.0], [1966, 1998, 0.0], [1998, 2407, 0.0], [2407, 2865, 0.0], [2865, 2941, 0.0]], "rps_lines_uppercase_letter_fraction": [[0, 55, 0.12727273], [55, 376, 0.02803738], [376, 682, 0.01960784], [682, 708, 0.07692308], [708, 1337, 0.01589825], [1337, 1966, 0.01589825], [1966, 1998, 0.15625], [1998, 2407, 0.01466993], [2407, 2865, 0.01965066], [2865, 2941, 0.11842105]], "rps_doc_ml_palm_score": [[0, 2941, 0.97709048]], "rps_doc_ml_wikipedia_score": [[0, 2941, null]], "rps_doc_ml_wikiref_score": [[0, 2941, 0.49238861]], "rps_doc_books_importance": [[0, 2941, 6.78156118]], "rps_doc_openwebtext_importance": [[0, 2941, 94.95677762]], "rps_doc_wikipedia_importance": [[0, 2941, -74.18068089]], "rps_doc_num_sentences": [[0, 2941, 27.0]]} |
Forsyth County Sheriff’s deputy gets minor injury at shooting range
A Forsyth County Sheriff’s Office deputy who received minor injuries to his leg from a gun discharge Thursday afternoon is transported to the hospital, shown here at Highway 20 and Ga. 400. - photo by Jim Dean
CUMMING – A Forsyth County Sheriff’s Office deputy received minor injuries to his leg from a gun discharge Thursday afternoon.
The deputy, who Cpl. Robin Regan said is fine, was transported to Northside Hospital-Forsyth from a shooting range he was at for training out of the county to the north.
“It’s not a serious injury,” Regan said. “He’s already fine.”
The sheriff’s office is not releasing any more details on the incident.
He was initially transported in a patrol car but was transferred to an ambulance en-route. | 2023-14/0000/en_head.json.gz/3641 | {"url": "https://demo.morristechnology.com/local-2/forsyth-county-sheriffs-deputy-gets-minor-injury-at-shooting-range/", "partition": "head_middle", "language": "en", "source_domain": "demo.morristechnology.com", "date_download": "2023-03-20T09:54:09Z", "digest": "sha1:6MM42DPLSWOWUNPOZPL4KNR4IZIQMUL4"} | {"ccnet_length": [[0, 799, 799.0]], "ccnet_original_length": [[0, 799, 3043.0]], "ccnet_nlines": [[0, 799, 7.0]], "ccnet_original_nlines": [[0, 799, 92.0]], "ccnet_language_score": [[0, 799, 0.96]], "ccnet_perplexity": [[0, 799, 255.6]], "ccnet_bucket": [[0, 799, 0.0]], "rps_doc_curly_bracket": [[0, 799, 0.0]], "rps_doc_ldnoobw_words": [[0, 799, 1.0]], "rps_doc_lorem_ipsum": [[0, 799, 0.0]], "rps_doc_stop_word_fraction": [[0, 799, 0.35151515]], "rps_doc_ut1_blacklist": [[0, 799, null]], "rps_doc_frac_chars_dupe_10grams": [[0, 799, 0.19384615]], "rps_doc_frac_chars_dupe_5grams": [[0, 799, 0.30153846]], "rps_doc_frac_chars_dupe_6grams": [[0, 799, 0.30153846]], "rps_doc_frac_chars_dupe_7grams": [[0, 799, 0.19384615]], "rps_doc_frac_chars_dupe_8grams": [[0, 799, 0.19384615]], "rps_doc_frac_chars_dupe_9grams": [[0, 799, 0.19384615]], "rps_doc_frac_chars_top_2gram": [[0, 799, 0.06]], "rps_doc_frac_chars_top_3gram": [[0, 799, 0.10153846]], "rps_doc_frac_chars_top_4gram": [[0, 799, 0.07076923]], "rps_doc_frac_all_caps_words": [[0, 799, 0.01818182]], "rps_doc_frac_lines_end_with_ellipsis": [[0, 799, 0.0]], "rps_doc_frac_no_alph_words": [[0, 799, 0.16363636]], "rps_doc_frac_unique_words": [[0, 799, 0.57142857]], "rps_doc_mean_word_length": [[0, 799, 4.88721805]], "rps_doc_symbol_to_word_ratio": [[0, 799, 0.0]], "rps_doc_unigram_entropy": [[0, 799, 4.12968396]], "rps_doc_word_count": [[0, 799, 133.0]], "rps_lines_ending_with_terminal_punctution_mark": [[0, 68, 0.0], [68, 278, 0.0], [278, 405, 1.0], [405, 575, 1.0], [575, 637, 1.0], [637, 709, 1.0], [709, 799, 1.0]], "rps_lines_javascript_counts": [[0, 68, 0.0], [68, 278, 0.0], [278, 405, 0.0], [405, 575, 0.0], [575, 637, 0.0], [637, 709, 0.0], [709, 799, 0.0]], "rps_lines_num_words": [[0, 68, 10.0], [68, 278, 36.0], [278, 405, 20.0], [405, 575, 30.0], [575, 637, 10.0], [637, 709, 12.0], [709, 799, 15.0]], "rps_lines_numerical_chars_fraction": [[0, 68, 0.0], [68, 278, 0.0245098], [278, 405, 0.0], [405, 575, 0.0], [575, 637, 0.0], [637, 709, 0.0], [709, 799, 0.0]], "rps_lines_start_with_bulletpoint": [[0, 68, 0.0], [68, 278, 0.0], [278, 405, 0.0], [405, 575, 0.0], [575, 637, 0.0], [637, 709, 0.0], [709, 799, 0.0]], "rps_lines_uppercase_letter_fraction": [[0, 68, 0.04411765], [68, 278, 0.04761905], [278, 405, 0.1023622], [405, 575, 0.04117647], [575, 637, 0.0483871], [637, 709, 0.01388889], [709, 799, 0.01111111]], "rps_doc_ml_palm_score": [[0, 799, 0.40109295]], "rps_doc_ml_wikipedia_score": [[0, 799, null]], "rps_doc_ml_wikiref_score": [[0, 799, 0.47925013]], "rps_doc_books_importance": [[0, 799, -62.14432662]], "rps_doc_openwebtext_importance": [[0, 799, 34.26534251]], "rps_doc_wikipedia_importance": [[0, 799, -33.1111635]], "rps_doc_num_sentences": [[0, 799, 9.0]]} |
Part II Minutes - 18th July 2022
12/09/2022 - Part II Minutes - 18th July 2022
21/09/2022 - Cabinet Part II Minutes - 18th July 2022 21/09/2022 | 2023-14/0000/en_head.json.gz/3642 | {"url": "https://democracy.slough.gov.uk/mgIssueHistoryHome.aspx?IId=45391&Opt=0", "partition": "head_middle", "language": "en", "source_domain": "democracy.slough.gov.uk", "date_download": "2023-03-20T09:37:47Z", "digest": "sha1:XBQVE6CGA7GB44VNN5IMUK525USSMFCY"} | {"ccnet_length": [[0, 143, 143.0]], "ccnet_original_length": [[0, 143, 656.0]], "ccnet_nlines": [[0, 143, 3.0]], "ccnet_original_nlines": [[0, 143, 29.0]], "ccnet_language_score": [[0, 143, 0.91]], "ccnet_perplexity": [[0, 143, 175.5]], "ccnet_bucket": [[0, 143, 0.0]], "rps_doc_curly_bracket": [[0, 143, 0.0]], "rps_doc_ldnoobw_words": [[0, 143, 0.0]], "rps_doc_lorem_ipsum": [[0, 143, 0.0]], "rps_doc_stop_word_fraction": [[0, 143, 0.0]], "rps_doc_ut1_blacklist": [[0, 143, null]], "rps_doc_frac_chars_dupe_10grams": [[0, 143, 0.0]], "rps_doc_frac_chars_dupe_5grams": [[0, 143, 0.85849057]], "rps_doc_frac_chars_dupe_6grams": [[0, 143, 0.85849057]], "rps_doc_frac_chars_dupe_7grams": [[0, 143, 0.62264151]], "rps_doc_frac_chars_dupe_8grams": [[0, 143, 0.0]], "rps_doc_frac_chars_dupe_9grams": [[0, 143, 0.0]], "rps_doc_frac_chars_top_2gram": [[0, 143, 0.16981132]], "rps_doc_frac_chars_top_3gram": [[0, 143, 0.36792453]], "rps_doc_frac_chars_top_4gram": [[0, 143, 0.48113208]], "rps_doc_frac_all_caps_words": [[0, 143, 0.07692308]], "rps_doc_frac_lines_end_with_ellipsis": [[0, 143, 0.0]], "rps_doc_frac_no_alph_words": [[0, 143, 0.58974359]], "rps_doc_frac_unique_words": [[0, 143, 0.40909091]], "rps_doc_mean_word_length": [[0, 143, 4.81818182]], "rps_doc_symbol_to_word_ratio": [[0, 143, 0.0]], "rps_doc_unigram_entropy": [[0, 143, 2.12916447]], "rps_doc_word_count": [[0, 143, 22.0]], "rps_lines_ending_with_terminal_punctution_mark": [[0, 33, 0.0], [33, 79, 0.0], [79, 143, 0.0]], "rps_lines_javascript_counts": [[0, 33, 0.0], [33, 79, 0.0], [79, 143, 0.0]], "rps_lines_num_words": [[0, 33, 6.0], [33, 79, 7.0], [79, 143, 9.0]], "rps_lines_numerical_chars_fraction": [[0, 33, 0.2], [33, 79, 0.35897436], [79, 143, 0.39285714]], "rps_lines_start_with_bulletpoint": [[0, 33, 0.0], [33, 79, 0.0], [79, 143, 0.0]], "rps_lines_uppercase_letter_fraction": [[0, 33, 0.15151515], [33, 79, 0.10869565], [79, 143, 0.09375]], "rps_doc_ml_palm_score": [[0, 143, -1.001e-05]], "rps_doc_ml_wikipedia_score": [[0, 143, null]], "rps_doc_ml_wikiref_score": [[0, 143, -9.06e-06]], "rps_doc_books_importance": [[0, 143, -41.84837632]], "rps_doc_openwebtext_importance": [[0, 143, -20.24758637]], "rps_doc_wikipedia_importance": [[0, 143, -9.48567459]], "rps_doc_num_sentences": [[0, 143, 1.0]]} |
Symposium | Trump Vs. Democracy
Genuflecting to Putin
By Michael Kimmage
from October, No. 60, Special Issue – 7 MIN READ
Tagged DemocracyDonald TrumpRussiaVladimir Putin
Donald Trump went to Moscow in 1987 and loved it. He has since preserved considerable affection for Russia, if not for the Russian Federation’s first president, Boris Yeltsin, then for his successor, Vladimir Putin. Putin has qualities of character that Trump admires: unconstrained power, wealth, and a “manly” cultural traditionalism that bestows legitimacy but imposes no personal responsibilities. Putin, about whom Trump has never had an ill word to say, is doubtless the kind of leader Trump wishes to be.
Beyond this elective affinity, a web of financial and personal ties link Trump and Putin, going back to the Kremlin-connected money that flowed into Trump’s properties in the last 20 years and to Trump’s dream of putting up a tower or two in Moscow, a scheme he was actively pursuing in 2016. Russia stands with Turkey, Saudi Arabia, and a handful of other countries as a place for the Trump family to enrich itself. No doubt Trump still has business interests in Russia, and in the case of all these countries the Trump family’s cupidity boils down to the potential for foreign leverage.
A third reason for Trump’s inability to criticize or confront Putin is more hypothetical. Putin served in the KGB, became head of the FSB, the successor to the KGB, in the late 1990s, and has filled his government with men from the intelligence and security services. Trump is notoriously undisciplined and has often been in debt. His campaign had many ties to Russian intelligence, which was working to get Trump elected. Trump is the perfect object of an intelligence operation, and Russia has had every incentive to buy him, to compromise him, or to make him an asset, all scenarios since 2016 that have been a part of mainstream political discourse in the United States. This shocking explanation of Trump’s attitude toward Putin has not yet been proven beyond a reasonable doubt.
The most convincing reason for Trump’s friendliness toward Russia is therefore to be found not in Trump’s sprawling personal life or in the chaotic mess that is his foreign policy but in his domestic political instincts. Trump is not truly wedded to any set of policies or to any one ideology. He does not care about foreign countries as such: Each is merely an end to his personal and political objectives. But Trump does care about the home front, the basis of his power and wealth. In the sphere of psychology and culture—in the sphere of American politics, that is—Trump is a formidable figure, fully capable of grievous mistakes yet imbued with genuine cunning.
A tangle of storylines, “Russia” has been of value to Trump. This is because “Russia” has served tangible interests, and may in fact be a catalyst, should he pull it off, of Trump’s reelection this November.
Trump knows the political terrain on which he has an advantage. It is the media-driven world of gossip, rumor, half-truths, and conspiracy theories—addled, agitated, perfervid, fueled by exaggeration. (Trump built his real estate career and initial celebrity status on the shrewd manipulation of tabloid journalism.) Prior to 2016, the transformation of politics into entertainment, the 24-hour news cycle, and the hunger for scandal were realities of American life. They were still on the margins, though. Trump has devoted himself to making them the warp and woof of American politics, and not simply because this suits him personally. In his world, conventional politicians cannot compete. They are too dull, too constrained, and too harshly defined by reality.
“Russia” has thus been perfect for Trump. The Mueller Report did not take Trump down, which may reflect undue caution on Robert Mueller’s part, the machinations of Attorney General Bill Barr, or real evidence that has remained hidden from view. Only time will tell. But too often Russiagate was a narrative of Trump’s demise in search of a factual foundation. Too often the hunt for a conspiracy involving Michael Flynn, George Papadopoulos, and countless others augmented the conspiratorial mindset, and what the historian Richard Hofstadter called the “paranoid style,” on the left. This has redounded to Trump’s tactical advantage, because he is the practitioner of the paranoid style par excellence and the conspiracy-theorist-in-chief. The drier, the more fact-driven, and the more balanced American politics is, the worse for Trump. “Russia” helps him to be where he wants to be, and to keep his adversaries where he wants them to be—angry, disoriented, and trapped in responding to his limitless capacity for engendering scandal.
“Russia” has one final attraction for Trump. An overall climate of distrust and confusion delegitimizes the entire system of constitutional government. Most likely not a reader of Franz Kafka, Trump has nonetheless internalized an understanding of politics that is robustly Kafkaesque. It is his own behavior—asking Russia to hack Hillary Clinton’s emails on the campaign trail, condemning the U.S. intelligence community during a 2017 press conference with Putin in Helsinki, having Rudy Giuliani dish dirt on Biden that was very likely fabricated by Russian intelligence—that has rightly inspired suspicion. Though he condemns those who suspect him, he also lets the suspicion itself float free and expand, “confirming” Trump’s frequent accusations that the system is rigged. It is the theater of the absurd, darkly comic in the Kafka vein, and an obvious jumble of contradictions. Yet each day he is President, the constitutional system of government gets weaker and weaker. Should “Russia” not be enough for Trump to win this November, his chief asset will be the system’s relative weakness, not just institutionally but in the court of public opinion. His appearance of guilt diminishes the general prospects for innocence or, rather, for legality.
Trump’s genuflecting to Putin goes far beyond previous examples of cultivating dictators. The Cold War and the War on Terror pushed American presidents to embrace authoritarian regimes from Chile to Argentina and from Pakistan to Saudi Arabia. By the standards of the American national interest, Trump has no strategic reason to praise Putin, and yet he does, rendering the democratic bona fides of American foreign policy not so much hypocritical as obsolete or beside the point. More acutely, though, Trump has exploited U.S.-Russian relations to obfuscate and manipulate domestic American politics and potentially to pave the way to his own version of authoritarian rule. One might recall here how Putin benefitted from the apartment bombings in Russia in 2000, which provoked fear and disorientation. Unscrupulous politicians can profit from crises that betray their own finger prints.
In the presidential election, Democrats and sympathetic Republicans should be critical of Trump’s actual Russia policy. They should propose three initiatives: shoring up the transatlantic relationship, which Trump has damaged; addressing Russia’s use of corruption and disinformation and election interference in the United States and Europe, which the Trump Administration has either abetted or ignored; and encouraging diplomatic engagement with Russia, which Trump has never pursued, despite his frequent calls for a better relationship with Moscow, and which might help to demystify the Russian leviathan.
Additionally, the FBI and the Justice Department still have much to do regarding the Trump family and Russia. If Trump truly is a Russian asset, it would be the greatest scandal in U.S. history; no stone should be left unturned in the investigation of this possibility. If the Russian government has exerted any influence on Trump’s decision-making as President, this would be an impeachable offense for as long as he is in office. It is something that would have to be clarified and litigated when he is no longer in office. These are urgent matters of state. Definitively proving Trump’s dependence on Russia would do much to restore the American body politic to health.
For Democrats, Federalist Paper #1 should establish the relevant polarities on Trump and “Russia.” One can have a government fashioned from “accident and force,” in which panic and fear are endemic, knowledge is tied in knots, and conspiracies are constantly afoot. Or one can have “good government [established] from reflection and choice,” helped along by empirical inquiry, intellectual honesty, and a healthy respect for nuance. Careful reflection would qualify the oversized, horror-movie role “Russia” has come to play in American politics. In turn, keeping Trump and Russia in perspective would encourage the spirit of reflection and the integrity of choices rationally considered and rationally made, preserving these precious goods for a time when Trump is no longer in office.
From the Symposium
Trump Vs. Democracy
No President in our history has presented such a threat to the Constitution and our democracy as this one. In this special issue, we asked 35 contributors to describe different aspects of the assault. We could have asked twice that number.
Bulldozing the Public Square
By Karen Kornbluh
Read more about DemocracyDonald TrumpRussiaVladimir Putin
Michael Kimmage is a professor of history at the Catholic University of America. His most recent book is The Abandonment of the West: The History of an Idea in American Foreign Policy. From 2014 to 2016, he served on the State Department’s Policy Planning Staff, where he held the Russia/Ukraine portfolio.
Spring, No. 68 – 1 MIN READ
Winter, No. 67 – 30 MIN READ
Spring, No. 68 – 14 MIN READ | 2023-14/0000/en_head.json.gz/3643 | {"url": "https://democracyjournal.org/magazine/specialissue/genuflecting-to-putin/", "partition": "head_middle", "language": "en", "source_domain": "democracyjournal.org", "date_download": "2023-03-20T09:49:17Z", "digest": "sha1:DV5ORZTEZRLIEI6VHUOLTE5Z72JF6NUW"} | {"ccnet_length": [[0, 9724, 9724.0]], "ccnet_original_length": [[0, 9724, 11871.0]], "ccnet_nlines": [[0, 9724, 27.0]], "ccnet_original_nlines": [[0, 9724, 108.0]], "ccnet_language_score": [[0, 9724, 0.96]], "ccnet_perplexity": [[0, 9724, 334.5]], "ccnet_bucket": [[0, 9724, 0.0]], "rps_doc_curly_bracket": [[0, 9724, 0.0]], "rps_doc_ldnoobw_words": [[0, 9724, 0.0]], "rps_doc_lorem_ipsum": [[0, 9724, 0.0]], "rps_doc_stop_word_fraction": [[0, 9724, 0.4]], "rps_doc_ut1_blacklist": [[0, 9724, null]], "rps_doc_frac_chars_dupe_10grams": [[0, 9724, 0.0]], "rps_doc_frac_chars_dupe_5grams": [[0, 9724, 0.00451807]], "rps_doc_frac_chars_dupe_6grams": [[0, 9724, 0.0]], "rps_doc_frac_chars_dupe_7grams": [[0, 9724, 0.0]], "rps_doc_frac_chars_dupe_8grams": [[0, 9724, 0.0]], "rps_doc_frac_chars_dupe_9grams": [[0, 9724, 0.0]], "rps_doc_frac_chars_top_2gram": [[0, 9724, 0.00753012]], "rps_doc_frac_chars_top_3gram": [[0, 9724, 0.00401606]], "rps_doc_frac_chars_top_4gram": [[0, 9724, 0.0062751]], "rps_doc_frac_all_caps_words": [[0, 9724, 0.01084011]], "rps_doc_frac_lines_end_with_ellipsis": [[0, 9724, 0.0]], "rps_doc_frac_no_alph_words": [[0, 9724, 0.15230352]], "rps_doc_frac_unique_words": [[0, 9724, 0.45084088]], "rps_doc_mean_word_length": [[0, 9724, 5.15394567]], "rps_doc_symbol_to_word_ratio": [[0, 9724, 0.00054201]], "rps_doc_unigram_entropy": [[0, 9724, 5.69515622]], "rps_doc_word_count": [[0, 9724, 1546.0]], "rps_lines_ending_with_terminal_punctution_mark": [[0, 32, 0.0], [32, 54, 0.0], [54, 73, 0.0], [73, 122, 0.0], [122, 171, 0.0], [171, 683, 1.0], [683, 1272, 1.0], [1272, 2057, 1.0], [2057, 2724, 1.0], [2724, 2932, 1.0], [2932, 3697, 1.0], [3697, 4734, 1.0], [4734, 5988, 1.0], [5988, 6878, 1.0], [6878, 7488, 1.0], [7488, 8161, 1.0], [8161, 8948, 1.0], [8948, 8967, 0.0], [8967, 8987, 0.0], [8987, 9227, 1.0], [9227, 9256, 0.0], [9256, 9274, 0.0], [9274, 9332, 0.0], [9332, 9639, 1.0], [9639, 9667, 0.0], [9667, 9696, 0.0], [9696, 9724, 0.0]], "rps_lines_javascript_counts": [[0, 32, 0.0], [32, 54, 0.0], [54, 73, 0.0], [73, 122, 0.0], [122, 171, 0.0], [171, 683, 0.0], [683, 1272, 0.0], [1272, 2057, 0.0], [2057, 2724, 0.0], [2724, 2932, 0.0], [2932, 3697, 0.0], [3697, 4734, 0.0], [4734, 5988, 0.0], [5988, 6878, 0.0], [6878, 7488, 0.0], [7488, 8161, 0.0], [8161, 8948, 0.0], [8948, 8967, 0.0], [8967, 8987, 0.0], [8987, 9227, 0.0], [9227, 9256, 0.0], [9256, 9274, 0.0], [9274, 9332, 0.0], [9332, 9639, 0.0], [9639, 9667, 0.0], [9667, 9696, 0.0], [9696, 9724, 0.0]], "rps_lines_num_words": [[0, 32, 4.0], [32, 54, 3.0], [54, 73, 3.0], [73, 122, 10.0], [122, 171, 4.0], [171, 683, 80.0], [683, 1272, 103.0], [1272, 2057, 132.0], [2057, 2724, 114.0], [2724, 2932, 36.0], [2932, 3697, 116.0], [3697, 4734, 162.0], [4734, 5988, 193.0], [5988, 6878, 136.0], [6878, 7488, 85.0], [7488, 8161, 114.0], [8161, 8948, 119.0], [8948, 8967, 3.0], [8967, 8987, 3.0], [8987, 9227, 41.0], [9227, 9256, 4.0], [9256, 9274, 3.0], [9274, 9332, 6.0], [9332, 9639, 51.0], [9639, 9667, 7.0], [9667, 9696, 7.0], [9696, 9724, 7.0]], "rps_lines_numerical_chars_fraction": [[0, 32, 0.0], [32, 54, 0.0], [54, 73, 0.0], [73, 122, 0.06666667], [122, 171, 0.0], [171, 683, 0.00803213], [683, 1272, 0.01038062], [1272, 2057, 0.01040312], [2057, 2724, 0.0], [2724, 2932, 0.0], [2932, 3697, 0.00814111], [3697, 4734, 0.0], [4734, 5988, 0.00325203], [5988, 6878, 0.00457666], [6878, 7488, 0.0], [7488, 8161, 0.0], [8161, 8948, 0.00130378], [8948, 8967, 0.0], [8967, 8987, 0.0], [8987, 9227, 0.00851064], [9227, 9256, 0.0], [9256, 9274, 0.0], [9274, 9332, 0.0], [9332, 9639, 0.02675585], [9639, 9667, 0.12], [9667, 9696, 0.15384615], [9696, 9724, 0.15384615]], "rps_lines_start_with_bulletpoint": [[0, 32, 0.0], [32, 54, 0.0], [54, 73, 0.0], [73, 122, 0.0], [122, 171, 0.0], [171, 683, 0.0], [683, 1272, 0.0], [1272, 2057, 0.0], [2057, 2724, 0.0], [2724, 2932, 0.0], [2932, 3697, 0.0], [3697, 4734, 0.0], [4734, 5988, 0.0], [5988, 6878, 0.0], [6878, 7488, 0.0], [7488, 8161, 0.0], [8161, 8948, 0.0], [8948, 8967, 0.0], [8967, 8987, 0.0], [8987, 9227, 0.0], [9227, 9256, 0.0], [9256, 9274, 0.0], [9274, 9332, 0.0], [9332, 9639, 0.0], [9639, 9667, 0.0], [9667, 9696, 0.0], [9696, 9724, 0.0]], "rps_lines_uppercase_letter_fraction": [[0, 32, 0.125], [32, 54, 0.09090909], [54, 73, 0.15789474], [73, 122, 0.2244898], [122, 171, 0.14285714], [171, 683, 0.03125], [683, 1272, 0.02716469], [1272, 2057, 0.03057325], [2057, 2724, 0.017991], [2724, 2932, 0.03365385], [2932, 3697, 0.0130719], [3697, 4734, 0.02796528], [4734, 5988, 0.02472089], [5988, 6878, 0.03146067], [6878, 7488, 0.02786885], [7488, 8161, 0.03417533], [8161, 8948, 0.01905972], [8948, 8967, 0.10526316], [8967, 8987, 0.15], [8987, 9227, 0.02083333], [9227, 9256, 0.10344828], [9256, 9274, 0.16666667], [9274, 9332, 0.12068966], [9332, 9639, 0.07491857], [9639, 9667, 0.32142857], [9667, 9696, 0.31034483], [9696, 9724, 0.32142857]], "rps_doc_ml_palm_score": [[0, 9724, 0.96837378]], "rps_doc_ml_wikipedia_score": [[0, 9724, null]], "rps_doc_ml_wikiref_score": [[0, 9724, 0.40056241]], "rps_doc_books_importance": [[0, 9724, -225.19702534]], "rps_doc_openwebtext_importance": [[0, 9724, 345.26020908]], "rps_doc_wikipedia_importance": [[0, 9724, -88.180618]], "rps_doc_num_sentences": [[0, 9724, 83.0]]} |
Welcome to DHFinance
These terms and conditions(“Terms”, “Agreement”) are an agreement between DHFinance, (“we”, or “our”) and you (“User”, or “you”).
This Agreement establishes the general terms and conditions of your use of the DHFinance website and any of its products or services(collectively, “Website” or “Services”).
You must be at least 18 years old to use this website. By using this website and by accepting this Agreement, you warrant and declare that you are at least 18 years old. If you create an account on the website, you are responsible for maintaining the security of your account and are fully responsible for all activities that occur under the account and any other action taken in relation to it. Providing false contact information of any kind may result in the cancellation of your account.
You must notify us immediately of any unauthorized use of your account or any other security breach. We will not be liable for any act or omission on your part, including damages of any kind incurred as a result of such acts or omissions. We may suspend, disable, or delete your account (or part thereof) if we determine that you have violated any provision of this Agreement or that its conduct or content will tend to damage our reputation and goodwill.
If we delete your account for the above reasons, you cannot re-register for our Services. We can block your email address and Internet protocol address to avoid further registration. This includes the action of any type of deception or manipulation to obtain own benefits or any type of fraud to third parties, in a preventive manner and until any type of clarification or litigation is remedied, the account may be temporarily blocked.
Occasionally there may be information on the website that contains typographical errors, inaccuracies or omissions that may be related to promotions and offers. We reserve the right to correct any error, inaccuracy or omission, and to change or update information or cancel orders if any information on the website or in any related service is inaccurate at any time without notice. We do not assume any obligation to update, modify or clarify the information on the website, including, among others, price information, except as required by law. No update or update date specified on the website should be taken to indicate that all information on the website or any related service has been modified or updated.
Although this website may be linked to other websites, we are not directly or indirectly involving any approval, association, sponsorship, endorsement or affiliation with any linked website, unless specifically indicated in this document. We are not responsible for examining or evaluating, and we do not guarantee the offers of any company or individual or the content of their websites. We do not assume any responsibility or obligation for the actions, products, services and content of other third parties. You should carefully review the legal statements and other conditions of use of any website that you access through a link from this website. Your link to any other website outside the site is at your own risk.
In addition to other terms set forth in the Agreement, you are prohibited from using the Website or its Content: (a) for any illegal purpose; (b) request others to perform or participate in illegal acts; (c) violate any international, federal, provincial or state local regulation, norm, law or ordinance; (d) infringe or violate our intellectual property rights or the intellectual property rights of third parties; (e) harass, abuse, insult, harm, defame, slander, belittle, intimidate or discriminate on the basis of gender, sexual orientation, religion, ethnicity, race, age, national origin or disability; (f) submit false or misleading information ; (g) to upload or transmit viruses or any other type of malicious code that will be used or may be used in any way that affects the functionality or operation of the Service or any related website, other websites or the Internet; (h) to collect or track the personal information of others; (i) spam, phishing, pharm, pretext, spider, tracking or scraping; (j) for any obscene or immoral purpose; or (k) interfere with or circumvent the security features of the Service or any related website, other websites or the Internet. We reserve the right to terminate your use of the Service or any related website for violating any of the prohibited uses. or (k) interfere with or circumvent the security features of the Service or any related website, other websites or the Internet. We reserve the right to terminate your use of the Service or any related website for violating any of the prohibited uses. or (k) interfere with or circumvent the security features of the Service or any related website, other websites or the Internet. We reserve the right to terminate your use of the Service or any related website for violating any of the prohibited uses.
All trademarks, service marks, graphics and logos used in connection with our website or services, are trademarks or registered trademarks of DHFinance limited licensors, other trademarks, service marks, graphics and logos used in relation to with our website or services they may be trademarks of other third parties. The use of our website and our services does not grant you any right or license to reproduce or use DHFinance properties or third party trademarks.
All rights and restrictions contained in this Agreement may be exercised and will be applicable and binding only to the extent that they do not violate applicable laws and are intended to be limited to the extent necessary so that they do not make this Agreement illegal, invalid or unenforceable. If any provision or part of any provision of this Agreement is deemed illegal, invalid or unenforceable by a court of competent jurisdiction, the parties intend that the remaining provisions or parts thereof constitute their agreement with respect to the purpose of this, and All remaining provisions or parts thereof shall remain in full force and effect.
We reserve the right to modify this Agreement or its policies related to the Website or the Services at any time, as of the publication of an updated version of this Agreement on the Website. When we do, we will send you an email to notify you. Continued use of the website after such changes will constitute your consent for such changes.
You acknowledge that you have read this Agreement and accept all its terms and conditions. By using the website or its services, you agree to be subject to this Agreement. If you do not agree to comply with the terms of this Agreement, you are not authorized to use or access the Website and its Services. “Terms and Conditions”, are considered subject to revision and therefore both will have to be consulted before use for study and approval. Any questions or clarification about this preceding regulation or any point of the “Terms and Conditions” that arises after reading it, should be sent to: [email protected]
[email protected]
Copyright 2022 © DHFinance | 2023-14/0000/en_head.json.gz/3644 | {"url": "https://dhfi.online/2a/index.html", "partition": "head_middle", "language": "en", "source_domain": "dhfi.online", "date_download": "2023-03-20T08:39:14Z", "digest": "sha1:6WWCPNJRYZ6YZL6J726JL44SGR6EMCWE"} | {"ccnet_length": [[0, 7066, 7066.0]], "ccnet_original_length": [[0, 7066, 7699.0]], "ccnet_nlines": [[0, 7066, 15.0]], "ccnet_original_nlines": [[0, 7066, 52.0]], "ccnet_language_score": [[0, 7066, 0.91]], "ccnet_perplexity": [[0, 7066, 292.6]], "ccnet_bucket": [[0, 7066, 0.0]], "rps_doc_curly_bracket": [[0, 7066, 0.0]], "rps_doc_ldnoobw_words": [[0, 7066, 1.0]], "rps_doc_lorem_ipsum": [[0, 7066, 0.0]], "rps_doc_stop_word_fraction": [[0, 7066, 0.48098434]], "rps_doc_ut1_blacklist": [[0, 7066, null]], "rps_doc_frac_chars_dupe_10grams": [[0, 7066, 0.11657083]], "rps_doc_frac_chars_dupe_5grams": [[0, 7066, 0.1949817]], "rps_doc_frac_chars_dupe_6grams": [[0, 7066, 0.13190451]], "rps_doc_frac_chars_dupe_7grams": [[0, 7066, 0.13190451]], "rps_doc_frac_chars_dupe_8grams": [[0, 7066, 0.13190451]], "rps_doc_frac_chars_dupe_9grams": [[0, 7066, 0.11657083]], "rps_doc_frac_chars_top_2gram": [[0, 7066, 0.01219725]], "rps_doc_frac_chars_top_3gram": [[0, 7066, 0.01672765]], "rps_doc_frac_chars_top_4gram": [[0, 7066, 0.01707615]], "rps_doc_frac_all_caps_words": [[0, 7066, 0.0]], "rps_doc_frac_lines_end_with_ellipsis": [[0, 7066, 0.0]], "rps_doc_frac_no_alph_words": [[0, 7066, 0.12975391]], "rps_doc_frac_unique_words": [[0, 7066, 0.32731959]], "rps_doc_mean_word_length": [[0, 7066, 4.93041237]], "rps_doc_symbol_to_word_ratio": [[0, 7066, 0.0]], "rps_doc_unigram_entropy": [[0, 7066, 5.02904006]], "rps_doc_word_count": [[0, 7066, 1164.0]], "rps_lines_ending_with_terminal_punctution_mark": [[0, 21, 0.0], [21, 151, 1.0], [151, 324, 1.0], [324, 816, 1.0], [816, 1272, 1.0], [1272, 1709, 1.0], [1709, 2423, 1.0], [2423, 3145, 1.0], [3145, 4952, 1.0], [4952, 5419, 1.0], [5419, 6074, 1.0], [6074, 6414, 1.0], [6414, 7027, 0.0], [7027, 7040, 0.0], [7040, 7066, 0.0]], "rps_lines_javascript_counts": [[0, 21, 0.0], [21, 151, 0.0], [151, 324, 0.0], [324, 816, 0.0], [816, 1272, 0.0], [1272, 1709, 0.0], [1709, 2423, 0.0], [2423, 3145, 0.0], [3145, 4952, 0.0], [4952, 5419, 0.0], [5419, 6074, 0.0], [6074, 6414, 0.0], [6414, 7027, 0.0], [7027, 7040, 0.0], [7040, 7066, 0.0]], "rps_lines_num_words": [[0, 21, 3.0], [21, 151, 18.0], [151, 324, 25.0], [324, 816, 87.0], [816, 1272, 81.0], [1272, 1709, 72.0], [1709, 2423, 116.0], [2423, 3145, 116.0], [3145, 4952, 294.0], [4952, 5419, 74.0], [5419, 6074, 107.0], [6074, 6414, 62.0], [6414, 7027, 104.0], [7027, 7040, 1.0], [7040, 7066, 4.0]], "rps_lines_numerical_chars_fraction": [[0, 21, 0.0], [21, 151, 0.0], [151, 324, 0.0], [324, 816, 0.00824742], [816, 1272, 0.0], [1272, 1709, 0.0], [1709, 2423, 0.0], [2423, 3145, 0.0], [3145, 4952, 0.0], [4952, 5419, 0.0], [5419, 6074, 0.0], [6074, 6414, 0.0], [6414, 7027, 0.0], [7027, 7040, 0.0], [7040, 7066, 0.15384615]], "rps_lines_start_with_bulletpoint": [[0, 21, 0.0], [21, 151, 0.0], [151, 324, 0.0], [324, 816, 0.0], [816, 1272, 0.0], [1272, 1709, 0.0], [1709, 2423, 0.0], [2423, 3145, 0.0], [3145, 4952, 0.0], [4952, 5419, 0.0], [5419, 6074, 0.0], [6074, 6414, 0.0], [6414, 7027, 0.0], [7027, 7040, 0.0], [7040, 7066, 0.0]], "rps_lines_uppercase_letter_fraction": [[0, 21, 0.19047619], [21, 151, 0.05384615], [151, 324, 0.04046243], [324, 816, 0.0101626], [816, 1272, 0.00877193], [1272, 1709, 0.01144165], [1709, 2423, 0.00560224], [2423, 3145, 0.00692521], [3145, 4952, 0.00996126], [4952, 5419, 0.01713062], [5419, 6074, 0.00916031], [6074, 6414, 0.02352941], [6414, 7027, 0.02120718], [7027, 7040, 0.0], [7040, 7066, 0.15384615]], "rps_doc_ml_palm_score": [[0, 7066, 0.0835008]], "rps_doc_ml_wikipedia_score": [[0, 7066, null]], "rps_doc_ml_wikiref_score": [[0, 7066, 0.02889973]], "rps_doc_books_importance": [[0, 7066, -259.54660587]], "rps_doc_openwebtext_importance": [[0, 7066, -41.24915588]], "rps_doc_wikipedia_importance": [[0, 7066, -272.33412549]], "rps_doc_num_sentences": [[0, 7066, 41.0]]} |
DiamondLobby / Platforms / Xbox / 13 Best Hunting Games for Xbox Series X / S
13 Best Hunting Games for Xbox Series X / S
Uneeb Asim
Just like many other outdoor activities that can be experienced in games, hunting games have been around for a long time. However, they have always been in a niche of their own, as most gamers prefer something with a bit more action.
Due to that, hunting games have had to expand and experiment with new ideas. Nowadays, there are as many hunting games based on fiction as there are based on reality. In fact, the number of simulation-based hunting games is decreasing by the day, with most of them opting to include elements of action in order to get more players.
Of course, this doesn’t mean that the games themselves are bad. In fact, there are some hunting games that are downright amazing, and they use a lot of fantasy elements.
Regardless of whether you want to play a hunting game that has you accurately tracking and shooting animals in the wild or taking down monsters, we have you covered! Our list of the 13 best hunting games for Xbox Series X / S includes a great hunting game for everyone, including those who are not that interested in the genre.
Hunt: Showdown
Fishing Planet
theHunter: Call of the Wild
Hunting Simulator 2
Big Buck Hunter Arcade
Fishing Sim World: Bass Pro Shops Edition
Duck.Hunt
We begin our list with Hunt: Showdown. We mentioned at the beginning that a lot of the games that we included are not based on reality, and this is one of those games.
In Hunt: Showdown, you are tasked with hunting monsters. You enter into a match where there are a lot of small monsters, and one giant one to deal with. At any time, you can escape the area by heading to the extraction point. If you want, you can simply kill a few of the smaller monsters and extract with your rewards.
What makes the game exciting is that there are a lot of other players on the server as well. As such, it is not just you fighting to kill the monsters, but you will also have to deal with all of the other players vying for the same prize. This can lead to some of the most intense and breathtaking fights you can imagine.
One thing to remember is that the player base for the game is quite low. As such, you may have some problems finding other players to play with on the Xbox. That said, the player base experiences a bump whenever a new update is released.
We know that fishing and hunting are not the same things, but the basic premise is still the same. Regardless of whether you are hunting or fishing, you are looking for and acquiring animals for the purposes of eating (well, most of the time).
The main reason for having Fishing Planet so high up on this list is because it is a fully-fledged game that is also free. On top of that, Fishing Planet is also a simulation, meaning that it tries to capture the feeling of fishing as well as it can. This means that finding fish can be quite difficult in certain cases, and it is possible that you may feel pangs of frustration.
While the graphics are not that great, the dynamic weather system does lead to a lot of interesting situations when fishing. Once you factor in the online multiplayer where over 64-players can compete against each other, chances are that avid fishermen will feel right at home when they turn on their Xbox to play Fishing Planet.
Let’s face it, chances are that this is the one game that you already knew was going to be on this list before you began reading. While theHunter series has been hit or miss in the past, Call of the Wild is basically as good as a simulation hunting game has ever been.
Call of the Wild is a game that perfectly captures what it is like to hunt. You can either go on an expedition alone or with someone. Not only are you responsible for finding the prey, but then you have to make sure that you can execute a clean kill.
There is multiplayer as well, where you can compete with other players in multiple game modes to see who the better hunter is. With numerous weapons, animals, and a large terrain to explore, no game is ever the same. If you want to play a hunting simulator that is the closest to hunting in real life, Call of the Wild will get you your fix.
The Monster Hunter series has always been about hunting monsters and acquiring loot. Monster Hunter World has done that at a scale never seen before, which is why it made the list of the best loot games for Xbox.
In Monster Hunter World, you and your friends can traverse the world and look for monsters that are increasingly difficult to defeat. There is both a single-player campaign and numerous additional quests that you can do. Once you kill a monster, you can use the loot to craft weapons of greater strength and face bigger monsters.
While this is the basic concept that is repeated over and over again in the game, it never gets boring. This is because of the variety of the monsters, terrain, and weapons available in the game. Chances are that if you enjoy Monster Hunter World at the beginning, you won’t get bored until the very end.
If you are looking for a classic hunting game but one that is not as hardcore as Call of the Wild, Hunting Simulator 2 is it.
Despite its name, Hunting Simulator 2 is a game that is more arcade-like than a simulation. Still, there is a simulation element to the game. Not only are the weapons extremely realistic in how they look, but you also have access to a lot of gear that professional hunters use.
You also have a dog to guide you on your expedition. The dog can be used to track down over 30 species of animals in multiple environments. However, there is no co-op or multiplayer in the game, making Hunting Simulator 2 single-player only. This means that there isn’t a lot of depth to the game, and it is bound to get boring for most players at some point.
Far Cry 6 is not exactly a hunting game. It is primarily an action-adventure game with strong shooter elements. In fact, it was on our list of the best action-adventure games for Xbox.
Apart from being a shooter, there is also a strong survival element to the game. As such, both hunting and fishing are prominently featured in the game. Although it is possible to play through large parts of the game without having to resort to hunting and fishing, they can be an integral part of the game if you want to.
While we won’t say that the mechanics of hunting and fishing are anywhere close to some of the other games on this list, they do provide a lot in terms of exploration. Also, different areas provide different rewards for hunting, making Far Cry 6 a great game for those that want a bit of casual hunting action but nothing too serious.
Big Buck Hunter is a game that has been released in roughly the same form for PC, Xbox, and mobile devices. The game does not take itself too seriously, and nor does it offer a lot of depth. It simply offers a few levels that can be tackled alone or with one other player in local multiplayer.
Offering almost nothing in terms of realism, the game does try and succeed in inciting a few laughs. Considering that the game is not too expensive and is regularly on sale, it is something that is worth a try if you are looking for something extremely casual to play with a friend.
Duck Dynasty is a game that is based on the TV Show and Film of the same name. While this does mean that this is a game that is best suited only for those who have watched Duck Dynasty, it does do a great job of emulating the stupidity and hilarity of those programs.
There are a lot of things that you can do in the game, with hunting ducks just being one of them. However, this is a game that is bound to make you laugh. If you don’t know the characters, you will get to know them soon enough.
In fact, chances are that if you play Duck Dynasty, you may very well be obliged to watch the film once you have finished the game.
This is the second game on our list of the best hunting games for Xbox that is about fishing. Just like Fishing Planet, this is a game that primarily tries to recreate the experience of fishing in real life.
While Fishing Planet does a better job of emulating the fishing experience, Fishing Sim World does have a lot more bells and whistles. For one, there are multiple boats that you can take out to sea, with each of them having both advantages and disadvantages. On top of that, there are over 500-pieces of equipment for you to experiment with.
With 8 locations and over 18 species of fish, there is a lot for you to try in Fishing Sim World. There is also an online multiplayer component, although it only supports a maximum of four players.
Duck.Hunt is a game that was initially released for mobile devices. Since then, it has been released on the PC, the Xbox, and VR devices.
It really is a very simple game, where you simply have to kill ducks and rack up the highest score you can manage. Despite its extremely casual nature, it is very easy to get addicted to Duck.Hunt. Of course, this is primarily a mobile game, and that too an old one. As such, the visuals are downright terrible compared to everything else on the list.
Still, for a game that is absolutely free, we would recommend giving Duck.Hunt a try when you have a few minutes to kill.
Red Dead Redemption 2 is one of those games that is not primarily based around hunting, but it does have hunting as an integral part of its gameplay.
The mechanics when it comes to hunting are quite similar to the general gunplay found in the game. While that does mean that this isn’t a simulation hunting game, the process of hunting itself is a lot more nuanced than you might initially imagined.
With over 500 species of animals for you to hunt, researching where to find the animal that you need plays a very important part. As such, this is a great game for hunters that enjoy the exploration part of the endeavor.
Dauntless is much more like Monster Hunter World than a standard hunting game. Set in a completely fictional world, you are a slayer who has been tasked with hunting various monsters. However, your goal in hunting these monsters is not to acquire loot but to save the world from destruction.
With an art style that we absolutely love, Dauntless is an extremely underrated game that not many people seem to know about. Just like Monster Hunter World, you can use the parts from the monsters you have slayed to craft better armor and weapons for yourself.
If you are looking for a game that has strong hunting elements but also primarily focuses on its narrative, Dauntless is a great choice!
We end our list of the best hunting games for Xbox Series X / S with The Witcher 3: Wild Hunt. Do not let the name distract you, as this is not a game that is based around hunting. Instead, it plays like an action-adventure game, and is primarily focused around its story. It also made our list of the best single-player campaign games for the Xbox.
Just like some of the other games on this list, it is possible for you to hunt on your own time. However, you get to hunt monsters instead of animals in The Witcher 3. Not only can you hunt the monsters on your own time and reap the rewards, but there are also numerous quests and side quests that require you to hunt them down.
Battling the monsters isn’t any different from battling human beings in the game. The only thing that is different is that monsters usually have different abilities. Still, we have to say that killing a large monster in Wild Hunt is a very satisfactory experience.
The 11 Best Retro Games on Xbox Series X / S
There are several retro games available on Xbox Series ...
9 Best Roguelike Games on Xbox X / S
In recent years, the roguelike genre has completely exp...
12 Best Bullet Hell Games for Xbox Series X / S
Most of us like a good shooting game. Even those who ar...
32 Best Sci-Fi Games for Xbox Series X / S
Games, just like many other forms of art, represent the...
The 35 Best Indie Games for Xbox Series X / S
Indie games refer to smaller games that are developed a...
The 29 Best Single Player Campaign Games for Xbox Series X|S
Single player games can offer some of the most captivat... | 2023-14/0000/en_head.json.gz/3645 | {"url": "https://diamondlobby.com/platform/xbox/best-hunting-games-xbox-x-s/", "partition": "head_middle", "language": "en", "source_domain": "diamondlobby.com", "date_download": "2023-03-20T10:32:48Z", "digest": "sha1:UBCHLZKRBWLKYHGGDRGKZJ3VT3G75CNI"} | {"ccnet_length": [[0, 12123, 12123.0]], "ccnet_original_length": [[0, 12123, 13422.0]], "ccnet_nlines": [[0, 12123, 65.0]], "ccnet_original_nlines": [[0, 12123, 134.0]], "ccnet_language_score": [[0, 12123, 0.97]], "ccnet_perplexity": [[0, 12123, 217.6]], "ccnet_bucket": [[0, 12123, 0.0]], "rps_doc_curly_bracket": [[0, 12123, 0.0]], "rps_doc_ldnoobw_words": [[0, 12123, 1.0]], "rps_doc_lorem_ipsum": [[0, 12123, 0.0]], "rps_doc_stop_word_fraction": [[0, 12123, 0.49663899]], "rps_doc_ut1_blacklist": [[0, 12123, null]], "rps_doc_frac_chars_dupe_10grams": [[0, 12123, 0.0]], "rps_doc_frac_chars_dupe_5grams": [[0, 12123, 0.07591378]], "rps_doc_frac_chars_dupe_6grams": [[0, 12123, 0.05092159]], "rps_doc_frac_chars_dupe_7grams": [[0, 12123, 0.03071957]], "rps_doc_frac_chars_dupe_8grams": [[0, 12123, 0.02447152]], "rps_doc_frac_chars_dupe_9grams": [[0, 12123, 0.01759867]], "rps_doc_frac_chars_top_2gram": [[0, 12123, 0.01301677]], "rps_doc_frac_chars_top_3gram": [[0, 12123, 0.0137457]], "rps_doc_frac_chars_top_4gram": [[0, 12123, 0.01499531]], "rps_doc_frac_all_caps_words": [[0, 12123, 0.00948992]], "rps_doc_frac_lines_end_with_ellipsis": [[0, 12123, 0.09230769]], "rps_doc_frac_no_alph_words": [[0, 12123, 0.11348359]], "rps_doc_frac_unique_words": [[0, 12123, 0.25848214]], "rps_doc_mean_word_length": [[0, 12123, 4.28705357]], "rps_doc_symbol_to_word_ratio": [[0, 12123, 0.00237248]], "rps_doc_unigram_entropy": [[0, 12123, 5.33410529]], "rps_doc_word_count": [[0, 12123, 2240.0]], "rps_lines_ending_with_terminal_punctution_mark": [[0, 78, 0.0], [78, 122, 0.0], [122, 133, 0.0], [133, 367, 1.0], [367, 699, 1.0], [699, 869, 1.0], [869, 1197, 1.0], [1197, 1212, 0.0], [1212, 1227, 0.0], [1227, 1255, 0.0], [1255, 1275, 0.0], [1275, 1298, 0.0], [1298, 1340, 0.0], [1340, 1350, 0.0], [1350, 1518, 1.0], [1518, 1838, 1.0], [1838, 2160, 1.0], [2160, 2398, 1.0], [2398, 2642, 1.0], [2642, 3022, 1.0], [3022, 3352, 1.0], [3352, 3621, 1.0], [3621, 3872, 1.0], [3872, 4214, 1.0], [4214, 4427, 1.0], [4427, 4757, 1.0], [4757, 5062, 1.0], [5062, 5188, 1.0], [5188, 5466, 1.0], [5466, 5826, 1.0], [5826, 6011, 1.0], [6011, 6334, 1.0], [6334, 6669, 1.0], [6669, 6963, 1.0], [6963, 7246, 1.0], [7246, 7514, 1.0], [7514, 7742, 1.0], [7742, 7874, 1.0], [7874, 8082, 1.0], [8082, 8424, 1.0], [8424, 8622, 1.0], [8622, 8760, 1.0], [8760, 9112, 1.0], [9112, 9234, 1.0], [9234, 9384, 1.0], [9384, 9634, 1.0], [9634, 9855, 1.0], [9855, 10147, 1.0], [10147, 10409, 1.0], [10409, 10546, 1.0], [10546, 10896, 1.0], [10896, 11225, 1.0], [11225, 11490, 1.0], [11490, 11535, 0.0], [11535, 11594, 1.0], [11594, 11631, 0.0], [11631, 11690, 1.0], [11690, 11738, 0.0], [11738, 11797, 1.0], [11797, 11840, 0.0], [11840, 11899, 1.0], [11899, 11945, 0.0], [11945, 12004, 1.0], [12004, 12065, 0.0], [12065, 12123, 1.0]], "rps_lines_javascript_counts": [[0, 78, 0.0], [78, 122, 0.0], [122, 133, 0.0], [133, 367, 0.0], [367, 699, 0.0], [699, 869, 0.0], [869, 1197, 0.0], [1197, 1212, 0.0], [1212, 1227, 0.0], [1227, 1255, 0.0], [1255, 1275, 0.0], [1275, 1298, 0.0], [1298, 1340, 0.0], [1340, 1350, 0.0], [1350, 1518, 0.0], [1518, 1838, 0.0], [1838, 2160, 0.0], [2160, 2398, 0.0], [2398, 2642, 0.0], [2642, 3022, 0.0], [3022, 3352, 0.0], [3352, 3621, 0.0], [3621, 3872, 0.0], [3872, 4214, 0.0], [4214, 4427, 0.0], [4427, 4757, 0.0], [4757, 5062, 0.0], [5062, 5188, 0.0], [5188, 5466, 0.0], [5466, 5826, 0.0], [5826, 6011, 0.0], [6011, 6334, 0.0], [6334, 6669, 0.0], [6669, 6963, 0.0], [6963, 7246, 0.0], [7246, 7514, 0.0], [7514, 7742, 0.0], [7742, 7874, 0.0], [7874, 8082, 0.0], [8082, 8424, 0.0], [8424, 8622, 0.0], [8622, 8760, 0.0], [8760, 9112, 0.0], [9112, 9234, 0.0], [9234, 9384, 0.0], [9384, 9634, 0.0], [9634, 9855, 0.0], [9855, 10147, 0.0], [10147, 10409, 0.0], [10409, 10546, 0.0], [10546, 10896, 0.0], [10896, 11225, 0.0], [11225, 11490, 0.0], [11490, 11535, 0.0], [11535, 11594, 0.0], [11594, 11631, 0.0], [11631, 11690, 0.0], [11690, 11738, 0.0], [11738, 11797, 0.0], [11797, 11840, 0.0], [11840, 11899, 0.0], [11899, 11945, 0.0], [11945, 12004, 0.0], [12004, 12065, 0.0], [12065, 12123, 0.0]], "rps_lines_num_words": [[0, 78, 12.0], [78, 122, 9.0], [122, 133, 2.0], [133, 367, 42.0], [367, 699, 59.0], [699, 869, 30.0], [869, 1197, 59.0], [1197, 1212, 2.0], [1212, 1227, 2.0], [1227, 1255, 5.0], [1255, 1275, 3.0], [1275, 1298, 4.0], [1298, 1340, 7.0], [1340, 1350, 1.0], [1350, 1518, 33.0], [1518, 1838, 61.0], [1838, 2160, 64.0], [2160, 2398, 45.0], [2398, 2642, 44.0], [2642, 3022, 72.0], [3022, 3352, 56.0], [3352, 3621, 53.0], [3621, 3872, 50.0], [3872, 4214, 65.0], [4214, 4427, 39.0], [4427, 4757, 56.0], [4757, 5062, 56.0], [5062, 5188, 26.0], [5188, 5466, 50.0], [5466, 5826, 68.0], [5826, 6011, 33.0], [6011, 6334, 60.0], [6334, 6669, 61.0], [6669, 6963, 57.0], [6963, 7246, 52.0], [7246, 7514, 54.0], [7514, 7742, 47.0], [7742, 7874, 26.0], [7874, 8082, 39.0], [8082, 8424, 60.0], [8424, 8622, 37.0], [8622, 8760, 25.0], [8760, 9112, 65.0], [9112, 9234, 23.0], [9234, 9384, 28.0], [9384, 9634, 44.0], [9634, 9855, 41.0], [9855, 10147, 50.0], [10147, 10409, 46.0], [10409, 10546, 24.0], [10546, 10896, 66.0], [10896, 11225, 65.0], [11225, 11490, 44.0], [11490, 11535, 10.0], [11535, 11594, 9.0], [11594, 11631, 8.0], [11631, 11690, 9.0], [11690, 11738, 10.0], [11738, 11797, 12.0], [11797, 11840, 9.0], [11840, 11899, 10.0], [11899, 11945, 10.0], [11945, 12004, 10.0], [12004, 12065, 11.0], [12065, 12123, 10.0]], "rps_lines_numerical_chars_fraction": [[0, 78, 0.02898551], [78, 122, 0.04878049], [122, 133, 0.0], [133, 367, 0.0], [367, 699, 0.0], [699, 869, 0.0], [869, 1197, 0.00623053], [1197, 1212, 0.0], [1212, 1227, 0.0], [1227, 1255, 0.0], [1255, 1275, 0.05263158], [1275, 1298, 0.0], [1298, 1340, 0.0], [1340, 1350, 0.0], [1350, 1518, 0.0], [1518, 1838, 0.0], [1838, 2160, 0.0], [2160, 2398, 0.0], [2398, 2642, 0.0], [2642, 3022, 0.0], [3022, 3352, 0.00617284], [3352, 3621, 0.0], [3621, 3872, 0.0], [3872, 4214, 0.0], [4214, 4427, 0.0], [4427, 4757, 0.0], [4757, 5062, 0.0], [5062, 5188, 0.00813008], [5188, 5466, 0.0037037], [5466, 5826, 0.00857143], [5826, 6011, 0.00561798], [6011, 6334, 0.0], [6334, 6669, 0.00303951], [6669, 6963, 0.0], [6963, 7246, 0.0], [7246, 7514, 0.0], [7514, 7742, 0.0], [7742, 7874, 0.0], [7874, 8082, 0.0], [8082, 8424, 0.00900901], [8424, 8622, 0.01554404], [8622, 8760, 0.0], [8760, 9112, 0.0], [9112, 9234, 0.0], [9234, 9384, 0.00680272], [9384, 9634, 0.0], [9634, 9855, 0.01388889], [9855, 10147, 0.0], [10147, 10409, 0.0], [10409, 10546, 0.0], [10546, 10896, 0.00296736], [10896, 11225, 0.00310559], [11225, 11490, 0.0], [11490, 11535, 0.04761905], [11535, 11594, 0.0], [11594, 11631, 0.02941176], [11631, 11690, 0.0], [11690, 11738, 0.04444444], [11738, 11797, 0.0], [11797, 11840, 0.05128205], [11840, 11899, 0.0], [11899, 11945, 0.04651163], [11945, 12004, 0.0], [12004, 12065, 0.03389831], [12065, 12123, 0.0]], "rps_lines_start_with_bulletpoint": [[0, 78, 0.0], [78, 122, 0.0], [122, 133, 0.0], [133, 367, 0.0], [367, 699, 0.0], [699, 869, 0.0], [869, 1197, 0.0], [1197, 1212, 0.0], [1212, 1227, 0.0], [1227, 1255, 0.0], [1255, 1275, 0.0], [1275, 1298, 0.0], [1298, 1340, 0.0], [1340, 1350, 0.0], [1350, 1518, 0.0], [1518, 1838, 0.0], [1838, 2160, 0.0], [2160, 2398, 0.0], [2398, 2642, 0.0], [2642, 3022, 0.0], [3022, 3352, 0.0], [3352, 3621, 0.0], [3621, 3872, 0.0], [3872, 4214, 0.0], [4214, 4427, 0.0], [4427, 4757, 0.0], [4757, 5062, 0.0], [5062, 5188, 0.0], [5188, 5466, 0.0], [5466, 5826, 0.0], [5826, 6011, 0.0], [6011, 6334, 0.0], [6334, 6669, 0.0], [6669, 6963, 0.0], [6963, 7246, 0.0], [7246, 7514, 0.0], [7514, 7742, 0.0], [7742, 7874, 0.0], [7874, 8082, 0.0], [8082, 8424, 0.0], [8424, 8622, 0.0], [8622, 8760, 0.0], [8760, 9112, 0.0], [9112, 9234, 0.0], [9234, 9384, 0.0], [9384, 9634, 0.0], [9634, 9855, 0.0], [9855, 10147, 0.0], [10147, 10409, 0.0], [10409, 10546, 0.0], [10546, 10896, 0.0], [10896, 11225, 0.0], [11225, 11490, 0.0], [11490, 11535, 0.0], [11535, 11594, 0.0], [11594, 11631, 0.0], [11631, 11690, 0.0], [11690, 11738, 0.0], [11738, 11797, 0.0], [11797, 11840, 0.0], [11840, 11899, 0.0], [11899, 11945, 0.0], [11945, 12004, 0.0], [12004, 12065, 0.0], [12065, 12123, 0.0]], "rps_lines_uppercase_letter_fraction": [[0, 78, 0.14102564], [78, 122, 0.15909091], [122, 133, 0.18181818], [133, 367, 0.00854701], [367, 699, 0.00903614], [699, 869, 0.01176471], [869, 1197, 0.01829268], [1197, 1212, 0.13333333], [1212, 1227, 0.13333333], [1227, 1255, 0.10714286], [1255, 1275, 0.1], [1275, 1298, 0.17391304], [1298, 1340, 0.16666667], [1340, 1350, 0.2], [1350, 1518, 0.02380952], [1518, 1838, 0.01875], [1838, 2160, 0.00931677], [2160, 2398, 0.01680672], [2398, 2642, 0.00819672], [2642, 3022, 0.01842105], [3022, 3352, 0.01515152], [3352, 3621, 0.01858736], [3621, 3872, 0.01593625], [3872, 4214, 0.01461988], [4214, 4427, 0.03286385], [4427, 4757, 0.01818182], [4757, 5062, 0.01967213], [5062, 5188, 0.03968254], [5188, 5466, 0.01798561], [5466, 5826, 0.01666667], [5826, 6011, 0.02702703], [6011, 6334, 0.00928793], [6334, 6669, 0.0119403], [6669, 6963, 0.02721088], [6963, 7246, 0.00706714], [7246, 7514, 0.03358209], [7514, 7742, 0.01315789], [7742, 7874, 0.02272727], [7874, 8082, 0.02403846], [8082, 8424, 0.02339181], [8424, 8622, 0.02525253], [8622, 8760, 0.05797101], [8760, 9112, 0.01704545], [9112, 9234, 0.02459016], [9234, 9384, 0.02], [9384, 9634, 0.008], [9634, 9855, 0.00904977], [9855, 10147, 0.02054795], [10147, 10409, 0.02290076], [10409, 10546, 0.01459854], [10546, 10896, 0.03714286], [10896, 11225, 0.01519757], [11225, 11490, 0.01886792], [11490, 11535, 0.17777778], [11535, 11594, 0.05084746], [11594, 11631, 0.16216216], [11631, 11690, 0.01694915], [11690, 11738, 0.16666667], [11738, 11797, 0.03389831], [11797, 11840, 0.18604651], [11840, 11899, 0.01694915], [11899, 11945, 0.17391304], [11945, 12004, 0.01694915], [12004, 12065, 0.16393443], [12065, 12123, 0.01724138]], "rps_doc_ml_palm_score": [[0, 12123, 0.68709183]], "rps_doc_ml_wikipedia_score": [[0, 12123, null]], "rps_doc_ml_wikiref_score": [[0, 12123, 0.57476711]], "rps_doc_books_importance": [[0, 12123, -374.2956861]], "rps_doc_openwebtext_importance": [[0, 12123, 222.22425027]], "rps_doc_wikipedia_importance": [[0, 12123, -552.39273794]], "rps_doc_num_sentences": [[0, 12123, 117.0]]} |
Tag Archives: James Thurber
Walking the Subtlety Tightrope
January 7, 2023 subtlety in writing, Writing, writing methods3rd person limited, 3rd person omniscient, creative writing, Death in Midsummer, Henry James, James Thurber, John Steinbeck, Kate Chopin, popularity, power writing, Sigmund Freud, storytelling, The Chrysanthemum, The Dream of an Hour, The Secret Life of Walter Mitty, The Story of an Hour, third person limited, third person omniscient, Walter Mitty, Yukio MishimaDiane Gallagher
Photo by Marcelo Moreira on Pexels.com
Is subtlety a good thing for writers to insert into their stories?
Subtlety is a tightrope, sometimes a tightrope in a windstorm. When you write subtlety into your story and ask someone to be a beta reader, if that person is another writer, probably they will get your subtlety. If you ask someone who just likes to read, probably they won’t. The issue with having a writer read your story or novel is that they have spent hours, days, and often years evaluating stories for just such subtlety. If you ask a friend or colleague who is not a writer, they don’t have that experience. These folks may not have the writer’s vocabulary to tell you what does and doesn’t work, but if you listen to them carefully, you can glean what problems your writing may or may not have, including too much subtlety.
To write with subtlety is, as I mentioned above, a tightrope. A story with not so much subtlety could give your story popularity with a wide audience, however a story with a great deal of subtlety can be extremely powerful, but not everyone will get it.
It is at this point that you, the writer, must make a decision in regards to power versus popularity. Something to keep in mind here is this: if you tip too far into the Subtle Sea, and you get reader feedback that they didn’t understand your story, it is not the fault of the reader. This is the writer’s story and so any decisions we make as writers fall strictly in our laps. If our readers do not understand, it is our error, not theirs. So be strategic. Think about what you really want. Do you want everyone to love your book? Then less subtlety. Do you want a very powerful story to knock the socks off some of your readers? Then more is better.
Has subtlety always been part of storytelling?
Intuitively we may want to answer ‘Yes, of course!’ but we would be wrong. In fact, it was not until the last century that adding subtlety became fashionable. In the 19th century, almost all of writers wrote in third person omniscient making subtlety unnecessary. The omniscient POV allows the reader to look into the minds of ALL the characters. Think Jane Austen or the Bröntes. At the very end of the 19th century, Henry James began releasing stories written in the 3rd-person limited POV. In doing this, the readers no longer had the ability to see completely into the minds of all the characters. Even the protagonist can be unreliable for the reader. The reader can only truly be aware of what the protagonist is aware of. This is where subtlety can be important to the understanding of the theme, or even the basic plotline. Certainly, the growing popularity of the theories of Sigmund Freud had an impact on this change. People were slowly coming to realize that people don’t always know why they do what they do. In this case, subtlety becomes important. Your characters may never figure out their own motivations but as long as your readers can, then you are fine.
What can subtlety add to your writing?
Photo by Gaurav Ranjitkar on Pexels.com
As I said above, subtlety can make your story much more powerful. A very short story that has been incorrectly attributed to Hemingway is this:
For sale: baby shoes, never worn.
This is the height of subtlety. We do not know what the story is until the final two words—the death of a baby before she began to walk—and even then, a reader not used to subtlety might miss the meaning.
“The Chrysanthemum” by Steinbeck is a story rich with subtlety. This multi-layered story hints at the relationship between the protagonist and her husband as well as the protagonist’s inner workings. You have to read very carefully between the lines, but if you can get the subtlety, it is a very powerful story.
“The Secret Life of Walter Mitty” by James Thurber shows us a protagonist who is, to the reader, clearly ridiculous, but Walter Mitty is oblivious. Actually, oblivious is not quite the right word. He is vaguely aware of his ridiculousness, but he chooses to ignore it. So, instead of ‘oblivious’, it seems that ‘in denial’ would be a better choice. The subtlety comes in the intersection between Mitty’s real world and his imaginary world.
“The Story of an Hour” AKA “The Dream of an Hour” by Kate Chopin is only three pages long, making it as short as some flash fiction. This story is ambiguous and because of that, intriguing. In three short pages, she hints at an analysis of death, marriage, and personal needs and desires. It is an hour in the life of a woman who has just been told that her husband is apparently dead. Because the story is so short, Chopin could only hint at these themes, but in doing so gives us a powerful story.
Another author who wrote with subtlety was Yukio Mishima. His collection of short stories under the title Death in Midsummer is masterful. He was able to write stories that are almost painfully subtle, however his readers are relieved of the burden of figuring out the underlying message because Mishima had complete control, not only over his entire story, but also over his last lines. These final lines are so strong, they feel like the proverbial punch in the gut. I found myself frowning when I first read his stories. I struggled to understand, and then–BOOM–the last line knocks you flat. In doing this he has the best of both worlds—the subtle and the obvious—making his stories truly powerful. The description on Amazon for Death in Midsummer sums up everything nicely:
Nine of Yukio Mishima’s finest stories were selected by Mishima himself for translation in this book; they represent his extraordinary ability to depict a wide variety of human beings in moments of significance. Often his characters are sophisticated modern Japanese who turn out to be not so liberated from the past as they had thought.
I strongly recommend reading Mishima’s short stories, or even his full-length novels.
I will wrap this up by saying, yes, be subtle, but not about anything that is critical to the understanding of your story. Another writer might figure out the intricacies but a nonwriter might just miss that which is critical. So jump on that tightrope and enjoy your balancing act as you swing to that perfect balance between powerful writing and popularity.
Photo by Kaique Lopes on Pexels.com | 2023-14/0000/en_head.json.gz/3646 | {"url": "https://dianegallagherwritings.com/tag/james-thurber/", "partition": "head_middle", "language": "en", "source_domain": "dianegallagherwritings.com", "date_download": "2023-03-20T10:43:15Z", "digest": "sha1:ILATO4GV23FMNYYTTVO63FVACOSUE3KQ"} | {"ccnet_length": [[0, 6780, 6780.0]], "ccnet_original_length": [[0, 6780, 9041.0]], "ccnet_nlines": [[0, 6780, 23.0]], "ccnet_original_nlines": [[0, 6780, 145.0]], "ccnet_language_score": [[0, 6780, 0.96]], "ccnet_perplexity": [[0, 6780, 284.2]], "ccnet_bucket": [[0, 6780, 0.0]], "rps_doc_curly_bracket": [[0, 6780, 0.0]], "rps_doc_ldnoobw_words": [[0, 6780, 0.0]], "rps_doc_lorem_ipsum": [[0, 6780, 0.0]], "rps_doc_stop_word_fraction": [[0, 6780, 0.46092619]], "rps_doc_ut1_blacklist": [[0, 6780, null]], "rps_doc_frac_chars_dupe_10grams": [[0, 6780, 0.0]], "rps_doc_frac_chars_dupe_5grams": [[0, 6780, 0.01100514]], "rps_doc_frac_chars_dupe_6grams": [[0, 6780, 0.01100514]], "rps_doc_frac_chars_dupe_7grams": [[0, 6780, 0.01100514]], "rps_doc_frac_chars_dupe_8grams": [[0, 6780, 0.0]], "rps_doc_frac_chars_dupe_9grams": [[0, 6780, 0.0]], "rps_doc_frac_chars_top_2gram": [[0, 6780, 0.00990462]], "rps_doc_frac_chars_top_3gram": [[0, 6780, 0.00880411]], "rps_doc_frac_chars_top_4gram": [[0, 6780, 0.00660308]], "rps_doc_frac_all_caps_words": [[0, 6780, 0.01013025]], "rps_doc_frac_lines_end_with_ellipsis": [[0, 6780, 0.0]], "rps_doc_frac_no_alph_words": [[0, 6780, 0.13603473]], "rps_doc_frac_unique_words": [[0, 6780, 0.37425405]], "rps_doc_mean_word_length": [[0, 6780, 4.64791134]], "rps_doc_symbol_to_word_ratio": [[0, 6780, 0.0]], "rps_doc_unigram_entropy": [[0, 6780, 5.41813733]], "rps_doc_word_count": [[0, 6780, 1173.0]], "rps_lines_ending_with_terminal_punctution_mark": [[0, 28, 0.0], [28, 59, 0.0], [59, 500, 0.0], [500, 539, 0.0], [539, 606, 1.0], [606, 1338, 1.0], [1338, 1592, 1.0], [1592, 2245, 1.0], [2245, 2292, 1.0], [2292, 3467, 1.0], [3467, 3506, 1.0], [3506, 3546, 0.0], [3546, 3690, 0.0], [3690, 3724, 1.0], [3724, 3929, 1.0], [3929, 4242, 1.0], [4242, 4682, 1.0], [4682, 5182, 1.0], [5182, 5961, 0.0], [5961, 6299, 1.0], [6299, 6385, 1.0], [6385, 6745, 1.0], [6745, 6780, 0.0]], "rps_lines_javascript_counts": [[0, 28, 0.0], [28, 59, 0.0], [59, 500, 0.0], [500, 539, 0.0], [539, 606, 0.0], [606, 1338, 0.0], [1338, 1592, 0.0], [1592, 2245, 0.0], [2245, 2292, 0.0], [2292, 3467, 0.0], [3467, 3506, 0.0], [3506, 3546, 0.0], [3546, 3690, 0.0], [3690, 3724, 0.0], [3724, 3929, 0.0], [3929, 4242, 0.0], [4242, 4682, 0.0], [4682, 5182, 0.0], [5182, 5961, 0.0], [5961, 6299, 0.0], [6299, 6385, 0.0], [6385, 6745, 0.0], [6745, 6780, 0.0]], "rps_lines_num_words": [[0, 28, 4.0], [28, 59, 4.0], [59, 500, 62.0], [500, 539, 6.0], [539, 606, 12.0], [606, 1338, 132.0], [1338, 1592, 46.0], [1592, 2245, 125.0], [2245, 2292, 7.0], [2292, 3467, 204.0], [3467, 3506, 7.0], [3506, 3546, 6.0], [3546, 3690, 25.0], [3690, 3724, 6.0], [3724, 3929, 40.0], [3929, 4242, 52.0], [4242, 4682, 74.0], [4682, 5182, 96.0], [5182, 5961, 130.0], [5961, 6299, 55.0], [6299, 6385, 12.0], [6385, 6745, 62.0], [6745, 6780, 6.0]], "rps_lines_numerical_chars_fraction": [[0, 28, 0.0], [28, 59, 0.0], [59, 500, 0.01674641], [500, 539, 0.0], [539, 606, 0.0], [606, 1338, 0.0], [1338, 1592, 0.0], [1592, 2245, 0.0], [2245, 2292, 0.0], [2292, 3467, 0.00434783], [3467, 3506, 0.0], [3506, 3546, 0.0], [3546, 3690, 0.0], [3690, 3724, 0.0], [3724, 3929, 0.0], [3929, 4242, 0.0], [4242, 4682, 0.0], [4682, 5182, 0.0], [5182, 5961, 0.0], [5961, 6299, 0.0], [6299, 6385, 0.0], [6385, 6745, 0.0], [6745, 6780, 0.0]], "rps_lines_start_with_bulletpoint": [[0, 28, 0.0], [28, 59, 0.0], [59, 500, 0.0], [500, 539, 0.0], [539, 606, 0.0], [606, 1338, 0.0], [1338, 1592, 0.0], [1592, 2245, 0.0], [2245, 2292, 0.0], [2292, 3467, 0.0], [3467, 3506, 0.0], [3506, 3546, 0.0], [3546, 3690, 0.0], [3690, 3724, 0.0], [3724, 3929, 0.0], [3929, 4242, 0.0], [4242, 4682, 0.0], [4682, 5182, 0.0], [5182, 5961, 0.0], [5961, 6299, 0.0], [6299, 6385, 0.0], [6385, 6745, 0.0], [6745, 6780, 0.0]], "rps_lines_uppercase_letter_fraction": [[0, 28, 0.14285714], [28, 59, 0.09677419], [59, 500, 0.07482993], [500, 539, 0.1025641], [539, 606, 0.01492537], [606, 1338, 0.00819672], [1338, 1592, 0.01181102], [1592, 2245, 0.01837672], [2245, 2292, 0.0212766], [2292, 3467, 0.02638298], [3467, 3506, 0.02564103], [3506, 3546, 0.1], [3546, 3690, 0.02777778], [3690, 3724, 0.02941176], [3724, 3929, 0.0097561], [3929, 4242, 0.01597444], [4242, 4682, 0.03181818], [4682, 5182, 0.032], [5182, 5961, 0.02695764], [5961, 6299, 0.01775148], [6299, 6385, 0.02325581], [6385, 6745, 0.00833333], [6745, 6780, 0.11428571]], "rps_doc_ml_palm_score": [[0, 6780, 0.56098408]], "rps_doc_ml_wikipedia_score": [[0, 6780, null]], "rps_doc_ml_wikiref_score": [[0, 6780, 0.12883061]], "rps_doc_books_importance": [[0, 6780, -91.62108093]], "rps_doc_openwebtext_importance": [[0, 6780, 108.81657683]], "rps_doc_wikipedia_importance": [[0, 6780, -367.00580603]], "rps_doc_num_sentences": [[0, 6780, 70.0]]} |
Day 8 – Touring Potosi
Posted on January 14, 2016 April 27, 2016
We left the hacienda after breakfast and returned to Potosi once more. There was a convent tour organized as well as a trip to the silver mine. It was a sunny, bright morning as we walked into the convent. The guide was busy taking two French girls on the tour. As soon as I saw the courtyard and the collection of cacti, I was more interested in photographing the cactus than going on the tour. Eventually, we followed the guide and the girls and Paola explained everything to me in English. Here are a few details about the convent:
The Saint Theresa convent was built in 1685 and is still home to a small community of Carmelite nuns who have restored the building and converted part of it into a museum. The Carmelite Convent of St. Theresa is where the second daughters of the wealthy Potosi elite were sent, aged 15, to live in ascetic conditions. There were only occasional family visits but never face to face. Once in, never out! The nuns never saw the outside nor any other people besides the other nuns. Most of the day was spent in silence with only one hour of being allowed to talk. It seemed pretty harsh to me. The nuns spent their time praying and making jams and jellies and things to eat to sell to the people. The nuns were even buried within the building. Only 21 nuns were there at any one time. No one could be admitted unless another nun died. The families had to donate large sums and expensive artifacts in order for their child to be admitted.
There are several fine art pieces, including works by Melchor Perez de Holguin, Bolivia’s most famous painter. There is one room with many painted wooden Christs. Each room is well preserved – locked and then unlocked by the guide. The building itself is impressive as many of the works of art were paid for by the dowries given for the privilege of entering the convent. Although, the idea of touring a convent was not that appealing, it did provide an interesting glimpse into the cloistered world that only changed character in the 1960s with the reforms of the Second Vatican Council.
Convent courtyard.
The original bell that was rung to call the nuns together.
Original piano played by the nuns hidden behind the wooden screen
Courtyard where the cactus were housed.
This cactus was really weird.
Cerro Rico
We finished the convent tour and made our way through the city to the silver mine. The area we passed through was generally where the workers lived. The streets were narrow and choked with traffic, the roads were dusty and this part of town had lots of small shops, kiosks and street venders. The longer we drove, the higher we went. We finally came to the mining area and at one point had to get out of the car to get closer to the mine.
Just a bit of history about the mine -today, after nearly 500 years of constant mining for the silver, tin and zinc that funded the Spanish empire and shaped Bolivia’s economic fortunes, Cerro Rico’s bones are weakened, and its iconic peak is caving in. Now the race is on to reinforce the mountaintop and save this national monument through a government-funded $2.4 million fill-in project. The urgency is doubled because Cerro Rico is not just a testament to Bolivia’s history — it is also a maze of working mines that employ 15,000 miners, generating revenue that supports.
Named a UNESCO world heritage site in 1987, along with the city of Potosi, the mountain’s unmistakable form appears on Bolivia’s currency and the national shield. It represents a history of tremendous wealth but also of the suffering of indigenous people and enslaved Africans who died mining it, earning Cerro Rico the title “the Mountain That Eats Men.”
Where some of the workers live.
Little puppy who barked at us as we left.
The rain clouds were rolling in as we left the mining area. We had a short trip to the market but it was rather empty as not much going on in the afternoon. One Indigenous lady held up her broom like she was going to hit me when she saw my camera. I wasn’t even going to take her photo. There was one area that Paola said was a local salad bar so I took a few photos.
Just waiting for the customers.
We walked from there to the park where there were lots of venders selling clothes. All of a sudden there was a huge bolt of lightening. Wish I had taken a photo. Never saw such a huge bolt and so close. A few more lightening bolts and it started to rain. Of course, I didn’t have my umbrella but I was more worried about my camera than myself. We waited in a nearby store until the driver found us. We jumped into the van and returned to the hacienda. It stopped raining just as we left Potosi.
Posted in Bolivia
Prev Day 8 – Traveling to Potosi
Next Day 9 – Traveling to Sucre
Frassa, Norbert says:
Nice post and beautiful cactus!
Interesting about the convent! | 2023-14/0000/en_head.json.gz/3647 | {"url": "https://diannestraveladventures.me/2016/01/14/day-8-continued/", "partition": "head_middle", "language": "en", "source_domain": "diannestraveladventures.me", "date_download": "2023-03-20T10:38:57Z", "digest": "sha1:Y6JIIPFIFMBPC5ALRTYCZQOEBCJEQLOH"} | {"ccnet_length": [[0, 4857, 4857.0]], "ccnet_original_length": [[0, 4857, 6746.0]], "ccnet_nlines": [[0, 4857, 25.0]], "ccnet_original_nlines": [[0, 4857, 122.0]], "ccnet_language_score": [[0, 4857, 0.98]], "ccnet_perplexity": [[0, 4857, 338.6]], "ccnet_bucket": [[0, 4857, 0.0]], "rps_doc_curly_bracket": [[0, 4857, 0.0]], "rps_doc_ldnoobw_words": [[0, 4857, 0.0]], "rps_doc_lorem_ipsum": [[0, 4857, 0.0]], "rps_doc_stop_word_fraction": [[0, 4857, 0.43687375]], "rps_doc_ut1_blacklist": [[0, 4857, null]], "rps_doc_frac_chars_dupe_10grams": [[0, 4857, 0.0]], "rps_doc_frac_chars_dupe_5grams": [[0, 4857, 0.0]], "rps_doc_frac_chars_dupe_6grams": [[0, 4857, 0.0]], "rps_doc_frac_chars_dupe_7grams": [[0, 4857, 0.0]], "rps_doc_frac_chars_dupe_8grams": [[0, 4857, 0.0]], "rps_doc_frac_chars_dupe_9grams": [[0, 4857, 0.0]], "rps_doc_frac_chars_top_2gram": [[0, 4857, 0.01030662]], "rps_doc_frac_chars_top_3gram": [[0, 4857, 0.00695697]], "rps_doc_frac_chars_top_4gram": [[0, 4857, 0.00772997]], "rps_doc_frac_all_caps_words": [[0, 4857, 0.00901804]], "rps_doc_frac_lines_end_with_ellipsis": [[0, 4857, 0.0]], "rps_doc_frac_no_alph_words": [[0, 4857, 0.12024048]], "rps_doc_frac_unique_words": [[0, 4857, 0.46561443]], "rps_doc_mean_word_length": [[0, 4857, 4.37542277]], "rps_doc_symbol_to_word_ratio": [[0, 4857, 0.0]], "rps_doc_unigram_entropy": [[0, 4857, 5.33763091]], "rps_doc_word_count": [[0, 4857, 887.0]], "rps_lines_ending_with_terminal_punctution_mark": [[0, 23, 0.0], [23, 65, 0.0], [65, 600, 0.0], [600, 1535, 1.0], [1535, 2124, 1.0], [2124, 2143, 1.0], [2143, 2202, 1.0], [2202, 2268, 0.0], [2268, 2308, 1.0], [2308, 2338, 1.0], [2338, 2349, 0.0], [2349, 2788, 1.0], [2788, 3365, 1.0], [3365, 3721, 1.0], [3721, 3753, 1.0], [3753, 3795, 1.0], [3795, 4163, 1.0], [4163, 4195, 1.0], [4195, 4690, 1.0], [4690, 4708, 0.0], [4708, 4741, 0.0], [4741, 4773, 0.0], [4773, 4795, 0.0], [4795, 4827, 1.0], [4827, 4857, 1.0]], "rps_lines_javascript_counts": [[0, 23, 0.0], [23, 65, 0.0], [65, 600, 0.0], [600, 1535, 0.0], [1535, 2124, 0.0], [2124, 2143, 0.0], [2143, 2202, 0.0], [2202, 2268, 0.0], [2268, 2308, 0.0], [2308, 2338, 0.0], [2338, 2349, 0.0], [2349, 2788, 0.0], [2788, 3365, 0.0], [3365, 3721, 0.0], [3721, 3753, 0.0], [3753, 3795, 0.0], [3795, 4163, 0.0], [4163, 4195, 0.0], [4195, 4690, 0.0], [4690, 4708, 0.0], [4708, 4741, 0.0], [4741, 4773, 0.0], [4773, 4795, 0.0], [4795, 4827, 0.0], [4827, 4857, 0.0]], "rps_lines_num_words": [[0, 23, 5.0], [23, 65, 8.0], [65, 600, 99.0], [600, 1535, 173.0], [1535, 2124, 101.0], [2124, 2143, 2.0], [2143, 2202, 11.0], [2202, 2268, 11.0], [2268, 2308, 6.0], [2308, 2338, 5.0], [2338, 2349, 2.0], [2349, 2788, 85.0], [2788, 3365, 97.0], [3365, 3721, 58.0], [3721, 3753, 6.0], [3753, 3795, 9.0], [3795, 4163, 78.0], [4163, 4195, 5.0], [4195, 4690, 97.0], [4690, 4708, 3.0], [4708, 4741, 7.0], [4741, 4773, 7.0], [4773, 4795, 3.0], [4795, 4827, 5.0], [4827, 4857, 4.0]], "rps_lines_numerical_chars_fraction": [[0, 23, 0.04545455], [23, 65, 0.30769231], [65, 600, 0.0], [600, 1535, 0.0087146], [1535, 2124, 0.00690846], [2124, 2143, 0.0], [2143, 2202, 0.0], [2202, 2268, 0.0], [2268, 2308, 0.0], [2308, 2338, 0.0], [2338, 2349, 0.0], [2349, 2788, 0.0], [2788, 3365, 0.01779359], [3365, 3721, 0.01142857], [3721, 3753, 0.0], [3753, 3795, 0.0], [3795, 4163, 0.0], [4163, 4195, 0.0], [4195, 4690, 0.0], [4690, 4708, 0.0], [4708, 4741, 0.03125], [4741, 4773, 0.03225806], [4773, 4795, 0.0], [4795, 4827, 0.0], [4827, 4857, 0.0]], "rps_lines_start_with_bulletpoint": [[0, 23, 0.0], [23, 65, 0.0], [65, 600, 0.0], [600, 1535, 0.0], [1535, 2124, 0.0], [2124, 2143, 0.0], [2143, 2202, 0.0], [2202, 2268, 0.0], [2268, 2308, 0.0], [2308, 2338, 0.0], [2338, 2349, 0.0], [2349, 2788, 0.0], [2788, 3365, 0.0], [3365, 3721, 0.0], [3721, 3753, 0.0], [3753, 3795, 0.0], [3795, 4163, 0.0], [4163, 4195, 0.0], [4195, 4690, 0.0], [4690, 4708, 0.0], [4708, 4741, 0.0], [4741, 4773, 0.0], [4773, 4795, 0.0], [4795, 4827, 0.0], [4827, 4857, 0.0]], "rps_lines_uppercase_letter_fraction": [[0, 23, 0.13043478], [23, 65, 0.07142857], [65, 600, 0.02429907], [600, 1535, 0.02139037], [1535, 2124, 0.02207131], [2124, 2143, 0.05263158], [2143, 2202, 0.01694915], [2202, 2268, 0.01515152], [2268, 2308, 0.025], [2308, 2338, 0.03333333], [2338, 2349, 0.18181818], [2349, 2788, 0.01138952], [2788, 3365, 0.01733102], [3365, 3721, 0.04775281], [3721, 3753, 0.03125], [3753, 3795, 0.02380952], [3795, 4163, 0.02173913], [4163, 4195, 0.03125], [4195, 4690, 0.02626263], [4690, 4708, 0.11111111], [4708, 4741, 0.12121212], [4741, 4773, 0.125], [4773, 4795, 0.09090909], [4795, 4827, 0.03125], [4827, 4857, 0.03333333]], "rps_doc_ml_palm_score": [[0, 4857, 0.32173789]], "rps_doc_ml_wikipedia_score": [[0, 4857, null]], "rps_doc_ml_wikiref_score": [[0, 4857, 0.11384463]], "rps_doc_books_importance": [[0, 4857, -47.43705432]], "rps_doc_openwebtext_importance": [[0, 4857, 104.48847984]], "rps_doc_wikipedia_importance": [[0, 4857, -43.2497383]], "rps_doc_num_sentences": [[0, 4857, 58.0]]} |
I Am Empowered: An Evaluation of STEPS, a Local Girls' Empowerment Program
Sarah Rosadini, Connecticut CollegeFollow
Joan Chrisler
Adolescent girls face many physical, cognitive, and social changes that can negatively affect their self-concept. Mentoring and empowerment programs that address the specific needs of adolescent girls can provide them with the tools to lead more empowered and positive lives. STEPS, a local girls’ empowerment program, empowers local New London, CT and Groton, CT middle and high-school students at their weekly meetings. The present study is an evaluation of this program. It was hypothesized that self-esteem, body-esteem, self-efficacy, and perceived social support would increase from the original baseline scores. Sixteen adolescent girls participating in the program completed the pre-and post-measures over a 10-week period. This study’s findings reveal that self-efficacy increased over the time period, Wilks’ Lambda = .54, F(1,15) = 12.55, p = .003, multivariate partial eta squared = .46. Perceptions of self-esteem, body-esteem, and perceived social support did not change. Participants’ qualitative responses to open-ended questions include prominent themes of self-love, positivity, confidence, and awareness, which suggest that the measures might also reveal significant changes at the year’s end of the program. Findings also revealed that 100% of STEPS girls would recommend the program to a friend. The girls greatly value the program and believe that is truly does foster a “safe space” for them to learn about themselves and others. These findings illustrate the positive impact of the STEPS program and will hopefully lead to greater funding and increased enrollment.
Rosadini, Sarah, "I Am Empowered: An Evaluation of STEPS, a Local Girls' Empowerment Program" (2017). Psychology Honors Papers. 66. | 2023-14/0000/en_head.json.gz/3648 | {"url": "https://digitalcommons.conncoll.edu/psychhp/66/", "partition": "head_middle", "language": "en", "source_domain": "digitalcommons.conncoll.edu", "date_download": "2023-03-20T09:58:09Z", "digest": "sha1:5X757H7WUES3PPAVXT5AGXQLX52VMQ5J"} | {"ccnet_length": [[0, 1851, 1851.0]], "ccnet_original_length": [[0, 1851, 2953.0]], "ccnet_nlines": [[0, 1851, 5.0]], "ccnet_original_nlines": [[0, 1851, 60.0]], "ccnet_language_score": [[0, 1851, 0.94]], "ccnet_perplexity": [[0, 1851, 267.3]], "ccnet_bucket": [[0, 1851, 0.0]], "rps_doc_curly_bracket": [[0, 1851, 0.0]], "rps_doc_ldnoobw_words": [[0, 1851, 0.0]], "rps_doc_lorem_ipsum": [[0, 1851, 0.0]], "rps_doc_stop_word_fraction": [[0, 1851, 0.28453039]], "rps_doc_ut1_blacklist": [[0, 1851, null]], "rps_doc_frac_chars_dupe_10grams": [[0, 1851, 0.0794702]], "rps_doc_frac_chars_dupe_5grams": [[0, 1851, 0.0794702]], "rps_doc_frac_chars_dupe_6grams": [[0, 1851, 0.0794702]], "rps_doc_frac_chars_dupe_7grams": [[0, 1851, 0.0794702]], "rps_doc_frac_chars_dupe_8grams": [[0, 1851, 0.0794702]], "rps_doc_frac_chars_dupe_9grams": [[0, 1851, 0.0794702]], "rps_doc_frac_chars_top_2gram": [[0, 1851, 0.02649007]], "rps_doc_frac_chars_top_3gram": [[0, 1851, 0.02781457]], "rps_doc_frac_chars_top_4gram": [[0, 1851, 0.01854305]], "rps_doc_frac_all_caps_words": [[0, 1851, 0.02762431]], "rps_doc_frac_lines_end_with_ellipsis": [[0, 1851, 0.0]], "rps_doc_frac_no_alph_words": [[0, 1851, 0.24585635]], "rps_doc_frac_unique_words": [[0, 1851, 0.59022556]], "rps_doc_mean_word_length": [[0, 1851, 5.67669173]], "rps_doc_symbol_to_word_ratio": [[0, 1851, 0.0]], "rps_doc_unigram_entropy": [[0, 1851, 4.72033808]], "rps_doc_word_count": [[0, 1851, 266.0]], "rps_lines_ending_with_terminal_punctution_mark": [[0, 75, 0.0], [75, 117, 0.0], [117, 131, 0.0], [131, 1720, 1.0], [1720, 1851, 1.0]], "rps_lines_javascript_counts": [[0, 75, 0.0], [75, 117, 0.0], [117, 131, 0.0], [131, 1720, 0.0], [1720, 1851, 0.0]], "rps_lines_num_words": [[0, 75, 12.0], [75, 117, 4.0], [117, 131, 2.0], [131, 1720, 229.0], [1720, 1851, 19.0]], "rps_lines_numerical_chars_fraction": [[0, 75, 0.0], [75, 117, 0.0], [117, 131, 0.0], [131, 1720, 0.01243455], [1720, 1851, 0.05042017]], "rps_lines_start_with_bulletpoint": [[0, 75, 0.0], [75, 117, 0.0], [117, 131, 0.0], [131, 1720, 0.0], [1720, 1851, 0.0]], "rps_lines_uppercase_letter_fraction": [[0, 75, 0.18666667], [75, 117, 0.11904762], [117, 131, 0.14285714], [131, 1720, 0.02265576], [1720, 1851, 0.14503817]], "rps_doc_ml_palm_score": [[0, 1851, 0.16036588]], "rps_doc_ml_wikipedia_score": [[0, 1851, null]], "rps_doc_ml_wikiref_score": [[0, 1851, 0.04265553]], "rps_doc_books_importance": [[0, 1851, -97.02474329]], "rps_doc_openwebtext_importance": [[0, 1851, 9.0346803]], "rps_doc_wikipedia_importance": [[0, 1851, -31.83721329]], "rps_doc_num_sentences": [[0, 1851, 19.0]]} |
Home » Iraqi News » Iraqi News Wednesday Evening
Iraqi News Wednesday Evening
Wednesday Evening Iraq CBI News Highlights 1-25-23
The Project To Delete The Zeros Of The Local Iraqi Currency Between The Timing And Feasibility
SEP 24 by dggoldjax Poll: Qasim flavor still Tributary Sadiq / Union / Baghdad 09.24.2011 12:26:29
Raised the project to delete the zeros of the local currency, which was announced the CBI on June 22 last debate between the economic and financial circles, at the time raised the reactions of the popular wondering about the timing and economic feasibility of the process that could cost millions of dollars and spoke upheaval in the domestic market.
(Union) polled a number of experts and academics on the dialectic of the deletion of zeros from the local currency over the following survey:
Ready To Erase The Zeros
Iraqi Central Bank confirms its willingness to delete the zeros of the local currency pointing at the same time that those zeros that are added to the Iraqi currency during the last period formed a cluster of large cash hit (27) trillion dinars, and said the appearance of Mohammed Salih, adviser to the Iraqi Central Bank:
This step is jobs strategy of the Bank, and the zeros that are added to the Iraqi currency during the last period formed a cluster of large cash amounted to more than four trillion paper reviews categories of the Iraqi currency value of financial Bgt (27) trillion Iraqi dinars after it was (25) billion dinars in 1980
He pointed out: that Iraq is unable to manage this block because it is a small country where the large number of money circulating in the Iraqi market has led to a confusion in business dealings in the work of the huge banks ..
Adding that the bank has fully prepared for the deletion of zeros from Iraqi dinar and that the deletion of zeros is a national issue and may need to special legislation, although the Central Bank of Iraq is the reform of management system of currency from the heart of his work and that the deletion of zeros is one of the strategic functions of the Central Bank of Iraq, noting : that the bank was able during the last period to reduce the level of inflation to the level of one decimal after the passage of more than twenty years ..
Indicating: that Iraq was able for the first time to reduce inflation to 2%, which reflects the success of Iraq’s monetary policy.
Fear Of Counterfeit Currency
The economic researcher Abdul Salam Jumaili he says that the project to delete the zeros regulating mechanism for lifting the three zeroes to promote the value of local currency in circulation economic, external and internal which is a strategic project aimed at reducing the rates of inflation experienced by the Iraqi market, as it will free the Iraqi economy from the restrictions and damaged the value of the Iraqi dinar the World Bank’s fund ..
Adding: when we want to buy dollars in 1200 dinars, we buy at one dinar and 200 fils, while the researcher says economic Mohammed Hatem Al-Kaabi on the subject: that most of the speakers with him from supporters and opponents disagree on the subject of hand out a good idea or not so good,
everyone has the conviction that the deletion of zeros to be that is one day, but the objectors whom I am standing on the timing and feasibility of the process could cost millions of dollars may occur upheaval in the domestic market, which obviously he does not need a case of confusion,
New point: that this amount is higher than the money, according to reports of the Central Bank of Iraq four trillion banknote is difficult to alter without constraints, the question arises how will change the currency, and alter the currency and to ensure the absence of manipulation during the switch, it is a matter numbers and zeros lift the zero set zero of sin, and confirms Al Kaabi said the draft of zeros and delete them in this way is a green light the mafias counterfeiting to put forged banknotes can be lost from the process of exchange coordination and purchase of receivables
As for the priorities do not say more than what specialists in this area because the economic situation need to follow the structural and the money that will be spent on the switch I see that the sectors of interest to the citizen first, such as the health sector and education and increase the productive capacity of the country and the adoption of incomes of a new alternative for oil will run dry one day and we will find ourselves in the desert where no water, no green in the face of good I do not think that the story is far from the political, there are those who want to say I made the dinar as the dollar and this phrase will go on citizen unnoticed.
Risks to the economic security
The economic adviser in the government Hussain Anbuge: The process of deletion of zeros from the currency will increase the phenomenon of money laundering and the government is working to convince the central bank need to stop work on the implementation of the project, indicating: that the government had informed the Central Bank of the existence of a risk to economic security in the event insisted the bank to delete the the zeros of the local currency ..
He continued by saying that this process costs time and effort of the government and the central bank raises the level of fraud, some terrorist groups to carry out the smuggling of money out of the country, either the stock market in Iraq,
in turn, will be affected by the process of deletion of zeros from Iraqi dinar, according to a member of the Federation of Iraqi mediators Mohammed Ismail, who said:
that in the case of deletion of zeros from the currency will drop capital companies proportions of zeros and that, for example, if the company’s capital of three billion will be three million dinars, and in this case will be in front of the stage must be where the market closed for a certain period for the issuance of certificates of new shares to holders with the remaining number of shares on the situation and to draw:
that the process of removing the zeros economic process difficult and complex needs numbers and accurate statistics for all economic activities, not Iraq has like, banking offices, in turn expressed their discontent with the process and said that Ahmed Rikabi the banking alligator in the Karrada district, who refused to portray his place explanation: that the situation the security situation being experienced by banking offices how the country has proved to the world that he is unable to protect its banks and its markets and banking offices in which to replace its currency,
removing the zeros in the light of ignorance of the concrete of the ordinary citizen of what is happening in the financial transactions and where to go the budget years that exceeded the barrier of $ 300 billion either for offices Banking and says:
All I fear is the increasing numbers of banknotes counterfeit, and it certainly will dominate the market and spread the idea of change, and delete the zeros indicating: that the project to delete the zeros from the currency will negatively affect the Iraqi economy that has not been made by careful examination by specialists in financial matters The cash is certain: if the monetary mass that remained unchanged no value to the project to delete the zeros ..
As for Dr. Bassem Jamil Antoine Vice-President of the Federation of Iraqi businessmen, he said: In fact, the meaning of the process of removing the zeros is to reduce the currency’s size I do not think that this step will be an impact on the Iraqi economy, which is noted by the recession and lack of economic growth and this process if I want her to bear fruit must be economic steps,
accompanied by economic plans are clear and the whole process needs to create and to keep logistics through seminars, meetings and workshops for Iraqi banks for the ordinary citizen to be aware of the move, so do not be fooled and be easy prey to gangs, extortion and laugh at the simple citizen,
and I think The issue needs to study more than one year so satisfying topic of research and educate all segments of the community and listen to all opinions noting: that the Iraqi currency is divided into two parts:.
the first in the banks, and the second when the citizen so we need to tightened security measures along with significant economic gains can be switched dinar from local currency to a global currency ..
And inflation and its relationship to Balasfar said inflation does not address in this way never, but addresses the scientific studies that can promote the private sector and eliminate unemployment and do the industrial sector, government and civil and promote the agricultural sector,
it is not hidden on the events of 80 percent of the basket of Iraq are imported from abroad, and this means that out of hard currency clarity the day, this with reference to the seven million Iraqis are living below the poverty line, and draws Antoine that this experience is not the result of Iraqi thought and came before them, Italy and Turkey when Khzvt zeros six but vary the timing and the necessities of phase, while the researcher believes the economic Karim Hilfi:
The Bank’s decision Central deletion of three zeroes in the long term, not a new decision on the economic scene has preceded us several countries to the special mention of Germany, adding that the situation in Iraq where there are no objective conditions for deletion, because Iraq is suffering from problems of structural economic and inflation is strong as well as high rates of How unemployment will be the deletion of zeros from the dinar, which is basically non-existent ..
And between: that Iraq is able to move forward with this decision either short or long, both saying that the explanation: it is possible to delete the zeros when the output strong and Jlth Department in addition to the absence of structural imbalances or inflation, and unemployment is less than 10 % noting:
the decision can not be implemented only after a long time, even if carried out within a year or two it will be a negative impact because the economy is in recession now.
In turn, drew the academic researcher Hilal Taan: that the project to delete the zeros is a project proposed by former Finance Minister Bayan Jabr, and can be applied and studied as a medium or long term, can not be applied only after the stability of the economic situation and improve the Iraqi dinar exchange rate ..
He Taan: The project has several requirements the most important he needs a stable economy in some respects, and needs substantial financial resources in addition to supporting the Iraqi dinar, foreign stocks, and it could be applied during the last period but for the effects of the global financial crisis on the Iraqi economy ..
Considered the economic and safety expert: The process of deletion of zeros from currency is a good step, but requires more efforts of the media of economic efforts and attributed the reason is that the market will be exposed to vibration as soon as the citizen’s belief that the deletion of zeros will affect negatively on the real value of the coin ..
She added that the real value of the coin is the value of goods received by the individual and that on the strength of the national economy, saying: These things can be achieved stability of the security situation in addition to the media a major economic effort to educate citizens about the positive results of the process of removing the zeros.
Citizens propose
Ali Faraj, 35, says that the process of deletion of zeros from the currency will contribute to raising the purchasing power, and also that efforts to raise the zeros and change the current currency contributes to the creation of a state of confusion in the monetary policy, compared to no changes in the value of purchasing real currency ..
Adding that there is a risk of leakage of foreign currency abroad after the ratification of the policy of freedom of exchange, especially as the auctions the central bank of the dollar contributed to the exit of large amounts of foreign currency from Iraq,
while he says Muhannad Salem employees that the process of deletion of zeros from the local currency’s decision is not feasible and will not contribute to raise the value of the Iraqi dinar, while cause when applied to problems of multi-dimensional added to the problems suffered by the Iraqi economy, including confusion and disorganization of the market in buying and selling and debt among citizens as well as confuse the mental state that Stantab the citizen as a result of the emergence of a new phenomenon require adaptation by the are of the view of Hamza Hussein :
that the project and before applying it requires coinage small denominations of currency, which represent parts of the dinar, and that these currencies would cost the state large sums eight project does not enhance the value of Iraqi Dinar where it will remain a value as it is and the results are equal, is certain:
the need to deal with this project with caution because it may cause economic problems in the Iraqi Central Bank as well as presenting the project to specialists in economics to indicate their opinion on the subject before the application and noted citizen Hassan Hamid to the approval of a project that aims to raise the value of the Iraqi dinar to its former value to return is equal to $ 3.33 U.S. through several measures, including a review of the proposed import because it drains excess foreign currency and deposited with the Central Bank of Iraq as well as reducing public spending to the state and its departments with regard to import of expensive furniture can be obtained through the local market.
http://translate.google.com/translate?hl=en&langpair=ar%7Cen&u=http://www.pukmedia.com/component/content/20916%3Ftask%3Dview
Categories: Iraqi News
Tags: evening, iraqi, Markz, news, wednesday
KTFA “Vietnam News” Henig Wedneday
KTFA Wednesday Night CC With Frank26 “Numbers” | 2023-14/0000/en_head.json.gz/3649 | {"url": "https://dinardetectives.com/iraqi-news-wednesday-evening-2/", "partition": "head_middle", "language": "en", "source_domain": "dinardetectives.com", "date_download": "2023-03-20T09:02:42Z", "digest": "sha1:JTQYGPIFUBBDC5APA4NVH4AP3KYYPUNY"} | {"ccnet_length": [[0, 13962, 13962.0]], "ccnet_original_length": [[0, 13962, 15424.0]], "ccnet_nlines": [[0, 13962, 51.0]], "ccnet_original_nlines": [[0, 13962, 144.0]], "ccnet_language_score": [[0, 13962, 0.95]], "ccnet_perplexity": [[0, 13962, 319.6]], "ccnet_bucket": [[0, 13962, 0.0]], "rps_doc_curly_bracket": [[0, 13962, 0.0]], "rps_doc_ldnoobw_words": [[0, 13962, 0.0]], "rps_doc_lorem_ipsum": [[0, 13962, 0.0]], "rps_doc_stop_word_fraction": [[0, 13962, 0.48193229]], "rps_doc_ut1_blacklist": [[0, 13962, null]], "rps_doc_frac_chars_dupe_10grams": [[0, 13962, 0.02203031]], "rps_doc_frac_chars_dupe_5grams": [[0, 13962, 0.12160733]], "rps_doc_frac_chars_dupe_6grams": [[0, 13962, 0.09305605]], "rps_doc_frac_chars_dupe_7grams": [[0, 13962, 0.06767712]], "rps_doc_frac_chars_dupe_8grams": [[0, 13962, 0.04423687]], "rps_doc_frac_chars_dupe_9grams": [[0, 13962, 0.03207614]], "rps_doc_frac_chars_top_2gram": [[0, 13962, 0.02070849]], "rps_doc_frac_chars_top_3gram": [[0, 13962, 0.01718364]], "rps_doc_frac_chars_top_4gram": [[0, 13962, 0.01506873]], "rps_doc_frac_all_caps_words": [[0, 13962, 0.00608596]], "rps_doc_frac_lines_end_with_ellipsis": [[0, 13962, 0.0]], "rps_doc_frac_no_alph_words": [[0, 13962, 0.08444275]], "rps_doc_frac_unique_words": [[0, 13962, 0.30467446]], "rps_doc_mean_word_length": [[0, 13962, 4.73622705]], "rps_doc_symbol_to_word_ratio": [[0, 13962, 0.0]], "rps_doc_unigram_entropy": [[0, 13962, 5.31182773]], "rps_doc_word_count": [[0, 13962, 2396.0]], "rps_lines_ending_with_terminal_punctution_mark": [[0, 49, 0.0], [49, 78, 0.0], [78, 129, 0.0], [129, 224, 0.0], [224, 323, 0.0], [323, 674, 1.0], [674, 816, 0.0], [816, 841, 0.0], [841, 1165, 0.0], [1165, 1484, 0.0], [1484, 1712, 1.0], [1712, 2248, 1.0], [2248, 2379, 1.0], [2379, 2408, 0.0], [2408, 2858, 1.0], [2858, 3148, 0.0], [3148, 3436, 0.0], [3436, 4026, 0.0], [4026, 4686, 1.0], [4686, 4717, 0.0], [4717, 5177, 1.0], [5177, 5417, 0.0], [5417, 5583, 0.0], [5583, 6007, 0.0], [6007, 6588, 0.0], [6588, 6837, 0.0], [6837, 7297, 1.0], [7297, 7683, 0.0], [7683, 7980, 0.0], [7980, 8197, 1.0], [8197, 8399, 1.0], [8399, 8685, 0.0], [8685, 9159, 0.0], [9159, 9638, 1.0], [9638, 9947, 0.0], [9947, 10118, 1.0], [10118, 10438, 1.0], [10438, 10770, 1.0], [10770, 11124, 1.0], [11124, 11472, 1.0], [11472, 11489, 0.0], [11489, 11830, 1.0], [11830, 12087, 0.0], [12087, 12660, 0.0], [12660, 12977, 0.0], [12977, 13688, 1.0], [13688, 13813, 0.0], [13813, 13836, 0.0], [13836, 13881, 0.0], [13881, 13916, 0.0], [13916, 13962, 1.0]], "rps_lines_javascript_counts": [[0, 49, 0.0], [49, 78, 0.0], [78, 129, 0.0], [129, 224, 0.0], [224, 323, 0.0], [323, 674, 0.0], [674, 816, 0.0], [816, 841, 0.0], [841, 1165, 0.0], [1165, 1484, 0.0], [1484, 1712, 0.0], [1712, 2248, 0.0], [2248, 2379, 0.0], [2379, 2408, 0.0], [2408, 2858, 0.0], [2858, 3148, 0.0], [3148, 3436, 0.0], [3436, 4026, 0.0], [4026, 4686, 0.0], [4686, 4717, 0.0], [4717, 5177, 0.0], [5177, 5417, 0.0], [5417, 5583, 0.0], [5583, 6007, 0.0], [6007, 6588, 0.0], [6588, 6837, 0.0], [6837, 7297, 0.0], [7297, 7683, 0.0], [7683, 7980, 0.0], [7980, 8197, 0.0], [8197, 8399, 0.0], [8399, 8685, 0.0], [8685, 9159, 0.0], [9159, 9638, 0.0], [9638, 9947, 0.0], [9947, 10118, 0.0], [10118, 10438, 0.0], [10438, 10770, 0.0], [10770, 11124, 0.0], [11124, 11472, 0.0], [11472, 11489, 0.0], [11489, 11830, 0.0], [11830, 12087, 0.0], [12087, 12660, 0.0], [12660, 12977, 0.0], [12977, 13688, 0.0], [13688, 13813, 0.0], [13813, 13836, 0.0], [13836, 13881, 0.0], [13881, 13916, 0.0], [13916, 13962, 0.0]], "rps_lines_num_words": [[0, 49, 9.0], [49, 78, 4.0], [78, 129, 7.0], [129, 224, 16.0], [224, 323, 14.0], [323, 674, 59.0], [674, 816, 24.0], [816, 841, 5.0], [841, 1165, 56.0], [1165, 1484, 57.0], [1484, 1712, 43.0], [1712, 2248, 99.0], [2248, 2379, 22.0], [2379, 2408, 4.0], [2408, 2858, 74.0], [2858, 3148, 54.0], [3148, 3436, 52.0], [3436, 4026, 103.0], [4026, 4686, 127.0], [4686, 4717, 5.0], [4717, 5177, 77.0], [5177, 5417, 43.0], [5417, 5583, 29.0], [5583, 6007, 77.0], [6007, 6588, 93.0], [6588, 6837, 43.0], [6837, 7297, 77.0], [7297, 7683, 72.0], [7683, 7980, 53.0], [7980, 8197, 39.0], [8197, 8399, 34.0], [8399, 8685, 43.0], [8685, 9159, 85.0], [9159, 9638, 79.0], [9638, 9947, 52.0], [9947, 10118, 33.0], [10118, 10438, 56.0], [10438, 10770, 54.0], [10770, 11124, 62.0], [11124, 11472, 61.0], [11472, 11489, 2.0], [11489, 11830, 58.0], [11830, 12087, 44.0], [12087, 12660, 97.0], [12660, 12977, 55.0], [12977, 13688, 122.0], [13688, 13813, 1.0], [13813, 13836, 3.0], [13836, 13881, 6.0], [13881, 13916, 5.0], [13916, 13962, 7.0]], "rps_lines_numerical_chars_fraction": [[0, 49, 0.0], [49, 78, 0.0], [78, 129, 0.10416667], [129, 224, 0.0], [224, 323, 0.17977528], [323, 674, 0.00576369], [674, 816, 0.0], [816, 841, 0.0], [841, 1165, 0.00628931], [1165, 1484, 0.02555911], [1484, 1712, 0.0], [1712, 2248, 0.0], [2248, 2379, 0.00793651], [2379, 2408, 0.0], [2408, 2858, 0.0], [2858, 3148, 0.02473498], [3148, 3436, 0.0], [3436, 4026, 0.0], [4026, 4686, 0.0], [4686, 4717, 0.0], [4717, 5177, 0.0], [5177, 5417, 0.0], [5417, 5583, 0.0], [5583, 6007, 0.0], [6007, 6588, 0.0], [6588, 6837, 0.0122449], [6837, 7297, 0.0], [7297, 7683, 0.0], [7683, 7980, 0.0], [7980, 8197, 0.0], [8197, 8399, 0.0], [8399, 8685, 0.0], [8685, 9159, 0.00428266], [9159, 9638, 0.0], [9638, 9947, 0.00664452], [9947, 10118, 0.0], [10118, 10438, 0.0], [10438, 10770, 0.0], [10770, 11124, 0.0], [11124, 11472, 0.0], [11472, 11489, 0.0], [11489, 11830, 0.00600601], [11830, 12087, 0.0], [12087, 12660, 0.0], [12660, 12977, 0.0], [12977, 13688, 0.00426743], [13688, 13813, 0.07920792], [13813, 13836, 0.0], [13836, 13881, 0.0], [13881, 13916, 0.0], [13916, 13962, 0.04347826]], "rps_lines_start_with_bulletpoint": [[0, 49, 0.0], [49, 78, 0.0], [78, 129, 0.0], [129, 224, 0.0], [224, 323, 0.0], [323, 674, 0.0], [674, 816, 0.0], [816, 841, 0.0], [841, 1165, 0.0], [1165, 1484, 0.0], [1484, 1712, 0.0], [1712, 2248, 0.0], [2248, 2379, 0.0], [2379, 2408, 0.0], [2408, 2858, 0.0], [2858, 3148, 0.0], [3148, 3436, 0.0], [3436, 4026, 0.0], [4026, 4686, 0.0], [4686, 4717, 0.0], [4717, 5177, 0.0], [5177, 5417, 0.0], [5417, 5583, 0.0], [5583, 6007, 0.0], [6007, 6588, 0.0], [6588, 6837, 0.0], [6837, 7297, 0.0], [7297, 7683, 0.0], [7683, 7980, 0.0], [7980, 8197, 0.0], [8197, 8399, 0.0], [8399, 8685, 0.0], [8685, 9159, 0.0], [9159, 9638, 0.0], [9638, 9947, 0.0], [9947, 10118, 0.0], [10118, 10438, 0.0], [10438, 10770, 0.0], [10770, 11124, 0.0], [11124, 11472, 0.0], [11472, 11489, 0.0], [11489, 11830, 0.0], [11830, 12087, 0.0], [12087, 12660, 0.0], [12660, 12977, 0.0], [12977, 13688, 0.0], [13688, 13813, 0.0], [13813, 13836, 0.0], [13836, 13881, 0.0], [13881, 13916, 0.0], [13916, 13962, 0.0]], "rps_lines_uppercase_letter_fraction": [[0, 49, 0.14285714], [49, 78, 0.13793103], [78, 129, 0.15686275], [129, 224, 0.16842105], [224, 323, 0.09090909], [323, 674, 0.01424501], [674, 816, 0.00704225], [816, 841, 0.2], [841, 1165, 0.02777778], [1165, 1484, 0.01880878], [1484, 1712, 0.01315789], [1712, 2248, 0.01492537], [2248, 2379, 0.02290076], [2379, 2408, 0.13793103], [2408, 2858, 0.02], [2858, 3148, 0.01724138], [3148, 3436, 0.00347222], [3436, 4026, 0.01016949], [4026, 4686, 0.00606061], [4686, 4717, 0.03225806], [4717, 5177, 0.01304348], [5177, 5417, 0.00833333], [5417, 5583, 0.03012048], [5583, 6007, 0.0], [6007, 6588, 0.00688468], [6588, 6837, 0.00401606], [6837, 7297, 0.00869565], [7297, 7683, 0.03367876], [7683, 7980, 0.003367], [7980, 8197, 0.01382488], [8197, 8399, 0.0], [8399, 8685, 0.00699301], [8685, 9159, 0.01898734], [9159, 9638, 0.01461378], [9638, 9947, 0.01294498], [9947, 10118, 0.0], [10118, 10438, 0.025], [10438, 10770, 0.01506024], [10770, 11124, 0.00564972], [11124, 11472, 0.00574713], [11472, 11489, 0.05882353], [11489, 11830, 0.0058651], [11830, 12087, 0.0077821], [12087, 12660, 0.0122164], [12660, 12977, 0.00630915], [12977, 13688, 0.01547117], [13688, 13813, 0.024], [13813, 13836, 0.13043478], [13836, 13881, 0.04444444], [13881, 13916, 0.22857143], [13916, 13962, 0.23913043]], "rps_doc_ml_palm_score": [[0, 13962, 0.45950341]], "rps_doc_ml_wikipedia_score": [[0, 13962, null]], "rps_doc_ml_wikiref_score": [[0, 13962, 0.27319735]], "rps_doc_books_importance": [[0, 13962, -235.66668407]], "rps_doc_openwebtext_importance": [[0, 13962, 298.22649839]], "rps_doc_wikipedia_importance": [[0, 13962, 33.64666587]], "rps_doc_num_sentences": [[0, 13962, 30.0]]} |
Dispensary Ranked
YES! GET A FREE PROFIT PROJECTION NOW!
FREE TRAFFIC PROJECTION
7 STEPS TO INCREASED EXPOSURE ON GOOGLE
Traffic Projection
Copyright © 2022 Dispensary Ranked | All Rights Reserved.
Any earnings or income statements, or earnings or income examples, increases in revenue achieved or mentioned are only estimates of what we think you could achieve. There is no assurance you’ll do as well. If you rely upon our figures, you must accept the risk of not doing as well. Where specific income figures are used, and attributed to an individual or business, those persons or businesses have earned that amount. There is no assurance you’ll do as well. If you rely upon our figures; you must accept the risk of not doing as well. Any and all claims or representations, as to income earnings on this web site, are not to be considered as average earnings. There can be no assurance that any prior successes, or past results, as to income earnings, can be used as an indication of your future success or results.
Monetary and income results are based on many factors. We have no way of knowing how well you will do, as we do not know you, your background, your work ethic, or your business skills or practices. Therefore we do not guarantee or imply that you will win any incentives or prizes that may be offered, get rich, that you will do as well, or make any money at all. There is no assurance you’ll do as well. If you rely upon our figures; you must accept the risk of not doing as well. Internet businesses and earnings derived there from, have unknown risks involved, and are not suitable for everyone. Making decisions based on any information presented in our products, services, or web site, should be done only with the knowledge that you could experience smaller gains or none at all. All products and services by our company are for educational and informational purposes only. Use caution and seek the advice of qualified professionals. Check with your accountant, lawyer or professional adviser, before acting on this or any information. Users of our products, services and web site are advised to do their own due diligence when it comes to making business decisions and all information, products, and services that have been provided should be independently verified by your own qualified professionals. Our information, products, and services on this web site should be carefully considered and evaluated, before reaching a business decision, on whether to rely on them. All disclosures and disclaimers made herein or on our site, apply equally to any offers, prizes, or incentives, that may be made by our company.
You agree that our company is not responsible for the success or failure of your business decisions relating to any information presented by our company, or our company products or services. Thank you for your understanding and our requirements of disclosure to remain compliant.
Any and all claims or representations, as to income earnings on this web site, are not to be considered as average earnings. There can be no assurance that any prior successes, or past results, as to income earnings, can be used as an indication of your future success or results. | 2023-14/0000/en_head.json.gz/3650 | {"url": "https://dispensaryranked.com/my-account/", "partition": "head_middle", "language": "en", "source_domain": "dispensaryranked.com", "date_download": "2023-03-20T08:42:59Z", "digest": "sha1:ZP7N7QZ2FAJJCMPZ4LRGJI6BQXIMQBY3"} | {"ccnet_length": [[0, 3201, 3201.0]], "ccnet_original_length": [[0, 3201, 3622.0]], "ccnet_nlines": [[0, 3201, 10.0]], "ccnet_original_nlines": [[0, 3201, 36.0]], "ccnet_language_score": [[0, 3201, 0.96]], "ccnet_perplexity": [[0, 3201, 282.8]], "ccnet_bucket": [[0, 3201, 0.0]], "rps_doc_curly_bracket": [[0, 3201, 0.0]], "rps_doc_ldnoobw_words": [[0, 3201, 0.0]], "rps_doc_lorem_ipsum": [[0, 3201, 0.0]], "rps_doc_stop_word_fraction": [[0, 3201, 0.47419355]], "rps_doc_ut1_blacklist": [[0, 3201, null]], "rps_doc_frac_chars_dupe_10grams": [[0, 3201, 0.27859351]], "rps_doc_frac_chars_dupe_5grams": [[0, 3201, 0.27859351]], "rps_doc_frac_chars_dupe_6grams": [[0, 3201, 0.27859351]], "rps_doc_frac_chars_dupe_7grams": [[0, 3201, 0.27859351]], "rps_doc_frac_chars_dupe_8grams": [[0, 3201, 0.27859351]], "rps_doc_frac_chars_dupe_9grams": [[0, 3201, 0.27859351]], "rps_doc_frac_chars_top_2gram": [[0, 3201, 0.01622875]], "rps_doc_frac_chars_top_3gram": [[0, 3201, 0.01236476]], "rps_doc_frac_chars_top_4gram": [[0, 3201, 0.02782071]], "rps_doc_frac_all_caps_words": [[0, 3201, 0.02580645]], "rps_doc_frac_lines_end_with_ellipsis": [[0, 3201, 0.0]], "rps_doc_frac_no_alph_words": [[0, 3201, 0.12741935]], "rps_doc_frac_unique_words": [[0, 3201, 0.36598891]], "rps_doc_mean_word_length": [[0, 3201, 4.78373383]], "rps_doc_symbol_to_word_ratio": [[0, 3201, 0.0]], "rps_doc_unigram_entropy": [[0, 3201, 4.80474726]], "rps_doc_word_count": [[0, 3201, 541.0]], "rps_lines_ending_with_terminal_punctution_mark": [[0, 18, 0.0], [18, 57, 1.0], [57, 81, 0.0], [81, 122, 0.0], [122, 141, 0.0], [141, 199, 1.0], [199, 1019, 1.0], [1019, 2641, 1.0], [2641, 2921, 1.0], [2921, 3201, 1.0]], "rps_lines_javascript_counts": [[0, 18, 0.0], [18, 57, 0.0], [57, 81, 0.0], [81, 122, 0.0], [122, 141, 0.0], [141, 199, 0.0], [199, 1019, 0.0], [1019, 2641, 0.0], [2641, 2921, 0.0], [2921, 3201, 0.0]], "rps_lines_num_words": [[0, 18, 2.0], [18, 57, 7.0], [57, 81, 3.0], [81, 122, 7.0], [122, 141, 2.0], [141, 199, 8.0], [199, 1019, 145.0], [1019, 2641, 273.0], [2641, 2921, 44.0], [2921, 3201, 50.0]], "rps_lines_numerical_chars_fraction": [[0, 18, 0.0], [18, 57, 0.0], [57, 81, 0.0], [81, 122, 0.025], [122, 141, 0.0], [141, 199, 0.07407407], [199, 1019, 0.0], [1019, 2641, 0.0], [2641, 2921, 0.0], [2921, 3201, 0.0]], "rps_lines_start_with_bulletpoint": [[0, 18, 0.0], [18, 57, 0.0], [57, 81, 0.0], [81, 122, 0.0], [122, 141, 0.0], [141, 199, 0.0], [199, 1019, 0.0], [1019, 2641, 0.0], [2641, 2921, 0.0], [2921, 3201, 0.0]], "rps_lines_uppercase_letter_fraction": [[0, 18, 0.11111111], [18, 57, 0.76923077], [57, 81, 0.875], [81, 122, 0.7804878], [122, 141, 0.10526316], [141, 199, 0.10344828], [199, 1019, 0.0097561], [1019, 2641, 0.0080148], [2641, 2921, 0.00714286], [2921, 3201, 0.00714286]], "rps_doc_ml_palm_score": [[0, 3201, 0.05513096]], "rps_doc_ml_wikipedia_score": [[0, 3201, null]], "rps_doc_ml_wikiref_score": [[0, 3201, 0.02168113]], "rps_doc_books_importance": [[0, 3201, -57.01629151]], "rps_doc_openwebtext_importance": [[0, 3201, -19.28729753]], "rps_doc_wikipedia_importance": [[0, 3201, -253.99632406]], "rps_doc_num_sentences": [[0, 3201, 28.0]]} |
Posts tagged “Best Digital Marketing Agency”
Tag: Best Digital Marketing Agency
The Essential Guide To Digital Marketing In 2023: How It Will Help Your Business Succeed: Diviants
As 2020 winds down, it’s time to start looking ahead to the future of Digital Marketing and how it will help your business succeed in 2023. Diviants great essential guide will provide an overview of what digital marketing is, why it is important for businesses in the modern world, and how you can use it […]
Grow your Business with Diviants | Best Digital Marketing Agency in Bangalore
The digital marketing landscape is constantly evolving and it can be hard to keep up with the latest trends and best practices. As a business owner, you may be feeling overwhelmed and uncertain about how to grow your business in the digital age. You’re not alone – many businesses are struggling to keep up with […] | 2023-14/0000/en_head.json.gz/3651 | {"url": "https://diviants.com/tag/best-digital-marketing-agency/", "partition": "head_middle", "language": "en", "source_domain": "diviants.com", "date_download": "2023-03-20T08:54:49Z", "digest": "sha1:5YBTWHUS6XYK3TG34UWSF3OYWGINSWQU"} | {"ccnet_length": [[0, 881, 881.0]], "ccnet_original_length": [[0, 881, 2635.0]], "ccnet_nlines": [[0, 881, 6.0]], "ccnet_original_nlines": [[0, 881, 77.0]], "ccnet_language_score": [[0, 881, 0.87]], "ccnet_perplexity": [[0, 881, 238.4]], "ccnet_bucket": [[0, 881, 0.0]], "rps_doc_curly_bracket": [[0, 881, 0.0]], "rps_doc_ldnoobw_words": [[0, 881, 0.0]], "rps_doc_lorem_ipsum": [[0, 881, 0.0]], "rps_doc_stop_word_fraction": [[0, 881, 0.4011976]], "rps_doc_ut1_blacklist": [[0, 881, null]], "rps_doc_frac_chars_dupe_10grams": [[0, 881, 0.0]], "rps_doc_frac_chars_dupe_5grams": [[0, 881, 0.08938547]], "rps_doc_frac_chars_dupe_6grams": [[0, 881, 0.08938547]], "rps_doc_frac_chars_dupe_7grams": [[0, 881, 0.08938547]], "rps_doc_frac_chars_dupe_8grams": [[0, 881, 0.0]], "rps_doc_frac_chars_dupe_9grams": [[0, 881, 0.0]], "rps_doc_frac_chars_top_2gram": [[0, 881, 0.15642458]], "rps_doc_frac_chars_top_3gram": [[0, 881, 0.05586592]], "rps_doc_frac_chars_top_4gram": [[0, 881, 0.0726257]], "rps_doc_frac_all_caps_words": [[0, 881, 0.0]], "rps_doc_frac_lines_end_with_ellipsis": [[0, 881, 0.0]], "rps_doc_frac_no_alph_words": [[0, 881, 0.1257485]], "rps_doc_frac_unique_words": [[0, 881, 0.52666667]], "rps_doc_mean_word_length": [[0, 881, 4.77333333]], "rps_doc_symbol_to_word_ratio": [[0, 881, 0.01197605]], "rps_doc_unigram_entropy": [[0, 881, 4.1193218]], "rps_doc_word_count": [[0, 881, 150.0]], "rps_lines_ending_with_terminal_punctution_mark": [[0, 45, 1.0], [45, 80, 0.0], [80, 179, 0.0], [179, 488, 0.0], [488, 566, 0.0], [566, 881, 0.0]], "rps_lines_javascript_counts": [[0, 45, 0.0], [45, 80, 0.0], [80, 179, 0.0], [179, 488, 0.0], [488, 566, 0.0], [566, 881, 0.0]], "rps_lines_num_words": [[0, 45, 6.0], [45, 80, 5.0], [80, 179, 16.0], [179, 488, 56.0], [488, 566, 11.0], [566, 881, 56.0]], "rps_lines_numerical_chars_fraction": [[0, 45, 0.0], [45, 80, 0.0], [80, 179, 0.04166667], [179, 488, 0.02649007], [488, 566, 0.0], [566, 881, 0.0]], "rps_lines_start_with_bulletpoint": [[0, 45, 0.0], [45, 80, 0.0], [80, 179, 0.0], [179, 488, 0.0], [488, 566, 0.0], [566, 881, 0.0]], "rps_lines_uppercase_letter_fraction": [[0, 45, 0.11111111], [45, 80, 0.14285714], [80, 179, 0.15151515], [179, 488, 0.01294498], [488, 566, 0.1025641], [566, 881, 0.00952381]], "rps_doc_ml_palm_score": [[0, 881, -9.78e-06]], "rps_doc_ml_wikipedia_score": [[0, 881, null]], "rps_doc_ml_wikiref_score": [[0, 881, 0.00035107]], "rps_doc_books_importance": [[0, 881, -80.72130525]], "rps_doc_openwebtext_importance": [[0, 881, -21.1089694]], "rps_doc_wikipedia_importance": [[0, 881, -80.82280162]], "rps_doc_num_sentences": [[0, 881, 4.0]]} |
9 Reasons Houses Get Condemned
When a local government declares a house or some other property as condemned, it means it is unsuitable for public use. Inspectors determine that, for a variety of reasons, the home is no longer suitable to live in or use for any other purpose.
Condemning a house can happen for several reasons. Homes are condemned because of severe water damage from a flood, fire damage, mold and rodent infestation, termites, poor construction, and anything that presents a significant health hazard.
Anything that would make the building unsafe for people inside to the point where there is a real risk is reason enough to condemn.
Any condemned house or building will likely require significant investment to get it back up to code and reverse the government’s decision.
Read on to learn more about the process involved in condemning a house and some of the most common reasons why homes are condemned.
What Does It Mean When a House Is Condemned?
9 Reasons Houses Can Be Condemned
1. Poor Construction Quality
2. Flood Damage
3. Black Mold
4. Termites
5. Drugs or Industrial Toxins
6. Hygiene Concerns
7. Fire Damage
8. Vacant Homes
9. Hurricanes
Buying a Condemned Home
Usually, houses are condemned as a last resort. To get condemned the government or any other governing bodies have likely exhausted all other efforts to remedy the situation.
A house getting condemned can start with complaints from neighbors about a rat infestation or leaning in a home that threatens properties around it.
Whatever the driver behind the complaints, the city or town where the house is located will likely have issued several written notices and warnings. The notices outline the issue and what is required to avoid escalating the situation further.
If the owners take no action, then ultimately, the house is declared condemned, and there is an order to vacate it is no longer certified for occupancy. The local government secures it and no one is allowed to live there.
If no resolution is reached, public money pays to demolish the house or it goes to auction, and the new owner has to bring it up to code.
In most cases, a certificate of occupancy must be issued for people to move back in.
There are a lot of reasons houses can be condemned. Really any situation in which there is a significant safety risk to the occupants or people in the area is a reason to condemn a house. Let’s take a look at some of the most common reasons.
A house typically has to meet stringent construction standards. However, as with most things in life, not everyone follows the letter of the law.
Unscrupulous contractors cut corners to save money or mistakes are made. Sometimes older buildings weren’t built with the same attention to detail as modern homes.
As a result, over years or decades, the house can shift more than normal and its structural integrity is compromised. When that happens, the local government may condemn the house because it’s unsafe to live in.
When flood damage is so serious that it poses a biohazard to people inside or it affects the structure of the home, it may be condemned.
This happens usually when flooding is so severe that proper restoration is all but impossible. It’s easier to demolish the house and start from scratch.
A lot of homes deal with mold issues, especially if the property is located in a humid climate. That’s why so many mold remediation companies exist.
They come, clear out the area with mold, and restore the drywall or flooring where the mold was located.
However, sometimes black mold infections are so severe that restoration doesn’t make sense. When this happens, the health of people inside the home and in the surrounding area is at risk and the house is condemned.
Termites, when spotted early on, can be dealt with relatively easily. They don’t cause that much damage to a house. But years of termites weaken wood beams and threaten the structural integrity of the home.
If there’s been a long-term infestation, it’s possible that fixing the termite problem is impossible. You can get rid of the termites, but the damage has been done.
If a house was used to produce illegal drugs or there is the presence of chemical toxins, the government may condemn it for health reasons.
The chemicals left behind can last for years, and they could cause negative health consequences for anyone living in the house.
You’ve probably seen TV shows about hoarders and what can happen to a house that is overloaded with unsanitary items.
The government can condemn a house when clearing out the house won’t work with someone living inside it. This prevents people from living there until it’s cleaned and sanitary again.
For what are likely obvious reasons, fire damage is a leading cause of houses getting condemned.
Fire damages wood beams, weakens floors, and trailing smoke damage can be a health hazard for anyone in the house.
Abandoned homes are condemned as a way to move forward with finding a new owner or demolishing them to clear the way for new construction. It can happen a lot in cities.
Vagrants end up living in abandoned homes, and crime rates go up. Governments condemn them to remove dilapidated houses.
Thousands of homes along the Gulf Coast were condemned as a result of hurricanes. Large storm systems can tear sections of the roof off, damage plumbing systems, and cause entire homes to shift off their foundation.
When damage like this happens, it’s easier to condemn the house and arrange for demolition than to deal with repairs.
In these examples, condemning a house is the fastest way to get people back in their homes.
These are just some of the reasons why houses get condemned. Not every condemned home gets demolished, and typically property owners can work with the city or local government to get the house back on track.
When a government condemns a home and orders its residents to vacate the premises, they often will provide funding to help with moving expenses. The assistance helps the residents get set up in a new house or apartment.
What if you’re interested in buying a condemned home and making it a project? Rehabbing a condemned home can be complex, and it will take more work than your typical flip.
However, under the right circumstances, a condemned home can be a great deal. Think about it. What if the house is condemned just because it’s a hoarder home?
All it would take is some muscle and deep cleaning, maybe some paint and light contractor work to get it back in shape again.
Most local governments don’t want to condemn a home. It’s a blight that brings down neighborhood values and takes away from the essence of a neighborhood. A lot of the time, cities are eager to help people who want to rehab them. There may even be some incentives to help with the project.
https://sellhomefast.org/why-does-property-get-condemned/
https://newenglandenterprises.com/reasons-condemn-a-house/
https://www.webuyuglyhouses.com/blog/condemned-house
https://jackjenkinshomes.com/buying-a-condemned-house-risks-and-rewards-for-bargain-seeking-buyers/
Home » 9 Reasons Houses Get Condemned
Can a Toilet Go Bad: Common Toilet Issues and What to Do About Them
Can You Bury an Extension Cord? | 2023-14/0000/en_head.json.gz/3652 | {"url": "https://diyallday.com/reasons-houses-get-condemned/", "partition": "head_middle", "language": "en", "source_domain": "diyallday.com", "date_download": "2023-03-20T08:57:16Z", "digest": "sha1:DUI6GTMXV2TRDBRDXFV5H7ESITB7WJML"} | {"ccnet_length": [[0, 7179, 7179.0]], "ccnet_original_length": [[0, 7179, 9297.0]], "ccnet_nlines": [[0, 7179, 59.0]], "ccnet_original_nlines": [[0, 7179, 108.0]], "ccnet_language_score": [[0, 7179, 0.94]], "ccnet_perplexity": [[0, 7179, 292.4]], "ccnet_bucket": [[0, 7179, 0.0]], "rps_doc_curly_bracket": [[0, 7179, 0.0]], "rps_doc_ldnoobw_words": [[0, 7179, 0.0]], "rps_doc_lorem_ipsum": [[0, 7179, 0.0]], "rps_doc_stop_word_fraction": [[0, 7179, 0.42498255]], "rps_doc_ut1_blacklist": [[0, 7179, null]], "rps_doc_frac_chars_dupe_10grams": [[0, 7179, 0.0]], "rps_doc_frac_chars_dupe_5grams": [[0, 7179, 0.03374656]], "rps_doc_frac_chars_dupe_6grams": [[0, 7179, 0.00895317]], "rps_doc_frac_chars_dupe_7grams": [[0, 7179, 0.0]], "rps_doc_frac_chars_dupe_8grams": [[0, 7179, 0.0]], "rps_doc_frac_chars_dupe_9grams": [[0, 7179, 0.0]], "rps_doc_frac_chars_top_2gram": [[0, 7179, 0.01790634]], "rps_doc_frac_chars_top_3gram": [[0, 7179, 0.00964187]], "rps_doc_frac_chars_top_4gram": [[0, 7179, 0.00585399]], "rps_doc_frac_all_caps_words": [[0, 7179, 0.00348918]], "rps_doc_frac_lines_end_with_ellipsis": [[0, 7179, 0.0]], "rps_doc_frac_no_alph_words": [[0, 7179, 0.13538032]], "rps_doc_frac_unique_words": [[0, 7179, 0.36469609]], "rps_doc_mean_word_length": [[0, 7179, 4.83597002]], "rps_doc_symbol_to_word_ratio": [[0, 7179, 0.0]], "rps_doc_unigram_entropy": [[0, 7179, 5.3106899]], "rps_doc_word_count": [[0, 7179, 1201.0]], "rps_lines_ending_with_terminal_punctution_mark": [[0, 31, 0.0], [31, 276, 1.0], [276, 519, 1.0], [519, 651, 1.0], [651, 791, 1.0], [791, 923, 1.0], [923, 968, 1.0], [968, 1002, 0.0], [1002, 1031, 0.0], [1031, 1047, 0.0], [1047, 1061, 0.0], [1061, 1073, 0.0], [1073, 1103, 0.0], [1103, 1123, 0.0], [1123, 1138, 0.0], [1138, 1154, 0.0], [1154, 1168, 0.0], [1168, 1192, 0.0], [1192, 1367, 1.0], [1367, 1516, 1.0], [1516, 1759, 1.0], [1759, 1981, 1.0], [1981, 2119, 1.0], [2119, 2204, 1.0], [2204, 2446, 1.0], [2446, 2592, 1.0], [2592, 2756, 1.0], [2756, 2968, 1.0], [2968, 3105, 1.0], [3105, 3258, 1.0], [3258, 3407, 1.0], [3407, 3512, 1.0], [3512, 3727, 1.0], [3727, 3934, 1.0], [3934, 4099, 1.0], [4099, 4239, 1.0], [4239, 4367, 1.0], [4367, 4485, 1.0], [4485, 4668, 1.0], [4668, 4765, 1.0], [4765, 4880, 1.0], [4880, 5050, 1.0], [5050, 5171, 1.0], [5171, 5387, 1.0], [5387, 5505, 1.0], [5505, 5597, 1.0], [5597, 5805, 1.0], [5805, 6025, 1.0], [6025, 6197, 1.0], [6197, 6356, 1.0], [6356, 6482, 1.0], [6482, 6772, 1.0], [6772, 6830, 0.0], [6830, 6889, 0.0], [6889, 6942, 0.0], [6942, 7042, 0.0], [7042, 7080, 0.0], [7080, 7148, 0.0], [7148, 7179, 1.0]], "rps_lines_javascript_counts": [[0, 31, 0.0], [31, 276, 0.0], [276, 519, 0.0], [519, 651, 0.0], [651, 791, 0.0], [791, 923, 0.0], [923, 968, 0.0], [968, 1002, 0.0], [1002, 1031, 0.0], [1031, 1047, 0.0], [1047, 1061, 0.0], [1061, 1073, 0.0], [1073, 1103, 0.0], [1103, 1123, 0.0], [1123, 1138, 0.0], [1138, 1154, 0.0], [1154, 1168, 0.0], [1168, 1192, 0.0], [1192, 1367, 0.0], [1367, 1516, 0.0], [1516, 1759, 0.0], [1759, 1981, 0.0], [1981, 2119, 0.0], [2119, 2204, 0.0], [2204, 2446, 0.0], [2446, 2592, 0.0], [2592, 2756, 0.0], [2756, 2968, 0.0], [2968, 3105, 0.0], [3105, 3258, 0.0], [3258, 3407, 0.0], [3407, 3512, 0.0], [3512, 3727, 0.0], [3727, 3934, 0.0], [3934, 4099, 0.0], [4099, 4239, 0.0], [4239, 4367, 0.0], [4367, 4485, 0.0], [4485, 4668, 0.0], [4668, 4765, 0.0], [4765, 4880, 0.0], [4880, 5050, 0.0], [5050, 5171, 0.0], [5171, 5387, 0.0], [5387, 5505, 0.0], [5505, 5597, 0.0], [5597, 5805, 0.0], [5805, 6025, 0.0], [6025, 6197, 0.0], [6197, 6356, 0.0], [6356, 6482, 0.0], [6482, 6772, 0.0], [6772, 6830, 0.0], [6830, 6889, 0.0], [6889, 6942, 0.0], [6942, 7042, 0.0], [7042, 7080, 0.0], [7080, 7148, 0.0], [7148, 7179, 0.0]], "rps_lines_num_words": [[0, 31, 5.0], [31, 276, 44.0], [276, 519, 36.0], [519, 651, 24.0], [651, 791, 22.0], [791, 923, 24.0], [923, 968, 9.0], [968, 1002, 6.0], [1002, 1031, 4.0], [1031, 1047, 3.0], [1047, 1061, 3.0], [1061, 1073, 2.0], [1073, 1103, 5.0], [1103, 1123, 3.0], [1123, 1138, 3.0], [1138, 1154, 3.0], [1154, 1168, 2.0], [1168, 1192, 4.0], [1192, 1367, 28.0], [1367, 1516, 24.0], [1516, 1759, 39.0], [1759, 1981, 40.0], [1981, 2119, 28.0], [2119, 2204, 16.0], [2204, 2446, 47.0], [2446, 2592, 24.0], [2592, 2756, 25.0], [2756, 2968, 36.0], [2968, 3105, 26.0], [3105, 3258, 25.0], [3258, 3407, 26.0], [3407, 3512, 19.0], [3512, 3727, 36.0], [3727, 3934, 35.0], [3934, 4099, 28.0], [4099, 4239, 25.0], [4239, 4367, 21.0], [4367, 4485, 20.0], [4485, 4668, 30.0], [4668, 4765, 16.0], [4765, 4880, 20.0], [4880, 5050, 32.0], [5050, 5171, 19.0], [5171, 5387, 36.0], [5387, 5505, 20.0], [5505, 5597, 17.0], [5597, 5805, 36.0], [5805, 6025, 38.0], [6025, 6197, 31.0], [6197, 6356, 28.0], [6356, 6482, 24.0], [6482, 6772, 53.0], [6772, 6830, 1.0], [6830, 6889, 1.0], [6889, 6942, 1.0], [6942, 7042, 1.0], [7042, 7080, 7.0], [7080, 7148, 14.0], [7148, 7179, 6.0]], "rps_lines_numerical_chars_fraction": [[0, 31, 0.03333333], [31, 276, 0.0], [276, 519, 0.0], [519, 651, 0.0], [651, 791, 0.0], [791, 923, 0.0], [923, 968, 0.0], [968, 1002, 0.03030303], [1002, 1031, 0.03703704], [1031, 1047, 0.07142857], [1047, 1061, 0.08333333], [1061, 1073, 0.1], [1073, 1103, 0.03571429], [1103, 1123, 0.05555556], [1123, 1138, 0.07692308], [1138, 1154, 0.07142857], [1154, 1168, 0.08333333], [1168, 1192, 0.0], [1192, 1367, 0.0], [1367, 1516, 0.0], [1516, 1759, 0.0], [1759, 1981, 0.0], [1981, 2119, 0.0], [2119, 2204, 0.0], [2204, 2446, 0.0], [2446, 2592, 0.0], [2592, 2756, 0.0], [2756, 2968, 0.0], [2968, 3105, 0.0], [3105, 3258, 0.0], [3258, 3407, 0.0], [3407, 3512, 0.0], [3512, 3727, 0.0], [3727, 3934, 0.0], [3934, 4099, 0.0], [4099, 4239, 0.0], [4239, 4367, 0.0], [4367, 4485, 0.0], [4485, 4668, 0.0], [4668, 4765, 0.0], [4765, 4880, 0.0], [4880, 5050, 0.0], [5050, 5171, 0.0], [5171, 5387, 0.0], [5387, 5505, 0.0], [5505, 5597, 0.0], [5597, 5805, 0.0], [5805, 6025, 0.0], [6025, 6197, 0.0], [6197, 6356, 0.0], [6356, 6482, 0.0], [6482, 6772, 0.0], [6772, 6830, 0.0], [6830, 6889, 0.0], [6889, 6942, 0.0], [6942, 7042, 0.0], [7042, 7080, 0.02702703], [7080, 7148, 0.0], [7148, 7179, 0.0]], "rps_lines_start_with_bulletpoint": [[0, 31, 0.0], [31, 276, 0.0], [276, 519, 0.0], [519, 651, 0.0], [651, 791, 0.0], [791, 923, 0.0], [923, 968, 0.0], [968, 1002, 0.0], [1002, 1031, 0.0], [1031, 1047, 0.0], [1047, 1061, 0.0], [1061, 1073, 0.0], [1073, 1103, 0.0], [1103, 1123, 0.0], [1123, 1138, 0.0], [1138, 1154, 0.0], [1154, 1168, 0.0], [1168, 1192, 0.0], [1192, 1367, 0.0], [1367, 1516, 0.0], [1516, 1759, 0.0], [1759, 1981, 0.0], [1981, 2119, 0.0], [2119, 2204, 0.0], [2204, 2446, 0.0], [2446, 2592, 0.0], [2592, 2756, 0.0], [2756, 2968, 0.0], [2968, 3105, 0.0], [3105, 3258, 0.0], [3258, 3407, 0.0], [3407, 3512, 0.0], [3512, 3727, 0.0], [3727, 3934, 0.0], [3934, 4099, 0.0], [4099, 4239, 0.0], [4239, 4367, 0.0], [4367, 4485, 0.0], [4485, 4668, 0.0], [4668, 4765, 0.0], [4765, 4880, 0.0], [4880, 5050, 0.0], [5050, 5171, 0.0], [5171, 5387, 0.0], [5387, 5505, 0.0], [5505, 5597, 0.0], [5597, 5805, 0.0], [5805, 6025, 0.0], [6025, 6197, 0.0], [6197, 6356, 0.0], [6356, 6482, 0.0], [6482, 6772, 0.0], [6772, 6830, 0.0], [6830, 6889, 0.0], [6889, 6942, 0.0], [6942, 7042, 0.0], [7042, 7080, 0.0], [7080, 7148, 0.0], [7148, 7179, 0.0]], "rps_lines_uppercase_letter_fraction": [[0, 31, 0.12903226], [31, 276, 0.00816327], [276, 519, 0.00823045], [519, 651, 0.00757576], [651, 791, 0.00714286], [791, 923, 0.00757576], [923, 968, 0.17777778], [968, 1002, 0.14705882], [1002, 1031, 0.10344828], [1031, 1047, 0.125], [1047, 1061, 0.14285714], [1061, 1073, 0.08333333], [1073, 1103, 0.1], [1103, 1123, 0.1], [1123, 1138, 0.13333333], [1138, 1154, 0.125], [1154, 1168, 0.07142857], [1168, 1192, 0.125], [1192, 1367, 0.01142857], [1367, 1516, 0.00671141], [1516, 1759, 0.00823045], [1759, 1981, 0.00900901], [1981, 2119, 0.00724638], [2119, 2204, 0.01176471], [2204, 2446, 0.01239669], [2446, 2592, 0.01369863], [2592, 2756, 0.01219512], [2756, 2968, 0.00943396], [2968, 3105, 0.00729927], [3105, 3258, 0.0130719], [3258, 3407, 0.01342282], [3407, 3512, 0.00952381], [3512, 3727, 0.00930233], [3727, 3934, 0.01449275], [3934, 4099, 0.01212121], [4099, 4239, 0.00714286], [4239, 4367, 0.0078125], [4367, 4485, 0.02542373], [4485, 4668, 0.01092896], [4668, 4765, 0.01030928], [4765, 4880, 0.00869565], [4880, 5050, 0.01176471], [5050, 5171, 0.01652893], [5171, 5387, 0.01851852], [5387, 5505, 0.00847458], [5505, 5597, 0.01086957], [5597, 5805, 0.00961538], [5805, 6025, 0.00909091], [6025, 6197, 0.01162791], [6197, 6356, 0.01886792], [6356, 6482, 0.00793651], [6482, 6772, 0.0137931], [6772, 6830, 0.0], [6830, 6889, 0.0], [6889, 6942, 0.0], [6942, 7042, 0.0], [7042, 7080, 0.13157895], [7080, 7148, 0.16176471], [7148, 7179, 0.16129032]], "rps_doc_ml_palm_score": [[0, 7179, 0.19967759]], "rps_doc_ml_wikipedia_score": [[0, 7179, null]], "rps_doc_ml_wikiref_score": [[0, 7179, 0.26348144]], "rps_doc_books_importance": [[0, 7179, -312.48884067]], "rps_doc_openwebtext_importance": [[0, 7179, 88.32676114]], "rps_doc_wikipedia_importance": [[0, 7179, -309.50091801]], "rps_doc_num_sentences": [[0, 7179, 83.0]]} |
How to Make a Clay Bowl by Hand
If you're looking for a new and unique DIY project, look no further than making a clay bowl by hand. This is a great way to show off your crafting skills, and ... | 2023-14/0000/en_head.json.gz/3653 | {"url": "https://diyquickly.com/category/bowl/", "partition": "head_middle", "language": "en", "source_domain": "diyquickly.com", "date_download": "2023-03-20T09:56:04Z", "digest": "sha1:E3IGXJXER5JGNT3HQTC7DPVTIQJEBA3H"} | {"ccnet_length": [[0, 194, 194.0]], "ccnet_original_length": [[0, 194, 6377.0]], "ccnet_nlines": [[0, 194, 2.0]], "ccnet_original_nlines": [[0, 194, 415.0]], "ccnet_language_score": [[0, 194, 0.97]], "ccnet_perplexity": [[0, 194, 199.6]], "ccnet_bucket": [[0, 194, 0.0]], "rps_doc_curly_bracket": [[0, 194, 0.0]], "rps_doc_ldnoobw_words": [[0, 194, 0.0]], "rps_doc_lorem_ipsum": [[0, 194, 0.0]], "rps_doc_stop_word_fraction": [[0, 194, 0.5]], "rps_doc_ut1_blacklist": [[0, 194, null]], "rps_doc_frac_chars_dupe_10grams": [[0, 194, 0.0]], "rps_doc_frac_chars_dupe_5grams": [[0, 194, 0.20408163]], "rps_doc_frac_chars_dupe_6grams": [[0, 194, 0.0]], "rps_doc_frac_chars_dupe_7grams": [[0, 194, 0.0]], "rps_doc_frac_chars_dupe_8grams": [[0, 194, 0.0]], "rps_doc_frac_chars_dupe_9grams": [[0, 194, 0.0]], "rps_doc_frac_chars_top_2gram": [[0, 194, 0.06802721]], "rps_doc_frac_chars_top_3gram": [[0, 194, 0.12244898]], "rps_doc_frac_chars_top_4gram": [[0, 194, 0.14965986]], "rps_doc_frac_all_caps_words": [[0, 194, 0.02173913]], "rps_doc_frac_lines_end_with_ellipsis": [[0, 194, 0.5]], "rps_doc_frac_no_alph_words": [[0, 194, 0.10869565]], "rps_doc_frac_unique_words": [[0, 194, 0.775]], "rps_doc_mean_word_length": [[0, 194, 3.675]], "rps_doc_symbol_to_word_ratio": [[0, 194, 0.02173913]], "rps_doc_unigram_entropy": [[0, 194, 3.34230586]], "rps_doc_word_count": [[0, 194, 40.0]], "rps_lines_ending_with_terminal_punctution_mark": [[0, 32, 0.0], [32, 194, 1.0]], "rps_lines_javascript_counts": [[0, 32, 0.0], [32, 194, 0.0]], "rps_lines_num_words": [[0, 32, 8.0], [32, 194, 32.0]], "rps_lines_numerical_chars_fraction": [[0, 32, 0.0], [32, 194, 0.0]], "rps_lines_start_with_bulletpoint": [[0, 32, 0.0], [32, 194, 0.0]], "rps_lines_uppercase_letter_fraction": [[0, 32, 0.15625], [32, 194, 0.0308642]], "rps_doc_ml_palm_score": [[0, 194, 0.00433278]], "rps_doc_ml_wikipedia_score": [[0, 194, null]], "rps_doc_ml_wikiref_score": [[0, 194, 1.454e-05]], "rps_doc_books_importance": [[0, 194, -14.8180987]], "rps_doc_openwebtext_importance": [[0, 194, -6.5385694]], "rps_doc_wikipedia_importance": [[0, 194, -25.02735147]], "rps_doc_num_sentences": [[0, 194, 2.0]]} |
Books and Writing / life / Thoughts
February 24, 2023 February 24, 2023 D. Michl LoweLeave a comment
So I took a break from writing the fantasy book that I have been working on and wrote something different. I woke up last night at 3:00 a.m. and had a book idea ramming itself into my consciousness. The idea wouldn’t go away. I had to write down the general idea. Below, I have written out the rough introduction to the book. I don’t plan on stopping work on the fantasy book, but I just needed a break. In my mind, this will be a very short book, less than 200 pages for sure. A middle-grade book, I think. We will see. In my head, this book is dealing with some heavy issues kids are dealing with even today. My current working title is My Forgotten Youth. Enjoy.
Introduction: The Abnormal Life
You don’t really question the things that happened to you as a kid. To you, it was just how life was and that was normal. It’s not until years later that you start to understand that your childhood might not have been completely regular. Example: My mom once told me that my father was a famous magician. I asked her why I didn’t have a dad and she said that he was busy working in Vegas and that the entertainment company that employed him wouldn’t let him take time off.
It made sense to me at the time, that my dad was a magician; too important to come and visit me. I told all my friends at school, and when they got old enough to realize it was a lie, they let me know, harshly, over and over. It was 1990 and I was eight years old. My mom would often disappear for weeks at a time, that was normal. My grandma and I would always order pizza when she knew mom wasn’t coming home. I got to the place where I hated the very smell of the stuff. Mom continually traveled up to Detroit with her boyfriends; sometimes just for the weekend, and others, for a month at a time.
We lived with my Grandma Susan. She had a little trailer that my grandpa had left her when he died. That was years before I was even born though. We all lived in Sissonville, West Virginia and my mom and Grandma had their own rooms, but I had the hall closet. It was big enough that my twin mattress could fit, but that was about it.
One morning, I woke up before I should have. Not sure why, but something seemed wrong. Sometimes you wake up because of a noise, but you think you just woke up naturally. It was one of those times when you feel like you slept for a long time. I was wide awake. There was the sound of clinking dishes in the kitchen and I walked in, the footed pajamas I wore had a hole for the big toe on each one, but they still made a soft shiff shiff as they slid across the linoleum floor.
The sound of my feet caused my mother to drop her little plastic purse on the side of the sink. When she did, amber pill bottles came plopping out on the counter and the floor. She was startled.
“Hey darlin’, what are you doin up?” She asked, smoothing her blond hair back from her face and licking lips that were too dry.
“I heard something, and woke up,” I said.
She was fully dressed in a short skirt and some of those fishnet stockings that girls loved to wear in the 80s, but that she was obviously too old to be wearing. Mom was fashionable; MTV was always on when she was home. Her bangs were the poofiest bangs in the whole town. While I always thought mom was pretty, in a “my mom” sort of way, I hated poofy bangs. She quickly picked up the pill bottles and began stuffing them back into the hot pink purse. One of them had rolled across the floor and my bare big toe twiddled it. I reached down and picked it up. It had my grandma’s name on the label.
“Oh,” I said. “This one is grandmas.”
A slight panic flashed across her eyes. “Yes, well I am taking it to get it refilled,” she said.
“Are all of those grandmas?”
She backed away from me and I was confused. “No, these ones are mine,” she said glancing towards the door. “Why don’t you mind your own business, huh? You think you know what’s best do ya? You aren’t the parent here! I am!” She screamed the last part, but immediately hushed herself, glancing towards the hall that led to grandma’s room.
“Mom, are you okay,” I asked, brushing off the harshness of her words. I learned long ago, not to take her harshness with any sincerity.
“I’m fine,” she hastily said, zipping up the little purse. It looked stupid, hot pink and almost rubbery. I thought it was like something a little kid would have, not a grown woman. She brushed tears out of her eyes. When had she started crying? Coming over, she kissed the top of my head. She smelled; sour, like ammonia. Like when our cat’s litter box hadn’t been cleaned in several weeks. Her arms were too thin, I could see the bones in her wrists. She had bruises up and down both of her arms, little scabbed dots all over.
“You be good okay. Listen to your grandma. I’m gonna be gone for a couple weeks, alright. I have a job up in Detroit I have to do. Robert says we can get some real good work this time.”
Robert was the current guy she called her boyfriend.
“Okay,” I said.
What else could I say? She looked back once, then walked out the front door.
I never saw my mom again.
1990s, boy, boyhood, distructive life, Drugs, family, Fiction, hurt, kids, loss, mom, pain, sadness, Writing
Why I’m Not Afraid of Dragons Anymore
I was part of a writing group for a while. I’m not going to name names, because it’s not important to the story. However, for some time now, I have been scared… of dragons. Let me explain. This was nearly ten years ago and my writing career was still very new. I hadn’t finished my…
by D. Michl Lowe January 27, 2023 January 29, 2023 | 2023-14/0000/en_head.json.gz/3654 | {"url": "https://dmichllowe.com/2023/02/24/side-project-my-forgotten-youth/", "partition": "head_middle", "language": "en", "source_domain": "dmichllowe.com", "date_download": "2023-03-20T09:41:11Z", "digest": "sha1:QMX6E62YKHLQ7WBTQWPJBQ5DRY3DZCVE"} | {"ccnet_length": [[0, 5653, 5653.0]], "ccnet_original_length": [[0, 5653, 7929.0]], "ccnet_nlines": [[0, 5653, 27.0]], "ccnet_original_nlines": [[0, 5653, 103.0]], "ccnet_language_score": [[0, 5653, 0.99]], "ccnet_perplexity": [[0, 5653, 300.1]], "ccnet_bucket": [[0, 5653, 0.0]], "rps_doc_curly_bracket": [[0, 5653, 0.0]], "rps_doc_ldnoobw_words": [[0, 5653, 0.0]], "rps_doc_lorem_ipsum": [[0, 5653, 0.0]], "rps_doc_stop_word_fraction": [[0, 5653, 0.43745262]], "rps_doc_ut1_blacklist": [[0, 5653, null]], "rps_doc_frac_chars_dupe_10grams": [[0, 5653, 0.0]], "rps_doc_frac_chars_dupe_5grams": [[0, 5653, 0.0]], "rps_doc_frac_chars_dupe_6grams": [[0, 5653, 0.0]], "rps_doc_frac_chars_dupe_7grams": [[0, 5653, 0.0]], "rps_doc_frac_chars_dupe_8grams": [[0, 5653, 0.0]], "rps_doc_frac_chars_dupe_9grams": [[0, 5653, 0.0]], "rps_doc_frac_chars_top_2gram": [[0, 5653, 0.0068306]], "rps_doc_frac_chars_top_3gram": [[0, 5653, 0.00637523]], "rps_doc_frac_chars_top_4gram": [[0, 5653, 0.0]], "rps_doc_frac_all_caps_words": [[0, 5653, 0.03942381]], "rps_doc_frac_lines_end_with_ellipsis": [[0, 5653, 0.03703704]], "rps_doc_frac_no_alph_words": [[0, 5653, 0.16982563]], "rps_doc_frac_unique_words": [[0, 5653, 0.42896679]], "rps_doc_mean_word_length": [[0, 5653, 4.05166052]], "rps_doc_symbol_to_word_ratio": [[0, 5653, 0.0015163]], "rps_doc_unigram_entropy": [[0, 5653, 5.4907574]], "rps_doc_word_count": [[0, 5653, 1084.0]], "rps_lines_ending_with_terminal_punctution_mark": [[0, 36, 0.0], [36, 101, 0.0], [101, 767, 1.0], [767, 799, 0.0], [799, 1272, 1.0], [1272, 1873, 1.0], [1873, 2207, 1.0], [2207, 2684, 1.0], [2684, 2879, 1.0], [2879, 3007, 1.0], [3007, 3049, 1.0], [3049, 3647, 1.0], [3647, 3685, 1.0], [3685, 3782, 1.0], [3782, 3811, 1.0], [3811, 4149, 1.0], [4149, 4286, 1.0], [4286, 4815, 1.0], [4815, 5001, 1.0], [5001, 5054, 1.0], [5054, 5070, 1.0], [5070, 5147, 1.0], [5147, 5173, 1.0], [5173, 5282, 0.0], [5282, 5320, 0.0], [5320, 5603, 0.0], [5603, 5653, 0.0]], "rps_lines_javascript_counts": [[0, 36, 0.0], [36, 101, 0.0], [101, 767, 0.0], [767, 799, 0.0], [799, 1272, 0.0], [1272, 1873, 0.0], [1873, 2207, 0.0], [2207, 2684, 0.0], [2684, 2879, 0.0], [2879, 3007, 0.0], [3007, 3049, 0.0], [3049, 3647, 0.0], [3647, 3685, 0.0], [3685, 3782, 0.0], [3782, 3811, 0.0], [3811, 4149, 0.0], [4149, 4286, 0.0], [4286, 4815, 0.0], [4815, 5001, 0.0], [5001, 5054, 0.0], [5054, 5070, 0.0], [5070, 5147, 0.0], [5147, 5173, 0.0], [5173, 5282, 0.0], [5282, 5320, 0.0], [5320, 5603, 0.0], [5603, 5653, 0.0]], "rps_lines_num_words": [[0, 36, 5.0], [36, 101, 11.0], [101, 767, 128.0], [767, 799, 4.0], [799, 1272, 90.0], [1272, 1873, 121.0], [1873, 2207, 66.0], [2207, 2684, 96.0], [2684, 2879, 38.0], [2879, 3007, 24.0], [3007, 3049, 8.0], [3049, 3647, 118.0], [3647, 3685, 7.0], [3685, 3782, 19.0], [3782, 3811, 5.0], [3811, 4149, 61.0], [4149, 4286, 25.0], [4286, 4815, 99.0], [4815, 5001, 39.0], [5001, 5054, 9.0], [5054, 5070, 3.0], [5070, 5147, 15.0], [5147, 5173, 6.0], [5173, 5282, 15.0], [5282, 5320, 7.0], [5320, 5603, 55.0], [5603, 5653, 10.0]], "rps_lines_numerical_chars_fraction": [[0, 36, 0.0], [36, 101, 0.19672131], [101, 767, 0.00933126], [767, 799, 0.0], [799, 1272, 0.0], [1272, 1873, 0.00686106], [1873, 2207, 0.0], [2207, 2684, 0.0], [2684, 2879, 0.0], [2879, 3007, 0.0], [3007, 3049, 0.0], [3049, 3647, 0.0034188], [3647, 3685, 0.0], [3685, 3782, 0.0], [3782, 3811, 0.0], [3811, 4149, 0.0], [4149, 4286, 0.0], [4286, 4815, 0.0], [4815, 5001, 0.0], [5001, 5054, 0.0], [5054, 5070, 0.0], [5070, 5147, 0.0], [5147, 5173, 0.0], [5173, 5282, 0.04210526], [5282, 5320, 0.0], [5320, 5603, 0.0], [5603, 5653, 0.25531915]], "rps_lines_start_with_bulletpoint": [[0, 36, 0.0], [36, 101, 0.0], [101, 767, 0.0], [767, 799, 0.0], [799, 1272, 0.0], [1272, 1873, 0.0], [1873, 2207, 0.0], [2207, 2684, 0.0], [2684, 2879, 0.0], [2879, 3007, 0.0], [3007, 3049, 0.0], [3049, 3647, 0.0], [3647, 3685, 0.0], [3685, 3782, 0.0], [3782, 3811, 0.0], [3811, 4149, 0.0], [4149, 4286, 0.0], [4286, 4815, 0.0], [4815, 5001, 0.0], [5001, 5054, 0.0], [5054, 5070, 0.0], [5070, 5147, 0.0], [5147, 5173, 0.0], [5173, 5282, 0.0], [5282, 5320, 0.0], [5320, 5603, 0.0], [5603, 5653, 0.0]], "rps_lines_uppercase_letter_fraction": [[0, 36, 0.08333333], [36, 101, 0.09230769], [101, 767, 0.03003003], [767, 799, 0.125], [799, 1272, 0.01691332], [1272, 1873, 0.01830283], [1873, 2207, 0.03892216], [2207, 2684, 0.02096436], [2684, 2879, 0.01538462], [2879, 3007, 0.015625], [3007, 3049, 0.04761905], [3049, 3647, 0.02173913], [3647, 3685, 0.07894737], [3685, 3782, 0.03092784], [3782, 3811, 0.03448276], [3811, 4149, 0.02366864], [4149, 4286, 0.02189781], [4286, 4815, 0.02079395], [4815, 5001, 0.03763441], [5001, 5054, 0.01886792], [5054, 5070, 0.125], [5070, 5147, 0.03896104], [5147, 5173, 0.03846154], [5173, 5282, 0.02752294], [5282, 5320, 0.15789474], [5320, 5603, 0.02473498], [5603, 5653, 0.1]], "rps_doc_ml_palm_score": [[0, 5653, 0.21955538]], "rps_doc_ml_wikipedia_score": [[0, 5653, null]], "rps_doc_ml_wikiref_score": [[0, 5653, 0.27695876]], "rps_doc_books_importance": [[0, 5653, 35.10262499]], "rps_doc_openwebtext_importance": [[0, 5653, 105.09704635]], "rps_doc_wikipedia_importance": [[0, 5653, -646.57399103]], "rps_doc_num_sentences": [[0, 5653, 93.0]]} |
« The Birthday Race
High School Confidential »
Marketing In Motion
Saturday March 15th, 2014 – Springfield, IL
Our worst fears came true this weekend, and the turnout has been small at Donnie B’s Comedy Club in Springfield, IL. There’s not a whole lot anyone can do, as Donnie was out there hitting it hard like he usually does. He’s always got a load of posters in his car, and we made the rounds at the local radio stations as is customary when I’m here. For whatever reason, it was a slow week.
It upsets me when promoters lose money, but especially Donnie B. There just aren’t that many club owners that hustle as hard he does to fill his club, but this just isn’t a ripe market. If he had a club in a big city – and I wish he did – I’d be sitting pretty. Still, I enjoy working for him here.
One thing I especially like is that he has his headliners do an hour rather than the customary 45 minutes. It’s a great workout for me, and I enjoy the challenge. He hosts the shows and promotes what he needs to, and then brings up an opener that does maybe twelve minutes tops. I love that.
It’s great practice for me, and the audiences are usually very friendly and receptive here for the most part. Even this weekend when the numbers weren’t big, each one of the four audiences was really into the show and I had a blast. I made up my mind to give them my very best – and I did.
One thing I need to work on is my sales pitch for merchandise. That’s a tricky procedure, but it needs to be done. It can’t be too long and ‘salesy’, but it needs to get the point across that I’ll be available after the show to meet and greet if they want to say hello. I consciously worked on my presentation all four shows, and it worked splendidly. I ended up selling something every show.
I have to credit my friend James Gregory for planting this seed, and he did it years ago. I have been dabbling in it, but I need to jump in with both feet and DO it. There’s a lot that goes into an effective sales campaign, and it all starts with the pitch from the stage. That’s what I worked on.
I also worked on including all things Uranus. I have some funny prototype t-shirts, and there is no reason I shouldn’t start selling them. I’ve had them for a while, but I never tried to push them from the stage. What a yutz I was, as they’re very well made and really funny. This week I threw them out there, and they were a big hit! It made me very excited when I saw them draw interest.
I know this gimmick is a winner. I can feel it. I sold the point that Uranus was discovered this week in 1781, and that I was starting a club called ‘The Uranus Movement’ for Earth inhabitants that were sick of the insanity here and wanted to escape. I didn’t sell the king just yet, but I will.
I have a book mark that says ‘I (heart) URANUS’, with the heart flipped upside down to look like a butt. Those went over like gangbusters, and I gave away a ton. It has my Twitter account listed which is @UranusTweets. Now that I’m giving these out I need to tweet something daily.
This is all part of the marketing game, and I’m making tremendous strides because I’m making it my focus. My shows are already rock solid, and they were again this weekend. I knocked it out of the park, and that’s not bragging. I know what I’m doing on stage. Offstage is the challenge.
That will come in time too, and it will be sooner than later. I will eventually discover where the exact right place in the show is to include my pitch, and it will be seamless. I’ll learn what to say, and when and how to say it to bring in the most sales. I’m already up there, why not sell product?
There’s no crime if someone doesn’t want to buy anything. I’ll still talk to them and be friendly like I always am. People will see that, and it will add to my sales. I need to organize myself as far as my table display goes as well. That’s going to take some work as well, but I’ll get to that too.
I think there should always be something free available for someone to take home with them as both a souvenir and a promo piece to pass out to their friends. The Uranus book marks are great, but I need something for Mr. Lucky and also the “Schlitz Happened!” show. I’ve got work to do.
I also need to come up with a package deal that people can buy that gets the numbers up. I have a one hour DVD shot in HD with three cameras done by my friend film director Mark Gumbinger. I’ve had it for two years now, but haven’t sold it heavily. What am I, an imbecile? In a word – YES.
I’m smartening up in a hurry, mainly because I need money more now than I ever have. I never felt comfortable selling merchandise before, but those days are over by necessity. I’ve got quite a few products to offer, and they’re not junk. The DVD is better than 99% of what I’ve seen others offer, and I shouldn’t feel ashamed to proudly display it after shows. It was hard work to make it.
Dealing with t-shirts is also a learning experience. The Uranus shirts are very well done, so I’m not ashamed to sell those either. They look professional, and the shirts aren’t those super cheapos like I’ve seen that will disintegrate with a single washing. I spent extra to maintain some quality.
I have six different varieties, but I only brought out three this weekend. That was enough, and I had all I could handle with that. I learned a lot, and it was fascinating to watch people’s dynamic as they decided what they wanted to buy. I sat back and shut my mouth so I could observe it all.
Sizes are always a part of shirt sales. I had two moms buy shirts for their teenagers, and wanted mediums. I only have XL, and even though that wasn’t what they wanted they bought because of the Uranus gimmick. Another guy wanted a 2XL, and I didn’t have that either. For the future I’m going to have to figure it out, but I will. Even getting it this far was a victory. I’ll keep working.
Price is another issue that’s going to take work. I charged $15 each for the shirts and DVDs, or both for $25. That’s a lot of money in my book, and I have to get that out of my mind as quickly as possible. I would always sell my CDs for $10, but I’ve seen people sell theirs for $15 and $20.
That extra money adds up, and I forced myself to smile and say the price rather than add “ But I’ll give them to you both for $10”. I have to learn to value myself and my products, and I’m on my way to doing that. In a very short time I can see myself having a system in place that works.
This weekend was a flop attendance wise, but a raging success in the fact that I was able to put a marketing plan into action and still give those that did show up hot shows. Thumbs up Uranus!
The great James Gregory may be trying to look crazy in this picture, but he’s hands down THE best comedian marketer I have ever seen.
His fans adore him, and I can see why. He’s all about them, and gives them what they want. He’s the real king. http://www.funniestman.com.
Heywood Banks runs a close second. Despite the cultivated crazy look, there’s a razor sharp mind in there. He gets it.
His CDs are hilarious, and he’s got other stuff too. Find him at http://www.heywoodbanks.com.
I’m modeling my own marketing after the best. I’m working on my cultivated crazy look, and a varied product line is to follow.
Want a free book mark? Send me a mailing address and I’ll send you as many as you can handle. Get BEHIND The Uranus MOVEMENT!
Tags: @UranusTweets, CDs, comedy, Donnie B's Comedy Club, DVDs, Heywood Banks, James Gregory, Mark Gumbinger, marketing, promoter, t-shirts, The Uranus Movement, Twitter, Uranus
This entry was posted on March 16, 2014 at 5:44 pm and is filed under Comedy, Entertainment, Entrepreneurship, Humorous speaking, Marketing, Show Business, Standup Comedy, Standup Comedy Instruction, Uncategorized. You can follow any responses to this entry through the RSS 2.0 feed. You can leave a response, or trackback from your own site. | 2023-14/0000/en_head.json.gz/3655 | {"url": "https://dobiemaxwell.wordpress.com/2014/03/16/marketing-in-motion/", "partition": "head_middle", "language": "en", "source_domain": "dobiemaxwell.wordpress.com", "date_download": "2023-03-20T08:44:49Z", "digest": "sha1:I3JNZTKF3DM7BLCCCNJ3DMFLEUDBRKTF"} | {"ccnet_length": [[0, 7910, 7910.0]], "ccnet_original_length": [[0, 7910, 9461.0]], "ccnet_nlines": [[0, 7910, 33.0]], "ccnet_original_nlines": [[0, 7910, 82.0]], "ccnet_language_score": [[0, 7910, 0.98]], "ccnet_perplexity": [[0, 7910, 286.0]], "ccnet_bucket": [[0, 7910, 0.0]], "rps_doc_curly_bracket": [[0, 7910, 0.0]], "rps_doc_ldnoobw_words": [[0, 7910, 1.0]], "rps_doc_lorem_ipsum": [[0, 7910, 0.0]], "rps_doc_stop_word_fraction": [[0, 7910, 0.46384574]], "rps_doc_ut1_blacklist": [[0, 7910, null]], "rps_doc_frac_chars_dupe_10grams": [[0, 7910, 0.0]], "rps_doc_frac_chars_dupe_5grams": [[0, 7910, 0.0]], "rps_doc_frac_chars_dupe_6grams": [[0, 7910, 0.0]], "rps_doc_frac_chars_dupe_7grams": [[0, 7910, 0.0]], "rps_doc_frac_chars_dupe_8grams": [[0, 7910, 0.0]], "rps_doc_frac_chars_dupe_9grams": [[0, 7910, 0.0]], "rps_doc_frac_chars_top_2gram": [[0, 7910, 0.0071232]], "rps_doc_frac_chars_top_3gram": [[0, 7910, 0.00453294]], "rps_doc_frac_chars_top_4gram": [[0, 7910, 0.00388538]], "rps_doc_frac_all_caps_words": [[0, 7910, 0.05945367]], "rps_doc_frac_lines_end_with_ellipsis": [[0, 7910, 0.0]], "rps_doc_frac_no_alph_words": [[0, 7910, 0.16604178]], "rps_doc_frac_unique_words": [[0, 7910, 0.36447979]], "rps_doc_mean_word_length": [[0, 7910, 4.09343936]], "rps_doc_symbol_to_word_ratio": [[0, 7910, 0.0]], "rps_doc_unigram_entropy": [[0, 7910, 5.57254429]], "rps_doc_word_count": [[0, 7910, 1509.0]], "rps_lines_ending_with_terminal_punctution_mark": [[0, 20, 0.0], [20, 47, 0.0], [47, 67, 0.0], [67, 111, 0.0], [111, 498, 1.0], [498, 797, 1.0], [797, 1089, 1.0], [1089, 1379, 1.0], [1379, 1772, 1.0], [1772, 2070, 1.0], [2070, 2460, 1.0], [2460, 2756, 1.0], [2756, 3038, 1.0], [3038, 3325, 1.0], [3325, 3626, 1.0], [3626, 3926, 1.0], [3926, 4212, 1.0], [4212, 4502, 1.0], [4502, 4892, 1.0], [4892, 5191, 1.0], [5191, 5486, 1.0], [5486, 5876, 1.0], [5876, 6169, 1.0], [6169, 6458, 1.0], [6458, 6651, 1.0], [6651, 6785, 1.0], [6785, 6924, 1.0], [6924, 7043, 1.0], [7043, 7137, 1.0], [7137, 7264, 1.0], [7264, 7390, 1.0], [7390, 7568, 0.0], [7568, 7910, 1.0]], "rps_lines_javascript_counts": [[0, 20, 0.0], [20, 47, 0.0], [47, 67, 0.0], [67, 111, 0.0], [111, 498, 0.0], [498, 797, 0.0], [797, 1089, 0.0], [1089, 1379, 0.0], [1379, 1772, 0.0], [1772, 2070, 0.0], [2070, 2460, 0.0], [2460, 2756, 0.0], [2756, 3038, 0.0], [3038, 3325, 0.0], [3325, 3626, 0.0], [3626, 3926, 0.0], [3926, 4212, 0.0], [4212, 4502, 0.0], [4502, 4892, 0.0], [4892, 5191, 0.0], [5191, 5486, 0.0], [5486, 5876, 0.0], [5876, 6169, 0.0], [6169, 6458, 0.0], [6458, 6651, 0.0], [6651, 6785, 0.0], [6785, 6924, 0.0], [6924, 7043, 0.0], [7043, 7137, 0.0], [7137, 7264, 0.0], [7264, 7390, 0.0], [7390, 7568, 0.0], [7568, 7910, 0.0]], "rps_lines_num_words": [[0, 20, 4.0], [20, 47, 4.0], [47, 67, 3.0], [67, 111, 7.0], [111, 498, 75.0], [498, 797, 62.0], [797, 1089, 56.0], [1089, 1379, 58.0], [1379, 1772, 75.0], [1772, 2070, 61.0], [2070, 2460, 77.0], [2460, 2756, 58.0], [2756, 3038, 53.0], [3038, 3325, 53.0], [3325, 3626, 61.0], [3626, 3926, 61.0], [3926, 4212, 54.0], [4212, 4502, 61.0], [4502, 4892, 73.0], [4892, 5191, 50.0], [5191, 5486, 58.0], [5486, 5876, 75.0], [5876, 6169, 62.0], [6169, 6458, 62.0], [6458, 6651, 38.0], [6651, 6785, 25.0], [6785, 6924, 24.0], [6924, 7043, 21.0], [7043, 7137, 14.0], [7137, 7264, 23.0], [7264, 7390, 25.0], [7390, 7568, 23.0], [7568, 7910, 53.0]], "rps_lines_numerical_chars_fraction": [[0, 20, 0.0], [20, 47, 0.0], [47, 67, 0.0], [67, 111, 0.14634146], [111, 498, 0.0], [498, 797, 0.0], [797, 1089, 0.00701754], [1089, 1379, 0.0], [1379, 1772, 0.0], [1772, 2070, 0.0], [2070, 2460, 0.0], [2460, 2756, 0.01384083], [2756, 3038, 0.0], [3038, 3325, 0.0], [3325, 3626, 0.0], [3626, 3926, 0.0], [3926, 4212, 0.0], [4212, 4502, 0.0], [4502, 4892, 0.00527704], [4892, 5191, 0.0], [5191, 5486, 0.0], [5486, 5876, 0.0026455], [5876, 6169, 0.03571429], [6169, 6458, 0.0070922], [6458, 6651, 0.0], [6651, 6785, 0.0], [6785, 6924, 0.0], [6924, 7043, 0.0], [7043, 7137, 0.0], [7137, 7264, 0.0], [7264, 7390, 0.0], [7390, 7568, 0.0], [7568, 7910, 0.03363914]], "rps_lines_start_with_bulletpoint": [[0, 20, 0.0], [20, 47, 0.0], [47, 67, 0.0], [67, 111, 0.0], [111, 498, 0.0], [498, 797, 0.0], [797, 1089, 0.0], [1089, 1379, 0.0], [1379, 1772, 0.0], [1772, 2070, 0.0], [2070, 2460, 0.0], [2460, 2756, 0.0], [2756, 3038, 0.0], [3038, 3325, 0.0], [3325, 3626, 0.0], [3626, 3926, 0.0], [3926, 4212, 0.0], [4212, 4502, 0.0], [4502, 4892, 0.0], [4892, 5191, 0.0], [5191, 5486, 0.0], [5486, 5876, 0.0], [5876, 6169, 0.0], [6169, 6458, 0.0], [6458, 6651, 0.0], [6651, 6785, 0.0], [6785, 6924, 0.0], [6924, 7043, 0.0], [7043, 7137, 0.0], [7137, 7264, 0.0], [7264, 7390, 0.0], [7390, 7568, 0.0], [7568, 7910, 0.0]], "rps_lines_uppercase_letter_fraction": [[0, 20, 0.15], [20, 47, 0.11111111], [47, 67, 0.15], [67, 111, 0.11363636], [111, 498, 0.03359173], [498, 797, 0.03010033], [797, 1089, 0.02054795], [1089, 1379, 0.01724138], [1379, 1772, 0.0178117], [1772, 2070, 0.03355705], [2070, 2460, 0.03076923], [2460, 2756, 0.03716216], [2756, 3038, 0.06028369], [3038, 3325, 0.02787456], [3325, 3626, 0.01328904], [3626, 3926, 0.02333333], [3926, 4212, 0.03146853], [4212, 4502, 0.05517241], [4502, 4892, 0.03076923], [4892, 5191, 0.02341137], [5191, 5486, 0.02711864], [5486, 5876, 0.03846154], [5876, 6169, 0.03754266], [6169, 6458, 0.02768166], [6458, 6651, 0.02072539], [6651, 6785, 0.05223881], [6785, 6924, 0.02877698], [6924, 7043, 0.03361345], [7043, 7137, 0.04255319], [7137, 7264, 0.01574803], [7264, 7390, 0.15873016], [7390, 7568, 0.12921348], [7568, 7910, 0.05847953]], "rps_doc_ml_palm_score": [[0, 7910, 0.46895665]], "rps_doc_ml_wikipedia_score": [[0, 7910, null]], "rps_doc_ml_wikiref_score": [[0, 7910, 0.19401431]], "rps_doc_books_importance": [[0, 7910, -342.19342529]], "rps_doc_openwebtext_importance": [[0, 7910, 150.53019272]], "rps_doc_wikipedia_importance": [[0, 7910, -1154.42367707]], "rps_doc_num_sentences": [[0, 7910, 114.0]]} |
Aleksey Dvorzhinskiy, M.D., MSc
Aleksey Dvorzhinskiy, M.D., MSc Specialties: Orthopedic Surgery, Orthopedic Trauma Surgery
Specialties: Orthopedic Surgery, Orthopedic Trauma Surgery
More specialties
About Aleksey Dvorzhinskiy, M.D., MSc
Dr. Aleksey Dvorzhinskiy is an orthopedic surgeon specializing in the treatment of fractures in the arms, legs, pelvis, and acetabulum. He has surgical expertise in complex fractures, deformity correction, infection, as well as malunions and non-unions (poorly healed or non-healed fractures). Dr. Dvorzhinskiy prioritizes delivering individualized patient care using modernre search and surgical techniques to achieve the best possible outcomes for his patients. He also strongly believes in nonoperative treatment when appropriate.
Dr. Dvorzhinskiy graduated summa cum laude from the University of Delaware where he was inducted into Phi Beta Kappa and awarded the Barry M. Goldwater Scholarship to pursue research related to bacterial infection. For medical school, he attended Cornell University and was a member of the Alpha Omega Alpha honor society. He completed his residency at Hospital for Special Surgery. During the final year of his residency, he was the Chair of the hospital's House Staff Quality and Safety Committee, where he oversaw operational initiatives focused on improving patient safety. At this time, he also received a master's degree in Clinical Epidemiology and Health Services Research from Cornell University. He completed a yearlong fellowship in orthopedic trauma at Harborview Medical Center – University of Washington which focused on the treatment of severe injuries, polytraumatized patients, and the complications of fractures (nonunion, malunion, and infection).
Dr. Dvorzhinskiy is passionate about improving clinical outcomes through research in the field of orthopedic trauma. He has been awarded research grants from internal funding sources within HSS and external organizations such as the Foundation for Orthopaedic Trauma. He has presented his research at numerous national and international research meetings and has authored multiple peer-reviewed articles and several book chapters. He is an active member of the American Academy of Orthopaedic Surgeons (AAOS) and the Orthopaedic Trauma Association (OTA).
NewYork-Presbyterian / Weill Cornell Medical Center
Fracture of Acetabulum
Fracture of Lower Extremity
Fracture of Upper Extremity
Orthopedic Trauma
Pelvic Fracture
Degrees: University of Delaware
Degrees: Weill Cornell Medical College, Cornell University
Degrees: Weill Graduate School of Medical Sciences of Cornell University | 2023-14/0000/en_head.json.gz/3656 | {"url": "https://doctors.nyp.org/aleksey-dvorzhinskiy-md-msc/east-river-professional-bldg", "partition": "head_middle", "language": "en", "source_domain": "doctors.nyp.org", "date_download": "2023-03-20T10:07:16Z", "digest": "sha1:EQGXQCMIYFDXOI4WF3ZPSMLIEPYO7XWP"} | {"ccnet_length": [[0, 2621, 2621.0]], "ccnet_original_length": [[0, 2621, 4198.0]], "ccnet_nlines": [[0, 2621, 17.0]], "ccnet_original_nlines": [[0, 2621, 105.0]], "ccnet_language_score": [[0, 2621, 0.95]], "ccnet_perplexity": [[0, 2621, 146.9]], "ccnet_bucket": [[0, 2621, 0.0]], "rps_doc_curly_bracket": [[0, 2621, 0.0]], "rps_doc_ldnoobw_words": [[0, 2621, 1.0]], "rps_doc_lorem_ipsum": [[0, 2621, 0.0]], "rps_doc_stop_word_fraction": [[0, 2621, 0.2718894]], "rps_doc_ut1_blacklist": [[0, 2621, null]], "rps_doc_frac_chars_dupe_10grams": [[0, 2621, 0.0]], "rps_doc_frac_chars_dupe_5grams": [[0, 2621, 0.04657534]], "rps_doc_frac_chars_dupe_6grams": [[0, 2621, 0.04657534]], "rps_doc_frac_chars_dupe_7grams": [[0, 2621, 0.0]], "rps_doc_frac_chars_dupe_8grams": [[0, 2621, 0.0]], "rps_doc_frac_chars_dupe_9grams": [[0, 2621, 0.0]], "rps_doc_frac_chars_top_2gram": [[0, 2621, 0.03652968]], "rps_doc_frac_chars_top_3gram": [[0, 2621, 0.02876712]], "rps_doc_frac_chars_top_4gram": [[0, 2621, 0.03287671]], "rps_doc_frac_all_caps_words": [[0, 2621, 0.02304147]], "rps_doc_frac_lines_end_with_ellipsis": [[0, 2621, 0.0]], "rps_doc_frac_no_alph_words": [[0, 2621, 0.14516129]], "rps_doc_frac_unique_words": [[0, 2621, 0.50964187]], "rps_doc_mean_word_length": [[0, 2621, 6.03305785]], "rps_doc_symbol_to_word_ratio": [[0, 2621, 0.0]], "rps_doc_unigram_entropy": [[0, 2621, 4.82923623]], "rps_doc_word_count": [[0, 2621, 363.0]], "rps_lines_ending_with_terminal_punctution_mark": [[0, 32, 0.0], [32, 123, 0.0], [123, 182, 0.0], [182, 199, 0.0], [199, 237, 0.0], [237, 771, 1.0], [771, 1738, 1.0], [1738, 2293, 1.0], [2293, 2345, 0.0], [2345, 2368, 0.0], [2368, 2396, 0.0], [2396, 2424, 0.0], [2424, 2442, 0.0], [2442, 2458, 0.0], [2458, 2490, 0.0], [2490, 2549, 0.0], [2549, 2621, 0.0]], "rps_lines_javascript_counts": [[0, 32, 0.0], [32, 123, 0.0], [123, 182, 0.0], [182, 199, 0.0], [199, 237, 0.0], [237, 771, 0.0], [771, 1738, 0.0], [1738, 2293, 0.0], [2293, 2345, 0.0], [2345, 2368, 0.0], [2368, 2396, 0.0], [2396, 2424, 0.0], [2424, 2442, 0.0], [2442, 2458, 0.0], [2458, 2490, 0.0], [2490, 2549, 0.0], [2549, 2621, 0.0]], "rps_lines_num_words": [[0, 32, 4.0], [32, 123, 10.0], [123, 182, 6.0], [182, 199, 2.0], [199, 237, 5.0], [237, 771, 72.0], [771, 1738, 144.0], [1738, 2293, 79.0], [2293, 2345, 5.0], [2345, 2368, 3.0], [2368, 2396, 4.0], [2396, 2424, 4.0], [2424, 2442, 2.0], [2442, 2458, 2.0], [2458, 2490, 4.0], [2490, 2549, 7.0], [2549, 2621, 10.0]], "rps_lines_numerical_chars_fraction": [[0, 32, 0.0], [32, 123, 0.0], [123, 182, 0.0], [182, 199, 0.0], [199, 237, 0.0], [237, 771, 0.0], [771, 1738, 0.0], [1738, 2293, 0.0], [2293, 2345, 0.0], [2345, 2368, 0.0], [2368, 2396, 0.0], [2396, 2424, 0.0], [2424, 2442, 0.0], [2442, 2458, 0.0], [2458, 2490, 0.0], [2490, 2549, 0.0], [2549, 2621, 0.0]], "rps_lines_start_with_bulletpoint": [[0, 32, 0.0], [32, 123, 0.0], [123, 182, 0.0], [182, 199, 0.0], [199, 237, 0.0], [237, 771, 0.0], [771, 1738, 0.0], [1738, 2293, 0.0], [2293, 2345, 0.0], [2345, 2368, 0.0], [2368, 2396, 0.0], [2396, 2424, 0.0], [2424, 2442, 0.0], [2442, 2458, 0.0], [2458, 2490, 0.0], [2490, 2549, 0.0], [2549, 2621, 0.0]], "rps_lines_uppercase_letter_fraction": [[0, 32, 0.1875], [32, 123, 0.13186813], [123, 182, 0.10169492], [182, 199, 0.05882353], [199, 237, 0.18421053], [237, 771, 0.01310861], [771, 1738, 0.0434333], [1738, 2293, 0.04504505], [2293, 2345, 0.13461538], [2345, 2368, 0.08695652], [2368, 2396, 0.10714286], [2396, 2424, 0.10714286], [2424, 2442, 0.11111111], [2442, 2458, 0.125], [2458, 2490, 0.09375], [2490, 2549, 0.11864407], [2549, 2621, 0.11111111]], "rps_doc_ml_palm_score": [[0, 2621, 0.04374427]], "rps_doc_ml_wikipedia_score": [[0, 2621, null]], "rps_doc_ml_wikiref_score": [[0, 2621, 0.22266293]], "rps_doc_books_importance": [[0, 2621, -81.28597039]], "rps_doc_openwebtext_importance": [[0, 2621, -7.94661153]], "rps_doc_wikipedia_importance": [[0, 2621, 56.69767657]], "rps_doc_num_sentences": [[0, 2621, 26.0]]} |
Give a recursive definition of each of these sets of ordered pairs of positive integers. (Hint: plot the points in the set in the plane and look for lines containing points in the set.
A triangle has two perpendicular sides that are the same length. Which two terms describe the triangle? A.
DONT GIVE ME THE ANSWER JUST EXPLAIN THE QUESTION! Use a Number Line to compare numbers Order rational numbers
huyenthanh
To assure that a+b is odd, one of them has to be odd and one of them has to be even, that is why (2,1), (1,2) is the base step:
if (a,b) is in the set (a+1,b+1) will be in the set
if (a,b) is in the set (a+2,b) will be in the set
if (a,b) is in the set (a,b+2) will be in the set.
What is the recursive definition?
A recursive definition, also known as an inductive definition, is used in mathematics and computer science to define the elements of a set in terms of other elements in the set.
Factorials, natural numbers, Fibonacci numbers, and the Cantor ternary set are examples of recursively-definable objects.
A recursive definition of a function defines the function’s values for some inputs in terms of the function’s values for other (usually smaller) inputs.
The rules, for example, define the factorial function n!
0! = 1.
(n + 1)! = (n + 1)·n!.
To give a recursive definition:
Think about how to solve this problem in general. How can we assure that the sum a+b is odd?
Think about this, what happens when we sum two even numbers? The result is even or odd?
2+6 = 8 (even)
10+12 = 22 (even)
And what happens when we sum two odd numbers? The result will be even or odd?
3+7 = 10 (even)
5+11 = 16 (even)
Therefore, to assure that a+b is odd, one of them has to be odd and one of them has to be even, that is why (2,1), (1,2) is the base step:
Know more about the recursive definition here:
https://brainly.com/question/4173316
The complete question is given below:
Give a recursive definition of each of these sets of ordered pairs of positive integers. [Hint: Plot the points in the set in the plane and look for lines containing points in the set.] a) S = {(a, b) | a ∈ Z+ , b ∈ Z+ , and a + b is odd} | 2023-14/0000/en_head.json.gz/3657 | {"url": "https://documen.tv/give-a-recursive-definition-of-each-of-these-sets-of-ordered-pairs-of-positive-integers-hint-plo-28215784-68/", "partition": "head_middle", "language": "en", "source_domain": "documen.tv", "date_download": "2023-03-20T09:23:08Z", "digest": "sha1:FXR52RFI56B6OY4T5VSTQAXTXVCJYCA3"} | {"ccnet_length": [[0, 2126, 2126.0]], "ccnet_original_length": [[0, 2126, 2831.0]], "ccnet_nlines": [[0, 2126, 28.0]], "ccnet_original_nlines": [[0, 2126, 58.0]], "ccnet_language_score": [[0, 2126, 0.9]], "ccnet_perplexity": [[0, 2126, 249.9]], "ccnet_bucket": [[0, 2126, 0.0]], "rps_doc_curly_bracket": [[0, 2126, 0.00094073]], "rps_doc_ldnoobw_words": [[0, 2126, 0.0]], "rps_doc_lorem_ipsum": [[0, 2126, 0.0]], "rps_doc_stop_word_fraction": [[0, 2126, 0.41316271]], "rps_doc_ut1_blacklist": [[0, 2126, null]], "rps_doc_frac_chars_dupe_10grams": [[0, 2126, 0.33081761]], "rps_doc_frac_chars_dupe_5grams": [[0, 2126, 0.37735849]], "rps_doc_frac_chars_dupe_6grams": [[0, 2126, 0.36855346]], "rps_doc_frac_chars_dupe_7grams": [[0, 2126, 0.33081761]], "rps_doc_frac_chars_dupe_8grams": [[0, 2126, 0.33081761]], "rps_doc_frac_chars_dupe_9grams": [[0, 2126, 0.33081761]], "rps_doc_frac_chars_top_2gram": [[0, 2126, 0.0408805]], "rps_doc_frac_chars_top_3gram": [[0, 2126, 0.05534591]], "rps_doc_frac_chars_top_4gram": [[0, 2126, 0.03522013]], "rps_doc_frac_all_caps_words": [[0, 2126, 0.02925046]], "rps_doc_frac_lines_end_with_ellipsis": [[0, 2126, 0.0]], "rps_doc_frac_no_alph_words": [[0, 2126, 0.28884826]], "rps_doc_frac_unique_words": [[0, 2126, 0.35549872]], "rps_doc_mean_word_length": [[0, 2126, 4.06649616]], "rps_doc_symbol_to_word_ratio": [[0, 2126, 0.0]], "rps_doc_unigram_entropy": [[0, 2126, 4.40172906]], "rps_doc_word_count": [[0, 2126, 391.0]], "rps_lines_ending_with_terminal_punctution_mark": [[0, 185, 1.0], [185, 292, 1.0], [292, 403, 0.0], [403, 414, 0.0], [414, 542, 0.0], [542, 594, 0.0], [594, 644, 0.0], [644, 695, 1.0], [695, 729, 1.0], [729, 907, 1.0], [907, 1029, 1.0], [1029, 1182, 1.0], [1182, 1239, 1.0], [1239, 1247, 1.0], [1247, 1270, 1.0], [1270, 1302, 0.0], [1302, 1395, 1.0], [1395, 1483, 1.0], [1483, 1498, 0.0], [1498, 1516, 0.0], [1516, 1594, 1.0], [1594, 1610, 0.0], [1610, 1627, 0.0], [1627, 1766, 0.0], [1766, 1813, 0.0], [1813, 1850, 0.0], [1850, 1888, 0.0], [1888, 2126, 0.0]], "rps_lines_javascript_counts": [[0, 185, 0.0], [185, 292, 0.0], [292, 403, 0.0], [403, 414, 0.0], [414, 542, 0.0], [542, 594, 0.0], [594, 644, 0.0], [644, 695, 0.0], [695, 729, 0.0], [729, 907, 0.0], [907, 1029, 0.0], [1029, 1182, 0.0], [1182, 1239, 0.0], [1239, 1247, 0.0], [1247, 1270, 0.0], [1270, 1302, 0.0], [1302, 1395, 0.0], [1395, 1483, 0.0], [1483, 1498, 0.0], [1498, 1516, 0.0], [1516, 1594, 0.0], [1594, 1610, 0.0], [1610, 1627, 0.0], [1627, 1766, 0.0], [1766, 1813, 0.0], [1813, 1850, 0.0], [1850, 1888, 0.0], [1888, 2126, 0.0]], "rps_lines_num_words": [[0, 185, 34.0], [185, 292, 18.0], [292, 403, 19.0], [403, 414, 1.0], [414, 542, 30.0], [542, 594, 12.0], [594, 644, 12.0], [644, 695, 12.0], [695, 729, 5.0], [729, 907, 31.0], [907, 1029, 15.0], [1029, 1182, 24.0], [1182, 1239, 9.0], [1239, 1247, 2.0], [1247, 1270, 4.0], [1270, 1302, 5.0], [1302, 1395, 19.0], [1395, 1483, 17.0], [1483, 1498, 3.0], [1498, 1516, 3.0], [1516, 1594, 16.0], [1594, 1610, 3.0], [1610, 1627, 3.0], [1627, 1766, 31.0], [1766, 1813, 7.0], [1813, 1850, 1.0], [1850, 1888, 6.0], [1888, 2126, 49.0]], "rps_lines_numerical_chars_fraction": [[0, 185, 0.0], [185, 292, 0.0], [292, 403, 0.0], [403, 414, 0.0], [414, 542, 0.03448276], [542, 594, 0.04651163], [594, 644, 0.02380952], [644, 695, 0.02380952], [695, 729, 0.0], [729, 907, 0.0], [907, 1029, 0.0], [1029, 1182, 0.0], [1182, 1239, 0.0], [1239, 1247, 0.66666667], [1247, 1270, 0.22222222], [1270, 1302, 0.0], [1302, 1395, 0.0], [1395, 1483, 0.0], [1483, 1498, 0.33333333], [1498, 1516, 0.5], [1516, 1594, 0.0], [1594, 1610, 0.4], [1610, 1627, 0.45454545], [1627, 1766, 0.03174603], [1766, 1813, 0.0], [1813, 1850, 0.23333333], [1850, 1888, 0.0], [1888, 2126, 0.0]], "rps_lines_start_with_bulletpoint": [[0, 185, 0.0], [185, 292, 0.0], [292, 403, 0.0], [403, 414, 0.0], [414, 542, 0.0], [542, 594, 0.0], [594, 644, 0.0], [644, 695, 0.0], [695, 729, 0.0], [729, 907, 0.0], [907, 1029, 0.0], [1029, 1182, 0.0], [1182, 1239, 0.0], [1239, 1247, 0.0], [1247, 1270, 0.0], [1270, 1302, 0.0], [1302, 1395, 0.0], [1395, 1483, 0.0], [1483, 1498, 0.0], [1498, 1516, 0.0], [1516, 1594, 0.0], [1594, 1610, 0.0], [1610, 1627, 0.0], [1627, 1766, 0.0], [1766, 1813, 0.0], [1813, 1850, 0.0], [1850, 1888, 0.0], [1888, 2126, 0.0]], "rps_lines_uppercase_letter_fraction": [[0, 185, 0.01081081], [185, 292, 0.02803738], [292, 403, 0.40540541], [403, 414, 0.0], [414, 542, 0.0078125], [542, 594, 0.0], [594, 644, 0.0], [644, 695, 0.0], [695, 729, 0.02941176], [729, 907, 0.00561798], [907, 1029, 0.02459016], [1029, 1182, 0.00653595], [1182, 1239, 0.01754386], [1239, 1247, 0.0], [1247, 1270, 0.0], [1270, 1302, 0.03125], [1302, 1395, 0.02150538], [1395, 1483, 0.02272727], [1483, 1498, 0.0], [1498, 1516, 0.0], [1516, 1594, 0.02564103], [1594, 1610, 0.0], [1610, 1627, 0.0], [1627, 1766, 0.00719424], [1766, 1813, 0.0212766], [1813, 1850, 0.0], [1850, 1888, 0.02631579], [1888, 2126, 0.02521008]], "rps_doc_ml_palm_score": [[0, 2126, 0.92793405]], "rps_doc_ml_wikipedia_score": [[0, 2126, null]], "rps_doc_ml_wikiref_score": [[0, 2126, 0.18001884]], "rps_doc_books_importance": [[0, 2126, -231.33353348]], "rps_doc_openwebtext_importance": [[0, 2126, -55.98729906]], "rps_doc_wikipedia_importance": [[0, 2126, -102.94770746]], "rps_doc_num_sentences": [[0, 2126, 26.0]]} |
What is the sum of all interior angles of a 35-sided regular polygon?
6,120°
HELP !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! I well give 100 point and a crown but need to be answer in less then 20 mins
Sandy rakes and bags the leaves on her front lawn in 30 mins. Randy does the same job in 20 mins. How many fewer minutes will it t
thaiduong
The sum of the interior angles of a polygon is (n-2)*180 where n is the number of sides. In this case n=35. (35-2)*180 = 33*180 = 5940. | 2023-14/0000/en_head.json.gz/3658 | {"url": "https://documen.tv/help-what-is-the-sum-of-all-interior-angles-of-a-35-sided-regular-polygon-6-120-5-940-180-b-29986518-25/", "partition": "head_middle", "language": "en", "source_domain": "documen.tv", "date_download": "2023-03-20T09:53:02Z", "digest": "sha1:ZM4WEW534KKE6TUTCXNZCZFBYFS4P4PZ"} | {"ccnet_length": [[0, 467, 467.0]], "ccnet_original_length": [[0, 467, 1020.0]], "ccnet_nlines": [[0, 467, 6.0]], "ccnet_original_nlines": [[0, 467, 37.0]], "ccnet_language_score": [[0, 467, 0.91]], "ccnet_perplexity": [[0, 467, 173.6]], "ccnet_bucket": [[0, 467, 0.0]], "rps_doc_curly_bracket": [[0, 467, 0.0]], "rps_doc_ldnoobw_words": [[0, 467, 0.0]], "rps_doc_lorem_ipsum": [[0, 467, 0.0]], "rps_doc_stop_word_fraction": [[0, 467, 0.36521739]], "rps_doc_ut1_blacklist": [[0, 467, null]], "rps_doc_frac_chars_dupe_10grams": [[0, 467, 0.0]], "rps_doc_frac_chars_dupe_5grams": [[0, 467, 0.0]], "rps_doc_frac_chars_dupe_6grams": [[0, 467, 0.0]], "rps_doc_frac_chars_dupe_7grams": [[0, 467, 0.0]], "rps_doc_frac_chars_dupe_8grams": [[0, 467, 0.0]], "rps_doc_frac_chars_dupe_9grams": [[0, 467, 0.0]], "rps_doc_frac_chars_top_2gram": [[0, 467, 0.03058104]], "rps_doc_frac_chars_top_3gram": [[0, 467, 0.04892966]], "rps_doc_frac_chars_top_4gram": [[0, 467, 0.10397554]], "rps_doc_frac_all_caps_words": [[0, 467, 0.0173913]], "rps_doc_frac_lines_end_with_ellipsis": [[0, 467, 0.0]], "rps_doc_frac_no_alph_words": [[0, 467, 0.31304348]], "rps_doc_frac_unique_words": [[0, 467, 0.72413793]], "rps_doc_mean_word_length": [[0, 467, 3.75862069]], "rps_doc_symbol_to_word_ratio": [[0, 467, 0.0]], "rps_doc_unigram_entropy": [[0, 467, 3.97684834]], "rps_doc_word_count": [[0, 467, 87.0]], "rps_lines_ending_with_terminal_punctution_mark": [[0, 70, 1.0], [70, 77, 0.0], [77, 191, 0.0], [191, 322, 0.0], [322, 332, 0.0], [332, 467, 1.0]], "rps_lines_javascript_counts": [[0, 70, 0.0], [70, 77, 0.0], [77, 191, 0.0], [191, 322, 0.0], [322, 332, 0.0], [332, 467, 0.0]], "rps_lines_num_words": [[0, 70, 13.0], [70, 77, 1.0], [77, 191, 19.0], [191, 322, 28.0], [322, 332, 1.0], [332, 467, 25.0]], "rps_lines_numerical_chars_fraction": [[0, 70, 0.02985075], [70, 77, 0.8], [77, 191, 0.0617284], [191, 322, 0.03125], [322, 332, 0.0], [332, 467, 0.1779661]], "rps_lines_start_with_bulletpoint": [[0, 70, 0.0], [70, 77, 0.0], [77, 191, 0.0], [191, 322, 0.0], [322, 332, 0.0], [332, 467, 0.0]], "rps_lines_uppercase_letter_fraction": [[0, 70, 0.01428571], [70, 77, 0.0], [77, 191, 0.04385965], [191, 322, 0.02290076], [322, 332, 0.0], [332, 467, 0.01481481]], "rps_doc_ml_palm_score": [[0, 467, 0.0505023]], "rps_doc_ml_wikipedia_score": [[0, 467, null]], "rps_doc_ml_wikiref_score": [[0, 467, 0.00070387]], "rps_doc_books_importance": [[0, 467, -37.18074459]], "rps_doc_openwebtext_importance": [[0, 467, -15.31494342]], "rps_doc_wikipedia_importance": [[0, 467, -22.13004151]], "rps_doc_num_sentences": [[0, 467, 7.0]]} |
105- Legality of Use of Force (Serbia and Montenegro vs Belgium)
On April 29th, 1999, The Federal Republic of Yugoslavia filed Applications instituting proceedings before the International Court of Justice (ICJ) against several States, including Belgium, claiming that the latter had violated its international obligation not to use force against another State.
In its application instituting proceedings against Belgium, Yugoslavia cited as a basis for jurisdiction of the Court the article 36 paragraph 2 of the Statute of the Court as well as the article IX of the Convention on the Prevention and Punishment of the Crime of Genocide, adopted by the UN on December 09th, 1948. The applicant also submitted a request for provisional measures to ensure that the respondent State cease immediately the use of force and refrain from any act that could constitute a threat or a use of force. In its order of June 2nd,1999 the ICJ ruled that it did not have a prima facie jurisdiction, which is one of the prerequisite for the issuance of provisional measures, therefore it could not grant that request.
Order of 02-06-1999.pdf
Belgium, like all the States brought before the Court by Yugoslavia, raised preliminary objections: the point was to decide whether Serbia and Montenegro was a party to the UN and ICJ Statute when the proceedings were instituted. If the applicant was not, the Court could not be open to it unless it satisfied the conditions set out in article 35 paragraph 2 of the Statute.
After recalling the political events that had affected Yugoslavia during the 1990s (more precisely its implosion, leading to the creation of Serbia and Montenegro), the Court concluded that the applicant became a member of the UN on November 1st,2000. However, there was nothing to confirm that the country had been a member of the institution prior to that date and hence could not have been party to the Statute when it instituted proceedings against these states on April 29th,1999.
The Court also examined the question of whether Serbia and Montenegro could invoke the Convention on the settlement of disputes, which it concluded with Belgium in the 1930s. After review, the Court concluded that this Convention could not serve as a basis of jurisdiction to hear this case.
Tagged jurisdictionStatute ICJ | 2023-14/0000/en_head.json.gz/3659 | {"url": "https://dome-online.com/en/105-legality-of-use-of-force-serbia-and-montenegro-vs-belgium/", "partition": "head_middle", "language": "en", "source_domain": "dome-online.com", "date_download": "2023-03-20T09:28:35Z", "digest": "sha1:YNVFZHRQGJZGECUGKIPRHMD36XGBZUCG"} | {"ccnet_length": [[0, 2308, 2308.0]], "ccnet_original_length": [[0, 2308, 3745.0]], "ccnet_nlines": [[0, 2308, 8.0]], "ccnet_original_nlines": [[0, 2308, 83.0]], "ccnet_language_score": [[0, 2308, 0.98]], "ccnet_perplexity": [[0, 2308, 138.7]], "ccnet_bucket": [[0, 2308, 0.0]], "rps_doc_curly_bracket": [[0, 2308, 0.0]], "rps_doc_ldnoobw_words": [[0, 2308, 0.0]], "rps_doc_lorem_ipsum": [[0, 2308, 0.0]], "rps_doc_stop_word_fraction": [[0, 2308, 0.45754717]], "rps_doc_ut1_blacklist": [[0, 2308, null]], "rps_doc_frac_chars_dupe_10grams": [[0, 2308, 0.0]], "rps_doc_frac_chars_dupe_5grams": [[0, 2308, 0.02328042]], "rps_doc_frac_chars_dupe_6grams": [[0, 2308, 0.0]], "rps_doc_frac_chars_dupe_7grams": [[0, 2308, 0.0]], "rps_doc_frac_chars_dupe_8grams": [[0, 2308, 0.0]], "rps_doc_frac_chars_dupe_9grams": [[0, 2308, 0.0]], "rps_doc_frac_chars_top_2gram": [[0, 2308, 0.02380952]], "rps_doc_frac_chars_top_3gram": [[0, 2308, 0.04021164]], "rps_doc_frac_chars_top_4gram": [[0, 2308, 0.01587302]], "rps_doc_frac_all_caps_words": [[0, 2308, 0.01886792]], "rps_doc_frac_lines_end_with_ellipsis": [[0, 2308, 0.0]], "rps_doc_frac_no_alph_words": [[0, 2308, 0.125]], "rps_doc_frac_unique_words": [[0, 2308, 0.44708995]], "rps_doc_mean_word_length": [[0, 2308, 5.0]], "rps_doc_symbol_to_word_ratio": [[0, 2308, 0.0]], "rps_doc_unigram_entropy": [[0, 2308, 4.5668058]], "rps_doc_word_count": [[0, 2308, 378.0]], "rps_lines_ending_with_terminal_punctution_mark": [[0, 65, 0.0], [65, 362, 1.0], [362, 1101, 1.0], [1101, 1125, 0.0], [1125, 1500, 1.0], [1500, 1986, 1.0], [1986, 2278, 1.0], [2278, 2308, 0.0]], "rps_lines_javascript_counts": [[0, 65, 0.0], [65, 362, 0.0], [362, 1101, 0.0], [1101, 1125, 0.0], [1125, 1500, 0.0], [1500, 1986, 0.0], [1986, 2278, 0.0], [2278, 2308, 0.0]], "rps_lines_num_words": [[0, 65, 11.0], [65, 362, 41.0], [362, 1101, 128.0], [1101, 1125, 3.0], [1125, 1500, 65.0], [1500, 1986, 79.0], [1986, 2278, 48.0], [2278, 2308, 3.0]], "rps_lines_numerical_chars_fraction": [[0, 65, 0.04918033], [65, 362, 0.02076125], [362, 1101, 0.01920439], [1101, 1125, 0.4], [1125, 1500, 0.00815217], [1500, 1986, 0.03151261], [1986, 2278, 0.01393728], [2278, 2308, 0.0]], "rps_lines_start_with_bulletpoint": [[0, 65, 0.0], [65, 362, 0.0], [362, 1101, 0.0], [1101, 1125, 0.0], [1125, 1500, 0.0], [1500, 1986, 0.0], [1986, 2278, 0.0], [2278, 2308, 0.0]], "rps_lines_uppercase_letter_fraction": [[0, 65, 0.09230769], [65, 362, 0.05387205], [362, 1101, 0.03112314], [1101, 1125, 0.04166667], [1125, 1500, 0.04], [1500, 1986, 0.02263374], [1986, 2278, 0.03082192], [2278, 2308, 0.16666667]], "rps_doc_ml_palm_score": [[0, 2308, 0.86608982]], "rps_doc_ml_wikipedia_score": [[0, 2308, null]], "rps_doc_ml_wikiref_score": [[0, 2308, 0.32715523]], "rps_doc_books_importance": [[0, 2308, -11.97018292]], "rps_doc_openwebtext_importance": [[0, 2308, 46.52028573]], "rps_doc_wikipedia_importance": [[0, 2308, 75.24209356]], "rps_doc_num_sentences": [[0, 2308, 12.0]]} |
App. 013/2019_ Ulrich Dibgolongo vs Burkina Faso
Ulrich Sergio Dibgolongo vs Republic of Burkina Faso
On April 23rd, 2019, Mr. Ulrich Sergio Dibgolongo (the Applicant), filed an application instituting proceedings against the Republic of Burkina Faso (the defendant State) before the African Court of Human and Peoples’ Rights (the ACHPR or the Court) for alleged violations of the African Charter of Human and Peoples’ Rights (the Charter) in its articles 5 and 7.
Mr. Dibgolongo had been tried and sentenced by the Criminal Chamber of the High Court of Ouagadougou to 15 years’ imprisonment for offences of serious banditry, illegal possession of firearms and theft. This judgement was upheld on appeal. The applicant claims that his rights to a fair trial and dignity were violated throughout the domestic proceedings before the domestic courts. He wants a presidential pardon, commutation of his prison sentence, conditional release and any financial compensation for the harm suffered.
Burkina Faso raised preliminary objections concerning the jurisdiction of the Court as well as the admissibility of the Application. Although it concluded that it had jurisdiction, it nevertheless agreed with the respondent State, which had raised the fact that the applicant had not exhausted all domestic remedies in so far as he had not lodged an appeal in cassation with the Supreme Court of the State of Burkina Faso.
Tagged -ACHPR--African Charter--Recevability-jurisdiction | 2023-14/0000/en_head.json.gz/3660 | {"url": "https://dome-online.com/en/ulrich-dibgolongo-vs-burkina-faso-2/", "partition": "head_middle", "language": "en", "source_domain": "dome-online.com", "date_download": "2023-03-20T09:20:10Z", "digest": "sha1:HHCUBTQDFOHFR2HDKQYYHWMVGBPIMXI4"} | {"ccnet_length": [[0, 1471, 1471.0]], "ccnet_original_length": [[0, 1471, 2833.0]], "ccnet_nlines": [[0, 1471, 6.0]], "ccnet_original_nlines": [[0, 1471, 80.0]], "ccnet_language_score": [[0, 1471, 0.96]], "ccnet_perplexity": [[0, 1471, 231.2]], "ccnet_bucket": [[0, 1471, 0.0]], "rps_doc_curly_bracket": [[0, 1471, 0.0]], "rps_doc_ldnoobw_words": [[0, 1471, 0.0]], "rps_doc_lorem_ipsum": [[0, 1471, 0.0]], "rps_doc_stop_word_fraction": [[0, 1471, 0.38167939]], "rps_doc_ut1_blacklist": [[0, 1471, null]], "rps_doc_frac_chars_dupe_10grams": [[0, 1471, 0.0]], "rps_doc_frac_chars_dupe_5grams": [[0, 1471, 0.04451772]], "rps_doc_frac_chars_dupe_6grams": [[0, 1471, 0.04451772]], "rps_doc_frac_chars_dupe_7grams": [[0, 1471, 0.0]], "rps_doc_frac_chars_dupe_8grams": [[0, 1471, 0.0]], "rps_doc_frac_chars_dupe_9grams": [[0, 1471, 0.0]], "rps_doc_frac_chars_top_2gram": [[0, 1471, 0.04534213]], "rps_doc_frac_chars_top_3gram": [[0, 1471, 0.03215169]], "rps_doc_frac_chars_top_4gram": [[0, 1471, 0.0346249]], "rps_doc_frac_all_caps_words": [[0, 1471, 0.00763359]], "rps_doc_frac_lines_end_with_ellipsis": [[0, 1471, 0.0]], "rps_doc_frac_no_alph_words": [[0, 1471, 0.14885496]], "rps_doc_frac_unique_words": [[0, 1471, 0.54666667]], "rps_doc_mean_word_length": [[0, 1471, 5.39111111]], "rps_doc_symbol_to_word_ratio": [[0, 1471, 0.0]], "rps_doc_unigram_entropy": [[0, 1471, 4.42109898]], "rps_doc_word_count": [[0, 1471, 225.0]], "rps_lines_ending_with_terminal_punctution_mark": [[0, 49, 0.0], [49, 102, 0.0], [102, 466, 1.0], [466, 991, 1.0], [991, 1414, 1.0], [1414, 1471, 0.0]], "rps_lines_javascript_counts": [[0, 49, 0.0], [49, 102, 0.0], [102, 466, 0.0], [466, 991, 0.0], [991, 1414, 0.0], [1414, 1471, 0.0]], "rps_lines_num_words": [[0, 49, 7.0], [49, 102, 8.0], [102, 466, 58.0], [466, 991, 80.0], [991, 1414, 69.0], [1414, 1471, 3.0]], "rps_lines_numerical_chars_fraction": [[0, 49, 0.15555556], [49, 102, 0.0], [102, 466, 0.02285714], [466, 991, 0.00387597], [991, 1414, 0.0], [1414, 1471, 0.0]], "rps_lines_start_with_bulletpoint": [[0, 49, 0.0], [49, 102, 0.0], [102, 466, 0.0], [466, 991, 0.0], [991, 1414, 0.0], [1414, 1471, 0.0]], "rps_lines_uppercase_letter_fraction": [[0, 49, 0.10204082], [49, 102, 0.11320755], [102, 466, 0.07692308], [466, 991, 0.01904762], [991, 1414, 0.02600473], [1414, 1471, 0.15789474]], "rps_doc_ml_palm_score": [[0, 1471, 0.69351053]], "rps_doc_ml_wikipedia_score": [[0, 1471, null]], "rps_doc_ml_wikiref_score": [[0, 1471, 0.28968626]], "rps_doc_books_importance": [[0, 1471, -13.34970705]], "rps_doc_openwebtext_importance": [[0, 1471, 29.01354842]], "rps_doc_wikipedia_importance": [[0, 1471, 41.146208]], "rps_doc_num_sentences": [[0, 1471, 11.0]]} |
« Breaking The Ice
One Quiet Woman Defines the American Dream Her Way »
Let’s Hang on to the Family Jewels
I don’t believe that the voters who put the Newtsies in power last fall intended to sell off our national lands and gut our environmental laws. I can’t remember any politician running on a promise to devastate the natural assets of the nation. So we better wake up, folks, because that’s what’s happening.
Here are a few of the unconservative measures already passed by Congress or well on their way to passage.
Rep. Joel Hefley (R-Colorado) and Sen. Frank Murkowski (R-Alaska) are creating a commission to decide which of our national parks to close. High on the list are urban parks like Golden Gate National Recreation Area in San Francisco and Gateway National Recreation Area in New York City.
“If you’ve been there once, you don’t need to go again,” is how Rep. James Hansen (R-Utah), chairman of the House Subcommittee on National Parks, sums up the legacy now in his hands. Hansen thinks the parks should be sold to developers, bringing in a one-time flush of money to help balance the government budget.
Another budget-balancing scheme will sell oil exploration leases to the Arctic National Wildlife Refuge. That would bring in about $1 billion once. (The deficit is running well over $200 billion every year). It would also bring in wells, pipelines, roads, air strips, ports, and garbage dumps to a pristine area now inhabited by polar bears, muskoxen, millions of migratory birds, and thousands of breeding caribou. Geologists say there’s a 20 percent chance of striking enough oil there to fuel the nation for six months.
Congress’s new Clean Water Act turns 60-80 percent of the nation’s remaining wetlands (half are already gone) over to drainers and developers. It ignores the careful definition of a wetland given in a new study by the National Research Council — and all the reasons for saving wetlands, from flood control to groundwater recharge to fish breeding. Says an ecologist who helps developers build without ruining wetlands, “Congress is trying to impose a policy without consideration of science.”
Sen. Slade Gorton’s (R-Washington) “logging without laws” rider is still attached to the budget rescission bill, passed by both houses, vetoed by President Clinton, now on its way back to his desk in revised form. The rider requires the Forest Service to sell off 6 billion board feet from our national forests, waiving every environmental law that might slow the cut. The plan is defended as a “forest health” measure to remove dead and dying trees. A better description was given by the Denver Post: “a frontal assault against public lands on behalf of a few corporate interests.”
Sen. Gorton has also redrafted the Endangered Species Act, essentially repealing it. His bill would allow the Interior Secretary to decide that a species is not worth protecting. It no longer requires the Forest Service or Bureau of Land Management to consider endangered species before opening public lands to private logging, mining, or grazing. No creature on public land would be safe if it stands in the way of anyone’s private plan to make money.
This Congress also plans to:
– Allow road-building in national parks and wildlife refuges.
– Ease restrictions on emissions of 70,000 different water pollutants.
– Turn federal lands back to the states. (That includes Bureau of Land Management lands, national forests, wildlife refuges, and some national parks.)
– Give ranchers who graze cattle on public lands the right to restrict public access to those lands, use water, cut wood, set fires, kill endangered species, and set their own rules about overgrazing.
– Ignore the 30-year-old law that puts into a Land and Water Conservation Fund the $900 million the government receives each year in offshore gas and oil royalties. Federal, state, and town governments use this fund to purchase parks and other land from willing sellers. The Newtsies want the money to go to the general government pot.
Many of these measures are touted as ways to balance the budget. But the budget can’t be the real goal, because this Congress is NOT proposing to ask concessionaires who have monopolies in national parks to pay the government a fair share of their profits. It is NOT suggesting that we stop charging below-market rates for grazing, logging, and mining rights on federal land. The deficit could be reduced by at least $1 billion a year if we ended these giveaways — and of course billions more if we were as austere in the military budget as we are in the social budget.
What is going on is not budget-balancing but resource-grabbing. America’s publicly owned resources have always been up for grabs under either political party, but previously the pace was restrained. Now it’s furious. The grabbers know they have to take advantage of their buddies in power fast, before the public wises up.
I suspect that you good people who voted for the perpetrators of this raid really wanted an end to the practice of passing on debts to our children. You know better than to pretend to balance the budget by selling off the family jewels. I can’t believe for a minute that you think it helps our children to sell our resources cheap in this generation and leave the next generation a denuded, polluted land.
If I’m right about you, we have to do some urgent, firm communicating to Washington (to call Congress: 202-225-3121) — and some careful checking before we vote again. | 2023-14/0000/en_head.json.gz/3661 | {"url": "https://donellameadows.org/archives/lets-hang-on-to-the-family-jewels/", "partition": "head_middle", "language": "en", "source_domain": "donellameadows.org", "date_download": "2023-03-20T09:49:16Z", "digest": "sha1:Z567L5CWF6QACGB7H66FKNRU4UIZLN6G"} | {"ccnet_length": [[0, 5487, 5487.0]], "ccnet_original_length": [[0, 5487, 8991.0]], "ccnet_nlines": [[0, 5487, 21.0]], "ccnet_original_nlines": [[0, 5487, 123.0]], "ccnet_language_score": [[0, 5487, 0.94]], "ccnet_perplexity": [[0, 5487, 272.6]], "ccnet_bucket": [[0, 5487, 0.0]], "rps_doc_curly_bracket": [[0, 5487, 0.0]], "rps_doc_ldnoobw_words": [[0, 5487, 0.0]], "rps_doc_lorem_ipsum": [[0, 5487, 0.0]], "rps_doc_stop_word_fraction": [[0, 5487, 0.38718412]], "rps_doc_ut1_blacklist": [[0, 5487, null]], "rps_doc_frac_chars_dupe_10grams": [[0, 5487, 0.0]], "rps_doc_frac_chars_dupe_5grams": [[0, 5487, 0.0]], "rps_doc_frac_chars_dupe_6grams": [[0, 5487, 0.0]], "rps_doc_frac_chars_dupe_7grams": [[0, 5487, 0.0]], "rps_doc_frac_chars_dupe_8grams": [[0, 5487, 0.0]], "rps_doc_frac_chars_dupe_9grams": [[0, 5487, 0.0]], "rps_doc_frac_chars_top_2gram": [[0, 5487, 0.0067659]], "rps_doc_frac_chars_top_3gram": [[0, 5487, 0.0067659]], "rps_doc_frac_chars_top_4gram": [[0, 5487, 0.00721696]], "rps_doc_frac_all_caps_words": [[0, 5487, 0.01083032]], "rps_doc_frac_lines_end_with_ellipsis": [[0, 5487, 0.0]], "rps_doc_frac_no_alph_words": [[0, 5487, 0.15794224]], "rps_doc_frac_unique_words": [[0, 5487, 0.50974026]], "rps_doc_mean_word_length": [[0, 5487, 4.7987013]], "rps_doc_symbol_to_word_ratio": [[0, 5487, 0.0]], "rps_doc_unigram_entropy": [[0, 5487, 5.58577641]], "rps_doc_word_count": [[0, 5487, 924.0]], "rps_lines_ending_with_terminal_punctution_mark": [[0, 19, 0.0], [19, 72, 0.0], [72, 107, 0.0], [107, 413, 1.0], [413, 519, 1.0], [519, 806, 1.0], [806, 1120, 1.0], [1120, 1643, 1.0], [1643, 2136, 1.0], [2136, 2719, 1.0], [2719, 3172, 1.0], [3172, 3201, 0.0], [3201, 3263, 1.0], [3263, 3334, 1.0], [3334, 3485, 0.0], [3485, 3686, 1.0], [3686, 4022, 1.0], [4022, 4592, 1.0], [4592, 4915, 1.0], [4915, 5321, 1.0], [5321, 5487, 1.0]], "rps_lines_javascript_counts": [[0, 19, 0.0], [19, 72, 0.0], [72, 107, 0.0], [107, 413, 0.0], [413, 519, 0.0], [519, 806, 0.0], [806, 1120, 0.0], [1120, 1643, 0.0], [1643, 2136, 0.0], [2136, 2719, 0.0], [2719, 3172, 0.0], [3172, 3201, 0.0], [3201, 3263, 0.0], [3263, 3334, 0.0], [3334, 3485, 0.0], [3485, 3686, 0.0], [3686, 4022, 0.0], [4022, 4592, 0.0], [4592, 4915, 0.0], [4915, 5321, 0.0], [5321, 5487, 0.0]], "rps_lines_num_words": [[0, 19, 4.0], [19, 72, 10.0], [72, 107, 7.0], [107, 413, 53.0], [413, 519, 19.0], [519, 806, 47.0], [806, 1120, 55.0], [1120, 1643, 84.0], [1643, 2136, 78.0], [2136, 2719, 98.0], [2719, 3172, 75.0], [3172, 3201, 5.0], [3201, 3263, 9.0], [3263, 3334, 10.0], [3334, 3485, 23.0], [3485, 3686, 34.0], [3686, 4022, 57.0], [4022, 4592, 103.0], [4592, 4915, 51.0], [4915, 5321, 74.0], [5321, 5487, 28.0]], "rps_lines_numerical_chars_fraction": [[0, 19, 0.0], [19, 72, 0.0], [72, 107, 0.0], [107, 413, 0.0], [413, 519, 0.0], [519, 806, 0.0], [806, 1120, 0.0], [1120, 1643, 0.01190476], [1643, 2136, 0.00826446], [2136, 2719, 0.00175747], [2719, 3172, 0.0], [3172, 3201, 0.0], [3201, 3263, 0.0], [3263, 3334, 0.07352941], [3334, 3485, 0.0], [3485, 3686, 0.0], [3686, 4022, 0.01529052], [4022, 4592, 0.00178571], [4592, 4915, 0.0], [4915, 5321, 0.0], [5321, 5487, 0.06329114]], "rps_lines_start_with_bulletpoint": [[0, 19, 0.0], [19, 72, 0.0], [72, 107, 0.0], [107, 413, 0.0], [413, 519, 0.0], [519, 806, 0.0], [806, 1120, 0.0], [1120, 1643, 0.0], [1643, 2136, 0.0], [2136, 2719, 0.0], [2719, 3172, 0.0], [3172, 3201, 0.0], [3201, 3263, 1.0], [3263, 3334, 1.0], [3334, 3485, 1.0], [3485, 3686, 1.0], [3686, 4022, 1.0], [4022, 4592, 0.0], [4592, 4915, 0.0], [4915, 5321, 0.0], [5321, 5487, 0.0]], "rps_lines_uppercase_letter_fraction": [[0, 19, 0.15789474], [19, 72, 0.1509434], [72, 107, 0.11428571], [107, 413, 0.0130719], [413, 519, 0.01886792], [519, 806, 0.08710801], [806, 1120, 0.03503185], [1120, 1643, 0.01720841], [1643, 2136, 0.02028398], [2136, 2719, 0.02401372], [2719, 3172, 0.03311258], [3172, 3201, 0.06896552], [3201, 3263, 0.01612903], [3263, 3334, 0.01408451], [3334, 3485, 0.03311258], [3485, 3686, 0.00497512], [3686, 4022, 0.02380952], [4022, 4592, 0.01929825], [4592, 4915, 0.0123839], [4915, 5321, 0.00738916], [5321, 5487, 0.02409639]], "rps_doc_ml_palm_score": [[0, 5487, 0.85902977]], "rps_doc_ml_wikipedia_score": [[0, 5487, null]], "rps_doc_ml_wikiref_score": [[0, 5487, 0.3083263]], "rps_doc_books_importance": [[0, 5487, -301.26739407]], "rps_doc_openwebtext_importance": [[0, 5487, 118.27428058]], "rps_doc_wikipedia_importance": [[0, 5487, -255.70705813]], "rps_doc_num_sentences": [[0, 5487, 49.0]]} |
Don't AsK.
We are cheeky, edgy, fierce, fresh, ever-loving and ever-changing.
The design process is a whole lot of work and a whole lot of fun: it begins with a stone, a crystal and some dancing to form an idea of how a design will become a piece that just has to be shared with others.
Don't AsK is a Montreal-based company with designs that show off the head designer's creativity and Scandinavian roots. Each collection is unique and fiercely different from the last and focuses on fun and funky, affordable jewelry that will never stay cooped up in a box.
While design central is here in the city of Montreal, we don't have a cutsie little shop. We sell online so we're accessible to all. We do runway and we do everyday: we are sure you will find something here that you love! Our pieces are on trend, well priced and amazingly wearable: nothing is mass produced!
So when it's gone you're SOLD babe.
Browse around, enjoy the site and remember: when someone stops you (and they will) tell them, "Don't AsK".
C'mon! Let's go shopping.
We love everything!
Chain, glass, hammered metal, Swarovski crystals, pearls, natural stones, wood, feathers, vermeil, nuts and enamel and satin.
Every season Don't AsK will continue to amaze.
Our collections are born out of things we love looking at and can't stop thinking about: the sparkle of crystals, the coolness of metal, the sheen of a pearl, the texture of a natural stone. Don't AsK isn't limited to two seasonal looks, we have thousands of designs in our head that are constantly being brought to life!
Each collection is unique so you never know what's coming next, or when. As a matter of a fact, neither do we. So come by often, see what's new and what's next! | 2023-14/0000/en_head.json.gz/3662 | {"url": "https://dont-ask.com/en-de/pages/about-us", "partition": "head_middle", "language": "en", "source_domain": "dont-ask.com", "date_download": "2023-03-20T08:50:07Z", "digest": "sha1:MOFIYTMCTT2X7V5VFUFI3CWG5E7EUXGH"} | {"ccnet_length": [[0, 1713, 1713.0]], "ccnet_original_length": [[0, 1713, 2539.0]], "ccnet_nlines": [[0, 1713, 13.0]], "ccnet_original_nlines": [[0, 1713, 80.0]], "ccnet_language_score": [[0, 1713, 0.95]], "ccnet_perplexity": [[0, 1713, 324.9]], "ccnet_bucket": [[0, 1713, 0.0]], "rps_doc_curly_bracket": [[0, 1713, 0.0]], "rps_doc_ldnoobw_words": [[0, 1713, 0.0]], "rps_doc_lorem_ipsum": [[0, 1713, 0.0]], "rps_doc_stop_word_fraction": [[0, 1713, 0.4475]], "rps_doc_ut1_blacklist": [[0, 1713, null]], "rps_doc_frac_chars_dupe_10grams": [[0, 1713, 0.0]], "rps_doc_frac_chars_dupe_5grams": [[0, 1713, 0.0]], "rps_doc_frac_chars_dupe_6grams": [[0, 1713, 0.0]], "rps_doc_frac_chars_dupe_7grams": [[0, 1713, 0.0]], "rps_doc_frac_chars_dupe_8grams": [[0, 1713, 0.0]], "rps_doc_frac_chars_dupe_9grams": [[0, 1713, 0.0]], "rps_doc_frac_chars_top_2gram": [[0, 1713, 0.02625656]], "rps_doc_frac_chars_top_3gram": [[0, 1713, 0.01350338]], "rps_doc_frac_chars_top_4gram": [[0, 1713, 0.01650413]], "rps_doc_frac_all_caps_words": [[0, 1713, 0.005]], "rps_doc_frac_lines_end_with_ellipsis": [[0, 1713, 0.0]], "rps_doc_frac_no_alph_words": [[0, 1713, 0.1875]], "rps_doc_frac_unique_words": [[0, 1713, 0.59672131]], "rps_doc_mean_word_length": [[0, 1713, 4.3704918]], "rps_doc_symbol_to_word_ratio": [[0, 1713, 0.0]], "rps_doc_unigram_entropy": [[0, 1713, 4.83963211]], "rps_doc_word_count": [[0, 1713, 305.0]], "rps_lines_ending_with_terminal_punctution_mark": [[0, 11, 1.0], [11, 78, 1.0], [78, 287, 1.0], [287, 560, 1.0], [560, 869, 1.0], [869, 905, 1.0], [905, 1012, 1.0], [1012, 1038, 1.0], [1038, 1058, 1.0], [1058, 1184, 1.0], [1184, 1231, 1.0], [1231, 1553, 1.0], [1553, 1713, 1.0]], "rps_lines_javascript_counts": [[0, 11, 0.0], [11, 78, 0.0], [78, 287, 0.0], [287, 560, 0.0], [560, 869, 0.0], [869, 905, 0.0], [905, 1012, 0.0], [1012, 1038, 0.0], [1038, 1058, 0.0], [1058, 1184, 0.0], [1184, 1231, 0.0], [1231, 1553, 0.0], [1553, 1713, 0.0]], "rps_lines_num_words": [[0, 11, 2.0], [11, 78, 9.0], [78, 287, 45.0], [287, 560, 45.0], [560, 869, 57.0], [869, 905, 7.0], [905, 1012, 18.0], [1012, 1038, 4.0], [1038, 1058, 3.0], [1058, 1184, 17.0], [1184, 1231, 8.0], [1231, 1553, 58.0], [1553, 1713, 32.0]], "rps_lines_numerical_chars_fraction": [[0, 11, 0.0], [11, 78, 0.0], [78, 287, 0.0], [287, 560, 0.0], [560, 869, 0.0], [869, 905, 0.0], [905, 1012, 0.0], [1012, 1038, 0.0], [1038, 1058, 0.0], [1058, 1184, 0.0], [1184, 1231, 0.0], [1231, 1553, 0.0], [1553, 1713, 0.0]], "rps_lines_start_with_bulletpoint": [[0, 11, 0.0], [11, 78, 0.0], [78, 287, 0.0], [287, 560, 0.0], [560, 869, 0.0], [869, 905, 0.0], [905, 1012, 0.0], [1012, 1038, 0.0], [1038, 1058, 0.0], [1058, 1184, 0.0], [1184, 1231, 0.0], [1231, 1553, 0.0], [1553, 1713, 0.0]], "rps_lines_uppercase_letter_fraction": [[0, 11, 0.27272727], [11, 78, 0.01492537], [78, 287, 0.00478469], [287, 560, 0.02197802], [560, 869, 0.01618123], [869, 905, 0.13888889], [905, 1012, 0.03738318], [1012, 1038, 0.07692308], [1038, 1058, 0.05], [1058, 1184, 0.01587302], [1184, 1231, 0.08510638], [1231, 1553, 0.01242236], [1553, 1713, 0.01875]], "rps_doc_ml_palm_score": [[0, 1713, 0.10971254]], "rps_doc_ml_wikipedia_score": [[0, 1713, null]], "rps_doc_ml_wikiref_score": [[0, 1713, 0.05873042]], "rps_doc_books_importance": [[0, 1713, -35.87004774]], "rps_doc_openwebtext_importance": [[0, 1713, -9.89072588]], "rps_doc_wikipedia_importance": [[0, 1713, -149.64170687]], "rps_doc_num_sentences": [[0, 1713, 21.0]]} |
Medicaid Crisis
VA Benefits
Annual Membership Plan
Seminar and Events
Dementia And Guns
Dementia and Guns: A Tragedy Waiting to Happen
You know that I do a lot of planning for families with loved ones with Alzheimer’s, and one of the discussions is always about driving and other dangerous activities. But, what about guns?
It’s common for families of those with Alzheimer’s and other forms of dementia to realize that at some point, their loved one shouldn’t be allowed to drive. But fewer people are aware they should exercise the same level of caution when it comes to restricting their loved ones’ access to firearms.
This was one of the findings of a May 2018 study published in the Annals of Internal Medicine covering firearm ownership among Alzheimer’s patients. The study noted that even though 89% of Americans support restricting access to firearms for those with mental illness, there’s been little attention focused on limiting firearm access among elderly dementia patients.
Did you know that there are currently no federal gun laws prohibiting the purchase or possession of firearms by persons with dementia? And only two states—Hawaii and Texas—have laws restricting gun access for dementia patients.
A ticking time bomb
This lack of attention comes despite an increasing number of incidents involving elderly dementia patients shooting and killing family members and caregivers after confusing them for intruders. And with so many Baby Boomers now entering retirement age, this dangerous situation could get much worse.
In fact, the number of people with dementia is expected to double to around 14 million in the next 20 years, with the vast majority of those over age 65. Since nearly half of people over 65 either own a gun or live with someone who does, it’s clear that firearm safety should be a top priority for those with elderly family members—even if they don’t currently have any signs of dementia.
Not restrictions, safe management
That said, just talking about restricting someone’s access to guns can be highly controversial and polarizing. Many people, especially veterans and those in law enforcement, consider guns—and their right to own them—an important part of their identity. Given this, the study’s authors recommended that families should talk with their elderly loved ones early on about the fact that one day they might have to give up their guns. Physicians suggest bringing up the topic of firearms relatively soon after the individual’s initial dementia diagnosis.
This discussion should be similar to those related to driving, acknowledging the emotions involved and allowing the person to maintain independence and decision control for as long as it’s safe. Even though this can be a very touchy subject, putting off this discussion can literally be life-threatening.
Note that I do not advocate the political positions of either side of the debate, and many questions are raised in that debate. It is important to remember that a confused person with a weapon can lead to terrible consequences, so, this is all about having the conversation.
All part of the plan
Since it relates to so many other end-of-life matters, this discussion should take place as part of the overall estate planning process. One way to handle the risk is to create a legally binding agreement laying out a “firearm retirement date” that’s similar to advance directives addressing the elderly relinquishing their driving privileges.
Such an agreement allows the gun owner to name a trusted family member or friend to take ownership of their firearms once they’ve reached a certain age or stage of dementia. In this way, the process may seem more like passing on a beloved family heirloom and less like giving up their guns.
Consider the legal issues of ownership
The transfer of certain types of firearms must adhere to strict state and federal regulations. Unless the new owner is in full compliance with these requirements, they could inadvertently violate the law simply by taking possession of the guns.
In light of this risk, you should consider creating a “gun trust,”an estate planning tool specially designed to deal with the ownership of firearms. With a gun trust, the firearm is legally owned by the trust, so most of the transfer requirements are avoided, making it a lot easier for family members to manage access after the original owner’s death.
Gun Trusts can be part of your estate planning
Gun trusts can be a valuable planning strategy even for gun owners without dementia. Speak with us to see if a gun trust would be a suitable option for your family.
December 18, 2019 /by Doug Koenig
https://dougkoeniglaw.com/wp-content/uploads/2019/08/LawOfficelogo-300x88.jpg 0 0 Doug Koenig https://dougkoeniglaw.com/wp-content/uploads/2019/08/LawOfficelogo-300x88.jpg Doug Koenig2019-12-18 05:26:402019-12-18 10:46:14Dementia And Guns
Link to: Contact Us
Our Experienced Lawyers Can Help You Understand Your Options
North Carolina Estate Planning Law Firm. Request Your Confidential Consultation and Discover How We Can Help.
The Law Offices of Douglas E. Koenig, PLLC
Customer meetings:
3020 S. Miami Blvd, Suite 201, Durham, NC 27703
Of-Counsel with Pierce Law Group, PLLC | 9121 Anson Way, Ste 200 Raleigh, NC 27615
Phone: 984-234-5550 | Fax: 919-590-1590 | Email: [email protected]
Located in Durham, North Carolina, the Law Offices of Douglas E. Koenig, PLLC represents clients throughout North Carolina and the Triangle, including the cities of Raleigh, Cary, Apex, Morrisville, Wake Forest, Chapel Hill, Durham, Pittsboro, Burlington, and Hillsborough.
We do not disclose any nonpublic personal information about you to anyone for any purpose that is not specifically permitted by law and any applicable State Ethics rules. We make reasonable efforts to restrict access to nonpublic personal information about you to those employees who need to know that information in order to provide products or services to you. We maintain physical, electronic, and procedural safeguards that comply with federal and state law to guard your nonpublic personal information. It is the policy of this law firm to protect the confidentiality of Social Security numbers and nonpublic personal information.
Information posted on this website is for general information only. You may not rely on it for your particular case. Communication sent to us via any forms on this website are not privileged and will not restrict us from representing an adverse party. You are not a client of the Law Offices unless and until you have a written agreement with us. Please do not share privileged information until you are a client.
Link to: Privacy Policy
© 2010-2021 The Law Offices of Douglas Koenig, PLLC
New Itunes Version Of Grandma Scam Elder Abuse Is A National Problem
This site may use cookies for some pages and contact forms. Neither Doug Koenig Law nor the Pierce Law Group use that information to sell or spam our visitors. Cookies are provided for your convenience to save your entries and cookies live only on your device. | 2023-14/0000/en_head.json.gz/3663 | {"url": "https://dougkoeniglaw.com/dementia-and-guns/", "partition": "head_middle", "language": "en", "source_domain": "dougkoeniglaw.com", "date_download": "2023-03-20T09:44:07Z", "digest": "sha1:5XVIOIMU5LZVYE2DZ74OCO2Y2XOETGGP"} | {"ccnet_length": [[0, 7054, 7054.0]], "ccnet_original_length": [[0, 7054, 7928.0]], "ccnet_nlines": [[0, 7054, 42.0]], "ccnet_original_nlines": [[0, 7054, 98.0]], "ccnet_language_score": [[0, 7054, 0.93]], "ccnet_perplexity": [[0, 7054, 322.1]], "ccnet_bucket": [[0, 7054, 0.0]], "rps_doc_curly_bracket": [[0, 7054, 0.0]], "rps_doc_ldnoobw_words": [[0, 7054, 0.0]], "rps_doc_lorem_ipsum": [[0, 7054, 0.0]], "rps_doc_stop_word_fraction": [[0, 7054, 0.38291605]], "rps_doc_ut1_blacklist": [[0, 7054, null]], "rps_doc_frac_chars_dupe_10grams": [[0, 7054, 0.0]], "rps_doc_frac_chars_dupe_5grams": [[0, 7054, 0.01523282]], "rps_doc_frac_chars_dupe_6grams": [[0, 7054, 0.01142461]], "rps_doc_frac_chars_dupe_7grams": [[0, 7054, 0.01142461]], "rps_doc_frac_chars_dupe_8grams": [[0, 7054, 0.01142461]], "rps_doc_frac_chars_dupe_9grams": [[0, 7054, 0.0]], "rps_doc_frac_chars_top_2gram": [[0, 7054, 0.00692401]], "rps_doc_frac_chars_top_3gram": [[0, 7054, 0.00900121]], "rps_doc_frac_chars_top_4gram": [[0, 7054, 0.00778951]], "rps_doc_frac_all_caps_words": [[0, 7054, 0.01104566]], "rps_doc_frac_lines_end_with_ellipsis": [[0, 7054, 0.0]], "rps_doc_frac_no_alph_words": [[0, 7054, 0.15684831]], "rps_doc_frac_unique_words": [[0, 7054, 0.45811052]], "rps_doc_mean_word_length": [[0, 7054, 5.14884135]], "rps_doc_symbol_to_word_ratio": [[0, 7054, 0.0]], "rps_doc_unigram_entropy": [[0, 7054, 5.61946432]], "rps_doc_word_count": [[0, 7054, 1122.0]], "rps_lines_ending_with_terminal_punctution_mark": [[0, 16, 0.0], [16, 28, 0.0], [28, 51, 0.0], [51, 70, 0.0], [70, 88, 0.0], [88, 135, 0.0], [135, 324, 1.0], [324, 622, 1.0], [622, 989, 1.0], [989, 1217, 1.0], [1217, 1237, 0.0], [1237, 1537, 1.0], [1537, 1926, 1.0], [1926, 1960, 0.0], [1960, 2509, 1.0], [2509, 2814, 1.0], [2814, 3089, 1.0], [3089, 3110, 0.0], [3110, 3454, 1.0], [3454, 3745, 1.0], [3745, 3784, 0.0], [3784, 4029, 1.0], [4029, 4382, 1.0], [4382, 4429, 0.0], [4429, 4594, 1.0], [4594, 4628, 0.0], [4628, 4867, 0.0], [4867, 4887, 0.0], [4887, 4948, 0.0], [4948, 5058, 1.0], [5058, 5101, 0.0], [5101, 5120, 0.0], [5120, 5168, 0.0], [5168, 5251, 0.0], [5251, 5323, 0.0], [5323, 5597, 1.0], [5597, 6234, 1.0], [6234, 6649, 1.0], [6649, 6673, 0.0], [6673, 6725, 0.0], [6725, 6794, 0.0], [6794, 7054, 1.0]], "rps_lines_javascript_counts": [[0, 16, 0.0], [16, 28, 0.0], [28, 51, 0.0], [51, 70, 0.0], [70, 88, 0.0], [88, 135, 0.0], [135, 324, 0.0], [324, 622, 0.0], [622, 989, 0.0], [989, 1217, 0.0], [1217, 1237, 0.0], [1237, 1537, 0.0], [1537, 1926, 0.0], [1926, 1960, 0.0], [1960, 2509, 0.0], [2509, 2814, 0.0], [2814, 3089, 0.0], [3089, 3110, 0.0], [3110, 3454, 0.0], [3454, 3745, 0.0], [3745, 3784, 0.0], [3784, 4029, 0.0], [4029, 4382, 0.0], [4382, 4429, 0.0], [4429, 4594, 0.0], [4594, 4628, 0.0], [4628, 4867, 0.0], [4867, 4887, 0.0], [4887, 4948, 0.0], [4948, 5058, 0.0], [5058, 5101, 0.0], [5101, 5120, 0.0], [5120, 5168, 0.0], [5168, 5251, 0.0], [5251, 5323, 0.0], [5323, 5597, 0.0], [5597, 6234, 0.0], [6234, 6649, 0.0], [6649, 6673, 0.0], [6673, 6725, 0.0], [6725, 6794, 0.0], [6794, 7054, 0.0]], "rps_lines_num_words": [[0, 16, 2.0], [16, 28, 2.0], [28, 51, 3.0], [51, 70, 3.0], [70, 88, 3.0], [88, 135, 8.0], [135, 324, 33.0], [324, 622, 51.0], [622, 989, 56.0], [989, 1217, 35.0], [1217, 1237, 4.0], [1237, 1537, 44.0], [1537, 1926, 72.0], [1926, 1960, 4.0], [1960, 2509, 84.0], [2509, 2814, 47.0], [2814, 3089, 48.0], [3089, 3110, 5.0], [3110, 3454, 53.0], [3454, 3745, 53.0], [3745, 3784, 6.0], [3784, 4029, 39.0], [4029, 4382, 60.0], [4382, 4429, 9.0], [4429, 4594, 31.0], [4594, 4628, 6.0], [4628, 4867, 12.0], [4867, 4887, 4.0], [4887, 4948, 9.0], [4948, 5058, 16.0], [5058, 5101, 8.0], [5101, 5120, 2.0], [5120, 5168, 9.0], [5168, 5251, 14.0], [5251, 5323, 6.0], [5323, 5597, 38.0], [5597, 6234, 99.0], [6234, 6649, 73.0], [6649, 6673, 4.0], [6673, 6725, 9.0], [6725, 6794, 12.0], [6794, 7054, 46.0]], "rps_lines_numerical_chars_fraction": [[0, 16, 0.0], [16, 28, 0.0], [28, 51, 0.0], [51, 70, 0.0], [70, 88, 0.0], [88, 135, 0.0], [135, 324, 0.0], [324, 622, 0.0], [622, 989, 0.01657459], [989, 1217, 0.0], [1217, 1237, 0.0], [1237, 1537, 0.0], [1537, 1926, 0.02088773], [1926, 1960, 0.0], [1960, 2509, 0.0], [2509, 2814, 0.0], [2814, 3089, 0.0], [3089, 3110, 0.0], [3110, 3454, 0.0], [3454, 3745, 0.0], [3745, 3784, 0.0], [3784, 4029, 0.0], [4029, 4382, 0.0], [4382, 4429, 0.0], [4429, 4594, 0.0], [4594, 4628, 0.19354839], [4628, 4867, 0.25242718], [4867, 4887, 0.0], [4887, 4948, 0.0], [4948, 5058, 0.0], [5058, 5101, 0.0], [5101, 5120, 0.0], [5120, 5168, 0.27906977], [5168, 5251, 0.15789474], [5251, 5323, 0.34482759], [5323, 5597, 0.0], [5597, 6234, 0.0], [6234, 6649, 0.0], [6649, 6673, 0.0], [6673, 6725, 0.16326531], [6725, 6794, 0.0], [6794, 7054, 0.0]], "rps_lines_start_with_bulletpoint": [[0, 16, 0.0], [16, 28, 0.0], [28, 51, 0.0], [51, 70, 0.0], [70, 88, 0.0], [88, 135, 0.0], [135, 324, 0.0], [324, 622, 0.0], [622, 989, 0.0], [989, 1217, 0.0], [1217, 1237, 0.0], [1237, 1537, 0.0], [1537, 1926, 0.0], [1926, 1960, 0.0], [1960, 2509, 0.0], [2509, 2814, 0.0], [2814, 3089, 0.0], [3089, 3110, 0.0], [3110, 3454, 0.0], [3454, 3745, 0.0], [3745, 3784, 0.0], [3784, 4029, 0.0], [4029, 4382, 0.0], [4382, 4429, 0.0], [4429, 4594, 0.0], [4594, 4628, 0.0], [4628, 4867, 0.0], [4867, 4887, 0.0], [4887, 4948, 0.0], [4948, 5058, 0.0], [5058, 5101, 0.0], [5101, 5120, 0.0], [5120, 5168, 0.0], [5168, 5251, 0.0], [5251, 5323, 0.0], [5323, 5597, 0.0], [5597, 6234, 0.0], [6234, 6649, 0.0], [6649, 6673, 0.0], [6673, 6725, 0.0], [6725, 6794, 0.0], [6794, 7054, 0.0]], "rps_lines_uppercase_letter_fraction": [[0, 16, 0.125], [16, 28, 0.25], [28, 51, 0.13043478], [51, 70, 0.10526316], [70, 88, 0.16666667], [88, 135, 0.12765957], [135, 324, 0.02116402], [324, 622, 0.01006711], [622, 989, 0.02179837], [989, 1217, 0.01754386], [1217, 1237, 0.05], [1237, 1537, 0.01333333], [1537, 1926, 0.00514139], [1926, 1960, 0.02941176], [1960, 2509, 0.00728597], [2509, 2814, 0.00655738], [2814, 3089, 0.01090909], [3089, 3110, 0.04761905], [3110, 3454, 0.00581395], [3454, 3745, 0.00687285], [3745, 3784, 0.02564103], [3784, 4029, 0.00816327], [4029, 4382, 0.00566572], [4382, 4429, 0.04255319], [4429, 4594, 0.01212121], [4594, 4628, 0.08823529], [4628, 4867, 0.0460251], [4867, 4887, 0.15], [4887, 4948, 0.14754098], [4948, 5058, 0.13636364], [5058, 5101, 0.23255814], [5101, 5120, 0.05263158], [5120, 5168, 0.14583333], [5168, 5251, 0.18072289], [5251, 5323, 0.04166667], [5323, 5597, 0.10218978], [5597, 6234, 0.01255887], [6234, 6649, 0.01686747], [6649, 6673, 0.125], [6673, 6725, 0.17307692], [6725, 6794, 0.17391304], [6794, 7054, 0.03461538]], "rps_doc_ml_palm_score": [[0, 7054, 0.05354005]], "rps_doc_ml_wikipedia_score": [[0, 7054, null]], "rps_doc_ml_wikiref_score": [[0, 7054, 0.13558829]], "rps_doc_books_importance": [[0, 7054, -413.24089172]], "rps_doc_openwebtext_importance": [[0, 7054, 22.62858035]], "rps_doc_wikipedia_importance": [[0, 7054, -350.63764732]], "rps_doc_num_sentences": [[0, 7054, 53.0]]} |
Dear Clients, Friends, and our Sea to Sky Community,
Our offices are closed to the public, but we know that many of you are nervous about your property sales, your financial security, the real estate market, your property investments, and your housing security. We want to use this opportunity to assure all of our clients that although our doors are locked, our REALTORS®, our property managers, and our staff are just a phone call away to answer your questions during this stressful time.
We at RE/MAX Sea to Sky Real Estate are (obviously) not doctors, nurses, or health officials, but we are a small local business. As such, it is up to us to offer the resources that we do have to help each other in this time. We’ve created this page to share information for:
-Home & Property Owners
–Landlords who want to help their tenants, or who may need help themselves
–Current Property Buyers & Sellers
Realizing in times like these, our homes and real property become even more pertinent to our financial and economic well being, we are keeping a close eye on the local real estate markets – Squamish, Whistler, and Pemberton. We will be updating the statistics weekly so you can keep tabs on the market.
Lastly, we’d like to take this opportunity to thank our local nurses, doctors, grocers, delivery people, transit drivers, municipal workers, and everyone else who have taken steps to keep our communities running. In order to protect them and the rest of your community, please abide by the Public Health Officer’s social distancing directive. Stay home. Take care of your family. Check on your friends with a phone call or video chat. Be safe. Reach out if you need help. We are here for each other.
Thank you, Sea to Sky neighbours, for your diligence and sense of community at this time.
Stefanie Hostetter
CEO RE/MAX Sea to Sky Real Estate | 2023-14/0000/en_head.json.gz/3664 | {"url": "https://downtown-squamish.com/resources/", "partition": "head_middle", "language": "en", "source_domain": "downtown-squamish.com", "date_download": "2023-03-20T09:54:00Z", "digest": "sha1:FGBCQP7XCOMNXHYXGRC5B7KC7K4GDQHT"} | {"ccnet_length": [[0, 1845, 1845.0]], "ccnet_original_length": [[0, 1845, 5014.0]], "ccnet_nlines": [[0, 1845, 11.0]], "ccnet_original_nlines": [[0, 1845, 122.0]], "ccnet_language_score": [[0, 1845, 0.95]], "ccnet_perplexity": [[0, 1845, 313.5]], "ccnet_bucket": [[0, 1845, 0.0]], "rps_doc_curly_bracket": [[0, 1845, 0.0]], "rps_doc_ldnoobw_words": [[0, 1845, 0.0]], "rps_doc_lorem_ipsum": [[0, 1845, 0.0]], "rps_doc_stop_word_fraction": [[0, 1845, 0.44240838]], "rps_doc_ut1_blacklist": [[0, 1845, null]], "rps_doc_frac_chars_dupe_10grams": [[0, 1845, 0.0]], "rps_doc_frac_chars_dupe_5grams": [[0, 1845, 0.03125]], "rps_doc_frac_chars_dupe_6grams": [[0, 1845, 0.03125]], "rps_doc_frac_chars_dupe_7grams": [[0, 1845, 0.0]], "rps_doc_frac_chars_dupe_8grams": [[0, 1845, 0.0]], "rps_doc_frac_chars_dupe_9grams": [[0, 1845, 0.0]], "rps_doc_frac_chars_top_2gram": [[0, 1845, 0.01358696]], "rps_doc_frac_chars_top_3gram": [[0, 1845, 0.02173913]], "rps_doc_frac_chars_top_4gram": [[0, 1845, 0.01766304]], "rps_doc_frac_all_caps_words": [[0, 1845, 0.01570681]], "rps_doc_frac_lines_end_with_ellipsis": [[0, 1845, 0.0]], "rps_doc_frac_no_alph_words": [[0, 1845, 0.15445026]], "rps_doc_frac_unique_words": [[0, 1845, 0.54858934]], "rps_doc_mean_word_length": [[0, 1845, 4.61442006]], "rps_doc_symbol_to_word_ratio": [[0, 1845, 0.0]], "rps_doc_unigram_entropy": [[0, 1845, 4.81085593]], "rps_doc_word_count": [[0, 1845, 319.0]], "rps_lines_ending_with_terminal_punctution_mark": [[0, 53, 0.0], [53, 491, 1.0], [491, 766, 0.0], [766, 790, 0.0], [790, 865, 0.0], [865, 900, 0.0], [900, 1203, 1.0], [1203, 1703, 1.0], [1703, 1793, 1.0], [1793, 1812, 0.0], [1812, 1845, 0.0]], "rps_lines_javascript_counts": [[0, 53, 0.0], [53, 491, 0.0], [491, 766, 0.0], [766, 790, 0.0], [790, 865, 0.0], [865, 900, 0.0], [900, 1203, 0.0], [1203, 1703, 0.0], [1703, 1793, 0.0], [1793, 1812, 0.0], [1812, 1845, 0.0]], "rps_lines_num_words": [[0, 53, 9.0], [53, 491, 74.0], [491, 766, 53.0], [766, 790, 3.0], [790, 865, 13.0], [865, 900, 4.0], [900, 1203, 53.0], [1203, 1703, 85.0], [1703, 1793, 16.0], [1793, 1812, 2.0], [1812, 1845, 7.0]], "rps_lines_numerical_chars_fraction": [[0, 53, 0.0], [53, 491, 0.0], [491, 766, 0.0], [766, 790, 0.0], [790, 865, 0.0], [865, 900, 0.0], [900, 1203, 0.0], [1203, 1703, 0.0], [1703, 1793, 0.0], [1793, 1812, 0.0], [1812, 1845, 0.0]], "rps_lines_start_with_bulletpoint": [[0, 53, 0.0], [53, 491, 0.0], [491, 766, 0.0], [766, 790, 0.0], [790, 865, 1.0], [865, 900, 1.0], [900, 1203, 0.0], [1203, 1703, 0.0], [1703, 1793, 0.0], [1793, 1812, 0.0], [1812, 1845, 0.0]], "rps_lines_uppercase_letter_fraction": [[0, 53, 0.11320755], [53, 491, 0.02283105], [491, 766, 0.04363636], [766, 790, 0.125], [790, 865, 0.01333333], [865, 900, 0.11428571], [900, 1203, 0.01650165], [1203, 1703, 0.022], [1703, 1793, 0.03333333], [1793, 1812, 0.10526316], [1812, 1845, 0.36363636]], "rps_doc_ml_palm_score": [[0, 1845, 0.04309207]], "rps_doc_ml_wikipedia_score": [[0, 1845, null]], "rps_doc_ml_wikiref_score": [[0, 1845, 0.00157452]], "rps_doc_books_importance": [[0, 1845, -103.1013865]], "rps_doc_openwebtext_importance": [[0, 1845, -9.8474244]], "rps_doc_wikipedia_importance": [[0, 1845, -171.45456426]], "rps_doc_num_sentences": [[0, 1845, 16.0]]} |
Caden McDonald, DS #30 OLB, San Diego State
Name: Caden McDonald
College: San Diego State Number: 54
Position: OLB Pos2: ILB/FB/ST
Class/Draft Year: rSr/2023
Data Scout Notes: 2022: 1stC...PLombardi/PNagurski...2021: 1stC...PNagurski...2020: PBednarik/1stC
Caden McDonald, San Diego State, Player News
12/18/22 - OLB Caden McDonald, rSr/2023, San Diego State, 6-2, 240 (DS#30 OLB) + More +
2022 ALL-MOUNTAIN WEST FOOTBALL FIRST TEAM: LB - Caden McDonald, Sr., San Diego State,...McDonald made the all-MW first team for a third consecutive season, joining fellow Aztec Kirk Morrison (2002-04) as the only linebackers in league history (since 199 9) to make three all-MW first teams. A member of the Bronko Nagurski Trophy Watch List (most outstanding defensive player), McDonald has totaled 41 tackles (16 solo) on the season in 12 starts, including two sacks, 7.5 tackles for loss, an interception, two pass breakups and nine quarterback hurries. - San Diego State Football
(DS#30 OLB) rSr/2023 OLB Caden McDonald, San Diego State
News Source: San Diego State Football
2022 PRESEASON ROTARY LOMBARDI AWARD WATCHLIST: Caden McDonald, San Diego State, Sr LB,...McDonald has started 31 straight games and played in 48 consecutive games for the Aztecs dating back to the 2018 season. The two-time first-team all-MW honoree reco rded 49 tackles (21 solo), including three sacks for 22 yards and 6.5 tackles for loss of 28 yards, seven quarterback hurries, two pass breakups and a blocked PAT in 2021. - San Diego State Football
He is a grizzled veteran, but San Diego State graduate senior linebacker Caden McDonald welcomes the start of fall camp now more than ever. "It's a blessing to come out here with all the boys again," McDonald said. "Doing six years of first days, this fi rst day has been the most exciting. "You get older, you get more mature, you know what's coming. ... All the butterflies." McDonald said added excitement comes from this being SDSU's 100th season of football, the Aztecs are about to debut at Snapdragon S tadium and, to top it off, younger brother Caden, a transfer from Washington, has joined him on the SDSU defense. - San Diego Times Union
News Source: San Diego Times Union
2022 PRESEASON BRONKO NAGURSKI TROPHY WATCHLIST: LB Caden McDonald, San Diego State,...This is the second consecutive season that McDonald has been named to the preseason Bronko Nagurski Watch List. McDonald, who was also named to the MW preseason all-co nference team on Thursday, has started 31 straight games and played in 48 consecutive games for the Aztecs dating back to the 2018 season. The two-time first-team all-MW honoree recorded 49 tackles (21 solo), including three sacks for 22 yards and 6.5 ta ckles for loss of 28 yards, seven quarterback hurries, two pass breakups and a blocked PAT in 2021. He registered three solo tackles, including two tackles for loss (six yards) and a 4-yard sack, in 45 snaps in a 38-14 win at Arizona last season, and had four assisted tackles and a quarterback hurry in 79 snaps in a win over Utah in 2021. - San Diego State Football
San Diego State linebacker Caden McDonald didn't put his tooth under his pillow in the hope of enriching himself after it was knocked out against Utah last season. McDonald has profited from the loss nonetheless. "A lot of people take losing a front toot h as a negative thing," said McDonald, a 6-foot-3 senior from Haslet, Texas. "You get your tooth knocked out and think, 'Dang, this sucks. Well, what now?' ... "I turned it into a business opportunity. I started my own clothing line." McDonald branded hi mself the Toothless Cowboy, the idea sparked when he was wearing a cowboy hat and a toothless grin in a TikTok photo. McDonald created a catchy logo - a T and sideways C designed to resemble a smile with a tooth missing - started the website toothlesscow boy.com and was open for business. - San Diego Union Tribune
News Source: San Diego Union Tribune
2021 ALL-MOUNTAIN WEST FIRST TEAM (COACHES): LB - Caden McDonald, Sr., San Diego State,...McDonald is on the first team for a second straight season after registering 44 tackles (16 solo) on the year so far, including three sacks for 22 yards and 6.5 tac kles for loss of 28 yards, six quarterback hurries and two pass breakups. McDonald (2020-21) joins Kirk Morrison (2002-04), Miles Burris (2010-11), Calvin Munson (2015-16) and Kyahva Tezino (2018-19) as the only Aztec linebackers to make multiple all-MW first teams in league history. - San Diego State Football
Linebacker Cooper McDonald, a 6-foot-3 redshirt freshman at Washington, announced on social media that he will transfer to SDSU, joining the school's 2022 early signing class. "I get the awesome opportunity to play alongside my older brother Caden McDona ld while doing so," Cooper wrote on Twitter. Cooper started 10 games this season for the Huskies, totaling 26 tackles with 5 1/2 tackles for a loss and 1 1/2 sacks. Caden confirmed that the linebackers from Haslet, Texas, will play together in 2022 in a social media post over the weekend. "I have decided to return to school and forego declaring for the draft to use my last year of eligibility to handle some unfinished business and be a part of the 100th Aztec football team," Caden wrote on Twitter. "I c an't wait to open up the season next year in the new stadium and get to see all of Aztec nation in a place we can finally call home. "I also could not turn down the opportunity to play alongside my little brother while doing all of this." - San Diego Uni on Tribune
2020 ALL-MOUNTAIN WEST FOOTBALL FIRST TEAM (COACHES): LB - Caden McDonald, Jr., San Diego State,...McDonald earned all-MW accolades for the first time of his career after registering 34 tackles, 10 tackles for loss, 4.5 sacks, three pass breakups, a forc ed fumble and 10 quarterback hurries. He's just the sixth Aztec linebacker to make an all-MW first team (also Kirk Morrison (2002-04), Freddy Keiaho (2005), Miles Burris (2010-11), Calvin Munson (2015-16) and Kyahva Tezino (2018-19)). McDonald leads the league in tackles for loss, and ranks in a tie for third in tackles for loss yards (48), eighth in sack yards (33) and tied for eighth in total sacks (4.5). - San Diego State Football
SDSU linebacker Caden McDonald was named Mountain West Defensive Player of the Week after leading the Aztecs with 2 1/2 sacks among his eight tackles. McDonald, a junior from Haslet, Texas, appreciated the honor, but wasn't satisfied with the performance . "This is just a good first game for us, but we have a lot more to improve on and get better on, that's for sure," McDonald said. "As a defense, we want to not let them score any points. Six points is too much for us, (186) yards in total offense is too much for us as well. We want to keep them to zero on all of that..."Anything more than zero is not acceptable for our defense. We want to be lockdown all the time and shut down everybody we play. This defense has the mindset to be the best in the countr y week in and week out." - San Diego Union Tribune
OCT 26 MOUNTAIN WEST DEFENSIVE PLAYER OF THE WEEK: CADEN MCDONALD, SAN DIEGO STATE, Junior, Linebacker, Haslet, Texas/Northwest HS,...Recorded career highs in tackles (8), tackles for loss (3.0) and sacks (2.5)...The 2.5 sacks are the most by an Aztec si nce Sept. 30, 2017 and tied for sixth-most nationally this season...Led a defense that limited UNLV to just six points, the fewest allowed in a Mountain West game by the Aztecs since Nov. 5, 2016. - Mountain West Football
News Source: Mountain West Football
Tweet #CadenMcDonald, Tweet #Football, | 2023-14/0000/en_head.json.gz/3665 | {"url": "https://draftscout.com/dsprofile.php?PlayerId=1024736&DraftYear=2023", "partition": "head_middle", "language": "en", "source_domain": "draftscout.com", "date_download": "2023-03-20T10:44:38Z", "digest": "sha1:X4U67XGJ5RRJ5CFTLGALU5ZWGYKNB6PW"} | {"ccnet_length": [[0, 7603, 7603.0]], "ccnet_original_length": [[0, 7603, 12902.0]], "ccnet_nlines": [[0, 7603, 24.0]], "ccnet_original_nlines": [[0, 7603, 159.0]], "ccnet_language_score": [[0, 7603, 0.95]], "ccnet_perplexity": [[0, 7603, 313.5]], "ccnet_bucket": [[0, 7603, 0.0]], "rps_doc_curly_bracket": [[0, 7603, 0.0]], "rps_doc_ldnoobw_words": [[0, 7603, 1.0]], "rps_doc_lorem_ipsum": [[0, 7603, 0.0]], "rps_doc_stop_word_fraction": [[0, 7603, 0.29516995]], "rps_doc_ut1_blacklist": [[0, 7603, null]], "rps_doc_frac_chars_dupe_10grams": [[0, 7603, 0.10167201]], "rps_doc_frac_chars_dupe_5grams": [[0, 7603, 0.19827732]], "rps_doc_frac_chars_dupe_6grams": [[0, 7603, 0.14845465]], "rps_doc_frac_chars_dupe_7grams": [[0, 7603, 0.13241007]], "rps_doc_frac_chars_dupe_8grams": [[0, 7603, 0.10741429]], "rps_doc_frac_chars_dupe_9grams": [[0, 7603, 0.10167201]], "rps_doc_frac_chars_top_2gram": [[0, 7603, 0.03377808]], "rps_doc_frac_chars_top_3gram": [[0, 7603, 0.03293363]], "rps_doc_frac_chars_top_4gram": [[0, 7603, 0.02128019]], "rps_doc_frac_all_caps_words": [[0, 7603, 0.05128205]], "rps_doc_frac_lines_end_with_ellipsis": [[0, 7603, 0.0]], "rps_doc_frac_no_alph_words": [[0, 7603, 0.26237329]], "rps_doc_frac_unique_words": [[0, 7603, 0.36755486]], "rps_doc_mean_word_length": [[0, 7603, 4.64028213]], "rps_doc_symbol_to_word_ratio": [[0, 7603, 0.01192606]], "rps_doc_unigram_entropy": [[0, 7603, 5.51281288]], "rps_doc_word_count": [[0, 7603, 1276.0]], "rps_lines_ending_with_terminal_punctution_mark": [[0, 44, 0.0], [44, 65, 0.0], [65, 101, 0.0], [101, 131, 0.0], [131, 158, 0.0], [158, 257, 0.0], [257, 302, 0.0], [302, 390, 0.0], [390, 974, 0.0], [974, 1031, 0.0], [1031, 1069, 0.0], [1069, 1523, 0.0], [1523, 2171, 0.0], [2171, 2206, 0.0], [2206, 3084, 0.0], [3084, 3910, 0.0], [3910, 3947, 0.0], [3947, 4514, 0.0], [4514, 5544, 0.0], [5544, 6237, 0.0], [6237, 7052, 0.0], [7052, 7529, 0.0], [7529, 7565, 0.0], [7565, 7603, 0.0]], "rps_lines_javascript_counts": [[0, 44, 0.0], [44, 65, 0.0], [65, 101, 0.0], [101, 131, 0.0], [131, 158, 0.0], [158, 257, 0.0], [257, 302, 0.0], [302, 390, 0.0], [390, 974, 0.0], [974, 1031, 0.0], [1031, 1069, 0.0], [1069, 1523, 0.0], [1523, 2171, 0.0], [2171, 2206, 0.0], [2206, 3084, 0.0], [3084, 3910, 0.0], [3910, 3947, 0.0], [3947, 4514, 0.0], [4514, 5544, 0.0], [5544, 6237, 0.0], [6237, 7052, 0.0], [7052, 7529, 0.0], [7529, 7565, 0.0], [7565, 7603, 0.0]], "rps_lines_num_words": [[0, 44, 8.0], [44, 65, 3.0], [65, 101, 6.0], [101, 131, 4.0], [131, 158, 3.0], [158, 257, 7.0], [257, 302, 7.0], [302, 390, 13.0], [390, 974, 91.0], [974, 1031, 9.0], [1031, 1069, 6.0], [1069, 1523, 74.0], [1523, 2171, 113.0], [2171, 2206, 6.0], [2206, 3084, 150.0], [3084, 3910, 142.0], [3910, 3947, 6.0], [3947, 4514, 90.0], [4514, 5544, 185.0], [5544, 6237, 112.0], [6237, 7052, 154.0], [7052, 7529, 78.0], [7529, 7565, 5.0], [7565, 7603, 4.0]], "rps_lines_numerical_chars_fraction": [[0, 44, 0.05], [44, 65, 0.0], [65, 101, 0.06060606], [101, 131, 0.04], [131, 158, 0.17391304], [158, 257, 0.1875], [257, 302, 0.0], [302, 390, 0.24285714], [390, 974, 0.04], [974, 1031, 0.11764706], [1031, 1069, 0.0], [1069, 1523, 0.06004619], [1523, 2171, 0.00486224], [2171, 2206, 0.0], [2206, 3084, 0.04609929], [3084, 3910, 0.00253807], [3910, 3947, 0.0], [3947, 4514, 0.08365019], [4514, 5544, 0.02302302], [5544, 6237, 0.07210031], [6237, 7052, 0.00768246], [7052, 7529, 0.04545455], [7529, 7565, 0.0], [7565, 7603, 0.0]], "rps_lines_start_with_bulletpoint": [[0, 44, 0.0], [44, 65, 0.0], [65, 101, 0.0], [101, 131, 0.0], [131, 158, 0.0], [158, 257, 0.0], [257, 302, 0.0], [302, 390, 0.0], [390, 974, 0.0], [974, 1031, 0.0], [1031, 1069, 0.0], [1069, 1523, 0.0], [1523, 2171, 0.0], [2171, 2206, 0.0], [2206, 3084, 0.0], [3084, 3910, 0.0], [3910, 3947, 0.0], [3947, 4514, 0.0], [4514, 5544, 0.0], [5544, 6237, 0.0], [6237, 7052, 0.0], [7052, 7529, 0.0], [7529, 7565, 0.0], [7565, 7603, 0.0]], "rps_lines_uppercase_letter_fraction": [[0, 44, 0.25], [44, 65, 0.19047619], [65, 101, 0.13888889], [101, 131, 0.4], [131, 158, 0.14814815], [158, 257, 0.14141414], [257, 302, 0.17777778], [302, 390, 0.18181818], [390, 974, 0.10616438], [974, 1031, 0.26315789], [1031, 1069, 0.15789474], [1069, 1523, 0.12995595], [1523, 2171, 0.04938272], [2171, 2206, 0.17142857], [2206, 3084, 0.08200456], [3084, 3910, 0.03995157], [3910, 3947, 0.16216216], [3947, 4514, 0.10405644], [4514, 5544, 0.03009709], [5544, 6237, 0.1038961], [6237, 7052, 0.03680982], [7052, 7529, 0.18448637], [7529, 7565, 0.13888889], [7565, 7603, 0.15789474]], "rps_doc_ml_palm_score": [[0, 7603, 0.76088524]], "rps_doc_ml_wikipedia_score": [[0, 7603, null]], "rps_doc_ml_wikiref_score": [[0, 7603, 0.9728424]], "rps_doc_books_importance": [[0, 7603, -558.22237331]], "rps_doc_openwebtext_importance": [[0, 7603, -8.92815635]], "rps_doc_wikipedia_importance": [[0, 7603, -97.74267821]], "rps_doc_num_sentences": [[0, 7603, 73.0]]} |
https://dula.nyc3.digitaloceanspaces.com/wp-content/uploads/2018/10/08035933/grandma-thinks-you-are-a-sex-offender-if-you-use-facebook.mp4
This Is Why You Don’t Add Your Boss On Facebook
What Facebook Is For | 2023-14/0000/en_head.json.gz/3666 | {"url": "https://dula.tv/videos/grandma-thinks-you-are-a-sex-offender-if-you-use-facebook/", "partition": "head_middle", "language": "en", "source_domain": "dula.tv", "date_download": "2023-03-20T10:28:57Z", "digest": "sha1:VCHRBN7DA5DH35DL5LXFAT6TO3FA2ZCN"} | {"ccnet_length": [[0, 207, 207.0]], "ccnet_original_length": [[0, 207, 1168.0]], "ccnet_nlines": [[0, 207, 3.0]], "ccnet_original_nlines": [[0, 207, 51.0]], "ccnet_language_score": [[0, 207, 0.54]], "ccnet_perplexity": [[0, 207, 336.7]], "ccnet_bucket": [[0, 207, 0.0]], "rps_doc_curly_bracket": [[0, 207, 0.0]], "rps_doc_ldnoobw_words": [[0, 207, 0.0]], "rps_doc_lorem_ipsum": [[0, 207, 0.0]], "rps_doc_stop_word_fraction": [[0, 207, 0.13114754]], "rps_doc_ut1_blacklist": [[0, 207, null]], "rps_doc_frac_chars_dupe_10grams": [[0, 207, 0.0]], "rps_doc_frac_chars_dupe_5grams": [[0, 207, 0.0]], "rps_doc_frac_chars_dupe_6grams": [[0, 207, 0.0]], "rps_doc_frac_chars_dupe_7grams": [[0, 207, 0.0]], "rps_doc_frac_chars_dupe_8grams": [[0, 207, 0.0]], "rps_doc_frac_chars_dupe_9grams": [[0, 207, 0.0]], "rps_doc_frac_chars_top_2gram": [[0, 207, 0.0]], "rps_doc_frac_chars_top_3gram": [[0, 207, 0.0]], "rps_doc_frac_chars_top_4gram": [[0, 207, 0.0]], "rps_doc_frac_all_caps_words": [[0, 207, 0.0]], "rps_doc_frac_lines_end_with_ellipsis": [[0, 207, 0.0]], "rps_doc_frac_no_alph_words": [[0, 207, 0.42622951]], "rps_doc_frac_unique_words": [[0, 207, 0.86666667]], "rps_doc_mean_word_length": [[0, 207, 11.26666667]], "rps_doc_symbol_to_word_ratio": [[0, 207, 0.0]], "rps_doc_unigram_entropy": [[0, 207, 2.52321095]], "rps_doc_word_count": [[0, 207, 15.0]], "rps_lines_ending_with_terminal_punctution_mark": [[0, 139, 0.0], [139, 187, 0.0], [187, 207, 0.0]], "rps_lines_javascript_counts": [[0, 139, 0.0], [139, 187, 0.0], [187, 207, 0.0]], "rps_lines_num_words": [[0, 139, 1.0], [139, 187, 10.0], [187, 207, 4.0]], "rps_lines_numerical_chars_fraction": [[0, 139, 0.14035088], [139, 187, 0.0], [187, 207, 0.0]], "rps_lines_start_with_bulletpoint": [[0, 139, 0.0], [139, 187, 0.0], [187, 207, 0.0]], "rps_lines_uppercase_letter_fraction": [[0, 139, 0.0], [139, 187, 0.20833333], [187, 207, 0.2]], "rps_doc_ml_palm_score": [[0, 207, -1.001e-05]], "rps_doc_ml_wikipedia_score": [[0, 207, null]], "rps_doc_ml_wikiref_score": [[0, 207, -1.001e-05]], "rps_doc_books_importance": [[0, 207, -73.4141626]], "rps_doc_openwebtext_importance": [[0, 207, -26.69303567]], "rps_doc_wikipedia_importance": [[0, 207, -54.5328156]], "rps_doc_num_sentences": [[0, 207, 5.0]]} |
Tag: teaching
Site Update: Course Syllabi and Assignments Added Under Teaching Section
Over the weekend, I added new pages under the Teaching menu option for the courses that I have taught, am teaching, and will teach. Each course page includes descriptions, syllabi, and assignments arranged chronologically by school:
English Composition I, ENG 1101
English Composition II, ENG 1121
Specialized Communication for Technology Students, ENG 1133
Science Fiction, ENG 2420
Writing in the Workplace, ENG 2570
Technical Writing, ENG 2575
If any of these materials might be useful to your course and assignment design, please feel free to adopt or modify as needed.
While assembling these pages, I discovered that some assignments and supporting materials were missing. Of course, it is best pedagogical practice to reflect and archive these kinds of materials for reference, improvement, and growth.
Posted on September 21, 2015 September 20, 2015 Categories City Tech, Georgia Tech, Kent State, Pedagogy, Science FictionTags assignment, citytech, classes, courses, gatech, kentstate, Pedagogy, sciencefiction, syllabus, teaching, technicalcommunication, writing
Minireview: The Reconcilers Graphic Novel Volume 1
The Reconcilers Vol. 1.
While Y and I were sitting for several hours in an airplane–on the ground, I had the pleasure of meeting the writer, actor, and director Erik Jensen. After I mentioned to him that my specific area of training is in Science Fiction, he gave me a graphic novel saying, “here’s some Science Fiction for you.” I was thankful for the gift and thankful for the time on the tarmac to read it!
The graphic novel that he gave me is volume one of The Reconcilers (2010) co-created by R. Emery Bright, Jens Pil Pilegaard, and Jensen. Volume one is written by Jensen and drawn by Shepherd Hendrix. Neal Adams created the cover art.
The narrative takes place in 2165 after the ascendency of religion-like mega-corporations and the gradual establishment of elaborate gladiatorial matches fought by “Reconcilers” to decide disputes between corporate entities. The story follows Sokor Industries attempting an extra-legal takeover of Hansen Engineering’s claim to the motherlode of exotic, energy-rich “liberty ore.” Hexhammer, Hansen’s miner who discovered the the vein, leads their underdog team against Sokol’s seasoned fighters to keep what they had earned. However, Hexhammer’s past choices threaten his ability to overcome his final confrontation with Sokor’s best Reconciler, “Masakor.”
The megacorporations of Fredrik Pohl and C.M. Kornbluth’s The Space Merchants and William Gibson’s Neuromancer, as does Weyland-Yutani of the Alien film series also, inform The Reconcilers.
The Reconcilers has a lot of interesting material for thinking through the convergence of corporate personhood, entertainment, religion, capital, and rule of law. I believe that it would be informative to research and engaging to students.
Posted on July 15, 2014 Categories Comics, Review, Science FictionTags comicbooks, Comics, cyberpunk, graphicnovel, megacorporation, Research, Review, sciencefiction, teaching
Atlanta Science Fiction Society Talk on Teaching SF at Georgia Tech, Tomorrow
Atlanta Science Fiction Society logo.
If you’re in the area tomorrow, you can catch my talk on teaching science fiction at Georgia Tech at the Atlanta Science Fiction Society meeting at the Sandy Springs Fulton County Library (395 Mount Vernon Hwy NE, Sandy Springs, GA). The meeting begins at 2:30PM, and my talk should begin around 3:30PM. I will bring copies of this handout to the meeting. My talk with focus on the history of teaching SF at Tech, my historical approach to teaching SF, and my emphasis on using SF as a way to develop student literacies in writing, new media, and haptics.
Posted on June 13, 2014 Categories Georgia Tech, Pedagogy, Science FictionTags gatech, georgiatech, haptics, historyofsf, Pedagogy, sciencefiction, teaching, teachingphilosophy, twitter
Recovered Writing, PhD in English, Teaching College Writing, Quiz, What do people do when they write? June 16, 2008
This is the fifty-fourth post in a series that I call, “Recovered Writing.” I am going through my personal archive of undergraduate and graduate school writing, recovering those essays I consider interesting but that I am unlikely to revise for traditional publication, and posting those essays as-is on my blog in the hope of engaging others with these ideas that played a formative role in my development as a scholar and teacher. Because this and the other essays in the Recovered Writing series are posted as-is and edited only for web-readability, I hope that readers will accept them for what they are–undergraduate and graduate school essays conveying varying degrees of argumentation, rigor, idea development, and research. Furthermore, I dislike the idea of these essays languishing in a digital tomb, so I offer them here to excite your curiosity and encourage your conversation.
Before I could accept my teaching fellowship at Kent State University, I needed to take the graduate seminar, “Teaching College English.” I was fortunate to have the opportunity to take this class from Professor Brian Huot. At the time, I thought my primary concern was putting together my first syllabus, but through the seminar, I learned the importance of meeting student needs, considering outcomes, meeting students on the page, helping students improve their command of rhetoric and multimodality with a portfolio, and considering student work holistically (something that I continue to do with the Georgia Tech WCP’s WOVEN modalities and programmatic rubric).
This third of four Recovered Writing posts from Teaching College Writing is my response to a quiz on an assigned reading on “What do people do when they write?” I can’t find the essay/handout that we were responding to, but I think that some of the thoughts that I put down regarding mechanics/surface vs. content/depth are interesting.
Jason W. Ellis
Professor Brian Huot
Teaching College Writing
“What do people do when they write?” Quiz
1) What can you say about the kinds of responses from students in the two groups? Are the first group of responses for each grade level different than the second group–what are the differences or similarities for each group in each grade level?
The responses from students in the first group of each grade level are typically about thought, expression, and explanation, and the second group responses of each grade level are about practices and methodology. Linking these responses back to the class, the first groups are aligned with function, and the second groups are aligned with form.
In the third-, fourth-, and fifth-grade responses, the first group overwhelming uses the following key words and phrases: think, open their minds, and share…thoughts. Also, they write for a purpose such as to relax, avoid boredom, or to communicate with friends and relatives. The second group’s responses focus on practical matters and the mechanics of writing. For example, these students talk about holding the pencil, sitting down to write, making letters, moving the hand and pencil, and wasting ink and pencil lead.
Again, the responses of the first group of ninth graders are generally about expression. They “tell about things,” “get stuff across,” and “express their thoughts.” The responses from the second group are, like the second group of third to fifth graders, about the physicality of writing. For example, they “just write stuff down,” “hold a pencil and move their hands,” and “put the point of the pencil to the paper and start making words and letters.” There are exceptions in both groups that could be interpreted as belonging to the other group if the groups are divided based on function and form.
The college freshmen groups are also split along lines of function and form. The first group is largely about conveying, explaining, translating thoughts into words, and revealing thoughts, ideas, and emotions. The second group of college freshmen is more closely aligned to the first than the other two grade levels, because there are some responses about expression and communication. However, the majority of the responses concern mechanics (e.g., “usage and grammar”) and writing methods (e.g., “First you pick your topic, then you make sure you have enough information. Then you rewrite and check the spelling and copy it down” and “They express their overall views on a given topic and later draw conclusions in a patterned coherent fashion”). Also, there’s a response that describes the act of writing.
2) Do you see any similarities across grade levels for each of the groups? Are there certain characteristics for either group one or two responses? What are those characteristics?
In general, the responses of the first group of each grade level are about the function of writing–i.e., communication and the expression of ideas, and those of the second group of each grade level are about the form of writing–i.e., the act of writing and methods of writing.
3) Considering your answers to the first two questions, what variable (consideration, category, quality) did the researchers use to separate the different groups of responses within each grade level?
Considering our readings of the past week, and the writing concept that “form follows function,” it seems that the responses are grouped based on writing ability. The first group in each grade level has stronger writers, and the second group respondents are weaker writers. The stronger writers understand the function of writing, because they’ve internalized that through their acquisition of writing. The weaker writers respond by describing the literal action of writing or process of using the plug-and-chug method of writing an essay in high school. They are thinking about the surface level of writing rather than what lies underneath as did the first group respondents.
Posted on May 16, 2014 May 11, 2014 Categories Kent State, Pedagogy, Recovered WritingTags content, kentstate, ksu, mechanics, Pedagogy, phd, recoveredwriting, teaching, writing
Recovered Writing, PhD in English, Teaching College Writing, Annotated Bibliography of Teaching SF Resources, June 29, 2008
This is the fifty-third post in a series that I call, “Recovered Writing.” I am going through my personal archive of undergraduate and graduate school writing, recovering those essays I consider interesting but that I am unlikely to revise for traditional publication, and posting those essays as-is on my blog in the hope of engaging others with these ideas that played a formative role in my development as a scholar and teacher. Because this and the other essays in the Recovered Writing series are posted as-is and edited only for web-readability, I hope that readers will accept them for what they are–undergraduate and graduate school essays conveying varying degrees of argumentation, rigor, idea development, and research. Furthermore, I dislike the idea of these essays languishing in a digital tomb, so I offer them here to excite your curiosity and encourage your conversation.
This second of four Recovered Writing posts from Teaching College Writing is a brief annotated bibliography of teaching science fiction resources. Professor Huot asked us to do research in our specific discipline and report back what we found. This kind of work has become an integral part of my professionalization as an educator (research+pedagogy) and reflective practitioner (how did this other person do that–how can I incorporate/modify/adapt their approach into mine–what worked/didn’t work and how can I make it better?).
Teaching Science Fiction Annotated Bibliography
Attebery, Brian. “Teaching Fantastic Literature.” Science-Fiction Studies 23:3 (Nov. 1996): 406-410.
Instead of focusing his course on Science Fiction, Attebery combines fantasy and SF into one course under the umbrella of the fantastic. Again, this is a literature, and not a composition course, but the important lesson to take away from his essay is that students with fantasy/SF backgrounds, which are not necessarily the same thing, as well as students without an inkling of experience with the fantastic all have something to bring to class discussion. Also, some fantastic literature carries more cultural or historic baggage than students may already be acquainted with, which may break down discussion, or require more lecturing or assigned reading in order to prepare students for engaging a particular text.
Bengels, Barbara. “The Pleasures and Perils of Teaching Science Fiction on the College Level.” Science-Fiction Studies 23:3 (Nov. 1996): 428-431.
Bengels builds on examples from Science Fiction and criticism, both on teaching SF, “to address the inherent and unique difficulties of teaching a body of literature that is changing even as we attempt to examine it…to convey the excitement and sense of wonder that continues to set science fiction apart from any other form of literature” (428). Most importantly, she suggests that, “There’s a special sense of community in the sf world that finds its way right into the classroom; new ideas must be bounced off one another, making for very exciting classroom discussions: new words, new worlds, new concepts all to be explored together” (430).
Csicsery-Ronay, Jr., Istvan. “The Seven Beauties of Science Fiction.” Science-Fiction Studies 23:3 (Nov. 1996): 385-388.
Csicsery-Ronay begins his essay with this striking line: “Like being hanged, teaching introductory sf courses to undergraduates focuses the mind wonderfully” (385). He is addressing the teaching of Gunn’s SF genre course, but he provides a great framework for introducing students to SF through a handout titled, “WHAT MAKES SCIENCE FICTION SCIENCE FICTION?” (386). This handout, perhaps given after having students read an emblematic SF short story, would be a powerful tool for opening discussion about what constitutes SF and what our students think SF is. Furthermore, he responds to what is implicitly said in Bengels, Gunn, and others when he writes, “My sf texts must also introduce students to important philosophical, social, and literary ideas that they might not encounter anywhere else, given the state of contemporary higher education” (386). This significant accusation reflects the potential of SF to engage students in ideas and critical thought that they would not otherwise encounter.
Elkins, Charles and Darko Suvin. “Preliminary Reflections on Teaching Science Fiction Critically.” Science-Fiction Studies 6 (1979): 263-270.
There are some very practical and insightful contributions by Elkins and Suvin in this Marxist essay regarding the teaching of SF. The authors propose that, “The main and the highest goal of SF teaching–as of all teaching–ought, in our opinion, to be a specific form of civic education” (267). SF is great for inculcating critical thinking, because SF often turns accepted systems upside-down. Introducing students to this and discussing what’s in the text and what the text leaves out should raise their ability to see beneath the surface of the text. Elkins and Suvin go on to suggest that, “Teaching SF…involves description and assessment, interpretation and evaluation; teaching SF is an act of literary criticism fused with the communication of that criticism” (268). In this passage, the authors are not literary saying that SF is literary criticism in the academic sense of an analysis of Shakespeare, but rather, SF is a critical literature that engages social issues. This is the power of SF that is useful for generating discussion in the introductory college writing classroom.
Evans, Arthur B. and R.D. Mullen. “North American College Courses in Science Fiction, Utopian Literature, and Fantasy.” Science-Fiction Studies 23:3 (Nov. 1996): 437-528.
Evans and Mullen compiled this list of SF, utopian, and fantasy courses complete with descriptions and book lists from colleges and universities all over the world. It also includes lists of works, authors, and films most often assigned.
Finch, Sheila. “Dispatches from the Trenches: Science Fiction in the Classroom.” Extrapolation 41:1 (Spring 2000): 28-35.
Finch writes that SF is a uniquely appropriate genre for stimulating student involvement and discussion, because it serves all the functions of other literature with, “the added distinction of being…a literature of ideas to think about in a peculiarly new way, what Albert Einstein called Gedankenexperimenten” (29). The thought experiment aspect of SF is indeed powerful for generating discussion, because it presents a new view to a (perhaps) mundane subject, and it begs the reader to critically evaluate the thought experiment on the surface narrative as well as what lies beneath. Like Bengels, Finch declares, “SF is a literature of ideas,” which can be employed as a useful tool in developing writing students skills at responding to things that they might not have considered before (31).
Gunn, James. “Teaching Science Fiction.” Science-Fiction Studies 23:3 (Nov. 1996): 377-384.
Gunn’s essay primarily concerns his own approaches to teaching SF as a genre course, and he makes the claim that of all of the SF courses available at various schools, “They seem to be as varied as the colleges and universities at which they are taught, and a number seem to address the question of what science fiction is and how to read it, that is, they are genre courses. But I would argue that there should be more” (377). In regard to his own various approaches to teaching SF, he identifies three course themes: 1) “the great books,” 2) “the ideas in science fiction,” and 3) “the historical approach.” He doesn’t address SF in the introductory writing classroom, but I believe his “ideas” theme is appropriate for generating discussion and leading into student essay topics without the course taking on a literature-laden mood.
Mullen, R.D. “Science Fiction in Academe.” Science-Fiction Studies 23:3 (Nov. 1996): 371-374.
This is a short history of the introduction of SF into the American college classroom. It includes early course descriptions and book lists.
Ontell, Val. “Imagine That! Science Fiction as a Learning Motivation.” Community & Junior College Libraries 12:1 (2003): 57-70.
This essay overflows with numerous examples of SF and fantasy stories, TV shows, and films, and how they may be used to engage our students’ attention and imagination. In addition to all of Ontell’s fabulous lists and contextualizations, she points out how the fantastic is an important learning tool: “Whether the students are in the elementary grades, middle school, high school, or higher, it is the function of teachers and librarians to provide the tools that enable them to question intelligently. Science Fiction provides many vehicles for inculcating those tools in a variety of subjects by stimulating the imagination and thus motivating students to learn” (57). In the writing classroom, building our students’ ability to “question intelligently” is essential to their success as readers and stronger writers.
Samuelson, David N. “Adventures in Paraliterature.” Science-Fiction Studies 23:3 (Nov. 1996): 389-392.
Samuelson provides a plethora of author and work successes in his classes. Also, he notes the usefulness of group presentations on particular works or authors to share with the class, and he lauds the use of a “cumulative journal” or portfolio in the classroom.
Posted on May 14, 2014 May 11, 2014 Categories Kent State, Pedagogy, Recovered Writing, Science FictionTags bibliography, composition, englishcomposition, kentstate, ksu, Pedagogy, phd, recoveredwriting, Research, sciencefiction, teaching
I’m a Recipient of this Year’s Class of 1940 Course Survey Teaching Effectiveness Award!
Recently, I was told that I was a recipient of this year’s Class of 1940 Course Survey Teaching Effectiveness Award from the Georgia Institute of Technology!
The selection criteria for the Class of 1940 Course Survey Teaching Effectiveness Award are: “During the Fall 2012 and/or Spring 2013 semesters, a CIOS [Course-Instructor Opinion Survey] response rate of at least 85%, and either a class size of at least 40 students with a CIOS score for the question which reads, “Overall, this instructor is an effective teacher” of at least 4.8; or a class size of at least 15 students and a CIOS score for the same question of at least 4.9; or a 5 (or greater) credit course with a size of at least 10 and a CIOS score for the same question of at least 4.9.”
I qualified in the middle category, because my classes are typically 25 students/each and 3 credit hours/each.
I am deeply honored to be recognized by my students and institution with this award, and its monetary award is certainly helpful and appreciated.
Georgia Tech’s teaching awards will be given publicly at the upcoming Celebrating Teaching Day on March 6, 2014. I’m looking forward to it!
Posted on January 9, 2014 Categories Georgia Tech, PedagogyTags award, gatech, georgiatech, Pedagogy, recognition, teaching
Science Fiction, LMC 3214 at Georgia Tech, Summer 2013 Begins (Syllabus Attached)
SF vs Sci-Fi Brainstorming.
Today, I began teaching my first Science Fiction class at Georgia Tech (LMC 3214 SS2). It is a short-session class, so my students and I will explore the history of SF in only five weeks on a grueling 4 days per week, 2 hours per day schedule.
During our first class today, we introduced ourselves, discussed the syllabus and schedule [available here: ellis-jason-syllabus-lmc3214-summer2013], and discussed the difference between SF and Sci-Fi.
Following a short break after reading the syllabus, I conducted an interactive exercise where I wrote “Science Fiction (SF)” on the left side of the chalkboard and “Sci-Fi” on the right side. I sketched out the differences between the two terms and how we might use them to identify different types of SF. Then, I handed the chalk to a student who I asked to go to the board and write a type of SF that she liked in the spot that she felt best represented it in the SF/Sci-Fi continuum. As a class, we would discuss these examples. The other students and I would help point out how we might view these examples in different ways along the SF/Sci-Fi axis. Each student would hand off the chalk to the next student. We completed two rounds of this before running out of time in class.
I think that I have an excellent group of students. Most are SF fans invested in the genre in one media form or another. Some students are there for pragmatic reasons. I believe that as the class unfolds all of my students will find interesting and significant connections to their thinking, life, and work.
Tomorrow, we begin discussing Mary Shelley’s Frankenstein.
Posted on June 24, 2013 Categories Georgia Tech, Pedagogy, Science FictionTags georgiatech, literature, lmc3214, sciencefiction, scifi, sf, teaching
Dean Griffin Day Luncheon for Thank a Teacher Recipients, and A Meditation on Teaching, Passion, and MOOCs
George C. Griffin, 1918
Today, the Georgia Tech Alumni Student Ambassadors and the Georgia Tech Center for the Enhancement of Teaching and Learning hosted the Dean Griffin Day Luncheon to recognize recipients of “Thank a Teacher” notes. I was honored by a Thank a Teacher note from one of my students.
Associate Vice Provost for Learning Excellence Donna Llewellyn told us about the origins of the Thank a Teacher program and recited some of the notes that recipients had received.
Marilyn Somers, Director of Georgia Tech’s Living History Program, guided us through enjoyable multimedia-driven stories about Dean George C. Griffin. Her enthusiasm for Georgia Tech is only matched by her passion as a storyteller.
Vice Provost for Undergraduate Education Colin Potts delivered a meditation on what exactly it is that makes a great teacher and how that relates to the modern development of online education in MOOCs (massive open online course). The two most provocative things that I took away from his talk include the question, “What makes a good teacher?,” and the observation, MOOCs are part of the evolution of education but not the end.
On my walk back to the Hall Building, I thought about Dr. Potts’ question: What makes a good teacher. Of course, I have given this idea a lot of thought before and after entering the profession, but it is a question that we as educators should continually return to in our work as reflective practitioners. The best quality that I have found in my teachers (and I mean those people who are educators in the broadest sense of the word) is passion. This includes a passion for the material being taught, a passion for student learning and success, a passion for engaging others, a passion for life-long learning, a passion for energetic discourse, a passion for understanding, a passion for a passion for giving back to a community, a passion for being a part of larger conversations beyond the classroom, and a passion for kindness. What’s intriguing about my experiences with some fantastic teachers during my life is that I do not believe that any of them perform, demonstrate, or conduct these same passions in the same way. There many different paths to these things, and its amazing to me how many different people tread very different paths yet have achieved for me the same positive and enriching outcomes.
This reminds me of something else that Dr. Potts warned about MOOCs–the impulse of some to promote a singular, superstar educator as the one way for a course to be delivered and taught. In a smaller way, I think back to my Calculus education at Georgia Tech. There were simply some professors who I could learn from–that is, their teaching style and methodology synced, jived, and meshed with my thinking and learning ability. The professors who I did learn Calculus best from might not have been the exemplars of the profession at Tech at that time, but they were, to me anyways, the best educators of Calculus (I should know, because I had some false starts early on in my educational career). We have to be very careful about the choices that we make as an institution and as a profession as we move further into offering MOOCs. These choices should extend beyond the calculus of student completion rates. We have to consider the effects MOOCs will have on pedagogy and educators. How will MOOCs, over time, influence education? How will MOOCs influence student success in areas not explicitly concerned with a course that teachers often provide and encourage (finding out how students are doing, having informal chats, making sure students are doing okay, etc.)? Will MOOCs push out some educators and educational styles in favor of others? Can the passions of educators be provided/conveyed and can the passions of students for learning, solving puzzles, and engaging discourses be fostered in a MOOC?
A final note: I am listening to Dr. Eric Rabkin’s lectures on tape for his Science Fiction: The Technological Imagination course from the University of Michigan. Certainly, he is passionate about science fiction, and it is, I believe, unavoidable for his passion to infect his audience. He knows the material, and he is obviously excited to convey this knowledge to his students (in the classroom and in the world–those of us listening to the lectures on tape). However, Dr. Rabkin cannot provide the same kinds of things as a teacher (educator, mentor, counselor, etc.) in a MOOC or lectures on tape that he can provide in a class of reasonable size (another issue). Don’t get me wrong–Dr. Rabkin is a fantastic person and I count him among my professional friends. However, there are limitations to what an educator can and cannot do in a MOOC or lectures on tape. For example, in his own highly popular MOOC, I imagine that he cannot read all of the comments or questions of every student (when I see Dr. Rabkin next, I will certainly ask him about how he compares his classroom and MOOC teaching). This is something possible when you have reasonable class enrollments and course loads (this leads into another area of concern about having too large of a class for a qualitative and composition oriented course–there is a point at which the teacher cannot provide the necessary and needed passion for all students. In which case, a too big of a class, too many classes, or a MOOC can become indistinguishable from the perspective of the educator). Of course, I can see that the objectives of the classroom learning environment compared to the MOOC/lectures on tape should be different. I am left wondering though if everyone who promotes MOOCs truly recognizes the different affordances of each without trying to make one into the other at the cost of each.
Posted on April 18, 2013 April 18, 2013 Categories Georgia Tech, PedagogyTags education, ericrabkin, gatech, georgiatech, luncheon, mooc, teacher, teaching, thankateacher
Teaching Portfolios and Reflection
It is essential to regularly reflect on teaching, and I do this after every class that I teach. When a course is over and I receive my student Student Surveys of Instruction, I begin another round of reflection. It is at this point, beyond the feedback that I get from students during the class that is usually favorable, that I receive the feedback that some students may be unsure about sharing. I am happy that my current college writing students are not so shy, but I am critiquing my college writing II class from Fall 2010 as a result of the no-holds barred comments that I received from students. This is a constructive process, because I want to make my classes as successful and engaging as possible for my future students. It is unfortunate that I only hear some of these complaints now, after the fact, but it is worthwhile that student voices can be included in the reflective process of their teachers.
Along with this process of reflection and reviewing student comments, I am also putting together the most thorough teaching portfolio that I have ever done. I have the beginnings of a teaching portfolio from past exercises and most recently from putting together a packet for the Midwestern Association of Graduate School’s Excellence in Teaching Award. In the packet that I am assembling now, I am thinking about and justifying certain elements of my portfolio. I am working through the rationalizations and results of particular choices that I have made as a composition instructor at Kent State. This is all very useful work for my development as a teacher, and it is giving me additional ideas about how to conclude my current college writing class and expand my future college writing classes.
Posted on April 9, 2011 Categories Pedagogy, PersonalTags Pedagogy, portfolio, postaday2011, reflection, teaching
Hasslin’ the Hasslers, Prospective Lit PhD Student, and Teaching Capgras Syndrome
Last night, Y and I walked down to Mack and Sue Hassler’s house for a visit. We have all been so busy–Y and I with dissertating and teaching, Sue with her music festival, and Mack with teaching and Faculty Senate–that we haven’t really spoken to one another since the beginning of the semester. Y and I enjoy our visits with the Hasslers, because we can engage in shoptalk as easily as anything not related to the academy.
Today, Dave and I met with a prospective Literature PhD student. We all went to the Ratskeller under the Kent State Student Center to talk about the program over coffee, tea, and a smoothie.
Since then, I have been in the Library working on my lesson plans for this week. My students are beginning the last phase of the semester during which time we will read Rivka Galchen’s Atmospheric Disturbances. Unlike Richard Powers’ The Echo Maker, the reader never learns definitively that Leo Liebenstein has Capgras Syndrome or not, but it is this uncertainly that drives home the point about neurological disorders. The person afflicted with a neurological deficit, illness, or damage has had his or her world shifted in a fundamental way, because it is with the brain that we make sense of the world. If its architecture or operation is impaired in some way, it is unlikely if not impossible for the person to fully grasp what has happened to him or herself. Of course, the person can be told and possibly convinced by someone else, but even this kind of explanation cannot change the fundamental feeling or understanding that the brain disallows due to its creation of our experience of the world. I am looking forward to the classroom discussions this week.
Posted on April 4, 2011 Categories Kent State, Personal, The BrainTags capgras, kenstate, postaday2011, teaching | 2023-14/0000/en_head.json.gz/3667 | {"url": "https://dynamicsubspace.net/tag/teaching/", "partition": "head_middle", "language": "en", "source_domain": "dynamicsubspace.net", "date_download": "2023-03-20T10:20:06Z", "digest": "sha1:JQ54X62AA5CEW5IDKF2GMWBTRJKZCICD"} | {"ccnet_length": [[0, 32683, 32683.0]], "ccnet_original_length": [[0, 32683, 37482.0]], "ccnet_nlines": [[0, 32683, 101.0]], "ccnet_original_nlines": [[0, 32683, 170.0]], "ccnet_language_score": [[0, 32683, 0.95]], "ccnet_perplexity": [[0, 32683, 336.5]], "ccnet_bucket": [[0, 32683, 0.0]], "rps_doc_curly_bracket": [[0, 32683, 0.0]], "rps_doc_ldnoobw_words": [[0, 32683, 0.0]], "rps_doc_lorem_ipsum": [[0, 32683, 0.0]], "rps_doc_stop_word_fraction": [[0, 32683, 0.3896329]], "rps_doc_ut1_blacklist": [[0, 32683, null]], "rps_doc_frac_chars_dupe_10grams": [[0, 32683, 0.06602673]], "rps_doc_frac_chars_dupe_5grams": [[0, 32683, 0.10464897]], "rps_doc_frac_chars_dupe_6grams": [[0, 32683, 0.09410879]], "rps_doc_frac_chars_dupe_7grams": [[0, 32683, 0.0882364]], "rps_doc_frac_chars_dupe_8grams": [[0, 32683, 0.0792396]], "rps_doc_frac_chars_dupe_9grams": [[0, 32683, 0.06971579]], "rps_doc_frac_chars_top_2gram": [[0, 32683, 0.00545831]], "rps_doc_frac_chars_top_3gram": [[0, 32683, 0.00414079]], "rps_doc_frac_chars_top_4gram": [[0, 32683, 0.00711462]], "rps_doc_frac_all_caps_words": [[0, 32683, 0.03040807]], "rps_doc_frac_lines_end_with_ellipsis": [[0, 32683, 0.0]], "rps_doc_frac_no_alph_words": [[0, 32683, 0.17520088]], "rps_doc_frac_unique_words": [[0, 32683, 0.29227238]], "rps_doc_mean_word_length": [[0, 32683, 5.08129304]], "rps_doc_symbol_to_word_ratio": [[0, 32683, 0.00063022]], "rps_doc_unigram_entropy": [[0, 32683, 6.1532114]], "rps_doc_word_count": [[0, 32683, 5228.0]], "rps_lines_ending_with_terminal_punctution_mark": [[0, 14, 0.0], [14, 87, 0.0], [87, 320, 0.0], [320, 352, 0.0], [352, 385, 0.0], [385, 445, 0.0], [445, 471, 0.0], [471, 506, 0.0], [506, 534, 0.0], [534, 661, 1.0], [661, 896, 1.0], [896, 1159, 0.0], [1159, 1210, 0.0], [1210, 1234, 1.0], [1234, 1620, 1.0], [1620, 1854, 1.0], [1854, 2512, 1.0], [2512, 2702, 1.0], [2702, 2942, 1.0], [2942, 3118, 0.0], [3118, 3196, 0.0], [3196, 3234, 1.0], [3234, 3790, 1.0], [3790, 3976, 0.0], [3976, 4092, 0.0], [4092, 4982, 1.0], [4982, 5649, 1.0], [5649, 5986, 1.0], [5986, 6001, 0.0], [6001, 6022, 0.0], [6022, 6047, 0.0], [6047, 6089, 0.0], [6089, 6334, 1.0], [6334, 6679, 1.0], [6679, 7201, 1.0], [7201, 7802, 1.0], [7802, 8612, 1.0], [8612, 8792, 1.0], [8792, 9069, 1.0], [9069, 9269, 1.0], [9269, 9946, 1.0], [9946, 10124, 0.0], [10124, 10248, 0.0], [10248, 11137, 1.0], [11137, 11667, 1.0], [11667, 11715, 0.0], [11715, 11816, 1.0], [11816, 12534, 1.0], [12534, 12680, 1.0], [12680, 13326, 1.0], [13326, 13447, 1.0], [13447, 14450, 1.0], [14450, 14592, 1.0], [14592, 15681, 1.0], [15681, 15852, 1.0], [15852, 16090, 1.0], [16090, 16212, 1.0], [16212, 17009, 1.0], [17009, 17101, 1.0], [17101, 17937, 1.0], [17937, 18031, 1.0], [18031, 18172, 1.0], [18172, 18300, 1.0], [18300, 19120, 1.0], [19120, 19223, 1.0], [19223, 19485, 1.0], [19485, 19724, 0.0], [19724, 19813, 1.0], [19813, 19971, 1.0], [19971, 20567, 1.0], [20567, 20678, 1.0], [20678, 20824, 1.0], [20824, 20964, 1.0], [20964, 21088, 0.0], [21088, 21170, 0.0], [21170, 21198, 1.0], [21198, 21442, 1.0], [21442, 21644, 1.0], [21644, 22427, 1.0], [22427, 22735, 1.0], [22735, 22794, 1.0], [22794, 22943, 0.0], [22943, 23050, 0.0], [23050, 23074, 0.0], [23074, 23352, 1.0], [23352, 23532, 1.0], [23532, 23764, 1.0], [23764, 24193, 1.0], [24193, 25406, 1.0], [25406, 26913, 1.0], [26913, 28774, 1.0], [28774, 28945, 0.0], [28945, 28980, 0.0], [28980, 29896, 1.0], [29896, 30695, 1.0], [30695, 30809, 0.0], [30809, 30891, 0.0], [30891, 31314, 1.0], [31314, 31505, 1.0], [31505, 32571, 1.0], [32571, 32683, 0.0]], "rps_lines_javascript_counts": [[0, 14, 0.0], [14, 87, 0.0], [87, 320, 0.0], [320, 352, 0.0], [352, 385, 0.0], [385, 445, 0.0], [445, 471, 0.0], [471, 506, 0.0], [506, 534, 0.0], [534, 661, 0.0], [661, 896, 0.0], [896, 1159, 0.0], [1159, 1210, 0.0], [1210, 1234, 0.0], [1234, 1620, 0.0], [1620, 1854, 0.0], [1854, 2512, 0.0], [2512, 2702, 0.0], [2702, 2942, 0.0], [2942, 3118, 0.0], [3118, 3196, 0.0], [3196, 3234, 0.0], [3234, 3790, 0.0], [3790, 3976, 0.0], [3976, 4092, 0.0], [4092, 4982, 0.0], [4982, 5649, 0.0], [5649, 5986, 0.0], [5986, 6001, 0.0], [6001, 6022, 0.0], [6022, 6047, 0.0], [6047, 6089, 0.0], [6089, 6334, 0.0], [6334, 6679, 0.0], [6679, 7201, 0.0], [7201, 7802, 0.0], [7802, 8612, 0.0], [8612, 8792, 0.0], [8792, 9069, 0.0], [9069, 9269, 0.0], [9269, 9946, 0.0], [9946, 10124, 0.0], [10124, 10248, 0.0], [10248, 11137, 0.0], [11137, 11667, 0.0], [11667, 11715, 0.0], [11715, 11816, 0.0], [11816, 12534, 0.0], [12534, 12680, 0.0], [12680, 13326, 0.0], [13326, 13447, 0.0], [13447, 14450, 0.0], [14450, 14592, 0.0], [14592, 15681, 0.0], [15681, 15852, 0.0], [15852, 16090, 0.0], [16090, 16212, 0.0], [16212, 17009, 0.0], [17009, 17101, 0.0], [17101, 17937, 0.0], [17937, 18031, 0.0], [18031, 18172, 0.0], [18172, 18300, 0.0], [18300, 19120, 0.0], [19120, 19223, 0.0], [19223, 19485, 0.0], [19485, 19724, 0.0], [19724, 19813, 0.0], [19813, 19971, 0.0], [19971, 20567, 0.0], [20567, 20678, 0.0], [20678, 20824, 0.0], [20824, 20964, 0.0], [20964, 21088, 0.0], [21088, 21170, 0.0], [21170, 21198, 0.0], [21198, 21442, 0.0], [21442, 21644, 0.0], [21644, 22427, 0.0], [22427, 22735, 0.0], [22735, 22794, 0.0], [22794, 22943, 0.0], [22943, 23050, 0.0], [23050, 23074, 0.0], [23074, 23352, 0.0], [23352, 23532, 0.0], [23532, 23764, 0.0], [23764, 24193, 0.0], [24193, 25406, 0.0], [25406, 26913, 0.0], [26913, 28774, 0.0], [28774, 28945, 0.0], [28945, 28980, 0.0], [28980, 29896, 0.0], [29896, 30695, 0.0], [30695, 30809, 0.0], [30809, 30891, 0.0], [30891, 31314, 0.0], [31314, 31505, 0.0], [31505, 32571, 0.0], [32571, 32683, 0.0]], "rps_lines_num_words": [[0, 14, 2.0], [14, 87, 10.0], [87, 320, 36.0], [320, 352, 5.0], [352, 385, 5.0], [385, 445, 7.0], [445, 471, 4.0], [471, 506, 6.0], [506, 534, 4.0], [534, 661, 23.0], [661, 896, 34.0], [896, 1159, 30.0], [1159, 1210, 7.0], [1210, 1234, 4.0], [1234, 1620, 72.0], [1620, 1854, 41.0], [1854, 2512, 89.0], [2512, 2702, 27.0], [2702, 2942, 36.0], [2942, 3118, 19.0], [3118, 3196, 12.0], [3196, 3234, 5.0], [3234, 3790, 99.0], [3790, 3976, 20.0], [3976, 4092, 19.0], [4092, 4982, 142.0], [4982, 5649, 102.0], [5649, 5986, 57.0], [5986, 6001, 3.0], [6001, 6022, 3.0], [6022, 6047, 3.0], [6047, 6089, 8.0], [6089, 6334, 43.0], [6334, 6679, 55.0], [6679, 7201, 82.0], [7201, 7802, 102.0], [7802, 8612, 128.0], [8612, 8792, 29.0], [8792, 9069, 48.0], [9069, 9269, 29.0], [9269, 9946, 106.0], [9946, 10124, 23.0], [10124, 10248, 17.0], [10248, 11137, 142.0], [11137, 11667, 80.0], [11667, 11715, 5.0], [11715, 11816, 11.0], [11816, 12534, 114.0], [12534, 12680, 20.0], [12680, 13326, 106.0], [13326, 13447, 15.0], [13447, 14450, 153.0], [14450, 14592, 17.0], [14592, 15681, 175.0], [15681, 15852, 23.0], [15852, 16090, 38.0], [16090, 16212, 16.0], [16212, 17009, 126.0], [17009, 17101, 11.0], [17101, 17937, 144.0], [17937, 18031, 12.0], [18031, 18172, 23.0], [18172, 18300, 17.0], [18300, 19120, 127.0], [19120, 19223, 12.0], [19223, 19485, 44.0], [19485, 19724, 27.0], [19724, 19813, 14.0], [19813, 19971, 26.0], [19971, 20567, 110.0], [20567, 20678, 17.0], [20678, 20824, 24.0], [20824, 20964, 23.0], [20964, 21088, 15.0], [21088, 21170, 12.0], [21170, 21198, 4.0], [21198, 21442, 47.0], [21442, 21644, 24.0], [21644, 22427, 145.0], [22427, 22735, 54.0], [22735, 22794, 7.0], [22794, 22943, 18.0], [22943, 23050, 17.0], [23050, 23074, 4.0], [23074, 23352, 47.0], [23352, 23532, 29.0], [23532, 23764, 34.0], [23764, 24193, 72.0], [24193, 25406, 209.0], [25406, 26913, 253.0], [26913, 28774, 319.0], [28774, 28945, 21.0], [28945, 28980, 4.0], [28980, 29896, 163.0], [29896, 30695, 131.0], [30695, 30809, 13.0], [30809, 30891, 11.0], [30891, 31314, 77.0], [31314, 31505, 34.0], [31505, 32571, 181.0], [32571, 32683, 15.0]], "rps_lines_numerical_chars_fraction": [[0, 14, 0.0], [14, 87, 0.0], [87, 320, 0.0], [320, 352, 0.13333333], [352, 385, 0.12903226], [385, 445, 0.06896552], [445, 471, 0.16666667], [471, 506, 0.12121212], [506, 534, 0.15384615], [534, 661, 0.0], [661, 896, 0.0], [896, 1159, 0.04897959], [1159, 1210, 0.02040816], [1210, 1234, 0.04761905], [1234, 1620, 0.0], [1620, 1854, 0.01785714], [1854, 2512, 0.00621118], [2512, 2702, 0.0], [2702, 2942, 0.0], [2942, 3118, 0.03658537], [3118, 3196, 0.0], [3196, 3234, 0.0], [3234, 3790, 0.01669759], [3790, 3976, 0.03448276], [3976, 4092, 0.05504587], [4092, 4982, 0.0], [4982, 5649, 0.0], [5649, 5986, 0.0], [5986, 6001, 0.0], [6001, 6022, 0.0], [6022, 6047, 0.0], [6047, 6089, 0.0], [6089, 6334, 0.00414938], [6334, 6679, 0.0], [6679, 7201, 0.0], [7201, 7802, 0.0], [7802, 8612, 0.0], [8612, 8792, 0.00571429], [8792, 9069, 0.0], [9069, 9269, 0.00520833], [9269, 9946, 0.0], [9946, 10124, 0.07272727], [10124, 10248, 0.05084746], [10248, 11137, 0.0], [11137, 11667, 0.0], [11667, 11715, 0.0], [11715, 11816, 0.14606742], [11816, 12534, 0.0], [12534, 12680, 0.09701493], [12680, 13326, 0.00952381], [13326, 13447, 0.12264151], [13447, 14450, 0.00922131], [14450, 14592, 0.08333333], [14592, 15681, 0.00562852], [15681, 15852, 0.08441558], [15852, 16090, 0.0], [16090, 16212, 0.0990991], [16212, 17009, 0.00513479], [17009, 17101, 0.1625], [17101, 17937, 0.00736196], [17937, 18031, 0.16049383], [18031, 18172, 0.0], [18172, 18300, 0.09565217], [18300, 19120, 0.00249066], [19120, 19223, 0.14285714], [19223, 19485, 0.0], [19485, 19724, 0.05381166], [19724, 19813, 0.04597701], [19813, 19971, 0.02580645], [19971, 20567, 0.04679376], [20567, 20678, 0.02830189], [20678, 20824, 0.0], [20824, 20964, 0.03676471], [20964, 21088, 0.04310345], [21088, 21170, 0.1038961], [21170, 21198, 0.0], [21198, 21442, 0.02978723], [21442, 21644, 0.04232804], [21644, 22427, 0.0], [22427, 22735, 0.0], [22735, 22794, 0.0], [22794, 22943, 0.07194245], [22943, 23050, 0.0], [23050, 23074, 0.19047619], [23074, 23352, 0.0], [23352, 23532, 0.0], [23532, 23764, 0.0], [23764, 24193, 0.0], [24193, 25406, 0.0], [25406, 26913, 0.0], [26913, 28774, 0.0], [28774, 28945, 0.0754717], [28945, 28980, 0.0], [28980, 29896, 0.00444444], [29896, 30695, 0.0], [30695, 30809, 0.08411215], [30809, 30891, 0.0], [30891, 31314, 0.0], [31314, 31505, 0.0], [31505, 32571, 0.0], [32571, 32683, 0.08490566]], "rps_lines_start_with_bulletpoint": [[0, 14, 0.0], [14, 87, 0.0], [87, 320, 0.0], [320, 352, 0.0], [352, 385, 0.0], [385, 445, 0.0], [445, 471, 0.0], [471, 506, 0.0], [506, 534, 0.0], [534, 661, 0.0], [661, 896, 0.0], [896, 1159, 0.0], [1159, 1210, 0.0], [1210, 1234, 0.0], [1234, 1620, 0.0], [1620, 1854, 0.0], [1854, 2512, 0.0], [2512, 2702, 0.0], [2702, 2942, 0.0], [2942, 3118, 0.0], [3118, 3196, 0.0], [3196, 3234, 0.0], [3234, 3790, 0.0], [3790, 3976, 0.0], [3976, 4092, 0.0], [4092, 4982, 0.0], [4982, 5649, 0.0], [5649, 5986, 0.0], [5986, 6001, 0.0], [6001, 6022, 0.0], [6022, 6047, 0.0], [6047, 6089, 0.0], [6089, 6334, 0.0], [6334, 6679, 0.0], [6679, 7201, 0.0], [7201, 7802, 0.0], [7802, 8612, 0.0], [8612, 8792, 0.0], [8792, 9069, 0.0], [9069, 9269, 0.0], [9269, 9946, 0.0], [9946, 10124, 0.0], [10124, 10248, 0.0], [10248, 11137, 0.0], [11137, 11667, 0.0], [11667, 11715, 0.0], [11715, 11816, 0.0], [11816, 12534, 0.0], [12534, 12680, 0.0], [12680, 13326, 0.0], [13326, 13447, 0.0], [13447, 14450, 0.0], [14450, 14592, 0.0], [14592, 15681, 0.0], [15681, 15852, 0.0], [15852, 16090, 0.0], [16090, 16212, 0.0], [16212, 17009, 0.0], [17009, 17101, 0.0], [17101, 17937, 0.0], [17937, 18031, 0.0], [18031, 18172, 0.0], [18172, 18300, 0.0], [18300, 19120, 0.0], [19120, 19223, 0.0], [19223, 19485, 0.0], [19485, 19724, 0.0], [19724, 19813, 0.0], [19813, 19971, 0.0], [19971, 20567, 0.0], [20567, 20678, 0.0], [20678, 20824, 0.0], [20824, 20964, 0.0], [20964, 21088, 0.0], [21088, 21170, 0.0], [21170, 21198, 0.0], [21198, 21442, 0.0], [21442, 21644, 0.0], [21644, 22427, 0.0], [22427, 22735, 0.0], [22735, 22794, 0.0], [22794, 22943, 0.0], [22943, 23050, 0.0], [23050, 23074, 0.0], [23074, 23352, 0.0], [23352, 23532, 0.0], [23532, 23764, 0.0], [23764, 24193, 0.0], [24193, 25406, 0.0], [25406, 26913, 0.0], [26913, 28774, 0.0], [28774, 28945, 0.0], [28945, 28980, 0.0], [28980, 29896, 0.0], [29896, 30695, 0.0], [30695, 30809, 0.0], [30809, 30891, 0.0], [30891, 31314, 0.0], [31314, 31505, 0.0], [31505, 32571, 0.0], [32571, 32683, 0.0]], "rps_lines_uppercase_letter_fraction": [[0, 14, 0.07142857], [14, 87, 0.12328767], [87, 320, 0.02145923], [320, 352, 0.1875], [352, 385, 0.21212121], [385, 445, 0.11666667], [445, 471, 0.19230769], [471, 506, 0.14285714], [506, 534, 0.17857143], [534, 661, 0.00787402], [661, 896, 0.01276596], [896, 1159, 0.05703422], [1159, 1210, 0.11764706], [1210, 1234, 0.125], [1234, 1620, 0.03367876], [1620, 1854, 0.06837607], [1854, 2512, 0.02279635], [2512, 2702, 0.08947368], [2702, 2942, 0.0125], [2942, 3118, 0.0625], [3118, 3196, 0.14102564], [3196, 3234, 0.10526316], [3234, 3790, 0.06294964], [3790, 3976, 0.05376344], [3976, 4092, 0.09482759], [4092, 4982, 0.01573034], [4982, 5649, 0.04047976], [5649, 5986, 0.02967359], [5986, 6001, 0.2], [6001, 6022, 0.14285714], [6022, 6047, 0.12], [6047, 6089, 0.04761905], [6089, 6334, 0.00816327], [6334, 6679, 0.0057971], [6679, 7201, 0.00766284], [7201, 7802, 0.00831947], [7802, 8612, 0.00987654], [8612, 8792, 0.01666667], [8792, 9069, 0.00361011], [9069, 9269, 0.005], [9269, 9946, 0.00738552], [9946, 10124, 0.06179775], [10124, 10248, 0.12096774], [10248, 11137, 0.01574803], [11137, 11667, 0.02075472], [11667, 11715, 0.10416667], [11715, 11816, 0.08910891], [11816, 12534, 0.01392758], [12534, 12680, 0.09589041], [12680, 13326, 0.01083591], [13326, 13447, 0.10743802], [13447, 14450, 0.05982054], [14450, 14592, 0.0915493], [14592, 15681, 0.02846648], [15681, 15852, 0.11111111], [15852, 16090, 0.0210084], [16090, 16212, 0.07377049], [16212, 17009, 0.01756587], [17009, 17101, 0.09782609], [17101, 17937, 0.01794258], [17937, 18031, 0.10638298], [18031, 18172, 0.03546099], [18172, 18300, 0.09375], [18300, 19120, 0.01341463], [19120, 19223, 0.08737864], [19223, 19485, 0.00763359], [19485, 19724, 0.05857741], [19724, 19813, 0.1011236], [19813, 19971, 0.07594937], [19971, 20567, 0.05201342], [20567, 20678, 0.00900901], [20678, 20824, 0.00684932], [20824, 20964, 0.05], [20964, 21088, 0.06451613], [21088, 21170, 0.13414634], [21170, 21198, 0.17857143], [21198, 21442, 0.06147541], [21442, 21644, 0.02475248], [21644, 22427, 0.03831418], [22427, 22735, 0.02272727], [22735, 22794, 0.06779661], [22794, 22943, 0.06040268], [22943, 23050, 0.14018692], [23050, 23074, 0.125], [23074, 23352, 0.07553957], [23352, 23532, 0.05], [23532, 23764, 0.06465517], [23764, 24193, 0.03962704], [24193, 25406, 0.01319044], [25406, 26913, 0.03450564], [26913, 28774, 0.03331542], [28774, 28945, 0.04678363], [28945, 28980, 0.08571429], [28980, 29896, 0.02401747], [29896, 30695, 0.02377972], [30695, 30809, 0.06140351], [30809, 30891, 0.12195122], [30891, 31314, 0.03782506], [31314, 31505, 0.06282723], [31505, 32571, 0.0206379], [32571, 32683, 0.08035714]], "rps_doc_ml_palm_score": [[0, 32683, 0.04282361]], "rps_doc_ml_wikipedia_score": [[0, 32683, null]], "rps_doc_ml_wikiref_score": [[0, 32683, 0.05853498]], "rps_doc_books_importance": [[0, 32683, -1243.0517353]], "rps_doc_openwebtext_importance": [[0, 32683, 141.8645088]], "rps_doc_wikipedia_importance": [[0, 32683, -1175.51968516]], "rps_doc_num_sentences": [[0, 32683, 257.0]]} |
By Mads Dyrberg|2014-12-08T12:02:54+00:00december 8th, 2014| | 2023-14/0000/en_head.json.gz/3668 | {"url": "https://dyrberginvest.com/screen-shot-2014-12-08-at-12-57-49/", "partition": "head_middle", "language": "en", "source_domain": "dyrberginvest.com", "date_download": "2023-03-20T10:40:08Z", "digest": "sha1:AN6JUPL3FKIHALBX6NNZRPGZARSLJ3JI"} | {"ccnet_length": [[0, 60, 60.0]], "ccnet_original_length": [[0, 60, 1243.0]], "ccnet_nlines": [[0, 60, 1.0]], "ccnet_original_nlines": [[0, 60, 43.0]], "ccnet_language_score": [[0, 60, 0.56]], "ccnet_perplexity": [[0, 60, 300.0]], "ccnet_bucket": [[0, 60, 0.0]], "rps_doc_curly_bracket": [[0, 60, 0.0]], "rps_doc_ldnoobw_words": [[0, 60, 0.0]], "rps_doc_lorem_ipsum": [[0, 60, 0.0]], "rps_doc_stop_word_fraction": [[0, 60, 0.0]], "rps_doc_ut1_blacklist": [[0, 60, null]], "rps_doc_frac_chars_dupe_10grams": [[0, 60, 0.0]], "rps_doc_frac_chars_dupe_5grams": [[0, 60, 0.0]], "rps_doc_frac_chars_dupe_6grams": [[0, 60, 0.0]], "rps_doc_frac_chars_dupe_7grams": [[0, 60, 0.0]], "rps_doc_frac_chars_dupe_8grams": [[0, 60, 0.0]], "rps_doc_frac_chars_dupe_9grams": [[0, 60, 0.0]], "rps_doc_frac_chars_top_2gram": [[0, 60, 0.0]], "rps_doc_frac_chars_top_3gram": [[0, 60, 0.0]], "rps_doc_frac_chars_top_4gram": [[0, 60, 0.0]], "rps_doc_frac_all_caps_words": [[0, 60, 0.04761905]], "rps_doc_frac_lines_end_with_ellipsis": [[0, 60, 0.0]], "rps_doc_frac_no_alph_words": [[0, 60, 0.71428571]], "rps_doc_frac_unique_words": [[0, 60, 1.0]], "rps_doc_mean_word_length": [[0, 60, 9.4]], "rps_doc_symbol_to_word_ratio": [[0, 60, 0.0]], "rps_doc_unigram_entropy": [[0, 60, 1.60943791]], "rps_doc_word_count": [[0, 60, 5.0]], "rps_lines_ending_with_terminal_punctution_mark": [[0, 60, 0.0]], "rps_lines_javascript_counts": [[0, 60, 0.0]], "rps_lines_num_words": [[0, 60, 5.0]], "rps_lines_numerical_chars_fraction": [[0, 60, 0.45098039]], "rps_lines_start_with_bulletpoint": [[0, 60, 0.0]], "rps_lines_uppercase_letter_fraction": [[0, 60, 0.06666667]], "rps_doc_ml_palm_score": [[0, 60, -1.001e-05]], "rps_doc_ml_wikipedia_score": [[0, 60, null]], "rps_doc_ml_wikiref_score": [[0, 60, -1.001e-05]], "rps_doc_books_importance": [[0, 60, -19.29020404]], "rps_doc_openwebtext_importance": [[0, 60, -11.97760678]], "rps_doc_wikipedia_importance": [[0, 60, -10.9887323]], "rps_doc_num_sentences": [[0, 60, 1.0]]} |
Ukraine’s SBU closes more than 30 illegal gambling operations
The Security Service of Ukraine (SBU), together with the nation’s Commission for Licensing of Gambling and Lotteries (KRAIL) has shut down more than 30 illegal gambling establishments and two casinos since the beginning of March.
Gambling establishments were shut down in the Kiev, Donetsk, Zhytomyr and Odessa regions of the country, among several others, and some of the establishments were operated by groups the SBU labels as terrorist organisations.
During the investigations, SBU officers seized over 600 units of computer equipment, 14 poker tables, 5 roulette wheels, playing chips, cards, card game accessories, the mobile phones of establishment staff, client database records, video recorders and cash.
Information on the possible involvement of law enforcement officers in the activities is currently under investigation.
The SBU said it has been established that part of the funds obtained by the operations had been used to finance self-proclaimed states within Ukraine, the Donetsk People’s Republic (DPR) and Luhansk People’s Republic (LPR).
Ukraine regards both the DPR and the LPR as terrorist organizations, although most international bodies and other nations, including the EU, US, and Russia, do not apply this label to the groups.
Gambling was legalised in Ukraine in 2020, after the country’s president Volodymyr Zelensky signed the Gambling Act into law in August.
Under the bill, online gambling, bookmaking, slot halls and land-based casinos would all be legal, but casinos may only be located in hotels.
The first licence granted by KRAIL was awarded in February, to Cosmolot operator Spaceiks, allowing it to operate an online casino in the country. Ukraine’s first sports betting licence was subsequently awarded to Parimatch in March.
Source: igamingbusiness.com
Iwo Bulski
Issues related to the gambling business is engaged in more than 30 years. My empirical experience gives me the opportunity to present events and companies from this business with full knowledge and industry knowledge.
Tags: illegal gambling, Ukraine
Previous How to get luckier while playing Blackjack?
Next Las Vegas – gamblers are back, but not enough
5 hours ago Iwo Bulski
2 weeks ago Szarlot
2 weeks ago Iwo Bulski | 2023-14/0000/en_head.json.gz/3669 | {"url": "https://e-playafrica.com/ukraines-sbu-closes-more-than-30-illegal-gambling-operations/", "partition": "head_middle", "language": "en", "source_domain": "e-playafrica.com", "date_download": "2023-03-20T09:39:59Z", "digest": "sha1:GU5GBDPSARBHGN3V27MWFXDAC7UXLOP3"} | {"ccnet_length": [[0, 2286, 2286.0]], "ccnet_original_length": [[0, 2286, 13630.0]], "ccnet_nlines": [[0, 2286, 19.0]], "ccnet_original_nlines": [[0, 2286, 344.0]], "ccnet_language_score": [[0, 2286, 0.96]], "ccnet_perplexity": [[0, 2286, 312.5]], "ccnet_bucket": [[0, 2286, 0.0]], "rps_doc_curly_bracket": [[0, 2286, 0.0]], "rps_doc_ldnoobw_words": [[0, 2286, 0.0]], "rps_doc_lorem_ipsum": [[0, 2286, 0.0]], "rps_doc_stop_word_fraction": [[0, 2286, 0.34761905]], "rps_doc_ut1_blacklist": [[0, 2286, null]], "rps_doc_frac_chars_dupe_10grams": [[0, 2286, 0.0]], "rps_doc_frac_chars_dupe_5grams": [[0, 2286, 0.02659574]], "rps_doc_frac_chars_dupe_6grams": [[0, 2286, 0.0]], "rps_doc_frac_chars_dupe_7grams": [[0, 2286, 0.0]], "rps_doc_frac_chars_dupe_8grams": [[0, 2286, 0.0]], "rps_doc_frac_chars_dupe_9grams": [[0, 2286, 0.0]], "rps_doc_frac_chars_top_2gram": [[0, 2286, 0.01276596]], "rps_doc_frac_chars_top_3gram": [[0, 2286, 0.01595745]], "rps_doc_frac_chars_top_4gram": [[0, 2286, 0.01808511]], "rps_doc_frac_all_caps_words": [[0, 2286, 0.03095238]], "rps_doc_frac_lines_end_with_ellipsis": [[0, 2286, 0.0]], "rps_doc_frac_no_alph_words": [[0, 2286, 0.16666667]], "rps_doc_frac_unique_words": [[0, 2286, 0.61363636]], "rps_doc_mean_word_length": [[0, 2286, 5.34090909]], "rps_doc_symbol_to_word_ratio": [[0, 2286, 0.0]], "rps_doc_unigram_entropy": [[0, 2286, 5.01485678]], "rps_doc_word_count": [[0, 2286, 352.0]], "rps_lines_ending_with_terminal_punctution_mark": [[0, 62, 0.0], [62, 292, 1.0], [292, 517, 1.0], [517, 776, 1.0], [776, 896, 1.0], [896, 1120, 1.0], [1120, 1316, 1.0], [1316, 1452, 1.0], [1452, 1594, 1.0], [1594, 1828, 1.0], [1828, 1856, 0.0], [1856, 1867, 0.0], [1867, 2085, 1.0], [2085, 2117, 0.0], [2117, 2170, 1.0], [2170, 2221, 0.0], [2221, 2244, 0.0], [2244, 2264, 0.0], [2264, 2286, 0.0]], "rps_lines_javascript_counts": [[0, 62, 0.0], [62, 292, 0.0], [292, 517, 0.0], [517, 776, 0.0], [776, 896, 0.0], [896, 1120, 0.0], [1120, 1316, 0.0], [1316, 1452, 0.0], [1452, 1594, 0.0], [1594, 1828, 0.0], [1828, 1856, 0.0], [1856, 1867, 0.0], [1867, 2085, 0.0], [2085, 2117, 0.0], [2117, 2170, 0.0], [2170, 2221, 0.0], [2221, 2244, 0.0], [2244, 2264, 0.0], [2264, 2286, 0.0]], "rps_lines_num_words": [[0, 62, 9.0], [62, 292, 35.0], [292, 517, 34.0], [517, 776, 37.0], [776, 896, 16.0], [896, 1120, 35.0], [1120, 1316, 32.0], [1316, 1452, 21.0], [1452, 1594, 23.0], [1594, 1828, 36.0], [1828, 1856, 2.0], [1856, 1867, 2.0], [1867, 2085, 34.0], [2085, 2117, 4.0], [2117, 2170, 8.0], [2170, 2221, 10.0], [2221, 2244, 5.0], [2244, 2264, 4.0], [2264, 2286, 5.0]], "rps_lines_numerical_chars_fraction": [[0, 62, 0.03278689], [62, 292, 0.00896861], [292, 517, 0.0], [517, 776, 0.02419355], [776, 896, 0.0], [896, 1120, 0.0], [1120, 1316, 0.0], [1316, 1452, 0.03007519], [1452, 1594, 0.0], [1594, 1828, 0.0], [1828, 1856, 0.0], [1856, 1867, 0.0], [1867, 2085, 0.00930233], [2085, 2117, 0.0], [2117, 2170, 0.0], [2170, 2221, 0.0], [2221, 2244, 0.04545455], [2244, 2264, 0.05263158], [2264, 2286, 0.04545455]], "rps_lines_start_with_bulletpoint": [[0, 62, 0.0], [62, 292, 0.0], [292, 517, 0.0], [517, 776, 0.0], [776, 896, 0.0], [896, 1120, 0.0], [1120, 1316, 0.0], [1316, 1452, 0.0], [1452, 1594, 0.0], [1594, 1828, 0.0], [1828, 1856, 0.0], [1856, 1867, 0.0], [1867, 2085, 0.0], [2085, 2117, 0.0], [2117, 2170, 0.0], [2170, 2221, 0.0], [2221, 2244, 0.0], [2244, 2264, 0.0], [2264, 2286, 0.0]], "rps_lines_uppercase_letter_fraction": [[0, 62, 0.06451613], [62, 292, 0.07391304], [292, 517, 0.03555556], [517, 776, 0.01544402], [776, 896, 0.00833333], [896, 1120, 0.07589286], [1120, 1316, 0.06122449], [1316, 1452, 0.05147059], [1452, 1594, 0.00704225], [1594, 1828, 0.05128205], [1828, 1856, 0.03571429], [1856, 1867, 0.18181818], [1867, 2085, 0.00917431], [2085, 2117, 0.0625], [2117, 2170, 0.05660377], [2170, 2221, 0.05882353], [2221, 2244, 0.08695652], [2244, 2264, 0.05], [2264, 2286, 0.09090909]], "rps_doc_ml_palm_score": [[0, 2286, 0.09770572]], "rps_doc_ml_wikipedia_score": [[0, 2286, null]], "rps_doc_ml_wikiref_score": [[0, 2286, 0.23276281]], "rps_doc_books_importance": [[0, 2286, -134.85734114]], "rps_doc_openwebtext_importance": [[0, 2286, 21.3960805]], "rps_doc_wikipedia_importance": [[0, 2286, -8.48536859]], "rps_doc_num_sentences": [[0, 2286, 15.0]]} |
A discussion of the journey that led to the development of the Bible study tool "21 Days with Jesus."
This presentation has been viewed 23593 times since it was published on September 12, 2005 . | 2023-14/0000/en_head.json.gz/3670 | {"url": "https://e-quip.net/presentations/130/info", "partition": "head_middle", "language": "en", "source_domain": "e-quip.net", "date_download": "2023-03-20T08:48:13Z", "digest": "sha1:5AURNUSQBJL24YMRA6WSSVLJ4JOASVFN"} | {"ccnet_length": [[0, 194, 194.0]], "ccnet_original_length": [[0, 194, 536.0]], "ccnet_nlines": [[0, 194, 2.0]], "ccnet_original_nlines": [[0, 194, 27.0]], "ccnet_language_score": [[0, 194, 0.98]], "ccnet_perplexity": [[0, 194, 102.1]], "ccnet_bucket": [[0, 194, 0.0]], "rps_doc_curly_bracket": [[0, 194, 0.0]], "rps_doc_ldnoobw_words": [[0, 194, 0.0]], "rps_doc_lorem_ipsum": [[0, 194, 0.0]], "rps_doc_stop_word_fraction": [[0, 194, 0.36842105]], "rps_doc_ut1_blacklist": [[0, 194, null]], "rps_doc_frac_chars_dupe_10grams": [[0, 194, 0.0]], "rps_doc_frac_chars_dupe_5grams": [[0, 194, 0.0]], "rps_doc_frac_chars_dupe_6grams": [[0, 194, 0.0]], "rps_doc_frac_chars_dupe_7grams": [[0, 194, 0.0]], "rps_doc_frac_chars_dupe_8grams": [[0, 194, 0.0]], "rps_doc_frac_chars_dupe_9grams": [[0, 194, 0.0]], "rps_doc_frac_chars_top_2gram": [[0, 194, 0.06451613]], "rps_doc_frac_chars_top_3gram": [[0, 194, 0.0]], "rps_doc_frac_chars_top_4gram": [[0, 194, 0.0]], "rps_doc_frac_all_caps_words": [[0, 194, 0.02631579]], "rps_doc_frac_lines_end_with_ellipsis": [[0, 194, 0.0]], "rps_doc_frac_no_alph_words": [[0, 194, 0.21052632]], "rps_doc_frac_unique_words": [[0, 194, 0.91176471]], "rps_doc_mean_word_length": [[0, 194, 4.55882353]], "rps_doc_symbol_to_word_ratio": [[0, 194, 0.0]], "rps_doc_unigram_entropy": [[0, 194, 3.38865078]], "rps_doc_word_count": [[0, 194, 34.0]], "rps_lines_ending_with_terminal_punctution_mark": [[0, 102, 0.0], [102, 194, 1.0]], "rps_lines_javascript_counts": [[0, 102, 0.0], [102, 194, 0.0]], "rps_lines_num_words": [[0, 102, 19.0], [102, 194, 15.0]], "rps_lines_numerical_chars_fraction": [[0, 102, 0.02040816], [102, 194, 0.12359551]], "rps_lines_start_with_bulletpoint": [[0, 102, 0.0], [102, 194, 0.0]], "rps_lines_uppercase_letter_fraction": [[0, 102, 0.03921569], [102, 194, 0.02173913]], "rps_doc_ml_palm_score": [[0, 194, 0.00029469]], "rps_doc_ml_wikipedia_score": [[0, 194, null]], "rps_doc_ml_wikiref_score": [[0, 194, -1.001e-05]], "rps_doc_books_importance": [[0, 194, 3.49242846]], "rps_doc_openwebtext_importance": [[0, 194, 4.93413363]], "rps_doc_wikipedia_importance": [[0, 194, 9.99600276]], "rps_doc_num_sentences": [[0, 194, 2.0]]} |
20th EASDec Meeting: Paris, France - 2010
20th EASDEC Meeting 2010
The 20th Scientific Meeting of the Eye Complications Study Group of the European Association for the Study of Diabetes (EASDec) took place in Paris, France from Friday 21st May 2010 - Saturday 22nd May 2010.
The information presented here is an archive of Abstracts for papers and posters. | 2023-14/0000/en_head.json.gz/3671 | {"url": "https://easdec.org/pages/default.asp?id=2&sID=121", "partition": "head_middle", "language": "en", "source_domain": "easdec.org", "date_download": "2023-03-20T09:39:51Z", "digest": "sha1:ANINQGVNMMWD345LFREDQAYYI75Y5JCX"} | {"ccnet_length": [[0, 356, 356.0]], "ccnet_original_length": [[0, 356, 2096.0]], "ccnet_nlines": [[0, 356, 4.0]], "ccnet_original_nlines": [[0, 356, 64.0]], "ccnet_language_score": [[0, 356, 0.88]], "ccnet_perplexity": [[0, 356, 304.7]], "ccnet_bucket": [[0, 356, 0.0]], "rps_doc_curly_bracket": [[0, 356, 0.0]], "rps_doc_ldnoobw_words": [[0, 356, 0.0]], "rps_doc_lorem_ipsum": [[0, 356, 0.0]], "rps_doc_stop_word_fraction": [[0, 356, 0.24242424]], "rps_doc_ut1_blacklist": [[0, 356, null]], "rps_doc_frac_chars_dupe_10grams": [[0, 356, 0.0]], "rps_doc_frac_chars_dupe_5grams": [[0, 356, 0.0]], "rps_doc_frac_chars_dupe_6grams": [[0, 356, 0.0]], "rps_doc_frac_chars_dupe_7grams": [[0, 356, 0.0]], "rps_doc_frac_chars_dupe_8grams": [[0, 356, 0.0]], "rps_doc_frac_chars_dupe_9grams": [[0, 356, 0.0]], "rps_doc_frac_chars_top_2gram": [[0, 356, 0.06920415]], "rps_doc_frac_chars_top_3gram": [[0, 356, 0.11764706]], "rps_doc_frac_chars_top_4gram": [[0, 356, 0.0]], "rps_doc_frac_all_caps_words": [[0, 356, 0.01515152]], "rps_doc_frac_lines_end_with_ellipsis": [[0, 356, 0.0]], "rps_doc_frac_no_alph_words": [[0, 356, 0.1969697]], "rps_doc_frac_unique_words": [[0, 356, 0.63157895]], "rps_doc_mean_word_length": [[0, 356, 5.07017544]], "rps_doc_symbol_to_word_ratio": [[0, 356, 0.0]], "rps_doc_unigram_entropy": [[0, 356, 3.41223501]], "rps_doc_word_count": [[0, 356, 57.0]], "rps_lines_ending_with_terminal_punctution_mark": [[0, 42, 0.0], [42, 67, 0.0], [67, 275, 1.0], [275, 356, 1.0]], "rps_lines_javascript_counts": [[0, 42, 0.0], [42, 67, 0.0], [67, 275, 0.0], [275, 356, 0.0]], "rps_lines_num_words": [[0, 42, 6.0], [42, 67, 4.0], [67, 275, 34.0], [275, 356, 13.0]], "rps_lines_numerical_chars_fraction": [[0, 42, 0.16216216], [42, 67, 0.25], [67, 275, 0.06965174], [275, 356, 0.0]], "rps_lines_start_with_bulletpoint": [[0, 42, 0.0], [42, 67, 0.0], [67, 275, 0.0], [275, 356, 0.0]], "rps_lines_uppercase_letter_fraction": [[0, 42, 0.16666667], [42, 67, 0.28], [67, 275, 0.10096154], [275, 356, 0.02469136]], "rps_doc_ml_palm_score": [[0, 356, 5.317e-05]], "rps_doc_ml_wikipedia_score": [[0, 356, null]], "rps_doc_ml_wikiref_score": [[0, 356, -8.34e-06]], "rps_doc_books_importance": [[0, 356, -24.10921569]], "rps_doc_openwebtext_importance": [[0, 356, -10.2420368]], "rps_doc_wikipedia_importance": [[0, 356, 8.49492033]], "rps_doc_num_sentences": [[0, 356, 2.0]]} |
types of fraud and detection methods
Types of Fraud and Detection Methods In this short paper, you will discuss the major types of fraud and fraud detection methods, as well as how the various elements of the fraud trian
https://eazyweezyhomeworks.com/wp-content/uploads/2020/09/logo-300x75.png 0 0 WN https://eazyweezyhomeworks.com/wp-content/uploads/2020/09/logo-300x75.png WN2022-06-10 16:36:142022-06-10 16:36:14types of fraud and detection methods
analyse group counseling analyze different types of legal damages | 2023-14/0000/en_head.json.gz/3672 | {"url": "https://eazyweezyhomeworks.com/types-of-fraud-and-detection-methods/", "partition": "head_middle", "language": "en", "source_domain": "eazyweezyhomeworks.com", "date_download": "2023-03-20T09:38:40Z", "digest": "sha1:DF7P3KJN4JDYJL5UHV4KZWY5BFJHMSPT"} | {"ccnet_length": [[0, 518, 518.0]], "ccnet_original_length": [[0, 518, 1934.0]], "ccnet_nlines": [[0, 518, 4.0]], "ccnet_original_nlines": [[0, 518, 82.0]], "ccnet_language_score": [[0, 518, 0.87]], "ccnet_perplexity": [[0, 518, 252.5]], "ccnet_bucket": [[0, 518, 0.0]], "rps_doc_curly_bracket": [[0, 518, 0.0]], "rps_doc_ldnoobw_words": [[0, 518, 0.0]], "rps_doc_lorem_ipsum": [[0, 518, 0.0]], "rps_doc_stop_word_fraction": [[0, 518, 0.20168067]], "rps_doc_ut1_blacklist": [[0, 518, null]], "rps_doc_frac_chars_dupe_10grams": [[0, 518, 0.0]], "rps_doc_frac_chars_dupe_5grams": [[0, 518, 0.20754717]], "rps_doc_frac_chars_dupe_6grams": [[0, 518, 0.14622642]], "rps_doc_frac_chars_dupe_7grams": [[0, 518, 0.0]], "rps_doc_frac_chars_dupe_8grams": [[0, 518, 0.0]], "rps_doc_frac_chars_dupe_9grams": [[0, 518, 0.0]], "rps_doc_frac_chars_top_2gram": [[0, 518, 0.06603774]], "rps_doc_frac_chars_top_3gram": [[0, 518, 0.09433962]], "rps_doc_frac_chars_top_4gram": [[0, 518, 0.10613208]], "rps_doc_frac_all_caps_words": [[0, 518, 0.01680672]], "rps_doc_frac_lines_end_with_ellipsis": [[0, 518, 0.0]], "rps_doc_frac_no_alph_words": [[0, 518, 0.37815126]], "rps_doc_frac_unique_words": [[0, 518, 0.55737705]], "rps_doc_mean_word_length": [[0, 518, 6.95081967]], "rps_doc_symbol_to_word_ratio": [[0, 518, 0.0]], "rps_doc_unigram_entropy": [[0, 518, 3.27256987]], "rps_doc_word_count": [[0, 518, 61.0]], "rps_lines_ending_with_terminal_punctution_mark": [[0, 37, 0.0], [37, 221, 0.0], [221, 453, 0.0], [453, 518, 0.0]], "rps_lines_javascript_counts": [[0, 37, 0.0], [37, 221, 0.0], [221, 453, 0.0], [453, 518, 0.0]], "rps_lines_num_words": [[0, 37, 6.0], [37, 221, 33.0], [221, 453, 13.0], [453, 518, 9.0]], "rps_lines_numerical_chars_fraction": [[0, 37, 0.0], [37, 221, 0.0], [221, 453, 0.26130653], [453, 518, 0.0]], "rps_lines_start_with_bulletpoint": [[0, 37, 0.0], [37, 221, 0.0], [221, 453, 0.0], [453, 518, 0.0]], "rps_lines_uppercase_letter_fraction": [[0, 37, 0.0], [37, 221, 0.02717391], [221, 453, 0.01724138], [453, 518, 0.0]], "rps_doc_ml_palm_score": [[0, 518, 0.03580034]], "rps_doc_ml_wikipedia_score": [[0, 518, null]], "rps_doc_ml_wikiref_score": [[0, 518, -9.06e-06]], "rps_doc_books_importance": [[0, 518, -127.56368933]], "rps_doc_openwebtext_importance": [[0, 518, -51.00801594]], "rps_doc_wikipedia_importance": [[0, 518, -66.18045003]], "rps_doc_num_sentences": [[0, 518, 5.0]]} |
All posts by David M.
Below are all of David M.'s postings, with the most recent are at the bottom of the page.
Thats Oxford
Monday 28 September 2015 11:48AM
So has this channel launched properly now? It had been delayed so many times. | 2023-14/0000/en_head.json.gz/3673 | {"url": "https://ec2-3-8-112-187.eu-west-2.compute.amazonaws.com/extras/mypostings/230396089", "partition": "head_middle", "language": "en", "source_domain": "ec2-3-8-112-187.eu-west-2.compute.amazonaws.com", "date_download": "2023-03-20T11:08:04Z", "digest": "sha1:DTVMFSTB3BWLPYPDDKL66FJJ2AUXCCDM"} | {"ccnet_length": [[0, 235, 235.0]], "ccnet_original_length": [[0, 235, 2398.0]], "ccnet_nlines": [[0, 235, 5.0]], "ccnet_original_nlines": [[0, 235, 109.0]], "ccnet_language_score": [[0, 235, 0.98]], "ccnet_perplexity": [[0, 235, 229.8]], "ccnet_bucket": [[0, 235, 0.0]], "rps_doc_curly_bracket": [[0, 235, 0.0]], "rps_doc_ldnoobw_words": [[0, 235, 0.0]], "rps_doc_lorem_ipsum": [[0, 235, 0.0]], "rps_doc_stop_word_fraction": [[0, 235, 0.37735849]], "rps_doc_ut1_blacklist": [[0, 235, null]], "rps_doc_frac_chars_dupe_10grams": [[0, 235, 0.0]], "rps_doc_frac_chars_dupe_5grams": [[0, 235, 0.0]], "rps_doc_frac_chars_dupe_6grams": [[0, 235, 0.0]], "rps_doc_frac_chars_dupe_7grams": [[0, 235, 0.0]], "rps_doc_frac_chars_dupe_8grams": [[0, 235, 0.0]], "rps_doc_frac_chars_dupe_9grams": [[0, 235, 0.0]], "rps_doc_frac_chars_top_2gram": [[0, 235, 0.0]], "rps_doc_frac_chars_top_3gram": [[0, 235, 0.0]], "rps_doc_frac_chars_top_4gram": [[0, 235, 0.0]], "rps_doc_frac_all_caps_words": [[0, 235, 0.05660377]], "rps_doc_frac_lines_end_with_ellipsis": [[0, 235, 0.0]], "rps_doc_frac_no_alph_words": [[0, 235, 0.18867925]], "rps_doc_frac_unique_words": [[0, 235, 0.84090909]], "rps_doc_mean_word_length": [[0, 235, 4.18181818]], "rps_doc_symbol_to_word_ratio": [[0, 235, 0.0]], "rps_doc_unigram_entropy": [[0, 235, 3.5517508]], "rps_doc_word_count": [[0, 235, 44.0]], "rps_lines_ending_with_terminal_punctution_mark": [[0, 22, 1.0], [22, 112, 1.0], [112, 125, 0.0], [125, 158, 0.0], [158, 235, 1.0]], "rps_lines_javascript_counts": [[0, 22, 0.0], [22, 112, 0.0], [112, 125, 0.0], [125, 158, 0.0], [158, 235, 0.0]], "rps_lines_num_words": [[0, 22, 5.0], [22, 112, 18.0], [112, 125, 2.0], [125, 158, 5.0], [158, 235, 14.0]], "rps_lines_numerical_chars_fraction": [[0, 22, 0.0], [22, 112, 0.0], [112, 125, 0.0], [125, 158, 0.32258065], [158, 235, 0.0]], "rps_lines_start_with_bulletpoint": [[0, 22, 0.0], [22, 112, 0.0], [112, 125, 0.0], [125, 158, 0.0], [158, 235, 0.0]], "rps_lines_uppercase_letter_fraction": [[0, 22, 0.13636364], [22, 112, 0.03333333], [112, 125, 0.15384615], [125, 158, 0.12121212], [158, 235, 0.02597403]], "rps_doc_ml_palm_score": [[0, 235, 0.14390904]], "rps_doc_ml_wikipedia_score": [[0, 235, null]], "rps_doc_ml_wikiref_score": [[0, 235, -9.54e-06]], "rps_doc_books_importance": [[0, 235, -8.8771587]], "rps_doc_openwebtext_importance": [[0, 235, -3.81237926]], "rps_doc_wikipedia_importance": [[0, 235, -8.35363031]], "rps_doc_num_sentences": [[0, 235, 5.0]]} |
{{error.message}}
WIN a 3-month supply of baby diapers (3 boxes of 252 diapers each)!!
76.89 Value
1 Winner
Giveaway Ended
Verifying Winner
March 25, 2019 11:45 pm EDT
Prizes Awarded
March 26, 2019 10:00 am EDT
Enter sweepstakes and receive exclusive offers from The Pulse. Unsubscribe anytime. LUVS is not affiliated with the giveaway. {{showRules ? 'Hide official rules' : 'Read official rules'}}.
By submitting an entry to the Sweepstakes (the “Sweepstakes”), you acknowledge that you have read and agree to be bound by these Official Rules and the decisions of The Pulse, and that you satisfy all eligibility requirements.
The Sweepstakes (“Sweepstakes”) is open to all humans age 18 or older at time of entry who are residents of the United States of America. Employees of The Pulse (the “Sponsor”), its respective parent, subsidiaries, affiliates, advertising and promotion agencies, distributors, merchants involved with this Sweepstakes, and their immediate family members and/or those living in the same household of each are not eligible to enter or win. Sweepstakes governed by law. Sweepstakes is subject to all applicable federal, state and local laws and regulations. Void where prohibited by law. No purchase necessary to enter or win. A purchase will not improve your chances of winning.
To enter the Sweepstakes, follow all onscreen instructions to complete and submit the online entry form (“Primary Entry”). You can earn an Additional Entry in the Sweepstakes when you send the Sweepstakes page to a friend, age 21 or older, by following the Additional Entry instructions on the screen (“Additional Entry”). The friend must enter the sweepstakes through your referral in order to get an additional entry. All entries must be received by March 25, 11:45PM EDT (“Sweepstakes Entry Period”). Online entrants subject to all notices posted online including but not limited to Sponsor’s privacy policy.
PRIZES & APPROXIMATE RETAIL VALUES (“ARV”):
Maximum ARV of all prizes: 76.89. Prize includes: Luvs Ultra Leakguards Disposable Diapers Newborn Size 1. If winner is unable to receive payment then prize will be forfeited, and an alternate winner may be selected in accordance with these Official Rules from among the remaining eligible entries for that prize. Prizes may not be transferred or assigned except by Sponsor.
Odds of winning depend on number of eligible entries received during the Sweepstakes Entry Period.
THE FOLLOWING APPLIES TO THE ABOVE:
Winners will be selected in a random drawing from all eligible entries received. Random drawing will be conducted on or about March 26, 10:00AM EDT by a representative from the Sponsor whose decisions are final. Winners will be notified by email and/or mail and at the judges’ discretion, will be required to complete and return an email confirmation of prize acceptance and possibly a Affidavit of Eligibility and Liability and a Publicity Release (“Affidavit/Release”) within 7 days of notification, or an alternate winner will be selected. Prize awarded within 30 days after receipt and verification of completed Affidavit/Release or prize notification. If a potential winner cannot be contacted, or if Affidavit/Release is returned as undeliverable or if the selected winner cannot accept or receive the prize for any reason, or he/she is not in compliance with these Official Rules, the prize will be forfeited and may be awarded to an alternate winner. If a winner is otherwise eligible under these Official Rules, but is nevertheless deemed a minor in his or her state of primary residence, the prize will be awarded in the name of winner’s parent or legal guardian who will be required to execute affidavit on minor’s behalf. No substitutions are permitted except if prize is unavailable, in which case a prize of equal or greater value will be awarded. Prize is not redeemable for cash and non-assignable and non-transferable except to a surviving spouse. Winner is responsible for all federal, state, local sales and income taxes. Any other incidental expenses on prize not specified herein are the sole responsibility of winner. Entry and acceptance of prize constitute permission to use winner’s name, prize won, hometown and likeness for online posting and promotional purposes without further compensation, except where prohibited by law. Limit: one prize per household. Prize is guaranteed to be awarded. If The Pulse goes bankrupt or out of business then the year they go out of business is the last year of payment for the premium service. If The Pulse is acquired or merges, it’s the new owners discretion to keep paying for the premium service. If we go bankrupt or out of business then the last year they are in business will be the payment for the premium service. Cash value of the prize cannot be redeemed.
Participating entrants agree to these Official Rules and the decisions of the Sponsor, and release, defend and hold harmless the Sponsor and its affiliated companies, and all other businesses involved in this Sweepstakes, as well as the employees, officers, directors and agents of each, from all claims and liability relating to their participation, acceptance and use or misuse of prize. Winner assumes all liability for any injury or damage caused or claimed to be caused, by participation in this Sweepstakes or use or redemption of prize. Sponsor is not responsible for any typographical or other error in the printing of the offer, administration of the Sweepstakes or in the announcement of the prize.
Winner acknowledges the Sponsor and all other businesses concerned with this Sweepstakes and their agents do not make, nor are in any manner responsible for any warranty, representations, expressed or implied, in fact or in law, relative the quality, conditions, fitness or merchantability of any aspect of prize.
In the event of a dispute over the identity of an online entrant, entry will be deemed submitted by the “Authorized Account Holder” of the e-mail address submitted at time of entry. Authorized Account Holder means the natural person who is assigned to an e-mail address by an Internet access provider, online service provider, or other organization that is responsible for assigning e-mail addresses for the domain associated with the submitted e-mail address. Entry materials/data that have been tampered with or altered, or mass entries or entries generated by a script, macro or use of automated devices are void. The Sponsor is not responsible for: (i) lost, late, misdirected, damaged, illegible or incomplete entries; or (ii) error, omission, interruption, deletion, defect, delay in operations or transmission, theft or destruction or unauthorized access to or alterations of entry materials, or for technical, network, telephone equipment, electronic, computer, hardware or software malfunctions of any kind, or inaccurate transmission of or failure to receive entry information by Sponsor on account of technical problems or traffic congestion on the Internet or at any web site or any combination thereof; or (iii) any injury or damage to entrant’s or any other person’s computer related to or resulting from participating in the Sweepstakes. By participating in the Sweepstakes, you (i) agree to be bound by these Official Rules, including all eligibility requirements, and (ii) agree to be bound by the decisions of Sponsor and the independent judging organization, which are final and binding in all matters relating to the Sweepstakes. Failure to comply with these Official Rules may result in disqualification from the Sweepstakes. Sponsor reserves the right to permanently disqualify from any sweepstakes it sponsors any person it believes has intentionally violated these Official Rules and cancel the Sweepstakes if it becomes technically corrupted (including if a computer virus, bot or system malfunction inalterably impairs its ability to conduct the Sweepstakes), and to select winner from among all eligible entries received prior to withdrawal. Legal Warning: ANY ATTEMPT BY AN individual, whether or not an ENTRANT, TO DELIBERATELY DAMAGE, destroy, tamper or vandalize this WEB SITE OR interfere with the OPERATION OF THE SWEEPSTAKES, IS A VIOLATION OF CRIMINAL AND CIVIL LAWS and SPONSOR RESERVES THE RIGHT TO SEEK DAMAGES and diligently pursue all remedies against ANY SUCH individual TO THE FULLEST EXTENT PERMITTED BY LAW.
ARBITRATION:
This Sweepstakes shall be governed by and interpreted under the laws of the State of , . without regard to its conflicts of laws provisions. By participating, entrants agree that any and all disputes arising out of or relating in any way to this Sweepstakes shall be litigated only in courts sitting in , , .
WINNERS LIST:
For a winners list (available after March 26, 10:00AM EDT), send a stamped, self-addressed envelope by March 26, 10:00AM EDT to:
By entering, the information you are only signing up for the The Pulse newsletter. Your information will never be sold to anyone. You can unsubscribe easily from The Pulse by clicking on the unsubscribe link in the bottom right of our daily emails. Information provided by you to participate in this Sweepstakes is subject to Sponsor’s privacy policy located at https://ec2-54-183-193-189.us-west-1.compute.amazonaws.com/privacy/
The personal information collected from each entrant in the course of the Sweepstakes will be used by The Pulse for the purposes of administration of the Sweepstakes and as otherwise permitted in accordance with The Pulse’s Privacy Policy. Please see The Pulse’s Terms of Service for more information. The Pulse reserves the right to alter any rules of any sweepstakes at anytime. If you have any questions or complaints about a The Pulse sweepstakes, please contact [email protected].
Sweepstakes is not sponsored, endorsed or administered by LUVS, nor is LUVS associated with the Sweepstakes in any way. You understand that you are providing information to the Sponsor and not to LUVS. Any questions, comments or complaints regarding the Sweepstakes must be directed to the Sponsor or Administrator, and not to LUVS. LUVS is a registered trademark of LUVS.
Powered by KingSumo Giveaways for WordPress | 2023-14/0000/en_head.json.gz/3674 | {"url": "https://ec2-54-183-193-189.us-west-1.compute.amazonaws.com/giveaways/win-3-month-supply-of-baby-diaper-2-2/", "partition": "head_middle", "language": "en", "source_domain": "ec2-54-183-193-189.us-west-1.compute.amazonaws.com", "date_download": "2023-03-20T09:33:29Z", "digest": "sha1:RZQJB36LG7IB5JL4HGZY5QISN5UN454P"} | {"ccnet_length": [[0, 10174, 10174.0]], "ccnet_original_length": [[0, 10174, 10425.0]], "ccnet_nlines": [[0, 10174, 29.0]], "ccnet_original_nlines": [[0, 10174, 40.0]], "ccnet_language_score": [[0, 10174, 0.93]], "ccnet_perplexity": [[0, 10174, 310.2]], "ccnet_bucket": [[0, 10174, 0.0]], "rps_doc_curly_bracket": [[0, 10174, 0.00078632]], "rps_doc_ldnoobw_words": [[0, 10174, 0.0]], "rps_doc_lorem_ipsum": [[0, 10174, 0.0]], "rps_doc_stop_word_fraction": [[0, 10174, 0.39643419]], "rps_doc_ut1_blacklist": [[0, 10174, null]], "rps_doc_frac_chars_dupe_10grams": [[0, 10174, 0.0]], "rps_doc_frac_chars_dupe_5grams": [[0, 10174, 0.05803733]], "rps_doc_frac_chars_dupe_6grams": [[0, 10174, 0.02239615]], "rps_doc_frac_chars_dupe_7grams": [[0, 10174, 0.02239615]], "rps_doc_frac_chars_dupe_8grams": [[0, 10174, 0.01517158]], "rps_doc_frac_chars_dupe_9grams": [[0, 10174, 0.0]], "rps_doc_frac_chars_top_2gram": [[0, 10174, 0.0320289]], "rps_doc_frac_chars_top_3gram": [[0, 10174, 0.01733895]], "rps_doc_frac_chars_top_4gram": [[0, 10174, 0.00505719]], "rps_doc_frac_all_caps_words": [[0, 10174, 0.03828002]], "rps_doc_frac_lines_end_with_ellipsis": [[0, 10174, 0.0]], "rps_doc_frac_no_alph_words": [[0, 10174, 0.15416885]], "rps_doc_frac_unique_words": [[0, 10174, 0.35334584]], "rps_doc_mean_word_length": [[0, 10174, 5.19387117]], "rps_doc_symbol_to_word_ratio": [[0, 10174, 0.0]], "rps_doc_unigram_entropy": [[0, 10174, 5.42368394]], "rps_doc_word_count": [[0, 10174, 1599.0]], "rps_lines_ending_with_terminal_punctution_mark": [[0, 18, 0.0], [18, 87, 1.0], [87, 99, 0.0], [99, 108, 0.0], [108, 123, 0.0], [123, 140, 0.0], [140, 168, 0.0], [168, 183, 0.0], [183, 211, 0.0], [211, 400, 1.0], [400, 627, 1.0], [627, 1304, 1.0], [1304, 1916, 1.0], [1916, 1960, 0.0], [1960, 2335, 1.0], [2335, 2434, 1.0], [2434, 2470, 0.0], [2470, 4800, 1.0], [4800, 5509, 1.0], [5509, 5823, 1.0], [5823, 8375, 1.0], [8375, 8388, 0.0], [8388, 8697, 1.0], [8697, 8711, 0.0], [8711, 8840, 0.0], [8840, 9270, 0.0], [9270, 9758, 1.0], [9758, 10131, 1.0], [10131, 10174, 0.0]], "rps_lines_javascript_counts": [[0, 18, 0.0], [18, 87, 0.0], [87, 99, 0.0], [99, 108, 0.0], [108, 123, 0.0], [123, 140, 0.0], [140, 168, 0.0], [168, 183, 0.0], [183, 211, 0.0], [211, 400, 0.0], [400, 627, 0.0], [627, 1304, 0.0], [1304, 1916, 0.0], [1916, 1960, 0.0], [1960, 2335, 0.0], [2335, 2434, 0.0], [2434, 2470, 0.0], [2470, 4800, 0.0], [4800, 5509, 0.0], [5509, 5823, 0.0], [5823, 8375, 0.0], [8375, 8388, 0.0], [8388, 8697, 0.0], [8697, 8711, 0.0], [8711, 8840, 0.0], [8840, 9270, 0.0], [9270, 9758, 0.0], [9758, 10131, 0.0], [10131, 10174, 0.0]], "rps_lines_num_words": [[0, 18, 1.0], [18, 87, 13.0], [87, 99, 2.0], [99, 108, 2.0], [108, 123, 2.0], [123, 140, 2.0], [140, 168, 6.0], [168, 183, 2.0], [183, 211, 6.0], [211, 400, 25.0], [400, 627, 37.0], [627, 1304, 105.0], [1304, 1916, 96.0], [1916, 1960, 5.0], [1960, 2335, 60.0], [2335, 2434, 15.0], [2434, 2470, 6.0], [2470, 4800, 383.0], [4800, 5509, 113.0], [5509, 5823, 48.0], [5823, 8375, 391.0], [8375, 8388, 1.0], [8388, 8697, 51.0], [8697, 8711, 2.0], [8711, 8840, 21.0], [8840, 9270, 61.0], [9270, 9758, 77.0], [9758, 10131, 60.0], [10131, 10174, 6.0]], "rps_lines_numerical_chars_fraction": [[0, 18, 0.0], [18, 87, 0.07936508], [87, 99, 0.4], [99, 108, 0.125], [108, 123, 0.0], [123, 140, 0.0], [140, 168, 0.4], [168, 183, 0.0], [183, 211, 0.4], [211, 400, 0.0], [400, 627, 0.0], [627, 1304, 0.00304878], [1304, 1916, 0.01344538], [1916, 1960, 0.0], [1960, 2335, 0.0136612], [2335, 2434, 0.0], [2434, 2470, 0.0], [2470, 4800, 0.00393701], [4800, 5509, 0.0], [5509, 5823, 0.0], [5823, 8375, 0.0], [8375, 8388, 0.0], [8388, 8697, 0.0], [8697, 8711, 0.0], [8711, 8840, 0.10169492], [8840, 9270, 0.03170732], [9270, 9758, 0.0], [9758, 10131, 0.0], [10131, 10174, 0.0]], "rps_lines_start_with_bulletpoint": [[0, 18, 0.0], [18, 87, 0.0], [87, 99, 0.0], [99, 108, 0.0], [108, 123, 0.0], [123, 140, 0.0], [140, 168, 0.0], [168, 183, 0.0], [183, 211, 0.0], [211, 400, 0.0], [400, 627, 0.0], [627, 1304, 0.0], [1304, 1916, 0.0], [1916, 1960, 0.0], [1960, 2335, 0.0], [2335, 2434, 0.0], [2434, 2470, 0.0], [2470, 4800, 0.0], [4800, 5509, 0.0], [5509, 5823, 0.0], [5823, 8375, 0.0], [8375, 8388, 0.0], [8388, 8697, 0.0], [8697, 8711, 0.0], [8711, 8840, 0.0], [8840, 9270, 0.0], [9270, 9758, 0.0], [9758, 10131, 0.0], [10131, 10174, 0.0]], "rps_lines_uppercase_letter_fraction": [[0, 18, 0.0], [18, 87, 0.04347826], [87, 99, 0.08333333], [99, 108, 0.11111111], [108, 123, 0.13333333], [123, 140, 0.11764706], [140, 168, 0.14285714], [168, 183, 0.13333333], [183, 211, 0.14285714], [211, 400, 0.05820106], [400, 627, 0.030837], [627, 1304, 0.02363368], [1304, 1916, 0.04248366], [1916, 1960, 0.72727273], [1960, 2335, 0.04533333], [2335, 2434, 0.04040404], [2434, 2470, 0.80555556], [2470, 4800, 0.01845494], [4800, 5509, 0.01410437], [5509, 5823, 0.00955414], [5823, 8375, 0.08424765], [8375, 8388, 0.84615385], [8388, 8697, 0.01618123], [8697, 8711, 0.78571429], [8711, 8840, 0.10077519], [8840, 9270, 0.02325581], [9270, 9758, 0.03893443], [9758, 10131, 0.08579088], [10131, 10174, 0.13953488]], "rps_doc_ml_palm_score": [[0, 10174, 0.14714831]], "rps_doc_ml_wikipedia_score": [[0, 10174, null]], "rps_doc_ml_wikiref_score": [[0, 10174, 0.17749101]], "rps_doc_books_importance": [[0, 10174, -425.23789166]], "rps_doc_openwebtext_importance": [[0, 10174, -51.11614838]], "rps_doc_wikipedia_importance": [[0, 10174, -233.9609807]], "rps_doc_num_sentences": [[0, 10174, 76.0]]} |
Shannon L. Rudisill
[email protected]
Shannon Rudisill is the first Executive Director of the Early Childhood Funders Collaborative, assuming the position in November 2016. Under her leadership, ECFC is pursuing a strategy of growth and strategic engagement with partners across philanthropy to build capacity for supporting policies and practices that help young children and their families thrive.
Prior to joining ECFC, Shannon was the Associate Deputy Assistant Secretary for Early Childhood Development at the Administration for Children and Families, U.S. Department of Health and Human Services. Shannon served on the Obama Administration’s senior leadership team for early childhood programs, setting strategy for ACF programs such child care and Head Start, as well as programs jointly administered with the U.S. Department of Education, such as the Race to the Top – Early Learning Challenge and the Preschool Development Grants. She is especially proud of the $500 million Early Head Start – Child Care Partnership grants which will bring higher quality early care and comprehensive services to thousands of infants and toddlers from families in poverty.
From 2009-2014, Shannon was the Director of ACF’s Office of Child Care (OCC). OCC administers the Child Care and Development Fund (CCDF), which provides over $5 billion per year to States, Territories, and over 500 Tribal communities to provide child care for about 1.5 million children each month. At OCC, Shannon tirelessly pursued strategies to make child care work better for families and to ensure that children are served in safe, healthy, and nurturing early learning and afterschool programs. Under her leadership, the Obama Administration’s principles for child care reform were reflected in the reauthorization of the Child Care and Development Block Grant for the first time in almost 18 years.
View All Team Members | 2023-14/0000/en_head.json.gz/3675 | {"url": "https://ecfunders.org/team/shannon-l-rudisill/", "partition": "head_middle", "language": "en", "source_domain": "ecfunders.org", "date_download": "2023-03-20T09:52:22Z", "digest": "sha1:FS4YNFM5TZKYYFBP5EVKD3DV7HVS5BY4"} | {"ccnet_length": [[0, 1906, 1906.0]], "ccnet_original_length": [[0, 1906, 4559.0]], "ccnet_nlines": [[0, 1906, 6.0]], "ccnet_original_nlines": [[0, 1906, 164.0]], "ccnet_language_score": [[0, 1906, 0.95]], "ccnet_perplexity": [[0, 1906, 129.8]], "ccnet_bucket": [[0, 1906, 0.0]], "rps_doc_curly_bracket": [[0, 1906, 0.0]], "rps_doc_ldnoobw_words": [[0, 1906, 0.0]], "rps_doc_lorem_ipsum": [[0, 1906, 0.0]], "rps_doc_stop_word_fraction": [[0, 1906, 0.31268437]], "rps_doc_ut1_blacklist": [[0, 1906, null]], "rps_doc_frac_chars_dupe_10grams": [[0, 1906, 0.0]], "rps_doc_frac_chars_dupe_5grams": [[0, 1906, 0.03293224]], "rps_doc_frac_chars_dupe_6grams": [[0, 1906, 0.0]], "rps_doc_frac_chars_dupe_7grams": [[0, 1906, 0.0]], "rps_doc_frac_chars_dupe_8grams": [[0, 1906, 0.0]], "rps_doc_frac_chars_dupe_9grams": [[0, 1906, 0.0]], "rps_doc_frac_chars_top_2gram": [[0, 1906, 0.04559848]], "rps_doc_frac_chars_top_3gram": [[0, 1906, 0.02279924]], "rps_doc_frac_chars_top_4gram": [[0, 1906, 0.01899937]], "rps_doc_frac_all_caps_words": [[0, 1906, 0.03834808]], "rps_doc_frac_lines_end_with_ellipsis": [[0, 1906, 0.0]], "rps_doc_frac_no_alph_words": [[0, 1906, 0.15339233]], "rps_doc_frac_unique_words": [[0, 1906, 0.53472222]], "rps_doc_mean_word_length": [[0, 1906, 5.48263889]], "rps_doc_symbol_to_word_ratio": [[0, 1906, 0.0]], "rps_doc_unigram_entropy": [[0, 1906, 4.64676931]], "rps_doc_word_count": [[0, 1906, 288.0]], "rps_lines_ending_with_terminal_punctution_mark": [[0, 20, 0.0], [20, 51, 0.0], [51, 413, 1.0], [413, 1179, 1.0], [1179, 1885, 1.0], [1885, 1906, 0.0]], "rps_lines_javascript_counts": [[0, 20, 0.0], [20, 51, 0.0], [51, 413, 0.0], [413, 1179, 0.0], [1179, 1885, 0.0], [1885, 1906, 0.0]], "rps_lines_num_words": [[0, 20, 3.0], [20, 51, 1.0], [51, 413, 52.0], [413, 1179, 117.0], [1179, 1885, 111.0], [1885, 1906, 4.0]], "rps_lines_numerical_chars_fraction": [[0, 20, 0.0], [20, 51, 0.0], [51, 413, 0.01120448], [413, 1179, 0.00398936], [1179, 1885, 0.02332362], [1885, 1906, 0.0]], "rps_lines_start_with_bulletpoint": [[0, 20, 0.0], [20, 51, 0.0], [51, 413, 0.0], [413, 1179, 0.0], [1179, 1885, 0.0], [1885, 1906, 0.0]], "rps_lines_uppercase_letter_fraction": [[0, 20, 0.15], [20, 51, 0.0], [51, 413, 0.03867403], [413, 1179, 0.06396867], [1179, 1885, 0.05524079], [1885, 1906, 0.19047619]], "rps_doc_ml_palm_score": [[0, 1906, 0.00660461]], "rps_doc_ml_wikipedia_score": [[0, 1906, null]], "rps_doc_ml_wikiref_score": [[0, 1906, 0.17523468]], "rps_doc_books_importance": [[0, 1906, -116.26057335]], "rps_doc_openwebtext_importance": [[0, 1906, 8.27063531]], "rps_doc_wikipedia_importance": [[0, 1906, 12.46097398]], "rps_doc_num_sentences": [[0, 1906, 18.0]]} |
explosive properties of explosives
the study does not need to be conducted because there are no chemical groups present in the molecule which are associated with explosive properties | 2023-14/0000/en_head.json.gz/3676 | {"url": "https://echa.europa.eu/registration-dossier/-/registered-dossier/5567/4/15", "partition": "head_middle", "language": "en", "source_domain": "echa.europa.eu", "date_download": "2023-03-20T09:36:40Z", "digest": "sha1:ZDFHDJYG7RMVIITOXIIPSXAGZLEMWOQP"} | {"ccnet_length": [[0, 182, 182.0]], "ccnet_original_length": [[0, 182, 6632.0]], "ccnet_nlines": [[0, 182, 2.0]], "ccnet_original_nlines": [[0, 182, 209.0]], "ccnet_language_score": [[0, 182, 0.97]], "ccnet_perplexity": [[0, 182, 234.6]], "ccnet_bucket": [[0, 182, 0.0]], "rps_doc_curly_bracket": [[0, 182, 0.0]], "rps_doc_ldnoobw_words": [[0, 182, 0.0]], "rps_doc_lorem_ipsum": [[0, 182, 0.0]], "rps_doc_stop_word_fraction": [[0, 182, 0.60714286]], "rps_doc_ut1_blacklist": [[0, 182, null]], "rps_doc_frac_chars_dupe_10grams": [[0, 182, 0.0]], "rps_doc_frac_chars_dupe_5grams": [[0, 182, 0.0]], "rps_doc_frac_chars_dupe_6grams": [[0, 182, 0.0]], "rps_doc_frac_chars_dupe_7grams": [[0, 182, 0.0]], "rps_doc_frac_chars_dupe_8grams": [[0, 182, 0.0]], "rps_doc_frac_chars_dupe_9grams": [[0, 182, 0.0]], "rps_doc_frac_chars_top_2gram": [[0, 182, 0.24516129]], "rps_doc_frac_chars_top_3gram": [[0, 182, 0.0]], "rps_doc_frac_chars_top_4gram": [[0, 182, 0.0]], "rps_doc_frac_all_caps_words": [[0, 182, 0.0]], "rps_doc_frac_lines_end_with_ellipsis": [[0, 182, 0.0]], "rps_doc_frac_no_alph_words": [[0, 182, 0.0]], "rps_doc_frac_unique_words": [[0, 182, 0.85714286]], "rps_doc_mean_word_length": [[0, 182, 5.53571429]], "rps_doc_symbol_to_word_ratio": [[0, 182, 0.0]], "rps_doc_unigram_entropy": [[0, 182, 3.13416246]], "rps_doc_word_count": [[0, 182, 28.0]], "rps_lines_ending_with_terminal_punctution_mark": [[0, 35, 0.0], [35, 182, 0.0]], "rps_lines_javascript_counts": [[0, 35, 0.0], [35, 182, 0.0]], "rps_lines_num_words": [[0, 35, 4.0], [35, 182, 24.0]], "rps_lines_numerical_chars_fraction": [[0, 35, 0.0], [35, 182, 0.0]], "rps_lines_start_with_bulletpoint": [[0, 35, 0.0], [35, 182, 0.0]], "rps_lines_uppercase_letter_fraction": [[0, 35, 0.0], [35, 182, 0.0]], "rps_doc_ml_palm_score": [[0, 182, 0.01008058]], "rps_doc_ml_wikipedia_score": [[0, 182, null]], "rps_doc_ml_wikiref_score": [[0, 182, -1.001e-05]], "rps_doc_books_importance": [[0, 182, 0.1715059]], "rps_doc_openwebtext_importance": [[0, 182, 4.25741351]], "rps_doc_wikipedia_importance": [[0, 182, -0.3727728]], "rps_doc_num_sentences": [[0, 182, 1.0]]} |
teameddymorales Uncategorized
October 30, 2020 teameddymorales
Everyone in our community deserves safety and dignity. Making sure everyone in Gresham has a stable and affordable place to call home is one of my top priorities.
Gresham has a housing problem – and it touches all of us. We all see people sleeping out on the street or in tent camps every day. We all know people who are struggling to afford rent (or you may be struggling yourself).
The reality is that so long as rent and single family homes remain unaffordable, there will be an unhoused and at-risk population in Gresham. Important services such as job training or counseling are not effective if people don’t have a safe and stable place to sleep each night. Everyone in our community deserves safety and dignity.
Housing Insecurity in Gresham
The last survey done by Multnomah County showed that we have over 4000 neighbors who are unhoused, and about 2000 of them sleep on the street each night. About two thirds of the people sleeping outside each night are chronically homeless: they’ve been homeless for over a year and also have a disabling condition such as a physical disability or a mental illness. Other reports indicate the number of unhoused people may be much higher.
At the same time, an even larger number of people in Gresham are housing insecure. These folks are struggling to afford rent and are at a high risk of becoming homeless if anything goes wrong in their lives such as a job loss or car accident. This is a gut-wrenching, dangerous edge for anyone to live on, much less entire families.
To address both homelessness and housing insecurity, I am helping to lead the city to take action. Here’s some what I have done as a city councilor:
Led the effort to get affordable housing and services for low income residents in Rockwood. I made sure one of our most underrepresented communities got a new center that includes over 200 affordable housing units and offices for local social service providers such as Loaves & Fishes.
Investing over $25 million of Metro Housing Bond money. I successfully pushed the city’s first project to combine housing and services. It’s only when we combine those two things that we start to solve the problem.
Voted to increase funding for mental health and addiction counseling.
My Plans for Housing Affordability and Homelessness
We need to get unhoused people into housing, and make sure that people who are struggling financially have the support they need to keep their housing. At the same time, we need to make sure that social services ranging from job training to mental health counseling are available to everyone. Here are a few avenues we can pursue that can keep Gresham a place that everyone can call home.
Use Metro funding to pursue housing land trusts. Where the city owns land, we should consider developing affordable housing for those in need. We can consider models like Habitat for Humanity, or Proud Ground in Portland.
Create more supportive housing that comes with services like mental health and addiction treatment.
Make sure housing is available for each stage of life from young adults who live alone, to families who need space for children, to the elderly who need extra support.
Find ways to support and provide childcare for working families.
Join Me to Fight for Gresham’s Future
If, like me, you care about seeing community voices in government, please sign up for my Gresham News and Action Alerts. I’ll keep you up to date with what’s happening at City Hall, and when we need community voices on issues like housing insecurity. | 2023-14/0000/en_head.json.gz/3677 | {"url": "https://eddymorales.com/2020/10/30/housing-and-homelessness-in-gresham/", "partition": "head_middle", "language": "en", "source_domain": "eddymorales.com", "date_download": "2023-03-20T10:23:29Z", "digest": "sha1:PEU2XYYM62YJFZ5MAQNMFNEWCHULSNVJ"} | {"ccnet_length": [[0, 3586, 3586.0]], "ccnet_original_length": [[0, 3586, 3921.0]], "ccnet_nlines": [[0, 3586, 20.0]], "ccnet_original_nlines": [[0, 3586, 42.0]], "ccnet_language_score": [[0, 3586, 0.97]], "ccnet_perplexity": [[0, 3586, 299.1]], "ccnet_bucket": [[0, 3586, 0.0]], "rps_doc_curly_bracket": [[0, 3586, 0.0]], "rps_doc_ldnoobw_words": [[0, 3586, 0.0]], "rps_doc_lorem_ipsum": [[0, 3586, 0.0]], "rps_doc_stop_word_fraction": [[0, 3586, 0.44948755]], "rps_doc_ut1_blacklist": [[0, 3586, null]], "rps_doc_frac_chars_dupe_10grams": [[0, 3586, 0.0]], "rps_doc_frac_chars_dupe_5grams": [[0, 3586, 0.04863014]], "rps_doc_frac_chars_dupe_6grams": [[0, 3586, 0.03150685]], "rps_doc_frac_chars_dupe_7grams": [[0, 3586, 0.03150685]], "rps_doc_frac_chars_dupe_8grams": [[0, 3586, 0.03150685]], "rps_doc_frac_chars_dupe_9grams": [[0, 3586, 0.0]], "rps_doc_frac_chars_top_2gram": [[0, 3586, 0.01232877]], "rps_doc_frac_chars_top_3gram": [[0, 3586, 0.00890411]], "rps_doc_frac_chars_top_4gram": [[0, 3586, 0.01506849]], "rps_doc_frac_all_caps_words": [[0, 3586, 0.00732064]], "rps_doc_frac_lines_end_with_ellipsis": [[0, 3586, 0.0]], "rps_doc_frac_no_alph_words": [[0, 3586, 0.10395315]], "rps_doc_frac_unique_words": [[0, 3586, 0.47454844]], "rps_doc_mean_word_length": [[0, 3586, 4.79474548]], "rps_doc_symbol_to_word_ratio": [[0, 3586, 0.0]], "rps_doc_unigram_entropy": [[0, 3586, 5.21565846]], "rps_doc_word_count": [[0, 3586, 609.0]], "rps_lines_ending_with_terminal_punctution_mark": [[0, 30, 0.0], [30, 63, 0.0], [63, 226, 1.0], [226, 447, 1.0], [447, 782, 1.0], [782, 812, 0.0], [812, 1249, 1.0], [1249, 1582, 1.0], [1582, 1731, 0.0], [1731, 2017, 1.0], [2017, 2232, 1.0], [2232, 2302, 1.0], [2302, 2354, 0.0], [2354, 2743, 1.0], [2743, 2965, 1.0], [2965, 3065, 1.0], [3065, 3233, 1.0], [3233, 3298, 1.0], [3298, 3336, 0.0], [3336, 3586, 1.0]], "rps_lines_javascript_counts": [[0, 30, 0.0], [30, 63, 0.0], [63, 226, 0.0], [226, 447, 0.0], [447, 782, 0.0], [782, 812, 0.0], [812, 1249, 0.0], [1249, 1582, 0.0], [1582, 1731, 0.0], [1731, 2017, 0.0], [2017, 2232, 0.0], [2232, 2302, 0.0], [2302, 2354, 0.0], [2354, 2743, 0.0], [2743, 2965, 0.0], [2965, 3065, 0.0], [3065, 3233, 0.0], [3233, 3298, 0.0], [3298, 3336, 0.0], [3336, 3586, 0.0]], "rps_lines_num_words": [[0, 30, 2.0], [30, 63, 4.0], [63, 226, 28.0], [226, 447, 43.0], [447, 782, 56.0], [782, 812, 4.0], [812, 1249, 75.0], [1249, 1582, 61.0], [1582, 1731, 27.0], [1731, 2017, 46.0], [2017, 2232, 36.0], [2232, 2302, 10.0], [2302, 2354, 7.0], [2354, 2743, 69.0], [2743, 2965, 36.0], [2965, 3065, 14.0], [3065, 3233, 30.0], [3233, 3298, 10.0], [3298, 3336, 7.0], [3336, 3586, 44.0]], "rps_lines_numerical_chars_fraction": [[0, 30, 0.0], [30, 63, 0.19354839], [63, 226, 0.0], [226, 447, 0.0], [447, 782, 0.0], [782, 812, 0.0], [812, 1249, 0.01856148], [1249, 1582, 0.0], [1582, 1731, 0.0], [1731, 2017, 0.01067616], [2017, 2232, 0.00952381], [2232, 2302, 0.0], [2302, 2354, 0.0], [2354, 2743, 0.0], [2743, 2965, 0.0], [2965, 3065, 0.0], [3065, 3233, 0.0], [3233, 3298, 0.0], [3298, 3336, 0.0], [3336, 3586, 0.0]], "rps_lines_start_with_bulletpoint": [[0, 30, 0.0], [30, 63, 0.0], [63, 226, 0.0], [226, 447, 0.0], [447, 782, 0.0], [782, 812, 0.0], [812, 1249, 0.0], [1249, 1582, 0.0], [1582, 1731, 0.0], [1731, 2017, 0.0], [2017, 2232, 0.0], [2232, 2302, 0.0], [2302, 2354, 0.0], [2354, 2743, 0.0], [2743, 2965, 0.0], [2965, 3065, 0.0], [3065, 3233, 0.0], [3233, 3298, 0.0], [3298, 3336, 0.0], [3336, 3586, 0.0]], "rps_lines_uppercase_letter_fraction": [[0, 30, 0.03333333], [30, 63, 0.03030303], [63, 226, 0.01840491], [226, 447, 0.01357466], [447, 782, 0.0119403], [782, 812, 0.1], [812, 1249, 0.01144165], [1249, 1582, 0.01201201], [1582, 1731, 0.02684564], [1731, 2017, 0.01748252], [2017, 2232, 0.02790698], [2232, 2302, 0.01428571], [2302, 2354, 0.09615385], [2354, 2743, 0.01028278], [2743, 2965, 0.04054054], [2965, 3065, 0.01], [3065, 3233, 0.00595238], [3233, 3298, 0.01538462], [3298, 3336, 0.13157895], [3336, 3586, 0.032]], "rps_doc_ml_palm_score": [[0, 3586, 0.13095677]], "rps_doc_ml_wikipedia_score": [[0, 3586, null]], "rps_doc_ml_wikiref_score": [[0, 3586, 0.05720675]], "rps_doc_books_importance": [[0, 3586, -189.41663399]], "rps_doc_openwebtext_importance": [[0, 3586, 33.69441602]], "rps_doc_wikipedia_importance": [[0, 3586, -191.9344544]], "rps_doc_num_sentences": [[0, 3586, 32.0]]} |
Energy and Economic Estimation for a Passive Solar Water Heating System Equipped with a Heat Pipe Evacuated Tube Collector:
Energy and Economic Estimation for a Passive Solar Water Heating System Equipped with a Heat Pipe Evacuated Tube Collector: Case Study
Altin Maraj
Polytechnic University of Tirana, Tirana, Albania
The energy and the economic estimation for a passive solar water heating system under the Tirana climate conditions is performed. The trial system includes a heat pipe evacuated tube collector. To fulfil these objectives, measured values recorded every minute from the data logger of the system are used. The studied solar collector has a slope of 45° and is oriented 10° East from the South direction. The heat pipe evacuated tube collector has an aperture area of 1.476 m2. For an annual time period the irradiation on tilted solar collector area, the extracted energy from the storage tank, the delivered energy to the thermal consumer and the system efficiency were defined. Also, the economic assessment of the system is performed.
Keywords: passive system, heat pipe evacuated tube collector, system efficiency, economic estimation
W. Weiss, M. Spork-Dur, "Solar heat worlwide. Global market development and trends in 2018. Detailed market figure in 2017" 2019. [Online]. Available: http://www.iea-shc.org/data/sites/1/publications/Solar-Heat-Worldwide-2019.pdf. [Accessed June 2019].
A. Maraj, "Simulation and study of operation for 3-different types of solar water heating systems for climate conditions of Central part of Albania", PhD dissertation, Dept. Mech. Eng., Polytechnic University of Tirana, Albania, December 2014. [Online]. Available: http://fim.edu.al/programet/doktorature/temat/altin_maraj.pdf. [Accessed January 2015].
D.W. Lee, A. Sharma, "Thermal performances of the active and passive water heating systems based on annual operation" Solar Energy, vol. 81, no. 2, pp. 207-215, 2007.
L. Gill, J.M. Mahon, K. Ryan, "The performance of an evacuated tube solar hot water system in a domestic house throught a year in a northern maritime climate (Dublin)" Solar Energy, vol. 137, no. 1, pp. 262-272, 2016.
S. Kalogirou, "Thermal performance, economic and environemental life cycle analysis of thermosiphon solar water heaters" Solar Energy, vol. 83, no. 1, pp. 39-48, 2009.
M. Ghorab, E. Entchev, L. Yang, “Inclusive analysis and performance evaluation of solar domestic hot water system” Alexandria Engineering Journal, Vol. 56, pp. 201-212, 2017.
P. Hernandez, P. Kenny, “Net energy analysis of domestic solar water heating installations in operation” Renewable and Sustainable Energy Reviews, Vol. 16, Issue 1, pp. 170-177, 2012.
Y. Kim, K. Thu, H.T. Bhatia, C.S. Bhatia, K.C Ng, "Thermal analysis and performance optimization of a solar hot water plant with economic evaluation" Solar Energy, vol. 86, no. 5, pp. 1378-1395, 2012.
D. Diakoulaki, A. Zervos, J. Sarafidis, S. Mirasgedis, “Cost benefit analysis for solar water heating systems” Energy Conversion & Management, Vol. 42, pp. 1727-1739, 2001.
M. Hazami, N. Naili, I. Attar, A. Farhat, “Solar water heating systems feasibility for domestic requests in Tunisia: Thermal potential and economical analysis” Energy Conversion & Management, Vol. 76, pp. 599-608, 2013.
J. Mathur, N.K. Bansal, “Energy analysis of solar water heating systems in India” International Journal of Life Cycle Assessment, Vol. 4, Issue 2, pp. 113-116, 1999.
M. Asif, J. Currie, T. Muneer, “Comparison of aluminium and stainless steel built-in-storage solar water heater” Building Services Engineering Research and Technology, Vol. 28, Issue 4, pp. 337-346, 2007.
R. H. Crawford, G.J. Treloar, “Net energy analysis of solar and conventional domestic hot water heating systems in Melbourne, Australia” Solar Energy, Vol. 76, pp. 159-163, 2004.
P. Banik, A. Ganguly, "Performance and economic analysis of a floricultural greenhouse with distributed fan-pad evaporative cooling coupled with solar dessication" Solar Energy, vol. 147, pp. 439-447, 2017.
Geographical names, "Tirana-Albania" [Online]. Available: http://www.geographic.org/geographic_names/name.php?uni=-168608&fid=283. [Accessed January 2013].
NANR, "Energjia diellore" 2012. [Online]. Available: http://www.akbn.gov.al/images/pdf/energji-te-rinovueshme/Energjia_Diellore.pdf. [Accessed August 2012].
"Resol Products" Resol, 2010. [Online]. Available: http://www.resol.de/index/produkte/sprache/en/#kategorie5. [Accessed June 2017].
"Sunda Solar Water Heaters" Sunda, 2012. [Online]. Available: http://www.sundasolar.com/product_waterheaters.html. [Accessed February 2012].
J.A. Duffie, W.A. Beckman, Solar engineering of thermal processes, 4-ed, New Jersey: John Wiley & Sons, 2013.
"Accounting explained" 2017. [Online]. Available: http://accountingexplained.com/managerial/capital-budgeting/discounted-payback-period. [Accessed July 2017].
S. Carter, N.J. Macdonald, D.C.B. Cheng, Basic finance for marketers, Rome: Food and Agriculture Organization of UN, 1997.
Maraj, A. 2020. Energy and Economic Estimation for a Passive Solar Water Heating System Equipped with a Heat Pipe Evacuated Tube Collector:: Case Study. European Journal of Engineering and Technology Research. 5, 3 (Mar. 2020), 348–352. DOI:https://doi.org/10.24018/ejeng.2020.5.3.1828.
Altin Maraj, Experimental Validation of a Forced-Circulation Solar Water Heating System Equipped with Flat-Plate Solar Collectors – Case Study , European Journal of Engineering and Technology Research: Vol. 5 No. 5: MAY 2020
Altin Maraj, Modelling the Business-As-Usual Energy Scenario for the Albanian Household Sector , European Journal of Engineering and Technology Research: Vol. 5 No. 8: AUGUST 2020 | 2023-14/0000/en_head.json.gz/3678 | {"url": "https://ej-eng.org/index.php/ejeng/article/view/1828", "partition": "head_middle", "language": "en", "source_domain": "ej-eng.org", "date_download": "2023-03-20T09:50:34Z", "digest": "sha1:P7BHSOXKRZKZ7MA2CT4URQ23HMEA4IU3"} | {"ccnet_length": [[0, 5697, 5697.0]], "ccnet_original_length": [[0, 5697, 7899.0]], "ccnet_nlines": [[0, 5697, 30.0]], "ccnet_original_nlines": [[0, 5697, 113.0]], "ccnet_language_score": [[0, 5697, 0.79]], "ccnet_perplexity": [[0, 5697, 295.3]], "ccnet_bucket": [[0, 5697, 0.0]], "rps_doc_curly_bracket": [[0, 5697, 0.0]], "rps_doc_ldnoobw_words": [[0, 5697, 0.0]], "rps_doc_lorem_ipsum": [[0, 5697, 0.0]], "rps_doc_stop_word_fraction": [[0, 5697, 0.10321101]], "rps_doc_ut1_blacklist": [[0, 5697, null]], "rps_doc_frac_chars_dupe_10grams": [[0, 5697, 0.09841199]], "rps_doc_frac_chars_dupe_5grams": [[0, 5697, 0.16282711]], "rps_doc_frac_chars_dupe_6grams": [[0, 5697, 0.13777678]], "rps_doc_frac_chars_dupe_7grams": [[0, 5697, 0.12301499]], "rps_doc_frac_chars_dupe_8grams": [[0, 5697, 0.12301499]], "rps_doc_frac_chars_dupe_9grams": [[0, 5697, 0.12301499]], "rps_doc_frac_chars_top_2gram": [[0, 5697, 0.02683963]], "rps_doc_frac_chars_top_3gram": [[0, 5697, 0.03422053]], "rps_doc_frac_chars_top_4gram": [[0, 5697, 0.02818161]], "rps_doc_frac_all_caps_words": [[0, 5697, 0.04587156]], "rps_doc_frac_lines_end_with_ellipsis": [[0, 5697, 0.0]], "rps_doc_frac_no_alph_words": [[0, 5697, 0.40902141]], "rps_doc_frac_unique_words": [[0, 5697, 0.44633508]], "rps_doc_mean_word_length": [[0, 5697, 5.85209424]], "rps_doc_symbol_to_word_ratio": [[0, 5697, 0.00076453]], "rps_doc_unigram_entropy": [[0, 5697, 5.24084245]], "rps_doc_word_count": [[0, 5697, 764.0]], "rps_lines_ending_with_terminal_punctution_mark": [[0, 124, 0.0], [124, 259, 0.0], [259, 271, 0.0], [271, 321, 0.0], [321, 1058, 1.0], [1058, 1159, 0.0], [1159, 1412, 1.0], [1412, 1765, 1.0], [1765, 1932, 1.0], [1932, 2150, 1.0], [2150, 2318, 1.0], [2318, 2493, 1.0], [2493, 2677, 1.0], [2677, 2878, 1.0], [2878, 3051, 1.0], [3051, 3271, 1.0], [3271, 3437, 1.0], [3437, 3642, 1.0], [3642, 3821, 1.0], [3821, 4028, 1.0], [4028, 4184, 1.0], [4184, 4341, 1.0], [4341, 4473, 1.0], [4473, 4614, 1.0], [4614, 4724, 1.0], [4724, 4883, 1.0], [4883, 5006, 1.0], [5006, 5293, 1.0], [5293, 5518, 0.0], [5518, 5697, 0.0]], "rps_lines_javascript_counts": [[0, 124, 0.0], [124, 259, 0.0], [259, 271, 0.0], [271, 321, 0.0], [321, 1058, 0.0], [1058, 1159, 0.0], [1159, 1412, 0.0], [1412, 1765, 0.0], [1765, 1932, 0.0], [1932, 2150, 0.0], [2150, 2318, 0.0], [2318, 2493, 0.0], [2493, 2677, 0.0], [2677, 2878, 0.0], [2878, 3051, 0.0], [3051, 3271, 0.0], [3271, 3437, 0.0], [3437, 3642, 0.0], [3642, 3821, 0.0], [3821, 4028, 0.0], [4028, 4184, 0.0], [4184, 4341, 0.0], [4341, 4473, 0.0], [4473, 4614, 0.0], [4614, 4724, 0.0], [4724, 4883, 0.0], [4883, 5006, 0.0], [5006, 5293, 0.0], [5293, 5518, 0.0], [5518, 5697, 0.0]], "rps_lines_num_words": [[0, 124, 19.0], [124, 259, 21.0], [259, 271, 2.0], [271, 321, 6.0], [321, 1058, 120.0], [1058, 1159, 12.0], [1159, 1412, 26.0], [1412, 1765, 41.0], [1765, 1932, 27.0], [1932, 2150, 38.0], [2150, 2318, 24.0], [2318, 2493, 25.0], [2493, 2677, 27.0], [2677, 2878, 33.0], [2878, 3051, 24.0], [3051, 3271, 31.0], [3271, 3437, 26.0], [3437, 3642, 29.0], [3642, 3821, 27.0], [3821, 4028, 28.0], [4028, 4184, 9.0], [4184, 4341, 10.0], [4341, 4473, 10.0], [4473, 4614, 12.0], [4614, 4724, 16.0], [4724, 4883, 9.0], [4883, 5006, 18.0], [5006, 5293, 37.0], [5293, 5518, 32.0], [5518, 5697, 25.0]], "rps_lines_numerical_chars_fraction": [[0, 124, 0.0], [124, 259, 0.0], [259, 271, 0.0], [271, 321, 0.0], [321, 1058, 0.01243094], [1058, 1159, 0.0], [1159, 1412, 0.09589041], [1412, 1765, 0.02821317], [1765, 1932, 0.08666667], [1932, 2150, 0.07106599], [2150, 2318, 0.07189542], [2318, 2493, 0.07453416], [2493, 2677, 0.07602339], [2677, 2878, 0.08474576], [2878, 3051, 0.09032258], [3051, 3271, 0.05970149], [3271, 3437, 0.07894737], [3437, 3642, 0.06914894], [3642, 3821, 0.07317073], [3821, 4028, 0.06770833], [4028, 4184, 0.10077519], [4184, 4341, 0.06153846], [4341, 4473, 0.08333333], [4473, 4614, 0.06666667], [4614, 4724, 0.05263158], [4724, 4883, 0.05839416], [4883, 5006, 0.03669725], [5006, 5293, 0.12692308], [5293, 5518, 0.02790698], [5518, 5697, 0.03529412]], "rps_lines_start_with_bulletpoint": [[0, 124, 0.0], [124, 259, 0.0], [259, 271, 0.0], [271, 321, 0.0], [321, 1058, 0.0], [1058, 1159, 0.0], [1159, 1412, 0.0], [1412, 1765, 0.0], [1765, 1932, 0.0], [1932, 2150, 0.0], [2150, 2318, 0.0], [2318, 2493, 0.0], [2493, 2677, 0.0], [2677, 2878, 0.0], [2878, 3051, 0.0], [3051, 3271, 0.0], [3271, 3437, 0.0], [3437, 3642, 0.0], [3642, 3821, 0.0], [3821, 4028, 0.0], [4028, 4184, 0.0], [4184, 4341, 0.0], [4341, 4473, 0.0], [4473, 4614, 0.0], [4614, 4724, 0.0], [4724, 4883, 0.0], [4883, 5006, 0.0], [5006, 5293, 0.0], [5293, 5518, 0.0], [5518, 5697, 0.0]], "rps_lines_uppercase_letter_fraction": [[0, 124, 0.11290323], [124, 259, 0.11851852], [259, 271, 0.16666667], [271, 321, 0.1], [321, 1058, 0.01356852], [1058, 1159, 0.00990099], [1159, 1412, 0.05928854], [1412, 1765, 0.05382436], [1765, 1932, 0.04790419], [1932, 2150, 0.05045872], [2150, 2318, 0.0297619], [2318, 2493, 0.06285714], [2493, 2677, 0.05978261], [2677, 2878, 0.07960199], [2878, 3051, 0.07514451], [3051, 3271, 0.06818182], [3271, 3437, 0.08433735], [3437, 3642, 0.06829268], [3642, 3821, 0.06703911], [3821, 4028, 0.03381643], [4028, 4184, 0.04487179], [4184, 4341, 0.07006369], [4341, 4473, 0.0530303], [4473, 4614, 0.06382979], [4614, 4724, 0.10909091], [4724, 4883, 0.03144654], [4883, 5006, 0.1300813], [5006, 5293, 0.09407666], [5293, 5518, 0.12], [5518, 5697, 0.13407821]], "rps_doc_ml_palm_score": [[0, 5697, 0.00295842]], "rps_doc_ml_wikipedia_score": [[0, 5697, null]], "rps_doc_ml_wikiref_score": [[0, 5697, 0.06627846]], "rps_doc_books_importance": [[0, 5697, -712.61427904]], "rps_doc_openwebtext_importance": [[0, 5697, -342.55677458]], "rps_doc_wikipedia_importance": [[0, 5697, -258.18072692]], "rps_doc_num_sentences": [[0, 5697, 173.0]]} |
Buyers • Buyers & Sellers • Colorado Real Estate • Greeley Real Estate • Johnstown Real Estate • Sellers • Severance Real Estate • Windsor Real Estate • August 29, 2022
Is the Shifting Market a Challenge or an Opportunity for Homebuyers?
If you tried to buy a home during the pandemic, you know the limited supply of homes for sale was a considerable challenge. It created intense bidding wars which drove home prices up as buyers competed with one another to be the winning offer.
But what was once your greatest challenge may now be your greatest opportunity. Today, data shows buyer demand is moderating in the wake of higher mortgage rates. Here are a few reasons why this shift in the housing market is good news for your homebuying plans.
There were many reasons for the limited number of homes on the market during the pandemic, including a history of underbuilding new homes since the market crash in 2008. As the graph below shows, housing supply is well below what the market has seen for most of the past 10 years (see graph below):
But that graph also shows a trend back up in the right direction this year. That’s because moderating demand is slowing the pace of home sales and that’s one of the reasons housing supply is finally able to grow. For you, that means you’ll have more options to choose from, so it shouldn’t be as difficult to find your next home as it has been recently.
And having more options may also lead to less intense bidding wars. Data from the Realtors Confidence Index from the National Association of Realtors (NAR) shows this trend has already begun. In their recent reports, bidding wars are easing month-over-month (see graph below):
If you’ve been outbid before or you’ve struggled to find a home that meets your needs, breathe a welcome sigh of relief. The big takeaway here is you have more options and less competition today.
Just remember, while easing, data shows multiple-offer scenarios are still happening – they’re just not as intense as they were over the past year. You should still lean on an agent to guide you through the process and help you make your strongest offer up front.
If you’re still looking to make a move, it may be time to pick your home search back up today. Let’s connect to kick off the homebuying process.
« Selling Your House? Your Asking Price Matters More Now Than EverWhat Does the Rest of the Year Hold for Home Prices? »
Buyers What Does the Rest of the Year Hold for Home Prices? Whether you’re a potential homebuyer, seller, or both, you probably want to know: will home prices fall this year? Let’s break down what’s happening with home prices, where experts say they’re headed, and why this matters for your homeownership goals. Last Year’s Rapid Home Price Growth Wasn’t the Norm In 2021, home prices appreciated quickly. One reason why is that record-low […]
Buyers The U.S. Homeownership Rate Is Growing The desire to own a home is still strong today. In fact, according to the Census, the U.S. homeownership rate is on the rise. To illustrate the increase, the graph below shows the homeownership rate over the last year: That data shows more than half of the U.S. population live in a home they own, and the percentage […] | 2023-14/0000/en_head.json.gz/3679 | {"url": "https://ekelerrealestate.com/2022/08/29/is-the-shifting-market-a-challenge-or-an-opportunity-for-homebuyers", "partition": "head_middle", "language": "en", "source_domain": "ekelerrealestate.com", "date_download": "2023-03-20T09:20:54Z", "digest": "sha1:FQ7U67XQPSR2X7Q4O2BPIHDSONYAC4T6"} | {"ccnet_length": [[0, 3211, 3211.0]], "ccnet_original_length": [[0, 3211, 5552.0]], "ccnet_nlines": [[0, 3211, 13.0]], "ccnet_original_nlines": [[0, 3211, 67.0]], "ccnet_language_score": [[0, 3211, 0.95]], "ccnet_perplexity": [[0, 3211, 322.8]], "ccnet_bucket": [[0, 3211, 0.0]], "rps_doc_curly_bracket": [[0, 3211, 0.0]], "rps_doc_ldnoobw_words": [[0, 3211, 0.0]], "rps_doc_lorem_ipsum": [[0, 3211, 0.0]], "rps_doc_stop_word_fraction": [[0, 3211, 0.40298507]], "rps_doc_ut1_blacklist": [[0, 3211, null]], "rps_doc_frac_chars_dupe_10grams": [[0, 3211, 0.02877138]], "rps_doc_frac_chars_dupe_5grams": [[0, 3211, 0.0474339]], "rps_doc_frac_chars_dupe_6grams": [[0, 3211, 0.02877138]], "rps_doc_frac_chars_dupe_7grams": [[0, 3211, 0.02877138]], "rps_doc_frac_chars_dupe_8grams": [[0, 3211, 0.02877138]], "rps_doc_frac_chars_dupe_9grams": [[0, 3211, 0.02877138]], "rps_doc_frac_chars_top_2gram": [[0, 3211, 0.02332815]], "rps_doc_frac_chars_top_3gram": [[0, 3211, 0.02138414]], "rps_doc_frac_chars_top_4gram": [[0, 3211, 0.01399689]], "rps_doc_frac_all_caps_words": [[0, 3211, 0.01044776]], "rps_doc_frac_lines_end_with_ellipsis": [[0, 3211, 0.0]], "rps_doc_frac_no_alph_words": [[0, 3211, 0.14925373]], "rps_doc_frac_unique_words": [[0, 3211, 0.45583039]], "rps_doc_mean_word_length": [[0, 3211, 4.54416961]], "rps_doc_symbol_to_word_ratio": [[0, 3211, 0.00298507]], "rps_doc_unigram_entropy": [[0, 3211, 5.10268692]], "rps_doc_word_count": [[0, 3211, 566.0]], "rps_lines_ending_with_terminal_punctution_mark": [[0, 169, 0.0], [169, 238, 1.0], [238, 482, 1.0], [482, 745, 1.0], [745, 1044, 0.0], [1044, 1398, 1.0], [1398, 1675, 0.0], [1675, 1871, 1.0], [1871, 2135, 1.0], [2135, 2280, 1.0], [2280, 2401, 0.0], [2401, 2845, 0.0], [2845, 3211, 0.0]], "rps_lines_javascript_counts": [[0, 169, 0.0], [169, 238, 0.0], [238, 482, 0.0], [482, 745, 0.0], [745, 1044, 0.0], [1044, 1398, 0.0], [1398, 1675, 0.0], [1675, 1871, 0.0], [1871, 2135, 0.0], [2135, 2280, 0.0], [2280, 2401, 0.0], [2401, 2845, 0.0], [2845, 3211, 0.0]], "rps_lines_num_words": [[0, 169, 30.0], [169, 238, 11.0], [238, 482, 44.0], [482, 745, 46.0], [745, 1044, 54.0], [1044, 1398, 66.0], [1398, 1675, 43.0], [1675, 1871, 35.0], [1871, 2135, 46.0], [2135, 2280, 28.0], [2280, 2401, 23.0], [2401, 2845, 74.0], [2845, 3211, 66.0]], "rps_lines_numerical_chars_fraction": [[0, 169, 0.03636364], [169, 238, 0.0], [238, 482, 0.0], [482, 745, 0.0], [745, 1044, 0.02054795], [1044, 1398, 0.0], [1398, 1675, 0.0], [1675, 1871, 0.0], [1871, 2135, 0.0], [2135, 2280, 0.0], [2280, 2401, 0.0], [2401, 2845, 0.00932401], [2845, 3211, 0.0]], "rps_lines_start_with_bulletpoint": [[0, 169, 0.0], [169, 238, 0.0], [238, 482, 0.0], [482, 745, 0.0], [745, 1044, 0.0], [1044, 1398, 0.0], [1398, 1675, 0.0], [1675, 1871, 0.0], [1871, 2135, 0.0], [2135, 2280, 0.0], [2280, 2401, 0.0], [2401, 2845, 0.0], [2845, 3211, 0.0]], "rps_lines_uppercase_letter_fraction": [[0, 169, 0.1183432], [169, 238, 0.08695652], [238, 482, 0.00819672], [482, 745, 0.01140684], [745, 1044, 0.00668896], [1044, 1398, 0.00847458], [1398, 1675, 0.0433213], [1675, 1871, 0.01020408], [1871, 2135, 0.00757576], [2135, 2280, 0.0137931], [2280, 2401, 0.14876033], [2401, 2845, 0.04504505], [2845, 3211, 0.04644809]], "rps_doc_ml_palm_score": [[0, 3211, 0.00219369]], "rps_doc_ml_wikipedia_score": [[0, 3211, null]], "rps_doc_ml_wikiref_score": [[0, 3211, 0.00426853]], "rps_doc_books_importance": [[0, 3211, -248.70392455]], "rps_doc_openwebtext_importance": [[0, 3211, 29.11618379]], "rps_doc_wikipedia_importance": [[0, 3211, -233.45926492]], "rps_doc_num_sentences": [[0, 3211, 33.0]]} |
Events for 23 September 2022
No events scheduled for 23 September 2022. Please try another day. | 2023-14/0000/en_head.json.gz/3680 | {"url": "https://ekip.com/events/2022-09-23/", "partition": "head_middle", "language": "en", "source_domain": "ekip.com", "date_download": "2023-03-20T10:36:11Z", "digest": "sha1:TUW6VDESUDT7RSUWPMS4F5HDHYXJHR6W"} | {"ccnet_length": [[0, 95, 95.0]], "ccnet_original_length": [[0, 95, 781.0]], "ccnet_nlines": [[0, 95, 2.0]], "ccnet_original_nlines": [[0, 95, 45.0]], "ccnet_language_score": [[0, 95, 0.97]], "ccnet_perplexity": [[0, 95, 273.8]], "ccnet_bucket": [[0, 95, 0.0]], "rps_doc_curly_bracket": [[0, 95, 0.0]], "rps_doc_ldnoobw_words": [[0, 95, 0.0]], "rps_doc_lorem_ipsum": [[0, 95, 0.0]], "rps_doc_stop_word_fraction": [[0, 95, 0.22222222]], "rps_doc_ut1_blacklist": [[0, 95, null]], "rps_doc_frac_chars_dupe_10grams": [[0, 95, 0.0]], "rps_doc_frac_chars_dupe_5grams": [[0, 95, 0.0]], "rps_doc_frac_chars_dupe_6grams": [[0, 95, 0.0]], "rps_doc_frac_chars_dupe_7grams": [[0, 95, 0.0]], "rps_doc_frac_chars_dupe_8grams": [[0, 95, 0.0]], "rps_doc_frac_chars_dupe_9grams": [[0, 95, 0.0]], "rps_doc_frac_chars_top_2gram": [[0, 95, 0.12820513]], "rps_doc_frac_chars_top_3gram": [[0, 95, 0.35897436]], "rps_doc_frac_chars_top_4gram": [[0, 95, 0.46153846]], "rps_doc_frac_all_caps_words": [[0, 95, 0.0]], "rps_doc_frac_lines_end_with_ellipsis": [[0, 95, 0.0]], "rps_doc_frac_no_alph_words": [[0, 95, 0.33333333]], "rps_doc_frac_unique_words": [[0, 95, 0.6875]], "rps_doc_mean_word_length": [[0, 95, 4.875]], "rps_doc_symbol_to_word_ratio": [[0, 95, 0.0]], "rps_doc_unigram_entropy": [[0, 95, 2.33937173]], "rps_doc_word_count": [[0, 95, 16.0]], "rps_lines_ending_with_terminal_punctution_mark": [[0, 29, 0.0], [29, 95, 1.0]], "rps_lines_javascript_counts": [[0, 29, 0.0], [29, 95, 0.0]], "rps_lines_num_words": [[0, 29, 5.0], [29, 95, 11.0]], "rps_lines_numerical_chars_fraction": [[0, 29, 0.21428571], [29, 95, 0.09375]], "rps_lines_start_with_bulletpoint": [[0, 29, 0.0], [29, 95, 0.0]], "rps_lines_uppercase_letter_fraction": [[0, 29, 0.06896552], [29, 95, 0.04545455]], "rps_doc_ml_palm_score": [[0, 95, -1.001e-05]], "rps_doc_ml_wikipedia_score": [[0, 95, null]], "rps_doc_ml_wikiref_score": [[0, 95, -1.001e-05]], "rps_doc_books_importance": [[0, 95, -11.6821756]], "rps_doc_openwebtext_importance": [[0, 95, -7.39458938]], "rps_doc_wikipedia_importance": [[0, 95, -5.26214751]], "rps_doc_num_sentences": [[0, 95, 2.0]]} |
You are Label
You want to Promote yourself
You have to hope that your music will reach as many people as possible. But for this to happen, you have to use the maximum number of channels to get your music out there: the media, music for the moving image, digital and physical distribution, in France and abroad - these are all ways of getting your music to as wide an audience as possible.
Subscribe to Promote yourself | 2023-14/0000/en_head.json.gz/3681 | {"url": "https://electronicmusicfactory.com/en/label/promote-yourself", "partition": "head_middle", "language": "en", "source_domain": "electronicmusicfactory.com", "date_download": "2023-03-20T10:40:45Z", "digest": "sha1:5JR5JQ7SWOJA74EMSY6QXNR6EBBV6MDA"} | {"ccnet_length": [[0, 418, 418.0]], "ccnet_original_length": [[0, 418, 1316.0]], "ccnet_nlines": [[0, 418, 4.0]], "ccnet_original_nlines": [[0, 418, 51.0]], "ccnet_language_score": [[0, 418, 0.97]], "ccnet_perplexity": [[0, 418, 220.6]], "ccnet_bucket": [[0, 418, 0.0]], "rps_doc_curly_bracket": [[0, 418, 0.0]], "rps_doc_ldnoobw_words": [[0, 418, 0.0]], "rps_doc_lorem_ipsum": [[0, 418, 0.0]], "rps_doc_stop_word_fraction": [[0, 418, 0.54761905]], "rps_doc_ut1_blacklist": [[0, 418, null]], "rps_doc_frac_chars_dupe_10grams": [[0, 418, 0.0]], "rps_doc_frac_chars_dupe_5grams": [[0, 418, 0.0]], "rps_doc_frac_chars_dupe_6grams": [[0, 418, 0.0]], "rps_doc_frac_chars_dupe_7grams": [[0, 418, 0.0]], "rps_doc_frac_chars_dupe_8grams": [[0, 418, 0.0]], "rps_doc_frac_chars_dupe_9grams": [[0, 418, 0.0]], "rps_doc_frac_chars_top_2gram": [[0, 418, 0.08083832]], "rps_doc_frac_chars_top_3gram": [[0, 418, 0.10179641]], "rps_doc_frac_chars_top_4gram": [[0, 418, 0.0]], "rps_doc_frac_all_caps_words": [[0, 418, 0.0]], "rps_doc_frac_lines_end_with_ellipsis": [[0, 418, 0.0]], "rps_doc_frac_no_alph_words": [[0, 418, 0.0952381]], "rps_doc_frac_unique_words": [[0, 418, 0.64473684]], "rps_doc_mean_word_length": [[0, 418, 4.39473684]], "rps_doc_symbol_to_word_ratio": [[0, 418, 0.0]], "rps_doc_unigram_entropy": [[0, 418, 3.69995792]], "rps_doc_word_count": [[0, 418, 76.0]], "rps_lines_ending_with_terminal_punctution_mark": [[0, 14, 0.0], [14, 43, 0.0], [43, 389, 1.0], [389, 418, 0.0]], "rps_lines_javascript_counts": [[0, 14, 0.0], [14, 43, 0.0], [43, 389, 0.0], [389, 418, 0.0]], "rps_lines_num_words": [[0, 14, 3.0], [14, 43, 5.0], [43, 389, 64.0], [389, 418, 4.0]], "rps_lines_numerical_chars_fraction": [[0, 14, 0.0], [14, 43, 0.0], [43, 389, 0.0], [389, 418, 0.0]], "rps_lines_start_with_bulletpoint": [[0, 14, 0.0], [14, 43, 0.0], [43, 389, 0.0], [389, 418, 0.0]], "rps_lines_uppercase_letter_fraction": [[0, 14, 0.14285714], [14, 43, 0.06896552], [43, 389, 0.00867052], [389, 418, 0.06896552]], "rps_doc_ml_palm_score": [[0, 418, 0.04269385]], "rps_doc_ml_wikipedia_score": [[0, 418, null]], "rps_doc_ml_wikiref_score": [[0, 418, 0.0065431]], "rps_doc_books_importance": [[0, 418, -16.92731911]], "rps_doc_openwebtext_importance": [[0, 418, 0.446286]], "rps_doc_wikipedia_importance": [[0, 418, -34.63820142]], "rps_doc_num_sentences": [[0, 418, 3.0]]} |
Steam-extraction experts
Founded in 1996, Sirius has rapidly grown in the market thanks to its innovative design, reliability and attention to performance and energy efficiency. Sirius is now one of the European leader in the segment of high-quality and decorative kitchen hoods. Among its latest innovations there is the use of ceramic, handmade in Deruta (Italy), to make unique and exclusive products. But all the production is 100% made in Italy, with a careful analysis of the supply chain.
← PreviousBora
NextNEFF → | 2023-14/0000/en_head.json.gz/3682 | {"url": "https://elementi.lv/portfolio/sirius/?lang=en", "partition": "head_middle", "language": "en", "source_domain": "elementi.lv", "date_download": "2023-03-20T10:41:40Z", "digest": "sha1:X4POJMEIZNUVP4ZWXYQUTFXLAWSYRB4H"} | {"ccnet_length": [[0, 521, 521.0]], "ccnet_original_length": [[0, 521, 985.0]], "ccnet_nlines": [[0, 521, 4.0]], "ccnet_original_nlines": [[0, 521, 32.0]], "ccnet_language_score": [[0, 521, 0.96]], "ccnet_perplexity": [[0, 521, 271.6]], "ccnet_bucket": [[0, 521, 0.0]], "rps_doc_curly_bracket": [[0, 521, 0.0]], "rps_doc_ldnoobw_words": [[0, 521, 0.0]], "rps_doc_lorem_ipsum": [[0, 521, 0.0]], "rps_doc_stop_word_fraction": [[0, 521, 0.36734694]], "rps_doc_ut1_blacklist": [[0, 521, null]], "rps_doc_frac_chars_dupe_10grams": [[0, 521, 0.0]], "rps_doc_frac_chars_dupe_5grams": [[0, 521, 0.0]], "rps_doc_frac_chars_dupe_6grams": [[0, 521, 0.0]], "rps_doc_frac_chars_dupe_7grams": [[0, 521, 0.0]], "rps_doc_frac_chars_dupe_8grams": [[0, 521, 0.0]], "rps_doc_frac_chars_dupe_9grams": [[0, 521, 0.0]], "rps_doc_frac_chars_top_2gram": [[0, 521, 0.02352941]], "rps_doc_frac_chars_top_3gram": [[0, 521, 0.0]], "rps_doc_frac_chars_top_4gram": [[0, 521, 0.0]], "rps_doc_frac_all_caps_words": [[0, 521, 0.0]], "rps_doc_frac_lines_end_with_ellipsis": [[0, 521, 0.0]], "rps_doc_frac_no_alph_words": [[0, 521, 0.17346939]], "rps_doc_frac_unique_words": [[0, 521, 0.73493976]], "rps_doc_mean_word_length": [[0, 521, 5.12048193]], "rps_doc_symbol_to_word_ratio": [[0, 521, 0.0]], "rps_doc_unigram_entropy": [[0, 521, 3.92921822]], "rps_doc_word_count": [[0, 521, 83.0]], "rps_lines_ending_with_terminal_punctution_mark": [[0, 25, 0.0], [25, 496, 1.0], [496, 511, 0.0], [511, 521, 0.0]], "rps_lines_javascript_counts": [[0, 25, 0.0], [25, 496, 0.0], [496, 511, 0.0], [511, 521, 0.0]], "rps_lines_num_words": [[0, 25, 2.0], [25, 496, 77.0], [496, 511, 2.0], [511, 521, 2.0]], "rps_lines_numerical_chars_fraction": [[0, 25, 0.0], [25, 496, 0.01531729], [496, 511, 0.0], [511, 521, 0.0]], "rps_lines_start_with_bulletpoint": [[0, 25, 0.0], [25, 496, 0.0], [496, 511, 0.0], [511, 521, 0.0]], "rps_lines_uppercase_letter_fraction": [[0, 25, 0.04], [25, 496, 0.01910828], [496, 511, 0.13333333], [511, 521, 0.5]], "rps_doc_ml_palm_score": [[0, 521, 0.01132905]], "rps_doc_ml_wikipedia_score": [[0, 521, null]], "rps_doc_ml_wikiref_score": [[0, 521, 0.00036013]], "rps_doc_books_importance": [[0, 521, -25.15917324]], "rps_doc_openwebtext_importance": [[0, 521, -2.6372513]], "rps_doc_wikipedia_importance": [[0, 521, 9.47499606]], "rps_doc_num_sentences": [[0, 521, 5.0]]} |
‘fable’
A storybook fable explains the origins of The Latitude Society in the documentary, IN BRIGHT AXIOM.
View EOMreviews’s profile on Facebook
View Elmntsofmadness’s profile on Twitter
View Elmntsofmadness’s profile on Instagram
View Elmntsofmadness’s profile on YouTube
Archives Select Month March 2023 February 2023 January 2023 December 2022 November 2022 October 2022 September 2022 August 2022 July 2022 June 2022 May 2022 April 2022 March 2022 February 2022 January 2022 December 2021 November 2021 October 2021 September 2021 August 2021 July 2021 June 2021 May 2021 April 2021 March 2021 February 2021 January 2021 December 2020 November 2020 October 2020 September 2020 August 2020 July 2020 June 2020 May 2020 April 2020 March 2020 February 2020 January 2020 December 2019 November 2019 October 2019 September 2019 August 2019 July 2019 June 2019 May 2019 April 2019 March 2019 February 2019 January 2019 December 2018 November 2018 October 2018 September 2018 August 2018 July 2018 June 2018 May 2018 April 2018 March 2018 February 2018 January 2018 December 2017 November 2017 October 2017 September 2017 August 2017 July 2017 June 2017 May 2017 April 2017 March 2017 February 2017 January 2017 December 2016 November 2016 October 2016 September 2016 August 2016 July 2016 June 2016 May 2016 April 2016 March 2016 February 2015 January 2013 August 2012 July 2012 May 2012 April 2012
Critics Choice Association
NC Film Critics Association
Southeastern Film Critics Association | 2023-14/0000/en_head.json.gz/3683 | {"url": "https://elementsofmadness.com/fable/", "partition": "head_middle", "language": "en", "source_domain": "elementsofmadness.com", "date_download": "2023-03-20T09:08:02Z", "digest": "sha1:SXMWML7XGCI2732WGP6ZIUTZLRJU7N5Z"} | {"ccnet_length": [[0, 1490, 1490.0]], "ccnet_original_length": [[0, 1490, 2416.0]], "ccnet_nlines": [[0, 1490, 10.0]], "ccnet_original_nlines": [[0, 1490, 58.0]], "ccnet_language_score": [[0, 1490, 0.98]], "ccnet_perplexity": [[0, 1490, 215.2]], "ccnet_bucket": [[0, 1490, 0.0]], "rps_doc_curly_bracket": [[0, 1490, 0.0]], "rps_doc_ldnoobw_words": [[0, 1490, 0.0]], "rps_doc_lorem_ipsum": [[0, 1490, 0.0]], "rps_doc_stop_word_fraction": [[0, 1490, 0.04897959]], "rps_doc_ut1_blacklist": [[0, 1490, null]], "rps_doc_frac_chars_dupe_10grams": [[0, 1490, 0.0]], "rps_doc_frac_chars_dupe_5grams": [[0, 1490, 0.0]], "rps_doc_frac_chars_dupe_6grams": [[0, 1490, 0.0]], "rps_doc_frac_chars_dupe_7grams": [[0, 1490, 0.0]], "rps_doc_frac_chars_dupe_8grams": [[0, 1490, 0.0]], "rps_doc_frac_chars_dupe_9grams": [[0, 1490, 0.0]], "rps_doc_frac_chars_top_2gram": [[0, 1490, 0.02866242]], "rps_doc_frac_chars_top_3gram": [[0, 1490, 0.06687898]], "rps_doc_frac_chars_top_4gram": [[0, 1490, 0.07165605]], "rps_doc_frac_all_caps_words": [[0, 1490, 0.02040816]], "rps_doc_frac_lines_end_with_ellipsis": [[0, 1490, 0.0]], "rps_doc_frac_no_alph_words": [[0, 1490, 0.40408163]], "rps_doc_frac_unique_words": [[0, 1490, 0.2360515]], "rps_doc_mean_word_length": [[0, 1490, 5.39055794]], "rps_doc_symbol_to_word_ratio": [[0, 1490, 0.0]], "rps_doc_unigram_entropy": [[0, 1490, 3.61444105]], "rps_doc_word_count": [[0, 1490, 233.0]], "rps_lines_ending_with_terminal_punctution_mark": [[0, 8, 0.0], [8, 108, 1.0], [108, 146, 0.0], [146, 188, 0.0], [188, 232, 0.0], [232, 274, 0.0], [274, 1398, 0.0], [1398, 1425, 0.0], [1425, 1453, 0.0], [1453, 1490, 0.0]], "rps_lines_javascript_counts": [[0, 8, 0.0], [8, 108, 0.0], [108, 146, 0.0], [146, 188, 0.0], [188, 232, 0.0], [232, 274, 0.0], [274, 1398, 0.0], [1398, 1425, 0.0], [1425, 1453, 0.0], [1453, 1490, 0.0]], "rps_lines_num_words": [[0, 8, 1.0], [8, 108, 16.0], [108, 146, 5.0], [146, 188, 5.0], [188, 232, 5.0], [232, 274, 5.0], [274, 1398, 185.0], [1398, 1425, 3.0], [1425, 1453, 4.0], [1453, 1490, 4.0]], "rps_lines_numerical_chars_fraction": [[0, 8, 0.0], [8, 108, 0.0], [108, 146, 0.0], [146, 188, 0.0], [188, 232, 0.0], [232, 274, 0.0], [274, 1398, 0.32413179], [1398, 1425, 0.0], [1425, 1453, 0.0], [1453, 1490, 0.0]], "rps_lines_start_with_bulletpoint": [[0, 8, 0.0], [8, 108, 0.0], [108, 146, 0.0], [146, 188, 0.0], [188, 232, 0.0], [232, 274, 0.0], [274, 1398, 0.0], [1398, 1425, 0.0], [1425, 1453, 0.0], [1453, 1490, 0.0]], "rps_lines_uppercase_letter_fraction": [[0, 8, 0.0], [8, 108, 0.17], [108, 146, 0.13157895], [146, 188, 0.07142857], [188, 232, 0.06818182], [232, 274, 0.0952381], [274, 1398, 0.08362989], [1398, 1425, 0.11111111], [1425, 1453, 0.17857143], [1453, 1490, 0.10810811]], "rps_doc_ml_palm_score": [[0, 1490, -1.001e-05]], "rps_doc_ml_wikipedia_score": [[0, 1490, null]], "rps_doc_ml_wikiref_score": [[0, 1490, 0.81714308]], "rps_doc_books_importance": [[0, 1490, -278.04238744]], "rps_doc_openwebtext_importance": [[0, 1490, -107.7626711]], "rps_doc_wikipedia_importance": [[0, 1490, -40.12419472]], "rps_doc_num_sentences": [[0, 1490, 2.0]]} |
from a preventable cause of death, said study co-author Jason Goldstick. He is a research associate professor of emergency medicine at Michigan Medicine in Ann Arbor.
"Recent investments in firearm injury prevention research by the [U.S. Centers for Disease Control and Prevention] and National Institutes of Health, in addition to community violence prevention funding in the federal budget, are a step in the right direction, but this momentum must continue if we truly want to break this alarming trend," Goldstick said in a university news release.
The study attributed 3,900 deaths in the 1- to 19-year age group to motor vehicles. The authors added that drug poisoning deaths rose more than 83% to more than 1,700. Drug poisoning was the third-leading cause of death in the studied age group.
The findings, based on CDC data, were published April 20 in the New England Journal of Medicine.
"Motor vehicle crashes were consistently the leading cause of death for children and adolescents by a fairly wide margin, but by making vehicles and their drivers safer, these types of fatalities have drastically decreased over the past 20 years," said study co-author Dr. Patrick Carter, co-director of the university's Institute for Firearm Injury Prevention.
"Injury prevention science played a crucial role in reducing automobile deaths without taking cars off the road, and we have a real opportunity here to generate a similar impact for reducing firearm deaths through the application of rigorous injury prevention science," Carter added.
In 2020, more than 45,000 people of all ages died in the United States as a result of firearms, up more than 13% from 2019. The researchers said the increase was largely due to gun homicides, which jumped more than 33% from 2019 to 2020. Firearm suicides rose by about 1%, the CDC data show.
According to study co-author Dr. Rebecca Cunningham, vice president for research and a professor of emergency medicine at the university, "Firearm violence is one of the most critical challenges facing our society, and based on the latest federal data, this crisis is growing more and more intense."
The Pew Research Center has more about gun deaths.
SOURCE: University of Michigan, news release, April 20, 2022
For Parents: Bicycle, In-Line Skating, Skateboard, and Scooter Safety
Bone Marrow Transplant for Children
Bullies: Helping Your Child Cope
Hearing Hazards in Everyday Life
Teen Health Quiz | 2023-14/0000/en_head.json.gz/3684 | {"url": "https://elibrary.carti.com/RelatedItems/6,1657192088", "partition": "head_middle", "language": "en", "source_domain": "elibrary.carti.com", "date_download": "2023-03-20T10:02:08Z", "digest": "sha1:KEKA5VDFWCG24SPSC2H3RF6RZ3YXDIIX"} | {"ccnet_length": [[0, 2434, 2434.0]], "ccnet_original_length": [[0, 2434, 4743.0]], "ccnet_nlines": [[0, 2434, 15.0]], "ccnet_original_nlines": [[0, 2434, 125.0]], "ccnet_language_score": [[0, 2434, 0.96]], "ccnet_perplexity": [[0, 2434, 247.2]], "ccnet_bucket": [[0, 2434, 0.0]], "rps_doc_curly_bracket": [[0, 2434, 0.0]], "rps_doc_ldnoobw_words": [[0, 2434, 0.0]], "rps_doc_lorem_ipsum": [[0, 2434, 0.0]], "rps_doc_stop_word_fraction": [[0, 2434, 0.32908705]], "rps_doc_ut1_blacklist": [[0, 2434, null]], "rps_doc_frac_chars_dupe_10grams": [[0, 2434, 0.0]], "rps_doc_frac_chars_dupe_5grams": [[0, 2434, 0.0304414]], "rps_doc_frac_chars_dupe_6grams": [[0, 2434, 0.0]], "rps_doc_frac_chars_dupe_7grams": [[0, 2434, 0.0]], "rps_doc_frac_chars_dupe_8grams": [[0, 2434, 0.0]], "rps_doc_frac_chars_dupe_9grams": [[0, 2434, 0.0]], "rps_doc_frac_chars_top_2gram": [[0, 2434, 0.0152207]], "rps_doc_frac_chars_top_3gram": [[0, 2434, 0.01826484]], "rps_doc_frac_chars_top_4gram": [[0, 2434, 0.02841197]], "rps_doc_frac_all_caps_words": [[0, 2434, 0.01061571]], "rps_doc_frac_lines_end_with_ellipsis": [[0, 2434, 0.0]], "rps_doc_frac_no_alph_words": [[0, 2434, 0.18895966]], "rps_doc_frac_unique_words": [[0, 2434, 0.55128205]], "rps_doc_mean_word_length": [[0, 2434, 5.05384615]], "rps_doc_symbol_to_word_ratio": [[0, 2434, 0.0]], "rps_doc_unigram_entropy": [[0, 2434, 4.98061104]], "rps_doc_word_count": [[0, 2434, 390.0]], "rps_lines_ending_with_terminal_punctution_mark": [[0, 167, 1.0], [167, 553, 1.0], [553, 799, 1.0], [799, 896, 1.0], [896, 1258, 1.0], [1258, 1542, 1.0], [1542, 1834, 1.0], [1834, 2134, 0.0], [2134, 2185, 1.0], [2185, 2246, 0.0], [2246, 2316, 0.0], [2316, 2352, 0.0], [2352, 2385, 0.0], [2385, 2418, 0.0], [2418, 2434, 0.0]], "rps_lines_javascript_counts": [[0, 167, 0.0], [167, 553, 0.0], [553, 799, 0.0], [799, 896, 0.0], [896, 1258, 0.0], [1258, 1542, 0.0], [1542, 1834, 0.0], [1834, 2134, 0.0], [2134, 2185, 0.0], [2185, 2246, 0.0], [2246, 2316, 0.0], [2316, 2352, 0.0], [2352, 2385, 0.0], [2385, 2418, 0.0], [2418, 2434, 0.0]], "rps_lines_num_words": [[0, 167, 26.0], [167, 553, 60.0], [553, 799, 43.0], [799, 896, 17.0], [896, 1258, 54.0], [1258, 1542, 43.0], [1542, 1834, 55.0], [1834, 2134, 47.0], [2134, 2185, 9.0], [2185, 2246, 9.0], [2246, 2316, 9.0], [2316, 2352, 5.0], [2352, 2385, 5.0], [2385, 2418, 5.0], [2418, 2434, 3.0]], "rps_lines_numerical_chars_fraction": [[0, 167, 0.0], [167, 553, 0.0], [553, 799, 0.05508475], [799, 896, 0.02150538], [896, 1258, 0.00571429], [1258, 1542, 0.0], [1542, 1834, 0.09285714], [1834, 2134, 0.0], [2134, 2185, 0.0], [2185, 2246, 0.10714286], [2246, 2316, 0.0], [2316, 2352, 0.0], [2352, 2385, 0.0], [2385, 2418, 0.0], [2418, 2434, 0.0]], "rps_lines_start_with_bulletpoint": [[0, 167, 0.0], [167, 553, 0.0], [553, 799, 0.0], [799, 896, 0.0], [896, 1258, 0.0], [1258, 1542, 0.0], [1542, 1834, 0.0], [1834, 2134, 0.0], [2134, 2185, 0.0], [2185, 2246, 0.0], [2246, 2316, 0.0], [2316, 2352, 0.0], [2352, 2385, 0.0], [2385, 2418, 0.0], [2418, 2434, 0.0]], "rps_lines_uppercase_letter_fraction": [[0, 167, 0.04191617], [167, 553, 0.02849741], [553, 799, 0.01219512], [799, 896, 0.09278351], [896, 1258, 0.02209945], [1258, 1542, 0.00704225], [1542, 1834, 0.02739726], [1834, 2134, 0.01666667], [2134, 2185, 0.07843137], [2185, 2246, 0.14754098], [2246, 2316, 0.12857143], [2316, 2352, 0.11111111], [2352, 2385, 0.15151515], [2385, 2418, 0.12121212], [2418, 2434, 0.1875]], "rps_doc_ml_palm_score": [[0, 2434, 0.71543962]], "rps_doc_ml_wikipedia_score": [[0, 2434, null]], "rps_doc_ml_wikiref_score": [[0, 2434, 0.74763632]], "rps_doc_books_importance": [[0, 2434, -83.42584055]], "rps_doc_openwebtext_importance": [[0, 2434, 42.44138322]], "rps_doc_wikipedia_importance": [[0, 2434, 45.57325039]], "rps_doc_num_sentences": [[0, 2434, 19.0]]} |
Ellel USATampa, FL
My back problems and pill addiction were healed.... Read More...
News & UpdatesNewsletter Sign Up
Ellel USA - Securing the Land
As many of you know, Ellel USA has been waiting on the Lord to provide a building for our center in the USA. We purchased the 140 acres of land we are located on 12 years ago with a 100% loan. But because of a series of challenges, unprecedented in 30 years since the ministry of Ellel was established in England in 1986, we have been unable to free the land from this financial encumbrance, without which we can't put up the building(s) that will accommodate the great purposes God has for the land and center.
But we believe that now this goal is within our grasp! The Lord has opened a new window of opportunity for us to obtain unencumbered title to the land if we act by March 31, 2018. Even though the original loan was for $2,200,000, we have now been told that the land will be ours if we can pay $1,500,000 for the land by that date - otherwise it will be placed for immediate sale on the open real estate market.
We have already raised approximately $375,000 (as of January 19, 2018) towards outright ownership of the land, reducing what we now need to only $1,125,000. Although this is not a small amount of money, it is surely well within the combined abilities of those of us who have been touched by the ministry! If we will all sacrificially give an extravagant gift, we can reach this goal. I truly believe this is what the Lord wants - a group effort of sacrificial giving to free the land.
I also believe there are a few people that are to give very large sums toward this effort. The Lord has gifted some of you with the heart and resources to support His Kingdom in very significant ways, financially. We are believing that the Lord is not only going to free this land so that we can build, but that He will also provide the finances to actually start the building process. Although we can begin to build for significantly less, the first phase of the main facility could be completed for $3,000,000 beyond the cost of the land mentioned above.
Whether you give a numerically large or small gift, would you please prayerfully consider a sacrificial financial gift toward freeing the Ellel USA land once and for all from its financial bondage, so that we may move forward in the Lord's full purposes for it? Even if 10% of those touched by the work of Ellel in the USA alone would give $10,000 or more, we would easily surpass the goal of purchasing the land for cash and would be able to begin the first phase of building.
Would you please call our office at 813-737-4848 or email me directly ([email protected]) to make a donation or to let us know that you plan to do so. You can also CLICK HERE to donate online. Time is of the essence. And we want to make sure every penny designated toward the land, goes for that purpose. By calling or emailing, we can also keep a "real time" tally of the funds coming in for this purpose. Remember, our deadline is March 31, 2018, so we need your donation by March 15, 2018!
Please share this with anyone you believe might be interested in helping in this effort or share this video with them - CLICK HERE TO SEE VIDEO. Thanks so much for your faithful prayer and financial support of Ellel USA thus far and for giving this plea prayerful and serious consideration. Please feel free to call me at the number above if you have any questions. I will keep you posted about our progress but please don't wait to give. If the Lord is prompting you to give, give generously and immediately!
National Director, Ellel Ministries USA | 2023-14/0000/en_head.json.gz/3685 | {"url": "https://ellel.org/usa/news/ellel-usa-securing-the-land", "partition": "head_middle", "language": "en", "source_domain": "ellel.org", "date_download": "2023-03-20T10:34:55Z", "digest": "sha1:Z7FSZGHBDFSHYAS6HVLKZ27HHHZTWWBX"} | {"ccnet_length": [[0, 3635, 3635.0]], "ccnet_original_length": [[0, 3635, 7238.0]], "ccnet_nlines": [[0, 3635, 12.0]], "ccnet_original_nlines": [[0, 3635, 167.0]], "ccnet_language_score": [[0, 3635, 0.97]], "ccnet_perplexity": [[0, 3635, 283.6]], "ccnet_bucket": [[0, 3635, 0.0]], "rps_doc_curly_bracket": [[0, 3635, 0.0]], "rps_doc_ldnoobw_words": [[0, 3635, 1.0]], "rps_doc_lorem_ipsum": [[0, 3635, 0.0]], "rps_doc_stop_word_fraction": [[0, 3635, 0.47674419]], "rps_doc_ut1_blacklist": [[0, 3635, null]], "rps_doc_frac_chars_dupe_10grams": [[0, 3635, 0.0]], "rps_doc_frac_chars_dupe_5grams": [[0, 3635, 0.0]], "rps_doc_frac_chars_dupe_6grams": [[0, 3635, 0.0]], "rps_doc_frac_chars_dupe_7grams": [[0, 3635, 0.0]], "rps_doc_frac_chars_dupe_8grams": [[0, 3635, 0.0]], "rps_doc_frac_chars_dupe_9grams": [[0, 3635, 0.0]], "rps_doc_frac_chars_top_2gram": [[0, 3635, 0.02681992]], "rps_doc_frac_chars_top_3gram": [[0, 3635, 0.00557297]], "rps_doc_frac_chars_top_4gram": [[0, 3635, 0.00905608]], "rps_doc_frac_all_caps_words": [[0, 3635, 0.02325581]], "rps_doc_frac_lines_end_with_ellipsis": [[0, 3635, 0.08333333]], "rps_doc_frac_no_alph_words": [[0, 3635, 0.1627907]], "rps_doc_frac_unique_words": [[0, 3635, 0.44712991]], "rps_doc_mean_word_length": [[0, 3635, 4.33685801]], "rps_doc_symbol_to_word_ratio": [[0, 3635, 0.00258398]], "rps_doc_unigram_entropy": [[0, 3635, 5.12332284]], "rps_doc_word_count": [[0, 3635, 662.0]], "rps_lines_ending_with_terminal_punctution_mark": [[0, 19, 0.0], [19, 84, 1.0], [84, 117, 0.0], [117, 147, 0.0], [147, 659, 1.0], [659, 1070, 1.0], [1070, 1555, 1.0], [1555, 2112, 1.0], [2112, 2590, 1.0], [2590, 3086, 1.0], [3086, 3596, 1.0], [3596, 3635, 0.0]], "rps_lines_javascript_counts": [[0, 19, 0.0], [19, 84, 0.0], [84, 117, 0.0], [117, 147, 0.0], [147, 659, 0.0], [659, 1070, 0.0], [1070, 1555, 0.0], [1555, 2112, 0.0], [2112, 2590, 0.0], [2590, 3086, 0.0], [3086, 3596, 0.0], [3596, 3635, 0.0]], "rps_lines_num_words": [[0, 19, 3.0], [19, 84, 10.0], [84, 117, 4.0], [117, 147, 5.0], [147, 659, 96.0], [659, 1070, 80.0], [1070, 1555, 86.0], [1555, 2112, 101.0], [2112, 2590, 88.0], [2590, 3086, 93.0], [3086, 3596, 91.0], [3596, 3635, 5.0]], "rps_lines_numerical_chars_fraction": [[0, 19, 0.0], [19, 84, 0.0], [84, 117, 0.0], [117, 147, 0.0], [147, 659, 0.028], [659, 1070, 0.05037783], [1070, 1555, 0.04068522], [1555, 2112, 0.01282051], [2112, 2590, 0.01495726], [2590, 3086, 0.0464135], [3086, 3596, 0.0], [3596, 3635, 0.0]], "rps_lines_start_with_bulletpoint": [[0, 19, 0.0], [19, 84, 0.0], [84, 117, 0.0], [117, 147, 0.0], [147, 659, 0.0], [659, 1070, 0.0], [1070, 1555, 0.0], [1555, 2112, 0.0], [2112, 2590, 0.0], [2590, 3086, 0.0], [3086, 3596, 0.0], [3596, 3635, 0.0]], "rps_lines_uppercase_letter_fraction": [[0, 19, 0.36842105], [19, 84, 0.04615385], [84, 117, 0.15151515], [117, 147, 0.2], [147, 659, 0.02734375], [659, 1070, 0.01216545], [1070, 1555, 0.01237113], [1555, 2112, 0.01615799], [2112, 2590, 0.02301255], [2590, 3086, 0.03427419], [3086, 3596, 0.05686275], [3596, 3635, 0.17948718]], "rps_doc_ml_palm_score": [[0, 3635, 0.31983274]], "rps_doc_ml_wikipedia_score": [[0, 3635, null]], "rps_doc_ml_wikiref_score": [[0, 3635, 0.13699889]], "rps_doc_books_importance": [[0, 3635, -122.01416907]], "rps_doc_openwebtext_importance": [[0, 3635, 17.11892721]], "rps_doc_wikipedia_importance": [[0, 3635, -237.84777388]], "rps_doc_num_sentences": [[0, 3635, 32.0]]} |
Washoe; Washoa
Isolate; North American
Severely Endangered
Language Lessons in W
@w4976155
(PDF): The Washo Lang
@cwckiatomicfartz
California Indian Languages
Ethnologue: Languages of the World, 18th Edition
As of 2008 there were about twenty fluent first-language speakers of Washo, all middle-aged or elderly, out of a total population of more than fifteen hundred
Nevada, California
38.9226,-119.7256
10 (Unchanged 2016.)
"A number of children and young adults have acquired moderate L2 fluency."
(1998 J. Brook). "Several dozen fully fluent speakers" (Golla 2007).
"California-Nevada border southeast of Lake Tahoe."
Several dozen
Many adults and children have become second-language speakers or semi-speakers. (Golla 2007:90)
The Washo Tribe has a successful immersion school (kindergaten-8th grade)
Several dozen fluent first-language speakers, all middle-aged or elderly, in a total population of over 1,500, divided among four small reservations in both Nevada and California.
Spoken by members of the Washoe Tribe of Nevada and California, whose traditional homeland centers on Lake Tahoe in the High Sierra.
39.25,-120.0
Golla, Victor. 2007. "North America." In Encyclopedia of the World's Endangered Languages, edited by C. Moseley. 1-96. London & New York: Routledge.
A Grammar of the Washo Language
Jacobsen, Jr., William H. A Grammar of the Washo Language. PhD thesis, Berkeley: University of California, 1964.
Beginning Washo
Jacobsen, Jr., W. 1996. "Beginning Washo." 5: Carson City: Nevada State Museum.
The Washo Language of East Central California and Nevada
Kroeber, A. L. 1907. "The Washo Language of East Central California and Nevada." In University of California Publications in American Archaeology and Ethnology, 4 , no. 5: 252-317.
d'Azevedo, Warren L. 1986. "Washoe." In Great Basin, edited by Warren L. D'azevedo. 11: 466-498. Smithsonian Institution, Washington:.
Kroeber, A. L. 1907. "The Washo Language of East Central California and Nevada." 4.5: 251-317. University of California Press.
Jacobsen, William. 1964. "A Grammar of the Washo Language."
Washo texts
Lowie, Robert H. 1963. "Washo Texts." In Anthropological Linguistics, 5 , no. 6: 1-30. | 2023-14/0000/en_head.json.gz/3686 | {"url": "https://elp.colo.hawaii.edu/language/1149/?lang=mr", "partition": "head_middle", "language": "en", "source_domain": "elp.colo.hawaii.edu", "date_download": "2023-03-20T10:02:38Z", "digest": "sha1:MCCQQWE7APQZYM5PBZHNQ3MLX25SBORW"} | {"ccnet_length": [[0, 2179, 2179.0]], "ccnet_original_length": [[0, 2179, 16554.0]], "ccnet_nlines": [[0, 2179, 34.0]], "ccnet_original_nlines": [[0, 2179, 517.0]], "ccnet_language_score": [[0, 2179, 0.78]], "ccnet_perplexity": [[0, 2179, 310.9]], "ccnet_bucket": [[0, 2179, 0.0]], "rps_doc_curly_bracket": [[0, 2179, 0.0]], "rps_doc_ldnoobw_words": [[0, 2179, 0.0]], "rps_doc_lorem_ipsum": [[0, 2179, 0.0]], "rps_doc_stop_word_fraction": [[0, 2179, 0.15546218]], "rps_doc_ut1_blacklist": [[0, 2179, null]], "rps_doc_frac_chars_dupe_10grams": [[0, 2179, 0.07142857]], "rps_doc_frac_chars_dupe_5grams": [[0, 2179, 0.14519906]], "rps_doc_frac_chars_dupe_6grams": [[0, 2179, 0.14519906]], "rps_doc_frac_chars_dupe_7grams": [[0, 2179, 0.09953162]], "rps_doc_frac_chars_dupe_8grams": [[0, 2179, 0.09953162]], "rps_doc_frac_chars_dupe_9grams": [[0, 2179, 0.09953162]], "rps_doc_frac_chars_top_2gram": [[0, 2179, 0.03747073]], "rps_doc_frac_chars_top_3gram": [[0, 2179, 0.05620609]], "rps_doc_frac_chars_top_4gram": [[0, 2179, 0.02283372]], "rps_doc_frac_all_caps_words": [[0, 2179, 0.03991597]], "rps_doc_frac_lines_end_with_ellipsis": [[0, 2179, 0.0]], "rps_doc_frac_no_alph_words": [[0, 2179, 0.37605042]], "rps_doc_frac_unique_words": [[0, 2179, 0.52698413]], "rps_doc_mean_word_length": [[0, 2179, 5.42222222]], "rps_doc_symbol_to_word_ratio": [[0, 2179, 0.0]], "rps_doc_unigram_entropy": [[0, 2179, 4.7145833]], "rps_doc_word_count": [[0, 2179, 315.0]], "rps_lines_ending_with_terminal_punctution_mark": [[0, 15, 0.0], [15, 39, 0.0], [39, 59, 0.0], [59, 81, 0.0], [81, 91, 0.0], [91, 113, 0.0], [113, 131, 0.0], [131, 159, 0.0], [159, 208, 0.0], [208, 367, 0.0], [367, 386, 0.0], [386, 404, 0.0], [404, 425, 0.0], [425, 500, 0.0], [500, 569, 1.0], [569, 621, 0.0], [621, 635, 0.0], [635, 731, 0.0], [731, 805, 0.0], [805, 985, 1.0], [985, 1118, 1.0], [1118, 1131, 0.0], [1131, 1280, 1.0], [1280, 1312, 0.0], [1312, 1425, 1.0], [1425, 1441, 0.0], [1441, 1521, 1.0], [1521, 1578, 0.0], [1578, 1759, 1.0], [1759, 1894, 1.0], [1894, 2021, 1.0], [2021, 2081, 0.0], [2081, 2093, 0.0], [2093, 2179, 1.0]], "rps_lines_javascript_counts": [[0, 15, 0.0], [15, 39, 0.0], [39, 59, 0.0], [59, 81, 0.0], [81, 91, 0.0], [91, 113, 0.0], [113, 131, 0.0], [131, 159, 0.0], [159, 208, 0.0], [208, 367, 0.0], [367, 386, 0.0], [386, 404, 0.0], [404, 425, 0.0], [425, 500, 0.0], [500, 569, 0.0], [569, 621, 0.0], [621, 635, 0.0], [635, 731, 0.0], [731, 805, 0.0], [805, 985, 0.0], [985, 1118, 0.0], [1118, 1131, 0.0], [1131, 1280, 0.0], [1280, 1312, 0.0], [1312, 1425, 0.0], [1425, 1441, 0.0], [1441, 1521, 0.0], [1521, 1578, 0.0], [1578, 1759, 0.0], [1759, 1894, 0.0], [1894, 2021, 0.0], [2021, 2081, 0.0], [2081, 2093, 0.0], [2093, 2179, 0.0]], "rps_lines_num_words": [[0, 15, 2.0], [15, 39, 3.0], [39, 59, 2.0], [59, 81, 4.0], [81, 91, 1.0], [91, 113, 4.0], [113, 131, 1.0], [131, 159, 3.0], [159, 208, 7.0], [208, 367, 26.0], [367, 386, 2.0], [386, 404, 1.0], [404, 425, 3.0], [425, 500, 12.0], [500, 569, 10.0], [569, 621, 6.0], [621, 635, 2.0], [635, 731, 12.0], [731, 805, 10.0], [805, 985, 26.0], [985, 1118, 22.0], [1118, 1131, 1.0], [1131, 1280, 21.0], [1280, 1312, 6.0], [1312, 1425, 17.0], [1425, 1441, 2.0], [1441, 1521, 12.0], [1521, 1578, 9.0], [1578, 1759, 27.0], [1759, 1894, 18.0], [1894, 2021, 19.0], [2021, 2081, 9.0], [2081, 2093, 2.0], [2093, 2179, 13.0]], "rps_lines_numerical_chars_fraction": [[0, 15, 0.0], [15, 39, 0.0], [39, 59, 0.0], [59, 81, 0.0], [81, 91, 0.875], [91, 113, 0.0], [113, 131, 0.0], [131, 159, 0.0], [159, 208, 0.04347826], [208, 367, 0.02597403], [367, 386, 0.0], [386, 404, 1.0], [404, 425, 0.35294118], [425, 500, 0.01408451], [500, 569, 0.13559322], [569, 621, 0.0], [621, 635, 0.0], [635, 731, 0.06741573], [731, 805, 0.01428571], [805, 985, 0.02325581], [985, 1118, 0.0], [1118, 1131, 1.0], [1131, 1280, 0.0530303], [1280, 1312, 0.0], [1312, 1425, 0.03883495], [1425, 1441, 0.0], [1441, 1521, 0.07352941], [1521, 1578, 0.0], [1578, 1759, 0.07228916], [1759, 1894, 0.1025641], [1894, 2021, 0.10526316], [2021, 2081, 0.0754717], [2081, 2093, 0.0], [2093, 2179, 0.12328767]], "rps_lines_start_with_bulletpoint": [[0, 15, 0.0], [15, 39, 0.0], [39, 59, 0.0], [59, 81, 0.0], [81, 91, 0.0], [91, 113, 0.0], [113, 131, 0.0], [131, 159, 0.0], [159, 208, 0.0], [208, 367, 0.0], [367, 386, 0.0], [386, 404, 0.0], [404, 425, 0.0], [425, 500, 0.0], [500, 569, 0.0], [569, 621, 0.0], [621, 635, 0.0], [635, 731, 0.0], [731, 805, 0.0], [805, 985, 0.0], [985, 1118, 0.0], [1118, 1131, 0.0], [1131, 1280, 0.0], [1280, 1312, 0.0], [1312, 1425, 0.0], [1425, 1441, 0.0], [1441, 1521, 0.0], [1521, 1578, 0.0], [1578, 1759, 0.0], [1759, 1894, 0.0], [1894, 2021, 0.0], [2021, 2081, 0.0], [2081, 2093, 0.0], [2093, 2179, 0.0]], "rps_lines_uppercase_letter_fraction": [[0, 15, 0.13333333], [15, 39, 0.125], [39, 59, 0.1], [59, 81, 0.13636364], [81, 91, 0.0], [91, 113, 0.27272727], [113, 131, 0.0], [131, 159, 0.10714286], [159, 208, 0.08163265], [208, 367, 0.01257862], [367, 386, 0.10526316], [386, 404, 0.0], [404, 425, 0.04761905], [425, 500, 0.02666667], [500, 569, 0.05797101], [569, 621, 0.07692308], [621, 635, 0.07142857], [635, 731, 0.02083333], [731, 805, 0.04054054], [805, 985, 0.01666667], [985, 1118, 0.06766917], [1118, 1131, 0.0], [1131, 1280, 0.10067114], [1280, 1312, 0.125], [1312, 1425, 0.11504425], [1425, 1441, 0.125], [1441, 1521, 0.125], [1521, 1578, 0.12280702], [1578, 1759, 0.09392265], [1759, 1894, 0.0962963], [1894, 2021, 0.1023622], [2021, 2081, 0.1], [2081, 2093, 0.08333333], [2093, 2179, 0.09302326]], "rps_doc_ml_palm_score": [[0, 2179, 0.82689428]], "rps_doc_ml_wikipedia_score": [[0, 2179, null]], "rps_doc_ml_wikiref_score": [[0, 2179, 0.44694072]], "rps_doc_books_importance": [[0, 2179, -130.59596704]], "rps_doc_openwebtext_importance": [[0, 2179, -62.32294001]], "rps_doc_wikipedia_importance": [[0, 2179, 37.72867802]], "rps_doc_num_sentences": [[0, 2179, 57.0]]} |
Applause: Automated, Manual, and Regression Testing
Applause Company Profile
Automated, manual, and regression testing each come with their own benefits, and sometimes, even their own shortcomings. Thus, knowing which to opt for is not always an obvious choice to make.
Joining us to shed some light on the pros and cons of the various testing types is Jonathan Zaleski, Senior Director Of Engineering & Head Of Labs at Applause. In particular, Jonathan outlines the differences between manual and automated testing, as well as where regression testing fits in. He also introduces Applause's Quality Score, before finally sharing case studies that show Applause in action. | 2023-14/0000/en_head.json.gz/3687 | {"url": "https://em360tech.com/tech-podcasts/applause-automated-manual-and-regression-testing", "partition": "head_middle", "language": "en", "source_domain": "em360tech.com", "date_download": "2023-03-20T09:49:07Z", "digest": "sha1:WMJIN5ICKZKFWSLKK2WO6ND73RZ7QXHW"} | {"ccnet_length": [[0, 672, 672.0]], "ccnet_original_length": [[0, 672, 3726.0]], "ccnet_nlines": [[0, 672, 4.0]], "ccnet_original_nlines": [[0, 672, 183.0]], "ccnet_language_score": [[0, 672, 0.94]], "ccnet_perplexity": [[0, 672, 229.4]], "ccnet_bucket": [[0, 672, 0.0]], "rps_doc_curly_bracket": [[0, 672, 0.0]], "rps_doc_ldnoobw_words": [[0, 672, 0.0]], "rps_doc_lorem_ipsum": [[0, 672, 0.0]], "rps_doc_stop_word_fraction": [[0, 672, 0.36585366]], "rps_doc_ut1_blacklist": [[0, 672, null]], "rps_doc_frac_chars_dupe_10grams": [[0, 672, 0.0]], "rps_doc_frac_chars_dupe_5grams": [[0, 672, 0.12727273]], "rps_doc_frac_chars_dupe_6grams": [[0, 672, 0.0]], "rps_doc_frac_chars_dupe_7grams": [[0, 672, 0.0]], "rps_doc_frac_chars_dupe_8grams": [[0, 672, 0.0]], "rps_doc_frac_chars_dupe_9grams": [[0, 672, 0.0]], "rps_doc_frac_chars_top_2gram": [[0, 672, 0.04909091]], "rps_doc_frac_chars_top_3gram": [[0, 672, 0.06545455]], "rps_doc_frac_chars_top_4gram": [[0, 672, 0.10181818]], "rps_doc_frac_all_caps_words": [[0, 672, 0.0]], "rps_doc_frac_lines_end_with_ellipsis": [[0, 672, 0.0]], "rps_doc_frac_no_alph_words": [[0, 672, 0.15447154]], "rps_doc_frac_unique_words": [[0, 672, 0.70873786]], "rps_doc_mean_word_length": [[0, 672, 5.33980583]], "rps_doc_symbol_to_word_ratio": [[0, 672, 0.0]], "rps_doc_unigram_entropy": [[0, 672, 4.13335145]], "rps_doc_word_count": [[0, 672, 103.0]], "rps_lines_ending_with_terminal_punctution_mark": [[0, 52, 0.0], [52, 77, 0.0], [77, 270, 1.0], [270, 672, 1.0]], "rps_lines_javascript_counts": [[0, 52, 0.0], [52, 77, 0.0], [77, 270, 0.0], [270, 672, 0.0]], "rps_lines_num_words": [[0, 52, 6.0], [52, 77, 3.0], [77, 270, 31.0], [270, 672, 63.0]], "rps_lines_numerical_chars_fraction": [[0, 52, 0.0], [52, 77, 0.0], [77, 270, 0.0], [270, 672, 0.0]], "rps_lines_start_with_bulletpoint": [[0, 52, 0.0], [52, 77, 0.0], [77, 270, 0.0], [270, 672, 0.0]], "rps_lines_uppercase_letter_fraction": [[0, 52, 0.09615385], [52, 77, 0.12], [77, 270, 0.01036269], [270, 672, 0.04477612]], "rps_doc_ml_palm_score": [[0, 672, 0.25933129]], "rps_doc_ml_wikipedia_score": [[0, 672, null]], "rps_doc_ml_wikiref_score": [[0, 672, 0.00095868]], "rps_doc_books_importance": [[0, 672, 10.83682003]], "rps_doc_openwebtext_importance": [[0, 672, 4.1950509]], "rps_doc_wikipedia_importance": [[0, 672, 4.04474726]], "rps_doc_num_sentences": [[0, 672, 5.0]]} |
Phil Spencer: Cult of the Lamb should be in the Game of the Year award discussion
The Xbox boss refers to the game as "Satanic Animal Crossing".
We and a whole lot of other people really seemed to enjoy Massive Monster's Cult of the Lamb, which was released back in August for PC, PlayStation, Switch and Xbox. One person who truly loves the game, is the Xbox head honcho Phil Spencer. In fact, he likes it so much, he thinks it should be considered for the Game of the Year award.
Here's what he had to say about it during in interview with Same Brain podcast:
"Satanic Animal Crossing, is what I call it. Literally, I believe Cult of the Lamb should be in the Game of the Year award discussion, that's my point of view. [It's] fantastic.
You have a cult, you have a world — it's not an island, but you've got your little cult area that you build out, [and] then you go on these column dungeon runs to level up, gain resources... It is tongue-in-cheek hilarious, it's got this really cute art style and then you realize you're sacrificing your guys..."
Have you played Cult of the Lamb yet and would you consider it as good as Phil Spencer thinks it is?
Thanks Pure Xbox.
REVIEW. Written by Ben Lyons on the 10th of August 2022 at 16
We've rejected the false prophets and started a cult in the name of the one true ancient deity.
Cult of the Lamb: Relics of the Old Faith is coming in early 2023
NEWS. Written by Alex Hopley on the 20th of January 2023 at 11:42
There's another major content update planned for later in the year, too.
Cult of the Lamb Reveals Major Content Planned for 2023
NEWS. Written by Alex Hopley on the 4th of January 2023 at 16:00
Massive Monster is expanding the roguelike, cult-building simulator.
NEWS. Written by Jonas Mäki on the 4th of November 2022 at 01:13
Cult of the Lamb's Blood Moon Festival has started
Celebrate the spooky season by sacrificing some loyal followers.
Cult of the Lamb already has more than 1 million players
NEWS. Written by Eirik Hyldbakk Furu on the 18th of August 2022 at 15:42
Should we be worried that so many are eager to sacrifice lambs and run a cult?
Massive Monster will be giving away custom made Cult of the Lamb controllers
NEWS. Written by Ben Lyons on the 14th of August 2022 at 11:52
They're fluffy yet freaky, as you would expect.
System: PC, PS4, PS5, Xbox One, Xbox Series X, Nintendo Switch
Developer: Massive Monster
Publisher: Devolver Digital | 2023-14/0000/en_head.json.gz/3688 | {"url": "https://embed.gamereactor.eu/phil-spencer-cult-of-the-lamb-should-be-in-the-game-of-the-year-award-discussion-1218813/", "partition": "head_middle", "language": "en", "source_domain": "embed.gamereactor.eu", "date_download": "2023-03-20T09:43:55Z", "digest": "sha1:NOUHZ3ISP752BB3KLEYKCRAUF26425KE"} | {"ccnet_length": [[0, 2421, 2421.0]], "ccnet_original_length": [[0, 2421, 4690.0]], "ccnet_nlines": [[0, 2421, 28.0]], "ccnet_original_nlines": [[0, 2421, 171.0]], "ccnet_language_score": [[0, 2421, 0.95]], "ccnet_perplexity": [[0, 2421, 320.4]], "ccnet_bucket": [[0, 2421, 0.0]], "rps_doc_curly_bracket": [[0, 2421, 0.0]], "rps_doc_ldnoobw_words": [[0, 2421, 0.0]], "rps_doc_lorem_ipsum": [[0, 2421, 0.0]], "rps_doc_stop_word_fraction": [[0, 2421, 0.39259259]], "rps_doc_ut1_blacklist": [[0, 2421, null]], "rps_doc_frac_chars_dupe_10grams": [[0, 2421, 0.056932]], "rps_doc_frac_chars_dupe_5grams": [[0, 2421, 0.12493411]], "rps_doc_frac_chars_dupe_6grams": [[0, 2421, 0.12493411]], "rps_doc_frac_chars_dupe_7grams": [[0, 2421, 0.09066948]], "rps_doc_frac_chars_dupe_8grams": [[0, 2421, 0.09066948]], "rps_doc_frac_chars_dupe_9grams": [[0, 2421, 0.056932]], "rps_doc_frac_chars_top_2gram": [[0, 2421, 0.03690037]], "rps_doc_frac_chars_top_3gram": [[0, 2421, 0.042699]], "rps_doc_frac_chars_top_4gram": [[0, 2421, 0.05482341]], "rps_doc_frac_all_caps_words": [[0, 2421, 0.02407407]], "rps_doc_frac_lines_end_with_ellipsis": [[0, 2421, 0.0]], "rps_doc_frac_no_alph_words": [[0, 2421, 0.18148148]], "rps_doc_frac_unique_words": [[0, 2421, 0.503386]], "rps_doc_mean_word_length": [[0, 2421, 4.28216704]], "rps_doc_symbol_to_word_ratio": [[0, 2421, 0.0037037]], "rps_doc_unigram_entropy": [[0, 2421, 4.93423722]], "rps_doc_word_count": [[0, 2421, 443.0]], "rps_lines_ending_with_terminal_punctution_mark": [[0, 82, 0.0], [82, 145, 1.0], [145, 482, 1.0], [482, 562, 0.0], [562, 740, 1.0], [740, 1054, 0.0], [1054, 1155, 1.0], [1155, 1173, 1.0], [1173, 1235, 0.0], [1235, 1331, 1.0], [1331, 1397, 0.0], [1397, 1463, 0.0], [1463, 1536, 1.0], [1536, 1592, 0.0], [1592, 1657, 0.0], [1657, 1726, 1.0], [1726, 1791, 0.0], [1791, 1842, 0.0], [1842, 1907, 1.0], [1907, 1964, 0.0], [1964, 2037, 0.0], [2037, 2116, 1.0], [2116, 2193, 0.0], [2193, 2256, 0.0], [2256, 2304, 1.0], [2304, 2367, 0.0], [2367, 2394, 0.0], [2394, 2421, 0.0]], "rps_lines_javascript_counts": [[0, 82, 0.0], [82, 145, 0.0], [145, 482, 0.0], [482, 562, 0.0], [562, 740, 0.0], [740, 1054, 0.0], [1054, 1155, 0.0], [1155, 1173, 0.0], [1173, 1235, 0.0], [1235, 1331, 0.0], [1331, 1397, 0.0], [1397, 1463, 0.0], [1463, 1536, 0.0], [1536, 1592, 0.0], [1592, 1657, 0.0], [1657, 1726, 0.0], [1726, 1791, 0.0], [1791, 1842, 0.0], [1842, 1907, 0.0], [1907, 1964, 0.0], [1964, 2037, 0.0], [2037, 2116, 0.0], [2116, 2193, 0.0], [2193, 2256, 0.0], [2256, 2304, 0.0], [2304, 2367, 0.0], [2367, 2394, 0.0], [2394, 2421, 0.0]], "rps_lines_num_words": [[0, 82, 16.0], [82, 145, 11.0], [145, 482, 64.0], [482, 562, 15.0], [562, 740, 32.0], [740, 1054, 57.0], [1054, 1155, 21.0], [1155, 1173, 3.0], [1173, 1235, 13.0], [1235, 1331, 18.0], [1331, 1397, 14.0], [1397, 1463, 13.0], [1463, 1536, 12.0], [1536, 1592, 10.0], [1592, 1657, 13.0], [1657, 1726, 8.0], [1726, 1791, 13.0], [1791, 1842, 9.0], [1842, 1907, 9.0], [1907, 1964, 11.0], [1964, 2037, 14.0], [2037, 2116, 16.0], [2116, 2193, 13.0], [2193, 2256, 13.0], [2256, 2304, 8.0], [2304, 2367, 11.0], [2367, 2394, 3.0], [2394, 2421, 3.0]], "rps_lines_numerical_chars_fraction": [[0, 82, 0.0], [82, 145, 0.0], [145, 482, 0.0], [482, 562, 0.0], [562, 740, 0.0], [740, 1054, 0.0], [1054, 1155, 0.0], [1155, 1173, 0.0], [1173, 1235, 0.13333333], [1235, 1331, 0.0], [1331, 1397, 0.0625], [1397, 1463, 0.15873016], [1463, 1536, 0.0], [1536, 1592, 0.07272727], [1592, 1657, 0.14516129], [1657, 1726, 0.0], [1726, 1791, 0.14285714], [1791, 1842, 0.0], [1842, 1907, 0.0], [1907, 1964, 0.01785714], [1964, 2037, 0.14285714], [2037, 2116, 0.0], [2116, 2193, 0.0], [2193, 2256, 0.16666667], [2256, 2304, 0.0], [2304, 2367, 0.03571429], [2367, 2394, 0.0], [2394, 2421, 0.0]], "rps_lines_start_with_bulletpoint": [[0, 82, 0.0], [82, 145, 0.0], [145, 482, 0.0], [482, 562, 0.0], [562, 740, 0.0], [740, 1054, 0.0], [1054, 1155, 0.0], [1155, 1173, 0.0], [1173, 1235, 0.0], [1235, 1331, 0.0], [1331, 1397, 0.0], [1397, 1463, 0.0], [1463, 1536, 0.0], [1536, 1592, 0.0], [1592, 1657, 0.0], [1657, 1726, 0.0], [1726, 1791, 0.0], [1791, 1842, 0.0], [1842, 1907, 0.0], [1907, 1964, 0.0], [1964, 2037, 0.0], [2037, 2116, 0.0], [2116, 2193, 0.0], [2193, 2256, 0.0], [2256, 2304, 0.0], [2304, 2367, 0.0], [2367, 2394, 0.0], [2394, 2421, 0.0]], "rps_lines_uppercase_letter_fraction": [[0, 82, 0.07317073], [82, 145, 0.07936508], [145, 482, 0.05637982], [482, 562, 0.0375], [562, 740, 0.06179775], [740, 1054, 0.00636943], [1054, 1155, 0.04950495], [1155, 1173, 0.16666667], [1173, 1235, 0.16129032], [1235, 1331, 0.01041667], [1331, 1397, 0.07575758], [1397, 1463, 0.12121212], [1463, 1536, 0.01369863], [1536, 1592, 0.10714286], [1592, 1657, 0.12307692], [1657, 1726, 0.02898551], [1726, 1791, 0.12307692], [1791, 1842, 0.09803922], [1842, 1907, 0.01538462], [1907, 1964, 0.03508772], [1964, 2037, 0.12328767], [2037, 2116, 0.01265823], [2116, 2193, 0.05194805], [2193, 2256, 0.12698413], [2256, 2304, 0.02083333], [2304, 2367, 0.22222222], [2367, 2394, 0.11111111], [2394, 2421, 0.11111111]], "rps_doc_ml_palm_score": [[0, 2421, 0.28392273]], "rps_doc_ml_wikipedia_score": [[0, 2421, null]], "rps_doc_ml_wikiref_score": [[0, 2421, 0.98470712]], "rps_doc_books_importance": [[0, 2421, -50.77651949]], "rps_doc_openwebtext_importance": [[0, 2421, 5.06617002]], "rps_doc_wikipedia_importance": [[0, 2421, -83.02305937]], "rps_doc_num_sentences": [[0, 2421, 24.0]]} |
September 11, 2021 By Site Admin
https://empoweringnations.org/wp-content/uploads/2021/09/video-1630955132.mp4 | 2023-14/0000/en_head.json.gz/3689 | {"url": "https://empoweringnations.org/video-1630955132/", "partition": "head_middle", "language": "en", "source_domain": "empoweringnations.org", "date_download": "2023-03-20T10:30:44Z", "digest": "sha1:ZCWR62UWVNBTIRKB2LU2P7N2WAZQBR5O"} | {"ccnet_length": [[0, 110, 110.0]], "ccnet_original_length": [[0, 110, 940.0]], "ccnet_nlines": [[0, 110, 2.0]], "ccnet_original_nlines": [[0, 110, 40.0]], "ccnet_language_score": [[0, 110, 0.78]], "ccnet_perplexity": [[0, 110, 107.9]], "ccnet_bucket": [[0, 110, 0.0]], "rps_doc_curly_bracket": [[0, 110, 0.0]], "rps_doc_ldnoobw_words": [[0, 110, 0.0]], "rps_doc_lorem_ipsum": [[0, 110, 0.0]], "rps_doc_stop_word_fraction": [[0, 110, 0.0]], "rps_doc_ut1_blacklist": [[0, 110, null]], "rps_doc_frac_chars_dupe_10grams": [[0, 110, 0.0]], "rps_doc_frac_chars_dupe_5grams": [[0, 110, 0.0]], "rps_doc_frac_chars_dupe_6grams": [[0, 110, 0.0]], "rps_doc_frac_chars_dupe_7grams": [[0, 110, 0.0]], "rps_doc_frac_chars_dupe_8grams": [[0, 110, 0.0]], "rps_doc_frac_chars_dupe_9grams": [[0, 110, 0.0]], "rps_doc_frac_chars_top_2gram": [[0, 110, 0.0]], "rps_doc_frac_chars_top_3gram": [[0, 110, 0.0]], "rps_doc_frac_chars_top_4gram": [[0, 110, 0.0]], "rps_doc_frac_all_caps_words": [[0, 110, 0.0]], "rps_doc_frac_lines_end_with_ellipsis": [[0, 110, 0.0]], "rps_doc_frac_no_alph_words": [[0, 110, 0.57142857]], "rps_doc_frac_unique_words": [[0, 110, 1.0]], "rps_doc_mean_word_length": [[0, 110, 13.0]], "rps_doc_symbol_to_word_ratio": [[0, 110, 0.0]], "rps_doc_unigram_entropy": [[0, 110, 1.94591015]], "rps_doc_word_count": [[0, 110, 7.0]], "rps_lines_ending_with_terminal_punctution_mark": [[0, 33, 0.0], [33, 110, 0.0]], "rps_lines_javascript_counts": [[0, 33, 0.0], [33, 110, 0.0]], "rps_lines_num_words": [[0, 33, 6.0], [33, 110, 1.0]], "rps_lines_numerical_chars_fraction": [[0, 33, 0.19354839], [33, 110, 0.26153846]], "rps_lines_start_with_bulletpoint": [[0, 33, 0.0], [33, 110, 0.0]], "rps_lines_uppercase_letter_fraction": [[0, 33, 0.12121212], [33, 110, 0.0]], "rps_doc_ml_palm_score": [[0, 110, -1.001e-05]], "rps_doc_ml_wikipedia_score": [[0, 110, null]], "rps_doc_ml_wikiref_score": [[0, 110, -1.001e-05]], "rps_doc_books_importance": [[0, 110, -37.49458941]], "rps_doc_openwebtext_importance": [[0, 110, -17.78820388]], "rps_doc_wikipedia_importance": [[0, 110, -15.24629426]], "rps_doc_num_sentences": [[0, 110, 3.0]]} |
Home D-Aspartic Acid
DAA Xtreme Prolact Block - 60 tablets
Hormone booster and prolactin inhibitor in one - never seen before! The experts at Olimp Laboratories have been developing nutritional supplements for...
What is D-aspartic acid - DAA?
D-aspartic acid, also known by the abbreviation DAA, is the only naturally occurring D-amino acid in the human body. It is also present in trace amounts in some foods such as soy protein or casein. In the human body, the concentration of D-aspartic acid is highest in the tissue types responsible for stimulating testosterone production, as well as the actual production of testosterone. These are the pituitary gland, the hypothalamus and the testes.
How does D-aspartic acid work?
D-aspartic acid stimulates an increase in the release of luteinizing hormone (LH) and follicle stimulating hormone (FSH) by the pituitary gland, which signal the testes to produce more testosterone. What is particularly interesting here is that, unlike the use of steroids or prohormones, there is no suppression of the body's own testosterone production after discontinuation of D-aspartic acid. In addition to this, scientific research suggests that D-aspartic acid can also significantly increase growth hormone levels in the body. Furthermore, D-aspartic acid can significantly increase nitric oxide production, which increases blood flow to the muscles. This results in a stronger muscle pump during exercise and a better supply of oxygen, nutrients and anabolic hormones to the working muscles.
A growing body of scientific research supports the role of different biochemical compounds in regulating testosterone levels. D-amino acids have a different molecular form than L-amino acids, which were previously thought to be the only naturally occurring amino acids. L-amino acids are converted to their D form by enzymes called racemases.
Scientists have found that D-amino acids - and in particular D-aspartic acid (DAA) - play important roles in the neurological and endocrine systems. D-aspartic acid (DAA) is secreted by nerve endings to increase cAMP levels and thereby regulate neurotransmission. In the endocrine system, DAA forms the precursor for N-methyl-D-aspartate (NMDA), which induces the release of GnR hormone by the hypothalamus. GnRH then stimulates the anterior pituitary to produce luteinizing hormone (LH), which in turn acts on specific Leydig cells of the testes to ultimately increase testosterone secretion.
Scientific studies have shown that oral administration of about 3 grams of D-aspartic acid could increase testosterone levels by an average of 42% in just 12 days in healthy young men with natural testosterone levels that were within the normal physiological range.
Testosterone is by far the most important hormone for the expression of masculine characteristics. In the field of bodybuilding, the focus is often on increasing testosterone levels in order to stimulate muscle growth and achieve increased muscle mass and strength. Androgen receptors in muscle tissue and in the nucleus of muscle cells allow testosterone to exert its anabolic effects. This has become an important factor in the use of anabolic steroids, but they can damage health and have serious side effects. The use of D-aspartic acid could be a viable, if not quite as potent, alternative that both accelerates muscle building through increased endogenous testosterone levels and does not carry the health risks and potential side effects of anabolic steroids.
What are the benefits of testosterone production increased by D-aspartic acid?
Testosterone is the most important hormone for building muscle mass and strength and increased testosterone production is therefore synonymous with faster muscle and strength building. In addition to this, testosterone speeds up recovery, which means you can train more often and stimulate muscle growth more often. And since testosterone is also the primary male sex hormone, an increase in testosterone levels results in increased libido and sexual performance.
Who can benefit from supplementation with D-aspartic acid?
Male athletes who want to build muscle mass and strength faster
Dieting athletes who want to protect their hard-earned muscle mass from diet-induced muscle loss
Men over 25 who want to counteract a natural age-related reduction in testosterone production.
Users of steroids or prohormones who want to restart their endogenous testosterone production more quickly after discontinuation
What are the main benefits of D-aspartic acid?
Average increase in endogenous testosterone production of over 40% in just 12 days
Build muscle mass and strength faster
Accelerated regeneration after training
Anticatabolic effect and muscle protection during a calorie-restricted diet
Increased libido and sexual performance
Increase in growth hormone levels
Increased pump and improved supply of muscles with oxygen, nutrients and anabolic hormones
Recommended intake
Although the intake recommendation may vary from manufacturer to manufacturer and preparation to preparation, the general dosage recommendation is 3 grams of DAA per day. On training days, this dose should be taken about 45 minutes before training and on non-training days first thing in the morning or last thing in the evening on an empty stomach.
The discovery of DAA appears to provide a viable alternative to steroids or other performance enhancing pharmaceuticals. Although scientific studies with DAA have not yet provided conclusive results and more detailed studies are needed, DAA has shown great promise. Users generally report good results with a noticeable increase in libido, strength and even aggression. So far, there are no known side effects of DAA.
D'Aniello, A, 'D-aspartic acid: an endogenous amino acid with an important neuroendocrine role' (2007) 53(2) Brain research reviews 215.
Di Fiore, MM et al, 'Opposing effects of D-aspartic acid and nitric oxide on tuning of testosterone production in mallard testis during the reproductive cycle' (2008) 6(1) Reproductive Biology and Endocrinology 28.
Gwartney, D., Testosterone (2011).
Kadi, F, 'Cellular and molecular mechanisms responsible for the action of testosterone on human skeletal muscle. A basis for illegal performance enhancement' (2008) 154(3) British journal of pharmacology 522.
Topo, E et al, 'The role and molecular mechanism of D-aspartic acid in the release and synthesis of LH and testosterone in humans and rats' (2009) 7(1) Reproductive Biology and Endocrinology 120. | 2023-14/0000/en_head.json.gz/3690 | {"url": "https://en.gigasnutrition.com/collections/d-asparaginsaeure", "partition": "head_middle", "language": "en", "source_domain": "en.gigasnutrition.com", "date_download": "2023-03-20T09:01:05Z", "digest": "sha1:N7BF25JARUUCZK7GHWFX4PWY27N77CEV"} | {"ccnet_length": [[0, 6513, 6513.0]], "ccnet_original_length": [[0, 6513, 12394.0]], "ccnet_nlines": [[0, 6513, 34.0]], "ccnet_original_nlines": [[0, 6513, 461.0]], "ccnet_language_score": [[0, 6513, 0.93]], "ccnet_perplexity": [[0, 6513, 225.4]], "ccnet_bucket": [[0, 6513, 0.0]], "rps_doc_curly_bracket": [[0, 6513, 0.0]], "rps_doc_ldnoobw_words": [[0, 6513, 3.0]], "rps_doc_lorem_ipsum": [[0, 6513, 0.0]], "rps_doc_stop_word_fraction": [[0, 6513, 0.32423208]], "rps_doc_ut1_blacklist": [[0, 6513, null]], "rps_doc_frac_chars_dupe_10grams": [[0, 6513, 0.0]], "rps_doc_frac_chars_dupe_5grams": [[0, 6513, 0.04730862]], "rps_doc_frac_chars_dupe_6grams": [[0, 6513, 0.01192028]], "rps_doc_frac_chars_dupe_7grams": [[0, 6513, 0.0]], "rps_doc_frac_chars_dupe_8grams": [[0, 6513, 0.0]], "rps_doc_frac_chars_dupe_9grams": [[0, 6513, 0.0]], "rps_doc_frac_chars_top_2gram": [[0, 6513, 0.04600484]], "rps_doc_frac_chars_top_3gram": [[0, 6513, 0.01955671]], "rps_doc_frac_chars_top_4gram": [[0, 6513, 0.01564537]], "rps_doc_frac_all_caps_words": [[0, 6513, 0.04351536]], "rps_doc_frac_lines_end_with_ellipsis": [[0, 6513, 0.02941176]], "rps_doc_frac_no_alph_words": [[0, 6513, 0.15955631]], "rps_doc_frac_unique_words": [[0, 6513, 0.40904419]], "rps_doc_mean_word_length": [[0, 6513, 5.51798561]], "rps_doc_symbol_to_word_ratio": [[0, 6513, 0.00085324]], "rps_doc_unigram_entropy": [[0, 6513, 5.31301483]], "rps_doc_word_count": [[0, 6513, 973.0]], "rps_lines_ending_with_terminal_punctution_mark": [[0, 21, 0.0], [21, 59, 0.0], [59, 213, 1.0], [213, 244, 1.0], [244, 696, 1.0], [696, 727, 1.0], [727, 1528, 1.0], [1528, 1871, 1.0], [1871, 2465, 1.0], [2465, 2731, 1.0], [2731, 3499, 1.0], [3499, 3578, 1.0], [3578, 4042, 1.0], [4042, 4101, 1.0], [4101, 4165, 0.0], [4165, 4262, 0.0], [4262, 4357, 1.0], [4357, 4486, 0.0], [4486, 4533, 1.0], [4533, 4616, 0.0], [4616, 4654, 0.0], [4654, 4694, 0.0], [4694, 4770, 0.0], [4770, 4810, 0.0], [4810, 4844, 0.0], [4844, 4935, 0.0], [4935, 4954, 0.0], [4954, 5304, 1.0], [5304, 5722, 1.0], [5722, 5859, 1.0], [5859, 6074, 1.0], [6074, 6109, 1.0], [6109, 6318, 1.0], [6318, 6513, 1.0]], "rps_lines_javascript_counts": [[0, 21, 0.0], [21, 59, 0.0], [59, 213, 0.0], [213, 244, 0.0], [244, 696, 0.0], [696, 727, 0.0], [727, 1528, 0.0], [1528, 1871, 0.0], [1871, 2465, 0.0], [2465, 2731, 0.0], [2731, 3499, 0.0], [3499, 3578, 0.0], [3578, 4042, 0.0], [4042, 4101, 0.0], [4101, 4165, 0.0], [4165, 4262, 0.0], [4262, 4357, 0.0], [4357, 4486, 0.0], [4486, 4533, 0.0], [4533, 4616, 0.0], [4616, 4654, 0.0], [4654, 4694, 0.0], [4694, 4770, 0.0], [4770, 4810, 0.0], [4810, 4844, 0.0], [4844, 4935, 0.0], [4935, 4954, 0.0], [4954, 5304, 0.0], [5304, 5722, 0.0], [5722, 5859, 0.0], [5859, 6074, 0.0], [6074, 6109, 0.0], [6109, 6318, 0.0], [6318, 6513, 0.0]], "rps_lines_num_words": [[0, 21, 3.0], [21, 59, 6.0], [59, 213, 21.0], [213, 244, 5.0], [244, 696, 73.0], [696, 727, 5.0], [727, 1528, 117.0], [1528, 1871, 51.0], [1871, 2465, 85.0], [2465, 2731, 42.0], [2731, 3499, 120.0], [3499, 3578, 11.0], [3578, 4042, 68.0], [4042, 4101, 8.0], [4101, 4165, 11.0], [4165, 4262, 14.0], [4262, 4357, 14.0], [4357, 4486, 17.0], [4486, 4533, 8.0], [4533, 4616, 13.0], [4616, 4654, 6.0], [4654, 4694, 4.0], [4694, 4770, 9.0], [4770, 4810, 5.0], [4810, 4844, 5.0], [4844, 4935, 13.0], [4935, 4954, 2.0], [4954, 5304, 57.0], [5304, 5722, 64.0], [5722, 5859, 19.0], [5859, 6074, 32.0], [6074, 6109, 4.0], [6109, 6318, 29.0], [6318, 6513, 32.0]], "rps_lines_numerical_chars_fraction": [[0, 21, 0.0], [21, 59, 0.05714286], [59, 213, 0.0], [213, 244, 0.0], [244, 696, 0.0], [696, 727, 0.0], [727, 1528, 0.0], [1528, 1871, 0.0], [1871, 2465, 0.0], [2465, 2731, 0.01908397], [2731, 3499, 0.0], [3499, 3578, 0.0], [3578, 4042, 0.0], [4042, 4101, 0.0], [4101, 4165, 0.0], [4165, 4262, 0.0], [4262, 4357, 0.02173913], [4357, 4486, 0.0], [4486, 4533, 0.0], [4533, 4616, 0.04938272], [4616, 4654, 0.0], [4654, 4694, 0.0], [4694, 4770, 0.0], [4770, 4810, 0.0], [4810, 4844, 0.0], [4844, 4935, 0.0], [4935, 4954, 0.0], [4954, 5304, 0.00872093], [5304, 5722, 0.0], [5722, 5859, 0.08064516], [5859, 6074, 0.03921569], [6074, 6109, 0.14285714], [6109, 6318, 0.05555556], [6318, 6513, 0.04864865]], "rps_lines_start_with_bulletpoint": [[0, 21, 0.0], [21, 59, 0.0], [59, 213, 0.0], [213, 244, 0.0], [244, 696, 0.0], [696, 727, 0.0], [727, 1528, 0.0], [1528, 1871, 0.0], [1871, 2465, 0.0], [2465, 2731, 0.0], [2731, 3499, 0.0], [3499, 3578, 0.0], [3578, 4042, 0.0], [4042, 4101, 0.0], [4101, 4165, 0.0], [4165, 4262, 0.0], [4262, 4357, 0.0], [4357, 4486, 0.0], [4486, 4533, 0.0], [4533, 4616, 0.0], [4616, 4654, 0.0], [4654, 4694, 0.0], [4694, 4770, 0.0], [4770, 4810, 0.0], [4810, 4844, 0.0], [4844, 4935, 0.0], [4935, 4954, 0.0], [4954, 5304, 0.0], [5304, 5722, 0.0], [5722, 5859, 0.0], [5859, 6074, 0.0], [6074, 6109, 0.0], [6109, 6318, 0.0], [6318, 6513, 0.0]], "rps_lines_uppercase_letter_fraction": [[0, 21, 0.19047619], [21, 59, 0.15789474], [59, 213, 0.02597403], [213, 244, 0.16129032], [244, 696, 0.0199115], [696, 727, 0.06451613], [727, 1528, 0.01622971], [1528, 1871, 0.01457726], [1871, 2465, 0.05218855], [2465, 2731, 0.0075188], [2731, 3499, 0.0078125], [3499, 3578, 0.02531646], [3578, 4042, 0.00646552], [4042, 4101, 0.03389831], [4101, 4165, 0.015625], [4165, 4262, 0.01030928], [4262, 4357, 0.01052632], [4357, 4486, 0.00775194], [4486, 4533, 0.04255319], [4533, 4616, 0.01204819], [4616, 4654, 0.02631579], [4654, 4694, 0.025], [4694, 4770, 0.01315789], [4770, 4810, 0.025], [4810, 4844, 0.02941176], [4844, 4935, 0.01098901], [4935, 4954, 0.05263158], [4954, 5304, 0.01428571], [5304, 5722, 0.03827751], [5722, 5859, 0.03649635], [5859, 6074, 0.04186047], [6074, 6109, 0.08571429], [6109, 6318, 0.02392344], [6318, 6513, 0.04615385]], "rps_doc_ml_palm_score": [[0, 6513, 0.72865617]], "rps_doc_ml_wikipedia_score": [[0, 6513, null]], "rps_doc_ml_wikiref_score": [[0, 6513, 0.34992659]], "rps_doc_books_importance": [[0, 6513, -334.65789395]], "rps_doc_openwebtext_importance": [[0, 6513, -12.38520055]], "rps_doc_wikipedia_importance": [[0, 6513, 56.57321185]], "rps_doc_num_sentences": [[0, 6513, 46.0]]} |
In order to promote its application, LiderA system created the figure of the LiderA facilitators (advisor) or consultants that have intricate knowledge of LiderA's approach and thus can support the search for sustainable solutions with developers, designers and other key stakeholders.
LiderA's facilitators (advisors) major activities are: acting as advisors to the customer, defining the scope, identifying opportunities for sustainable solutions, analyzing their viability, facilitating the development of sustainable solutions, contributing to the dialogue between partners, pre-evaluating sustainable performances and suggesting recommendations, organizing the evidences for certification, answering to the questions posed by the verifier and ensuring compliance with LiderA's code of conduct.
The LiderA System advisors should know the approach, the areas and the criteria defined in the system, to evaluate the environmental dimension of the sustainability of the construction (built version), as well as the function and way of proceeding.
In order to form and promote the knowledge and possibilities of application of the concepts of sustainable construction according to the LiderA system, since 2006, Courses of Advisors have been carried out.
The successful completion of the course by the participant, approval in the assessment requirements, equal or superior to 80 % and acceptance of the LiderA Deontological Code of Conduct, allows to be part of the list of assessors (level 1), which is presented in alphabetical order.
The accompaniment and experience of advice in certified cases allows to evolve of level.
LiderA Deontological Code of Conduct
(PT version)
List of Facilitators (Advisors) | 2023-14/0000/en_head.json.gz/3691 | {"url": "https://en.lidera4all.com/assessores", "partition": "head_middle", "language": "en", "source_domain": "en.lidera4all.com", "date_download": "2023-03-20T09:12:58Z", "digest": "sha1:V3TIMURZH3UYGXVVVFQGFLSCEEJAQ3ZH"} | {"ccnet_length": [[0, 1708, 1708.0]], "ccnet_original_length": [[0, 1708, 2106.0]], "ccnet_nlines": [[0, 1708, 9.0]], "ccnet_original_nlines": [[0, 1708, 35.0]], "ccnet_language_score": [[0, 1708, 0.92]], "ccnet_perplexity": [[0, 1708, 303.7]], "ccnet_bucket": [[0, 1708, 0.0]], "rps_doc_curly_bracket": [[0, 1708, 0.0]], "rps_doc_ldnoobw_words": [[0, 1708, 0.0]], "rps_doc_lorem_ipsum": [[0, 1708, 0.0]], "rps_doc_stop_word_fraction": [[0, 1708, 0.37716263]], "rps_doc_ut1_blacklist": [[0, 1708, null]], "rps_doc_frac_chars_dupe_10grams": [[0, 1708, 0.0]], "rps_doc_frac_chars_dupe_5grams": [[0, 1708, 0.04503871]], "rps_doc_frac_chars_dupe_6grams": [[0, 1708, 0.0]], "rps_doc_frac_chars_dupe_7grams": [[0, 1708, 0.0]], "rps_doc_frac_chars_dupe_8grams": [[0, 1708, 0.0]], "rps_doc_frac_chars_dupe_9grams": [[0, 1708, 0.0]], "rps_doc_frac_chars_top_2gram": [[0, 1708, 0.02463054]], "rps_doc_frac_chars_top_3gram": [[0, 1708, 0.02744546]], "rps_doc_frac_chars_top_4gram": [[0, 1708, 0.03518649]], "rps_doc_frac_all_caps_words": [[0, 1708, 0.00346021]], "rps_doc_frac_lines_end_with_ellipsis": [[0, 1708, 0.0]], "rps_doc_frac_no_alph_words": [[0, 1708, 0.15224913]], "rps_doc_frac_unique_words": [[0, 1708, 0.52459016]], "rps_doc_mean_word_length": [[0, 1708, 5.82377049]], "rps_doc_symbol_to_word_ratio": [[0, 1708, 0.0]], "rps_doc_unigram_entropy": [[0, 1708, 4.2982608]], "rps_doc_word_count": [[0, 1708, 244.0]], "rps_lines_ending_with_terminal_punctution_mark": [[0, 286, 1.0], [286, 799, 1.0], [799, 1048, 1.0], [1048, 1255, 1.0], [1255, 1538, 1.0], [1538, 1627, 1.0], [1627, 1664, 0.0], [1664, 1677, 0.0], [1677, 1708, 0.0]], "rps_lines_javascript_counts": [[0, 286, 0.0], [286, 799, 0.0], [799, 1048, 0.0], [1048, 1255, 0.0], [1255, 1538, 0.0], [1538, 1627, 0.0], [1627, 1664, 0.0], [1664, 1677, 0.0], [1677, 1708, 0.0]], "rps_lines_num_words": [[0, 286, 41.0], [286, 799, 62.0], [799, 1048, 39.0], [1048, 1255, 32.0], [1255, 1538, 45.0], [1538, 1627, 14.0], [1627, 1664, 5.0], [1664, 1677, 2.0], [1677, 1708, 4.0]], "rps_lines_numerical_chars_fraction": [[0, 286, 0.0], [286, 799, 0.0], [799, 1048, 0.0], [1048, 1255, 0.01970443], [1255, 1538, 0.01098901], [1538, 1627, 0.0], [1627, 1664, 0.0], [1664, 1677, 0.0], [1677, 1708, 0.0]], "rps_lines_start_with_bulletpoint": [[0, 286, 0.0], [286, 799, 0.0], [799, 1048, 0.0], [1048, 1255, 0.0], [1255, 1538, 0.0], [1538, 1627, 0.0], [1627, 1664, 0.0], [1664, 1677, 0.0], [1677, 1708, 0.0]], "rps_lines_uppercase_letter_fraction": [[0, 286, 0.02447552], [286, 799, 0.00779727], [799, 1048, 0.01606426], [1048, 1255, 0.02415459], [1255, 1538, 0.02120141], [1538, 1627, 0.01123596], [1627, 1664, 0.13513514], [1664, 1677, 0.15384615], [1677, 1708, 0.09677419]], "rps_doc_ml_palm_score": [[0, 1708, 0.18997216]], "rps_doc_ml_wikipedia_score": [[0, 1708, null]], "rps_doc_ml_wikiref_score": [[0, 1708, 0.05318892]], "rps_doc_books_importance": [[0, 1708, -49.48700941]], "rps_doc_openwebtext_importance": [[0, 1708, 4.23525783]], "rps_doc_wikipedia_importance": [[0, 1708, 34.67591607]], "rps_doc_num_sentences": [[0, 1708, 7.0]]} |
Category:1890s disestablishments in Georgia (U.S. state)
Organizations, places or other things ended or disestablished in Georgia (U.S. state) in the decade 1890s.
See also: Category:1890s establishments in Georgia (U.S. state)
1894 disestablishments in Georgia (U.S. state) (1 P)
Retrieved from "https://en.wikipedia.org/w/index.php?title=Category:1890s_disestablishments_in_Georgia_(U.S._state)&oldid=943433369" | 2023-14/0000/en_head.json.gz/3692 | {"url": "https://en.m.wikipedia.org/wiki/Category:1890s_disestablishments_in_Georgia_(U.S._state)", "partition": "head_middle", "language": "en", "source_domain": "en.m.wikipedia.org", "date_download": "2023-03-20T11:13:48Z", "digest": "sha1:H7FWONURIS6PO72CFHX6ARVUXLEUTSYY"} | {"ccnet_length": [[0, 414, 414.0]], "ccnet_original_length": [[0, 414, 1360.0]], "ccnet_nlines": [[0, 414, 5.0]], "ccnet_original_nlines": [[0, 414, 62.0]], "ccnet_language_score": [[0, 414, 0.94]], "ccnet_perplexity": [[0, 414, 187.2]], "ccnet_bucket": [[0, 414, 0.0]], "rps_doc_curly_bracket": [[0, 414, 0.0]], "rps_doc_ldnoobw_words": [[0, 414, 0.0]], "rps_doc_lorem_ipsum": [[0, 414, 0.0]], "rps_doc_stop_word_fraction": [[0, 414, 0.12]], "rps_doc_ut1_blacklist": [[0, 414, null]], "rps_doc_frac_chars_dupe_10grams": [[0, 414, 0.0]], "rps_doc_frac_chars_dupe_5grams": [[0, 414, 0.0]], "rps_doc_frac_chars_dupe_6grams": [[0, 414, 0.0]], "rps_doc_frac_chars_dupe_7grams": [[0, 414, 0.0]], "rps_doc_frac_chars_dupe_8grams": [[0, 414, 0.0]], "rps_doc_frac_chars_dupe_9grams": [[0, 414, 0.0]], "rps_doc_frac_chars_top_2gram": [[0, 414, 0.11009174]], "rps_doc_frac_chars_top_3gram": [[0, 414, 0.13455657]], "rps_doc_frac_chars_top_4gram": [[0, 414, 0.14678899]], "rps_doc_frac_all_caps_words": [[0, 414, 0.11]], "rps_doc_frac_lines_end_with_ellipsis": [[0, 414, 0.0]], "rps_doc_frac_no_alph_words": [[0, 414, 0.42]], "rps_doc_frac_unique_words": [[0, 414, 0.63414634]], "rps_doc_mean_word_length": [[0, 414, 7.97560976]], "rps_doc_symbol_to_word_ratio": [[0, 414, 0.0]], "rps_doc_unigram_entropy": [[0, 414, 3.06498025]], "rps_doc_word_count": [[0, 414, 41.0]], "rps_lines_ending_with_terminal_punctution_mark": [[0, 57, 0.0], [57, 164, 1.0], [164, 228, 0.0], [228, 282, 0.0], [282, 414, 0.0]], "rps_lines_javascript_counts": [[0, 57, 0.0], [57, 164, 0.0], [164, 228, 0.0], [228, 282, 0.0], [282, 414, 0.0]], "rps_lines_num_words": [[0, 57, 6.0], [57, 164, 16.0], [164, 228, 8.0], [228, 282, 8.0], [282, 414, 3.0]], "rps_lines_numerical_chars_fraction": [[0, 57, 0.07843137], [57, 164, 0.04], [164, 228, 0.07017544], [228, 282, 0.10638298], [282, 414, 0.12037037]], "rps_lines_start_with_bulletpoint": [[0, 57, 0.0], [57, 164, 0.0], [164, 228, 0.0], [228, 282, 0.0], [282, 414, 0.0]], "rps_lines_uppercase_letter_fraction": [[0, 57, 0.07017544], [57, 164, 0.03738318], [164, 228, 0.078125], [228, 282, 0.07407407], [282, 414, 0.03787879]], "rps_doc_ml_palm_score": [[0, 414, 0.70193976]], "rps_doc_ml_wikipedia_score": [[0, 414, null]], "rps_doc_ml_wikiref_score": [[0, 414, 5.36e-06]], "rps_doc_books_importance": [[0, 414, -94.21188271]], "rps_doc_openwebtext_importance": [[0, 414, -19.55063084]], "rps_doc_wikipedia_importance": [[0, 414, -1.11088387]], "rps_doc_num_sentences": [[0, 414, 16.0]]} |
Maurice Blackburn
This article is about Australian politician and lawyer. For the firm he founded, see Maurice Blackburn (law firm). For the Canadian composer, see Maurice Blackburn (composer).
Maurice McCrae Blackburn (19 November 1880 – 31 March 1944)[1] was an Australian politician and socialist lawyer, noted for his protection of the interests of workers and the establishment of the legal firm known as Maurice Blackburn Lawyers.
Member of the Australian Parliament for Bourke
15 September 1934 – 7 July 1943
Frank Anstey
16th Speaker of the Victorian Legislative Assembly
11 October 1933 – 1 August 1934
William Everard
Alexander Peacock
Member of the Victorian Legislative Assembly for Clifton Hill
9 April 1927 – 1 August 1934
Electorate established
Bert Cremean
Member of the Victorian Legislative Assembly for Fitzroy
4 February 1925 – 4 March 1927
John Billson
Electorate abolished
Member of the Victorian Legislative Assembly for Essendon
26 November 1914 – 24 October 1917
William Watt
Thomas Ryan
(1880-11-19)19 November 1880
Inglewood, Victoria Colony, British Empire
31 March 1944(1944-03-31) (aged 63)
Prahran, Victoria, Australia
Cerebral tumour
Box Hill Cemetery
Labor (1914–35; 1937–41)
Independent Labor (1935–37; 1941–43)
Other political
Victorian Socialist Party
Doris Hordern
(m. 1914–1944)
Parent(s)
Maurice Blackburn Sr.
Thomasann McCrae
Melbourne Grammar School
University of Melbourne (BA, LLB)
BiographyEdit
Blackburn was born in Inglewood, Victoria, to Maurice Blackburn, a bank manager, and his wife Thomasann Cole (née McCrae), daughter of Captain Alexander McCrae.
Following the death of his father in 1887, Blackburn and his mother moved to Melbourne where he was educated at Melbourne Grammar School, matriculating in 1896. He attended the University of Melbourne, graduating in arts and law in 1909, and began to practise as a lawyer a year later. In the same year, he also became a member of the Victorian Socialist Party and was soon editing its newspaper, The Socialist. Later, in about 1908, he joined the Australian Labor Party.[1]
Blackburn married Doris Amelia Hordern on 10 December 1914.[2] Two weeks earlier he had entered the Victorian Legislative Assembly as the Labor member for the Electoral district of Essendon, but lost his seat in 1917, due largely to his strong anti-war and anti-conscription stances.
Further information: No Conscription Campaign
He returned to practising law, establishing the firm Maurice Blackburn & Co. in 1919, dealing primarily in trade union law and civil liberties cases.[3]
During his time practising law, Blackburn won cases that played a key role in establishing rights most Australians now take for granted, including the 40-hour working week, wage equality for indigenous workers, and equal pay for women.[4]
Blackburn made his mark on Australian politics in 1921 when he led a successful move to have the socialisation of the means of production added to Labor's official platform (which became known as the "socialist objective"). Returning to state Parliament as the member for Fitzroy in a 1925 by-election, Blackburn introduced legislation aimed at removing discrimination against women, and opposed what he saw as repressive economic measures proposed during the depression. In 1933, he was elected Speaker of the Victorian Legislative Assembly, but resigned from the Assembly in 1934,[5] so he could contest the Federal seat of Bourke, based on the suburbs of Brunswick and Coburg in Melbourne. Although he won Bourke and held it until 1943, his relationship with the Labor Party was chequered.[6]
In late 1934 and early 1935, Blackburn acted as legal counsel to prevent the deportation of the noted Czech anti-Fascist Egon Kisch.[1]
In October 1935, he voted in favour of sanctions against Italy over the Abyssinian crisis, defying his leader John Curtin.[1]
Blackburn's support for international socialism, and his opposition to conscription,[7] frequently caused him to take positions opposed to Labor policy and, in October 1935, he was expelled over his membership of the Movement Against War and Fascism.[1] He was soon re-admitted to the ALP, but expelled again in 1941 for his support of the Australia-Soviet Friendship League.[1] His expulsion was seen as a warning to other left-wing MPs that violation of party policy was not to be tolerated. Blackburn continued to serve as the member for Bourke as an independent, voting against the Labor government's conscription bill, but he lost his seat at the 1943 election to the official Labor candidate.[1]
Blackburn died of cerebral tumour on 31 March 1944,[1] in Prahran, Victoria, and was buried in Box Hill Cemetery, survived by his wife, two sons, a daughter, and his mother. His estate, which included a fine library, was sworn for probate at £2,552.[1] In a eulogy, Australian Prime Minister John Curtin referred to Blackburn as "one of the great servants of the people of the Commonwealth of Australia".[8] His widow, Doris, won Bourke as an Independent Labour candidate at the 1946 election and spent much of her time in Parliament promoting similar policies to those that Blackburn had supported.[2]
^ a b c d e f g h i Blackburn Abeyasekere, Susan (1979). "Blackburn, Maurice McCrae (1880–1944)". Australian Dictionary of Biography. National Centre of Biography, Australian National University. ISSN 1833-7538. Retrieved 7 February 2015.
^ a b Rasmussen, Carolyn (1993). "Blackburn, Doris Amelia (1889–1970)". Australian Dictionary of Biography. National Centre of Biography, Australian National University. ISSN 1833-7538. Retrieved 7 February 2015.
^ Day, David (2019), Maurice Blackburn : champion of the people, Scribe, ISBN 978-1-925713-78-7
^ "About Maurice Blackburn Lawyers | Australian compensation law firm".
^ "Maurice McCrae Blackburn". re-member: a database of all Victorian MPs since 1851. Parliament of Victoria. Retrieved 4 August 2022.
^ Blackburn, Susan; Australian Society for the Study of Labour History (1969), Maurice Blackburn and the Australian Labor Party, 1934-1943 : a study of principle in politics, Australian Society for the Study of Labour History, retrieved 8 March 2020
^ McKellar, John (1945), "Maurice Blackburn and the struggle for freedom", The Anti-conscription Campaign, retrieved 8 March 2020
^ Hawke, Robert. "Speech by the Prime Minister: inaugural Maurice Blackburn Memorial Lecture". Archived from the original on 23 June 2015. Retrieved 23 June 2015.
Wikimedia Commons has media related to Maurice McCrae Blackburn.
Maurice Blackburn Lawyers history of Maurice Blackburn
Victorian Parliament Profile of Former Speakers: Maurice Blackburn
Victorian Legislative Assembly
Member for Essendon
1914–1917 Succeeded by
Member for Fitzroy
1925–1927 District abolished
District created Member for Clifton Hill
Herbert Cremean
Member for Bourke
Retrieved from "https://en.wikipedia.org/w/index.php?title=Maurice_Blackburn&oldid=1143974974" | 2023-14/0000/en_head.json.gz/3693 | {"url": "https://en.m.wikipedia.org/wiki/Maurice_Blackburn", "partition": "head_middle", "language": "en", "source_domain": "en.m.wikipedia.org", "date_download": "2023-03-20T09:04:32Z", "digest": "sha1:BXF66DFBQAPU3333B3FHYTU3Q2UCQPC7"} | {"ccnet_length": [[0, 6938, 6938.0]], "ccnet_original_length": [[0, 6938, 8031.0]], "ccnet_nlines": [[0, 6938, 71.0]], "ccnet_original_nlines": [[0, 6938, 147.0]], "ccnet_language_score": [[0, 6938, 0.96]], "ccnet_perplexity": [[0, 6938, 154.9]], "ccnet_bucket": [[0, 6938, 0.0]], "rps_doc_curly_bracket": [[0, 6938, 0.0]], "rps_doc_ldnoobw_words": [[0, 6938, 0.0]], "rps_doc_lorem_ipsum": [[0, 6938, 0.0]], "rps_doc_stop_word_fraction": [[0, 6938, 0.25501859]], "rps_doc_ut1_blacklist": [[0, 6938, null]], "rps_doc_frac_chars_dupe_10grams": [[0, 6938, 0.04200783]], "rps_doc_frac_chars_dupe_5grams": [[0, 6938, 0.09398362]], "rps_doc_frac_chars_dupe_6grams": [[0, 6938, 0.09398362]], "rps_doc_frac_chars_dupe_7grams": [[0, 6938, 0.07974368]], "rps_doc_frac_chars_dupe_8grams": [[0, 6938, 0.05731577]], "rps_doc_frac_chars_dupe_9grams": [[0, 6938, 0.04200783]], "rps_doc_frac_chars_top_2gram": [[0, 6938, 0.01512994]], "rps_doc_frac_chars_top_3gram": [[0, 6938, 0.03488786]], "rps_doc_frac_chars_top_4gram": [[0, 6938, 0.03310787]], "rps_doc_frac_all_caps_words": [[0, 6938, 0.00446097]], "rps_doc_frac_lines_end_with_ellipsis": [[0, 6938, 0.0]], "rps_doc_frac_no_alph_words": [[0, 6938, 0.28475836]], "rps_doc_frac_unique_words": [[0, 6938, 0.44131911]], "rps_doc_mean_word_length": [[0, 6938, 5.44907856]], "rps_doc_symbol_to_word_ratio": [[0, 6938, 0.0]], "rps_doc_unigram_entropy": [[0, 6938, 5.43224349]], "rps_doc_word_count": [[0, 6938, 1031.0]], "rps_lines_ending_with_terminal_punctution_mark": [[0, 18, 0.0], [18, 194, 1.0], [194, 437, 1.0], [437, 484, 0.0], [484, 516, 0.0], [516, 529, 0.0], [529, 580, 0.0], [580, 612, 0.0], [612, 628, 0.0], [628, 646, 0.0], [646, 708, 0.0], [708, 737, 0.0], [737, 760, 0.0], [760, 773, 0.0], [773, 830, 0.0], [830, 861, 0.0], [861, 874, 0.0], [874, 895, 0.0], [895, 953, 0.0], [953, 988, 0.0], [988, 1001, 0.0], [1001, 1013, 0.0], [1013, 1042, 0.0], [1042, 1085, 0.0], [1085, 1121, 0.0], [1121, 1150, 0.0], [1150, 1166, 0.0], [1166, 1184, 0.0], [1184, 1209, 0.0], [1209, 1246, 0.0], [1246, 1262, 0.0], [1262, 1288, 0.0], [1288, 1302, 0.0], [1302, 1320, 0.0], [1320, 1330, 0.0], [1330, 1352, 1.0], [1352, 1369, 0.0], [1369, 1394, 0.0], [1394, 1428, 0.0], [1428, 1442, 0.0], [1442, 1603, 1.0], [1603, 2078, 0.0], [2078, 2362, 1.0], [2362, 2408, 0.0], [2408, 2561, 0.0], [2561, 2800, 0.0], [2800, 3596, 0.0], [3596, 3732, 0.0], [3732, 3858, 0.0], [3858, 4560, 0.0], [4560, 5163, 0.0], [5163, 5402, 1.0], [5402, 5615, 1.0], [5615, 5711, 0.0], [5711, 5783, 1.0], [5783, 5917, 1.0], [5917, 6167, 0.0], [6167, 6297, 0.0], [6297, 6460, 1.0], [6460, 6525, 1.0], [6525, 6580, 0.0], [6580, 6647, 0.0], [6647, 6678, 0.0], [6678, 6698, 0.0], [6698, 6721, 0.0], [6721, 6740, 0.0], [6740, 6769, 0.0], [6769, 6810, 0.0], [6810, 6826, 0.0], [6826, 6844, 0.0], [6844, 6938, 0.0]], "rps_lines_javascript_counts": [[0, 18, 0.0], [18, 194, 0.0], [194, 437, 0.0], [437, 484, 0.0], [484, 516, 0.0], [516, 529, 0.0], [529, 580, 0.0], [580, 612, 0.0], [612, 628, 0.0], [628, 646, 0.0], [646, 708, 0.0], [708, 737, 0.0], [737, 760, 0.0], [760, 773, 0.0], [773, 830, 0.0], [830, 861, 0.0], [861, 874, 0.0], [874, 895, 0.0], [895, 953, 0.0], [953, 988, 0.0], [988, 1001, 0.0], [1001, 1013, 0.0], [1013, 1042, 0.0], [1042, 1085, 0.0], [1085, 1121, 0.0], [1121, 1150, 0.0], [1150, 1166, 0.0], [1166, 1184, 0.0], [1184, 1209, 0.0], [1209, 1246, 0.0], [1246, 1262, 0.0], [1262, 1288, 0.0], [1288, 1302, 0.0], [1302, 1320, 0.0], [1320, 1330, 0.0], [1330, 1352, 0.0], [1352, 1369, 0.0], [1369, 1394, 0.0], [1394, 1428, 0.0], [1428, 1442, 0.0], [1442, 1603, 0.0], [1603, 2078, 0.0], [2078, 2362, 0.0], [2362, 2408, 0.0], [2408, 2561, 0.0], [2561, 2800, 0.0], [2800, 3596, 0.0], [3596, 3732, 0.0], [3732, 3858, 0.0], [3858, 4560, 0.0], [4560, 5163, 0.0], [5163, 5402, 0.0], [5402, 5615, 0.0], [5615, 5711, 0.0], [5711, 5783, 0.0], [5783, 5917, 0.0], [5917, 6167, 0.0], [6167, 6297, 0.0], [6297, 6460, 0.0], [6460, 6525, 0.0], [6525, 6580, 0.0], [6580, 6647, 0.0], [6647, 6678, 0.0], [6678, 6698, 0.0], [6698, 6721, 0.0], [6721, 6740, 0.0], [6740, 6769, 0.0], [6769, 6810, 0.0], [6810, 6826, 0.0], [6826, 6844, 0.0], [6844, 6938, 0.0]], "rps_lines_num_words": [[0, 18, 2.0], [18, 194, 26.0], [194, 437, 38.0], [437, 484, 7.0], [484, 516, 7.0], [516, 529, 2.0], [529, 580, 7.0], [580, 612, 7.0], [612, 628, 2.0], [628, 646, 2.0], [646, 708, 9.0], [708, 737, 7.0], [737, 760, 2.0], [760, 773, 2.0], [773, 830, 8.0], [830, 861, 7.0], [861, 874, 2.0], [874, 895, 2.0], [895, 953, 8.0], [953, 988, 7.0], [988, 1001, 2.0], [1001, 1013, 2.0], [1013, 1042, 3.0], [1042, 1085, 5.0], [1085, 1121, 5.0], [1121, 1150, 3.0], [1150, 1166, 2.0], [1166, 1184, 3.0], [1184, 1209, 3.0], [1209, 1246, 4.0], [1246, 1262, 2.0], [1262, 1288, 3.0], [1288, 1302, 2.0], [1302, 1320, 2.0], [1320, 1330, 1.0], [1330, 1352, 3.0], [1352, 1369, 2.0], [1369, 1394, 3.0], [1394, 1428, 5.0], [1428, 1442, 1.0], [1442, 1603, 24.0], [1603, 2078, 81.0], [2078, 2362, 44.0], [2362, 2408, 5.0], [2408, 2561, 23.0], [2561, 2800, 37.0], [2800, 3596, 125.0], [3596, 3732, 22.0], [3732, 3858, 20.0], [3858, 4560, 111.0], [4560, 5163, 99.0], [5163, 5402, 34.0], [5402, 5615, 26.0], [5615, 5711, 12.0], [5711, 5783, 8.0], [5783, 5917, 19.0], [5917, 6167, 37.0], [6167, 6297, 17.0], [6297, 6460, 24.0], [6460, 6525, 9.0], [6525, 6580, 7.0], [6580, 6647, 8.0], [6647, 6678, 3.0], [6678, 6698, 3.0], [6698, 6721, 3.0], [6721, 6740, 3.0], [6740, 6769, 3.0], [6769, 6810, 6.0], [6810, 6826, 2.0], [6826, 6844, 3.0], [6844, 6938, 3.0]], "rps_lines_numerical_chars_fraction": [[0, 18, 0.0], [18, 194, 0.0], [194, 437, 0.05508475], [437, 484, 0.0], [484, 516, 0.35483871], [516, 529, 0.0], [529, 580, 0.04], [580, 612, 0.35483871], [612, 628, 0.0], [628, 646, 0.0], [646, 708, 0.0], [708, 737, 0.35714286], [737, 760, 0.0], [760, 773, 0.0], [773, 830, 0.0], [830, 861, 0.33333333], [861, 874, 0.0], [874, 895, 0.0], [895, 953, 0.0], [953, 988, 0.35294118], [988, 1001, 0.0], [1001, 1013, 0.0], [1013, 1042, 0.58333333], [1042, 1085, 0.0], [1085, 1121, 0.55172414], [1121, 1150, 0.0], [1150, 1166, 0.0], [1166, 1184, 0.0], [1184, 1209, 0.57142857], [1209, 1246, 0.36363636], [1246, 1262, 0.0], [1262, 1288, 0.0], [1288, 1302, 0.0], [1302, 1320, 0.57142857], [1320, 1330, 0.0], [1330, 1352, 0.0], [1352, 1369, 0.0], [1369, 1394, 0.0], [1394, 1428, 0.0], [1428, 1442, 0.0], [1442, 1603, 0.0], [1603, 2078, 0.03695652], [2078, 2362, 0.04], [2362, 2408, 0.0], [2408, 2561, 0.03472222], [2561, 2800, 0.01304348], [2800, 3596, 0.02842377], [3596, 3732, 0.06923077], [3732, 3858, 0.04166667], [3858, 4560, 0.02363368], [4560, 5163, 0.03130435], [5163, 5402, 0.11363636], [5402, 5615, 0.12886598], [5615, 5711, 0.20987654], [5711, 5783, 0.0], [5783, 5917, 0.07317073], [5917, 6167, 0.0720339], [6167, 6297, 0.07627119], [6297, 6460, 0.07894737], [6460, 6525, 0.0], [6525, 6580, 0.0], [6580, 6647, 0.0], [6647, 6678, 0.0], [6678, 6698, 0.0], [6698, 6721, 0.36363636], [6721, 6740, 0.0], [6740, 6769, 0.28571429], [6769, 6810, 0.0], [6810, 6826, 0.0], [6826, 6844, 0.0], [6844, 6938, 0.12658228]], "rps_lines_start_with_bulletpoint": [[0, 18, 0.0], [18, 194, 0.0], [194, 437, 0.0], [437, 484, 0.0], [484, 516, 0.0], [516, 529, 0.0], [529, 580, 0.0], [580, 612, 0.0], [612, 628, 0.0], [628, 646, 0.0], [646, 708, 0.0], [708, 737, 0.0], [737, 760, 0.0], [760, 773, 0.0], [773, 830, 0.0], [830, 861, 0.0], [861, 874, 0.0], [874, 895, 0.0], [895, 953, 0.0], [953, 988, 0.0], [988, 1001, 0.0], [1001, 1013, 0.0], [1013, 1042, 0.0], [1042, 1085, 0.0], [1085, 1121, 0.0], [1121, 1150, 0.0], [1150, 1166, 0.0], [1166, 1184, 0.0], [1184, 1209, 0.0], [1209, 1246, 0.0], [1246, 1262, 0.0], [1262, 1288, 0.0], [1288, 1302, 0.0], [1302, 1320, 0.0], [1320, 1330, 0.0], [1330, 1352, 0.0], [1352, 1369, 0.0], [1369, 1394, 0.0], [1394, 1428, 0.0], [1428, 1442, 0.0], [1442, 1603, 0.0], [1603, 2078, 0.0], [2078, 2362, 0.0], [2362, 2408, 0.0], [2408, 2561, 0.0], [2561, 2800, 0.0], [2800, 3596, 0.0], [3596, 3732, 0.0], [3732, 3858, 0.0], [3858, 4560, 0.0], [4560, 5163, 0.0], [5163, 5402, 0.0], [5402, 5615, 0.0], [5615, 5711, 0.0], [5711, 5783, 0.0], [5783, 5917, 0.0], [5917, 6167, 0.0], [6167, 6297, 0.0], [6297, 6460, 0.0], [6460, 6525, 0.0], [6525, 6580, 0.0], [6580, 6647, 0.0], [6647, 6678, 0.0], [6678, 6698, 0.0], [6698, 6721, 0.0], [6721, 6740, 0.0], [6740, 6769, 0.0], [6769, 6810, 0.0], [6810, 6826, 0.0], [6826, 6844, 0.0], [6844, 6938, 0.0]], "rps_lines_uppercase_letter_fraction": [[0, 18, 0.11111111], [18, 194, 0.05113636], [194, 437, 0.04115226], [437, 484, 0.08510638], [484, 516, 0.0625], [516, 529, 0.15384615], [529, 580, 0.07843137], [580, 612, 0.0625], [612, 628, 0.125], [628, 646, 0.11111111], [646, 708, 0.09677419], [708, 737, 0.06896552], [737, 760, 0.04347826], [760, 773, 0.15384615], [773, 830, 0.0877193], [830, 861, 0.06451613], [861, 874, 0.15384615], [874, 895, 0.04761905], [895, 953, 0.0862069], [953, 988, 0.05714286], [988, 1001, 0.15384615], [1001, 1013, 0.16666667], [1013, 1042, 0.03448276], [1042, 1085, 0.11627907], [1085, 1121, 0.02777778], [1121, 1150, 0.10344828], [1150, 1166, 0.0625], [1166, 1184, 0.16666667], [1184, 1209, 0.04], [1209, 1246, 0.05405405], [1246, 1262, 0.0625], [1262, 1288, 0.11538462], [1288, 1302, 0.14285714], [1302, 1320, 0.0], [1320, 1330, 0.1], [1330, 1352, 0.13636364], [1352, 1369, 0.17647059], [1369, 1394, 0.12], [1394, 1428, 0.20588235], [1428, 1442, 0.14285714], [1442, 1603, 0.08074534], [1603, 2078, 0.04], [2078, 2362, 0.04225352], [2362, 2408, 0.08695652], [2408, 2561, 0.02614379], [2561, 2800, 0.0125523], [2800, 3596, 0.02763819], [3596, 3732, 0.04411765], [3732, 3858, 0.04761905], [3858, 4560, 0.03133903], [4560, 5163, 0.039801], [5163, 5402, 0.09205021], [5402, 5615, 0.09389671], [5615, 5711, 0.09375], [5711, 5783, 0.06944444], [5783, 5917, 0.08208955], [5917, 6167, 0.072], [6167, 6297, 0.06923077], [6297, 6460, 0.0797546], [6460, 6525, 0.09230769], [6525, 6580, 0.09090909], [6580, 6647, 0.10447761], [6647, 6678, 0.09677419], [6678, 6698, 0.1], [6698, 6721, 0.04347826], [6721, 6740, 0.10526316], [6740, 6769, 0.03448276], [6769, 6810, 0.09756098], [6810, 6826, 0.125], [6826, 6844, 0.11111111], [6844, 6938, 0.03191489]], "rps_doc_ml_palm_score": [[0, 6938, 0.71415544]], "rps_doc_ml_wikipedia_score": [[0, 6938, null]], "rps_doc_ml_wikiref_score": [[0, 6938, 0.96946299]], "rps_doc_books_importance": [[0, 6938, -343.1820089]], "rps_doc_openwebtext_importance": [[0, 6938, 21.8908404]], "rps_doc_wikipedia_importance": [[0, 6938, 251.15726415]], "rps_doc_num_sentences": [[0, 6938, 57.0]]} |
"Lawrence Philips" redirects here. For the football player, see Lawrence Phillips.
Metaphone is a phonetic algorithm, published by Lawrence Philips in 1990, for indexing words by their English pronunciation.[1] It fundamentally improves on the Soundex algorithm by using information about variations and inconsistencies in English spelling and pronunciation to produce a more accurate encoding, which does a better job of matching words and names which sound similar. As with Soundex, similar-sounding words should share the same keys. Metaphone is available as a built-in operator in a number of systems.
Philips later produced a new version of the algorithm, which he named Double Metaphone. Contrary to the original algorithm whose application is limited to English only, this version takes into account spelling peculiarities of a number of other languages. In 2009 Philips released a third version, called Metaphone 3, which achieves an accuracy of approximately 99% for English words, non-English words familiar to Americans, and first names and family names commonly found in the United States, having been developed according to modern engineering standards against a test harness of prepared correct encodings.
1 Procedure
2 Double Metaphone
3 Metaphone 3
4 Common misconceptions
5 Metaphone of other languages
8.1 Metaphone algorithms for other languages
ProcedureEdit
Original Metaphone codes use the 16 consonant symbols 0BFHJKLMNPRSTWXY. The '0' represents "th" (as an ASCII approximation of Θ), 'X' represents "sh" or "ch", and the others represent their usual English pronunciations. The vowels AEIOU are also used, but only at the beginning of the code.[2] This table summarizes most of the rules in the original implementation:
Drop duplicate adjacent letters, except for C.
If the word begins with 'KN', 'GN', 'PN', 'AE', 'WR', drop the first letter.
Drop 'B' if after 'M' at the end of the word.
'C' transforms to 'X' if followed by 'IA' or 'H' (unless in latter case, it is part of '-SCH-', in which case it transforms to 'K'). 'C' transforms to 'S' if followed by 'I', 'E', or 'Y'. Otherwise, 'C' transforms to 'K'.
'D' transforms to 'J' if followed by 'GE', 'GY', or 'GI'. Otherwise, 'D' transforms to 'T'.
Drop 'G' if followed by 'H' and 'H' is not at the end or before a vowel. Drop 'G' if followed by 'N' or 'NED' and is at the end.
'G' transforms to 'J' if before 'I', 'E', or 'Y', and it is not in 'GG'. Otherwise, 'G' transforms to 'K'.
Drop 'H' if after vowel and not before a vowel.
'CK' transforms to 'K'.
'PH' transforms to 'F'.
'Q' transforms to 'K'.
'S' transforms to 'X' if followed by 'H', 'IO', or 'IA'.
'T' transforms to 'X' if followed by 'IA' or 'IO'. 'TH' transforms to '0'. Drop 'T' if followed by 'CH'.
'V' transforms to 'F'.
'WH' transforms to 'W' if at the beginning. Drop 'W' if not followed by a vowel.
'X' transforms to 'S' if at the beginning. Otherwise, 'X' transforms to 'KS'.
Drop 'Y' if not followed by a vowel.
'Z' transforms to 'S'.
Drop all vowels unless it is the beginning.
This table does not constitute a complete description of the original Metaphone algorithm, and the algorithm cannot be coded correctly from it. Original Metaphone contained many errors and was superseded by Double Metaphone, and in turn Double Metaphone and original Metaphone were superseded by Metaphone 3, which corrects thousands of miscodings that will be produced by the first two versions.
To implement Metaphone without purchasing a (source code) copy of Metaphone 3, the reference implementation of Double Metaphone can be used.[3] Alternatively, version 2.1.3 of Metaphone 3, an earlier 2009 version without a number of encoding corrections made in the current version, version 2.5.4, has been made available under the terms of the BSD License via the OpenRefine project.[4]
Double Metaphone Edit
The Double Metaphone phonetic encoding algorithm is the second generation of this algorithm. Its implementation was described in the June 2000 issue of C/C++ Users Journal.[5] It makes a number of fundamental design improvements over the original Metaphone algorithm.
It is called "Double" because it can return both a primary and a secondary code for a string; this accounts for some ambiguous cases as well as for multiple variants of surnames with common ancestry. For example, encoding the name "Smith" yields a primary code of SM0 and a secondary code of XMT, while the name "Schmidt" yields a primary code of XMT and a secondary code of SMT—both have XMT in common.
Double Metaphone tries to account for myriad irregularities in English of Slavic, Germanic, Celtic, Greek, French, Italian, Spanish, Chinese, and other origins. Thus it uses a much more complex ruleset for coding than its predecessor; for example, it tests for approximately 100 different contexts of the use of the letter C alone.
Metaphone 3Edit
A professional version was released in October 2009, developed by the same author, Lawrence Philips. It is a commercial product sold as source code. Metaphone 3 further improves phonetic encoding of words in the English language, non-English words familiar to Americans, and first names and family names commonly found in the United States. It improves encoding for proper names in particular to a considerable extent.[6] The author claims that in general it improves accuracy for all words from the approximately 89% of Double Metaphone to 98%. Developers can also now set switches in code to cause the algorithm to encode Metaphone keys 1) taking non-initial vowels into account, as well as 2) encoding voiced and unvoiced consonants differently. This allows the result set to be more closely focused if the developer finds that the search results include too many words that don't resemble the search term closely enough.[7] Metaphone 3 is sold as C++, Java, C#, PHP, Perl, and PL/SQL source, Ruby and Python wrappers accessing a Java jar, and also Metaphone 3 for Spanish and German pronunciation available as Java and C# source.[8] The latest revision of the Metaphone 3 algorithm is v2.5.4, released March 2015. The Metaphone3 Java source code for an earlier version, 2.1.3, lacking a large number of encoding corrections made in the current version, version 2.5.4, was included as part of the OpenRefine project and is publicly viewable.[9]
Common misconceptionsEdit
There are some misconceptions about the Metaphone algorithms that should be addressed. The following statements are true:
All of them are designed to address regular, "dictionary" words, not just names, and
Metaphone algorithms do not produce phonetic representations of the input words and names; rather, the output is an intentionally approximate phonetic representation, according to this standard:
words that start with a vowel sound will have an 'A', representing any vowel, as the first character of the encoding (in Double Metaphone and Metaphone 3 - original Metaphone just preserves the actual vowel),
vowels after an initial vowel sound will be disregarded and not encoded, and
voiced/unvoiced consonant pairs will be mapped to the same encoding. (Examples of voiced/unvoiced consonant pairs are D/T, B/P, Z/S, G/K, etc.).
This approximate encoding is necessary to account for the way English speakers vary their pronunciations and misspell or otherwise vary words and names they are trying to spell. Vowels, of course, are notoriously highly variable. British speakers often complain that Americans seem to pronounce 'T's the same as 'D'. Consider, also, that all English speakers often pronounce 'Z' where 'S' is spelled, almost always when a noun ending in a voiced consonant or a liquid is pluralized, for example "seasons", "beams", "examples", etc. Not encoding vowels after an initial vowel sound will help to group words where a vowel and a consonant may be transposed in the misspelling or alternative pronunciation.
Metaphone of other languagesEdit
Metaphone is useful for English variants and other languages, having been preferred to Soundex in several Indo-European languages. On the other hand, rough phonetic encoding causes language dependency — or, in a language variant, average language-speaker dependency — mainly for non-English variants.
Perhaps the first example of stable adaptation of non-English metaphone was Brazilian Portuguese: it originated in ~2008 as a database solution in Várzea Paulista municipality of Brazil, and it evolved to the current metaphone-ptbr algorithm.
Caverphone
New York State Identification and Intelligence System
Match Rating Approach
Approximate string matching
^ Hanging on the Metaphone, Lawrence Philips. Computer Language, Vol. 7, No. 12 (December), 1990.
^ "Morfoedro - Technology". www.morfoedro.it. Retrieved 16 May 2018.
^ http://aspell.net/metaphone/dmetaph.cpp[bare URL plain text file]
^ "OpenRefine". GitHub. 19 May 2022.
^ Philips, Lawrence (June 2000). "The double metaphone search algorithm". C/C++ Users Journal. 18 (6): 38–43.
^ Best Faces Forward: A Large-scale Study of People Search in the Enterprise I Guy, S Ur, I Ronen, S Weber... - 2012 - http://www.research.ibm.com/haifa/dept/imt/papers/guyCHI12.pdf
^ Atkinson, Kevin. "Lawrence Philips' Metaphone Algorithm". aspell.net. Retrieved 16 May 2018.
^ "Anthropomorphic Software". www.amorphics.com. Retrieved 16 May 2018.
^ "OpenRefine source for Metaphone3". github.com. Retrieved 2 Nov 2020.
The Double Metaphone Search Algorithm, By Lawrence Phillips, June 1, 2000, Dr Dobb's, Original article
Metaphone algorithms for other languagesEdit
Brazilian Portuguese in C Metaphone for Brazilian Portuguese, in C with PHP and PostgreSQL port.
Brazilian Portuguese in Java Metaphone for Brazilian Portuguese, in Java.
Spanish Metaphone in Python
Double Metaphone algorithm for Bangla
Double Metaphone algorithm for Amharic
Russian Metaphone in Ruby.
Double Metaphone and Metaphone in JavaScript
Retrieved from "https://en.wikipedia.org/w/index.php?title=Metaphone&oldid=1132575313" | 2023-14/0000/en_head.json.gz/3694 | {"url": "https://en.m.wikipedia.org/wiki/Metaphone", "partition": "head_middle", "language": "en", "source_domain": "en.m.wikipedia.org", "date_download": "2023-03-20T10:09:20Z", "digest": "sha1:EQTTGRDP4TN6WZDD6T3F2EYUS5IHEOKV"} | {"ccnet_length": [[0, 9946, 9946.0]], "ccnet_original_length": [[0, 9946, 10497.0]], "ccnet_nlines": [[0, 9946, 72.0]], "ccnet_original_nlines": [[0, 9946, 106.0]], "ccnet_language_score": [[0, 9946, 0.88]], "ccnet_perplexity": [[0, 9946, 231.6]], "ccnet_bucket": [[0, 9946, 0.0]], "rps_doc_curly_bracket": [[0, 9946, 0.0]], "rps_doc_ldnoobw_words": [[0, 9946, 0.0]], "rps_doc_lorem_ipsum": [[0, 9946, 0.0]], "rps_doc_stop_word_fraction": [[0, 9946, 0.30143541]], "rps_doc_ut1_blacklist": [[0, 9946, null]], "rps_doc_frac_chars_dupe_10grams": [[0, 9946, 0.03848605]], "rps_doc_frac_chars_dupe_5grams": [[0, 9946, 0.09417612]], "rps_doc_frac_chars_dupe_6grams": [[0, 9946, 0.06308143]], "rps_doc_frac_chars_dupe_7grams": [[0, 9946, 0.05454314]], "rps_doc_frac_chars_dupe_8grams": [[0, 9946, 0.04587741]], "rps_doc_frac_chars_dupe_9grams": [[0, 9946, 0.03848605]], "rps_doc_frac_chars_top_2gram": [[0, 9946, 0.02905569]], "rps_doc_frac_chars_top_3gram": [[0, 9946, 0.01223397]], "rps_doc_frac_chars_top_4gram": [[0, 9946, 0.00573468]], "rps_doc_frac_all_caps_words": [[0, 9946, 0.05645933]], "rps_doc_frac_lines_end_with_ellipsis": [[0, 9946, 0.0]], "rps_doc_frac_no_alph_words": [[0, 9946, 0.27033493]], "rps_doc_frac_unique_words": [[0, 9946, 0.36886854]], "rps_doc_mean_word_length": [[0, 9946, 5.13211249]], "rps_doc_symbol_to_word_ratio": [[0, 9946, 0.00143541]], "rps_doc_unigram_entropy": [[0, 9946, 5.53294095]], "rps_doc_word_count": [[0, 9946, 1529.0]], "rps_lines_ending_with_terminal_punctution_mark": [[0, 83, 1.0], [83, 606, 1.0], [606, 1220, 1.0], [1220, 1232, 0.0], [1232, 1251, 0.0], [1251, 1265, 0.0], [1265, 1289, 0.0], [1289, 1320, 0.0], [1320, 1365, 0.0], [1365, 1379, 0.0], [1379, 1745, 0.0], [1745, 1792, 1.0], [1792, 1869, 1.0], [1869, 1915, 1.0], [1915, 2137, 1.0], [2137, 2229, 1.0], [2229, 2358, 1.0], [2358, 2465, 1.0], [2465, 2513, 1.0], [2513, 2537, 1.0], [2537, 2561, 1.0], [2561, 2584, 1.0], [2584, 2641, 1.0], [2641, 2746, 1.0], [2746, 2769, 1.0], [2769, 2850, 1.0], [2850, 2928, 1.0], [2928, 2965, 1.0], [2965, 2988, 1.0], [2988, 3032, 1.0], [3032, 3429, 1.0], [3429, 3817, 0.0], [3817, 3839, 0.0], [3839, 4107, 1.0], [4107, 4511, 1.0], [4511, 4843, 1.0], [4843, 4859, 0.0], [4859, 6307, 0.0], [6307, 6333, 0.0], [6333, 6455, 0.0], [6455, 6540, 0.0], [6540, 6735, 0.0], [6735, 6944, 0.0], [6944, 7021, 0.0], [7021, 7166, 1.0], [7166, 7869, 1.0], [7869, 7902, 0.0], [7902, 8203, 1.0], [8203, 8446, 1.0], [8446, 8457, 0.0], [8457, 8511, 0.0], [8511, 8533, 0.0], [8533, 8561, 0.0], [8561, 8659, 1.0], [8659, 8728, 1.0], [8728, 8796, 0.0], [8796, 8833, 1.0], [8833, 8943, 1.0], [8943, 9125, 0.0], [9125, 9220, 1.0], [9220, 9292, 1.0], [9292, 9364, 1.0], [9364, 9467, 0.0], [9467, 9512, 0.0], [9512, 9609, 1.0], [9609, 9683, 1.0], [9683, 9711, 0.0], [9711, 9749, 0.0], [9749, 9788, 0.0], [9788, 9815, 1.0], [9815, 9860, 0.0], [9860, 9946, 0.0]], "rps_lines_javascript_counts": [[0, 83, 0.0], [83, 606, 0.0], [606, 1220, 0.0], [1220, 1232, 0.0], [1232, 1251, 0.0], [1251, 1265, 0.0], [1265, 1289, 0.0], [1289, 1320, 0.0], [1320, 1365, 0.0], [1365, 1379, 0.0], [1379, 1745, 0.0], [1745, 1792, 0.0], [1792, 1869, 0.0], [1869, 1915, 0.0], [1915, 2137, 0.0], [2137, 2229, 0.0], [2229, 2358, 0.0], [2358, 2465, 0.0], [2465, 2513, 0.0], [2513, 2537, 0.0], [2537, 2561, 0.0], [2561, 2584, 0.0], [2584, 2641, 0.0], [2641, 2746, 0.0], [2746, 2769, 0.0], [2769, 2850, 0.0], [2850, 2928, 0.0], [2928, 2965, 0.0], [2965, 2988, 0.0], [2988, 3032, 0.0], [3032, 3429, 0.0], [3429, 3817, 0.0], [3817, 3839, 0.0], [3839, 4107, 0.0], [4107, 4511, 0.0], [4511, 4843, 0.0], [4843, 4859, 0.0], [4859, 6307, 0.0], [6307, 6333, 0.0], [6333, 6455, 0.0], [6455, 6540, 0.0], [6540, 6735, 0.0], [6735, 6944, 0.0], [6944, 7021, 0.0], [7021, 7166, 0.0], [7166, 7869, 0.0], [7869, 7902, 0.0], [7902, 8203, 0.0], [8203, 8446, 0.0], [8446, 8457, 0.0], [8457, 8511, 0.0], [8511, 8533, 0.0], [8533, 8561, 0.0], [8561, 8659, 0.0], [8659, 8728, 0.0], [8728, 8796, 0.0], [8796, 8833, 0.0], [8833, 8943, 0.0], [8943, 9125, 0.0], [9125, 9220, 0.0], [9220, 9292, 0.0], [9292, 9364, 0.0], [9364, 9467, 0.0], [9467, 9512, 0.0], [9512, 9609, 0.0], [9609, 9683, 0.0], [9683, 9711, 0.0], [9711, 9749, 0.0], [9749, 9788, 0.0], [9788, 9815, 0.0], [9815, 9860, 1.0], [9860, 9946, 0.0]], "rps_lines_num_words": [[0, 83, 11.0], [83, 606, 78.0], [606, 1220, 92.0], [1220, 1232, 2.0], [1232, 1251, 3.0], [1251, 1265, 3.0], [1265, 1289, 3.0], [1289, 1320, 5.0], [1320, 1365, 6.0], [1365, 1379, 1.0], [1379, 1745, 57.0], [1745, 1792, 7.0], [1792, 1869, 14.0], [1869, 1915, 11.0], [1915, 2137, 42.0], [2137, 2229, 16.0], [2229, 2358, 30.0], [2358, 2465, 21.0], [2465, 2513, 10.0], [2513, 2537, 4.0], [2537, 2561, 4.0], [2561, 2584, 4.0], [2584, 2641, 11.0], [2641, 2746, 20.0], [2746, 2769, 4.0], [2769, 2850, 16.0], [2850, 2928, 13.0], [2928, 2965, 8.0], [2965, 2988, 4.0], [2988, 3032, 8.0], [3032, 3429, 60.0], [3429, 3817, 59.0], [3817, 3839, 3.0], [3839, 4107, 39.0], [4107, 4511, 73.0], [4511, 4843, 52.0], [4843, 4859, 2.0], [4859, 6307, 233.0], [6307, 6333, 2.0], [6333, 6455, 17.0], [6455, 6540, 14.0], [6540, 6735, 26.0], [6735, 6944, 34.0], [6944, 7021, 13.0], [7021, 7166, 21.0], [7166, 7869, 111.0], [7869, 7902, 4.0], [7902, 8203, 42.0], [8203, 8446, 35.0], [8446, 8457, 1.0], [8457, 8511, 7.0], [8511, 8533, 3.0], [8533, 8561, 3.0], [8561, 8659, 14.0], [8659, 8728, 7.0], [8728, 8796, 5.0], [8796, 8833, 5.0], [8833, 8943, 15.0], [8943, 9125, 22.0], [9125, 9220, 11.0], [9220, 9292, 7.0], [9292, 9364, 9.0], [9364, 9467, 15.0], [9467, 9512, 5.0], [9512, 9609, 15.0], [9609, 9683, 10.0], [9683, 9711, 4.0], [9711, 9749, 5.0], [9749, 9788, 5.0], [9788, 9815, 4.0], [9815, 9860, 6.0], [9860, 9946, 3.0]], "rps_lines_numerical_chars_fraction": [[0, 83, 0.0], [83, 606, 0.00980392], [606, 1220, 0.01164725], [1220, 1232, 0.09090909], [1232, 1251, 0.05555556], [1251, 1265, 0.15384615], [1265, 1289, 0.04347826], [1289, 1320, 0.03333333], [1320, 1365, 0.04651163], [1365, 1379, 0.0], [1379, 1745, 0.01453488], [1745, 1792, 0.0], [1792, 1869, 0.0], [1869, 1915, 0.0], [1915, 2137, 0.0], [2137, 2229, 0.0], [2229, 2358, 0.0], [2358, 2465, 0.0], [2465, 2513, 0.0], [2513, 2537, 0.0], [2537, 2561, 0.0], [2561, 2584, 0.0], [2584, 2641, 0.0], [2641, 2746, 0.01176471], [2746, 2769, 0.0], [2769, 2850, 0.0], [2850, 2928, 0.0], [2928, 2965, 0.0], [2965, 2988, 0.0], [2988, 3032, 0.0], [3032, 3429, 0.00255754], [3429, 3817, 0.03783784], [3817, 3839, 0.0], [3839, 4107, 0.01930502], [4107, 4511, 0.00255102], [4511, 4843, 0.00940439], [4843, 4859, 0.06666667], [4859, 6307, 0.02295552], [6307, 6333, 0.0], [6333, 6455, 0.0], [6455, 6540, 0.0], [6540, 6735, 0.0], [6735, 6944, 0.00502513], [6944, 7021, 0.0], [7021, 7166, 0.0], [7166, 7869, 0.0], [7869, 7902, 0.0], [7902, 8203, 0.0], [8203, 8446, 0.01687764], [8446, 8457, 0.0], [8457, 8511, 0.0], [8511, 8533, 0.0], [8533, 8561, 0.0], [8561, 8659, 0.08235294], [8659, 8728, 0.10526316], [8728, 8796, 0.0], [8796, 8833, 0.20689655], [8833, 8943, 0.11956522], [8943, 9125, 0.03870968], [9125, 9220, 0.07228916], [9220, 9292, 0.09677419], [9292, 9364, 0.0952381], [9364, 9467, 0.05208333], [9467, 9512, 0.0], [9512, 9609, 0.0], [9609, 9683, 0.0], [9683, 9711, 0.0], [9711, 9749, 0.0], [9749, 9788, 0.0], [9788, 9815, 0.0], [9815, 9860, 0.0], [9860, 9946, 0.13888889]], "rps_lines_start_with_bulletpoint": [[0, 83, 0.0], [83, 606, 0.0], [606, 1220, 0.0], [1220, 1232, 0.0], [1232, 1251, 0.0], [1251, 1265, 0.0], [1265, 1289, 0.0], [1289, 1320, 0.0], [1320, 1365, 0.0], [1365, 1379, 0.0], [1379, 1745, 0.0], [1745, 1792, 0.0], [1792, 1869, 0.0], [1869, 1915, 0.0], [1915, 2137, 0.0], [2137, 2229, 0.0], [2229, 2358, 0.0], [2358, 2465, 0.0], [2465, 2513, 0.0], [2513, 2537, 0.0], [2537, 2561, 0.0], [2561, 2584, 0.0], [2584, 2641, 0.0], [2641, 2746, 0.0], [2746, 2769, 0.0], [2769, 2850, 0.0], [2850, 2928, 0.0], [2928, 2965, 0.0], [2965, 2988, 0.0], [2988, 3032, 0.0], [3032, 3429, 0.0], [3429, 3817, 0.0], [3817, 3839, 0.0], [3839, 4107, 0.0], [4107, 4511, 0.0], [4511, 4843, 0.0], [4843, 4859, 0.0], [4859, 6307, 0.0], [6307, 6333, 0.0], [6333, 6455, 0.0], [6455, 6540, 0.0], [6540, 6735, 0.0], [6735, 6944, 0.0], [6944, 7021, 0.0], [7021, 7166, 0.0], [7166, 7869, 0.0], [7869, 7902, 0.0], [7902, 8203, 0.0], [8203, 8446, 0.0], [8446, 8457, 0.0], [8457, 8511, 0.0], [8511, 8533, 0.0], [8533, 8561, 0.0], [8561, 8659, 0.0], [8659, 8728, 0.0], [8728, 8796, 0.0], [8796, 8833, 0.0], [8833, 8943, 0.0], [8943, 9125, 0.0], [9125, 9220, 0.0], [9220, 9292, 0.0], [9292, 9364, 0.0], [9364, 9467, 0.0], [9467, 9512, 0.0], [9512, 9609, 0.0], [9609, 9683, 0.0], [9683, 9711, 0.0], [9711, 9749, 0.0], [9749, 9788, 0.0], [9788, 9815, 0.0], [9815, 9860, 0.0], [9860, 9946, 0.0]], "rps_lines_uppercase_letter_fraction": [[0, 83, 0.06024096], [83, 606, 0.01912046], [606, 1220, 0.02117264], [1220, 1232, 0.08333333], [1232, 1251, 0.10526316], [1251, 1265, 0.07142857], [1265, 1289, 0.04166667], [1289, 1320, 0.03225806], [1320, 1365, 0.02222222], [1365, 1379, 0.14285714], [1379, 1745, 0.09016393], [1745, 1792, 0.04255319], [1792, 1869, 0.14285714], [1869, 1915, 0.06521739], [1915, 2137, 0.07657658], [2137, 2229, 0.11956522], [2229, 2358, 0.07751938], [2358, 2465, 0.09345794], [2465, 2513, 0.04166667], [2513, 2537, 0.125], [2537, 2561, 0.125], [2561, 2584, 0.08695652], [2584, 2641, 0.12280702], [2641, 2746, 0.11428571], [2746, 2769, 0.08695652], [2769, 2850, 0.0617284], [2850, 2928, 0.07692308], [2928, 2965, 0.05405405], [2965, 2988, 0.08695652], [2988, 3032, 0.02272727], [3032, 3429, 0.02518892], [3429, 3817, 0.03350515], [3817, 3839, 0.13636364], [3839, 4107, 0.04104478], [4107, 4511, 0.0470297], [4511, 4843, 0.03915663], [4843, 4859, 0.125], [4859, 6307, 0.03245856], [6307, 6333, 0.07692308], [6333, 6455, 0.02459016], [6455, 6540, 0.01176471], [6540, 6735, 0.00512821], [6735, 6944, 0.02392344], [6944, 7021, 0.0], [7021, 7166, 0.06206897], [7166, 7869, 0.0170697], [7869, 7902, 0.06060606], [7902, 8203, 0.02325581], [8203, 8446, 0.02880658], [8446, 8457, 0.09090909], [8457, 8511, 0.11111111], [8511, 8533, 0.13636364], [8533, 8561, 0.03571429], [8561, 8659, 0.09183673], [8659, 8728, 0.05797101], [8728, 8796, 0.04411765], [8796, 8833, 0.13513514], [8833, 8943, 0.07272727], [8943, 9125, 0.10989011], [9125, 9220, 0.08421053], [9220, 9292, 0.05555556], [9292, 9364, 0.06944444], [9364, 9467, 0.11650485], [9467, 9512, 0.04444444], [9512, 9609, 0.1443299], [9609, 9683, 0.09459459], [9683, 9711, 0.10714286], [9711, 9749, 0.07894737], [9749, 9788, 0.07692308], [9788, 9815, 0.11111111], [9815, 9860, 0.11111111], [9860, 9946, 0.02325581]], "rps_doc_ml_palm_score": [[0, 9946, 0.91097891]], "rps_doc_ml_wikipedia_score": [[0, 9946, null]], "rps_doc_ml_wikiref_score": [[0, 9946, 0.84716898]], "rps_doc_books_importance": [[0, 9946, -477.30922508]], "rps_doc_openwebtext_importance": [[0, 9946, -106.68867916]], "rps_doc_wikipedia_importance": [[0, 9946, 44.04899996]], "rps_doc_num_sentences": [[0, 9946, 128.0]]} |
April electricity tariffs to fall by at least 2 cents per KWh
Suppliers spared of €10/MWh cost on electricity producers
Energean plc: First gas from NEA / NI, offshore Egypt
Debt worth €123m owed by three major industries a concern for DEPA
DEPA, the Public Power Corporation, currently an entirely debt-free enterprise with a total turnover figure of 900 million euros in 2015, is troubled by significant unpaid debt amounts owed by three major industrial consumers, the fertilizer producer ELFE, which owes 90 million euros, the public road transportation company OSY (23 million euros) and the Hellenic Sugar Industry (10 million euros).
The unpaid debt owed by OSY, which uses gas for its gas-fueled buses, is increasing at a rate of 700,000 euros per month. The debt amount owed by Hellenic Sugar Industry is currently steady as a result of seasonal factors.
As for ELFE, recently renamed PFIC by its financially troubled owner, Lavrentis Lavrentiadis, DEPA is keeping a close watch on the entrepreneur’s maneuverings.
Besides the 90 million euro amount owed to DEPA, the fertilizer producer also owes the Greek state a further 60 million euros, its bank debt figure is at 150 million euros, while the European Commission’s Directorate-General (DG) for Competition has also taken action against the firm in a case concerning state aid.
DEPA is also owed amounts by industrial enterprises that have gone out of business, such as paper producer Diana, which owes 2.5 million euros to the gas company. Such amounts will most likely need to go down as bad debt. | 2023-14/0000/en_head.json.gz/3695 | {"url": "https://energypress.eu/depa-troubled-by-123m-euro-debt-amount-owed-by-three-major-industries/", "partition": "head_middle", "language": "en", "source_domain": "energypress.eu", "date_download": "2023-03-20T08:47:59Z", "digest": "sha1:LXF3V5BIDEFLJKINLGUCGHJ32TLSJH5P"} | {"ccnet_length": [[0, 1562, 1562.0]], "ccnet_original_length": [[0, 1562, 2196.0]], "ccnet_nlines": [[0, 1562, 9.0]], "ccnet_original_nlines": [[0, 1562, 37.0]], "ccnet_language_score": [[0, 1562, 0.96]], "ccnet_perplexity": [[0, 1562, 311.7]], "ccnet_bucket": [[0, 1562, 0.0]], "rps_doc_curly_bracket": [[0, 1562, 0.0]], "rps_doc_ldnoobw_words": [[0, 1562, 0.0]], "rps_doc_lorem_ipsum": [[0, 1562, 0.0]], "rps_doc_stop_word_fraction": [[0, 1562, 0.3147541]], "rps_doc_ut1_blacklist": [[0, 1562, null]], "rps_doc_frac_chars_dupe_10grams": [[0, 1562, 0.0]], "rps_doc_frac_chars_dupe_5grams": [[0, 1562, 0.0]], "rps_doc_frac_chars_dupe_6grams": [[0, 1562, 0.0]], "rps_doc_frac_chars_dupe_7grams": [[0, 1562, 0.0]], "rps_doc_frac_chars_dupe_8grams": [[0, 1562, 0.0]], "rps_doc_frac_chars_dupe_9grams": [[0, 1562, 0.0]], "rps_doc_frac_chars_top_2gram": [[0, 1562, 0.06624606]], "rps_doc_frac_chars_top_3gram": [[0, 1562, 0.01735016]], "rps_doc_frac_chars_top_4gram": [[0, 1562, 0.02523659]], "rps_doc_frac_all_caps_words": [[0, 1562, 0.04262295]], "rps_doc_frac_lines_end_with_ellipsis": [[0, 1562, 0.0]], "rps_doc_frac_no_alph_words": [[0, 1562, 0.18032787]], "rps_doc_frac_unique_words": [[0, 1562, 0.5703125]], "rps_doc_mean_word_length": [[0, 1562, 4.953125]], "rps_doc_symbol_to_word_ratio": [[0, 1562, 0.0]], "rps_doc_unigram_entropy": [[0, 1562, 4.68539089]], "rps_doc_word_count": [[0, 1562, 256.0]], "rps_lines_ending_with_terminal_punctution_mark": [[0, 62, 0.0], [62, 120, 0.0], [120, 174, 0.0], [174, 241, 0.0], [241, 641, 1.0], [641, 864, 1.0], [864, 1024, 1.0], [1024, 1341, 1.0], [1341, 1562, 1.0]], "rps_lines_javascript_counts": [[0, 62, 0.0], [62, 120, 0.0], [120, 174, 0.0], [174, 241, 0.0], [241, 641, 0.0], [641, 864, 0.0], [864, 1024, 0.0], [1024, 1341, 0.0], [1341, 1562, 0.0]], "rps_lines_num_words": [[0, 62, 12.0], [62, 120, 8.0], [120, 174, 9.0], [174, 241, 12.0], [241, 641, 60.0], [641, 864, 40.0], [864, 1024, 23.0], [1024, 1341, 52.0], [1341, 1562, 40.0]], "rps_lines_numerical_chars_fraction": [[0, 62, 0.01639344], [62, 120, 0.03571429], [120, 174, 0.0], [174, 241, 0.04545455], [241, 641, 0.03359173], [641, 864, 0.02777778], [864, 1024, 0.0], [1024, 1341, 0.02265372], [1341, 1562, 0.00925926]], "rps_lines_start_with_bulletpoint": [[0, 62, 0.0], [62, 120, 0.0], [120, 174, 0.0], [174, 241, 0.0], [241, 641, 0.0], [641, 864, 0.0], [864, 1024, 0.0], [1024, 1341, 0.0], [1341, 1562, 0.0]], "rps_lines_uppercase_letter_fraction": [[0, 62, 0.0483871], [62, 120, 0.05172414], [120, 174, 0.14814815], [174, 241, 0.07462687], [241, 641, 0.0425], [641, 864, 0.03587444], [864, 1024, 0.09375], [1024, 1341, 0.04100946], [1341, 1562, 0.02714932]], "rps_doc_ml_palm_score": [[0, 1562, 0.98972017]], "rps_doc_ml_wikipedia_score": [[0, 1562, null]], "rps_doc_ml_wikiref_score": [[0, 1562, 0.72399783]], "rps_doc_books_importance": [[0, 1562, -101.99067147]], "rps_doc_openwebtext_importance": [[0, 1562, 11.84745286]], "rps_doc_wikipedia_importance": [[0, 1562, 5.71153033]], "rps_doc_num_sentences": [[0, 1562, 8.0]]} |
THE ONE THING YOU SHOULD CHANGE FOR 2023… SLEEP!
When we have too many things to get done, one of the first things we tend to compromise on is usually sleep. Most of us think we can function on less than 7 hrs of sleep, but we may be doing more harm than we realize and here’s why
MIDNIGHT MADNESS: As the body nears midnight, our system begins to release cortisol, a stress chemical that can cause a lot of inflammation, creates digestion issues, higher cholesterol levels, and many other health issues. Each hour past midnight, cortisol levels continue to rise rapidly.
HORMONES & IMMUNE SYSTEM: From about 9pm – 1am our hormones are balanced, our immune system gets strengthened, and as the night progresses our waste system goes into play. If we are awake during these hours, our immune system will almost always be compromised, we could suffer with hormone imbalances and have trouble with digestion.
SCIENTIFIC STUDY: The National Heart, Lung and Blood Institute (NHLBI) did a study that confirmed the link between lack of sleep and high cholesterol levels as mentioned above. The study also showed that sleep deficiency contributes to high blood pressure, diabetes, stroke, obesity, and depression. In my experience of working with people for more than a decade, I would say that this is 100% true.
Getting to bed early and getting a solid 7-9 hrs of uninterrupted sleep is probably one of the most important health decisions you could make each day.
https://www.nhlbi.nih.gov/health/sleep-deprivation | 2023-14/0000/en_head.json.gz/3696 | {"url": "https://energyshackjuicebar.com/the-one-thing-you-should-change-for-2023-sleep/", "partition": "head_middle", "language": "en", "source_domain": "energyshackjuicebar.com", "date_download": "2023-03-20T10:31:17Z", "digest": "sha1:BXOFBNEBETTFTPUQFTOARJFP6GLZLNMH"} | {"ccnet_length": [[0, 1508, 1508.0]], "ccnet_original_length": [[0, 1508, 5922.0]], "ccnet_nlines": [[0, 1508, 7.0]], "ccnet_original_nlines": [[0, 1508, 90.0]], "ccnet_language_score": [[0, 1508, 0.94]], "ccnet_perplexity": [[0, 1508, 271.3]], "ccnet_bucket": [[0, 1508, 0.0]], "rps_doc_curly_bracket": [[0, 1508, 0.0]], "rps_doc_ldnoobw_words": [[0, 1508, 0.0]], "rps_doc_lorem_ipsum": [[0, 1508, 0.0]], "rps_doc_stop_word_fraction": [[0, 1508, 0.38283828]], "rps_doc_ut1_blacklist": [[0, 1508, null]], "rps_doc_frac_chars_dupe_10grams": [[0, 1508, 0.0]], "rps_doc_frac_chars_dupe_5grams": [[0, 1508, 0.0]], "rps_doc_frac_chars_dupe_6grams": [[0, 1508, 0.0]], "rps_doc_frac_chars_dupe_7grams": [[0, 1508, 0.0]], "rps_doc_frac_chars_dupe_8grams": [[0, 1508, 0.0]], "rps_doc_frac_chars_dupe_9grams": [[0, 1508, 0.0]], "rps_doc_frac_chars_top_2gram": [[0, 1508, 0.02965404]], "rps_doc_frac_chars_top_3gram": [[0, 1508, 0.01317957]], "rps_doc_frac_chars_top_4gram": [[0, 1508, 0.0]], "rps_doc_frac_all_caps_words": [[0, 1508, 0.05610561]], "rps_doc_frac_lines_end_with_ellipsis": [[0, 1508, 0.0]], "rps_doc_frac_no_alph_words": [[0, 1508, 0.16831683]], "rps_doc_frac_unique_words": [[0, 1508, 0.62650602]], "rps_doc_mean_word_length": [[0, 1508, 4.87550201]], "rps_doc_symbol_to_word_ratio": [[0, 1508, 0.00330033]], "rps_doc_unigram_entropy": [[0, 1508, 4.80956884]], "rps_doc_word_count": [[0, 1508, 249.0]], "rps_lines_ending_with_terminal_punctution_mark": [[0, 49, 1.0], [49, 281, 0.0], [281, 572, 1.0], [572, 906, 1.0], [906, 1306, 1.0], [1306, 1458, 1.0], [1458, 1508, 0.0]], "rps_lines_javascript_counts": [[0, 49, 0.0], [49, 281, 0.0], [281, 572, 0.0], [572, 906, 0.0], [906, 1306, 0.0], [1306, 1458, 0.0], [1458, 1508, 0.0]], "rps_lines_num_words": [[0, 49, 9.0], [49, 281, 49.0], [281, 572, 44.0], [572, 906, 54.0], [906, 1306, 65.0], [1306, 1458, 27.0], [1458, 1508, 1.0]], "rps_lines_numerical_chars_fraction": [[0, 49, 0.08510638], [49, 281, 0.00438596], [281, 572, 0.0], [572, 906, 0.00617284], [906, 1306, 0.00777202], [1306, 1458, 0.01342282], [1458, 1508, 0.0]], "rps_lines_start_with_bulletpoint": [[0, 49, 0.0], [49, 281, 0.0], [281, 572, 0.0], [572, 906, 0.0], [906, 1306, 0.0], [1306, 1458, 0.0], [1458, 1508, 0.0]], "rps_lines_uppercase_letter_fraction": [[0, 49, 0.69387755], [49, 281, 0.00862069], [281, 572, 0.05841924], [572, 906, 0.06586826], [906, 1306, 0.0725], [1306, 1458, 0.00657895], [1458, 1508, 0.0]], "rps_doc_ml_palm_score": [[0, 1508, 0.56770325]], "rps_doc_ml_wikipedia_score": [[0, 1508, null]], "rps_doc_ml_wikiref_score": [[0, 1508, 0.03995883]], "rps_doc_books_importance": [[0, 1508, -88.84027949]], "rps_doc_openwebtext_importance": [[0, 1508, -2.10936141]], "rps_doc_wikipedia_importance": [[0, 1508, -86.52196447]], "rps_doc_num_sentences": [[0, 1508, 14.0]]} |
How to get a guard
To get a job as a security guard, does not have to be Superman. Enough to have a good body, in good physical condition and satisfy certain other requirements of the security industry. What exactly need in order to get a job as a security guard?
How to make a CV for security guard
How to get to MOE to work
How to make watchmen
The diploma of the school guards. As a rule, school fees paid and it lasts about three months. At the end of the learning process you will have to pass a qualifying examination before the Commission under the authority of the interior. To a degree, only those future guards who successfully passed the examination test.
Obtaining a license security guard. The guard card is issued to those who already have the diploma of the school guards. Besides the diploma, you must provide a number of documents including a medical certificate that confirms that your health meets established standards.The lack of a diploma or license will not be a hindrance when applying for a job as a security guardif you stay in this area for a long time you are going. In any city of Russia there are the private security enterprises (chop), employing security guards without any qualification. But be prepared for the fact that to find a job that meets your expectations for salary and working conditions will be much more difficult. In reputable organizations having a license is a requirement.
Browse newspaper ads in your city, and career sites on the Internet. Job offers for security guards found quite a lot.
Don't be afraid to offer his candidacy in the interest of chop and other businesses, even if the vacancies you have not found. But first make sure that you meet the company's requirements. Most security companies want to see in the ranks of his employees security guards:
•license
•a driver's license,
•who served in the armed forces,
•having positive characteristics with the previous place of work,
•higher education (for personal security guards and bodyguards). Sometimes there are restrictions on growth. The optimal growth 175-185 cm Has value and age. If you are at least 21 years and not more than 50, to get a job as a security guard will be easier.An additional advantage for you can be a successful experience in special units of the interior Ministry and the FSB of Russia, the presence of a belt or a title in martial arts, the title of master of sports. Some security firms are valued employees who speak foreign languages.
Does the right guard to arrest the buyer?
How to get the MoE
How to check your guard card
How to obtain a license security guard in Saint Petersburg
How to get a job as a firefighter
Where to get after the army
How to enter service in MOE
How to avoid the scams when applying for a job
Searching the Internet, you can easily run into scams. About jobs, virtual employers, and the...
Initiative at work is punishable?
What can be said about the former employer when applying for a job
Useful tips for graduates: how to become competitive in the labour market
So do a lot of business etiquette? | 2023-14/0000/en_head.json.gz/3697 | {"url": "https://eng.kakprosto.ru/how-45192-how-to-get-a-guard", "partition": "head_middle", "language": "en", "source_domain": "eng.kakprosto.ru", "date_download": "2023-03-20T08:48:17Z", "digest": "sha1:LJ54MNMX2F3T4S3P7YRPDL62RVEX5RNK"} | {"ccnet_length": [[0, 3072, 3072.0]], "ccnet_original_length": [[0, 3072, 3591.0]], "ccnet_nlines": [[0, 3072, 27.0]], "ccnet_original_nlines": [[0, 3072, 57.0]], "ccnet_language_score": [[0, 3072, 0.95]], "ccnet_perplexity": [[0, 3072, 263.0]], "ccnet_bucket": [[0, 3072, 0.0]], "rps_doc_curly_bracket": [[0, 3072, 0.0]], "rps_doc_ldnoobw_words": [[0, 3072, 0.0]], "rps_doc_lorem_ipsum": [[0, 3072, 0.0]], "rps_doc_stop_word_fraction": [[0, 3072, 0.44334975]], "rps_doc_ut1_blacklist": [[0, 3072, null]], "rps_doc_frac_chars_dupe_10grams": [[0, 3072, 0.0]], "rps_doc_frac_chars_dupe_5grams": [[0, 3072, 0.08458114]], "rps_doc_frac_chars_dupe_6grams": [[0, 3072, 0.05706192]], "rps_doc_frac_chars_dupe_7grams": [[0, 3072, 0.03035208]], "rps_doc_frac_chars_dupe_8grams": [[0, 3072, 0.03035208]], "rps_doc_frac_chars_dupe_9grams": [[0, 3072, 0.0]], "rps_doc_frac_chars_top_2gram": [[0, 3072, 0.0222582]], "rps_doc_frac_chars_top_3gram": [[0, 3072, 0.01214083]], "rps_doc_frac_chars_top_4gram": [[0, 3072, 0.01416431]], "rps_doc_frac_all_caps_words": [[0, 3072, 0.00656814]], "rps_doc_frac_lines_end_with_ellipsis": [[0, 3072, 0.03703704]], "rps_doc_frac_no_alph_words": [[0, 3072, 0.10344828]], "rps_doc_frac_unique_words": [[0, 3072, 0.46323529]], "rps_doc_mean_word_length": [[0, 3072, 4.54227941]], "rps_doc_symbol_to_word_ratio": [[0, 3072, 0.00164204]], "rps_doc_unigram_entropy": [[0, 3072, 4.86462485]], "rps_doc_word_count": [[0, 3072, 544.0]], "rps_lines_ending_with_terminal_punctution_mark": [[0, 19, 0.0], [19, 264, 1.0], [264, 300, 0.0], [300, 326, 0.0], [326, 347, 0.0], [347, 667, 1.0], [667, 1423, 1.0], [1423, 1542, 1.0], [1542, 1814, 0.0], [1814, 1823, 0.0], [1823, 1844, 0.0], [1844, 1877, 0.0], [1877, 1943, 0.0], [1943, 2480, 1.0], [2480, 2522, 1.0], [2522, 2541, 0.0], [2541, 2570, 0.0], [2570, 2629, 0.0], [2629, 2663, 0.0], [2663, 2691, 0.0], [2691, 2719, 0.0], [2719, 2766, 0.0], [2766, 2863, 1.0], [2863, 2897, 1.0], [2897, 2964, 0.0], [2964, 3038, 0.0], [3038, 3072, 1.0]], "rps_lines_javascript_counts": [[0, 19, 0.0], [19, 264, 0.0], [264, 300, 0.0], [300, 326, 0.0], [326, 347, 0.0], [347, 667, 0.0], [667, 1423, 0.0], [1423, 1542, 0.0], [1542, 1814, 0.0], [1814, 1823, 0.0], [1823, 1844, 0.0], [1844, 1877, 0.0], [1877, 1943, 0.0], [1943, 2480, 0.0], [2480, 2522, 0.0], [2522, 2541, 0.0], [2541, 2570, 0.0], [2570, 2629, 0.0], [2629, 2663, 0.0], [2663, 2691, 0.0], [2691, 2719, 0.0], [2719, 2766, 0.0], [2766, 2863, 0.0], [2863, 2897, 0.0], [2897, 2964, 0.0], [2964, 3038, 0.0], [3038, 3072, 0.0]], "rps_lines_num_words": [[0, 19, 5.0], [19, 264, 46.0], [264, 300, 8.0], [300, 326, 7.0], [326, 347, 4.0], [347, 667, 55.0], [667, 1423, 127.0], [1423, 1542, 21.0], [1542, 1814, 47.0], [1814, 1823, 1.0], [1823, 1844, 3.0], [1844, 1877, 6.0], [1877, 1943, 9.0], [1943, 2480, 94.0], [2480, 2522, 8.0], [2522, 2541, 5.0], [2541, 2570, 6.0], [2570, 2629, 10.0], [2629, 2663, 8.0], [2663, 2691, 6.0], [2691, 2719, 6.0], [2719, 2766, 10.0], [2766, 2863, 15.0], [2863, 2897, 5.0], [2897, 2964, 13.0], [2964, 3038, 12.0], [3038, 3072, 7.0]], "rps_lines_numerical_chars_fraction": [[0, 19, 0.0], [19, 264, 0.0], [264, 300, 0.0], [300, 326, 0.0], [326, 347, 0.0], [347, 667, 0.0], [667, 1423, 0.0], [1423, 1542, 0.0], [1542, 1814, 0.0], [1814, 1823, 0.0], [1823, 1844, 0.0], [1844, 1877, 0.0], [1877, 1943, 0.0], [1943, 2480, 0.01908397], [2480, 2522, 0.0], [2522, 2541, 0.0], [2541, 2570, 0.0], [2570, 2629, 0.0], [2629, 2663, 0.0], [2663, 2691, 0.0], [2691, 2719, 0.0], [2719, 2766, 0.0], [2766, 2863, 0.0], [2863, 2897, 0.0], [2897, 2964, 0.0], [2964, 3038, 0.0], [3038, 3072, 0.0]], "rps_lines_start_with_bulletpoint": [[0, 19, 0.0], [19, 264, 0.0], [264, 300, 0.0], [300, 326, 0.0], [326, 347, 0.0], [347, 667, 0.0], [667, 1423, 0.0], [1423, 1542, 0.0], [1542, 1814, 0.0], [1814, 1823, 1.0], [1823, 1844, 1.0], [1844, 1877, 1.0], [1877, 1943, 1.0], [1943, 2480, 1.0], [2480, 2522, 0.0], [2522, 2541, 0.0], [2541, 2570, 0.0], [2570, 2629, 0.0], [2629, 2663, 0.0], [2663, 2691, 0.0], [2691, 2719, 0.0], [2719, 2766, 0.0], [2766, 2863, 0.0], [2863, 2897, 0.0], [2897, 2964, 0.0], [2964, 3038, 0.0], [3038, 3072, 0.0]], "rps_lines_uppercase_letter_fraction": [[0, 19, 0.05263158], [19, 264, 0.01632653], [264, 300, 0.08333333], [300, 326, 0.15384615], [326, 347, 0.04761905], [347, 667, 0.015625], [667, 1423, 0.01058201], [1423, 1542, 0.02521008], [1542, 1814, 0.01102941], [1814, 1823, 0.0], [1823, 1844, 0.0], [1844, 1877, 0.0], [1877, 1943, 0.0], [1943, 2480, 0.02048417], [2480, 2522, 0.02380952], [2522, 2541, 0.15789474], [2541, 2570, 0.03448276], [2570, 2629, 0.05084746], [2629, 2663, 0.02941176], [2663, 2691, 0.03571429], [2691, 2719, 0.14285714], [2719, 2766, 0.0212766], [2766, 2863, 0.03092784], [2863, 2897, 0.02941176], [2897, 2964, 0.01492537], [2964, 3038, 0.01351351], [3038, 3072, 0.02941176]], "rps_doc_ml_palm_score": [[0, 3072, 0.04047143]], "rps_doc_ml_wikipedia_score": [[0, 3072, null]], "rps_doc_ml_wikiref_score": [[0, 3072, 0.38027018]], "rps_doc_books_importance": [[0, 3072, -105.70087391]], "rps_doc_openwebtext_importance": [[0, 3072, 36.05429017]], "rps_doc_wikipedia_importance": [[0, 3072, -149.34115437]], "rps_doc_num_sentences": [[0, 3072, 29.0]]} |
Office of Engagement and Extension
CSU Magazines
CSU Magazine
Alumline
CSU Life
STATE - CSU System Magazine
University Source
Health and Human Sciences
Veterinary Medicine and Biomedical Sciences
Walter Scott, Jr. College of Engineering
Warner College of Natural Resources
Extension Spotlight: A return to the Teller County gardens
By Mark Platten
After a year of events being canceled due to Covid, it was exciting to be able to host a Teller County greenhouse and garden tour on August 28 and 29. The collaboration between the Colorado Master Gardeners of Teller County, the Woodland Park Harvest Center, and Community Partnership Family Resource Center resulted in fourteen community members opening their greenhouses and gardens for the community to view and ask questions.
It might not seem like a big deal to have a garden tour, after all, a lot of cities do this type of thing. But those of us who live in the mountains know that any trick or idea that can improve the odds of harvesting the desired vegetable is worth any effort. The elevation of the gardens on our tour ranged from 7,000 feet to over 10,000 feet.
We have a lot of transplanted people from around the country who are used to gardening near sea level where you can throw seeds on the ground and, with very little work, have more produce than you know what to do with. Obviously, that’s not the case in the mountains where we battle low moisture, winds, hail, poor soil and daily temperature variations of up to 50 degrees. Even once you learn how to mitigate these challenges, the deer, pocket gophers and ground squirrels will remind you that you’re not in control! No, in the mountains you have to be tenacious and take care of your plants like a protective parent with a newborn.
So, when we have a greenhouse and garden tour, it isn’t just for people to stroll through and admire what others have done. It’s a full-blown investigation to determine exactly what others are doing to achieve results. Though we certainly promote the Extension research-based fact sheets, the old adage “necessity is the mother of all invention” leads to a lot of creative ways that people have solved their problems that we just don’t have research to support. We’re pretty much all MacGyvers in these seemingly inhospitable environments.
The Teller County Extension garden tours invited community members to explore 14 local gardens.
True to our mountain nature, we made the tour free because we already have enough issues with gardening and didn’t want to create another barrier. We also allowed participants to choose where to start and how many sites they wanted to check out. Somehow this potential bottleneck never manifested and we had between 50-60 people at each of the 14 sites, except for one, our County Commissioner Dan Williams, whom I was able to get on the tour. His property is at 10,100 feet and he has a few raised beds and four greenhouses, a couple from purchased kits and a couple built the old-fashioned way using scraps of wood from other projects. Perhaps it’s because he is at 10,100 feet or perhaps it’s because he’s one of our commissioners, but he had almost 200 visitors over the five-hour tour!
What I heard over and over from participants is how much information and creative ideas they received from all those who volunteered their time and knowledge. After all, isn’t that what Extension is built on: the best research and the best evolving ideas? If you’re interested in what some of these gardeners are doing, you can check out some of our videos from last year’s virtual tour on the Teller County Extension website.
Tags assigned to this story
CSU ExtensionOffice of Engagement and Extension
Mark Platten
More posts by Mark Platten
Schools & Programs:
Online Degrees and Courses
School of Music, Theatre and Dance
School of Global Environmental Sustainability
© 2023 Colorado State University | 2023-14/0000/en_head.json.gz/3698 | {"url": "https://engagement.source.colostate.edu/extension-spotlight-a-return-to-the-teller-county-gardens/", "partition": "head_middle", "language": "en", "source_domain": "engagement.source.colostate.edu", "date_download": "2023-03-20T08:52:48Z", "digest": "sha1:BC4IAXS2MLKH44LU54P4LDPRMB5W352N"} | {"ccnet_length": [[0, 3887, 3887.0]], "ccnet_original_length": [[0, 3887, 4710.0]], "ccnet_nlines": [[0, 3887, 29.0]], "ccnet_original_nlines": [[0, 3887, 66.0]], "ccnet_language_score": [[0, 3887, 0.95]], "ccnet_perplexity": [[0, 3887, 333.9]], "ccnet_bucket": [[0, 3887, 0.0]], "rps_doc_curly_bracket": [[0, 3887, 0.0]], "rps_doc_ldnoobw_words": [[0, 3887, 0.0]], "rps_doc_lorem_ipsum": [[0, 3887, 0.0]], "rps_doc_stop_word_fraction": [[0, 3887, 0.44782034]], "rps_doc_ut1_blacklist": [[0, 3887, null]], "rps_doc_frac_chars_dupe_10grams": [[0, 3887, 0.0]], "rps_doc_frac_chars_dupe_5grams": [[0, 3887, 0.0]], "rps_doc_frac_chars_dupe_6grams": [[0, 3887, 0.0]], "rps_doc_frac_chars_dupe_7grams": [[0, 3887, 0.0]], "rps_doc_frac_chars_dupe_8grams": [[0, 3887, 0.0]], "rps_doc_frac_chars_dupe_9grams": [[0, 3887, 0.0]], "rps_doc_frac_chars_top_2gram": [[0, 3887, 0.01895136]], "rps_doc_frac_chars_top_3gram": [[0, 3887, 0.01421352]], "rps_doc_frac_chars_top_4gram": [[0, 3887, 0.01516109]], "rps_doc_frac_all_caps_words": [[0, 3887, 0.01188904]], "rps_doc_frac_lines_end_with_ellipsis": [[0, 3887, 0.0]], "rps_doc_frac_no_alph_words": [[0, 3887, 0.12813738]], "rps_doc_frac_unique_words": [[0, 3887, 0.52134146]], "rps_doc_mean_word_length": [[0, 3887, 4.82621951]], "rps_doc_symbol_to_word_ratio": [[0, 3887, 0.0]], "rps_doc_unigram_entropy": [[0, 3887, 5.34676032]], "rps_doc_word_count": [[0, 3887, 656.0]], "rps_lines_ending_with_terminal_punctution_mark": [[0, 35, 0.0], [35, 49, 0.0], [49, 62, 0.0], [62, 71, 0.0], [71, 80, 0.0], [80, 108, 0.0], [108, 126, 0.0], [126, 152, 0.0], [152, 196, 0.0], [196, 237, 0.0], [237, 273, 0.0], [273, 332, 0.0], [332, 348, 0.0], [348, 778, 1.0], [778, 1123, 1.0], [1123, 1757, 1.0], [1757, 2297, 1.0], [2297, 2393, 1.0], [2393, 3184, 1.0], [3184, 3611, 1.0], [3611, 3639, 0.0], [3639, 3687, 0.0], [3687, 3700, 0.0], [3700, 3727, 0.0], [3727, 3747, 0.0], [3747, 3774, 0.0], [3774, 3809, 0.0], [3809, 3855, 0.0], [3855, 3887, 0.0]], "rps_lines_javascript_counts": [[0, 35, 0.0], [35, 49, 0.0], [49, 62, 0.0], [62, 71, 0.0], [71, 80, 0.0], [80, 108, 0.0], [108, 126, 0.0], [126, 152, 0.0], [152, 196, 0.0], [196, 237, 0.0], [237, 273, 0.0], [273, 332, 0.0], [332, 348, 0.0], [348, 778, 0.0], [778, 1123, 0.0], [1123, 1757, 0.0], [1757, 2297, 0.0], [2297, 2393, 0.0], [2393, 3184, 0.0], [3184, 3611, 0.0], [3611, 3639, 0.0], [3639, 3687, 0.0], [3687, 3700, 0.0], [3700, 3727, 0.0], [3727, 3747, 0.0], [3747, 3774, 0.0], [3774, 3809, 0.0], [3809, 3855, 0.0], [3855, 3887, 0.0]], "rps_lines_num_words": [[0, 35, 5.0], [35, 49, 2.0], [49, 62, 2.0], [62, 71, 1.0], [71, 80, 2.0], [80, 108, 4.0], [108, 126, 2.0], [126, 152, 4.0], [152, 196, 5.0], [196, 237, 6.0], [237, 273, 5.0], [273, 332, 9.0], [332, 348, 3.0], [348, 778, 69.0], [778, 1123, 69.0], [1123, 1757, 114.0], [1757, 2297, 87.0], [2297, 2393, 14.0], [2393, 3184, 140.0], [3184, 3611, 73.0], [3611, 3639, 5.0], [3639, 3687, 6.0], [3687, 3700, 2.0], [3700, 3727, 5.0], [3727, 3747, 2.0], [3747, 3774, 4.0], [3774, 3809, 6.0], [3809, 3855, 5.0], [3855, 3887, 5.0]], "rps_lines_numerical_chars_fraction": [[0, 35, 0.0], [35, 49, 0.0], [49, 62, 0.0], [62, 71, 0.0], [71, 80, 0.0], [80, 108, 0.0], [108, 126, 0.0], [126, 152, 0.0], [152, 196, 0.0], [196, 237, 0.0], [237, 273, 0.0], [273, 332, 0.0], [332, 348, 0.0], [348, 778, 0.00943396], [778, 1123, 0.02670623], [1123, 1757, 0.00322581], [1757, 2297, 0.0], [2297, 2393, 0.0212766], [2393, 3184, 0.0245478], [3184, 3611, 0.0], [3611, 3639, 0.0], [3639, 3687, 0.0], [3687, 3700, 0.0], [3700, 3727, 0.0], [3727, 3747, 0.0], [3747, 3774, 0.0], [3774, 3809, 0.0], [3809, 3855, 0.0], [3855, 3887, 0.125]], "rps_lines_start_with_bulletpoint": [[0, 35, 0.0], [35, 49, 0.0], [49, 62, 0.0], [62, 71, 0.0], [71, 80, 0.0], [80, 108, 0.0], [108, 126, 0.0], [126, 152, 0.0], [152, 196, 0.0], [196, 237, 0.0], [237, 273, 0.0], [273, 332, 0.0], [332, 348, 0.0], [348, 778, 0.0], [778, 1123, 0.0], [1123, 1757, 0.0], [1757, 2297, 0.0], [2297, 2393, 0.0], [2393, 3184, 0.0], [3184, 3611, 0.0], [3611, 3639, 0.0], [3639, 3687, 0.0], [3687, 3700, 0.0], [3700, 3727, 0.0], [3727, 3747, 0.0], [3747, 3774, 0.0], [3774, 3809, 0.0], [3809, 3855, 0.0], [3855, 3887, 0.0]], "rps_lines_uppercase_letter_fraction": [[0, 35, 0.08571429], [35, 49, 0.28571429], [49, 62, 0.30769231], [62, 71, 0.11111111], [71, 80, 0.44444444], [80, 108, 0.35714286], [108, 126, 0.11111111], [126, 152, 0.11538462], [152, 196, 0.09090909], [196, 237, 0.12195122], [237, 273, 0.11111111], [273, 332, 0.08474576], [332, 348, 0.1875], [348, 778, 0.04651163], [778, 1123, 0.00869565], [1123, 1757, 0.00630915], [1757, 2297, 0.01296296], [2297, 2393, 0.04166667], [2393, 3184, 0.01264223], [3184, 3611, 0.01873536], [3611, 3639, 0.03571429], [3639, 3687, 0.14583333], [3687, 3700, 0.15384615], [3700, 3727, 0.11111111], [3727, 3747, 0.1], [3747, 3774, 0.11111111], [3774, 3809, 0.11428571], [3809, 3855, 0.08695652], [3855, 3887, 0.09375]], "rps_doc_ml_palm_score": [[0, 3887, 0.03762162]], "rps_doc_ml_wikipedia_score": [[0, 3887, null]], "rps_doc_ml_wikiref_score": [[0, 3887, 0.4105351]], "rps_doc_books_importance": [[0, 3887, -205.26503708]], "rps_doc_openwebtext_importance": [[0, 3887, 41.36724429]], "rps_doc_wikipedia_importance": [[0, 3887, -230.73665235]], "rps_doc_num_sentences": [[0, 3887, 24.0]]} |
An open invitation
Azza Radwan Sedky
Tuesday 7 Feb 2023
Egypt is a convenient and inexpensive option for those looking for a warm but less-expensive holiday or even retirement destination.
Egyptian soprano Fatma Said serenaded audiences at the first public event to be held at the new Grand Egyptian Museum (GEM) last month, inviting people to visit her home country by saying “open invitation: come and visit Egypt.”
I could go a step further and recommend that visitors decide to stay in Egypt after making a first short visit. In fact, many foreigners choose to do just that.
My accountant in Vancouver in Canada once asked me if it was costly to live in Egypt. My response was an adamant “no!” I realised afterwards that he was thinking of spending half the year elsewhere, not necessarily in Egypt but just to be able to afford to live for the remaining six months in Vancouver.
He later said that he felt it was time to split the year between Vancouver and a less-expensive location. Many North Americans choose to do just that.
It costs a family of four close to CA$5,000 (about $3,730) a month not counting rent to live in Vancouver, while it costs a single person around CA$1,300. The average rent for a one-bedroom apartment is now CA$2,500. It goes without saying that Vancouver, though exquisite, is an exorbitantly expensive city.
The cost of living is even higher in New York across the border in the US. A family of four has estimated monthly costs of $5,250 without counting rent. A single person’s estimated monthly costs are $1,431 without rent. In fact, Numbeo, a cost-of-living database, estimates that New York is 367.1 per cent more expensive than Cairo, while rent is 2,147 per cent higher.
In the UK, it is estimated that it costs a family of four £2,400 a month to live not counting rent. A family of four needs around €3,000 per month excluding rent to live in parts of France.
Such prohibitively expensive locations force residents to spend part of the year elsewhere or to retire in other countries. Fortunately, many inexpensive retirement havens exist around the world.
Mexico is possibly the most desirable option for North Americans. Thousands of Canadian retirees, or “snowbirds” as they have come to be known, travel southwards to sunnier climes such as Palm Springs in California and Miami in Florida. However, living expenses remain high in those destinations despite the warmth that many Canadians and Americans like to enjoy.
What about Europeans? Where do they head when they want to live or retire abroad?
Egypt may be a convenient and inexpensive option for those looking for a warm but less-expensive destination. Egypt’s low cost of living remains an enticing feature. Expats can extend their pensions and social security benefits to last longer in cities around Egypt. With the exchange rate of the Egyptian pound to the British pound now being £0.027 (LE37.20 to one British pound) and $0.033 (LE30) to one US dollar, Egypt is very appealing to those looking for a destination in which they can retire with a high quality of life and a low cost of living.
The US news network CNBC says that “a recent analysis by personal-finance site GOBankingRates found that Egypt is the destination for retirees who want to stretch their social security benefits, thanks to affordable food and housing expenses.” It also clarifies that US citizens residing in Egypt are exempt from US taxes on their benefits.
Cities such as Dahab and ElGouna are attractive spots for those seeking affordability amidst tranquility. Both cities are sunny, friendly, and, just as importantly, inexpensive options for retirees paid in hard currency, and they are a mere four-hour flight from most European cities.
The Red Sea coastal town of Dahab is inviting, laid back, and inexpensive, which is why many Russians, Ukrainians, Germans, and Italians have opted to live there.
According to the US network CNN, “Dahab is a destination so alluring that some have opted to make it their home soon after visiting.” According to the Atlys site, “if early morning swims in the Red Sea, getting work done over drip coffee at a world-class coworking space, kitesurfing as the sun goes down, and drinks with a vibrant expat community sound good to you, then Dahab is calling your name.”
It explains that “if you do your homework, you can get away with spending not more than $400 a month on rent, amenities, and entertainment. The prices of mobile-phone plans and Internet access are among the lowest in the world.”
Though ElGouna is a costlier and fancier spot, it remains very affordable for foreigners. It boasts year-round sunshine, excellent amenities, and is just one hop on a plane from European cities to Hurghada International Airport.
Thousands of Europeans already call ElGouna their home away from home and live permanently in this restful oasis. Journalist Michael Turtle in his travel blog “The Travel Turtle” says that expats “come from all around the world for different reasons. Some buy a property here as a holiday home, somewhere to escape to for a month or two each year when they need time away.”
Turtle talks to a couple on his blog who were originally from Yorkshire, England, and who had a business in Luxor, another warm spot in Egypt, but opted to move to ElGouna because of the downturn in tourism that came with the Ukraine war. They relocated their business, a restaurant that serves authentic English food called Puddleduck to ElGouna. They say that many other people have bought property in the city, and some come for maybe six weeks or more to the city and keep returning.
Even Cairo, though noisy and congested, offers an amazing opportunity for expats who prefer to be frugal with their expenses. The Internet site Expat Arrivals talked to a Canadian who had chosen to work as a teacher in Egypt who told the site that “the cost of living in Cairo was amazingly cheap. My quality of life was very good. I had an enormous apartment with three balconies and three bedrooms for $250 a month in a very trendy part of town. You could take a taxi to wherever you want, and it would cost you nothing. Food, produce, and meat from the grocery store was ridiculously cheap. Everything could be delivered to your house… even beer.” And that was even before the recent devaluation of the Egyptian pound.
There is much to see and do in Egypt, as Egypt is a country rich in history and adventure and its people are welcoming and friendly. Many visitors to the country decide to settle here having seen how inexpensive and affordable it is. For them, Egypt is the perfect spot for expats to stretch their bucks.
So here is another “open invitation” – come to visit Egypt and maybe think about staying on.
* The writer is former professor of communication based in Vancouver, Canada.
* A version of this article appears in print in the 9 February, 2023 edition of Al-Ahram Weekly
Grand Egyptian Museum
Also In Opinion
A promising regional realignment
No simple solution
A welcome shift
Women and entrepreneurship
Point-blank :Israel’s many extremists
Modern Egypt — (XXII) The warrior
Bidding the GTC good-bye | 2023-14/0000/en_head.json.gz/3699 | {"url": "https://english.ahram.org.eg/News/487704.aspx", "partition": "head_middle", "language": "en", "source_domain": "english.ahram.org.eg", "date_download": "2023-03-20T10:36:11Z", "digest": "sha1:SRA4U6APM4VUTIU2UB7MPR5DRAMTDL53"} | {"ccnet_length": [[0, 7134, 7134.0]], "ccnet_original_length": [[0, 7134, 9783.0]], "ccnet_nlines": [[0, 7134, 37.0]], "ccnet_original_nlines": [[0, 7134, 104.0]], "ccnet_language_score": [[0, 7134, 0.97]], "ccnet_perplexity": [[0, 7134, 302.9]], "ccnet_bucket": [[0, 7134, 0.0]], "rps_doc_curly_bracket": [[0, 7134, 0.0]], "rps_doc_ldnoobw_words": [[0, 7134, 0.0]], "rps_doc_lorem_ipsum": [[0, 7134, 0.0]], "rps_doc_stop_word_fraction": [[0, 7134, 0.41172366]], "rps_doc_ut1_blacklist": [[0, 7134, null]], "rps_doc_frac_chars_dupe_10grams": [[0, 7134, 0.0243309]], "rps_doc_frac_chars_dupe_5grams": [[0, 7134, 0.03823427]], "rps_doc_frac_chars_dupe_6grams": [[0, 7134, 0.03267292]], "rps_doc_frac_chars_dupe_7grams": [[0, 7134, 0.03267292]], "rps_doc_frac_chars_dupe_8grams": [[0, 7134, 0.0243309]], "rps_doc_frac_chars_dupe_9grams": [[0, 7134, 0.0243309]], "rps_doc_frac_chars_top_2gram": [[0, 7134, 0.00851582]], "rps_doc_frac_chars_top_3gram": [[0, 7134, 0.00625652]], "rps_doc_frac_chars_top_4gram": [[0, 7134, 0.00903719]], "rps_doc_frac_all_caps_words": [[0, 7134, 0.01814375]], "rps_doc_frac_lines_end_with_ellipsis": [[0, 7134, 0.0]], "rps_doc_frac_no_alph_words": [[0, 7134, 0.15282624]], "rps_doc_frac_unique_words": [[0, 7134, 0.43574959]], "rps_doc_mean_word_length": [[0, 7134, 4.73970346]], "rps_doc_symbol_to_word_ratio": [[0, 7134, 0.00069784]], "rps_doc_unigram_entropy": [[0, 7134, 5.60447737]], "rps_doc_word_count": [[0, 7134, 1214.0]], "rps_lines_ending_with_terminal_punctution_mark": [[0, 19, 0.0], [19, 37, 0.0], [37, 56, 0.0], [56, 189, 1.0], [189, 418, 1.0], [418, 579, 1.0], [579, 884, 1.0], [884, 1035, 1.0], [1035, 1344, 1.0], [1344, 1714, 1.0], [1714, 1904, 1.0], [1904, 2100, 1.0], [2100, 2464, 1.0], [2464, 2546, 1.0], [2546, 3101, 1.0], [3101, 3442, 1.0], [3442, 3727, 1.0], [3727, 3890, 1.0], [3890, 4291, 1.0], [4291, 4520, 1.0], [4520, 4749, 1.0], [4749, 5123, 1.0], [5123, 5611, 1.0], [5611, 6333, 1.0], [6333, 6638, 1.0], [6638, 6731, 1.0], [6731, 6809, 1.0], [6809, 6905, 0.0], [6905, 6927, 0.0], [6927, 6943, 0.0], [6943, 6976, 0.0], [6976, 6995, 0.0], [6995, 7011, 0.0], [7011, 7038, 0.0], [7038, 7076, 0.0], [7076, 7110, 0.0], [7110, 7134, 0.0]], "rps_lines_javascript_counts": [[0, 19, 0.0], [19, 37, 0.0], [37, 56, 0.0], [56, 189, 0.0], [189, 418, 0.0], [418, 579, 0.0], [579, 884, 0.0], [884, 1035, 0.0], [1035, 1344, 0.0], [1344, 1714, 0.0], [1714, 1904, 0.0], [1904, 2100, 0.0], [2100, 2464, 0.0], [2464, 2546, 0.0], [2546, 3101, 0.0], [3101, 3442, 0.0], [3442, 3727, 0.0], [3727, 3890, 0.0], [3890, 4291, 0.0], [4291, 4520, 0.0], [4520, 4749, 0.0], [4749, 5123, 0.0], [5123, 5611, 0.0], [5611, 6333, 0.0], [6333, 6638, 0.0], [6638, 6731, 0.0], [6731, 6809, 0.0], [6809, 6905, 0.0], [6905, 6927, 0.0], [6927, 6943, 0.0], [6943, 6976, 0.0], [6976, 6995, 0.0], [6995, 7011, 0.0], [7011, 7038, 0.0], [7038, 7076, 0.0], [7076, 7110, 0.0], [7110, 7134, 0.0]], "rps_lines_num_words": [[0, 19, 3.0], [19, 37, 3.0], [37, 56, 4.0], [56, 189, 20.0], [189, 418, 38.0], [418, 579, 30.0], [579, 884, 56.0], [884, 1035, 27.0], [1035, 1344, 51.0], [1344, 1714, 64.0], [1714, 1904, 38.0], [1904, 2100, 28.0], [2100, 2464, 57.0], [2464, 2546, 15.0], [2546, 3101, 96.0], [3101, 3442, 54.0], [3442, 3727, 43.0], [3727, 3890, 27.0], [3890, 4291, 71.0], [4291, 4520, 40.0], [4520, 4749, 35.0], [4749, 5123, 65.0], [5123, 5611, 86.0], [5611, 6333, 129.0], [6333, 6638, 56.0], [6638, 6731, 17.0], [6731, 6809, 11.0], [6809, 6905, 17.0], [6905, 6927, 3.0], [6927, 6943, 3.0], [6943, 6976, 4.0], [6976, 6995, 3.0], [6995, 7011, 3.0], [7011, 7038, 3.0], [7038, 7076, 4.0], [7076, 7110, 6.0], [7110, 7134, 4.0]], "rps_lines_numerical_chars_fraction": [[0, 19, 0.0], [19, 37, 0.0], [37, 56, 0.27777778], [56, 189, 0.0], [189, 418, 0.0], [418, 579, 0.0], [579, 884, 0.0], [884, 1035, 0.0], [1035, 1344, 0.05498282], [1344, 1714, 0.04532578], [1714, 1904, 0.04347826], [1904, 2100, 0.0], [2100, 2464, 0.0], [2464, 2546, 0.0], [2546, 3101, 0.02592593], [3101, 3442, 0.0], [3442, 3727, 0.0], [3727, 3890, 0.0], [3890, 4291, 0.0], [4291, 4520, 0.01357466], [4520, 4749, 0.0], [4749, 5123, 0.0], [5123, 5611, 0.0], [5611, 6333, 0.00424328], [6333, 6638, 0.0], [6638, 6731, 0.0], [6731, 6809, 0.0], [6809, 6905, 0.05494505], [6905, 6927, 0.0], [6927, 6943, 0.0], [6943, 6976, 0.0], [6976, 6995, 0.0], [6995, 7011, 0.0], [7011, 7038, 0.0], [7038, 7076, 0.0], [7076, 7110, 0.0], [7110, 7134, 0.0]], "rps_lines_start_with_bulletpoint": [[0, 19, 0.0], [19, 37, 0.0], [37, 56, 0.0], [56, 189, 0.0], [189, 418, 0.0], [418, 579, 0.0], [579, 884, 0.0], [884, 1035, 0.0], [1035, 1344, 0.0], [1344, 1714, 0.0], [1714, 1904, 0.0], [1904, 2100, 0.0], [2100, 2464, 0.0], [2464, 2546, 0.0], [2546, 3101, 0.0], [3101, 3442, 0.0], [3442, 3727, 0.0], [3727, 3890, 0.0], [3890, 4291, 0.0], [4291, 4520, 0.0], [4520, 4749, 0.0], [4749, 5123, 0.0], [5123, 5611, 0.0], [5611, 6333, 0.0], [6333, 6638, 0.0], [6638, 6731, 0.0], [6731, 6809, 0.0], [6809, 6905, 0.0], [6905, 6927, 0.0], [6927, 6943, 0.0], [6943, 6976, 0.0], [6976, 6995, 0.0], [6995, 7011, 0.0], [7011, 7038, 0.0], [7038, 7076, 0.0], [7076, 7110, 0.0], [7110, 7134, 0.0]], "rps_lines_uppercase_letter_fraction": [[0, 19, 0.05263158], [19, 37, 0.16666667], [37, 56, 0.10526316], [56, 189, 0.0075188], [189, 418, 0.04366812], [418, 579, 0.01863354], [579, 884, 0.02622951], [884, 1035, 0.03311258], [1035, 1344, 0.03559871], [1344, 1714, 0.03243243], [1714, 1904, 0.02631579], [1904, 2100, 0.01020408], [2100, 2464, 0.03571429], [2464, 2546, 0.03658537], [2546, 3101, 0.02702703], [3101, 3442, 0.05278592], [3442, 3727, 0.02105263], [3727, 3890, 0.04907975], [3890, 4291, 0.02992519], [4291, 4520, 0.01310044], [4520, 4749, 0.0349345], [4749, 5123, 0.02941176], [5123, 5611, 0.02868852], [5611, 6333, 0.02216066], [6333, 6638, 0.01967213], [6638, 6731, 0.02150538], [6731, 6809, 0.03846154], [6809, 6905, 0.05208333], [6905, 6927, 0.13636364], [6927, 6943, 0.1875], [6943, 6976, 0.03030303], [6976, 6995, 0.05263158], [6995, 7011, 0.0625], [7011, 7038, 0.03703704], [7038, 7076, 0.05263158], [7076, 7110, 0.20588235], [7110, 7134, 0.16666667]], "rps_doc_ml_palm_score": [[0, 7134, 0.71586001]], "rps_doc_ml_wikipedia_score": [[0, 7134, null]], "rps_doc_ml_wikiref_score": [[0, 7134, 0.46506274]], "rps_doc_books_importance": [[0, 7134, -207.9296799]], "rps_doc_openwebtext_importance": [[0, 7134, 125.51336694]], "rps_doc_wikipedia_importance": [[0, 7134, -116.21230769]], "rps_doc_num_sentences": [[0, 7134, 64.0]]} |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.