content
stringlengths
10
4.9M
A statistical analysis by Peking University Library shows that when its student body isn't reading course-assigned books about economics and politics, they are showing a preference for Japan's top thriller and mystery writer, Keigo Higashino. According to a 21-page report, Higashino's thriller Mysterious Night is the library's third-most borrowed book for 2016. Additionally, Higashino is also credited as having written the two most reserved books in Peking University Library, The Miracle in the Grocery Store as well as Journey Under the Midnight Sun. In fact, the report describes Higashino Peking University Library's second-most popular author, second only to the venerable Lu Xun. Higashino is a prolific author whose well-received works have been compared to popular page-turners by Stieg Larsson and Dan Brown, so it's wonder that he's well received by Beijing's finest minds. And yet, the report shows that library-goers and Peking University don't have a preference for Chinese authors and books. Much of the report's top ten lists are dominated by books written outside of China, making the mention of a Chinese author a rare sight. Liu Cixun's Hugo and Nebula-winning 2008 sci-fi novel The Three-Body Problem is one of the few fictionals Chinese works to make it onto the report as did Celeste Ng's Everything I Never Told You, both appearing on the list of most reserved books at numbers 7 and 10, respectively. Meanwhile, Tang Dynasty poet Bai Juyi's long-form poem Changhenge is the only Chinese non-textbook to crack the library's list of most borrowed books at #10. For a country that is as proud of its culture as China, you'd think the most commonly-assigned course textbook would be about Chinese history. However, that honor goes to Democracy in America, a seminal work that is required reading for political or social sciences majors at American universities. And although books about Marxism and economics make up the majority of reading for Peking University students, there are some exceptions. Sandwiched between Introductory Econometrics and Microeconomic Theory is the Peking Library's third-most borrowed foreign language book, a German translation of the Bible titled Die Bibel Einheitsübersetzung. Other trends at the library are more obvious. Borrowing of paper books has sharply declined, while electronic versions have skyrocketed. And if you're having trouble meeting girls, you'll find them at Peking University Library where female book borrowers outnumber males by 11 percent. Want to read like a Peking University Student? Here's a rundown of the top 10 most borrowed books at Peking University last year: Psychology and Life – A popular introductory textbook. Soulstealers: The Chinese Sorcery Scare of 1768 – An account of mass hysteria during 18th century China, and how ruling Emperor Hungli used a brutal campaign of torture to exert supremacy over his subjects. Mysterious Night – Keigo Higashino's 2004 thriller about a couple who fall in love while on the lam in Tokyo. Economic and Philosophic Manuscripts of 1844 – Early writings by Karl Marx about the alienation of the worker that were only published after the rise of Communism. The Origin of the Family, Private Property, and the State: in the Light of the Researches of Lewis H. Morgan – Friedrich Engels' writings on family economics that were partly based upon Marx's writings after his death. The Clash of Civilizations and the Remaking of World Order – Samuel Huntington argues that the best way to safeguard peace is to establish a new world order based upon cultural and religious identities, which Huntington describes as the greatest source of post-Cold War friction. Curiously, Huntington labels Japan as not part of China's Confucian empire, but as the world's only country that has its own unique civilization. Animal Farm – A parable by George Orwell. In it, Boxer the horse says "Jiayou!" numerous times. Suicide: A Study in Sociology – A ground-breaking 1897 study by Emile Durkheim that has since been criticized for using broad statistics to make specific conclusions about individuals. Theory of Development for the People – The only Chinese textbook on this list uses Marxism to explain economic trends in China's fast-developming society. Changhenge – A long-form narrative poem written by Tang Dynasty poet Bai Juyi almost 1,200 years ago. More stories from this author here. Images: JD, JD, yjbys, tieba, Sougou, Sina Blog, langlang, Amazon, Douban
/** Contains tests for the {@link LocalConverter} class. */ @ExtendWith(LocalOfficeManagerExtension.class) public class LocalConverterITest { private static final File SOURCE_FILE = documentFile("/test.doc"); @Test public void convert_FromFileToFile_ShouldSucceeded( final @TempDir File testFolder, final DocumentConverter converter) { final File outputFile = new File(testFolder, "out.pdf"); assertThatCode(() -> converter.convert(SOURCE_FILE).to(outputFile).execute()) .doesNotThrowAnyException(); assertThat(outputFile).isFile(); assertThat(outputFile.length()).isGreaterThan(0L); } @Test public void convert_FromStreamToFileWithNullInputFormat_ShouldThrowNullPointerException( final @TempDir File testFolder, final DocumentConverter converter) throws IOException { final File outputFile = new File(testFolder, "out.pdf"); try (InputStream stream = Files.newInputStream(SOURCE_FILE.toPath())) { assertThatNullPointerException() .isThrownBy(() -> converter.convert(stream).as(null).to(outputFile).execute()); } } @Test public void convert_FromStreamToFileWithoutInputFormat_ShouldSucceeded( final @TempDir File testFolder, final DocumentConverter converter) throws IOException { final File outputFile = new File(testFolder, "out.pdf"); try (InputStream stream = Files.newInputStream(SOURCE_FILE.toPath())) { assertThatCode(() -> converter.convert(stream).to(outputFile).execute()) .doesNotThrowAnyException(); } } @Test public void convert_FromStreamToFileWithSupportedInputFormat_ShouldSucceeded( final @TempDir File testFolder, final DocumentConverter converter) throws IOException { final File outputFile = new File(testFolder, "out.pdf"); try (InputStream stream = Files.newInputStream(SOURCE_FILE.toPath())) { assertThatCode( () -> converter .convert(stream) .as(DefaultDocumentFormatRegistry.getFormatByExtension("doc")) .to(outputFile) .execute()) .doesNotThrowAnyException(); } assertThat(outputFile).isFile(); assertThat(outputFile.length()).isGreaterThan(0L); } @Test public void convert_FromFileToStreamWithMissingOutputFormat_ShouldThrowNullPointerException( final @TempDir File testFolder, final DocumentConverter converter) throws IOException { final File outputFile = new File(testFolder, "out.pdf"); try (OutputStream stream = Files.newOutputStream(outputFile.toPath())) { assertThatNullPointerException() .isThrownBy(() -> converter.convert(SOURCE_FILE).to(stream).as(null).execute()); } } @Test public void convert_FromFileToStreamWithSupportedOutputFormat_ShouldSucceeded( final @TempDir File testFolder, final DocumentConverter converter) throws IOException { final File outputFile = new File(testFolder, "out.pdf"); try (OutputStream stream = Files.newOutputStream(outputFile.toPath())) { assertThatCode( () -> converter .convert(SOURCE_FILE) .to(stream) .as(DefaultDocumentFormatRegistry.getFormatByExtension("pdf")) .execute()) .doesNotThrowAnyException(); } assertThat(outputFile).isFile(); assertThat(outputFile.length()).isGreaterThan(0L); } @Test public void convert_FromFileWithoutExtensionToFile_ShouldSucceeded( final @TempDir File testFolder, final DocumentConverter converter) throws IOException { final File sourceFile = documentFile("test"); final File outputFile = new File(testFolder, "out.pdf"); try (OutputStream stream = Files.newOutputStream(outputFile.toPath())) { assertThatCode( () -> converter .convert(sourceFile) .to(stream) .as(DefaultDocumentFormatRegistry.getFormatByExtension("txt")) .execute()) .doesNotThrowAnyException(); } final String content = FileUtils.readFileToString(outputFile, StandardCharsets.UTF_8); assertThat(content).as("Check content: %s", content).contains("Test document"); } }
use log::info; /// Camera struct pub struct Camera { camera: rscam::Camera, width: u32, height: u32, } impl Camera { /// Web camera interface /// # Arguments /// - device_: The device name. For example, "/dev/video0" /// - width_: The width of camera device /// - height_: The height of camera device /// - fps_: Frame per seconds pub fn new(device_: &str, width_: u32, height_: u32, fps_: u32) -> Camera { let mut camera_ = rscam::new(device_).unwrap(); camera_ .start(&rscam::Config { interval: (1, fps_), resolution: (width_, height_), format: b"RGB3", ..Default::default() }) .unwrap(); info!("Camera {}: {} * {}, {} fps", device_, width_, height_, fps_); Camera { camera: camera_, width: width_, height: height_, } } /// Get frame from interface /// If interface do not get a image, return None pub fn get_frame(&self) -> Option<image::RgbImage> { let frame: rscam::Frame = self.camera.capture().unwrap(); let rgb_image = image::RgbImage::from_vec(self.width, self.height, (&frame[..]).to_vec()).unwrap(); Some(rgb_image) } }
from path import Path from random import random import argparse import pandas as pd import random def relpath_split(relpath): relpath = relpath.split('/') traj_name=relpath[0] shader = relpath[1] frame = relpath[2] frame=frame.replace('.png','') return traj_name, shader, frame def writelines(list,path): lenth = len(list) with open(path,'w') as f: for i in range(lenth): if i == lenth-1: f.writelines(str(list[i])) else: f.writelines(str(list[i])+'\n') def readlines(filename): """Read all the lines in a text file and return as a list """ with open(filename, 'r') as f: lines = f.read().splitlines() return lines def scene_fileter(scenes): print('ok') ret_scenes=[] for scene in scenes: if scene.stem in ['0000','0001','0002','0003','0004','0005','0006','0008','0009','0010','0011']: ret_scenes.append(scene) return ret_scenes def shader_fileter(shaders): ret = [] for shader in shaders: if shader.stem in ['9k','sildurs-e','sildurs-h','12k','15k']: ret.append(shader) return ret def file_fileter(dataset_path,files,ref_df=None): ret=[] for file in files: if int(file.stem)>=4 and int(file.stem)<=len(files)-4: if type(ref_df)==type(None): ret.append(file) else: scene, shader, frame = relpath_split(file.relpath(dataset_path)) shader='sildurs-e'#这里替换一下, 通过sildurs-e的文件作用于mbl if ref_df.loc[scene + '_' + shader][int(frame)] == 1: ret.append(file) return ret def table_fileter(files): pass def parse_args(): parser = argparse.ArgumentParser( description='custom dataset split for training ,validation and test') parser.add_argument('--dataset_path', type=str,default='/home/roit/970evop6/datasets/fpv_feild',help='path to a test image or folder of images') parser.add_argument("--num", default=1000, # default=None ) parser.add_argument('--reference', default=None, # default='./selection.csv', help='selection table for filtering') parser.add_argument("--proportion",default=[0.8,0.2,0.0],help="train, val, test") parser.add_argument("--rand_seed",default=12346) parser.add_argument("--out_dir",default='../splits/fpv_feild_lite1000') return parser.parse_args() def main(args): ''' :param args: :return:none , output is a dir includes 3 .txt files ''' [train_,val_,test_] = args.proportion out_num = args.num if train_+val_+test_-1.>0.01:#delta print('erro') return if args.reference: ref_df = pd.read_csv(args.reference,index_col='scences') print('load refs ok') else: ref_df=None out_dir = Path(args.out_dir) out_dir.mkdir_p() train_txt_p = out_dir/'train.txt' val_txt_p = out_dir/'val.txt' test_txt_p = out_dir/'test.txt' dataset_path = Path(args.dataset_path) trajs = dataset_path item_list=[]# # filtering and combination scenes = trajs.dirs() scenes.sort()#blocks scenes = scene_fileter(scenes) for scene in scenes: files = scene.files() files.sort() files = file_fileter(args.dataset_path,files,ref_df) item_list+=files #list constructed random.seed(args.rand_seed) random.shuffle(item_list) if out_num and out_num<len(item_list): item_list=item_list[:out_num] for i in range(len(item_list)): item_list[i] = item_list[i].relpath(dataset_path) length = len(item_list) train_bound = int(length * args.proportion[0]) val_bound = int(length * args.proportion[1]) + train_bound test_bound = int(length * args.proportion[2]) + val_bound print(" train items:{}\n val items:{}\n test items:{}".format(len(item_list[:train_bound]), len(item_list[train_bound:val_bound]), len(item_list[val_bound:test_bound]))) writelines(item_list[:train_bound],train_txt_p) writelines(item_list[train_bound:val_bound],val_txt_p) writelines(item_list[val_bound:test_bound],test_txt_p) if __name__ == '__main__': options = parse_args() main(options)
str1 = input() n, k = str1.strip().split() n = int(n) k = int(k) str2 = input() arr = [ int(i) for i in str2.strip().split()] #var maxSeg = 1 front = 0 back = 0 i = 0 # loop while i < n: back = i front = i while i < n-1 and arr[i+1]!=arr[i]: i+=1 front = i seg = front - back + 1 if seg > maxSeg: maxSeg = seg while i < n-1 and arr[i+1] == arr[i]: i+=1 if (i == n-1): break print(maxSeg)
def _update_widget_view(self): for str_option in self._dict_but_by_option: but_wid = self._dict_but_by_option[str_option] if str_option in self.value: but_wid.value = True but_wid.button_style = "success" else: but_wid.value = False but_wid.button_style = ""
import React from 'react' function Remaining() { return ( <div className="alert alert-success"> <span>Remaining: 1000$</span> </div> ) } export default Remaining
import { px } from "style-value-types" import { ResolvedValues } from "../../types" const dashKeys = { offset: "stroke-dashoffset", array: "stroke-dasharray", } const camelKeys = { offset: "strokeDashoffset", array: "strokeDasharray", } /** * Build SVG path properties. Uses the path's measured length to convert * our custom pathLength, pathSpacing and pathOffset into stroke-dashoffset * and stroke-dasharray attributes. * * This function is mutative to reduce per-frame GC. */ export function buildSVGPath( attrs: ResolvedValues, length: number, spacing = 1, offset = 0, useDashCase: boolean = true ): void { // Normalise path length by setting SVG attribute pathLength to 1 attrs.pathLength = 1 // We use dash case when setting attributes directly to the DOM node and camel case // when defining props on a React component. const keys = useDashCase ? dashKeys : camelKeys // Build the dash offset attrs[keys.offset] = px.transform!(-offset) // Build the dash array const pathLength = px.transform!(length) const pathSpacing = px.transform!(spacing) attrs[keys.array] = `${pathLength} ${pathSpacing}` }
# coding=utf-8 # Copyright 2021 The OneFlow Authors. All rights reserved. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. import oneflow as flow from oneflow import nn from libai.layers import Linear, build_activation class MLP(nn.Module): """MLP MLP will take the input with h hidden state, project it to intermediate hidden dimension, perform gelu transformation, and project the state back into h hidden dimension. Arguments: hidden_size: size of each input and output sample. ffn_hidden_size: size of each intermediate sample. output_dropout_prob: Output dropout probability. Defaults to 0.0. init_method: method to initialize the first linear weight. Defaults to :func:`nn.init.xavier_normal_`. output_layer_init_method: method to initialize the second linear weight. If set to None, it will use ``init_method`` instead. Defaults to None. bias_gelu_fusion: If set to ``True``, it will fuse bias adding and elementwise gelu activation. Defaults to ``False``. bias_dropout_fusion: If set to ``True``, it will fuse bias adding and dropout. Defaults to ``False``. layer_idx: A layer_idx sign which determines the placement. It will be used in pipeline parallelism. Defaults to 0. """ def __init__( self, hidden_size, ffn_hidden_size, output_dropout_prob=0.0, init_method=nn.init.xavier_normal_, output_layer_init_method=None, bias_gelu_fusion=False, bias_dropout_fusion=False, *, layer_idx=0, ): super().__init__() self.output_dropout_prob = output_dropout_prob self.bias_gelu_fusion = bias_gelu_fusion self.bias_dropout_fusion = bias_dropout_fusion if output_layer_init_method is None: output_layer_init_method = init_method self.dense_h_to_4h = Linear( hidden_size, ffn_hidden_size, bias=True, parallel="col", skip_bias_add=bias_gelu_fusion, init_method=init_method, layer_idx=layer_idx, ) if not bias_gelu_fusion: self.activation_func = build_activation("gelu") self.dense_4h_to_h = Linear( ffn_hidden_size, hidden_size, bias=True, parallel="row", skip_bias_add=bias_dropout_fusion, init_method=output_layer_init_method, layer_idx=layer_idx, ) if not bias_dropout_fusion: self.dropout = nn.Dropout(self.output_dropout_prob) def forward(self, hidden_states): intermediate = self.dense_h_to_4h(hidden_states) if self.bias_gelu_fusion: intermediate, bias = intermediate intermediate = flow._C.fused_bias_add_gelu( intermediate, bias, axis=intermediate.ndim - 1 ) else: intermediate = self.activation_func(intermediate) output = self.dense_4h_to_h(intermediate) if self.bias_dropout_fusion: output, bias = output output = flow._C.fused_bias_add_dropout( output, bias, p=self.output_dropout_prob, axis=output.ndim - 1 ) else: output = self.dropout(output) return output def extra_repr(self) -> str: return "bias_gelu_fusion={}, bias_dropout_fusion={}, dropout={}".format( self.bias_gelu_fusion, self.bias_dropout_fusion, self.output_dropout_prob )
<filename>src/dtos/create-cancel-ticket.dtos.ts export class CancellationTicketDto { oldTicketId: string; lostPercentage: number; }
def viewbox_mouse_event(self, event): if not self._key_events_bound: self._key_events_bound = True event.canvas.events.key_press.connect(self.viewbox_key_event) event.canvas.events.key_release.connect(self.viewbox_key_event)
Optimal interpolation schemes for particle tracking in turbulence. An important aspect in numerical simulations of particle-laden turbulent flows is the interpolation of the flow field needed for the computation of the Lagrangian trajectories. The accuracy of the interpolation method has direct consequences for the acceleration spectrum of the fluid particles and is therefore also important for the correct evaluation of the hydrodynamic forces for almost neutrally buoyant particles, common in many environmental applications. In order to systematically choose the optimal tradeoff between interpolation accuracy and computational cost we focus on comparing errors: the interpolation error is compared with the discretization error of the flow field. In this way one can prevent unnecessary computations and still retain the accuracy of the turbulent flow simulation. From the analysis a practical method is proposed that enables direct estimation of the interpolation and discretization error from the energy spectrum. The theory is validated by means of direct numerical simulations (DNS) of homogeneous, isotropic turbulence using a spectral code, where the trajectories of fluid tracers are computed using several interpolation methods. We show that B-spline interpolation has the best accuracy given the computational cost. Finally, the optimal interpolation order for the different methods is shown as a function of the resolution of the DNS simulation.
U.S. Secretary of State John Kerry said Thursday that Jews in the Ukrainian city of Donetsk were recently given notices instructing them to officially identify themselves as Jews. “In the year 2014, after all of the miles traveled and all of the journey of history, this is not just intolerable, it’s grotesque. It is beyond unacceptable. And any of the people who engage in these kinds of activities — from whatever party or whatever ideology or whatever place they crawl out of — there is no place for that,” he said, as reported by CNN. A report released by USA Today said the notices bore the name of Denis Pushilin, who identified himself as chairman of “Donetsk’s temporary government,” and were distributed near the Donetsk synagogue and other areas. The leaflet begins, “Dear Ukraine citizens of Jewish nationality,” and states that all Jews 16 or older must report to the Commissioner for Nationalities in the Donetsk Regional Administration to register. CNN could not immediately confirm the reports. Copyright © 2019 The Washington Times, LLC. Click here for reprint permission.
This President of the United States is the first president we've ever had who thinks he can choose which laws to enforce and which laws to ignore. He announces just about every day one change after another after another in Obamacare. It's utterly lawless. It is inconsistent with our Constitution, and it ought to trouble everyone, Republicans, Democrats, independents, libertarians. Let me tell you something: If you have a president picking and choosing which laws to follow and which laws to ignore, you no longer have a president. Texas Sen. Ted 'LOL' Cruz on Thursday at the Conservative Political Action Committee (video above):So we no longer have a president, eh? I guess that means Obama is a dictator, right? Of course, it's possible he's the world's lamest dictator because what other explanation do you have for the fact that as dictator he would allow an enemy of the state like Ted Cruz to spread his revolutionary rhetoric? Oh wait, I'm sorry. I forgot about the IRS, which Obama is clearly using to suppress the political opposition, which explains why Mitt Romney had the election stolen to him, because as everyone who unskewed the polls knows, Romney was the people's choice, all 53 percent of them. But as for this thing about Obama being the first president ever to engage in a tug-of-war with Congress about the extent of his powers and his flexibility in enforcing the law, well, in the spirit of being completely fair and balanced, it's probably worth noting these words from none other than President Ronald Reagan himself: While I am signing S. 1192, it contains a legislative veto provision which the Attorney General advises is unconstitutional. [...] Accordingly, this language of section 114(e) must be objected to on constitutional grounds. The Secretary of Transportation will not, consistent with this objection, regard himself as legally bound by any such resolution. So not only is President Barack Obama not the first president to do the sorts of things Cruz accuses him of doing, but Ronald Reagan was among those who did. And of course the logical implication of Cruz's anti-Obama attack is this: Ronald Reagan was not president of the United States.
00:39 Lightning Strike Fire at South Carolina State Park Fire that was probably caused by a lightning strike kills a variety of animals at a nature center in a South Carolina state park. Park workers and campers are shocked and saddened. At a Glance Fish, turtles, alligators and snakes were killed in a fire at the nature center. A fire was started, probably by lightning, early Wednesday morning at South Carolina's Huntington Beach State Park. A fire that destroyed a nature center early Wednesday morning at South Carolina's Huntington Beach State Park may have been started by lightning, officials said. The blaze consumed the three-story nature center, destroying it and killing numerous turtles, alligators, snakes and fish that were trapped inside , according to WMBF-TV. The fire was started at about 2:30 a.m. EDT Wednesday morning, the report added. “Unfortunately, even before we arrived, there was no life sustainable inside the structure ,” Midway Fire Rescue Chief Doug Eggiman told South Strand News. “Even if it was, [the fire] was too fully involved for us to make any kind of entry. All our operations had been exterior.” (MORE: Wildfire Burns Near 'Hollywood' Sign in Los Angeles ) As crews work to bring down the charred structure, attention turns to investigating the cause of the fire. Lightning was believed to be the culprit after a preliminary investigation, but officials told WMBF that they're not done looking into the events that led up to the fire. Thunderstorms were reported in the area around the time the fire started. The north side of the park was closed indefinitely because of the fire, South Strand News also reported. Park employees – especially those who worked with the animals – were distraught when they learned about the animals that were lost in the fire. "There's history, unbelievable history, inside that center, and it's gone for good now ," Judy Blanchfield, who witnessed the fire, told WBTW.com. "It’s just so sad." Nobody was inside the nature center at the time of the fire, and no firefighters were injured during the hours-long battle, according to the State.
from collections import defaultdict def _get_node_successors(edges, from_id_col, to_id_col): edge_cnt = len(edges) node_successors = defaultdict(list) from_ids = edges[from_id_col].to_list() to_ids = edges[to_id_col].to_list() for i in range(0, edge_cnt): node_successors[from_ids[i]].append(to_ids[i]) return node_successors def _strongly_connected_components(list_of_nodes, node_successors): """ Generate nodes in strongly connected components of graph. Source: https://networkx.org/documentation/stable/reference/algorithms/component.html Uses Tarjan's algorithm [1] with Nuutila's modifications [2]. Nonrecursive version of algorithm. References ---------- [1] Depth-first search and linear graph algorithms, R. Tarjan SIAM Journal of Computing 1(2):146-160, (1972). [2] On finding the strongly connected components in a directed graph. E. Nuutila and E. Soisalon-Soinen Information Processing Letters 49(1): 9-14, (1994). """ preorder = {} lowlink = {} scc_found = {} scc_queue = [] i = 0 # Preorder counter for source in list_of_nodes: if source not in scc_found: queue = [source] while queue: v = queue[-1] if v not in preorder: i = i + 1 preorder[v] = i done = 1 v_nbrs = node_successors[v] for w in v_nbrs: if w not in preorder: queue.append(w) done = 0 break if done == 1: lowlink[v] = preorder[v] for w in v_nbrs: if w not in scc_found: if preorder[w] > preorder[v]: lowlink[v] = min([lowlink[v], lowlink[w]]) else: lowlink[v] = min([lowlink[v], preorder[w]]) queue.pop() if lowlink[v] == preorder[v]: scc_found[v] = True scc = {v} while scc_queue and preorder[scc_queue[-1]] > preorder[v]: k = scc_queue.pop() scc_found[k] = True scc.add(k) yield scc else: scc_queue.append(v) def get_connected_edges(nodes, edges, from_id_col="u", to_id_col="v", node_id_col="id"): """Filters the network data (directed) to include only connected edges and nodes.""" node_successors = _get_node_successors(edges, from_id_col, to_id_col) node_ids = nodes[node_id_col].to_list() scc = max(_strongly_connected_components(node_ids, node_successors), key=len) # Filter nodes and edges accordingly n = nodes[nodes[node_id_col].isin(scc)] e = edges[(edges[from_id_col].isin(scc)) & (edges[to_id_col].isin(scc))] return n, e.reset_index(drop=True)
<gh_stars>1-10 #include <bits/stdc++.h> using namespace std; int main(){ int n, q; cin>>n>>q; vector<int> v(n, 0); while(q--){ int l, r; cin>>l>>r; v[l]++; if(r+1 < n){ v[r+1]--; } } for(int i=1; i<n; i++){ v[i] = v[i] + v[i-1]; } for(auto el : v){ cout<<el<<" "; } return 0; }
<filename>maven/java-ee-spring-boot-2.2.2-study/ee-spring-boot-2.2.2-ztree-3.5/src/main/java/com/litongjava/module/spring/boot/ztree/service/impl/TreeMenuServiceImpl.java package com.litongjava.module.spring.boot.ztree.service.impl; import com.litongjava.module.spring.boot.ztree.model.TreeMenu; import com.litongjava.module.spring.boot.ztree.mapper.TreeMenuMapper; import com.litongjava.module.spring.boot.ztree.service.TreeMenuService; import com.baomidou.mybatisplus.extension.service.impl.ServiceImpl; import org.springframework.stereotype.Service; /** * <p> * 服务实现类 * </p> * * @author litong * @since 2021-04-12 */ @Service public class TreeMenuServiceImpl extends ServiceImpl<TreeMenuMapper, TreeMenu> implements TreeMenuService { }
#include "NNLayer.h" #include <iostream> #include <math.h> float NNLayer::sigmoid(const float x) const { return 1.f / (1 + exp(-x)); } NNLayer::NNLayer(int numNodes, int numTargetNodes) { values.resize(numNodes, 0); for (int i = 0; i < numNodes; i++) { for (int j = 0; j < numTargetNodes; j++) { edges.push_back(Edge(i, j)); } } } void NNLayer::activation() { for (std::size_t i = 0; i < values.size(); i++) { values[i] = sigmoid(values[i]); } } void NNLayer::clear() { for (std::size_t i = 0; i < values.size(); i++) { values[i] = 0; } } void NNLayer::propagate(NNLayer& nextLayer) { for (const Edge& edge : edges) { nextLayer.add(edge.to, values[edge.from] * edge.weight); } nextLayer.activation(); } void NNLayer::add(int index, float value) { values[index] += value; } std::vector<float> NNLayer::getValues() const { return values; } void NNLayer::setValues(const std::vector<float>& _values) { for(std::size_t i = 0; i < _values.size(); i++) values[i] = _values[i]; } int NNLayer::getNumNodes() const { return values.size(); } int NNLayer::getNumEdges() const { return edges.size(); } Edge NNLayer::getEdge(int index) const { return edges[index]; } void NNLayer::setWeight(int index, float weight) { edges[index].weight = weight; } std::vector<Edge>& NNLayer::getEdgesRef() { return edges; }
<filename>pkg/ruler/mapper.go package ruler import ( "crypto/md5" "net/url" "path/filepath" "sort" "github.com/go-kit/kit/log" "github.com/go-kit/kit/log/level" "github.com/prometheus/prometheus/pkg/rulefmt" "github.com/spf13/afero" "gopkg.in/yaml.v3" ) // mapper is designed to enusre the provided rule sets are identical // to the on-disk rules tracked by the prometheus manager type mapper struct { Path string // Path specifies the directory in which rule files will be mapped. FS afero.Fs logger log.Logger } func newMapper(path string, logger log.Logger) *mapper { m := &mapper{ Path: path, FS: afero.NewOsFs(), logger: logger, } m.cleanup() return m } func (m *mapper) cleanupUser(userID string) { dirPath := filepath.Join(m.Path, userID) err := m.FS.RemoveAll(dirPath) if err != nil { level.Warn(m.logger).Log("msg", "unable to remove user directory", "path", dirPath, "err", err) } } // cleanup removes all of the user directories in the path of the mapper func (m *mapper) cleanup() { level.Info(m.logger).Log("msg", "cleaning up mapped rules directory", "path", m.Path) users, err := m.users() if err != nil { level.Error(m.logger).Log("msg", "unable to read rules directory", "path", m.Path, "err", err) return } for _, u := range users { m.cleanupUser(u) } } func (m *mapper) users() ([]string, error) { var result []string dirs, err := afero.ReadDir(m.FS, m.Path) for _, u := range dirs { if u.IsDir() { result = append(result, u.Name()) } } return result, err } func (m *mapper) MapRules(user string, ruleConfigs map[string][]rulefmt.RuleGroup) (bool, []string, error) { anyUpdated := false filenames := []string{} // user rule files will be stored as `/<path>/<userid>/<encoded filename>` path := filepath.Join(m.Path, user) err := m.FS.MkdirAll(path, 0777) if err != nil { return false, nil, err } // write all rule configs to disk for filename, groups := range ruleConfigs { // Store the encoded file name to better handle `/` characters encodedFileName := url.PathEscape(filename) fullFileName := filepath.Join(path, encodedFileName) fileUpdated, err := m.writeRuleGroupsIfNewer(groups, fullFileName) if err != nil { return false, nil, err } filenames = append(filenames, fullFileName) anyUpdated = anyUpdated || fileUpdated } // and clean any up that shouldn't exist existingFiles, err := afero.ReadDir(m.FS, path) if err != nil { return false, nil, err } for _, existingFile := range existingFiles { fullFileName := filepath.Join(path, existingFile.Name()) // Ensure the namespace is decoded from a url path encoding to see if it is still required decodedNamespace, err := url.PathUnescape(existingFile.Name()) if err != nil { level.Warn(m.logger).Log("msg", "unable to remove rule file on disk", "file", fullFileName, "err", err) continue } ruleGroups := ruleConfigs[string(decodedNamespace)] if ruleGroups == nil { err = m.FS.Remove(fullFileName) if err != nil { level.Warn(m.logger).Log("msg", "unable to remove rule file on disk", "file", fullFileName, "err", err) } anyUpdated = true } } return anyUpdated, filenames, nil } func (m *mapper) writeRuleGroupsIfNewer(groups []rulefmt.RuleGroup, filename string) (bool, error) { sort.Slice(groups, func(i, j int) bool { return groups[i].Name > groups[j].Name }) rgs := rulefmt.RuleGroups{Groups: groups} d, err := yaml.Marshal(&rgs) if err != nil { return false, err } _, err = m.FS.Stat(filename) if err == nil { current, err := afero.ReadFile(m.FS, filename) if err != nil { return false, err } newHash := md5.New() currentHash := md5.New() // bailout if there is no update if string(currentHash.Sum(current)) == string(newHash.Sum(d)) { return false, nil } } level.Info(m.logger).Log("msg", "updating rule file", "file", filename) err = afero.WriteFile(m.FS, filename, d, 0777) if err != nil { return false, err } return true, nil }
#pragma once #include "searchengine.hpp" #include <annoy/annoylib.h> #include <annoy/kissrandom.h> class SEAnnoy : public SearchEngine { public: typedef std::shared_ptr<SEAnnoy> SEAnnoyPtr; public: SEAnnoy(const TorchManager::TorchManagerPtr &torch_manager, const DatabaseManager::DatabaseManagerPtr &database_manager, int tree_factor = 2); ~SEAnnoy(); void setup() override; bool requireRefresh() override; void search(const std::string &model_name, const torch::Tensor &features_tensor, int top_k, std::vector<int> *top_ids, std::vector<float> *distances) override; private: typedef std::shared_ptr<AnnoyIndex<int, float, Angular, Kiss32Random>> AnnoyPtr; typedef std::unordered_map<int, int> idmapping_t; int mTreeFactor; std::unordered_map<std::string, AnnoyPtr> mAnnoyMap; std::unordered_map<std::string, idmapping_t> mIdMapping; };
<gh_stars>0 use core::ops::Index; use necsim_core_bond::{NonNegativeF64, PositiveF64}; use super::{Habitat, LineageReference, OriginSampler}; use crate::{ landscape::{IndexedLocation, Location}, lineage::{GlobalLineageReference, Lineage}, }; #[allow(clippy::inline_always, clippy::inline_fn_without_body)] #[contract_trait] pub trait LineageStore<H: Habitat, R: LineageReference<H>>: crate::cogs::Backup + Sized + core::fmt::Debug { type LineageReferenceIterator<'a>: Iterator<Item = R>; #[must_use] fn from_origin_sampler<'h, O: OriginSampler<'h, Habitat = H>>(origin_sampler: O) -> Self where H: 'h; #[must_use] fn get_number_total_lineages(&self) -> usize; #[must_use] fn iter_local_lineage_references(&self) -> Self::LineageReferenceIterator<'_>; #[must_use] fn get(&self, reference: R) -> Option<&Lineage>; } #[allow(clippy::inline_always, clippy::inline_fn_without_body)] #[allow(clippy::module_name_repetitions)] #[contract_trait] pub trait LocallyCoherentLineageStore<H: Habitat, R: LineageReference<H>>: LineageStore<H, R> + Index<R, Output = Lineage> { #[must_use] #[debug_requires( habitat.contains(indexed_location.location()), "indexed location is inside habitat" )] fn get_active_global_lineage_reference_at_indexed_location( &self, indexed_location: &IndexedLocation, habitat: &H, ) -> Option<&GlobalLineageReference>; #[debug_requires( habitat.contains(indexed_location.location()), "indexed location is inside habitat" )] #[debug_requires(self.get(reference.clone()).is_some(), "lineage reference is valid")] #[debug_requires(!self[reference.clone()].is_active(), "lineage is inactive")] #[debug_ensures(self[old(reference.clone())].is_active(), "lineage was activated")] #[debug_ensures( self[old(reference.clone())].indexed_location() == Some(&old(indexed_location.clone())), "lineage was added to indexed_location" )] #[debug_ensures( self.get_active_global_lineage_reference_at_indexed_location( &old(indexed_location.clone()), old(habitat) ) == Some(self[old(reference.clone())].global_reference()), "lineage is now indexed at indexed_location" )] fn insert_lineage_to_indexed_location_locally_coherent( &mut self, reference: R, indexed_location: IndexedLocation, habitat: &H, ); #[must_use] #[debug_requires(self.get(reference.clone()).is_some(), "lineage reference is valid")] #[debug_requires(self[reference.clone()].is_active(), "lineage is active")] #[debug_ensures(old(habitat).contains(ret.0.location()), "prior location is inside habitat")] #[debug_ensures(!self[old(reference.clone())].is_active(), "lineage was deactivated")] #[debug_ensures( ret.0 == old(self[reference.clone()].indexed_location().unwrap().clone()), "returns the individual's prior IndexedLocation" )] #[debug_ensures( ret.1 == old(self[reference.clone()].last_event_time()), "returns the individual's prior last event time" )] #[debug_ensures(self.get_active_global_lineage_reference_at_indexed_location( &ret.0, old(habitat) ).is_none(), "lineage is no longer indexed at its prior IndexedLocation")] #[debug_ensures( self[old(reference.clone())].last_event_time() == old(event_time), "updates the time of the last event of the lineage reference" )] fn extract_lineage_from_its_location_locally_coherent( &mut self, reference: R, event_time: PositiveF64, habitat: &H, ) -> (IndexedLocation, NonNegativeF64); #[debug_requires( self.get(local_lineage_reference.clone()).is_some(), "lineage reference is valid" )] #[debug_requires( !self[local_lineage_reference.clone()].is_active(), "lineage is inactive" )] #[debug_ensures( self.get(old(local_lineage_reference.clone())).is_none(), "lineage was removed" )] #[debug_ensures( ret == old(self[local_lineage_reference.clone()].global_reference().clone()), "returns the individual's GlobalLineageReference" )] fn emigrate(&mut self, local_lineage_reference: R) -> GlobalLineageReference; #[must_use] #[debug_requires( habitat.contains(indexed_location.location()), "indexed location is inside habitat" )] #[debug_ensures(self[ret.clone()].is_active(), "lineage was activated")] #[debug_ensures( self[ret.clone()].indexed_location() == Some(&old(indexed_location.clone())), "lineage was added to indexed_location" )] #[debug_ensures( self.get_active_global_lineage_reference_at_indexed_location( &old(indexed_location.clone()), old(habitat) ) == Some(self[ret.clone()].global_reference()), "lineage is now indexed at indexed_location" )] fn immigrate_locally_coherent( &mut self, habitat: &H, global_reference: GlobalLineageReference, indexed_location: IndexedLocation, time_of_emigration: PositiveF64, ) -> R; } #[allow(clippy::inline_always, clippy::inline_fn_without_body)] #[allow(clippy::module_name_repetitions)] #[contract_trait] pub trait GloballyCoherentLineageStore<H: Habitat, R: LineageReference<H>>: LocallyCoherentLineageStore<H, R> { type LocationIterator<'a>: Iterator<Item = Location>; #[must_use] fn iter_active_locations(&self, habitat: &H) -> Self::LocationIterator<'_>; #[must_use] #[debug_requires(habitat.contains(location), "location is inside habitat")] fn get_active_local_lineage_references_at_location_unordered( &self, location: &Location, habitat: &H, ) -> &[R]; #[debug_ensures( self.get_active_local_lineage_references_at_location_unordered( &old(indexed_location.location().clone()), old(habitat) ).last() == Some(&old(reference.clone())), "lineage is now indexed unordered at indexed_location.location()" )] #[debug_ensures( old(self.get_active_local_lineage_references_at_location_unordered( indexed_location.location(), old(habitat) ).len() + 1) == self.get_active_local_lineage_references_at_location_unordered( &old(indexed_location.location().clone()), old(habitat) ).len(), "unordered active lineage index at given location has grown by 1" )] fn insert_lineage_to_indexed_location_globally_coherent( &mut self, reference: R, indexed_location: IndexedLocation, habitat: &H, ) { self.insert_lineage_to_indexed_location_locally_coherent( reference, indexed_location, habitat, ); } #[must_use] #[debug_ensures( self.get_active_local_lineage_references_at_location_unordered( ret.0.location(), old(habitat), ).len() + 1 == old(self.get_active_local_lineage_references_at_location_unordered( self[reference.clone()].indexed_location().unwrap().location(), old(habitat), ).len()), "unordered active lineage index at returned location has shrunk by 1")] fn extract_lineage_from_its_location_globally_coherent( &mut self, reference: R, event_time: PositiveF64, habitat: &H, ) -> (IndexedLocation, NonNegativeF64) { self.extract_lineage_from_its_location_locally_coherent(reference, event_time, habitat) } #[must_use] #[debug_ensures( self.get_active_local_lineage_references_at_location_unordered( &old(indexed_location.location().clone()), old(habitat) ).last() == Some(&ret), "lineage is now indexed unordered at indexed_location.location()" )] #[debug_ensures( old(self.get_active_local_lineage_references_at_location_unordered( indexed_location.location(), old(habitat) ).len() + 1) == self.get_active_local_lineage_references_at_location_unordered( &old(indexed_location.location().clone()), old(habitat) ).len(), "unordered active lineage index at given location has grown by 1" )] fn immigrate_globally_coherent( &mut self, habitat: &H, global_reference: GlobalLineageReference, indexed_location: IndexedLocation, time_of_emigration: PositiveF64, ) -> R { self.immigrate_locally_coherent( habitat, global_reference, indexed_location, time_of_emigration, ) } }
def _access_checks(self, c_type: str) -> int: return self._checks.index(next( item for item in self._checks if item['c_type'] == c_type))
//NewRoom creates a room with name. func NewRoom(name string) *Room { newRoom := new(Room) newRoom.name = name newRoom.clients = NewClientList() newRoom.messages = message.NewMessageList() return newRoom }
def harmonic_amplitudes_to_signal(f0_t: Tensor, harmonic_amplitudes_t: Tensor, sampling_rate: int, min_f0: float) -> Tensor: _, n_harmonic, _ = harmonic_amplitudes_t.shape f0_map = freq_multiplier(n_harmonic, f0_t.device) * f0_t weight_map = ( freq_antialias_mask(sampling_rate, f0_map) * harmonic_amplitudes_t ) f0_map_cum = f0_t.cumsum(dim=-1) * freq_multiplier( n_harmonic, f0_t.device ) w0_map_cum = f0_map_cum * 2.0 * pi / sampling_rate source = torch.sum( torch.sin(w0_map_cum) * weight_map, dim=-2, keepdim=True ) source = (~(f0_t < min_f0)).float() * source return source * 0.01
// processSuccess processes case after successful code processing via setting a corresponding status and output to cache func processSuccess(ctx context.Context, output []byte, pipelineId uuid.UUID, cacheService cache.Cache, status pb.Status) { switch status { case pb.Status_STATUS_COMPILING: logger.Infof("%s: Validate() finish\n", pipelineId) setToCache(ctx, cacheService, pipelineId, cache.Status, pb.Status_STATUS_COMPILING) case pb.Status_STATUS_EXECUTING: logger.Infof("%s: Compile() finish\n", pipelineId) setToCache(ctx, cacheService, pipelineId, cache.CompileOutput, string(output)) setToCache(ctx, cacheService, pipelineId, cache.Status, pb.Status_STATUS_EXECUTING) case pb.Status_STATUS_FINISHED: logger.Infof("%s: Run() finish\n", pipelineId) setToCache(ctx, cacheService, pipelineId, cache.RunOutput, string(output)) setToCache(ctx, cacheService, pipelineId, cache.Status, pb.Status_STATUS_FINISHED) } }
def remove(self, assets: dict): def _remove_assets(assets_df, exclude_asset, exclude_dates, granularity=self._granularity): dates = [str_to_ts(dt) for dt in exclude_dates] assert len(dates) % 2 == 0, f'Unsupported datetime sequence for {exclude_asset}: odd amount of dates.' for i in range(int(len(dates) / 2)): exclude_dates = list(pd.date_range(start=dates[2 * i], end=dates[2 * i + 1], freq=granularity)) assets_df = assets_df[ ~((assets_df.index.isin(exclude_dates)) & (assets_df['asset'] == exclude_asset)) ] return assets_df common_asset_names = set(self.common_assets['asset'].unique()) reserve_asset_names = set(self.reserve_assets['asset'].unique()) for asset in assets: if asset in common_asset_names: self.common_assets = _remove_assets( assets_df=self.common_assets, exclude_asset=asset, exclude_dates=assets[asset]) elif asset in reserve_asset_names: self.reserve_assets = _remove_assets( assets_df=self.reserve_assets, exclude_asset=asset, exclude_dates=assets[asset]) else: logging.warning(f'can\'t find {asset} in assets.')
<filename>source/ledger/ledger-model/src/main/java/com/jd/blockchain/ledger/ParticipantInfo.java //package com.jd.blockchain.ledger; // //import com.jd.blockchain.base.data.TypeCodes; //import com.jd.blockchain.binaryproto.DataContract; //import com.jd.blockchain.binaryproto.DataField; //import com.jd.blockchain.crypto.asymmetric.PubKey; // //import my.utils.ValueType; // ///** // * 参与方信息; // * // * @author huanghaiquan // * // */ //@DataContract(code = TypeCodes.METADATA_PARTICIPANT_INFO) //public interface ParticipantInfo { // // /** // * 参与者名称; // * // * @return // */ // @DataField(order = 1, primitiveType = ValueType.TEXT) // String getName(); // // /** // * 公钥; // * // * @return // */ // @DataField(order = 2, primitiveType = ValueType.BYTES) // PubKey getPubKey(); // //}
/** * Die initiale Position auch darstellen. * Der init() wird auch bei neuen Entities/Components aufgerufen. * * @param group */ @Override public void init(EcsGroup group) { if (group != null) { GraphMovingComponent gmc = (GraphMovingComponent) group.cl.get(0); adjustVisual(gmc); } }
// Add an exception frame for a PLT. This is called from target code. void Layout::add_eh_frame_for_plt(Output_data* plt, const unsigned char* cie_data, size_t cie_length, const unsigned char* fde_data, size_t fde_length) { if (parameters->incremental()) { return; } Output_section* os = this->make_eh_frame_section(NULL); if (os == NULL) return; this->eh_frame_data_->add_ehframe_for_plt(plt, cie_data, cie_length, fde_data, fde_length); if (!this->added_eh_frame_data_) { os->add_output_section_data(this->eh_frame_data_); this->added_eh_frame_data_ = true; } }
/** * @brief Internal helper function to validate model files. */ static int __ml_validate_model_file (const char *const *model, const unsigned int num_models, gboolean * is_dir) { guint i; if (!model || num_models < 1) { _ml_loge ("The required param, model is not provided (null)."); return ML_ERROR_INVALID_PARAMETER; } if (g_file_test (model[0], G_FILE_TEST_IS_DIR)) { *is_dir = TRUE; return ML_ERROR_NONE; } for (i = 0; i < num_models; i++) { if (!model[i] || !g_file_test (model[i], G_FILE_TEST_IS_REGULAR)) { _ml_loge ("The given param, model path [%s] is invalid or not given.", GST_STR_NULL (model[i])); return ML_ERROR_INVALID_PARAMETER; } } return ML_ERROR_NONE; }
def create_thriftpy_context(server_side=False, ciphers=None): if MODERN_SSL: if server_side: context = ssl.create_default_context(ssl.Purpose.CLIENT_AUTH) else: context = ssl.create_default_context(ssl.Purpose.SERVER_AUTH) if ciphers: context.set_ciphers(ciphers) else: context = SSLContext(ssl.PROTOCOL_SSLv23) context.options |= OP_NO_SSLv2 context.options |= OP_NO_SSLv3 context.options |= OP_NO_COMPRESSION if server_side: context.options |= OP_CIPHER_SERVER_PREFERENCE context.options |= OP_SINGLE_DH_USE context.options |= OP_SINGLE_ECDH_USE else: context.verify_mode = ssl.CERT_REQUIRED warnings.warn( "ssl check hostname support disabled, upgrade your python", InsecurePlatformWarning) if ciphers: context.set_ciphers(ciphers) return context
FROM THE OP -"I was inside my apartment when I heard the first explosion and witnessed smoke raising into the sky. I turned my camera on placing it on to the balcony of my apartment and continued getting myself ready for work. Several minutes later, there was a much larger explosion. This explosion blew my camera backwards and broke all the windows in the apartment."More Info on the Explosion -At approximately noon local time on Saturday 15 March 2008, at an ex-military ammunition depot in the village of Gërdec in the Vorë Municipality, Albania (14 kilometers from Tirana, the nation's capital), U.S and Albanian munitions experts were preparing to destroy stockpiles of obsolete ammunition. The methodical destruction of the old ammo was supposed to occur with a series of small, controlled explosions, but a chain of events led to the entire stockpile going up at once. The main explosion, involving more than 400 tons of propellant in containers, destroyed hundreds of houses within a few kilometers from the depot and broke windows in cars on the Tirana-Durrës highway. A large fire caused a series of smaller but powerful explosions that continued until 2 a.m. on Sunday. The explosions could be heard as far away as the Macedonian capital of Skopje, 170 km (110 mi) away.Thousands of artillery shells, most of them unexploded, littered the area. The blast shattered all the windows of the terminal building at the country's only international airport, and all flights were suspended for some 40 minutes. Some 4,000 inhabitants of the zone were evacuated and offered shelter in state-owned resorts. The Government declared the zone a disaster area. According to subsequent investigations, a privately managed ammo dismantling process was ongoing in the area. It killed 16 and injured 243.
/** * A Filter that provides response caching, for HTTP {@code GET} requests. * <p> * Originally based on ideas and code found in the ONJava article * <a href="http://www.onjava.com/pub/a/onjava/2003/11/19/filters.html">Two * Servlet Filters Every Web Application Should Have</a> * by Jayson Falkner. * </p> * * @author Jayson Falkner * @author <a href="mailto:[email protected]">Harald Kuhr</a> * @author last modified by $Author: haku $ * @version $Id: CacheFilter.java#4 $ * */ @Deprecated public class CacheFilter extends GenericFilter { HTTPCache cache; /** * Initializes the filter * * @throws javax.servlet.ServletException */ public void init() throws ServletException { FilterConfig config = getFilterConfig(); // Default don't delete cache files on exit (persistent cache) boolean deleteCacheOnExit = "TRUE".equalsIgnoreCase(config.getInitParameter("deleteCacheOnExit")); // Default expiry time 10 minutes int expiryTime = 10 * 60 * 1000; String expiryTimeStr = config.getInitParameter("expiryTime"); if (!StringUtil.isEmpty(expiryTimeStr)) { try { // TODO: This is insane.. :-P Let the expiry time be in minutes or seconds.. expiryTime = Integer.parseInt(expiryTimeStr); } catch (NumberFormatException e) { throw new ServletConfigException("Could not parse expiryTime: " + e.toString(), e); } } // Default max mem cache size 10 MB int memCacheSize = 10; String memCacheSizeStr = config.getInitParameter("memCacheSize"); if (!StringUtil.isEmpty(memCacheSizeStr)) { try { memCacheSize = Integer.parseInt(memCacheSizeStr); } catch (NumberFormatException e) { throw new ServletConfigException("Could not parse memCacheSize: " + e.toString(), e); } } int maxCachedEntites = 10000; try { cache = new HTTPCache( getTempFolder(), expiryTime, memCacheSize * 1024 * 1024, maxCachedEntites, deleteCacheOnExit, new ServletContextLoggerAdapter(getFilterName(), getServletContext()) ) { @Override protected File getRealFile(CacheRequest pRequest) { String contextRelativeURI = ServletUtil.getContextRelativeURI(((ServletCacheRequest) pRequest).getRequest()); String path = getServletContext().getRealPath(contextRelativeURI); if (path != null) { return new File(path); } return null; } }; log("Created cache: " + cache); } catch (IllegalArgumentException e) { throw new ServletConfigException("Could not create cache: " + e.toString(), e); } } private File getTempFolder() { File tempRoot = (File) getServletContext().getAttribute("javax.servlet.context.tempdir"); if (tempRoot == null) { throw new IllegalStateException("Missing context attribute \"javax.servlet.context.tempdir\""); } return new File(tempRoot, getFilterName()); } public void destroy() { log("Destroying cache: " + cache); cache = null; super.destroy(); } protected void doFilterImpl(ServletRequest pRequest, ServletResponse pResponse, FilterChain pChain) throws IOException, ServletException { // We can only cache HTTP GET/HEAD requests if (!(pRequest instanceof HttpServletRequest && pResponse instanceof HttpServletResponse && isCachable((HttpServletRequest) pRequest))) { pChain.doFilter(pRequest, pResponse); // Continue chain } else { ServletCacheRequest cacheRequest = new ServletCacheRequest((HttpServletRequest) pRequest); ServletCacheResponse cacheResponse = new ServletCacheResponse((HttpServletResponse) pResponse); ServletResponseResolver resolver = new ServletResponseResolver(cacheRequest, cacheResponse, pChain); // Render fast try { cache.doCached(cacheRequest, cacheResponse, resolver); } catch (CacheException e) { if (e.getCause() instanceof ServletException) { throw (ServletException) e.getCause(); } else { throw new ServletException(e); } } finally { pResponse.flushBuffer(); } } } private boolean isCachable(HttpServletRequest pRequest) { // TODO: Get Cache-Control: no-cache/max-age=0 and Pragma: no-cache from REQUEST too? return "GET".equals(pRequest.getMethod()) || "HEAD".equals(pRequest.getMethod()); } // TODO: Extract, complete and document this class, might be useful in other cases // Maybe add it to the ServletUtil class static class ServletContextLoggerAdapter extends Logger { private final ServletContext context; public ServletContextLoggerAdapter(String pName, ServletContext pContext) { super(pName, null); context = pContext; } @Override public void log(Level pLevel, String pMessage) { context.log(pMessage); } @Override public void log(Level pLevel, String pMessage, Throwable pThrowable) { context.log(pMessage, pThrowable); } } }
//! diameter filter test case for binned class with periodic boundary conditions UP_TEST(NeighborListStencil_diameter_shift_periodic) { neighborlist_diameter_shift_periodic_tests<NeighborListStencil>( std::shared_ptr<ExecutionConfiguration>( new ExecutionConfiguration(ExecutionConfiguration::CPU))); }
def flush(self, indexes=['_all'], refresh=None): path = self._make_path([','.join(indexes), '_flush']) args = {} if refresh is not None: args['refresh'] = refresh response = self._send_request('POST', path, querystring_args=args) return response
DESLOCAMENTO = ( (-1, 0), # pra cima ( 1, 0), # pra baixo ( 0, -1), # pra esquerda ( 0, 1) # par direita ) def possivel(i, j, n, m): return 0 <= i < n and 0 <= j < m def site_vazio(maze, n, m): for i in range(n): for j in range(m): if maze[i][j] == ".": return i, j def dfs(maze, n, m, limite): visitados = [[False] * m for _ in range(n)] i, j = site_vazio(maze, n, m) vizinhos = [(i, j)] count = 0 while len(vizinhos) != 0: i, j = vizinhos.pop() visitados[i][j] = True count += 1 if count > limite: maze[i][j] = "X" for di, dj in DESLOCAMENTO: novoi = i + di novoj = j + dj if possivel(novoi, novoj, n, m) and not visitados[novoi][novoj] and maze[novoi][novoj] == ".": vizinhos.append((novoi, novoj)) visitados[novoi][novoj] = True n, m, k = [int(x) for x in input().split()] maze = [[s for s in input()] for _ in range(n)] total_livre = sum(linha.count(".") for linha in maze) dfs(maze, n, m, total_livre - k) for linha in maze: print(*linha, sep="")
def define_input_fields(params): pixelsX = params['pixelsX'] pixelsY = params['pixelsY'] dx = params['Lx'] dy = params['Ly'] xa = np.linspace(0, pixelsX - 1, pixelsX) * dx xa = xa - np.mean(xa) ya = np.linspace(0, pixelsY - 1, pixelsY) * dy ya = ya - np.mean(ya) [y_mesh, x_mesh] = np.meshgrid(ya, xa) x_mesh = x_mesh[np.newaxis, :, :] y_mesh = y_mesh[np.newaxis, :, :] lam_phase_test = params['lam0'][:, 0, 0, 0, 0, 0] lam_phase_test = lam_phase_test[:, tf.newaxis, tf.newaxis] theta_phase_test = params['theta'][:, 0, 0, 0, 0, 0] theta_phase_test = theta_phase_test[:, tf.newaxis, tf.newaxis] phase_def = 2 * np.pi * np.sin(theta_phase_test) * x_mesh / lam_phase_test phase_def = tf.cast(phase_def, dtype = tf.complex64) return tf.exp(1j * phase_def)
def _generate_task_from_yield(tasks, func_name, task_dict, gen_doc): if not isinstance(task_dict, dict): raise InvalidTask("Task '%s' must yield dictionaries" % func_name) msg_dup = "Task generation '%s' has duplicated definition of '%s'" basename = task_dict.pop('basename', None) if 'name' in task_dict: basename = basename or func_name if task_dict['name'] is None: task_dict['name'] = basename task_dict['actions'] = None group_task = dict_to_task(task_dict) group_task.has_subtask = True tasks[basename] = group_task return full_name = "%s:%s"% (basename, task_dict['name']) if full_name in tasks: raise InvalidTask(msg_dup % (func_name, full_name)) task_dict['name'] = full_name sub_task = dict_to_task(task_dict) sub_task.subtask_of = basename group_task = tasks.get(basename) if group_task: if not group_task.has_subtask: raise InvalidTask(msg_dup % (func_name, basename)) else: group_task = Task(basename, None, doc=gen_doc, has_subtask=True) tasks[basename] = group_task group_task.task_dep.append(sub_task.name) tasks[sub_task.name] = sub_task else: if not basename: raise InvalidTask( "Task '%s' must contain field 'name' or 'basename'. %s"% (func_name, task_dict)) if basename in tasks: raise InvalidTask(msg_dup % (func_name, basename)) task_dict['name'] = basename if not 'doc' in task_dict: task_dict['doc'] = gen_doc tasks[basename] = dict_to_task(task_dict)
// open opens and initializes the view. func (v *view) open() error { if strings.HasPrefix(v.name, viewBSIGroupPrefix) { v.cacheType = CacheTypeNone } if err := func() error { v.logger.Debugf("ensure view path exists: %s", v.path) if err := os.MkdirAll(v.path, 0777); err != nil { return errors.Wrap(err, "creating view directory") } else if err := os.MkdirAll(filepath.Join(v.path, "fragments"), 0777); err != nil { return errors.Wrap(err, "creating fragments directory") } v.logger.Debugf("open fragments for index/field/view: %s/%s/%s", v.index, v.field, v.name) if err := v.openFragments(); err != nil { return errors.Wrap(err, "opening fragments") } return nil }(); err != nil { v.close() return err } v.logger.Debugf("successfully opened index/field/view: %s/%s/%s", v.index, v.field, v.name) return nil }
Anti-windup in mid-ranging control The implementation of anti-windup methods in mid-ranging control needs further attention. It is demonstrated how use of standard anti-windup schemes may give unnecessary performance degradation during saturation. The problem is illustrated for two separate systems, control of oxygen concentration in a bio-reactor and temperature control of a cooling system. In the paper, guidelines are derived for how to design the standard anti-windup scheme to recover performance. As an alternative a modified anti-windup scheme for mid-ranging control is presented that minimizes the performance degradation during saturation.
/** * Default implementation of {@link ThreeDDocument}. * * @since 8.4 */ public class ThreeDDocumentAdapter implements ThreeDDocument { final DocumentModel docModel; public ThreeDDocumentAdapter(DocumentModel threed) { docModel = threed; } @Override public ThreeD getThreeD() { BlobHolder bh = docModel.getAdapter(BlobHolder.class); List<Blob> resources = ((List<Map<String, Object>>) docModel.getPropertyValue( "files:files")).stream().map(file -> (Blob) file.get("file")).collect(Collectors.toList()); Map<String, Serializable> infoMap = (Map<String, Serializable>) docModel.getPropertyValue("threed:info"); ThreeDInfo info = (infoMap != null) ? new ThreeDInfo(infoMap) : null; return new ThreeD(bh.getBlob(), resources, info); } @SuppressWarnings("unchecked") @Override public Collection<TransmissionThreeD> getTransmissionThreeDs() { List<Map<String, Serializable>> list = (List<Map<String, Serializable>>) docModel.getPropertyValue( TRANSMISSIONS_PROPERTY); return list.stream().map(TransmissionThreeD::new).collect(Collectors.toList()); } @SuppressWarnings("unchecked") @Override public TransmissionThreeD getTransmissionThreeD(String name) { List<Map<String, Serializable>> list = (List<Map<String, Serializable>>) docModel.getPropertyValue( TRANSMISSIONS_PROPERTY); return list.stream() .filter(item -> ((String) item.get(NAME)) != null && name.equals(item.get(NAME))) .map(TransmissionThreeD::new) .findFirst() .orElse(null); } @SuppressWarnings("unchecked") @Override public Collection<ThreeDRenderView> getRenderViews() { List<Map<String, Serializable>> list = (List<Map<String, Serializable>>) docModel.getPropertyValue( RENDER_VIEWS_PROPERTY); return list.stream().map(ThreeDRenderView::new).collect(Collectors.toList()); } @SuppressWarnings("unchecked") @Override public ThreeDRenderView getRenderView(String title) { List<Map<String, Serializable>> list = (List<Map<String, Serializable>>) docModel.getPropertyValue( RENDER_VIEWS_PROPERTY); return list.stream() .filter(item -> title.equals(item.get(TITLE))) .map(ThreeDRenderView::new) .findFirst() .orElse(null); } }
This article is part 1 of an upcoming article series, Storm vs. Heron. Follow me on Twitter to make sure you don’t miss the next part! When upgrading your existing Apache Storm topologies to be compatible with Twitter’s newest distributed stream processing engine, Heron, you can just follow the instructions over at Heron’s page, as Heron aims to be fully compatible with existing Storm topologies. But is it really that simple? I’ve tried that, using my very much real topology and was not overly surprised by the result. Setup So, here’s my setup: Hadoop 2.7.3 (HDFS + YARN) Apache Storm 1.0.2 Twitter Heron 0.14.3 MacOS Sierra 10.2 All my dependencies come from using $ brew install Topology First off, the topology I’m using is my Twitter Analysis topology that reads in tweets (filtered by keywords, using twitter4j), runs a couple of analysis queries on these (based on a prototypical word list from pre-defined reference accounts, basically sh*tty, supervised machine learning for non Data Scientists) and persists them into HDFS, using the storm-hdfs library. I’ll come to the details in a later article; for know, all we need to know is that we use the most recent, stable versions of Storm and an external library that does what Storm does best, stream analysis and simple storage. The process What Twitter recommends is simple: Install Heron, remove the following dependency <dependency> <groupId>org.apache.storm</groupId> <artifactId>storm-core</artifactId> <version>storm-VERSION</version> <scope>${scope.provided}</scope> </dependency> from your Maven pom.xml and replace it with Heron’s dependencies. Sounds too amazing to be true, considering that updating from Storm 0.x to 1.x required us to do actual code changes! Not to mention the way easier deployment, configuration and under-the-hood changes Heron comes with. But unfortunately, not everything went as smoothly as planned… Issues storm-hdfs As of right now, Heron seems not to be compatible with storm-hdfs (albeit stating otherwise on the project’s GitHub in the past), judging by this error: You have a couple of options for this, I guess. Heron seems to be not able to work with pre-compiled classes containing Storm-objects, so your best shot would be to grab the library and integrate it in your project as regular classes and packages. The other option is re-writing the library. While that seems like an expensive endeavour, it may prove useful if you require more control or flexibility anyways. As I did this in the past with the storm-hbase library for those exact reasons, I’m pretty sure that this is far from a viable option for everyone, but surely it will work for some. Considering the sheer amount of external libraries for Storm (kafka, hdfs, hbase, mqtt, hive, jms, redis …), this could turn out to be a real problem, though. So, if you know of a smarter alternative, let me know! Update: Thanks to the Heron team over at Twitter, the external systems for Heron are work in progress! twitter4j Also, twitter4j did me not leave exactly happy. Whilst my simple spout worked, it was not able to emit the twitter4j Status interface, representing a tweet, as it was usually coming in in the form of a twitter4j.StatusJSONImpl.class, not visible from outside the twitter4j package. As Heron uses kryo for this, I was not able to register it with the Heron configuration: conf.registerSerialization(twitter4j.StatusJSONImpl.class); The solution to this was fairly simple, after the Reflections library failed me as well: I used a custom wrapper object, emulating the Status interface and hence remaining compatible with the rest of my code. Not pretty, but works – and required me to touch code after all. Remember to register these classes as well! The asm library That one was tricky – after fixing the previous errors, I was left with this: As it turns out, the method call you see there used to be a interface call in asm 3.x, but was switched to a class in 4.x+. Adding this to my pom.xml fixed that one as well: <dependency> <groupId>org.ow2.asm</groupId> <artifactId>asm</artifactId> <version>4.0</version> </dependency> Conclusion Heron surely is awesome when we compare it to Storm – at least in theory. While the idea of literally re-using your entire source with a new, better technology seems thrilling, reality looked a bit less exiting. Anyways, after a couple of hours debugging and using fixes that maybe could be done a bit less brutally, it is now running beautifully on my local machine. Time to test it on a cluster! Disclaimer Heron is in development and things change quickly. These issues might not be true for your setup and there might be a perfectly viable solution to all of this – but I wasn’t able to find it and hence decided to document my experiences here.
// Send sends message to given client ID func (nb *NodeBag) Send(message NodeMessage) { if node, ok := nb.nodes[message.nodeID]; ok { node.outgoing <- message.Message } }
package mil.nga.giat.geowave.adapter.vector.render; import org.geoserver.wms.WMS; import org.geoserver.wms.WMSInfo; import org.geoserver.wms.WMSInfo.WMSInterpolation; import org.geoserver.wms.WMSInfoImpl; public class DistributedRenderWMSFacade extends WMS { private final DistributedRenderOptions options; public DistributedRenderWMSFacade( final DistributedRenderOptions options ) { super( null); this.options = options; } @Override public int getMaxBuffer() { return options.getBuffer(); } @Override public int getMaxRenderingTime() { return options.getMaxRenderTime(); } @Override public int getMaxRenderingErrors() { return options.getMaxErrors(); } @Override public WMSInterpolation getInterpolation() { return WMSInterpolation.values()[options.getWmsInterpolationOrdinal()]; } @Override public boolean isContinuousMapWrappingEnabled() { return options.isContinuousMapWrapping(); } @Override public boolean isAdvancedProjectionHandlingEnabled() { return options.isAdvancedProjectionHandlingEnabled(); } @Override public WMSInfo getServiceInfo() { return new WMSInfoImpl(); } @Override public int getMaxRequestMemory() { // bypass checking memory within distributed rendering return -1; } }
/** * OpenCV UI part, handling mouse actions */ static void mouse_callback(int event, int x, int y, int flags, void *userdata) { auto coordinate = (int*) userdata; int rows = coordinate[0]; int cols = coordinate[1]; bool clicked = false; bool mouse_moved = false; switch (event) { case EVENT_MOUSEMOVE : mouse_moved = true; break; case EVENT_LBUTTONDOWN : clicked = true; break; case EVENT_RBUTTONDOWN : break; case EVENT_MBUTTONDOWN : break; case EVENT_LBUTTONUP : break; case EVENT_RBUTTONUP : break; case EVENT_MBUTTONUP : break; case EVENT_LBUTTONDBLCLK: break; case EVENT_RBUTTONDBLCLK: break; case EVENT_MBUTTONDBLCLK: break; case EVENT_MOUSEWHEEL : break; case EVENT_MOUSEHWHEEL : break; default: break; } auto mouse_point = Point(y, x); Mat current_image = images_stack.top().clone(); #ifdef DEBUG_USER_INTERFACE auto start = std::chrono::system_clock::now(); cout << " x: " << x << " y: " << y << endl; #endif if (x > 1 && x < coordinate[1]-2 && y > 1 && y < coordinate[0]-2) { Point seed_flip = Point(mouse_point.y, mouse_point.x); circle(current_image, seed_flip, 2, point_to_point_color, 2); if (mouse_moved) { if ( !points_stack.empty() ) { Point* seed; vector<Pixel_Node *>* seed_graph; seed = &points_stack.top(); seed_graph = &graphs_stack.top(); assert(seed != nullptr); assert(seed_graph != nullptr); plot_path_tree_point_to_point(seed, &mouse_point, seed_graph, &current_image); } } if (clicked) { vector<Pixel_Node *> nodes_graph; init_node_vector(rows, cols, &nodes_graph, &image_gradient); minimum_cost_path_dijkstra(rows, cols, &mouse_point, &nodes_graph); if ( !points_stack.empty() ) { Point stack = points_stack.top(); cout << "clicked: point on stack: " << stack.x << " " << stack.y << " mouse point " << mouse_point.x << " " << mouse_point.y << endl; plot_path_tree_point_to_point(&points_stack.top(), &mouse_point, &graphs_stack.top(), &current_image); } points_stack.push(mouse_point); images_stack.push(current_image); graphs_stack.push(nodes_graph); #ifdef DEBUG_USER_INTERFACE auto end = std::chrono::system_clock::now(); cout << "points_stack size " << points_stack.size() << endl; std::chrono::duration<double> running_seconds = end - start; cout << "dijkstra result " << ++click_count << " with " << running_seconds.count() << "seconds. " << endl; #endif } } imshow(plot_window_name, current_image); }
<filename>app/src/main/java/com/moviebomber/adapter/MenuAdapter.java package com.moviebomber.adapter; import android.content.Context; import android.graphics.drawable.Drawable; import android.view.LayoutInflater; import android.view.View; import android.view.ViewGroup; import android.widget.ArrayAdapter; import android.widget.ImageView; import android.widget.TextView; import com.moviebomber.R; import com.moviebomber.model.utils.MenuSection; import java.util.List; import butterknife.ButterKnife; import butterknife.InjectView; /** * Created by engine on 15/3/29. */ public class MenuAdapter extends ArrayAdapter<MenuSection> { public MenuAdapter(Context context, int resource, List<MenuSection> objects) { super(context, resource, objects); } @Override public View getView(int position, View convertView, ViewGroup parent) { ViewHolder holder; if (convertView == null) { convertView = LayoutInflater.from(getContext()).inflate(R.layout.item_menu_section, parent, false); holder = new ViewHolder(convertView); convertView.setTag(holder); } else holder = (ViewHolder)convertView.getTag(); MenuSection menu = this.getItem(position); Drawable icon = this.getContext().getResources().getDrawable(menu.getIconRes()); icon.setBounds(0, 0, icon.getIntrinsicWidth(), icon.getIntrinsicHeight()); holder.mImageIcon.setBackground(icon); holder.mTextMenu.setText(menu.getTitle()); return convertView; } class ViewHolder { @InjectView(R.id.image_menu_icon) ImageView mImageIcon; @InjectView(R.id.text_menu) TextView mTextMenu; ViewHolder(View itemView) { ButterKnife.inject(this, itemView); } } }
/** * Send a POST request to update the error information about the device identifier, as an object * {@link DeviceState}. * * @param deviceId the device identifier. * @param state device state information containing the error. */ public void postError(final String deviceId, final DeviceState state) { try { final List<ClientHttpRequestInterceptor> interceptors = new ArrayList<>(); interceptors.add(new RestHeaderRequestInterceptor(HttpHeaders.CONTENT_TYPE, MediaType.APPLICATION_JSON_VALUE)); final String apiServer = OpsPropertiesLoader.getProperty("rest.api.server"); final String path = OpsPropertiesLoader.getProperty("rest.api.error.path"); final UriComponentsBuilder builder = UriComponentsBuilder.fromUriString(apiServer).path(path); final RestTemplate restTemplate = new RestTemplate(sslContext()); restTemplate.setInterceptors(interceptors); restTemplate.postForObject(builder.buildAndExpand(deviceId).toString(), state, DeviceState.class); } catch (Exception e) { logger.error( "Error occurred while setting the devices state for " + deviceId + ". " + e.getMessage()); logger.debug(e.getMessage(), e); } }
#include<bits/stdc++.h> #include<unistd.h> using namespace std; #define FZ(n) memset((n),0,sizeof(n)) #define FMO(n) memset((n),-1,sizeof(n)) #define F first #define S second #define PB push_back #define ALL(x) begin(x),end(x) #define SZ(x) ((int)(x).size()) #define IOS ios_base::sync_with_stdio(0); cin.tie(0) #define REP(i,x) for (int i=0; i<(x); i++) #define REP1(i,a,b) for (int i=(a); i<=(b); i++) #ifdef ONLINE_JUDGE #define FILEIO(name) \ freopen(name".in", "r", stdin); \ freopen(name".out", "w", stdout); #else #define FILEIO(name) #endif template<typename A, typename B> ostream& operator <<(ostream &s, const pair<A,B> &p) { return s<<"("<<p.first<<","<<p.second<<")"; } template<typename T> ostream& operator <<(ostream &s, const vector<T> &c) { s<<"[ "; for (auto it : c) s << it << " "; s<<"]"; return s; } // Let's Fight! #define double long double using ld = double; struct pdd : pair<double, double> { using pair<double, double>::pair; pdd operator + (const pdd &he) const { return {F+he.F, S+he.S}; } pdd operator - (const pdd &he) const { return {F-he.F, S-he.S}; } double operator * (const pdd &he) const { return F*he.F+S*he.S; } pdd operator * (double f) const { return {F*f, S*f}; } }; pdd operator * (double f, const pdd &p) { return {p.F*f, p.S*f}; } inline double abs(pdd p) { return hypot(p.F, p.S); } const double EPS = 1e-12; vector<pdd> interCircle(pdd o1, double r1, pdd o2, double r2) { //if (abs(r1-r2) < EPS and abs(o1-o2) < EPS) return {}; ld d2 = (o1 - o2) * (o1 - o2); ld d = sqrtl(d2); if (d < abs(r1-r2)) return {}; if (d > r1+r2) return {}; pdd u = 0.5*(o1+o2) + ((r2*r2-r1*r1)/(2*d2))*(o1-o2); double A = sqrtl((r1+r2+d) * (r1-r2+d) * (r1+r2-d) * (-r1+r2+d)); pdd v = A / (2*d2) * pdd(o1.S-o2.S, -o1.F+o2.F); return {u+v, u-v}; } const double PI = acosl(-1); int N; vector<pdd> ip; using Circle = pair<pdd, double>; inline double arg(pdd p) { return atan2l(p.S, p.F); } double zz(double x, double y) { if (x > y) x -= 2*PI; double t = y - x; while (t > PI) t = 2*PI - t; return t; } double check2(pdd p) { for (auto q: ip) { auto d = p - q; if (abs(d) < 1 - EPS) return 0; } vector<double> ar(N); for (int i=0; i<N; i++) ar[i] = arg(ip[i] - p); sort(ALL(ar)); //cout << ar << endl; double ans = 1e9; for (int i=0; i<N; i++) { int j = (i+1)%N; // cout << ar[i] << ' ' << ar[j] << ' ' << zz(ar[i], ar[j]) << endl; ans = min(ans, zz(ar[i], ar[j])); } return ans; } bool check(pdd p, double m) { /* if (abs(p) < 0.5) { cout << "p = " << p << endl; }*/ for (auto q: ip) { auto d = p - q; if (abs(d) < 1 - EPS) return false; } vector<double> ar(N); for (int i=0; i<N; i++) ar[i] = arg(ip[i] - p); sort(ALL(ar)); //cout << ar << endl; for (int i=0; i<N; i++) { int j = (i+1)%N; //cout << zz(ar[i], ar[j]) << endl; if (zz(ar[i], ar[j]) < m - EPS) return false; } return true; } pdd ans; bool test(double m) { vector<Circle> cirs; double mm = PI - m; //cout << "mm = " << mm << endl; for (int i=0; i<N; i++) { for (int j=i+1; j<N; j++) { auto p = ip[i], q = ip[j]; pdd d = p - q; double l = abs(d); d = {-d.S, d.F}; d = d * (1.0 / abs(d)); pdd mp = (p+q)*0.5; double ll = l/2.0/tanl(mm); d = d*ll; double r = l/2.0/sinl(mm); cirs.PB({mp+d, r}); cirs.PB({mp-d, r}); } cirs.PB({ip[i], 1}); } //cout << cirs << endl; //return 0; int C = SZ(cirs); for (int i=0; i<C; i++) { for (int j=i+1; j<C; j++) { auto c1 = cirs[i], c2 = cirs[j]; auto intp = interCircle(c1.F, c1.S, c2.F, c2.S); // cout << c1 << ' ' << c2 << endl; //cout << intp << endl; for (auto p: intp) { if (std::isnan(p.F)) continue; if (check(p, m)) { ans = p; // cout << m << ' ' << ans << endl; return true; } } } } return false; } //vector<pii> _ip; int main() { FILEIO("astronomy"); IOS; cin >> N; // _ip.resize(N); // for (int i=0; i<N; i++) cin>>_ip[i].F>>_ip[i].S; cout << fixed << setprecision(10); ip.resize(N); for (int i=0; i<N; i++) cin>>ip[i].F>>ip[i].S; double l = 0, r = PI; for (int _=0; _<50; _++) { double md = (l+r)/2; if (test(md)) { l = md; } else { r = md; } } double md = (l+r)/2; test(md); cout << ans.F << ' ' << ans.S << endl; // cout << check2(ans) << endl; return 0; }
// "Pull method 'foo' up and make it abstract" "true" public class Test{ void main(){ new Int(){ @Override void foo(){ } }; } }
Your Brain on Facebook: Neuropsychological Associations with Social Versus other Media We measured individuals’ mental associations between four types of media (books, television, social/Facebook, and general informational web pages) and relevant concepts (Addictive, Story, Interesting, Frivolous, Personal, and Useful) using three different measurements: a Likert scale questionnaire, a speeded Yes/No judgment task, and electrical brain activity. The three measures were designed to capture associations at different levels of mental processing, from very automatic (electrical brain activity) to conscious and reasoned (questionnaire). At more conscious levels of cognitive processing, Facebook was seen as interesting, addictive, and highly personal. Results for the electrical brain activity measure show that Facebook tells less of a story and, surprisingly, is less personal than other forms of media. We discuss differences in results across the three measures and how our findings can inform the design of future social media systems.
import random ''' ДЗ Это игра Дурак на которую я буду писать pytest & unitTest ''' class Card: '''Класс стандартной игральной колоды''' def __init__(self, suit, rank): '''Инициализация карты''' self.suit = suit # атрибут экз-ра класса Масть self.rank = rank # атрибут экз-ра класса Достоинство(ранг) suit_names = ['Крести ♧', 'Буби ♢', 'Черви ♡', 'Пики ♠'] rank_names = [None, '6', '7', '8', '9', '10', 'Валет', 'Дама', 'Король', 'Туз'] def __str__(self): '''Строковое представление объекта карты''' return '%s of %s' % (Card.rank_names[self.rank], Card.suit_names[self.suit]) def __lt__(self, other): ''' Сравнивает эту карту с другими, сначала по масти, затем по рангу. возвращает: логическое (функция __lt__(x < y)) ''' return self.rank < other.rank and self.suit == other.suit '''' принцип сравнения __lt__ t1 = self.suit, self.rank t2 = other.suit, other.rank return t1 < t2 ''' class Deck: '''Класс колоды карт''' def __init__(self): '''Инициализация колоды из 36 карт''' self.cards = [] for suit in range(4): for rank in range(1, 10): card = Card(suit, rank) self.cards.append(card) def __str__(self): '''Строковое представление объекта колоды''' res = [] for card in self.cards: res.append(str(card)) return '\n'.join(res) def add_card(self, card): '''Добавление карты в колоду''' self.cards.append(card) def remove_card(self, card): '''Удаление карты из колоды''' self.cards.remove(card) def pop_card(self, i=-1): '''Возвращает вынятую из колоды карту''' return self.cards.pop(i) def shuffle(self): '''Перемешивает карты в колоде''' random.shuffle(self.cards) def sort(self): ''' Сортировка карт в порядке возрастания''' self.cards.sort() def move_cards(self, hand, num): '''Передается num карт игроку hand''' for i in range(num): hand.add_card(self.pop_card()) def __len__(self): '''Определяет количество карт''' return len(self.cards) class Hand(Deck): '''Колода игрока''' def __init__(self, label=''): self.cards = [] self.label = label def __len__(self): '''Определяет количество карт''' return len(self.cards) deck = Deck() #создал колоду deck.shuffle() # перетасовал #print('Распечатываю созданную колоду deck:\n', deck) # закоментил, т.к. эта инфа мешает #print("Тип переменной ",type(deck)) artem = Hand('Artem') # Создали игрока Artem pk = Hand ('PK') # Создали игрока PK deck.move_cards(artem, 6) # раздал 6 карт игроку artem deck.move_cards(pk, 6) # раздал 6 карт игроку pk print(' Карты у игрока artem: \n', artem) print(' Карты у игрока pk: \n', pk) #print(' Контрольная распечатка колоды deck: \n', deck) # закоментил, т.к. эта инфа мешает card_1 = artem.pop_card() # Этой картой ходит игрок artem print(' Этой картой ходит игрок artem: \n', card_1) card_2 = "" for card in pk.cards: if card_1 < card: card_2 = pk.pop_card(pk.cards.index(card)) print(' Распечатываю карту, которой отбился PK: \n', card_2) if card_2: print(f" -> игрок {artem.label} проиграл") else: print(f" -> игрок {pk.label} проиграл")
<reponame>dyna-mis/Hilabeling<filename>src/customwidgets/qgsrasterbandcomboboxplugin.h /*************************************************************************** qgsrasterbandcomboboxplugin.h -------------------------------------- Date : 09.05.2017 Copyright : (C) 2017 <NAME> Email : <EMAIL> *************************************************************************** * * * This program is free software; you can redistribute it and/or modify * * it under the terms of the GNU General Public License as published by * * the Free Software Foundation; either version 2 of the License, or * * (at your option) any later version. * * * ***************************************************************************/ #ifndef QGSRASTERBANDCOMBOBOXPLUGIN_H #define QGSRASTERBANDCOMBOBOXPLUGIN_H #include <QtGlobal> #include <QtUiPlugin/QDesignerCustomWidgetInterface> #include <QtUiPlugin/QDesignerExportWidget> #include "qgis_customwidgets.h" class CUSTOMWIDGETS_EXPORT QgsRasterBandComboBoxPlugin : public QObject, public QDesignerCustomWidgetInterface { Q_OBJECT Q_INTERFACES( QDesignerCustomWidgetInterface ) public: explicit QgsRasterBandComboBoxPlugin( QObject *parent = nullptr ); private: bool mInitialized; // QDesignerCustomWidgetInterface interface public: QString name() const override; QString group() const override; QString includeFile() const override; QIcon icon() const override; bool isContainer() const override; QWidget *createWidget( QWidget *parent ) override; bool isInitialized() const override; void initialize( QDesignerFormEditorInterface *core ) override; QString toolTip() const override; QString whatsThis() const override; QString domXml() const override; }; #endif // QGSRASTERBANDCOMBOBOXPLUGIN_H
Cars on Metra tracks View Full Caption BUCKTOWN — Two cars driven by a husband and wife were stuck on an outbound Metra railroad line between North and Armitage Avenues in Bucktown near the Kennedy Expy., police and Metra workers said. A Union Pacific Northwest Metra train that was headed toward the North/Clybourn station was halted around 6:30 p.m. Tuesday because an SUV was on the train tracks. The train hit one of the cars, police said. Police sources told DNAinfo that the driver of the first car somehow drove onto the tracks after following directions from her GPS navigational system. The driver of the second car, an SUV, then followed the first car onto the tracks, police said. The driver of the SUV was the husband of the car's driver, police said. The cars were initially thought to have both been driving on the westbound North Avenue ramp to enter the Kennedy Expy., police said. There is also an additional ramp in that vicinity off Ashland Avenue used by Metra workers only, and police believe it's possible the couple could have driven up that ramp onto the tracks presuming it was the entrance to the highway. The car avoided getting hit by the train, but the SUV was struck after attempting a U-Turn to get off the tracks, police said. The driver of the SUV refused medical treatment but was shaken up, police said. Around 8:20 p.m., a tow truck was on the scene to remove both cars from the tracks before the train, which still had passengers in it, could continue moving. The driver of the car was sitting in the driver's seat of her car and her husband was next to her in the passenger seat as his SUV was towed first. Michael Gillis, a Metra spokesman, said all evening trains were delayed by about an hour and the passengers on the train involved "saw a closer to two hour delay" before it got moving again a little after 8:30 p.m. Tuesday. Metra police trying to tow SUV off the tracks #bucktown A video posted by alisa (@alisahauser1) on Dec 20, 2016 at 6:07pm PST For more neighborhood news, listen to DNAinfo Radio here.
#include <iostream> #include <string> #include <algorithm> #include <vector> #define rep(i, n) for (int i = 0; i < (int)(n); i++) #define P pair<int, int> #define ll long long #define x 100010 using namespace std; int main(){ cin.tie(0); int n,m;cin >> n >> m; int h[x],c[x]; for(int i=1;i<=n;i++){ cin >> h[i]; c[i]=0; } rep(i, m){ int a,b;cin >> a >> b; c[a]=max(c[a],h[b]); c[b]=max(c[b],h[a]); } int sum=0; for(int i=1;i<=n;i++){ if(h[i] > c[i]){ sum++; } } cout << sum << endl; }
package main // import ( // "errors" // "fmt" // "log" // "strconv" // "time" // // "github.com/cagnosolutions/adb" // "github.com/cagnosolutions/mg" // ) // // type ScheduledEmail struct { // Id string `json:"id"` // Time int64 `json:"time"` // Data map[string]Data `json:"data,omitempty"` // Vals map[string]interface{} `json:"vals,omitempty"` // Sent bool `json:"sent"` // Email mg.Email `json:"email,omitempty"` // Template string `json:"template,omitempty"` // Reschedule bool `json:"reschedule"` // IntervalMonth int `json:"intervalMonth,omitempty"` // IntervalYear int `json:"intervalYear,omitempty"` // GroupId string `json:"groupId"` // EmailLocation EmailLocation `json:"emailLocation,omitempty"` // } // // type Data struct { // Slice bool `json:"slice,omitempty"` // Key string `json:"key,omitempty"` // Ids []string `json:"ids,omitempty"` // } // // func (data *Data) Get(store string) (string, interface{}) { // if len(data.Ids) > 1 || data.Slice { // var vals []map[string]interface{} // for _, id := range data.Ids { // var val map[string]interface{} // db.Get(store, id, &val) // vals = append(vals, val) // } // return data.Key, vals // } // var val map[string]interface{} // db.Get(store, data.Ids[0], &val) // return data.Key, val // } // // type GroupedEmail struct { // Id string `json:"id"` // Time int64 `json:"time"` // DataStore string `json:"dataStore,omitempty"` // DataId string `json:"dataId,omitempty"` // DataKey string `json:"dataKey,omitempty"` // ValsKey string `json:"valsKey,omitempty"` // Vals map[string]interface{} `json:"vals,omitempty"` // GroupId string `json:"groupId"` // Sent bool `json:"sent"` // } // // type EmailLocation struct { // DataKey string `json:"dataKey,omitempty"` // EmailKey string `json:"emailKey,omitempty"` // } // // func (scheduledEmail *ScheduledEmail) Send() (string, error) { // // var groupedEmails []GroupedEmail // // if scheduledEmail.Vals == nil { // scheduledEmail.Vals = make(map[string]interface{}) // } // // check if scheduled email if parent of grouped emails // if scheduledEmail.GroupId != "" { // // get all email in this group for this month not yet sent // begM, endM := ThisMonth() // ok := db.TestQuery("grouped-email", &groupedEmails, adb.Eq("sent", "false"), adb.Gt("time", strconv.Itoa(int(begM))), adb.Lt("time", strconv.Itoa(int(endM)))) // if !ok { // fmt.Println("failed query") // } // if len(groupedEmails) < 1 { // // return if there are no emails grouped emails to send // return "", errors.New("No grouped emails found to send") // } // // // range over grouped emails, combining the data into the parent // for _, ge := range groupedEmails { // var vs []map[string]interface{} // if v, ok := scheduledEmail.Vals[ge.DataKey]; ok { // vs, ok = v.([]map[string]interface{}) // if !ok { // break // } // } // // var val map[string]interface{} // db.Get(ge.DataStore, ge.DataId, &val) // if val == nil { // continue // } // val["vals"] = ge.Vals // vs = append(vs, val) // scheduledEmail.Vals[ge.DataKey] = vs // } // } // // // range the data of the scheduled email // for store, data := range scheduledEmail.Data { // // get the data fromvthe database and enter it into the vals // key, val := data.Get(store) // scheduledEmail.Vals[key] = val // } // // if scheduledEmail.EmailLocation != (EmailLocation{}) { // if data, ok := scheduledEmail.Vals[scheduledEmail.EmailLocation.DataKey].(map[string]interface{}); ok { // if email, ok := data[scheduledEmail.EmailLocation.EmailKey].(string); ok { // scheduledEmail.Email.To = append(scheduledEmail.Email.To, email) // } // } // } // // // combine the vals information in the scheduled email with the template // // set the result as the body of the email // body, err := mg.BodyFile(scheduledEmail.Template, scheduledEmail.Vals, nil) // if err != nil { // log.Printf("main.go >> scheduledEmail.Send() >> mg.Body() >> %v\n\n", err) // return "", err // } // scheduledEmail.Email.HTML = body // // // send the email // resp, err := mg.SendEmail(scheduledEmail.Email) // // if there is no sending error update scheduled email and any grouped emails // if err == nil { // // mark scheduled email as sent and reset the html body // scheduledEmail.Sent = true // scheduledEmail.Email.HTML = "" // // // range grouped emails, set sent to true and save // for _, ge := range groupedEmails { // ge.Sent = true // db.Set("grouped-email", ge.Id, ge) // } // // // chack if scheduledEmail is to be rescheduled // if scheduledEmail.Reschedule { // // reset sent to false // scheduledEmail.Sent = false // // rt := time.Unix(scheduledEmail.Time, 0) // // add reschedule interval to scheduledEmail time // rt.AddDate(scheduledEmail.IntervalYear, scheduledEmail.IntervalMonth, 0) // scheduledEmail.Time = rt.Unix() // } // // save scheduledEmail if it is not a grouped email parent // // or it is a grouped email parent that is only sent/scraped once // // (scheduled emails with a GroupId are grouped email parents) // // (grouped email parents with time set to 0 are to be sent/scraped every time) // if scheduledEmail.GroupId == "" || (scheduledEmail.GroupId != "" && scheduledEmail.Time != 0) { // db.Set("scheduled-email", scheduledEmail.Id, scheduledEmail) // } // // } // // return resp, err // } // // func Scrape() []ScheduledEmail { // var scheduledEmail []ScheduledEmail // beg, end := Today() // db.TestQuery("scheduled-email", &scheduledEmail, adb.Eq("sent", "false"), adb.Gt("time", strconv.Itoa(int(beg))), adb.Lt("time", strconv.Itoa(int(end))), adb.Eq("groupId", `""`)) // var groupedEmailParent []ScheduledEmail // db.TestQuery("scheduled-email", &groupedEmailParent, adb.Ne("groupId", `""`), adb.Eq("sent", "false")) // scheduledEmail = append(scheduledEmail, groupedEmailParent...) // return scheduledEmail // } // // func SendToday(hours int) { // for _, scheduledEmail := range Scrape() { // if r, err := scheduledEmail.Send(); err != nil { // log.Println("\t", r) // log.Printf("\t%v\n\n", err) // } // time.Sleep(time.Millisecond * 200) // } // time.AfterFunc((time.Hour * time.Duration(hours)), func() { SendToday(hours) }) // }
def OnCopy(self, event): if self.currentCtrl == JetDefs.MAIN_SEGLIST: if self.currentSegmentName is None: return "" segment = self.jet_file.GetSegment(self.currentSegmentName) if segment == None: return "" self.clipBoard = JetCutCopy(self.currentCtrl, segment, self.currentSegmentName) return self.currentCtrl elif self.currentCtrl == JetDefs.MAIN_EVENTLIST: if self.currentSegmentName is None: return "" if self.currentEventName is None: return "" segment = self.jet_file.GetSegment(self.currentSegmentName) if segment == None: return "" curEvent = self.jet_file.GetEvent(self.currentSegmentName, self.currentEventName) if curEvent == None: return "" self.clipBoard = JetCutCopy(self.currentCtrl, curEvent, self.currentSegmentName) return self.currentCtrl
#include<functional> #include<algorithm> #include<iostream> #include<numeric> #include<cassert> #include<cstring> #include<vector> #include<queue> //#include<cmath> #include<set> #include<map> using namespace std; typedef unsigned long long LL; typedef unsigned long long ULL; typedef vector<int> VI; typedef vector<LL> VLL; typedef vector<VI> VVI; typedef pair<int,int> PII; typedef vector<PII> VPII; #define REP(i,n) for(int i=0;i<(n);++i) #define FOR(i,b,e) for(int i=(b);i<=(e);++i) #define FORD(i,b,e) for(int i=(b);i>=(e);--i) #define FOReach(it,V) for(__typeof((V).begin()) it=(V).begin();it!=(V).end();++it) #define PB push_back #define ALL(V) (V).begin(),(V).end() #define SIZE(V) ((int)(V).size()) #define MP make_pair #define ST first #define ND second #define DBG #ifdef DBG #define debug(...) fprintf(stderr, __VA_ARGS__) #else #define debug(...) #endif int stmp; #define scanf stmp=scanf const int MAX = 100000; const int INF = 1000000001; LL solve(LL x) { if(x < 10) return x+1; if(x < 100) { LL res = 10; for(int i=11;i<=x;i+=11) ++res; return res; } LL d = 100; LL res = 10; for(;x>=d;d*=10LL) res += d/100 * 9LL; LL dig; for(LL i=x;i;i/=10) dig = i%10; res += d/100 * (dig-1); res += (x-(d/10)*dig) / 10 + 1; if(x/10*10+dig > x) --res; return res; } int main(int argc, char *argv[]) { LL a, b; cin >> a >> b; cout << solve(b) - solve(a-1) << endl; return 0; }
#ifndef CPPUNIT_PORTABILITY_H #define CPPUNIT_PORTABILITY_H #if defined(_WIN32) && !defined(WIN32) # define WIN32 1 #endif /* include platform specific config */ #if defined(__BORLANDC__) # include <cppunit/config/config-bcb5.h> #elif defined (_MSC_VER) # if _MSC_VER == 1200 && defined(_WIN32_WCE) //evc4 # include <cppunit/config/config-evc4.h> # else # include <cppunit/config/config-msvc6.h> # endif #else # include <cppunit/config-auto.h> #endif // Version number of package #ifndef CPPUNIT_VERSION #define CPPUNIT_VERSION "1.12.0" #endif #include <cppunit/config/CppUnitApi.h> // define CPPUNIT_API & CPPUNIT_NEED_DLL_DECL #include <cppunit/config/SelectDllLoader.h> /* Options that the library user may switch on or off. * If the user has not done so, we chose default values. */ /* Define to 1 if you wish to have the old-style macros assert(), assertEqual(), assertDoublesEqual(), and assertLongsEqual() */ #if !defined(CPPUNIT_ENABLE_NAKED_ASSERT) # define CPPUNIT_ENABLE_NAKED_ASSERT 0 #endif /* Define to 1 if you wish to have the old-style CU_TEST family of macros. */ #if !defined(CPPUNIT_ENABLE_CU_TEST_MACROS) # define CPPUNIT_ENABLE_CU_TEST_MACROS 0 #endif /* Define to 1 if the preprocessor expands (#foo) to "foo" (quotes incl.) I don't think there is any C preprocess that does NOT support this! */ #if !defined(CPPUNIT_HAVE_CPP_SOURCE_ANNOTATION) # define CPPUNIT_HAVE_CPP_SOURCE_ANNOTATION 1 #endif /* Assumes that STL and CppUnit are in global space if the compiler does not support namespace. */ #if !defined(CPPUNIT_HAVE_NAMESPACES) # if !defined(CPPUNIT_NO_NAMESPACE) # define CPPUNIT_NO_NAMESPACE 1 # endif // !defined(CPPUNIT_NO_NAMESPACE) # if !defined(CPPUNIT_NO_STD_NAMESPACE) # define CPPUNIT_NO_STD_NAMESPACE 1 # endif // !defined(CPPUNIT_NO_STD_NAMESPACE) #endif // !defined(CPPUNIT_HAVE_NAMESPACES) /* Define CPPUNIT_STD_NEED_ALLOCATOR to 1 if you need to specify * the allocator you used when instantiating STL container. Typically * used for compilers that do not support template default parameter. * CPPUNIT_STD_ALLOCATOR will be used as the allocator. Default is * std::allocator. On some compilers, you may need to change this to * std::allocator<T>. */ #if CPPUNIT_STD_NEED_ALLOCATOR # if !defined(CPPUNIT_STD_ALLOCATOR) # define CPPUNIT_STD_ALLOCATOR std::allocator # endif // !defined(CPPUNIT_STD_ALLOCATOR) #endif // defined(CPPUNIT_STD_NEED_ALLOCATOR) // Compiler error location format for CompilerOutputter // If not define, assumes that it's gcc // See class CompilerOutputter for format. #if !defined(CPPUNIT_COMPILER_LOCATION_FORMAT) #if defined(__GNUC__) && ( defined(__APPLE_CPP__) || defined(__APPLE_CC__) ) // gcc/Xcode integration on Mac OS X # define CPPUNIT_COMPILER_LOCATION_FORMAT "%p:%l: " #else # define CPPUNIT_COMPILER_LOCATION_FORMAT "%f:%l:" #endif #endif // If CPPUNIT_HAVE_CPP_CAST is defined, then c++ style cast will be used, // otherwise, C style cast are used. #if defined( CPPUNIT_HAVE_CPP_CAST ) # define CPPUNIT_CONST_CAST( TargetType, pointer ) \ const_cast<TargetType>( pointer ) # define CPPUNIT_STATIC_CAST( TargetType, pointer ) \ static_cast<TargetType>( pointer ) #else // defined( CPPUNIT_HAVE_CPP_CAST ) # define CPPUNIT_CONST_CAST( TargetType, pointer ) \ ((TargetType)( pointer )) # define CPPUNIT_STATIC_CAST( TargetType, pointer ) \ ((TargetType)( pointer )) #endif // defined( CPPUNIT_HAVE_CPP_CAST ) // If CPPUNIT_NO_STD_NAMESPACE is defined then STL are in the global space. // => Define macro 'std' to nothing #if defined(CPPUNIT_NO_STD_NAMESPACE) # undef std # define std #endif // defined(CPPUNIT_NO_STD_NAMESPACE) // If CPPUNIT_NO_NAMESPACE is defined, then put CppUnit classes in the // global namespace: the compiler does not support namespace. #if defined(CPPUNIT_NO_NAMESPACE) # define CPPUNIT_NS_BEGIN # define CPPUNIT_NS_END # define CPPUNIT_NS #else // defined(CPPUNIT_NO_NAMESPACE) # define CPPUNIT_NS_BEGIN namespace CppUnit { # define CPPUNIT_NS_END } # define CPPUNIT_NS CppUnit #endif // defined(CPPUNIT_NO_NAMESPACE) /*! Stringize a symbol. * * Use this macro to convert a preprocessor symbol to a string. * * Example of usage: * \code * #define CPPUNIT_PLUGIN_EXPORTED_NAME cppunitTestPlugIn * const char *name = CPPUNIT_STRINGIZE( CPPUNIT_PLUGIN_EXPORTED_NAME ); * \endcode */ #define CPPUNIT_STRINGIZE( symbol ) _CPPUNIT_DO_STRINGIZE( symbol ) /// \internal #define _CPPUNIT_DO_STRINGIZE( symbol ) #symbol /*! Joins to symbol after expanding them into string. * * Use this macro to join two symbols. Example of usage: * * \code * #define MAKE_UNIQUE_NAME(prefix) CPPUNIT_JOIN( prefix, __LINE__ ) * \endcode * * The macro defined in the example concatenate a given prefix with the line number * to obtain a 'unique' identifier. * * \internal From boost documentation: * The following piece of macro magic joins the two * arguments together, even when one of the arguments is * itself a macro (see 16.3.1 in C++ standard). The key * is that macro expansion of macro arguments does not * occur in CPPUNIT_JOIN2 but does in CPPUNIT_JOIN. */ #define CPPUNIT_JOIN( symbol1, symbol2 ) _CPPUNIT_DO_JOIN( symbol1, symbol2 ) /// \internal #define _CPPUNIT_DO_JOIN( symbol1, symbol2 ) _CPPUNIT_DO_JOIN2( symbol1, symbol2 ) /// \internal #define _CPPUNIT_DO_JOIN2( symbol1, symbol2 ) symbol1##symbol2 /// \internal Unique suffix for variable name. Can be overridden in platform specific /// config-*.h. Default to line number. #ifndef CPPUNIT_UNIQUE_COUNTER # define CPPUNIT_UNIQUE_COUNTER __LINE__ #endif /*! Adds the line number to the specified string to create a unique identifier. * \param prefix Prefix added to the line number to create a unique identifier. * \see CPPUNIT_TEST_SUITE_REGISTRATION for an example of usage. */ #define CPPUNIT_MAKE_UNIQUE_NAME( prefix ) CPPUNIT_JOIN( prefix, CPPUNIT_UNIQUE_COUNTER ) /*! Defines wrap colunm for %CppUnit. Used by CompilerOuputter. */ #if !defined(CPPUNIT_WRAP_COLUMN) # define CPPUNIT_WRAP_COLUMN 79 #endif #endif // CPPUNIT_PORTABILITY_H
<filename>hapi-fhir-jpaserver-subscription/src/test/java/ca/uhn/fhir/jpa/subscription/match/matcher/subscriber/SubscriptionActivatingSubscriberTest.java package ca.uhn.fhir.jpa.subscription.match.matcher.subscriber; import ca.uhn.fhir.context.FhirContext; import ca.uhn.fhir.jpa.api.config.DaoConfig; import ca.uhn.fhir.jpa.api.dao.DaoRegistry; import ca.uhn.fhir.jpa.api.dao.IFhirResourceDao; import ca.uhn.fhir.jpa.partition.SystemRequestDetails; import ca.uhn.fhir.jpa.subscription.match.matcher.matching.SubscriptionStrategyEvaluator; import ca.uhn.fhir.jpa.subscription.match.registry.SubscriptionCanonicalizer; import ca.uhn.fhir.jpa.subscription.match.registry.SubscriptionConstants; import ca.uhn.fhir.jpa.subscription.match.registry.SubscriptionRegistry; import ca.uhn.fhir.jpa.subscription.model.CanonicalSubscriptionChannelType; import ca.uhn.fhir.rest.server.exceptions.ResourceGoneException; import ch.qos.logback.classic.Level; import ch.qos.logback.classic.Logger; import ch.qos.logback.classic.spi.ILoggingEvent; import ch.qos.logback.core.Appender; import org.hl7.fhir.instance.model.api.IBaseResource; import org.hl7.fhir.instance.model.api.IIdType; import org.hl7.fhir.r4.model.Subscription; import org.junit.jupiter.api.AfterEach; import org.junit.jupiter.api.Assertions; import org.junit.jupiter.api.BeforeEach; import org.junit.jupiter.api.Test; import org.junit.jupiter.api.extension.ExtendWith; import org.mockito.ArgumentCaptor; import org.mockito.InjectMocks; import org.mockito.Mock; import org.mockito.Mockito; import org.mockito.Spy; import org.mockito.internal.util.collections.Sets; import org.mockito.junit.jupiter.MockitoExtension; import org.slf4j.LoggerFactory; import java.util.List; @ExtendWith(MockitoExtension.class) public class SubscriptionActivatingSubscriberTest { private Logger ourLogger; @Mock private Appender<ILoggingEvent> myAppender; @Spy private FhirContext fhirContext = FhirContext.forR4Cached(); @Mock private SubscriptionRegistry mySubscriptionRegistry; @Mock private DaoRegistry myDaoRegistry; @Mock private SubscriptionCanonicalizer mySubscriptionCanonicallizer; @Mock private DaoConfig myDaoConfig; @Mock private SubscriptionStrategyEvaluator mySubscriptionStrategyEvaluator; @InjectMocks private SubscriptionActivatingSubscriber mySubscriptionActivatingSubscriber; private Level myStoredLogLevel; @BeforeEach public void init() { ourLogger = (Logger) LoggerFactory.getLogger(SubscriptionActivatingSubscriber.class); myStoredLogLevel = ourLogger.getLevel(); ourLogger.addAppender(myAppender); } @AfterEach public void end() { ourLogger.detachAppender(myAppender); ourLogger.setLevel(myStoredLogLevel); } @Test public void activateSubscriptionIfRequired_activationFails_setsStatusOfSubscriptionToError() { CanonicalSubscriptionChannelType type = CanonicalSubscriptionChannelType.RESTHOOK; Subscription subscription = new Subscription(); subscription.setId("Subscription/123"); String exceptionMsg = "Gone Exception"; int totalInfoLogs = 1; ourLogger.setLevel(Level.ERROR); IFhirResourceDao dao = Mockito.mock(IFhirResourceDao.class); // when Mockito.when(mySubscriptionCanonicallizer.getChannelType(Mockito.any(IBaseResource.class))) .thenReturn(type); Mockito.when(myDaoConfig.getSupportedSubscriptionTypes()) .thenReturn(Sets.newSet(type.toCanonical())); Mockito.when(mySubscriptionCanonicallizer.getSubscriptionStatus(Mockito.any(IBaseResource.class))) .thenReturn(SubscriptionConstants.REQUESTED_STATUS); Mockito.when(myDaoRegistry.getSubscriptionDao()) .thenReturn(dao); Mockito.when(dao.read(Mockito.any(IIdType.class), Mockito.any(SystemRequestDetails.class))) .thenThrow(new ResourceGoneException(exceptionMsg)); // test boolean isActivated = mySubscriptionActivatingSubscriber.activateSubscriptionIfRequired(subscription); // verify Assertions.assertFalse(isActivated); ArgumentCaptor<IBaseResource> captor = ArgumentCaptor.forClass(IBaseResource.class); Mockito.verify(dao).update(captor.capture(), Mockito.any(SystemRequestDetails.class)); IBaseResource savedResource = captor.getValue(); Assertions.assertTrue(savedResource instanceof Subscription); Assertions.assertEquals(Subscription.SubscriptionStatus.ERROR, ((Subscription)savedResource).getStatus()); ArgumentCaptor<ILoggingEvent> appenderCaptor = ArgumentCaptor.forClass(ILoggingEvent.class); Mockito.verify(myAppender, Mockito.times(totalInfoLogs)) .doAppend(appenderCaptor.capture()); List<ILoggingEvent> events = appenderCaptor.getAllValues(); Assertions.assertEquals(totalInfoLogs, events.size()); Assertions.assertTrue(events.get(0).getMessage().contains(exceptionMsg)); } }
/** * Produce a list of results from the list of values associated to the given key. The algorithm tries to match every value * found to a mapper stored, and adds the result created if it succeeds to do so. * @param key the full path to the key corresponding to a list of string values * @param config the config loaded from the TOML file * @return a potentially empty collection containing all the results successfully computed */ public Collection<T> loadList(String key, ReadableRawMap config) { return config.<List<String>>get(key) .stream() .map(name -> loadFromValue(name, config)) .filter(Optional::isPresent) .map(Optional::get) .collect(Collectors.toList()); }
/// Processes one bit of input, returning a valid message, an error, or pending. May be called /// repeatedly to continuously process incoming messages. /// /// # Example /// /// ```no_run /// # use nexus_revo_io::{SymReaderFsm, SymReaderFsmPoll}; /// # use libftd2xx::{Ft232h, Ftdi}; /// # use libftd2xx_cc1101::CC1101; /// # use std::convert::TryInto; /// # use std::io::Read; /// # let ft = Ftdi::new().expect("unable to Ftdi::new"); /// # let mut ftdi: Ft232h = ft.try_into().expect("not a Ft232h"); /// # let mut cc1101 = CC1101::new(&mut ftdi); /// # let mut cc1101_reader = cc1101.reader::<32>(); /// # let mut sym_reader = SymReaderFsm::new(&mut cc1101_reader); /// loop { /// match sym_reader.poll() { /// SymReaderFsmPoll::Msg(msg) => println!("{:x?} {:?}", msg.0, msg.1), /// SymReaderFsmPoll::Err(e) => panic!("poll failure: {:?}", e), /// SymReaderFsmPoll::Pending => {} /// } /// } /// ``` pub fn poll(&mut self) -> SymReaderFsmPoll { let bit = match self.read_bit() { Ok(bit) => bit, Err(e) => return SymReaderFsmPoll::Err(e), }; if bit != self.read_until { return SymReaderFsmPoll::Pending; } self.read_until = !self.read_until; if !self.read_until { match self.state { SymReaderFsmState::Sync => self.poll_sync(), SymReaderFsmState::Addr => self.poll_addr(), SymReaderFsmState::Cmd => return self.poll_cmd(), } } SymReaderFsmPoll::Pending }
import Control.Monad import Data.List import qualified Data.ByteString.Char8 as B import qualified Data.Vector.Unboxed as UV r2 [a,b]=(a,b) main = do [n,w]<-map read.words<$>getLine::IO [Int] wvs<-replicateM n $ r2.unfoldr (B.readInt.B.dropWhile(<'!'))<$>B.getLine print $ f n w wvs f n w = (UV.! w).foldl' p (UV.replicate (w+1) 0) where p v0 (wi,vi) = UV.zipWith max v0 (UV.replicate wi 0 UV.++ UV.map (+vi) v0)
On Tuesday, Kristian Dyer of Metro US dropped the news that Atlanta United was exploring starting a reserve team in USL as early as 2018. The report relies on information from an anonymous source within the league, who states that efforts for the Five Stripes to expand into USL for 2018 are moving “in a positive direction.” Dyer is generally one of the more trusted media voices around the league and is right about his reports more often than not, so there may well be fire behind this bit of smoke. But why should this excite anyone? “MLS-2” clubs, as they’ve become known, presently make up about 35% of USL and have been an increasing point of contention around the league in recent months. After all, these teams generally aren’t very good, attract very few fans, and contribute to a “feeder league” label that USL is trying to get away from. Plus, Atlanta United have an existing affiliate relationship with Charleston Battery, one of the most storied clubs in the league, and have sent several players there on loan over the last two seasons. Would “Atlanta United 2” really be a better solution for the club than extending that relationship? If we’re going to answer that question, we need to first look at how Charleston Battery is currently helping Atlanta United develop their young players, how it can be improved, and whether or not a reserve club would solve those issues. Examining the Battery Relationship One of the first things that the Atlanta United brain-trust did, way back in February of 2016, was make the Charleston Battery their USL affiliate. At the time, it was a very smart decision and provided places to play for some of the club’s first signings. In 2017, the Battery played a more robust role as an affiliate, with ten Atlanta United players spending time with the club over the course of the season. Some played big roles in the Battery’s 2017, most notably striker Romario Williams, who scored a team-high 15 goals, and goalkeeper Alex Tambakis. However, the main function of a USL affiliate or reserve side is to give young players a chance to gain professional experience, and Williams and Tambakis aren’t exactly young at this point in their careers. Ideally, players 20 years old or younger would benefit the most from this kind of relationship, so we’ve drawn the line there. The Battery played seven players aged 20 or younger this season, six of them coming from Atlanta United (the other was Battery academy product Robbie Robinson, who only appeared for 2 minutes in an Open Cup match, strangely enough against Atlanta). Here’s how much playing time they saw this season. That’s…not great, especially when you consider that none of these players played a single minute in MLS in 2017 (some of them aren’t on the first team yet, to be fair). To get a better picture, we ranked every USL team by the percentage of their total minutes on the season that went to U-20 players. The top nine in this ranking were all MLS reserve teams, and the Battery…well, see for yourself. Atlanta can’t really blame Charleston for this, and that is the inherent problem with affiliate relationships. Charleston are their own club with their own aspirations. They’re here to win matches and compete for USL titles, not to develop another team’s youth for them. Atlanta can send them all the kids they want, but whether they ever see the field is completely up to the Battery. What Reserve Teams Can Offer So, if Atlanta United isn’t getting what they need from Charleston, how would a reserve team fix that? There are several positives that a reserve team would offer that an affiliate cannot. Some of them have nothing to do with playing time and more to do with logistics. Typically, MLS reserve teams train at the same facility as the first team, leading to easier player movement between the two squads. It also means that the first team coaching staff can keep a closer eye on who’s performing well and who’s potentially ready for a call-up. The most obvious benefit to an MLS reserve team is creating a place to give young first-team players professional minutes early in their careers. For example, Patrick Okonkwo is a fine young striker, but he probably won’t be replacing Josef Martinez in the starting XI any time soon. The step up from academy soccer to the pros, even the lower divisions, is massive, and most players take time to adjust. A reserve team would provide a place for players like Okonkwo, Chris Goslin, George Bello, and others to get somewhat consistent minutes for a professional team, which would be hard to come by in Charleston. However, it goes further than that. Reserve teams often sign players directly out of a club’s academy as an intermediate step between the Development Academy and MLS. This is a way to get some of the best young players at a club playing professionally before they graduate high school, but not give them an MLS contract until they are ready. It’s worked brilliantly in the past for players like soon-to-be full US international Tyler Adams, who signed his first professional contract with New York Red Bulls II in 2015. Also, as MLS academies draw more attention from the prying eyes of Europe and Mexico, signing a player to your reserve team helps guarantee compensation should that player leave in the future. This is exactly what LA Galaxy II did in August when they signed 15 year-old Mexican-American super-talent Efrain Alvarez. Signings on these teams aren’t limited to academy players, either. Having an entire secondary roster at the club allows teams to take flyers on raw talents that may or may not develop into something. Often times these players come from abroad, especially from countries in Africa and Asia. It’s exactly how Seattle Sounders found their new starting left back, 20 year-old Cameroonian Nouhou Tolo. The Amateur Contract Effect At this point, we’ve established many ways that USL reserve sides help facilitate youth development for MLS clubs, but we haven’t mentioned one very important method. USL has a rule which allows clubs to sign players to “academy contracts.” These are contracts that are specifically for players under the age of 21 that provide a place on the team without providing any actual payment. There is one obvious use for this rule, which is giving professional minutes to academy players without forgoing their NCAA eligibility. Why exactly would a player want to sign one of these contracts? For starters, USL players do not get paid very well, and for many players, passing up on a college scholarship for a barely-livable is a very difficult decision. Academy contracts provide the best of both worlds. For clubs, academy contracts can be used to essentially “try out” young players on a professional level. If the player impresses, they usually give them a full pro contract, just as Sporting Kansas City’s reserve team Swope Park Rangers did this season with midfielder Wan Kuzain Wan Kamal. USL rules only allow five academy contract players to be signed before they start counting against the 30-man roster, but that has not stopped many MLS clubs. Some, like the Sounders and Galaxy, have fewer than 18 players on their full time reserve roster and regularly call up academy players into the USL team. For U-19 and U-17 academy players at clubs like these, playing in USL is a very attainable goal. Even if they do not sign full time with the team, they can still go to college and, should they develop well, come back to the club in the future as a pro. Is Atlanta Ready? Having a USL reserve team brings a ton of benefits when it comes to developing young talent, but there’s still one question left to be answered. Just under 18 months after Atlanta United’s academy was started, does the club have the depth of young talent available to them where having a reserve team makes sense? Not only is the answer to this question a resounding yes, they may already have more than just about anyone else. The Five Stripes already have five homegrown players, each 19 years old or younger, but the academy behind them is absolutely loaded as well. TopDrawerSoccer, a site devoted to all things youth and collegiate soccer in the US, ranks the top 150 players in each high school class nationwide, ranging from current seniors down to freshmen. Chris Goslin and Andrew Carleton obviously come in near the top of their rankings for the Class of 2018, but five academy players (Justin Garces, Charlie Asensio, Zyen Jones, Rayshaun McGann, and James Brighton) also are ranked in the top 60. Four more players make the top 60 in the Class of 2019 (Jackson Conway, Kendall Edwards, Dylan Gaither and Chad Letts). These are just some of the talented youngsters set to graduate from the academy in the next two years, and there are plenty more behind them. Between first-teamers who need playing time, academy players who need to get their feet wet in professional soccer, and young talents from across the globe who need a chance to prove themselves, the necessary ingredients for adding a USL team to the Atlanta United system are already here in abundance. The time is right for Atlanta United to start their own USL team and complete their youth development pipeline; now we just have to wait and see whether or not it will happen. Data sourced from Transfermarkt
<reponame>ericniebler/time_series // Copyright <NAME> 2006. Distributed under the Boost // Software License, Version 1.0. (See accompanying // file LICENSE_1_0.txt or copy at http://www.boost.org/LICENSE_1_0.txt) #ifndef BOOST_SEQUENCE_BEGIN_DWA200655_HPP # define BOOST_SEQUENCE_BEGIN_DWA200655_HPP # include <boost/detail/function1.hpp> # include <boost/detail/pod_singleton.hpp> # include <boost/range/result_iterator.hpp> # include <boost/range/begin.hpp> # include <boost/sequence/tag.hpp> # include <boost/mpl/placeholders.hpp> # include <boost/iterator/counting_iterator.hpp> namespace boost { namespace sequence { /// INTERNAL ONLY namespace impl { template<typename S, typename T = typename tag<S>::type> struct begin { typedef counting_iterator< typename range_result_iterator<S>::type > result_type; result_type operator ()(S &s) const { return result_type(boost::begin(s)); } }; } namespace op { using mpl::_; struct begin : boost::detail::function1<impl::begin<_, impl::tag<_> > > {}; } namespace { op::begin const &begin = boost::detail::pod_singleton<op::begin>::instance; } }} // namespace boost::sequence #endif // BOOST_SEQUENCE_BEGIN_DWA200655_HPP
package chapter4.demo493; public class Demo { public static void main(String[] args) { Boolean flag=true; if (flag) { System.out.println("Hello"); } } }
/** * Generates a String representation of the Chess board * with the n queens placed in it. * @return */ public String drawBoard(){ int[][] board = new int[gridSize][gridSize]; placedQueens.forEach(queen -> board[queen.getCol()][queen.getRow()] = 1); final StringBuilder sb = new StringBuilder(); for (int i=0; i<gridSize; i++) { for (int j=0; j<gridSize; j++) { if (board[i][j] == 1) { sb.append("[Q] "); } else { sb.append("[ ] "); } } sb.append('\n'); } return sb.toString(); }
def _parseIperf( iperfOutput ): r = r'([\d\.]+ \w+/sec)' m = re.findall( r, iperfOutput ) if m: return m[-1] else: error( 'could not parse iperf output: ' + iperfOutput ) return ''
//flag definitions as per accordance to the help document func init() { rootCmd.AddCommand(wcCmd) wcCmd.Flags().BoolVarP(&char, "chars_count", "m", false, "Display the number of characters") wcCmd.Flags().BoolVarP(&line, "lines_count", "l", false, "Display number of lines") wcCmd.Flags().BoolVarP(&max_line, "max_len", "L", false, "Display the length of the line having maximum length ") wcCmd.Flags().BoolVarP(&bytes, "byte_size", "c", false, "Display the size of the file") }
<gh_stars>0 package com.jiyun.qcloud.pop; import android.app.Activity; import android.util.Log; import java.util.Stack; /** * 设计一个全局的Activity栈,使用这个栈来管理Activity */ public class ActivityMgr { private static Stack<Activity> activityStack; private static ActivityMgr instance; /** * 构造方法 */ private ActivityMgr() { if (activityStack == null) { activityStack = new Stack<Activity>(); } } /**获取activity的manager*/ public static ActivityMgr getActivityManager() { if (instance == null) { instance = new ActivityMgr(); } return instance; } /**获取栈的大小*/ public int size() { return activityStack.size(); } //添加 /**添加activity到栈中*/ public void pushActivity(Activity activity) { // Log.e("ActivityMgr", "得到的className--" + activity.getClass().getName()); activityStack.add(activity); // Log.e("ActivityMgr", "得到的总个数 --" + activityStack.size()); } //移除 /**从栈中移除某一个activity*/ public void clear(Activity activity) { if (activity != null) { activityStack.remove(activity); } } /**清空栈*/ public void clear() { activityStack.clear(); } /**从栈中移除最后一个activity*/ public void popActivity() { Activity activity = activityStack.lastElement(); if (activity != null) { activity.finish(); activity = null; } } /**移除指定的activity*/ public void popActivity(Activity activity) { if (activity != null) { activity.finish(); activityStack.remove(activity); activity = null; } } /**获取当前的activity*/ public Activity currentActivity() { if (activityStack.size() > 0) { Activity activity = activityStack.lastElement(); return activity; } else { return null; } } /**获取倒数第二个activiyt,这个有点疑问*/ public Activity getTopActivity() { if (activityStack.size() < 1) { return null; } Activity activity = activityStack.get(activityStack.size() - 1); return activity; } /**退出除指定activity之外的其他activity*/ @SuppressWarnings("rawtypes") public void popAllActivityExceptOne(Class cls) { while (true) { Activity activity = currentActivity(); if (activity == null) { break; } if (activity.getClass().equals(cls)) { break; } popActivity(activity); } } /**移除所有activity*/ public void popAllActivity() { while (true) { Activity activity = currentActivity(); if (activity == null) { break; } popActivity(activity); } } /** * 销毁所有Activity -- ybb */ public void destroyAllActivity() { for (int i = 0; i < activityStack.size(); i++) { Activity activity = activityStack.get(i); activity.finish(); } activityStack.clear(); } /**销毁所有activity除了主页界面,直接回退到主activiyty*/ public void destroyAllActivityBySingleLogin() { for (int i = 0; i < activityStack.size(); i++) { Activity activity = activityStack.get(i); Log.e("ActivityMgr", "activit.name=" + activity.getClass().getName()); if(!"com.lecarlink.lf.activity.MainActivity".equals(activity.getClass().getName()) && !"com.lecarlink.pltpsuper.integration.home.MainActivity".equals(activity.getClass().getName())){ activity.finish(); } } } /** * 获取上一个activity */ public Activity getParentActivity(int index) { if (activityStack.size() < index) { return null; } Activity activity = activityStack.get(activityStack.size() - index); return activity; } /**回退到上一个activiyt*/ public Activity getTopStackActivity() { if(activityStack.size() <= 0 ) return null; Activity activity = activityStack.get(activityStack.size()-1); return activity; } }
/** * Created by emil.ivanov on 2/18/18. * <p> * Infinite scroll solution is based on this stack post * https://stackoverflow.com/questions/35673854/how-to-implement-infinite-scroll-in-gridlayout-recylcerview */ public class AdapterMovieCollection extends RecyclerView.Adapter<AdapterMovieCollection.ViewHolder> { private final List<MovieItem> mData; private final ICollectionInteraction mListenerMovieInteraction; private final String mImageBaseUrl; private final Context mContext; private final static int ITEMS_BEFORE_LOAD = 3; AdapterMovieCollection(Context context, List<MovieItem> movieItems, ICollectionInteraction listenerMovieInteraction, String imageBaseUrl) { this.mData = movieItems; this.mListenerMovieInteraction = listenerMovieInteraction; this.mImageBaseUrl = imageBaseUrl; this.mContext = context; } @NonNull @Override public ViewHolder onCreateViewHolder(@Nonnull ViewGroup parent, int viewType) { View view = LayoutInflater.from(parent.getContext()) .inflate(R.layout.movie_list_content, parent, false); return new ViewHolder(view); } @Override public void onBindViewHolder(@NonNull ViewHolder holder, int position, @NonNull List<Object> payloads) { if ((payloads.size() > 0) && (payloads.get(0) instanceof MovieItem)) { mData.get(position).setFavorite((((MovieItem) payloads.get(0))).isFavorite()); holder.bindData(mData.get(position)); } else { super.onBindViewHolder(holder, position, payloads); } } @Override public void onBindViewHolder(@Nonnull final ViewHolder holder, int position) { holder.itemView.setTag(position); holder.bindData(mData.get(position)); if ((position >= getItemCount() - ITEMS_BEFORE_LOAD)) { mListenerMovieInteraction.onLoadMore(); } } /** * Update the adapter items if the favorite status has changed * This method is invoked only when the current selected filter is {@link MovieCollectionActivity#MOVIE_CATEGORY_FAVORITES} * * @param movieItem - currently selected movie item */ public void removeItemFromFavorite(MovieItem movieItem) { if (!movieItem.isFavorite()) { int position = mData.indexOf(movieItem); mData.remove(movieItem); notifyItemRemoved(position); if (mData.size() == 0) { mListenerMovieInteraction.showEmptyList(); } } } @Override public int getItemCount() { return mData.size(); } class ViewHolder extends RecyclerView.ViewHolder { private final View.OnClickListener mClickListener = new View.OnClickListener() { @Override public void onClick(View v) { int position = getAdapterPosition(); mListenerMovieInteraction.onMovieSelected(mData.get(position), mBinding.ivThumbMovie); } }; private final MovieListContentBinding mBinding; ViewHolder(View view) { super(view); mBinding = DataBindingUtil.bind(view); itemView.setOnClickListener(mClickListener); } private void bindData(MovieItem movieItem) { Uri uri = Uri.parse(mImageBaseUrl + movieItem.getPosterPath()); Picasso.with(mContext) .load(uri) .networkPolicy(NetworkPolicy.OFFLINE) .placeholder(R.color.colorPrimary) .error(R.drawable.empty_image) .into(mBinding.ivThumbMovie, new Callback() { @Override public void onSuccess() { } @Override public void onError() { Picasso.with(mContext) .load(uri) .error(android.R.drawable.stat_notify_error) .into(mBinding.ivThumbMovie); } }); } } void addItemsCollection(List<MovieItem> newList) { if (!mData.containsAll(newList)) { mData.addAll(newList); notifyDataSetChanged(); } } void updateCollection(List<MovieItem> newList) { mData.clear(); mData.addAll(newList); notifyDataSetChanged(); } public interface ICollectionInteraction { void onMovieSelected(MovieItem movieItem, View imageView); void onLoadMore(); void showEmptyList(); } }
package br.com.angrybits.angrybitsCore.entity; import java.io.IOException; import java.text.DateFormat; import java.text.ParseException; import java.text.SimpleDateFormat; import java.util.Date; import org.codehaus.jackson.JsonParser; import org.codehaus.jackson.JsonProcessingException; import org.codehaus.jackson.map.DeserializationContext; import org.codehaus.jackson.map.JsonDeserializer; public class CustomJsonDateDeserializer extends JsonDeserializer<Date>{ @Override public Date deserialize(JsonParser jsonparser, DeserializationContext deserializationcontext) throws IOException, JsonProcessingException { DateFormat formatter = new SimpleDateFormat("yyyy-MM-dd HH:mm:ss"); String dateStr = jsonparser.getText(); try { return (Date)formatter.parse(dateStr); } catch (ParseException e) { throw new RuntimeException(e); } } }
/** * DistributionReport gives total consumption and production for the timeslot, * summed across all brokers. */ public void handleMessage(DistributionReport dr) { PrintService.getInstance().addDistributionReport(dr.getTimeslot(), dr.getTotalProduction(), dr.getTotalConsumption()); }
George O’Donnell died at his home in McKinleyville, California on May 12, 2016 at the age of 86. He left behind gallons of bourbon, vodka and gin that we have no idea what to do with as we are all sober. He was self-indulgent, kind and curious, fond of jokes and unexplainable phenomenon. He believed in UFOs and liked to contemplate the vastness of the universe. He was very proud of the dysfunctional family he left behind. George grew up as an only child, well loved and spoiled by James and Louise Becker O’Donnell in Rochester, New York. Born on July 9, 1929, he had a thoroughly Irish Catholic upbringing, attending St Augustine grammar school and St Thomas Aquinas High School. His mother envisioned the priesthood for him, but George left Maryknoll Seminary when he discerned his true calling of loving the ladies. He learned drafting at St. John Fisher College. He was proud of his service in the Korean conflict, serving in the rank of first lieutenant. George’s employment history was unusual. Shocking his conservative parents, he moved his young wife Geri and infant daughter Kathleen all the way from New York to California in 1955 with no job prospects. The car broke down at the Grand Canyon, but they ran into helpful strangers who towed them to Los Angeles. His eternal optimism bore fruit as he talked his way into work as an engineer, working for Litton, RCA and Hughes Aircraft. He didn’t much care for the work but loved the long boozy lunches with coworkers. Despite fairly steady employment through which he supported not only his family but enormous partying habits, he believed that a nine-to-five job was for the unimaginative and dreamt of becoming a professional gambler. He stayed married to Geri for a record thirteen years, long enough to have five girls to whom he imparted his “take the road less traveled” philosophy. His good looks, joy de vivre, dancing skills and love of beautiful women led him on many romantic adventures and two more tries at marriage until he realized he was better off single and friendly. Tiring of the limitations of regular work, he tried his hand at many different things including sales, gambling, used cars, gambling, and Amway. He strongly believed in the power of positive thinking and used it to successfully buy and sell real estate in San Diego for many years. He loved the horse races and was twice victorious in his life long quest to hit a Pick 6. He remained close to his Catholic roots and attended Mary Star of the Sea in La Jolla, and Saint Mary’s in Arcata. As proof of his continual luck, George was able to live for nearly 18 years at the fabulous League House in La Jolla, in a stunning studio apartment with a view of the ocean. There he found kindred spirits and a mutual love of shenanigans. He was known for his twice weekly showings of his favorite movies, repetition being his strong point. Moving to McKinleyville to live with his daughter, he continued a consistent routine of washing down his morning vitamins with screwdrivers, starting on martinis at noon and finishing each evening with Manhattans. He called his friend Mickey almost daily to bet on a game. He loved football, the Padres, chocolate, Amy’s toaster pops, Sidney Sheldon novels, tabloids, game shows, the Playboy channel, and Lifetime Movie Network. His fondness for John Wayne movies was surpassed only by his love of fried chicken and reruns of Johnny Carson. His favorite show, Deal or No Deal, combined his major life interests of beautiful women and gambling. George was enthusiastic about bridge and played for many years with the La Jolla Bridge Club and more recently with Sequoia Bridge Club in Eureka. Whenever possible he enjoyed playing blackjack in our local casinos. He passed on his skill as a keeper of treasures and horse race handicapper. George is survived by his daughters Sheila Kircher (Rick) of McKinleyville, Maureen (Bill) Stapleton of New Lebanon, Ohio, Eileen (Greg) Gapko of Draper, Utah; his grandchildren Melinda Whitney, Sean Traverse, Amber Meyer, Aster Meyer, Shannon Lightner, Michael Swort, William Stapleton, Victoria Stapleton, Sarah Star, Amy Slimp, Stephanie Meese, Michon Lee; his great grandchildren Cody Heyen, Brandon Heyen, William Kirkland, Sean Kirkland, Elijah Swort, Hayden Faulkner, Jimmie Slimp, Seth Meese, Owen Meese, A’lias Morgan, and by many nieces and nephews and lifelong friends. He was preceded in death by his parents, brother Will, and daughters Kathleen and Sharon. A funeral mass is scheduled for May 26 at 10 am at St Mary’s in Arcata. Cremation is underway and George’s ashes will remain with family until we figure out what to do with them. Tremendous heartfelt thanks go to the staff of Eureka VA clinic, Humboldt Caregivers, and Hospice of Humboldt. The care providers at these organizations deserve medals for dealing with the cantankerous with skill and grace. It would be very fitting if people had a drink of their favorite beverage to celebrate George’s passing. ### The obituary above was submitted by George O’Donnell’s family. The Lost Coast Outpost runs obituaries of Humboldt County residents at no charge. See guidelines here. Email [email protected].
The narrow road that leads to Alakhpura, in Haryana's Bhiwani district, is mostly empty on this sultry August afternoon. The clouds above portend heavy showers. The farmers are returning from the fields, leading cows back into their shelters. And just as the village's day is drawing to a close, you can see some two dozen young girls kicking a ball around in a big field, right next to the village's only government school. They are warming up. This non-descript Haryana village, with a sex ratio worse than the state average (Haryana has the ignominy of the worst sex ratio in the country) is the ground zero of an incredible and heartwarming footballing success story scripted by its women. More than 300 women train regularly in this village, and among them are 11 who have represented India and several others who have represented the state across age groups. Footballing success here means a ticket to a better livelihood for their families. A move from a mud home to a concrete one. And as such stories have played out repeatedly, the status of young women in this village have transformed--they are now seen as being just as capable of changing a family's fortunes as a male child. Trickle Becomes A Flood Even as it begins to rain heavily, and the field turns in parts into muddy slush, the girls don't stop playing. The roads from their houses to the field are now flooded. But the field keeps getting more crowded. "We practice everyday, Storms and rains can't stop us from playing," says Poonam Sharma. The 18-year old started playing about seven years ago. Last year, she went to Vietnam to represent India at the Asian Football Confederation Under-19 qualifiers. The girls from Haryana shot into the limelight when the players of the Alakhpura government school won the U-17 Subroto Cup--a national tournament--in 2014. The girls managed to the reach the finals in 2015, and win it back in 2016. The Alakhpura girls have also been a part of the Haryana U-14, U-19 and the national teams for over a decade now. It all started in the year 2002. The school's physical education teacher Gordhan Dass, 49, used to train the boys to play kabaddi. One day, the girls went to him and said they wanted to play too. Gordhan, a qualified kabaddi coach, decided to give the girls a football that was lying around in the school premises. During recess, these girls simply kicked the football around. "There was a time when the girls would sit with a punctured football. We didn't have money to get a new one. Those girls are now playing for India," a beaming Dass told HuffPost India. HuffPost India Alakhpura government school's PE teacher Gordhan Dass with the team coach Sonika Bijaria and her husband, who is a boxer. Within two years after the girls started playing football, the villagers got together to turn a road into a field. "They put in their money and got sand and mud to cover the place," he said. A decade later, there are around 300 girls from the village who train in the school's adjoining field every day. From this cohort, champions emerge routinely. There was a time when the girls would sit with a punctured football. We didn't have money to get a new one. Those girls are now playing for India. Last year, the girls formed Alakhpura FC in their effort to participate in the first Indian Women's League (IWL), initiated by the AIFF (All India Football Federation). The team won the regional qualifiers in Haryana to be one of the 10 teams to qualify for the second round of the IWL. They lost to Manipur's Eastern Sporting Union in the semi-finals. But they had made a mark. Alakhpura FC's 6-2 victory against Aizawl FC in the group stage was one of the most talked about matches in the league. "At least one girl plays football from every household in Alakhpura," said Sonika Bijarnia, the head coach of Alakhpura FC. Bijarnia, who has represented Haryana at the senior nationals, started training these girls in 2014 when she was deputed by the government. "There were so many difficulties. We didn't have proper training facilities. But these girls are extremely talented," she said. Home to about 2,000 people, the village mostly has farmers and daily wage labourers. The census data shows that the average sex ratio in the village is 849, lower than Haryana's state average of 879. But the women's football team is making sure that their village is known for something else. "Now people know the village because of our football. Game is our life," said 19-year-old Jyoti Yadav, who has played the Subroto Cup and in the IWL. Bend It Like The Alakhpura Girls Sanju Yadav, 19, was named as the AIFF Emerging Player of the Year in 2016 for her performance in the National Women's Football team. What started as just another game in her school recess became her career. "I don't think I would be anything but a football player," she said. Sanju, daughter of a farm labourer, has just returned from Malyasia after India won the friendly championship against the host country. Andrew Clarance/HuffPost India Last year, the girls didn't get any league sponsors and that's when 2,000 villagers of Alakhpura came together to donate Rs1.5 lakh so they could take part in the IWL. Sanju has been playing for the last 8 years. She first played for the U-14 Indian team in 2011 against Sri Lanka. But she shot to fame in a match against Bangladesh in February 2016. Playing for the senior team, she scored in the 74th minute as India won 5–1 and moved into the finals. India then went on to defeat Bangladesh 3-1 to win their fourth consecutive SAFF Women's Football Championship title. Eldest among three siblings, Sanju started playing football at the age of 10 when their school's physical education teacher just handed them a football. "We would run to the other village, some of us bare-feet because we couldn't afford shoes," she said. This was a practice method to build stamina. Till last year, all they had was a set of bricks in place of goalposts. The pitch was barren. "Once the girls started winning tournaments, the government came forward to help," said Bijarnia, Alakhpura FC's coach. The pitch now has grass, and the girls have shoes. In the next six months, the girls will have another field to train. "We have been allotted Rs 2 crore by the Haryana government to develop this field," she said. Sanju's family has only one acre of land and her father works as a labourer in the fields of others to make ends meet. They lived in a house made of mud bricks. Every day, Sanju would wake up at 5 am to train and then she would help her father in the fields. She would then go to school and return to training in the evening. Once Sanju started playing, she realized that the game could help her get scholarships and prize money that could help her parents run the family. "For the last four-five years, she got around Rs 2 lakh scholarship per year and with that (money) we have constructed a two-bedroom house," said Sanju's mother Nirmala Devi. It's been three months since the 19-year-old got a job with the Railways, thanks to football. "My parents have never discouraged me from playing. At my age, most girls usually get married, but they have never asked me to. They want me to win matches, and not wear a ghunghat (a traditional veil)," she said. This year, along with Sanju, 20-year-old Ritu Bagaria from the same village had gone to Malyasia for a national camp. "I learnt a lot there," she said. "I will now teach the girls in the village learn some of the tactics I learned there. I hope more and more women from Alakhpura play international," she said. Andrew Clarance/HuffPost India Aneybai, 15, has played the Subroto Cup in 2016. 15-year-old Aneybai has helped her family in a way her mother had never dreamt of. Aneybai lived in a mud house with her mother and two brothers. Her mother Maya is a safai karmachari (cleaner), and her brothers are in school. Her father passed away when she was four years old. "With the scholarship money that she got from playing football, we have built a brick house and a toilet," Aneybai's mother said. Inspired by the girls, the boys from the village who hadn't taken much interest in the game till recently, have now started training. "The boys have just started training, they want to achieve what the girls have. They have a long way to go," Bijaria said. It Takes A Village... Prakash Singh Jakhar, a member of the village's Panchayat Committee, recounted how the villagers got together to ensure lack of facilities doesn't come in between these girls and their dreams. "There was no ground at the village for girls to play and so we requested the government to help us out, but no one helped," he said. So, the villagers decided to dry out a nearby pond by filling it with sand, to create a place for girls to play and train. Last year, the girls didn't get any league sponsors and that's when 2,000 villagers of Alakhpura came together to donate Rs1.5 lakh so they could take part in the IWL. Dass, the PE teacher who had initiated this football revolution in the village, said that the girls' success in football has won around Rs50-60 lakhs in scholarships this year. "Whenever Haryana's women team plays a game, there are mostly girls from this village," he said. "The girls' parents try and fund whatever little they can. Some give Rs100, and some Rs5,000. They don't want the girls to stop playing because of lack of funds," Bijarnia points out. All of this might sound like scenes from an idyllic village movie where everyone just gets along. But that wasn't always so. Like in most Haryana villages, the girls here used to be married off before they turned 16. Dass would often have to visit the girls' homes, to try and convince the parents to let their daughters come to the field. Often, the villagers would refuse to listen to Dass. So, he came up with an idea and decided to pose a question to the unwilling villagers. "My daughter used to play football. I would train her too. Every time there was difficulty from a girl's house, I would visit the parents with my daughter. I would tell them that I have a daughter the same age. If it isn't disrespectful for the school teacher's daughter to play, how is it disrespectful for anyone?" Andrew Clarance/HuffPost India Anjali, 12, Sandhya, 10, Sneha, 9, Niyati, 6, Ritika,5, Kafin, 3, are sisters. They all play in the football team. That wasn't all. Before Bijarnia, the club's only female coach, came on board, Dass wasn't sure how to travel with the teenage girls for tournaments. "The villagers wouldn't be happy about this," he said. So he would leave his cows to his brother's care and take his wife along with him. "My wife took care of their personal needs, while I continued with the training," he said. Dass believes that the game has changed these girls' lives. "Now the girls' parents want them to play, not get married," he said. He said that if this change could come in a small village like Alakhpura, it can happen in every village of the country. The Hurdles In Women's Football Alakhpura might have a success story to tell, but the wider state of women's football in the country has been a sorry one in recent years. The Indian women's football team has seen a steep decline from a period of glory in the 1980s. At the 1980 Asian Women's Championships, held in Calicut (now Kozhikode), India had entered not one but two teams. One of these teams returned home with silver. Andrew Clarance/HuffPost India Till last year, all they had was a set of bricks in place of goalposts. The pitch was barren.
#ifndef SPACE_COMPONENTS_AI_FOLLOW_HPP #define SPACE_COMPONENTS_AI_FOLLOW_HPP namespace space::components { struct AIFollow {}; } // namespace space::components #endif
Disc galaxy resolved in HI absorption against the radio lobe of 3C 433: Case study for future surveys The neutral atomic gas content of galaxies is usually studied in the HI 21cm emission line of hydrogen. However, at higher redshifts, we need very deep integrations to detect HI emission. The HI absorption does not suffer from this dependence on distance as long as there is a bright enough background radio source. However, resolved HI absorption studies of galaxies are rare. We report one such rare study of resolved HI absorption against the radio galaxy 3C 433 at $z = 0.101$ with the VLA. The resolved kinematics of the absorber, located against the southern lobe of the 3C 433, shows that it has regular kinematics with an HI mass $\lesssim 3.4 \times 10^{8} M_{\odot}$ for T$_{spin} =$ 100K. Our deep optical continuum and H$\alpha$ observations from the Gran Telescopio CANARIAS (GTC) show that the absorber is a faint disc galaxy in the same environment as 3C 433 with a stellar mass $\sim 10^{10} M_{\odot}$ and a star-formation rate of 0.15 $M_{\odot}~yr^{-1}$ or less. For its HI mass, HI column density, stellar mass, and star-formation rate, this galaxy lies well below the main sequence of star-forming galaxies. Its HI mass is lower than the galaxies studied in HI emission at $z \sim 0.1$. Our GTC imaging reveals interesting alignments between H$\alpha$ and radio emission in the HI companion and in the host galaxy of the AGN as well as in the circumgalactic medium in between. This suggests that the shock ionization of gas by the propagating radio source may happen across tens of kpc. Our work supports the potential of studying the HI content in galaxies via absorption in the case of a fortuitous alignment with an extended radio continuum. This allows us to trace galaxies with low HI masses which would otherwise be missed by deep HI emission surveys. In conjunction with the deep all-sky optical surveys, the blind HI surveys with the SKA pathfinders will be able to detect many such systems. Introduction Neutral atomic hydrogen (H i) plays an important role in the formation and evolution of galaxies. Hence, attaining a deeper understanding of this component of gas in galaxies (i.e. the content, distribution, kinematics, etc.) is of interest in the study of galaxy evolution. The H i content, traced by the 21cm line, in galaxies in the local Universe has been studied extensively in emission (e.g. Wright 1971;Rogstad et al. 1974;Sancisi 1976;van der Kruit & Allen 1978;van der Hulst 1979;van der Hulst et al. 1987;Oosterloo & Shostak 1993;Barnes et al. 2001;Koribalski et al. 2004;Giovanelli et al. 2005b,a;Catinella et al. 2010Catinella et al. , 2012Spekkens et al. 2014;Odekon et al. 2016;Jones et al. 2018). These studies have provided insights into the relation between H i content and stellar mass, galaxy structure, star-formation rate, etc. There have also been extensions of such studies beyond the local universe (e.g. Catinella et al. 2008;Catinella & Cortese 2015;Verheijen et al. 2007;Hess et al. 2019;Bera et al. 2019;Blue Bird et al. 2020;Gogate et al. 2020). However, even with very deep integrations, direct detections of H i in emission have so far been from gas-rich, star-forming galaxies (≥ 2 × 10 9 M ; e.g. Zwaan et al. 2001;Catinella et al. 2012;Fernández et al. 2016;Hess et al. 2019) and mostly for redshifts below z ∼ 0.1. H i absorption studies do not suffer from this limitation of redshift because the detection of H i in this case is possible within short integration times provided there is a strong radio continuum in the background. Hence, H i absorption studies have been employed extensively to study atomic gas in damped Lyman-α systems and MgII absorbers (e.g. Kanekar et al. 2009;Gupta et al. 2009;Kanekar et al. 2014) and the host galaxies of active galactic nuclei (AGN; e.g. Morganti et al. 2005Morganti et al. , 2013Allison et al. 2015;Aditya et al. 2016Aditya et al. , 2017Chowdhury et al. 2020;Morganti & Oosterloo 2018, and references therein) as well as in the study of the distribution of H i in the halo of galaxies (e.g. Gupta et al. 2010;Borthakur 2016;Dutta et al. 2017). However, by design, these studies have mostly focused on compact radio sources. In cases where the absorption is detected against an extended radio continuum, it is possible to study the absorber in greater detail. The properties of H i at parsec scales have been probed Article number, page 1 of 11 arXiv:2009.13338v1 28 Sep 2020 A&A proofs: manuscript no. AA_39114 against the extended radio continuum using very long baseline interferometry techniques (e.g. Borthakur et al. 2010;Srianand et al. 2013;Gupta et al. 2018;Schulz et al. 2018). When it is possible to resolve H i absorption at kpc scales, we can extract information that is similar to what we obtain from H i emission studies on the distribution and kinematics of cold gas at galactic scales. Yet, there have only been a few studies so far where a galaxy has been imaged in H i absorption at kpc scales; for example, the intervening spiral galaxies towards 3C 196 at z ∼ 0.4, PKS 1229−021 at z ∼ 0.395 (Kanekar & Briggs 2004) and the spiral galaxy UGC 00439 at z ∼ 0.02 of a quasar-galaxy pair (Dutta et al. 2016). A great deal can also be inferred even when the absorption itself is unresolved, but knowledge of the background radio structure at a higher spatial resolution is available, which can be used to constrain the nature of the absorbing gas via modelling (e.g. Briggs 1999;Briggs et al. 2001;Murthy et al. 2019). With the commencement of large, deep, blind H i surveys planned with the Square Kilometre Array (SKA) and its pathfinder and precursor facilities, which are capable of simultaneously covering a large redshift range, there is the potential to expand such studies to large numbers and to gain particularly valuable insights at higher redshifts. Here, we present an example of such a study of resolved H i absorption towards the radio galaxy 3C 433. As we detail below, 3C 433 has a highly asymmetric radio morphology with most of the flux density arising from the southern lobe. The H i absorption was detected towards 3C 433 by Mirabel (1989) using the Arecibo telescope. Due to the brightness asymmetry in the radio continuum, the authors proposed that the absorption is likely to arise against the southern lobe. Since the large size of the lobe could provide an extended background, this system is a good candidate for carrying out a resolved H i study in absorption. The peculiar radio morphology of 3C 433, as seen in Fig. 1, has already been discussed in various studies (e.g. van Breugel et al. 1983;Parma et al. 1991;Black et al. 1992;Leahy et al. 1997). It has a projected angular size of 58 , corresponding to a linear size of 110 kpc. It has a weak radio core at the base of a highly collimated jet expanding to the north. The jet is initially slightly curved and has a gap in the middle. There is no similar jet-like feature to the south. Instead, the southern lobe has a large opening angle (80 • ) right at the beginning and contributes to ∼ 85% of the radio emission from this source at 1.4 GHz. The southern lobe does not exhibit a relaxed morphology but, instead, it protrudes at the end of the lobe as if it has been pinched. It also contains a hotspot and various complex and fine structures (see e.g. Fig. 19 in Leahy et al. 1997). Thus, in the Fanaroff & Riley (1974, FR) classification scheme, 3C 433 is a hybridmorphology source exhibiting an FR I radio jet and an FR II radio lobe. In addition, the southern lobe has a faint wing on the western side, which appears to lie at the same angle from the core as the outer emission in the northern lobe. This emission gives 3C 433 an X-shaped appearance (e.g. Lal & Rao 2007;Gillone et al. 2016). The host galaxy of the 3C 433 is a part of an interacting pair, enclosed in a common envelope (see Fig. 1; Matthews et al. 1964;van Breugel et al. 1983;Baum et al. 1988;Smith & Heckman 1989;Black et al. 1992). It has been found to have young stellar population with ages 0.03 < t YSP < 0.1 Gyr (Tadhunter et al. 2011), which is very likely formed due to the ongoing interaction. Miller & Brandt (2009) have carried out X-ray observations with Chandra and find diffuse X-ray emission in the softband, which curves along the east side of the southern lobe in the 0.5 -2 keV smoothed image. In order to localise and, if possible, resolve the H i absorber, we observed 3C 433 with NSF's Karl G. Jansky Very Large Array (VLA) in the B configuration. Furthermore, to better characterise the absorber, we also obtained deep, narrow, and medium band optical continuum and Hα images of the field with the Gran Telescopio CANARIAS (GTC). Interestingly, we find that the absorption is due to an intervening disc galaxy with a redshift close to that of 3C 433. This allows us to study the properties of the galaxy in detail and explore the possibility of an interaction between the radio lobe of 3C 433 and the galaxy, as well as its effect on the observed morphology of the radio lobe. We describe our radio and optical observations in Section 2, present our results in Section 3, and discuss the possible origin of the H i absorption and its implications in Section 4. Finally, we summarise our findings in Section 5. We have assumed a flat universe with H 0 = 67.3 km s −1 Mpc −1 , Ω Λ = 0.685, and Ω M = 0.315 (Planck Collaboration et al. 2014) for all our calculations. 3C 433 is at z = 0.1016 ± 0.0001 (an optical systemic velocity of 30458.9 km s −1 ;Schmidt 1965;Hewitt & Burbidge 1991) where 1 corresponds to 1.943 kpc. VLA observations Our VLA observations were carried out in July 2002 with antennas in B configuration (project id: AM0730). We observed 3C 433 for 4.5 hours in total. Because the observations were performed before the upgrade to the VLA wideband WIDAR correlator, only a single Stokes parameter (RR) was used in order to cover a sufficiently wide bandwidth of 6.2 MHz subdivided into 127 channels. The observations consisted of interleaved scans on the flux and bandpass calibrator (3C 48), the phase calibrator (B2 2113+29), and the target. The data reduction was done in 'classic' AIPS (Astronomical Image Processing Software). We first flagged the bad baselines and bad data. Then determined the antenna-dependent gain and bandpass solutions using the data on calibrators. We then iteratively improved the gain solutions via self-calibration. Initially, we carried out a few cycles of imaging and phase-only self-calibration. Then carried out a round of amplitude and phase self-calibration and imaging. We then subtracted the continuum model from the calibrated visibilities and flagged the residual UV data affected by radio frequency interference. We fit a second-order polynomial to the line-free channels of each visibility spectrum. Finally, we imaged this UV data to get the spectral cube. We made the continuum map using robust weighting of -1, averaging all the line-free channels together. It has a restoring beam of 5.39 × 4.72 with a position angle of -89.27 • and has an RMS noise of ∼ 2.6 mJy beam −1 . We made the spectral cube with the same weighting and the same restoring beam as the continuum map. It has an RMS noise of ∼ 1.2 mJy beam −1 channel −1 for a channel width of 12.5 km s −1 , without any spectral smoothing. The H i moment maps (shown in Fig. 1 and Fig. 4) were produced from this spectral cube by adding the channels across which the absorption was found. The peak flux density of the target is 1.3 ± 0.1 Jy beam −1 . The integrated flux density is 14.2 ± 0.7 Jy. The southern lobe contains ∼ 85% of the total flux density (∼ 13 Jy). The uncertainty on the flux density scale at the observed frequency is assumed to be 5% (Perley & Butler 2017). More recent VLA observations of 3C 433 from 2017 (project id: 17B-016) are available in the archive. A part of these obser- Optical image of 3C 433 environment from GTC is shown in the background. Black contours show the radio continuum at 1.12 GHz as observed with the VLA. Contours start from 4σ (13 mJy beam −1 ) and increase by a factor of 2. Our VLA radio image has a beam of 5.39 × 4.72 , shown in the bottom right corner. Blue contours show the 3.6 cm map by Black et al. (1992). Contour levels start from 30µJy beam −1 (3σ) and increase by a factor of two. The region in the white box is blown up in the two images on the right panel. The radio core of 3C 433 as identified by van Breugel et al. (1983) is marked with a black cross. Right panel (bottom): Blow-up of the image in the left panel with the H i column density (N HI ) contours shown in white. The N HI contours start from 0.7 ×10 18 (T spin /f) cm −2 and increase in steps of 0.3 ×10 18 (T spin /f) cm −2 . The image clearly shows that the H i absorber overlaps with an optical galaxy. Right panel (top): Hα contours (from GTC) overlaid on the optical image. The Hα contours levels are: (2.7, 3.2, 3.7, 4.3, 4.9, 5.4, 6 and 6.4) × 10 −18 erg s −1 cm −2 . The 3.6cm radio contours (same as that in the left panel) are shown in blue. vations was carried out in spectral line mode with a 16 MHz band, centred at the redshifted Hi 21cm frequency, subdivided into 1024 channels. Thus, these data have a significantly better spectral resolution that would enable us to study the absorption profile in greater detail. Unfortunately, these observations are severely affected by RFI and the data are unusable. Optical and Hα data Optical narrow and medium band images of 3C 433 were taken with the GTC on 11 Sept 2017 (project GTC48-17B), using the Optical System for Imaging and low-Intermediate-Resolution Integrated Spectroscopy (OSIRIS). The Hα emission line was captured using the tunable narrow-band filter f723/45, centred at a wavelength of 7234 Å with a full width at half the maximum intensity (FWHM) of 20 Å. The continuum emission was observed with the medium-band order sorter filter f666/36, which has a central wavelength of 6668 Å and an FWHM of 355 Å. This corresponds to r-band wavelengths and is free of emission lines. The total on-source exposure time was 1 h in the narrowband filter, divided into 15 dithered exposures of 250 sec. For the medium-band filter, this was 25 min, with 15 dithered exposures of 100 sec. The observations were performed in queue mode under seeing conditions of roughly 1 . We reduced the data using the Image Reduction and Analysis Facility (IRAF; Tody 1986). After a standard bias subtraction and flat-fielding, we co-added the images in each filter while removing cosmic ray. Because the wavelength tuning is not uniform for the OSIRIS narrow-band filters, the effective field of view is significantly smaller than the unvignetted 7.8 coverage of the CCD, and background gradients are introduced in the narrow-band imaging. We removed these background gradients as best as possible by fitting a 2-D polynomial to the background emission and subtracting this from the narrow-band image. We used the same technique to produce the medium-band image of the continuum. We then subtracted the medium-band continuum image from the narrow-band image to obtain an image with only Hα emission. We did this by scaling down the raw counts of the continuum image by a factor 5.75 (derived empirically) and then subtracted that from the narrow-band image. In the resulting Hα image, any remaining background gradient was removed by fitting the background separately across RA and Dec while manually excluding emission from galaxies, stars and artefacts, and then subtracting a 2D average of these RA and Dec fits from the Hα image. Due to the large uncertainty in absolute astrometry of the GTC data, we shifted the final GTC images by 2 to align with the existing Hubble Space Telescope (HST) imaging. We would like to note here that the artefacts associated with the point or highly nucleated sources seen in the Hα images are unavoidable. Spatial shifts of a small fraction of a pixel or small differences in seeing size between the continuum and narrowband images (or both) will always produce such artefacts. We do not have an absolute calibration for our GTC images since we did not observe a standard star, as that would have resulted in a large increase in the overhead time for these queuemode observations. Fortunately, a part of the field covered by our observation has been observed by the Sloan Digital Sky Survey (SDSS) and, hence, the SDSS stars could be used for calibration. To this end, we first extracted the point sources in our image using SExtractor (Bertin & Arnouts 1996). The software generated a catalogue of sources from which we selected those sources with CLASS_STAR > 0.85 and FLAG < 4. This was done to ensure that we selected only the point sources which are not saturated. We extracted the instrumental magnitudes for these stars from SExtractor via the MAG_AUTO parameter. We crossmatched the stars thus obtained with the stars in the SDSS Data Release 12 r-band catalogue (Alam et al. 2015). Thence, we obtained the zero-point magnitude by taking the difference between the SDSS magnitude and the magnitude obtained from SExtractor. The final zero-point magnitude is the mean of the zero-point magnitudes thus obtained. We obtained a final zeropoint magnitude of 26.71 ± 0.49 for our continuum image. The GTC images, corrected for the astrometric offset using the HST imaging as mentioned above, align with the SDSS stars. As we go on to explain in the sections to follow, a faint galaxy located close to the edge of the southern radio lobe of 3C 433 is of particular interest for this work. Figure 2 shows the medium-band, narrow-band, and Hα GTC images of this galaxy. We used GALFIT (Peng et al. 2002(Peng et al. , 2010 to characterise this galaxy. We used a cutout of the field (123 × 134 pixels), including the galaxy, as the input to GALFIT. We fit for the centre of the galaxy, integrated r-band magnitude, effective radius (r e ), axial ratio, and the sky background. The initial guess for the centre of the galaxy, effective radius, and the axial ratio were determined visually. The initial guess for the integrated r-band magnitude was input from the Panoramic Survey Telescope and Rapid Response System (Pan-STARRS) r-band catalogue Flewelling et al. 2016) in which the galaxy is de- tected. We fine-tuned the initial parameters over a few iterations and found that the final fit values agreed within the error bars. This galaxy has also been detected in the Pan-STARRS gband image. We extracted the g-band integrated magnitude of the galaxy following the same procedure as mentioned for our continuum image using SExtractor and GALFIT on the Pan-STARRS image. We have summarised the fit parameters in Table 1. Then we performed aperture photometry in continuum using Photoutils of Astropy package under Python software (Bradley et al. 2019). We used circular apertures of radii varying from 0.1r e to r e . We did not extend beyond the unit effective radius for the presentation of the surface brightness profile due to the presence of a bright source to the south of the galaxy whose contribution would become significant beyond that distance. The choice of circular aperture was motivated by the high axial ratio of the galaxy (see Sect. 3) as obtained from GALFIT. We discuss the parameters obtained in Sect. 3.2. Results The radio and the optical data both show interesting and new features. We describe them below, including the optical properties of both the host galaxy of 3C 433 and its environment. We then focus only on the H i absorber in the discussion in Sect. 4. Radio continuum and the HI absorption Our VLA continuum map of 3C 433 is shown in Fig. 1, left panel in black contours. Due to low spatial resolution, the fine structures and the hot-spot in the southern lobe mentioned in Sect. 1 are smoothed out in our continuum map. However, the X-shaped morphology and the brightness difference between the northern and southern lobes are evident. As expected, we detected the H i absorption and, interestingly, we find that the absorption is confined to the southern part of the southern lobe. The location of the H i absorption is indi- cated by the white contours in Fig. 1 (bottom-right panel). As mentioned in Sect. 1, Mirabel (1989) had suggested that the H i absorption may arise towards the southern radio lobe since it contains most of the flux. Our observations clearly show that the H i is confined (in projection) to a small part of the lobe. This is essential for the interpretation of the origin of the absorption, as we discuss in Sect. 4. The integrated H i absorption profile extracted from the region detected in H i absorption is shown in Fig. 3. The absorption is ∼ 50 km s −1 blue-shifted from the systemic velocity of 3C 433, which is in agreement with that reported by (Mirabel 1989). The integrated absorption line is ∼ 40 mJy deep and has a full width at zero intensity (FWZI) of ∼ 80 km s −1 . The H i column density distribution is shown in Fig. 1, bottom-right panel. The average H i column density is (1.0 ± 0.2) × 10 18 (T spin /f) cm −2 where T spin is the gas spin temperature and f is the covering factor which, in this case, is unity on the account of the absorber being resolved. The absorber is ∼ 60 kpc, in projection, from the radio core of 3C 433 identified by van Breugel et al. (1983). The velocity field is shown in Fig. 4 and Fig. 5 shows the position-velocity (PV) diagram along the major axis of the absorber. We find that the velocity gradient is smooth throughout the structure, and is probably related to rotation. There is a steep gradient in the brightness of the southern lobe and the flux density decreases away from the region overlapping with the detected H i. The average flux density in the region immediately northwards of the absorber is ∼ 700 mJy beam −1 . Using this and assuming an absorption profile of the same FWHM as the detected integrated absorption (60 km s −1 ), we derive a 3σ H i column density sensitivity of 7.5 × 10 17 (T spin /f) cm −2 towards regions northwards of the absorber. Similarly, the average flux density to the south of the absorber is ∼ 50 mJy beam −1 . That gives us a 3σ sensitivity of 9 × 10 18 (T spin /f) cm −2 in the regions south to the absorber. Thus, if there is more of H i in the region northwards that is covered by the radio continuum, we would have been able to detect it but not towards the southern regions, where the continuum emission drops rapidly. Optical and Hα counterparts of the H i absorber As can be seen from Fig. 1, a faint galaxy is present, in projection, at the location of the H i absorption and, hence, it is of interest as a possible candidate giving rise to the absorption. Because of this, we characterised this galaxy further using the methods described in Sect. 2. The galaxy is shown clearly in Fig. 2 and has an effective radius, r e = 3.3 , that is, 6.3 kpc and an axial ratio of 0.88. Fig. 8 shows the surface brightness profile of this galaxy up to one effective radius. We find that an exponential disc fits the surface brightness profile very well. We measured an r-band integrated apparent magnitude of m r = 16.61. As mentioned in Sect. 2, this galaxy has also been detected by Pan-STARRS in g-band with an integrated apparent magnitude, m g = 17.22. Thus, the K-corrected (Chilingarian et al. 2010;Chilingarian & Zolotukhin 2012) g−r colour of the galaxy is 0.46. Using the correlation between the mass-to-light ratio and the K-corrected g−r colour given by Bell et al. (2003), we obtain a stellar mass M * ∼ 6.7 × 10 10 M for the galaxy. Our GTC observations reveal an interesting morphology for the Hα emitting gas in this galaxy as shown in Fig. 1 top-right panel. We detected Hα emission from the central stellar region of the galaxy and also along the eastern edge. It is worth noting that this outer Hα blob coincides with the outer boundary of the radio lobe (see Fig. 1). We come back to this aspect in Sect. 4. One possibility is the presence of an interaction between the radio lobe and the galaxy, and that this component of Hα emission is due to shock excitation of the gas resulting from this. The total Hα flux from the galaxy is 5 × 10 −16 erg cm −2 s −1 while the emission from only the central region is ∼ 2 × 10 −16 erg cm −2 s −1 . With these Hα fluxes, we can estimate the starformation rate (SFR) using the Kennicutt-Schmidt law (Kennicutt 1998). Assuming that all the Hα emission detected from the galaxy arises from star formation activity, we obtain a starformation rate (SFR) of 0.11 M yr −1 . If only the Hα emission arising from the centre of the galaxy is due to star formation (while the eastern Hα component is due to shock excitation), we obtain a lower limit to the SFR of 0.035 M yr −1 . The SFR is lower even if we are to infer it from other tracers such as the infrared (IR) luminosity: following Kewley et al. (2002), we use the relation SFR (IR) ∼ 2.7 × SFR (Hα) 1.3 and find that SFR(IR) ranges from 0.15 M yr −1 to 0.034 M yr −1 for the two cases mentioned above. We note here that the H i centre of the galaxy is offset from the optical centre (Fig. 1, bottom-right panel). The left panel of the same figure shows the location of the optical galaxy with respect to the radio lobe (at higher spatial resolution than our VLA map; blue contours). We find that the radio continuum does 10 kpc not intercept the galaxy in its entirety since the latter is located at the boundary of the former. Therefore, it is likely that a part of the H i disc could be missing due to the absence of background radio continuum in that region. Thus, the mismatch between the H i and optical centres could be due to the non-detection of a part of the H i disc. Hα in the host galaxy of 3C 433 3C 433 is associated with one of the galaxies in an interacting pair surrounded by a common envelope (Matthews et al. 1964). Earlier optical broad-band, narrow-band Hα and spectroscopic studies of the system have shown clear Hα filaments, a young stellar population, and signs of recent disturbance in the form of distorted dust lanes (e.g. Baum et al. 1988;de Koff et al. 1996de Koff et al. , 2000Holt et al. 2007;Tadhunter et al. 2011). Our optical image of 3C 433 and its environment is shown in Fig. 1. It is of much better sensitivity and has a superior angular resolution compared to the Pan-STARRS r-band image. We clearly see the optical counterpart of 3C 433 along with its companion galaxy. Additionally, we also clearly detect shells around this interacting pair, very likely caused due to their interaction. Figure 6 shows the Hα emission-line image of radio galaxy 3C 433 and its circumgalactic environment. The Hα emission at the location of the host galaxy of 3C 433 shows an elongated structure, previously seen by Baum et al. (1988). The total extent of the Hα structure is ∼10 (∼19 kpc). The brightest patch of Hα emission corresponds to the peak in the continuum image, hence we argue that this is most likely the core of 3C 433. A plume of Hα emission is seen to stretch roughly 4 (∼8 kpc) north-east of the centre of the radio galaxy (see image in Fig. 6, top-right panel). This plume extends almost perpendicular to the radio axis. It could represent gas that is either being driven out of or being accreted onto the centre of the galaxy. Further to the north, at 9 (∼17 kpc) distance from the radio galaxy, another region of enhanced emission appears in the Hα image. This region aligns with what appears to be a companion galaxy in the continuum image. It is not clear whether this emission is from the Hα-emitting gas in this companion galaxy, or residual emission left over from the continuum subtraction. Fig. 7 shows the Hα emission in the centre of 3C 433 overlaid onto a Hubble Space Telescope narrow-band image of emission. The Hα emission shows a distribution that is very similar to the -emitting gas in the inner 12 kpc. In turn, the emission traces the inner radio jet. It is therefore likely that both the and a large fraction of the Hα emission in the inner 12 kpc trace gas that is under the influence of the propagating radio jets. It is plausible that the radio jets have shock-ionized the gas, or even triggered star formation. In addition to the co-alignment of Hα and in the inner 12 kpc, the Hα stretches further out to the north-west than the , in what was classified as a curving filament by Baum et al. (1988). The overall distribution of the Hα emission resembles a gaseous disc with an extent of ∼19 kpc, but this would have to be verified by observing the kinematics. If confirmed, this structure would be misaligned with the main radio axis by ∼40 • . However, it is interesting to note that the outer faint wings of radio emission that give the 3C 433 its X-shape, lie at an angle of ∼ 90 • with the optical structure and, thus, parallel to its minor axis, possibly corresponding to the rotation axis (see light-blue dashed line in Fig. 6). This would suggest that the central black hole and its accretion disk underwent a significant precession recently, possibly during the current episode of radio-AGN activity, whereby the jet direction changed by ∼50 • in the plane of the sky. It is quite possible that such a precession may have been triggered by the ongoing galaxy interactions, and is the mechanism (of the many mechanisms proposed to explain the X-shape in radio galaxies; Cotton et al. 2020;Hardcastle et al. 2019) responsible for the observed X-shape of 3C 433. The circumgalactic environment of 3C 433 The Hα image of Fig. 6 shows several other interesting features. At least three blobs of Hα emission are seen at distances of roughly 8, 23, 15, and 23 arcsec (16, 29, 45 kpc) from the centre of the radio galaxy. These blobs are marked with a black arrow in Fig. 6. Regions R1 and R2 only appear in the tunable filter image (Hα+continuum) and not in the continuum image, while the southernmost region (R3) has a higher contrast in the tunable filter compared to the continuum image. These three regions of enhanced Hα, therefore, most likely represent real emission-line regions. There may be other regions of Hα emission, for example, ∼7 south-west of R3, but due to stronger underlying continuum at those locations, we cannot confirm this. The northernmost region (R1) appears to be a weak emission-line feature stretching from the radio host galaxy. Interestingly, the three emission-line regions lie along an arc that follows the outer edge of the southern radio lobe. This could be tidal debris from material that is being redistributed across the circum-galactic environment of 3C 433. We hypothesise that the alignment with the outer edge of the radio source may occur because this circum-galactic material is being shocked and ionized by the propagating radio source, similar to what has been seen in Coma A (Morganti et al. 2002) and the Beetle Galaxy (Villar-Martín et al. 2017). The Hα regions are also aligned in the direction of the H i companion, although a direct connection between this circumgalactic gas and the H i companion is not apparent from our data. Discussion Our spatially resolved H i absorption observations of 3C 433 have shown that the absorption arises against the southern radio lobe, about 60 kpc away from the radio core. We find that the central velocity of the H i absorption is only ∼ 50 km s −1 blueshifted from the systemic velocity of 3C 433 (see Fig. 3), confirming that it belongs to the same environment. This combination of having the absorption spatially resolved and the absorber belonging to the same environment, and yet being located so far away from the central region of the background AGN, makes this system a rarity. The H i absorber: a disc galaxy What we observe in 3C 433 is different from the results commonly obtained from H i absorption studies of the gas associated with the radio AGN host galaxies. Those studies have mostly found that the H i absorption arises from the central (kpc) regions close to the core of the radio AGN (see Morganti & Oosterloo 2018, for an overview). Instead, in this case, our optical image shows the presence of a galaxy coincident with the location of H i absorption. We propose that the H i detected in absorption belongs to this galaxy located in front of the southern radio lobe. The peak of the H i absorption is only ∼ 50 km s −1 bluewards of the systemic velocity of 3C 433 and, hence, very likely also belongs to the same environment. The galaxy has a stellar mass of ∼ 6.7 × 10 10 M . It has a surface brightness profile (see Fig. 8) characteristic of a disc galaxy. Assuming a spin temperature of ∼ 100 K, a value typical of H i on galactic scales (e.g. Borthakur et al. 2010Borthakur et al. , 2014Srianand et al. 2013;Reeves et al. 2015;Dutta et al. 2016;Reeves et al. 2016, and references therein), we obtain an average H i column density of ∼ 1.0 × 10 20 cm −2 . For spiral galaxies, the average column density of the H i disc is about 6 × 10 20 cm −2 (Wang et al. 2016), quite higher than our absorber galaxy. Instead, Serra et al. (2012) found that in gas-rich early-type galaxies, the typical column densities are around 10 20 cm −2 , which is very similar to what we observe. This suggests that our galaxy is a gas-rich early-type galaxy, which is also consistent with the fact that it lies off the star-forming main sequence (see Sect. 4.2). Using this H i column density of the gas, we obtain the mass of the detected H i as ∼ 3.4 × 10 8 M . As mentioned in Sect. 3.1, it is likely that a portion of the galaxy has not been intercepted by the radio continuum. Since it is only a small part of the galaxy, we argue that the actual H i mass could at most be ∼ 7 × 10 8 M (i.e. twice the estimated mass, assuming that half the disc has not been detected). This gives us an atomic gas fraction (f HI ) of 0.008 for the galaxy. As discussed in Sect. 4.3, there is a possibility of an interaction between the galaxy and the radio lobe. If this is the case, the spin temperature of H i could be higher. Assuming a T spin of 1000 K for H i, gives an H i mass of ∼ 3.4 ×10 9 M and an f HI of 0.08. There is a possibility that the observed absorption could also arise from a tail of tidal debris formed, for example, due to the ongoing merger event. We exclude this scenario since we do not see the presence of such prominent and large-scale structures in our optical image. Comparison with galaxies observed in H i emission in the local Universe and beyond As mentioned earlier, studies of spatially resolved H i absorption have been carried out, beyond the local universe, for only two intervening absorbers where the absorber galaxy and background radio source are of entirely different redshifts. A few cases have been presented in the literature of H i absorption due to intervening galaxies that belong to the same environment, (for example, B2 1321+31, J1337+3152, 3C 234, PKS 1649-062; Emonts 2006;Srianand et al. 2010;Pihlstrom 2001, Mahony priv. comm., respectively). However, the H i absorption in all these cases is unresolved due to which a detailed study of the absorber has not been possible. Unlike these systems, for 3C 433, it has been possible to characterise the absorber and, hence, to carry out a comparison with the samples of galaxies studied in H i emission -in the local universe and beyond. At redshifts comparable to, or higher than that of, 3C 433 (i.e. z ≥ 0.1), there are not many studies of H i emission in galaxies (e.g. Catinella & Cortese 2015;Hess et al. 2019;Verheijen et al. 2007;Gogate et al. 2020). The stellar masses of the galaxies included in such studies are comparable to that of our galaxy, while their H i masses range from 2 × 10 9 M to a few times 10 10 M , considerably greater than the H i mass of our galaxy. Similarly, their SFRs are between 0.3 M yr −1 and 35 M yr −1 , much higher than that in our case. Thus, this galaxy, detected via H i absorption, belongs to a population of galaxies which would be missed by the deep H i emission surveys. We note that this is likely to be a Malmquist bias since the emission studies are biased towards detecting the most HI-rich galaxies which is not the case with absorption studies. Furthermore, we compare this galaxy with those studied in the local universe (e.g. the xCOLD GASS sample by Saintonge et al. 2017). Figure 9 shows the properties of the H i absorber compared to galaxies of this sample. The rectangle indicates the range of SFR of the galaxy depending on whether the detected Hα emission arises entirely from the SFR activity or not (see section 3.2). We find that the galaxy lies well below the starforming main sequence. The upper limit to the atomic gas fraction obtained by considering a T spin of 1000 K is comparable to the galaxies lying below the main sequence. However, a more reasonable estimate of f HI obtained by assuming a T spin of 100 K is lower even compared to the galaxies lying below the main sequence. Saintonge et al. (2017) with our H i absorbing galaxy marked in blue rectangle (colourcoded on the atomic gas fraction). The black dots represent H i nondetections in their sample while the grey contours represents the distribution of the SDSS galaxy population. The absorber galaxy is marked in a rectangle to represent the range of SFR derived; see Sect. 3.2 for details. The f H i , which corresponds to the ratio between M HI and (M * + M HI ), for our H i absorber galaxy ranges between -1.1 to -2.1 (in logscale) for a T spin of 1000 K and 100 K respectively. For plotting, we consider a T spin of 1000 K. c AAS. Reproduced with permission. Finally, given that the absorber galaxy belongs to a different population compared to those studied at similar redshifts, it is also interesting to test whether this galaxy conforms with the baryonic Tully-Fisher relation (bTFR) (McGaugh et al. 2000). To check the consistency with the baryonic Tully-Fisher relation, we need to determine the rotational velocity of the galaxy. We used 3D Barolo (Di Teodoro & Fraternali 2015) to this end and found that the inclination of the galaxy is low (≤ 20 • ) and, hence, it is not quite possible to constrain the rotational velocity. However, assuming an inclination angle of 20 • results in a rotational velocity of ∼ 120 km s −1 , for which the galaxy is consistent with the bTFR (McGaugh et al. 2000). For lower inclination angles, however, the galaxy is off the relation. Is the radio lobe interacting with the galaxy? The detected H i absorption is only ∼ 50 km s −1 blueshifted with respect to the systemic velocity of 3C 433 and, hence, the absorbing galaxy belongs to the same group of galaxies as 3C 433. Because of this, it is of interest to investigate the possibility of an interaction between the galaxy and the radio AGN. Figure 1 shows the location of the galaxy with respect to the radio lobe. We find that the end of the radio lobe is not relaxed but, instead, protruded as if pinched at the corners. Intriguingly, the location of the galaxy and especially the majority of the Hα emission arises, in projection, from the same region where the lobe appears pinched. It is quite possible that the radio lobe is interacting with the galaxy at this point and has acquired the observed morphology as a result. Hα emission lends support to this possibility. Fig. 1 shows in detail the location of Hα emission with respect to the radio lobe. We find that the emission has two components one arising from the central part of the galaxy and another located along the eastern boundary of the galaxy. It is very likely that these two components arise from two HII regions belonging to a spiral arm of the galaxy (see Fig. 2). Interestingly, the Hα blob along the eastern boundary of the galaxy also coincides with the boundary of the radio lobe in projection. Hence, we cannot rule out that an interaction with the radio lobe may be the cause of this blob. The radio lobe may have shock-ionised the gas in the galaxy, giving rise to the observed Hα morphology. As has been seen in other cases, for example in Coma A (Tadhunter et al. 2000;Morganti et al. 2002) and the Beetle galaxy (Villar-Martín et al. 2017), interactions between radio plasma and gas are very much capable of producing such structures. In fact, in 3C 433 we observe such an effect within the AGN host galaxy as well as on the circumgalactic scales: in the radio host galaxy 3C 433, the Hα follows emission that aligns with the inner radio jet (Fig. 7), while blobs of Hα emission are also found to align along the edge of the southern radio lobe in the circumgalactic environment between 3C 433 and the H i companion (Fig. 6). However, we would like to note that the morphology of the galaxy does not seem disturbed and the H i kinematics of the galaxy appear regular. Hence, we suggest that if there is any interaction between the radio lobe and the galaxy, it is only present at a mild level. A further investigation of this scenario of interaction is possible by studying the spectral index of the southern radio lobe at the location of the galaxy. In the presence of an interaction of radio plasma with the interstellar medium of the galaxy, the electrons at those sites of interaction would be re-accelerated (e.g. Harwood et al. 2013). Hence, we would expect the spectral index in those regions to be flatter compared to other regions of the lobe. A spectral index study of 3C 433 has been carried out by (Lal & Rao 2007), however, the spatial resolution of their study was not high enough to resolve the southern lobe sufficiently. Thus, a high-resolution spectral index study (that may be carried out with, for example, LOFAR Two Meter Sky Survey Shimwell et al. 2017) would help confirm this scenario. Although it is very unlikely, it might also be possible that the observed Hα features are produced by the interaction of the radio lobe with the gas in the circumgalactic environment, which does not belong to the galaxy but overlaps with it only in projection. However, we do not have enough evidence to conclude either way based solely on the available data. Blind H i absorption surveys in the SKA era As we see in the previous sections of this paper, resolved H i absorption against an extended, bright-enough radio continuum can trace neutral hydrogen gas in galaxy populations beyond the local universe which would otherwise be missed by the current deep H i emission surveys. With the SKA pathfinder facilities and the SKA itself, it may be possible to expand the number of such studies since they provide a large field of view and a simultaneous coverage of a large range of redshift, thus increasing the chance of detection of H i absorption from intervening galaxies along the line of sight of an extended radio source in a single pointing. Deep optical images of a large part of the sky are already available (e.g. SDSS, Pan-STARRS) and so are deep images of the radio sky at relatively high spatial resolution (e.g. VLASS, FIRST, LOFAR). A cross-match between the two would pro-vide candidate systems where the extended radio emission overlaps, in projection, with a galaxy. Since the blind H i absorption surveys presently planned with the SKA pathfinders do cover a large area of the sky, it is possible to conduct a similar study of all these systems in a quasi-blind manner, without a preceding optical spectroscopic survey to ensure that the galaxy is in the foreground. Apart from the chance alignment of extended, bright radio AGN and galaxies, high spatial resolution is another crucial requirement for such studies. Even at relatively low redshifts, for example at z ∼ 0.2, to be able to barely resolve a 50 kpc radio galaxy (∼ 4 beam elements; typical size of a high-redshift radio galaxy, e.g. Kanekar & Briggs 2004), the angular resolution needed is ∼ 3 . The current and forthcoming blind H i absorption surveys planned with the SKA-pathfinder facilities, namely, APERTIF (Oosterloo et al. 2009), MeerKat (Booth & Jonas 2012) and ASKAP (Johnston et al. 2008), will at best achieve an angular resolution of 6 (by MeerKAT). Thus, further resolving the absorber galaxy which would only overlap with a small portion of the radio continuum may not be possible. However, even if unresolved, such studies will still be able to provide a census of low H i mass galaxy population beyond the local universe, while the more detailed H i mapping of galaxies in absorption may have to wait until SKA phase 1, whose sub-arcsecond angular resolution even at low frequencies (up to z ∼ 0.85) and unprecedented sensitivity have the potential to expand the study of such systems to statistical samples. Summary In this paper, we present an H i, Hα, and optical continuum study of the H i absorber against the radio galaxy 3C 433. The peculiar radio morphology of 3C 433, namely, the bright southern lobe, provides a favourable extended background continuum to resolve the foreground absorber. The absorber is only ∼ 50 km s −1 blueshifted from 3C 433 and, hence, it belongs to the same environment. Our resolved H i absorption data obtained with the VLA in B array show that the absorber does not exhibit any sign of disturbance and has regular kinematics. With the help of our continuum and Hα images from GTC, we find that the absorber is a disc galaxy located at the boundary of the southern radio lobe. This galaxy has a stellar mass of ∼ 10 10 M and a maximum star-formation rate of ∼ 0.15 M yr −1 . We estimate the H i mass of this galaxy to be ∼ 3 × 10 8 M and, furthermore, we find that the H i column density of the gas and star formation properties are representative of a gas-rich early-type galaxy off the star-forming main sequence. We compare this result with the properties of galaxies detected in H i in the local universe as well as at redshifts ∼ 0.1. We find that for the given H i and stellar mass, and the star-formation rate, it is consistent with the galaxies in the local universe. However, the deep H i emission surveys at similar redshifts as our galaxy still have not detected galaxies of H i masses as low as this. Since the absorber galaxy belongs to the same environment as 3C 433, we also investigate the possibility of an interaction between the southern radio lobe and the galaxy. Interestingly, we find that in projection, the galaxy is located at the boundary where the radio lobe exhibits a protruded morphology. The Hα emission from the galaxy shows two H ii regions, one of which coincides with the boundary of the radio lobe. Together with the fact that Hα emission is also aligned with the radio continuum in the host galaxy of 3C 433 and at a few other locations along the southern radio lobe, this points to an interesting scenario where the radio AGN is directly interacting with a neighbouring galaxy. Our study shows that in the case of a favourable alignment of a galaxy in front of an extended, bright-enough radio source, we can trace galaxy population in H i that would otherwise be missed by deep H i emission surveys. The deep blind H i absorption surveys with the SKA-pathfinder facilities, in conjunction with the deep optical images available from the all-sky surveys, may be able to detect more systems of this kind, although they will fall short of the desired spatial resolution to resolve the absorption. The SKA phase 1 with sub-arcsecond resolution and high sensitivity out to high redshifts has the potential to extend the reach of such studies.
List1 = [] List2 = [] Number = input() for i in range(int(Number)): Word = input() List1.append(Word) for W in List1: if len(str(W)) > 10: Abbreviation = W[0] + str(len(W)-2) + W[-1] List2.append(Abbreviation) else: List2.append(W) for i in List2: print(i)
<gh_stars>1-10 package io.fno.grel; public class ControlsFunctions { /** * Expression o is evaluated to a value. If that value is true, then expression eTrue is evaluated and the result is the value of the whole if expression. * @param b * @param eTrue * @param eFalse * @return Object */ public static Object ifThenElse(Boolean b, Object eTrue, Object eFalse) { if (b) { return eTrue; } return eFalse; } }
def step(self, action): time_between_steps = time.time() - self._last_step_time if time_between_steps < self._desired_time_between_steps: time.sleep(self._desired_time_between_steps - time_between_steps) self._last_step_time = time.time() self._step_counter += 1 return self._gym_env.step(action)
Disability Rights and Compulsory Psychiatric Treatment: The Case for a Balanced Approach under the Mental Health (Compulsory Assessment and Treatment) Act 1992 This article argues the New Zealand Government's current approach to compulsory psychiatric treatment is unjustifiable in a human rights context. Under s 59 of the Mental Health (Compulsory Assessment and Treatment) Act 1992, clinicians are empowered to administer compulsory psychiatric treatment to individuals without, or contrary to, their consent. This article analyses s 59, and its underlying justifications, in light of the New Zealand Government's commitments under the United Nations Convention on the Rights of Persons with Disabilities (UNCRPD). Further, it analyses the approach for compulsory psychiatric treatment advocated by the UNCRPD in light of Aotearoa New Zealand's mental health context to evaluate whether this approach would be more desirable than the current approach under s 59. The article then advocates for a more balanced approach to compulsory psychatric treatment which puts the rights of disabled individuals at the forefront and also ensures there are limits to these rights which are justifiable within a human rights context.
/** * Provides logical grouping for actors, agents and dataflow tasks and operators. Each group has an underlying thread pool, which will perform actions * on behalf of the users belonging to the group. Actors created through the DefaultPGroup.actor() method * will automatically belong to the group through which they were created, just like agents created through the agent() or fairAgent() methods * or dataflow tasks and operators created through the task() or operator() methods. * Uses a pool of non-daemon threads. * The DefaultPGroup class implements the Pool interface through @Delegate. * * @author Vaclav Pech * Date: Jun 17, 2009 */ public final class NonDaemonPGroup extends PGroup { /** * Creates a group for actors, agents, tasks and operators. The actors will share a common non-daemon thread pool. */ public NonDaemonPGroup() { super(new DefaultPool(false)); } /** * Creates a group for actors, agents, tasks and operators. The actors will share a common non-daemon thread pool. * * @param poolSize The initial size of the underlying thread pool */ public NonDaemonPGroup(final int poolSize) { super(new DefaultPool(false, poolSize)); } }
/* * Wine Message Compiler output generation * * Copyright 2000 <NAME> (BS) * * This library is free software; you can redistribute it and/or * modify it under the terms of the GNU Lesser General Public * License as published by the Free Software Foundation; either * version 2.1 of the License, or (at your option) any later version. * * This library is distributed in the hope that it will be useful, * but WITHOUT ANY WARRANTY; without even the implied warranty of * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU * Lesser General Public License for more details. * * You should have received a copy of the GNU Lesser General Public * License along with this library; if not, write to the Free Software * Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA 02110-1301, USA */ #include "config.h" #include "wine/port.h" #include <stdio.h> #include <stdlib.h> #include <string.h> #include <assert.h> #include <ctype.h> #include "wmc.h" #include "utils.h" #include "lang.h" #include "write.h" /* * The binary resource layout is as follows: * * +===============+ * Header | NBlocks | * +===============+ * Block 0 | Low ID | * +---------------+ * | High ID | * +---------------+ * | Offset |---+ * +===============+ | * Block 1 | Low ID | | * +---------------+ | * | High ID | | * +---------------+ | * | Offset |------+ * +===============+ | | * | | | | * ... ... | | * | | | | * +===============+ <-+ | * B0 LoID | Len | Flags | | * +---+---+---+---+ | * | b | l | a | b | | * +---+---+---+---+ | * | l | a | \0| \0| | * +===============+ | * | | | * ... ... | * | | | * +===============+ | * B0 HiID | Len | Flags | | * +---+---+---+---+ | * | M | o | r | e | | * +---+---+---+---+ | * | b | l | a | \0| | * +===============+ <----+ * B1 LoID | Len | Flags | * +---+---+---+---+ * | J | u | n | k | * +---+---+---+---+ * | \0| \0| \0| \0| * +===============+ * | | * ... ... * | | * +===============+ * * All Fields are aligned on their natural boundaries. The length * field (Len) covers both the length of the string and the header * fields (Len and Flags). Strings are '\0' terminated. Flags is 0 * for normal character strings and 1 for unicode strings. */ static const char str_header[] = "/* This file is generated with wmc version " PACKAGE_VERSION ". Do not edit! */\n" "/* Source : %s */\n" "/* Cmdline: %s */\n" "/* Date : %s */\n" "\n" ; static char *dup_u2c(int cp, const WCHAR *uc) { int len; char *cptr; const union cptable *cpdef = find_codepage(cp); if(cpdef) len = wine_cp_wcstombs(cpdef, 0, uc, unistrlen(uc)+1, NULL, 0, NULL, NULL); else len = wine_utf8_wcstombs(0, uc, unistrlen(uc)+1, NULL, 0); cptr = xmalloc(len); if (cpdef) len = wine_cp_wcstombs(cpdef, 0, uc, unistrlen(uc)+1, cptr, len, NULL, NULL); else len = wine_utf8_wcstombs(0, uc, unistrlen(uc)+1, cptr, len); if (len < 0) internal_error(__FILE__, __LINE__, "Buffer overflow? code %d\n", len); return cptr; } static void killnl(char *s, int ddd) { char *tmp; tmp = strstr(s, "\r\n"); if(tmp) { if(ddd && tmp - s > 3) { tmp[0] = tmp[1] = tmp[2] = '.'; tmp[3] = '\0'; } else *tmp = '\0'; } tmp = strchr(s, '\n'); if(tmp) { if(ddd && tmp - s > 3) { tmp[0] = tmp[1] = tmp[2] = '.'; tmp[3] = '\0'; } else *tmp = '\0'; } } static int killcomment(char *s) { char *tmp = s; int b = 0; while((tmp = strstr(tmp, "/*"))) { tmp[1] = 'x'; b++; } tmp = s; while((tmp = strstr(tmp, "*/"))) { tmp[0] = 'x'; b++; } return b; } void write_h_file(const char *fname) { node_t *ndp; char *cptr; char *cast; FILE *fp; token_t *ttab; int ntab; int i; int once = 0; int idx_en = 0; fp = fopen(fname, "w"); if(!fp) { perror(fname); exit(1); } cptr = ctime(&now); killnl(cptr, 0); fprintf(fp, str_header, input_name ? input_name : "<stdin>", cmdline, cptr); fprintf(fp, "#ifndef __WMCGENERATED_%08lx_H\n", (long)now); fprintf(fp, "#define __WMCGENERATED_%08lx_H\n", (long)now); fprintf(fp, "\n"); /* Write severity and facility aliases */ get_tokentable(&ttab, &ntab); fprintf(fp, "/* Severity codes */\n"); for(i = 0; i < ntab; i++) { if(ttab[i].type == tok_severity && ttab[i].alias) { cptr = dup_u2c(WMC_DEFAULT_CODEPAGE, ttab[i].alias); fprintf(fp, "#define %s\t0x%x\n", cptr, ttab[i].token); free(cptr); } } fprintf(fp, "\n"); fprintf(fp, "/* Facility codes */\n"); for(i = 0; i < ntab; i++) { if(ttab[i].type == tok_facility && ttab[i].alias) { cptr = dup_u2c(WMC_DEFAULT_CODEPAGE, ttab[i].alias); fprintf(fp, "#define %s\t0x%x\n", cptr, ttab[i].token); free(cptr); } } fprintf(fp, "\n"); /* Write the message codes */ fprintf(fp, "/* Message definitions */\n"); for(ndp = nodehead; ndp; ndp = ndp->next) { switch(ndp->type) { case nd_comment: cptr = dup_u2c(WMC_DEFAULT_CODEPAGE, ndp->u.comment+1); killnl(cptr, 0); killcomment(cptr); if(*cptr) fprintf(fp, "/* %s */\n", cptr); else fprintf(fp, "\n"); free(cptr); break; case nd_msg: if(!once) { /* * Search for an english text. * If not found, then use the first in the list */ once++; for(i = 0; i < ndp->u.msg->nmsgs; i++) { if(ndp->u.msg->msgs[i]->lan == 0x409) { idx_en = i; break; } } fprintf(fp, "\n"); } fprintf(fp, "/* MessageId : 0x%08x */\n", ndp->u.msg->realid); cptr = dup_u2c(ndp->u.msg->msgs[idx_en]->cp, ndp->u.msg->msgs[idx_en]->msg); killnl(cptr, 0); killcomment(cptr); fprintf(fp, "/* Approximate msg: %s */\n", cptr); free(cptr); cptr = dup_u2c(WMC_DEFAULT_CODEPAGE, ndp->u.msg->sym); if(ndp->u.msg->cast) cast = dup_u2c(WMC_DEFAULT_CODEPAGE, ndp->u.msg->cast); else cast = NULL; switch(ndp->u.msg->base) { case 8: if(cast) fprintf(fp, "#define %s\t((%s)0%oL)\n\n", cptr, cast, ndp->u.msg->realid); else fprintf(fp, "#define %s\t0%oL\n\n", cptr, ndp->u.msg->realid); break; case 10: if(cast) fprintf(fp, "#define %s\t((%s)%dL)\n\n", cptr, cast, ndp->u.msg->realid); else fprintf(fp, "#define %s\t%dL\n\n", cptr, ndp->u.msg->realid); break; case 16: if(cast) fprintf(fp, "#define %s\t((%s)0x%08xL)\n\n", cptr, cast, ndp->u.msg->realid); else fprintf(fp, "#define %s\t0x%08xL\n\n", cptr, ndp->u.msg->realid); break; default: internal_error(__FILE__, __LINE__, "Invalid base for number print\n"); } free(cptr); free(cast); break; default: internal_error(__FILE__, __LINE__, "Invalid node type %d\n", ndp->type); } } fprintf(fp, "\n#endif\n"); fclose(fp); } static void write_rcbin(FILE *fp) { lan_blk_t *lbp; token_t *ttab; int ntab; int i; get_tokentable(&ttab, &ntab); for(lbp = lanblockhead; lbp; lbp = lbp->next) { char *cptr = NULL; fprintf(fp, "LANGUAGE 0x%x, 0x%x\n", lbp->lan & 0x3ff, lbp->lan >> 10); for(i = 0; i < ntab; i++) { if(ttab[i].type == tok_language && ttab[i].token == lbp->lan) { if(ttab[i].alias) cptr = dup_u2c(WMC_DEFAULT_CODEPAGE, ttab[i].alias); break; } } if(!cptr) internal_error(__FILE__, __LINE__, "Filename vanished for language 0x%0x\n", lbp->lan); fprintf(fp, "1 MESSAGETABLE \"%s.bin\"\n", cptr); free(cptr); } } static char *make_string(WCHAR *uc, int len, int codepage) { char *str = xmalloc(7*len + 1); char *cptr = str; int i; int b; if(!codepage) { *cptr++ = ' '; *cptr++ = 'L'; *cptr++ = '"'; for(i = b = 0; i < len; i++, uc++) { switch(*uc) { case '\a': *cptr++ = '\\'; *cptr++ = 'a'; b += 2; break; case '\b': *cptr++ = '\\'; *cptr++ = 'b'; b += 2; break; case '\f': *cptr++ = '\\'; *cptr++ = 'f'; b += 2; break; case '\n': *cptr++ = '\\'; *cptr++ = 'n'; b += 2; break; case '\r': *cptr++ = '\\'; *cptr++ = 'r'; b += 2; break; case '\t': *cptr++ = '\\'; *cptr++ = 't'; b += 2; break; case '\v': *cptr++ = '\\'; *cptr++ = 'v'; b += 2; break; case '\\': *cptr++ = '\\'; *cptr++ = '\\'; b += 2; break; case '"': *cptr++ = '\\'; *cptr++ = '"'; b += 2; break; default: if (*uc < 0x100 && isprint(*uc)) { *cptr++ = *uc; b++; } else { int n = sprintf(cptr, "\\x%04x", *uc & 0xffff); cptr += n; b += n; } break; } if(i < len-1 && b >= 72) { *cptr++ = '"'; *cptr++ = ','; *cptr++ = '\n'; *cptr++ = ' '; *cptr++ = 'L'; *cptr++ = '"'; b = 0; } } if (unicodeout) len = (len + 1) & ~1; else len = (len + 3) & ~3; for(; i < len; i++) { *cptr++ = '\\'; *cptr++ = 'x'; *cptr++ = '0'; *cptr++ = '0'; *cptr++ = '0'; *cptr++ = '0'; } *cptr++ = '"'; *cptr = '\0'; } else { char *tmp, *cc; int mlen; const union cptable *cpdef = find_codepage(codepage); if (cpdef) mlen = wine_cp_wcstombs(cpdef, 0, uc, unistrlen(uc)+1, NULL, 0, NULL, NULL); else mlen = wine_utf8_wcstombs(0, uc, unistrlen(uc)+1, NULL, 0); cc = tmp = xmalloc(mlen); if (cpdef) { if((i = wine_cp_wcstombs(cpdef, 0, uc, unistrlen(uc)+1, tmp, mlen, NULL, NULL)) < 0) internal_error(__FILE__, __LINE__, "Buffer overflow? code %d\n", i); } else { if((i = wine_utf8_wcstombs(0, uc, unistrlen(uc)+1, tmp, mlen)) < 0) internal_error(__FILE__, __LINE__, "Buffer overflow? code %d\n", i); } *cptr++ = ' '; *cptr++ = '"'; for(i = b = 0; i < len; i++, cc++) { switch(*cc) { case '\a': *cptr++ = '\\'; *cptr++ = 'a'; b += 2; break; case '\b': *cptr++ = '\\'; *cptr++ = 'b'; b += 2; break; case '\f': *cptr++ = '\\'; *cptr++ = 'f'; b += 2; break; case '\n': *cptr++ = '\\'; *cptr++ = 'n'; b += 2; break; case '\r': *cptr++ = '\\'; *cptr++ = 'r'; b += 2; break; case '\t': *cptr++ = '\\'; *cptr++ = 't'; b += 2; break; case '\v': *cptr++ = '\\'; *cptr++ = 'v'; b += 2; break; case '\\': *cptr++ = '\\'; *cptr++ = '\\'; b += 2; break; case '"': *cptr++ = '\\'; *cptr++ = '"'; b += 2; break; default: if(isprint(*cc)) { *cptr++ = *cc; b++; } else { int n = sprintf(cptr, "\\x%02x", *cc & 0xff); cptr += n; b += n; } break; } if(i < len-1 && b >= 72) { *cptr++ = '"'; *cptr++ = ','; *cptr++ = '\n'; *cptr++ = ' '; *cptr++ = '"'; b = 0; } } len = (len + 3) & ~3; for(; i < len; i++) { *cptr++ = '\\'; *cptr++ = 'x'; *cptr++ = '0'; *cptr++ = '0'; } *cptr++ = '"'; *cptr = '\0'; free(tmp); } return str; } static void write_rcinline(FILE *fp) { lan_blk_t *lbp; int i; int j; for(lbp = lanblockhead; lbp; lbp = lbp->next) { unsigned offs = 4 * (lbp->nblk * 3 + 1); fprintf(fp, "\n1 MESSAGETABLE\n"); fprintf(fp, "LANGUAGE 0x%x, 0x%x\n", lbp->lan & 0x3ff, lbp->lan >> 10); fprintf(fp, "{\n"); fprintf(fp, " /* NBlocks */ 0x%08xL,\n", lbp->nblk); for(i = 0; i < lbp->nblk; i++) { fprintf(fp, " /* Lo,Hi,Offs */ 0x%08xL, 0x%08xL, 0x%08xL,\n", lbp->blks[i].idlo, lbp->blks[i].idhi, offs); offs += lbp->blks[i].size; } for(i = 0; i < lbp->nblk; i++) { block_t *blk = &lbp->blks[i]; for(j = 0; j < blk->nmsg; j++) { char *cptr; int l = blk->msgs[j]->len; const char *comma = j == blk->nmsg-1 && i == lbp->nblk-1 ? "" : ","; cptr = make_string(blk->msgs[j]->msg, l, unicodeout ? 0 : blk->msgs[j]->cp); fprintf(fp, "\n /* Msg 0x%08x */ 0x%04x, 0x000%c,\n", blk->idlo + j, (unicodeout ? (l*2+3)&~3 : (l+3)&~3) + 4, unicodeout ? '1' : '0'); fprintf(fp, "%s%s\n", cptr, comma); free(cptr); } } fprintf(fp, "}\n"); } } void write_rc_file(const char *fname) { FILE *fp; char *cptr; fp = fopen(fname, "w"); if(!fp) { perror(fname); exit(1); } cptr = ctime(&now); killnl(cptr, 0); fprintf(fp, str_header, input_name ? input_name : "<stdin>", cmdline, cptr); if(rcinline) write_rcinline(fp); else write_rcbin(fp); fclose(fp); } void write_bin_files(void) { assert(rcinline == 0); }
// +build behaviour package fundi import ( "os" "testing" "github.com/cucumber/godog" "github.com/kasulani/go-fundi/internal/behaviour" ) func TestBehaviour(t *testing.T) { specs := behaviour.NewTestSpecifications() suite := godog.TestSuite{ Name: "fundi", TestSuiteInitializer: initializeSuite(specs), ScenarioInitializer: initializeScenarios(specs), Options: &godog.Options{ Randomize: 1, StopOnFailure: false, Format: "pretty", Paths: featuresFiles(t), Tags: "~@notYetImplemented", }, } if suite.Run() != 0 { t.Fatal("failed to run behaviour tests") } } func featuresFiles(t *testing.T) []string { os.Chdir("../../") parentDir, err := os.Getwd() if err != nil { t.Fatalf("failed to get working directory: %q", err) } return []string{parentDir + "/features"} } func initializeSuite(specs *behaviour.TestSpecifications) func(ts *godog.TestSuiteContext) { return func(ts *godog.TestSuiteContext) { ts.AfterSuite(func() { specs.MustStop() }) } } func initializeScenarios(specs *behaviour.TestSpecifications) func(sc *godog.ScenarioContext) { return func(sc *godog.ScenarioContext) { specs.Loader(sc) sc.BeforeScenario(func(s *godog.Scenario) { specs.MustClearState(s) }) } }
// // Function: OBOUserAddRefSpecialCase // // Purpose: Handle a special case where when upgrading from NT351 or NT 4 // with MS's "File and Print" and GSNW. In this case we need to // AddRef OBOUser F&P, so removal of GSNW does not remove F&P. // // Parameters: pWizard [IN] - Context information // // Returns: Nothing. (this is basically a do it if we can special case.) // VOID OBOUserAddRefSpecialCase(CWizard * pWizard) { TraceFileFunc(ttidGuiModeSetup); CSetupInfFile csif; HRESULT hr = S_OK; Assert(pWizard->PNetCfg()); Assert(IsUnattended(pWizard)); TraceTag(ttidWizard, "OBOUserAddRefSpecialCase - Start"); if (pWizard->PSetupData()->UnattendFile) { PRODUCT_FLAVOR pf; GetProductFlavor(NULL, &pf); if (PF_WORKSTATION != pf) { const GUID * rgguidClass[2] = {&GUID_DEVCLASS_NETSERVICE, &GUID_DEVCLASS_NETCLIENT}; const PCWSTR rgpszComponentId[2] = {c_szInfId_MS_Server, c_szInfId_MS_NWClient}; INetCfgComponent* rgpncc[2] = {NULL, NULL}; hr = HrFindComponents (pWizard->PNetCfg(), 2, rgguidClass, rgpszComponentId, rgpncc); if (SUCCEEDED(hr)) { if (rgpncc[0] && rgpncc[1]) { NETWORK_INSTALL_PARAMS nip = {0}; nip.dwSetupFlags = NSF_PRIMARYINSTALL; TraceTag(ttidWizard, " OBOUser Install of File and Print Services"); TraceTag(ttidWizard, " On upgrade from NT 3.51 or NT 4"); (void)HrInstallComponentsOboUser(pWizard->PNetCfg(), &nip, 1, &rgguidClass[0], &rgpszComponentId[0]); } ReleaseObj(rgpncc[0]); ReleaseObj(rgpncc[1]); } } } TraceTag(ttidWizard, "OBOUserAddRefSpecialCase - End"); TraceError("OBOUserAddRefSpecialCase",hr); }
/** * A builder that creates proxies based on provided target concrete class and * {@link MethodHandler} instance. * * @author Pavel Sorocun ([email protected]) */ public class ProxyBuilder { private Class aClass; private MethodHandler handler; public <T> ProxyBuilder aClass(final Class<T> aClass) { this.aClass = aClass; return this; } public ProxyBuilder handler(final MethodHandler handler) { this.handler = handler; return this; } public <T> T build() throws Exception { final ProxyFactory factory = new ProxyFactory(); factory.setSuperclass(aClass); Class subClass = factory.createClass(); Object proxy = subClass.newInstance(); ((ProxyObject) proxy).setHandler(handler); return (T) proxy; } }
def vector_space_search(self, words=[], **kwargs): top = kwargs.pop("top", 10) if not isinstance(words, (list, tuple)): words = [words] if not isinstance(words, Document): kwargs.setdefault("threshold", 0) words = Document(" ".join(words), **kwargs) if len([w for w in words if w in self.vector]) == 0: return [] return self.related(words, top)
<gh_stars>1-10 import {AliasTagCallBackData, Common, InitOption} from './jiguang-push.common'; export declare class JiguangPush extends Common { // define your typings manually // or.. // take the ios or android .d.ts files and copy/paste them here /** * get the sdk version */ public static getVersion(): string; /** * 初始化sdk */ public static init(options: InitOption): void; /** * 获取别名 */ public static getAlias(): Promise<AliasTagCallBackData>; /** * 设置别名 * @param alias 新的别名 */ public static setAlias(alias: string): Promise<AliasTagCallBackData>; /** * 删除别名 */ public static deleteAlias(): Promise<AliasTagCallBackData>; }
/** * Resolves the endpoint of the partition for the given request * * @param request Request for which the partition endpoint resolution is to be performed * @param forceRefreshPartitionAddresses Force refresh the partition's endpoint * @return ResolutionResult */ private Mono<ResolutionResult> resolveAddressesAndIdentityAsync( RxDocumentServiceRequest request, boolean forceRefreshPartitionAddresses) { if (ReplicatedResourceClient.isReadingFromMaster(request.getResourceType(), request.getOperationType()) && request.getPartitionKeyRangeIdentity() == null) { return resolveMasterResourceAddress(request, forceRefreshPartitionAddresses); } Mono<RefreshState> refreshStateObs = this.getOrRefreshRoutingMap(request, forceRefreshPartitionAddresses); return refreshStateObs.flatMap( state -> { try { AddressResolver.ensureRoutingMapPresent(request, state.routingMap, state.collection); } catch (Exception e) { return Mono.error(e); } Mono<Utils.ValueHolder<ResolutionResult>> resultObs = this.tryResolveServerPartitionAsync( request, state.collection, state.routingMap, state.collectionCacheIsUptoDate, state.collectionRoutingMapCacheIsUptoDate, forceRefreshPartitionAddresses); Function<ResolutionResult, Mono<ResolutionResult>> addCollectionRidIfNameBased = funcResolutionResult -> { assert funcResolutionResult != null; if (request.getIsNameBased()) { request.getHeaders().put(WFConstants.BackendHeaders.COLLECTION_RID, state.collection.getResourceId()); } return Mono.just(funcResolutionResult); }; return resultObs.flatMap(resolutionResultValueHolder -> { if (resolutionResultValueHolder.v != null) { return addCollectionRidIfNameBased.apply(resolutionResultValueHolder.v); } assert resolutionResultValueHolder.v == null; Function<RefreshState, Mono<RefreshState>> ensureCollectionRoutingMapCacheIsUptoDateFunc = funcState -> { if (!funcState.collectionRoutingMapCacheIsUptoDate) { funcState.collectionRoutingMapCacheIsUptoDate = true; Mono<Utils.ValueHolder<CollectionRoutingMap>> newRoutingMapObs = this.collectionRoutingMapCache.tryLookupAsync( BridgeInternal.getMetaDataDiagnosticContext(request.requestContext.cosmosDiagnostics), funcState.collection.getResourceId(), funcState.routingMap, request.properties); return getStateWithNewRoutingMap(funcState, newRoutingMapObs); } else { return Mono.just(state); } }; Function<RefreshState, Mono<Utils.ValueHolder<ResolutionResult>>> resolveServerPartition = funcState -> { try { AddressResolver.ensureRoutingMapPresent(request, funcState.routingMap, funcState.collection); } catch (Exception e) { return Mono.error(e); } return this.tryResolveServerPartitionAsync( request, funcState.collection, funcState.routingMap, true, true, forceRefreshPartitionAddresses); }; Function<Utils.ValueHolder<ResolutionResult>, Mono<ResolutionResult>> onNullThrowNotFound = funcResolutionResult -> { if (funcResolutionResult.v == null) { logger.debug("Couldn't route partitionkeyrange-oblivious request after retry/cache refresh. Collection doesn't exist."); return Mono.error(BridgeInternal.setResourceAddress(new NotFoundException(), request.requestContext.resourcePhysicalAddress)); } return Mono.just(funcResolutionResult.v); }; if (!state.collectionCacheIsUptoDate) { request.forceNameCacheRefresh = true; state.collectionCacheIsUptoDate = true; Mono<Utils.ValueHolder<DocumentCollection>> newCollectionObs = this.collectionCache.resolveCollectionAsync(BridgeInternal.getMetaDataDiagnosticContext(request.requestContext.cosmosDiagnostics), request); Mono<RefreshState> newRefreshStateObs = newCollectionObs.flatMap(collectionValueHolder -> { state.collection = collectionValueHolder.v; if (!StringUtils.equals(collectionValueHolder.v.getResourceId(), state.routingMap.getCollectionUniqueId())) { state.collectionRoutingMapCacheIsUptoDate = false; Mono<Utils.ValueHolder<CollectionRoutingMap>> newRoutingMap = this.collectionRoutingMapCache.tryLookupAsync( BridgeInternal.getMetaDataDiagnosticContext(request.requestContext.cosmosDiagnostics), collectionValueHolder.v.getResourceId(), null, request.properties); return getStateWithNewRoutingMap(state, newRoutingMap); } return Mono.just(state); }); Mono<Utils.ValueHolder<ResolutionResult>> newResultObs = newRefreshStateObs.flatMap(ensureCollectionRoutingMapCacheIsUptoDateFunc) .flatMap(resolveServerPartition); return newResultObs.flatMap(onNullThrowNotFound).flatMap(addCollectionRidIfNameBased); } else { return ensureCollectionRoutingMapCacheIsUptoDateFunc.apply(state) .flatMap(resolveServerPartition) .flatMap(onNullThrowNotFound) .flatMap(addCollectionRidIfNameBased); } }); } ); }
package tw.com.sample.chyiiiiiiiiiiii.fusedlocationprovider; import android.annotation.SuppressLint; import android.location.Location; import android.os.Bundle; import android.util.Log; import android.widget.TextView; import com.google.android.gms.location.FusedLocationProviderClient; import com.google.android.gms.location.LocationCallback; import com.google.android.gms.location.LocationRequest; import com.google.android.gms.location.LocationResult; import com.google.android.gms.location.LocationServices; import com.google.android.gms.tasks.OnSuccessListener; import androidx.appcompat.app.AppCompatActivity; public class MainActivity extends AppCompatActivity { public static final String TAG = "MainActivity"; private TextView tvLon, tvLat, tvCount; private FusedLocationProviderClient fusedLocationProviderClient; private LocationRequest locationRequest; private LocationCallback locationCallback; private Location lastLocation; private int updateCount = 0; @Override protected void onCreate(Bundle savedInstanceState) { super.onCreate(savedInstanceState); setContentView(R.layout.activity_main); // tvLon = findViewById(R.id.tvLon); tvLat = findViewById(R.id.tvLat); tvCount = findViewById(R.id.tvCount); // initFusedLocationClient(); initLocationRequest(); setLocationCallback(); } @SuppressLint("MissingPermission") public void initFusedLocationClient() { fusedLocationProviderClient = LocationServices.getFusedLocationProviderClient(this); fusedLocationProviderClient.getLastLocation().addOnSuccessListener(new OnSuccessListener<Location>() { @Override public void onSuccess(Location location) { if (location != null) { lastLocation = location; } } }); } private void initLocationRequest() { locationRequest = LocationRequest.create(); locationRequest.setInterval(5 * 1000); locationRequest.setFastestInterval(2 * 1000); locationRequest.setPriority(LocationRequest.PRIORITY_HIGH_ACCURACY); } private void setLocationCallback() { locationCallback = new LocationCallback(){ @Override public void onLocationResult(LocationResult locationResult) { super.onLocationResult(locationResult); // updateCount++; for (Location location : locationResult.getLocations()) { Log.i(TAG, location.toString()); lastLocation = location; // tvLon.setText("經度 : " + location.getLongitude()); tvLat.setText("緯度 : " + location.getLatitude()); tvCount.setText("更新次數 : " + updateCount); } } }; } //-------------------------------------------------------------------------------------------------------------------------------------------------- public Location getLastLocation() { return lastLocation; } @SuppressLint("MissingPermission") public void startLocationUpdates() { fusedLocationProviderClient.requestLocationUpdates(locationRequest, locationCallback, null); } public void stopLocationUpdates() { fusedLocationProviderClient.removeLocationUpdates(locationCallback); } //-------------------------------------------------------------------------------------------------------------------------------------------------- @Override protected void onStart() { super.onStart(); // startLocationUpdates(); } @Override protected void onStop() { super.onStop(); // stopLocationUpdates(); } }
<reponame>best08618/asylo<gh_stars>1-10 /* Copyright (C) 2012-2017 Free Software Foundation, Inc. Contributed by <NAME> <<EMAIL>>. This file is part of the GNU Atomic Library (libatomic). Libatomic is free software; you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation; either version 3 of the License, or (at your option) any later version. Libatomic is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details. Under Section 7 of GPL version 3, you are granted additional permissions described in the GCC Runtime Library Exception, version 3.1, as published by the Free Software Foundation. You should have received a copy of the GNU General Public License and a copy of the GCC Runtime Library Exception along with this program; see the files COPYING3 and COPYING.RUNTIME respectively. If not, see <http://www.gnu.org/licenses/>. */ #include "libatomic_i.h" /* Accesses with a power-of-two size are not lock-free if we don't have an integer type of this size or if they are not naturally aligned. They are lock-free if such a naturally aligned access is always lock-free according to the compiler, which requires that both atomic loads and CAS are available. In all other cases, we fall through to LARGER (see below). */ #define EXACT(N) \ do { \ if (!C2(HAVE_INT,N)) break; \ if ((uintptr_t)ptr & (N - 1)) break; \ if (__atomic_always_lock_free(N, 0)) return true; \ if (!C2(MAYBE_HAVE_ATOMIC_CAS_,N)) break; \ if (C2(FAST_ATOMIC_LDST_,N)) return true; \ } while (0) /* We next check to see if an access of a larger size is lock-free. We use a similar check as in EXACT, except that we also check that the alignment of the access is so that the data to be accessed is completely covered by the larger access. */ #define LARGER(N) \ do { \ uintptr_t r = (uintptr_t)ptr & (N - 1); \ if (!C2(HAVE_INT,N)) break; \ if (!C2(FAST_ATOMIC_LDST_,N)) break; \ if (!C2(MAYBE_HAVE_ATOMIC_CAS_,N)) break; \ if (r + n <= N) return true; \ } while (0) /* Note that this can return that a size/alignment is not lock-free even if all the operations that we use to implement the respective accesses provide lock-free forward progress as specified in C++14: Users likely expect "lock-free" to also mean "fast", which is why we do not return true if, for example, we implement loads with this size/alignment using a CAS. */ bool libat_is_lock_free (size_t n, void *ptr) { switch (n) { case 0: return true; case 1: EXACT(1); goto L4; case 2: EXACT(2); goto L4; case 4: EXACT(4); goto L8; case 8: EXACT(8); goto L16; case 16: EXACT(16); break; case 3: L4: LARGER(4); /* FALLTHRU */ case 5 ... 7: L8: LARGER(8); /* FALLTHRU */ case 9 ... 15: L16: LARGER(16); break; } return false; } EXPORT_ALIAS (is_lock_free);
/** * Resolves a path to its target node while also visiting all nodes along the way. * @param document the document to resolve the path relative to * @param visitor an optional visitor to invoke for each node in the path (can be null) */ public Node resolveWithVisitor(Document document, IVisitor visitor) { Node node = document; if (visitor != null) { node.accept(visitor); } Object oNode = node; for (NodePathSegment segment : this.segments) { oNode = segment.resolve(oNode); if (visitor != null && oNode != null && oNode instanceof IVisitable) { ((IVisitable) oNode).accept(visitor); } } return (Node) oNode; }
N, M = map(int, input().split()) A = [list(map(int, input().split())) for i in range(N)] def calc(hold): participants = {i: 0 for i in range(1, M + 1)} for person in A: for sport in person: if sport not in hold: continue participants[sport] += 1 break return participants X = set(list(range(1, M + 1))) ans = float('inf') while X: ret_p = calc(X) max_participants = max(ret_p.values()) ans = min(ans, max_participants) for key, value in ret_p.items(): if value == max_participants: X.remove(key) break print(ans)
Twisted Fourier(-Stieltjes) spaces and amenability The Fourier(-Stieltjes) algebras on locally compact groups are important commutative Banach algebras in abstract harmonic analysis. In this paper we introduce a generalization of the above two algebras via twisting with respect to 2-cocycles on the group. We also define and investigate basic properties of the associated multiplier spaces with respect to a pair of 2-cocycles. We finally prove a twisted version of the result of Bozejko/Losert/Ruan characterizing amenability of the underlying locally compact group through the comparison of the twisted Fourier-Stieltjes space with the associated multiplier spaces.
/** * * @see <a href= * "../../doc-files/api-spec.html#_types.aggregations.BucketCorrelationFunctionCountCorrelationIndicator">API * specification</a> */ @JsonpDeserializable public class BucketCorrelationFunctionCountCorrelationIndicator implements JsonpSerializable { private final int docCount; private final List<Double> expectations; private final List<Double> fractions; // --------------------------------------------------------------------------------------------- private BucketCorrelationFunctionCountCorrelationIndicator(Builder builder) { this.docCount = ApiTypeHelper.requireNonNull(builder.docCount, this, "docCount"); this.expectations = ApiTypeHelper.unmodifiableRequired(builder.expectations, this, "expectations"); this.fractions = ApiTypeHelper.unmodifiable(builder.fractions); } public static BucketCorrelationFunctionCountCorrelationIndicator of( Function<Builder, ObjectBuilder<BucketCorrelationFunctionCountCorrelationIndicator>> fn) { return fn.apply(new Builder()).build(); } /** * Required - The total number of documents that initially created the * expectations. It’s required to be greater than or equal to the sum of all * values in the buckets_path as this is the originating superset of data to * which the term values are correlated. * <p> * API name: {@code doc_count} */ public final int docCount() { return this.docCount; } /** * Required - An array of numbers with which to correlate the configured * <code>bucket_path</code> values. The length of this value must always equal * the number of buckets returned by the <code>bucket_path</code>. * <p> * API name: {@code expectations} */ public final List<Double> expectations() { return this.expectations; } /** * An array of fractions to use when averaging and calculating variance. This * should be used if the pre-calculated data and the buckets_path have known * gaps. The length of fractions, if provided, must equal expectations. * <p> * API name: {@code fractions} */ public final List<Double> fractions() { return this.fractions; } /** * Serialize this object to JSON. */ public void serialize(JsonGenerator generator, JsonpMapper mapper) { generator.writeStartObject(); serializeInternal(generator, mapper); generator.writeEnd(); } protected void serializeInternal(JsonGenerator generator, JsonpMapper mapper) { generator.writeKey("doc_count"); generator.write(this.docCount); if (ApiTypeHelper.isDefined(this.expectations)) { generator.writeKey("expectations"); generator.writeStartArray(); for (Double item0 : this.expectations) { generator.write(item0); } generator.writeEnd(); } if (ApiTypeHelper.isDefined(this.fractions)) { generator.writeKey("fractions"); generator.writeStartArray(); for (Double item0 : this.fractions) { generator.write(item0); } generator.writeEnd(); } } // --------------------------------------------------------------------------------------------- /** * Builder for {@link BucketCorrelationFunctionCountCorrelationIndicator}. */ public static class Builder extends ObjectBuilderBase implements ObjectBuilder<BucketCorrelationFunctionCountCorrelationIndicator> { private Integer docCount; private List<Double> expectations; @Nullable private List<Double> fractions; /** * Required - The total number of documents that initially created the * expectations. It’s required to be greater than or equal to the sum of all * values in the buckets_path as this is the originating superset of data to * which the term values are correlated. * <p> * API name: {@code doc_count} */ public final Builder docCount(int value) { this.docCount = value; return this; } /** * Required - An array of numbers with which to correlate the configured * <code>bucket_path</code> values. The length of this value must always equal * the number of buckets returned by the <code>bucket_path</code>. * <p> * API name: {@code expectations} * <p> * Adds all elements of <code>list</code> to <code>expectations</code>. */ public final Builder expectations(List<Double> list) { this.expectations = _listAddAll(this.expectations, list); return this; } /** * Required - An array of numbers with which to correlate the configured * <code>bucket_path</code> values. The length of this value must always equal * the number of buckets returned by the <code>bucket_path</code>. * <p> * API name: {@code expectations} * <p> * Adds one or more values to <code>expectations</code>. */ public final Builder expectations(Double value, Double... values) { this.expectations = _listAdd(this.expectations, value, values); return this; } /** * An array of fractions to use when averaging and calculating variance. This * should be used if the pre-calculated data and the buckets_path have known * gaps. The length of fractions, if provided, must equal expectations. * <p> * API name: {@code fractions} * <p> * Adds all elements of <code>list</code> to <code>fractions</code>. */ public final Builder fractions(List<Double> list) { this.fractions = _listAddAll(this.fractions, list); return this; } /** * An array of fractions to use when averaging and calculating variance. This * should be used if the pre-calculated data and the buckets_path have known * gaps. The length of fractions, if provided, must equal expectations. * <p> * API name: {@code fractions} * <p> * Adds one or more values to <code>fractions</code>. */ public final Builder fractions(Double value, Double... values) { this.fractions = _listAdd(this.fractions, value, values); return this; } /** * Builds a {@link BucketCorrelationFunctionCountCorrelationIndicator}. * * @throws NullPointerException * if some of the required fields are null. */ public BucketCorrelationFunctionCountCorrelationIndicator build() { _checkSingleUse(); return new BucketCorrelationFunctionCountCorrelationIndicator(this); } } // --------------------------------------------------------------------------------------------- /** * Json deserializer for * {@link BucketCorrelationFunctionCountCorrelationIndicator} */ public static final JsonpDeserializer<BucketCorrelationFunctionCountCorrelationIndicator> _DESERIALIZER = ObjectBuilderDeserializer .lazy(Builder::new, BucketCorrelationFunctionCountCorrelationIndicator::setupBucketCorrelationFunctionCountCorrelationIndicatorDeserializer); protected static void setupBucketCorrelationFunctionCountCorrelationIndicatorDeserializer( ObjectDeserializer<BucketCorrelationFunctionCountCorrelationIndicator.Builder> op) { op.add(Builder::docCount, JsonpDeserializer.integerDeserializer(), "doc_count"); op.add(Builder::expectations, JsonpDeserializer.arrayDeserializer(JsonpDeserializer.doubleDeserializer()), "expectations"); op.add(Builder::fractions, JsonpDeserializer.arrayDeserializer(JsonpDeserializer.doubleDeserializer()), "fractions"); } }
import Data.Char solve :: [Char] -> [Char] solve (x:xs) = if length (filter (`elem` ['A'..'Z']) xs) == (length xs) then [if x >= 'a' then toUpper x else toLower x] ++ map toLower xs else [x]++xs main :: IO() main = do s <- getLine putStr(solve s)
#include <stdio.h> int main() { int n,flg =0,ko,gyo,cnt=0,kaz; scanf("%d",&n); char U[101][101]; for(int i=0;i<n;i++){ for(int j=0;j<n;j++) U[i][j]='0'; } for(int i=0;i<n;i++){ kaz =0;flg =0;cnt=0; scanf("%d %d",&gyo,&ko); for(int j=0;j<ko;j++){ scanf("%d",&kaz); U[i][kaz-1] = '1'; //printf("i=%c ?",U[i][kaz]); } //printf("3=%d ?",U[i][3]); //U[i][n] = '\0'; } for(int i=0;i<n;i++){ for(int j=0;j<n;j++) { if(j == 0) printf("%c",U[i][j]); else printf(" %c",U[i][j]); } printf("\n"); } }
/** * A {@link org.terasology.engine.persistence.StorageManager} that performs reading only. */ public final class ReadOnlyStorageManager extends AbstractStorageManager { public ReadOnlyStorageManager(Path savePath, ModuleEnvironment environment, EngineEntityManager entityManager, BlockManager blockManager, ExtraBlockDataManager extraDataManager) { this(savePath, environment, entityManager, blockManager, extraDataManager, true); } public ReadOnlyStorageManager(Path savePath, ModuleEnvironment environment, EngineEntityManager entityManager, BlockManager blockManager, ExtraBlockDataManager extraDataManager, boolean storeChunksInZips) { super(savePath, environment, entityManager, blockManager, extraDataManager, storeChunksInZips); } @Override public void finishSavingAndShutdown() { // don't care } @Override public void requestSaving() { // don't care } @Override public void waitForCompletionOfPreviousSaveAndStartSaving() { // don't care } @Override public void deactivateChunk(Chunk chunk) { Collection<EntityRef> entitiesOfChunk = getEntitiesOfChunk(chunk); entitiesOfChunk.forEach(this::deactivateOrDestroyEntityRecursive); } @Override public void update() { } @Override public boolean isSaving() { return false; } @Override public void checkAndRepairSaveIfNecessary() throws IOException { // can't do that .. } @Override public void deleteWorld() { // can't do that .. } @Override public void deactivatePlayer(Client client) { EntityRef character = client.getEntity().getComponent(ClientComponent.class).character; deactivateOrDestroyEntityRecursive(character); } }
<gh_stars>0 package zaplog import ( "go.uber.org/zap" "net" "net/http" ) var zapLoggerHttpServer string func runZapLoggerHttpServer(config *zapLoggerConf, level zap.AtomicLevel) { mux := http.NewServeMux() mux.Handle(config.logApiPath, level) listener, err := net.Listen("tcp", config.listenAddr) if err != nil { Fatal("runZapLoggerHttpServer err", zap.String("ListenAddr", config.listenAddr), zap.Error(err)) } else { zapLoggerHttpServer = "http://" + listener.Addr().String() + config.logApiPath Info("make zapLoggerHttpServer success", zap.String("ZapLoggerHttpServer", zapLoggerHttpServer)) } go func() { if err = http.Serve(listener, mux); err != nil { Fatal("runZapLoggerHttpServer err", zap.String("ListenAddr", config.listenAddr), zap.Error(err)) } }() }
def _getSynVers(self): version = self.sharinfo.get('syn:version') return version