content
stringlengths
10
4.9M
Predictive data analysis techniques applied to dropping out of university studies Student dropout is a major problem in university studies all around the world. To alleviate this problem, it is important to detect as soon as possible student attrition before he or she becomes a deserter. A student may be considered a deserter when she/he has not completed her academic credits or leave the studies. In this paper we present a study made at a higher education institution, by analyzing the records of 530 higher education students from 52 different careers with application date 2015 to 2018, considering factors such as academic monitoring, financial situation, personal and social information. These are some issues or mix of problems that could affect dropout rates. Analyze student behavior by implementing predictive analytics techniques reduce the gaps between professional demands and applicants' competencies. We applied predictive analytical techniques to identify the relationship of factors characterizing students who leave the university. As a result, we have elaborated a conceptual model to predict the risk of defection and applied machine learning techniques to generate preventive and corrective alerts as a student permanence strategy. This study shows that information is important, but the application of machine learning in the student's prior knowledge and its relationship to a dynamic and pre-established profile of the deserter student is essential to generate early strategies that manage to reduce the gaps between professional demands and applicants' competencies. In addition, a data model has been created to give solution to the issue get generated preventive and corrective alerts.
Wisconsin Gov. Scott Walker says British Prime Minister David Cameron confided in him that he was concerned about the direction of American leadership. But there’s a problem with the Republican’s tidy critique of President Barack Obama: Cameron doesn’t remember it that way. Walker, who has taken several trips overseas in recent months to study up on foreign policy in preparation for an all-but-certain presidential bid, told a roomful of Republican donors Friday that world leaders, including Cameron, are worried about the U.S. stepping back in the world. “The Prime Minister did not say that and does not think that,” a Downing Street spokesperson told TIME. “I heard that from David Cameron back in February earlier when we were over at 10 Downing,” Walker said. “I heard it from other leaders around the world. They’re looking around realizing this lead from behind mentality just doesn’t work. It’s just not working.” Politics Newsletter Sign up to receive the day’s top political stories. View Sample Sign Up Now His comments came at the E2 Summit hosted by former Republican presidential nominee Mitt Romney in Deer Valley, Utah, where Walker was auditioning for support from some of the Republican Party’s deep-pocketed donors. Walker and Cameron met Feb. 10 while Walker was traveling on a trade mission for his state. The trip was overshadowed in the U.S. by news coverage of Walker dodging a question on evolution. Walker’s political office deferred comment to his official office, which did not immediately respond to a request for comment. -With reporting by Conal Urquhart / London. Contact us at [email protected].
<filename>questions/remove-duplicates-from-sorted-array-ii/Solution.java /** * Follow up for "Remove Duplicates": * What if duplicates are allowed at most twice? * * For example, * Given sorted array nums = [1,1,1,2,2,3], * Your function should return length = 5, with the first five elements of nums being 1, 1, 2, 2 and 3. * It doesn't matter what you leave beyond the new length. */ public class Solution { public int removeDuplicates(int[] nums) { if (nums == null || nums.length <= 0) { return 0; } int acc = 1; int runner = 1; int curr = 1; while (runner < nums.length) { if (nums[runner] == nums[runner - 1]) { acc++; if (acc <= 2) { nums[curr] = nums[runner]; curr++; } } else { nums[curr] = nums[runner]; curr++; acc = 1; } runner++; } return curr; } }
/* Perform an acceptor step of the underlying mechanism exchange. */ static OM_uint32 mech_accept(OM_uint32 *minor, spnego_gss_ctx_id_t ctx, gss_cred_id_t cred, struct negoex_message *messages, size_t nmessages, gss_channel_bindings_t bindings, gss_buffer_t output_token, OM_uint32 *time_rec) { OM_uint32 major, tmpmin; struct negoex_auth_mech *mech; struct exchange_message *msg; assert(!ctx->initiate && !K5_TAILQ_EMPTY(&ctx->negoex_mechs)); msg = negoex_locate_exchange_message(messages, nmessages, AP_REQUEST); if (msg == NULL) { if (ctx->negoex_step == 1 || K5_TAILQ_FIRST(&ctx->negoex_mechs)->complete) return GSS_S_COMPLETE; *minor = ERR_NEGOEX_MISSING_AP_REQUEST_MESSAGE; return GSS_S_DEFECTIVE_TOKEN; } if (ctx->negoex_step == 1) { mech = K5_TAILQ_FIRST(&ctx->negoex_mechs); if (!GUID_EQ(msg->scheme, mech->scheme)) return GSS_S_COMPLETE; } else { mech = negoex_locate_auth_scheme(ctx, msg->scheme); if (mech == NULL) { *minor = ERR_NEGOEX_NO_AVAILABLE_MECHS; return GSS_S_FAILURE; } negoex_select_auth_mech(ctx, mech); } if (mech->complete) return GSS_S_COMPLETE; if (ctx->internal_name != GSS_C_NO_NAME) gss_release_name(&tmpmin, &ctx->internal_name); if (ctx->deleg_cred != GSS_C_NO_CREDENTIAL) gss_release_cred(&tmpmin, &ctx->deleg_cred); major = gss_accept_sec_context(minor, &mech->mech_context, cred, &msg->token, bindings, &ctx->internal_name, &ctx->actual_mech, output_token, &ctx->ctx_flags, time_rec, &ctx->deleg_cred); if (major == GSS_S_COMPLETE) mech->complete = 1; if (!GSS_ERROR(major)) { major = get_session_keys(minor, mech); } else if (ctx->negoex_step == 1) { major = GSS_S_COMPLETE; *minor = 0; gss_release_buffer(&tmpmin, output_token); gss_delete_sec_context(&tmpmin, &mech->mech_context, GSS_C_NO_BUFFER); } return major; }
The country’s dominant bargain-store chain made headlines last month when it confirmed plans to introduce prices that will be much closer to $5 (with taxes) rather than a buck. Dollarama has made a pretty good run in recent years by introducing new products at higher prices on everything from food to seasonal products to houseware that shoppers have eagerly embraced. So much so that more than three quarters of the items sold in store now are priced at more than $1 (see chart below). That’s a big difference from just a few years ago, when half the items sold were $1.00 or less. The problem now though is that Dollarama is struggling to find items that fall within the new high end of its price range that are good enough to put on the shelf, according to experts. Dollarama is experiencing “difficulty in sourcing products at the new $3.50 and $4.00 price points,” experts at BMO Capital Markets said in a research brief on Monday. Exceptional value The people responsible for finding the assortment of products that have been critical in making the bargain chain so successful in recent years are hitting a bit of a wall, it seems. “[They’re] not able to source a significant number of items that they believe would deliver ‘exceptional value’ at these higher price points, following Dollarama’s recent buying trip to China,” the BMO experts said. Slow roll-out BMO’s experts said they believe Dollarama will eventually get it right and find higher priced items shoppers will be as enthusiastic about. But they don’t foresee those products being a meaningful portion of sales this year. “Management suggested that the roll-out of these products would be ‘very slow,’” the BMO note said. WATCH: Just like how buying certain items in bulk can save you money, buying specific items at your local dollar store can amount to big savings as well. Brian McKechnie reports.
import '@/styles/index.scss' import React, { ReactElement } from 'react' import { History } from 'history' import { Helmet } from 'react-helmet' import { Provider } from 'react-redux' import { ConnectedRouter } from 'connected-react-router' import ErrorBoundary from '@/components/error-boundary' import RoutesElem from './app.routes' import configureStore from './store/configStore' import Wrapper from './components/wrapper' interface IProps { history: History store: ReturnType<typeof configureStore> } const App = ({ history, store }: IProps): ReactElement => { return ( <ErrorBoundary> <Helmet defaultTitle='Aid Mate' titleTemplate='%s'> <link rel='canonical' href={process.env.SERVER_BASE_URL} /> </Helmet> <Provider store={store}> <ConnectedRouter history={history}> <Wrapper> <RoutesElem /> </Wrapper> </ConnectedRouter> </Provider> </ErrorBoundary> ) } export default App
// Check that a nodes pointed to with connect-thinblock actually supports thinblocks BOOST_FOREACH (string &strAddr, mapMultiArgs["-connect-thinblock"]) { CNodeRef node = FindNodeRef(strAddr); if (node && !node->ThinBlockCapable()) { LogPrintf("ERROR: You are trying to use connect-thinblocks but to a node that does not support it " "- Protocol Version: %d peer=%s\n", node->nVersion, node->GetLogName()); } }
<filename>components/apps/BoxedWine/useBoxedWine.ts import { getConfig, libs } from "components/apps/BoxedWine/config"; import useTitle from "components/system/Window/useTitle"; import { useFileSystem } from "contexts/fileSystem"; import { basename, extname } from "path"; import { useCallback, useEffect, useRef } from "react"; import { loadFiles } from "utils/functions"; declare global { interface Window { BoxedWineConfig: { isRunning?: boolean; urlParams: string; }; BoxedWineShell: (onLoad: () => void) => void; } } const getExeName = async (zipData: Buffer): Promise<string | undefined> => { const { unzip } = await import("utils/zipFunctions"); const fileList = Object.entries(await unzip(zipData)); const [[fileName] = []] = fileList .filter(([name]) => name.toLowerCase().endsWith(".exe")) .sort(([, aFile], [, bFile]) => bFile.length - aFile.length); return fileName; }; const useBoxedWine = ( id: string, url: string, _containerRef: React.MutableRefObject<HTMLDivElement | null>, setLoading: React.Dispatch<React.SetStateAction<boolean>> ): void => { const { appendFileToTitle } = useTitle(id); const { readFile } = useFileSystem(); const loadedUrl = useRef<string>(); const loadEmulator = useCallback(async (): Promise<void> => { let dynamicConfig = {}; let appPayload = url ? await readFile(url) : Buffer.from(""); const extension = extname(url).toLowerCase(); const isExecutable = extension === ".exe"; const { zipAsync } = await import("utils/zipFunctions"); const appName = isExecutable || !url ? basename(url, extension) : await getExeName(appPayload); if (isExecutable) { appPayload = Buffer.from(await zipAsync({ [basename(url)]: appPayload })); } dynamicConfig = { ...(appPayload ? { "app-payload": appPayload.toString("base64") } : {}), ...(appName ? { p: appName } : {}), }; window.BoxedWineConfig = { ...window.BoxedWineConfig, urlParams: getConfig(dynamicConfig), }; loadFiles(libs).then(() => { if (url) appendFileToTitle(appName || basename(url)); try { window.BoxedWineShell(() => setLoading(false)); } catch { // Ignore BoxedWine errors } }); }, [appendFileToTitle, readFile, setLoading, url]); useEffect(() => { if ( (!loadedUrl.current && typeof url === "string") || (loadedUrl.current && url) ) { loadedUrl.current = url; loadEmulator(); } return () => { window.BoxedWineConfig = { ...window.BoxedWineConfig, isRunning: false, }; }; }, [loadEmulator, url]); }; export default useBoxedWine;
/** * Starts the booting process and handles exceptions with the console frame. * * @param args Command line arguments */ public final void securedBoot(String[] args) { try { unsecuredBoot(args); } catch (Exception e) { errorOccured = true; handleException(e); } if (!getConsoleFrame().isErrorOccured() && !isErrorOccured()) getConsoleFrame().setVisible(false); else getConsoleFrame().setVisible(true); }
/// Return an iterator over all elemens inside the memory reservations block /// of this device tree. /// /// Memory reservation regions should never be used as normal memory by the /// kernel. pub fn memory_reservations(&'tree self) -> MemoryReservations<'tree> { let start = self.mem_rsv_offset() as usize; let data = &self.buf[start..]; MemoryReservations { data } }
Students at the Infantry Officers’ Course fire a mortar round during a mountain attack in the Bullion Training Area at Twentynine Palms, Calif., March 21, 2012. WASHINGTON — The first two female lieutenants to volunteer for the Marine Corps’ Infantry Officer Course failed to complete the program, the Marine Corps said Tuesday. The first woman did not finish the combat endurance test at the beginning for the course in late September. Twenty-six of the 107 male Marines also did not finish the endurance test. The second woman could not complete two required training events “due to medical reasons,” said Capt. Eric Flanagan, a Marine spokesman. She is receiving treatment and is in “good condition,” Flanagan said, though the Marine Corps is not releasing specifics about her medical condition or any identifying information about either of the women. The women will now attend their primary military occupational specialty schools, Flanagan said. The Corps decided earlier this year to allow female lieutenants to attend its school for infantry officers as part of a larger effort to gather data on how to expand the role of women in combat. Male infantry officers must complete the 10-week course after graduating from The Basic School. In addition to allowing women to volunteer for IOC, the Marines will allow enlisted women to volunteer to train with the infantry training battalion for research purposes. The Corps will give men and women volunteers a strength test to see how they respond to heavy machine gun lift, casualty evacuation and “march under load” assessments, according to a service-wide message released in April. As part of the assessment process, the Marine Corps will assign some active-duty female officers and high-ranking enlisted female Marines to certain jobs in combat-related battalions for the first time. The units include artillery, tank, combat engineer, combat assault, low altitude air defense and assault amphibious battalions, are all in the women’s existing specialties. Female Navy medical officers, chaplains and corpsmen now may also be assigned to those battalions. Marine Corps Commandant Gen. James Amos will use the information gathered in the test programs and initiatives to make recommendations about how to change the policies that currently bar women from combat. [email protected] Twitter: @jhlad
/* Adds the link l to the list of incoming future links * for l's target. */ static void add_future_incoming(dcontext_t *dcontext, fragment_t *f, linkstub_t *l) { future_fragment_t *targetf; app_pc target_tag = EXIT_TARGET_TAG(dcontext, f, l); ASSERT(LINKSTUB_DIRECT(l->flags)); ASSERT(!SHARED_FRAGMENTS_ENABLED() || !dynamo_exited || self_owns_recursive_lock(&change_linking_lock)); ASSERT(linkstub_owned_by_fragment(dcontext, f, l)); if (!TEST(FRAG_SHARED, f->flags)) targetf = fragment_lookup_private_future(dcontext, target_tag); else targetf = fragment_lookup_future(dcontext, target_tag); if (targetf == NULL) { targetf = fragment_create_and_add_future( dcontext, target_tag, (f->flags & (FRAG_SHARED | FRAG_TEMP_PRIVATE))); } add_to_incoming_list(dcontext, f, l, (fragment_t *)targetf, false); if (!TEST(FRAG_SHARED, f->flags)) { add_private_check_shared(dcontext, f, l); } }
Welcome to K I LLMYW I FE.COM We understand. You're desparately in need of help. You can't take it anymore. The bitch needs to die. You want to kill your wife, but you just don't know where to start. You've come to the right place. If you're too sick of her to even bitch about it and really want to master the dating gameclick here to checkout the Badass method. Updates (if you've been here before) HOT NEWS: WE HAVE A FORUM. Check it out here. Sign up. Bitch about your wife. Post naked pictures of her blubbery ass. Spam links. We don't give a shit!!! Um. Ok. We've been at this a while. We've been "researching". You know, we got the dot com urge. We were going to make MILLION$. We haven't made any money yet, but we've spent a ton on hosting fees and spent a bunch of time explaining ourselves to the police. Somehow, they don't seem to get our humor. So before we get arrested (again), we need to clear a few things up. (If you really love us you'll keep comming back, send us suggestions and buy our tee-shirts. More on that later.) 1) We will NOT kill your wife for you!! While other internet companies have been able to get away with certain activities of questionable legality for quite some time, the law does catch up with them eventually. The penalties for murder being what they are, we decided that we shouldn't try this. Not that there isn't a tremendous demand for the service, as evidenced by just a few of the email's we've received: I really wish you guys would get this site up....I am running out of ideas and nothing has worked yet...I NEED HELP!!! Any recommendations would be GREATLY APPRECIATED ASAP. Descrete, Life Ins., & Get aways readily available OOOOOOHHHHHHH YYYEEEAAAHHHH!!!!!!!!!!!! HOW DO I SIGN UP?????!!!!!!!! We met at the Dallas/Fort Worth airport and you mentioned you could help me in my endeavors. Well I am now calling on your expertise. I need to know a soft, quiet, inexpensive way to accomplish what we were discussing. (We don't know who this person is, and we wonder just who they did meet at the Dallas/Fort Worth airport) Her names Melanie... How much? 2) We will NOT kill your wife for you!!! We know - the bitch needs to die. We understand. Believe us, we've been there. Oh, god have we been there. And while we're on the subjuct we should point out that it applies even more to ex-wives. But somebody got that name before us. Hopefully they will do something clever with it (like offer it to us to take over). Anyway, we are here to offer relief. Not permanant relief, but relief nonetheless. And we will listen patiently to your gripes, complaints and general bitching, such as the following: I'm sitting at the car dealer - it's 7:30 and we were supposed to go together at 6:30. I tried calling her cellphone and workphone, but couldn't reach her. She called me once in the past hour to tell me she was waiting - I had meanwhile gone up on my own to save her time. Of course her work number didn't pick-up (it rolls-over after 6:30) and her cellphone didn't work (her battery is dead since she never charges it). So - I've wasted an hour of my life waiting for her; she's finally on her way after she thought to finally try me again. She says it's just as much my fault - I told her she had said there was no way she could be in the building after 6:30, so why would I work late tonight? Just the usual problems - poor communication, no back-up plan, and I get shit for it! I WANT TO KILL MY WIFE! Now, can you see how lucky this poor fellow's co-worker were? How many times have they had to listen to this stuff? This time, we listened instead. You see, relief!! Thats what we're here for. And you can send your gripes, etc. to us at [email protected]. You're co-workers will thank you. Just remember that when you send it to us, its ours. And we may choose to air you dirty laundry here so that others may find relief too. 3 ) We will NOT kill your wife for you!! We are not professional hit men. Shit, we're not even professional web designers. Well, yes we are. Sort of. (email us if you're interested) But we are absolutely not professional hit men. We're not even amateur hit men. We're not even men (all of us). We don't even encourage you to kill your wife. Hopefully, after a bit more research, we will determine if it is safe for us to encourage you to do anything, legally speaking. In the meantime, we encourage you to continue to send us your comments, complains, and suggestions to [email protected]. Remembering of course that 1) what you send us becomes ours and we may use it, and 2) WE WILL NOT KILL YOUR WIFE FOR YOU. Ok, you can even send us naked pictures of her blubbery fat butt, too, if you want. We're here to help... Dont worry, More is comming. We know you need it. O h, we almost forgot. We've had a bunch of inquiries and we ARE working on tee-shirts. We'll let you know when they're ready. Other funny stuff about killing wives (If you are a wife, you may not find this so funny!) , The OJ Trial as Told by Dr. Seuss.
NEW DELHI: Rahul first heard of the virtual currency called bitcoin four years ago while playing multi-player online games. “There was an option to buy lives and resurrect your character so I traded my bitcoin stash,” says Rahul, who goes by just his first name. One life for 500 bits (one bit is one millionth of a bitcoin) was not a bad deal at all. But as he recounts that story now sitting in a café in Delhi, the 32-year-old says, “I regret it. I regret it so much. I should have saved those.” Had he held on, he could have been a millionaire today.The cryptocurrency’s value has skyrocketed – from $13 for one bitcoin (denoted with BTC) in January 2013 to $1,182 (Rs 76,000 approx) now. Rahul, a data analysis consultant in Delhi, has since invested in the cryptocurrency though his family back in Muzaffarpur, Bihar, is mystified by his move. “When I told my brother about it, he asked me, ‘Iska maalik kaun hai?’ (Who owns it?) I told him, no one does. Everyone does. He wasn’t convinced. The functioning is difficult to understand for a layperson,” says Rahul.Bitcoins can be bought or mined (discovering new ones) but the latter requires high-end computing power and lots of time. Depending on the kind of hardware one uses, one can currently mine up to 0.1 BTC per month. “That is neither profitable not practical,” says Rahul, who, like many others, buys them off an exchange. The “investment” works much like it would with gold or other currencies — buy, wait for value to appreciate, sell.Sahil Bansal (name changed on request), a business analyst based in Noida, is also among the small but growing tribe of Indian “bitcoiners”. “I have never used it for transactions. I only keep it as an asset,” says Bansal, who bought Rs 50,000 worth last year after a friend told him about it. Today, he can make over Rs 75,000 if he sells but he wants to hold on till the government takes a decision on how it wants to treat the cryptocurrency.On Wednesday, the Union ministry of finance set up an inter-disciplinary committee that will look at global regulatory frameworks for Bitcoin, and suggest measures to deal with the same in India.The committee will submit a report in three months. The move is significant considering that Japan has legalised it as a payment method.In India, one can buy bitcoins through a Bitcoin exchange or directly from an individual. Saurabh Agarwal, 41, co-founder and CEO of Indian Bitcoin exchange Zebpay, says very few exchange them for goods and services in the physical world. “Like gold, Indians believe then to be an asset class and its actual use case is only 1% of total transaction value. We believe that once Indian merchants get clarity on regulation and taxation of bitcoins, we will see them using it as a payment gateway,” says Agarwal.There are some who make money buying bitcoin off one exchange and selling it on another. Benson Samuel, CTO and founder of exchange Coinsecure, says he has seen people indulge in this arbitrage. “One lady used to buy from our exchange, and within a few hours, sell it on another. That works because different exchanges have different prices,” he says. Given the high value, it is more common to deal in fractions of bitcoin. “You can trade for as low as 0.01 BTC on Coinsecure,” says Samuel, who started his exchange in 2014.Techies and traders are among the most common users of bitcoins, he says. What makes the currency attractive is that it can be used to move money across the globe quickly and anonymously, and that it is free of control from any central bank or government.Hence, its association with the illegal drug trade. Just last month in India, two college students from Mumbai were arrested for possession of LSD, a psychedelic drug. They bought it online, paying for it in the cryptocurrency.Despite it all, Rahul is idealistic about what the currency can do. “The government has to try it at least once. You can’t track cash like you can track bitcoin,” he says. The blockchain technology that underlies Bitcoin, makes sure that every transaction is traceable right up to the origin point. It is often described as a “decentralised ledger” that records every payment. Since this ledger is public, no intermediaries are required to authenticate transactions.“This is the kind of transparency that is required. Just think of transactions for property — not even an inch of land can be bought illegally if this is used. I think the government should make regulations at some point. It is when bitcoins are converted into rupees that regulation will be required. Bitcoin to bitcoin transactions are transparent and trackable,” says Rahul.
<gh_stars>0 import { AzureFunction } from '../types'; import { customFetch } from '../lib/utils'; import { createUri } from './server'; export async function azureFunction(context, req) { try { const uri: string = await createUri(); const res = await customFetch(uri, req); if (res.status === 200) { context.res = { status: res.status, body: res.myBody, headers: res.myHeaders, }; } else { // Not Found context.res = { status: res.status, body: res.statusText, } } } catch (err) { context.res = { status: 500, body: err, }; } console.log('context.res:', JSON.stringify(context.res, null, 2)); }
/* * Issue a control request to the specified device. * This is O/S specific ... */ static inline int ctrl_msg ( int device, unsigned char requestType, unsigned char request, unsigned short value, unsigned short index, unsigned char *data, size_t length ) { struct usbdevfs_ctrltransfer ctrl; if (length > USHRT_MAX) { logerror("length too big\n"); return -EINVAL; } ctrl.bRequestType = requestType; ctrl.bRequest = request; ctrl.wValue = value; ctrl.wLength = (unsigned short) length; ctrl.wIndex = index; ctrl.data = data; ctrl.timeout = 10000; return ioctl (device, USBDEVFS_CONTROL, &ctrl); }
<reponame>danielpwarren/Etaler<filename>Etaler/Core/String.hpp #pragma once #include <vector> #include <string> #include <sstream> #include "Shape.hpp" namespace et { inline std::vector<std::string> split(const std::string& str, char delim = ',') { std::size_t current, previous = 0; std::vector<std::string> cont; current = str.find(delim); while (current != std::string::npos) { cont.push_back(str.substr(previous, current - previous)); previous = current + 1; current = str.find(delim, previous); } cont.push_back(str.substr(previous, current - previous)); return cont; } inline std::string hash_string(const std::string& str) { auto hash = std::hash<std::string>()(str); std::stringstream ss; ss << std::hex << hash; return ss.str(); } inline void replaceAll(std::string& str, const std::string& from, const std::string& to) { if(from.empty()) return; size_t start_pos = 0; while((start_pos = str.find(from, start_pos)) != std::string::npos) { str.replace(start_pos, from.length(), to); start_pos += to.length(); // In case 'to' contains 'from', like replacing 'x' with 'yx' } } class Tensor; enum class DType; // A lazy and not bullet-proof hashing function template <typename ... Args> inline std::string hashify(const Args& ... args) { auto to_str = [](const auto& v) -> std::string { using ValueType = std::decay_t<decltype(v)>; // HACK: You should NEVER put a tensor in hashify static_assert(std::is_same_v<ValueType, Tensor> == false); using namespace std; if constexpr(std::is_same_v<ValueType, std::string>) return v; else if constexpr(std::is_same_v<ValueType, Shape> || std::is_same_v<ValueType, DType>) return to_string(v); else return std::to_string(v); }; std::string concated = ((to_str(args) + " ") + ...); return hash_string(concated); } }
class ACIconfig: """Class which contains and manages ACI configuration dict """ aciitemcfg = {} # item configuretion structure configfiledir = "" # absolute path to configfiles dir def __init__(self, acicfgfile, aciapidict, excel=False): """[summary] :param acicfgfile: config struct :type acicfgfile: [type] """ self.aciapidict = aciapidict self.aciitemcfg["aci_items"] = {} if excel: # if Excel, this parameter already contains struct variable, not a filename aci_cfg = acicfgfile else: aci_cfg = [] self.acicfgfile = acicfgfile curdir = os.getcwd() tmpfilename = os.path.join(curdir, self.acicfgfile) # absolute path to configfile self.acicfgfile = os.path.abspath(os.path.realpath(tmpfilename)) # directory where are config files located self.configfiledir = os.path.dirname(self.acicfgfile) aci_cfg = self.read_configfile(self.acicfgfile) self.process_aci_config(aci_cfg, excel) def read_configfile(self, cfgfile): """ Read a configuration file into cfgfilevar :param cfgfile: [description] :type cfgfile: [type] :return: structured config (lists,dicts) :rtype: [type] """ # absolute path to filename cfgfile = os.path.join(self.configfiledir, cfgfile) cfgfile = os.path.normpath(cfgfile) if os.path.isfile(cfgfile): try: with open(cfgfile) as data_file: cfgfilevar = data_file.read() except IOError: print("Unable to read the file", cfgfile) exit(1) else: print("Cannot find the file", cfgfile) exit(1) return cfgfilevar def yaml2struct(self, cfgfilevar): """Transform configuration yaml file stored in cfgfilevar into structured data :param cfgfilevar: [description] :type cfgfilevar: [type] :return: [description] :rtype: [type] """ struct = yaml.safe_load(cfgfilevar) return struct def process_aci_config(self, acicfgfile, is_cfg_struct=False, isj2=False, vars={}): """Process configuration file stored in acicfgfile (plain text) :param acicfgfile: [description] :type acicfgfile: [type] :param isj2: is is jinja2 file? :type isj2: Boolean :param vars: [description], defaults to {} :type vars: dict, optional """ urlparams = {} # if file is a j2 template if isj2: acicfgtemplate = Template(acicfgfile) # render final yaml configuration file from j2 template acicfgfile = acicfgtemplate.render(vars) # print(acicfgfile) if is_cfg_struct: # acicfgfile is in fact not a yaml file but a python data structure acicfg = acicfgfile else: # convert yaml file into structured data acicfg = self.yaml2struct(acicfgfile) # process standard ACI config structure if "imdata" in acicfg: for subtree in acicfg["imdata"]: self.process_tree(subtree, urlparams) # 'default" names of parameters which are used in urlparams # (aciapidesc) # when full config tree is not specifies in aci_tree # i.e. fvTenant is not specified if "url_names" in acicfg: urlparams = acicfg["url_names"] # extract variables from configuration file # if vars var conigured in j2 template, they will be used in # child files if "vars" in acicfg: for varitem in acicfg["vars"].keys(): vars[varitem] = acicfg["vars"][varitem] # transofm final yaml config into structured data # acicfg = self.yaml2struct(renderedcfg) if ( "aci_trees" in acicfg and "aci_items" not in acicfg ): # tree based configuration for subtree in acicfg["aci_trees"]: self.process_tree(subtree, urlparams) elif ("aci_items" in acicfg): #self.aciitemcfg["aci_items"] = acicfg["aci_items"] # !!! THIS SHOULD BE DONE BETTER !!! self.process_items_cfg(acicfg["aci_items"]) # config file contains reference to another config file if "aci_cfgfiles" in acicfg: for filename in acicfg["aci_cfgfiles"]: # absolute path to the file filename = os.path.join(self.configfiledir, filename) filename = os.path.normpath(filename) aci_cfg = self.read_configfile(filename) # is filename j2 template ? isj2 = filename.split(".")[-1] == "j2" self.process_aci_config(aci_cfg, False, isj2, vars) def process_items_cfg(self, itemscfg): """Process list of item based configuration. Add them to final item configuration structure self.aciitemcfg """ for item in itemscfg: key = list(item.keys())[0] if key not in self.aciitemcfg["aci_items"]: self.aciitemcfg["aci_items"][key] = [] self.aciitemcfg["aci_items"][key].append(item[key]) def process_tree(self, subtree, urlparams): """Process a tree based configuration and create item based configuration ['aci_items'] Recurent function :param subtree: subtree wich is being processd :type subtree: [type] :param urlparams: parameters used in POST URL :type urlparams: [type] """ newitem = {} mykey = list(subtree)[0] # currently processed item of the cfg tree stopproc = False # dont't recurse down the tree, stop processing this branch if mykey not in self.aciapidict: print("cannot process key {}".format(mykey)) return # stop processing this branch, save all children together with this item if "stopproc" in self.aciapidict[mykey]: if ( self.aciapidict[mykey]["stopproc"] == "True" or self.aciapidict[mykey]["stopproc"] == "Yes" ): stopproc = True if mykey not in self.aciitemcfg["aci_items"]: self.aciitemcfg["aci_items"][mykey] = [] for itemkey in subtree[mykey]: # save all children together with this item if stopproc == True # if stopproc == False continue with processing of children if itemkey == "children" and not stopproc: continue # copy is important, it removes link to original subtree (dict) location # this is problem when the dict is dumped into yaml newitem[itemkey] = subtree[mykey][itemkey].copy() for itemkey in urlparams: # add names of parent items newitem[itemkey] = urlparams[itemkey].copy() self.aciitemcfg["aci_items"][mykey].append(newitem) if mykey not in urlparams: urlparams[mykey] = dict() if mykey in self.aciapidict and "key" in self.aciapidict[mykey]: # if key attribute is specified, use it insted of name attribute for keyattribute in self.aciapidict[mykey]["key"]: # keyattribute = self.aciapidict[mykey]['key'] if keyattribute in subtree[mykey]["attributes"]: urlparams[mykey][keyattribute] = subtree[mykey]["attributes"][ keyattribute ] if "name" in subtree[mykey]["attributes"]: # if 'name' is attribute of current key (mykey), add it even it is not # specified in 'key' section of API description file # urlparams contain names of all superior items in current tree branch # i.e fvAEP will contain name of superior fvAP and fvTenant # template wich creates URL uses this urlparams dict urlparams[mykey]["name"] = subtree[mykey]["attributes"]["name"] if ( "children" in subtree[mykey] and not stopproc ): # does current item contain child(ren)? try: for key in subtree[mykey]["children"]: # yes, process it # copy is important, otherwise the same variable is referenced self.process_tree(key, urlparams.copy()) except TypeError: # no children are defined None def getconfig(self): return self.aciitemcfg def getitemconfig(self): # ???? itemcfg = {"aci_items": self.aciitemcfg["aci_items"]} return itemcfg
/** * <p><b>(This method works with JSON data)</b></p> * * <p>Uploads a JSON string to the device. Calls the <code>write</code> effect * command from the <a href = "https://forum.nanoleaf.me/docs#_u2t4jzmkp8nt">OpenAPI</a>. * Refer to it for more information about the commands.</p> * * <h1>Commands:</h1> * <ul> * <li><i>add</i> - Installs an effect on the device or updates the effect * if it already exists.</li> * <li><i>delete</i> - Permanently removes an effect from the device.</li> * <li><i>request</i> - Requests a single effect by name.</li> * <li><i>requestAll</i> - Requests all the installed effects from the device.</li> * <li><i>display</i> - Sets a color mode on the device (used for previewing * effects).</li> * <li><i>displayTemp</i> - Temporarily sets a color mode on the device (typically * used for notifications of visual indicators).</li> * <li><i>rename</i> - Changes the name of an effect on the device.</li> * </ul> * * @param command the operation to perform the write with * @throws NanoleafException If the access token is invalid, or the command parameter * is invalid JSON, or the command parameter contains an * invalid command or has invalid command options * @throws IOException If an HTTP exception occurs */ public String writeEffect(String command) throws NanoleafException, IOException { String body = String.format("{\"write\": %s}", command); return put(getURL("effects"), body); }
/* Configure CGX LMAC in internal loopback mode */ int cgx_lmac_internal_loopback(void *cgxd, int lmac_id, bool enable) { struct cgx *cgx = cgxd; u8 lmac_type; u64 cfg; if (!cgx || lmac_id >= cgx->lmac_count) return -ENODEV; lmac_type = cgx_get_lmac_type(cgx, lmac_id); if (lmac_type == LMAC_MODE_SGMII || lmac_type == LMAC_MODE_QSGMII) { cfg = cgx_read(cgx, lmac_id, CGXX_GMP_PCS_MRX_CTL); if (enable) cfg |= CGXX_GMP_PCS_MRX_CTL_LBK; else cfg &= ~CGXX_GMP_PCS_MRX_CTL_LBK; cgx_write(cgx, lmac_id, CGXX_GMP_PCS_MRX_CTL, cfg); } else { cfg = cgx_read(cgx, lmac_id, CGXX_SPUX_CONTROL1); if (enable) cfg |= CGXX_SPUX_CONTROL1_LBK; else cfg &= ~CGXX_SPUX_CONTROL1_LBK; cgx_write(cgx, lmac_id, CGXX_SPUX_CONTROL1, cfg); } return 0; }
A 581/781 MSPS 3-BIT CMOS FLASH ADC USING TIQ COMPARATOR Analog to digital converters (ADCs) is a mixed signal device that converts analog signals which are real world signals to digital signals (which are binary in nature) for processing the information.Since signal processing in digital domain has widely being studied, designing an analog-to-digital converter has become more challenging for the researchers. In the digital domain, low power and low voltage requirements are becoming more important issues as the channel length of MOSFET shrinks below 0.25 sub-micron values. In this paper, a novel flash analog-to-digital converter for low-power & high speed applications has been proposed by incorporating the threshold inverter quantization technique. The key idea of this technique is to generate 2 n – 1 different sized threshold inverter quantization comparators for an n-bit converter due to which the fast data conversion speed improves the operating speed and the elimination of ladder resistors leads significant reduction in the power consumption. The use of two cascaded inverters as a voltage comparator is the reason for the technique's name (TIQ). To construct an n-bit flash TIQ based ADC, one must find 2 n - 1 different inverters, each has different V m value, and one must arrange them in the order of their V m value. The gain boosters make sharper thresholds for comparator outputs and provide a full digital output voltage swing. The comparator outputs - the thermometer code - are converted to a binary code in two steps through the `01' generator and the encoder. The proposed 3-bit flash ADC using TIQ is designed using FAT tree encoder and simulated with the help of TANNER-EDA tool in 0.25 µm CMOS technology . This paper represents 3-bit flash ADC for supply voltage of 2.5Vand 3.3 V. Also transistor sizes are varied from 1.5µm to 12.1µm for different widths, for input frequency of 1 MHz and speed and power for different variation are estimated. The optimum values of Power and Speed are being chosen amongst the obtained values.
/////////////////////////////////////////////////////////// // Set default or migrate SNI registy values. // // inputs: // dwIndex[in]: the index to the value array that is to be set. // hSNIRootNew[in]: the current SNI registry root key. // hSNIRootOld[in]: the latest SNI registry root key if any. It // should not point to the same location as the // hSNIRootNew does. // fOverWrite[in]: whether to overwrite or migrate. // // returns: // ERROR_SUCCESS if no error. // otherwise, a winerr code. // // notes: // See spec for detailed logic. // LONG NLregC::CSdefaultOrmigrateSNIFPV ( const DWORD dwIndex, const HKEY hSNIRootNew, const HKEY hSNIRootOld, BOOL fOverWrite ) { LONG lResult = ERROR_SUCCESS; if (dwIndex >= sizeof(csDefaultSNIFPVs)/sizeof(CS_FPV_DEFAULT) ||!hSNIRootNew) { lResult = ERROR_INVALID_PARAMETER; goto Exit; } { CS_FPV csfpv(eFPVUnknown, NULL, -1, REG_NONE); lResult = csfpv.Copy(&csDefaultSNIFPVs[dwIndex]); CHKL_GOTO(lResult, Exit); if(!fOverWrite && !csDefaultSNIFPVs[dwIndex].fOverwriteAlways) { lResult = ERROR_SUCCESS; if( !hSNIRootOld) { lResult = CSgetFPV(hSNIRootNew, &csfpv); } else { if( csDefaultSNIFPVs[dwIndex].fmigrate ) { lResult = CSgetFPV(hSNIRootOld, &csfpv); } } Reset the value to default if fail to query existing value. if(ERROR_SUCCESS != lResult ) { lResult = csfpv.Copy(&csDefaultSNIFPVs[dwIndex]); CHKL_GOTO(lResult, Exit); } } lResult = CSsetFPV(hSNIRootNew, &csfpv, CS_CREATE); } Exit: return lResult; }
package org.icatproject.core.oldparser; import java.util.ArrayList; import java.util.List; import org.icatproject.core.IcatException; import org.icatproject.core.entity.EntityBaseBean; public class BooleanTerm { // BooleanTerm ::= BooleanFactor ( "AND" BooleanFactor ) * private List<BooleanFactor> factors = new ArrayList<BooleanFactor>(); public BooleanTerm(OldInput input) throws OldParserException { this.factors.add(new BooleanFactor(input)); OldToken t = null; while ((t = input.peek(0)) != null) { if (t.getType() == OldToken.Type.AND) { input.consume(); this.factors.add(new BooleanFactor(input)); } else { return; } } } public StringBuilder getWhere(Class<? extends EntityBaseBean> tb) throws IcatException { StringBuilder sb = new StringBuilder(); sb.append(this.factors.get(0).getWhere(tb)); for (int i = 1; i < this.factors.size(); i++) { sb.append(" AND "); sb.append(this.factors.get(i).getWhere(tb)); } return sb; } @Override public String toString() { StringBuilder sb = new StringBuilder(); sb.append(this.factors.get(0)); for (int i = 1; i < this.factors.size(); i++) { sb.append(" AND "); sb.append(this.factors.get(i)); } return sb.toString(); } }
<reponame>theIDinside/rr<filename>src/test/perf_event_mmap.c /* -*- Mode: C; tab-width: 8; c-basic-offset: 2; indent-tabs-mode: nil; -*- */ #include "util.h" static int sys_perf_event_open(struct perf_event_attr* attr, pid_t pid, int cpu, int group_fd, unsigned long flags) { return syscall(SYS_perf_event_open, attr, pid, cpu, group_fd, flags); } int main(void) { struct perf_event_attr attr; void* p; size_t page_size = sysconf(_SC_PAGESIZE); int counter_fd; memset(&attr, 0, sizeof(attr)); attr.size = sizeof(attr); attr.type = PERF_TYPE_SOFTWARE; attr.config = PERF_COUNT_SW_CPU_CLOCK; attr.sample_period = 1000000; attr.sample_type = PERF_SAMPLE_IP; counter_fd = sys_perf_event_open(&attr, 0 /*self*/, -1 /*any cpu*/, -1, 0); test_assert(0 <= counter_fd); p = mmap(NULL, 3*page_size, PROT_READ | PROT_WRITE, MAP_SHARED, counter_fd, 0); test_assert(p == MAP_FAILED); test_assert(errno == ENODEV); atomic_puts("EXIT-SUCCESS"); return 0; }
<reponame>almaaro/Firmware-old /**************************************************************************** * * Copyright (c) 2018 PX4 Development Team. All rights reserved. * * Redistribution and use in source and binary forms, with or without * modification, are permitted provided that the following conditions * are met: * * 1. Redistributions of source code must retain the above copyright * notice, this list of conditions and the following disclaimer. * 2. Redistributions in binary form must reproduce the above copyright * notice, this list of conditions and the following disclaimer in * the documentation and/or other materials provided with the * distribution. * 3. Neither the name PX4 nor the names of its contributors may be * used to endorse or promote products derived from this software * without specific prior written permission. * * THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS * "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT * LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS * FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE * COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, * INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, * BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS * OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED * AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT * LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN * ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE * POSSIBILITY OF SUCH DAMAGE. * ****************************************************************************/ #include <unit_test.h> #include <time.h> #include <stdlib.h> #include <unistd.h> #include <drivers/drv_hrt.h> #include <perf/perf_counter.h> #include <px4_config.h> #include <px4_micro_hal.h> namespace MicroBenchMath { #ifdef __PX4_NUTTX #include <nuttx/irq.h> static irqstate_t flags; #endif void lock() { #ifdef __PX4_NUTTX flags = px4_enter_critical_section(); #endif } void unlock() { #ifdef __PX4_NUTTX px4_leave_critical_section(flags); #endif } #define PERF(name, op, count) do { \ usleep(1000); \ reset(); \ perf_counter_t p = perf_alloc(PC_ELAPSED, name); \ for (int i = 0; i < count; i++) { \ lock(); \ perf_begin(p); \ op; \ perf_end(p); \ unlock(); \ reset(); \ } \ perf_print_counter(p); \ perf_free(p); \ } while (0) class MicroBenchMath : public UnitTest { public: virtual bool run_tests(); private: bool time_single_precision_float(); bool time_single_precision_float_trig(); bool time_double_precision_float(); bool time_double_precision_float_trig(); bool time_8bit_integers(); bool time_16bit_integers(); bool time_32bit_integers(); bool time_64bit_integers(); void reset(); float f32; float f32_out; double f64; double f64_out; uint8_t i_8; uint8_t i_8_out; uint16_t i_16; uint16_t i_16_out; uint32_t i_32; uint32_t i_32_out; int64_t i_64; int64_t i_64_out; uint64_t u_64; uint64_t u_64_out; }; bool MicroBenchMath::run_tests() { ut_run_test(time_single_precision_float); ut_run_test(time_single_precision_float_trig); ut_run_test(time_double_precision_float); ut_run_test(time_double_precision_float_trig); ut_run_test(time_8bit_integers); ut_run_test(time_16bit_integers); ut_run_test(time_32bit_integers); ut_run_test(time_64bit_integers); return (_tests_failed == 0); } template<typename T> T random(T min, T max) { const T scale = rand() / (T) RAND_MAX; /* [0, 1.0] */ return min + scale * (max - min); /* [min, max] */ } void MicroBenchMath::reset() { srand(time(nullptr)); // initialize with random data f32 = random(-2.0f * M_PI, 2.0f * M_PI); // somewhat representative range for angles in radians f32_out = random(-2.0f * M_PI, 2.0f * M_PI); f64 = random(-2.0 * M_PI, 2.0 * M_PI); f64_out = random(-2.0 * M_PI, 2.0 * M_PI); i_8 = rand(); i_8_out = rand(); i_16 = rand(); i_16_out = rand(); i_32 = rand(); i_32_out = rand(); i_64 = rand(); i_64_out = rand(); u_64 = rand(); u_64_out = rand(); } ut_declare_test_c(test_microbench_math, MicroBenchMath) bool MicroBenchMath::time_single_precision_float() { PERF("float add", f32_out += f32, 1000); PERF("float sub", f32_out -= f32, 1000); PERF("float mul", f32_out *= f32, 1000); PERF("float div", f32_out /= f32, 1000); PERF("float sqrt", f32_out = sqrtf(f32), 1000); return true; } bool MicroBenchMath::time_single_precision_float_trig() { PERF("sinf()", f32_out = sinf(f32), 1000); PERF("cosf()", f32_out = cosf(f32), 1000); PERF("tanf()", f32_out = tanf(f32), 1000); PERF("acosf()", f32_out = acosf(f32), 1000); PERF("asinf()", f32_out = asinf(f32), 1000); PERF("atan2f()", f32_out = atan2f(f32, 2.0f * f32), 1000); return true; } bool MicroBenchMath::time_double_precision_float() { PERF("double add", f64_out += f64, 1000); PERF("double sub", f64_out -= f64, 1000); PERF("double mul", f64_out *= f64, 1000); PERF("double div", f64_out /= f64, 1000); PERF("double sqrt", f64_out = sqrt(f64), 1000); return true; } bool MicroBenchMath::time_double_precision_float_trig() { PERF("sin()", f64_out = sin(f64), 1000); PERF("cos()", f64_out = cos(f64), 1000); PERF("tan()", f64_out = tan(f64), 1000); PERF("acos()", f64_out = acos(f64 * 0.5), 1000); PERF("asin()", f64_out = asin(f64 * 0.6), 1000); PERF("atan2()", f64_out = atan2(f64 * 0.7, f64 * 0.8), 1000); PERF("sqrt()", f64_out = sqrt(f64), 1000); return true; } bool MicroBenchMath::time_8bit_integers() { PERF("int8 add", i_8_out += i_8, 1000); PERF("int8 sub", i_8_out -= i_8, 1000); PERF("int8 mul", i_8_out *= i_8, 1000); PERF("int8 div", i_8_out /= i_8, 1000); return true; } bool MicroBenchMath::time_16bit_integers() { PERF("int16 add", i_16_out += i_16, 1000); PERF("int16 sub", i_16_out -= i_16, 1000); PERF("int16 mul", i_16_out *= i_16, 1000); PERF("int16 div", i_16_out /= i_16, 1000); return true; } bool MicroBenchMath::time_32bit_integers() { PERF("int32 add", i_32_out += i_32, 1000); PERF("int32 sub", i_32_out -= i_32, 1000); PERF("int32 mul", i_32_out *= i_32, 1000); PERF("int32 div", i_32_out /= i_32, 1000); return true; } bool MicroBenchMath::time_64bit_integers() { PERF("int64 add", i_64_out += i_64, 1000); PERF("int64 sub", i_64_out -= i_64, 1000); PERF("int64 mul", i_64_out *= i_64, 1000); PERF("int64 div", i_64_out /= i_64, 1000); return true; } } // namespace MicroBenchMath
def _compute_device_infos( bridge: HuaweiSolarBridge, connecting_inverter_device_id: tuple[str, str] | None, ) -> HuaweiInverterBridgeDeviceInfos: inverter_device_info = DeviceInfo( identifiers={(DOMAIN, bridge.serial_number)}, name=bridge.model_name, manufacturer="Huawei", model=bridge.model_name, via_device=connecting_inverter_device_id, ) power_meter_device_info = None if bridge.power_meter_type is not None: power_meter_device_info = DeviceInfo( identifiers={ (DOMAIN, f"{bridge.serial_number}/power_meter"), }, name="Power Meter", via_device=(DOMAIN, bridge.serial_number), ) battery_device_info = None if bridge.battery_1_type != rv.StorageProductModel.NONE: battery_device_info = DeviceInfo( identifiers={ (DOMAIN, f"{bridge.serial_number}/connected_energy_storage"), }, name=f"{inverter_device_info['name']} Connected Energy Storage", manufacturer=inverter_device_info["manufacturer"], model=f"{inverter_device_info['model']} Connected Energy Storage", via_device=(DOMAIN, bridge.serial_number), ) return HuaweiInverterBridgeDeviceInfos( inverter=inverter_device_info, power_meter=power_meter_device_info, connected_energy_storage=battery_device_info, )
import fs from "fs"; (async () => { const raw = fs.readFileSync("addresses.txt"); const splits = raw.toString().split("\n"); const addresses: string[] = []; for (const split of splits) { if (split.indexOf("0x") === 0) { const address = split.trim(); if (address.length === 42 && addresses.includes(address) !== true) { addresses.push(address); } } } const founds: string[] = []; const find = () => { const found = Math.floor(Math.random() * addresses.length); if (founds.includes(addresses[found]) === true) { find(); } else { founds.push(addresses[found]); console.log(addresses[found]); } }; console.log("수상자 목록"); for (let i = 0; i < 10; i += 1) { find(); } console.log("만약 수상자 중 정보가 잘못된 경우를 위한 대기자 목록"); for (let i = 0; i < 5; i += 1) { find(); } })();
/** * Created by Marco Capitani on 20/06/19. * * @author Marco Capitani <m.capitani AT nextworks.it> */ public class PrometheusTDetails extends ThresholdDetails { private int value; private RelationalOperation comparison; private int thresholdTime; private String pmJobId; public PrometheusTDetails(int value, RelationalOperation comparison, int thresholdTime, String pmJobId) { super(ThresholdFormat.PROMETHEUS); this.value = value; this.comparison = comparison; this.thresholdTime = thresholdTime; this.pmJobId = pmJobId; } public double getValue() { return value; } public RelationalOperation getRelationalOperation() { return comparison; } public int getThresholdTime() { return thresholdTime; } public String getPmJobId() { return pmJobId; } }
An Extension of an Invariance Property of the Swiss Premium Calculation Principle Some of the results obtained in an earlier paper entitled “An Invariance property of the Swiss premium calculation principle” by F. De Vylder and M. Goovaerts (1979) are generalized. For that purpose the notion of additivity and iterativity is extended. Some rather general characterization theorems for some premium calculation principles are obtained.
def fqn(self): if self.path: module_path = self.path else: fp = self.func.__code__.co_filename matching = max([p for p in sys.path if p in fp], key=len) module_path = ( fp.replace(matching + "/", "").replace(".py", "").replace("/", ".") ) func_name = self.alias or self.func.__qualname__ return ".".join([module_path, func_name])
/** * {@link CharacterStateRow} test. * * @author Sam Donnelly * */ @Test(groups = TestGroupDefs.FAST) public class StandardRowTest { private List<Otu> otus; private StandardMatrix matrix; @BeforeMethod public void beforeMethod() { matrix = new StandardMatrix(); matrix.setParent(new OtuSet()); otus = newArrayList(); otus.add(new Otu("OTU-0")); matrix.getParent().clearAndAddOtus(newArrayList(otus.get(0))); matrix.putRow(otus.get(0), new StandardRow()); final StandardCharacter character0 = new StandardCharacter(); character0.setLabel("character-0"); matrix.clearAndAddCharacters(newArrayList(character0)); } @Test public void addCellToMatrixWOneCharacter() { final StandardCell cell = new StandardCell(); cell.setUnassigned(); matrix.getRows().get(matrix.getParent().getOtus().get(0)).clearAndAddCells( Arrays.asList(cell)); assertSame(matrix.getRows().get(matrix.getParent().getOtus().get(0)) .getCells().get(0), cell); } @Test void setCells() { matrix = new StandardMatrix(); matrix.setParent(new OtuSet()); otus = newArrayList(); final Otu otu0 = new Otu(); otu0.setLabel("OTU-0"); otus.add(otu0); matrix.getParent().clearAndAddOtus(newArrayList(otus.get(0))); matrix.putRow(otus.get(0), new StandardRow()); final StandardRow row = matrix.getRows().get(otu0); final ImmutableList<StandardCharacter> characters = ImmutableList.of( new StandardCharacter(), new StandardCharacter(), new StandardCharacter()); characters.get(0).setLabel("character-0"); characters.get(1).setLabel("character-1"); characters.get(2).setLabel("character-2"); matrix.clearAndAddCharacters(characters); final List<StandardCell> cells = ImmutableList.of(new StandardCell(), new StandardCell(), new StandardCell()); row.clearAndAddCells(cells); assertEquals(row.getCells(), cells); assertSame(cells.get(0).getParent(), row); } @Test(expectedExceptions = IllegalStateException.class) public void addCellToRowThatsNotInAMatrix() { // Just call setUnassigned so that the cell is in a legal state - it // shouldn't really matterJust call setUnassigned so that the cell is in // a legal state - it shouldn't really matter final StandardCell cell = new StandardCell(); cell.setUnassigned(); new StandardRow().clearAndAddCells(Arrays.asList(cell)); } @Test(expectedExceptions = IllegalStateException.class) public void addCellToMatrixThatHasNoCharacters() { @SuppressWarnings("unchecked") final List<StandardCharacter> emptyList = (List<StandardCharacter>) Collections.EMPTY_LIST; matrix.clearAndAddCharacters(emptyList); // Just call setUnassigned so that the cell is in a legal state - it // shouldn't really matter. final StandardCell cell = new StandardCell(); cell.setUnassigned(); matrix.getRows().get( matrix.getParent().getOtus().get(0)) .clearAndAddCells(Arrays.asList(cell)); } @Test(expectedExceptions = IllegalStateException.class) public void addCellToMatrixWTooFewCharacters() { final StandardCell cell0 = new StandardCell(); cell0.setUnassigned(); final StandardCell cell1 = new StandardCell(); cell1.setUnassigned(); final List<StandardCell> cells = newArrayList(cell0, cell1); matrix.getRows().get(matrix.getParent().getOtus().get(0)) .clearAndAddCells(cells); } }
/** * Implementation of a Constraint Store, oriented toward the definition of the constraint store by * Van Weert et al. "CHR for Imperative Host Languages". * Stores {@link Constraint}-objects and a rule application history (which rules where applied to which constraints) */ public class ConstraintStore { protected final List<Constraint<?>> store = new ArrayList<>(); protected final HashSet<Long> deadConstraints = new HashSet<>(); public ConstraintStore() { // Empty constructor for empty constraint stores. } public ConstraintStore(Collection<Constraint<?>> constraints) { if(constraints == null) return; List<Constraint<?>> withoutDuplicates = new ArrayList<>(new HashSet<>(constraints)); store.addAll(withoutDuplicates); } public ConstraintStore(Constraint<?>... constraints) { if (constraints.length <= 0) return; List<Constraint<?>> withoutDuplicates = new ArrayList<>(new HashSet<>(Arrays.asList(constraints))); store.addAll(withoutDuplicates); } @SafeVarargs public <T> ConstraintStore(T... values) { for (T value : values) this.add(new Constraint<>(value)); } /** * Add the given constraint to the store. */ public void add(Constraint<?> constraint){ store.add(constraint); setAlive(constraint.getID()); } /** * Create a new constraint and add it to the store. * @param object the new object to add. * @param <T> Type of the given object and the created constraint. */ public <T> void add(T object){ this.add(new Constraint<>(object)); } /** * Adds all the constraints of the list to the constraint store. * @param collection Constraints that get added to the store. */ public void addAll(Collection<Constraint<?>> collection){ HashSet<Constraint<?>> withoutDuplicates = new HashSet<>(collection); store.addAll(withoutDuplicates); withoutDuplicates.forEach(x -> setAlive(x.getID())); } /** * Adds all the constraints of the list to the constraint store. * @param constraints Constraints that get added to the store. */ public void addAll(Constraint<?>... constraints){ HashSet<Constraint<?>> withoutDuplicates = new HashSet<>(Arrays.asList(constraints)); store.addAll(withoutDuplicates); withoutDuplicates.forEach(x -> setAlive(x.getID())); } /** * Remove the constraint with the given ID. */ public void remove(long ID){ deadConstraints.add(ID); store.removeIf(x -> x.getID() == ID); } /** * Returns a complete Iterator for the ConstraintStore. * @return An iterator with all elements in the ConstraintStore. */ public Iterator<Constraint<?>> lookup(){ return store.iterator(); } /** * Returns an Iterator which contains only the constraints with type constraintType. * @param constraintType The type you want the constraints to be. * @return An iterator with all elements of type {@param constraintType}. */ public Iterator<Constraint<?>> lookup(Class<?> constraintType){ return store.stream().filter(x -> isAlive(x.getID()) && x.isOfType(constraintType)).iterator(); } /** * Returns an iterator which contains only the constraints that contain an object that is equal to {@param value} * Equivalence is determined by the function by calling .equal() on the object in the constraint with the {@param value} * as parameter. * @param value The value that all constraints in the iterator should contain. (e.g. String "42") * @return An iterator with constraints that contain objects equal to {@param value}. */ public Iterator<Constraint<?>> lookup(Object value){ return store.stream() .filter(x -> isAlive(x.getID()) && x.innerObjEquals(value)) .iterator(); } /** * @return The first constraint in the store that is alive. */ public Constraint<?> getFirst(){ return store.stream() .filter(x -> isAlive(x.getID())) .findFirst() .orElse(null); } /** * @return The last constraint in the store that is alive. */ public Constraint<?> getLast(){ return store.stream() .filter(x -> isAlive(x.getID())) .reduce((x, y) -> y) .orElse(null); } /** * Sets the constraint with the given ID to dead if the constraint exists in the internal store. * Dead constraints are unregarded by {@link RuleApplicator}s. * * @param ID the ID of the constraint that should be set to dead. */ public void setDead(long ID){ deadConstraints.add(ID); } /** * Sets the constraint with the given ID to alive. * Removes the ID from the list of dead constraints if it is in there. * Dead constraints are unregarded by {@link RuleApplicator}s. * * @param ID The ID of the constraint that should be set to alive. */ public void setAlive(long ID){ deadConstraints.remove(ID); } public boolean isAlive(long ID){ return !deadConstraints.contains(ID); } /** * Cleans the internal store of dead constraints. * Removes all stored IDs that refer to constraints that are no longer in the internal data structure. */ public void cleanup() { deadConstraints.removeIf(x -> store.stream().noneMatch(y -> y.getID() == x)); } /** * Set all constraints to dead and remove them from the internal data structure. */ public void clear(){ store.forEach(x -> setDead(x.getID())); store.clear(); } public int size(){ return store.size(); } public List<Constraint<?>> toList(){ return new ArrayList<>(store); } @Override public String toString() { StringBuilder str = new StringBuilder("ConstraintStore:"); for (Constraint<?> c : store) { str.append("\n\t").append(c.toString()); } return str.toString(); } }
s=input() n=len(s) ans="" import math for i in range(1,n): if i%2==1: continue ss=s[:i] m=len(ss) p=0 for j in range(m//2): if ss[j]!=ss[j+m//2]: p=1 break if p==0: ans=ss print(len(ans))
import { useEffect, useRef, useState } from "react"; import { getFileUrl } from "./storageUtils"; import { StorageObject } from "./types"; export const usePrevious = <T>(value: T) => { const ref = useRef<T>(); useEffect(() => { ref.current = value; }, [value]); return ref.current; }; type KeyValuePair = { [key: string]: any }; const sortObjectKeys = (unorderedObject: KeyValuePair) => { return Object.keys(unorderedObject) ?.sort() .reduce((newObject: KeyValuePair, key: string) => { newObject[key] = unorderedObject[key]; return newObject; }, {}); }; const deepEqual = ( valueA?: StorageObject | null, valueB?: StorageObject | null ): boolean => { const stringValueA = JSON.stringify(sortObjectKeys(valueA || {})); const stringValueB = JSON.stringify(sortObjectKeys(valueB || {})); return stringValueA === stringValueB; }; export const useStorageObject = (storageObject?: StorageObject | null) => { const [url, setUrl] = useState<string>(); const [loading, setLoading] = useState(false); const [error, setError] = useState<string>(); const prevStorageObject = usePrevious(storageObject); useEffect(() => { if (deepEqual(prevStorageObject, storageObject)) { return; } if (storageObject) { setLoading(true); getFileUrl(storageObject) .then((result) => { setUrl(result); setError(undefined); }) .catch((e: Error) => { setUrl(undefined); setError(e.message); }) .finally(() => setLoading(false)); } else { setLoading(false); setError("No storage object provided"); setUrl(undefined); } }, [storageObject, prevStorageObject]); return { url, loading, error, }; };
<gh_stars>1-10 /* External Imports */ import { L2ToL1Message } from '@eth-optimism/rollup-core' import { getLogger, logError } from '@eth-optimism/core-utils' import { Contract, Wallet } from 'ethers' import { L2ToL1MessageSubmitter } from '../types' import { TransactionReceipt } from 'ethers/providers/abstract-provider' const log = getLogger('rollup-message-submitter') export class NoOpL2ToL1MessageSubmitter implements L2ToL1MessageSubmitter { public async submitMessage(l2ToL1Message: L2ToL1Message): Promise<void> { log.debug( `L2ToL1Message received by NoOpL2ToL1MessageSubmitter: ${JSON.stringify( NoOpL2ToL1MessageSubmitter )}` ) return } } /** * Default Message Submitter implementation. This will be deprecated when message submission works properly. */ export class DefaultL2ToL1MessageSubmitter implements L2ToL1MessageSubmitter { private highestNonceSubmitted: number private highestNonceConfirmed: number public static async create( sequencerWallet: Wallet, messageReceiverContract: Contract ): Promise<DefaultL2ToL1MessageSubmitter> { return new DefaultL2ToL1MessageSubmitter( sequencerWallet, messageReceiverContract ) } private constructor( private readonly sequencerWallet: Wallet, private readonly messageReceiverContract: Contract ) { this.highestNonceSubmitted = -1 this.highestNonceConfirmed = -1 } public async submitMessage(message: L2ToL1Message): Promise<void> { log.info( `Submitting message number ${message.nonce}: ${JSON.stringify(message)}.` ) let receipt try { const callData = this.messageReceiverContract.interface.functions.enqueueL2ToL1Message.encode( [message] ) receipt = await this.sequencerWallet.sendTransaction({ to: this.messageReceiverContract.address, data: callData, }) log.debug( `Receipt for message ${JSON.stringify(message)}: ${JSON.stringify( receipt )}` ) } catch (e) { logError( log, `Error submitting rollup message: ${JSON.stringify(message)}`, e ) throw e } this.highestNonceSubmitted = Math.max( this.highestNonceSubmitted, message.nonce ) this.messageReceiverContract.provider .waitForTransaction(receipt.hash) .then((txReceipt: TransactionReceipt) => { log.debug( `L2ToL1Message with nonce ${message.nonce.toString( 16 )} was confirmed on L1!` ) this.highestNonceConfirmed = Math.max( this.highestNonceConfirmed, message.nonce ) }) .catch((error) => { logError(log, 'Error submitting L2 -> L1 message transaction', error) }) } public getHighestNonceSubmitted(): number { return this.highestNonceSubmitted } public getHighestNonceConfirmed(): number { return this.highestNonceConfirmed } }
<reponame>maidiHaitai/haitaibrowser // Copyright 2015 The Chromium Authors. All rights reserved. // Use of this source code is governed by a BSD-style license that can be // found in the LICENSE file. #include "components/offline_pages/offline_page_metadata_store_impl.h" #include <stdint.h> #include <memory> #include "base/bind.h" #include "base/files/file_path.h" #include "base/files/scoped_temp_dir.h" #include "base/strings/string_number_conversions.h" #include "base/strings/utf_string_conversions.h" #include "base/test/test_simple_task_runner.h" #include "base/threading/thread_task_runner_handle.h" #include "components/leveldb_proto/proto_database_impl.h" #include "components/offline_pages/offline_page_item.h" #include "components/offline_pages/offline_page_metadata_store_sql.h" #include "components/offline_pages/offline_page_model.h" #include "components/offline_pages/proto/offline_pages.pb.h" #include "testing/gtest/include/gtest/gtest.h" using leveldb_proto::ProtoDatabaseImpl; namespace offline_pages { namespace { const char kTestClientNamespace[] = "CLIENT_NAMESPACE"; const char kTestURL[] = "https://example.com"; const ClientId kTestClientId1(kTestClientNamespace, "1234"); const ClientId kTestClientId2(kTestClientNamespace, "5678"); const base::FilePath::CharType kFilePath[] = FILE_PATH_LITERAL("/offline_pages/example_com.mhtml"); int64_t kFileSize = 234567; class OfflinePageMetadataStoreFactory { public: virtual OfflinePageMetadataStore* BuildStore(const base::FilePath& file) = 0; }; class OfflinePageMetadataStoreImplFactory : public OfflinePageMetadataStoreFactory { public: OfflinePageMetadataStore* BuildStore(const base::FilePath& file) override { return new OfflinePageMetadataStoreImpl(base::ThreadTaskRunnerHandle::Get(), file); } }; class OfflinePageMetadataStoreSQLFactory : public OfflinePageMetadataStoreFactory { public: OfflinePageMetadataStore* BuildStore(const base::FilePath& file) override { OfflinePageMetadataStoreSQL* store = new OfflinePageMetadataStoreSQL( base::ThreadTaskRunnerHandle::Get(), file); return store; } }; enum CalledCallback { NONE, LOAD, ADD, REMOVE, DESTROY }; enum Status { STATUS_NONE, STATUS_TRUE, STATUS_FALSE }; class OfflinePageMetadataStoreTestBase : public testing::Test { public: OfflinePageMetadataStoreTestBase(); ~OfflinePageMetadataStoreTestBase() override; void TearDown() override { // Wait for all the pieces of the store to delete itself properly. PumpLoop(); } std::unique_ptr<OfflinePageMetadataStoreImpl> BuildStore(); void PumpLoop(); void LoadCallback(OfflinePageMetadataStore::LoadStatus load_status, const std::vector<OfflinePageItem>& offline_pages); void UpdateCallback(CalledCallback called_callback, bool success); void ClearResults(); protected: CalledCallback last_called_callback_; Status last_status_; std::vector<OfflinePageItem> offline_pages_; base::ScopedTempDir temp_directory_; scoped_refptr<base::TestSimpleTaskRunner> task_runner_; base::ThreadTaskRunnerHandle task_runner_handle_; }; OfflinePageMetadataStoreTestBase::OfflinePageMetadataStoreTestBase() : last_called_callback_(NONE), last_status_(STATUS_NONE), task_runner_(new base::TestSimpleTaskRunner), task_runner_handle_(task_runner_) { EXPECT_TRUE(temp_directory_.CreateUniqueTempDir()); } OfflinePageMetadataStoreTestBase::~OfflinePageMetadataStoreTestBase() {} void OfflinePageMetadataStoreTestBase::PumpLoop() { task_runner_->RunUntilIdle(); } void OfflinePageMetadataStoreTestBase::LoadCallback( OfflinePageMetadataStore::LoadStatus load_status, const std::vector<OfflinePageItem>& offline_pages) { last_called_callback_ = LOAD; last_status_ = load_status == OfflinePageMetadataStore::LOAD_SUCCEEDED ? STATUS_TRUE : STATUS_FALSE; offline_pages_.swap(const_cast<std::vector<OfflinePageItem>&>(offline_pages)); } void OfflinePageMetadataStoreTestBase::UpdateCallback( CalledCallback called_callback, bool status) { last_called_callback_ = called_callback; last_status_ = status ? STATUS_TRUE : STATUS_FALSE; } void OfflinePageMetadataStoreTestBase::ClearResults() { last_called_callback_ = NONE; last_status_ = STATUS_NONE; offline_pages_.clear(); } template <typename T> class OfflinePageMetadataStoreTest : public OfflinePageMetadataStoreTestBase { public: std::unique_ptr<OfflinePageMetadataStore> BuildStore(); protected: T factory_; }; template <typename T> std::unique_ptr<OfflinePageMetadataStore> OfflinePageMetadataStoreTest<T>::BuildStore() { std::unique_ptr<OfflinePageMetadataStore> store( factory_.BuildStore(temp_directory_.path())); store->Load(base::Bind(&OfflinePageMetadataStoreTestBase::LoadCallback, base::Unretained(this))); PumpLoop(); return store; } typedef testing::Types<OfflinePageMetadataStoreImplFactory, OfflinePageMetadataStoreSQLFactory> MyTypes; TYPED_TEST_CASE(OfflinePageMetadataStoreTest, MyTypes); // Loads empty store and makes sure that there are no offline pages stored in // it. TYPED_TEST(OfflinePageMetadataStoreTest, LoadEmptyStore) { std::unique_ptr<OfflinePageMetadataStore> store(this->BuildStore()); EXPECT_EQ(LOAD, this->last_called_callback_); EXPECT_EQ(STATUS_TRUE, this->last_status_); EXPECT_EQ(0U, this->offline_pages_.size()); } // Adds metadata of an offline page into a store and then opens the store // again to make sure that stored metadata survives store restarts. TYPED_TEST(OfflinePageMetadataStoreTest, AddOfflinePage) { std::unique_ptr<OfflinePageMetadataStore> store(this->BuildStore()); OfflinePageItem offline_page(GURL(kTestURL), 1234LL, kTestClientId1, base::FilePath(kFilePath), kFileSize); store->AddOrUpdateOfflinePage( offline_page, base::Bind(&OfflinePageMetadataStoreTestBase::UpdateCallback, base::Unretained(this), ADD)); this->PumpLoop(); EXPECT_EQ(ADD, this->last_called_callback_); EXPECT_EQ(STATUS_TRUE, this->last_status_); this->ClearResults(); // Close the store first to ensure file lock is removed. store.reset(); store = this->BuildStore(); this->PumpLoop(); EXPECT_EQ(LOAD, this->last_called_callback_); EXPECT_EQ(STATUS_TRUE, this->last_status_); EXPECT_EQ(1U, this->offline_pages_.size()); EXPECT_EQ(offline_page.url, this->offline_pages_[0].url); EXPECT_EQ(offline_page.offline_id, this->offline_pages_[0].offline_id); EXPECT_EQ(offline_page.version, this->offline_pages_[0].version); EXPECT_EQ(offline_page.file_path, this->offline_pages_[0].file_path); EXPECT_EQ(offline_page.file_size, this->offline_pages_[0].file_size); EXPECT_EQ(offline_page.creation_time, this->offline_pages_[0].creation_time); EXPECT_EQ(offline_page.last_access_time, this->offline_pages_[0].last_access_time); EXPECT_EQ(offline_page.access_count, this->offline_pages_[0].access_count); EXPECT_EQ(offline_page.client_id, this->offline_pages_[0].client_id); } // Tests removing offline page metadata from the store, for which it first adds // metadata of an offline page. TYPED_TEST(OfflinePageMetadataStoreTest, RemoveOfflinePage) { std::unique_ptr<OfflinePageMetadataStore> store(this->BuildStore()); // Add an offline page. OfflinePageItem offline_page(GURL(kTestURL), 1234LL, kTestClientId1, base::FilePath(kFilePath), kFileSize); store->AddOrUpdateOfflinePage( offline_page, base::Bind(&OfflinePageMetadataStoreTestBase::UpdateCallback, base::Unretained(this), ADD)); this->PumpLoop(); EXPECT_EQ(ADD, this->last_called_callback_); EXPECT_EQ(STATUS_TRUE, this->last_status_); this->ClearResults(); // Load the store. store->Load(base::Bind(&OfflinePageMetadataStoreTestBase::LoadCallback, base::Unretained(this))); this->PumpLoop(); EXPECT_EQ(LOAD, this->last_called_callback_); EXPECT_EQ(1U, this->offline_pages_.size()); // Remove the offline page. std::vector<int64_t> ids_to_remove; ids_to_remove.push_back(offline_page.offline_id); store->RemoveOfflinePages( ids_to_remove, base::Bind(&OfflinePageMetadataStoreTestBase::UpdateCallback, base::Unretained(this), REMOVE)); this->PumpLoop(); EXPECT_EQ(REMOVE, this->last_called_callback_); EXPECT_EQ(STATUS_TRUE, this->last_status_); this->ClearResults(); // Load the store. store->Load(base::Bind(&OfflinePageMetadataStoreTestBase::LoadCallback, base::Unretained(this))); this->PumpLoop(); EXPECT_EQ(LOAD, this->last_called_callback_); EXPECT_EQ(0U, this->offline_pages_.size()); this->ClearResults(); // Close and reload the store. store.reset(); store = this->BuildStore(); EXPECT_EQ(LOAD, this->last_called_callback_); EXPECT_EQ(STATUS_TRUE, this->last_status_); EXPECT_EQ(0U, this->offline_pages_.size()); } // Adds metadata of multiple offline pages into a store and removes some. TYPED_TEST(OfflinePageMetadataStoreTest, AddRemoveMultipleOfflinePages) { std::unique_ptr<OfflinePageMetadataStore> store(this->BuildStore()); // Add an offline page. OfflinePageItem offline_page_1(GURL(kTestURL), 12345LL, kTestClientId1, base::FilePath(kFilePath), kFileSize); base::FilePath file_path_2 = base::FilePath(FILE_PATH_LITERAL("//other.page.com.mhtml")); OfflinePageItem offline_page_2(GURL("https://other.page.com"), 5678LL, kTestClientId2, file_path_2, 12345, base::Time::Now()); store->AddOrUpdateOfflinePage( offline_page_1, base::Bind(&OfflinePageMetadataStoreTestBase::UpdateCallback, base::Unretained(this), ADD)); this->PumpLoop(); EXPECT_EQ(ADD, this->last_called_callback_); EXPECT_EQ(STATUS_TRUE, this->last_status_); this->ClearResults(); // Add anther offline page. store->AddOrUpdateOfflinePage( offline_page_2, base::Bind(&OfflinePageMetadataStoreTestBase::UpdateCallback, base::Unretained(this), ADD)); this->PumpLoop(); EXPECT_EQ(ADD, this->last_called_callback_); EXPECT_EQ(STATUS_TRUE, this->last_status_); this->ClearResults(); // Load the store. store->Load(base::Bind(&OfflinePageMetadataStoreTestBase::LoadCallback, base::Unretained(this))); this->PumpLoop(); EXPECT_EQ(LOAD, this->last_called_callback_); EXPECT_EQ(STATUS_TRUE, this->last_status_); EXPECT_EQ(2U, this->offline_pages_.size()); // Remove the offline page. std::vector<int64_t> ids_to_remove; ids_to_remove.push_back(offline_page_1.offline_id); store->RemoveOfflinePages( ids_to_remove, base::Bind(&OfflinePageMetadataStoreTestBase::UpdateCallback, base::Unretained(this), REMOVE)); this->PumpLoop(); EXPECT_EQ(REMOVE, this->last_called_callback_); EXPECT_EQ(STATUS_TRUE, this->last_status_); this->ClearResults(); // Close and reload the store. store.reset(); store = this->BuildStore(); store->Load(base::Bind(&OfflinePageMetadataStoreTestBase::LoadCallback, base::Unretained(this))); this->PumpLoop(); EXPECT_EQ(LOAD, this->last_called_callback_); EXPECT_EQ(STATUS_TRUE, this->last_status_); EXPECT_EQ(1U, this->offline_pages_.size()); EXPECT_EQ(offline_page_2.url, this->offline_pages_[0].url); EXPECT_EQ(offline_page_2.offline_id, this->offline_pages_[0].offline_id); EXPECT_EQ(offline_page_2.version, this->offline_pages_[0].version); EXPECT_EQ(offline_page_2.file_path, this->offline_pages_[0].file_path); EXPECT_EQ(offline_page_2.file_size, this->offline_pages_[0].file_size); EXPECT_EQ(offline_page_2.creation_time, this->offline_pages_[0].creation_time); EXPECT_EQ(offline_page_2.last_access_time, this->offline_pages_[0].last_access_time); EXPECT_EQ(offline_page_2.access_count, this->offline_pages_[0].access_count); EXPECT_EQ(offline_page_2.client_id, this->offline_pages_[0].client_id); } // Tests updating offline page metadata from the store. TYPED_TEST(OfflinePageMetadataStoreTest, UpdateOfflinePage) { std::unique_ptr<OfflinePageMetadataStore> store(this->BuildStore()); // First, adds a fresh page. OfflinePageItem offline_page(GURL(kTestURL), 1234LL, kTestClientId1, base::FilePath(kFilePath), kFileSize); store->AddOrUpdateOfflinePage( offline_page, base::Bind(&OfflinePageMetadataStoreTestBase::UpdateCallback, base::Unretained(this), ADD)); this->PumpLoop(); EXPECT_EQ(ADD, this->last_called_callback_); EXPECT_EQ(STATUS_TRUE, this->last_status_); this->ClearResults(); store->Load(base::Bind(&OfflinePageMetadataStoreTestBase::LoadCallback, base::Unretained(this))); this->PumpLoop(); EXPECT_EQ(LOAD, this->last_called_callback_); EXPECT_EQ(STATUS_TRUE, this->last_status_); EXPECT_EQ(1U, this->offline_pages_.size()); EXPECT_EQ(offline_page.url, this->offline_pages_[0].url); EXPECT_EQ(offline_page.offline_id, this->offline_pages_[0].offline_id); EXPECT_EQ(offline_page.version, this->offline_pages_[0].version); EXPECT_EQ(offline_page.file_path, this->offline_pages_[0].file_path); EXPECT_EQ(offline_page.file_size, this->offline_pages_[0].file_size); EXPECT_EQ(offline_page.creation_time, this->offline_pages_[0].creation_time); EXPECT_EQ(offline_page.last_access_time, this->offline_pages_[0].last_access_time); EXPECT_EQ(offline_page.access_count, this->offline_pages_[0].access_count); EXPECT_EQ(offline_page.client_id, this->offline_pages_[0].client_id); // Then update some data. offline_page.file_size = kFileSize + 1; offline_page.access_count++; store->AddOrUpdateOfflinePage( offline_page, base::Bind(&OfflinePageMetadataStoreTestBase::UpdateCallback, base::Unretained(this), ADD)); this->PumpLoop(); EXPECT_EQ(ADD, this->last_called_callback_); EXPECT_EQ(STATUS_TRUE, this->last_status_); this->ClearResults(); store->Load(base::Bind(&OfflinePageMetadataStoreTestBase::LoadCallback, base::Unretained(this))); this->PumpLoop(); EXPECT_EQ(LOAD, this->last_called_callback_); EXPECT_EQ(STATUS_TRUE, this->last_status_); EXPECT_EQ(1U, this->offline_pages_.size()); EXPECT_EQ(offline_page.url, this->offline_pages_[0].url); EXPECT_EQ(offline_page.offline_id, this->offline_pages_[0].offline_id); EXPECT_EQ(offline_page.version, this->offline_pages_[0].version); EXPECT_EQ(offline_page.file_path, this->offline_pages_[0].file_path); EXPECT_EQ(offline_page.file_size, this->offline_pages_[0].file_size); EXPECT_EQ(offline_page.creation_time, this->offline_pages_[0].creation_time); EXPECT_EQ(offline_page.last_access_time, this->offline_pages_[0].last_access_time); EXPECT_EQ(offline_page.access_count, this->offline_pages_[0].access_count); EXPECT_EQ(offline_page.client_id, this->offline_pages_[0].client_id); } } // namespace class OfflinePageMetadataStoreImplTest : public OfflinePageMetadataStoreTest<OfflinePageMetadataStoreImplFactory> { public: void UpdateStoreEntries( OfflinePageMetadataStoreImpl* store, std::unique_ptr<leveldb_proto::ProtoDatabase< OfflinePageEntry>::KeyEntryVector> entries_to_save); }; void OfflinePageMetadataStoreImplTest::UpdateStoreEntries( OfflinePageMetadataStoreImpl* store, std::unique_ptr<leveldb_proto::ProtoDatabase< OfflinePageEntry>::KeyEntryVector> entries_to_save) { std::unique_ptr<std::vector<std::string>> keys_to_remove( new std::vector<std::string>()); store->UpdateEntries( std::move(entries_to_save), std::move(keys_to_remove), base::Bind(&OfflinePageMetadataStoreTestBase::UpdateCallback, base::Unretained(this), ADD)); } // Test that loading a store with a bad value still loads. // Needs to be outside of the anonymous namespace in order for FRIEND_TEST // to work. TEST_F(OfflinePageMetadataStoreImplTest, LoadCorruptedStore) { std::unique_ptr<OfflinePageMetadataStore> store(this->BuildStore()); // Write one ok page. OfflinePageItem offline_page(GURL(kTestURL), 1234LL, kTestClientId1, base::FilePath(kFilePath), kFileSize); store->AddOrUpdateOfflinePage( offline_page, base::Bind(&OfflinePageMetadataStoreTestBase::UpdateCallback, base::Unretained(this), ADD)); this->PumpLoop(); EXPECT_EQ(ADD, this->last_called_callback_); EXPECT_EQ(STATUS_TRUE, this->last_status_); // Manually write one broken page (no id) std::unique_ptr< leveldb_proto::ProtoDatabase<OfflinePageEntry>::KeyEntryVector> entries_to_save( new leveldb_proto::ProtoDatabase<OfflinePageEntry>::KeyEntryVector()); OfflinePageEntry offline_page_proto; entries_to_save->push_back(std::make_pair("0", offline_page_proto)); UpdateStoreEntries((OfflinePageMetadataStoreImpl*)store.get(), std::move(entries_to_save)); this->PumpLoop(); EXPECT_EQ(ADD, this->last_called_callback_); EXPECT_EQ(STATUS_TRUE, this->last_status_); this->ClearResults(); // Close the store first to ensure file lock is removed. store.reset(); store = this->BuildStore(); this->PumpLoop(); // One of the pages was busted, so only expect one page. EXPECT_EQ(LOAD, this->last_called_callback_); EXPECT_EQ(STATUS_TRUE, this->last_status_); EXPECT_EQ(1U, this->offline_pages_.size()); EXPECT_EQ(offline_page.url, this->offline_pages_[0].url); EXPECT_EQ(offline_page.offline_id, this->offline_pages_[0].offline_id); EXPECT_EQ(offline_page.version, this->offline_pages_[0].version); EXPECT_EQ(offline_page.file_path, this->offline_pages_[0].file_path); EXPECT_EQ(offline_page.file_size, this->offline_pages_[0].file_size); EXPECT_EQ(offline_page.creation_time, this->offline_pages_[0].creation_time); EXPECT_EQ(offline_page.last_access_time, this->offline_pages_[0].last_access_time); EXPECT_EQ(offline_page.access_count, this->offline_pages_[0].access_count); } // Test that loading a store with nothing but bad values errors. // Needs to be outside of the anonymous namespace in order for FRIEND_TEST // to work. TEST_F(OfflinePageMetadataStoreImplTest, LoadTotallyCorruptedStore) { std::unique_ptr<OfflinePageMetadataStore> store(this->BuildStore()); // Manually write two broken pages (no id) std::unique_ptr< leveldb_proto::ProtoDatabase<OfflinePageEntry>::KeyEntryVector> entries_to_save( new leveldb_proto::ProtoDatabase<OfflinePageEntry>::KeyEntryVector()); OfflinePageEntry offline_page_proto; entries_to_save->push_back(std::make_pair("0", offline_page_proto)); entries_to_save->push_back(std::make_pair("1", offline_page_proto)); UpdateStoreEntries((OfflinePageMetadataStoreImpl*)store.get(), std::move(entries_to_save)); ; this->PumpLoop(); EXPECT_EQ(ADD, this->last_called_callback_); EXPECT_EQ(STATUS_TRUE, this->last_status_); this->ClearResults(); // Close the store first to ensure file lock is removed. store.reset(); store = this->BuildStore(); this->PumpLoop(); // One of the pages was busted, so only expect one page. EXPECT_EQ(LOAD, this->last_called_callback_); EXPECT_EQ(STATUS_FALSE, this->last_status_); } TEST_F(OfflinePageMetadataStoreImplTest, UpgradeStoreFromBookmarkIdToClientId) { std::unique_ptr<OfflinePageMetadataStore> store(this->BuildStore()); // Manually write a page referring to legacy bookmark id. std::unique_ptr< leveldb_proto::ProtoDatabase<OfflinePageEntry>::KeyEntryVector> entries_to_save( new leveldb_proto::ProtoDatabase<OfflinePageEntry>::KeyEntryVector()); OfflinePageEntry offline_page_proto; offline_page_proto.set_deprecated_bookmark_id(1LL); offline_page_proto.set_version(1); offline_page_proto.set_url(kTestURL); offline_page_proto.set_file_path("/foo/bar"); entries_to_save->push_back(std::make_pair("1", offline_page_proto)); UpdateStoreEntries((OfflinePageMetadataStoreImpl*)store.get(), std::move(entries_to_save)); PumpLoop(); EXPECT_EQ(ADD, last_called_callback_); EXPECT_EQ(STATUS_TRUE, last_status_); ClearResults(); // Close the store first to ensure file lock is removed. store.reset(); store = BuildStore(); PumpLoop(); // The page should be upgraded with new Client ID format. EXPECT_EQ(LOAD, last_called_callback_); EXPECT_EQ(STATUS_TRUE, last_status_); EXPECT_EQ(1U, offline_pages_.size()); EXPECT_TRUE(offline_pages_[0].offline_id != 0); EXPECT_EQ(offline_pages::kBookmarkNamespace, offline_pages_[0].client_id.name_space); EXPECT_EQ(base::Int64ToString(offline_page_proto.deprecated_bookmark_id()), offline_pages_[0].client_id.id); EXPECT_EQ(GURL(kTestURL), offline_pages_[0].url); EXPECT_EQ(offline_page_proto.version(), offline_pages_[0].version); EXPECT_EQ(offline_page_proto.file_path(), offline_pages_[0].file_path.MaybeAsASCII()); } } // namespace offline_pages
<reponame>frstrtr/the #include <iostream> using namespace std; #include <boost/program_options.hpp> namespace po = boost::program_options; enum ValueTypes { first, second }; std::istream &operator>>(std::istream &in, ValueTypes &value) { std::string token; in >> token; //cout << token << endl; if (token == "0") value = first; else if (token == "1") value = second; return in; } class A { public: A(ValueTypes _v) { value = _v; } ValueTypes value; }; int main(int ac, char *av[]) { cout << "STARTED" << endl; A a(ValueTypes::second); po::options_description desc("test"); desc.add_options()("debug", po::value<ValueTypes>(&a.value)->default_value(ValueTypes::first) /*->default_value(false)*/, "enable debugging mode"); po::variables_map vm; po::store(po::parse_command_line(ac, av, desc), vm); po::notify(vm); cout << a.value << endl; }
<reponame>twosigma/beaker-notebook-archive /* * Copyright 2017 TWO SIGMA OPEN SOURCE, LLC * * Licensed under the Apache License, Version 2.0 (the "License"); * you may not use this file except in compliance with the License. * You may obtain a copy of the License at * * http://www.apache.org/licenses/LICENSE-2.0 * * Unless required by applicable law or agreed to in writing, software * distributed under the License is distributed on an "AS IS" BASIS, * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. * See the License for the specific language governing permissions and * limitations under the License. */ package com.twosigma.beaker.widgets; import com.twosigma.beaker.jvm.object.OutputContainer; import com.twosigma.beaker.jvm.object.TabbedOutputContainerLayoutManager; import com.twosigma.beaker.table.TableDisplay; import com.twosigma.beaker.widgets.internal.CommWidget; import com.twosigma.beaker.widgets.internal.InternalCommWidget; import com.twosigma.beaker.widgets.selectioncontainer.Tab; import com.twosigma.beaker.widgets.strings.Label; import java.util.HashMap; import java.util.List; import java.util.stream.Collectors; public class DisplayOutputContainer { public static void display(OutputContainer container) { if (container.getLayoutManager() instanceof TabbedOutputContainerLayoutManager) { List<CommFunctionality> items = container.getItems().stream().map(x -> toCommFunctionality(x)).collect(Collectors.toList()); Tab tab = new Tab(items, container.getLabels()); tab.display(); } else { container.getItems().forEach(item -> toCommFunctionality(item).display()); } } private static CommWidget toCommFunctionality(Object item) { CommWidget widget; if (item instanceof InternalCommWidget || item instanceof CommWidget) { widget = (CommWidget) item; } else if (item instanceof HashMap) { widget = TableDisplay.createTableDisplayForMap((HashMap) item); } else { Label label = new Label(); label.setValue(item.toString()); widget = label; } return widget; } }
/** * Test XStream with an active SecurityManager. Note, that it is intentional, that this test is * not derived from AbstractAcceptanceTest to avoid loaded classes before the SecurityManager is * in action. Also run each fixture in its own to avoid side-effects. * * @author J&ouml;rg Schaible */ public class SecurityManagerTest extends TestCase { private XStream xstream; private DynamicSecurityManager securityManager; private CodeSource defaultCodeSource; private File mainClasses; private File testClasses; private File libs; private File libsJDK13; protected void setUp() throws Exception { super.setUp(); System.setSecurityManager(null); defaultCodeSource = new CodeSource(null, (Certificate[])null); mainClasses = new File(new File( new File(System.getProperty("user.dir"), "target"), "classes"), "-"); testClasses = new File(new File( new File(System.getProperty("user.dir"), "target"), "test-classes"), "-"); libs = new File(new File(System.getProperty("user.dir"), "lib"), "*"); if (!JVM.is14()) { libsJDK13 = new File(new File( new File(System.getProperty("user.dir"), "lib"), "jdk1.3"), "*"); } securityManager = new DynamicSecurityManager(); Policy policy = Policy.getPolicy(); securityManager.setPermissions(defaultCodeSource, policy .getPermissions(defaultCodeSource)); securityManager.addPermission(defaultCodeSource, new RuntimePermission( "setSecurityManager")); } protected void tearDown() throws Exception { System.setSecurityManager(null); super.tearDown(); } protected void runTest() throws Throwable { try { super.runTest(); } catch(Throwable e) { for (final Iterator iter = securityManager.getFailedPermissions().iterator(); iter.hasNext();) { final Permission permission = (Permission)iter.next(); System.out.println("SecurityException: Permission " + permission.toString()); } throw e; } } public void testSerializeWithXpp3DriverAndSun14ReflectionProviderAndActiveSecurityManager() { if (JVM.is14()) { securityManager.addPermission(defaultCodeSource, new FilePermission(mainClasses .toString(), "read")); securityManager.addPermission(defaultCodeSource, new FilePermission(testClasses .toString(), "read")); securityManager.addPermission(defaultCodeSource, new FilePermission( libs.toString(), "read")); securityManager.addPermission(defaultCodeSource, new RuntimePermission( "accessDeclaredMembers")); securityManager.addPermission(defaultCodeSource, new RuntimePermission( "accessClassInPackage.sun.reflect")); securityManager.addPermission(defaultCodeSource, new RuntimePermission( "accessClassInPackage.sun.misc")); securityManager.addPermission(defaultCodeSource, new RuntimePermission( "createClassLoader")); securityManager.addPermission(defaultCodeSource, new RuntimePermission( "reflectionFactoryAccess")); securityManager.addPermission(defaultCodeSource, new ReflectPermission( "suppressAccessChecks")); // permissions necessary for CGLIBMapper securityManager.addPermission(defaultCodeSource, new PropertyPermission( "cglib.debugLocation", "read")); securityManager.addPermission(defaultCodeSource, new RuntimePermission( "getProtectionDomain")); securityManager.setReadOnly(); System.setSecurityManager(securityManager); // uses implicit Sun14ReflectionProvider in JDK >= 1.4, since it has the appropriate // rights xstream = new XStream(); assertBothWays(); } } public void testSerializeWithXpp3DriverAndPureJavaReflectionProviderAndActiveSecurityManager() { securityManager.addPermission(defaultCodeSource, new FilePermission(mainClasses .toString(), "read")); securityManager.addPermission(defaultCodeSource, new FilePermission(testClasses .toString(), "read")); securityManager.addPermission(defaultCodeSource, new FilePermission( libs.toString(), "read")); if (libsJDK13 != null) { securityManager.addPermission(defaultCodeSource, new FilePermission(libsJDK13 .toString(), "read")); } securityManager.addPermission(defaultCodeSource, new RuntimePermission( "accessDeclaredMembers")); securityManager.addPermission(defaultCodeSource, new RuntimePermission( "createClassLoader")); securityManager.addPermission(defaultCodeSource, new ReflectPermission( "suppressAccessChecks")); // permissions necessary for CGLIBMapper securityManager.addPermission(defaultCodeSource, new PropertyPermission( "cglib.debugLocation", "read")); securityManager.addPermission(defaultCodeSource, new RuntimePermission( "getProtectionDomain")); securityManager.setReadOnly(); System.setSecurityManager(securityManager); xstream = new XStream(new PureJavaReflectionProvider()); assertBothWays(); } public void testSerializeWithDomDriverAndPureJavaReflectionProviderAndActiveSecurityManager() { securityManager.addPermission(defaultCodeSource, new FilePermission(mainClasses .toString(), "read")); securityManager.addPermission(defaultCodeSource, new FilePermission(testClasses .toString(), "read")); securityManager.addPermission(defaultCodeSource, new FilePermission( libs.toString(), "read")); if (libsJDK13 != null) { securityManager.addPermission(defaultCodeSource, new FilePermission(libsJDK13 .toString(), "read")); } securityManager.addPermission(defaultCodeSource, new RuntimePermission( "accessDeclaredMembers")); securityManager.addPermission(defaultCodeSource, new RuntimePermission( "createClassLoader")); securityManager.addPermission(defaultCodeSource, new ReflectPermission( "suppressAccessChecks")); // permissions necessary for CGLIBMapper securityManager.addPermission(defaultCodeSource, new PropertyPermission( "cglib.debugLocation", "read")); securityManager.addPermission(defaultCodeSource, new RuntimePermission( "getProtectionDomain")); securityManager.setReadOnly(); System.setSecurityManager(securityManager); // uses implicit PureJavaReflectionProvider, since Sun14ReflectionProvider cannot be // loaded xstream = new XStream(new DomDriver()); assertBothWays(); } private void assertBothWays() { xstream.alias("software", Software.class); final Software sw = new Software("jw", "xstr"); final String xml = "<software>\n" + " <vendor>jw</vendor>\n" + " <name>xstr</name>\n" + "</software>"; String resultXml = xstream.toXML(sw); assertEquals(xml, resultXml); Object resultRoot = xstream.fromXML(resultXml); if (!sw.equals(resultRoot)) { assertEquals("Object deserialization failed", "DESERIALIZED OBJECT\n" + xstream.toXML(sw), "DESERIALIZED OBJECT\n" + xstream.toXML(resultRoot)); } } }
// stringMatcher solves the problem introduced above of possibly // matching with exact string matching or partial string matching. func stringMatcher(test, match string, matchExact bool) bool { if matchExact { return test == match } return strings.Contains(test, match) }
import os, sys, re, math N,X = list(map(int,input().split(' '))) memo_length = [0 for i in range(51)] memo_patties = [0 for i in range(51)] memo_length[0] = 1 memo_patties[0] = 1 for i in range(1,51): memo_length[i] = memo_length[i-1] * 2 + 3 memo_patties[i] = memo_patties[i-1] * 2 + 1 def doit(level,x): if x == 0: return 0 if x >= memo_length[level]: return memo_patties[level] tmp = memo_length[level-1] ret = 0 if x >= 1: ret += doit(level-1, x-1) if x >= tmp + 2: ret += 1 ret += doit(level-1, x-tmp-2) return ret print(doit(N,X))
<reponame>artintexo/facedemo #pragma once #include <QWidget> namespace facelib { class FaceObject; } class GraphicsView; class QGraphicsScene; class QGroupBox; class QLineEdit; class QProgressBar; class QPushButton; class QTreeView; class FaceGui : public QWidget { Q_OBJECT public: FaceGui(); protected: void resizeEvent(QResizeEvent *event) override; private slots: void slotStartBtnClicked(); void slotSelectBtnClicked(); void slotCheckStartCondition(); void slotProgress(int value); void slotFinished(); void slotTreeViewClicked(const QModelIndex &index); private: QPushButton *startBtn_; QProgressBar *progressBar_; QLineEdit *tokenEdit_; QPushButton *selectBtn_; QTreeView *treeView_; QGraphicsScene *graphicsScene_; GraphicsView *graphicsView_; facelib::FaceObject *faceObject_; };
/** * Handler that takes care of redirecting * * @author igor */ public static class RedirectHandler implements IRequestHandler { private final String url; private final HttpsConfig config; /** * Constructor * * @param config * https config * @param url * redirect location */ public RedirectHandler(String url, HttpsConfig config) { this.url = Args.notNull(url, "url"); this.config = Args.notNull(config, "config"); } /** * @return redirect location */ public String getUrl() { return url; } @Override public void respond(IRequestCycle requestCycle) { String location = url; if (location.startsWith("/")) { // context-absolute url location = requestCycle.getUrlRenderer().renderContextRelativeUrl(location); } if (config.isPreferStateful()) { // we need to persist the session before a redirect to https so the session lasts // across both http and https calls. Session.get().bind(); } WebResponse response = (WebResponse)requestCycle.getResponse(); response.sendRedirect(location); } }
A Chesapeake, Virginia grand jury indicted 28-year-old Ryan Frederick on charges of capital murder yesterday. The more severe charge (he was originally charged with first-degree murder) means the state will likely seek the death penalty, though there has been no official announcement as of yet. Last January, Frederick shot and killed Det. Jarrod Shivers during a drug raid on Frederick's home. Police were looking for a major marijuana growing operation in Frederick's garage. They didn't find one. Frederick had no prior criminal record, and had a misdemeanor amount of pot (he says a few joints) in his home at the time of the raid. His home had also been broken into a few days prior to the raid. We now know that the police informant whose tip led to the raid was responsible for the break-in. We also know that informant had credit card fraud charges pending against him that were dropped just before the raid. What we still don't know is if his burglary of Frederick's home was done with the knowledge or consent of the police. Special Prosecutor Paul Ebert pushed the unlikely theory yesterday that Frederick looked out his window, saw several police officers about to break into his home, heard them announce themselves as police, decided to shoot and kill just one of them, then surrendered. This is a guy who friends, former employers, neighbors and family describe as harmless and unconfrontational to the point of being meek. The idea that he'd knowingly kill a cop over a few joints is absurd. Frederick had a job he enjoyed, a record of steady employment and strong recommendations from supervisors, and he'd just gotten engaged. Again, hardly the profile of a cop killer with a death wish. Ebert also got the law wrong in his statement to the press. He said: "Anytime someone kills a police officer, who is acting properly with a legal search warrant, that is a case of Capital Murder." Well, no. According to the Virginia criminal code, the act has to be willful, deliberate, and premeditated. If you don't know that the men breaking down your door are police when you shoot and kill one of them, you aren't guilty of capital murder. Virginia doesn't have a Castle Doctrine, so you may be guilty of something. But it isn't capital murder. This is why Ebert is arguing the "peered out the window" theory. The grand jury also indicted Frederick on a charge of manufacturing marijuana. Ebert hinted at this possibility a couple of weeks ago. I'm still trying to figure out what evidence they have for that charge. They found no plants in Frederick's home. They seized some grow lights and planting pots, but the guy is a gardener. His friends and neighbors—or one look at his backyard—confirm that. It's unlikely that police had information of Frederick manufacturing marijuana other than the informant's tip prior to the raid, or they'd have included it in the affidavit to obtain the search warrant. That leaves only the possibility that they've rounded up someone since the raid who might testify that he bought drugs from Frederick, or witnessed Frederick's alleged marijuana operation. At this point, it would be prudent to be wary of any informants with criminal records the police may bring forward to testify against Frederick. The police did no controlled buys to confirm the informant's tip. They say their "surveillance" consisted of a few drive-bys over a three-month period, during which they reported no unusual activity. They claim to have done an extensive background check on Frederick, and found only traffic tickets. Yet they felt compelled to break down his door after nightfall, based on a tip from a shady informant, and very little else. Bad as all of this looks, there are a couple of glimmers of hope, here. The first is that Paul Ebert has a long and illustrious history of incompetence. He seems to be living up to that reputation here with his overcharging of Frederick. The other reason for hope is that judging from the comments threads at the Virginian-Pilot website, public opinion in Chesapeake seems to have shifted decidedly to Frederick's favor. The public is usually reflexively pro-police, particularly when a cop is killed in the line of duty. That there's now considerable doubt about this case is testament to just how poorly this raid was executed, and how poorly it's been handled since. Prior coverage of the Frederick case here.
#include "QPhoneDao.h" #include "QDaoHelpBase.h" #include "QPhoneVo.h" QPhoneDao::QPhoneDao() { } bool QPhoneDao::initTable() { if (!dbOperateValid()) { return false; } QSqlError sqlError = QDAOHELP->createTable<QPhoneVo>(); return sqlError.type() == QSqlError::NoError; } bool QPhoneDao::insertInfo(QPhoneVo *pInfo) { QSqlError sqlError = QDAOHELP->insert(pInfo); return sqlError.type() == QSqlError::NoError; } bool QPhoneDao::getInfo(long id, QPhoneVo *pInfo) { if (!dbOperateValid()) { return false; } QString strSql = QString(" WHERE PhoneID = '%1'").arg(id); QSqlError error = QDAOHELP->fetch_by_query(strSql, pInfo); bool bResult = error.type() == QSqlError::NoError; return bResult; }
def extract_metadata(shp_file_full_path): try: metadata_dict = {} parsed_md_dict = parse_shp(shp_file_full_path) if parsed_md_dict["wgs84_extent_dict"]["westlimit"] != UNKNOWN_STR: wgs84_dict = parsed_md_dict["wgs84_extent_dict"] if wgs84_dict["westlimit"] == wgs84_dict["eastlimit"] \ and wgs84_dict["northlimit"] == wgs84_dict["southlimit"]: coverage_dict = {"Coverage": {"type": "point", "value": {"east": wgs84_dict["eastlimit"], "north": wgs84_dict["northlimit"], "units": wgs84_dict["units"], "projection": wgs84_dict["projection"]} }} else: coverage_dict = {"Coverage": {"type": "box", "value": parsed_md_dict["wgs84_extent_dict"]}} metadata_dict["coverage"] = coverage_dict original_coverage_dict = {} original_coverage_dict["originalcoverage"] = {"northlimit": parsed_md_dict ["origin_extent_dict"]["northlimit"], "southlimit": parsed_md_dict ["origin_extent_dict"]["southlimit"], "westlimit": parsed_md_dict ["origin_extent_dict"]["westlimit"], "eastlimit": parsed_md_dict ["origin_extent_dict"]["eastlimit"], "projection_string": parsed_md_dict ["origin_projection_string"], "projection_name": parsed_md_dict["origin_projection_name"], "datum": parsed_md_dict["origin_datum"], "unit": parsed_md_dict["origin_unit"] } metadata_dict["originalcoverage"] = original_coverage_dict field_info_array = [] field_name_list = parsed_md_dict["field_meta_dict"]['field_list'] for field_name in field_name_list: field_info_dict_item = {} field_info_dict_item['fieldinformation'] = \ parsed_md_dict["field_meta_dict"]["field_attr_dict"][field_name] field_info_array.append(field_info_dict_item) metadata_dict['field_info_array'] = field_info_array geometryinformation = {"featureCount": parsed_md_dict["feature_count"], "geometryType": parsed_md_dict["geometry_type"]} metadata_dict["geometryinformation"] = geometryinformation return metadata_dict except: raise ValidationError("Parse Shapefiles Failed!")
/** * See the FeatureInferrer class for the function of FeatureInferrers. * * This inferrer uses dependency relations to generate syntactic N-grams. * Syntactic N-grams are tuples of words which are adjacent in terms of * their paths in a syntactic tree, as opposed to lexical n-grams which * are tuples of words which are adjacent according to their order of * occurrence in an utterance. * * WARNING, Assumptions: * 1. CMU tagger and tokeniser was used * 2. TweetTagConverter for token expansion was used * 3. TweetDependencyParser was used with stanford dependencies * * User: Andrew D. Robertson * Date: 07/01/2014 * Time: 11:33 */ public class FeatureInferrerDependencyNGrams extends FeatureInferrer { private static final long serialVersionUID = 0L; private static final String FEATURE_TYPE_NGRAM = "dependencyNgram"; private static final String prepositionPOS = "P"; private static final String prepositionDeprel = "prep"; private static final String prepositionComplementDeprel = "pcomp"; private static final String prepositionObjectDeprel = "pobj"; private static final String punctuationPOS = ","; private boolean ignorePunctuation = true; private boolean lowercase = true; private boolean useUnexpandedForm = true; private boolean collapsePrepositions = true; private boolean retainUncollapsedPrepositionalNgrams = true; private boolean includeBigrams = true; private boolean includeTrigrams = true; public FeatureInferrerDependencyNGrams(){ } public FeatureInferrerDependencyNGrams(boolean ignorePunctuation, boolean includeBigrams, boolean includeTrigrams, boolean collapsePrepositions, boolean retainUncollapsedPrepositionalNgrams, boolean useUnexpandedForm, boolean lowercase) { this.ignorePunctuation = ignorePunctuation; this.includeBigrams = includeBigrams; this.includeTrigrams = includeTrigrams; this.collapsePrepositions = collapsePrepositions; this.retainUncollapsedPrepositionalNgrams = retainUncollapsedPrepositionalNgrams; this.useUnexpandedForm = useUnexpandedForm; this.lowercase = lowercase; } @Override public List<Feature> addInferredFeatures(Document document, List<Feature> featuresSoFar) { List<TweetTagConverter.Token> tokens = (List<TweetTagConverter.Token>)document.getAttribute("ExpandedTokens"); if (tokens == null) throw new FeatureExtractionException("Instances must have been dependency parsed before they can have DependencyNGrams extracted."); if (!(includeBigrams || includeTrigrams)) return featuresSoFar; // User asked for no ngrams, so do nothing if (useUnexpandedForm) { featuresSoFar.addAll(getUnexpandedNgrams(tokens, document)); } else { //TODO: currently doesn't support collapsing of prepositions in this mode DependencyTree dt = new DependencyTree(tokens); featuresSoFar.addAll(getNGrams(dt, document)); } return featuresSoFar; } @Override public Set<String> getFeatureTypes() { return Sets.newHashSet(FEATURE_TYPE_NGRAM); } /** * Produce a mapping from token ID to the token's headIDs. In the parse of a sentence, tokens only have a single * head, but this parse occurs on expanded tokens. If we de-expand those tokens, it is possible that a * de-expanded token (which inherits the head of the expanded tokens) may inherit more than one head. So a set * of head IDs is produced. */ private Map<Integer, Set<Integer>> resolveHeads(List<TweetTagConverter.Token> tokens, Document document){ Map<Integer, Set<Integer>> tokenHeads = new HashMap<>(); // Initialise sets for each Token ID (IDs are indexed from 1) for (int i = 1; i <= document.size(); i++) tokenHeads.put(i, new HashSet<Integer>()); for (TweetTagConverter.Token token : tokens){ // The "oldID" of an expanded token refers to the ID of the token in *document* from which this token was expanded // The "head" of an expanded token refers to the ID of the head of the token from the expanded tokens list (i.e. different ID space) // So we must get the head ID, find the expanded token that this refers to, then get the oldID of that token, to get the unexpanded head ID int resolvedHeadID = token.head == 0? 0 : tokens.get(token.head-1).oldID; // The oldID and resolvedHeadID will be the same if an expanded tokens lists a second expanded token as its head, but both tokens resolve to the same unexpanded token. We ignore these relations // The resolvedHeadID is 0 when the head is the root. We also ignore these relations if (token.oldID != resolvedHeadID && resolvedHeadID != 0) { // We may ignore punctuation if(!ignorePunctuation || !token.pos.equals(punctuationPOS)) { tokenHeads.get(token.oldID).add(resolvedHeadID); } } } return tokenHeads; } /** * Get a set of the IDs of the tokens that are actually prepositions that can be collapsed. */ private Set<Integer> getCollapsablePrepositions(List<TweetTagConverter.Token> tokens){ Set<Integer> collapsablePrepositionIDs = new HashSet<>(); // First, all tokens marked as prepositions (by their PoS) and that have the dependency relation indicating they are initiating as a prepositional phrase are considered candidates for being collapsable Set<Integer> candidates = new HashSet<>(); for (TweetTagConverter.Token token : tokens){ if (token.pos.equals(prepositionPOS) && (token.deprel.equals(prepositionDeprel) || token.deprel.equals(prepositionComplementDeprel))){ candidates.add(token.oldID); } } // Then only those candidates which have a prepositional object or prepositional complement as a child token are considered collapsable for (TweetTagConverter.Token token : tokens) { if (token.deprel.equals(prepositionObjectDeprel) || token.deprel.equals(prepositionComplementDeprel)){ int resolvedHeadID = token.head == 0? 0 : tokens.get(token.head-1).oldID; if (candidates.contains(resolvedHeadID)){ collapsablePrepositionIDs.add(resolvedHeadID); } } } return collapsablePrepositionIDs; } private List<Feature> getUnexpandedNgrams(List<TweetTagConverter.Token> tokens, Document document) { // Get a mapping between tokens and their heads Map<Integer, Set<Integer>> tokenHeads = resolveHeads(tokens, document); // Get a set of the ids of tokens that are collapsable prepositions Set<Integer> collapsablePrepositionIDs = getCollapsablePrepositions(tokens); // Keep track of all the unique ngrams we might want to extract (uniqueness determined by token ID not token form) Set<Ngram> allNgrams = new HashSet<>(); for (int i = 1; i <= document.size(); i++) { // If this is a token that we shouldn't be ignoring if (retainUncollapsedPrepositionalNgrams || !collapsablePrepositionIDs.contains(i) && (!ignorePunctuation || !document.get(i-1).get("pos").equals(punctuationPOS))){ // A bigram is two tokens linked by a single relation (though certain relations can be collapsed) Set<Ngram> bigrams = new HashSet<>(); // See "resolveHeads()" for why we may have multiple heads addNgrams(i, bigrams, document, tokenHeads, collapsablePrepositionIDs); if (includeBigrams) allNgrams.addAll(bigrams); // Trigrams extend bigrams by finding and attaching all the bigrams from the end point of the original bigrams if (includeTrigrams) { for (Ngram ngram : bigrams) { addNgrams(ngram.getHeadID(), ngram, allNgrams, document, tokenHeads, collapsablePrepositionIDs); } } } } return makeFeatures(allNgrams, document); } private void addNgrams(int tokenID, Set<Ngram> ngramsSoFar, Document document, Map<Integer, Set<Integer>> tokenHeads, Set<Integer> collapsablePrepositionIDs){ addNgrams(tokenID, new Ngram(tokenID), ngramsSoFar, document, tokenHeads, collapsablePrepositionIDs); } private void addNgrams(int tokenID, Ngram ngram, Set<Ngram> ngramsSoFar, Document document, Map<Integer, Set<Integer>> tokenHeads, Set<Integer> collapsablePrepositionIDs) { for (int headID : tokenHeads.get(tokenID)) { if (!ignorePunctuation || !document.get(headID-1).get("pos").equals(punctuationPOS)) { if (!ngram.isAlreadyVisited(headID)){ boolean canCollapse = collapsePrepositions && collapsablePrepositionIDs.contains(headID) && !onlyHeadIsRoot(headID, tokenHeads); if (!canCollapse || retainUncollapsedPrepositionalNgrams) { Ngram finalNgram = new Ngram(ngram); finalNgram.setDestinationID(headID); ngramsSoFar.add(finalNgram); } if(canCollapse) { Ngram grownNgram = new Ngram(ngram); grownNgram.addIntermediateID(headID); addNgrams(headID, grownNgram, ngramsSoFar, document, tokenHeads, collapsablePrepositionIDs); } } } } } private boolean onlyHeadIsRoot(int tokenID, Map<Integer, Set<Integer>> tokenHeads) { return tokenHeads.get(tokenID).size()==0; } private List<Feature> makeFeatures(Set<Ngram> ngrams, Document document){ List<Feature> features = new ArrayList<>(ngrams.size()); for (Ngram ngram : ngrams) { List<String> forms = new ArrayList<>(ngrams.size()); for (int i : ngram.asImmutableSortedList()) { forms.add(getTokenForm(i, document)); } features.add(makeFeature(FEATURE_TYPE_NGRAM, forms)); } return features; } private String getTokenForm(int tokenID, Document document) { try { return document.get(tokenID-1).get("form"); } catch (ArrayIndexOutOfBoundsException e) { throw new RuntimeException(e); } } /** * Given a node in a dependency tree, recursively find all syntactic bigrams and trigrams of * this node and all others below this node in the syntactic tree. * * WARNING: don't call this with the root node. */ private List<Feature> getNGrams(DependencyTree.Node node, Document document){ List<Feature> ngrams = new ArrayList<>(); if (ignorePunctuation && node.getData().deprel.equals("punct")) return ngrams; List<DependencyTree.Node> children = node.getChildren(); ngrams.addAll(makeBigramsAndTrigrams(node)); // Recurse on children for (DependencyTree.Node child : children) { ngrams.addAll(getNGrams(child, document)); } return ngrams; } /** * Given a dependency tree, find all syntactic bigrams and trigrams of all tokens. */ private List<Feature> getNGrams(DependencyTree dt, Document document) { List<Feature> ngrams = new ArrayList<>(); for (DependencyTree.Node child : dt.getRoot().getChildren()) { ngrams.addAll(getNGrams(child, document)); } return ngrams; } /** * Given a node in a dependency tree, make bigrams with it's children, and * incorporate grandchildren for trigrams. The order of the elements in the * ngrams are consistent with the order in which they occur in the utterance. */ private List<Feature> makeBigramsAndTrigrams(DependencyTree.Node head) { List<Feature> ngrams = new ArrayList<>(); for (DependencyTree.Node child : ignorePunctuation? DependencyTree.extractNodesWithoutRelation(head.getChildren(), "punct"): head.getChildren()) { if (includeBigrams) { // Order the words in the bigrams by their position in the sentence if (child.compareTo(head) < 0) ngrams.add(makeFeature(FEATURE_TYPE_NGRAM, child.getData().form, head.getData().form)); else ngrams.add(makeFeature(FEATURE_TYPE_NGRAM, head.getData().form, child.getData().form)); } if (includeTrigrams) { for (DependencyTree.Node grandChild : ignorePunctuation? DependencyTree.extractNodesWithoutRelation(child.getChildren(), "punct"): child.getChildren()){ ArrayList<DependencyTree.Node> nodes = Lists.newArrayList(grandChild, child, head); // Order the nodes by their position in the sentence Collections.sort(nodes); ngrams.add(makeFeature(FEATURE_TYPE_NGRAM, nodes.get(0).getData().form, nodes.get(1).getData().form, nodes.get(2).getData().form)); } } } return ngrams; } private Feature makeFeature(String name, String... forms){ StringBuilder sb = new StringBuilder(); for (int i=0; i < forms.length-1; i++) { sb.append(lowercase? forms[i].toLowerCase() : forms[i]); sb.append("-"); } sb.append(lowercase? forms[forms.length-1].toLowerCase() : forms[forms.length-1]); return new Feature(sb.toString(), name); } private Feature makeFeature(String name, List<String> forms){ StringBuilder sb = new StringBuilder(); for (int i=0; i < forms.size()-1; i++) { sb.append(lowercase ? forms.get(i).toLowerCase() : forms.get(i)); sb.append("-"); } sb.append(lowercase ? forms.get(forms.size() - 1).toLowerCase() : forms.get(forms.size() - 1)); return new Feature(sb.toString(), name); } private static class Ngram { private int startingID; private Set<Integer> pathIDs; private int destinationID; public Ngram(int startingID) { this(startingID, new HashSet<Integer>(), 0); } public Ngram(int startingID, Set<Integer> pathIDs) { this(startingID, pathIDs, 0); } public Ngram(int startingID, Set<Integer> pathIDs, int destinationID) { this.startingID = startingID; this.pathIDs = new HashSet<>(pathIDs); this.destinationID = destinationID; } public Ngram(int startingID, int destinationID){ this(startingID, new HashSet<Integer>(), destinationID); } public Ngram(Ngram ngram) { this(ngram.startingID); for (int i : ngram.pathIDs) { this.pathIDs.add(i); } if (ngram.destinationID != 0) this.pathIDs.add(ngram.destinationID); } public Ngram(int... ids) { if (ids.length < 2) throw new RuntimeException("No starting point AND/OR no destination point"); if (ids.length == 2) { startingID = ids[0]; pathIDs = new HashSet<>(); destinationID = ids[1]; } else { startingID = ids[0]; destinationID = ids[ids.length-1]; pathIDs = new HashSet<>(); for (int i = 1; i < ids.length-1; i++) pathIDs.add(ids[i]); } } public boolean isAlreadyVisited(int id){ return pathIDs.contains(id) || id == startingID || id == destinationID; } public int getHeadID(){ return destinationID; } public int getStartingID(){ return startingID; } public void addIntermediateID(int id) { if (isAlreadyVisited(id)) throw new RuntimeException("Attempting to represent a cycle"); pathIDs.add(id); } public void setDestinationID(int id) { if (isAlreadyVisited(id)) throw new RuntimeException("Attempting to represent a cycle"); destinationID = id; } public ImmutableList<Integer> asImmutableSortedList() { if (destinationID == 0) throw new RuntimeException("Destination is root / unassigned"); List<Integer> items = new ArrayList<>(pathIDs.size()+1); items.add(startingID); items.addAll(pathIDs); items.add(destinationID); Collections.sort(items); return ImmutableList.<Integer>builder().addAll(items).build(); } public boolean equals(Object o) { if (this == o) return true; if (o== null || getClass() != o.getClass()) return false; Ngram ngram = (Ngram)o; return asImmutableSortedList().equals(ngram.asImmutableSortedList()); } public int hashCode(){ return asImmutableSortedList().hashCode(); } public String toString() { return asImmutableSortedList().toString(); } } // TESTING public static void main(String[] args) throws IOException { FeatureExtractionPipeline pipeline = buildParsingPipeline(true, true); // String exampleSentence = "Economic news have little effect on financial markets."; // String exampleSentence = "I'm gonna go!"; String exampleSentence = "The badger'll never kill him before me"; // String exampleSentence = "He came out from under the bed."; Document document = pipeline.processDocumentWithoutCache(new Instance("", exampleSentence, "")); FeatureInferrerDependencyNGrams d = new FeatureInferrerDependencyNGrams(); // DependencyTree dt = new DependencyTree((List<TweetTagConverter.Token>)document.getAttribute("ExpandedTokens")); List<Feature> ngrams = d.getUnexpandedNgrams((List<TweetTagConverter.Token>) document.getAttribute("ExpandedTokens"), document); // d.resolveHeads((List<TweetTagConverter.Token>)document.getAttribute("ExpandedTokens"), document); // for (Feature s : d.getNGrams(dt, document)) { // System.out.println(s); // } for(Feature f : ngrams){ System.out.println(f.value()); } System.out.println("---- Done."); } @Override public boolean isThreadSafe() { return true; } }
/** * Waits until the consumer is done with the texture. * * <p>This method must be called within the application's GL context that will overwrite the * TextureFrame. */ public void waitUntilReleasedWithGpuSync() throws InterruptedException { synchronized (this) { while (inUse && releaseSyncToken == null) { wait(); } if (releaseSyncToken != null) { releaseSyncToken.waitOnGpu(); releaseSyncToken.release(); inUse = false; releaseSyncToken = null; } } }
We've grown so accustomed to seeing Stevie Wonder perform like he did last night at the Grammys that sometimes we might take for granted not just his talent, but also the obstacles he overcame to learn those skills. Jahmir Wallace, who says Wonder is his inspiration, is not one of those people. The 10-year-old student at Green Street Elementary School in Phillipsburg, N.J., was born without arms. His older sister used to play piano, and that piqued Wallace's musical curiosity. Undeterred by his disability, the fifth-grader learned the trumpet, playing it with his feet. "I kind of felt excited," Wallace recalled to WFMZ-TV in Allentown, Pa., after playing his first note, about four months ago. "I kind of felt like, 'Oh, man, this is kind of comfortable,' and it kind of felt like this might be the one for me." A custom trumpet stand constructed by a local music store helped Wallace feel at ease with the instrument. "Everybody who knew him said, 'If Jahmir wants to play the trumpet, we want him to play the trumpet,'" music teacher Desiree Kratzer told the TV station. She is one of many faculty and administrators at the school supporting Wallace. "To see how he moves his toes like we move our fingers, it's amazing," said school principal Raffaele LaForgia. But while adults marvel at Wallace's resilience, the student actually credits Kratzer, saying that "if it wasn't for her, I would never know what the trumpet was." Wallace also believes that his story can inspire others. "Anybody out there that would like to try an instrument, go ahead and try it," he said. "You never know: If you like it, you like it; if you don't, you don't. Keep on trying." It's a great piece of advice from a young musician who might just end up with his own Grammy moment one day.
//finds the LCA in a binary tree /* if the two nodes are on different sides then the first node to have it on different sides is the LCA or when they are on the same side then the first found is the LCA. each time we check if either of the 2 trees have found the LCA then return that node . */ Node* findLCA(Node* root,int a,int b){ if(root==NULL) return NULL; if(root->data==a || root->data==b) return root; Node* left=findLCA(root->left,a,b); Node* right=findLCA(root->right,a,b); if(left && right) return root; else return left?left:right; }
import { Body, Controller, Post, UsePipes, ValidationPipe } from '@nestjs/common'; import { DesafiosService } from './desafios.service'; import { CriarDesafioDto } from './dtos/criar-desafio.dto'; @Controller('desafios') export class DesafiosController { constructor(private readonly desafiosService: DesafiosService) {} @Post() @UsePipes(ValidationPipe) async criarDesafio(@Body() criarDesafioDto: CriarDesafioDto) { return await this.desafiosService.criarDesafio(criarDesafioDto) } }
/** * Displays an error message. * * @param message the message to display */ @Override public void error(String message) { error(null, message); }
/** * A collection of {@link Document Documents}. * * @deprecated This class is deprecated because of potential issues with marshalling and unmarshalling this type when used in RESTful APIs. * When you want to represent a collection of {@link Document documents), please use the {@link DocumentCollection} interface and the * {@link DocumentCollectionImpl} implementation class. */ @XmlAccessorType(XmlAccessType.FIELD) @XmlRootElement(name = "documents-object") @Deprecated public class Documents implements DocumentCollection<Document> { private static final long serialVersionUID = 6962662228758156488L; private List<Document> documents = new ArrayList<>(); public Documents() { } public Documents(List<Document> documents) { this.documents = new ArrayList<Document>(documents); } public void addDocument(Document document) { documents.add(document); } public List<Document> getDocuments() { return documents; } }
Toronto Blue Jays QR Code Ticket Converter Bought tickets on StubHub and have no printer? Convert your PDF tickets to a mobile ticket right here! All you have to do is enter your ticket number and this will convert your ticket to a scannable QR code to enter Rogers Centre, all processing is done on the browser level, and no data is transferred! Use the code this page generates to enter the stadium, and a locator slip will be printed at the turnstiles. Update - May 22, 2018 - Still working as of today. Please note that only tickets that are 12 digits from season tickets or game pack holders will work. Single game tickets with 16 digits will not work on this tool. Ticket Number (no spaces): Section: Row: Seat:
def dict_to_str(d): return string.join(['--%s \'%s\'' % (lib_to_orig_opt_rep(k), lib_to_orig_opt_rep(v)) for (k, v) in d.items() if v != ''], ' ')
// GetPart is a wrapper around the C function soup_multipart_get_part. func (recv *Multipart) GetPart(part int32) (bool, *MessageHeaders, *Buffer) { c_part := (C.int)(part) var c_headers *C.SoupMessageHeaders var c_body *C.SoupBuffer retC := C.soup_multipart_get_part((*C.SoupMultipart)(recv.native), c_part, &c_headers, &c_body) retGo := retC == C.TRUE headers := MessageHeadersNewFromC(unsafe.Pointer(c_headers)) body := BufferNewFromC(unsafe.Pointer(c_body)) return retGo, headers, body }
/** * just adds a prefix to the path * * @param filepath the path to which to add the prefix * @param offset the offset to add as prefix * @return the whole path */ public static String prependPathOffset(String filepath, String offset) { if ((offset != null) && (offset.length() > 0)) { /* * IMPORTANT: Do NOT use File.separator here, since this is also used for resource-paths! */ return offset + "/" + filepath; } else { return filepath; } }
def ordered_host(self): return self._get_list_field("hosts", lambda x: OrderedHost(x))
def subscribe(self, handler_func): if handler_func.__name__ == '<lambda>': raise ValueError('handler cannot be a lambda function') return self._add_handler(handler_func)
<filename>setup.py<gh_stars>1-10 # -*- coding: utf-8 -*- from setuptools import find_packages, setup setup( name='FI', description='A library of common functions used in financial ' + 'independence (FI, FIRE) calculations.', version='0.0.1', author='<NAME>', author_email='<EMAIL>', packages=find_packages(), py_modules=['fi'], entry_points={ 'console_scripts': [ 'annual_cost = fi_commands:run_annual_cost', 'average_daily_spend = fi_commands:run_average_daily_spend', 'buy_a_day_of_freedom = fi_commands:run_buy_a_day_of_freedom', 'coast_fi = fi_commands:run_coast_fi', 'cost_per_use = fi_commands:run_cost_per_use', 'days_covered_by_fi = fi_commands:run_days_covered_by_fi', 'fi_age = fi_commands:run_fi_age', 'future_value = fi_commands:run_future_value', 'redeem_chase_points = fi_commands:run_redeem_chase_points', 'redeem_points = fi_commands:run_redeem_points', 'rule_of_72 = fi_commands:run_rule_of_72', 'take_home_pay = fi_commands:run_take_home_pay', 'savings_rate = fi_commands:run_savings_rate', 'spending_from_savings = fi_commands:run_spending_from_savings', ] }, url='https://github.com/bbusenius/FI', license='MIT, see LICENSE.txt', include_package_data=True, install_requires=['numpy'], test_suite='tests', zip_safe=False, )
LEXINGTON, Ky. -- Kentucky coach Mark Stoops has announced the hiring of Shannon Dawson as the Wildcats' offensive coordinator and quarterbacks coach. Dawson serves in the same capacities with West Virginia and will stay with the Mountaineers through their Dec. 29 Liberty Bowl game against Texas A&M. WVU's high-octane offense ranks 11th nationally at 502.1 yards per game, ninth in passing at nearly 315 yards and is averaging 33.2 points per contest. Stoops said in a release Friday that he was impressed with the Mountaineers' offensive balance under Dawson, adding, "I love the continuity he will bring to what we've been doing as we build on the progress we've made so far." Dawson will replace Neal Brown, who left after two seasons with the Wildcats to become Troy's head coach.
One of the oak trees at Toomer's Corner was set on fire following Auburn's win over LSU on Saturday and police arrested a 29-year-old Auburn resident on charges of desecration of a venerable object. Auburn police arrested Jochen Wiest, who was already in custody on charges of public intoxication and authorities say is "not affiliated with Auburn University," on Sunday afternoon. Auburn Police arrested Jochen Wiest, 29, of Auburn on a warrant charging him with desecration of a venerable object in connection with the fire set to one of the oak trees at Toomer's Corner following Auburn's win over LSU on Saturday. (Mugshot via Auburn police) "Wiest was developed as a suspect and immediately taken into custody while still in the area on an unrelated charge of public intoxication," according to a press release from Auburn police. "Further investigation resulted in Wiest being identified as the individual responsible for setting the fire and a warrant was obtained for his arrest. He was arrested while incarcerated in the Lee County jail on the prior charge and his bond was set at $1000." Desecration of a venerable object is a Class A misdemeanor in the state of Alabama, which carries a penalty of up to one year in jail and up to a $6,000 fine. Surveillance footage appears to show a man Auburn police believe to be Wiest, walk up to the oak tree on W. Magnolia Ave., which had to be replanted during the remodeling process that started in 2015, and set some of the toilet paper hanging from it on fire. The fire department extinguished the fire shortly after receiving calls at approximately 12:15 a.m., according to Auburn police. As of 1:45 a.m. Sunday, there were still embers smoldering in the tree enough to catch a small piece of toilet paper on fire. A bystander stomped the small fire out when the paper fell to the ground. Auburn University professor of horticulture Gary Keever inspected the tree Sunday morning. Keever does not believe the fire has killed the tree but the extent of the damage may not be known for "several days to several weeks" if not longer. "At this time, we know that the upper canopy, the lower canopy on the northwest side, and the base of the trunk were burned," Keever told AL.com. "Based on the leaf curl and off-color of the foliage in parts of the canopy, these leaves will drop over the next several days. We plan to use a lift to inspect the shoots in the canopy early this week. "The full extent of the damage won't be known for several days to several weeks, and perhaps not until we see regrowth in spring. Based on the initial appearance of the tree, I don't think the fire has killed the tree; however, aesthetic death, when the tree declines to such an extent that it detracts from the landscape and there is little chance of it returning to its full grandeur, may warrant consideration of removal as a result of this act." An Auburn University spokesman told AL.com a lift would be used to assess the damage to tree's canopy "as early as Monday." Fans left messages at Toomer's Corner on Sunday. Auburn athletic director Jay Jacobs posted a pair of tweets Sunday morning, saying it was "shocking and sad to wake up to see this fire on one of our Toomers Oaks." Auburn began allowing fans to roll the new oak trees, which were planted at Toomer's Corner in 2015, at the start of this season after suspending the tradition in 2013 due to the original oaks being poisoned in 2011. Fans rolled the new trees for the first time following Auburn's win over Arkansas State on Sept. 10. Toomer's Corner Fire Aftermath 27 Gallery: Toomer's Corner Fire Aftermath Amber Sutton contributed to this report.
Republican presidential candidates Ben Carson, Donald Trump and Rand Paul responded to a question about vaccines and autism at the GOP debate. (CNN) Donald Trump confidently strode on stage Wednesday night and explained to millions of Americans on live television his version of the heart-rending science of why so many children are being diagnosed with autism these days. "You take this little beautiful baby and you pump ... " he said, referring to mandatory childhood vaccines. "We had so many instances, people that work for me, just the other day, 2 years old, a beautiful child, went to have the vaccine and came back and a week later got a tremendous fever, got very, very sick, now is autistic." To be fair, as far as medical hypotheses go, Trump's idea is not completely crazy. Or at least it wouldn't be if this were still 1998. [So much sweat: Why it was hard for the candidates to look cool during the GOP debate] That year, a well-respected journal published a paper by researcher Andrew Wakefield and 12 of his colleagues linking a standard measles, mumps and rubella vaccine to autism. Despite its tiny sample size of 12 and its speculative conclusions, the study was publicized far and wide -- launching a global movement involving celebrities like Jenny McCarthy, Jim Carrey (and of course Trump) who warned parents to stop vaccinating their children. The result was what public health officials reported was a dangerous drop in MMR vaccinations. Karman Willmer, left, and Shelby Messenger protest SB277, a measure requiring California schoolchildren to get vaccinated, at a Capitol rally on June 9, 2015, in Sacramento. The bill, sponsored by Democratic Sens. Richard Pan of Sacramento and Ben Allen of Santa Monica, would only allow children with serious health problems to opt out of school-mandated vaccinations. (Rich Pedroncelli/AP) I am being proven right about massive vaccinations—the doctors lied. Save our children & their future. — Donald J. Trump (@realDonaldTrump) September 3, 2014 Lots of autism and vaccine response. Stop these massive doses immediately. Go back to single, spread out shots! What do we have to lose. — Donald J. Trump (@realDonaldTrump) October 22, 2012 The problem: The study was an elaborate fraud. Editors of the Lancet, which published the original piece, discovered that Wakefield had been funded by attorneys for parents who were pursuing lawsuits against vaccine companies and that a number of elements of the paper were misreported. In February 2010, the journal retracted the piece, and in an investigative piece in 2011, in The BMJ found even more shenanigans in the way the study was conducted. Some parents of children in the study reported by Wakefield to have autism said they did not, and others who were listed in the study as having no problems before the vaccine actually had had developmental issues. [Carly Fiorina: "I buried a child to drug addiction." How addiction is changing America.] Journalist Brian Deer wrote: "No case was free of misreporting or alteration. Taken together ... records cannot be reconciled with what was published, to such devastating effect, in the journal." Despite these revelations and reassurances from the federal health officials and other experts that vaccines are safe, the public remained fearful. Much of the alarm came from the case of Hannah Poling -- whose condition after she received five vaccines at 19 months old seemed to confirm every parent's nightmare. Hannah's parents had described their child as interactive, playful and communicative before she got those shots but reported that after she got the vaccine, she develop problems with language, communication and behavior, features of autism spectrum disorder. [The GOP's dangerous 'debate' on vaccines and autism] An article in the New England Journal of Medicine described the drama of what happened after her parents sued the Department of Health and Human Services (DHHS) for compensation under the Vaccine Injury Compensation Program (VICP) and won: On March 6, 2008, the Polings took their case to the public. Standing before a bank of microphones from several major news organizations, Jon Poling said that “the results in this case may well signify a landmark decision with children developing autism following vaccinations.” For years, federal health agencies and professional organizations had reassured the public that vaccines didn't cause autism. Now, with DHHS making this concession in a federal claims court, the government appeared to be saying exactly the opposite. Caught in the middle, clinicians were at a loss to explain the reasoning behind the VICP's decision. Hannah Poling with her father, Jon Poling, are seen before the start of a news conference on March 6, 2008, in Atlanta. (W.A. Harewood/AP) The issue became so controversial back then that dozens of studies were launched to address the question Wakefield posed. The research, published in top journals including JAMA, the New England Journal of Medicine, the Journal of Pediatric Infectious Diseases and the Journal of Autism and Developmental Disorders, is consistent and confident in its conclusions: There's no link between autism and vaccines. One of the largest was published in JAMA in April of this year and looked at 96,000 children in the United States and analyzed which ones got the shot and which ones were diagnosed with autism spectrum disorder. They found "no harmful association" between the two. Another large study, published in the New England Journal of Medicine in 2002, involved a half-million children in Denmark's health registry. Its takeaway: "This study provides strong evidence against the hypothesis that MMR vaccination causes autism." [All the ways the GOP candidates were wrong about vaccines] On Thursday, medical associations and patient advocacy groups decried Trump's remarks as false and potentially dangerous. The American Academy of Pediatrics said that "claims that vaccines are linked to autism, or are unsafe when administered according to the recommended schedule, have been disproven by a robust body of medical literature." Autism Speaks, a science and advocacy group, expressed similar sentiments noting that "extensive research has asked whether there is any link between childhood vaccinations and autism." "The results of this research are clear: Vaccines do not cause autism," the organization said in a statement. The Centers for Disease Control and Prevention explicitly states that there is no link between vaccines and autism, that vaccine ingredients do not cause autism and that vaccines in general are very safe. It cites numerous studies, including a 2013 study that looked at the substances in vaccines that cause the body’s immune system to produce disease-fighting antibodies, showed that the total amount from vaccines received was the same between children with autism and those without. The CDC said it has looked specifically into thimerosal, a mercury-based preservative used in multidose vials of vaccines that has been a source of concern among those who believe in an autism-vaccine link, and found no link. A review in 2004 by the Institute of Medicine concluded that "the evidence favors rejection of a causal relationship between thimerosal-containing vaccines and autism." Today, most scientists believe that autism there is no single cause of autism, but that genetics and abnormalities in brain structure or function may play a role. Vials of measles, mumps and rubella vaccine are displayed on a counter at a Walgreens Pharmacy on Jan. 26, 2015, in Mill Valley, Calif. (Justin Sullivan/Getty Images) Read more: Toddler brain images reveal which toddlers may have autism and may struggle with language Scientists: Why running makes you so happy Secrets of aging may be in long-lived smokers, a 'biologically distinct' group The world's myopia crisis and why children should spend more time outdoors For more health news, you can sign up for our weekly newsletter here.
Lauren Zaser for BuzzFeed Photo Cepeda worked as an electrician while she was serving, and she credits the Marines for helping to shape who she is as a woman today. While she said she values the skills she learned in the Marines, she's working toward a different goal when it comes to her civilian work. She's on track to graduate from Baruch College in December 2015 with a degree in communications, which she hopes to use for a career in public relations. "I'm not going to say that I want a job I love, because that's rare. But I want a job that I feel confident about. I want to wake up in the morning and say, 'Wow. I made it. I'm doing what I set out to do so many years ago,'" she said.
// NewExecuteSQL constructs a new ExecuteSQL struct instance, with // all (but only) the required parameters. Optional parameters // may be added using the builder-like methods below. // // https://chromedevtools.github.io/devtools-protocol/tot/Database/#method-executeSQL func NewExecuteSQL(databaseID string, query string) *ExecuteSQL { return &ExecuteSQL{ DatabaseID: databaseID, Query: query, } }
/** * @brief Check WLAN Link Status & Reconnect if disconnected * * @param wps_s A pointer to PWPS_DATA structure * @param pwps_info A pointer to PWPS_INFO structure * @param reconnected A pointer to variable to indicate if STA re-connected * @return 1-connected, 0-not connected */ int wps_sta_check_link_active(WPS_DATA * wps_s, PWPS_INFO pwps_info, int *reconnected) { int link_active = 0; u8 retry_count = AP_CONNNECT_RETRY_CNT; u8 bssid_get[ETH_ALEN]; *reconnected = 0; memset(bssid_get, 0x00, ETH_ALEN); wps_wlan_get_wap(bssid_get); if ((memcmp(bssid_get, wps_s->current_ssid.bssid, ETH_ALEN) == 0)) { link_active = 1; return link_active; } *reconnected = 1; wps_wlan_session_control(WPS_SESSION_ON); do { wps_printf(DEBUG_WPS_STATE, "\nConnection lost, try to re-connect to AP ..... \n"); if (wps_wlan_set_wap((u8 *) wps_s->current_ssid.bssid)) { printf("Re-Connect to AP Failed\n"); } retry_count--; tx_thread_sleep(10); memset(bssid_get, 0x00, ETH_ALEN); wps_wlan_get_wap(bssid_get); if ((memcmp(bssid_get, wps_s->current_ssid.bssid, ETH_ALEN) == 0)) { link_active = 1; break; } } while (retry_count != 0); wps_wlan_session_control(WPS_SESSION_OFF); return link_active; }
<reponame>jvikstedt/birds_and_crustaceans<gh_stars>0 use bevy::{prelude::Query, sprite::Sprite}; use crate::component::{Collider, Player}; pub fn update_player_size(mut player_query: Query<(&mut Sprite, &mut Collider, &Player)>) { for (mut sprite, mut collider, player) in player_query.iter_mut() { collider.size.x = player.area as f32; collider.size.y = player.area as f32; sprite.custom_size = Some(collider.size); } }
#!/usr/bin/env python3 # -*- coding:utf-8 -*- # author: bigfoolliu """ 给你一个链表,删除链表的倒数第 n 个节点,并且返回链表的头节点。 进阶:你能尝试使用一趟扫描实现吗?   示例 1: 输入:head = [1,2,3,4,5], n = 2 输出:[1,2,3,5] 示例 2: 输入:head = [1], n = 1 输出:[] 示例 3: 输入:head = [1,2], n = 1 输出:[1]   提示: 链表中节点的数目为 sz 1 <= sz <= 30 0 <= Node.val <= 100 1 <= n <= sz 来源:力扣(LeetCode) 链接:https://leetcode-cn.com/problems/remove-nth-node-from-end-of-list 著作权归领扣网络所有。商业转载请联系官方授权,非商业转载请注明出处。 """ import doctest # Definition for singly-linked list. class ListNode: def __init__(self, val=0, next=None): self.val = val self.next = next class Solution: """ >>> s = Solution() """ def removeNthFromEnd(self, head: ListNode, n: int) -> ListNode: """ 双指针 指针 A 先移动 n 次, 指针 B 再开始移动 当 A 到达 null 的时候, 指针 B 的位置正好是倒数第 n 将 B 的指针指向 B 的下下个指针即可完成删除工作 """ # 使用虚拟节点来简化对头节点的特殊判断 pre = ListNode(-1) pre.next = head fast = pre # 快指针 slow = pre # 慢指针 # 移动快指针fast n个节点,快慢指针中间需要间隔n个节点,所以快指针要移动n+1 for _ in range(n + 1): fast = fast.next # 此时快慢指针之间相隔n个节点,同时移动快慢指针至快指针指向空 while fast: fast = fast.next slow = slow.next slow.next = slow.next.next return pre.next def removeNthFromEnd2(self, head: ListNode, n: int) -> ListNode: """ 常规做法 1.先遍历到尾部得知链表的长度 2.得到需要遍历的长度 3.遍历跳过倒数第n个节点 """ pre = ListNode(-1) # 虚拟头节点 pre.next = head l, cur = 0, head # 长度 # 先遍历到链表的尾部 while cur: l += 1 cur = cur.next cur = pre # 重新回到头 for _ in range(l - n): cur = cur.next # 跳过倒数第n个节点 cur.next = cur.next.next return pre.next if __name__ == '__main__': doctest.testmod()
<gh_stars>1-10 /** * Copyright (C) 2012 - present by OpenGamma Inc. and the OpenGamma group of companies * Copyright (C) 2015 - present by <NAME>oores Software Limited. * * Please see distribution for license. */ package com.opengamma.engine.function.blacklist; import java.net.URI; import java.util.concurrent.ExecutorService; import javax.ws.rs.core.UriBuilder; import org.fudgemsg.FudgeMsg; import org.fudgemsg.mapping.FudgeDeserializer; import com.opengamma.util.jms.JmsConnector; /** * Provides remote access to a {@link ManageableFunctionBlacklistProvider}. */ public class RemoteManageableFunctionBlacklistProvider extends RemoteFunctionBlacklistProvider implements ManageableFunctionBlacklistProvider { public RemoteManageableFunctionBlacklistProvider(final URI baseUri, final ExecutorService backgroundTasks, final JmsConnector jmsConnector) { super(baseUri, backgroundTasks, jmsConnector); } @Override protected ManageableFunctionBlacklist createBlacklist(final FudgeDeserializer fdc, final FudgeMsg info) { return new RemoteManageableFunctionBlacklist(fdc, info, this); } @Override public ManageableFunctionBlacklist getBlacklist(final String identifier) { return (ManageableFunctionBlacklist) super.getBlacklist(identifier); } protected void add(final String blacklist, final FudgeMsg request) { accessRemote(UriBuilder.fromUri(getBaseUri()).path("name/{name}/add").build(blacklist)).post(request); } protected void remove(final String blacklist, final FudgeMsg request) { accessRemote(UriBuilder.fromUri(getBaseUri()).path("name/{name}/remove").build(blacklist)).post(request); } }
/** * @param productName the product to check * @return true if the product exists */ public boolean assetNameExists(String productName) { for (AssetDefinition assetDefinition : this.assetDefinitions) if (assetDefinition.getName().equalsIgnoreCase(productName)) return true; return false; }
package nest import ( "bytes" "encoding/json" "fmt" "io/ioutil" "log" "net/http" "time" influxdb2 "github.com/influxdata/influxdb-client-go/v2" "github.com/influxdata/influxdb-client-go/v2/api" "github.com/blakehartshorn/go-nest-temp-monitor/configuration" ) // JSON types // Authorization - unpack access_token type Authorization struct { AccessToken string `json:"access_token"` } // Devices - root of the device list response type Devices struct { Device []Device `json:"devices"` } // Device - Individual devices and their descriptions type Device struct { Name string `json:"name"` Type string `json:"type"` Assignee string `json:"assignee"` Traits Traits `json:"traits"` ParentRelations []struct { DisplayName string `json:"displayName"` Parent string `json:"parent"` } `json:"parentRelations"` } // Traits - traits per device json object type Traits struct { Info struct { CustomName string `json:"customName"` } `json:"sdm.devices.traits.Info"` Humidity struct { Percent int `json:"ambientHumidityPercent"` } `json:"sdm.devices.traits.Humidity"` Connectivity struct { Status string `json:"status"` } `json:"sdm.devices.traits.Connectivity"` ThermostatMode struct { Mode string `json:"mode"` } `json:"sdm.devices.traits.ThermostatMode"` ThermostatEco struct { Mode string `json:"mode"` Heat float64 `json:"heatCelsius"` Cool float64 `json:"coolCelsius"` } `json:"sdm.devices.traits.ThermostatEco"` ThermostatHvac struct { Status string `json:"status"` } `json:"sdm.devices.traits.ThermostatHvac"` ThermostatTemperatureSetpoint struct { Heat float64 `json:"heatCelsius"` Cool float64 `json:"coolCelsius"` } `json:"sdm.devices.traits.ThermostatTemperatureSetpoint"` Temperature struct { Ambient float64 `json:"ambientTemperatureCelsius"` } `json:"sdm.devices.traits.Temperature"` } // Global Variables var ( AccessToken string ClientID string ClientSecret string Interval time.Duration ProjectID string RedirectURI string RefreshToken string ) // Initialize - Set initial valules from config func Initialize(config configuration.NestConfig) { ProjectID = config.ProjectID ClientID = config.ClientID ClientSecret = config.ClientSecret Interval = time.Duration(config.Interval) RefreshToken = config.RefreshToken } // RefreshLogin - Routinely fetch a new authentication token func RefreshLogin() { httpClient := &http.Client{Timeout: time.Second * 10} postData, _ := json.Marshal(map[string]string{ "client_id": ClientID, "client_secret": ClientSecret, "grant_type": "refresh_token", "redirect_uri": RedirectURI, "refresh_token": RefreshToken, }) for { log.Println("Getting new Google access_token") res, err := httpClient.Post( "https://www.googleapis.com/oauth2/v4/token", "application/json", bytes.NewBuffer(postData), ) if err != nil { log.Printf("ERROR: Could not login to Google (%s)\n", err) time.Sleep(time.Minute * 5) continue } var authData Authorization body, _ := ioutil.ReadAll(res.Body) err = json.Unmarshal(body, &authData) res.Body.Close() if err != nil { log.Println("ERROR: Invalid response object from Google") log.Fatal(err) } AccessToken = fmt.Sprintf("Bearer %s", authData.AccessToken) time.Sleep(time.Minute * 45) } } // WriteNest - parse and write thermostat data to influx func WriteNest(influx api.WriteAPI) { url := fmt.Sprintf( "https://smartdevicemanagement.googleapis.com/v1/enterprises/%s/devices", ProjectID, ) for { req, _ := http.NewRequest("GET", url, nil) req.Header.Set("Content-Type", "application/json") req.Header.Set("Authorization", AccessToken) httpClient := &http.Client{Timeout: time.Second * 10} res, err := httpClient.Do(req) if err != nil { log.Printf("ERROR: Could not get device info from server (%s)\n", err) time.Sleep(time.Minute * Interval) continue } var NestDevices Devices body, _ := ioutil.ReadAll(res.Body) err = json.Unmarshal(body, &NestDevices) res.Body.Close() if err != nil { log.Print(string(body)) log.Println("ERROR: Invalid json.") log.Fatal(err) } wCount := 0 for _, device := range NestDevices.Device { var Tags = make(map[string]string) var Fields = make(map[string]interface{}) Tags["name"] = device.Name Tags["assignee"] = device.Assignee Tags["customName"] = device.Traits.Info.CustomName for _, parent := range device.ParentRelations { if device.Assignee == parent.Parent { Tags["displayName"] = parent.DisplayName break } } if device.Traits.Connectivity.Status == "ONLINE" && device.Type == "sdm.devices.types.THERMOSTAT" { Fields["humidity"] = device.Traits.Humidity.Percent Fields["temperature"] = device.Traits.Temperature.Ambient if device.Traits.ThermostatEco.Mode == "MANUAL_ECO" { Fields["heat"] = device.Traits.ThermostatEco.Heat Fields["cool"] = device.Traits.ThermostatEco.Cool Tags["mode"] = "MANUAL_ECO" } else if device.Traits.ThermostatMode.Mode == "HEATCOOL" { Fields["heat"] = device.Traits.ThermostatTemperatureSetpoint.Heat Fields["cool"] = device.Traits.ThermostatTemperatureSetpoint.Cool Tags["mode"] = "HEATCOOL" } else if device.Traits.ThermostatMode.Mode == "HEAT" { Fields["heat"] = device.Traits.ThermostatTemperatureSetpoint.Heat Tags["mode"] = "HEAT" } else if device.Traits.ThermostatMode.Mode == "COOL" { Fields["cool"] = device.Traits.ThermostatTemperatureSetpoint.Cool Tags["mode"] = "COOL" } if device.Traits.ThermostatHvac.Status == "OFF" { Fields["hvac"] = int8(0) } else { Fields["hvac"] = int8(1) } p := influxdb2.NewPoint("nest", Tags, Fields, time.Now()) influx.WritePoint(p) wCount++ } } log.Printf("Wrote %d thermostat metrics. Sleeping for %d minute(s).\n", wCount, Interval) time.Sleep(time.Minute * Interval) } }
<gh_stars>10-100 // // M A R I A D B + + // // Copyright <NAME> 2013, // <NAME> 2015, // The ViaDuck Project 2016 - 2018. // Distributed under the Boost Software License, Version 1.0. // (See accompanying file LICENSE or copy at // http://www.boost.org/LICENSE_1_0.txt) #ifndef _MARIADB_STATEMENT_HPP_ #define _MARIADB_STATEMENT_HPP_ #include <mariadb++/last_error.hpp> #include <mariadb++/result_set.hpp> #define MAKE_SETTER_SIG(nm, type, fq) void fq set_##nm(u32 index, type value) #define MAKE_SETTER_INT(nm, type, fq) void fq _set_body_##nm(bind &bind, type value) #define MAKE_SETTER_DECL(nm, type) \ MAKE_SETTER_SIG(nm, type, ); \ MAKE_SETTER_INT(nm, type, ) #define MAKE_SETTER(nm, type) \ MAKE_SETTER_SIG(nm, type, statement::) { \ if (index >= m_data->m_bind_count) \ throw std::out_of_range("Field index out of range"); \ \ bind &bind = *m_data->m_binds.at(index); \ _set_body_##nm(bind, value); \ } \ MAKE_SETTER_INT(nm, type, statement::) namespace mariadb { class connection; class worker; class result_set; typedef std::shared_ptr<connection> connection_ref; /** * Class representing a prepared statement with binding functionality */ class statement : public last_error { friend class connection; friend class result_set; friend class worker; public: statement() = delete; /** * Execute the query and return the number of rows affected * * @return Number of rows affected or zero on error */ u64 execute(); /** * Execute the query and return the last insert id * * @return Last insert ID or zero on error */ u64 insert(); /** * Execute the query and return a result set * * @return Result set containing a result or an empty set on error */ result_set_ref query(); /** * Set connection ref, used by concurrency */ void set_connection(connection_ref &connection); // declare all setters MAKE_SETTER_DECL(blob, stream_ref); MAKE_SETTER_DECL(date_time, const date_time &); MAKE_SETTER_DECL(date, const date_time &); MAKE_SETTER_DECL(time, const time &); MAKE_SETTER_DECL(data, const data_ref &); MAKE_SETTER_DECL(decimal, const decimal &); MAKE_SETTER_DECL(string, const std::string &); MAKE_SETTER_DECL(boolean, bool); MAKE_SETTER_DECL(unsigned8, u8); MAKE_SETTER_DECL(signed8, s8); MAKE_SETTER_DECL(unsigned16, u16); MAKE_SETTER_DECL(signed16, s16); MAKE_SETTER_DECL(unsigned32, u32); MAKE_SETTER_DECL(signed32, s32); MAKE_SETTER_DECL(unsigned64, u64); MAKE_SETTER_DECL(signed64, s64); MAKE_SETTER_DECL(float, f32); MAKE_SETTER_DECL(double, f64); void set_null(u32 index); private: /** * Private constructor used by connection */ statement(connection *conn, const std::string &query); // reference to parent connection connection_ref m_connection; // non-owning pointer to parent connection connection *m_parent; // reference to internal data, shared with all results statement_data_ref m_data; }; typedef std::shared_ptr<statement> statement_ref; } // namespace mariadb #endif
//! Gets an attribute as binary data ///! \param index: Index value, must be between 0 and getAttributeCount()-1. void CAttributes::getAttributeAsBinaryData(s32 index, void* outData, s32 maxSizeInBytes) const { if ((u32)index < Attributes.size()) Attributes[index]->getBinary(outData, maxSizeInBytes); }
package com.hankcs.nlp.collection.trie.bintrie; public class _EmptyValueArray<V> extends _ValueArray<V> { public _EmptyValueArray() { } @Override public V nextValue() { return null; } }
/** * Attack another creature. */ public class ActionAttack implements Action { /** * The ID of the creature which we are attacking. */ public final int id; /** * Constructor which gives us a null impulse. */ public ActionAttack(final int id) { this.id = id; } /** * {@inheritDoc} */ public final double energyCost() { return 0.0; } /** * {@inheritDoc} */ public final double massCost() { return 0.0; } /** * {@inheritDoc} */ public String toString() { return "Attack<" + id + ">"; } }
def make_supervisions( sgml_path: Pathlike, recording: Recording ) -> Dict[str, List[SupervisionSegment]]: doc = try_parse(sgml_path) episode = doc.find("episode") section_supervisions = [] text_supervisions = [] text_idx = 0 for sec_idx, section in enumerate(doc.find("episode").find_all("section")): sec_start = float(section.attrs["starttime"]) section_supervisions.append( SupervisionSegment( id=f"{recording.id}_section{sec_idx:03d}", recording_id=recording.id, start=sec_start, duration=round(float(section.attrs["endtime"]) - sec_start, ndigits=3), channel=0, language=episode.attrs["language"], custom={ "section": section.attrs["type"], "program": episode.attrs["program"], }, ) ) for turn in section.find_all("turn"): <turn speaker=Peter_Jennings spkrtype=male startTime=336.704 endTime=338.229> <overlap startTime=336.704 endTime=337.575> <time sec=336.704> time served up until </overlap> <time sec=337.575> this point? </turn> for child in turn.children: Here, we switch to custom parsing code as explained at the top of this script. lines = [ l for l in str(child).split("\n") if len(l) and not any(l.startswith(b) for b in EXCLUDE_BEGINNINGS) ] if not lines: continue times = [] texts = [] for time_marker, text in group_lines_in_time_marker(lines): match = re.search(r'sec="?(\d+\.?\d*)"?', time_marker) times.append(float(match.group(1))) texts.append(text) times.append(float(turn.attrs["endtime"])) Having parsed the current section into start/end times and text for individual speech segments, create a SupervisionSegment for each one. for (start, end), text in zip(sliding_window(2, times), texts): text_supervisions.append( SupervisionSegment( id=f"{recording.id}_segment{text_idx:04d}", recording_id=recording.id, start=start, duration=round(end - start, ndigits=8), channel=0, language=episode.attrs["language"], text=text.strip(), speaker=turn.attrs["speaker"], gender=turn.attrs["spkrtype"], ) ) text_idx += 1 return {"sections": section_supervisions, "segments": text_supervisions}
Age: 25 Height: 5ft 2in Hometown: Blackpool Lives: Leeds CV: Emmerdale - Jasmine Thomas (2005-2009) Waterloo Road – Lindsay James (2009) Room at the Top – Susan Brown (2010) Captain America: the First Avenger - Connie (2011) Dancing on the Edge - Rosie Williams (2012) Imaginary Forces - Ellen (2012) Titanic - Annie Desmond (2012) Awards: Nominated for best newcomer at the 2006 British Soap Awards Nominated for most popular newcomer at the 2006 National Television Awards Nominated for best dramatic performance at the 2009 British Soap Awards Biography: New Doctor Who companion Jenna-Louise Coleman is best known for appearing in Emmerdale and Waterloo Road. Born in Blackpool, the 25-year-old has been acting since she began applying for a place at drama school in 2005, when she was spotted by Emmerdale’s producers and given the role of Jasmine Thomas in the soap. Her performance in the drama, which saw her murdering her on-screen boyfriend and embarking on a lesbian affair, earned her best newcomer award nominations in 2006 and saw her tipped for best dramatic performance at the 2009 British Soap Awards. In the same year, she left Emmerdale and took a role in the BBC1 drama Waterloo Road, set in a school. Leaving after making appearances in nine episodes, Jenna-Louise then began working in films, appearing in a TV adaptation of John Braine’s Room at the Top in 2010, and then landing a role in the Hollywood blockbuster Captain America: The First Avenger in 2011. She’ll soon be seen playing “cheeky little cockney” stewardess Annie Desmond in Julian Fellowes’s Titanic, which will begin screening on ITV1 from Sunday 25 March, and has also filmed her part in Dancing on the Edge, an upcoming five-part Stephen Poliakoff series for BBC2.
/** * Deletes a node in the middle of a single linked list, given * only access to that node. * @param toDelete Node to be deleted */ public static boolean deleteAGivenNode(Node toDelete) { if(toDelete == null) { return false; } Node temp = toDelete.getNext(); if(temp != null) { toDelete.setData(temp.getData()); toDelete.setNext(temp.getNext()); return true; } else { System.out.println("Since it is the last node in the linked list, it cannot be deleted."); return false; } }
<gh_stars>1-10 export { default as ImageSizeParser } from './ImageSizeParser'; export { default as ImageThumbParser } from './ImageThumbParser'; export { default as PdfParser } from './PdfParser'; export { default as TextParser } from './TextParser'; export { default as ZipParser } from './ZipParser';
package client import ( "k8s.io/apimachinery/pkg/util/intstr" ) const ( RollingUpdateDaemonSetType = "rollingUpdateDaemonSet" RollingUpdateDaemonSetFieldMaxUnavailable = "maxUnavailable" ) type RollingUpdateDaemonSet struct { MaxUnavailable intstr.IntOrString `json:"maxUnavailable,omitempty" yaml:"maxUnavailable,omitempty"` }
// logMethod logs a method. func (middleware loggingMiddlewareCore) logMethod( ctx context.Context, methodLogger zerolog.Logger, args interface{}, results interface{}, err error, ) { var event *zerolog.Event var eventLevel zerolog.Level if err != nil { event = methodLogger.Err(err).Stack() eventLevel = zerolog.ErrorLevel } else { event = methodLogger.WithLevel(middleware.SuccessLogLevel) eventLevel = middleware.SuccessLogLevel } if !event.Enabled() { return } methodInfo := amqpmiddleware.GetMethodInfo(ctx) event.Int("OP_ATTEMPT", methodInfo.OpAttempt) if middleware.LogArgsResultsLevel <= eventLevel { event.Interface("zARGS", args) if err == nil && results != nil { event.Interface("zRESULTS", results) } } event.Timestamp().Send() }
/** * Load the attributes of the object from a file. * * @return the object attributes to be displayed on the interface, as an * array list of strings. */ private CustomTreeNode loadObjectAttributes() { if (new File(Constants.OBJECT_ATTRIBUTES_PATH).exists()) { try (FileInputStream fin = new FileInputStream(Constants.OBJECT_ATTRIBUTES_PATH); ObjectInputStream ois = new ObjectInputStream(fin)) { objectAttributes = (CustomTreeNode) ois.readObject(); } catch (IOException | ClassNotFoundException ex) { String msg = "Read the attributes from the file error"; log.error(msg); log.debug("{} {}", msg, ex); } } return objectAttributes; }
/// Two points in a source file. class location { public: typedef position::filename_type filename_type; typedef position::counter_type counter_type; location(const position& b, const position& e) : begin(b) , end(e) {} explicit location(const position& p = position()) : begin(p) , end(p) {} explicit location(filename_type* f, counter_type l = 1, counter_type c = 1) : begin(f, l, c) , end(f, l, c) {} void initialize(filename_type* f = YY_NULLPTR, counter_type l = 1, counter_type c = 1) { begin.initialize(f, l, c); end = begin; } public: void step() { begin = end; } void columns(counter_type count = 1) { end += count; } void lines(counter_type count = 1) { end.lines(count); } public: position begin; position end; }
<gh_stars>1-10 // Package shifter provide simple routing by dividing path into its segment package shifter // Shifter hold state of shifting segment in the path type Shifter struct { tag map[string]int list []string next int } // New return new Shifter with the key func New(list []string) *Shifter { return &Shifter{ tag: nil, list: list, next: 0, } } // Reset the shifter func (s *Shifter) Reset() { s.tag = nil s.next = 0 } // SetNext set the index for the next Shift. func (s *Shifter) SetNext(next int) { if next < 0 { next = 0 } if size := s.Size(); next > size { next = size } s.next = next } // Shift to next segment, also telling if already in last segment func (s *Shifter) Shift() (string, bool) { if s.End() { return "", true } ret := s.list[s.next] s.next++ return ret, s.End() } // Unshift do reverse of Shift func (s *Shifter) Unshift() { if s.next == 0 { return } s.next-- } // Get i-th segment func (s *Shifter) Get(i int) string { if i < 0 || i >= s.Size() { return "" } return s.list[i] } // GetRelative is same with Get, but relative to current segment func (s *Shifter) GetRelative(d int) string { return s.Get(s.CurrentIndex() + d) } // Size return the size of segment in path func (s *Shifter) Size() int { return len(s.list) } // CurrentIndex of shifter state func (s *Shifter) CurrentIndex() int { return s.next - 1 } // End indicated end segment in the path func (s *Shifter) End() bool { return s.next == s.Size() } // Split return processed segment and rest of them func (s *Shifter) Split() (done []string, rest []string) { done = make([]string, s.next) rest = make([]string, s.Size()-s.next) copy(done, s.list[:s.next]) copy(rest, s.list[s.next:]) return done, rest } // Tag current segment func (s *Shifter) Tag(tag string) { s.TagIndex(s.CurrentIndex(), tag) } func (s *Shifter) lazyInitTag() { if s.tag == nil { s.tag = make(map[string]int) } } // TagIndex will tag i-th segment func (s *Shifter) TagIndex(i int, tag string) { if i < 0 || i >= s.Size() { return } s.lazyInitTag() s.tag[tag] = i } // TagRelative is same with TagIndex, but relative to current segment func (s *Shifter) TagRelative(d int, tag string) { s.TagIndex(s.CurrentIndex()+d, tag) } // DeleteTag delete tag func (s *Shifter) DeleteTag(tag string) { s.lazyInitTag() delete(s.tag, tag) } // ClearTag clear all tags on index func (s *Shifter) ClearTag(index int) { s.lazyInitTag() var what []string for k, v := range s.tag { if v == index { what = append(what, k) } } for _, v := range what { delete(s.tag, v) } } // GetByTag return tagged segment func (s *Shifter) GetByTag(tag string) (string, bool) { s.lazyInitTag() i, ok := s.tag[tag] if !ok { return "", false } return s.list[i], true }
import type { BadgePreset, Sponsor, Sponsorship } from './types' import type { SponsorkitConfig } from '.' export function genSvgImage(x: number, y: number, size: number, url: string) { return `<image x="${x}" y="${y}" width="${size}" height="${size}" xlink:href="${url}"/>` } export function generateBadge( x: number, y: number, sponsor: Sponsor, { size, displayName: showName, textColor = '#333333', nameLength = 12, classes = 'sponsors-avatar' }: BadgePreset, ) { const { login, avatarUrl } = sponsor let name = (sponsor.name || sponsor.login).trim() const url = sponsor.linkUrl || `https://github.com/${sponsor.login}` if (name.length > nameLength) { if (name.includes(' ')) name = name.split(' ')[0] else name = `${name.slice(0, nameLength - 3)}...` } return ` <a xlink:href="${url}" class="${classes}" target="_blank" id="${login}"> ${showName ? `<text x="${x + size / 2}" y="${y + size + 18}" text-anchor="middle" class="name" fill="${textColor}">${name}</text>` : ''} ${genSvgImage(x, y, size, avatarUrl)} </a>`.trim() } export class SvgComposer { height = 0 body = '' constructor(public readonly config: Required<SponsorkitConfig>) {} addSpan(height = 0) { this.height += height return this } addTitle(text: string, classes = 'tier-title') { return this.addText(text, classes) } addText(text: string, classes = 'text') { this.body += `<text x="${this.config.width / 2}" y="${this.height}" text-anchor="middle" class="${classes}">${text}</text>` this.height += 20 return this } addRaw(svg: string) { this.body += svg return this } addLine(sponsors: Sponsorship[], config: BadgePreset) { const offsetX = (this.config.width - sponsors.length * config.boxWidth) / 2 + (config.boxWidth - config.size) / 2 this.body += sponsors .map((s, i) => { const x = offsetX + config.boxWidth * i const y = this.height return generateBadge(x, y, s.sponsor, config) }) .join('\n') this.height += config.boxHeight } addSponsorGrid(sponsors: Sponsorship[], preset: BadgePreset) { const perLine = Math.floor((this.config.width - preset.sidePadding * 2) / preset.boxWidth) new Array(Math.ceil(sponsors.length / perLine)) .fill(0) .forEach((_, i) => { this.addLine(sponsors.slice(i * perLine, (i + 1) * perLine), preset) }) return this } generateSvg() { return ` <svg xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink" width="${this.config.width}" height="${this.height}"> <!-- Generated by https://github.com/antfu/sponsorskit --> <style>${this.config.svgInlineCSS}</style> ${this.body} </svg> ` } }
#include "TrackingTools/GsfTools/interface/MultiTrajectoryStateAssembler.h" #include "TrackingTools/GsfTools/interface/GetComponents.h" #include "TrackingTools/GsfTools/interface/BasicMultiTrajectoryState.h" #include "TrackingTools/GsfTools/src/TrajectoryStateLessWeight.h" #include "FWCore/MessageLogger/interface/MessageLogger.h" #include "FWCore/Utilities/interface/Exception.h" MultiTrajectoryStateAssembler::MultiTrajectoryStateAssembler() : combinationDone(false), thePzError(false), theValidWeightSum(0.), theInvalidWeightSum(0.) { // // parameters (could be configurable) // sortStates = false; minValidFraction = 0.01; minFractionalWeight = 1.e-6; // 4; } void MultiTrajectoryStateAssembler::addState(const TrajectoryStateOnSurface tsos) { // // refuse to add states after combination has been done // if (combinationDone) throw cms::Exception("LogicError") << "MultiTrajectoryStateAssembler: trying to add states after combination"; // // Verify validity of state to be added // if (!tsos.isValid()) throw cms::Exception("LogicError") << "MultiTrajectoryStateAssembler: trying to add invalid state"; // // Add components (i.e. state to be added can be single or multi state) // GetComponents comps(tsos); const MultiTSOS &components(comps()); addStateVector(components); } void MultiTrajectoryStateAssembler::addStateVector(const MultiTSOS &states) { // // refuse to add states after combination has been done // if (combinationDone) throw cms::Exception("LogicError") << "MultiTrajectoryStateAssembler: trying to add states after combination"; // // sum up weights (all components are supposed to be valid!!!) and // check for consistent pz // double sum(0.); double pzFirst = theStates.empty() ? 0. : theStates.front().localParameters().pzSign(); for (MultiTSOS::const_iterator i = states.begin(); i != states.end(); i++) { if (!(i->isValid())) throw cms::Exception("LogicError") << "MultiTrajectoryStateAssembler: trying to add invalid state"; // weights sum += i->weight(); // check on p_z if (!theStates.empty() && pzFirst * i->localParameters().pzSign() < 0.) thePzError = true; } theValidWeightSum += sum; // // add to vector of states // theStates.insert(theStates.end(), states.begin(), states.end()); } void MultiTrajectoryStateAssembler::addInvalidState(const double weight) { // // change status of combination (contains at least one invalid state) // theInvalidWeightSum += weight; } TrajectoryStateOnSurface MultiTrajectoryStateAssembler::combinedState() { // // Prepare resulting state vector // if (!prepareCombinedState()) return TSOS(); // // If invalid states in input: use reweighting // if (theInvalidWeightSum > 0.) return reweightedCombinedState(theValidWeightSum + theInvalidWeightSum); // // Return new multi state without reweighting // return TSOS((BasicTrajectoryState *)(new BasicMultiTrajectoryState(theStates))); } TrajectoryStateOnSurface MultiTrajectoryStateAssembler::combinedState(const float newWeight) { // // Prepare resulting state vector // if (!prepareCombinedState()) return TSOS(); // // return reweighted state // return reweightedCombinedState(newWeight); } bool MultiTrajectoryStateAssembler::prepareCombinedState() { // // Protect against empty combination (no valid input state) // if (invalidCombinedState()) return false; // // Check for states with wrong pz // if (thePzError) removeWrongPz(); // // Check for minimum fraction of valid states // double allWeights(theValidWeightSum + theInvalidWeightSum); if (theInvalidWeightSum > 0. && theValidWeightSum < minValidFraction * allWeights) return false; // // remaining part to be done only once // if (combinationDone) return true; combinationDone = true; // // Remove states with negligible weights // removeSmallWeights(); if (invalidCombinedState()) return false; // // Sort output by weights? // if (sortStates) sort(theStates.begin(), theStates.end(), TrajectoryStateLessWeight()); return true; } TrajectoryStateOnSurface MultiTrajectoryStateAssembler::reweightedCombinedState(const double newWeight) const { // // check status // if (invalidCombinedState()) return TSOS(); // // scaling factor // double factor = theValidWeightSum > 0. ? newWeight / theValidWeightSum : 1; // // create new vector of states & combined state // MultiTSOS reweightedStates; reweightedStates.reserve(theStates.size()); for (auto const &is : theStates) { auto oldWeight = is.weight(); reweightedStates.emplace_back(factor * oldWeight, is.localParameters(), is.localError(), is.surface(), &(is.globalParameters().magneticField()), is.surfaceSide()); } return TSOS((BasicTrajectoryState *)(new BasicMultiTrajectoryState(reweightedStates))); } void MultiTrajectoryStateAssembler::removeSmallWeights() { // // check total weight // auto totalWeight(theInvalidWeightSum + theValidWeightSum); if (totalWeight == 0.) { theStates.clear(); return; } theStates.erase( std::remove_if(theStates.begin(), theStates.end(), [&](MultiTSOS::value_type const &s) { return s.weight() < minFractionalWeight * totalWeight; }), theStates.end()); } void MultiTrajectoryStateAssembler::removeWrongPz() { LogDebug("GsfTrackFitters") << "MultiTrajectoryStateAssembler: found at least one state with inconsistent pz\n" << " #state / weights before cleaning = " << theStates.size() << " / " << theValidWeightSum << " / " << theInvalidWeightSum; // // Calculate average pz // double meanPz(0.); for (auto const &is : theStates) meanPz += is.weight() * is.localParameters().pzSign(); meanPz /= theValidWeightSum; // // Now keep only states compatible with the average pz // theValidWeightSum = 0.; MultiTSOS oldStates(theStates); theStates.clear(); for (auto const &is : oldStates) { if (meanPz * is.localParameters().pzSign() >= 0.) { theValidWeightSum += is.weight(); theStates.push_back(is); } else { theInvalidWeightSum += is.weight(); LogDebug("GsfTrackFitters") << "removing weight / pz / global position = " << is.weight() << " " << is.localParameters().pzSign() << " " << is.globalPosition(); } } LogDebug("GsfTrackFitters") << " #state / weights after cleaning = " << theStates.size() << " / " << theValidWeightSum << " / " << theInvalidWeightSum; }
import Joi = require('joi') export const validateBodyMid = (schema: Joi.AnySchema) => { return async function (req, res, next) { const { body } = req _validate(schema, body, req, res, next) } } export const validateParamsMid = (schema: Joi.AnySchema) => { return async function (req, res, next) { const { params } = req _validate(schema, params, req, res, next) } } async function _validate(schema: Joi.AnySchema, data: any, req, res, next) { try { await schema.validateAsync(data) } catch (err) { res.status(400).end() return } next() }
def _bootstrap_cost(target_array, forecast_prob_array, cost_function, num_replicates): cost_estimates = numpy.full(num_replicates, numpy.nan) if num_replicates == 1: cost_estimates[0] = cost_function(target_array, forecast_prob_array) else: num_examples = target_array.shape[0] example_indices = numpy.linspace( 0, num_examples - 1, num=num_examples, dtype=int ) for k in range(num_replicates): these_indices = numpy.random.choice( example_indices, size=num_examples, replace=True ) cost_estimates[k] = cost_function( target_array[these_indices, ...], forecast_prob_array[these_indices, ...] ) print('Average cost estimate over {0:d} replicates = {1:f}'.format( num_replicates, numpy.mean(cost_estimates) )) return cost_estimates
BOOK REVIEW: GUYER, PAUL. Kant on the Rationality of Morality (Cambridge University Press, 2019, 73p. ) I discuss Paul Guyer’s contribution to the Cambridge Elements: The Philosophy of Immanuel Kant series. The author argues that Kant derives the fundamental principle and the object of morality from the fundamental principles of reason (the law of noncontradiction, of excluded middle and the principle of sufficient reason). I provide an overview of its chapters and discuss some of its main interpretative claims.
Strong shaking in Los Angeles expected from southern San Andreas earthquake The southernmost San Andreas fault has a high probability of rupturing in a large (greater than magnitude 7.5) earthquake sometime during the next few decades. New simulations show that the chain of sedimentary basins between San Bernardino and downtown Los Angeles form an effective waveguide that channels Love waves along the southern edge of the San Bernardino and San Gabriel Mountains. Earthquake scenarios with northward rupture, in which the guided wave is efficiently excited, produce unusually high long‐period ground motions over much of the greater Los Angeles region, including intense, localized amplitude modulations arising from variations in waveguide cross‐section.
San Luis Obispo’s Board of Supervisors has denied a request to build an oil-receiving terminal in its county, which would have allowed for the transportation of crude oil by train through Berkeley. Oil company Phillips 66 was working to build an oil train terminal at their Santa Maria refinery in San Luis Obispo County in order to import tar sands from Canada, said spokesperson for the Sierra Club’s San Francisco Bay Chapter Virginia Reinhart in an email. In the past, oil trains have been involved in fires, explosions and spills, such as in Quebec and Colorado. The oil trains terminal would have permitted more than 7 million gallons of crude oil to be transported to the Phillips 66 refinery each week, according to a joint press release issued by the Sierra Club and other environmental groups. On their way to the Central Coast refinery, the trains would have passed through the rail line in Berkeley by 4th Street. Reinhart said in her email that with a 40-fold increase in the supply of oil being shipped by rail since 2008, derailments and spills have increased significantly. “We do not feel that we should be exposed to all of these dangers,” said Vice Mayor of Berkeley Linda Maio. “We could not permit (the project) to happen without a fight and that’s what we did, we fought.” Maio had been raising funds, organizing community members and Berkeley City Council for more than three years in opposition to the project, according to a press release. The Sierra Club and other advocacy groups helped to ensure the project’s environmental impacts were sufficiently analyzed, according to Reinhart. Each oil train emits the equivalent particulate matter of 4,500 diesel automobiles, meaning that the project would pollute as much as 2 million cars per year, Reinhart said in the email. Justin Jacobs, spokesperson for Union Pacific, the railroad whose tracks would have been used to transport the oil, said in an email that his company provides safe and efficient transportation for products “that power the country.” Jacobs added that safety is the primary focus for every product Union Pacific transports. “Railroads provide the infrastructure, flexible networks, and efficiency needed to move crude oil from locations where oil is recovered to destinations where it is most highly valued,” Jacobs said in an email. Adam Hill, county supervisor for the 3rd District San Luis Obispo, said Berkeley was among many cities statewide that worried about the project, which he said “wasn’t worth the risk.” Hill added the project was unique in that it was the first time in nine years that San Luis Obispo’s Board of Supervisors heard from other cities and jurisdictions about a project in their county. Ryan Hostetter, supervising planner of San Luis Obispo County, said although the board denied the application, there is an appeal period once the paperwork is sent. The Coastal Commission will oversee the appeal process. “It’s not necessarily over,” Hostetter said. Contact Ahna Straube at [email protected] and follow her on Twitter at @akstraube.
<filename>extensions/project-creator/web/src/components/NotFound/index.tsx import React from 'react'; import styles from './index.module.scss'; export default ({ description }) => { return ( <div className={styles.container}> <img src="https://img.alicdn.com/tfs/TB1WNNxjBHH8KJjy0FbXXcqlpXa-780-780.png" width="160" height="160" alt="" /> <div className={styles.description}>{description}</div> </div> ); };
<reponame>PrebenRonningen/Crater #pragma once #include "CraterEngine.h" #include "ScoreComponent.h" class ScoreDisplayComponent final : public CraterEngine::Component { public: ScoreDisplayComponent(const CraterEngine::GameObject* parent, ScoreComponent* scoreComp); virtual ~ScoreDisplayComponent() override{}; #pragma region deleted ScoreDisplayComponent(const ScoreDisplayComponent& other) = delete; ScoreDisplayComponent(ScoreDisplayComponent&& other) noexcept = delete; ScoreDisplayComponent& operator=(const ScoreDisplayComponent& other) = delete; ScoreDisplayComponent& operator=(ScoreDisplayComponent&& other) noexcept = delete; #pragma endregion virtual void Update(const float dt) override; //virtual void Render() const override; virtual bool Initialize() override { return true; }; private: ScoreComponent* m_pScoreComponent; int m_PreviousScore = -1; };
How calcium indicators work. In the last two decades, imaging of fluorescent indicators specific for Ca(2+) has revealed its often spectacular spatial dynamics, such as rhythmic oscillations or standing gradients, within single groups or individual cells, in unprecedented detail. This short review describes how the more widely used indicators work. The currently used Ca(2+) indicators have a modular design consisting of a metal-binding site (or sensor) coupled in some way to a fluorescent dye. Combining different sensors with different dyes results in numerous indicators suited to a wide range of experiments and equipment.
Joe Biden Worries About 'Black Trucks' Selling Guns To 4th Graders From Red Alert Politics: The illegal transfer of guns is a widespread problem, according to Vice President Joe Biden, so widespread that "every neighborhood in every major city" has easy access to illegal guns, he claimed. [...] [T]he Vice President described a conversation between a teacher and her class, during which the teacher asked her students where they could buy a gun. One student responded that he would show her the black truck outside. "Every neighborhood in every major city has the equivalent of a black truck," he said. Uh, Joe, those black trucks are not selling guns. They are with the Secret Service and they are there to protect you: OMG! My Zombie Defense Kit seems so incomplete now!
import { Pipe, PipeTransform } from '@angular/core'; import { ViewListValue } from '../_models'; @Pipe({ name: 'filterModel', pure: false, }) export class FilterModelPipe implements PipeTransform { transform(items: ViewListValue, term: string): any { if (!term) { return items; } return items.filter(item => Object.values(item).some(val => val.toString().indexOf(term) !== -1)); } }
<filename>packages/nestjs-jwt/src/jwt.strategy.ts import { Strategy as PassportStrategy } from 'passport-strategy'; import { Strategy, VerifyCallback } from 'passport-jwt'; import { JwtStrategyOptionsInterface } from './interfaces/jwt-strategy-options.interface'; export class JwtStrategy extends PassportStrategy { constructor( private options: JwtStrategyOptionsInterface, private verify: VerifyCallback, ) { super(); this.options = options; this.verify = verify; } authenticate(...args: Parameters<Strategy['authenticate']>) { const [req] = args; const rawToken = this.options.jwtFromRequest(req); if (!rawToken) { return this.fail('Missing authorization token', 401); } try { return this.options.verifyToken( rawToken, this.verifyTokenCallback.bind(this), ); } catch (e) { return this.error(e); } } private verifyTokenCallback(e: Error, decodedToken: Record<string, unknown>) { if (e) { return this.error(e); } try { return this.verify(decodedToken, this.isVerifiedCallback.bind(this)); } catch (e) { return this.error(e); } } private isVerifiedCallback(error: Error, user: unknown, info: unknown) { if (error) { return this.error(error); } else if (!user) { return this.fail(info, 401); } else { return this.success(user, info); } } }
// Package server サーバー実装 // - REST APIサーバー // - gRPCサーバー // - fgprofサーバー package server
import Cannon from './Cannon'; import Enemy from './Enemy'; import Missile from './Missile'; import Collision from './Collision'; export {Cannon, Enemy, Missile, Collision};
def process(self): self.log.info("building db from '%s'" % self.src) for module, filepath, docid, md_raw in self.get_documents(): try: ns = module.namespace doc = Document(docid=docid, md_raw=md_raw, ns=ns, src_fn=filepath) html, meta = self.processor.convert(doc) doc.update(output=html, **meta) self.db.register(doc) except DuplicateDocumentError as dde: m = "{:s} in {:s}".format(dde.message, filepath) raise DuplicateDocumentError(m) self.db.resolve_forwardlinks() self.j2env.globals["doc_root_hash"] = self.processor.get_root_hash()