content
stringlengths
10
4.9M
def crf_decoding_layer(input, size, label=None, param_attr=None, name=None): assert isinstance(input, LayerOutput) assert label is None or isinstance(label, LayerOutput) ipts = [Input(input.name, **param_attr.attr)] if label is not None: ipts.append(Input(label.name)) Layer( name=name, type=LayerType.CRF_DECODING_LAYER, size=size, inputs=ipts, ) parents = [input] if label is not None: parents.append(label) return LayerOutput(name, LayerType.CRF_DECODING_LAYER, parents, size=size)
LAS VEGAS—Government leaders in the United States and Mexico are close to signing a pact to add areas south of the border to Colorado River water sharing agreements involving seven Western U.S. states, officials said Friday. U.S. Bureau of Reclamation officials characterized the talks as delicate while final documents circulate among 15 water agencies and state officials in Arizona, California, Colorado, Nevada, New Mexico, Utah and Wyoming. “The concern is that these are sensitive negotiations,” said Kip White, a bureau spokesman in Washington, D.C. “It has taken a long time to get here. We’re looking forward to a culmination of this later this month.” “It’s not a completed agreement until the document is signed,” added Rose Davis, a bureau spokeswoman in Boulder City. The framework of the five-year agreement became public with agenda items for a meeting next Thursday in Las Vegas involving the Southern Nevada Water Authority and Colorado River Commission of Nevada. The Las Vegas Review-Journal first reported it on Friday. The pact is an addendum to a 1944 U.S.-Mexico water treaty. It developed from talks begun when the seven Colorado River states signed a landmark agreement in 2007 to share the pain of shortages and the wealth of surpluses from the Colorado River reservoirs of Lake Mead and Lake Powell. The water users called at the time for federal officials to get Mexico to participate. The agreement would also link Mexican and U.S. water allocations from the Colorado River during surplus and drought. The documents never refer to shortage, but instead cite “low reservoir conditions.” “Provisions include Mexico agreeing to adjust its delivery schedule during low reservoir conditions, Mexico having access to additional water during high reservoir conditions, and a commitment to work together on a pilot program that includes water for the environment,” according to a summary submitted to voting SNWA and Colorado River Commission members. The agreement would let Mexico continue an emergency program begun two years ago to store water in Lake Mead, the reservoir behind Hoover Dam near Las Vegas. After pipelines and canals were damaged by a magnitude 7.2 Easter Sunday 2010 earthquake, Mexico asked the U.S. at the time to let it store water temporarily while repairs were made to irrigation systems in a broad agricultural area south of Mexicali. The area is irrigated by water from the Morelos Dam on the Colorado River west of Yuma, Ariz. The agreement also calls for a pilot program of water releases from the U.S. to replenish wetlands in the Colorado River delta of the Gulf of California, and it clears the way for the U.S. government and municipal water agencies to invest in infrastructure improvements in Mexico in return for a share of the water such projects would save. Las Vegas gets 90 percent of its drinking water from drought-stricken Lake Mead, and officials have talked about paying to build a seawater desalination plant in Mexico to trade for additional water rights to Colorado River water. The Review-Journal reported that the two largest municipal water agencies in Arizona and California have signed off on the agreement. New Mexico’s Interstate Stream Commission is due to consider the agreements next Wednesday. Jeff Kightlinger, general manager of the Metropolitan Water District of Southern California, said his board approved the agreement on Monday. Officials with the Central Arizona Project didn’t immediately respond to messages Friday from The Associated Press. The agreement calls for the Southern California district to pay Mexico $5 million over three years in return for 47,500 acre-feet of water. The agencies in Arizona and Nevada would each pay $2.5 million over three years and receive a total of 23,750 acre-feet. An acre-foot of water is about enough to serve two Las Vegas-area households for a year, officials say. ——— Find Ken Ritter on Twitter: http://twitter.com/krttr
/* Copyright 2012-2018 <NAME>, <NAME>, Yiancar This program is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation, either version 2 of the License, or (at your option) any later version. This program is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details. You should have received a copy of the GNU General Public License along with this program. If not, see <http://www.gnu.org/licenses/>. */ #include <stdint.h> #include <stdbool.h> #include "wait.h" #include "util.h" #include "matrix.h" #include "debounce.h" #include "quantum.h" #ifdef DIRECT_PINS static pin_t direct_pins[MATRIX_ROWS][MATRIX_COLS] = DIRECT_PINS; #elif (DIODE_DIRECTION == ROW2COL) || (DIODE_DIRECTION == COL2ROW) static const pin_t row_pins[MATRIX_ROWS] = MATRIX_ROW_PINS; //static const pin_t col_pins[MATRIX_COLS] = MATRIX_COL_PINS; #endif // matrix code #ifdef DIRECT_PINS static void init_pins(void) { for (int row = 0; row < MATRIX_ROWS; row++) { for (int col = 0; col < MATRIX_COLS; col++) { pin_t pin = direct_pins[row][col]; if (pin != NO_PIN) { setPinInputHigh(pin); } } } } static bool read_cols_on_row(matrix_row_t current_matrix[], uint8_t current_row) { matrix_row_t last_row_value = current_matrix[current_row]; current_matrix[current_row] = 0; for (uint8_t col_index = 0; col_index < MATRIX_COLS; col_index++) { pin_t pin = direct_pins[current_row][col_index]; if (pin != NO_PIN) { current_matrix[current_row] |= readPin(pin) ? 0 : (MATRIX_ROW_SHIFTER << col_index); } } return (last_row_value != current_matrix[current_row]); } #elif (DIODE_DIRECTION == COL2ROW) static void select_row(uint8_t row) { setPinOutput(row_pins[row]); writePinLow(row_pins[row]); } static void unselect_row(uint8_t row) { setPinInputHigh(row_pins[row]); } static void unselect_rows(void) { for (uint8_t x = 0; x < MATRIX_ROWS; x++) { setPinInputHigh(row_pins[x]); } } static void init_pins(void) { unselect_rows(); for (uint8_t x = 0; x < MATRIX_COLS; x++) { setPinInputHigh(col_pins[x]); } } static bool read_cols_on_row(matrix_row_t current_matrix[], uint8_t current_row) { // Store last value of row prior to reading matrix_row_t last_row_value = current_matrix[current_row]; // Clear data in matrix row current_matrix[current_row] = 0; // Select row and wait for row selecton to stabilize select_row(current_row); wait_us(30); // For each col... for (uint8_t col_index = 0; col_index < MATRIX_COLS; col_index++) { // Select the col pin to read (active low) uint8_t pin_state = readPin(col_pins[col_index]); // Populate the matrix row with the state of the col pin current_matrix[current_row] |= pin_state ? 0 : (MATRIX_ROW_SHIFTER << col_index); } // Unselect row unselect_row(current_row); return (last_row_value != current_matrix[current_row]); } #elif (DIODE_DIRECTION == ROW2COL) /* Cols 0 - 15 * col 0: C7 * col 1: B6 * col 2: C6 * col 3: B4 * col 4: B5 * col 5: D7 * These columns use a 74HC237D 3 to 8 bit demultiplexer. * A0 A1 A2 * col / pin: PD2 PD1 PD0 * 6: 1 1 1 * 7: 0 1 1 * 8: 1 0 1 * 9: 0 0 1 * 10: 1 1 0 * 11: 0 1 0 * 12: 1 0 0 * col 13: D3 * col 14: B7 * col 15: B3 */ static void select_col(uint8_t col) { switch (col) { case 0: writePinLow(C7); break; case 1: writePinLow(B6); break; case 2: writePinLow(C6); break; case 3: writePinLow(B4); break; case 4: writePinLow(B5); break; case 5: writePinLow(D7); break; case 6: writePinHigh(D0); writePinHigh(D1); writePinHigh(D2); break; case 7: writePinHigh(D0); writePinHigh(D1); break; case 8: writePinHigh(D0); writePinHigh(D2); break; case 9: writePinHigh(D0); break; case 10: writePinHigh(D1); writePinHigh(D2); break; case 11: writePinHigh(D1); break; case 12: writePinHigh(D2); break; case 13: writePinLow(D3); break; case 14: writePinLow(B7); break; case 15: writePinLow(B3); break; } } static void unselect_col(uint8_t col) { switch (col) { case 0: writePinHigh(C7); break; case 1: writePinHigh(B6); break; case 2: writePinHigh(C6); break; case 3: writePinHigh(B4); break; case 4: writePinHigh(B5); break; case 5: writePinHigh(D7); break; case 6: writePinLow(D0); writePinLow(D1); writePinLow(D2); break; case 7: writePinLow(D0); writePinLow(D1); break; case 8: writePinLow(D0); writePinLow(D2); break; case 9: writePinLow(D0); break; case 10: writePinLow(D1); writePinLow(D2); break; case 11: writePinLow(D1); break; case 12: writePinLow(D2); break; case 13: writePinHigh(D3); break; case 14: writePinHigh(B7); break; case 15: writePinHigh(B3); break; } } static void unselect_cols(void) { //Native setPinOutput(D3); setPinOutput(D7); writePinHigh(D3); writePinHigh(D7); setPinOutput(C6); setPinOutput(C7); writePinHigh(C6); writePinHigh(C7); setPinOutput(B3); setPinOutput(B4); setPinOutput(B5); setPinOutput(B6); setPinOutput(B7); writePinHigh(B3); writePinHigh(B4); writePinHigh(B5); writePinHigh(B6); writePinHigh(B7); //Demultiplexer setPinOutput(D0); setPinOutput(D1); setPinOutput(D2); writePinLow(D0); writePinLow(D1); writePinLow(D2); } static void init_pins(void) { unselect_cols(); for (uint8_t x = 0; x < MATRIX_ROWS; x++) { setPinInputHigh(row_pins[x]); } } static bool read_rows_on_col(matrix_row_t current_matrix[], uint8_t current_col) { bool matrix_changed = false; // Select col and wait for col selecton to stabilize select_col(current_col); wait_us(30); // For each row... for (uint8_t row_index = 0; row_index < MATRIX_ROWS; row_index++) { // Store last value of row prior to reading matrix_row_t last_row_value = current_matrix[row_index]; // Check row pin state if (readPin(row_pins[row_index]) == 0) { // Pin LO, set col bit current_matrix[row_index] |= (MATRIX_ROW_SHIFTER << current_col); } else { // Pin HI, clear col bit current_matrix[row_index] &= ~(MATRIX_ROW_SHIFTER << current_col); } // Determine if the matrix changed state if ((last_row_value != current_matrix[row_index]) && !(matrix_changed)) { matrix_changed = true; } } // Unselect col unselect_col(current_col); return matrix_changed; } #endif void matrix_init_custom(void) { // initialize key pins init_pins(); } bool matrix_scan_custom(matrix_row_t current_matrix[]) { bool changed = false; #if defined(DIRECT_PINS) || (DIODE_DIRECTION == COL2ROW) // Set row, read cols for (uint8_t current_row = 0; current_row < MATRIX_ROWS; current_row++) { changed |= read_cols_on_row(current_matrix, current_row); } #elif (DIODE_DIRECTION == ROW2COL) // Set col, read rows for (uint8_t current_col = 0; current_col < MATRIX_COLS; current_col++) { changed |= read_rows_on_col(current_matrix, current_col); } #endif return changed; }
<gh_stars>10-100 #ifndef MAPI_H_INCLUDED #define MAPI_H_INCLUDED #ifdef DONT_HAVE_BUILTIN_EXPECT #define __unlikely(cond) __builtin_expect(!!(cond), 0) #define __likely(cond) __builtin_expect(!!(cond), 1) #else #define __unlikely(cond) (cond) #define __likely(cond) (cond) #endif /* DONT_HAVE_BUILTIN_EXPECT */ #define MAPI_ROUNDUP(v, m) (((v) + (m) - 1) & ~((m) - 1)) #define MAPI_EMPTY_KEY ((uint32_t) -1) #define MAPI_MIN_CAPACITY 16 typedef struct mapi_object mapi_object; typedef struct mapi mapi; struct mapi_object { uint32_t key; }; struct mapi { char *objects; size_t object_size; size_t size; size_t capacity; uint32_t empty_key; void (*release)(void *); void (*clone)(void *, void *, size_t); uint32_t (*getKey)(void *); void (*setKey)(void *, uint32_t); }; /* allocators */ mapi *mapi_new(size_t); void mapi_init(mapi *, size_t); void mapi_empty_key(mapi *, uint32_t); void mapi_release(mapi *, void (*)(void *)); void mapi_clone(mapi *, void (*)(void*, void*, size_t)); void mapi_getKey(mapi *, uint32_t (*)(void*)); void mapi_setKey(mapi *, void (*)(void*, uint32_t)); void mapi_free(mapi *); /* capacity */ size_t mapi_size(mapi *); /* element access */ void *mapi_get(mapi *, size_t); void *mapi_at(mapi *, uint32_t); mapi_object *mapi_super(void *); int mapi_empty(mapi *, void *); /* element lookup */ void *mapi_find(mapi *, uint32_t); /* modifiers */ void *mapi_insert(mapi *, void *); void mapi_erase(mapi *, uint32_t); void mapi_clear(mapi *); /* buckets */ size_t mapi_bucket_count(mapi *); /* hash policy */ int mapi_rehash(mapi *, size_t); int mapi_reserve(mapi *, size_t); /* iterators */ void *mapi_begin(mapi *); void *mapi_next(mapi *, void *); void *mapi_end(mapi *); void *mapi_skip(mapi *, void *); void *mapi_inc(mapi *, void *); #endif /* MAPI_H_INCLUDED */
Kinetic studies of coke formation and removal on HP40 in cycled atmospheres at high temperatures. An austenitic FeNiCr alloy, HP40Nb, has been preoxidized and subsequently exposed to an alternating carburizing/oxidizing /carburizing atmosphere. During the oxidation at 1000°C a thick Cr 2 O 3 layer was formed which partly spalled off during cooling to room temperature, in this way chromium depleted areas resulted at the surface. The carburizing and reducing condition was established by a C 2 H 6 /C 2 H 4 /H 2 mixture at a temperature of 850 °C while the oxidation for decoking was conducted in air at 800°C. The exposure times were relatively short, respectively 90/30/180 minutes. During the first exposure of the preoxidized alloy to the carburizing atmosphere, coke formation took place, and underneath the coke layer the alloy was carburized, however, only locally. After the decoking in air at 800°C, during the second exposure to the carburizing atmosphere much more catalytic coke formation was observed compared to the first exposure. The coke formation was initiated by the reduction of (Fe,Ni,Cr)-spinels formed in the oxidizing atmosphere. The reduction of the oxides gives rise to the formation of (Fe,Ni)-particles which show strong catalytic activity towards coke formation.
use super::*; use std::fmt::{Display, Error, Formatter}; impl Display for BddVariable { fn fmt(&self, f: &mut Formatter) -> Result<(), Error> { f.write_fmt(format_args!("{}", self.0)) } } impl BddVariable { /// Convert to little endian bytes pub(super) fn to_le_bytes(self) -> [u8; 2] { self.0.to_le_bytes() } /// Read from little endian byte representation pub(super) fn from_le_bytes(bytes: [u8; 2]) -> BddVariable { BddVariable(u16::from_le_bytes(bytes)) } }
/** * Loads clients from the database. * * @param con that will be used for the queries * @throws SQLException * in case loading queries fail */ public void loadClients(Connection con) throws SQLException { clients.clear(); String loadSQL = "select client_id, client_name, phone_number, creation_date, modification_date " + "from h_b2wkl0.client " + "order by client_name"; Statement loadStatement = con.createStatement(); ResultSet results = loadStatement.executeQuery(loadSQL); while (results.next()) { int id = results.getInt("client_id"); String name = results.getString("client_name"); String phoneNum = results.getString("phone_number"); Date creationDate = results.getTimestamp("creation_date"); Date modificationDate = results.getTimestamp("modification_date"); Client client = new Client(id, name, phoneNum, creationDate, modificationDate); clients.add(client); } results.close(); loadStatement.close(); }
// Delete will delete a User Favorite from the datastore func (p *FavoritesDBStore) Delete(userGUID string, guid string) error { if _, err := p.db.Exec(deleteFavorite, userGUID, guid); err != nil { return fmt.Errorf("Unable to delete User Favorite record: %v", err) } return nil }
// RegisterTransport transport module init function call this function register transport func RegisterTransport(transport Transport) { if err := globalRegister.add(transport); err != nil { panic(err) } }
// Tests basic initialization and makes sure that the proper metrics are updated with an insert. TEST_F(CounterTest, InitializationSanity) { Counter::CounterTimeConfig test_time_config = {pair<uint64_t, int> (5000, 2), pair<uint64_t, int> (10000, 2)}; Counter::MetricSet test_metric_set {"mock_type_1", "mock_type_2"}; unique_ptr<Counter> counter(new Counter(test_time_config, test_metric_set)); const vector<Counter::BinSet>& binsets = counter->GetBinsets(); metrics::test::MockMetricInterface* mock_metric_1_ = dynamic_cast<metrics::test::MockMetricInterface*>( binsets.at(0).bins.at(0).find("mock_type_1")->second.get()); metrics::test::MockMetricInterface* mock_total_metric_1_ = dynamic_cast<metrics::test::MockMetricInterface*>( binsets.at(0).total_metrics.find("mock_type_1")->second.get()); metrics::test::MockMetricInterface* mock_metric_2_ = dynamic_cast<metrics::test::MockMetricInterface*>( binsets.at(0).bins.at(0).find("mock_type_2")->second.get()); metrics::test::MockMetricInterface* mock_total_metric_2_ = dynamic_cast<metrics::test::MockMetricInterface*>( binsets.at(0).total_metrics.find("mock_type_2")->second.get()); ASSERT_NOTNULL(mock_metric_1_); ASSERT_NOTNULL(mock_total_metric_1_); ASSERT_NOTNULL(mock_metric_2_); ASSERT_NOTNULL(mock_total_metric_2_); metrics::tCountedEvent event1; EXPECT_CALL((*mock_metric_1_), Add(_)).Times(1); EXPECT_CALL((*mock_total_metric_1_), Add(_)).Times(1); EXPECT_CALL((*mock_metric_2_), Add(_)).Times(1); EXPECT_CALL((*mock_total_metric_2_), Add(_)).Times(1); counter->ProcessEvent(event1); }
// SetInactive sets a finding as inactive func (r *CommandCenter) SetInactive(ctx context.Context, name string) (*crm.Finding, error) { return r.client.SetFindingState(ctx, &crm.SetFindingStateRequest{ Name: name, State: crm.Finding_INACTIVE, StartTime: timestamppb.Now(), }) }
<filename>types/global.d.ts<gh_stars>0 /** Global definitions for development **/ // for style loader declare module '*.css' { const styles: any; export = styles; } // Omit type https://github.com/Microsoft/TypeScript/issues/12215#issuecomment-377567046 type Omit<T, K extends keyof T> = Pick<T, Exclude<keyof T, K>>; type PartialPick<T, K extends keyof T> = Partial<T> & Pick<T, K>; declare namespace API { export interface IArea { zone: string; municipality: string; area: string; } export interface IStation { id: number; zone: string; municipality: string; area: string; station: string; eoi: string; latitude: number; longitude: number; owner: string; status: string; description: string; firstMeasurment: string; lastMeasurment: string; components: string; } export interface IComp { component: string; topic: string; } // Generated by https://quicktype.io export interface IAqiLookup { component: string; unit: string; aqis: Aqi[]; } export interface Aqi { index: number; fromValue: number; toValue: number; color: string; text: string; shortDescription: string; description: string; advice: string; } }
An SVM-based Query Monitoring Method For Inference Control Many widely-used data sharing applications rely on allowing untrusted clients to query against secret data. Inference control for secure data is a key issue. In these scenarios, the owner of secret data opens the secret data for querying to provide general information while trying to ensure not too much information is leaked from these queries. In this work, we propose a method based on Support Vector Machines (SVM) to quickly determine whether a set of queries is leaking too much information. Through experiments over several sample functions, we demonstrate the performance and flexibility of this method.
<filename>read-co2-sensor.py import os import datetime import sys import getopt def get_last_modified_file(directory): dated_files = [(os.path.getmtime(os.path.join(directory, fn)), os.path.join(directory, fn)) for fn in os.listdir(directory) if fn.lower().endswith('.csv')] dated_files.sort() dated_files.reverse() newest = dated_files[0][1] return newest def get_last_line_in_file(path): statinfo = os.stat(path) with open(path) as file: if statinfo.st_size > 2000: file.seek(-1024, 2) return file.readlines()[-1] def parse_sensor_output(line): data = line.split(",") return { "time": datetime.datetime.strptime(data[0], "%Y-%m-%d %H:%M:%S"), "temperature": float(data[1]), "co2": float(data[2]) } def sensor_dead(sensor_output): diff = datetime.datetime.now() - sensor_output["time"] sensor_dead_threshold = datetime.timedelta(seconds = 30) return diff > sensor_dead_threshold file = get_last_modified_file("/var/lib/co2monitor/data") line = get_last_line_in_file(file) sensor_output = parse_sensor_output(line) if sensor_dead(sensor_output): sys.exit("sensor dead") options, args = getopt.getopt(sys.argv[1:], "", ["co2","temperature"]) for o, a in options: if o == "--co2": print sensor_output["co2"] elif o == "--temperature": print sensor_output["temperature"]
def check_bot(self): if self._data.get('bot_id'): if self._data.get('bot_id') == self._kwargs.get('botid'): self.command = False elif self.user == self._kwargs.get('botid'): self.command = False return
KAKRAPAR: It was a red letter day in the history of India's nuclear power generation. The work on the first pair of indigenously designed 700 MW pressurized heavy water reactors (PHWRs) of Nuclear Power Corporation of India Limited ( NCPIL ) started at Kakrapar Atomic Power Project (KAPP) with the first pouring of concrete (FPC) on Monday.The Rs 8,000 crore worth of 700 MW PHWR project known as KAPP 3&4 is the latest state-of-the-art technology nuclear power reactors which have been designed by NCPIL. Both the nuclear reactors, which together would produce 1,400 MW electricity, will begin commercial operation in 2015. While the other two reactors of same capacity based in Rawatbhata in Rajasthan would start commercial operation in 2016.The KAPP has two nuclear power reactors of 220 MW each. With the commissioning of the 700 MW PHWRs in 2015, the KAPP will have an installed capacity of 1,840 MW power generation."It is a historic day for us at NCPIL," chairman and managing director (CMD) of NCPIL Dr S K Jain said at the FPC ceremony. "Kakrapar is the first place from where we started our long journey of work on about 14 such nuclear reactors in some important locations in the country."According to Jain, NCPIL would be working on about 8 700 MW PHWRs in states like Haryana, Rajasthan, Madhya Pradesh and Gujaratt by 2012. The NCPIL has financial sanction from the central government for starting about 10 700 MW PHWRs under a five-year programme.Asked about the issue faced by NCPIL in Mithi Virdi in Bhavnagar, Jain said, "We have earmarked the commissioning of 1,000 MW PHWRs in Bhavnagar by 2018. There is little dissatisfaction among the villagers over giving their lands and some misconception about the ill-effects of radiation. We want to assure all co-operation to the farmers and villagers in Bhavnagar as we envisage 6,000 MW of power generation from this area"NPCIL's installed capacity will reach 9,580 MW by 2016 with the progressive completion of the nuclear power reactors under construction. By 2020, NCPIL has a vision to reach 20,000 MW or more and 63,000 MW by year 2032 by setting up nuclear reactors based on indigenously designed 700 MW PHWRs and Light Water Reactors of 1,000 MW or large size reactors.Chairman of Atomic Energy Commission (AEC) Dr Srikumar Banerjee, who was also present, said, "Indigenously designed nuclear reactors are more economical than the imported ones. We see ourselves as the fifth major supplier of nuclear reactors of small size in the world."He said nuclear power generation is the best way to address the global warming issue. Most of the developed nations are for carbon-free economy. India has a per capita power consumption of 700 KW--the average per capita consumption in the world is 2,500 KW--and we have to take this to 2,000 KW in the coming years.
def list_dhcp_agent_networks(qclient, agent_id): resp = qclient.list_networks_on_dhcp_agent(agent_id) LOG.debug("list_networks_on_dhcp_agent: %s", resp) return [s['id'] for s in resp['networks']]
A Magnetic Alpha-Omega Dynamo in Active Galactic Nuclei Disks: I. The Hydrodynamics of Star-Disk Collisions and Keplerian Flow A magnetic field dynamo in the inner regions of the accretion disk surrounding the supermassive black holes in AGNs may be the mechanism for the generation of magnetic fields in galaxies and in extragalactic space. We argue that the two coherent motions produced by 1) the Keplerian motion and 2) star-disk collisions, numerous in the inner region of AGN accretion disks, are both basic to the formation of a robust, coherent dynamo and consequently the generation of large scale magnetic fields. They are frequent enough to account for an integrated dynamo gain, e^{10^{9}} at 100 gravitational radii of a central black hole, many orders of magnitude greater than required to amplify any seed field no matter how small. The existence of extra-galactic, coherent, large scale magnetic fields whose energies greatly exceed all but massive black hole energies is recognized. In paper II (Pariev, Colgate, and Finn 2006) we argue that in order to produce a dynamo that can access the free energy of black hole formation and produce all the magnetic flux in a coherent fashion the existence of these two coherent motions in a conducting fluid is required. The differential winding of Keplerian motion is obvious, but the disk structure depends upon the model of"alpha", the transport coefficient of angular momentum chosen. The counter rotation of driven plumes in a rotating frame is less well known, but fortunately the magnetic effect is independent of the disk model. Both motions are discussed in this paper, paper I. The description of the two motions are preliminary to two theoretical derivations and one numerical simulation of the alpha-omega dynamo in paper II. (Abridged) Introduction The need for a magnetic dynamo to produce and amplify the immense magnetic fields observed external to galaxies and in clusters of galaxies has long been recognized. The theory of kinematic magnetic dynamos has had a long history and is a well developed subject by now. There are numerous monographs and review articles devoted to the magnetic dynamos in astrophysics, some of which are: Parker (1979); Moffatt (1978); Stix (1975); Cowling (1981); Roberts & Soward (1992); Childress et al. (1990) ;Zeldovich, Ruzmaikin, & Sokoloff (1983); Priest (1982); Busse (1991); Krause & Rädler (1980); Biskamp (1993); Mestel (1999). Hundreds of papers on magnetic dynamos are published each year. Three main astrophysical areas, in which dynamos are involved, are the generation of magnetic fields in the convective zones of planets and stars, in differentially rotating spiral galaxies, and in the accretion disks around compact objects. The possibility of production of magnetic fields in the central parts of the black hole accretion disks in AGN has been pointed out by Chakrabarti, Rosner, & Vainshtein (1994) and the need and possibility for a robust dynamo by Colgate & Li (1997). Dynamos have been also observed in the laboratory in the Riga experiment (Gailitis et al. 2000(Gailitis et al. , 2001 and in Karlsruhe experiment (Stieglitz & Müller 2001). Recently, counter rotating, opposed jets or open-flow geometries, such as the the Von Kármán Sodium (VKS) Experiment and the Madison Dynamo Experiment, have been designed to explore less constrained flows (Bourgoin et al. 2004;Spence et al. 2006). So far, neither of these experiments have reported sustained magnetic field generation despite predictions of positive gain in laminar flow theory and calculations. The null result has been ascribed to the deleterious effects of enhanced turbulent diffusion of large-scale turbulence. The Need for a Robust Astrophysical Dynamo Why, with all the thousands of research papers, very many successes, and even experimental verification of dynamo theory in constrained flows is there a need for a new paradigm for the generation of intergalactic scale astrophysical magnetic fields? We claim that the plume-driven αω dynamo in the black hole accretion disk is a unique solution to the need for the largest dynamos of the universe, because the flow is naturally constrained by a gradient in angular momentum and by the transient dynamical behavior of plumes in contrast to the large turbulence of unconstrained flows. (A discussion of the role of convective plume-driven αω dynamos in stars will be reserved for another paper, because the mechanism of the production of large scale plumes in the convective zone of stars is radically different from the production of plumes by high velocity stars plunging frequently through the accretion disk.) The minimum energy inferred from radio emission observations of structures or so-called radio lobes within clusters and external to clusters by both synchrotron emission and Faraday rotation (Kronberg 1994;Kronberg et al. 2001) are so large, ∼ 10 59 ergs and and up to ∼ 10 61 ergs respectively, compared to galactic energies in fields 10 −7 as large and gravitational binding energies 10 −3 as large, that only the energy of formation of the central massive black hole (hereafter, CMBH) of every galaxy in its AGN phase, ∼ 10 62 ergs, becomes the most feasible astrophysical known source of so much energy. This statement is based upon the recognition that ∼ 10 8 neutron stars have been created in the galaxy in a Hubble time, or only ∼ 10 6 in the life time of a radio lobe of ∼ 10 8 yr. Each supernova results in 10 51 ergs of kinetic energy, the rest being emitted in neutrinos so that ∼ 10 57 ergs of kinetic energy becomes available for the production of magnetic energy within the necessary time. If one considers the difficulty of summing the magnetic field from many, presumably incoherent sources, the likelihood of many stellar sources contributing to the coherent field of radio lobes seems remote. In order to access this energy of formation the conversion of kinetic to magnetic energy is required. This in turn requires a mechanism to multiply or exponentiate an initial field up to the back reaction limit. This limit is where the Ampere force does a large work to significantly alter the accretion motion thus converting the kinetic energy to magnetic energy. Because the specific angular momentum of matter accreted onto the CMBH is ∼ 10 3 to 10 4 greater than possible for accretion at r g , the result is the universal Keplerian motion of an accretion disk and so the access of this free energy must be in the form of a back reaction torque that transforms kinetic to magnetic energy. A robust dynamo is one that can potentially convert a large fraction of the available mechanical energy or free energy of the accretion disk into magnetic energy. A further advantage of the αω dynamo in the CMBH accretion disk is that the exponential gain within 100 gravitational radii of the CMBH is so large, some fraction f per turn, or gain = e f N , N ∼ 10 9 turns in the 10 8 years of formation, that the origin and strength of the initial (seed) field becomes moot. The Robust αω Dynamo Such a dynamo has conceptually become feasible because of the recognition of a relatively new, coherent, large scale, robust source of helicity. Helicity generation, in the sense of the αω dynamo, is the driven deformation of the conducting fluid that converts an amplified (by differential winding) toroidal field back into the initial, (radial), poloidal field. In our case it is caused by the rotation of driven, diverging plumes in a rotating frame (Beckley et al. 2003;Mestel 1999;. The advantage of driven plumes as a source of helicity as compared to turbulent motions within the disk is that the flow displaces fluid and entrapped flux well above the disk, several scale heights, and then rotates the flux on average a quarter turn before merging again into the disk. Such an ideal deformation is then a large coherent (single direction) source of helicity. These plumes are presumably driven by many stars in orbits repeatedly plunging through the disk, but comprising only a small mass fraction, ∼ 10 −3 , of the matter accreted to form and grow the CMBH to ∼ 10 8 M ⊙ . The twisting or relative rotation of the plumes occurs because of partially conserved angular momentum of the plume itself as its moment of inertia increases due to its expansion or divergence while progressing in height. The repeatable fractional turn before merging with the disk occurs because the cooling plume matter falls back to the disk in half a turn of the disk. This translation and rotation twists the embedded toroidal magnetic field. Furthermore, the angle of twist is in the same direction for all plumes, opposite to the rotation of the disk, and furthermore the angle of this twist is limited to ∆φ ≃ −π/2 radians of rotation for each occurrence. This nearly ideal repetitive driven deformation leads to a robust dynamo, one where both motions are not likely to be easily damped by back reaction except at the full Keplerian stress. Such a dynamo is not dependent upon a net helicity derived from random turbulent motions. The limitation of turbulently derived helicity due to early back reaction is discussed later, but first we discuss the preference for a finite angle, specifically (2n + 1)π/4 angle of rotation in n periods of rotation for an effective helicity. (Preferably n = 0.) The Original αω Dynamo The original proposal of Parker (1955Parker ( , 1979 of the αω dynamo in rotationally sheared conducting flows, seemed to be the logical answer to the problem of creating the large, highly organized fields of stars and galaxies as revealed by polarized synchrotron emission and Faraday rotation maps. Here the radial component of a poloidal field is wrapped up by differential rotation into a much stronger toroidal field. Then as proposed by Parker, cyclonic motions of geostrophic flow twist and displace axially a fraction of the toroidal flux back into the poloidal direction. Subsequent merger of this small component of poloidal flux with the large scale original poloidal flux by resistive diffusion or reconnection completed the cycle. The later process of merging the small scales to create the large scales is referred to as mean field dynamo theory. There were two apparently insurmountable problems with this theory. The first, as argued by Moffatt (1978) and as discussed in Roberts & Soward (1992) was that geostrophic cyclonic flows, with negative pressure on axis, make very many revolutions before dissipating therefore reconnecting the flux in an arbitrary orientation. Hence, the orientation of any newly formed component of poloidal flux would be averaged to near zero. The star-disk driven plumes, on the other hand, avoid this difficulty by falling back to the disk in less than π revolutions of rotation, thereby terminating further rotation by fluid merging within the disk. The second difficulty was that the large dimensions of interstellar space and finite resistivity ensured a near infinite magnetic Reynolds number, Rm = Lv/η (L the dimension, v the velocity and η the resistivity), so that, in general, the resistive reconnection time would become large compared to the age of the astrophysical object. Consequently newly minted poloidal flux would never merge with the original poloidal flux. Currently, although the details of reconnection are poorly understood, it is well recognized in both astrophysical observations, theory, and in the many fusion confinement experiments that reconnection occurs astonishingly fast, up to Alvén speed. As a result, physicists concerned with the problem turned to turbulence as the solution, both to produce a small net helicity as well as to produce an enhanced resistivity in order to allow reconnection of the fluxes. Furthermore mean field theory was developed to predict the emergence of large scale fields from the merger of small scale turbulent motions (Steenbeck, Krause & Rädler 1966;Steenbeck & Krause 1969a,b). Ever since, mean field turbulent dynamo theory has dominated the subject for the last 40 years. The Turbulent Dynamo There are two principle problems with turbulent dynamos: first, the difficulty of deriving a net and sufficient helicity from random turbulent motions, and secondly, the ease with which the turbulent motions themselves can be suppressed by the back reaction of the field stress, in this case the multiplied toroidal field (Vainshtein, Parker, & Rosner 1993). Regardless of the source of such turbulence, i.e., the α viscosity (Shakura & Sunyaev 1973), the magneto-rotational instability (Balbus & Hawley 1998) or magnetic buoyancy (Chakrabarti, Rosner, & Vainshtein 1994), the turbulent stress will be small compared to the stress of Keplerian motion. The stress of the magnetic field produced will be limited by the back reaction on this turbulence. As discussed later the back reaction would limit the stress of the dynamo fields to values very much less than the Keplerian stress. The problem of the origin of reconnection remains, but here turbulence in the disk can help where one needs only assume that the flow of energy in turbulence is always dissipative and that the fraction of magnetic energy dissipated by this turbulence may be very small yet satisfy the necessary reconnection. Secondly, fast reconnection (at near Alvén speed) in low beta, collisionless plasmas has been modeled Drake et al. 2003). We note that we are not considering turbulence as a significant source of helicity in the αω dynamo, yet at the same time invoking turbulence in order to enhance reconnection. The Astrophysical Consequences We are attempting to demonstrate that a robust dynamo in an accretion disk, dependent upon a small mass fraction of orbiting stars, becomes a dominant magnetic instability of CMBH formation. To the extent to which this indeed is so and since orbiting stars and Keplerian accretion are universal, then it becomes difficult to avoid the conclusion that the free energy of formation of most CMBHs would be converted into magnetic energy. In our view the magnetic field, both energy and flux, generated by the black hole accretion disk dynamo presumably powers the jets and the giant magnetized radio lobes. For us both of these phenomena are most likely the on-going dissipation by reconnection and synchrotron emission of force-free helices of wound up strong magnetic field produced by the accretion disk dynamo. (The large scale magnetic flux, as indicated by polarization observations where the correlation length is of order the distance between bright knots, M87, Owen, Hardee & Bignell (1980) is equally demanding of the coherence of the dynamo process.) The electromagnetic mechanism of extraction of angular momentum and energy from the accretion disk has been proposed by Blandford (1976) and Lovelace (1976). Recently, the process of formation of such a force-free helix by shearing of the foot-points of the magnetic field by the rotation of the accretion disk has been considered by Lynden-Bell (1996) and Ustyugova et al. (2000); Li et al. (2001a); Lovelace et al. (2002). The magnetic dynamo in the disk is the essential part of the whole emerging picture of the formation and functioning of AGNs, closely related to the production of magnetic fields within galaxies, within clusters of galaxies, and the still greater energies and fluxes in the IGM. Black hole formation, Rossby wave torquing of the accretion disk (Lovelace et al. 1999;Li et al. 2000Li et al. , 2001bColgate et al. 2003), jet formation (Li et al. 2001a) and magnetic field redistribution by reconnection and flux conversion, and finally particle acceleration in the radio lobes and jets are the key parts of this scenario Colgate, Li, & Pariev 2001). Finally we note that if almost every galaxy contains a CMBH and that if a major fraction of the free energy of its formation is converted into magnetic energy, then only a small fraction of this magnetic energy, as seen in the giant radio lobes (Kronberg et al. 2001), is sufficient to propose a possible feed back in structure formation and in galaxy formation. The Back Reaction Limit and Star-Disk Collisions The main stream of astrophysical dynamo theory is the mean field theory where an exponential growth of the large scale field is sought, while averaging over small scale motions of the conducting plasma usually regarded as turbulence. The behavior of turbulent dynamos at the nonlinear stage i.e., back reaction, when one can no longer ignore the Ampere force, is not fully understood and is the process of active investigations (Vainshtein & Cattaneo 1992;Vainshtein, Parker, & Rosner 1993;Field, Blackman, & Chou 1999). However, as it was argued by Vainshtein & Cattaneo (1992), the growth of magnetic fields as a result of the action of the kinematic dynamo should lead to the development of strong field filaments with the diameter of the order of L/Rm 1/2 , where L is the characteristic size of the system and Rm is the magnetic Reynolds number. The field in the filaments reaches the equipartition value much sooner than the large scale field, causing the suppression of the α effect due to the strong Ampere force or back reaction, acting in the filaments. As a result, turbulent αω dynamos may be able to account for the generation of the large scale magnetic fields only at the level of Rm −1/2 of the equipartition value. Finding the mechanism for producing and maintaining large scale helical flows resulting in a robust α effect is thus very important for the generation of large scale magnetic fields of the order of the equipartition magnitude. One way of alleviating the difficulty with the early quenching of the turbulent α-dynamo may be a nonlinear dynamo, where the α-effect is maintained by the action of the large-scale magnetic field itself rather than by a small-scale turbulent motions. Such a nonlinear dynamo due to the buoyancy of the magnetic field in a rotating medium was first proposed by Moffatt (1978). As magnetic flux tubes are rising, they expand sidewise to maintain the balance of the pressure with the less dense surrounding gas. This sidewise velocity is claimed to cause the magnetic tube to bend under the action of the Coriolis force. Calculations of the nonlinear dynamo applied to the Sun was performed by Schmitt (1987) and Brandenburg & Schmitt (1998). A somewhat different mechanism for the radial expansion of the buoyant magnetic loops (due to the cosmic ray pressure) was proposed in the context of the Galactic dynamo by Parker (1992) and detailed calculations of the resulting mean field theory were performed by Moss, Shukurov & Sokoloff (1999). In this case the matter, cosmic rays, would not fall back to the galaxy surface, but the inertial mass of the cosmic rays is smaller than that of the galactic matter by ∼ 10 −10 , again greatly reducing the back reaction limit. The buoyant dynamo can amplify the weak large-scale magnetic field, B c ∼ Rm −1/2 B equi , where B equi is the magnetic field in equipartition with the turbulent energy. However, the buoyant α is a fraction (generally, a small fraction) of the velocity of the buoyant rise of the toroidal magnetic fields, where d is the radius of a flux tube, H is the half thickness of the disk, v A is the Alfvén speed, and C is a constant of order unity. For Rm ∼ 10 15 to 10 20 in the accretion disk, B c ∼ 10 −8 to 10 −10 B equi . Alfvén speed will be about 10 −8 to 10 −10 of sound speed. As we show below, star-disk collisions lead to a large mass ejected above the disk and therefore result in robust, large scale helical motions of hot gas with the rotation velocity exceeding the sound speed in the disk and, therefore, 10 8 to 10 10 times faster than the buoyant motions of the magnetic flux tubes. Thus, we can safely neglect the buoyant dynamo in our calculations of the linear stage of star-disk collision driven dynamo. Star-Disk Collisions It has now been long realized that the collisions of stars forming the central part of the star cluster in AGNs with the accretion disk lead to the exchange and stripping (or possibly growth) of the outer envelopes of stars and also, inevitably, a change in the momentum of the stars. This makes an important impact on the dynamics of stellar orbits. Thus the evolution of the central star cluster may contribute to providing accretion mass for the formation of the CMBH and can account for part of the observed emission from AGNs (Syer, Clarke, & Rees 1991;Artymowicz, Lin, & Wampler 1993;Artymowicz 1994;Rauch 1995;Vokrouhlicky & Karas 1998;Landry & Pineault 1998). Zurek, Siemiginowska, & Colgate (1994) considered the physics of plasma tails produced after star-disk collisions (see also Zurek, Siemiginowska, & Colgate 1996). They suggest that emission from these tails may account for the broad lines in quasars. Here we suggest another consequence of stars passing through the accretion disk, the generation of magnetic fields. For this to happen on a large scale and at the Keplerian back reaction limit requires multiple, repeatable coherent rotation through a finite angle and axial translation of conducting matter well above the disk. We emphasize the importance of an experimental, laboratory demonstration of the rotation and translation of plumes, driven by jets in a rotating frame (Beckley et al. 2003). These laboratory plumes are the analogue of those produced by the star disk collisions, which are the source of the helicity fundamental to this dynamo mechanism. The Structure of the Accretion Disk The near universally accepted view of accretion disks is that based upon the transport of angular momentum by turbulence within the disk. This is the α-disk model, which is also referred to as the Shakura-Sunyaev and to many is the standard model. This model was developed by Shakura (1972), Shakura & Sunyaev (1973), and Novikov & Thorne (1973) and since then it has been widely used for geometrically thin and optically thick accretion disks in moderate to high luminosity AGNs. In this model the viscous transport coefficient is limited by the vertical size of an eddy that can "fit" within the height of the disk, 2H, and the velocity of the eddy of less than sound speed, c s , within the disk. Thus the maximum possible viscous transport coefficient, ν max becomes ν max < Hc s , regardless of what source of turbulence or instability one invokes. The consequence of this limitation is that using the Shakura-Sunyaev formalism, a constant mass flow and the physics of radiation transport, pressure, and surface emission one obtains a disk around a typical CMBH of 10 8 M ⊙ that has too great a mass thickness at too small a radius, ∼ 0.013 pc to be consistent within several orders of magnitude with a generally accepted picture of galaxy formation and angular momentum distribution of a "flat rotation curve" disk. This difficulty has been recognized for some time, (Shlosman & Begelman 1989), motivating the consideration of various alternate transport mechanisms. However, a recent in-depth review of the problem by Goodman (2003) finds no simple solution. As an alternative solution we have found in recent years that large scale horizontal vortices can be excited within a Keplerian disk by appropriate pressure or angular momentum distributions, closely analogous to Rossby vortices within the disk (Li et al. 2001b). These vortices initially have a horizontal dimension of ∼ 2 to 4 H. One might then ask what is the difference with the truncation of eddy size at the disk height of a turbulent disk and the Rossby vortex disk, because both are truncated initially at the same size. The difference is that the Rossby vortices act coherently and so each vortex, regardless of size acts to transport angular momentum in one direction only, namely radially outwards as compared to turbulence, which is a random walk process. Furthermore the Rossby vortices have a further property of merging leading to larger vortices until r vortex ≃ R/3. The transport process is then faster or a transport coefficient that can be larger by the ratio ν Rossby /ν turbulence ≃ r/H ∼ 10 4 , thus making feasible an accretion disk that matches the flat rotation curve mass and angular momentum distribution of typical galaxy formation. In addition we also take note of the fact that we have recently suggested that the origin of CMBHs and their correlated power law velocity dispersion can be surprisingly explained by forming the CMBH accretion disk using the Rossby vortex instability mechanism rather than the Shakura-Sunyaev turbulent model . This prediction and confirmation by observations as well as the mass thickness problem is sufficiently provoking that to consider the accretion disk dynamo model based solely upon the Shakura-Sunyaev model may be misleading. Fortunately the Rossby vortex instability predicts universally a thinner disk and all disk problems with the dynamo become less difficult. Still, as it is described in a companion paper II (Pariev, Colgate & Finn 2006), stardisk collisions driven dynamo operates at radii ∼ 200r g in the accretion disk, where too high mass thickness of Shakura-Sunyaev disk is not yet a problem for self-gravity and matching to outside "flat rotation curve". Shakura-Sunyaev model is also better developed than Rossby vortex model at present. Hence, in order to minimize the number of speculative assumptions, we proceed with our dynamo model based upon the Shakura-Sunyaev disk model and note the alternate differences when necessary. This work is arranged as follows: in section 2 we discuss the distribution of stars, in section 3 the structure of the accretion disk, and in section 4 the kinematics of star-disk collisions. Finally, we end with a summary. Star Clusters, and their Distributions To proceed with the dynamo problem we need to address the following issues: 1. What is the distribution of stars in coordinate and velocity space in the central star cluster of an AGN ? 2. What is the velocity, density and conductivity of the plasma in the disk and in the corona of the disk ? 3. What is the hydrodynamics of the flow resulting from the passage of the star through the disk ? Each of these problems is difficult to solve. Moreover, there are no detailed solutions to these problems up to date. Furthermore they all interrelate. In the following three subsections we present a brief (far from complete) analysis of each of the problems based on available research and some of our own conjectures. Because each of these problems interrelate to some degree with each other, the justification of some assumptions must be delayed. However, as noted above, we will predict a dynamo gain so large that details of the disk and of the star disk collisions and their frequency become of secondary importance compared to the existence of the disk, a few stars and the CMBH. Kinematics of the Central Star Cluster By now there is strong observational evidence (e.g., Tremaine et al. 2002;Merritt & Ferrarese 2001;van der Marel 1999;Kormendy et al. 1998;van der Marel et al. 1997) that many galactic nuclei contain massive dark objects in the range of ≈ 10 6 − 10 9 M ⊙ . Numerical simulations of the evolution of central dense stellar clusters indicate that they are unstable to the formation of black holes, which would subsequently grow to larger masses by absorbing more stars (Quinlan & Shapiro 1990). Recent observations and the interpretation of very broad skewed profiles of iron emission line (e.g., Tanaka et al. 1995;Bromley, Miller, & Pariev 1998;Fabian et al. 2000) in Seyfert nuclei provide direct evidence for strong gravitational effects in the vicinity of massive dark objects in AGNs. This leaves us with conviction that the nuclei of AGNs indeed harbor black holes with accretion disks . Although the observations of star velocities and velocity dispersion are used to obtain an estimate of the mass of the supermassive black hole, a measurement of the number density of stars is limited by resolution to about 1 pc for M32 and M31 and about 10 pc for the nearest ellipticals. From these observations we infer a star density of n(1 pc) ≈ 10 4 − 10 6 M ⊙ pc −3 at 1 pc (Lauer et al. 1995). One needs to rely on the theory of the evolution of the central star cluster in order to obtain number densities of stars closer to the black hole. The subject of the evolution of a star cluster around a supermassive black hole has drawn significant interest in the past. The gravitational potential inside of the central 1 pc will be always dominated by the black hole. Bahcall & Wolf (1976) showed that, if the evolution of a star cluster is dominated by relaxation, the effect of a central Newtonian point mass on an isotropic cluster would be to create a density profile n ∝ r −7/4 . However, for small radii (≈ 0.1 − 1 pc) the effects of physical collisions between stars become dominant over twobody relaxation. Also, the disk produces a drag on the stellar orbits, which accumulates over many star passages. The result of the star-disk interactions is to reduce the inclination, eccentricity, and semimajor axis of an orbit, finally causing the star to be trapped in the disk plane, and so moving on circular Keplerian orbits (Syer, Clarke, & Rees 1991;Artymowicz, Lin, & Wampler 1993;Artymowicz 1994;Rauch 1995;Vokrouhlicky & Karas 1998). Closer to the black hole (≤ 100r g , r g = 2GM/c 2 , the gravitational radius) general relativistic corrections to the orbital motions and tidal disruption of the stars by the black hole must be taken into account. Considering all these effects and furthermore that the star-star collisions cannot be treated in a Fokker-Plank (or diffusion) approximation, an accurate theory becomes a difficult endeavor, which has not yet been completed to our knowledge. To obtain a plausible estimate of the number density and velocity distribution of stars in the central cluster we will follow the work of Rauch (1999), which addresses all these effects on the star distribution mentioned above, except the dragging by the disk. Rauch (1999) showed that star-star collisions lead to the formation of a plateau in the density of stars for small r because of the large rates of destruction of stars by collisions. We adopt the results of model 4 from Rauch (1999) as our fiducial model. This model was calculated for all stars having initially one solar mass. The collisional evolution in model 4 are close to the stationary state, when the combined losses of stars due to collisions, ejection, tidal disruptions and capture by the black hole are balanced by the replenishment of stars as a result of two-body relaxation in the outer region with n ∝ r −7/4 density profile. Taking into account the order of magnitude uncertainties in the observed star density at 1 pc, the fact that model 4 has not quite reached a stationary state can be acceptable for the purpose of order of magnitude estimates. This extrapolated lack of stars within the inner most regions of the disk presumably occurs because of star-star collisions and tidal disruption of stars and is independent of disk structure. The zero n at r < 10r t is a crude approximation to actual decrease in the number density of stars. This is because we recognize that distant gravitational scattering will lead to some diffusion of stars from distant regions and thus feeding of stars to the inner regions, limited by r t . We shall comment further on the influence of the drag by the disk on the above density profile. Following the formula from Rauch (1999) the probability that the solar mass star on the elliptic orbit with eccentricity e and the minimum distance from the black hole r min will experience a collision with another star during one orbital period is This probability at 100r g or ∼ 10 −3 pc becomes This probability is sufficiently small that the drag of the disk during star-disk collisions can be more important. In order to evaluate that drag we need to know the surface density in the disk. Disk Structure and Star Collisions We adopt the α-disk model, which we also refer to as the Shakura-Sunyaev (Shakura 1972;Shakura & Sunyaev 1973) model. We also consider the Rossby vortex model for reasons outlined in the introduction. As noted before, fortunately the Rossby vortex instability predicts universally a thinner disk and all disk problems with the dynamo become less difficult. Hence we proceed with our dynamo model based upon the Shakura-Sunyaev disk model and note the alternate differences when necessary. For thirty years, the Shakura-Sunyaev disk model has been the most widely used model of the accretion disk. The expressions for the parameters of the α-disk can be found in original articles (Shakura 1972;Shakura & Sunyaev 1973) and in many later books (e.g., Shapiro & Teukolsky 1983;Krolik 1999;Bisnovatyi-Kogan 2002). Here, we give the complete set of these expressions conveniently scaled for our problem (supermassive black hole, radius about 200r g or 10 −2 pc) in Appendix A. There have been a number of works perfecting and improving the simple analytical Shakura-Sunyaev model and determining the limits of applicability of this solution to real AGN accretion disks. Here we leave aside the complex physics of the innermost (≤ 10r g ) parts of the accretion flow because the innermost regions are devoid of stars and so star-disk collisions are almost non-existent in this region. More realistic bound-free opacities were included by Wandel & Petrosian (1988), non-LTE models were developed by Hubeny & Hubeny (1997 in disks with arbitrary optical depth, and optically thin and optically thick disks, were considered in Artemova et al. (1996). If one is looking at the interval of disk radii ∼ 100 to ∼ 1000r g , these improvements have some quantitative effects on the disk structure such as the emitted spectrum may be significantly different among models. More exact descriptions of the accretion disk come at a price of loosing analytic simplicity of the expressions for the radial profiles of the density, temperature, disk height, etc., while gaining a factor of only a few in accuracy. Because of the approximate nature of our model (mandated by the poor accuracy of its other ingredients), we prefer to use the simplest of the disk models, and therefore use the analytic results given in the original works of Shakura and Sunyaev. The surface density of the α-disk in the inner radiation dominated part, where Compton opacity prevails, is given by expression (A4) in the Appendix A. When expressed in units of M ⊙ /R 2 ⊙ , it becomes where α ss is the "α"-parameter of the disk model, l E is the ratio of the luminosity of the disk to the Eddington limit for the black hole of mass M , ǫ is the fraction of the rest mass energy of the accreting matter, which is radiated away. Thus close to r g , Σ = 404 g cm −2 . The expression (5) is valid for a radiation pressure supported disk where r < r ab given by expression (A2). For typical values α ss = 0.01, ǫ = 0.1, l E = 0.1, M 8 = 1, we obtain r ab = 2.3 · 10 −3 pc ≈ 240r g and Σ ab = Σ(r ab ) ≈ 4.2 · 10 6 g cm −2 . When the disk becomes self gravitating, it may become subject to a gravitational instability. In Appendix A we check that by calculating the Toomre parameter To = κc s πGΣ (e.g., Binney & Tremaine 1994), where κ is the epicyclic frequency and c s is vertically averaged sound speed. The gravitational instability develops if To < 1. As follows from the analysis in the Appendix A the disk has a well defined radius of stability r T , such that for r > r T it becomes unstable. In the case when r T < r ab , the expression for r T is given by formula (A33). For the values α ss = 0.01, ǫ = 0.1, l E = 0.1, M 8 = 1 the radius of stability r T falls close to the radius of transition r ab between radiation dominated and gas pressure dominated parts of the disk. The development of the Jeans instability should lead to the formation of spiral patterns and fragmentation of the disk (Shlosman & Begelman 1989), which will happen on the radial inflow time scale at a radius ≈ r T . Therefore, for estimating the drag produced by the disk on the passing stars, we can limit ourselves to consider only the inner portion of the disk at r < r ab and use equation (5) for the disk surface density. The gas beyond r ab may also influence the motion of stars. It is difficult to evaluate the drag produced on stars passing through gravitationally unstable outer parts of the disk for r > r T . However, we note that the rate of star-disk collisions is maximized at r 10r t ∼ r ab , so most of the star-disk collisions happen inside the radiation dominated zone (zone (a)) of the disk. The Rossby vortex model of the disk predicts a mass thickness of a near constant, 100 g cm −2 < Σ RV I < 1000 g cm −2 . This is about the same as the Shakura-Sunyaev model near to the BH, but becomes very much less at large radius. Consequently the self gravity condition occurs at a much larger radius, 3 to 10 pc, and matches smoothly onto the galactic flat rotation curve mass distribution. Hereafter, we will use disk parameters in zone (a) listed in Appendix A for the estimates of star-disk collisions. The disk half-thickness (expression (A5)) expressed in units of r g is expressed in solar radii and expressed as a fraction of r ab H = 3.7 · 10 −3 r ab α ss 0.01 It is natural to expect that the dynamo growth rate will be also maximized at small radii primarily within zone (a) where the disk is radiation dominated, but outside of the region, r t ≃ 21r g , of tidal destruction of stars. However, we should also point out that although proof of principle of the dynamo is most likely where the growth rate is maximum, we also expect that regardless of where the growth rate maximizes, that the back reaction will limit the maximum fields and that subsequent diffusion outwards (as for the angular momentum) and advection inwards (as for the mass) will ensure a redistribution of the magnetic flux reaching a new equilibrium presumably less dependent upon where the maximum dynamo growth rate occurs. Star-Disk Interaction The orbital period of the star is where, as before, r min is the minimum impact radius of the star's orbit. The typical velocity of the star relative to the disk is close to the Keplerian velocity at r min . Since the speed of sound in the disk is much smaller than the Keplerian velocity, by the ratio H/r ≃ 3.7 · 10 −3 , stars pass through the disk with highly supersonic velocities. The drag force on the star consists of two components, collisional and gravitational. The collisional or direct drag is produced by intercepting the disk material by the geometric cross section of the star. Assuming the star to have a solar mass and radius, this force is F drag = πR 2 ⊙ ρv 2 * , where ρ is the mass density of the gas in the disk, and v * is the velocity of the star relative to the disk gas. Radiation drag is negligible compared to gas drag as soon as the speed of sound is nonrelativistic, i.e. c s ≪ c. The second component of the drag force is due to deflection of the gas by the gravitational field of the star. Rephaeli & Salpeter (1980) found that the latter component is nonzero only for supersonic motion and gave the following expression for that force in the limit v * ≫ c s where Λ is the Coulomb logarithm. The ratio of the two forces is Using for v * its Keplerian value v * = (GM/r) −1/2 , and using for the Coulomb logarithm its maximum possible value Λ = r/R ⊙ , one obtains the ratio of the forces as One can see from equation (12) that the force due to the direct interception of gas by the star is much larger than the drag caused by the gravitational drag for all values of r of interest to us r 10 5 r g . Thus, we can consider the change of momentum caused by the disk on passing stars as purely due to the interception of the gas by the geometrical cross section of the star πR 2 ⊙ . Hence, the characteristic time needed to substantially change the star orbit as a result of star-disk interactions, t disk , is approximately equal to the time needed for the star to intercept the disk mass equal to the mass of the star. A star will pass through the disk twice per one orbital period. Assuming all stars as having a solar mass and radius, the ratio of the orbital period to t disk is Using expression (5) for Σ in the region r < r ab one obtains τ disk = 6.2 · 10 −9 α ss 0.01 The corresponding star-disk interaction time scale t disk is given by t disk = 1.58 · 10 4 yr · α ss 0.01 and is independent of the semi-major axis of the star orbit. As was shown by Rauch (1995) secular evolution of all orbital elements of a star happen at the same time scale t disk from equation (14). The ratio of τ disk to τ coll (equation (3)) is given by For orbits with r min ≤ 30r g one has τ disk < τ coll and the effect of star-star collisions dominates over the effect of star-disk collisions (assuming typical parameters for the disk). For the radii 30r g ≤ r min ≤ r T the orbit evolution is more influenced by the drag from the disk rather than by star-star collisions. (We note that this radius, 30r g , is only slightly greater than the gravitational disruption radius by the CMBH, r t ≃ 20r g .) Only a fraction of stars from the outer region located beyond ≈ 1000r g will not be put into the disk plane by star-disk drag. Results of Rauch (1995) show that it takes a considerably longer time than t disk to reorient the retrograde star orbits. During this reorientation process the semimajor axes of initially retrograde star orbits decreases by ≈ 10 times. Before the alignment process for such stars could be completed they will move in radius closer than ≈ 30r g into the star-star collisions zone, where their orbital inclinations would be randomized. Another factor preventing all stars from being trapped into the disk plane is that there is always a fraction of stars which are injected by two body relaxation into the neighborhood of the black hole from large (much larger than r T ) radii. These stars can be brought directly into the region r ≤ 30r g (or close to it) and contribute to the collisional core of the stellar cluster. To summarize, both star-disk and star-star collisions can be important for determining the distribution function in the central star cluster. However, it seems unlikely that the drag by the disk can trap all stars into the disk plane and denude the central ≈ 10 −3 pc of all stars not in the disk plane. Trapping of stars by the disk will reduce the numbers of stars given by (1) but this requires more evolved computations, which are beyond the scope of the present work. Both starstar collisions and the effect of trapping by the disk of the stars having lower eccentricities faster than the stars having larger eccentricities leads to highly eccentric orbits of stars in the central ≈ 10 −3 pc. Drag by the disk will also lead to the prevailing of prograde orbits over the retrograde orbits. However, for our purpose, we assume that the star density is given by equations (1), all stars have e = 1 and their orbits are randomly oriented in space. (This approximation is better in the model of the disk driven by Rossby vortices.) The Rate of Star-Disk Collisions We shall use the number density of stars, n, given by equation (1) in order to evaluate the rate of star-disk collisions. The flux of stars through the disk coming from one side of it is nv/4, where we assume that all stars have the same velocity v = √ 2(rΩ K ) (parabolic velocity) and are distributed isotropically. One obtains then for M 8 = 1 1 4 nv = 2.4 · 10 −39 1 cm 2 s n 5 r 10 −2 pc −9/4 for r > 10 −2 pc, 1 4 nv = 2.4 · 10 −39 1 cm 2 s n 5 r 10 −2 pc −1/2 for 10r t < r < 10 −2 pc, 1 4 nv = 0 for r < 10r t . Integrating the flux of stars coming from both sides of the disk over an area of πr 2 inside some given radius r, one can estimate the rate of star-disk collisions within the radius r. Let us define the time ∆T c = ∆T c (r) as the inverse of this rate, i.e. one star passes through the disk area inside the radius r during the time ∆T c on average. The result is (see equation (1)) ∆T c = 2π Ω K (r) · 2.8 · 10 −5 · n −1 5 r 10 −2 pc −3/2 for r > 10 −2 pc, ∆T c = ∞ for r < 10r t (no collisions), where 2π/Ω K (r) = T K (r) is the period of Keplerian circular orbit at the radial distance r from the black hole. We see that the number of star-disk collisions happening per Keplerian period, T K (r), is ∝ r 3 inside the collisional core of the star cluster, e.g. within ≈ 10 −2 pc. For the outer region of the stellar cluster beyond ≈ 10 −2 pc this number continues to increase with r but more slowly, as ∝ r 3/2 . The number of collisions per Keplerian period at 0.01 pc is ∼ 30, 000, leading to fluctuations of the order of 1% within an orbital time of several years. If these collisions should produce broad emission and absorption lines regions, (BLRs), then this result may not be inconsistent with observations. Estimates of the density of the matter leading to the broad emission lines from the interpretation of allowed and forbidden transitions give a density of ρ BL ∼ 10 −11 to 10 −13 g/cm 3 , (Sulentic, Marziani & Dultzin-Hacyan 2000). The geometrical thickness of the disk H in radiative pressure dominated inner zone is independent of the disk model and the mechanism of angular momentum transport and is given by equation (A5). In thermal pressure dominated part of the disk, H weakly depends on Σ as H ∝ Σ 1/8 . Only in the case of the RVI disk does the low thickness, Σ RV I ∼ 10 2 to 10 3 g/cm 3 , lead to a sufficiently low density, ρ RV I = Σ RV I /H ≃ Σ RV I · 3 · 10 −14 g/cm 3 , which is consistent with the above estimates for the density of the star-disk driven matter emitting the broad emission lines. On the other hand, the Shakura-Sunyaev disk would be expected to have a density ρ SS given by expression (A6) in the radiation dominated zone (a) and expression (A25) in the pressure dominated zone (b). If one equates the observed width of broad emission lines (∼ 7·10 3 km/s) to the Doppler shift at Keplerian velocity, one obtains an estimate of the location of the broad lines region at r ∼ 10 3 r g . This radius falls not far from the boundary between zones (a) and (b) in the Shakura-Sunyaev disk model (see expression (A2)). The density of the Shakura-Sunyaev disk at this radius is ∼ 10 −6 g/cm 3 to 10 −8 g/cm 3 depending upon the parameters of the model. This is at least 5 orders of magnitude larger than ρ BL required by observations. The differences in ρ for Shakura-Sunyaev and RVI disks are almost completely attributable to the much lower column thickness Σ RV I than Σ for Shakura-Sunyaev model. Regardless, the function of the plumes for producing the helicity for the dynamo should be independent of these differences in the models of the disk. Star disk collisions were first suggested as the source of the BLRs by Zurek, Siemiginowska, & Colgate (1994), Zurek, Siemiginowska, & Colgate (1996), but a detailed calculation of the phenomena has not yet been performed, because it requires 3-D hydrodynamics with radiation flow and opacities determined by multiple lines. An approximation to this problem was calculated by Armitage, Zurek & Davies (1996) for the purpose of determining the mass accretion rate of giant stars by dynamic friction with the disk, but the radiation flow in thin disks was not considered. We recognize that very many additional variables of hydrodynamics, radiation, and geometry must be taken into account in order to positively identify BLRs with star disk collisions. With these caveats we proceed to analyze the star collisions with the disk and the resulting plume formation from the standpoint of the fluid dynamics that has consequences for the dynamo. Plumes Produced by Star Passages through the Disk The first result of a star-disk collision is to cause a local fraction of the mass of the disk to rise above the surface of the disk because of the heat generated by the collision. Two plumes expanding on both sides of the accretion disk will be formed. A second result is the expansion of this rising mass fraction relative to its vertical axis in the relative vacuum above the disk surface and again because of the internal heat generated by the collision. A third result is the rotation (anticyclonic) of this expanding matter relative to the Keplerian frame corotating with the disk because of the Coriolis force acting on the expanding matter. Again we emphasize that this rotation through a finite angle has been measured in the laboratory and agrees with a simple theory of conservation of angular momentum and radial expansion of the plume (Beckley et al. 2003). All three effects are important to the dynamo gain. However, we will find that the dynamo gain during the life time of the accretion disk, ∼ 10 8 years, is so large that the accuracy of the detailed description of these "plumes" becomes of less importance compared to the facts of: (1) their axial displacement well above the disk; (2) their finite, ∼ π/2 radians, coherent rotation every star-disk collision; and (3) their subsidence back to the disk in ∼ π radians. In this spirit we will estimate the hydrodynamics of the star-disk collision, attempting to establish the universality of this phenomena as the basis of the accretion disk dynamo. As far as we know no hydrodynamic simulations of the behavior of the disk matter due to stars passing through the disk have yet been performed. (This is because of the difficulty of 3-dimensional hydrodynamics with radiation flow.) The star passes through the disk at a velocity, close to the Keplerian velocity of the disk at whatever radius the collision happens. The sound speed in the accretion disk is much less than the Keplerian speed v K : c s ≃ v K H/r ≃ 3 · 10 −3 v K at r ab , where H is the disk half-thickness given by expression (A5) in zone (a). Hence, the star-disk collisions are highly supersonic. The temperature of the gas in the disk, shocked by the star moving at a Keplerian velocity, is of the order of the virial temperature in the gravitational potential of the central black hole. This pressure must include the radiation contribution, which in general, will be much larger than the particle pressure. Because of the high Mach number of the collision, the pressure of the shocked gas is very much greater than the ambient pressure in the disk. This over pressure will cause a strong, primarily radial shock, radial from the axis of the trajectory, in the wake of the star, because of the large, length to diameter ratio of the hot channel, H/R ⊙ ≃ 4 · 10 2 . After the star emerges above the disk surface (i.e. higher than the half thickness of the disk), the heated shocked gas in the wake of the star continues to expand sideways and furthermore starts to expand vertically because of the rapidly decreasing ambient pressure away from the disk mid-plane where the pressure of the disk drops as ∝ exp(−z 2 /H 2 ). Thus this expansion can be treated as an adiabatic expansion into vacuum after the plume rises by a few heights H above and below the disk, provided the radiative loss is fractionally small. We would now like to estimate the size, or radius, r p , of the matter that rises ∼ 2H above the disk, or to a height l ≃ 3H above the mid-plane. Although smaller mass fractions with greater internal energy corresponding to smaller radii of the shock will expand to greater heights above the disk, nevertheless we are concerned with only this modest height, because we expect that the mass and hence entrained magnetic flux to be positively correlated with plume mass, and we wish to maximize the entrained flux. On the other hand, by conservation of energy, a larger mass will rise or expand to a smaller height. We also desire the plume to rise sufficiently above the disk such that there is ample time for radial expansion and hence torquing of the entrained magnetic field during the rise and fall of the plume material. This will be our standard plume. The radial extend of the plume should be somewhat less than its vertical extend because the density gradient in the disk is largest in the vertical direction. The action of the Coriolis force leads to an elliptical shape of the horizontal cross section of the plume. This is due to the fact that epicycles of particles in the gravitational field of a point mass are ellipses with an axis ratio of 2 and with an epicyclic frequency of Ω K . We performed simple ballistic calculations of trajectories of particles launched from a point at the mid-plane of the disk with initial velocities in different directions in the horizontal plane. We obtained that at the time of maximum height of the plume, ≈ T K /4, the position angle of the major axis of the ellipse is approximately −π/4 from the outward radial direction e r . At the time of the fall back to the disk plane at ≈ T K /2, the major axis of the ellipse is close to the azimuthal direction. Such a distortion in the shape of an otherwise cylindrical plume will only slightly affect the rotation of the entrapped toroidal flux and hence will not alter the dynamo action. Before calculating the size or radius, r p , we first verify the adiabatic approximation in that the diffusion of radiation is fractionally small compared to the hydrodynamic displacements. In this circumstance of a Shakura-Sunyaev disk, this will allow us to treat the star-disk collisions as strong shocks within the disk matter. Subsequently we will consider the thinner, lower density Rossby disks (Li et al. 2001b) where radiation transport will dominate over shock hydrodynamics. However, for the purposes of the dynamo, the production of helicity from either plumes will be similar. Radiation Diffusion in the Collision Shock During star-disk collisions the total energy taken from the star is ≈ Σv 2 K πR 2 ⊙ . This energy is distributed over a column of radial extent, ∆R rad , due to radiation transport. For an estimate of ∆R rad one can take the distance from the star track where the sideways diffusion of radiation becomes comparable with the advection of the radiation by the displacement of the disk matter with the star velocity v K (since the velocity of strong shock is of the order of v K ). This results in where for ρ we consider the density ahead of the shock in the undisturbed disk matter to compare the radiation flux with the transport of energy and momentum by the shock. We assume κ = 0.4 cm 2 g −1 , Thompson opacity. Then using ρ from expression (A6) at r ab , ∆R rad = 10 8 cm = 1.4 · 10 −3 R ⊙ . Since ∆R rad ≪ R ⊙ , the radiation will remain local to the shocked fluid. The state conditions in this shocked matter will depend upon the rapid thermalization between the matter and radiation. The number of photon scatterings, n hν within the time of traversal of the radiation front, ∆R rad , becomes Therefore the radiation will be fully absorbed and thermalized with the gas within ∆R rad . Since the gas pressure is radiation dominated for r < r ab and the shock has a high Mach number, c s /v K ≈ r/H ≃ 280 at r ab , then the shocked matter will have a still higher entropy and be even further radiation dominated. In a strong shock the energy behind the shock will be half kinetic and half internal energy, where in this case the radiation pressure dominates. Thus the subsequent evolution of the radiation dominated gas will be governed by adiabatic hydrodynamics of the fluid with a polytropic index γ = 4/3. The Shock Produced by the Collision and Its Radial Expansion Since the initial radius of the shocked gas is that of the star and since this radius is small compared to the path length through the disk, H, or H/R ⊙ ≈ 370 at r = r ab , we make the assumption that the collision can be approximated as a line source of energy with the energy deposition per unit length Σv 2 K πR 2 ⊙ , and consider the shock wave as expanding radially from the trajectory axis. This can be well described as one of the sequence of Sedov solutions (Sedov 1959) of an expanding cylindrical shock in a uniform medium. However, for the purposes of the accuracy required for our plume approximation it is sufficient to note that the energy density left behind the shock, ǫ shk , is nearly inversely proportional to the swept-up mass, or ǫ shk ≃ ǫ shk,R⊙ (R ⊙ /R shk ) 2 where ǫ shk,R⊙ ≃ v 2 K /2. This increase in energy density leads to an increase in the pressure of the shocked gas P shk relative to the ambient pressure P o : P shk,R ⊙ (z) ≃ ρv 2 K ≫ P o (z) for all z. The high pressure of the shocked gas near the axis of the channel will drive the shock to larger radii while expanding adiabatically behind the shock. Near the surface, R shk ≃ z the shocked gas can expand vertically as well as horizontally. However, to the extent that when the shock is strong, R shk ≪ H, the radial shock will have decreased in strength before the star reaches the surface and the over pressure becomes too small except for a small mass fraction of the surface mass, ∆z ≃ R shk ≪ H, that will expand vertically above the disk surface. However, a larger mass will expand above the disk due to buoyancy. In this case the vertical momentum is derived primarily from the difference of gravitational forces on the buoyant matter versus the ambient matter. The buoyant force is proportional to the entropy ratio. A strong shock leaves behind matter whose entropy is higher than the ambient medium. Since the entropy change due to a shock wave is third order in the shock strength (Courant & Friedrichs 1948;Zeldovich & Raizer 1967), only strong shocks result in significant changes in entropy. In this limit the entropy change ∆S from the ambient entropy S o is ∆S/S o ∝ ∆(P/ρ)/(P o /ρ) ≃ (P shk /P o )((γ − 1)/(γ + 1)) where γ is the usual ratio of specific heats, and ρ is the ambient density. The compression ratio is η CR = ρ shk /ρ = (γ + 1)/(γ − 1) = 7 across a strong shock for γ = 4/3. Thus, for example, for a plume to rise well above the disk requires an estimated ∆S/S o ≥ 2 and thus P shk /P o ≃ 14 . Once the hot shocked gas rises to the surface of the disk and assuming that this flow is adiabatic thereafter and thus does not entrain a significant fraction of surrounding matter, the subsequent expansion above the disk is determined by its initial internal energy. Let us consider the neighborhood of a point r = r 0 at the mid-plane of the disk where a star disk collision occurs. One can introduce a local Cartesian coordinate system x, y, z in the Keplerian rotating frame with the origin at the point r = r 0 such that the x-axis is directed radially outward, the y-axis is directed in the positive azimuthal direction, and the z-axis is perpendicular to the disk plane. Then, the effective gravitational and centrifugal potential in the Keplerian rotating frame in the neighborhood of the point r = r 0 is The thermal energy of the hot column of gas is a fraction of the loss of kinetic energy of the star due to the hydrodynamic collision with the disk. This latter energy loss during one passage is F drag 2H = 2HπR 2 ⊙ ρv 2 * = πR 2 ⊙ Σv 2 * . Without a hydrodynamic simulation in 3-dimensions an accurate description is missing. Nevertheless it is sufficient to approximate the solution as that fraction of the disk matter that has an internal energy density, ǫ shk greater than that of the ambient disk by that factor such that it will rise to a height, z, determined by its potential energy, or ∆Φ = GM z 2 2r 3 (equation (21)). Since 2r 3 , then in order for a plume to rise above the disk mid-plane to a height, l, We are concerned with plumes that rise well above the disk so that they can expand horizontally by a factor several times the plume's original radius. In this case the moment of inertia of the plume about its own axis will be increased by several times before falling back to the disk. This causes the plume to reduce its own rotation rate relative to the frame of the disk, that is to untwist relative to that frame. For this expansion to take place, the plume must rise roughly ∼ 2H above the disk, or l ≃ 3H. At this height the pressure of the hydrostatic isothermal atmosphere with the density profile as ∝ exp(−z 2 /H 2 ) becomes negligible compared to that of the plume, and so the hot gas of the plume can expand both vertically and horizontally as a free expansion. With this l we get Using expression (8) Thus a plume, starting from a size, R shk < H will expand to a size ≃ 2H both vertically and horizontally, thus producing a near spherical bubble with radius r p = H above the disk. Post shock expansion will increase the estimate of R shk somewhat. For simplicity we will use R shk = H/2 for estimates of the toroidal flux entrained in the plumes in paper II. This is our standard plume. Finally we note that the rise and fall time of this plume should be the half orbit time, corresponding to a ballistic trajectory above and back to the surface of the disk. Hence, t plume ≃ π/Ω or a plume rotation angle of π radians. We next consider the twisting of the plume leading to its effective helicity. The Untwisting or Helicity Generation by the Plume Thus the plume should expand to several times its original radius by the time it reaches the height of the order 2H. The corresponding increase in the moment of inertia of the plume and the conservation of the angular momentum of the plume causes the plume to rotate slower relative to the inertial frame (Beckley et al. 2003;Mestel 1999;. From the viewpoint of the observer in the frame corotating with the Keplerian flow at the radius of the disk of the plume, this means that the plume rotates in the direction opposite to the Keplerian rotation with an angular velocity equal to some fraction of the local Keplerian angular velocity depending upon the radial expansion ratio. Since the expansion of the plume will not be infinite in the rise and fall time of π radians of Keplerian rotation of the disk, we expect that the average of the plume rotation will be correspondingly less, or ∆φ < π or ∼ π/2 radians. Any force or frictional drag that resists this rotation will be countered by the Coriolis force. Finally we note that kinetic helicity is proportional to For the dynamo one requires one additional dynamic property of the plumes. This is, that the total rotation angle must be finite and preferably ≃ π/2 radians, otherwise a larger angle or after many turns the vector of the entrained magnetic field would average to a small value and consequently the dynamo growth rate would be correspondingly small. This property of finite rotation, ∆φ ∼ π/2 radians, is a fundamental property of plumes produced above a Keplerian disk. Summary Thus we have derived the approximate properties of an accretion disk around a massive black hole, the high probability of star-disk collisions, the three necessary properties of the resulting plumes all necessary for a robust dynamo. What is missing from this description is the necessary electrical properties of the medium. However, since the required conductivity is so closely related to the mechanism of the dynamo itself, we leave it to the following paper II (Pariev, Colgate & Finn 2006), a discussion of this remaining property of the hydrodynamic accretion disk flows necessary for a robust accretion disk dynamo. With this exception we feel confident that an accretion disk forming a CMBH with its associated star disk collisions is nearly ideal for forming a robust feedback-limited dynamo and thus, potentially converting a major fraction of the gravitational free energy of massive black hole formation into magnetic energy. VP is pleased to thank Richard Lovelace and Eric Blackman for helpful discussions. Eric Blackman is thanked again for his support during the late stages of this work. SC particularly recognizes Hui Li of LANL for support through the Director funded Research on the Magnetized Universe and New Mexico Tech for support of the plume rotation experiments as well as the dynamo experiment. The facilities and interactions of Aspen Center for Physics during two summer visits by VP and more by SAC are gratefully acknowledged. This work has been supported by the U.S. Department of Energy through the LDRD program at Los Alamos National Laboratory. VP also acknowledges partial support by DOE grant DE-FG02-00ER54600 and by the Center for Magnetic Self-Organization in Laboratory and Astrophysical Plasmas at the University of Wisconsin-Madison. A. Parameters of Shakura-Sunyaev Disk In the subsequent estimates of the disk physical parameters we will keep the radius of the disk r, where the star-disk collisions happen, Shakura-Sunyaev viscosity parameter α ss , ratio of the disk luminosity to the Eddington luminosity l E , fraction ǫ of the rest mass accretion fluxṀ c 2 , which is radiated away, as parameters. We will assume them to be within an order of magnitude from their typical values of importance for the dynamo problem, which are the following α ss = 0.01, l E = 0.1, ǫ = 0.1, r = 10 −2 pc. The flux of the stars through the disk, nv/4, peaks at the radii inside r = 10 −2 pc (see section 3.2), therefore we need to know the physics of the accretion disk at r ∼ 10 −2 pc. Below, we will define the gravitational radius as r g = 2GM/c 2 = 3.0 · 10 13 M 8 cm = 9.5 · 10 −6 M 8 pc. All formulae for the structure of Shakura-Sunyaev disk are written for an arbitrary value of the black hole mass M = 10 8 M 8 M ⊙ . However, we will consider only M = 10 8 M ⊙ whenever we invoke the model for the star distribution in the central cluster, because the best available model of the central star cluster was calculated for the M = 10 8 M ⊙ (section 2.1). Finally, the accuracy of expressions for the disk parameters is only one significant figure in all cases, and we keep two or even three figures only to avoid introducing additional round off errors, when using our expressions. Similarly, one should not be concerned about small jumps of values across the boundaries with different physical approximations: a more elaborate treatment is needed to find exact matching solutions there, although the physical principles are unchanged. We use formulae from the Shakura & Sunyaev (1973) article to obtain estimate of the state of the accretion disk. We assume the Schwarzschild black hole with the inner edge of the disk being at 3r g . However, since we consider star-disk collisions happening at ∼ 10 3 r g , general relativistic corrections are only at a level less than few per cents and do not matter for our approximate treatment of star-disk collision hydrodynamics. All expressions for disk quantities below were also verified in later textbooks by Shapiro & Teukolsky (1983) and Krolik (1999). The inner part of the disk (part (a) as in Shakura & Sunyaev (1973)) is radiation dominated and the opacity is dominated by Thomson scattering. In the next zone (part (b)) the opacity is still Thomson, while the gas pressure exceeds radiation pressure. In the outer most zone (part (c)) the opacity becomes dominated by free-free and bound-free transitions. The boundary between parts (a) and (b) r ab is given by an expression The boundary between parts (b) and (c) r bc is given by the following expression r bc = 3.4 · 10 3 r g l E 0.1 One can see that, generally, r bc > 10 −2 pc. Therefore, we may consider zones (a) and (b) only, for our purpose of addressing star-disk collisions. First, we will list parameters following from solving for the vertically averaged radial distributions of physical parameters inside the zone (a). The surface density is This H depends upon the radius only via general relativistic corrections. So, the disk has asymptotically constant thickness for values of r ≫ r g (Shakura & Sunyaev 1973;Krolik 1999). Moreover, H does not depend on α ss in zone (a) and so H is also independent on the mechanizm of angular momentum transport. The corresponding density is ρ = Σ 2H = 7.5 · 10 −7 g cm −3 0.01 α ss l E 0.1 i.e. almost a constant, depending on all parameters of the disk and the black hole very weakly. Depending upon parameters, r sg maybe inside or outside the r bc , however, as we show next, the disk in part (b) is unstable to fragmentation caused by self gravity, which makes the question on whether the exact position of r sg is with respect to r bc unimportant. The expressions for radiation flux Q and surface temperature of the disk T s remain the same as in the part (a) of the disk, namely given by the expressions (A9) and (A10). For the temperature at the midplane of the disk one can obtain from formula (A11) Fig. 1.-The α − Ω dynamo in a galactic black hole accretion disk. The radial component of the poloidal quadrupole field within the disk (A) is sheared by the differential rotation within the disk, developing a stronger toroidal component (B). As a star passes through the disk it heats by shock and by radiation a fraction of the matter of the disk, which expands vertically and lifts a fraction of the toroidal flux within an expanding plume (C). Due to the conservation of angular momentum, the expanding plume and embedded flux rotate ∼ π/2 radians before the matter in the plume and embedded flux falls back to the disk (D). Reconnection allows the new poloidal flux to merge with and augment the original poloidal flux (D).
When Gran Turismo 5 was released, something unusual in the game’s Kyoto Photo Travel Location caught the attention of our community: a curiously detailed little cat. He blinked, turned his head, looked around, and generally drove people crazy. When GTPlanet member MadmuppGT created a thread in our forums about the cat, affectionately naming him “Jenkins”, his popularity skyrocketed. The topic gathered hundreds of posts and spawned an explosion of Jenkins memes, avatars, and photo galleries across the site. Although “Jenkins” was not found in GT6, his legend lives on, driven by the mystery of why this seemingly random cat was so carefully modeled and animated in a driving game. His presence even led to some oddly compelling conspiracy theories about the true meaning behind the GT logo (warning – you won’t be able to un-see this)… What’s the real story, though? With so many Polyphony Digital employees on hand at the recent GT Sport reveal in London, I knew the time was right to get to the bottom of all this. Sure enough, with the help of Kazunori Yamauchi and Translator-san, I was introduced to the cat’s owner, who was eager to share the story of his famous feline. The cat was actually a female Abyssinian, and her real name was “Primary”. Her owner is a Polyphony Digital employee who was married during the development of Gran Turismo 5. As a wedding gift, his co-workers surprised him with a highly-detailed 3D model of his beloved pet, immortalizing her in a game which would go on to sell nearly 12 million copies. Sadly, Primary is no longer with us – she has since passed away – but thanks to this special wedding gift, she has brought a smile to people the world over, and will no doubt live on in the lore of Gran Turismo. More Posts On...
def do_inconclusive(self, meeting: Meeting, context: Context, operation: str, operand: str, message: TrackedMessage) -> None: if meeting.is_chair(message.sender): meeting.vote_in_progress = False meeting.motion_index = None meeting.track_event(EventType.INCONCLUSIVE, message, operand=operand)
def check_version(args): version_str_set = set() file_vars_set = load_file_vars_set(args.pyproject_file, args.file_vars) for file_var_str in sorted(file_vars_set): print(f"Processing {file_var_str}") file, var_name = file_var_str.split(":", 1) file_path = Path(file).resolve() if file_path.suffix == ".py": version_str_set.update(get_variable_from_py_file(file_path, var_name)) elif file_path.suffix == ".toml": version_str_set.add(get_variable_from_toml_file(file_path, var_name)) else: raise RuntimeError(f"Unsupported file extension: {file_path.suffix}") if len(version_str_set) == 0: raise RuntimeError(f"No versions found in {', '.join(sorted(file_vars_set))}") if len(version_str_set) > 1: raise RuntimeError( f"Found more than one version: {', '.join(sorted(version_str_set))}\n" "Re-run make set-version" ) if not VersionInfo.isvalid((version := next(iter(version_str_set)))): raise RuntimeError(f"Unable to validate version: {version}") print(f"Found version {version} in all processed locations.")
We are all Melmottes now Hot/cold on the heels of Iceland’s quasi-default, the Roger Lowenstein in the NY Times urges underwater/negative equity homeowners to “Walk Away From Your Mortgage!”. . Lowenstein’s key point is that businesses (including those owned or controlled by the banks themselves) treat default as a straightforward business decision, to be adopted whenever it is profitable to do so. Lowenstein gives a number of examples where leading banks like (inevitably) Goldman Sachs have engaged in strategic default and urges his readers to do likewise. The piece is in a section headed “The Way We Live Now” and it’s striking that it’s taken more than 100 years for the business ethics of Augustus Melmotte to percolate through to the American middle class To be fair, it’s only in the last thirty years or so that such ethics have become dominant in the corporate sector, to the point where a board that rejected profitable opportunities to stiff their creditors would now be regarded as having violated its fiduciary obligations to shareholders (particularly if the creditors are workers). And despite all the talk about shareholder value, a CEO who passed up opportunities for personal enrichment at the expense of shareholders would be regarded by his or her fellows as a mug. Millions have defaulted already – (one in eight mortgages is currently in arrears). Bankruptcy is once again as common as divorce. When defaulting on debt is this common, it is hard to sustain any sort of social stigma or internalised notion that this is anything other than a financial option, like refinancing an existing loan. And, as with divorce, we must soon be reaching the point where most people who take out loans will do so in the knowledge that default is an option. The question is – can the consumer credit system survive this? Probably it can, but the system will need some radical changes. It’s worked for several decades on the basis of creditworthiness criteria that work on the assumption that (nearly) everyone will repay their debts if they can. Until recently, the checks could also rely on the assumption that people would be more-or-less honest in the information they provided in their applications. The financial system, by promoting ‘liar loans’ colluded in the destruction of the second assumption, and by leading the way in strategic default, helped to destroy the first. The problem for lenders now is that they will increasingly have to act on the assumption that their borrowers (including those who appear creditworthy on the old standards) are planning, at a minimum, to use default as an insurance option. The only good way to protect against this is to demand lots of secure collateral. That means less liberal credit (and, given higher default rates, higher interest rates) for everyone and no credit at all for lots of us.
def cal_ndcg_loo(self): full, top_k = self._subjects, self._top_k top_k = full[full['rank'] <= top_k] test_in_top_k = top_k[top_k['test_item'] == top_k['item']] test_in_top_k['ndcg'] = test_in_top_k['rank'].apply( lambda x: math.log(2) / math.log(1 + x)) return test_in_top_k['ndcg'].sum() * 1.0 / full['user'].nunique()
Dapsone: A Novel Corrosion Inhibitor for Mild Steel in Acid Media Abstract: Corrosion inhibition of mild steel in 1 M HCl and 0.5 M H 2 SO 4 by dapsone were studied by polarization resis-tance, Tafel polarization, electrochemical impedance spectroscopy (EIS) and weight loss measurement. Results obtained revealed that inhibition occurs through adsorption of the drug on metal surface without modifying the mechanism of corrosion process. Potentiodynamic polarization suggested that it acts as a mixed type predominantly cathodic in HCl and predominantly anodic in H 2 SO 4 . Electrochemical impedance spectroscopy was used to investigate the mechanism of corrosion inhibition. Thermodynamic parameters such as E a , °  Ha , °  Sa , °  Hads were calculated to investigate mechanism of inhibition. The adsorption of dapsone followed Langmuir adsorption isotherm. Keywords: Mild steel, Acid solution, weight loss, EIS, Drug. 1. INTRODUCTION Inhibition of corrosion of mild steel is a matter of theo-retical as well as practical importance . Acids are widely used in industries such as pickling, cleaning, descaling etc. Because of their aggressiveness, inhibitors are used to reduce the rate of dissolution of metals. Compounds containing ni-trogen, sulphur and oxygen have been reported as excellent inhibitors . The efficiency of an organic compound as an inhibitor is mainly dependent on its ability to get adsorbed on metal surface which consists of a replacement of water molecule at a corroding interface. The adsorption of these compounds is influenced by the electronic structure of inhib-iting molecules, steric factor, aromatic, and electron density at donor site, presence of functional group such as –CHO, –N=N, R–OH etc., molecular area and molecular weight of the inhibitor molecule . A large number of organic compounds are known to be applicable as corrosion inhibitors for mild steel . However, only a few non-toxic and eco-friendly compounds have been investigated as corrosion inhibitors. Tryptamine, Succinic acid, L-ascorbic acid, Sulfamethoxazole and Ce-fatrexyl, were found to be effective inhibitors for acid envi-ronments. Dithiobiurets exhibited the best performance to-wards the corrosion of mild steel in HCl solutions showed very less toxicity . The inhibitive effect of four anti-bacterial drugs, namely Ampicillin, Cloxacillin, Flucloxacil-lin and Amoxicillin towards the corrosion of aluminum was investigated . The inhibition action of these drugs was attributed to blocking the surface
Interventions for treating simple bone cysts in the long bones of children. BACKGROUND Simple bone cysts, also known as a unicameral bone cysts or solitary bone cysts, are the most common type of benign bone lesion in growing children. Cysts may lead to repeated pathological fracture (fracture that occurs in an area of bone weakened by a disease process). Occasionally, these fractures may result in symptomatic malunion. The main goals of treatment are to decrease the risk of pathological fracture, enhance cyst healing and resolve pain. Despite the numerous treatment methods that have been used for simple bone cysts in long bones of children, there is no consensus on the best procedure. OBJECTIVES To assess the effects (benefits and harms) of interventions for treating simple bone cysts in the long bones of children, including adolescents.We intended the following main comparisons: invasive (e.g. injections, curettage, surgical fixation) versus non-invasive interventions (e.g. observation, plaster cast, restricted activity); different categories of invasive interventions (i.e. injections, curettage, drilling holes and decompression, surgical fixation and continued decompression); different variations of each category of invasive intervention (e.g. different injection substances: autologous bone marrow versus steroid). SEARCH METHODS We searched the Cochrane Bone, Joint and Muscle Trauma Group Specialised Register (December 2013), the Cochrane Central Register of Controlled Trials (CENTRAL, 2013 Issue 12), MEDLINE (1946 to 12 December 2013), EMBASE (1974 to 12 December 2013) and the China National Knowledge Infrastructure Platform (31 December 2013). We also searched trial registers, conference proceedings and reference lists. SELECTION CRITERIA Randomised and quasi-randomised controlled trials evaluating methods for treating simple bone cysts in the long bones of children. DATA COLLECTION AND ANALYSIS Two review authors independently screened search results and performed study selection. We resolved differences in opinion between review authors by discussion and by consulting a third review author. Two review authors independently assessed risk of bias and data extraction. We summarised data using risk ratios (RRs) or mean differences (MDs), as appropriate, and 95% confidence intervals (CIs). We used the Grading of Recommendations Assessment, Development, and Evaluation (GRADE) system to assess the overall quality of the evidence. MAIN RESULTS The only included trial was a multicentre randomised controlled trial (RCT) conducted at 24 locations in North America and India that compared bone marrow injection with steroid (methylprednisolone acetate) injection for treating simple bone cysts. Up to three injections were planned for participants in each group. The trial involved 90 children (mean age 9.5 years) and presented results for 77 children at two-year follow-up. Although the trial had secure allocation concealment, it was at high risk of performance bias and from major imbalances in baseline characteristics. Reflecting these study limitations, we downgraded the quality of evidence by two levels to 'low' for most outcomes, meaning that we are unsure about the estimates of effect. For outcomes where there was serious imprecision, we downgraded the quality of evidence by a further level to 'very low'.The trial provided very low quality evidence that fewer children in the bone marrow injection group had radiographically assessed healing of bone cysts at two years than in the steroid injection group (9/39 versus 16/38; RR 0.55 favouring steroid injection, 95% CI 0.28 to 1.09). However, the result was uncertain and may be compatible with no difference or small benefit favouring bone marrow injection. Based on an illustrative success rate of 421 children with healed bone cysts per 1000 children treated with steroid injections, this equates to 189 fewer (95% CI 303 fewer to 38 more) children with healed bone cysts per 1000 children treated with bone marrow injections. There was low quality evidence of a lack of difference between the two interventions at two years in functional outcome, based on the Activity Scale for Kids function score (0 to 100; higher scores equate to better outcome: MD -0.90; 95% CI -4.26 to 2.46) or in pain assessed using the Oucher pain score. There was very low quality evidence of a lack of differences between the two interventions for adverse events: subsequent pathological fracture (9/39 versus 11/38; RR 0.80, 95% CI 0.37 to 1.70) or superficial infection (two cases in the bone marrow group). Recurrence of bone cyst, unacceptable malunion, return to normal activities, and participant satisfaction were not reported. AUTHORS' CONCLUSIONS The available evidence is insufficient to determine the relative effects of bone marrow versus steroid injections, although the bone marrow injections are more invasive. Noteably, the rate of radiographically assessed healing of the bone cyst at two years was well under 50% for both interventions. Overall, there is a lack of evidence to determine the best method for treating simple bone cysts in the long bones of children. Further RCTs of sufficient size and quality are needed to guide clinical practice.
/** * @author shiyuanchen * @project LeetCode * @since 2020/06/01 */ public class P245_ShortestWordDistanceIII { public int shortestWordDistance(String[] words, String word1, String word2) { long dist = Integer.MAX_VALUE, i1 = dist, i2 = -dist; for (int i = 0; i < words.length; i++) { if (words[i].equals(word1)) { i1 = i; } if (words[i].equals(word2)) { if (word1.equals(word2)) { i1 = i2; } i2 = i; } dist = Math.min(dist, Math.abs(i1 - i2)); } return (int) dist; } }
// CountTransactions returns the number of transactions matching the filter options. func (s *DBService) CountTransactions(user *User, options TransactionFilterOptions) (uint64, error) { var count uint64 emptyFilter := options.IsEmpty() err := s.view(func() error { handleFn := func(transactionUUID string) error { if emptyFilter { transactionKey := user.createTransactionKeyFromUUID(transactionUUID) exists, err := s.db.Has(transactionKey) if err != nil { return err } if !exists { return nil } } else { transaction, err := s.getTransaction(user, transactionUUID) if err != nil { return err } if transaction == nil { return nil } if !options.Matches(transaction) { return nil } } count++ return nil } doneFn := func() bool { return false } return s.iterateTransactions(user, handleFn, doneFn) }) if err != nil { return 0, fmt.Errorf("failed to count transactions: %w", err) } return count, nil }
/** * The main class for the playspace deploy wizard */ public class PVSystemGui extends JPanel implements ActionListener { public static final GridBagConstraints gbc = null; private static final String LOAD_CARD = "Load profile"; private static final String SOLAR_CARD = "Solar irradiance profile"; private static final String MODULE_CARD = "PV module's inputs"; private static final String OTHER_CARD = "Other components' inputs"; private static final String RESULT_CARD = "Results"; private static final String BATTERY_VOLTAGE_CARD = "Battery voltage"; private static final String BATTERY_CURRENT_CARD = "Battery current"; private static final String CELL_TEMP_CARD = "Cell temperature"; private static final String PV_CURRENT_CARD = "PV generated current"; private static final String DELIVERED_LOAD_CARD = "Delivered load profile"; private static final int LOAD_INDEX = 0; private static final int SOLAR_INDEX = 1; private static final int MODULE_INDEX = 2; private static final int OTHER_INDEX = 3; private static final int RESULTS_INDEX = 4; private static final int BATTERY_VOLTAGE_INDEX = 5; private static final int BATTERY_CURRENT_INDEX = 6; private static final int CELL_TEMP_INDEX = 7; private static final int PV_CURRENT_INDEX = 8; private static final int DELIVERED_LOAD_INDEX = 9; private static final int FIRST_CARD = 0; private static final int NUM_CARD = 10; private static final Dimension PREFERRED_SIZE = new Dimension(850,500); private int cardIndex = 0; private JButton backButton; private JButton nextButton; private JRadioButton[] buttons; private ButtonGroup buttonGroup; private JRadioButton loadInput; private JRadioButton solarInput; private JRadioButton moduleInput; private JRadioButton otherCompInput; private JRadioButton systemBehaviors; private JRadioButton batVoltBehaviors; private JRadioButton batCurrBehaviors; private JRadioButton cellTempBehaviors; private JRadioButton pvCurrBehaviors; private JRadioButton deliverLoadBehaviors; private JPanel cardPanel; public PVSystemGui(ModelInterfaceBase iface) { this(); setInterface(iface); } /** * Constructor for the main Deploy playspace wizard */ public PVSystemGui() { JComponent[] comps = {makeRadioPanel(), makeButtonPanel(), makeCardPanel()}; // gridx, gridy, gridwidth, gridheight, weightx, weighty, anchor, fill, insets(t,l,b,r), ipadx, ipady GridBagConstraints[] gbcs = { new GridBagConstraints(0, 0, 1, 2, 0.0, 1.0, gbc.WEST, gbc.VERTICAL, new Insets(0, 0, 0, 5), 0, 0), new GridBagConstraints(1, 1, 1, 1, 0.0, 0.0, gbc.EAST, gbc.NONE, new Insets(0, 0, 5, 5), 0, 0), new GridBagConstraints(1, 0, 1, 1, 1.0, 1.0, gbc.CENTER, gbc.BOTH, new Insets(5, 0, 5, 5), 0, 0) }; Templates.layoutGridBag(this, comps, gbcs); setPreferredSize(PREFERRED_SIZE); } private JPanel makeRadioPanel() { JPanel p = new JPanel(); loadInput = Templates.makeRadioButton("Load profile", false); loadInput.setBackground(Templates.DARKER_BACKGROUND_COLOR); loadInput.addActionListener(this); solarInput = Templates.makeRadioButton("Solar irradiance profile", false); solarInput.setBackground(Templates.DARKER_BACKGROUND_COLOR); solarInput.addActionListener(this); moduleInput = Templates.makeRadioButton("PV module, system, & computation", false); moduleInput.setBackground(Templates.DARKER_BACKGROUND_COLOR); moduleInput.addActionListener(this); otherCompInput = Templates.makeRadioButton("surrounding, battery, controller, & inverter", false); otherCompInput.setBackground(Templates.DARKER_BACKGROUND_COLOR); otherCompInput.addActionListener(this); systemBehaviors = Templates.makeRadioButton("System behaviors", false); systemBehaviors.setBackground(Templates.DARKER_BACKGROUND_COLOR); systemBehaviors.addActionListener(this); batVoltBehaviors = Templates.makeRadioButton("Battery voltage profile", false); batVoltBehaviors.setBackground(Templates.DARKER_BACKGROUND_COLOR); batVoltBehaviors.addActionListener(this); batCurrBehaviors = Templates.makeRadioButton("Battery current profile", false); batCurrBehaviors.setBackground(Templates.DARKER_BACKGROUND_COLOR); batCurrBehaviors.addActionListener(this); cellTempBehaviors = Templates.makeRadioButton("Cell temperature profile", false); cellTempBehaviors.setBackground(Templates.DARKER_BACKGROUND_COLOR); cellTempBehaviors.addActionListener(this); pvCurrBehaviors = Templates.makeRadioButton("PV generated current profile", false); pvCurrBehaviors.setBackground(Templates.DARKER_BACKGROUND_COLOR); pvCurrBehaviors.addActionListener(this); deliverLoadBehaviors = Templates.makeRadioButton("Actual delivered load profile", false); deliverLoadBehaviors.setBackground(Templates.DARKER_BACKGROUND_COLOR); deliverLoadBehaviors.addActionListener(this); JLabel inputLabel = Templates.makeLabel("inputs", Templates.FONT11B); inputLabel.setBackground(Templates.DARKER_BACKGROUND_COLOR); JLabel resultLabel = Templates.makeLabel("results", Templates.FONT11B); resultLabel.setBackground(Templates.DARKER_BACKGROUND_COLOR); JPanel fill = new JPanel(); fill.setBackground(Templates.DARKER_BACKGROUND_COLOR); JComponent[] comps = {inputLabel, loadInput, solarInput, moduleInput, otherCompInput, resultLabel, systemBehaviors, batVoltBehaviors, batCurrBehaviors, cellTempBehaviors, pvCurrBehaviors, deliverLoadBehaviors, fill}; // gridx, gridy, gridwidth, gridheight, weightx, weighty, anchor, fill, insets(t,l,b,r), ipadx, ipady GridBagConstraints[] gbcs = { new GridBagConstraints(0, 0, 1, 1, 0.0, 0.0, gbc.WEST, gbc.NONE, new Insets(5, 5, 0, 5), 0, 0), new GridBagConstraints(0, 1, 1, 1, 0.0, 0.0, gbc.WEST, gbc.HORIZONTAL, new Insets(5, 5, 0, 5), 0, 0), new GridBagConstraints(0, 2, 1, 1, 0.0, 0.0, gbc.WEST, gbc.HORIZONTAL, new Insets(5, 5, 0, 5), 0, 0), new GridBagConstraints(0, 3, 1, 1, 0.0, 0.0, gbc.WEST, gbc.HORIZONTAL, new Insets(5, 5, 0, 5), 0, 0), new GridBagConstraints(0, 4, 1, 1, 0.0, 0.0, gbc.WEST, gbc.NONE, new Insets(5, 5, 0, 5), 0, 0), new GridBagConstraints(0, 5, 1, 1, 0.0, 0.0, gbc.WEST, gbc.NONE, new Insets(10, 5, 0, 5), 0, 0), new GridBagConstraints(0, 6, 1, 1, 0.0, 0.0, gbc.WEST, gbc.HORIZONTAL, new Insets(5, 5, 0, 5), 0, 0), new GridBagConstraints(0, 7, 1, 1, 0.0, 0.0, gbc.WEST, gbc.HORIZONTAL, new Insets(5, 5, 0, 5), 0, 0), new GridBagConstraints(0, 8, 1, 1, 0.0, 0.0, gbc.WEST, gbc.HORIZONTAL, new Insets(5, 5, 0, 5), 0, 0), new GridBagConstraints(0, 9, 1, 1, 0.0, 0.0, gbc.WEST, gbc.HORIZONTAL, new Insets(5, 5, 0, 5), 0, 0), new GridBagConstraints(0, 10, 1, 1, 0.0, 0.0, gbc.WEST, gbc.HORIZONTAL, new Insets(5, 5, 0, 5), 0, 0), new GridBagConstraints(0, 11, 1, 1, 0.0, 0.0, gbc.WEST, gbc.HORIZONTAL, new Insets(5, 5, 0, 5), 0, 0), new GridBagConstraints(0, 12, 1, 1, 0.0, 1.0, gbc.WEST, gbc.VERTICAL, new Insets(0, 0, 0, 0), 0, 0), }; Templates.layoutGridBag(p, comps, gbcs); p.setBackground(Templates.DARKER_BACKGROUND_COLOR); buttonGroup = new ButtonGroup(); buttonGroup.add(loadInput); buttonGroup.add(solarInput); buttonGroup.add(moduleInput); buttonGroup.add(otherCompInput); buttonGroup.add(systemBehaviors); buttonGroup.add(batVoltBehaviors); buttonGroup.add(batCurrBehaviors); buttonGroup.add(cellTempBehaviors); buttonGroup.add(pvCurrBehaviors); buttonGroup.add(deliverLoadBehaviors); buttons = new JRadioButton[NUM_CARD]; buttons[LOAD_INDEX] = loadInput; buttons[SOLAR_INDEX] = solarInput; buttons[MODULE_INDEX] = moduleInput; buttons[OTHER_INDEX] = otherCompInput; buttons[RESULTS_INDEX] = systemBehaviors; buttons[BATTERY_VOLTAGE_INDEX] = batVoltBehaviors; buttons[BATTERY_CURRENT_INDEX] = batCurrBehaviors; buttons[CELL_TEMP_INDEX] = cellTempBehaviors; buttons[PV_CURRENT_INDEX] = pvCurrBehaviors; buttons[DELIVERED_LOAD_INDEX] = deliverLoadBehaviors; //set the first radio button to be selected! buttons[FIRST_CARD].setSelected(true); return p; } private JPanel makeButtonPanel() { JPanel p = new JPanel(); backButton = Templates.makeButton("back", this); nextButton = Templates.makeButton("next", this); JComponent[] comps = {backButton, nextButton}; // gridx, gridy, gridwidth, gridheight, weightx, weighty, anchor, fill, insets(t,l,b,r), ipadx, ipady GridBagConstraints[] gbcs = { new GridBagConstraints(0, 0, 1, 1, 0.0, 0.0, gbc.EAST, gbc.NONE, new Insets(0, 5, 0, 0), 0, 0), new GridBagConstraints(1, 0, 1, 1, 0.0, 0.0, gbc.EAST, gbc.NONE, new Insets(0, 5, 0, 0), 0, 0) }; Templates.layoutGridBag(p, comps, gbcs); //set the back button disabled since will be on the first card backButton.setEnabled(false); return p; } private JPanel makeCardPanel() { //cards must be added in the same order as the button group indices cardPanel = new JPanel(); cardPanel.setLayout(new CardLayout2()); ChartCard loadCard = new ChartCard("Load power profile", "Time", "Load power", true); loadCard.setParamNames("load time vector", "load vector", "chart maximum time"); loadCard.setYMax(300); cardPanel.add(LOAD_CARD, loadCard); ChartCard solarCard = new ChartCard("Solar irradiance profile", "Time", "Solar irradiance", true); solarCard.setParamNames("irradiance time vector", "irradiance vector", "chart maximum time"); solarCard.setYMax(1000); cardPanel.add(SOLAR_CARD, solarCard); cardPanel.add(MODULE_CARD, new ModuleInputCard()); cardPanel.add(OTHER_CARD, new OtherCompInputCard()); cardPanel.add(RESULT_CARD, new ResultsCard()); ChartCard vBatCard = new ChartCard("Battery voltage profile", "Time", "Battery voltage", false); vBatCard.setParamNames("Vbat time vector", "battery voltage data", "chart maximum time"); vBatCard.setYMin(23.5); vBatCard.setYMax(24.5); cardPanel.add(BATTERY_VOLTAGE_CARD, vBatCard); ChartCard iBatCard = new ChartCard("Battery current profile", "Time", "Battery current", false); iBatCard.setParamNames("Ibat time vector", "battery current data", "chart maximum time"); iBatCard.setYMin(0); iBatCard.setYMax(6); cardPanel.add(BATTERY_CURRENT_CARD, iBatCard); ChartCard cellTempCard = new ChartCard("Cell temperature profile", "Time", "Cell temperature", false); cellTempCard.setParamNames("Tp time vector", "cell temperature data", "chart maximum time"); cellTempCard.setYMin(300); cellTempCard.setYMax(320); cardPanel.add(CELL_TEMP_CARD, cellTempCard); ChartCard iPVCard = new ChartCard("PV generated current profile", "Time", "PV generated current", false); iPVCard.setParamNames("Ipv time vector", "PV current data", "chart maximum time"); iPVCard.setYMin(0); iPVCard.setYMax(5); cardPanel.add(PV_CURRENT_CARD, iPVCard); ChartCard delivLoadCard = new ChartCard("Actual delivered load profile", "Time", "Delivered load", false); delivLoadCard.setParamNames("SatLoad time vector", "delivered load data", "chart maximum time"); delivLoadCard.setYMin(0); delivLoadCard.setYMax(300); cardPanel.add(DELIVERED_LOAD_CARD, delivLoadCard); // set the first card in the panel to match the radio buttons ((CardLayout2) cardPanel.getLayout()).first(cardPanel); return cardPanel; } private void setNextCard() { cardIndex++; ((CardLayout2) cardPanel.getLayout()).next(cardPanel); buttons[cardIndex].setSelected(true); } private void setPrevCard() { cardIndex--; ((CardLayout2) cardPanel.getLayout()).previous(cardPanel); buttons[cardIndex].setSelected(true); } public void actionPerformed(ActionEvent event) { Object object = event.getSource(); if (object == backButton) { setPrevCard(); } else if (object == nextButton) { setNextCard(); } else if (object == loadInput) { cardIndex = LOAD_INDEX; ((CardLayout2) cardPanel.getLayout()).show(cardPanel, LOAD_CARD); } else if (object == solarInput) { cardIndex = SOLAR_INDEX; ((CardLayout2) cardPanel.getLayout()).show(cardPanel, SOLAR_CARD); } else if (object == moduleInput){ cardIndex = MODULE_INDEX; ((CardLayout2) cardPanel.getLayout()).show(cardPanel, MODULE_CARD); } else if (object == otherCompInput) { cardIndex = OTHER_INDEX; ((CardLayout2) cardPanel.getLayout()).show(cardPanel, OTHER_CARD); } else if (object == systemBehaviors) { cardIndex = RESULTS_INDEX; ((CardLayout2) cardPanel.getLayout()).show(cardPanel, RESULT_CARD); } else if (object == batVoltBehaviors) { cardIndex = BATTERY_VOLTAGE_INDEX; ((CardLayout2) cardPanel.getLayout()).show(cardPanel, BATTERY_VOLTAGE_CARD); } else if (object == batCurrBehaviors) { cardIndex = BATTERY_CURRENT_INDEX; ((CardLayout2) cardPanel.getLayout()).show(cardPanel, BATTERY_CURRENT_CARD); } else if (object == cellTempBehaviors) { cardIndex = CELL_TEMP_INDEX; ((CardLayout2) cardPanel.getLayout()).show(cardPanel, CELL_TEMP_CARD); } else if (object == pvCurrBehaviors) { cardIndex = PV_CURRENT_INDEX; ((CardLayout2) cardPanel.getLayout()).show(cardPanel, PV_CURRENT_CARD); } else if (object == deliverLoadBehaviors) { cardIndex = DELIVERED_LOAD_INDEX; ((CardLayout2) cardPanel.getLayout()).show(cardPanel, DELIVERED_LOAD_CARD); } else System.out.println("unknown action case!"); if (cardIndex == FIRST_CARD) backButton.setEnabled(false); else backButton.setEnabled(true); if (cardIndex == NUM_CARD-1) nextButton.setEnabled(false); else nextButton.setEnabled(true); } public void dispose() { SwingUtilities.windowForComponent(PVSystemGui.this).dispose(); } public WindowAdapter getWindowAdapter() { return new WindowAdapter() { public void windowClosing(WindowEvent e) { dispose(); } }; } private void setInterface(ModelInterfaceBase iface) { ((ChartCard) ((CardLayout2) (cardPanel.getLayout())).getComponent(LOAD_CARD)).setInterface(iface, true); ((ChartCard) ((CardLayout2) (cardPanel.getLayout())).getComponent(SOLAR_CARD)).setInterface(iface, true); ((ModuleInputCard) ((CardLayout2) (cardPanel.getLayout())).getComponent(MODULE_CARD)).setInterface(iface); ((OtherCompInputCard) ((CardLayout2) (cardPanel.getLayout())).getComponent(OTHER_CARD)).setInterface(iface); ((ResultsCard) ((CardLayout2) (cardPanel.getLayout())).getComponent(RESULT_CARD)).setInterface(iface); ((ChartCard) ((CardLayout2) (cardPanel.getLayout())).getComponent(BATTERY_VOLTAGE_CARD)).setInterface(iface, false); ((ChartCard) ((CardLayout2) (cardPanel.getLayout())).getComponent(BATTERY_CURRENT_CARD)).setInterface(iface, false); ((ChartCard) ((CardLayout2) (cardPanel.getLayout())).getComponent(CELL_TEMP_CARD)).setInterface(iface, false); ((ChartCard) ((CardLayout2) (cardPanel.getLayout())).getComponent(PV_CURRENT_CARD)).setInterface(iface, false); ((ChartCard) ((CardLayout2) (cardPanel.getLayout())).getComponent(DELIVERED_LOAD_CARD)).setInterface(iface, false); } public static void main(String[] args) { JFrame f = new JFrame("PV system custom GUI"); f.setDefaultCloseOperation(JFrame.EXIT_ON_CLOSE); f.getContentPane().add(new PVSystemGui()); f.pack(); f.show(); } }
/** * The builder builds a specific {@link GobblinTrackingEvent} whose metadata has {@value GobblinEventBuilder#EVENT_TYPE} * to be {@value LineageEventBuilder#LINEAGE_EVENT_TYPE} * * Note: A {@link LineageEventBuilder} instance is not reusable */ @Slf4j public final class LineageEventBuilder extends GobblinEventBuilder { static final String LIENAGE_EVENT_NAMESPACE = getKey(NAMESPACE, "lineage"); static final String SOURCE = "source"; static final String DESTINATION = "destination"; static final String LINEAGE_EVENT_TYPE = "LineageEvent"; private static final Gson GSON = new Gson(); @Getter @Setter private Descriptor source; @Getter @Setter private Descriptor destination; public LineageEventBuilder(String name) { super(name, LIENAGE_EVENT_NAMESPACE); addMetadata(EVENT_TYPE, LINEAGE_EVENT_TYPE); } @Override public GobblinTrackingEvent build() { Map<String, String> dataMap = Maps.newHashMap(metadata); dataMap.put(SOURCE, Descriptor.serialize(source)); dataMap.put(DESTINATION, Descriptor.serialize(destination)); return new GobblinTrackingEvent(0L, namespace, name, dataMap); } @Override public String toString() { return GSON.toJson(this); } @Override public boolean equals(Object o) { if (this == o) { return true; } if (o == null || getClass() != o.getClass()) { return false; } LineageEventBuilder event = (LineageEventBuilder) o; if (!namespace.equals(event.namespace) || !name.equals(event.name) || !metadata.equals(event.metadata)) { return false; } if (source != null ? !source.equals(event.source) : event.source != null) { return false; } return destination != null ? destination.equals(event.destination) : event.destination == null; } @Override public int hashCode() { int result = name.hashCode(); result = 31 * result + namespace.hashCode(); result = 31 * result + metadata.hashCode(); result = 31 * result + (source != null ? source.hashCode() : 0); result = 31 * result + (destination != null ? destination.hashCode() : 0); return result; } /** * Check if the given {@link GobblinTrackingEvent} is a lineage event */ public static boolean isLineageEvent(GobblinTrackingEvent event) { String eventType = event.getMetadata().get(EVENT_TYPE); return StringUtils.isNotEmpty(eventType) && eventType.equals(LINEAGE_EVENT_TYPE); } /** * Create a {@link LineageEventBuilder} from a {@link GobblinEventBuilder}. An inverse function * to {@link LineageEventBuilder#build()} */ public static LineageEventBuilder fromEvent(GobblinTrackingEvent event) { Map<String, String> metadata = event.getMetadata(); LineageEventBuilder lineageEvent = new LineageEventBuilder(event.getName()); metadata.forEach((key, value) -> { switch (key) { case SOURCE: lineageEvent.setSource(Descriptor.deserialize(value)); break; case DESTINATION: lineageEvent.setDestination(Descriptor.deserialize(value)); break; default: lineageEvent.addMetadata(key, value); break; } }); return lineageEvent; } static String getKey(Object ... parts) { return Joiner.on(".").join(parts); } }
<filename>src/test/java/Conditions.java import org.openqa.selenium.By; import org.openqa.selenium.WebElement; import org.testng.Assert; import org.testng.annotations.Test; import java.util.ArrayList; import java.util.Arrays; import java.util.List; public class Conditions extends BaseUI { @Test public void test() { String fruit1 = "apple"; String fruit2 = "orange"; if (fruit1.contains("apple") && fruit2.contains("kiwi")) { System.out.println("we cand find this fruit 1"); } else { Assert.fail("we cannot find it"); } } @Test public void test2() { int number = 10; int sum; if (number==15 +5){ sum =95+100; } else { sum= 100-95; } System.out.println(sum); } @Test public void test4(){ boolean requirement = true; if(!requirement){ System.out.println("Boolean is true"); }else { Assert.fail("Boolean is false"); } } @Test public void test5(){ WebElement tabSearch = driver.findElement(Locators.FIND_JOBS_BUTTON); if(tabSearch.isDisplayed()){ tabSearch.click(); }else { Assert.fail("We can't find this button"); } } @Test public void test6(){ mainPage.clickLogInButton(); signUpPage.completeSignUp(); WebElement checkbox = driver.findElement(Locators.RADIO_BUTTON_YES); if(!checkbox.isSelected()){ checkbox.click(); } } @Test public void test7 (){ List<String> crunchifyList1 = new ArrayList(Arrays.asList("kiwi","orange","melon")); String element = crunchifyList1.get(0); if (crunchifyList1.contains("orange")){ System.out.println(crunchifyList1); } } @Test public void test8 (){ List<Integer> crunchifyList2 = new ArrayList(Arrays.asList(5,7,4)); int sum = crunchifyList2.get(1) + crunchifyList2.get(2); System.out.println(sum); } /* @Test public void test9(){ List<WebElement> links = driver.findElements(By.xpath(("//ul//li"))); System.out.println(links.size()); for (int i=0; i <links.size(); i++){ String info = links.get((i)).getText(); System.out.println(info); links.get(i).click(); driver.get(Data.MAIN_URL); links = driver.findElements(By.xpath("//ul/li")); } }*/ }
The Chicago Bears’ 2016 NFL Draft class ranks extremely well just a year after the event. Chicago Bears general manager Ryan Pace has not been perfect. His 2015 draft class leaves plenty to be desired and he has been constantly criticized for his work in the 2017 NFL Draft. That being said, Pace and the Bears front office deserves a ton of praise for their work in the 2016 NFL Draft. The Bears’ draft picks last year were as followed: Round 1 (9th Overall): Leonard Floyd, OLB 2 (56): Cody Whitehair, OL 3 (72): Jonathan Bullard, DL 4 (113): Nick Kwiatkowki, ILB 4 (124): Deon Bush, S 4 (127): Deiondre’ Hall, DB 5 (150): Jordan Howard, RB 6 (185): DeAndre Houston-Carson, S 7 (230): Daniel Braverman, WR Pace did an incredible job of moving up and down the board in the 2016 draft, finding great value along the way. In Floyd and Whitehair, the Bears found two players who seem sure to make significant impacts on both sides of the ball. On top of that, Pace and company found arguably the biggest steal in the draft when they landed Jordan Howard in the fifth round. Howard only started 13 games at running back for the Bears last season, but he was the second-leading rusher in the NFL with 1,313 yards. In Howard, Floyd and Whitehair, the Bears found a legitimate core of players to build around. They also did a nice job of finding high-upside talent in the other rounds. One of the best More than a year after the draft, Bleacher Report’s Brent Sobleski put together a complete ranking for all of the 2016 NFL draft classes. As you would expect, the Bears do well in this ranking, coming in at number five. Sobleski was particularly high on his praise for Howard. “Everyone knows the Dallas Cowboys’ Ezekiel Elliott led the NFL in rushing during his first season. Another rookie finished second,” he wrote. “The Chicago Bears’ Jordan Howard was a workhorse for the Bears after being selected in the fifth round. The 225-pound back carried the ball 252 times for 1,313 yards. His 5.2 yards per carry also ranked second behind the Buffalo Bills’ LeSean McCoy (5.4) among backs with 200 or more carries. As one of the league’s leading rushers, Howard was named to his first Pro Bowl.” Say what you want about the Bears’ performance in the 2017 NFL Draft, but there is no other way to look at it; Pace and company killed it in the 2016 draft. Guys like Howard, Floyd and Whitehair will undoubtedly be major building blocks going forward. If Pace hit again with quarterback Mitch Trubisky this year, the Chicago Bears could quickly be a serious contender in the NFC.
Fixed points fix estimates: The accuracy of an effect size estimate is described by its sample’s conditional algorithmic information, as computed across rank permutations Every statistical estimate is equal to the sum of a nonrandom component, due to parameter values and bias, and a random component, due to sampling error. Estimation theory suggests that the two components are hopelessly confounded in the estimate. We would like to estimate the sign and magnitude of a statistic’s random deviation from its parameter--its accuracy--in the same way we quantify a statistic’s random variability around its parameter--its precision--by estimating the standard error. However, because the random component is an attribute of the sample data, it be described with parametric or Fisher information. In information theory, on the other hand, every information type--entropy, complexity--is understood as describing the extent of randomness in manifest data. This suggests that integrating the two conceptions of information could allow us to describe the two components of a statistical estimate, if only we could identify a common link between the two paradigms.The matching statistic, m, is such a link. For paired, ranked vectors X and Y of length n, m is the total number of paired observations in X and Y with matching ranks, m = Σ R(Xi) = R(Yi). That is, m is the number of fixed points between vectors. m has a long history in statistics, having served as the test statistic of a little-known null hypothesis statistical test (NHST) for the correlation coefficient, dating to around the turn of the twentieth century, called the matching method. Subtracting m from n yields a metric with a long history in information theory, the Hamming distance, a classic metric of the conditional complexity K(Y|X). Thus, m simultaneously contains both the Fisher information in a bivariate sample about the latent correlation and the conditional complexity or algorithmic information about the manifest observations.This paper shows that the presence of these two conflicting information types in m manifests a peculiar attribute in the statistic: m has an asymptotic efficiency less than or equal to zero relative to conventional correlation estimators computed on the same data. This means its Fisher information content decreases with increasing sample size, so that m’s random component is disproportionately large. Furthermore, when m and Pearson’s r are computed on the same sample, the two share a random component, and the value of m is indicative of the accuracy of r with respect to that component. Having proven this utility of m, by means theoretical and empirical (Monte Carlo simulations), additional matching statistics are constructed, including one composite statistic that is even more informative of the accuracy of r, and another that is indicative of the accuracy of Cohen’s d. Potential applications for computing accuracy-adjusted r are described, and implications are discussed.
import sys def text_match_word(text, words): for word in words: if word == text[:len(word)]: return word words_next = (("dream", "er"), ("erase", "r")) words_next_dict = dict(words_next) s = input() before_match_word = "" while "" != s: search_words = list(words_next_dict.keys()) if before_match_word in words_next_dict: search_words.append(words_next_dict[before_match_word]) match_word = text_match_word(s, search_words) if match_word != None: s = s[len(match_word):] else: print("NO") sys.exit() before_match_word = match_word print("YES")
/* * Summary: SAX2 parser interface used to build the DOM tree * Description: those are the default SAX2 interfaces used by * the library when building DOM tree. * * Copy: See Copyright for the status of this software. * * Author: Daniel Veillard */ #ifndef __XML_SAX2_H__ #define __XML_SAX2_H__ #include <stdio.h> #include <stdlib.h> #include "xmlversion.h" #include "parser.h" #include "xlink.h" #ifdef __cplusplus extern "C" { #endif XMLPUBFUN const xmlChar * XMLCALL xmlSAX2GetPublicId (void *ctx); XMLPUBFUN const xmlChar * XMLCALL xmlSAX2GetSystemId (void *ctx); XMLPUBFUN void XMLCALL xmlSAX2SetDocumentLocator (void *ctx, xmlSAXLocatorPtr loc); XMLPUBFUN int XMLCALL xmlSAX2GetLineNumber (void *ctx); XMLPUBFUN int XMLCALL xmlSAX2GetColumnNumber (void *ctx); XMLPUBFUN int XMLCALL xmlSAX2IsStandalone (void *ctx); XMLPUBFUN int XMLCALL xmlSAX2HasInternalSubset (void *ctx); XMLPUBFUN int XMLCALL xmlSAX2HasExternalSubset (void *ctx); XMLPUBFUN void XMLCALL xmlSAX2InternalSubset (void *ctx, const xmlChar *name, const xmlChar *ExternalID, const xmlChar *SystemID); XMLPUBFUN void XMLCALL xmlSAX2ExternalSubset (void *ctx, const xmlChar *name, const xmlChar *ExternalID, const xmlChar *SystemID); XMLPUBFUN xmlEntityPtr XMLCALL xmlSAX2GetEntity (void *ctx, const xmlChar *name); XMLPUBFUN xmlEntityPtr XMLCALL xmlSAX2GetParameterEntity (void *ctx, const xmlChar *name); XMLPUBFUN xmlParserInputPtr XMLCALL xmlSAX2ResolveEntity (void *ctx, const xmlChar *publicId, const xmlChar *systemId); XMLPUBFUN void XMLCALL xmlSAX2EntityDecl (void *ctx, const xmlChar *name, int type, const xmlChar *publicId, const xmlChar *systemId, xmlChar *content); XMLPUBFUN void XMLCALL xmlSAX2AttributeDecl (void *ctx, const xmlChar *elem, const xmlChar *fullname, int type, int def, const xmlChar *defaultValue, xmlEnumerationPtr tree); XMLPUBFUN void XMLCALL xmlSAX2ElementDecl (void *ctx, const xmlChar *name, int type, xmlElementContentPtr content); XMLPUBFUN void XMLCALL xmlSAX2NotationDecl (void *ctx, const xmlChar *name, const xmlChar *publicId, const xmlChar *systemId); XMLPUBFUN void XMLCALL xmlSAX2UnparsedEntityDecl (void *ctx, const xmlChar *name, const xmlChar *publicId, const xmlChar *systemId, const xmlChar *notationName); XMLPUBFUN void XMLCALL xmlSAX2StartDocument (void *ctx); XMLPUBFUN void XMLCALL xmlSAX2EndDocument (void *ctx); #if defined(LIBXML_SAX1_ENABLED) || defined(LIBXML_HTML_ENABLED) || defined(LIBXML_WRITER_ENABLED) || defined(LIBXML_DOCB_ENABLED) XMLPUBFUN void XMLCALL xmlSAX2StartElement (void *ctx, const xmlChar *fullname, const xmlChar **atts); XMLPUBFUN void XMLCALL xmlSAX2EndElement (void *ctx, const xmlChar *name); #endif /* LIBXML_SAX1_ENABLED or LIBXML_HTML_ENABLED */ XMLPUBFUN void XMLCALL xmlSAX2StartElementNs (void *ctx, const xmlChar *localname, const xmlChar *prefix, const xmlChar *URI, int nb_namespaces, const xmlChar **namespaces, int nb_attributes, int nb_defaulted, const xmlChar **attributes); XMLPUBFUN void XMLCALL xmlSAX2EndElementNs (void *ctx, const xmlChar *localname, const xmlChar *prefix, const xmlChar *URI); XMLPUBFUN void XMLCALL xmlSAX2Reference (void *ctx, const xmlChar *name); XMLPUBFUN void XMLCALL xmlSAX2Characters (void *ctx, const xmlChar *ch, int len); XMLPUBFUN void XMLCALL xmlSAX2IgnorableWhitespace (void *ctx, const xmlChar *ch, int len); XMLPUBFUN void XMLCALL xmlSAX2ProcessingInstruction (void *ctx, const xmlChar *target, const xmlChar *data); XMLPUBFUN void XMLCALL xmlSAX2Comment (void *ctx, const xmlChar *value); XMLPUBFUN void XMLCALL xmlSAX2CDataBlock (void *ctx, const xmlChar *value, int len); #ifdef LIBXML_SAX1_ENABLED XMLPUBFUN int XMLCALL xmlSAXDefaultVersion (int version); #endif /* LIBXML_SAX1_ENABLED */ XMLPUBFUN int XMLCALL xmlSAXVersion (xmlSAXHandler *hdlr, int version); XMLPUBFUN void XMLCALL xmlSAX2InitDefaultSAXHandler (xmlSAXHandler *hdlr, int warning); #ifdef LIBXML_HTML_ENABLED XMLPUBFUN void XMLCALL xmlSAX2InitHtmlDefaultSAXHandler(xmlSAXHandler *hdlr); XMLPUBFUN void XMLCALL htmlDefaultSAXHandlerInit (void); #endif #ifdef LIBXML_DOCB_ENABLED XMLPUBFUN void XMLCALL xmlSAX2InitDocbDefaultSAXHandler(xmlSAXHandler *hdlr); XMLPUBFUN void XMLCALL docbDefaultSAXHandlerInit (void); #endif XMLPUBFUN void XMLCALL xmlDefaultSAXHandlerInit (void); #ifdef __cplusplus } #endif #endif /* __XML_SAX2_H__ */
def cleanup_at_exit(): global lockfilename if lockfilename: os.remove(lockfilename) lockfilename = None
import React from 'react'; import { Container } from '../Container'; export default { title: '@co-design/core/Container', component: Container, argTypes: { size: { defaultValue: 'medium', options: ['xsmall', 'small', 'medium', 'large', 'xlarge'], control: { type: 'inline-radio' }, }, padding: { defaultValue: 0, control: { type: 'number' }, }, fluid: { defaultValue: false, control: { type: 'boolean' }, }, break: { defaultValue: false, control: { type: 'boolean' }, }, }, }; export const Default = (props) => { return ( <Container {...props} co={(theme) => ({ backgroundColor: theme.palettes.purple[3] })}> Container </Container> ); };
This is an artist's impression of the large stellar void stretching 8,000 light-years from the center of the Milky Way. Astronomers of the new research found no young stars called Cepheids in this vast region. A vast tract of space near the center of the Milky Way — in an area called the inner disk — is completely devoid of young stars, new research shows. The Milky Way, which hosts Earth's solar system, is a spiral galaxy containing billions and billions of stars. By measuring the distribution of these stars, astronomers can better understand how the Milky Way formed and developed over time. Young stars called Cepheids are good growth markers because they regularly pulsate in brightness and the pulsations are tied to their overall luminosity. This means astronomers can monitor the duration of bright periods and estimate the stars' distance from Earth based on how bright they appear. But in the Milky Way's inner disk, which extends for 8,000 light-years from the galactic core, researchers haven't found any of those young stars, and that observation challenges current theories on Milky Way formation, officials said in a statement from the Royal Astronomical Society. [Our Milky Way Galaxy's Core Revealed (Photos)] "The current results indicate that there has been no significant star formation in this large region over hundreds of millions years," Giuseppe Bono, co-author of the new research and astronomer at the Rome Observatory, said in the statement. This lack of Cephids had not been seen before, because thick, light-blocking cosmic dust in the inner regions of the Milky Way block astronomers' view from Earth and make it difficult to spot the pulsating stars. But by using near-infrared data from a Japanese-South African telescope, the researchers were able to get a clearer view. Previous studies found Cepheids in the heart of the Milky Way, said Noriyuki Matsunaga, lead author of the new work from the University of Tokyo. "Now, we find that outside this there is a huge Cepheid desert extending out to 8,000 light years from the center," he said in the statement. With the Milky Way itself measuring about 100,000 light-years across, the researchers noted that this stellar desert comprises a lot of empty space. "Our conclusions are contrary to other recent work but in line with the work of radio astronomers who see no new stars being born in this desert," Michael Feast, co-author of the new research and astronomer from the South African Astronomical Observatory, said in the statement. The new work was published June 27 in the Monthly Notices of the Royal Astronomical Society. Follow Samantha Mathewson @Sam_Ashley13. Follow us @Spacedotcom, Facebook and Google+. Original article on Space.com.
<filename>api/selftest.go package api import ( "fmt" log "github.com/Sirupsen/logrus" ) func findAgentsInHistoryServiceSelfTest(pastTime string) error { finder := &findAgentsInHistoryService{ pastTime: pastTime, next: nil, } nodes, err := finder.find() if err != nil { return err } if len(nodes) == 0 { return fmt.Errorf("No nodes found in history service for past %s", pastTime) } return nil } func findAgentsInHistoryServicePastMinuteSelfTest() error { return findAgentsInHistoryServiceSelfTest("/minute/") } func findAgentsInHistoryServicePastHourSelfTest() error { return findAgentsInHistoryServiceSelfTest("/hour/") } func dummySelfTest() error { return nil } func getSelfTests() map[string]func() error { tests := make(map[string]func() error) tests["findAgentsInHistoryServicePastMinuteSelfTest"] = findAgentsInHistoryServicePastMinuteSelfTest tests["findAgentsInHistoryServicePastHourSelfTest"] = findAgentsInHistoryServicePastHourSelfTest tests["dummyTest"] = dummySelfTest return tests } type selfTestResponse struct { Success bool ErrorMessage string } func runSelfTest() map[string]*selfTestResponse { result := make(map[string]*selfTestResponse) for selfTestName, fn := range getSelfTests() { result[selfTestName] = &selfTestResponse{} err := fn() if err == nil { result[selfTestName].Success = true } else { // check for NodesNotFoundError. Do not fail if this happens. It just means history service // was did not dump anything yet. if serr, ok := err.(NodesNotFoundError); ok { log.Debugf("Non critical error recevied: %s", serr) result[selfTestName].Success = true } else { result[selfTestName].ErrorMessage = err.Error() } } } return result }
//----------------------------------------------------------------------------- // Resource preloading for cubemaps. //----------------------------------------------------------------------------- class CResourcePreloadCubemap : public CResourcePreload { virtual bool CreateResource( const char *pName ) { ITexture *pTexture = g_MaterialSystem.FindTexture( pName, TEXTURE_GROUP_CUBE_MAP, true ); ITextureInternal *pTexInternal = static_cast< ITextureInternal * >( pTexture ); if ( pTexInternal ) { pTexInternal->MarkAsPreloaded( true ); pTexInternal->IncrementReferenceCount(); if ( !IsErrorTexture( pTexInternal ) ) { return true; } } return false; } virtual void OnEndMapLoading( bool bAbort ) { int iIndex = -1; for ( ;; ) { ITextureInternal *pTexInternal; iIndex = TextureManager()->FindNext( iIndex, &pTexInternal ); if ( iIndex == -1 || !pTexInternal ) { break; } if ( pTexInternal->IsPreloaded() ) { pTexInternal->MarkAsPreloaded( false ); pTexInternal->DecrementReferenceCount(); } } } #if defined( _PS3 ) virtual bool RequiresRendererLock() { return true; } #endif }
package com.mark59.datahunter.api.rest.samples; import static org.junit.Assert.assertEquals; import static org.junit.Assert.assertTrue; import java.util.List; import com.mark59.datahunter.api.application.DataHunterConstants; import com.mark59.datahunter.api.data.beans.Policies; import com.mark59.datahunter.api.model.AsyncMessageaAnalyzerResult; import com.mark59.datahunter.api.model.CountPoliciesBreakdown; import com.mark59.datahunter.api.model.DataHunterRestApiResponsePojo; import com.mark59.datahunter.api.rest.DataHunterRestApiClient; /** * Detailed use cases and verifications for the DataHunter Rest API client and service.: * * @author <NAME> * Written: Australian Spring 2022 * */ public class DataHunterRestApiClientSampleUsage { /** * This is functionally equivalent to the DataHunterSeleniumFunctionalTest.asyncLifeCycleTestWithUseabilityUpdate() web application test * in the dataHunterFunctionalTest project (held on the mark-5-9/mark59-xtras GitHub repo), but running the REST API instead of the web * 'Asynchronous Message Analyzer' function. * * @param dhApiClient DataHunterRestApiClient * @see <a href="https://github.com/mark-5-9/mark59-xtras/blob/master/mark59-datahunterFunctionalTest/src/main/java/com/mark59/datahunter/functionalTest/scripts/DataHunterSeleniumFunctionalTest.java#L47">Web App Equivalent Test</a> */ public void asyncLifeCycleTestWithUseabilityUpdate(DataHunterRestApiClient dhApiClient) { dhApiClient.deleteMultiplePolicies("TESTAPI_ASYNC_TOUSED", null, null); dhApiClient.addPolicy( new Policies("TESTAPI_ASYNC_TOUSED", "T99-testonly-01", "FIRSTONE", "UNPAIRED", "", 1460613152000L)); dhApiClient.addPolicy( new Policies("TESTAPI_ASYNC_TOUSED", "T99-testonly-01", "between", "UNPAIRED", "", 1460613152009L)); dhApiClient.addPolicy( new Policies("TESTAPI_ASYNC_TOUSED", "T99-testonly-01", "LASTONE", "UNPAIRED", "", 1460613153001L)); dhApiClient.addPolicy( new Policies("TESTAPI_ASYNC_TOUSED", "T99-testonly-02", "FIRSTONE", "UNPAIRED", "", 1460613153000L)); dhApiClient.addPolicy( new Policies("TESTAPI_ASYNC_TOUSED", "T99-testonly-02", "LASTONE", "UNPAIRED", "", 1460613155001L)); DataHunterRestApiResponsePojo response = dhApiClient.asyncMessageAnalyzer(DataHunterConstants.EQUALS,"TESTAPI_ASYNC_TOUSED", null, null, "USED"); int i=0; List<AsyncMessageaAnalyzerResult> asyncResults = response.getAsyncMessageaAnalyzerResults(); System.out.println( " asyncMessageAnalyzerPrintResults (" + asyncResults.size() + ") - asyncLifeCycleTestWithUseabilityUpdate" ); System.out.println( " -------------------------------- "); for (AsyncMessageaAnalyzerResult asyncResult : asyncResults) { System.out.println(" " + ++i + " " + asyncResult); } assertEquals(2, asyncResults.size()); assertEquals("[application=TESTAPI_ASYNC_TOUSED, startsWith=null, identifier=T99-testonly-02, lifecycle=null, useability=USED, selectOrder=null], starttm= 1460613153000, endtm= 1460613155001, differencetm= 2001]", asyncResults.get(0).toString()); assertEquals("[application=TESTAPI_ASYNC_TOUSED, startsWith=null, identifier=T99-testonly-01, lifecycle=null, useability=USED, selectOrder=null], starttm= 1460613152000, endtm= 1460613153001, differencetm= 1001]", asyncResults.get(1).toString()); for (AsyncMessageaAnalyzerResult pairedAsyncTxn : asyncResults ) { // example of a typical transaction name you could set (and its response time) System.out.println( " Txn Name : " + pairedAsyncTxn.getApplication() + "_" + pairedAsyncTxn.getIdentifier() + " Respsonse time (Assumed msecs) : " + pairedAsyncTxn.getDifferencetm() ); } System.out.println( " -------------------------------- "); // clean up assertEquals(new Integer(5), dhApiClient.deleteMultiplePolicies("TESTAPI_ASYNC_TOUSED", null, null).getRowsAffected()); } /** * @param dhApiClient DataHunterRestApiClient */ public void basicPolicyAddPrintDeleteChecks(DataHunterRestApiClient dhApiClient){ // System.out.println("DataHunterRestApiResponsePojo =" + response ); DataHunterRestApiResponsePojo response = dhApiClient.deletePolicy("testapi", "id1", ""); assertEquals(String.valueOf(true), response.getSuccess() ); response = dhApiClient.deletePolicy("testapi", "id2", null); assertEquals(String.valueOf(true), response.getSuccess() ); response = dhApiClient.deletePolicy("testapi", "id3", "setepochtime"); assertEquals(String.valueOf(true), response.getSuccess() ); response = dhApiClient.deletePolicy("testapi", "id3", "setepochtime"); assertEquals(String.valueOf(true), response.getSuccess() ); response = dhApiClient.addPolicy(new Policies("testapi","id1", "", "USED", null, null)); assertEquals(String.valueOf(true), response.getSuccess() ); assertEquals(new Integer(1), response.getRowsAffected()); response = dhApiClient.printPolicy("testapi", "id1"); assertEquals(String.valueOf(true), response.getSuccess() ); assertEquals(new Integer(1), response.getRowsAffected()); assertsOnPolicy(new Policies("testapi","id1", "", "USED", "", null), response.getPolicies().get(0)); response = dhApiClient.addPolicy(new Policies("testapi","id1", "duplicatedid", "USED", "", null)); response = dhApiClient.printPolicy("testapi", "id1"); assertEquals(String.valueOf(true), response.getSuccess() ); assertEquals(new Integer(1), response.getRowsAffected()); assertEquals(1, response.getPolicies().size()); assertsOnPolicy(new Policies("testapi","id1", "", "USED", "", null), response.getPolicies().get(0)); response = dhApiClient.printPolicy("testapi", "id1", ""); assertEquals(String.valueOf(true), response.getSuccess() ); assertEquals(new Integer(1), response.getRowsAffected()); assertsOnPolicy(new Policies("testapi","id1", "", "USED", "", null), response.getPolicies().get(0)); response = dhApiClient.printPolicy("testapi", "id1", null); assertEquals(String.valueOf(true), response.getSuccess() ); assertEquals(new Integer(1), response.getRowsAffected()); assertsOnPolicy(new Policies("testapi","id1", "", "USED", "", null), response.getPolicies().get(0)); response = dhApiClient.printPolicy("testapi", "id1", "duplicatedid"); assertEquals(String.valueOf(true), response.getSuccess()); assertEquals(new Integer(1), response.getRowsAffected()); assertsOnPolicy(new Policies("testapi","id1", "duplicatedid", "USED", "", null), response.getPolicies().get(0)); response = dhApiClient.printPolicy("testapi", "doesnotexist", "duplicatedid"); assertEquals(String.valueOf(false), response.getSuccess()); assertEquals(new Integer(0), response.getRowsAffected()); assertEquals(0, response.getPolicies().size()); dhApiClient.addPolicy(new Policies("testapi","id2", "", "USED", "", null)); response = dhApiClient.deletePolicy("testapi", "id1", ""); assertEquals(String.valueOf(true), response.getSuccess()); assertEquals(new Integer(1), response.getRowsAffected()); assertEquals(1, response.getPolicies().size()); assertEquals("", response.getFailMsg()); assertsOnPolicy(new Policies("testapi","id1", "", null, null, null), response.getPolicies().get(0)); response = dhApiClient.deletePolicy("testapi", "id1","duplicatedid"); assertEquals(String.valueOf(true), response.getSuccess()); assertEquals(new Integer(1), response.getRowsAffected()); assertEquals(1, response.getPolicies().size()); assertEquals("", response.getFailMsg()); assertsOnPolicy(new Policies("testapi","id1", "duplicatedid", null, null, null), response.getPolicies().get(0)); response = dhApiClient.deletePolicy("testapi", "id1","duplicatedid"); assertEquals(String.valueOf(true), response.getSuccess()); assertEquals(new Integer(0), response.getRowsAffected()); assertEquals(1, response.getPolicies().size()); assertEquals("No rows matching the selection.", response.getFailMsg()); assertsOnPolicy(new Policies("testapi","id1", "duplicatedid", null, null, null), response.getPolicies().get(0)); response = dhApiClient.deletePolicy("testapi", "id2",""); assertEquals(String.valueOf(true), response.getSuccess()); assertEquals(new Integer(1), response.getRowsAffected()); assertEquals("", response.getFailMsg()); assertsOnPolicy(new Policies("testapi","id2", "", null, null, null), response.getPolicies().get(0)); response = dhApiClient.addPolicy(new Policies("testapi","id3", "setepochtime", "UNUSED", "otherstuff", 1643673346936L)); assertEquals(String.valueOf(true), response.getSuccess()); assertEquals(1, response.getPolicies().size()); assertsOnPolicy(new Policies("testapi","id3", "setepochtime", "UNUSED", "otherstuff", 1643673346936L), response.getPolicies().get(0)); response = dhApiClient.printPolicy("testapi","id3", "setepochtime"); assertEquals(String.valueOf(true), response.getSuccess()); assertEquals(1, response.getPolicies().size()); assertsOnPolicy(new Policies("testapi","id3", "setepochtime", "UNUSED", "otherstuff", 1643673346936L), response.getPolicies().get(0)); response = dhApiClient.printPolicy("testapi","id3"); assertEquals(String.valueOf(false), response.getSuccess()); assertEquals(0, response.getPolicies().size()); assertEquals("No rows matching the selection.", response.getFailMsg()); response = dhApiClient.addPolicy(new Policies("testapi","id3", "setepochtime", "USED", "ALREADYEXISTS!!", 1643673346936L)); assertEquals(String.valueOf(false), response.getSuccess()); assertEquals(1, response.getPolicies().size()); assertsOnPolicy(new Policies("testapi","id3", "setepochtime", "UNUSED", "ALREADYEXISTS!!", 1643673346936L), response.getPolicies().get(0)); assertTrue("error should contain application (testapi)" , response.getFailMsg().contains("testapi") ); assertTrue("error should contain idenifier (id3)" , response.getFailMsg().contains("id3") ); assertTrue("error should contain lifecycle (setepochtime)" , response.getFailMsg().contains("setepochtime") ); response = dhApiClient.deletePolicy("testapi", "id3", "setepochtime"); assertEquals(String.valueOf(true), response.getSuccess() ); assertEquals(new Integer(1), response.getRowsAffected()); } /** * @param dhApiClient DataHunterRestApiClient */ public void workingWithMultiplePolicies(DataHunterRestApiClient dhApiClient){ create6testPolices(dhApiClient); DataHunterRestApiResponsePojo response = dhApiClient.deleteMultiplePolicies("nonexistingapp", null, null); assertEquals(String.valueOf(true), response.getSuccess() ); assertEquals(new Integer(0), response.getRowsAffected()); response = dhApiClient.deleteMultiplePolicies("testapi", null, "USED"); assertEquals(new Integer(3), response.getRowsAffected()); response = dhApiClient.deleteMultiplePolicies("testapi", null, "USED"); assertEquals(new Integer(0), response.getRowsAffected()); response = dhApiClient.deleteMultiplePolicies("testapi", null, ""); assertEquals(new Integer(2), response.getRowsAffected()); response = dhApiClient.deleteMultiplePolicies("otherapp", "", ""); assertEquals(new Integer(1), response.getRowsAffected()); create6testPolices(dhApiClient); response = dhApiClient.printSelectedPolicies("testapi", null, null); assertEquals(new Integer(5), response.getRowsAffected()); assertEquals(5, response.getPolicies().size()); assertTrue(response.getPolicies().get(0).toString().startsWith("[application=testapi, identifier=im4, lifecycle=nonblanklc, useability=UNUSED, otherdata=,")); assertTrue(response.getPolicies().get(1).toString().startsWith("[application=testapi, identifier=im3, lifecycle=duplicatedid, useability=REUSABLE, otherdata=duplicated id,")); assertTrue(response.getPolicies().get(2).toString().startsWith("[application=testapi, identifier=im3, lifecycle=nonblanklc, useability=USED, otherdata=otherdata3,")); assertTrue(response.getPolicies().get(3).toString().startsWith("[application=testapi, identifier=im2, lifecycle=, useability=USED, otherdata=,")); assertTrue(response.getPolicies().get(4).toString().startsWith("[application=testapi, identifier=im1, lifecycle=, useability=USED, otherdata=,")); response = dhApiClient.printSelectedPolicies("testapi", null, "USED"); assertEquals(3, response.getPolicies().size()); assertTrue(response.getPolicies().get(0).toString().startsWith("[application=testapi, identifier=im3, lifecycle=nonblanklc, useability=USED, otherdata=otherdata3,")); assertTrue(response.getPolicies().get(1).toString().startsWith("[application=testapi, identifier=im2, lifecycle=, useability=USED, otherdata=,")); assertTrue(response.getPolicies().get(2).toString().startsWith("[application=testapi, identifier=im1, lifecycle=, useability=USED, otherdata=,")); response = dhApiClient.printSelectedPolicies("testapi", "nonblanklc", ""); assertEquals(2, response.getPolicies().size()); assertTrue(response.getPolicies().get(0).toString().startsWith("[application=testapi, identifier=im4, lifecycle=nonblanklc, useability=UNUSED, otherdata=,")); assertTrue(response.getPolicies().get(1).toString().startsWith("[application=testapi, identifier=im3, lifecycle=nonblanklc, useability=USED, otherdata=otherdata3,")); response = dhApiClient.printSelectedPolicies("testapi", "nonblanklc", "USED"); assertEquals(1, response.getPolicies().size()); assertTrue(response.getPolicies().get(0).toString().startsWith("[application=testapi, identifier=im3, lifecycle=nonblanklc, useability=USED, otherdata=otherdata3,")); response = dhApiClient.printSelectedPolicies("doesntexist", "nonblanklc", "USED"); assertEquals(0, response.getPolicies().size()); response = dhApiClient.deleteMultiplePolicies("testapi", "nonblanklc", null); assertEquals(new Integer(2), response.getRowsAffected()); response = dhApiClient.printSelectedPolicies("testapi", "", ""); assertEquals(new Integer(3), response.getRowsAffected()); response = dhApiClient.deleteMultiplePolicies("testapi", "", "USED"); assertEquals(new Integer(2), response.getRowsAffected()); response = dhApiClient.printSelectedPolicies("testapi", null, null); assertEquals(new Integer(1), response.getRowsAffected()); assertTrue(response.getPolicies().get(0).toString().startsWith("[application=testapi, identifier=im3, lifecycle=duplicatedid, useability=REUSABLE, otherdata=duplicated id,")); } /** * @param dhApiClient DataHunterRestApiClient */ public void policyCountsAndBreakdowns(DataHunterRestApiClient dhApiClient){ create6testPolices(dhApiClient); DataHunterRestApiResponsePojo response = dhApiClient.countPolicies("testapi", "nonblanklc", "USED"); assertEquals(new Integer(1), response.getRowsAffected()); assertEquals("[[application=testapi, identifier=null, lifecycle=nonblanklc, useability=USED, otherdata=null, created=null, updated=null, epochtime=null]]", response.getPolicies().toString()); response = dhApiClient.countPoliciesBreakdown(DataHunterConstants.EQUALS, "testapi", "nonblanklc", "USED"); assertEquals(new Integer(1), response.getRowsAffected()); assertEquals("[[application=testapi, identifier=null, lifecycle=nonblanklc, useability=USED, otherdata=EQUALS, created=null, updated=null, epochtime=null]]", response.getPolicies().toString()); assertEquals(1, response.getCountPoliciesBreakdown().size()); assertEquals("[[application=testapi, startsWith=null, identifier=null, lifecycle=nonblanklc, useability=USED, selectOrder=null], rowCount=1]]", response.getCountPoliciesBreakdown().toString()); assertEquals(new Integer(5), dhApiClient.countPolicies("testapi", null, null).getRowsAffected()); response = dhApiClient.countPoliciesBreakdown(DataHunterConstants.EQUALS, "testapi", null, null); assertEquals(4, response.getCountPoliciesBreakdown().size()); assertEquals("[application=testapi, startsWith=null, identifier=null, lifecycle=, useability=USED, selectOrder=null], rowCount=2]", response.getCountPoliciesBreakdown().get(0).toString()); assertEquals("[application=testapi, startsWith=null, identifier=null, lifecycle=duplicatedid, useability=REUSABLE, selectOrder=null], rowCount=1]", response.getCountPoliciesBreakdown().get(1).toString()); assertEquals("[application=testapi, startsWith=null, identifier=null, lifecycle=nonblanklc, useability=UNUSED, selectOrder=null], rowCount=1]", response.getCountPoliciesBreakdown().get(2).toString()); assertEquals("[application=testapi, startsWith=null, identifier=null, lifecycle=nonblanklc, useability=USED, selectOrder=null], rowCount=1]", response.getCountPoliciesBreakdown().get(3).toString()); assertEquals(new Integer(5), dhApiClient.countPolicies("testapi", "", "").getRowsAffected() ); response = dhApiClient.countPoliciesBreakdown(DataHunterConstants.EQUALS, "testapi", "", ""); assertEquals(4, response.getCountPoliciesBreakdown().size()); assertEquals("[application=testapi, startsWith=null, identifier=null, lifecycle=, useability=USED, selectOrder=null], rowCount=2]", response.getCountPoliciesBreakdown().get(0).toString()); assertEquals("[application=testapi, startsWith=null, identifier=null, lifecycle=duplicatedid, useability=REUSABLE, selectOrder=null], rowCount=1]", response.getCountPoliciesBreakdown().get(1).toString()); assertEquals("[application=testapi, startsWith=null, identifier=null, lifecycle=nonblanklc, useability=UNUSED, selectOrder=null], rowCount=1]", response.getCountPoliciesBreakdown().get(2).toString()); assertEquals("[application=testapi, startsWith=null, identifier=null, lifecycle=nonblanklc, useability=USED, selectOrder=null], rowCount=1]", response.getCountPoliciesBreakdown().get(3).toString()); assertEquals(new Integer(0), dhApiClient.countPolicies("nonexisting", "", "").getRowsAffected() ); response = dhApiClient.countPoliciesBreakdown(DataHunterConstants.EQUALS, "nonexisting", "", ""); assertEquals(0, response.getCountPoliciesBreakdown().size()); assertEquals("sql execution OK, but no rows matched the selection criteria.", response.getFailMsg()); assertEquals(String.valueOf(true), response.getSuccess()); assertEquals(new Integer(3), dhApiClient.countPolicies("testapi", "", "USED").getRowsAffected() ); response = dhApiClient.countPoliciesBreakdown(DataHunterConstants.EQUALS, "testapi", "", "USED"); assertEquals(2, response.getCountPoliciesBreakdown().size()); assertEquals("[application=testapi, startsWith=null, identifier=null, lifecycle=, useability=USED, selectOrder=null], rowCount=2]", response.getCountPoliciesBreakdown().get(0).toString()); assertEquals("[application=testapi, startsWith=null, identifier=null, lifecycle=nonblanklc, useability=USED, selectOrder=null], rowCount=1]", response.getCountPoliciesBreakdown().get(1).toString()); assertEquals(new Integer(0), dhApiClient.countPolicies("testapi", "nonexistingc", "").getRowsAffected() ); assertEquals(0, dhApiClient.countPoliciesBreakdown(DataHunterConstants.EQUALS, "testapi", "nonexistingc", "").getCountPoliciesBreakdown().size()); assertEquals(new Integer(1), dhApiClient.countPolicies("testapi", null, "UNUSED").getRowsAffected() ); response = dhApiClient.countPoliciesBreakdown(DataHunterConstants.EQUALS, "testapi", null, "UNUSED"); assertEquals(1, response.getCountPoliciesBreakdown().size()); assertEquals("[application=testapi, startsWith=null, identifier=null, lifecycle=nonblanklc, useability=UNUSED, selectOrder=null], rowCount=1]", response.getCountPoliciesBreakdown().get(0).toString()); assertEquals(new Integer(2), dhApiClient.countPolicies("testapi", "nonblanklc", "").getRowsAffected() ); response = dhApiClient.countPoliciesBreakdown(DataHunterConstants.EQUALS, "testapi", "nonblanklc", ""); assertEquals(2, response.getCountPoliciesBreakdown().size()); assertEquals("[application=testapi, startsWith=null, identifier=null, lifecycle=nonblanklc, useability=UNUSED, selectOrder=null], rowCount=1]", response.getCountPoliciesBreakdown().get(0).toString()); assertEquals("[application=testapi, startsWith=null, identifier=null, lifecycle=nonblanklc, useability=USED, selectOrder=null], rowCount=1]", response.getCountPoliciesBreakdown().get(1).toString()); } /** * Note: this method clears the DataHunter database of all existing data * @param dhApiClient DataHunterRestApiClient */ public void policyCountBreakdownsUsingStartWith(DataHunterRestApiClient dhApiClient){ clearDatabase(dhApiClient); create6testPolices(dhApiClient); assertEquals(5, dhApiClient.countPoliciesBreakdown(DataHunterConstants.STARTS_WITH, "", null, null).getCountPoliciesBreakdown().size()); dhApiClient.addPolicy(new Policies("test api","ex1", "", "UNUSED", "", null)); dhApiClient.addPolicy(new Policies("testaB_pi","ex2", "nonblanklc", "USED", "", null)); dhApiClient.addPolicy(new Policies("testaC%pi:&? @=+","ex3", "lc with$char-s", "USED", "", null)); dhApiClient.addPolicy(new Policies("testaC%pi:&? @=+","ex4", "lc with$char-s", "USED", "", null)); DataHunterRestApiResponsePojo response = dhApiClient.countPoliciesBreakdown(DataHunterConstants.STARTS_WITH, "", null, null); assertEquals(8, response.getCountPoliciesBreakdown().size()); assertEquals("[application=otherapp, startsWith=null, identifier=null, lifecycle=, useability=UNUSED, selectOrder=null], rowCount=1]", response.getCountPoliciesBreakdown().get(0).toString()); assertEquals("[application=test api, startsWith=null, identifier=null, lifecycle=, useability=UNUSED, selectOrder=null], rowCount=1]", response.getCountPoliciesBreakdown().get(1).toString()); assertEquals("[application=testaB_pi, startsWith=null, identifier=null, lifecycle=nonblanklc, useability=USED, selectOrder=null], rowCount=1]", response.getCountPoliciesBreakdown().get(2).toString()); assertEquals("[application=testaC%pi:&? @=+, startsWith=null, identifier=null, lifecycle=lc with$char-s, useability=USED, selectOrder=null], rowCount=2]",response.getCountPoliciesBreakdown().get(3).toString()); assertEquals("[application=testapi, startsWith=null, identifier=null, lifecycle=, useability=USED, selectOrder=null], rowCount=2]", response.getCountPoliciesBreakdown().get(4).toString()); assertEquals("[application=testapi, startsWith=null, identifier=null, lifecycle=duplicatedid, useability=REUSABLE, selectOrder=null], rowCount=1]", response.getCountPoliciesBreakdown().get(5).toString()); assertEquals("[application=testapi, startsWith=null, identifier=null, lifecycle=nonblanklc, useability=UNUSED, selectOrder=null], rowCount=1]", response.getCountPoliciesBreakdown().get(6).toString()); assertEquals("[application=testapi, startsWith=null, identifier=null, lifecycle=nonblanklc, useability=USED, selectOrder=null], rowCount=1]", response.getCountPoliciesBreakdown().get(7).toString()); assertEquals(7, dhApiClient.countPoliciesBreakdown(DataHunterConstants.STARTS_WITH, "test", null, null).getCountPoliciesBreakdown().size()); assertEquals(4, dhApiClient.countPoliciesBreakdown(DataHunterConstants.STARTS_WITH, "testapi", null, null).getCountPoliciesBreakdown().size()); assertEquals(1, dhApiClient.countPoliciesBreakdown(DataHunterConstants.STARTS_WITH, "test ", null, null).getCountPoliciesBreakdown().size()); response = dhApiClient.countPoliciesBreakdown(DataHunterConstants.STARTS_WITH, "test", "lc with$char-s", null); assertEquals(1, response.getCountPoliciesBreakdown().size()); assertEquals("[application=testaC%pi:&? @=+, startsWith=null, identifier=null, lifecycle=lc with$char-s, useability=USED, selectOrder=null], rowCount=2]",response.getCountPoliciesBreakdown().get(0).toString()); response = dhApiClient.countPoliciesBreakdown(DataHunterConstants.STARTS_WITH, "", null, "USED"); assertEquals(4, response.getCountPoliciesBreakdown().size()); assertEquals("[application=testaB_pi, startsWith=null, identifier=null, lifecycle=nonblanklc, useability=USED, selectOrder=null], rowCount=1]", response.getCountPoliciesBreakdown().get(0).toString()); assertEquals("[application=testaC%pi:&? @=+, startsWith=null, identifier=null, lifecycle=lc with$char-s, useability=USED, selectOrder=null], rowCount=2]",response.getCountPoliciesBreakdown().get(1).toString()); assertEquals("[application=testapi, startsWith=null, identifier=null, lifecycle=, useability=USED, selectOrder=null], rowCount=2]", response.getCountPoliciesBreakdown().get(2).toString()); assertEquals("[application=testapi, startsWith=null, identifier=null, lifecycle=nonblanklc, useability=USED, selectOrder=null], rowCount=1]", response.getCountPoliciesBreakdown().get(3).toString()); response = dhApiClient.countPoliciesBreakdown(DataHunterConstants.STARTS_WITH, "testa", "nonblanklc", "USED"); assertEquals(2, response.getCountPoliciesBreakdown().size()); assertEquals("[application=testaB_pi, startsWith=null, identifier=null, lifecycle=nonblanklc, useability=USED, selectOrder=null], rowCount=1]", response.getCountPoliciesBreakdown().get(0).toString()); assertEquals("[application=testapi, startsWith=null, identifier=null, lifecycle=nonblanklc, useability=USED, selectOrder=null], rowCount=1]", response.getCountPoliciesBreakdown().get(1).toString()); //clean up assertEquals(new Integer(1), dhApiClient.deleteMultiplePolicies("otherapp", null, null).getRowsAffected()); assertEquals(new Integer(1), dhApiClient.deleteMultiplePolicies("test api", null, null).getRowsAffected()); assertEquals(new Integer(1), dhApiClient.deleteMultiplePolicies("testaB_pi", null, null).getRowsAffected()); assertEquals(new Integer(2), dhApiClient.deleteMultiplePolicies("testaC%pi:&? @=+", null, null).getRowsAffected()); assertEquals(new Integer(5), dhApiClient.deleteMultiplePolicies("testapi", null, null).getRowsAffected()); assertEquals(0, dhApiClient.countPoliciesBreakdown(DataHunterConstants.STARTS_WITH, "", null, null).getCountPoliciesBreakdown().size()); } /** * @param dhApiClient DataHunterRestApiClient */ public void workingWithUseStateChanges(DataHunterRestApiClient dhApiClient){ create6testPolices(dhApiClient); DataHunterRestApiResponsePojo response = dhApiClient.updatePoliciesUseState("testapi", null, "USED", "UNUSED", null); assertEquals(new Integer(3), response.getRowsAffected()); response = dhApiClient.useNextPolicy("testapi", null, "UNUSED", DataHunterConstants.SELECT_MOST_RECENTLY_ADDED); assertEquals(String.valueOf(true), response.getSuccess()); assertEquals(new Integer(1), response.getRowsAffected()); assertsOnPolicy(new Policies("testapi","im4", "nonblanklc", "UNUSED", "", null), response.getPolicies().get(0)); response = dhApiClient.useNextPolicy("testapi", null, "UNUSED", DataHunterConstants.SELECT_MOST_RECENTLY_ADDED); assertEquals(String.valueOf(true), response.getSuccess()); assertEquals(new Integer(1), response.getRowsAffected()); assertsOnPolicy(new Policies("testapi","im3", "nonblanklc", "UNUSED", "otherdata3", null), response.getPolicies().get(0)); response = dhApiClient.useNextPolicy("testapi", null, "UNUSED", DataHunterConstants.SELECT_MOST_RECENTLY_ADDED); assertEquals(String.valueOf(true), response.getSuccess()); assertEquals(new Integer(1), response.getRowsAffected()); assertsOnPolicy(new Policies("testapi","im2", "", "UNUSED", "", null), response.getPolicies().get(0)); response = dhApiClient.useNextPolicy("testapi", null, "UNUSED", DataHunterConstants.SELECT_MOST_RECENTLY_ADDED); assertEquals(String.valueOf(true), response.getSuccess()); assertEquals(new Integer(1), response.getRowsAffected()); assertsOnPolicy(new Policies("testapi","im1", "", "UNUSED", "", null), response.getPolicies().get(0)); response = dhApiClient.useNextPolicy("testapi", null, "UNUSED", DataHunterConstants.SELECT_MOST_RECENTLY_ADDED); assertEquals(String.valueOf(false), response.getSuccess()); assertEquals(new Integer(0), response.getRowsAffected()); assertEquals("No rows matching the selection. Possibly we have ran out of data for application:[testapi]", response.getFailMsg()); response = dhApiClient.useNextPolicy("testapi", null, "REUSABLE", DataHunterConstants.SELECT_MOST_RECENTLY_ADDED); assertEquals(String.valueOf(true), response.getSuccess()); assertEquals(new Integer(0), response.getRowsAffected()); assertsOnPolicy(new Policies("testapi","im3", "duplicatedid", "REUSABLE", "duplicated id", null), response.getPolicies().get(0)); assertEquals("Policy im3 NOT updated as it is marked as REUSABLE", response.getFailMsg()); create6testPolices(dhApiClient); response = dhApiClient.updatePoliciesUseState("testapi", null, "", "UNUSED", null); assertEquals(new Integer(5), response.getRowsAffected()); response = dhApiClient.lookupNextPolicy("testapi", null, "UNUSED", DataHunterConstants.SELECT_OLDEST_ENTRY); assertsOnPolicy(new Policies("testapi","im1", "", "UNUSED", "", null), response.getPolicies().get(0)); response = dhApiClient.useNextPolicy("testapi", null, "UNUSED", DataHunterConstants.SELECT_OLDEST_ENTRY); assertsOnPolicy(new Policies("testapi","im1", "", "UNUSED", "", null), response.getPolicies().get(0)); response = dhApiClient.useNextPolicy("testapi", null, "UNUSED", DataHunterConstants.SELECT_OLDEST_ENTRY); assertsOnPolicy(new Policies("testapi","im2", "", "UNUSED", "", null), response.getPolicies().get(0)); response = dhApiClient.useNextPolicy("testapi", null, "UNUSED", DataHunterConstants.SELECT_OLDEST_ENTRY); assertsOnPolicy(new Policies("testapi","im3", "nonblanklc", "UNUSED", "otherdata3", null), response.getPolicies().get(0)); response = dhApiClient.lookupNextPolicy("testapi", null, "UNUSED", DataHunterConstants.SELECT_OLDEST_ENTRY); assertsOnPolicy(new Policies("testapi","im3", "duplicatedid", "UNUSED", "duplicated id", null), response.getPolicies().get(0)); response = dhApiClient.lookupNextPolicy("testapi", null, "UNUSED", DataHunterConstants.SELECT_OLDEST_ENTRY); assertsOnPolicy(new Policies("testapi","im3", "duplicatedid", "UNUSED", "duplicated id", null), response.getPolicies().get(0)); response = dhApiClient.useNextPolicy("testapi", null, "UNUSED", DataHunterConstants.SELECT_OLDEST_ENTRY); assertsOnPolicy(new Policies("testapi","im3", "duplicatedid", "UNUSED", "duplicated id", null), response.getPolicies().get(0)); response = dhApiClient.useNextPolicy("testapi", null, "UNUSED", DataHunterConstants.SELECT_OLDEST_ENTRY); assertsOnPolicy(new Policies("testapi","im4", "nonblanklc", "UNUSED", "", null), response.getPolicies().get(0)); response = dhApiClient.lookupNextPolicy("testapi", null, "UNUSED", DataHunterConstants.SELECT_OLDEST_ENTRY); assertEquals("No rows matching the selection. Possibly we have ran out of data for application:[testapi]", response.getFailMsg()); response = dhApiClient.useNextPolicy("testapi", null, "UNUSED", DataHunterConstants.SELECT_OLDEST_ENTRY); assertEquals("No rows matching the selection. Possibly we have ran out of data for application:[testapi]", response.getFailMsg()); response = dhApiClient.updatePoliciesUseState("testapi", "im3", "USED", "UNUSED", null); assertEquals(new Integer(2), response.getRowsAffected()); response = dhApiClient.useNextPolicy("testapi", null, "UNUSED", DataHunterConstants.SELECT_OLDEST_ENTRY); assertsOnPolicy(new Policies("testapi","im3", "nonblanklc", "UNUSED", "otherdata3", null), response.getPolicies().get(0)); response = dhApiClient.useNextPolicy("testapi", null, "UNUSED", DataHunterConstants.SELECT_OLDEST_ENTRY); assertsOnPolicy(new Policies("testapi","im3", "duplicatedid", "UNUSED", "duplicated id", null), response.getPolicies().get(0)); response = dhApiClient.useNextPolicy("testapi", null, "UNUSED", DataHunterConstants.SELECT_OLDEST_ENTRY); assertEquals("No rows matching the selection. Possibly we have ran out of data for application:[testapi]", response.getFailMsg()); assertEquals(new Integer(2), dhApiClient.updatePoliciesUseState("testapi", "im3", "USED", "UNUSED", null).getRowsAffected()); assertEquals(new Integer(1), dhApiClient.updatePoliciesUseState("testapi", "im4", "USED", "UNUSED", null).getRowsAffected()); response = dhApiClient.useNextPolicy("testapi", "nonblanklc", "UNUSED", DataHunterConstants.SELECT_OLDEST_ENTRY); assertsOnPolicy(new Policies("testapi","im3", "nonblanklc", "UNUSED", "otherdata3", null), response.getPolicies().get(0)); response = dhApiClient.useNextPolicy("testapi", "nonblanklc", "UNUSED", DataHunterConstants.SELECT_OLDEST_ENTRY); assertsOnPolicy(new Policies("testapi","im4", "nonblanklc", "UNUSED", "", null), response.getPolicies().get(0)); response = dhApiClient.useNextPolicy("testapi", "nonblanklc", "UNUSED", DataHunterConstants.SELECT_OLDEST_ENTRY); assertEquals("No rows matching the selection. Possibly we have ran out of data for application:[testapi]", response.getFailMsg()); } /** * Note: this method clears the DataHunter database of all existing data * @param dhApiClient DataHunterRestApiClient */ public void workingWithAsyncMessages(DataHunterRestApiClient dhApiClient) { clearDatabase(dhApiClient); insertPolicySets(dhApiClient, "testapi-async", "t01-", 5); DataHunterRestApiResponsePojo response = dhApiClient.printSelectedPolicies("testapi-async", null, "UNPAIRED"); assertEquals(new Integer(20), response.getRowsAffected()); assertEquals(20, response.getPolicies().size()); dhApiClient.deleteMultiplePolicies("norowsfound", null, null); //response = dhApiClient.asyncMessageAnalyzer(DataHunterConstants.STARTS_WITH,"TESTAPI_ASYNC_HIGH_VOL", null, "UNPAIRED", "USED"); response = dhApiClient.asyncMessageAnalyzer(DataHunterConstants.EQUALS,"norowsfound", null, null, null); assertEquals(0, response.getAsyncMessageaAnalyzerResults().size()); assertEquals(new Integer(0), response.getRowsAffected()); response = dhApiClient.asyncMessageAnalyzer(DataHunterConstants.EQUALS,"testapi-async", null, null, null); assertEquals(5, response.getAsyncMessageaAnalyzerResults().size()); assertEquals(new Integer(5), response.getRowsAffected()); insertPolicySets(dhApiClient, "testapi-like", "t02-", 1); assertEquals(new Integer(8), dhApiClient.countPoliciesBreakdown(DataHunterConstants.STARTS_WITH, "testapi-", "", "UNPAIRED").getRowsAffected()); response = dhApiClient.asyncMessageAnalyzer(DataHunterConstants.STARTS_WITH ,"testapi-", null, null, null); assertEquals(6, response.getAsyncMessageaAnalyzerResults().size()); assertTrue(response.getAsyncMessageaAnalyzerResults().get(0).toString().startsWith("[application=testapi-like, startsWith=null, identifier=t02-testonly-1, lifecycle=null, useability=UNPAIRED")); assertTrue(response.getAsyncMessageaAnalyzerResults().get(1).toString().startsWith("[application=testapi-async, startsWith=null, identifier=t01-testonly-5, lifecycle=null, useability=UNPAIRED")); assertTrue(response.getAsyncMessageaAnalyzerResults().get(2).toString().startsWith("[application=testapi-async, startsWith=null, identifier=t01-testonly-4, lifecycle=null, useability=UNPAIRED")); assertTrue(response.getAsyncMessageaAnalyzerResults().get(3).toString().startsWith("[application=testapi-async, startsWith=null, identifier=t01-testonly-3, lifecycle=null, useability=UNPAIRED")); assertTrue(response.getAsyncMessageaAnalyzerResults().get(4).toString().startsWith("[application=testapi-async, startsWith=null, identifier=t01-testonly-2, lifecycle=null, useability=UNPAIRED")); assertTrue(response.getAsyncMessageaAnalyzerResults().get(5).toString().startsWith("[application=testapi-async, startsWith=null, identifier=t01-testonly-1, lifecycle=null, useability=UNPAIRED")); response = dhApiClient.asyncMessageAnalyzer(DataHunterConstants.EQUALS ,"testapi-async", "t01-testonly-4", null, "USED"); assertEquals(1, response.getAsyncMessageaAnalyzerResults().size()); assertTrue(response.getAsyncMessageaAnalyzerResults().get(0).toString().startsWith("[application=testapi-async, startsWith=null, identifier=t01-testonly-4, lifecycle=null, useability=USED")); assertEquals(3, dhApiClient.printSelectedPolicies("testapi-async", null, "USED").getPolicies().size()); response = dhApiClient.asyncMessageAnalyzer(DataHunterConstants.STARTS_WITH ,"testapi-async", "t01-testonly-3", null, "USED"); assertEquals(1, response.getAsyncMessageaAnalyzerResults().size()); assertTrue(response.getAsyncMessageaAnalyzerResults().get(0).toString().startsWith("[application=testapi-async, startsWith=null, identifier=t01-testonly-3, lifecycle=null, useability=USED")); assertEquals(6, dhApiClient.printSelectedPolicies("testapi-async", null, "USED").getPolicies().size()); response = dhApiClient.asyncMessageAnalyzer(DataHunterConstants.EQUALS ,"testapi-async", "t01-someother-5", null, "USED"); assertEquals(0, response.getAsyncMessageaAnalyzerResults().size()); } private void clearDatabase(DataHunterRestApiClient dhApiClient) { DataHunterRestApiResponsePojo response = dhApiClient.countPoliciesBreakdown(DataHunterConstants.STARTS_WITH, "", null, null); List<CountPoliciesBreakdown> countPoliciesBreakdownList = response.getCountPoliciesBreakdown(); for (CountPoliciesBreakdown cpb : countPoliciesBreakdownList) { dhApiClient.deleteMultiplePolicies(cpb.getApplication(), cpb.getLifecycle(), cpb.getUseability()); } assertEquals(0, dhApiClient.countPoliciesBreakdown(DataHunterConstants.STARTS_WITH, "", null, null).getCountPoliciesBreakdown().size()); } private void insertPolicySets(DataHunterRestApiClient dhApiClient, String application, String idPrefex, int numPoliciesSetsToBeCreate) { dhApiClient.deleteMultiplePolicies(application, null, null); for (int i = 1; i <= numPoliciesSetsToBeCreate; i++) { dhApiClient.addPolicy( new Policies(application, idPrefex+"testonly-" + i, "FIRSTONE", "UNPAIRED", "", null)); dhApiClient.addPolicy( new Policies(application, idPrefex+"testonly-" + i, "between", "UNPAIRED", "", null)); dhApiClient.addPolicy( new Policies(application, idPrefex+"someother-"+ i, "other", "UNPAIRED", "", null)); } for (int i = 1; i <= numPoliciesSetsToBeCreate; i++) { try {Thread.sleep(2);} catch (Exception e){}; // ensure a time gap dhApiClient.addPolicy( new Policies(application, idPrefex+"testonly-" + i, "LASTONE", "UNPAIRED", "", null)); } } private void create6testPolices(DataHunterRestApiClient dhApiClient) { dhApiClient.deleteMultiplePolicies("testapi", null, null); dhApiClient.deleteMultiplePolicies("otherapp", null, null); dhApiClient.addPolicy(new Policies("testapi","im1", "", "USED", "", null)); try {Thread.sleep(2);} catch (Exception e){}; // guarantee no two rows have matching 'created' (for sorts) dhApiClient.addPolicy(new Policies("testapi","im2", "", "USED", null, null)); try {Thread.sleep(2);} catch (Exception e){}; dhApiClient.addPolicy(new Policies("testapi","im3", "nonblanklc", "USED", "otherdata3", null)); try {Thread.sleep(2);} catch (Exception e){}; dhApiClient.addPolicy(new Policies("testapi","im3", "duplicatedid", "REUSABLE", "duplicated id", null)); try {Thread.sleep(2);} catch (Exception e){}; dhApiClient.addPolicy(new Policies("testapi","im4", "nonblanklc", "UNUSED", "", null)); try {Thread.sleep(2);} catch (Exception e){}; dhApiClient.addPolicy(new Policies("otherapp","io1", "", "UNUSED", null, null)); } private void assertsOnPolicy(Policies expectedPolicy, Policies actualPolicy) { assertEquals(expectedPolicy.getApplication(), actualPolicy.getApplication()); assertEquals(expectedPolicy.getIdentifier(), actualPolicy.getIdentifier()); assertEquals(expectedPolicy.getLifecycle(), actualPolicy.getLifecycle()); assertEquals(expectedPolicy.getOtherdata(), actualPolicy.getOtherdata()); if (expectedPolicy.getEpochtime() != null) { assertEquals(expectedPolicy.getOtherdata(), actualPolicy.getOtherdata()); } } /** * runs each of the sample use cases against a local dataHuner instance * @param args none required */ public static void main(String[] args) { System.out.println("running DataHunterRestApiClientSampleUsage .."); DataHunterRestApiClient dhApiClient = new DataHunterRestApiClient("http://localhost:8081/mark59-datahunter" ); DataHunterRestApiClientSampleUsage sample = new DataHunterRestApiClientSampleUsage(); sample.basicPolicyAddPrintDeleteChecks(dhApiClient); sample.workingWithMultiplePolicies(dhApiClient); sample.policyCountsAndBreakdowns(dhApiClient); sample.policyCountBreakdownsUsingStartWith(dhApiClient); // this method clears the database !! sample.workingWithUseStateChanges(dhApiClient); sample.workingWithAsyncMessages(dhApiClient); sample.asyncLifeCycleTestWithUseabilityUpdate(dhApiClient); System.out.println("completed DataHunterRestApiClientSampleUsage run"); } }
Just over a month after the launch of Java 8, it was announced yesterday that, after a massive two year effort, Java ME 8 is now officially GA (click here for a full feature tour, courtesy of Steve Meloan and Oracle senior technologist and product manager Terrence Barr). Java ME 8 constitutes a major update to the existing embedded technology, bundling in a significant number of updated and new features, including Java language and API alignment with Java SE 8, a comprehensive application model, sophisticated security features and standard APIs for power management and interaction with an extensive set of standard peripherals. It wouldn’t be hyperbolic to call this release epoch making for the Internet of Things (IoT). Within Java ME 8, developers will find the tools they need for terraforming the fragmented foundations of the IoT as it is into a cohesive movement. Designed to provide a bespoke, flexible, secure, and highly scalable development and deployment environment for the embedded space, the nine million plus international Java community is now “poised to help facilitate what many predict will be a third IT revolution.” Figure 1 An overview of the Java ME 8 platform. At the forefront of Java ME 8’s design is a series of careful calibrations to ensure speedier application performance – a critical factor when you consider the billion strong network of low-power devices that will make up the fabric of the IoT. This release also brings Java ME and Java SE together, which Oracle has described as “the most significant upgrade to the Java Programming Model ever.” The warders of the platform reckon that will help facilitate a smoother developer experience, as well as easier code replication across the platforms. Potential use case scenarios have been widened, with the platform now customisable for devices with as little as 192 KB RAM and 1 MB of Flash/ ROM. There’s also improved networking and connectivity, including wireless support (3GPP, CDMA, WiFi) and restyled access to peripheral devices through Device Access API, as well as new APIs for RESTful programming (JSON, OAuth2, HTTP client). You can get an official “Introduction into Java Micro Edition (ME) 8″ here.
Two-loop non-planar hexa-box integrals with one massive leg Based on the Simplified Differential Equations approach, we present results for the two-loop non-planar hexa-box families of master integrals. We introduce a new approach to obtain the boundary terms and establish a one-dimensional integral representation of the master integrals in terms of Generalised Polylogarithms, when the alphabet contains non-factorisable square roots. The results are relevant to the study of NNLO QCD corrections for $W,Z$ and Higgs-boson production in association with two hadronic jets. Introduction The computation of higher order corrections to Standard Model (SM) scattering processes and their comparison against data coming from collider experiments remains one of the best approaches for the study of Nature at its most fundamental level. The discovery of the Higgs boson at the LHC solidified the mathematical consistency of the SM of Particle Physics as our best fundamental description of Nature. In the absence of any clear signals for physics beyond the SM, a detailed study of the properties of the Higgs boson along with a scrutinization of key SM processes have spearheaded the endeavour to advance our understanding of Particle Physics . The upcoming High Luminosity upgrade of the LHC will provide us with experimental data of unprecedented precision. Making sense of the data and exploiting the machine's full potential will require theoretical predictions of equally high precision. In recent years, the theoretical community has made tremendous effort to meet the challenge of performing notoriously difficult perturbative calculations in Quantum Field Theory. The current precision frontier for the QCD dominated processes studied at the LHC lies at the Next-to-Next-to-Leading-Order (NNLO) for massless 2 → 3 scattering with one off-shell external particle . A typical NNLO calculation involves, among other things, the computation of twoloop Feynman diagrams . The established method for performing such calculations is by solving first-order differential equations (DE) satisfied by the relevant Feynman integrals (FI) . Working within dimensional regularisation in d = 4 − 2 dimensions, allows the derivation of linear relations in the form of Integration-By-Parts (IBP) identities satisfied by these integrals , which allows one to obtain a minimal and finite set of FI for a specific scattering process, known as master integrals (MI). It has been conjectured that FI with constant leading singularities in d dimensions satisfy a simpler class of DE , known as canonical DE . A basis of MI satisfying canonical DE is known as a pure basis. The study of the special functions which appear in the solutions of such DE has provided a deeper understanding of their mathematical properties. These special functions often admit a representation in the form of Chen iterated integrals . For a large class of FI, their result can be written in terms of a well studied class of special functions, known as Multiple of Goncharov polylogarithms (GPLs) . Several computational tools have been developed for their algebraic manipulation and numerical evaluation . For the case of two-loop five-point MI with one massive leg, pure bases of MI have been recently presented in for the planar topologies, which we will call one-mass pentaboxes, and more recently in for some of the non-planar topologies, which we will call onemass hexaboxes. All one-mass pentaboxes have been computed both numerically , using generalised power-series expansions , as well as analytically in terms of GPLs , by employing the Simplified Differential Equations (SDE) approach . Recently, analytic results were also obtained in the form of Chen iterated integrals and have been implemented into the so-called one-mass pentagon functions , similar to the two-loop five-point massless results . These results, along with fully analytic solutions for the relevant one-loop integral family , have lead to the production of the first phenomenological studies at the leading-colour approximation for 2 → 3 scattering processes involving one massive particle at the LHC . For the one-mass hexabox topologies, numerical results were first presented in , using a method which emulates the Feynman parameter technique, for one of the non-planar integral families. All three integral families were treated numerically in using the same methods as in . In this paper, we employ the SDE approach and obtain semi-analytic results for all one-mass hexaboxes, using the pure bases presented in . More specifically, we obtain fully analytic expressions in terms of GPLs of up to weight 4 for the first non-planar family, denoted as N 1 in figure 1. For families N 2 and N 3 , we obtain analytic results for the unknown non-planar integrals up to weight 2, whereas for weights 3 and 4 we introduce a one-fold integral representation in terms of GPLs allowing for a straightforward numerical evaluation of our expressions. In the SDE approach the momenta are parametrized by introducing a dimensionless variable x, as follows q 1 → p 123 − xp 12 , q 2 → p 4 , q 3 → −p 1234 , q 4 → xp 1 (2.1) Figure 1: The five non-planar families with one external massive leg. The first row corresponds to the so-called hexabox topologies, whereas the diagrams of the second row are known as double-pentagons. We label them as follows: N 1 (top left), N 2 (top middle), N 3 (top right), N 4 (bottom left), N 5 (bottom right). All diagrams have been drawn using Jaxodraw . where the new momenta p i , i = 1 . . . 5 satisfy now 5 1 p i = 0, p 2 i = 0, i = 1 . . . 5, whereas p i...j := p i +. . .+p j . The set of independent invariants is given by {S 12 , S 23 , S 34 , S 45 , S 51 , x}, with S ij := (p i + p j ) 2 . The explicit mapping between the two sets of invariants is given by and as usual the x = 1 limit corresponds to the on-shell kinematics. The corresponding Feynman Integrals are defined through where q i...j := q i + . . . + q j . Using FIRE6 we found that the N 1 family consists of 86 MI out of which 10 MI are genuinely new, the rest being known from the one-mass planar pentabox or the non-planar double-box families . For N 2 and N 3 the corresponding numbers are 86, 13 and 135, 21. Pure bases and simplified canonical differential equations We adopt the pure bases presented in . As was the case for the pure bases of the planar families presented in , a d log form of the relevant differential equations was achieved, whose alphabet involves several square roots of the kinematic invariants {q 2 1 , s 12 , s 23 , s 34 , s 45 , s 15 }. More specifically, the following six square roots that appear in the alphabets of the one-mass hexabox integral families are (2.10) with λ(x, y, z) = x 2 −2xy−2xz+y 2 −2yz+z 2 representing the Källen function, G(q 1 , q 2 , q 3 , q 4 ) = {2q i · q j } being the the Gram matrix of the external momenta, and Σ For topology N 1 , the square roots r 1 and r 4 appear in its alphabet given in . Introducing the dimensionless variable x rationalises these two roots through the mapping of (2.2). This allows us to derive a SDE in canonical form for N 1 , where g is the pure basis of N 1 , M i are the residue matrices corresponding to each letter l i and l max is the length of the alphabet, which for N 1 is l max = 21. It is interesting to note here the significant reduction in the number of letters in comparison with the alphabet of N 1 given in , where the relevant length of the alphabet is 39. The form of (2.14) allows for a direct iterative solution order-by-order in in terms of GPLs, assuming that the relevant boundary terms are obtained. For topologies N 2 and N 3 , the square roots appearing in their respective alphabets are {r 1 , r 2 , r 4 , r 5 } and {r 1 , r 3 , r 4 , r 6 }. In general all the square roots with the exception of {r 5 , r 6 } can be rationalised using either the mapping given in (2.2) or a variant of it . Nevertheless, in order to write an equation in the form of (2.14) a simultaneous rationalisation of all square roots is necessary. In fact, the mapping (2.2) allows for the rationalisation of r 1 and r 4 in terms of x, but this is not the case for {r 2 , r 3 , r 5 , r 6 }. It is thus not possible to achieve a canonical SDE in the form of (2.14) for families N 2 and N 3 using the parametrisation (2.1). This does not mean that the basis elements cannot be cast in the form of GPLs, but just that such a representation is not straightforwardly obtained based on the simple equation (2.14). The more general form of the SDE takes the form: where most of the L a are simple rational functions of x, as in (2.14), whereas the rest are algebraic functions of x involving the non-rationalisable square roots. A detailed analysis of (2.15) reveals that these non-rationalisable square roots start appearing at weight two. In practise this means that we can use the mapping (2.2) and solve the respective canonical DE for N 2 and N 3 by integrating with respect to x up to weight one in terms of ordinary logarithms. For weight two, analytic expressions in terms of GPLs can be achieved due to the fact that the non-rationalisable square roots {r 2 , r 3 , r 5 , r 6 } appear decoupled in the DE. In fact, most of the basis elements are straightforwardly expressed in terms of GPLs by integrating the corresponding DE. For the rest, an educated ansatz can be constructed involving only specific weight-two GPLs, which are identified by inspecting the DE in each case where square roots {r 2 , r 3 , r 5 , r 6 } appear, modulo the boundary terms that one needs to compute. Thus analytic expressions in terms of GPLs up to weight two are obtained for all elements belonging in these families. To further elaborate on this point let us analyse a rather simple case of a 3-point integral sector with three off-shell legs, that appears in both N 2 and N 3 families. This sector comprises two basis elements and the DE satisfied by those elements includes also two-point MI that are known in closed form. For instance, in N 2 , the 3-point integrals appear as basis elements number 10 and 11 (see the ancillary file). The element 10 at weight 2, g 10 , can straightforwardly be obtained by integrating the (2.15) and it is expressible in terms of GPLs in the form G(a, b; x) where a, b are independent of x. On the contrary the element 11 at weight 2, g 11 , is obtained by construction of an ansatz. Let us mention that all elements in question, except those involving the square roots {r 5 , r 6 }, namely element 73 in N 2 and 114 in N 3 , are known in terms of GPLs up to weight 4 , based though on different variants of the parametrization (2.1). For instance element 11 of N 2 is given as where the new parametrization of the external momenta is given by The set of independent invariants is given by {S 12 ,S 23 ,S 34 ,S 45 ,S 51 , y}, withS ij := (p i +p j ) 2 . The explicit mapping between the two sets of invariants is given by , which in terms of (2.2) are given as we can write the DE for this element in the simple and compact form d dx g (2) The form of the DE makes the determination of the ansatz rather straightforward, with the result Concerning the other non-rationalisable square root in the family N 2 , r 5 , it also appears for the first time at weight 2 in the basis element 73 only (see the ancillary file), which is one of the new integrals to be calculated. Following the same procedure as for the element 11, namely writing the corresponding DE in a similar form, we find that the expression at weight 2 is similar to that of (2.19), Regarding family N 3 , there are two 3-point integral sectors with three off-shell legs that involve square root r 4 , which is not rationalised in terms of x by (2.1), and consist of elements 12, 13 and 16, 17. Similarly to element 11 of family N 2 , elements 12 and 16 cannot be expressed in terms of GPLs through a straightforward integration of their respective DE. However, we can achieve a GPL representation for them at weight 2 similar to (2.19), where now the f − , f + functions involve the square root r 4 instead of r 2 . Square root r 6 appears for the first time at weight 2 in element 114 similarly to the way square root r 5 appears in element 73 in the N 2 family, allowing us to obtain an expression at weight 2 as in (2.20), with the f − , f + functions involving r 6 instead of r 5 . Studying basis elements that are known in terms of GPLs up to weight 4, proved useful in constructing an educated ansatz for the unknown integrals at weight 2. It would be very interesting to further pursue this direction, with the aim to establish a systematic way to construct representations in terms of GPLs for weights higher than 2. This will allow to extend the SDE approach to cases where the letters L a in (2.15) assume a general algebraic form. Constructing analytic expressions in terms of GPLs beyond weight 2 by applying a more general procedure following the ideas of is also possible, but it requires a significant amount of resources and it might well result to a proliferation of GPLs. A more practical and direct approach, introducing a one-dimensional integral representation will be presented in detail in section 4. Boundary terms In this section we will describe the analytic computation of all necessary boundary terms in terms of GPLs with rational functions of the underline kinematic invariants S ij up to weight 4. We perform this task for all three non-planar families. Our main approach is the one introduced in and elaborated in detail in . In general we need to calculate the x → 0 limit of each pure basis element. At first we exploit the canonical SDE at the limit x → 0 and define through it the resummation matrix where the matrices S, D are obtained through the Jordan decomposition of the residue matrix for the letter l 1 = 0, M 1 , Secondly, we can relate the elements of the pure basis to a set of MI G through IBP reduction, g = TG. Using the expansion by regions method as implemented in the asy code which is shipped along with FIESTA4 , we can obtain the x → 0 limit of the MI in terms of which we express the pure basis (3.3), where a j and b j are integers and G i are the individual members of the basis G of MI in (3.3). This analysis allows us to construct the following relation where the right-hand side implies that, apart from the terms x a i coming from (3.4), we expand around x = 0, keeping only terms of order x 0 . Equation (3.5) allows us in principle to determine all boundary constants b = 6 i=0 i b (i) 0 . More specifically, in the case where D in (3.2) is non-diagonal, we will get logarithmic terms in x on the left-hand side of (3.5), in the form x a j log(x). Since no such terms appear on the right-hand side of (3.5), a set of linear relations between elements of the array b are obtained by setting the coefficient of x a j log(x) terms to zero. Furthermore, powers of x a j that appear only on the left-hand side can also yield linear relations among elements of b, by setting their coefficients to zero. We shall call these two sets of relations pure, since they are linear relations among elements of b with rational numbers as coefficients. These pure relations account for the determination of a significant part of the two components of the boundary array. Finally for the undetermined elements of b, several region-integrals G (b j +a j ) i usually need to be calculated coming from (3.4).Their calculation is straightforwardly achieved either by direct integration in Feynman-parameter space and then by using HypExp to expand the resulting 2 F 1 hypergeometric functions, or in a very few cases, by Mellin-Barnes techniques using the MB , MBSums and XSummer This approach was efficient enough for the determination of all boundary terms for families N 1 and N 2 . Specifically for family N 1 , where a canonical SDE can be achieved (2.14), we can write a solution in terms of GPLs up to weight 4 in the following compact form were G ab... := G(l a , l b , . . . ; x) represent the GPLs. These results are presented in such a way that each coefficient of i has transcendental weight i. If we assign weight −1 to , then (3.6) has uniform weight zero. For family N 3 , eq. (3.5) resulted in a proliferation of region-integrals, more than 200, that one would have to calculate in order to obtain boundary terms for several higher-sector basis elements. More specifically, in order to obtain the following boundary terms {b 101 , b 103 , b 104 , b 106 , b 113 , b 117 , b 118 , b 124 , b 125 , b 126 , b 130 , b 131 , b 132 one would have to calculate 208 region-integrals, with 17 of them having seven Feynman parameters to be integrated, making their direct integration highly non-trivial. For all basis elements apart from (3.7) we were able to obtain boundary terms through (3.5). To reduce the number of region-integrals for the computation of (3.7) we have investigated a different approach. The idea is rather simple and straightforward. The pure basis elements can be written in general as follows: where D i , i = 1...11, represent the inverse scalar propagators,S the set of indices corresponding to a given sector, S ij , x the kinematic invariants, P is a polynomial, a i are positive integers and C a factor depending on S ij , x. This form is usually decomposed in terms of FI, with c i being polynomials in S ij , x. The limit x = 0, is then obtained, after IBP reduction, through Feynman parameter representation of the individual MI, as described in the previous paragraphs. An alternative approach, would be to build-up the Feynman parameter representation for the whole basis element, by considering the integral in (3.8) as a tensor integral and making use of the formulae from the references , to bring it in its Feynman parameter representation. Then, by using the expansion by regions approach , we determine the regions 2 in the limit x = 0. Rescaling the Feynman parameters by appropriate powers of x, keeping the leading power in x, we then obtain the final result that can be written as follows: where I runs over the set of contributing regions, U I and F I are the limits of the usual Symanzik polynomials, Π I is a polynomial in the Feynman parameters, x i , and the kinematic invariants S ij , and S I the subset of surviving Feynman parameters in the limit. In this way a significant reduction of the number of regions to be calculated is achieved, namely from 208 to 9. Notice that in contrast to the approach described in the previous paragraphs, only the regions x −2 and x −4 contribute to the final result, making thus the evaluation of the region-integrals simpler. Moreover, this approach overpasses the need for an IBP reduction of the basis elements in terms of MI. Integral representation After obtaining all boundary terms in section 3 and constructing analytic expressions for families N 2 and N 3 up to O( 2 ) in terms of GPLs up to weight two, we will now introduce an one-fold integral representation for O( 3 ) and O( 4 ). This representation will allow us to obtain numerical results through direct numerical integration . Weight 3: The differential equation (2.15) can be written in the form: where a runs over the set of contributing letters, I, J run over the set of basis elements, c a IJ are Q−number coefficients read off from the matrices M a and g (2) J are the basis elements at weight 2, known in terms of GPLs. Since the lower limit of integration corresponds to x = 0, we need to subtract the appropriate term so that the integral is explicitly finite. This is achieved as follows: where g I,0 are obtained by expanding g (2) I around x = 0 and keeping terms up to order O( log(x) 2 ), and l a ∈ Q are defined through The DE (4.2) can now be integrated from x = 0 to x =x, and the result is given by Weight 4: At weight 4, the differential equation (2.15) can be written in the form: which after doubly-subtracting, in order to obtain integrals that are explicitly finite as in (4.2), is written as where LL a are obtained by expanding log(L a ) around x = 0 and keeping terms up to order O( log(x)), and g (3) Now, by integrating by parts and using (4.2) we can write the final result as follows: with a, b running over the set of contributing letters, I, J, K running over the set of basis elements, b I being the boundary terms at O( 4 ) and g (4) where the subscript G indicates that the integral is represented in terms of GPLs (see ancillary file), following (4.6). Implementation: As a proof of concept, we have implemented the final formulae (4.4) and (4.10) in Mathematica. We use NIntegrate to perform the one-dimensional integrals appearing in the (4.4) and (4.10), after expressing all weight-2 functions in terms of classical polylogarithms following references . For kinematic configurations where there are no singularities in the domain of integration (0,x), we have checked the new basis elements obtained by us against numerical results provided by the authors 3 of reference and found full agreement. For kinematic configurations with singularities in the domain of integration, we use the i −prescription as explained in references , as well as the convention concerning the square roots appearing in the alphabet and the basis elements as detailed in section 6.2 of reference . We provide proof-of-concept codes in the ancillary files for both the Euclidean point mentioned above as well as the first physical phase-space point of Eq. (6.15) in reference . The reader can easily assess the performance of this straightforward implementation by running the provided codes and look at the minimum number of digits in agreement with the high-precision results from reference , as well as at the number of integrand evaluations performed by NIntegrate. Notice that the integrand expressions involve logarithms and classical polylogarithms Li 2 that are evaluated using very little CPU time. The parts of the formulae (4.4) and (4.10) that can be represented in terms of GPLs up to weight four, as well as the results for the N 1 family, for which we have all basis elements in terms of GPLs up to weight four, are evaluated with GiNaC as implemented in PolyLogTools . In the current implementation we use the default parameters for GiNaC and the default parameters for NIntegrate with the exception of WorkingPrecision and PrecisionGoal, in order to obtain reasonable results within reasonable time, taking into account that the provided implementation serves merely as a demonstration of the correctness of our representations. For the Euclidean point the precision is typically of the order of 32 digits, which is compatible with GiNaC setup. For the physical point, the typical precision is of the order of 25 digits, which is compatible with the expected one taking into account the numerical value of the infinitesimal imaginary part assigned to the kinematical invariants. We plan to address all the details regarding the numerical evaluation in a forthcoming publication, where an optimised implementation for all two-loop five-point basis elements based on our analytic results, in line with references , will be presented. Conclusions The frontier of precision calculations at NNLO currently concerns 2 → 3 scattering process involving massless propagators and one massive external particle. At the level of FI, all planar two-loop MI have been recently computed through the solution of canonical DE both numerically , via generalised power series expansions, and analytically in terms of GPLs up to weight 4 , using the SDE approach . More recently, results in terms of Chen iterated integrals were presented and implemented in the so-called pentagon functions . Concerning the two-loop non-planar topologies, these can be classified into the three so-called hexabox topologies and two so-called double-pentagons, see figure 1. One of the hexabox topologies, denoted as N 1 in figure 1, was calculated numerically a few years ago using an approach which introduces a Feynman parameter and uses analytic results for the sub-topologies that are involved . More recently, pure bases for the three hexabox topologies satisfying DE in d log form were presented in reference and solved numerically using the same methods as in . In this paper we addressed the calculation of the three two-loop hexabox topologies, N 1 , N 2 , N 3 in figure 1, using the SDE approach. For the N 1 family results up to weight 4 in terms of GPLs are obtained. For the N 2 and N 3 families we have established an onedimensional integral representation involving up to weight-2 GPLs. This allows to extend the scope of the SDE approach when non-factorisable square roots appear in the alphabet. We have also introduced a new approach to compute the boundary terms directly for the basis elements, that significantly reduces the complexity of the problem. With these new developments, we hope to complete the full set of five-point one-mass two-loop MI families in the near future and provide a solid implementation for their numerical evaluation.
/* * Copyright 2015-2016 USEF Foundation * * Licensed under the Apache License, Version 2.0 (the "License"); * you may not use this file except in compliance with the License. * You may obtain a copy of the License at * * http://www.apache.org/licenses/LICENSE-2.0 * * Unless required by applicable law or agreed to in writing, software * distributed under the License is distributed on an "AS IS" BASIS, * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. * See the License for the specific language governing permissions and * limitations under the License. */ package energy.usef.agr.workflow.plan.connection.profile; import static energy.usef.core.constant.USEFConstants.LOG_COORDINATOR_FINISHED_HANDLING_EVENT; import static energy.usef.core.constant.USEFConstants.LOG_COORDINATOR_START_HANDLING_EVENT; import energy.usef.agr.dto.ConnectionPortfolioDto; import energy.usef.agr.dto.ElementDto; import energy.usef.agr.model.Element; import energy.usef.agr.service.business.AgrElementBusinessService; import energy.usef.agr.service.business.AgrPortfolioBusinessService; import energy.usef.agr.transformer.ElementTransformer; import energy.usef.agr.workflow.AgrWorkflowStep; import energy.usef.core.config.Config; import energy.usef.core.config.ConfigParam; import energy.usef.core.event.validation.EventValidationService; import energy.usef.core.exception.BusinessValidationException; import energy.usef.core.workflow.DefaultWorkflowContext; import energy.usef.core.workflow.WorkflowContext; import energy.usef.core.workflow.step.WorkflowStepExecuter; import energy.usef.core.workflow.util.WorkflowUtil; import java.util.List; import javax.ejb.Lock; import javax.ejb.LockType; import javax.ejb.Singleton; import javax.enterprise.event.Event; import javax.enterprise.event.Observes; import javax.enterprise.event.TransactionPhase; import javax.inject.Inject; import javax.transaction.Transactional; import org.joda.time.LocalDate; import org.slf4j.Logger; import org.slf4j.LoggerFactory; /** * Coordinator class in charge of the workflow updating the element data store. */ @Singleton public class AgrUpdateElementDataStoreCoordinator { private static final Logger LOGGER = LoggerFactory.getLogger(AgrUpdateElementDataStoreCoordinator.class); @Inject private WorkflowStepExecuter workflowStepExecuter; @Inject private Config config; @Inject private AgrPortfolioBusinessService agrPortfolioBusinessService; @Inject private AgrElementBusinessService agrElementBusinessService; @Inject private Event<CreateConnectionProfileEvent> createConnectionProfileEventManager; @Inject private EventValidationService eventValidationService; /** * Update the element data store in order to supply up-to-date data to the subsequent Portfolio initialization process. * * @param event {@link AgrUpdateElementDataStoreEvent} event triggering the workflow. */ @Lock(LockType.WRITE) @Transactional(Transactional.TxType.REQUIRES_NEW) public void updateElementDataStore(@Observes(during = TransactionPhase.AFTER_COMPLETION) AgrUpdateElementDataStoreEvent event) throws BusinessValidationException { LOGGER.info(LOG_COORDINATOR_START_HANDLING_EVENT, event); eventValidationService.validateEventPeriodInFuture(event); // retrieve the input for the PBC from the database List<ConnectionPortfolioDto> connectionPortfolioDtoList = agrPortfolioBusinessService .findConnectionPortfolioDto(event.getPeriod()); List<ElementDto> elementDtoList = invokePBC(event.getPeriod(), connectionPortfolioDtoList); // persist the elements List<Element> elementList = ElementTransformer.transformToModelList(elementDtoList); agrElementBusinessService.createElements(elementList); createConnectionProfileEventManager.fire(new CreateConnectionProfileEvent(event.getPeriod())); LOGGER.info(LOG_COORDINATOR_FINISHED_HANDLING_EVENT, event); } @SuppressWarnings("unchecked") private List<ElementDto> invokePBC(LocalDate period, List<ConnectionPortfolioDto> connectionPortfolioDtoList) { // setup the input for the PBC WorkflowContext context = new DefaultWorkflowContext(); context.setValue(AgrUpdateElementDataStoreParameter.IN.PERIOD.name(), period); context.setValue(AgrUpdateElementDataStoreParameter.IN.PTU_DURATION.name(), config.getIntegerProperty(ConfigParam.PTU_DURATION)); context.setValue(AgrUpdateElementDataStoreParameter.IN.CONNECTION_PORTFOLIO_LIST.name(), connectionPortfolioDtoList); context = workflowStepExecuter.invoke(AgrWorkflowStep.AGR_UPDATE_ELEMENT_DATA_STORE.name(), context); // validate output of the PBC WorkflowUtil.validateContext(AgrWorkflowStep.AGR_UPDATE_ELEMENT_DATA_STORE.name(), context, AgrUpdateElementDataStoreParameter.OUT .values()); return context.get(AgrUpdateElementDataStoreParameter.OUT.ELEMENT_LIST.name(), List.class); } }
Syntactic‐semantic relationships in the mental lexicon of aphasic patients This paper examines the relative values of syntactic‐semantic relationships in the mental lexicon of aphasic patients, which were tested within syntagmatic and paradigmatic networks of lexical relations. Semantic relations, such as synonymy, antonomy, and hyperonymy, as well as collocational and coordinational syntactic‐semantic relations, were examined simultaneously. Twenty‐five subjects diagnosed with nominal aphasia were tested, as well as a control group of 20 healthy subjects. The control group was matched with the aphasic group in terms of dominant hemisphere, age, sex, and job. A naming test based on semantic context was used in this research. The test was presented orally to subjects. After the examiner had read a sentence, subjects were supposed to finish it with a target word (the word which was, through context, in a syntactic‐semantic relationship with the rest of the sentence). Sentences were composed of highly frequently occurring words. The categories used in the test were randomly patterned. The resultant data were analysed according to adequate semantic relations of the answers in the given context, and according to the type of the semantic‐syntactic relation in ‘wrong’ answers. Results of this analysis are interpreted according to current psycholinguistic theories.
<filename>UVa 10360 - Rat Attack/sample/10360 - Rat Attack.cpp #include <stdio.h> #include <string.h> #include <algorithm> using namespace std; int g[1025][1025]; int main() { int testcase, n, d; scanf("%d", &testcase); while(testcase--) { scanf("%d %d", &d, &n); memset(g, 0, sizeof(g)); int x, y, size; int i, j, xl, xr, yl, yr; while(n--) { scanf("%d %d %d", &x, &y, &size); xl = max(0, x-d), xr = min(x+d, 1024); yl = max(0, y-d), yr = min(y+d, 1024); for(i = xl; i <= xr; i++) for(j = yl; j <= yr; j++) g[i][j] += size; } int mx = -1, rx, ry; for(i = 0; i < 1025; i++) for(j = 0; j < 1025; j++) if(g[i][j] > mx) mx = g[i][j], rx = i, ry = j; printf("%d %d %d\n", rx, ry, mx); } return 0; }
package com.sequenceiq.cloudbreak.cmtemplate.configproviders.profilermanager; import com.cloudera.api.swagger.model.ApiClusterTemplateConfig; import com.google.common.annotations.VisibleForTesting; import com.sequenceiq.cloudbreak.api.endpoint.v4.database.base.DatabaseType; import com.sequenceiq.cloudbreak.cmtemplate.configproviders.AbstractRdsRoleConfigProvider; import com.sequenceiq.cloudbreak.cmtemplate.configproviders.ConfigUtils; import com.sequenceiq.cloudbreak.template.TemplatePreparationObject; import com.sequenceiq.cloudbreak.template.views.RdsView; import org.springframework.stereotype.Component; import java.util.List; import static com.sequenceiq.cloudbreak.cmtemplate.configproviders.ConfigUtils.config; @Component public class ProfilerMetricsRoleConfigProvider extends AbstractRdsRoleConfigProvider { @VisibleForTesting static final String PROFILER_METRICS_DATABASE_HOST = "profiler_metrics_database_host"; @VisibleForTesting static final String PROFILER_METRICS_DATABASE_NAME = "profiler_metrics_database_name"; @VisibleForTesting static final String PROFILER_METRICS_DATABASE_TYPE = "profiler_metrics_database_type"; @VisibleForTesting static final String PROFILER_METRICS_DATABASE_USER = "profiler_metrics_database_user"; @VisibleForTesting static final String PROFILER_METRICS_DATABASE_PASSWORD = "<PASSWORD>"; @Override protected List<ApiClusterTemplateConfig> getRoleConfigs(String roleType, TemplatePreparationObject source) { if (ProfilerManagerRoles.PROFILER_METRICS_AGENT.equals(roleType)) { RdsView profilerManagerRdsView = getRdsView(source); return List.of(config(PROFILER_METRICS_DATABASE_HOST, profilerManagerRdsView.getHost()), config(PROFILER_METRICS_DATABASE_NAME, profilerManagerRdsView.getDatabaseName()), config(PROFILER_METRICS_DATABASE_TYPE, ConfigUtils.getDbTypePostgres(profilerManagerRdsView, ProfilerManagerRoles.PROFILER_MANAGER)), config(PROFILER_METRICS_DATABASE_USER, profilerManagerRdsView.getConnectionUserName()), config(PROFILER_METRICS_DATABASE_PASSWORD, profilerManagerRdsView.getConnectionPassword())); } return List.of(); } @Override public String getServiceType() { return ProfilerManagerRoles.PROFILER_MANAGER; } @Override public List<String> getRoleTypes() { return List.of(ProfilerManagerRoles.PROFILER_METRICS_AGENT); } @Override protected DatabaseType dbType() { return DatabaseType.PROFILER_METRIC; } }
def quiz1(self): _, ax = plt.subplots(len(_data_file_list), figsize=(12, 7)) datas = {} for i, file in enumerate(_data_file_list): pair_name = file.split('_')[1] pair_data = pd.read_csv( _data_path / file, index_col='Date', parse_dates=True, skiprows=1) pair_data.set_index(pd.to_datetime( pair_data.index, format='%Y-%m-%d %I-%p'), inplace=True, verify_integrity=True) datas[pair_name] = pair_data.sort_index() print('pair: {}, head data: \n{}'.format( pair_name, datas[pair_name].head(10))) new_ylabel = pair_name + '($)' g = sns.relplot(x='Date', y=new_ylabel, kind='line', data=datas[pair_name].head(50).rename(columns={'Close': new_ylabel}).reset_index(), ax=ax[i]) g.set(ylabel=pair_name) plt.close(g.fig) plt.tight_layout() plt.show() for c1, c2 in itertools.combinations(datas.keys(), 2): print('Pearson correlation coefficient between {} and {}: {}'.format( c1, c2, scipy.stats.pearsonr(datas[c1].Close, datas[c2].Close))) corr_data = pd.DataFrame({c: v.Close for c, v in datas.items()}) plt.figure(figsize=(12, 7)) plt.title('Pearson correlation coefficients between BTC, ETH, XRP, LTC') sns.heatmap(corr_data.corr(), vmin=-1.0, vmax=1.0, square=True, annot=True) plt.show()
import { parseISO, format } from "date-fns"; import styles from "./date.module.scss"; import CalendarIcon from "./../../assets/Icons/Calendar.svg"; import ClockIcon from "./../../assets/Icons/Clock.svg"; export default function Date({ dateString }: { dateString: string }) { const date = parseISO(dateString); return ( <div className={styles.container}> <div> <div className={styles.icon}> <CalendarIcon /> </div> <time className={styles.value} dateTime={dateString}> {format(date, "LLLL d, yyyy")} </time> </div> <div> <div className={styles.icon}> <ClockIcon /> </div> <time className={styles.value} dateTime={dateString}> {format(date, "HH:mm")} </time> </div> </div> ); }
/** Gnome libs imports */ import * as GObject from 'gobject'; import { registerGObjectClass } from 'src/utils/gjs'; import * as St from 'st'; /** Extension imports */ const Me = imports.misc.extensionUtils.getCurrentExtension(); @registerGObjectClass export class MatDivider extends St.Widget { static metaInfo: GObject.MetaInfo = { GTypeName: 'MatDivider', }; vertical: boolean; constructor(vertical = false) { super({ y_expand: vertical, x_expand: !vertical, }); this.vertical = vertical; } override vfunc_get_preferred_width(forHeight: number): [number, number] { return this.vertical ? [1, 1] : // Note: clutter typing is incorrect here (super.vfunc_get_preferred_width(forHeight) as [number, number]); } override vfunc_get_preferred_height(forWidth: number): [number, number] { return !this.vertical ? [1, 1] : // Note: clutter typing is incorrect here (super.vfunc_get_preferred_height(forWidth) as [number, number]); } }
def element_count( atoms ): res = {} for atom in atoms: if ( not atom.symbol in res.keys() ): res[atom.symbol] = 1 else: res[atom.symbol] += 1 return res
<gh_stars>0 /** * Copyright 2015 @ to2.net. * name : odl_to_go * author : jarryliu * date : 2016-07-19 19:40 * description : * history : */ package parser import ( "bytes" "sync" ) var _ Parser = new(odlToGo) type odlToGo struct { mux sync.Mutex buf *bytes.Buffer } func NewOdlToGo() *odlToGo { return &odlToGo{ buf: bytes.NewBuffer([]byte("")), } } func (o *odlToGo) Parse(code string, options map[string]string) []byte { o.mux.Lock() defer o.mux.Unlock() o.buf.Write([]byte("it's ok!")) d := o.buf.Bytes() o.buf.Reset() return d }
/// The total balance involved in this vote. pub fn balance(self) -> Balance { match self { AccountVote::Standard { balance, .. } => balance, AccountVote::Split { aye, nay } => aye.saturating_add(nay), } }
use built::Options; use std::path::{Path, PathBuf}; fn main() { let mut default: Options = Options::default(); let options = default .set_compiler(true) .set_cfg(true) .set_ci(false) .set_dependencies(false) .set_git(true) .set_env(true) .set_features(true); let src: PathBuf = std::env::var("CARGO_MANIFEST_DIR").unwrap().into(); let dst: PathBuf = Path::new(&std::env::var("OUT_DIR").unwrap()).join("built.rs"); built::write_built_file_with_opts(&options, &src, &dst).expect("Failed to acquire build-time information"); }
Who would think a cookie could create such controversy and eventual sweet victory? After years of legal maneuvers and grassroots organizing, Wisconsin bakers can sell homemade baked goods thanks to a judge’s ruling in 2017. New Jersey is now the only state still with a ban on the sale of homemade baked goods. Thanks to cottage food laws across our country, homesteaders can diversify and sell to their community certain “non-hazardous” food products made in their home kitchen, such as breads and cookies, and jams, jellies and pickles. It’s community commerce at its finest, crafting something in your own kitchen and selling to your neighbors. While these state specific laws vary in what you can produce, how much you can earn and where you can sell, some states like Wisconsin and New Jersey unfortunately have been intensely fighting such entrepreneurial opportunities. I’ve learned it takes homestead bakers like myself to fight unconstitutional laws. Our success results in positive change that supports all of us small scale food entrepreneurs. I’m one of a trio of baking activists that recently, successfully, sued the State of Wisconsin on behalf of home bakers. Alongside my farmer friends Kriss Marion and Dela Ends, we spent years working with the Wisconsin Farmers Union to expand our state’s cottage food laws to catch up with the rest of the country that fosters such entrepreneurial spirit and include baked goods via the Cookie Bill. While this bill had broad-based support, passing in the Senate multiple times, Wisconsin Assembly Speaker Robin Vos refused to put the Bill on the Assembly floor for a vote, resulting in Wisconsin being one of the most unfriendly states in the nation for home-based food entrepreneurs, especially those who want to launch a baked enterprise from their home kitchen. -Advertisement- When the Legislative branch bogged down as it became clear Speaker Vos would never put the Cookie Bill on the agenda for a vote, we took our case to the Judicial branch, suing the state in partnership with the Institute for Justice. Our point and Judge Jorgenson in Lafayette County agreed, was that Wisconsin’s ban on the sale of home baked goods is unconstitutional and reflects the illegal influence of big industry groups. Apparently, these groups felt threatened by mom and pop competition. While the Judge ruled in our favor back in May, 2017, the Wisconsin Department of Agriculture, Trade and Consumer Protection (DATCP) had argued that the ruling was limited to just myself and Dela and Kriss. Fortunately, Judge Jorgenson officially disagreed and clarified that his ruling applies to all home bakers in the state of Wisconsin. Not, as DATCP claimed, just us three plaintiff bakers.
/** * Contains ACH/ECP account details for vaulted shoppers */ export declare type EcpAccountType = 'CONSUMER_CHECKING' | 'CONSUMER_SAVINGS' | 'CORPORATE_CHECKING' | 'CORPORATE_SAVINGS'; export interface EcpRequest { accountNumber: string; routingNumber: string; accountType: EcpAccountType; } export interface EcpResponse { accountNumber: string; routingNumber: string; accountType: EcpAccountType; publicAccountNumber: string; publicRoutingNumber: string; }
AlphaGo Zero uses 4 TPUs, is built entirely out of neural nets with no handcrafted features, doesn’t pretrain against expert games or anything else human, reaches a superhuman level after 3 days of self-play, and is the strongest version of AlphaGo yet. The architecture has been simplified. Previous AlphaGo had a policy net that predicted good plays, and a value net that evaluated positions, both feeding into lookahead using MCTS (random probability-weighted plays out to the end of a game). AlphaGo Zero has one neural net that selects moves and this net is trained by Paul-Christiano-style capability amplification, playing out games against itself to learn new probabilities for winning moves. As others have also remarked, this seems to me to be an element of evidence that favors the Yudkowskian position over the Hansonian position in my and Robin Hanson’s AI-foom debate. As I recall and as I understood: Hanson doubted that what he calls “architecture” is much of a big deal, compared to (Hanson said) elements like cumulative domain knowledge, or special-purpose components built by specialized companies in what he expects to be an ecology of companies serving an AI economy. When I remarked upon how it sure looked to me like humans had an architectural improvement over chimpanzees that counted for a lot, Hanson replied that this seemed to him like a one-time gain from allowing the cultural accumulation of knowledge. I emphasize how all the mighty human edifice of Go knowledge, the joseki and tactics developed over centuries of play, the experts teaching children from an early age, was entirely discarded by AlphaGo Zero with a subsequent performance improvement. These mighty edifices of human knowledge, as I understand the Hansonian thesis, are supposed to be the bulwark against rapid gains in AI capability across multiple domains at once. I said, “Human intelligence is crap and our accumulated skills are crap,” and this appears to have been borne out. Similarly, single research labs like DeepMind are not supposed to pull far ahead of the general ecology, because adapting AI to any particular domain is supposed to require lots of components developed all over the place by a market ecology that makes those components available to other companies. AlphaGo Zero is much simpler than that. To the extent that nobody else can run out and build AlphaGo Zero, it’s either because Google has Tensor Processing Units that aren’t generally available, or because DeepMind has a silo of expertise for being able to actually make use of existing ideas like ResNets, or both. Sheer speed of capability gain should also be highlighted here. Most of my argument for FOOM in the Yudkowsky-Hanson debate was about self-improvement and what happens when an optimization loop is folded in on itself. Though it wasn’t necessary to my argument, the fact that Go play went from “nobody has come close to winning against a professional” to “so strongly superhuman they’re not really bothering any more” over two years just because that’s what happens when you improve and simplify the architecture, says you don’t even need self-improvement to get things that look like FOOM. Yes, Go is a closed system allowing for self-play. It still took humans centuries to learn how to play it. Perhaps the new Hansonian bulwark against rapid capability gain can be that the environment has lots of empirical bits that are supposed to be very hard to learn, even in the limit of AI thoughts fast enough to blow past centuries of human-style learning in 3 days; and that humans have learned these vital bits over centuries of cultural accumulation of knowledge, even though we know that humans take centuries to do 3 days of AI learning when humans have all the empirical bits they need; and that AIs cannot absorb this knowledge very quickly using “architecture”, even though humans learn it from each other using architecture. If so, then let’s write down this new world-wrecking assumption (that is, the world ends if the assumption is false) and be on the lookout for further evidence that this assumption might perhaps be wrong. AlphaGo clearly isn’t a general AI. There’s obviously stuff humans do that make us much more general than AlphaGo, and AlphaGo obviously doesn’t do that. However, if even with the human special sauce we’re to expect AGI capabilities to be slow, domain-specific, and requiring feed-in from a big market ecology, then the situation we see without human-equivalent generality special sauce should not look like this. To put it another way, I put a lot of emphasis in my debate on recursive self-improvement and the remarkable jump in generality across the change from primate intelligence to human intelligence. It doesn’t mean we can’t get info about speed of capability gains without self-improvement. It doesn’t mean we can’t get info about the importance and generality of algorithms without the general intelligence trick. The debate can start to settle for fast capability gains before we even get to what I saw as the good parts; I wouldn’t have predicted AlphaGo and lost money betting against the speed of its capability gains, because reality held a more extreme position than I did on the Yudkowsky-Hanson spectrum. (Reply from Robin Hanson.)
Dr. Greer is a retired medical doctor turned ufologist who claims to have briefed sitting Presidents of the United States, members of the Joint Chiefs of Staff, congressmen, and other figures in government. He is known for the Disclosure Project hearings that took place in 2001 and 2013, as well as the widely successful film, which has played a major role in revealing the UFO and extraterrestrial phenomenon to the public. In November 2016, he gave two lectures in Las Vegas, Nevada, on the topics of ET contact and what he termed theFirst, an overview of what he discussed will be presented with supportive data added. Second, an analysis of his overarching premise of a cosmic false flag will be made. And finally, his statements in reference to Corey Goode and William Tompkins will be assessed, along with a summary of the methods used to discern a claims truthfulness.Before diving into the content, let it be said that the goal of this work is not to take sides or discredit Dr. Greer in favor of other figures in the field of ufology, such as those mentioned above. It is meant to be an objective, neutral assessment of the data and perspectives he offers as compared to other sources, and the total store of information available, as much as it is possible to do so.What should become apparent through the course of this writing is that Dr. Greer, Goode, and Tompkins, all provide information that is largely in agreement. They each discuss the Secret Space Program or Military Industrial Complex, secret government projects, UFOs, extraterrestrials, and they agree that full disclosure is the best course of action. The knowledge that has been hidden from humanity to maintain oppression, enslavement, and energy dominance of corporate interests (along with the Earth's complete history) is something they each are passionate about releasing to the public for the good of all. Dr. Greer's intentions appear to be good and non-duplicitous, although he may not have a complete comprehension of the full body of knowledge available, as will be revealed.In general, Dr. Greer's concern is related to the details about this full body of information—offering some conflicting opinions about certain specific data points that will be itemized in detail during the course of this writing. And that these data points, in his view, could be used by nefarious forces to cause a global war by capitalizing on the ignorance of the masses to stage a false alien invasion. He worries the public, who has been conditioned to fear extraterrestrials via the media and works of fiction, could be swept up into hatred and radicalism if they are led to believe negative extraterrestrials exist.On this score, Dr. Greer's concerns are supported by historical evidence, as the masses are heavily programmed through years of propaganda and social engineering to be used in such a way. But Dr. Greer's assertion that Goode and Tompkins (and anyone else that isn't in agreement with his belief that nefarious space-fairing ETs don't exist) are sharing untrue information is an unproven theory.As will be discussed later, such a contention is largely ideological on Dr. Greer's part, and he provided no direct evidence to substantiate the notion they are disinformation agents. That being said, his warnings about exercising discernment are quite poignant, something that Goode and others have emphatically said as well. Blind belief or adherence to dogma, in any capacity, is a serious problem that leads to radicalism and is arguably what allows tyranny and oppression to flourish. Thus, if humanity is to rise above the darkness of the past, each individual is beholden to find within themselves the courage to face reality and learn the skills of discernment, so the people won't beHere is a synopis of information within the existing body of researcher to support Dr. Greer, Goode, and Tompkins narratives.Government projects since the early part of the 20th century, and likely before, can be categorized as either overt and open to public scrutiny, or covert and hidden in nature. It is this latter category that has been a field of intense interest for many years, as whistleblowers, documents obtained from Freedom of Information Requests (FIOA), and anecdotal testimony from the public suggests that a shadow government or clandestine organization has advanced an agenda of great interest to all of humanity. The scope of these activities is vast beyond the limits of what can be enumerated in this writing, but it involves UFOs, advanced technology that is hidden from the public, and an insidious agenda that would probably shock the average person.The FBI investigation into the Hillary Clinton scandal in 2016 revealed a document that referred to "The Shadow Government" or "The Seventh Floor Group." It could be a reference to the same shadow government that some claim managed secret projects related to UFOs and extraterrestrials. In a controversial presentation produced by the well-respected ufologist and researcher Dr. Steven Greer, allegations were levied against two persons, Corey Goode and William Tompkins , claiming they could be disinformation agents or fraudsters. Dr. Greer also mentioned the former front man of the popular musical bandTom Delong, and the news media sitesand Collective Evolution . He said each could be sharing inaccurate or intentionally fabricated information about the nature of secret government projects, extraterrestrials, and hidden technology that could be used by certain nefarious forces to foment an interstellar war.The following article will detail the core aspects of Dr. Greer's presentation and philosophy, to include video excerpts of the presentation, and assess and analyze the plausibility of such accusations against the collective store of data available in the public record. It will also evaluate whether such ais possible if secret space programs exist, and if a shadow government uses hidden technology for nefarious ends, drawing from resources for support whenever possible. Whether one is a newcomer or well versed in these topics, the data contained herein is replete and can be used to educate oneself further. Consider that President Eisenhower warned of the power and might of a military industrial complex that answers to no nation or public authority whatsoever and is capable of advancing an agenda hidden from the public eye—much like what President Roosevelt and Senator Daniel K. Inouye referred to. https://youtu.be/OyBNmecVtdU This warning went out in 1961 after Eisenhower learned of what had been developed behind closed doors under the direction of what are called Special Access Programs and Unacknowledged Special Access Programs, also known as deep-black government projects. These are highly secretive projects with no government oversight, possessing the capacity to advanced technology that is completely hidden from the public eye. According to released FBI documents as well as recovered pages from former assets within some of these programs, a group known as Majestic 12 or MJ12 was in charge of some of these programs—a multinational group of high-ranking business figures with ties to what has been called the secret government. When Eisenhower learned of these projects—realizing that he was not only unaware of their activities but completely incapable of overseeing them for the safety of the American people—he was furious and made efforts to regain control of the situation. Dr. Michael Salla is one researcher who presented the following account as provided by Richard Dolan in relation to a former CIA operative turned whistleblower who was ordered to deliver a message to one of the secret facilities known as S4 or Area 51. The whistleblower testified before a panel of six former US members of congress during the Citizens Hearing on Disclosure in 2013. Dr. Salla introduces the account by the former CIA operative in the following excerpt from an article on his website Exopolitics: In response to questions from UFO historian Richard Dolan, the former CIA agent went on to explain how in 1958, he and his boss – the CIA operative – were summoned by President Eisenhower to the Oval Office. The President, who was accompanied by Vice-President Nixon, told the agent and his boss that he was trying to get information about efforts to learn about extraterrestrial life and technology. The agent said that according to President Eisenhower: “MJ-12 was supposed to find out, but they never sent reports to him.” The CIA agent said he and his boss were called into the Oval Office”. President Eisenhower said: We called the people in from MJ-12, from Area 51 and S-4, but they told us that the government had no jurisdiction over what they were doing…. I want you and your boss to fly out there. I want you to give them a personal message…. I want you to tell them, whoever is in charge, I want you to tell them that they have this coming week to get into Washington and to report to me. And if they don’t, I’m going to get the First Army from Colorado. we are going to go over and take the base over. I don’t care what kind of classified material you got. We are going to rip this thing apart.” If these accounts are to be believed, it would provide some backstory for Eisenhower's infamous warning regarding the out-of-control status of the military industrial complex. And given that these programs were well established during the mid-20th century, it also suggests that incredible advances have taken place, which is—to this day—completely hidden from the public. Program Cover stories. (UNACKNOWLEDGED Program). Cover stories may be established for unacknowledged programs in order to protect the integrity of the program from individuals who do not have a need to know. Cover stories must be believable and cannot reveal any information regarding the true nature of the contract. (Source) Richard Dolan is a researcher who spoke at the Citizens Hearing on Disclosure at the National Press Club event organized by Dr. Steven Greer in 2001. He said that evidence assembled via declassified documents and whistleblower testimony suggests that indeed a breakaway civilization has existed alongside common place societies for decades, if not far longer. It is this same covert group that was responsible for promoting and maintaining the UFO cover-up and silencing anyone who dared speak about advanced technology that could threaten the status quo of the energy industry, according to Dolan. https://youtu.be/iROkeC3lmVA But what is a breakaway civilization? According to researchers, a breakaway civilization is a term referring to a secretive group within a nation or society that furthers a hidden agenda often involving technological advancements that eventually lead to resource independence from the parent community. These resources would include but are not limited to, financial, social, or material demands that are one day transcended as a result of an agenda's success. (Source) distract, decoy, and trash campaigns Indications and Warnings Those intelligence activities intended to detect and report time-sensitive intelligence information on foreign developments that could involve a threat to the United States or allied and/or coalition military, political, or economic interests or to US citizens abroad. It includes forewarning of enemy actions or intentions; the imminence of hostilities; insurgency; nuclear/nonnuclear attack on the United States, its overseas forces, or allied and/or coalition nations; hostile reactions to US reconnaissance activities; terrorists� attacks; and other similar events. Also called I&W. See also information; intelligence. (JP 2-01) When the term deceptive is added to the term indications and warnings, it becomes clear that any operation with the purpose of promulgating false intelligence or information, sponsored by the government, military, or clandestine entities, falls into this category. This is a term Dr. Greer referred to several times in his presentation as something high-level operatives within the MIC are aware of. A notable example that Dr. Greer mentioned during his presentation was the Gulf of Tonkin incident. The NSA admits that it lied about what really happened in the Gulf of Tonkin incident in 1964 … manipulating data to make it look like North Vietnamese boats fired on a U.S. ship so as to create a false justification for the Vietnam war. (Source) When the term deceptive is added to the term, it becomes clear that any operation with the purpose of promulgating false intelligence or information, sponsored by the government, military, or clandestine entities, falls into this category. This is a term Dr. Greer referred to several times in his presentation as something high-level operatives within the MIC are aware of.A notable example that Dr. Greer mentioned during his presentation was the Gulf of Tonkin incident. https://youtu.be/IU01omNacMM What is a Cosmic False Flag? Dr. Greer proposes that a Cosmic False Flag is an elaborate deception of global import and scope. The military industrial complex (MIC) and shadow government would employ the use of three different categories of technology to deceive the public to accept an alien threat has invaded the Earth: Alien Reproduction Vehicles (ARVs), EM mind control technology, and manipulated/fabricated witnesses and whistleblowers. The false flag would unite humanity in ways that have heretofore only been revealed in science fiction films like Independence Day. Here is an introduction to the Cosmic False Flag presentation. https://youtu.be/P47m2K8cU_0 Instead of bringing humanity together in an age of freedom and recognition of individual rights and sovereignty, it would be a despotic fascist coup that would give the hidden powers that be an excuse to claim total control over humanity. Most of the population of Earth would be exterminated in a "war against the aliens," which would, in reality, be the shadow government and the psychotic elite behind it. Related The X-Files "My Struggle" Part 2: The Depopulation Agenda, Partial Disclosure, Predictive Programing, and Reprogramming the Mind Ronald Reagan spoke of an extraterrestrial threat uniting humanity in a speech to the United Nations General Assembly, the 42nd General Assembly, on September 21st, 1987: "In our obsession with antagonisms of the moment, we often forget how much unites all the members of humanity. Perhaps we need some outside, universal threat to make us recognize this common bond. I occasionally think how quickly our differences worldwide would vanish if we were facing an alien threat from outside this world" (Source) https://youtu.be/Ag44dRO8LEA What the former president just suggested is the essence of the cosmic false flag scenario described by Dr. Greer—that people would be completely overwhelmed and galvanized by an alien threat, setting aside their differences to fight the invaders. According to Dr. Greer and several others, this is exactly what the darkest parts of the shadow government have been advancing. Realizing that the plot would arguably be the grandest and far-reaching deception in human history, the proponents of such a scheme began using anything and everything at their disposal for the endeavor. This includes the use of false extraterrestrial contacts beginning as early as the 1950s, spacecraft designed to appear non-terrestrial but are in fact human in design, and finally media of all kinds that would begin to prepare humanity for accepting an alien threat. Dr. Greer's presentation discussed each the above-mentioned facets in detail, which will now be outlined. Negative Extraterrestrials A central facet of the cosmic false flag is what Dr. Greer says is alien propaganda, specifically the belief that scores of extraterrestrials are negatively oriented and have the intention of enslaving or taking over the human race. The use of all three categories of assets discussed above is employed to promote the idea that aliens are here to conquer humanity against humanity, something he thinks is untrue. Dr. Greer had this to say in regard to the notion that negative ETs exist, calling it a form of alienism. Alienism, as I’m going to define it tonight, is the proclivity to view anything that is non-human, but an intelligent life-form, as a potential threat, and the threat is directly proportional to how different they either appear or behave from us. With many of these people you’ll hear that the ones that the look Nordic, shall we say. I’ve spent three or four hours with Bill Tompkins, the ones that kind of look like us but are pretty, those are GROOD [great/good] the ones that look some other way those are bad ones. Isn’t that interstellar racism? … Take a step back. Aren’t the good ones always the pretty blond ones with the big breasts… I find it appalling … are we really going to stay on that cycle of taking racism and turning it into alienism? (Source) https://youtu.be/hOj2FCh8Beo The notion that only positive ETs exist is a hotly debated point within ufology and other circles. Dr. Greer seems to believe that any claim of nefariousness on the part of contactees and experiencers is part of a false agenda or hoax to deceive the public. He seems to think that if such negative groups existed, they would have taken over long ago, as he shares in the below excerpt in reference to a conversation he had with Monsignor Corrado Balducci. https://youtu.be/n81xO6_k6b0 Others state that the alien invasion already happened long ago, and that the human race was taken over by negative entities before modern history began and that the monetary system itself is part of a subtle program of mass control that was perfected by extraterrestrials on other worlds eons ago. Within this perspective, the negative beings, whether extraterrestrial or human, want society to believe it has not been taken over, a kind of free-range slavery, which is easier to maintain and produces more fruit than one of overt oppression. If you can get people to believe they are free, even though they are not, they won't try to identify or gain freedom from oppressors. While the average person might think the idea of money being used as a tool for self-policing enslavement is a wild conspiracy, the logic behind it is not unfounded, and scores of researchers have put forth information detailing how this works. The theory of financial enslavement seems to be so widely accepted by a growing body of researchers it is openly revealed in pop-culture like the TV series Rick and Morty, as depicted in the below segment. https://youtu.be/o_CyMqQBO8w And, of course, there is the infamous film They Live, which is one of the most revealing presentations of hidden control forces in popular works of fiction. Although Dr. Greer's assertions about a cosmic false flag and the promulgation of hoaxed negative alien experiences might be valid, there is an equally compelling data set suggesting that this "take over" happened long ago. THEY LIVE (1988) + Ἑλλ.ὑπότ.ἐνσωματ. from ΑΝΔΡΕΑΣ ΤΕΝΕΕΥΣ on Vimeo Specifically, Corey Goode and countless other researchers contend that nefarious races of extraterrestrials have maintained underground facilities on the Earth, particularly in Antarctica, for thousands of years, well before modern human history. And that some of these races use artificial life extension technology that requires the consumption of life force energy produced by fear and trauma, called looshe. While some dismiss this as yet another wild conspiracy theory, when one considers biophotons and how fear states alter brain and heart coherence, there is empirical data to support this assertion. Related Coherent Light (Biophoton) Can Heal Cancer and Other Diseases https://youtu.be/rkkfqO7JMGQ Emotion has been shown to alter the shape of DNA, which according to Fritz Albert Popp stores biophotonic light, and thus if one is in a state of fear, excess life force energy is released that parasitic entities can feed on. To continue the discussion of the cosmic false flag, an examination of advanced secret spacecraft is in order. Alien Reproduction Vehicles Dr. Greer claims that secret programs were involved in developing antigravity technology and spacecraft that— by the mid-1950s—was apparently so advanced that many civilians could and did confuse them for extraterrestrial vessels. Here is an excerpt from his presentation discussion manmade UFOs and what he calls stagecraft or the methods used to deceive the public. https://youtu.be/BXKDSIdR5ao And here is an extensive presentation about ARVs by Mark McCandlish. https://youtu.be/9QNvZN7X7v8 Dr. Greer says the shadow government wanted the public to believe extraterrestrials were frequently appearing in the skies over the world, all while maintaining an official policy that they don't exist. And while Dr. Greer contends there were genuine ET sightings and contacts, he says there were also staged encounters that are intentionally designed to confuse those researching the topic—a point that Goode and Tompkins assert as well. Alien Reproduction Vehicles is a term specifically referring to the creation of craft that mimics non-terrestrial vehicles. Many research groups to this day contend that all such sightings are ET in origin only, saying that "humans aren't allowed to make them, so all UFOs are ETs." But proponents of such viewpoints often ignore and even suppress any evidence that implies the contrary, acting more like religious zealots than earnest researchers seeking the whole truth. Here is an excerpt wherein Dr. Greer discusses the difference between genuine ET spacecraft vs. ARVs spacecraft. https://youtu.be/mVkGU3MYdR4 In addition, behind closed doors, through the use of scalar-based technology, the MIC developed weaponry capable of targeting and downing genuine extraterrestrial spacecraft for reverse-engineering projects. The fruit of such a program was a state-of-the-art grade of technology that—according to Dr. Greer and others—was well established by the beginning of the cold war. Here is an excerpt from the presentation discussing this technology. https://youtu.be/sTWYsakOdJo Dr. Greer mentions in his presentation Dr. Fred Bell who allegedly was the victim of EM mind control that was used to convince him the military needed to target and take down any ET craft in the skies. Allegedly, Dr. Bell helped produce, test, and perfect these weapons as early as the late 1950s. Sergeant Clifford Stone is a whistleblower who claims to have worked in https://youtu.be/Jv--ZPTtVEI Holographic Projection Devices Project Blue Beam is a popular term referring to technology that would be used in a fake alien invasion, which Dr. Greer did not mention by name but did discuss aspects of. It has all the same components as his cosmic false flag scenario, such as holographic projectors and the use of EM mind control devices—generally known a Voice of God or psychotronic weapons—to lead people into thinking they are having a religious experience with whichever deity they personally believe in. Dr. Greer discusses this technology along with false or scripted memories in the next excerpt from his presentation. https://youtu.be/TQER9PVlNuA Many elements in the Salvage Program document appear to be based on an accurate analysis of prospects of the Clinton campaign succeeding as distrust in mainstream media and polling reaches unprecedented heights. The FIRESIGN technology has been well known to many UFO/exopolitics researchers who over several decades have described it as Project Blue Beam. Nevertheless, there are a number of issues from the document that raise doubt about its authenticity according to the popular blog author, Kauilapele. These include the document using an old 2012 logo for the Berenson Strategic Group, and an incorrect street address being used for its Washington Office. These discrepancies may be indicative of a hoax, or included in order to introduce plausible deniability in case of any unintended leak. (Source) Whether an authentic document or not, the plan calls for the use of holographic projection technology that would be used in addition to ARVs to make people believe an alien invasion is taking place. In the presentation by Dr. Greer, he refers to holographic projectors that were revealed during a Snoop Dogg concert in 2012. https://youtu.be/TGbrFmPBV0Y Dr. Greer states that the above demonstration is only a partial disclosure of what is far superior technology and could easily convince people that the projections are real. Thus, there are several corroborating points to draw on to support Dr. Greer's claims, whether anecdotally or technologically regarding "stagecraft" and ARVs used for psychological warfare purposes. EM Mind Control and Voice of God Technology Dr. Michael Persinger is a cognitive neuroscientist who studied the effects of electromagnetic signals on the brain, supporting the idea it is possible to produce incredibly vivid hallucinations that mimic real experiences. He developed a device popularly known as the "God Helmet" for this purpose, which some researchers assert is proof of concept that psychotronic technology exists and can be used as a mind-control weapon against the public. Dr. Persinger's research was featured in a TV documentary on The Learning Channel. Dr. Greer proposes that ais an elaborate deception of global import and scope. The military industrial complex (MIC) and shadow government would employ the use of three different categories of technology to deceive the public to accept an alien threat has invaded the Earth: Alien Reproduction Vehicles (ARVs), EM mind control technology, and manipulated/fabricated witnesses and whistleblowers. The false flag would unite humanity in ways that have heretofore only been revealed in science fiction films likeHere is an introduction to the Cosmic False Flag presentation.Instead of bringing humanity together in an age of freedom and recognition of individual rights and sovereignty, it would be a despotic fascist coup that would give the hiddenan excuse to claim total control over humanity. Most of the population of Earth would be exterminated in a "war against the aliens," which would, in reality, be the shadow government and the psychotic elite behind it.Ronald Reagan spoke of an extraterrestrial threat uniting humanity in a speech to the United Nations General Assembly, the 42nd General Assembly, on September 21st, 1987:What the former president just suggested is the essence of the cosmic false flag scenario described by Dr. Greer—that people would be completely overwhelmed and galvanized by an alien threat, setting aside their differences to fight the invaders.According to Dr. Greer and several others, this is exactly what the darkest parts of the shadow government have been advancing. Realizing that the plot would arguably be the grandest and far-reaching deception in human history, the proponents of such a scheme began using anything and everything at their disposal for the endeavor. This includes the use of false extraterrestrial contacts beginning as early as the 1950s, spacecraft designed to appear non-terrestrial but are in fact human in design, and finally media of all kinds that would begin to prepare humanity for accepting an alien threat.Dr. Greer's presentation discussed each the above-mentioned facets in detail, which will now be outlined.A central facet of the cosmic false flag is what Dr. Greer says is, specifically the belief that scores of extraterrestrials are negatively oriented and have the intention of enslaving or taking over the human race. The use of all three categories of assets discussed above is employed to promote the idea that aliens are here to conquer humanity against humanity, something he thinks is untrue.Dr. Greer had this to say in regard to the notion that negative ETs exist, calling it a form ofThe notion that only positive ETs exist is a hotly debated point within ufology and other circles. Dr. Greer seems to believe thatof nefariousness on the part of contactees and experiencers is part of a false agenda or hoax to deceive the public. He seems to think that if such negative groups existed, they would have taken over long ago, as he shares in the below excerpt in reference to a conversation he had with Monsignor Corrado Balducci.Others state that thealready happened long ago, and that the human race was taken over by negative entities before modern history began and that the monetary system itself is part of a subtle program of mass control that was perfected by extraterrestrials on other worlds eons ago. Within this perspective, the negative beings, whether extraterrestrial or human, want society to believe it has not been taken over, a kind of free-range slavery, which is easier to maintain and produces more fruit than one of overt oppression.While the average person might think the idea of money being used as a tool for self-policing enslavement is a wild conspiracy, the logic behind it is not unfounded, and scores of researchers have put forth information detailing how this works.The theory of financial enslavement seems to be so widely accepted by a growing body of researchers it is openly revealed in pop-culture like the TV series, as depicted in the below segment.And, of course, there is the infamous film, which is one of the most revealing presentations of hidden control forces in popular works of fiction.Specifically, Corey Goode and countless other researchers contend that nefarious races of extraterrestrials have maintained underground facilities on the Earth, particularly in Antarctica, for thousands of years, well before modern human history. And that some of these races use artificial life extension technology that requires the consumption of life force energy produced by fear and trauma, called looshe. While some dismiss this as yet another wild conspiracy theory, when one considers biophotons and how fear states alter brain and heart coherence, there is empirical data to support this assertion.Emotion has been shown to alter the shape of DNA, which according to Fritz Albert Popp stores biophotonic light, and thus if one is in a state of fear, excess life force energy is released that parasitic entities can feed on.To continue the discussion of the cosmic false flag, an examination of advanced secret spacecraft is in order.Dr. Greer claims that secret programs were involved in developing antigravity technology and spacecraft that— by the mid-1950s—was apparently so advanced that many civilians could and did confuse them for extraterrestrial vessels.Here is an excerpt from his presentation discussion manmade UFOs and what he calls stagecraft or the methods used to deceive the public.And here is an extensive presentation about ARVs by Mark McCandlish.Dr. Greer says the shadow government wanted the public to believe extraterrestrials were frequently appearing in the skies over the world, all while maintaining an official policy that they don't exist. And while Dr. Greer contends there were genuine ET sightings and contacts, he says there were also staged encounters that are intentionally designed to confuse those researching the topic—a point that Goode and Tompkins assert as well.Alien Reproduction Vehicles is a term specifically referring to the creation of craft that mimics non-terrestrial vehicles. Many research groups to this day contend that all such sightings are ET in origin, saying that "humans aren'tto make them, so all UFOs are ETs." But proponents of such viewpoints often ignore and even suppress any evidence that implies the contrary, acting more like religious zealots than earnest researchers seeking the whole truth.Here is an excerpt wherein Dr. Greer discusses the difference between genuine ET spacecraft vs. ARVs spacecraft.In addition, behind closed doors, through the use of scalar-based technology, the MIC developed weaponry capable of targeting and downing genuine extraterrestrial spacecraft for reverse-engineering projects. The fruit of such a program was a state-of-the-art grade of technology that—according to Dr. Greer and others—was well established by the beginning of the cold war.Here is an excerpt from the presentation discussing this technology.Dr. Greer mentions in his presentation Dr. Fred Bell who allegedly was the victim of EM mind control that was used to convince him the military needed to target and take down any ET craft in the skies. Allegedly, Dr. Bell helped produce, test, and perfect these weapons as early as the late 1950s.Sergeant Clifford Stone is a whistleblower who claims to have worked in UFO crash retrieval programs during the mid-20th century, further supporting the notion that the MIC was advancing technology by reverse engineering genuine ET craft—while at the same time—using their own advanced vehicles to confuse ufologists and contactees.Project Blue Beam is a popular term referring to technology that would be used in a fake alien invasion, which Dr. Greer did not mention by name but did discuss aspects of. It has all the same components as his cosmic false flag scenario, such as holographic projectors and the use of EM mind control devices—generally known aor psychotronic weapons—to lead people into thinking they are having a religious experience with whichever deity they personally believe in.Dr. Greer discusses this technology along with false or scripted memories in the next excerpt from his presentation. Benenson Strategy Group allegedly produced a document entitled "Salvage Plan" that mentioned Project FIRESIGN—a false alien invasion scenario that almost perfectly matches the infamous Blue Beam plan. However, the documents authenticity is hotly contested, and it was dismissed by some as a hoax, citing discrepancies with the logo.Whether an authentic document or not, the plan calls for the use of holographic projection technology that would be used in addition to ARVs to make people believe an alien invasion is taking place.In the presentation by Dr. Greer, he refers to holographic projectors that were revealed during a Snoop Dogg concert in 2012.Dr. Greer states that the above demonstration is only a partial disclosure of what is far superior technology and could easily convince people that the projections are real.Thus, there are several corroborating points to draw on to support Dr. Greer's claims, whether anecdotally or technologically regarding "stagecraft" and ARVs used for psychological warfare purposes.Dr. Michael Persinger is a cognitive neuroscientist who studied the effects of electromagnetic signals on the brain, supporting the idea it is possible to produce incredibly vivid hallucinations that mimic real experiences. He developed a device popularly known as the "God Helmet" for this purpose, which some researchers assert isthat psychotronic technology exists and can be used as a mind-control weapon against the public. https://youtu.be/9l6VPpDublg Briefly, the human brain and biology produce an electromagnetic field that is capable of coupling with the Earth and other human beings via sympathetic resonance or entrainment. The EM fields produced by the earth, known as the Schumann resonances, seem to act as a baseline frequency for upper-level harmonics used by the body and measured by scientists as brain waves. Using various forms of electromagnetic field generators, such as radar and scalar wave antenna (almost indistinguishable from the reality they perceive. They can even produce disembodied voices that mimic discarnate entities (spirits, angels or ETs) that mediums or channelers claim to have made contact with for most of human history. In summary, according to Dr. Greer and others, over the past 70 years, the MIC has perfected technology that can hijack what is commonly referred to as extrasensory perception mechanisms within the human body and mind. This suggests that any experience, whether mundane or extraordinary, could be artificially induced via advanced mind-control technology. But Dr. Greer doesn't claim everyone experiencing such things are all false experiences. Instead, he says that many are genuine but that the MIC has the ability to track these bonafide encounters and quite often targets these individuals for a false experience to confuse the situation. Enter the Military Abduction Program or MILABs. MILABs Dr. Greer states that the Military Abduction Program (MILAB) is another facet of this cosmic false flag agenda. But he is hardly the only one who discusses this topic. It is a complex and vast subject that often involves the following scenario. An individual having genuine extraterrestrial contact experience, sometimes involving abduction is identified by the military. The MIC through the use of advanced surveillance technology can track and target anyone who has such an encounter and often performs a secondary abduction that uses EM mind-control technology, ARVs, and what Dr. Greer calls Programmed Lifeforms—entities that look like real ETs but are either humans in makeup or robotically controlled fakes. The purpose of such a program, Dr. Greer says, is to confuse the experiencer, often involving horrific testing or probing procedures that lead the individual to conclude the beings encountered are negative and insidious. But this is just one perspective on the situation. Here is an excerpt from the presentation wherein Dr. Greer discusses false abduction and MILABs. https://youtu.be/by1kMdj3YC0 Volumes have been written on the topic of extraterrestrial abduction. But focusing on Dr. Greer's perspective, one goal of the MILAB program is to implant a false experience that leads many to believe all extraterrestrials are negative. Arguably there are many other aims for this program, such as reconnaissance and research of real extraterrestrial encounters. The individuals abducted by the military and given false memories will form a growing body of witnesses that can further cloud the field of ufology so that they—according to Dr. Greer—bear false witness and state that negative extraterrestrials are trying to harvest human DNA or learn about humanity for an eventual invasion. It should be noted that Goode has discussed the MILAB program in precisely the same fashion as Dr. Greer, saying these projects have the purpose of gathering information and confusing ET abductees, but does not go as far to also say that all negative ET encounters are false, as Dr. Greer claims. He also says that cattle mutilations are fabricated, citing a Dr. John H Altshuler's research as a reference. https://youtu.be/8Z3xUZweBEo Whistleblowers and MILABs Dr. Greer states that another aspect of the cosmic false flag agenda is that the field of ufology—via the growing body of abductees with false experiences, must be seeded into the consciousness of humanity to support the plausibility of an extraterrestrial invasion scenario. Again, whether or not Dr. Greer's claim that all negative experiences are false is unproven as there are scores of people that say they had horrific abduction experiences. Whether all of these experiences are false is unclear, but if Dr. Greer and others are to be believed some of them could be psychological warfare operations to confuse abductees and by extension ufologists. Related Mothers to Alien Hybrids? | Women 'had sex with aliens and gave birth to hybrid babies' - and so might YOU Until recently, the topic of alien abduction was considered too taboo to discuss seriously, but since the 1950s, science fiction has exploded onto the scene, often providing a venue to reveal hidden activities and realities to the public. Subject matter that seems to be of keen interest to the MIC involving USAPs often appears on TV shows, movies, and other forms of media that are disclosed through such mediums. In other words, science fiction in almost every respect has become focused on promulgating a prejudice that all extraterrestrial life is evil or nefarious. This is in agreement with Dr. Greer's assertions about creating a public bias that will one day lead to the population of Earth being galvanized against an alien threat. On the issue of whether or not whistleblowers are providing false testimony, it is not only theoretically possible, but arguably essential to consider. As was mentioned above, USAPs have the authority via codified policy to distribute disinformation. And if one performs a comparative analysis on the complete totality of whistleblower testimony available, there are glaring inconsistencies. This suggests that the MIC can and does use false whistleblowers to confuse researchers and the public at large. Historically speaking it seems any field of research that advances towards the truth is subject to coordinated disinformation attacks. As an example, consider the topic of super soldiers, which is a hotbed of activity at present. Certain individuals claim to have been recruited into assorted secret projects, and enhanced using various means, such as DNA augmentation, electromagnetic fields, age-regression, and sometimes are the product of alien-human hybrid programs. But often there is no evidence provided other than anecdotal testimony. The lack of evidence does not necessarily mean anything related to the topic is untrue, but it does make it very easy for false whistleblowers to come on the scene and befuddle genuine research. Initially, there are a handful of alleged super soldiers emerging to discuss their experiences, but as of the time of this writing, there are hundreds of people coming forward and in many cases, the testimony provided is divergent on the whole. Still, other insiders claim that some of these people professing to be super soldiers are either outright frauds, or were given false experiences that made them perfect disinformation agents. For the growing body of researchers attempting to make sense of the situation, it is increasingly confusing, and many have abandoned the whole topic due to a handful of obvious fraudsters. For the MIC, the more confusion that is created, the greater the likelihood the actual truth will not be revealed or make it into the hands of researchers. And in effect, this is what Dr. Greer has also asserted. But he goes a step further to state that anyone who has testimony that suggests negative extraterrestrials exist—or that nefarious activity is taking place outside the Earth, such as MILABs are another aspect of manipulation, which is less contested. Through hypnotic regression therapy techniques, the scripted or screen memories given to abductees can be identified and carefully dismantled to reveal what is hidden beneath. In some cases, secondary abductions have revealed that human beings are nearby while so-called aliens are performing the abduction, which Dr. Greer cites in his presentation. Other whistleblowers have also stated that the military does abduct individuals who have genuine ET encounters in order to debrief them and cause confusion, as was mentioned earlier. So while Dr. Greer's contention that all negative ET experiences are a product of disinformation is not confirmed, there does exist other testimony and evidence to suggest fabricated experiences are occurring. Screen or Scripted Memories The work of Dr. Persinger provides an empirical basis for understanding that it is possible to manipulate consciousness via electromagnetic means. And other researchers in the mainstream dealing with criminal investigations and eyewitness testimony have demonstrated that memory is much more fluid than one might think, as the below-linked article reveals. Related Implanting False Memories is Real, says Psychologist | A ‘Memory Hacker’ Explains How to Plant False Memories in People’s Minds Corey Goode is a whistleblower and SSP insider who claims to have participated in the MILAB program and worked to undo the effects of screen memory implantation. His testimony is too vast to detail here completely, but it essentially describes what is a fairly typical experience within MILAB cases. As a young child, he was targeted by the MIC for having special intuitive or psychic abilities and would often go to "special classes" at school. In 1986, he claims to have participated in the Solar Warden SSP for 20 years. He was then age-regressed and deposited back in time to moments after he left. To many, these claims are so wildly fantastic that they are dismissed without research but for avid researchers of ufology and secret government projects, there is much to lend credence to them. Goode says that upon his return after a 20-year tour of duty, his memory was wiped and he had a screen memory implanted that for many years prevented him from recalling what happened. But one day he began to remember aspects of his SSP service, and through a trusted regression therapist, finally experienced total recall. Since then, he has been abducted on two more occasions, taken by the MIC SSP, had his memory wiped, and was given more screen memories. And while he says he was confused for a time, he is now able to determine when these technologies are used against him. If his accounts are true, then it suggests that while the unaware public might not be able to tell the difference between genuine and real memories, it is possible to differentiate scripted from genuine experiences. Dr. Greer also stated that he was subjected to this technology years ago, and that he too is able to discern the difference between screened and real memories. Although he did not indicate how this was done, it presumably requires discernment skills and the use of one's mental faculties, like critical thinking, keen observation, and inquisitiveness. Science Fiction and Media Drawing from popular works of fiction, there are countless examples like The Bourne Trilogy, Dollhouse, and others—all of which have memory wiping and screen or scripted memories as a key facet of the storyline. There is a clear pattern of portraying seemingly real aspects of hidden programs in popular works of fiction, arguably to cause the uninitiated public to dismiss them as fiction if they are ever encountered in life. Here is another excerpt from the As one possible example of disclosure through fiction, Jack Kirby is a comic book artist who produced many widely successful works during the mid to late 20th century. Kirby wrote The Face on Mars depicting a giant human visage on the surface of the red planet in 1958. Image Source. But it wasn't until 1976 when NASA's Viking Missions sent back photos of the Cydonia site that the public learned of a landform that some believe is a clear-cut case of an extraterrestrial civilization. Image Source. Viking image of the Cydonia site on Mars Image Source. The "face on Mars." Although NASA and the media were quick to refute claims that this image was proof of the existence of non-terrestrials, it fueled a storm of interest in the notion that humanity is not alone in the universe. Ufologists would spend years trying to solve this mystery, all while most of the public remained completely unaware of this staggering correlation. Did Kirby know that the face on Mars existed? If so, how did he come to gain this knowledge? Some researchers contend that Kirby was provided the data by certain figures within a secretive space program that had already made the voyage to Mars decades before NASA would set its sights there. If true, then this suggests what breakaway civilization and secret space program researchers have asserted for years—that there is indeed a hidden agenda to explore space that was successful beyond the wildest dreams of many who subscribe to NASA's contemporary plans for Mars exploration. One whistleblower claims that the Germans developed a hidden program to produce antigravity flying saucers as early as 1930, some 30 years before Kirby would pen his infamous comic book work. But again, for the average person, these claims are so controversial that many dismiss them out-of-hand, without a second thought that they might actually be true. The 1953 CIA sponsored Robertson Panel release the Durant Report recommending debunking the flying saucer phenomenon by using the mass media in what was the beginning of a psychological warfare program against the general public. (Source) What this small data point out of a large body to choose from suggests is that the media in general seems to be playing a major role in manipulating the extraterrestrial and USAP narrative, whether it is to "debunk" genuine UFOs and extraterrestrial encounters or it is to promote propaganda that serves the agenda of the MIC. Comparative Analysis of Dr. Greer's Presentation Dr. Greer originally publicized his talk by specifically referring to Corey Goode and William Tompkins. And during the presentation, he alluded to the fact they could be disinformation agents but did not explicitly state that they were. He did respond to a question asked from the audience wherein Greer stated that people should be highly skeptical of their accounts. Here is a compilation of Dr. Greer's remarks on Corey Goode, William Tompkins, Tom Delong, https://youtu.be/2TrWUsnhjuo When comparing Dr. Greer's data to both Goode and Tompkins accounts, there is almost total agreement insofar as histoical narrative, with some differences that will now be listed. Early German SSP Dr. Greer contends that the Nazis developed antigravity commonly known as the Nazi bell but they was not successful in perfecting the technology. It wasn't until the end of World War II, after German scientists came to the U.S. through Project PAPERCLIP that they were able to finish their designs, with the help of the American shadow government and the MIC. Goode and Tompkins, working independently, contend that Dr. Greer's version of events is only partially true and that it is itself a cover story and an aspect of a partiald disclosure agenda—a kind of deceptive I & W in its own right. They say that the Germans had several different antigravity research projects, one involving the Another point provided by Goode and Tompkins is that the MIC learned of the German Antarctic bases after the war and sent a large naval contingent there to deal with them, under the code name According to Goode and Tompkins, the Americans were developing their own SSP and learned of the German advancements during the war. Germans were brought over to the U.S. MIC during Goode and Tompkins state that the level of compartmentalization used by the MIC and these USAPs is such that it would be very easy for personnel working within them to believe they were being told the whole truth, when in fact, it was only part of the story. And it appears that the lower-level aspects of the SSP are what Dr. Greer draws forms as the basis of his knowledge, especially regarding German advancements. It is highly likely that as part of the compartmentalization protocol, personnel would be led to believe they were the most aware and had access to the whole truth, and in doing so, they would be compelled to reject anything that doesn't fit within their worldview. Reach of the MIC and SSP While Dr. Greer did not spend much time focusing on the scope of the programs developed by the MIC, he has stated in the past that positive extraterrestrials maintain a kind of quarantine that prevents aggressive human projects from leaving the Earth. He also asserted that the slave colonies operating on Mars or elsewhere were probably not true, presumably because of the belief they would not be allowed by positive ETs. But he did not offer any evidence to support this conjecture. On the other hand, Goode and Tompkins state that the SSP did travel to the Moon and beyond. Goode in particular claims there are If these claims by Goode are true, it means that Dr. Greer's insiders are mostly from the lower-level MIC space program, and as such, would not be read-in or have a need to know about the outer space programs, and were provided cover stories that supported this narrative. It also means that most people working in these programs operating largely in outer space would not have contact with the Earth and thus cross contamination is greatly limited. In essence, the secrecy mechanism of the whole SSP and MIC is an extremely well oiled machine of deception that only a few small cracks have allowed drips of the truth to come through. But it should be noted that there is no direct evidence supplied by Goode to confirm his claims, only anecdotes and personal experienes. Tompkins on the other hand, has provided a vast body of documented evidence suggesting he did participate in the designing of spacecraft that eventually became the vessels used by Solar Warden. Yet Tompkins also has no direct evidence (other than his personal experiences) to support the notion that there are in fact programs operating outside of Earth orbit, or that reptilians and other races of a nefarious nature are interacting with humanity. Conclusions Regarding the Cosmic False Flag While the focus of this writing is partially to assess and analyze whether Goode and Tompkins are in fact disinformation agents, as Dr. Greer claims, the overall premise of a cosmic false flag should also be considered. There is substantial indirect evidence to support this idea. And there may well be direct evidence if the FIRESIGN documents from the Furthermore, the human population's ability to think critically and assess reality from a moral and ethical standpoint, as Dr. Greer states, is shockingly low. One need only look at the past one hundred years of history to see Therefore, the spiritual awakening discussed by Dr. Greer is a valid point, which is also echoed by Goode. Humanity needs to set aside groupthink and dogma in favor of personal discernment and learning the skills of assessing reality first hand. In addition, the social programs of division based on racial, ideological, and religious beliefs have been used as a pretext for war for the past 6,000 years. Only by gaining the skills to think for oneself and not allow the deceptions and machinations of the shadow government to work can humanity finally set aside the plague of madness that has enabled untold harm and suffering to flourish. If humanity can do this, even if a cosmic false flag did occur, the call to unite in a campaign of interstellar war would fall on deaf ears. Finally, Goode said that the cosmic false flag plan is real but that it was abandoned in favor of a partial disclosure agenda that has been actively moving forward for years. The Cabal's grand deception plan is no longer to bring about a mass fake alien invasion. That was exposed heavily decades ago and is no longer a viable program to execute. This holographic technology is however still viable to use in smaller "theaters of war". The plan now is to introduce humanity to an "Angelic and Human-Like" ET group and have people worship these so-called "angelic ET's" who bring us a "New Cosmic Esoteric World Religion." (Source) And here is a detailed presentation of his findings that discuss how much consciousness seems to be affected by magnetic fields, and is capable of fantastic extrasensory perception.Briefly, the human brain and biology produce an electromagnetic field that is capable of coupling with the Earth and other human beings via sympathetic resonance or entrainment. The EM fields produced by the earth, known as the Schumann resonances, seem to act as a baseline frequency for upper-level harmonics used by the body and measured by scientists as brain waves.Using various forms of electromagnetic field generators, such as radar and scalar wave antenna ( HAARP ), but also localized devices that can target single individuals, one can induce all types of experiences and phenomenon on the body and mind. Disease or healing can be affected, as well as false encounters—that to the person experiencing them areindistinguishable from the reality they perceive. They can even produce disembodied voices that mimic discarnate entities (spirits, angels or ETs) that mediums or channelers claim to have made contact with for most of human history.In summary, according to Dr. Greer and others, over the past 70 years, the MIC has perfected technology that can hijack what is commonly referred to as extrasensory perception mechanisms within the human body and mind. This suggests that any experience, whether mundane or extraordinary, could be artificially induced via advanced mind-control technology. But Dr. Greer doesn't claim everyone experiencing such things arefalse experiences. Instead, he says that many are genuine but that the MIC has the ability to track these bonafide encounters and quite often targets these individuals for a false experience to confuse the situation.Enter the Military Abduction Program or MILABs.Dr. Greer states that the Military Abduction Program (MILAB) is another facet of this cosmic false flag agenda. But he is hardly the only one who discusses this topic. It is a complex and vast subject that often involves the following scenario.An individual having genuine extraterrestrial contact experience, sometimes involving abduction is identified by the military. The MIC through the use of advanced surveillance technology can track and target anyone who has such an encounter and often performs a secondary abduction that uses EM mind-control technology, ARVs, and what Dr. Greer calls Programmed Lifeforms—entities that look like real ETs but are either humans in makeup or robotically controlled fakes. The purpose of such a program, Dr. Greer says, is to confuse the experiencer, often involving horrific testing or probing procedures that lead the individual to conclude the beings encountered are negative and insidious. But this is just one perspective on the situation.Here is an excerpt from the presentation wherein Dr. Greer discusses false abduction and MILABs.Volumes have been written on the topic of extraterrestrial abduction. But focusing on Dr. Greer's perspective, one goal of the MILAB program is to implant a false experience that leads many to believe all extraterrestrials are negative. Arguably there are many other aims for this program, such as reconnaissance and research of real extraterrestrial encounters. The individuals abducted by the military and given false memories will form a growing body of witnesses that can further cloud the field of ufology so that they—according to Dr. Greer—bear false witness and state that negative extraterrestrials are trying to harvest human DNA or learn about humanity for an eventual invasion.It should be noted that Goode has discussed the MILAB program in precisely the same fashion as Dr. Greer, saying these projects have the purpose of gathering information and confusing ET abductees, but does not go as far to also say thatare false, as Dr. Greer claims.He also says that cattle mutilations are fabricated, citing a Dr. John H Altshuler's research as a reference.Dr. Greer states that another aspect of the cosmic false flag agenda is that the field of ufology—via the growing body of abductees withexperiences, must be seeded into the consciousness of humanity to support the plausibility of an extraterrestrial invasion scenario. Again, whether or not Dr. Greer's claim that all negative experiences are false is unproven as there are scores of people that say they had horrific abduction experiences. Whether all of these experiences are false is unclear, but if Dr. Greer and others are to be believed some of them could be psychological warfare operations to confuse abductees and by extension ufologists.Until recently, the topic of alien abduction was considered too taboo to discuss seriously, but since the 1950s, science fiction has exploded onto the scene, often providing a venue to reveal hidden activities and realities to the public. Subject matter that seems to be of keen interest to the MIC involving USAPs often appears on TV shows, movies, and other forms of media that are disclosed through such mediums. In other words, science fiction in almost every respect has become focused on promulgating a prejudice that all extraterrestrial life is evil or nefarious. This is in agreement with Dr. Greer's assertions about creating a public bias that will one day lead to the population of Earth being galvanized against an alien threat.On the issue of whether or not whistleblowers are providing false testimony, it is not only theoretically possible, but arguably essential to consider. As was mentioned above, USAPs have the authority via codified policy to distribute disinformation. And if one performs a comparative analysis on the complete totality of whistleblower testimony available, there are glaring inconsistencies. This suggests that the MIC can and does use false whistleblowers to confuse researchers and the public at large. Historically speaking it seems any field of research that advances towards the truth is subject to coordinated disinformation attacks.As an example, consider the topic of super soldiers, which is a hotbed of activity at present. Certain individuals claim to have been recruited into assorted secret projects, and enhanced using various means, such as DNA augmentation, electromagnetic fields, age-regression, and sometimes are the product of alien-human hybrid programs. But often there is no evidence provided other than anecdotal testimony. The lack of evidence does not necessarily mean anything related to the topic is untrue, but it does make it very easy for false whistleblowers to come on the scene and befuddle genuine research. Initially, there are a handful of alleged super soldiers emerging to discuss their experiences, but as of the time of this writing, there are hundreds of people coming forward and in many cases, the testimony provided is divergent on the whole.Still, other insiders claim that some of these people professing to be super soldiers are either outright frauds, or were given false experiences that made them perfect disinformation agents. For the growing body of researchers attempting to make sense of the situation, it is increasingly confusing, and many have abandoned the whole topic due to a handful of obvious fraudsters.For the MIC, the more confusion that is created, the greater the likelihood the actual truth will not be revealed or make it into the hands of researchers. And in effect, this is what Dr. Greer has also asserted. But he goes a step further to state thatwho has testimony that suggests negative extraterrestrials exist—or that nefarious activity is taking place outside the Earth, such as Mars bases populated by slaves taken from Earth—is a disinformation agent, either by conscious intent or scripted memory manipulation.MILABs are another aspect of manipulation, which is less contested. Through hypnotic regression therapy techniques, the scripted or screen memories given to abductees can be identified and carefully dismantled to reveal what is hidden beneath. In some cases, secondary abductions have revealed that human beings are nearby while so-called aliens are performing the abduction, which Dr. Greer cites in his presentation. Other whistleblowers have also stated that the military does abduct individuals who have genuine ET encounters in order to debrief them and cause confusion, as was mentioned earlier. So while Dr. Greer's contention thatnegative ET experiences are a product of disinformation is not confirmed, there does exist other testimony and evidence to suggest fabricated experiences are occurring.The work of Dr. Persinger provides an empirical basis for understanding that it is possible to manipulate consciousness via electromagnetic means. And other researchers in the mainstream dealing with criminal investigations and eyewitness testimony have demonstrated that memory is much more fluid than one might think, as the below-linked article reveals.Corey Goode is a whistleblower and SSP insider who claims to have participated in the MILAB program and worked to undo the effects of screen memory implantation. His testimony is too vast to detail here completely, but it essentially describes what is a fairly typical experience within MILAB cases. As a young child, he was targeted by the MIC for having special intuitive or psychic abilities and would often go to "special classes" at school. In 1986, he claims to have participated in the Solar Warden SSP for 20 years. He was then age-regressed and deposited back in time to moments after he left. To many, these claims are so wildly fantastic that they are dismissed without research but for avid researchers of ufology and secret government projects, there is much to lend credence to them.Goode says that upon his return after a 20-year tour of duty, his memory was wiped and he had a screen memory implanted that for many years prevented him from recalling what happened. But one day he began to remember aspects of his SSP service, and through a trusted regression therapist, finally experienced total recall. Since then, he has been abducted on two more occasions, taken by the MIC SSP, had his memory wiped, and was given more screen memories. And while he says he was confused for a time, he is now able to determine when these technologies are used against him. If his accounts are true, then it suggests that while the unaware public might not be able to tell the difference between genuine and real memories, it is possible to differentiate scripted from genuine experiences.Dr. Greer also stated that he was subjected to this technology years ago, and that he too is able to discern the difference between screened and real memories. Although he did not indicate how this was done, it presumably requires discernment skills and the use of one's mental faculties, like critical thinking, keen observation, and inquisitiveness.Drawing from popular works of fiction, there are countless examples like, and others—all of which have memory wiping and screen or scripted memories as a key facet of the storyline. There is a clear pattern of portraying seemingly real aspects of hidden programs in popular works of fiction, arguably to cause the uninitiated public to dismiss them as fiction if they are ever encountered in life.Here is another excerpt from the Asgardia article of citing how the media seems to be used as a way to hide or disclose information to the public. Dr. Salla also raised this point of a coordinated program of manipulation of the masses using fiction and other forms of media in a review of Dr. Greer's presentation What this small data point out of a large body to choose from suggests is that the media in general seems to be playing a major role in manipulating the extraterrestrial and USAP narrative, whether it is to "debunk" genuine UFOs and extraterrestrial encounters or it is to promote propaganda that serves the agenda of the MIC.Dr. Greer originally publicized his talk by specifically referring to Corey Goode and William Tompkins. And during the presentation, he alluded to the fact theybe disinformation agents but did not explicitly state that they were. He did respond to a question asked from the audience wherein Greer stated that people should be highly skeptical of their accounts.Here is a compilation of Dr. Greer's remarks on Corey Goode, William Tompkins, Tom Delong, Gaia TV , and Collective Evolution When comparing Dr. Greer's data to both Goode and Tompkins accounts, there is almost total agreement insofar as histoical narrative, with some differences that will now be listed.Dr. Greer contends that the Nazis developed antigravity commonly known as the Nazi bell but they was not successful in perfecting the technology. It wasn't until the end of World War II, after German scientists came to the U.S. through Project PAPERCLIP that they were able to finish their designs, with the help of the American shadow government and the MIC.Goode and Tompkins, working independently, contend that Dr. Greer's version of events is only partially true and that it is itself a cover story and an aspect of a partiald disclosure agenda—a kind of deceptive I & W in its own right.They say that the Germans had several different antigravity research projects, one involving the Nazi Bell and another that was greatly advanced through extraterrestrial contacts made with a race of reptilians called the Draco . Still another civilian project that the Germans only learned of later was headed by the Vril Society and Maria Orsic . According to Goode, Orsic made contact with several different non-terrestrial races that helped her team develop a fully functional spacecraft capable of traveling to the stars. And both Tompkins and Goode claim that another program assisted by the reptilians went to the Moon setting up a base there, as well as visiting Mars and other planets in the solar system. They also say that this more advanced faction of the German government was advised by their ET contacts to go to Antarctica, where they built under-ice facilities . All of these advances occurred before the Second World War came to a close.Another point provided by Goode and Tompkins is that the MIC learned of the German Antarctic bases after the war and sent a large naval contingent there to deal with them, under the code name Operation HIGHJUMP . Instead of destroying the base, they were attacked by highly advanced saucer craft as well as large cigar-shaped vessels, that Tompkins claims were reptilian allies of the Germans.According to Goode and Tompkins, the Americans were developing their own SSP and learned of the German advancements during the war. Germans were brought over to the U.S. MIC during Operation PAPERCLIP and provided their less advanced antigravity craft as a decoy to satisfy American curiosity. Eventually, the Germans in the U.S.—assisted by their Antarctica compatriots, strong-armed the MIC into signing treaties that effectively merged the German and American MICs. This latter aspect Dr. Greer did allude to when he said that the U.S. is effectively the Fourth Reich.Goode and Tompkins state that the level of compartmentalization used by the MIC and these USAPs is such that it would be very easy for personnel working within them to believe they were being told the whole truth, when in fact, it was only part of the story. And it appears that the lower-level aspects of the SSP are what Dr. Greer draws forms as the basis of his knowledge, especially regarding German advancements. It is highly likely that as part of the compartmentalization protocol, personnel would be led to believe they were the most aware and had access to the whole truth, and in doing so, they would be compelled to reject anything that doesn't fit within their worldview.While Dr. Greer did not spend much time focusing on the scope of the programs developed by the MIC, he has stated in the past that positive extraterrestrials maintain a kind of quarantine that prevents aggressive human projects from leaving the Earth. He also asserted that the slave colonies operating on Mars or elsewhere were probably not true, presumably because of the belief they would not be allowed by positive ETs. But he did not offer any evidence to support this conjecture.On the other hand, Goode and Tompkins state that the SSP did travel to the Moon and beyond. Goode in particular claims there are five different factions or compartments of the SSP , Solar Warden, the Interplanetary Corporate Conglomerate (ICC), the Global Galactic League of Nations (GGLN), the Military or MIC faction, and the Dark Fleet. Each is heavily compartmentalized and only higher-level operatives within the ICC have some working knowledge of the other programs. The MIC faction operates mostly in low-earth orbit and has access to technology that is only about 30 or 40 years ahead of today. But the other factions have traveled all around the solar system and beyond. The ICC seems to be the commercial branch of the SSP, trading with over 900 different extraterrestrial groups and maintains production facilities on Mars, the Asteroid Belt, and elsewhere using human slaves taken from earth as a labor force. The GGLN and the Dark Fleet operate largely outside of the solar system.If these claims by Goode are true, it means that Dr. Greer's insiders are mostly from the lower-level MIC space program, and as such, would not beor have aabout the outer space programs, and were provided cover stories that supported this narrative. It also means that most people working in these programs operating largely in outer space would not have contact with the Earth and thus cross contamination is greatly limited. In essence, the secrecy mechanism of the whole SSP and MIC is an extremely well oiled machine of deception that only a few small cracks have allowed drips of the truth to come through.But it should be noted that there is no direct evidence supplied by Goode to confirm his claims, only anecdotes and personal experienes. Tompkins on the other hand, has provided a vast body of documented evidence suggesting he did participate in the designing of spacecraft that eventually became the vessels used by Solar Warden. Yet Tompkins also has no direct evidence (other than his personal experiences) to support the notion that there are in fact programs operating outside of Earth orbit, or that reptilians and other races of a nefarious nature are interacting with humanity.While the focus of this writing is partially to assess and analyze whether Goode and Tompkins are in fact disinformation agents, as Dr. Greer claims, the overall premise of a cosmic false flag should also be considered. There is substantial indirect evidence to support this idea. And there may well be direct evidence if the FIRESIGN documents from the Benenson Strategy Group are authentic.Furthermore, the human population's ability to think critically and assess reality from a moral and ethical standpoint, as Dr. Greer states, is shockingly low. One need only look at the past one hundred years of history to see false flag agendas —that could have easily been seen for what they are if they had not be blindly accepted as true—deceived millions if not billions of people and were used as a pretext to start wars and issue draconian policies in the U.S. and all over the globe.Therefore, the spiritual awakening discussed by Dr. Greer is a valid point, which is also echoed by Goode. Humanity needs to set aside groupthink and dogma in favor of personal discernment and learning the skills of assessing reality first hand. In addition, the social programs of division based on racial, ideological, and religious beliefs have been used as a pretext for war for the past 6,000 years. Only by gaining the skills to think for oneself and not allow the deceptions and machinations of the shadow government to work can humanity finally set aside the plague of madness that has enabled untold harm and suffering to flourish. If humanity can do this, even if a cosmic false flag did occur, the call to unite in a campaign of interstellar war would fall on deaf ears.Finally, Goode said that the cosmic false flag plan is real but that it was abandoned in favor of a partial disclosure agenda that has been actively moving forward for years. Unacknowledged Special Access Programs (USAPs) is a term referring to government and independently financed activities that are allegedly for the purpose of national security but are completely devoid of public oversight. These programs are authorized to lie to the public and government officials using cover stories that are often very elaborate in nature, as an excerpt from a supplement to the DoD manual related to special access programs states.Evidence in the form of whistleblower testimony, as well as declassified documents, suggests that secret space programs have been in operation alongside the public space program for decades—if not far longer. Its activities include the control and exploitation of all life on Earth and beyond—so it is claimed by many researchers.The Secret Space Program (SSP) is an unofficial term referring to a conglomerate of government and corporate interest projects with overarching activities in space exploration, often dealing with extraterrestrials and/or spacecraft of some kind—whether in joint operations or reverse engineering projects. While there has been no "official" disclosure in this regard, an enormous body of evidence by way of whistleblower testimony, declassified documents, and eyewitness testimony suggests such programs exist.Richard Dolan is a ufology researcher who uses the term Breakaway Civilization when discussing these activities, as described in the below excerpt from the above mentioned Asgardia article.Before the program of official secrecy and cover-up began in the 1950s, research of UFOs and extraterrestrials was a valid field of research. After the Trinity nuclear bomb tests in the 1940s, the number of flying saucers seen by the public exploded and an official military investigation took place called Project Blue Book between 1952 and 1970, preceded by projects Sign (1947) and Grudge (1949).As Dr. Greer mentions in his presentation, also echoed by countless other researchers, the infamous 1947 Roswell UFO crash apparently provided the MIC with an extraterrestrial spacecraft that was so advanced it took decades to understand fully. From this and other recovered ET vessels, the information age was born.But prior to this development and World War II, the Germans and other nations were advancing antigravity technology and what are called scalar weapon systems based on Nikola Tesla's advances—that were not fully understood during the peak of the Serbian-American inventor's career at the turn of the 20th century. It would take teams of engineers decades to comprehend what Tesla seemed to divine intuitively, and from these hidden projects, a host of incredibly deadly and disturbing technologies was produced. These systems include scalar-based electromagnetic induction apparatus that can affect weather systems and trigger earthquakes (known popularly as HAARP technology), and also has the ability to interface with consciousness via electromagnetic entrainment. It is the latter application for altering human perception that was a central point in Dr. Greer's presentation.MKULTRA is a mind-control program developed by the CIA beginning in the 1950s and officially ended in 1964. However, many contend that it was classified and went(an unofficial term referring to USAPs). In 1977 FOIA requests revealed to the public certain aspects of the program, specifically related to trauma-based and chemical mind-control techniques, but the electromagnetic methods were never fully disclosed. Dr. Greer and others contend that these scalar mind augmentation systems were used, and still are, by the shadow government against people all over the world. The alleged Russian Woodpecker experiments are another example of mass mind-control using similar technology.What is of interest in relation to Dr. Greer's presentation is what is popularly known as a false flag, or as it is apparently known in the military, Deceptive Indication and Warning operations, or Deceptive I & Ws. As the name implies, these are missions intentionally designed to cause confusion and deceive the public (or those within the government or military) for the purpose of compartmentalization, inciting war, drawing attention away from classified projects, or garnering support for a political venture of some kind. When partnered with disinformation or distract, decoy, and trash (discredit) campaigns, it is very easy to manipulate the masses, especially since critical thinking and discernment skills have been almost completely purged from the public arena.All that need be done inis to implant a false whistleblower or insider into a community doing valid research, like alien abduction. Then the insider is revealed to be a fraud and the undiscerning masses assume anything that person said—whether true or not—is also untrue. They paint anyone mentioning alien abduction with the same brush, which is a tactic that is actively used by debunkers and the propaganda media to this day. This is why personal discernment and analysis is an essential to avoid deceiving oneself.Here is the definition of an I & W from The United States Department of Defense:
<filename>app/src/main/java/net/droidlabs/mvvmdemo/binder/SuperUserBinder.java package net.droidlabs.mvvmdemo.binder; import net.droidlabs.mvvm.recyclerview.adapter.binder.ConditionalDataBinder; import net.droidlabs.mvvmdemo.viewmodel.SuperUserViewModel; import net.droidlabs.mvvmdemo.viewmodel.UserViewModel; public class SuperUserBinder extends ConditionalDataBinder<UserViewModel> { public SuperUserBinder(int bindingVariable, int layoutId) { super(bindingVariable, layoutId); } @Override public boolean canHandle(UserViewModel model) { return model instanceof SuperUserViewModel; } }
#include<stdio.h> int main(){ int n,i,j,k1,k2; long prt1[100000]; long prt2[100000]; long tmp; int res[100000][2]={0}; scanf("%d",&n); for (i=0;i<n;i++){ scanf("%ld %ld",&prt1[i],&prt2[i]); } k1=0; k2=0; for (j=0;j<n/2;j++){ res[j][0]=1; res[j][1]=1; } while (k1+k2<n){ if (prt1[k1]<prt2[k2]) { res[k1][0]=1; k1++; } else { res[k2][1]=1; k2++; } } for (i=0;i<n;i++){ printf("%d",res[i][0]); } printf("\n"); for (i=0;i<n;i++){ printf("%d",res[i][1]); } printf("\n"); return 0; }
export class Schedule { id: string; title: string; creator: string; description: string; location: string; timeStart: Date; timeEnd: Date; constructor(obj: any) { this.id = obj.id; this.title = obj.title; this.creator = obj.creator; this.description = obj.description; this.location = obj.location; this.timeStart = obj.timeStart; this.timeEnd = obj.timeEnd; } }
<reponame>fossabot/pedrolamas.com<gh_stars>1-10 import React from 'react'; import SidebarSocialLink from './sidebarSocialLink'; import SiteContext from '../../../siteContext'; import { FontAwesome } from '../../../../utils'; type SidebarSocialProps = { children?: never; }; const SidebarSocial: React.FunctionComponent<SidebarSocialProps> = () => { const { siteMetadata } = React.useContext(SiteContext); if (!siteMetadata) return null; const { social } = siteMetadata; return ( <nav className="sidebar-social" role="navigation" aria-label="Social Links Menu"> {social?.links?.map((link, index) => { if (!link) { return null; } let linkTitle = ''; let symbolName: FontAwesome.SymbolNames = 'sidebar-default'; if (link.includes('twitter.com')) { linkTitle = 'Twitter'; symbolName = 'sidebar-twitter'; } else if (link.includes('facebook.com')) { linkTitle = 'Facebook'; symbolName = 'sidebar-facebook'; } else if (link.includes('linkedin.com')) { linkTitle = 'LinkedIn'; symbolName = 'sidebar-linkedin'; } else if (link.includes('github.com')) { linkTitle = 'GitHub'; symbolName = 'sidebar-github'; } return <SidebarSocialLink url={link} title={linkTitle} symbolName={symbolName} key={index} />; })} <SidebarSocialLink url="/feeds" title="Syndicated Feeds" symbolName="sidebar-rss" /> </nav> ); }; SidebarSocial.displayName = 'SidebarSocial'; export default SidebarSocial;
Reduction of noise transmission through an aperture using active feedforward noise control. A local active noise control technique has been applied to reduce noise emission through an aperture in the wall of the enclosure. Pressure cancellation was effected at the center of an aperture of 0.3×0.3 m2 for an enclosure of 2 m3 volume, and the reduction in sound was measured at various locations at the aperture and at some distance from the enclosure. The results showed that sound pressure cancellation at the window implies emission attenuation through itself, and a relationship between attenuation at the window and outside the enclosure is confirmed. The behavior of attenuation results at the window is related to frequency, to the modal density of the sound field in the enclosure, and to the distance between the error microphone and secondary source.
from __future__ import absolute_import, division, print_function from __future__ import unicode_literals import sys import os from random import randint import datetime import time from multiprocessing import Pool, TimeoutError from collections import defaultdict from scipy.stats import chisquare from mmgroup import MM0, MMV from mmgroup.mm_space import MMSpace from mmgroup.mm import INT_BITS ################################################################ # Class and character for the monster information taken from GAP ################################################################ #The following information has been obtained from the GAP package: GAP_INFO = """ gap> t := CharacterTable("M"); #! The character table of the Monster group CharacterTable( "M" ) gap> ClassNames(t, "ATLAS"); #! Classes of the Monster in ATLAS notatation [ "1A", "2A", "2B", "3A", "3B", "3C", "4A", "4B", "4C", "4D", "5A", "5B", "6A", "6B", "6C", "6D", "6E", "6F", "7A", "7B", "8A", "8B", "8C", "8D", "8E", "8F", "9A", "9B", "10A", "10B", "10C", "10D", "10E", "11A", "12A", "12B", "12C", "12D", "12E", "12F", "12G", "12H", "12I", "12J", "13A", "13B", "14A", "14B", "14C", "15A", "15B", "15C", "15D", "16A", "16B", "16C", "17A", "18A", "18B", "18C", "18D", "18E", "19A", "20A", "20B", "20C", "20D", "20E", "20F", "21A", "21B", "21C", "21D", "22A", "22B", "23A", "23B", "24A", "24B", "24C", "24D", "24E", "24F", "24G", "24H", "24I", "24J", "25A", "26A", "26B", "27A", "27B", "28A", "28B", "28C", "28D", "29A", "30A", "30B", "30C", "30D", "30E", "30F", "30G", "31A", "31B", "32A", "32B", "33A", "33B", "34A", "35A", "35B", "36A", "36B", "36C", "36D", "38A", "39A", "39B", "39C", "39D", "40A", "40B", "40C", "40D", "41A", "42A", "42B", "42C", "42D", "44A", "44B", "45A", "46A", "46B", "46C", "46D", "47A", "47B", "48A", "50A", "51A", "52A", "52B", "54A", "55A", "56A", "56B", "56C", "57A", "59A", "59B", "60A", "60B", "60C", "60D", "60E", "60F", "62A", "62B", "66A", "66B", "68A", "69A", "69B", "70A", "70B", "71A", "71B", "78A", "78B", "78C", "84A", "84B", "84C", "87A", "87B", "88A", "88B", "92A", "92B", "93A", "93B", "94A", "94B", "95A", "95B", "104A", "104B", "105A", "110A", "119A", "119B" ] gap> Irr(t)[2]; #! Character of degree 196883 Character( CharacterTable( "M" ), [ 196883, 4371, 275, 782, 53, -1, 275, 51, 19, -13, 133, 8, 78, 77, 14, -3, 5, -1, 50, 1, 35, 11, -1, -5, 3, -1, 26, -1, 21, 5, -4, 20, 0, 16, 14, 5, 6, -1, -2, 5, -3, 13, 1, -1, 11, -2, 10, 2, 9, 7, -2, 8, -1, 3, -1, 7, 6, -3, 6, 2, -1, 5, 5, 5, 1, 0, -3, 2, 4, 5, -2, -1, 4, 4, 0, 3, 3, 2, 2, -1, -2, -1, -1, -1, 1, 3, -1, 3, 3, 2, 2, 2, 2, 2, -2, 1, 2, 2, 3, -1, 2, -1, 2, 0, 2, 2, 1, 1, -2, 1, 2, 0, 1, 2, -1, 0, 1, 1, 2, -1, 1, 1, -1, 1, 0, 0, 1, 1, 0, -1, 0, 0, 0, 1, -1, -1, 1, 1, 0, 0, 0, 1, 0, -1, 0, 0, 1, 0, -1, -1, -1, 0, 0, 1, -1, 0, -2, 0, -1, 0, 0, 1, 0, 0, 0, 0, 0, -1, 0, 0, 0, -1, -1, -1, -2, -1, -1, -1, 0, 0, -1, -1, -1, -1, 0, 0, 0, 0, -1, -1, 0, -1, -1, -1 ] ) gap> SizesCentralizers(t); #! Sizes of the centralizers of the classes [ 808017424794512875886459904961710757005754368000000000, 8309562962452852382355161088000000, 139511839126336328171520000, 3765617127571985163878400, 1429615077540249600, 272237831663616000, 8317584273309696000, 26489012826931200, 48704929136640, 8244323942400, 1365154560000000, 94500000000, 774741019852800, 2690072985600, 481579499520, 130606940160, 1612431360, 278691840, 28212710400, 84707280, 792723456, 778567680, 143769600, 23592960, 12582912, 3096576, 56687040, 2834352, 887040000, 18432000, 12000000, 6048000, 480000, 1045440, 119439360, 22394880, 17418240, 1161216, 884736, 483840, 373248, 276480, 82944, 23040, 73008, 52728, 1128960, 150528, 35280, 2721600, 145800, 10800, 9000, 12288, 8192, 8192, 2856, 34992, 23328, 15552, 3888, 3888, 1140, 76800, 28800, 24000, 19200, 1200, 960, 52920, 6174, 3528, 504, 2640, 2112, 552, 552, 6912, 4608, 3456, 2304, 1152, 864, 864, 576, 384, 288, 250, 624, 312, 486, 243, 4704, 2688, 896, 168, 87, 10800, 7200, 2880, 1800, 360, 240, 240, 186, 186, 128, 128, 594, 396, 136, 2100, 70, 1296, 648, 216, 72, 76, 702, 117, 78, 78, 400, 320, 80, 80, 41, 504, 504, 168, 126, 352, 352, 135, 184, 184, 92, 92, 94, 94, 96, 50, 51, 104, 52, 54, 110, 112, 56, 56, 57, 59, 59, 360, 240, 120, 120, 60, 60, 62, 62, 132, 66, 68, 69, 69, 140, 70, 71, 71, 78, 78, 78, 84, 84, 84, 87, 87, 88, 88, 92, 92, 93, 93, 94, 94, 95, 95, 104, 104, 105, 110, 119, 119 ] """ def find_table(name): """Return table in GAP_INFO after the comment starting with 'name'""" s = GAP_INFO[GAP_INFO.find("#! " + name):] copen, cclose = s.find("["), s.find("]") return eval(s[copen:cclose+1]) ClassNames = find_table("Classes") ClassOrders = [int(s[:-1]) for s in ClassNames] CharacterValues = find_table("Character") SizesCentralizers = find_table("Sizes of the centralizers") assert len(ClassNames) == len(CharacterValues) == len(SizesCentralizers) ################################################################ # Check that monster group elements have coorect orders ################################################################ p = 3 space = MMV(3) group = MM0 good_mm_orders = set(ClassOrders) max_mmm_order = max(good_mm_orders) def one_test_mm_order(v, m, verbose = 0): v = v.copy() v1, n = v.copy(), 0 while n <= max_mmm_order: v1, n = v1 * m, n+1 if v1 == v: return n return None def rand_v(): return space('R') def rand_m(n_entries = 4): return group('r', n_entries) def random_test_order(n_entries = 4, display = True): v, m = rand_v(), rand_m() order = one_test_mm_order(v, m, display) ok = order in good_mm_orders st = "ok" if ok else "error" if display: print("\rorder is", order, ",", st) s = "\nm = " + str(m) s += "\norder = " + str(order) + ", " + st + "\n" return ok, order, s def check_mm_orders(ntests, display = True): print("\nTesting orders of elements of the monster group") nerrors = 0 order_sum = 0 start_time = datetime.datetime.now() print(start_time) t_start = time.process_time() for i in range(ntests): t = time.process_time() if display: print("Test %d, CPU time = %.3f s" % (i+1, t) ) ok, order, _ = random_test_order(display = display) nerrors += not ok if ok: order_sum += order t = time.process_time() - t_start print("started: ", start_time) print("finished:", datetime.datetime.now()) print("CPU time = %.3f s, per test: %.3f ms" % (t, 1000*t/ntests)) print("CPU time per standard operation: %.5f ms" % (1000.0*t/order_sum)) print("%d tests, %d errors, " % (ntests, nerrors)) if nerrors: raise ValueError("Error in orders of monster group elements") ################################################################ # Chisquare test of orders of monster group elements ################################################################ MM_WORD_SIZE = 20 # No of elementary operations to construct # an element of the monster MIN_CHISQU = 560 # Min No of cases for chisquare test class ChisquareOrder: probabilities = defaultdict(float) orders = set(ClassOrders) good_orders = set() for order, csize in zip(ClassOrders, SizesCentralizers): probabilities[order] += 1.0/csize if probabilities[order] >= 1.0/111: good_orders.add(order) max_small = max(orders - good_orders) for x in orders: if x <= max_small: del probabilities[x] min_order = min(probabilities) probabilities[0] = 1.0 - sum(probabilities.values()) chisquare_ = chisquare def __init__(self, p = p): self.obtained = defaultdict(int) self.p = p self.total = 0 self.order_sum = 0 self.errors = 0 self.word_size = MM_WORD_SIZE def add(self, ok, order): ok = ok and order in self.orders if ok: key = order if order >= self.min_order else 0 self.obtained[key] += 1 self.total += 1 self.order_sum += order self.errors += not ok def chisquare(self): f_obt = [self.obtained[key] for key in self.probabilities] sum_obt = sum(f_obt) f_exp = [sum_obt * self.probabilities[key] for key in self.probabilities] chisq, p = chisquare(f_obt, f_exp = f_exp) return chisq, p def is_ok(self): if self.errors: return False if self.total < MIN_CHISQU: return True _, prob = self.chisquare() return prob > 1.0e-6 def show_result(self): description = ( """Chisquare test of distribution of orders >= %d in the monster M, %d degrees of freedom, characteristic p = %d, %d-bit C random element of MM built from %d factors, %d tests, %d MM operations, %d errors. """ ) s = description % ( self.min_order, len(self.probabilities) - 1, self.p, INT_BITS, self.word_size, self.total, self.order_sum, self.errors ) if self.errors == 0 and self.total >= MIN_CHISQU: st = "\nChisquare test statistics = %.3f, p = %.4f\n" chisq, p = self.chisquare() s += st % (chisq, p) return s def one_test_order(args): v, m = args order = one_test_mm_order(v, m) ok = order in good_mm_orders return ok, order def get_test_values(ntests): for i in range(ntests): yield rand_v(), rand_m(MM_WORD_SIZE) def statistics_chisqu_orders(results, start_time = None): if not start_time is None: end_time = datetime.datetime.now() chisq = ChisquareOrder() for i, (ok, order) in enumerate(results): st = "ok" if ok else "error" chisq.add(ok, order) print("\n" + chisq.show_result()) if not start_time is None: ntests, order_sum = chisq.total, chisq.order_sum diff_time = end_time - start_time t = diff_time.total_seconds() print("started: ", start_time) print("finished:", end_time) print("time = %.3f s, per test: %.3f ms" % (t, 1000*t/ntests)) print("time per standard operation: %.5f ms" % (1000.0*t/order_sum)) return chisq.is_ok() def check_chisqu_orders(ntests, nprocesses = 1, verbose = False): verbose = 1 start_time = datetime.datetime.now() header = "\nChisquare test of distribution of orders in the monster M," print(header) print("%d tests, %d processes" % (ntests, nprocesses)) print("started: ", start_time) testvalues = get_test_values(ntests) if nprocesses > 1: with Pool(processes = nprocesses) as pool: results = pool.map(one_test_order, testvalues) pool.join() else: results_ = map(one_test_order, testvalues) results = [] for i, x in enumerate(results_): ok, order = x if verbose: print("Test %d, order = %3d, %s" % (i+1, order, ok) ) else: print("\r %d " % i, end = "") results.append(x) return statistics_chisqu_orders(results, start_time)
/** * Test counts in a given table * * @param table Table name * @param aggregateAttribute Aggregate attribute * @param componentId Component ID * @param expectedCount Expected count * @throws AnalyticsException */ private void testCounts(String table, String aggregateAttribute, String componentId, int expectedCount) throws AnalyticsException { for (int tenantId: TestConstants.TENANT_IDS) { List<AggregateField> fields = new ArrayList<AggregateField>(); fields.add(new AggregateField(new String[] { aggregateAttribute }, "sum", TestConstants.REQUEST_COUNT)); AggregateRequest aggregateRequest = new AggregateRequest(); aggregateRequest.setFields(fields); aggregateRequest.setAggregateLevel(0); aggregateRequest.setParentPath(new ArrayList<String>()); aggregateRequest.setGroupByField(TestConstants.COMPONENT_ID); aggregateRequest.setQuery(TestConstants.META_TENANT_ID + ":" + tenantId + " AND " + TestConstants.COMPONENT_ID + ":\"" + componentId + "\""); aggregateRequest.setTableName(table); AnalyticsIterator<Record> resultItr = this.analyticsDataAPI.searchWithAggregates(-1234, aggregateRequest); int count = ((Double) resultItr.next().getValue(TestConstants.REQUEST_COUNT)).intValue(); log.info("ComponentId: " + componentId + " | Expected: " + expectedCount + " | " + "Actual: " + count + " | tenant: " + tenantId); Assert.assertEquals(count, expectedCount, aggregateAttribute + " is incorrect in " + table + " table, for tenant: " + tenantId); } }
// ParseObjects return a string yaml and return a array of the objects/items from a Template/List kind func ParseObjects(source string) (Objects, error) { var template Object err := yaml.Unmarshal([]byte(source), &template) if err != nil { return nil, err } if GetKind(template) == ValKindTemplate || GetKind(template) == ValKindList { var ts []interface{} if GetKind(template) == ValKindTemplate { ts = template[FieldObjects].([]interface{}) } else if GetKind(template) == ValKindList { ts = template[FieldItems].([]interface{}) } var objs Objects for _, obj := range ts { parsedObj := obj.(Object) stringKeys := make(Object, len(parsedObj)) for key, value := range parsedObj { stringKeys[key.(string)] = value } objs = append(objs, stringKeys) } return objs, nil } return Objects{template}, nil }
package meta import ( "time" "github.com/gogo/protobuf/proto" "github.com/messagedb/messagedb/meta/internal" ) // RetentionPolicyInfo represents metadata about a retention policy. type RetentionPolicyInfo struct { Name string ReplicaN int Duration time.Duration ShardGroupDuration time.Duration ShardGroups []ShardGroupInfo } // NewRetentionPolicyInfo returns a new instance of RetentionPolicyInfo with defaults set. func NewRetentionPolicyInfo(name string) *RetentionPolicyInfo { return &RetentionPolicyInfo{ Name: name, ReplicaN: DefaultRetentionPolicyReplicaN, Duration: DefaultRetentionPolicyDuration, } } // ShardGroupByTimestamp returns the shard group in the policy that contains the timestamp. func (rpi *RetentionPolicyInfo) ShardGroupByTimestamp(timestamp time.Time) *ShardGroupInfo { for i := range rpi.ShardGroups { if rpi.ShardGroups[i].Contains(timestamp) && !rpi.ShardGroups[i].Deleted() { return &rpi.ShardGroups[i] } } return nil } // ExpiredShardGroups returns the Shard Groups which are considered expired, for the given time. func (rpi *RetentionPolicyInfo) ExpiredShardGroups(t time.Time) []*ShardGroupInfo { var groups []*ShardGroupInfo for i := range rpi.ShardGroups { if rpi.ShardGroups[i].Deleted() { continue } if rpi.Duration != 0 && rpi.ShardGroups[i].EndTime.Add(rpi.Duration).Before(t) { groups = append(groups, &rpi.ShardGroups[i]) } } return groups } // DeletedShardGroups returns the Shard Groups which are marked as deleted. func (rpi *RetentionPolicyInfo) DeletedShardGroups() []*ShardGroupInfo { var groups []*ShardGroupInfo for i := range rpi.ShardGroups { if rpi.ShardGroups[i].Deleted() { groups = append(groups, &rpi.ShardGroups[i]) } } return groups } // marshal serializes to a protobuf representation. func (rpi *RetentionPolicyInfo) marshal() *internal.RetentionPolicyInfo { pb := &internal.RetentionPolicyInfo{ Name: proto.String(rpi.Name), ReplicaN: proto.Uint32(uint32(rpi.ReplicaN)), Duration: proto.Int64(int64(rpi.Duration)), ShardGroupDuration: proto.Int64(int64(rpi.ShardGroupDuration)), } pb.ShardGroups = make([]*internal.ShardGroupInfo, len(rpi.ShardGroups)) for i, sgi := range rpi.ShardGroups { pb.ShardGroups[i] = sgi.marshal() } return pb } // unmarshal deserializes from a protobuf representation. func (rpi *RetentionPolicyInfo) unmarshal(pb *internal.RetentionPolicyInfo) { rpi.Name = pb.GetName() rpi.ReplicaN = int(pb.GetReplicaN()) rpi.Duration = time.Duration(pb.GetDuration()) rpi.ShardGroupDuration = time.Duration(pb.GetShardGroupDuration()) rpi.ShardGroups = make([]ShardGroupInfo, len(pb.GetShardGroups())) for i, x := range pb.GetShardGroups() { rpi.ShardGroups[i].unmarshal(x) } } // clone returns a deep copy of rpi. func (rpi RetentionPolicyInfo) clone() RetentionPolicyInfo { other := rpi if rpi.ShardGroups != nil { other.ShardGroups = make([]ShardGroupInfo, len(rpi.ShardGroups)) for i := range rpi.ShardGroups { other.ShardGroups[i] = rpi.ShardGroups[i].clone() } } return other }
//Reads and parses cell voltages from LTC6804 registers into 'cell_codes' variable. uint8_t LTC6804_rdcv(uint8_t reg, uint8_t total_ic, uint16_t cell_codes[][12], uint8_t addr_first_ic ) { const uint8_t NUM_CELLVOLTAGES_IN_REG = 3; const uint8_t NUM_BYTES_IN_REG = 6; const uint8_t NUM_RX_BYTES = 8; uint8_t *cell_data; int8_t pec_error = 0; static uint8_t pec_error_location[4][4]; uint16_t parsed_cell; uint16_t received_pec; uint16_t data_pec; uint8_t data_counter=0; cell_data = (uint8_t *) malloc( (NUM_RX_BYTES*total_ic)*sizeof(uint8_t) ); if (reg == 0) { for (uint8_t cell_reg = 1; cell_reg<5; cell_reg++) { data_counter = 0; LTC6804_rdcv_reg(cell_reg, total_ic,cell_data,addr_first_ic); for (uint8_t current_ic = 0 ; current_ic < total_ic; current_ic++) { for (uint8_t current_cell = 0; current_cell < NUM_CELLVOLTAGES_IN_REG; current_cell++) { parsed_cell = cell_data[data_counter] + (cell_data[data_counter + 1] << 8); cell_codes[current_ic][current_cell + ((cell_reg - 1) * NUM_CELLVOLTAGES_IN_REG)] = parsed_cell; data_counter = data_counter + 2; } received_pec = (cell_data[data_counter] << 8) + cell_data[data_counter+1]; data_pec = pec15_calc(NUM_BYTES_IN_REG, &cell_data[current_ic * NUM_RX_BYTES ]); if (received_pec != data_pec) { pec_error = 1; pec_error_location[cell_reg-1][current_ic]++; Serial.print("\nErrors:"); for(uint8_t ii=0; ii<4 ; ii++) { for (uint8_t jj=0; jj<4; jj++) { Serial.print(" " + String( pec_error_location[jj][ii] ) ); } } } data_counter = data_counter + 2; } } } else { LTC6804_rdcv_reg(reg, total_ic,cell_data,addr_first_ic); for (uint8_t current_ic = 0 ; current_ic < total_ic; current_ic++) { for (uint8_t current_cell = 0; current_cell < NUM_CELLVOLTAGES_IN_REG; current_cell++) { parsed_cell = cell_data[data_counter] + (cell_data[data_counter+1]<<8); cell_codes[current_ic][current_cell + ((reg - 1) * NUM_CELLVOLTAGES_IN_REG)] = 0x0000FFFF & parsed_cell; data_counter= data_counter + 2; } received_pec = (cell_data[data_counter] << 8 )+ cell_data[data_counter + 1]; data_pec = pec15_calc(NUM_BYTES_IN_REG, &cell_data[current_ic * NUM_RX_BYTES]); if (received_pec != data_pec) { pec_error = 1; } } } free(cell_data); return(pec_error); }
/* ============================================================================= * preprocessor_convertURNHex * -- Translate % hex escape sequences * ============================================================================= */ void preprocessor_convertURNHex (char* str) { char* src = str; char* dst = str; char c; while ((c = *src) != '\0') { if (c == '%') { char hex[3]; hex[0] = (char)tolower((int)*(src + 1)); assert(hex[0]); hex[1] = (char)tolower((int)*(src + 2)); assert(hex[1]); hex[2] = '\0'; int i; int n = sscanf(hex, "%x", &i); assert(n == 1); src += 2; *src = (char)i; } *dst = *src; src++; dst++; } *dst = '\0'; }
/** * Appends the combined random art entries for the provided keys * * @param <A> The {@link Appendable} output writer * @param session The {@link SessionContext} for invoking this load command - may be {@code null} if not invoked * within a session context (e.g., offline tool or session unknown). * @param sb The writer * @param separator The separator to use between the arts - if empty char ('\0') then no separation is done * @param provider The {@link KeyIdentityProvider} - ignored if {@code null} or has no keys to provide * @return The updated writer instance * @throws Exception If failed to extract or write the entries * @see #generate(SessionContext, KeyIdentityProvider) * @see #combine(Appendable, char, Collection) */ public static <A extends Appendable> A combine( SessionContext session, A sb, char separator, KeyIdentityProvider provider) throws Exception { return combine(sb, separator, generate(session, provider)); }
/// User clicks on one of the exercise it shows all assignments to that exercise /// # Arguments /// path is ```/manage/exercise/{{exercise.id}}``` /// data is my state of the app pub async fn all_assignments_for_exercise( path: web::Path<String>, data: web::Data<State>, ) -> HttpResult { let id = parse_path(&path.into_inner())?; let client = &data.db_pool.get().await?; let stmt = client .prepare( r#" SELECT a.assignment_name as name, script_type, e.description as exercise_name, a.description, a.uuid, a.active FROM assignment a INNER JOIN exercise e ON a.exercise_id = e.id WHERE e.id = $1 ORDER BY a.active = FALSE, name"#, ) .await?; let rows = client.query(&stmt, &[&id]).await?; let exercise_name = get_exercise_description_for_id(&data.db_pool, id).await?; let assignments: Vec<AssignmentExercise> = rows_into(rows); let mut context = tera::Context::new(); context.insert("assignments", &assignments); context.insert("exercise_name", &exercise_name); render_template(&TEMPLATES, "assignments_list.html", &context) }
n, k, x = map(int, input().split()) arr = list(map(int, input().split())) # def collapse(arr): # if (len(arr) < 3): # return 0 # start = 0 # currentNumber = arr[0] # counter = 1 # for i in range(1, len(arr), 1): # if (i == len(arr) - 1 and arr[i] == currentNumber): # counter += 1 # if (i != len(arr) - 1 and arr[i] == currentNumber): # counter += 1 # else: # if (counter >= 3): # if (start != 0 and start != len(arr) - counter): # return counter + collapse(arr[:start] + arr[start + counter:]) # else: # return counter # else: # counter = 1 # start = i # currentNumber = arr[i] # return 0 def collapse(arr): if (len(arr) < 3): return 0 start = len(arr) end = 0 while True: newStart = 0 newEnd = 0 currentNumber = 101 for i in range(0, len(arr), 1): if (i >= start and i <= end): continue if (i == 0): currentNumber = arr[0] counter = 1 else: if (i == len(arr) - 1 and arr[i] == currentNumber): counter += 1 if (i != len(arr) - 1 and arr[i] == currentNumber): counter += 1 else: if (counter >= 3): if (i == len(arr) - 1 and arr[i] == currentNumber): newEnd = i else: newEnd = i - 1 break else: counter = 1 newStart = i currentNumber = arr[i] if (newStart >= start or newEnd <= end): break else: start = newStart end = newEnd return max(0, end - start + 1) currentMax = 0 for i in range(0, n-1): if (arr[i] == arr[i+1] and arr[i] == x): maxDestroy = 0 # Destroying head and tail doesnt create anything new if (i != 0 and i != n - 2): maxDestroy = collapse(arr[:i] + arr[i+2:]) + 2 else: maxDestroy = 2 currentMax = max(currentMax, maxDestroy) print(currentMax)
The New York Times has a story out on how San Diego police use mobile facial recognition devices in the field, including potentially on non-consenting residents who aren’t suspected of a crime. One account from a retired firefighter is especially alarming: Stopped by the police after a dispute with a man he said was a prowler, he was ordered to sit on a curb, he said, while officers took his photo with an iPad and ran it through the same facial recognition software. The officers also used a cotton swab to collect a DNA sample from the inside of his cheek… “I was thinking, ‘Why are you taking pictures of me, doing this to me?’ ” said Mr. Hanson, 58, who has no criminal record. “I felt like my identity was being stolen. I’m a straight-up, no lie, cheat or steal guy, and I get treated like a criminal.” The story confirms concerns EFF raised two years ago, when we obtained a stack of records from the San Diego Association of Governments (SANDAG) about the regional facial recognition program it manages called “Tactical Identification System” or TACIDS, for short. Under a federally funded pilot program, law enforcement agencies around San Diego County were provided with smartphones that could run photos taken in the field against the sheriff’s mugshot database. Although the draft policy called for police to obtain consent before taking a photo, anecdotal testimony indicated that officers may be using it on certain people simply because their “spidy senses” [sic] were tingling. The latest version of the policy, which was finalized in February 2015 [PDF], does not even mention the issue of consent, saying that police should primarily use it when they believe someone who is lawfully detained is being deceptive or evasive about their identity. On Twitter, San Diego Police Department immediately challenged many elements reported by the New York Times, which in turn updated the piece with some corrections. However, there is one way to get the facts: SDPD can move quickly to respond to a public records request filed last week for a long list of documents associated with this program. San Diego’s facial recognition system is one of many programs around the country that we are targeting through a crowd-sourced information-gathering endeavor. As part of EFF’s new Street Level Surveillance project, EFF has teamed up with MuckRock to file public records requests around the country regarding law enforcement use of mobile biometric technology, including face recognition, fingerprint analysis, iris scanning, Rapid DNA, and tattoo identification. We are in the process of submitting more than 200 requests around the country with agencies nominated by our supporters, including several in San Diego County. We have already received records from two agencies elsewhere in the country: In 2008, the San Jose Police Department signed a $961,000 contract with 3i Infotech to develop a mobile identification technology system that would include fingerprint analysis and mugshot database searches. Two purchase orders show that SJPD paid another company, Mobizent, $195,000 for 22 mobile fingerprinting devices in 2010-2011. Denver police provided us with a report [PDF] dated February 2015 that provides an overview of a mobile fingerprinting pilot project. According to the report, the technology worked with 99% accuracy, provided verification in under 30 seconds, and police only required an hour of training to become proficient with the devices. “Officers firmly stated they did not want us to take the readers away from them,” the said report sand and listed several case studies in which police were able to identify gang members, car thieves, and a sex offender. Denver’s policies, as of 2014, state that if a person has not been arrested or otherwise lawfully detained, police need to obtain consent before using the fingerprint reader. The policies also forbid use in “random or general investigative or intelligence gathering,” or during the issuance of a civil marijuana citation. Police are also not allowed to use the technology on people they believe to be juveniles. We’re filing new requests every day and expect responsive documents (and request rejections) to steadily stream in over the next few weeks. You can still nominate an agency through our online form, file your own request, or follow requests already being processed through MuckRock’s page.
Lifestyle drugs -- chiefly Viagra -- are costing General Motors $17 million a year and the cost is passed along to car, truck and SUV consumers. The blue pill is covered under GM's labor agreement with United Auto Workers, as well as benefit plans for salaried employees. GM executives estimate health care adds $1,500 to the price of each vehicle but they do not break out how much of the premium is caused by erectile dysfunction expenses. GM provides health care for 1.1 million employees, retirees and dependents and is the world's largest private purchaser of Viagra. GM recently raised the co-pay for erectile dysfunction drugs to $18 under a new agreement with the UAW and the company has also pared benefits for salaried workers. The automaker spends almost $5.6 billion each year on health care. While lifestyle drugs are a small fraction of the total medical bill, every health care expense is added into the price of every new vehicle and is a drag on the struggling goliath's earnings. Given the large number of aging autoworkers in the U.S., the industrys Viagra tab and bill for other erectile dysfunction drugs is certain to continue rising. Neither Ford nor Chrysler will disclose the amount spent on erectile dysfunction drugs. While many government and company health plans have eliminated impotence drugs from coverage plans, GM has more than two retirees for every active worker on its rolls and must negotiate eliminating the drugs from the union health plan with the UAW.
Let us now address the greatest American mystery at the moment: what motivates the supporters of Republican presidential candidate Donald Trump? I call it a “mystery” because the working-class white people who make up the bulk of Trump’s fan base show up in amazing numbers for the candidate, filling stadiums and airport hangars, but their views, by and large, do not appear in our prestige newspapers. On their opinion pages, these publications take care to represent demographic categories of nearly every kind, but “blue-collar” is one they persistently overlook. The views of working-class people are so foreign to that universe that when New York Times columnist Nick Kristof wanted to “engage” a Trump supporter last week, he made one up, along with this imaginary person’s responses to his questions. When members of the professional class wish to understand the working-class Other, they traditionally consult experts on the subject. And when these authorities are asked to explain the Trump movement, they always seem to zero in on one main accusation: bigotry. Only racism, they tell us, is capable of powering a movement like Trump’s, which is blowing through the inherited structure of the Republican party like a tornado through a cluster of McMansions. Trump himself provides rather excellent evidence for this finding. The man is an insult clown who has systematically gone down the list of American ethnic groups and offended them each in turn. He wants to deport millions upon millions of undocumented immigrants. He wants to bar Muslims from visiting the United States. He admires various foreign strongmen and dictators, and has even retweeted a quote from Mussolini. This gold-plated buffoon has in turn drawn the enthusiastic endorsement of leading racists from across the spectrum of intolerance, a gorgeous mosaic of haters, each of them quivering excitedly at the prospect of getting a real, honest-to-god bigot in the White House. All this stuff is so insane, so wildly outrageous, that the commentariat has deemed it to be the entirety of the Trump campaign. Trump appears to be a racist, so racism must be what motivates his armies of followers. And so, on Saturday, New York Times columnist Timothy Egan blamed none other than “the people” for Trump’s racism: “Donald Trump’s supporters know exactly what he stands for: hatred of immigrants, racial superiority, a sneering disregard of the basic civility that binds a society.” Stories marveling at the stupidity of Trump voters are published nearly every day. Articles that accuse Trump’s followers of being bigots have appeared by the hundreds, if not the thousands. Conservatives have written them; liberals have written them; impartial professionals have written them. The headline of a recent Huffington Post column announced, bluntly, that “Trump Won Super Tuesday Because America is Racist.” A New York Times reporter proved that Trump’s followers were bigots by coordinating a map of Trump support with a map of racist Google searches. Everyone knows it: Trump’s followers’ passions are nothing more than the ignorant blurtings of the white American id, driven to madness by the presence of a black man in the White House. The Trump movement is a one-note phenomenon, a vast surge of race-hate. Its partisans are not only incomprehensible, they are not really worth comprehending. * * * Or so we’re told. Last week, I decided to watch several hours of Trump speeches for myself. I saw the man ramble and boast and threaten and even seem to gloat when protesters were ejected from the arenas in which he spoke. I was disgusted by these things, as I have been disgusted by Trump for 20 years. But I also noticed something surprising. In each of the speeches I watched, Trump spent a good part of his time talking about an entirely legitimate issue, one that could even be called leftwing. Yes, Donald Trump talked about trade. In fact, to judge by how much time he spent talking about it, trade may be his single biggest concern – not white supremacy. Not even his plan to build a wall along the Mexican border, the issue that first won him political fame. He did it again during the debate on 3 March: asked about his political excommunication by Mitt Romney, he chose to pivot and talk about … trade. It seems to obsess him: the destructive free-trade deals our leaders have made, the many companies that have moved their production facilities to other lands, the phone calls he will make to those companies’ CEOs in order to threaten them with steep tariffs unless they move back to the US. Trump embellished this vision with another favorite leftwing idea: under his leadership, the government would “start competitive bidding in the drug industry”. (“We don’t competitively bid!” he marveled – another true fact, a legendary boondoggle brought to you by the George W Bush administration.) Trump extended the critique to the military-industrial complex, describing how the government is forced to buy lousy but expensive airplanes thanks to the power of industry lobbyists. Trump: the great orange-haired Unintended Consequence | Marilynne Robinson Read more Thus did he hint at his curious selling proposition: because he is personally so wealthy, a fact about which he loves to boast, Trump himself is unaffected by business lobbyists and donations. And because he is free from the corrupting power of modern campaign finance, famous deal-maker Trump can make deals on our behalf that are “good” instead of “bad”. The chance that he will actually do so, of course, is small. He appears to be a hypocrite on this issue as well as so many other things. But at least Trump is saying this stuff. All this surprised me because, for all the articles about Trump I had read in recent months, I didn’t recall trade coming up very often. Trump is supposed to be on a one-note crusade for whiteness. Could it be that all this trade stuff is a key to understanding the Trump phenomenon? * * * Trade is an issue that polarizes Americans by socio-economic status. To the professional class, which encompasses the vast majority of our media figures, economists, Washington officials and Democratic powerbrokers, what they call “free trade” is something so obviously good and noble it doesn’t require explanation or inquiry or even thought. Republican and Democratic leaders alike agree on this, and no amount of facts can move them from their Econ 101 dream. To the remaining 80 or 90% of America, trade means something very different. There’s a video going around on the internet these days that shows a room full of workers at a Carrier air conditioning plant in Indiana being told by an officer of the company that the factory is being moved to Monterrey, Mexico, and that they’re all going to lose their jobs. As I watched it, I thought of all the arguments over trade that we’ve had in this country since the early 1990s, all the sweet words from our economists about the scientifically proven benevolence of free trade, all the ways in which our newspapers mock people who say that treaties like the North American Free Trade Agreement allow companies to move jobs to Mexico. Well, here is a video of a company moving its jobs to Mexico, courtesy of Nafta. This is what it looks like. The Carrier executive talks in that familiar and highly professional HR language about the need to “stay competitive” and “the extremely price-sensitive marketplace”. A worker shouts “Fuck you!” at the executive. The executive asks people to please be quiet so he can “share” his “information”. His information about all of them losing their jobs. * * * Now, I have no special reason to doubt the suspicion that Donald Trump is a racist. Either he is one, or (as the comedian John Oliver puts it) he is pretending to be one, which amounts to the same thing. But there is another way to interpret the Trump phenomenon. A map of his support may coordinate with racist Google searches, but it coordinates even better with deindustrialization and despair, with the zones of economic misery that 30 years of Washington’s free-market consensus have brought the rest of America. It is worth noting that Trump is making a point of assailing that Indiana air conditioning company from the video in his speeches. What this suggests is that he’s telling a tale as much about economic outrage as it is tale of racism on the march. Many of Trump’s followers are bigots, no doubt, but many more are probably excited by the prospect of a president who seems to mean it when he denounces our trade agreements and promises to bring the hammer down on the CEO that fired you and wrecked your town, unlike Barack Obama and Hillary Clinton. Here is the most salient supporting fact: when people talk to white, working-class Trump supporters, instead of simply imagining what they might say, they find that what most concerns these people is the economy and their place in it. I am referring to a study just published by Working America, a political-action auxiliary of the AFL-CIO, which interviewed some 1,600 white working-class voters in the suburbs of Cleveland and Pittsburgh in December and January. Support for Donald Trump, the group found, ran strong among these people, even among self-identified Democrats, but not because they are all pining for a racist in the White House. Their favorite aspect of Trump was his “attitude”, the blunt and forthright way he talks. As far as issues are concerned, “immigration” placed third among the matters such voters care about, far behind their number one concern: “good jobs / the economy”. “People are much more frightened than they are bigoted,” is how the findings were described to me by Karen Nussbaum, the executive director of Working America. The survey “confirmed what we heard all the time: people are fed up, people are hurting, they are very distressed about the fact that their kids don’t have a future” and that “there still hasn’t been a recovery from the recession, that every family still suffers from it in one way or another.” Tom Lewandowski, the president of the Northeast Indiana Central Labor Council in Fort Wayne, puts it even more bluntly when I asked him about working-class Trump fans. “These people aren’t racist, not any more than anybody else is,” he says of Trump supporters he knows. “When Trump talks about trade, we think about the Clinton administration, first with Nafta and then with [Permanent Normal Trade Relations] China, and here in Northeast Indiana, we hemorrhaged jobs.” “They look at that, and here’s Trump talking about trade, in a ham-handed way, but at least he’s representing emotionally. We’ve had all the political establishment standing behind every trade deal, and we endorsed some of these people, and then we’ve had to fight them to get them to represent us.” Now, let us stop and smell the perversity. Left parties the world over were founded to advance the fortunes of working people. But our left party in America – one of our two monopoly parties – chose long ago to turn its back on these people’s concerns, making itself instead into the tribune of the enlightened professional class, a “creative class” that makes innovative things like derivative securities and smartphone apps. The working people that the party used to care about, Democrats figured, had nowhere else to go, in the famous Clinton-era expression. The party just didn’t need to listen to them any longer. What Lewandowski and Nussbaum are saying, then, should be obvious to anyone who’s dipped a toe outside the prosperous enclaves on the two coasts. Ill-considered trade deals and generous bank bailouts and guaranteed profits for insurance companies but no recovery for average people, ever – these policies have taken their toll. As Trump says, “we have rebuilt China and yet our country is falling apart. Our infrastructure is falling apart … Our airports are, like, Third World.” Trump’s words articulate the populist backlash against liberalism that has been building slowly for decades and may very well occupy the White House itself, whereupon the entire world will be required to take seriously its demented ideas. Yet still we cannot bring ourselves to look the thing in the eyes. We cannot admit that we liberals bear some of the blame for its emergence, for the frustration of the working-class millions, for their blighted cities and their downward spiraling lives. So much easier to scold them for their twisted racist souls, to close our eyes to the obvious reality of which Trumpism is just a crude and ugly expression: that neoliberalism has well and truly failed. Thomas Frank is the author of Listen, Liberal or Whatever Happened to the Party of the People, published 15 March by Metropolitan Books
This show has been commercially released as " Europe '72: The Complete Recordings - All The Music Edition" Set 1 Cold Rain And Snow Me And Bobby McGee Chinatown Shuffle China Cat Sunflower -> I Know You Rider Jack Straw He's Gone Next Time You See Me Black Throated Wind Set 2 Casey Jones Playing In The Band Sugaree It Hurts Me Too Ramble On Rose El Paso Big Railroad Blues Truckin' Set 3 Dark Star -> Sugar Magnolia -> Caution (Do Not Stop On Tracks) Johnny B. Goode Encore One More Saturday Night Notes: Digitally Remastered using the Bertha Digital Audio Workstation by [email protected]. Completed April 14, 2006. This was a terrible transfer with more than 100 digi-snits throughout the show. More than 16 hours was spent fixing most of them. There are a few small ones that remain. A couple of methods were used in fixing the digi-snits, including manually editing individual samples. Bobby says "See you later, bye bye" at the end of One More Saturday Night. Frequency and spectral analysis shows the signature to be the same as the rest of the show. Though it may appear out of order, it is believed this is due to the show being partially broadcast on French TV and this may have been the end of the TV part of the show. Between song tuning and banter was removed by someone and is not part of the original release. There appears to be tape machine noise between each song on the original. These small bips of noise were removed. This is A fantastic show with absolutely wonderful sound. Enjoy! DVD audio extraction through flac compression by Gary Field 4/20/2006. Thanks to Jay. plus-circle Add Review comment Reviews Reviewer: c-freedom - favorite favorite favorite favorite favorite - November 25, 2018 Subject: Married me a wife While i will always check out the in-between song banter for a show it is also nice to have a more streamlined copy. The band's sound seems to be richer and fuller with each show. Pretty ironic that a 9 song 1st set seems condensed after previous shows for this tour. This is a solid recording. I am at the China>Rider and everything to this point has been wide open from the get go and of course this pairing gives the band its first opportunity to hit its full stride. Set 1 The Rain & Snow opener had a real nice flow. It was always heartwarming live. it seemed to get played when I least expected it probably because it wasn't in a very regular rotation. Me & Bobby-again this tune is straightforward and so perfect for the band. China town Shuffle- PIG is just on top of his tunes.And for me it is these stand alone songs that are the highlight of Pig's last full tour. The Rider almost sounded like the LP version Jack Straw-Sweet and UPbeat! "You keep us on the run" The very first STEAL YOUR FACE. Fast tempo with a nice jump to it The jam gets a little exploration Mainly Garcia but Phil lays down counterpoint. Could use some DJG. No call & response yet which keeps the maiden voyage under 7 minutes. Lied, Cheated, wrong Doing and Pig on the harp. what was that old number IF LOViN YOU IS WRONG I DON'T WANT TO BE RIGHT. Monitors & Messing with the HEAD. and the Merry go round and round "I use to lay awake at night thinking, am i going down , down" BTWind-How many times did this function as a set closer? A plodding version. ie DOWNbeat. Almost gets off the ground? "Going back home, That's what I am going to do" Jerry announces a break but i can't help but wonder if they still went out with Casey Jones Set 2 Playin'-The return of Donna Jean "just a little nervous from the fall" She comes in strong for The 'Standing on a Tower' verse The jams on these 10 minute Europe 72 PITB are absolutely stunning. Donna all in on the Reprise Jerry just all over that tune. Sugaree is so mellow it tends to get past me all of a sudden I find myself at Saturday Night which for sure they wanted to put up for the broadcast. Weir kills his voice on European T.V. So Weir says goodbye which i assume is to the TV. A very bluesy Hurts Me Too Dang this is a thick tune It is this contrast that gives the upbeat material such soaring HIGH's. "Little Girl, Oh I love you" Soulful Harp Set 3 I can't recapture the over the top review I did in 2016 of this show but suffice it to say that this is a peach. They finally nail the transition from: Dark Star >Sugar Magnolia "Takin' it easy!" - November 25, 2018Married me a wife Reviewer: bluestones - favorite favorite favorite favorite favorite - August 22, 2018 Subject: A Beautiful Dark Star My personal favorite too. Don't know why it doesn't get the accolades it deserves. I agree with one of the previous reviewers, the final jam does sound a little bit sleepy but that just makes the grogginess of Caution sound kinda swampy and spooky. - August 22, 2018A Beautiful Dark Star Reviewer: Mind Wondrin - favorite favorite favorite favorite - February 9, 2016 Subject: Theme Park Deuce 2nd date at an amusement park, the day after squeezing in a cafeteria appearance at a nearby college. While not as great overall as the 1st show here, two days prior, that's just 'cause that one's an all-time great. It's still one of the top ten shows from E72, and that's tough competition. Aired on Danish television. First Set. Standard fare until a perfect example of the '72 China>Rider. This is right before Jer started singing on Jack Straw. The first He's Gone is way uptempo (why this night and not in London?) and the bridge hadn't been finished yet. It worked so well that they added it to 10 of the remaining 16 nights - even without the middle eight. Next Time You See Me is a perfect version. As the band jokes around with levels Keith adds a quick Merry-Go-Round Broke Down. Black-Throated Wind is a clunky, underwritten song, but Jer and Keith both find something to add to a tight version. Second Set. This is a definitive version of Casey Jones with crazy interplay from everybody. Mr. Charlie was only skipped on tour at the following Beat Club taping, and many are good, but not all are this good. Playin' has a good tempo and goes off the deep end F-A-S-T; even at 10 minutes it's concise and powerful. Sugaree is tight, but '72 is not the year for this one. It's a raucous, perfect One More Saturday Night - an almost-nightly usual set closer. Bobby actually sings the end, rather than the concomitant shred voice™. The previous Tivoli has the better Hurts Me Too - on this tour mere above average becomes mediocre! Ramble On Rose is uptempo, with bent double-stops-galore from Bobby and ragtime from Keith! (It also has a Merry-Go-Round). Big Railroad may be known for the masks they donned but it's also a best-ever. Jer plays a unique rhythm figure. The count-in is clipped on the box set version of Truckin', which seems a standard run-through but erupts into a screamer jam. It's as good or better than 4/11 and better even than 5/26. Third Set. On a tour with several 40min Dark Stars, a 31-minute one isn't unusual. It's better than most of the longer ones, though not one of the top 72s. This is, however, the longest Caution by far - which doesn't mean more happens - in fact they sound tired in this set. This was the only Johnny B. Goode of E72. 1st Set : A 2nd Set : A+ 3rd Set : C+ Overall = 4 Stars Highlights: China Cat/Rider - one of the best of '72 He's Gone - only time they nailed a premier and the best pre-bridge version Next Time You See Me - perfect Pig Casey Jones - definitive Mr. Charlie - even played every night this stands out Playing in the Band - fast, nailed One More Saturday Night - mid-set rave up Ramble On Rose - punchy and check Bobby El Paso - very on Big Railroad Blues - classic Truckin' - one of the best few from the 17 on tour SOURCES: Aside from a great Bertha-Ashley, there's the official release (Big Railroad is also on the vinyl version of E72v2). The entire show was also produced for Kongeriget Danmark Television (the next show was also shot, for Deutsch TV). Euro TV aired the show repeatedly over the years in Benelux, France, Germany (BRD), & Denmark, though the live show had different songs than the repeat version. - February 9, 2016Theme Park Deuce Reviewer: kriddaz - favorite favorite favorite favorite favorite - January 13, 2016 Subject: Amazing recordings from the day I was born! Does anyone happen to know what TIME they recorded this show? I'd love to figure out what song )if any) was being played at the exact moment of my delivery. Come to think of it "He's Gone" would be absolutely perfect, an homage to my mother's womb! I've listened and watched the Copenhagen recordings many times. 5-Stars! - January 13, 2016Amazing recordings from the day I was born! Reviewer: cfinnstl - favorite favorite favorite favorite favorite - December 23, 2015 Subject: Wow, this recording is top-notch! And the show is on fire! A++ all around. Many thanks for your efforts! - December 23, 2015Wow, this recording is top-notch! Reviewer: cb18201 - favorite favorite favorite favorite - November 8, 2013 Subject: . 12-17-08 - November 8, 2013 Reviewer: Dhfalcon - favorite favorite favorite favorite favorite - April 15, 2011 Subject: If I had a Time Machine... I'd definitely check this one out live. (and then stick around for all of the Europe tour!) :-) - April 15, 2011If I had a Time Machine... Reviewer: CosmicGeordie - favorite favorite favorite favorite favorite - September 29, 2010 Subject: Echoes of Europe ’72…but really live The sound/music quality on this is so good I feel like I’m listening to the much overdubbed E-’72 album. Love this period of the band and agree that this show is a high water mark for the year. Enjoy - September 29, 2010Echoes of Europe ’72…but really live Reviewer: Cliff Hucker - favorite favorite favorite favorite favorite - April 17, 2010 Subject: A Scandinavian fairy tail An outstanding performance, including some great playing by Weir, two sets of which are immortalized in an excellent video from a European television broadcast. It features superb sound quality and a gorgeous soaring Dark Star with a beautiful segue into Sugar Magnolia. Although the songs from the first two sets don't get stretched out much, perhaps due to the television broadcast, everything is particularly well played, particularly the China/Rider, an excellent Big Railroad Blues (the performance where the band wears the Bozo masks for this number), and a classic styled performance of Casey Jones. While this 30+ minute Dark Star is not quite as brilliant as some of the monumental efforts that follow, it is still a stunning effort, largely due to some great work by Keith. The pre-verse intro is simply beautiful. Right from the opening notes there is some beautiful interplay between Jerry and Keith, though it eventually breaks down. After several attempts, Jerry re-establishes the theme and pulls it back together. Immediately after the verse there is a lilting duet between Jerry and Billy, though this too is unsustained. Following some rumbling from Phil, this one briefly gets quiet, and then the magic starts. At about 24:00, Weir steers the band into a gorgeous and jazzy jam similar to WRS. Its amazingly sublime, the entire ensemble locking in on the theme for several minutes before a thrilling segue into Sugar Magnolia. An extraordinary performance of Caution follows, the beginning of which is particularly jazzy! This is a gem of a show! (97 pts) - April 17, 2010A Scandinavian fairy tail Reviewer: BIG_R - favorite favorite favorite favorite favorite - February 16, 2010 Subject: 4-17-72 Exellent both in performance and sound quality. Another beautiful remaster by Mr. Ashley. - February 16, 20104-17-72 Reviewer: sugareesalibi - favorite favorite favorite favorite - June 4, 2009 Subject: Thanks Jay! First off a huge thanks goes out to Jay who helped to make this show sound all shiny and new. I've edited sound before and it's a pain in the ass. The music contained within is pretty phenomenal. Jerrys guitar sounds great! A very unique "He's Gone", one of my favorites. The transition from Dark Star into Sugar Mag is absolutely beautiful. Good show in a foreign land. Enjoy the Dead! - June 4, 2009Thanks Jay!
ON THE UNIQUENESS OF SOLUTIONS OF STOCHASTIC VOLTERRA EQUATIONS We prove strong existence and uniqueness, and Hölder regularity, of a large class of stochastic Volterra equations, with singular kernels and non-Lipschitz diffusion coefficient. Extending Yamada-Watanabe’s theorem , our proof relies on an approximation of the process by a sequence of semimartingales with regularised kernels. We apply these results to the rough Heston model, with square-root diffusion coefficient, recently proposed in Mathematical Finance to model the volatility of asset prices. Introduction This paper deals with one-dimensional Stochastic Volterra Equations (SVE) of the following type: where x ∈ R, T > 0, b and σ are Borel-measurable functions and W is a Brownian motion on the canonical setup (Ω, F , {F t } t∈ , P). We prove strong existence and uniqueness of (1.1) for a large class of (singular) kernels, where σ is only locally 1 2 -Hölder continuous. Although Itô stochastic integration covers this type of integrands as long as they belong to L 2 (Ω × ), the potential solution of (1.1) is not a semimartingale in general because the quadratic variation can be infinite, for example in the case of the fractional Brownian motion with H < 1/2 . This prevents the use of Itô calculus. On a more practical side, the solution is non-Markovian which, combined with the singularity of the kernel, restricts the use of classical numerical schemes. This property though is particularly relevant to modelling in fields where past dependence is observed. Volterra's seminal work was concerned with population growth models with memory. Those deterministic integral equations now bearing his name subsequently blossomed in various fields including heat conduction, spread of epidemics, engineering, viscoelasticity and hydrology, see and and the references therein. Models of chemical reaction also gave birth to SVEs as limits of branching processes. In such models, a catalytic super-Brownian motion arises from the study of the interaction between a reactant and a catalyst . Its density satisfies a stochastic partial differential equation which, under certain assumptions, reduces to a SVE of the type (1.1). This is explored in more details and serves as a motivation in . Recently, empirical evidence has justified the use of SVEs and has required refined tools to make them tractable. This is in particular true in mathematical finance, where the volatility of asset prices, already known to be non-Markovian , have now been observed to feature memory properties well captured by SVEs . These papers also show that SVEs fit remarkably well the behaviour of implied volatility. Additionally, the study of high-frequency data carried out in revealed the roughness, in the sense of low Hölder regularity, of the observed time series of the instantaneous volatility. This combination showed that the log-volatility is modelled more accurately by a fractional Brownian motion (fBm) with small Hurst parameter H ≈ 0.1 than by a classical one (where H = 0.5). This precisely corresponds to a driftless version of (1.1) with constant diffusion σ. Since this seminal observation, more advanced results have enlarged this new class of rough volatility models, in particular showing that drift and diffusion should be state dependent. One important example is the rough Heston model introduced in , and studied further in , where the squared volatility satisfies Existence and uniqueness in the weak sense was proved in using the deterministic theory of resolvents associated to the convolution kernel. However, strong existence and uniqueness was so far out of reach. Weak existence and uniqueness are sufficient in most mathematical finance applications, especially for pricing purposes. Asymptotic methods have been used extensively in order to obtain easy-to-use approximations of models, with a strong emphasis on large deviations methods. A successful approach for the latter has been set by Dupuis and Ellis , and requires strong existence and uniqueness. Our results here are therefore the first stone to build such a theory for this class of SVEs. Pathwise uniqueness is also a key ingredient to validate numerical schemes for such equations, and practical applications thereof cannot be fully justified without them . The classical existence and uniqueness results for SVEs with bounded kernels are due to Berger and Mizel , Protter , Pardoux and Protter . In the latter, coefficients are considered anticipating, hence the integrals are interpreted in the Skorohod sense. This approach was also adopted by while defines them through Malliavin calculus. The first existence and uniqueness result for singular kernels was derived in in the case of linear diffusion coefficient. The general SVE case was studied by Coutin and Decreusefond in , who proved strong existence and uniqueness in a concise but elegant manner for singular kernels and Lipschitz coefficients. The authors circumvented the use of the Burkholder-Davis-Gundy (BDG) inequality, the standard tool in such scheme, which is not available in this context since the stochastic integral is not a local martingale. Instead, they relied on fractional calculus and exploited the embedding of Besov spaces into spaces of Hölder continuous functions. A slight extension and a first proof for non-Lipschitz coefficients can be found in where the author uses the Bihari-Lasalle inequality. It is however much more delicate to consider coefficients that are only Hölder continuous. This is even the case for diffusions, as this feature prevents the direct use of a Grönwall-type inequality. In that regard, Yamada and Watanabe's pathwise uniqueness theorem is one of a kind, and its extension to SVEs particularly challenging because the authors relied heavily on Itô calculus. Mytnik and Salisbury 's result seems to be the only one so far to achieve pathwise uniqueness for SVEs with Hölder continuous diffusion coefficient. Yet, the full generality of (1.1) is not attained as their drift is a deterministic bounded function and they only considered the Riemann-Liouville kernel. More importantly, σ is only allowed to be γ-Hölder continuous with γ ∈ 1 2α , 1 , which cannot reach the square-root function and becomes constraining in the rough case where α ≈ 0.6. Therefore, it remains at respectable distance from the rough Heston model, for which strong uniqueness is still an open problem, as emphasised in . Finally, the assumptions used in to prove weak existence do not fully overlap with ours, and we additionally provide strong existence and uniqueness. Indeed we extend Yamada-Watanabe's theorem with mild regularity assumptions on the kernels by approximating the solution to (1.1) by a sequence of semimartingales. The latter are designed with a regularised kernel K(t + ε, s) which avoids the singularity on the diagonal. The convergence takes place in L p (Ω) for some p > 2 and is proved using the Hölder regularity of the solution, derived by Decreusefond through fractional and Malliavin calculus, and represents the cornerstone of our approach. Tanaka's formula can then be used on the semimartingales and 'transferred' to the solution of the SVE by passing to the limit. In particular we clarify the link between the regularity of the kernel and the Hölder continuity of the solution. The remainder of the paper is as follows: Section 2 gathers the definition of the model and sets the notations, recalling essential tools needed here. The proofs of strong uniqueness and existence are contained in Section 3. Finally, Section 4 shows how our setup covers the rough Heston model, and presents an extension to the multidimensional case. Notations: The letter C will denote a constant that might change from line to line. When needed, we indicate the parameters on which it depends. We shall further consider a time frame T of the form T for some T > 0. For any p ≥ 1 we write L p = L p (T) and L p = L p (Ω). Stochastic Volterra integrals 2.1. Regularity. We introduce the results obtained by Decreusefond which still represent the state-of-the-art in terms of stochastic Volterra integrals with singular kernels. For a fixed time horizon T > 0, we call a kernel a map K : T 2 → R for which both t 0 K(t, s) 2 ds and K(t, s) are finite for all t ∈ T and s = t. The associated space is defined as and E Hence, for all u ∈ A K the stochastic integral is well defined for all t ∈ T in the Itô sense. We also need the following tools: • The Riemann-Liouville integral of f ∈ L 1 is and the Riemann-Liouville derivative is defined by and infinity otherwise. If α > 1 p , then I α,p ⊂ C α− 1 p , the space of (α − 1 p )-Hölder continuous functions null at time 0. Given the space inclusions above, the assumption also implies precise Hölder regularity for the integral (2.1). Example 2.2. The operators associated to the following kernels satisfy Assumption 2.1: , with H ∈ (0, 1), satisfies this assumption with α = H and any η < 2 . • The fractional Brownian motion kernel where F is the Gauss Hypergeometric function, also satisfies this assumption with the same parameters as in 1. . Decreusefond's main result yields the Hölder regularity of the stochastic Volterra integral : Theorem 2.3 (Decreusefond). If Assumption 2.1 holds, r = θ(η) and u ∈ A K ∩ L r (Ω×T), then M K (u) has a measurable version M K (u), which belongs to γ<α I γ,r . Thus M K (u) is γ-Hölder continuous for all γ < α − 1 r , and, for any γ < α, From now on, we only consider the measurable version of the stochastic integral. We also prove the following inequalities, replacing the BDG inequality: Lemma 2.4 (BDG-type inequality). Let p > 2, K : T 2 → R be a kernel and u progressively measurable with M K (u) ∈ L 1 . Then Proof. We first recall a useful property . If p and q are conjugate, g integrable and sup Recall that by assumption t 0 K(t, s)u(s) dW s is integrable with respect to P. Now for any φ ∈ L 2 simple, the following calculations follow from Cauchy-Schwarz inequality, the Itô isometry, and Hölder inequality: Since the space L 2 is dense in L p p−1 , the claim follows. 2.2. Semimartingale approximation. We now show how to approximate a stochastic Volterra integral by a semimartingale, following for all s ∈ T. K is also triangular, i.e. K(t, s) = 0 for all s > t. All the kernels in Example 2.2 clearly satisfy this assumption. The first lemma exhibits the semimartingale property of the processes M K,ε (u): Proof. The lemma follows from a straightforward application of the stochastic Fubini theorem : The next proposition proves the convergence of the integrals under integrability assumptions on the process u. Furthermore, it yields an explicit rate of convergence. Proposition 2.7. Let Assumptions 2.1 and 2.5 hold and let r : Proof. (i) Notice first that, using the kernel's triangularity and Itô isometry, Now using Lemma 2.4, we derive the following estimates: We recall from Theorem 2.3 that for all γ ∈ (1/r, α), Hence, combining the previous inequalities we obtain and the first claim follows since γ > 1/r. (ii) We know that Ku is also γ-Hölder continuous for u ∈ L η . If u ∈ L η (Ω × T), then the paths u(ω) are in L η , hence Ku(ω) is also γ-Hölder continuous. Thus by similar calculations, N K,ε t (u) converges to N K t (u) in L r as ε tends to zero. Main results: strong solution This section displays the proof of our main theorem so we naturally start by defining a strong solution in our context. If the SVE has a unique pathwise solution, we say that it is exact. This definition is standard . If (3.1) is not satisfied then one can consider the solution up to the time of explosion. Inspired by the Yamada-Watanabe conditions and after space localisation, we consider the following assumptions on the coefficients: Furthermore, there exists C G > 0 such that for all x ∈ R and all s ∈ T the coefficients satisfy the linear growth condition |b(s, x)| + |σ(s, x)| ≤ C G (1 + |x|). As mentioned in , the condition 0+ ρ(u) −2 du = ∞ cannot be weakened. In particular, the SDE with K 1 ≡ K 2 ≡ 1 and σ(s, x) = x β with β < 1 2 has an infinite number of solutions. Our last assumption concerns the kernels and is satisfied by all the kernels presented in Example 2.2. The main theorem of this paper is as follows: Theorem 3.4. If the two kernels K 1 and K 2 satisfy Assumptions 2.1, 2.5, 3.3 with the same parameters α and η, and the coefficients satisfy Assumption 3.2, then the stochastic Volterra equation (1.1) is exact. The proof of this theorem will be split into three parts. We start by proving pathwise uniqueness of the solution under stronger assumptions in Proposition 3.6, where the core of the proof resides. Then, under the same assumptions, we show the existence of a strong solution in Proposition 3.8. Finally we relax the assumptions by applying the localisation argument presented in . Hence let us also consider the following global assumptions: Assumption 3.5 (Global). There exists a continuous increasing function ρ : (0, ∞) → (0, ∞) with ρ(0) := 0 by continuity such that 0+ ρ(u) −2 du = ∞ and a positive constant C L such that, for all x, y ∈ R and all s ∈ T: Finally, there exists C G > 0 such that the coefficients satisfy the linear growth condition for all x ∈ R and s ∈ T. 3.1. Uniqueness. The following proposition corresponds to Theorem 3.4, but with stronger assumptions, and is key to proving the theorem. Proposition 3.6. If the two kernels K 1 and K 2 satisfy Assumptions 2.1, 2.5, 3.3 with the same parameters α and η, and the coefficients satisfy Assumption 3.5, then pathwise uniqueness holds for the stochastic Volterra equation (1.1). The crucial point here is to check the assumptions of Theorem 2.3 on which the semimartingale approximation of Proposition 2.7 depends. Therefore the integrability condition proved in the following lemma serves two purposes: to derive the Hölder regularity of the solution and to allow the use of our convergence results. Lemma 3.7. Let X be a solution to (1.1), Assumptions 2.1, 3.3 for both kernels, and the linear growth condition (3.4) for the coefficients hold. Denote r = θ(η) then Moreover, X has γ-Hölder continuous paths for all γ < α − 1/r. This shows that the regularity of the solution is tied with the regularity of the kernels in the sense of Assumption 2.1. We are now in position to prove pathwise uniqueness. Proof of Proposition 3.6. 1) Let X and Y be two solutions on the same probability space of (1.1). For any ε > 0, define the semimartingales {X ε t , t ∈ T} and {Y ε t , t ∈ T} by For all ε > 0, let Z ε := X ε − Y ε which is a continuous semimartingale by Lemma 2.6 and Z := X − Y . Hence, Proposition 2.7 implies that, for all t ∈ T, 2) Tanaka's formula for continuous semimartingales yields We now claim that if and letting m tend to infinity, the monotone convergence theorem implies that The local time function a → L a t (Z ε ) is right continuous, and therefore the righthand side diverges unless L 0 t (Z ε ) = 0 because of the behaviour of the function ρ around the origin in Assumption 3.5. We now prove that (3.8) is indeed finite, at least for all ω in the set Ω ε which we define now. 3) For any t ∈ T, introduce the measurable spaces N ε t := ω ∈ Ω : Z ε t (ω) = 0 and Z t (ω) = 0 , as well as N ε := t∈T N ε t and Ω ε := Ω \ N ε . Since Z ε t converges to Z t in L 2 and both processes are continuous, there exists a subsequence (which we denote again by ε) such that Z ε t converges to Z t for all t ∈ T almost surely. Hence, notice that by Fatou's lemma and because lim sup n 1 1 An = 1 1 lim sup n An : Going back to the estimate (3.8), on Ω ε , either ρ(Z u ) 2 = 0 or ρ(Z ε u ) −2 is finite for all u ∈ T, and therefore no blow-up can occur, so that is finite and L 0 t Z ε (ω) = 0 for all ε > 0 and all ω ∈ Ω ε . 4) Finally, from (3.7), Cauchy-Schwarz and Assumption 3.5 imply where the first term tends to zero because Z ε t is integrable. Moreover, by (twice) dominated convergence, and we finally conclude that which means E Z + t = 0 by Grönwall's inequality, i.e. X t ≥ Y t almost surely. Interchanging the roles of X and Y reverses the inequality and the claim follows. 3.2. Existence. As we mentioned in the introduction, the kernels present in our SVE are too general for known weak existence results. Hence we undertake to prove the existence of a strong solution using the traditional Picard iteration and calculations similar to the uniqueness proof. Proposition 3.8. Under the same assumptions as Proposition 3.6, the SVE (1.1) has a strong solution. Some preliminaries are needed before getting to the proof of this result. We consider the Banach space L r T of all progressively measurable processes X such that X L r (Ω×T) is finite, and the map X → I(X) from L r T to itself, defined as Lemma 3.9. Under the same assumptions as Proposition 3.6, the map I is welldefined from L r T to itself and E < ∞, for any X ∈ L r T . Proof. By Theorem 2.3, there exists γ > 0 such that I(X) has γ-Hölder continuous paths. Thus there exists A ∈ L r such that for all t, t ′ ∈ T, for all ω ∈ Ω. In particular sup t∈T |I(X)(ω) t | r ≤ A(ω)T rγ . This yields the existence of C T,r > 0 such that E ≤ C T,r , and therefore I(X) ∈ L r T for all X ∈ L r T . Now we are all set to prove the main result of this subsection. 1) Thanks to Lemma 3.9 we can define by iteration the sequence, in L r T , X 0 = x and X (n+1) := I(X (n) ) for all n ≥ 0, such that X (n) ∈ L r T for each n ≥ 0, and We now prove that {X (n) } is a Cauchy sequence in a space invariant for the mapping I, and for which Tanaka's formula holds. To this end, introduce the Banach space L T ⊂ L 1 T endowed with the norm X LT := T 0 E 2 dt, and we prove convergence in L T even though the sequence {X (n) } n∈N belongs to L r T . 2) For each n, define the sequence of semimartingales {X (n,ε) } ε>0 in L r T by X (n,ε) t for t ∈ T. From Proposition 2.7, lim ε→0 sup t∈T E X (n,ε) t − X (n) t r = 0, and we can use Tanaka's formula and similar calculations to the uniqueness proof for the local time. For Z (n,m) := X (n) − X (m) , the analogue to (3.9) yields The sequence ∆ is finite. Hence we can use Fatou's lemma to deduce ∆ t ≤ C T,L t 0 ∆ s ds, and Grönwall's inequality yields ∆ t = 0 for all t ∈ T. Therefore, {X (n) } n∈N is a Cauchy sequence in L T and there exists a limit X ∈ L T such that 3) To show that X = I(X) almost surely we recall from previous calculations that and therefore X is a fixed point of I and a solution of (1.1). Note that even though the convergence took place in L T , we know from Lemmas 3.7 or 3.9 that X ∈ L r T . 3.3. Localisation. The first two parts of the proof of Theorem 3.4 have now been established. Therefore we can use the localisation argument to relax our assumptions in the sense of 3.2 and finish the main proof. Recall from Lemma 3.7 that there exists γ > 0 such that X has γ-Hölder continuous paths. Therefore there exists A ∈ L r such that for all t, t ′ ∈ T, for all ω ∈ Ω. In particular sup t∈T |X t (ω)| r ≤ A(ω)T rγ . This yields the existence of a constant C T,r such that E sup t∈T |X t | r ≤ C T,r , and hence for all N ≥ 0, converges to zero as N tends to infinity. Therefore P (sup τ N = ∞) = 1, and X is a solution on T. Finally to prove uniqueness consider X and X ′ both satisfying (1.1) with coefficients obeying Assumption 3.2. As we just showed, they must be equal to X N almost surely on , hence X = X ′ almost surely on T. The rough Heston model. Recently, El Euch and Rosenbaum proposed a rough version of the classical Heston model, widely used in the financial industry, in order to capture the specificities of Equity options markets, and weak existence and uniqueness was derived in . A generalised version with timedependent drift was also introduced in and takes the form where y 0 ∈ R, x 0 , ξ, λ > 0, ρ ∈ (−1, 1),ρ := 1 − ρ 2 , H ∈ (0, 1), W and B are independent Brownian motions, and θ : T → R + is some deterministic function. Here H corresponds to the Hurst exponent of the fractional Brownian motion and governs the Hölder regularity of the solution. The combination of the square root diffusion coefficient and the singular kernel seriously complicates the study of certain aspects of this model, as we have seen. The process Y represents the log-price of an asset while X corresponds to its squared instantaneous volatility. Since the log stock price process is represented as an integrated version of the volatility, it only suffices to prove existence and uniqueness of the latter. Proof. The diffusion coefficient for X in (4.1) is only defined on R + so that we cannot apply Theorem 3.4 directly as it would require to build a strong non-negative solution. The kernel satisfies Assumption 2.1 by and it is easy to check that it also satisfies Assumptions 2.5 and 3.3. Clearly, b(s, x) = λ(θ(s)− x) is Lipschitz continuous and σ(s, x) := ξ √ x + is 1 2 -Hölder continuous therefore they satisfy Assumption 3.5. This implies that pathwise uniqueness holds for the SVE for X in (4.1) with coefficients b and σ by Proposition 3.6. Any solution of the original SVE must be non-negative, in which case σ = σ, hence pathwise uniqueness holds for this equation too. Moreover, a non-negative weak solution of the second SDE in (4.1) was constructed in , and therefore this SVE is exact. Plugging this solution into the first component of (4.1) yields the claim. Remark 4.2. Looking at Assumption 2.1 we have H = α and one can choose any η < 2 therefore r = θ(η) can be as large as one wants. This means, by Proposition 3.7, that V is almost surely γ-Hölder continuous for any γ < H. Hence we also retrieve the Hölder continuity proved in . Furthermore this reasoning applies to any strong solution of an SVE with the same kernel, regardless of the form of the coefficients. A direct consequence is the pathwise uniqueness of the forward variance curve {E } t≥s for any s ∈ T and of the option price process {C t := E } t∈T for some measurable function g : Ω → R, which consolidates the theoretical setup of the hedging strategy derived in . 4.2. Multi-dimensional version. Tanaka's formula and Yamada-Watanabe's theorem only hold in one dimension. A proper multidimensional pathwise uniqueness version for SDEs with non-Lipschitz coefficients represents a complex challenge , and only few limited extensions exist . Counterexamples to the weak uniqueness were displayed in , while the Yamada-Watanabe approach fails in several dimensions because of the mutual dependence between the components. However our one-dimensional theorem (Theorem 3.4) can be extended to the multidimensional case as long as the coefficient's components do not depend mutually on each other. Consider (4.2) X t = x + t 0 K 1 (t, s) · b(s, X s ) ds + t 0 K 2 (t, s) · σ σ σ(s, X s ) dW s , t ∈ T, where · represents component-wise multiplication, x ∈ R d , K 1 , K 2 : T 2 → R d are multidimensional kernels, b : T × R d → R d , σ σ σ : T × R d → R d × R m are Borel-measurable functions, and W is an m-dimensional Brownian motion. Both functions are assumed progressive in the following sense: σ 11 (s, x 1 ) · · · σ 1m (s, x 1 ) σ 21 (s, x 1 , x 2 ) · · · σ 2m (s, x 1 , x 2 ) . . . . . . . . . σ d1 (s, x 1 , · · · , x d ) · · · σ dm (s, x 1 , · · · , x d ) such that the dependence only goes in one direction. Thus we can prove uniqueness for the first line and then to the others by induction. Corollary 4.3. Assume that 2.1, 2.5 3.3 hold for all kernels and that each element b i , σ ij , 1 ≤ i ≤ d, 1 ≤ j ≤ m, satisfy Assumption 3.2 where the Lipschitz constant and the diffusion modulus can be different from row to row (may vary for different i but are the same for different j). Then the SVE (4.2) is exact. Proof. The first element X 1 of X is one-dimensional, so that the proof in the one-dimensional case is not altered by the additional diffusion terms: K 1 1 (t, s)b 1 (s, X 1 s ) ds + m j=1 t 0 K 1 2 (t, s)σ 1j (s, X 1 s ) dW j s . Therefore strong existence and uniqueness hold for X 1 . If strong existence and uniqueness stands for all X i , i < k then the same holds for X k by plugging the previous elements in the coefficients of the k-th row. More precisely, • the integrability conditions are still ensured by the linear growth condition. • For the existence we mimic the proof of Proposition 3.8: define the Cauchy sequence X (0) = (X 1 , · · · , X k−1 , x) ⊤ and X (n+1) = X 1 , · · · , X k−1 , I(X (n) ) k ⊤ for all n ≥ 1 on L T . Hence the difference between X (n) and X (m) does not depend on previous elements (since they are fixed). Therefore the existence proof becomes one-dimensional and this has already been dealt with. • The uniqueness follows the same pattern where, for any solutions X and Z, b k X 1
/** * Copyright 2007-2008 University Of Southern California * * <p>Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file * except in compliance with the License. You may obtain a copy of the License at * * <p>http://www.apache.org/licenses/LICENSE-2.0 * * <p>Unless required by applicable law or agreed to in writing, software distributed under the * License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either * express or implied. See the License for the specific language governing permissions and * limitations under the License. */ package edu.isi.pegasus.planner.catalog; import edu.isi.pegasus.planner.catalog.work.WorkCatalogException; /** * The catalog interface to the Work Catalog, the erstwhile Work DB, that is populated by tailstatd * and associates. * * @author <NAME> * @version $Revision$ */ public interface WorkCatalog extends Catalog { /** Prefix for the property subset to use with this catalog. */ public static final String c_prefix = "pegasus.catalog.work"; /** The DB Driver properties prefix. */ public static final String DB_PREFIX = "pegasus.catalog.work.db"; /** The version of the API */ public static final String VERSION = "1.0"; /** * Inserts a new mapping into the work catalog. * * @param basedir the base directory * @param vogroup the vo to which the user belongs to. * @param label the label in the DAX * @param run the run number. * @param creator the user who is running. * @param cTime the creation time of the DAX * @param mTime the modification time. * @param state the state of the workflow * @return number of insertions, should always be 1. On failure, throw an exception, don't use * zero. * @throws WorkCatalogException in case of unable to delete entry. */ public int insert( String basedir, String vogroup, String label, String run, String creator, java.util.Date cTime, java.util.Date mTime, int state) throws WorkCatalogException; /** * Deletes a mapping from the work catalog. * * @param basedir the base directory * @param vogroup the vo to which the user belongs to. * @param label the label in the DAX * @param run the run number. * @return number of insertions, should always be 1. On failure, throw an exception, don't use * zero. * @throws WorkCatalogException in case of unable to delete entry. */ public int delete(String basedir, String vogroup, String label, String run) throws WorkCatalogException; }
def _move_ssh_key(profile, logger, is_backup): context = env.get_profile_context(profile) key_filepath = context.ssh_key if key_filepath: backup_path = os.path.join( EXPORTED_SSH_KEYS_DIR, os.path.basename(key_filepath)) + \ '.{0}.profile'.format(profile) if is_backup: if not os.path.isdir(EXPORTED_SSH_KEYS_DIR): os.makedirs(EXPORTED_SSH_KEYS_DIR, mode=0o700) logger.info('Copying ssh key %s to %s...', key_filepath, backup_path) shutil.copy2(key_filepath, backup_path) else: if os.path.isfile(backup_path): logger.info('Restoring ssh key for profile %s to %s...', profile, key_filepath) shutil.move(backup_path, key_filepath)
// SPDX-License-Identifier: Apache-2.0 // Copyright Authors of Cilium package fragmap import ( "fmt" "unsafe" "github.com/cilium/cilium/pkg/bpf" "github.com/cilium/cilium/pkg/types" ) const ( // MapName is the name of the map used to retrieve L4 ports associated // to the datagram to which an IPv4 belongs. MapName = "cilium_ipv4_frag_datagrams" ) // FragmentKey must match 'struct ipv4_frag_id' in "bpf/lib/ipv4.h". // +k8s:deepcopy-gen=true // +k8s:deepcopy-gen:interfaces=github.com/cilium/cilium/pkg/bpf.MapKey type FragmentKey struct { destAddr types.IPv4 `align:"daddr"` sourceAddr types.IPv4 `align:"saddr"` id uint16 `align:"id"` proto uint8 `align:"proto"` pad uint8 `align:"pad"` } // FragmentValue must match 'struct ipv4_frag_l4ports' in "bpf/lib/ipv4.h". // +k8s:deepcopy-gen=true // +k8s:deepcopy-gen:interfaces=github.com/cilium/cilium/pkg/bpf.MapValue type FragmentValue struct { sourcePort uint16 `align:"sport"` destPort uint16 `align:"dport"` } // GetKeyPtr returns the unsafe pointer to the BPF key. func (k *FragmentKey) GetKeyPtr() unsafe.Pointer { return unsafe.Pointer(k) } // GetValuePtr returns the unsafe pointer to the BPF value. func (v *FragmentValue) GetValuePtr() unsafe.Pointer { return unsafe.Pointer(v) } // String converts the key into a human readable string format. func (k *FragmentKey) String() string { return fmt.Sprintf("%s --> %s, %d, %d", k.sourceAddr, k.destAddr, k.proto, k.id) } // String converts the value into a human readable string format. func (v *FragmentValue) String() string { return fmt.Sprintf("%d, %d", v.destPort, v.sourcePort) } // NewValue returns a new empty instance of the structure representing the BPF // map value. func (k FragmentKey) NewValue() bpf.MapValue { return &FragmentValue{} } // InitMap creates the signal map in the kernel. func InitMap(mapEntries int) error { fragMap := bpf.NewMap(MapName, bpf.MapTypeLRUHash, &FragmentKey{}, int(unsafe.Sizeof(FragmentKey{})), &FragmentValue{}, int(unsafe.Sizeof(FragmentValue{})), mapEntries, 0, 0, bpf.ConvertKeyValue, ) _, err := fragMap.Create() return err }
<reponame>basharast/RetroArch-ARM<filename>src/play_feature_delivery/play_feature_delivery.h /* RetroArch - A frontend for libretro. * Copyright (C) 2011-2017 - <NAME> * Copyright (C) 2014-2017 - <NAME> * Copyright (C) 2016-2019 - <NAME> * Copyright (C) 2019-2020 - <NAME> * * RetroArch is free software: you can redistribute it and/or modify it under the terms * of the GNU General Public License as published by the Free Software Found- * ation, either version 3 of the License, or (at your option) any later version. * * RetroArch is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; * without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR * PURPOSE. See the GNU General Public License for more details. * * You should have received a copy of the GNU General Public License along with RetroArch. * If not, see <http://www.gnu.org/licenses/>. */ #ifndef __PLAY_FEATURE_DELIVERY_H #define __PLAY_FEATURE_DELIVERY_H #include <retro_common_api.h> #include <libretro.h> #include <lists/string_list.h> #include <boolean.h> RETRO_BEGIN_DECLS /* Defines possible status values of * a play feature delivery install * transaction */ enum play_feature_delivery_install_status { PLAY_FEATURE_DELIVERY_IDLE = 0, PLAY_FEATURE_DELIVERY_PENDING, PLAY_FEATURE_DELIVERY_STARTING, PLAY_FEATURE_DELIVERY_DOWNLOADING, PLAY_FEATURE_DELIVERY_INSTALLING, PLAY_FEATURE_DELIVERY_INSTALLED, PLAY_FEATURE_DELIVERY_FAILED }; /******************/ /* Initialisation */ /******************/ /* Must be called upon program initialisation */ void play_feature_delivery_init(void); /* Must be called upon program termination */ void play_feature_delivery_deinit(void); /**********/ /* Status */ /**********/ /* Returns true if current build utilises * play feature delivery */ bool play_feature_delivery_enabled(void); /* Returns a list of cores currently available * via play feature delivery. * Returns a new string_list on success, or * NULL on failure */ struct string_list *play_feature_delivery_available_cores(void); /* Returns true if specified core is currently * installed via play feature delivery */ bool play_feature_delivery_core_installed(const char *core_file); /* Fetches last recorded status of the most * recently initiated play feature delivery * install transaction. * 'progress' is an integer from 0-100. * Returns true if a transaction is currently * in progress. */ bool play_feature_delivery_download_status( enum play_feature_delivery_install_status *status, unsigned *progress); /***********/ /* Control */ /***********/ /* Initialises download of the specified core. * Returns false in the event of an error. * Download status should be monitored via * play_feature_delivery_download_status() */ bool play_feature_delivery_download(const char *core_file); /* Deletes specified core. * Returns false in the event of an error. */ bool play_feature_delivery_delete(const char *core_file); RETRO_END_DECLS #endif
#include <bts/blockchain/note_record.hpp> #include <bts/blockchain/chain_interface.hpp> namespace bts { namespace blockchain { public_key_type note_record::signer_key()const { try { FC_ASSERT( signer.valid() ); fc::sha256 digest; if( !message->data.empty() ) digest = fc::sha256::hash( string(message->data.begin(), message->data.end()).c_str(), message->data.size() ); return fc::ecc::public_key( *signer, digest ); } FC_CAPTURE_AND_RETHROW() } void note_record::sanity_check( const chain_interface& db )const { try { FC_ASSERT( index.account_id == 0 || db.lookup<account_record>( abs( index.account_id ) ).valid() ); FC_ASSERT( amount.amount > 0 ); FC_ASSERT( amount.asset_id == 0 || db.lookup<asset_record>( amount.asset_id ).valid() ); } FC_CAPTURE_AND_RETHROW( (*this) ) } onote_record note_record::lookup( const chain_interface& db, const note_index& index ) { try { return db.note_lookup_by_index( index ); } FC_CAPTURE_AND_RETHROW( (index) ) } void note_record::store( chain_interface& db, const note_index& index, const note_record& record ) { try { db.note_insert_into_index_map( index, record ); } FC_CAPTURE_AND_RETHROW( (index)(record) ) } void note_record::remove( chain_interface& db, const note_index& index ) { try { const onote_record prev_record = db.lookup<note_record>( index ); if( prev_record.valid() ) db.note_erase_from_index_map( index ); } FC_CAPTURE_AND_RETHROW( (index) ) } } } // bts::blockchain
<gh_stars>0 /**************************************************************************** ** ** Copyright (C) 2015 The Qt Company Ltd. ** Contact: http://www.qt.io/licensing/ ** ** This file is part of the test suite of the Qt Toolkit. ** ** $QT_BEGIN_LICENSE:LGPL21$ ** Commercial License Usage ** Licensees holding valid commercial Qt licenses may use this file in ** accordance with the commercial license agreement provided with the ** Software or, alternatively, in accordance with the terms contained in ** a written agreement between you and The Qt Company. For licensing terms ** and conditions see http://www.qt.io/terms-conditions. For further ** information use the contact form at http://www.qt.io/contact-us. ** ** GNU Lesser General Public License Usage ** Alternatively, this file may be used under the terms of the GNU Lesser ** General Public License version 2.1 or version 3 as published by the Free ** Software Foundation and appearing in the file LICENSE.LGPLv21 and ** LICENSE.LGPLv3 included in the packaging of this file. Please review the ** following information to ensure the GNU Lesser General Public License ** requirements will be met: https://www.gnu.org/licenses/lgpl.html and ** http://www.gnu.org/licenses/old-licenses/lgpl-2.1.html. ** ** As a special exception, The Qt Company gives you certain additional ** rights. These rights are described in The Qt Company LGPL Exception ** version 1.1, included in the file LGPL_EXCEPTION.txt in this package. ** ** $QT_END_LICENSE$ ** ****************************************************************************/ #include <QtTest/QtTest> #include <QtCore/QDate> #include <QtCore/QDebug> #include <QtCore/QObject> #include <QtGui> #ifdef Q_OS_WINCE_WM #include <windows.h> #include "ddhelper.h" #endif class tst_WindowsMobile : public QObject { Q_OBJECT public: tst_WindowsMobile() { qApp->setCursorFlashTime (24 * 3600 * 1000); // once a day // qApp->setCursorFlashTime (INT_MAX); #ifdef Q_OS_WINCE_WM q_initDD(); #endif } #if defined(Q_OS_WINCE_WM) && defined(_WIN32_WCE) && _WIN32_WCE <= 0x501 private slots: void testMainWindowAndMenuBar(); void testSimpleWidget(); #endif }; #if defined(Q_OS_WINCE_WM) && defined(_WIN32_WCE) && _WIN32_WCE <= 0x501 bool qt_wince_is_platform(const QString &platformString) { wchar_t tszPlatform[64]; if (SystemParametersInfo(SPI_GETPLATFORMTYPE, sizeof(tszPlatform)/sizeof(*tszPlatform),tszPlatform,0)) if (0 == _tcsicmp(reinterpret_cast<const wchar_t *> (platformString.utf16()), tszPlatform)) return true; return false; } bool qt_wince_is_smartphone() { return qt_wince_is_platform(QString::fromLatin1("Smartphone")); } void openMenu() { ::mouse_event(MOUSEEVENTF_LEFTDOWN | MOUSEEVENTF_ABSOLUTE,450,630,0,0); ::mouse_event(MOUSEEVENTF_LEFTUP | MOUSEEVENTF_ABSOLUTE,450,630,0,0); QTest::qWait(2000); ::mouse_event(MOUSEEVENTF_LEFTDOWN | MOUSEEVENTF_ABSOLUTE,65535,65535,0,0); ::mouse_event(MOUSEEVENTF_LEFTUP | MOUSEEVENTF_ABSOLUTE,65535,65535,0,0); QTest::qWait(2000); ::mouse_event(MOUSEEVENTF_LEFTDOWN | MOUSEEVENTF_ABSOLUTE,55535,55535,0,0); ::mouse_event(MOUSEEVENTF_LEFTUP | MOUSEEVENTF_ABSOLUTE,55535,55535,0,0); QTest::qWait(2000); ::mouse_event(MOUSEEVENTF_LEFTDOWN | MOUSEEVENTF_ABSOLUTE,55535,58535,0,0); ::mouse_event(MOUSEEVENTF_LEFTUP | MOUSEEVENTF_ABSOLUTE,55535,58535,0,0); QTest::qWait(2000); ::mouse_event(MOUSEEVENTF_LEFTDOWN | MOUSEEVENTF_ABSOLUTE,40535,55535,0,0); ::mouse_event(MOUSEEVENTF_LEFTUP | MOUSEEVENTF_ABSOLUTE,40535,55535,0,0); QTest::qWait(2000); ::mouse_event(MOUSEEVENTF_LEFTDOWN | MOUSEEVENTF_ABSOLUTE,32535,55535,0,0); ::mouse_event(MOUSEEVENTF_LEFTUP | MOUSEEVENTF_ABSOLUTE,32535,55535,0,0); QTest::qWait(2000); ::mouse_event(MOUSEEVENTF_LEFTDOWN | MOUSEEVENTF_ABSOLUTE,65535,65535,0,0); ::mouse_event(MOUSEEVENTF_LEFTUP | MOUSEEVENTF_ABSOLUTE,65535,65535,0,0); QTest::qWait(2000); ::mouse_event(MOUSEEVENTF_LEFTDOWN | MOUSEEVENTF_ABSOLUTE,55535,50535,0,0); ::mouse_event(MOUSEEVENTF_LEFTUP | MOUSEEVENTF_ABSOLUTE,55535,50535,0,0); QTest::qWait(2000); ::mouse_event(MOUSEEVENTF_LEFTDOWN | MOUSEEVENTF_ABSOLUTE,55535,40535,0,0); ::mouse_event(MOUSEEVENTF_LEFTUP | MOUSEEVENTF_ABSOLUTE,55535,40535,0,0); QTest::qWait(2000); ::mouse_event(MOUSEEVENTF_LEFTDOWN | MOUSEEVENTF_ABSOLUTE,48535,45535,0,0); QTest::qWait(2000); ::mouse_event(MOUSEEVENTF_LEFTUP | MOUSEEVENTF_ABSOLUTE,48535,45535,0,0); } void compareScreenshots(const QString &image1, const QString &image2) { QImage screenShot(image1); QImage original(image2); // cut away the title bar before comparing QDesktopWidget desktop; QRect desktopFrameRect = desktop.frameGeometry(); QRect desktopClientRect = desktop.availableGeometry(); QPainter p1(&screenShot); QPainter p2(&original); //screenShot.save("scr1.png", "PNG"); p1.fillRect(0, 0, desktopFrameRect.width(), desktopClientRect.y(), Qt::black); p2.fillRect(0, 0, desktopFrameRect.width(), desktopClientRect.y(), Qt::black); //screenShot.save("scr2.png", "PNG"); //original.save("orig1.png", "PNG"); QVERIFY(original == screenShot); } void takeScreenShot(const QString filename) { q_lock(); QImage image = QImage(( uchar *) q_frameBuffer(), q_screenWidth(), q_screenHeight(), q_screenWidth() * q_screenDepth() / 8, QImage::Format_RGB16); image.save(filename, "PNG"); q_unlock(); } void tst_WindowsMobile::testMainWindowAndMenuBar() { if (qt_wince_is_smartphone()) QSKIP("This test is only for Windows Mobile"); QProcess process; process.start("testQMenuBar.exe"); QCOMPARE(process.state(), QProcess::Running); QTest::qWait(6000); openMenu(); QTest::qWait(1000); takeScreenShot("testQMenuBar_current.png"); process.close(); compareScreenshots("testQMenuBar_current.png", ":/testQMenuBar_current.png"); } void tst_WindowsMobile::testSimpleWidget() { if (qt_wince_is_smartphone()) QSKIP("This test is only for Windows Mobile"); QMenuBar menubar; menubar.show(); QWidget maximized; QPalette pal = maximized.palette(); pal.setColor(QPalette::Background, Qt::red); maximized.setPalette(pal); maximized.showMaximized(); QWidget widget; widget.setGeometry(100, 100, 200, 200); widget.setWindowTitle("Widget"); widget.show(); qApp->processEvents(); QTest::qWait(1000); QWidget widget2; widget2.setGeometry(100, 380, 300, 200); widget2.setWindowTitle("Widget 2"); widget2.setWindowFlags(Qt::Popup); widget2.show(); qApp->processEvents(); QTest::qWait(1000); takeScreenShot("testSimpleWidget_current.png"); compareScreenshots("testSimpleWidget_current.png", ":/testSimpleWidget_current.png"); } #endif //Q_OS_WINCE_WM QTEST_MAIN(tst_WindowsMobile) #include "tst_windowsmobile.moc"
def _create_dim_scales(self): dim_order = self._dim_order.maps[0] for dim in sorted(dim_order, key=lambda d: dim_order[d]): if dim not in self._h5group: size = self._current_dim_sizes[dim] kwargs = {} if self._dim_sizes[dim] is None: kwargs["maxshape"] = (None,) self._h5group.create_dataset( name=dim, shape=(size,), dtype='S1', **kwargs) h5ds = self._h5group[dim] h5ds.attrs['_Netcdf4Dimid'] = dim_order[dim] if len(h5ds.shape) > 1: dims = self._variables[dim].dimensions coord_ids = np.array([dim_order[d] for d in dims], 'int32') h5ds.attrs['_Netcdf4Coordinates'] = coord_ids scale_name = dim if dim in self.variables else NOT_A_VARIABLE h5ds.dims.create_scale(h5ds, scale_name) for subgroup in self.groups.values(): subgroup._create_dim_scales()
# move_zeros([1, 0, 1, 2, 0, 1, 3]) # returns [1, 1, 2, 1, 3, 0, 0] '''Мое решение''' list1 = [] list2 = [] for i in list1: if i>0: list2.append(i) list1.sort() for k in list1: if k == 0: list2.append(k) print(list2) arr = [1, 0, 1, 2, 0, 1, 3] l = [i for i in arr if isinstance(i, bool) or i!=0] print(l+[0]*(len(arr)-len(l))) ''' l = [i for i in arr if isinstance(i, bool) or i!=0] -> Создает список сверху без нулей >>> l+[0]*(len(arr)-len(l)) -> это действие добавляет в конец списка l нули ( [0] ), количество которых зависит от разницы длины старого и нового списка >>> '''
def prop_nouns_with_adj(self, **kwargs): return self._extract_syntactic_features('prop_nouns_with_adj', **kwargs)[ 'prop_nouns_with_adj' ]
/****************************************************************************** * Top contributors (to current version): * Andrew Reynolds * * This file is part of the cvc5 project. * * Copyright (c) 2009-2023 by the authors listed in the file AUTHORS * in the top-level source directory and their institutional affiliations. * All rights reserved. See the file COPYING in the top-level source * directory for licensing information. * **************************************************************************** * * Oracle caller */ #include "expr/oracle_caller.h" #include "theory/quantifiers/quantifiers_attributes.h" namespace cvc5::internal { OracleCaller::OracleCaller(const Node& n) : d_oracleNode(getOracleFor(n)), d_oracle(NodeManager::currentNM()->getOracleFor(d_oracleNode)) { Assert(!d_oracleNode.isNull()); } bool OracleCaller::callOracle(const Node& fapp, std::vector<Node>& res) { std::map<Node, std::vector<Node>>::iterator it = d_cachedResults.find(fapp); if (it != d_cachedResults.end()) { Trace("oracle-calls") << "Using cached oracle result for " << fapp << std::endl; res = it->second; // don't bother setting runResult return false; } Assert(fapp.getKind() == kind::APPLY_UF); Assert(getOracleFor(fapp.getOperator()) == d_oracleNode); Trace("oracle-calls") << "Call oracle " << fapp << std::endl; // get the input arguments from the application std::vector<Node> args(fapp.begin(), fapp.end()); // run the oracle method std::vector<Node> response = d_oracle.run(args); Trace("oracle-calls") << "response node " << response << std::endl; // cache the response d_cachedResults[fapp] = response; res = response; return true; } bool OracleCaller::isOracleFunction(Node f) { return f.hasAttribute(theory::OracleInterfaceAttribute()); } bool OracleCaller::isOracleFunctionApp(Node n) { if (n.getKind() == kind::APPLY_UF) { return isOracleFunction(n.getOperator()); } // possibly 0-ary return isOracleFunction(n); } Node OracleCaller::getOracleFor(const Node& n) { // oracle functions have no children if (n.isVar()) { Assert(isOracleFunction(n)); Node o = n.getAttribute(theory::OracleInterfaceAttribute()); Assert(o.getKind() == kind::ORACLE); return o; } else if (n.getKind() == kind::FORALL) { // oracle interfaces have children, and the attribute is stored in 2nd child for (const Node& v : n[2][0]) { if (v.getKind() == kind::ORACLE) { return v; } } } Assert(false) << "Unexpected node for oracle " << n; return Node::null(); } const std::map<Node, std::vector<Node>>& OracleCaller::getCachedResults() const { return d_cachedResults; } } // namespace cvc5::internal
/** * Updates the goods wanted by this settlement. * * It is only meaningful to call this method from the * server, since the settlement's {@link GoodsContainer} * is hidden from the clients. */ public void updateWantedGoods() { final Specification spec = getSpecification(); final java.util.Map<GoodsType, Integer> prices = new HashMap<>(); for (GoodsType gt : spec.getGoodsTypeList()) { if (gt.isMilitaryGoods() || !gt.isStorable()) continue; prices.put(gt, getNormalGoodsPriceToBuy(gt, GoodsContainer.CARGO_SIZE)); } int wantedIndex = 0; for (Entry<GoodsType, Integer> e : mapEntriesByValue(prices, descendingIntegerComparator)) { GoodsType goodsType = e.getKey(); if (e.getValue() <= GoodsContainer.CARGO_SIZE * TRADE_MINIMUM_PRICE || wantedIndex >= wantedGoods.length) break; wantedGoods[wantedIndex] = goodsType; wantedIndex++; } for (; wantedIndex < wantedGoods.length; wantedIndex++) { wantedGoods[wantedIndex] = null; } }
/* * Copyright (c) Contributors to the Open 3D Engine Project. For complete copyright and license terms please see the LICENSE at the root of this distribution. * * SPDX-License-Identifier: Apache-2.0 OR MIT * */ #include <TestImpactFramework/TestImpactClientTestRun.h> namespace TestImpact { namespace Client { TestRun::TestRun(const AZStd::string& name, TestRunResult result, AZStd::chrono::milliseconds duration) : m_targetName(name) , m_result(result) , m_duration(duration) { } const AZStd::string& TestRun::GetTargetName() const { return m_targetName; } AZStd::chrono::milliseconds TestRun::GetDuration() const { return m_duration; } TestRunResult TestRun::GetResult() const { return m_result; } } // namespace Client } // namespace TestImpact
// Code generated by protoc-gen-go. DO NOT EDIT. // versions: // protoc-gen-go v1.26.0 // protoc v3.14.0 // source: service/sys/internal/conf/conf.proto package conf import ( protoreflect "google.golang.org/protobuf/reflect/protoreflect" protoimpl "google.golang.org/protobuf/runtime/protoimpl" durationpb "google.golang.org/protobuf/types/known/durationpb" reflect "reflect" sync "sync" ) const ( // Verify that this generated code is sufficiently up-to-date. _ = protoimpl.EnforceVersion(20 - protoimpl.MinVersion) // Verify that runtime/protoimpl is sufficiently up-to-date. _ = protoimpl.EnforceVersion(protoimpl.MaxVersion - 20) ) type Bootstrap struct { state protoimpl.MessageState sizeCache protoimpl.SizeCache unknownFields protoimpl.UnknownFields Service *Service `protobuf:"bytes,1,opt,name=service,proto3" json:"service,omitempty"` Server *Server `protobuf:"bytes,2,opt,name=server,proto3" json:"server,omitempty"` Client *Client `protobuf:"bytes,3,opt,name=client,proto3" json:"client,omitempty"` Data *Data `protobuf:"bytes,4,opt,name=data,proto3" json:"data,omitempty"` Reg *Reg `protobuf:"bytes,5,opt,name=reg,proto3" json:"reg,omitempty"` Logger *Logger `protobuf:"bytes,6,opt,name=logger,proto3" json:"logger,omitempty"` App *App `protobuf:"bytes,7,opt,name=app,proto3" json:"app,omitempty"` } func (x *Bootstrap) Reset() { *x = Bootstrap{} if protoimpl.UnsafeEnabled { mi := &file_service_sys_internal_conf_conf_proto_msgTypes[0] ms := protoimpl.X.MessageStateOf(protoimpl.Pointer(x)) ms.StoreMessageInfo(mi) } } func (x *Bootstrap) String() string { return protoimpl.X.MessageStringOf(x) } func (*Bootstrap) ProtoMessage() {} func (x *Bootstrap) ProtoReflect() protoreflect.Message { mi := &file_service_sys_internal_conf_conf_proto_msgTypes[0] if protoimpl.UnsafeEnabled && x != nil { ms := protoimpl.X.MessageStateOf(protoimpl.Pointer(x)) if ms.LoadMessageInfo() == nil { ms.StoreMessageInfo(mi) } return ms } return mi.MessageOf(x) } // Deprecated: Use Bootstrap.ProtoReflect.Descriptor instead. func (*Bootstrap) Descriptor() ([]byte, []int) { return file_service_sys_internal_conf_conf_proto_rawDescGZIP(), []int{0} } func (x *Bootstrap) GetService() *Service { if x != nil { return x.Service } return nil } func (x *Bootstrap) GetServer() *Server { if x != nil { return x.Server } return nil } func (x *Bootstrap) GetClient() *Client { if x != nil { return x.Client } return nil } func (x *Bootstrap) GetData() *Data { if x != nil { return x.Data } return nil } func (x *Bootstrap) GetReg() *Reg { if x != nil { return x.Reg } return nil } func (x *Bootstrap) GetLogger() *Logger { if x != nil { return x.Logger } return nil } func (x *Bootstrap) GetApp() *App { if x != nil { return x.App } return nil } type Service struct { state protoimpl.MessageState sizeCache protoimpl.SizeCache unknownFields protoimpl.UnknownFields Name string `protobuf:"bytes,1,opt,name=name,proto3" json:"name,omitempty"` Version string `protobuf:"bytes,2,opt,name=version,proto3" json:"version,omitempty"` } func (x *Service) Reset() { *x = Service{} if protoimpl.UnsafeEnabled { mi := &file_service_sys_internal_conf_conf_proto_msgTypes[1] ms := protoimpl.X.MessageStateOf(protoimpl.Pointer(x)) ms.StoreMessageInfo(mi) } } func (x *Service) String() string { return protoimpl.X.MessageStringOf(x) } func (*Service) ProtoMessage() {} func (x *Service) ProtoReflect() protoreflect.Message { mi := &file_service_sys_internal_conf_conf_proto_msgTypes[1] if protoimpl.UnsafeEnabled && x != nil { ms := protoimpl.X.MessageStateOf(protoimpl.Pointer(x)) if ms.LoadMessageInfo() == nil { ms.StoreMessageInfo(mi) } return ms } return mi.MessageOf(x) } // Deprecated: Use Service.ProtoReflect.Descriptor instead. func (*Service) Descriptor() ([]byte, []int) { return file_service_sys_internal_conf_conf_proto_rawDescGZIP(), []int{1} } func (x *Service) GetName() string { if x != nil { return x.Name } return "" } func (x *Service) GetVersion() string { if x != nil { return x.Version } return "" } type Server struct { state protoimpl.MessageState sizeCache protoimpl.SizeCache unknownFields protoimpl.UnknownFields Http *Server_HTTP `protobuf:"bytes,1,opt,name=http,proto3" json:"http,omitempty"` Grpc *Server_GRPC `protobuf:"bytes,2,opt,name=grpc,proto3" json:"grpc,omitempty"` Middleware *Server_Middleware `protobuf:"bytes,3,opt,name=middleware,proto3" json:"middleware,omitempty"` } func (x *Server) Reset() { *x = Server{} if protoimpl.UnsafeEnabled { mi := &file_service_sys_internal_conf_conf_proto_msgTypes[2] ms := protoimpl.X.MessageStateOf(protoimpl.Pointer(x)) ms.StoreMessageInfo(mi) } } func (x *Server) String() string { return protoimpl.X.MessageStringOf(x) } func (*Server) ProtoMessage() {} func (x *Server) ProtoReflect() protoreflect.Message { mi := &file_service_sys_internal_conf_conf_proto_msgTypes[2] if protoimpl.UnsafeEnabled && x != nil { ms := protoimpl.X.MessageStateOf(protoimpl.Pointer(x)) if ms.LoadMessageInfo() == nil { ms.StoreMessageInfo(mi) } return ms } return mi.MessageOf(x) } // Deprecated: Use Server.ProtoReflect.Descriptor instead. func (*Server) Descriptor() ([]byte, []int) { return file_service_sys_internal_conf_conf_proto_rawDescGZIP(), []int{2} } func (x *Server) GetHttp() *Server_HTTP { if x != nil { return x.Http } return nil } func (x *Server) GetGrpc() *Server_GRPC { if x != nil { return x.Grpc } return nil } func (x *Server) GetMiddleware() *Server_Middleware { if x != nil { return x.Middleware } return nil } type Client struct { state protoimpl.MessageState sizeCache protoimpl.SizeCache unknownFields protoimpl.UnknownFields Sso *Client_Sso `protobuf:"bytes,1,opt,name=sso,proto3" json:"sso,omitempty"` } func (x *Client) Reset() { *x = Client{} if protoimpl.UnsafeEnabled { mi := &file_service_sys_internal_conf_conf_proto_msgTypes[3] ms := protoimpl.X.MessageStateOf(protoimpl.Pointer(x)) ms.StoreMessageInfo(mi) } } func (x *Client) String() string { return protoimpl.X.MessageStringOf(x) } func (*Client) ProtoMessage() {} func (x *Client) ProtoReflect() protoreflect.Message { mi := &file_service_sys_internal_conf_conf_proto_msgTypes[3] if protoimpl.UnsafeEnabled && x != nil { ms := protoimpl.X.MessageStateOf(protoimpl.Pointer(x)) if ms.LoadMessageInfo() == nil { ms.StoreMessageInfo(mi) } return ms } return mi.MessageOf(x) } // Deprecated: Use Client.ProtoReflect.Descriptor instead. func (*Client) Descriptor() ([]byte, []int) { return file_service_sys_internal_conf_conf_proto_rawDescGZIP(), []int{3} } func (x *Client) GetSso() *Client_Sso { if x != nil { return x.Sso } return nil } type Data struct { state protoimpl.MessageState sizeCache protoimpl.SizeCache unknownFields protoimpl.UnknownFields Admin *Data_Database `protobuf:"bytes,1,opt,name=admin,proto3" json:"admin,omitempty"` Sso *Data_Database `protobuf:"bytes,2,opt,name=sso,proto3" json:"sso,omitempty"` Tiku *Data_Database `protobuf:"bytes,3,opt,name=tiku,proto3" json:"tiku,omitempty"` Redis *Data_Redis `protobuf:"bytes,4,opt,name=redis,proto3" json:"redis,omitempty"` Migrate *Data_Migrate `protobuf:"bytes,5,opt,name=migrate,proto3" json:"migrate,omitempty"` } func (x *Data) Reset() { *x = Data{} if protoimpl.UnsafeEnabled { mi := &file_service_sys_internal_conf_conf_proto_msgTypes[4] ms := protoimpl.X.MessageStateOf(protoimpl.Pointer(x)) ms.StoreMessageInfo(mi) } } func (x *Data) String() string { return protoimpl.X.MessageStringOf(x) } func (*Data) ProtoMessage() {} func (x *Data) ProtoReflect() protoreflect.Message { mi := &file_service_sys_internal_conf_conf_proto_msgTypes[4] if protoimpl.UnsafeEnabled && x != nil { ms := protoimpl.X.MessageStateOf(protoimpl.Pointer(x)) if ms.LoadMessageInfo() == nil { ms.StoreMessageInfo(mi) } return ms } return mi.MessageOf(x) } // Deprecated: Use Data.ProtoReflect.Descriptor instead. func (*Data) Descriptor() ([]byte, []int) { return file_service_sys_internal_conf_conf_proto_rawDescGZIP(), []int{4} } func (x *Data) GetAdmin() *Data_Database { if x != nil { return x.Admin } return nil } func (x *Data) GetSso() *Data_Database { if x != nil { return x.Sso } return nil } func (x *Data) GetTiku() *Data_Database { if x != nil { return x.Tiku } return nil } func (x *Data) GetRedis() *Data_Redis { if x != nil { return x.Redis } return nil } func (x *Data) GetMigrate() *Data_Migrate { if x != nil { return x.Migrate } return nil } type Reg struct { state protoimpl.MessageState sizeCache protoimpl.SizeCache unknownFields protoimpl.UnknownFields Etcd *Reg_Etcd `protobuf:"bytes,1,opt,name=etcd,proto3" json:"etcd,omitempty"` Instance *Reg_Instance `protobuf:"bytes,2,opt,name=instance,proto3" json:"instance,omitempty"` } func (x *Reg) Reset() { *x = Reg{} if protoimpl.UnsafeEnabled { mi := &file_service_sys_internal_conf_conf_proto_msgTypes[5] ms := protoimpl.X.MessageStateOf(protoimpl.Pointer(x)) ms.StoreMessageInfo(mi) } } func (x *Reg) String() string { return protoimpl.X.MessageStringOf(x) } func (*Reg) ProtoMessage() {} func (x *Reg) ProtoReflect() protoreflect.Message { mi := &file_service_sys_internal_conf_conf_proto_msgTypes[5] if protoimpl.UnsafeEnabled && x != nil { ms := protoimpl.X.MessageStateOf(protoimpl.Pointer(x)) if ms.LoadMessageInfo() == nil { ms.StoreMessageInfo(mi) } return ms } return mi.MessageOf(x) } // Deprecated: Use Reg.ProtoReflect.Descriptor instead. func (*Reg) Descriptor() ([]byte, []int) { return file_service_sys_internal_conf_conf_proto_rawDescGZIP(), []int{5} } func (x *Reg) GetEtcd() *Reg_Etcd { if x != nil { return x.Etcd } return nil } func (x *Reg) GetInstance() *Reg_Instance { if x != nil { return x.Instance } return nil } type Logger struct { state protoimpl.MessageState sizeCache protoimpl.SizeCache unknownFields protoimpl.UnknownFields Path string `protobuf:"bytes,1,opt,name=path,proto3" json:"path,omitempty"` Stdout bool `protobuf:"varint,2,opt,name=stdout,proto3" json:"stdout,omitempty"` Level int32 `protobuf:"varint,3,opt,name=level,proto3" json:"level,omitempty"` } func (x *Logger) Reset() { *x = Logger{} if protoimpl.UnsafeEnabled { mi := &file_service_sys_internal_conf_conf_proto_msgTypes[6] ms := protoimpl.X.MessageStateOf(protoimpl.Pointer(x)) ms.StoreMessageInfo(mi) } } func (x *Logger) String() string { return protoimpl.X.MessageStringOf(x) } func (*Logger) ProtoMessage() {} func (x *Logger) ProtoReflect() protoreflect.Message { mi := &file_service_sys_internal_conf_conf_proto_msgTypes[6] if protoimpl.UnsafeEnabled && x != nil { ms := protoimpl.X.MessageStateOf(protoimpl.Pointer(x)) if ms.LoadMessageInfo() == nil { ms.StoreMessageInfo(mi) } return ms } return mi.MessageOf(x) } // Deprecated: Use Logger.ProtoReflect.Descriptor instead. func (*Logger) Descriptor() ([]byte, []int) { return file_service_sys_internal_conf_conf_proto_rawDescGZIP(), []int{6} } func (x *Logger) GetPath() string { if x != nil { return x.Path } return "" } func (x *Logger) GetStdout() bool { if x != nil { return x.Stdout } return false } func (x *Logger) GetLevel() int32 { if x != nil { return x.Level } return 0 } type App struct { state protoimpl.MessageState sizeCache protoimpl.SizeCache unknownFields protoimpl.UnknownFields Jwt *App_Jwt `protobuf:"bytes,1,opt,name=jwt,proto3" json:"jwt,omitempty"` Gen *App_Gen `protobuf:"bytes,2,opt,name=gen,proto3" json:"gen,omitempty"` Tools *App_Database `protobuf:"bytes,3,opt,name=tools,proto3" json:"tools,omitempty"` Tiku *App_Tiku `protobuf:"bytes,4,opt,name=tiku,proto3" json:"tiku,omitempty"` } func (x *App) Reset() { *x = App{} if protoimpl.UnsafeEnabled { mi := &file_service_sys_internal_conf_conf_proto_msgTypes[7] ms := protoimpl.X.MessageStateOf(protoimpl.Pointer(x)) ms.StoreMessageInfo(mi) } } func (x *App) String() string { return protoimpl.X.MessageStringOf(x) } func (*App) ProtoMessage() {} func (x *App) ProtoReflect() protoreflect.Message { mi := &file_service_sys_internal_conf_conf_proto_msgTypes[7] if protoimpl.UnsafeEnabled && x != nil { ms := protoimpl.X.MessageStateOf(protoimpl.Pointer(x)) if ms.LoadMessageInfo() == nil { ms.StoreMessageInfo(mi) } return ms } return mi.MessageOf(x) } // Deprecated: Use App.ProtoReflect.Descriptor instead. func (*App) Descriptor() ([]byte, []int) { return file_service_sys_internal_conf_conf_proto_rawDescGZIP(), []int{7} } func (x *App) GetJwt() *App_Jwt { if x != nil { return x.Jwt } return nil } func (x *App) GetGen() *App_Gen { if x != nil { return x.Gen } return nil } func (x *App) GetTools() *App_Database { if x != nil { return x.Tools } return nil } func (x *App) GetTiku() *App_Tiku { if x != nil { return x.Tiku } return nil } type Server_HTTP struct { state protoimpl.MessageState sizeCache protoimpl.SizeCache unknownFields protoimpl.UnknownFields Network string `protobuf:"bytes,1,opt,name=network,proto3" json:"network,omitempty"` Address string `protobuf:"bytes,2,opt,name=address,proto3" json:"address,omitempty"` Timeout *durationpb.Duration `protobuf:"bytes,3,opt,name=timeout,proto3" json:"timeout,omitempty"` } func (x *Server_HTTP) Reset() { *x = Server_HTTP{} if protoimpl.UnsafeEnabled { mi := &file_service_sys_internal_conf_conf_proto_msgTypes[8] ms := protoimpl.X.MessageStateOf(protoimpl.Pointer(x)) ms.StoreMessageInfo(mi) } } func (x *Server_HTTP) String() string { return protoimpl.X.MessageStringOf(x) } func (*Server_HTTP) ProtoMessage() {} func (x *Server_HTTP) ProtoReflect() protoreflect.Message { mi := &file_service_sys_internal_conf_conf_proto_msgTypes[8] if protoimpl.UnsafeEnabled && x != nil { ms := protoimpl.X.MessageStateOf(protoimpl.Pointer(x)) if ms.LoadMessageInfo() == nil { ms.StoreMessageInfo(mi) } return ms } return mi.MessageOf(x) } // Deprecated: Use Server_HTTP.ProtoReflect.Descriptor instead. func (*Server_HTTP) Descriptor() ([]byte, []int) { return file_service_sys_internal_conf_conf_proto_rawDescGZIP(), []int{2, 0} } func (x *Server_HTTP) GetNetwork() string { if x != nil { return x.Network } return "" } func (x *Server_HTTP) GetAddress() string { if x != nil { return x.Address } return "" } func (x *Server_HTTP) GetTimeout() *durationpb.Duration { if x != nil { return x.Timeout } return nil } type Server_GRPC struct { state protoimpl.MessageState sizeCache protoimpl.SizeCache unknownFields protoimpl.UnknownFields Network string `protobuf:"bytes,1,opt,name=network,proto3" json:"network,omitempty"` Address string `protobuf:"bytes,2,opt,name=address,proto3" json:"address,omitempty"` Timeout *durationpb.Duration `protobuf:"bytes,3,opt,name=timeout,proto3" json:"timeout,omitempty"` } func (x *Server_GRPC) Reset() { *x = Server_GRPC{} if protoimpl.UnsafeEnabled { mi := &file_service_sys_internal_conf_conf_proto_msgTypes[9] ms := protoimpl.X.MessageStateOf(protoimpl.Pointer(x)) ms.StoreMessageInfo(mi) } } func (x *Server_GRPC) String() string { return protoimpl.X.MessageStringOf(x) } func (*Server_GRPC) ProtoMessage() {} func (x *Server_GRPC) ProtoReflect() protoreflect.Message { mi := &file_service_sys_internal_conf_conf_proto_msgTypes[9] if protoimpl.UnsafeEnabled && x != nil { ms := protoimpl.X.MessageStateOf(protoimpl.Pointer(x)) if ms.LoadMessageInfo() == nil { ms.StoreMessageInfo(mi) } return ms } return mi.MessageOf(x) } // Deprecated: Use Server_GRPC.ProtoReflect.Descriptor instead. func (*Server_GRPC) Descriptor() ([]byte, []int) { return file_service_sys_internal_conf_conf_proto_rawDescGZIP(), []int{2, 1} } func (x *Server_GRPC) GetNetwork() string { if x != nil { return x.Network } return "" } func (x *Server_GRPC) GetAddress() string { if x != nil { return x.Address } return "" } func (x *Server_GRPC) GetTimeout() *durationpb.Duration { if x != nil { return x.Timeout } return nil } type Server_JWT struct { state protoimpl.MessageState sizeCache protoimpl.SizeCache unknownFields protoimpl.UnknownFields Secret string `protobuf:"bytes,1,opt,name=secret,proto3" json:"secret,omitempty"` Timeout *durationpb.Duration `protobuf:"bytes,2,opt,name=timeout,proto3" json:"timeout,omitempty"` } func (x *Server_JWT) Reset() { *x = Server_JWT{} if protoimpl.UnsafeEnabled { mi := &file_service_sys_internal_conf_conf_proto_msgTypes[10] ms := protoimpl.X.MessageStateOf(protoimpl.Pointer(x)) ms.StoreMessageInfo(mi) } } func (x *Server_JWT) String() string { return protoimpl.X.MessageStringOf(x) } func (*Server_JWT) ProtoMessage() {} func (x *Server_JWT) ProtoReflect() protoreflect.Message { mi := &file_service_sys_internal_conf_conf_proto_msgTypes[10] if protoimpl.UnsafeEnabled && x != nil { ms := protoimpl.X.MessageStateOf(protoimpl.Pointer(x)) if ms.LoadMessageInfo() == nil { ms.StoreMessageInfo(mi) } return ms } return mi.MessageOf(x) } // Deprecated: Use Server_JWT.ProtoReflect.Descriptor instead. func (*Server_JWT) Descriptor() ([]byte, []int) { return file_service_sys_internal_conf_conf_proto_rawDescGZIP(), []int{2, 2} } func (x *Server_JWT) GetSecret() string { if x != nil { return x.Secret } return "" } func (x *Server_JWT) GetTimeout() *durationpb.Duration { if x != nil { return x.Timeout } return nil } type Server_Middleware struct { state protoimpl.MessageState sizeCache protoimpl.SizeCache unknownFields protoimpl.UnknownFields Jwt *Server_JWT `protobuf:"bytes,1,opt,name=jwt,proto3" json:"jwt,omitempty"` } func (x *Server_Middleware) Reset() { *x = Server_Middleware{} if protoimpl.UnsafeEnabled { mi := &file_service_sys_internal_conf_conf_proto_msgTypes[11] ms := protoimpl.X.MessageStateOf(protoimpl.Pointer(x)) ms.StoreMessageInfo(mi) } } func (x *Server_Middleware) String() string { return protoimpl.X.MessageStringOf(x) } func (*Server_Middleware) ProtoMessage() {} func (x *Server_Middleware) ProtoReflect() protoreflect.Message { mi := &file_service_sys_internal_conf_conf_proto_msgTypes[11] if protoimpl.UnsafeEnabled && x != nil { ms := protoimpl.X.MessageStateOf(protoimpl.Pointer(x)) if ms.LoadMessageInfo() == nil { ms.StoreMessageInfo(mi) } return ms } return mi.MessageOf(x) } // Deprecated: Use Server_Middleware.ProtoReflect.Descriptor instead. func (*Server_Middleware) Descriptor() ([]byte, []int) { return file_service_sys_internal_conf_conf_proto_rawDescGZIP(), []int{2, 3} } func (x *Server_Middleware) GetJwt() *Server_JWT { if x != nil { return x.Jwt } return nil } type Client_Breaker struct { state protoimpl.MessageState sizeCache protoimpl.SizeCache unknownFields protoimpl.UnknownFields Window string `protobuf:"bytes,1,opt,name=window,proto3" json:"window,omitempty"` Bucket int32 `protobuf:"varint,2,opt,name=bucket,proto3" json:"bucket,omitempty"` K float32 `protobuf:"fixed32,3,opt,name=k,proto3" json:"k,omitempty"` Request int32 `protobuf:"varint,4,opt,name=request,proto3" json:"request,omitempty"` } func (x *Client_Breaker) Reset() { *x = Client_Breaker{} if protoimpl.UnsafeEnabled { mi := &file_service_sys_internal_conf_conf_proto_msgTypes[12] ms := protoimpl.X.MessageStateOf(protoimpl.Pointer(x)) ms.StoreMessageInfo(mi) } } func (x *Client_Breaker) String() string { return protoimpl.X.MessageStringOf(x) } func (*Client_Breaker) ProtoMessage() {} func (x *Client_Breaker) ProtoReflect() protoreflect.Message { mi := &file_service_sys_internal_conf_conf_proto_msgTypes[12] if protoimpl.UnsafeEnabled && x != nil { ms := protoimpl.X.MessageStateOf(protoimpl.Pointer(x)) if ms.LoadMessageInfo() == nil { ms.StoreMessageInfo(mi) } return ms } return mi.MessageOf(x) } // Deprecated: Use Client_Breaker.ProtoReflect.Descriptor instead. func (*Client_Breaker) Descriptor() ([]byte, []int) { return file_service_sys_internal_conf_conf_proto_rawDescGZIP(), []int{3, 0} } func (x *Client_Breaker) GetWindow() string { if x != nil { return x.Window } return "" } func (x *Client_Breaker) GetBucket() int32 { if x != nil { return x.Bucket } return 0 } func (x *Client_Breaker) GetK() float32 { if x != nil { return x.K } return 0 } func (x *Client_Breaker) GetRequest() int32 { if x != nil { return x.Request } return 0 } type Client_Sso struct { state protoimpl.MessageState sizeCache protoimpl.SizeCache unknownFields protoimpl.UnknownFields Dial string `protobuf:"bytes,1,opt,name=dial,proto3" json:"dial,omitempty"` Timeout string `protobuf:"bytes,2,opt,name=timeout,proto3" json:"timeout,omitempty"` Zone string `protobuf:"bytes,3,opt,name=zone,proto3" json:"zone,omitempty"` Breaker *Client_Breaker `protobuf:"bytes,4,opt,name=breaker,proto3" json:"breaker,omitempty"` } func (x *Client_Sso) Reset() { *x = Client_Sso{} if protoimpl.UnsafeEnabled { mi := &file_service_sys_internal_conf_conf_proto_msgTypes[13] ms := protoimpl.X.MessageStateOf(protoimpl.Pointer(x)) ms.StoreMessageInfo(mi) } } func (x *Client_Sso) String() string { return protoimpl.X.MessageStringOf(x) } func (*Client_Sso) ProtoMessage() {} func (x *Client_Sso) ProtoReflect() protoreflect.Message { mi := &file_service_sys_internal_conf_conf_proto_msgTypes[13] if protoimpl.UnsafeEnabled && x != nil { ms := protoimpl.X.MessageStateOf(protoimpl.Pointer(x)) if ms.LoadMessageInfo() == nil { ms.StoreMessageInfo(mi) } return ms } return mi.MessageOf(x) } // Deprecated: Use Client_Sso.ProtoReflect.Descriptor instead. func (*Client_Sso) Descriptor() ([]byte, []int) { return file_service_sys_internal_conf_conf_proto_rawDescGZIP(), []int{3, 1} } func (x *Client_Sso) GetDial() string { if x != nil { return x.Dial } return "" } func (x *Client_Sso) GetTimeout() string { if x != nil { return x.Timeout } return "" } func (x *Client_Sso) GetZone() string { if x != nil { return x.Zone } return "" } func (x *Client_Sso) GetBreaker() *Client_Breaker { if x != nil { return x.Breaker } return nil } type Data_Database struct { state protoimpl.MessageState sizeCache protoimpl.SizeCache unknownFields protoimpl.UnknownFields Driver string `protobuf:"bytes,1,opt,name=driver,proto3" json:"driver,omitempty"` Source string `protobuf:"bytes,2,opt,name=source,proto3" json:"source,omitempty"` } func (x *Data_Database) Reset() { *x = Data_Database{} if protoimpl.UnsafeEnabled { mi := &file_service_sys_internal_conf_conf_proto_msgTypes[14] ms := protoimpl.X.MessageStateOf(protoimpl.Pointer(x)) ms.StoreMessageInfo(mi) } } func (x *Data_Database) String() string { return protoimpl.X.MessageStringOf(x) } func (*Data_Database) ProtoMessage() {} func (x *Data_Database) ProtoReflect() protoreflect.Message { mi := &file_service_sys_internal_conf_conf_proto_msgTypes[14] if protoimpl.UnsafeEnabled && x != nil { ms := protoimpl.X.MessageStateOf(protoimpl.Pointer(x)) if ms.LoadMessageInfo() == nil { ms.StoreMessageInfo(mi) } return ms } return mi.MessageOf(x) } // Deprecated: Use Data_Database.ProtoReflect.Descriptor instead. func (*Data_Database) Descriptor() ([]byte, []int) { return file_service_sys_internal_conf_conf_proto_rawDescGZIP(), []int{4, 0} } func (x *Data_Database) GetDriver() string { if x != nil { return x.Driver } return "" } func (x *Data_Database) GetSource() string { if x != nil { return x.Source } return "" } type Data_Redis struct { state protoimpl.MessageState sizeCache protoimpl.SizeCache unknownFields protoimpl.UnknownFields Network string `protobuf:"bytes,1,opt,name=network,proto3" json:"network,omitempty"` Addr string `protobuf:"bytes,2,opt,name=addr,proto3" json:"addr,omitempty"` ReadTimeout *durationpb.Duration `protobuf:"bytes,3,opt,name=read_timeout,json=readTimeout,proto3" json:"read_timeout,omitempty"` WriteTimeout *durationpb.Duration `protobuf:"bytes,4,opt,name=write_timeout,json=writeTimeout,proto3" json:"write_timeout,omitempty"` DialTimeout *durationpb.Duration `protobuf:"bytes,5,opt,name=dial_timeout,json=dialTimeout,proto3" json:"dial_timeout,omitempty"` } func (x *Data_Redis) Reset() { *x = Data_Redis{} if protoimpl.UnsafeEnabled { mi := &file_service_sys_internal_conf_conf_proto_msgTypes[15] ms := protoimpl.X.MessageStateOf(protoimpl.Pointer(x)) ms.StoreMessageInfo(mi) } } func (x *Data_Redis) String() string { return protoimpl.X.MessageStringOf(x) } func (*Data_Redis) ProtoMessage() {} func (x *Data_Redis) ProtoReflect() protoreflect.Message { mi := &file_service_sys_internal_conf_conf_proto_msgTypes[15] if protoimpl.UnsafeEnabled && x != nil { ms := protoimpl.X.MessageStateOf(protoimpl.Pointer(x)) if ms.LoadMessageInfo() == nil { ms.StoreMessageInfo(mi) } return ms } return mi.MessageOf(x) } // Deprecated: Use Data_Redis.ProtoReflect.Descriptor instead. func (*Data_Redis) Descriptor() ([]byte, []int) { return file_service_sys_internal_conf_conf_proto_rawDescGZIP(), []int{4, 1} } func (x *Data_Redis) GetNetwork() string { if x != nil { return x.Network } return "" } func (x *Data_Redis) GetAddr() string { if x != nil { return x.Addr } return "" } func (x *Data_Redis) GetReadTimeout() *durationpb.Duration { if x != nil { return x.ReadTimeout } return nil } func (x *Data_Redis) GetWriteTimeout() *durationpb.Duration { if x != nil { return x.WriteTimeout } return nil } func (x *Data_Redis) GetDialTimeout() *durationpb.Duration { if x != nil { return x.DialTimeout } return nil } type Data_Migrate struct { state protoimpl.MessageState sizeCache protoimpl.SizeCache unknownFields protoimpl.UnknownFields Path string `protobuf:"bytes,1,opt,name=path,proto3" json:"path,omitempty"` } func (x *Data_Migrate) Reset() { *x = Data_Migrate{} if protoimpl.UnsafeEnabled { mi := &file_service_sys_internal_conf_conf_proto_msgTypes[16] ms := protoimpl.X.MessageStateOf(protoimpl.Pointer(x)) ms.StoreMessageInfo(mi) } } func (x *Data_Migrate) String() string { return protoimpl.X.MessageStringOf(x) } func (*Data_Migrate) ProtoMessage() {} func (x *Data_Migrate) ProtoReflect() protoreflect.Message { mi := &file_service_sys_internal_conf_conf_proto_msgTypes[16] if protoimpl.UnsafeEnabled && x != nil { ms := protoimpl.X.MessageStateOf(protoimpl.Pointer(x)) if ms.LoadMessageInfo() == nil { ms.StoreMessageInfo(mi) } return ms } return mi.MessageOf(x) } // Deprecated: Use Data_Migrate.ProtoReflect.Descriptor instead. func (*Data_Migrate) Descriptor() ([]byte, []int) { return file_service_sys_internal_conf_conf_proto_rawDescGZIP(), []int{4, 2} } func (x *Data_Migrate) GetPath() string { if x != nil { return x.Path } return "" } type Reg_Etcd struct { state protoimpl.MessageState sizeCache protoimpl.SizeCache unknownFields protoimpl.UnknownFields Endpoints []string `protobuf:"bytes,1,rep,name=endpoints,proto3" json:"endpoints,omitempty"` DialTimeout *durationpb.Duration `protobuf:"bytes,2,opt,name=dialTimeout,proto3" json:"dialTimeout,omitempty"` } func (x *Reg_Etcd) Reset() { *x = Reg_Etcd{} if protoimpl.UnsafeEnabled { mi := &file_service_sys_internal_conf_conf_proto_msgTypes[17] ms := protoimpl.X.MessageStateOf(protoimpl.Pointer(x)) ms.StoreMessageInfo(mi) } } func (x *Reg_Etcd) String() string { return protoimpl.X.MessageStringOf(x) } func (*Reg_Etcd) ProtoMessage() {} func (x *Reg_Etcd) ProtoReflect() protoreflect.Message { mi := &file_service_sys_internal_conf_conf_proto_msgTypes[17] if protoimpl.UnsafeEnabled && x != nil { ms := protoimpl.X.MessageStateOf(protoimpl.Pointer(x)) if ms.LoadMessageInfo() == nil { ms.StoreMessageInfo(mi) } return ms } return mi.MessageOf(x) } // Deprecated: Use Reg_Etcd.ProtoReflect.Descriptor instead. func (*Reg_Etcd) Descriptor() ([]byte, []int) { return file_service_sys_internal_conf_conf_proto_rawDescGZIP(), []int{5, 0} } func (x *Reg_Etcd) GetEndpoints() []string { if x != nil { return x.Endpoints } return nil } func (x *Reg_Etcd) GetDialTimeout() *durationpb.Duration { if x != nil { return x.DialTimeout } return nil } type Reg_Instance struct { state protoimpl.MessageState sizeCache protoimpl.SizeCache unknownFields protoimpl.UnknownFields Zone string `protobuf:"bytes,1,opt,name=zone,proto3" json:"zone,omitempty"` Env string `protobuf:"bytes,2,opt,name=env,proto3" json:"env,omitempty"` Hostname string `protobuf:"bytes,3,opt,name=hostname,proto3" json:"hostname,omitempty"` Appid string `protobuf:"bytes,4,opt,name=appid,proto3" json:"appid,omitempty"` Addrs []string `protobuf:"bytes,5,rep,name=addrs,proto3" json:"addrs,omitempty"` } func (x *Reg_Instance) Reset() { *x = Reg_Instance{} if protoimpl.UnsafeEnabled { mi := &file_service_sys_internal_conf_conf_proto_msgTypes[18] ms := protoimpl.X.MessageStateOf(protoimpl.Pointer(x)) ms.StoreMessageInfo(mi) } } func (x *Reg_Instance) String() string { return protoimpl.X.MessageStringOf(x) } func (*Reg_Instance) ProtoMessage() {} func (x *Reg_Instance) ProtoReflect() protoreflect.Message { mi := &file_service_sys_internal_conf_conf_proto_msgTypes[18] if protoimpl.UnsafeEnabled && x != nil { ms := protoimpl.X.MessageStateOf(protoimpl.Pointer(x)) if ms.LoadMessageInfo() == nil { ms.StoreMessageInfo(mi) } return ms } return mi.MessageOf(x) } // Deprecated: Use Reg_Instance.ProtoReflect.Descriptor instead. func (*Reg_Instance) Descriptor() ([]byte, []int) { return file_service_sys_internal_conf_conf_proto_rawDescGZIP(), []int{5, 1} } func (x *Reg_Instance) GetZone() string { if x != nil { return x.Zone } return "" } func (x *Reg_Instance) GetEnv() string { if x != nil { return x.Env } return "" } func (x *Reg_Instance) GetHostname() string { if x != nil { return x.Hostname } return "" } func (x *Reg_Instance) GetAppid() string { if x != nil { return x.Appid } return "" } func (x *Reg_Instance) GetAddrs() []string { if x != nil { return x.Addrs } return nil } type App_Jwt struct { state protoimpl.MessageState sizeCache protoimpl.SizeCache unknownFields protoimpl.UnknownFields Secret string `protobuf:"bytes,1,opt,name=secret,proto3" json:"secret,omitempty"` Timeout *durationpb.Duration `protobuf:"bytes,2,opt,name=timeout,proto3" json:"timeout,omitempty"` } func (x *App_Jwt) Reset() { *x = App_Jwt{} if protoimpl.UnsafeEnabled { mi := &file_service_sys_internal_conf_conf_proto_msgTypes[19] ms := protoimpl.X.MessageStateOf(protoimpl.Pointer(x)) ms.StoreMessageInfo(mi) } } func (x *App_Jwt) String() string { return protoimpl.X.MessageStringOf(x) } func (*App_Jwt) ProtoMessage() {} func (x *App_Jwt) ProtoReflect() protoreflect.Message { mi := &file_service_sys_internal_conf_conf_proto_msgTypes[19] if protoimpl.UnsafeEnabled && x != nil { ms := protoimpl.X.MessageStateOf(protoimpl.Pointer(x)) if ms.LoadMessageInfo() == nil { ms.StoreMessageInfo(mi) } return ms } return mi.MessageOf(x) } // Deprecated: Use App_Jwt.ProtoReflect.Descriptor instead. func (*App_Jwt) Descriptor() ([]byte, []int) { return file_service_sys_internal_conf_conf_proto_rawDescGZIP(), []int{7, 0} } func (x *App_Jwt) GetSecret() string { if x != nil { return x.Secret } return "" } func (x *App_Jwt) GetTimeout() *durationpb.Duration { if x != nil { return x.Timeout } return nil } type App_Template struct { state protoimpl.MessageState sizeCache protoimpl.SizeCache unknownFields protoimpl.UnknownFields Model string `protobuf:"bytes,1,opt,name=model,proto3" json:"model,omitempty"` Dao string `protobuf:"bytes,2,opt,name=dao,proto3" json:"dao,omitempty"` Js string `protobuf:"bytes,3,opt,name=js,proto3" json:"js,omitempty"` Vue string `protobuf:"bytes,4,opt,name=vue,proto3" json:"vue,omitempty"` } func (x *App_Template) Reset() { *x = App_Template{} if protoimpl.UnsafeEnabled { mi := &file_service_sys_internal_conf_conf_proto_msgTypes[20] ms := protoimpl.X.MessageStateOf(protoimpl.Pointer(x)) ms.StoreMessageInfo(mi) } } func (x *App_Template) String() string { return protoimpl.X.MessageStringOf(x) } func (*App_Template) ProtoMessage() {} func (x *App_Template) ProtoReflect() protoreflect.Message { mi := &file_service_sys_internal_conf_conf_proto_msgTypes[20] if protoimpl.UnsafeEnabled && x != nil { ms := protoimpl.X.MessageStateOf(protoimpl.Pointer(x)) if ms.LoadMessageInfo() == nil { ms.StoreMessageInfo(mi) } return ms } return mi.MessageOf(x) } // Deprecated: Use App_Template.ProtoReflect.Descriptor instead. func (*App_Template) Descriptor() ([]byte, []int) { return file_service_sys_internal_conf_conf_proto_rawDescGZIP(), []int{7, 1} } func (x *App_Template) GetModel() string { if x != nil { return x.Model } return "" } func (x *App_Template) GetDao() string { if x != nil { return x.Dao } return "" } func (x *App_Template) GetJs() string { if x != nil { return x.Js } return "" } func (x *App_Template) GetVue() string { if x != nil { return x.Vue } return "" } type App_Gen struct { state protoimpl.MessageState sizeCache protoimpl.SizeCache unknownFields protoimpl.UnknownFields Name string `protobuf:"bytes,1,opt,name=name,proto3" json:"name,omitempty"` Dbname string `protobuf:"bytes,2,opt,name=dbname,proto3" json:"dbname,omitempty"` Frontpath string `protobuf:"bytes,3,opt,name=frontpath,proto3" json:"frontpath,omitempty"` Backpath string `protobuf:"bytes,4,opt,name=backpath,proto3" json:"backpath,omitempty"` Template *App_Template `protobuf:"bytes,5,opt,name=template,proto3" json:"template,omitempty"` } func (x *App_Gen) Reset() { *x = App_Gen{} if protoimpl.UnsafeEnabled { mi := &file_service_sys_internal_conf_conf_proto_msgTypes[21] ms := protoimpl.X.MessageStateOf(protoimpl.Pointer(x)) ms.StoreMessageInfo(mi) } } func (x *App_Gen) String() string { return protoimpl.X.MessageStringOf(x) } func (*App_Gen) ProtoMessage() {} func (x *App_Gen) ProtoReflect() protoreflect.Message { mi := &file_service_sys_internal_conf_conf_proto_msgTypes[21] if protoimpl.UnsafeEnabled && x != nil { ms := protoimpl.X.MessageStateOf(protoimpl.Pointer(x)) if ms.LoadMessageInfo() == nil { ms.StoreMessageInfo(mi) } return ms } return mi.MessageOf(x) } // Deprecated: Use App_Gen.ProtoReflect.Descriptor instead. func (*App_Gen) Descriptor() ([]byte, []int) { return file_service_sys_internal_conf_conf_proto_rawDescGZIP(), []int{7, 2} } func (x *App_Gen) GetName() string { if x != nil { return x.Name } return "" } func (x *App_Gen) GetDbname() string { if x != nil { return x.Dbname } return "" } func (x *App_Gen) GetFrontpath() string { if x != nil { return x.Frontpath } return "" } func (x *App_Gen) GetBackpath() string { if x != nil { return x.Backpath } return "" } func (x *App_Gen) GetTemplate() *App_Template { if x != nil { return x.Template } return nil } type App_Database struct { state protoimpl.MessageState sizeCache protoimpl.SizeCache unknownFields protoimpl.UnknownFields Driver string `protobuf:"bytes,1,opt,name=driver,proto3" json:"driver,omitempty"` Source string `protobuf:"bytes,2,opt,name=source,proto3" json:"source,omitempty"` } func (x *App_Database) Reset() { *x = App_Database{} if protoimpl.UnsafeEnabled { mi := &file_service_sys_internal_conf_conf_proto_msgTypes[22] ms := protoimpl.X.MessageStateOf(protoimpl.Pointer(x)) ms.StoreMessageInfo(mi) } } func (x *App_Database) String() string { return protoimpl.X.MessageStringOf(x) } func (*App_Database) ProtoMessage() {} func (x *App_Database) ProtoReflect() protoreflect.Message { mi := &file_service_sys_internal_conf_conf_proto_msgTypes[22] if protoimpl.UnsafeEnabled && x != nil { ms := protoimpl.X.MessageStateOf(protoimpl.Pointer(x)) if ms.LoadMessageInfo() == nil { ms.StoreMessageInfo(mi) } return ms } return mi.MessageOf(x) } // Deprecated: Use App_Database.ProtoReflect.Descriptor instead. func (*App_Database) Descriptor() ([]byte, []int) { return file_service_sys_internal_conf_conf_proto_rawDescGZIP(), []int{7, 3} } func (x *App_Database) GetDriver() string { if x != nil { return x.Driver } return "" } func (x *App_Database) GetSource() string { if x != nil { return x.Source } return "" } type App_Tiku struct { state protoimpl.MessageState sizeCache protoimpl.SizeCache unknownFields protoimpl.UnknownFields Path string `protobuf:"bytes,1,opt,name=path,proto3" json:"path,omitempty"` } func (x *App_Tiku) Reset() { *x = App_Tiku{} if protoimpl.UnsafeEnabled { mi := &file_service_sys_internal_conf_conf_proto_msgTypes[23] ms := protoimpl.X.MessageStateOf(protoimpl.Pointer(x)) ms.StoreMessageInfo(mi) } } func (x *App_Tiku) String() string { return protoimpl.X.MessageStringOf(x) } func (*App_Tiku) ProtoMessage() {} func (x *App_Tiku) ProtoReflect() protoreflect.Message { mi := &file_service_sys_internal_conf_conf_proto_msgTypes[23] if protoimpl.UnsafeEnabled && x != nil { ms := protoimpl.X.MessageStateOf(protoimpl.Pointer(x)) if ms.LoadMessageInfo() == nil { ms.StoreMessageInfo(mi) } return ms } return mi.MessageOf(x) } // Deprecated: Use App_Tiku.ProtoReflect.Descriptor instead. func (*App_Tiku) Descriptor() ([]byte, []int) { return file_service_sys_internal_conf_conf_proto_rawDescGZIP(), []int{7, 4} } func (x *App_Tiku) GetPath() string { if x != nil { return x.Path } return "" } var File_service_sys_internal_conf_conf_proto protoreflect.FileDescriptor var file_service_sys_internal_conf_conf_proto_rawDesc = []byte{ 0x0a, 0x24, 0x73, 0x65, 0x72, 0x76, 0x69, 0x63, 0x65, 0x2f, 0x73, 0x79, 0x73, 0x2f, 0x69, 0x6e, 0x74, 0x65, 0x72, 0x6e, 0x61, 0x6c, 0x2f, 0x63, 0x6f, 0x6e, 0x66, 0x2f, 0x63, 0x6f, 0x6e, 0x66, 0x2e, 0x70, 0x72, 0x6f, 0x74, 0x6f, 0x12, 0x11, 0x73, 0x79, 0x73, 0x2e, 0x69, 0x6e, 0x74, 0x65, 0x72, 0x6e, 0x61, 0x6c, 0x2e, 0x63, 0x6f, 0x6e, 0x66, 0x1a, 0x1e, 0x67, 0x6f, 0x6f, 0x67, 0x6c, 0x65, 0x2f, 0x70, 0x72, 0x6f, 0x74, 0x6f, 0x62, 0x75, 0x66, 0x2f, 0x64, 0x75, 0x72, 0x61, 0x74, 0x69, 0x6f, 0x6e, 0x2e, 0x70, 0x72, 0x6f, 0x74, 0x6f, 0x22, 0xdb, 0x02, 0x0a, 0x09, 0x42, 0x6f, 0x6f, 0x74, 0x73, 0x74, 0x72, 0x61, 0x70, 0x12, 0x34, 0x0a, 0x07, 0x73, 0x65, 0x72, 0x76, 0x69, 0x63, 0x65, 0x18, 0x01, 0x20, 0x01, 0x28, 0x0b, 0x32, 0x1a, 0x2e, 0x73, 0x79, 0x73, 0x2e, 0x69, 0x6e, 0x74, 0x65, 0x72, 0x6e, 0x61, 0x6c, 0x2e, 0x63, 0x6f, 0x6e, 0x66, 0x2e, 0x53, 0x65, 0x72, 0x76, 0x69, 0x63, 0x65, 0x52, 0x07, 0x73, 0x65, 0x72, 0x76, 0x69, 0x63, 0x65, 0x12, 0x31, 0x0a, 0x06, 0x73, 0x65, 0x72, 0x76, 0x65, 0x72, 0x18, 0x02, 0x20, 0x01, 0x28, 0x0b, 0x32, 0x19, 0x2e, 0x73, 0x79, 0x73, 0x2e, 0x69, 0x6e, 0x74, 0x65, 0x72, 0x6e, 0x61, 0x6c, 0x2e, 0x63, 0x6f, 0x6e, 0x66, 0x2e, 0x53, 0x65, 0x72, 0x76, 0x65, 0x72, 0x52, 0x06, 0x73, 0x65, 0x72, 0x76, 0x65, 0x72, 0x12, 0x31, 0x0a, 0x06, 0x63, 0x6c, 0x69, 0x65, 0x6e, 0x74, 0x18, 0x03, 0x20, 0x01, 0x28, 0x0b, 0x32, 0x19, 0x2e, 0x73, 0x79, 0x73, 0x2e, 0x69, 0x6e, 0x74, 0x65, 0x72, 0x6e, 0x61, 0x6c, 0x2e, 0x63, 0x6f, 0x6e, 0x66, 0x2e, 0x43, 0x6c, 0x69, 0x65, 0x6e, 0x74, 0x52, 0x06, 0x63, 0x6c, 0x69, 0x65, 0x6e, 0x74, 0x12, 0x2b, 0x0a, 0x04, 0x64, 0x61, 0x74, 0x61, 0x18, 0x04, 0x20, 0x01, 0x28, 0x0b, 0x32, 0x17, 0x2e, 0x73, 0x79, 0x73, 0x2e, 0x69, 0x6e, 0x74, 0x65, 0x72, 0x6e, 0x61, 0x6c, 0x2e, 0x63, 0x6f, 0x6e, 0x66, 0x2e, 0x44, 0x61, 0x74, 0x61, 0x52, 0x04, 0x64, 0x61, 0x74, 0x61, 0x12, 0x28, 0x0a, 0x03, 0x72, 0x65, 0x67, 0x18, 0x05, 0x20, 0x01, 0x28, 0x0b, 0x32, 0x16, 0x2e, 0x73, 0x79, 0x73, 0x2e, 0x69, 0x6e, 0x74, 0x65, 0x72, 0x6e, 0x61, 0x6c, 0x2e, 0x63, 0x6f, 0x6e, 0x66, 0x2e, 0x52, 0x65, 0x67, 0x52, 0x03, 0x72, 0x65, 0x67, 0x12, 0x31, 0x0a, 0x06, 0x6c, 0x6f, 0x67, 0x67, 0x65, 0x72, 0x18, 0x06, 0x20, 0x01, 0x28, 0x0b, 0x32, 0x19, 0x2e, 0x73, 0x79, 0x73, 0x2e, 0x69, 0x6e, 0x74, 0x65, 0x72, 0x6e, 0x61, 0x6c, 0x2e, 0x63, 0x6f, 0x6e, 0x66, 0x2e, 0x4c, 0x6f, 0x67, 0x67, 0x65, 0x72, 0x52, 0x06, 0x6c, 0x6f, 0x67, 0x67, 0x65, 0x72, 0x12, 0x28, 0x0a, 0x03, 0x61, 0x70, 0x70, 0x18, 0x07, 0x20, 0x01, 0x28, 0x0b, 0x32, 0x16, 0x2e, 0x73, 0x79, 0x73, 0x2e, 0x69, 0x6e, 0x74, 0x65, 0x72, 0x6e, 0x61, 0x6c, 0x2e, 0x63, 0x6f, 0x6e, 0x66, 0x2e, 0x41, 0x70, 0x70, 0x52, 0x03, 0x61, 0x70, 0x70, 0x22, 0x37, 0x0a, 0x07, 0x53, 0x65, 0x72, 0x76, 0x69, 0x63, 0x65, 0x12, 0x12, 0x0a, 0x04, 0x6e, 0x61, 0x6d, 0x65, 0x18, 0x01, 0x20, 0x01, 0x28, 0x09, 0x52, 0x04, 0x6e, 0x61, 0x6d, 0x65, 0x12, 0x18, 0x0a, 0x07, 0x76, 0x65, 0x72, 0x73, 0x69, 0x6f, 0x6e, 0x18, 0x02, 0x20, 0x01, 0x28, 0x09, 0x52, 0x07, 0x76, 0x65, 0x72, 0x73, 0x69, 0x6f, 0x6e, 0x22, 0xab, 0x04, 0x0a, 0x06, 0x53, 0x65, 0x72, 0x76, 0x65, 0x72, 0x12, 0x32, 0x0a, 0x04, 0x68, 0x74, 0x74, 0x70, 0x18, 0x01, 0x20, 0x01, 0x28, 0x0b, 0x32, 0x1e, 0x2e, 0x73, 0x79, 0x73, 0x2e, 0x69, 0x6e, 0x74, 0x65, 0x72, 0x6e, 0x61, 0x6c, 0x2e, 0x63, 0x6f, 0x6e, 0x66, 0x2e, 0x53, 0x65, 0x72, 0x76, 0x65, 0x72, 0x2e, 0x48, 0x54, 0x54, 0x50, 0x52, 0x04, 0x68, 0x74, 0x74, 0x70, 0x12, 0x32, 0x0a, 0x04, 0x67, 0x72, 0x70, 0x63, 0x18, 0x02, 0x20, 0x01, 0x28, 0x0b, 0x32, 0x1e, 0x2e, 0x73, 0x79, 0x73, 0x2e, 0x69, 0x6e, 0x74, 0x65, 0x72, 0x6e, 0x61, 0x6c, 0x2e, 0x63, 0x6f, 0x6e, 0x66, 0x2e, 0x53, 0x65, 0x72, 0x76, 0x65, 0x72, 0x2e, 0x47, 0x52, 0x50, 0x43, 0x52, 0x04, 0x67, 0x72, 0x70, 0x63, 0x12, 0x44, 0x0a, 0x0a, 0x6d, 0x69, 0x64, 0x64, 0x6c, 0x65, 0x77, 0x61, 0x72, 0x65, 0x18, 0x03, 0x20, 0x01, 0x28, 0x0b, 0x32, 0x24, 0x2e, 0x73, 0x79, 0x73, 0x2e, 0x69, 0x6e, 0x74, 0x65, 0x72, 0x6e, 0x61, 0x6c, 0x2e, 0x63, 0x6f, 0x6e, 0x66, 0x2e, 0x53, 0x65, 0x72, 0x76, 0x65, 0x72, 0x2e, 0x4d, 0x69, 0x64, 0x64, 0x6c, 0x65, 0x77, 0x61, 0x72, 0x65, 0x52, 0x0a, 0x6d, 0x69, 0x64, 0x64, 0x6c, 0x65, 0x77, 0x61, 0x72, 0x65, 0x1a, 0x6f, 0x0a, 0x04, 0x48, 0x54, 0x54, 0x50, 0x12, 0x18, 0x0a, 0x07, 0x6e, 0x65, 0x74, 0x77, 0x6f, 0x72, 0x6b, 0x18, 0x01, 0x20, 0x01, 0x28, 0x09, 0x52, 0x07, 0x6e, 0x65, 0x74, 0x77, 0x6f, 0x72, 0x6b, 0x12, 0x18, 0x0a, 0x07, 0x61, 0x64, 0x64, 0x72, 0x65, 0x73, 0x73, 0x18, 0x02, 0x20, 0x01, 0x28, 0x09, 0x52, 0x07, 0x61, 0x64, 0x64, 0x72, 0x65, 0x73, 0x73, 0x12, 0x33, 0x0a, 0x07, 0x74, 0x69, 0x6d, 0x65, 0x6f, 0x75, 0x74, 0x18, 0x03, 0x20, 0x01, 0x28, 0x0b, 0x32, 0x19, 0x2e, 0x67, 0x6f, 0x6f, 0x67, 0x6c, 0x65, 0x2e, 0x70, 0x72, 0x6f, 0x74, 0x6f, 0x62, 0x75, 0x66, 0x2e, 0x44, 0x75, 0x72, 0x61, 0x74, 0x69, 0x6f, 0x6e, 0x52, 0x07, 0x74, 0x69, 0x6d, 0x65, 0x6f, 0x75, 0x74, 0x1a, 0x6f, 0x0a, 0x04, 0x47, 0x52, 0x50, 0x43, 0x12, 0x18, 0x0a, 0x07, 0x6e, 0x65, 0x74, 0x77, 0x6f, 0x72, 0x6b, 0x18, 0x01, 0x20, 0x01, 0x28, 0x09, 0x52, 0x07, 0x6e, 0x65, 0x74, 0x77, 0x6f, 0x72, 0x6b, 0x12, 0x18, 0x0a, 0x07, 0x61, 0x64, 0x64, 0x72, 0x65, 0x73, 0x73, 0x18, 0x02, 0x20, 0x01, 0x28, 0x09, 0x52, 0x07, 0x61, 0x64, 0x64, 0x72, 0x65, 0x73, 0x73, 0x12, 0x33, 0x0a, 0x07, 0x74, 0x69, 0x6d, 0x65, 0x6f, 0x75, 0x74, 0x18, 0x03, 0x20, 0x01, 0x28, 0x0b, 0x32, 0x19, 0x2e, 0x67, 0x6f, 0x6f, 0x67, 0x6c, 0x65, 0x2e, 0x70, 0x72, 0x6f, 0x74, 0x6f, 0x62, 0x75, 0x66, 0x2e, 0x44, 0x75, 0x72, 0x61, 0x74, 0x69, 0x6f, 0x6e, 0x52, 0x07, 0x74, 0x69, 0x6d, 0x65, 0x6f, 0x75, 0x74, 0x1a, 0x52, 0x0a, 0x03, 0x4a, 0x57, 0x54, 0x12, 0x16, 0x0a, 0x06, 0x73, 0x65, 0x63, 0x72, 0x65, 0x74, 0x18, 0x01, 0x20, 0x01, 0x28, 0x09, 0x52, 0x06, 0x73, 0x65, 0x63, 0x72, 0x65, 0x74, 0x12, 0x33, 0x0a, 0x07, 0x74, 0x69, 0x6d, 0x65, 0x6f, 0x75, 0x74, 0x18, 0x02, 0x20, 0x01, 0x28, 0x0b, 0x32, 0x19, 0x2e, 0x67, 0x6f, 0x6f, 0x67, 0x6c, 0x65, 0x2e, 0x70, 0x72, 0x6f, 0x74, 0x6f, 0x62, 0x75, 0x66, 0x2e, 0x44, 0x75, 0x72, 0x61, 0x74, 0x69, 0x6f, 0x6e, 0x52, 0x07, 0x74, 0x69, 0x6d, 0x65, 0x6f, 0x75, 0x74, 0x1a, 0x3d, 0x0a, 0x0a, 0x4d, 0x69, 0x64, 0x64, 0x6c, 0x65, 0x77, 0x61, 0x72, 0x65, 0x12, 0x2f, 0x0a, 0x03, 0x6a, 0x77, 0x74, 0x18, 0x01, 0x20, 0x01, 0x28, 0x0b, 0x32, 0x1d, 0x2e, 0x73, 0x79, 0x73, 0x2e, 0x69, 0x6e, 0x74, 0x65, 0x72, 0x6e, 0x61, 0x6c, 0x2e, 0x63, 0x6f, 0x6e, 0x66, 0x2e, 0x53, 0x65, 0x72, 0x76, 0x65, 0x72, 0x2e, 0x4a, 0x57, 0x54, 0x52, 0x03, 0x6a, 0x77, 0x74, 0x22, 0xa3, 0x02, 0x0a, 0x06, 0x43, 0x6c, 0x69, 0x65, 0x6e, 0x74, 0x12, 0x2f, 0x0a, 0x03, 0x73, 0x73, 0x6f, 0x18, 0x01, 0x20, 0x01, 0x28, 0x0b, 0x32, 0x1d, 0x2e, 0x73, 0x79, 0x73, 0x2e, 0x69, 0x6e, 0x74, 0x65, 0x72, 0x6e, 0x61, 0x6c, 0x2e, 0x63, 0x6f, 0x6e, 0x66, 0x2e, 0x43, 0x6c, 0x69, 0x65, 0x6e, 0x74, 0x2e, 0x53, 0x73, 0x6f, 0x52, 0x03, 0x73, 0x73, 0x6f, 0x1a, 0x61, 0x0a, 0x07, 0x42, 0x72, 0x65, 0x61, 0x6b, 0x65, 0x72, 0x12, 0x16, 0x0a, 0x06, 0x77, 0x69, 0x6e, 0x64, 0x6f, 0x77, 0x18, 0x01, 0x20, 0x01, 0x28, 0x09, 0x52, 0x06, 0x77, 0x69, 0x6e, 0x64, 0x6f, 0x77, 0x12, 0x16, 0x0a, 0x06, 0x62, 0x75, 0x63, 0x6b, 0x65, 0x74, 0x18, 0x02, 0x20, 0x01, 0x28, 0x05, 0x52, 0x06, 0x62, 0x75, 0x63, 0x6b, 0x65, 0x74, 0x12, 0x0c, 0x0a, 0x01, 0x6b, 0x18, 0x03, 0x20, 0x01, 0x28, 0x02, 0x52, 0x01, 0x6b, 0x12, 0x18, 0x0a, 0x07, 0x72, 0x65, 0x71, 0x75, 0x65, 0x73, 0x74, 0x18, 0x04, 0x20, 0x01, 0x28, 0x05, 0x52, 0x07, 0x72, 0x65, 0x71, 0x75, 0x65, 0x73, 0x74, 0x1a, 0x84, 0x01, 0x0a, 0x03, 0x53, 0x73, 0x6f, 0x12, 0x12, 0x0a, 0x04, 0x64, 0x69, 0x61, 0x6c, 0x18, 0x01, 0x20, 0x01, 0x28, 0x09, 0x52, 0x04, 0x64, 0x69, 0x61, 0x6c, 0x12, 0x18, 0x0a, 0x07, 0x74, 0x69, 0x6d, 0x65, 0x6f, 0x75, 0x74, 0x18, 0x02, 0x20, 0x01, 0x28, 0x09, 0x52, 0x07, 0x74, 0x69, 0x6d, 0x65, 0x6f, 0x75, 0x74, 0x12, 0x12, 0x0a, 0x04, 0x7a, 0x6f, 0x6e, 0x65, 0x18, 0x03, 0x20, 0x01, 0x28, 0x09, 0x52, 0x04, 0x7a, 0x6f, 0x6e, 0x65, 0x12, 0x3b, 0x0a, 0x07, 0x62, 0x72, 0x65, 0x61, 0x6b, 0x65, 0x72, 0x18, 0x04, 0x20, 0x01, 0x28, 0x0b, 0x32, 0x21, 0x2e, 0x73, 0x79, 0x73, 0x2e, 0x69, 0x6e, 0x74, 0x65, 0x72, 0x6e, 0x61, 0x6c, 0x2e, 0x63, 0x6f, 0x6e, 0x66, 0x2e, 0x43, 0x6c, 0x69, 0x65, 0x6e, 0x74, 0x2e, 0x42, 0x72, 0x65, 0x61, 0x6b, 0x65, 0x72, 0x52, 0x07, 0x62, 0x72, 0x65, 0x61, 0x6b, 0x65, 0x72, 0x22, 0xe7, 0x04, 0x0a, 0x04, 0x44, 0x61, 0x74, 0x61, 0x12, 0x36, 0x0a, 0x05, 0x61, 0x64, 0x6d, 0x69, 0x6e, 0x18, 0x01, 0x20, 0x01, 0x28, 0x0b, 0x32, 0x20, 0x2e, 0x73, 0x79, 0x73, 0x2e, 0x69, 0x6e, 0x74, 0x65, 0x72, 0x6e, 0x61, 0x6c, 0x2e, 0x63, 0x6f, 0x6e, 0x66, 0x2e, 0x44, 0x61, 0x74, 0x61, 0x2e, 0x44, 0x61, 0x74, 0x61, 0x62, 0x61, 0x73, 0x65, 0x52, 0x05, 0x61, 0x64, 0x6d, 0x69, 0x6e, 0x12, 0x32, 0x0a, 0x03, 0x73, 0x73, 0x6f, 0x18, 0x02, 0x20, 0x01, 0x28, 0x0b, 0x32, 0x20, 0x2e, 0x73, 0x79, 0x73, 0x2e, 0x69, 0x6e, 0x74, 0x65, 0x72, 0x6e, 0x61, 0x6c, 0x2e, 0x63, 0x6f, 0x6e, 0x66, 0x2e, 0x44, 0x61, 0x74, 0x61, 0x2e, 0x44, 0x61, 0x74, 0x61, 0x62, 0x61, 0x73, 0x65, 0x52, 0x03, 0x73, 0x73, 0x6f, 0x12, 0x34, 0x0a, 0x04, 0x74, 0x69, 0x6b, 0x75, 0x18, 0x03, 0x20, 0x01, 0x28, 0x0b, 0x32, 0x20, 0x2e, 0x73, 0x79, 0x73, 0x2e, 0x69, 0x6e, 0x74, 0x65, 0x72, 0x6e, 0x61, 0x6c, 0x2e, 0x63, 0x6f, 0x6e, 0x66, 0x2e, 0x44, 0x61, 0x74, 0x61, 0x2e, 0x44, 0x61, 0x74, 0x61, 0x62, 0x61, 0x73, 0x65, 0x52, 0x04, 0x74, 0x69, 0x6b, 0x75, 0x12, 0x33, 0x0a, 0x05, 0x72, 0x65, 0x64, 0x69, 0x73, 0x18, 0x04, 0x20, 0x01, 0x28, 0x0b, 0x32, 0x1d, 0x2e, 0x73, 0x79, 0x73, 0x2e, 0x69, 0x6e, 0x74, 0x65, 0x72, 0x6e, 0x61, 0x6c, 0x2e, 0x63, 0x6f, 0x6e, 0x66, 0x2e, 0x44, 0x61, 0x74, 0x61, 0x2e, 0x52, 0x65, 0x64, 0x69, 0x73, 0x52, 0x05, 0x72, 0x65, 0x64, 0x69, 0x73, 0x12, 0x39, 0x0a, 0x07, 0x6d, 0x69, 0x67, 0x72, 0x61, 0x74, 0x65, 0x18, 0x05, 0x20, 0x01, 0x28, 0x0b, 0x32, 0x1f, 0x2e, 0x73, 0x79, 0x73, 0x2e, 0x69, 0x6e, 0x74, 0x65, 0x72, 0x6e, 0x61, 0x6c, 0x2e, 0x63, 0x6f, 0x6e, 0x66, 0x2e, 0x44, 0x61, 0x74, 0x61, 0x2e, 0x4d, 0x69, 0x67, 0x72, 0x61, 0x74, 0x65, 0x52, 0x07, 0x6d, 0x69, 0x67, 0x72, 0x61, 0x74, 0x65, 0x1a, 0x3a, 0x0a, 0x08, 0x44, 0x61, 0x74, 0x61, 0x62, 0x61, 0x73, 0x65, 0x12, 0x16, 0x0a, 0x06, 0x64, 0x72, 0x69, 0x76, 0x65, 0x72, 0x18, 0x01, 0x20, 0x01, 0x28, 0x09, 0x52, 0x06, 0x64, 0x72, 0x69, 0x76, 0x65, 0x72, 0x12, 0x16, 0x0a, 0x06, 0x73, 0x6f, 0x75, 0x72, 0x63, 0x65, 0x18, 0x02, 0x20, 0x01, 0x28, 0x09, 0x52, 0x06, 0x73, 0x6f, 0x75, 0x72, 0x63, 0x65, 0x1a, 0xf1, 0x01, 0x0a, 0x05, 0x52, 0x65, 0x64, 0x69, 0x73, 0x12, 0x18, 0x0a, 0x07, 0x6e, 0x65, 0x74, 0x77, 0x6f, 0x72, 0x6b, 0x18, 0x01, 0x20, 0x01, 0x28, 0x09, 0x52, 0x07, 0x6e, 0x65, 0x74, 0x77, 0x6f, 0x72, 0x6b, 0x12, 0x12, 0x0a, 0x04, 0x61, 0x64, 0x64, 0x72, 0x18, 0x02, 0x20, 0x01, 0x28, 0x09, 0x52, 0x04, 0x61, 0x64, 0x64, 0x72, 0x12, 0x3c, 0x0a, 0x0c, 0x72, 0x65, 0x61, 0x64, 0x5f, 0x74, 0x69, 0x6d, 0x65, 0x6f, 0x75, 0x74, 0x18, 0x03, 0x20, 0x01, 0x28, 0x0b, 0x32, 0x19, 0x2e, 0x67, 0x6f, 0x6f, 0x67, 0x6c, 0x65, 0x2e, 0x70, 0x72, 0x6f, 0x74, 0x6f, 0x62, 0x75, 0x66, 0x2e, 0x44, 0x75, 0x72, 0x61, 0x74, 0x69, 0x6f, 0x6e, 0x52, 0x0b, 0x72, 0x65, 0x61, 0x64, 0x54, 0x69, 0x6d, 0x65, 0x6f, 0x75, 0x74, 0x12, 0x3e, 0x0a, 0x0d, 0x77, 0x72, 0x69, 0x74, 0x65, 0x5f, 0x74, 0x69, 0x6d, 0x65, 0x6f, 0x75, 0x74, 0x18, 0x04, 0x20, 0x01, 0x28, 0x0b, 0x32, 0x19, 0x2e, 0x67, 0x6f, 0x6f, 0x67, 0x6c, 0x65, 0x2e, 0x70, 0x72, 0x6f, 0x74, 0x6f, 0x62, 0x75, 0x66, 0x2e, 0x44, 0x75, 0x72, 0x61, 0x74, 0x69, 0x6f, 0x6e, 0x52, 0x0c, 0x77, 0x72, 0x69, 0x74, 0x65, 0x54, 0x69, 0x6d, 0x65, 0x6f, 0x75, 0x74, 0x12, 0x3c, 0x0a, 0x0c, 0x64, 0x69, 0x61, 0x6c, 0x5f, 0x74, 0x69, 0x6d, 0x65, 0x6f, 0x75, 0x74, 0x18, 0x05, 0x20, 0x01, 0x28, 0x0b, 0x32, 0x19, 0x2e, 0x67, 0x6f, 0x6f, 0x67, 0x6c, 0x65, 0x2e, 0x70, 0x72, 0x6f, 0x74, 0x6f, 0x62, 0x75, 0x66, 0x2e, 0x44, 0x75, 0x72, 0x61, 0x74, 0x69, 0x6f, 0x6e, 0x52, 0x0b, 0x64, 0x69, 0x61, 0x6c, 0x54, 0x69, 0x6d, 0x65, 0x6f, 0x75, 0x74, 0x1a, 0x1d, 0x0a, 0x07, 0x4d, 0x69, 0x67, 0x72, 0x61, 0x74, 0x65, 0x12, 0x12, 0x0a, 0x04, 0x70, 0x61, 0x74, 0x68, 0x18, 0x01, 0x20, 0x01, 0x28, 0x09, 0x52, 0x04, 0x70, 0x61, 0x74, 0x68, 0x22, 0xd0, 0x02, 0x0a, 0x03, 0x52, 0x65, 0x67, 0x12, 0x2f, 0x0a, 0x04, 0x65, 0x74, 0x63, 0x64, 0x18, 0x01, 0x20, 0x01, 0x28, 0x0b, 0x32, 0x1b, 0x2e, 0x73, 0x79, 0x73, 0x2e, 0x69, 0x6e, 0x74, 0x65, 0x72, 0x6e, 0x61, 0x6c, 0x2e, 0x63, 0x6f, 0x6e, 0x66, 0x2e, 0x52, 0x65, 0x67, 0x2e, 0x45, 0x74, 0x63, 0x64, 0x52, 0x04, 0x65, 0x74, 0x63, 0x64, 0x12, 0x3b, 0x0a, 0x08, 0x69, 0x6e, 0x73, 0x74, 0x61, 0x6e, 0x63, 0x65, 0x18, 0x02, 0x20, 0x01, 0x28, 0x0b, 0x32, 0x1f, 0x2e, 0x73, 0x79, 0x73, 0x2e, 0x69, 0x6e, 0x74, 0x65, 0x72, 0x6e, 0x61, 0x6c, 0x2e, 0x63, 0x6f, 0x6e, 0x66, 0x2e, 0x52, 0x65, 0x67, 0x2e, 0x49, 0x6e, 0x73, 0x74, 0x61, 0x6e, 0x63, 0x65, 0x52, 0x08, 0x69, 0x6e, 0x73, 0x74, 0x61, 0x6e, 0x63, 0x65, 0x1a, 0x61, 0x0a, 0x04, 0x45, 0x74, 0x63, 0x64, 0x12, 0x1c, 0x0a, 0x09, 0x65, 0x6e, 0x64, 0x70, 0x6f, 0x69, 0x6e, 0x74, 0x73, 0x18, 0x01, 0x20, 0x03, 0x28, 0x09, 0x52, 0x09, 0x65, 0x6e, 0x64, 0x70, 0x6f, 0x69, 0x6e, 0x74, 0x73, 0x12, 0x3b, 0x0a, 0x0b, 0x64, 0x69, 0x61, 0x6c, 0x54, 0x69, 0x6d, 0x65, 0x6f, 0x75, 0x74, 0x18, 0x02, 0x20, 0x01, 0x28, 0x0b, 0x32, 0x19, 0x2e, 0x67, 0x6f, 0x6f, 0x67, 0x6c, 0x65, 0x2e, 0x70, 0x72, 0x6f, 0x74, 0x6f, 0x62, 0x75, 0x66, 0x2e, 0x44, 0x75, 0x72, 0x61, 0x74, 0x69, 0x6f, 0x6e, 0x52, 0x0b, 0x64, 0x69, 0x61, 0x6c, 0x54, 0x69, 0x6d, 0x65, 0x6f, 0x75, 0x74, 0x1a, 0x78, 0x0a, 0x08, 0x49, 0x6e, 0x73, 0x74, 0x61, 0x6e, 0x63, 0x65, 0x12, 0x12, 0x0a, 0x04, 0x7a, 0x6f, 0x6e, 0x65, 0x18, 0x01, 0x20, 0x01, 0x28, 0x09, 0x52, 0x04, 0x7a, 0x6f, 0x6e, 0x65, 0x12, 0x10, 0x0a, 0x03, 0x65, 0x6e, 0x76, 0x18, 0x02, 0x20, 0x01, 0x28, 0x09, 0x52, 0x03, 0x65, 0x6e, 0x76, 0x12, 0x1a, 0x0a, 0x08, 0x68, 0x6f, 0x73, 0x74, 0x6e, 0x61, 0x6d, 0x65, 0x18, 0x03, 0x20, 0x01, 0x28, 0x09, 0x52, 0x08, 0x68, 0x6f, 0x73, 0x74, 0x6e, 0x61, 0x6d, 0x65, 0x12, 0x14, 0x0a, 0x05, 0x61, 0x70, 0x70, 0x69, 0x64, 0x18, 0x04, 0x20, 0x01, 0x28, 0x09, 0x52, 0x05, 0x61, 0x70, 0x70, 0x69, 0x64, 0x12, 0x14, 0x0a, 0x05, 0x61, 0x64, 0x64, 0x72, 0x73, 0x18, 0x05, 0x20, 0x03, 0x28, 0x09, 0x52, 0x05, 0x61, 0x64, 0x64, 0x72, 0x73, 0x22, 0x4a, 0x0a, 0x06, 0x4c, 0x6f, 0x67, 0x67, 0x65, 0x72, 0x12, 0x12, 0x0a, 0x04, 0x70, 0x61, 0x74, 0x68, 0x18, 0x01, 0x20, 0x01, 0x28, 0x09, 0x52, 0x04, 0x70, 0x61, 0x74, 0x68, 0x12, 0x16, 0x0a, 0x06, 0x73, 0x74, 0x64, 0x6f, 0x75, 0x74, 0x18, 0x02, 0x20, 0x01, 0x28, 0x08, 0x52, 0x06, 0x73, 0x74, 0x64, 0x6f, 0x75, 0x74, 0x12, 0x14, 0x0a, 0x05, 0x6c, 0x65, 0x76, 0x65, 0x6c, 0x18, 0x03, 0x20, 0x01, 0x28, 0x05, 0x52, 0x05, 0x6c, 0x65, 0x76, 0x65, 0x6c, 0x22, 0xf6, 0x04, 0x0a, 0x03, 0x41, 0x70, 0x70, 0x12, 0x2c, 0x0a, 0x03, 0x6a, 0x77, 0x74, 0x18, 0x01, 0x20, 0x01, 0x28, 0x0b, 0x32, 0x1a, 0x2e, 0x73, 0x79, 0x73, 0x2e, 0x69, 0x6e, 0x74, 0x65, 0x72, 0x6e, 0x61, 0x6c, 0x2e, 0x63, 0x6f, 0x6e, 0x66, 0x2e, 0x41, 0x70, 0x70, 0x2e, 0x4a, 0x77, 0x74, 0x52, 0x03, 0x6a, 0x77, 0x74, 0x12, 0x2c, 0x0a, 0x03, 0x67, 0x65, 0x6e, 0x18, 0x02, 0x20, 0x01, 0x28, 0x0b, 0x32, 0x1a, 0x2e, 0x73, 0x79, 0x73, 0x2e, 0x69, 0x6e, 0x74, 0x65, 0x72, 0x6e, 0x61, 0x6c, 0x2e, 0x63, 0x6f, 0x6e, 0x66, 0x2e, 0x41, 0x70, 0x70, 0x2e, 0x47, 0x65, 0x6e, 0x52, 0x03, 0x67, 0x65, 0x6e, 0x12, 0x35, 0x0a, 0x05, 0x74, 0x6f, 0x6f, 0x6c, 0x73, 0x18, 0x03, 0x20, 0x01, 0x28, 0x0b, 0x32, 0x1f, 0x2e, 0x73, 0x79, 0x73, 0x2e, 0x69, 0x6e, 0x74, 0x65, 0x72, 0x6e, 0x61, 0x6c, 0x2e, 0x63, 0x6f, 0x6e, 0x66, 0x2e, 0x41, 0x70, 0x70, 0x2e, 0x44, 0x61, 0x74, 0x61, 0x62, 0x61, 0x73, 0x65, 0x52, 0x05, 0x74, 0x6f, 0x6f, 0x6c, 0x73, 0x12, 0x2f, 0x0a, 0x04, 0x74, 0x69, 0x6b, 0x75, 0x18, 0x04, 0x20, 0x01, 0x28, 0x0b, 0x32, 0x1b, 0x2e, 0x73, 0x79, 0x73, 0x2e, 0x69, 0x6e, 0x74, 0x65, 0x72, 0x6e, 0x61, 0x6c, 0x2e, 0x63, 0x6f, 0x6e, 0x66, 0x2e, 0x41, 0x70, 0x70, 0x2e, 0x54, 0x69, 0x6b, 0x75, 0x52, 0x04, 0x74, 0x69, 0x6b, 0x75, 0x1a, 0x52, 0x0a, 0x03, 0x4a, 0x77, 0x74, 0x12, 0x16, 0x0a, 0x06, 0x73, 0x65, 0x63, 0x72, 0x65, 0x74, 0x18, 0x01, 0x20, 0x01, 0x28, 0x09, 0x52, 0x06, 0x73, 0x65, 0x63, 0x72, 0x65, 0x74, 0x12, 0x33, 0x0a, 0x07, 0x74, 0x69, 0x6d, 0x65, 0x6f, 0x75, 0x74, 0x18, 0x02, 0x20, 0x01, 0x28, 0x0b, 0x32, 0x19, 0x2e, 0x67, 0x6f, 0x6f, 0x67, 0x6c, 0x65, 0x2e, 0x70, 0x72, 0x6f, 0x74, 0x6f, 0x62, 0x75, 0x66, 0x2e, 0x44, 0x75, 0x72, 0x61, 0x74, 0x69, 0x6f, 0x6e, 0x52, 0x07, 0x74, 0x69, 0x6d, 0x65, 0x6f, 0x75, 0x74, 0x1a, 0x54, 0x0a, 0x08, 0x54, 0x65, 0x6d, 0x70, 0x6c, 0x61, 0x74, 0x65, 0x12, 0x14, 0x0a, 0x05, 0x6d, 0x6f, 0x64, 0x65, 0x6c, 0x18, 0x01, 0x20, 0x01, 0x28, 0x09, 0x52, 0x05, 0x6d, 0x6f, 0x64, 0x65, 0x6c, 0x12, 0x10, 0x0a, 0x03, 0x64, 0x61, 0x6f, 0x18, 0x02, 0x20, 0x01, 0x28, 0x09, 0x52, 0x03, 0x64, 0x61, 0x6f, 0x12, 0x0e, 0x0a, 0x02, 0x6a, 0x73, 0x18, 0x03, 0x20, 0x01, 0x28, 0x09, 0x52, 0x02, 0x6a, 0x73, 0x12, 0x10, 0x0a, 0x03, 0x76, 0x75, 0x65, 0x18, 0x04, 0x20, 0x01, 0x28, 0x09, 0x52, 0x03, 0x76, 0x75, 0x65, 0x1a, 0xa8, 0x01, 0x0a, 0x03, 0x47, 0x65, 0x6e, 0x12, 0x12, 0x0a, 0x04, 0x6e, 0x61, 0x6d, 0x65, 0x18, 0x01, 0x20, 0x01, 0x28, 0x09, 0x52, 0x04, 0x6e, 0x61, 0x6d, 0x65, 0x12, 0x16, 0x0a, 0x06, 0x64, 0x62, 0x6e, 0x61, 0x6d, 0x65, 0x18, 0x02, 0x20, 0x01, 0x28, 0x09, 0x52, 0x06, 0x64, 0x62, 0x6e, 0x61, 0x6d, 0x65, 0x12, 0x1c, 0x0a, 0x09, 0x66, 0x72, 0x6f, 0x6e, 0x74, 0x70, 0x61, 0x74, 0x68, 0x18, 0x03, 0x20, 0x01, 0x28, 0x09, 0x52, 0x09, 0x66, 0x72, 0x6f, 0x6e, 0x74, 0x70, 0x61, 0x74, 0x68, 0x12, 0x1a, 0x0a, 0x08, 0x62, 0x61, 0x63, 0x6b, 0x70, 0x61, 0x74, 0x68, 0x18, 0x04, 0x20, 0x01, 0x28, 0x09, 0x52, 0x08, 0x62, 0x61, 0x63, 0x6b, 0x70, 0x61, 0x74, 0x68, 0x12, 0x3b, 0x0a, 0x08, 0x74, 0x65, 0x6d, 0x70, 0x6c, 0x61, 0x74, 0x65, 0x18, 0x05, 0x20, 0x01, 0x28, 0x0b, 0x32, 0x1f, 0x2e, 0x73, 0x79, 0x73, 0x2e, 0x69, 0x6e, 0x74, 0x65, 0x72, 0x6e, 0x61, 0x6c, 0x2e, 0x63, 0x6f, 0x6e, 0x66, 0x2e, 0x41, 0x70, 0x70, 0x2e, 0x54, 0x65, 0x6d, 0x70, 0x6c, 0x61, 0x74, 0x65, 0x52, 0x08, 0x74, 0x65, 0x6d, 0x70, 0x6c, 0x61, 0x74, 0x65, 0x1a, 0x3a, 0x0a, 0x08, 0x44, 0x61, 0x74, 0x61, 0x62, 0x61, 0x73, 0x65, 0x12, 0x16, 0x0a, 0x06, 0x64, 0x72, 0x69, 0x76, 0x65, 0x72, 0x18, 0x01, 0x20, 0x01, 0x28, 0x09, 0x52, 0x06, 0x64, 0x72, 0x69, 0x76, 0x65, 0x72, 0x12, 0x16, 0x0a, 0x06, 0x73, 0x6f, 0x75, 0x72, 0x63, 0x65, 0x18, 0x02, 0x20, 0x01, 0x28, 0x09, 0x52, 0x06, 0x73, 0x6f, 0x75, 0x72, 0x63, 0x65, 0x1a, 0x1a, 0x0a, 0x04, 0x54, 0x69, 0x6b, 0x75, 0x12, 0x12, 0x0a, 0x04, 0x70, 0x61, 0x74, 0x68, 0x18, 0x01, 0x20, 0x01, 0x28, 0x09, 0x52, 0x04, 0x70, 0x61, 0x74, 0x68, 0x42, 0x1c, 0x5a, 0x1a, 0x65, 0x64, 0x75, 0x2f, 0x73, 0x79, 0x73, 0x2f, 0x69, 0x6e, 0x74, 0x65, 0x72, 0x6e, 0x61, 0x6c, 0x2f, 0x63, 0x6f, 0x6e, 0x66, 0x3b, 0x63, 0x6f, 0x6e, 0x66, 0x62, 0x06, 0x70, 0x72, 0x6f, 0x74, 0x6f, 0x33, } var ( file_service_sys_internal_conf_conf_proto_rawDescOnce sync.Once file_service_sys_internal_conf_conf_proto_rawDescData = file_service_sys_internal_conf_conf_proto_rawDesc ) func file_service_sys_internal_conf_conf_proto_rawDescGZIP() []byte { file_service_sys_internal_conf_conf_proto_rawDescOnce.Do(func() { file_service_sys_internal_conf_conf_proto_rawDescData = protoimpl.X.CompressGZIP(file_service_sys_internal_conf_conf_proto_rawDescData) }) return file_service_sys_internal_conf_conf_proto_rawDescData } var file_service_sys_internal_conf_conf_proto_msgTypes = make([]protoimpl.MessageInfo, 24) var file_service_sys_internal_conf_conf_proto_goTypes = []interface{}{ (*Bootstrap)(nil), // 0: sys.internal.conf.Bootstrap (*Service)(nil), // 1: sys.internal.conf.Service (*Server)(nil), // 2: sys.internal.conf.Server (*Client)(nil), // 3: sys.internal.conf.Client (*Data)(nil), // 4: sys.internal.conf.Data (*Reg)(nil), // 5: sys.internal.conf.Reg (*Logger)(nil), // 6: sys.internal.conf.Logger (*App)(nil), // 7: sys.internal.conf.App (*Server_HTTP)(nil), // 8: sys.internal.conf.Server.HTTP (*Server_GRPC)(nil), // 9: sys.internal.conf.Server.GRPC (*Server_JWT)(nil), // 10: sys.internal.conf.Server.JWT (*Server_Middleware)(nil), // 11: sys.internal.conf.Server.Middleware (*Client_Breaker)(nil), // 12: sys.internal.conf.Client.Breaker (*Client_Sso)(nil), // 13: sys.internal.conf.Client.Sso (*Data_Database)(nil), // 14: sys.internal.conf.Data.Database (*Data_Redis)(nil), // 15: sys.internal.conf.Data.Redis (*Data_Migrate)(nil), // 16: sys.internal.conf.Data.Migrate (*Reg_Etcd)(nil), // 17: sys.internal.conf.Reg.Etcd (*Reg_Instance)(nil), // 18: sys.internal.conf.Reg.Instance (*App_Jwt)(nil), // 19: sys.internal.conf.App.Jwt (*App_Template)(nil), // 20: sys.internal.conf.App.Template (*App_Gen)(nil), // 21: sys.internal.conf.App.Gen (*App_Database)(nil), // 22: sys.internal.conf.App.Database (*App_Tiku)(nil), // 23: sys.internal.conf.App.Tiku (*durationpb.Duration)(nil), // 24: google.protobuf.Duration } var file_service_sys_internal_conf_conf_proto_depIdxs = []int32{ 1, // 0: sys.internal.conf.Bootstrap.service:type_name -> sys.internal.conf.Service 2, // 1: sys.internal.conf.Bootstrap.server:type_name -> sys.internal.conf.Server 3, // 2: sys.internal.conf.Bootstrap.client:type_name -> sys.internal.conf.Client 4, // 3: sys.internal.conf.Bootstrap.data:type_name -> sys.internal.conf.Data 5, // 4: sys.internal.conf.Bootstrap.reg:type_name -> sys.internal.conf.Reg 6, // 5: sys.internal.conf.Bootstrap.logger:type_name -> sys.internal.conf.Logger 7, // 6: sys.internal.conf.Bootstrap.app:type_name -> sys.internal.conf.App 8, // 7: sys.internal.conf.Server.http:type_name -> sys.internal.conf.Server.HTTP 9, // 8: sys.internal.conf.Server.grpc:type_name -> sys.internal.conf.Server.GRPC 11, // 9: sys.internal.conf.Server.middleware:type_name -> sys.internal.conf.Server.Middleware 13, // 10: sys.internal.conf.Client.sso:type_name -> sys.internal.conf.Client.Sso 14, // 11: sys.internal.conf.Data.admin:type_name -> sys.internal.conf.Data.Database 14, // 12: sys.internal.conf.Data.sso:type_name -> sys.internal.conf.Data.Database 14, // 13: sys.internal.conf.Data.tiku:type_name -> sys.internal.conf.Data.Database 15, // 14: sys.internal.conf.Data.redis:type_name -> sys.internal.conf.Data.Redis 16, // 15: sys.internal.conf.Data.migrate:type_name -> sys.internal.conf.Data.Migrate 17, // 16: sys.internal.conf.Reg.etcd:type_name -> sys.internal.conf.Reg.Etcd 18, // 17: sys.internal.conf.Reg.instance:type_name -> sys.internal.conf.Reg.Instance 19, // 18: sys.internal.conf.App.jwt:type_name -> sys.internal.conf.App.Jwt 21, // 19: sys.internal.conf.App.gen:type_name -> sys.internal.conf.App.Gen 22, // 20: sys.internal.conf.App.tools:type_name -> sys.internal.conf.App.Database 23, // 21: sys.internal.conf.App.tiku:type_name -> sys.internal.conf.App.Tiku 24, // 22: sys.internal.conf.Server.HTTP.timeout:type_name -> google.protobuf.Duration 24, // 23: sys.internal.conf.Server.GRPC.timeout:type_name -> google.protobuf.Duration 24, // 24: sys.internal.conf.Server.JWT.timeout:type_name -> google.protobuf.Duration 10, // 25: sys.internal.conf.Server.Middleware.jwt:type_name -> sys.internal.conf.Server.JWT 12, // 26: sys.internal.conf.Client.Sso.breaker:type_name -> sys.internal.conf.Client.Breaker 24, // 27: sys.internal.conf.Data.Redis.read_timeout:type_name -> google.protobuf.Duration 24, // 28: sys.internal.conf.Data.Redis.write_timeout:type_name -> google.protobuf.Duration 24, // 29: sys.internal.conf.Data.Redis.dial_timeout:type_name -> google.protobuf.Duration 24, // 30: sys.internal.conf.Reg.Etcd.dialTimeout:type_name -> google.protobuf.Duration 24, // 31: sys.internal.conf.App.Jwt.timeout:type_name -> google.protobuf.Duration 20, // 32: sys.internal.conf.App.Gen.template:type_name -> sys.internal.conf.App.Template 33, // [33:33] is the sub-list for method output_type 33, // [33:33] is the sub-list for method input_type 33, // [33:33] is the sub-list for extension type_name 33, // [33:33] is the sub-list for extension extendee 0, // [0:33] is the sub-list for field type_name } func init() { file_service_sys_internal_conf_conf_proto_init() } func file_service_sys_internal_conf_conf_proto_init() { if File_service_sys_internal_conf_conf_proto != nil { return } if !protoimpl.UnsafeEnabled { file_service_sys_internal_conf_conf_proto_msgTypes[0].Exporter = func(v interface{}, i int) interface{} { switch v := v.(*Bootstrap); i { case 0: return &v.state case 1: return &v.sizeCache case 2: return &v.unknownFields default: return nil } } file_service_sys_internal_conf_conf_proto_msgTypes[1].Exporter = func(v interface{}, i int) interface{} { switch v := v.(*Service); i { case 0: return &v.state case 1: return &v.sizeCache case 2: return &v.unknownFields default: return nil } } file_service_sys_internal_conf_conf_proto_msgTypes[2].Exporter = func(v interface{}, i int) interface{} { switch v := v.(*Server); i { case 0: return &v.state case 1: return &v.sizeCache case 2: return &v.unknownFields default: return nil } } file_service_sys_internal_conf_conf_proto_msgTypes[3].Exporter = func(v interface{}, i int) interface{} { switch v := v.(*Client); i { case 0: return &v.state case 1: return &v.sizeCache case 2: return &v.unknownFields default: return nil } } file_service_sys_internal_conf_conf_proto_msgTypes[4].Exporter = func(v interface{}, i int) interface{} { switch v := v.(*Data); i { case 0: return &v.state case 1: return &v.sizeCache case 2: return &v.unknownFields default: return nil } } file_service_sys_internal_conf_conf_proto_msgTypes[5].Exporter = func(v interface{}, i int) interface{} { switch v := v.(*Reg); i { case 0: return &v.state case 1: return &v.sizeCache case 2: return &v.unknownFields default: return nil } } file_service_sys_internal_conf_conf_proto_msgTypes[6].Exporter = func(v interface{}, i int) interface{} { switch v := v.(*Logger); i { case 0: return &v.state case 1: return &v.sizeCache case 2: return &v.unknownFields default: return nil } } file_service_sys_internal_conf_conf_proto_msgTypes[7].Exporter = func(v interface{}, i int) interface{} { switch v := v.(*App); i { case 0: return &v.state case 1: return &v.sizeCache case 2: return &v.unknownFields default: return nil } } file_service_sys_internal_conf_conf_proto_msgTypes[8].Exporter = func(v interface{}, i int) interface{} { switch v := v.(*Server_HTTP); i { case 0: return &v.state case 1: return &v.sizeCache case 2: return &v.unknownFields default: return nil } } file_service_sys_internal_conf_conf_proto_msgTypes[9].Exporter = func(v interface{}, i int) interface{} { switch v := v.(*Server_GRPC); i { case 0: return &v.state case 1: return &v.sizeCache case 2: return &v.unknownFields default: return nil } } file_service_sys_internal_conf_conf_proto_msgTypes[10].Exporter = func(v interface{}, i int) interface{} { switch v := v.(*Server_JWT); i { case 0: return &v.state case 1: return &v.sizeCache case 2: return &v.unknownFields default: return nil } } file_service_sys_internal_conf_conf_proto_msgTypes[11].Exporter = func(v interface{}, i int) interface{} { switch v := v.(*Server_Middleware); i { case 0: return &v.state case 1: return &v.sizeCache case 2: return &v.unknownFields default: return nil } } file_service_sys_internal_conf_conf_proto_msgTypes[12].Exporter = func(v interface{}, i int) interface{} { switch v := v.(*Client_Breaker); i { case 0: return &v.state case 1: return &v.sizeCache case 2: return &v.unknownFields default: return nil } } file_service_sys_internal_conf_conf_proto_msgTypes[13].Exporter = func(v interface{}, i int) interface{} { switch v := v.(*Client_Sso); i { case 0: return &v.state case 1: return &v.sizeCache case 2: return &v.unknownFields default: return nil } } file_service_sys_internal_conf_conf_proto_msgTypes[14].Exporter = func(v interface{}, i int) interface{} { switch v := v.(*Data_Database); i { case 0: return &v.state case 1: return &v.sizeCache case 2: return &v.unknownFields default: return nil } } file_service_sys_internal_conf_conf_proto_msgTypes[15].Exporter = func(v interface{}, i int) interface{} { switch v := v.(*Data_Redis); i { case 0: return &v.state case 1: return &v.sizeCache case 2: return &v.unknownFields default: return nil } } file_service_sys_internal_conf_conf_proto_msgTypes[16].Exporter = func(v interface{}, i int) interface{} { switch v := v.(*Data_Migrate); i { case 0: return &v.state case 1: return &v.sizeCache case 2: return &v.unknownFields default: return nil } } file_service_sys_internal_conf_conf_proto_msgTypes[17].Exporter = func(v interface{}, i int) interface{} { switch v := v.(*Reg_Etcd); i { case 0: return &v.state case 1: return &v.sizeCache case 2: return &v.unknownFields default: return nil } } file_service_sys_internal_conf_conf_proto_msgTypes[18].Exporter = func(v interface{}, i int) interface{} { switch v := v.(*Reg_Instance); i { case 0: return &v.state case 1: return &v.sizeCache case 2: return &v.unknownFields default: return nil } } file_service_sys_internal_conf_conf_proto_msgTypes[19].Exporter = func(v interface{}, i int) interface{} { switch v := v.(*App_Jwt); i { case 0: return &v.state case 1: return &v.sizeCache case 2: return &v.unknownFields default: return nil } } file_service_sys_internal_conf_conf_proto_msgTypes[20].Exporter = func(v interface{}, i int) interface{} { switch v := v.(*App_Template); i { case 0: return &v.state case 1: return &v.sizeCache case 2: return &v.unknownFields default: return nil } } file_service_sys_internal_conf_conf_proto_msgTypes[21].Exporter = func(v interface{}, i int) interface{} { switch v := v.(*App_Gen); i { case 0: return &v.state case 1: return &v.sizeCache case 2: return &v.unknownFields default: return nil } } file_service_sys_internal_conf_conf_proto_msgTypes[22].Exporter = func(v interface{}, i int) interface{} { switch v := v.(*App_Database); i { case 0: return &v.state case 1: return &v.sizeCache case 2: return &v.unknownFields default: return nil } } file_service_sys_internal_conf_conf_proto_msgTypes[23].Exporter = func(v interface{}, i int) interface{} { switch v := v.(*App_Tiku); i { case 0: return &v.state case 1: return &v.sizeCache case 2: return &v.unknownFields default: return nil } } } type x struct{} out := protoimpl.TypeBuilder{ File: protoimpl.DescBuilder{ GoPackagePath: reflect.TypeOf(x{}).PkgPath(), RawDescriptor: file_service_sys_internal_conf_conf_proto_rawDesc, NumEnums: 0, NumMessages: 24, NumExtensions: 0, NumServices: 0, }, GoTypes: file_service_sys_internal_conf_conf_proto_goTypes, DependencyIndexes: file_service_sys_internal_conf_conf_proto_depIdxs, MessageInfos: file_service_sys_internal_conf_conf_proto_msgTypes, }.Build() File_service_sys_internal_conf_conf_proto = out.File file_service_sys_internal_conf_conf_proto_rawDesc = nil file_service_sys_internal_conf_conf_proto_goTypes = nil file_service_sys_internal_conf_conf_proto_depIdxs = nil }
import os import param import logging import cartopy.crs as ccrs from holoviews import Path from geoviews import Path as GeoPath from genesis.model import Model from genesis.util import Projection from .mesh import AdhMesh import holoviews.plotting.bokeh import geoviews.plotting.bokeh log = logging.getLogger('AdhModel.adh_model') class AdhModel(Model): """ Class object to hold all data related to an adh simulation. The adhModel object primarily stores, modifies, and outputs data. It contains pass through methods for reading, and writing the mesh, hotstart, boundary condition, and result files. Included are methods to read individual AdH files or read a suite of model files. There is also a validate method to ensure that the data within the model is valid and consistent across internal objects. An AdhModel object contains a suite of model parameters and one AdhMesh object. The AdhMesh object contains the mesh itself and an AdhSimulation object. The AdHSimulation object stores the boundary condition information, the hotstart, and the results - all the data for an individual model run. """ model_name = param.String( default="adh_model", doc="File name prefix for ADH input files.", ) model_path = param.Foldername( default=os.getcwd(), doc="Path on disk where ADH files are located.", ) version = param.Number( default=5.0, bounds=(4.5, None), softbounds=(4.5, 5.0), doc="Version of AdH executable" ) project_name = param.String( default='default', doc='Global project name' ) units = param.ObjectSelector( default='meters', objects=['meters', 'feet'] ) path_type = param.ClassSelector(default=GeoPath, class_=Path, is_instance=False, doc=""" The element type to draw into.""") mesh = param.ClassSelector(class_=AdhMesh) def __init__(self, **params): super(AdhModel, self).__init__(**params) proj = Projection() if 'crs' in params: proj.set_crs(params['crs']) else: proj.set_crs(ccrs.GOOGLE_MERCATOR) self.mesh = AdhMesh(projection=proj) @property def projection(self): return self.mesh.projection @property def simulation(self): return self.mesh.current_sim def read(self, path, project_name='*', crs=None, fmt='nc'): """Read in AdH model files as an xarray.Dataset object Args: path(str, required): path to the AdH project files project_name(str, optional, default='*'): the root name of the AdH project. If not specified then it will be derived from the first mesh file found in `path`. crs(cartopy.CRS, optional, default=None): The projection of the mesh file. fmt(stt, optional, default='nc'): The format of the file being passed in. Valid options are ['nc', '2dm', '3dm'] Returns: xarray.Dataset variables for the nodes, mesh elements, output datasets and hot-start file datasets """ #TODO look at filename for default format? fmts = { 'nc': self.from_netcdf, 'ascii': self.from_ascii, '2dm': self.from_ascii, '3dm': self.from_ascii } return fmts[fmt](path=path, project_name=project_name, crs=crs) def write(self, path, fmt='nc'): if fmt != 'nc': raise IOError('The only option currently available is nc (netcdf)') else: # write mesh self.write_mesh(file_name=path, fmt=fmt) # write hotstart self.write_hotstart(file_name=path) # write boundary conditions self.write_bc(file_name=path, validate=True, fmt='bc') # write results self.write_results(file_name=path, fmt='nc') def from_netcdf(self, *args, **kwargs): """Read suite of model files from netcdf file and store data in this model object NOTE: Boundary conditions are not stored in netcdf, so they must be read from *.bc file Args: *args: variable length argument list. **kwargs: arbitrary keyword arguments. Returns: None """ nc_file = os.path.join(f'{kwargs["path"]}', f'{kwargs["project_name"]}.nc') # read mesh self.read_mesh(*args, **kwargs) # read hotstart self.read_hotstart(path=nc_file, fmt='nc') # read boundary conditions (must be read as ascii) # todo add warning? bc_file = os.path.join(f'{kwargs["path"]}', f'{kwargs["project_name"]}.bc') self.read_bc(bc_file, fmt='bc') # read results self.read_results(path=nc_file, fmt='nc') def from_ascii(self, *args, **kwargs): """Read suite of model files from ASCII file and store data in this model object Args: *args: variable length argument list. **kwargs: arbitrary keyword arguments. Returns: None """ # set the mesh file name mesh_file = os.path.join(f'{kwargs["path"]}', f'{kwargs["project_name"]}.3dm') # read mesh self.read_mesh(mesh_file, project_name=kwargs['project_name'], crs=kwargs['crs'], fmt='3dm') # set hotstart file name hot_file = os.path.join(f'{kwargs["path"]}', f'{kwargs["project_name"]}.hot') # read hotstart self.read_hotstart(path=hot_file, fmt='ascii') # read boundary conditions bc_file = os.path.join(f'{kwargs["path"]}', f'{kwargs["project_name"]}.bc') self.read_bc(bc_file, fmt='bc') # read results self.read_results(kwargs['path'], project_name=kwargs['project_name'], fmt='ascii') def read_mesh(self, *args, **kwargs): return self.mesh.read(*args, **kwargs) def write_mesh(self, *args, **kwargs): return self.mesh.write(*args, **kwargs) def read_bc(self, *args, **kwargs): return self.simulation.read_bc(*args, **kwargs) def write_bc(self, *args, **kwargs): return self.simulation.write_bc(*args, **kwargs) def read_hotstart(self, *args, **kwargs): return self.simulation.read_hotstart(*args, **kwargs) def write_hotstart(self, *args, **kwargs): return self.simulation.write_hotstart(*args, **kwargs) def read_results(self, *args, **kwargs): return self.simulation.read_results(*args, **kwargs) def write_results(self, *args, **kwargs): return self.simulation.write_results(*args, **kwargs) def read_result(self, *args, **kwargs): return self.simulation.read_result(*args, **kwargs) def write_result(self, *args, **kwargs): return self.simulation.write_result(*args, **kwargs) def validate(self): # ensure mesh units and model units match if self.units != self.mesh.units: log.warning('Model units do not match mesh units') class AdhModelCoupled(AdhModel): """ Class object for holding AdhModel information that to be used for coupling models. """ def __init__(self, **params): super(AdhModelCoupled, self).__init__(**params) self.configurationMode = None # Str: Read previous solution, initialize from previous, read start, new # Calculation configuration self.numberOfProcessors = None # Int: Number of processors available for the simulation run # Coupling to STWave controls self.coupleMode = None # Str: Couple mode to STWave self.coupleValue = None # Str: Simulation time or number of couples, depending on mode # # Output time parameters self.timeOutputSeriesMethod = None # Str: Method for creating the output time series self.timeOutputSeriesIncrement = None # Str: Time increment if manually constructing the timestep series self.timeOutputSeriesFilename = None # Str: Input filename for a custom created series read from file self.timeOutputSeries = [] # Constructed output time series self.simulationStartTime = None # Int: Start time of the current simulation self.simulationStopTime = None # Int: Stop time of the current simulation self.timeInputSeriesIncrement = None # Int: Time increment if manually constructing the timestep series
// (C) Copyright Beman Dawes 1999-2003. Distributed under the Boost // Software License, Version 1.0. (See accompanying file // LICENSE_1_0.txt or copy at http://www.boost.org/LICENSE_1_0.txt) // Contributed by Dave Abrahams // See http://www.boost.org/libs/utility for documentation. #ifndef DLIB_BOOST_NONCOPYABLE_HPP_INCLUDED #define DLIB_BOOST_NONCOPYABLE_HPP_INCLUDED namespace dlib { class noncopyable { /*! This class makes it easier to declare a class as non-copyable. If you want to make an object that can't be copied just inherit from this object. !*/ protected: noncopyable() = default; ~noncopyable() = default; private: // emphasize the following members are private noncopyable(const noncopyable&); const noncopyable& operator=(const noncopyable&); }; } #endif // DLIB_BOOST_NONCOPYABLE_HPP_INCLUDED
// AcceptBlock Handles insertion of new blocks into the block chain and merkle tree. // It will verify the block before actually inserting, if the block is invalid an error will be returned. // For now, it's the caller's responsibility to make sure given block is under consensus. // If VerifyBlock returns err, but block shard and version is right, it will cache the block, and try to catch the latest block // It is used to facilitate the reconstruction of consensus func (abc *appBlockChain) AcceptBlock(block *wire.MsgBlock) error { if block == nil { abc.log.Errorf("can't receive nil") return fmt.Errorf("block is nil") } if abc.shardHeight+1 > block.Header.Height { return fmt.Errorf("the block may have been accepted") } abc.log.Debugf("Before accept block, height: %d, prevHeader: %v,prevHeader.Height: %v, reshardSeed: %v", block.Header.Height, abc.prevHeader.BlockHeaderHash(), abc.prevHeader.Height, abc.prevReshardHeader.ReshardSeed) if err := abc.blockChain.ReceiveHeader(&block.Header); !err { abc.log.Debugf("Failed to call blockChain.ReceiveHeader on block %v", block) } if err := abc.VerifyBlock(block); err != nil { abc.log.Debugf("Cant't verify block: %v, abci will fetch new ledger at height %d", err, block.Header.Height) abc.fetchNewLedger() if !bytes.Equal(block.Header.ReshardSeed, abc.prevReshardHeader.ReshardSeed) { abc.log.Debugf("Update reshard seed at height %d", block.Header.Height) abc.prevReshardHeader = block.Header } return err } update, err := abc.state.GetUpdateWithFullBlock(block) if err != nil { panic(err) } err = abc.dPool.Update(update, abc.shardIndex, wire.BlockHeight(block.Header.Height)) if err != nil { abc.log.Errorf("Update deposit pool error: %v", err) panic(err) } abc.txPool.RefreshPool(update) abc.smartContractDataStore.RefreshDataStore(update, block.Body.UpdateActions) err = abc.state.ApplyUpdate(update) if err != nil { panic("failed to apply update on abci") } preBlock := abc.prevBlock if preBlock != nil { err := abc.updateLastBlockDeposit(update, preBlock) if err != nil { panic(err) } } abc.blockAccepted(block) return nil }
import { Component } from "@angular/core"; import { GameStateView, GameStateViewBase } from "../view-base.component"; @Component({ selector: 'ultimate-view', templateUrl: 'ultimate-view.component.html' }) export class UltimateViewComponent extends GameStateViewBase<any> { ngOnInit() { } ngOnDestroy() { } } export const ViewUltimate: GameStateView = { id: 'VIEW_ULTIMATE', label: 'Ultimate status', icon: 'flare', description: 'Shows your ultimate level + cooldown.', columnSize: 1, rowSize: 1, viewComponent: UltimateViewComponent, viewSettings: [ ] };
import cv2 import numpy as np from face_common import get_landmarks, get_landmarks_points def floating_face(models, image, background): face, landmarks = get_landmarks(models, image) lps = get_landmarks_points(landmarks) np_points = np.array(lps, np.int32) hull = cv2.convexHull(np_points) #mask = np.zeros((image.shape[1], frame.shape[0]), dtype='uint8') mask = np.zeros_like(image) cv2.drawContours(mask, [hull], 0, color=(255, 255, 255), thickness=-1) mask = cv2.GaussianBlur(mask, ksize=(19, 19), sigmaX=16) mask = mask.astype(float) / 255 mask_inv = 1 - mask ret = (image * mask + background * mask_inv).astype(dtype='uint8') return ret
{-# LANGUAGE NoImplicitPrelude #-} {-# LANGUAGE QuasiQuotes #-} {-# LANGUAGE TemplateHaskell #-} module Handler.Blog ( getBlogHomeR , getBlogPostR , getBlogFeedR ) where import Data.WebsiteContent import Import import Yesod.AtomFeed (atomLink) import RIO.Time (getCurrentTime) getAddPreview :: Handler (Route App -> (Route App, [(Text, Text)])) getAddPreview = do mpreview <- lookupGetParam "preview" case mpreview of Just "true" -> return $ \route -> (route, [("preview", "true")]) _ -> return $ \route -> (route, []) getBlogHomeR :: Handler () getBlogHomeR = do cacheSeconds 3600 posts <- getPosts case headMay posts of Nothing -> notFound Just post -> do addPreview <- getAddPreview redirect $ addPreview $ BlogPostR (postYear post) (postMonth post) (postSlug post) getBlogPostR :: Year -> Month -> Text -> Handler Html getBlogPostR year month slug = do cacheSeconds 3600 posts <- getPosts post <- maybe notFound return $ find matches posts now <- getCurrentTime addPreview <- getAddPreview defaultLayout $ do setTitle $ toHtml $ postTitle post atomLink BlogFeedR "Stackage Curator blog" $(widgetFile "blog-post") toWidgetHead [shamlet|<meta name=og:description value=#{postDescription post}>|] where matches p = postYear p == year && postMonth p == month && postSlug p == slug getBlogFeedR :: Handler TypedContent getBlogFeedR = do cacheSeconds 3600 posts <- fmap (take 10) getPosts latest <- maybe notFound return $ headMay posts newsFeed Feed { feedTitle = "Stackage Curator blog" , feedLinkSelf = BlogFeedR , feedLinkHome = HomeR , feedAuthor = "The Stackage Curator team" , feedDescription = "Messages from the Stackage Curators about the Stackage project" , feedLanguage = "en" , feedUpdated = postTime latest , feedLogo = Nothing , feedEntries = map toEntry $ toList posts } where toEntry post = FeedEntry { feedEntryLink = BlogPostR (postYear post) (postMonth post) (postSlug post) , feedEntryUpdated = postTime post , feedEntryTitle = postTitle post , feedEntryContent = postBody post , feedEntryEnclosure = Nothing , feedEntryCategories = [] }
<filename>test/Test.hs {-# LANGUAGE OverloadedStrings #-} import YGG import ScrapHTML import Text.XML.HXT.Core import Paths_hYGG import Config import Utils import Data.Maybe c = Config {port=8080, hostName="https://www3.yggtorrent.nz", yggid="", yggpass="", yggcookie=""} exSearchurl = "https://www3.yggtorrent.nz/engine/search?name=luca&do=search&order=asc&sort=name&page=50" exTIurl = "https://www3.yggtorrent.nz/torrent/audio/musique/496268-andy+y+lucas+discography+albums+2003+2018+mp3+320+freek911" testConfig = setDefaultCookie c >>= print testTI = runX (getdoc c exTIurl /> hasName "html" /> hasName "head" /> filterA (getName >>> isA (/="script"))) >>= print testWr = runLA $ root [] [mkelem "aah" [] []] >>> writeDocumentToString [withOutputHTML] testUD = do connectCookie c r <- runX $ getdoc c (hostName c <> "/user/account") >>> selectUserData >>> xunpickleVal xpTree print r main = testUD
def make_tfrecord(atom_data, mask_data, nlist, peak_data, residue, atom_names, weights=None, indices=np.zeros((3, 1), dtype=np.int64)): features = {} N = atom_data.shape[0] NN = nlist.shape[1] assert mask_data.shape[0] == N assert nlist.shape[0] == N and nlist.shape[2] == 3 assert peak_data.shape[0] == N assert atom_names.shape[0] == N assert len(indices) == 3 if np.any(np.isnan(peak_data)): raise ValueError('Found nan in your data!') peak_data[np.isnan(peak_data)] = 0 mask_data[np.isnan(peak_data)] = 0 if np.any(np.abs(peak_data) > 10000): raise ValueError('Found very large peaks, |v| > 10000') features['atom-number'] = tf.train.Feature( int64_list=tf.train.Int64List(value=[N])) features['neighbor-number'] = tf.train.Feature( int64_list=tf.train.Int64List(value=[NN])) features['bond-data'] = tf.train.Feature( float_list=tf.train.FloatList(value=nlist.flatten())) features['atom-data'] = tf.train.Feature( int64_list=tf.train.Int64List(value=atom_data.flatten())) features['peak-data'] = tf.train.Feature( float_list=tf.train.FloatList(value=peak_data.flatten())) features['mask-data'] = tf.train.Feature( float_list=tf.train.FloatList(value=mask_data.flatten())) features['name-data'] = tf.train.Feature( int64_list=tf.train.Int64List(value=atom_names.flatten())) features['residue'] = tf.train.Feature( int64_list=tf.train.Int64List(value=[residue])) features['indices'] = tf.train.Feature( int64_list=tf.train.Int64List(value=indices.flatten())) example = tf.train.Example(features=tf.train.Features(feature=features)) return example
// Fuzz the H1/H2 codec implementations. DEFINE_PROTO_FUZZER(const test::common::http::CodecImplFuzzTestCase& input) { try { TestUtility::validate(input); codecFuzz(input, HttpVersion::Http1); codecFuzz(input, HttpVersion::Http2); } catch (const EnvoyException& e) { ENVOY_LOG_MISC(debug, "EnvoyException: {}", e.what()); } }
// Copyright 2017 The Chromium Authors. All rights reserved. // Use of this source code is governed by a BSD-style license that can be // found in the LICENSE file. #include "ash/system/palette/palette_tray.h" #include "ash/ash_switches.h" #include "ash/public/cpp/ash_pref_names.h" #include "ash/public/cpp/config.h" #include "ash/shell.h" #include "ash/shell_test_api.h" #include "ash/system/palette/test_palette_delegate.h" #include "ash/system/status_area_widget.h" #include "ash/system/status_area_widget_test_helper.h" #include "ash/test/ash_test_base.h" #include "ash/test/ash_test_helper.h" #include "ash/test_shell_delegate.h" #include "base/command_line.h" #include "base/memory/ptr_util.h" #include "components/prefs/testing_pref_service.h" #include "ui/events/event.h" #include "ui/events/test/event_generator.h" namespace ash { class PaletteTrayTest : public AshTestBase { public: PaletteTrayTest() {} ~PaletteTrayTest() override {} void SetUp() override { base::CommandLine::ForCurrentProcess()->AppendSwitch( switches::kAshForceEnableStylusTools); base::CommandLine::ForCurrentProcess()->AppendSwitch( switches::kAshEnablePaletteOnAllDisplays); AshTestBase::SetUp(); Shell::RegisterLocalStatePrefs(pref_service_.registry()); ash_test_helper()->test_shell_delegate()->set_local_state_pref_service( &pref_service_); palette_tray_ = StatusAreaWidgetTestHelper::GetStatusAreaWidget()->palette_tray(); test_api_ = base::MakeUnique<PaletteTray::TestApi>(palette_tray_); // Set the test palette delegate here, since this requires an instance of // shell to be available. ShellTestApi().SetPaletteDelegate(base::MakeUnique<TestPaletteDelegate>()); // Initialize the palette tray again since this test requires information // from the palette delegate. (It was initialized without the delegate in // AshTestBase::SetUp()). palette_tray_->Initialize(); } // Adds the command line flag which states this device has an internal stylus. void InitForInternalStylus() { base::CommandLine::ForCurrentProcess()->AppendSwitch( switches::kHasInternalStylus); // Initialize the palette tray again so the changes from adding this switch // are applied. palette_tray_->Initialize(); } // Performs a tap on the palette tray button. void PerformTap() { ui::GestureEvent tap(0, 0, 0, base::TimeTicks(), ui::GestureEventDetails(ui::ET_GESTURE_TAP)); palette_tray_->PerformAction(tap); } protected: PaletteTray* palette_tray_ = nullptr; // not owned TestingPrefServiceSimple pref_service_; std::unique_ptr<PaletteTray::TestApi> test_api_; private: DISALLOW_COPY_AND_ASSIGN(PaletteTrayTest); }; // Verify the palette tray button exists and but is not visible initially. TEST_F(PaletteTrayTest, PaletteTrayIsInvisible) { ASSERT_TRUE(palette_tray_); EXPECT_FALSE(palette_tray_->visible()); } // Verify that if the has seen stylus pref is not set initially, the palette // tray's touch event watcher should be active. TEST_F(PaletteTrayTest, PaletteTrayStylusWatcherAlive) { // TODO(crbug.com/751191): Remove the check for Mash. if (Shell::GetAshConfig() == Config::MASH) return; ASSERT_FALSE(palette_tray_->visible()); EXPECT_TRUE(test_api_->IsStylusWatcherActive()); } // Verify if the has seen stylus pref is not set initially, the palette tray // should become visible after seeing a stylus event. TEST_F(PaletteTrayTest, PaletteTrayVisibleAfterStylusSeen) { // TODO(crbug.com/751191): Remove the check for Mash. if (Shell::GetAshConfig() == Config::MASH) return; ASSERT_FALSE(palette_tray_->visible()); ASSERT_FALSE(pref_service_.GetBoolean(prefs::kHasSeenStylus)); ASSERT_TRUE(test_api_->IsStylusWatcherActive()); // Send a stylus event. GetEventGenerator().EnterPenPointerMode(); GetEventGenerator().PressTouch(); GetEventGenerator().ReleaseTouch(); GetEventGenerator().ExitPenPointerMode(); // Verify that the palette tray is now visible, the stylus event watcher is // inactive and that the has seen stylus pref is now set to true. EXPECT_TRUE(palette_tray_->visible()); EXPECT_FALSE(test_api_->IsStylusWatcherActive()); } // Verify if the has seen stylus pref is initially set, the palette tray is // visible. TEST_F(PaletteTrayTest, StylusSeenPrefInitiallySet) { // TODO(crbug.com/751191): Remove the check for Mash. if (Shell::GetAshConfig() == Config::MASH) return; ASSERT_FALSE(palette_tray_->visible()); pref_service_.SetBoolean(prefs::kHasSeenStylus, true); EXPECT_TRUE(palette_tray_->visible()); EXPECT_FALSE(test_api_->IsStylusWatcherActive()); } // Verify the palette tray button exists and is visible if the device has an // internal stylus. TEST_F(PaletteTrayTest, PaletteTrayIsVisibleForInternalStylus) { // TODO(crbug.com/751191): Remove the check for Mash. if (Shell::GetAshConfig() == Config::MASH) return; InitForInternalStylus(); ASSERT_TRUE(palette_tray_); EXPECT_TRUE(palette_tray_->visible()); } // Verify taps on the palette tray button results in expected behaviour. TEST_F(PaletteTrayTest, PaletteTrayWorkflow) { // Verify the palette tray button is not active, and the palette tray bubble // is not shown initially. EXPECT_FALSE(palette_tray_->is_active()); EXPECT_FALSE(test_api_->GetTrayBubbleWrapper()); // Verify that by tapping the palette tray button, the button will become // active and the palette tray bubble will be open. PerformTap(); EXPECT_TRUE(palette_tray_->is_active()); EXPECT_TRUE(test_api_->GetTrayBubbleWrapper()); // Verify that activating a mode tool will close the palette tray bubble, but // leave the palette tray button active. test_api_->GetPaletteToolManager()->ActivateTool( PaletteToolId::LASER_POINTER); EXPECT_TRUE(test_api_->GetPaletteToolManager()->IsToolActive( PaletteToolId::LASER_POINTER)); EXPECT_TRUE(palette_tray_->is_active()); EXPECT_FALSE(test_api_->GetTrayBubbleWrapper()); // Verify that tapping the palette tray while a tool is active will deactivate // the tool, and the palette tray button will not be active. PerformTap(); EXPECT_FALSE(palette_tray_->is_active()); EXPECT_FALSE(test_api_->GetPaletteToolManager()->IsToolActive( PaletteToolId::LASER_POINTER)); // Verify that activating a action tool will close the palette tray bubble and // the palette tray button is will not be active. PerformTap(); ASSERT_TRUE(test_api_->GetTrayBubbleWrapper()); test_api_->GetPaletteToolManager()->ActivateTool( PaletteToolId::CAPTURE_SCREEN); EXPECT_FALSE(test_api_->GetPaletteToolManager()->IsToolActive( PaletteToolId::CAPTURE_SCREEN)); // Wait for the tray bubble widget to close. RunAllPendingInMessageLoop(); EXPECT_FALSE(test_api_->GetTrayBubbleWrapper()); EXPECT_FALSE(palette_tray_->is_active()); } } // namespace ash
What is the deal? If you're over 35, are you as good as dead? Do you really absolutely have to pick a genre? And is the spec market really dead? There are some rumors that seem to have become truths out there, while other realities have been written about again and again. Yes, you can make it if you're over 35. And, for the launch of your career, you are indeed better off picking a genre and sticking to it. As for the spec market, look no further than The Scoggins Report to confirm that the spec market is alive and well. While the noise rages on about these topics, there is a slew of others that are rarely addressed. Here are just a few of those truths, that every aspiring screenwriter should know, even if nobody tells them: Your first script is not going to get you there Doesn't matter how good you think it is, how much talent you have or how much time and effort you've invested in the work; your first screenplay will rarely be good enough to get you the attention you deserve. Most industry types look at screenplay 1 to 3 from new writers as practice. It's scripts 4, 5 and 6 that will put on display your honed skills. If you're taking a new script out into the industry and are excited to share it, don't tell industry executives you come in contact with that it's your first script right off the bat; unless there's a million Dollar idea buried deep inside of it, the knowledge that it's the first script from a new scribe might stop them from giving it serious consideration. Agencies are always looking for worthwhile writers While agents from agencies big and small are known to spout such declarations as "You can't get an agent!" the truth is that they are looking. Every last one of them. Agents get to keep their job if they have material they can sell. Finding new, fresh, exciting material to sell, and buzzed about writers for whom to procure work, is bread and butter in that world. What they do mean with that big, broad, disappointing statement is this: You can't just walk up to an agent without any previous successes and get them to sign you. That is just not going to happen. The agency world works a bit like the mafia - they have to have someone vouch for you before they take you seriously. The good news? "Someone" is a rather broad term: contest win, colleague, pitch event. They just want to know that by looking at your work, their time is not being wasted. No one thing will make it happen Everyone thinks: Oh, if only I won that one script writing contest! If I could just get my script to that one manager! If only I could get into that event, or get a high score on The Black List's listing service… The reality is that it takes a combination of things to make it happen, so you have to fire on all fronts, and keep doing it. Nobody knows what the road leading to screenwriting success will look like for them, especially since screenwriting success differs for everyone. Trust me, I've talked to hundreds of writers who have "made it," and their road did not look like anyone else's… Because of this, it's important that you stay active on all fronts. Say you did win that one contest… What if you don't have a stellar follow up? Or you did get some attention through The Black List's listing service, but didn't have a short pitch ready to help you feel confident when you talk about your projects? In most cases, screenwriting success happens when a number of elements come together, when a series of wins unfold in your favor. And it's up to you to aggressively pursue them, and be wholly prepared when success, or the potential for success, does show up. We love the script but… If you get a rejection letter or email telling you that the production or representation company you approached loved the work but just felt it's not right for them, DON'T take it at face value. They don't know you from Adam, so they don't know that you won't become the next Chris Terrio some day, and therefore don't want to offend you, just in case. If they loved the script like they said they did, they would have wanted to meet you, or passed the work on to someone else. Good work rises to the top, and everyone wants to be associated with good material. By being nice to you and staying on your good side, the company responsible for the letter is keeping you from recognizing that the screenplay doesn't work, and either engaging in some investigative work to uncover where and why it failed, or moving onto the next. Getting representation is where the hard work begins You know the saying, it's harder to keep an A than it is to get an A?Nowhere in the industry is this truer than in the representation game. Many a screenwriter thinks that once they've secured representation they can rest on their laurels and wait to see what comes next. Instead, it's just the opposite: Once you've gotten an agent or a manager's attention, you'll have to work hard every day to keep them interested. Representatives want to see that you are a stellar, consistent source for new work. And that the work you turn in is ready for the marketplace. So accelerate your writing schedule, and make sure that any material you turn into them has been seen and signed off on by somebody else, be they a writer from your writing group or an industry analyst. As a repped writer, you are competing not only with your rep's other clients for attention, but also with aspiring writers who are eager to take your place. No matter the genre you write in, the city you live in, or the length of time you've been trying to make a go of it, make sure that your efforts are thought out, and driven by deliberate strategy. Lee is available for questions and remarks in the Comments section below. To register for Lee's webinar, The Do's and Don'ts of Landing a Manager or Agent, please click here.