content
stringlengths
10
4.9M
What about LinkedHashMap? ArrayDeque to the rescue! backed by a ring buffer (yes, like the Disruptor! you clever monkeys) it uses a power of 2 sized backing array, which allows it to replace modulo(%) with a bit-wise and(&) which works because x % some-power-of-2 is the same as x & (some-power-of-2 - 1) adding and removing elements is all about changing the head/tail counters, no copies, no garbage (until you hit capacity). iterating through an array involves no pointer chasing, unlike linked list. I like the way you walk, I like the way you talk, Susie Q! The results(average over multiple runs) are as follows: Experiment Throughput Cost array.measureOffer, 100881.077 ops/msec, 10ns array.measureOffer1Poll1, 41679.299 ops/msec, 24ns array.measureOffer2Poll1, 30217.424 ops/msec, 33ns array.measureOffer2Poll2, 21365.283 ops/msec, 47ns array.measureOffer1000PollUntilEmpty, 102.232 ops/msec, 9804ns linked.measureOffer, 103403.692 ops/msec, 10ns linked.measureOffer1Poll1, 24970.200 ops/msec, 40ns linked.measureOffer2Poll1, 16228.638 ops/msec, 62ns linked.measureOffer2Poll2, 12874.235 ops/msec, 78ns linked.measureOffer1000PollUntilEmpty, 92.328 ops/msec, 10830ns -------- Interpretation: Offer method cost for both implementations is quite similar at 10ns, with the linked implementation marginally faster perhaps. Poll method cost is roughly 14ns for the array deque based implementation, and 30ns for the linked implementation. Further profiling has also shown that while the deq implementation generates no garbage the linked implementation has some garbage overhead. For my idea of a real world load the array deq is 10% faster. Depending on the ratio between offer and poll the above implementation can be quite attractive. Consider for instance that queues/buffer buildup tends to be either empty, or quite full when a burst of traffic comes in. When you are dealing with relatively little traffic the cost of polling is more significant, when you are merging a large buildup of updates into your queue the offer cost is more important. Luckily this is not a difficult choice as the array deque implementation is only marginally slower for offering and much faster for polling. Finally, a small real world gem I hit while writing this blog. When benchmarking the 1k offer/queue drain case for the linked implementation I hit this only trust what you measure. I can discard the benchmark result if I like, but if you change your command line arguments in a production environment and hit a kink like that you will have a real problem. Many thanks to Update 08/03/2013: Just realized I forgot to include a link to the code. I'm using a micro benchmarking framework which is both awesome and secret, so sadly the benchmark code is not entirely peer review-able. I will put the benchmark on GitHub when the framework makers will give me the go ahead which should be soon enough. Here are the benchmarks:The results(average over multiple runs) are as follows:Depending on the ratio between offer and poll the above implementation can be quite attractive. Consider for instance that queues/buffer buildup tends to be either empty, or quite full when a burst of traffic comes in. When you are dealing with relatively little traffic the cost of polling is more significant, when you are merging a large buildup of updates into your queue the offer cost is more important. Luckily this is not a difficult choice as the array deque implementation is only marginally slower for offering and much faster for polling.Finally, a small real world gem I hit while writing this blog. When benchmarking the 1k offer/queue drain case for the linked implementation I hit this JVM bug - "command line length affects performance". The way it manifested was bad performance (~50 ops/ms) when running with one set of parameters and much better performance when using some extra parameters to profile GC which I'd have expected to slow things down if anything. It had me banging my head against the wall for a bit, I wrote a second benchmark to validate what I considered the representative performance etc. Eventually I talked to Mr. Shipilev who pointed me at this ticket. I was not suffering the same issue with the other benchmarks, or the same benchmark for the other implementation which goes to show what a slippery sucker this is. The life lesson from this is to. I can discard the benchmark result if I like, but if you change your command line arguments in a production environment and hit a kink like that youhave a real problem.Many thanks to Doug Lawrie with whom I had a discussion about his implementation of a merging event queue (a merging queue stuck on the end of a Disruptor) which drove me to write this post.Just realized I forgot to include a link to the code. Here it is. The merging queue is a useful construct for slow consumers. It allows a bounded queue to keep receiving updates with the requirement for space limited to the number of keys. It also allows the consumer to skip old data. This is particularly of interest for systems dealing with fast moving data where old data is not relevant and will just slow you down. I've seen this requirement in many pricing systems in the past few years, but there are other variations.Here's the interface:Now it is true that LinkedHashMap offers similar functionality and you could use it to implement a merging queue as follows:This works, but the way we have to implement poll() is clumsy. What I mean by that is that it looks like we are asking for allot more than we want to work around some missing functionality. If you dig into the machinery behind the expression: "lastValMap.remove(lastValMap.keySet().iterator().next())" there's an awful lot of intermediate structures we need to jump through before we get where we are going. LinkedHashMap is simply not geared toward being a queue, we are abusing it to get what we want.ArrayDeque is one of the unsung heroes of the java collections. If you ever need a non-concurrent queue or stack look no further than this class. In it's guts you'll find the familiar ring buffer. It doesn't allocate or copy anything when you pop elements out or put them in(unless you exceed the capacity). It's cache friendly(unlike a linked list). It's LOVELY!Here's a merging queue implementation using a HashMap and ArrayDeque combined:You can replace the HashMap with an open address implementation to get more cache friendly behaviour for key collisions if you like, but in the name of KISS we won't go down that particular rabbit hole. As the comment states, setting entries to null rather than removing them is an optimization with a trade off. If your key set is not of a finite manageable range then this is perhaps not the way to go. As it stands it saves you some GC overheads. This optimization is not really open to you with LinkedHashMap where the values and their order are managed as one.ArrayDeque is a better performer than any other queue for the all the reasons discussed in this StackOverflow discussion , which boil down to:
Graphs associated with triangulations of lattice polygons Abstract Two graphs, the edge crossing graph E and the triangle graph T are associated with a simple lattice polygon. The maximal independent sets of vertices of E and T are derived including a formula for the size of the fundamental triangles. Properties of E and T are derived including a formula for the size of the maximal independent sets in E and T. It is shown that T is a factor graph of edge-disjoint 4-cycles, which gives corresponding geometric information, and is a partition graph as recently defined by the authors and F. Harary.
<gh_stars>1000+ export interface BarMapItem { offset: string; scroll: string; scrollSize: string; size: string; key: string; axis: string; client: string; direction: string; } export interface BarMap { vertical: BarMapItem; horizontal: BarMapItem; } export interface ScrollbarType { wrap: ElRef; }
def need_immediate_os_return(self) : if self.are_listing : return os_ret_val_good elif not self.failfast : return os_ret_val_good else: return self.os_return_code() assert False, "Impossible Place"
def fix_coordinates(user_input: dict) -> dict: for coordinate in (CONF_LAT_NE, CONF_LAT_SW, CONF_LON_NE, CONF_LON_SW): if len(str(user_input[coordinate]).split(".")[1]) < 7: user_input[coordinate] = user_input[coordinate] + 0.0000001 if user_input[CONF_LAT_NE] < user_input[CONF_LAT_SW]: user_input[CONF_LAT_NE], user_input[CONF_LAT_SW] = ( user_input[CONF_LAT_SW], user_input[CONF_LAT_NE], ) if user_input[CONF_LON_NE] < user_input[CONF_LON_SW]: user_input[CONF_LON_NE], user_input[CONF_LON_SW] = ( user_input[CONF_LON_SW], user_input[CONF_LON_NE], ) return user_input
/** * Test Managers to add and remove local message listener. */ public class GridManagerLocalMessageListenerSelfTest extends GridCommonAbstractTest { /** */ private static final short DIRECT_TYPE = 210; static { IgniteMessageFactoryImpl.registerCustom(DIRECT_TYPE, GridIoUserMessage::new); } /** {@inheritDoc} */ @Override protected IgniteConfiguration getConfiguration(String igniteInstanceName) throws Exception { IgniteConfiguration c = super.getConfiguration(igniteInstanceName); TcpCommunicationSpi commSpi = new TcpCommunicationSpi(); c.setCommunicationSpi(commSpi); return c; } /** {@inheritDoc} */ @Override protected void afterTest() throws Exception { stopAllGrids(); } /** * @throws Exception If failed. */ @Test public void testSendMessage() throws Exception { startGridsMultiThreaded(2); IgniteSpiContext ctx0 = ((IgniteSpiAdapter)grid(0).context().io().getSpi()).getSpiContext(); IgniteSpiContext ctx1 = ((IgniteSpiAdapter)grid(1).context().io().getSpi()).getSpiContext(); String topic = "test-topic"; final CountDownLatch latch = new CountDownLatch(1); ctx1.addLocalMessageListener(topic, new IgniteBiPredicate<UUID, Object>() { @Override public boolean apply(UUID nodeId, Object msg) { assertEquals("Message", msg); latch.countDown(); return true; } }); long time = System.nanoTime(); ctx0.send(grid(1).localNode(), "Message", topic); assert latch.await(3, SECONDS); time = System.nanoTime() - time; info(">>>"); info(">>> send() time (ms): " + MILLISECONDS.convert(time, NANOSECONDS)); info(">>>"); } /** * @throws Exception If failed. */ @Test public void testAddLocalMessageListener() throws Exception { startGrid(); Manager mgr = new Manager(grid().context(), new Spi()); mgr.start(); mgr.onKernalStart(true); assertTrue(mgr.enabled()); } /** * @throws Exception If failed. */ @Test public void testRemoveLocalMessageListener() throws Exception { startGrid(); Manager mgr = new Manager(grid().context(), new Spi()); assertTrue(mgr.enabled()); mgr.onKernalStart(true); mgr.onKernalStop(false); mgr.stop(false); assertTrue(mgr.enabled()); } /** */ private static class Manager extends GridManagerAdapter<IgniteSpi> { /** * @param ctx Kernal context. * @param spis Specific SPI instance. */ protected Manager(GridKernalContext ctx, IgniteSpi... spis) { super(ctx, spis); } /** {@inheritDoc} */ @Override public void start() throws IgniteCheckedException { // No-op. } /** {@inheritDoc} */ @Override public void stop(boolean cancel) throws IgniteCheckedException { // No-op. } } /** * Test Spi. */ private static interface TestSpi extends IgniteSpi { // No-op. } /** * Spi */ private static class Spi extends IgniteSpiAdapter implements TestSpi { /** Ignite Spi Context. **/ private IgniteSpiContext spiCtx; /** Test message topic. **/ private static final String TEST_TOPIC = "test_topic"; /** {@inheritDoc} */ @Override public void spiStart(@Nullable String igniteInstanceName) throws IgniteSpiException { // No-op. } /** {@inheritDoc} */ @Override public void spiStop() throws IgniteSpiException { // No-op. } /** {@inheritDoc} */ @Override public void onContextInitialized0(IgniteSpiContext spiCtx) throws IgniteSpiException { this.spiCtx = spiCtx; spiCtx.addLocalMessageListener(TEST_TOPIC, new IgniteBiPredicate<UUID, Object>() { @Override public boolean apply(UUID uuid, Object o) { return true; } }); } /** {@inheritDoc} */ @Override public void onContextDestroyed0() { spiCtx.removeLocalMessageListener(TEST_TOPIC, new IgniteBiPredicate<UUID, Object>() { @Override public boolean apply(UUID uuid, Object o) { return true; } }); } } }
<gh_stars>1-10 package com.intellij.spring.facet; import com.intellij.openapi.editor.event.DocumentAdapter; import com.intellij.openapi.editor.event.DocumentEvent; import com.intellij.openapi.fileChooser.FileChooser; import com.intellij.openapi.fileChooser.FileChooserDescriptor; import com.intellij.openapi.module.Module; import com.intellij.openapi.project.Project; import com.intellij.openapi.ui.DialogWrapper; import com.intellij.openapi.vfs.VirtualFile; import com.intellij.openapi.vfs.pointers.VirtualFilePointer; import com.intellij.psi.PsiFile; import com.intellij.psi.PsiManager; import com.intellij.spring.SpringBundle; import com.intellij.ui.CheckedTreeNode; import com.intellij.ui.EditorTextField; import com.intellij.util.containers.MultiMap; import com.intellij.util.ui.tree.TreeModelAdapter; import com.intellij.util.ui.tree.TreeUtil; import org.jetbrains.annotations.NonNls; import javax.annotation.Nonnull; import javax.annotation.Nullable; import javax.swing.*; import javax.swing.event.TreeModelEvent; import javax.swing.tree.DefaultTreeModel; import java.awt.*; import java.awt.event.ActionEvent; import java.awt.event.ItemEvent; import java.awt.event.ItemListener; import java.util.*; import java.util.List; /** * @author <NAME> */ public class FileSetEditor extends DialogWrapper { public static final DefaultListCellRenderer FILESET_RENDERER = new DefaultListCellRenderer() { public Component getListCellRendererComponent(final JList list, Object value, final int index, final boolean isSelected, final boolean cellHasFocus) { if (value == null) { value = SpringBundle.message("fileset.none"); } else if (((SpringFileSet) value).isNew()) { value = SpringBundle.message("fileset.new"); } return super.getListCellRendererComponent(list, value, index, isSelected, cellHasFocus); } }; private JPanel myMainPanel; private EditorTextField mySetName; private SpringFilesTree myFilesTree; private JComboBox myParentBox; private final SpringFileSet myFileSet; private final CheckedTreeNode myRoot = new CheckedTreeNode(null); private final SpringFileSet myOriginalSet; public FileSetEditor(final @Nonnull Module module, final SpringFileSet fileSet, final Set<SpringFileSet> allSets) { super(module.getProject(), true); myOriginalSet = fileSet; myFileSet = new SpringFileSet(fileSet); init(fileSet, allSets, new SpringConfigsSearcher(module), module.getProject()); } protected FileSetEditor(final Component parent, final SpringFileSet fileSet, final Collection<SpringFileSet> allSets, final SpringConfigsSearcher searcher, final Project project) { super(parent, true); myOriginalSet = fileSet; myFileSet = new SpringFileSet(fileSet); init(fileSet, allSets, searcher, project); } private void init(final SpringFileSet fileSet, final Collection<SpringFileSet> allSets, final SpringConfigsSearcher searcher, @Nullable final Project project) { setTitle(SpringBundle.message("config.fileset.editor.title")); myFilesTree.setModel(new DefaultTreeModel(myRoot)); searcher.search(); final MultiMap<Module, PsiFile> files = searcher.getFilesByModules(); final MultiMap<VirtualFile, PsiFile> jars = searcher.getJars(); final Set<PsiFile> psiFiles = myFilesTree.buildModuleNodes(files, jars, fileSet); final List<VirtualFile> virtualFiles = searcher.getVirtualFiles(); for (VirtualFile virtualFile : virtualFiles) { myFilesTree.addFile(virtualFile); } if (project != null) { final PsiManager psiManager = PsiManager.getInstance(project); final Collection<VirtualFilePointer> list = fileSet.getFiles(); for (VirtualFilePointer pointer : list) { final VirtualFile file = pointer.getFile(); if (file != null) { final PsiFile psiFile = psiManager.findFile(file); if (psiFile != null && psiFiles.contains(psiFile)) { continue; } myFilesTree.addFile(file); } } } TreeUtil.expandAll(myFilesTree); myFilesTree.getModel().addTreeModelListener(new TreeModelAdapter() { public void treeNodesChanged(final TreeModelEvent e) { updateFileSet(); } }); mySetName.setText(fileSet.getName()); mySetName.addDocumentListener(new DocumentAdapter() { public void documentChanged(final DocumentEvent e) { updateFileSet(); } }); for (SpringFileSet set : allSets) { if (set.getId().equals(fileSet.getId())) { continue; } myParentBox.addItem(set); } myParentBox.addItem(null); myParentBox.setSelectedItem(myFileSet.getDependencies().size() > 0 ? myFileSet.getDependencies().get(0) : null); myParentBox.addItemListener(new ItemListener() { public void itemStateChanged(final ItemEvent e) { updateFileSet(); } }); myParentBox.setRenderer(FILESET_RENDERER); init(); getOKAction().setEnabled(fileSet.isNew()); } @Nullable protected JComponent createCenterPanel() { return myMainPanel; } @NonNls protected String getDimensionServiceKey() { return "spring file set editor"; } public boolean isOKActionEnabled() { if (myOriginalSet.isNew()) { return true; } if (myFileSet.getFiles().size() != myOriginalSet.getFiles().size()) { return true; } final List<VirtualFilePointer> pointers = new ArrayList<>(myFileSet.getFiles()); final List<VirtualFilePointer> originalPointers = new ArrayList<>(myOriginalSet.getFiles()); for (int i = 0; i < pointers.size(); i++) { if (!pointers.get(i).getUrl().equals(originalPointers.get(i).getUrl())) { return true; } } final boolean b = myFileSet.getDependencies().equals(myOriginalSet.getDependencies()); return !myFileSet.getName().equals(myOriginalSet.getName()) || !b; } protected void doOKAction() { updateFileSet(); super.doOKAction(); } private void updateFileSet() { myFileSet.setName(mySetName.getText()); myFilesTree.updateFileSet(myFileSet); SpringFileSet parent = (SpringFileSet) myParentBox.getSelectedItem(); myFileSet.setDependencies(parent == null ? Collections.<String>emptyList() : Arrays.asList(parent.getId())); getOKAction().setEnabled(isOKActionEnabled()); } protected Action[] createActions() { return new Action[]{getOKAction(), getCancelAction()}; } protected Action[] createLeftSideActions() { final AbstractAction locateAction = new AbstractAction(SpringBundle.message("config.locate.button")) { public void actionPerformed(final ActionEvent e) { final VirtualFile[] files = FileChooser.chooseFiles(new FileChooserDescriptor(true, false, true, false, true, true), myMainPanel, null, null); if (files.length > 0) { for (VirtualFile file : files) { myFilesTree.addFile(file); } updateFileSet(); TreeUtil.expandAll(myFilesTree); } } }; return new Action[]{locateAction}; } public SpringFileSet getEditedFileSet() { return myFileSet; } }
def rowsInserted(self, index, start, end): Qt.QTableView.rowsInserted(self, index, start, end) for i in xrange(start, end + 1): self.resizeRowToContents(i) if start == 0: self.resizeColumnsToContents() if not self.scrollLock: self.scrollToBottom()
Study tracks illicit drug use through Europe’s sewage system The largest multi-city study using sewage to monitor drug usage across Europe has been published today in the scientific journal Addiction. Scientists from the University of Bath are part of the Europe-wide SCORE network (Sewage analysis CORE group) that analysed waste water from over 40 European cities during a one-week period over consecutive years (2011-13) to explore how the drug-taking habits of these populations has changed. Its conclusions are taken up in the European Drug Report 2014, launched by the European Monitoring Centre for Drugs & Drug Addiction (EMCDDA) this week, as well as in an online interactive analysis by the agency dedicated to the issue (Perspectives on drugs). From London to Nicosia and Stockholm to Lisbon, the study analysed daily waste water samples from waste water treatment plants over a one-week period in April 2012 and in March 2013. In 2012, the study involved 23 cities in 11 countries, while in 2013 it was broadened to 42 cities in 21 countries. Data from a 2011 study (19 cities, 11 countries) were used for comparison. The scientists used highly sensitive mass spectroscopy techniques to look for tiny traces of biomarkers for cocaine, amphetamine, methamphetamine, ecstasy and cannabis in waste water from approximately 8 million people. The results provide a valuable snapshot of the drug flow through the cities involved, revealing marked regional variations in drug use patterns. Traces of cocaine, for example, were higher in western and some southern cities but lower in northern and eastern cities. Use of amphetamine, while relatively evenly distributed, showed the highest levels in the north and northwest of Europe. When weekly patterns of drug use were examined, cocaine and ecstasy levels rose sharply at weekends in most cities, while methamphetamine and cannabis use appeared to be more evenly distributed throughout the week. Methamphetamine use, generally low and traditionally concentrated in the Czech Republic and Slovakia, now appears to be present in the east of Germany and northern Europe. Lead investigator from the University of Bath, Dr Barbara Kasprzyk-Horden, said: “Analysing sewage for estimating drug use has huge potential for monitoring the health of populations. Traditional epidemiological methods rely on surveys, which are time consuming, expensive and can be inaccurate due to self-reporting bias. “However waste water profiling is non-intrusive and can show changes in local populations in real time, with a large sample size and can be used alongside existing epidemiology methods to give important information on drug use and markets across Europe. “This tool is exciting because it could also be used to identify the use of new dangerous ‘legal highs’ that have not yet been banned. “It also has potential to be used to monitor biomarkers for diseases such as cancer or trace the spread of flu epidemics in real time and could be a really powerful tool for improving public health.” Ort C, van Nuijs ALN, Berset J-D, et al. ‘Spatial differences and temporal changes in illicit drug use in Europe quantified by wastewater analysis’, is published in Addiction 109: doi: 10.1111/add.12570 www.addictionjournal.org
import nmag import numpy as np from nmag import SI, at Ms = 0.86e6 K1 = 520e3 a = (1, 0, 0) x1 = y1 = z1 = 20 # same as in bar.geo file def m_gen(r): x = np.maximum(np.minimum(r[0] / x1, 1.0), 0.0) # x, y and z as a fraction y = np.maximum(np.minimum(r[1] / y1, 1.0), 0.0) # between 0 and 1 in the z = np.maximum(np.minimum(r[2] / z1, 1.0), 0.0) mx = (2 - y) * (2 * x - 1) / 4 mz = (2 - y) * (2 * z - 1) / 4 my = np.sqrt(1 - mx ** 2 - mz ** 2) return np.array([mx, my, mz]) def generate_anisotropy_data(anis, name='anis'): # Create the material mat_Py = nmag.MagMaterial(name="Py", Ms=SI(Ms, "A/m"), anisotropy=anis) # Create the simulation object sim = nmag.Simulation(name, do_demag=False) # Load the mesh sim.load_mesh("bar.nmesh.h5", [("Py", mat_Py)], unit_length=SI(1e-9, "m")) # Set the initial magnetisation sim.set_m(lambda r: m_gen(np.array(r) * 1e9)) #sim.advance_time(SI(1e-12, 's') ) # Save the exchange field and the magnetisation once at the beginning # of the simulation for comparison with finmag np.savetxt("H_%s_nmag.txt" % name, sim.get_subfield("H_anis_Py")) np.savetxt("m0_nmag.txt", sim.get_subfield("m_Py")) if __name__ == "__main__": # define uniaxial_anisotropy anis = nmag.uniaxial_anisotropy( axis=[1, 0, 0], K1=SI(520e3, "J/m^3"), K2=SI(230e3, "J/m^3")) generate_anisotropy_data(anis) cubic = nmag.cubic_anisotropy(axis1=[1, 0, 0], axis2=[0, 1, 0], K1=SI(520e3, "J/m^3"), K2=SI(230e3, "J/m^3"), K3=SI(123e3, "J/m^3")) generate_anisotropy_data(cubic, name='cubic_anis')
<filename>PicNumero/Helper.py import Display from skimage.color import rgb2gray import numpy as np from scipy import misc import matplotlib.pyplot as plt import string import random import pickle import os, sys, shutil from skimage.feature import greycomatrix, greycoprops from skimage import img_as_ubyte, io def generate_random_id(length=8): '''Returns a unique string of specified length''' identifier = "" for i in xrange(length): identifier = identifier + random.choice(string.ascii_letters + string.digits) return identifier def block_proc(A, blockSize, blockFunc): ''' Function to somewhat mimic behavior of MATLAB's blocproc function (See http://uk.mathworks.com/help/images/ref/blockproc.html). Creates a block (kernel), slides block across image and applies the specified function to each block. Args: A: 2D image array. blockSize: Tuple (width,height) specifying dimensions for each block. blockFunc: A lambda or function with the signature blockFunc(block). This should be implemented to specify operations to be performed on each block. ''' xStart = 0; xStop = A.shape[1] if(xStop % blockSize[0] != 0): xStop = int(xStop/blockSize[0]) * blockSize[0] yStart = 0; yStop = A.shape[0] if(yStop % blockSize[1] != 0): yStop = int(yStop/blockSize[1]) * blockSize[1] for x in range(xStart, xStop, blockSize[0]): for y in range(yStart, yStop, blockSize[1]): block = A[y:y+blockSize[1], x:x+blockSize[0]] blockFunc(block) def serialize(filename, obj): '''Save object to file''' f = open(filename, 'wb+') pickle.dump(obj, f) f.close() def unserialize(filename): '''Return object from file''' try: f = open(filename, 'rb') obj = pickle.load(f, encoding='latin1') f.close() return obj except: return None def get_textural_features(img, isMultidirectional=False, distance=1): '''Extract GLCM feature vector from image Args: img: input image. isMultidirectional: Controls whether co-occurence should be calculated in other directions (ie 45 degrees, 90 degrees and 135 degrees). distance: Distance between pixels for co-occurence. Returns: features: if isMultidirectional=False, this is a 4 element vector of [dissimilarity, correlation,homogeneity, energy]. If not it is a 16 element vector containing each of the above properties in each direction. ''' if(isMultidirectional): img = img_as_ubyte(rgb2gray(img)) glcm = greycomatrix(img, [distance], [0, 0.79, 1.57, 2.36], 256, symmetric=True, normed=True) dissimilarity_1 = greycoprops(glcm, 'dissimilarity')[0][0] dissimilarity_2 = greycoprops(glcm, 'dissimilarity')[0][1] dissimilarity_3 = greycoprops(glcm, 'dissimilarity')[0][2] dissimilarity_4 = greycoprops(glcm, 'dissimilarity')[0][3] correlation_1 = greycoprops(glcm, 'correlation')[0][0] correlation_2 = greycoprops(glcm, 'correlation')[0][1] correlation_3 = greycoprops(glcm, 'correlation')[0][2] correlation_4 = greycoprops(glcm, 'correlation')[0][3] homogeneity_1 = greycoprops(glcm, 'homogeneity')[0][0] homogeneity_2 = greycoprops(glcm, 'homogeneity')[0][1] homogeneity_3 = greycoprops(glcm, 'homogeneity')[0][2] homogeneity_4 = greycoprops(glcm, 'homogeneity')[0][3] energy_1 = greycoprops(glcm, 'energy')[0][0] energy_2 = greycoprops(glcm, 'energy')[0][1] energy_3 = greycoprops(glcm, 'energy')[0][2] energy_4 = greycoprops(glcm, 'energy')[0][3] feature = np.array([dissimilarity_1, dissimilarity_2, dissimilarity_3,\ dissimilarity_4, correlation_1, correlation_2, correlation_3, correlation_4,\ homogeneity_1, homogeneity_2, homogeneity_3, homogeneity_4, energy_1,\ energy_2, energy_3, energy_4]) return feature else: img = img_as_ubyte(rgb2gray(img)) glcm = greycomatrix(img, [distance], [0], 256, symmetric=True, normed=True) dissimilarity = greycoprops(glcm, 'dissimilarity')[0][0] correlation = greycoprops(glcm, 'correlation')[0][0] homogeneity = greycoprops(glcm, 'homogeneity')[0][0] energy = greycoprops(glcm, 'energy')[0][0] feature = np.array([dissimilarity, correlation, homogeneity, energy]) return feature def save_feature_dataset(ser_filename, featureRepresentation='image', glcm_distance=1, glcm_isMultidirectional=False): ''' Convenience method to extract features from images and serialize data. Args: ser_filename: name to store serialized dataset. featureRepresentation: Type of features to be used in classification. Can ake of one of the values 'image', 'pca' or 'glcm'. glcm_distance: Distance between pixels for co-occurence. Only used if featureRepresentation=glcm. isMultidirectional: Controls whether co-occurence should be calculated in other directions (ie 45 degrees, 90 degrees and 135 degrees). Only used if featureRepresentation=glcm. Return: dataset: Tuple containing (train_data, train_targets, test_data, test_targets) ''' # Load train data train_filenames = [] for filename in os.listdir("../train/positive"): if(filename != ".DS_Store"): train_filenames.append("../train/positive/" + filename) train_targets = [1]*(len(os.listdir("../train/positive"))-1) for filename in os.listdir("../train/negative"): if(filename != ".DS_Store"): train_filenames.append("../train/negative/" + filename) train_targets = train_targets + [0]*(len(os.listdir("../train/negative"))-1) n_train_samples = len(train_filenames) if(featureRepresentation == 'glcm'): if(glcm_isMultidirectional): sample_size = 16 else: sample_size = 4 else: sample_size = 20*20 train_data = np.zeros((n_train_samples, sample_size)) i = 0 for filename in train_filenames: img = io.imread(filename) if(featureRepresentation == 'image'): train_data[i] = img.flatten() elif(featureRepresentation == 'pca'): train_data[i] = decomposition.PCA(n_components=8).fit_transform(img.flatten()) elif(featureRepresentation == 'glcm'): train_data[i] = get_textural_features(img, glcm_distance, glcm_isMultidirectional) i = i + 1; # Load test data test_filenames = [] expected = [] for filename in os.listdir("test"): if(filename != ".DS_Store"): test_filenames.append("../test/" + filename) expected.append(int(filename.split('_')[1].split('.')[0])) n_test_samples = len(test_filenames) test_data = np.zeros((n_test_samples, sample_size)) i = 0 for filename in test_filenames: img = io.imread(filename) if(featureRepresentation == 'image'): test_data[i] = img.flatten() elif(featureRepresentation == 'pca'): test_data[i] = decomposition.PCA(n_components=8).fit_transform(img.flatten()) elif(featureRepresentation == 'glcm'): test_data[i] = get_textural_features(img, glcm_distance, glcm_isMultidirectional) i = i + 1; dataset = (train_data, train_targets, test_data, expected) serialize(ser_filename, dataset) return dataset def extract_features_from_old_data(featureRepresentation='image', glcm_distance=1, glcm_isMultidirectional=False): ''' Convenience method to extract features from images in "train" and "test" foldersand serialize data. Args: ser_filename: name to store serialized dataset. featureRepresentation: Type of features to be used in classification. Can ake of one of the values 'image', 'pca' or 'glcm'. glcm_distance: Distance between pixels for co-occurence. Only used if featureRepresentation=glcm. isMultidirectional: Controls whether co-occurence should be calculated in other directions (ie 45 degrees, 90 degrees and 135 degrees). Only used if featureRepresentation=glcm. Return: dataset: Tuple containing (train_data, train_targets, test_data, test_targets) ''' # Load train data train_filenames = [] for filename in os.listdir("../train/positive"): if(filename != ".DS_Store"): train_filenames.append("../train/positive/" + filename) train_targets = [1]*(len(os.listdir("../train/positive"))-1) for filename in os.listdir("../train/negative"): if(filename != ".DS_Store"): train_filenames.append("../train/negative/" + filename) train_targets = train_targets + [0]*(len(os.listdir("../train/negative"))-1) n_train_samples = len(train_filenames) if(featureRepresentation == 'glcm'): if(glcm_isMultidirectional): sample_size = 16 else: sample_size = 4 else: sample_size = 20*20 train_data = np.zeros((n_train_samples, sample_size)) i = 0 for filename in train_filenames: img = io.imread(filename) if(featureRepresentation == 'image'): train_data[i] = img.flatten() elif(featureRepresentation == 'pca'): train_data[i] = decomposition.PCA(n_components=8).fit_transform(img.flatten()) elif(featureRepresentation == 'glcm'): train_data[i] = get_textural_features(img, glcm_distance, glcm_isMultidirectional) i = i + 1; # Load test data test_filenames = [] expected = [] for filename in os.listdir("test"): if(filename != ".DS_Store"): test_filenames.append("../test/" + filename) expected.append(int(filename.split('_')[1].split('.')[0])) n_test_samples = len(test_filenames) test_data = np.zeros((n_test_samples, sample_size)) i = 0 for filename in test_filenames: img = io.imread(filename) if(featureRepresentation == 'image'): test_data[i] = img.flatten() elif(featureRepresentation == 'pca'): test_data[i] = decomposition.PCA(n_components=8).fit_transform(img.flatten()) elif(featureRepresentation == 'glcm'): test_data[i] = get_textural_features(img, glcm_distance, glcm_isMultidirectional) i = i + 1; dataset = (train_data, train_targets, test_data, expected) return dataset def extract_features_from_new_data(featureRepresentation='image', glcm_distance=1, glcm_isMultidirectional=False, train_size=0.75): ''' Convenience method to extract features from images in "grain_images" folder and serialize data. Args: ser_filename: name to store serialized dataset. featureRepresentation: Type of features to be used in classification. Can ake of one of the values 'image', 'pca' or 'glcm'. glcm_distance: Distance between pixels for co-occurence. Only used if featureRepresentation=glcm. isMultidirectional: Controls whether co-occurence should be calculated in other directions (ie 45 degrees, 90 degrees and 135 degrees). Only used if featureRepresentation=glcm. train_size: Fraction of dataset to be used for training. The remainder is used for testing. Return: dataset: Tuple containing (train_data, train_targets, test_data, test_targets) ''' image_filenames = [] expected = [] targets = [] for filename in os.listdir("grain_images"): if(filename != ".DS_Store"): image_filenames.append("../grain_images/" + filename) targets.append(int(filename.split('_')[1].split('.')[0])) if(featureRepresentation == 'glcm'): if(glcm_isMultidirectional): sample_size = 16 else: sample_size = 4 else: sample_size = 20*20 train_filenames = image_filenames[:int(train_size*len(image_filenames))] test_filenames = image_filenames[int(train_size*len(image_filenames)):] train_data = np.zeros((len(train_filenames), sample_size)) train_targets = targets[:int(train_size*len(targets))] test_data = np.zeros((len(test_filenames), sample_size)) test_targets = targets[int(train_size*len(targets)):] for i in xrange(0,len(train_filenames)): img = io.imread(train_filenames[i]) if(featureRepresentation == 'image'): train_data[i] = img.flatten() elif(featureRepresentation == 'pca'): train_data[i] = decomposition.PCA(n_components=8).fit_transform(img.flatten()) elif(featureRepresentation == 'glcm'): train_data[i] = get_textural_features(img, glcm_distance, glcm_isMultidirectional) for i in xrange(0,len(test_filenames)): img = io.imread(test_filenames[i]) if(featureRepresentation == 'image'): test_data[i] = img.flatten() elif(featureRepresentation == 'pca'): test_data[i] = decomposition.PCA(n_components=8).fit_transform(img.flatten()) elif(featureRepresentation == 'glcm'): test_data[i] = get_textural_features(img, glcm_distance, glcm_isMultidirectional) print("EXTRACTED FEATURES FROM IMAGES AND STORED AS DATASET\n1: {} | 0: {}".format(train_targets.count(1), train_targets.count(0))) return (train_data, train_targets, test_data, test_targets)
<gh_stars>1-10 use crate::arg_parser::ParsedArgs; use crate::colors; use crate::command::Command; use crate::command_error::CommandError; use crate::command_info::*; use crate::commands::*; use crate::errors; use constants::{BINARY_NAME, VERSION}; pub struct HelpCommand {} impl Command for HelpCommand { fn info() -> CommandInfo { CommandInfo::new("help", "Prints help text") .args(vec![Arg::new("command", "Command to print help for")]) .with_help() } fn execute(args: &ParsedArgs) -> Result<(), CommandError> { match args.get_positional_arg(0) { Some(val) => match &val[..] { "check" => CheckCommand::print_help(), "help" => HelpCommand::print_help(), "version" => VersionCommand::print_help(), unknown => { errors::print_usage_error(format!( "Cannot retrieve help for unrecognized command '{}'.", unknown )); std::process::exit(1); } }, _ => { println!( "{binary_name_bold} - version {version} Compiler & tools for the Pluma language {usage_header} {binary_name} <command> [options] {commands_header}", binary_name_bold = format!("{}", colors::bold(BINARY_NAME)), binary_name = BINARY_NAME, version = VERSION, usage_header = colors::bold("Usage:"), commands_header = colors::bold("Commands:"), ); let cmd_info: Vec<CommandInfo> = vec![ CheckCommand::info(), VersionCommand::info(), HelpCommand::info(), ]; let mut max_cmd_length = 0; for info in &cmd_info { max_cmd_length = std::cmp::max(max_cmd_length, info.name.len()); } for info in cmd_info { println!( " {:width$} {}", info.name, info.description, width = max_cmd_length ); } println!( "\nFor help with an individual command, try: {binary_name} <command> -h", binary_name = BINARY_NAME, ) } } Ok(()) } }
/** * Handler: Message from main thread received in worker thread */ static int thread_inproc_rcv(zloop_t *loop, zsock_t *reader, void *thread_ctx_void) { struct worker_thread_ctx *thread_ctx = thread_ctx_void; assert(thread_ctx); int retval; osd_result rv; zmsg_t *msg = zmsg_recv(reader); if (!msg) { return -1; } zframe_t *type_frame = zmsg_first(msg); assert(type_frame); char *type_str = zframe_strdup(type_frame); assert(type_str); if (!strcmp(type_str, "I-SHUTDOWN")) { retval = -1; zmsg_destroy(&msg); goto free_return; } else { if (!thread_ctx->cmd_handler_fn) { err(thread_ctx->log_ctx, "No handler for inproc message set."); zmsg_destroy(&msg); } rv = thread_ctx->cmd_handler_fn(thread_ctx, type_str, msg); if (OSD_FAILED(rv)) { err(thread_ctx->log_ctx, "Handler for inproc message failed."); } } retval = 0; free_return: free(type_str); return retval; }
Hereditary breast cancer: review and current approach Hereditary breast cancer is a complex and important condition, representing about 10% of all breast cancer cases. Identifying highrisk patients and possible carriers of pathogenic genetic variants with indication for genetic testing is an essential step to care for these patients and their families. Treatment can be influenced, both surgical and adjuvant, by the existence of mutation, providing the possibility of better results and preventive measures. In Brazil, access to oncogeneticists and genetic counseling is limited. Mastologists and their teams must be trained to identify and conduct the approach of these patients, with the objective of offering an adequate and preventive care, as well as early diagnostics. In the present study, a literature review of hereditary breast cancer aspects, diagnostic, and implications, in patients with and without breast cancer, was performed, aiming to assist in the proper management offered by mastologists, considering general and Brazilian characteristics.
Exceptions test the rule. Ron Paul is an exception. We might have to revise some rules. One rule in politics is: Don’t obsess about arcana unfamiliar to voters; stick to issues they care about. Ron Paul has long flouted that rule. That’s one reason why mainstream journalists often dismiss him as a “nut.” He keeps talking about issues outside the scope of what journalists say we should care about. This makes his causes appear — to insiders and reporters and the like — ridiculous. Specifically, he keeps talking about monetary theory in general and the Federal Reserve in particular. Now, there are several good reasons why journalists don’t talk about monetary theory. Milton Friedman memorably declared (it’s memorable because I remember it) that monetary theory was the most difficult area of economics. And if Friedman saw it as difficult, you can be sure your standard journalism major is going to find it harder than getting a quotation out of J.D. Salinger’s dog Phoney. So when Ron Paul moves from sexier issues like war, drugs, and taxes and on to the Federal Reserve and the gold standard, it’s hard not to shake one’s head. I speak for myself, here. When I helped Ron Paul in his first campaign for the presidency, way back in the late 1980s, as the Libertarian Party nominee, I often shook my head. Could Dr. Paul skip the Fed talk, just once? Hey, I knew something about how to get ahead in politics. And advancing unpopular, obscure ideas at variance with the folks at Harvard and Yale and a long retinue of economic advisors wasn’t the way to win national elections. Now, it was not that I disagreed with Dr. Paul. I was familiar with the basic notions. I saw why gold and silver so often served as money from ancient times to the near present. I knew that the Federal Reserve was something close to a conspiracy of insiders working to advance . . . well, I wasn’t at all certain it was “the general welfare.” The founding of the Fed was a typical Progressive Era reform: allegedly a triumph of expertise, it was obviously a concoction by bankers, run for bankers. At first I merely suspected Ron Paul’s even darker view had at least something going for it. Later I came to agree more and more with him. But still, why bring it up all the time? It was political death. It seemed like an effort in self-marginalization. I was wrong. Ron Paul was right. Journalists and pundits and political experts were also wrong — big-time wrong. Just look at the crowds of Ron Paul’s supporters. They don’t start yawning when his speeches earnestly wander away from the Approved Topics and into monetary theory. They maintain enthusiasm. The good doctor gets cheers, perhaps even more cheers. And when Ron Paul triumphantly proclaimed, after his third-place showing in Iowa, that sometime soon the experts would proclaim “we’re all Austrians now,” his hordes of supporters got the reference to a Nixon-era aphorism about Keynesianism and were not in the least confused by a presidential candidate in America referencing, positively, a German-speaking foreign country. Instead of backfiring and sounding lunatic, the moment almost reached Kennedy’s “Ich bin ein Berliner” heights. How wrong the experts were! Just ask the young folks. Ron Paul’s supporters of all shapes and colors and creeds will emphasize the danger posed by the Federal Reserve. And the need to get rid of it. I have heard dozens — scores, maybe even hundreds — of just plain folks begin to discourse on the hazards associated with giving insiders special privileges in the money creation biz. I have heard explanations of Gresham’s Law; the usefulness of “hard money”; the dangers of credit money and the sheer perversity of fiat money; and the advisability of abolishing legal tender laws . . . as well as knowledgeable mentions of the Austrian School. These folks may not always understand that Austrian economics is not a univocal set of policy proposals, but a rich tradition of positive explanatory theory, instead. Still, the mere familiarity with a few of its doctrines is something of a surprise, especially from regular voters. Well, maybe “regular voters” is not quite right. These folks are not old hands in either major party. They are often independents. But they are special. They have been schooled by Dr. Ron Paul. For Ron Paul has been in this for the long haul. He has been pushing monetary reform from the beginning of his political career, with Sisyphean persistence. Yes, the experts — including me — shook their heads, clucking disapproval. But he switched myths on us. He has become Prometheus. He has brought us fire. Ron Paul was right to say that money is key. Monetary theory best explains the cycles of boom and bust, why they occur, and why the medicine employed since the beginning of the Great Depression doesn’t work, instead prolonging unemployment. And never was a time more ripe for this truth than now. Ron Paul’s persistence is paying off, paying off in the enthusiasm of crowds and the formation of a new voting bloc. Monetary reform, as an issue, is key for another reason: It helps demonstrate that Ron Paul is no standard-brand politician, looking only for the right grab-bag of issues that will “sell” to voters. It proves his honesty. It proves his prescience. It proves that the rules of politics-as-usual only apply when situations are usual. In times of crisis, the old rule of play-it-safe/stick-to-the-ordinary doesn’t cut it. In times of crisis, a true educator can do what he set out to do, educate. And, once we learn something, major change can come. I don’t know how far this new force in politics will go to implement the ideas that have arisen through the agency of Ron Paul. But I can tell you this: it’s great to see the pundits proved wrong. And I’ve never been so happy being proven wrong myself.
// Compresses quaternion to ozz::animation::RotationKey format. // The 3 smallest components of the quaternion are quantized to 16 bits // integers, while the largest is recomputed thanks to quaternion normalization // property (x^2+y^2+z^2+w^2 = 1). Because the 3 components are the 3 smallest, // their value cannot be greater than sqrt(2)/2. Thus quantization quality is // improved by pre-multiplying each componenent by sqrt(2). void CompressQuat(const ozz::math::Quaternion& _src, ozz::animation::QuaternionKey* _dest) { const float quat[4] = {_src.x, _src.y, _src.z, _src.w}; const size_t largest = std::max_element(quat, quat + 4, LessAbs) - quat; assert(largest <= 3); _dest->largest = largest & 0x3; _dest->sign = quat[largest] < 0.f; const float kFloat2Int = 32767.f * math::kSqrt2; const int kMapping[4][3] = {{1, 2, 3}, {0, 2, 3}, {0, 1, 3}, {0, 1, 2}}; const int* map = kMapping[largest]; const int a = static_cast<int>(floor(quat[map[0]] * kFloat2Int + .5f)); const int b = static_cast<int>(floor(quat[map[1]] * kFloat2Int + .5f)); const int c = static_cast<int>(floor(quat[map[2]] * kFloat2Int + .5f)); _dest->value[0] = math::Clamp(-32767, a, 32767) & 0xffff; _dest->value[1] = math::Clamp(-32767, b, 32767) & 0xffff; _dest->value[2] = math::Clamp(-32767, c, 32767) & 0xffff; }
#include <bits/stdc++.h> using namespace std; int main(int argc, char const *argv[]) { int d,g; cin>>d>>g; int p[d],c[d]; for (int i = 0; i < d; ++i) { cin>>p[i]>>c[i]; } int memo[d+1]={},ans=10000; while (memo[d]==0) { int now=0,nans=0,cab=-1; for (int i = 0; i < d; ++i) { if (memo[i]==1) { now+=(i+1)*100*p[i]+c[i]; nans+=p[i]; } else { cab=i; } } if (now<g) { if (cab!=-1&&(g-now)<=(cab+1)*100*p[cab]) { nans+=(g-now+(cab+1)*100-1)/((cab+1)*100); now+=g; } } if (now>=g) { ans=min(ans,nans); } int tmp=0; while (memo[tmp]==1) { memo[tmp]=0; ++tmp; } ++memo[tmp]; } cout<<ans<<endl; return 0; }
def simulate(self, clks=None): if clks is None: clks = 200 self.dut = Cosimulation("vvp -m ./myhdl.vpi fftexec", clk=self.clk, rst_n=self.rst_n, din=self.in_data, din_nd=self.in_nd, dout=self.out_data, dout_nd=self.out_nd, overflow = self.overflow, ) sim = Simulation(self.dut, self.clk_driver(), self.control()) sim.run(self.half_period*2*clks)
/** * If the message tooltip is our of vertical viewport then we update the position to be at the bottom */ protected void showStatus() { if (position == null) { if (!new ScrollHelper().isInViewPort(textObject.getElement())) { position = Position.TOP; } else { position = Position.LEFT; } } updatePosition(position); textObject.getElement().getStyle().setVisibility(Style.Visibility.VISIBLE); }
import multer from 'multer'; import { Request, Response } from 'express'; import fs from 'fs'; import formidable from 'formidable'; export const multipartUpload = multer({ storage: multer.diskStorage({ destination: function (req, file, callback) { callback(null, './uploads'); }, filename : (req, file, cb) => { const dateTimestamp = Date.now(); cb(null, file.fieldname + '-' + dateTimestamp + '.' + file.originalname.split('.')[file.originalname.split('.').length - 1]); } }) }).single('file'); export let uploadCropped = (req: Request, res: Response) => { const form = new formidable.IncomingForm(); form.parse(req, (err: any, fields: any, files: any) => { const file = fields.file; if (typeof file === 'string') { const name = Date.now() + '.jpg'; const dirPath = './uploads/' + name; const base64Data = file.replace(/^data:image\/png;base64,/, ''); fs.writeFile(dirPath, base64Data, 'base64', function (err) { if (err) { return console.log(err); } res.json({'filename': name}); }); } else { res.json({'filename': null}); } }); };
def calculateTiles(self, pCity): for i in range(gc.getNUM_CITY_PLOTS()): pPlot = pCity.getCityIndexPlot(i) if pPlot and not pPlot.isNone() and pPlot.hasYield(): if pCity.isWorkingPlot(pPlot): self._addTile(WORKED_TILES, pPlot) elif pCity.canWork(pPlot): self._addTile(CITY_TILES, pPlot) elif pPlot.getOwner() == pCity.getOwner(): self._addTile(OWNED_TILES, pPlot) else: self._addTile(ALL_TILES, pPlot)
/** * Utility class for Parameter parsing. */ public final class ParameterTypeUtil { private static final PrimitiveTypeNameResolver PRIMITIVE_TYPE_NAME_RESOLVER = new PrimitiveTypeNameResolver(); private ParameterTypeUtil() { } public static void setParameter(Object object, Field field, Object bodyParameter) throws ODataUnmarshallingException { Object fieldValue = null; if (bodyParameter != null) { EdmParameter annotation = field.getAnnotation(EdmParameter.class); if (annotation == null) { throw new ODataUnmarshallingException("Field should have EdmParameter annotation"); } PrimitiveType type = resolveType(annotation.type(), field); try { if (field.getType().isAssignableFrom(bodyParameter.getClass())) { fieldValue = bodyParameter; } else if (type != null) { fieldValue = ParserUtil.parsePrimitiveValue(bodyParameter.toString(), type); } } catch (ODataException e) { throw new ODataUnmarshallingException("Can't parse primitive value"); } } field.setAccessible(true); try { field.set(object, fieldValue); } catch (IllegalAccessException e) { throw new ODataUnmarshallingException("Error during setting a parameter to action object field"); } } private static PrimitiveType resolveType(String type, Field field) { if (isNullOrEmpty(type)) { return PRIMITIVE_TYPE_NAME_RESOLVER.resolveTypeName(field.getType()); } return PrimitiveType.forName(type); } /** * Primitive Type resolver. */ private static class PrimitiveTypeNameResolver { private Map<Class<?>, PrimitiveType> javaToPrimitiveType; PrimitiveTypeNameResolver() { Map<Class<?>, PrimitiveType> javaToPrimitiveTypeBuilder = new HashMap<>(); for (PrimitiveType primitiveType : PrimitiveType.values()) { Class<?> javaType = primitiveType.getJavaType(); if (javaType != null) { javaToPrimitiveTypeBuilder.put(javaType, primitiveType); } } javaToPrimitiveType = Collections.unmodifiableMap(javaToPrimitiveTypeBuilder); } public PrimitiveType resolveTypeName(Class<?> javaType) { return javaToPrimitiveType.get(PrimitiveUtil.unwrap(javaType)); } } }
<filename>cmd/ndm_daemonset/controller/disk_to_device_convertor.go<gh_stars>0 /* Copyright 2019 The OpenEBS Authors. Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at http://www.apache.org/licenses/LICENSE-2.0 Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License. */ package controller import ( "github.com/openebs/node-disk-manager/pkg/udev" "strings" ) func (c *Controller) NewDeviceInfoFromDiskInfo(diskDetails *DiskInfo) *DeviceInfo { deviceDetails := NewDeviceInfo() deviceDetails.NodeAttributes = diskDetails.NodeAttributes deviceDetails.UUID = c.DiskToDeviceUUID(diskDetails.ProbeIdentifiers.Uuid) deviceDetails.Capacity = diskDetails.Capacity deviceDetails.Model = diskDetails.Model deviceDetails.Serial = diskDetails.Serial deviceDetails.Vendor = diskDetails.Vendor deviceDetails.Path = diskDetails.Path deviceDetails.ByIdDevLinks = diskDetails.ByIdDevLinks deviceDetails.ByPathDevLinks = diskDetails.ByPathDevLinks deviceDetails.LogicalSectorSize = diskDetails.LogicalSectorSize deviceDetails.PhysicalSectorSize = diskDetails.PhysicalSectorSize deviceDetails.Compliance = diskDetails.Compliance deviceDetails.DeviceType = diskDetails.DriveType deviceDetails.FileSystemInfo.FileSystem = diskDetails.FileSystemInformation.FileSystem deviceDetails.FileSystemInfo.MountPoint = diskDetails.FileSystemInformation.MountPoint return deviceDetails } // DiskToDeviceUUID converts a disk UUID (disk-xxx) to a block- // device UUID (blockdevice-xxx) func (c *Controller) DiskToDeviceUUID(diskUUID string) string { uuid := strings.TrimPrefix(diskUUID, udev.NDMDiskPrefix) return udev.NDMBlockDevicePrefix + uuid }
/* * This test uses FileHandle to load 3 binary documents, writes to database using bulk write set. * Verified by reading individual documents */ @Test public void testWriteMultipleBinaryDoc() throws Exception { String docId[] = {"Pandakarlino.jpg","mlfavicon.png"}; BinaryDocumentManager docMgr = client.newBinaryDocumentManager(); DocumentWriteSet writeset =docMgr.newWriteSet(); File file1= null,file2=null; file1 = new File("src/test/java/com/marklogic/javaclient/data/" + docId[0]); FileHandle handle1 = new FileHandle(file1); writeset.add("/1/"+docId[0],handle1.withFormat(Format.BINARY)); writeset.add("/2/"+docId[0],handle1.withFormat(Format.BINARY)); file2 = new File("src/test/java/com/marklogic/javaclient/data/" + docId[1]); FileHandle handle2 = new FileHandle(file2); writeset.add("/1/"+docId[1],handle2.withFormat(Format.BINARY)); writeset.add("/2/"+docId[1],handle2.withFormat(Format.BINARY)); docMgr.write(writeset); long fsize1 = file1.length(),fsize2 = file2.length(); FileHandle readHandle1 = new FileHandle(); docMgr.read("/1/"+docId[0],readHandle1); FileHandle readHandle2 = new FileHandle(); docMgr.read("/1/"+docId[1],readHandle2); System.out.println(file1.getName()+":"+fsize1+" "+readHandle1.get().getName()+":"+readHandle1.get().length()); System.out.println(file2.getName()+":"+fsize2+" "+readHandle2.get().getName()+":"+readHandle2.get().length()); assertEquals("Size of the File 1"+docId[0],fsize1,readHandle1.get().length()); assertEquals("Size of the File 1"+docId[1],fsize2,readHandle2.get().length()); }
/** * This function is called in the benchmarking phase which is executed with the -t argument. * * gets the comments for a resource and all the comment's details. * @param requesterID The unique identifier of the user who wants to view the comments posted on the profileOwnerID's resource. * @param profileOwnerID The profile owner's unique identifier (owner of the resource). * @param resourceID The resource's unique identifier. * @param result A vector of all the comment entities for a specific resource, each comment and its details are * specified as a hashmap. * @return Zero on success, a non-zero error code on error. See this class's description for a discussion of error codes. * * The code written for this function, gets the resourceid of a resource and returns all the comments posted on that resource * and their details. This information should be put into the results Vector. */ public int viewCommentOnResource(int requesterID, int profileOwnerID, int resourceID, Vector<HashMap<String,ByteIterator>> result) { if (requesterID < 0 || profileOwnerID < 0 || resourceID < 0) return -1; try { View view = manip_client.getView("manipulation", "by_rsc"); Query query = new Query(); query.setKey("\""+resourceID+"\""); query.setDescending(true); query.setIncludeDocs(true); ViewResponse res = manip_client.query(view, query); for (ViewRow row: res) { String doc = (String) row.getDocument(); HashMap<String, String> doc_map = getMapFromJson(doc); HashMap<String, ByteIterator> hmap = new HashMap<String, ByteIterator>(); for (String field: doc_map.keySet()) { hmap.put(field, new ObjectByteIterator(doc_map.get(field).getBytes())); } result.add(hmap); } } catch (Exception e) { e.printStackTrace(); return -1; } System.out.println("NOTICE viewCommentOnResource ok with resourceID " + resourceID); return 0; }
/* * partition_delete() - Delete a partition. * part_num: partition number * * Returns 0 on success, or Error Code */ static int partition_delete(int part_num) { int x; if (part_num < 0 || part_num >= spt.partitions) { librsu_log(LOW, __func__, "error: Invalid partition number"); return -1; } for (x = part_num; x < spt.partitions; x++) spt.partition[x] = spt.partition[x + 1]; spt.partitions--; if (writeback_spt()) return -1; if (load_spt()) return -1; return 0; }
Isolated annular genital lichen planus in a female A 30‐year‐old female presented to STD clinic with the complaint of pruritus vulva for 3 months. There was no history of any such illness in the past. Genital examination revealed a single well‐defined annular plaque of approximately 1 cm in diameter with raised borders present on the inner surface of the left labium majus. The center of the lesion was slightly atrophic . There was no cutaneous, oral mucosal, and nail involvement. A differential diagnosis of mucosal LP, porokeratosis, seborrheic keratosis, and granuloma annulare was kept. The histopathological examination from the edge of the plaque showed hyperkeratosis, acanthosis, follicular plugging, spongiosis, lymphocytic exocytosis, vacuolar alteration of basal keratinocytes, and apoptotic keratinocytes . Dermis showed dermal edema with pigment incontinence, and a band-like chronic lymphocytic infiltrate . No atypical keratinocytes were seen. Based on the clinicopathological correlation, a diagnosis of isolated annular LP of the vulva was made. Isolated Annular Genital Lichen Planus in a Female
<reponame>Princeton-CDH/djiffy<filename>djiffy/test_urls.py """Test URL configuration for djiffy """ from django.contrib import admin from django.urls import include, path from django.views.generic.base import RedirectView from djiffy import urls as djiffy_urls urlpatterns = [ path('', RedirectView.as_view(pattern_name='admin:index'), name='site-index'), path('iiif/', include(djiffy_urls, namespace='djiffy')), path('admin/', admin.site.urls), ]
<reponame>ringtail/cloud-provider-alibaba-cloud package ros import ( "testing" "fmt" "os" "time" ) var ( myTestTemplate = ` { "ROSTemplateFormatVersion": "2015-09-01", "Resources": { "string3": { "Type": "ALIYUN::RandomString", "DependsOn": [ "string2" ], "Properties": { "sequence": "octdigits", "length": 100 } }, "string1": { "Test1": "Hello", "Type": "ALIYUN::RandomString" }, "string2": { "Type": "ALIYUN::RandomString", "DependsOn": [ "string1" ], "Properties": { "sequence": "octdigits", "length": 100 } } } } ` myTestTemplateUpdate = ` { "ROSTemplateFormatVersion": "2015-09-01", "Resources": { "string3": { "Type": "ALIYUN::RandomString", "DependsOn": [ "string2" ], "Properties": { "sequence": "octdigits", "length": 125 } }, "string1": { "Test1": "Hello222", "Type": "ALIYUN::RandomString" }, "string2": { "Type": "ALIYUN::RandomString", "DependsOn": [ "string1" ], "Properties": { "sequence": "octdigits", "length": 100 } } } } ` ) func TestClient_CreateStack(t *testing.T) { args := &CreateStackRequest{ Name: fmt.Sprintf("my-k8s-test-stack-%d", time.Now().Unix()), Template: myTestTemplate, Parameters: map[string]interface{}{}, DisableRollback: false, TimeoutMins: 30, } response, err := debugClientForTestCase.CreateStack(TestRegionID, args) if err != nil { t.Fatalf("Failed to CreateStack %++v", err) } else { t.Logf("Success %++v", response) } } func TestClient_DeleteStack(t *testing.T) { stackName := os.Getenv("StackName") stackId := os.Getenv("StackId") response, err := debugClientForTestCase.DeleteStack(TestRegionID, stackId, stackName) if err != nil { t.Fatalf("Failed to DeleteStack %++v", err) } else { t.Logf("Success %++v", response) } } func TestClient_AbandonStack(t *testing.T) { stackName := os.Getenv("StackName") stackId := os.Getenv("StackId") response, err := debugClientForTestCase.AbandonStack(TestRegionID, stackId, stackName) if err != nil { t.Fatalf("Failed to AbandonStack %++v", err) } else { t.Logf("Success %++v", response) } } func TestClient_DescribeStacks(t *testing.T) { args := &DescribeStacksRequest{ RegionId: TestRegionID, } stacks, err := debugClientForTestCase.DescribeStacks(args) if err != nil { t.Fatalf("Failed to DescribeStacks %++v", err) } else { t.Logf("Response is %++v", stacks) } } func TestClient_DescribeStack(t *testing.T) { stackName := os.Getenv("StackName") stackId := os.Getenv("StackId") response, err := debugClientForTestCase.DescribeStack(TestRegionID, stackId, stackName) if err != nil { t.Fatalf("Failed to DescribeStack %++v", err) } else { t.Logf("Success %++v", response) } } func TestClient_UpdateStack(t *testing.T) { stackName := os.Getenv("StackName") stackId := os.Getenv("StackId") //ps, _ := json.Marshal(p) args := &UpdateStackRequest{ Template: myTestTemplateUpdate, Parameters: map[string]interface{}{}, DisableRollback: false, TimeoutMins: 45, } response, err := debugClientForTestCase.UpdateStack(TestRegionID, stackId, stackName, args) if err != nil { t.Fatalf("Failed to UpdateStack %++v", err) } else { t.Logf("Success %++v", response) } }
def _parse_input(self, line): data = line.lower() if 'say' not in data: data = data.replace("?", "") data = data.replace("!", "") data = data.replace(",", "") data = data.replace(".", "") data = self.regex_dot.sub("", data) plugins = self.plugins last_plugin = None plugin_prefix = '' while True: sub_command, data = self._find_action(data, plugins, plugin_prefix) if sub_command is None: return last_plugin last_plugin = sub_command plugin_prefix += last_plugin.get_name() + ' ' plugins = sub_command.get_plugins().values()
module VariationsOnEither where import Test.QuickCheck.Checkers import Test.QuickCheck.Classes import Test.QuickCheck(Arbitrary(arbitrary)) import Test.QuickCheck.Gen (oneof) data Validation e a = Failure e | Success a deriving (Eq, Show) instance Functor (Validation e) where fmap _ (Failure e) = Failure e fmap f (Success a) = Success (f a) instance Monoid e => Applicative (Validation e) where pure = Success (Failure e) <*> (Failure e') = Failure (e <> e') (Failure e) <*> _ = Failure e _ <*> (Failure e) = Failure e (Success f) <*> (Success x) = Success (f x) instance (Arbitrary e, Arbitrary a) => Arbitrary (Validation e a) where arbitrary = do e <- arbitrary a <- arbitrary oneof [return e, return a] instance (Eq e, Eq a) => EqProp (Validation e a) where (=-=) = eq tests = do -- quickBatch $ functor (undefined::Validation [String] (Int, String, Integer)) quickBatch $ applicative (undefined::Validation [String] (Int, String, Integer)) -- af <*> x = case af of -- (Success f) -> case x of -- (Success x) -> Success (f x) -- (Failure e) -> Failure e -- (Failure e) -> case x of -- (Success x) -> Failure e -- (Failure e') -> Failure (e <> e')
Trials, Skills, and Future Standpoints of AI Based Research in Bioinformatics In recent times, computer field has entered in all types of business and industries. Recent advancements in the information technology field, has open up many possibilities in multidisciplinary research. Machine learning, deep learning, convolution neural network, etc. are recent development in computer fields which has change the way of development of algorithms. Such algorithms can learn over a period of time while in execution and improves its performance and continue learning. Bioinformatics is the recent example of the science which strives to use such recent technologies of computer science for betterment in its own field. This article reviews Artificial Intelligence subset such as Machine learning and Deep learning in the genomics and proteomics domain. This article provides profound insights of various AI techniques which can be incorporated in the field of bioinformatics. The paper also highlighted the future research potential of this field. Computational biology, genomics, proteomics, Drug designing, gene expression level analysis are the major research areas in bioinformatics. These areas are also discussed in the paper.
miR-29a, b and c Regulate SLC5A8 Expression in Intestinal Epithelial Cells. Short chain fatty acids (SCFAs) produced by bacterial fermentation of dietary fiber exert myriad of beneficial effects including the amelioration of inflammation. SCFAs exist as anions at luminal pH, their entry into the cells depends on the expression and function of monocarboxylate transporters. In this regard, sodium-coupled monocarboxylate transporter-1 (SMCT-1) is one of the major proteins involved in the absorption of SCFA in the mammalian colon. However, very little is known about the mechanisms of regulation of SMCT-1 expression in health and disease. MicroRNAs (miRs) are known to play a key role in modulating gene expression. In silico analysis showed miR-29abc with highest context score and its binding region was conserved among mammals. The 3`-untranslated region (UTR) of human SMCT-1 gene was cloned into pmirGLO vector upstream of luciferase reporter and transiently transfected with miR-29a, b and c mimics into Caco-2 and/or T-84 cells. The presence of UTR of this gene significantly decreased luciferase activity compared to empty vector. Co-transfection with miR-29a, b or c resulted in further decrease in 3`UTR activity of SMCT-1 luciferase constructs. Mimic transfection significantly decreased SMCT-1 protein expression without altering mRNA expression. Further, the expression of miR-29a and c were significantly lower in mouse colon compared to small intestine, consistent with higher levels of SMCT-1 protein in the colon. Our studies demonstrated a novel finding in which miR-29a, b and c down-regulate SMCT-1 expression in colonic epithelial cells and may partly explain the differential expression of these transporters along the length of the GI tract.
//we use transportation technique to forward TCP packet direct to destination location private void installTCPProcessingRules(FloodlightContext cntx){ Ethernet eth = IFloodlightProviderService.bcStore.get(cntx,IFloodlightProviderService.CONTEXT_PI_PAYLOAD); IPv4 ipv4 = (IPv4) eth.getPayload(); int dstIP = ipv4.getDestinationAddress().getInt(); IDevice dstDevice = getDeviceFromIP(dstIP); if (dstDevice == null){ log.error("[FRESCO] cannot send out TCP packets due to failure to locate destination"); return; } if (dstDevice.getAttachmentPoints().length < 1){ log.error("[FRESCO] can not install TCP processing" + " flow rules due to missing host location info"); return; } SwitchPort dstLocation = getLocationFromDevice(dstDevice); IOFSwitch sw = switchService.getSwitch(dstLocation.getSwitchDPID()); if (sw == null){ log.error("[FRESCO] can not install TCP processing" + " due to to destionation switch is offline"); } OFPacketOut.Builder pob = sw.getOFFactory().buildPacketOut(); List<OFAction> actions = new ArrayList<OFAction>(); actions.add(sw.getOFFactory().actions().buildOutput(). setPort(dstLocation.getPort()).setMaxLen(Integer.MAX_VALUE).build()); pob.setActions(actions); byte[] packetData = eth.serialize(); pob.setData(packetData); sw.write(pob.build()); }
#![cfg(test)] use super::SExp::{self, Null}; fn do_parse_and_assert(test_val: &str, expected_val: SExp) { let test_parsed = test_val.parse::<SExp>().unwrap(); assert_eq!(test_parsed, expected_val); } #[test] fn empty_list() { do_parse_and_assert("()", Null); } #[test] fn list_of_lists() { do_parse_and_assert("(() () ())", Null.cons(Null).cons(Null).cons(Null)); } #[test] fn atom() { do_parse_and_assert("hello", SExp::sym("hello")); } #[test] fn list_of_atoms() { do_parse_and_assert( "(a bc de fgh ijk l mnop)", Null.cons(SExp::sym("mnop")) .cons(SExp::sym("l")) .cons(SExp::sym("ijk")) .cons(SExp::sym("fgh")) .cons(SExp::sym("de")) .cons(SExp::sym("bc")) .cons(SExp::sym("a")), ); } #[test] fn comments() { do_parse_and_assert( r#" ; leading comment (1 ;; double semicolon (2 null) ; in between (x) ;; not included: 5) ) "#, Null.cons(Null.cons(SExp::sym("x"))) .cons(Null.cons(SExp::sym("null")).cons(2.into())) .cons(1.into()), ); } #[test] fn primitive_types() { do_parse_and_assert("#f", SExp::from(false)); do_parse_and_assert("#t", SExp::from(true)); do_parse_and_assert("0", SExp::from(0)); do_parse_and_assert("2.0", SExp::from(2)); do_parse_and_assert("inf", SExp::from(std::f64::INFINITY)); do_parse_and_assert("-inf", SExp::from(std::f64::NEG_INFINITY)); do_parse_and_assert("#\\c", SExp::from('c')); do_parse_and_assert("#\\'", SExp::from('\'')); do_parse_and_assert( r#""test string with spaces""#, SExp::from("test string with spaces"), ); } #[test] fn mixed_type_list() { do_parse_and_assert( "(0 #f () 33.5 \"xyz\" #\\? #t \"\" \" \")", Null.cons(SExp::from(" ")) .cons(SExp::from("")) .cons(SExp::from(true)) .cons(SExp::from('?')) .cons(SExp::from("xyz")) .cons(SExp::from(33.5)) .cons(Null) .cons(SExp::from(false)) .cons(SExp::from(0)), ); } #[test] fn quote_syntax() { do_parse_and_assert( "'(a b c d)", Null.cons( Null.cons(SExp::sym("d")) .cons(SExp::sym("c")) .cons(SExp::sym("b")) .cons(SExp::sym("a")), ) .cons(SExp::sym("quote")), ); do_parse_and_assert( "'potato", Null.cons(SExp::sym("potato")).cons(SExp::sym("quote")), ); } #[test] fn quasiquote_syntax() { do_parse_and_assert("`1", Null.cons(1.into()).cons(SExp::sym("quasiquote"))); do_parse_and_assert( "`(a b c d)", Null.cons( Null.cons(SExp::sym("d")) .cons(SExp::sym("c")) .cons(SExp::sym("b")) .cons(SExp::sym("a")), ) .cons(SExp::sym("quasiquote")), ); do_parse_and_assert( "`(a b ,() d)", Null.cons( Null.cons(SExp::sym("d")) .cons(Null.cons(Null).cons(SExp::sym("unquote"))) .cons(SExp::sym("b")) .cons(SExp::sym("a")), ) .cons(SExp::sym("quasiquote")), ); } mod parens { use super::{do_parse_and_assert, Null, SExp}; macro_rules! try_parse { ( $e:expr ) => { $e.parse::<SExp>().unwrap() }; } #[test] #[should_panic] fn unmatched_null() { try_parse!("'(]"); } #[test] #[should_panic] fn unmatched_list() { try_parse!("'[a b }"); } #[test] #[should_panic] fn unbalanced() { try_parse!("(a b (c d) e [ f (g h ) )])"); } #[test] fn types() { do_parse_and_assert( "(a b c d)", Null.cons(SExp::sym("d")) .cons(SExp::sym("c")) .cons(SExp::sym("b")) .cons(SExp::sym("a")), ); do_parse_and_assert( "[a b c d]", Null.cons(SExp::sym("d")) .cons(SExp::sym("c")) .cons(SExp::sym("b")) .cons(SExp::sym("a")), ); do_parse_and_assert( "{ a b c d }", Null.cons(SExp::sym("d")) .cons(SExp::sym("c")) .cons(SExp::sym("b")) .cons(SExp::sym("a")), ); } }
/** * * @param factors an array of two long integers. * factors[0] is the numerator * factors[1] is the denominator * The fraction is reduced in place, so the array will contain the reduced fraction. * The denominator will always be positive when true is returned. * @return false if either factor equals Long.MIN_VALUE after being reduced, otherwise true. * The factors will be unchanged if false is returned. * @throws IllegalArgumentException if denominator is zero */ private static boolean reduce(long[] factors) { long numerator = factors[NUM]; long denominator = factors[DEN]; if(denominator == 0) { throw new IllegalArgumentException("Denominator is zero"); } if (denominator == 1) { return true; } if (numerator == 0) { factors[DEN]= 1; return true; } if(numerator == Long.MIN_VALUE || denominator == Long.MIN_VALUE ) { if(numerator % 2 == denominator % 2) { numerator = numerator / 2; denominator = denominator / 2; } else { return false; } } int sign = 1; if(numerator < 0 ^ denominator < 0) { sign = -1; numerator = Math.abs(numerator); denominator = Math.abs(denominator); } long g = Gcd.gcd( numerator, denominator ); numerator = numerator / g; denominator = denominator / g; factors[NUM] = Math.abs(numerator) * sign; factors[DEN] = Math.abs(denominator); return true; }
import sys import math def Ii():return int(sys.stdin.buffer.readline()) def Mi():return map(int,sys.stdin.buffer.readline().split()) def Li():return list(map(int,sys.stdin.buffer.readline().split())) x = Ii() ans = 0 k = 100 while x > k: tax = k//100 k += tax ans += 1 print(ans)
/** * Discard any input. * Reads and throws away any input until no more is present. Then resets the * read state. */ synchronized void discardInput() { do { inCnt = 0; inOffset = 0; Delay.msDelay(1); fillBuffer(false); } while (inCnt > 0); setHeader(header); }
Mads Gilbert, an outspoken Norwegian doctor and activist who treated patients at Gaza’s al-Shifa hospital during Israel’s assault on the Palestinian territory this summer, has been denied access to Gaza "indefinitely" by Israeli authorities. Gilbert told Al Jazeera on Friday that he was turned away from the Erez border crossing when attempting to return to Gaza in October, despite having all the legitimate paper work. "To my surprise I was denied access by the Israeli military," he said. "When I asked the reason they informed me that it was a security issue." Gilbert said that when he asked for a fuller explanation, he was told to "leave the premises or the police would be called". Telling the world about the burdens of the Palestinians in Gaza is considered a security risk. Mads Gilbert, Norwegian doctor The 67-year-old, who has been involved in solidarity work with Palestinians for decades and volunteered at al-Shifa on and off for 17 years, has been a vocal critic of Israel's military campaigns and its occupation of Palestinian territory. During the seven-week conflict between Israel and the Hamas movement that left more than 2,000 Palestinians dead, Gilbert frequently spoke to international media, including Al Jazeera, about the situation at al-Shifa hospital, which was overwhelmed with civilian casualties. However, a spokesperson for the Coordination of the Government Activities in the Territories, the Israeli authority that coordinates all traffic between Gaza and Israel, told Norwegian newspaper Verdens Gang that the refusal of entry was related to security reasons and had "nothing to do with Gilbert's anti-Israeli and anti-Semitic remarks". Norwegian pressure Gilbert told Al Jazeera he was informed that the ban was "infinite without any time limit". He said he had been invited by the Gaza Health Ministry, which had requested his assistance to research the impact on healthcare of the Israeli bombardment and to follow up on work done during that time. The Norwegian embassy in Tel Aviv has made numerous inquiries to the Israeli government about the ban. Bard Glad Pedersen, state secretary at the Norwegian Foreign Ministry, told Verdens Gang, "we have raised Gilbert's exclusion from Gaza and asked Israel to change their decision. The humanitarian situation in Gaza is still difficult and there is a need for all health workers." Medical Aid for Palestinians, a UK-registered charity, which has been working in the occupied West Bank and Gaza for over 20 years and supports al-Shifa hospital, called the ban on Gilbert "deeply concerning" and reiterated that, "following the recent conflict, thousands of Palestinians in Gaza require specialised surgical treatment and it is imperative that the right to health is unimpeded". 'Will not give up' Denouncing his entry ban as a limitation of freedom of expression, Gilbert said it appeared the Israeli government "doesn't want the effects of their continuous attacks on the civilian population in Gaza to be known to the world". "Telling the world about the burdens of the Palestinians in Gaza is considered a security risk," he said, adding that in a larger perspective, the ban was not about him, but about the Gazans' right to international assistance. "The Israeli authorities are, in my opinion, in no position to deny the Palestinian people support from the international community," he told Al Jazeera. He vowed to continue to challenge Israel and called for political pressure to be exerted to lift the "long overdue" siege of Gaza. "There is no way we’re going to accept that medical and humanitarian assistance to the people in Gaza shall be denied just because the Israeli government has decided so. I will not give up travelling to Gaza as long as they have medical needs," he said. Israel launched "Operation Protective Edge" following firing of rockets by Palestinian armed groups from Gaza. According to UN figures the Palestinian death toll was 2,131, of whom 1,473 were identified as civilians, including 501 children. On the Israeli side, 77 people, mostly soldiers, were killed.
import java.util.Scanner; public class Main { /** * 被8整除,100位,2^3,删除数字, * 08 * 642 123 567 9 * 6: 4*k+1 1 5 9 (37) * 2*k+1 * <p> * 4: 4*k+2 2 6 0 8 (4) * 2*k+1 * <p> * 2: 4*k+3 37 (159) * 2*k+1 */ static String nums = "08642"; static String even = "13579"; static boolean contain(String s, String c) { for (int i = 0; i < c.length(); i++) { if (s.contains("" + c.charAt(i))) { return true; } } return false; } public static void main(String[] args) { // for (int i = 0; i < 100; i++) { // System.out.println(4*i+2); // } Scanner scanner = new Scanner(System.in); String s = scanner.nextLine(); if (contain(s, "0")) { System.out.println("YES\n0"); } else if (contain(s, "8")) { System.out.println("YES\n8"); } else if (contain(s, "642")) { String ans = ""; if ("".equals(ans) && contain(s, "6")) { String sub = s.substring(0, s.lastIndexOf("6")); String cc = "159"; for (int i = 0; i < cc.length(); i++) { if (contain(sub, "" + cc.charAt(i))) { ans = cc.charAt(i) + "6"; break; } } String ll = "37"; for (int i = 0; i < ll.length() && "".equals(ans); i++) { if (contain(sub, "" + ll.charAt(i))) { String sub1 = sub.substring(0, sub.lastIndexOf(ll.charAt(i))); for (int j = 0; j < even.length(); j++) { if (contain(sub1, "" + even.charAt(j))) { ans = even.charAt(j) + (ll.charAt(i) + "6"); } } } } } if ("".equals(ans) && contain(s, "2")) { String sub = s.substring(0, s.lastIndexOf("2")); String cc = "37"; for (int i = 0; i < cc.length(); i++) { if (contain(sub, "" + cc.charAt(i))) { ans = cc.charAt(i) + "2"; break; } } String ll = "159"; for (int i = 0; i < ll.length() && "".equals(ans); i++) { if (contain(sub, "" + ll.charAt(i))) { String sub1 = sub.substring(0, sub.lastIndexOf(ll.charAt(i))); for (int j = 0; j < even.length(); j++) { if (contain(sub1, "" + even.charAt(j))) { ans = even.charAt(j) + (ll.charAt(i) + "2"); } } } } } if ("".equals(ans) && contain(s, "4")) { String sub = s.substring(0, s.lastIndexOf("4")); String cc = "26"; for (int i = 0; i < cc.length(); i++) { if (contain(sub, "" + cc.charAt(i))) { ans = cc.charAt(i) + "4"; break; } } String ll = "4"; for (int i = 0; i < ll.length() && "".equals(ans); i++) { if (contain(sub, "" + ll.charAt(i))) { String sub1 = sub.substring(0, sub.lastIndexOf(ll.charAt(i))); for (int j = 0; j < even.length(); j++) { if (contain(sub1, "" + even.charAt(j))) { ans = even.charAt(j) + (ll.charAt(i) + "4"); break; } } } } } if ("".equals(ans)) { System.out.println("NO"); } else { System.out.println("YES\n" + ans); } } else { System.out.println("NO"); } } }
<reponame>raptor-lang/Raptorpreter<filename>src/interpreter.rs use num::FromPrimitive; use header::*; use constants::*; use instructions::Instruction as Instr; #[derive(Debug, Default)] pub struct Interpreter { // File data header: RaptorHeader, const_table: ConstTable, // Rutime stuff pub op_stack: Vec<i32>, memory: Vec<i32>, pub prog_bytecode: Vec<u8>, } // All of the fields beeing pub is not very good // Maybe move this in runtime.rs? #[derive(Debug, Default, Clone)] pub struct StackFrame { pub id: u32, pub locals: Vec<i32>, // The index of the first op in the op_stack that should be kept pub return_addr: usize, pub bytecode: Vec<u8>, pub bc_counter: usize } impl Interpreter { pub fn new(mut data: Vec<u8>, debug: bool) -> Interpreter { if debug {debug!("Bytecode length: {} bytes", data.len());} let header = read_header(&data); data.drain(..HEADER_SIZE); let const_table: ConstTable = read_const_table(data.as_slice()); if debug { debug!("Constant table length: {} bytes", const_table.bc_counter); debug!("Bytecode length: {} bytes", data.len()); } data.drain(..const_table.bc_counter); let mut i = Interpreter { header: header, const_table: const_table, op_stack: Vec::new(), memory: Vec::new(), prog_bytecode: data, }; i.memory.resize(i.header.var_count as usize, 0); i } } impl StackFrame { fn get_next_4_bytes(&mut self) -> u32 { let val = (self.bytecode[self.bc_counter] as u32) << 24 | (self.bytecode[self.bc_counter + 1] as u32) << 16 | (self.bytecode[self.bc_counter + 2] as u32) << 8 | (self.bytecode[self.bc_counter + 3] as u32); self.bc_counter += 4; debug!("get_next_4_bytes: 0x{:04X}", val); val } pub fn dispatch(&mut self, inpr: &mut Interpreter, debug: bool) -> Option<StackFrame> { use std::ops::*; use std::cmp::*; // Main loop while self.bc_counter != self.bytecode.len() { // info!("PC: {}", bc_counter); // Use FromPrimitive trait to convert a value to its enum let instr = Instr::from_u8(self.bytecode[self.bc_counter]); self.bc_counter += 1; if instr.is_none() { warn!("Unimplemented instruction: {:04X}", self.bytecode[self.bc_counter]); continue; } // We're sure it's Some here, so unpack it let instr = instr.unwrap(); if debug { debug!("{:?}", instr); } macro_rules! push { ( $x:expr ) => { inpr.op_stack.push($x); }; } macro_rules! pop { () => { inpr.op_stack.pop(); }; } macro_rules! operation { ($op:ident) => ({ let l = pop!().unwrap(); let r = pop!().unwrap(); let val = l.$op(r); push!(val); debug!("Operation: {:?}. Operands: [{}, {}]. Result: {}.", instr, l, r, val); }) } macro_rules! reljump { ($op:ident) => ({ let top = pop!().unwrap(); if top.$op(&0) { reljump!(); } else { self.get_next_4_bytes(); if debug {debug!("Jump not taken"); } } }); () => ({ let offset = (self.get_next_4_bytes() - 1) as i32; // Need this if because you can't have negative usizes if offset > 0 { if debug {debug!("RELJUMP: {}", offset);} self.bc_counter += offset as usize; if offset == 1 { warn!("RELJUMP 1 is redundant. This is a compiler bug") } } else if offset < 0 { if debug {debug!("RELJUMP: {}", offset);} self.bc_counter -= (-offset) as usize; } else { warn!("Invalid reljump offset: 0"); } }); } macro_rules! push_frame { ($id:expr) => ({ let func_const = &inpr.const_table.funcs[$id as usize]; let mut sf = StackFrame { id: $id, locals: Vec::new(), bytecode: func_const.body.clone(), ..Default::default() }; for _ in 0..func_const.arg_count { sf.locals.push(pop!().unwrap()); } sf.return_addr = inpr.op_stack.len(); sf.locals.resize( (func_const.arg_count + func_const.local_count) as usize, 0); if debug { debug!("Pushed new frame: {:?}", sf); debug!("Op stack: {:?}", inpr.op_stack); } return Some(sf); }); } match instr { Instr::NOP => {}, Instr::HALT => { println!("HALT issued, stopped execution."); if debug { debug!("Stack: {:?}", inpr.op_stack); debug!("Memory: {:?}", inpr.memory); } }, Instr::ICONST => { let b = self.get_next_4_bytes() as i32; push!(b); }, Instr::POP => { pop!(); }, Instr::ADD => { operation!(add); }, Instr::SUB => { operation!(sub); }, Instr::MULTIPLY => { operation!(mul); }, Instr::DIVIDE => { operation!(div); }, Instr::MODULUS => { operation!(rem); }, Instr::RSHIFT => { operation!(shl); }, Instr::LSHIFT => { operation!(shr); }, Instr::AND => { operation!(bitand); }, Instr::OR => { operation!(bitor); }, Instr::NOT => { let val = pop!().unwrap(); push!(val.not()); }, Instr::COMP => { let a = pop!(); let b = pop!(); push!(if a > b {1} else if a < b {-1} else {0}); }, Instr::COMP_LT => { let a = pop!(); let b = pop!(); push!(if a < b {1} else {0}); }, Instr::COMP_EQ => { let a = pop!(); let b = pop!(); push!(if a == b {1} else {0}); }, Instr::COMP_GT => { let a = pop!(); let b = pop!(); push!(if a > b {1} else {0}); }, Instr::RELJUMP => {reljump!();}, Instr::RELJUMP_GT => {reljump!(gt);}, Instr::RELJUMP_LT => {reljump!(lt);}, Instr::RELJUMP_EQ => {reljump!(eq);}, Instr::STORE => { let index = self.get_next_4_bytes() as usize; let val = pop!().unwrap(); self.locals[index] = val; debug!("Stored {} into local {}", val, index) }, Instr::LOAD => { let index = self.get_next_4_bytes() as usize; let val = self.locals[index]; push!(val); debug!("Loaded {} from local {}", val, index); debug!("Op stack: {:?}", inpr.op_stack) }, Instr::STOREFIELD => {unimplemented!()}, Instr::LOADFIELD => {unimplemented!()}, Instr::VECTORSTORE => {unimplemented!()}, Instr::VECTORLOAD => {unimplemented!()}, Instr::CALL => { let id: u32 = self.get_next_4_bytes(); debug!("Calling func {}", self.id); return push_frame!(id); }, Instr::RETURN => { let val = pop!().unwrap(); inpr.op_stack.resize(self.return_addr, 0); debug!("Returning {} from func {}", val, self.id); push!(val); return None; } Instr::PRINT => { println!("PRINT: {}", pop!().unwrap()); }, Instr::DUMP_STACK => { println!("{:?}", inpr.op_stack); }, Instr::DUMP_GLOBALS => { println!("{:?}", inpr.memory);}, } } return None; // Pop the current frame } } // TODO: Add tests
Nanocomposites of TiO2/cyanoethylated cellulose with ultra high dielectric constants A novel dielectric nanocomposite containing a high permittivity polymer, cyanoethylated cellulose (CRS) and TiO2 nanoparticles was successfully prepared with different weight percentages (10%, 20% and 30%) of TiO2. The intermolecular interactions and morphology within the polymer nanocomposites were analysed. TiO2/CRS nanofilms on SiO2/Si wafers were used to form metal–insulator–metal type capacitors. Capacitances and loss factors in the frequency range of 1 kHz–1 MHz were measured. At 1 kHz CRS-TiO2 nanocomposites exhibited ultra high dielectric constants of 118, 176 and 207 for nanocomposites with 10%, 20% and 30% weight of TiO2 respectively, significantly higher than reported values of pure CRS (21), TiO2 (41) and other dielectric polymer-TiO2 nanocomposite films. Furthermore, all three CRS-TiO2 nanocomposites show a loss factor <0.3 at 1 kHz and low leakage current densities (10−6–10−7 A cm−2). Leakage was studied using conductive atomic force microscopy and it was observed that the leakage is associated with TiO2 nanoparticles embedded in the CRS polymer matrix. A new class of ultra high dielectric constant hybrids using nanoscale inorganic dielectrics dispersed in a high permittivity polymer suitable for energy management applications is reported.
<gh_stars>1-10 import java.util.*; class For { void f(List<String> list) { for (Iterator<String> <warning descr="Variable 'it' can have 'final' modifier">it</warning> = list.iterator(); it.hasNext();) { } for (int i = 0; i < 10; i++) {} for (int i = 0, length = 10; i < length; i++) {} } }
def cardinality(self): return gen_dataset_ops.dataset_cardinality(self._variant_tensor)
Photos by Rhea Butcher After a lot of hard work coming up as a stand-up comic in Los Angeles, I got a call late in the evening on Labor Day telling me that I’d be appearing on The Late Late Show with Craig Ferguson the following night. This would be my network TV debut. I actually appreciate that I didn’t know sooner than the night before. I didn't have that one last time to run my jokes and eat shit because I was too hopped up on making them perfect to properly tell them. As it was, the Friday before Labor Day I went up to San Francisco to do two shows at Lost Weekend Video, an actual video store! With videos! Lost Weekend built an awesome twenty seat room into their basement for movie screenings and comedy and such. The last sets I got in before taping Late Late were there—loose and low pressure and literally in a basement. Walking into CBS Studios, a producer on the show introduced me to another gal taping a segment for a later episode of Late Late. She was wearing some sheath dress that looked completely perfect and was gliding around in heels—the heel of which basically had the same diameter of a single human hair. I was wearing an old shirt I had picked out to sweat through during the drive over and then arrive in. I love to make a sweaty entrance. While she was running through talking points I found out that Elettra—that lovely heeled lady—is not only a model with a successful cooking show, she also happens to be Isabella Rossellini’s daughter which means she is also Ingrid's Bergman's grandkid. That’s some serious royal Hollywood blood. I am, however, Brenda Esposito’s daughter, and my mom's a rocking lady from Ohio. I was standing next to Elettra, trying to comprehend her ankle bones, when the other guest on my episode walked in. He was beautifully clad in denim, like a patriot, and also, he was Jay Leno. It’s pretty rare that late night hosts have an opportunity to guest on one another’s shows. They all work similar hours, often for different networks, and usually, they don’t overlap. Jay Leno being the other guest on my particular episode of Craig’s show was like a Jetsons/Flintstones crossover movie with less time travel. I introduced myself to Jay and then began pacing around near the craft service table's sandwich platter, staring at my joke notes as if they might become sentient and just do the set for me. I was giving those notes some eye loving partially because I hadn't run my set in a few days and partially because I always imagine I'll go blank on everything I've ever written and then have to say, "Well, I forgot to know anything. Goodnight!" before walking directly out of the room and quitting standup forever. And I was nervous as hell. Jay went into his dressing room to change from his denim to his suit. Why he did this I will never understand. Can't get a sharper look than denim. When he returned to the green room, I was still pacing around. He walked over and said to me, “I could tell you were the comic when I walked in. You don’t need to look at that. Put it away. It’s your act—you know it. You'll be great.” Jay Leno said this to me. Not only was that stellar, stellar advice given at the perfect moment, but also, he didn't have to be talking to me. Who am I? (I'm Cameron Esposito.) He could have sat in his dressing room, or he could have ignored me. Instead, I got the stand-up equivalent of a coach's butt slap to his star player from late night TV's most-watched host. A few minutes later, as I waited backstage to go on, Craig came over to meet me. He was charming and friendly and told me, “Jay’s gonna stay and watch you.” In fact, they both did. As I walked out of the curtain to hit my mark, I was facing the audience. But really, I was focused on Jay and Craig sitting off to my right, about 20 feet away, watching me. Comics always play to the other comics in the room—that's who we want to laugh. So for some reason, either because I love crowd work or because I wanted to play to the comics in the room, I decided to refer to Jay directly in my act. It was improvised—I mentioned that like me and all lesbians everywhere, Jay loves denim. Craig called over to me, asking if I was calling Jay a lesbian. I pointed out Jay’s pompadour. We went back and forth. It was loose and fun and wildly unexpected. It’s an honor for a comic to be invited to the couch to sit with the host after a late night set. Not all comics are invited, and certainly not all comics are invited to sit after their debut. But, after our back and forth, Craig invited me over. To be invited to sit between two late night hosts? I honestly don't know if that has happened before. And what did they say to me when I sat there? They said: "White men are on their way out. You're the future." Final words spoken on the show? Jay yelled into the air, "Lesbians rule!" As my friend and fellow comic John Roy wrote to me after seeing the set, “Who would even think to ask a genie lamp for a first TV appearance where you do panel with two hosts at once, make fun of Leno and then he literally says ‘you’re the future?’” Not anyone. For me, there was one more element to that moment. Like most comics, I draw material from my life. I am a lesbian, so when I talk about my relationship, I talk about being with a woman, and when I talk about politics I talk about it from the perspective of someone who is still fighting for equal rights. It's not an act—it's my life, which I've turned into my act. I am very comfortable with myself and my act, but I have read some comments [DON'T READ THE COMMENTS] below videos of mine on the internet. My Late Late set centered around my recent engagement and I did wonder what CBS's middle of the road audience would think of my really gay, really normal marriage to another human who is a woman. Turns out, they were on board with it, just as they absolutely should have been. People can be shitty in faceless internet comments. In person, most audiences I've dealt with are open, interested, and pretty chill. Jay and Craig were more than that—they were an audience of comics, ready to step in and joke along with me and ready to improve the show together. Listen, I know those things happened with The Tonight Show and Conan. Those of us who are not Conan O'Brien or Jay Leno won't ever really know that full story. At the time, alternative comics like myself tended to be pretty invested in Conan's side of things. A few years later, Conan's show is going strong and Jay's about to retire again. Television is one of those industries without a ton of job security. It's a lot of up and downs and a lot of public failures and quiet successes. So I'd like to weigh in now, before he leaves late night, as saying, "Hey thanks, Mr. Leno." You were kind when you didn't have to be, quick on your feet, and genuinely funny. That stuff matters. Now, take this butt slap, get out there and finish strong. And then imagine the sound of a butt slap. Cameron Esposito is an LA-based comic and the host of Put Your Hands Together. Follow her on Twitter at @cameronesposito. More on Jay Leno (yes, we actually have more on Jay Leno): Jay Leno: New Hero of the Republican Party How Jay Leno Has Bettered Our Society
def find(A): X=[(A[i],i) for i in range(len(A))] X=sorted(X) ans=[0]*len(X) p1=(len(A)-1)//2 p2=len(A)//2 #print(p1,p2) for i in range(len(X)): a,b=X[i] if i<p2: ans[b]=X[p2][0] else: ans[b]=X[p1][0] return ans input() A=[str(x) for x in find(list(map(int,input().strip().split(" "))))] print("\n".join(A))
/** * Actual experimental histograms are not able to be specified for inputs, so we just get a * default histogram for the input RPU geometric mean values. */ public static void makeHistogramsforInputRPUs(GateLibrary gate_library, String file_name_default) { HistogramBins hbins = new HistogramBins(); hbins.init(); for(int i=0; i< gate_library.get_INPUT_NAMES().length; ++i) { Double log_OFF = Math.log10(gate_library.get_INPUTS_OFF().get(gate_library.get_INPUT_NAMES()[i])); Double log_ON = Math.log10(gate_library.get_INPUTS_ON().get( gate_library.get_INPUT_NAMES()[i])); ArrayList<Double> hist_OFF = HistogramUtil.getDefaultHistgramAtSpecifiedMean(log_OFF, file_name_default); ArrayList<Double> hist_ON = HistogramUtil.getDefaultHistgramAtSpecifiedMean(log_ON, file_name_default); double[] load_OFF = HistogramUtil.normalize(HistogramUtil.placeDataIntoBins(hist_OFF, hbins)); double[] load_ON = HistogramUtil.normalize(HistogramUtil.placeDataIntoBins(hist_ON, hbins)); double[] shifted_OFF = HistogramUtil.normalizeHistogramToNewMedian(load_OFF, gate_library.get_INPUTS_OFF().get(gate_library.get_INPUT_NAMES()[i]), hbins); double[] shifted_ON = HistogramUtil.normalizeHistogramToNewMedian(load_ON, gate_library.get_INPUTS_ON().get( gate_library.get_INPUT_NAMES()[i]), hbins); gate_library.get_INPUTS_HIST_OFF().put(gate_library.get_INPUT_NAMES()[i], shifted_OFF); gate_library.get_INPUTS_HIST_ON().put( gate_library.get_INPUT_NAMES()[i], shifted_ON); } }
def convert_for_dictarray(data, header=None, row_order=None): if isinstance(data, DictArray): header = data.template.names[0] row_order = data.template.names[1] data = data.array.copy() elif hasattr(data, "keys"): data, row_order, header = convert_dict(data, header, row_order) else: data, row_order, header = convert_series(data, header, row_order) return data, row_order, header
#include "StdAfx.h" #include "WatchWnd.h" CWatchWnd::CWatchWnd(CPaintManagerUI* pPaintManager) { m_pMainWndManager = pPaintManager; } CWatchWnd::CWatchWnd() { m_pMainWndManager = NULL; } CWatchWnd::~CWatchWnd(void) { } LRESULT CWatchWnd::HandleMessage(UINT uMsg, WPARAM wParam, LPARAM lParam) { LRESULT lRes = 0; BOOL bHandled = FALSE; switch( uMsg ) { case WM_TIMER: { OnTimer(uMsg, wParam, lParam, bHandled); } break; case WM_CREATE: { OnCreate(uMsg, wParam, lParam, bHandled); } break; default: bHandled = FALSE; } if( bHandled ) return lRes; return CWindowUI::HandleMessage(uMsg, wParam, lParam); } LRESULT CWatchWnd::OnCreate(UINT uMsg, WPARAM wParam, LPARAM lParam, BOOL& bHandled) { //__super::OnCreate(uMsg, wParam, lParam, bHandled); //m_PaintManager->SetMaxInfo(100, 50); SetWindowPos(GetHWND(), HWND_TOPMOST, 0,0, 400, 30, SWP_NOMOVE|SWP_NOACTIVATE); SetTimer(m_hWnd, 100, 300, NULL); return 0; } LRESULT CWatchWnd::OnTimer(UINT uMsg, WPARAM wParam, LPARAM lParam, BOOL& bHandled) { if (wParam == 100) { if (NULL == m_pMainWndManager) { return 0; } POINT pt; GetCursorPos(&pt); ScreenToClient(m_pMainWndManager->GetPaintWindow(), &pt); CControlUI* pControl = m_pMainWndManager->FindControl(pt); if (pControl) { // CDuiString strInfo; strInfo += pControl->GetClass(); strInfo += _T(" -- "); strInfo += pControl->GetName(); strInfo += _T(" #### "); strInfo += pControl->GetText(); SetWindowText(GetHWND(), strInfo); } } return 0; }
import { Component, OnInit } from '@angular/core'; import { schemaToJSON } from 'src/utils/api'; import { SingleApiService, FormattedEndpoints } from '../single-api.service'; import { splitEndpointTitle } from 'src/utils/api'; @Component({ selector: 'app-api-page-overview', templateUrl: './api-page-overview.component.html', }) export class ApiPageOverviewComponent implements OnInit { endpoints: FormattedEndpoints; example = undefined; exampleCode = ''; constructor(public singleApiService: SingleApiService) {} ngOnInit(): void { this.setExampleCode(); } setExampleCode(): void { this.endpoints = this.singleApiService.returnFormattedEndpoints(); const [firstKey] = Object.keys(this.endpoints); this.exampleCode = `import m3o from '@m3o/m3o-node'; const client = new m3o.Client({ token: 'INSERT_YOUR_YOUR_M3O_API_KEY_HERE' }); client.call('${ this.singleApiService.service.api.name }', '${firstKey}', ${schemaToJSON(this.endpoints[firstKey].request)}) .then(response => { console.log(response); });`; for (let key in this.endpoints) { let value = this.endpoints[key]; if (value['examples'] != undefined && value['examples'].length > 0) { this.example = value['examples'][0]; console.log(this.example); return; } } } formatJSON(val: any): string { return JSON.stringify(val, null, 4); } formatTitle(name: string): string { return splitEndpointTitle(name); } getDescription(): string { const { service: { api: { description }, }, } = this.singleApiService; const [, desc] = description.split('Service'); return desc.replace('\n', ' '); } getFirstEndpointName(): string { const [firstKey] = Object.keys(this.endpoints); return firstKey; } getEndpointPrice(endpoint: any) { let price: string = 'Free'; const { pricing = {} } = this.singleApiService.service.api; Object.keys(pricing).forEach((key) => { if (key.includes(endpoint.key)) { price = `$${parseInt(pricing[key]) / 1000000}`; } }); return price; } }
<reponame>harrain/AppWheel<gh_stars>0 package com.damon.appwheel.model.util; import android.content.Context; import android.os.Environment; import java.io.File; public class FileUtils { /** * 获取sd卡的保存位置 * @param path: */ public static String getDir(Context context, String path) { File dir = context.getExternalFilesDir(Environment.DIRECTORY_PICTURES); // File dir =Environment.getExternalStoragePublicDirectory(Environment.DIRECTORY_PICTURES); path=dir.getAbsolutePath()+"/"+path; return path; } /** * 修改本地任意文件名称 * @param context * @param oldName * @param newName */ public static void renameFileName(Context context, String oldName, String newName){ String dir = getDir(context, oldName); File oldFile=new File(dir); dir=getDir(context, newName); File newFile=new File(dir); oldFile.renameTo(newFile); } public static File createFile(String path){ File file=new File(path); if(!file.getParentFile().exists()){//若不存在目录,则创建 boolean isSuccess = file.getParentFile().mkdirs(); if(!isSuccess){//若文件所在目录创建失败,则返回 return null; } } return file; } }
#include<bits/stdc++.h> #define REP(x,y,z) for(int x=y;x<=z;x++) #define MSET(x,y) memset(x,y,sizeof(x)) #define M 1005 using namespace std; typedef tuple<int,int,int> T; int n,m,dx[4]={-1,1,0,0},dy[4]={0,0,1,-1}; char in[M][M]; bool vis[M][M]; queue<T> q; int main() { while (~scanf("%d %d",&n,&m)) { REP(i,1,n) scanf("%s", in[i]+1); REP(i,1,n) REP(j,1,m) if(in[i][j]=='#') { q.push(make_tuple(i,j,0)); vis[i][j] = true; } int ans = 0; int x,y,z; while (!q.empty()) { tie(x,y,z) = q.front(); q.pop(); ans = max(ans, z); for (int k=0; k<4; k++) { int nx = x+dx[k]; int ny = y+dy[k]; if (nx>=1 && nx<=n && ny>=1 && ny<=m && !vis[nx][ny]) { q.push(make_tuple(nx, ny, z+1)); vis[nx][ny] = true; } } } printf("%d\n", ans); } return 0; }
What’s the greatest horror game of all time? I’m willing to bet that your choice was a game with a first-person, or at least tightly-cropped, perspective. Horror relies on immersion to be effective, and a P.O.V. that puts you in the action will often enhance the experience. A new game from Polish development studio Acid Wizard called Darkwood is one of a recent batch of indie horror games that shows even unusual perspectives and simplified graphics can deliver big scares. Darkwood, which I backed on Indiegogo back in 2013, puts you above the fray with a top-down perspective. While you might think that would create a large divide between the player and the protagonist, it’s surprisingly as riveting an experience as you’ll find in any first-person horror. The woods are dark and full of horrors, and they’re slowly swallowing the town and everything in it–including you, if you can’t escape. Days are spent scavenging for supplies and searching for a way out. When night falls, you hunker down in your hideout and hope the barricades and traps you’ve built are enough to keep the monsters at bay. It might be standard crafting-survival fare were it not for the fact that Darkwood is dripping with dread. The woods not only feel alive, they feel malevolent. Thick tangles of trees are oppressive and sunlight never reaches the ground. Your vision is limited to the cone of light cast by your flashlight, and as you slowly trudge on, you can’t help but anticipate something reaching out from the darkness to grab you. The footsteps around your hideout at night may signal someone coming for you, or they might only be your imagination. Which thought is more disturbing? Advertisement Undoubtedly, what constitutes “disturbing” or “scary” is wildly subjective. One person’s sleepless night is another’s “Pfft, you baby, I laughed through that whole mess.” If there’s one thing we can all agree on, though, it’s that in horror, atmosphere is paramount. It’s the feeling of the thing that grabs you and stays with you. It’s the clang of falling metal in the distance as you walk down an empty hallway on the USG Ishimura in Dead Space, the dark corners and faint whispers in Fatal Frame II’s abandoned Minakami Village. There are other important factors that make up a successful horror game to be sure–a feeling of vulnerability, for example–but a healthy dose of dread goes a long way. With the right tone, games can transcend limitations of mechanics and aesthetics to deliver a true survival horror experience. One of the finest examples of this is Jasper Byrne’s Lone Survivor, a 2D, pixel art side-scroller released in 2012 that is by turns unsettling, disturbing, and intense. Advertisement “You” are the protagonist, a young man in a surgical mask who just may be the only survivor of a mysterious plague that has turned the rest of the world into fleshy, zombie-like creatures. Unwilling to accept this fate, you leave your apartment in hopes of finding others who may yet live… but, as you might expect, it ain’t easy out there. During your explorations, resources such as ammo and batteries for your small flashlight are in short supply. Even worse, reality, dreams, and hallucinations intertwine and overlap, so you’re never quite sure what’s really happening. The gameplay is simple and the graphics are rudimentary, but Lone Survivor is far creepier than many of its hi-fidelity horror counterparts. It feels like a Silent Hill game pared down to its barest essence: an unreliable narrator battling an ever-encroaching darkness, shifting architecture, flesh growing in places it really shouldn’t grow, and a plot sometimes too ambiguous for its own good. Sound design is sometimes overlooked in horror games, but ultimately it’s key to creating atmosphere. In Lone Survivor it works so well I found myself hesitating to find out what exactly was shuffling at the other end of a long, dark, pixelated hallway. If Lone Survivor wears its Silent Hill influence on its sleeve, then the 2014 game Claire wears it on the front of its T-shirt. Advertisement While visiting her comatose mother in the hospital, an exhausted Claire has trouble discerning dreams and nightmares from reality… and so does the player. The hospital becomes a dark, bloody labyrinth filled with shadow monsters and survivors to be saved. All the while, Claire carries only a flashlight, relying on her dog Anubis to warn her of impending danger, She may make it out alive, but she may not make it out sane. Again, the old-school pixel graphics belie a sophisticated horror story that is intensified by the sound effects and score. Navigating a 2D environment full of doors can be frustrating (you’ll spend a lot of time checking your map to get your bearings), but it’s well worth the effort, if only so you can puzzle over who exactly lights all those candles in the dark world. Advertisement Although the puzzles are often as simplistic as the art, these games can still provide a challenge. Resources are scarce, combat is slow and awkward, and as a player you never feel completely safe. Coupled with a relentlessly creepy, tense atmosphere, games like Darkwood and Lone Survivor deserve their place alongside the classics of survival horror. They’re just scaring you from a different perspective.
<reponame>eyrmedical/react-native-callkeep package com.eyr.callkeep; import static com.eyr.callkeep.EyrCallBannerDisplayService.CHANNEL_ID_INCOMING_CALL; import android.content.Context; import androidx.annotation.NonNull; import androidx.annotation.Nullable; import androidx.core.app.NotificationCompat; import java.util.HashMap; import java.util.List; public class EyrNotificationCompatBuilderArgSerializer { private final HashMap<String, Object> mArgs; public EyrNotificationCompatBuilderArgSerializer(HashMap<String, Object> args) { mArgs = args; } private void maybeAddAutoCancel(NotificationCompat.Builder builder) { @Nullable Boolean autoCancel = (Boolean) mArgs.get("autoCancel"); if (autoCancel != null) { builder.setAutoCancel(autoCancel); } } private void maybeAddOngoing(NotificationCompat.Builder builder) { @Nullable Boolean ongoing = (Boolean) mArgs.get("ongoing"); if (ongoing != null) { builder.setAutoCancel(ongoing); } } private void maybeAddVisibility(NotificationCompat.Builder builder) { Double visibility = (Double) mArgs.get("visibility"); if (visibility != null) { builder.setPriority(visibility.intValue()); } } private void maybeAddCategory(NotificationCompat.Builder builder) { String category = (String) mArgs.get("category"); if (category != null) { builder.setCategory(category); } } private void maybeAddTitle(NotificationCompat.Builder builder) { String title = (String) mArgs.get("title"); if (title != null) { builder.setContentTitle(title); } } private void maybeAddSubtitle(NotificationCompat.Builder builder) { String subtitle = (String) mArgs.get("subtitle"); if (subtitle != null) { builder.setContentText(subtitle); } } private void maybeSetSound(Context context, NotificationCompat.Builder builder) { String sound = (String) mArgs.get("sound"); if (sound != null) { } } private void maybeSetVibration(NotificationCompat.Builder builder) { List<?> vibrateJsonArray = (List<?>) mArgs.get("vibration"); if (vibrateJsonArray != null) { builder.setVibrate(getVibrationPattern(vibrateJsonArray)); } } private long[] getVibrationPattern(List<?> vibrateJsonArray) { try { if (vibrateJsonArray != null) { long[] pattern = new long[vibrateJsonArray.size()]; for (int i = 0; i < vibrateJsonArray.size(); i++) { if (vibrateJsonArray.get(i) instanceof Number) { pattern[i] = ((Number) vibrateJsonArray.get(i)).longValue(); } else { throw new Exception("Invalid vibration array"); } } return pattern; } } catch (Exception e) { e.printStackTrace(); } return null; } @Nullable public static String parseAcceptBtnTitle(HashMap<String, Object> args) { return (String) args.get("acceptTitle"); } @Nullable public static String parseEndCallBtnTitle(HashMap<String, Object> args) { return (String) args.get("endCallTitle"); } @Nullable public static String parseDeclineBtnTitle(HashMap<String, Object> args) { return (String) args.get("declineTitle"); } public NotificationCompat.Builder createNotificationFromContext(Context context,@Nullable String channelId) { @NonNull String notificationChannelId = (String) mArgs.get("channelId"); if (notificationChannelId==null && channelId!=null) { notificationChannelId = channelId; } if (notificationChannelId==null) { notificationChannelId = CHANNEL_ID_INCOMING_CALL; } NotificationCompat.Builder builder = new NotificationCompat.Builder(context, notificationChannelId); builder.setSmallIcon(R.drawable.ic_notification); maybeAddAutoCancel(builder); maybeAddOngoing(builder); maybeAddCategory(builder); maybeAddVisibility(builder); maybeAddTitle(builder); maybeAddSubtitle(builder); maybeSetSound(context, builder); maybeSetVibration(builder); return builder; } }
import argparse parser = argparse.ArgumentParser(description='Test description') parser.add_argument('pos1', help='positional arg #1') parser.add_argument('pos2', help='positional arg #2') parser.add_argument('pos3', help='positional arg #3') args = parser.parse_args() print(args)
//<NAME> //CS4348 Project 2 //29 October 2015 #ifndef __FRONT_DESK__ #define __FRONT_DESK__ #include <stdio.h> #include <stdlib.h> #include <semaphore.h> #include <pthread.h> #include <string> #include <vector> using std::vector; class FrontDeskEmployee; class Bellhop; class Guest; #include "FrontDeskEmployee.h" #include "Bellhop.h" #include "Guest.h" #define NUM_FRONTDESKEMPLOYEES 1 #define NUM_BELLHOPS 1 class FrontDesk { public: FrontDesk(); ~FrontDesk(); static FrontDesk* get(); static void* start(void* emptyVar); void start(); void guestEnteredHotel(Guest* guestThatEntered); sem_t* getSem(); sem_t* getFrontDeskEmployeeSem(); sem_t* getBellhopSem(); int getAvailableRoom(); void addGuestToFrontDeskEmployeeQueue(Guest* guestToBeAdded); Guest* removeGuestFromFrontDeskEmployeeQueue(); void addGuestToBellhopQueue(Guest* guestToBeAdded); Guest* removeGuestFromBellhopQueue(); void quit(); private: sem_t frontDeskQueueBeingAccessed; vector<Guest*> guestsWaitingOnFrontDeskEmployee; sem_t bellhopQueueBeingAccessed; vector<Guest*> guestsWaitingOnBellhop; vector<FrontDeskEmployee*> frontDeskEmployees; pthread_t frontDeskEmployeeThreads[NUM_FRONTDESKEMPLOYEES]; vector<Bellhop*> bellhops; pthread_t bellhopThreads[NUM_BELLHOPS]; sem_t roomAvailableSem; int nextRoomAvailable; sem_t* frontDeskEmployeeNeededSem; sem_t* bellhopNeededSem; sem_t runFrontDesk; bool exitFlag; static FrontDesk* frontDeskSingleton; }; #endif
// New initializes a new AsyncLoader from loadAttempter function func New(loadAttempter LoadAttempter) *LoadAttemptQueue { return &LoadAttemptQueue{ loadAttempter: loadAttempter, } }
/** * split the TableModel into 2 TableModels according to the attribute * splitAt * * @param tableModel * @return */ private TableModel[] splitTableModel(TableModel tableModel) { TableModel[] tableModels_splited = new TableModel[2]; TableModel tableModel_left = (TableModel) (tableModel.clone()); TableModel tableModel_right = (TableModel) (tableModel.clone()); List headerCellList_left = sliceList(tableModel_left.getHeaderCellList(), 0, this.splitAt); List headerCellList_right = sliceList(tableModel_right.getHeaderCellList(), splitAt, tableModel_right.getHeaderCellList() .size()); tableModel_left.setHeaderCellList(headerCellList_left); tableModel_right.setHeaderCellList(headerCellList_right); tableModels_splited[0] = tableModel_left; tableModels_splited[1] = tableModel_right; return tableModels_splited; }
for k in range(int(input())): n = int(input()) t= list(map(int,input().split())) t.sort() u = t[1]*t[-1] d=0 for j in range(1,n*2,2): if t[j]!=t[j-1] or t[-j]!=t[-(j+1)] or t[j]*t[-j]!=u: d=1 if d==0: print('YES') else: print('NO')
// Chi-Square Statistic for Poisson distribution public static double chiSquare(double[] observed, double[] expected, double[] variance){ int nObs = observed.length; int nExp = expected.length; int nVar = variance.length; if(nObs!=nExp)throw new IllegalArgumentException("observed array length does not equal the expected array length"); if(nObs!=nVar)throw new IllegalArgumentException("observed array length does not equal the variance array length"); double chi = 0.0D; for(int i=0; i<nObs; i++){ chi += Fmath.square(observed[i]-expected[i])/variance[i]; } return chi; }
<gh_stars>0 package uhppoted import ( "errors" "fmt" "time" "github.com/uhppoted/uhppote-core/types" ) const ROLLOVER = uint32(100000) type GetEventRangeRequest struct { DeviceID DeviceID Start *types.DateTime End *types.DateTime } type GetEventRangeResponse struct { DeviceID DeviceID `json:"device-id,omitempty"` Dates *DateRange `json:"dates,omitempty"` Events *EventRange `json:"events,omitempty"` } type GetEventRequest struct { DeviceID DeviceID EventID uint32 } type GetEventResponse struct { DeviceID DeviceID `json:"device-id"` Event Event `json:"event"` } // Request definition for record-special-events API type RecordSpecialEventsRequest struct { DeviceID DeviceID Enable bool } // Response definition for record-special-events API type RecordSpecialEventsResponse struct { DeviceID DeviceID Enable bool Updated bool } type Event struct { Index uint32 `json:"event-id"` Type uint8 `json:"event-type"` Granted bool `json:"access-granted"` Door uint8 `json:"door-id"` Direction uint8 `json:"direction"` CardNumber uint32 `json:"card-number"` Timestamp types.DateTime `json:"timestamp"` Reason uint8 `json:"event-reason"` } func (u *UHPPOTED) GetEventRange(request GetEventRangeRequest) (*GetEventRangeResponse, error) { u.debug("get-events", fmt.Sprintf("request %+v", request)) devices := u.UHPPOTE.DeviceList() device := uint32(request.DeviceID) start := request.Start end := request.End rollover := ROLLOVER if d, ok := devices[device]; ok { if d.RolloverAt() != 0 { rollover = d.RolloverAt() } } f, err := u.UHPPOTE.GetEvent(device, 0) if err != nil { return nil, fmt.Errorf("%w: %v", InternalServerError, fmt.Errorf("Error getting first event index from %v (%w)", device, err)) } l, err := u.UHPPOTE.GetEvent(device, 0xffffffff) if err != nil { return nil, fmt.Errorf("%w: %v", InternalServerError, fmt.Errorf("Error getting last event index from %v (%w)", device, err)) } if f == nil && l != nil { return nil, fmt.Errorf("%w: %v", InternalServerError, fmt.Errorf("Error getting first event index from %v (%w)", device, errors.New("Record not found"))) } else if f != nil && l == nil { return nil, fmt.Errorf("%w: %v", InternalServerError, fmt.Errorf("Error getting last event index from %v (%w)", device, errors.New("Record not found"))) } // The indexing logic below 'decrements' the index from l(ast) to f(irst) assuming that the on-device event store has // a circular event buffer of size ROLLOVER. The logic assumes the events are ordered by datetime, which is reasonable // but not necessarily true e.g. if the start/end interval includes a significant device time change. var first *types.Event var last *types.Event var dates *DateRange var events *EventRange if f == nil || l == nil { if start != nil || end != nil { dates = &DateRange{ Start: start, End: end, } } events = &EventRange{} } else { if start != nil || end != nil { index := EventIndex(l.Index) for { record, err := u.UHPPOTE.GetEvent(device, uint32(index)) if err != nil { return nil, fmt.Errorf("%w: %v", InternalServerError, fmt.Errorf("Error getting event for index %v from %v (%w)", index, device, err)) } if in(record, start, end) { if last == nil { last = record } first = record } else if first != nil || last != nil { break } if uint32(index) == f.Index { break } index = index.decrement(rollover) } dates = &DateRange{ Start: start, End: end, } if first != nil && last != nil { events = &EventRange{ First: &first.Index, Last: &last.Index, } } } else { events = &EventRange{ First: &f.Index, Last: &l.Index, } } } response := GetEventRangeResponse{ DeviceID: DeviceID(device), Dates: dates, Events: events, } u.debug("get-events", fmt.Sprintf("response %+v", response)) return &response, nil } func in(record *types.Event, start, end *types.DateTime) bool { if start != nil && time.Time(record.Timestamp).Before(time.Time(*start)) { return false } if end != nil && time.Time(record.Timestamp).After(time.Time(*end)) { return false } return true } func (u *UHPPOTED) GetEvent(request GetEventRequest) (*GetEventResponse, error) { u.debug("get-events", fmt.Sprintf("request %+v", request)) device := uint32(request.DeviceID) eventID := request.EventID record, err := u.UHPPOTE.GetEvent(device, eventID) if err != nil { return nil, fmt.Errorf("%w: %v", InternalServerError, fmt.Errorf("Error getting event for ID %v from %v (%w)", eventID, device, err)) } if record == nil { return nil, fmt.Errorf("%w: %v", NotFound, fmt.Errorf("No event record for ID %v for %v", eventID, device)) } if record.Index != eventID { return nil, fmt.Errorf("%w: %v", NotFound, fmt.Errorf("No event record for ID %v for %v", eventID, device)) } response := GetEventResponse{ DeviceID: DeviceID(record.SerialNumber), Event: Event{ Index: record.Index, Type: record.Type, Granted: record.Granted, Door: record.Door, Direction: record.Direction, CardNumber: record.CardNumber, Timestamp: record.Timestamp, Reason: record.Reason, }, } u.debug("get-event", fmt.Sprintf("response %+v", response)) return &response, nil } // Unwraps the request and dispatches the corresponding controller command to enable or disable // door open, door close and door button press events for the controller. func (u *UHPPOTED) RecordSpecialEvents(request RecordSpecialEventsRequest) (*RecordSpecialEventsResponse, error) { u.debug("record-special-events", fmt.Sprintf("request %+v", request)) device := uint32(request.DeviceID) enable := request.Enable updated, err := u.UHPPOTE.RecordSpecialEvents(device, enable) if err != nil { return nil, fmt.Errorf("%w: %v", InternalServerError, fmt.Errorf("Error updating 'record special events' flag for %v (%w)", device, err)) } response := RecordSpecialEventsResponse{ DeviceID: DeviceID(device), Enable: enable, Updated: updated, } u.debug("record-special-events", fmt.Sprintf("response %+v", response)) return &response, nil }
// RetryProcessingTask process tasks in Processing status when restarting server // this step is necessary because when participants abnormally exit computation process // the task may be in Processing stage forever func (t *TaskMonitor) RetryProcessingTask(ctx context.Context) { logger.Info("processing tasks retry execution start") t.doneRetryReqC = make(chan struct{}) defer close(t.doneRetryReqC) defer logger.Info("processing tasks retry end") taskList, err := t.Blockchain.ListTask(&blockchain.ListFLTaskOptions{ PubKey: t.PublicKey[:], Status: blockchain.TaskProcessing, TimeStart: 0, TimeEnd: time.Now().UnixNano(), }) if err != nil { logger.WithError(err).Error("failed to find TaskProcessing task list") return } if len(taskList) == 0 { logger.WithField("amount", len(taskList)).Debug("no processing task found") return } for _, task := range taskList { select { case <-ctx.Done(): return default: } startRequest, err := t.MpcHandler.TaskStartPrepare(task) if err != nil { logger.WithError(err).Errorf("error occurred when retry prepare task, and taskId: %s", task.ID) continue } logger.Infof("retry start Processing task, taskId: %s", task.ID) if err := t.MpcHandler.StartLocalMpcTask(startRequest, true); err != nil { logger.WithError(err).Errorf("error occurred when retry execute task, and taskId: %s", task.ID) continue } } logger.WithFields(logrus.Fields{ "task_len": len(taskList), "end_time": time.Now().Format("2006-01-02 15:04:05"), }).Info("tasks retry execution finished") }
/** * @author Rhett Sutphin */ public class DelegatedCredentialAcquirer { private final Logger log = LoggerFactory.getLogger(getClass()); private String xml; private String hostCertificateFilename; private String hostKeyFilename; public DelegatedCredentialAcquirer(String xml, String hostCertificateFilename, String hostKeyFilename) { this.xml = xml; this.hostCertificateFilename = hostCertificateFilename; this.hostKeyFilename = hostKeyFilename; } protected GlobusCredential acquire() throws Exception { log.debug("Attempting to load delegated credential"); log.trace("- Building host credential out of cert={} and key={}", hostCertificateFilename, hostKeyFilename); GlobusCredential hostCredential = new GlobusCredential(hostCertificateFilename, hostKeyFilename); log.trace("* hostCredential={}", hostCredential); log.trace("- Deserializing reference \n{}", xml); DelegatedCredentialReference delegatedCredentialReference = (DelegatedCredentialReference) Utils.deserializeObject( new StringReader(xml), DelegatedCredentialReference.class, CredentialDelegationServiceClient.class.getResourceAsStream("client-config.wsdd")); log.trace("* reference={}", delegatedCredentialReference); log.trace("* reference.endpointReference={}", delegatedCredentialReference.getEndpointReference()); log.trace("* reference.endpointReference.address={}", delegatedCredentialReference.getEndpointReference().getAddress()); log.trace("* reference.endpointReference.address.host={}", delegatedCredentialReference.getEndpointReference().getAddress().getHost()); log.trace("* reference.endpointReference.address.path={}", delegatedCredentialReference.getEndpointReference().getAddress().getPath()); log.trace("- Getting delegated credential from reference"); DelegatedCredentialUserClient delegatedCredentialUserClient = new DelegatedCredentialUserClient(delegatedCredentialReference, hostCredential); GlobusCredential userCredential = delegatedCredentialUserClient.getDelegatedCredential(); log.trace("* userCredential={}", userCredential); log.trace("* uc.identity={}", userCredential.getIdentity()); log.trace("* uc.issuer={}", userCredential.getIssuer()); log.trace("* uc.subject={}", userCredential.getSubject()); return userCredential; } }
/** * This is a bogus key class that returns random hash values from {@link #hashCode()} and always * returns {@code false} for {@link #equals(Object)}. The results of the test are correct if the * runner correctly hashes and sorts on the encoded bytes. */ static class BadEqualityKey { long key; public BadEqualityKey() {} public BadEqualityKey(long key) { this.key = key; } @Override public boolean equals(Object o) { return false; } @Override public int hashCode() { return ThreadLocalRandom.current().nextInt(); } }
#include <stdio.h> int main() { long long n, m, x, equip=0; scanf("%I64d%I64d", &n, &m); while(m>n and n>0) { n--; m-=2; equip++; } while(n>m and m>0) { m--; n-=2; equip++; } if(n==m) equip+= (n+m)/3; printf("%d", equip); return 0; }
/** * Returns plan tree for an inline view ref. */ private PlanNode createInlineViewPlan(Analyzer analyzer, InlineViewRef inlineViewRef) throws NotImplementedException, InternalException { List<Expr> conjuncts = Lists.newArrayList(); if (!inlineViewRef.getViewStmt().hasLimitClause()) { for (Expr e: analyzer.getUnassignedConjuncts(inlineViewRef.getMaterializedTupleIds())) { if (canEvalPredicate(inlineViewRef.getMaterializedTupleIds(), e, analyzer)) { conjuncts.add(e); } } inlineViewRef.getAnalyzer().registerConjuncts(conjuncts); analyzer.markConjunctsAssigned(conjuncts); } QueryStmt viewStmt = inlineViewRef.getViewStmt(); if (viewStmt instanceof SelectStmt) { SelectStmt selectStmt = (SelectStmt) viewStmt; if (selectStmt.getTableRefs().isEmpty()) { Preconditions.checkState(inlineViewRef.getMaterializedTupleIds().size() == 1); MergeNode mergeNode = new MergeNode(new PlanNodeId(nodeIdGenerator), inlineViewRef.getMaterializedTupleIds().get(0)); mergeNode.getConstExprLists().add(selectStmt.getResultExprs()); mergeNode.getConjuncts().addAll(conjuncts); return mergeNode; } } return createQueryPlan(inlineViewRef.getViewStmt(), inlineViewRef.getAnalyzer(), -1); }
module Main where import Lib import System.IO import Control.Monad files :: [String] -- files = ["lc-src/lc1.lc0", "lc-src/test.lc1"] files = ["lc-src/test.lc0"] main :: IO () main = do contents <- mapM get_content files printResult True $ execChain contents where get_content s = openFile s ReadMode >>= hGetContents execChain :: [String] -> Result Lc0Expr execChain srcs = do translated <- foldM translate l_id srcs interpret translated where l_id = LArg "x" (Var "x") translate translator src = Trace "<<<<< FILE >>>>>" $ interpret (translation translator src) >>= parseProg "intermediate" . decode
/** * Brings up a dialog where the number of choices is dependent on the * value of the optiontype_ parameter. * * @param parent_ Determines the frame in which the dialog is displayed. * @param message_ the String to display. * @param title_ the title of the dialog. * @param optiontype_ must be YES_NO_OPTION or YES_NO_CANCEL_OPTION. */ public static int showConfirmDialog(Component parent_, Object message_, String title_, int optiontype_) { JOptionPane pane = new JOptionPane(message_, PLAIN_MESSAGE, optiontype_); Popup dialog = (Popup) pane.createDialog(parent_, title_); dialog.show(); return ((Integer) pane.getValue()).intValue(); }
def _show_status(self, view): view.set_status( 'anaconda_doc', 'Anaconda: {}'.format(self.signature) )
from sys import stdin t = int(stdin.readline()) for _ in xrange(t): n,m,x,y = map(int,stdin.readline().split()) a = [[1,1],[1,m],[n,1],[n,m]] ans = 0 for u,v in a: ans = max(ans, abs(u-x) + abs(v-y)) print ans
def analyze(self, page_url, html): soup = BeautifulSoup(html) triples = [] for link_type, element_name, attrs, attribute_name in self.link_types: triples.extend( [ (page_url, link_type, self.extract_link(page_url, element, attribute_name)) for element in soup.find_all(element_name, attrs=attrs) ] ) return list(filter(lambda t: t[2] is not None, triples))
<filename>datatypes/Array.ts import { Encoder } from "../Encoder.ts"; import { Decoder } from "../Decoder.ts"; import { UINT_16_MAX_VALUE, UINT_32_MAX_VALUE, UINT_8_MAX_VALUE, } from "../_util.ts"; import { DataType } from "../DataType.ts"; export const fixedArrayDataType = (length: number) => new (class FixedArrayDataType extends DataType { test(data: unknown) { return data instanceof Array && data.length === length; } encode(encoder: Encoder, data: Array<unknown>) { return data.reduce( (buffer: ArrayBuffer, item: unknown) => encoder.combineBuffers(buffer, encoder.encode(item)), new ArrayBuffer(0), ); } decode(decoder: Decoder) { const array: unknown[] = []; let len = length; while (len--) { const item = decoder.next(); array.push(item); } return array; } })(); export class Array8DataType extends DataType { test(data: unknown) { return data instanceof Array && data.length <= UINT_8_MAX_VALUE; } encode(encoder: Encoder, data: Array<unknown>) { const length = data.length; return data.reduce( (buffer: ArrayBuffer, item: unknown) => encoder.combineBuffers(buffer, encoder.encode(item)), encoder.uInt8ToBuffer(length), ); } decode(decoder: Decoder) { let length = decoder.stepUint8(); const array: unknown[] = []; while (length--) { const item = decoder.next(); array.push(item); } return array; } } export class Array16DataType extends DataType { test(data: unknown) { return data instanceof Array && data.length <= UINT_16_MAX_VALUE; } encode(encoder: Encoder, data: Array<unknown>) { const length = data.length; return data.reduce( (buffer: ArrayBuffer, item: unknown) => encoder.combineBuffers(buffer, encoder.encode(item)), encoder.uInt16ToBuffer(length), ); } decode(decoder: Decoder) { let length = decoder.stepUint16(); const array: unknown[] = []; while (length--) { const item = decoder.next(); array.push(item); } return array; } } export class Array32DataType extends DataType { test(data: unknown) { return data instanceof Array && data.length <= UINT_32_MAX_VALUE; } encode(encoder: Encoder, data: Array<unknown>) { const length = data.length; return data.reduce( (buffer: ArrayBuffer, item: unknown) => encoder.combineBuffers(buffer, encoder.encode(item)), encoder.uInt32ToBuffer(length), ); } decode(decoder: Decoder) { let length = decoder.stepUint32(); const array: unknown[] = []; while (length--) { const item = decoder.next(); array.push(item); } return array; } }
Variety had a guest shot in Sunday’s episode of HBO’s “Boardwalk Empire” in a storyline involving our famous 1931 interview with Al Capone. The June 30, 1931, edition of Variety featured the banner story “Capone Kids Gang Films,” written by staff scribe Lou Greenspan. The story detailed Capone’s amused reaction to the gangster films that were then flooding theaters, including now-classics “Scarface” and “Public Enemy.” Played by British thesp Stephen Graham (pictured far right), Al Capone has been a regular character in the Prohibition-era drama that revolves around bootleggers and mobsters in Atlantic City. In the second episode of the show’s final season, “The Good Listener,” written by series creator Terence Winter and directed by Allen Coulter, the Variety interview is depicted as taking place in Capone’s Chicago headquarters while the notoriously fashion-conscious Little Caesar is surrounded by henchmen and getting fitted for a suit. The enterprising reporter behind the story was Greenspan, who worked out of Variety’s Hollywood bureau (this was two years before Daily Variety launched in Hollywood) but had deep roots in Chicago. Winter noted that several lines of dialogue for the episode were taken directly from the article. Related Madhive & Inscape on Promise of Blockchain for Advertisers Variety Announces Inaugural Silicon Valley Summit Greenspan revealed that Capone scoffed at the slew of mob-focused movies and books that were popular in the day, thanks in no small part to Capone’s growing celebrity. He noted that Capone’s many bodyguards and assistants called their boss “Snorky,” and that Capone had pictures of George Washington and Abraham Lincoln hanging on the wall behind his desk. Capone bragged to Greenspan that he’d been approached “many times” to star in a movie but had no interest. “I wouldn’t go into a picture for all the money in the world,” Capone said. He also used Variety to send a message to one particular book writer who claimed to have gotten the inside dope on the Capone operation from the boss himself. “If you ever meet that guy give him a punch in the nose with my compliments,” Capone instructed Greenspan. A few months after the interview, Capone was in the clink following his conviction on tax evasion charges. And Lou Greenspan? He left Variety a few years later to go into marketing and publicity. Here’s the original story in its entirety:
def check(func, kwargs=None): d = dict() if kwargs is None else dict(kwargs) kwargs = { 'body_formats': ['text', 'haiku'], 'count': 1, 'eid': '123', 'media': 'album', 'page': 1, 'since': datetime.datetime(2010, 1, 1, 0, 0, 0), 'url_name': 'me', 'sort': 'hot', 'without_related_keywords': True, 'word': 'BOT', 'word1': 'BOT1', 'word2': 'BOT2', } kwargs.update(d) required, optional = find_args(func, kwargs) for keys in all_combinations(optional): func(**{kw: kwargs[kw] for kw in required | set(keys)})
/* * Make array value * * This functions returns a new array value representing * an empty array. */ value *value_make_array(void) { value *copy; copy = value_alloc(VALUE_TYPE_ARRAY); return copy; }
Purification of bovine and human retinal S-antigen using immunoabsorbent polymer particles. Bovine retinal S-antigen was prepared using gel filtration chromatography followed by DEAE A-50 or QAE A-50 anion-exchange chromatography. The final purification was performed using immunoadsorbents made from polymerized polyvalent antiserum (rabbit) to bovine serum components. The purity of the antigen was confirmed by polyacrylamide gel electrophoresis, double diffusion according to Ouchterlony, immunoblotting and by producing monospecific antiserum to the retinal S-antigen. Both S-antigen preparations (DEAE and QAE) proved to be highly uveitogenic, causing experimental allergic uveitis in guinea pigs within 14 days of immunization. DEAE separated the antigen into three protein peaks but QAE only into one distinct protein peak. All these protein peaks were S-antigen-active and the yield was about the same using both separation systems. After optimizing the purification for bovine retinas, human retinal S-antigen was also prepared.
Jeremy Lin to an ankle injury. The loss of Lin did not prevent the team from playing as hard as they could, once cutting the deficit from eighteen points to only one late in the fourth quarter. The Nets were sparked by another good Brook Lopez performance and yet another impressive game from Spencer Dinwiddie off the bench. If it wasn’t for J. J. Barea and Dirk Nowitzki coming through in the fourth quarter, Brooklyn could’ve possibly pulled this one out without Lin’s steady hand. Jeremy Lin Is Hurt, Again, Sort Of Every time Jeremy Lin hits the floor, I hold my breath. It is always the worst case scenario if he gets hurt, so when he landed on someone’s foot and was then later not seen on the bench the writing was on the wall. Brooklyn soon announced he would not return to the game due to a minor ankle injury. Over 24 hours later, we have since learned that Lin is basically day-to-day and won’t play tomorrow versus the Detroit Pistons. In his place, Spencer Dinwiddie will start. As far as worst case scenarios go, this isn’t too bad. While seeing Lin miss any games due to injury after a season lost to a bum hamstring isn’t ideal, having him listed as day-to-day is a relief. Sprained ankled happen all the time in basketball, its the equivalent of falling and scraping your knee as a child. Lin isn’t in bad spirits about this, the team looks to have this under control, and at the end of the day these games are pretty pointless outside of player development. Having Lin on the court has done wonders for LeVert’s game and he always sets up Rondae Hollis-Jefferson with room service dimes. Having him off the court will do no favors for player development but it is better than having him out for the rest of the season with yet another hamstring injury. Brook’s Big Night Was Almost Enough To Spark A Comeback Brook Lopez is easily one of the most under appreciated players in the league. Fans have been calling for his trade for years and all he has done in response is becoming the second leading scorer in franchise history. Last night he scored 27 points while shooting eight of eighteen from the floor and three of six from three while pulling down seven rebounds. Brook’s best moment came on a five point sequence late in the game. He finished an earth shattering alley-oop dunk on one end and then hit a corner three the following possession to further swing momentum in favor of the Nets. Not to be lost in the shuffle, newly signed guard Archie Goodwin put the clamps down on Harrison Barnes. His defense led to the Brook corner three after the amazing dunk. As Lin was sidelined with the ankle injury, Brook was instrumental in the Nets comeback attempt. Much like the Goonies, this team never says die. Lin could very well be out more than just the Detroit game, so Brook will need to come through like this more and more. He has proven he can get these numbers routinely, hopefully it’ll be enough to keep Brooklyn afloat without Lin. Spencer Dinwiddie Has Been On Fire Lately While Brooklyn is without Lin, they will rely on the streaking Spencer Dinwiddie. The former D-Leaguer has gone from starting games for Windy City to starting games for Brooklyn. As of late, Dinwiddie has been looking incredible. After starting the year as an unreliable three point shooter he has since become a threat from deep, upping his season average to almost 40 percent. Against Dallas he had yet another good game, finishing with 18 points on five of nine shooting. He also dished out seven assists, pulled in two rebounds, and had one steal in 30 minutes of action. This was also a good look at what a unit featuring Dinwiddie at the point for an extended period of time. With Lin’s injury, Dinwiddie will be thrust into the starting lineup for the foreseeable future. His length and newfound shooting stroke are exactly what Sean Marks and Kenny Atkinson are looking for in their point guards. It was exactly why they chose him over Yogi Ferrell. If he can avoid turnovers and keep playing with the same enthusiasm that has helped him get to where he is now, he should do just fine in Lin’s absence.
<reponame>technicalheist/react-js-pwa-example import React, { Component } from 'react' export default class Dashboard extends Component<any,any> { constructor(props:any) { super(props); this.state = { name : localStorage.getItem('displayName'), providerId : localStorage.getItem('providerId'), user_id : localStorage.getItem('user_id'), email : localStorage.getItem('email'), photoURL : localStorage.getItem('photoURL'), } } render() { return ( <React.Fragment> <div className="section no-pad-bot" id="index-banner"> <div className="container"> <br /> <div className="row center"> <img alt="profile" src={this.state.photoURL} width="250px" height="auto"/> </div> <h1 className="header center orange-text">Welcome dear {this.state.name}</h1> <div className="row center"> <h5 className="header col s12 light"> Your email is <b>{this.state.email}</b> and user_id is <b>{this.state.user_id}</b> </h5> </div> <br /><br /> </div> </div> </React.Fragment> ) } }
/** * Takes in a string holding the building code, and the room number * where the two are separated by a space, and splits them into their * own Strings and places them in a String array, building followed by * room. If the passed in String is malformed, this returns null. * * @param location String holding the building id and room number. * @return 2 element String array with the building code, followed by * the room number. If the String passed in is malformed, returns * null. */ public static String[] splitBuildingRoom(String location) { String[] theLocation = location.split(" "); if (theLocation.length == 2) { return theLocation; } return null; }
//! Random utility functions go here. //! //! Every project ends up with a few.
<gh_stars>0 package de.tinf13aibi.cardboardbro.Engine; import de.tinf13aibi.cardboardbro.Entities.Lined.PolyLineEntity; import de.tinf13aibi.cardboardbro.Geometry.Simple.Vec3d; import de.tinf13aibi.cardboardbro.Shader.Programs; import de.tinf13aibi.cardboardbro.Shader.ShaderCollection; /** * Created by dthom on 24.01.2016. */ public class StateWaitForBeginPolyLinePoint extends StateBase implements IState { public StateWaitForBeginPolyLinePoint(DrawingContext drawingContext){ super(drawingContext); } @Override public void processOnDrawEye(float[] view, float[] perspective, float[] lightPosInEyeSpace) { super.processOnDrawEye(view, perspective, lightPosInEyeSpace); } @Override public void processOnNewFrame(float[] headView, Vec3d armForwardVec) { super.processOnNewFrame(headView, armForwardVec); mUser.calcArmPointingAt(mDrawing.getEntityListWithFloorAndCanvas()); } @Override public void processInputAction(InputAction inputAction) { switch (inputAction){ case DoEndSelect: drawPolyLineBegin(mUser.getArmCrosshair().getPosition()); break; case DoStateBack: drawPolyLineLeave(); break; } } //Draw PolyLine private void drawPolyLineBegin(Vec3d point){ mDrawingContext.setEditingEntity(new PolyLineEntity(ShaderCollection.getProgram(Programs.LineProgram))); ((PolyLineEntity)mDrawingContext.getEditingEntity()).addVert(point); ((PolyLineEntity)mDrawingContext.getEditingEntity()).addVert(point); mDrawing.getEntityList().add(0, mDrawingContext.getEditingEntity()); //Linien zuerst zeichnen sonst unsichtbar changeState(new StateWaitForNextPolyLinePoint(mDrawingContext), "Begin PolyLine"); } private void drawPolyLineLeave(){ mDrawingContext.setEditingEntity(null); changeState(new StateSelectEntityToCreate(mDrawingContext), "Leave PolyLine Mode"); } }
<gh_stars>1-10 /* Generated by RuntimeBrowser Image: /System/Library/PrivateFrameworks/BookDataStore.framework/BookDataStore */ @interface BCCloudKitDatabaseController : NSObject <BCCloudDataPrivacyDelegate> { NSObject<OS_dispatch_queue> * _accessQueue; NSURL * _archiveURL; bool _attachedToContainer; double _backOffInterval; NSMutableSet * _changedRecordZoneIDs; BDSCoalescingCallBlock * _coalescedArchive; BDSCoalescingCallBlock * _coalescedZoneFetch; CKContainer * _container; NSString * _containerIdentifier; CKDatabase * _database; NSArray * _desiredRecordZoneIDs; bool _fetchRecordZoneChangesSuccess; bool _hasSubscription; NSMutableDictionary * _observers; NSData * _recordIDSalt; NSMutableDictionary * _recordZones; CKServerChangeToken * _serverChangeToken; bool _serverFetchPostponed; NSString * _subscriptionID; NSMutableDictionary * _tokenStores; CKRecordID * _userRecordID; } @property (nonatomic, retain) NSObject<OS_dispatch_queue> *accessQueue; @property (nonatomic, copy) NSURL *archiveURL; @property (nonatomic) bool attachedToContainer; @property (nonatomic) double backOffInterval; @property (nonatomic, retain) NSMutableSet *changedRecordZoneIDs; @property (nonatomic, retain) BDSCoalescingCallBlock *coalescedArchive; @property (nonatomic, retain) BDSCoalescingCallBlock *coalescedZoneFetch; @property (nonatomic, retain) CKContainer *container; @property (nonatomic, copy) NSString *containerIdentifier; @property (nonatomic, retain) CKDatabase *database; @property (nonatomic, copy) NSArray *desiredRecordZoneIDs; @property (nonatomic) bool fetchRecordZoneChangesSuccess; @property (nonatomic) bool hasSubscription; @property (nonatomic, retain) NSMutableDictionary *observers; @property (nonatomic, retain) NSData *recordIDSalt; @property (nonatomic, retain) NSMutableDictionary *recordZones; @property (nonatomic, retain) CKServerChangeToken *serverChangeToken; @property (nonatomic) bool serverFetchPostponed; @property (nonatomic, copy) NSString *subscriptionID; @property (nonatomic, retain) NSMutableDictionary *tokenStores; @property (nonatomic, retain) CKRecordID *userRecordID; + (id)decodeRecordFromSystemFields:(id)arg1; + (id)encodeRecordSystemFields:(id)arg1; - (void).cxx_destruct; - (void)_deleteRecordZonesWithIDs:(id)arg1 qualityOfService:(long long)arg2 completion:(id /* block */)arg3; - (id)accessQueue; - (void)addObserver:(id)arg1 recordType:(id)arg2; - (id)archiveURL; - (void)attachToZones:(id)arg1 completion:(id /* block */)arg2; - (bool)attachedToContainer; - (double)backOffInterval; - (id)changedRecordZoneIDs; - (id)coalescedArchive; - (id)coalescedZoneFetch; - (void)connectUserTo:(id)arg1 container:(id)arg2 updateSubscription:(bool)arg3 completion:(id /* block */)arg4 subscriptionCompletion:(id /* block */)arg5; - (id)container; - (id)containerIdentifier; - (id)database; - (id)desiredRecordZoneIDs; - (void)detach; - (void)detachWithCompletion:(id /* block */)arg1; - (bool)establishedSalt; - (void)fetchChangesWithCompletion:(id /* block */)arg1; - (void)fetchRecordForRecordID:(id)arg1 completion:(id /* block */)arg2; - (bool)fetchRecordZoneChangesSuccess; - (void)getAttached:(id /* block */)arg1; - (bool)hasSubscription; - (id)initWithSubscriptionID:(id)arg1 archiveURL:(id)arg2; - (id)observers; - (id)p_archiveToData; - (void)p_createRecordIDSaltWithCompletion:(id /* block */)arg1; - (void)p_createRecordZones:(id)arg1 completionHandler:(id /* block */)arg2; - (void)p_fetchDatabaseChanges:(id /* block */)arg1; - (void)p_fetchRecordZoneChanges:(id)arg1 optionsByRecordZoneID:(id)arg2 completionHandler:(id /* block */)arg3; - (void)p_fetchRecordZoneChangesForRecordZoneIDs:(id)arg1 completionHandler:(id /* block */)arg2; - (void)p_fetchZoneChanges:(id /* block */)arg1; - (void)p_informObserversOfAttachmentChange; - (void)p_informObserversOfCompletedFetchOfZone:(id)arg1; - (void)p_informObserversOfRecordsChanged:(id)arg1; - (void)p_informObserversOfRecordsChanged:(id)arg1 forRecordType:(id)arg2; - (void)p_scheduleArchiveWithCompletion:(id /* block */)arg1; - (void)p_subscribeWithCompletion:(id /* block */)arg1; - (void)p_unarchive; - (id)p_unarchiveFromData:(id)arg1; - (void)p_unsubscribeToContainer:(id)arg1; - (void)p_updateRetryParametersFromFetchZoneChangesOperationError:(id)arg1; - (id)recordIDSalt; - (id)recordNameFromRecordType:(id)arg1 identifier:(id)arg2; - (void)recordZoneWithName:(id)arg1 completionHandler:(id /* block */)arg2; - (id)recordZones; - (void)registerServerChangeTokenStore:(id)arg1 forZoneID:(id)arg2; - (void)removeObserver:(id)arg1; - (void)removeObserver:(id)arg1 recordType:(id)arg2; - (id)saltedAndHashedIDFromLocalID:(id)arg1; - (id)serverChangeToken; - (bool)serverFetchPostponed; - (void)setAccessQueue:(id)arg1; - (void)setArchiveURL:(id)arg1; - (void)setAttachedToContainer:(bool)arg1; - (void)setBackOffInterval:(double)arg1; - (void)setChangedRecordZoneIDs:(id)arg1; - (void)setCoalescedArchive:(id)arg1; - (void)setCoalescedZoneFetch:(id)arg1; - (void)setContainer:(id)arg1; - (void)setContainerIdentifier:(id)arg1; - (void)setDatabase:(id)arg1; - (void)setDesiredRecordZoneIDs:(id)arg1; - (void)setFetchRecordZoneChangesSuccess:(bool)arg1; - (void)setHasSubscription:(bool)arg1; - (void)setObservers:(id)arg1; - (void)setRecordIDSalt:(id)arg1; - (void)setRecordZones:(id)arg1; - (void)setServerChangeToken:(id)arg1; - (void)setServerFetchPostponed:(bool)arg1; - (void)setSubscriptionID:(id)arg1; - (void)setTokenStores:(id)arg1; - (void)setUserRecordID:(id)arg1; - (id)subscriptionID; - (id)tokenStores; - (void)unregisterServerChangeTokenStore:(id)arg1; - (id)userRecordID; - (void)willAttachToContainer:(id)arg1 serviceMode:(bool)arg2 completion:(id /* block */)arg3; - (void)zonesTemporarilyUnreadableDueToMissingD2DEncryptionIdentity:(id)arg1 completion:(id /* block */)arg2; - (void)zonesUnreadableDueToMissingD2DEncryptionIdentity:(id)arg1 completion:(id /* block */)arg2; @end
def human_move_inval_checker(self, move_from, move_to, move_data, max_movement): if (move_data[0] + move_data[1]) > max_movement: if (move_data[0] + move_data[1]) == 2 and move_data[4] == 'diagonal' and self._game_board.is_palace(move_from): if self._game_board.invalid_palace_movement(move_from, move_to): return True if not self._game_board.is_palace(move_to): return True else: return True return False
def current_combiner(self): return [comb for comb in self.combiner if self.name in comb]
Alan Cooper, Paul Godfread Call Prenda Law's Bluff On Defamation Lawsuit from the this-won't-go-well-for-prenda dept Dear Mr. Godfread: My firm has been retained by Livewire Holdings LLC to pursue claims in the U.S. District Court for the District of Minnesota against you and your coconspirators arising from defamation, civil conspiracy and related acts. The alleged acts occurred in e-mail communications and blog posts describing my client as a criminal enterprise. As you know, such statements constitute defamation per se and are, quite frankly, wildly inappropriate. Less-egregious claims have resulted in multi-million dollar judgments, as I trust this one will. The facts of the underlying case are essentially a law school exam hypothetical of every possible variation of libel. Perhaps you can forward my client's complaint to your former professors at William Mitchell. My client is well-aware that you are a major contributor to these blog sites. The purpose of this e-mail is to inform you of impending litigation so that you preserve all relevant evidence in your possession including, but not limited to, communications between yourself and David Camaratto, Morgan Pietz, Nicholas Ranallo and any other individuals associated directly or indirectly with the sites fightcopyrighttrolls and dietrolldie. Further, any and all other evidence that might be relevant to this matter must, of course, be preserved. I suspect that you aligned yourself with these defamatory efforts as a marketing strategy. I don't know if these efforts paid off, but I can assure you that making baseless accusations of criminal conduct is not a wise move for a licensed attorney. All of that being said, my client knows that you didn't work alone in these wrongful efforts. If you think we are missing out on more serious actors in your enterprise my client would be willing to consider decreasing your liability in exchange for information about these individuals. Of course, that interest will disappear if someone else comes forward first. Think it over and let me know. If you're willing to take the fall for whole group then you are decidedly a "true believer." Welcome to the big leagues. Paul ...please take note of the dog that did not bark in the night. That is, note what the letter does not say. Consider the context. Godfread, on behalf of Cooper, is telling courts that Prenda Law has stolen Cooper's identity, and has filed a lawsuit on that basis. What would you expect in response, if Prenda Law had an answer for that? If I were representing Prenda Law, and had an answer, there is no doubt in my mind I would articulate it. I would say, "As you and Mr. Cooper know, and witnesses will attest, Mr. Cooper was a willing participant in AF Holdings LLC and fully consented to being an officer." Or I might say "You have recklessly and without adequate basis suggested that your client is the Alan Cooper who is an officer of AF Holdings, when even the briefest inquiry would show that AF Holdings is led by the distinguished Alan Cooper of Nevis and St. Kitts." I would say something articulating why Cooper's and Godfread's assertions are false. As I so often say, vagueness in legal threats is the hallmark of thuggery. But Hansmeier says nothing of the sort. He has only adolescent puffed-up threats and insults. What do you think that signifies? We were somewhat surprised by Prenda Law, John Steele and Paul Duffy choosing to sue various critics for defamation, and specifically charging Alan Cooper and Paul Godfread with defamation. Cooper, of course, was the home caretaker for some John Steele Properties who discovered that his name was somehow involved in Prenda Law's shell games with (at least) AF Holdings and Ingenuity 13. He eventually sued Prenda claiming that his identity was used without his permission. Following this, as we heard at the big Prenda hearing , Steele started leaving a bunch of voicemails for Cooper, potentially violating ethics rules about directly contacting parties on the other side in a lawsuit. Also, from the voicemails, it seemed clear that the intention was to intimidate Cooper.As we noted at the time, it would seem that filing these lawsuits would open them all up for significant discovery, which they probably would not like very much. The Prenda and Duffy lawsuits were filed in Illinois, and as we noted, Illinois has a relatively broad anti-SLAPP law. The Steele lawsuit was filed in Florida, though it was quickly dismissed . The two Illinois cases are ongoing, and the two named people sued -- Alan Cooper and his lawyer Paul Godfread -- have now filed their answers to the lawsuit . As is required in such cases, they go through each statement in the original suit, and confirm or deny (mostly deny) the various allegations made. Specifically, they deny making the vast majority of the random comments made on various blog comment systems that the lawsuits accuse them of being a part of.Following this, they present their defenses, which again all appear to be fairly standard. They don't believe they've done anything illegal, any statements made were true, and thus not defamatory, information about their own lawsuit against Prenda are protected by legal privilege and they argue that it is a SLAPP suit.They also bring up aof counterclaims, and as part of that reveal that the "intimidation" campaign wasn't just limited to Steele calling Cooper, but included Peter Hansmeier's emails with Godfread as well, with the following email revealed to the court, which really highlights Hansmeier's pure hubris.That sign off line is quite a piece of work, and I'm sure it will go over well in federal court, where it's likely that the judge will have a chance to learn about the case in front of Judge Otis Wright in California. Furthermore, as Ken White points out , that email is most telling for what's not in there:There is one seeming oddity in the response. As we noted Illinois has a decent, though not wonderful, anti-SLAPP law. But rather than rely on that, Cooper and Godfread, instead. They're both based in Minnesota, but it's still a little odd. Minnesota's anti-SLAPP law is definitely stronger than Illinois and as White notes, provides "immunity" from such lawsuits.More importantly, by filing a bunch of counterclaims, Duffy and Prenda cannot easily walk away from this lawsuit, which is probably not the situation that Duffy, Hansmeier, Steele and others really want to be in right now. They've been playing a bullying bluster game all along, and suddenly their bluff is getting called, repeatedly, and they seem to think that if they just keep bullying and bluffing maybe it'll work out in the end. Of course, by the time Judge Wright is done with these guys, these cases in Illinois might not even matter very much... Filed Under: alan cooper, anti-slapp, defamation, illinois, john steele, minnesota, paul duffy, paul godfread, paul hansmeier Companies: prenda, prenda law
<filename>System/Library/PrivateFrameworks/MaterialKit.framework/MTMaterialShadowView.h /* * This header is generated by classdump-dyld 1.5 * on Friday, April 30, 2021 at 11:37:16 AM Mountain Standard Time * Operating System: Version 13.5.1 (Build 17F80) * Image Source: /System/Library/PrivateFrameworks/MaterialKit.framework/MaterialKit * classdump-dyld is licensed under GPLv3, Copyright © 2013-2016 by <NAME>. Updated by <NAME>. */ #import <MaterialKit/MaterialKit-Structs.h> #import <UIKitCore/UIView.h> #import <libobjc.A.dylib/MTMaterialViewObserving.h> @class UIView, MTMaterialView, UIColor, NSString; @interface MTMaterialShadowView : UIView <MTMaterialViewObserving> { UIView* _shadowView; MTMaterialView* _captureOnlyMaterialView; BOOL _captureOnlyMaterialViewSuppliedByClient; MTMaterialView* _materialView; } @property (nonatomic,readonly) MTMaterialView * materialView; //@synthesize materialView=_materialView - In the implementation block @property (assign,getter=isCaptureOnlyMaterialViewSuppliedByClient,nonatomic) BOOL captureOnlyMaterialViewSuppliedByClient; //@synthesize captureOnlyMaterialViewSuppliedByClient=_captureOnlyMaterialViewSuppliedByClient - In the implementation block @property (nonatomic,copy) UIColor * shadowColor; @property (assign,nonatomic) double shadowOpacity; @property (assign,nonatomic) CGSize shadowOffset; @property (assign,nonatomic) double shadowRadius; @property (assign,nonatomic) BOOL shadowPathIsBounds; @property (readonly) unsigned long long hash; @property (readonly) Class superclass; @property (copy,readonly) NSString * description; @property (copy,readonly) NSString * debugDescription; +(id)materialShadowViewWithRecipe:(long long)arg1 configuration:(long long)arg2 initialWeighting:(double)arg3 ; +(id)materialShadowViewWithRecipe:(long long)arg1 configuration:(long long)arg2 initialWeighting:(double)arg3 scaleAdjustment:(/*^block*/id)arg4 ; +(id)materialShadowViewWithRecipe:(long long)arg1 configuration:(long long)arg2 ; +(id)materialShadowViewWithRecipeNamed:(id)arg1 inBundle:(id)arg2 configuration:(long long)arg3 initialWeighting:(double)arg4 scaleAdjustment:(/*^block*/id)arg5 ; +(id)materialShadowViewWithRecipeNamesByTraitCollection:(id)arg1 inBundle:(id)arg2 configuration:(long long)arg3 initialWeighting:(double)arg4 scaleAdjustment:(/*^block*/id)arg5 ; -(double)shadowRadius; -(void)setShadowRadius:(double)arg1 ; -(CGSize)shadowOffset; -(void)setShadowOffset:(CGSize)arg1 ; -(UIColor *)shadowColor; -(void)setShadowColor:(UIColor *)arg1 ; -(void)setShadowOpacity:(double)arg1 ; -(void)layoutSubviews; -(void)_setContinuousCornerRadius:(double)arg1 ; -(double)shadowOpacity; -(void)setShadowPathIsBounds:(BOOL)arg1 ; -(BOOL)shadowPathIsBounds; -(void)groupNameDidChangeForMaterialView:(id)arg1 ; -(void)weightingDidChangeForMaterialView:(id)arg1 ; -(void)configurationDidChangeForMaterialView:(id)arg1 ; -(void)recipeNameDidChangeForMaterialView:(id)arg1 ; -(id)initWithMaterialView:(id)arg1 ; -(void)setCaptureOnlyMaterialViewSuppliedByClient:(BOOL)arg1 ; -(void)_configureShadowViewIfNecessary; -(void)_configureCaptureOnlyMaterialViewIfNecessary; -(MTMaterialView *)materialView; -(BOOL)isCaptureOnlyMaterialViewSuppliedByClient; @end
export default { id: 2, components: [ { type: 'subHeading', value: 'A problem statement', }, { type: 'paragraph', value: `We have to build an app which displays metadata of HD images like height, width, pixels, date, etc. The user clicks on a thumbnail/name of an image and we show the image details on the screen.`, }, { type: 'paragraph', value: `We need to understand that accessing HD pictures over the network takes time and a lot of network bandwidth. We also don't want to redownload the same picture again if a user wishes to know the metadata of one picture many times.`, }, ], };
/** * Abstract Implementation of KeyManagerEventHandler. */ public abstract class AbstractKeyManagerEventHandler implements KeyManagerEventHandler { private RevocationRequestPublisher revocationRequestPublisher; public AbstractKeyManagerEventHandler() { revocationRequestPublisher = RevocationRequestPublisher.getInstance(); } public boolean handleTokenRevocationEvent(TokenRevocationEvent tokenRevocationEvent) throws APIManagementException { Properties properties = new Properties(); properties.setProperty(APIConstants.NotificationEvent.EVENT_ID, tokenRevocationEvent.getEventId()); properties.put(APIConstants.NotificationEvent.CONSUMER_KEY, tokenRevocationEvent.getConsumerKey()); properties.put(APIConstants.NotificationEvent.TOKEN_TYPE, tokenRevocationEvent.getTokenType()); properties.put(APIConstants.NotificationEvent.TENANT_ID, tokenRevocationEvent.getTenantId()); properties.put(APIConstants.NotificationEvent.TENANT_DOMAIN, tokenRevocationEvent.getTenantDomain()); ApiMgtDAO.getInstance().addRevokedJWTSignature(tokenRevocationEvent.getEventId(), tokenRevocationEvent.getAccessToken(), tokenRevocationEvent.getTokenType(), tokenRevocationEvent.getExpiryTime(), tokenRevocationEvent.getTenantId()); revocationRequestPublisher.publishRevocationEvents(tokenRevocationEvent.getAccessToken(), tokenRevocationEvent.getExpiryTime(), properties); return true; } }
/** * Determines if a volume is part of a MetroPoint configuration. * * @param dbClient DbClient reference * @param volume the volume. * @return true if this is a MetroPoint volume, false otherwise. */ public static boolean isMetroPointVolume(DbClient dbClient, Volume volume) { if (volume != null) { VirtualPool vpool = dbClient.queryObject(VirtualPool.class, volume.getVirtualPool()); if (vpool != null && VirtualPool.vPoolSpecifiesMetroPoint(vpool)) { _log.info(String.format("Volume's vpool [%s](%s) specifies Metropoint", vpool.getLabel(), vpool.getId())); return true; } } return false; }
/* LEARNING DP: Knapsack 1 + CHOSE some of the N items have VALUES + THE CAPACITY of the knapsack is W. + MAXIMUM VALUES. CONSTRAINT: + 1 <= N <= 100 + 1 <= WEIGHT <= 100,000 + 1 <= VALUE <= 10^9 * N QUESTION: WHICH WEIGHT? and WHICH ITEMS? IF WE KNOW: so if we know if it is exist a WEIGHT x = VALUES at x we will loop all N items and find the answer for WEIGHT at x + Wi = max(VALUES at x + Vi, VALUES at x + Wi) NOTICE: we have to go from heighest WEIGHT to lowest WEIGHT i think this is a trick in DP PRE DP: ALL WEIGHT are -1e11; WEIGHT at 0 = 0; TIME COMPLEXITY: O(N*W) */ #include <bits/stdc++.h> using namespace std; #define DB(_x) cout << #_x << " is " << (_x) << "\n"; #define FOR(_a,_b) for (_a = 0; _a < _b; ++_a) #define fs first #define sc second //#define IOFILE using LL = long long; using LD = long double; using VI = vector<int>; using PI = pair<int,int>; void Excalibur(){ int n, w, i, j; LL limit = -1e11; while (cin >> n >> w){ vector<PI> a(n); for (auto &x: a) cin >> x.fs >> x.sc; vector<LL> dp(w+1,limit); dp[0] = 0; FOR (i, n){ for (j=w; j>=0; --j){ if (j-a[i].fs >= 0 && dp[j-a[i].fs] != limit){ dp[j] = max(dp[j], dp[j-a[i].fs] + a[i].sc); } } } LL ans = 0; for (auto x: dp) ans = max(x,ans); cout << ans << "\n"; } } int main(){ ios::sync_with_stdio(false); cin.tie(nullptr); cout.tie(nullptr); #ifdef IOFILE freopen("INPUT.in", "r", stdin); freopen("OUTPUT.out", "w", stdout); #endif Excalibur(); return 0;}
def write_fm_programme_file(self, fm_items_df, fm_programme_file_path): try: cov_level = fm_items_df['level_id'].min() fm_programme_df = pd.DataFrame( pd.concat([fm_items_df[fm_items_df['level_id'] == cov_level], fm_items_df])[['level_id', 'agg_id']], dtype=int ).reset_index(drop=True) num_cov_items = len(fm_items_df[fm_items_df['level_id'] == cov_level]) for i in range(num_cov_items): fm_programme_df.at[i, 'level_id'] = 0 def from_agg_id_to_agg_id(from_level_id, to_level_id): iterator = ( (from_level_it, to_level_it) for (_, from_level_it), (_, to_level_it) in zip( fm_programme_df[fm_programme_df['level_id'] == from_level_id].iterrows(), fm_programme_df[fm_programme_df['level_id'] == to_level_id].iterrows() ) ) for from_level_it, to_level_it in iterator: yield from_level_it['agg_id'], to_level_id, to_level_it['agg_id'] levels = list(set(fm_programme_df['level_id'])) data = [ (from_agg_id, level_id, to_agg_id) for from_level_id, to_level_id in zip(levels, levels[1:]) for from_agg_id, level_id, to_agg_id in from_agg_id_to_agg_id(from_level_id, to_level_id) ] fm_programme_df = pd.DataFrame(columns=['from_agg_id', 'level_id', 'to_agg_id'], data=data, dtype=int).drop_duplicates() fm_programme_df.to_csv( path_or_buf=fm_programme_file_path, encoding='utf-8', chunksize=1000, index=False ) except (IOError, OSError) as e: raise OasisException(e) return fm_programme_file_path
In the wake of the Edward Snowden revelations, the rush was on to rein in – or appear to rein in – the Surveillance State. The only really authentic effort was led by Rep. Justin Amash, the libertarian Braveheart, whose bill to completely defund the unconstitutional activities of the National Security Agency was narrowly defeated by a coalition of Clintonian Democrats and neoconservative Republicans. But that wasn’t the end of it. Two camps coalesced around two completely different concepts of "reform": Sen. Diane Feinstein introduced the "FISA Improvement Act," essentially an extension of the NSA’s powers. Rep. Jim Sensenbrunner (R-Wisconsin), author of the infamous "Patriot" Act, called it "a joke," and the bill was universally mocked. On the other side of the barricades, the proponents of the original USA Freedom Act basically dismantled the promiscuous collection of Americans’ phone records and other communications by the NSA. While Feinstein’s bill languished with few co-sponsors and little support, the Freedom Act made it to the Judiciary Committee – where it was gutted of most of its content. As I wrote at the time: "The ‘compromise’ bill deploys the time-honored bureaucratic weapon of linguistic obfuscation to redefine language and use it in ways no ordinary person would recognize. In translating the intent of legislators into lingo describing the technical architecture of our emerging police state, terms like ‘selector’ can be interpreted broadly enough to put not even a dent in the NSA’s armor. "The final legislative product will be an amalgamation of the language contained in both the original Sensenbrenner bill and the Feinstein extension of the NSA’s powers, leading to the creation of a new hybrid system in which the power of the State to track, surveil, and investigate Americans on suspicion of ‘terrorism’ will be extended in more ways than it is (theoretically) restricted." This is precisely what has occurred with the final bill. Rather than fundamentally changing the way the NSA scoops up data, the bill merely outsources collection to immunized telecoms, compelling them to do the NSA’s dirty work. The "Freedom Act" is quite free with its Orwellian redefinition of common words to mean the exact opposite of what they have traditionally meant: for example, the bill defines a "selector" in such a way as to permit NSA to report a dragnet order collecting everyone’s VISA bill as a single order targeting specific alleged terrorist outfits – when, in the real world, it would legalize surveillance of over 300 million US citizens. No wonder Deputy NSA Director Richard Ledgett says that under the terms of the bill "the actual universe of potential calls that could be queried against is [potentially] dramatically larger." Under the present arrangement, government spies must operate within certain parameters that theoretically minimize "accidental" collections of Americans’ data. The bill actually weakens these existing minimization procedures: instead of encoding them in law it hands the job of devising "privacy procedures" to the Attorney General, rather than the FISA court. What this means is that, under the proposed legislation, if the court found the NSA or other government agency spying on an individual (and his or her network of friends and acquaintances) because they engaged in constitutionally protected speech, the court would no longer have the authority to demand the destruction of those records. This is a giant step backward. The so-called "transparency" provisions in the bill contain numerous loopholes, including exemptions for back door searches by the FBI, and the possibility that the DNI may issue a certification claiming there are operational reasons why he or she can’t report the number of Americans whose information has been collected. Rather than reveal anything meaningful, the provisions in the bill covering statistics to be submitted by the government will actually hide how many individuals are having their non-communications records – purchases, financial records, etc. – collected and stored. Under the procedures set up by this bill, we’ll never know how many Americans the FBI is spying on by collecting and storing their emails, call records, Internet searches, etc., because the reporting procedures are designed to conceal. The misnamed "USA Freedom Act" holds out the promise of "reform," but its main purpose is to mislead. It doesn’t minimize the intrusive surveillance techniques currently used by the NSA and other government agencies: instead it codifies them, in some instances, and in other instances masks ongoing abuses. Some civil liberties groups, like the ACLU and the Electronic Frontier Foundation, argue that the present bill is "a first step," and is better than nothing. This is nonsense: this bill is worse than nothing. With the passage of the USA Freedom Act the momentum for real reform will be blunted and allowed to dissipate. Further efforts to roll back the awful power of the NSA will be met with cries of "Didn’t we already do this?" If this bill passes, the Washington insiders will win out, and the Surveillance State will remain intact – arguably even more powerful than before. Some may say: But aren’t you taking an all-or-nothing attitude? The answer is: not at all. A real reform means a partial reining in of the NSA, with no new extensions of its reach. This bill includes a full-scale codification of abuses coupled with ambiguous and easily reinterpreted "reforms" that don’t mean what they appear to mean. Sen. Rand Paul (R-Kentucky), one of the original supporters of the bill before it was gutted, has said through an aide that Sen. Leahy’s reforms "don’t go far enough. There are significant problems with the bill, the most notable being an extension of the Patriot Act through December 2017." This is the kind of radical-but-reasonable stance that can lead to real reform – not the phony variety being pushed by Sen. Leahy, the ACLU, and the rest of the limp-wristed beaten-down liberals who have grown so accustomed to the Warfare State that they can’t imagine anything else. Their timidity won’t restore our old Republic – only the principled stance taken by Sen. Paul and those calling for the outright repeal of the odious "Patriot" Act have a shot at doing that. A special note: It’s 4:38 Pacific Standard Time, and I’m sitting at my desk, trying to gather my thoughts: how do I impress upon Antiwar.com’s supporters the depth and breadth of the crisis we face? After all, we’ve lived through thirteen years of constant warfare: an entire generation has grown up knowing nothing but war. This is the New Normal – right? It’s what we’ve come to expect – and accept. Except I’m not accepting it – and, I hope and trust, neither are you. Because the moment we do accept it, all is lost. The minute we acclimatize ourselves to the atmosphere of hysterical fear drummed up by the "mainstream" media, we become part of the problem. The media is a very big part of the problem: they are bought and paid for – there’s just no other way to put it. The War Party owns them just as surely as the bank owns the mortgage on your house: they jump when the War Party says "Jump!" The only question they have is "How high?" This morning, the latest beheading of an American aid worker by ISIS is being trumpeted as yet another reason why we must re-invade Iraq. Emotion – chiefly fear – is the War Party’s biggest weapon and they are using it to the max. Will someone stand up and say our most important ally in the region, Saudi Arabia, has beheaded dozens in the past few months? Will anybody rise to point out that a bunch of savages in the distant desert of the Levant pose no real threat to Americans – unless they decide to go there? We will, and we have – but we can’t continue to speak truth to hysteria without your help. Unlike the War Party, we don’t have fat-assed oligarchs like Sheldon Adelson and Haim Saban passing us bundles of cash. We don’t have the support of the Washington insiders and Georgetown cocktail partygoers who urge us to "be realistic" and accept the status quo – that is, the Empire, which, they say, is a permanent fixture of American life. Our answer to them is "Never!" – because the Empire is a criminal enterprise. And we won’t rest until it’s exposed, dismantled, and buried for good. We won’t stop until our old Republic is restored. And we won’t remain silent in the face of the liars in the media – the camarilla of fear – who are accessories to mass murder. That’s why we need your help – because we can’t do it alone. Since 1998, we’ve been exposing the War Party’s schemes, debunking their lies, and taking plenty of heat for it. We don’t mind the heat, but we do need some refreshment now and then. Our tiny band of antiwar fighters has been on the barricades for years now, without even a brief respite. Every so often we start running out of fuel, and it’s gotten to the point that we need a jump-start – and we need it now. Your tax-deductible donation to Antiwar.com goes a very long way in any event: that’s because we run a very tight ship. With our matching funds waiting in the wings (cash and Bitcoin), your generous contribution will go twice as far. But we won’t get those funds without your donation – however small. Because every little bit helps. So please – if you’ve been holding off for any reason, now is the time to make that donation. Help ensure that the voice of peace – and reason – is never silenced. Donate today – because tomorrow may be too late. NOTES IN THE MARGIN You can check out my Twitter feed by going here. But please note that my tweets are sometimes deliberately provocative, often made in jest, and largely consist of me thinking out loud. I’ve written a couple of books, which you might want to peruse. Here is the link for buying the second edition of my 1993 book, Reclaiming the American Right: The Lost Legacy of the Conservative Movement, with an Introduction by Prof. George W. Carey, a Foreword by Patrick J. Buchanan, and critical essays by Scott Richert and David Gordon (ISI Books, 2008). You can buy An Enemy of the State: The Life of Murray N. Rothbard (Prometheus Books, 2000), my biography of the great libertarian thinker, here.
. The work of one of the greatest writers of German Romanticism, E.T.A. Hoffmann, incorporates a great deal of current medical knowledge, which Hoffmann used in a skillful and detailed manner in the portrayal of his characters and their motives. Immersed as he was in contemporary medical practice, an interest fuelled by his own deep-seated hypochondria, he was particularly taken with the works of Philippe Pinel, Johann Christoph Reil, Carl A.F. Kluge, and Gotthelf Heinrich Schubert, who were all interested in the working of the mind. This article demonstrates how attention to Hoffmans's medical reading list offers insights useful for critical understanding of his work, using as an example the analysis of the mad goldsmith Cardillac in one of Hoffmann's most famous stories, 'Das Fräulein von Scuderi'.
def entropy_from_mnemonic(mnemonic: str, lang: str) -> Entropy: indexes = mnemonic_dict.indexes_from_mnemonic(mnemonic, lang) entropy = mnemonic_dict.entropy_from_indexes(indexes, lang) return entropy
Flower Farms Environmental Performance Evaluation in Ethiopia Article history Received: 20 April 2021 Accepted: 28 May 2021 Published Online: 10 June 2021 Cultivation of cut flowers is a new agricultural sector in Ethiopia, which currently generates a high amount of income for the country's developments. Despite its significant contribution to economic developments; many issues were raised from communities and environmentalists concerning its environmental performance. Based on this issue the study assesses cradle to gate of cut flower production in the Wolmera district. The main objective of the study was environmental performance evaluation of flower farms in Wolmera district, Oromia regional state, Ethiopia related to operational activities throughout entire life cycles of cut flower production. In this study, primary and secondary data were collected using ISO 14031 standard structured with LCA tool methodology. Data were collected by inventory using an on-site data collection system from its sources. Based on data collected GHG (CO2, N2O, CH4 & NH3) emissions to the atmosphere were evaluated by using an inter-governmental panel on climatic changes (IPCC 2006) for inventory data and eutrophication & acidification estimated from data tested at laboratory levels. Similarly, the study also assesses banned chemicals used in the farms through inventory data assessment, and about 156 chemicals applied in the farms were collected to screen out those banned chemicals used and the two most extremely hazardous chemicals (Impulse & Meltatix) banned by WHO identified in the study. As it understood from a general assessment of all flower farms; all of them haven't EIA document established before construction in the district and production started with having less attention for EHPEA code of conducts in the flower farms which faces the environments for high impacts by emission emitted from flower farms in the district as a whole. Cultivation of cut flowers is a new agricultural sector in Ethiopia, which currently generates a high amount of income for the country's developments. Despite its significant contribution to economic developments; many issues were raised from communities and environmentalists concerning its environmental performance. Based on this issue the study assesses cradle to gate of cut flower production in the Wolmera district. The main objective of the study was environmental performance evaluation of flower farms in Wolmera district, Oromia regional state, Ethiopia related to operational activities throughout entire life cycles of cut flower production. In this study, primary and secondary data were collected using ISO 14031 standard structured with LCA tool methodology. Data were collected by inventory using an on-site data collection system from its sources. Based on data collected GHG (CO 2 , N 2 O, CH 4 & NH 3 ) emissions to the atmosphere were evaluated by using an inter-governmental panel on climatic changes (IPCC 2006) for inventory data and eutrophication & acidification estimated from data tested at laboratory levels. Similarly, the study also assesses banned chemicals used in the farms through inventory data assessment, and about 156 chemicals applied in the farms were collected to screen out those banned chemicals used and the two most extremely hazardous chemicals (Impulse & Meltatix) banned by WHO identified in the study. As it understood from a general assessment of all flower farms; all of them haven't EIA document established before construction in the district and production started with having less attention for EHPEA code of conducts in the flower farms which faces the environments for high impacts by emission emitted from flower farms in the district as a whole. Introduction Ethiopia is the second-most populous country in Sub-Saharan Africa and with a current population growth rate of 2.6%, it made one of the highest populous country in the world . As the population growth continues the pressure on existing natural resources and ecosystems in- problem in the country . Depletion of potable water and aquatic resources is continuous for agriculture without any recognition for the environmental issues . In Ethiopia, most of the time agricultural productions are based on subsistence food crops production and coffee harvesting for exporting purposes dominantly, but recently agriculture sectors in the country moving from subsistence farming to commercial production which included flower farming for exporting purpose, especially in the central parts of the country which included Wolmera, Sululta, Ziway, Sebeta and others . Among these, the Wolmera district is one of the areas found in the central parts of the country or district which is densely occupied by flower farms. Wolmera district is almost covered by high lands (>1100m a.s.l) that are most preferable for cut flowers or roses cultivation . Therefore, this situation makes odd the areas to attract the investors than elsewhere, especially flower farm investors are attracted by this area. Unfortunately, at this moment only about twenty-one flower farms are on the function and the rests are already phased out. Floriculture can be defined as "a discipline of horticulture concerned with the cultivation of flowering and ornamental plants for gardens and floristry, comprising the floral industry". It can also be defined as The segment of horticulture concerned with commercial production, marketing, and sale of bedding plants, cut flowers, potted flowering plants, foliage plants, flower arrangements, and noncommercial home gardening . The Ethiopian floriculture industry started around 1980 when state farms began to export cut flowers to Europe and within a short period recognized as an international cut flowers business player next to Kenya in Africa. Because, Ethiopia has geographical advantages for floriculture industry developments; i.e. cut flowers grow well at high altitude or above 1100m . As stated by Ethiopia's agroecology facilitate opportunities to produce different varieties of flowers in different ecological zones that used to increase flower industries through time in the country . Cut flower includes all commercially cultivated rose and ornamental plants in the greenhouse or the field, especially in a controlled environment . But, various cut flowers sometimes grow out of the greenhouse in many climatic conditions. The rapid growth of flower farms in Ethiopia in general, due to comfortable climatic conditions and natural resources, excellent governmental supports, good transportation system, and availability of abundant and cheap labor forces. Floriculture is used for luxury with high social value and rarely used for food. The demand for luxury is increased in the international market from time to time recently. The flower farms/industries are one part of the agricultural sectors in Africa just like other continents for economic developments at this moment . The objective of this study is the environmental performance evaluation of operating systems within flower farms in Wolmera district, Oromia regional state, Ethiopia. The studied dedicated on water consumption and discharge, solid waste generation and discharge and energy consumption and emission during the flower plantation. Study Methodology Study methodology is mainly based on selecting LCA tools for assessments and the main purpose of this selection tool study is to express the values of environmental management tools for a realistic case and to analyze. The result generated or aspects of the firms. Production of flower farms in Wolmera district is blamed by a large amount of chemical fertilizer, pesticides, and resource use. These create great problems on the environment through emission, discharges, and disposal to the environment in the district. The reason is to identify the environmental impacts or burdens within the sectors. It is vital to collect the necessary data from its sources. Based on this method to assess the issues in the current study it is best to choose a globally acceptable route (tool) to collect, organize, analyze and decide on the issues following new standard ISO 14031 & ISO 14044. Therefore, by using the new international organization for standard; the fundamental data were aggregated following the LCA method that passes at least four fundamental steps through product life cycles which included goal and scopes, data collection & interpretation . It is easy to understand from schematic diagram overview steps that carried out for the implementation of environmental performance evaluation of cut-flower farms or industries within their operation, shown in Figure 1. The system boundary of study identification: The system boundary of any process describes the process's activity and input-output components, which have been engaged into account within a life cycle assessment . For this study, system boundary starts from land preparation to cut flower products transportation. The process included in the system was water consumption, energy consumption, chemical consumption, products, waste generates, and emission to the environments. Again, in this industry, some processes are excluded from the system boundaries of the current study which involves office activity, chemical container storage, and nursery site, shown in Figure 2. Core Indicators Selection for Study Fundamental operational indicators are significant for an organization to establish a sustainable civilization and decreasing environmental burden (Jasch 2009). Also, sub-indicators use in combination with the core sets of indicators to measure and follow environmental performance for further accuracy. Indicators can direct types and amount of resources input and output easily to point out quantitatively the material utilized in an organization . The most core indicators used or identified in this study were total energy used, the total amount of water used, total material input, total products, wastewater output, solid waste, and most known GHG as sub-indicators data were evaluated quantitatively using secondary and primary data collection system following ISO 14031 standard. All indicators identified were grouped under EPI which resulted in environmental condition indicators. But, most GHG emissions which included CH 4 , CO 2 , N 2 O, NH 3 , and eutrophication & acidification facilitators cannot be collected in the farm process as another type of parameter. Because of this, the amount of GHG, eutrophication & acidification emission from the farms was evaluated using different equations that measure emission emits from the agricultural process as per IPCC,2006, EPA (2003), and FAO from material inputs and outputs . Inventory Data Collection Inventory data collected from more than 21 flower farms existed in the district that have about 35 km distances from Finfinnee /Addis Ababa. The study covered more than one year time interval i.e. starting from February 2019 to April 2020. Collected data focused on four stages of cut flower harvesting activity that included land preparation stages which included the amount of energy used, amount of water used as inputs. The second data collection stage was from cut flower plant handlings, which focused on the amount of water, chemical, energy, material used, and products in the cut flower production farms. The third stage is from post-harvesting activities that involve data collection on water, chemical, cardboard paper used, and wasted materials throughout the activities and the fourth stages of flower production included transportation of product and data collection related to power consumption for transportation or fuel used . Data collected at each stage of the flower harvesting activities were focused on selected indicators that are based on the input-output entire life cycle of the production. All necessary data were collected using both primary and secondary data sources by distributing questioner papers, reviewing related documents from various sources that included governmental offices, Private institutions, individual, nongovernmental organizations (NGO), interviewing the workers in floriculture industries, interview farm managers, direct physical site observation and assessing the existing situation of the study areas. All necessary data collected by using all mentioned data collection systems from cut flower farms and other data sources, but impossible to get data about GHG emission resulting from the materials used at four stages of cut flower productions. So, the emission of the firms quantified by using the amount of material used (fertilizer, chemical, fuel), amount of wastes burnt, amount of waste discharged/disposed of, and their emission factors with relating different study paper and IPCC 2006 guidelines for every emitted GHG from input-output indicators in data analysis . Evaluate Potential Environmental Impacts Data collected using inventory methods were evaluated and provide the necessary information, but impossible to get quantitative data about GHG emission from fertilizer and pesticides used in the farms, from residual biomass burnt in the farms, and from power energy (diesel fuel, petroleum fuel, and electricity) used for transportation of products and irrigation purpose in the farms. In the same way's eutrophication supporter discharged materials within the wastewater per hectare (NO 3 , PO 4 , NH 3 , SO 4 ) require estimation. Therefore, it is obligatory to estimate the emission of material input-output in the flower farms and the most greenhouse gaseous emitted from the farms to environments (atmosphere) identified for estimation ( CO 2 , N 2 O, and CH 4 ) and were evaluated or analyzed by using different equations which included equation for evaluation of GHG emitted from wastewater, from nitrogen synthetic fertilizer (DAP, UREA), from solid waste biomass burnt in the farms and combustion of energy sources by vehicles released to the environments at the end life cycle of cut flower production or transportation of main products evaluated using emission factors of the material used or disposed of; but the amount of eutrophication and acidification supporter materials calculated using laboratory results and wastewater discharged per hectare of cut flower production . The identified parameters whether core indicators or sub-indicators, it used to point out the environmental problems that occurred by flower farms in the districts analyzed using Excel and evaluation was done based on average materials flow in the farms per hectare of any activities . Results and Discussion Based on the methodology used in the study, all necessary data collected from an onsite data collection system using inventory assessments. This inventory data collection included the fundamental materials input-outputs in the flower farms cradle-gate processing system based on LCA as per ISO14044 which is organized in the following Table 1 (based on the selected functional unit). Planting media in the flower farms: In flower farms, media is the area that is prepared for the plantation of cut flowers in the greenhouse or open fields of the flower farms within a furrow alignment form. This study assessed primary and secondary data from 21 flower farms that existed in the district. As understood from collected data, the district flower farms have used both soil bed media and hydroponics media. All most all flower farms in Wolmera district have used soil bed as planting media because of its cost-effectiveness, but using soil bed in flower farms environmentally less significant when compared with hydroponic beds because hydroponic bed systems have the recycling probability of wastewater as data obtained from flower farm managers and Environmental protection authority office of the district . Cut Flower Products Cut flower products are annually produced cut flowers may be measured in stem/tons/kg/bunch that supplied for marketing purposes (export/for local markets). For this study to get the annual production of cut flowers more than twenty questioner papers were distributed to flower farms in the areas and tangible data were collected from its sources. The growth production of cut flower in Wolmera district flower farms was about 85520 kg/ ha production yields were harvested throughout the one-year production life cycle for marketing purposes. This has a great role in the country's economic developments as stated that export earnings further diversifying Ethiopian exports and becoming an important contributor to Ethiopia's economic developments . Despite this, an average nearly 5220 kg of cut flower rejected during packaging process as waste materials and through cut flower development process huge amount of stem and leaves were wasted to the environment which has similar amount with products per year in average as data obtained from the flower farms managers office and EPA of the district. In the same way, no route tries to change these solid wastes to any beneficiary assets in the flower farms . The rejected cut flower wastes, stems & leaves were disposed of and burnt in the firms as agricultural residual biomass. Any agricultural residual biomass burnt in the farms emits emission of GHG (CO 2 , N 2 O, and CH 4 ) (IPCC 2006). In this study based on IPCC standards emissions emitted to the atmosphere were calculated using IPCC (2006) guidelines related to agricultural residue biomass burning emission factor standards . Based on this guideline the results of evaluated GHG emission from burnt floricultural residue and biomass aggregated in Table 2. Water Consumption and Analysis The total water consumptions in flower farming are originated from groundwater, surface water, and harvested water, normally more percentage of demand fulfilled from groundwater. This similar Ethiopia flower farms use more present water from groundwater . It is belonging that flower farms use a high amount of water just like other common horticulture production. But, the use of water in floriculture depending on the farm area, climate change, soil types, and water using mechanism in the activities and flower farms daily water consumption is varying from farm to farms. In the current study, Wolmera district flower farms were used on average 28800 m 3 /ha as input to process cut flower production activities and 7200 m 3 / ha wastewater was discharged to the environments per year as data organized from flower farms managers (21 in number) and district EPA office. But, even if horticulture production is known by using too many intensive resources like land, water, and chemicals ; the amounts of water consumption in Wolmera district flower farms have the highest values when compared with previous articles. This indicated that flower farms in Wolmera district have used too much water which results in GHG emission to the atmosphere & drains the wastewater directly to the field and rivers that supplying nutrients like PO 4 , NO 3 , and NH 3 which support the process of eutrophication or acidification. This situation restricts the value of water for a different purpose in the communities . This assures that the boundless use of water in flower farms can lead the area to scarcity of groundwater and can cause a high amount of wastewater drain to environments. In general, the wastewater discharged from the flower farms to the fields and rivers could facilitate the eutrophication and acidification in the areas by supplying N, P with their compounds respectively and these all emission to atmosphere, territory and aquatic body quantitatively estimated in the next portion at flower farm emission evaluation parts from per hectare emitted wastewater . Solid Waste Analysis The most solid waste observed in the flower farms were plastic wastes, paper wastes, cardboards, flower stems, leaves, and cut flower residues. As data collected from different sources of the district office and flower farms managers (21 in numbers) the total amounts of stems and leave waste disposed of were an average of 86000 kg/ ha and cut flower wastes during packaging 5220 kg/ha were wasted from the farms and 20.26 kg/ha paper wastes generated from 1500 kg/ha input papers, 30.12 kg/ha of cardboard wastes from 4100 kg/ha inputs and 20.9 kg/ ha of plastic materials wasted from 3200 kg/ha of plastic materials input to the flower farm process were disposed to the environments, which shown in Figure 3. To dispose of the wastes in flower farms, there are different types of waste disposal mechanisms that including landfill, incineration, anaerobic digestion, and recycling wasted materials . But, in the Wolmera district, all almost all flower farms have used open burn of the farm's residual biomass infield because of fear of the cost to build modern and acceptable disposal mechanism, but a little bit of wastes has burnt in incinerators in some of the flower farms. Open burning of agricultural residues biomass generates GHG emission (IPCC 2006). In the district, all flower farms recycling and changing to the beneficial asset is zero as data obtained (gathered) from Wolmera district environmental protection & climate change authority office and physical observation of the farms at sites except some percent of plastic wastes. The GHG emission emitted from residual biomass burning in farms estimated using equation listed in chapter three that based on IPCC 2006 agricultural residues biomass burning guideline in the emission estimation process and the GHG emission that generated from the farms evaluated and discussed in emission estimation parts . Chemicals Used in the Farms Ethiopia's floriculture industries use more than 300 types of chemicals in rose production farms (Kassa 2017). In the same way to get chemical types used in Wolmera district flower farms in this study sufficient questioner papers were distributed to collect the necessary data from the farms. To aggregate, these data about twenty-one questioner papers were distributed and collected the necessary data on chemical type and the number they used at each flower farms. The collected data indicate that about 156 chemical types were applied in the Wolmera district flower farms. These all chemicals are mostly used at nursery sites, cut flower plant handling stages and at packaging rooms for prevention and preservation purposes. Most chemicals used in flower farms are fertilizer and pesticides that are stated separately . Fertilizer Flower farms in Ethiopia used more than 30 types of fertilizers to supply sufficient nutrients to the harvesting plants. Also, Wolmera flower farms are used different types of fertilizer which involves ammonium sulfate, potassium sulfate, potassium nitrate, potassium phosphate, ammonium phosphate, and urea, but the current study focused on two main fertilizers were used in the farms with the highest percentage which included DAP & UREA . In this study, as data collected from flower farms office directly at on-site data collection system, an average Wolmera district flower farms use 650 kg DAP and 450 kg UREA per hectare of cut flower production within a year. The farms used much amount of fertilizer that can lead the process to environmental pollution in case of GHG emission, nutrient discharging to the rivers that support the eutrophication or algal developments in river bodies and increase the acidity of the rivers in the areas as evaluated from laboratory analysis. The study mainly focused on estimating both emission types (GHG &nutrient discharged to rivers) emitted to environments from the farms as a whole . The GHG emission was evaluated using a different equation based on the number of materials used/ disposed of and emission factors to estimate NH 3 , N 2 O & CO 2 emitted to air with a correction factor of each gaseous as per IPCC 2006 standards related to synthetic nitrogen fertilizer, but the nutrients discharged to the rivers were estimated from laboratory results related with wastewater discharged per hectare of cut flower productions. The most GHG emissions evaluated in this study from wastewater discharged or emission were N 2 O, CO 2 , and CH 4 ; also, NH 3 emission estimated from 8% of applied nitrogen fertilizer in the farms . Pesticides Ethiopia flower farms used more than 200 types of pesticides to control macro and micro-organism that affect the developments of cut flowers. Based on this statement in the current study more than 156 chemical types were collected to assess the banned chemical used and estimate emission to air in the farms. Wolmera flower farms on average about nearly 45 kg of pesticides used per hectare of cut flower production within a year. The pesticide used in flower farms has the ability of emitting pollutants into an atmosphere that cause climatic changes or pollution . This pesticide emission into the atmosphere was estimated which indicated that 30-50% of pesticide sprayed emitted into the air in case of volatilization and air drafting system which organized in Table 2. Energy Consumption Analysis The most energy sources in Wolmera district flower production farms are electricity, diesel, and petrol to facilitate any activities in the firms. Also, they are mainly depending on non-renewable energy sources rather than supporting renewable energy sources. Energy in the farms was used in the cooling room, in the office, lighting in the compounds, transportation, and for irrigation purposes. But current study focused on energy used for transportation and irrigation water pumping which is included in the system boundary. Total energy consumes per hectare of cut flower production were 3.55 kWh electric power,50 kg of diesel oil, and 35 kg of petrol. The energy used in flower farm production emits GHG to the environment that has great value in environmental pollution. Most GHG emissions caused by these sources of energy used in the farms are CO 2 , N 2 O & CH 4 . These were estimated concerning on heavy-medium duty vehicle emission factor adopted from IPCC 2006 guideline. The result and discussion of the evaluated emission were aggregated in the emission estimated portion is mentioned in Table 2. Evaluation of Emission Emission is the process of releasing materials (gaseous, liquid & solid substances) to the atmosphere, land, and water bodies that cause great problems in the environments that resulted from a large amount of natural resource consumed by industries . In this study emission of gaseous substances from input and output materials was evaluated based on data collected from its sources. As listed in Table 2 the most known greenhouse gas (GHG) evaluated were CO 2 , N 2 O, and CH 4 using IPCC (2006) standard and emissive factors (EF) of each GHG. The study mainly focused on the GHG emission from fertilizer used, energy used, and agricultural residue biomass burnt in the farms and wastewater discharged . Agricultural residues are the main sources of GHG emission in the flower farms that emanated from leaves, stems, cut flowers, and decomposable input materials incinerated or burnt in the farms. In the current study, the residual biomass of flower farms burnt in open fields and release GHG to the environments which mostly included CO 2 , N 2 O, and CH 4 as expressed in Table 2. These GHG emissions were evaluated from residual biomass burnt in the farms as per IPCC (2006). The evaluation expressed that high amount of CO 2 released to the atmosphere among evaluated GHG emitted from other materials in the farms or when compared with N 2 O and CH 4 from these sources shown in Figure 4, but N 2 O can create GHG about 265 times over CO 2 gases within a hundred years' life spans . Fertilizer is another type of GHG emission source in agricultural activities and flower farms are one sector of the agricultural system which used a high amount of chemical fertilizers. In this study, only two main nitrogen fertilizers were selected which included DAP and UREA that are used in high percentages in the farms. When these fertilizer types are used in the farms, greenhouse gas is emitted to the atmosphere that can cause global warming by supporting climatic changes (FAO). The basic GHG emitted from both nitrogen fertilizers used in the farms were CO 2 , N 2 O, CH 4 , and NH 4 . This GHG emitted from N-fertilizer was also evaluated as per IPCC (2006) standards that included using CO 2 , N 2 O & NH 3 gaseous from DAP, but for NH 3 using 8% of total nitrogen fertilizer percentages used in the farms as an emissive factor . Also, greenhouse gas emitted from UREA evaluated using the total amount of Urea used to multiply with gaseous emitted emission factors for evaluation of both CO 2 , N 2 O & CH 4 estimated from global warming potential standards for each GHG emission. NH 3 was evaluated from total urea used per hectare of flower production after decomposition evaluated based on 8% of fertilizer used released to the atmosphere in form of ammonia & 2% release in the form of NOx . Table 2 and shown in Figure 4 high percentages of CO 2 emanated from urea and high percentages of N 2 O from DAP released into the atmosphere when compared with other types of GHG emitted from both types of fertilizers. This can cause atmospheric pollution and climatic changes in the environments. The other types of GHG emission sources in the current study were the gaseous emitted from energy sources used in the farms that included energy for transportation and water pumping systems. Different types of energy sources used in flower farms, but the current study only selected the major energy sources used in the farms which included diesel fuel, petroleum fuel, and electricity power . For all energy sources in the farms major GHG (CO 2 , N 2 O & CH 4 ) emitted as fundamental emission from the energy used were evaluated for diesel fuel, petroleum fuel and electricity. The amount of material used and emission factors of each GHG emitted from all energy sources used in flower farms and is based on medium to heavy-duty vehicles for emission factor of each GHG emitted. It can be observed that high percentages of CO 2 released to the atmosphere from diesel fuel and low amount of CO 2 released from petroleum when compared with each other or compare three of them that can bring climatic changes in the areas . As mentioned in Wastewater discharged from flower farms is another type of material output that can cause environmental pollution and emit greenhouse gases to the air. In the current study, the other materials that can cause GHG emissions were wastewater discharged to environments from flower farms. The main greenhouse gas emitted from wastewater included CO 2 , N 2 O, and CH 4 that evaluated in this study. The CO 2 gas emission in the study estimated using hundred-year time horizon global warming potential (GWP =310 for N 2 O and GWP = 21 for CH 4 ) that collected from IPCC 2006 standards and both N 2 O and CH 4 estimated values. Based on evaluated results the amount of GHG emission from wastewater (effluent) to the atmosphere was greater than the other types of GHG emitting sources i.e. wastewater discharged from flower farms has great values to increase global warming by supplying a huge amount of GHG rather than other types of GHG emitting materials. The emission from pesticides in this study evaluated from a total pesticide used per year in the farms based on emissive factor. The reason behind to released on atmosphere and soils which depend on 30-50% of pesticides sprayed emitted to air in case of volatilizations and air drafting that mainly focused. Estimation of pesticide emission most of the time based on air condition, time, application methods, application systems, application skill, and types of pesticides . Using this system, the GHG emitted from these chemicals evaluated totally from pesticide applied in flower farms that highly supports the climatic changes of the environments by inducing about 0.00002 Gg/year . Emission of Nutrients to the River with Wastewater The number of nutrients discharged to the river and nearby lands was evaluated from the results of effluent sampled that examined by the laboratory and the amount of wastewater discharged to the environments. As shown in Figure 5 PO 4 , NO 3 , NH 3 & SO 4 were the main nutrients that were discharged to the environments which support the eutrophication and acidification i.e. N, P, and their compounds are the major causes of eutrophication and acidification respectively . Acidification occurred by NH 3 , NO X , SO X by releasing H + which has the potential to acidify soil and water bodies. In this study, the main supporters of acidification are SOx, NH 3 , and NOx and the main eutrophication supporter nutrients are PO 4, NH 3 , NO 3 . Figure 5. Nutrients support eutrophication and acidification of water bodies As mentioned in Table 2 Wolmera district flower farms released a high dosage of chemicals that supports eutrophication and acidification into the environments as understood from estimated results. In general, the assessment evaluation involves the most influential emission which focused on GHG emission and wastewater emission to the environments. Both emission types estimated using international standards and laboratory analysis using the emitted discharge to the environments. The fundamental emission of GHG estimated from all input-output materials was CO 2 , N 2 O, CH 4 & NH 3 which has high potential to increase global warming and emission of wastewater to environments; also used to estimate chemical nutrient (PO 4 , NO 3 , NH 3 , and SO 4 ) added to the rivers that support the development of eutrophication and increase acidification in the ecosystem after chemical fertilizer and pesticides react with water. In addition to these CO 2 has a great value to add acidification to environments especially in water bodies that included rivers, lakes & oceans . According to this statement, CO 2 released into the air react with water and creates water body acidification that can harm the organisms in water and users of the water resources. Acidification of the water body could occur during atmospheric CO 2 reaction with water as following reaction process and increase H + in water bodies (oceans, lakes, rivers) of CO 2 from the air. In general, CO 2 , SO 4 , like compounds resulted in acidification when they reacted in the atmosphere with water droplets or precipitation . CO 2 +H 2 O↔H 2 CO 3 H 2 CO 3 ↔HCO 3 1-+ H + Conclusions Wolmera flower farms consume too many resources and disposed of/discharged a huge amount of wastes to the environments which directly or indirectly influence the environment & its components. In the company, input-output materials were assessed & identified by using inventory and sampling data collection methods that are supported by ISO14031 standard integrated with LCA. An important data was collected from its sources (at the site) and GHG (CO 2 , N 2 O, CH 4 & NH 3 ) emission emitted to environments were evaluated from fertilizer (DAP & UREA), floriculture biomass residue burns in farms, energy consumed (diesel fuel & petroleum), pesticide applied and wastewater discharged to an environment using IPCC 2006 from inventory data aggregated. These all GHG emitted to environments can increase global warmings. Similarly, the basic cause of eutrophication and acidification materials (NO 3 , PO 4 , NH 3 , SO 4 from wastewater & N, P from soil) were evaluated from laboratory results. In general, the farms have low operational performances and environmentally less significant. To solve these like challenges in flower farms they must follow internal and external combined or linked environmental performances evaluation. Therefore, this systematic environmental management tool is used to lead the flower farms to evaluate an ability they have to manage impacts of an environment instead of missed EIA documents during construction.
The innovation industry faces an uncertain future, as long as the United States R&D Tax Credit remains a Congressional roller coaster ride. Innovation should be rewarded and the U.S. government should use federal funds to foster a culture of discovery. Virtually everyone agrees with this broad premise. But, as with many things on Capitol Hill, the devil is in the details. On that note, let’s talk about the Research and Experimentation Tax Credit. Every couple of years, Congress votes to extend the so-called R&D Credit—a significant tax break for corporations that invest in research and development. Beneficiaries of the credit include many of the usual suspects (think Boeing and Dow Chemical) but also small businesses and even tech start-ups. Right now the R&D Credit is a temporary measure, an adaptation of President Reagan’s 1981 tax break that was launched to ensure that the U.S. economy kept Japan on its toes. But since then, Congress has renewed it without fail (more or less) 15 times. This year, it almost looked like the R&D Credit might be extended permanently—a move that industry experts say would shield science and technology companies from the losses that they take every time it expires. Spoiler: it didn’t happen. Last week, Congress begrudgingly renewed the R&D Credit retroactively for another year and, at least for now, all hopes of it becoming a permanent incentive for cutting-edge research are dashed. “This place is dysfunctional,” Representative Jim McDermott said during last week’s session. “Businesses and individuals need to know what the tax is going to be in the beginning of the year so that they can plan and take advantage of incentives rather than waiting until the last two weeks of the year when the Congress may or may not act.” A full transcript of the session is available here. Every time the R&D Credit comes up for renewal, a similar debate emerges. Proponents remind us that the tax break spurs innovation and supports high-paying jobs in science and technology. They call for a permanent R&D Credit, so that research-minded businesses can budget accordingly. Detractors point out that the last two-year extension cost $14 billion over ten years, and that making the credit permanent could cost $150 billion over ten years. They argue that nobody has figured out how to pay for a $150 billion loss in tax revenue. But this time around, things started looking different. In 2010, President Obama announced that he would seek to permanently extend the R&D Credit. The presidential push for a permanent tax break reached its peak in January 2014 during his State of the Union Address: “Listen, China and Europe aren't standing on the sidelines; and neither—neither should we. We know that the nation that goes all-in on innovation today will own the global economy tomorrow. This is an edge America cannot surrender. Federally-funded research helped lead to the ideas and inventions behind Google and smartphones. And that's why Congress should undo the damage done by last year's cuts to basic research so we can unleash the next great American discovery.” Then in May 2014, it finally happened. The House of Representatives passed a bill to permanently extend the R&D Credit. Rumor has it that, for a short while, the bill enjoyed a degree of bipartisan support. Research firms were thrilled. A permanent R&D Credit would have meant that businesses invested in science and technology could lay out their budgets without having to wonder when (or if) they would get their tax breaks. And errors in earning reports that led to stock fluctuations—partially due to confusion over the R&D Credit’s odd cycle of expirations and renewals—may have diminished, or vanished entirely. And then everything came crumbling down. The White House threatened a veto if Congress didn’t find a way to pay for the $150 billion bill, House Republicans took a shot at unemployment benefits and Senate members (who, perhaps, saw the writing on the wall) began to draft a bill to extend the R&D Credit temporarily. Now that the dust has settled, it looks like we’re only getting a one-year extension out of the whole kerfuffle. This is disappointing news for science and scientific research. It’s also a little bit frustrating. Here we have a case where everyone—Republicans, Democrats, Congressmen and the White House all claimed to want the same thing. In the end, nobody got what they wanted. For more on the implications of the R&D Credit extension, check out David Malakoff’s in-depth coverage in Science Magazine.
Dubspot contributor Josh Spoon explains how to sidechain frequencies in Ableton Live by using the Max for Live Envelope Follower device. Many people love the rhythmic pulse of sidechaining a pad or vocal to a kick drum. Though, sometimes you may not want to sidechain or duck the whole sound, but instead you want to sidechain certain frequencies that get in the way. In this tutorial I will show you how to sidechain specific frequencies and create rhythmic equalization. You will need Ableton Live 9 Suite and Max for Live Essentials to complete this tutorial. I have provided a starter pack for you. You can download it here. When you open the Dubspot-Sidechain-Freq-Tutorial.als file you will see two MIDI tracks with a kick and a bass. Let’s play both clips. If you select the Kick track and look at the Spectrum you will see that a majority of the kick’s energy is between 50 – 200hz (hertz). Now let’s look at the Bass track. Looking at the Spectrum on the rebuilt Live 9 EQ, you see the Bass takes up a lot of the frequency spectrum including a lot of the low end. Traditionally to keep the Bass from getting in the way of the kick we would sidechain the bass so when the kick hits the volume of the Bass goes down as a whole. Using the Max for Live Envelope Follower to sidechain you can preserve the volume of the frequencies that do not compete with the kick drum. First thing we need to do is navigate to the Max For Live category in the browser and type “Envelope Follower” Then drag “Envelope Follower.amxd” to your Kick track and you will see that the Envelope Follower is registering the amplitude of the kick. Now let’s go back to our Bass to set up the the EQ. Let’s select Filter 1 on the EQ and change the “Freq” to 120hz and the Q (Resonance) at 0.71. This will help the EQ of the bass duck where most of the kick’s power is located. Listen to the clips again. Begin to move the “Gain” of Filter 1 between 0db and -15db simulating a ducking effect and listen for how the sound of the kick and bass are changing. It’s subtle but subtle is the difference between a good low end and a great low end. Next let’s make an Audio Effects Rack, it will give us greater control over the frequency ducking we are creating. See my other article, 4 Reasons To Build Your Own Ableton Live Racks! Dynamic Controller Integration and More to learn more about the power of racks. Right click on the Bass EQ and select “Group” We will then right click Filter 1′s gain and select “Map to Macro 1″ To get greater control over the the ducking of the Bass set the Min to 0db and the Max to -15db to the newly auto-named “1 Gain A”. Click on the Kick track and select “Map” the Envelope Follower. We will then select the Bass track and then click the “1 Gain A” on the rack to map the Envelope Follower from the Kick to the EQ gain of the Filter. You will know it worked if the 1 Gain A is grayed out. Let’s listen to how the bass is being affected. If you want the effect to be more present you can increase the gain of the Envelope Follower. Listen for some of the power of the bass dipping as the rest of the frequencies stays as the kick hits. This will he heard more clearly with headphones or on a sound system with good bass response. Here is the track with the Envelope Follower at +10db Here is the frequency ducking at 1k. You can feel the pumping of the Bass and the thump of the kick a little more. This technique is great for mixing and as an effect because you get more control then a compressor. Experiment with the other parameters like Rise, Fall and Delay on the Envelope Follower as well as the frequency and Q of 1 Gain A to hear the rhythmic effects of ducking different frequencies. In the Live Pack I’m including an extra kick track with two Envelope Followers and a rack with two frequencies to sidechain at two frequency points. Download the Live Pack here. You don’t have to just sidechain sounds to Kicks; you can sidechain any sound to another sound. The options to creating unique tones with this technique are limitless. I hope this gives you a new tool to create new sonic textures to aid you in your productions. Dubspot blogger Josh Spoon is an Ableton Live veteran, blogger, drummer, music producer and live performer. Josh has a residency with the eclectic Los Angeles electronic music collective Space Circus,performing every first Friday of the month, and just released his first concept EP of grooving low-end originals entitled Man on Mars. The flagship of our music training, with every Ableton Live course offered at the school. After completing this program, you will leave with a portfolio of original tracks, a remix entered in an active contest, a scored commercial to widen your scope, and the Dubspot Producer’s Certificate in Ableton Live. What’s Included: Ableton Live Level 1: Beats, Sketches, and Ideas Ableton Live Level 2: Analyze, Deconstruct, Recompose, and Assemble Ableton Live Level 3: Synthesis and Original Sound Creation Ableton Live Level 4: Advanced Sound Creation Ableton Live Level 5: Advanced Effect Processing Ableton Live Level 6: Going Global with your Music This program is about learning Ableton Live by going through the entire process of being an artist, by developing your own sound through a series of sketches and experimentation. You will also learn the ins and outs of this powerful software through a series of exercises designed to help you master the steps involved in producing your own music. After a level of getting familiar with the tools that Ableton has to offer, you will then develop your sonic ideas into full-length tracks. You will be exposed to a variety of approaches to arrangement and composition, storytelling techniques, ways of creating tension and drama in your music. At the end of the day, it is the sum total of your choices as an artist that define your sound, and levels 2 – 6 will give you the experience of actually completing tracks to add to your portfolio. If you have questions, please call 877.DUBSPOT or send us a message. Dubspot NYC Open House: Sundays 11:00am and 12:00pm Ask questions. Watch demos. Test drive workstations. Anyone who comes by will have the opportunity to ask our instructors in-depth questions about our programs, curriculum and philosophy, and watch live music production and DJ demos. You can even sit down at one of our workstations and take it for a test drive. If you are still trying to decide what you are looking for, we suggest you stop by one of our Open Houses to learn more about the school, understand what the learning process at Dubspot entails and help you decide what is best for you. We can also help with scheduling details and payment options.
Effect of Vaspin on Myocardial Ischemia-Reperfusion Injury Rats and Expression of NLR Family Pyrin Domain Containing 3 (NLRP3) Myocardial ischemia-reperfusion injury (MIRI) can cause myocardial damage. Vaspin can protect against myocardial damage. However, the effect of vaspin on MIRI rats and the expression of NLRP3 remains unclear. Sprague-Dawley rats were separated into sham group; MIRI group and Vaspin group, in which 100 ng/ml vaspin was administrated before model preparation followed by analysis of cardiac function by M-mode ultrasound, level of NLRP3, of type I collagen, IL-6 and TNF-α by ELISA, SOD activity and ROS by spectrophotometry and Bcl-2 and PI3K/AKT signaling protein expression by Western Blot. In MIRI group, left ventricular end-systolic diameter (LVESD), left ventricular mass index (LVMI), left ventricular end-diastolic diameter (LVEDD), NLRP3 expression, contents of type I collagen, IL-6, TNF-α as well as ROS were significantly increased and SOD activity was significantly decreased with decreased Bcl-2 expression and upregulated pAKT and pPI3K (P < 0.05). In Vaspin group, LVESD, LVMI and LVEDD and NLRP3 expression as well as type I collagen, IL-6, TNF-α and ROS was decreased, SOD activity and Bcl-2 expression was significantly increased with downregulated pAKT and pPI3K (P < 0.05). Vaspin can regulate PI3K/AKT signaling pathway, inhibit NLRP3 expression, regulate oxidative stress, inhibit inflammation, reduce apoptosis, improve and improve cardiac function of myocardial ischemia-reperfusion injury in rats.
<filename>bf2c.hs module Main where import Data.Word (Word8) import System.Environment (getArgs) data Instruction = PIncr Int |PDecr Int |CIncr Word8 |CDecr Word8 |Print |Read |Loop [Instruction] deriving (Eq, Show) -- parse and compile a Brainfuck code to instructions. compile :: String -> [Instruction] compile str = fst $ compile' ([], str) where compile' :: ([Instruction], String) -> ([Instruction], String) compile' (acc, "") = (reverse acc, "") compile' (PIncr n:acc, '>':xs) = compile' (PIncr (n+1):acc, xs) compile' (PDecr n:acc, '<':xs) = compile' (PDecr (n+1):acc, xs) compile' (CIncr n:acc, '+':xs) = compile' (CIncr (n+1):acc, xs) compile' (CDecr n:acc, '-':xs) = compile' (CDecr (n+1):acc, xs) compile' (acc,'>':xs) = compile' (PIncr 1:acc, xs) compile' (acc,'<':xs) = compile' (PDecr 1:acc, xs) compile' (acc,'+':xs) = compile' (CIncr 1:acc, xs) compile' (acc,'-':xs) = compile' (CDecr 1:acc, xs) compile' (acc,'.':xs) = compile' (Print:acc, xs) compile' (acc,',':xs) = compile' (Read:acc, xs) compile' (acc,'[':xs) = compile' (Loop subinstructs:acc, xss) where (subinstructs, xss) = compile' ([], xs) compile' (acc,']':xs) = (reverse acc, xs) compile' (acc, _:xs) = compile' (acc, xs) -- ignore any other character. -- translate instructions into a C code. translate :: [Instruction] -> String translate [] = "" translate (Loop [CDecr 1]:xs) = "*p=0;" ++ translate xs translate (PIncr 1:xs) = "p++;" ++ translate xs translate (PDecr 1:xs) = "p--;" ++ translate xs translate (CIncr 1:xs) = "(*p)++;" ++ translate xs translate (CDecr 1:xs) = "(*p)--;" ++ translate xs translate (PIncr n:xs) = "p=p+" ++ show n ++ ";" ++ translate xs translate (PDecr n:xs) = "p=p-" ++ show n ++ ";" ++ translate xs translate (CIncr n:xs) = "*p=*p+" ++ show n ++ ";" ++ translate xs translate (CDecr n:xs) = "*p=*p-" ++ show n ++ ";" ++ translate xs translate (Print:xs) = "putchar(*p);" ++ translate xs translate (Read:xs) = "getchar(*p);" ++ translate xs translate (Loop sub:xs) = "while(*p){" ++ translate sub ++ "}" ++ translate xs translate' str = header ++ (translate$compile$str) ++ footer where header = "#include <stdio.h>\n\ \int main(){\ \unsigned char m[30000]={};\ \unsigned char *p=m;" footer = "}\n" -- read the first argument as a name of a source file. main :: IO () main = do filename <- fmap (!!0) getArgs readFile filename >>= putStr.translate'
import { getMIRAnnualRewards } from "./calc" test("MIR Annual rewards", () => { const Y1999 = new Date("1999-01-01").getTime() const Y2021 = new Date("2021-01-01").getTime() const Y2022 = new Date("2022-01-01").getTime() const Y2023 = new Date("2023-01-01").getTime() const Y2024 = new Date("2024-01-01").getTime() const Y2025 = new Date("2025-01-01").getTime() expect(getMIRAnnualRewards(Y1999)).toBe(3431250) expect(getMIRAnnualRewards(Y2021)).toBe(3431250) expect(getMIRAnnualRewards(Y2022)).toBe(1715625) expect(getMIRAnnualRewards(Y2023)).toBe(857813) expect(getMIRAnnualRewards(Y2024)).toBe(428906) expect(getMIRAnnualRewards(Y2025)).toBe(undefined) expect(getMIRAnnualRewards(Y1999, true)).toBe(10293750) expect(getMIRAnnualRewards(Y2021, true)).toBe(10293750) expect(getMIRAnnualRewards(Y2022, true)).toBe(5146875) expect(getMIRAnnualRewards(Y2023, true)).toBe(2573439) expect(getMIRAnnualRewards(Y2024, true)).toBe(1286718) expect(getMIRAnnualRewards(Y2025, true)).toBe(undefined) })