content
stringlengths 10
4.9M
|
---|
Between 2008 and 2009, 35 employees at France Telecom took their own lives. On Wednesday the operator’s former CEO Didier Lombard (pictured) was placed under formal investigation for psychological harassment.
ADVERTISING Read more
The former head of France Telecom, the world’s second-largest telecommunications firm, was formally placed under investigation Wednesday for his alleged role in a wave of staff suicides which prosecutors say amounted to psychological harassment.
Didier Lombard (pictured) was CEO of the company, which provides mobile and internet services through the Orange brand, during a sweeping restructuring period between 2008 and 2009 in which some 35 employees took their lives.
Two other senior executives, 70-year-old Lombard’s deputy Louis-Pierre Wenes and the former head of human resources Olivier Barberot, were also due to appear before magistrates on Thursday.
Lombard, who was released on bail of 100,000 euros, resigned as CEO of the former state monopoly in 2010.
In his five years at the helm, the number of France-based employees fell from 130,000 to around 100,000. The company has some 180,000 employees worldwide.
‘Copycat suicide culture’
On Wednesday he denied that his strategy as head of the international operator he was responsible for the number suicides, which he maintained was no higher than the national average.
In an editorial in left-leaning Le Monde on Wednesday, Lombard said: “The [restructuring] plans put in place by France Telecom were never intended to hurt the employees. On the contrary, the plans were destined to save the company and its workforce.
“I am conscious of the fact that the company’s upheaval may have caused problems [for some employees],” he wrote. “But I absolutely reject that these plans, which were vital to France Telecom’s survival, were the direct cause of these human tragedies.”
In 2009 Lombard shocked France by claiming that there was a “mode de suicide” – a “copycat suicide culture” - at France Telecom.
Unions representing the company’s staff maintain that specific policies, including forced moves and impossible performance targets, were put in place specifically to crush morale and force employees to quit.
‘People were put under unbearable pressure’
“This was the biggest redundancy plan at a French company in decades,” Sebastien Crozier, head of the CFE-CGC union at France Telecom, told FRANCE 24.
“People were put under unbearable pressure. Thousands were forced to move geographically or to take on new job functions. They couldn’t take it.”
“The whole strategy was to reduce the number of employees,” he added. “It was an organised and planned method to make employees’ lives difficult so that they would resign.”
Many of the workers who took their own lives directly blamed the pressure of the restructuring, and in 2010 a government report on the suicides concluded that the company had ignored advice from doctors about the effect these policies had on staff morale and on employees’ mental health.
Wednesday’s court hearing was the first time that a CEO of a multinational company has been bought before French magistrates for psychological harassment, a move the unions welcomed.
Crozier said the development was “important for all the staff and the families.”
“Since Lombard has left the company the number of suicides has dropped by two thirds,” he said. “We need to ask why the rate was three times higher when he was CEO.”
Lombard, if convicted, faces up to one year in prison and a 15,000-euro fine. |
<reponame>landonvg/gedcomx-java<filename>extensions/familysearch/familysearch-api-model/src/test/java/org/familysearch/platform/ct/ChangeInfoTest.java
package org.familysearch.platform.ct;
import org.gedcomx.common.ResourceReference;
import org.gedcomx.common.URI;
import org.junit.Test;
import static org.junit.Assert.assertEquals;
import static org.junit.Assert.assertNull;
public class ChangeInfoTest {
@Test
public void testGetReason() throws Exception {
ChangeInfo changeInfo = new ChangeInfo();
assertNull(changeInfo.getReason());
assertNull(changeInfo.getKnownOperation());
assertNull(changeInfo.getObjectType());
assertNull(changeInfo.getParent());
assertNull(changeInfo.getResulting());
assertNull(changeInfo.getOriginal());
assertNull(changeInfo.getRemoved());
changeInfo.setReason("junkReason");
changeInfo.setKnownOperation(ChangeOperation.Delete);
changeInfo.setObjectType(URI.create("urn:hi"));
changeInfo.setParent(new ResourceReference(URI.create("urn:junkParent")));
changeInfo.setResulting(new ResourceReference(URI.create("#CHNG-001.PRSN-001.resulting")));
changeInfo.setOriginal(new ResourceReference(URI.create("#CHNG-001.PRSN-001.original")));
changeInfo.setRemoved(new ResourceReference(URI.create("#CHNG-001.PRSN-001.removed")));
assertEquals("junkReason", changeInfo.getReason());
assertEquals(ChangeOperation.Delete, changeInfo.getKnownOperation());
assertEquals("urn:hi", changeInfo.getObjectType().toString());
assertEquals("urn:junkParent", changeInfo.getParent().getResource().toString());
assertEquals("#CHNG-001.PRSN-001.resulting", changeInfo.getResulting().getResource().toString());
assertEquals("#CHNG-001.PRSN-001.original", changeInfo.getOriginal().getResource().toString());
assertEquals("#CHNG-001.PRSN-001.removed", changeInfo.getRemoved().getResource().toString());
}
}
|
package awesome.socks.common.util;
import java.io.FileInputStream;
import java.io.InputStreamReader;
import java.nio.charset.StandardCharsets;
import java.util.Map;
import java.util.Properties;
import lombok.extern.slf4j.Slf4j;
/**
*
* @author awesome
*/
@Slf4j
public final class Config {
private static final Config CONFIG = new Config();
private final Properties properties = new Properties();
private Config() {
log.info("Config.init start");
try (FileInputStream fis = new FileInputStream(ResourcesUtils.getResource("app.properties").getFile());
InputStreamReader isr = new InputStreamReader(fis, StandardCharsets.UTF_8);) {
properties.load(isr);
log.info("Config.init end");
} catch (Exception e) {
log.error("Config.init error", e);
}
}
public static String get(String key) {
String property = CONFIG.properties.getProperty(key);
if (property == null) {
for (Map.Entry<Object, Object> entry : CONFIG.properties.entrySet()) {
if (key.toUpperCase().equals(((String) entry.getKey()).toUpperCase())) {
if (property == null) {
property = (String) entry.getValue();
} else {
return null;
}
}
}
}
return property;
}
public static int getInt(String key) {
return Integer.parseInt(get(key));
}
}
|
<filename>app/s3/fake_client.go
// +build !prod
package s3
import (
"encoding/json"
"fmt"
)
type FakeClient struct {
configuredUploads map[string]struct{}
configuredDownloads map[string][]byte
}
var _ S3 = (*FakeClient)(nil)
func NewFakeClient() *FakeClient {
return &FakeClient{
configuredUploads: make(map[string]struct{}),
configuredDownloads: make(map[string][]byte),
}
}
func (c *FakeClient) SetupUpload(key string) {
c.configuredUploads[key] = struct{}{}
}
func (c *FakeClient) SetupDownload(key string, v interface{}) error {
bs, err := json.Marshal(v)
if err != nil {
return err
}
c.configuredDownloads[key] = bs
return nil
}
func (c *FakeClient) UploadJSON(key string, v interface{}) error {
if _, ok := c.configuredUploads[key]; ok {
return nil
}
return fmt.Errorf(`upload for key %s was not configured`, key)
}
func (c *FakeClient) DownloadJSON(key string, v interface{}) error {
if _, ok := c.configuredDownloads[key]; !ok {
return fmt.Errorf(`download for key %s was not configured`, key)
}
j := c.configuredDownloads[key]
err := json.Unmarshal(j, v)
if err != nil {
return err
}
return nil
}
|
/**
* The lemmas of the sentence.
* @param props The properties to use for the {@link edu.stanford.nlp.pipeline.MorphaAnnotator}.
* @return A list of lemmatized words, one for each token in the sentence.
*/
public List<String> lemmas(Properties props) {
document.runLemma(props);
synchronized (impl) {
return lazyList(tokensBuilders, CoreNLPProtos.Token.Builder::getLemma);
}
} |
/**
* Fully recalculates the given path from the user's position up to its waypoint
* NOTE: This process tries to retrieve an online route first
* @param path The path to be rerouted
* @param userPosition The user's position detected on this GPS cycle
*/
private void calculatePathFromScratch(@NonNull NavPath path, @NonNull GeoPoint userPosition, @NonNull Context c){
RoutePoint pathWaypoint = path.getWaypoint();
path.discardPointsUpTo(pathWaypoint);
Router.CallbackRouter triggerPathFromScratch = new Router.CallbackRouter() {
@Override
public void onReceivingRoute(List<RoutePoint> routePoints) {
if (routePoints != null && !routePoints.isEmpty()) {
Router.PostProcessCallback afterPostProcessing = totalPathDistance -> {
calculatingPath = false;
navGuide.onUserReadjustedPath(c);
triggerNavigationEvent.onRouteUpdateReceived(path.getPathLayers(), path.getPathMarkers());
};
routePoints.remove(routePoints.size()-1);
List<RoutePoint> finalRoutePoints = addPointBehindUser(routePoints);
path.addPoints(finalRoutePoints);
router.fillMissingPathInfo(path.getPathPoints(), path.getInstructionPoints(), afterPostProcessing);
} else {
calculatingPath = false;
cancelCurrentRoute();
navGuide.onFailingRouteUpdate(c);
triggerNavigationEvent.onRouteUpdateFailure();
}
}
@Override
public void onFailingRoute() {
calculatingPath = false;
cancelCurrentRoute();
navGuide.onFailingRouteUpdate(c);
triggerNavigationEvent.onRouteUpdateFailure();
}
};
router.calculateRoute(userPosition, pathWaypoint.getGeoPoint(), getUserBearing(), Router.ROUTE_FROM_SCRATCH, triggerPathFromScratch);
calculatingPath = true;
resetLastOriginalRoutePoint();
} |
//Vector2uToVector : Converts a Vector2u to a Vector
func Vector2uToVector(vec sf.Vector2u) Vector {
return Vector{
X: float32(vec.X) / scale,
Y: float32(vec.Y) / scale,
}
} |
<filename>d/player_houses/stefano/kitchen.c
//kitchen.c - Stefano's cottage kitchen. Coded by Circe 9/20/03
#include <std.h>
#include "stefano.h"
#include <daemons.h>
inherit VAULT;
int uses;
void create() {
::create();
set_property("indoors",1);
set_property("light",2);
set_terrain(STONE_BUILDING);
set_travel(PAVED_ROAD);
set_name("A large kitchen");
set_short("A large kitchen");
set_long("%^BOLD%^%^RED%^The kitchen is swelteringly hot, with the "+
"fires constantly stoked beneath the large %^BOLD%^%^BLACK%^iron "+
"range%^BOLD%^%^RED%^ at the far side of the room. "+
"%^BOLD%^%^YELLOW%^Shiny copper pots and pans%^BOLD%^%^RED%^ hang along "+
"the walls, and there are glass fronted cupboards containing an array of "+
"fine china cups, saucers, plates, dishes, jugs and bowls. A small, cool "+
"%^RESET%^pantry %^BOLD%^%^RED%^off to one side stores fresh game and poultry, "+
"cheeses, vegetables, and eggs. Herbs and spices sit in containers in a neatly "+
"ordered wall rack. The %^RESET%^cool flagstone floor%^BOLD%^%^RED%^ contrasts "+
"with the heat of the room, except nearer the range, where even the flagstones "+
"become warmer. There is a small table and a couple of chairs at one end of the "+
"room, where a meal could be taken if one did not want to move through to a more "+
"formal setting."+
"\n");
set_smell("default","The smells of great food in preparation are enough to make your mouth water.");
set_listen("default","The clattering of pots and pans and the crackling of the fire fill the room.");
set_items(([
({"pots","pans"}) : "The pots and pans hang in neat rows. They are polished "+
"to a high shine and gleam in the flickering stovelight.",
({"rack", "herbs", "spices"}) : "The rack is well stocked with fresh and dried "+
"herbs, from mint and jasmine to rosmary and sage. The spices include tumeric, "+
"saffron, cumin and nutmeg.",
({"cupboards", "cups", "saucers", "plates", "dishes", "jugs", "bowls"}) : "All the "+
"crockery is nicely displayed within the glass fronted cupboards. It is made from "+
"fine %^BOLD%^%^WHITE%^w%^BLUE%^h%^BOLD%^%^WHITE%^ite and b%^BOLD%^%^BLUE%^l"+
"%^BOLD%^%^WHITE%^ue chi%^BOLD%^%^BLUE%^n%^BOLD%^%^WHITE%^a%^RESET%^. The cupboards "+
"themselves have been kept clean and dust free.",
"pantry" : "The pantry is open-fronted and recessed into the side wall. It is dark and "+
"cool, markedly in contrast to the rest of the kitchen. It is not obvious quite how "+
"this has been acheived. In any case, the foods stocked here look fresh and of the "+
"highest quality.",
"table" : "The table is small and plain, with a simple white linen cloth.",
"range" : "%^BOLD%^%^BLACK%^A large black iron stove and boiler, heated by a wood fire at "+
"the base. There are compartments for roasting or baking, as well as an iron top "+
"plate for heating saucepans and kettles. The boiler is able to provide a near "+
"constant supply of hot water. The %^BOLD%^%^RED%^heat%^BOLD%^%^BLACK%^ from the whole "+
"affair is immense.%^RESET%^",
]));
set_exits(([
"out" : SHOUSE"cottagemain"
]));
set_door("kitchen door",SHOUSE"cottagemain","out",0);
set_door_description("kitchen door","A simple wooden door.");
uses = random(5) + 4;
set_search("default","With such a well-stocked kitchen, it wouldn't take much to prepare a meal.");
}
void init() {
::init();
add_action("feed_em","prepare");
}
int feed_em(string str) {
string mname=TP->query_name();
if(!str) {
tell_object(TP,"You might want to try preparing a meal.");
return 1;
}
if(str == "meal"){
// if((wizardp(TP)) || (mname == "stefano")){
if(avatarp(TP) || member_array(mname,PGUESTS) != -1 || member_array(mname,POWNERS) != -1){
if(uses == 0){
tell_object(TP,"The kitchen seems to have run out. You really should discipline them.");
tell_room(ETP,""+TPQCN+" frowns deeply as "+TP->query_subjective()+" opens "+
"the cupboards and finds them lacking.",TP);
return 1;
}
tell_object(TP,"You bustle about in the kitchen, chopping and slicing, "+
"washing and marinating, preparing meat and vegetables, rolling pastry, "+
"adding spices and doing all those things good cooks do to make their "+
"meals taste just that little bit special.");
tell_room(ETP,""+TPQCN+" bustles about in the kitchen, chopping and slicing, "+
"washing and marinading, preparing meat and vegetables, rolling pastry, "+
"adding spices and doing all those things good cooks do to make their meals "+
"taste just that little bit special.",TP);
new(OBJ"stefano_food")->move(TO);
TP->force_me("get food");
uses = uses - 1;
return 1;
}
tell_object(TP,"The kitchen staff looks appalled that an uninvited guest would try to prepare food.");
tell_room(ETP,""+TPQCN+" seems about to prepare food before seeing the horrified looks "+
"on the faces of the kitchen staff.",TP);
return 1;
}
} |
import Day from './Day';
export default interface DayOccurrence {
day: Day
weight: number
}
export const sortDayOccurrencesChronologically = (a: DayOccurrence, b: DayOccurrence): number => (
a.day.date.getTime() !== b.day.date.getTime()
? a.day.date.getTime() - b.day.date.getTime()
: b.weight - a.weight
);
|
/**
* Example: when key is NAME and value is GALA the list of JSONArray will
* be only elements that have have the above.
*
* @param key use key to reference
* @param value value of all the element that you retain
* @return list of elements where key is equal to value
*/
public String serialiseFromKey(String key, String value) {
JSONArray jsonArray = new JSONArray();
for (Element element : getList()) {
if(element.get(key).equals(value)) {
jsonArray.put(element.getRaw());
}
}
return jsonArray.toString();
} |
def top_words(model, labels, vectorizer):
vocab = vectorizer.vocabulary_
weights = {}
return weights |
<reponame>mafei6827/nettyWsServer
package com.betmatrix.theonex.netty.util;
import java.io.IOException;
import java.util.Properties;
/**
* Created by junior on 10:39 2018/6/22.
*/
public class PropertiesUtil {
public static Properties loadPropertiesFromFile(String fileName){
Properties props = new Properties();
try {
props.load(Class.class.getResourceAsStream(fileName));
} catch (IOException e) {
e.printStackTrace();
}
return props;
}
}
|
/**
* Kick off the widget picker dialog
* @param act The Activity to use as the context for these actions
* @param widgetType The type of widget (e.g. 'default', 'media', 'notification' etc.)
*/
public void register_widget(Activity act, String widgetType) {
Log.d("HMAWM.register_widget","Register widget called with type: " + widgetType);
if (mAppWidgetManager == null) mAppWidgetManager = AppWidgetManager.getInstance(act);
if (mAppWidgetHost == null) mAppWidgetHost = new AppWidgetHost(act, R.id.APPWIDGET_HOST_ID);
int appWidgetId = mAppWidgetHost.allocateAppWidgetId();
Log.d("HMAWM.register_widget","appWidgetId allocated: " + appWidgetId);
Intent pickIntent = new Intent(AppWidgetManager.ACTION_APPWIDGET_PICK);
pickIntent.putExtra(AppWidgetManager.EXTRA_APPWIDGET_ID, appWidgetId);
ArrayList<AppWidgetProviderInfo> customInfo = new ArrayList<AppWidgetProviderInfo>();
pickIntent.putParcelableArrayListExtra(AppWidgetManager.EXTRA_CUSTOM_INFO, customInfo);
ArrayList<Bundle> customExtras = new ArrayList<Bundle>();
pickIntent.putParcelableArrayListExtra(AppWidgetManager.EXTRA_CUSTOM_EXTRAS, customExtras);
currentWidgetType = widgetType;
Functions.widget_settings_ongoing = true;
act.startActivityForResult(pickIntent, Functions.REQUEST_PICK_APPWIDGET);
} |
/***************************************************************************/
/**
* Initializes track properties and returns false if track bad or out of range.
******************************************************************************/
Bool_t AliMultDepSpecAnalysisTask::InitTrack(AliVTrack* track)
{
if(!track) {AliError("Track not available\n"); return kFALSE;}
fPt = track->Pt();
fEta = track->Eta();
fSigmaPt = 1./TMath::Abs(dynamic_cast<AliESDtrack*>(track)->GetSigned1Pt())*TMath::Sqrt(dynamic_cast<AliESDtrack*>(track)->GetSigma1Pt2());
if(fPt <= fMinPt + PRECISION) return kFALSE;
if(fPt >= fMaxPt - PRECISION) return kFALSE;
if(fEta <= fMinEta + PRECISION) return kFALSE;
if(fEta >= fMaxEta - PRECISION) return kFALSE;
if(!AcceptTrackQuality(track)) return kFALSE;
return kTRUE;
} |
<gh_stars>1-10
/*
-------------------------------------------------
Author : zlyuan
date: 2019/9/7
Description :
-------------------------------------------------
*/
package zobserver
type ActionFunc func(notifyName string, msg IMessage)
type IObserver interface {
// 触发事件
OnNotify(notifyName string, msg IMessage)
}
type observer struct {
action ActionFunc
msgtypes map[string]struct{}
is_msgtype bool
}
func (m *observer) OnNotify(notifyName string, msg IMessage) {
if !m.is_msgtype {
m.action(notifyName, msg)
return
}
if _, ok := m.msgtypes[msg.Type()]; ok {
m.action(notifyName, msg)
}
}
// 创建一个观察者
func NewObserver(fn ActionFunc) IObserver {
return &observer{
action: fn,
}
}
// 创建一个观察者并指定监听的消息类型
func NewObserverWithType(fn ActionFunc, msgtypes ...string) IObserver {
mm := make(map[string]struct{}, len(msgtypes))
for _, t := range msgtypes {
mm[t] = struct{}{}
}
return &observer{
action: fn,
msgtypes: mm,
is_msgtype: true,
}
}
// 创建一个观察者并注册到通告者
func NewObserverAndReg(notifyName string, fn ActionFunc) (INotifier, IObserver) {
ob := &observer{
action: fn,
}
notifier := CreateOrGerNotifier(notifyName)
notifier.Register(ob)
return notifier, ob
}
// 创建一个观察者并指定监听的消息类型然后注册到通告者
func NewObserverAndRegWithType(notifyName string, fn ActionFunc, msgtypes ...string) (INotifier, IObserver) {
mm := make(map[string]struct{}, len(msgtypes))
for _, t := range msgtypes {
mm[t] = struct{}{}
}
ob := &observer{
action: fn,
msgtypes: mm,
is_msgtype: true,
}
notifier := CreateOrGerNotifier(notifyName)
notifier.Register(ob)
return notifier, ob
}
|
def load_web_conf(self):
self.config = self._load_config()
self.cb = self._load_convert(self.db_name) |
Control strategy of electric charging station with V2G function based on DC micro-grid
The number and variety of electric vehicles connected to grid continue to grow rapidly, which results in a heavy burden on grid. To solve this problem, the concept of V2G (Vehicle to Grid) that electric vehicles could act as a new power source for grid was proposed. With distributed renewable energy developing fast recent years, charging EVs is a good way to use it. The DC micro-grid program being conducted at the Xiamen University (China) consists of a 150kW peak photovoltaic system. A unique charging station has been proposed designed on the basis of the DC micro-grid, featuring three-way energy flow among the power grid, PV modules and electric vehicles. The paper proposes a charging station based on DC micro-grid which also can provide V2G service. The control strategy of converters is investigated. Finally, simulation based on MATLAB is built to validate the availability and effectiveness of the proposed system and control strategy. |
/* Send an HTTP POST request to the server. This assumes the http url is set.*/
int gsm_http_post(GsmState *s, char* data) {
int ld = strlen(data);
if (ld > GSM_MAX_HTTP_LEN)
return -1;
sprintf(post_content, http_header, ld);
strcat(post_content, data);
char msg[100];
int latency = GSM_TIMEOUT/2;
int lt = strlen(post_content);
sprintf(msg, "AT+QHTTPPOST=%i,%i,1\r", lt, latency/1000);
if (gsm_send_command(s, GSM_CONNECT, msg, GSM_TIMEOUT))
return -1;
delay_ms(250);
return gsm_send_command(s, GSM_OK, post_content, GSM_TIMEOUT);
} |
// GetTopAlbums fetches a list of top albums listened to by the user
// from LastFM for the specified period.
//
func (u *User) GetTopAlbums(period string, page int) (ta *topAlbums, err error) {
params := map[string]string{
"user": u.Username,
"limit": u.api.GetLimit(),
"page": strconv.Itoa(page),
"period": period,
}
p := &lastfm.Provider{
Method: "user.gettopalbums",
Params: params,
Response: &ta,
Type: "GET",
}
err = u.api.Request(p)
return
} |
<filename>form-binding-jaxrs/src/test/java/com/github/exabrial/formbinding/jaxrs/testmodel/TestObject.java
package com.github.exabrial.formbinding.jaxrs.testmodel;
public class TestObject {
private String key;
private int anInt;
public String getKey() {
return key;
}
public void setKey(String key) {
this.key = key;
}
public int getAnInt() {
return anInt;
}
public void setAnInt(int anInt) {
this.anInt = anInt;
}
}
|
// Copyright 2015-2019 Parity Technologies (UK) Ltd.
// This file is part of Parity Ethereum.
// Parity Ethereum is free software: you can redistribute it and/or modify
// it under the terms of the GNU General Public License as published by
// the Free Software Foundation, either version 3 of the License, or
// (at your option) any later version.
// Parity Ethereum is distributed in the hope that it will be useful,
// but WITHOUT ANY WARRANTY; without even the implied warranty of
// MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
// GNU General Public License for more details.
// You should have received a copy of the GNU General Public License
// along with Parity Ethereum. If not, see <http://www.gnu.org/licenses/>.
//! Cache for data fetched from the network.
//!
//! Stores ancient block headers, bodies, receipts, and total difficulties.
//! Furthermore, stores a "gas price corpus" of relative recency, which is a sorted
//! vector of all gas prices from a recent range of blocks.
use std::time::{Instant, Duration};
use parity_util_mem::{MallocSizeOf, MallocSizeOfOps, MallocSizeOfExt};
use common_types::encoded;
use common_types::BlockNumber;
use common_types::receipt::Receipt;
use ethereum_types::{H256, U256};
use memory_cache::MemoryLruCache;
use stats::Corpus;
/// Configuration for how much data to cache.
#[derive(Copy, Clone, Debug, PartialEq, Eq)]
pub struct CacheSizes {
/// Maximum size, in bytes, of cached headers.
pub headers: usize,
/// Maximum size, in bytes, of cached canonical hashes.
pub canon_hashes: usize,
/// Maximum size, in bytes, of cached block bodies.
pub bodies: usize,
/// Maximum size, in bytes, of cached block receipts.
pub receipts: usize,
/// Maximum size, in bytes, of cached chain score for the block.
pub chain_score: usize,
}
impl Default for CacheSizes {
fn default() -> Self {
const MB: usize = 1024 * 1024;
CacheSizes {
headers: 10 * MB,
canon_hashes: 3 * MB,
bodies: 20 * MB,
receipts: 10 * MB,
chain_score: 7 * MB,
}
}
}
/// The light client data cache.
///
/// Note that almost all getter methods take `&mut self` due to the necessity to update
/// the underlying LRU-caches on read.
/// [LRU-cache](https://en.wikipedia.org/wiki/Cache_replacement_policies#Least_Recently_Used_.28LRU.29)
pub struct Cache {
headers: MemoryLruCache<H256, encoded::Header>,
canon_hashes: MemoryLruCache<BlockNumber, H256>,
bodies: MemoryLruCache<H256, encoded::Body>,
receipts: MemoryLruCache<H256, Vec<Receipt>>,
chain_score: MemoryLruCache<H256, U256>,
corpus: Option<(Corpus<U256>, Instant)>,
corpus_expiration: Duration,
}
impl Cache {
/// Create a new data cache with the given sizes and gas price corpus expiration time.
pub fn new(sizes: CacheSizes, corpus_expiration: Duration) -> Self {
Cache {
headers: MemoryLruCache::new(sizes.headers),
canon_hashes: MemoryLruCache::new(sizes.canon_hashes),
bodies: MemoryLruCache::new(sizes.bodies),
receipts: MemoryLruCache::new(sizes.receipts),
chain_score: MemoryLruCache::new(sizes.chain_score),
corpus: None,
corpus_expiration,
}
}
/// Query header by hash.
pub fn block_header(&mut self, hash: &H256) -> Option<encoded::Header> {
self.headers.get_mut(hash).cloned()
}
/// Query hash by number.
pub fn block_hash(&mut self, num: BlockNumber) -> Option<H256> {
self.canon_hashes.get_mut(&num).map(|h| *h)
}
/// Query block body by block hash.
pub fn block_body(&mut self, hash: &H256) -> Option<encoded::Body> {
self.bodies.get_mut(hash).cloned()
}
/// Query block receipts by block hash.
pub fn block_receipts(&mut self, hash: &H256) -> Option<Vec<Receipt>> {
self.receipts.get_mut(hash).cloned()
}
/// Query chain score by block hash.
pub fn chain_score(&mut self, hash: &H256) -> Option<U256> {
self.chain_score.get_mut(hash).map(|h| *h)
}
/// Cache the given header.
pub fn insert_block_header(&mut self, hash: H256, hdr: encoded::Header) {
self.headers.insert(hash, hdr);
}
/// Cache the given canonical block hash.
pub fn insert_block_hash(&mut self, num: BlockNumber, hash: H256) {
self.canon_hashes.insert(num, hash);
}
/// Cache the given block body.
pub fn insert_block_body(&mut self, hash: H256, body: encoded::Body) {
self.bodies.insert(hash, body);
}
/// Cache the given block receipts.
pub fn insert_block_receipts(&mut self, hash: H256, receipts: Vec<Receipt>) {
self.receipts.insert(hash, receipts);
}
/// Cache the given chain scoring.
pub fn insert_chain_score(&mut self, hash: H256, score: U256) {
self.chain_score.insert(hash, score);
}
/// Get gas price corpus, if recent enough.
pub fn gas_price_corpus(&self) -> Option<Corpus<U256>> {
let now = Instant::now();
self.corpus.as_ref().and_then(|&(ref corpus, ref tm)| {
if *tm + self.corpus_expiration >= now {
Some(corpus.clone())
} else {
None
}
})
}
/// Set the cached gas price corpus.
pub fn set_gas_price_corpus(&mut self, corpus: Corpus<U256>) {
self.corpus = Some((corpus, Instant::now()))
}
/// Get the memory used.
pub fn mem_used(&self) -> usize {
self.malloc_size_of()
}
}
// This is fast method: it is possible to have a more exhaustive implementation
impl MallocSizeOf for Cache {
fn size_of(&self, _ops: &mut MallocSizeOfOps) -> usize {
self.headers.current_size()
+ self.canon_hashes.current_size()
+ self.bodies.current_size()
+ self.receipts.current_size()
+ self.chain_score.current_size()
// `self.corpus` is skipped
}
}
#[cfg(test)]
mod tests {
use super::Cache;
use std::time::Duration;
#[test]
fn corpus_inaccessible() {
let duration = Duration::from_secs(20);
let mut cache = Cache::new(Default::default(), duration.clone());
cache.set_gas_price_corpus(vec![].into());
assert_eq!(cache.gas_price_corpus(), Some(vec![].into()));
{
let corpus_time = &mut cache.corpus.as_mut().unwrap().1;
*corpus_time = *corpus_time - duration;
}
assert!(cache.gas_price_corpus().is_none());
}
}
|
Demand Side Management : Augmenting Tool in Energy Security And Climate Change
Demand-side management (DSM) proved to be a convent ional concept emanates from the need seen as a mean s of reducing peak electricity demand and in which a power utility, such as an ver tically integrated utility or State Electricity Boa rds (SEB) manages the demand for power among its customers to meet its current and future needs, so that the utilities can adjourn building f urther capacity. The two pronged approach of DSM one by reducing the overall load, secondly d ue to various beneficial outputs makes DSM signific ant scope for contributing towards the increase of efficiency of the system investment. Th e benefits of DSM program includes alleviating ele ctrical system emergencies, decrementing blackouts, increasing system reliabili ty and most importantly independency on expensive i mports of fuel in the view of limited reserves and high energy prices. DSM encompasses a variety of utility activities designed to change th e level or timing of customer’s electricity demand. DSM has a major role to play in procrastinating bundles investment in generation, transmission and distribution networks with a common initiative of reducing GHG emission o a large scale. DSM is either implemented directly through utility sponsored programs or through market intermediaries like Energy Servic e Companies (ESCOs). In India, DSM can be achieved through energy efficiency, which is the reduction of kilowatt hours (kWh) of energy con sumption or demand load management, which is the re duction of kilowatts (kW) of power demand or the displacement of demand to off-peak ti mes. In the former category are programs such as aw areness generation programs, customer or vendor rebates for efficient equipment, tc, while the latter includes time-of-use tariffs , interruptible tariffs, direct load control, etc. Specific type of programs depends on the utility ob jective: peak clipping, load shifting, strategic co nservation or strategic load growth. This paper, will concisely discuss the major benefits an d challenges that are faced while implementing elec tricity (DSM) and why Indian context is different from rest of the world and the conclusion derived for elsewhere cannot be extrapolated for India. Simultaneously examine the DSM measures which can be undertaken to reduce energy d mand and restructure more efficient and sustainabl e energy use in the context of the Indian electricity system. |
/**
* Checks if {@code host} is contains by any of the provided {@code bases}
* <p/>
* For example "www.youtube.com" contains "youtube.com" but not "notyoutube.com" or
* "youtube.co.uk"
*
* @param host A hostname from e.g. {@link URI#getHost()}
* @param bases Any number of hostnames to compare against {@code host}
* @return If {@code host} contains any of {@code bases}
*/
public static boolean hostContains(String host, String... bases) {
if (host == null || host.isEmpty()) return false;
for (String base : bases) {
if (base == null || base.isEmpty()) continue;
final int index = host.lastIndexOf(base);
if (index < 0 || index + base.length() != host.length()) continue;
if (base.length() == host.length() || host.charAt(index - 1) == '.') return true;
}
return false;
} |
P53 PROTEIN AND ITS FUNDAMENTAL ROLE IN THE CELL CYCLE, APOPTOSIS AND CANCER
P53 is activated in response to DNA damage, hypoxia, oncogenesis expression to promote the cell cycle checkpoints, DNA repair, cell senescence and apoptosis. These activities are important for the suppression of tumor formation and mediate cellular responses that are related to the cell cycle control, being the key element and also the main obstacle to the suppression of tumors. A better understanding of the apoptotic mechanism of p53 may promote the development of in vitro and in vivo assays, contributing to improve cancer diagnosis and prognosis and also helping with the deployment of rational strategies that advance the treatment therapies. In this way, this review is intended to present the effects of p53 protein on cells and show how it works on the activation of specific genes to promote the cell control and regulation and clarify the mysteries evolving the cell regulation mediated by p53 protein. |
#include<bits/stdc++.h>
using namespace std;
typedef long long ll;
const int maxn=111;
int n;
int a[maxn];
bool check(int d){
map<int,int> t;
for(int i=0;i<n;i++){
t[a[i]+1000]++;
t[a[i]-d+1000]++;
t[a[i]+d+1000]++;
}
for(auto p : t) if(p.second == n )return 1;
return 0;
}
int main()
{
#ifndef ONLINE_JUDGE
//freopen("B.in","r",stdin);
//freopen("B.out","w",stdout);
#endif
cin >> n;
for(int i=0;i<n;i++) cin >> a[i];
bool flag=1;
for(int i=0;i<n;i++)
if(a[i]!=a[1]) flag = 0 ;
if(flag || n == 1) return 0*puts("0");
for(int D=1;D<=201;D++){
if(check(D)) return 0*printf("%d\n",D);
}
return 0*puts("-1");
return 0;
}
|
/**
* Unlike many of the other EventReceivers, THIS one needs to be able to be torn down.
*/
ManuvrOIC::~ManuvrOIC() {
platform.kernel()->removeSchedule(&_discovery_ping);
platform.kernel()->removeSchedule(&_discovery_timeout);
#if defined(__BUILD_HAS_THREADS)
#if defined(__MACH__) && defined(__APPLE__)
if (_thread_id > 0) {
_thread_id = 0;
pthread_cancel(_thread_id);
}
#else
if (_thread_id > 0) {
_thread_id = 0;
pthread_cancel(_thread_id);
}
#endif
#endif
} |
/**
* Draws the outline of this Triangle. The Triangle is drawn using the
* GraphicsContext's current color.
*
* @param gc - GraphicsContext object to use for drawing.
*/
@Override
public void draw(GraphicsContext gc) {
if (this.angle == 0.0) {
gc.strokeLine(p1.getX(), p1.getY(), p2.getX(), p2.getY());
gc.strokeLine(p2.getX(), p2.getY(), p3.getX(), p3.getY());
gc.strokeLine(p3.getX(), p3.getY(), p1.getX(), p1.getY());
} else {
Point d1 = new Point(p1.getX() - center.getX(), p1.getX() - center.getY());
Point d2 = new Point(p2.getX() - center.getX(), p2.getY() - center.getY());
Point d3 = new Point(p3.getX() - center.getX(), p3.getY() - center.getY());
d1.rotate(angle);
d2.rotate(angle);
d3.rotate(angle);
gc.strokeLine(d1.getX() + center.getX(), d1.getY() + center.getY(),
d2.getX() + center.getX(), d2.getY() + center.getY());
gc.strokeLine(d2.getX() + center.getX(), d2.getY() + center.getY(),
d3.getX() + center.getX(), d3.getY() + center.getY());
gc.strokeLine(d3.getX() + center.getX(), d3.getY() + center.getY(),
d1.getX() + center.getX(), d1.getY() + center.getY());
}
} |
Revision of Reliability Concepts for Quasibrittle Structures and Size Effect on Probability Distribution of Structural Strength
The paper demonstrates the need for a fundamental revision of reliability concepts and design codes for quasibrittle heterogeneous structures, such as concrete structures failing due to concrete fracture or crushing (rather than reinforcement yielding), or large load-bearing fiber-composite structures for ships or aircraft, sea ice plates, etc. While ductile failure occurs simultaneously along the failure surface and is characterized by absence of size effect and Gaussian distribution of structural strength, quasibrittle failures propagates, exhibits a strong size effect and follows at large sizes extreme value statistics of weakest-link model, which leads to Weibull distribution of structural strength (provided that failure occurs at macro-crack initiation). Based on smalland large-size asymptotic properties recently deduced from cohesive crack model and nonlocal Weibull theory, the transition of cumulative probability distribution function (cdf) of structural strength from small to large sizes is modeled by a chain of fiber bundles, in which each fiber with Weibulltype tail of strength probability corresponds to one dominant micro-bond within a representative volume element (RVE) in a brittle lower-scale microstructure. The cdf of each fiber (or micro-bond) properties can be deduced from Maxwell-Boltzmann distribution of the atomic thermal energies, which brings about the rescaling of cdf according to temperature, load duration and moisture content. A fascinating by-product of the analysis, with physical implications, is that the Weibull modulus is equal to the number of dominant (simultaneously failing) micro-bonds in an RVE. The structural strength distribution is based on chain-of-bundles model, for which a composite cdf with a Weibull tail grafted on a Gaussian core is proposed. For the smallsize limit, the core is totally Gaussian, and for the large-size limit totally Weibull. In between, the grafting point moves right as the Gaussian core shrinks with increasing size. This causes that the distance from the mean to a point of tolerable failure probability (such as 10) nearly doubles as the size of quasibrittle structure increases. Consequently, the understrength factor in design codes must be made size dependent. So must the Cornell and Hasofer-Lind reliability indices. Their reformulation (implying replacement of FORM with ‘EVRM’) is proposed. Inseparable from these effects are further problems due to ‘covert’ understrength factors implied in brittle failure provisions of concrete design codes, as well as an irrational hidden size effect implied by excessive load factor for self weight acting alone. To improve design safety and efficiency, experts in statistical reliability and fracture mechanics will need to collaborate to tackle these problems comprehensively. |
/**
*
*/
package edu.berkeley.nlp.classify;
import edu.berkeley.nlp.math.DifferentiableFunction;
import edu.berkeley.nlp.util.Pair;
/**
* @author petrov
*
*/
public interface ObjectiveFunction extends DifferentiableFunction {
<F,L> double[] getLogProbabilities(EncodedDatum datum, double[] weights, Encoding<F, L> encoding, IndexLinearizer indexLinearizer);
//Pair<Double, double[]> calculate();
public void shutdown();
}
|
/**
* Sets all locks as specified in the original UserScript. Note that the
* code may have changed since then, so an error may be thrown.
*/
public void resetLocks() {
clearLocks();
IntegerSet set = new IntegerSet();
for (IntegerSet.Interval i : _Script.locks) {
set.add(i.start, i.end);
}
for (IntegerSet.Interval i : set)
try {
_Highlighter.addHighlight(i.start, i.end + 1, _Painter);
} catch (BadLocationException e) {
e.printStackTrace();
}
updateButton();
} |
On Sunday, one of the many women who have accused former President Bill Clinton of sexual misconduct, Juanita Broaddrick, crushed former comedian Chelsea Handler for seemingly attempting to score partisan political points off of sexual misconduct allegations against Senate candidate Roy Moore (AL-R) while ignoring alleged victims of Democrats, such as herself.
"I'm sure you don't want to go there," slammed Broaddrick, who accused Clinton of rape in 1978.
“Imagine being molested by an older man. Then that man denies ever doing it and then goes on and gets elected to United States senate. What kind of message does that send to young girls everywhere? And men to all the men who abuse women?” wrote Handler via Twitter, targeting alleged sexual assaulter Moore.
Imagine being molested by an older man. Then that man denies ever doing it and then goes on and gets elected to United States senate. What kind of message does that send to young girls everywhere? And men to all the men who abuse women? — Chelsea Handler (@chelseahandler) November 12, 2017
“I can imagine," Broaddrick shot back. "I was raped by the Arkansas AG who then becomes Governor & President and NBC held my interview explaining the rape until after his impeachment hearing. But I'm sure you don't want to go there."
Yeah, @chelseahandler I can imagine. I was raped by the Arkansas AG who then becomes Governor & President and NBC held my interview explaining the rape until after his impeachment hearing. But I'm sure you don't want to go there. https://t.co/s9W8NZsaZ3 — Juanita Broaddrick (@atensnut) November 13, 2017
Handler, unsurprisingly, has yet to address the tweet, which has garnered over 18,000 retweets.
As noted by Fox News, "Broaddrick has said that Clinton raped her in April 1978 and that Hillary Clinton intimidated her in an effort to keep her silent about the situation. She finally gave an interview to NBC in 1999 but the network didn’t air it until Clinton’s impeachment process ended with an acquittal."
During the 2016 presidential campaign, Handler was a big-time supporter of alleged rape-enabler Mrs. Clinton.
“I felt like she knew [I was raped], it was just the look in her eyes and the anger on her face, because I was so afraid at that time of anyone knowing what had happened to me,” Broaddrick told The Daily Wire last summer.
Broaddrick recounting the moment Hillary allegedly threatened her at a political rally: “She threatened me at that fundraiser,” she said, “that’s foremost in my mind; I’ll never forget that; I’ll never forget that encounter with her that people keep minimizing, saying that, ‘Well, she didn’t mean that.’”
“Well I knew,” claimed Broaddrick. “I was there, I knew from the tone in her voice and how she changed when she said that. I knew what she meant; I just have absolutely no doubt.”
Largely silenced by the Left and the media for so many years, Broaddrick, along with other Clinton accusers, attended a pre-debate presser with then-candidate Donald Trump last year. Mrs. Broaddrick has since been vocal about what she claims happened to her at the hands of Bill and Hillary, making her pinned tweet: |
<reponame>swirzt/compiladores2021<filename>src/Bytecompile.hs
{-# LANGUAGE PatternSynonyms #-}
-- |
-- Module : Byecompile
-- Description : Compila a bytecode. Ejecuta bytecode.
-- Copyright : (c) <NAME>, <NAME>, 2020.
-- License : GPL-3
-- Maintainer : <EMAIL>
-- Stability : experimental
--
-- Este módulo permite compilar módulos a la BVM. También provee una implementación de la BVM
-- para ejecutar bytecode.
module Bytecompile (Bytecode, runBC, bcWrite, bcRead, bytecompileModule) where
import Data.Binary (Binary (get, put), Word32, decode, encode)
import Data.Binary.Get (getWord32le, isEmpty)
import Data.Binary.Put (putWord32le)
import qualified Data.ByteString.Lazy as BS
import Data.Char
import Lang
import MonadFD4
import Subst
type Opcode = Int
type Bytecode = [Opcode]
newtype Bytecode32 = BC {un32 :: [Word32]}
data Val
= I Int
| Fun Env Bytecode
| RA Env Bytecode
type Env = [Val]
type Stack = [Val]
{- Esta instancia explica como codificar y decodificar Bytecode de 32 bits -}
instance Binary Bytecode32 where
put (BC bs) = mapM_ putWord32le bs
get = go
where
go = do
empty <- isEmpty
if empty
then return $ BC []
else do
x <- getWord32le
BC xs <- go
return $ BC (x : xs)
{- Estos sinónimos de patrón nos permiten escribir y hacer
pattern-matching sobre el nombre de la operación en lugar del código
entero, por ejemplo:
f (CALL : cs) = ...
Notar que si hubieramos escrito algo como
call = 5
no podríamos hacer pattern-matching con `call`.
En lo posible, usar estos códigos exactos para poder ejectutar un
mismo bytecode compilado en distintas implementaciones de la máquina.
-}
pattern NULL = 0
pattern RETURN = 1
pattern CONST = 2
pattern ACCESS = 3
pattern FUNCTION = 4
pattern CALL = 5
pattern ADD = 6
pattern SUB = 7
pattern IFZ = 8
pattern FIX = 9
pattern STOP = 10
pattern SHIFT = 11
pattern DROP = 12
pattern PRINT = 13
pattern PRINTN = 14
pattern SKIP = 15
pattern TAILCALL = 16
bc :: MonadFD4 m => Term -> m Bytecode
bc (V _ (Bound i)) = return [ACCESS, i]
bc (Const _ (CNat n)) = return [CONST, n]
bc (Lam _ _ _ tm) = do
ts <- bt tm
let len = length ts
return $ [FUNCTION, len] ++ ts
bc (App _ tm1 tm2) = do
ts1 <- bc tm1
ts2 <- bc tm2
return $ ts1 ++ ts2 ++ [CALL]
bc (Print _ str tm) = do
ts <- bc tm
let itr = stringToUnicode str
return $ ts ++ [PRINT] ++ itr ++ [NULL, PRINTN]
bc (BinaryOp _ op tm1 tm2) = do
ts1 <- bc tm1
ts2 <- bc tm2
let x = parseOp op
return $ ts1 ++ ts2 ++ [x]
where
parseOp Add = ADD
parseOp Sub = SUB
bc (Fix _ _ _ _ _ tm) = do
ts <- bt tm
let len = length ts
return $ [FUNCTION, len] ++ ts ++ [FIX]
bc (Let _ _ _ tm1 tm2) = do
ts1 <- bc tm1
ts2 <- bc tm2
return $ ts1 ++ [SHIFT] ++ ts2 ++ [DROP]
bc (IfZ _ tmb tmt tmf) = do
tsb <- bc tmb
tst <- bc tmt
tsf <- bc tmf
let tLen = length tst + 2 -- Tengo que saltear el SKIP y el largo del False
let fLen = length tsf
return $ tsb ++ [IFZ, tLen] ++ tst ++ [SKIP, fLen] ++ tsf
bc (V _ (Free _)) = undefined
bc (V _ (Global _)) = undefined
bt :: MonadFD4 m => Term -> m Bytecode
bt (App _ tm1 tm2) = do
ts1 <- bc tm1
ts2 <- bc tm2
return $ ts1 ++ ts2 ++ [TAILCALL]
bt (IfZ _ tmb tmt tmf) = do
tsb <- bc tmb
tst <- bt tmt
tsf <- bt tmf
let tLen = length tst
return $ tsb ++ [IFZ, tLen] ++ tst ++ tsf
bt (Let _ _ _ tm1 tm2) = do
ts1 <- bc tm1
ts2 <- bt tm2
return $ ts1 ++ [SHIFT] ++ ts2
bt t = do
tt <- bc t
return $ tt ++ [RETURN]
stringToUnicode :: String -> [Int]
stringToUnicode xs = map ord xs
type Module = [Decl Term]
bytecompileModule :: MonadFD4 m => Module -> m Bytecode
bytecompileModule xs = do
tm <- declToLet xs
ys <- bc tm
return $ ys ++ [PRINTN, STOP]
-- Esto es un fold?
declToLet :: MonadFD4 m => Module -> m Term
declToLet [DeclFun _ _ _ body] = return $ global2Free body
declToLet (DeclFun pos name ty body : xs) = do
tm <- declToLet xs
let bodyf = global2Free body
return $ Let pos name ty bodyf (close name tm)
declToLet _ = undefined -- Para calmar al linter
-- | Toma un bytecode, lo codifica y lo escribe un archivo
bcWrite :: Bytecode -> FilePath -> IO ()
bcWrite bs filename = BS.writeFile filename (encode $ BC $ fromIntegral <$> bs)
---------------------------
-- * Ejecución de bytecode
---------------------------
-- | Lee de un archivo y lo decodifica a bytecode
bcRead :: FilePath -> IO Bytecode
bcRead filename = map fromIntegral <$> un32 <$> decode <$> BS.readFile filename
runBC :: MonadFD4 m => Bytecode -> m ()
runBC xs = runBC' xs [] []
runBC' :: MonadFD4 m => Bytecode -> Env -> Stack -> m ()
runBC' (CONST : n : c) e s = runBC' c e (I n : s)
runBC' (ADD : c) e (I n : I m : s) = runBC' c e (I (m + n) : s)
runBC' (ADD : _) _ _ = failFD4 "Error al ejecutar ADD"
runBC' (SUB : c) e (I n : I m : s) =
let k = max 0 (m - n)
in runBC' c e (I k : s)
runBC' (SUB : _) _ _ = failFD4 "Error al ejecutar SUB"
runBC' (ACCESS : i : c) e s = runBC' c e (e !! i : s)
runBC' (CALL : c) e (v : Fun ef cf : s) = runBC' cf (v : ef) (RA e c : s)
runBC' (CALL : _) _ _ = failFD4 "Error al ejecutar CALL"
runBC' (FUNCTION : l : c) e s = runBC' (drop l c) e (Fun e (take l c) : s)
runBC' (RETURN : _) _ (v : (RA e c) : s) = runBC' c e (v : s)
runBC' (RETURN : _) _ _ = failFD4 "Error al ejecutar RETURN"
runBC' (SHIFT : c) e (v : s) = runBC' c (v : e) s
runBC' (DROP : c) (_ : e) s = runBC' c e s
runBC' (PRINTN : c) e st@(I n : _) = do
printFD4 (show n)
runBC' c e st
runBC' (PRINTN : _) _ _ = failFD4 "Error al ejecutar PRINTN"
runBC' (PRINT : c) e s = printStr c e s
runBC' (FIX : c) e (Fun _ cf : s) =
let efix = Fun efix cf : e
in runBC' c e (Fun efix cf : s)
runBC' (FIX : _) _ _ = failFD4 "Error al ejecutar FIX"
runBC' (STOP : _) _ _ = return ()
runBC' (IFZ : tLen : c) e (I n : s) =
if n == 0
then runBC' c e s
else runBC' (drop tLen c) e s
runBC' (IFZ : _) _ _ = failFD4 "Error al ejecutar IFZ"
runBC' (SKIP : len : c) e s = runBC' (drop len c) e s
runBC' (TAILCALL : _) _ (v : Fun ef cf : s) = runBC' cf (v : ef) s
runBC' (TAILCALL : _) _ _ = failFD4 "Error al ejecutar TAILCALL"
runBC' _ _ _ = failFD4 "Pasaron cosas"
printStr :: MonadFD4 m => Bytecode -> Env -> Stack -> m ()
printStr (NULL : c) e s = runBC' c e s
printStr (char : c) e s = do
printFD4Char (chr char)
printStr c e s
printStr _ _ _ = failFD4 "Error al desarmar la cadena" |
import { createGlobalStyle } from 'styled-components';
import WorkSans from '../assets/fonts/WorkSans-VariableFont_wght.ttf';
import Sora from '../assets/fonts/Sora-VariableFont_wght.ttf';
import { theme } from './theme';
export const GlobalStyle = createGlobalStyle`
body {
-webkit-font-smoothing: antialiased;
box-sizing: border-box;
margin: 0;
color: ${theme.palette.text.primary};
background-color: ${theme.palette.background.main};
font-family: 'Work Sans', sans-serif;
}
*, *::before, *::after {
box-sizing: inherit;
}
*:focus {
outline: none;
}
h1 {
font-weight: 300;
margin: 0;
margin-bottom: .5rem;
}
h2 {
font-weight: 600;
margin: 0;
margin-bottom: .5rem;
}
@font-face {
font-family: 'Work Sans';
src: url(${WorkSans}) format('truetype');
font-weight: 400;
font-style: normal;
}
@font-face {
font-family: 'Sora';
src: url(${Sora}) format('truetype');
font-weight: 600;
font-style: normal;
}
`;
|
def _expression_list_to_conjunction(expression_list):
if not isinstance(expression_list, list):
raise AssertionError(u'Expected list. Received {}: '
u'{}'.format(type(expression_list).__name__, expression_list))
if len(expression_list) == 0:
raise AssertionError(u'Received empty expression_list '
u'(function should never be called with empty list): '
u'{}'.format(expression_list))
elif len(expression_list) == 1:
return expression_list[0]
else:
remaining_conjunction = _expression_list_to_conjunction(expression_list[1:])
return BinaryComposition(u'&&', expression_list[0], remaining_conjunction) |
def recipe(self, backbone=True):
buf = self.vol/10
enzyme = self.vol/100
plasmid = 4/25*self.vol
m2 = """
After gel purification, add 1µL CIP to backbone directly, place at 37 for 30min
"""
if backbone is True:
message1 = 'plasmid {0}'.format(self.plasmid)
message2 = m2
else:
message1 = 'insert {0}'.format(self.insert)
message2 = 'Run a gel and gel/pcr purify.'
if self.e2 is not '':
water = self.vol - buf - 2*enzyme - plasmid
s = """
Digestion for {0}:
10x Buffer = {1}µL
Enzyme {2} = {3}µL
Enzyme {4} = {3}µL
Plasmid = {5}µL
H20 = {6}µL
""".format(message1, buf, self.e1, enzyme, self.e2, plasmid, water, message2)
else:
water = self.vol - buf - enzyme - plasmid
s = """
Digestion for plasmid {0}:
10x Buffer = {1}µL
Enzyme {2} = {3}µL
Plasmid = {5}µL
H20 = {6}µL
Place at 37degrees for 30 minutes.
""".format(message1, buf, self.e1, enzyme, plasmid, water, message2)
print(s)
return s |
Browsing Models for Hypermedia Databases
Hypertext can be simply defined as the creation and representation of links between discrete pieces of data. When this data can be graphics, or sound, as well as text or numbers, the resulting structure is referred to as hypermedia. The strengths of hypermedia arise from its flexibility in storing and retrieving knowledge. Any piece of information, whether it be text, graphics, sound, numerical data, etc., can be linked to any other piece of information. In many ways, the problems of hypermedia stem from the very flexibility that is its chief advantage and justification. It is difficult to maintain a sense of where things are in a relatively unstructured network of information. While the associative nature of hypermedia increases the availability of large amounts of diverse information, this very diversity makes it easy for information and users to get lost. Hypermedia exacerbates the problem of getting "lost in information space" by providing a complex associative structure that can be traversed, but not fully visualized. Information gets lost because it becomes difficult to organize and tag effectively, while users get lost as they lose sense of where they are in the hypermedia. Getting lost or disoriented occurs when one doesn't know where one is. Solutions to the problem of disorientation in hypermedia appear to fall into two general classes. First, one can create maps or browsers that allows users to determine where they are in terms of the overall network, or regions thereof. Second, one can create tags, markers or milestones which represent familiar locations, much as a lighthouse signals location in the middle of a foggy night. This paper reports basic research on the identification of landmarks in a hypermedia application. |
MOSCOW (Reuters) - Russia and the world’s top energy user China may jointly develop six floating nuclear power plants (NPPs), Russia’s nuclear export body said on Tuesday, a further joint energy project since the signing of a $400 billion gas supply deal.
Rusatom Overseas, the export branch of state nuclear reactor monopoly Rosatom, said it signed a memorandum of understanding with China on the development of floating NPPs from 2019.
“Floating NPPs can provide a reliable power supply not only to remote settlements but also to large industrial facilities such as oil platforms,” Rusatom Overseas Chief Executive Dzhomart Aliev said in a statement.
Hit by European and U.S. sanctions in response to the crisis in Ukraine, Russia is eager to diversify its economy away from the West. Following this new strategy, Russian state monopoly Gazprom signed a $400 billion deal with China in May after 10 years of negotiation. [ID:nL6N0O92T1]
Rosatom plans to launch the world’s first floating NPP in 2018. This mobile, small capacity nuclear thermal power plant, best suited to remote regions, will be based in Chukhotka in Russia’s far east. |
/*
* Invalidate the D-caches, but no write back please
*/
static void sh2a__flush_invalidate_region(void *start, int size)
{
unsigned long v;
unsigned long begin, end;
unsigned long flags;
begin = (unsigned long)start & ~(L1_CACHE_BYTES-1);
end = ((unsigned long)start + size + L1_CACHE_BYTES-1)
& ~(L1_CACHE_BYTES-1);
local_irq_save(flags);
jump_to_uncached();
if (((end - begin) >> PAGE_SHIFT) >= MAX_OCACHE_PAGES) {
__raw_writel(__raw_readl(CCR) | CCR_OCACHE_INVALIDATE, CCR);
} else {
for (v = begin; v < end; v += L1_CACHE_BYTES)
sh2a_invalidate_line(CACHE_OC_ADDRESS_ARRAY, v);
}
back_to_cached();
local_irq_restore(flags);
} |
/*
* Set frequency of internal CW generator common to both DAC channels
*
* clk_8m_div: 0b000 - 0b111
* frequency: range 0x0001 - 0xFFFF
*
*/
void DAC_Module::dac_frequency_set(int clk_8m_div, int frequency)
{
REG_SET_FIELD(RTC_CNTL_CLK_CONF_REG, RTC_CNTL_CK8M_DIV_SEL, clk_8m_div);
SET_PERI_REG_BITS(SENS_SAR_DAC_CTRL1_REG, SENS_SW_FSTEP, frequency, SENS_SW_FSTEP_S);
} |
// //////////////////////////////////////////////////////////////////////////////
//
// RMG - Reaction Mechanism Generator
//
// Copyright (c) 2002-2011 Prof. <NAME> (<EMAIL>) and the
// RMG Team (<EMAIL>)
//
// Permission is hereby granted, free of charge, to any person obtaining a
// copy of this software and associated documentation files (the "Software"),
// to deal in the Software without restriction, including without limitation
// the rights to use, copy, modify, merge, publish, distribute, sublicense,
// and/or sell copies of the Software, and to permit persons to whom the
// Software is furnished to do so, subject to the following conditions:
//
// The above copyright notice and this permission notice shall be included in
// all copies or substantial portions of the Software.
//
// THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
// IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
// FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
// AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
// LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING
// FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER
// DEALINGS IN THE SOFTWARE.
//
// //////////////////////////////////////////////////////////////////////////////
package jing.rxn;
import jing.chem.*;
import java.util.*;
import jing.chemUtil.*;
import jing.chemUtil.Arc;
import jing.chemUtil.Graph;
import jing.param.Global;
import jing.rxnSys.Logger;
// ## package jing::rxn
// ----------------------------------------------------------------------------
// jing\rxn\ReactionAdjList.java
// ----------------------------------------------------------------------------
// ## class ReactionAdjList
public class ReactionAdjList {
protected LinkedList actions = new LinkedList(); // ## attribute actions
protected int productNumber; // ## attribute productNumber
protected int reactantNumber; // ## attribute reactantNumber
// Constructors
// ## operation ReactionAdjList(int,int)
public ReactionAdjList(int p_reactantNumber, int p_productNumber) {
// #[ operation ReactionAdjList(int,int)
reactantNumber = p_reactantNumber;
productNumber = p_productNumber;
// #]
}
public ReactionAdjList() {
}
// ## operation addAction(Action)
public void addAction(Action p_action) {
// #[ operation addAction(Action)
if (actions == null)
actions = new LinkedList();
actions.add(p_action);
// #]
}
// ## operation convertBenzene(Arc)
public void convertBenzene(Arc p_arc) {
// #[ operation convertBenzene(Arc)
// #]
}
// ## operation generateReverse()
public ReactionAdjList generateReverse() {
// #[ operation generateReverse()
ReactionAdjList reverse = new ReactionAdjList(productNumber,
reactantNumber);
Iterator iter = actions.iterator();
while (iter.hasNext()) {
Action action = (Action) iter.next();
Action newAction = action.generateReverse();
reverse.addAction(newAction);
}
return reverse;
// #]
}
// ## operation getActions()
public Iterator getActions() {
// #[ operation getActions()
Iterator iter = actions.iterator();
return iter;
// #]
}
// ## operation mutate(Graph)
public void mutate(Graph p_graph) throws InvalidActionException {
// #[ operation mutate(Graph)
try {
Iterator act_iter = getActions();
LinkedHashSet changedAtom = new LinkedHashSet();
while (act_iter.hasNext()) {
Action act = (Action) act_iter.next();
switch (act.type) {
case Action.CHANGE_BOND: {
// locate the atoms linked by changed bond
Iterator iter = act.getSite();
Integer a1 = (Integer) iter.next();
Node n1 = p_graph.getCentralNodeAt(a1.intValue());
Integer a2 = (Integer) iter.next();
Node n2 = p_graph.getCentralNodeAt(a2.intValue());
// locate the changed bond
Integer changedOrder = (Integer) act.getElement();
Arc arc = p_graph.getArcBetween(n1, n2);
Object oldBond = arc.getElement();
if (oldBond instanceof Collection) {
LinkedHashSet b = new LinkedHashSet();
Iterator bond_iter = ((Collection) oldBond).iterator();
while (bond_iter.hasNext()) {
Bond thisBond = (Bond) bond_iter.next();
b.add(thisBond.changeBond(changedOrder.intValue()));
}
arc.setElement(b);
} else if (oldBond instanceof Bond) {
Bond b = ((Bond) oldBond).changeBond(changedOrder
.intValue());
arc.setElement(b);
} else {
throw new InvalidBondException("change bond");
}
Node leftNode = null;
if (n1.includeFgElementInChemNodeElement(FGElement
.make("Cdd"))) {
Iterator a_iter = n1.getNeighbor();
while (a_iter.hasNext()) {
Arc a = (Arc) a_iter.next();
if (a != arc)
leftNode = n1.getOtherNode(a);
}
}
n1.changeChemNodeElement(changedOrder.intValue(), leftNode);
if (n2.includeFgElementInChemNodeElement(FGElement
.make("Cdd"))) {
Iterator a_iter = n2.getNeighbor();
while (a_iter.hasNext()) {
Arc a = (Arc) a_iter.next();
if (a != arc)
leftNode = n2.getOtherNode(a);
}
}
n2.changeChemNodeElement(changedOrder.intValue(), leftNode);
changedAtom.add(n1);
changedAtom.add(n2);
break;
}
case Action.BREAK_BOND: {
// locate the atoms linked by changed bond
Iterator iter = act.getSite();
Integer a1 = (Integer) iter.next();
Node n1 = p_graph.getCentralNodeAt(a1.intValue());
Integer a2 = (Integer) iter.next();
Node n2 = p_graph.getCentralNodeAt(a2.intValue());
// locate the broken bond
Bond b = (Bond) act.getElement();
Arc arc = p_graph.getArcBetween(n1, n2);
try {
if ((Bond) (arc.getElement()) != b) {
throw new InvalidActionException("break bond");
}
} catch (ClassCastException e) {
throw new InvalidBondException("break bond");
}
// break the bond element
p_graph.removeArc(arc);
changedAtom.add(n1);
changedAtom.add(n2);
break;
}
case Action.FORM_BOND: {
// locate the atoms linked by changed bond
Iterator iter = act.getSite();
Integer a1 = (Integer) iter.next();
Node n1 = p_graph.getCentralNodeAt(a1.intValue());
Integer a2 = (Integer) iter.next();
Node n2 = p_graph.getCentralNodeAt(a2.intValue());
// form the new bond and add it to graph
Bond b = (Bond) act.getElement();
p_graph.addArcBetween(n1, b, n2);
changedAtom.add(n1);
changedAtom.add(n2);
break;
}
case Action.GAIN_RADICAL: {
// locate the atom
Iterator iter = act.getSite();
Integer a = (Integer) iter.next();
Node n = p_graph.getCentralNodeAt(a.intValue());
// get the number of radical gained
Integer r = (Integer) act.getElement();
String spin = null;
if (iter.hasNext())
spin = (String) iter.next();
// new a radical or set the new radical order
Object oldAtom = n.getElement();
if (oldAtom instanceof Collection) {
LinkedHashSet atom = new LinkedHashSet();
Iterator atom_iter = ((Collection) oldAtom).iterator();
while (atom_iter.hasNext()) {
ChemNodeElement thisAtom = (ChemNodeElement) atom_iter
.next();
atom.add(thisAtom.changeRadical(r.intValue(), spin));
}
n.setElement(atom);
} else if (oldAtom instanceof ChemNodeElement) {
ChemNodeElement atom = ((ChemNodeElement) oldAtom)
.changeRadical(r.intValue(), spin);
n.setElement(atom);
} else {
throw new InvalidActionException();
}
changedAtom.add(n);
break;
}
case Action.LOSE_RADICAL: {
// locate the atom
Iterator iter = act.getSite();
Integer a = (Integer) iter.next();
Node n = p_graph.getCentralNodeAt(a.intValue());
// get the number of radical lost
Integer r = (Integer) act.getElement();
String spin = null;
if (iter.hasNext())
spin = (String) iter.next();
// new a radical or set the new radical order
Object oldAtom = n.getElement();
if (oldAtom instanceof Collection) {
LinkedHashSet atom = new LinkedHashSet();
Iterator atom_iter = ((Collection) oldAtom).iterator();
while (atom_iter.hasNext()) {
ChemNodeElement thisAtom = (ChemNodeElement) atom_iter
.next();
atom.add(thisAtom.changeRadical(-r.intValue(), spin));
}
n.setElement(atom);
} else if (oldAtom instanceof ChemNodeElement) {
ChemNodeElement atom = ((ChemNodeElement) oldAtom)
.changeRadical(-r.intValue(), spin);
n.setElement(atom);
} else {
throw new InvalidActionException();
}
changedAtom.add(n);
break;
}
default:
throw new InvalidActionException("unknown action");
}
}
Iterator iter = changedAtom.iterator();
while (iter.hasNext()) {
Node node = (Node) iter.next();
node.updateFeElement();
node.updateFgElement();
}
return;
} catch (UnknownSymbolException e) {
Logger.logStackTrace(e);
Logger.critical("Unknown symbols: " + e.getMessage());
System.exit(0);
}
// #]
}
/**
* Requires: p_reactants is an ordered linked list, where at index1 there is reactant1's graph, and at indext2,
* there is reactant2's graph. the reactedSite ArrayList is the combined ordered reaction sites. Effects: according
* to the reaction adjlist, react at reaction site to form product's graph. Modifies:
*/
// ## operation react(LinkedList)
public LinkedList react(LinkedList p_reactants)
throws InvalidReactantException, InvalidProductException,
InvalidReactantNumberException, InvalidProductNumberException {
// #[ operation react(LinkedList)
if (p_reactants.size() != reactantNumber) {
throw new InvalidReactantNumberException();
}
Graph graph = null;
if (reactantNumber == 1) {
Graph g1 = (Graph) p_reactants.get(0);
if (g1 == null) {
throw new InvalidReactantException();
}
graph = Graph.copy(g1);
} else if (reactantNumber == 2) {
Graph g1 = (Graph) p_reactants.get(0);
Graph g2 = (Graph) p_reactants.get(1);
if (g1 == null || g2 == null) {
throw new InvalidReactantException();
}
graph = Graph.combine(g1, g2);
} else {
throw new InvalidReactantNumberException();
}
mutate(graph);
if (graph == null) {
throw new InvalidProductException();
}
LinkedList productGraph = graph.partition();
if (productGraph.size() != productNumber) {
throw new InvalidProductNumberException();
}
return productGraph;
// #]
}
// ## operation reactChemGraph(LinkedList)
public LinkedList reactChemGraph(LinkedList p_reactants)
throws InvalidChemGraphException, ForbiddenStructureException {
double pT = System.currentTimeMillis();
// #[ operation reactChemGraph(LinkedList)
LinkedList reactants = new LinkedList();
LinkedList products = new LinkedList();
for (Iterator iter = p_reactants.iterator(); iter.hasNext();) {
Object o = iter.next();
Graph g = null;
if (o instanceof ChemGraph)
g = ((ChemGraph) o).getGraph();
else
g = ((Species) o).getChemGraph().getGraph();
reactants.add(g);
}
LinkedList productGraph = null;
try {
productGraph = react(reactants);
} catch (InvalidProductNumberException e) {
throw new InvalidProductNumberException(e.getMessage());
}
for (Iterator iter = productGraph.iterator(); iter.hasNext();) {
Graph pg = (Graph) iter.next();
/*
* if (ChemGraph.isForbiddenStructure(pg)) throw new ForbiddenStructureException(pg.toString());
*/
String name = null;
ChemGraph pcg = ChemGraph.make(pg, true);
// Species ps = Species.make(name, pcg);
products.add(pcg);
}
Global.RT_reactChemGraph += (System.currentTimeMillis() - pT) / 1000 / 60;
return products;
// #]
}
// ## operation reactFunctionalGroup(LinkedList)
public LinkedList reactFunctionalGroup(LinkedList p_reactants) {
// #[ operation reactFunctionalGroup(LinkedList)
if (p_reactants.size() != reactantNumber) {
throw new InvalidReactantNumberException();
}
LinkedList productGraph = null;
LinkedList allProduct = new LinkedList();
if (reactantNumber == 1) {
Matchable g1 = (Matchable) p_reactants.get(0);
if (g1 == null) {
throw new InvalidReactantException();
}
LinkedList reactant = new LinkedList();
if (g1 instanceof FunctionalGroupCollection) {
Iterator iter = ((FunctionalGroupCollection) g1)
.getFunctionalGroups();
while (iter.hasNext()) {
Graph thisGraph = ((FunctionalGroup) iter.next())
.getGraph();
reactant.add(thisGraph);
productGraph = react(reactant);
allProduct.addAll(productGraph);
reactant.clear();
}
} else if (g1 instanceof FunctionalGroup) {
Graph thisGraph = ((FunctionalGroup) g1).getGraph();
reactant.add(thisGraph);
productGraph = react(reactant);
allProduct.addAll(productGraph);
} else {
throw new InvalidReactantException();
}
} else if (reactantNumber == 2) {
Matchable g1 = (Matchable) p_reactants.get(0);
Matchable g2 = (Matchable) p_reactants.get(1);
if (g1 == null || g2 == null) {
throw new InvalidReactantException();
}
boolean r1fg = (g1 instanceof FunctionalGroup);
boolean r2fg = (g2 instanceof FunctionalGroup);
boolean r1fgc = (g1 instanceof FunctionalGroupCollection);
boolean r2fgc = (g2 instanceof FunctionalGroupCollection);
LinkedList reactant = new LinkedList();
if (r1fg && r2fg) {
Graph thisGraph1 = ((FunctionalGroup) g1).getGraph();
Graph thisGraph2 = ((FunctionalGroup) g2).getGraph();
reactant.add(thisGraph1);
reactant.add(thisGraph2);
allProduct = react(reactant);
} else if (r1fg && r2fgc) {
Graph thisGraph1 = ((FunctionalGroup) g1).getGraph();
Iterator iter = ((FunctionalGroupCollection) g2)
.getFunctionalGroups();
reactant.add(thisGraph1);
while (iter.hasNext()) {
Graph thisGraph2 = ((FunctionalGroup) iter.next())
.getGraph();
reactant.add(thisGraph2);
productGraph = react(reactant);
allProduct.addAll(productGraph);
reactant.remove(thisGraph2);
}
} else if (r1fgc && r2fg) {
Graph thisGraph2 = ((FunctionalGroup) g2).getGraph();
Iterator iter = ((FunctionalGroupCollection) g1)
.getFunctionalGroups();
reactant.add(thisGraph2);
while (iter.hasNext()) {
Graph thisGraph1 = ((FunctionalGroup) iter.next())
.getGraph();
reactant.add(thisGraph1);
productGraph = react(reactant);
allProduct.addAll(productGraph);
reactant.remove(thisGraph1);
}
} else if (r1fgc && r2fgc) {
Iterator iter1 = ((FunctionalGroupCollection) g1)
.getFunctionalGroups();
while (iter1.hasNext()) {
Graph thisGraph1 = ((FunctionalGroup) iter1.next())
.getGraph();
reactant.add(thisGraph1);
Iterator iter2 = ((FunctionalGroupCollection) g2)
.getFunctionalGroups();
while (iter2.hasNext()) {
Graph thisGraph2 = ((FunctionalGroup) iter2.next())
.getGraph();
reactant.add(thisGraph2);
productGraph = react(reactant);
allProduct.addAll(productGraph);
reactant.remove(thisGraph2);
}
reactant.remove(thisGraph1);
}
}
} else {
throw new InvalidReactantNumberException();
}
LinkedList p_collection = new LinkedList();
if (productNumber == 1) {
FunctionalGroupCollection p1 = new FunctionalGroupCollection();
Iterator p_iter = allProduct.iterator();
while (p_iter.hasNext()) {
Graph pg = (Graph) p_iter.next();
String name = "";
FunctionalGroup fg = FunctionalGroup.make(name, pg);
p1.addFunctionalGroups(fg);
}
p_collection.add(p1);
} else if (productNumber == 2) {
FunctionalGroupCollection p1 = new FunctionalGroupCollection();
FunctionalGroupCollection p2 = new FunctionalGroupCollection();
/*
* int highestCentralID1 = -1; int highestCentralID2 = -1;
*/
int lowestCentralID1 = 10000;
int lowestCentralID2 = 10000;
Iterator p_iter = allProduct.iterator();
while (p_iter.hasNext()) {
Graph pg = (Graph) p_iter.next();
String name = "";
FunctionalGroup fg = FunctionalGroup.make(name, pg);
/*
* int present_HCID = pg.getHighestCentralID(); if (present_HCID <= 0) throw new
* InvalidCentralIDException(); if (present_HCID == highestCentralID1) { p1.addFunctionalGroups(fg); }
* else if (present_HCID == highestCentralID2) { p2.addFunctionalGroups(fg); } else { if
* (highestCentralID1 == -1) { p1.addFunctionalGroups(fg); highestCentralID1 = present_HCID; } else if
* (highestCentralID2 == -1) { p2.addFunctionalGroups(fg); highestCentralID2 = present_HCID; } else {
* throw new InvalidCentralIDException(); } }
*/
int present_LCID = pg.getLowestCentralID();
if (present_LCID <= 0)
throw new InvalidCentralIDException();
if (present_LCID == lowestCentralID1) {
p1.addFunctionalGroups(fg);
} else if (present_LCID == lowestCentralID2) {
p2.addFunctionalGroups(fg);
} else {
if (lowestCentralID1 == 10000) {
p1.addFunctionalGroups(fg);
lowestCentralID1 = present_LCID;
} else if (lowestCentralID2 == 10000) {
p2.addFunctionalGroups(fg);
lowestCentralID2 = present_LCID;
} else {
throw new InvalidCentralIDException();
}
}
}
p_collection.add(p1);
p_collection.add(p2);
}
return p_collection;
// #]
}
public void setActions(LinkedList p_actions) {
actions = p_actions;
}
public int getProductNumber() {
return productNumber;
}
public void setProductNumber(int p_productNumber) {
productNumber = p_productNumber;
}
public int getReactantNumber() {
return reactantNumber;
}
public void setReactantNumber(int p_reactantNumber) {
reactantNumber = p_reactantNumber;
}
}
/*********************************************************************
* File Path : RMG\RMG\jing\rxn\ReactionAdjList.java
*********************************************************************/
|
package rest
import (
"github.com/eyebluecn/tank/code/core"
)
//@Service
type BridgeService struct {
BaseBean
bridgeDao *BridgeDao
userDao *UserDao
}
func (this *BridgeService) Init() {
this.BaseBean.Init()
b := core.CONTEXT.GetBean(this.bridgeDao)
if b, ok := b.(*BridgeDao); ok {
this.bridgeDao = b
}
b = core.CONTEXT.GetBean(this.userDao)
if b, ok := b.(*UserDao); ok {
this.userDao = b
}
}
func (this *BridgeService) Detail(uuid string) *Bridge {
bridge := this.bridgeDao.CheckByUuid(uuid)
return bridge
}
|
/**
* Derived-class handler when the fragment is canceled
*/
@Override
protected final void doCancelFragment(boolean userCancel)
{
if (itsNewTask != null) {
NewTask task = itsNewTask;
itsNewTask = null;
task.cancel(false);
}
GuiUtils.setKeyboardVisible(itsPasswordInput, getActivity(), false);
if (userCancel && itsListener != null) {
itsListener.handleFileNewCanceled();
}
} |
package toolbelt
import (
"net/http"
"net/http/httptest"
"testing"
)
func TestDefaultHeaderTransport(t *testing.T) {
srv := httptest.NewServer(http.HandlerFunc(func(w http.ResponseWriter, r *http.Request) {
ua := r.UserAgent()
if ua != "test-agent" {
t.Fatalf("expected user-agent to be test-agent but was %q", ua)
}
clientVersion := r.Header.Get("x-client-version")
if clientVersion != "test-version" {
t.Fatalf("expected x-client-version to be test-version but was %q", clientVersion)
}
}))
cl := NewHTTPClient("test-agent", "test-version")
resp, err := cl.Get(srv.URL)
if err != nil {
t.Fatal(err)
}
resp.Body.Close()
}
func TestClientTimeout(t *testing.T) {
cl := NewHTTPClient("test-agent", "test-version")
if cl.Timeout != DefaultTimeout {
t.Fatalf("expected client timeout to be %q but was %q", DefaultTimeout, cl.Timeout)
}
}
|
/**
* Class provides a weighted collection and a method
* to select random element according to their weight
* @param <T> Weighted result
*/
public class WeightedCollection<T> {
private final NavigableMap<Integer, T> map = new TreeMap<>();
private final Random random;
private int total = 0;
public WeightedCollection() {
this(new Random());
}
public WeightedCollection(Random random) {
this.random = random;
}
public void add(int weight, T object) {
if (weight <= 0) return;
total += weight;
map.put(total, object);
}
public T getRandom() {
int value = random.nextInt(total) + 1; // Can also use floating-point weights
return map.ceilingEntry(value).getValue();
}
} |
// +build linux,!gccgo
package main
/*
#cgo CFLAGS: -Wall
extern void joinmnt();
void __attribute__((constructor)) initmnt(void) {
joinmnt();
}
*/
import "C"
|
The nature versus nurture battle is scientifically over.
Due to the cornucopia of research pouring out, it has become patently obvious that traits of all sorts are heritable (551 traits analyzed in this study). The study doesn’t just examine the heritability of intelligence, but a myriad of behaviors that constitute human nature. The recent attempt to push epigenetics was a rearguard action that effectively admitted the primacy of nature. The best approach the progressives have is denying biology by suppressing research, which is a possibility because progressive elites need nurture.
Human neurological uniformity (HNU) and the corresponding belief that education fixes everything, leveling the life economic outcome playing field, is an important piece to the progressive system. It is why they shun heretics who publicly deviate from HNU immediately, no matter the achievements of champions like Dr. James Watson.
Higher status is correlated to education, which feeds the status game system where the academics (priests) grant status. Consider the nonstop messaging of 2016 that Trump supporters were uneducated. No right-thinking person wants to be considered dumb, after all.
The belief in nurture means that schools at lower levels matter, as well. This funnels money to the elementary and secondary school systems, paying off the loyal progressive foot soldiers in local education and the teachers’ unions. Countless studies can show that school spending means better test scores, while ignoring the genetic elephant in the room. The ripple effect is shoveling state and federal money to disadvantaged school districts that always seem to test lower no matter the system they employ.
What nurture and the belief in schools coaching people up become is a focal point for people wanting to invest in their children, which is in itself is a trait that varies across groups. “Good schools” becomes code for more whites. It has to be since freedom of association and neighborhood covenants were destroyed by the federal government and courts. Valuing education becomes the polite, safe way to keep like-minded people around.
But unfortunately, this messaging eventually pushes minorities into white school districts hoping for the “good schools” nurture effect to come into play.
This is a key overlooked bit in the flow of people around our big cities. An example of the perverse effects of this can be seen in Indianapolis. Indianapolis public schools tanked decades ago, and the creep to the outer IPS schools was complete in the ’90s. Blacks were fearful of “bad whites” on the city’s south side and drifted north. This is clearly evident by The New York Times’ census population graphics. “Good whites” were welcoming to the north. On the north side, Lawrence Township was a slightly cheaper area to live in, and so city kids crept in.
In 2002, a family could move to Lawrence Township for the schools and expect a good system. As the welcoming area took in more city kids, more families gamed the attendance, and eventually Lawrence Township instituted proof of residence rules. This did not work. Lawrence Township steadily bled whites to the brand new communities just north of it in Fishers, Noblesville, and McCordsville. While having a higher economic moat, Washington Township just to Lawrence’s west is undergoing the same process. Washington Township’s high school, North Central, was the basis for Glee!. By 2030, North Central could be the setting for a reboot of Boyz In The Hood. South side townships still have strong white populations due to the worse reputation a generation ago.
Nurture marketing, when combined with the inability to restrict neighborhoods, becomes the economic moat for communities. Sure income and intelligence are correlated, but if you want to live with the in-crowd, you pay a bit more. Combined with some zoning laws, this becomes a way to limit available housing and price out the riff-raff who would love to be able to go to that school if only they could afford a home there. This becomes a payoff to the earliest of land holders in these areas. Their home equity spikes become fictitious wealth with which they can take loans against and scarf up assets, concentrating even more wealth. These towns and cities still need the demand.
The demand then also creates the odd situation where brand new subdivisions that sprout out of forests and cornfields magically all have price points that mimic the price points of suburbs that already have quality schools. The good schools crowd becomes a consumer bloc that maximizes developers’ profits. Ten acres of cornfield that become forty homes around 2200 square feet provide one amazing return on investment for the developer and home builder that got the acreage rezoned by the little municipality.
Those are very expensive. white refugee developments—all for the “good schools” that a few years ago were rural, small school districts.
If nurture were dethroned entirely, demand would subside. New England is an overwhelmingly white region of America, yet you will still hear white parents discuss the merits of one 95% white school over another 95% white school. If parents were to understand that a significant portion of the high school graduates going to Ivies from specific schools were legacy admissions, they could have a more honest appraisal of the situation. They might understand that Jimmy at Natick High will receive just as good of an education as Jimmy at Wellesley High because what mattered was Jimmy, not the money spent per student.
There is a darker reason for pushing nurture that keeps the progressives safe. The progressive coalition is a very diverse coalition with members spanning all averages in test scores. Because no teaching program can close the achievement gap, there is always a financial sop to throw to loyal political allies. There is always cover for the progressive elite to their underclass clients that they are doing all they can to help, “See look, we now spend more per student in your district than ours!” It is far harder for any coalition to explain wide disparities rather than a more homogeneous group with tighter variability.
The consequences of admitting nature rules will keep progressive elites, academics, and pundits clinging to and defending it at all costs. They will only let go when they can say they have always believed nature over nurture during the media campaign to celebrate the FDA approval of gene editing procedures. |
/**
* Demonstration of Stack Class.
*/
public class Stacks {
public static void main(String[] args){
Stack stack = new Stack();
for(int i=0; i< Month.month.length; i++){
stack.push(Month.month[i]);
}
System.out.println("stack = " + stack);
// Treating a stack as a Vector
stack.addElement("The last line");
System.out.println("element 5 = " + stack.elementAt(5));
System.out.println("popping elements: ");
while (!stack.isEmpty()){
System.out.println(stack.pop());
}
}
} |
[bitcoin-dev] Weak block thoughts...
I've been thinking about 'weak blocks' and SPV mining, and it seems to me weak blocks will make things better, not worse, if we improve the mining code a little bit. First: the idea of 'weak blocks' (hat tip to Rusty for the term) is for miners to pre-announce blocks that they're working on, before they've solved the proof-of-work puzzle. To prevent DoS attacks, assume that some amount of proof-of-work is done (hence the term 'weak block') to rate-limit how many 'weak block' messages are relayed across the network. Today, miners are incentivized to start mining an empty block as soon as they see a block with valid proof-of-work, because they want to spend as little time as possible mining a not-best chain. Imagine miners always pre-announce the blocks they're working on to their peers, and peers validate those 'weak blocks' as quickly as they are able. Because weak blocks are pre-validated, when a full-difficulty block based on a previously announced weak block is found, block propagation should be insanely fast-- basically, as fast as a single packet can be relayed across the network the whole network could be mining on the new block. I don't see any barrier to making accepting the full-difficulty block and CreateNewBlock() insanely fast, and if those operations take just a microsecond or three, miners will have an incentive to create blocks with fee-paying transactions that weren't in the last block, rather than mining empty blocks. ................. A miner could try to avoid validation work by just taking a weak block announced by somebody else, replacing the coinbase and re-computing the merkle root, and then mining. They will be at a slight disadvantage to fully validating miners, though, because they WOULD have to mine empty blocks between the time a full block is found and a fully-validating miner announced their next weak block. ................. Weak block announcements are great for the network; they give transaction creators a pretty good idea of whether or not their transactions are likely to be confirmed in the next block. And if we're smart about implementing them, they shouldn't increase bandwidth or CPU usage significantly, because all the weak blocks at a given point in time are likely to contain the same transactions. -- -- Gavin Andresen -------------- next part -------------- An HTML attachment was scrubbed... URL: <http://lists.linuxfoundation.org/pipermail/bitcoin-dev/attachments/20150923/ca1bbeb5/attachment.html> |
def _read_parquet_dispatch(df: DataFrameType):
if isinstance(df, pd.DataFrame):
return pd.read_parquet
else:
return cudf.io.read_parquet |
// Config returns `*rest.Config` instance of Kubernetes client-go package. This
// config can be used to create a new client that will work against k3s cluster
// running in the provided container.
func Config(c *gnomock.Container) (*rest.Config, error) {
configBytes, err := ConfigBytes(c)
if err != nil {
return nil, fmt.Errorf("can't get kubeconfig bytes: %w", err)
}
kubeconfig, err := clientcmd.RESTConfigFromKubeConfig(configBytes)
if err != nil {
return nil, fmt.Errorf("can't create kubeconfig from bytes: %w", err)
}
return kubeconfig, nil
} |
Electroless, Electrolytic and Galvanic Copper Deposition with the Scanning Electrochemical Microscope (SECM)
Summary The Scanning Electrochemical Microscope (SECM) can be used with different techniques of microstructured copper deposition. A first approach involves the electrolytic copper deposition on noble metals, whereby copper ions are released from a complex by a suitable tip reaction and then reduced on the polarised conducting surface to form a copper microstructure. The second approach is very similar to the first, but does not involve polarising the substrate. It generates a tip-induced microgalvanic cell, the positive electromotoric force of which is constituted by two electrochemical reactions at different areas of the substrate. Finally the electroless copper deposition is performed on nonconducting surfaces like glass or semiconducting surfaces like silicon. This involves locally reducing a suitable precursor film whose surface has been previously immobilised. |
<filename>src/geojson_transformer/__init__.py<gh_stars>0
from .transformer import GeoJsonTransformer |
a = [int(i) for i in input().split()]
n = a[0]
m = a[1]
x = min(n,m)
print(x+1)
if(n == x):
for i in range(x+1):
print(i,x-i)
else:
for i in range(x+1):
print(x-i,i) |
def isContinuationLine(self, line, priorStatus):
if(line.strip() == ""):
return priorStatus
if(re.search(PythonExplicitContinuationRegex, line) != None):
return languageSwitcher.CONTINUATION_EXPLICIT
else:
BracketStack = []
for char in line:
if(char in "{[("):
BracketStack.append(char)
elif(char == "}"):
if(BracketStack != [] and (BracketStack[-1] == "(" or BracketStack[-1] == "[")):
raise InvalidCodeException("Brackets: {} did not match in the code.")
elif(BracketStack != [] and BracketStack[-1] == "{"):
BracketStack.pop()
else:
BracketStack.append(char)
elif(char == "]"):
if(BracketStack != [] and (BracketStack[-1] == "(" or BracketStack[-1] == "{")):
raise InvalidCodeException("Braces: [] did not match in the code.")
elif(BracketStack != [] and BracketStack[-1] == "["):
BracketStack.pop()
else:
BracketStack.append(char)
elif(char == ")"):
if(BracketStack != [] and (BracketStack[-1] == "{" or BracketStack[-1] == "[")):
raise InvalidCodeException("Parantheses: () did not match in the code.")
elif(BracketStack != [] and BracketStack[-1] == "("):
BracketStack.pop()
else:
BracketStack.append(char)
if(BracketStack == []):
if(priorStatus in [languageSwitcher.CONTINUATION, languageSwitcher.CONTINUATION_START]):
return languageSwitcher.CONTINUATION
else:
return languageSwitcher.NOT_CONTINUATION
else:
if(BracketStack[-1] in "}])"):
return languageSwitcher.CONTINUATION_END
else:
return languageSwitcher.CONTINUATION_START |
import { Directive, Input, forwardRef, OnInit, OnChanges, SimpleChanges } from '@angular/core';
import { NG_VALIDATORS, Validator, ValidatorFn, AbstractControl } from '@angular/forms';
import { notMatching } from './not-matching.validator';
const NOT_MATCHING_VALIDATOR: any = {
provide: NG_VALIDATORS,
useExisting: forwardRef(() => NotMatchingValidator),
multi: true
};
@Directive({
selector: '[ngvNotMatching][formControlName],[ngvNotMatching][formControl],[ngvNotMatching][ngModel]',
providers: [NOT_MATCHING_VALIDATOR]
})
export class NotMatchingValidator implements Validator, OnInit, OnChanges {
@Input() notMatching: string | RegExp;
private validator: ValidatorFn;
private onChange: () => void;
ngOnInit() {
this.validator = notMatching(this.notMatching);
}
ngOnChanges(changes: SimpleChanges) {
for (const key in changes) {
if (key === 'notMatching') {
this.validator = notMatching(changes[key].currentValue);
if (this.onChange) {
this.onChange();
}
}
}
}
validate(c: AbstractControl): { [key: string]: any } {
return this.validator(c);
}
registerOnValidatorChange(fn: () => void): void {
this.onChange = fn;
}
}
|
//
// Created by Admin on 21.09.2019.
//
#ifndef PD_CHAIN_H
#define PD_CHAIN_H
#include <AMGEngine.h>
#include <GUI/Canvas.h>
#include <GUI/styles.h>
#include <GUI/IECanvas.h>
class AMGChain : public AMGObject {
private:
std::vector<GUI::IECanvas *> IEObjects;
const float spacing = 0.01;
std::vector<GUI::IECanvas *>::iterator moving_from, moving_to;
GUI::AMGCanvas * moving_overlay;
int moving_focus = 0;
vecmath::Vec2 last_touch;
public:
AMGChain() : AMGObject(GUI::BOX) {
// moving_overlay = new GCanvas();
// moving_overlay->GAttachShaders("Shaders/VS_ShaderPlain.vsh", "Shaders/ShaderPlainColor.fsh");
// moving_overlay->GSetColor(1, 1, 1, 0.2);
// moving_overlay->GSetVisible(false);
// GAttach(moving_overlay);
auto dummy = new GUI::IECanvas("dummy");
IEObjects.push_back(dummy);
GSetDragBeginCallback([this](const vecmath::Vec2& v) -> GUI::GObject * {
std::list<GObject *> trace;
auto go = AMGObject::GFindFocusObject(v, &trace);
moving_from = std::find(IEObjects.begin(), IEObjects.end(), go);
moving_to = moving_from;
// moving_overlay->GSetVisible(true);
return this;
});
GSetDragHandlerCallback([this](const vecmath::Vec2& v) -> GUI::GObject * {
auto old_moving_to = moving_to;
while ((moving_to != IEObjects.begin()) && (v.x < (*moving_to)->global.c.x)){
moving_to --;
}
while ((moving_to != IEObjects.end() - 2) && (v.x > (*moving_to)->global.c.x + (*moving_to)->global.s.x)){
moving_to ++;
}
if (!(*moving_to)->GContains(v)) moving_overlay->GSetVisible(false);
if (old_moving_to != moving_to){
// auto position = (*moving_to)->globalPosition.toRelative(this->globalPosition);
// moving_overlay->place(position.x, position.y, (*moving_to)->z);
// moving_overlay->setHeight(position.height);
// moving_overlay->setWidth(position.width);
// moving_overlay->GSetVisible(true);
}
last_touch = v;
return this;
});
GSetDragEndCallback([](const vecmath::Vec2& v) -> GUI::GObject * {
// if ((*moving_to)->globalPosition.GContains(last_touch)){
// auto go = (*moving_from);
// AMGChainDel(moving_from - AMGObjects.begin());
// AMGChainInsert(go, moving_to - AMGObjects.begin());
// }
// moving_overlay->GSetVisible(false);
return nullptr;
});
}
inline bool ARender(double beat, float * lsample, float * rsample) override {
for (auto const &ieo : IEObjects) {
ieo->ARender(beat, lsample, rsample);
}
return true;
}
inline void MIn(MData cmd) override {
IEObjects.front()->MIn(cmd);
}
inline void MConnect(MObject * mo) override {
IEObjects.back()->MConnect(mo);
}
inline void MDisconnect(MObject * mo) override {
IEObjects.back()->MDisconnect(mo);
}
void GDraw(NVGcontext *nvg) override {
if (changed) {
float cur_ratio = 0;
int i = 0;
for (auto const &obj : IEObjects) {
obj->GPlace({cur_ratio / local.ratio + spacing * i, 0});
obj->GSetHeight(1);
cur_ratio += obj->local.ratio;
i++;
}
}
// nvgBeginPath(nvg);
// nvgRect(nvg,
// global.c.x,
// global.c.y,
// global.s.x, global.s.y);
// nvgFillColor(nvg, BLUE);
// nvgFill(nvg);
}
inline void AMGChainPushBack(GUI::IECanvas * ieo) {
auto size = IEObjects.size();
if (size > 1) {
IEObjects[size - 2]->MDisconnect(IEObjects.back());
IEObjects[size - 2]->MConnect(ieo);
}
ieo->MConnect(IEObjects.back());
GSetRatio(local.ratio + ieo->local.ratio + spacing);
IEObjects.insert(IEObjects.end() - 1, ieo);
GAttach(ieo);
changed = true;
}
inline void AMGChainPushFront(GUI::IECanvas * ieo) {
ieo->MConnect(IEObjects[0]);
IEObjects.insert(IEObjects.begin(), ieo);
GSetRatio(local.ratio + ieo->local.ratio + spacing);
GAttach(ieo);
changed = true;
}
inline void AMGChainInsert(GUI::IECanvas * ieo, int pos) {
auto size = IEObjects.size();
if (pos > size) pos = size;
if (pos < 0) pos = 0;
if (pos == size - 1) AMGChainPushBack(ieo);
else if (pos == 0) AMGChainPushFront(ieo);
else {
IEObjects[pos - 1]->MDisconnect(IEObjects[pos]);
IEObjects[pos - 1]->MConnect(ieo);
ieo->MConnect(IEObjects[pos]);
GSetRatio(local.ratio + ieo->local.ratio + spacing);
IEObjects.insert(IEObjects.begin() + pos, ieo);
GAttach(ieo);
changed = true;
}
}
inline void AMGChainDel(int pos) {
auto size = IEObjects.size();
GSetRatio(local.ratio - IEObjects[pos]->local.ratio);
if (pos > 0 && pos < size - 1) {
IEObjects[pos]->MDisconnect(IEObjects[pos + 1]);
IEObjects[pos - 1]->MDisconnect(IEObjects[pos]);
IEObjects[pos - 1]->MConnect(IEObjects[pos + 1]);
GDetach(IEObjects[pos]);
IEObjects.erase(IEObjects.begin() + pos);
} else if (pos == 0 && size > 1) {
IEObjects[0]->MDisconnect(IEObjects[1]);
GDetach(IEObjects[pos]);
IEObjects.erase(IEObjects.begin() + pos);
}
changed = true;
}
inline void MRender(double beat) override {
for (auto const& mo : IEObjects) mo->MRender(beat);
}
// GObject * GFindFocusObject(const vecmath::Vec2& point) override
// {
// if (visible && globalPosition.GContains(point)){
// for (auto const &gr : Graphics) {
// auto fo = gr->GFindFocusObject(point);
// if (fo){
// if (fo == this) return parent;
// if (fo == gr) return this;
// return fo;
// }
// }
// }
//
// return nullptr;
// }
virtual GObject *GFindFocusObject(const vecmath::Vec2 &point, std::list<GObject *> * trace) override {
for (auto const &gr : Graphics) {
auto fo = gr->GFindFocusObject(point, trace);
if (fo) {
trace->push_front(this);
if (fo == this) return parent;
if (fo == gr) return this;
return fo;
}
}
if (visible && GContains(point)) {
trace->push_front(this);
return this;
}
return nullptr;
}
void GSetVisible(bool visible_) override {
AMGObject::GSetVisible(visible_);
// moving_overlay->GSetVisible(false);
}
};
#endif //PD_CHAIN_H
|
Not to be confused with Hepatology
Herpetology (from Greek ἑρπετόν herpetón, meaning "reptile" or "creeping animal") is the branch of zoology concerned with the study of amphibians (including frogs, toads, salamanders, newts, and caecilians (gymnophiona)) and reptiles (including snakes, lizards, amphisbaenids, turtles, terrapins, tortoises, crocodilians, and the tuataras). Birds, which are cladistically included within Reptilia, are traditionally excluded here; the scientific study of birds is the subject of ornithology.
Thus, the definition of herpetology can be more precisely stated as the study of ectothermic (cold-blooded) tetrapods. Under this definition "herps" (or sometimes "herptiles" or "herpetofauna") exclude fish, but it is not uncommon for herpetological and ichthyological scientific societies to "team up", publishing joint journals and holding conferences in order to foster the exchange of ideas between the fields, as the American Society of Ichthyologists and Herpetologists does. Many herpetological societies have been formed to promote interest in reptiles and amphibians, both captive and wild.
Herpetology offers benefits to humanity in the study of the role of amphibians and reptiles in global ecology, especially because amphibians are often very sensitive to environmental changes, offering a visible warning to humans that significant changes are taking place. Some toxins and venoms produced by reptiles and amphibians are useful in human medicine. Currently, some snake venom has been used to create anti-coagulants that work to treat strokes and heart attacks. |
package dcr
import (
"context"
"io/ioutil"
"sync"
"time"
"github.com/Shopify/sarama"
"github.com/decred/dcrd/rpc/jsonrpc/types"
"github.com/decred/dcrd/rpcclient/v4"
"github.com/golang/glog"
)
func rpcGetWork(wg *sync.WaitGroup, client *rpcclient.Client, workCh chan *types.GetWorkResult, errCh chan error) {
glog.Info("Calling getwork")
defer wg.Done()
work, err := client.GetWork()
if err != nil {
glog.Errorf("Failed to get work: %v", err)
errCh <- err
} else {
glog.Infof("Get work result: %s", *work)
workCh <- work
}
}
func rpcLoop(ctx context.Context, wg *sync.WaitGroup, producer sarama.AsyncProducer) error {
certs, err := ioutil.ReadFile(rpcCertificate)
if err != nil {
glog.Fatalf("Failed to load RPC certificate: %v", err)
return err
}
workCh := make(chan *types.GetWorkResult, 16)
notifyCh := make(chan bool, 16)
errCh := make(chan error)
ntfnHandlers := rpcclient.NotificationHandlers{
OnBlockConnected: func(blockHeader []byte, transactions [][]byte) {
glog.Infof("Block connected")
notifyCh <- true
},
}
connCfg := &rpcclient.ConnConfig{
Host: rpcAddress,
Endpoint: "ws",
User: rpcUsername,
Pass: <PASSWORD>,
Certificates: certs,
DisableConnectOnNew: true,
}
client, err := rpcclient.New(connCfg, &ntfnHandlers)
if err != nil {
glog.Errorf("Failed to create RPC client: %v", err)
return err
}
defer client.WaitForShutdown()
err = client.Connect(ctx, true)
if err != nil {
glog.Errorf("Failed to create RPC client: %v", err)
return err
}
network, err := client.GetCurrentNet()
if err != nil {
glog.Errorf("Failed to get network type: %v", err)
return err
} else {
glog.Infof("DCR network type: %s", network)
}
err = client.NotifyBlocks()
if err != nil {
glog.Errorf("Failed to register block notifications: %v", err)
return err
}
intervalTimer := time.NewTimer(rpcInterval)
notifyCh <- true
for {
select {
case work := <-workCh:
wg.Add(1)
go processWork(wg, network, work, producer)
intervalTimer.Stop()
intervalTimer.Reset(rpcInterval)
case <-intervalTimer.C:
wg.Add(1)
go rpcGetWork(wg, client, workCh, errCh)
case <-notifyCh:
wg.Add(1)
go rpcGetWork(wg, client, workCh, errCh)
case <-errCh:
intervalTimer.Stop()
intervalTimer.Reset(rpcInterval)
case <-ctx.Done():
client.Shutdown()
return nil
}
}
}
|
Imprints of security challenges on vernacular architecture of northern Nigeria: a study on Borno State
Security challenges are known to have diverse negative impacts on all facets of human endeavours across the world. However, in a country like Nigeria that is faced with myriads of security challenges ranging from armed bandits to insurgences by terrorist groups, the impacts of security challenges on architecture have not been adequately explored. This research examined how security challenges have influenced the architectural forms and spatial morphology of vernacular architecture in Northern Nigeria using three Local Government Areas: Maiduguri, Nganzai and Monguno of Borno State a case study. The primary data were gathered through observations, photographic materials and oral interviews with randomly selected residents in the study area. These were complemented with the review of published literature. The results of content analysis reveal a gradual change in Northern Nigeria vernacular architecture to accommodate the myriads of security challenges confronting the region. Specifically, it was observed that changes in traditional circular houses with thatch roofs to rectangular houses and the use of modern building materials as well as emergence of gated communities abound in the study area. The study also found a decline in traditional decorations and paintings on domestic building due to the relocation of practitioners from the study area. The study concludes that in spite of the daunting security challenges, Nigerians should not abandon their rich heritage in vernacular architecture, rather, indigenous and modern architectural principles should be integrated in evolving secured human settlements in this country.
Introduction
Throughout history, man has made conscientious efforts to gain increasing control over his physical environment and to gain freedom from its limitation . One of the strategies human has employed to achieve this is architecture, which among other things provides shelter, one of the three basic necessities of life after food and clothing. Basically, shelter gives protection from both natural and man-made elements such as wild animals, natural disaster, harsh weather conditions, terror attacks and other unwholesome activities that seek to threaten the wellbeing and existence of man. These factors, no doubt, influence the building form, choice of materials and method of construction leading to a unique architectural theory peculiar to localities.
Architectural theory can be said to be the concept used to explain components that influence the design style and forms of buildings and structures in a particular geographical setting. Nigeria as a nation possess several traditional architectural forms in line with her different sub-geographic regions, socio-economic background, cultural settings and weather conditions. Globally and locally, traditional architecture reflects the cultural values of the people and represents the common heritage of the people; and thus the need for it to respond to the material, spiritual, and social attributes of the society cannot be over-emphasized . The Islamic religion of the northern Nigerian has a strong influence on the predominant architectural theory of this region, which is known for its courtyard system, circular plan shape, monolithic walls, domed roof and appropriate construction technology. However, today, the architectural form of this region is characterized by the post-modem buildings of the 1990's with sprawling new design concept incorporating foreign building materials and borrowed methods of construction. As a result, the contemporary architecture in Northern Nigeria is devoid of the design features that portray the characteristics of indigenous Nigerian architecture regardless of the established significant sociocultural, economic and environmental benefits associated with it.
Notably, very little efforts have been given to exploring the inherent characteristics of the merging architectural form in this part of Nigeria , while government policies and architects have continued to frivolously jettison the traditional architecture for modern architecture in recent times. The northern vernacular architecture has from time immemorial a thing of pride to Nigeria with tremendous benefits in tourism. The current situation has compelled the authors to inquire the extent to which the current security issues play role in changing architectural form and spatial morphology of Hausa settlements in northern Nigeria.
This research sought to examine how security affects the architectural forms of contemporary Northern Nigeria using Hausa house forms and architecture as a case study. The following specific objectives were pursued in the study; • To highlight the architectural form of northern Nigerian traditional architecture • To examine the spatial morphology of traditional Hausa pattern, and • To explore how security issues have affected the architectural form and spatial morphology of traditional settlement pattern in the study area. This study was informed by the observed changing effect of security issues in all facets of human activities locally and globally. Based on this, the study makes contribution to knowledge by identifying the specific areas the current security challenges are impacting on vernacular architecture of Hausa land in northeast Nigeria. This information is very vital in the search for strategies for achieving sustainable human settlements in the face of various social and environmental challenges confronting Nigeria and other developing countries.
Nigeria in the Context of Global Terrorism index
Nigeria has been affected greatly by terror attacks from different sect of religious beliefs . She remains the third most terrorist affected country in the world, a position she has retained since 2015, as shown in the 2019 Global Terrorism Index (GTI) report (see Figure 2). Nigeria appears to be in the group with countries like Afghanistan and Iraq, both of which have been experiencing war in the last one decade. With jihadists such as the Islamic State in West Africa Province (ISWAP) and Boko Haram sects gaining power and dominance, Nigeria has undergone 10 years of regionalized armed conflicts, while acts of terrorism and violent extremism and kidnapping have been on the increase unabated. Resulting from these, are loss of human lives and property worth billions of Naira. Amongst the most affect aspects are housing and livelihoods of the people. Consequently, this region accounts for the largest number of internally displaced people (IDPs) in the African continent Figure 2: Ten countries most impacted by terrorism ranked by number of death.
Source: START GTD, IEP calculations Figure 3 is a summary of data extracted from the United Nation's database on terrorism in Nigeria. The Statistics of global terrorism index indicate that Nigeria is among the first ten countries most impacted by terrorism and these ten countries alone accounted for around 87% of deaths from terrorism in 2018, with Nigeria contributing about 13% of these deaths and a total of 562 incidents . Currently the security situation in Nigeria is deteriorating significantly, the level of human rights violations and mass killings are alarming, in the North-East region of the country, with vast majority of Muslims (Hausa) as victims. This poses a serious threat to economic development and international security. According to Okeke et al , the increasing rate of terrorism especially in Africa in general and Nigeria in particular, requires adequate attention by government and professionals involved in counter terrorism to devise measures to adequately deal with issues relating to terror and prevent this from escalating further in this country.
2. Spatial Settlement Pattern in Hausa land
The religion of Islam, combined with trade across the Sahara are considered to have had the greatest impact on Hausa settlement patterns and the local building and construction practice associated with this ethnic group in northern Nigeria. It should be noted that there is only a thin line between the urban and rural settlements in Hausa land . Scholars see the urban settlement as part and parcel of the rural surrounding areas and rural lifestyle, called "Anguwani," much like the rural community. Popoola also reported that the rural landscape of Hausa settlement was and is still dominated by nucleated villages, with the possibility of some expansion of compounds. Their architectural language demonstrates a simple form of hierarchical arrangement of dwellings. Starting from the Gida (compound) basically made up of extended family and usually sub-divided into units and each containing family houses to the Kauaye (village) which is essentially a collection of matrilineal family groups in a nucleated homestead planned for agriculture, and followed by the farmland (Gona), which most often is the adjoining space separated from the other "Kwauyika" (villages) by "Daji" (forest). At the apex of the Hausa settlement is the Gari (town). This shows that there is a clear hierarchy in spatial planning in a typical Hausa settlement resulting in a distinct space morphology and architectural form The historic towns in Hausa land are reflective of a compact nucleated settlement surrounded with the defensive walls . The towns are organically and conceptually categorized into three fragments: 'Cikin Gari' (inner core), 'Tsakiyar Gari' (central core) and 'Wajen Gari' (outer core), surrounded by a thick mud wall called 'Ganuwa' (city wall) and accessed through a gate known as the city wall and gate called 'Kofa' . The key concept underpinning the spatial settlement pattern in Hausa land is the triple core or spaces in traditional compounds and it synonymously translates to the concept of the town surrounded by a compound wall entrance and the city wall with an entrance gate . This pattern as described has evolved over the years as a result of interplay of religious and socio-cultural values of the people.
Architectural form of northern Nigeria
Generally speaking, Hausa communities in northern Nigeria are usually united by common language and religion. Some scholars have argued that to the Hausa, Islam is not only a religion but a way of life. Hence, its introduction into Hausa land had a mammoth impact on their architecture and building construction, which directly influenced the use of conical curvilinear and mud-dome roof structures as seen from Figure 7c and Figure 8c. Also traceable to the religion is the ideology of privacy, security, seclusion of women (Purdah) and the segregation between male and female. This has given rise to the traditional courtyard design of family compound as shown in Figure 6. The associated building form is also prevalent and paramount also in the West African savannah areas of the Niger and Chad River basin . But in Nigeria, the Northern vernacular architecture incorporates a courtyard encircled by several rooms that enable future expansion to house more occupants including wives, children and slaves as the family size increases. As Islamic architectural design principle insists on seclusion and privacy for the girl child and women, a traditional Hausa residence is conceptually subdivided into three parts or layout: inner core (private area), a central core (semi-private area), and an outer core (public areas) ; as shown in Figure 4 International Figure 4: A traditional layout of a northern residence.
The inner core consists of women area, ward and servant area with space at the backyard for animal husbandry and refuse disposals. Further to this, is open air space for injunction, which is a court mainly in the central core, for domestic and other family activities as well as fulfilling other building services functions of lighting and ventilation. The outer core is the "Zaure," and it is the seat of the Master (man) of the house. As explained by Adamu its major functions include security, reception, privacy, moral, protection, ethnic ideas, decorations and administration. Historically, this concept is believed to have originated in 500 CE from the Egyptian domestic architecture. Hence, Hausa traditional village layouts of shelter and settlements have also manifested in urban spatial morphology . As regards the compound setting, the Hausa domestic architecture is influenced by "PURDAH" (exclusion of women) described as Haremlik and Selamlik areas (see figure 5) meaning accessible and non-accessible zones . Whereas Selamlik is that part of the house kept for men, the Haremlik is the private part traditionally forbidden to male strangers meant for women. The idea behind the design concept is that visitors are not permitted beyond Selamlik as long as you are not a member of the immediate household. This amounts to having large compound and consume fairly large area of land to accommodate the various spaces required to fulfil daily activities.
4. Research methods
The research design adopted for this study is case study and it followed qualitative research approach involving non-participant observations and oral interview of randomly selected 18 locals in the study area. However, the scope was limited to Borno State in northern Nigeria based on purposive sampling technique. This is because among other locations, Borno has been the most affected State in term of terrorist attacks; and thus considered the most appropriate for sourcing the data to address stated research objectives. Furthermore, a simple random sampling technique was used to select three Local Government Areas in this State with copious evidence of severe terrorist attacks. The three LGAs purposively selected for investigation were Maiduguri Local Government Area which is the state capital with a land mass of 137.36km 2 and having the highest population of 540,016 in the State; Nganzai Local Government Area with a land mass of 2,572.35km 2 and a population of 99,074 people and Monguno Local Government Area with a land mass of 1,993.20km 2 and a population of 109,834 inhabitants. Notably, these three LGAs had witnessed some level of destruction in human lives and properties resulting from terrorist attacks in Nigeria in most recent times and was considered fairly secured at the time of the research hence its selection.
The primary data collection process involved authors' observations captured using sketches and photographic materials. The observations were centered on buildings and structures existing in the localities and interviews. In addition, oral interviews were conducted with residents of the study area. The focus of the interviews was on the impact of terrorists' activities on houses in particular and entire settlements in generally, based on their experience in their respective settlements. The interviews were recorded electronically and later transcribed into word document for the analysis. These sources were complemented with anecdotal evidence on security and sustainable architecture. The secondary data were obtained from the review of published literature relevant to the research area, identified from various sources including, journals, workshops and conference papers. Papers reviewed were identified via searchers on online databases such as Google scholar, United Nations database, and Science Direct among others. The data were subjected to thematic content analysis. The results are presented using sketches, pictures and text. Based on the results, inferences were made regarding the contribution of insecurity challenges to rapid abandonment and change in vernacular architecture of Northern Nigeria.
Features of Hausa traditional compound in northern Nigeria
The study found that there is a separation between the exterior and interior spaces and this was pronounced and emphasized in Hausa architecture. The built-up areas within the interior spaces have a living and sleeping room, a kitchen, an entrance point, stores and conveniences. Compound wall height is raised high enough to ensure privacy and prevent passers-by from seeing what is going on inside the interior space. As seen in Figure 6, the courtyard is well accentuated, and there are usually three common denominators of space (i.e. the system of the courtyard for household and social activities, the kitchen and dining areas where the kitchen is located far away in the compound, and the dining area, individually or collectively in the parlour or open area). Subsequently, for health concerns and other reasons, their privy or latrines are located away from or at the end of the compound. Figure 6: A typical Hausa traditional compound in northern Nigeria.
Source: Authors' Sketches In addition, providing relatively large space in the vicinity of the neighbourhood is necessary for congregational assembly to celebrate social functions such as appointments, weddings and naming ceremony, and a children's playground and others. Open spaces are meant also to breed little livestock and reserved for future developments as the need arises.
Security induced-changes in architectural form and spatial morphology of settlements in Northern Nigeria
The findings of this study as presented in this section of the paper are in form of photographs showing the existing housing architectural form, spatial morphology of settlement and building materials as concrete evidences to support the discussion in the subsequent section. 6 6. Discussion Available evidence from this study indicates that terrorism and insurgency have had tremendous negative impacts on the Housing forms in Hausa land of Northern Nigeria. Notable among these impacts is the destruction of lives and buildings especially in the North Eastern part of Nigeria. This scenario has actively affected the architectural form of Northern Nigeria in two major areas namely, preference for modern building materials over traditional materials; and the emergence of more fenced houses with strong iron gates in what is popularly referred to gated communities. Some of the footprints of insecurity on the architectural form of the northern Nigeria are discussed in the following paragraphs.
The indigenous materials for building construction in Hausa land consist majorly of materials such as earth, timber, reeds (grasses) and stones and with a dynamic building process. Adobe or earth (mud) has been the most common and readily available materials, while the largely employed method of construction is wattle-and-daub, which helps to create a notable built environment with the attendant architectural merits. However, currently, their walling system as shown in Figures 7(a, b and c) is considered to be deficient in the aspect of tensile strength, which renders it vulnerable to the impacts from potential attacks or threats. As a result, there is massive use of masonry and concrete walling system with greater potential to resist the impacts resulting from explosives and bombs as shown in Figures 9(a, b, c). From the vernacular architectural forms identified with this region, the walling system involves a pearshaped sun desiccated mud (adobe) bricks that are prepared from earth (the red laterite soil) harvested from the locality and popularly referred to as Jankasa. This soil is rich in fiber and therefore produces excellent materials for walls and roofs when skillfully handled by experienced craftsmen. Also non-load bearing walls as seen in figures 7(a, b & c) are made of grasses, thatch or raffia palm knitted together called Zana fence. According to one of the informants interviewed, this can easily be set ablaze by bandits or pulled down by potential terrorist attackers, and thus there was a need to jettison these types of construction materials in contemporary times in the study area.
The traditional need to meet structural requirements of stability has led to increase in wall thickness in the study area. This is firstly, in response to safety requirements, and secondly, due to the advantage of the thick walls in creating cooler indoor environment as well as the availability of cheap labour to handle it. However, in present situation the reverse is the case as security has become a major factor considered in the development of residential buildings and their surroundings in the study area. Among others, this has resulted to the gradual replacement of the traditional building materials with Portland cement, steel bars, aluminum and concreted components as seen in Figures 9(a-c). It is believed that based on the Islamic injunction, women in purdah are secluded from the outer male reception area and these new found (foreign) building materials will strengthen the security of homes and help in achieving privacy better than the traditional earth-based building materials. Furthermore, the prevailing security challenges experienced in the study area plays a major role in deciding whether a settlement would be nucleated or dispersed, as revealed by our informants interviewed. According to them, the more compacted dwellings are, the better they can resist and withstand terror attacks. This submission appears to be consistent with the assertion of Agboola and Zango on spatial settlement pattern of Hausa land in noting that security plays a major role in deciding why settlement would he nucleated or dispersed. Arguably, the concentration of leaders and members of their families living in close proximity to the marketplace seems to suggest that this spatial morphology is a product of defensive instincts of the people. This might help to explain the emergence and dominance of nucleated settlement pattern in Hausa communities as alluded by those interviewed.
For the local craftsmen and builders, the roof is seen as the most gratifying part of the building and also the most challenging aspect of construction. The justification for this can be seen in the decorations and efforts applied to achieve this building element. With the rising incidence of global warming and climate change, there is increasing search for adaptive measures and strategies in mitigating the effects of these challenges, locally and globally. Nigeria is in the tropics where there are extreme weather conditions of dry and rainy seasons. The study area is marked by diminutive rainfall and significant difference of day to night temperature. Daytime is dominated by bright sunshine and hot, dry air, while the night is associated with extreme cold. The traditional roof system made of thatch and or mud as shown in Figures 7(a-c) obviously cannot withstand the extremes of these weather conditions and require constant maintenance. To cater for this deficiency, the traditional roofing materials are being replaced with corrugated roofing sheets as shown in Figures 8(a-c). As observed in the study area, the construction of mud roofs involves split palm frond pieces that are laid on palm frond beams in herringbone fashion and on both sides it is plastered with mud. The use of flat or vaulted mud roofs in Hausa architecture indicates method of averting the risk of fire outbreaks , which has informed the proscription of the use of thatched roofs within the urban areas. This proscription was as a response to security threats and is the off-shoot of the disparity between buildings in rural and urban areas. While structures in rural areas are characterized by circular plans with simple thatch roofs, their counterparts in urban areas are made of rectangular walls having mud roofs. The building morphology has aided in the use of corrugated iron sheets and has helped its diffusion among the people. Resulting from this, is the abandonment of erstwhile traditional thatch roofs which are considered to be highly combustible and engender the spread of fire in the event of terrorist attacks as constantly being witnessed in the study area.
Furthermore, to the relocation of people from their natural place of abode to other areas due to the insecurity challenges has created a void in the continuity of their architectural form and local construction methods as supported by the comments of those interviewed in the research. For instance, it was pointed out by those interviewed that as result of the growing insecurity in the study area, family sizes have started to shrink, the courtyard system of family dwellings as illustrated in Figure 4 is gradually going into extinction. With smaller family size, the erstwhile compound family houses are now fragmented and being modified to accommodate the newly formed nuclear family structure. The original architectural form that reflects the culture values of the people is being changed to fit the prevailing situation leading to dramatic change in building character and spatial morphology of traditional settlements. In fact, evidence from this study as shown in Figures 9(a-c) indicates that today, buildings are constructed to meet the needs of nuclear family without the usual courtyard as the three core areas of houses have been reduce to two in line with contemporary building design. This depicts a change in architectural form due to security challenges associated with terrorism; and thus there is the emergence of what has been described as "architecture of fear'' which ultimately promotes progressive fragmentation of public space and a breakdown of social cohesion due to spatial segregation and social discrimination . Similarly, as the demand for agricultural space or compound farm (Gona) found in the traditional Hausa dwelling continue to wane due to fear of insecurity, spatial morphology is also rapidly changing without the erstwhile Gona (farmland). Consequently, compound space is consequently shrinking or becoming underutilized and this is considered a major change in the vernacular architecture of the study area.
Also from the review of the existing studies, ornamentation of building façade in northern traditional architecture was before now a thing to behold. In fact, Umar, et al. asserted that Hausaland astounds visitors with its pleasing building forms full of colourful motifs, decorations and the quality of the interior space. Adamu also categorized decorations in Hausa traditional architecture into three groups namely, surface design, calligraphy and ornamental and these three categories could be displayed on a single facade of the "Zaure," but the choice depends on the status and preference of the users. However, with the recent development, the craft and practice of ornamentation has been eroded in contemporary northern Nigeria traditional architecture. This is mainly because as earlier noted in this paper, there are no longer mud walls and roof to decorate as walling materials have changed to concreted components, sandscrete blocks with concertina wire. Consequently, the craftsmen who have the technical know-how to achieve these wall and surface ornamentation have fled for safety and the construction practice is heading to extinction in the study area. The style of surface finish and wall decorations are presently executed with paint and Plaster of Paris (P.O.P) screed mostly in Maiduguri and these have been made possible by the process of westernization of culture and architecture as traditional techniques have been abandoned partly due to security challenges confronting people in the study area.
7. Conclusion and Recommendation
The study examined the imprints of security changes on the architectural landscape of northern Nigeria using Borno State as a case study. The study found that the vernacular architecture of Hausa land of northern-east Nigeria, which was based on the concept of the triple space in city concepts as well as compound family house layout is gradually witnessing tremendous changes. Specifically, the findings of this study reveal that security challenges have a greatly influenced vernacular architecture of northern Nigeria as houses initially built with indigenous /traditional building materials are being changed to modern building materials, which the residents considered to offer more protection in the face of incessant terrorist attacks. In addition, there is a change in building forms from circular houses with thatch roofs to rectangular houses with corrugated iron sheets roofs, burglary proof doors, windows and verandas together with increasing emergence of gated communities having more restricted access across and round compounds were observed as major reactions to the increasing need for secured residential environment. Furthermore, the study found a decline in traditional decoration and painting of house, such practice as "Graffito" in the study area.
From this study are lessons that will help policy makers and professionals in the built environment in evolving strategies that can help this country preserve its architectural heritage in the face of security, environmental and economic challenges. The fact that a significant proportion of the population in Hausa land northern Nigeria still holds on to their traditional building materials in spite of the growing security challenges, due to financial and religious/cultural reasons, suggests that we can still preserve our architectural heritage in the midst of daunting security challenges. Therefore, security challenges should not be the reason for Nigerians to abandon their rich heritage in traditional/vernacular architecture for the western style. In the of light this, coupled with the issues of climatic change, there is a need for policy makers and built environment professions to look inward in devising strategies that will engender the integration of indigenous skill and construction techniques and modern architectural and urban planning principles and practices in evolving human settlement that enhance the security of live and property in Northern Nigeria or any other part of the country experiencing mass security challenges. To achieve this, it is recommended that concerted efforts should be in the documentation and propagation indigenous architectural ideas and skills, with current research on how best to improve local building materials to meet the requirements of contemporary times in terms of security and other needs of contemporary Nigerian society.
This study is not without some limitations. Notably, the findings of this study is limited to the three local Government Areas of Borno State studied. The study is also limited to the biases of those interviewed in the study. In view of these limitations, it is recommended that further studies but extended to other locations in Northern Nigeria were there are armed conflict and violence; and such study should consider using different research design.
A Acknowledgments Authors wishing to acknowledge assistance of a Research Assistant-Dorcas Samuel who on special assignment in Maiduguri helped to take the photographs and interviewed the locals in their dialect. |
def paths(self):
if self.__paths is None:
directory = os.path.join(self.root_dir, self.dir_name)
my_paths = []
for filename in sorted(os.listdir(directory)):
match = re.match(self.ordering_pattern, filename)
if not match:
continue
ordering = int(match.groups()[0])
if self.img_regex != "":
if not re.match(self.img_regex, filename):
continue
my_paths.append((ordering, os.path.join(directory, filename)))
my_paths = [_[1] for _ in sorted(my_paths)]
self.__paths = [my_paths[_] for _ in self.__idxs_to_keep]
return self.__paths |
// Verify compares the expected values and the observed outcome, returning
// an error if the verification failed.
func (c Check) Verify() error {
defer func() {
if c.observed != nil {
if c.observed.Body != nil {
c.observed.Body.Close()
}
}
}()
switch true {
case c.observed == nil:
return fmt.Errorf("Run() has not be executed")
case c.StatusCode == *new(int) && c.observed.StatusCode > 399:
return fmt.Errorf("observed status code, %v, was a non-success response (if this is expected, set StatusCode to this value)", c.observed.StatusCode)
case c.StatusCode != *new(int) && c.StatusCode != c.observed.StatusCode:
return fmt.Errorf("expected status code %v but response had status code %v", c.StatusCode, c.observed.StatusCode)
case c.observed.Body != nil && !c.bodiesMatch():
observedBody, _ := ioutil.ReadAll(c.observed.Body)
return fmt.Errorf("expected body response '%s' to match '%s' but it did not", c.ResponseBody, string(observedBody))
case utils.GetNKeyValuePairsStringMap(c.Headers) > 0 && !c.headersMatch():
expectedHeaders, _ := json.Marshal(c.Headers)
responseHeaders := map[string]string{}
for key, value := range c.observed.Header {
responseHeaders[key] = strings.Join(value, ",")
}
observedHeaders, _ := json.Marshal(responseHeaders)
return fmt.Errorf("expected headers '%s' but got '%s'", string(expectedHeaders), string(observedHeaders))
}
return nil
} |
class ArtNetConfigurator:
"""Used for building an ArtNet Server with Nodes"""
pat_universe = re.compile('^universe_[0-9]+$')
pat_universe_lines = re.compile('^universe_[0-9]+_lines$')
@staticmethod
def get_artnet_server(config_artnet=None, config_led_mapping=None):
if config_artnet:
config_artnet = ArtNetConfigurator.get_conf(config_artnet)
else:
raise FileNotFoundError('Please specify a file for config_artnet')
if config_artnet:
config_led_mapping = ArtNetConfigurator.get_conf(config_led_mapping)
else:
raise FileNotFoundError('Please specify a file for config_led_mapping')
artnet_server = ArtNetServer(config_artnet['artnet_server']['ip_address'],
ArtNetConfigurator.get_broadcast_addr(config_artnet),
int(config_artnet['artnet_server']['port']))
# Create ArtNet nodes
for node_entry in config_artnet:
if node_entry.startswith('artnet_node_'):
artnet_node = ArtNetNode(config_artnet[node_entry]['name'],
config_artnet[node_entry]['ip_address'],
int(config_artnet[node_entry]['port']),
int(config_artnet[node_entry]['max_history_size']))
artnet_node.universe = {}
for (node_option_key, node_option_val) in config_artnet[node_entry].items():
if ArtNetConfigurator.pat_universe.match(node_option_key):
universe_id = node_option_key.split('_')[1]
strip_length = int(config_artnet[node_entry][node_option_key])
artnet_node.universe[universe_id] = LEDStrip(strip_length)
if node_option_key == 'color_history':
artnet_node.color_history = config_artnet[node_entry][node_option_key]
for mapping_entry in config_led_mapping:
if mapping_entry.startswith(artnet_node.name):
universe = str(config_led_mapping[mapping_entry]['universe'])
for slot_entry in config_led_mapping[mapping_entry]:
if not slot_entry.startswith('slot_'):
continue
# artnet_node.slots.update({int(slot_entry.split('_')[1]): {'universe': universe,
# 'led': config_led_mapping[mapping_entry][slot_entry]}})
for sub_slot in config_led_mapping[mapping_entry][slot_entry]:
start, end = config_led_mapping[mapping_entry][slot_entry][sub_slot].split('-')
slot_name = slot_entry.split('_')[1] + '.' + str(sub_slot)
if slot_name in artnet_node.slots:
artnet_node.slots[slot_name].universe.append(str(universe))
artnet_node.slots[slot_name].start.append(int(start))
artnet_node.slots[slot_name].end.append(int(end))
else:
artnet_node.slots.update(
{slot_name: Slot([str(universe)], [int(start)], [int(end)], slot_name)})
artnet_server.art_net_nodes.append(artnet_node)
return artnet_server
@staticmethod
def get_artnet_server_no_slots(config_artnet=None):
""":return: ArtNet Server with nodes from the .yml file"""
config = ArtNetConfigurator.get_conf(config_artnet)
artnet_server = ArtNetServer(config['artnet_server']['ip_address'],
ArtNetConfigurator.get_broadcast_addr(config_artnet),
int(config['artnet_server']['port']))
# Create ArtNet nodes
for node_entry in config.keys():
if node_entry.startswith('artnet_node_'):
artnet_node = ArtNetNodePlanboard(config[node_entry]['name'], config[node_entry]['ip_address'],
int(config[node_entry]['port']))
artnet_node.universe = {}
for (node_option_key, node_option_val) in config.get(node_entry).items():
if ArtNetConfigurator.pat_universe.match(node_option_key):
universe_id = node_option_key.split('_')[1]
strip_length = int(config[node_entry][node_option_key])
artnet_node.universe[universe_id] = LEDStrip(strip_length)
if node_option_key == 'color_history':
for color in node_option_val.split(','):
artnet_node.color_history.append(color)
if ArtNetConfigurator.pat_universe_lines.match(node_option_key):
universe_id = node_option_key.split('_')[1]
artnet_node.universe_to_lines.update({universe_id: config[node_entry][node_option_key]})
artnet_server.art_net_nodes.append(artnet_node)
return artnet_server
@staticmethod
def get_conf(config_file_name=None):
if not config_file_name:
config_file_name = Path(__file__).absolute().parents[2] / "conf" / "artnet.yml"
with open(str(config_file_name), 'r') as ymlfile:
return yaml.safe_load(ymlfile)
@staticmethod
def get_broadcast_addr(config_artnet):
return str(ipaddress.IPv4Network(config_artnet['artnet_server']['ip_address'] + '/' +
config_artnet['artnet_server']['netmask'], False).broadcast_address) |
<filename>src/main/java/com/project/bookmyshow/db/mappers/SeatsBookingDynamicSqlSupport.java
package com.project.bookmyshow.db.mappers;
import java.sql.JDBCType;
import java.util.Date;
import javax.annotation.Generated;
import org.mybatis.dynamic.sql.SqlColumn;
import org.mybatis.dynamic.sql.SqlTable;
public final class SeatsBookingDynamicSqlSupport {
@Generated(value="org.mybatis.generator.api.MyBatisGenerator", date="2020-02-19T14:45:14.764+05:30", comments="Source Table: TBL_SeatsBooking")
public static final SeatsBooking seatsBooking = new SeatsBooking();
@Generated(value="org.mybatis.generator.api.MyBatisGenerator", date="2020-02-19T14:45:14.764+05:30", comments="Source field: TBL_SeatsBooking.seat_booking_id")
public static final SqlColumn<Integer> seatBookingId = seatsBooking.seatBookingId;
@Generated(value="org.mybatis.generator.api.MyBatisGenerator", date="2020-02-19T14:45:14.764+05:30", comments="Source field: TBL_SeatsBooking.seat_id")
public static final SqlColumn<Integer> seatId = seatsBooking.seatId;
@Generated(value="org.mybatis.generator.api.MyBatisGenerator", date="2020-02-19T14:45:14.765+05:30", comments="Source field: TBL_SeatsBooking.booking_id")
public static final SqlColumn<Integer> bookingId = seatsBooking.bookingId;
@Generated(value="org.mybatis.generator.api.MyBatisGenerator", date="2020-02-19T14:45:14.765+05:30", comments="Source field: TBL_SeatsBooking.scheduled_live_show_id")
public static final SqlColumn<Integer> scheduledLiveShowId = seatsBooking.scheduledLiveShowId;
@Generated(value="org.mybatis.generator.api.MyBatisGenerator", date="2020-02-19T14:45:14.765+05:30", comments="Source field: TBL_SeatsBooking.seat_booking_status")
public static final SqlColumn<Integer> seatBookingStatus = seatsBooking.seatBookingStatus;
@Generated(value="org.mybatis.generator.api.MyBatisGenerator", date="2020-02-19T14:45:14.765+05:30", comments="Source field: TBL_SeatsBooking.created_at")
public static final SqlColumn<Date> createdAt = seatsBooking.createdAt;
@Generated(value="org.mybatis.generator.api.MyBatisGenerator", date="2020-02-19T14:45:14.765+05:30", comments="Source field: TBL_SeatsBooking.modified_at")
public static final SqlColumn<Date> modifiedAt = seatsBooking.modifiedAt;
@Generated(value="org.mybatis.generator.api.MyBatisGenerator", date="2020-02-19T14:45:14.764+05:30", comments="Source Table: TBL_SeatsBooking")
public static final class SeatsBooking extends SqlTable {
public final SqlColumn<Integer> seatBookingId = column("seat_booking_id", JDBCType.INTEGER);
public final SqlColumn<Integer> seatId = column("seat_id", JDBCType.INTEGER);
public final SqlColumn<Integer> bookingId = column("booking_id", JDBCType.INTEGER);
public final SqlColumn<Integer> scheduledLiveShowId = column("scheduled_live_show_id", JDBCType.INTEGER);
public final SqlColumn<Integer> seatBookingStatus = column("seat_booking_status", JDBCType.INTEGER);
public final SqlColumn<Date> createdAt = column("created_at", JDBCType.TIMESTAMP);
public final SqlColumn<Date> modifiedAt = column("modified_at", JDBCType.TIMESTAMP);
public SeatsBooking() {
super("TBL_SeatsBooking");
}
}
} |
package cmd
import (
"fmt"
"go/build"
"os"
"os/exec"
"os/user"
fp "path/filepath"
"runtime"
"strings"
"github.com/spf13/cobra"
)
func dockerCmd() *cobra.Command {
cmd := &cobra.Command{
Use: "docker [target]",
Run: dockerHandler,
Args: cobra.ExactArgs(1),
Short: "Build QML app using Docker image",
Long: "Build QML app using Docker image.\nPossible values are " +
`"linux", "linux-static", "win32", "win32-static", "win64" and "win64-static".`,
}
cmd.Flags().StringP("output", "o", "", "location for executable file")
cmd.Flags().StringSliceP("tags", "t", []string{}, "space-separated list of build tags to satisfied during the build")
cmd.Flags().Bool("copy-deps", false, "copy dependencies for app with dynamic linking")
cmd.Flags().Bool("skip-vendoring", false, "if uses Go module, skip updating project's vendor")
return cmd
}
func dockerHandler(cmd *cobra.Command, args []string) {
cBlueBold.Println("Run `qamel build` from Docker image.")
// Read flags
buildTags, _ := cmd.Flags().GetStringSlice("tags")
outputPath, _ := cmd.Flags().GetString("output")
copyDependencies, _ := cmd.Flags().GetBool("copy-deps")
skipVendoring, _ := cmd.Flags().GetBool("skip-vendoring")
// Get target name
target := args[0]
switch target {
case "linux", "linux-static",
"win32", "win32-static",
"win64", "win64-static":
default:
cRedBold.Printf("Target %s is not supported.\n", target)
os.Exit(1)
}
// Get gopath
gopath := os.Getenv("GOPATH")
if gopath == "" {
gopath = build.Default.GOPATH
}
// Get project directory from current working dir
projectDir, err := os.Getwd()
if err != nil {
cRedBold.Println("Failed to get current working dir:", err)
os.Exit(1)
}
// If this project uses Go module, vendor it first before
// passing it to Docker
vendorDir := fp.Join(projectDir, "vendor", "github.com", "go-qamel", "qamel")
goModFile := fp.Join(projectDir, "go.mod")
usesGoModule := fileExists(goModFile)
if usesGoModule && (!dirExists(vendorDir) || !skipVendoring) {
fmt.Print("Generating vendor files...")
cmdModVendor := exec.Command("go", "mod", "vendor")
cmdOutput, err := cmdModVendor.CombinedOutput()
if err != nil {
fmt.Println()
cRedBold.Println("Failed to vendor app:", err)
cRedBold.Println(string(cmdOutput))
os.Exit(1)
}
cGreen.Println("done")
}
// Get docker user from current active user
currentUser, err := user.Current()
if err != nil {
cRedBold.Println("Failed to get user's data:", err)
os.Exit(1)
}
uid := currentUser.Uid
gid := currentUser.Gid
dockerUser := fmt.Sprintf("%s:%s", uid, gid)
if runtime.GOOS == "windows" {
uidParts := strings.Split(uid, "-")
dockerUser = uidParts[len(uidParts)-1]
}
// Create directory for Go build's cache
goCacheDir := fp.Join(projectDir, ".qamel-cache", target, "go")
err = os.MkdirAll(goCacheDir, os.ModePerm)
if err != nil {
cRedBold.Println("Failed to create cache directory for Go build:", err)
os.Exit(1)
}
// If target is Linux, create ccache dir as well
ccacheDir := ""
if strings.HasPrefix(target, "linux") {
ccacheDir = fp.Join(projectDir, ".qamel-cache", target, "ccache")
err = os.MkdirAll(ccacheDir, os.ModePerm)
if err != nil {
cRedBold.Println("Failed to create cache directory for gcc build:", err)
os.Exit(1)
}
}
// Prepare docker arguments
dockerGopath := unixJoinPath("/", "home", "user", "go")
dockerProjectDir := unixJoinPath(dockerGopath, "src", fp.Base(projectDir))
dockerGoCacheDir := unixJoinPath("/", "home", "user", ".cache", "go-build")
dockerBindProject := fmt.Sprintf(`type=bind,src=%s,dst=%s`,
projectDir, dockerProjectDir)
dockerBindGoSrc := fmt.Sprintf(`type=bind,src=%s,dst=%s`,
unixJoinPath(gopath, "src"),
unixJoinPath(dockerGopath, "src"))
dockerBindGoCache := fmt.Sprintf(`type=bind,src=%s,dst=%s`,
unixJoinPath(goCacheDir), dockerGoCacheDir)
dockerArgs := []string{"run", "--rm",
"--attach", "stdout",
"--attach", "stderr",
"--user", dockerUser,
"--workdir", dockerProjectDir,
"--mount", dockerBindProject,
"--mount", dockerBindGoSrc,
"--mount", dockerBindGoCache}
if ccacheDir != "" {
dockerCcacheDir := unixJoinPath("/", "home", "user", ".ccache")
dockerBindCcache := fmt.Sprintf(`type=bind,src=%s,dst=%s`,
unixJoinPath(ccacheDir), dockerCcacheDir)
dockerArgs = append(dockerArgs, "--mount", dockerBindCcache)
}
// If uses go module, set env GO111MODULE to on
if usesGoModule {
dockerArgs = append(dockerArgs, "--env", "GO111MODULE=on")
}
// Finally, set image name
dockerImageName := fmt.Sprintf("radhifadlillah/qamel:%s", target)
dockerArgs = append(dockerArgs, dockerImageName)
// Add qamel arguments
dockerArgs = append(dockerArgs, "--skip-vendoring")
if outputPath != "" {
dockerArgs = append(dockerArgs, "-o", outputPath)
}
if len(buildTags) > 0 {
dockerArgs = append(dockerArgs, "-t")
dockerArgs = append(dockerArgs, buildTags...)
}
if copyDependencies {
dockerArgs = append(dockerArgs, "--copy-deps")
}
// Run docker
cmdDocker := exec.Command("docker", dockerArgs...)
cmdDocker.Stdout = os.Stdout
cmdDocker.Stderr = os.Stderr
err = cmdDocker.Start()
if err != nil {
cRedBold.Println("Failed to start Docker:", err)
os.Exit(1)
}
err = cmdDocker.Wait()
if err != nil {
cRedBold.Println("Failed to build app using Docker:", err)
os.Exit(1)
}
}
|
def compute_work_item_times(df: pd.DataFrame) -> pd.DataFrame:
df.from_phase.fillna('Start', inplace=True)
df.to_phase.fillna('End', inplace=True)
relevant_columns = ['work_item', 'timestamp']
start_times = df[df.from_phase == 'Start'][relevant_columns]
end_times = df[df.to_phase == 'End'][relevant_columns]
times = pd.merge(start_times, end_times, on='work_item', how='left')
times.rename(columns={'timestamp_x': 'start', 'timestamp_y': 'end'}, inplace=True)
times['duration'] = times['end'] - times['start']
times['duration_in_days'] = times['duration'].apply(lambda x: round(x.total_seconds() / (24*3600), 2))
return times |
// Copyright 2021 Google Inc. All rights reserved.
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.
#ifndef BENCHMARK_PERF_COUNTERS_H
#define BENCHMARK_PERF_COUNTERS_H
#include <array>
#include <cstdint>
#include <vector>
#include "benchmark/benchmark.h"
#include "check.h"
#include "log.h"
#ifndef BENCHMARK_OS_WINDOWS
#include <unistd.h>
#endif
namespace benchmark {
namespace internal {
// Typically, we can only read a small number of counters. There is also a
// padding preceding counter values, when reading multiple counters with one
// syscall (which is desirable). PerfCounterValues abstracts these details.
// The implementation ensures the storage is inlined, and allows 0-based
// indexing into the counter values.
// The object is used in conjunction with a PerfCounters object, by passing it
// to Snapshot(). The values are populated such that
// perfCounters->names()[i]'s value is obtained at position i (as given by
// operator[]) of this object.
class PerfCounterValues {
public:
explicit PerfCounterValues(size_t nr_counters) : nr_counters_(nr_counters) {
BM_CHECK_LE(nr_counters_, kMaxCounters);
}
uint64_t operator[](size_t pos) const { return values_[kPadding + pos]; }
static constexpr size_t kMaxCounters = 3;
private:
friend class PerfCounters;
// Get the byte buffer in which perf counters can be captured.
// This is used by PerfCounters::Read
std::pair<char*, size_t> get_data_buffer() {
return {reinterpret_cast<char*>(values_.data()),
sizeof(uint64_t) * (kPadding + nr_counters_)};
}
static constexpr size_t kPadding = 1;
std::array<uint64_t, kPadding + kMaxCounters> values_;
const size_t nr_counters_;
};
// Collect PMU counters. The object, once constructed, is ready to be used by
// calling read(). PMU counter collection is enabled from the time create() is
// called, to obtain the object, until the object's destructor is called.
class PerfCounters final {
public:
// True iff this platform supports performance counters.
static const bool kSupported;
bool IsValid() const { return is_valid_; }
static PerfCounters NoCounters() { return PerfCounters(); }
~PerfCounters();
PerfCounters(PerfCounters&&) = default;
PerfCounters(const PerfCounters&) = delete;
// Platform-specific implementations may choose to do some library
// initialization here.
static bool Initialize();
// Return a PerfCounters object ready to read the counters with the names
// specified. The values are user-mode only. The counter name format is
// implementation and OS specific.
// TODO: once we move to C++-17, this should be a std::optional, and then the
// IsValid() boolean can be dropped.
static PerfCounters Create(const std::vector<std::string>& counter_names);
// Take a snapshot of the current value of the counters into the provided
// valid PerfCounterValues storage. The values are populated such that:
// names()[i]'s value is (*values)[i]
BENCHMARK_ALWAYS_INLINE bool Snapshot(PerfCounterValues* values) const {
#ifndef BENCHMARK_OS_WINDOWS
assert(values != nullptr);
assert(IsValid());
auto buffer = values->get_data_buffer();
auto read_bytes = ::read(counter_ids_[0], buffer.first, buffer.second);
return static_cast<size_t>(read_bytes) == buffer.second;
#else
(void)values;
return false;
#endif
}
const std::vector<std::string>& names() const { return counter_names_; }
size_t num_counters() const { return counter_names_.size(); }
private:
PerfCounters(const std::vector<std::string>& counter_names,
std::vector<int>&& counter_ids)
: counter_ids_(std::move(counter_ids)),
counter_names_(counter_names),
is_valid_(true) {}
PerfCounters() : is_valid_(false) {}
std::vector<int> counter_ids_;
const std::vector<std::string> counter_names_;
const bool is_valid_;
};
// Typical usage of the above primitives.
class PerfCountersMeasurement final {
public:
PerfCountersMeasurement(PerfCounters&& c)
: counters_(std::move(c)),
start_values_(counters_.IsValid() ? counters_.names().size() : 0),
end_values_(counters_.IsValid() ? counters_.names().size() : 0) {}
bool IsValid() const { return counters_.IsValid(); }
BENCHMARK_ALWAYS_INLINE void Start() {
assert(IsValid());
// Tell the compiler to not move instructions above/below where we take
// the snapshot.
ClobberMemory();
counters_.Snapshot(&start_values_);
ClobberMemory();
}
BENCHMARK_ALWAYS_INLINE std::vector<std::pair<std::string, double>>
StopAndGetMeasurements() {
assert(IsValid());
// Tell the compiler to not move instructions above/below where we take
// the snapshot.
ClobberMemory();
counters_.Snapshot(&end_values_);
ClobberMemory();
std::vector<std::pair<std::string, double>> ret;
for (size_t i = 0; i < counters_.names().size(); ++i) {
double measurement = static_cast<double>(end_values_[i]) -
static_cast<double>(start_values_[i]);
ret.push_back({counters_.names()[i], measurement});
}
return ret;
}
private:
PerfCounters counters_;
PerfCounterValues start_values_;
PerfCounterValues end_values_;
};
BENCHMARK_UNUSED static bool perf_init_anchor = PerfCounters::Initialize();
} // namespace internal
} // namespace benchmark
#endif // BENCHMARK_PERF_COUNTERS_H
|
#include<sstream>
#include <iostream>
#include <string>
#include <algorithm>
#include <cmath>
#include <math.h>
#include <cstdlib>
#include <iomanip>
#include<numeric>
#include<set>
#include<bitset>
#include<cstring>
#include<stack>
#include<queue>
#include<list>
#include<map>
#include<vector>
#include<ctype.h>
#include <cctype>
#include <cstring>
#include <cstdio>
#include <vector>
using namespace std;
int arr[200];
int main() {
std::ios_base::sync_with_stdio(NULL);
cin.tie(NULL);
cout.tie(NULL);
int n, x;
cin >> n >> x;
vector<string> s(n);
for (int i = 0; i < n; i++) {
cin >> s[i];
}
sort(s.begin(), s.end());
for (int i = 0; i < n; i++) {
cout << s[i];
}
cout << endl;
system("pause");
//return 0;
}
|
Lebanese gunmen greeted an edict from the leader of Hezbollah banning the firing of celebratory shots in the air - with a volley of gunfire.
In a televised speech whose opening remarks were greeted with a customary staccato round of shots in central Beirut, Sayyed Hassan Nasrallah criticised the phenomenon which he said was dangerous and provocative.
"On religious holidays, people shoot in the air, on political occasions, they shoot in the air, at funerals of martyrs, they shoot ... when someone graduates from school, they shoot," the Hezbollah leader said.
Nasrallah said he had consulted with Shi'ite Muslim clerics in Iran and Iraq, who ruled that the practice was 'haram', or forbidden in Islam. Several times in his speech he urged an end to it.
At the end of his hour-long speech however, the familiar sound of gunfire echoed around the centre of the Lebanese capital. But in the Hezbollah stronghold of southern Beirut, residents said discipline reigned and the streets were quiet.
Hezbollah was set up in the 1980s to fight Israeli occupation and is now fighting alongside President Bashar al-Assad's forces in Syria's civil war. The movement is part of a fragile power balance in the caretaker government of ethnically divided Lebanon. |
Even the most creative business card design cannot guarantee your business success, however it is one of those very first things that make a very long lasting impression. So, you shouldn’t miss an opportunity to show your best qualities here, because your business card is much more than just your contact information. It also communicates your company’s values and character.
Never limit your imagination to a plain rectangle! Don’t forget to include some interactivity – people love to play and solve puzzles.
If it happened so that right now you are thinking about making a great business card – you’ve come to the right place! To help you do that, we collected 20 most creative and extremely interactive business card designs. Are you ready to breathe in some inspiration?
1. Lush: Business Card Filled With Seeds
The business cards were letter pressed by hand and stuffed with grass seed. The best thing about them is when you hand one out, the seeds shake and instantly pay off the idea. (Advertising Agency: Struck, USA)
2. Business Card for Cosmetic Surgeon
Via the use of two rubber inserts, this business card shows the effect Dr.Kiprov’s cosmetic surgery can have. (Advertising Agency: Demner, Merlicek & Bergmann, Vienna, Austria)
3. Business Card For Fitness Trainer
Zohra Mouhetta helps you strip away your belly! (Advertising Agency: Leo Burnett, Dubai, United Arab Emirates)
4. Another Bloomin’ Designer’s Business Card
“By following the instructions correctly a decent bloom would have been seen within a week.” (Designed by Jamie Wieck)
5. Yoga One: Get Stretchy
If you have stretchy fingers you can make this girl do wonders! (Advertising Agency: Marked for Trade)
6. Knot Business Card for Massage Therapist
Michael Royer is a massage therapist in Toronto. Clients have to loosen the knot to reveal the message.
7. Personal Trainer: Stretchy Business Card
If you want to see the text on this business card, you have to do a little stretching exercise – it’s like a warm up before going to Poul Nielsen.
8. Chest Physician: Balloon Visiting Card
A balloon was used as a medium for a visiting card, for Dr.Pramod Niphadkar, a chest physician. In order to read the details of the card, one would have to blow the balloon – an excercise that would determine his/her lung capacity. (Advertising Agency: Ogilvy & Mather , Mumbai, India)
9. TAM Cargo: Transformable Business Card
Transforming the traditional business card in the funny and unusual object, a little box of transporting cargo air. (Advertising Agency: Y&R, São Paulo, Brazil)
10. We Love You…
At first, it may seem that they really love you, but then you see that the gray bar beside the message is a scratch-off. Well, at least they are are upfront with their motives. (Designed by Sensus Design Factory)
11. Business Card for a Dentist
This incredibly creative card belongs to Dr. Anita Wehrle Lechmann (Designed by Michael Häne & Remo Caminada)
12. Divorce Lawyer: Hand-Tearable Business Card
Notice that there is a contact information on both sides of the business card.
13. Model Kit Business Card from Tamiya
Tamiya, who make model kits, came up with a DIY business card: all the pieces pop out of the card so you can either build a model race car, airplane, or navy boat. (Designed by Creative Juice)
14. Thumbs Up
Advertising Agency: Cheil Worldwide, Seoul, Korea
15. DJ Mohit Business Card
Designed by Deepak Nagar & Nasheet Shadan |
/**
* second Part of Structure generating, this for example places Spiderwebs, Mob Spawners, it closes
* Mineshafts at the end, it adds Fences...
*/
public boolean addComponentParts(World p_74875_1_, Random p_74875_2_, StructureBoundingBox p_74875_3_)
{
if (this.isLiquidInStructureBoundingBox(p_74875_1_, p_74875_3_))
{
return false;
}
int i = this.sectionCount * 5 - 1;
this.fillWithBlocks(p_74875_1_, p_74875_3_, 0, 0, 0, 2, 1, i, Blocks.air, Blocks.air, false);
this.randomlyFillWithBlocks(p_74875_1_, p_74875_3_, p_74875_2_, 0.8F, 0, 2, 0, 2, 2, i, Blocks.air, Blocks.air, false);
if (this.hasSpiders)
{
randomlyFillWithBlocksLightLimit(p_74875_1_, p_74875_3_, p_74875_2_, 0.6F, 0, 0, 0, 2, 1, i, Blocks.web, Blocks.air, false, 8);
}
int j;
int k;
for (j = 0; j < this.sectionCount; ++j)
{
k = 2 + j * 5;
buildWoodenParts(p_74875_1_, p_74875_3_, 0, 0, k, 2, 2, p_74875_2_);
placeWeb(p_74875_1_, p_74875_3_, p_74875_2_, 0.1F, 0, 2, k - 1);
placeWeb(p_74875_1_, p_74875_3_, p_74875_2_, 0.1F, 2, 2, k - 1);
placeWeb(p_74875_1_, p_74875_3_, p_74875_2_, 0.1F, 0, 2, k + 1);
placeWeb(p_74875_1_, p_74875_3_, p_74875_2_, 0.1F, 2, 2, k + 1);
placeWeb(p_74875_1_, p_74875_3_, p_74875_2_, 0.05F, 0, 2, k - 2);
placeWeb(p_74875_1_, p_74875_3_, p_74875_2_, 0.05F, 2, 2, k - 2);
placeWeb(p_74875_1_, p_74875_3_, p_74875_2_, 0.05F, 0, 2, k + 2);
placeWeb(p_74875_1_, p_74875_3_, p_74875_2_, 0.05F, 2, 2, k + 2);
ChestGenHooks info = ChestGenHooks.getInfo(MINESHAFT_CORRIDOR);
if (p_74875_2_.nextInt(100) == 0)
{
this.generateStructureChestContents(p_74875_1_, p_74875_3_, p_74875_2_, 2, 0, k - 1, info.getItems(p_74875_2_), info.getCount(p_74875_2_));
}
if (p_74875_2_.nextInt(100) == 0)
{
this.generateStructureChestContents(p_74875_1_, p_74875_3_, p_74875_2_, 0, 0, k + 1, info.getItems(p_74875_2_), info.getCount(p_74875_2_));
}
if (this.hasSpiders && !this.spawnerPlaced)
{
int l = this.getYWithOffset(0);
int i1 = k - 1 + p_74875_2_.nextInt(3);
int j1 = this.getXWithOffset(1, i1);
i1 = this.getZWithOffset(1, i1);
if (p_74875_3_.isVecInside(j1, l, i1) && this.skyLight(p_74875_1_, 1, 0, i1, getBoundingBox()) < 8)
{
this.spawnerPlaced = true;
p_74875_1_.setBlock(j1, l, i1, Blocks.mob_spawner, 0, 2);
TileEntityMobSpawner tileentitymobspawner = (TileEntityMobSpawner)p_74875_1_.getTileEntity(j1, l, i1);
if (tileentitymobspawner != null)
{
tileentitymobspawner.func_145881_a().setEntityName("CaveSpider");
}
}
}
}
for (j = 0; j <= 2; ++j)
{
for (k = 0; k <= i; ++k)
{
byte b0 = -1;
Block block1 = this.getBlockAtCurrentPosition(p_74875_1_, j, b0, k, p_74875_3_);
if (block1.getMaterial() == Material.air && this.skyLight(p_74875_1_, j, -1, k, getBoundingBox()) < 8)
{
byte b1 = -1;
this.placeBlockAtCurrentPosition(p_74875_1_, getPlank(), getPlankMeta(), j, b1, k, p_74875_3_);
}
}
}
if (this.hasRails)
{
for (j = 0; j <= i; ++j)
{
Block block = this.getBlockAtCurrentPosition(p_74875_1_, 1, -1, j, p_74875_3_);
if (block.getMaterial() != Material.air && block.func_149730_j())
{
float f = skyLight(p_74875_1_, 1, 0, j, getBoundingBox()) > 8 ? 0.9F : 0.7F;
this.func_151552_a(p_74875_1_, p_74875_3_, p_74875_2_, f, 1, 0, j, Blocks.rail, this.getMetadataWithOffset(Blocks.rail, 0));
}
}
}
return true;
} |
from __future__ import absolute_import
from datetime import timedelta
from django.core.urlresolvers import reverse
from sentry.testutils import APITestCase, SnubaTestCase
from sentry.testutils.helpers.datetime import iso_format, before_now
class OrganizationEventsStatsEndpointTest(APITestCase, SnubaTestCase):
def setUp(self):
super(OrganizationEventsStatsEndpointTest, self).setUp()
self.login_as(user=self.user)
self.day_ago = before_now(days=1).replace(hour=10, minute=0, second=0, microsecond=0)
self.project = self.create_project()
self.project2 = self.create_project()
self.user = self.create_user()
self.user2 = self.create_user()
self.store_event(
data={
"event_id": "a" * 32,
"message": "very bad",
"timestamp": iso_format(self.day_ago + timedelta(minutes=1)),
"fingerprint": ["group1"],
"tags": {"sentry:user": self.user.email},
},
project_id=self.project.id,
)
self.store_event(
data={
"event_id": "b" * 32,
"message": "oh my",
"timestamp": iso_format(self.day_ago + timedelta(hours=1, minutes=1)),
"fingerprint": ["group2"],
"tags": {"sentry:user": self.user2.email},
},
project_id=self.project2.id,
)
self.store_event(
data={
"event_id": "c" * 32,
"message": "very bad",
"timestamp": iso_format(self.day_ago + timedelta(hours=1, minutes=2)),
"fingerprint": ["group2"],
"tags": {"sentry:user": self.user2.email},
},
project_id=self.project2.id,
)
self.url = reverse(
"sentry-api-0-organization-events-stats",
kwargs={"organization_slug": self.project.organization.slug},
)
def test_simple(self):
response = self.client.get(
self.url,
data={
"start": iso_format(self.day_ago),
"end": iso_format(self.day_ago + timedelta(hours=1, minutes=59)),
"interval": "1h",
},
format="json",
)
assert response.status_code == 200, response.content
assert [attrs for time, attrs in response.data["data"]] == [
[],
[{"count": 1}],
[{"count": 2}],
]
def test_no_projects(self):
org = self.create_organization(owner=self.user)
self.login_as(user=self.user)
url = reverse(
"sentry-api-0-organization-events-stats", kwargs={"organization_slug": org.slug}
)
response = self.client.get(url, format="json")
assert response.status_code == 200, response.content
assert len(response.data["data"]) == 0
def test_groupid_filter(self):
response = self.client.get(
self.url,
data={
"start": iso_format(self.day_ago),
"end": iso_format(self.day_ago + timedelta(hours=1, minutes=59)),
"interval": "1h",
"group": self.group.id,
},
format="json",
)
assert response.status_code == 200, response.content
assert len(response.data["data"])
def test_groupid_filter_invalid_value(self):
url = "%s?group=not-a-number" % (self.url,)
response = self.client.get(url, format="json")
assert response.status_code == 400, response.content
def test_user_count(self):
self.store_event(
data={
"event_id": "d" * 32,
"message": "something",
"timestamp": iso_format(self.day_ago + timedelta(minutes=2)),
"tags": {"sentry:user": self.user2.email},
"fingerprint": ["group2"],
},
project_id=self.project2.id,
)
response = self.client.get(
self.url,
data={
"start": iso_format(self.day_ago),
"end": iso_format(self.day_ago + timedelta(hours=1, minutes=59)),
"interval": "1h",
"yAxis": "user_count",
},
format="json",
)
assert response.status_code == 200, response.content
assert [attrs for time, attrs in response.data["data"]] == [
[],
[{"count": 2}],
[{"count": 1}],
]
def test_discover2_backwards_compatibility(self):
with self.feature("organizations:events-v2"):
response = self.client.get(
self.url,
data={
"start": iso_format(self.day_ago),
"end": iso_format(self.day_ago + timedelta(hours=1, minutes=59)),
"interval": "1h",
"yAxis": "user_count",
},
format="json",
)
assert response.status_code == 200, response.content
assert len(response.data["data"]) > 0
with self.feature("organizations:events-v2"):
response = self.client.get(
self.url,
data={
"start": iso_format(self.day_ago),
"end": iso_format(self.day_ago + timedelta(hours=1, minutes=59)),
"interval": "1h",
"yAxis": "event_count",
},
format="json",
)
assert response.status_code == 200, response.content
assert len(response.data["data"]) > 0
def test_with_event_count_flag(self):
response = self.client.get(
self.url,
data={
"start": iso_format(self.day_ago),
"end": iso_format(self.day_ago + timedelta(hours=1, minutes=59)),
"interval": "1h",
"yAxis": "event_count",
},
format="json",
)
assert response.status_code == 200, response.content
assert [attrs for time, attrs in response.data["data"]] == [
[],
[{"count": 1}],
[{"count": 2}],
]
def test_aggregate_function_count(self):
with self.feature("organizations:events-v2"):
response = self.client.get(
self.url,
format="json",
data={
"start": iso_format(self.day_ago),
"end": iso_format(self.day_ago + timedelta(hours=1, minutes=59)),
"interval": "1h",
"yAxis": "count()",
},
)
assert response.status_code == 200, response.content
assert [attrs for time, attrs in response.data["data"]] == [
[],
[{"count": 1}],
[{"count": 2}],
]
def test_invalid_aggregate(self):
with self.feature("organizations:events-v2"):
response = self.client.get(
self.url,
format="json",
data={
"start": iso_format(self.day_ago),
"end": iso_format(self.day_ago + timedelta(hours=1, minutes=59)),
"interval": "1h",
"yAxis": "rubbish",
},
)
assert response.status_code == 400, response.content
def test_aggregate_function_user_count(self):
with self.feature("organizations:events-v2"):
response = self.client.get(
self.url,
format="json",
data={
"start": iso_format(self.day_ago),
"end": iso_format(self.day_ago + timedelta(hours=1, minutes=59)),
"interval": "1h",
"yAxis": "count_unique(user)",
},
)
assert response.status_code == 200, response.content
assert [attrs for time, attrs in response.data["data"]] == [
[],
[{"count": 1}],
[{"count": 1}],
]
def test_aggregate_invalid(self):
with self.feature("organizations:events-v2"):
response = self.client.get(
self.url,
format="json",
data={
"start": iso_format(self.day_ago),
"end": iso_format(self.day_ago + timedelta(hours=1, minutes=59)),
"interval": "1h",
"yAxis": "nope(lol)",
},
)
assert response.status_code == 400, response.content
def test_with_field_and_reference_event_invalid(self):
with self.feature("organizations:events-v2"):
response = self.client.get(
self.url,
format="json",
data={
"start": iso_format(self.day_ago),
"end": iso_format(self.day_ago + timedelta(hours=1, minutes=59)),
"interval": "1h",
"referenceEvent": "nope-invalid",
"yAxis": "count()",
},
)
assert response.status_code == 400, response.content
assert "reference" in response.content
def test_only_reference_event(self):
# Create a new event that message matches events made in setup
event = self.store_event(
data={
"event_id": "e" * 32,
"message": "oh my",
"timestamp": iso_format(self.day_ago + timedelta(minutes=2)),
"tags": {"sentry:user": "<EMAIL>"},
"fingerprint": ["group3"],
},
project_id=self.project.id,
)
with self.feature("organizations:events-v2"):
response = self.client.get(
self.url,
format="json",
data={
"start": iso_format(self.day_ago),
"end": iso_format(self.day_ago + timedelta(hours=1, minutes=59)),
"interval": "1h",
"referenceEvent": "%s:%s" % (self.project.slug, event.event_id),
"yAxis": "count()",
},
)
assert response.status_code == 200, response.content
# Because we didn't send fields, the reference event is not applied
assert [attrs for time, attrs in response.data["data"]] == [
[],
[{"count": 2}],
[{"count": 2}],
]
def test_field_and_reference_event(self):
# Create a new event that message matches events made in setup
event = self.store_event(
data={
"event_id": "e" * 32,
"message": "oh my",
"timestamp": iso_format(self.day_ago + timedelta(minutes=2)),
"tags": {"sentry:user": "<EMAIL>"},
"fingerprint": ["group3"],
},
project_id=self.project.id,
)
with self.feature("organizations:events-v2"):
response = self.client.get(
self.url,
format="json",
data={
"start": iso_format(self.day_ago),
"end": iso_format(self.day_ago + timedelta(hours=1, minutes=59)),
"field": ["message", "count()"],
"interval": "1h",
"referenceEvent": "%s:%s" % (self.project.slug, event.event_id),
"yAxis": "count()",
},
)
assert response.status_code == 200, response.content
assert [attrs for time, attrs in response.data["data"]] == [
[],
[{"count": 1}],
[{"count": 1}],
]
def test_transaction_events(self):
prototype = {
"type": "transaction",
"transaction": "api.issue.delete",
"spans": [],
"contexts": {"trace": {"op": "foobar", "trace_id": "a" * 32, "span_id": "a" * 16}},
"tags": {"important": "yes"},
}
fixtures = (
("d" * 32, before_now(minutes=32)),
("e" * 32, before_now(hours=1, minutes=2)),
("f" * 32, before_now(hours=1, minutes=35)),
)
for fixture in fixtures:
data = prototype.copy()
data["event_id"] = fixture[0]
data["timestamp"] = iso_format(fixture[1])
data["start_timestamp"] = iso_format(fixture[1] - timedelta(seconds=1))
self.store_event(data=data, project_id=self.project.id)
with self.feature("organizations:events-v2"):
response = self.client.get(
self.url,
format="json",
data={
"end": iso_format(before_now()),
"start": iso_format(before_now(hours=2)),
"query": "event.type:transaction",
"interval": "30m",
"yAxis": "count()",
},
)
assert response.status_code == 200, response.content
items = [item for time, item in response.data["data"] if item]
# We could get more results depending on where the 30 min
# windows land.
assert len(items) >= 3
def test_project_id_query_filter(self):
with self.feature("organizations:events-v2"):
response = self.client.get(
self.url,
format="json",
data={
"end": iso_format(before_now()),
"start": iso_format(before_now(hours=2)),
"query": "project_id:1",
"interval": "30m",
"yAxis": "count()",
},
)
assert response.status_code == 200
|
#!/usr/bin/env python3
# -*- coding: utf-8 -*-
#
# Copyright 2020-2021 by <NAME>. All rights reserved. This file is part
# of the Robot Operating System project, released under the MIT License. Please
# see the LICENSE file included as part of this package.
#
# author: <NAME>
# created: 2020-11-13
# modified: 2020-11-13
#
import pytest
import sys, traceback
from colorama import init, Fore, Style
init()
from lib.config_loader import ConfigLoader
from lib.rate import Rate
from lib.logger import Logger, Level
from lib.rotary_ctrl import RotaryControl
# ..............................................................................
@pytest.mark.unit
def test_rot_control():
_log = Logger("rot-ctrl-test", Level.INFO)
_rot_ctrl = None
try:
# read YAML configuration
_loader = ConfigLoader(Level.INFO)
filename = 'config.yaml'
_config = _loader.configure(filename)
# def __init__(self, config, minimum, maximum, step, level):
# _min = -127
# _max = 127
# _step = 1
_min = 0
_max = 10
_step = 0.5
_rot_ctrl = RotaryControl(_config, _min, _max, _step, Level.INFO)
_rate = Rate(20)
_log.info(Fore.WHITE + Style.BRIGHT + 'waiting for changes to rotary encoder...')
while True:
_value = _rot_ctrl.read()
_log.info(Fore.YELLOW + ' value: {:5.2f}'.format(_value))
_rate.wait()
finally:
if _rot_ctrl:
_log.info('resetting rotary encoder...')
_rot_ctrl.reset()
# main .........................................................................
_rot = None
def main(argv):
try:
test_rot_control()
except KeyboardInterrupt:
print(Fore.CYAN + 'caught Ctrl-C; exiting...' + Style.RESET_ALL)
except Exception:
print(Fore.RED + 'error starting ros: {}'.format(traceback.format_exc()) + Style.RESET_ALL)
# call main ....................................................................
if __name__== "__main__":
main(sys.argv[1:])
#EOF
|
// Custom Viewholder class for the list items
public class ItemViewHolder extends ViewHolder {
@BindView(R.id.expand_view)
ExpandableTextView expandableTextView;
@BindView(R.id.expandable_text)
TextView textViewReview;
@BindView(R.id.tv_review_author)
TextView textViewReviewAuthor;
private ItemViewHolder(View itemView) {
super(itemView);
ButterKnife.bind(this, itemView);
Utils.setCustomTypeFace(context, textViewReview);
}
} |
/// Removes the least recently seen node from a particular routing bucket in the routing table.
pub fn remove_lrs(&mut self, key: &Key) -> Option<NodeData> {
let index = cmp::min(
self.node_data.id.xor(key).leading_zeros(),
self.buckets.len() - 1,
);
self.buckets[index].remove_lrs()
} |
def unblock_contact(source, target):
if source.has_blocked(target):
source.update(pull__blocked=target)
clear_get_user_cache(source)
clear_get_contacts_cache(source, target)
clear_get_followers_cache(target) |
{-# LANGUAGE UndecidableInstances #-}
{-# OPTIONS_GHC -fno-warn-orphans #-}
module Hs.Main.Build.Render
( RenderEffect
, renderBuildingComponent
, renderBuildingDependency
, renderCompilingModule
, renderCompletedDependency
, renderConfiguringComponent
, renderDownloadingDependency
, renderPreprocessingComponent
, renderStderr
, RenderCarrier
, runRender
) where
import Hs.Cabal.Component (Component(..))
import Hs.Cabal.Package (Package(..))
import Control.Concurrent.STM
import Control.Effect
import Control.Effect.Carrier
import Control.Effect.Lift
import Control.Effect.Reader
import Control.Effect.State
import Control.Effect.Sum
import Data.HashMap.Strict (HashMap)
import Data.Set (Set)
import System.Console.ANSI
import System.Console.Concurrent
import System.Console.Regions
import GHC.Conc
import qualified Data.HashMap.Strict as HashMap
import qualified Data.Set as Set
import qualified Data.Text as Text
type Dependency
= Either Package Component
data RenderEffect (m :: Type -> Type) (k :: Type) where
RenderBuildingComponent
:: Component
-> k
-> RenderEffect m k
RenderBuildingDependency
:: Dependency
-> Word64
-> k
-> RenderEffect m k
RenderCompilingModule
:: Int
-> Int
-> Text
-> Bool
-> k
-> RenderEffect m k
RenderCompletedDependency
:: Dependency
-> Word64
-> k
-> RenderEffect m k
RenderConfiguringComponent
:: Component
-> k
-> RenderEffect m k
RenderDownloadingDependency
:: Dependency
-> k
-> RenderEffect m k
RenderPreprocessingComponent
:: Component
-> k
-> RenderEffect m k
RenderStderr
:: Text
-> k
-> RenderEffect m k
deriving stock (Functor)
deriving anyclass (HFunctor)
renderBuildingComponent ::
( Carrier sig m
, Member RenderEffect sig
)
=> Component
-> m ()
renderBuildingComponent component =
send (RenderBuildingComponent component (pure ()))
renderBuildingDependency ::
( Carrier sig m
, Member RenderEffect sig
)
=> Dependency
-> Word64
-> m ()
renderBuildingDependency dep time =
send (RenderBuildingDependency dep time (pure ()))
renderCompilingModule ::
( Carrier sig m
, Member RenderEffect sig
)
=> Int
-> Int
-> Text
-> Bool
-> m ()
renderCompilingModule n m name isBoot =
send (RenderCompilingModule n m name isBoot (pure ()))
renderCompletedDependency ::
( Carrier sig m
, Member RenderEffect sig
)
=> Dependency
-> Word64
-> m ()
renderCompletedDependency dep time =
send (RenderCompletedDependency dep time (pure ()))
renderConfiguringComponent ::
( Carrier sig m
, Member RenderEffect sig
)
=> Component
-> m ()
renderConfiguringComponent component =
send (RenderConfiguringComponent component (pure ()))
renderDownloadingDependency ::
( Carrier sig m
, Member RenderEffect sig
)
=> Dependency
-> m ()
renderDownloadingDependency dep =
send (RenderDownloadingDependency dep (pure ()))
renderPreprocessingComponent ::
( Carrier sig m
, Member RenderEffect sig
)
=> Component
-> m ()
renderPreprocessingComponent component =
send (RenderPreprocessingComponent component (pure ()))
renderStderr ::
( Carrier sig m
, Member RenderEffect sig
)
=> Text
-> m ()
renderStderr line =
send (RenderStderr line (pure ()))
newtype RenderCarrier m a
= RenderCarrier
{ unRenderCarrier ::
ReaderC (TVar (Set (Double, Dependency)))
(StateC (Maybe LocalsConsoleRegion)
(StateC (Maybe DepsConsoleRegion)
(StateC (Maybe Component)
(StateC (HashMap Component ConsoleRegion)
(StateC (HashMap Dependency ConsoleRegion)
(StateC (HashMap Dependency Word64) m)))))) a
} deriving newtype (Applicative, Functor, Monad, MonadIO)
instance (Carrier sig m, Effect sig, LiftRegion m, MonadIO m)
=> Carrier (RenderEffect :+: sig) (RenderCarrier m) where
eff ::
(RenderEffect :+: sig) (RenderCarrier m) (RenderCarrier m a)
-> RenderCarrier m a
eff = \case
L (RenderBuildingComponent _component next) ->
next
L (RenderBuildingDependency dep time next) ->
RenderCarrier $ do
regions :: HashMap Dependency ConsoleRegion <-
get
-- Dependency may already exist in map because it was downloaded first.
when (isNothing (HashMap.lookup dep regions)) $
renderNewDependency dep
modify (HashMap.insert dep time)
unRenderCarrier next
L (RenderCompilingModule n m name isBoot next) ->
RenderCarrier $ do
mcomponent :: Maybe Component <-
get
for_ mcomponent $ \component -> do
mregion :: Maybe ConsoleRegion <-
gets (HashMap.lookup component)
for_ mregion $ \region ->
liftIO
(setConsoleRegion
region
(fold
[ pprComponent component
, " "
, Text.pack (show n)
, "/"
, Text.pack (show m)
, " "
, pprModule name isBoot
, "\n"
]))
unRenderCarrier next
L (RenderCompletedDependency dep endTime next) ->
RenderCarrier $ do
regions :: HashMap Dependency ConsoleRegion <-
get
beginTimes :: HashMap Dependency Word64 <-
get
doneVar :: TVar (Set (Double, Dependency)) <-
ask
let
mregion :: Maybe ConsoleRegion
regions' :: HashMap Dependency ConsoleRegion
(mregion, regions') =
HashMap.alterF
(\case
Nothing -> (Nothing, Nothing)
Just region -> (Just region, Nothing))
dep
regions
liftIO . atomically $ do
for_ mregion $ \region ->
closeConsoleRegion region
for_ (HashMap.lookup dep beginTimes) $ \beginTime ->
let
elapsedTime :: Double
elapsedTime =
fromIntegral (endTime - beginTime) / 1e9
in
modifyTVar'
doneVar
(Set.insert (elapsedTime, dep))
put regions'
unRenderCarrier next
L (RenderConfiguringComponent component next) ->
RenderCarrier $ do
renderNewComponent component
unRenderCarrier next
L (RenderDownloadingDependency dep next) ->
RenderCarrier $ do
renderNewDependency dep
unRenderCarrier next
L (RenderPreprocessingComponent component next) ->
RenderCarrier $ do
regions :: HashMap Component ConsoleRegion <-
get
when (isNothing (HashMap.lookup component regions)) $
renderNewComponent component
unRenderCarrier next
L (RenderStderr line next) ->
RenderCarrier $ do
liftIO (outputConcurrent (line <> "\n"))
unRenderCarrier next
R other ->
RenderCarrier (eff (R (R (R (R (R (R (R (handleCoercible other)))))))))
renderNewComponent ::
( Carrier sig m
, Member (State (Maybe Component)) sig
, Member (State (Maybe LocalsConsoleRegion)) sig
, Member (State (HashMap Component ConsoleRegion)) sig
, LiftRegion m
)
=> Component
-> m ()
renderNewComponent component = do
renderCompletedComponent
localsRegion <-
get >>= \case
Nothing -> do
localsRegion <- liftRegion (openConsoleRegion Linear)
put (Just (LocalsConsoleRegion localsRegion))
pure localsRegion
Just (LocalsConsoleRegion localsRegion) ->
pure localsRegion
region <- liftRegion (openConsoleRegion (InLine localsRegion))
liftRegion (setConsoleRegion region (pprComponent component <> "\n"))
put (Just component)
modify (HashMap.insert component region)
renderCompletedComponent ::
( Carrier sig m
, Member (State (Maybe Component)) sig
, Member (State (HashMap Component ConsoleRegion)) sig
, LiftRegion m
)
=> m ()
renderCompletedComponent = do
mprevComponent :: Maybe Component <- get
for_ mprevComponent $ \prevComponent -> do
mprevRegion <- gets (HashMap.lookup prevComponent)
for_ mprevRegion $ \prevRegion ->
liftRegion (setConsoleRegion prevRegion (pprComponent prevComponent <> "\n"))
renderNewDependency ::
( Carrier sig m
, Member (State (Maybe Component)) sig
, Member (State (Maybe DepsConsoleRegion)) sig
, Member (State (HashMap Component ConsoleRegion)) sig
, Member (State (HashMap Dependency ConsoleRegion)) sig
, LiftRegion m
)
=> Dependency
-> m ()
renderNewDependency dep = do
renderCompletedComponent
depsRegion <-
get >>= \case
Nothing -> do
depsRegion <- liftRegion (openConsoleRegion Linear)
put (Just (DepsConsoleRegion depsRegion))
pure depsRegion
Just (DepsConsoleRegion depsRegion) ->
pure depsRegion
region <- liftRegion (openConsoleRegion (InLine depsRegion))
liftRegion (setConsoleRegion region (pprDependency dep <> "\n"))
modify (HashMap.insert dep region)
pprDependency :: Dependency -> Text
pprDependency =
either pprPackage pprComponent
pprPackage :: Package -> Text
pprPackage (Package name ver) =
brightWhite name <> " " <> brightBlack ver
prPackage :: Package -> Text
prPackage (Package name ver) =
name <> " " <> ver
pprComponent :: Component -> Text
pprComponent = \case
Executable package exeName ->
pprPackage package <> blue " executable " <> brightWhite exeName
Library package@(Package pkgName _) name ->
pprPackage package <>
(if name /= pkgName then blue " library " <> brightWhite name else mempty)
TestSuite package exeName ->
pprPackage package <> blue " test suite " <> brightWhite exeName
prComponent :: Component -> Text
prComponent = \case
Executable package exeName ->
prPackage package <> " executable " <> exeName
Library package@(Package pkgName _) name ->
prPackage package <>
(if name /= pkgName then " library " <> name else mempty)
TestSuite package exeName ->
prPackage package <> " test suite " <> exeName
pprDoneDependency :: (Double, Dependency) -> Text
pprDoneDependency (n, dep) =
brightBlack
(Text.pack (show (round n :: Int)) <> "s " <>
either prPackage prComponent dep)
pprModule :: Text -> Bool -> Text
pprModule name isBoot =
name <> (if isBoot then brightBlack " boot" else Text.empty)
newtype DepsConsoleRegion = DepsConsoleRegion ConsoleRegion
newtype LocalsConsoleRegion = LocalsConsoleRegion ConsoleRegion
runRender ::
( Carrier sig m
, MonadIO m
)
=> RenderCarrier m a
-> m a
runRender (RenderCarrier action) = do
doneDepsVar :: TVar (Set (Double, Dependency)) <-
liftIO (newTVarIO Set.empty)
-- doneDepsRegion :: ConsoleRegion <-
-- liftIO (newConsoleRegion Linear Text.empty)
-- liftIO . setConsoleRegion doneDepsRegion $ do
-- done <- readTVar doneDepsVar
-- unsafeIOToSTM (outputConcurrent (Text.pack (show done) <> "\n"))
-- -- Only add the region when it has at least one elem.
-- when (Set.size done == 1) $ do
-- unsafeIOToSTM (outputConcurrent ("Adding DONE DEPS\n" :: Text))
-- regions <- takeTMVar regionList
-- putTMVar regionList (doneDepsRegion : regions)
-- done
-- & Set.toDescList
-- & map pprDoneDependency
-- & Text.unlines
-- & pure
(regions, (mlastComponent, result)) <-
action
& runReader doneDepsVar
& evalState Nothing
& evalState Nothing
& runState Nothing
& runState HashMap.empty
& evalState HashMap.empty
& evalState HashMap.empty
for_ mlastComponent $ \lastComponent ->
for_ (HashMap.lookup lastComponent regions) $ \region ->
liftIO (setConsoleRegion region (pprComponent lastComponent <> "\n"))
pure result
--------------------------------------------------------------------------------
-- Colors
--------------------------------------------------------------------------------
blue :: Text -> Text
blue s =
Text.pack (setSGRCode [SetColor Foreground Dull Blue]) <> s <> reset
brightBlack :: Text -> Text
brightBlack s =
Text.pack (setSGRCode [SetColor Foreground Vivid Black]) <> s <> reset
brightWhite :: Text -> Text
brightWhite s =
Text.pack (setSGRCode [SetColor Foreground Vivid White]) <> s <> reset
reset :: Text
reset = Text.pack (setSGRCode [Reset])
instance LiftRegion m => LiftRegion (ReaderC r m) where
liftRegion =
ReaderC . const . liftRegion
instance (Functor m, LiftRegion m) => LiftRegion (StateC r m) where
liftRegion m =
StateC (\s -> (s,) <$> liftRegion m)
instance LiftRegion (LiftC IO) where
liftRegion = LiftC . atomically
|
Techniques for magnetic field monitor of the low frequency trapezoidal pulse magnet with the NMR
Measurement of magnetic field of the lattice bending magnet is important to control the beam orbit, tune control and the timing of the fixed magnetic field. Hole probe and search coil are used conventionally and NMR has been considered to know the only DC magnetic field. Authors have developed the technique for the magnetic field monitor of the pulsed magnet such as the main ring magnet of the 12 GeV KEK-PS. During the injection (550 ms) and flat top (1-2 s) periods, magnetic field is measured by NMR probes with the frequency scanning. If we want the timing pulse at any magnetic field during acceleration, NMR probe can measure it with a fixed frequency mode. It depends on the principle of the NMR, which occur the nuclear magnetic resonance in the relation between the quantum axis magnetic field and the frequency of the rotational magnetic field in the perpendicular plane. The performance of the technique for the magnetic field monitor by NMR and some unique results of the magnetic field in the lattice bending magnet are presented. |
package email
import "github.com/flowdev/example-mono/x/logging"
func Send(address, title, text string) {
logging.Log("Sending email to: " + address)
}
|
/**
* This class is responsible for sanitising the apply url to ensure it is formatted correctly.
*/
@Slf4j
public class ApplyURLSanitiser {
public static Vacancy sanitise(Vacancy vacancy) {
String originalURL = vacancy.getApplyURL();
String url = vacancy.getApplyURL();
if (url != null && !url.toLowerCase().matches("^\\w+://.*")) {
url = "https://" + url;
}
String[] schemes = {"http", "https"};
UrlValidator urlValidator = new UrlValidator(schemes);
if (!urlValidator.isValid(url)) {
url = null;
}
log.debug("setting applyurl '" + originalURL + "' to: " + url);
vacancy.setApplyURL(url);
return vacancy;
}
} |
<reponame>DeepInThought/swim
// Copyright 2015-2020 SWIM.AI inc.
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.
import {Input} from "./Input";
import {Output} from "./Output";
import {Parser} from "./Parser";
import {Writer} from "./Writer";
import {Format} from "./Format";
import {Unicode} from "./Unicode";
import {Base10NumberParser} from "./Base10NumberParser";
import {Base10IntegerWriter} from "./Base10IntegerWriter";
/**
* Base-10 (decimal) encoding [[Parser]]/[[Writer]] factory.
*/
export class Base10 {
private constructor() {
// nop
}
/**
* Returns `true` if the Unicode code point `c` is a valid base-10 digit.
*/
static isDigit(c: number): boolean {
return c >= 48/*'0'*/ && c <= 57/*'9'*/;
}
/**
* Returns the decimal quantity between `0` (inclusive) and `10` (exclusive)
* represented by the base-10 digit `c`.
*
* @throws `Error` if `c` is not a valid base-10 digit.
*/
static decodeDigit(c: number): number {
if (c >= 48/*'0'*/ && c <= 57/*'9'*/) {
return c - 48/*'0'*/;
} else {
const message = Unicode.stringOutput();
message.write("Invalid base-10 digit: ");
Format.debugChar(c, message);
throw new Error(message.bind());
}
}
/**
* Returns the Unicode code point of the base-10 digit that encodes the given
* decimal quantity between `0` (inclusive) and `10` (exclusive).
*/
static encodeDigit(b: number): number {
if (b >= 0 && b <= 9) {
return 48/*'0'*/ + b;
} else {
throw new Error("" + b);
}
}
/**
* Returns the number of whole decimal digits in the given absolute `value`.
*/
static countDigits(value: number): number {
let size = 0;
do {
size += 1;
value = (value / 10) | 0;
} while (value !== 0);
return size;
}
static integerParser(): Parser<number> {
return new Base10.NumberParser(void 0, void 0, 0);
}
static parseInteger(input: Input): Parser<number> {
return Base10.NumberParser.parse(input, void 0, void 0, 0);
}
static decimalParser(): Parser<number> {
return new Base10.NumberParser(void 0, void 0, 1);
}
static parseDecimal(input: Input): Parser<number> {
return Base10.NumberParser.parse(input, void 0, void 0, 1);
}
static numberParser(): Parser<number> {
return new Base10.NumberParser();
}
static parseNumber(input: Input): Parser<number> {
return Base10.NumberParser.parse(input);
}
/**
* Returns a `Writer` that, when fed an input `number` value, returns a
* continuation that writes the base-10 (decimal) encoding of the input value.
*/
static integerWriter(): Writer<number, unknown>;
/**
* Returns a `Writer` continuation that writes the base-10 (decimal) encoding
* of the `input` value.
*/
static integerWriter(input: number): Writer<unknown, number>;
static integerWriter(input?: number): Writer<unknown, unknown> {
if (input === void 0) {
return new Base10.IntegerWriter(void 0, 0);
} else {
return new Base10.IntegerWriter(void 0, input);
}
}
/**
* Writes the base-10 (decimal) encoding of the `input` value to the `output`,
* returning a `Writer` continuation that knows how to write any remaining
* output that couldn't be immediately generated.
*/
static writeInteger(input: number, output: Output): Writer<unknown, unknown> {
return Base10.IntegerWriter.write(output, void 0, input);
}
// Forward type declarations
/** @hidden */
static NumberParser: typeof Base10NumberParser; // defined by Base10NumberParser
/** @hidden */
static IntegerWriter: typeof Base10IntegerWriter; // defined by Base10IntegerWriter
}
|
#include "storage_manager.hh"
#include <drogon/HttpAppFramework.h>
#include <thread>
#include <filesystem>
#include "concurrent_cache.hh"
namespace detail {
static hanaru::storage_manager* instance_ = nullptr;
// Hard drive information
std::atomic_int64_t current_free_space_ = 0;
// RAM information
std::atomic_int64_t current_ram_usage_ = 0;
}
hanaru::cached_beatmap::cached_beatmap(
const std::string& name_,
const std::string& content_,
bool retry_,
std::chrono::system_clock::time_point time_
)
: name(name_)
, content(content_)
, retry(retry_)
, timestamp(time_)
{};
hanaru::storage_manager::storage_manager(
size_t maximum_cache_size,
int64_t required_free_space
)
: maximum_cache_size_(std::max(static_cast<size_t>(32), maximum_cache_size))
, required_free_space_(std::max(static_cast<int64_t>(1024), required_free_space))
{
std::filesystem::space_info si = std::filesystem::space(".");
detail::current_free_space_ = si.available - (required_free_space << 20);
detail::instance_ = this;
}
int64_t hanaru::storage_manager::memory_usage() const {
// Convert raw bytes to megabytes
return detail::current_ram_usage_ >> 20;
}
void hanaru::storage_manager::insert(const int64_t id, hanaru::cached_beatmap&& btm) {
detail::current_free_space_ -= btm.content.size();
detail::current_ram_usage_ += btm.content.size();
cache_.insert(id, std::move(btm));
}
std::shared_ptr<hanaru::cached_beatmap> hanaru::storage_manager::find(int64_t id) {
return cache_.get(id);
}
bool hanaru::storage_manager::can_write() const {
return detail::current_free_space_ > 0;
}
hanaru::storage_manager& hanaru::storage_manager::get() {
return *detail::instance_;
}
|
package de.shiirroo.manhunt.command.subcommands.vote;
import de.shiirroo.manhunt.utilis.vote.BossBarCreator;
import de.shiirroo.manhunt.utilis.vote.VoteCreator;
import org.bukkit.entity.Player;
public abstract class Vote {
private final VoteCreator voteCreator = voteCreator();
public VoteCreator getVoteCreator() {
return voteCreator;
}
protected abstract VoteCreator voteCreator();
protected BossBarCreator getBossBarCreator(){
return getVoteCreator().getbossBarCreator();
}
protected abstract void editBossBarCreator();
protected abstract boolean requirement();
protected abstract String requirementMessage();
public void startVote(Player player){
if(requirement()) {
editBossBarCreator();
getVoteCreator().startVote();
getVoteCreator().addVote(player);
} else {
player.sendMessage(requirementMessage());
}
}
}
|
Linux 4.5 Offloads Copying, Improves IPv6 Networking
March 15, 2016
By Sean Michael Kerner
The Linux 4.5 kernel was officially released by Linus Torvalds late on Sunday March 13, providing the second major kernel milestone update so far in 2016, following Linux 4.4 which debutedon January 10.
Among the big additions in Linux 4.5 is support for the copy_file_range() system call for offloading copies between regular files. The Linux kernel code commit for the new system was authored by Red Hat's Zach Brown and according to the git entrythe new system call, "..gives an interface to underlying layers of the storage stack which can copy without reading and writing all the data."
There is also an update for forward error correction in device mapper that could have a profound impact on Linux performance moving forward.
"Using verity hashes to locate erasures nearly doubles the effectiveness of error correction," Google engineer Sami Tolvanen wrote in his git commit message. "Being able to detect corrupted blocks also improves performance, because only corrupted blocks need to corrected."
There are also multiple commits in Linux 4.5 for IPv6 networking improvements. Among them is a new mode to the addrconf (address autoconfiguration) utility that for a new address generator mode. Additionally Linux 4.5 now provides new generic header support of IPv6 over Low power Wireless Personal Area network (6lowpan)
In his release note for Linux 4.5, Linus Torvalds remarked that the networking pull early in the week was larger than he would have wished for.
But the block layer should be all good now, and David went through all his networking commits an extra time just to make me feel comfy about it, so in the end I didn't see any point to making the release cycle any longer than usual.
And on the whole, everything here is pretty small.
Sean Michael Kerner is a senior editor at LinuxPlanet and InternetNews.com. Follow him on Twitter @TechJournalist |
Attention! This news was published on the old version of the website. There may be some problems with news display in specific browser versions.
Tactical Tournament 7x7 in a new format!
Prize fund: 95000 Golden Eagles + “Gladiator” and “Steel Legion” decals
Participate
Combined battles 7х7 tournament in Realistic Battles mode:
We have pleasure in introducing a new tactical tournament format, where each team has a limit on how many different vehicle types they can spawn. In order to win, you will need to use your available vehicle resources strategically and predict your opponent's moves in advance.
You can see the limits for different vehicle types that each team can use in a single battle in the image bellow:
The only vehicle type that a team can spawn without limitations are SPAAs.
This is not all of the new mechanics - each team has a maximum of 18 spawns, while each member of the team can spawn a maximum of 3 times. This means that you will need to plan your spawns during the battle, because not all players in your team may use their maximum number of spawns. (Example: 4 players can spawn 3 times each, while the 3 remaining players can use a maximum of 2 spawns each). |
Just when you thought the 'Russian collusion' narrative couldn't get any more surreal, 3 House democrats decide to write a letter to the FCC which can only be described as 'criminally stupid' and even that seems generous.
According to the letter, signed by Representatives Anna Eshoo (D-CA), Mike Doyle (D-PA) and Frank Pallone (D-N.J), Sputnik Radio, "a radio network funded by the Russian government, was used as part of the Kremlin's effort to influence the 2016 presidential election." As such, these 3 democrats demand that the FCC launch an investigation into Sputnik Radio.
And while it may only seem 'marginally stupid' to suggest that propaganda from a Russian-operated radio station might outweigh the $1.2 billion that Hillary spent on her campaign and/or all of the propaganda spewed by the mainstream media, the argument goes full "criminally stupid" when you realize that Sputnik Radio didn't even start broadcasting in the U.S. until June 2017 (which is about 8 months AFTER the 2016 presidential election...for anyone who may have missed the nuance there).
We're writing in response to recent troubling press reports that a radio network funded by the Russian government may have used U.S. airwaves to influence the 2016 presidential election. We ask that you investigate these troubling reports and apply all applicable laws and regulations to enforce the public interest standard for licensed stations that broadcast this network.
An article published by the New York Times Magazine (9/13/17) titled "RT, Sputnik and Russia's New Theory of War" suggests that Sputnik, a radio network funded by the Russian government, was used as part of the Kremlin's effort to influence the 2016 presidential election. In Washington, D.C., listeners can tune their radios to 105.5 FM to hear Sputnik and the Russian government's effort to spread misinformation to influence U.S. policy and undermine our elections. This means the Kremlin's propaganda is being broadcast over a license granted by the FCC and the Russian government may be using our country's own airwaves to undermine our democracy.
Meanwhile, a quick look at the New York Times' original article ("RT, Sputnik and Russia’s New Theory of War") reveals that even they noted that Sputnik Radio didn't start until after the election.
It’s hard to imagine Russia’s state-backed media getting any traction in the United States if there wasn’t already an audience for it. For some subset of Americans, the intelligence report singling out RT and Sputnik was just another attack from the supposed “deep state” that Breitbart, for instance, had been fuming about for months — and it was less than surprising when, this spring, Sputnik hired a former Breitbart reporter, Lee Stranahan, to start a radio show in Washington. As Stranahan told The Atlantic, though his paycheck might now come from the Russians, “Nothing about it really affects my position on stuff that I’ve had for years now.”
Source URL
What more is there to say? |
<gh_stars>1-10
module Data.JSONTool.AstTransforms
where
import Data.Aeson
import qualified Data.Vector as Vector
import Data.Vector ((!?))
import qualified Data.HashMap.Strict as HashMap
import Data.List (concatMap, sortOn)
import Data.Monoid.Endo
import qualified Data.Text as Text
flatten :: Endo Value
flatten = Endo $ \val -> case go val of
[] -> Null
[x] -> x
xs -> Array $ Vector.fromList xs
where
go (Array a) = concatMap go $ Vector.toList a
go (Object o) = concatMap go $ HashMap.elems o
go x = [x]
at :: String -> Endo Value
at index = Endo $ \val -> case val of
Array a -> toJSON . HashMap.lookup index . HashMap.fromList . zip (map show [0..]) . Vector.toList $ a
Object o -> toJSON . HashMap.lookup (Text.pack index) $ o
_ -> Null
first :: Endo Value
first = Endo $ go
where
go (Array a) =
case Vector.toList a :: [Value] of
[] -> Null
(x:xs) -> x
go x = x
|
/**
* This class parses the CSV file that is created by calling the CDC API for COVID-19 vaccination
* information. The API can be found at:
* <p>
* https://data.cdc.gov/resource/km4m-vcsb
* </p>
* <p>
* The number of doses given in a day is used to select a date for when an individual in the
* simulation will get their vaccine. EnumeratedDistributions are created for each age group, with
* the days weighted by the number of doses administered on that day.
* </p>
*/
public class C19VaccineAgeDistributions {
public static final String DOSE_RATES_FILE = "covid_dose_rates.csv";
public static final String FIRST_SHOT_PROBS_FILE = "covid_first_shot_percentage_by_age.json";
public static final String DATE_COLUMN_HEADER = "date";
public static final LocalDate EXPAND_AGE_TO_TWELVE = LocalDate.of(2021, 5, 10);
public static final DateTimeFormatter CSV_DATE_FORMAT = DateTimeFormatter.ofPattern("M/d/yyyy");
public static final HashMap<AgeRange,
List<Pair<String, Integer>>> rawDistributions = new HashMap<>();
public static final HashMap<AgeRange,
SyncedEnumeratedDistro<String>> distributions = new HashMap<>();
public static final HashMap<AgeRange, Double> firstShotProbByAge = new HashMap<>();
/**
* Representation of an age range in years with logic for parsing the format used by the CDC API.
*/
public static class AgeRange {
public int min;
public int max;
public String display;
@Override
public boolean equals(Object o) {
if (this == o) {
return true;
}
if (o == null || getClass() != o.getClass()) {
return false;
}
AgeRange ageRange = (AgeRange) o;
return min == ageRange.min && max == ageRange.max;
}
@Override
public int hashCode() {
return Objects.hash(min, max);
}
/**
* Creates an AgeRange by parsing the display string.
* @param display something like Ages_75+_yrs or Ages_40-49_yrs
*/
public AgeRange(String display) {
this.display = display;
switch (display.length()) {
case 12:
// Example: Ages_75+_yrs
this.min = Integer.parseInt(display.substring(5, 7));
this.max = Integer.MAX_VALUE;
break;
case 14:
// Example: Ages_40-49_yrs
this.min = Integer.parseInt(display.substring(5, 7));
this.max = Integer.parseInt(display.substring(8, 10));
break;
default:
throw new IllegalArgumentException("Display should be 12 or 14 characters long.");
}
}
/**
* Determines whether the specified age falls within this range.
* @param age Age in years
* @return true if the age is contained in the range
*/
public boolean in(int age) {
return age >= min && age <= max;
}
}
/**
* Run all methods needed to interact with the methods to select shot time or shot chance.
*/
public static void initialize() {
loadRawDistribution();
populateDistributions();
loadShotProbabilitiesByAge();
}
/**
* Reads in the CSV file. Creating a List for each AgeRange, where the items are Pairs of the
* String representation of the date and an Integer counting the number of doses administered.
*/
public static void loadRawDistribution() {
String fileName = Config.get("generate.lookup_tables") + DOSE_RATES_FILE;
List<? extends Map<String, String>> rawRates = null;
try {
String csv = Utilities.readResource(fileName);
if (csv.startsWith("\uFEFF")) {
csv = csv.substring(1); // Removes BOM.
}
rawRates = SimpleCSV.parse(csv);
} catch (IOException e) {
e.printStackTrace();
}
rawRates.forEach((rowMap) -> {
String date = rowMap.get(DATE_COLUMN_HEADER);
rowMap.forEach((name, value) -> {
if (! name.equals(DATE_COLUMN_HEADER)) {
AgeRange r = new AgeRange(name);
List distroForRange = rawDistributions.get(r);
if (distroForRange == null) {
distroForRange = new ArrayList(200);
rawDistributions.put(r, distroForRange);
}
distroForRange.add(new Pair(date, Integer.parseInt(value)));
}
});
});
// Special case for 12-15. While there were individuals under 16 who received vaccinations
// prior to May 10, 2021, the numbers are pretty small. This strips out that population for
// the sake of simplicity in the simulation
AgeRange specialCaseRange = new AgeRange("Ages_12-15_yrs");
rawDistributions.put(specialCaseRange, rawDistributions.get(specialCaseRange).stream()
.filter(pair -> {
String dateString = pair.getFirst();
LocalDate doseCountDate = CSV_DATE_FORMAT.parse(dateString, LocalDate::from);
return doseCountDate.isAfter(EXPAND_AGE_TO_TWELVE);
})
.collect(Collectors.toList()));
}
/**
* Processes the raw distribution information into the EnumeratedDistributions that will be used
* to select the date of vaccination administration. Must be called after loadRawDistribution().
*/
public static void populateDistributions() {
rawDistributions.forEach((ageRange, dayInfoList) -> {
double totalDosesForRange = dayInfoList.stream()
.map(pair -> pair.getSecond()).collect(Collectors.summingInt(Integer::intValue));
List<Pair<String, Double>> pmf = dayInfoList.stream().map(dayInfo -> {
double dosesForDay = dayInfo.getSecond();
double weight = dosesForDay / totalDosesForRange;
return new Pair<String, Double>(dayInfo.getFirst(), weight);
}).collect(Collectors.toList());
distributions.put(ageRange, new SyncedEnumeratedDistro(pmf));
});
}
/**
* Load the JSON file that has a map of age ranges to percentage of that age range that has been
* vaccinated.
*/
public static void loadShotProbabilitiesByAge() {
String fileName = "covid19/" + FIRST_SHOT_PROBS_FILE;
LinkedTreeMap<String, Object> rawShotProbs = null;
try {
String rawJson = Utilities.readResource(fileName);
Gson gson = new Gson();
rawShotProbs = gson.fromJson(rawJson, LinkedTreeMap.class);
} catch (IOException e) {
throw new RuntimeException("Couldn't load the shot probabilities file", e);
}
rawShotProbs.entrySet().forEach(stringObjectEntry -> {
AgeRange ar = new AgeRange(stringObjectEntry.getKey());
Double probability = Double.parseDouble((String) stringObjectEntry.getValue());
probability = probability / 100;
firstShotProbByAge.put(ar, probability);
});
}
/**
* Selects a time for an individual to get the COVID-19 vaccine based on their age and historical
* distributions of vaccine administration based on age.
* @param person The person to use
* @param time Time in the simulation used for age calculation
* @return A time in the simulation when the person should get their first (only?) shot
*/
public static long selectShotTime(Person person, long time) {
int age = person.ageInYears(time);
AgeRange r = distributions.keySet()
.stream()
.filter(ageRange -> ageRange.in(age))
.findFirst()
.get();
SyncedEnumeratedDistro<String> distro = distributions.get(r);
LocalDate shotDate = CSV_DATE_FORMAT.parse(distro.syncedReseededSample(person),
LocalDate::from);
return LocalDateTime.of(shotDate, LocalTime.NOON).toInstant(ZoneOffset.UTC).toEpochMilli();
}
/**
* Determines the likelihood that an individual will get a COVID-19 vaccine based on their age.
* @param person The person to use
* @param time Time in the simulation used for age calculation
* @return a value between 0 - 1 representing the chance the person will get vaccinated
*/
public static double chanceOfGettingShot(Person person, long time) {
int age = person.ageInYears(time);
AgeRange r = distributions.keySet()
.stream()
.filter(ageRange -> ageRange.in(age))
.findFirst()
.get();
return firstShotProbByAge.get(r);
}
} |
def change_next_charge_date(self, charge_id, to_date):
return self.http_put(
f'{self.url}/{charge_id}/change_next_charge_date',
{'next_charge_date': to_date}
) |
/**
* PmPowerUpFpd() - Power up FPD domain
*
* @return Status of the pmu-rom operations
*/
static int PmPowerUpFpd(void)
{
if (0 != (XPfw_Read32(PMU_GLOBAL_PWR_STATE) &
PMU_GLOBAL_PWR_STATE_FP_MASK)) {
PmDbg(DEBUG_DETAILED,"Skiping FPD power up as FPD is on\r\n");
return XST_SUCCESS;
}
int status = XpbrPwrUpFpdHandler();
if (XST_SUCCESS != status) {
goto err;
}
status = PmResetAssertInt(PM_RESET_FPD, PM_RESET_ACTION_RELEASE);
if (XST_SUCCESS != status) {
goto err;
}
XPfw_AibDisable(XPFW_AIB_LPD_TO_DDR);
XPfw_AibDisable(XPFW_AIB_LPD_TO_FPD);
PmFpdRestoreContext();
err:
return status;
} |
#include <iostream>
#include <string>
#include <vector>
using namespace std;
vector<int> z_function(const string& str) {
int n = str.size();
vector<int> z(n);
for (int i = 1, l = 0, r = 0; i < n; ++i) {
if (i <= r) z[i] = min(r - i + 1, z[i - l]);
while (i + z[i] < n && str[z[i]] == str[i + z[i]]) ++z[i];
if (i + z[i] - 1 > r) l = i, r = i + z[i] - 1;
}
return z;
}
int y[1001001];
int main() {
int n, m, i, k, mod = 1e9 + 7;
cin >> n >> m;
string p, s(n, '#');
cin >> p;
for (i = 0; i < m; ++i) cin >> y[i]; y[m] = 2e9;
for (i = 0; i < m; ++i) {
k = 0;
while (k < p.size() && y[i] + k < y[i + 1]) s[y[i] + k - 1] = p[k], ++k;
}
vector<int> z = z_function(p + "$" + s);
for (i = 0; i < m; ++i) if (z[y[i] + p.size()] != p.size()) return cout << 0 << endl, 0;
long long ans = 1;
for (i = 0; i < n; ++i) if (s[i] == '#') ans = (ans * 26) % mod;
cout << ans << endl;
return 0;
} |
import YamlFilesCreator from './yaml-files.creator';
describe('YamlFilesCreator', () => {
let fileRepository = null;
let yamlFilesCreator = null;
beforeEach(() => {
fileRepository = {
hasAccess: (path, permission) => false,
loadData: (filename, extension) => 'loadData',
saveData: jest.fn(),
};
yamlFilesCreator = new YamlFilesCreator(fileRepository);
});
it('does return true if supported type', async () => {
const result = yamlFilesCreator.supports('yml');
expect(result).toBeTruthy();
});
it('does return false if not supported type', async () => {
const result = yamlFilesCreator.supports('xyz');
expect(result).toBeFalsy();
});
it('executes save method once when dataToSave is string', () => {
yamlFilesCreator.save('data', '.', 'test');
expect(fileRepository.saveData).toBeCalledWith('data', 'test', 'yml', '.');
expect(fileRepository.saveData.mock.calls.length).toBe(1);
});
it('executes save method for every language', () => {
const translations = [
{ lang: 'pl_pl', content: 'test' },
{ lang: 'en_US', content: 'test2' },
{ lang: 'de', content: 'test3' },
];
yamlFilesCreator.save(translations, '.', 'test');
expect(fileRepository.saveData).toBeCalledWith(
translations[0].content,
`messages.${translations[0].lang}`,
'yml',
'.'
);
expect(fileRepository.saveData).toBeCalledWith(
translations[1].content,
`messages.${translations[1].lang}`,
'yml',
'.'
);
expect(fileRepository.saveData).toBeCalledWith(
translations[2].content,
`messages.${translations[2].lang}`,
'yml',
'.'
);
expect(fileRepository.saveData).toHaveBeenCalledTimes(3);
});
});
|
Terahertz laser based standoff imaging system
Definition and design of a terahertz standoff imaging system has been theoretically investigated. Utilizing terahertz quantum cascade lasers for transmitter and local oscillator, a detailed analysis of the expected performance of an active standoff imaging system based on coherent heterodyne detection has been carried out. Five atmospheric windows between 0.3 THz and 4.0 THz have been identified and quantified by carrying out laboratory measurements of atmospheric transmission as a function of relative humidity. Using the approximate center frequency of each of these windows, detailed calculations of expected system performance vs target distance, pixel resolution, and relative humidity were carried out. It is shown that with 1.5 THz laser radiation, a 10m standoff distance, 1 m times 1 m target area, and a 1cm times 1cm pixel resolution, a viable imaging system should be achievable. Performance calculations for various target distances, target pixel resolution, and laser frequency are presented |
package main
import (
"github.com/ligato/cn-infra/agent"
"github.com/ligato/cn-infra/datasync"
"github.com/ligato/cn-infra/datasync/kvdbsync"
"github.com/ligato/cn-infra/datasync/kvdbsync/local"
"github.com/ligato/cn-infra/datasync/resync"
"github.com/ligato/cn-infra/db/keyval/etcd"
"github.com/ligato/cn-infra/logging"
"github.com/ligato/cn-infra/logging/logmanager"
"github.com/ligato/vpp-agent/cmd/vpp-agent/app"
"github.com/ligato/vpp-agent/plugins/orchestrator"
)
func main() {
// Create an instance of our custom agent.
p := NewCustomAgent()
// Create new agent with our plugin instance.
a := agent.NewAgent(agent.AllPlugins(p))
// Run starts the agent with plugins, waits until shutdown
// and then stops the agent and plugins.
if err := a.Run(); err != nil {
logging.DefaultLogger.Fatalln(err)
}
}
// CustomAgent represents our plugin.
type CustomAgent struct {
LogManager *logmanager.Plugin
app.VPP
app.Linux
Orchestrator *orchestrator.Plugin
KVDBSync *kvdbsync.Plugin
Resync *resync.Plugin
}
// NewCustomAgent returns new CustomAgent instance.
func NewCustomAgent() *CustomAgent {
p := &CustomAgent{
LogManager: &logmanager.DefaultPlugin,
}
etcdDataSync := kvdbsync.NewPlugin(kvdbsync.UseKV(&etcd.DefaultPlugin))
p.KVDBSync = etcdDataSync
p.Resync = &resync.DefaultPlugin
p.VPP = app.DefaultVPP()
p.Linux = app.DefaultLinux()
// connect IfPlugins for Linux & VPP
p.Linux.IfPlugin.VppIfPlugin = p.VPP.IfPlugin
p.VPP.IfPlugin.LinuxIfPlugin = p.Linux.IfPlugin
p.VPP.IfPlugin.NsPlugin = p.Linux.NSPlugin
// Set watcher for KVScheduler.
watchers := datasync.KVProtoWatchers{
local.DefaultRegistry,
etcdDataSync,
}
orch := &orchestrator.DefaultPlugin
orch.Watcher = watchers
p.Orchestrator = orch
return p
}
// String is used to identify the plugin by giving it name.
func (p *CustomAgent) String() string {
return "CustomAgent"
}
// Init is executed on agent initialization.
func (p *CustomAgent) Init() error {
logging.DefaultLogger.Info("Initializing CustomAgent")
return nil
}
// AfterInit is executed after initialization of all plugins. It's optional
// and used for executing operations that require plugins to be initalized.
func (p *CustomAgent) AfterInit() error {
p.Resync.DoResync()
logging.DefaultLogger.Info("CustomAgent is Ready")
return nil
}
// Close is executed on agent shutdown.
func (p *CustomAgent) Close() error {
logging.DefaultLogger.Info("Shutting down CustomAgent")
return nil
}
|
import pandas as pd
import numpy as np
import matplotlib.pyplot as plt
ds=pd.read_csv("Salary_Data.csv")
x=ds.iloc[:,:-1].values
y=ds.iloc[:,1].values
from sklearn.model_selection import train_test_split
x_train,x_test,y_train,y_test=train_test_split(x,y,test_size=1/3)
from sklearn.linear_model import LinearRegression
regressor=LinearRegression()
regressor.fit(x_train,y_train)
y_pred=regressor.predict(x_test)
plt.scatter(x_train,y_train,color='red')
plt.plot(x_train,regressor.predict(x_train))
plt.title("salary vs no of exprience")
plt.xlabel("exprience")
plt.ylabel("salary")
#verify on test set
plt.scatter(x_test,y_test,color='red')
plt.plot(x_train,regressor.predict(x_train))
plt.title("salary vs no of exprience")
plt.xlabel("exprience")
plt.ylabel("salary")
|
def capture_snapshots(self, path_globs_and_roots):
return self._scheduler.capture_snapshots(path_globs_and_roots) |
/// Create a memref allocation with the given type and dynamic extents.
static FailureOr<Value> createAlloc(OpBuilder &b, Location loc, MemRefType type,
ValueRange dynShape,
const BufferizationOptions &options) {
if (options.allocationFn)
return (*options.allocationFn)(b, loc, type, dynShape,
options.bufferAlignment);
Value allocated = b.create<memref::AllocOp>(
loc, type, dynShape, b.getI64IntegerAttr(options.bufferAlignment));
return allocated;
} |
import * as React from "react"
import { css } from "theme-ui"
const Footer = () => (
<footer
css={css({
mt: 4,
pt: 3,
})}
>
</footer>
)
export default Footer
|
/** Utility method:
* Formulate an HTTP POST request to upload the batch query file
* @param queryBody
* @return
* @throws UnsupportedEncodingException
*/
private HttpPost makePost(String queryBody) throws UnsupportedEncodingException {
HttpPost post = new HttpPost(ClueWebSearcher.BATCH_CATB_BASEURL);
InputStreamBody qparams =
new InputStreamBody(
new ReaderInputStream(new StringReader(queryBody)),
"text/plain",
"query.txt");
MultipartEntity entity = new MultipartEntity();
entity.addPart("viewstatus", new StringBody("1"));
entity.addPart("indextype", new StringBody("catbparams"));
entity.addPart("countmax", new StringBody("100"));
entity.addPart("formattype", new StringBody(format));
entity.addPart("infile", qparams);
post.setEntity(entity);
return post;
} |
def random_page(language: str) -> Page:
url = create_url(language)
try:
with requests.get(url) as response:
response.raise_for_status()
return Page(**response.json())
except (requests.exceptions.RequestException, TypeError) as error:
message = str(error)
raise click.ClickException(message) |
n,l = [int(x) for x in input().split()]
a = [int(x) for x in input().split()]
a.sort()
b = []
if(n>1):
for i in range (n-1):
b.append((a[i+1]-a[i])/2)
b.sort(reverse = True)
hsl = max(a[0], (l-a[n-1]), b[0])
print (hsl)
else:
print (max(a[0], l-a[0])) |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.