content
stringlengths 10
4.9M
|
---|
On his last night, Daxon Stankey went to bed happy and content, just like any other night. At 10 months old, he was busy all day, crawling and happily tucking into any food he was offered. The mere sight of a snack was enough to make the baby grin. It wouldn’t be long before he started to walk—he was already trying.
Because his mom, Krystal Needham, was trying to wean Daxon off breast milk, her boyfriend volunteered to put her son in his crib for the night. When he came back downstairs, he reported that Daxon had settled on his belly. “I didn’t really worry about it,” says Needham. “He always rolls onto his tummy anyway, even if I put him down on his back.”
When Needham woke up in the morning, her first thought was that it seemed late. She went into Daxon’s room to check on him. Lying in the middle of his crib, fists clenched, her young son was lifeless.
“He was face down and he was gone. It seemed like he had been for hours,” recalled Needham, who, five months after Daxon died, is still wracked with grief at her last memories of her little boy. She’s still trying to make sense of his death. “He didn’t suffocate. He could turn his head, he could roll over, he was almost walking,” she says. “His face looked smushed, like something out of a nightmare. He didn’t look like my baby at all.”
Later, when the local Calgary medical examiner called, Needham asked the pathologist if Daxon had died from SIDS—sudden infant death syndrome. The term SIDS has been used since the late 1960s to describe the unexpected deaths of babies who appear to be healthy and developing well but, for no explicable reason, go to sleep and do not wake up. They stop breathing at some point during their sleep. And while researchers warn against taking risks that can make it difficult for babies to breathe—such as having bumpers, stuffed animals or too many blankets in the crib, smoking in the house, keeping the temperature too hot or cold, and putting babies to sleep on their tummies or sides—the actual cause of death is usually impossible to determine with certainty.
SIDS is considered the leading cause of death among babies aged one month to one year old in Canada. Other common terms for the phenomenon include “crib death” or “sudden unexpected infant death.”
But, when Needham asked about SIDS, she was told the term is no longer in use. Instead, Daxon’s death would be classified as “undetermined.” The word immediately made Needham uncomfortable, as if authorities thought she was responsible for Daxon’s death or had done something wrong, she says. Across the country, dozens of bereaved parents who have also been told their infants’ sudden deaths are due to “undetermined” causes are struggling with the same worry and confusion.
Use of the term is part of a cross-country shift by coroners and medical examiners who have decided to stop using SIDS to classify sudden and unexpected infant deaths.
The shift was first suggested in 2010 and, after much research, instituted in 2012, says Lisa Lapointe, BC’s Chief Coroner and chair of the Canadian Forum of Chief Coroners and Chief Medical Examiners. The goal in making the change, says Lapointe, is to improve the accuracy of national statistics related to sudden infant deaths by ensuring all provinces are using the same terminology to classify them. In the past, while some used the term SIDS, others used “sudden unexpected death of an infant” (SUDI) or “undetermined,” making tracking difficult.
“The term SIDS was really introduced so we had some way of explaining why infants died suddenly and unexpectedly where no other cause of death was established at autopsy,” says Lapointe. “Over time, people have come to see it as a diagnosis. But it actually just meant no cause of death had been established,” she explains. “The feeling was, Why are we putting a name to something when, basically, we have no idea why this child died?”
Lapointe says using the term “undetermined” seems to be more accurate. But the word has put many traumatized, grieving parents on edge.
“To see that on paper, it kind of makes you wonder if there’s something that you did wrong,” says Erin Inglehart, a Martensville, Sask., mother whose one-year-old son Nathan died during his sleep, suddenly and unexpectedly, about three years ago. His cause of death was also classified as “undetermined.” “SIDS parents already feel an enormous amount of guilt related to their child’s death,” says Inglehart. “‘Undetermined’ feels like the blame is placed back on the parent.”
Parents also struggle with how to explain a death classified as “undetermined” to their friends and family.
Lapointe says the aim is not to cast suspicion on parents. “There’s never any intention to shame parents at all. We’re always trying to look at what could be done to prevent similar deaths in the future.” Key to that, she says, is moving away from using SIDS as a catch-all term to describe young babies who die unexpectedly during sleep. While some have no identifiable cause of death, others are actually asphyxiated (which can be the result of sleeping on soft surfaces, such as a couch or familial bed, or having too many blankets, stuffed animals or bumpers in the crib). Separating those cases from the truly undetermined cases will narrow researchers’ focus and hopefully improve their chances of one day uncovering the cause of sudden infant death, says Lapointe.
In the meantime, parents need to know that even though the term SIDS is falling out of favour, taking precautions to limit risk factors is still important.
“Infants still die suddenly and unexpectedly for no reason,” says Lapointe. “We know there are some risk factors parents can be really cautious about to help reduce the chances of their child dying.”
Some of those precautions parents can take include putting babies to sleep on their backs in their own crib or bassinet, and not on a shared sleep surface, says Ian Mitchell, a SIDS researcher and professor emeritus of paediatrics at the University of Calgary. Parents should not smoke during pregnancy or in the house. Infant sleeping environments should be free of extra and loose blankets, bumpers, pillows and stuffed animals and of moderate temperature. But in some cases it’s still not enough. “You can’t prevent every case,” says Mitchell. “But we have been able to prevent many cases. The numbers are way down all over the world.” Still, social trends like co-sleeping have led to a small uptick in cases in North America, he says.
Needham, who is expecting another baby in July, says she will take precautions with the new baby but knows her efforts may not guarantee his safety.
“It doesn’t really seem like you can prevent SIDS, but you can prevent suffocation,” she says. “I’m terrified of it happening again and I’d rather know if there’s something I can avoid doing.”
Read more:
Transitioning from a bassinet: How to get baby to sleep in crib
7 sleep mistakes new parents make
What you need to know about the new safe sleep guidelines for babies |
function delay(ms: number) {
return new Promise(resolve => setTimeout(resolve, ms));
}
async function type(element: HTMLElement, text: string) {
for (var i = 0; i < text.length; i++) {
await key(text[i], element);
}
}
async function key(charactor: string, element: HTMLElement) {
switch (charactor) {
case "+":
// Pause
element.className = "waiting typing";
break;
case "-":
// Backspace
element.className = "typing";
element.innerHTML = element.innerHTML.slice(0, -1);
break;
case "^":
// New line
element.className = "";
break;
default:
element.className = "typing";
element.innerHTML = element.innerHTML + charactor;
}
await delay(100);
}
async function main() {
const title = "++++Quality+++++--+---+-6++++ Tech+nolog++ies+++^";
const subTitle = "Quality is in the name^";
const contact = "contacr++-+t+@++<EMAIL>+";
var mainTitleElement = document.getElementById("main-title");
var subTitleElement = document.getElementById("sub-title");
var contactElement = document.getElementById("contact");
mainTitleElement.textContent = "";
subTitleElement.textContent = "";
contactElement.textContent = "";
document.body.className = "";
await type(mainTitleElement, title);
await type(subTitleElement, subTitle);
await type(contactElement, contact);
};
main();
|
// Test code an incorrect state
func TestHandleHomeCodeErrorState(t *testing.T) {
token := initToken()
token.state = "abn2xy"
err := loadCredentials(token)
if err != nil {
t.Fatalf("could not add client credentials %s", err)
}
server := httptest.NewServer(http.HandlerFunc(func(w http.ResponseWriter, r *http.Request) {
w.WriteHeader(http.StatusOK)
w.Write([]byte(`{"access_token": "abc", "refresh_token": "def", "expires_in": 1800}`))
}))
defer server.Close()
token.tokenURL = server.URL
handler := token.HandleCode
req := httptest.NewRequest("GET", "http://127.0.0.1:5001/code?code=abc", nil)
w := httptest.NewRecorder()
handler(w, req)
resp := w.Result()
statusCode := resp.StatusCode
if statusCode != 403 {
t.Errorf("Status code %d != 403", statusCode)
}
} |
<gh_stars>0
import { gql, useQuery } from "@apollo/client";
import { useRouter } from "next/router";
import React, { useContext, useState } from "react";
import { useMediaQuery } from "react-responsive";
import PageWithNavAndPanel from "../components/PageWithNavAndPanel";
import SearchBar from "../components/SearchBar";
import SkillPanel from "../components/SkillPanel";
import UserPanel from "../components/UserPanel";
import { i18nContext } from "../utils/i18nContext";
const SEARCH_QUERY = gql`
query searchSkillsAndProfiles($search: String!) {
skills: ZenikasAverageCurrentSkillsAndDesires(
where: { name: { _ilike: $search } }
order_by: { name: asc }
) {
name
skillLevel: averageSkillLevel
desireLevel: averageDesireLevel
Category {
label
}
}
profiles: User(where: { name: { _ilike: $search } }) {
email
name
picture
UserLatestAgency {
agency
}
}
}
`;
const Search = ({ pathName }) => {
const { query } = useRouter();
const isDesktop = useMediaQuery({
query: "(min-device-width: 1280px)",
});
const { t } = useContext(i18nContext);
const [search, setSearch] = useState("");
const { data, error } = useQuery(SEARCH_QUERY, {
variables: { search: `%${search}%` },
});
if (error) {
console.error(error);
}
const skills = data?.skills;
const profiles = data?.profiles;
return (
<PageWithNavAndPanel pathName={pathName} context={""}>
<div className="flex justify-center mb-16">
<div
className={`flex ${isDesktop ? "w-2/3" : "w-full"} flex-col mx-4 `}
>
<SearchBar
initialValue={search}
setSearch={setSearch}
placeholder={t("search.placeholder")}
/>
{search && search !== "" ? (
<>
<div className="flex flex-col my-2 py-2">
<h1 className="text-xl">{t("search.skills")}</h1>
{skills?.length > 0 ? (
skills.map((skill) => (
<SkillPanel
key={skill.name}
skill={skill}
categoryLabel={skill.Category?.label}
context={"zenika"}
/>
))
) : (
<span className="text-sm">{t("search.noSkill")}</span>
)}
</div>
<div className="flex flex-col my-2">
<h1 className="text-xl">{t("search.profiles")}</h1>
{profiles?.length > 0 ? (
profiles.map((profile) => (
<UserPanel
context=""
user={{
name: profile.name,
agency: profile.UserLatestAgency?.agency,
picture: profile.picture,
}}
/>
))
) : (
<span className="text-sm">{t("search.noProfile")}</span>
)}
</div>
</>
) : (
<></>
)}
</div>
</div>
</PageWithNavAndPanel>
);
};
export default Search;
|
// Close will close all loops in this object and any
// subListeners. This method will also cause an error to be
// raised on both the Accept() and AcceptFailures() methods
// for any listening loops that are above this transport
// object. This method is safe to be called multiple times
// from different goroutines.
func (pt *P2PTransport) Close() error {
fn := func() {
close(pt.closeChan)
go pt.listener.Close()
pt.logger.Info("P2PTransport closed.")
}
go pt.closeOnce.Do(fn)
return nil
} |
<filename>node_modules/carbon-icons-svelte/lib/CrowdReport32/index.d.ts
export { CrowdReport32 as default } from "../";
|
John Boehner (left) and Daniel Inouye still have a way to go to reach a deal on appropriations. Senate Dems push for spending deal
Senate defeat of House Republican spending cuts puts the burden back on Speaker John Boehner to show more flexibility even as Democrats and President Barack Obama must summon more unity if they are to capitalize on the win.
“Today’s vote establishes there is clearly a need to work toward a reasonable middle ground,” White House Budget Director Jack Lew told POLITICO Wednesday. Senate Democratic Leader Harry Reid (D-Nev.) predicted that “this paves the way to get something done.”
Story Continued Below
Republicans fell well short of a simple majority let alone the 60 votes needed for Senate passage of the 56-44 roll call. And like the House last month, the measure won not a single Democratic vote—a remarkable failure given the political pressure on moderates running in 2012 to embrace more spending reductions.
Nonetheless, the level of Democratic unhappiness in the Senate is such that Boehner has real opportunities if he can recalibrate the House approach. Just minutes after the Republican defeat, a Democratic budget alternative failed 58-42 after a mix of moderates and liberals walked away, calling the proposal an inadequate response to the debt problems facing the nation.
“Many people are in denial around here,” Sen. Claire McCaskill (D-Mo.) told her colleagues.
“Any plan to tackle our fiscal crisis must make a material difference in reducing the deficit,” said Sen. Michael Bennet (D-Col.). “And everyone should be asked to shoulder part of the burden.”
Boehner’s initial reaction Wednesday was to stall for more time, saying Democrats had still failed to come up with a legitimate counter-offer to the House bill. “It’s time for Washington Democrats to present a serious plan to cut spending,” the speaker said.
But Reid, who met later with House Democratic leaders, appears open to broadening the discussion now to look beyond appropriations and include tax reform provisions or savings from mandatory programs, such as farm subsidies, for example.
“Our goal is to fund the government for the rest of this year and the out years,” Reid told reporters after the votes. “We are going to try to get a universal deal, something that’s good for the country…We’re going to look at everything.”
That’s very likely a non-starter for many House Republicans, who want to keep a single-minded focus on rolling back domestic and foreign aid appropriations to the levels seen in the last year of the Bush administration. And it’s not clear yet how far the White House, which will be increasingly driving the talks and wants a deal done in the next month, will move in that direction either. |
def gpu_scheduler(
commands: Sequence[str],
wait_time_in_secs: int = 180,
log=True,
maxMemory=1e-4,
maxLoad=1e-4,
excludeID=(),
excludeUUID=()
):
print(f'Scheduling {len(commands)} jobs...')
import GPUtil
import subprocess
procs = []
for job_id, command in enumerate(commands):
empty_gpus = []
while len(empty_gpus) == 0:
empty_gpus = GPUtil.getAvailable(
order='first',
maxLoad=maxLoad,
maxMemory=maxMemory,
limit=1,
excludeID=excludeID,
excludeUUID=excludeUUID,
)
time.sleep(1)
print(f'empty gpus: {empty_gpus}')
gpu_id = empty_gpus[0]
command = f"export CUDA_VISIBLE_DEVICES={gpu_id}; {command}"
command = command.strip()
if log and '--train_dir' in command:
train_dir = extract_argument(command)
log_path = os.path.join(train_dir, 'log.out')
command += f" > {log_path} 2>&1 "
command = f"mkdir -p {train_dir}; \n{command}"
proc = subprocess.Popen(
[command],
shell=True, stdin=None, stdout=None, stderr=None, close_fds=True
)
procs.append(proc)
print('command: ')
print(command)
print(f'scheduled job: {job_id} on gpu: {gpu_id}')
time.sleep(wait_time_in_secs)
return procs |
// Buffered Channels
// By default, channels are unbuffered. Which implies that the channel
// would only accept data into the channel if there is a corresponding receive.
// Buffered Channels are just like internal data stores, they allow a limited
// number of values without a reciver for thos values. Buffered Channels are blocked
// when the channel is full.
// We'll see how to create a buffered Channel below.
package main
import "fmt"
type Info struct {
Name string
Address string
}
func main() {
// We just made a buffered channel by adding an extra parameter to the make function
// The extra parameter we added specifies the size of the buffer
// We created a buffered that can
bufChannel := make(chan Info, 2)
bufChannel <- Info{
Name: "John",
Address: "Georgia",
}
bufChannel <- Info{
Name: "James",
Address: "NY City",
}
// Since the size of the buffered channel is ``2``, sending another data to the
// channels would result in a deadlock. What is a deadlock?
// Deadlock is a situation where a set of processes are blocked because each process
// is holiding a resource and waiting for another resource acquired by another resource.
// prints out the output
fmt.Println(<-bufChannel)
fmt.Println(<-bufChannel)
}
|
/**
* Copyright (C) 2015 The Gravitee team (http://gravitee.io)
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package io.gravitee.am.gateway.handler.oidc.resources.endpoint;
import io.gravitee.am.common.exception.oauth2.InvalidRequestException;
import io.gravitee.am.common.exception.oauth2.InvalidTokenException;
import io.gravitee.am.common.jwt.JWT;
import io.gravitee.am.common.oidc.CustomClaims;
import io.gravitee.am.common.oidc.Scope;
import io.gravitee.am.common.oidc.StandardClaims;
import io.gravitee.am.gateway.handler.common.jwt.JWTService;
import io.gravitee.am.gateway.handler.common.utils.ConstantKeys;
import io.gravitee.am.gateway.handler.common.vertx.utils.UriBuilderRequest;
import io.gravitee.am.gateway.handler.oidc.service.discovery.OpenIDDiscoveryService;
import io.gravitee.am.gateway.handler.oidc.service.idtoken.IDTokenService;
import io.gravitee.am.gateway.handler.oidc.service.jwe.JWEService;
import io.gravitee.am.gateway.handler.oidc.service.request.ClaimsRequest;
import io.gravitee.am.model.Role;
import io.gravitee.am.model.User;
import io.gravitee.am.model.oidc.Client;
import io.gravitee.am.service.UserService;
import io.gravitee.common.http.HttpHeaders;
import io.gravitee.common.http.MediaType;
import io.reactivex.Maybe;
import io.reactivex.Single;
import io.vertx.core.Handler;
import io.vertx.core.json.Json;
import io.vertx.reactivex.ext.web.RoutingContext;
import java.util.*;
import java.util.stream.Collectors;
import static java.util.Optional.ofNullable;
/**
* The Client sends the UserInfo Request using either HTTP GET or HTTP POST.
* The Access Token obtained from an OpenID Connect Authentication Request MUST be sent as a Bearer Token, per Section 2 of OAuth 2.0 Bearer Token Usage [RFC6750].
* It is RECOMMENDED that the request use the HTTP GET method and the Access Token be sent using the Authorization header field.
*
* See <a href="http://openid.net/specs/openid-connect-core-1_0.html#UserInfo">5.3.1. UserInfo Request</a>
*
* The UserInfo Endpoint is an OAuth 2.0 Protected Resource that returns Claims about the authenticated End-User.
* To obtain the requested Claims about the End-User, the Client makes a request to the UserInfo Endpoint using an Access Token obtained through OpenID Connect Authentication.
* These Claims are normally represented by a JSON object that contains a collection of name and value pairs for the Claims.
*
* See <a href="http://openid.net/specs/openid-connect-core-1_0.html#UserInfo">5.3. UserInfo Endpoint</a>
*
* @author <NAME> (david.brassely at graviteesource.com)
* @author <NAME> (titouan.compiegne at graviteesource.com)
* @author GraviteeSource Team
*/
public class UserInfoEndpoint implements Handler<RoutingContext> {
private UserService userService;
private JWTService jwtService;
private JWEService jweService;
private OpenIDDiscoveryService openIDDiscoveryService;
public UserInfoEndpoint(UserService userService,
JWTService jwtService,
JWEService jweService,
OpenIDDiscoveryService openIDDiscoveryService) {
this.userService = userService;
this.jwtService = jwtService;
this.jweService = jweService;
this.openIDDiscoveryService = openIDDiscoveryService;
}
@Override
public void handle(RoutingContext context) {
JWT accessToken = context.get(ConstantKeys.TOKEN_CONTEXT_KEY);
Client client = context.get(ConstantKeys.CLIENT_CONTEXT_KEY);
String subject = accessToken.getSub();
userService.findById(subject)
.switchIfEmpty(Maybe.error(new InvalidTokenException("No user found for this token")))
// enhance user information
.flatMapSingle(user -> enhance(user, accessToken))
// process user claims
.map(user -> processClaims(user, accessToken))
// encode response
.flatMap(claims -> {
if (!expectSignedOrEncryptedUserInfo(client)) {
context.response().putHeader(HttpHeaders.CONTENT_TYPE, MediaType.APPLICATION_JSON);
return Single.just(Json.encodePrettily(claims));
} else {
context.response().putHeader(HttpHeaders.CONTENT_TYPE, MediaType.APPLICATION_JWT);
JWT jwt = new JWT(claims);
jwt.setIss(openIDDiscoveryService.getIssuer(UriBuilderRequest.resolveProxyRequest(context)));
jwt.setSub(accessToken.getSub());
jwt.setAud(accessToken.getAud());
jwt.setIat(new Date().getTime() / 1000l);
jwt.setExp(accessToken.getExp() / 1000l);
return jwtService.encodeUserinfo(jwt, client)//Sign if needed, else return unsigned JWT
.flatMap(userinfo -> jweService.encryptUserinfo(userinfo, client));//Encrypt if needed, else return JWT
}
}
)
.subscribe(
buffer -> context.response()
.putHeader(HttpHeaders.CACHE_CONTROL, "no-store")
.putHeader(HttpHeaders.PRAGMA, "no-cache")
.end(buffer)
,
error -> context.fail(error)
);
}
/**
* Process user claims against user data and access token information
* @param user the end user
* @param accessToken the access token
* @return user claims
*/
private Map<String, Object> processClaims(User user, JWT accessToken) {
final Map<String, Object> fullProfileClaims = ofNullable(user.getAdditionalInformation()).orElse(Map.of());
if (!fullProfileClaims.containsKey(StandardClaims.SUB)) {
// The sub (subject) Claim MUST always be returned in the UserInfo Response.
// https://openid.net/specs/openid-connect-core-1_0.html#UserInfoResponse
throw new InvalidRequestException("UserInfo response is missing required claims");
}
Map<String, Object> userClaims = new HashMap<>();
// Exchange the sub claim from the identity provider to its technical id
userClaims.put(StandardClaims.SUB, user.getId());
// prepare requested claims
Map<String, Object> requestedClaims = new HashMap<>();
// SUB claim is required
requestedClaims.put(StandardClaims.SUB, user.getId());
boolean requestForSpecificClaims = false;
// processing claims list
// 1. process the request using scope values
if (accessToken.getScope() != null) {
final Set<String> scopes = new HashSet<>(Arrays.asList(accessToken.getScope().split("\\s+")));
requestForSpecificClaims = processScopesRequest(scopes, userClaims, requestedClaims, fullProfileClaims);
}
// 2. process the request using the claims values (If present, the listed Claims are being requested to be added to any Claims that are being requested using scope values.
// If not present, the Claims being requested from the UserInfo Endpoint are only those requested using scope values.)
if (accessToken.getClaimsRequestParameter() != null) {
requestForSpecificClaims = processClaimsRequest((String) accessToken.getClaimsRequestParameter(), fullProfileClaims, requestedClaims);
}
// remove technical claims that are useless for the calling app
IDTokenService.EXCLUDED_CLAIMS.forEach(key -> userClaims.remove(key));
return (requestForSpecificClaims) ? requestedClaims : userClaims;
}
/**
* For OpenID Connect, scopes can be used to request that specific sets of information be made available as Claim Values.
*
* @param scopes scopes request parameter
* @param userClaims requested claims from scope
* @param requestedClaims requested claims
* @param fullProfileClaims full profile claims
* @return true if OpenID Connect scopes have been found
*/
private boolean processScopesRequest(Set<String> scopes,
Map<String, Object> userClaims,
Map<String, Object> requestedClaims,
final Map<String, Object> fullProfileClaims
) {
// if full_profile requested, continue
if (scopes.contains(Scope.FULL_PROFILE.getKey())) {
userClaims.putAll(fullProfileClaims);
return false;
}
// get requested scopes claims
final List<String> scopesClaimKeys = scopes.stream()
.map(String::toUpperCase)
.filter(scope -> Scope.exists(scope) && !Scope.valueOf(scope).getClaims().isEmpty())
.map(Scope::valueOf)
.map(Scope::getClaims)
.flatMap(List::stream)
.collect(Collectors.toList());
// no OpenID Connect scopes requested continue
if (scopesClaimKeys.isEmpty()) {
return false;
}
// return specific available sets of information made by scope value request
scopesClaimKeys.stream()
.filter(fullProfileClaims::containsKey)
.forEach(scopeClaim ->
requestedClaims.putIfAbsent(scopeClaim, fullProfileClaims.get(scopeClaim))
);
return true;
}
/**
* Handle claims request previously made during the authorization request
* @param claimsValue claims request parameter
* @param fullProfileClaims user full claims list
* @param requestedClaims requested claims
* @return true if userinfo claims have been found
*/
private boolean processClaimsRequest(String claimsValue, final Map<String, Object> fullProfileClaims, Map<String, Object> requestedClaims) {
try {
ClaimsRequest claimsRequest = Json.decodeValue(claimsValue, ClaimsRequest.class);
if (claimsRequest != null && claimsRequest.getUserInfoClaims() != null) {
claimsRequest.getUserInfoClaims().forEach((key, value) -> {
if (fullProfileClaims.containsKey(key)) {
requestedClaims.putIfAbsent(key, fullProfileClaims.get(key));
}
});
return true;
}
} catch (Exception e) {
// Any members used that are not understood MUST be ignored.
}
return false;
}
/**
* Enhance user information with roles and groups if the access token contains those scopes
* @param user The end user
* @param accessToken The access token with required scopes
* @return enhanced user
*/
private Single<User> enhance(User user, JWT accessToken) {
if (!loadRoles(user, accessToken) && !loadGroups(accessToken)) {
return Single.just(user);
}
return userService.enhance(user)
.map(user1 -> {
Map<String, Object> userClaims = user.getAdditionalInformation() == null ?
new HashMap<>() :
new HashMap<>(user.getAdditionalInformation());
if (user.getRolesPermissions() != null && !user.getRolesPermissions().isEmpty()) {
userClaims.putIfAbsent(CustomClaims.ROLES, user.getRolesPermissions().stream().map(Role::getName).collect(Collectors.toList()));
}
if (user.getGroups() != null && !user.getGroups().isEmpty()) {
userClaims.putIfAbsent(CustomClaims.GROUPS, user.getGroups());
}
user1.setAdditionalInformation(userClaims);
return user1;
});
}
/**
* @param client Client
* @return Return true if client request signed or encrypted (or both) userinfo.
*/
private boolean expectSignedOrEncryptedUserInfo(Client client) {
return client.getUserinfoSignedResponseAlg() != null || client.getUserinfoEncryptedResponseAlg() != null;
}
private boolean loadRoles(User user, JWT accessToken) {
return accessToken.hasScope(Scope.ROLES.getKey());
}
private boolean loadGroups(JWT accessToken) {
return accessToken.hasScope(Scope.GROUPS.getKey());
}
}
|
Dermal fibroblasts tumor suppression of ras-transformed keratinocytes is associated with induction of squamous cell differentiation.
We have previously reported that tumor formation of ras-transformed keratinocytes can be suppressed by dermal fibroblasts through production of a diffusible growth inhibitory factor of the transforming growth factor-beta (TGF-beta) family. Keratinocytes transformed by ras and E1a oncogenes or papilloma-derived keratinocytes transformed by a ras oncogene show concomitant resistance to dermal fibroblast tumor suppression and TGF-beta growth inhibition. We report here that dermal fibroblast tumor suppression is associated with a striking induction of squamous cell differentiation and that this effect is blocked in tumors resistant to dermal fibroblast inhibition. This experimental system strongly supports the notion that suppression of tumorigenicity and induction of a differentiated phenotype are closely associated events. |
def cancel(self):
logger.info(f'Loader: Canceling {len(self.queues)} batches')
self.loop.call_soon_threadsafe(self.stop_progress)
asyncio.run_coroutine_threadsafe(self.stop(), self.loop) |
/**
* Create the table showing the labels list
*/
private Table createTable(Composite parent, boolean multiSelect) {
int flags = SWT.BORDER | SWT.H_SCROLL | SWT.V_SCROLL
| SWT.FULL_SELECTION;
if (multiSelect) {
flags |= SWT.MULTI;
}
final Table table = new Table(parent, flags);
table.setHeaderVisible(true);
table.setLinesVisible(true);
table.addDisposeListener(new DisposeListener() {
public void widgetDisposed(DisposeEvent e) {
saveColumnSizes();
PerforceUIPlugin.getPlugin().getPreferenceStore()
.removePropertyChangeListener(LabelsViewer.this);
}
});
GridData gd = new GridData();
gd.grabExcessHorizontalSpace = true;
gd.grabExcessVerticalSpace = true;
gd.horizontalAlignment = GridData.FILL;
gd.verticalAlignment = GridData.FILL;
table.setLayoutData(gd);
TableLayout layout = new TableLayout();
table.setLayout(layout);
SelectionListener headerListener = new SelectionAdapter() {
@Override
public void widgetSelected(SelectionEvent e) {
TableColumn column = (TableColumn) e.widget;
sorter.setSortColumn(column.getText());
labelsList.refresh();
}
};
labelsList = new TableViewer(table);
labelsList.setContentProvider(new ArrayContentProvider());
labelsList.setLabelProvider(new LabelsLabelProvider());
TableColumn labelColumn = addColumn(labelsList.getTable(), LABEL_COLUMN);
addColumn(labelsList.getTable(), OWNER_COLUMN);
addColumn(labelsList.getTable(), ACCESS_COLUMN);
addColumn(labelsList.getTable(), DESCRIPTION_COLUMN);
for (TableColumn column : labelsList.getTable().getColumns()) {
column.addSelectionListener(headerListener);
}
table.setSortColumn(labelColumn);
table.setSortDirection(SWT.UP);
Map<String, Integer> columnSizes = loadColumnSizes();
for (TableColumn column : labelsList.getTable().getColumns()) {
int width = 100;
if (columnSizes.containsKey(column.getText())) {
int size = columnSizes.get(column.getText()).intValue();
if (size > 0) {
width = size;
}
}
layout.addColumnData(new ColumnPixelData(width, true));
}
sorter = new LabelSorter(this.labelsList.getTable(),
labelColumn.getText());
sorter.setAscending();
labelsList.setSorter(sorter);
return table;
} |
PORT ST. LUCIE, Fla. — Jon Niese has a new arm angle this spring. The reason is obvious, if complicated.
"Because my shoulder is healthy," he said. "Last year I was just trying to find an angle where it didn't hurt. This year, I'm 100 percent healthy so I'm able to actually work on my mechanics and have proper mechanics."
That is no small feat for Niese. Last year, the Mets pitcher's spring began with a sputter. He underwent two MRIs by Opening Day and started the season on the disabled list. Last month, he said it could have been better to miss the whole first month.
Now, Niese is without issue and focused on preparing for the season. He pitched 2 2/3 innings Sunday, allowing an earned run in 52 pitches after finding trouble in the third against the Red Sox.
Niese is using this month to tinker and tweak his arsenal of pitches.
"Cutting loose on everything," he says. "My cutter and curveball need a little bit more work but there's plenty of time for that."
His fastball command left him late in his outing but Niese attributed that to fatigue this early in spring. It was just his first Grapefruit League start.
Niese will be a part of a starting rotation the Mets are hoping can take them to the postseason. After six losing seasons, the 28-year-old sees the potential for a turnaround.
I'm excited, especially looking at all the talent we have," Niese said.
"The sky is the limit for us this year as far as the pitching staff goes. I know our hitting has gotten better. I'm really excited about it."
• As Defensive Shifts Become Divisive, One Prominent Mets Pitcher Protests Too
• Meet Michael Cuddyer: A magician, a pitchman, a scout's dream and the Virginia boy who never left home
Tigers vs. Mets 3-6-15 15 Gallery: Tigers vs. Mets 3-6-15
Mike Vorkunov may be reached at [email protected] . Follow him on Twitter @Mike_Vorkunov . Find NJ.com Mets on Facebook |
<reponame>yuto-moriizumi/UkrainianPeoplesRepublic
import DeclareWar from "./DeclareWar";
import SetOwner from "./SetOwner";
import Annex from "./Annex";
import Peace from "./Peace";
import ChangeName from "./ChangeName";
import GainAccess from "./GainAccess";
import Effect from "./Effect";
import DispatchEvent from "./DispatchEvent";
export default class EffectCreator {
public static createEffect(effect: any): Effect {
switch (effect.type) {
case "DeclareWar":
return Object.assign(new DeclareWar(), effect);
case "SetOwner":
return Object.assign(new SetOwner(), effect);
case "Annex":
return Object.assign(new Annex(), effect);
case "Peace":
return Object.assign(new Peace(), effect);
case "ChangeName":
return Object.assign(new ChangeName(), effect);
case "GainAccess":
return Object.assign(new GainAccess(), effect);
case "DispatchEvent":
return Object.assign(new DispatchEvent(), effect);
default:
console.log(effect);
throw new Error(
"一致する効果クラスが見つかりませんでした:" + effect.type
);
}
}
}
|
<reponame>GDGToulouse/devfest-embedded-devices-monorepo
export interface SocketEmitsUpdateConfig {
deploymentUuid: string;
}
|
def predictive(self, Xnew, output_function_ind=None, kern_list=None):
f_ind = self.Y_metadata['function_index'].flatten()
if output_function_ind is None:
output_function_ind = 0
d = output_function_ind
if kern_list is None:
kern_list = self.kern_list
Xmulti_all_new = self.Xmulti_all.copy()
Xmulti_all_new[f_ind[d]] = Xnew
posteriors = self.inference_method.posteriors(q_u_means=self.q_u_means, q_u_chols=self.q_u_chols, X=Xmulti_all_new,
Y=self.Ymulti_all, Z=self.Z, kern_list=self.kern_list, likelihood=self.likelihood,
B_list=self.B_list, Y_metadata=self.Y_metadata)
posterior_output = posteriors[output_function_ind]
Kx = np.zeros((Xmulti_all_new[f_ind[d]].shape[0], Xnew.shape[0]))
Kxx = np.zeros((Xnew.shape[0], Xnew.shape[0]))
for q, B_q in enumerate(self.B_list):
Kx += B_q.B[output_function_ind, output_function_ind] * kern_list[q].K(Xmulti_all_new[f_ind[d]], Xnew)
Kxx += B_q.B[output_function_ind, output_function_ind] * kern_list[q].K(Xnew, Xnew)
mu = np.dot(Kx.T, posterior_output.woodbury_vector)
Kxx = np.diag(Kxx)
var = (Kxx - np.sum(np.dot(np.atleast_3d(posterior_output.woodbury_inv).T, Kx) * Kx[None, :, :], 1)).T
return mu, np.abs(var) |
<filename>e3nn/nn/__init__.py
from ._extract import Extract, ExtractIr
from ._activation import Activation
from ._batchnorm import BatchNorm
from ._fc import FullyConnectedNet
from ._gate import Gate
from ._identity import Identity
from ._s2act import S2Activation
from ._so3act import SO3Activation
from ._normact import NormActivation
from ._dropout import Dropout
__all__ = [
"Extract",
"ExtractIr",
"BatchNorm",
"FullyConnectedNet",
"Activation",
"Gate",
"Identity",
"S2Activation",
"SO3Activation",
"NormActivation",
"Dropout",
]
|
<filename>argus/design/transformations.py
import numpy as np
from argus.utils import to_array
def to_classes(x, n_classes=2, min_value=None, max_value=None, agg_function=None, verbose=False):
x = to_array(x)
n_classes = int(n_classes)
if not min_value:
min_value = np.min(x)
if not max_value:
max_value = np.max(x)
if verbose:
print(f"Value max: {min_value}")
print(f"Value min: {max_value}")
if not agg_function:
agg_function = np.mean
step = (max_value - min_value) / n_classes
classes_array = np.zeros((x.shape[0],))
for j in range(n_classes):
if verbose:
print(f"Interval lower bound: {min_value + j * step}")
print(f"Interval upper bound: {min_value + (j + 1) * step}")
if j < n_classes - 1:
idx = np.where((0 <= x - min_value - j * step) & (x - min_value - j * step < step))[0]
else:
# if last chunk, take inferior or equal (instead of strictly inferio) to max value
# n_steps * steps =/= max_value because of float approximation
idx = np.where((0 <= x - min_value - j * step) & (x <= max_value))[0]
classes_array[idx] = agg_function(x[idx])
return classes_array |
def _parse_latency_line(data):
_min, _avg, _max = data.split("/")
return {
"latency_min": int(_min),
"latency_avg": int(_avg),
"latency_max": int(_max),
}
def _parse_mode_line(data):
value = data.strip()
return {"mode": value, "is_leader": 1 if (value == "leader") else 0}
def _parse_connections(data):
return {"connections": int(data)}
def _parse_oustanting(data):
return {"outstanding": int(data)}
def _parse_node_count(data):
return {"node_count": int(data)}
STAT_LINE_PARSER_CALLBACK = {
"latency min/avg/max": _parse_latency_line,
"connections": _parse_connections,
"outstanding": _parse_oustanting,
"mode": _parse_mode_line,
"node count": _parse_node_count,
}
def parse_stat_output(raw_output):
result_dict = {}
lines = [l.lower() for l in raw_output.split("\n") if ":" in l]
for line in lines:
key, value = line.split(":", 1)
try:
parser = STAT_LINE_PARSER_CALLBACK[key.strip()]
except KeyError:
continue
else:
parsed = parser(value)
result_dict.update(parsed)
return result_dict
|
On making holes in liquids
Abstract Just as a solid object would, a liquid jet or a stream of droplets impacting a free surface deforms and perforates it. This generic flow interaction, met in everyday life but also in cutting edge industrial processes, has puzzled scientists for centuries. Lee et al. (J. Fluid Mech., vol. 921, 2021, A8) present an experimental study of a simple droplet train interacting with a liquid bath and identify two stages in the interaction: a first where a cavity elongates and finally bursts, and a second where the interface is steadily punched by the incoming stream. Each of these regimes is explained with elementary but effective models arising from first principles, thereby revealing a full and simple picture of the physics of making holes in liquids. |
/**
* Helper class for building discrete distributions.
*
* @author Daniel
*
*/
public abstract class AFiniteDistribution<T> extends ADistributionBase<T>
implements IFiniteDistribution<T>
//, Collection<T> causes problems with serialisers :( Use asList() instead
{
/**
* If >0, the total weight (i.e. the sum of values).<br>
* Methods which edit the weight MUST either adjust this or zero it!
*
* NB: only a few sub-classes actually use this -- maybe move into them
*/
protected double cachedWeight;
@Override
public String toString() {
// avoid costly calculations on big objects
if (size() > 1000) return getClass().getSimpleName()+"[size="+size()+"]";
try {
HashMap map = new HashMap(asMap()); // NB: copy for thread safety
return getClass().getSimpleName()+"["+Printer.toString(Containers
.getValueSortedMap(map, false, 12))+"]";
} catch(Throwable ex) { // paranoia -- bug seen in ZF
Log.i("distro.toString", ex);
return getClass().getSimpleName()+"[size="+size()+"]";
}
}
@Override
public Map<T, Double> asMap() {
return new DistroMap(this);
}
/**
* TODO where should this go? On StreamClassifier??
*
* @param <Label>
* @param <Term>
* @param classifier
* @param nPerModel
* @return
*/
public static <Label, Term> Map<Label, List<Term>> getTopDistinguishingTerms(
Map<Label, AFiniteDistribution<Term>> classifier, int nPerModel) {
// Build list of candidate terms
HashSet<Term> terms = new HashSet();
for (AFiniteDistribution<Term> dm : classifier.values()) {
terms.addAll(dm.asList());
}
// what is each one like?
Map<Label,TopNList<Term>> topTerms = new ArrayMap();
for(Label lbl : classifier.keySet()) {
topTerms.put(lbl, new TopNList<Term>(nPerModel));
}
for (Term term : terms) {
ObjectDistribution<Label> od = new ObjectDistribution();
for (Label label : classifier.keySet()) {
AFiniteDistribution<Term> model = classifier.get(label);
od.setProb(label, model.prob(term));
}
if (od.getTotalWeight()==0) continue;
od.normalise();
Label winner = od.getMostLikely();
double p = od.prob(winner);
if (p <= 0.5) {
if (od.size() == 2) continue;
}
TopNList<Term> tops = topTerms.get(winner);
tops.maybeAdd(term, p);
}
return (Map) topTerms;
}
public AFiniteDistribution() {
}
/**
* Add another distribution to this one (editing this), so afterwards<br>
* P'_this(a) = P_this(a) + alpha*P_other(a) <br>
* Does not do any normalisation.
* <p>
* Can be thought of as "create a mixture model of this OR other"
*
* @param alpha
* The weight to apply.
* @param other
* This is not edited
* @see AndDist which is the opposite type of merge.
*/
public void addAll(double alpha, IFiniteDistribution<T> other) {
assert this != other : this;
assert alpha >= 0;
if (alpha == 0)
return;
Map<T, Double> xmap = other.asMap();
for (Entry<T, Double> entry : xmap.entrySet()) {
T key = entry.getKey();
addProb(key, alpha * entry.getValue());
}
}
/**
* Does p(x) = p(x) + dx.
*
* @param obj
* @param dp Must be >= 0
* Should we return the new prob?
* @throws UnsupportedOperationException
* if {@link #setProb(Object, double)} is not supported.
*/
public void addProb(T obj, double dp) throws UnsupportedOperationException {
double p = prob(obj) + dp;
setProb(obj, p);
}
// @Override
public void clear() {
for (Object x : toArray()) {
setProb((T) x, 0);
}
}
// @Override
public boolean contains(Object o) {
return prob((T) o) != 0;
}
// @Override
public boolean containsAll(Collection<?> c) {
for (Object object : c) {
if (!contains(object))
return false;
}
return true;
}
/**
* If this were a particle filter, how many particles do we have?
* @return from 1 to size()
*/
public double getEffectiveParticleCount() {
double sumSq = 0;
double totalWeight = getTotalWeight();
for (T e : this) {
double p = prob(e);
sumSq += p * p;
}
// normalise
sumSq = sumSq / (totalWeight * totalWeight);
// If the particles are evenly weighted with p=1/n, then sumSq = n * (1/n^2) = 1/n
// So the answer would be n.
// At the other end of the spectrum, if 1 particle has all the weight, then sumSq=1.
// So the answer would be 1.
return 1 / sumSq;
}
@Override
public T getMostLikely() {
BestOne<T> best = new BestOne<T>();
for (T t : this) {
best.maybeSet(t, prob(t));
}
return best.getBest();
}
@Override
public List<T> getMostLikely(int n) {
// compare by probability
TopNList<T> best = new TopNList<T>(n, new Comparator<T>() {
@Override
public int compare(T o1, T o2) {
if (o1 == o2)
return 0;
double v1 = prob(o1);
double v2 = prob(o2);
if (v1 == v2)
// arbitrary: order the keys
return Containers.compare(o1, o2);
return -Double.compare(v1, v2);
}
});
for (T t : this) {
// filter the zeroes
if (prob(t) == 0) {
continue;
}
best.maybeAdd(t);
}
// The comparator gives the TopNList a link to this distribution.
// This can cause issues for serialisation, or garbage collection.
// So we create a fresh list.
return new ArrayList<T>(best);
}
@Override
public double getTotalWeight() {
// this is a slow method - so you may wish to cache the value. See ObjectDistribution which does so.
double wt = 0;
for (T x : this) {
wt += prob(x); // NB: includes pseudo-counts
}
return wt;
}
// @Override
public boolean isEmpty() {
return size() == 0;
}
@Override
public double logProb(T x) {
// crappy implementation - override if you can do better
return Math.log(prob(x));
}
@Override
public double normProb(T x) {
if (isNormalised())
return prob(x);
// TODO calculate normalisation
throw new UnsupportedOperationException();
}
/**
* This MUST be called after any methods that alter the probs but do not
* also update the cachedWeight
*/
public void recalcTotalWeight() {
cachedWeight = -1;
normalised = false;
}
/**
* Equivalent to {@link #setProb(Object, double)} with p=0, but a bit less
* efficient (uses an extra hashmap lookup).
*/
// @Override
public boolean remove(Object o) {
double p = prob((T) o);
setProb((T) o, 0);
return p != 0;
}
// @Override
public boolean removeAll(Collection<?> c) {
for (Object x : c) {
setProb((T) x, 0);
}
// this breaks the Collections interface contract!
return true;
}
/**
* WARNING: beware of the class! e.g. String != Tkn ever
*/
// @Override
public boolean retainAll(Collection<?> c) {
ArrayList remove = new ArrayList();
for (T object : this) {
if ( ! c.contains(object)) {
remove.add(object);
}
}
removeAll(remove);
return ! remove.isEmpty();
}
/**
* Sample from this distribution. Works fine with un-normalised
* distributions.
*
* @return an x selected by random weight
* @testedby AFiniteDistributionTest#testSample()}
*/
@Override
public T sample() {
double totalWeight = getTotalWeight();
if (totalWeight == 0)
// bummer - pick anything?
throw new IllegalStateException(
"Cannot sample from empty distribution");
double p = random().nextDouble() * totalWeight;
double sum = 0;
for (T e : this) {
sum += prob(e);
if (sum > p)
return e;
}
// What? must be a rounding issue (or we have zero weight). Return
// anything
assert MathUtils.approx(totalWeight, 0) || MathUtils.approx(p, totalWeight) : p + " vs " + totalWeight;
return Utils.getRandomMember(Containers.getList(this));
}
@Override
public void setProb(T obj, double value)
throws UnsupportedOperationException {
throw new UnsupportedOperationException();
}
// @Override
public Object[] toArray() {
// inefficient - but doesn't need to know about how data is stored
ArrayList<T> list = new ArrayList<T>();
for (T x : this) {
list.add(x);
}
return list.toArray();
}
// @Override
public <T2> T2[] toArray(T2[] a) {
// inefficient - but doesn't need to know about how data is stored
ArrayList<T> list = new ArrayList<T>();
for (T x : this) {
list.add(x);
}
return list.toArray(a);
}
public Collection<T> asList() {
return new CollectionWrapper(this);
}
} |
<gh_stars>0
import Data.Char (isAlpha)
import Data.List (sortBy)
type Cup = (Int, String)
main :: IO ()
main = do
n <- readLn
cups <- foldr (\_ acc_ -> do
acc <- acc_
line <- getLine
let cup = if isAlpha $ line !! 0 then do
-- <color> <radius>
let (color:radius_:[]) = words line
(read radius_, color)
else do
-- <diameter> <color>
let (diameter_:color:[]) = words line
((read diameter_) `div` 2, color)
return (cup:acc)
) (return []) [1..n]
mapM_ (\(_, color) -> putStrLn color) (sortBy (\(r1, _) (r2, _) -> compare r1 r2) cups)
|
You know that mutual parting of ways that occurred between the San Francisco 49ers and head coach Jim Harbaugh? The one that we all suspected wasn't actually mutual in any way, especially after Jed York and Trent Baalke played the fools (does that term work plural?) at Harbaugh's post-divorce press conference? Big shocker: it wasn't exactly mutual.
Harbaugh appeared on a podcast hosted by Tim Kawakami of the San Jose Mercury News on Friday, and talked quite a bit about his exit from the team. I haven't had a chance to listen to the full thing just yet, but I've gathered some quotes (primarily from our own Tre9er via Twitter) and of course, there's the link above.
On the podcast, Harbaugh says that he didn't leave the 49ers, but that the 49ers hierarchy left him. He then went on to say that the "mutual parting of ways" was only mutual in that Harbaugh didn't want to put the team in a bad spot by trying to fight with them. It makes sense, given that it would also put Harbaugh in a bad spot to try and stick around with an owner and general manager that were obviously done with him.
This all flies right in the face of the lies told to the fans by York and Baalke. I don't know if it would have necessarily been better if they came out and said "Look, we like power and Harbaugh didn't listen to us. So screw that guy," but at least that would have been honest, right?
From what I've heard, it's a fascinating interview, and Kawakami was clearly very excited to land it (and I know many people here have their own feelings on Kawakami but serious kudos goes to the guy over this). Harbaugh also mentions that things were awkward with Jim Tomsula near the end, which to me calls into question the legitimacy of the team's coaching search following Harbaugh's ousting. Did Tomsula already know the job was his?
I'm sure there's more, but the only other things I've noticed thus far: Harbaugh confirmed that he was told he was out after the Week 15 loss to the Seattle Seahawks, and Harbaugh said that he didn't consider any other NFL team -- Michigan was always his destination.
Welp. |
class EnzymaticReactionTokenizer:
"""Constructs a EnzymaticReactionTokenizer using AA sequence."""
def __init__(
self,
aa_sequence_tokenizer_filepath: Optional[str] = None,
smiles_aa_sequence_separator: str = "|",
reaction_separator: str = ">>",
) -> None:
"""Constructs an EnzymaticReactionTokenizer.
Args:
aa_sequence_tokenizer_filepath: file to a serialized AA sequence tokenizer.
smiles_aa_sequence_separator: separator between reactants and AA sequence. Defaults to "|".
reaction_separator: reaction sides separator. Defaults to ">>".
"""
# define tokenization utilities
self.smiles_tokenizer = RegexTokenizer(
regex_pattern=SMILES_TOKENIZER_PATTERN, suffix="_"
)
self.aa_sequence_tokenizer_filepath = aa_sequence_tokenizer_filepath
self.aa_sequence_tokenizer = self._get_aa_tokenizer_fn()
self.smiles_aa_sequence_separator = smiles_aa_sequence_separator
self.reaction_separator = reaction_separator
def tokenize(self, text: str) -> List[str]:
"""Tokenize a text representing an enzymatic reaction with AA sequence information.
Args:
text: text to tokenize.
Returns:
extracted tokens.
"""
product = ""
aa_sequence = ""
try:
reactants_and_aa_sequence, product = text.split(self.reaction_separator)
except ValueError:
reactants_and_aa_sequence = text
try:
reactants, aa_sequence = reactants_and_aa_sequence.split(
self.smiles_aa_sequence_separator
)
except ValueError:
reactants = reactants_and_aa_sequence
tokens = []
tokens.extend(self.smiles_tokenizer.tokenize(reactants))
if aa_sequence:
tokens.append(self.smiles_aa_sequence_separator)
tokens.extend(self.aa_sequence_tokenizer(aa_sequence))
if product:
tokens.append(self.reaction_separator)
tokens.extend(self.smiles_tokenizer.tokenize(product))
return tokens
def _get_aa_tokenizer_fn(self) -> Callable:
"""Definition of the tokenizer for the aa sequence
Returns:
a callable function
"""
if self.aa_sequence_tokenizer_filepath is not None:
fn = AASequenceTokenizer(
tokenizer_filepath=self.aa_sequence_tokenizer_filepath
).tokenize
return fn
else:
return list |
/*
* Public API Surface of edit-in-place
*/
export * from './lib/editable.module';
export * from './lib/editable.config';
export * from './lib/directives/editable-group-edit.directive';
export * from './lib/directives/editable-group-save.directive';
export * from './lib/directives/editable-group-cancel.directive';
export * from './lib/directives/editable-group.directive';
export * from './lib/directives/editable-on-escape.directive';
export * from './lib/directives/editable-on-enter.directive';
export * from './lib/directives/editable-focus.directive';
export * from './lib/directives/editable-cancel.directive';
export * from './lib/directives/editable-save.directive';
export * from './lib/editable.component';
export * from './lib/directives/edit-mode.directive';
export * from './lib/directives/view-mode.directive';
|
Methods of Pedagogical Psychology in Education
In the conditions of modern reality, when the acquired knowledge can become outdated before the learning process ends, the methods in which the students have the opportunity to develop as individuals, improving their competencies become especially important. The article discusses the approach of using the methods of educational psychology to create a learning environment that stimulates students to personal development by improving their own skills and abilities. In order to implement this methodology in practice, emphasis is placed on potential problems that may arise during its implementation, as well as ways to increase its effectiveness are indicated. The methodology is based on the use of educational psychology and should be adapted to the specifics of a particular educational process. The article also describes the basic principles that teachers should be guided by when working with this technique. |
/**
* Gathers all composite types for a given root element in the schema.
* <p>
* Composite types are complex types, choice types, arrays.
*
*
*/
public static class RootCompositeType {
public final Map < String, Object > complexTypes;
public final Map < String, Object > choiceTypes;
public final String cobolName;
public RootCompositeType(String cobolName) {
this.cobolName = cobolName;
complexTypes = new LinkedHashMap < String, Object >();
choiceTypes = new LinkedHashMap < String, Object >();
}
@Override
public String toString() {
StringBuilder sb = new StringBuilder();
sb.append("[cobolName=");
sb.append(cobolName);
sb.append(", ");
sb.append("complexTypes=");
sb.append(complexTypes);
if (choiceTypes.size() > 0) {
sb.append(", choiceTypes=");
sb.append(choiceTypes);
}
sb.append("]");
return sb.toString();
};
} |
# Demonstrate failure of MLE for GMMs in high-D case, whereas MAP works
# Based on: https://github.com/probml/pmtk3/blob/master/demos/mixGaussMLvsMAP.m
# Author: <NAME>
import superimport
import numpy as np
import matplotlib.pyplot as plt
from numpy.random import randn, seed
from numpy.linalg import cholesky, LinAlgError
import pyprobml_utils as pml
import gmm_lib
def fill_cov(S, dim):
m, m = S.shape
S_eye = np.identity(dim - m)
S_fill = np.zeros((m, dim - m))
S_fill_left = np.r_[S_fill, S_eye]
S_final = np.r_[S, S_fill.T]
S_final = np.c_[S_final, S_fill_left]
return S_final
def attempt_em_fit(X, k, pi, Sigma, n_attempts=5):
N, M = X.shape
eta = M + 2
n_success_ml = 0
n_success_map = 0
S = X.std(axis=0)
S = np.diag(S ** 2) / (k ** (1 / M))
for n in range(n_attempts):
mu = randn(k, dim)
try:
gmm_lib.apply_em(X, pi, mu, Sigma)
n_success_ml += 1
except LinAlgError:
pass
try:
gmm_lib.apply_em(X, pi, mu, Sigma, S=S, eta=eta)
n_success_map += 1
except LinAlgError:
pass
pct_ml = n_success_ml / n_attempts
pct_map = n_success_map / n_attempts
return pct_ml, pct_map
if __name__ == "__main__":
seed(314)
plt.rcParams["axes.spines.right"] = False
plt.rcParams["axes.spines.top"] = False
pi = np.ones(3) / 3
hist_ml, hist_map = [], []
test_dims = np.arange(10, 110, 10)
n_samples = 150
for dim in test_dims:
mu_base = np.array([[-1, 1], [1, -1], [3, -1]])
Sigma1_base = np.array([[1, -0.7], [-0.7, 1]])
Sigma2_base = np.array([[1, 0.7], [0.7, 1]])
Sigma3_base = np.array([[1, 0.9], [0.9, 1]])
mu = np.c_[mu_base, np.zeros((3, dim - 2))]
Sigma1 = fill_cov(Sigma1_base, dim)
Sigma2 = fill_cov(Sigma2_base, dim)
Sigma3 = fill_cov(Sigma3_base, dim)
Sigma = np.stack((Sigma1, Sigma2, Sigma3), axis=0)
R = cholesky(Sigma)
samples = np.ones((n_samples, 1, 1)) * mu[None, ...]
noise = randn(n_samples, dim)
noise = np.einsum("kjm,nj->nkm", R, noise)
samples = samples + noise
samples = samples.reshape(-1, dim)
ml, map = attempt_em_fit(samples, 3, pi, Sigma)
hist_ml.append(1 - ml)
hist_map.append(1 - map)
fig, ax = plt.subplots()
ax.plot(test_dims, hist_ml, c="tab:red", marker="o", label="MLE")
ax.plot(test_dims, hist_map, c="black", marker="o", linestyle="--", label="MAP")
ax.set_xlabel("dimensionality")
ax.set_ylabel("fraction of times EM for GMM fails")
ax.legend()
pml.savefig("gmm_mle_vs_map.pdf")
plt.show()
|
/*********************************************************************
** Copyright (C) 2003 Terabit Pty Ltd. All rights reserved.
**
** This file is part of the POSIX-Proactor module.
**
**
**
**
**
**
** @author <NAME> <<EMAIL>>
**
**********************************************************************/
#ifndef TERABIT_SINGLE_LIST_T_H
#define TERABIT_SINGLE_LIST_T_H
#include /**/ "ace/pre.h"
#if !defined (ACE_LACKS_PRAGMA_ONCE)
# pragma once
#endif /* ACE_LACKS_PRAGMA_ONCE */
#include <functional>
#include <algorithm>
#include "ace/Log_Msg.h"
ACE_BEGIN_VERSIONED_NAMESPACE_DECL
template <typename X> class LinkS_T;
template <typename X> class LinkD_T;
template <typename X, typename F> class Single_List_T;
template <typename X, typename F> class Single_Queue_T;
template <typename X, typename F> class Double_List_T;
//============================================================
//
//
//============================================================
template <typename X>
class LinkS_T
{
private:
template <typename > friend class LinkD_T;
template <typename , typename > friend class Single_List_T;
template <typename , typename > friend class Single_Queue_T;
template <typename , typename > friend class Double_List_T;
public:
static X * end ();
~LinkS_T () {}
LinkS_T () : ptr_ (0) {}
X * get() const { return ptr_;}
X * operator-> () const { return ptr_;}
bool is_free () const { return ptr_ == 0; }
bool is_end () const { return ptr_ == end (); }
private:
void set (X * x)
{
//ACE_ASSERT (x != 0);
ptr_ = x;
}
void set_free () { ptr_ = 0; }
void set_end () { ptr_ = end ();}
LinkS_T (X * x) : ptr_ (x) {}
X * ptr_;
};
template <typename X>
inline X *
LinkS_T<X>::end ()
{
return (X*) (-1L);
}
//------------------------------------------------------------
// class that converts X reference to LinkS_T<X> reference
//------------------------------------------------------------
template <class X>
class LinkS_Functor_T : public std::unary_function < X, LinkS_T <X> >
{
public :
LinkS_T <X> * operator ()( const X * x ) const
{
return const_cast < X* > (x);
}
};
//============================================================
//
//
//============================================================
template < class X, class F = LinkS_Functor_T <X> >
class Single_List_T
{
public:
typedef Single_List_T<X,F> List;
typedef LinkS_T<X> Link;
class iterator
{
friend class Single_List_T<X,F>;
iterator(X * x) : x_ (x) {;}
iterator(const Link & link) : x_ (link.get()) {;}
public:
~iterator () {}
iterator () : x_ (0) {;}
iterator(const iterator & other )
: x_ (other.x_)
{;}
iterator & operator = (const iterator & other )
{
x_ = other.x_;
return *this;
}
bool operator == (const iterator & other )const
{
return (x_ == other.x_ );
}
bool operator != (const iterator & other )const
{
return (x_ != other.x_ );
}
iterator & operator ++ ();
iterator operator ++ (int);
operator X * () const { return x_; }
X * operator * () const { return x_; }
private:
X * x_;
};
Single_List_T () : head_ (Link::end())
{}
bool empty () const { return head_.is_end(); }
size_t size() const;
void push_front (X * x);
void push_back (X * x);
X * pop_front ();
X * pop_back ();
X * front ();
X * back ();
X * find (X *x);
X * remove (X *x);
iterator begin() const { return iterator (head_);}
iterator end () const { return iterator (Link::end());}
void swap (Single_List_T<X, F> & other);
template < class Other_List_T >
void splice (Other_List_T & other);
private:
friend class iterator;
/// Protect from copy and assignment
Single_List_T (const Single_List_T<X, F> & other);
Single_List_T & operator= (const Single_List_T <X, F> & other);
// functor that converts object pointer to link pointer
// does it make sense to have non-static converter ??
static Link * get_link (const X * x);
static X * get_next (const X * x);
Link head_;
};
//-----------------------------------------------------
// Single_List_T::iterator
//-----------------------------------------------------
template <class X, class F>
inline typename Single_List_T<X,F>::iterator &
Single_List_T<X,F>::iterator::operator ++()
{
// behavior unpredictable if iterator not valid
this->x_ = Single_List_T<X,F>::get_next (this->x_);
return *this;
}
template <class X, class F>
inline typename Single_List_T<X,F>::iterator
Single_List_T<X,F>::iterator::operator ++(int)
{
// behavior unpredictable if iterator not valid
iterator itr (*this);
++(*this);
return itr;
}
//-----------------------------------------------------
// Single_List_T
//-----------------------------------------------------
template <class X, class F>
inline typename Single_List_T<X,F>::Link *
Single_List_T<X,F>::get_link (const X * x)
{
ACE_ASSERT (x != 0 && x != Link::end() );
static F funcObj2Link;
return funcObj2Link (x);
}
template <class X, class F>
inline X *
Single_List_T<X,F>::get_next (const X * x)
{
return get_link (x)->get ();
}
template <class X, class F>
inline void
Single_List_T<X,F>::push_front (X * x)
{
//ACE_ASSERT (x != 0 && x != Link::end() && head_.get() != 0);
Link * link = get_link (x);
ACE_ASSERT (link->is_free());
link->set (head_.get());
head_.set (x);
}
template <class X, class F>
inline void
Single_List_T<X,F>::push_back (X * x)
{
//ACE_ASSERT (x != 0 && x != Link::end() && head_.get() != 0);
Link * link = get_link (x);
ACE_ASSERT (link->is_free());
Link * linkLast = &head_;
while (!linkLast->is_end())
{
linkLast = get_link (linkLast->get());
}
linkLast->set (x);
link->set_end ();
}
template <class X, class F>
inline X *
Single_List_T<X,F>::front ()
{
X * x = head_.get();
if (x != Link::end())
{
return x;
}
return 0;
}
template <class X, class F>
inline X *
Single_List_T<X,F>::back ()
{
X * x = 0;
iterator it1 = begin();
iterator it2 = end();
for (; it1 !=it2 ; ++it1)
{
x = *it1;
}
return x;
}
template <class X, class F>
inline size_t
Single_List_T<X,F>::size () const
{
size_t count =0;
iterator it1 = begin();
iterator it2 = end();
for (; it1 !=it2 ; ++count, ++it1)
{
}
return count;
}
template <class X, class F>
inline X *
Single_List_T<X,F>::pop_front ()
{
X * x = head_.get();
if (x == Link::end ())
{
return 0;
}
//ACE_ASSERT (x != 0 );
//ACE_ASSERT (this->get_next (x) != 0);
head_.set (this->get_next(x));
this->get_link(x)->set_free();
ACE_ASSERT (head_.get() != 0);
return x;
}
template <class X, class F>
inline X *
Single_List_T<X,F>::pop_back ()
{
X * x = 0;
Link * prevLink = &head_;
if (prevLink->is_end ())
{
return x;
}
for (;;)
{
x = prevLink->get ();
Link * nextLink = this->get_link (x);
if (nextLink->is_end ())
{
break;
}
prevLink = nextLink;
}
prevLink->set_end ();
get_link(x)->set_free();
//ACE_ASSERT (x != 0 && x != Link::end() && head_.get() != 0);
return x;
}
template <class X, class F>
inline void
Single_List_T<X,F>::swap (Single_List_T<X,F> & other)
{
if (&other == this)
return;
std::swap (head_, other.head_);
}
template <class X, class F>
inline X *
Single_List_T<X,F>::find (X *x)
{
if (x == 0)
return 0;
iterator it1 = begin();
iterator it2 = end();
for (; it1 != it2 ; ++it1)
{
if (x == *it1)
return x;
}
return 0;
}
template <class X, class F>
inline X *
Single_List_T<X,F>::remove (X *x)
{
if (x == 0)
return 0;
Link * link_x = get_link (x);
Link * link_cur = &this->head_;
for (;;)
{
X *next = link_cur->get ();
if (next == Link::end())
break; // not found
if (next == x)
{
link_cur->set (link_x->get ());
link_x->set_free ();
return x;
}
link_cur = get_link (next);
}
return 0;
}
template < class X, class F >
template < class Other_List_T >
inline void
Single_List_T<X,F>::splice (Other_List_T & other)
{
X * x = other.front ();
if (x == 0)
return;
Link * linkLast = &head_;
if (linkLast->get() == x)
{
return; // the same
}
while(!linkLast->is_end())
{
linkLast = get_link (linkLast->get());
}
linkLast->set (x);
Other_List_T tmp;
tmp.swap (other);
//other.head_.set_end();
}
//============================================================
//
//
//============================================================
template <class X, class F = LinkS_Functor_T <X> >
class Single_Queue_T
{
public:
typedef Single_Queue_T<X,F> Queue;
typedef LinkS_T<X> Link;
class iterator
{
friend class Single_Queue_T<X,F>;
iterator(X * x) : x_ (x) {;}
iterator(const Link & link) : x_ (link.get()) {;}
public:
~iterator () {}
iterator () : x_ (0) {;}
iterator(const iterator & other )
: x_ (other.x_)
{;}
iterator & operator = (const iterator & other )
{
x_ = other.x_;
return *this;
}
bool operator == (const iterator & other )const
{
return (x_ == other.x_ );
}
bool operator != (const iterator & other )const
{
return (x_ != other.x_ );
}
iterator & operator ++ ();
iterator operator ++ (int);
operator X * () const { return x_; }
X * operator * () const { return x_; }
private:
X * x_;
};
Single_Queue_T ()
: head_ (Link::end())
, tail_ (Link::end())
, size_ (0)
{}
bool empty () const { return (size_ == 0); }
size_t size() const { return size_;}
void push_front (X * x);
void push_back (X * x);
X * pop_front ();
X * pop_back ();
X * front ();
X * back ();
X * find (X *x);
X * remove (X *x);
void swap (Single_Queue_T<X, F> & other);
template < class Other_List_T >
void splice (Other_List_T & other);
iterator begin() const { return iterator (head_);}
iterator end () const { return iterator (Link::end());}
private:
friend class iterator;
/// Protect from copy and assignment
Single_Queue_T (const Single_Queue_T<X, F> & other);
Single_Queue_T & operator= (const Single_Queue_T <X, F> & other);
// functor that converts object pointer to link pointer
// does it make sense to have non-static converter ??
static Link * get_link (const X * x);
static X * get_next (const X * x);
Link head_;
Link tail_;
size_t size_;
};
//-----------------------------------------------------
// Single_Queue_T::iterator
//-----------------------------------------------------
template <class X, class F>
inline typename Single_Queue_T<X,F>::iterator &
Single_Queue_T<X,F>::iterator::operator ++()
{
// behavior unpredictable if iterator not valid
x_ = Single_Queue_T<X,F>::get_next (x_);
return *this;
}
template <class X, class F>
inline typename Single_Queue_T<X,F>::iterator
Single_Queue_T<X,F>::iterator::operator ++(int)
{
// behavior unpredictable if iterator not valid
iterator itr (*this);
++(*this);
return itr;
}
//-----------------------------------------------------
// Single_Queue_T
//-----------------------------------------------------
template <class X, class F>
inline typename Single_Queue_T<X,F>::Link *
Single_Queue_T<X,F>::get_link (const X * x)
{
ACE_ASSERT (x != 0 && x != Link::end() );
static F funcObj2Link;
return funcObj2Link (x);
}
template <class X, class F>
inline X *
Single_Queue_T<X,F>::get_next (const X * x)
{
return get_link (x)->get ();
}
template <class X, class F>
inline void
Single_Queue_T<X,F>::push_front (X * x)
{
//ACE_ASSERT (x != 0 && x != Link::end() && head_.get() != 0);
Link * link = get_link (x);
ACE_ASSERT (link->is_free());
link->set (head_.get());
head_.set (x);
if (size_ == 0)
{
tail_ = head_;
}
++size_;
}
template <class X, class F>
inline void
Single_Queue_T<X,F>::push_back (X * x)
{
//ACE_ASSERT (x != 0 && x != Link::end() && head_.get() != 0);
Link * link = get_link (x);
ACE_ASSERT (link->is_free());
link->set (Link::end());
if (size_ == 0)
{
head_.set (x);
}
else
{
get_link(tail_.get())->set (x);
}
tail_.set (x);
++size_;
}
template <class X, class F>
inline X *
Single_Queue_T<X,F>::front ()
{
if (size_ == 0)
{
return 0;
}
return head_.get();
}
template <class X, class F>
inline X *
Single_Queue_T<X,F>::back ()
{
if (size_ == 0)
{
return 0;
}
return tail_.get();
}
template <class X, class F>
inline X *
Single_Queue_T<X,F>::pop_front ()
{
X * x = head_.get();
if (x == Link::end ())
{
return 0;
}
head_.set (this->get_next(x));
if (--size_ == 0)
{
tail_ = head_;
}
get_link(x)->set_free();
return x;
}
template <class X, class F>
inline X *
Single_Queue_T<X,F>::pop_back ()
{
if (size_ <= 1)
{
return pop_front ();
}
iterator it1 (head_);
iterator it2 (tail_);
X * xprev = *it1;
X * xlast = *it2;
for (; it1 != it2 ; ++it1)
{
xprev = *it1;
}
--size_;
tail_.set (xprev);
get_link(xprev)->set_end();
get_link(xlast)->set (0);
return xlast;
}
template <class X, class F>
inline void
Single_Queue_T<X,F>::swap (Single_Queue_T<X,F> & other)
{
if (&other == this)
return;
std::swap (head_, other.head_);
std::swap (tail_, other.tail_);
std::swap (size_, other.size_);
}
template <class X, class F>
inline X *
Single_Queue_T<X,F>::find (X *x)
{
if (x == 0)
return 0;
iterator it1 = begin();
iterator it2 = end();
for (; it1 != it2 ; ++it1)
{
if (x == *it1)
return x;
}
return 0;
}
template <class X, class F>
inline X *
Single_Queue_T<X,F>::remove (X *x)
{
if (x == 0)
return 0;
if (head_.get () == x)
return pop_front ();
if (tail_.get () == x)
return pop_back ();
Link * link_x = get_link (x);
Link * link_cur = &this->head_;
for (;;)
{
X *next = link_cur->get ();
if (next == Link::end())
break; // not found
if (next == x)
{
link_cur->set (link_x->get ());
link_x->set_free ();
ACE_ASSERT (this->tail_.get () != x);
--this->size_;
return x;
}
link_cur = get_link (next);
}
return 0;
}
template <class X, class F >
template < class Other_List_T >
inline void
Single_Queue_T<X,F>::splice (Other_List_T & other)
{
if (other.empty())
return;
X * x0 = this->front ();
X * x1 = this->back ();
X * x2 = other.front();
if (x0 == x2) // the same
return;
if (x1 == 0)
{
head_.set (x2);
}
else
{
get_link(x1)->set (x2);
}
this->tail_.set (other.back()); // it is valid back()!!!
this->size_ += other.size ();
Other_List_T tmp;
tmp.swap (other);
// other.head_.set_end();
// other.tail_.set_end();
// other.size_ = 0;
}
ACE_END_VERSIONED_NAMESPACE_DECL
#include /**/ "ace/post.h"
#endif /* TERABIT_SINGLE_LIST_T_H */
|
def in_words():
week_days = ['ПОНЕДЕЛЬНИК', 'ВТОРНИК', 'СРЕДА', 'ЧЕТВЕРГ', 'ПЯТНИЦА', 'СУББОТА', 'ВОСКРЕСЕНЬЕ']
current_day = datetime.datetime.today().weekday()
week_days[current_day] = 'СЕГОДНЯ'
if current_day > 0:
week_days[current_day - 1] = 'ВЧЕРА'
if current_day < 6:
week_days[current_day + 1] = 'ЗАВТРА'
logging.info(week_days)
return week_days |
<reponame>bcgov/time-machine<gh_stars>1-10
export * from './client.entity';
export * from './contact.entity';
// export * from './document.entity';
export * from './ministry.entity';
export * from './project.entity';
export * from './projectContacts.entity';
// export * from './projectDocuments.entity';
export * from './projectIntake.entity';
export * from './projectIntakeContacts.entity';
export * from './projectRfx.entity';
export * from './projectSector.entity';
export * from './project_Intake_Category.entity';
export * from './project_Intake_Services.entity';
export * from './rfxPhase.entity';
export * from './rfxType.entity';
export * from './timesheet.entity';
export * from './timesheetEntry.entity';
export * from './user.entity';
export * from './riskQuestions.entity';
export * from './riskAnswers.entity';
export * from './projectRiskAnalysis.entity';
export * from './financeExport.entity';
|
A support system for brain tumor's therapeutic planning
Multidisciplinary approach to treatment planning of brain tumors is a worldwide increasingly practice, this approach is achieved using multidisciplinary team meetings (MDTM) to discuss cases. Studies have shown two main barriers to maximizing the efficiency of the MDTM: lack of information and inadequate presentation of available data to the team members. These difficulties is the reason for design and develop an information system called SATTC (the Portuguese acronym for Support System for Brain Tumor's Therapeutic Planning) whose premise is identifying, automatically, essential information for treatment planning and show it to team members in adequate way. Three computational technologies were used in the development of the system: Ontology; Data Warehousing Data Mining. The main expected benefit of the proposed system is to avoid that relevant and crucial information are hidden from team members at MDTM. |
def _on_removetext(self, event):
num_text = len(self.textList)
if num_text < 1:
if self.parent is not None:
msg = "Remove Text: Nothing to remove. "
wx.PostEvent(self.parent, StatusEvent(status=msg))
else:
raise
return
txt = self.textList[num_text - 1]
try:
text_remove = txt.get_text()
txt.remove()
if self.parent is not None:
msg = "Removed Text: '%s'. " % text_remove
wx.PostEvent(self.parent, StatusEvent(status=msg))
except:
if self.parent is not None:
msg = "Remove Text: Error occurred. "
wx.PostEvent(self.parent, StatusEvent(status=msg))
else:
raise
self.textList.remove(txt)
self.subplot.figure.canvas.draw_idle() |
/**
* A driver class in which launchers are created for all known products.
*/
public final class OpenSphereLauncherCreator
{
/**
* The {@link Logger} instance used to capture output.
*/
private static final Logger LOG = Logger.getLogger(OpenSphereLauncherCreator.class);
/**
* The writer used to create the eclipse launcher file.
*/
private final ProjectLauncherWriter myWriter;
/**
* The project reader used to parse the project file(s).
*/
private final AbstractCompositeProjectReader myProjectReader;
/**
* Creates a new launcher creator.
*
* @param projectReader the reader used to parse the project file.
*/
public OpenSphereLauncherCreator(AbstractCompositeProjectReader projectReader)
{
myProjectReader = projectReader;
myWriter = new ProjectLauncherWriter();
}
/**
* Executes the launcher creator. Use a single command line argument to
* specify the profiles to use (comma separated if more than one).
*
* @param args the set of arguments supplied by the user.
*/
public static void main(String[] args)
{
if (args.length < 1)
{
LOG.error("arguments: <opensphere directory name> [<profile>]");
return;
}
String projName = args[0];
String root = System.getProperty("user.dir");
root = root.substring(0, root.indexOf(projName));
String profile = "unclass";
if (args.length == 2)
{
profile = args[1];
}
CompositeProjectModel compositeProjectModel = new CompositeProjectModel(Paths.get(root));
OpenSphereLauncherCreator creator = new OpenSphereLauncherCreator(
new OpenSphereProjectReader(compositeProjectModel, new HashSet<>(Arrays.asList(profile.split(",")))));
creator.processProjects(projName);
}
/**
* Reads projects using the project reader, and writes the
*
* @param projectName the name of the project folder
*/
public void processProjects(String projectName)
{
Project project = myProjectReader.readProject(projectName);
CompositeProjectModel compositeProjectModel = myProjectReader.getCompositeProjectModel();
List<String> modules = compositeProjectModel.getProjects(project);
Set<Dependency> dependencies = compositeProjectModel.getExternalDependencies(project);
Project parent = project.getParent();
while (parent != null)
{
modules.addAll(compositeProjectModel.getProjects(parent));
dependencies.addAll(compositeProjectModel.getExternalDependencies(parent));
parent = parent.getParent();
}
LOG.info(project.getTitle() + " Modules: " + modules.size());
LOG.info(project.getTitle() + " Dependencies: " + dependencies.size());
for (OsInfo osInfo : OsInfo.values())
{
myWriter.write(modules, dependencies, project, osInfo);
}
}
} |
<reponame>ayseirmak/Front_React
export * from './ContentForm'
|
<filename>src/Routes/Predict/index.ts
import PredictContainer from "./PredictContainer";
export default PredictContainer |
<filename>modules/mod_explosm.py
from modules.module_base import ModuleBase
from tools.imageSender import ImageSender
class ModuleExplosm(ModuleBase):
URL = "https://explosm.net/comics/"
XPATH = '//img[@id="main-comic"]/@src'
def __init__(self, bot):
ModuleBase.__init__(self, bot)
self.name = "ModuleExplosm"
def notify_command(self, message_id, from_attr, date, chat, commandName, commandStr):
if commandName == "explosm":
if commandStr == "" or commandStr == "random":
ImageSender.send_image(self.bot, chat["id"], self.URL + "random", self.XPATH, "Cyanide and Happiness")
elif commandStr == "last":
ImageSender.send_image(self.bot, chat["id"], self.URL + "latest", self.XPATH, "Cyanide and Happiness")
else:
self.bot.sendMessage("Command /%s %s unknown !" % (commandName, commandStr), chat["id"])
def get_commands(self):
return [
("explosm", "Cyanide and Happiness. Keywords : <last/random>"),
]
|
class StcStats:
"""Represents statistics view.
The statistics dictionary represents a table:
Statistics Name | Object 1 Value | Object 2 Value | ...
object | | |
parents | | |
topLevelName | | |
Stat 1 | | |
...
For example, generatorportresults statistics for two ports might look like the following:
Statistics Name | Object 1 Value | Object 2 Value
object | analyzerportresults1 | analyzerportresults2
parents | project1/port1/analyzer1 | project1/port2/analyzer2
topLevelName | Port 1 | Port 2
GeneratorFrameCount | 1000 | 2000
...
"""
def __init__(self, view: str) -> None:
"""Subscribe to view with default configuration type as defined by config_2_type.
:param view: statistics view to subscribe to. If view is None it is the test responsibility to subscribe with
specific config_type.
"""
self.rds = None
self.statistics = TgnObjectsDict()
if view:
self.subscribe(view)
def subscribe(self, view: str, config_type: Optional[str] = None) -> None:
"""Subscribe to statistics view.
:param view: statistics view to subscribe to.
:param config_type: configuration type to subscribe to.
"""
if view.lower() in view_2_config_type:
if not config_type:
config_type = view_2_config_type[view.lower()]
rds = StcObject.project.api.subscribe(
Parent=StcObject.project.ref, ResultParent=StcObject.project.ref, ConfigType=config_type, ResultType=view
)
self.rds = StcObject(objType="ResultDataSet", parent=StcObject.project, objRef=rds)
else:
StcObject.project.get_children("DynamicResultView")
drv = StcObject.project.get_object_by_name(view)
rc = StcObject.project.command("SubscribeDynamicResultView", DynamicResultView=drv.ref)
self.rds = StcObject(objType="DynamicResultView", parent=StcObject.project, objRef=rc["DynamicResultView"])
def unsubscribe(self) -> None:
""" UnSubscribe from statistics view. """
StcObject.project.api.unsubscribe(self.rds.ref)
def read_stats(self, obj_id_stat: Optional[str] = "topLevelName") -> TgnObjectsDict:
"""Reads the statistics view from STC and saves it in statistics dictionary.
:param obj_id_stat: which statistics name to use as object ID, sometimes topLevelName is
not meaningful and it is better to use other unique identifier like stream ID.
"""
self.statistics = TgnObjectsDict()
if self.rds.type == "dynamicresultview":
self._read_custom_view()
else:
self._read_view(obj_id_stat)
return self.statistics
def get_column_stats(self, name: str) -> TgnObjectsDict:
"""Returns all statistics values for the requested statistics.
N/A for custom views.
:param name: requested statistic name.
"""
column_statistics = TgnObjectsDict()
for obj, obj_values in self.statistics.items():
column_statistics[obj] = obj_values[name]
return column_statistics
#
# Private methods.
#
def _read_custom_view(self):
StcObject.project.command("RefreshResultView", ResultDataSet=self.rds.ref)
StcObject.project.command("UpdateDynamicResultViewCommand", DynamicResultView=self.rds.ref)
presentationResultQuery = self.rds.get_child("PresentationResultQuery")
selectedProperties = presentationResultQuery.get_list_attribute("SelectProperties")
self.objs_stats = []
for rvd in presentationResultQuery.get_children("ResultViewData"):
self.objs_stats.append(rvd.get_list_attribute("ResultData")[: len(selectedProperties)])
self._get_result_data(rvd, len(selectedProperties))
self.statistics = dict(zip(selectedProperties, zip(*self.objs_stats)))
def _get_result_data(self, rvd, num_columns):
StcObject.project.command("ExpandResultViewDataCommand", ResultViewData=rvd.ref)
for child_rvd in rvd.get_children("ResultViewData"):
if is_false(child_rvd.get_attribute("IsDummy")):
self.objs_stats.append(child_rvd.get_list_attribute("ResultData")[:num_columns])
self._get_result_data(child_rvd, num_columns)
def _read_view(self, obj_id_stat: Optional[str] = "topLevelName") -> None:
StcObject.project.command("RefreshResultView", ResultDataSet=self.rds.ref)
for page_number in range(1, int(self.rds.get_attribute("TotalPageCount")) + 1):
self.rds.set_attributes(PageNumber=page_number)
for results in self.rds.get_objects_from_attribute("ResultHandleList"):
parent = results.get_object_from_attribute("parent")
parents = parent.ref
name = ""
while parent != StcObject.project:
if not name and parent.obj_type().lower() in ("port", "emulateddevice", "streamblock"):
name = parent.get_name()
parent = parent.get_object_from_attribute("parent")
parents = parent.ref + "/" + parents
obj_stats = {"object": results.ref, "parents": parents, "topLevelName": name}
obj_stats.update(results.get_attributes())
obj_stats.pop("parent", None)
obj_stats.pop("Name", None)
obj_stats.pop("resultchild-Sources", None)
for stat in obj_stats:
try:
obj_stats[stat] = int(obj_stats[stat])
except ValueError:
pass
self.statistics[StcObject.project.get_object_by_name(obj_stats[obj_id_stat])] = obj_stats |
/*
Keep consistent with python version !
*/
string PatchKPIDToStr( int kpID )
{
switch( kpID ){
case -2:
return "LogoArea";
case -1:
return "vface";
case 0:
return "WinGlassLT";
case 1:
return "WinGlassRT";
case 2:
return "WinGlassLB";
case 3:
return "WinGlassRB ";
case 4:
return "LeftHLamp";
case 5:
return "RightHLamp";
case 6:
return "FrontBumpLB";
case 7:
return "FrontBumpRB";
case 8:
return "VehicleLogo";
case 9:
return "LicensePC";
case 10:
return "MidLineBot";
default:
printf( "Invalid patch or key point id !" );
return "BAD";
}
} |
<reponame>maxWN/angular-electron-utility
import { CommonModule } from '@angular/common';
import { NgModule } from '@angular/core';
import { AppSettingsComponent } from './app-settings.component';
import { AppSettingsRoutingModule } from './app-settings-routing.module';
// REMEMBER: If you import a module that contains another module you are already importing,
// you can simply use the imported modules version, and remove your local import
@NgModule({
declarations: [
AppSettingsComponent
],
exports: [ AppSettingsComponent ],
imports: [ CommonModule, AppSettingsRoutingModule ],
providers: [],
bootstrap: []
})
export class AppSettingsModule { }
|
/**
* Created by alex.mihai on 6/9/2017.
*/
public class CartValidationTest extends BaseTest {
@Test
public void validateCart() throws InterruptedException {
//open homepage
HomepageObject homepage = new HomepageObject(driver);
homepage.openHomePage();
homepage.acceptPrompt();
//check if the cart is empty
String emptyCartStatus = homepage.getCartHeaderText();
String expectedEmptyCartStatus = "Your cart is empty";
Assert.assertTrue(emptyCartStatus.equals(expectedEmptyCartStatus), "The cart status is not correct ! \nExpected: " + expectedEmptyCartStatus + "\nActual: "+ emptyCartStatus);
System.out.println("Cart text correct: " + emptyCartStatus);
//expand the cart and check if the correct text is present
String emptyExpandedCartStatus = homepage.getCartBodyText();
String expectedEmptyExpandedCartStatus = "Recently added item(s)\n" +
"You have no items in your shopping cart.";
Assert.assertTrue(emptyExpandedCartStatus.equals(expectedEmptyExpandedCartStatus), "The cart status is not correct ! \nExpected: " + expectedEmptyExpandedCartStatus + "\nActual: " + emptyExpandedCartStatus);
System.out.println("Cart text correct : " + emptyExpandedCartStatus);
//add some products
CapsulePageObject capsulePage = homepage.clickCapsuleButton();
capsulePage.waitForCapsulePageToLoad();
capsulePage.addLivanto();
Thread.sleep(10000);
String cartWithItems = capsulePage.getCartHeaderText();
Assert.assertFalse(expectedEmptyCartStatus.equals(cartWithItems), "The cart is still empty ! ");
System.out.println("Products added correctly ! \n" + cartWithItems);
ShoppingCartPageObject shoppingCart = capsulePage.openShoppingCart();
shoppingCart.waitForCartPageToLoad();
//Check item base price
String livantoPrice = shoppingCart.getLivantoPrice();
String actualPrice = shoppingCart.getProductPrice();
Assert.assertTrue(livantoPrice.equals(actualPrice), "The price is wrong ! " + "\nExpected: " + livantoPrice + "\nActual: " + actualPrice);
System.out.println("Prices match ! \nExpected : " + livantoPrice + "\nActual: " + actualPrice);
//Check that the total price is calculated correctly against selected qty
String calculatedPrice = shoppingCart.calculateSubtotal();
String subtotalFromCart = shoppingCart.parseSubtotal();
Assert.assertTrue(calculatedPrice.equals(subtotalFromCart), "The price is wrong !" + "\nExpected: " + calculatedPrice + "\nActual: " + subtotalFromCart);
System.out.println("Price calculated correctly ! \nExpected: " + calculatedPrice + "\nActual: " + subtotalFromCart);
//Modify the QTY of the products
shoppingCart.typeQty("150");
shoppingCart.updateCart();
//Check again that the total price is calculated correctly against selected qty
String calculatedPrice2 = shoppingCart.calculateSubtotal();
String subtotalFromCart2 = shoppingCart.parseSubtotal();
Assert.assertTrue(calculatedPrice2.equals(subtotalFromCart2), "The price is wrong !" + "\nExpected: " + calculatedPrice2 + "\nActual: " + subtotalFromCart2);
System.out.println("Price calculated correctly ! \nExpected: " + calculatedPrice2 + "\nActual: " + subtotalFromCart2);
String minicartStatus = shoppingCart.constructMiniCartStatus();
//Check if elements are visible in the cart page
Assert.assertTrue(shoppingCart.checkVisibleProductImg());
Assert.assertTrue(shoppingCart.checkVisibleContinueShoppingButton());
Assert.assertTrue(shoppingCart.checkVisibleEmptyCartButton());
Assert.assertTrue(shoppingCart.checkVisibleCheckoutButton());
Assert.assertTrue(shoppingCart.checkVisibleUpdateCartButton());
Assert.assertTrue(shoppingCart.checkVisibleProductSKU());
Assert.assertTrue(shoppingCart.checkVisibleProductName());
Assert.assertTrue(shoppingCart.checkVisibleCouponField());
Assert.assertTrue(shoppingCart.checkVisibleDiscountHeader());
Assert.assertTrue(shoppingCart.checkVisibleTax());
//Check if "Continue Shopping" button works and products are still in cart
HomepageObject homepage2 = shoppingCart.clickContinueShopping();
homepage2.waitForHomepageToLoad();
String actualMiniCartText = homepage2.getCartHeaderText();
Assert.assertTrue(minicartStatus.equals(actualMiniCartText), "The minicart status is correct!" + "\nExpected: " + minicartStatus + "\nActual: " + actualMiniCartText);
System.out.println("Minicart status correct ! \nExpected: " + minicartStatus + "\nActual: " + actualMiniCartText);
//Open the cart again and empty it
ShoppingCartPageObject shoppingCart2 = homepage2.openShoppingCart();
shoppingCart2.waitForCartPageToLoad();
//Check again that the total price is calculated correctly against selected qty
String calculatedPrice3 = shoppingCart2.calculateSubtotal();
String subtotalFromCart3 = shoppingCart2.parseSubtotal();
Assert.assertTrue(calculatedPrice2.equals(subtotalFromCart3), "The price is wrong !" + "\nExpected: " + calculatedPrice3 + "\nActual: " + subtotalFromCart2);
System.out.println("Price calculated correctly ! \nExpected: " + calculatedPrice3 + "\nActual: " + subtotalFromCart3);
//Check if elements are visible in the cart page
Assert.assertTrue(shoppingCart2.checkVisibleProductImg());
Assert.assertTrue(shoppingCart2.checkVisibleContinueShoppingButton());
Assert.assertTrue(shoppingCart2.checkVisibleEmptyCartButton());
Assert.assertTrue(shoppingCart2.checkVisibleCheckoutButton());
Assert.assertTrue(shoppingCart2.checkVisibleUpdateCartButton());
Assert.assertTrue(shoppingCart2.checkVisibleProductSKU());
Assert.assertTrue(shoppingCart2.checkVisibleProductName());
Assert.assertTrue(shoppingCart2.checkVisibleCouponField());
Assert.assertTrue(shoppingCart2.checkVisibleDiscountHeader());
Assert.assertTrue(shoppingCart2.checkVisibleTax());
//Empty cart and validate
shoppingCart2.clickEmptyCartButton();
shoppingCart2.waitForEmptyCartPageToLoad();
String expectedHeader = "Shopping Cart is Empty";
String expectedP1 = "You have no items in your shopping cart.";
String expectedP2 = "Click here to continue shopping.";
String expectedStatus = "Your cart is empty";
String actualHeader = shoppingCart2.getTextEmptyCartHeader();
String actualP1 = shoppingCart2.getTextEmptyCartP1();
String actualP2 = shoppingCart2.getTextEmptyCartP2();
String actualStatus = shoppingCart2.getCartHeaderText();
Assert.assertTrue(expectedHeader.equals(actualHeader), "Cart Header mismatch !" + "\nExpected: " + expectedHeader + "\nActual: " + actualHeader);
Assert.assertTrue(expectedP1.equals(actualP1), "Cart paragraph mismatch !" + "\nExpected: " + expectedP1 + "\nActual: " + actualP1);
Assert.assertTrue(expectedP2.equals(actualP2), "Cart paragraph mismatch !" + "\nExpected: " + expectedP2 + "\nActual: " + actualP2);
Assert.assertTrue(expectedStatus.equals(actualStatus), "Cart status mismatch !" + "\nExpected: " + expectedStatus + "\nActual: " + actualStatus);
}
} |
import sys
input = sys.stdin.readline
def solve():
n,k = map(int, input().split())
dp = [None]*(n+1)
p = [None]*(n+1)
dp[0] = 0
q = [0]
qi = 0
while qi < len(q):
x = q[qi]
#print(x)
qi += 1
for j in range(min(k, x)+1):
if x + k - j <= n:
y = x + k - 2*j
if y >= 0 and y <= n and dp[y] is None:
dp[y] = dp[x]+1
p[y] = x
q.append(y)
if dp[n] is None:
print(-1)
sys.stdout.flush()
return
x = n
a = [n]
while x != 0:
x = p[x]
a.append(x)
print(a, file=sys.stderr)
res = 0
have = [False]*n
for i in range(len(a)-2,-1,-1):
x = a[i+1]
g = ['?']
for j in range(min(k, x)+1):
y = x + k - 2*j
if y == a[i]:
l = k - j
for w in range(n):
if have[w] == True:
if j > 0:
j -= 1
g.append(str(w+1))
have[w] = False
else:
if l > 0:
l -= 1
g.append(str(w+1))
have[w] = True
break
print(' '.join(g))
sys.stdout.flush()
res ^= int(input())
print('!', res)
sys.stdout.flush()
solve()
|
/*
Code for problem by cookiedoth
Generated 28 Jun 2020 at 03.06 PM
,##. ,==.
,# #. \ o ',
# # _ _ \ \
# # (_) (_) / ;
`# #' / .'
`##' "=="
z_z
=_=
¯\_(ツ)_/¯
*/
#include <iostream>
#include <fstream>
#include <vector>
#include <set>
#include <map>
#include <bitset>
#include <algorithm>
#include <iomanip>
#include <cmath>
#include <ctime>
#include <functional>
#include <unordered_set>
#include <unordered_map>
#include <string>
#include <queue>
#include <deque>
#include <stack>
#include <complex>
#include <cassert>
#include <random>
#include <cstring>
#include <numeric>
#define ll long long
#define ld long double
#define null NULL
#define all(a) a.begin(), a.end()
#define rall(a) a.rbegin(), a.rend()
#define debug(a) cerr << #a << " = " << a << endl
#define forn(i, n) for (int i = 0; i < n; ++i)
#define sz(a) (int)a.size()
using namespace std;
template<class T> int chkmax(T &a, T b) {
if (b > a) {
a = b;
return 1;
}
return 0;
}
template<class T> int chkmin(T &a, T b) {
if (b < a) {
a = b;
return 1;
}
return 0;
}
template<class iterator> void output(iterator begin, iterator end, ostream& out = cerr) {
while (begin != end) {
out << (*begin) << " ";
begin++;
}
out << endl;
}
template<class T> void output(T x, ostream& out = cerr) {
output(x.begin(), x.end(), out);
}
void fast_io() {
ios_base::sync_with_stdio(0);
cin.tie(0);
cout.tie(0);
}
const int K = 26;
const int MAX_D = 365;
int c[K], s[MAX_D][K];
struct solver {
int D, sum;
vector<set<int> > S;
vector<int> a;
int prog(int x, int len) {
return len * (len - 1) / 2 * x;
}
solver() {}
solver(int _D, vector<int> _a): D(_D), a(_a) {
// cerr << "D = " << D << endl;
S.resize(K);
for (int i = 0; i < K; ++i) {
S[i].insert(-1);
S[i].insert(D);
}
sum = 0;
for (int i = 0; i < D; ++i) {
sum += s[i][a[i]];
S[a[i]].insert(i);
}
for (int i = 0; i < K; ++i) {
auto it = next(S[i].begin());
while (it != S[i].end()) {
// cerr << "subt prog " << c[i] << " " << (*it) - (*prev(it)) << endl;
sum -= prog(c[i], (*it) - (*prev(it)));
it++;
}
}
// cerr << "sum = " << sum << endl;
}
void find_nb(int clr, int x, int &l, int &r) {
auto it = S[clr].upper_bound(x);
auto it2 = S[clr].lower_bound(x);
r = (it == S[clr].end() ? -1 : (*it));
l = (it2 == S[clr].begin() ? -1 : (*prev(it2)));
}
int modify(int pos, int val) {
// cerr << "modify " << pos << " " << val << endl;
if (a[pos] == val) {
return 0;
}
int old_val = a[pos];
int res = 0, l, r;
find_nb(val, pos, l, r);
res -= prog(c[val], pos - l);
res -= prog(c[val], r - pos);
res += prog(c[val], r - l);
// cerr << "lr for " << val << " " << pos << " = " << l << " " << r << endl;
find_nb(old_val, pos, l, r);
res += prog(c[old_val], pos - l);
res += prog(c[old_val], r - pos);
res -= prog(c[old_val], r - l);
// cerr << "lr for " << old_val << " " << pos << " = " << l << " " << r << endl;
// cerr << "res = " << res << endl;
res -= s[pos][old_val];
res += s[pos][val];
// cerr << "done" << endl;
// cerr << "delta = " << res << endl;
return res;
}
void apply_modification(int pos, int val, int delta) {
sum += delta;
int old_val = a[pos];
S[old_val].erase(pos);
S[val].insert(pos);
a[pos] = val;
}
};
int D;
void read() {
cin >> D;
for (int i = 0; i < K; ++i) {
cin >> c[i];
}
for (int i = 0; i < D; ++i) {
for (int j = 0; j < K; ++j) {
cin >> s[i][j];
}
}
}
vector<int> t;
void read_t() {
t.resize(D);
for (int i = 0; i < D; ++i) {
cin >> t[i];
t[i]--;
}
}
int lst[K];
void solveB() {
fill(lst, lst + K, -1);
int sat = 0;
for (int i = 0; i < D; ++i) {
sat += s[i][t[i]];
lst[t[i]] = i;
for (int j = 0; j < K; ++j) {
sat -= c[j] * (i - lst[j]);
}
cout << sat << "\n";
}
}
solver S;
void solveC() {
S = solver(D, t);
int m;
cin >> m;
for (int it = 0; it < m; ++it) {
int pos, val;
cin >> pos >> val;
pos--;
val--;
int delta = S.modify(pos, val);
S.apply_modification(pos, val, delta);
cout << S.sum << '\n';
}
}
const int INF = 1e9;
void find_t_greedily() {
t.resize(D);
fill(lst, lst + K, -1);
for (int i = 0; i < D; ++i) {
// cerr << "i = " << i << endl;
int best = -INF, opt = -1;
for (int x = 0; x < K; ++x) {
int cur = s[i][x] + c[x] * (i - lst[x]);
if (chkmax(best, cur)) {
opt = x;
}
}
t[i] = opt;
lst[opt] = i;
}
}
void print_cf() {
solver S(D, t);
cerr << S.sum << endl;
}
ld t0 = clock();
ld apple_watch() {
return ((ld)clock() - t0) / CLOCKS_PER_SEC;
}
mt19937 rng(58);
const int MASK = (1 << 9);
ld real() {
return (ld)rng() / (ld)rng.max();
}
const ld maxQ = 4e9;
void local_optimizations() {
solver S(D, t);
cerr << S.sum << endl;
int iter = 0;
while (true) {
iter++;
int pos = rng() % D;
int clr = rng() % K;
int delta = S.modify(pos, clr);
ld Q = (maxQ / (ld)iter);
if (delta >= 0) {
S.apply_modification(pos, clr, delta);
} else {
ld p = exp(-(ld)abs(delta) / (ld)Q);
if (real() < p) {
S.apply_modification(pos, clr, delta);
}
}
if (!(iter & MASK) && apple_watch() > 1.98) {
break;
}
}
cerr << "Iterations performed: " << iter << endl;
t = S.a;
cerr << S.sum << endl;
}
void print_ans() {
for (int i = 0; i < D; ++i) {
cout << t[i] + 1 << '\n';
}
}
signed main() {
fast_io();
read();
find_t_greedily();
local_optimizations();
print_ans();
}
|
<filename>pkg/apis/tf/v1alpha1/templates/common_config.go
package templates
import (
"text/template"
)
// CommonRunConfig is the template of the common run service actions
var CommonRunConfig = template.Must(template.New("").Parse(`#!/bin/bash
[[ "$LOG_LEVEL" != "SYS_DEBUG" ]] || set -x
cmd_file="/tmp/command.sh"
pid_file="${cmd_file}.pid"
sig_file="${cmd_file}.sighup"
cat <<\EOF > /tmp/command.sh
#!/bin/bash
[[ "$LOG_LEVEL" != "SYS_DEBUG" ]] || set -x
{{ .Command }}
EOF
chmod +x /tmp/command.sh
function wait_file() {
local src=$1
echo "INFO: $(date): wait for $src"
while [ ! -e $src ] ; do sleep 1; done
echo "INFO: $(date): wait for $src completed"
local hash=$(md5sum $src | awk '{print($1)}')
echo $hash > /tmp/$(basename $src).md5sum
if [[ "$LOG_LEVEL" != "SYS_DEBUG" ]] ; then
echo "INFO: $(date): hash $hash"
else
echo -e "INFO: $(date): hash $hash\n$(cat $src)"
fi
}
function link_file() {
local src={{ .ConfigMapMount }}/$1
wait_file $src
if [[ -n "$2" ]] ; then
local ddir={{ .DstConfigPath }}
local dst=$ddir/$2
echo "INFO: $(date): link $src => $dst"
mkdir -p $ddir
ln -sf $src $dst
fi
}
function term_process() {
local pid=$1
local signal=TERM
echo "INFO: $(date): $0: term_command $pid"
if [ -n "$pid" ] ; then
kill -${signal} $pid
echo "INFO: $(date): $0: term_command $pid: wait child job"
for i in {1..20}; do
kill -0 $pid >/dev/null 2>&1 || break
sleep 6
done
if kill -0 $pid >/dev/null 2>&1 ; then
echo "INFO: $(date): $0: term_command $pid: faild to wait child job.. exit to relaunch container"
[ -z "$sig_file" ] || rm -f $sig_file
exit 1
fi
fi
}
function trap_sigterm() {
echo "INFO: $(date): $0: trap_sigterm: start"
local pid=$(cat $pid_file 2>/dev/null)
term_process $pid
echo "INFO: $(date): $0: trap_sigterm: done"
[ -z "$sig_file" ] || rm -f $sig_file
}
function trap_sighup() {
[ -z "$sig_file" ] || touch $sig_file
local pid=$(cat $pid_file 2>/dev/null)
echo "INFO: $(date): $0: trap_sighup: pid=$pid"
kill -HUP $pid
}
function check_hash_impl() {
local src=$1
local new=$(md5sum $src | awk '{print($1)}')
local old=$(cat /tmp/$(basename $src).md5sum)
if [[ "$new" != "$old" ]] ; then
echo "INFO: $(date): File changed $src: old=$old new=$new"
return 1
fi
return 0
}
function check_hash() {
check_hash_impl {{ .ConfigMapMount }}/$1
}
function configs_unchanged() {
local changed=0
{{ range $src, $dst := .Configs }}
check_hash {{ $src }} || changed=1
{{ end }}
check_hash_impl /etc/certificates/server-key-${POD_IP}.pem || changed=1
check_hash_impl /etc/certificates/server-${POD_IP}.crt || changed=1
check_hash_impl {{ .CAFilePath }} || changed=1
return $changed
}
{{ if .InitCommand }}
{{ .InitCommand }}
{{ end }}
export -f trap_sighup
export -f trap_sigterm
export -f wait_file
export -f link_file
update_signal={{ .UpdateSignal }}
trap 'trap_sighup' SIGHUP
trap 'trap_sigterm' SIGTERM
touch $sig_file
while [ -e $sig_file ] ; do
wait_file /etc/certificates/server-key-${POD_IP}.pem
wait_file /etc/certificates/server-${POD_IP}.crt
wait_file {{ .CAFilePath }}
{{ range $src, $dst := .Configs }}
link_file {{ $src }} {{ $dst }}
{{ end }}
while [ -e $sig_file ] ; do
pid=$(cat $pid_file 2>/dev/null)
if [ -z "$pid" ] || ! kill -0 $pid >/dev/null 2>&1 ; then
$cmd_file &
pid=$!
echo $pid > $pid_file
echo "INFO: $(date): command started pid=$pid"
else
if ! configs_unchanged ; then
delay=$(( $RANDOM % 60 ))
echo "INFO: $(date): delay reload for $delay sec"
sleep $delay
if [[ "$update_signal" == 'TERM' ]] ; then
term_process $pid
elif [[ "$update_signal" == 'HUP' ]] ; then
trap_sighup
else
echo "INFO: $(date): unsupported signal $update_signal"
exit 1
fi
break
fi
fi
sleep 10
done
done
`))
|
<filename>engine/netutil/WebSocketConnection.go
package netutil
//
//type WebSocketConnection struct {
// wsconn *websocket.Conn
//}
//
//func NewWebSocketConnection(conn *websocket.Conn) WebSocketConnection {
// return WebSocketConnection(conn)
//}
//
//func (wsc WebSocketConnection) Write(p []byte) (int, error) {
// wsc.wsconn.Write()
//}
//
//func (wsc WebSocketConnection) Read(p []byte) (int, error) {
//
//}
|
#include <iostream>
using namespace std;
#define FOR(i, m, n) for (int i = m; i <= n; ++i)
int main() {
int A, B;
cin >> A >> B;
int cnt;
FOR(i,A,B) {
int s = i % 10, t = i / 10000;
int u = i / 10 % 10, v = i / 1000 % 10;
if ( s == t && u == v ) ++cnt;
}
cout << cnt << endl;
} |
<filename>chapter_002/src/main/java/products/storage/decorator/LimitedStorage.java
package products.storage.decorator;
import products.items.decorator.IFood;
/**
* @autor aoliferov
* @since 14.02.2019
*/
public class LimitedStorage<E extends IFood> extends DecoratorStorage<E> {
private int size;
public LimitedStorage(IStorage storage, int size) {
super(storage);
this.size = size;
}
@Override
public boolean add(E item) {
boolean result = false;
if (super.engaged() < this.size) {
result = super.add(item);
}
return result;
}
}
|
import { assertEquals, assertThrows } from "./deps.ts";
import { Ningen } from "./mod.ts";
const ng = new Ningen("/root/dir");
const testRule = ng.rule({
name: "ttt",
command: "ttt -o $out $in",
srcs: [],
});
function generate(): string {
return ng.generateToString({ enableGeneratorRule: false }).trim();
}
Deno.test("generate: writePool: outputs pool", () => {
ng.reset();
ng.pool({
name: "my_pool",
depth: 2,
});
assertEquals(
generate(),
`
pool my_pool
depth = 2
`.trim(),
);
});
Deno.test("generate: writeRule", () => {
ng.reset();
ng.rule({ name: "rrr", command: "cmd goes here" });
assertEquals(
generate(),
`
rule rrr
command = cmd goes here
`.trim(),
);
});
Deno.test("generate: writeRule: string vars", () => {
ng.reset();
ng.rule({
name: "rrr",
command: "cmd goes here",
vars: {
c: "ccc",
b: "bbb",
a: "aaa",
},
});
assertEquals(
generate(),
`
rule rrr
command = cmd goes here
c = ccc
b = bbb
a = aaa
`.trim(),
);
});
Deno.test("generate: writeRule: file vars", () => {
ng.reset();
ng.rule({
name: "rrr",
command: "cmd goes here",
vars: {
c: ng.file("/root/dir/ccc"),
b: ng.file("/abs/bbb"),
a: ng.file("/root/dir/nested/aaa"),
},
});
assertEquals(
generate(),
`
rule rrr
command = cmd goes here
c = ccc
b = ../../abs/bbb
a = nested/aaa
`.trim(),
);
});
Deno.test("generate: writeRule: $binary substituted in command", () => {
ng.reset();
ng.rule({
name: "rrr",
command: "something $binary something",
binary: ng.file("/root/dir/mybinary"),
});
assertEquals(
generate(),
`
rule rrr
command = something ./mybinary something
`.trim(),
);
});
Deno.test("generate: writeRule: error if $binary is not in rule command", () => {
ng.reset();
ng.rule({
name: "rrr",
command: "something something",
binary: ng.file("/root/dir/mybinary"),
});
assertThrows(
() => generate(),
Error,
"binary property defined in rule rrr but not referenced in command: something something",
);
});
Deno.test("generate: writeRule: generator", () => {
ng.reset();
ng.rule({ name: "rrr", command: "cmd goes here", generator: true });
assertEquals(
generate(),
`
rule rrr
command = cmd goes here
generator = 1
`.trim(),
);
});
Deno.test("generate: writeRule: using a pool", () => {
ng.reset();
const pool = ng.pool({ name: "my_pool", depth: 1 });
ng.rule({ name: "rrr", command: "cmd goes here", pool });
assertEquals(
generate(),
`
pool my_pool
depth = 1
rule rrr
command = cmd goes here
pool = my_pool
`.trim(),
);
});
Deno.test("generate: writeRule: using the console pool", () => {
ng.reset();
ng.rule({ name: "rrr", command: "cmd goes here", pool: "console" });
assertEquals(
generate(),
`
rule rrr
command = cmd goes here
pool = console
`.trim(),
);
});
Deno.test("generate: writeRule: substitutes $dir", () => {
ng.reset();
ng.rule({
name: "r1",
command: "cd $dir && something in root",
});
const subdirNg = new Ningen("/root/dir/subdir");
subdirNg.rule({
name: "r2",
command: "cd $dir && something in subdir",
});
assertEquals(
generate(),
`
rule r1
command = cd ./ && something in root
rule r2
command = cd ./subdir && something in subdir
`.trim(),
);
});
Deno.test("generate: writeRule: includes description", () => {
ng.reset();
ng.rule({
name: "r",
command: "rrr",
description: "my description",
});
assertEquals(
generate(),
`
rule r
command = rrr
description = my description
`.trim(),
);
});
// Deno.test("generate: writeRule: depfile", () => {
// ng.rule({ name: "rrr", command: "cmd", depfile: "$out.d" }),
// );
// assertEquals(
// generate(),
// `
// rule rrr
// command = cmd
// depfile = $out.d
// deps = gcc
// `.trim(),
// );
// });
Deno.test("generate: writeTarget: single input and output", () => {
ng.reset();
ng.build({
rule: testRule,
inputs: ng.files("i"),
outputs: ng.files("o"),
});
assertEquals(generate(), `build o: ttt i`);
});
Deno.test("generate: writeTarget: multiple inputs and outputs", () => {
ng.reset();
ng.build({
rule: testRule,
inputs: ng.files("i1", "i2"),
outputs: ng.files("o1", "o2"),
});
assertEquals(
generate(),
`build o1 o2: ttt i1 i2
`.trim(),
);
});
Deno.test("generate: writeTarget: with implicit inputs", () => {
ng.reset();
const r = ng.rule({
name: "r",
command: "c",
srcs: ng.files("x1", "x2"),
});
ng.build({
rule: r,
inputs: ng.files("i"),
outputs: ng.files("o"),
});
assertEquals(
generate(),
`
rule r
command = c
build o: r i | x1 x2
`.trim(),
);
});
Deno.test("generate: writeTarget: vars", () => {
ng.reset();
ng.build({
rule: testRule,
inputs: ng.files("i"),
outputs: ng.files("o"),
vars: {
varA: "A",
varB: ng.file("b/b"),
},
});
assertEquals(
generate(),
`
build o: ttt i
varA = A
varB = b/b
`.trim(),
);
});
Deno.test("generate: writeRule: binary not added to vars", () => {
ng.reset();
const r = ng.rule({
name: "r",
command: "$binary 123",
binary: ng.file("mybinary"),
});
ng.build({
rule: r,
inputs: ng.files("i"),
outputs: ng.files("o"),
});
assertEquals(
generate(),
`
rule r
command = ./mybinary 123
build o: r i | mybinary
`.trim(),
);
});
Deno.test("generate: writeTarget: overriding default pool with new pool", () => {
ng.reset();
const pool1 = ng.pool({ name: "my_pool1", depth: 1 });
const pool2 = ng.pool({ name: "my_pool2", depth: 1 });
const rule = ng.rule({ name: "rrr", command: "cmd goes here", pool: pool1 });
ng.build({
rule,
inputs: ng.files("i"),
outputs: ng.files("o"),
pool: pool2,
});
assertEquals(
generate(),
`
pool my_pool1
depth = 1
pool my_pool2
depth = 1
rule rrr
command = cmd goes here
pool = my_pool1
build o: rrr i
pool = my_pool2
`.trim(),
);
});
Deno.test("generate: writeTarget: overriding default pool with empty string", () => {
ng.reset();
const pool = ng.pool({ name: "my_pool", depth: 1 });
const rule = ng.rule({ name: "rrr", command: "cmd goes here", pool });
ng.build({
rule,
inputs: ng.files("i"),
outputs: ng.files("o"),
pool: "",
});
assertEquals(
generate(),
`
pool my_pool
depth = 1
rule rrr
command = cmd goes here
pool = my_pool
build o: rrr i
pool =
`.trim(),
);
});
Deno.test("generate: write", () => {
ng.reset();
const rule0 = ng.rule({ name: "r0", command: "c0" });
const rule1 = ng.rule({ name: "r1", command: "c1" });
const rule2 = ng.rule({ name: "r2", command: "c2" });
ng.build({
rule: rule0,
inputs: ng.files("i0"),
outputs: ng.files("o0"),
});
ng.build({
rule: rule1,
inputs: ng.files("i1"),
outputs: ng.files("o1"),
});
ng.build({
rule: rule2,
inputs: ng.files("i2"),
outputs: ng.files("o2"),
});
assertEquals(
generate(),
`
rule r0
command = c0
rule r1
command = c1
rule r2
command = c2
build o0: r0 i0
build o1: r1 i1
build o2: r2 i2
`.trim(),
);
});
Deno.test("generate: write: rules written in sorted order", () => {
ng.reset();
ng.rule({ name: "rrr2", command: "cmd goes here" });
ng.rule({ name: "rrr1", command: "cmd goes here" });
assertEquals(
generate(),
`
rule rrr1
command = cmd goes here
rule rrr2
command = cmd goes here
`.trim(),
);
});
Deno.test("generate: write: targets written in original order", () => {
ng.reset();
ng.rule({
name: "ttt",
command: "ttt",
});
ng.build({
rule: testRule,
inputs: ng.files("i3"),
outputs: ng.files("o3"),
});
ng.build({
rule: testRule,
inputs: ng.files("i2"),
outputs: ng.files("o2"),
});
ng.build({
rule: testRule,
inputs: ng.files("i1"),
outputs: ng.files("o1"),
});
assertEquals(
generate(),
`
rule ttt
command = ttt
build o3: ttt i3
build o2: ttt i2
build o1: ttt i1
`.trim(),
);
});
Deno.test("generate: isDefault: non-default targets don't get default statements", () => {
ng.reset();
ng.build({
rule: testRule,
inputs: ng.files("i1"),
outputs: ng.files("o1"),
isDefault: true,
});
ng.build({
rule: testRule,
inputs: ng.files("i2"),
outputs: ng.files("o2"),
// isDefault omitted (should be true)
});
ng.build({
rule: testRule,
inputs: ng.files("i3"),
outputs: ng.files("o3"),
isDefault: false,
});
assertEquals(
generate(),
`
build o1: ttt i1
build o2: ttt i2
build o3: ttt i3
default o1
default o2
`.trim(),
);
});
Deno.test("generate: isDefault: when everything is default, no default statements get printed", () => {
ng.reset();
ng.build({
rule: testRule,
inputs: ng.files("i1"),
outputs: ng.files("o1"),
isDefault: true,
});
ng.build({
rule: testRule,
inputs: ng.files("i2"),
outputs: ng.files("o2"),
// isDefault omitted (should be true)
});
assertEquals(
generate(),
`
build o1: ttt i1
build o2: ttt i2
`.trim(),
);
});
Deno.test("generate: using default generator rule", () => {
ng.reset();
assertEquals(
ng.generateToString({}).trim(),
`
rule ningen
command = ./BUILD.ts
description = Regenerating Ninja file
generator = 1
build build.ninja: ningen | BUILD.ts
`.trim(),
);
});
Deno.test("generate: override output file", () => {
ng.reset();
assertEquals(
ng.generateToString({ output: ng.file("override.ninja") }).trim(),
`
rule ningen
command = ./BUILD.ts
description = Regenerating Ninja file
generator = 1
build override.ninja: ningen | BUILD.ts
`.trim(),
);
});
Deno.test("generate: override inputs", () => {
ng.reset();
assertEquals(
ng.generateToString({ inputs: ng.files("a.txt", "b.txt") }).trim(),
`
rule ningen
command = ./BUILD.ts
description = Regenerating Ninja file
generator = 1
build build.ninja: ningen a.txt b.txt | BUILD.ts
`.trim(),
);
});
Deno.test("generate: override generator rule", () => {
ng.reset();
const myGeneratorRule = ng.rule({
name: "my-generator",
command: "ggg",
srcs: ng.files("f"),
generator: true,
});
assertEquals(
ng.generateToString({ generatorRule: myGeneratorRule }).trim(),
`
rule my-generator
command = ggg
generator = 1
build build.ninja: my-generator | f
`.trim(),
);
});
Deno.test("generate: throws if overridden rule is not a generator rule", () => {
ng.reset();
const myGeneratorRule = ng.rule({
name: "my-generator",
command: "ggg",
srcs: ng.files("f"),
// generator not set => false
});
assertThrows(
() => ng.generateToString({ generatorRule: myGeneratorRule }),
Error,
"my-generator is not a generator rule",
);
});
|
/**
* Factory for creating EasyMock objects as beans.
*
* @author Eric Dalquist
* @version $Revision$
*/
public class EasyMockFactoryBean<T> extends AbstractFactoryBean<T> {
private final Class<? extends T> type;
private boolean nice = true;
/**
* @param nice If a nice mock should be created, defaults to true
*/
public void setNice(boolean nice) {
this.nice = nice;
}
public EasyMockFactoryBean(Class<? extends T> type) {
this.type = type;
}
@Override
public Class<? extends T> getObjectType() {
return this.type;
}
@Override
protected T createInstance() throws Exception {
if (this.nice) {
return createNiceMock(this.type);
}
return createMock(this.type);
}
} |
Civil rights historian and author Frank Sikora, who worked as a reporter for The Birmingham News for more than three decades, has died. He was 80.
The tall, gray-bearded reporter known for his kind, soft-spoken mild manner and elegant writing style died on Monday, according to his family.
Sikora was the author of highly respected books about the civil rights movement, including ''Until Justice Rolls Down: The Birmingham Church Bombing Case,'' published in 1991 and reprinted in 2005, and "The Judge: The Life and Opinions of Judge Frank M. Johnson Jr," published in 1992 and reprinted in 2006.
His first book, "Selma, Lord, Selma," published in 1980 and reprinted in 1997, was turned into a TV movie by Disney. It's an account of two young girls in 1965 Selma in the days leading up to Bloody Sunday, when law enforcement officers beat civil rights marchers.
Sikora was from Byesville, Ohio, but spent most of his life in Alabama. "He was really from Birmingham," said his granddaughter, SchaScha Smith. "He was very humble, very loving, he didn't care who you were, or what you did, if you needed help, he'd be there. He'd always show a sense of humor and make you smile even in a sad situation.
Sikora began his writing career at The Gadsden Times in 1964. He worked for The Birmingham News from 1967 until he retired in 1999. He freelanced for Time magazine from 2001-2007.
In 2014, Sikora received the Clarence Cason Award in Non-fiction Writing from the University of Alabama.
Sikora also co-wrote the 2006 novel "The Visitor at Winter Chapel" and a 2007 biography, "Hear the Bugles Calling: My Three Wars as a Combat Infantryman."
A funeral service will be held Monday, March 28, at Gray Brown-Service Mortuary in Anniston. Burial will be at Bethlehem Baptist Cemetery in Oxford. Sikora, who had six children, moved from Birmingham to the Anniston area about four months ago to live with his oldest daughter as his health declined, Smith said. Sikora had just turned 80 on March 18. He died of natural causes, Smith said. "He was just tired and worn out and ready to go," she said. |
<reponame>groupby/sayt-client-javascript
import { test, theon } from 'groupby-client-core';
import { Api } from '../src';
export function middleware(rootApi: { sayt: Api } & theon.Request, action: string) {
afterEach(() => rootApi.getStore().map = {});
it('should set url from customerId', (done) => {
rootApi.getStore().map = { customerId: 'myCustomerId' };
test.expectRootUrl(rootApi.sayt[action], 'https://mycustomerid.groupbycloud.com')
.then(() => done());
});
}
export function validation(api: () => theon.Request) {
test.itShouldFailValidation(api, undefined, 'request validation error: must provide body');
test.itShouldFailValidation(api, {}, 'body validation error: must provide query');
test.itShouldPassValidation(api, { query: 'appl'});
}
|
<reponame>xpsurgery/customer-base<gh_stars>1-10
class CustomerBase:
def __init__(self):
self.customers = []
def add(self, customer):
self.customers.append(customer)
def findByLastName(self, lastName):
result = []
for customer in self.customers:
if customer.lastName == lastName:
result.append(customer)
return result
def findByFirstAndLastName(self, firstName, lastName):
result = []
for customer in self.customers:
if customer.firstName == firstName and customer.lastName == lastName:
result.append(customer)
return result
def findByCreditGreaterThan(self, credit):
result = []
for customer in self.customers:
if customer.credit > credit:
result.append(customer)
return result
|
<filename>src/main/java/org/github/gwttemplate/templates/widgets/InlinePanel.java<gh_stars>0
/**
*
*/
package org.github.gwttemplate.templates.widgets;
import com.google.gwt.user.client.DOM;
import com.google.gwt.user.client.ui.ComplexPanel;
import com.google.gwt.user.client.ui.Widget;
/**
*
*/
public class InlinePanel extends ComplexPanel {
/**
* Creates an empty flow panel.
*/
public InlinePanel() {
setElement(DOM.createSpan());
}
/**
* Adds a new child widget to the panel.
*
* @param w the widget to be added
*/
@Override
public void add(Widget w) {
add(w, getElement());
}
/**
* Inserts a widget before the specified index.
*
* @param w the widget to be inserted
* @param beforeIndex the index before which it will be inserted
* @throws IndexOutOfBoundsException if <code>beforeIndex</code> is out of
* range
*/
public void insert(Widget w, int beforeIndex) {
insert(w, getElement(), beforeIndex, true);
}
}
|
def popularity_of_expressions(path_to_db: str) -> dict:
trace_id = str(random.randint(1000000, 9999999))
logger.info("[trace start " + trace_id + "]")
dat = clib.read_db(path_to_db)
expression_popularity_dict = {}
deriv_uses_expr_global_id = {}
for deriv_id in dat["derivations"].keys():
list_of_all_expr_global_IDs_for_this_deriv = []
for step_id, step_dict in dat["derivations"][deriv_id]["steps"].items():
for connection_type in ["inputs", "feeds", "outputs"]:
for expr_local_id in step_dict[connection_type]:
list_of_all_expr_global_IDs_for_this_deriv.append(
dat["expr local to global"][expr_local_id]
)
deriv_uses_expr_global_id[deriv_id] = list(
set(list_of_all_expr_global_IDs_for_this_deriv)
)
for expr_global_id, expr_dict in dat["expressions"].items():
expression_popularity_dict[expr_global_id] = []
for deriv_id, list_of_expr in deriv_uses_expr_global_id.items():
if expr_global_id in list_of_expr:
expression_popularity_dict[expr_global_id].append(deriv_id)
expression_popularity_dict[expr_global_id] = list(
set(expression_popularity_dict[expr_global_id])
)
logger.info("[trace end " + trace_id + "]")
return expression_popularity_dict |
/**
* Dialog for adding constants to subsystems.
*
* @author Sam Carlberg
*/
public class ConstantsAdderDialog extends CenteredDialog {
/**
* The constants property being edited.
*/
private final ConstantsProperty constantsProperty;
/**
* Convenience list for changing constants.
*/
private final List<ValuedParameterDescriptor> constantsList;
/**
* ComboBox for selecting parameter types.
*/
private final JComboBox<String> typeBox = new JComboBox<>(ParameterDescriptor.SUPPORTED_TYPES);
private DefaultCellEditor currentCellEditor;
public ConstantsAdderDialog(RobotComponent command, JFrame owner, boolean modal) {
super(owner, "Add constants");
this.constantsProperty = (ConstantsProperty) command.getProperty("Constants");
initComponents();
setBackground(Color.WHITE);
setForeground(Color.WHITE);
constantsTable.setShowHorizontalLines(true);
constantsTable.setShowVerticalLines(true);
constantsTable.setRowHeight(25);
constantsTable.setBackground(new Color(240, 240, 240));
constantsTable.setGridColor(Color.BLACK);
constantsTable.addKeyListener(new KeyAdapter() {
@Override
public void keyTyped(KeyEvent e) {
if (e.getKeyChar() == KeyEvent.VK_BACK_SPACE
|| e.getKeyChar() == KeyEvent.VK_DELETE) {
deleteSelectedRows();
}
}
});
constantsList = constantsProperty.getValue();
constantsList.stream().forEach(p -> getTableModel().addRow(p.toArray()));
}
public List<ValuedParameterDescriptor> showAndGet() {
setVisible(true);
return getParameters();
}
/**
* Saves the data in the table to the constants property. This will clear
* any data that previously existed in the property.
*/
private void save() {
Vector<Vector> dataVector = getTableModel().getDataVector();
constantsList.clear();
dataVector.stream().forEach((dataRow) -> {
String name = (String) dataRow.get(0);
String type = (String) dataRow.get(1);
Object value = dataRow.get(2);
ValuedParameterDescriptor newParam = new ValuedParameterDescriptor(name, type, value);
constantsList.add(newParam);
});
constantsProperty.setValueAndUpdate(constantsList); // almost certainly redundant
}
/**
* Deletes the selected rows in the table. This does not effect the
* constants property.
*/
private void deleteSelectedRows() {
int[] rows = constantsTable.getSelectedRows();
constantsTable.clearSelection();
currentCellEditor.cancelCellEditing();
for (int i = rows.length - 1; i >= 0; i--) {
if (rows[i] > -1) {
getTableModel().removeRow(rows[i]);
}
}
}
/**
* Checks if the given row is valid. A row is valid if the name of the
* constant in the row is unique, i.e. no other constant in the table has
* the same name.
*
* @param row the row to validate
* @return true if the row is valid, false otherwise
*/
private boolean isRowValid(int row) {
String name = (String) ((Vector) getTableModel().getDataVector().get(row)).get(0);
int count = 0;
count = ((Vector<Vector>) getTableModel().getDataVector())
.stream()
.filter(v -> v.get(0).equals(name))
.map(i -> 1)
.reduce(count, Integer::sum);
if (count != 1) {
return false;
}
return constantsProperty.isValid();
}
public List<ValuedParameterDescriptor> getParameters() {
return constantsProperty.getValue();
}
/**
* This method is called from within the constructor to initialize the form.
* WARNING: Do NOT modify this code. The content of this method is always
* regenerated by the Form Editor.
*/
@SuppressWarnings("unchecked")
// <editor-fold defaultstate="collapsed" desc="Generated Code">//GEN-BEGIN:initComponents
private void initComponents() {
jScrollPane1 = new javax.swing.JScrollPane();
constantsTable = new ParameterDeclarationTable();
addButton = new javax.swing.JButton();
saveButton = new javax.swing.JButton();
setDefaultCloseOperation(javax.swing.WindowConstants.DISPOSE_ON_CLOSE);
constantsTable.setModel(new javax.swing.table.DefaultTableModel(
new Object [][] {
},
new String [] {
"Name", "Type", "Value"
}
) {
Class[] types = new Class [] {
java.lang.String.class, java.lang.String.class, java.lang.String.class
};
public Class getColumnClass(int columnIndex) {
return types [columnIndex];
}
});
constantsTable.setDragEnabled(true);
constantsTable.setShowGrid(true);
constantsTable.getTableHeader().setReorderingAllowed(false);
jScrollPane1.setViewportView(constantsTable);
addButton.setText("Add constant");
addButton.addActionListener(new java.awt.event.ActionListener() {
public void actionPerformed(java.awt.event.ActionEvent evt) {
addButtonActionPerformed(evt);
}
});
saveButton.setText("Save and close");
saveButton.addActionListener(new java.awt.event.ActionListener() {
public void actionPerformed(java.awt.event.ActionEvent evt) {
saveButtonActionPerformed(evt);
}
});
javax.swing.GroupLayout layout = new javax.swing.GroupLayout(getContentPane());
getContentPane().setLayout(layout);
layout.setHorizontalGroup(
layout.createParallelGroup(javax.swing.GroupLayout.Alignment.LEADING)
.addComponent(jScrollPane1, javax.swing.GroupLayout.DEFAULT_SIZE, 400, Short.MAX_VALUE)
.addGroup(layout.createSequentialGroup()
.addGap(0, 0, Short.MAX_VALUE)
.addComponent(addButton)
.addPreferredGap(javax.swing.LayoutStyle.ComponentPlacement.RELATED)
.addComponent(saveButton)
.addContainerGap())
);
layout.setVerticalGroup(
layout.createParallelGroup(javax.swing.GroupLayout.Alignment.LEADING)
.addGroup(layout.createSequentialGroup()
.addComponent(jScrollPane1, javax.swing.GroupLayout.DEFAULT_SIZE, 265, Short.MAX_VALUE)
.addPreferredGap(javax.swing.LayoutStyle.ComponentPlacement.RELATED)
.addGroup(layout.createParallelGroup(javax.swing.GroupLayout.Alignment.BASELINE)
.addComponent(addButton)
.addComponent(saveButton)))
);
pack();
}// </editor-fold>//GEN-END:initComponents
private void addButtonActionPerformed(java.awt.event.ActionEvent evt) {//GEN-FIRST:event_addButtonActionPerformed
ValuedParameterDescriptor p = new ValuedParameterDescriptor("[change me]", "String", null);
getTableModel().addRow(p.toArray());
}//GEN-LAST:event_addButtonActionPerformed
private void saveButtonActionPerformed(java.awt.event.ActionEvent evt) {//GEN-FIRST:event_saveButtonActionPerformed
save();
dispose();
}//GEN-LAST:event_saveButtonActionPerformed
/**
* Helper method to get and cast the table model to avoid casting it
* everywhere it's used.
*/
private DefaultTableModel getTableModel() {
return (DefaultTableModel) constantsTable.getModel();
}
private ValuedParameterDescriptor constantForRow(int row) {
Vector<Object> rowData = (Vector) getTableModel().getDataVector().get(row);
String name = (String) rowData.get(0);
ValuedParameterDescriptor p = constantsProperty.getConstantByName(name);
if (p == null) {
p = new ValuedParameterDescriptor(name, (String) rowData.get(1), rowData.get(2));
}
return p;
}
private class ParameterDeclarationTable extends JTable {
public ParameterDeclarationTable() {
setTransferHandler(new TableRowTransferHandler(this));
}
@Override
public TableCellEditor getCellEditor(int row, int column) {
DefaultCellEditor editor;
switch (column) {
case 0: // name
editor = new DefaultCellEditor(new JTextField());
break;
case 1: // type
editor = new DefaultCellEditor(typeBox);
break;
case 2: // value
String type = (String) getValueAt(row, 1);
switch (type) {
case "boolean":
editor = new DefaultCellEditor(new JCheckBox());
break;
case "String":
editor = new DefaultCellEditor(new JTextField());
break;
default: // number
editor = new DefaultCellEditor(new JTextField("0"));
break;
}
break;
default:
editor = (DefaultCellEditor) super.getCellEditor(row, column);
break;
}
editor.setClickCountToStart(2);
currentCellEditor = editor;
return editor;
}
@Override
public void setValueAt(Object aValue, int row, int column) {
if (0 <= row && row < this.getRowCount()) {
super.setValueAt(aValue, row, column);
}
}
@Override
@SuppressWarnings("Convert2Lambda")
public TableCellRenderer getCellRenderer(int row, int column) {
final Object valueHere = super.getValueAt(row, column);
return new TableCellRenderer() {
@Override
public Component getTableCellRendererComponent(JTable table, Object value, boolean isSelected, boolean hasFocus, int row, int column) {
JLabel label = new JLabel(String.valueOf(valueHere));
label.setBorder(BorderFactory.createEmptyBorder(0, 4, 0, 0));
ParameterDescriptor param = constantForRow(row);
if (!param.isValid() || !isRowValid(row)) {
if (isSelected) {
label.setBackground(SELECTED_INVALID_COLOR);
} else {
label.setBackground(INVALID_COLOR);
}
} else if (isSelected) {
label.setBackground(SELECTED_COLOR);
}
label.setOpaque(true);
return label;
}
};
}
}
// Variables declaration - do not modify//GEN-BEGIN:variables
private javax.swing.JButton addButton;
private javax.swing.JTable constantsTable;
private javax.swing.JScrollPane jScrollPane1;
private javax.swing.JButton saveButton;
// End of variables declaration//GEN-END:variables
} |
<commit_msg>wa: Add ApkWorkload to default imports
<commit_before>from wa.framework import pluginloader, signal
from wa.framework.command import Command, ComplexCommand, SubCommand
from wa.framework.configuration import settings
from wa.framework.configuration.core import Status
from wa.framework.exception import HostError, JobError, InstrumentError, ConfigError
from wa.framework.exception import (ResultProcessorError, ResourceError,
CommandError, ToolError)
from wa.framework.exception import (WAError, NotFoundError, ValidationError,
WorkloadError)
from wa.framework.exception import WorkerThreadError, PluginLoaderError
from wa.framework.instrumentation import (Instrument, very_slow, slow, normal, fast,
very_fast)
from wa.framework.plugin import Plugin, Parameter
from wa.framework.processor import ResultProcessor
from wa.framework.resource import (NO_ONE, JarFile, ApkFile, ReventFile, File,
Executable)
from wa.framework.workload import Workload, ApkUiautoWorkload, ReventWorkload
<commit_after>from wa.framework import pluginloader, signal
from wa.framework.command import Command, ComplexCommand, SubCommand
from wa.framework.configuration import settings
from wa.framework.configuration.core import Status
from wa.framework.exception import HostError, JobError, InstrumentError, ConfigError
from wa.framework.exception import (ResultProcessorError, ResourceError,
CommandError, ToolError)
from wa.framework.exception import (WAError, NotFoundError, ValidationError,
WorkloadError)
from wa.framework.exception import WorkerThreadError, PluginLoaderError
from wa.framework.instrumentation import (Instrument, very_slow, slow, normal, fast,
very_fast)
from wa.framework.plugin import Plugin, Parameter
from wa.framework.processor import ResultProcessor
from wa.framework.resource import (NO_ONE, JarFile, ApkFile, ReventFile, File,
Executable)
from wa.framework.workload import Workload, ApkWorkload, ApkUiautoWorkload, ReventWorkload
|
// RenderValue properly render a value
func (r *Renderer) RenderValue(value *Value) {
if r.Autoescape && value.IsString() && !value.Safe {
r.WriteString(value.Escaped())
} else {
r.WriteString(value.String())
}
} |
def reg_check(self, index, filepath, limit):
item_registered = False
if 'days' not in limit and 'regex' not in limit:
index[filepath] = {}
item_registered = True
elif 'days' in limit and 'regex' not in limit:
timestamp = functions.modification_date(filepath=filepath)
keep = functions.keep(
now_timestamp=self.c['now_timestamp'],
timestamp=timestamp,
days=self.c['days'])
if not keep:
if filepath not in index:
index[filepath] = {}
index[filepath]['t'] = unicode(timestamp)
item_registered = True
elif 'days' not in limit and 'regex' in limit:
match = re.search(r''+self.c['regex']+'', filepath)
if match:
if filepath not in index:
index[filepath] = {}
index[filepath]['r'] = 'True'
item_registered = True
elif 'days' in limit and 'regex' in limit:
timestamp = functions.modification_date(filepath=filepath)
keep = functions.keep(
now_timestamp=self.c['now_timestamp'],
timestamp=timestamp,
days=self.c['days'])
if not keep:
match = re.search(r''+self.c['regex']+'', filepath)
if match:
if filepath not in index:
index[filepath] = {}
index[filepath]['r'] = 'True'
item_registered = True
if item_registered:
sys.stdout.write(' ...')
sys.stdout.flush()
index = self.add_sizedata(index=index, filepath=filepath)
return index |
#ifndef _VIDEO_DCT_P6_H_
#define _VIDEO_DCT_P6_H_
// #include <VP_Os/vp_os_types.h>
#include <VLIB/video_dct.h>
////////////////////////////////////////////////////
// Parrot proprietary DCT registers
////////////////////////////////////////////////////
// Parrot DCT address: 0xD00B0000
#define DCT_STATUS 0x000 // Status Register
#define DCT_ITEN 0x004 // Interrupt Enable Register
#define DCT_ITACK 0x008 // Interrupt Acknowledge Register
#define DCT_CONTROL 0x040 // Control Register
#define DCT_DMA 0x010 // Dma Register
#define DCT_DMAINT 0x02C
#define DCT_RESET 0x03C
#define DCT_START 0x00C
#define DCT_CONFIG 0x028
#define DCT_ORIG_Y_ADDR 0x044 // Address Register
#define DCT_ORIG_CU_ADDR 0x048 // Address Register
#define DCT_ORIG_CV_ADDR 0x04C // Address Register
#define DCT_DEST_Y_ADDR 0x050 // Address Register
#define DCT_DEST_CU_ADDR 0x054 // Address Register
#define DCT_DEST_CV_ADDR 0x058 // Address Register
#define DCT_LINEOFFSET 0x05C // Line size
#define DCT_Q_ADDR 0x064 // quantization table
//#define DCT_DEBUG 0x030? // Debug register
//#define DCT_SIGNATURE 0x034? // Signature Register
// Registers bitwise definitions
// Status register
#define DCT_STATUS_END_OK (1<<0) // DCT Done
//#define DCT_STATUS_ERROR (1<<1) // DCT Error ?
// Interrupt enable register
#define DCT_ITEN_END_OK (1<<0) // IT Done enable
//#define DCT_ITEN_ERROR (1<<1) // IT Error enable ?
// Interrupt Acknowledge register
#define DCT_ITACK_END_OK (1<<0) // IT Done acknowledge
//#define DCT_ITACK_ERROR (1<<1) // IT Error acknowledge ?
// DCT control mode (forward or inverse dct)
#define DCT_CTRLMODE_FDCT 0
#define DCT_CTRLMODE_IDCT 1
//! write to a DCT register
#define dct_write_reg( _reg_, _value_ ) \
(*((volatile CYG_WORD32 *)(PARROT5_DCT +(_reg_))) = (CYG_WORD32)(_value_))
//! read a DCT register
#define dct_read_reg(_reg_ ) \
(*((volatile CYG_WORD32 *)(PARROT5_DCT+(_reg_))))
typedef enum {
DCT_DMA_INCR = 0, //!< 4 bytes DMA burst
DCT_DMA_INCR4 = 1, //!< 16 bytes DMA burst
DCT_DMA_INCR8 = 2, //!< 32 bytes DMA burst
DCT_DMA_INCR16 = 3, //!< 64 bytes DMA burst
} DCT_DMA_BURST_MODE;
C_RESULT video_dct_p6_init(void);
C_RESULT video_dct_p6_close(void);
C_RESULT video_dct_p6p_init(void);
int16_t* video_fdct_quant_compute(int16_t* in, int16_t* out, int32_t num_macro_blocks,int32_t quant);
//int16_t* video_idct_compute(int16_t* in, int16_t* out, int32_t num_macro_blocks);
#endif // ! _VIDEO_DCT_P6P_H_
|
/**
* Adds a new <code>MenuItem</code> using the given menu item ID and menu item flow class.
*
* @param id The ID for the new menu item.
* @param clazz The flow class to initiate when flow is about to be initialized.
* @return The added menu item.
* @since 2.0
*/
public MenuItem addMenuItem(String id, Class<? extends Widget> clazz) {
String path = null;
if (StringUtils.contains(id, MENU_PATH_SEPARATOR)) {
path = StringUtils.substringBeforeLast(id, MENU_PATH_SEPARATOR);
id = StringUtils.substringAfterLast(id, MENU_PATH_SEPARATOR);
}
return addMenuItem(path, new MenuItem(id, clazz));
} |
/**
* A simple {@link Fragment} subclass.
*/
public class EstablistmentListFragment extends BaseFragment<EstablismentsViewModel, FragmentEstablistmentListBinding>
implements EstablismentCallback.RestaurantCallback {
private static final String ESTABLISHMENT_ID_KEY = "establishment_id";
private EstablishmentListAdapter adapter;
@Inject
SharedPreferences preferences;
public static EstablistmentListFragment newInstance(Establishment establistment) {
Bundle args = new Bundle();
args.putInt(ESTABLISHMENT_ID_KEY, establistment.getEstablishment().getId());
EstablistmentListFragment fragment = new EstablistmentListFragment();
fragment.setArguments(args);
return fragment;
}
public EstablistmentListFragment() {
// Required empty public constructor
}
@Override
public Class<EstablismentsViewModel> getViewModel() {
return EstablismentsViewModel.class;
}
@Override
public int getLayoutRes() {
return R.layout.fragment_establistment_list;
}
@Override
public View onCreateView(LayoutInflater inflater, ViewGroup container,
Bundle savedInstanceState) {
super.onCreateView(inflater, container, savedInstanceState);
adapter = new EstablishmentListAdapter(this);
dataBinding.establishmentListRecyclerview.setLayoutManager(new LinearLayoutManager(getContext()));
dataBinding.establishmentListRecyclerview.setAdapter(adapter);
return dataBinding.getRoot();
}
@Override
public void onActivityCreated(@Nullable Bundle savedInstanceState) {
super.onActivityCreated(savedInstanceState);
int establishmentId = getArguments().getInt(ESTABLISHMENT_ID_KEY);
int entityId = SPUtils.getIntegerPreference(preferences, Constants.ENTITY_ID, 0);
String entityType = SPUtils.getStringPreference(preferences, Constants.ENTITY_TYPE);
viewModel.getEstablishmentList(String.valueOf(establishmentId), entityId, entityType).observe(this, restaurants -> {
dataBinding.setListSize(restaurants.size());
adapter.submitList(restaurants);
});
}
@Override
public void onEstablismentClick(Restaurant restaurant, ImageView sharedElement) {
Intent intent = new Intent(getActivity(), RestaurantDetailActivity.class);
intent.putExtra(Constants.RESTAURANTS_BUNDLE_KEY, restaurant);
Bundle options = ActivityOptionsCompat.makeSceneTransitionAnimation(getActivity(), sharedElement, getString(R.string.shared_element_transition_name)).toBundle();
startActivity(intent, options);
}
@Override
public void onEstablismentMarkerClick(Restaurant restaurant) {
LocationUtils.openGoogleMaps(getActivity(), Double.parseDouble(restaurant.getRestaurant().getLocation().getLatitude()), Double.parseDouble(restaurant.getRestaurant().getLocation().getLongitude()));
}
} |
//
// Created by <NAME> on 31/10/14.
// Copyright (c) 2014 ___FULLUSERNAME___. All rights reserved.
//
#include "Plane.h"
#include "NormalToNoise.h"
#include <iostream>
using namespace kick;
using namespace glm;
using namespace std;
ThePlane::ThePlane(kick::GameObject *gameObject) : Component(gameObject) {
MeshRenderer* mr = gameObject->addComponent<MeshRenderer>();
Material *material = new Material();
material->setShader(Project::loadShader("poly-assets/shaders/diffuse_vertex_colored.shader"));
mr->setMaterial(material);
Mesh *mesh = new Mesh();
mesh->setMeshData(AddNoise(loadPlyData("poly-assets/models","plane.ply")));
mr->setMesh(mesh);
{
auto propellerGO = Engine::activeScene()->createGameObject("Propeller");
MeshRenderer* mr = propellerGO->addComponent<MeshRenderer>();
Material *material = new Material();
material->setShader(Project::loadShader("poly-assets/shaders/diffuse_vertex_colored.shader"));
mr->setMaterial(material);
Mesh *mesh = new Mesh();
mesh->setMeshData(AddNoise(loadPlyData("poly-assets/models","plane-propeller.ply")));
mr->setMesh(mesh);
propeller = propellerGO->transform();
propeller->setParent(transform());
}
transform()->setPosition(vec3{0,20,0});
if (particles.size() == 0){
for (int i=0;i<50;i++){
auto p = Engine::activeScene()->createGameObject("Particle");
particles.push_back(p->addComponent<Exhaustion>());
}
}
}
void ThePlane::update() {
static float exCount = 0;
bool accelerate = KeyInput::pressed(Key::SPACE) || MouseInput::pressed(0);
if (accelerate) {
exCount += Time::delta();
}
if (exCount>0.25f){
exCount = 0;
currentParticle++;
if (currentParticle >= particles.size()){
currentParticle = 0;
}
particles[currentParticle]->spawn(transform()->position());
}
auto trans = transform();
if (accelerate){
float acc = 10;
propellerSpeed += Time::delta()*acc;
propellerSpeed = std::min(500.0f,propellerSpeed);
} else {
float deacc = 0.5f;
propellerSpeed -= propellerSpeed*deacc*Time::delta();
propellerSpeed = std::max(10.0f,propellerSpeed);
}
propellerRotation += propellerSpeed * Time::delta();
float speedFactor = 0.5f;
trans->setPosition(trans->position() + trans->forward()*propellerSpeed*speedFactor*Time::delta());
float torqueAcc = 10;
if (MouseInput::pressed(0)){
float relative = (MouseInput::position().x / (float)Engine::context()->getContextSurfaceDim().x)*2-1;
torque.x -= relative*Time::delta()*torqueAcc;
} else {
if (KeyInput::pressed(Key::LEFT) || KeyInput::pressed(Key::LSHIFT) || KeyInput::pressed(Key::a)) {
torque.x += Time::delta() * torqueAcc;
}
if (KeyInput::pressed(Key::RIGHT) || KeyInput::pressed(Key::RSHIFT) || KeyInput::pressed(Key::d)) {
torque.x -= Time::delta() * torqueAcc;
}
}
/*
if (keyInput.pressed(Key::UP)){
torque.y += Time::getDeltaTime()*torqueAcc;
}
if (keyInput.pressed(Key::DOWN)){
torque.y -= Time::getDeltaTime()*torqueAcc;
} */
torque = glm::clamp(torque, vec2{-1.0f},vec2{1.0f});
float deTorqueAcc = 5;
torque -= torque*deTorqueAcc *Time::delta();
rotation.x += torque.x * Time::delta();
rotation.y += sin(rotation.x) * Time::delta();
trans->setRotation(angleAxis(rotation.y, vec3{0,1,0}) * angleAxis(rotation.x, vec3{0,0,1}));
propeller->setRotationEuler(vec3{0,0,propellerRotation });
}
|
/**
* A {@link io.reactivex.Observer} which has a stable tag. Must be used with {@link AutoResubscribe}
* annotation to set the tag before observer is used.
*/
public abstract class AutoResubscribingObserver<T> implements TaggedObserver<T> {
private String tag;
public final String getTag() {
return tag;
}
void setTag(String tag) {
this.tag = tag;
}
@Override
public void onComplete() {
}
@Override
public void onError(Throwable e) {
}
@Override
public void onNext(T t) {
}
@Override public void onSubscribe(@NonNull Disposable d) {
}
} |
by |
If ‘sex, drugs, and rock and roll’ are the main ingredients of a great rock story, then Mötley Crüe’s oral history The Dirt: Confessions of the World’s Most Notorious Rock Band has it all. My favorite story in the book is recounted here. This jaw-dropping, debaucherous Ozzy Osbourne tale is one of the many reasons that this book landed my list of the best rock and roll books of all time.
The story starts on a sunny afternoon at a Florida resort/hotel where Ozzy and the Crüe had stopped on tour. Ozzy is wearing a dress that he’d stolen from the purse of an elderly woman (due to some other antics in the bar). The Ozzy/Crüe entourage is hanging by the pool when they run out of cocaine. When Nikki Sixx tells Ozzy there’s “no blow” left, Ozzy insists to be handed a straw. In Nikki’s words:
I handed him the straw, and he walked over to a crack in the sidewalk and bent over it. I saw a long column of ants, marching to a little dugout built where the pavement met the dirt. And as I thought, “No, he wouldn’t,” he did. He put the straw to his nose and, with his bare white ass peeking out from under the dress like a sliced honeydew, sent the entire line of ants tickling up his nose with a single, monstrous snort. He stood up, reared back his head, and concluded with a powerful rightnostriled sniff that probably sent a stray ant or two dripping down his throat. Then he hiked up the sundress, grabbed his dick, and pissed on the pavement. Without even looking at his growing audience–everyone on the tour was watching him while the old women and families on the pool deck were pretending not to–he knelt down and, getting the dress soggy in the puddle, lapped it up. He didn’t just flick it with his tongue, he took a half-dozen long, lingering, and thorough strokes, like a cat. Then he stood up and, eyes blazing and mouth wet with urine, looked straight at me. “Do that, Sixx!”
Read the disgusting conclusion, and a million other debaucherous rock tales, in Mötley Crüe’s The Dirt. |
<reponame>LegitMagic/HomeEditLoader
#pragma once
#include <3ds/types.h>
#include <string.h>
void progIdToStr(char *strEnd, u64 progId);
|
/**
* Checks if the database needs to be synced with the file system
*
* @return True if the database needs to be synced
*/
private boolean needDatabaseSync() {
long folderTimestamp = WordsWithCrossesApplication.CROSSWORDS_DIR.lastModified();
long lastDBSync = prefs.getLong(PREF_LAST_DB_SYNC_TIME, 0);
return (folderTimestamp > lastDBSync);
} |
// Parse an interval. Can be expressed in hours, days, weeks, months or years.
// Return the time interval in seconds.
func ParseInterval(intv string) (int, error) {
alias, ok := intervalAliases[intv]
if ok {
intv = alias
}
if len(intv) == 0 {
return 0, fmt.Errorf("empty interval")
}
var result int
var suffix byte
var err error
if strings.Contains("ymwdh", string(intv[len(intv)-1])) {
result, err = strconv.Atoi(intv[:len(intv)-1])
suffix = intv[len(intv)-1]
} else {
result, err = strconv.Atoi(intv)
}
if err != nil {
return 0, err
}
switch suffix {
case 'y':
result *= 365 * 24 * 3600
case 'm':
result *= 30 * 24 * 3600
case 'w':
result *= 7 * 24 * 3600
case 'd':
result *= 24 * 3600
case 'h':
result *= 3600
default:
if suffix != 0 {
return 0, fmt.Errorf("invalid suffix: %v", suffix)
}
}
return result, nil
} |
import { Command } from "commander";
import * as fs from "fs";
import * as os from "os";
import ow from "ow";
import * as rimraf from "rimraf";
import { MockReadable, MockWritable, stdio as stdioMock } from "stdio-mock--fork-by-wiseteam";
import { App } from "../app";
import { Context } from "../Context";
import { d } from "../util/util";
export class CliTestHelper {
private static REQUIRED_EXEC_ARGS = ["node", "index.js"];
private started: boolean = false;
private args: string[] = [];
private stdio: { stdin: MockReadable; stderr: MockWritable; stdout: MockWritable };
private tmpDir: string;
public constructor() {
this.stdio = stdioMock();
this.tmpDir = this.uniqueTmpDirName();
}
public setArgs(args: string[]): CliTestHelper {
this.args = args;
ow(this.args, ow.array.ofType(ow.string.nonEmpty).label("args"));
return this;
}
public createTestAppContext(): Context {
const context = new Context();
context.stdout = this.stdio.stdout;
context.stdin = this.stdio.stdin;
context.stderr = this.stdio.stderr;
context.log = (msg: string) => this.stdio.stdout.write(msg + "\n");
context.error = (msg?: string, error?: Error) =>
this.stdio.stderr.write(
(msg ? msg : "") + "\n" + error
? d(error).name + ": " + d(error).message + "\n " + (d(error).stack || "")
: "",
);
context.debug = (msg: string) => this.stdio.stdout.write(msg + "\n");
context.info = (msg: string) => this.stdio.stdout.write(msg + "\n");
context.exception = (exception: Error, level?: string) =>
this.stdio.stderr.write(exception.name + ": " + exception.message + "\n " + (exception.stack || ""));
return context;
}
public mockCliTest() {
this.ensureSingleStart();
const appContext = this.createTestAppContext();
const app = new App(appContext, new Command(), [...CliTestHelper.REQUIRED_EXEC_ARGS, ...this.args]);
return { appContext, app, stdio: this.stdio };
}
public async testSetup() {
this.createTestDirectory();
}
public async testTeardown() {
await this.removeRecursivelyTestDirtectory();
}
public getStderrLines(): string[] {
return this.stdio.stderr.data();
}
public getStderr(): string {
return this.getStderrLines().join("\n");
}
public getStdoutLines(): string[] {
return this.stdio.stdout.data();
}
public getStdout(): string {
return this.getStdoutLines().join("\n");
}
public writeToStdin(str: string) {
this.stdio.stdin.write(str);
}
public endStdin() {
this.stdio.stdin.end();
}
public getTmpDir(): string {
return this.tmpDir;
}
public writeFile(relativeFilePath: string, contents: string): string {
const filePath = `${this.getTmpDir()}/${relativeFilePath}`;
fs.writeFileSync(filePath, contents, "UTF-8");
return filePath;
}
private uniqueTmpDirName(): string {
return os.tmpdir() + "/wise_cli_test_" + Date.now();
}
private createTestDirectory() {
fs.mkdirSync(this.tmpDir);
}
private removeRecursivelyTestDirtectory(): Promise<void> {
return new Promise((resolve, reject) => {
rimraf(this.tmpDir, (error?: Error) => {
if (error) reject(error);
else resolve();
});
});
}
private ensureSingleStart() {
if (this.started) throw new Error("You can start CliTestHelper only once");
this.started = true;
}
}
|
def range(self):
return self.data.index[0], self.data.index[-1] |
// MultivariateFunction.java
//
// (c) 2006- JEBL development team
//
// based on LGPL code from the Phylogenetic Analysis Library (PAL),
// http://www.cebl.auckland.ac.nz/pal-project/
// which is (c) 1999-2001 PAL Development Core Team
//
// This package may be distributed under the
// terms of the Lesser GNU General Public License (LGPL)
package jebl.math;
/**
* interface for a function of several variables
*
* @author <NAME>
*/
public interface MultivariateFunction
{
/**
* compute function value
*
* @param argument function argument (vector)
*
* @return function value
*/
double evaluate(double[] argument);
/**
* get number of arguments
*
* @return number of arguments
*/
int getNumArguments();
/**
* get lower bound of argument n
*
* @param n argument number
*
* @return lower bound
*/
double getLowerBound(int n);
/**
* get upper bound of argument n
*
* @param n argument number
*
* @return upper bound
*/
double getUpperBound(int n);
/**
* @return an Orthogonal Hints object that can be used by Orthogonal based optimisers
* to get information about the function
* @return if no such information just return null!
*/
OrthogonalHints getOrthogonalHints();
}
|
def main(unused_argv):
del unused_argv
gin.parse_config_files_and_bindings([FLAGS.gin_file], FLAGS.gin_param)
seed = gin.query_parameter('%seed')
results_dir = gin.query_parameter('%results_dir')
results_dir = os.path.normpath(results_dir)
num_samps = gin.query_parameter('%num_samps')
num_steps = gin.query_parameter('%num_steps')
results_dir = os.path.normpath(results_dir)
if not os.path.exists(results_dir):
os.makedirs(results_dir)
torch.manual_seed(seed)
simulation, data_args, _, _ = get_simulation()
inv_cdfs, loan_repaid_probs, pis, _, scores, \
rate_indices = data_args
rate_index_A, rate_index_B = rate_indices
def check(results):
for k, v in results.items():
if isinstance(v, torch.Tensor):
if torch.isnan(v).any():
msg = 'NaN spotted in results for variable ' + k
raise ValueError(msg)
outcome_curve_A = []
outcome_curve_B = []
utility_curve_A = []
utility_curve_B = []
for selection_rate in tqdm(rate_index_A):
f_T = get_dempar_policy_from_selection_rate(selection_rate, inv_cdfs)
simulation.intervene(f_T=f_T)
results = simulation.run(num_steps, num_samps)
check(results)
DeltaA, _ = [mdj.item() for mdj in results['Deltaj']]
if (results['A'] != 0).all():
UmathcalA = 0.
else:
batched_A_mask = \
torch.unsqueeze(results['A'] == 0, 0).repeat(num_steps, 1)
UmathcalA = torch.mean(results['u'][batched_A_mask]).item()
outcome_curve_A.append(DeltaA)
utility_curve_A.append(UmathcalA)
for selection_rate in tqdm(rate_index_B):
f_T = get_dempar_policy_from_selection_rate(selection_rate, inv_cdfs)
simulation.intervene(f_T=f_T)
results = simulation.run(num_steps, num_samps)
check(results)
_, DeltaB = [mdj.item() for mdj in results['Deltaj']]
if (results['A'] != 1).all():
UmathcalB = 0.
else:
batched_A_mask = \
torch.unsqueeze(results['A'] == 1, 0).repeat(num_steps, 1)
UmathcalB = torch.mean(results['u'][batched_A_mask]).item()
outcome_curve_B.append(DeltaB)
utility_curve_B.append(UmathcalB)
outcome_curve_A = np.array(outcome_curve_A)
outcome_curve_B = np.array(outcome_curve_B)
utility_curves = np.array([
utility_curve_A,
utility_curve_B,
])
util_MP = np.amax(utility_curves, axis=1)
utility_curves_MP = np.vstack(
[utility_curves[0] + util_MP[1], utility_curves[1]+ util_MP[0]]
)
utility_curves_DP = [[], []]
for i in tqdm(range(len(rate_index_A))):
beta_A = rate_index_A[i]
beta_B = rate_index_B[i]
f_T_at_beta_A = get_dempar_policy_from_selection_rate(
beta_A, inv_cdfs)
simulation.intervene(f_T=f_T_at_beta_A)
results = simulation.run(num_steps, num_samps)
check(results)
Umathcal_at_beta_A = results['Umathcal'].item()
utility_curves_DP[0].append(Umathcal_at_beta_A)
f_T_at_beta_B = get_dempar_policy_from_selection_rate(
beta_B, inv_cdfs)
simulation.intervene(f_T=f_T_at_beta_B)
results = simulation.run(num_steps, num_samps)
check(results)
Umathcal_at_beta_B = results['Umathcal'].item()
utility_curves_DP[1].append(Umathcal_at_beta_B)
utility_curves_DP = np.array(utility_curves_DP)
utility_curves_EO = [[], []]
for i in tqdm(range(len(rate_index_A))):
beta_A = rate_index_A[i]
beta_B = rate_index_B[i]
f_T_at_beta_A = get_eqopp_policy_from_selection_rate(
beta_A, loan_repaid_probs, pis, scores)
simulation.intervene(f_T=f_T_at_beta_A)
results = simulation.run(num_steps, num_samps)
check(results)
Umathcal_at_beta_A = results['Umathcal'].item()
utility_curves_EO[0].append(Umathcal_at_beta_A)
f_T_at_beta_B = get_dempar_policy_from_selection_rate(
beta_B, inv_cdfs)
simulation.intervene(f_T=f_T_at_beta_B)
results = simulation.run(num_steps, num_samps)
check(results)
Umathcal_at_beta_B = results['Umathcal'].item()
utility_curves_EO[1].append(Umathcal_at_beta_B)
utility_curves_EO = np.array(utility_curves_EO)
results.update(dict(
rate_index_A=rate_index_A,
rate_index_B=rate_index_B,
outcome_curve_A=outcome_curve_A,
outcome_curve_B=outcome_curve_B,
utility_curves_MP=utility_curves_MP,
utility_curves_DP=utility_curves_DP,
utility_curves_EO=utility_curves_EO))
if results_dir not in ('.', ):
cmd = 'python ' + ' '.join(sys.argv)
with open(os.path.join(results_dir, 'command.sh'), 'w') as f:
f.write(cmd)
this_script = open(__file__, 'r').readlines()
with open(os.path.join(results_dir, __file__), 'w') as f:
f.write(''.join(this_script))
results_filename = os.path.join(results_dir, 'results.p')
with open(results_filename, 'wb') as f:
_ = pickle.dump(results, f)
with open(os.path.join(results_dir, 'config.gin'), 'w') as f:
f.write(gin.operative_config_str()) |
Expert answer
Dear Gina, Lots of psychiatric diagnoses generate controversy in the general public (e.g. attention deficit hyperactivity disorder, juvenile bipolar disorder), although they are noncontroversial in the mental health world. On the other hand, if you want to see mental health professionals spat with each other, ask a few of them what they think about dissociative identity disorder, the condition that used to be more colorfully known as multiple personality disorder.
Many biological psychiatrists who base their practices around medication management will tell you the condition doesn't exist, or that if it exists it is "iatrogenic," meaning it is caused by therapists training their patients to interpret their symptoms as if they have a whole set of distinct personalities. On the other hand, there are clinicians who specialize in the condition and they take the presence of multiple personalities so seriously that they will separate therapeutic meetings with each of a patient's "alters" (i.e. individual personalities). True believers will point to data that different personalities have different electroencephalogram tracings. Cynics will point out that actors can generate different EEG tracings when they switch characters.
As all psychiatrists, I have my opinion about dissociative disorders. I like to think it is a middle-of-the-road position, but I'll let you judge for yourself.
The dictionary defines dissociation as "an unexpected partial or complete disruption of the normal integration of a person's conscious or psychological functioning that cannot be easily explained by the person." I don't think anyone could doubt that this phenomenon exists. You can do the mental experiment. Think about a time when you were driving a car and suddenly realized you'd completely lost attention to the last number of miles, or that you'd missed a turn without even realizing it. That is dissociation -- you are doing something important and you lose track of the part of yourself that is doing it.
Like all other mental difficulties, dissociation runs a spectrum from normal to extremely pathological. In my clinical experience it is very common for traumatized and/or very mentally ill people to manifest high rates of dissociation. People who dissociate a lot have conscious experience that is like Swiss cheese: full of holes. But unlike sadness, anger or clear psychosis, it is not usually readily apparent, so it gets less attention than it should. People who suffer with this rarely complain about it, because almost by definition, their fragmented conscious awareness makes it very difficult for them to even notice that they are missing things and/or not aware. We also do not have good pharmacological interventions to reduce dissociation, so it has gotten less money behind it than have many other mental conditions.
There is no doubt that some people behave as if they have multiple personalities. And not all of them have been to therapists who have trained them to interpret their dissociative experiences in this way. Does this mean that dissociative identity disorder exists? In my opinion it depends on what we mean by "exists." Yes, dissociative identity disorder exists if by exists we mean there are people who complain of its symptoms and suffer its consequences. Do I think that some people have many biologically distinct entities packed into their heads? No. I think that some people dissociate so badly that either on their own or as a result of therapeutic experiences it becomes the case that the most convincing way for them to see their own experience is as if it is happening to multiple people.
If this sounds like an endorsement of the condition, it is in a qualified way. I am personally less sanguine, however, about treatments that proceed as if each of the separate personalities really exists concretely and then work to integrate them again. This is the most common therapeutic way to treat the disorder, but I have seen precious few successes and a lot of people made worse by this intervention. In all fairness, however, I used to work intensively on inpatient psychiatric wards and had to care for the train wrecks left behind when integrative therapies failed, so maybe I'm negatively biased.
Here is a final strange paradox regarding the question of whether dissociative identity disorder exists. Whether clinicians believe or disbelieve, they will all tell you that it is one of the most serious psychiatric difficulties. Patients who demonstrate dissociative identity disorder symptoms are all extremely ill in my experience. They have frequently undergone significant trauma, especially early in life. The chaos of their personalities and behavior often leave a tornado track in their wake, and they suffer tremendous emotional discomfort and anxiety. And, as I mentioned above, unlike mental conditions such as depression or psychosis, for which good -- although far from perfect -- treatments exist, there is very little evidence that any currently available interventions are of much help. |
<filename>pandapower/control/__init__.py
import pandapower.control.basic_controller
import pandapower.control.controller
# --- Controller ---
from pandapower.control.controller.const_control import ConstControl
from pandapower.control.controller.characteristic_control import CharacteristicControl
from pandapower.control.controller.trafo.ContinuousTapControl import ContinuousTapControl
from pandapower.control.controller.trafo.DiscreteTapControl import DiscreteTapControl
from pandapower.control.controller.trafo.VmSetTapControl import VmSetTapControl
from pandapower.control.controller.trafo.USetTapControl import USetTapControl # TODO: drop after next release
from pandapower.control.controller.trafo.TapDependentImpedance import TapDependentImpedance
from pandapower.control.controller.trafo_control import TrafoController
# --- Other ---
from pandapower.control.run_control import *
from pandapower.control.run_control import ControllerNotConverged
from pandapower.control.util.characteristic import Characteristic, SplineCharacteristic
from pandapower.control.util.auxiliary import get_controller_index, plot_characteristic, create_trafo_characteristics
from pandapower.control.util.diagnostic import control_diagnostic, trafo_characteristics_diagnostic
|
The Missouri Lottery Optimizes Its Scheduling and Routing to Improve Efficiency and Balance
The Missouri lottery, a profit-driven nonprofit organization, generates annual revenues of over $800 million by selling lottery tickets; 27.5 percent of the revenue goes to Missouri's public education programs. The lottery sales representatives (LSRs) play a central role in increasing sales by providing excellent customer service to ticket retailers throughout the state. Hence, LSRs must have equitable, balanced work schedules and efficient routes and navigation sequences. Our objective was to provide scheduling and routing policies that minimize LSRs' total travel distance while balancing their workloads and meeting visitation constraints. We modeled the problem as a periodic traveling-salesman problem and developed improvement algorithms specifically to solve this problem. The newly implemented schedules and routes decrease the LSRs' travel distance by 15 percent, improve visitation feasibility by 46 percent, increase the balance of routes by 63 percent, decrease overtime days by 32 percent, and indirectly increase the sales of lottery tickets by improving customer service. |
<reponame>arteneo/forge
import React from "react";
import Button, { ButtonProps } from "@arteneo/forge/components/Common/Button";
import axios, { AxiosRequestConfig } from "axios";
import { useHandleCatch } from "@arteneo/forge/contexts/HandleCatch";
import { useLoader } from "@arteneo/forge/contexts/Loader";
interface ButtonDownloadInterface {
requestConfig: AxiosRequestConfig;
}
type ButtonDownloadProps = ButtonDownloadInterface & ButtonProps;
const ButtonDownload = ({ requestConfig, ...props }: ButtonDownloadProps) => {
const { showLoader, hideLoader } = useLoader();
const handleCatch = useHandleCatch();
const onClick = () => {
showLoader();
axios
.request(Object.assign({ responseType: "blob" }, requestConfig))
.then((response) => {
hideLoader();
const url = window.URL.createObjectURL(new Blob([response.data]));
const link = document.createElement("a");
link.href = url;
link.setAttribute("download", response.headers["content-type-filename"]);
link.setAttribute("target", "_blank");
document.body.appendChild(link);
link.click();
})
.catch((error) => handleCatch(error));
};
return (
<Button
{...{
onClick: () => onClick(),
...props,
}}
/>
);
};
export default ButtonDownload;
export { ButtonDownloadProps };
|
<filename>pkg/types/error.go<gh_stars>0
package types
import "fmt"
type OrderError struct {
error error
order Order
}
func (e *OrderError) Error() string {
return fmt.Sprintf("%s exchange: %s orderID:%d", e.error.Error(), e.order.Exchange, e.order.OrderID)
}
func (e *OrderError) Order() Order {
return e.order
}
func NewOrderError(e error, o Order) error {
return &OrderError{
error: e,
order: o,
}
}
|
// Associate folders with the right parent folders
// For each new folder, strip off the last component of the folder name.
// Look through new and existing folders for names that match.
void SyncFolderListCommand::MatchParents()
{
const vector<ImapFolderPtr>& folders = m_folderListDiff.GetFolders();
map<string, MojObject> suitableParents;
vector<ImapFolderPtr>::const_iterator it;
BOOST_FOREACH(const ImapFolderPtr& folder, folders) {
assert( !folder->GetId().undefined() );
suitableParents[folder->GetFolderName()] = folder->GetId();
}
BOOST_FOREACH(const ImapFolderPtr& folder, folders) {
const string &folderName = folder->GetFolderName();
size_t end = folderName.find_last_of(folder->GetDelimiter());
if(end != string::npos) {
std::string parentName = folderName.substr(0, end);
if(boost::istarts_with(parentName, ImapFolder::INBOX_FOLDER_NAME)) {
parentName.replace(0, string(ImapFolder::INBOX_FOLDER_NAME).length(), ImapFolder::INBOX_FOLDER_NAME);
}
map<string, MojObject>::iterator parentIt = suitableParents.find(parentName);
if(parentIt != suitableParents.end()) {
MojObject parentId = parentIt->second;
folder->SetParentId(parentId);
}
}
}
} |
<filename>src/modules/comentary/repository/i-commentary.repository.ts
import { Commentary } from '../schemas/commentary.schema';
export interface ICommentaryRepository {
create(dto: Commentary);
getAll();
getById(id: string);
update(id: string, dto: Commentary);
delete(id: string);
}
|
// WaitForHealthy polls a URL repeatedly until the server responds with a
// non-server-error status, the context is canceled, or the context's deadline
// is met. WaitForHealthy returns an error in the latter two cases.
func WaitForHealthy(ctx context.Context, u *url.URL) error {
req := &http.Request{
Method: http.MethodGet,
URL: u,
}
req = req.WithContext(ctx)
tick := time.NewTicker(100 * time.Millisecond)
defer tick.Stop()
for {
resp, err := http.DefaultClient.Do(req)
if err == nil && (statusCodeInRange(resp.StatusCode, 200) || statusCodeInRange(resp.StatusCode, 400)) {
return nil
}
select {
case <-tick.C:
case <-ctx.Done():
return xerrors.Errorf("wait for healthy: %w", ctx.Err())
}
}
} |
Competitive Inhibitors of Mycobacterium tuberculosis Ribose-5-phosphate Isomerase B Reveal New Information about the Reaction Mechanism*
Ribose-5-phosphate isomerase (Rpi), an important enzyme in the pentose phosphate pathway, catalyzes the interconversion of ribulose 5-phosphate and ribose 5-phosphate. Two unrelated isomerases have been identified, RpiA and RpiB, with different structures and active site residues. The reaction catalyzed by both enzymes is thought to proceed via a high energy enediolate intermediate, by analogy to other carbohydrate isomerases. Here we present studies of RpiB from Mycobacterium tuberculosis together with small molecules designed to resemble the enediolate intermediate. The relative affinities of these inhibitors for RpiB have a different pattern than that observed previously for the RpiA from spinach. X-ray structures of RpiB in complex with the inhibitors 4-phospho-d-erythronohydroxamic acid (Km 57 μm) and 4-phospho-d-erythronate (Ki 1.7 mm) refined to resolutions of 2.1 and 2.2 Å, respectively, allowed us to assign roles for most active site residues. These results, combined with docking of the substrates in the position of the most effective inhibitor, now allow us to outline the reaction mechanism for RpiBs. Both enzymes have residues that can catalyze opening of the furanose ring of the ribose 5-phosphate and so can improve the efficiency of the reaction. Both enzymes also have an acidic residue that acts as a base in the isomerization step. A lysine residue in RpiAs provides for more efficient stabilization of the intermediate than the corresponding uncharged groups of RpiBs; this same feature lies behind the more efficient binding of RpiA to 4-phospho-d-erythronate. |
<reponame>unfoldingWord-dev/tools<filename>md/tw_txt2md-rc.py
# -*- coding: utf-8 -*-
# This script converts a repository of tW text files from tStudio to .md format,
# to create a valid resource container. The old folder structure has only a single
# folder for all 1000+ files. This script is intended to do the following:
# Convert each .txt file into an equivalent .md file.
# Determine a location under the target folder for the .md file based on
# matching the file name to a file in the English tW structure. Assumes that
# folders are only one level deep under the English and target language folders.
# Global variables
source_dir = r'C:\DCS\Amharic\TW\01'
target_dir = r'C:\DCS\Amharic\am_tw.RPP'
language_code = 'am'
en_ta_dir = r'C:\DCS\English\en_ta.v9'
en_tw_dir = r'C:\DCS\English\en_tw.v10\bible' # should end in 'bible'
import re
import io
import os
import sys
import convert2md
import string
tapage_re = re.compile(r'\[\[.*/ta/man/(.*)]]', flags=re.UNICODE)
pages = []
# Parse the tA manual page name from the link string.
# Add it to the list of pages to be resolved if it is not in the tA manual
def captureArticle(linkstr):
global pages
page = tapage_re.match(linkstr)
if page:
manpage = page.group(1)
path = os.path.join(en_ta_dir, manpage)
if not os.path.isdir(path):
pages.append(manpage)
# Writes a file, articles.txt, containing a list of unresolved references to tA articles.
def dumpArticles(dir):
pages = convert2md.getBadReferences()
if len(pages) > 0:
pagelist = list(set(pages))
pagelist.sort()
path = os.path.join(dir, "articles.txt")
file = io.open(path, "tw", encoding='utf-8', newline='\n')
for article in pagelist:
file.write(article + '\n')
file.close()
sys.stderr.write("Unresolved links, see " + shortname(path) + '\n')
def shortname(longpath):
shortname = longpath
if source_dir in longpath:
shortname = longpath[len(source_dir)+1:]
return shortname
def makeMdPath(fname):
mdName = os.path.splitext(fname)[0] + ".md"
subdir = "other"
for trydir in os.listdir(en_tw_dir):
tryFolder = os.path.join(en_tw_dir, trydir)
if os.path.isdir(tryFolder):
if os.path.isfile( os.path.join(tryFolder, mdName) ):
subdir = trydir
break
mdFolder = os.path.join(target_dir, subdir)
if not os.path.isdir(mdFolder):
os.mkdir(mdFolder)
return os.path.join(mdFolder, mdName)
# Converts .txt file in fullpath location to .md file in target dir.
def convertFile(fname, fullpath):
sys.stdout.write(fname + '\n')
if os.access(fullpath, os.F_OK):
mdPath = makeMdPath(fname)
convert2md.json2md(fullpath, mdPath, language_code, shortname)
# This method is called to convert the text files in the specified folder.
def convertFolder(folder):
for entry in os.listdir(folder):
path = os.path.join(folder, entry)
if entry[0] != '.' and os.path.isdir(path):
convertFolder(path)
elif entry[-4:].lower() == ".txt":
convertFile(entry, path)
# Processes each directory and its files one at a time
if __name__ == "__main__":
if not target_dir.endswith("bible"):
target_dir = os.path.join(target_dir, "bible")
if not os.path.isdir(target_dir):
os.mkdir(target_dir)
if len(sys.argv) > 1 and sys.argv[1] != 'hard-coded-path':
source_dir = sys.argv[1]
if source_dir and os.path.isdir(source_dir):
convertFolder(source_dir)
print("\nDone.")
else:
sys.stderr.write("Path not found: " + source_dir + '\n')
|
/// Fill a row in the network map array.
fn fill_row(line: &str, row: usize, network_map: &mut Array2<u8>) {
for (idx, bit) in line.as_bytes().iter().enumerate() {
network_map[[row, idx]] = *bit;
}
} |
def slurp_nights(self, make_frameqa=False, remove=True, restrict_nights=None,
write_nights=False, **kwargs):
log = get_logger()
if make_frameqa:
self.make_frameqa(**kwargs)
log.info("Resetting QA_Night objects")
self.qa_nights = []
for night in self.mexp_dict.keys():
if restrict_nights is not None:
if night not in restrict_nights:
continue
qaNight = QA_Night(night, specprod_dir=self.specprod_dir, qaprod_dir=self.qaprod_dir)
qaNight.slurp(remove=remove)
self.qa_nights.append(qaNight)
if write_nights:
qaNight.write_qa_exposures() |
def draw(self, dstrect, hollow=False, color=None):
target = pg.Rect(dstrect)
target.width = max(target.width, self.min_width)
target.height = max(target.height, self.min_height)
bounds = self.area
texture = self.texture
if color:
texture.color, color = color, texture.color
texture.draw(
srcrect=(bounds.left, bounds.top, self.left, self.top),
dstrect=(target.left, target.top, self.left, self.top) )
texture.draw(
srcrect=(bounds.left, bounds.top+self.top, self.left,
bounds.height-self.top-self.bottom),
dstrect=(target.left, target.top+self.top, self.left,
target.height-self.top-self.bottom) )
texture.draw(
srcrect=(bounds.left, bounds.bottom-self.bottom,
self.left, self.bottom),
dstrect=(target.left, target.bottom-self.bottom,
self.left, self.bottom) )
texture.draw(
srcrect=(bounds.right-self.right, bounds.top,
self.right, self.top),
dstrect=(target.right-self.right, target.top,
self.right, self.top) )
texture.draw(
srcrect=(bounds.right-self.right, bounds.top+self.top,
self.right,bounds.height-self.bottom-self.top),
dstrect=(target.right-self.right, target.top+self.top,
self.right, target.height-self.bottom-self.top) )
texture.draw(
srcrect=(bounds.right-self.right, bounds.bottom-self.bottom,
self.right, self.bottom),
dstrect=(target.right-self.right, target.bottom-self.bottom,
self.right, self.bottom) )
texture.draw(
srcrect=(bounds.left+self.left, bounds.top+self.top,
bounds.width-self.right-self.left,
bounds.height-self.top-self.bottom),
dstrect=(target.left+self.left, target.top+self.top,
target.width-self.right-self.left,
target.height-self.top-self.bottom) )
texture.draw(
srcrect=(bounds.left+self.left, bounds.top,
bounds.width-self.left-self.right, self.top),
dstrect=(target.left+self.left, target.top,
target.width-self.left-self.right, self.top) )
texture.draw(
srcrect=(bounds.left+self.left, bounds.top+self.top,
bounds.width-self.right-self.left,
bounds.height-self.top-self.bottom),
dstrect=(target.left+self.left, target.top+self.top,
target.width-self.right-self.left,
target.height-self.top-self.bottom) )
texture.draw(
srcrect=(bounds.left+self.left, bounds.bottom-self.bottom,
bounds.width-self.left-self.right, self.bottom),
dstrect=(target.left+self.left, target.bottom-self.bottom,
target.width-self.left-self.right, self.bottom) )
if color:
texture.color = color
return target |
/**
* Print information about this gene as a row of tab-delimited text
* @return A text line (like a table entry) representing this gene...
*/
public String printInfo()
{
String line = "";
line += getID() + "\t";
line += getName() + "\t";
if (this.is_ncRNA()) line += "ncRNA\t";
else line += "mRNA\t";
line += getChromosome().getUDPName() + "\t";
if (getStrand()) line += "+\t";
else line += "-\t";
line += getTranscriptStart() + "\t";
line += getTranscriptEnd() + "\t";
line += getTranslationStart() + "\t";
line += getTranslationStop() + "\t";
int eCount = getExonCount();
line += eCount + "\t";
for (int i = 0; i < eCount; i++)
{
Exon e = getExon(i);
line += "[" + e.getStart() + "," + e.getEnd() + "]";
}
return line;
} |
package main
import (
"crypto/rand"
"encoding/hex"
"fmt"
"io/ioutil"
"os"
"strings"
"github.com/spf13/pflag"
flag "github.com/spf13/pflag"
"github.com/stutonk/boxutil"
)
const (
errFmt = "%v: fatal; %v\n"
saltLen = 128
usageFmt = `usage: %v [-h, -v] [-s salt] [passphrase]
If no passphrase given, read from STDIN
Options are:
`
verFmt = "%v version %v\n"
version = "1.2.0"
)
var (
appName = os.Args[0]
helpFlag bool
saltFlag setString
verFlag bool
)
type setString struct {
set bool
value string
}
func (sf *setString) Set(x string) error {
sf.value = x
sf.set = true
return nil
}
func (sf *setString) String() string {
return sf.value
}
func (sf *setString) Type() string {
return "string"
}
func init() {
flag.Usage = func() {
fmt.Fprintf(os.Stderr, usageFmt, appName)
flag.PrintDefaults()
fmt.Println()
}
flag.BoolVarP(
&helpFlag,
"help",
"h",
false,
"display this help and exit",
)
flag.VarP(
&saltFlag,
"salt",
"s",
"provide salt as a hexidecimal string",
)
flag.BoolVarP(
&verFlag,
"version",
"v",
false,
"output version information and exit",
)
flag.Parse()
}
func main() {
switch {
case verFlag:
fmt.Printf(verFmt, appName, version)
return
case helpFlag:
flag.Usage()
return
}
defer func() {
if r := recover(); r != nil {
fmt.Fprintf(os.Stderr, errFmt, appName, r)
}
}()
var (
err error
input []byte
salt []byte
)
if len(pflag.Args()) > 0 {
input = []byte(strings.Join(pflag.Args(), " "))
} else {
input, err = ioutil.ReadAll(os.Stdin)
if err != nil {
panic(err)
}
}
if saltFlag.set {
salt, err = hex.DecodeString(saltFlag.value)
if err != nil {
panic(err)
}
} else {
salt = make([]byte, saltLen)
if _, err := rand.Read(salt); err != nil {
panic(err)
}
}
fmt.Println(hex.EncodeToString((*boxutil.Passkey(input, salt))[:]))
fmt.Println(hex.EncodeToString(salt))
}
|
<reponame>jgramoll/terraform-provider-jenkins
package main
import (
"fmt"
"os"
"github.com/jgramoll/terraform-provider-jenkins/client"
)
type GenerateTerraformCodeService struct {
jobService *client.JobService
}
func NewGenerateTerraformCodeService(jobService *client.JobService) *GenerateTerraformCodeService {
return &GenerateTerraformCodeService{
jobService: jobService,
}
}
func (s *GenerateTerraformCodeService) GenerateCode(job *client.Job, outputDir string) error {
if err := s.generateProviderCode(outputDir); err != nil {
return err
}
return s.generatePipelineCode(outputDir, job)
}
func (s *GenerateTerraformCodeService) generateProviderCode(outputDir string) error {
tfCodeFile, err := os.Create(fmt.Sprintf("%s/provider.tf", outputDir))
if err != nil {
return err
}
defer tfCodeFile.Close()
_, err = tfCodeFile.Write([]byte(providerCode(s.jobService.Config.Address)))
return err
}
func (s *GenerateTerraformCodeService) generatePipelineCode(outputDir string, job *client.Job) error {
tfCodeFile, err := os.Create(fmt.Sprintf("%s/pipeline.tf", outputDir))
if err != nil {
return err
}
defer tfCodeFile.Close()
_, err = tfCodeFile.Write([]byte(jobCode(job)))
return err
}
|
// implicitMetaAnyPolicy defines an implicit meta policy whose sub_policy and key is policyname with rule ANY.
func implicitMetaAnyPolicy(policyName string) (*standardConfigPolicy, error) {
implicitMetaPolicy, err := implicitMetaPolicy(policyName, cb.ImplicitMetaPolicy_ANY)
if err != nil {
return nil, fmt.Errorf("failed to make implicit meta ANY policy: %v", err)
}
return &standardConfigPolicy{
key: policyName,
value: implicitMetaPolicy,
}, nil
} |
/**
* Assumes this json object is an array holding elements that will be converted to the requested element type, returning
* a {@link List} of them.
*/
@Override
public <T> List<T> unmarshallList(final JsonNode node,
final Class<T> elementType) {
return this.unmarshallCollection(node,
elementType,
Collectors.toList());
} |
const withTooManyValues = `Name Email Birthday
Bunk <EMAIL> 3/12/48 Uh Oh.`;
export default withTooManyValues;
|
/**
* Formats and return selected fields.
*
* @param entity Java object to update.
* @param selectFields Selected fields.
* @return Formatted selected fields.
*/
private static Map<String, Object> EntityToNameValueList(Object entity, List<String> selectFields) {
if (entity == null) {
return null;
}
ObjectMapper mapper = JsonObjectMapper.getMapper();
Map<String, Object> tempEntity = mapper.convertValue(entity, Map.class);
if (tempEntity == null) {
return null;
}
boolean useSelectedFields = (selectFields != null) && (selectFields.size() > 0);
Map<String, Object> mappedEntity = new HashMap<String, Object>();
for (Map.Entry<String, Object> mapEntry : tempEntity.entrySet()) {
String key = mapEntry.getKey();
if (useSelectedFields) {
if (!selectFields.contains(key)) {
continue;
}
}
if (key.equalsIgnoreCase("id")) {
continue;
}
Map<String, Object> namevalueDic = new HashMap<String, Object>();
namevalueDic.put("name", key);
namevalueDic.put("value", mapEntry.getValue());
mappedEntity.put(key, namevalueDic);
}
return mappedEntity;
} |
package env
import (
"github.com/gobuffalo/envy"
)
// LogLevel returns the system's
// exposure to internal logs. Defaults
// to debug.
func LogLevel() string {
return envy.Get("ATHENS_LOG_LEVEL", "debug")
}
|
package cinema.shows.rest;
import cinema.shows.dtos.InputShowDTO;
import cinema.shows.dtos.ShowDTOFull;
import cinema.shows.dtos.ShowDTOMin;
import cinema.shows.services.ShowServices;
import org.springframework.http.HttpStatus;
import org.springframework.http.ResponseEntity;
import org.springframework.web.bind.annotation.*;
import java.sql.Date;
import java.util.List;
@RestController
@RequestMapping("/api/shows")
public class ShowRESTAPIImp implements ShowRESTAPI {
private ShowServices showServices;
public ShowRESTAPIImp(ShowServices services) {
this.showServices = services;
}
@GetMapping
public ResponseEntity<List<ShowDTOMin>> getMinShowsByTheater(Integer theaterId) {
List<ShowDTOMin> showDTOsMin = showServices.getMinShowsByTheater(theaterId);
return new ResponseEntity<>(showDTOsMin, HttpStatus.OK);
}
@PostMapping
public ResponseEntity<ShowDTOMin> addShow(@RequestBody InputShowDTO inputShowDTO) {
ShowDTOMin show = showServices.addShow(inputShowDTO);
return new ResponseEntity<>(show, HttpStatus.OK);
}
@GetMapping("/date")
public ResponseEntity<List<ShowDTOMin>> getAllShowsForTheaterForDate(
@RequestParam("date") String date) {
Date dateLooked = Date.valueOf(date);
List<ShowDTOMin> showDTOsMin = showServices.getMinShowsByDate(dateLooked);
return new ResponseEntity<>(showDTOsMin, HttpStatus.OK);
}
@GetMapping("/dates")
public ResponseEntity<List<ShowDTOMin>> getAllShowsForTheaterForDates(
@RequestParam("dateStart") String dateStart,
@RequestParam("dateEnd") String dateEnd) {
Date dateOne = Date.valueOf(dateStart);
Date dateTwo = Date.valueOf(dateEnd);
List<ShowDTOMin> showDTOsMin = showServices.getMinShowsByDates(dateOne,dateTwo);
return new ResponseEntity<>(showDTOsMin, HttpStatus.OK);
}
// @GetMapping("/{id}")
// public ResponseEntity<ShowDTOFull> getShow(@PathVariable int id){
// //here comes the logic
// return new ResponseEntity<>(ShowDTOFull.class, HttpStatus.OK);
// }
}
|
import { Component, OnInit, ViewChild } from '@angular/core';
import { ClarityIcons } from '@clr/icons';
import { JwtHelperService } from '@auth0/angular-jwt';
import { ClrModal } from '@clr/angular';
import { Router } from '@angular/router';
import { version } from 'src/environments/version';
import { UserService } from './services/user.service';
import { FormGroup, FormControl, Validators, ValidatorFn, ValidationErrors } from '@angular/forms';
import { ServerResponse } from './ServerResponse';
import { AppConfigService } from './app-config.service';
import { SettingsService } from './services/settings.service';
import { availableThemes } from './scenario/terminal-themes/themes';
@Component({
selector: 'app-root',
templateUrl: './app.component.html',
styleUrls: ['./app.component.css']
})
export class AppComponent implements OnInit {
public logoutModalOpened: boolean = false;
public aboutModalOpened: boolean = false;
public changePasswordModalOpened: boolean = false;
public new_password1: string = "";
public new_password2: string = "";
public version: string;
public changePwDangerClosed: boolean = true;
public changePwSuccessClosed: boolean = true;
public changePwDangerAlert: string = "";
public changePwSuccessAlert: string = "";
public accessCodeDangerClosed: boolean = true;
public accessCodeSuccessClosed: boolean = true;
public accessCodeDangerAlert: string = "";
public accessCodeSuccessAlert: string = "";
public newAccessCode: boolean = false;
public fetchingAccessCodes: boolean = false;
public accessCodeModalOpened: boolean = false;
public settingsModalOpened: boolean = false;
public fetchingSettings: boolean = false;
private settings: Map<string,string>;
public accesscodes: string[] = [];
public email: string = "";
private Config = this.config.getConfig();
public title = this.Config.title || "Rancher's Hobby Farm";
private logo = this.Config.logo || '/assets/default/logo.svg';
public availableThemes = availableThemes;
private selectedTheme = availableThemes[0];
constructor(
private helper: JwtHelperService,
private userService: UserService,
private router: Router,
private config: AppConfigService,
private settingsService: SettingsService
) {
this.config.getLogo(this.logo)
.then((obj: string) => {
ClarityIcons.add({
"logo": obj
})
})
if (this.Config.favicon) {
var fi = <HTMLLinkElement>document.querySelector("#favicon")
fi.href = this.Config.favicon;
}
if (version.tag) {
this.version = version.tag;
} else {
this.version = version.revision;
}
}
public matchedPasswordValidator: ValidatorFn = (control: FormGroup): ValidationErrors | null => {
var pw1 = control.get("new_password1").value;
var pw2 = control.get("new_password2").value;
return (pw1 && pw1 && (pw1 == pw2)) ? null : { 'passwordMismatch': true }
}
@ViewChild("logoutmodal", { static: true }) logoutModal: ClrModal;
@ViewChild("aboutmodal", { static: true }) aboutModal: ClrModal;
@ViewChild("changepasswordmodal", { static: true }) changePasswordModal: ClrModal;
@ViewChild("accesscodemodal", {static: true}) accessCodeModal: ClrModal;
@ViewChild("settingsmodal", {static: true}) settingsModal: ClrModal;
public passwordChangeForm: FormGroup = new FormGroup({
'old_password': new FormControl(null, [
Validators.required
]),
'new_password1': new FormControl(null, [
Validators.required
]),
'new_password2': new FormControl(null, [
Validators.required
])
}, { validators: this.matchedPasswordValidator })
public newAccessCodeForm: FormGroup = new FormGroup({
'access_code': new FormControl(null, [
Validators.required,
Validators.minLength(4)
])
})
public settingsForm: FormGroup = new FormGroup({
'selected_theme': new FormControl(null, [
Validators.required
])
})
ngOnInit() {
var tok = this.helper.decodeToken(this.helper.tokenGetter());
this.email = tok.email;
}
public logout() {
this.logoutModal.open();
}
public about() {
this.aboutModal.open();
}
public changePassword() {
this.passwordChangeForm.reset();
this.changePasswordModal.open();
}
public openAccessCodes() {
this.newAccessCodeForm.reset();
this.fetchingAccessCodes = true;
this.userService.getAccessCodes()
.subscribe(
(a: string[]) => {
this.accesscodes = a;
this.fetchingAccessCodes = false;
},
(s: ServerResponse) => {
this.accessCodeDangerClosed = false;
this.accessCodeDangerAlert = s.message;
this.fetchingAccessCodes = false;
}
)
this.accessCodeModal.open();
}
public openSettings() {
this.settingsForm.reset();
this.fetchingSettings = true;
this.settingsService.get(true)
.subscribe(
(a: Map<string,string>) => {
this.settings = a;
this.selectedTheme = availableThemes[0]; //Default to "Hobbyfarm Default Terminal" if no settings for theme are provided
availableThemes.forEach(element => {
if(element.theme === a.get("terminal_theme")){
this.selectedTheme = element;
}
});
this.settingsForm.setValue({
'selected_theme': this.selectedTheme
});
this.fetchingSettings = false;
}
);
this.settingsModal.open();
}
public saveAccessCode() {
var a = this.newAccessCodeForm.get("access_code").value;
this.userService.addAccessCode(a)
.subscribe(
(s: ServerResponse) => {
// success
this.accessCodeSuccessAlert = s.message + " added.";
this.accessCodeSuccessClosed = false;
this.accesscodes.push(a);
this.newAccessCode = false;
setTimeout(() => this.accessCodeSuccessClosed = true, 2000);
},
(s: ServerResponse) => {
// failure
this.accessCodeDangerAlert = s.message;
this.accessCodeDangerClosed = false;
setTimeout(() => this.accessCodeDangerClosed = true, 2000);
}
)
}
private _removeAccessCode(a: string) {
var acIndex = this.accesscodes.findIndex((v: string) => {
return v == a;
});
this.accesscodes.splice(acIndex, 1);
}
public deleteAccessCode(a: string) {
this.userService.deleteAccessCode(a)
.subscribe(
(s: ServerResponse) => {
this.accessCodeSuccessAlert = s.message + " deleted.";
this.accessCodeSuccessClosed = false;
this._removeAccessCode(a);
setTimeout(() => this.accessCodeSuccessClosed = true, 2000);
},
(s: ServerResponse) => {
this.accessCodeDangerAlert = s.message;
this.accessCodeDangerClosed = false;
setTimeout(() => this.accessCodeDangerClosed = true, 2000);
}
)
}
public doSaveSettings(){
if(!this.settings){
this.settings = new Map<string,string>();
}
this.settings.set("terminal_theme", this.settingsForm.get('selected_theme').value.theme);
this.settingsService.set(this.settings);
this.userService.updateSettings(this.settings)
.subscribe(
(s: ServerResponse) => {
this.settingsModalOpened = false
},
(s: ServerResponse) => {
setTimeout(() => this.settingsModalOpened = false, 2000);
}
);
}
public doChangePassword() {
this.userService.changepassword(this.passwordChangeForm.get('old_password').value, this.passwordChangeForm.get('new_password2').value)
.subscribe(
(s: ServerResponse) => {
this.changePwSuccessAlert = s.message + ". Logging you out..."
this.changePwSuccessClosed = false;
setTimeout(() => this.doLogout(), 2000);
},
(s: ServerResponse) => {
this.changePwDangerAlert = s.message;
this.changePwDangerClosed = false;
setTimeout(() => this.changePwDangerClosed = true, 2000);
}
)
}
public doLogout() {
localStorage.removeItem("hobbyfarm_token");
this.router.navigateByUrl("/login");
}
}
|
The crude explosive devices used against runners and spectators at the Boston Marathon were similar to the kitchen pot bombs aimed at U.S. troops on foot patrol in Afghanistan, law enforcement and Congressional officials said Tuesday.
For nearly 10 years, presidential directives, Homeland Security Department reports and local law enforcement officials have warned of the eventual threat within the U.S. of “pressure cooker” bombs developed by terrorists in Afghanistan and Iraq.
After briefings from law enforcement and U.S. intelligence officials, Rep. Randy McCaul (R-Tex.) said that the two devices which detonated in Boston appeared to be “pressure cooker” bombs. Law enforcement officials later confirmed that the investigation was proceeding on the basis that the devices were “pressure cooker” bombs.
But Rick DesLaurier, the FBI’s special agent in charge in Boston, stressed that “the investigation is in its infancy.” At a news conference in Boston, DesLauriers said there had thus far been “no claims of responsibility and the range of suspects and motives is wide open.”
DesLauriers asked the public to contact authorities if they knew anyone who had recently expressed a special “interest in research in how to create explosive devices.”
In Afghanistan, Taliban operatives who lacked the wherewithal or expertise to build “improvised explosive devices” capable of destroying the huge, V-shaped hull MRAPs (Mine Resistant Ambush Protected vehicles) turned to “pressure cooker” bombs as anti-personnel devices. They would be placed along paths expected to be used by U.S. troops on foot patrol.
In the summer of 2011, explosives specialists from the 2nd Battalion, Eighth Marines, showed several of the devices they had found and defused to an embedded reporter. One of the devices had even been placed in a tree, the specialists said.
The terrorists would take a simple kitchen pot, a pressure cooker was preferred, pack it with ammonium nitrate and metal filings, nails and even rocks to increase the lethality, and rig the device to detonate by wire or by a remote device, sometimes a garage door opener.
When triggered, the blast would seek the path of least resistance and most of the force and the debris would blow out the top of the pressure cooker, in effect acting much like a military “shaped” charge that concentrates the force of an anti-tank shell.
As far back as 2004, a Homeland Security Department memo warned of the pressure cooker bomb threat, calling it "a technique commonly taught in Afghan terrorist training camps."
"Typically, these bombs are made by placing TNT or other explosives in a pressure cooker and attaching a blasting cap at the top of the pressure cooker," the memo said.
The attraction of the pressure cooker bomb for the terrorist is that they are cheap, relatively easy to make, and lethal.
A presidential directive on homeland security in 2007 warned that “The threat of explosive attacks in the United States is of great concern considering terrorists’ability to make, obtain, and use explosives, the ready availability of components used in IED construction, the relative technological ease with which an IED can be fashioned, and the nature of our free society.”
In 2010, a joint FBI and Homeland Securty intelligence report issued in 2010 warned that "Placed carefully, such devices provide little or no indication of an impending attack.”
Two years ago, “Inspire,” the online magazine of Al Qaeda in the Arabian, published an article on how to make a pressure cooker bomb. The title of the article was “How To Make A Bomb In The Kitchen Of Your Mom.” |
use super::*;
#[test]
fn normal() {
let bag = gen_bag();
const VAL: &str = r#"
{
"data":
{
"id": "6720877a-e27e-4e9e-9ac0-3fff4deb55f2",
"type": "comments",
"attributes":
{
"body": "world"
}
},
"links":
{
"self": "comments/6720877a-e27e-4e9e-9ac0-3fff4deb55f2"
}
}
"#;
let mut deserializer = serde_json::Deserializer::from_str(VAL);
let doc_builder = CibouletteBodyBuilder::deserialize(&mut deserializer)
.expect("to parse the json:api document");
let doc = doc_builder
.build(&bag, &CibouletteIntention::Read)
.expect("to build the document");
assert_eq!(doc.is_compound(), false);
assert_eq!(doc.has_all_ids(), true);
assert_eq!(doc.has_data(), true);
}
#[test]
fn no_data() {
let bag = gen_bag();
const VAL: &str = r#"
{
"errors":
{
"id": "toto",
"status": 400
}
}
"#;
let mut deserializer = serde_json::Deserializer::from_str(VAL);
let doc_builder = CibouletteBodyBuilder::deserialize(&mut deserializer)
.expect("to parse the json:api document");
let doc = doc_builder
.build(&bag, &CibouletteIntention::Read)
.expect("to build the document");
assert_eq!(doc.is_compound(), false);
assert_eq!(doc.has_all_ids(), true);
assert_eq!(doc.has_data(), false);
}
#[test]
fn has_all_ids() {
let bag = gen_bag();
const VAL: &str = r#"
{
"data":
{
"type": "comments",
"attributes":
{
"body": "world"
}
},
"links":
{
"self": "comments/6720877a-e27e-4e9e-9ac0-3fff4deb55f2"
}
}
"#;
let mut deserializer = serde_json::Deserializer::from_str(VAL);
let doc_builder = CibouletteBodyBuilder::deserialize(&mut deserializer)
.expect("to parse the json:api document");
let doc = doc_builder
.build(&bag, &CibouletteIntention::Read)
.expect("to build the document");
assert_eq!(doc.is_compound(), false);
assert_eq!(doc.has_all_ids(), false);
assert_eq!(doc.has_data(), true);
}
#[test]
fn compound_simple() {
let bag = gen_bag();
const VAL: &str = r#"
{
"data":
[
{
"id": "6720877a-e27e-4e9e-9ac0-3fff4deb55f2",
"type": "comments",
"attributes":
{
"body": "world"
}
},
{
"id": "44814756-fdff-4d8f-a7d3-6fb11184af81",
"type": "comments",
"attributes":
{
"body": "world2"
}
}
]
}
"#;
let mut deserializer = serde_json::Deserializer::from_str(VAL);
let doc_builder = CibouletteBodyBuilder::deserialize(&mut deserializer)
.expect("to parse the json:api document");
let doc = doc_builder
.build(&bag, &CibouletteIntention::Read)
.expect("to build the document");
assert_eq!(doc.is_compound(), true);
assert_eq!(doc.has_all_ids(), true);
assert_eq!(doc.has_data(), true);
}
#[test]
fn compound_missing_ids() {
let bag = gen_bag();
const VAL: &str = r#"
{
"data":
[
{
"id": "6720877a-e27e-4e9e-9ac0-3fff4deb55f2",
"type": "comments",
"attributes":
{
"body": "world"
}
},
{
"type": "comments",
"attributes":
{
"body": "world2"
}
}
]
}
"#;
let mut deserializer = serde_json::Deserializer::from_str(VAL);
let doc_builder = CibouletteBodyBuilder::deserialize(&mut deserializer)
.expect("to parse the json:api document");
let doc = doc_builder
.build(&bag, &CibouletteIntention::Read)
.expect("to build the document");
assert_eq!(doc.is_compound(), true);
assert_eq!(doc.has_all_ids(), false);
assert_eq!(doc.has_data(), true);
}
#[test]
fn compound_empty() {
let bag = gen_bag();
const VAL: &str = r#"
{
"data":
[
]
}
"#;
let mut deserializer = serde_json::Deserializer::from_str(VAL);
let doc_builder = CibouletteBodyBuilder::deserialize(&mut deserializer)
.expect("to parse the json:api document");
let doc = doc_builder
.build(&bag, &CibouletteIntention::Read)
.expect("to build the document");
assert_eq!(doc.is_compound(), true);
assert_eq!(doc.has_all_ids(), true);
assert_eq!(doc.has_data(), true);
}
|
// VerifiableDecryptWithDomain the ciphertext. This performs verifiable decryption
// such that the decrypted data is checked against El-Gamal C2 value.
// If the plaintext does not match, an error is returned
// The Domain component is meant for scenarios where `msg` is used in more
// than just one setting and should be contextualized. The ciphertext must have
// been generated by EncryptWithDomain
func (dk DecryptionKey) VerifiableDecryptWithDomain(domain []byte, cipherText *CipherText) ([]byte, curves.Scalar, error) {
msgBytes, msgScalar, rhs, err := dk.decryptData(cipherText)
if err != nil {
return nil, nil, err
}
ek := dk.EncryptionKey()
genBytes := append(domain, ek.value.ToAffineUncompressed()...)
genBytes = append(genBytes, cipherText.nonce...)
h := ek.value.Hash(genBytes)
lhs := h.Mul(msgScalar)
if !lhs.Equal(rhs) {
return nil, nil, fmt.Errorf("ciphertext mismatch")
}
return msgBytes, msgScalar, nil
} |
# @-*- coding: utf-8 -*-
# @Time: 2018-10-21T21:21:26+08:00
# @Email: [email protected]
import sys
import math
import bisect
MOD = int(1e9+7)
# n = map(int, raw_input().split())
x1, y1 = map(int, raw_input().split())
x2, y2 = map(int, raw_input().split())
n = int(raw_input())
step = raw_input()
dx, dy = x2 - x1, y2 - y1
st = []
ans = -1
now = [0, 0]
for ind, ch in enumerate(step):
if ch == 'U':
now[-1] += 1
elif ch == 'D':
now[-1] -= 1
elif ch == 'L':
now[0] -= 1
elif ch == 'R':
now[0] += 1
st.append((now[0], now[-1]))
INF = int(1e20)
l, r = 0, INF
def ck(midd):
div = midd / n
mod = midd % n
cx, cy = div*st[-1][0], div*st[-1][-1]
if mod:
cx, cy = cx + st[mod-1][0], cy + st[mod-1][-1]
if abs(cx - dx) + abs(cy - dy) <= midd:
return True
else:
return False
while l < r-1:
# print l,r
mid = l + (r-l)/2
if ck(mid):
r = mid
else:
l = mid
if ck(l):
r = l
if r == INF:
print -1
else:
print r
|
<gh_stars>0
package linkedlists;
public class SumLists {
public static Node sumLists(Node l1, Node l2, int carry){
if(l1 == null && l2 == null && carry ==0)
return null;
int value = carry;
if(l1 != null)
value += l1.data;
if(l2 != null)
value += l2.data;
Node result = new Node(value%10);
if(l1 != null || l2 != null){
Node more = sumLists(l1 == null ? null:l1.next, l2 == null ? null:l2.next, value>10 ? 1:0);
result.next = more;
}
return result;
}
static class PartialSum{
public Node sum = null;
public int carry = 0;
}
public static int length(Node l1){
int length = 0;
Node temp = l1;
while(temp != null){
length++;
temp = temp.next;
}
return length;
}
public static Node padZero(Node l, int padding){
Node head = l;
for(int i=0; i< padding; i++){
head = addBefore(head,0);
}
return head;
}
public static Node addBefore(Node head, int value){
Node node = new Node(value);
if(head != null)
node.next = head;
return node;
}
public static Node addLists(Node l1, Node l2){
int len1 = length(l1);
int len2 = length(l2);
if(len1<len2){
padZero(l1,len2-len1);
}
else{
padZero(l2,len1-len2);
}
PartialSum sum = addListsHelper(l1,l2);
if(sum.carry == 0)
return sum.sum;
else
return addBefore(sum.sum,sum.carry);
}
private static PartialSum addListsHelper(Node l1, Node l2) {
if(l1 == null && l2 == null)
return new PartialSum();
PartialSum sum = addListsHelper(l1.next,l2.next);
int value = sum.carry + l1.data + l2.data;
Node result = addBefore(sum.sum, value % 10);
sum.sum = result;
sum.carry = value/10;
return sum;
}
public static void main(String args[]){
Node exam = new Node(7);
Node head = exam;
exam.appendToTail(1);
exam.appendToTail(6);
Node exam2 = new Node(5);
Node head2 = exam2;
exam2.appendToTail(9);
exam2.appendToTail(2);
Node n = head;
while(n!=null){
System.out.print(n.data+" ");
n=n.next;
}
System.out.print("\n");
n = head2;
while(n!=null){
System.out.print(n.data+" ");
n=n.next;
}
System.out.print("\n");
n = addLists(head,head2);
while(n!=null){
System.out.print(n.data+" ");
n=n.next;
}
}
}
|
Subsets and Splits